Nov 28 06:42:57 localhost kernel: Linux version 5.14.0-284.11.1.el9_2.x86_64 (mockbuild@x86-vm-09.build.eng.bos.redhat.com) (gcc (GCC) 11.3.1 20221121 (Red Hat 11.3.1-4), GNU ld version 2.35.2-37.el9) #1 SMP PREEMPT_DYNAMIC Wed Apr 12 10:45:03 EDT 2023
Nov 28 06:42:57 localhost kernel: The list of certified hardware and cloud instances for Red Hat Enterprise Linux 9 can be viewed at the Red Hat Ecosystem Catalog, https://catalog.redhat.com.
Nov 28 06:42:57 localhost kernel: Command line: BOOT_IMAGE=(hd0,gpt3)/vmlinuz-5.14.0-284.11.1.el9_2.x86_64 root=UUID=a3dd82de-ffc6-4652-88b9-80e003b8f20a console=tty0 console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-4G:192M,4G-64G:256M,64G-:512M
Nov 28 06:42:57 localhost kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers'
Nov 28 06:42:57 localhost kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers'
Nov 28 06:42:57 localhost kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers'
Nov 28 06:42:57 localhost kernel: x86/fpu: xstate_offset[2]:  576, xstate_sizes[2]:  256
Nov 28 06:42:57 localhost kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'standard' format.
Nov 28 06:42:57 localhost kernel: signal: max sigframe size: 1776
Nov 28 06:42:57 localhost kernel: BIOS-provided physical RAM map:
Nov 28 06:42:57 localhost kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable
Nov 28 06:42:57 localhost kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved
Nov 28 06:42:57 localhost kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved
Nov 28 06:42:57 localhost kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000bffdafff] usable
Nov 28 06:42:57 localhost kernel: BIOS-e820: [mem 0x00000000bffdb000-0x00000000bfffffff] reserved
Nov 28 06:42:57 localhost kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved
Nov 28 06:42:57 localhost kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved
Nov 28 06:42:57 localhost kernel: BIOS-e820: [mem 0x0000000100000000-0x000000043fffffff] usable
Nov 28 06:42:57 localhost kernel: NX (Execute Disable) protection: active
Nov 28 06:42:57 localhost kernel: SMBIOS 2.8 present.
Nov 28 06:42:57 localhost kernel: DMI: OpenStack Foundation OpenStack Nova, BIOS 1.15.0-1 04/01/2014
Nov 28 06:42:57 localhost kernel: Hypervisor detected: KVM
Nov 28 06:42:57 localhost kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00
Nov 28 06:42:57 localhost kernel: kvm-clock: using sched offset of 1784718827 cycles
Nov 28 06:42:57 localhost kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns
Nov 28 06:42:57 localhost kernel: tsc: Detected 2799.998 MHz processor
Nov 28 06:42:57 localhost kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved
Nov 28 06:42:57 localhost kernel: e820: remove [mem 0x000a0000-0x000fffff] usable
Nov 28 06:42:57 localhost kernel: last_pfn = 0x440000 max_arch_pfn = 0x400000000
Nov 28 06:42:57 localhost kernel: x86/PAT: Configuration [0-7]: WB  WC  UC- UC  WB  WP  UC- WT  
Nov 28 06:42:57 localhost kernel: last_pfn = 0xbffdb max_arch_pfn = 0x400000000
Nov 28 06:42:57 localhost kernel: found SMP MP-table at [mem 0x000f5ae0-0x000f5aef]
Nov 28 06:42:57 localhost kernel: Using GB pages for direct mapping
Nov 28 06:42:57 localhost kernel: RAMDISK: [mem 0x2eef4000-0x33771fff]
Nov 28 06:42:57 localhost kernel: ACPI: Early table checksum verification disabled
Nov 28 06:42:57 localhost kernel: ACPI: RSDP 0x00000000000F5AA0 000014 (v00 BOCHS )
Nov 28 06:42:57 localhost kernel: ACPI: RSDT 0x00000000BFFE16BD 000030 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Nov 28 06:42:57 localhost kernel: ACPI: FACP 0x00000000BFFE1571 000074 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Nov 28 06:42:57 localhost kernel: ACPI: DSDT 0x00000000BFFDFC80 0018F1 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Nov 28 06:42:57 localhost kernel: ACPI: FACS 0x00000000BFFDFC40 000040
Nov 28 06:42:57 localhost kernel: ACPI: APIC 0x00000000BFFE15E5 0000B0 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Nov 28 06:42:57 localhost kernel: ACPI: WAET 0x00000000BFFE1695 000028 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Nov 28 06:42:57 localhost kernel: ACPI: Reserving FACP table memory at [mem 0xbffe1571-0xbffe15e4]
Nov 28 06:42:57 localhost kernel: ACPI: Reserving DSDT table memory at [mem 0xbffdfc80-0xbffe1570]
Nov 28 06:42:57 localhost kernel: ACPI: Reserving FACS table memory at [mem 0xbffdfc40-0xbffdfc7f]
Nov 28 06:42:57 localhost kernel: ACPI: Reserving APIC table memory at [mem 0xbffe15e5-0xbffe1694]
Nov 28 06:42:57 localhost kernel: ACPI: Reserving WAET table memory at [mem 0xbffe1695-0xbffe16bc]
Nov 28 06:42:57 localhost kernel: No NUMA configuration found
Nov 28 06:42:57 localhost kernel: Faking a node at [mem 0x0000000000000000-0x000000043fffffff]
Nov 28 06:42:57 localhost kernel: NODE_DATA(0) allocated [mem 0x43ffd5000-0x43fffffff]
Nov 28 06:42:57 localhost kernel: Reserving 256MB of memory at 2800MB for crashkernel (System RAM: 16383MB)
Nov 28 06:42:57 localhost kernel: Zone ranges:
Nov 28 06:42:57 localhost kernel:   DMA      [mem 0x0000000000001000-0x0000000000ffffff]
Nov 28 06:42:57 localhost kernel:   DMA32    [mem 0x0000000001000000-0x00000000ffffffff]
Nov 28 06:42:57 localhost kernel:   Normal   [mem 0x0000000100000000-0x000000043fffffff]
Nov 28 06:42:57 localhost kernel:   Device   empty
Nov 28 06:42:57 localhost kernel: Movable zone start for each node
Nov 28 06:42:57 localhost kernel: Early memory node ranges
Nov 28 06:42:57 localhost kernel:   node   0: [mem 0x0000000000001000-0x000000000009efff]
Nov 28 06:42:57 localhost kernel:   node   0: [mem 0x0000000000100000-0x00000000bffdafff]
Nov 28 06:42:57 localhost kernel:   node   0: [mem 0x0000000100000000-0x000000043fffffff]
Nov 28 06:42:57 localhost kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000043fffffff]
Nov 28 06:42:57 localhost kernel: On node 0, zone DMA: 1 pages in unavailable ranges
Nov 28 06:42:57 localhost kernel: On node 0, zone DMA: 97 pages in unavailable ranges
Nov 28 06:42:57 localhost kernel: On node 0, zone Normal: 37 pages in unavailable ranges
Nov 28 06:42:57 localhost kernel: ACPI: PM-Timer IO Port: 0x608
Nov 28 06:42:57 localhost kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1])
Nov 28 06:42:57 localhost kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23
Nov 28 06:42:57 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl)
Nov 28 06:42:57 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level)
Nov 28 06:42:57 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level)
Nov 28 06:42:57 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level)
Nov 28 06:42:57 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level)
Nov 28 06:42:57 localhost kernel: ACPI: Using ACPI (MADT) for SMP configuration information
Nov 28 06:42:57 localhost kernel: TSC deadline timer available
Nov 28 06:42:57 localhost kernel: smpboot: Allowing 8 CPUs, 0 hotplug CPUs
Nov 28 06:42:57 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x00000000-0x00000fff]
Nov 28 06:42:57 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x0009f000-0x0009ffff]
Nov 28 06:42:57 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x000a0000-0x000effff]
Nov 28 06:42:57 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x000f0000-0x000fffff]
Nov 28 06:42:57 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xbffdb000-0xbfffffff]
Nov 28 06:42:57 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xc0000000-0xfeffbfff]
Nov 28 06:42:57 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xfeffc000-0xfeffffff]
Nov 28 06:42:57 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xff000000-0xfffbffff]
Nov 28 06:42:57 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xfffc0000-0xffffffff]
Nov 28 06:42:57 localhost kernel: [mem 0xc0000000-0xfeffbfff] available for PCI devices
Nov 28 06:42:57 localhost kernel: Booting paravirtualized kernel on KVM
Nov 28 06:42:57 localhost kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns
Nov 28 06:42:57 localhost kernel: setup_percpu: NR_CPUS:8192 nr_cpumask_bits:8 nr_cpu_ids:8 nr_node_ids:1
Nov 28 06:42:57 localhost kernel: percpu: Embedded 55 pages/cpu s188416 r8192 d28672 u262144
Nov 28 06:42:57 localhost kernel: pcpu-alloc: s188416 r8192 d28672 u262144 alloc=1*2097152
Nov 28 06:42:57 localhost kernel: pcpu-alloc: [0] 0 1 2 3 4 5 6 7 
Nov 28 06:42:57 localhost kernel: kvm-guest: PV spinlocks disabled, no host support
Nov 28 06:42:57 localhost kernel: Fallback order for Node 0: 0 
Nov 28 06:42:57 localhost kernel: Built 1 zonelists, mobility grouping on.  Total pages: 4128475
Nov 28 06:42:57 localhost kernel: Policy zone: Normal
Nov 28 06:42:57 localhost kernel: Kernel command line: BOOT_IMAGE=(hd0,gpt3)/vmlinuz-5.14.0-284.11.1.el9_2.x86_64 root=UUID=a3dd82de-ffc6-4652-88b9-80e003b8f20a console=tty0 console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-4G:192M,4G-64G:256M,64G-:512M
Nov 28 06:42:57 localhost kernel: Unknown kernel command line parameters "BOOT_IMAGE=(hd0,gpt3)/vmlinuz-5.14.0-284.11.1.el9_2.x86_64", will be passed to user space.
Nov 28 06:42:57 localhost kernel: Dentry cache hash table entries: 2097152 (order: 12, 16777216 bytes, linear)
Nov 28 06:42:57 localhost kernel: Inode-cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear)
Nov 28 06:42:57 localhost kernel: mem auto-init: stack:off, heap alloc:off, heap free:off
Nov 28 06:42:57 localhost kernel: software IO TLB: area num 8.
Nov 28 06:42:57 localhost kernel: Memory: 2873456K/16776676K available (14342K kernel code, 5536K rwdata, 10180K rodata, 2792K init, 7524K bss, 741260K reserved, 0K cma-reserved)
Nov 28 06:42:57 localhost kernel: random: get_random_u64 called from kmem_cache_open+0x1e/0x210 with crng_init=0
Nov 28 06:42:57 localhost kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=8, Nodes=1
Nov 28 06:42:57 localhost kernel: ftrace: allocating 44803 entries in 176 pages
Nov 28 06:42:57 localhost kernel: ftrace: allocated 176 pages with 3 groups
Nov 28 06:42:57 localhost kernel: Dynamic Preempt: voluntary
Nov 28 06:42:57 localhost kernel: rcu: Preemptible hierarchical RCU implementation.
Nov 28 06:42:57 localhost kernel: rcu:         RCU restricting CPUs from NR_CPUS=8192 to nr_cpu_ids=8.
Nov 28 06:42:57 localhost kernel:         Trampoline variant of Tasks RCU enabled.
Nov 28 06:42:57 localhost kernel:         Rude variant of Tasks RCU enabled.
Nov 28 06:42:57 localhost kernel:         Tracing variant of Tasks RCU enabled.
Nov 28 06:42:57 localhost kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies.
Nov 28 06:42:57 localhost kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=8
Nov 28 06:42:57 localhost kernel: NR_IRQS: 524544, nr_irqs: 488, preallocated irqs: 16
Nov 28 06:42:57 localhost kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention.
Nov 28 06:42:57 localhost kernel: kfence: initialized - using 2097152 bytes for 255 objects at 0x(____ptrval____)-0x(____ptrval____)
Nov 28 06:42:57 localhost kernel: random: crng init done (trusting CPU's manufacturer)
Nov 28 06:42:57 localhost kernel: Console: colour VGA+ 80x25
Nov 28 06:42:57 localhost kernel: printk: console [tty0] enabled
Nov 28 06:42:57 localhost kernel: printk: console [ttyS0] enabled
Nov 28 06:42:57 localhost kernel: ACPI: Core revision 20211217
Nov 28 06:42:57 localhost kernel: APIC: Switch to symmetric I/O mode setup
Nov 28 06:42:57 localhost kernel: x2apic enabled
Nov 28 06:42:57 localhost kernel: Switched APIC routing to physical x2apic.
Nov 28 06:42:57 localhost kernel: tsc: Marking TSC unstable due to TSCs unsynchronized
Nov 28 06:42:57 localhost kernel: Calibrating delay loop (skipped) preset value.. 5599.99 BogoMIPS (lpj=2799998)
Nov 28 06:42:57 localhost kernel: pid_max: default: 32768 minimum: 301
Nov 28 06:42:57 localhost kernel: LSM: Security Framework initializing
Nov 28 06:42:57 localhost kernel: Yama: becoming mindful.
Nov 28 06:42:57 localhost kernel: SELinux:  Initializing.
Nov 28 06:42:57 localhost kernel: LSM support for eBPF active
Nov 28 06:42:57 localhost kernel: Mount-cache hash table entries: 32768 (order: 6, 262144 bytes, linear)
Nov 28 06:42:57 localhost kernel: Mountpoint-cache hash table entries: 32768 (order: 6, 262144 bytes, linear)
Nov 28 06:42:57 localhost kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated
Nov 28 06:42:57 localhost kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127
Nov 28 06:42:57 localhost kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0
Nov 28 06:42:57 localhost kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization
Nov 28 06:42:57 localhost kernel: Spectre V2 : Mitigation: Retpolines
Nov 28 06:42:57 localhost kernel: Spectre V2 : Spectre v2 / SpectreRSB mitigation: Filling RSB on context switch
Nov 28 06:42:57 localhost kernel: Spectre V2 : Spectre v2 / SpectreRSB : Filling RSB on VMEXIT
Nov 28 06:42:57 localhost kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls
Nov 28 06:42:57 localhost kernel: RETBleed: Mitigation: untrained return thunk
Nov 28 06:42:57 localhost kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier
Nov 28 06:42:57 localhost kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl
Nov 28 06:42:57 localhost kernel: Freeing SMP alternatives memory: 36K
Nov 28 06:42:57 localhost kernel: smpboot: CPU0: AMD EPYC-Rome Processor (family: 0x17, model: 0x31, stepping: 0x0)
Nov 28 06:42:57 localhost kernel: cblist_init_generic: Setting adjustable number of callback queues.
Nov 28 06:42:57 localhost kernel: cblist_init_generic: Setting shift to 3 and lim to 1.
Nov 28 06:42:57 localhost kernel: cblist_init_generic: Setting shift to 3 and lim to 1.
Nov 28 06:42:57 localhost kernel: cblist_init_generic: Setting shift to 3 and lim to 1.
Nov 28 06:42:57 localhost kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver.
Nov 28 06:42:57 localhost kernel: ... version:                0
Nov 28 06:42:57 localhost kernel: ... bit width:              48
Nov 28 06:42:57 localhost kernel: ... generic registers:      6
Nov 28 06:42:57 localhost kernel: ... value mask:             0000ffffffffffff
Nov 28 06:42:57 localhost kernel: ... max period:             00007fffffffffff
Nov 28 06:42:57 localhost kernel: ... fixed-purpose events:   0
Nov 28 06:42:57 localhost kernel: ... event mask:             000000000000003f
Nov 28 06:42:57 localhost kernel: rcu: Hierarchical SRCU implementation.
Nov 28 06:42:57 localhost kernel: rcu:         Max phase no-delay instances is 400.
Nov 28 06:42:57 localhost kernel: smp: Bringing up secondary CPUs ...
Nov 28 06:42:57 localhost kernel: x86: Booting SMP configuration:
Nov 28 06:42:57 localhost kernel: .... node  #0, CPUs:      #1 #2 #3 #4 #5 #6 #7
Nov 28 06:42:57 localhost kernel: smp: Brought up 1 node, 8 CPUs
Nov 28 06:42:57 localhost kernel: smpboot: Max logical packages: 8
Nov 28 06:42:57 localhost kernel: smpboot: Total of 8 processors activated (44799.96 BogoMIPS)
Nov 28 06:42:57 localhost kernel: node 0 deferred pages initialised in 25ms
Nov 28 06:42:57 localhost kernel: devtmpfs: initialized
Nov 28 06:42:57 localhost kernel: x86/mm: Memory block size: 128MB
Nov 28 06:42:57 localhost kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns
Nov 28 06:42:57 localhost kernel: futex hash table entries: 2048 (order: 5, 131072 bytes, linear)
Nov 28 06:42:57 localhost kernel: pinctrl core: initialized pinctrl subsystem
Nov 28 06:42:57 localhost kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family
Nov 28 06:42:57 localhost kernel: DMA: preallocated 2048 KiB GFP_KERNEL pool for atomic allocations
Nov 28 06:42:57 localhost kernel: DMA: preallocated 2048 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations
Nov 28 06:42:57 localhost kernel: DMA: preallocated 2048 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations
Nov 28 06:42:57 localhost kernel: audit: initializing netlink subsys (disabled)
Nov 28 06:42:57 localhost kernel: audit: type=2000 audit(1764312176.109:1): state=initialized audit_enabled=0 res=1
Nov 28 06:42:57 localhost kernel: thermal_sys: Registered thermal governor 'fair_share'
Nov 28 06:42:57 localhost kernel: thermal_sys: Registered thermal governor 'step_wise'
Nov 28 06:42:57 localhost kernel: thermal_sys: Registered thermal governor 'user_space'
Nov 28 06:42:57 localhost kernel: cpuidle: using governor menu
Nov 28 06:42:57 localhost kernel: HugeTLB: can optimize 4095 vmemmap pages for hugepages-1048576kB
Nov 28 06:42:57 localhost kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5
Nov 28 06:42:57 localhost kernel: PCI: Using configuration type 1 for base access
Nov 28 06:42:57 localhost kernel: PCI: Using configuration type 1 for extended access
Nov 28 06:42:57 localhost kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible.
Nov 28 06:42:57 localhost kernel: HugeTLB: can optimize 7 vmemmap pages for hugepages-2048kB
Nov 28 06:42:57 localhost kernel: HugeTLB registered 1.00 GiB page size, pre-allocated 0 pages
Nov 28 06:42:57 localhost kernel: HugeTLB registered 2.00 MiB page size, pre-allocated 0 pages
Nov 28 06:42:57 localhost kernel: cryptd: max_cpu_qlen set to 1000
Nov 28 06:42:57 localhost kernel: ACPI: Added _OSI(Module Device)
Nov 28 06:42:57 localhost kernel: ACPI: Added _OSI(Processor Device)
Nov 28 06:42:57 localhost kernel: ACPI: Added _OSI(3.0 _SCP Extensions)
Nov 28 06:42:57 localhost kernel: ACPI: Added _OSI(Processor Aggregator Device)
Nov 28 06:42:57 localhost kernel: ACPI: Added _OSI(Linux-Dell-Video)
Nov 28 06:42:57 localhost kernel: ACPI: Added _OSI(Linux-Lenovo-NV-HDMI-Audio)
Nov 28 06:42:57 localhost kernel: ACPI: Added _OSI(Linux-HPI-Hybrid-Graphics)
Nov 28 06:42:57 localhost kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded
Nov 28 06:42:57 localhost kernel: ACPI: Interpreter enabled
Nov 28 06:42:57 localhost kernel: ACPI: PM: (supports S0 S3 S4 S5)
Nov 28 06:42:57 localhost kernel: ACPI: Using IOAPIC for interrupt routing
Nov 28 06:42:57 localhost kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug
Nov 28 06:42:57 localhost kernel: PCI: Using E820 reservations for host bridge windows
Nov 28 06:42:57 localhost kernel: ACPI: Enabled 2 GPEs in block 00 to 0F
Nov 28 06:42:57 localhost kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff])
Nov 28 06:42:57 localhost kernel: acpi PNP0A03:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI EDR HPX-Type3]
Nov 28 06:42:57 localhost kernel: acpiphp: Slot [3] registered
Nov 28 06:42:57 localhost kernel: acpiphp: Slot [4] registered
Nov 28 06:42:57 localhost kernel: acpiphp: Slot [5] registered
Nov 28 06:42:57 localhost kernel: acpiphp: Slot [6] registered
Nov 28 06:42:57 localhost kernel: acpiphp: Slot [7] registered
Nov 28 06:42:57 localhost kernel: acpiphp: Slot [8] registered
Nov 28 06:42:57 localhost kernel: acpiphp: Slot [9] registered
Nov 28 06:42:57 localhost kernel: acpiphp: Slot [10] registered
Nov 28 06:42:57 localhost kernel: acpiphp: Slot [11] registered
Nov 28 06:42:57 localhost kernel: acpiphp: Slot [12] registered
Nov 28 06:42:57 localhost kernel: acpiphp: Slot [13] registered
Nov 28 06:42:57 localhost kernel: acpiphp: Slot [14] registered
Nov 28 06:42:57 localhost kernel: acpiphp: Slot [15] registered
Nov 28 06:42:57 localhost kernel: acpiphp: Slot [16] registered
Nov 28 06:42:57 localhost kernel: acpiphp: Slot [17] registered
Nov 28 06:42:57 localhost kernel: acpiphp: Slot [18] registered
Nov 28 06:42:57 localhost kernel: acpiphp: Slot [19] registered
Nov 28 06:42:57 localhost kernel: acpiphp: Slot [20] registered
Nov 28 06:42:57 localhost kernel: acpiphp: Slot [21] registered
Nov 28 06:42:57 localhost kernel: acpiphp: Slot [22] registered
Nov 28 06:42:57 localhost kernel: acpiphp: Slot [23] registered
Nov 28 06:42:57 localhost kernel: acpiphp: Slot [24] registered
Nov 28 06:42:57 localhost kernel: acpiphp: Slot [25] registered
Nov 28 06:42:57 localhost kernel: acpiphp: Slot [26] registered
Nov 28 06:42:57 localhost kernel: acpiphp: Slot [27] registered
Nov 28 06:42:57 localhost kernel: acpiphp: Slot [28] registered
Nov 28 06:42:57 localhost kernel: acpiphp: Slot [29] registered
Nov 28 06:42:57 localhost kernel: acpiphp: Slot [30] registered
Nov 28 06:42:57 localhost kernel: acpiphp: Slot [31] registered
Nov 28 06:42:57 localhost kernel: PCI host bridge to bus 0000:00
Nov 28 06:42:57 localhost kernel: pci_bus 0000:00: root bus resource [io  0x0000-0x0cf7 window]
Nov 28 06:42:57 localhost kernel: pci_bus 0000:00: root bus resource [io  0x0d00-0xffff window]
Nov 28 06:42:57 localhost kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window]
Nov 28 06:42:57 localhost kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window]
Nov 28 06:42:57 localhost kernel: pci_bus 0000:00: root bus resource [mem 0x440000000-0x4bfffffff window]
Nov 28 06:42:57 localhost kernel: pci_bus 0000:00: root bus resource [bus 00-ff]
Nov 28 06:42:57 localhost kernel: pci 0000:00:00.0: [8086:1237] type 00 class 0x060000
Nov 28 06:42:57 localhost kernel: pci 0000:00:01.0: [8086:7000] type 00 class 0x060100
Nov 28 06:42:57 localhost kernel: pci 0000:00:01.1: [8086:7010] type 00 class 0x010180
Nov 28 06:42:57 localhost kernel: pci 0000:00:01.1: reg 0x20: [io  0xc140-0xc14f]
Nov 28 06:42:57 localhost kernel: pci 0000:00:01.1: legacy IDE quirk: reg 0x10: [io  0x01f0-0x01f7]
Nov 28 06:42:57 localhost kernel: pci 0000:00:01.1: legacy IDE quirk: reg 0x14: [io  0x03f6]
Nov 28 06:42:57 localhost kernel: pci 0000:00:01.1: legacy IDE quirk: reg 0x18: [io  0x0170-0x0177]
Nov 28 06:42:57 localhost kernel: pci 0000:00:01.1: legacy IDE quirk: reg 0x1c: [io  0x0376]
Nov 28 06:42:57 localhost kernel: pci 0000:00:01.2: [8086:7020] type 00 class 0x0c0300
Nov 28 06:42:57 localhost kernel: pci 0000:00:01.2: reg 0x20: [io  0xc100-0xc11f]
Nov 28 06:42:57 localhost kernel: pci 0000:00:01.3: [8086:7113] type 00 class 0x068000
Nov 28 06:42:57 localhost kernel: pci 0000:00:01.3: quirk: [io  0x0600-0x063f] claimed by PIIX4 ACPI
Nov 28 06:42:57 localhost kernel: pci 0000:00:01.3: quirk: [io  0x0700-0x070f] claimed by PIIX4 SMB
Nov 28 06:42:57 localhost kernel: pci 0000:00:02.0: [1af4:1050] type 00 class 0x030000
Nov 28 06:42:57 localhost kernel: pci 0000:00:02.0: reg 0x10: [mem 0xfe000000-0xfe7fffff pref]
Nov 28 06:42:57 localhost kernel: pci 0000:00:02.0: reg 0x18: [mem 0xfe800000-0xfe803fff 64bit pref]
Nov 28 06:42:57 localhost kernel: pci 0000:00:02.0: reg 0x20: [mem 0xfeb90000-0xfeb90fff]
Nov 28 06:42:57 localhost kernel: pci 0000:00:02.0: reg 0x30: [mem 0xfeb80000-0xfeb8ffff pref]
Nov 28 06:42:57 localhost kernel: pci 0000:00:02.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff]
Nov 28 06:42:57 localhost kernel: pci 0000:00:03.0: [1af4:1000] type 00 class 0x020000
Nov 28 06:42:57 localhost kernel: pci 0000:00:03.0: reg 0x10: [io  0xc080-0xc0bf]
Nov 28 06:42:57 localhost kernel: pci 0000:00:03.0: reg 0x14: [mem 0xfeb91000-0xfeb91fff]
Nov 28 06:42:57 localhost kernel: pci 0000:00:03.0: reg 0x20: [mem 0xfe804000-0xfe807fff 64bit pref]
Nov 28 06:42:57 localhost kernel: pci 0000:00:03.0: reg 0x30: [mem 0xfeb00000-0xfeb7ffff pref]
Nov 28 06:42:57 localhost kernel: pci 0000:00:04.0: [1af4:1001] type 00 class 0x010000
Nov 28 06:42:57 localhost kernel: pci 0000:00:04.0: reg 0x10: [io  0xc000-0xc07f]
Nov 28 06:42:57 localhost kernel: pci 0000:00:04.0: reg 0x14: [mem 0xfeb92000-0xfeb92fff]
Nov 28 06:42:57 localhost kernel: pci 0000:00:04.0: reg 0x20: [mem 0xfe808000-0xfe80bfff 64bit pref]
Nov 28 06:42:57 localhost kernel: pci 0000:00:05.0: [1af4:1002] type 00 class 0x00ff00
Nov 28 06:42:57 localhost kernel: pci 0000:00:05.0: reg 0x10: [io  0xc0c0-0xc0ff]
Nov 28 06:42:57 localhost kernel: pci 0000:00:05.0: reg 0x20: [mem 0xfe80c000-0xfe80ffff 64bit pref]
Nov 28 06:42:57 localhost kernel: pci 0000:00:06.0: [1af4:1005] type 00 class 0x00ff00
Nov 28 06:42:57 localhost kernel: pci 0000:00:06.0: reg 0x10: [io  0xc120-0xc13f]
Nov 28 06:42:57 localhost kernel: pci 0000:00:06.0: reg 0x20: [mem 0xfe810000-0xfe813fff 64bit pref]
Nov 28 06:42:57 localhost kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10
Nov 28 06:42:57 localhost kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10
Nov 28 06:42:57 localhost kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11
Nov 28 06:42:57 localhost kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11
Nov 28 06:42:57 localhost kernel: ACPI: PCI: Interrupt link LNKS configured for IRQ 9
Nov 28 06:42:57 localhost kernel: iommu: Default domain type: Translated 
Nov 28 06:42:57 localhost kernel: iommu: DMA domain TLB invalidation policy: lazy mode 
Nov 28 06:42:57 localhost kernel: SCSI subsystem initialized
Nov 28 06:42:57 localhost kernel: ACPI: bus type USB registered
Nov 28 06:42:57 localhost kernel: usbcore: registered new interface driver usbfs
Nov 28 06:42:57 localhost kernel: usbcore: registered new interface driver hub
Nov 28 06:42:57 localhost kernel: usbcore: registered new device driver usb
Nov 28 06:42:57 localhost kernel: pps_core: LinuxPPS API ver. 1 registered
Nov 28 06:42:57 localhost kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti <giometti@linux.it>
Nov 28 06:42:57 localhost kernel: PTP clock support registered
Nov 28 06:42:57 localhost kernel: EDAC MC: Ver: 3.0.0
Nov 28 06:42:57 localhost kernel: NetLabel: Initializing
Nov 28 06:42:57 localhost kernel: NetLabel:  domain hash size = 128
Nov 28 06:42:57 localhost kernel: NetLabel:  protocols = UNLABELED CIPSOv4 CALIPSO
Nov 28 06:42:57 localhost kernel: NetLabel:  unlabeled traffic allowed by default
Nov 28 06:42:57 localhost kernel: PCI: Using ACPI for IRQ routing
Nov 28 06:42:57 localhost kernel: PCI: pci_cache_line_size set to 64 bytes
Nov 28 06:42:57 localhost kernel: e820: reserve RAM buffer [mem 0x0009fc00-0x0009ffff]
Nov 28 06:42:57 localhost kernel: e820: reserve RAM buffer [mem 0xbffdb000-0xbfffffff]
Nov 28 06:42:57 localhost kernel: pci 0000:00:02.0: vgaarb: setting as boot VGA device
Nov 28 06:42:57 localhost kernel: pci 0000:00:02.0: vgaarb: bridge control possible
Nov 28 06:42:57 localhost kernel: pci 0000:00:02.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none
Nov 28 06:42:57 localhost kernel: vgaarb: loaded
Nov 28 06:42:57 localhost kernel: clocksource: Switched to clocksource kvm-clock
Nov 28 06:42:57 localhost kernel: VFS: Disk quotas dquot_6.6.0
Nov 28 06:42:57 localhost kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes)
Nov 28 06:42:57 localhost kernel: pnp: PnP ACPI init
Nov 28 06:42:57 localhost kernel: pnp 00:03: [dma 2]
Nov 28 06:42:57 localhost kernel: pnp: PnP ACPI: found 5 devices
Nov 28 06:42:57 localhost kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns
Nov 28 06:42:57 localhost kernel: NET: Registered PF_INET protocol family
Nov 28 06:42:57 localhost kernel: IP idents hash table entries: 262144 (order: 9, 2097152 bytes, linear)
Nov 28 06:42:57 localhost kernel: tcp_listen_portaddr_hash hash table entries: 8192 (order: 5, 131072 bytes, linear)
Nov 28 06:42:57 localhost kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear)
Nov 28 06:42:57 localhost kernel: TCP established hash table entries: 131072 (order: 8, 1048576 bytes, linear)
Nov 28 06:42:57 localhost kernel: TCP bind hash table entries: 65536 (order: 8, 1048576 bytes, linear)
Nov 28 06:42:57 localhost kernel: TCP: Hash tables configured (established 131072 bind 65536)
Nov 28 06:42:57 localhost kernel: MPTCP token hash table entries: 16384 (order: 6, 393216 bytes, linear)
Nov 28 06:42:57 localhost kernel: UDP hash table entries: 8192 (order: 6, 262144 bytes, linear)
Nov 28 06:42:57 localhost kernel: UDP-Lite hash table entries: 8192 (order: 6, 262144 bytes, linear)
Nov 28 06:42:57 localhost kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family
Nov 28 06:42:57 localhost kernel: NET: Registered PF_XDP protocol family
Nov 28 06:42:57 localhost kernel: pci_bus 0000:00: resource 4 [io  0x0000-0x0cf7 window]
Nov 28 06:42:57 localhost kernel: pci_bus 0000:00: resource 5 [io  0x0d00-0xffff window]
Nov 28 06:42:57 localhost kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window]
Nov 28 06:42:57 localhost kernel: pci_bus 0000:00: resource 7 [mem 0xc0000000-0xfebfffff window]
Nov 28 06:42:57 localhost kernel: pci_bus 0000:00: resource 8 [mem 0x440000000-0x4bfffffff window]
Nov 28 06:42:57 localhost kernel: pci 0000:00:01.0: PIIX3: Enabling Passive Release
Nov 28 06:42:57 localhost kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers
Nov 28 06:42:57 localhost kernel: ACPI: \_SB_.LNKD: Enabled at IRQ 11
Nov 28 06:42:57 localhost kernel: pci 0000:00:01.2: quirk_usb_early_handoff+0x0/0x140 took 29825 usecs
Nov 28 06:42:57 localhost kernel: PCI: CLS 0 bytes, default 64
Nov 28 06:42:57 localhost kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB)
Nov 28 06:42:57 localhost kernel: Trying to unpack rootfs image as initramfs...
Nov 28 06:42:57 localhost kernel: software IO TLB: mapped [mem 0x00000000ab000000-0x00000000af000000] (64MB)
Nov 28 06:42:57 localhost kernel: ACPI: bus type thunderbolt registered
Nov 28 06:42:57 localhost kernel: Initialise system trusted keyrings
Nov 28 06:42:57 localhost kernel: Key type blacklist registered
Nov 28 06:42:57 localhost kernel: workingset: timestamp_bits=36 max_order=22 bucket_order=0
Nov 28 06:42:57 localhost kernel: zbud: loaded
Nov 28 06:42:57 localhost kernel: integrity: Platform Keyring initialized
Nov 28 06:42:57 localhost kernel: NET: Registered PF_ALG protocol family
Nov 28 06:42:57 localhost kernel: xor: automatically using best checksumming function   avx       
Nov 28 06:42:57 localhost kernel: Key type asymmetric registered
Nov 28 06:42:57 localhost kernel: Asymmetric key parser 'x509' registered
Nov 28 06:42:57 localhost kernel: Running certificate verification selftests
Nov 28 06:42:57 localhost kernel: Loaded X.509 cert 'Certificate verification self-testing key: f58703bb33ce1b73ee02eccdee5b8817518fe3db'
Nov 28 06:42:57 localhost kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 246)
Nov 28 06:42:57 localhost kernel: io scheduler mq-deadline registered
Nov 28 06:42:57 localhost kernel: io scheduler kyber registered
Nov 28 06:42:57 localhost kernel: io scheduler bfq registered
Nov 28 06:42:57 localhost kernel: atomic64_test: passed for x86-64 platform with CX8 and with SSE
Nov 28 06:42:57 localhost kernel: shpchp: Standard Hot Plug PCI Controller Driver version: 0.4
Nov 28 06:42:57 localhost kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input0
Nov 28 06:42:57 localhost kernel: ACPI: button: Power Button [PWRF]
Nov 28 06:42:57 localhost kernel: ACPI: \_SB_.LNKB: Enabled at IRQ 10
Nov 28 06:42:57 localhost kernel: ACPI: \_SB_.LNKC: Enabled at IRQ 11
Nov 28 06:42:57 localhost kernel: ACPI: \_SB_.LNKA: Enabled at IRQ 10
Nov 28 06:42:57 localhost kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled
Nov 28 06:42:57 localhost kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A
Nov 28 06:42:57 localhost kernel: Non-volatile memory driver v1.3
Nov 28 06:42:57 localhost kernel: rdac: device handler registered
Nov 28 06:42:57 localhost kernel: hp_sw: device handler registered
Nov 28 06:42:57 localhost kernel: emc: device handler registered
Nov 28 06:42:57 localhost kernel: alua: device handler registered
Nov 28 06:42:57 localhost kernel: libphy: Fixed MDIO Bus: probed
Nov 28 06:42:57 localhost kernel: ehci_hcd: USB 2.0 'Enhanced' Host Controller (EHCI) Driver
Nov 28 06:42:57 localhost kernel: ehci-pci: EHCI PCI platform driver
Nov 28 06:42:57 localhost kernel: ohci_hcd: USB 1.1 'Open' Host Controller (OHCI) Driver
Nov 28 06:42:57 localhost kernel: ohci-pci: OHCI PCI platform driver
Nov 28 06:42:57 localhost kernel: uhci_hcd: USB Universal Host Controller Interface driver
Nov 28 06:42:57 localhost kernel: uhci_hcd 0000:00:01.2: UHCI Host Controller
Nov 28 06:42:57 localhost kernel: uhci_hcd 0000:00:01.2: new USB bus registered, assigned bus number 1
Nov 28 06:42:57 localhost kernel: uhci_hcd 0000:00:01.2: detected 2 ports
Nov 28 06:42:57 localhost kernel: uhci_hcd 0000:00:01.2: irq 11, io port 0x0000c100
Nov 28 06:42:57 localhost kernel: usb usb1: New USB device found, idVendor=1d6b, idProduct=0001, bcdDevice= 5.14
Nov 28 06:42:57 localhost kernel: usb usb1: New USB device strings: Mfr=3, Product=2, SerialNumber=1
Nov 28 06:42:57 localhost kernel: usb usb1: Product: UHCI Host Controller
Nov 28 06:42:57 localhost kernel: usb usb1: Manufacturer: Linux 5.14.0-284.11.1.el9_2.x86_64 uhci_hcd
Nov 28 06:42:57 localhost kernel: usb usb1: SerialNumber: 0000:00:01.2
Nov 28 06:42:57 localhost kernel: hub 1-0:1.0: USB hub found
Nov 28 06:42:57 localhost kernel: hub 1-0:1.0: 2 ports detected
Nov 28 06:42:57 localhost kernel: usbcore: registered new interface driver usbserial_generic
Nov 28 06:42:57 localhost kernel: usbserial: USB Serial support registered for generic
Nov 28 06:42:57 localhost kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12
Nov 28 06:42:57 localhost kernel: serio: i8042 KBD port at 0x60,0x64 irq 1
Nov 28 06:42:57 localhost kernel: serio: i8042 AUX port at 0x60,0x64 irq 12
Nov 28 06:42:57 localhost kernel: mousedev: PS/2 mouse device common for all mice
Nov 28 06:42:57 localhost kernel: rtc_cmos 00:04: RTC can wake from S4
Nov 28 06:42:57 localhost kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input1
Nov 28 06:42:57 localhost kernel: rtc_cmos 00:04: registered as rtc0
Nov 28 06:42:57 localhost kernel: rtc_cmos 00:04: setting system clock to 2025-11-28T06:42:56 UTC (1764312176)
Nov 28 06:42:57 localhost kernel: rtc_cmos 00:04: alarms up to one day, y3k, 242 bytes nvram
Nov 28 06:42:57 localhost kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input4
Nov 28 06:42:57 localhost kernel: hid: raw HID events driver (C) Jiri Kosina
Nov 28 06:42:57 localhost kernel: usbcore: registered new interface driver usbhid
Nov 28 06:42:57 localhost kernel: usbhid: USB HID core driver
Nov 28 06:42:57 localhost kernel: drop_monitor: Initializing network drop monitor service
Nov 28 06:42:57 localhost kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input3
Nov 28 06:42:57 localhost kernel: Initializing XFRM netlink socket
Nov 28 06:42:57 localhost kernel: NET: Registered PF_INET6 protocol family
Nov 28 06:42:57 localhost kernel: Segment Routing with IPv6
Nov 28 06:42:57 localhost kernel: NET: Registered PF_PACKET protocol family
Nov 28 06:42:57 localhost kernel: mpls_gso: MPLS GSO support
Nov 28 06:42:57 localhost kernel: IPI shorthand broadcast: enabled
Nov 28 06:42:57 localhost kernel: AVX2 version of gcm_enc/dec engaged.
Nov 28 06:42:57 localhost kernel: AES CTR mode by8 optimization enabled
Nov 28 06:42:57 localhost kernel: sched_clock: Marking stable (758975655, 180051824)->(1069299812, -130272333)
Nov 28 06:42:57 localhost kernel: registered taskstats version 1
Nov 28 06:42:57 localhost kernel: Loading compiled-in X.509 certificates
Nov 28 06:42:57 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux kernel signing key: aaec4b640ef162b54684864066c7d4ffd428cd72'
Nov 28 06:42:57 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux Driver Update Program (key 3): bf57f3e87362bc7229d9f465321773dfd1f77a80'
Nov 28 06:42:57 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux kpatch signing key: 4d38fd864ebe18c5f0b72e3852e2014c3a676fc8'
Nov 28 06:42:57 localhost kernel: zswap: loaded using pool lzo/zbud
Nov 28 06:42:57 localhost kernel: page_owner is disabled
Nov 28 06:42:57 localhost kernel: Key type big_key registered
Nov 28 06:42:57 localhost kernel: Freeing initrd memory: 74232K
Nov 28 06:42:57 localhost kernel: Key type encrypted registered
Nov 28 06:42:57 localhost kernel: ima: No TPM chip found, activating TPM-bypass!
Nov 28 06:42:57 localhost kernel: Loading compiled-in module X.509 certificates
Nov 28 06:42:57 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux kernel signing key: aaec4b640ef162b54684864066c7d4ffd428cd72'
Nov 28 06:42:57 localhost kernel: usb 1-1: new full-speed USB device number 2 using uhci_hcd
Nov 28 06:42:57 localhost kernel: ima: Allocated hash algorithm: sha256
Nov 28 06:42:57 localhost kernel: ima: No architecture policies found
Nov 28 06:42:57 localhost kernel: evm: Initialising EVM extended attributes:
Nov 28 06:42:57 localhost kernel: evm: security.selinux
Nov 28 06:42:57 localhost kernel: evm: security.SMACK64 (disabled)
Nov 28 06:42:57 localhost kernel: evm: security.SMACK64EXEC (disabled)
Nov 28 06:42:57 localhost kernel: evm: security.SMACK64TRANSMUTE (disabled)
Nov 28 06:42:57 localhost kernel: evm: security.SMACK64MMAP (disabled)
Nov 28 06:42:57 localhost kernel: evm: security.apparmor (disabled)
Nov 28 06:42:57 localhost kernel: evm: security.ima
Nov 28 06:42:57 localhost kernel: evm: security.capability
Nov 28 06:42:57 localhost kernel: evm: HMAC attrs: 0x1
Nov 28 06:42:57 localhost kernel: Freeing unused decrypted memory: 2036K
Nov 28 06:42:57 localhost kernel: Freeing unused kernel image (initmem) memory: 2792K
Nov 28 06:42:57 localhost kernel: Write protecting the kernel read-only data: 26624k
Nov 28 06:42:57 localhost kernel: usb 1-1: New USB device found, idVendor=0627, idProduct=0001, bcdDevice= 0.00
Nov 28 06:42:57 localhost kernel: usb 1-1: New USB device strings: Mfr=1, Product=3, SerialNumber=10
Nov 28 06:42:57 localhost kernel: Freeing unused kernel image (text/rodata gap) memory: 2040K
Nov 28 06:42:57 localhost kernel: usb 1-1: Product: QEMU USB Tablet
Nov 28 06:42:57 localhost kernel: Freeing unused kernel image (rodata/data gap) memory: 60K
Nov 28 06:42:57 localhost kernel: usb 1-1: Manufacturer: QEMU
Nov 28 06:42:57 localhost kernel: usb 1-1: SerialNumber: 28754-0000:00:01.2-1
Nov 28 06:42:57 localhost kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:01.2/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input5
Nov 28 06:42:57 localhost kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:00:01.2-1/input0
Nov 28 06:42:57 localhost kernel: x86/mm: Checked W+X mappings: passed, no W+X pages found.
Nov 28 06:42:57 localhost kernel: Run /init as init process
Nov 28 06:42:57 localhost kernel:   with arguments:
Nov 28 06:42:57 localhost kernel:     /init
Nov 28 06:42:57 localhost kernel:   with environment:
Nov 28 06:42:57 localhost kernel:     HOME=/
Nov 28 06:42:57 localhost kernel:     TERM=linux
Nov 28 06:42:57 localhost kernel:     BOOT_IMAGE=(hd0,gpt3)/vmlinuz-5.14.0-284.11.1.el9_2.x86_64
Nov 28 06:42:57 localhost systemd[1]: systemd 252-13.el9_2 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Nov 28 06:42:57 localhost systemd[1]: Detected virtualization kvm.
Nov 28 06:42:57 localhost systemd[1]: Detected architecture x86-64.
Nov 28 06:42:57 localhost systemd[1]: Running in initrd.
Nov 28 06:42:57 localhost systemd[1]: No hostname configured, using default hostname.
Nov 28 06:42:57 localhost systemd[1]: Hostname set to <localhost>.
Nov 28 06:42:57 localhost systemd[1]: Initializing machine ID from VM UUID.
Nov 28 06:42:57 localhost systemd[1]: Queued start job for default target Initrd Default Target.
Nov 28 06:42:57 localhost systemd[1]: Started Dispatch Password Requests to Console Directory Watch.
Nov 28 06:42:57 localhost systemd[1]: Reached target Local Encrypted Volumes.
Nov 28 06:42:57 localhost systemd[1]: Reached target Initrd /usr File System.
Nov 28 06:42:57 localhost systemd[1]: Reached target Local File Systems.
Nov 28 06:42:57 localhost systemd[1]: Reached target Path Units.
Nov 28 06:42:57 localhost systemd[1]: Reached target Slice Units.
Nov 28 06:42:57 localhost systemd[1]: Reached target Swaps.
Nov 28 06:42:57 localhost systemd[1]: Reached target Timer Units.
Nov 28 06:42:57 localhost systemd[1]: Listening on D-Bus System Message Bus Socket.
Nov 28 06:42:57 localhost systemd[1]: Listening on Journal Socket (/dev/log).
Nov 28 06:42:57 localhost systemd[1]: Listening on Journal Socket.
Nov 28 06:42:57 localhost systemd[1]: Listening on udev Control Socket.
Nov 28 06:42:57 localhost systemd[1]: Listening on udev Kernel Socket.
Nov 28 06:42:57 localhost systemd[1]: Reached target Socket Units.
Nov 28 06:42:57 localhost systemd[1]: Starting Create List of Static Device Nodes...
Nov 28 06:42:57 localhost systemd[1]: Starting Journal Service...
Nov 28 06:42:57 localhost systemd[1]: Starting Load Kernel Modules...
Nov 28 06:42:57 localhost systemd[1]: Starting Create System Users...
Nov 28 06:42:57 localhost systemd[1]: Starting Setup Virtual Console...
Nov 28 06:42:57 localhost systemd[1]: Finished Create List of Static Device Nodes.
Nov 28 06:42:57 localhost systemd[1]: Finished Load Kernel Modules.
Nov 28 06:42:57 localhost systemd-journald[284]: Journal started
Nov 28 06:42:57 localhost systemd-journald[284]: Runtime Journal (/run/log/journal/4c358f0e7e1544e5bde2714780d05a92) is 8.0M, max 314.7M, 306.7M free.
Nov 28 06:42:57 localhost systemd-modules-load[285]: Module 'msr' is built in
Nov 28 06:42:57 localhost systemd[1]: Started Journal Service.
Nov 28 06:42:57 localhost systemd[1]: Finished Setup Virtual Console.
Nov 28 06:42:57 localhost systemd[1]: dracut ask for additional cmdline parameters was skipped because no trigger condition checks were met.
Nov 28 06:42:57 localhost systemd[1]: Starting dracut cmdline hook...
Nov 28 06:42:57 localhost systemd[1]: Starting Apply Kernel Variables...
Nov 28 06:42:57 localhost systemd-sysusers[286]: Creating group 'sgx' with GID 997.
Nov 28 06:42:57 localhost systemd-sysusers[286]: Creating group 'users' with GID 100.
Nov 28 06:42:57 localhost systemd-sysusers[286]: Creating group 'dbus' with GID 81.
Nov 28 06:42:57 localhost systemd-sysusers[286]: Creating user 'dbus' (System Message Bus) with UID 81 and GID 81.
Nov 28 06:42:57 localhost systemd[1]: Finished Create System Users.
Nov 28 06:42:57 localhost systemd[1]: Finished Apply Kernel Variables.
Nov 28 06:42:57 localhost systemd[1]: Starting Create Static Device Nodes in /dev...
Nov 28 06:42:57 localhost systemd[1]: Starting Create Volatile Files and Directories...
Nov 28 06:42:57 localhost dracut-cmdline[291]: dracut-9.2 (Plow) dracut-057-21.git20230214.el9
Nov 28 06:42:57 localhost dracut-cmdline[291]: Using kernel command line parameters:    BOOT_IMAGE=(hd0,gpt3)/vmlinuz-5.14.0-284.11.1.el9_2.x86_64 root=UUID=a3dd82de-ffc6-4652-88b9-80e003b8f20a console=tty0 console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-4G:192M,4G-64G:256M,64G-:512M
Nov 28 06:42:57 localhost systemd[1]: Finished Create Volatile Files and Directories.
Nov 28 06:42:57 localhost systemd[1]: Finished Create Static Device Nodes in /dev.
Nov 28 06:42:57 localhost systemd[1]: Finished dracut cmdline hook.
Nov 28 06:42:57 localhost systemd[1]: Starting dracut pre-udev hook...
Nov 28 06:42:57 localhost kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log.
Nov 28 06:42:57 localhost kernel: device-mapper: uevent: version 1.0.3
Nov 28 06:42:57 localhost kernel: device-mapper: ioctl: 4.47.0-ioctl (2022-07-28) initialised: dm-devel@redhat.com
Nov 28 06:42:57 localhost kernel: RPC: Registered named UNIX socket transport module.
Nov 28 06:42:57 localhost kernel: RPC: Registered udp transport module.
Nov 28 06:42:57 localhost kernel: RPC: Registered tcp transport module.
Nov 28 06:42:57 localhost kernel: RPC: Registered tcp NFSv4.1 backchannel transport module.
Nov 28 06:42:57 localhost rpc.statd[410]: Version 2.5.4 starting
Nov 28 06:42:57 localhost rpc.statd[410]: Initializing NSM state
Nov 28 06:42:57 localhost rpc.idmapd[415]: Setting log level to 0
Nov 28 06:42:58 localhost systemd[1]: Finished dracut pre-udev hook.
Nov 28 06:42:58 localhost systemd[1]: Starting Rule-based Manager for Device Events and Files...
Nov 28 06:42:58 localhost systemd-udevd[428]: Using default interface naming scheme 'rhel-9.0'.
Nov 28 06:42:58 localhost systemd[1]: Started Rule-based Manager for Device Events and Files.
Nov 28 06:42:58 localhost systemd[1]: Starting dracut pre-trigger hook...
Nov 28 06:42:58 localhost systemd[1]: Finished dracut pre-trigger hook.
Nov 28 06:42:58 localhost systemd[1]: Starting Coldplug All udev Devices...
Nov 28 06:42:58 localhost systemd[1]: Finished Coldplug All udev Devices.
Nov 28 06:42:58 localhost systemd[1]: Reached target System Initialization.
Nov 28 06:42:58 localhost systemd[1]: Reached target Basic System.
Nov 28 06:42:58 localhost systemd[1]: nm-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Nov 28 06:42:58 localhost systemd[1]: Reached target Network.
Nov 28 06:42:58 localhost systemd[1]: nm-wait-online-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Nov 28 06:42:58 localhost systemd[1]: Starting dracut initqueue hook...
Nov 28 06:42:58 localhost kernel: virtio_blk virtio2: [vda] 838860800 512-byte logical blocks (429 GB/400 GiB)
Nov 28 06:42:58 localhost kernel: GPT:Primary header thinks Alt. header is not at the end of the disk.
Nov 28 06:42:58 localhost kernel: GPT:20971519 != 838860799
Nov 28 06:42:58 localhost kernel: GPT:Alternate GPT header not at the end of the disk.
Nov 28 06:42:58 localhost kernel: GPT:20971519 != 838860799
Nov 28 06:42:58 localhost kernel: GPT: Use GNU Parted to correct GPT errors.
Nov 28 06:42:58 localhost kernel:  vda: vda1 vda2 vda3 vda4
Nov 28 06:42:58 localhost systemd[1]: Found device /dev/disk/by-uuid/a3dd82de-ffc6-4652-88b9-80e003b8f20a.
Nov 28 06:42:58 localhost kernel: libata version 3.00 loaded.
Nov 28 06:42:58 localhost systemd-udevd[460]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 06:42:58 localhost kernel: ata_piix 0000:00:01.1: version 2.13
Nov 28 06:42:58 localhost kernel: scsi host0: ata_piix
Nov 28 06:42:58 localhost kernel: scsi host1: ata_piix
Nov 28 06:42:58 localhost kernel: ata1: PATA max MWDMA2 cmd 0x1f0 ctl 0x3f6 bmdma 0xc140 irq 14
Nov 28 06:42:58 localhost kernel: ata2: PATA max MWDMA2 cmd 0x170 ctl 0x376 bmdma 0xc148 irq 15
Nov 28 06:42:58 localhost systemd[1]: Reached target Initrd Root Device.
Nov 28 06:42:58 localhost kernel: ata1: found unknown device (class 0)
Nov 28 06:42:58 localhost kernel: ata1.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100
Nov 28 06:42:58 localhost kernel: scsi 0:0:0:0: CD-ROM            QEMU     QEMU DVD-ROM     2.5+ PQ: 0 ANSI: 5
Nov 28 06:42:58 localhost kernel: scsi 0:0:0:0: Attached scsi generic sg0 type 5
Nov 28 06:42:58 localhost kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray
Nov 28 06:42:58 localhost kernel: cdrom: Uniform CD-ROM driver Revision: 3.20
Nov 28 06:42:58 localhost kernel: sr 0:0:0:0: Attached scsi CD-ROM sr0
Nov 28 06:42:58 localhost systemd[1]: Finished dracut initqueue hook.
Nov 28 06:42:58 localhost systemd[1]: Reached target Preparation for Remote File Systems.
Nov 28 06:42:58 localhost systemd[1]: Reached target Remote Encrypted Volumes.
Nov 28 06:42:58 localhost systemd[1]: Reached target Remote File Systems.
Nov 28 06:42:58 localhost systemd[1]: Starting dracut pre-mount hook...
Nov 28 06:42:58 localhost systemd[1]: Finished dracut pre-mount hook.
Nov 28 06:42:58 localhost systemd[1]: Starting File System Check on /dev/disk/by-uuid/a3dd82de-ffc6-4652-88b9-80e003b8f20a...
Nov 28 06:42:58 localhost systemd-fsck[512]: /usr/sbin/fsck.xfs: XFS file system.
Nov 28 06:42:58 localhost systemd[1]: Finished File System Check on /dev/disk/by-uuid/a3dd82de-ffc6-4652-88b9-80e003b8f20a.
Nov 28 06:42:58 localhost systemd[1]: Mounting /sysroot...
Nov 28 06:42:58 localhost kernel: SGI XFS with ACLs, security attributes, scrub, quota, no debug enabled
Nov 28 06:42:58 localhost kernel: XFS (vda4): Mounting V5 Filesystem
Nov 28 06:42:58 localhost kernel: XFS (vda4): Ending clean mount
Nov 28 06:42:58 localhost systemd[1]: Mounted /sysroot.
Nov 28 06:42:58 localhost systemd[1]: Reached target Initrd Root File System.
Nov 28 06:42:58 localhost systemd[1]: Starting Mountpoints Configured in the Real Root...
Nov 28 06:42:58 localhost systemd[1]: initrd-parse-etc.service: Deactivated successfully.
Nov 28 06:42:58 localhost systemd[1]: Finished Mountpoints Configured in the Real Root.
Nov 28 06:42:58 localhost systemd[1]: Reached target Initrd File Systems.
Nov 28 06:42:58 localhost systemd[1]: Reached target Initrd Default Target.
Nov 28 06:42:58 localhost systemd[1]: Starting dracut mount hook...
Nov 28 06:42:58 localhost systemd[1]: Finished dracut mount hook.
Nov 28 06:42:58 localhost systemd[1]: Starting dracut pre-pivot and cleanup hook...
Nov 28 06:42:59 localhost rpc.idmapd[415]: exiting on signal 15
Nov 28 06:42:59 localhost systemd[1]: var-lib-nfs-rpc_pipefs.mount: Deactivated successfully.
Nov 28 06:42:59 localhost systemd[1]: Finished dracut pre-pivot and cleanup hook.
Nov 28 06:42:59 localhost systemd[1]: Starting Cleaning Up and Shutting Down Daemons...
Nov 28 06:42:59 localhost systemd[1]: Stopped target Network.
Nov 28 06:42:59 localhost systemd[1]: Stopped target Remote Encrypted Volumes.
Nov 28 06:42:59 localhost systemd[1]: Stopped target Timer Units.
Nov 28 06:42:59 localhost systemd[1]: dbus.socket: Deactivated successfully.
Nov 28 06:42:59 localhost systemd[1]: Closed D-Bus System Message Bus Socket.
Nov 28 06:42:59 localhost systemd[1]: dracut-pre-pivot.service: Deactivated successfully.
Nov 28 06:42:59 localhost systemd[1]: Stopped dracut pre-pivot and cleanup hook.
Nov 28 06:42:59 localhost systemd[1]: Stopped target Initrd Default Target.
Nov 28 06:42:59 localhost systemd[1]: Stopped target Basic System.
Nov 28 06:42:59 localhost systemd[1]: Stopped target Initrd Root Device.
Nov 28 06:42:59 localhost systemd[1]: Stopped target Initrd /usr File System.
Nov 28 06:42:59 localhost systemd[1]: Stopped target Path Units.
Nov 28 06:42:59 localhost systemd[1]: Stopped target Remote File Systems.
Nov 28 06:42:59 localhost systemd[1]: Stopped target Preparation for Remote File Systems.
Nov 28 06:42:59 localhost systemd[1]: Stopped target Slice Units.
Nov 28 06:42:59 localhost systemd[1]: Stopped target Socket Units.
Nov 28 06:42:59 localhost systemd[1]: Stopped target System Initialization.
Nov 28 06:42:59 localhost systemd[1]: Stopped target Local File Systems.
Nov 28 06:42:59 localhost systemd[1]: Stopped target Swaps.
Nov 28 06:42:59 localhost systemd[1]: dracut-mount.service: Deactivated successfully.
Nov 28 06:42:59 localhost systemd[1]: Stopped dracut mount hook.
Nov 28 06:42:59 localhost systemd[1]: dracut-pre-mount.service: Deactivated successfully.
Nov 28 06:42:59 localhost systemd[1]: Stopped dracut pre-mount hook.
Nov 28 06:42:59 localhost systemd[1]: Stopped target Local Encrypted Volumes.
Nov 28 06:42:59 localhost systemd[1]: systemd-ask-password-console.path: Deactivated successfully.
Nov 28 06:42:59 localhost systemd[1]: Stopped Dispatch Password Requests to Console Directory Watch.
Nov 28 06:42:59 localhost systemd[1]: dracut-initqueue.service: Deactivated successfully.
Nov 28 06:42:59 localhost systemd[1]: Stopped dracut initqueue hook.
Nov 28 06:42:59 localhost systemd[1]: systemd-sysctl.service: Deactivated successfully.
Nov 28 06:42:59 localhost systemd[1]: Stopped Apply Kernel Variables.
Nov 28 06:42:59 localhost systemd[1]: systemd-modules-load.service: Deactivated successfully.
Nov 28 06:42:59 localhost systemd[1]: Stopped Load Kernel Modules.
Nov 28 06:42:59 localhost systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully.
Nov 28 06:42:59 localhost systemd[1]: Stopped Create Volatile Files and Directories.
Nov 28 06:42:59 localhost systemd[1]: systemd-udev-trigger.service: Deactivated successfully.
Nov 28 06:42:59 localhost systemd[1]: Stopped Coldplug All udev Devices.
Nov 28 06:42:59 localhost systemd[1]: dracut-pre-trigger.service: Deactivated successfully.
Nov 28 06:42:59 localhost systemd[1]: Stopped dracut pre-trigger hook.
Nov 28 06:42:59 localhost systemd[1]: Stopping Rule-based Manager for Device Events and Files...
Nov 28 06:42:59 localhost systemd[1]: systemd-vconsole-setup.service: Deactivated successfully.
Nov 28 06:42:59 localhost systemd[1]: Stopped Setup Virtual Console.
Nov 28 06:42:59 localhost systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully.
Nov 28 06:42:59 localhost systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Nov 28 06:42:59 localhost systemd[1]: systemd-udevd.service: Deactivated successfully.
Nov 28 06:42:59 localhost systemd[1]: Stopped Rule-based Manager for Device Events and Files.
Nov 28 06:42:59 localhost systemd[1]: initrd-cleanup.service: Deactivated successfully.
Nov 28 06:42:59 localhost systemd[1]: Finished Cleaning Up and Shutting Down Daemons.
Nov 28 06:42:59 localhost systemd[1]: systemd-udevd-control.socket: Deactivated successfully.
Nov 28 06:42:59 localhost systemd[1]: Closed udev Control Socket.
Nov 28 06:42:59 localhost systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully.
Nov 28 06:42:59 localhost systemd[1]: Closed udev Kernel Socket.
Nov 28 06:42:59 localhost systemd[1]: dracut-pre-udev.service: Deactivated successfully.
Nov 28 06:42:59 localhost systemd[1]: Stopped dracut pre-udev hook.
Nov 28 06:42:59 localhost systemd[1]: dracut-cmdline.service: Deactivated successfully.
Nov 28 06:42:59 localhost systemd[1]: Stopped dracut cmdline hook.
Nov 28 06:42:59 localhost systemd[1]: Starting Cleanup udev Database...
Nov 28 06:42:59 localhost systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully.
Nov 28 06:42:59 localhost systemd[1]: Stopped Create Static Device Nodes in /dev.
Nov 28 06:42:59 localhost systemd[1]: kmod-static-nodes.service: Deactivated successfully.
Nov 28 06:42:59 localhost systemd[1]: Stopped Create List of Static Device Nodes.
Nov 28 06:42:59 localhost systemd[1]: systemd-sysusers.service: Deactivated successfully.
Nov 28 06:42:59 localhost systemd[1]: Stopped Create System Users.
Nov 28 06:42:59 localhost systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully.
Nov 28 06:42:59 localhost systemd[1]: Finished Cleanup udev Database.
Nov 28 06:42:59 localhost systemd[1]: Reached target Switch Root.
Nov 28 06:42:59 localhost systemd[1]: Starting Switch Root...
Nov 28 06:42:59 localhost systemd[1]: Switching root.
Nov 28 06:42:59 localhost systemd-journald[284]: Journal stopped
Nov 28 06:42:59 localhost systemd-journald[284]: Received SIGTERM from PID 1 (systemd).
Nov 28 06:42:59 localhost kernel: audit: type=1404 audit(1764312179.318:2): enforcing=1 old_enforcing=0 auid=4294967295 ses=4294967295 enabled=1 old-enabled=1 lsm=selinux res=1
Nov 28 06:42:59 localhost kernel: SELinux:  policy capability network_peer_controls=1
Nov 28 06:42:59 localhost kernel: SELinux:  policy capability open_perms=1
Nov 28 06:42:59 localhost kernel: SELinux:  policy capability extended_socket_class=1
Nov 28 06:42:59 localhost kernel: SELinux:  policy capability always_check_network=0
Nov 28 06:42:59 localhost kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 28 06:42:59 localhost kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 28 06:42:59 localhost kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 28 06:42:59 localhost kernel: audit: type=1403 audit(1764312179.410:3): auid=4294967295 ses=4294967295 lsm=selinux res=1
Nov 28 06:42:59 localhost systemd[1]: Successfully loaded SELinux policy in 95.699ms.
Nov 28 06:42:59 localhost systemd[1]: Relabelled /dev, /dev/shm, /run, /sys/fs/cgroup in 24.951ms.
Nov 28 06:42:59 localhost systemd[1]: systemd 252-13.el9_2 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Nov 28 06:42:59 localhost systemd[1]: Detected virtualization kvm.
Nov 28 06:42:59 localhost systemd[1]: Detected architecture x86-64.
Nov 28 06:42:59 localhost systemd-rc-local-generator[582]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 06:42:59 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 06:42:59 localhost systemd[1]: initrd-switch-root.service: Deactivated successfully.
Nov 28 06:42:59 localhost systemd[1]: Stopped Switch Root.
Nov 28 06:42:59 localhost systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1.
Nov 28 06:42:59 localhost systemd[1]: Created slice Slice /system/getty.
Nov 28 06:42:59 localhost systemd[1]: Created slice Slice /system/modprobe.
Nov 28 06:42:59 localhost systemd[1]: Created slice Slice /system/serial-getty.
Nov 28 06:42:59 localhost systemd[1]: Created slice Slice /system/sshd-keygen.
Nov 28 06:42:59 localhost systemd[1]: Created slice Slice /system/systemd-fsck.
Nov 28 06:42:59 localhost systemd[1]: Created slice User and Session Slice.
Nov 28 06:42:59 localhost systemd[1]: Started Dispatch Password Requests to Console Directory Watch.
Nov 28 06:42:59 localhost systemd[1]: Started Forward Password Requests to Wall Directory Watch.
Nov 28 06:42:59 localhost systemd[1]: Set up automount Arbitrary Executable File Formats File System Automount Point.
Nov 28 06:42:59 localhost systemd[1]: Reached target Local Encrypted Volumes.
Nov 28 06:42:59 localhost systemd[1]: Stopped target Switch Root.
Nov 28 06:42:59 localhost systemd[1]: Stopped target Initrd File Systems.
Nov 28 06:42:59 localhost systemd[1]: Stopped target Initrd Root File System.
Nov 28 06:42:59 localhost systemd[1]: Reached target Local Integrity Protected Volumes.
Nov 28 06:42:59 localhost systemd[1]: Reached target Path Units.
Nov 28 06:42:59 localhost systemd[1]: Reached target rpc_pipefs.target.
Nov 28 06:42:59 localhost systemd[1]: Reached target Slice Units.
Nov 28 06:42:59 localhost systemd[1]: Reached target Swaps.
Nov 28 06:42:59 localhost systemd[1]: Reached target Local Verity Protected Volumes.
Nov 28 06:42:59 localhost systemd[1]: Listening on RPCbind Server Activation Socket.
Nov 28 06:42:59 localhost systemd[1]: Reached target RPC Port Mapper.
Nov 28 06:42:59 localhost systemd[1]: Listening on Process Core Dump Socket.
Nov 28 06:42:59 localhost systemd[1]: Listening on initctl Compatibility Named Pipe.
Nov 28 06:42:59 localhost systemd[1]: Listening on udev Control Socket.
Nov 28 06:42:59 localhost systemd[1]: Listening on udev Kernel Socket.
Nov 28 06:42:59 localhost systemd[1]: Mounting Huge Pages File System...
Nov 28 06:42:59 localhost systemd[1]: Mounting POSIX Message Queue File System...
Nov 28 06:42:59 localhost systemd[1]: Mounting Kernel Debug File System...
Nov 28 06:42:59 localhost systemd[1]: Mounting Kernel Trace File System...
Nov 28 06:42:59 localhost systemd[1]: Kernel Module supporting RPCSEC_GSS was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Nov 28 06:42:59 localhost systemd[1]: Starting Create List of Static Device Nodes...
Nov 28 06:42:59 localhost systemd[1]: Starting Load Kernel Module configfs...
Nov 28 06:42:59 localhost systemd[1]: Starting Load Kernel Module drm...
Nov 28 06:42:59 localhost systemd[1]: Starting Load Kernel Module fuse...
Nov 28 06:42:59 localhost systemd[1]: Starting Read and set NIS domainname from /etc/sysconfig/network...
Nov 28 06:42:59 localhost systemd[1]: systemd-fsck-root.service: Deactivated successfully.
Nov 28 06:42:59 localhost systemd[1]: Stopped File System Check on Root Device.
Nov 28 06:42:59 localhost systemd[1]: Stopped Journal Service.
Nov 28 06:42:59 localhost systemd[1]: Starting Journal Service...
Nov 28 06:42:59 localhost systemd[1]: Starting Load Kernel Modules...
Nov 28 06:42:59 localhost systemd[1]: Starting Generate network units from Kernel command line...
Nov 28 06:42:59 localhost kernel: fuse: init (API version 7.36)
Nov 28 06:42:59 localhost systemd[1]: Starting Remount Root and Kernel File Systems...
Nov 28 06:42:59 localhost systemd[1]: Repartition Root Disk was skipped because no trigger condition checks were met.
Nov 28 06:42:59 localhost systemd[1]: Starting Coldplug All udev Devices...
Nov 28 06:42:59 localhost systemd-journald[618]: Journal started
Nov 28 06:42:59 localhost systemd-journald[618]: Runtime Journal (/run/log/journal/5cd59ba25ae47acac865224fa46a5f9e) is 8.0M, max 314.7M, 306.7M free.
Nov 28 06:42:59 localhost systemd[1]: Queued start job for default target Multi-User System.
Nov 28 06:42:59 localhost systemd[1]: systemd-journald.service: Deactivated successfully.
Nov 28 06:42:59 localhost systemd-modules-load[619]: Module 'msr' is built in
Nov 28 06:42:59 localhost kernel: xfs filesystem being remounted at / supports timestamps until 2038 (0x7fffffff)
Nov 28 06:42:59 localhost systemd[1]: Started Journal Service.
Nov 28 06:42:59 localhost systemd[1]: Mounted Huge Pages File System.
Nov 28 06:42:59 localhost systemd[1]: Mounted POSIX Message Queue File System.
Nov 28 06:42:59 localhost systemd[1]: Mounted Kernel Debug File System.
Nov 28 06:42:59 localhost systemd[1]: Mounted Kernel Trace File System.
Nov 28 06:42:59 localhost systemd[1]: Finished Create List of Static Device Nodes.
Nov 28 06:42:59 localhost kernel: ACPI: bus type drm_connector registered
Nov 28 06:42:59 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Nov 28 06:42:59 localhost systemd[1]: Finished Load Kernel Module configfs.
Nov 28 06:43:00 localhost systemd[1]: modprobe@drm.service: Deactivated successfully.
Nov 28 06:43:00 localhost systemd[1]: Finished Load Kernel Module drm.
Nov 28 06:43:00 localhost systemd[1]: modprobe@fuse.service: Deactivated successfully.
Nov 28 06:43:00 localhost systemd[1]: Finished Load Kernel Module fuse.
Nov 28 06:43:00 localhost systemd[1]: Finished Read and set NIS domainname from /etc/sysconfig/network.
Nov 28 06:43:00 localhost systemd[1]: Finished Load Kernel Modules.
Nov 28 06:43:00 localhost systemd[1]: Finished Generate network units from Kernel command line.
Nov 28 06:43:00 localhost systemd[1]: Finished Remount Root and Kernel File Systems.
Nov 28 06:43:00 localhost systemd[1]: Mounting FUSE Control File System...
Nov 28 06:43:00 localhost systemd[1]: Mounting Kernel Configuration File System...
Nov 28 06:43:00 localhost systemd[1]: First Boot Wizard was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Nov 28 06:43:00 localhost systemd[1]: Starting Rebuild Hardware Database...
Nov 28 06:43:00 localhost systemd[1]: Starting Flush Journal to Persistent Storage...
Nov 28 06:43:00 localhost systemd[1]: Starting Load/Save Random Seed...
Nov 28 06:43:00 localhost systemd[1]: Starting Apply Kernel Variables...
Nov 28 06:43:00 localhost systemd-journald[618]: Runtime Journal (/run/log/journal/5cd59ba25ae47acac865224fa46a5f9e) is 8.0M, max 314.7M, 306.7M free.
Nov 28 06:43:00 localhost systemd-journald[618]: Received client request to flush runtime journal.
Nov 28 06:43:00 localhost systemd[1]: Starting Create System Users...
Nov 28 06:43:00 localhost systemd[1]: Finished Coldplug All udev Devices.
Nov 28 06:43:00 localhost systemd[1]: Mounted FUSE Control File System.
Nov 28 06:43:00 localhost systemd[1]: Mounted Kernel Configuration File System.
Nov 28 06:43:00 localhost systemd[1]: Finished Flush Journal to Persistent Storage.
Nov 28 06:43:00 localhost systemd[1]: Finished Load/Save Random Seed.
Nov 28 06:43:00 localhost systemd[1]: First Boot Complete was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Nov 28 06:43:00 localhost systemd-sysusers[630]: Creating group 'sgx' with GID 989.
Nov 28 06:43:00 localhost systemd-sysusers[630]: Creating group 'systemd-oom' with GID 988.
Nov 28 06:43:00 localhost systemd-sysusers[630]: Creating user 'systemd-oom' (systemd Userspace OOM Killer) with UID 988 and GID 988.
Nov 28 06:43:00 localhost systemd[1]: Finished Apply Kernel Variables.
Nov 28 06:43:00 localhost systemd[1]: Finished Create System Users.
Nov 28 06:43:00 localhost systemd[1]: Starting Create Static Device Nodes in /dev...
Nov 28 06:43:00 localhost systemd[1]: Finished Create Static Device Nodes in /dev.
Nov 28 06:43:00 localhost systemd[1]: Reached target Preparation for Local File Systems.
Nov 28 06:43:00 localhost systemd[1]: Set up automount EFI System Partition Automount.
Nov 28 06:43:00 localhost systemd[1]: Finished Rebuild Hardware Database.
Nov 28 06:43:00 localhost systemd[1]: Starting Rule-based Manager for Device Events and Files...
Nov 28 06:43:00 localhost systemd-udevd[635]: Using default interface naming scheme 'rhel-9.0'.
Nov 28 06:43:00 localhost systemd[1]: Started Rule-based Manager for Device Events and Files.
Nov 28 06:43:00 localhost systemd[1]: Starting Load Kernel Module configfs...
Nov 28 06:43:00 localhost systemd-udevd[637]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 06:43:00 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Nov 28 06:43:00 localhost systemd[1]: Finished Load Kernel Module configfs.
Nov 28 06:43:00 localhost systemd[1]: Condition check resulted in /dev/ttyS0 being skipped.
Nov 28 06:43:00 localhost systemd[1]: Condition check resulted in /dev/disk/by-uuid/7B77-95E7 being skipped.
Nov 28 06:43:00 localhost systemd[1]: Starting File System Check on /dev/disk/by-uuid/7B77-95E7...
Nov 28 06:43:00 localhost systemd[1]: Condition check resulted in /dev/disk/by-uuid/b141154b-6a70-437a-a97f-d160c9ba37eb being skipped.
Nov 28 06:43:00 localhost systemd[1]: Mounting /boot...
Nov 28 06:43:00 localhost systemd-fsck[679]: fsck.fat 4.2 (2021-01-31)
Nov 28 06:43:00 localhost systemd-fsck[679]: /dev/vda2: 12 files, 1782/51145 clusters
Nov 28 06:43:00 localhost kernel: XFS (vda3): Mounting V5 Filesystem
Nov 28 06:43:00 localhost systemd[1]: Finished File System Check on /dev/disk/by-uuid/7B77-95E7.
Nov 28 06:43:00 localhost kernel: piix4_smbus 0000:00:01.3: SMBus Host Controller at 0x700, revision 0
Nov 28 06:43:00 localhost kernel: input: PC Speaker as /devices/platform/pcspkr/input/input6
Nov 28 06:43:00 localhost kernel: XFS (vda3): Ending clean mount
Nov 28 06:43:00 localhost kernel: xfs filesystem being mounted at /boot supports timestamps until 2038 (0x7fffffff)
Nov 28 06:43:00 localhost systemd[1]: Mounted /boot.
Nov 28 06:43:00 localhost kernel: [drm] pci: virtio-vga detected at 0000:00:02.0
Nov 28 06:43:00 localhost kernel: virtio-pci 0000:00:02.0: vgaarb: deactivate vga console
Nov 28 06:43:00 localhost kernel: Console: switching to colour dummy device 80x25
Nov 28 06:43:00 localhost kernel: [drm] features: -virgl +edid -resource_blob -host_visible
Nov 28 06:43:00 localhost kernel: [drm] features: -context_init
Nov 28 06:43:00 localhost kernel: [drm] number of scanouts: 1
Nov 28 06:43:00 localhost kernel: [drm] number of cap sets: 0
Nov 28 06:43:00 localhost kernel: [drm] Initialized virtio_gpu 0.1.0 0 for virtio0 on minor 0
Nov 28 06:43:00 localhost kernel: virtio_gpu virtio0: [drm] drm_plane_enable_fb_damage_clips() not called
Nov 28 06:43:00 localhost kernel: Console: switching to colour frame buffer device 128x48
Nov 28 06:43:00 localhost kernel: virtio_gpu virtio0: [drm] fb0: virtio_gpudrmfb frame buffer device
Nov 28 06:43:00 localhost kernel: SVM: TSC scaling supported
Nov 28 06:43:00 localhost kernel: kvm: Nested Virtualization enabled
Nov 28 06:43:00 localhost kernel: SVM: kvm: Nested Paging enabled
Nov 28 06:43:00 localhost kernel: SVM: LBR virtualization supported
Nov 28 06:43:00 localhost systemd[1]: Mounting /boot/efi...
Nov 28 06:43:01 localhost systemd[1]: Mounted /boot/efi.
Nov 28 06:43:01 localhost systemd[1]: Reached target Local File Systems.
Nov 28 06:43:01 localhost systemd[1]: Starting Rebuild Dynamic Linker Cache...
Nov 28 06:43:01 localhost systemd[1]: Mark the need to relabel after reboot was skipped because of an unmet condition check (ConditionSecurity=!selinux).
Nov 28 06:43:01 localhost systemd[1]: Set Up Additional Binary Formats was skipped because no trigger condition checks were met.
Nov 28 06:43:01 localhost systemd[1]: Store a System Token in an EFI Variable was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/LoaderFeatures-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Nov 28 06:43:01 localhost systemd[1]: Starting Automatic Boot Loader Update...
Nov 28 06:43:01 localhost systemd[1]: Commit a transient machine-id on disk was skipped because of an unmet condition check (ConditionPathIsMountPoint=/etc/machine-id).
Nov 28 06:43:01 localhost systemd[1]: Starting Create Volatile Files and Directories...
Nov 28 06:43:01 localhost systemd[1]: efi.automount: Got automount request for /efi, triggered by 702 (bootctl)
Nov 28 06:43:01 localhost systemd[1]: Starting File System Check on /dev/vda2...
Nov 28 06:43:01 localhost systemd[1]: Finished File System Check on /dev/vda2.
Nov 28 06:43:01 localhost systemd[1]: Mounting EFI System Partition Automount...
Nov 28 06:43:01 localhost systemd[1]: Mounted EFI System Partition Automount.
Nov 28 06:43:01 localhost systemd[1]: Finished Automatic Boot Loader Update.
Nov 28 06:43:01 localhost systemd[1]: Finished Create Volatile Files and Directories.
Nov 28 06:43:01 localhost systemd[1]: Starting Security Auditing Service...
Nov 28 06:43:01 localhost systemd[1]: Starting RPC Bind...
Nov 28 06:43:01 localhost systemd[1]: Starting Rebuild Journal Catalog...
Nov 28 06:43:01 localhost systemd[1]: Finished Rebuild Dynamic Linker Cache.
Nov 28 06:43:01 localhost auditd[719]: audit dispatcher initialized with q_depth=1200 and 1 active plugins
Nov 28 06:43:01 localhost auditd[719]: Init complete, auditd 3.0.7 listening for events (startup state enable)
Nov 28 06:43:01 localhost systemd[1]: Started RPC Bind.
Nov 28 06:43:01 localhost systemd[1]: Finished Rebuild Journal Catalog.
Nov 28 06:43:01 localhost systemd[1]: Starting Update is Completed...
Nov 28 06:43:01 localhost systemd[1]: Finished Update is Completed.
Nov 28 06:43:01 localhost augenrules[725]: /sbin/augenrules: No change
Nov 28 06:43:01 localhost augenrules[739]: No rules
Nov 28 06:43:01 localhost augenrules[739]: enabled 1
Nov 28 06:43:01 localhost augenrules[739]: failure 1
Nov 28 06:43:01 localhost augenrules[739]: pid 719
Nov 28 06:43:01 localhost augenrules[739]: rate_limit 0
Nov 28 06:43:01 localhost augenrules[739]: backlog_limit 8192
Nov 28 06:43:01 localhost augenrules[739]: lost 0
Nov 28 06:43:01 localhost augenrules[739]: backlog 0
Nov 28 06:43:01 localhost augenrules[739]: backlog_wait_time 60000
Nov 28 06:43:01 localhost augenrules[739]: backlog_wait_time_actual 0
Nov 28 06:43:01 localhost augenrules[739]: enabled 1
Nov 28 06:43:01 localhost augenrules[739]: failure 1
Nov 28 06:43:01 localhost augenrules[739]: pid 719
Nov 28 06:43:01 localhost augenrules[739]: rate_limit 0
Nov 28 06:43:01 localhost augenrules[739]: backlog_limit 8192
Nov 28 06:43:01 localhost augenrules[739]: lost 0
Nov 28 06:43:01 localhost augenrules[739]: backlog 0
Nov 28 06:43:01 localhost augenrules[739]: backlog_wait_time 60000
Nov 28 06:43:01 localhost augenrules[739]: backlog_wait_time_actual 0
Nov 28 06:43:01 localhost augenrules[739]: enabled 1
Nov 28 06:43:01 localhost augenrules[739]: failure 1
Nov 28 06:43:01 localhost augenrules[739]: pid 719
Nov 28 06:43:01 localhost augenrules[739]: rate_limit 0
Nov 28 06:43:01 localhost augenrules[739]: backlog_limit 8192
Nov 28 06:43:01 localhost augenrules[739]: lost 0
Nov 28 06:43:01 localhost augenrules[739]: backlog 0
Nov 28 06:43:01 localhost augenrules[739]: backlog_wait_time 60000
Nov 28 06:43:01 localhost augenrules[739]: backlog_wait_time_actual 0
Nov 28 06:43:01 localhost systemd[1]: Started Security Auditing Service.
Nov 28 06:43:01 localhost systemd[1]: Starting Record System Boot/Shutdown in UTMP...
Nov 28 06:43:01 localhost systemd[1]: Finished Record System Boot/Shutdown in UTMP.
Nov 28 06:43:01 localhost systemd[1]: Reached target System Initialization.
Nov 28 06:43:01 localhost systemd[1]: Started dnf makecache --timer.
Nov 28 06:43:01 localhost systemd[1]: Started Daily rotation of log files.
Nov 28 06:43:01 localhost systemd[1]: Started Daily Cleanup of Temporary Directories.
Nov 28 06:43:01 localhost systemd[1]: Reached target Timer Units.
Nov 28 06:43:01 localhost systemd[1]: Listening on D-Bus System Message Bus Socket.
Nov 28 06:43:01 localhost systemd[1]: Listening on SSSD Kerberos Cache Manager responder socket.
Nov 28 06:43:01 localhost systemd[1]: Reached target Socket Units.
Nov 28 06:43:01 localhost systemd[1]: Starting Initial cloud-init job (pre-networking)...
Nov 28 06:43:01 localhost systemd[1]: Starting D-Bus System Message Bus...
Nov 28 06:43:01 localhost systemd[1]: TPM2 PCR Barrier (Initialization) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Nov 28 06:43:01 localhost systemd[1]: Started D-Bus System Message Bus.
Nov 28 06:43:01 localhost systemd[1]: Reached target Basic System.
Nov 28 06:43:01 localhost systemd[1]: Starting NTP client/server...
Nov 28 06:43:01 localhost dbus-broker-lau[750]: Ready
Nov 28 06:43:01 localhost systemd[1]: Starting Restore /run/initramfs on shutdown...
Nov 28 06:43:01 localhost systemd[1]: Started irqbalance daemon.
Nov 28 06:43:01 localhost systemd[1]: Load CPU microcode update was skipped because of an unmet condition check (ConditionPathExists=/sys/devices/system/cpu/microcode/reload).
Nov 28 06:43:01 localhost systemd[1]: Starting System Logging Service...
Nov 28 06:43:01 localhost systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 28 06:43:01 localhost systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 28 06:43:01 localhost systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 28 06:43:01 localhost systemd[1]: Reached target sshd-keygen.target.
Nov 28 06:43:01 localhost rsyslogd[758]: [origin software="rsyslogd" swVersion="8.2102.0-111.el9" x-pid="758" x-info="https://www.rsyslog.com"] start
Nov 28 06:43:01 localhost rsyslogd[758]: imjournal: No statefile exists, /var/lib/rsyslog/imjournal.state will be created (ignore if this is first run): No such file or directory [v8.2102.0-111.el9 try https://www.rsyslog.com/e/2040 ]
Nov 28 06:43:01 localhost systemd[1]: System Security Services Daemon was skipped because no trigger condition checks were met.
Nov 28 06:43:01 localhost systemd[1]: Reached target User and Group Name Lookups.
Nov 28 06:43:01 localhost systemd[1]: Starting User Login Management...
Nov 28 06:43:01 localhost systemd[1]: Started System Logging Service.
Nov 28 06:43:01 localhost systemd[1]: Finished Restore /run/initramfs on shutdown.
Nov 28 06:43:01 localhost chronyd[765]: chronyd version 4.3 starting (+CMDMON +NTP +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +ASYNCDNS +NTS +SECHASH +IPV6 +DEBUG)
Nov 28 06:43:01 localhost chronyd[765]: Using right/UTC timezone to obtain leap second data
Nov 28 06:43:01 localhost chronyd[765]: Loaded seccomp filter (level 2)
Nov 28 06:43:01 localhost systemd[1]: Started NTP client/server.
Nov 28 06:43:01 localhost rsyslogd[758]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Nov 28 06:43:01 localhost systemd-logind[763]: New seat seat0.
Nov 28 06:43:01 localhost systemd-logind[763]: Watching system buttons on /dev/input/event0 (Power Button)
Nov 28 06:43:01 localhost systemd-logind[763]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Nov 28 06:43:01 localhost systemd[1]: Started User Login Management.
Nov 28 06:43:01 localhost cloud-init[769]: Cloud-init v. 22.1-9.el9 running 'init-local' at Fri, 28 Nov 2025 06:43:01 +0000. Up 5.72 seconds.
Nov 28 06:43:01 localhost kernel: ISO 9660 Extensions: Microsoft Joliet Level 3
Nov 28 06:43:01 localhost kernel: ISO 9660 Extensions: RRIP_1991A
Nov 28 06:43:01 localhost systemd[1]: Starting Hostname Service...
Nov 28 06:43:01 localhost systemd[1]: Started Hostname Service.
Nov 28 06:43:01 np0005538515.novalocal systemd-hostnamed[783]: Hostname set to <np0005538515.novalocal> (static)
Nov 28 06:43:01 np0005538515.novalocal systemd[1]: run-cloud\x2dinit-tmp-tmpuu9n5t76.mount: Deactivated successfully.
Nov 28 06:43:01 np0005538515.novalocal systemd[1]: Finished Initial cloud-init job (pre-networking).
Nov 28 06:43:01 np0005538515.novalocal systemd[1]: Reached target Preparation for Network.
Nov 28 06:43:01 np0005538515.novalocal systemd[1]: Starting Network Manager...
Nov 28 06:43:02 np0005538515.novalocal NetworkManager[788]: <info>  [1764312182.0092] NetworkManager (version 1.42.2-1.el9) is starting... (boot:a439187c-a774-4883-a00c-1a7b4e2aa22a)
Nov 28 06:43:02 np0005538515.novalocal NetworkManager[788]: <info>  [1764312182.0098] Read config: /etc/NetworkManager/NetworkManager.conf (run: 15-carrier-timeout.conf)
Nov 28 06:43:02 np0005538515.novalocal systemd[1]: Started Network Manager.
Nov 28 06:43:02 np0005538515.novalocal NetworkManager[788]: <info>  [1764312182.0122] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Nov 28 06:43:02 np0005538515.novalocal systemd[1]: Reached target Network.
Nov 28 06:43:02 np0005538515.novalocal NetworkManager[788]: <info>  [1764312182.0187] manager[0x563361390020]: monitoring kernel firmware directory '/lib/firmware'.
Nov 28 06:43:02 np0005538515.novalocal systemd[1]: Starting Network Manager Wait Online...
Nov 28 06:43:02 np0005538515.novalocal NetworkManager[788]: <info>  [1764312182.0233] hostname: hostname: using hostnamed
Nov 28 06:43:02 np0005538515.novalocal NetworkManager[788]: <info>  [1764312182.0233] hostname: static hostname changed from (none) to "np0005538515.novalocal"
Nov 28 06:43:02 np0005538515.novalocal NetworkManager[788]: <info>  [1764312182.0237] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Nov 28 06:43:02 np0005538515.novalocal systemd[1]: Starting GSSAPI Proxy Daemon...
Nov 28 06:43:02 np0005538515.novalocal systemd[1]: Starting Enable periodic update of entitlement certificates....
Nov 28 06:43:02 np0005538515.novalocal systemd[1]: Starting Dynamic System Tuning Daemon...
Nov 28 06:43:02 np0005538515.novalocal NetworkManager[788]: <info>  [1764312182.0371] manager[0x563361390020]: rfkill: Wi-Fi hardware radio set enabled
Nov 28 06:43:02 np0005538515.novalocal NetworkManager[788]: <info>  [1764312182.0371] manager[0x563361390020]: rfkill: WWAN hardware radio set enabled
Nov 28 06:43:02 np0005538515.novalocal NetworkManager[788]: <info>  [1764312182.0404] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.42.2-1.el9/libnm-device-plugin-team.so)
Nov 28 06:43:02 np0005538515.novalocal NetworkManager[788]: <info>  [1764312182.0406] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Nov 28 06:43:02 np0005538515.novalocal NetworkManager[788]: <info>  [1764312182.0408] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Nov 28 06:43:02 np0005538515.novalocal NetworkManager[788]: <info>  [1764312182.0409] manager: Networking is enabled by state file
Nov 28 06:43:02 np0005538515.novalocal systemd[1]: Started Enable periodic update of entitlement certificates..
Nov 28 06:43:02 np0005538515.novalocal NetworkManager[788]: <info>  [1764312182.0420] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.42.2-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Nov 28 06:43:02 np0005538515.novalocal NetworkManager[788]: <info>  [1764312182.0421] settings: Loaded settings plugin: keyfile (internal)
Nov 28 06:43:02 np0005538515.novalocal NetworkManager[788]: <info>  [1764312182.0440] dhcp: init: Using DHCP client 'internal'
Nov 28 06:43:02 np0005538515.novalocal NetworkManager[788]: <info>  [1764312182.0442] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Nov 28 06:43:02 np0005538515.novalocal NetworkManager[788]: <info>  [1764312182.0453] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', sys-iface-state: 'external')
Nov 28 06:43:02 np0005538515.novalocal systemd[1]: Started GSSAPI Proxy Daemon.
Nov 28 06:43:02 np0005538515.novalocal NetworkManager[788]: <info>  [1764312182.0456] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', sys-iface-state: 'external')
Nov 28 06:43:02 np0005538515.novalocal NetworkManager[788]: <info>  [1764312182.0462] device (lo): Activation: starting connection 'lo' (116e0581-bf2b-4791-a901-61d85cb9c212)
Nov 28 06:43:02 np0005538515.novalocal NetworkManager[788]: <info>  [1764312182.0468] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Nov 28 06:43:02 np0005538515.novalocal NetworkManager[788]: <info>  [1764312182.0470] device (eth0): state change: unmanaged -> unavailable (reason 'managed', sys-iface-state: 'external')
Nov 28 06:43:02 np0005538515.novalocal NetworkManager[788]: <info>  [1764312182.0502] device (lo): state change: disconnected -> prepare (reason 'none', sys-iface-state: 'external')
Nov 28 06:43:02 np0005538515.novalocal NetworkManager[788]: <info>  [1764312182.0504] device (lo): state change: prepare -> config (reason 'none', sys-iface-state: 'external')
Nov 28 06:43:02 np0005538515.novalocal NetworkManager[788]: <info>  [1764312182.0505] device (lo): state change: config -> ip-config (reason 'none', sys-iface-state: 'external')
Nov 28 06:43:02 np0005538515.novalocal NetworkManager[788]: <info>  [1764312182.0507] device (eth0): carrier: link connected
Nov 28 06:43:02 np0005538515.novalocal NetworkManager[788]: <info>  [1764312182.0509] device (lo): state change: ip-config -> ip-check (reason 'none', sys-iface-state: 'external')
Nov 28 06:43:02 np0005538515.novalocal NetworkManager[788]: <info>  [1764312182.0513] device (eth0): state change: unavailable -> disconnected (reason 'carrier-changed', sys-iface-state: 'managed')
Nov 28 06:43:02 np0005538515.novalocal systemd[1]: Listening on Load/Save RF Kill Switch Status /dev/rfkill Watch.
Nov 28 06:43:02 np0005538515.novalocal NetworkManager[788]: <info>  [1764312182.0517] policy: auto-activating connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Nov 28 06:43:02 np0005538515.novalocal NetworkManager[788]: <info>  [1764312182.0521] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Nov 28 06:43:02 np0005538515.novalocal NetworkManager[788]: <info>  [1764312182.0522] device (eth0): state change: disconnected -> prepare (reason 'none', sys-iface-state: 'managed')
Nov 28 06:43:02 np0005538515.novalocal NetworkManager[788]: <info>  [1764312182.0525] manager: NetworkManager state is now CONNECTING
Nov 28 06:43:02 np0005538515.novalocal NetworkManager[788]: <info>  [1764312182.0526] device (eth0): state change: prepare -> config (reason 'none', sys-iface-state: 'managed')
Nov 28 06:43:02 np0005538515.novalocal NetworkManager[788]: <info>  [1764312182.0536] device (eth0): state change: config -> ip-config (reason 'none', sys-iface-state: 'managed')
Nov 28 06:43:02 np0005538515.novalocal NetworkManager[788]: <info>  [1764312182.0538] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Nov 28 06:43:02 np0005538515.novalocal systemd[1]: RPC security service for NFS client and server was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Nov 28 06:43:02 np0005538515.novalocal systemd[1]: Reached target NFS client services.
Nov 28 06:43:02 np0005538515.novalocal systemd[1]: Reached target Preparation for Remote File Systems.
Nov 28 06:43:02 np0005538515.novalocal NetworkManager[788]: <info>  [1764312182.0617] dhcp4 (eth0): state changed new lease, address=38.102.83.53
Nov 28 06:43:02 np0005538515.novalocal systemd[1]: Reached target Remote File Systems.
Nov 28 06:43:02 np0005538515.novalocal NetworkManager[788]: <info>  [1764312182.0623] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Nov 28 06:43:02 np0005538515.novalocal systemd[1]: TPM2 PCR Barrier (User) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Nov 28 06:43:02 np0005538515.novalocal NetworkManager[788]: <info>  [1764312182.0653] device (eth0): state change: ip-config -> ip-check (reason 'none', sys-iface-state: 'managed')
Nov 28 06:43:02 np0005538515.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Nov 28 06:43:02 np0005538515.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Nov 28 06:43:02 np0005538515.novalocal NetworkManager[788]: <info>  [1764312182.0870] device (lo): state change: ip-check -> secondaries (reason 'none', sys-iface-state: 'external')
Nov 28 06:43:02 np0005538515.novalocal NetworkManager[788]: <info>  [1764312182.0873] device (lo): state change: secondaries -> activated (reason 'none', sys-iface-state: 'external')
Nov 28 06:43:02 np0005538515.novalocal NetworkManager[788]: <info>  [1764312182.0881] device (lo): Activation: successful, device activated.
Nov 28 06:43:02 np0005538515.novalocal NetworkManager[788]: <info>  [1764312182.0890] device (eth0): state change: ip-check -> secondaries (reason 'none', sys-iface-state: 'managed')
Nov 28 06:43:02 np0005538515.novalocal NetworkManager[788]: <info>  [1764312182.0893] device (eth0): state change: secondaries -> activated (reason 'none', sys-iface-state: 'managed')
Nov 28 06:43:02 np0005538515.novalocal NetworkManager[788]: <info>  [1764312182.0903] manager: NetworkManager state is now CONNECTED_SITE
Nov 28 06:43:02 np0005538515.novalocal NetworkManager[788]: <info>  [1764312182.0908] device (eth0): Activation: successful, device activated.
Nov 28 06:43:02 np0005538515.novalocal NetworkManager[788]: <info>  [1764312182.0916] manager: NetworkManager state is now CONNECTED_GLOBAL
Nov 28 06:43:02 np0005538515.novalocal NetworkManager[788]: <info>  [1764312182.0922] manager: startup complete
Nov 28 06:43:02 np0005538515.novalocal systemd[1]: Finished Network Manager Wait Online.
Nov 28 06:43:02 np0005538515.novalocal systemd[1]: Starting Initial cloud-init job (metadata service crawler)...
Nov 28 06:43:02 np0005538515.novalocal cloud-init[995]: Cloud-init v. 22.1-9.el9 running 'init' at Fri, 28 Nov 2025 06:43:02 +0000. Up 6.54 seconds.
Nov 28 06:43:02 np0005538515.novalocal cloud-init[995]: ci-info: +++++++++++++++++++++++++++++++++++++++Net device info+++++++++++++++++++++++++++++++++++++++
Nov 28 06:43:02 np0005538515.novalocal cloud-init[995]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Nov 28 06:43:02 np0005538515.novalocal cloud-init[995]: ci-info: | Device |  Up  |           Address            |      Mask     | Scope  |     Hw-Address    |
Nov 28 06:43:02 np0005538515.novalocal cloud-init[995]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Nov 28 06:43:02 np0005538515.novalocal cloud-init[995]: ci-info: |  eth0  | True |         38.102.83.53         | 255.255.255.0 | global | fa:16:3e:93:ca:2d |
Nov 28 06:43:02 np0005538515.novalocal cloud-init[995]: ci-info: |  eth0  | True | fe80::f816:3eff:fe93:ca2d/64 |       .       |  link  | fa:16:3e:93:ca:2d |
Nov 28 06:43:02 np0005538515.novalocal cloud-init[995]: ci-info: |   lo   | True |          127.0.0.1           |   255.0.0.0   |  host  |         .         |
Nov 28 06:43:02 np0005538515.novalocal cloud-init[995]: ci-info: |   lo   | True |           ::1/128            |       .       |  host  |         .         |
Nov 28 06:43:02 np0005538515.novalocal cloud-init[995]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Nov 28 06:43:02 np0005538515.novalocal cloud-init[995]: ci-info: +++++++++++++++++++++++++++++++++Route IPv4 info+++++++++++++++++++++++++++++++++
Nov 28 06:43:02 np0005538515.novalocal cloud-init[995]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Nov 28 06:43:02 np0005538515.novalocal cloud-init[995]: ci-info: | Route |   Destination   |    Gateway    |     Genmask     | Interface | Flags |
Nov 28 06:43:02 np0005538515.novalocal cloud-init[995]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Nov 28 06:43:02 np0005538515.novalocal cloud-init[995]: ci-info: |   0   |     0.0.0.0     |  38.102.83.1  |     0.0.0.0     |    eth0   |   UG  |
Nov 28 06:43:02 np0005538515.novalocal cloud-init[995]: ci-info: |   1   |   38.102.83.0   |    0.0.0.0    |  255.255.255.0  |    eth0   |   U   |
Nov 28 06:43:02 np0005538515.novalocal cloud-init[995]: ci-info: |   2   | 169.254.169.254 | 38.102.83.126 | 255.255.255.255 |    eth0   |  UGH  |
Nov 28 06:43:02 np0005538515.novalocal cloud-init[995]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Nov 28 06:43:02 np0005538515.novalocal cloud-init[995]: ci-info: +++++++++++++++++++Route IPv6 info+++++++++++++++++++
Nov 28 06:43:02 np0005538515.novalocal cloud-init[995]: ci-info: +-------+-------------+---------+-----------+-------+
Nov 28 06:43:02 np0005538515.novalocal cloud-init[995]: ci-info: | Route | Destination | Gateway | Interface | Flags |
Nov 28 06:43:02 np0005538515.novalocal cloud-init[995]: ci-info: +-------+-------------+---------+-----------+-------+
Nov 28 06:43:02 np0005538515.novalocal cloud-init[995]: ci-info: |   1   |  fe80::/64  |    ::   |    eth0   |   U   |
Nov 28 06:43:02 np0005538515.novalocal cloud-init[995]: ci-info: |   3   |  multicast  |    ::   |    eth0   |   U   |
Nov 28 06:43:02 np0005538515.novalocal cloud-init[995]: ci-info: +-------+-------------+---------+-----------+-------+
Nov 28 06:43:02 np0005538515.novalocal polkitd[1033]: Started polkitd version 0.117
Nov 28 06:43:02 np0005538515.novalocal systemd[1]: Starting Authorization Manager...
Nov 28 06:43:02 np0005538515.novalocal systemd[1]: Started Dynamic System Tuning Daemon.
Nov 28 06:43:02 np0005538515.novalocal polkitd[1033]: Loading rules from directory /etc/polkit-1/rules.d
Nov 28 06:43:02 np0005538515.novalocal polkitd[1033]: Loading rules from directory /usr/share/polkit-1/rules.d
Nov 28 06:43:02 np0005538515.novalocal polkitd[1033]: Finished loading, compiling and executing 4 rules
Nov 28 06:43:02 np0005538515.novalocal systemd[1]: Started Authorization Manager.
Nov 28 06:43:02 np0005538515.novalocal polkitd[1033]: Acquired the name org.freedesktop.PolicyKit1 on the system bus
Nov 28 06:43:03 np0005538515.novalocal useradd[1114]: new group: name=cloud-user, GID=1001
Nov 28 06:43:03 np0005538515.novalocal useradd[1114]: new user: name=cloud-user, UID=1001, GID=1001, home=/home/cloud-user, shell=/bin/bash, from=none
Nov 28 06:43:03 np0005538515.novalocal useradd[1114]: add 'cloud-user' to group 'adm'
Nov 28 06:43:03 np0005538515.novalocal useradd[1114]: add 'cloud-user' to group 'systemd-journal'
Nov 28 06:43:03 np0005538515.novalocal useradd[1114]: add 'cloud-user' to shadow group 'adm'
Nov 28 06:43:03 np0005538515.novalocal useradd[1114]: add 'cloud-user' to shadow group 'systemd-journal'
Nov 28 06:43:04 np0005538515.novalocal cloud-init[995]: Generating public/private rsa key pair.
Nov 28 06:43:04 np0005538515.novalocal cloud-init[995]: Your identification has been saved in /etc/ssh/ssh_host_rsa_key
Nov 28 06:43:04 np0005538515.novalocal cloud-init[995]: Your public key has been saved in /etc/ssh/ssh_host_rsa_key.pub
Nov 28 06:43:04 np0005538515.novalocal cloud-init[995]: The key fingerprint is:
Nov 28 06:43:04 np0005538515.novalocal cloud-init[995]: SHA256:d9rXMPbh9bnKjfp2LxEfqsw+3ON+ysSYxfJ3/uvflfs root@np0005538515.novalocal
Nov 28 06:43:04 np0005538515.novalocal cloud-init[995]: The key's randomart image is:
Nov 28 06:43:04 np0005538515.novalocal cloud-init[995]: +---[RSA 3072]----+
Nov 28 06:43:04 np0005538515.novalocal cloud-init[995]: |                 |
Nov 28 06:43:04 np0005538515.novalocal cloud-init[995]: |                 |
Nov 28 06:43:04 np0005538515.novalocal cloud-init[995]: |                 |
Nov 28 06:43:04 np0005538515.novalocal cloud-init[995]: |            . .. |
Nov 28 06:43:04 np0005538515.novalocal cloud-init[995]: |        S ...o=+o|
Nov 28 06:43:04 np0005538515.novalocal cloud-init[995]: |         . +Boo=B|
Nov 28 06:43:04 np0005538515.novalocal cloud-init[995]: |          =o++.**|
Nov 28 06:43:04 np0005538515.novalocal cloud-init[995]: |           *+==+*|
Nov 28 06:43:04 np0005538515.novalocal cloud-init[995]: |          .oBXBBE|
Nov 28 06:43:04 np0005538515.novalocal cloud-init[995]: +----[SHA256]-----+
Nov 28 06:43:04 np0005538515.novalocal cloud-init[995]: Generating public/private ecdsa key pair.
Nov 28 06:43:04 np0005538515.novalocal cloud-init[995]: Your identification has been saved in /etc/ssh/ssh_host_ecdsa_key
Nov 28 06:43:04 np0005538515.novalocal cloud-init[995]: Your public key has been saved in /etc/ssh/ssh_host_ecdsa_key.pub
Nov 28 06:43:04 np0005538515.novalocal cloud-init[995]: The key fingerprint is:
Nov 28 06:43:04 np0005538515.novalocal cloud-init[995]: SHA256:U4ZOXeHErZ4n6bWc0xDI5MESnYKy7XJodqSqxRRSX5w root@np0005538515.novalocal
Nov 28 06:43:04 np0005538515.novalocal cloud-init[995]: The key's randomart image is:
Nov 28 06:43:04 np0005538515.novalocal cloud-init[995]: +---[ECDSA 256]---+
Nov 28 06:43:04 np0005538515.novalocal cloud-init[995]: |    .  ..o.=++   |
Nov 28 06:43:04 np0005538515.novalocal cloud-init[995]: |   . ...Eoo+B .  |
Nov 28 06:43:04 np0005538515.novalocal cloud-init[995]: |  . . .+o +*.+   |
Nov 28 06:43:04 np0005538515.novalocal cloud-init[995]: |   . ..ooo  = .  |
Nov 28 06:43:04 np0005538515.novalocal cloud-init[995]: |    .  =S  . o . |
Nov 28 06:43:04 np0005538515.novalocal cloud-init[995]: |   o  * +.  = +  |
Nov 28 06:43:04 np0005538515.novalocal cloud-init[995]: |    o+ +   . = = |
Nov 28 06:43:04 np0005538515.novalocal cloud-init[995]: |   ..       . = .|
Nov 28 06:43:04 np0005538515.novalocal cloud-init[995]: |  ..           . |
Nov 28 06:43:04 np0005538515.novalocal cloud-init[995]: +----[SHA256]-----+
Nov 28 06:43:04 np0005538515.novalocal cloud-init[995]: Generating public/private ed25519 key pair.
Nov 28 06:43:04 np0005538515.novalocal cloud-init[995]: Your identification has been saved in /etc/ssh/ssh_host_ed25519_key
Nov 28 06:43:04 np0005538515.novalocal cloud-init[995]: Your public key has been saved in /etc/ssh/ssh_host_ed25519_key.pub
Nov 28 06:43:04 np0005538515.novalocal cloud-init[995]: The key fingerprint is:
Nov 28 06:43:04 np0005538515.novalocal cloud-init[995]: SHA256:REHx2gIqRu/twAcPou5ct8uLzF9aMkAibgMpUuh9OgI root@np0005538515.novalocal
Nov 28 06:43:04 np0005538515.novalocal cloud-init[995]: The key's randomart image is:
Nov 28 06:43:04 np0005538515.novalocal cloud-init[995]: +--[ED25519 256]--+
Nov 28 06:43:04 np0005538515.novalocal cloud-init[995]: | ..    .=o       |
Nov 28 06:43:04 np0005538515.novalocal cloud-init[995]: |.o     . .       |
Nov 28 06:43:04 np0005538515.novalocal cloud-init[995]: |B.o.  . . .      |
Nov 28 06:43:04 np0005538515.novalocal cloud-init[995]: |Eooo o o o       |
Nov 28 06:43:04 np0005538515.novalocal cloud-init[995]: |.++.B   S .      |
Nov 28 06:43:04 np0005538515.novalocal cloud-init[995]: |.+.B.=   .       |
Nov 28 06:43:04 np0005538515.novalocal cloud-init[995]: |. ..=++o         |
Nov 28 06:43:04 np0005538515.novalocal cloud-init[995]: |o + ++B          |
Nov 28 06:43:04 np0005538515.novalocal cloud-init[995]: |.+ +.B+          |
Nov 28 06:43:04 np0005538515.novalocal cloud-init[995]: +----[SHA256]-----+
Nov 28 06:43:04 np0005538515.novalocal sm-notify[1127]: Version 2.5.4 starting
Nov 28 06:43:04 np0005538515.novalocal systemd[1]: Finished Initial cloud-init job (metadata service crawler).
Nov 28 06:43:04 np0005538515.novalocal sshd[1128]: Server listening on 0.0.0.0 port 22.
Nov 28 06:43:04 np0005538515.novalocal systemd[1]: Reached target Cloud-config availability.
Nov 28 06:43:04 np0005538515.novalocal sshd[1128]: Server listening on :: port 22.
Nov 28 06:43:04 np0005538515.novalocal systemd[1]: Reached target Network is Online.
Nov 28 06:43:04 np0005538515.novalocal crond[1137]: (CRON) STARTUP (1.5.7)
Nov 28 06:43:04 np0005538515.novalocal systemd[1]: Starting Apply the settings specified in cloud-config...
Nov 28 06:43:04 np0005538515.novalocal crond[1137]: (CRON) INFO (Syslog will be used instead of sendmail.)
Nov 28 06:43:04 np0005538515.novalocal sshd[1128]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 06:43:04 np0005538515.novalocal systemd[1]: Run Insights Client at boot was skipped because of an unmet condition check (ConditionPathExists=/etc/insights-client/.run_insights_client_next_boot).
Nov 28 06:43:04 np0005538515.novalocal crond[1137]: (CRON) INFO (RANDOM_DELAY will be scaled with factor 25% if used.)
Nov 28 06:43:04 np0005538515.novalocal systemd[1]: Starting Crash recovery kernel arming...
Nov 28 06:43:04 np0005538515.novalocal crond[1137]: (CRON) INFO (running with inotify support)
Nov 28 06:43:04 np0005538515.novalocal systemd[1]: Starting Notify NFS peers of a restart...
Nov 28 06:43:04 np0005538515.novalocal systemd[1]: Starting OpenSSH server daemon...
Nov 28 06:43:04 np0005538515.novalocal systemd[1]: Starting Permit User Sessions...
Nov 28 06:43:04 np0005538515.novalocal systemd[1]: Started Notify NFS peers of a restart.
Nov 28 06:43:04 np0005538515.novalocal systemd[1]: Started OpenSSH server daemon.
Nov 28 06:43:04 np0005538515.novalocal systemd[1]: Finished Permit User Sessions.
Nov 28 06:43:04 np0005538515.novalocal systemd[1]: Started Command Scheduler.
Nov 28 06:43:04 np0005538515.novalocal systemd[1]: Started Getty on tty1.
Nov 28 06:43:04 np0005538515.novalocal systemd[1]: Started Serial Getty on ttyS0.
Nov 28 06:43:04 np0005538515.novalocal systemd[1]: Reached target Login Prompts.
Nov 28 06:43:04 np0005538515.novalocal systemd[1]: Reached target Multi-User System.
Nov 28 06:43:04 np0005538515.novalocal systemd[1]: Starting Record Runlevel Change in UTMP...
Nov 28 06:43:04 np0005538515.novalocal systemd[1]: systemd-update-utmp-runlevel.service: Deactivated successfully.
Nov 28 06:43:04 np0005538515.novalocal systemd[1]: Finished Record Runlevel Change in UTMP.
Nov 28 06:43:04 np0005538515.novalocal kdumpctl[1131]: kdump: No kdump initial ramdisk found.
Nov 28 06:43:04 np0005538515.novalocal kdumpctl[1131]: kdump: Rebuilding /boot/initramfs-5.14.0-284.11.1.el9_2.x86_64kdump.img
Nov 28 06:43:04 np0005538515.novalocal cloud-init[1247]: Cloud-init v. 22.1-9.el9 running 'modules:config' at Fri, 28 Nov 2025 06:43:04 +0000. Up 9.00 seconds.
Nov 28 06:43:04 np0005538515.novalocal systemd[1]: Finished Apply the settings specified in cloud-config.
Nov 28 06:43:04 np0005538515.novalocal systemd[1]: Starting Execute cloud user/final scripts...
Nov 28 06:43:05 np0005538515.novalocal sshd[1327]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 06:43:05 np0005538515.novalocal sshd[1327]: Connection reset by 38.102.83.114 port 57040 [preauth]
Nov 28 06:43:05 np0005538515.novalocal sshd[1348]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 06:43:05 np0005538515.novalocal sshd[1348]: Unable to negotiate with 38.102.83.114 port 57056: no matching host key type found. Their offer: ssh-ed25519,ssh-ed25519-cert-v01@openssh.com [preauth]
Nov 28 06:43:05 np0005538515.novalocal sshd[1365]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 06:43:05 np0005538515.novalocal sshd[1376]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 06:43:05 np0005538515.novalocal sshd[1376]: Unable to negotiate with 38.102.83.114 port 57068: no matching host key type found. Their offer: ecdsa-sha2-nistp384,ecdsa-sha2-nistp384-cert-v01@openssh.com [preauth]
Nov 28 06:43:05 np0005538515.novalocal sshd[1390]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 06:43:05 np0005538515.novalocal sshd[1390]: Unable to negotiate with 38.102.83.114 port 57084: no matching host key type found. Their offer: ecdsa-sha2-nistp521,ecdsa-sha2-nistp521-cert-v01@openssh.com [preauth]
Nov 28 06:43:05 np0005538515.novalocal sshd[1399]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 06:43:05 np0005538515.novalocal sshd[1421]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 06:43:05 np0005538515.novalocal sshd[1421]: Connection reset by 38.102.83.114 port 57108 [preauth]
Nov 28 06:43:05 np0005538515.novalocal dracut[1426]: dracut-057-21.git20230214.el9
Nov 28 06:43:05 np0005538515.novalocal sshd[1365]: Connection closed by 38.102.83.114 port 57058 [preauth]
Nov 28 06:43:05 np0005538515.novalocal sshd[1428]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 06:43:05 np0005538515.novalocal sshd[1428]: fatal: mm_answer_sign: sign: error in libcrypto
Nov 28 06:43:05 np0005538515.novalocal sshd[1444]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 06:43:05 np0005538515.novalocal sshd[1444]: Unable to negotiate with 38.102.83.114 port 57130: no matching host key type found. Their offer: ssh-dss,ssh-dss-cert-v01@openssh.com [preauth]
Nov 28 06:43:05 np0005538515.novalocal cloud-init[1448]: Cloud-init v. 22.1-9.el9 running 'modules:final' at Fri, 28 Nov 2025 06:43:05 +0000. Up 9.37 seconds.
Nov 28 06:43:05 np0005538515.novalocal sshd[1399]: Connection closed by 38.102.83.114 port 57098 [preauth]
Nov 28 06:43:05 np0005538515.novalocal cloud-init[1451]: #############################################################
Nov 28 06:43:05 np0005538515.novalocal cloud-init[1453]: -----BEGIN SSH HOST KEY FINGERPRINTS-----
Nov 28 06:43:05 np0005538515.novalocal cloud-init[1457]: 256 SHA256:U4ZOXeHErZ4n6bWc0xDI5MESnYKy7XJodqSqxRRSX5w root@np0005538515.novalocal (ECDSA)
Nov 28 06:43:05 np0005538515.novalocal dracut[1430]: Executing: /usr/bin/dracut --add kdumpbase --quiet --hostonly --hostonly-cmdline --hostonly-i18n --hostonly-mode strict --hostonly-nics  -o "plymouth resume ifcfg earlykdump" --mount "/dev/disk/by-uuid/a3dd82de-ffc6-4652-88b9-80e003b8f20a /sysroot xfs rw,relatime,seclabel,attr2,inode64,logbufs=8,logbsize=32k,noquota" --squash-compressor zstd --no-hostonly-default-device -f /boot/initramfs-5.14.0-284.11.1.el9_2.x86_64kdump.img 5.14.0-284.11.1.el9_2.x86_64
Nov 28 06:43:05 np0005538515.novalocal cloud-init[1465]: 256 SHA256:REHx2gIqRu/twAcPou5ct8uLzF9aMkAibgMpUuh9OgI root@np0005538515.novalocal (ED25519)
Nov 28 06:43:05 np0005538515.novalocal cloud-init[1473]: 3072 SHA256:d9rXMPbh9bnKjfp2LxEfqsw+3ON+ysSYxfJ3/uvflfs root@np0005538515.novalocal (RSA)
Nov 28 06:43:05 np0005538515.novalocal cloud-init[1475]: -----END SSH HOST KEY FINGERPRINTS-----
Nov 28 06:43:05 np0005538515.novalocal cloud-init[1477]: #############################################################
Nov 28 06:43:05 np0005538515.novalocal cloud-init[1448]: Cloud-init v. 22.1-9.el9 finished at Fri, 28 Nov 2025 06:43:05 +0000. Datasource DataSourceConfigDrive [net,ver=2][source=/dev/sr0].  Up 9.61 seconds
Nov 28 06:43:05 np0005538515.novalocal dracut[1430]: dracut module 'systemd-networkd' will not be installed, because command 'networkctl' could not be found!
Nov 28 06:43:05 np0005538515.novalocal dracut[1430]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd' could not be found!
Nov 28 06:43:05 np0005538515.novalocal dracut[1430]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd-wait-online' could not be found!
Nov 28 06:43:05 np0005538515.novalocal dracut[1430]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Nov 28 06:43:05 np0005538515.novalocal dracut[1430]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Nov 28 06:43:05 np0005538515.novalocal dracut[1430]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Nov 28 06:43:05 np0005538515.novalocal dracut[1430]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Nov 28 06:43:05 np0005538515.novalocal systemd[1]: Reloading Network Manager...
Nov 28 06:43:05 np0005538515.novalocal dracut[1430]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Nov 28 06:43:05 np0005538515.novalocal NetworkManager[788]: <info>  [1764312185.5332] audit: op="reload" arg="0" pid=1595 uid=0 result="success"
Nov 28 06:43:05 np0005538515.novalocal NetworkManager[788]: <info>  [1764312185.5343] config: signal: SIGHUP (no changes from disk)
Nov 28 06:43:05 np0005538515.novalocal dracut[1430]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Nov 28 06:43:05 np0005538515.novalocal systemd[1]: Reloaded Network Manager.
Nov 28 06:43:05 np0005538515.novalocal dracut[1430]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Nov 28 06:43:05 np0005538515.novalocal systemd[1]: Finished Execute cloud user/final scripts.
Nov 28 06:43:05 np0005538515.novalocal systemd[1]: Reached target Cloud-init target.
Nov 28 06:43:05 np0005538515.novalocal dracut[1430]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Nov 28 06:43:05 np0005538515.novalocal dracut[1430]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Nov 28 06:43:05 np0005538515.novalocal dracut[1430]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Nov 28 06:43:05 np0005538515.novalocal dracut[1430]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Nov 28 06:43:05 np0005538515.novalocal dracut[1430]: dracut module 'ifcfg' will not be installed, because it's in the list to be omitted!
Nov 28 06:43:05 np0005538515.novalocal dracut[1430]: dracut module 'plymouth' will not be installed, because it's in the list to be omitted!
Nov 28 06:43:05 np0005538515.novalocal dracut[1430]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Nov 28 06:43:05 np0005538515.novalocal dracut[1430]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Nov 28 06:43:05 np0005538515.novalocal dracut[1430]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Nov 28 06:43:05 np0005538515.novalocal dracut[1430]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Nov 28 06:43:05 np0005538515.novalocal dracut[1430]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Nov 28 06:43:05 np0005538515.novalocal dracut[1430]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Nov 28 06:43:05 np0005538515.novalocal dracut[1430]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Nov 28 06:43:05 np0005538515.novalocal dracut[1430]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Nov 28 06:43:05 np0005538515.novalocal dracut[1430]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Nov 28 06:43:05 np0005538515.novalocal dracut[1430]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Nov 28 06:43:05 np0005538515.novalocal dracut[1430]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Nov 28 06:43:05 np0005538515.novalocal dracut[1430]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Nov 28 06:43:05 np0005538515.novalocal dracut[1430]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Nov 28 06:43:05 np0005538515.novalocal dracut[1430]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Nov 28 06:43:05 np0005538515.novalocal dracut[1430]: dracut module 'resume' will not be installed, because it's in the list to be omitted!
Nov 28 06:43:05 np0005538515.novalocal dracut[1430]: dracut module 'biosdevname' will not be installed, because command 'biosdevname' could not be found!
Nov 28 06:43:05 np0005538515.novalocal dracut[1430]: dracut module 'earlykdump' will not be installed, because it's in the list to be omitted!
Nov 28 06:43:06 np0005538515.novalocal dracut[1430]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Nov 28 06:43:06 np0005538515.novalocal dracut[1430]: memstrack is not available
Nov 28 06:43:06 np0005538515.novalocal dracut[1430]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Nov 28 06:43:06 np0005538515.novalocal dracut[1430]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Nov 28 06:43:06 np0005538515.novalocal dracut[1430]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Nov 28 06:43:06 np0005538515.novalocal dracut[1430]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Nov 28 06:43:06 np0005538515.novalocal dracut[1430]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Nov 28 06:43:06 np0005538515.novalocal dracut[1430]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Nov 28 06:43:06 np0005538515.novalocal dracut[1430]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Nov 28 06:43:06 np0005538515.novalocal dracut[1430]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Nov 28 06:43:06 np0005538515.novalocal dracut[1430]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Nov 28 06:43:06 np0005538515.novalocal dracut[1430]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Nov 28 06:43:06 np0005538515.novalocal dracut[1430]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Nov 28 06:43:06 np0005538515.novalocal dracut[1430]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Nov 28 06:43:06 np0005538515.novalocal dracut[1430]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Nov 28 06:43:06 np0005538515.novalocal dracut[1430]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Nov 28 06:43:06 np0005538515.novalocal dracut[1430]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Nov 28 06:43:06 np0005538515.novalocal dracut[1430]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Nov 28 06:43:06 np0005538515.novalocal dracut[1430]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Nov 28 06:43:06 np0005538515.novalocal dracut[1430]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Nov 28 06:43:06 np0005538515.novalocal dracut[1430]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Nov 28 06:43:06 np0005538515.novalocal dracut[1430]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Nov 28 06:43:06 np0005538515.novalocal dracut[1430]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Nov 28 06:43:06 np0005538515.novalocal dracut[1430]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Nov 28 06:43:06 np0005538515.novalocal dracut[1430]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Nov 28 06:43:06 np0005538515.novalocal dracut[1430]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Nov 28 06:43:06 np0005538515.novalocal dracut[1430]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Nov 28 06:43:06 np0005538515.novalocal dracut[1430]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Nov 28 06:43:06 np0005538515.novalocal dracut[1430]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Nov 28 06:43:06 np0005538515.novalocal dracut[1430]: memstrack is not available
Nov 28 06:43:06 np0005538515.novalocal dracut[1430]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Nov 28 06:43:06 np0005538515.novalocal dracut[1430]: *** Including module: systemd ***
Nov 28 06:43:06 np0005538515.novalocal dracut[1430]: *** Including module: systemd-initrd ***
Nov 28 06:43:06 np0005538515.novalocal dracut[1430]: *** Including module: i18n ***
Nov 28 06:43:06 np0005538515.novalocal dracut[1430]: No KEYMAP configured.
Nov 28 06:43:06 np0005538515.novalocal dracut[1430]: *** Including module: drm ***
Nov 28 06:43:07 np0005538515.novalocal chronyd[765]: Selected source 206.108.0.131 (2.rhel.pool.ntp.org)
Nov 28 06:43:07 np0005538515.novalocal chronyd[765]: System clock TAI offset set to 37 seconds
Nov 28 06:43:07 np0005538515.novalocal dracut[1430]: *** Including module: prefixdevname ***
Nov 28 06:43:07 np0005538515.novalocal dracut[1430]: *** Including module: kernel-modules ***
Nov 28 06:43:07 np0005538515.novalocal dracut[1430]: *** Including module: kernel-modules-extra ***
Nov 28 06:43:07 np0005538515.novalocal dracut[1430]:   kernel-modules-extra: configuration source "/run/depmod.d" does not exist
Nov 28 06:43:07 np0005538515.novalocal dracut[1430]:   kernel-modules-extra: configuration source "/lib/depmod.d" does not exist
Nov 28 06:43:07 np0005538515.novalocal dracut[1430]:   kernel-modules-extra: parsing configuration file "/etc/depmod.d/dist.conf"
Nov 28 06:43:07 np0005538515.novalocal dracut[1430]:   kernel-modules-extra: /etc/depmod.d/dist.conf: added "updates extra built-in weak-updates" to the list of search directories
Nov 28 06:43:07 np0005538515.novalocal dracut[1430]: *** Including module: qemu ***
Nov 28 06:43:08 np0005538515.novalocal dracut[1430]: *** Including module: fstab-sys ***
Nov 28 06:43:08 np0005538515.novalocal dracut[1430]: *** Including module: rootfs-block ***
Nov 28 06:43:08 np0005538515.novalocal dracut[1430]: *** Including module: terminfo ***
Nov 28 06:43:08 np0005538515.novalocal dracut[1430]: *** Including module: udev-rules ***
Nov 28 06:43:08 np0005538515.novalocal dracut[1430]: Skipping udev rule: 91-permissions.rules
Nov 28 06:43:08 np0005538515.novalocal dracut[1430]: Skipping udev rule: 80-drivers-modprobe.rules
Nov 28 06:43:08 np0005538515.novalocal dracut[1430]: *** Including module: virtiofs ***
Nov 28 06:43:08 np0005538515.novalocal dracut[1430]: *** Including module: dracut-systemd ***
Nov 28 06:43:08 np0005538515.novalocal dracut[1430]: *** Including module: usrmount ***
Nov 28 06:43:08 np0005538515.novalocal dracut[1430]: *** Including module: base ***
Nov 28 06:43:09 np0005538515.novalocal dracut[1430]: *** Including module: fs-lib ***
Nov 28 06:43:09 np0005538515.novalocal dracut[1430]: *** Including module: kdumpbase ***
Nov 28 06:43:09 np0005538515.novalocal dracut[1430]: *** Including module: microcode_ctl-fw_dir_override ***
Nov 28 06:43:09 np0005538515.novalocal dracut[1430]:   microcode_ctl module: mangling fw_dir
Nov 28 06:43:09 np0005538515.novalocal dracut[1430]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel"...
Nov 28 06:43:09 np0005538515.novalocal dracut[1430]:     microcode_ctl: configuration "intel" is ignored
Nov 28 06:43:09 np0005538515.novalocal dracut[1430]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-2d-07"...
Nov 28 06:43:09 np0005538515.novalocal dracut[1430]:     microcode_ctl: configuration "intel-06-2d-07" is ignored
Nov 28 06:43:09 np0005538515.novalocal dracut[1430]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4e-03"...
Nov 28 06:43:09 np0005538515.novalocal dracut[1430]:     microcode_ctl: configuration "intel-06-4e-03" is ignored
Nov 28 06:43:09 np0005538515.novalocal dracut[1430]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4f-01"...
Nov 28 06:43:09 np0005538515.novalocal dracut[1430]:     microcode_ctl: configuration "intel-06-4f-01" is ignored
Nov 28 06:43:09 np0005538515.novalocal dracut[1430]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-55-04"...
Nov 28 06:43:09 np0005538515.novalocal dracut[1430]:     microcode_ctl: configuration "intel-06-55-04" is ignored
Nov 28 06:43:09 np0005538515.novalocal dracut[1430]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-5e-03"...
Nov 28 06:43:09 np0005538515.novalocal dracut[1430]:     microcode_ctl: configuration "intel-06-5e-03" is ignored
Nov 28 06:43:09 np0005538515.novalocal dracut[1430]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8c-01"...
Nov 28 06:43:09 np0005538515.novalocal dracut[1430]:     microcode_ctl: configuration "intel-06-8c-01" is ignored
Nov 28 06:43:09 np0005538515.novalocal dracut[1430]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-0xca"...
Nov 28 06:43:09 np0005538515.novalocal dracut[1430]:     microcode_ctl: configuration "intel-06-8e-9e-0x-0xca" is ignored
Nov 28 06:43:09 np0005538515.novalocal dracut[1430]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-dell"...
Nov 28 06:43:09 np0005538515.novalocal dracut[1430]:     microcode_ctl: configuration "intel-06-8e-9e-0x-dell" is ignored
Nov 28 06:43:09 np0005538515.novalocal dracut[1430]:     microcode_ctl: final fw_dir: "/lib/firmware/updates/5.14.0-284.11.1.el9_2.x86_64 /lib/firmware/updates /lib/firmware/5.14.0-284.11.1.el9_2.x86_64 /lib/firmware"
Nov 28 06:43:09 np0005538515.novalocal dracut[1430]: *** Including module: shutdown ***
Nov 28 06:43:09 np0005538515.novalocal dracut[1430]: *** Including module: squash ***
Nov 28 06:43:10 np0005538515.novalocal dracut[1430]: *** Including modules done ***
Nov 28 06:43:10 np0005538515.novalocal dracut[1430]: *** Installing kernel module dependencies ***
Nov 28 06:43:10 np0005538515.novalocal dracut[1430]: *** Installing kernel module dependencies done ***
Nov 28 06:43:10 np0005538515.novalocal dracut[1430]: *** Resolving executable dependencies ***
Nov 28 06:43:11 np0005538515.novalocal dracut[1430]: *** Resolving executable dependencies done ***
Nov 28 06:43:11 np0005538515.novalocal dracut[1430]: *** Hardlinking files ***
Nov 28 06:43:11 np0005538515.novalocal dracut[1430]: Mode:           real
Nov 28 06:43:11 np0005538515.novalocal dracut[1430]: Files:          1099
Nov 28 06:43:11 np0005538515.novalocal dracut[1430]: Linked:         3 files
Nov 28 06:43:11 np0005538515.novalocal dracut[1430]: Compared:       0 xattrs
Nov 28 06:43:11 np0005538515.novalocal dracut[1430]: Compared:       373 files
Nov 28 06:43:12 np0005538515.novalocal dracut[1430]: Saved:          61.04 KiB
Nov 28 06:43:12 np0005538515.novalocal dracut[1430]: Duration:       0.018742 seconds
Nov 28 06:43:12 np0005538515.novalocal dracut[1430]: *** Hardlinking files done ***
Nov 28 06:43:12 np0005538515.novalocal dracut[1430]: Could not find 'strip'. Not stripping the initramfs.
Nov 28 06:43:12 np0005538515.novalocal dracut[1430]: *** Generating early-microcode cpio image ***
Nov 28 06:43:12 np0005538515.novalocal dracut[1430]: *** Constructing AuthenticAMD.bin ***
Nov 28 06:43:12 np0005538515.novalocal dracut[1430]: *** Store current command line parameters ***
Nov 28 06:43:12 np0005538515.novalocal dracut[1430]: Stored kernel commandline:
Nov 28 06:43:12 np0005538515.novalocal dracut[1430]: No dracut internal kernel commandline stored in the initramfs
Nov 28 06:43:12 np0005538515.novalocal dracut[1430]: *** Install squash loader ***
Nov 28 06:43:12 np0005538515.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Nov 28 06:43:12 np0005538515.novalocal dracut[1430]: *** Squashing the files inside the initramfs ***
Nov 28 06:43:13 np0005538515.novalocal dracut[1430]: *** Squashing the files inside the initramfs done ***
Nov 28 06:43:13 np0005538515.novalocal dracut[1430]: *** Creating image file '/boot/initramfs-5.14.0-284.11.1.el9_2.x86_64kdump.img' ***
Nov 28 06:43:14 np0005538515.novalocal dracut[1430]: *** Creating initramfs image file '/boot/initramfs-5.14.0-284.11.1.el9_2.x86_64kdump.img' done ***
Nov 28 06:43:14 np0005538515.novalocal kdumpctl[1131]: kdump: kexec: loaded kdump kernel
Nov 28 06:43:14 np0005538515.novalocal kdumpctl[1131]: kdump: Starting kdump: [OK]
Nov 28 06:43:14 np0005538515.novalocal systemd[1]: Finished Crash recovery kernel arming.
Nov 28 06:43:14 np0005538515.novalocal systemd[1]: Startup finished in 1.525s (kernel) + 2.012s (initrd) + 15.244s (userspace) = 18.782s.
Nov 28 06:43:32 np0005538515.novalocal systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Nov 28 06:43:40 np0005538515.novalocal sshd[4170]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 06:43:40 np0005538515.novalocal sshd[4170]: Accepted publickey for zuul from 38.102.83.114 port 46744 ssh2: RSA SHA256:zhs3MiW0JhxzckYcMHQES8SMYHj1iGcomnyzmbiwor8
Nov 28 06:43:40 np0005538515.novalocal systemd[1]: Created slice User Slice of UID 1000.
Nov 28 06:43:40 np0005538515.novalocal systemd[1]: Starting User Runtime Directory /run/user/1000...
Nov 28 06:43:40 np0005538515.novalocal systemd-logind[763]: New session 1 of user zuul.
Nov 28 06:43:41 np0005538515.novalocal systemd[1]: Finished User Runtime Directory /run/user/1000.
Nov 28 06:43:41 np0005538515.novalocal systemd[1]: Starting User Manager for UID 1000...
Nov 28 06:43:41 np0005538515.novalocal systemd[4174]: pam_unix(systemd-user:session): session opened for user zuul(uid=1000) by (uid=0)
Nov 28 06:43:41 np0005538515.novalocal systemd[4174]: Queued start job for default target Main User Target.
Nov 28 06:43:41 np0005538515.novalocal systemd[4174]: Created slice User Application Slice.
Nov 28 06:43:41 np0005538515.novalocal systemd[4174]: Started Mark boot as successful after the user session has run 2 minutes.
Nov 28 06:43:41 np0005538515.novalocal systemd[4174]: Started Daily Cleanup of User's Temporary Directories.
Nov 28 06:43:41 np0005538515.novalocal systemd[4174]: Reached target Paths.
Nov 28 06:43:41 np0005538515.novalocal systemd[4174]: Reached target Timers.
Nov 28 06:43:41 np0005538515.novalocal systemd[4174]: Starting D-Bus User Message Bus Socket...
Nov 28 06:43:41 np0005538515.novalocal systemd[4174]: Starting Create User's Volatile Files and Directories...
Nov 28 06:43:41 np0005538515.novalocal systemd[4174]: Finished Create User's Volatile Files and Directories.
Nov 28 06:43:41 np0005538515.novalocal systemd[4174]: Listening on D-Bus User Message Bus Socket.
Nov 28 06:43:41 np0005538515.novalocal systemd[4174]: Reached target Sockets.
Nov 28 06:43:41 np0005538515.novalocal systemd[4174]: Reached target Basic System.
Nov 28 06:43:41 np0005538515.novalocal systemd[4174]: Reached target Main User Target.
Nov 28 06:43:41 np0005538515.novalocal systemd[4174]: Startup finished in 113ms.
Nov 28 06:43:41 np0005538515.novalocal systemd[1]: Started User Manager for UID 1000.
Nov 28 06:43:41 np0005538515.novalocal systemd[1]: Started Session 1 of User zuul.
Nov 28 06:43:41 np0005538515.novalocal sshd[4170]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Nov 28 06:43:41 np0005538515.novalocal python3[4227]: ansible-setup Invoked with gather_subset=['!all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 06:43:51 np0005538515.novalocal python3[4245]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 06:43:56 np0005538515.novalocal python3[4298]: ansible-setup Invoked with gather_subset=['network'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 06:43:58 np0005538515.novalocal python3[4328]: ansible-zuul_console Invoked with path=/tmp/console-{log_uuid}.log port=19885 state=present
Nov 28 06:44:01 np0005538515.novalocal python3[4344]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCsnBivukZgTjr1SoC29hE3ofwUMxTaKeXh9gXvDwMJASbvK4q9943cbJ2j47GUf8sEgY38kkU/dxSMQWULl4d2oquIgZpJbJuXMU1WNxwGNSrS74OecQ3Or4VxTiDmu/HV83nIWHqfpDCra4DlrIBPPNwhBK4u0QYy87AJaML6NGEDaubbHgVCg1UpW1ho/sDoXptAehoCEaaeRz5tPHiXRnHpIXu44Sp8fRcyU9rBqdv+/lgachTcMYadsD2WBHIL+pptEDHB5TvQTDpnU58YdGFarn8uuGPP4t8H6xcqXbaJS9/oZa5Fb5Mh3vORBbR65jvlGg4PYGzCuI/xllY5+lGK7eyOleFyRqWKa2uAIaGoRBT4ZLKAssOFwCIaGfOAFFOBMkuylg4+MtbYiMJYRARPSRAufAROqhUDOo73y5lBrXh07aiWuSn8fU4mclWu+Xw382ryxW+XeHPc12d7S46TvGJaRvzsLtlyerRxGI77xOHRexq1Z/SFjOWLOwc= zuul-build-sshkey manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 28 06:44:01 np0005538515.novalocal python3[4358]: ansible-file Invoked with state=directory path=/home/zuul/.ssh mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 06:44:03 np0005538515.novalocal python3[4417]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 06:44:03 np0005538515.novalocal python3[4458]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764312242.7710705-395-6155547921769/source dest=/home/zuul/.ssh/id_rsa mode=384 force=False _original_basename=4237e6bc560e46d9aa55417d911b2c55_id_rsa follow=False checksum=47f6a2f8fa426c1f34aad346f88073a22928af4e backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 06:44:04 np0005538515.novalocal python3[4531]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa.pub follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 06:44:05 np0005538515.novalocal python3[4572]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764312244.4699037-496-47380771635991/source dest=/home/zuul/.ssh/id_rsa.pub mode=420 force=False _original_basename=4237e6bc560e46d9aa55417d911b2c55_id_rsa.pub follow=False checksum=d1f12d852c72cfefab089d88337552962cfbc93d backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 06:44:06 np0005538515.novalocal python3[4600]: ansible-ping Invoked with data=pong
Nov 28 06:44:09 np0005538515.novalocal python3[4614]: ansible-setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 06:44:13 np0005538515.novalocal python3[4667]: ansible-zuul_debug_info Invoked with ipv4_route_required=False ipv6_route_required=False image_manifest_files=['/etc/dib-builddate.txt', '/etc/image-hostname.txt'] image_manifest=None traceroute_host=None
Nov 28 06:44:13 np0005538515.novalocal chronyd[765]: Selected source 149.56.19.163 (2.rhel.pool.ntp.org)
Nov 28 06:44:15 np0005538515.novalocal python3[4689]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 06:44:16 np0005538515.novalocal python3[4703]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 06:44:16 np0005538515.novalocal python3[4717]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 06:44:17 np0005538515.novalocal python3[4731]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 06:44:17 np0005538515.novalocal python3[4745]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 06:44:18 np0005538515.novalocal python3[4759]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 06:44:20 np0005538515.novalocal sudo[4773]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dyrixhkrhklipkydimyafntsanyidkjc ; /usr/bin/python3
Nov 28 06:44:20 np0005538515.novalocal sudo[4773]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 06:44:20 np0005538515.novalocal python3[4775]: ansible-file Invoked with path=/etc/ci state=directory owner=root group=root mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 06:44:20 np0005538515.novalocal sudo[4773]: pam_unix(sudo:session): session closed for user root
Nov 28 06:44:22 np0005538515.novalocal sudo[4822]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sccsifaofzpbxfnbjudzrsrtqxywxzzr ; /usr/bin/python3
Nov 28 06:44:22 np0005538515.novalocal sudo[4822]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 06:44:22 np0005538515.novalocal python3[4824]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/mirror_info.sh follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 06:44:22 np0005538515.novalocal sudo[4822]: pam_unix(sudo:session): session closed for user root
Nov 28 06:44:22 np0005538515.novalocal sudo[4865]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hleshrzbnbjdpdujczfzirzztgellicu ; /usr/bin/python3
Nov 28 06:44:22 np0005538515.novalocal sudo[4865]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 06:44:22 np0005538515.novalocal python3[4867]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/mirror_info.sh owner=root group=root mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764312262.2043922-106-134507180075167/source follow=False _original_basename=mirror_info.sh.j2 checksum=92d92a03afdddee82732741071f662c729080c35 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 06:44:22 np0005538515.novalocal sudo[4865]: pam_unix(sudo:session): session closed for user root
Nov 28 06:44:31 np0005538515.novalocal python3[4895]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAABIwAAAQEA4Z/c9osaGGtU6X8fgELwfj/yayRurfcKA0HMFfdpPxev2dbwljysMuzoVp4OZmW1gvGtyYPSNRvnzgsaabPNKNo2ym5NToCP6UM+KSe93aln4BcM/24mXChYAbXJQ5Bqq/pIzsGs/pKetQN+vwvMxLOwTvpcsCJBXaa981RKML6xj9l/UZ7IIq1HSEKMvPLxZMWdu0Ut8DkCd5F4nOw9Wgml2uYpDCj5LLCrQQ9ChdOMz8hz6SighhNlRpPkvPaet3OXxr/ytFMu7j7vv06CaEnuMMiY2aTWN1Imin9eHAylIqFHta/3gFfQSWt9jXM7owkBLKL7ATzhaAn+fjNupw== arxcruz@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 28 06:44:31 np0005538515.novalocal python3[4909]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDS4Fn6k4deCnIlOtLWqZJyksbepjQt04j8Ed8CGx9EKkj0fKiAxiI4TadXQYPuNHMixZy4Nevjb6aDhL5Z906TfvNHKUrjrG7G26a0k8vdc61NEQ7FmcGMWRLwwc6ReDO7lFpzYKBMk4YqfWgBuGU/K6WLKiVW2cVvwIuGIaYrE1OiiX0iVUUk7KApXlDJMXn7qjSYynfO4mF629NIp8FJal38+Kv+HA+0QkE5Y2xXnzD4Lar5+keymiCHRntPppXHeLIRzbt0gxC7v3L72hpQ3BTBEzwHpeS8KY+SX1y5lRMN45thCHfJqGmARJREDjBvWG8JXOPmVIKQtZmVcD5b mandreou@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 28 06:44:31 np0005538515.novalocal python3[4923]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC9MiLfy30deHA7xPOAlew5qUq3UP2gmRMYJi8PtkjFB20/DKeWwWNnkZPqP9AayruRoo51SIiVg870gbZE2jYl+Ncx/FYDe56JeC3ySZsXoAVkC9bP7gkOGqOmJjirvAgPMI7bogVz8i+66Q4Ar7OKTp3762G4IuWPPEg4ce4Y7lx9qWocZapHYq4cYKMxrOZ7SEbFSATBbe2bPZAPKTw8do/Eny+Hq/LkHFhIeyra6cqTFQYShr+zPln0Cr+ro/pDX3bB+1ubFgTpjpkkkQsLhDfR6cCdCWM2lgnS3BTtYj5Ct9/JRPR5YOphqZz+uB+OEu2IL68hmU9vNTth1KeX rlandy@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 28 06:44:31 np0005538515.novalocal python3[4937]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIFCbgz8gdERiJlk2IKOtkjQxEXejrio6ZYMJAVJYpOIp raukadah@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 28 06:44:32 np0005538515.novalocal python3[4951]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIBqb3Q/9uDf4LmihQ7xeJ9gA/STIQUFPSfyyV0m8AoQi bshewale@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 28 06:44:32 np0005538515.novalocal python3[4965]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC0I8QqQx0Az2ysJt2JuffucLijhBqnsXKEIx5GyHwxVULROa8VtNFXUDH6ZKZavhiMcmfHB2+TBTda+lDP4FldYj06dGmzCY+IYGa+uDRdxHNGYjvCfLFcmLlzRK6fNbTcui+KlUFUdKe0fb9CRoGKyhlJD5GRkM1Dv+Yb6Bj+RNnmm1fVGYxzmrD2utvffYEb0SZGWxq2R9gefx1q/3wCGjeqvufEV+AskPhVGc5T7t9eyZ4qmslkLh1/nMuaIBFcr9AUACRajsvk6mXrAN1g3HlBf2gQlhi1UEyfbqIQvzzFtsbLDlSum/KmKjy818GzvWjERfQ0VkGzCd9bSLVL dviroel@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 28 06:44:32 np0005538515.novalocal python3[4979]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDLOQd4ZLtkZXQGY6UwAr/06ppWQK4fDO3HaqxPk98csyOCBXsliSKK39Bso828+5srIXiW7aI6aC9P5mwi4mUZlGPfJlQbfrcGvY+b/SocuvaGK+1RrHLoJCT52LBhwgrzlXio2jeksZeein8iaTrhsPrOAs7KggIL/rB9hEiB3NaOPWhhoCP4vlW6MEMExGcqB/1FVxXFBPnLkEyW0Lk7ycVflZl2ocRxbfjZi0+tI1Wlinp8PvSQSc/WVrAcDgKjc/mB4ODPOyYy3G8FHgfMsrXSDEyjBKgLKMsdCrAUcqJQWjkqXleXSYOV4q3pzL+9umK+q/e3P/bIoSFQzmJKTU1eDfuvPXmow9F5H54fii/Da7ezlMJ+wPGHJrRAkmzvMbALy7xwswLhZMkOGNtRcPqaKYRmIBKpw3o6bCTtcNUHOtOQnzwY8JzrM2eBWJBXAANYw+9/ho80JIiwhg29CFNpVBuHbql2YxJQNrnl90guN65rYNpDxdIluweyUf8= anbanerj@kaermorhen manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 28 06:44:32 np0005538515.novalocal python3[4993]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC3VwV8Im9kRm49lt3tM36hj4Zv27FxGo4C1Q/0jqhzFmHY7RHbmeRr8ObhwWoHjXSozKWg8FL5ER0z3hTwL0W6lez3sL7hUaCmSuZmG5Hnl3x4vTSxDI9JZ/Y65rtYiiWQo2fC5xJhU/4+0e5e/pseCm8cKRSu+SaxhO+sd6FDojA2x1BzOzKiQRDy/1zWGp/cZkxcEuB1wHI5LMzN03c67vmbu+fhZRAUO4dQkvcnj2LrhQtpa+ytvnSjr8icMDosf1OsbSffwZFyHB/hfWGAfe0eIeSA2XPraxiPknXxiPKx2MJsaUTYbsZcm3EjFdHBBMumw5rBI74zLrMRvCO9GwBEmGT4rFng1nP+yw5DB8sn2zqpOsPg1LYRwCPOUveC13P6pgsZZPh812e8v5EKnETct+5XI3dVpdw6CnNiLwAyVAF15DJvBGT/u1k0Myg/bQn+Gv9k2MSj6LvQmf6WbZu2Wgjm30z3FyCneBqTL7mLF19YXzeC0ufHz5pnO1E= dasm@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 28 06:44:33 np0005538515.novalocal python3[5007]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIHUnwjB20UKmsSed9X73eGNV5AOEFccQ3NYrRW776pEk cjeanner manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 28 06:44:33 np0005538515.novalocal python3[5021]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDercCMGn8rW1C4P67tHgtflPdTeXlpyUJYH+6XDd2lR jgilaber@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 28 06:44:33 np0005538515.novalocal python3[5035]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIAMI6kkg9Wg0sG7jIJmyZemEBwUn1yzNpQQd3gnulOmZ adrianfuscoarnejo@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 28 06:44:33 np0005538515.novalocal python3[5049]: ansible-authorized_key Invoked with user=zuul state=present key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPijwpQu/3jhhhBZInXNOLEH57DrknPc3PLbsRvYyJIFzwYjX+WD4a7+nGnMYS42MuZk6TJcVqgnqofVx4isoD4= ramishra@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 28 06:44:34 np0005538515.novalocal python3[5063]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIGpU/BepK3qX0NRf5Np+dOBDqzQEefhNrw2DCZaH3uWW rebtoor@monolith manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 28 06:44:34 np0005538515.novalocal python3[5077]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDK0iKdi8jQTpQrDdLVH/AAgLVYyTXF7AQ1gjc/5uT3t ykarel@yatinkarel manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 28 06:44:34 np0005538515.novalocal python3[5091]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIF/V/cLotA6LZeO32VL45Hd78skuA2lJA425Sm2LlQeZ fmount@horcrux manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 28 06:44:34 np0005538515.novalocal python3[5105]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDa7QCjuDMVmRPo1rREbGwzYeBCYVN+Ou/3WKXZEC6Sr manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 28 06:44:35 np0005538515.novalocal python3[5119]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQCfNtF7NvKl915TGsGGoseUb06Hj8L/S4toWf0hExeY+F00woL6NvBlJD0nDct+P5a22I4EhvoQCRQ8reaPCm1lybR3uiRIJsj+8zkVvLwby9LXzfZorlNG9ofjd00FEmB09uW/YvTl6Q9XwwwX6tInzIOv3TMqTHHGOL74ibbj8J/FJR0cFEyj0z4WQRvtkh32xAHl83gbuINryMt0sqRI+clj2381NKL55DRLQrVw0gsfqqxiHAnXg21qWmc4J+b9e9kiuAFQjcjwTVkwJCcg3xbPwC/qokYRby/Y5S40UUd7/jEARGXT7RZgpzTuDd1oZiCVrnrqJNPaMNdVv5MLeFdf1B7iIe5aa/fGouX7AO4SdKhZUdnJmCFAGvjC6S3JMZ2wAcUl+OHnssfmdj7XL50cLo27vjuzMtLAgSqi6N99m92WCF2s8J9aVzszX7Xz9OKZCeGsiVJp3/NdABKzSEAyM9xBD/5Vho894Sav+otpySHe3p6RUTgbB5Zu8VyZRZ/UtB3ueXxyo764yrc6qWIDqrehm84Xm9g+/jpIBzGPl07NUNJpdt/6Sgf9RIKXw/7XypO5yZfUcuFNGTxLfqjTNrtgLZNcjfav6sSdVXVcMPL//XNuRdKmVFaO76eV/oGMQGr1fGcCD+N+CpI7+Q+fCNB6VFWG4nZFuI/Iuw== averdagu@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 28 06:44:35 np0005538515.novalocal python3[5133]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDq8l27xI+QlQVdS4djp9ogSoyrNE2+Ox6vKPdhSNL1J3PE5w+WCSvMz9A5gnNuH810zwbekEApbxTze/gLQJwBHA52CChfURpXrFaxY7ePXRElwKAL3mJfzBWY/c5jnNL9TCVmFJTGZkFZP3Nh+BMgZvL6xBkt3WKm6Uq18qzd9XeKcZusrA+O+uLv1fVeQnadY9RIqOCyeFYCzLWrUfTyE8x/XG0hAWIM7qpnF2cALQS2h9n4hW5ybiUN790H08wf9hFwEf5nxY9Z9dVkPFQiTSGKNBzmnCXU9skxS/xhpFjJ5duGSZdtAHe9O+nGZm9c67hxgtf8e5PDuqAdXEv2cf6e3VBAt+Bz8EKI3yosTj0oZHfwr42Yzb1l/SKy14Rggsrc9KAQlrGXan6+u2jcQqqx7l+SWmnpFiWTV9u5cWj2IgOhApOitmRBPYqk9rE2usfO0hLn/Pj/R/Nau4803e1/EikdLE7Ps95s9mX5jRDjAoUa2JwFF5RsVFyL910= ashigupt@ashigupt.remote.csb manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 28 06:44:35 np0005538515.novalocal python3[5147]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIOKLl0NYKwoZ/JY5KeZU8VwRAggeOxqQJeoqp3dsAaY9 manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 28 06:44:35 np0005538515.novalocal python3[5161]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIASASQOH2BcOyLKuuDOdWZlPi2orcjcA8q4400T73DLH evallesp@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 28 06:44:36 np0005538515.novalocal python3[5175]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILeBWlamUph+jRKV2qrx1PGU7vWuGIt5+z9k96I8WehW amsinha@amsinha-mac manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 28 06:44:36 np0005538515.novalocal python3[5189]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIANvVgvJBlK3gb1yz5uef/JqIGq4HLEmY2dYA8e37swb morenod@redhat-laptop manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 28 06:44:36 np0005538515.novalocal python3[5203]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQDZdI7t1cxYx65heVI24HTV4F7oQLW1zyfxHreL2TIJKxjyrUUKIFEUmTutcBlJRLNT2Eoix6x1sOw9YrchloCLcn//SGfTElr9mSc5jbjb7QXEU+zJMhtxyEJ1Po3CUGnj7ckiIXw7wcawZtrEOAQ9pH3ExYCJcEMiyNjRQZCxT3tPK+S4B95EWh5Fsrz9CkwpjNRPPH7LigCeQTM3Wc7r97utAslBUUvYceDSLA7rMgkitJE38b7rZBeYzsGQ8YYUBjTCtehqQXxCRjizbHWaaZkBU+N3zkKB6n/iCNGIO690NK7A/qb6msTijiz1PeuM8ThOsi9qXnbX5v0PoTpcFSojV7NHAQ71f0XXuS43FhZctT+Dcx44dT8Fb5vJu2cJGrk+qF8ZgJYNpRS7gPg0EG2EqjK7JMf9ULdjSu0r+KlqIAyLvtzT4eOnQipoKlb/WG5D/0ohKv7OMQ352ggfkBFIQsRXyyTCT98Ft9juqPuahi3CAQmP4H9dyE+7+Kz437PEtsxLmfm6naNmWi7Ee1DqWPwS8rEajsm4sNM4wW9gdBboJQtc0uZw0DfLj1I9r3Mc8Ol0jYtz0yNQDSzVLrGCaJlC311trU70tZ+ZkAVV6Mn8lOhSbj1cK0lvSr6ZK4dgqGl3I1eTZJJhbLNdg7UOVaiRx9543+C/p/As7w== brjackma@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 28 06:44:36 np0005538515.novalocal python3[5217]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKwedoZ0TWPJX/z/4TAbO/kKcDZOQVgRH0hAqrL5UCI1 vcastell@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 28 06:44:37 np0005538515.novalocal python3[5231]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIEmv8sE8GCk6ZTPIqF0FQrttBdL3mq7rCm/IJy0xDFh7 michburk@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 28 06:44:37 np0005538515.novalocal python3[5245]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAICy6GpGEtwevXEEn4mmLR5lmSLe23dGgAvzkB9DMNbkf rsafrono@rsafrono manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 28 06:44:38 np0005538515.novalocal sudo[5259]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fiyhcebufbtczewbmpemmkbxcgnuxmqb ; /usr/bin/python3
Nov 28 06:44:38 np0005538515.novalocal sudo[5259]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 06:44:38 np0005538515.novalocal python3[5261]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Nov 28 06:44:38 np0005538515.novalocal systemd[1]: Starting Time & Date Service...
Nov 28 06:44:39 np0005538515.novalocal systemd[1]: Started Time & Date Service.
Nov 28 06:44:39 np0005538515.novalocal systemd-timedated[5263]: Changed time zone to 'UTC' (UTC).
Nov 28 06:44:39 np0005538515.novalocal sudo[5259]: pam_unix(sudo:session): session closed for user root
Nov 28 06:44:39 np0005538515.novalocal sudo[5280]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wlsbgpboxfuxvlhyeqvuhtrbsiwdqglj ; /usr/bin/python3
Nov 28 06:44:39 np0005538515.novalocal sudo[5280]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 06:44:39 np0005538515.novalocal python3[5282]: ansible-file Invoked with path=/etc/nodepool state=directory mode=511 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 06:44:39 np0005538515.novalocal sudo[5280]: pam_unix(sudo:session): session closed for user root
Nov 28 06:44:40 np0005538515.novalocal python3[5328]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 06:44:41 np0005538515.novalocal python3[5369]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes src=/home/zuul/.ansible/tmp/ansible-tmp-1764312280.7429726-497-121646648876435/source _original_basename=tmplqmmkecs follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 06:44:42 np0005538515.novalocal python3[5429]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 06:44:42 np0005538515.novalocal python3[5470]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes_private src=/home/zuul/.ansible/tmp/ansible-tmp-1764312282.2725708-588-225791012218072/source _original_basename=tmp_40x8ufp follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 06:44:44 np0005538515.novalocal sudo[5530]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-spnpjjhqzovxuvyypjkvueaqwcrejtmn ; /usr/bin/python3
Nov 28 06:44:44 np0005538515.novalocal sudo[5530]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 06:44:44 np0005538515.novalocal python3[5532]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/node_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 06:44:44 np0005538515.novalocal sudo[5530]: pam_unix(sudo:session): session closed for user root
Nov 28 06:44:44 np0005538515.novalocal sudo[5573]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mikqxucxkfooegwidsrjwmhtgxbkupzk ; /usr/bin/python3
Nov 28 06:44:44 np0005538515.novalocal sudo[5573]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 06:44:45 np0005538515.novalocal python3[5575]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/node_private src=/home/zuul/.ansible/tmp/ansible-tmp-1764312284.38717-732-123433027138430/source _original_basename=tmpteo97krb follow=False checksum=1cc2ea2b76967ada2d4710a35e138c3751da2100 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 06:44:45 np0005538515.novalocal sudo[5573]: pam_unix(sudo:session): session closed for user root
Nov 28 06:44:46 np0005538515.novalocal python3[5603]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa /etc/nodepool/id_rsa zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 06:44:46 np0005538515.novalocal python3[5619]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa.pub /etc/nodepool/id_rsa.pub zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 06:44:47 np0005538515.novalocal sudo[5667]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nwfdgycpstmckknlfwlexkfccvyqyelg ; /usr/bin/python3
Nov 28 06:44:47 np0005538515.novalocal sudo[5667]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 06:44:47 np0005538515.novalocal python3[5669]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/zuul-sudo-grep follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 06:44:47 np0005538515.novalocal sudo[5667]: pam_unix(sudo:session): session closed for user root
Nov 28 06:44:47 np0005538515.novalocal sudo[5710]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vimetfctpcldmjvttkhnjiestcmtfzrv ; /usr/bin/python3
Nov 28 06:44:47 np0005538515.novalocal sudo[5710]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 06:44:47 np0005538515.novalocal python3[5712]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/zuul-sudo-grep mode=288 src=/home/zuul/.ansible/tmp/ansible-tmp-1764312287.4328728-857-10663037658759/source _original_basename=tmpo3n9mue_ follow=False checksum=bdca1a77493d00fb51567671791f4aa30f66c2f0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 06:44:48 np0005538515.novalocal sudo[5710]: pam_unix(sudo:session): session closed for user root
Nov 28 06:44:59 np0005538515.novalocal sudo[5741]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eseeltoyeoktxiwctklsqkfzerhucdol ; /usr/bin/python3
Nov 28 06:44:59 np0005538515.novalocal sudo[5741]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 06:44:59 np0005538515.novalocal python3[5743]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/visudo -c zuul_log_id=fa163ef9-e89a-161e-20ee-000000000023-1-overcloudnovacompute2 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 06:44:59 np0005538515.novalocal sudo[5741]: pam_unix(sudo:session): session closed for user root
Nov 28 06:45:09 np0005538515.novalocal systemd[1]: systemd-timedated.service: Deactivated successfully.
Nov 28 06:45:10 np0005538515.novalocal python3[5765]: ansible-ansible.legacy.command Invoked with executable=/bin/bash _raw_params=env
                                                       _uses_shell=True zuul_log_id=fa163ef9-e89a-161e-20ee-000000000024-1-overcloudnovacompute2 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None creates=None removes=None stdin=None
Nov 28 06:45:12 np0005538515.novalocal python3[5783]: ansible-file Invoked with path=/home/zuul/workspace state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 06:45:31 np0005538515.novalocal sudo[5797]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bbwvyrmyurobcywgbsidzpoibfhqthqa ; /usr/bin/python3
Nov 28 06:45:31 np0005538515.novalocal sudo[5797]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 06:45:31 np0005538515.novalocal python3[5799]: ansible-ansible.builtin.file Invoked with path=/etc/ci/env state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 06:45:31 np0005538515.novalocal sudo[5797]: pam_unix(sudo:session): session closed for user root
Nov 28 06:45:53 np0005538515.novalocal systemd[4174]: Starting Mark boot as successful...
Nov 28 06:45:53 np0005538515.novalocal systemd[4174]: Finished Mark boot as successful.
Nov 28 06:46:31 np0005538515.novalocal sshd[4184]: Received disconnect from 38.102.83.114 port 46744:11: disconnected by user
Nov 28 06:46:31 np0005538515.novalocal sshd[4184]: Disconnected from user zuul 38.102.83.114 port 46744
Nov 28 06:46:31 np0005538515.novalocal sshd[4170]: pam_unix(sshd:session): session closed for user zuul
Nov 28 06:46:31 np0005538515.novalocal systemd-logind[763]: Session 1 logged out. Waiting for processes to exit.
Nov 28 06:46:47 np0005538515.novalocal sshd[5803]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 06:46:48 np0005538515.novalocal sshd[5803]: Invalid user support from 78.128.112.74 port 44324
Nov 28 06:46:48 np0005538515.novalocal sshd[5803]: Connection closed by invalid user support 78.128.112.74 port 44324 [preauth]
Nov 28 06:47:02 np0005538515.novalocal systemd[1]: Unmounting EFI System Partition Automount...
Nov 28 06:47:02 np0005538515.novalocal systemd[1]: efi.mount: Deactivated successfully.
Nov 28 06:47:02 np0005538515.novalocal systemd[1]: Unmounted EFI System Partition Automount.
Nov 28 06:48:53 np0005538515.novalocal systemd[4174]: Created slice User Background Tasks Slice.
Nov 28 06:48:53 np0005538515.novalocal systemd[4174]: Starting Cleanup of User's Temporary Files and Directories...
Nov 28 06:48:53 np0005538515.novalocal systemd[4174]: Finished Cleanup of User's Temporary Files and Directories.
Nov 28 06:49:48 np0005538515.novalocal kernel: pci 0000:00:07.0: [1af4:1000] type 00 class 0x020000
Nov 28 06:49:48 np0005538515.novalocal kernel: pci 0000:00:07.0: reg 0x10: [io  0x0000-0x003f]
Nov 28 06:49:48 np0005538515.novalocal kernel: pci 0000:00:07.0: reg 0x14: [mem 0x00000000-0x00000fff]
Nov 28 06:49:48 np0005538515.novalocal kernel: pci 0000:00:07.0: reg 0x20: [mem 0x00000000-0x00003fff 64bit pref]
Nov 28 06:49:48 np0005538515.novalocal kernel: pci 0000:00:07.0: reg 0x30: [mem 0x00000000-0x0007ffff pref]
Nov 28 06:49:48 np0005538515.novalocal kernel: pci 0000:00:07.0: BAR 6: assigned [mem 0xc0000000-0xc007ffff pref]
Nov 28 06:49:48 np0005538515.novalocal kernel: pci 0000:00:07.0: BAR 4: assigned [mem 0x440000000-0x440003fff 64bit pref]
Nov 28 06:49:48 np0005538515.novalocal kernel: pci 0000:00:07.0: BAR 1: assigned [mem 0xc0080000-0xc0080fff]
Nov 28 06:49:48 np0005538515.novalocal kernel: pci 0000:00:07.0: BAR 0: assigned [io  0x1000-0x103f]
Nov 28 06:49:48 np0005538515.novalocal kernel: virtio-pci 0000:00:07.0: enabling device (0000 -> 0003)
Nov 28 06:49:48 np0005538515.novalocal NetworkManager[788]: <info>  [1764312588.2147] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Nov 28 06:49:48 np0005538515.novalocal systemd-udevd[5809]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 06:49:48 np0005538515.novalocal NetworkManager[788]: <info>  [1764312588.2288] device (eth1): state change: unmanaged -> unavailable (reason 'managed', sys-iface-state: 'external')
Nov 28 06:49:48 np0005538515.novalocal NetworkManager[788]: <info>  [1764312588.2309] settings: (eth1): created default wired connection 'Wired connection 1'
Nov 28 06:49:48 np0005538515.novalocal NetworkManager[788]: <info>  [1764312588.2312] device (eth1): carrier: link connected
Nov 28 06:49:48 np0005538515.novalocal NetworkManager[788]: <info>  [1764312588.2314] device (eth1): state change: unavailable -> disconnected (reason 'carrier-changed', sys-iface-state: 'managed')
Nov 28 06:49:48 np0005538515.novalocal NetworkManager[788]: <info>  [1764312588.2318] policy: auto-activating connection 'Wired connection 1' (3f6b1ecd-3b33-3888-bbb5-7c383df6ee7e)
Nov 28 06:49:48 np0005538515.novalocal NetworkManager[788]: <info>  [1764312588.2322] device (eth1): Activation: starting connection 'Wired connection 1' (3f6b1ecd-3b33-3888-bbb5-7c383df6ee7e)
Nov 28 06:49:48 np0005538515.novalocal NetworkManager[788]: <info>  [1764312588.2323] device (eth1): state change: disconnected -> prepare (reason 'none', sys-iface-state: 'managed')
Nov 28 06:49:48 np0005538515.novalocal NetworkManager[788]: <info>  [1764312588.2326] device (eth1): state change: prepare -> config (reason 'none', sys-iface-state: 'managed')
Nov 28 06:49:48 np0005538515.novalocal NetworkManager[788]: <info>  [1764312588.2330] device (eth1): state change: config -> ip-config (reason 'none', sys-iface-state: 'managed')
Nov 28 06:49:48 np0005538515.novalocal NetworkManager[788]: <info>  [1764312588.2332] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Nov 28 06:49:48 np0005538515.novalocal sshd[5813]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 06:49:48 np0005538515.novalocal sshd[5813]: Accepted publickey for zuul from 38.102.83.114 port 41852 ssh2: RSA SHA256:3gOhaEk5Hp1Sm2LwNst6cGDJ5O01KvSo8lCo9SBO2II
Nov 28 06:49:48 np0005538515.novalocal systemd-logind[763]: New session 3 of user zuul.
Nov 28 06:49:48 np0005538515.novalocal systemd[1]: Started Session 3 of User zuul.
Nov 28 06:49:48 np0005538515.novalocal sshd[5813]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Nov 28 06:49:49 np0005538515.novalocal python3[5830]: ansible-ansible.legacy.command Invoked with _raw_params=ip -j link zuul_log_id=fa163ef9-e89a-b1a9-fc65-000000000475-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 06:49:49 np0005538515.novalocal kernel: IPv6: ADDRCONF(NETDEV_CHANGE): eth1: link becomes ready
Nov 28 06:50:02 np0005538515.novalocal sudo[5878]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uywnwvgownqrabbynkbqsugbcjsiqnti ; OS_CLOUD=vexxhost /usr/bin/python3
Nov 28 06:50:02 np0005538515.novalocal sudo[5878]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 06:50:02 np0005538515.novalocal python3[5880]: ansible-ansible.legacy.stat Invoked with path=/etc/NetworkManager/system-connections/ci-private-network.nmconnection follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 06:50:02 np0005538515.novalocal sudo[5878]: pam_unix(sudo:session): session closed for user root
Nov 28 06:50:02 np0005538515.novalocal sudo[5921]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aqxjoktvbzvtoidpfaaawqpaskkoeykk ; OS_CLOUD=vexxhost /usr/bin/python3
Nov 28 06:50:02 np0005538515.novalocal sudo[5921]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 06:50:02 np0005538515.novalocal python3[5923]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764312602.1217651-537-163600387487933/source dest=/etc/NetworkManager/system-connections/ci-private-network.nmconnection mode=0600 owner=root group=root follow=False _original_basename=bootstrap-ci-network-nm-connection.nmconnection.j2 checksum=45f8b686f86847d91097e2a4a4bdd4c78853fa25 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 06:50:02 np0005538515.novalocal sudo[5921]: pam_unix(sudo:session): session closed for user root
Nov 28 06:50:03 np0005538515.novalocal sudo[5951]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gjibtnxxyivcjztvvdffxygjyoarmlaz ; OS_CLOUD=vexxhost /usr/bin/python3
Nov 28 06:50:03 np0005538515.novalocal sudo[5951]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 06:50:03 np0005538515.novalocal python3[5953]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 28 06:50:03 np0005538515.novalocal systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Nov 28 06:50:03 np0005538515.novalocal systemd[1]: Stopped Network Manager Wait Online.
Nov 28 06:50:03 np0005538515.novalocal systemd[1]: Stopping Network Manager Wait Online...
Nov 28 06:50:03 np0005538515.novalocal systemd[1]: Stopping Network Manager...
Nov 28 06:50:03 np0005538515.novalocal NetworkManager[788]: <info>  [1764312603.4550] caught SIGTERM, shutting down normally.
Nov 28 06:50:03 np0005538515.novalocal NetworkManager[788]: <info>  [1764312603.4681] dhcp4 (eth0): canceled DHCP transaction
Nov 28 06:50:03 np0005538515.novalocal NetworkManager[788]: <info>  [1764312603.4682] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Nov 28 06:50:03 np0005538515.novalocal NetworkManager[788]: <info>  [1764312603.4682] dhcp4 (eth0): state changed no lease
Nov 28 06:50:03 np0005538515.novalocal NetworkManager[788]: <info>  [1764312603.4686] manager: NetworkManager state is now CONNECTING
Nov 28 06:50:03 np0005538515.novalocal NetworkManager[788]: <info>  [1764312603.4772] dhcp4 (eth1): canceled DHCP transaction
Nov 28 06:50:03 np0005538515.novalocal NetworkManager[788]: <info>  [1764312603.4772] dhcp4 (eth1): state changed no lease
Nov 28 06:50:03 np0005538515.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Nov 28 06:50:03 np0005538515.novalocal NetworkManager[788]: <info>  [1764312603.4842] exiting (success)
Nov 28 06:50:03 np0005538515.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Nov 28 06:50:03 np0005538515.novalocal systemd[1]: NetworkManager.service: Deactivated successfully.
Nov 28 06:50:03 np0005538515.novalocal systemd[1]: Stopped Network Manager.
Nov 28 06:50:03 np0005538515.novalocal systemd[1]: NetworkManager.service: Consumed 2.630s CPU time.
Nov 28 06:50:03 np0005538515.novalocal systemd[1]: Starting Network Manager...
Nov 28 06:50:03 np0005538515.novalocal NetworkManager[5965]: <info>  [1764312603.5417] NetworkManager (version 1.42.2-1.el9) is starting... (after a restart, boot:a439187c-a774-4883-a00c-1a7b4e2aa22a)
Nov 28 06:50:03 np0005538515.novalocal NetworkManager[5965]: <info>  [1764312603.5420] Read config: /etc/NetworkManager/NetworkManager.conf (run: 15-carrier-timeout.conf)
Nov 28 06:50:03 np0005538515.novalocal NetworkManager[5965]: <info>  [1764312603.5446] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Nov 28 06:50:03 np0005538515.novalocal systemd[1]: Started Network Manager.
Nov 28 06:50:03 np0005538515.novalocal NetworkManager[5965]: <info>  [1764312603.5495] manager[0x561c475d8090]: monitoring kernel firmware directory '/lib/firmware'.
Nov 28 06:50:03 np0005538515.novalocal systemd[1]: Starting Network Manager Wait Online...
Nov 28 06:50:03 np0005538515.novalocal systemd[1]: Starting Hostname Service...
Nov 28 06:50:03 np0005538515.novalocal sudo[5951]: pam_unix(sudo:session): session closed for user root
Nov 28 06:50:03 np0005538515.novalocal systemd[1]: Started Hostname Service.
Nov 28 06:50:03 np0005538515.novalocal NetworkManager[5965]: <info>  [1764312603.6316] hostname: hostname: using hostnamed
Nov 28 06:50:03 np0005538515.novalocal NetworkManager[5965]: <info>  [1764312603.6316] hostname: static hostname changed from (none) to "np0005538515.novalocal"
Nov 28 06:50:03 np0005538515.novalocal NetworkManager[5965]: <info>  [1764312603.6325] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Nov 28 06:50:03 np0005538515.novalocal NetworkManager[5965]: <info>  [1764312603.6333] manager[0x561c475d8090]: rfkill: Wi-Fi hardware radio set enabled
Nov 28 06:50:03 np0005538515.novalocal NetworkManager[5965]: <info>  [1764312603.6334] manager[0x561c475d8090]: rfkill: WWAN hardware radio set enabled
Nov 28 06:50:03 np0005538515.novalocal NetworkManager[5965]: <info>  [1764312603.6385] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.42.2-1.el9/libnm-device-plugin-team.so)
Nov 28 06:50:03 np0005538515.novalocal NetworkManager[5965]: <info>  [1764312603.6387] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Nov 28 06:50:03 np0005538515.novalocal NetworkManager[5965]: <info>  [1764312603.6389] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Nov 28 06:50:03 np0005538515.novalocal NetworkManager[5965]: <info>  [1764312603.6389] manager: Networking is enabled by state file
Nov 28 06:50:03 np0005538515.novalocal NetworkManager[5965]: <info>  [1764312603.6400] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.42.2-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Nov 28 06:50:03 np0005538515.novalocal NetworkManager[5965]: <info>  [1764312603.6401] settings: Loaded settings plugin: keyfile (internal)
Nov 28 06:50:03 np0005538515.novalocal NetworkManager[5965]: <info>  [1764312603.6467] dhcp: init: Using DHCP client 'internal'
Nov 28 06:50:03 np0005538515.novalocal NetworkManager[5965]: <info>  [1764312603.6471] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Nov 28 06:50:03 np0005538515.novalocal NetworkManager[5965]: <info>  [1764312603.6490] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', sys-iface-state: 'external')
Nov 28 06:50:03 np0005538515.novalocal NetworkManager[5965]: <info>  [1764312603.6508] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', sys-iface-state: 'external')
Nov 28 06:50:03 np0005538515.novalocal NetworkManager[5965]: <info>  [1764312603.6528] device (lo): Activation: starting connection 'lo' (116e0581-bf2b-4791-a901-61d85cb9c212)
Nov 28 06:50:03 np0005538515.novalocal NetworkManager[5965]: <info>  [1764312603.6537] device (eth0): carrier: link connected
Nov 28 06:50:03 np0005538515.novalocal NetworkManager[5965]: <info>  [1764312603.6543] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Nov 28 06:50:03 np0005538515.novalocal NetworkManager[5965]: <info>  [1764312603.6549] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Nov 28 06:50:03 np0005538515.novalocal NetworkManager[5965]: <info>  [1764312603.6549] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', sys-iface-state: 'assume')
Nov 28 06:50:03 np0005538515.novalocal NetworkManager[5965]: <info>  [1764312603.6556] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', sys-iface-state: 'assume')
Nov 28 06:50:03 np0005538515.novalocal NetworkManager[5965]: <info>  [1764312603.6565] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Nov 28 06:50:03 np0005538515.novalocal NetworkManager[5965]: <info>  [1764312603.6572] device (eth1): carrier: link connected
Nov 28 06:50:03 np0005538515.novalocal NetworkManager[5965]: <info>  [1764312603.6578] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Nov 28 06:50:03 np0005538515.novalocal NetworkManager[5965]: <info>  [1764312603.6585] manager: (eth1): assume: will attempt to assume matching connection 'Wired connection 1' (3f6b1ecd-3b33-3888-bbb5-7c383df6ee7e) (indicated)
Nov 28 06:50:03 np0005538515.novalocal NetworkManager[5965]: <info>  [1764312603.6586] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', sys-iface-state: 'assume')
Nov 28 06:50:03 np0005538515.novalocal NetworkManager[5965]: <info>  [1764312603.6593] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', sys-iface-state: 'assume')
Nov 28 06:50:03 np0005538515.novalocal NetworkManager[5965]: <info>  [1764312603.6603] device (eth1): Activation: starting connection 'Wired connection 1' (3f6b1ecd-3b33-3888-bbb5-7c383df6ee7e)
Nov 28 06:50:03 np0005538515.novalocal NetworkManager[5965]: <info>  [1764312603.6629] device (lo): state change: disconnected -> prepare (reason 'none', sys-iface-state: 'external')
Nov 28 06:50:03 np0005538515.novalocal NetworkManager[5965]: <info>  [1764312603.6634] device (lo): state change: prepare -> config (reason 'none', sys-iface-state: 'external')
Nov 28 06:50:03 np0005538515.novalocal NetworkManager[5965]: <info>  [1764312603.6638] device (lo): state change: config -> ip-config (reason 'none', sys-iface-state: 'external')
Nov 28 06:50:03 np0005538515.novalocal NetworkManager[5965]: <info>  [1764312603.6640] device (eth0): state change: disconnected -> prepare (reason 'none', sys-iface-state: 'assume')
Nov 28 06:50:03 np0005538515.novalocal NetworkManager[5965]: <info>  [1764312603.6647] device (eth0): state change: prepare -> config (reason 'none', sys-iface-state: 'assume')
Nov 28 06:50:03 np0005538515.novalocal NetworkManager[5965]: <info>  [1764312603.6650] device (eth1): state change: disconnected -> prepare (reason 'none', sys-iface-state: 'assume')
Nov 28 06:50:03 np0005538515.novalocal NetworkManager[5965]: <info>  [1764312603.6654] device (eth1): state change: prepare -> config (reason 'none', sys-iface-state: 'assume')
Nov 28 06:50:03 np0005538515.novalocal NetworkManager[5965]: <info>  [1764312603.6658] device (lo): state change: ip-config -> ip-check (reason 'none', sys-iface-state: 'external')
Nov 28 06:50:03 np0005538515.novalocal NetworkManager[5965]: <info>  [1764312603.6682] device (eth0): state change: config -> ip-config (reason 'none', sys-iface-state: 'assume')
Nov 28 06:50:03 np0005538515.novalocal NetworkManager[5965]: <info>  [1764312603.6689] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Nov 28 06:50:03 np0005538515.novalocal NetworkManager[5965]: <info>  [1764312603.6701] device (eth1): state change: config -> ip-config (reason 'none', sys-iface-state: 'assume')
Nov 28 06:50:03 np0005538515.novalocal NetworkManager[5965]: <info>  [1764312603.6705] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Nov 28 06:50:03 np0005538515.novalocal NetworkManager[5965]: <info>  [1764312603.6748] device (lo): state change: ip-check -> secondaries (reason 'none', sys-iface-state: 'external')
Nov 28 06:50:03 np0005538515.novalocal NetworkManager[5965]: <info>  [1764312603.6756] device (lo): state change: secondaries -> activated (reason 'none', sys-iface-state: 'external')
Nov 28 06:50:03 np0005538515.novalocal NetworkManager[5965]: <info>  [1764312603.6765] device (lo): Activation: successful, device activated.
Nov 28 06:50:03 np0005538515.novalocal NetworkManager[5965]: <info>  [1764312603.6775] dhcp4 (eth0): state changed new lease, address=38.102.83.53
Nov 28 06:50:03 np0005538515.novalocal NetworkManager[5965]: <info>  [1764312603.6780] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Nov 28 06:50:03 np0005538515.novalocal NetworkManager[5965]: <info>  [1764312603.6902] device (eth0): state change: ip-config -> ip-check (reason 'none', sys-iface-state: 'assume')
Nov 28 06:50:03 np0005538515.novalocal NetworkManager[5965]: <info>  [1764312603.6948] device (eth0): state change: ip-check -> secondaries (reason 'none', sys-iface-state: 'assume')
Nov 28 06:50:03 np0005538515.novalocal NetworkManager[5965]: <info>  [1764312603.6951] device (eth0): state change: secondaries -> activated (reason 'none', sys-iface-state: 'assume')
Nov 28 06:50:03 np0005538515.novalocal NetworkManager[5965]: <info>  [1764312603.6956] manager: NetworkManager state is now CONNECTED_SITE
Nov 28 06:50:03 np0005538515.novalocal NetworkManager[5965]: <info>  [1764312603.6960] device (eth0): Activation: successful, device activated.
Nov 28 06:50:03 np0005538515.novalocal NetworkManager[5965]: <info>  [1764312603.6967] manager: NetworkManager state is now CONNECTED_GLOBAL
Nov 28 06:50:03 np0005538515.novalocal python3[6026]: ansible-ansible.legacy.command Invoked with _raw_params=ip route zuul_log_id=fa163ef9-e89a-b1a9-fc65-000000000136-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 06:50:13 np0005538515.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Nov 28 06:50:33 np0005538515.novalocal systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Nov 28 06:50:48 np0005538515.novalocal NetworkManager[5965]: <info>  [1764312648.7792] device (eth1): state change: ip-config -> ip-check (reason 'none', sys-iface-state: 'assume')
Nov 28 06:50:48 np0005538515.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Nov 28 06:50:48 np0005538515.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Nov 28 06:50:48 np0005538515.novalocal NetworkManager[5965]: <info>  [1764312648.8026] device (eth1): state change: ip-check -> secondaries (reason 'none', sys-iface-state: 'assume')
Nov 28 06:50:48 np0005538515.novalocal NetworkManager[5965]: <info>  [1764312648.8033] device (eth1): state change: secondaries -> activated (reason 'none', sys-iface-state: 'assume')
Nov 28 06:50:48 np0005538515.novalocal NetworkManager[5965]: <info>  [1764312648.8052] device (eth1): Activation: successful, device activated.
Nov 28 06:50:48 np0005538515.novalocal NetworkManager[5965]: <info>  [1764312648.8063] manager: startup complete
Nov 28 06:50:48 np0005538515.novalocal systemd[1]: Finished Network Manager Wait Online.
Nov 28 06:50:58 np0005538515.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Nov 28 06:51:03 np0005538515.novalocal sshd[5816]: Received disconnect from 38.102.83.114 port 41852:11: disconnected by user
Nov 28 06:51:03 np0005538515.novalocal sshd[5816]: Disconnected from user zuul 38.102.83.114 port 41852
Nov 28 06:51:03 np0005538515.novalocal sshd[5813]: pam_unix(sshd:session): session closed for user zuul
Nov 28 06:51:03 np0005538515.novalocal systemd[1]: session-3.scope: Deactivated successfully.
Nov 28 06:51:03 np0005538515.novalocal systemd[1]: session-3.scope: Consumed 1.515s CPU time.
Nov 28 06:51:03 np0005538515.novalocal systemd-logind[763]: Session 3 logged out. Waiting for processes to exit.
Nov 28 06:51:04 np0005538515.novalocal systemd-logind[763]: Removed session 3.
Nov 28 06:51:25 np0005538515.novalocal sshd[6053]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 06:51:25 np0005538515.novalocal sshd[6053]: Accepted publickey for zuul from 38.102.83.114 port 35072 ssh2: RSA SHA256:3gOhaEk5Hp1Sm2LwNst6cGDJ5O01KvSo8lCo9SBO2II
Nov 28 06:51:25 np0005538515.novalocal systemd-logind[763]: New session 4 of user zuul.
Nov 28 06:51:25 np0005538515.novalocal systemd[1]: Started Session 4 of User zuul.
Nov 28 06:51:25 np0005538515.novalocal sshd[6053]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Nov 28 06:51:25 np0005538515.novalocal sudo[6102]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tultzfhxkdjdcfypblvaflbnhcpcdcyi ; OS_CLOUD=vexxhost /usr/bin/python3
Nov 28 06:51:25 np0005538515.novalocal sudo[6102]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 06:51:25 np0005538515.novalocal python3[6104]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/env/networking-info.yml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 06:51:25 np0005538515.novalocal sudo[6102]: pam_unix(sudo:session): session closed for user root
Nov 28 06:51:25 np0005538515.novalocal sudo[6145]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zjdzpyivapraehfxvmxjspjacnfrponc ; OS_CLOUD=vexxhost /usr/bin/python3
Nov 28 06:51:25 np0005538515.novalocal sudo[6145]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 06:51:25 np0005538515.novalocal python3[6147]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/env/networking-info.yml owner=root group=root mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764312685.3060608-628-204986376716689/source _original_basename=tmpq83qr24o follow=False checksum=10225105ecbcb8380becb3ed8e03293c5f034347 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 06:51:25 np0005538515.novalocal sudo[6145]: pam_unix(sudo:session): session closed for user root
Nov 28 06:51:28 np0005538515.novalocal sshd[6053]: pam_unix(sshd:session): session closed for user zuul
Nov 28 06:51:28 np0005538515.novalocal systemd[1]: session-4.scope: Deactivated successfully.
Nov 28 06:51:28 np0005538515.novalocal systemd-logind[763]: Session 4 logged out. Waiting for processes to exit.
Nov 28 06:51:28 np0005538515.novalocal systemd-logind[763]: Removed session 4.
Nov 28 06:54:41 np0005538515.novalocal sshd[6163]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 06:54:41 np0005538515.novalocal sshd[6163]: error: kex_exchange_identification: Connection closed by remote host
Nov 28 06:54:41 np0005538515.novalocal sshd[6163]: Connection closed by 134.209.154.70 port 43818
Nov 28 06:56:53 np0005538515.novalocal systemd[1]: Starting dnf makecache...
Nov 28 06:56:54 np0005538515.novalocal dnf[6164]: Failed determining last makecache time.
Nov 28 06:56:54 np0005538515.novalocal dnf[6164]: There are no enabled repositories in "/etc/yum.repos.d", "/etc/yum/repos.d", "/etc/distro.repos.d".
Nov 28 06:56:54 np0005538515.novalocal systemd[1]: dnf-makecache.service: Deactivated successfully.
Nov 28 06:56:54 np0005538515.novalocal systemd[1]: Finished dnf makecache.
Nov 28 06:58:43 np0005538515.novalocal systemd[1]: Starting Cleanup of Temporary Directories...
Nov 28 06:58:43 np0005538515.novalocal systemd[1]: systemd-tmpfiles-clean.service: Deactivated successfully.
Nov 28 06:58:43 np0005538515.novalocal systemd[1]: Finished Cleanup of Temporary Directories.
Nov 28 06:58:43 np0005538515.novalocal systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dclean.service.mount: Deactivated successfully.
Nov 28 06:58:59 np0005538515.novalocal sshd[6169]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 06:58:59 np0005538515.novalocal sshd[6169]: Accepted publickey for zuul from 38.102.83.114 port 51054 ssh2: RSA SHA256:3gOhaEk5Hp1Sm2LwNst6cGDJ5O01KvSo8lCo9SBO2II
Nov 28 06:58:59 np0005538515.novalocal systemd-logind[763]: New session 5 of user zuul.
Nov 28 06:58:59 np0005538515.novalocal systemd[1]: Started Session 5 of User zuul.
Nov 28 06:58:59 np0005538515.novalocal sshd[6169]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Nov 28 06:58:59 np0005538515.novalocal sudo[6186]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dhrhhfjhuyclrlrfiitfqgezfzwaqpzt ; /usr/bin/python3
Nov 28 06:58:59 np0005538515.novalocal sudo[6186]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 06:58:59 np0005538515.novalocal python3[6188]: ansible-ansible.legacy.command Invoked with _raw_params=lsblk -nd -o MAJ:MIN /dev/vda
                                                       _uses_shell=True zuul_log_id=fa163ef9-e89a-0218-9e4d-000000001d10-1-overcloudnovacompute2 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 06:58:59 np0005538515.novalocal sudo[6186]: pam_unix(sudo:session): session closed for user root
Nov 28 06:59:00 np0005538515.novalocal sudo[6205]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lxwbdfijoftcwjxaywintutznitjlrvp ; /usr/bin/python3
Nov 28 06:59:00 np0005538515.novalocal sudo[6205]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 06:59:01 np0005538515.novalocal python3[6207]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/init.scope state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 06:59:01 np0005538515.novalocal sudo[6205]: pam_unix(sudo:session): session closed for user root
Nov 28 06:59:01 np0005538515.novalocal sudo[6221]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pjujueoulcuunisnjumnlzhgecxznhws ; /usr/bin/python3
Nov 28 06:59:01 np0005538515.novalocal sudo[6221]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 06:59:01 np0005538515.novalocal python3[6223]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/machine.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 06:59:01 np0005538515.novalocal sudo[6221]: pam_unix(sudo:session): session closed for user root
Nov 28 06:59:01 np0005538515.novalocal sudo[6237]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qhjvniaexoekruorhxttunpvvficeuyd ; /usr/bin/python3
Nov 28 06:59:01 np0005538515.novalocal sudo[6237]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 06:59:01 np0005538515.novalocal python3[6239]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/system.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 06:59:01 np0005538515.novalocal sudo[6237]: pam_unix(sudo:session): session closed for user root
Nov 28 06:59:01 np0005538515.novalocal sudo[6253]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nnsmylexnyijcbuiqcjxbbifswpawkda ; /usr/bin/python3
Nov 28 06:59:01 np0005538515.novalocal sudo[6253]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 06:59:01 np0005538515.novalocal python3[6255]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/user.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 06:59:01 np0005538515.novalocal sudo[6253]: pam_unix(sudo:session): session closed for user root
Nov 28 06:59:02 np0005538515.novalocal sudo[6269]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sbimnrcqguiygbvkbcfotleozzgzeygs ; /usr/bin/python3
Nov 28 06:59:02 np0005538515.novalocal sudo[6269]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 06:59:02 np0005538515.novalocal python3[6271]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system.conf.d state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 06:59:02 np0005538515.novalocal sudo[6269]: pam_unix(sudo:session): session closed for user root
Nov 28 06:59:03 np0005538515.novalocal sudo[6317]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-szkwmgwduwqmvsgajwiehymtlgxfodkg ; /usr/bin/python3
Nov 28 06:59:03 np0005538515.novalocal sudo[6317]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 06:59:03 np0005538515.novalocal python3[6319]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system.conf.d/override.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 06:59:03 np0005538515.novalocal sudo[6317]: pam_unix(sudo:session): session closed for user root
Nov 28 06:59:04 np0005538515.novalocal sudo[6360]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mylclmigrvmlasxopdeavitegbzflfth ; /usr/bin/python3
Nov 28 06:59:04 np0005538515.novalocal sudo[6360]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 06:59:04 np0005538515.novalocal python3[6362]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system.conf.d/override.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764313143.556926-652-35063446022834/source _original_basename=tmpj7q725jz follow=False checksum=a05098bd3d2321238ea1169d0e6f135b35b392d4 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 06:59:04 np0005538515.novalocal sudo[6360]: pam_unix(sudo:session): session closed for user root
Nov 28 06:59:05 np0005538515.novalocal sudo[6390]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zahnlcycxzxsudnmvlrpvyuhnzdjaunj ; /usr/bin/python3
Nov 28 06:59:05 np0005538515.novalocal sudo[6390]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 06:59:05 np0005538515.novalocal python3[6392]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 28 06:59:05 np0005538515.novalocal systemd[1]: Reloading.
Nov 28 06:59:05 np0005538515.novalocal systemd-rc-local-generator[6410]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 06:59:05 np0005538515.novalocal systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 06:59:05 np0005538515.novalocal sudo[6390]: pam_unix(sudo:session): session closed for user root
Nov 28 06:59:07 np0005538515.novalocal sudo[6437]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dfqzyusortsgczxknqkwzspdxbepgnzq ; /usr/bin/python3
Nov 28 06:59:07 np0005538515.novalocal sudo[6437]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 06:59:07 np0005538515.novalocal python3[6439]: ansible-ansible.builtin.wait_for Invoked with path=/sys/fs/cgroup/system.slice/io.max state=present timeout=30 host=127.0.0.1 connect_timeout=5 delay=0 active_connection_states=['ESTABLISHED', 'FIN_WAIT1', 'FIN_WAIT2', 'SYN_RECV', 'SYN_SENT', 'TIME_WAIT'] sleep=1 port=None search_regex=None exclude_hosts=None msg=None
Nov 28 06:59:07 np0005538515.novalocal sudo[6437]: pam_unix(sudo:session): session closed for user root
Nov 28 06:59:08 np0005538515.novalocal sudo[6453]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dghswapufhdmofbhbwuqjwophplfqawz ; /usr/bin/python3
Nov 28 06:59:08 np0005538515.novalocal sudo[6453]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 06:59:08 np0005538515.novalocal python3[6455]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/init.scope/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 06:59:08 np0005538515.novalocal sudo[6453]: pam_unix(sudo:session): session closed for user root
Nov 28 06:59:08 np0005538515.novalocal sudo[6471]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rgwqrntistyiuhhtdqagglmzocwtzyeh ; /usr/bin/python3
Nov 28 06:59:08 np0005538515.novalocal sudo[6471]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 06:59:09 np0005538515.novalocal python3[6473]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/machine.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 06:59:09 np0005538515.novalocal sudo[6471]: pam_unix(sudo:session): session closed for user root
Nov 28 06:59:09 np0005538515.novalocal sudo[6489]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mmwzinmsotenfqdpreqogdvceestaoev ; /usr/bin/python3
Nov 28 06:59:09 np0005538515.novalocal sudo[6489]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 06:59:09 np0005538515.novalocal python3[6491]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/system.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 06:59:09 np0005538515.novalocal sudo[6489]: pam_unix(sudo:session): session closed for user root
Nov 28 06:59:09 np0005538515.novalocal sudo[6507]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xkvpwodmpavnlcpznozltsstqkvpkdew ; /usr/bin/python3
Nov 28 06:59:09 np0005538515.novalocal sudo[6507]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 06:59:09 np0005538515.novalocal python3[6509]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/user.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 06:59:09 np0005538515.novalocal sudo[6507]: pam_unix(sudo:session): session closed for user root
Nov 28 06:59:20 np0005538515.novalocal python3[6526]: ansible-ansible.legacy.command Invoked with _raw_params=echo "init";    cat /sys/fs/cgroup/init.scope/io.max; echo "machine"; cat /sys/fs/cgroup/machine.slice/io.max; echo "system";  cat /sys/fs/cgroup/system.slice/io.max; echo "user";    cat /sys/fs/cgroup/user.slice/io.max;
                                                       _uses_shell=True zuul_log_id=fa163ef9-e89a-0218-9e4d-000000001d17-1-overcloudnovacompute2 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 06:59:21 np0005538515.novalocal python3[6546]: ansible-ansible.builtin.stat Invoked with path=/sys/fs/cgroup/kubepods.slice/io.max follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 28 06:59:24 np0005538515.novalocal sshd[6169]: pam_unix(sshd:session): session closed for user zuul
Nov 28 06:59:24 np0005538515.novalocal systemd[1]: session-5.scope: Deactivated successfully.
Nov 28 06:59:24 np0005538515.novalocal systemd[1]: session-5.scope: Consumed 3.997s CPU time.
Nov 28 06:59:24 np0005538515.novalocal systemd-logind[763]: Session 5 logged out. Waiting for processes to exit.
Nov 28 06:59:24 np0005538515.novalocal systemd-logind[763]: Removed session 5.
Nov 28 07:00:50 np0005538515.novalocal sshd[6552]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:00:50 np0005538515.novalocal sshd[6552]: Accepted publickey for zuul from 38.102.83.114 port 57236 ssh2: RSA SHA256:3gOhaEk5Hp1Sm2LwNst6cGDJ5O01KvSo8lCo9SBO2II
Nov 28 07:00:50 np0005538515.novalocal systemd-logind[763]: New session 6 of user zuul.
Nov 28 07:00:50 np0005538515.novalocal systemd[1]: Started Session 6 of User zuul.
Nov 28 07:00:50 np0005538515.novalocal sshd[6552]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Nov 28 07:00:50 np0005538515.novalocal sudo[6569]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-utbsnpjgkholxohatdutzmydecbcjudh ; /usr/bin/python3
Nov 28 07:00:50 np0005538515.novalocal sudo[6569]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:00:51 np0005538515.novalocal systemd[1]: Starting RHSM dbus service...
Nov 28 07:00:51 np0005538515.novalocal systemd[1]: Started RHSM dbus service.
Nov 28 07:00:51 np0005538515.novalocal rhsm-service[6576]:  INFO [subscription_manager.i18n:169] Could not import locale for C: [Errno 2] No translation file found for domain: 'rhsm'
Nov 28 07:00:51 np0005538515.novalocal rhsm-service[6576]:  INFO [subscription_manager.i18n:139] Could not import locale either for C_C: [Errno 2] No translation file found for domain: 'rhsm'
Nov 28 07:00:51 np0005538515.novalocal rhsm-service[6576]:  INFO [subscription_manager.i18n:169] Could not import locale for C: [Errno 2] No translation file found for domain: 'rhsm'
Nov 28 07:00:51 np0005538515.novalocal rhsm-service[6576]:  INFO [subscription_manager.i18n:139] Could not import locale either for C_C: [Errno 2] No translation file found for domain: 'rhsm'
Nov 28 07:00:55 np0005538515.novalocal rhsm-service[6576]:  INFO [subscription_manager.managerlib:90] Consumer created: np0005538515.novalocal (c20224ed-ba86-41a6-a487-b9546587a93c)
Nov 28 07:00:55 np0005538515.novalocal subscription-manager[6576]: Registered system with identity: c20224ed-ba86-41a6-a487-b9546587a93c
Nov 28 07:00:55 np0005538515.novalocal rhsm-service[6576]:  INFO [subscription_manager.entcertlib:131] certs updated:
Nov 28 07:00:55 np0005538515.novalocal rhsm-service[6576]: Total updates: 1
Nov 28 07:00:55 np0005538515.novalocal rhsm-service[6576]: Found (local) serial# []
Nov 28 07:00:55 np0005538515.novalocal rhsm-service[6576]: Expected (UEP) serial# [7824755758168854409]
Nov 28 07:00:55 np0005538515.novalocal rhsm-service[6576]: Added (new)
Nov 28 07:00:55 np0005538515.novalocal rhsm-service[6576]:   [sn:7824755758168854409 ( Content Access,) @ /etc/pki/entitlement/7824755758168854409.pem]
Nov 28 07:00:55 np0005538515.novalocal rhsm-service[6576]: Deleted (rogue):
Nov 28 07:00:55 np0005538515.novalocal rhsm-service[6576]:   <NONE>
Nov 28 07:00:55 np0005538515.novalocal subscription-manager[6576]: Added subscription for 'Content Access' contract 'None'
Nov 28 07:00:55 np0005538515.novalocal subscription-manager[6576]: Added subscription for product ' Content Access'
Nov 28 07:00:57 np0005538515.novalocal rhsm-service[6576]:  INFO [subscription_manager.i18n:169] Could not import locale for C: [Errno 2] No translation file found for domain: 'rhsm'
Nov 28 07:00:57 np0005538515.novalocal rhsm-service[6576]:  INFO [subscription_manager.i18n:139] Could not import locale either for C_C: [Errno 2] No translation file found for domain: 'rhsm'
Nov 28 07:00:57 np0005538515.novalocal rhsm-service[6576]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Nov 28 07:00:57 np0005538515.novalocal rhsm-service[6576]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Nov 28 07:00:57 np0005538515.novalocal rhsm-service[6576]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Nov 28 07:00:57 np0005538515.novalocal rhsm-service[6576]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Nov 28 07:00:58 np0005538515.novalocal rhsm-service[6576]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Nov 28 07:00:58 np0005538515.novalocal sudo[6569]: pam_unix(sudo:session): session closed for user root
Nov 28 07:00:59 np0005538515.novalocal python3[6667]: ansible-ansible.legacy.command Invoked with _raw_params=cat /etc/redhat-release zuul_log_id=fa163ef9-e89a-cf29-7b10-00000000000d-1-overcloudnovacompute2 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 07:01:01 np0005538515.novalocal CROND[6672]: (root) CMD (run-parts /etc/cron.hourly)
Nov 28 07:01:01 np0005538515.novalocal run-parts[6675]: (/etc/cron.hourly) starting 0anacron
Nov 28 07:01:01 np0005538515.novalocal anacron[6683]: Anacron started on 2025-11-28
Nov 28 07:01:01 np0005538515.novalocal anacron[6683]: Will run job `cron.daily' in 5 min.
Nov 28 07:01:01 np0005538515.novalocal anacron[6683]: Will run job `cron.weekly' in 25 min.
Nov 28 07:01:01 np0005538515.novalocal anacron[6683]: Will run job `cron.monthly' in 45 min.
Nov 28 07:01:01 np0005538515.novalocal anacron[6683]: Jobs will be executed sequentially
Nov 28 07:01:01 np0005538515.novalocal run-parts[6685]: (/etc/cron.hourly) finished 0anacron
Nov 28 07:01:01 np0005538515.novalocal CROND[6671]: (root) CMDEND (run-parts /etc/cron.hourly)
Nov 28 07:01:52 np0005538515.novalocal sudo[6699]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bkcvfetfatganonbobrdauaoufsnglqn ; /usr/bin/python3
Nov 28 07:01:52 np0005538515.novalocal sudo[6699]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:01:52 np0005538515.novalocal python3[6701]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Nov 28 07:02:22 np0005538515.novalocal setsebool[6776]: The virt_use_nfs policy boolean was changed to 1 by root
Nov 28 07:02:22 np0005538515.novalocal setsebool[6776]: The virt_sandbox_use_all_caps policy boolean was changed to 1 by root
Nov 28 07:02:33 np0005538515.novalocal kernel: SELinux:  Converting 410 SID table entries...
Nov 28 07:02:33 np0005538515.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Nov 28 07:02:33 np0005538515.novalocal kernel: SELinux:  policy capability open_perms=1
Nov 28 07:02:33 np0005538515.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Nov 28 07:02:33 np0005538515.novalocal kernel: SELinux:  policy capability always_check_network=0
Nov 28 07:02:33 np0005538515.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 28 07:02:33 np0005538515.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 28 07:02:33 np0005538515.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 28 07:02:45 np0005538515.novalocal dbus-broker-launch[754]: avc:  op=load_policy lsm=selinux seqno=3 res=1
Nov 28 07:02:45 np0005538515.novalocal systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 28 07:02:45 np0005538515.novalocal systemd[1]: Starting man-db-cache-update.service...
Nov 28 07:02:45 np0005538515.novalocal systemd[1]: Reloading.
Nov 28 07:02:45 np0005538515.novalocal systemd-rc-local-generator[7646]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 07:02:45 np0005538515.novalocal systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 07:02:45 np0005538515.novalocal systemd[1]: Queuing reload/restart jobs for marked units…
Nov 28 07:02:47 np0005538515.novalocal rhsm-service[6576]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Nov 28 07:02:47 np0005538515.novalocal sudo[6699]: pam_unix(sudo:session): session closed for user root
Nov 28 07:02:47 np0005538515.novalocal rhsm-service[6576]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Nov 28 07:02:49 np0005538515.novalocal sudo[12845]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rlsnpqldpcgriofrdhmclpafdtxagzuk ; /usr/bin/python3
Nov 28 07:02:49 np0005538515.novalocal sudo[12845]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:02:49 np0005538515.novalocal systemd[1]: var-lib-containers-storage-overlay-compat2983557930-merged.mount: Deactivated successfully.
Nov 28 07:02:49 np0005538515.novalocal podman[13101]: 2025-11-28 07:02:49.621734953 +0000 UTC m=+0.090962412 system refresh
Nov 28 07:02:50 np0005538515.novalocal sudo[12845]: pam_unix(sudo:session): session closed for user root
Nov 28 07:02:50 np0005538515.novalocal systemd[4174]: Starting D-Bus User Message Bus...
Nov 28 07:02:50 np0005538515.novalocal dbus-broker-launch[14507]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +31: Eavesdropping is deprecated and ignored
Nov 28 07:02:50 np0005538515.novalocal dbus-broker-launch[14507]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +33: Eavesdropping is deprecated and ignored
Nov 28 07:02:50 np0005538515.novalocal systemd[4174]: Started D-Bus User Message Bus.
Nov 28 07:02:50 np0005538515.novalocal dbus-broker-lau[14507]: Ready
Nov 28 07:02:50 np0005538515.novalocal systemd[4174]: selinux: avc:  op=load_policy lsm=selinux seqno=3 res=1
Nov 28 07:02:50 np0005538515.novalocal systemd[4174]: Created slice Slice /user.
Nov 28 07:02:50 np0005538515.novalocal systemd[4174]: podman-14354.scope: unit configures an IP firewall, but not running as root.
Nov 28 07:02:50 np0005538515.novalocal systemd[4174]: (This warning is only shown for the first unit using IP firewalling.)
Nov 28 07:02:50 np0005538515.novalocal systemd[4174]: Started podman-14354.scope.
Nov 28 07:02:50 np0005538515.novalocal systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 28 07:02:50 np0005538515.novalocal systemd[4174]: Started podman-pause-ee9564e6.scope.
Nov 28 07:02:51 np0005538515.novalocal sshd[6552]: pam_unix(sshd:session): session closed for user zuul
Nov 28 07:02:51 np0005538515.novalocal systemd[1]: session-6.scope: Deactivated successfully.
Nov 28 07:02:51 np0005538515.novalocal systemd[1]: session-6.scope: Consumed 52.581s CPU time.
Nov 28 07:02:51 np0005538515.novalocal systemd-logind[763]: Session 6 logged out. Waiting for processes to exit.
Nov 28 07:02:51 np0005538515.novalocal systemd-logind[763]: Removed session 6.
Nov 28 07:02:53 np0005538515.novalocal systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 28 07:02:53 np0005538515.novalocal systemd[1]: Finished man-db-cache-update.service.
Nov 28 07:02:53 np0005538515.novalocal systemd[1]: man-db-cache-update.service: Consumed 9.319s CPU time.
Nov 28 07:02:53 np0005538515.novalocal systemd[1]: run-r866cffe61bf245e8b4b329f1c339457e.service: Deactivated successfully.
Nov 28 07:03:07 np0005538515.novalocal sshd[18432]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:03:07 np0005538515.novalocal sshd[18433]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:03:07 np0005538515.novalocal sshd[18431]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:03:07 np0005538515.novalocal sshd[18434]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:03:07 np0005538515.novalocal sshd[18435]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:03:07 np0005538515.novalocal sshd[18431]: Connection closed by 38.102.83.32 port 34524 [preauth]
Nov 28 07:03:07 np0005538515.novalocal sshd[18433]: Unable to negotiate with 38.102.83.32 port 34536: no matching host key type found. Their offer: ssh-ed25519 [preauth]
Nov 28 07:03:07 np0005538515.novalocal sshd[18432]: Connection closed by 38.102.83.32 port 34534 [preauth]
Nov 28 07:03:07 np0005538515.novalocal sshd[18434]: Unable to negotiate with 38.102.83.32 port 34544: no matching host key type found. Their offer: sk-ecdsa-sha2-nistp256@openssh.com [preauth]
Nov 28 07:03:07 np0005538515.novalocal sshd[18435]: Unable to negotiate with 38.102.83.32 port 34552: no matching host key type found. Their offer: sk-ssh-ed25519@openssh.com [preauth]
Nov 28 07:03:11 np0005538515.novalocal sshd[18441]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:03:11 np0005538515.novalocal sshd[18441]: Accepted publickey for zuul from 38.102.83.114 port 44396 ssh2: RSA SHA256:3gOhaEk5Hp1Sm2LwNst6cGDJ5O01KvSo8lCo9SBO2II
Nov 28 07:03:11 np0005538515.novalocal systemd-logind[763]: New session 7 of user zuul.
Nov 28 07:03:11 np0005538515.novalocal systemd[1]: Started Session 7 of User zuul.
Nov 28 07:03:11 np0005538515.novalocal sshd[18441]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Nov 28 07:03:11 np0005538515.novalocal python3[18458]: ansible-ansible.posix.authorized_key Invoked with user=zuul key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBCwbdeJV6VDDXadsSf2RG5X7kz/GTOF493/FPhPlXmY8LaEjIgaNVgahbrG06qkZx72vk0TqexyzHBymiNAuWIc= zuul@np0005538507.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 28 07:03:12 np0005538515.novalocal sudo[18472]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ozupfavrxqguicfuwuknvmyqcmejntnp ; /usr/bin/python3
Nov 28 07:03:12 np0005538515.novalocal sudo[18472]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:03:12 np0005538515.novalocal python3[18474]: ansible-ansible.posix.authorized_key Invoked with user=root key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBCwbdeJV6VDDXadsSf2RG5X7kz/GTOF493/FPhPlXmY8LaEjIgaNVgahbrG06qkZx72vk0TqexyzHBymiNAuWIc= zuul@np0005538507.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 28 07:03:12 np0005538515.novalocal sudo[18472]: pam_unix(sudo:session): session closed for user root
Nov 28 07:03:14 np0005538515.novalocal sshd[18441]: pam_unix(sshd:session): session closed for user zuul
Nov 28 07:03:14 np0005538515.novalocal systemd[1]: session-7.scope: Deactivated successfully.
Nov 28 07:03:14 np0005538515.novalocal systemd-logind[763]: Session 7 logged out. Waiting for processes to exit.
Nov 28 07:03:14 np0005538515.novalocal systemd-logind[763]: Removed session 7.
Nov 28 07:04:48 np0005538515.novalocal sshd[18476]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:04:49 np0005538515.novalocal sshd[18476]: Accepted publickey for zuul from 38.102.83.114 port 52374 ssh2: RSA SHA256:3gOhaEk5Hp1Sm2LwNst6cGDJ5O01KvSo8lCo9SBO2II
Nov 28 07:04:49 np0005538515.novalocal systemd-logind[763]: New session 8 of user zuul.
Nov 28 07:04:49 np0005538515.novalocal systemd[1]: Started Session 8 of User zuul.
Nov 28 07:04:49 np0005538515.novalocal sshd[18476]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Nov 28 07:04:49 np0005538515.novalocal sudo[18493]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wshbyugezgypnpuzncqnfsohvnincxrx ; /usr/bin/python3
Nov 28 07:04:49 np0005538515.novalocal sudo[18493]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:04:49 np0005538515.novalocal python3[18495]: ansible-authorized_key Invoked with user=root manage_dir=True key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCsnBivukZgTjr1SoC29hE3ofwUMxTaKeXh9gXvDwMJASbvK4q9943cbJ2j47GUf8sEgY38kkU/dxSMQWULl4d2oquIgZpJbJuXMU1WNxwGNSrS74OecQ3Or4VxTiDmu/HV83nIWHqfpDCra4DlrIBPPNwhBK4u0QYy87AJaML6NGEDaubbHgVCg1UpW1ho/sDoXptAehoCEaaeRz5tPHiXRnHpIXu44Sp8fRcyU9rBqdv+/lgachTcMYadsD2WBHIL+pptEDHB5TvQTDpnU58YdGFarn8uuGPP4t8H6xcqXbaJS9/oZa5Fb5Mh3vORBbR65jvlGg4PYGzCuI/xllY5+lGK7eyOleFyRqWKa2uAIaGoRBT4ZLKAssOFwCIaGfOAFFOBMkuylg4+MtbYiMJYRARPSRAufAROqhUDOo73y5lBrXh07aiWuSn8fU4mclWu+Xw382ryxW+XeHPc12d7S46TvGJaRvzsLtlyerRxGI77xOHRexq1Z/SFjOWLOwc= zuul-build-sshkey state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 28 07:04:49 np0005538515.novalocal sudo[18493]: pam_unix(sudo:session): session closed for user root
Nov 28 07:04:50 np0005538515.novalocal sudo[18509]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ylhizbmxqepoxierntugwlixrswiuwsb ; /usr/bin/python3
Nov 28 07:04:50 np0005538515.novalocal sudo[18509]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:04:50 np0005538515.novalocal python3[18511]: ansible-user Invoked with name=root state=present non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005538515.novalocal update_password=always uid=None group=None groups=None comment=None home=None shell=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None
Nov 28 07:04:50 np0005538515.novalocal sudo[18509]: pam_unix(sudo:session): session closed for user root
Nov 28 07:04:51 np0005538515.novalocal sudo[18559]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kxhiogmxaftjtrfgnmsnoaskgcruoqcq ; /usr/bin/python3
Nov 28 07:04:51 np0005538515.novalocal sudo[18559]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:04:52 np0005538515.novalocal python3[18561]: ansible-ansible.legacy.stat Invoked with path=/root/.ssh/id_rsa follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 07:04:52 np0005538515.novalocal sudo[18559]: pam_unix(sudo:session): session closed for user root
Nov 28 07:04:52 np0005538515.novalocal sudo[18602]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gzpzyxykaywttzbxqbenpbmxbidytcpw ; /usr/bin/python3
Nov 28 07:04:52 np0005538515.novalocal sudo[18602]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:04:52 np0005538515.novalocal python3[18604]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764313491.736329-139-40109814909018/source dest=/root/.ssh/id_rsa mode=384 owner=root force=False _original_basename=4237e6bc560e46d9aa55417d911b2c55_id_rsa follow=False checksum=47f6a2f8fa426c1f34aad346f88073a22928af4e backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:04:52 np0005538515.novalocal sudo[18602]: pam_unix(sudo:session): session closed for user root
Nov 28 07:04:53 np0005538515.novalocal sudo[18665]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xzarkleitmntbqmfhpdzvdyskuhzzqkd ; /usr/bin/python3
Nov 28 07:04:53 np0005538515.novalocal sudo[18665]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:04:53 np0005538515.novalocal python3[18667]: ansible-ansible.legacy.stat Invoked with path=/root/.ssh/id_rsa.pub follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 07:04:53 np0005538515.novalocal sudo[18665]: pam_unix(sudo:session): session closed for user root
Nov 28 07:04:53 np0005538515.novalocal sudo[18708]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fwoneuawnsnqcobmujpyekaylpzcyemb ; /usr/bin/python3
Nov 28 07:04:53 np0005538515.novalocal sudo[18708]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:04:53 np0005538515.novalocal python3[18710]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764313493.405782-229-146659391301234/source dest=/root/.ssh/id_rsa.pub mode=420 owner=root force=False _original_basename=4237e6bc560e46d9aa55417d911b2c55_id_rsa.pub follow=False checksum=d1f12d852c72cfefab089d88337552962cfbc93d backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:04:54 np0005538515.novalocal sudo[18708]: pam_unix(sudo:session): session closed for user root
Nov 28 07:04:55 np0005538515.novalocal sudo[18738]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ufopbszoehppxjlhnypucqtyybwatthp ; /usr/bin/python3
Nov 28 07:04:55 np0005538515.novalocal sudo[18738]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:04:56 np0005538515.novalocal python3[18740]: ansible-ansible.builtin.file Invoked with path=/etc/nodepool state=directory mode=0777 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:04:56 np0005538515.novalocal sudo[18738]: pam_unix(sudo:session): session closed for user root
Nov 28 07:04:57 np0005538515.novalocal python3[18786]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 07:04:57 np0005538515.novalocal python3[18802]: ansible-ansible.legacy.file Invoked with dest=/etc/nodepool/sub_nodes _original_basename=tmpjahlwwbq recurse=False state=file path=/etc/nodepool/sub_nodes force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:04:58 np0005538515.novalocal python3[18862]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 07:04:59 np0005538515.novalocal python3[18878]: ansible-ansible.legacy.file Invoked with dest=/etc/nodepool/sub_nodes_private _original_basename=tmpvrm7mxis recurse=False state=file path=/etc/nodepool/sub_nodes_private force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:05:00 np0005538515.novalocal python3[18938]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/node_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 07:05:00 np0005538515.novalocal python3[18954]: ansible-ansible.legacy.file Invoked with dest=/etc/nodepool/node_private _original_basename=tmpzbuw0znv recurse=False state=file path=/etc/nodepool/node_private force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:05:01 np0005538515.novalocal sshd[18476]: pam_unix(sshd:session): session closed for user zuul
Nov 28 07:05:01 np0005538515.novalocal systemd[1]: session-8.scope: Deactivated successfully.
Nov 28 07:05:01 np0005538515.novalocal systemd[1]: session-8.scope: Consumed 3.519s CPU time.
Nov 28 07:05:01 np0005538515.novalocal systemd-logind[763]: Session 8 logged out. Waiting for processes to exit.
Nov 28 07:05:01 np0005538515.novalocal systemd-logind[763]: Removed session 8.
Nov 28 07:06:01 np0005538515.novalocal anacron[6683]: Job `cron.daily' started
Nov 28 07:06:01 np0005538515.novalocal anacron[6683]: Job `cron.daily' terminated
Nov 28 07:07:24 np0005538515.novalocal sshd[18971]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:07:24 np0005538515.novalocal sshd[18971]: Accepted publickey for zuul from 38.102.83.32 port 52908 ssh2: RSA SHA256:3gOhaEk5Hp1Sm2LwNst6cGDJ5O01KvSo8lCo9SBO2II
Nov 28 07:07:24 np0005538515.novalocal systemd-logind[763]: New session 9 of user zuul.
Nov 28 07:07:24 np0005538515.novalocal systemd[1]: Started Session 9 of User zuul.
Nov 28 07:07:24 np0005538515.novalocal sshd[18971]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Nov 28 07:07:25 np0005538515.novalocal python3[19017]: ansible-ansible.legacy.command Invoked with _raw_params=hostname _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 07:11:22 np0005538515.novalocal sshd[19020]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:11:24 np0005538515.novalocal sshd[19020]: Connection closed by authenticating user root 69.5.7.210 port 44646 [preauth]
Nov 28 07:11:24 np0005538515.novalocal sshd[19022]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:11:25 np0005538515.novalocal sshd[19022]: Connection closed by authenticating user root 69.5.7.210 port 44650 [preauth]
Nov 28 07:11:26 np0005538515.novalocal sshd[19024]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:11:27 np0005538515.novalocal sshd[19024]: Connection closed by authenticating user root 69.5.7.210 port 44656 [preauth]
Nov 28 07:11:27 np0005538515.novalocal sshd[19026]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:11:28 np0005538515.novalocal sshd[19026]: Connection closed by authenticating user root 69.5.7.210 port 44662 [preauth]
Nov 28 07:11:29 np0005538515.novalocal sshd[19028]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:11:30 np0005538515.novalocal sshd[19028]: Connection closed by authenticating user root 69.5.7.210 port 43086 [preauth]
Nov 28 07:11:30 np0005538515.novalocal sshd[19030]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:11:32 np0005538515.novalocal sshd[19030]: Connection closed by authenticating user root 69.5.7.210 port 43098 [preauth]
Nov 28 07:11:32 np0005538515.novalocal sshd[19032]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:11:33 np0005538515.novalocal sshd[19032]: Connection closed by authenticating user root 69.5.7.210 port 43106 [preauth]
Nov 28 07:11:33 np0005538515.novalocal sshd[19034]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:11:35 np0005538515.novalocal sshd[19034]: Connection closed by authenticating user root 69.5.7.210 port 43118 [preauth]
Nov 28 07:11:35 np0005538515.novalocal sshd[19037]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:11:36 np0005538515.novalocal sshd[19037]: Connection closed by authenticating user root 69.5.7.210 port 43132 [preauth]
Nov 28 07:11:37 np0005538515.novalocal sshd[19039]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:11:38 np0005538515.novalocal sshd[19039]: Connection closed by authenticating user root 69.5.7.210 port 43138 [preauth]
Nov 28 07:11:38 np0005538515.novalocal sshd[19041]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:11:40 np0005538515.novalocal sshd[19041]: Connection closed by authenticating user root 69.5.7.210 port 43144 [preauth]
Nov 28 07:11:40 np0005538515.novalocal sshd[19043]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:11:41 np0005538515.novalocal sshd[19043]: Connection closed by authenticating user root 69.5.7.210 port 38812 [preauth]
Nov 28 07:11:42 np0005538515.novalocal sshd[19045]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:11:43 np0005538515.novalocal sshd[19045]: Connection closed by authenticating user root 69.5.7.210 port 38818 [preauth]
Nov 28 07:11:43 np0005538515.novalocal sshd[19047]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:11:44 np0005538515.novalocal sshd[19047]: Connection closed by authenticating user root 69.5.7.210 port 38832 [preauth]
Nov 28 07:11:45 np0005538515.novalocal sshd[19049]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:11:46 np0005538515.novalocal sshd[19049]: Connection closed by authenticating user root 69.5.7.210 port 38846 [preauth]
Nov 28 07:11:46 np0005538515.novalocal sshd[19051]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:11:48 np0005538515.novalocal sshd[19051]: Connection closed by authenticating user root 69.5.7.210 port 38854 [preauth]
Nov 28 07:11:48 np0005538515.novalocal sshd[19053]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:11:49 np0005538515.novalocal sshd[19053]: Connection closed by authenticating user root 69.5.7.210 port 38862 [preauth]
Nov 28 07:11:49 np0005538515.novalocal sshd[19055]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:11:51 np0005538515.novalocal sshd[19055]: Connection closed by authenticating user root 69.5.7.210 port 49766 [preauth]
Nov 28 07:11:51 np0005538515.novalocal sshd[19057]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:11:52 np0005538515.novalocal sshd[19057]: Connection closed by authenticating user root 69.5.7.210 port 49774 [preauth]
Nov 28 07:11:53 np0005538515.novalocal sshd[19059]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:11:54 np0005538515.novalocal sshd[19059]: Connection closed by authenticating user root 69.5.7.210 port 49782 [preauth]
Nov 28 07:11:54 np0005538515.novalocal sshd[19061]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:11:55 np0005538515.novalocal sshd[19061]: Connection closed by authenticating user root 69.5.7.210 port 49788 [preauth]
Nov 28 07:11:56 np0005538515.novalocal sshd[19063]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:11:57 np0005538515.novalocal sshd[19063]: Connection closed by authenticating user root 69.5.7.210 port 49804 [preauth]
Nov 28 07:11:57 np0005538515.novalocal sshd[19065]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:11:59 np0005538515.novalocal sshd[19065]: Connection closed by authenticating user root 69.5.7.210 port 49812 [preauth]
Nov 28 07:12:00 np0005538515.novalocal sshd[19067]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:12:01 np0005538515.novalocal sshd[19067]: Connection closed by authenticating user root 69.5.7.210 port 55900 [preauth]
Nov 28 07:12:01 np0005538515.novalocal sshd[19069]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:12:03 np0005538515.novalocal sshd[19069]: Connection closed by authenticating user root 69.5.7.210 port 55914 [preauth]
Nov 28 07:12:03 np0005538515.novalocal sshd[19071]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:12:04 np0005538515.novalocal sshd[19071]: Connection closed by authenticating user root 69.5.7.210 port 55926 [preauth]
Nov 28 07:12:05 np0005538515.novalocal sshd[19073]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:12:06 np0005538515.novalocal sshd[19073]: Connection closed by authenticating user root 69.5.7.210 port 55928 [preauth]
Nov 28 07:12:06 np0005538515.novalocal sshd[19075]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:12:07 np0005538515.novalocal sshd[19075]: Connection closed by authenticating user root 69.5.7.210 port 55938 [preauth]
Nov 28 07:12:08 np0005538515.novalocal sshd[19077]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:12:09 np0005538515.novalocal sshd[19077]: Connection closed by authenticating user root 69.5.7.210 port 55940 [preauth]
Nov 28 07:12:09 np0005538515.novalocal sshd[19079]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:12:11 np0005538515.novalocal sshd[19079]: Connection closed by authenticating user root 69.5.7.210 port 43172 [preauth]
Nov 28 07:12:11 np0005538515.novalocal sshd[19081]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:12:12 np0005538515.novalocal sshd[19081]: Connection closed by authenticating user root 69.5.7.210 port 43186 [preauth]
Nov 28 07:12:13 np0005538515.novalocal sshd[19083]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:12:14 np0005538515.novalocal sshd[19083]: Connection closed by authenticating user root 69.5.7.210 port 43190 [preauth]
Nov 28 07:12:14 np0005538515.novalocal sshd[19085]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:12:15 np0005538515.novalocal sshd[19085]: Connection closed by authenticating user root 69.5.7.210 port 43206 [preauth]
Nov 28 07:12:16 np0005538515.novalocal sshd[19087]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:12:17 np0005538515.novalocal sshd[19087]: Connection closed by authenticating user root 69.5.7.210 port 43214 [preauth]
Nov 28 07:12:17 np0005538515.novalocal sshd[19089]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:12:19 np0005538515.novalocal sshd[19089]: Connection closed by authenticating user root 69.5.7.210 port 43222 [preauth]
Nov 28 07:12:19 np0005538515.novalocal sshd[19091]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:12:20 np0005538515.novalocal sshd[19091]: Connection closed by authenticating user root 69.5.7.210 port 55698 [preauth]
Nov 28 07:12:20 np0005538515.novalocal sshd[19093]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:12:22 np0005538515.novalocal sshd[19093]: Connection closed by authenticating user root 69.5.7.210 port 55710 [preauth]
Nov 28 07:12:22 np0005538515.novalocal sshd[19095]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:12:23 np0005538515.novalocal sshd[19095]: Connection closed by authenticating user root 69.5.7.210 port 55712 [preauth]
Nov 28 07:12:24 np0005538515.novalocal sshd[19097]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:12:24 np0005538515.novalocal sshd[18974]: Received disconnect from 38.102.83.32 port 52908:11: disconnected by user
Nov 28 07:12:24 np0005538515.novalocal sshd[18974]: Disconnected from user zuul 38.102.83.32 port 52908
Nov 28 07:12:24 np0005538515.novalocal sshd[18971]: pam_unix(sshd:session): session closed for user zuul
Nov 28 07:12:24 np0005538515.novalocal systemd[1]: session-9.scope: Deactivated successfully.
Nov 28 07:12:24 np0005538515.novalocal systemd-logind[763]: Session 9 logged out. Waiting for processes to exit.
Nov 28 07:12:24 np0005538515.novalocal systemd-logind[763]: Removed session 9.
Nov 28 07:12:25 np0005538515.novalocal sshd[19097]: Connection closed by authenticating user root 69.5.7.210 port 55728 [preauth]
Nov 28 07:12:25 np0005538515.novalocal sshd[19100]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:12:27 np0005538515.novalocal sshd[19100]: Connection closed by authenticating user root 69.5.7.210 port 55740 [preauth]
Nov 28 07:12:27 np0005538515.novalocal sshd[19102]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:12:28 np0005538515.novalocal sshd[19102]: Connection closed by authenticating user root 69.5.7.210 port 55744 [preauth]
Nov 28 07:12:28 np0005538515.novalocal sshd[19104]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:12:30 np0005538515.novalocal sshd[19104]: Connection closed by authenticating user root 69.5.7.210 port 55752 [preauth]
Nov 28 07:12:30 np0005538515.novalocal sshd[19106]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:12:31 np0005538515.novalocal sshd[19106]: Connection closed by authenticating user root 69.5.7.210 port 35228 [preauth]
Nov 28 07:12:32 np0005538515.novalocal sshd[19108]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:12:33 np0005538515.novalocal sshd[19108]: Connection closed by authenticating user root 69.5.7.210 port 35232 [preauth]
Nov 28 07:12:33 np0005538515.novalocal sshd[19110]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:12:35 np0005538515.novalocal sshd[19110]: Connection closed by authenticating user root 69.5.7.210 port 35244 [preauth]
Nov 28 07:12:35 np0005538515.novalocal sshd[19112]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:12:36 np0005538515.novalocal sshd[19112]: Connection closed by authenticating user root 69.5.7.210 port 35250 [preauth]
Nov 28 07:12:37 np0005538515.novalocal sshd[19114]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:12:38 np0005538515.novalocal sshd[19114]: Connection closed by authenticating user root 69.5.7.210 port 35252 [preauth]
Nov 28 07:12:38 np0005538515.novalocal sshd[19116]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:12:40 np0005538515.novalocal sshd[19116]: Connection closed by authenticating user root 69.5.7.210 port 35260 [preauth]
Nov 28 07:12:40 np0005538515.novalocal sshd[19118]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:12:41 np0005538515.novalocal sshd[19118]: Connection closed by authenticating user root 69.5.7.210 port 55716 [preauth]
Nov 28 07:12:41 np0005538515.novalocal sshd[19120]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:12:43 np0005538515.novalocal sshd[19120]: Connection closed by authenticating user root 69.5.7.210 port 55726 [preauth]
Nov 28 07:12:43 np0005538515.novalocal sshd[19122]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:12:44 np0005538515.novalocal sshd[19122]: Connection closed by authenticating user root 69.5.7.210 port 55730 [preauth]
Nov 28 07:12:44 np0005538515.novalocal sshd[19124]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:12:46 np0005538515.novalocal sshd[19124]: Connection closed by authenticating user root 69.5.7.210 port 55732 [preauth]
Nov 28 07:12:46 np0005538515.novalocal sshd[19126]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:12:47 np0005538515.novalocal sshd[19126]: Connection closed by authenticating user root 69.5.7.210 port 55748 [preauth]
Nov 28 07:12:48 np0005538515.novalocal sshd[19128]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:12:49 np0005538515.novalocal sshd[19128]: Connection closed by authenticating user root 69.5.7.210 port 55750 [preauth]
Nov 28 07:12:49 np0005538515.novalocal sshd[19130]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:12:51 np0005538515.novalocal sshd[19130]: Connection closed by authenticating user root 69.5.7.210 port 49252 [preauth]
Nov 28 07:12:51 np0005538515.novalocal sshd[19132]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:12:52 np0005538515.novalocal sshd[19132]: Connection closed by authenticating user root 69.5.7.210 port 49268 [preauth]
Nov 28 07:12:52 np0005538515.novalocal sshd[19134]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:12:54 np0005538515.novalocal sshd[19134]: Connection closed by authenticating user root 69.5.7.210 port 49284 [preauth]
Nov 28 07:12:54 np0005538515.novalocal sshd[19136]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:12:55 np0005538515.novalocal sshd[19136]: Connection closed by authenticating user root 69.5.7.210 port 49290 [preauth]
Nov 28 07:12:56 np0005538515.novalocal sshd[19138]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:12:57 np0005538515.novalocal sshd[19138]: Connection closed by authenticating user root 69.5.7.210 port 49300 [preauth]
Nov 28 07:12:57 np0005538515.novalocal sshd[19140]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:12:58 np0005538515.novalocal sshd[19140]: Connection closed by authenticating user root 69.5.7.210 port 49306 [preauth]
Nov 28 07:12:59 np0005538515.novalocal sshd[19142]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:13:00 np0005538515.novalocal sshd[19142]: Connection closed by authenticating user root 69.5.7.210 port 60768 [preauth]
Nov 28 07:13:00 np0005538515.novalocal sshd[19144]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:13:02 np0005538515.novalocal sshd[19144]: Connection closed by authenticating user root 69.5.7.210 port 60772 [preauth]
Nov 28 07:13:02 np0005538515.novalocal sshd[19146]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:13:03 np0005538515.novalocal sshd[19146]: Connection closed by authenticating user root 69.5.7.210 port 60788 [preauth]
Nov 28 07:13:03 np0005538515.novalocal sshd[19148]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:13:05 np0005538515.novalocal sshd[19148]: Connection closed by authenticating user root 69.5.7.210 port 60792 [preauth]
Nov 28 07:13:05 np0005538515.novalocal sshd[19150]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:13:06 np0005538515.novalocal sshd[19150]: Connection closed by authenticating user root 69.5.7.210 port 60804 [preauth]
Nov 28 07:13:07 np0005538515.novalocal sshd[19152]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:13:08 np0005538515.novalocal sshd[19152]: Connection closed by authenticating user root 69.5.7.210 port 60820 [preauth]
Nov 28 07:13:08 np0005538515.novalocal sshd[19154]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:13:09 np0005538515.novalocal sshd[19154]: Connection closed by authenticating user root 69.5.7.210 port 60826 [preauth]
Nov 28 07:13:10 np0005538515.novalocal sshd[19156]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:13:11 np0005538515.novalocal sshd[19156]: Connection closed by authenticating user root 69.5.7.210 port 50978 [preauth]
Nov 28 07:13:11 np0005538515.novalocal sshd[19158]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:13:13 np0005538515.novalocal sshd[19158]: Connection closed by authenticating user root 69.5.7.210 port 50986 [preauth]
Nov 28 07:13:13 np0005538515.novalocal sshd[19160]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:13:14 np0005538515.novalocal sshd[19160]: Connection closed by authenticating user root 69.5.7.210 port 50996 [preauth]
Nov 28 07:13:14 np0005538515.novalocal sshd[19162]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:13:16 np0005538515.novalocal sshd[19162]: Connection closed by authenticating user root 69.5.7.210 port 51000 [preauth]
Nov 28 07:13:16 np0005538515.novalocal sshd[19164]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:13:17 np0005538515.novalocal sshd[19164]: Connection closed by authenticating user root 69.5.7.210 port 51010 [preauth]
Nov 28 07:13:18 np0005538515.novalocal sshd[19166]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:13:19 np0005538515.novalocal sshd[19166]: Connection closed by authenticating user root 69.5.7.210 port 51018 [preauth]
Nov 28 07:13:19 np0005538515.novalocal sshd[19168]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:13:21 np0005538515.novalocal sshd[19168]: Connection closed by authenticating user root 69.5.7.210 port 33126 [preauth]
Nov 28 07:13:21 np0005538515.novalocal sshd[19170]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:13:22 np0005538515.novalocal sshd[19170]: Connection closed by authenticating user root 69.5.7.210 port 33136 [preauth]
Nov 28 07:13:22 np0005538515.novalocal sshd[19172]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:13:24 np0005538515.novalocal sshd[19172]: Connection closed by authenticating user root 69.5.7.210 port 33140 [preauth]
Nov 28 07:13:24 np0005538515.novalocal sshd[19174]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:13:25 np0005538515.novalocal sshd[19174]: Connection closed by authenticating user root 69.5.7.210 port 33156 [preauth]
Nov 28 07:13:26 np0005538515.novalocal sshd[19176]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:13:27 np0005538515.novalocal sshd[19176]: Connection closed by authenticating user root 69.5.7.210 port 33158 [preauth]
Nov 28 07:13:27 np0005538515.novalocal sshd[19178]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:13:28 np0005538515.novalocal sshd[19178]: Connection closed by authenticating user root 69.5.7.210 port 33160 [preauth]
Nov 28 07:13:29 np0005538515.novalocal sshd[19180]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:13:30 np0005538515.novalocal sshd[19180]: Connection closed by authenticating user root 69.5.7.210 port 47116 [preauth]
Nov 28 07:13:30 np0005538515.novalocal sshd[19182]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:13:32 np0005538515.novalocal sshd[19182]: Connection closed by authenticating user root 69.5.7.210 port 47120 [preauth]
Nov 28 07:13:32 np0005538515.novalocal sshd[19184]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:13:33 np0005538515.novalocal sshd[19184]: Connection closed by authenticating user root 69.5.7.210 port 47130 [preauth]
Nov 28 07:13:34 np0005538515.novalocal sshd[19186]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:13:35 np0005538515.novalocal sshd[19186]: Invalid user user from 69.5.7.210 port 47140
Nov 28 07:13:36 np0005538515.novalocal sshd[19186]: Connection closed by invalid user user 69.5.7.210 port 47140 [preauth]
Nov 28 07:13:36 np0005538515.novalocal sshd[19188]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:13:37 np0005538515.novalocal sshd[19188]: Invalid user user from 69.5.7.210 port 47142
Nov 28 07:13:37 np0005538515.novalocal sshd[19188]: Connection closed by invalid user user 69.5.7.210 port 47142 [preauth]
Nov 28 07:13:38 np0005538515.novalocal sshd[19190]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:13:39 np0005538515.novalocal sshd[19190]: Invalid user user from 69.5.7.210 port 47150
Nov 28 07:13:39 np0005538515.novalocal sshd[19190]: Connection closed by invalid user user 69.5.7.210 port 47150 [preauth]
Nov 28 07:13:39 np0005538515.novalocal sshd[19192]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:13:40 np0005538515.novalocal sshd[19192]: Invalid user user from 69.5.7.210 port 59548
Nov 28 07:13:40 np0005538515.novalocal sshd[19192]: Connection closed by invalid user user 69.5.7.210 port 59548 [preauth]
Nov 28 07:13:41 np0005538515.novalocal sshd[19194]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:13:42 np0005538515.novalocal sshd[19194]: Invalid user user from 69.5.7.210 port 59558
Nov 28 07:13:42 np0005538515.novalocal sshd[19194]: Connection closed by invalid user user 69.5.7.210 port 59558 [preauth]
Nov 28 07:13:42 np0005538515.novalocal sshd[19196]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:13:43 np0005538515.novalocal sshd[19196]: Invalid user user from 69.5.7.210 port 59570
Nov 28 07:13:43 np0005538515.novalocal sshd[19196]: Connection closed by invalid user user 69.5.7.210 port 59570 [preauth]
Nov 28 07:13:44 np0005538515.novalocal sshd[19198]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:13:45 np0005538515.novalocal sshd[19198]: Invalid user user from 69.5.7.210 port 59572
Nov 28 07:13:45 np0005538515.novalocal sshd[19198]: Connection closed by invalid user user 69.5.7.210 port 59572 [preauth]
Nov 28 07:13:45 np0005538515.novalocal sshd[19200]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:13:46 np0005538515.novalocal sshd[19200]: Invalid user user from 69.5.7.210 port 59586
Nov 28 07:13:47 np0005538515.novalocal sshd[19200]: Connection closed by invalid user user 69.5.7.210 port 59586 [preauth]
Nov 28 07:13:47 np0005538515.novalocal sshd[19202]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:13:48 np0005538515.novalocal sshd[19202]: Invalid user user from 69.5.7.210 port 59596
Nov 28 07:13:48 np0005538515.novalocal sshd[19202]: Connection closed by invalid user user 69.5.7.210 port 59596 [preauth]
Nov 28 07:13:48 np0005538515.novalocal sshd[19204]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:13:50 np0005538515.novalocal sshd[19204]: Invalid user user from 69.5.7.210 port 59600
Nov 28 07:13:50 np0005538515.novalocal sshd[19204]: Connection closed by invalid user user 69.5.7.210 port 59600 [preauth]
Nov 28 07:13:50 np0005538515.novalocal sshd[19206]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:13:51 np0005538515.novalocal sshd[19206]: Invalid user user from 69.5.7.210 port 57452
Nov 28 07:13:51 np0005538515.novalocal sshd[19206]: Connection closed by invalid user user 69.5.7.210 port 57452 [preauth]
Nov 28 07:13:52 np0005538515.novalocal sshd[19208]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:13:53 np0005538515.novalocal sshd[19208]: Invalid user user from 69.5.7.210 port 57460
Nov 28 07:13:53 np0005538515.novalocal sshd[19208]: Connection closed by invalid user user 69.5.7.210 port 57460 [preauth]
Nov 28 07:13:53 np0005538515.novalocal sshd[19210]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:13:54 np0005538515.novalocal sshd[19210]: Invalid user user from 69.5.7.210 port 57474
Nov 28 07:13:55 np0005538515.novalocal sshd[19210]: Connection closed by invalid user user 69.5.7.210 port 57474 [preauth]
Nov 28 07:13:55 np0005538515.novalocal sshd[19212]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:13:56 np0005538515.novalocal sshd[19212]: Invalid user user from 69.5.7.210 port 57482
Nov 28 07:13:56 np0005538515.novalocal sshd[19212]: Connection closed by invalid user user 69.5.7.210 port 57482 [preauth]
Nov 28 07:13:56 np0005538515.novalocal sshd[19214]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:13:58 np0005538515.novalocal sshd[19214]: Invalid user user from 69.5.7.210 port 57484
Nov 28 07:13:58 np0005538515.novalocal sshd[19214]: Connection closed by invalid user user 69.5.7.210 port 57484 [preauth]
Nov 28 07:13:58 np0005538515.novalocal sshd[19216]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:13:59 np0005538515.novalocal sshd[19216]: Invalid user user from 69.5.7.210 port 57496
Nov 28 07:13:59 np0005538515.novalocal sshd[19216]: Connection closed by invalid user user 69.5.7.210 port 57496 [preauth]
Nov 28 07:14:00 np0005538515.novalocal sshd[19218]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:14:01 np0005538515.novalocal sshd[19218]: Invalid user user from 69.5.7.210 port 42236
Nov 28 07:14:01 np0005538515.novalocal sshd[19218]: Connection closed by invalid user user 69.5.7.210 port 42236 [preauth]
Nov 28 07:14:01 np0005538515.novalocal sshd[19220]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:14:02 np0005538515.novalocal sshd[19220]: Invalid user user from 69.5.7.210 port 42240
Nov 28 07:14:03 np0005538515.novalocal sshd[19220]: Connection closed by invalid user user 69.5.7.210 port 42240 [preauth]
Nov 28 07:14:03 np0005538515.novalocal sshd[19222]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:14:04 np0005538515.novalocal sshd[19222]: Invalid user user from 69.5.7.210 port 42256
Nov 28 07:14:04 np0005538515.novalocal sshd[19222]: Connection closed by invalid user user 69.5.7.210 port 42256 [preauth]
Nov 28 07:14:04 np0005538515.novalocal sshd[19224]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:14:06 np0005538515.novalocal sshd[19224]: Invalid user user from 69.5.7.210 port 42258
Nov 28 07:14:06 np0005538515.novalocal sshd[19224]: Connection closed by invalid user user 69.5.7.210 port 42258 [preauth]
Nov 28 07:14:06 np0005538515.novalocal sshd[19226]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:14:07 np0005538515.novalocal sshd[19226]: Invalid user user from 69.5.7.210 port 42264
Nov 28 07:14:07 np0005538515.novalocal sshd[19226]: Connection closed by invalid user user 69.5.7.210 port 42264 [preauth]
Nov 28 07:14:08 np0005538515.novalocal sshd[19228]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:14:09 np0005538515.novalocal sshd[19228]: Invalid user user from 69.5.7.210 port 42266
Nov 28 07:14:09 np0005538515.novalocal sshd[19228]: Connection closed by invalid user user 69.5.7.210 port 42266 [preauth]
Nov 28 07:14:09 np0005538515.novalocal sshd[19230]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:14:10 np0005538515.novalocal sshd[19230]: Invalid user user from 69.5.7.210 port 49134
Nov 28 07:14:11 np0005538515.novalocal sshd[19230]: Connection closed by invalid user user 69.5.7.210 port 49134 [preauth]
Nov 28 07:14:11 np0005538515.novalocal sshd[19232]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:14:12 np0005538515.novalocal sshd[19232]: Invalid user user from 69.5.7.210 port 49144
Nov 28 07:14:12 np0005538515.novalocal sshd[19232]: Connection closed by invalid user user 69.5.7.210 port 49144 [preauth]
Nov 28 07:14:12 np0005538515.novalocal sshd[19234]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:14:13 np0005538515.novalocal sshd[19234]: Invalid user user from 69.5.7.210 port 49152
Nov 28 07:14:14 np0005538515.novalocal sshd[19234]: Connection closed by invalid user user 69.5.7.210 port 49152 [preauth]
Nov 28 07:14:14 np0005538515.novalocal sshd[19236]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:14:15 np0005538515.novalocal sshd[19236]: Invalid user user from 69.5.7.210 port 49158
Nov 28 07:14:15 np0005538515.novalocal sshd[19236]: Connection closed by invalid user user 69.5.7.210 port 49158 [preauth]
Nov 28 07:14:16 np0005538515.novalocal sshd[19238]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:14:17 np0005538515.novalocal sshd[19238]: Invalid user user from 69.5.7.210 port 49172
Nov 28 07:14:17 np0005538515.novalocal sshd[19238]: Connection closed by invalid user user 69.5.7.210 port 49172 [preauth]
Nov 28 07:14:17 np0005538515.novalocal sshd[19240]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:14:18 np0005538515.novalocal sshd[19240]: Invalid user user from 69.5.7.210 port 49186
Nov 28 07:14:18 np0005538515.novalocal sshd[19240]: Connection closed by invalid user user 69.5.7.210 port 49186 [preauth]
Nov 28 07:14:19 np0005538515.novalocal sshd[19242]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:14:20 np0005538515.novalocal sshd[19242]: Invalid user user from 69.5.7.210 port 50928
Nov 28 07:14:20 np0005538515.novalocal sshd[19242]: Connection closed by invalid user user 69.5.7.210 port 50928 [preauth]
Nov 28 07:14:20 np0005538515.novalocal sshd[19244]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:14:21 np0005538515.novalocal sshd[19244]: Invalid user user from 69.5.7.210 port 50942
Nov 28 07:14:22 np0005538515.novalocal sshd[19244]: Connection closed by invalid user user 69.5.7.210 port 50942 [preauth]
Nov 28 07:14:22 np0005538515.novalocal sshd[19246]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:14:23 np0005538515.novalocal sshd[19246]: Invalid user user from 69.5.7.210 port 50954
Nov 28 07:14:23 np0005538515.novalocal sshd[19246]: Connection closed by invalid user user 69.5.7.210 port 50954 [preauth]
Nov 28 07:14:23 np0005538515.novalocal sshd[19248]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:14:25 np0005538515.novalocal sshd[19248]: Invalid user user from 69.5.7.210 port 50960
Nov 28 07:14:25 np0005538515.novalocal sshd[19248]: Connection closed by invalid user user 69.5.7.210 port 50960 [preauth]
Nov 28 07:14:25 np0005538515.novalocal sshd[19250]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:14:26 np0005538515.novalocal sshd[19250]: Invalid user user from 69.5.7.210 port 50964
Nov 28 07:14:26 np0005538515.novalocal sshd[19250]: Connection closed by invalid user user 69.5.7.210 port 50964 [preauth]
Nov 28 07:14:27 np0005538515.novalocal sshd[19252]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:14:28 np0005538515.novalocal sshd[19252]: Invalid user user from 69.5.7.210 port 50966
Nov 28 07:14:28 np0005538515.novalocal sshd[19252]: Connection closed by invalid user user 69.5.7.210 port 50966 [preauth]
Nov 28 07:14:28 np0005538515.novalocal sshd[19254]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:14:29 np0005538515.novalocal sshd[19254]: Invalid user user from 69.5.7.210 port 50970
Nov 28 07:14:30 np0005538515.novalocal sshd[19254]: Connection closed by invalid user user 69.5.7.210 port 50970 [preauth]
Nov 28 07:14:30 np0005538515.novalocal sshd[19256]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:14:31 np0005538515.novalocal sshd[19256]: Invalid user user from 69.5.7.210 port 38994
Nov 28 07:14:31 np0005538515.novalocal sshd[19256]: Connection closed by invalid user user 69.5.7.210 port 38994 [preauth]
Nov 28 07:14:31 np0005538515.novalocal sshd[19258]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:14:32 np0005538515.novalocal sshd[19258]: Invalid user user from 69.5.7.210 port 39004
Nov 28 07:14:33 np0005538515.novalocal sshd[19258]: Connection closed by invalid user user 69.5.7.210 port 39004 [preauth]
Nov 28 07:14:33 np0005538515.novalocal sshd[19260]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:14:34 np0005538515.novalocal sshd[19260]: Invalid user user from 69.5.7.210 port 39016
Nov 28 07:14:34 np0005538515.novalocal sshd[19260]: Connection closed by invalid user user 69.5.7.210 port 39016 [preauth]
Nov 28 07:14:35 np0005538515.novalocal sshd[19262]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:14:36 np0005538515.novalocal sshd[19262]: Invalid user user from 69.5.7.210 port 39026
Nov 28 07:14:36 np0005538515.novalocal sshd[19262]: Connection closed by invalid user user 69.5.7.210 port 39026 [preauth]
Nov 28 07:14:36 np0005538515.novalocal sshd[19264]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:14:37 np0005538515.novalocal sshd[19264]: Invalid user user from 69.5.7.210 port 39040
Nov 28 07:14:38 np0005538515.novalocal sshd[19264]: Connection closed by invalid user user 69.5.7.210 port 39040 [preauth]
Nov 28 07:14:38 np0005538515.novalocal sshd[19266]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:14:39 np0005538515.novalocal sshd[19266]: Invalid user user from 69.5.7.210 port 39042
Nov 28 07:14:39 np0005538515.novalocal sshd[19266]: Connection closed by invalid user user 69.5.7.210 port 39042 [preauth]
Nov 28 07:14:39 np0005538515.novalocal sshd[19268]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:14:41 np0005538515.novalocal sshd[19268]: Invalid user user from 69.5.7.210 port 55422
Nov 28 07:14:41 np0005538515.novalocal sshd[19268]: Connection closed by invalid user user 69.5.7.210 port 55422 [preauth]
Nov 28 07:14:41 np0005538515.novalocal sshd[19270]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:14:42 np0005538515.novalocal sshd[19270]: Invalid user user from 69.5.7.210 port 55424
Nov 28 07:14:42 np0005538515.novalocal sshd[19270]: Connection closed by invalid user user 69.5.7.210 port 55424 [preauth]
Nov 28 07:14:43 np0005538515.novalocal sshd[19272]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:14:44 np0005538515.novalocal sshd[19272]: Invalid user user from 69.5.7.210 port 55430
Nov 28 07:14:44 np0005538515.novalocal sshd[19272]: Connection closed by invalid user user 69.5.7.210 port 55430 [preauth]
Nov 28 07:14:44 np0005538515.novalocal sshd[19274]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:14:45 np0005538515.novalocal sshd[19274]: Invalid user user from 69.5.7.210 port 55446
Nov 28 07:14:46 np0005538515.novalocal sshd[19274]: Connection closed by invalid user user 69.5.7.210 port 55446 [preauth]
Nov 28 07:14:46 np0005538515.novalocal sshd[19276]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:14:47 np0005538515.novalocal sshd[19276]: Invalid user user from 69.5.7.210 port 55462
Nov 28 07:14:47 np0005538515.novalocal sshd[19276]: Connection closed by invalid user user 69.5.7.210 port 55462 [preauth]
Nov 28 07:14:47 np0005538515.novalocal sshd[19278]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:14:49 np0005538515.novalocal sshd[19278]: Invalid user user from 69.5.7.210 port 55472
Nov 28 07:14:49 np0005538515.novalocal sshd[19278]: Connection closed by invalid user user 69.5.7.210 port 55472 [preauth]
Nov 28 07:14:49 np0005538515.novalocal sshd[19280]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:14:50 np0005538515.novalocal sshd[19280]: Invalid user user from 69.5.7.210 port 48312
Nov 28 07:14:50 np0005538515.novalocal sshd[19280]: Connection closed by invalid user user 69.5.7.210 port 48312 [preauth]
Nov 28 07:14:51 np0005538515.novalocal sshd[19282]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:14:52 np0005538515.novalocal sshd[19282]: Invalid user user from 69.5.7.210 port 48320
Nov 28 07:14:52 np0005538515.novalocal sshd[19282]: Connection closed by invalid user user 69.5.7.210 port 48320 [preauth]
Nov 28 07:14:52 np0005538515.novalocal sshd[19284]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:14:53 np0005538515.novalocal sshd[19284]: Invalid user user from 69.5.7.210 port 48326
Nov 28 07:14:54 np0005538515.novalocal sshd[19284]: Connection closed by invalid user user 69.5.7.210 port 48326 [preauth]
Nov 28 07:14:54 np0005538515.novalocal sshd[19286]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:14:55 np0005538515.novalocal sshd[19286]: Invalid user user from 69.5.7.210 port 48332
Nov 28 07:14:55 np0005538515.novalocal sshd[19286]: Connection closed by invalid user user 69.5.7.210 port 48332 [preauth]
Nov 28 07:14:56 np0005538515.novalocal sshd[19288]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:14:57 np0005538515.novalocal sshd[19288]: Invalid user user from 69.5.7.210 port 48346
Nov 28 07:14:57 np0005538515.novalocal sshd[19288]: Connection closed by invalid user user 69.5.7.210 port 48346 [preauth]
Nov 28 07:14:57 np0005538515.novalocal sshd[19290]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:14:58 np0005538515.novalocal sshd[19290]: Invalid user user from 69.5.7.210 port 48356
Nov 28 07:14:59 np0005538515.novalocal sshd[19290]: Connection closed by invalid user user 69.5.7.210 port 48356 [preauth]
Nov 28 07:14:59 np0005538515.novalocal sshd[19292]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:15:00 np0005538515.novalocal sshd[19292]: Invalid user user from 69.5.7.210 port 51914
Nov 28 07:15:00 np0005538515.novalocal sshd[19292]: Connection closed by invalid user user 69.5.7.210 port 51914 [preauth]
Nov 28 07:15:00 np0005538515.novalocal sshd[19294]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:15:01 np0005538515.novalocal sshd[19294]: Invalid user user from 69.5.7.210 port 51926
Nov 28 07:15:02 np0005538515.novalocal sshd[19294]: Connection closed by invalid user user 69.5.7.210 port 51926 [preauth]
Nov 28 07:15:02 np0005538515.novalocal sshd[19296]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:15:03 np0005538515.novalocal sshd[19296]: Invalid user user from 69.5.7.210 port 51936
Nov 28 07:15:03 np0005538515.novalocal sshd[19296]: Connection closed by invalid user user 69.5.7.210 port 51936 [preauth]
Nov 28 07:15:04 np0005538515.novalocal sshd[19298]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:15:05 np0005538515.novalocal sshd[19298]: Invalid user user from 69.5.7.210 port 51950
Nov 28 07:15:05 np0005538515.novalocal sshd[19298]: Connection closed by invalid user user 69.5.7.210 port 51950 [preauth]
Nov 28 07:15:05 np0005538515.novalocal sshd[19300]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:15:06 np0005538515.novalocal sshd[19300]: Invalid user user from 69.5.7.210 port 51966
Nov 28 07:15:06 np0005538515.novalocal sshd[19300]: Connection closed by invalid user user 69.5.7.210 port 51966 [preauth]
Nov 28 07:15:07 np0005538515.novalocal sshd[19302]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:15:08 np0005538515.novalocal sshd[19302]: Invalid user user from 69.5.7.210 port 51980
Nov 28 07:15:08 np0005538515.novalocal sshd[19302]: Connection closed by invalid user user 69.5.7.210 port 51980 [preauth]
Nov 28 07:15:08 np0005538515.novalocal sshd[19304]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:15:09 np0005538515.novalocal sshd[19304]: Invalid user user from 69.5.7.210 port 51986
Nov 28 07:15:10 np0005538515.novalocal sshd[19304]: Connection closed by invalid user user 69.5.7.210 port 51986 [preauth]
Nov 28 07:15:10 np0005538515.novalocal sshd[19306]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:15:11 np0005538515.novalocal sshd[19306]: Invalid user user from 69.5.7.210 port 36896
Nov 28 07:15:11 np0005538515.novalocal sshd[19306]: Connection closed by invalid user user 69.5.7.210 port 36896 [preauth]
Nov 28 07:15:12 np0005538515.novalocal sshd[19308]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:15:13 np0005538515.novalocal sshd[19308]: Invalid user user from 69.5.7.210 port 36910
Nov 28 07:15:13 np0005538515.novalocal sshd[19308]: Connection closed by invalid user user 69.5.7.210 port 36910 [preauth]
Nov 28 07:15:13 np0005538515.novalocal sshd[19310]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:15:14 np0005538515.novalocal sshd[19310]: Invalid user user from 69.5.7.210 port 36916
Nov 28 07:15:15 np0005538515.novalocal sshd[19310]: Connection closed by invalid user user 69.5.7.210 port 36916 [preauth]
Nov 28 07:15:15 np0005538515.novalocal sshd[19312]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:15:16 np0005538515.novalocal sshd[19312]: Invalid user user from 69.5.7.210 port 36926
Nov 28 07:15:16 np0005538515.novalocal sshd[19312]: Connection closed by invalid user user 69.5.7.210 port 36926 [preauth]
Nov 28 07:15:16 np0005538515.novalocal sshd[19314]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:15:17 np0005538515.novalocal sshd[19314]: Invalid user user from 69.5.7.210 port 36934
Nov 28 07:15:18 np0005538515.novalocal sshd[19314]: Connection closed by invalid user user 69.5.7.210 port 36934 [preauth]
Nov 28 07:15:18 np0005538515.novalocal sshd[19316]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:15:19 np0005538515.novalocal sshd[19316]: Invalid user user from 69.5.7.210 port 36948
Nov 28 07:15:19 np0005538515.novalocal sshd[19316]: Connection closed by invalid user user 69.5.7.210 port 36948 [preauth]
Nov 28 07:15:20 np0005538515.novalocal sshd[19318]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:15:21 np0005538515.novalocal sshd[19318]: Invalid user user from 69.5.7.210 port 51226
Nov 28 07:15:21 np0005538515.novalocal sshd[19318]: Connection closed by invalid user user 69.5.7.210 port 51226 [preauth]
Nov 28 07:15:21 np0005538515.novalocal sshd[19320]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:15:22 np0005538515.novalocal sshd[19320]: Invalid user user from 69.5.7.210 port 51232
Nov 28 07:15:22 np0005538515.novalocal sshd[19320]: Connection closed by invalid user user 69.5.7.210 port 51232 [preauth]
Nov 28 07:15:23 np0005538515.novalocal sshd[19322]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:15:24 np0005538515.novalocal sshd[19322]: Invalid user user from 69.5.7.210 port 51236
Nov 28 07:15:24 np0005538515.novalocal sshd[19322]: Connection closed by invalid user user 69.5.7.210 port 51236 [preauth]
Nov 28 07:15:24 np0005538515.novalocal sshd[19324]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:15:25 np0005538515.novalocal sshd[19324]: Invalid user user from 69.5.7.210 port 51244
Nov 28 07:15:26 np0005538515.novalocal sshd[19324]: Connection closed by invalid user user 69.5.7.210 port 51244 [preauth]
Nov 28 07:15:26 np0005538515.novalocal sshd[19326]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:15:27 np0005538515.novalocal sshd[19326]: Invalid user user from 69.5.7.210 port 51250
Nov 28 07:15:27 np0005538515.novalocal sshd[19326]: Connection closed by invalid user user 69.5.7.210 port 51250 [preauth]
Nov 28 07:15:27 np0005538515.novalocal sshd[19328]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:15:28 np0005538515.novalocal sshd[19328]: Invalid user user from 69.5.7.210 port 51266
Nov 28 07:15:29 np0005538515.novalocal sshd[19328]: Connection closed by invalid user user 69.5.7.210 port 51266 [preauth]
Nov 28 07:15:29 np0005538515.novalocal sshd[19330]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:15:30 np0005538515.novalocal sshd[19330]: Invalid user user from 69.5.7.210 port 36806
Nov 28 07:15:30 np0005538515.novalocal sshd[19330]: Connection closed by invalid user user 69.5.7.210 port 36806 [preauth]
Nov 28 07:15:31 np0005538515.novalocal sshd[19332]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:15:32 np0005538515.novalocal sshd[19332]: Invalid user user from 69.5.7.210 port 36812
Nov 28 07:15:32 np0005538515.novalocal sshd[19332]: Connection closed by invalid user user 69.5.7.210 port 36812 [preauth]
Nov 28 07:15:32 np0005538515.novalocal sshd[19334]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:15:33 np0005538515.novalocal sshd[19334]: Invalid user user from 69.5.7.210 port 36824
Nov 28 07:15:34 np0005538515.novalocal sshd[19334]: Connection closed by invalid user user 69.5.7.210 port 36824 [preauth]
Nov 28 07:15:34 np0005538515.novalocal sshd[19336]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:15:35 np0005538515.novalocal sshd[19336]: Invalid user user from 69.5.7.210 port 36834
Nov 28 07:15:35 np0005538515.novalocal sshd[19336]: Connection closed by invalid user user 69.5.7.210 port 36834 [preauth]
Nov 28 07:15:35 np0005538515.novalocal sshd[19338]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:15:37 np0005538515.novalocal sshd[19338]: Invalid user user from 69.5.7.210 port 36844
Nov 28 07:15:37 np0005538515.novalocal sshd[19338]: Connection closed by invalid user user 69.5.7.210 port 36844 [preauth]
Nov 28 07:15:37 np0005538515.novalocal sshd[19340]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:15:38 np0005538515.novalocal sshd[19340]: Invalid user user from 69.5.7.210 port 36856
Nov 28 07:15:38 np0005538515.novalocal sshd[19340]: Connection closed by invalid user user 69.5.7.210 port 36856 [preauth]
Nov 28 07:15:39 np0005538515.novalocal sshd[19342]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:15:40 np0005538515.novalocal sshd[19342]: Invalid user user from 69.5.7.210 port 37092
Nov 28 07:15:40 np0005538515.novalocal sshd[19342]: Connection closed by invalid user user 69.5.7.210 port 37092 [preauth]
Nov 28 07:15:40 np0005538515.novalocal sshd[19344]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:15:41 np0005538515.novalocal sshd[19344]: Invalid user user from 69.5.7.210 port 37096
Nov 28 07:15:41 np0005538515.novalocal sshd[19344]: Connection closed by invalid user user 69.5.7.210 port 37096 [preauth]
Nov 28 07:15:42 np0005538515.novalocal sshd[19346]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:15:43 np0005538515.novalocal sshd[19346]: Invalid user user from 69.5.7.210 port 37108
Nov 28 07:15:43 np0005538515.novalocal sshd[19346]: Connection closed by invalid user user 69.5.7.210 port 37108 [preauth]
Nov 28 07:15:43 np0005538515.novalocal sshd[19348]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:15:44 np0005538515.novalocal sshd[19348]: Invalid user user from 69.5.7.210 port 37120
Nov 28 07:15:45 np0005538515.novalocal sshd[19348]: Connection closed by invalid user user 69.5.7.210 port 37120 [preauth]
Nov 28 07:15:45 np0005538515.novalocal sshd[19350]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:15:46 np0005538515.novalocal sshd[19350]: Invalid user ubuntu from 69.5.7.210 port 37132
Nov 28 07:15:46 np0005538515.novalocal sshd[19350]: Connection closed by invalid user ubuntu 69.5.7.210 port 37132 [preauth]
Nov 28 07:15:47 np0005538515.novalocal sshd[19352]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:15:48 np0005538515.novalocal sshd[19352]: Invalid user ubuntu from 69.5.7.210 port 37144
Nov 28 07:15:48 np0005538515.novalocal sshd[19352]: Connection closed by invalid user ubuntu 69.5.7.210 port 37144 [preauth]
Nov 28 07:15:48 np0005538515.novalocal sshd[19355]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:15:49 np0005538515.novalocal sshd[19355]: Invalid user ubuntu from 69.5.7.210 port 37160
Nov 28 07:15:50 np0005538515.novalocal sshd[19355]: Connection closed by invalid user ubuntu 69.5.7.210 port 37160 [preauth]
Nov 28 07:15:50 np0005538515.novalocal sshd[19357]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:15:51 np0005538515.novalocal sshd[19357]: Invalid user ubuntu from 69.5.7.210 port 60568
Nov 28 07:15:51 np0005538515.novalocal sshd[19357]: Connection closed by invalid user ubuntu 69.5.7.210 port 60568 [preauth]
Nov 28 07:15:52 np0005538515.novalocal sshd[19359]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:15:53 np0005538515.novalocal sshd[19359]: Invalid user ubuntu from 69.5.7.210 port 60576
Nov 28 07:15:53 np0005538515.novalocal sshd[19359]: Connection closed by invalid user ubuntu 69.5.7.210 port 60576 [preauth]
Nov 28 07:15:53 np0005538515.novalocal sshd[19361]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:15:54 np0005538515.novalocal sshd[19361]: Invalid user ubuntu from 69.5.7.210 port 60584
Nov 28 07:15:54 np0005538515.novalocal sshd[19361]: Connection closed by invalid user ubuntu 69.5.7.210 port 60584 [preauth]
Nov 28 07:15:55 np0005538515.novalocal sshd[19363]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:15:56 np0005538515.novalocal sshd[19363]: Invalid user ubuntu from 69.5.7.210 port 60592
Nov 28 07:15:56 np0005538515.novalocal sshd[19363]: Connection closed by invalid user ubuntu 69.5.7.210 port 60592 [preauth]
Nov 28 07:15:56 np0005538515.novalocal sshd[19365]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:15:57 np0005538515.novalocal sshd[19365]: Invalid user ubuntu from 69.5.7.210 port 60600
Nov 28 07:15:58 np0005538515.novalocal sshd[19365]: Connection closed by invalid user ubuntu 69.5.7.210 port 60600 [preauth]
Nov 28 07:15:58 np0005538515.novalocal sshd[19367]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:15:59 np0005538515.novalocal sshd[19367]: Invalid user ubuntu from 69.5.7.210 port 60614
Nov 28 07:15:59 np0005538515.novalocal sshd[19367]: Connection closed by invalid user ubuntu 69.5.7.210 port 60614 [preauth]
Nov 28 07:16:00 np0005538515.novalocal sshd[19369]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:16:01 np0005538515.novalocal sshd[19369]: Invalid user ubuntu from 69.5.7.210 port 38598
Nov 28 07:16:01 np0005538515.novalocal sshd[19369]: Connection closed by invalid user ubuntu 69.5.7.210 port 38598 [preauth]
Nov 28 07:16:01 np0005538515.novalocal sshd[19371]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:16:02 np0005538515.novalocal sshd[19371]: Invalid user ubuntu from 69.5.7.210 port 38614
Nov 28 07:16:03 np0005538515.novalocal sshd[19371]: Connection closed by invalid user ubuntu 69.5.7.210 port 38614 [preauth]
Nov 28 07:16:03 np0005538515.novalocal sshd[19373]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:16:04 np0005538515.novalocal sshd[19373]: Invalid user ubuntu from 69.5.7.210 port 38624
Nov 28 07:16:04 np0005538515.novalocal sshd[19373]: Connection closed by invalid user ubuntu 69.5.7.210 port 38624 [preauth]
Nov 28 07:16:04 np0005538515.novalocal sshd[19375]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:16:05 np0005538515.novalocal sshd[19375]: Invalid user ubuntu from 69.5.7.210 port 38638
Nov 28 07:16:06 np0005538515.novalocal sshd[19375]: Connection closed by invalid user ubuntu 69.5.7.210 port 38638 [preauth]
Nov 28 07:16:06 np0005538515.novalocal sshd[19377]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:16:07 np0005538515.novalocal sshd[19377]: Invalid user ubuntu from 69.5.7.210 port 38646
Nov 28 07:16:07 np0005538515.novalocal sshd[19377]: Connection closed by invalid user ubuntu 69.5.7.210 port 38646 [preauth]
Nov 28 07:16:08 np0005538515.novalocal sshd[19379]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:16:09 np0005538515.novalocal sshd[19379]: Invalid user ubuntu from 69.5.7.210 port 38654
Nov 28 07:16:09 np0005538515.novalocal sshd[19379]: Connection closed by invalid user ubuntu 69.5.7.210 port 38654 [preauth]
Nov 28 07:16:09 np0005538515.novalocal sshd[19381]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:16:10 np0005538515.novalocal sshd[19381]: Invalid user ubuntu from 69.5.7.210 port 40396
Nov 28 07:16:11 np0005538515.novalocal sshd[19381]: Connection closed by invalid user ubuntu 69.5.7.210 port 40396 [preauth]
Nov 28 07:16:11 np0005538515.novalocal sshd[19383]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:16:12 np0005538515.novalocal sshd[19383]: Invalid user ubuntu from 69.5.7.210 port 40412
Nov 28 07:16:12 np0005538515.novalocal sshd[19383]: Connection closed by invalid user ubuntu 69.5.7.210 port 40412 [preauth]
Nov 28 07:16:12 np0005538515.novalocal sshd[19385]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:16:13 np0005538515.novalocal sshd[19385]: Invalid user ubuntu from 69.5.7.210 port 40416
Nov 28 07:16:14 np0005538515.novalocal sshd[19385]: Connection closed by invalid user ubuntu 69.5.7.210 port 40416 [preauth]
Nov 28 07:16:14 np0005538515.novalocal sshd[19387]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:16:15 np0005538515.novalocal sshd[19387]: Invalid user ubuntu from 69.5.7.210 port 40430
Nov 28 07:16:15 np0005538515.novalocal sshd[19387]: Connection closed by invalid user ubuntu 69.5.7.210 port 40430 [preauth]
Nov 28 07:16:16 np0005538515.novalocal sshd[19389]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:16:17 np0005538515.novalocal sshd[19389]: Invalid user ubuntu from 69.5.7.210 port 40432
Nov 28 07:16:17 np0005538515.novalocal sshd[19389]: Connection closed by invalid user ubuntu 69.5.7.210 port 40432 [preauth]
Nov 28 07:16:17 np0005538515.novalocal sshd[19391]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:16:18 np0005538515.novalocal sshd[19391]: Invalid user ubuntu from 69.5.7.210 port 40434
Nov 28 07:16:18 np0005538515.novalocal sshd[19391]: Connection closed by invalid user ubuntu 69.5.7.210 port 40434 [preauth]
Nov 28 07:16:19 np0005538515.novalocal sshd[19393]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:16:20 np0005538515.novalocal sshd[19393]: Invalid user ubuntu from 69.5.7.210 port 58244
Nov 28 07:16:20 np0005538515.novalocal sshd[19393]: Connection closed by invalid user ubuntu 69.5.7.210 port 58244 [preauth]
Nov 28 07:16:20 np0005538515.novalocal sshd[19395]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:16:21 np0005538515.novalocal sshd[19395]: Invalid user ubuntu from 69.5.7.210 port 58252
Nov 28 07:16:22 np0005538515.novalocal sshd[19395]: Connection closed by invalid user ubuntu 69.5.7.210 port 58252 [preauth]
Nov 28 07:16:22 np0005538515.novalocal sshd[19397]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:16:23 np0005538515.novalocal sshd[19397]: Invalid user ubuntu from 69.5.7.210 port 58262
Nov 28 07:16:23 np0005538515.novalocal sshd[19397]: Connection closed by invalid user ubuntu 69.5.7.210 port 58262 [preauth]
Nov 28 07:16:23 np0005538515.novalocal sshd[19399]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:16:24 np0005538515.novalocal sshd[19399]: Invalid user ubuntu from 69.5.7.210 port 58274
Nov 28 07:16:25 np0005538515.novalocal sshd[19399]: Connection closed by invalid user ubuntu 69.5.7.210 port 58274 [preauth]
Nov 28 07:16:25 np0005538515.novalocal sshd[19401]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:16:26 np0005538515.novalocal sshd[19401]: Invalid user ubuntu from 69.5.7.210 port 58278
Nov 28 07:16:26 np0005538515.novalocal sshd[19401]: Connection closed by invalid user ubuntu 69.5.7.210 port 58278 [preauth]
Nov 28 07:16:27 np0005538515.novalocal sshd[19403]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:16:28 np0005538515.novalocal sshd[19403]: Invalid user ubuntu from 69.5.7.210 port 58280
Nov 28 07:16:28 np0005538515.novalocal sshd[19403]: Connection closed by invalid user ubuntu 69.5.7.210 port 58280 [preauth]
Nov 28 07:16:28 np0005538515.novalocal sshd[19405]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:16:29 np0005538515.novalocal sshd[19405]: Invalid user ubuntu from 69.5.7.210 port 58290
Nov 28 07:16:29 np0005538515.novalocal sshd[19405]: Connection closed by invalid user ubuntu 69.5.7.210 port 58290 [preauth]
Nov 28 07:16:30 np0005538515.novalocal sshd[19407]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:16:31 np0005538515.novalocal sshd[19407]: Invalid user ubuntu from 69.5.7.210 port 53784
Nov 28 07:16:31 np0005538515.novalocal sshd[19407]: Connection closed by invalid user ubuntu 69.5.7.210 port 53784 [preauth]
Nov 28 07:16:31 np0005538515.novalocal sshd[19409]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:16:32 np0005538515.novalocal sshd[19409]: Invalid user ubuntu from 69.5.7.210 port 53792
Nov 28 07:16:33 np0005538515.novalocal sshd[19409]: Connection closed by invalid user ubuntu 69.5.7.210 port 53792 [preauth]
Nov 28 07:16:33 np0005538515.novalocal sshd[19411]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:16:34 np0005538515.novalocal sshd[19411]: Invalid user ubuntu from 69.5.7.210 port 53806
Nov 28 07:16:34 np0005538515.novalocal sshd[19411]: Connection closed by invalid user ubuntu 69.5.7.210 port 53806 [preauth]
Nov 28 07:16:34 np0005538515.novalocal sshd[19413]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:16:36 np0005538515.novalocal sshd[19413]: Invalid user ubuntu from 69.5.7.210 port 53816
Nov 28 07:16:36 np0005538515.novalocal sshd[19413]: Connection closed by invalid user ubuntu 69.5.7.210 port 53816 [preauth]
Nov 28 07:16:36 np0005538515.novalocal sshd[19415]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:16:37 np0005538515.novalocal sshd[19415]: Invalid user ubuntu from 69.5.7.210 port 53828
Nov 28 07:16:37 np0005538515.novalocal sshd[19415]: Connection closed by invalid user ubuntu 69.5.7.210 port 53828 [preauth]
Nov 28 07:16:38 np0005538515.novalocal sshd[19417]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:16:39 np0005538515.novalocal sshd[19417]: Invalid user ubuntu from 69.5.7.210 port 53838
Nov 28 07:16:39 np0005538515.novalocal sshd[19417]: Connection closed by invalid user ubuntu 69.5.7.210 port 53838 [preauth]
Nov 28 07:16:39 np0005538515.novalocal sshd[19419]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:16:40 np0005538515.novalocal sshd[19419]: Invalid user ubuntu from 69.5.7.210 port 37596
Nov 28 07:16:41 np0005538515.novalocal sshd[19419]: Connection closed by invalid user ubuntu 69.5.7.210 port 37596 [preauth]
Nov 28 07:16:41 np0005538515.novalocal sshd[19421]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:16:42 np0005538515.novalocal sshd[19421]: Invalid user ubuntu from 69.5.7.210 port 37606
Nov 28 07:16:42 np0005538515.novalocal sshd[19421]: Connection closed by invalid user ubuntu 69.5.7.210 port 37606 [preauth]
Nov 28 07:16:42 np0005538515.novalocal sshd[19423]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:16:43 np0005538515.novalocal sshd[19423]: Invalid user ubuntu from 69.5.7.210 port 37608
Nov 28 07:16:44 np0005538515.novalocal sshd[19423]: Connection closed by invalid user ubuntu 69.5.7.210 port 37608 [preauth]
Nov 28 07:16:44 np0005538515.novalocal sshd[19425]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:16:45 np0005538515.novalocal sshd[19425]: Invalid user ubuntu from 69.5.7.210 port 37624
Nov 28 07:16:45 np0005538515.novalocal sshd[19425]: Connection closed by invalid user ubuntu 69.5.7.210 port 37624 [preauth]
Nov 28 07:16:46 np0005538515.novalocal sshd[19427]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:16:47 np0005538515.novalocal sshd[19427]: Invalid user ubuntu from 69.5.7.210 port 37626
Nov 28 07:16:47 np0005538515.novalocal sshd[19427]: Connection closed by invalid user ubuntu 69.5.7.210 port 37626 [preauth]
Nov 28 07:16:47 np0005538515.novalocal sshd[19429]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:16:48 np0005538515.novalocal sshd[19429]: Invalid user ubuntu from 69.5.7.210 port 37636
Nov 28 07:16:48 np0005538515.novalocal sshd[19429]: Connection closed by invalid user ubuntu 69.5.7.210 port 37636 [preauth]
Nov 28 07:16:49 np0005538515.novalocal sshd[19431]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:16:50 np0005538515.novalocal sshd[19431]: Invalid user ubuntu from 69.5.7.210 port 44054
Nov 28 07:16:50 np0005538515.novalocal sshd[19431]: Connection closed by invalid user ubuntu 69.5.7.210 port 44054 [preauth]
Nov 28 07:16:50 np0005538515.novalocal sshd[19433]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:16:51 np0005538515.novalocal sshd[19433]: Invalid user ubuntu from 69.5.7.210 port 44056
Nov 28 07:16:52 np0005538515.novalocal sshd[19433]: Connection closed by invalid user ubuntu 69.5.7.210 port 44056 [preauth]
Nov 28 07:16:52 np0005538515.novalocal sshd[19435]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:16:53 np0005538515.novalocal sshd[19435]: Invalid user ubuntu from 69.5.7.210 port 44066
Nov 28 07:16:53 np0005538515.novalocal sshd[19435]: Connection closed by invalid user ubuntu 69.5.7.210 port 44066 [preauth]
Nov 28 07:16:54 np0005538515.novalocal sshd[19437]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:16:55 np0005538515.novalocal sshd[19437]: Invalid user ubuntu from 69.5.7.210 port 44074
Nov 28 07:16:55 np0005538515.novalocal sshd[19437]: Connection closed by invalid user ubuntu 69.5.7.210 port 44074 [preauth]
Nov 28 07:16:55 np0005538515.novalocal sshd[19439]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:16:57 np0005538515.novalocal sshd[19439]: Invalid user ubuntu from 69.5.7.210 port 44082
Nov 28 07:16:57 np0005538515.novalocal sshd[19439]: Connection closed by invalid user ubuntu 69.5.7.210 port 44082 [preauth]
Nov 28 07:16:57 np0005538515.novalocal sshd[19441]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:16:58 np0005538515.novalocal sshd[19441]: Invalid user ubuntu from 69.5.7.210 port 44092
Nov 28 07:16:59 np0005538515.novalocal sshd[19441]: Connection closed by invalid user ubuntu 69.5.7.210 port 44092 [preauth]
Nov 28 07:16:59 np0005538515.novalocal sshd[19443]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:17:00 np0005538515.novalocal sshd[19443]: Invalid user ubuntu from 69.5.7.210 port 41582
Nov 28 07:17:00 np0005538515.novalocal sshd[19443]: Connection closed by invalid user ubuntu 69.5.7.210 port 41582 [preauth]
Nov 28 07:17:01 np0005538515.novalocal sshd[19445]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:17:02 np0005538515.novalocal sshd[19445]: Invalid user ubuntu from 69.5.7.210 port 41596
Nov 28 07:17:02 np0005538515.novalocal sshd[19445]: Connection closed by invalid user ubuntu 69.5.7.210 port 41596 [preauth]
Nov 28 07:17:02 np0005538515.novalocal sshd[19447]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:17:03 np0005538515.novalocal sshd[19447]: Invalid user ubuntu from 69.5.7.210 port 41602
Nov 28 07:17:04 np0005538515.novalocal sshd[19447]: Connection closed by invalid user ubuntu 69.5.7.210 port 41602 [preauth]
Nov 28 07:17:04 np0005538515.novalocal sshd[19449]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:17:05 np0005538515.novalocal sshd[19449]: Invalid user ubuntu from 69.5.7.210 port 41614
Nov 28 07:17:05 np0005538515.novalocal sshd[19449]: Connection closed by invalid user ubuntu 69.5.7.210 port 41614 [preauth]
Nov 28 07:17:05 np0005538515.novalocal sshd[19451]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:17:06 np0005538515.novalocal sshd[19451]: Invalid user ubuntu from 69.5.7.210 port 41616
Nov 28 07:17:07 np0005538515.novalocal sshd[19451]: Connection closed by invalid user ubuntu 69.5.7.210 port 41616 [preauth]
Nov 28 07:17:07 np0005538515.novalocal sshd[19453]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:17:08 np0005538515.novalocal sshd[19453]: Invalid user ubuntu from 69.5.7.210 port 41624
Nov 28 07:17:08 np0005538515.novalocal sshd[19453]: Connection closed by invalid user ubuntu 69.5.7.210 port 41624 [preauth]
Nov 28 07:17:09 np0005538515.novalocal sshd[19455]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:17:10 np0005538515.novalocal sshd[19455]: Invalid user ubuntu from 69.5.7.210 port 41638
Nov 28 07:17:10 np0005538515.novalocal sshd[19455]: Connection closed by invalid user ubuntu 69.5.7.210 port 41638 [preauth]
Nov 28 07:17:10 np0005538515.novalocal sshd[19457]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:17:11 np0005538515.novalocal sshd[19457]: Invalid user ubuntu from 69.5.7.210 port 60414
Nov 28 07:17:11 np0005538515.novalocal sshd[19457]: Connection closed by invalid user ubuntu 69.5.7.210 port 60414 [preauth]
Nov 28 07:17:12 np0005538515.novalocal sshd[19459]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:17:13 np0005538515.novalocal sshd[19459]: Invalid user ubuntu from 69.5.7.210 port 60422
Nov 28 07:17:13 np0005538515.novalocal sshd[19459]: Connection closed by invalid user ubuntu 69.5.7.210 port 60422 [preauth]
Nov 28 07:17:13 np0005538515.novalocal sshd[19461]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:17:14 np0005538515.novalocal sshd[19461]: Invalid user ubuntu from 69.5.7.210 port 60432
Nov 28 07:17:15 np0005538515.novalocal sshd[19461]: Connection closed by invalid user ubuntu 69.5.7.210 port 60432 [preauth]
Nov 28 07:17:15 np0005538515.novalocal sshd[19463]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:17:16 np0005538515.novalocal sshd[19463]: Invalid user ubuntu from 69.5.7.210 port 60446
Nov 28 07:17:16 np0005538515.novalocal sshd[19463]: Connection closed by invalid user ubuntu 69.5.7.210 port 60446 [preauth]
Nov 28 07:17:16 np0005538515.novalocal sshd[19465]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:17:17 np0005538515.novalocal sshd[19465]: Invalid user ubuntu from 69.5.7.210 port 60448
Nov 28 07:17:18 np0005538515.novalocal sshd[19465]: Connection closed by invalid user ubuntu 69.5.7.210 port 60448 [preauth]
Nov 28 07:17:18 np0005538515.novalocal sshd[19467]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:17:19 np0005538515.novalocal sshd[19467]: Invalid user ubuntu from 69.5.7.210 port 60464
Nov 28 07:17:19 np0005538515.novalocal sshd[19467]: Connection closed by invalid user ubuntu 69.5.7.210 port 60464 [preauth]
Nov 28 07:17:20 np0005538515.novalocal sshd[19469]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:17:21 np0005538515.novalocal sshd[19469]: Invalid user ubuntu from 69.5.7.210 port 44826
Nov 28 07:17:21 np0005538515.novalocal sshd[19469]: Connection closed by invalid user ubuntu 69.5.7.210 port 44826 [preauth]
Nov 28 07:17:21 np0005538515.novalocal sshd[19471]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:17:22 np0005538515.novalocal sshd[19471]: Invalid user ubuntu from 69.5.7.210 port 44834
Nov 28 07:17:23 np0005538515.novalocal sshd[19471]: Connection closed by invalid user ubuntu 69.5.7.210 port 44834 [preauth]
Nov 28 07:17:23 np0005538515.novalocal sshd[19473]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:17:24 np0005538515.novalocal sshd[19473]: Invalid user ubuntu from 69.5.7.210 port 44842
Nov 28 07:17:24 np0005538515.novalocal sshd[19473]: Connection closed by invalid user ubuntu 69.5.7.210 port 44842 [preauth]
Nov 28 07:17:24 np0005538515.novalocal sshd[19475]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:17:26 np0005538515.novalocal sshd[19475]: Invalid user ubuntu from 69.5.7.210 port 44858
Nov 28 07:17:26 np0005538515.novalocal sshd[19475]: Connection closed by invalid user ubuntu 69.5.7.210 port 44858 [preauth]
Nov 28 07:17:26 np0005538515.novalocal sshd[19477]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:17:27 np0005538515.novalocal sshd[19477]: Invalid user ubuntu from 69.5.7.210 port 44874
Nov 28 07:17:27 np0005538515.novalocal sshd[19477]: Connection closed by invalid user ubuntu 69.5.7.210 port 44874 [preauth]
Nov 28 07:17:28 np0005538515.novalocal sshd[19479]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:17:29 np0005538515.novalocal sshd[19479]: Invalid user ubuntu from 69.5.7.210 port 44888
Nov 28 07:17:29 np0005538515.novalocal sshd[19479]: Connection closed by invalid user ubuntu 69.5.7.210 port 44888 [preauth]
Nov 28 07:17:29 np0005538515.novalocal sshd[19481]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:17:30 np0005538515.novalocal sshd[19481]: Invalid user ubuntu from 69.5.7.210 port 55584
Nov 28 07:17:31 np0005538515.novalocal sshd[19481]: Connection closed by invalid user ubuntu 69.5.7.210 port 55584 [preauth]
Nov 28 07:17:31 np0005538515.novalocal sshd[19483]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:17:32 np0005538515.novalocal sshd[19483]: Invalid user ubuntu from 69.5.7.210 port 55592
Nov 28 07:17:32 np0005538515.novalocal sshd[19483]: Connection closed by invalid user ubuntu 69.5.7.210 port 55592 [preauth]
Nov 28 07:17:32 np0005538515.novalocal sshd[19485]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:17:33 np0005538515.novalocal sshd[19485]: Invalid user ubuntu from 69.5.7.210 port 55602
Nov 28 07:17:34 np0005538515.novalocal sshd[19485]: Connection closed by invalid user ubuntu 69.5.7.210 port 55602 [preauth]
Nov 28 07:17:34 np0005538515.novalocal sshd[19487]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:17:35 np0005538515.novalocal sshd[19487]: Invalid user ubuntu from 69.5.7.210 port 55612
Nov 28 07:17:35 np0005538515.novalocal sshd[19487]: Connection closed by invalid user ubuntu 69.5.7.210 port 55612 [preauth]
Nov 28 07:17:36 np0005538515.novalocal sshd[19489]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:17:37 np0005538515.novalocal sshd[19489]: Invalid user ubuntu from 69.5.7.210 port 55624
Nov 28 07:17:37 np0005538515.novalocal sshd[19489]: Connection closed by invalid user ubuntu 69.5.7.210 port 55624 [preauth]
Nov 28 07:17:37 np0005538515.novalocal sshd[19491]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:17:38 np0005538515.novalocal sshd[19491]: Invalid user ubuntu from 69.5.7.210 port 55628
Nov 28 07:17:38 np0005538515.novalocal sshd[19491]: Connection closed by invalid user ubuntu 69.5.7.210 port 55628 [preauth]
Nov 28 07:17:39 np0005538515.novalocal sshd[19493]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:17:40 np0005538515.novalocal sshd[19493]: Invalid user ubuntu from 69.5.7.210 port 60198
Nov 28 07:17:40 np0005538515.novalocal sshd[19493]: Connection closed by invalid user ubuntu 69.5.7.210 port 60198 [preauth]
Nov 28 07:17:40 np0005538515.novalocal sshd[19495]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:17:41 np0005538515.novalocal sshd[19495]: Invalid user ubuntu from 69.5.7.210 port 60206
Nov 28 07:17:42 np0005538515.novalocal sshd[19495]: Connection closed by invalid user ubuntu 69.5.7.210 port 60206 [preauth]
Nov 28 07:17:42 np0005538515.novalocal sshd[19497]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:17:43 np0005538515.novalocal sshd[19497]: Invalid user ubuntu from 69.5.7.210 port 60212
Nov 28 07:17:43 np0005538515.novalocal sshd[19497]: Connection closed by invalid user ubuntu 69.5.7.210 port 60212 [preauth]
Nov 28 07:17:43 np0005538515.novalocal sshd[19499]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:17:45 np0005538515.novalocal sshd[19499]: Invalid user ubuntu from 69.5.7.210 port 60218
Nov 28 07:17:45 np0005538515.novalocal sshd[19499]: Connection closed by invalid user ubuntu 69.5.7.210 port 60218 [preauth]
Nov 28 07:17:45 np0005538515.novalocal sshd[19501]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:17:46 np0005538515.novalocal sshd[19501]: Invalid user ubuntu from 69.5.7.210 port 60232
Nov 28 07:17:46 np0005538515.novalocal sshd[19501]: Connection closed by invalid user ubuntu 69.5.7.210 port 60232 [preauth]
Nov 28 07:17:47 np0005538515.novalocal sshd[19503]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:17:48 np0005538515.novalocal sshd[19503]: Invalid user ubuntu from 69.5.7.210 port 60240
Nov 28 07:17:48 np0005538515.novalocal sshd[19503]: Connection closed by invalid user ubuntu 69.5.7.210 port 60240 [preauth]
Nov 28 07:17:48 np0005538515.novalocal sshd[19505]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:17:49 np0005538515.novalocal sshd[19505]: Invalid user ubuntu from 69.5.7.210 port 60256
Nov 28 07:17:49 np0005538515.novalocal sshd[19505]: Connection closed by invalid user ubuntu 69.5.7.210 port 60256 [preauth]
Nov 28 07:17:50 np0005538515.novalocal sshd[19507]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:17:51 np0005538515.novalocal sshd[19507]: Invalid user ubuntu from 69.5.7.210 port 52240
Nov 28 07:17:51 np0005538515.novalocal sshd[19507]: Connection closed by invalid user ubuntu 69.5.7.210 port 52240 [preauth]
Nov 28 07:17:51 np0005538515.novalocal sshd[19509]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:17:52 np0005538515.novalocal sshd[19509]: Invalid user ubuntu from 69.5.7.210 port 52250
Nov 28 07:17:53 np0005538515.novalocal sshd[19509]: Connection closed by invalid user ubuntu 69.5.7.210 port 52250 [preauth]
Nov 28 07:17:53 np0005538515.novalocal sshd[19511]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:17:54 np0005538515.novalocal sshd[19511]: Invalid user ubuntu from 69.5.7.210 port 52258
Nov 28 07:17:54 np0005538515.novalocal sshd[19511]: Connection closed by invalid user ubuntu 69.5.7.210 port 52258 [preauth]
Nov 28 07:17:54 np0005538515.novalocal sshd[19513]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:17:55 np0005538515.novalocal sshd[19513]: Invalid user ubuntu from 69.5.7.210 port 52260
Nov 28 07:17:56 np0005538515.novalocal sshd[19513]: Connection closed by invalid user ubuntu 69.5.7.210 port 52260 [preauth]
Nov 28 07:17:56 np0005538515.novalocal sshd[19515]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:17:57 np0005538515.novalocal sshd[19515]: Invalid user debian from 69.5.7.210 port 52264
Nov 28 07:17:57 np0005538515.novalocal sshd[19515]: Connection closed by invalid user debian 69.5.7.210 port 52264 [preauth]
Nov 28 07:17:58 np0005538515.novalocal sshd[19517]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:17:59 np0005538515.novalocal sshd[19517]: Invalid user debian from 69.5.7.210 port 52268
Nov 28 07:17:59 np0005538515.novalocal sshd[19517]: Connection closed by invalid user debian 69.5.7.210 port 52268 [preauth]
Nov 28 07:17:59 np0005538515.novalocal sshd[19519]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:18:01 np0005538515.novalocal sshd[19519]: Invalid user debian from 69.5.7.210 port 38856
Nov 28 07:18:01 np0005538515.novalocal sshd[19519]: Connection closed by invalid user debian 69.5.7.210 port 38856 [preauth]
Nov 28 07:18:01 np0005538515.novalocal sshd[19521]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:18:02 np0005538515.novalocal sshd[19521]: Invalid user debian from 69.5.7.210 port 38858
Nov 28 07:18:03 np0005538515.novalocal sshd[19521]: Connection closed by invalid user debian 69.5.7.210 port 38858 [preauth]
Nov 28 07:18:03 np0005538515.novalocal sshd[19523]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:18:04 np0005538515.novalocal sshd[19523]: Invalid user debian from 69.5.7.210 port 38864
Nov 28 07:18:04 np0005538515.novalocal sshd[19523]: Connection closed by invalid user debian 69.5.7.210 port 38864 [preauth]
Nov 28 07:18:04 np0005538515.novalocal sshd[19525]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:18:05 np0005538515.novalocal sshd[19525]: Invalid user debian from 69.5.7.210 port 38870
Nov 28 07:18:06 np0005538515.novalocal sshd[19525]: Connection closed by invalid user debian 69.5.7.210 port 38870 [preauth]
Nov 28 07:18:06 np0005538515.novalocal sshd[19527]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:18:07 np0005538515.novalocal sshd[19527]: Invalid user debian from 69.5.7.210 port 38872
Nov 28 07:18:07 np0005538515.novalocal sshd[19527]: Connection closed by invalid user debian 69.5.7.210 port 38872 [preauth]
Nov 28 07:18:08 np0005538515.novalocal sshd[19529]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:18:09 np0005538515.novalocal sshd[19529]: Invalid user debian from 69.5.7.210 port 38888
Nov 28 07:18:09 np0005538515.novalocal sshd[19529]: Connection closed by invalid user debian 69.5.7.210 port 38888 [preauth]
Nov 28 07:18:09 np0005538515.novalocal sshd[19531]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:18:10 np0005538515.novalocal sshd[19531]: Invalid user debian from 69.5.7.210 port 56372
Nov 28 07:18:11 np0005538515.novalocal sshd[19531]: Connection closed by invalid user debian 69.5.7.210 port 56372 [preauth]
Nov 28 07:18:11 np0005538515.novalocal sshd[19533]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:18:12 np0005538515.novalocal sshd[19533]: Invalid user debian from 69.5.7.210 port 56380
Nov 28 07:18:12 np0005538515.novalocal sshd[19533]: Connection closed by invalid user debian 69.5.7.210 port 56380 [preauth]
Nov 28 07:18:12 np0005538515.novalocal sshd[19535]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:18:13 np0005538515.novalocal sshd[19535]: Invalid user debian from 69.5.7.210 port 56392
Nov 28 07:18:14 np0005538515.novalocal sshd[19535]: Connection closed by invalid user debian 69.5.7.210 port 56392 [preauth]
Nov 28 07:18:14 np0005538515.novalocal sshd[19537]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:18:15 np0005538515.novalocal sshd[19537]: Invalid user debian from 69.5.7.210 port 56408
Nov 28 07:18:15 np0005538515.novalocal sshd[19537]: Connection closed by invalid user debian 69.5.7.210 port 56408 [preauth]
Nov 28 07:18:16 np0005538515.novalocal sshd[19539]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:18:17 np0005538515.novalocal sshd[19539]: Invalid user debian from 69.5.7.210 port 56412
Nov 28 07:18:17 np0005538515.novalocal sshd[19539]: Connection closed by invalid user debian 69.5.7.210 port 56412 [preauth]
Nov 28 07:18:17 np0005538515.novalocal sshd[19541]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:18:18 np0005538515.novalocal sshd[19541]: Invalid user debian from 69.5.7.210 port 56426
Nov 28 07:18:18 np0005538515.novalocal sshd[19541]: Connection closed by invalid user debian 69.5.7.210 port 56426 [preauth]
Nov 28 07:18:19 np0005538515.novalocal sshd[19543]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:18:20 np0005538515.novalocal sshd[19543]: Invalid user debian from 69.5.7.210 port 54310
Nov 28 07:18:20 np0005538515.novalocal sshd[19543]: Connection closed by invalid user debian 69.5.7.210 port 54310 [preauth]
Nov 28 07:18:20 np0005538515.novalocal sshd[19545]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:18:21 np0005538515.novalocal sshd[19545]: Invalid user debian from 69.5.7.210 port 54314
Nov 28 07:18:22 np0005538515.novalocal sshd[19545]: Connection closed by invalid user debian 69.5.7.210 port 54314 [preauth]
Nov 28 07:18:22 np0005538515.novalocal sshd[19547]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:18:23 np0005538515.novalocal sshd[19547]: Invalid user debian from 69.5.7.210 port 54316
Nov 28 07:18:23 np0005538515.novalocal sshd[19547]: Connection closed by invalid user debian 69.5.7.210 port 54316 [preauth]
Nov 28 07:18:24 np0005538515.novalocal sshd[19549]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:18:25 np0005538515.novalocal sshd[19549]: Invalid user debian from 69.5.7.210 port 54318
Nov 28 07:18:25 np0005538515.novalocal sshd[19549]: Connection closed by invalid user debian 69.5.7.210 port 54318 [preauth]
Nov 28 07:18:25 np0005538515.novalocal sshd[19551]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:18:26 np0005538515.novalocal sshd[19551]: Invalid user debian from 69.5.7.210 port 54330
Nov 28 07:18:26 np0005538515.novalocal sshd[19551]: Connection closed by invalid user debian 69.5.7.210 port 54330 [preauth]
Nov 28 07:18:27 np0005538515.novalocal sshd[19553]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:18:28 np0005538515.novalocal sshd[19553]: Invalid user debian from 69.5.7.210 port 54332
Nov 28 07:18:28 np0005538515.novalocal sshd[19553]: Connection closed by invalid user debian 69.5.7.210 port 54332 [preauth]
Nov 28 07:18:28 np0005538515.novalocal sshd[19555]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:18:29 np0005538515.novalocal sshd[19555]: Invalid user debian from 69.5.7.210 port 54342
Nov 28 07:18:30 np0005538515.novalocal sshd[19555]: Connection closed by invalid user debian 69.5.7.210 port 54342 [preauth]
Nov 28 07:18:30 np0005538515.novalocal sshd[19557]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:18:31 np0005538515.novalocal sshd[19557]: Invalid user debian from 69.5.7.210 port 34860
Nov 28 07:18:31 np0005538515.novalocal sshd[19557]: Connection closed by invalid user debian 69.5.7.210 port 34860 [preauth]
Nov 28 07:18:31 np0005538515.novalocal sshd[19559]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:18:32 np0005538515.novalocal sshd[19559]: Invalid user debian from 69.5.7.210 port 34876
Nov 28 07:18:33 np0005538515.novalocal sshd[19559]: Connection closed by invalid user debian 69.5.7.210 port 34876 [preauth]
Nov 28 07:18:33 np0005538515.novalocal sshd[19561]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:18:34 np0005538515.novalocal sshd[19561]: Invalid user debian from 69.5.7.210 port 34886
Nov 28 07:18:34 np0005538515.novalocal sshd[19561]: Connection closed by invalid user debian 69.5.7.210 port 34886 [preauth]
Nov 28 07:18:35 np0005538515.novalocal sshd[19563]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:18:36 np0005538515.novalocal sshd[19563]: Invalid user debian from 69.5.7.210 port 34900
Nov 28 07:18:36 np0005538515.novalocal sshd[19563]: Connection closed by invalid user debian 69.5.7.210 port 34900 [preauth]
Nov 28 07:18:36 np0005538515.novalocal sshd[19565]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:18:37 np0005538515.novalocal sshd[19565]: Invalid user debian from 69.5.7.210 port 34908
Nov 28 07:18:37 np0005538515.novalocal sshd[19565]: Connection closed by invalid user debian 69.5.7.210 port 34908 [preauth]
Nov 28 07:18:38 np0005538515.novalocal sshd[19567]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:18:39 np0005538515.novalocal sshd[19567]: Invalid user debian from 69.5.7.210 port 34920
Nov 28 07:18:39 np0005538515.novalocal sshd[19567]: Connection closed by invalid user debian 69.5.7.210 port 34920 [preauth]
Nov 28 07:18:39 np0005538515.novalocal sshd[19569]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:18:40 np0005538515.novalocal sshd[19569]: Invalid user debian from 69.5.7.210 port 46592
Nov 28 07:18:41 np0005538515.novalocal sshd[19569]: Connection closed by invalid user debian 69.5.7.210 port 46592 [preauth]
Nov 28 07:18:41 np0005538515.novalocal sshd[19571]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:18:42 np0005538515.novalocal sshd[19571]: Invalid user debian from 69.5.7.210 port 46596
Nov 28 07:18:42 np0005538515.novalocal sshd[19571]: Connection closed by invalid user debian 69.5.7.210 port 46596 [preauth]
Nov 28 07:18:43 np0005538515.novalocal sshd[19573]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:18:44 np0005538515.novalocal sshd[19573]: Invalid user debian from 69.5.7.210 port 46598
Nov 28 07:18:44 np0005538515.novalocal sshd[19573]: Connection closed by invalid user debian 69.5.7.210 port 46598 [preauth]
Nov 28 07:18:44 np0005538515.novalocal sshd[19575]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:18:45 np0005538515.novalocal sshd[19575]: Invalid user debian from 69.5.7.210 port 46600
Nov 28 07:18:45 np0005538515.novalocal sshd[19575]: Connection closed by invalid user debian 69.5.7.210 port 46600 [preauth]
Nov 28 07:18:46 np0005538515.novalocal sshd[19577]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:18:47 np0005538515.novalocal sshd[19577]: Invalid user debian from 69.5.7.210 port 46610
Nov 28 07:18:47 np0005538515.novalocal sshd[19577]: Connection closed by invalid user debian 69.5.7.210 port 46610 [preauth]
Nov 28 07:18:47 np0005538515.novalocal sshd[19579]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:18:48 np0005538515.novalocal sshd[19579]: Invalid user debian from 69.5.7.210 port 46612
Nov 28 07:18:49 np0005538515.novalocal sshd[19579]: Connection closed by invalid user debian 69.5.7.210 port 46612 [preauth]
Nov 28 07:18:49 np0005538515.novalocal sshd[19581]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:18:50 np0005538515.novalocal sshd[19581]: Invalid user debian from 69.5.7.210 port 57614
Nov 28 07:18:50 np0005538515.novalocal sshd[19581]: Connection closed by invalid user debian 69.5.7.210 port 57614 [preauth]
Nov 28 07:18:51 np0005538515.novalocal sshd[19583]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:18:52 np0005538515.novalocal sshd[19583]: Invalid user debian from 69.5.7.210 port 57618
Nov 28 07:18:52 np0005538515.novalocal sshd[19583]: Connection closed by invalid user debian 69.5.7.210 port 57618 [preauth]
Nov 28 07:18:52 np0005538515.novalocal sshd[19585]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:18:53 np0005538515.novalocal sshd[19585]: Invalid user debian from 69.5.7.210 port 57632
Nov 28 07:18:54 np0005538515.novalocal sshd[19585]: Connection closed by invalid user debian 69.5.7.210 port 57632 [preauth]
Nov 28 07:18:54 np0005538515.novalocal sshd[19587]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:18:55 np0005538515.novalocal sshd[19587]: Invalid user debian from 69.5.7.210 port 57634
Nov 28 07:18:55 np0005538515.novalocal sshd[19587]: Connection closed by invalid user debian 69.5.7.210 port 57634 [preauth]
Nov 28 07:18:55 np0005538515.novalocal sshd[19589]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:18:56 np0005538515.novalocal sshd[19589]: Invalid user debian from 69.5.7.210 port 57644
Nov 28 07:18:57 np0005538515.novalocal sshd[19589]: Connection closed by invalid user debian 69.5.7.210 port 57644 [preauth]
Nov 28 07:18:57 np0005538515.novalocal sshd[19591]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:18:58 np0005538515.novalocal sshd[19591]: Invalid user debian from 69.5.7.210 port 57660
Nov 28 07:18:58 np0005538515.novalocal sshd[19591]: Connection closed by invalid user debian 69.5.7.210 port 57660 [preauth]
Nov 28 07:18:59 np0005538515.novalocal sshd[19593]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:19:00 np0005538515.novalocal sshd[19593]: Invalid user debian from 69.5.7.210 port 57672
Nov 28 07:19:00 np0005538515.novalocal sshd[19593]: Connection closed by invalid user debian 69.5.7.210 port 57672 [preauth]
Nov 28 07:19:00 np0005538515.novalocal sshd[19595]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:19:01 np0005538515.novalocal sshd[19595]: Invalid user debian from 69.5.7.210 port 43330
Nov 28 07:19:01 np0005538515.novalocal sshd[19595]: Connection closed by invalid user debian 69.5.7.210 port 43330 [preauth]
Nov 28 07:19:02 np0005538515.novalocal sshd[19597]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:19:03 np0005538515.novalocal sshd[19597]: Invalid user debian from 69.5.7.210 port 43340
Nov 28 07:19:03 np0005538515.novalocal sshd[19597]: Connection closed by invalid user debian 69.5.7.210 port 43340 [preauth]
Nov 28 07:19:03 np0005538515.novalocal sshd[19599]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:19:04 np0005538515.novalocal sshd[19599]: Invalid user debian from 69.5.7.210 port 43352
Nov 28 07:19:05 np0005538515.novalocal sshd[19599]: Connection closed by invalid user debian 69.5.7.210 port 43352 [preauth]
Nov 28 07:19:05 np0005538515.novalocal sshd[19601]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:19:06 np0005538515.novalocal sshd[19601]: Invalid user debian from 69.5.7.210 port 43354
Nov 28 07:19:06 np0005538515.novalocal sshd[19601]: Connection closed by invalid user debian 69.5.7.210 port 43354 [preauth]
Nov 28 07:19:06 np0005538515.novalocal sshd[19603]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:19:08 np0005538515.novalocal sshd[19603]: Invalid user debian from 69.5.7.210 port 43362
Nov 28 07:19:08 np0005538515.novalocal sshd[19603]: Connection closed by invalid user debian 69.5.7.210 port 43362 [preauth]
Nov 28 07:19:08 np0005538515.novalocal sshd[19605]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:19:09 np0005538515.novalocal sshd[19605]: Invalid user debian from 69.5.7.210 port 43366
Nov 28 07:19:09 np0005538515.novalocal sshd[19605]: Connection closed by invalid user debian 69.5.7.210 port 43366 [preauth]
Nov 28 07:19:10 np0005538515.novalocal sshd[19607]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:19:11 np0005538515.novalocal sshd[19607]: Invalid user debian from 69.5.7.210 port 55786
Nov 28 07:19:11 np0005538515.novalocal sshd[19607]: Connection closed by invalid user debian 69.5.7.210 port 55786 [preauth]
Nov 28 07:19:11 np0005538515.novalocal sshd[19609]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:19:12 np0005538515.novalocal sshd[19609]: Invalid user debian from 69.5.7.210 port 55788
Nov 28 07:19:13 np0005538515.novalocal sshd[19609]: Connection closed by invalid user debian 69.5.7.210 port 55788 [preauth]
Nov 28 07:19:13 np0005538515.novalocal sshd[19611]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:19:14 np0005538515.novalocal sshd[19611]: Invalid user debian from 69.5.7.210 port 55802
Nov 28 07:19:14 np0005538515.novalocal sshd[19611]: Connection closed by invalid user debian 69.5.7.210 port 55802 [preauth]
Nov 28 07:19:14 np0005538515.novalocal sshd[19613]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:19:15 np0005538515.novalocal sshd[19613]: Invalid user debian from 69.5.7.210 port 55812
Nov 28 07:19:16 np0005538515.novalocal sshd[19613]: Connection closed by invalid user debian 69.5.7.210 port 55812 [preauth]
Nov 28 07:19:16 np0005538515.novalocal sshd[19615]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:19:17 np0005538515.novalocal sshd[19615]: Invalid user debian from 69.5.7.210 port 55822
Nov 28 07:19:17 np0005538515.novalocal sshd[19615]: Connection closed by invalid user debian 69.5.7.210 port 55822 [preauth]
Nov 28 07:19:18 np0005538515.novalocal sshd[19617]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:19:19 np0005538515.novalocal sshd[19617]: Invalid user debian from 69.5.7.210 port 55836
Nov 28 07:19:19 np0005538515.novalocal sshd[19617]: Connection closed by invalid user debian 69.5.7.210 port 55836 [preauth]
Nov 28 07:19:19 np0005538515.novalocal sshd[19619]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:19:20 np0005538515.novalocal sshd[19619]: Invalid user debian from 69.5.7.210 port 52058
Nov 28 07:19:21 np0005538515.novalocal sshd[19619]: Connection closed by invalid user debian 69.5.7.210 port 52058 [preauth]
Nov 28 07:19:21 np0005538515.novalocal sshd[19621]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:19:22 np0005538515.novalocal sshd[19621]: Invalid user debian from 69.5.7.210 port 52074
Nov 28 07:19:22 np0005538515.novalocal sshd[19621]: Connection closed by invalid user debian 69.5.7.210 port 52074 [preauth]
Nov 28 07:19:22 np0005538515.novalocal sshd[19623]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:19:24 np0005538515.novalocal sshd[19623]: Invalid user debian from 69.5.7.210 port 52080
Nov 28 07:19:24 np0005538515.novalocal sshd[19623]: Connection closed by invalid user debian 69.5.7.210 port 52080 [preauth]
Nov 28 07:19:24 np0005538515.novalocal sshd[19625]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:19:25 np0005538515.novalocal sshd[19625]: Invalid user debian from 69.5.7.210 port 52096
Nov 28 07:19:25 np0005538515.novalocal sshd[19625]: Connection closed by invalid user debian 69.5.7.210 port 52096 [preauth]
Nov 28 07:19:26 np0005538515.novalocal sshd[19627]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:19:27 np0005538515.novalocal sshd[19627]: Invalid user debian from 69.5.7.210 port 52104
Nov 28 07:19:27 np0005538515.novalocal sshd[19627]: Connection closed by invalid user debian 69.5.7.210 port 52104 [preauth]
Nov 28 07:19:27 np0005538515.novalocal sshd[19629]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:19:28 np0005538515.novalocal sshd[19629]: Invalid user debian from 69.5.7.210 port 52116
Nov 28 07:19:28 np0005538515.novalocal sshd[19629]: Connection closed by invalid user debian 69.5.7.210 port 52116 [preauth]
Nov 28 07:19:29 np0005538515.novalocal sshd[19631]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:19:30 np0005538515.novalocal sshd[19631]: Invalid user debian from 69.5.7.210 port 52184
Nov 28 07:19:30 np0005538515.novalocal sshd[19631]: Connection closed by invalid user debian 69.5.7.210 port 52184 [preauth]
Nov 28 07:19:30 np0005538515.novalocal sshd[19633]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:19:31 np0005538515.novalocal sshd[19633]: Invalid user debian from 69.5.7.210 port 52200
Nov 28 07:19:32 np0005538515.novalocal sshd[19633]: Connection closed by invalid user debian 69.5.7.210 port 52200 [preauth]
Nov 28 07:19:32 np0005538515.novalocal sshd[19635]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:19:33 np0005538515.novalocal sshd[19635]: Invalid user debian from 69.5.7.210 port 52210
Nov 28 07:19:33 np0005538515.novalocal sshd[19635]: Connection closed by invalid user debian 69.5.7.210 port 52210 [preauth]
Nov 28 07:19:33 np0005538515.novalocal sshd[19637]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:19:34 np0005538515.novalocal sshd[19637]: Invalid user debian from 69.5.7.210 port 52224
Nov 28 07:19:35 np0005538515.novalocal sshd[19637]: Connection closed by invalid user debian 69.5.7.210 port 52224 [preauth]
Nov 28 07:19:35 np0005538515.novalocal sshd[19639]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:19:36 np0005538515.novalocal sshd[19639]: Invalid user debian from 69.5.7.210 port 52238
Nov 28 07:19:36 np0005538515.novalocal sshd[19639]: Connection closed by invalid user debian 69.5.7.210 port 52238 [preauth]
Nov 28 07:19:37 np0005538515.novalocal sshd[19641]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:19:38 np0005538515.novalocal sshd[19641]: Invalid user debian from 69.5.7.210 port 52258
Nov 28 07:19:38 np0005538515.novalocal sshd[19641]: Connection closed by invalid user debian 69.5.7.210 port 52258 [preauth]
Nov 28 07:19:38 np0005538515.novalocal sshd[19643]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:19:39 np0005538515.novalocal sshd[19646]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:19:39 np0005538515.novalocal sshd[19646]: Accepted publickey for zuul from 38.102.83.114 port 42984 ssh2: RSA SHA256:3gOhaEk5Hp1Sm2LwNst6cGDJ5O01KvSo8lCo9SBO2II
Nov 28 07:19:39 np0005538515.novalocal systemd-logind[763]: New session 10 of user zuul.
Nov 28 07:19:39 np0005538515.novalocal systemd[1]: Started Session 10 of User zuul.
Nov 28 07:19:39 np0005538515.novalocal sshd[19646]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Nov 28 07:19:39 np0005538515.novalocal sshd[19643]: Invalid user debian from 69.5.7.210 port 52266
Nov 28 07:19:40 np0005538515.novalocal python3[19663]: ansible-ansible.legacy.command Invoked with _raw_params=cat /etc/redhat-release zuul_log_id=fa163ef9-e89a-f34b-e95a-00000000000c-1-overcloudnovacompute2 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 07:19:40 np0005538515.novalocal sshd[19643]: Connection closed by invalid user debian 69.5.7.210 port 52266 [preauth]
Nov 28 07:19:40 np0005538515.novalocal sshd[19667]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:19:41 np0005538515.novalocal sshd[19667]: Invalid user debian from 69.5.7.210 port 36042
Nov 28 07:19:41 np0005538515.novalocal sshd[19667]: Connection closed by invalid user debian 69.5.7.210 port 36042 [preauth]
Nov 28 07:19:42 np0005538515.novalocal sshd[19670]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:19:43 np0005538515.novalocal sshd[19670]: Invalid user debian from 69.5.7.210 port 36046
Nov 28 07:19:43 np0005538515.novalocal sudo[19685]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aphradaanwsstynnggnuzomdkbooevou ; /usr/bin/python3
Nov 28 07:19:43 np0005538515.novalocal sudo[19685]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:19:43 np0005538515.novalocal python3[19687]: ansible-ansible.legacy.command Invoked with _raw_params=yum clean all zuul_log_id=fa163ef9-e89a-f34b-e95a-00000000000d-1-overcloudnovacompute2 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 07:19:43 np0005538515.novalocal sshd[19670]: Connection closed by invalid user debian 69.5.7.210 port 36046 [preauth]
Nov 28 07:19:43 np0005538515.novalocal sshd[19690]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:19:44 np0005538515.novalocal sshd[19690]: Invalid user debian from 69.5.7.210 port 36062
Nov 28 07:19:44 np0005538515.novalocal sshd[19690]: Connection closed by invalid user debian 69.5.7.210 port 36062 [preauth]
Nov 28 07:19:45 np0005538515.novalocal sshd[19692]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:19:45 np0005538515.novalocal sudo[19685]: pam_unix(sudo:session): session closed for user root
Nov 28 07:19:46 np0005538515.novalocal sshd[19692]: Invalid user debian from 69.5.7.210 port 36074
Nov 28 07:19:46 np0005538515.novalocal sshd[19692]: Connection closed by invalid user debian 69.5.7.210 port 36074 [preauth]
Nov 28 07:19:46 np0005538515.novalocal sshd[19695]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:19:47 np0005538515.novalocal sshd[19695]: Invalid user debian from 69.5.7.210 port 36080
Nov 28 07:19:48 np0005538515.novalocal sshd[19695]: Connection closed by invalid user debian 69.5.7.210 port 36080 [preauth]
Nov 28 07:19:48 np0005538515.novalocal sshd[19697]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:19:49 np0005538515.novalocal sshd[19697]: Invalid user debian from 69.5.7.210 port 36096
Nov 28 07:19:49 np0005538515.novalocal sshd[19697]: Connection closed by invalid user debian 69.5.7.210 port 36096 [preauth]
Nov 28 07:19:49 np0005538515.novalocal sshd[19699]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:19:51 np0005538515.novalocal sshd[19699]: Invalid user debian from 69.5.7.210 port 57768
Nov 28 07:19:51 np0005538515.novalocal sshd[19699]: Connection closed by invalid user debian 69.5.7.210 port 57768 [preauth]
Nov 28 07:19:51 np0005538515.novalocal sshd[19701]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:19:52 np0005538515.novalocal sshd[19701]: Invalid user debian from 69.5.7.210 port 57774
Nov 28 07:19:52 np0005538515.novalocal sshd[19701]: Connection closed by invalid user debian 69.5.7.210 port 57774 [preauth]
Nov 28 07:19:53 np0005538515.novalocal sshd[19703]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:19:54 np0005538515.novalocal sshd[19703]: Invalid user debian from 69.5.7.210 port 57788
Nov 28 07:19:54 np0005538515.novalocal sshd[19703]: Connection closed by invalid user debian 69.5.7.210 port 57788 [preauth]
Nov 28 07:19:54 np0005538515.novalocal sshd[19705]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:19:55 np0005538515.novalocal sshd[19705]: Invalid user debian from 69.5.7.210 port 57790
Nov 28 07:19:56 np0005538515.novalocal sshd[19705]: Connection closed by invalid user debian 69.5.7.210 port 57790 [preauth]
Nov 28 07:19:56 np0005538515.novalocal sshd[19707]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:19:57 np0005538515.novalocal sshd[19707]: Invalid user debian from 69.5.7.210 port 57806
Nov 28 07:19:57 np0005538515.novalocal sshd[19707]: Connection closed by invalid user debian 69.5.7.210 port 57806 [preauth]
Nov 28 07:19:58 np0005538515.novalocal sshd[19709]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:19:59 np0005538515.novalocal sshd[19709]: Invalid user debian from 69.5.7.210 port 57812
Nov 28 07:19:59 np0005538515.novalocal sshd[19709]: Connection closed by invalid user debian 69.5.7.210 port 57812 [preauth]
Nov 28 07:19:59 np0005538515.novalocal sshd[19711]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:20:00 np0005538515.novalocal sshd[19711]: Invalid user debian from 69.5.7.210 port 42056
Nov 28 07:20:01 np0005538515.novalocal sshd[19711]: Connection closed by invalid user debian 69.5.7.210 port 42056 [preauth]
Nov 28 07:20:01 np0005538515.novalocal sshd[19713]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:20:02 np0005538515.novalocal sshd[19713]: Invalid user debian from 69.5.7.210 port 42066
Nov 28 07:20:02 np0005538515.novalocal sshd[19713]: Connection closed by invalid user debian 69.5.7.210 port 42066 [preauth]
Nov 28 07:20:03 np0005538515.novalocal sshd[19715]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:20:04 np0005538515.novalocal sshd[19715]: Invalid user debian from 69.5.7.210 port 42082
Nov 28 07:20:04 np0005538515.novalocal sshd[19715]: Connection closed by invalid user debian 69.5.7.210 port 42082 [preauth]
Nov 28 07:20:04 np0005538515.novalocal sshd[19717]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:20:05 np0005538515.novalocal sshd[19717]: Invalid user debian from 69.5.7.210 port 42098
Nov 28 07:20:06 np0005538515.novalocal sshd[19717]: Connection closed by invalid user debian 69.5.7.210 port 42098 [preauth]
Nov 28 07:20:06 np0005538515.novalocal sshd[19719]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:20:07 np0005538515.novalocal sshd[19719]: Invalid user debian from 69.5.7.210 port 42114
Nov 28 07:20:07 np0005538515.novalocal sshd[19719]: Connection closed by invalid user debian 69.5.7.210 port 42114 [preauth]
Nov 28 07:20:07 np0005538515.novalocal sshd[19721]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:20:08 np0005538515.novalocal sshd[19721]: Invalid user admin from 69.5.7.210 port 42130
Nov 28 07:20:09 np0005538515.novalocal sshd[19721]: Connection closed by invalid user admin 69.5.7.210 port 42130 [preauth]
Nov 28 07:20:09 np0005538515.novalocal sshd[19723]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:20:10 np0005538515.novalocal sshd[19723]: Invalid user admin from 69.5.7.210 port 54262
Nov 28 07:20:10 np0005538515.novalocal sshd[19723]: Connection closed by invalid user admin 69.5.7.210 port 54262 [preauth]
Nov 28 07:20:11 np0005538515.novalocal sshd[19725]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:20:12 np0005538515.novalocal sshd[19725]: Invalid user admin from 69.5.7.210 port 54274
Nov 28 07:20:12 np0005538515.novalocal sshd[19725]: Connection closed by invalid user admin 69.5.7.210 port 54274 [preauth]
Nov 28 07:20:12 np0005538515.novalocal sshd[19727]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:20:13 np0005538515.novalocal sudo[19742]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fhvgdotjixuytufcbqxpiiqwblsgojhd ; /usr/bin/python3
Nov 28 07:20:13 np0005538515.novalocal sudo[19742]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:20:13 np0005538515.novalocal python3[19744]: ansible-community.general.rhsm_repository Invoked with name=['rhel-9-for-x86_64-baseos-eus-rpms'] state=enabled purge=False
Nov 28 07:20:13 np0005538515.novalocal sshd[19727]: Invalid user admin from 69.5.7.210 port 54290
Nov 28 07:20:13 np0005538515.novalocal sshd[19727]: Connection closed by invalid user admin 69.5.7.210 port 54290 [preauth]
Nov 28 07:20:14 np0005538515.novalocal sshd[19746]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:20:15 np0005538515.novalocal sshd[19746]: Invalid user admin from 69.5.7.210 port 54306
Nov 28 07:20:15 np0005538515.novalocal sshd[19746]: Connection closed by invalid user admin 69.5.7.210 port 54306 [preauth]
Nov 28 07:20:15 np0005538515.novalocal sshd[19749]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:20:16 np0005538515.novalocal rhsm-service[6576]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Nov 28 07:20:16 np0005538515.novalocal sshd[19749]: Invalid user admin from 69.5.7.210 port 54312
Nov 28 07:20:17 np0005538515.novalocal sshd[19749]: Connection closed by invalid user admin 69.5.7.210 port 54312 [preauth]
Nov 28 07:20:17 np0005538515.novalocal sshd[19815]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:20:18 np0005538515.novalocal sshd[19815]: Invalid user admin from 69.5.7.210 port 54316
Nov 28 07:20:18 np0005538515.novalocal sshd[19815]: Connection closed by invalid user admin 69.5.7.210 port 54316 [preauth]
Nov 28 07:20:18 np0005538515.novalocal sshd[19879]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:20:19 np0005538515.novalocal sshd[19879]: Invalid user admin from 69.5.7.210 port 54324
Nov 28 07:20:20 np0005538515.novalocal sshd[19879]: Connection closed by invalid user admin 69.5.7.210 port 54324 [preauth]
Nov 28 07:20:20 np0005538515.novalocal sshd[19881]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:20:21 np0005538515.novalocal sshd[19881]: Invalid user admin from 69.5.7.210 port 60954
Nov 28 07:20:21 np0005538515.novalocal sshd[19881]: Connection closed by invalid user admin 69.5.7.210 port 60954 [preauth]
Nov 28 07:20:22 np0005538515.novalocal sshd[19883]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:20:23 np0005538515.novalocal sshd[19883]: Invalid user admin from 69.5.7.210 port 60966
Nov 28 07:20:23 np0005538515.novalocal sshd[19883]: Connection closed by invalid user admin 69.5.7.210 port 60966 [preauth]
Nov 28 07:20:23 np0005538515.novalocal sshd[19889]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:20:24 np0005538515.novalocal sshd[19889]: Invalid user admin from 69.5.7.210 port 60974
Nov 28 07:20:25 np0005538515.novalocal sshd[19889]: Connection closed by invalid user admin 69.5.7.210 port 60974 [preauth]
Nov 28 07:20:25 np0005538515.novalocal sshd[19891]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:20:26 np0005538515.novalocal sshd[19891]: Invalid user admin from 69.5.7.210 port 60986
Nov 28 07:20:26 np0005538515.novalocal sshd[19891]: Connection closed by invalid user admin 69.5.7.210 port 60986 [preauth]
Nov 28 07:20:26 np0005538515.novalocal sshd[19893]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:20:28 np0005538515.novalocal sshd[19893]: Invalid user admin from 69.5.7.210 port 60998
Nov 28 07:20:28 np0005538515.novalocal sshd[19893]: Connection closed by invalid user admin 69.5.7.210 port 60998 [preauth]
Nov 28 07:20:28 np0005538515.novalocal sshd[19895]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:20:29 np0005538515.novalocal sshd[19895]: Invalid user admin from 69.5.7.210 port 32772
Nov 28 07:20:29 np0005538515.novalocal sshd[19895]: Connection closed by invalid user admin 69.5.7.210 port 32772 [preauth]
Nov 28 07:20:30 np0005538515.novalocal sshd[19897]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:20:31 np0005538515.novalocal sshd[19897]: Invalid user admin from 69.5.7.210 port 33230
Nov 28 07:20:31 np0005538515.novalocal sshd[19897]: Connection closed by invalid user admin 69.5.7.210 port 33230 [preauth]
Nov 28 07:20:31 np0005538515.novalocal sshd[19903]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:20:32 np0005538515.novalocal sshd[19903]: Invalid user admin from 69.5.7.210 port 33238
Nov 28 07:20:33 np0005538515.novalocal sshd[19903]: Connection closed by invalid user admin 69.5.7.210 port 33238 [preauth]
Nov 28 07:20:33 np0005538515.novalocal sshd[19905]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:20:34 np0005538515.novalocal sshd[19905]: Invalid user admin from 69.5.7.210 port 33252
Nov 28 07:20:34 np0005538515.novalocal sshd[19905]: Connection closed by invalid user admin 69.5.7.210 port 33252 [preauth]
Nov 28 07:20:34 np0005538515.novalocal sshd[19907]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:20:35 np0005538515.novalocal sshd[19907]: Invalid user admin from 69.5.7.210 port 33256
Nov 28 07:20:36 np0005538515.novalocal sshd[19907]: Connection closed by invalid user admin 69.5.7.210 port 33256 [preauth]
Nov 28 07:20:36 np0005538515.novalocal sshd[19909]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:20:37 np0005538515.novalocal sshd[19909]: Invalid user admin from 69.5.7.210 port 33260
Nov 28 07:20:37 np0005538515.novalocal sshd[19909]: Connection closed by invalid user admin 69.5.7.210 port 33260 [preauth]
Nov 28 07:20:38 np0005538515.novalocal sshd[19915]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:20:39 np0005538515.novalocal sshd[19915]: Invalid user admin from 69.5.7.210 port 33276
Nov 28 07:20:39 np0005538515.novalocal sshd[19915]: Connection closed by invalid user admin 69.5.7.210 port 33276 [preauth]
Nov 28 07:20:39 np0005538515.novalocal sshd[19917]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:20:40 np0005538515.novalocal sshd[19917]: Invalid user admin from 69.5.7.210 port 53232
Nov 28 07:20:41 np0005538515.novalocal sshd[19917]: Connection closed by invalid user admin 69.5.7.210 port 53232 [preauth]
Nov 28 07:20:41 np0005538515.novalocal sshd[19919]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:20:41 np0005538515.novalocal sudo[19742]: pam_unix(sudo:session): session closed for user root
Nov 28 07:20:42 np0005538515.novalocal sshd[19919]: Invalid user admin from 69.5.7.210 port 53238
Nov 28 07:20:42 np0005538515.novalocal sshd[19919]: Connection closed by invalid user admin 69.5.7.210 port 53238 [preauth]
Nov 28 07:20:42 np0005538515.novalocal sshd[19921]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:20:43 np0005538515.novalocal sshd[19921]: Invalid user admin from 69.5.7.210 port 53254
Nov 28 07:20:44 np0005538515.novalocal sshd[19921]: Connection closed by invalid user admin 69.5.7.210 port 53254 [preauth]
Nov 28 07:20:44 np0005538515.novalocal sshd[19923]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:20:45 np0005538515.novalocal sshd[19923]: Invalid user admin from 69.5.7.210 port 53270
Nov 28 07:20:45 np0005538515.novalocal sshd[19923]: Connection closed by invalid user admin 69.5.7.210 port 53270 [preauth]
Nov 28 07:20:46 np0005538515.novalocal sshd[19925]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:20:47 np0005538515.novalocal sshd[19925]: Invalid user admin from 69.5.7.210 port 53286
Nov 28 07:20:47 np0005538515.novalocal sshd[19925]: Connection closed by invalid user admin 69.5.7.210 port 53286 [preauth]
Nov 28 07:20:47 np0005538515.novalocal sshd[19927]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:20:48 np0005538515.novalocal sshd[19927]: Invalid user admin from 69.5.7.210 port 53302
Nov 28 07:20:48 np0005538515.novalocal sudo[19942]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zlmehomkgywcxnugocqhyrcnztftfvhv ; /usr/bin/python3
Nov 28 07:20:48 np0005538515.novalocal sudo[19942]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:20:48 np0005538515.novalocal python3[19944]: ansible-community.general.rhsm_repository Invoked with name=['rhel-9-for-x86_64-appstream-eus-rpms'] state=enabled purge=False
Nov 28 07:20:48 np0005538515.novalocal sshd[19927]: Connection closed by invalid user admin 69.5.7.210 port 53302 [preauth]
Nov 28 07:20:49 np0005538515.novalocal sshd[19946]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:20:50 np0005538515.novalocal sshd[19946]: Invalid user admin from 69.5.7.210 port 54048
Nov 28 07:20:50 np0005538515.novalocal sshd[19946]: Connection closed by invalid user admin 69.5.7.210 port 54048 [preauth]
Nov 28 07:20:50 np0005538515.novalocal sshd[19949]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:20:51 np0005538515.novalocal rhsm-service[6576]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Nov 28 07:20:51 np0005538515.novalocal sshd[19949]: Invalid user admin from 69.5.7.210 port 54060
Nov 28 07:20:52 np0005538515.novalocal sshd[19949]: Connection closed by invalid user admin 69.5.7.210 port 54060 [preauth]
Nov 28 07:20:52 np0005538515.novalocal sshd[20015]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:20:53 np0005538515.novalocal sshd[20015]: Invalid user admin from 69.5.7.210 port 54068
Nov 28 07:20:53 np0005538515.novalocal sshd[20015]: Connection closed by invalid user admin 69.5.7.210 port 54068 [preauth]
Nov 28 07:20:53 np0005538515.novalocal sshd[20075]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:20:53 np0005538515.novalocal sudo[19942]: pam_unix(sudo:session): session closed for user root
Nov 28 07:20:54 np0005538515.novalocal sshd[20075]: Invalid user admin from 69.5.7.210 port 54074
Nov 28 07:20:55 np0005538515.novalocal sshd[20075]: Connection closed by invalid user admin 69.5.7.210 port 54074 [preauth]
Nov 28 07:20:55 np0005538515.novalocal sshd[20077]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:20:56 np0005538515.novalocal sshd[20077]: Invalid user admin from 69.5.7.210 port 54090
Nov 28 07:20:56 np0005538515.novalocal sshd[20077]: Connection closed by invalid user admin 69.5.7.210 port 54090 [preauth]
Nov 28 07:20:57 np0005538515.novalocal sshd[20079]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:20:58 np0005538515.novalocal sshd[20079]: Invalid user admin from 69.5.7.210 port 54106
Nov 28 07:20:58 np0005538515.novalocal sshd[20079]: Connection closed by invalid user admin 69.5.7.210 port 54106 [preauth]
Nov 28 07:20:58 np0005538515.novalocal sshd[20081]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:20:59 np0005538515.novalocal sshd[20081]: Invalid user admin from 69.5.7.210 port 54110
Nov 28 07:21:00 np0005538515.novalocal sshd[20081]: Connection closed by invalid user admin 69.5.7.210 port 54110 [preauth]
Nov 28 07:21:00 np0005538515.novalocal sshd[20083]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:21:01 np0005538515.novalocal sshd[20083]: Invalid user admin from 69.5.7.210 port 53790
Nov 28 07:21:01 np0005538515.novalocal sshd[20083]: Connection closed by invalid user admin 69.5.7.210 port 53790 [preauth]
Nov 28 07:21:01 np0005538515.novalocal sshd[20085]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:21:02 np0005538515.novalocal sshd[20085]: Invalid user admin from 69.5.7.210 port 53794
Nov 28 07:21:03 np0005538515.novalocal sshd[20085]: Connection closed by invalid user admin 69.5.7.210 port 53794 [preauth]
Nov 28 07:21:03 np0005538515.novalocal sshd[20087]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:21:04 np0005538515.novalocal sshd[20087]: Invalid user admin from 69.5.7.210 port 53796
Nov 28 07:21:04 np0005538515.novalocal sshd[20087]: Connection closed by invalid user admin 69.5.7.210 port 53796 [preauth]
Nov 28 07:21:04 np0005538515.novalocal sshd[20090]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:21:06 np0005538515.novalocal sshd[20090]: Invalid user admin from 69.5.7.210 port 53802
Nov 28 07:21:06 np0005538515.novalocal sshd[20090]: Connection closed by invalid user admin 69.5.7.210 port 53802 [preauth]
Nov 28 07:21:06 np0005538515.novalocal sshd[20092]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:21:07 np0005538515.novalocal sshd[20092]: Invalid user admin from 69.5.7.210 port 53810
Nov 28 07:21:07 np0005538515.novalocal sshd[20092]: Connection closed by invalid user admin 69.5.7.210 port 53810 [preauth]
Nov 28 07:21:08 np0005538515.novalocal sshd[20094]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:21:09 np0005538515.novalocal sshd[20094]: Invalid user admin from 69.5.7.210 port 53816
Nov 28 07:21:09 np0005538515.novalocal sudo[20109]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wanhrdzsbharclwxlgzacmonvrhsqtzd ; /usr/bin/python3
Nov 28 07:21:09 np0005538515.novalocal sudo[20109]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:21:09 np0005538515.novalocal python3[20111]: ansible-community.general.rhsm_repository Invoked with name=['rhel-9-for-x86_64-highavailability-eus-rpms'] state=enabled purge=False
Nov 28 07:21:09 np0005538515.novalocal sshd[20094]: Connection closed by invalid user admin 69.5.7.210 port 53816 [preauth]
Nov 28 07:21:09 np0005538515.novalocal sshd[20113]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:21:10 np0005538515.novalocal sshd[20113]: Invalid user admin from 69.5.7.210 port 57268
Nov 28 07:21:11 np0005538515.novalocal sshd[20113]: Connection closed by invalid user admin 69.5.7.210 port 57268 [preauth]
Nov 28 07:21:11 np0005538515.novalocal sshd[20116]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:21:12 np0005538515.novalocal rhsm-service[6576]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Nov 28 07:21:12 np0005538515.novalocal sshd[20116]: Invalid user admin from 69.5.7.210 port 57280
Nov 28 07:21:12 np0005538515.novalocal rhsm-service[6576]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Nov 28 07:21:12 np0005538515.novalocal sshd[20116]: Connection closed by invalid user admin 69.5.7.210 port 57280 [preauth]
Nov 28 07:21:13 np0005538515.novalocal sshd[20242]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:21:14 np0005538515.novalocal sshd[20242]: Invalid user admin from 69.5.7.210 port 57296
Nov 28 07:21:14 np0005538515.novalocal sshd[20242]: Connection closed by invalid user admin 69.5.7.210 port 57296 [preauth]
Nov 28 07:21:14 np0005538515.novalocal sshd[20302]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:21:15 np0005538515.novalocal sshd[20302]: Invalid user admin from 69.5.7.210 port 57298
Nov 28 07:21:15 np0005538515.novalocal sshd[20302]: Connection closed by invalid user admin 69.5.7.210 port 57298 [preauth]
Nov 28 07:21:16 np0005538515.novalocal sshd[20307]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:21:17 np0005538515.novalocal sshd[20307]: Invalid user admin from 69.5.7.210 port 57312
Nov 28 07:21:17 np0005538515.novalocal rhsm-service[6576]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Nov 28 07:21:17 np0005538515.novalocal sshd[20307]: Connection closed by invalid user admin 69.5.7.210 port 57312 [preauth]
Nov 28 07:21:17 np0005538515.novalocal sshd[20372]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:21:18 np0005538515.novalocal sshd[20372]: Invalid user admin from 69.5.7.210 port 57328
Nov 28 07:21:19 np0005538515.novalocal sshd[20372]: Connection closed by invalid user admin 69.5.7.210 port 57328 [preauth]
Nov 28 07:21:19 np0005538515.novalocal sshd[20378]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:21:20 np0005538515.novalocal sshd[20378]: Invalid user admin from 69.5.7.210 port 38996
Nov 28 07:21:20 np0005538515.novalocal sshd[20378]: Connection closed by invalid user admin 69.5.7.210 port 38996 [preauth]
Nov 28 07:21:20 np0005538515.novalocal sshd[20437]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:21:22 np0005538515.novalocal sshd[20437]: Invalid user admin from 69.5.7.210 port 39006
Nov 28 07:21:22 np0005538515.novalocal sshd[20437]: Connection closed by invalid user admin 69.5.7.210 port 39006 [preauth]
Nov 28 07:21:22 np0005538515.novalocal sshd[20447]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:21:23 np0005538515.novalocal sshd[20447]: Invalid user admin from 69.5.7.210 port 39016
Nov 28 07:21:23 np0005538515.novalocal sshd[20447]: Connection closed by invalid user admin 69.5.7.210 port 39016 [preauth]
Nov 28 07:21:24 np0005538515.novalocal sshd[20449]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:21:24 np0005538515.novalocal sudo[20109]: pam_unix(sudo:session): session closed for user root
Nov 28 07:21:25 np0005538515.novalocal sshd[20449]: Invalid user admin from 69.5.7.210 port 39020
Nov 28 07:21:25 np0005538515.novalocal sshd[20449]: Connection closed by invalid user admin 69.5.7.210 port 39020 [preauth]
Nov 28 07:21:25 np0005538515.novalocal sshd[20451]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:21:26 np0005538515.novalocal sshd[20451]: Invalid user admin from 69.5.7.210 port 39028
Nov 28 07:21:26 np0005538515.novalocal sshd[20451]: Connection closed by invalid user admin 69.5.7.210 port 39028 [preauth]
Nov 28 07:21:27 np0005538515.novalocal sshd[20453]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:21:28 np0005538515.novalocal sshd[20453]: Invalid user admin from 69.5.7.210 port 39044
Nov 28 07:21:28 np0005538515.novalocal sshd[20453]: Connection closed by invalid user admin 69.5.7.210 port 39044 [preauth]
Nov 28 07:21:28 np0005538515.novalocal sshd[20455]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:21:30 np0005538515.novalocal sshd[20455]: Invalid user admin from 69.5.7.210 port 39048
Nov 28 07:21:30 np0005538515.novalocal sshd[20455]: Connection closed by invalid user admin 69.5.7.210 port 39048 [preauth]
Nov 28 07:21:30 np0005538515.novalocal sshd[20457]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:21:31 np0005538515.novalocal sshd[20457]: Invalid user admin from 69.5.7.210 port 42278
Nov 28 07:21:31 np0005538515.novalocal sshd[20457]: Connection closed by invalid user admin 69.5.7.210 port 42278 [preauth]
Nov 28 07:21:32 np0005538515.novalocal sshd[20459]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:21:33 np0005538515.novalocal sshd[20459]: Invalid user admin from 69.5.7.210 port 42286
Nov 28 07:21:33 np0005538515.novalocal sshd[20459]: Connection closed by invalid user admin 69.5.7.210 port 42286 [preauth]
Nov 28 07:21:33 np0005538515.novalocal sshd[20461]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:21:34 np0005538515.novalocal sshd[20461]: Invalid user admin from 69.5.7.210 port 42300
Nov 28 07:21:35 np0005538515.novalocal sshd[20461]: Connection closed by invalid user admin 69.5.7.210 port 42300 [preauth]
Nov 28 07:21:35 np0005538515.novalocal sshd[20463]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:21:36 np0005538515.novalocal sshd[20463]: Invalid user admin from 69.5.7.210 port 42302
Nov 28 07:21:36 np0005538515.novalocal sshd[20463]: Connection closed by invalid user admin 69.5.7.210 port 42302 [preauth]
Nov 28 07:21:36 np0005538515.novalocal sshd[20465]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:21:37 np0005538515.novalocal sshd[20465]: Invalid user admin from 69.5.7.210 port 42318
Nov 28 07:21:38 np0005538515.novalocal sshd[20465]: Connection closed by invalid user admin 69.5.7.210 port 42318 [preauth]
Nov 28 07:21:38 np0005538515.novalocal sshd[20467]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:21:39 np0005538515.novalocal sshd[20467]: Invalid user admin from 69.5.7.210 port 42334
Nov 28 07:21:39 np0005538515.novalocal sshd[20467]: Connection closed by invalid user admin 69.5.7.210 port 42334 [preauth]
Nov 28 07:21:40 np0005538515.novalocal sudo[20482]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fuahjiiqbbnlvzbgggfyrqktcbpsdglk ; /usr/bin/python3
Nov 28 07:21:40 np0005538515.novalocal sudo[20482]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:21:40 np0005538515.novalocal sshd[20485]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:21:40 np0005538515.novalocal python3[20484]: ansible-community.general.rhsm_repository Invoked with name=['fast-datapath-for-rhel-9-x86_64-rpms'] state=enabled purge=False
Nov 28 07:21:41 np0005538515.novalocal sshd[20485]: Invalid user admin from 69.5.7.210 port 41196
Nov 28 07:21:41 np0005538515.novalocal sshd[20485]: Connection closed by invalid user admin 69.5.7.210 port 41196 [preauth]
Nov 28 07:21:41 np0005538515.novalocal sshd[20489]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:21:42 np0005538515.novalocal sshd[20489]: Invalid user admin from 69.5.7.210 port 41204
Nov 28 07:21:42 np0005538515.novalocal sshd[20489]: Connection closed by invalid user admin 69.5.7.210 port 41204 [preauth]
Nov 28 07:21:43 np0005538515.novalocal rhsm-service[6576]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Nov 28 07:21:43 np0005538515.novalocal rhsm-service[6576]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Nov 28 07:21:43 np0005538515.novalocal sshd[20611]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:21:44 np0005538515.novalocal sshd[20611]: Invalid user admin from 69.5.7.210 port 41218
Nov 28 07:21:44 np0005538515.novalocal sshd[20611]: Connection closed by invalid user admin 69.5.7.210 port 41218 [preauth]
Nov 28 07:21:44 np0005538515.novalocal sshd[20655]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:21:45 np0005538515.novalocal sshd[20655]: Invalid user admin from 69.5.7.210 port 41232
Nov 28 07:21:46 np0005538515.novalocal sshd[20655]: Connection closed by invalid user admin 69.5.7.210 port 41232 [preauth]
Nov 28 07:21:46 np0005538515.novalocal sshd[20678]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:21:47 np0005538515.novalocal sshd[20678]: Invalid user admin from 69.5.7.210 port 41240
Nov 28 07:21:47 np0005538515.novalocal sshd[20678]: Connection closed by invalid user admin 69.5.7.210 port 41240 [preauth]
Nov 28 07:21:48 np0005538515.novalocal sshd[20689]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:21:48 np0005538515.novalocal rhsm-service[6576]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Nov 28 07:21:48 np0005538515.novalocal rhsm-service[6576]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Nov 28 07:21:49 np0005538515.novalocal sshd[20689]: Invalid user admin from 69.5.7.210 port 41248
Nov 28 07:21:49 np0005538515.novalocal sshd[20689]: Connection closed by invalid user admin 69.5.7.210 port 41248 [preauth]
Nov 28 07:21:49 np0005538515.novalocal sshd[20749]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:21:50 np0005538515.novalocal sshd[20749]: Invalid user admin from 69.5.7.210 port 48582
Nov 28 07:21:50 np0005538515.novalocal sshd[20749]: Connection closed by invalid user admin 69.5.7.210 port 48582 [preauth]
Nov 28 07:21:51 np0005538515.novalocal sshd[20753]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:21:52 np0005538515.novalocal sshd[20753]: Invalid user admin from 69.5.7.210 port 48596
Nov 28 07:21:52 np0005538515.novalocal sshd[20753]: Connection closed by invalid user admin 69.5.7.210 port 48596 [preauth]
Nov 28 07:21:52 np0005538515.novalocal sshd[20817]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:21:53 np0005538515.novalocal sshd[20817]: Invalid user admin from 69.5.7.210 port 48598
Nov 28 07:21:54 np0005538515.novalocal sshd[20817]: Connection closed by invalid user admin 69.5.7.210 port 48598 [preauth]
Nov 28 07:21:54 np0005538515.novalocal sshd[20823]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:21:55 np0005538515.novalocal sshd[20823]: Invalid user admin from 69.5.7.210 port 48612
Nov 28 07:21:55 np0005538515.novalocal sshd[20823]: Connection closed by invalid user admin 69.5.7.210 port 48612 [preauth]
Nov 28 07:21:55 np0005538515.novalocal sudo[20482]: pam_unix(sudo:session): session closed for user root
Nov 28 07:21:55 np0005538515.novalocal sshd[20825]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:21:56 np0005538515.novalocal sshd[20825]: Invalid user admin from 69.5.7.210 port 48620
Nov 28 07:21:57 np0005538515.novalocal sshd[20825]: Connection closed by invalid user admin 69.5.7.210 port 48620 [preauth]
Nov 28 07:21:57 np0005538515.novalocal sshd[20827]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:21:58 np0005538515.novalocal sshd[20827]: Invalid user admin from 69.5.7.210 port 48628
Nov 28 07:21:58 np0005538515.novalocal sshd[20827]: Connection closed by invalid user admin 69.5.7.210 port 48628 [preauth]
Nov 28 07:21:59 np0005538515.novalocal sshd[20829]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:22:00 np0005538515.novalocal sshd[20829]: Invalid user admin from 69.5.7.210 port 48644
Nov 28 07:22:00 np0005538515.novalocal sshd[20829]: Connection closed by invalid user admin 69.5.7.210 port 48644 [preauth]
Nov 28 07:22:00 np0005538515.novalocal sshd[20831]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:22:01 np0005538515.novalocal sshd[20831]: Invalid user admin from 69.5.7.210 port 41230
Nov 28 07:22:02 np0005538515.novalocal sshd[20831]: Connection closed by invalid user admin 69.5.7.210 port 41230 [preauth]
Nov 28 07:22:02 np0005538515.novalocal sshd[20833]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:22:03 np0005538515.novalocal sshd[20833]: Invalid user admin from 69.5.7.210 port 41238
Nov 28 07:22:03 np0005538515.novalocal sshd[20833]: Connection closed by invalid user admin 69.5.7.210 port 41238 [preauth]
Nov 28 07:22:03 np0005538515.novalocal sshd[20835]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:22:05 np0005538515.novalocal sshd[20835]: Invalid user admin from 69.5.7.210 port 41252
Nov 28 07:22:05 np0005538515.novalocal sshd[20835]: Connection closed by invalid user admin 69.5.7.210 port 41252 [preauth]
Nov 28 07:22:05 np0005538515.novalocal sshd[20837]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:22:06 np0005538515.novalocal sshd[20837]: Invalid user admin from 69.5.7.210 port 41258
Nov 28 07:22:06 np0005538515.novalocal sshd[20837]: Connection closed by invalid user admin 69.5.7.210 port 41258 [preauth]
Nov 28 07:22:07 np0005538515.novalocal sshd[20839]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:22:08 np0005538515.novalocal sshd[20839]: Invalid user admin from 69.5.7.210 port 41266
Nov 28 07:22:08 np0005538515.novalocal sshd[20839]: Connection closed by invalid user admin 69.5.7.210 port 41266 [preauth]
Nov 28 07:22:08 np0005538515.novalocal sshd[20841]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:22:09 np0005538515.novalocal sshd[20841]: Invalid user admin from 69.5.7.210 port 41274
Nov 28 07:22:09 np0005538515.novalocal sshd[20841]: Connection closed by invalid user admin 69.5.7.210 port 41274 [preauth]
Nov 28 07:22:10 np0005538515.novalocal sshd[20843]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:22:11 np0005538515.novalocal sshd[20843]: Invalid user admin from 69.5.7.210 port 54920
Nov 28 07:22:11 np0005538515.novalocal sshd[20843]: Connection closed by invalid user admin 69.5.7.210 port 54920 [preauth]
Nov 28 07:22:11 np0005538515.novalocal sshd[20845]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:22:12 np0005538515.novalocal sudo[20860]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-imbceqaqedpgjjbzqfqxoqpjjigfyfcx ; /usr/bin/python3
Nov 28 07:22:12 np0005538515.novalocal sudo[20860]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:22:12 np0005538515.novalocal python3[20862]: ansible-community.general.rhsm_repository Invoked with name=['openstack-17.1-for-rhel-9-x86_64-rpms'] state=enabled purge=False
Nov 28 07:22:12 np0005538515.novalocal sshd[20845]: Invalid user admin from 69.5.7.210 port 54936
Nov 28 07:22:13 np0005538515.novalocal sshd[20845]: Connection closed by invalid user admin 69.5.7.210 port 54936 [preauth]
Nov 28 07:22:13 np0005538515.novalocal sshd[20864]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:22:14 np0005538515.novalocal sshd[20864]: Invalid user admin from 69.5.7.210 port 54946
Nov 28 07:22:14 np0005538515.novalocal sshd[20864]: Connection closed by invalid user admin 69.5.7.210 port 54946 [preauth]
Nov 28 07:22:14 np0005538515.novalocal sshd[20867]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:22:15 np0005538515.novalocal rhsm-service[6576]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Nov 28 07:22:15 np0005538515.novalocal sshd[20867]: Invalid user admin from 69.5.7.210 port 54948
Nov 28 07:22:15 np0005538515.novalocal rhsm-service[6576]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Nov 28 07:22:16 np0005538515.novalocal sshd[20867]: Connection closed by invalid user admin 69.5.7.210 port 54948 [preauth]
Nov 28 07:22:16 np0005538515.novalocal sshd[20993]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:22:17 np0005538515.novalocal sshd[20993]: Invalid user admin from 69.5.7.210 port 54956
Nov 28 07:22:17 np0005538515.novalocal sshd[20993]: Connection closed by invalid user admin 69.5.7.210 port 54956 [preauth]
Nov 28 07:22:18 np0005538515.novalocal sshd[21054]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:22:19 np0005538515.novalocal sshd[21054]: Invalid user pi from 69.5.7.210 port 54972
Nov 28 07:22:19 np0005538515.novalocal sshd[21054]: Connection closed by invalid user pi 69.5.7.210 port 54972 [preauth]
Nov 28 07:22:19 np0005538515.novalocal sshd[21058]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:22:20 np0005538515.novalocal sshd[21058]: Connection closed by authenticating user ftp 69.5.7.210 port 60214 [preauth]
Nov 28 07:22:21 np0005538515.novalocal rhsm-service[6576]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Nov 28 07:22:21 np0005538515.novalocal rhsm-service[6576]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Nov 28 07:22:28 np0005538515.novalocal sudo[20860]: pam_unix(sudo:session): session closed for user root
Nov 28 07:22:32 np0005538515.novalocal sudo[21268]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kvkirglxqxolwwxivxlqjccnfxbdkpde ; /usr/bin/python3
Nov 28 07:22:32 np0005538515.novalocal sudo[21268]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:22:32 np0005538515.novalocal python3[21270]: ansible-ansible.legacy.command Invoked with _raw_params=yum repolist --enabled
                                                        _uses_shell=True zuul_log_id=fa163ef9-e89a-f34b-e95a-000000000013-1-overcloudnovacompute2 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 07:22:34 np0005538515.novalocal sudo[21268]: pam_unix(sudo:session): session closed for user root
Nov 28 07:23:00 np0005538515.novalocal sudo[21287]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ahrqsycyyceocfukywrfxecuadunrhff ; /usr/bin/python3
Nov 28 07:23:00 np0005538515.novalocal sudo[21287]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:23:01 np0005538515.novalocal python3[21289]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch', 'os-net-config', 'ansible-core'] state=present update_cache=True allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Nov 28 07:23:12 np0005538515.novalocal groupadd[21385]: group added to /etc/group: name=unbound, GID=987
Nov 28 07:23:12 np0005538515.novalocal groupadd[21385]: group added to /etc/gshadow: name=unbound
Nov 28 07:23:12 np0005538515.novalocal groupadd[21385]: new group: name=unbound, GID=987
Nov 28 07:23:12 np0005538515.novalocal useradd[21392]: new user: name=unbound, UID=987, GID=987, home=/etc/unbound, shell=/sbin/nologin, from=none
Nov 28 07:23:12 np0005538515.novalocal systemd[1]: Started daily update of the root trust anchor for DNSSEC.
Nov 28 07:23:22 np0005538515.novalocal kernel: SELinux:  Converting 503 SID table entries...
Nov 28 07:23:22 np0005538515.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Nov 28 07:23:22 np0005538515.novalocal kernel: SELinux:  policy capability open_perms=1
Nov 28 07:23:22 np0005538515.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Nov 28 07:23:22 np0005538515.novalocal kernel: SELinux:  policy capability always_check_network=0
Nov 28 07:23:22 np0005538515.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 28 07:23:22 np0005538515.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 28 07:23:22 np0005538515.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 28 07:23:22 np0005538515.novalocal groupadd[21418]: group added to /etc/group: name=openvswitch, GID=986
Nov 28 07:23:22 np0005538515.novalocal groupadd[21418]: group added to /etc/gshadow: name=openvswitch
Nov 28 07:23:22 np0005538515.novalocal groupadd[21418]: new group: name=openvswitch, GID=986
Nov 28 07:23:22 np0005538515.novalocal useradd[21425]: new user: name=openvswitch, UID=986, GID=986, home=/, shell=/sbin/nologin, from=none
Nov 28 07:23:22 np0005538515.novalocal groupadd[21433]: group added to /etc/group: name=hugetlbfs, GID=985
Nov 28 07:23:22 np0005538515.novalocal groupadd[21433]: group added to /etc/gshadow: name=hugetlbfs
Nov 28 07:23:22 np0005538515.novalocal groupadd[21433]: new group: name=hugetlbfs, GID=985
Nov 28 07:23:22 np0005538515.novalocal usermod[21441]: add 'openvswitch' to group 'hugetlbfs'
Nov 28 07:23:22 np0005538515.novalocal usermod[21441]: add 'openvswitch' to shadow group 'hugetlbfs'
Nov 28 07:23:24 np0005538515.novalocal dbus-broker-launch[754]: avc:  op=load_policy lsm=selinux seqno=4 res=1
Nov 28 07:23:24 np0005538515.novalocal systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 28 07:23:24 np0005538515.novalocal systemd[1]: Starting man-db-cache-update.service...
Nov 28 07:23:24 np0005538515.novalocal systemd[1]: Reloading.
Nov 28 07:23:24 np0005538515.novalocal systemd-rc-local-generator[21947]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 07:23:24 np0005538515.novalocal systemd-sysv-generator[21953]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 07:23:24 np0005538515.novalocal systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 07:23:24 np0005538515.novalocal systemd[1]: Queuing reload/restart jobs for marked units…
Nov 28 07:23:25 np0005538515.novalocal systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 28 07:23:25 np0005538515.novalocal systemd[1]: Finished man-db-cache-update.service.
Nov 28 07:23:25 np0005538515.novalocal systemd[1]: run-rfe612edd75704cda8f860ad050967a17.service: Deactivated successfully.
Nov 28 07:23:25 np0005538515.novalocal rhsm-service[6576]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Nov 28 07:23:25 np0005538515.novalocal sudo[21287]: pam_unix(sudo:session): session closed for user root
Nov 28 07:23:25 np0005538515.novalocal rhsm-service[6576]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Nov 28 07:23:40 np0005538515.novalocal sudo[22495]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wweviporeazhuznpeywvkkfrudsvetgg ; /usr/bin/python3
Nov 28 07:23:40 np0005538515.novalocal sudo[22495]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:23:40 np0005538515.novalocal python3[22497]: ansible-ansible.legacy.command Invoked with _raw_params=ansible-galaxy collection install ansible.posix
                                                        _uses_shell=True zuul_log_id=fa163ef9-e89a-f34b-e95a-000000000015-1-overcloudnovacompute2 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 07:23:53 np0005538515.novalocal sudo[22495]: pam_unix(sudo:session): session closed for user root
Nov 28 07:23:58 np0005538515.novalocal sudo[22515]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-egwtaklpsouomsxolykeuzylgpgglgsd ; /usr/bin/python3
Nov 28 07:23:58 np0005538515.novalocal sudo[22515]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:23:58 np0005538515.novalocal python3[22517]: ansible-ansible.builtin.file Invoked with path=/etc/os-net-config state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:23:58 np0005538515.novalocal sudo[22515]: pam_unix(sudo:session): session closed for user root
Nov 28 07:24:00 np0005538515.novalocal sudo[22563]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uopwlsjinbxdyosxnitztctnvvboddai ; /usr/bin/python3
Nov 28 07:24:00 np0005538515.novalocal sudo[22563]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:24:00 np0005538515.novalocal python3[22565]: ansible-ansible.legacy.stat Invoked with path=/etc/os-net-config/tripleo_config.yaml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 07:24:00 np0005538515.novalocal sudo[22563]: pam_unix(sudo:session): session closed for user root
Nov 28 07:24:00 np0005538515.novalocal sudo[22606]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bppjkppdlbtflpelokqlbrkqyxypmflc ; /usr/bin/python3
Nov 28 07:24:00 np0005538515.novalocal sudo[22606]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:24:00 np0005538515.novalocal python3[22608]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764314639.8936636-334-43831190965907/source dest=/etc/os-net-config/tripleo_config.yaml mode=None follow=False _original_basename=overcloud_net_config.j2 checksum=91bc45728dd9738fc644e3ada9d8642294da29ff backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:24:00 np0005538515.novalocal sudo[22606]: pam_unix(sudo:session): session closed for user root
Nov 28 07:24:02 np0005538515.novalocal sudo[22636]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wourxcojpiiegotkedszqzosgxeamelp ; /usr/bin/python3
Nov 28 07:24:02 np0005538515.novalocal sudo[22636]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:24:02 np0005538515.novalocal python3[22638]: ansible-community.general.nmcli Invoked with conn_name=ci-private-network  state=absent ignore_unsupported_suboptions=False autoconnect=True gw4_ignore_auto=False never_default4=False dns4_ignore_auto=False may_fail4=True gw6_ignore_auto=False dns6_ignore_auto=False mode=balance-rr stp=True priority=128 slavepriority=32 forwarddelay=15 hellotime=2 maxage=20 ageingtime=300 hairpin=False path_cost=100 runner=roundrobin master=None slave_type=None ifname=None type=None ip4=None gw4=None routes4=None routes4_extended=None route_metric4=None routing_rules4=None dns4=None dns4_search=None dns4_options=None method4=None dhcp_client_id=None ip6=None gw6=None dns6=None dns6_search=None dns6_options=None routes6=None routes6_extended=None route_metric6=None method6=None ip_privacy6=None addr_gen_mode6=None miimon=None downdelay=None updelay=None xmit_hash_policy=None arp_interval=None arp_ip_target=None primary=None mtu=None mac=None zone=None runner_hwaddr_policy=None runner_fast_rate=None vlanid=None vlandev=None flags=None ingress=None egress=None vxlan_id=None vxlan_local=None vxlan_remote=None ip_tunnel_dev=None ip_tunnel_local=None ip_tunnel_remote=None ip_tunnel_input_key=NOT_LOGGING_PARAMETER ip_tunnel_output_key=NOT_LOGGING_PARAMETER ssid=None wifi=None wifi_sec=NOT_LOGGING_PARAMETER gsm=None macvlan=None wireguard=None vpn=None transport_mode=None
Nov 28 07:24:03 np0005538515.novalocal sudo[22636]: pam_unix(sudo:session): session closed for user root
Nov 28 07:24:03 np0005538515.novalocal systemd-journald[618]: Field hash table of /run/log/journal/5cd59ba25ae47acac865224fa46a5f9e/system.journal has a fill level at 89.2 (297 of 333 items), suggesting rotation.
Nov 28 07:24:03 np0005538515.novalocal systemd-journald[618]: /run/log/journal/5cd59ba25ae47acac865224fa46a5f9e/system.journal: Journal header limits reached or header out-of-date, rotating.
Nov 28 07:24:03 np0005538515.novalocal rsyslogd[758]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Nov 28 07:24:03 np0005538515.novalocal rsyslogd[758]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Nov 28 07:24:03 np0005538515.novalocal rsyslogd[758]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Nov 28 07:24:03 np0005538515.novalocal sudo[22657]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wrlryblmmiuhnpbercvewbbthydrgwxc ; /usr/bin/python3
Nov 28 07:24:03 np0005538515.novalocal sudo[22657]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:24:03 np0005538515.novalocal python3[22659]: ansible-community.general.nmcli Invoked with conn_name=ci-private-network-20 state=absent ignore_unsupported_suboptions=False autoconnect=True gw4_ignore_auto=False never_default4=False dns4_ignore_auto=False may_fail4=True gw6_ignore_auto=False dns6_ignore_auto=False mode=balance-rr stp=True priority=128 slavepriority=32 forwarddelay=15 hellotime=2 maxage=20 ageingtime=300 hairpin=False path_cost=100 runner=roundrobin master=None slave_type=None ifname=None type=None ip4=None gw4=None routes4=None routes4_extended=None route_metric4=None routing_rules4=None dns4=None dns4_search=None dns4_options=None method4=None dhcp_client_id=None ip6=None gw6=None dns6=None dns6_search=None dns6_options=None routes6=None routes6_extended=None route_metric6=None method6=None ip_privacy6=None addr_gen_mode6=None miimon=None downdelay=None updelay=None xmit_hash_policy=None arp_interval=None arp_ip_target=None primary=None mtu=None mac=None zone=None runner_hwaddr_policy=None runner_fast_rate=None vlanid=None vlandev=None flags=None ingress=None egress=None vxlan_id=None vxlan_local=None vxlan_remote=None ip_tunnel_dev=None ip_tunnel_local=None ip_tunnel_remote=None ip_tunnel_input_key=NOT_LOGGING_PARAMETER ip_tunnel_output_key=NOT_LOGGING_PARAMETER ssid=None wifi=None wifi_sec=NOT_LOGGING_PARAMETER gsm=None macvlan=None wireguard=None vpn=None transport_mode=None
Nov 28 07:24:03 np0005538515.novalocal sudo[22657]: pam_unix(sudo:session): session closed for user root
Nov 28 07:24:04 np0005538515.novalocal sudo[22677]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oceaullojyhdxltttmbngcuxwoldcvlt ; /usr/bin/python3
Nov 28 07:24:04 np0005538515.novalocal sudo[22677]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:24:04 np0005538515.novalocal python3[22679]: ansible-community.general.nmcli Invoked with conn_name=ci-private-network-21 state=absent ignore_unsupported_suboptions=False autoconnect=True gw4_ignore_auto=False never_default4=False dns4_ignore_auto=False may_fail4=True gw6_ignore_auto=False dns6_ignore_auto=False mode=balance-rr stp=True priority=128 slavepriority=32 forwarddelay=15 hellotime=2 maxage=20 ageingtime=300 hairpin=False path_cost=100 runner=roundrobin master=None slave_type=None ifname=None type=None ip4=None gw4=None routes4=None routes4_extended=None route_metric4=None routing_rules4=None dns4=None dns4_search=None dns4_options=None method4=None dhcp_client_id=None ip6=None gw6=None dns6=None dns6_search=None dns6_options=None routes6=None routes6_extended=None route_metric6=None method6=None ip_privacy6=None addr_gen_mode6=None miimon=None downdelay=None updelay=None xmit_hash_policy=None arp_interval=None arp_ip_target=None primary=None mtu=None mac=None zone=None runner_hwaddr_policy=None runner_fast_rate=None vlanid=None vlandev=None flags=None ingress=None egress=None vxlan_id=None vxlan_local=None vxlan_remote=None ip_tunnel_dev=None ip_tunnel_local=None ip_tunnel_remote=None ip_tunnel_input_key=NOT_LOGGING_PARAMETER ip_tunnel_output_key=NOT_LOGGING_PARAMETER ssid=None wifi=None wifi_sec=NOT_LOGGING_PARAMETER gsm=None macvlan=None wireguard=None vpn=None transport_mode=None
Nov 28 07:24:04 np0005538515.novalocal sudo[22677]: pam_unix(sudo:session): session closed for user root
Nov 28 07:24:04 np0005538515.novalocal sudo[22697]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-afjagocpktopyinbbnrkzqvcbeeuzjcm ; /usr/bin/python3
Nov 28 07:24:04 np0005538515.novalocal sudo[22697]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:24:04 np0005538515.novalocal python3[22699]: ansible-community.general.nmcli Invoked with conn_name=ci-private-network-22 state=absent ignore_unsupported_suboptions=False autoconnect=True gw4_ignore_auto=False never_default4=False dns4_ignore_auto=False may_fail4=True gw6_ignore_auto=False dns6_ignore_auto=False mode=balance-rr stp=True priority=128 slavepriority=32 forwarddelay=15 hellotime=2 maxage=20 ageingtime=300 hairpin=False path_cost=100 runner=roundrobin master=None slave_type=None ifname=None type=None ip4=None gw4=None routes4=None routes4_extended=None route_metric4=None routing_rules4=None dns4=None dns4_search=None dns4_options=None method4=None dhcp_client_id=None ip6=None gw6=None dns6=None dns6_search=None dns6_options=None routes6=None routes6_extended=None route_metric6=None method6=None ip_privacy6=None addr_gen_mode6=None miimon=None downdelay=None updelay=None xmit_hash_policy=None arp_interval=None arp_ip_target=None primary=None mtu=None mac=None zone=None runner_hwaddr_policy=None runner_fast_rate=None vlanid=None vlandev=None flags=None ingress=None egress=None vxlan_id=None vxlan_local=None vxlan_remote=None ip_tunnel_dev=None ip_tunnel_local=None ip_tunnel_remote=None ip_tunnel_input_key=NOT_LOGGING_PARAMETER ip_tunnel_output_key=NOT_LOGGING_PARAMETER ssid=None wifi=None wifi_sec=NOT_LOGGING_PARAMETER gsm=None macvlan=None wireguard=None vpn=None transport_mode=None
Nov 28 07:24:04 np0005538515.novalocal sudo[22697]: pam_unix(sudo:session): session closed for user root
Nov 28 07:24:04 np0005538515.novalocal sudo[22717]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-enbjrkotctndiryvuvlmbcxekqpzehfa ; /usr/bin/python3
Nov 28 07:24:04 np0005538515.novalocal sudo[22717]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:24:04 np0005538515.novalocal python3[22719]: ansible-community.general.nmcli Invoked with conn_name=ci-private-network-23 state=absent ignore_unsupported_suboptions=False autoconnect=True gw4_ignore_auto=False never_default4=False dns4_ignore_auto=False may_fail4=True gw6_ignore_auto=False dns6_ignore_auto=False mode=balance-rr stp=True priority=128 slavepriority=32 forwarddelay=15 hellotime=2 maxage=20 ageingtime=300 hairpin=False path_cost=100 runner=roundrobin master=None slave_type=None ifname=None type=None ip4=None gw4=None routes4=None routes4_extended=None route_metric4=None routing_rules4=None dns4=None dns4_search=None dns4_options=None method4=None dhcp_client_id=None ip6=None gw6=None dns6=None dns6_search=None dns6_options=None routes6=None routes6_extended=None route_metric6=None method6=None ip_privacy6=None addr_gen_mode6=None miimon=None downdelay=None updelay=None xmit_hash_policy=None arp_interval=None arp_ip_target=None primary=None mtu=None mac=None zone=None runner_hwaddr_policy=None runner_fast_rate=None vlanid=None vlandev=None flags=None ingress=None egress=None vxlan_id=None vxlan_local=None vxlan_remote=None ip_tunnel_dev=None ip_tunnel_local=None ip_tunnel_remote=None ip_tunnel_input_key=NOT_LOGGING_PARAMETER ip_tunnel_output_key=NOT_LOGGING_PARAMETER ssid=None wifi=None wifi_sec=NOT_LOGGING_PARAMETER gsm=None macvlan=None wireguard=None vpn=None transport_mode=None
Nov 28 07:24:04 np0005538515.novalocal sudo[22717]: pam_unix(sudo:session): session closed for user root
Nov 28 07:24:07 np0005538515.novalocal sudo[22737]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ovxgirgshnyujobzbjgtetigpqmzuboo ; /usr/bin/python3
Nov 28 07:24:07 np0005538515.novalocal sudo[22737]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:24:07 np0005538515.novalocal python3[22739]: ansible-ansible.builtin.systemd Invoked with name=network state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 28 07:24:07 np0005538515.novalocal systemd[1]: Starting LSB: Bring up/down networking...
Nov 28 07:24:07 np0005538515.novalocal network[22742]: WARN      : [network] You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 28 07:24:07 np0005538515.novalocal network[22753]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 28 07:24:07 np0005538515.novalocal network[22742]: WARN      : [network] 'network-scripts' will be removed from distribution in near future.
Nov 28 07:24:07 np0005538515.novalocal network[22754]: 'network-scripts' will be removed from distribution in near future.
Nov 28 07:24:07 np0005538515.novalocal network[22742]: WARN      : [network] It is advised to switch to 'NetworkManager' instead for network management.
Nov 28 07:24:07 np0005538515.novalocal network[22755]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 28 07:24:07 np0005538515.novalocal NetworkManager[5965]: <info>  [1764314647.6575] audit: op="connections-reload" pid=22783 uid=0 result="success"
Nov 28 07:24:07 np0005538515.novalocal network[22742]: Bringing up loopback interface:  [  OK  ]
Nov 28 07:24:07 np0005538515.novalocal NetworkManager[5965]: <info>  [1764314647.8433] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-eth0" pid=22871 uid=0 result="success"
Nov 28 07:24:07 np0005538515.novalocal network[22742]: Bringing up interface eth0:  [  OK  ]
Nov 28 07:24:07 np0005538515.novalocal systemd[1]: Started LSB: Bring up/down networking.
Nov 28 07:24:07 np0005538515.novalocal sudo[22737]: pam_unix(sudo:session): session closed for user root
Nov 28 07:24:08 np0005538515.novalocal sudo[22910]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rsorivqfeepjdaafakgejfikfufxvmmg ; /usr/bin/python3
Nov 28 07:24:08 np0005538515.novalocal sudo[22910]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:24:08 np0005538515.novalocal python3[22912]: ansible-ansible.builtin.systemd Invoked with name=openvswitch state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 28 07:24:08 np0005538515.novalocal systemd[1]: Starting Open vSwitch Database Unit...
Nov 28 07:24:08 np0005538515.novalocal chown[22916]: /usr/bin/chown: cannot access '/run/openvswitch': No such file or directory
Nov 28 07:24:08 np0005538515.novalocal ovs-ctl[22921]: /etc/openvswitch/conf.db does not exist ... (warning).
Nov 28 07:24:08 np0005538515.novalocal ovs-ctl[22921]: Creating empty database /etc/openvswitch/conf.db [  OK  ]
Nov 28 07:24:08 np0005538515.novalocal ovs-ctl[22921]: Starting ovsdb-server [  OK  ]
Nov 28 07:24:08 np0005538515.novalocal ovs-vsctl[22970]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait -- init -- set Open_vSwitch . db-version=8.5.1
Nov 28 07:24:08 np0005538515.novalocal ovs-vsctl[22990]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait set Open_vSwitch . ovs-version=3.3.6-141.el9fdp "external-ids:system-id=\"62c03cad-89c1-4fd7-973b-8f2a608c71f1\"" "external-ids:rundir=\"/var/run/openvswitch\"" "system-type=\"rhel\"" "system-version=\"9.2\""
Nov 28 07:24:08 np0005538515.novalocal ovs-ctl[22921]: Configuring Open vSwitch system IDs [  OK  ]
Nov 28 07:24:08 np0005538515.novalocal ovs-ctl[22921]: Enabling remote OVSDB managers [  OK  ]
Nov 28 07:24:08 np0005538515.novalocal ovs-vsctl[22996]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=np0005538515.novalocal
Nov 28 07:24:08 np0005538515.novalocal systemd[1]: Started Open vSwitch Database Unit.
Nov 28 07:24:08 np0005538515.novalocal systemd[1]: Starting Open vSwitch Delete Transient Ports...
Nov 28 07:24:08 np0005538515.novalocal systemd[1]: Finished Open vSwitch Delete Transient Ports.
Nov 28 07:24:08 np0005538515.novalocal systemd[1]: Starting Open vSwitch Forwarding Unit...
Nov 28 07:24:08 np0005538515.novalocal kernel: openvswitch: Open vSwitch switching datapath
Nov 28 07:24:08 np0005538515.novalocal ovs-ctl[23040]: Inserting openvswitch module [  OK  ]
Nov 28 07:24:08 np0005538515.novalocal ovs-ctl[23009]: Starting ovs-vswitchd [  OK  ]
Nov 28 07:24:08 np0005538515.novalocal ovs-vsctl[23058]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=np0005538515.novalocal
Nov 28 07:24:08 np0005538515.novalocal ovs-ctl[23009]: Enabling remote OVSDB managers [  OK  ]
Nov 28 07:24:08 np0005538515.novalocal systemd[1]: Started Open vSwitch Forwarding Unit.
Nov 28 07:24:08 np0005538515.novalocal systemd[1]: Starting Open vSwitch...
Nov 28 07:24:08 np0005538515.novalocal systemd[1]: Finished Open vSwitch.
Nov 28 07:24:09 np0005538515.novalocal sudo[22910]: pam_unix(sudo:session): session closed for user root
Nov 28 07:24:39 np0005538515.novalocal sudo[23074]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zqmtbijdxllhkmepiadhmnuzvhbhfvph ; /usr/bin/python3
Nov 28 07:24:39 np0005538515.novalocal sudo[23074]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:24:39 np0005538515.novalocal python3[23076]: ansible-ansible.legacy.command Invoked with _raw_params=os-net-config -c /etc/os-net-config/tripleo_config.yaml
                                                        _uses_shell=True zuul_log_id=fa163ef9-e89a-f34b-e95a-00000000001a-1-overcloudnovacompute2 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 07:24:40 np0005538515.novalocal NetworkManager[5965]: <info>  [1764314680.1561] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-br-ex" pid=23235 uid=0 result="success"
Nov 28 07:24:40 np0005538515.novalocal ifup[23236]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated.
Nov 28 07:24:40 np0005538515.novalocal ifup[23237]: 'network-scripts' will be removed from distribution in near future.
Nov 28 07:24:40 np0005538515.novalocal ifup[23238]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well.
Nov 28 07:24:40 np0005538515.novalocal NetworkManager[5965]: <info>  [1764314680.1870] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-br-ex" pid=23244 uid=0 result="success"
Nov 28 07:24:40 np0005538515.novalocal ovs-vsctl[23246]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --may-exist add-br br-ex -- set bridge br-ex other-config:mac-table-size=50000 -- set bridge br-ex other-config:hwaddr=fa:16:3e:f7:e2:83 -- set bridge br-ex fail_mode=standalone -- del-controller br-ex
Nov 28 07:24:40 np0005538515.novalocal kernel: device ovs-system entered promiscuous mode
Nov 28 07:24:40 np0005538515.novalocal NetworkManager[5965]: <info>  [1764314680.2159] manager: (ovs-system): new Generic device (/org/freedesktop/NetworkManager/Devices/4)
Nov 28 07:24:40 np0005538515.novalocal systemd-udevd[23247]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 07:24:40 np0005538515.novalocal kernel: Timeout policy base is empty
Nov 28 07:24:40 np0005538515.novalocal kernel: Failed to associated timeout policy `ovs_test_tp'
Nov 28 07:24:40 np0005538515.novalocal kernel: device br-ex entered promiscuous mode
Nov 28 07:24:40 np0005538515.novalocal NetworkManager[5965]: <info>  [1764314680.2605] manager: (br-ex): new Generic device (/org/freedesktop/NetworkManager/Devices/5)
Nov 28 07:24:40 np0005538515.novalocal NetworkManager[5965]: <info>  [1764314680.2860] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-br-ex" pid=23273 uid=0 result="success"
Nov 28 07:24:40 np0005538515.novalocal NetworkManager[5965]: <info>  [1764314680.3064] device (br-ex): carrier: link connected
Nov 28 07:24:43 np0005538515.novalocal NetworkManager[5965]: <info>  [1764314683.3587] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-br-ex" pid=23302 uid=0 result="success"
Nov 28 07:24:43 np0005538515.novalocal NetworkManager[5965]: <info>  [1764314683.4002] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-br-ex" pid=23317 uid=0 result="success"
Nov 28 07:24:43 np0005538515.novalocal NET[23342]: /etc/sysconfig/network-scripts/ifup-post : updated /etc/resolv.conf
Nov 28 07:24:43 np0005538515.novalocal NetworkManager[5965]: <info>  [1764314683.4875] device (eth1): state change: activated -> unmanaged (reason 'unmanaged', sys-iface-state: 'managed')
Nov 28 07:24:43 np0005538515.novalocal NetworkManager[5965]: <info>  [1764314683.4983] dhcp4 (eth1): canceled DHCP transaction
Nov 28 07:24:43 np0005538515.novalocal NetworkManager[5965]: <info>  [1764314683.5004] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Nov 28 07:24:43 np0005538515.novalocal NetworkManager[5965]: <info>  [1764314683.5004] dhcp4 (eth1): state changed no lease
Nov 28 07:24:43 np0005538515.novalocal NetworkManager[5965]: <info>  [1764314683.5034] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-eth1" pid=23351 uid=0 result="success"
Nov 28 07:24:43 np0005538515.novalocal ifup[23352]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated.
Nov 28 07:24:43 np0005538515.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Nov 28 07:24:43 np0005538515.novalocal ifup[23353]: 'network-scripts' will be removed from distribution in near future.
Nov 28 07:24:43 np0005538515.novalocal ifup[23355]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well.
Nov 28 07:24:43 np0005538515.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Nov 28 07:24:43 np0005538515.novalocal NetworkManager[5965]: <info>  [1764314683.5320] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-eth1" pid=23369 uid=0 result="success"
Nov 28 07:24:43 np0005538515.novalocal NetworkManager[5965]: <info>  [1764314683.5631] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-eth1" pid=23379 uid=0 result="success"
Nov 28 07:24:43 np0005538515.novalocal NetworkManager[5965]: <info>  [1764314683.5672] device (eth1): carrier: link connected
Nov 28 07:24:43 np0005538515.novalocal NetworkManager[5965]: <info>  [1764314683.5779] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-eth1" pid=23388 uid=0 result="success"
Nov 28 07:24:43 np0005538515.novalocal ipv6_wait_tentative[23400]: Waiting for interface eth1 IPv6 address(es) to leave the 'tentative' state
Nov 28 07:24:44 np0005538515.novalocal ipv6_wait_tentative[23405]: Waiting for interface eth1 IPv6 address(es) to leave the 'tentative' state
Nov 28 07:24:45 np0005538515.novalocal NetworkManager[5965]: <info>  [1764314685.6341] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-eth1" pid=23414 uid=0 result="success"
Nov 28 07:24:45 np0005538515.novalocal ovs-vsctl[23429]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex eth1 -- add-port br-ex eth1
Nov 28 07:24:45 np0005538515.novalocal kernel: device eth1 entered promiscuous mode
Nov 28 07:24:45 np0005538515.novalocal NetworkManager[5965]: <info>  [1764314685.7092] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-br-ex" pid=23437 uid=0 result="success"
Nov 28 07:24:45 np0005538515.novalocal ifup[23438]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated.
Nov 28 07:24:45 np0005538515.novalocal ifup[23439]: 'network-scripts' will be removed from distribution in near future.
Nov 28 07:24:45 np0005538515.novalocal ifup[23440]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well.
Nov 28 07:24:45 np0005538515.novalocal NetworkManager[5965]: <info>  [1764314685.7431] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-br-ex" pid=23446 uid=0 result="success"
Nov 28 07:24:45 np0005538515.novalocal NetworkManager[5965]: <info>  [1764314685.7806] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=23456 uid=0 result="success"
Nov 28 07:24:45 np0005538515.novalocal ifup[23457]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated.
Nov 28 07:24:45 np0005538515.novalocal ifup[23458]: 'network-scripts' will be removed from distribution in near future.
Nov 28 07:24:45 np0005538515.novalocal ifup[23459]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well.
Nov 28 07:24:45 np0005538515.novalocal NetworkManager[5965]: <info>  [1764314685.8040] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=23465 uid=0 result="success"
Nov 28 07:24:45 np0005538515.novalocal ovs-vsctl[23468]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan21 -- add-port br-ex vlan21 tag=21 -- set Interface vlan21 type=internal
Nov 28 07:24:45 np0005538515.novalocal kernel: device vlan21 entered promiscuous mode
Nov 28 07:24:45 np0005538515.novalocal NetworkManager[5965]: <info>  [1764314685.8403] manager: (vlan21): new Generic device (/org/freedesktop/NetworkManager/Devices/6)
Nov 28 07:24:45 np0005538515.novalocal systemd-udevd[23470]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 07:24:45 np0005538515.novalocal NetworkManager[5965]: <info>  [1764314685.8596] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=23479 uid=0 result="success"
Nov 28 07:24:45 np0005538515.novalocal NetworkManager[5965]: <info>  [1764314685.8745] device (vlan21): carrier: link connected
Nov 28 07:24:48 np0005538515.novalocal NetworkManager[5965]: <info>  [1764314688.9223] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=23508 uid=0 result="success"
Nov 28 07:24:48 np0005538515.novalocal NetworkManager[5965]: <info>  [1764314688.9752] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=23523 uid=0 result="success"
Nov 28 07:24:49 np0005538515.novalocal NetworkManager[5965]: <info>  [1764314689.0409] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=23544 uid=0 result="success"
Nov 28 07:24:49 np0005538515.novalocal ifup[23545]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated.
Nov 28 07:24:49 np0005538515.novalocal ifup[23546]: 'network-scripts' will be removed from distribution in near future.
Nov 28 07:24:49 np0005538515.novalocal ifup[23547]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well.
Nov 28 07:24:49 np0005538515.novalocal NetworkManager[5965]: <info>  [1764314689.1074] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=23553 uid=0 result="success"
Nov 28 07:24:49 np0005538515.novalocal ovs-vsctl[23556]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan23 -- add-port br-ex vlan23 tag=23 -- set Interface vlan23 type=internal
Nov 28 07:24:49 np0005538515.novalocal kernel: device vlan23 entered promiscuous mode
Nov 28 07:24:49 np0005538515.novalocal NetworkManager[5965]: <info>  [1764314689.1521] manager: (vlan23): new Generic device (/org/freedesktop/NetworkManager/Devices/7)
Nov 28 07:24:49 np0005538515.novalocal systemd-udevd[23558]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 07:24:49 np0005538515.novalocal NetworkManager[5965]: <info>  [1764314689.1806] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=23568 uid=0 result="success"
Nov 28 07:24:49 np0005538515.novalocal NetworkManager[5965]: <info>  [1764314689.2009] device (vlan23): carrier: link connected
Nov 28 07:24:52 np0005538515.novalocal NetworkManager[5965]: <info>  [1764314692.2546] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=23598 uid=0 result="success"
Nov 28 07:24:52 np0005538515.novalocal NetworkManager[5965]: <info>  [1764314692.3031] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=23613 uid=0 result="success"
Nov 28 07:24:52 np0005538515.novalocal NetworkManager[5965]: <info>  [1764314692.3650] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=23634 uid=0 result="success"
Nov 28 07:24:52 np0005538515.novalocal ifup[23635]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated.
Nov 28 07:24:52 np0005538515.novalocal ifup[23636]: 'network-scripts' will be removed from distribution in near future.
Nov 28 07:24:52 np0005538515.novalocal ifup[23637]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well.
Nov 28 07:24:52 np0005538515.novalocal NetworkManager[5965]: <info>  [1764314692.3960] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=23643 uid=0 result="success"
Nov 28 07:24:52 np0005538515.novalocal ovs-vsctl[23646]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan20 -- add-port br-ex vlan20 tag=20 -- set Interface vlan20 type=internal
Nov 28 07:24:52 np0005538515.novalocal kernel: device vlan20 entered promiscuous mode
Nov 28 07:24:52 np0005538515.novalocal systemd-udevd[23648]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 07:24:52 np0005538515.novalocal NetworkManager[5965]: <info>  [1764314692.4366] manager: (vlan20): new Generic device (/org/freedesktop/NetworkManager/Devices/8)
Nov 28 07:24:52 np0005538515.novalocal NetworkManager[5965]: <info>  [1764314692.4635] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=23658 uid=0 result="success"
Nov 28 07:24:52 np0005538515.novalocal NetworkManager[5965]: <info>  [1764314692.4839] device (vlan20): carrier: link connected
Nov 28 07:24:53 np0005538515.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Nov 28 07:24:55 np0005538515.novalocal NetworkManager[5965]: <info>  [1764314695.5335] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=23688 uid=0 result="success"
Nov 28 07:24:55 np0005538515.novalocal NetworkManager[5965]: <info>  [1764314695.5785] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=23703 uid=0 result="success"
Nov 28 07:24:55 np0005538515.novalocal NetworkManager[5965]: <info>  [1764314695.6403] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=23724 uid=0 result="success"
Nov 28 07:24:55 np0005538515.novalocal ifup[23725]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated.
Nov 28 07:24:55 np0005538515.novalocal ifup[23726]: 'network-scripts' will be removed from distribution in near future.
Nov 28 07:24:55 np0005538515.novalocal ifup[23727]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well.
Nov 28 07:24:55 np0005538515.novalocal NetworkManager[5965]: <info>  [1764314695.6734] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=23733 uid=0 result="success"
Nov 28 07:24:55 np0005538515.novalocal ovs-vsctl[23736]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan22 -- add-port br-ex vlan22 tag=22 -- set Interface vlan22 type=internal
Nov 28 07:24:55 np0005538515.novalocal kernel: device vlan22 entered promiscuous mode
Nov 28 07:24:55 np0005538515.novalocal NetworkManager[5965]: <info>  [1764314695.7113] manager: (vlan22): new Generic device (/org/freedesktop/NetworkManager/Devices/9)
Nov 28 07:24:55 np0005538515.novalocal systemd-udevd[23738]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 07:24:55 np0005538515.novalocal NetworkManager[5965]: <info>  [1764314695.7379] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=23748 uid=0 result="success"
Nov 28 07:24:55 np0005538515.novalocal NetworkManager[5965]: <info>  [1764314695.7595] device (vlan22): carrier: link connected
Nov 28 07:24:58 np0005538515.novalocal NetworkManager[5965]: <info>  [1764314698.8129] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=23778 uid=0 result="success"
Nov 28 07:24:58 np0005538515.novalocal NetworkManager[5965]: <info>  [1764314698.8589] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=23793 uid=0 result="success"
Nov 28 07:24:58 np0005538515.novalocal NetworkManager[5965]: <info>  [1764314698.9208] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=23814 uid=0 result="success"
Nov 28 07:24:58 np0005538515.novalocal ifup[23815]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated.
Nov 28 07:24:58 np0005538515.novalocal ifup[23816]: 'network-scripts' will be removed from distribution in near future.
Nov 28 07:24:58 np0005538515.novalocal ifup[23817]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well.
Nov 28 07:24:58 np0005538515.novalocal NetworkManager[5965]: <info>  [1764314698.9532] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=23823 uid=0 result="success"
Nov 28 07:24:58 np0005538515.novalocal ovs-vsctl[23826]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan44 -- add-port br-ex vlan44 tag=44 -- set Interface vlan44 type=internal
Nov 28 07:24:58 np0005538515.novalocal NetworkManager[5965]: <info>  [1764314698.9939] manager: (vlan44): new Generic device (/org/freedesktop/NetworkManager/Devices/10)
Nov 28 07:24:58 np0005538515.novalocal kernel: device vlan44 entered promiscuous mode
Nov 28 07:24:58 np0005538515.novalocal systemd-udevd[23828]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 07:24:59 np0005538515.novalocal NetworkManager[5965]: <info>  [1764314699.0188] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=23838 uid=0 result="success"
Nov 28 07:24:59 np0005538515.novalocal NetworkManager[5965]: <info>  [1764314699.0405] device (vlan44): carrier: link connected
Nov 28 07:25:02 np0005538515.novalocal NetworkManager[5965]: <info>  [1764314702.1039] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=23868 uid=0 result="success"
Nov 28 07:25:02 np0005538515.novalocal NetworkManager[5965]: <info>  [1764314702.1529] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=23883 uid=0 result="success"
Nov 28 07:25:02 np0005538515.novalocal NetworkManager[5965]: <info>  [1764314702.2130] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=23904 uid=0 result="success"
Nov 28 07:25:02 np0005538515.novalocal ifup[23905]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated.
Nov 28 07:25:02 np0005538515.novalocal ifup[23906]: 'network-scripts' will be removed from distribution in near future.
Nov 28 07:25:02 np0005538515.novalocal ifup[23907]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well.
Nov 28 07:25:02 np0005538515.novalocal NetworkManager[5965]: <info>  [1764314702.2438] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=23913 uid=0 result="success"
Nov 28 07:25:02 np0005538515.novalocal ovs-vsctl[23916]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan44 -- add-port br-ex vlan44 tag=44 -- set Interface vlan44 type=internal
Nov 28 07:25:02 np0005538515.novalocal NetworkManager[5965]: <info>  [1764314702.3081] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=23923 uid=0 result="success"
Nov 28 07:25:03 np0005538515.novalocal NetworkManager[5965]: <info>  [1764314703.3786] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=23950 uid=0 result="success"
Nov 28 07:25:03 np0005538515.novalocal NetworkManager[5965]: <info>  [1764314703.4318] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=23965 uid=0 result="success"
Nov 28 07:25:03 np0005538515.novalocal NetworkManager[5965]: <info>  [1764314703.4989] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=23986 uid=0 result="success"
Nov 28 07:25:03 np0005538515.novalocal ifup[23987]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated.
Nov 28 07:25:03 np0005538515.novalocal ifup[23988]: 'network-scripts' will be removed from distribution in near future.
Nov 28 07:25:03 np0005538515.novalocal ifup[23989]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well.
Nov 28 07:25:03 np0005538515.novalocal NetworkManager[5965]: <info>  [1764314703.5333] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=23995 uid=0 result="success"
Nov 28 07:25:03 np0005538515.novalocal ovs-vsctl[23998]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan20 -- add-port br-ex vlan20 tag=20 -- set Interface vlan20 type=internal
Nov 28 07:25:03 np0005538515.novalocal NetworkManager[5965]: <info>  [1764314703.5977] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=24005 uid=0 result="success"
Nov 28 07:25:04 np0005538515.novalocal NetworkManager[5965]: <info>  [1764314704.6626] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=24033 uid=0 result="success"
Nov 28 07:25:04 np0005538515.novalocal NetworkManager[5965]: <info>  [1764314704.7117] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=24048 uid=0 result="success"
Nov 28 07:25:04 np0005538515.novalocal NetworkManager[5965]: <info>  [1764314704.7676] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=24069 uid=0 result="success"
Nov 28 07:25:04 np0005538515.novalocal ifup[24070]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated.
Nov 28 07:25:04 np0005538515.novalocal ifup[24071]: 'network-scripts' will be removed from distribution in near future.
Nov 28 07:25:04 np0005538515.novalocal ifup[24072]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well.
Nov 28 07:25:04 np0005538515.novalocal NetworkManager[5965]: <info>  [1764314704.7949] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=24078 uid=0 result="success"
Nov 28 07:25:04 np0005538515.novalocal ovs-vsctl[24081]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan21 -- add-port br-ex vlan21 tag=21 -- set Interface vlan21 type=internal
Nov 28 07:25:04 np0005538515.novalocal NetworkManager[5965]: <info>  [1764314704.8525] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=24088 uid=0 result="success"
Nov 28 07:25:05 np0005538515.novalocal NetworkManager[5965]: <info>  [1764314705.9132] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=24116 uid=0 result="success"
Nov 28 07:25:05 np0005538515.novalocal NetworkManager[5965]: <info>  [1764314705.9580] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=24131 uid=0 result="success"
Nov 28 07:25:06 np0005538515.novalocal NetworkManager[5965]: <info>  [1764314706.0185] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=24152 uid=0 result="success"
Nov 28 07:25:06 np0005538515.novalocal ifup[24153]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated.
Nov 28 07:25:06 np0005538515.novalocal ifup[24154]: 'network-scripts' will be removed from distribution in near future.
Nov 28 07:25:06 np0005538515.novalocal ifup[24155]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well.
Nov 28 07:25:06 np0005538515.novalocal NetworkManager[5965]: <info>  [1764314706.0495] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=24161 uid=0 result="success"
Nov 28 07:25:06 np0005538515.novalocal ovs-vsctl[24164]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan23 -- add-port br-ex vlan23 tag=23 -- set Interface vlan23 type=internal
Nov 28 07:25:06 np0005538515.novalocal NetworkManager[5965]: <info>  [1764314706.1083] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=24171 uid=0 result="success"
Nov 28 07:25:07 np0005538515.novalocal NetworkManager[5965]: <info>  [1764314707.1695] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=24199 uid=0 result="success"
Nov 28 07:25:07 np0005538515.novalocal NetworkManager[5965]: <info>  [1764314707.2210] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=24214 uid=0 result="success"
Nov 28 07:25:07 np0005538515.novalocal NetworkManager[5965]: <info>  [1764314707.2859] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=24235 uid=0 result="success"
Nov 28 07:25:07 np0005538515.novalocal ifup[24236]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated.
Nov 28 07:25:07 np0005538515.novalocal ifup[24237]: 'network-scripts' will be removed from distribution in near future.
Nov 28 07:25:07 np0005538515.novalocal ifup[24238]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well.
Nov 28 07:25:07 np0005538515.novalocal NetworkManager[5965]: <info>  [1764314707.3194] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=24244 uid=0 result="success"
Nov 28 07:25:07 np0005538515.novalocal ovs-vsctl[24247]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan22 -- add-port br-ex vlan22 tag=22 -- set Interface vlan22 type=internal
Nov 28 07:25:07 np0005538515.novalocal NetworkManager[5965]: <info>  [1764314707.3789] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=24254 uid=0 result="success"
Nov 28 07:25:08 np0005538515.novalocal NetworkManager[5965]: <info>  [1764314708.4441] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=24282 uid=0 result="success"
Nov 28 07:25:08 np0005538515.novalocal NetworkManager[5965]: <info>  [1764314708.4917] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=24297 uid=0 result="success"
Nov 28 07:25:08 np0005538515.novalocal sudo[23074]: pam_unix(sudo:session): session closed for user root
Nov 28 07:25:34 np0005538515.novalocal python3[24329]: ansible-ansible.legacy.command Invoked with _raw_params=ip a
                                                       ping -c 2 -W 2 192.168.122.10
                                                       ping -c 2 -W 2 192.168.122.11
                                                        _uses_shell=True zuul_log_id=fa163ef9-e89a-f34b-e95a-00000000001b-1-overcloudnovacompute2 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 07:25:38 np0005538515.novalocal python3[24348]: ansible-ansible.posix.authorized_key Invoked with user=zuul key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCsnBivukZgTjr1SoC29hE3ofwUMxTaKeXh9gXvDwMJASbvK4q9943cbJ2j47GUf8sEgY38kkU/dxSMQWULl4d2oquIgZpJbJuXMU1WNxwGNSrS74OecQ3Or4VxTiDmu/HV83nIWHqfpDCra4DlrIBPPNwhBK4u0QYy87AJaML6NGEDaubbHgVCg1UpW1ho/sDoXptAehoCEaaeRz5tPHiXRnHpIXu44Sp8fRcyU9rBqdv+/lgachTcMYadsD2WBHIL+pptEDHB5TvQTDpnU58YdGFarn8uuGPP4t8H6xcqXbaJS9/oZa5Fb5Mh3vORBbR65jvlGg4PYGzCuI/xllY5+lGK7eyOleFyRqWKa2uAIaGoRBT4ZLKAssOFwCIaGfOAFFOBMkuylg4+MtbYiMJYRARPSRAufAROqhUDOo73y5lBrXh07aiWuSn8fU4mclWu+Xw382ryxW+XeHPc12d7S46TvGJaRvzsLtlyerRxGI77xOHRexq1Z/SFjOWLOwc= zuul-build-sshkey manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 28 07:25:38 np0005538515.novalocal sudo[24362]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bxjmcudwruchokfzajyirjfwpwcfmnux ; /usr/bin/python3
Nov 28 07:25:38 np0005538515.novalocal sudo[24362]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:25:38 np0005538515.novalocal python3[24364]: ansible-ansible.posix.authorized_key Invoked with user=root key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCsnBivukZgTjr1SoC29hE3ofwUMxTaKeXh9gXvDwMJASbvK4q9943cbJ2j47GUf8sEgY38kkU/dxSMQWULl4d2oquIgZpJbJuXMU1WNxwGNSrS74OecQ3Or4VxTiDmu/HV83nIWHqfpDCra4DlrIBPPNwhBK4u0QYy87AJaML6NGEDaubbHgVCg1UpW1ho/sDoXptAehoCEaaeRz5tPHiXRnHpIXu44Sp8fRcyU9rBqdv+/lgachTcMYadsD2WBHIL+pptEDHB5TvQTDpnU58YdGFarn8uuGPP4t8H6xcqXbaJS9/oZa5Fb5Mh3vORBbR65jvlGg4PYGzCuI/xllY5+lGK7eyOleFyRqWKa2uAIaGoRBT4ZLKAssOFwCIaGfOAFFOBMkuylg4+MtbYiMJYRARPSRAufAROqhUDOo73y5lBrXh07aiWuSn8fU4mclWu+Xw382ryxW+XeHPc12d7S46TvGJaRvzsLtlyerRxGI77xOHRexq1Z/SFjOWLOwc= zuul-build-sshkey manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 28 07:25:38 np0005538515.novalocal sudo[24362]: pam_unix(sudo:session): session closed for user root
Nov 28 07:25:40 np0005538515.novalocal python3[24378]: ansible-ansible.posix.authorized_key Invoked with user=zuul key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCsnBivukZgTjr1SoC29hE3ofwUMxTaKeXh9gXvDwMJASbvK4q9943cbJ2j47GUf8sEgY38kkU/dxSMQWULl4d2oquIgZpJbJuXMU1WNxwGNSrS74OecQ3Or4VxTiDmu/HV83nIWHqfpDCra4DlrIBPPNwhBK4u0QYy87AJaML6NGEDaubbHgVCg1UpW1ho/sDoXptAehoCEaaeRz5tPHiXRnHpIXu44Sp8fRcyU9rBqdv+/lgachTcMYadsD2WBHIL+pptEDHB5TvQTDpnU58YdGFarn8uuGPP4t8H6xcqXbaJS9/oZa5Fb5Mh3vORBbR65jvlGg4PYGzCuI/xllY5+lGK7eyOleFyRqWKa2uAIaGoRBT4ZLKAssOFwCIaGfOAFFOBMkuylg4+MtbYiMJYRARPSRAufAROqhUDOo73y5lBrXh07aiWuSn8fU4mclWu+Xw382ryxW+XeHPc12d7S46TvGJaRvzsLtlyerRxGI77xOHRexq1Z/SFjOWLOwc= zuul-build-sshkey manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 28 07:25:40 np0005538515.novalocal sudo[24392]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fjbmwnbbvtidwpvzwkgulvmrhfhdrrem ; /usr/bin/python3
Nov 28 07:25:40 np0005538515.novalocal sudo[24392]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:25:40 np0005538515.novalocal python3[24394]: ansible-ansible.posix.authorized_key Invoked with user=root key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCsnBivukZgTjr1SoC29hE3ofwUMxTaKeXh9gXvDwMJASbvK4q9943cbJ2j47GUf8sEgY38kkU/dxSMQWULl4d2oquIgZpJbJuXMU1WNxwGNSrS74OecQ3Or4VxTiDmu/HV83nIWHqfpDCra4DlrIBPPNwhBK4u0QYy87AJaML6NGEDaubbHgVCg1UpW1ho/sDoXptAehoCEaaeRz5tPHiXRnHpIXu44Sp8fRcyU9rBqdv+/lgachTcMYadsD2WBHIL+pptEDHB5TvQTDpnU58YdGFarn8uuGPP4t8H6xcqXbaJS9/oZa5Fb5Mh3vORBbR65jvlGg4PYGzCuI/xllY5+lGK7eyOleFyRqWKa2uAIaGoRBT4ZLKAssOFwCIaGfOAFFOBMkuylg4+MtbYiMJYRARPSRAufAROqhUDOo73y5lBrXh07aiWuSn8fU4mclWu+Xw382ryxW+XeHPc12d7S46TvGJaRvzsLtlyerRxGI77xOHRexq1Z/SFjOWLOwc= zuul-build-sshkey manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 28 07:25:40 np0005538515.novalocal sudo[24392]: pam_unix(sudo:session): session closed for user root
Nov 28 07:25:41 np0005538515.novalocal python3[24408]: ansible-ansible.builtin.slurp Invoked with path=/etc/hostname src=/etc/hostname
Nov 28 07:25:42 np0005538515.novalocal python3[24423]: ansible-ansible.legacy.command Invoked with _raw_params=hostname="np0005538515.novalocal"
                                                       hostname_str_array=(${hostname//./ })
                                                       echo ${hostname_str_array[0]} > /home/zuul/ansible_hostname
                                                        _uses_shell=True zuul_log_id=fa163ef9-e89a-f34b-e95a-000000000022-1-overcloudnovacompute2 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 07:25:42 np0005538515.novalocal sudo[24441]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bxcgshhiktwjjusfhtivgonhaenhtyiz ; /usr/bin/python3
Nov 28 07:25:42 np0005538515.novalocal sudo[24441]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:25:43 np0005538515.novalocal python3[24443]: ansible-ansible.legacy.command Invoked with _raw_params=hostname=$(cat /home/zuul/ansible_hostname)
                                                       hostnamectl hostname "$hostname.localdomain"
                                                        _uses_shell=True zuul_log_id=fa163ef9-e89a-f34b-e95a-000000000023-1-overcloudnovacompute2 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 07:25:43 np0005538515.novalocal systemd[1]: Starting Hostname Service...
Nov 28 07:25:43 np0005538515.novalocal systemd[1]: Started Hostname Service.
Nov 28 07:25:43 np0005538515.localdomain systemd-hostnamed[24447]: Hostname set to <np0005538515.localdomain> (static)
Nov 28 07:25:43 np0005538515.localdomain NetworkManager[5965]: <info>  [1764314743.1007] hostname: static hostname changed from "np0005538515.novalocal" to "np0005538515.localdomain"
Nov 28 07:25:43 np0005538515.localdomain systemd[1]: Starting Network Manager Script Dispatcher Service...
Nov 28 07:25:43 np0005538515.localdomain systemd[1]: Started Network Manager Script Dispatcher Service.
Nov 28 07:25:43 np0005538515.localdomain sudo[24441]: pam_unix(sudo:session): session closed for user root
Nov 28 07:25:44 np0005538515.localdomain sshd[19646]: pam_unix(sshd:session): session closed for user zuul
Nov 28 07:25:44 np0005538515.localdomain systemd-logind[763]: Session 10 logged out. Waiting for processes to exit.
Nov 28 07:25:44 np0005538515.localdomain systemd[1]: session-10.scope: Deactivated successfully.
Nov 28 07:25:44 np0005538515.localdomain systemd[1]: session-10.scope: Consumed 1min 43.779s CPU time.
Nov 28 07:25:44 np0005538515.localdomain systemd-logind[763]: Removed session 10.
Nov 28 07:25:47 np0005538515.localdomain sshd[24458]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:25:47 np0005538515.localdomain sshd[24458]: Accepted publickey for zuul from 38.102.83.114 port 33134 ssh2: RSA SHA256:3gOhaEk5Hp1Sm2LwNst6cGDJ5O01KvSo8lCo9SBO2II
Nov 28 07:25:47 np0005538515.localdomain systemd-logind[763]: New session 11 of user zuul.
Nov 28 07:25:47 np0005538515.localdomain systemd[1]: Started Session 11 of User zuul.
Nov 28 07:25:47 np0005538515.localdomain sshd[24458]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Nov 28 07:25:47 np0005538515.localdomain python3[24475]: ansible-ansible.builtin.slurp Invoked with path=/home/zuul/ansible_hostname src=/home/zuul/ansible_hostname
Nov 28 07:25:49 np0005538515.localdomain sshd[24458]: pam_unix(sshd:session): session closed for user zuul
Nov 28 07:25:49 np0005538515.localdomain systemd[1]: session-11.scope: Deactivated successfully.
Nov 28 07:25:49 np0005538515.localdomain systemd-logind[763]: Session 11 logged out. Waiting for processes to exit.
Nov 28 07:25:49 np0005538515.localdomain systemd-logind[763]: Removed session 11.
Nov 28 07:25:53 np0005538515.localdomain systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Nov 28 07:26:01 np0005538515.localdomain anacron[6683]: Job `cron.weekly' started
Nov 28 07:26:01 np0005538515.localdomain anacron[6683]: Job `cron.weekly' terminated
Nov 28 07:26:13 np0005538515.localdomain systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Nov 28 07:26:38 np0005538515.localdomain sshd[24480]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:26:39 np0005538515.localdomain sshd[24480]: Accepted publickey for zuul from 38.102.83.114 port 56372 ssh2: RSA SHA256:3gOhaEk5Hp1Sm2LwNst6cGDJ5O01KvSo8lCo9SBO2II
Nov 28 07:26:39 np0005538515.localdomain systemd-logind[763]: New session 12 of user zuul.
Nov 28 07:26:39 np0005538515.localdomain systemd[1]: Started Session 12 of User zuul.
Nov 28 07:26:39 np0005538515.localdomain sshd[24480]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Nov 28 07:26:39 np0005538515.localdomain sudo[24497]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hllcfodmirhbxnqpbknzcrkiiyybatpk ; /usr/bin/python3
Nov 28 07:26:39 np0005538515.localdomain sudo[24497]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:26:39 np0005538515.localdomain python3[24499]: ansible-ansible.legacy.dnf Invoked with name=['lvm2', 'jq'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Nov 28 07:26:42 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 07:26:42 np0005538515.localdomain systemd-sysv-generator[24541]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 07:26:42 np0005538515.localdomain systemd-rc-local-generator[24537]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 07:26:42 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 07:26:43 np0005538515.localdomain systemd[1]: Listening on Device-mapper event daemon FIFOs.
Nov 28 07:26:43 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 07:26:43 np0005538515.localdomain systemd-rc-local-generator[24579]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 07:26:43 np0005538515.localdomain systemd-sysv-generator[24585]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 07:26:43 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 07:26:43 np0005538515.localdomain systemd[1]: Starting Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling...
Nov 28 07:26:43 np0005538515.localdomain systemd[1]: Finished Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling.
Nov 28 07:26:43 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 07:26:43 np0005538515.localdomain systemd-sysv-generator[24624]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 07:26:43 np0005538515.localdomain systemd-rc-local-generator[24620]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 07:26:43 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 07:26:43 np0005538515.localdomain systemd[1]: Listening on LVM2 poll daemon socket.
Nov 28 07:26:43 np0005538515.localdomain systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 28 07:26:43 np0005538515.localdomain systemd[1]: Starting man-db-cache-update.service...
Nov 28 07:26:44 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 07:26:44 np0005538515.localdomain systemd-sysv-generator[24685]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 07:26:44 np0005538515.localdomain systemd-rc-local-generator[24679]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 07:26:44 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 07:26:44 np0005538515.localdomain systemd[1]: Queuing reload/restart jobs for marked units…
Nov 28 07:26:44 np0005538515.localdomain systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 28 07:26:44 np0005538515.localdomain systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 28 07:26:44 np0005538515.localdomain systemd[1]: Finished man-db-cache-update.service.
Nov 28 07:26:44 np0005538515.localdomain systemd[1]: run-r6b732527809749d4b4619d874cb61790.service: Deactivated successfully.
Nov 28 07:26:44 np0005538515.localdomain systemd[1]: run-r25d053d67fa2446ea87d51c0ee8e2bf0.service: Deactivated successfully.
Nov 28 07:26:45 np0005538515.localdomain sudo[24497]: pam_unix(sudo:session): session closed for user root
Nov 28 07:27:45 np0005538515.localdomain sshd[24483]: Received disconnect from 38.102.83.114 port 56372:11: disconnected by user
Nov 28 07:27:45 np0005538515.localdomain sshd[24483]: Disconnected from user zuul 38.102.83.114 port 56372
Nov 28 07:27:45 np0005538515.localdomain sshd[24480]: pam_unix(sshd:session): session closed for user zuul
Nov 28 07:27:45 np0005538515.localdomain systemd[1]: session-12.scope: Deactivated successfully.
Nov 28 07:27:45 np0005538515.localdomain systemd[1]: session-12.scope: Consumed 4.561s CPU time.
Nov 28 07:27:45 np0005538515.localdomain systemd-logind[763]: Session 12 logged out. Waiting for processes to exit.
Nov 28 07:27:45 np0005538515.localdomain systemd-logind[763]: Removed session 12.
Nov 28 07:39:24 np0005538515.localdomain sshd[25274]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:39:24 np0005538515.localdomain sshd[25274]: error: kex_exchange_identification: banner line contains invalid characters
Nov 28 07:39:24 np0005538515.localdomain sshd[25274]: banner exchange: Connection from 152.32.131.245 port 46534: invalid format
Nov 28 07:39:25 np0005538515.localdomain sshd[25275]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:39:43 np0005538515.localdomain sshd[25275]: error: kex_exchange_identification: Connection closed by remote host
Nov 28 07:39:43 np0005538515.localdomain sshd[25275]: Connection closed by 152.32.131.245 port 46540
Nov 28 07:39:43 np0005538515.localdomain sshd[25276]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:39:44 np0005538515.localdomain sshd[25276]: fatal: mm_answer_sign: sign: error in libcrypto
Nov 28 07:39:44 np0005538515.localdomain sshd[25278]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:39:44 np0005538515.localdomain sshd[25278]: error: Protocol major versions differ: 2 vs. 1
Nov 28 07:39:44 np0005538515.localdomain sshd[25278]: banner exchange: Connection from 152.32.131.245 port 55400: could not read protocol version
Nov 28 07:43:42 np0005538515.localdomain sshd[25281]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:43:42 np0005538515.localdomain sshd[25281]: Accepted publickey for zuul from 192.168.122.100 port 54654 ssh2: RSA SHA256:3gOhaEk5Hp1Sm2LwNst6cGDJ5O01KvSo8lCo9SBO2II
Nov 28 07:43:42 np0005538515.localdomain systemd-logind[763]: New session 13 of user zuul.
Nov 28 07:43:42 np0005538515.localdomain systemd[1]: Started Session 13 of User zuul.
Nov 28 07:43:42 np0005538515.localdomain sshd[25281]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Nov 28 07:43:43 np0005538515.localdomain sudo[25327]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kzkaedqnouenjtlzqbibubguwccgnqew ; /usr/bin/python3
Nov 28 07:43:43 np0005538515.localdomain sudo[25327]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:43:43 np0005538515.localdomain python3[25329]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 07:43:44 np0005538515.localdomain sudo[25327]: pam_unix(sudo:session): session closed for user root
Nov 28 07:43:45 np0005538515.localdomain sudo[25414]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eziqxmjnajagsiqtqjpwrrjovwxfrpar ; /usr/bin/python3
Nov 28 07:43:45 np0005538515.localdomain sudo[25414]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:43:45 np0005538515.localdomain python3[25416]: ansible-ansible.builtin.dnf Invoked with name=['util-linux', 'lvm2', 'jq', 'podman'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Nov 28 07:43:47 np0005538515.localdomain sudo[25414]: pam_unix(sudo:session): session closed for user root
Nov 28 07:43:48 np0005538515.localdomain sudo[25431]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xzrjgazksyneehyncshsdryabujfqhde ; /usr/bin/python3
Nov 28 07:43:48 np0005538515.localdomain sudo[25431]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:43:48 np0005538515.localdomain python3[25433]: ansible-ansible.builtin.stat Invoked with path=/dev/loop3 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 28 07:43:48 np0005538515.localdomain sudo[25431]: pam_unix(sudo:session): session closed for user root
Nov 28 07:43:48 np0005538515.localdomain sudo[25447]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cahdammlptqrynngjwpqmbmbxlkthpvi ; /usr/bin/python3
Nov 28 07:43:48 np0005538515.localdomain sudo[25447]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:43:49 np0005538515.localdomain python3[25449]: ansible-ansible.legacy.command Invoked with _raw_params=dd if=/dev/zero of=/var/lib/ceph-osd-0.img bs=1 count=0 seek=7G
                                                         losetup /dev/loop3 /var/lib/ceph-osd-0.img
                                                         lsblk _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 07:43:49 np0005538515.localdomain kernel: loop: module loaded
Nov 28 07:43:49 np0005538515.localdomain kernel: loop3: detected capacity change from 0 to 14680064
Nov 28 07:43:49 np0005538515.localdomain sudo[25447]: pam_unix(sudo:session): session closed for user root
Nov 28 07:43:49 np0005538515.localdomain sudo[25472]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zyqppjuzolqgqtpwczjnuvxudjawahvi ; /usr/bin/python3
Nov 28 07:43:49 np0005538515.localdomain sudo[25472]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:43:49 np0005538515.localdomain python3[25474]: ansible-ansible.legacy.command Invoked with _raw_params=pvcreate /dev/loop3
                                                         vgcreate ceph_vg0 /dev/loop3
                                                         lvcreate -n ceph_lv0 -l +100%FREE ceph_vg0
                                                         lvs _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 07:43:49 np0005538515.localdomain lvm[25477]: PV /dev/loop3 not used.
Nov 28 07:43:49 np0005538515.localdomain lvm[25479]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Nov 28 07:43:49 np0005538515.localdomain systemd[1]: Started /usr/sbin/lvm vgchange -aay --autoactivation event ceph_vg0.
Nov 28 07:43:49 np0005538515.localdomain lvm[25482]:   1 logical volume(s) in volume group "ceph_vg0" now active
Nov 28 07:43:49 np0005538515.localdomain lvm[25490]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Nov 28 07:43:49 np0005538515.localdomain lvm[25490]: VG ceph_vg0 finished
Nov 28 07:43:49 np0005538515.localdomain systemd[1]: lvm-activate-ceph_vg0.service: Deactivated successfully.
Nov 28 07:43:49 np0005538515.localdomain sudo[25472]: pam_unix(sudo:session): session closed for user root
Nov 28 07:43:50 np0005538515.localdomain sudo[25537]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qewdqqfmeslqxhljtghngzsvmqpnrutl ; /usr/bin/python3
Nov 28 07:43:50 np0005538515.localdomain sudo[25537]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:43:50 np0005538515.localdomain python3[25539]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/ceph-osd-losetup-0.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 07:43:50 np0005538515.localdomain sudo[25537]: pam_unix(sudo:session): session closed for user root
Nov 28 07:43:50 np0005538515.localdomain sudo[25580]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rlnxvcapazuhmponpqdwgjurkcmcaqli ; /usr/bin/python3
Nov 28 07:43:50 np0005538515.localdomain sudo[25580]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:43:50 np0005538515.localdomain python3[25582]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764315830.059363-54710-33954886908959/source dest=/etc/systemd/system/ceph-osd-losetup-0.service mode=0644 force=True follow=False _original_basename=ceph-osd-losetup.service.j2 checksum=427b1db064a970126b729b07acf99fa7d0eecb9c backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:43:50 np0005538515.localdomain sudo[25580]: pam_unix(sudo:session): session closed for user root
Nov 28 07:43:51 np0005538515.localdomain sudo[25610]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gibbverhsmbgayoxjfptnjbzlgjlcvcc ; /usr/bin/python3
Nov 28 07:43:51 np0005538515.localdomain sudo[25610]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:43:51 np0005538515.localdomain python3[25612]: ansible-ansible.builtin.systemd Invoked with state=started enabled=True name=ceph-osd-losetup-0.service daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 07:43:51 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 07:43:51 np0005538515.localdomain systemd-rc-local-generator[25637]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 07:43:51 np0005538515.localdomain systemd-sysv-generator[25640]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 07:43:51 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 07:43:51 np0005538515.localdomain systemd[1]: Starting Ceph OSD losetup...
Nov 28 07:43:51 np0005538515.localdomain bash[25652]: /dev/loop3: [64516]:8400144 (/var/lib/ceph-osd-0.img)
Nov 28 07:43:51 np0005538515.localdomain systemd[1]: Finished Ceph OSD losetup.
Nov 28 07:43:52 np0005538515.localdomain lvm[25653]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Nov 28 07:43:52 np0005538515.localdomain lvm[25653]: VG ceph_vg0 finished
Nov 28 07:43:52 np0005538515.localdomain sudo[25610]: pam_unix(sudo:session): session closed for user root
Nov 28 07:43:52 np0005538515.localdomain sudo[25667]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qrzwcnvkizpuecretofgslkyufvibrac ; /usr/bin/python3
Nov 28 07:43:52 np0005538515.localdomain sudo[25667]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:43:52 np0005538515.localdomain python3[25669]: ansible-ansible.builtin.dnf Invoked with name=['util-linux', 'lvm2', 'jq', 'podman'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Nov 28 07:43:55 np0005538515.localdomain sudo[25667]: pam_unix(sudo:session): session closed for user root
Nov 28 07:43:55 np0005538515.localdomain sudo[25684]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wciwzqkmvswqqdhihfruyrzoqjeyahbb ; /usr/bin/python3
Nov 28 07:43:55 np0005538515.localdomain sudo[25684]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:43:55 np0005538515.localdomain python3[25686]: ansible-ansible.builtin.stat Invoked with path=/dev/loop4 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 28 07:43:55 np0005538515.localdomain sudo[25684]: pam_unix(sudo:session): session closed for user root
Nov 28 07:43:55 np0005538515.localdomain sudo[25700]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wjjtddyivfzotptavavnnfjtwqkpujps ; /usr/bin/python3
Nov 28 07:43:55 np0005538515.localdomain sudo[25700]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:43:56 np0005538515.localdomain python3[25702]: ansible-ansible.legacy.command Invoked with _raw_params=dd if=/dev/zero of=/var/lib/ceph-osd-1.img bs=1 count=0 seek=7G
                                                         losetup /dev/loop4 /var/lib/ceph-osd-1.img
                                                         lsblk _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 07:43:56 np0005538515.localdomain kernel: loop4: detected capacity change from 0 to 14680064
Nov 28 07:43:56 np0005538515.localdomain sudo[25700]: pam_unix(sudo:session): session closed for user root
Nov 28 07:43:56 np0005538515.localdomain sudo[25722]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mtocjsszsqrsonbzeykkatngylfiawar ; /usr/bin/python3
Nov 28 07:43:56 np0005538515.localdomain sudo[25722]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:43:56 np0005538515.localdomain python3[25724]: ansible-ansible.legacy.command Invoked with _raw_params=pvcreate /dev/loop4
                                                         vgcreate ceph_vg1 /dev/loop4
                                                         lvcreate -n ceph_lv1 -l +100%FREE ceph_vg1
                                                         lvs _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 07:43:56 np0005538515.localdomain lvm[25727]: PV /dev/loop4 not used.
Nov 28 07:43:56 np0005538515.localdomain lvm[25729]: PV /dev/loop4 online, VG ceph_vg1 is complete.
Nov 28 07:43:56 np0005538515.localdomain systemd[1]: Started /usr/sbin/lvm vgchange -aay --autoactivation event ceph_vg1.
Nov 28 07:43:56 np0005538515.localdomain lvm[25736]:   1 logical volume(s) in volume group "ceph_vg1" now active
Nov 28 07:43:56 np0005538515.localdomain lvm[25740]: PV /dev/loop4 online, VG ceph_vg1 is complete.
Nov 28 07:43:56 np0005538515.localdomain lvm[25740]: VG ceph_vg1 finished
Nov 28 07:43:56 np0005538515.localdomain systemd[1]: lvm-activate-ceph_vg1.service: Deactivated successfully.
Nov 28 07:43:57 np0005538515.localdomain sudo[25722]: pam_unix(sudo:session): session closed for user root
Nov 28 07:43:57 np0005538515.localdomain sudo[25786]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zkivgfzqkzxxvhfbhexkmgsgcwszwyhn ; /usr/bin/python3
Nov 28 07:43:57 np0005538515.localdomain sudo[25786]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:43:57 np0005538515.localdomain python3[25788]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/ceph-osd-losetup-1.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 07:43:57 np0005538515.localdomain sudo[25786]: pam_unix(sudo:session): session closed for user root
Nov 28 07:43:57 np0005538515.localdomain sudo[25829]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fnjntncyjhzueimrrgednbmarhyyavac ; /usr/bin/python3
Nov 28 07:43:57 np0005538515.localdomain sudo[25829]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:43:57 np0005538515.localdomain python3[25831]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764315837.2653916-54794-28444591396235/source dest=/etc/systemd/system/ceph-osd-losetup-1.service mode=0644 force=True follow=False _original_basename=ceph-osd-losetup.service.j2 checksum=19612168ea279db4171b94ee1f8625de1ec44b58 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:43:57 np0005538515.localdomain sudo[25829]: pam_unix(sudo:session): session closed for user root
Nov 28 07:43:58 np0005538515.localdomain sudo[25859]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jkfbxnzksvrlnbgtiyniwkbbnjkacgcx ; /usr/bin/python3
Nov 28 07:43:58 np0005538515.localdomain sudo[25859]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:43:58 np0005538515.localdomain python3[25861]: ansible-ansible.builtin.systemd Invoked with state=started enabled=True name=ceph-osd-losetup-1.service daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 07:43:58 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 07:43:58 np0005538515.localdomain systemd-sysv-generator[25894]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 07:43:58 np0005538515.localdomain systemd-rc-local-generator[25887]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 07:43:58 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 07:43:58 np0005538515.localdomain systemd[1]: Starting Ceph OSD losetup...
Nov 28 07:43:58 np0005538515.localdomain bash[25902]: /dev/loop4: [64516]:9169890 (/var/lib/ceph-osd-1.img)
Nov 28 07:43:58 np0005538515.localdomain systemd[1]: Finished Ceph OSD losetup.
Nov 28 07:43:59 np0005538515.localdomain lvm[25903]: PV /dev/loop4 online, VG ceph_vg1 is complete.
Nov 28 07:43:59 np0005538515.localdomain lvm[25903]: VG ceph_vg1 finished
Nov 28 07:43:59 np0005538515.localdomain sudo[25859]: pam_unix(sudo:session): session closed for user root
Nov 28 07:44:07 np0005538515.localdomain sudo[25946]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fqubfknajxrhfquazomebafgpmsrtnnm ; /usr/bin/python3
Nov 28 07:44:07 np0005538515.localdomain sudo[25946]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:44:07 np0005538515.localdomain python3[25948]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all', 'min'] gather_timeout=45 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 07:44:07 np0005538515.localdomain sudo[25946]: pam_unix(sudo:session): session closed for user root
Nov 28 07:44:08 np0005538515.localdomain sudo[25966]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mgtswuzdtwfhtainkbbzgbcaatodkvwf ; /usr/bin/python3
Nov 28 07:44:08 np0005538515.localdomain sudo[25966]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:44:08 np0005538515.localdomain python3[25968]: ansible-hostname Invoked with name=np0005538515.localdomain use=None
Nov 28 07:44:08 np0005538515.localdomain systemd[1]: Starting Hostname Service...
Nov 28 07:44:08 np0005538515.localdomain systemd[1]: Started Hostname Service.
Nov 28 07:44:10 np0005538515.localdomain sudo[25966]: pam_unix(sudo:session): session closed for user root
Nov 28 07:44:11 np0005538515.localdomain sudo[25989]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uhumnjdrcfiulnnvyoakfxkfqclrkceb ; /usr/bin/python3
Nov 28 07:44:11 np0005538515.localdomain sudo[25989]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:44:11 np0005538515.localdomain python3[25991]: ansible-tempfile Invoked with state=file suffix=tmphosts prefix=ansible. path=None
Nov 28 07:44:11 np0005538515.localdomain sudo[25989]: pam_unix(sudo:session): session closed for user root
Nov 28 07:44:12 np0005538515.localdomain sudo[26037]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-llpasxolbwvtwwhtdknqcgnhpspehihi ; /usr/bin/python3
Nov 28 07:44:12 np0005538515.localdomain sudo[26037]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:44:12 np0005538515.localdomain python3[26039]: ansible-ansible.legacy.copy Invoked with remote_src=True src=/etc/hosts dest=/tmp/ansible.vmihhaiutmphosts mode=preserve backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:44:12 np0005538515.localdomain sudo[26037]: pam_unix(sudo:session): session closed for user root
Nov 28 07:44:12 np0005538515.localdomain sudo[26067]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gerufwmenbhwnfwodquztmvyvyxkyqjg ; /usr/bin/python3
Nov 28 07:44:12 np0005538515.localdomain sudo[26067]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:44:13 np0005538515.localdomain python3[26069]: ansible-blockinfile Invoked with state=absent path=/tmp/ansible.vmihhaiutmphosts block= marker=# {mark} marker_begin=HEAT_HOSTS_START - Do not edit manually within this section! marker_end=HEAT_HOSTS_END create=False backup=False unsafe_writes=False insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:44:13 np0005538515.localdomain sudo[26067]: pam_unix(sudo:session): session closed for user root
Nov 28 07:44:13 np0005538515.localdomain sudo[26083]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ydclwknzjpfuixscqjevyymjhbzkyvth ; /usr/bin/python3
Nov 28 07:44:13 np0005538515.localdomain sudo[26083]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:44:13 np0005538515.localdomain python3[26085]: ansible-blockinfile Invoked with create=True path=/tmp/ansible.vmihhaiutmphosts insertbefore=BOF block=192.168.122.106 np0005538513.localdomain np0005538513
                                                         192.168.122.106 np0005538513.ctlplane.localdomain np0005538513.ctlplane
                                                         192.168.122.107 np0005538514.localdomain np0005538514
                                                         192.168.122.107 np0005538514.ctlplane.localdomain np0005538514.ctlplane
                                                         192.168.122.108 np0005538515.localdomain np0005538515
                                                         192.168.122.108 np0005538515.ctlplane.localdomain np0005538515.ctlplane
                                                         192.168.122.103 np0005538510.localdomain np0005538510
                                                         192.168.122.103 np0005538510.ctlplane.localdomain np0005538510.ctlplane
                                                         192.168.122.104 np0005538511.localdomain np0005538511
                                                         192.168.122.104 np0005538511.ctlplane.localdomain np0005538511.ctlplane
                                                         192.168.122.105 np0005538512.localdomain np0005538512
                                                         192.168.122.105 np0005538512.ctlplane.localdomain np0005538512.ctlplane
                                                         
                                                         192.168.122.100 undercloud.ctlplane.localdomain undercloud.ctlplane
                                                          marker=# {mark} marker_begin=START_HOST_ENTRIES_FOR_STACK: overcloud marker_end=END_HOST_ENTRIES_FOR_STACK: overcloud state=present backup=False unsafe_writes=False insertafter=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:44:13 np0005538515.localdomain sudo[26083]: pam_unix(sudo:session): session closed for user root
Nov 28 07:44:13 np0005538515.localdomain sudo[26099]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pskgyltrhwqhfmwujvibxuzmsmzjlpmj ; /usr/bin/python3
Nov 28 07:44:13 np0005538515.localdomain sudo[26099]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:44:14 np0005538515.localdomain python3[26101]: ansible-ansible.legacy.command Invoked with _raw_params=cp "/tmp/ansible.vmihhaiutmphosts" "/etc/hosts" _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 07:44:14 np0005538515.localdomain sudo[26099]: pam_unix(sudo:session): session closed for user root
Nov 28 07:44:14 np0005538515.localdomain sudo[26116]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-drfqjdvwuunxbbxrhsvqdvhbjabngqob ; /usr/bin/python3
Nov 28 07:44:14 np0005538515.localdomain sudo[26116]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:44:14 np0005538515.localdomain python3[26118]: ansible-file Invoked with path=/tmp/ansible.vmihhaiutmphosts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:44:14 np0005538515.localdomain sudo[26116]: pam_unix(sudo:session): session closed for user root
Nov 28 07:44:16 np0005538515.localdomain sudo[26132]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vqgcraclwjmwbzhoqaaxrjyckghgciyo ; /usr/bin/python3
Nov 28 07:44:16 np0005538515.localdomain sudo[26132]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:44:16 np0005538515.localdomain python3[26134]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-active ntpd.service || systemctl is-enabled ntpd.service _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 07:44:16 np0005538515.localdomain sudo[26132]: pam_unix(sudo:session): session closed for user root
Nov 28 07:44:18 np0005538515.localdomain sudo[26150]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dvdcabbwlqdfeqnnycdruydywsabqkhc ; /usr/bin/python3
Nov 28 07:44:18 np0005538515.localdomain sudo[26150]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:44:18 np0005538515.localdomain python3[26152]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Nov 28 07:44:20 np0005538515.localdomain sudo[26150]: pam_unix(sudo:session): session closed for user root
Nov 28 07:44:22 np0005538515.localdomain sudo[26199]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pbfnlbwgvdhfpbspfvqazjspjlnofvbw ; /usr/bin/python3
Nov 28 07:44:22 np0005538515.localdomain sudo[26199]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:44:22 np0005538515.localdomain python3[26201]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 07:44:22 np0005538515.localdomain sudo[26199]: pam_unix(sudo:session): session closed for user root
Nov 28 07:44:22 np0005538515.localdomain sudo[26244]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mjmuuzvanvavjmmrljzpcxbtzftaqolw ; /usr/bin/python3
Nov 28 07:44:22 np0005538515.localdomain sudo[26244]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:44:22 np0005538515.localdomain python3[26246]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764315861.9310894-55738-203382693845929/source dest=/etc/chrony.conf owner=root group=root mode=420 follow=False _original_basename=chrony.conf.j2 checksum=4fd4fbbb2de00c70a54478b7feb8ef8adf6a3362 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:44:22 np0005538515.localdomain sudo[26244]: pam_unix(sudo:session): session closed for user root
Nov 28 07:44:23 np0005538515.localdomain sudo[26274]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qvoyfejnorppqzzitbkawfplewzhrhfk ; /usr/bin/python3
Nov 28 07:44:23 np0005538515.localdomain sudo[26274]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:44:24 np0005538515.localdomain python3[26276]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 07:44:24 np0005538515.localdomain sudo[26274]: pam_unix(sudo:session): session closed for user root
Nov 28 07:44:24 np0005538515.localdomain sudo[26292]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-omedfzzzaiadwzujclkhkkorknsbztub ; /usr/bin/python3
Nov 28 07:44:24 np0005538515.localdomain sudo[26292]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:44:24 np0005538515.localdomain python3[26294]: ansible-ansible.legacy.systemd Invoked with name=chronyd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 28 07:44:24 np0005538515.localdomain systemd[1]: Stopping NTP client/server...
Nov 28 07:44:24 np0005538515.localdomain chronyd[765]: chronyd exiting
Nov 28 07:44:24 np0005538515.localdomain systemd[1]: chronyd.service: Deactivated successfully.
Nov 28 07:44:24 np0005538515.localdomain systemd[1]: Stopped NTP client/server.
Nov 28 07:44:24 np0005538515.localdomain systemd[1]: chronyd.service: Consumed 101ms CPU time, read 1.9M from disk, written 4.0K to disk.
Nov 28 07:44:24 np0005538515.localdomain systemd[1]: Starting NTP client/server...
Nov 28 07:44:24 np0005538515.localdomain chronyd[26301]: chronyd version 4.3 starting (+CMDMON +NTP +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +ASYNCDNS +NTS +SECHASH +IPV6 +DEBUG)
Nov 28 07:44:24 np0005538515.localdomain chronyd[26301]: Frequency -30.402 +/- 0.257 ppm read from /var/lib/chrony/drift
Nov 28 07:44:24 np0005538515.localdomain chronyd[26301]: Loaded seccomp filter (level 2)
Nov 28 07:44:24 np0005538515.localdomain systemd[1]: Started NTP client/server.
Nov 28 07:44:24 np0005538515.localdomain sudo[26292]: pam_unix(sudo:session): session closed for user root
Nov 28 07:44:26 np0005538515.localdomain sudo[26348]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kdkcminttpuzaxmnfvpelzqfavbpstoi ; /usr/bin/python3
Nov 28 07:44:26 np0005538515.localdomain sudo[26348]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:44:26 np0005538515.localdomain python3[26350]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/chrony-online.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 07:44:26 np0005538515.localdomain sudo[26348]: pam_unix(sudo:session): session closed for user root
Nov 28 07:44:26 np0005538515.localdomain sudo[26391]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iaxvhsfoybnotcvvbaptysbqykrnxitb ; /usr/bin/python3
Nov 28 07:44:26 np0005538515.localdomain sudo[26391]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:44:27 np0005538515.localdomain python3[26393]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764315866.3296187-55886-87452685903154/source dest=/etc/systemd/system/chrony-online.service _original_basename=chrony-online.service follow=False checksum=d4d85e046d61f558ac7ec8178c6d529d893e81e1 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:44:27 np0005538515.localdomain sudo[26391]: pam_unix(sudo:session): session closed for user root
Nov 28 07:44:27 np0005538515.localdomain sudo[26421]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wjyeziwinlmdemvgzyrkkabcriibuhma ; /usr/bin/python3
Nov 28 07:44:27 np0005538515.localdomain sudo[26421]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:44:27 np0005538515.localdomain python3[26423]: ansible-systemd Invoked with state=started name=chrony-online.service enabled=True daemon-reload=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 07:44:27 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 07:44:27 np0005538515.localdomain systemd-rc-local-generator[26445]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 07:44:27 np0005538515.localdomain systemd-sysv-generator[26451]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 07:44:27 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 07:44:27 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 07:44:27 np0005538515.localdomain systemd-rc-local-generator[26488]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 07:44:27 np0005538515.localdomain systemd-sysv-generator[26492]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 07:44:27 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 07:44:28 np0005538515.localdomain systemd[1]: Starting chronyd online sources service...
Nov 28 07:44:28 np0005538515.localdomain chronyc[26499]: 200 OK
Nov 28 07:44:28 np0005538515.localdomain systemd[1]: chrony-online.service: Deactivated successfully.
Nov 28 07:44:28 np0005538515.localdomain systemd[1]: Finished chronyd online sources service.
Nov 28 07:44:28 np0005538515.localdomain sudo[26421]: pam_unix(sudo:session): session closed for user root
Nov 28 07:44:28 np0005538515.localdomain sudo[26514]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oqimfeljnevlwhrmbvczflsnrqivshsc ; /usr/bin/python3
Nov 28 07:44:28 np0005538515.localdomain sudo[26514]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:44:28 np0005538515.localdomain python3[26516]: ansible-ansible.legacy.command Invoked with _raw_params=chronyc makestep _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 07:44:28 np0005538515.localdomain chronyd[26301]: System clock was stepped by 0.000000 seconds
Nov 28 07:44:28 np0005538515.localdomain sudo[26514]: pam_unix(sudo:session): session closed for user root
Nov 28 07:44:29 np0005538515.localdomain sudo[26531]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sbalupfsdklqdpiomxphjqftuyzmauhw ; /usr/bin/python3
Nov 28 07:44:29 np0005538515.localdomain sudo[26531]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:44:29 np0005538515.localdomain chronyd[26301]: Selected source 23.133.168.247 (pool.ntp.org)
Nov 28 07:44:29 np0005538515.localdomain python3[26533]: ansible-ansible.legacy.command Invoked with _raw_params=chronyc waitsync 30 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 07:44:29 np0005538515.localdomain sudo[26531]: pam_unix(sudo:session): session closed for user root
Nov 28 07:44:39 np0005538515.localdomain sudo[26548]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-znjmnbvwtzlgwymxuuynlpjssftpwdmw ; /usr/bin/python3
Nov 28 07:44:39 np0005538515.localdomain sudo[26548]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:44:39 np0005538515.localdomain python3[26550]: ansible-timezone Invoked with name=UTC hwclock=None
Nov 28 07:44:39 np0005538515.localdomain systemd[1]: Starting Time & Date Service...
Nov 28 07:44:39 np0005538515.localdomain systemd[1]: Started Time & Date Service.
Nov 28 07:44:39 np0005538515.localdomain sudo[26548]: pam_unix(sudo:session): session closed for user root
Nov 28 07:44:39 np0005538515.localdomain systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Nov 28 07:44:41 np0005538515.localdomain sudo[26570]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-obuvdykwjxzpguoinsdxrnttbqawsxjb ; /usr/bin/python3
Nov 28 07:44:41 np0005538515.localdomain sudo[26570]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:44:41 np0005538515.localdomain python3[26572]: ansible-ansible.legacy.systemd Invoked with name=chronyd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 28 07:44:41 np0005538515.localdomain chronyd[26301]: chronyd exiting
Nov 28 07:44:41 np0005538515.localdomain systemd[1]: Stopping NTP client/server...
Nov 28 07:44:41 np0005538515.localdomain systemd[1]: chronyd.service: Deactivated successfully.
Nov 28 07:44:41 np0005538515.localdomain systemd[1]: Stopped NTP client/server.
Nov 28 07:44:41 np0005538515.localdomain systemd[1]: Starting NTP client/server...
Nov 28 07:44:41 np0005538515.localdomain chronyd[26579]: chronyd version 4.3 starting (+CMDMON +NTP +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +ASYNCDNS +NTS +SECHASH +IPV6 +DEBUG)
Nov 28 07:44:41 np0005538515.localdomain chronyd[26579]: Frequency -30.402 +/- 0.257 ppm read from /var/lib/chrony/drift
Nov 28 07:44:41 np0005538515.localdomain chronyd[26579]: Loaded seccomp filter (level 2)
Nov 28 07:44:41 np0005538515.localdomain systemd[1]: Started NTP client/server.
Nov 28 07:44:41 np0005538515.localdomain sudo[26570]: pam_unix(sudo:session): session closed for user root
Nov 28 07:44:46 np0005538515.localdomain chronyd[26579]: Selected source 174.138.193.90 (pool.ntp.org)
Nov 28 07:44:57 np0005538515.localdomain sudo[26594]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nfwtjlsvgdqkbqjvmhpzmpralegivvql ; /usr/bin/python3
Nov 28 07:44:57 np0005538515.localdomain sudo[26594]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:44:58 np0005538515.localdomain useradd[26598]: new group: name=ceph-admin, GID=1002
Nov 28 07:44:58 np0005538515.localdomain useradd[26598]: new user: name=ceph-admin, UID=1002, GID=1002, home=/home/ceph-admin, shell=/bin/bash, from=none
Nov 28 07:44:58 np0005538515.localdomain sudo[26594]: pam_unix(sudo:session): session closed for user root
Nov 28 07:44:58 np0005538515.localdomain sudo[26650]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-leeojkgfwzevguzsywqncxsztrxniuvy ; /usr/bin/python3
Nov 28 07:44:58 np0005538515.localdomain sudo[26650]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:44:58 np0005538515.localdomain sudo[26650]: pam_unix(sudo:session): session closed for user root
Nov 28 07:44:58 np0005538515.localdomain sudo[26693]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-merexayvlltvcjgdranpopijfhrqtvgb ; /usr/bin/python3
Nov 28 07:44:58 np0005538515.localdomain sudo[26693]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:44:59 np0005538515.localdomain sudo[26693]: pam_unix(sudo:session): session closed for user root
Nov 28 07:44:59 np0005538515.localdomain sudo[26723]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-epdczdytvfewmddcwkmzevicwznlsogo ; /usr/bin/python3
Nov 28 07:44:59 np0005538515.localdomain sudo[26723]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:44:59 np0005538515.localdomain sudo[26723]: pam_unix(sudo:session): session closed for user root
Nov 28 07:44:59 np0005538515.localdomain sudo[26739]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lylwtgojjvtsbliurvovgtxycrzimadv ; /usr/bin/python3
Nov 28 07:44:59 np0005538515.localdomain sudo[26739]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:45:00 np0005538515.localdomain sudo[26739]: pam_unix(sudo:session): session closed for user root
Nov 28 07:45:00 np0005538515.localdomain sudo[26755]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mysgipwxugmnuhnphgmtijpqjfkxmhjz ; /usr/bin/python3
Nov 28 07:45:00 np0005538515.localdomain sudo[26755]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:45:00 np0005538515.localdomain sudo[26755]: pam_unix(sudo:session): session closed for user root
Nov 28 07:45:00 np0005538515.localdomain sudo[26771]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gbasgoojlsnfwpmovtefadwgkqmrblef ; /usr/bin/python3
Nov 28 07:45:00 np0005538515.localdomain sudo[26771]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:45:01 np0005538515.localdomain sudo[26771]: pam_unix(sudo:session): session closed for user root
Nov 28 07:45:09 np0005538515.localdomain systemd[1]: systemd-timedated.service: Deactivated successfully.
Nov 28 07:45:56 np0005538515.localdomain sshd[26776]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:45:56 np0005538515.localdomain sshd[26776]: error: kex_exchange_identification: client sent invalid protocol identifier "GET / HTTP/1.1"
Nov 28 07:45:56 np0005538515.localdomain sshd[26776]: banner exchange: Connection from 64.62.156.192 port 8608: invalid format
Nov 28 07:46:01 np0005538515.localdomain anacron[6683]: Job `cron.monthly' started
Nov 28 07:46:01 np0005538515.localdomain anacron[6683]: Job `cron.monthly' terminated
Nov 28 07:46:01 np0005538515.localdomain anacron[6683]: Normal exit (3 jobs run)
Nov 28 07:46:50 np0005538515.localdomain sshd[26779]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:46:50 np0005538515.localdomain sshd[26779]: Accepted publickey for ceph-admin from 192.168.122.103 port 35104 ssh2: RSA SHA256:zjXO5gWr7Xng+SeiWsaFLFQaayJZD5rPIAl1v5Aks+g
Nov 28 07:46:50 np0005538515.localdomain systemd[1]: Created slice User Slice of UID 1002.
Nov 28 07:46:50 np0005538515.localdomain systemd[1]: Starting User Runtime Directory /run/user/1002...
Nov 28 07:46:50 np0005538515.localdomain systemd-logind[763]: New session 14 of user ceph-admin.
Nov 28 07:46:50 np0005538515.localdomain systemd[1]: Finished User Runtime Directory /run/user/1002.
Nov 28 07:46:50 np0005538515.localdomain systemd[1]: Starting User Manager for UID 1002...
Nov 28 07:46:50 np0005538515.localdomain systemd[26783]: pam_unix(systemd-user:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Nov 28 07:46:50 np0005538515.localdomain systemd[26783]: Queued start job for default target Main User Target.
Nov 28 07:46:50 np0005538515.localdomain systemd[26783]: Created slice User Application Slice.
Nov 28 07:46:50 np0005538515.localdomain systemd[26783]: Started Mark boot as successful after the user session has run 2 minutes.
Nov 28 07:46:50 np0005538515.localdomain systemd[26783]: Started Daily Cleanup of User's Temporary Directories.
Nov 28 07:46:50 np0005538515.localdomain systemd[26783]: Reached target Paths.
Nov 28 07:46:50 np0005538515.localdomain systemd[26783]: Reached target Timers.
Nov 28 07:46:50 np0005538515.localdomain systemd[26783]: Starting D-Bus User Message Bus Socket...
Nov 28 07:46:50 np0005538515.localdomain systemd[26783]: Starting Create User's Volatile Files and Directories...
Nov 28 07:46:50 np0005538515.localdomain systemd[26783]: Finished Create User's Volatile Files and Directories.
Nov 28 07:46:50 np0005538515.localdomain systemd[26783]: Listening on D-Bus User Message Bus Socket.
Nov 28 07:46:50 np0005538515.localdomain systemd[26783]: Reached target Sockets.
Nov 28 07:46:50 np0005538515.localdomain systemd[26783]: Reached target Basic System.
Nov 28 07:46:50 np0005538515.localdomain systemd[26783]: Reached target Main User Target.
Nov 28 07:46:50 np0005538515.localdomain systemd[26783]: Startup finished in 93ms.
Nov 28 07:46:50 np0005538515.localdomain systemd[1]: Started User Manager for UID 1002.
Nov 28 07:46:50 np0005538515.localdomain systemd[1]: Started Session 14 of User ceph-admin.
Nov 28 07:46:50 np0005538515.localdomain sshd[26798]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:46:50 np0005538515.localdomain sshd[26779]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Nov 28 07:46:50 np0005538515.localdomain sshd[26798]: Accepted publickey for ceph-admin from 192.168.122.103 port 35118 ssh2: RSA SHA256:zjXO5gWr7Xng+SeiWsaFLFQaayJZD5rPIAl1v5Aks+g
Nov 28 07:46:50 np0005538515.localdomain systemd-logind[763]: New session 16 of user ceph-admin.
Nov 28 07:46:50 np0005538515.localdomain systemd[1]: Started Session 16 of User ceph-admin.
Nov 28 07:46:50 np0005538515.localdomain sshd[26798]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Nov 28 07:46:50 np0005538515.localdomain sudo[26803]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 07:46:50 np0005538515.localdomain sudo[26803]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 07:46:50 np0005538515.localdomain sudo[26803]: pam_unix(sudo:session): session closed for user root
Nov 28 07:46:50 np0005538515.localdomain sshd[26818]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:46:50 np0005538515.localdomain sshd[26818]: Accepted publickey for ceph-admin from 192.168.122.103 port 35126 ssh2: RSA SHA256:zjXO5gWr7Xng+SeiWsaFLFQaayJZD5rPIAl1v5Aks+g
Nov 28 07:46:50 np0005538515.localdomain systemd-logind[763]: New session 17 of user ceph-admin.
Nov 28 07:46:50 np0005538515.localdomain systemd[1]: Started Session 17 of User ceph-admin.
Nov 28 07:46:50 np0005538515.localdomain sshd[26818]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Nov 28 07:46:50 np0005538515.localdomain sudo[26822]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 check-host --expect-hostname np0005538515.localdomain
Nov 28 07:46:50 np0005538515.localdomain sudo[26822]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 07:46:50 np0005538515.localdomain sudo[26822]: pam_unix(sudo:session): session closed for user root
Nov 28 07:46:51 np0005538515.localdomain sshd[26837]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:46:51 np0005538515.localdomain sshd[26837]: Accepted publickey for ceph-admin from 192.168.122.103 port 35128 ssh2: RSA SHA256:zjXO5gWr7Xng+SeiWsaFLFQaayJZD5rPIAl1v5Aks+g
Nov 28 07:46:51 np0005538515.localdomain systemd-logind[763]: New session 18 of user ceph-admin.
Nov 28 07:46:51 np0005538515.localdomain systemd[1]: Started Session 18 of User ceph-admin.
Nov 28 07:46:51 np0005538515.localdomain sshd[26837]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Nov 28 07:46:51 np0005538515.localdomain sudo[26841]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3
Nov 28 07:46:51 np0005538515.localdomain sudo[26841]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 07:46:51 np0005538515.localdomain sudo[26841]: pam_unix(sudo:session): session closed for user root
Nov 28 07:46:51 np0005538515.localdomain sshd[26856]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:46:51 np0005538515.localdomain sshd[26856]: Accepted publickey for ceph-admin from 192.168.122.103 port 35134 ssh2: RSA SHA256:zjXO5gWr7Xng+SeiWsaFLFQaayJZD5rPIAl1v5Aks+g
Nov 28 07:46:51 np0005538515.localdomain systemd-logind[763]: New session 19 of user ceph-admin.
Nov 28 07:46:51 np0005538515.localdomain systemd[1]: Started Session 19 of User ceph-admin.
Nov 28 07:46:51 np0005538515.localdomain sshd[26856]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Nov 28 07:46:51 np0005538515.localdomain sudo[26860]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 07:46:51 np0005538515.localdomain sudo[26860]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 07:46:51 np0005538515.localdomain sudo[26860]: pam_unix(sudo:session): session closed for user root
Nov 28 07:46:51 np0005538515.localdomain sshd[26875]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:46:51 np0005538515.localdomain sshd[26875]: Accepted publickey for ceph-admin from 192.168.122.103 port 35142 ssh2: RSA SHA256:zjXO5gWr7Xng+SeiWsaFLFQaayJZD5rPIAl1v5Aks+g
Nov 28 07:46:52 np0005538515.localdomain systemd-logind[763]: New session 20 of user ceph-admin.
Nov 28 07:46:52 np0005538515.localdomain systemd[1]: Started Session 20 of User ceph-admin.
Nov 28 07:46:52 np0005538515.localdomain sshd[26875]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Nov 28 07:46:52 np0005538515.localdomain sudo[26879]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 07:46:52 np0005538515.localdomain sudo[26879]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 07:46:52 np0005538515.localdomain sudo[26879]: pam_unix(sudo:session): session closed for user root
Nov 28 07:46:52 np0005538515.localdomain sshd[26894]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:46:52 np0005538515.localdomain sshd[26894]: Accepted publickey for ceph-admin from 192.168.122.103 port 35152 ssh2: RSA SHA256:zjXO5gWr7Xng+SeiWsaFLFQaayJZD5rPIAl1v5Aks+g
Nov 28 07:46:52 np0005538515.localdomain systemd-logind[763]: New session 21 of user ceph-admin.
Nov 28 07:46:52 np0005538515.localdomain systemd[1]: Started Session 21 of User ceph-admin.
Nov 28 07:46:52 np0005538515.localdomain sshd[26894]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Nov 28 07:46:52 np0005538515.localdomain sudo[26898]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3.new
Nov 28 07:46:52 np0005538515.localdomain sudo[26898]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 07:46:52 np0005538515.localdomain sudo[26898]: pam_unix(sudo:session): session closed for user root
Nov 28 07:46:52 np0005538515.localdomain sshd[26913]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:46:52 np0005538515.localdomain sshd[26913]: Accepted publickey for ceph-admin from 192.168.122.103 port 35154 ssh2: RSA SHA256:zjXO5gWr7Xng+SeiWsaFLFQaayJZD5rPIAl1v5Aks+g
Nov 28 07:46:52 np0005538515.localdomain systemd-logind[763]: New session 22 of user ceph-admin.
Nov 28 07:46:52 np0005538515.localdomain systemd[1]: Started Session 22 of User ceph-admin.
Nov 28 07:46:52 np0005538515.localdomain sshd[26913]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Nov 28 07:46:52 np0005538515.localdomain sudo[26917]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 07:46:52 np0005538515.localdomain sudo[26917]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 07:46:52 np0005538515.localdomain sudo[26917]: pam_unix(sudo:session): session closed for user root
Nov 28 07:46:53 np0005538515.localdomain sshd[26932]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:46:53 np0005538515.localdomain sshd[26932]: Accepted publickey for ceph-admin from 192.168.122.103 port 35170 ssh2: RSA SHA256:zjXO5gWr7Xng+SeiWsaFLFQaayJZD5rPIAl1v5Aks+g
Nov 28 07:46:53 np0005538515.localdomain systemd-logind[763]: New session 23 of user ceph-admin.
Nov 28 07:46:53 np0005538515.localdomain systemd[1]: Started Session 23 of User ceph-admin.
Nov 28 07:46:53 np0005538515.localdomain sshd[26932]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Nov 28 07:46:53 np0005538515.localdomain sudo[26936]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3.new
Nov 28 07:46:53 np0005538515.localdomain sudo[26936]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 07:46:53 np0005538515.localdomain sudo[26936]: pam_unix(sudo:session): session closed for user root
Nov 28 07:46:53 np0005538515.localdomain sshd[26951]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:46:53 np0005538515.localdomain sshd[26951]: Accepted publickey for ceph-admin from 192.168.122.103 port 35178 ssh2: RSA SHA256:zjXO5gWr7Xng+SeiWsaFLFQaayJZD5rPIAl1v5Aks+g
Nov 28 07:46:53 np0005538515.localdomain systemd-logind[763]: New session 24 of user ceph-admin.
Nov 28 07:46:53 np0005538515.localdomain systemd[1]: Started Session 24 of User ceph-admin.
Nov 28 07:46:53 np0005538515.localdomain sshd[26951]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Nov 28 07:46:54 np0005538515.localdomain sshd[26968]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:46:54 np0005538515.localdomain sshd[26968]: Accepted publickey for ceph-admin from 192.168.122.103 port 52188 ssh2: RSA SHA256:zjXO5gWr7Xng+SeiWsaFLFQaayJZD5rPIAl1v5Aks+g
Nov 28 07:46:54 np0005538515.localdomain systemd-logind[763]: New session 25 of user ceph-admin.
Nov 28 07:46:54 np0005538515.localdomain systemd[1]: Started Session 25 of User ceph-admin.
Nov 28 07:46:54 np0005538515.localdomain sshd[26968]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Nov 28 07:46:54 np0005538515.localdomain sudo[26972]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3.new /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3
Nov 28 07:46:54 np0005538515.localdomain sudo[26972]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 07:46:54 np0005538515.localdomain sudo[26972]: pam_unix(sudo:session): session closed for user root
Nov 28 07:46:54 np0005538515.localdomain sshd[26987]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:46:54 np0005538515.localdomain sshd[26987]: Accepted publickey for ceph-admin from 192.168.122.103 port 52194 ssh2: RSA SHA256:zjXO5gWr7Xng+SeiWsaFLFQaayJZD5rPIAl1v5Aks+g
Nov 28 07:46:54 np0005538515.localdomain systemd-logind[763]: New session 26 of user ceph-admin.
Nov 28 07:46:54 np0005538515.localdomain systemd[1]: Started Session 26 of User ceph-admin.
Nov 28 07:46:54 np0005538515.localdomain sshd[26987]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Nov 28 07:46:54 np0005538515.localdomain sudo[26991]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 check-host --expect-hostname np0005538515.localdomain
Nov 28 07:46:54 np0005538515.localdomain sudo[26991]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 07:46:54 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 28 07:46:55 np0005538515.localdomain sudo[26991]: pam_unix(sudo:session): session closed for user root
Nov 28 07:47:09 np0005538515.localdomain sudo[27028]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 07:47:09 np0005538515.localdomain sudo[27028]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 07:47:09 np0005538515.localdomain sudo[27028]: pam_unix(sudo:session): session closed for user root
Nov 28 07:47:09 np0005538515.localdomain sudo[27043]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 07:47:09 np0005538515.localdomain sudo[27043]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 07:47:09 np0005538515.localdomain sudo[27043]: pam_unix(sudo:session): session closed for user root
Nov 28 07:47:10 np0005538515.localdomain sudo[27058]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 check-host
Nov 28 07:47:10 np0005538515.localdomain sudo[27058]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 07:47:10 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 28 07:47:10 np0005538515.localdomain sudo[27058]: pam_unix(sudo:session): session closed for user root
Nov 28 07:47:10 np0005538515.localdomain sudo[27093]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 07:47:10 np0005538515.localdomain sudo[27093]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 07:47:10 np0005538515.localdomain sudo[27093]: pam_unix(sudo:session): session closed for user root
Nov 28 07:47:10 np0005538515.localdomain sudo[27108]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Nov 28 07:47:10 np0005538515.localdomain sudo[27108]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 07:47:10 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 28 07:47:11 np0005538515.localdomain sudo[27108]: pam_unix(sudo:session): session closed for user root
Nov 28 07:47:11 np0005538515.localdomain sudo[27161]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 07:47:11 np0005538515.localdomain sudo[27161]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 07:47:11 np0005538515.localdomain sudo[27161]: pam_unix(sudo:session): session closed for user root
Nov 28 07:47:11 np0005538515.localdomain sudo[27176]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 07:47:11 np0005538515.localdomain sudo[27176]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 07:47:11 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 28 07:47:11 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 28 07:47:11 np0005538515.localdomain systemd[1]: proc-sys-fs-binfmt_misc.automount: Got automount request for /proc/sys/fs/binfmt_misc, triggered by 27204 (sysctl)
Nov 28 07:47:11 np0005538515.localdomain systemd[1]: Mounting Arbitrary Executable File Formats File System...
Nov 28 07:47:11 np0005538515.localdomain systemd[1]: Mounted Arbitrary Executable File Formats File System.
Nov 28 07:47:11 np0005538515.localdomain sudo[27176]: pam_unix(sudo:session): session closed for user root
Nov 28 07:47:12 np0005538515.localdomain sudo[27226]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 07:47:12 np0005538515.localdomain sudo[27226]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 07:47:12 np0005538515.localdomain sudo[27226]: pam_unix(sudo:session): session closed for user root
Nov 28 07:47:12 np0005538515.localdomain sudo[27241]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 list-networks
Nov 28 07:47:12 np0005538515.localdomain sudo[27241]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 07:47:12 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 28 07:47:12 np0005538515.localdomain sudo[27241]: pam_unix(sudo:session): session closed for user root
Nov 28 07:47:12 np0005538515.localdomain sudo[27274]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 07:47:12 np0005538515.localdomain sudo[27274]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 07:47:12 np0005538515.localdomain sudo[27274]: pam_unix(sudo:session): session closed for user root
Nov 28 07:47:12 np0005538515.localdomain sudo[27289]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ceph-volume --fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1 -- inventory --format=json-pretty --filter-for-batch
Nov 28 07:47:12 np0005538515.localdomain sudo[27289]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 07:47:12 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 28 07:47:16 np0005538515.localdomain kernel: VFS: idmapped mount is not enabled.
Nov 28 07:47:39 np0005538515.localdomain podman[27341]: 
Nov 28 07:47:39 np0005538515.localdomain podman[27341]: 2025-11-28 07:47:39.163887306 +0000 UTC m=+26.099083256 container create 7b64133b18b07e5ae02e0e6d104e5f51e0ceb1064c9e6f16237231234b65b668 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=silly_clarke, release=553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.tags=rhceph ceph, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, CEPH_POINT_RELEASE=, io.buildah.version=1.33.12, build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, vendor=Red Hat, Inc., GIT_BRANCH=main, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_CLEAN=True, name=rhceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Nov 28 07:47:39 np0005538515.localdomain podman[27341]: 2025-11-28 07:47:13.105405246 +0000 UTC m=+0.040601256 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 07:47:39 np0005538515.localdomain systemd[1]: Created slice Slice /machine.
Nov 28 07:47:39 np0005538515.localdomain systemd[1]: Started libpod-conmon-7b64133b18b07e5ae02e0e6d104e5f51e0ceb1064c9e6f16237231234b65b668.scope.
Nov 28 07:47:39 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 07:47:39 np0005538515.localdomain podman[27341]: 2025-11-28 07:47:39.293334426 +0000 UTC m=+26.228530406 container init 7b64133b18b07e5ae02e0e6d104e5f51e0ceb1064c9e6f16237231234b65b668 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=silly_clarke, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, RELEASE=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, distribution-scope=public, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, release=553, name=rhceph, io.openshift.expose-services=, GIT_CLEAN=True, build-date=2025-09-24T08:57:55, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, version=7, io.buildah.version=1.33.12, GIT_BRANCH=main)
Nov 28 07:47:39 np0005538515.localdomain podman[27341]: 2025-11-28 07:47:39.309001477 +0000 UTC m=+26.244197457 container start 7b64133b18b07e5ae02e0e6d104e5f51e0ceb1064c9e6f16237231234b65b668 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=silly_clarke, io.buildah.version=1.33.12, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, release=553, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, build-date=2025-09-24T08:57:55, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, name=rhceph, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, RELEASE=main, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, distribution-scope=public, vendor=Red Hat, Inc.)
Nov 28 07:47:39 np0005538515.localdomain podman[27341]: 2025-11-28 07:47:39.309253794 +0000 UTC m=+26.244449774 container attach 7b64133b18b07e5ae02e0e6d104e5f51e0ceb1064c9e6f16237231234b65b668 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=silly_clarke, build-date=2025-09-24T08:57:55, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=553, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, ceph=True, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, version=7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.tags=rhceph ceph, architecture=x86_64, GIT_CLEAN=True, com.redhat.component=rhceph-container, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main)
Nov 28 07:47:39 np0005538515.localdomain silly_clarke[27737]: 167 167
Nov 28 07:47:39 np0005538515.localdomain systemd[1]: libpod-7b64133b18b07e5ae02e0e6d104e5f51e0ceb1064c9e6f16237231234b65b668.scope: Deactivated successfully.
Nov 28 07:47:39 np0005538515.localdomain podman[27341]: 2025-11-28 07:47:39.313845826 +0000 UTC m=+26.249041816 container died 7b64133b18b07e5ae02e0e6d104e5f51e0ceb1064c9e6f16237231234b65b668 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=silly_clarke, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, architecture=x86_64, build-date=2025-09-24T08:57:55, CEPH_POINT_RELEASE=, distribution-scope=public, io.openshift.tags=rhceph ceph, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.buildah.version=1.33.12, vcs-type=git, release=553, version=7, RELEASE=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, GIT_BRANCH=main, name=rhceph)
Nov 28 07:47:39 np0005538515.localdomain podman[27742]: 2025-11-28 07:47:39.412748609 +0000 UTC m=+0.084262976 container remove 7b64133b18b07e5ae02e0e6d104e5f51e0ceb1064c9e6f16237231234b65b668 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=silly_clarke, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/agreements, RELEASE=main, release=553, architecture=x86_64, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, version=7, GIT_BRANCH=main, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, build-date=2025-09-24T08:57:55, io.buildah.version=1.33.12, GIT_CLEAN=True, vendor=Red Hat, Inc., vcs-type=git, CEPH_POINT_RELEASE=, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph)
Nov 28 07:47:39 np0005538515.localdomain systemd[1]: libpod-conmon-7b64133b18b07e5ae02e0e6d104e5f51e0ceb1064c9e6f16237231234b65b668.scope: Deactivated successfully.
Nov 28 07:47:39 np0005538515.localdomain podman[27764]: 
Nov 28 07:47:39 np0005538515.localdomain podman[27764]: 2025-11-28 07:47:39.652199033 +0000 UTC m=+0.077880379 container create dd21b17ad1a756543f87576b7de2d56398d9a3db17b107af621117ad0d0875e5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nervous_edison, CEPH_POINT_RELEASE=, name=rhceph, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_BRANCH=main, RELEASE=main, GIT_CLEAN=True, release=553, io.openshift.expose-services=, vendor=Red Hat, Inc., vcs-type=git, io.openshift.tags=rhceph ceph, io.buildah.version=1.33.12, com.redhat.component=rhceph-container, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, version=7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public)
Nov 28 07:47:39 np0005538515.localdomain systemd[1]: Started libpod-conmon-dd21b17ad1a756543f87576b7de2d56398d9a3db17b107af621117ad0d0875e5.scope.
Nov 28 07:47:39 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 07:47:39 np0005538515.localdomain podman[27764]: 2025-11-28 07:47:39.621327536 +0000 UTC m=+0.047008902 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 07:47:39 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/51fef301fe477176060efa587fed16744e0ab56033ce957d65a5da993a654c51/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Nov 28 07:47:39 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/51fef301fe477176060efa587fed16744e0ab56033ce957d65a5da993a654c51/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Nov 28 07:47:39 np0005538515.localdomain podman[27764]: 2025-11-28 07:47:39.754370527 +0000 UTC m=+0.180051843 container init dd21b17ad1a756543f87576b7de2d56398d9a3db17b107af621117ad0d0875e5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nervous_edison, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.component=rhceph-container, build-date=2025-09-24T08:57:55, name=rhceph, ceph=True, architecture=x86_64, io.buildah.version=1.33.12, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7, release=553, io.openshift.expose-services=, GIT_CLEAN=True, vendor=Red Hat, Inc., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main)
Nov 28 07:47:39 np0005538515.localdomain podman[27764]: 2025-11-28 07:47:39.767916443 +0000 UTC m=+0.193597789 container start dd21b17ad1a756543f87576b7de2d56398d9a3db17b107af621117ad0d0875e5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nervous_edison, release=553, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.license_terms=https://www.redhat.com/agreements, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, build-date=2025-09-24T08:57:55, com.redhat.component=rhceph-container, architecture=x86_64, vendor=Red Hat, Inc., RELEASE=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, vcs-type=git, ceph=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, CEPH_POINT_RELEASE=, io.openshift.expose-services=, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Nov 28 07:47:39 np0005538515.localdomain podman[27764]: 2025-11-28 07:47:39.768321135 +0000 UTC m=+0.194002491 container attach dd21b17ad1a756543f87576b7de2d56398d9a3db17b107af621117ad0d0875e5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nervous_edison, RELEASE=main, architecture=x86_64, io.buildah.version=1.33.12, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, com.redhat.license_terms=https://www.redhat.com/agreements, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, build-date=2025-09-24T08:57:55, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., release=553, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, com.redhat.component=rhceph-container, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.expose-services=, description=Red Hat Ceph Storage 7)
Nov 28 07:47:40 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-f530b521bad0f3c0079403ca7d8e5533435e50290bf26755d51c2e2db81ad6a3-merged.mount: Deactivated successfully.
Nov 28 07:47:40 np0005538515.localdomain nervous_edison[27779]: [
Nov 28 07:47:40 np0005538515.localdomain nervous_edison[27779]:     {
Nov 28 07:47:40 np0005538515.localdomain nervous_edison[27779]:         "available": false,
Nov 28 07:47:40 np0005538515.localdomain nervous_edison[27779]:         "ceph_device": false,
Nov 28 07:47:40 np0005538515.localdomain nervous_edison[27779]:         "device_id": "QEMU_DVD-ROM_QM00001",
Nov 28 07:47:40 np0005538515.localdomain nervous_edison[27779]:         "lsm_data": {},
Nov 28 07:47:40 np0005538515.localdomain nervous_edison[27779]:         "lvs": [],
Nov 28 07:47:40 np0005538515.localdomain nervous_edison[27779]:         "path": "/dev/sr0",
Nov 28 07:47:40 np0005538515.localdomain nervous_edison[27779]:         "rejected_reasons": [
Nov 28 07:47:40 np0005538515.localdomain nervous_edison[27779]:             "Has a FileSystem",
Nov 28 07:47:40 np0005538515.localdomain nervous_edison[27779]:             "Insufficient space (<5GB)"
Nov 28 07:47:40 np0005538515.localdomain nervous_edison[27779]:         ],
Nov 28 07:47:40 np0005538515.localdomain nervous_edison[27779]:         "sys_api": {
Nov 28 07:47:40 np0005538515.localdomain nervous_edison[27779]:             "actuators": null,
Nov 28 07:47:40 np0005538515.localdomain nervous_edison[27779]:             "device_nodes": "sr0",
Nov 28 07:47:40 np0005538515.localdomain nervous_edison[27779]:             "human_readable_size": "482.00 KB",
Nov 28 07:47:40 np0005538515.localdomain nervous_edison[27779]:             "id_bus": "ata",
Nov 28 07:47:40 np0005538515.localdomain nervous_edison[27779]:             "model": "QEMU DVD-ROM",
Nov 28 07:47:40 np0005538515.localdomain nervous_edison[27779]:             "nr_requests": "2",
Nov 28 07:47:40 np0005538515.localdomain nervous_edison[27779]:             "partitions": {},
Nov 28 07:47:40 np0005538515.localdomain nervous_edison[27779]:             "path": "/dev/sr0",
Nov 28 07:47:40 np0005538515.localdomain nervous_edison[27779]:             "removable": "1",
Nov 28 07:47:40 np0005538515.localdomain nervous_edison[27779]:             "rev": "2.5+",
Nov 28 07:47:40 np0005538515.localdomain nervous_edison[27779]:             "ro": "0",
Nov 28 07:47:40 np0005538515.localdomain nervous_edison[27779]:             "rotational": "1",
Nov 28 07:47:40 np0005538515.localdomain nervous_edison[27779]:             "sas_address": "",
Nov 28 07:47:40 np0005538515.localdomain nervous_edison[27779]:             "sas_device_handle": "",
Nov 28 07:47:40 np0005538515.localdomain nervous_edison[27779]:             "scheduler_mode": "mq-deadline",
Nov 28 07:47:40 np0005538515.localdomain nervous_edison[27779]:             "sectors": 0,
Nov 28 07:47:40 np0005538515.localdomain nervous_edison[27779]:             "sectorsize": "2048",
Nov 28 07:47:40 np0005538515.localdomain nervous_edison[27779]:             "size": 493568.0,
Nov 28 07:47:40 np0005538515.localdomain nervous_edison[27779]:             "support_discard": "0",
Nov 28 07:47:40 np0005538515.localdomain nervous_edison[27779]:             "type": "disk",
Nov 28 07:47:40 np0005538515.localdomain nervous_edison[27779]:             "vendor": "QEMU"
Nov 28 07:47:40 np0005538515.localdomain nervous_edison[27779]:         }
Nov 28 07:47:40 np0005538515.localdomain nervous_edison[27779]:     }
Nov 28 07:47:40 np0005538515.localdomain nervous_edison[27779]: ]
Nov 28 07:47:40 np0005538515.localdomain systemd[1]: libpod-dd21b17ad1a756543f87576b7de2d56398d9a3db17b107af621117ad0d0875e5.scope: Deactivated successfully.
Nov 28 07:47:40 np0005538515.localdomain podman[27764]: 2025-11-28 07:47:40.641713434 +0000 UTC m=+1.067394750 container died dd21b17ad1a756543f87576b7de2d56398d9a3db17b107af621117ad0d0875e5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nervous_edison, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, description=Red Hat Ceph Storage 7, release=553, CEPH_POINT_RELEASE=, architecture=x86_64, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, vendor=Red Hat, Inc., version=7, distribution-scope=public, build-date=2025-09-24T08:57:55, vcs-type=git, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, name=rhceph, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git)
Nov 28 07:47:40 np0005538515.localdomain systemd[1]: tmp-crun.t6f1SJ.mount: Deactivated successfully.
Nov 28 07:47:40 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-51fef301fe477176060efa587fed16744e0ab56033ce957d65a5da993a654c51-merged.mount: Deactivated successfully.
Nov 28 07:47:40 np0005538515.localdomain podman[29165]: 2025-11-28 07:47:40.740433021 +0000 UTC m=+0.087492294 container remove dd21b17ad1a756543f87576b7de2d56398d9a3db17b107af621117ad0d0875e5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nervous_edison, vcs-type=git, ceph=True, io.buildah.version=1.33.12, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., GIT_CLEAN=True, distribution-scope=public, release=553, io.openshift.expose-services=, architecture=x86_64, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, description=Red Hat Ceph Storage 7, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=rhceph-container, name=rhceph, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, build-date=2025-09-24T08:57:55, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Nov 28 07:47:40 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 28 07:47:40 np0005538515.localdomain systemd[1]: libpod-conmon-dd21b17ad1a756543f87576b7de2d56398d9a3db17b107af621117ad0d0875e5.scope: Deactivated successfully.
Nov 28 07:47:40 np0005538515.localdomain sudo[27289]: pam_unix(sudo:session): session closed for user root
Nov 28 07:47:40 np0005538515.localdomain sudo[29177]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 07:47:40 np0005538515.localdomain sudo[29177]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 07:47:40 np0005538515.localdomain sudo[29177]: pam_unix(sudo:session): session closed for user root
Nov 28 07:47:40 np0005538515.localdomain sudo[29192]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 _orch set-coredump-overrides --fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1 --coredump-max-size=32G
Nov 28 07:47:40 np0005538515.localdomain sudo[29192]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 07:47:41 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 28 07:47:41 np0005538515.localdomain systemd[1]: systemd-coredump.socket: Deactivated successfully.
Nov 28 07:47:41 np0005538515.localdomain systemd[1]: Closed Process Core Dump Socket.
Nov 28 07:47:41 np0005538515.localdomain systemd[1]: Stopping Process Core Dump Socket...
Nov 28 07:47:41 np0005538515.localdomain systemd[1]: Listening on Process Core Dump Socket.
Nov 28 07:47:41 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 07:47:41 np0005538515.localdomain systemd-sysv-generator[29251]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 07:47:41 np0005538515.localdomain systemd-rc-local-generator[29247]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 07:47:41 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 07:47:41 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 07:47:41 np0005538515.localdomain systemd-sysv-generator[29291]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 07:47:41 np0005538515.localdomain systemd-rc-local-generator[29287]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 07:47:41 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 07:47:41 np0005538515.localdomain sudo[29192]: pam_unix(sudo:session): session closed for user root
Nov 28 07:48:09 np0005538515.localdomain sudo[29315]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 07:48:09 np0005538515.localdomain sudo[29315]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 07:48:09 np0005538515.localdomain sudo[29315]: pam_unix(sudo:session): session closed for user root
Nov 28 07:48:09 np0005538515.localdomain sudo[29330]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 07:48:09 np0005538515.localdomain sudo[29330]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 07:48:10 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 28 07:48:10 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 28 07:48:10 np0005538515.localdomain podman[29389]: 
Nov 28 07:48:10 np0005538515.localdomain podman[29389]: 2025-11-28 07:48:10.473086117 +0000 UTC m=+0.079644212 container create 9d56a61ce61c3bc4880e2ce0743d47f131b229cb331de2af35d8700265884372 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=keen_elbakyan, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, release=553, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/agreements, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, io.openshift.expose-services=, distribution-scope=public, name=rhceph, architecture=x86_64, GIT_CLEAN=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, version=7, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, io.buildah.version=1.33.12, build-date=2025-09-24T08:57:55, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, RELEASE=main)
Nov 28 07:48:10 np0005538515.localdomain systemd[1]: Started libpod-conmon-9d56a61ce61c3bc4880e2ce0743d47f131b229cb331de2af35d8700265884372.scope.
Nov 28 07:48:10 np0005538515.localdomain podman[29389]: 2025-11-28 07:48:10.441960235 +0000 UTC m=+0.048518360 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 07:48:10 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 07:48:10 np0005538515.localdomain podman[29389]: 2025-11-28 07:48:10.561759596 +0000 UTC m=+0.168317691 container init 9d56a61ce61c3bc4880e2ce0743d47f131b229cb331de2af35d8700265884372 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=keen_elbakyan, com.redhat.license_terms=https://www.redhat.com/agreements, release=553, GIT_CLEAN=True, ceph=True, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, io.openshift.expose-services=, io.buildah.version=1.33.12, build-date=2025-09-24T08:57:55, vcs-type=git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main, vendor=Red Hat, Inc., distribution-scope=public, GIT_BRANCH=main, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph)
Nov 28 07:48:10 np0005538515.localdomain podman[29389]: 2025-11-28 07:48:10.57205674 +0000 UTC m=+0.178614835 container start 9d56a61ce61c3bc4880e2ce0743d47f131b229cb331de2af35d8700265884372 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=keen_elbakyan, GIT_BRANCH=main, build-date=2025-09-24T08:57:55, description=Red Hat Ceph Storage 7, architecture=x86_64, io.openshift.tags=rhceph ceph, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, io.buildah.version=1.33.12, distribution-scope=public, ceph=True, version=7, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, release=553, CEPH_POINT_RELEASE=, name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git)
Nov 28 07:48:10 np0005538515.localdomain podman[29389]: 2025-11-28 07:48:10.572445437 +0000 UTC m=+0.179003582 container attach 9d56a61ce61c3bc4880e2ce0743d47f131b229cb331de2af35d8700265884372 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=keen_elbakyan, com.redhat.component=rhceph-container, GIT_BRANCH=main, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., architecture=x86_64, CEPH_POINT_RELEASE=, build-date=2025-09-24T08:57:55, description=Red Hat Ceph Storage 7, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, release=553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, vcs-type=git, ceph=True, com.redhat.license_terms=https://www.redhat.com/agreements)
Nov 28 07:48:10 np0005538515.localdomain keen_elbakyan[29402]: 167 167
Nov 28 07:48:10 np0005538515.localdomain systemd[1]: libpod-9d56a61ce61c3bc4880e2ce0743d47f131b229cb331de2af35d8700265884372.scope: Deactivated successfully.
Nov 28 07:48:10 np0005538515.localdomain podman[29389]: 2025-11-28 07:48:10.578571447 +0000 UTC m=+0.185129542 container died 9d56a61ce61c3bc4880e2ce0743d47f131b229cb331de2af35d8700265884372 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=keen_elbakyan, io.openshift.tags=rhceph ceph, build-date=2025-09-24T08:57:55, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, name=rhceph, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, RELEASE=main, release=553, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, GIT_CLEAN=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, io.buildah.version=1.33.12, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Nov 28 07:48:11 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-5d53415e905b49cffeb8993230a20691ddd8960a0df1ff6da5cf9753165f609c-merged.mount: Deactivated successfully.
Nov 28 07:48:11 np0005538515.localdomain podman[29407]: 2025-11-28 07:48:11.833994498 +0000 UTC m=+1.243548659 container remove 9d56a61ce61c3bc4880e2ce0743d47f131b229cb331de2af35d8700265884372 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=keen_elbakyan, architecture=x86_64, GIT_CLEAN=True, vendor=Red Hat, Inc., RELEASE=main, description=Red Hat Ceph Storage 7, version=7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, name=rhceph, io.openshift.tags=rhceph ceph, build-date=2025-09-24T08:57:55, GIT_BRANCH=main, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.33.12, ceph=True, io.openshift.expose-services=, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, release=553, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, CEPH_POINT_RELEASE=)
Nov 28 07:48:11 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 28 07:48:11 np0005538515.localdomain systemd[1]: libpod-conmon-9d56a61ce61c3bc4880e2ce0743d47f131b229cb331de2af35d8700265884372.scope: Deactivated successfully.
Nov 28 07:48:11 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 07:48:12 np0005538515.localdomain systemd-rc-local-generator[29449]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 07:48:12 np0005538515.localdomain systemd-sysv-generator[29452]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 07:48:12 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 07:48:12 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 07:48:12 np0005538515.localdomain systemd-sysv-generator[29487]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 07:48:12 np0005538515.localdomain systemd-rc-local-generator[29481]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 07:48:12 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 07:48:12 np0005538515.localdomain systemd[1]: Reached target All Ceph clusters and services.
Nov 28 07:48:12 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 07:48:12 np0005538515.localdomain systemd-sysv-generator[29522]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 07:48:12 np0005538515.localdomain systemd-rc-local-generator[29518]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 07:48:12 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 07:48:12 np0005538515.localdomain systemd[1]: Reached target Ceph cluster 2c5417c9-00eb-57d5-a565-ddecbc7995c1.
Nov 28 07:48:12 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 07:48:12 np0005538515.localdomain systemd-sysv-generator[29565]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 07:48:12 np0005538515.localdomain systemd-rc-local-generator[29559]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 07:48:12 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 07:48:12 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 07:48:12 np0005538515.localdomain systemd-rc-local-generator[29604]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 07:48:13 np0005538515.localdomain systemd-sysv-generator[29608]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 07:48:13 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 07:48:13 np0005538515.localdomain systemd[1]: Created slice Slice /system/ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1.
Nov 28 07:48:13 np0005538515.localdomain systemd[1]: Reached target System Time Set.
Nov 28 07:48:13 np0005538515.localdomain systemd[1]: Reached target System Time Synchronized.
Nov 28 07:48:13 np0005538515.localdomain systemd[1]: Starting Ceph crash.np0005538515 for 2c5417c9-00eb-57d5-a565-ddecbc7995c1...
Nov 28 07:48:13 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 28 07:48:13 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 28 07:48:13 np0005538515.localdomain podman[29665]: 
Nov 28 07:48:13 np0005538515.localdomain podman[29665]: 2025-11-28 07:48:13.495529981 +0000 UTC m=+0.076723123 container create 98f7091a3e2ea0e9ed1e630f1e98c8fad1fd276cf7448473db6afc3c103ea45d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538515, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.tags=rhceph ceph, release=553, distribution-scope=public, architecture=x86_64, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, io.openshift.expose-services=, build-date=2025-09-24T08:57:55, GIT_CLEAN=True, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, ceph=True, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, vcs-type=git, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.buildah.version=1.33.12, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d)
Nov 28 07:48:13 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f3ce2bad17e13efb10b638a9b81cc8ffa6a43ae6257dd0423ab5beb4a935d7f3/merged/etc/ceph/ceph.client.crash.np0005538515.keyring supports timestamps until 2038 (0x7fffffff)
Nov 28 07:48:13 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f3ce2bad17e13efb10b638a9b81cc8ffa6a43ae6257dd0423ab5beb4a935d7f3/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Nov 28 07:48:13 np0005538515.localdomain podman[29665]: 2025-11-28 07:48:13.465623773 +0000 UTC m=+0.046816945 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 07:48:13 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f3ce2bad17e13efb10b638a9b81cc8ffa6a43ae6257dd0423ab5beb4a935d7f3/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Nov 28 07:48:13 np0005538515.localdomain podman[29665]: 2025-11-28 07:48:13.579310314 +0000 UTC m=+0.160503486 container init 98f7091a3e2ea0e9ed1e630f1e98c8fad1fd276cf7448473db6afc3c103ea45d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538515, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, release=553, RELEASE=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, distribution-scope=public, io.buildah.version=1.33.12, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, build-date=2025-09-24T08:57:55, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, GIT_BRANCH=main, com.redhat.component=rhceph-container, name=rhceph, version=7)
Nov 28 07:48:13 np0005538515.localdomain podman[29665]: 2025-11-28 07:48:13.589920162 +0000 UTC m=+0.171113324 container start 98f7091a3e2ea0e9ed1e630f1e98c8fad1fd276cf7448473db6afc3c103ea45d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538515, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/agreements, version=7, io.buildah.version=1.33.12, GIT_BRANCH=main, release=553, description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., ceph=True, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, architecture=x86_64, GIT_CLEAN=True, RELEASE=main, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, build-date=2025-09-24T08:57:55, io.k8s.description=Red Hat Ceph Storage 7)
Nov 28 07:48:13 np0005538515.localdomain bash[29665]: 98f7091a3e2ea0e9ed1e630f1e98c8fad1fd276cf7448473db6afc3c103ea45d
Nov 28 07:48:13 np0005538515.localdomain systemd[1]: Started Ceph crash.np0005538515 for 2c5417c9-00eb-57d5-a565-ddecbc7995c1.
Nov 28 07:48:13 np0005538515.localdomain sudo[29330]: pam_unix(sudo:session): session closed for user root
Nov 28 07:48:13 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538515[29679]: INFO:ceph-crash:pinging cluster to exercise our key
Nov 28 07:48:13 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538515[29679]: 2025-11-28T07:48:13.762+0000 7f11df21e640 -1 auth: unable to find a keyring on /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin: (2) No such file or directory
Nov 28 07:48:13 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538515[29679]: 2025-11-28T07:48:13.762+0000 7f11df21e640 -1 AuthRegistry(0x7f11d80680d0) no keyring found at /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin, disabling cephx
Nov 28 07:48:13 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538515[29679]: 2025-11-28T07:48:13.763+0000 7f11df21e640 -1 auth: unable to find a keyring on /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin: (2) No such file or directory
Nov 28 07:48:13 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538515[29679]: 2025-11-28T07:48:13.763+0000 7f11df21e640 -1 AuthRegistry(0x7f11df21d000) no keyring found at /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin, disabling cephx
Nov 28 07:48:13 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538515[29679]: 2025-11-28T07:48:13.774+0000 7f11dcf93640 -1 monclient(hunting): handle_auth_bad_method server allowed_methods [2] but i only support [1]
Nov 28 07:48:13 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538515[29679]: 2025-11-28T07:48:13.776+0000 7f11dd794640 -1 monclient(hunting): handle_auth_bad_method server allowed_methods [2] but i only support [1]
Nov 28 07:48:13 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538515[29679]: 2025-11-28T07:48:13.777+0000 7f11d7fff640 -1 monclient(hunting): handle_auth_bad_method server allowed_methods [2] but i only support [1]
Nov 28 07:48:13 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538515[29679]: 2025-11-28T07:48:13.777+0000 7f11df21e640 -1 monclient: authenticate NOTE: no keyring found; disabled cephx authentication
Nov 28 07:48:13 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538515[29679]: [errno 13] RADOS permission denied (error connecting to the cluster)
Nov 28 07:48:13 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538515[29679]: INFO:ceph-crash:monitoring path /var/lib/ceph/crash, delay 600s
Nov 28 07:48:16 np0005538515.localdomain sudo[29696]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 07:48:16 np0005538515.localdomain sudo[29696]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 07:48:16 np0005538515.localdomain sudo[29696]: pam_unix(sudo:session): session closed for user root
Nov 28 07:48:16 np0005538515.localdomain sudo[29711]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --env CEPH_VOLUME_OSDSPEC_AFFINITY=default_drive_group --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ceph-volume --fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1 --config-json - -- lvm batch --no-auto /dev/ceph_vg0/ceph_lv0 /dev/ceph_vg1/ceph_lv1 --yes --no-systemd
Nov 28 07:48:16 np0005538515.localdomain sudo[29711]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 07:48:17 np0005538515.localdomain podman[29765]: 
Nov 28 07:48:17 np0005538515.localdomain podman[29765]: 2025-11-28 07:48:17.408389404 +0000 UTC m=+0.081165419 container create c6b228b50906280576b074d98e9317e4d670f297b58a2386edf39d3386190657 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=keen_allen, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, ceph=True, RELEASE=main, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, build-date=2025-09-24T08:57:55, description=Red Hat Ceph Storage 7, release=553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, version=7, GIT_BRANCH=main, io.buildah.version=1.33.12, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, name=rhceph)
Nov 28 07:48:17 np0005538515.localdomain systemd[1]: Started libpod-conmon-c6b228b50906280576b074d98e9317e4d670f297b58a2386edf39d3386190657.scope.
Nov 28 07:48:17 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 07:48:17 np0005538515.localdomain podman[29765]: 2025-11-28 07:48:17.371793061 +0000 UTC m=+0.044569066 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 07:48:17 np0005538515.localdomain podman[29765]: 2025-11-28 07:48:17.480059394 +0000 UTC m=+0.152835339 container init c6b228b50906280576b074d98e9317e4d670f297b58a2386edf39d3386190657 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=keen_allen, distribution-scope=public, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, RELEASE=main, io.openshift.expose-services=, CEPH_POINT_RELEASE=, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, version=7, description=Red Hat Ceph Storage 7, ceph=True, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, build-date=2025-09-24T08:57:55, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, name=rhceph, architecture=x86_64, io.buildah.version=1.33.12, release=553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Nov 28 07:48:17 np0005538515.localdomain podman[29765]: 2025-11-28 07:48:17.49177677 +0000 UTC m=+0.164552735 container start c6b228b50906280576b074d98e9317e4d670f297b58a2386edf39d3386190657 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=keen_allen, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, architecture=x86_64, CEPH_POINT_RELEASE=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vendor=Red Hat, Inc., distribution-scope=public, release=553, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.buildah.version=1.33.12, ceph=True, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, vcs-type=git, GIT_BRANCH=main, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, build-date=2025-09-24T08:57:55)
Nov 28 07:48:17 np0005538515.localdomain podman[29765]: 2025-11-28 07:48:17.492042842 +0000 UTC m=+0.164818807 container attach c6b228b50906280576b074d98e9317e4d670f297b58a2386edf39d3386190657 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=keen_allen, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, release=553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-09-24T08:57:55, CEPH_POINT_RELEASE=, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, ceph=True, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_BRANCH=main, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, GIT_CLEAN=True, io.openshift.expose-services=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.buildah.version=1.33.12, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc.)
Nov 28 07:48:17 np0005538515.localdomain keen_allen[29780]: 167 167
Nov 28 07:48:17 np0005538515.localdomain systemd[1]: libpod-c6b228b50906280576b074d98e9317e4d670f297b58a2386edf39d3386190657.scope: Deactivated successfully.
Nov 28 07:48:17 np0005538515.localdomain podman[29765]: 2025-11-28 07:48:17.497403508 +0000 UTC m=+0.170179513 container died c6b228b50906280576b074d98e9317e4d670f297b58a2386edf39d3386190657 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=keen_allen, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, ceph=True, io.openshift.tags=rhceph ceph, distribution-scope=public, release=553, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=rhceph-container, GIT_BRANCH=main, io.openshift.expose-services=, name=rhceph, build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, description=Red Hat Ceph Storage 7, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, version=7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-type=git, GIT_CLEAN=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553)
Nov 28 07:48:17 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-2780e6983e60978dd49c412a9a7135caf1c4d430e106a13fd88da61daa6257f0-merged.mount: Deactivated successfully.
Nov 28 07:48:17 np0005538515.localdomain podman[29785]: 2025-11-28 07:48:17.584222995 +0000 UTC m=+0.073705581 container remove c6b228b50906280576b074d98e9317e4d670f297b58a2386edf39d3386190657 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=keen_allen, build-date=2025-09-24T08:57:55, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, architecture=x86_64, vcs-type=git, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, RELEASE=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, name=rhceph, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, version=7, vendor=Red Hat, Inc., ceph=True, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=553, com.redhat.component=rhceph-container, distribution-scope=public, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3)
Nov 28 07:48:17 np0005538515.localdomain systemd[1]: libpod-conmon-c6b228b50906280576b074d98e9317e4d670f297b58a2386edf39d3386190657.scope: Deactivated successfully.
Nov 28 07:48:17 np0005538515.localdomain podman[29804]: 
Nov 28 07:48:17 np0005538515.localdomain podman[29804]: 2025-11-28 07:48:17.790720137 +0000 UTC m=+0.072787850 container create 7bdc0e75f65c183b634305c5c46ce63fea2326490d73604fba351170fe28fcd4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=cranky_turing, architecture=x86_64, ceph=True, vcs-type=git, release=553, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.buildah.version=1.33.12, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, distribution-scope=public, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, name=rhceph, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, version=7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-09-24T08:57:55, CEPH_POINT_RELEASE=)
Nov 28 07:48:17 np0005538515.localdomain systemd[1]: Started libpod-conmon-7bdc0e75f65c183b634305c5c46ce63fea2326490d73604fba351170fe28fcd4.scope.
Nov 28 07:48:17 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 07:48:17 np0005538515.localdomain podman[29804]: 2025-11-28 07:48:17.762705642 +0000 UTC m=+0.044773365 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 07:48:17 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/455d9d65b846604f148f74071d448755b10b434220828e4798aec310d34864f7/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Nov 28 07:48:17 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/455d9d65b846604f148f74071d448755b10b434220828e4798aec310d34864f7/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Nov 28 07:48:17 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/455d9d65b846604f148f74071d448755b10b434220828e4798aec310d34864f7/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Nov 28 07:48:17 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/455d9d65b846604f148f74071d448755b10b434220828e4798aec310d34864f7/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Nov 28 07:48:17 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/455d9d65b846604f148f74071d448755b10b434220828e4798aec310d34864f7/merged/var/lib/ceph/bootstrap-osd/ceph.keyring supports timestamps until 2038 (0x7fffffff)
Nov 28 07:48:17 np0005538515.localdomain podman[29804]: 2025-11-28 07:48:17.921783395 +0000 UTC m=+0.203851108 container init 7bdc0e75f65c183b634305c5c46ce63fea2326490d73604fba351170fe28fcd4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=cranky_turing, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, com.redhat.component=rhceph-container, name=rhceph, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-09-24T08:57:55, version=7, RELEASE=main, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, GIT_BRANCH=main, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, release=553, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.buildah.version=1.33.12, CEPH_POINT_RELEASE=)
Nov 28 07:48:17 np0005538515.localdomain podman[29804]: 2025-11-28 07:48:17.93233036 +0000 UTC m=+0.214398083 container start 7bdc0e75f65c183b634305c5c46ce63fea2326490d73604fba351170fe28fcd4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=cranky_turing, build-date=2025-09-24T08:57:55, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, io.buildah.version=1.33.12, CEPH_POINT_RELEASE=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, description=Red Hat Ceph Storage 7, architecture=x86_64, distribution-scope=public, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=553, com.redhat.license_terms=https://www.redhat.com/agreements, version=7, vendor=Red Hat, Inc., GIT_CLEAN=True)
Nov 28 07:48:17 np0005538515.localdomain podman[29804]: 2025-11-28 07:48:17.93256391 +0000 UTC m=+0.214631623 container attach 7bdc0e75f65c183b634305c5c46ce63fea2326490d73604fba351170fe28fcd4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=cranky_turing, io.buildah.version=1.33.12, GIT_CLEAN=True, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, vcs-type=git, build-date=2025-09-24T08:57:55, release=553, distribution-scope=public, CEPH_POINT_RELEASE=, ceph=True, RELEASE=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.component=rhceph-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553)
Nov 28 07:48:18 np0005538515.localdomain cranky_turing[29819]: --> passed data devices: 0 physical, 2 LVM
Nov 28 07:48:18 np0005538515.localdomain cranky_turing[29819]: --> relative data size: 1.0
Nov 28 07:48:18 np0005538515.localdomain cranky_turing[29819]: Running command: /usr/bin/ceph-authtool --gen-print-key
Nov 28 07:48:18 np0005538515.localdomain cranky_turing[29819]: Running command: /usr/bin/ceph --cluster ceph --name client.bootstrap-osd --keyring /var/lib/ceph/bootstrap-osd/ceph.keyring -i - osd new b9cdd064-f06d-4e2b-b6e3-0368d5f01fb9
Nov 28 07:48:19 np0005538515.localdomain lvm[29873]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Nov 28 07:48:19 np0005538515.localdomain lvm[29873]: VG ceph_vg0 finished
Nov 28 07:48:19 np0005538515.localdomain cranky_turing[29819]: Running command: /usr/bin/ceph-authtool --gen-print-key
Nov 28 07:48:19 np0005538515.localdomain cranky_turing[29819]: Running command: /usr/bin/mount -t tmpfs tmpfs /var/lib/ceph/osd/ceph-1
Nov 28 07:48:19 np0005538515.localdomain cranky_turing[29819]: Running command: /usr/bin/chown -h ceph:ceph /dev/ceph_vg0/ceph_lv0
Nov 28 07:48:19 np0005538515.localdomain cranky_turing[29819]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Nov 28 07:48:19 np0005538515.localdomain cranky_turing[29819]: Running command: /usr/bin/ln -s /dev/ceph_vg0/ceph_lv0 /var/lib/ceph/osd/ceph-1/block
Nov 28 07:48:19 np0005538515.localdomain cranky_turing[29819]: Running command: /usr/bin/ceph --cluster ceph --name client.bootstrap-osd --keyring /var/lib/ceph/bootstrap-osd/ceph.keyring mon getmap -o /var/lib/ceph/osd/ceph-1/activate.monmap
Nov 28 07:48:19 np0005538515.localdomain cranky_turing[29819]:  stderr: got monmap epoch 3
Nov 28 07:48:19 np0005538515.localdomain cranky_turing[29819]: --> Creating keyring file for osd.1
Nov 28 07:48:19 np0005538515.localdomain cranky_turing[29819]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-1/keyring
Nov 28 07:48:19 np0005538515.localdomain cranky_turing[29819]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-1/
Nov 28 07:48:19 np0005538515.localdomain cranky_turing[29819]: Running command: /usr/bin/ceph-osd --cluster ceph --osd-objectstore bluestore --mkfs -i 1 --monmap /var/lib/ceph/osd/ceph-1/activate.monmap --keyfile - --osdspec-affinity default_drive_group --osd-data /var/lib/ceph/osd/ceph-1/ --osd-uuid b9cdd064-f06d-4e2b-b6e3-0368d5f01fb9 --setuser ceph --setgroup ceph
Nov 28 07:48:22 np0005538515.localdomain cranky_turing[29819]:  stderr: 2025-11-28T07:48:19.678+0000 7fc2cdc8ba80 -1 bluestore(/var/lib/ceph/osd/ceph-1//block) _read_bdev_label unable to decode label at offset 102: void bluestore_bdev_label_t::decode(ceph::buffer::v15_2_0::list::const_iterator&) decode past end of struct encoding: Malformed input [buffer:3]
Nov 28 07:48:22 np0005538515.localdomain cranky_turing[29819]:  stderr: 2025-11-28T07:48:19.679+0000 7fc2cdc8ba80 -1 bluestore(/var/lib/ceph/osd/ceph-1/) _read_fsid unparsable uuid
Nov 28 07:48:22 np0005538515.localdomain cranky_turing[29819]: --> ceph-volume lvm prepare successful for: ceph_vg0/ceph_lv0
Nov 28 07:48:22 np0005538515.localdomain cranky_turing[29819]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-1
Nov 28 07:48:22 np0005538515.localdomain cranky_turing[29819]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph_vg0/ceph_lv0 --path /var/lib/ceph/osd/ceph-1 --no-mon-config
Nov 28 07:48:22 np0005538515.localdomain cranky_turing[29819]: Running command: /usr/bin/ln -snf /dev/ceph_vg0/ceph_lv0 /var/lib/ceph/osd/ceph-1/block
Nov 28 07:48:22 np0005538515.localdomain cranky_turing[29819]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-1/block
Nov 28 07:48:22 np0005538515.localdomain cranky_turing[29819]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Nov 28 07:48:22 np0005538515.localdomain cranky_turing[29819]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-1
Nov 28 07:48:22 np0005538515.localdomain cranky_turing[29819]: --> ceph-volume lvm activate successful for osd ID: 1
Nov 28 07:48:22 np0005538515.localdomain cranky_turing[29819]: --> ceph-volume lvm create successful for: ceph_vg0/ceph_lv0
Nov 28 07:48:22 np0005538515.localdomain cranky_turing[29819]: Running command: /usr/bin/ceph-authtool --gen-print-key
Nov 28 07:48:22 np0005538515.localdomain cranky_turing[29819]: Running command: /usr/bin/ceph --cluster ceph --name client.bootstrap-osd --keyring /var/lib/ceph/bootstrap-osd/ceph.keyring -i - osd new f4f9cdb9-a7e9-468b-968c-003e9ca341ca
Nov 28 07:48:22 np0005538515.localdomain lvm[30810]: PV /dev/loop4 online, VG ceph_vg1 is complete.
Nov 28 07:48:22 np0005538515.localdomain lvm[30810]: VG ceph_vg1 finished
Nov 28 07:48:22 np0005538515.localdomain cranky_turing[29819]: Running command: /usr/bin/ceph-authtool --gen-print-key
Nov 28 07:48:22 np0005538515.localdomain cranky_turing[29819]: Running command: /usr/bin/mount -t tmpfs tmpfs /var/lib/ceph/osd/ceph-4
Nov 28 07:48:22 np0005538515.localdomain cranky_turing[29819]: Running command: /usr/bin/chown -h ceph:ceph /dev/ceph_vg1/ceph_lv1
Nov 28 07:48:22 np0005538515.localdomain cranky_turing[29819]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-1
Nov 28 07:48:22 np0005538515.localdomain cranky_turing[29819]: Running command: /usr/bin/ln -s /dev/ceph_vg1/ceph_lv1 /var/lib/ceph/osd/ceph-4/block
Nov 28 07:48:22 np0005538515.localdomain cranky_turing[29819]: Running command: /usr/bin/ceph --cluster ceph --name client.bootstrap-osd --keyring /var/lib/ceph/bootstrap-osd/ceph.keyring mon getmap -o /var/lib/ceph/osd/ceph-4/activate.monmap
Nov 28 07:48:23 np0005538515.localdomain cranky_turing[29819]:  stderr: got monmap epoch 3
Nov 28 07:48:23 np0005538515.localdomain cranky_turing[29819]: --> Creating keyring file for osd.4
Nov 28 07:48:23 np0005538515.localdomain cranky_turing[29819]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-4/keyring
Nov 28 07:48:23 np0005538515.localdomain cranky_turing[29819]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-4/
Nov 28 07:48:23 np0005538515.localdomain cranky_turing[29819]: Running command: /usr/bin/ceph-osd --cluster ceph --osd-objectstore bluestore --mkfs -i 4 --monmap /var/lib/ceph/osd/ceph-4/activate.monmap --keyfile - --osdspec-affinity default_drive_group --osd-data /var/lib/ceph/osd/ceph-4/ --osd-uuid f4f9cdb9-a7e9-468b-968c-003e9ca341ca --setuser ceph --setgroup ceph
Nov 28 07:48:26 np0005538515.localdomain cranky_turing[29819]:  stderr: 2025-11-28T07:48:23.539+0000 7f86f3786a80 -1 bluestore(/var/lib/ceph/osd/ceph-4//block) _read_bdev_label unable to decode label at offset 102: void bluestore_bdev_label_t::decode(ceph::buffer::v15_2_0::list::const_iterator&) decode past end of struct encoding: Malformed input [buffer:3]
Nov 28 07:48:26 np0005538515.localdomain cranky_turing[29819]:  stderr: 2025-11-28T07:48:23.539+0000 7f86f3786a80 -1 bluestore(/var/lib/ceph/osd/ceph-4/) _read_fsid unparsable uuid
Nov 28 07:48:26 np0005538515.localdomain cranky_turing[29819]: --> ceph-volume lvm prepare successful for: ceph_vg1/ceph_lv1
Nov 28 07:48:26 np0005538515.localdomain cranky_turing[29819]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-4
Nov 28 07:48:26 np0005538515.localdomain cranky_turing[29819]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph_vg1/ceph_lv1 --path /var/lib/ceph/osd/ceph-4 --no-mon-config
Nov 28 07:48:26 np0005538515.localdomain cranky_turing[29819]: Running command: /usr/bin/ln -snf /dev/ceph_vg1/ceph_lv1 /var/lib/ceph/osd/ceph-4/block
Nov 28 07:48:26 np0005538515.localdomain cranky_turing[29819]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-4/block
Nov 28 07:48:26 np0005538515.localdomain cranky_turing[29819]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-1
Nov 28 07:48:26 np0005538515.localdomain cranky_turing[29819]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-4
Nov 28 07:48:26 np0005538515.localdomain cranky_turing[29819]: --> ceph-volume lvm activate successful for osd ID: 4
Nov 28 07:48:26 np0005538515.localdomain cranky_turing[29819]: --> ceph-volume lvm create successful for: ceph_vg1/ceph_lv1
Nov 28 07:48:26 np0005538515.localdomain systemd[1]: libpod-7bdc0e75f65c183b634305c5c46ce63fea2326490d73604fba351170fe28fcd4.scope: Deactivated successfully.
Nov 28 07:48:26 np0005538515.localdomain systemd[1]: libpod-7bdc0e75f65c183b634305c5c46ce63fea2326490d73604fba351170fe28fcd4.scope: Consumed 3.855s CPU time.
Nov 28 07:48:26 np0005538515.localdomain podman[31714]: 2025-11-28 07:48:26.274300235 +0000 UTC m=+0.062849072 container died 7bdc0e75f65c183b634305c5c46ce63fea2326490d73604fba351170fe28fcd4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=cranky_turing, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, io.openshift.expose-services=, version=7, architecture=x86_64, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/agreements, CEPH_POINT_RELEASE=, vcs-type=git, release=553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, name=rhceph, distribution-scope=public, RELEASE=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_BRANCH=main, build-date=2025-09-24T08:57:55, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12, description=Red Hat Ceph Storage 7)
Nov 28 07:48:26 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-455d9d65b846604f148f74071d448755b10b434220828e4798aec310d34864f7-merged.mount: Deactivated successfully.
Nov 28 07:48:26 np0005538515.localdomain podman[31714]: 2025-11-28 07:48:26.316255725 +0000 UTC m=+0.104804532 container remove 7bdc0e75f65c183b634305c5c46ce63fea2326490d73604fba351170fe28fcd4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=cranky_turing, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.buildah.version=1.33.12, ceph=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, build-date=2025-09-24T08:57:55, com.redhat.component=rhceph-container, GIT_BRANCH=main, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_CLEAN=True, release=553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, name=rhceph, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, version=7, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public)
Nov 28 07:48:26 np0005538515.localdomain systemd[1]: libpod-conmon-7bdc0e75f65c183b634305c5c46ce63fea2326490d73604fba351170fe28fcd4.scope: Deactivated successfully.
Nov 28 07:48:26 np0005538515.localdomain sudo[29711]: pam_unix(sudo:session): session closed for user root
Nov 28 07:48:26 np0005538515.localdomain sudo[31730]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 07:48:26 np0005538515.localdomain sudo[31730]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 07:48:26 np0005538515.localdomain sudo[31730]: pam_unix(sudo:session): session closed for user root
Nov 28 07:48:26 np0005538515.localdomain sudo[31745]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ceph-volume --fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1 -- lvm list --format json
Nov 28 07:48:26 np0005538515.localdomain sudo[31745]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 07:48:27 np0005538515.localdomain podman[31799]: 
Nov 28 07:48:27 np0005538515.localdomain podman[31799]: 2025-11-28 07:48:27.112327896 +0000 UTC m=+0.094865293 container create 50e39805a9f4a6cf794a341dcc923da550d5b3684e3ca526749397e693cf14fe (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=pedantic_ride, io.buildah.version=1.33.12, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=553, io.openshift.expose-services=, GIT_BRANCH=main, com.redhat.component=rhceph-container, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-09-24T08:57:55, RELEASE=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, description=Red Hat Ceph Storage 7, name=rhceph, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, distribution-scope=public, vendor=Red Hat, Inc.)
Nov 28 07:48:27 np0005538515.localdomain podman[31799]: 2025-11-28 07:48:27.046837639 +0000 UTC m=+0.029375066 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 07:48:27 np0005538515.localdomain systemd[1]: Started libpod-conmon-50e39805a9f4a6cf794a341dcc923da550d5b3684e3ca526749397e693cf14fe.scope.
Nov 28 07:48:27 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 07:48:27 np0005538515.localdomain podman[31799]: 2025-11-28 07:48:27.190076344 +0000 UTC m=+0.172613751 container init 50e39805a9f4a6cf794a341dcc923da550d5b3684e3ca526749397e693cf14fe (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=pedantic_ride, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., vcs-type=git, RELEASE=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, build-date=2025-09-24T08:57:55, io.buildah.version=1.33.12, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, architecture=x86_64, release=553, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/agreements, ceph=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, io.openshift.expose-services=, version=7, distribution-scope=public, GIT_BRANCH=main, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9)
Nov 28 07:48:27 np0005538515.localdomain podman[31799]: 2025-11-28 07:48:27.199843824 +0000 UTC m=+0.182381231 container start 50e39805a9f4a6cf794a341dcc923da550d5b3684e3ca526749397e693cf14fe (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=pedantic_ride, CEPH_POINT_RELEASE=, ceph=True, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.33.12, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., release=553, io.openshift.expose-services=, name=rhceph, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-type=git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, RELEASE=main, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/agreements, version=7, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Nov 28 07:48:27 np0005538515.localdomain podman[31799]: 2025-11-28 07:48:27.200069854 +0000 UTC m=+0.182607251 container attach 50e39805a9f4a6cf794a341dcc923da550d5b3684e3ca526749397e693cf14fe (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=pedantic_ride, CEPH_POINT_RELEASE=, release=553, distribution-scope=public, RELEASE=main, GIT_BRANCH=main, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, architecture=x86_64, vcs-type=git, version=7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.tags=rhceph ceph, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc.)
Nov 28 07:48:27 np0005538515.localdomain pedantic_ride[31814]: 167 167
Nov 28 07:48:27 np0005538515.localdomain systemd[1]: libpod-50e39805a9f4a6cf794a341dcc923da550d5b3684e3ca526749397e693cf14fe.scope: Deactivated successfully.
Nov 28 07:48:27 np0005538515.localdomain podman[31799]: 2025-11-28 07:48:27.205252462 +0000 UTC m=+0.187789869 container died 50e39805a9f4a6cf794a341dcc923da550d5b3684e3ca526749397e693cf14fe (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=pedantic_ride, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-09-24T08:57:55, io.buildah.version=1.33.12, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=553, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, RELEASE=main, GIT_BRANCH=main, GIT_CLEAN=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, architecture=x86_64, CEPH_POINT_RELEASE=, name=rhceph, distribution-scope=public, version=7, io.openshift.expose-services=)
Nov 28 07:48:27 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-f9bba7074b839a20091fc69eda4f3c450e3b267591c00dcc10cb073b32735acd-merged.mount: Deactivated successfully.
Nov 28 07:48:27 np0005538515.localdomain podman[31819]: 2025-11-28 07:48:27.305351225 +0000 UTC m=+0.087780151 container remove 50e39805a9f4a6cf794a341dcc923da550d5b3684e3ca526749397e693cf14fe (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=pedantic_ride, build-date=2025-09-24T08:57:55, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, RELEASE=main, com.redhat.component=rhceph-container, release=553, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, io.buildah.version=1.33.12, vcs-type=git, version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, description=Red Hat Ceph Storage 7, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhceph, vendor=Red Hat, Inc., ceph=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, io.openshift.tags=rhceph ceph)
Nov 28 07:48:27 np0005538515.localdomain systemd[1]: libpod-conmon-50e39805a9f4a6cf794a341dcc923da550d5b3684e3ca526749397e693cf14fe.scope: Deactivated successfully.
Nov 28 07:48:27 np0005538515.localdomain podman[31841]: 
Nov 28 07:48:27 np0005538515.localdomain podman[31841]: 2025-11-28 07:48:27.53710167 +0000 UTC m=+0.078988032 container create 2d4fc26160d6c9818d2904060878812ca68e37be5114c21d4032ec109785cc3b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=recursing_jepsen, vcs-type=git, GIT_BRANCH=main, build-date=2025-09-24T08:57:55, distribution-scope=public, io.buildah.version=1.33.12, description=Red Hat Ceph Storage 7, release=553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, CEPH_POINT_RELEASE=, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, RELEASE=main, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/agreements)
Nov 28 07:48:27 np0005538515.localdomain systemd[1]: Started libpod-conmon-2d4fc26160d6c9818d2904060878812ca68e37be5114c21d4032ec109785cc3b.scope.
Nov 28 07:48:27 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 07:48:27 np0005538515.localdomain podman[31841]: 2025-11-28 07:48:27.505807791 +0000 UTC m=+0.047694123 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 07:48:27 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/71513df3195fd258b0cc3f7b6fb67ebd5c345c99d01ff0b2c312ca316cbe5712/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Nov 28 07:48:27 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/71513df3195fd258b0cc3f7b6fb67ebd5c345c99d01ff0b2c312ca316cbe5712/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Nov 28 07:48:27 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/71513df3195fd258b0cc3f7b6fb67ebd5c345c99d01ff0b2c312ca316cbe5712/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Nov 28 07:48:27 np0005538515.localdomain podman[31841]: 2025-11-28 07:48:27.644737336 +0000 UTC m=+0.186623628 container init 2d4fc26160d6c9818d2904060878812ca68e37be5114c21d4032ec109785cc3b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=recursing_jepsen, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, RELEASE=main, vcs-type=git, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, build-date=2025-09-24T08:57:55, name=rhceph, version=7, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.33.12, release=553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_CLEAN=True)
Nov 28 07:48:27 np0005538515.localdomain podman[31841]: 2025-11-28 07:48:27.65595209 +0000 UTC m=+0.197838432 container start 2d4fc26160d6c9818d2904060878812ca68e37be5114c21d4032ec109785cc3b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=recursing_jepsen, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, build-date=2025-09-24T08:57:55, vendor=Red Hat, Inc., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, RELEASE=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=553, io.openshift.expose-services=, version=7, architecture=x86_64, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, com.redhat.component=rhceph-container)
Nov 28 07:48:27 np0005538515.localdomain podman[31841]: 2025-11-28 07:48:27.656254403 +0000 UTC m=+0.198140705 container attach 2d4fc26160d6c9818d2904060878812ca68e37be5114c21d4032ec109785cc3b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=recursing_jepsen, io.buildah.version=1.33.12, GIT_CLEAN=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-09-24T08:57:55, distribution-scope=public, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, vcs-type=git, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, ceph=True, release=553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.expose-services=, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main)
Nov 28 07:48:27 np0005538515.localdomain recursing_jepsen[31856]: {
Nov 28 07:48:27 np0005538515.localdomain recursing_jepsen[31856]:     "1": [
Nov 28 07:48:27 np0005538515.localdomain recursing_jepsen[31856]:         {
Nov 28 07:48:27 np0005538515.localdomain recursing_jepsen[31856]:             "devices": [
Nov 28 07:48:27 np0005538515.localdomain recursing_jepsen[31856]:                 "/dev/loop3"
Nov 28 07:48:27 np0005538515.localdomain recursing_jepsen[31856]:             ],
Nov 28 07:48:27 np0005538515.localdomain recursing_jepsen[31856]:             "lv_name": "ceph_lv0",
Nov 28 07:48:27 np0005538515.localdomain recursing_jepsen[31856]:             "lv_path": "/dev/ceph_vg0/ceph_lv0",
Nov 28 07:48:27 np0005538515.localdomain recursing_jepsen[31856]:             "lv_size": "7511998464",
Nov 28 07:48:27 np0005538515.localdomain recursing_jepsen[31856]:             "lv_tags": "ceph.block_device=/dev/ceph_vg0/ceph_lv0,ceph.block_uuid=3749M0-boz5-Yk5R-nRdT-Yezs-CSPm-tBY4Hg,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=2c5417c9-00eb-57d5-a565-ddecbc7995c1,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=b9cdd064-f06d-4e2b-b6e3-0368d5f01fb9,ceph.osd_id=1,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Nov 28 07:48:27 np0005538515.localdomain recursing_jepsen[31856]:             "lv_uuid": "3749M0-boz5-Yk5R-nRdT-Yezs-CSPm-tBY4Hg",
Nov 28 07:48:27 np0005538515.localdomain recursing_jepsen[31856]:             "name": "ceph_lv0",
Nov 28 07:48:27 np0005538515.localdomain recursing_jepsen[31856]:             "path": "/dev/ceph_vg0/ceph_lv0",
Nov 28 07:48:27 np0005538515.localdomain recursing_jepsen[31856]:             "tags": {
Nov 28 07:48:27 np0005538515.localdomain recursing_jepsen[31856]:                 "ceph.block_device": "/dev/ceph_vg0/ceph_lv0",
Nov 28 07:48:27 np0005538515.localdomain recursing_jepsen[31856]:                 "ceph.block_uuid": "3749M0-boz5-Yk5R-nRdT-Yezs-CSPm-tBY4Hg",
Nov 28 07:48:27 np0005538515.localdomain recursing_jepsen[31856]:                 "ceph.cephx_lockbox_secret": "",
Nov 28 07:48:27 np0005538515.localdomain recursing_jepsen[31856]:                 "ceph.cluster_fsid": "2c5417c9-00eb-57d5-a565-ddecbc7995c1",
Nov 28 07:48:27 np0005538515.localdomain recursing_jepsen[31856]:                 "ceph.cluster_name": "ceph",
Nov 28 07:48:27 np0005538515.localdomain recursing_jepsen[31856]:                 "ceph.crush_device_class": "",
Nov 28 07:48:27 np0005538515.localdomain recursing_jepsen[31856]:                 "ceph.encrypted": "0",
Nov 28 07:48:27 np0005538515.localdomain recursing_jepsen[31856]:                 "ceph.osd_fsid": "b9cdd064-f06d-4e2b-b6e3-0368d5f01fb9",
Nov 28 07:48:27 np0005538515.localdomain recursing_jepsen[31856]:                 "ceph.osd_id": "1",
Nov 28 07:48:27 np0005538515.localdomain recursing_jepsen[31856]:                 "ceph.osdspec_affinity": "default_drive_group",
Nov 28 07:48:27 np0005538515.localdomain recursing_jepsen[31856]:                 "ceph.type": "block",
Nov 28 07:48:27 np0005538515.localdomain recursing_jepsen[31856]:                 "ceph.vdo": "0"
Nov 28 07:48:27 np0005538515.localdomain recursing_jepsen[31856]:             },
Nov 28 07:48:27 np0005538515.localdomain recursing_jepsen[31856]:             "type": "block",
Nov 28 07:48:27 np0005538515.localdomain recursing_jepsen[31856]:             "vg_name": "ceph_vg0"
Nov 28 07:48:27 np0005538515.localdomain recursing_jepsen[31856]:         }
Nov 28 07:48:27 np0005538515.localdomain recursing_jepsen[31856]:     ],
Nov 28 07:48:27 np0005538515.localdomain recursing_jepsen[31856]:     "4": [
Nov 28 07:48:27 np0005538515.localdomain recursing_jepsen[31856]:         {
Nov 28 07:48:27 np0005538515.localdomain recursing_jepsen[31856]:             "devices": [
Nov 28 07:48:27 np0005538515.localdomain recursing_jepsen[31856]:                 "/dev/loop4"
Nov 28 07:48:27 np0005538515.localdomain recursing_jepsen[31856]:             ],
Nov 28 07:48:27 np0005538515.localdomain recursing_jepsen[31856]:             "lv_name": "ceph_lv1",
Nov 28 07:48:27 np0005538515.localdomain recursing_jepsen[31856]:             "lv_path": "/dev/ceph_vg1/ceph_lv1",
Nov 28 07:48:27 np0005538515.localdomain recursing_jepsen[31856]:             "lv_size": "7511998464",
Nov 28 07:48:27 np0005538515.localdomain recursing_jepsen[31856]:             "lv_tags": "ceph.block_device=/dev/ceph_vg1/ceph_lv1,ceph.block_uuid=bsBjq9-gtPd-H3kW-i3Gz-qY9o-NJ4N-eJqMew,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=2c5417c9-00eb-57d5-a565-ddecbc7995c1,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=f4f9cdb9-a7e9-468b-968c-003e9ca341ca,ceph.osd_id=4,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Nov 28 07:48:27 np0005538515.localdomain recursing_jepsen[31856]:             "lv_uuid": "bsBjq9-gtPd-H3kW-i3Gz-qY9o-NJ4N-eJqMew",
Nov 28 07:48:27 np0005538515.localdomain recursing_jepsen[31856]:             "name": "ceph_lv1",
Nov 28 07:48:27 np0005538515.localdomain recursing_jepsen[31856]:             "path": "/dev/ceph_vg1/ceph_lv1",
Nov 28 07:48:27 np0005538515.localdomain recursing_jepsen[31856]:             "tags": {
Nov 28 07:48:27 np0005538515.localdomain recursing_jepsen[31856]:                 "ceph.block_device": "/dev/ceph_vg1/ceph_lv1",
Nov 28 07:48:27 np0005538515.localdomain recursing_jepsen[31856]:                 "ceph.block_uuid": "bsBjq9-gtPd-H3kW-i3Gz-qY9o-NJ4N-eJqMew",
Nov 28 07:48:27 np0005538515.localdomain recursing_jepsen[31856]:                 "ceph.cephx_lockbox_secret": "",
Nov 28 07:48:27 np0005538515.localdomain recursing_jepsen[31856]:                 "ceph.cluster_fsid": "2c5417c9-00eb-57d5-a565-ddecbc7995c1",
Nov 28 07:48:27 np0005538515.localdomain recursing_jepsen[31856]:                 "ceph.cluster_name": "ceph",
Nov 28 07:48:27 np0005538515.localdomain recursing_jepsen[31856]:                 "ceph.crush_device_class": "",
Nov 28 07:48:27 np0005538515.localdomain recursing_jepsen[31856]:                 "ceph.encrypted": "0",
Nov 28 07:48:27 np0005538515.localdomain recursing_jepsen[31856]:                 "ceph.osd_fsid": "f4f9cdb9-a7e9-468b-968c-003e9ca341ca",
Nov 28 07:48:27 np0005538515.localdomain recursing_jepsen[31856]:                 "ceph.osd_id": "4",
Nov 28 07:48:27 np0005538515.localdomain recursing_jepsen[31856]:                 "ceph.osdspec_affinity": "default_drive_group",
Nov 28 07:48:27 np0005538515.localdomain recursing_jepsen[31856]:                 "ceph.type": "block",
Nov 28 07:48:27 np0005538515.localdomain recursing_jepsen[31856]:                 "ceph.vdo": "0"
Nov 28 07:48:27 np0005538515.localdomain recursing_jepsen[31856]:             },
Nov 28 07:48:27 np0005538515.localdomain recursing_jepsen[31856]:             "type": "block",
Nov 28 07:48:27 np0005538515.localdomain recursing_jepsen[31856]:             "vg_name": "ceph_vg1"
Nov 28 07:48:27 np0005538515.localdomain recursing_jepsen[31856]:         }
Nov 28 07:48:27 np0005538515.localdomain recursing_jepsen[31856]:     ]
Nov 28 07:48:27 np0005538515.localdomain recursing_jepsen[31856]: }
Nov 28 07:48:28 np0005538515.localdomain podman[31841]: 2025-11-28 07:48:28.032674846 +0000 UTC m=+0.574561178 container died 2d4fc26160d6c9818d2904060878812ca68e37be5114c21d4032ec109785cc3b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=recursing_jepsen, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, GIT_CLEAN=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., ceph=True, architecture=x86_64, io.buildah.version=1.33.12, description=Red Hat Ceph Storage 7, release=553, RELEASE=main, vcs-type=git, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2025-09-24T08:57:55, GIT_BRANCH=main, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Nov 28 07:48:28 np0005538515.localdomain systemd[1]: libpod-2d4fc26160d6c9818d2904060878812ca68e37be5114c21d4032ec109785cc3b.scope: Deactivated successfully.
Nov 28 07:48:28 np0005538515.localdomain podman[31865]: 2025-11-28 07:48:28.129037094 +0000 UTC m=+0.083867548 container remove 2d4fc26160d6c9818d2904060878812ca68e37be5114c21d4032ec109785cc3b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=recursing_jepsen, ceph=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, RELEASE=main, io.openshift.expose-services=, name=rhceph, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., release=553, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, com.redhat.component=rhceph-container, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-09-24T08:57:55, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7)
Nov 28 07:48:28 np0005538515.localdomain systemd[1]: libpod-conmon-2d4fc26160d6c9818d2904060878812ca68e37be5114c21d4032ec109785cc3b.scope: Deactivated successfully.
Nov 28 07:48:28 np0005538515.localdomain sudo[31745]: pam_unix(sudo:session): session closed for user root
Nov 28 07:48:28 np0005538515.localdomain sudo[31880]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 07:48:28 np0005538515.localdomain sudo[31880]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 07:48:28 np0005538515.localdomain sudo[31880]: pam_unix(sudo:session): session closed for user root
Nov 28 07:48:28 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-71513df3195fd258b0cc3f7b6fb67ebd5c345c99d01ff0b2c312ca316cbe5712-merged.mount: Deactivated successfully.
Nov 28 07:48:28 np0005538515.localdomain sudo[31895]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 07:48:28 np0005538515.localdomain sudo[31895]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 07:48:28 np0005538515.localdomain podman[31951]: 
Nov 28 07:48:28 np0005538515.localdomain podman[31951]: 2025-11-28 07:48:28.95264703 +0000 UTC m=+0.070178314 container create cd84d20f0f178406a752f72ee48881f9871fba58ac6896853be57396ee7944b2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=intelligent_burnell, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, version=7, description=Red Hat Ceph Storage 7, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, distribution-scope=public, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_BRANCH=main, vendor=Red Hat, Inc., io.buildah.version=1.33.12, vcs-type=git, RELEASE=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, build-date=2025-09-24T08:57:55, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, com.redhat.component=rhceph-container, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, release=553, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git)
Nov 28 07:48:28 np0005538515.localdomain systemd[1]: Started libpod-conmon-cd84d20f0f178406a752f72ee48881f9871fba58ac6896853be57396ee7944b2.scope.
Nov 28 07:48:28 np0005538515.localdomain systemd[1]: tmp-crun.L2DiVa.mount: Deactivated successfully.
Nov 28 07:48:29 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 07:48:29 np0005538515.localdomain podman[31951]: 2025-11-28 07:48:28.921728427 +0000 UTC m=+0.039259711 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 07:48:29 np0005538515.localdomain podman[31951]: 2025-11-28 07:48:29.033780457 +0000 UTC m=+0.151311751 container init cd84d20f0f178406a752f72ee48881f9871fba58ac6896853be57396ee7944b2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=intelligent_burnell, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=, vcs-type=git, GIT_BRANCH=main, release=553, ceph=True, architecture=x86_64, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, version=7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-09-24T08:57:55, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_CLEAN=True, description=Red Hat Ceph Storage 7)
Nov 28 07:48:29 np0005538515.localdomain intelligent_burnell[31966]: 167 167
Nov 28 07:48:29 np0005538515.localdomain podman[31951]: 2025-11-28 07:48:29.046376961 +0000 UTC m=+0.163908245 container start cd84d20f0f178406a752f72ee48881f9871fba58ac6896853be57396ee7944b2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=intelligent_burnell, description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, release=553, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, RELEASE=main, build-date=2025-09-24T08:57:55, name=rhceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vcs-type=git, GIT_CLEAN=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, ceph=True, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, GIT_BRANCH=main)
Nov 28 07:48:29 np0005538515.localdomain podman[31951]: 2025-11-28 07:48:29.049140313 +0000 UTC m=+0.166671637 container attach cd84d20f0f178406a752f72ee48881f9871fba58ac6896853be57396ee7944b2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=intelligent_burnell, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., ceph=True, io.openshift.expose-services=, architecture=x86_64, build-date=2025-09-24T08:57:55, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.33.12, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, RELEASE=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, release=553, distribution-scope=public, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, version=7)
Nov 28 07:48:29 np0005538515.localdomain systemd[1]: libpod-cd84d20f0f178406a752f72ee48881f9871fba58ac6896853be57396ee7944b2.scope: Deactivated successfully.
Nov 28 07:48:29 np0005538515.localdomain podman[31951]: 2025-11-28 07:48:29.05450831 +0000 UTC m=+0.172039674 container died cd84d20f0f178406a752f72ee48881f9871fba58ac6896853be57396ee7944b2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=intelligent_burnell, distribution-scope=public, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, RELEASE=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, architecture=x86_64, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, version=7, io.openshift.expose-services=, build-date=2025-09-24T08:57:55, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.tags=rhceph ceph, vcs-type=git, io.buildah.version=1.33.12, name=rhceph)
Nov 28 07:48:29 np0005538515.localdomain podman[31971]: 2025-11-28 07:48:29.137192555 +0000 UTC m=+0.073293812 container remove cd84d20f0f178406a752f72ee48881f9871fba58ac6896853be57396ee7944b2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=intelligent_burnell, RELEASE=main, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, GIT_CLEAN=True, GIT_BRANCH=main, version=7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., vcs-type=git, ceph=True, build-date=2025-09-24T08:57:55, io.openshift.expose-services=, release=553, io.buildah.version=1.33.12, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553)
Nov 28 07:48:29 np0005538515.localdomain systemd[1]: libpod-conmon-cd84d20f0f178406a752f72ee48881f9871fba58ac6896853be57396ee7944b2.scope: Deactivated successfully.
Nov 28 07:48:29 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-3fbcda35e0867fad2a4d13583fa5a7f0f3690a680d1b01bacceab37549aa5cf0-merged.mount: Deactivated successfully.
Nov 28 07:48:29 np0005538515.localdomain podman[31998]: 
Nov 28 07:48:29 np0005538515.localdomain podman[31998]: 2025-11-28 07:48:29.474878981 +0000 UTC m=+0.070462977 container create 65c73b9721994000d168180c60470cdca21dbbcae5fa11640fd254f1724a2fc3 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-1-activate-test, RELEASE=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, version=7, description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, release=553, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, GIT_CLEAN=True, CEPH_POINT_RELEASE=, name=rhceph, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat Ceph Storage 7)
Nov 28 07:48:29 np0005538515.localdomain systemd[1]: Started libpod-conmon-65c73b9721994000d168180c60470cdca21dbbcae5fa11640fd254f1724a2fc3.scope.
Nov 28 07:48:29 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 07:48:29 np0005538515.localdomain podman[31998]: 2025-11-28 07:48:29.448263028 +0000 UTC m=+0.043847034 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 07:48:29 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9c5a49cef492c3a27374e6497a362c48df357f95ca21b4005bed80a217740eb9/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Nov 28 07:48:29 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9c5a49cef492c3a27374e6497a362c48df357f95ca21b4005bed80a217740eb9/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Nov 28 07:48:29 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9c5a49cef492c3a27374e6497a362c48df357f95ca21b4005bed80a217740eb9/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Nov 28 07:48:29 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9c5a49cef492c3a27374e6497a362c48df357f95ca21b4005bed80a217740eb9/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Nov 28 07:48:29 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9c5a49cef492c3a27374e6497a362c48df357f95ca21b4005bed80a217740eb9/merged/var/lib/ceph/osd/ceph-1 supports timestamps until 2038 (0x7fffffff)
Nov 28 07:48:29 np0005538515.localdomain podman[31998]: 2025-11-28 07:48:29.606973583 +0000 UTC m=+0.202557559 container init 65c73b9721994000d168180c60470cdca21dbbcae5fa11640fd254f1724a2fc3 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-1-activate-test, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, io.buildah.version=1.33.12, com.redhat.component=rhceph-container, RELEASE=main, distribution-scope=public, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_BRANCH=main, io.openshift.expose-services=, architecture=x86_64, release=553, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, build-date=2025-09-24T08:57:55)
Nov 28 07:48:29 np0005538515.localdomain podman[31998]: 2025-11-28 07:48:29.618087433 +0000 UTC m=+0.213671439 container start 65c73b9721994000d168180c60470cdca21dbbcae5fa11640fd254f1724a2fc3 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-1-activate-test, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., io.buildah.version=1.33.12, version=7, vcs-type=git, CEPH_POINT_RELEASE=, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, architecture=x86_64, release=553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, build-date=2025-09-24T08:57:55, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, ceph=True)
Nov 28 07:48:29 np0005538515.localdomain podman[31998]: 2025-11-28 07:48:29.618425368 +0000 UTC m=+0.214009334 container attach 65c73b9721994000d168180c60470cdca21dbbcae5fa11640fd254f1724a2fc3 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-1-activate-test, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, distribution-scope=public, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vendor=Red Hat, Inc., build-date=2025-09-24T08:57:55, GIT_BRANCH=main, com.redhat.component=rhceph-container, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=553, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, architecture=x86_64, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, com.redhat.license_terms=https://www.redhat.com/agreements, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git)
Nov 28 07:48:29 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-1-activate-test[32014]: usage: ceph-volume activate [-h] [--osd-id OSD_ID] [--osd-uuid OSD_UUID]
Nov 28 07:48:29 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-1-activate-test[32014]:                             [--no-systemd] [--no-tmpfs]
Nov 28 07:48:29 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-1-activate-test[32014]: ceph-volume activate: error: unrecognized arguments: --bad-option
Nov 28 07:48:29 np0005538515.localdomain systemd[1]: libpod-65c73b9721994000d168180c60470cdca21dbbcae5fa11640fd254f1724a2fc3.scope: Deactivated successfully.
Nov 28 07:48:29 np0005538515.localdomain podman[31998]: 2025-11-28 07:48:29.868514463 +0000 UTC m=+0.464098479 container died 65c73b9721994000d168180c60470cdca21dbbcae5fa11640fd254f1724a2fc3 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-1-activate-test, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, name=rhceph, build-date=2025-09-24T08:57:55, version=7, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, release=553, CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, ceph=True, distribution-scope=public, io.openshift.expose-services=, io.buildah.version=1.33.12, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, GIT_CLEAN=True)
Nov 28 07:48:29 np0005538515.localdomain podman[32019]: 2025-11-28 07:48:29.966335605 +0000 UTC m=+0.084886643 container remove 65c73b9721994000d168180c60470cdca21dbbcae5fa11640fd254f1724a2fc3 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-1-activate-test, ceph=True, io.buildah.version=1.33.12, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., architecture=x86_64, release=553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, vcs-type=git, build-date=2025-09-24T08:57:55, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, version=7, com.redhat.license_terms=https://www.redhat.com/agreements, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, distribution-scope=public)
Nov 28 07:48:29 np0005538515.localdomain systemd-journald[618]: Field hash table of /run/log/journal/5cd59ba25ae47acac865224fa46a5f9e/system.journal has a fill level at 75.1 (250 of 333 items), suggesting rotation.
Nov 28 07:48:29 np0005538515.localdomain systemd-journald[618]: /run/log/journal/5cd59ba25ae47acac865224fa46a5f9e/system.journal: Journal header limits reached or header out-of-date, rotating.
Nov 28 07:48:29 np0005538515.localdomain rsyslogd[758]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Nov 28 07:48:29 np0005538515.localdomain systemd[1]: libpod-conmon-65c73b9721994000d168180c60470cdca21dbbcae5fa11640fd254f1724a2fc3.scope: Deactivated successfully.
Nov 28 07:48:30 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 07:48:30 np0005538515.localdomain systemd-rc-local-generator[32076]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 07:48:30 np0005538515.localdomain systemd-sysv-generator[32081]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 07:48:30 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 07:48:30 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-9c5a49cef492c3a27374e6497a362c48df357f95ca21b4005bed80a217740eb9-merged.mount: Deactivated successfully.
Nov 28 07:48:30 np0005538515.localdomain systemd[1]: tmp-crun.GUfD5n.mount: Deactivated successfully.
Nov 28 07:48:30 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 07:48:30 np0005538515.localdomain systemd-sysv-generator[32123]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 07:48:30 np0005538515.localdomain systemd-rc-local-generator[32120]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 07:48:30 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 07:48:30 np0005538515.localdomain systemd[1]: Starting Ceph osd.1 for 2c5417c9-00eb-57d5-a565-ddecbc7995c1...
Nov 28 07:48:31 np0005538515.localdomain podman[32184]: 
Nov 28 07:48:31 np0005538515.localdomain podman[32184]: 2025-11-28 07:48:31.121004585 +0000 UTC m=+0.065763421 container create a85a25c20a9bb47c4374b03b9bde22be8316131b9ebacee9600413c6598ff7b7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-1-activate, release=553, vendor=Red Hat, Inc., GIT_BRANCH=main, distribution-scope=public, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-type=git, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, version=7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, RELEASE=main, name=rhceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.buildah.version=1.33.12, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, ceph=True, architecture=x86_64)
Nov 28 07:48:31 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 07:48:31 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0680a283c51576e3d8382da9b68b17d4695f7c95f0855e7c1eaaee75775f0b2b/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Nov 28 07:48:31 np0005538515.localdomain podman[32184]: 2025-11-28 07:48:31.096804887 +0000 UTC m=+0.041563683 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 07:48:31 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0680a283c51576e3d8382da9b68b17d4695f7c95f0855e7c1eaaee75775f0b2b/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Nov 28 07:48:31 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0680a283c51576e3d8382da9b68b17d4695f7c95f0855e7c1eaaee75775f0b2b/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Nov 28 07:48:31 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0680a283c51576e3d8382da9b68b17d4695f7c95f0855e7c1eaaee75775f0b2b/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Nov 28 07:48:31 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0680a283c51576e3d8382da9b68b17d4695f7c95f0855e7c1eaaee75775f0b2b/merged/var/lib/ceph/osd/ceph-1 supports timestamps until 2038 (0x7fffffff)
Nov 28 07:48:31 np0005538515.localdomain podman[32184]: 2025-11-28 07:48:31.25224722 +0000 UTC m=+0.197006046 container init a85a25c20a9bb47c4374b03b9bde22be8316131b9ebacee9600413c6598ff7b7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-1-activate, architecture=x86_64, description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vendor=Red Hat, Inc., build-date=2025-09-24T08:57:55, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, name=rhceph, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, release=553, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, version=7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, io.buildah.version=1.33.12, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Nov 28 07:48:31 np0005538515.localdomain podman[32184]: 2025-11-28 07:48:31.261717617 +0000 UTC m=+0.206476433 container start a85a25c20a9bb47c4374b03b9bde22be8316131b9ebacee9600413c6598ff7b7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-1-activate, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, RELEASE=main, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, version=7, io.k8s.description=Red Hat Ceph Storage 7, release=553, io.buildah.version=1.33.12, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, build-date=2025-09-24T08:57:55, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.expose-services=, ceph=True, vcs-type=git, vendor=Red Hat, Inc., distribution-scope=public)
Nov 28 07:48:31 np0005538515.localdomain podman[32184]: 2025-11-28 07:48:31.262098794 +0000 UTC m=+0.206857790 container attach a85a25c20a9bb47c4374b03b9bde22be8316131b9ebacee9600413c6598ff7b7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-1-activate, GIT_BRANCH=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, CEPH_POINT_RELEASE=, io.buildah.version=1.33.12, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, build-date=2025-09-24T08:57:55, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, vcs-type=git, architecture=x86_64, name=rhceph, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, release=553, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/agreements, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc.)
Nov 28 07:48:31 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-1-activate[32198]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-1
Nov 28 07:48:31 np0005538515.localdomain bash[32184]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-1
Nov 28 07:48:31 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-1-activate[32198]: Running command: /usr/bin/ceph-bluestore-tool prime-osd-dir --path /var/lib/ceph/osd/ceph-1 --no-mon-config --dev /dev/mapper/ceph_vg0-ceph_lv0
Nov 28 07:48:31 np0005538515.localdomain bash[32184]: Running command: /usr/bin/ceph-bluestore-tool prime-osd-dir --path /var/lib/ceph/osd/ceph-1 --no-mon-config --dev /dev/mapper/ceph_vg0-ceph_lv0
Nov 28 07:48:31 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-1-activate[32198]: Running command: /usr/bin/chown -h ceph:ceph /dev/mapper/ceph_vg0-ceph_lv0
Nov 28 07:48:31 np0005538515.localdomain bash[32184]: Running command: /usr/bin/chown -h ceph:ceph /dev/mapper/ceph_vg0-ceph_lv0
Nov 28 07:48:31 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-1-activate[32198]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Nov 28 07:48:31 np0005538515.localdomain bash[32184]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Nov 28 07:48:31 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-1-activate[32198]: Running command: /usr/bin/ln -s /dev/mapper/ceph_vg0-ceph_lv0 /var/lib/ceph/osd/ceph-1/block
Nov 28 07:48:31 np0005538515.localdomain bash[32184]: Running command: /usr/bin/ln -s /dev/mapper/ceph_vg0-ceph_lv0 /var/lib/ceph/osd/ceph-1/block
Nov 28 07:48:31 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-1-activate[32198]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-1
Nov 28 07:48:31 np0005538515.localdomain bash[32184]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-1
Nov 28 07:48:31 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-1-activate[32198]: --> ceph-volume raw activate successful for osd ID: 1
Nov 28 07:48:31 np0005538515.localdomain bash[32184]: --> ceph-volume raw activate successful for osd ID: 1
Nov 28 07:48:31 np0005538515.localdomain systemd[1]: libpod-a85a25c20a9bb47c4374b03b9bde22be8316131b9ebacee9600413c6598ff7b7.scope: Deactivated successfully.
Nov 28 07:48:31 np0005538515.localdomain podman[32184]: 2025-11-28 07:48:31.992766383 +0000 UTC m=+0.937525269 container died a85a25c20a9bb47c4374b03b9bde22be8316131b9ebacee9600413c6598ff7b7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-1-activate, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, io.buildah.version=1.33.12, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-09-24T08:57:55, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, distribution-scope=public, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, io.openshift.expose-services=, GIT_CLEAN=True, vendor=Red Hat, Inc., GIT_BRANCH=main, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, architecture=x86_64, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, release=553, com.redhat.license_terms=https://www.redhat.com/agreements, ceph=True)
Nov 28 07:48:32 np0005538515.localdomain systemd[1]: tmp-crun.7iDOwr.mount: Deactivated successfully.
Nov 28 07:48:32 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-0680a283c51576e3d8382da9b68b17d4695f7c95f0855e7c1eaaee75775f0b2b-merged.mount: Deactivated successfully.
Nov 28 07:48:32 np0005538515.localdomain podman[32312]: 2025-11-28 07:48:32.084239055 +0000 UTC m=+0.080689158 container remove a85a25c20a9bb47c4374b03b9bde22be8316131b9ebacee9600413c6598ff7b7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-1-activate, build-date=2025-09-24T08:57:55, vendor=Red Hat, Inc., RELEASE=main, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=553, ceph=True, GIT_BRANCH=main, GIT_CLEAN=True, distribution-scope=public, vcs-type=git, version=7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, name=rhceph, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7)
Nov 28 07:48:32 np0005538515.localdomain podman[32374]: 
Nov 28 07:48:32 np0005538515.localdomain podman[32374]: 2025-11-28 07:48:32.402222352 +0000 UTC m=+0.072113660 container create d6143f5c4d0edf8e88527410f707a68007ec2434660a8ed1dd820cec36e66cc3 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-1, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, release=553, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2025-09-24T08:57:55, vcs-type=git, RELEASE=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhceph, ceph=True, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, version=7, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=)
Nov 28 07:48:32 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/259010bf69d409df7f88bdd32b0016120bbdb0cbdd301173dd40dd864993b89a/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Nov 28 07:48:32 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/259010bf69d409df7f88bdd32b0016120bbdb0cbdd301173dd40dd864993b89a/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Nov 28 07:48:32 np0005538515.localdomain podman[32374]: 2025-11-28 07:48:32.373550818 +0000 UTC m=+0.043442136 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 07:48:32 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/259010bf69d409df7f88bdd32b0016120bbdb0cbdd301173dd40dd864993b89a/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Nov 28 07:48:32 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/259010bf69d409df7f88bdd32b0016120bbdb0cbdd301173dd40dd864993b89a/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Nov 28 07:48:32 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/259010bf69d409df7f88bdd32b0016120bbdb0cbdd301173dd40dd864993b89a/merged/var/lib/ceph/osd/ceph-1 supports timestamps until 2038 (0x7fffffff)
Nov 28 07:48:32 np0005538515.localdomain podman[32374]: 2025-11-28 07:48:32.508356301 +0000 UTC m=+0.178247619 container init d6143f5c4d0edf8e88527410f707a68007ec2434660a8ed1dd820cec36e66cc3 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-1, io.buildah.version=1.33.12, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, RELEASE=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, distribution-scope=public, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_CLEAN=True, vcs-type=git, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, release=553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, ceph=True, name=rhceph, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55, version=7)
Nov 28 07:48:32 np0005538515.localdomain podman[32374]: 2025-11-28 07:48:32.514907009 +0000 UTC m=+0.184798317 container start d6143f5c4d0edf8e88527410f707a68007ec2434660a8ed1dd820cec36e66cc3 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-1, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-type=git, io.buildah.version=1.33.12, distribution-scope=public, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, io.openshift.expose-services=, vendor=Red Hat, Inc., release=553, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, version=7, GIT_BRANCH=main, RELEASE=main, architecture=x86_64, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph)
Nov 28 07:48:32 np0005538515.localdomain bash[32374]: d6143f5c4d0edf8e88527410f707a68007ec2434660a8ed1dd820cec36e66cc3
Nov 28 07:48:32 np0005538515.localdomain systemd[1]: Started Ceph osd.1 for 2c5417c9-00eb-57d5-a565-ddecbc7995c1.
Nov 28 07:48:32 np0005538515.localdomain ceph-osd[32393]: set uid:gid to 167:167 (ceph:ceph)
Nov 28 07:48:32 np0005538515.localdomain ceph-osd[32393]: ceph version 18.2.1-361.el9cp (439dcd6094d413840eb2ec590fe2194ec616687f) reef (stable), process ceph-osd, pid 2
Nov 28 07:48:32 np0005538515.localdomain ceph-osd[32393]: pidfile_write: ignore empty --pid-file
Nov 28 07:48:32 np0005538515.localdomain ceph-osd[32393]: bdev(0x55ab8a452e00 /var/lib/ceph/osd/ceph-1/block) open path /var/lib/ceph/osd/ceph-1/block
Nov 28 07:48:32 np0005538515.localdomain ceph-osd[32393]: bdev(0x55ab8a452e00 /var/lib/ceph/osd/ceph-1/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-1/block failed: (22) Invalid argument
Nov 28 07:48:32 np0005538515.localdomain ceph-osd[32393]: bdev(0x55ab8a452e00 /var/lib/ceph/osd/ceph-1/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Nov 28 07:48:32 np0005538515.localdomain ceph-osd[32393]: bluestore(/var/lib/ceph/osd/ceph-1) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 data 0.06
Nov 28 07:48:32 np0005538515.localdomain ceph-osd[32393]: bdev(0x55ab8a453180 /var/lib/ceph/osd/ceph-1/block) open path /var/lib/ceph/osd/ceph-1/block
Nov 28 07:48:32 np0005538515.localdomain ceph-osd[32393]: bdev(0x55ab8a453180 /var/lib/ceph/osd/ceph-1/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-1/block failed: (22) Invalid argument
Nov 28 07:48:32 np0005538515.localdomain ceph-osd[32393]: bdev(0x55ab8a453180 /var/lib/ceph/osd/ceph-1/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Nov 28 07:48:32 np0005538515.localdomain ceph-osd[32393]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-1/block size 7.0 GiB
Nov 28 07:48:32 np0005538515.localdomain ceph-osd[32393]: bdev(0x55ab8a453180 /var/lib/ceph/osd/ceph-1/block) close
Nov 28 07:48:32 np0005538515.localdomain sudo[31895]: pam_unix(sudo:session): session closed for user root
Nov 28 07:48:32 np0005538515.localdomain sudo[32406]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 07:48:32 np0005538515.localdomain sudo[32406]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 07:48:32 np0005538515.localdomain sudo[32406]: pam_unix(sudo:session): session closed for user root
Nov 28 07:48:32 np0005538515.localdomain sudo[32421]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 07:48:32 np0005538515.localdomain sudo[32421]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 07:48:32 np0005538515.localdomain ceph-osd[32393]: bdev(0x55ab8a452e00 /var/lib/ceph/osd/ceph-1/block) close
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: starting osd.1 osd_data /var/lib/ceph/osd/ceph-1 /var/lib/ceph/osd/ceph-1/journal
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: load: jerasure load: lrc 
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: bdev(0x55ab8a453180 /var/lib/ceph/osd/ceph-1/block) open path /var/lib/ceph/osd/ceph-1/block
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: bdev(0x55ab8a453180 /var/lib/ceph/osd/ceph-1/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-1/block failed: (22) Invalid argument
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: bdev(0x55ab8a453180 /var/lib/ceph/osd/ceph-1/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: bluestore(/var/lib/ceph/osd/ceph-1) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 data 0.06
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: bdev(0x55ab8a453180 /var/lib/ceph/osd/ceph-1/block) close
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: bdev(0x55ab8a453180 /var/lib/ceph/osd/ceph-1/block) open path /var/lib/ceph/osd/ceph-1/block
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: bdev(0x55ab8a453180 /var/lib/ceph/osd/ceph-1/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-1/block failed: (22) Invalid argument
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: bdev(0x55ab8a453180 /var/lib/ceph/osd/ceph-1/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: bluestore(/var/lib/ceph/osd/ceph-1) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 data 0.06
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: bdev(0x55ab8a453180 /var/lib/ceph/osd/ceph-1/block) close
Nov 28 07:48:33 np0005538515.localdomain podman[32486]: 
Nov 28 07:48:33 np0005538515.localdomain podman[32486]: 2025-11-28 07:48:33.397755987 +0000 UTC m=+0.075601383 container create fce79ddbfb4ca7305887e3b30a10f282f004f484f956cd29730228c506a2678d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=magical_sammet, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, vendor=Red Hat, Inc., distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, build-date=2025-09-24T08:57:55, name=rhceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, version=7, io.buildah.version=1.33.12, com.redhat.component=rhceph-container, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, release=553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main, ceph=True, vcs-type=git)
Nov 28 07:48:33 np0005538515.localdomain systemd[1]: Started libpod-conmon-fce79ddbfb4ca7305887e3b30a10f282f004f484f956cd29730228c506a2678d.scope.
Nov 28 07:48:33 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 07:48:33 np0005538515.localdomain podman[32486]: 2025-11-28 07:48:33.36583898 +0000 UTC m=+0.043684396 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 07:48:33 np0005538515.localdomain podman[32486]: 2025-11-28 07:48:33.47177076 +0000 UTC m=+0.149616156 container init fce79ddbfb4ca7305887e3b30a10f282f004f484f956cd29730228c506a2678d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=magical_sammet, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, ceph=True, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, build-date=2025-09-24T08:57:55, GIT_CLEAN=True, release=553, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.33.12, RELEASE=main, name=rhceph, vendor=Red Hat, Inc., distribution-scope=public)
Nov 28 07:48:33 np0005538515.localdomain podman[32486]: 2025-11-28 07:48:33.481147963 +0000 UTC m=+0.158993359 container start fce79ddbfb4ca7305887e3b30a10f282f004f484f956cd29730228c506a2678d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=magical_sammet, RELEASE=main, description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, GIT_CLEAN=True, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, version=7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, release=553, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, name=rhceph, architecture=x86_64, vendor=Red Hat, Inc.)
Nov 28 07:48:33 np0005538515.localdomain podman[32486]: 2025-11-28 07:48:33.481420284 +0000 UTC m=+0.159265680 container attach fce79ddbfb4ca7305887e3b30a10f282f004f484f956cd29730228c506a2678d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=magical_sammet, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, name=rhceph, GIT_BRANCH=main, vcs-type=git, io.openshift.expose-services=, architecture=x86_64, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., build-date=2025-09-24T08:57:55, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.33.12, release=553, RELEASE=main, com.redhat.component=rhceph-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, distribution-scope=public, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Nov 28 07:48:33 np0005538515.localdomain systemd[1]: libpod-fce79ddbfb4ca7305887e3b30a10f282f004f484f956cd29730228c506a2678d.scope: Deactivated successfully.
Nov 28 07:48:33 np0005538515.localdomain magical_sammet[32505]: 167 167
Nov 28 07:48:33 np0005538515.localdomain podman[32486]: 2025-11-28 07:48:33.485345988 +0000 UTC m=+0.163191384 container died fce79ddbfb4ca7305887e3b30a10f282f004f484f956cd29730228c506a2678d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=magical_sammet, name=rhceph, architecture=x86_64, GIT_BRANCH=main, CEPH_POINT_RELEASE=, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, release=553, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, io.openshift.expose-services=, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, version=7, io.buildah.version=1.33.12, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.component=rhceph-container, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553)
Nov 28 07:48:33 np0005538515.localdomain podman[32510]: 2025-11-28 07:48:33.583426742 +0000 UTC m=+0.084921456 container remove fce79ddbfb4ca7305887e3b30a10f282f004f484f956cd29730228c506a2678d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=magical_sammet, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, name=rhceph, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, release=553, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, distribution-scope=public, io.openshift.tags=rhceph ceph, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_CLEAN=True, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/agreements, version=7, build-date=2025-09-24T08:57:55, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.component=rhceph-container, ceph=True)
Nov 28 07:48:33 np0005538515.localdomain systemd[1]: libpod-conmon-fce79ddbfb4ca7305887e3b30a10f282f004f484f956cd29730228c506a2678d.scope: Deactivated successfully.
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: mClockScheduler: set_osd_capacity_params_from_config: osd_bandwidth_cost_per_io: 499321.90 bytes/io, osd_bandwidth_capacity_per_shard 157286400.00 bytes/second
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: osd.1:0.OSDShard using op scheduler mclock_scheduler, cutoff=196
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: bdev(0x55ab8a453180 /var/lib/ceph/osd/ceph-1/block) open path /var/lib/ceph/osd/ceph-1/block
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: bdev(0x55ab8a453180 /var/lib/ceph/osd/ceph-1/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-1/block failed: (22) Invalid argument
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: bdev(0x55ab8a453180 /var/lib/ceph/osd/ceph-1/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: bluestore(/var/lib/ceph/osd/ceph-1) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 data 0.06
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: bdev(0x55ab8a453500 /var/lib/ceph/osd/ceph-1/block) open path /var/lib/ceph/osd/ceph-1/block
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: bdev(0x55ab8a453500 /var/lib/ceph/osd/ceph-1/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-1/block failed: (22) Invalid argument
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: bdev(0x55ab8a453500 /var/lib/ceph/osd/ceph-1/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-1/block size 7.0 GiB
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: bluefs mount
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: bluefs _init_alloc shared, id 1, capacity 0x1bfc00000, block size 0x10000
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: bluefs mount shared_bdev_used = 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: bluestore(/var/lib/ceph/osd/ceph-1) _prepare_db_environment set db_paths to db,7136398540 db.slow,7136398540
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: RocksDB version: 7.9.2
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Git sha 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Compile date 2025-09-23 00:00:00
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: DB SUMMARY
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: DB Session ID:  QREFVIKU9CR5RBF8LMCG
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: CURRENT file:  CURRENT
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: IDENTITY file:  IDENTITY
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: MANIFEST file:  MANIFEST-000032 size: 1007 Bytes
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: SST files in db dir, Total Num: 1, files: 000030.sst 
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: SST files in db.slow dir, Total Num: 0, files: 
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Write Ahead Log file in db.wal: 000031.log size: 5093 ; 
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                         Options.error_if_exists: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                       Options.create_if_missing: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                         Options.paranoid_checks: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:             Options.flush_verify_memtable_count: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                                     Options.env: 0x55ab8a6e7180
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                                      Options.fs: LegacyFileSystem
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                                Options.info_log: 0x55ab8b3d6500
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                Options.max_file_opening_threads: 16
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                              Options.statistics: (nil)
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                               Options.use_fsync: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                       Options.max_log_file_size: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                   Options.log_file_time_to_roll: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                       Options.keep_log_file_num: 1000
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                    Options.recycle_log_file_num: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                         Options.allow_fallocate: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                        Options.allow_mmap_reads: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                       Options.allow_mmap_writes: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                        Options.use_direct_reads: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:          Options.create_missing_column_families: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                              Options.db_log_dir: 
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                                 Options.wal_dir: db.wal
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                Options.table_cache_numshardbits: 6
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                         Options.WAL_ttl_seconds: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                       Options.WAL_size_limit_MB: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:             Options.manifest_preallocation_size: 4194304
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                     Options.is_fd_close_on_exec: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                   Options.advise_random_on_open: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                    Options.db_write_buffer_size: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                    Options.write_buffer_manager: 0x55ab8a43d4a0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.access_hint_on_compaction_start: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                      Options.use_adaptive_mutex: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                            Options.rate_limiter: (nil)
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                       Options.wal_recovery_mode: 2
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                  Options.enable_thread_tracking: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                  Options.enable_pipelined_write: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                  Options.unordered_write: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:             Options.write_thread_max_yield_usec: 100
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                               Options.row_cache: None
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                              Options.wal_filter: None
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:             Options.avoid_flush_during_recovery: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:             Options.allow_ingest_behind: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:             Options.two_write_queues: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:             Options.manual_wal_flush: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:             Options.wal_compression: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:             Options.atomic_flush: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                 Options.persist_stats_to_disk: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                 Options.write_dbid_to_manifest: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                 Options.log_readahead_size: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                 Options.best_efforts_recovery: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:             Options.allow_data_in_errors: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:             Options.db_host_id: __hostname__
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:             Options.enforce_single_del_contracts: true
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:             Options.max_background_jobs: 4
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:             Options.max_background_compactions: -1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:             Options.max_subcompactions: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:           Options.writable_file_max_buffer_size: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:             Options.delayed_write_rate : 16777216
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:             Options.max_total_wal_size: 1073741824
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                   Options.stats_dump_period_sec: 600
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                 Options.stats_persist_period_sec: 600
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                          Options.max_open_files: -1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                          Options.bytes_per_sync: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                      Options.wal_bytes_per_sync: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                   Options.strict_bytes_per_sync: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:       Options.compaction_readahead_size: 2097152
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                  Options.max_background_flushes: -1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Compression algorithms supported:
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         kZSTD supported: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         kXpressCompression supported: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         kBZip2Compression supported: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         kZSTDNotFinalCompression supported: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         kLZ4Compression supported: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         kZlibCompression supported: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         kLZ4HCCompression supported: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         kSnappyCompression supported: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Fast CRC32 supported: Supported on x86
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: DMutex implementation: pthread_mutex_t
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: [db/db_impl/db_impl_readonly.cc:25] Opening the db in read only mode
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: db/MANIFEST-000032
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 0, name: default)
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:           Options.merge_operator: .T:int64_array.b:bitwise_xor
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:        Options.compaction_filter: None
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:        Options.compaction_filter_factory: None
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:  Options.sst_partitioner_factory: None
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55ab8b3d66c0)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55ab8a42a850
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:        Options.write_buffer_size: 16777216
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:  Options.max_write_buffer_number: 64
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:          Options.compression: LZ4
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:       Options.prefix_extractor: nullptr
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:             Options.num_levels: 7
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                  Options.compression_opts.level: 32767
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:               Options.compression_opts.strategy: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                  Options.compression_opts.enabled: false
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                        Options.arena_block_size: 1048576
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                Options.disable_auto_compactions: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                   Options.table_properties_collectors: 
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                   Options.inplace_update_support: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                           Options.bloom_locality: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                    Options.max_successive_merges: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                Options.paranoid_file_checks: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                Options.force_consistency_checks: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                Options.report_bg_io_stats: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                               Options.ttl: 2592000
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                       Options.enable_blob_files: false
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                           Options.min_blob_size: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                          Options.blob_file_size: 268435456
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                Options.blob_file_starting_level: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 1, name: m-0)
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-0]:
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:           Options.merge_operator: None
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:        Options.compaction_filter: None
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:        Options.compaction_filter_factory: None
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:  Options.sst_partitioner_factory: None
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55ab8b3d66c0)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55ab8a42a850
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:        Options.write_buffer_size: 16777216
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:  Options.max_write_buffer_number: 64
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:          Options.compression: LZ4
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:       Options.prefix_extractor: nullptr
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:             Options.num_levels: 7
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                  Options.compression_opts.level: 32767
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:               Options.compression_opts.strategy: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                  Options.compression_opts.enabled: false
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                        Options.arena_block_size: 1048576
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                Options.disable_auto_compactions: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                   Options.inplace_update_support: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                           Options.bloom_locality: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                    Options.max_successive_merges: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                Options.paranoid_file_checks: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                Options.force_consistency_checks: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                Options.report_bg_io_stats: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                               Options.ttl: 2592000
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                       Options.enable_blob_files: false
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                           Options.min_blob_size: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                          Options.blob_file_size: 268435456
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                Options.blob_file_starting_level: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 2, name: m-1)
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-1]:
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:           Options.merge_operator: None
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:        Options.compaction_filter: None
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:        Options.compaction_filter_factory: None
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:  Options.sst_partitioner_factory: None
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55ab8b3d66c0)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55ab8a42a850
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:        Options.write_buffer_size: 16777216
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:  Options.max_write_buffer_number: 64
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:          Options.compression: LZ4
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:       Options.prefix_extractor: nullptr
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:             Options.num_levels: 7
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                  Options.compression_opts.level: 32767
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:               Options.compression_opts.strategy: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                  Options.compression_opts.enabled: false
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                        Options.arena_block_size: 1048576
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                Options.disable_auto_compactions: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                   Options.inplace_update_support: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                           Options.bloom_locality: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                    Options.max_successive_merges: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                Options.paranoid_file_checks: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                Options.force_consistency_checks: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                Options.report_bg_io_stats: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                               Options.ttl: 2592000
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                       Options.enable_blob_files: false
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                           Options.min_blob_size: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                          Options.blob_file_size: 268435456
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                Options.blob_file_starting_level: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 3, name: m-2)
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-2]:
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:           Options.merge_operator: None
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:        Options.compaction_filter: None
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:        Options.compaction_filter_factory: None
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:  Options.sst_partitioner_factory: None
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55ab8b3d66c0)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55ab8a42a850
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:        Options.write_buffer_size: 16777216
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:  Options.max_write_buffer_number: 64
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:          Options.compression: LZ4
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:       Options.prefix_extractor: nullptr
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:             Options.num_levels: 7
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                  Options.compression_opts.level: 32767
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:               Options.compression_opts.strategy: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                  Options.compression_opts.enabled: false
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                        Options.arena_block_size: 1048576
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                Options.disable_auto_compactions: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                   Options.inplace_update_support: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                           Options.bloom_locality: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                    Options.max_successive_merges: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                Options.paranoid_file_checks: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                Options.force_consistency_checks: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                Options.report_bg_io_stats: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                               Options.ttl: 2592000
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                       Options.enable_blob_files: false
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                           Options.min_blob_size: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                          Options.blob_file_size: 268435456
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                Options.blob_file_starting_level: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 4, name: p-0)
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-0]:
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:           Options.merge_operator: None
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:        Options.compaction_filter: None
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:        Options.compaction_filter_factory: None
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:  Options.sst_partitioner_factory: None
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55ab8b3d66c0)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55ab8a42a850
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:        Options.write_buffer_size: 16777216
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:  Options.max_write_buffer_number: 64
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:          Options.compression: LZ4
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:       Options.prefix_extractor: nullptr
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:             Options.num_levels: 7
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                  Options.compression_opts.level: 32767
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:               Options.compression_opts.strategy: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                  Options.compression_opts.enabled: false
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                        Options.arena_block_size: 1048576
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                Options.disable_auto_compactions: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                   Options.inplace_update_support: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                           Options.bloom_locality: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                    Options.max_successive_merges: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                Options.paranoid_file_checks: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                Options.force_consistency_checks: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                Options.report_bg_io_stats: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                               Options.ttl: 2592000
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                       Options.enable_blob_files: false
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                           Options.min_blob_size: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                          Options.blob_file_size: 268435456
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                Options.blob_file_starting_level: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 5, name: p-1)
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-1]:
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:           Options.merge_operator: None
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:        Options.compaction_filter: None
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:        Options.compaction_filter_factory: None
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:  Options.sst_partitioner_factory: None
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55ab8b3d66c0)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55ab8a42a850
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:        Options.write_buffer_size: 16777216
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:  Options.max_write_buffer_number: 64
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:          Options.compression: LZ4
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:       Options.prefix_extractor: nullptr
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:             Options.num_levels: 7
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                  Options.compression_opts.level: 32767
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:               Options.compression_opts.strategy: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                  Options.compression_opts.enabled: false
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                        Options.arena_block_size: 1048576
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                Options.disable_auto_compactions: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                   Options.inplace_update_support: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                           Options.bloom_locality: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                    Options.max_successive_merges: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                Options.paranoid_file_checks: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                Options.force_consistency_checks: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                Options.report_bg_io_stats: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                               Options.ttl: 2592000
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                       Options.enable_blob_files: false
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                           Options.min_blob_size: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                          Options.blob_file_size: 268435456
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                Options.blob_file_starting_level: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 6, name: p-2)
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-2]:
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:           Options.merge_operator: None
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:        Options.compaction_filter: None
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:        Options.compaction_filter_factory: None
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:  Options.sst_partitioner_factory: None
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55ab8b3d66c0)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55ab8a42a850
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:        Options.write_buffer_size: 16777216
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:  Options.max_write_buffer_number: 64
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:          Options.compression: LZ4
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:       Options.prefix_extractor: nullptr
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:             Options.num_levels: 7
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                  Options.compression_opts.level: 32767
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:               Options.compression_opts.strategy: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                  Options.compression_opts.enabled: false
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                        Options.arena_block_size: 1048576
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                Options.disable_auto_compactions: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                   Options.inplace_update_support: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                           Options.bloom_locality: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                    Options.max_successive_merges: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                Options.paranoid_file_checks: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                Options.force_consistency_checks: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                Options.report_bg_io_stats: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                               Options.ttl: 2592000
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                       Options.enable_blob_files: false
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                           Options.min_blob_size: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                          Options.blob_file_size: 268435456
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                Options.blob_file_starting_level: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 7, name: O-0)
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-0]:
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:           Options.merge_operator: None
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:        Options.compaction_filter: None
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:        Options.compaction_filter_factory: None
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:  Options.sst_partitioner_factory: None
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55ab8b3d68e0)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55ab8a42a2d0
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 536870912
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:        Options.write_buffer_size: 16777216
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:  Options.max_write_buffer_number: 64
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:          Options.compression: LZ4
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:       Options.prefix_extractor: nullptr
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:             Options.num_levels: 7
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                  Options.compression_opts.level: 32767
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:               Options.compression_opts.strategy: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                  Options.compression_opts.enabled: false
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                        Options.arena_block_size: 1048576
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                Options.disable_auto_compactions: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                   Options.inplace_update_support: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                           Options.bloom_locality: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                    Options.max_successive_merges: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                Options.paranoid_file_checks: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                Options.force_consistency_checks: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                Options.report_bg_io_stats: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                               Options.ttl: 2592000
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                       Options.enable_blob_files: false
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                           Options.min_blob_size: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                          Options.blob_file_size: 268435456
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                Options.blob_file_starting_level: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 8, name: O-1)
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-1]:
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:           Options.merge_operator: None
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:        Options.compaction_filter: None
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:        Options.compaction_filter_factory: None
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:  Options.sst_partitioner_factory: None
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55ab8b3d68e0)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55ab8a42a2d0
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 536870912
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:        Options.write_buffer_size: 16777216
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:  Options.max_write_buffer_number: 64
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:          Options.compression: LZ4
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:       Options.prefix_extractor: nullptr
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:             Options.num_levels: 7
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                  Options.compression_opts.level: 32767
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:               Options.compression_opts.strategy: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                  Options.compression_opts.enabled: false
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                        Options.arena_block_size: 1048576
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                Options.disable_auto_compactions: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                   Options.inplace_update_support: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                           Options.bloom_locality: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                    Options.max_successive_merges: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                Options.paranoid_file_checks: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                Options.force_consistency_checks: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                Options.report_bg_io_stats: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                               Options.ttl: 2592000
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                       Options.enable_blob_files: false
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                           Options.min_blob_size: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                          Options.blob_file_size: 268435456
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                Options.blob_file_starting_level: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 9, name: O-2)
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-2]:
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:           Options.merge_operator: None
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:        Options.compaction_filter: None
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:        Options.compaction_filter_factory: None
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:  Options.sst_partitioner_factory: None
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55ab8b3d68e0)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55ab8a42a2d0
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 536870912
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:        Options.write_buffer_size: 16777216
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:  Options.max_write_buffer_number: 64
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:          Options.compression: LZ4
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:       Options.prefix_extractor: nullptr
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:             Options.num_levels: 7
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                  Options.compression_opts.level: 32767
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:               Options.compression_opts.strategy: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                  Options.compression_opts.enabled: false
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                        Options.arena_block_size: 1048576
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                Options.disable_auto_compactions: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                   Options.inplace_update_support: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                           Options.bloom_locality: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                    Options.max_successive_merges: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                Options.paranoid_file_checks: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                Options.force_consistency_checks: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                Options.report_bg_io_stats: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                               Options.ttl: 2592000
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                       Options.enable_blob_files: false
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                           Options.min_blob_size: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                          Options.blob_file_size: 268435456
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                Options.blob_file_starting_level: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 10, name: L)
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: [db/column_family.cc:635]         (skipping printing options)
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 11, name: P)
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: [db/column_family.cc:635]         (skipping printing options)
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:db/MANIFEST-000032 succeeded,manifest_file_number is 32, next_file_number is 34, last_sequence is 12, log_number is 5,prev_log_number is 0,max_column_family is 11,min_log_number_to_keep is 5
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 5
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: [db/version_set.cc:5581] Column family [m-0] (ID 1), log number is 5
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: [db/version_set.cc:5581] Column family [m-1] (ID 2), log number is 5
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: [db/version_set.cc:5581] Column family [m-2] (ID 3), log number is 5
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: [db/version_set.cc:5581] Column family [p-0] (ID 4), log number is 5
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: [db/version_set.cc:5581] Column family [p-1] (ID 5), log number is 5
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: [db/version_set.cc:5581] Column family [p-2] (ID 6), log number is 5
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: [db/version_set.cc:5581] Column family [O-0] (ID 7), log number is 5
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: [db/version_set.cc:5581] Column family [O-1] (ID 8), log number is 5
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: [db/version_set.cc:5581] Column family [O-2] (ID 9), log number is 5
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: [db/version_set.cc:5581] Column family [L] (ID 10), log number is 5
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: [db/version_set.cc:5581] Column family [P] (ID 11), log number is 5
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: 4ac62bd1-f8fe-41aa-94df-6dbfb775e8d7
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764316113682013, "job": 1, "event": "recovery_started", "wal_files": [31]}
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #31 mode 2
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764316113682381, "job": 1, "event": "recovery_finished"}
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: bluestore(/var/lib/ceph/osd/ceph-1) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: bluestore(/var/lib/ceph/osd/ceph-1) _open_super_meta old nid_max 1025
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: bluestore(/var/lib/ceph/osd/ceph-1) _open_super_meta old blobid_max 10240
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: bluestore(/var/lib/ceph/osd/ceph-1) _open_super_meta ondisk_format 4 compat_ondisk_format 3
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: bluestore(/var/lib/ceph/osd/ceph-1) _open_super_meta min_alloc_size 0x1000
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: freelist init
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: freelist _read_cfg
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: bluestore(/var/lib/ceph/osd/ceph-1) _init_alloc loaded 7.0 GiB in 2 extents, allocator type hybrid, capacity 0x1bfc00000, block size 0x1000, free 0x1bfbfd000, fragmentation 5.5e-07
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: [db/db_impl/db_impl.cc:496] Shutdown: canceling all background work
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: [db/db_impl/db_impl.cc:704] Shutdown complete
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: bluefs umount
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: bdev(0x55ab8a453500 /var/lib/ceph/osd/ceph-1/block) close
Nov 28 07:48:33 np0005538515.localdomain podman[32733]: 
Nov 28 07:48:33 np0005538515.localdomain podman[32733]: 2025-11-28 07:48:33.933846839 +0000 UTC m=+0.073386576 container create 6d4d7da991c4f1b78c12e591d1ce5dfbe35a960096156af77a75d9f2508bb460 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-4-activate-test, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, version=7, CEPH_POINT_RELEASE=, io.buildah.version=1.33.12, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, RELEASE=main, ceph=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, release=553, build-date=2025-09-24T08:57:55, GIT_CLEAN=True, io.openshift.expose-services=, distribution-scope=public, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main)
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: bdev(0x55ab8a453500 /var/lib/ceph/osd/ceph-1/block) open path /var/lib/ceph/osd/ceph-1/block
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: bdev(0x55ab8a453500 /var/lib/ceph/osd/ceph-1/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-1/block failed: (22) Invalid argument
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: bdev(0x55ab8a453500 /var/lib/ceph/osd/ceph-1/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-1/block size 7.0 GiB
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: bluefs mount
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: bluefs _init_alloc shared, id 1, capacity 0x1bfc00000, block size 0x10000
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: bluefs mount shared_bdev_used = 4718592
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: bluestore(/var/lib/ceph/osd/ceph-1) _prepare_db_environment set db_paths to db,7136398540 db.slow,7136398540
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: RocksDB version: 7.9.2
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Git sha 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Compile date 2025-09-23 00:00:00
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: DB SUMMARY
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: DB Session ID:  QREFVIKU9CR5RBF8LMCH
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: CURRENT file:  CURRENT
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: IDENTITY file:  IDENTITY
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: MANIFEST file:  MANIFEST-000032 size: 1007 Bytes
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: SST files in db dir, Total Num: 1, files: 000030.sst 
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: SST files in db.slow dir, Total Num: 0, files: 
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Write Ahead Log file in db.wal: 000031.log size: 5093 ; 
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                         Options.error_if_exists: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                       Options.create_if_missing: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                         Options.paranoid_checks: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:             Options.flush_verify_memtable_count: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                                     Options.env: 0x55ab8b524850
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                                      Options.fs: LegacyFileSystem
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                                Options.info_log: 0x55ab8b44d040
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                Options.max_file_opening_threads: 16
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                              Options.statistics: (nil)
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                               Options.use_fsync: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                       Options.max_log_file_size: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                   Options.log_file_time_to_roll: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                       Options.keep_log_file_num: 1000
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                    Options.recycle_log_file_num: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                         Options.allow_fallocate: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                        Options.allow_mmap_reads: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                       Options.allow_mmap_writes: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                        Options.use_direct_reads: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:          Options.create_missing_column_families: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                              Options.db_log_dir: 
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                                 Options.wal_dir: db.wal
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                Options.table_cache_numshardbits: 6
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                         Options.WAL_ttl_seconds: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                       Options.WAL_size_limit_MB: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:             Options.manifest_preallocation_size: 4194304
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                     Options.is_fd_close_on_exec: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                   Options.advise_random_on_open: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                    Options.db_write_buffer_size: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                    Options.write_buffer_manager: 0x55ab8a43d5e0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.access_hint_on_compaction_start: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                      Options.use_adaptive_mutex: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                            Options.rate_limiter: (nil)
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                       Options.wal_recovery_mode: 2
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                  Options.enable_thread_tracking: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                  Options.enable_pipelined_write: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                  Options.unordered_write: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:             Options.write_thread_max_yield_usec: 100
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                               Options.row_cache: None
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                              Options.wal_filter: None
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:             Options.avoid_flush_during_recovery: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:             Options.allow_ingest_behind: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:             Options.two_write_queues: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:             Options.manual_wal_flush: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:             Options.wal_compression: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:             Options.atomic_flush: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                 Options.persist_stats_to_disk: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                 Options.write_dbid_to_manifest: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                 Options.log_readahead_size: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                 Options.best_efforts_recovery: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:             Options.allow_data_in_errors: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:             Options.db_host_id: __hostname__
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:             Options.enforce_single_del_contracts: true
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:             Options.max_background_jobs: 4
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:             Options.max_background_compactions: -1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:             Options.max_subcompactions: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:           Options.writable_file_max_buffer_size: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:             Options.delayed_write_rate : 16777216
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:             Options.max_total_wal_size: 1073741824
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                   Options.stats_dump_period_sec: 600
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                 Options.stats_persist_period_sec: 600
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                          Options.max_open_files: -1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                          Options.bytes_per_sync: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                      Options.wal_bytes_per_sync: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                   Options.strict_bytes_per_sync: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:       Options.compaction_readahead_size: 2097152
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                  Options.max_background_flushes: -1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Compression algorithms supported:
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         kZSTD supported: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         kXpressCompression supported: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         kBZip2Compression supported: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         kZSTDNotFinalCompression supported: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         kLZ4Compression supported: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         kZlibCompression supported: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         kLZ4HCCompression supported: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         kSnappyCompression supported: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Fast CRC32 supported: Supported on x86
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: DMutex implementation: pthread_mutex_t
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: db/MANIFEST-000032
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 0, name: default)
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:           Options.merge_operator: .T:int64_array.b:bitwise_xor
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:        Options.compaction_filter: None
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:        Options.compaction_filter_factory: None
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:  Options.sst_partitioner_factory: None
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55ab8b44c460)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55ab8a42b350
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:        Options.write_buffer_size: 16777216
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:  Options.max_write_buffer_number: 64
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:          Options.compression: LZ4
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:       Options.prefix_extractor: nullptr
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:             Options.num_levels: 7
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                  Options.compression_opts.level: 32767
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:               Options.compression_opts.strategy: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                  Options.compression_opts.enabled: false
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                        Options.arena_block_size: 1048576
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                Options.disable_auto_compactions: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                   Options.table_properties_collectors: 
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                   Options.inplace_update_support: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                           Options.bloom_locality: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                    Options.max_successive_merges: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                Options.paranoid_file_checks: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                Options.force_consistency_checks: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                Options.report_bg_io_stats: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                               Options.ttl: 2592000
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                       Options.enable_blob_files: false
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                           Options.min_blob_size: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                          Options.blob_file_size: 268435456
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                Options.blob_file_starting_level: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 1, name: m-0)
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-0]:
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:           Options.merge_operator: None
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:        Options.compaction_filter: None
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:        Options.compaction_filter_factory: None
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:  Options.sst_partitioner_factory: None
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55ab8b44c460)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55ab8a42b350
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:        Options.write_buffer_size: 16777216
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:  Options.max_write_buffer_number: 64
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:          Options.compression: LZ4
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:       Options.prefix_extractor: nullptr
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:             Options.num_levels: 7
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                  Options.compression_opts.level: 32767
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:               Options.compression_opts.strategy: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                  Options.compression_opts.enabled: false
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                        Options.arena_block_size: 1048576
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                Options.disable_auto_compactions: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                   Options.inplace_update_support: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                           Options.bloom_locality: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                    Options.max_successive_merges: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                Options.paranoid_file_checks: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                Options.force_consistency_checks: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                Options.report_bg_io_stats: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                               Options.ttl: 2592000
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                       Options.enable_blob_files: false
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                           Options.min_blob_size: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                          Options.blob_file_size: 268435456
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                Options.blob_file_starting_level: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 2, name: m-1)
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-1]:
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:           Options.merge_operator: None
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:        Options.compaction_filter: None
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:        Options.compaction_filter_factory: None
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:  Options.sst_partitioner_factory: None
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55ab8b44c460)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55ab8a42b350
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:        Options.write_buffer_size: 16777216
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:  Options.max_write_buffer_number: 64
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:          Options.compression: LZ4
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:       Options.prefix_extractor: nullptr
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:             Options.num_levels: 7
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                  Options.compression_opts.level: 32767
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:               Options.compression_opts.strategy: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                  Options.compression_opts.enabled: false
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                        Options.arena_block_size: 1048576
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                Options.disable_auto_compactions: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                   Options.inplace_update_support: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                           Options.bloom_locality: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                    Options.max_successive_merges: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                Options.paranoid_file_checks: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                Options.force_consistency_checks: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                Options.report_bg_io_stats: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                               Options.ttl: 2592000
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                       Options.enable_blob_files: false
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                           Options.min_blob_size: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                          Options.blob_file_size: 268435456
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                Options.blob_file_starting_level: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 3, name: m-2)
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-2]:
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:           Options.merge_operator: None
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:        Options.compaction_filter: None
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:        Options.compaction_filter_factory: None
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:  Options.sst_partitioner_factory: None
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55ab8b44c460)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55ab8a42b350
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:        Options.write_buffer_size: 16777216
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:  Options.max_write_buffer_number: 64
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:          Options.compression: LZ4
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:       Options.prefix_extractor: nullptr
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:             Options.num_levels: 7
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                  Options.compression_opts.level: 32767
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:               Options.compression_opts.strategy: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                  Options.compression_opts.enabled: false
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                        Options.arena_block_size: 1048576
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                Options.disable_auto_compactions: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                   Options.inplace_update_support: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                           Options.bloom_locality: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                    Options.max_successive_merges: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                Options.paranoid_file_checks: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                Options.force_consistency_checks: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                Options.report_bg_io_stats: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                               Options.ttl: 2592000
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                       Options.enable_blob_files: false
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                           Options.min_blob_size: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                          Options.blob_file_size: 268435456
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                Options.blob_file_starting_level: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 4, name: p-0)
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-0]:
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:           Options.merge_operator: None
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:        Options.compaction_filter: None
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:        Options.compaction_filter_factory: None
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:  Options.sst_partitioner_factory: None
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55ab8b44c460)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55ab8a42b350
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:        Options.write_buffer_size: 16777216
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:  Options.max_write_buffer_number: 64
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:          Options.compression: LZ4
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:       Options.prefix_extractor: nullptr
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:             Options.num_levels: 7
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                  Options.compression_opts.level: 32767
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:               Options.compression_opts.strategy: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                  Options.compression_opts.enabled: false
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                        Options.arena_block_size: 1048576
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                Options.disable_auto_compactions: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                   Options.inplace_update_support: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                           Options.bloom_locality: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                    Options.max_successive_merges: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                Options.paranoid_file_checks: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                Options.force_consistency_checks: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                Options.report_bg_io_stats: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                               Options.ttl: 2592000
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                       Options.enable_blob_files: false
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                           Options.min_blob_size: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                          Options.blob_file_size: 268435456
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                Options.blob_file_starting_level: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 5, name: p-1)
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-1]:
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:           Options.merge_operator: None
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:        Options.compaction_filter: None
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:        Options.compaction_filter_factory: None
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:  Options.sst_partitioner_factory: None
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55ab8b44c460)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55ab8a42b350
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:        Options.write_buffer_size: 16777216
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:  Options.max_write_buffer_number: 64
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:          Options.compression: LZ4
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:       Options.prefix_extractor: nullptr
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:             Options.num_levels: 7
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                  Options.compression_opts.level: 32767
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:               Options.compression_opts.strategy: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                  Options.compression_opts.enabled: false
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                        Options.arena_block_size: 1048576
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                Options.disable_auto_compactions: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                   Options.inplace_update_support: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                           Options.bloom_locality: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                    Options.max_successive_merges: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                Options.paranoid_file_checks: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                Options.force_consistency_checks: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                Options.report_bg_io_stats: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                               Options.ttl: 2592000
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                       Options.enable_blob_files: false
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                           Options.min_blob_size: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                          Options.blob_file_size: 268435456
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                Options.blob_file_starting_level: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 6, name: p-2)
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-2]:
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:           Options.merge_operator: None
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:        Options.compaction_filter: None
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:        Options.compaction_filter_factory: None
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:  Options.sst_partitioner_factory: None
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55ab8b44c460)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55ab8a42b350
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:        Options.write_buffer_size: 16777216
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:  Options.max_write_buffer_number: 64
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:          Options.compression: LZ4
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:       Options.prefix_extractor: nullptr
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:             Options.num_levels: 7
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                  Options.compression_opts.level: 32767
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:               Options.compression_opts.strategy: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                  Options.compression_opts.enabled: false
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                        Options.arena_block_size: 1048576
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                Options.disable_auto_compactions: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                   Options.inplace_update_support: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                           Options.bloom_locality: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                    Options.max_successive_merges: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                Options.paranoid_file_checks: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                Options.force_consistency_checks: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                Options.report_bg_io_stats: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                               Options.ttl: 2592000
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                       Options.enable_blob_files: false
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                           Options.min_blob_size: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                          Options.blob_file_size: 268435456
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                Options.blob_file_starting_level: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 7, name: O-0)
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-0]:
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:           Options.merge_operator: None
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:        Options.compaction_filter: None
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:        Options.compaction_filter_factory: None
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:  Options.sst_partitioner_factory: None
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55ab8b3d6c40)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55ab8a42b610
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 536870912
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:        Options.write_buffer_size: 16777216
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:  Options.max_write_buffer_number: 64
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:          Options.compression: LZ4
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:       Options.prefix_extractor: nullptr
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:             Options.num_levels: 7
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                  Options.compression_opts.level: 32767
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:               Options.compression_opts.strategy: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                  Options.compression_opts.enabled: false
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                        Options.arena_block_size: 1048576
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                Options.disable_auto_compactions: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 28 07:48:33 np0005538515.localdomain systemd[1]: Started libpod-conmon-6d4d7da991c4f1b78c12e591d1ce5dfbe35a960096156af77a75d9f2508bb460.scope.
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                   Options.inplace_update_support: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                           Options.bloom_locality: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                    Options.max_successive_merges: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                Options.paranoid_file_checks: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                Options.force_consistency_checks: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                Options.report_bg_io_stats: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                               Options.ttl: 2592000
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                       Options.enable_blob_files: false
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                           Options.min_blob_size: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                          Options.blob_file_size: 268435456
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                Options.blob_file_starting_level: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 8, name: O-1)
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-1]:
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:           Options.merge_operator: None
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:        Options.compaction_filter: None
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:        Options.compaction_filter_factory: None
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:  Options.sst_partitioner_factory: None
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55ab8b3d6c40)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55ab8a42b610
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 536870912
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:        Options.write_buffer_size: 16777216
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:  Options.max_write_buffer_number: 64
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:          Options.compression: LZ4
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:       Options.prefix_extractor: nullptr
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:             Options.num_levels: 7
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                  Options.compression_opts.level: 32767
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:               Options.compression_opts.strategy: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                  Options.compression_opts.enabled: false
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                        Options.arena_block_size: 1048576
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                Options.disable_auto_compactions: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                   Options.inplace_update_support: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                           Options.bloom_locality: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                    Options.max_successive_merges: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                Options.paranoid_file_checks: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                Options.force_consistency_checks: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                Options.report_bg_io_stats: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                               Options.ttl: 2592000
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                       Options.enable_blob_files: false
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                           Options.min_blob_size: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                          Options.blob_file_size: 268435456
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                Options.blob_file_starting_level: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 9, name: O-2)
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-2]:
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:           Options.merge_operator: None
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:        Options.compaction_filter: None
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:        Options.compaction_filter_factory: None
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:  Options.sst_partitioner_factory: None
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55ab8b3d6c40)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55ab8a42b610
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 536870912
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:        Options.write_buffer_size: 16777216
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:  Options.max_write_buffer_number: 64
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:          Options.compression: LZ4
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:       Options.prefix_extractor: nullptr
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:             Options.num_levels: 7
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                  Options.compression_opts.level: 32767
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:               Options.compression_opts.strategy: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                  Options.compression_opts.enabled: false
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                        Options.arena_block_size: 1048576
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                Options.disable_auto_compactions: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                   Options.inplace_update_support: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                           Options.bloom_locality: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                    Options.max_successive_merges: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                Options.paranoid_file_checks: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                Options.force_consistency_checks: 1
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                Options.report_bg_io_stats: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                               Options.ttl: 2592000
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                       Options.enable_blob_files: false
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                           Options.min_blob_size: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                          Options.blob_file_size: 268435456
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb:                Options.blob_file_starting_level: 0
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 10, name: L)
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: [db/column_family.cc:635]         (skipping printing options)
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 11, name: P)
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: [db/column_family.cc:635]         (skipping printing options)
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:db/MANIFEST-000032 succeeded,manifest_file_number is 32, next_file_number is 34, last_sequence is 12, log_number is 5,prev_log_number is 0,max_column_family is 11,min_log_number_to_keep is 5
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 5
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: [db/version_set.cc:5581] Column family [m-0] (ID 1), log number is 5
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: [db/version_set.cc:5581] Column family [m-1] (ID 2), log number is 5
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: [db/version_set.cc:5581] Column family [m-2] (ID 3), log number is 5
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: [db/version_set.cc:5581] Column family [p-0] (ID 4), log number is 5
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: [db/version_set.cc:5581] Column family [p-1] (ID 5), log number is 5
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: [db/version_set.cc:5581] Column family [p-2] (ID 6), log number is 5
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: [db/version_set.cc:5581] Column family [O-0] (ID 7), log number is 5
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: [db/version_set.cc:5581] Column family [O-1] (ID 8), log number is 5
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: [db/version_set.cc:5581] Column family [O-2] (ID 9), log number is 5
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: [db/version_set.cc:5581] Column family [L] (ID 10), log number is 5
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: [db/version_set.cc:5581] Column family [P] (ID 11), log number is 5
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: 4ac62bd1-f8fe-41aa-94df-6dbfb775e8d7
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764316113970174, "job": 1, "event": "recovery_started", "wal_files": [31]}
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #31 mode 2
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764316113975468, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 35, "file_size": 1261, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 13, "largest_seqno": 21, "table_properties": {"data_size": 128, "index_size": 27, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 87, "raw_average_key_size": 17, "raw_value_size": 82, "raw_average_value_size": 16, "num_data_blocks": 1, "num_entries": 5, "num_filter_entries": 5, "num_deletions": 0, "num_merge_operands": 2, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": ".T:int64_array.b:bitwise_xor", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764316113, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4ac62bd1-f8fe-41aa-94df-6dbfb775e8d7", "db_session_id": "QREFVIKU9CR5RBF8LMCH", "orig_file_number": 35, "seqno_to_time_mapping": "N/A"}}
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764316113980301, "cf_name": "p-0", "job": 1, "event": "table_file_creation", "file_number": 36, "file_size": 1609, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 14, "largest_seqno": 15, "table_properties": {"data_size": 468, "index_size": 39, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 72, "raw_average_key_size": 36, "raw_value_size": 567, "raw_average_value_size": 283, "num_data_blocks": 1, "num_entries": 2, "num_filter_entries": 2, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "p-0", "column_family_id": 4, "comparator": "leveldb.BytewiseComparator", "merge_operator": "nullptr", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764316113, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4ac62bd1-f8fe-41aa-94df-6dbfb775e8d7", "db_session_id": "QREFVIKU9CR5RBF8LMCH", "orig_file_number": 36, "seqno_to_time_mapping": "N/A"}}
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764316113986618, "cf_name": "O-2", "job": 1, "event": "table_file_creation", "file_number": 37, "file_size": 1290, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 16, "largest_seqno": 16, "table_properties": {"data_size": 121, "index_size": 64, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 55, "raw_average_key_size": 55, "raw_value_size": 50, "raw_average_value_size": 50, "num_data_blocks": 1, "num_entries": 1, "num_filter_entries": 1, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "O-2", "column_family_id": 9, "comparator": "leveldb.BytewiseComparator", "merge_operator": "nullptr", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764316113, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4ac62bd1-f8fe-41aa-94df-6dbfb775e8d7", "db_session_id": "QREFVIKU9CR5RBF8LMCH", "orig_file_number": 37, "seqno_to_time_mapping": "N/A"}}
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: [db/db_impl/db_impl_open.cc:1432] Failed to truncate log #31: IO error: No such file or directory: While open a file for appending: db.wal/000031.log: No such file or directory
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764316113990625, "job": 1, "event": "recovery_finished"}
Nov 28 07:48:33 np0005538515.localdomain ceph-osd[32393]: rocksdb: [db/version_set.cc:5047] Creating manifest 40
Nov 28 07:48:33 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 07:48:34 np0005538515.localdomain podman[32733]: 2025-11-28 07:48:33.906861989 +0000 UTC m=+0.046401736 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 07:48:34 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/de9401ce940ded6b98d412edb363f866bebab3d8561836e2537140b8a1e691e9/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Nov 28 07:48:34 np0005538515.localdomain ceph-osd[32393]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x55ab8a452e00
Nov 28 07:48:34 np0005538515.localdomain ceph-osd[32393]: rocksdb: DB pointer 0x55ab8a485a00
Nov 28 07:48:34 np0005538515.localdomain ceph-osd[32393]: bluestore(/var/lib/ceph/osd/ceph-1) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0
Nov 28 07:48:34 np0005538515.localdomain ceph-osd[32393]: bluestore(/var/lib/ceph/osd/ceph-1) _upgrade_super from 4, latest 4
Nov 28 07:48:34 np0005538515.localdomain ceph-osd[32393]: bluestore(/var/lib/ceph/osd/ceph-1) _upgrade_super done
Nov 28 07:48:34 np0005538515.localdomain ceph-osd[32393]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 28 07:48:34 np0005538515.localdomain ceph-osd[32393]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s
                                                          Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                          Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          
                                                          ** Compaction Stats [default] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      2/0    2.61 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.005       0      0       0.0       0.0
                                                           Sum      2/0    2.61 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.005       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.005       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [default] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.2      0.01              0.00         1    0.005       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.02 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.02 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55ab8a42b350#2 capacity: 460.80 MB usage: 1.39 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.2e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(2,0.72 KB,0.000152323%) FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [default] **
                                                          
                                                          ** Compaction Stats [m-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55ab8a42b350#2 capacity: 460.80 MB usage: 1.39 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.2e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(2,0.72 KB,0.000152323%) FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-0] **
                                                          
                                                          ** Compaction Stats [m-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55ab8a42b350#2 capacity: 460.80 MB usage: 1.39 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.2e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(2,0.72 KB,0.000152323%) FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-1] **
                                                          
                                                          ** Compaction Stats [m-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55ab8a42b350#2 capacity: 460.80 MB usage: 1.39 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.2e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(2,0.72 KB,0.000152323%) FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-2] **
                                                          
                                                          ** Compaction Stats [p-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      1/0    1.57 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0
                                                           Sum      1/0    1.57 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.02 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.02 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55ab8a42b350#2 capacity: 460.80 MB usage: 1.39 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.2e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(2,0.72 KB,0.000152323%) FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-0] **
                                                          
                                                          ** Compaction Stats [p-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55ab8a42b350#2 capacity: 460.80 MB usage: 1.39 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.2e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(2,0.72 KB,0.000152323%) FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-1] **
                                                          
                                                          ** Compaction Stats [p-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55ab8a42b350#2 capacity: 460.80 MB usage: 1.39 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.2e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(2,0.72 KB,0.000152323%) FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-2] **
                                                          
                                                          ** Compaction Stats [O-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55ab8a42b610#2 capacity: 512.00 MB usage: 0.25 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 2 last_secs: 1.1e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.14 KB,2.68221e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-0] **
                                                          
                                                          ** Compaction Stats [O-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55ab8a42b610#2 capacity: 512.00 MB usage: 0.25 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 2 last_secs: 1.1e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.14 KB,2.68221e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-1] **
                                                          
                                                          ** Compaction Stats [O-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      1/0    1.26 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0
                                                           Sum      1/0    1.26 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.02 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.02 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55ab8a42b610#2 capacity: 512.00 MB usage: 0.25 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 2 last_secs: 1.1e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.14 KB,2.68221e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-2] **
                                                          
                                                          ** Compaction Stats [L] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [L] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55ab8a42b350#2 capacity: 460.80 MB usage: 1.39 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.2e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(2,0.72 KB,0.000152323%) FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [L] **
                                                          
                                                          ** Compaction Stats [P] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [P] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55ab8a42b350#2 capacity: 460.80 MB usage: 1.39 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.2e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(2,0.72 KB,0.000152323%) FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [P] **
Nov 28 07:48:34 np0005538515.localdomain ceph-osd[32393]: <cls> /builddir/build/BUILD/ceph-18.2.1/src/cls/cephfs/cls_cephfs.cc:201: loading cephfs
Nov 28 07:48:34 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/de9401ce940ded6b98d412edb363f866bebab3d8561836e2537140b8a1e691e9/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Nov 28 07:48:34 np0005538515.localdomain ceph-osd[32393]: <cls> /builddir/build/BUILD/ceph-18.2.1/src/cls/hello/cls_hello.cc:316: loading cls_hello
Nov 28 07:48:34 np0005538515.localdomain ceph-osd[32393]: _get_class not permitted to load lua
Nov 28 07:48:34 np0005538515.localdomain ceph-osd[32393]: _get_class not permitted to load sdk
Nov 28 07:48:34 np0005538515.localdomain ceph-osd[32393]: _get_class not permitted to load test_remote_reads
Nov 28 07:48:34 np0005538515.localdomain ceph-osd[32393]: osd.1 0 crush map has features 288232575208783872, adjusting msgr requires for clients
Nov 28 07:48:34 np0005538515.localdomain ceph-osd[32393]: osd.1 0 crush map has features 288232575208783872 was 8705, adjusting msgr requires for mons
Nov 28 07:48:34 np0005538515.localdomain ceph-osd[32393]: osd.1 0 crush map has features 288232575208783872, adjusting msgr requires for osds
Nov 28 07:48:34 np0005538515.localdomain ceph-osd[32393]: osd.1 0 check_osdmap_features enabling on-disk ERASURE CODES compat feature
Nov 28 07:48:34 np0005538515.localdomain ceph-osd[32393]: osd.1 0 load_pgs
Nov 28 07:48:34 np0005538515.localdomain ceph-osd[32393]: osd.1 0 load_pgs opened 0 pgs
Nov 28 07:48:34 np0005538515.localdomain ceph-osd[32393]: osd.1 0 log_to_monitors true
Nov 28 07:48:34 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-1[32389]: 2025-11-28T07:48:34.026+0000 7fc674863a80 -1 osd.1 0 log_to_monitors true
Nov 28 07:48:34 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/de9401ce940ded6b98d412edb363f866bebab3d8561836e2537140b8a1e691e9/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Nov 28 07:48:34 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/de9401ce940ded6b98d412edb363f866bebab3d8561836e2537140b8a1e691e9/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Nov 28 07:48:34 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/de9401ce940ded6b98d412edb363f866bebab3d8561836e2537140b8a1e691e9/merged/var/lib/ceph/osd/ceph-4 supports timestamps until 2038 (0x7fffffff)
Nov 28 07:48:34 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-7570757b21089630af044dff18f5069d9af04d63f9ed03a218b84dd47e0eefd5-merged.mount: Deactivated successfully.
Nov 28 07:48:34 np0005538515.localdomain podman[32733]: 2025-11-28 07:48:34.072573193 +0000 UTC m=+0.212112900 container init 6d4d7da991c4f1b78c12e591d1ce5dfbe35a960096156af77a75d9f2508bb460 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-4-activate-test, CEPH_POINT_RELEASE=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., name=rhceph, architecture=x86_64, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, io.openshift.expose-services=, io.buildah.version=1.33.12, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, GIT_CLEAN=True, com.redhat.component=rhceph-container, distribution-scope=public, GIT_BRANCH=main, vcs-type=git, build-date=2025-09-24T08:57:55, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, release=553, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.tags=rhceph ceph)
Nov 28 07:48:34 np0005538515.localdomain podman[32733]: 2025-11-28 07:48:34.08019753 +0000 UTC m=+0.219737267 container start 6d4d7da991c4f1b78c12e591d1ce5dfbe35a960096156af77a75d9f2508bb460 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-4-activate-test, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, release=553, build-date=2025-09-24T08:57:55, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, RELEASE=main, com.redhat.component=rhceph-container, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, io.openshift.expose-services=, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph)
Nov 28 07:48:34 np0005538515.localdomain podman[32733]: 2025-11-28 07:48:34.080677601 +0000 UTC m=+0.220217398 container attach 6d4d7da991c4f1b78c12e591d1ce5dfbe35a960096156af77a75d9f2508bb460 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-4-activate-test, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, build-date=2025-09-24T08:57:55, vcs-type=git, distribution-scope=public, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, architecture=x86_64, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, ceph=True, io.buildah.version=1.33.12, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, release=553, GIT_BRANCH=main)
Nov 28 07:48:34 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-4-activate-test[32930]: usage: ceph-volume activate [-h] [--osd-id OSD_ID] [--osd-uuid OSD_UUID]
Nov 28 07:48:34 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-4-activate-test[32930]:                             [--no-systemd] [--no-tmpfs]
Nov 28 07:48:34 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-4-activate-test[32930]: ceph-volume activate: error: unrecognized arguments: --bad-option
Nov 28 07:48:34 np0005538515.localdomain systemd[1]: libpod-6d4d7da991c4f1b78c12e591d1ce5dfbe35a960096156af77a75d9f2508bb460.scope: Deactivated successfully.
Nov 28 07:48:34 np0005538515.localdomain podman[32733]: 2025-11-28 07:48:34.297085931 +0000 UTC m=+0.436625708 container died 6d4d7da991c4f1b78c12e591d1ce5dfbe35a960096156af77a75d9f2508bb460 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-4-activate-test, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, name=rhceph, GIT_BRANCH=main, version=7, io.openshift.tags=rhceph ceph, distribution-scope=public, RELEASE=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, release=553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., io.openshift.expose-services=, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git)
Nov 28 07:48:34 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-de9401ce940ded6b98d412edb363f866bebab3d8561836e2537140b8a1e691e9-merged.mount: Deactivated successfully.
Nov 28 07:48:34 np0005538515.localdomain podman[32968]: 2025-11-28 07:48:34.395412675 +0000 UTC m=+0.085222298 container remove 6d4d7da991c4f1b78c12e591d1ce5dfbe35a960096156af77a75d9f2508bb460 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-4-activate-test, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=553, io.openshift.expose-services=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., ceph=True, version=7, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhceph, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-09-24T08:57:55, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-type=git, CEPH_POINT_RELEASE=, GIT_BRANCH=main, io.buildah.version=1.33.12, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d)
Nov 28 07:48:34 np0005538515.localdomain systemd[1]: libpod-conmon-6d4d7da991c4f1b78c12e591d1ce5dfbe35a960096156af77a75d9f2508bb460.scope: Deactivated successfully.
Nov 28 07:48:34 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 07:48:34 np0005538515.localdomain systemd-sysv-generator[33025]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 07:48:34 np0005538515.localdomain systemd-rc-local-generator[33022]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 07:48:34 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 07:48:34 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 07:48:35 np0005538515.localdomain ceph-osd[32393]: log_channel(cluster) log [DBG] : purged_snaps scrub starts
Nov 28 07:48:35 np0005538515.localdomain ceph-osd[32393]: log_channel(cluster) log [DBG] : purged_snaps scrub ok
Nov 28 07:48:35 np0005538515.localdomain systemd-rc-local-generator[33060]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 07:48:35 np0005538515.localdomain systemd-sysv-generator[33066]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 07:48:35 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 07:48:35 np0005538515.localdomain systemd[1]: Starting Ceph osd.4 for 2c5417c9-00eb-57d5-a565-ddecbc7995c1...
Nov 28 07:48:35 np0005538515.localdomain ceph-osd[32393]: osd.1 0 done with init, starting boot process
Nov 28 07:48:35 np0005538515.localdomain ceph-osd[32393]: osd.1 0 start_boot
Nov 28 07:48:35 np0005538515.localdomain ceph-osd[32393]: osd.1 0 maybe_override_options_for_qos osd_max_backfills set to 1
Nov 28 07:48:35 np0005538515.localdomain ceph-osd[32393]: osd.1 0 maybe_override_options_for_qos osd_recovery_max_active set to 0
Nov 28 07:48:35 np0005538515.localdomain ceph-osd[32393]: osd.1 0 maybe_override_options_for_qos osd_recovery_max_active_hdd set to 3
Nov 28 07:48:35 np0005538515.localdomain ceph-osd[32393]: osd.1 0 maybe_override_options_for_qos osd_recovery_max_active_ssd set to 10
Nov 28 07:48:35 np0005538515.localdomain ceph-osd[32393]: osd.1 0  bench count 12288000 bsize 4 KiB
Nov 28 07:48:35 np0005538515.localdomain podman[33126]: 
Nov 28 07:48:35 np0005538515.localdomain podman[33126]: 2025-11-28 07:48:35.629972046 +0000 UTC m=+0.107730550 container create 04f84c1e81dbc9dbf850783ae50fa0ecf301892495e75819575c778b15aa64de (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-4-activate, ceph=True, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, release=553, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_CLEAN=True, RELEASE=main, CEPH_POINT_RELEASE=, GIT_BRANCH=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, architecture=x86_64, description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.buildah.version=1.33.12, build-date=2025-09-24T08:57:55, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9)
Nov 28 07:48:35 np0005538515.localdomain podman[33126]: 2025-11-28 07:48:35.573740467 +0000 UTC m=+0.051498991 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 07:48:35 np0005538515.localdomain systemd[1]: tmp-crun.Zm04Qz.mount: Deactivated successfully.
Nov 28 07:48:35 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 07:48:35 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/868ed51b53f87dacb1240adb098301a374c5a6dad4454e524b8f4efa738f7998/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Nov 28 07:48:35 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/868ed51b53f87dacb1240adb098301a374c5a6dad4454e524b8f4efa738f7998/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Nov 28 07:48:35 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/868ed51b53f87dacb1240adb098301a374c5a6dad4454e524b8f4efa738f7998/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Nov 28 07:48:35 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/868ed51b53f87dacb1240adb098301a374c5a6dad4454e524b8f4efa738f7998/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Nov 28 07:48:35 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/868ed51b53f87dacb1240adb098301a374c5a6dad4454e524b8f4efa738f7998/merged/var/lib/ceph/osd/ceph-4 supports timestamps until 2038 (0x7fffffff)
Nov 28 07:48:35 np0005538515.localdomain podman[33126]: 2025-11-28 07:48:35.78749451 +0000 UTC m=+0.265253004 container init 04f84c1e81dbc9dbf850783ae50fa0ecf301892495e75819575c778b15aa64de (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-4-activate, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, vcs-type=git, GIT_BRANCH=main, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, architecture=x86_64, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_CLEAN=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, ceph=True, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2025-09-24T08:57:55, version=7, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, io.buildah.version=1.33.12)
Nov 28 07:48:35 np0005538515.localdomain podman[33126]: 2025-11-28 07:48:35.797050211 +0000 UTC m=+0.274808715 container start 04f84c1e81dbc9dbf850783ae50fa0ecf301892495e75819575c778b15aa64de (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-4-activate, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.33.12, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, name=rhceph, release=553, build-date=2025-09-24T08:57:55, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, GIT_CLEAN=True, RELEASE=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, version=7, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=)
Nov 28 07:48:35 np0005538515.localdomain podman[33126]: 2025-11-28 07:48:35.797368465 +0000 UTC m=+0.275126999 container attach 04f84c1e81dbc9dbf850783ae50fa0ecf301892495e75819575c778b15aa64de (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-4-activate, name=rhceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.expose-services=, architecture=x86_64, release=553, io.buildah.version=1.33.12, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2025-09-24T08:57:55, GIT_CLEAN=True, vcs-type=git, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7)
Nov 28 07:48:36 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-4-activate[33140]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-4
Nov 28 07:48:36 np0005538515.localdomain bash[33126]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-4
Nov 28 07:48:36 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-4-activate[33140]: Running command: /usr/bin/ceph-bluestore-tool prime-osd-dir --path /var/lib/ceph/osd/ceph-4 --no-mon-config --dev /dev/mapper/ceph_vg1-ceph_lv1
Nov 28 07:48:36 np0005538515.localdomain bash[33126]: Running command: /usr/bin/ceph-bluestore-tool prime-osd-dir --path /var/lib/ceph/osd/ceph-4 --no-mon-config --dev /dev/mapper/ceph_vg1-ceph_lv1
Nov 28 07:48:36 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-4-activate[33140]: Running command: /usr/bin/chown -h ceph:ceph /dev/mapper/ceph_vg1-ceph_lv1
Nov 28 07:48:36 np0005538515.localdomain bash[33126]: Running command: /usr/bin/chown -h ceph:ceph /dev/mapper/ceph_vg1-ceph_lv1
Nov 28 07:48:36 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-4-activate[33140]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-1
Nov 28 07:48:36 np0005538515.localdomain bash[33126]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-1
Nov 28 07:48:36 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-4-activate[33140]: Running command: /usr/bin/ln -s /dev/mapper/ceph_vg1-ceph_lv1 /var/lib/ceph/osd/ceph-4/block
Nov 28 07:48:36 np0005538515.localdomain bash[33126]: Running command: /usr/bin/ln -s /dev/mapper/ceph_vg1-ceph_lv1 /var/lib/ceph/osd/ceph-4/block
Nov 28 07:48:36 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-4-activate[33140]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-4
Nov 28 07:48:36 np0005538515.localdomain bash[33126]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-4
Nov 28 07:48:36 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-4-activate[33140]: --> ceph-volume raw activate successful for osd ID: 4
Nov 28 07:48:36 np0005538515.localdomain bash[33126]: --> ceph-volume raw activate successful for osd ID: 4
Nov 28 07:48:36 np0005538515.localdomain systemd[1]: libpod-04f84c1e81dbc9dbf850783ae50fa0ecf301892495e75819575c778b15aa64de.scope: Deactivated successfully.
Nov 28 07:48:36 np0005538515.localdomain podman[33126]: 2025-11-28 07:48:36.525870978 +0000 UTC m=+1.003629502 container died 04f84c1e81dbc9dbf850783ae50fa0ecf301892495e75819575c778b15aa64de (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-4-activate, io.buildah.version=1.33.12, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., release=553, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, GIT_CLEAN=True, RELEASE=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, com.redhat.component=rhceph-container, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, version=7, architecture=x86_64, CEPH_POINT_RELEASE=, build-date=2025-09-24T08:57:55, vcs-type=git, distribution-scope=public)
Nov 28 07:48:36 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-868ed51b53f87dacb1240adb098301a374c5a6dad4454e524b8f4efa738f7998-merged.mount: Deactivated successfully.
Nov 28 07:48:36 np0005538515.localdomain podman[33257]: 2025-11-28 07:48:36.674662497 +0000 UTC m=+0.136925527 container remove 04f84c1e81dbc9dbf850783ae50fa0ecf301892495e75819575c778b15aa64de (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-4-activate, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.tags=rhceph ceph, release=553, architecture=x86_64, build-date=2025-09-24T08:57:55, CEPH_POINT_RELEASE=, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., GIT_CLEAN=True, ceph=True, com.redhat.component=rhceph-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, distribution-scope=public, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, RELEASE=main)
Nov 28 07:48:36 np0005538515.localdomain podman[33315]: 
Nov 28 07:48:36 np0005538515.localdomain podman[33315]: 2025-11-28 07:48:36.994390621 +0000 UTC m=+0.071332245 container create 606597793ddb0558c53e18a6d88c01aa5ec597833ba8a2079a2c6fce1e6d2c82 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-4, com.redhat.component=rhceph-container, ceph=True, vcs-type=git, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., io.openshift.expose-services=, GIT_CLEAN=True, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=553, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.buildah.version=1.33.12, GIT_BRANCH=main, architecture=x86_64, build-date=2025-09-24T08:57:55)
Nov 28 07:48:37 np0005538515.localdomain systemd[1]: tmp-crun.GIElyk.mount: Deactivated successfully.
Nov 28 07:48:37 np0005538515.localdomain podman[33315]: 2025-11-28 07:48:36.96851082 +0000 UTC m=+0.045452324 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 07:48:37 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/02176f38795b040d5d85ab09fc76c0bbcbbfc9c3677feaa100178a06e93de861/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Nov 28 07:48:37 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/02176f38795b040d5d85ab09fc76c0bbcbbfc9c3677feaa100178a06e93de861/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Nov 28 07:48:37 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/02176f38795b040d5d85ab09fc76c0bbcbbfc9c3677feaa100178a06e93de861/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Nov 28 07:48:37 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/02176f38795b040d5d85ab09fc76c0bbcbbfc9c3677feaa100178a06e93de861/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Nov 28 07:48:37 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/02176f38795b040d5d85ab09fc76c0bbcbbfc9c3677feaa100178a06e93de861/merged/var/lib/ceph/osd/ceph-4 supports timestamps until 2038 (0x7fffffff)
Nov 28 07:48:37 np0005538515.localdomain podman[33315]: 2025-11-28 07:48:37.154940539 +0000 UTC m=+0.231882033 container init 606597793ddb0558c53e18a6d88c01aa5ec597833ba8a2079a2c6fce1e6d2c82 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-4, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-09-24T08:57:55, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=553, com.redhat.component=rhceph-container, GIT_CLEAN=True, architecture=x86_64, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.license_terms=https://www.redhat.com/agreements, RELEASE=main, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., version=7, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, name=rhceph, distribution-scope=public)
Nov 28 07:48:37 np0005538515.localdomain podman[33315]: 2025-11-28 07:48:37.186806543 +0000 UTC m=+0.263748027 container start 606597793ddb0558c53e18a6d88c01aa5ec597833ba8a2079a2c6fce1e6d2c82 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-4, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12, architecture=x86_64, vendor=Red Hat, Inc., version=7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_CLEAN=True, RELEASE=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, release=553, name=rhceph, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, ceph=True, build-date=2025-09-24T08:57:55)
Nov 28 07:48:37 np0005538515.localdomain bash[33315]: 606597793ddb0558c53e18a6d88c01aa5ec597833ba8a2079a2c6fce1e6d2c82
Nov 28 07:48:37 np0005538515.localdomain systemd[1]: Started Ceph osd.4 for 2c5417c9-00eb-57d5-a565-ddecbc7995c1.
Nov 28 07:48:37 np0005538515.localdomain ceph-osd[33334]: set uid:gid to 167:167 (ceph:ceph)
Nov 28 07:48:37 np0005538515.localdomain ceph-osd[33334]: ceph version 18.2.1-361.el9cp (439dcd6094d413840eb2ec590fe2194ec616687f) reef (stable), process ceph-osd, pid 2
Nov 28 07:48:37 np0005538515.localdomain ceph-osd[33334]: pidfile_write: ignore empty --pid-file
Nov 28 07:48:37 np0005538515.localdomain ceph-osd[33334]: bdev(0x562bf8a32e00 /var/lib/ceph/osd/ceph-4/block) open path /var/lib/ceph/osd/ceph-4/block
Nov 28 07:48:37 np0005538515.localdomain ceph-osd[33334]: bdev(0x562bf8a32e00 /var/lib/ceph/osd/ceph-4/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-4/block failed: (22) Invalid argument
Nov 28 07:48:37 np0005538515.localdomain ceph-osd[33334]: bdev(0x562bf8a32e00 /var/lib/ceph/osd/ceph-4/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Nov 28 07:48:37 np0005538515.localdomain ceph-osd[33334]: bluestore(/var/lib/ceph/osd/ceph-4) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 data 0.06
Nov 28 07:48:37 np0005538515.localdomain ceph-osd[33334]: bdev(0x562bf8a33180 /var/lib/ceph/osd/ceph-4/block) open path /var/lib/ceph/osd/ceph-4/block
Nov 28 07:48:37 np0005538515.localdomain ceph-osd[33334]: bdev(0x562bf8a33180 /var/lib/ceph/osd/ceph-4/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-4/block failed: (22) Invalid argument
Nov 28 07:48:37 np0005538515.localdomain ceph-osd[33334]: bdev(0x562bf8a33180 /var/lib/ceph/osd/ceph-4/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Nov 28 07:48:37 np0005538515.localdomain ceph-osd[33334]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-4/block size 7.0 GiB
Nov 28 07:48:37 np0005538515.localdomain ceph-osd[33334]: bdev(0x562bf8a33180 /var/lib/ceph/osd/ceph-4/block) close
Nov 28 07:48:37 np0005538515.localdomain ceph-osd[33334]: bdev(0x562bf8a32e00 /var/lib/ceph/osd/ceph-4/block) close
Nov 28 07:48:37 np0005538515.localdomain sudo[32421]: pam_unix(sudo:session): session closed for user root
Nov 28 07:48:37 np0005538515.localdomain sudo[33350]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 07:48:37 np0005538515.localdomain sudo[33350]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 07:48:37 np0005538515.localdomain sudo[33350]: pam_unix(sudo:session): session closed for user root
Nov 28 07:48:37 np0005538515.localdomain sudo[33365]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ceph-volume --fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1 -- raw list --format json
Nov 28 07:48:37 np0005538515.localdomain sudo[33365]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 07:48:37 np0005538515.localdomain ceph-osd[33334]: starting osd.4 osd_data /var/lib/ceph/osd/ceph-4 /var/lib/ceph/osd/ceph-4/journal
Nov 28 07:48:37 np0005538515.localdomain ceph-osd[33334]: load: jerasure load: lrc 
Nov 28 07:48:37 np0005538515.localdomain ceph-osd[33334]: bdev(0x562bf8a32e00 /var/lib/ceph/osd/ceph-4/block) open path /var/lib/ceph/osd/ceph-4/block
Nov 28 07:48:37 np0005538515.localdomain ceph-osd[33334]: bdev(0x562bf8a32e00 /var/lib/ceph/osd/ceph-4/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-4/block failed: (22) Invalid argument
Nov 28 07:48:37 np0005538515.localdomain ceph-osd[33334]: bdev(0x562bf8a32e00 /var/lib/ceph/osd/ceph-4/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Nov 28 07:48:37 np0005538515.localdomain ceph-osd[33334]: bluestore(/var/lib/ceph/osd/ceph-4) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 data 0.06
Nov 28 07:48:37 np0005538515.localdomain ceph-osd[33334]: bdev(0x562bf8a32e00 /var/lib/ceph/osd/ceph-4/block) close
Nov 28 07:48:37 np0005538515.localdomain ceph-osd[33334]: bdev(0x562bf8a32e00 /var/lib/ceph/osd/ceph-4/block) open path /var/lib/ceph/osd/ceph-4/block
Nov 28 07:48:37 np0005538515.localdomain ceph-osd[33334]: bdev(0x562bf8a32e00 /var/lib/ceph/osd/ceph-4/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-4/block failed: (22) Invalid argument
Nov 28 07:48:37 np0005538515.localdomain ceph-osd[33334]: bdev(0x562bf8a32e00 /var/lib/ceph/osd/ceph-4/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Nov 28 07:48:37 np0005538515.localdomain ceph-osd[33334]: bluestore(/var/lib/ceph/osd/ceph-4) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 data 0.06
Nov 28 07:48:37 np0005538515.localdomain ceph-osd[33334]: bdev(0x562bf8a32e00 /var/lib/ceph/osd/ceph-4/block) close
Nov 28 07:48:37 np0005538515.localdomain podman[33425]: 
Nov 28 07:48:37 np0005538515.localdomain podman[33425]: 2025-11-28 07:48:37.957068697 +0000 UTC m=+0.066193759 container create 49d3db070436a7bd695838e4be1de6c44cec26f166951c485b33da130554834d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=romantic_hoover, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, io.openshift.expose-services=, ceph=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_BRANCH=main, release=553, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, GIT_CLEAN=True, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, name=rhceph, build-date=2025-09-24T08:57:55, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.tags=rhceph ceph, RELEASE=main, com.redhat.component=rhceph-container, distribution-scope=public, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, version=7, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git)
Nov 28 07:48:38 np0005538515.localdomain systemd[1]: Started libpod-conmon-49d3db070436a7bd695838e4be1de6c44cec26f166951c485b33da130554834d.scope.
Nov 28 07:48:38 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 07:48:38 np0005538515.localdomain podman[33425]: 2025-11-28 07:48:37.92537963 +0000 UTC m=+0.034504722 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: mClockScheduler: set_osd_capacity_params_from_config: osd_bandwidth_cost_per_io: 499321.90 bytes/io, osd_bandwidth_capacity_per_shard 157286400.00 bytes/second
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: osd.4:0.OSDShard using op scheduler mclock_scheduler, cutoff=196
Nov 28 07:48:38 np0005538515.localdomain podman[33425]: 2025-11-28 07:48:38.048485297 +0000 UTC m=+0.157610369 container init 49d3db070436a7bd695838e4be1de6c44cec26f166951c485b33da130554834d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=romantic_hoover, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, description=Red Hat Ceph Storage 7, architecture=x86_64, RELEASE=main, vendor=Red Hat, Inc., io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=553, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, build-date=2025-09-24T08:57:55, version=7, vcs-type=git, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_CLEAN=True, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d)
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: bdev(0x562bf8a32e00 /var/lib/ceph/osd/ceph-4/block) open path /var/lib/ceph/osd/ceph-4/block
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: bdev(0x562bf8a32e00 /var/lib/ceph/osd/ceph-4/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-4/block failed: (22) Invalid argument
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: bdev(0x562bf8a32e00 /var/lib/ceph/osd/ceph-4/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: bluestore(/var/lib/ceph/osd/ceph-4) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 data 0.06
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: bdev(0x562bf8a33180 /var/lib/ceph/osd/ceph-4/block) open path /var/lib/ceph/osd/ceph-4/block
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: bdev(0x562bf8a33180 /var/lib/ceph/osd/ceph-4/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-4/block failed: (22) Invalid argument
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: bdev(0x562bf8a33180 /var/lib/ceph/osd/ceph-4/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-4/block size 7.0 GiB
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: bluefs mount
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: bluefs _init_alloc shared, id 1, capacity 0x1bfc00000, block size 0x10000
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: bluefs mount shared_bdev_used = 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: bluestore(/var/lib/ceph/osd/ceph-4) _prepare_db_environment set db_paths to db,7136398540 db.slow,7136398540
Nov 28 07:48:38 np0005538515.localdomain romantic_hoover[33440]: 167 167
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: RocksDB version: 7.9.2
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Git sha 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Compile date 2025-09-23 00:00:00
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: DB SUMMARY
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: DB Session ID:  X4M0XD0YLGEXMBVZCYFT
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: CURRENT file:  CURRENT
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: IDENTITY file:  IDENTITY
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: MANIFEST file:  MANIFEST-000032 size: 1007 Bytes
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: SST files in db dir, Total Num: 1, files: 000030.sst 
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: SST files in db.slow dir, Total Num: 0, files: 
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Write Ahead Log file in db.wal: 000031.log size: 5093 ; 
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                         Options.error_if_exists: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                       Options.create_if_missing: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                         Options.paranoid_checks: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:             Options.flush_verify_memtable_count: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                                     Options.env: 0x562bf8cc6cb0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                                      Options.fs: LegacyFileSystem
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                                Options.info_log: 0x562bf99ba380
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                Options.max_file_opening_threads: 16
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                              Options.statistics: (nil)
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                               Options.use_fsync: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                       Options.max_log_file_size: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                   Options.log_file_time_to_roll: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                       Options.keep_log_file_num: 1000
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                    Options.recycle_log_file_num: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                         Options.allow_fallocate: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                        Options.allow_mmap_reads: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                       Options.allow_mmap_writes: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                        Options.use_direct_reads: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:          Options.create_missing_column_families: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                              Options.db_log_dir: 
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                                 Options.wal_dir: db.wal
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                Options.table_cache_numshardbits: 6
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                         Options.WAL_ttl_seconds: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                       Options.WAL_size_limit_MB: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:             Options.manifest_preallocation_size: 4194304
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                     Options.is_fd_close_on_exec: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                   Options.advise_random_on_open: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                    Options.db_write_buffer_size: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                    Options.write_buffer_manager: 0x562bf8a1c140
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.access_hint_on_compaction_start: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                      Options.use_adaptive_mutex: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                            Options.rate_limiter: (nil)
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                       Options.wal_recovery_mode: 2
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                  Options.enable_thread_tracking: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                  Options.enable_pipelined_write: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                  Options.unordered_write: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:             Options.write_thread_max_yield_usec: 100
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                               Options.row_cache: None
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                              Options.wal_filter: None
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:             Options.avoid_flush_during_recovery: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:             Options.allow_ingest_behind: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:             Options.two_write_queues: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:             Options.manual_wal_flush: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:             Options.wal_compression: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:             Options.atomic_flush: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                 Options.persist_stats_to_disk: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                 Options.write_dbid_to_manifest: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                 Options.log_readahead_size: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                 Options.best_efforts_recovery: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:             Options.allow_data_in_errors: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:             Options.db_host_id: __hostname__
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:             Options.enforce_single_del_contracts: true
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:             Options.max_background_jobs: 4
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:             Options.max_background_compactions: -1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:             Options.max_subcompactions: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:           Options.writable_file_max_buffer_size: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:             Options.delayed_write_rate : 16777216
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:             Options.max_total_wal_size: 1073741824
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                   Options.stats_dump_period_sec: 600
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                 Options.stats_persist_period_sec: 600
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                          Options.max_open_files: -1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                          Options.bytes_per_sync: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                      Options.wal_bytes_per_sync: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                   Options.strict_bytes_per_sync: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:       Options.compaction_readahead_size: 2097152
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                  Options.max_background_flushes: -1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Compression algorithms supported:
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         kZSTD supported: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         kXpressCompression supported: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         kBZip2Compression supported: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         kZSTDNotFinalCompression supported: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         kLZ4Compression supported: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         kZlibCompression supported: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         kLZ4HCCompression supported: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         kSnappyCompression supported: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Fast CRC32 supported: Supported on x86
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: DMutex implementation: pthread_mutex_t
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: [db/db_impl/db_impl_readonly.cc:25] Opening the db in read only mode
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: db/MANIFEST-000032
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 0, name: default)
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:           Options.merge_operator: .T:int64_array.b:bitwise_xor
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:        Options.compaction_filter: None
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:        Options.compaction_filter_factory: None
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:  Options.sst_partitioner_factory: None
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x562bf99ba540)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x562bf8a0a850
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Nov 28 07:48:38 np0005538515.localdomain systemd[1]: libpod-49d3db070436a7bd695838e4be1de6c44cec26f166951c485b33da130554834d.scope: Deactivated successfully.
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:        Options.write_buffer_size: 16777216
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:  Options.max_write_buffer_number: 64
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:          Options.compression: LZ4
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:       Options.prefix_extractor: nullptr
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:             Options.num_levels: 7
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                  Options.compression_opts.level: 32767
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:               Options.compression_opts.strategy: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                  Options.compression_opts.enabled: false
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                        Options.arena_block_size: 1048576
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                Options.disable_auto_compactions: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                   Options.table_properties_collectors: 
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                   Options.inplace_update_support: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                           Options.bloom_locality: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                    Options.max_successive_merges: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                Options.paranoid_file_checks: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                Options.force_consistency_checks: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                Options.report_bg_io_stats: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                               Options.ttl: 2592000
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                       Options.enable_blob_files: false
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                           Options.min_blob_size: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                          Options.blob_file_size: 268435456
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                Options.blob_file_starting_level: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 1, name: m-0)
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-0]:
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:           Options.merge_operator: None
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:        Options.compaction_filter: None
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:        Options.compaction_filter_factory: None
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:  Options.sst_partitioner_factory: None
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x562bf99ba540)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x562bf8a0a850
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:        Options.write_buffer_size: 16777216
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:  Options.max_write_buffer_number: 64
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:          Options.compression: LZ4
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:       Options.prefix_extractor: nullptr
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:             Options.num_levels: 7
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                  Options.compression_opts.level: 32767
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:               Options.compression_opts.strategy: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                  Options.compression_opts.enabled: false
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                        Options.arena_block_size: 1048576
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                Options.disable_auto_compactions: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                   Options.inplace_update_support: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                           Options.bloom_locality: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                    Options.max_successive_merges: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                Options.paranoid_file_checks: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                Options.force_consistency_checks: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                Options.report_bg_io_stats: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                               Options.ttl: 2592000
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                       Options.enable_blob_files: false
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                           Options.min_blob_size: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                          Options.blob_file_size: 268435456
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                Options.blob_file_starting_level: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 2, name: m-1)
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-1]:
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:           Options.merge_operator: None
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:        Options.compaction_filter: None
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:        Options.compaction_filter_factory: None
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:  Options.sst_partitioner_factory: None
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x562bf99ba540)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x562bf8a0a850
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:        Options.write_buffer_size: 16777216
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:  Options.max_write_buffer_number: 64
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:          Options.compression: LZ4
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:       Options.prefix_extractor: nullptr
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:             Options.num_levels: 7
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                  Options.compression_opts.level: 32767
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:               Options.compression_opts.strategy: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                  Options.compression_opts.enabled: false
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                        Options.arena_block_size: 1048576
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                Options.disable_auto_compactions: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                   Options.inplace_update_support: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                           Options.bloom_locality: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                    Options.max_successive_merges: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                Options.paranoid_file_checks: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                Options.force_consistency_checks: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                Options.report_bg_io_stats: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                               Options.ttl: 2592000
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                       Options.enable_blob_files: false
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                           Options.min_blob_size: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                          Options.blob_file_size: 268435456
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                Options.blob_file_starting_level: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 3, name: m-2)
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-2]:
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:           Options.merge_operator: None
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:        Options.compaction_filter: None
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:        Options.compaction_filter_factory: None
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:  Options.sst_partitioner_factory: None
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x562bf99ba540)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x562bf8a0a850
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:        Options.write_buffer_size: 16777216
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:  Options.max_write_buffer_number: 64
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:          Options.compression: LZ4
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:       Options.prefix_extractor: nullptr
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:             Options.num_levels: 7
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                  Options.compression_opts.level: 32767
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:               Options.compression_opts.strategy: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                  Options.compression_opts.enabled: false
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                        Options.arena_block_size: 1048576
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                Options.disable_auto_compactions: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                   Options.inplace_update_support: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                           Options.bloom_locality: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                    Options.max_successive_merges: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                Options.paranoid_file_checks: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                Options.force_consistency_checks: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                Options.report_bg_io_stats: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                               Options.ttl: 2592000
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                       Options.enable_blob_files: false
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                           Options.min_blob_size: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                          Options.blob_file_size: 268435456
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                Options.blob_file_starting_level: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 4, name: p-0)
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-0]:
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:           Options.merge_operator: None
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:        Options.compaction_filter: None
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:        Options.compaction_filter_factory: None
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:  Options.sst_partitioner_factory: None
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x562bf99ba540)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x562bf8a0a850
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:        Options.write_buffer_size: 16777216
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:  Options.max_write_buffer_number: 64
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:          Options.compression: LZ4
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:       Options.prefix_extractor: nullptr
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:             Options.num_levels: 7
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                  Options.compression_opts.level: 32767
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:               Options.compression_opts.strategy: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                  Options.compression_opts.enabled: false
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                        Options.arena_block_size: 1048576
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                Options.disable_auto_compactions: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                   Options.inplace_update_support: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                           Options.bloom_locality: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                    Options.max_successive_merges: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                Options.paranoid_file_checks: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                Options.force_consistency_checks: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                Options.report_bg_io_stats: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                               Options.ttl: 2592000
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                       Options.enable_blob_files: false
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                           Options.min_blob_size: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                          Options.blob_file_size: 268435456
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                Options.blob_file_starting_level: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 5, name: p-1)
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-1]:
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:           Options.merge_operator: None
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:        Options.compaction_filter: None
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:        Options.compaction_filter_factory: None
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:  Options.sst_partitioner_factory: None
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x562bf99ba540)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x562bf8a0a850
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:        Options.write_buffer_size: 16777216
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:  Options.max_write_buffer_number: 64
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:          Options.compression: LZ4
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:       Options.prefix_extractor: nullptr
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:             Options.num_levels: 7
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                  Options.compression_opts.level: 32767
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:               Options.compression_opts.strategy: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                  Options.compression_opts.enabled: false
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                        Options.arena_block_size: 1048576
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                Options.disable_auto_compactions: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                   Options.inplace_update_support: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                           Options.bloom_locality: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                    Options.max_successive_merges: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                Options.paranoid_file_checks: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                Options.force_consistency_checks: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                Options.report_bg_io_stats: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                               Options.ttl: 2592000
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                       Options.enable_blob_files: false
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                           Options.min_blob_size: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                          Options.blob_file_size: 268435456
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                Options.blob_file_starting_level: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 6, name: p-2)
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-2]:
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:           Options.merge_operator: None
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:        Options.compaction_filter: None
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:        Options.compaction_filter_factory: None
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:  Options.sst_partitioner_factory: None
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x562bf99ba540)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x562bf8a0a850
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:        Options.write_buffer_size: 16777216
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:  Options.max_write_buffer_number: 64
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:          Options.compression: LZ4
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:       Options.prefix_extractor: nullptr
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:             Options.num_levels: 7
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                  Options.compression_opts.level: 32767
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:               Options.compression_opts.strategy: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                  Options.compression_opts.enabled: false
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                        Options.arena_block_size: 1048576
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                Options.disable_auto_compactions: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                   Options.inplace_update_support: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                           Options.bloom_locality: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                    Options.max_successive_merges: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                Options.paranoid_file_checks: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                Options.force_consistency_checks: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                Options.report_bg_io_stats: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                               Options.ttl: 2592000
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                       Options.enable_blob_files: false
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                           Options.min_blob_size: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                          Options.blob_file_size: 268435456
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 28 07:48:38 np0005538515.localdomain podman[33425]: 2025-11-28 07:48:38.085973919 +0000 UTC m=+0.195098981 container start 49d3db070436a7bd695838e4be1de6c44cec26f166951c485b33da130554834d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=romantic_hoover, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., com.redhat.component=rhceph-container, vcs-type=git, release=553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.buildah.version=1.33.12, RELEASE=main, GIT_CLEAN=True, name=rhceph, version=7, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_BRANCH=main, CEPH_POINT_RELEASE=, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2025-09-24T08:57:55, ceph=True)
Nov 28 07:48:38 np0005538515.localdomain podman[33425]: 2025-11-28 07:48:38.087019295 +0000 UTC m=+0.196144407 container attach 49d3db070436a7bd695838e4be1de6c44cec26f166951c485b33da130554834d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=romantic_hoover, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, RELEASE=main, name=rhceph, vcs-type=git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, distribution-scope=public, version=7, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, build-date=2025-09-24T08:57:55, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git)
Nov 28 07:48:38 np0005538515.localdomain podman[33425]: 2025-11-28 07:48:38.091552405 +0000 UTC m=+0.200677517 container died 49d3db070436a7bd695838e4be1de6c44cec26f166951c485b33da130554834d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=romantic_hoover, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, release=553, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, name=rhceph, build-date=2025-09-24T08:57:55, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vendor=Red Hat, Inc., architecture=x86_64, GIT_CLEAN=True, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=rhceph-container, io.buildah.version=1.33.12, RELEASE=main, vcs-type=git)
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                Options.blob_file_starting_level: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 7, name: O-0)
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-0]:
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:           Options.merge_operator: None
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:        Options.compaction_filter: None
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:        Options.compaction_filter_factory: None
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:  Options.sst_partitioner_factory: None
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x562bf99ba760)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x562bf8a0a2d0
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 536870912
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:        Options.write_buffer_size: 16777216
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:  Options.max_write_buffer_number: 64
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:          Options.compression: LZ4
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:       Options.prefix_extractor: nullptr
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:             Options.num_levels: 7
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                  Options.compression_opts.level: 32767
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:               Options.compression_opts.strategy: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                  Options.compression_opts.enabled: false
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                        Options.arena_block_size: 1048576
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                Options.disable_auto_compactions: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                   Options.inplace_update_support: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                           Options.bloom_locality: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                    Options.max_successive_merges: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                Options.paranoid_file_checks: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                Options.force_consistency_checks: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                Options.report_bg_io_stats: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                               Options.ttl: 2592000
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                       Options.enable_blob_files: false
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                           Options.min_blob_size: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                          Options.blob_file_size: 268435456
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                Options.blob_file_starting_level: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 8, name: O-1)
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-1]:
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:           Options.merge_operator: None
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:        Options.compaction_filter: None
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:        Options.compaction_filter_factory: None
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:  Options.sst_partitioner_factory: None
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x562bf99ba760)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x562bf8a0a2d0
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 536870912
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:        Options.write_buffer_size: 16777216
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:  Options.max_write_buffer_number: 64
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:          Options.compression: LZ4
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:       Options.prefix_extractor: nullptr
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:             Options.num_levels: 7
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                  Options.compression_opts.level: 32767
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:               Options.compression_opts.strategy: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                  Options.compression_opts.enabled: false
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                        Options.arena_block_size: 1048576
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                Options.disable_auto_compactions: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                   Options.inplace_update_support: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                           Options.bloom_locality: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                    Options.max_successive_merges: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                Options.paranoid_file_checks: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                Options.force_consistency_checks: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                Options.report_bg_io_stats: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                               Options.ttl: 2592000
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                       Options.enable_blob_files: false
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                           Options.min_blob_size: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                          Options.blob_file_size: 268435456
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                Options.blob_file_starting_level: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 9, name: O-2)
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-2]:
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:           Options.merge_operator: None
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:        Options.compaction_filter: None
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:        Options.compaction_filter_factory: None
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:  Options.sst_partitioner_factory: None
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x562bf99ba760)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x562bf8a0a2d0
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 536870912
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:        Options.write_buffer_size: 16777216
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:  Options.max_write_buffer_number: 64
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:          Options.compression: LZ4
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:       Options.prefix_extractor: nullptr
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:             Options.num_levels: 7
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                  Options.compression_opts.level: 32767
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:               Options.compression_opts.strategy: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                  Options.compression_opts.enabled: false
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                        Options.arena_block_size: 1048576
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                Options.disable_auto_compactions: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                   Options.inplace_update_support: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                           Options.bloom_locality: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                    Options.max_successive_merges: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                Options.paranoid_file_checks: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                Options.force_consistency_checks: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                Options.report_bg_io_stats: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                               Options.ttl: 2592000
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                       Options.enable_blob_files: false
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                           Options.min_blob_size: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                          Options.blob_file_size: 268435456
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                Options.blob_file_starting_level: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 10, name: L)
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: [db/column_family.cc:635]         (skipping printing options)
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 11, name: P)
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: [db/column_family.cc:635]         (skipping printing options)
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:db/MANIFEST-000032 succeeded,manifest_file_number is 32, next_file_number is 34, last_sequence is 12, log_number is 5,prev_log_number is 0,max_column_family is 11,min_log_number_to_keep is 5
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 5
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: [db/version_set.cc:5581] Column family [m-0] (ID 1), log number is 5
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: [db/version_set.cc:5581] Column family [m-1] (ID 2), log number is 5
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: [db/version_set.cc:5581] Column family [m-2] (ID 3), log number is 5
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: [db/version_set.cc:5581] Column family [p-0] (ID 4), log number is 5
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: [db/version_set.cc:5581] Column family [p-1] (ID 5), log number is 5
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: [db/version_set.cc:5581] Column family [p-2] (ID 6), log number is 5
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: [db/version_set.cc:5581] Column family [O-0] (ID 7), log number is 5
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: [db/version_set.cc:5581] Column family [O-1] (ID 8), log number is 5
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: [db/version_set.cc:5581] Column family [O-2] (ID 9), log number is 5
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: [db/version_set.cc:5581] Column family [L] (ID 10), log number is 5
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: [db/version_set.cc:5581] Column family [P] (ID 11), log number is 5
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: 0e9f3910-9ea8-45ae-a4b3-9f14ef476182
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764316118079629, "job": 1, "event": "recovery_started", "wal_files": [31]}
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #31 mode 2
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764316118079874, "job": 1, "event": "recovery_finished"}
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: bluestore(/var/lib/ceph/osd/ceph-4) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: bluestore(/var/lib/ceph/osd/ceph-4) _open_super_meta old nid_max 1025
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: bluestore(/var/lib/ceph/osd/ceph-4) _open_super_meta old blobid_max 10240
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: bluestore(/var/lib/ceph/osd/ceph-4) _open_super_meta ondisk_format 4 compat_ondisk_format 3
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: bluestore(/var/lib/ceph/osd/ceph-4) _open_super_meta min_alloc_size 0x1000
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: freelist init
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: freelist _read_cfg
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: bluestore(/var/lib/ceph/osd/ceph-4) _init_alloc loaded 7.0 GiB in 2 extents, allocator type hybrid, capacity 0x1bfc00000, block size 0x1000, free 0x1bfbfd000, fragmentation 5.5e-07
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: [db/db_impl/db_impl.cc:496] Shutdown: canceling all background work
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: [db/db_impl/db_impl.cc:704] Shutdown complete
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: bluefs umount
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: bdev(0x562bf8a33180 /var/lib/ceph/osd/ceph-4/block) close
Nov 28 07:48:38 np0005538515.localdomain podman[33504]: 2025-11-28 07:48:38.197391361 +0000 UTC m=+0.115764275 container remove 49d3db070436a7bd695838e4be1de6c44cec26f166951c485b33da130554834d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=romantic_hoover, io.buildah.version=1.33.12, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vcs-type=git, GIT_CLEAN=True, GIT_BRANCH=main, RELEASE=main, ceph=True, name=rhceph, release=553, com.redhat.component=rhceph-container, distribution-scope=public, build-date=2025-09-24T08:57:55, version=7, architecture=x86_64, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.expose-services=)
Nov 28 07:48:38 np0005538515.localdomain systemd[1]: libpod-conmon-49d3db070436a7bd695838e4be1de6c44cec26f166951c485b33da130554834d.scope: Deactivated successfully.
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: bdev(0x562bf8a33180 /var/lib/ceph/osd/ceph-4/block) open path /var/lib/ceph/osd/ceph-4/block
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: bdev(0x562bf8a33180 /var/lib/ceph/osd/ceph-4/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-4/block failed: (22) Invalid argument
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: bdev(0x562bf8a33180 /var/lib/ceph/osd/ceph-4/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-4/block size 7.0 GiB
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: bluefs mount
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: bluefs _init_alloc shared, id 1, capacity 0x1bfc00000, block size 0x10000
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: bluefs mount shared_bdev_used = 4718592
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: bluestore(/var/lib/ceph/osd/ceph-4) _prepare_db_environment set db_paths to db,7136398540 db.slow,7136398540
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: RocksDB version: 7.9.2
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Git sha 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Compile date 2025-09-23 00:00:00
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: DB SUMMARY
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: DB Session ID:  X4M0XD0YLGEXMBVZCYFS
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: CURRENT file:  CURRENT
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: IDENTITY file:  IDENTITY
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: MANIFEST file:  MANIFEST-000032 size: 1007 Bytes
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: SST files in db dir, Total Num: 1, files: 000030.sst 
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: SST files in db.slow dir, Total Num: 0, files: 
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Write Ahead Log file in db.wal: 000031.log size: 5093 ; 
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                         Options.error_if_exists: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                       Options.create_if_missing: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                         Options.paranoid_checks: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:             Options.flush_verify_memtable_count: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                                     Options.env: 0x562bf8cc7c00
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                                      Options.fs: LegacyFileSystem
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                                Options.info_log: 0x562bf8acb520
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                Options.max_file_opening_threads: 16
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                              Options.statistics: (nil)
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                               Options.use_fsync: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                       Options.max_log_file_size: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                   Options.log_file_time_to_roll: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                       Options.keep_log_file_num: 1000
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                    Options.recycle_log_file_num: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                         Options.allow_fallocate: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                        Options.allow_mmap_reads: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                       Options.allow_mmap_writes: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                        Options.use_direct_reads: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:          Options.create_missing_column_families: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                              Options.db_log_dir: 
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                                 Options.wal_dir: db.wal
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                Options.table_cache_numshardbits: 6
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                         Options.WAL_ttl_seconds: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                       Options.WAL_size_limit_MB: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:             Options.manifest_preallocation_size: 4194304
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                     Options.is_fd_close_on_exec: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                   Options.advise_random_on_open: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                    Options.db_write_buffer_size: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                    Options.write_buffer_manager: 0x562bf8a1d540
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.access_hint_on_compaction_start: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                      Options.use_adaptive_mutex: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                            Options.rate_limiter: (nil)
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                       Options.wal_recovery_mode: 2
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                  Options.enable_thread_tracking: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                  Options.enable_pipelined_write: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                  Options.unordered_write: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:             Options.write_thread_max_yield_usec: 100
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                               Options.row_cache: None
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                              Options.wal_filter: None
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:             Options.avoid_flush_during_recovery: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:             Options.allow_ingest_behind: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:             Options.two_write_queues: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:             Options.manual_wal_flush: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:             Options.wal_compression: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:             Options.atomic_flush: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                 Options.persist_stats_to_disk: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                 Options.write_dbid_to_manifest: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                 Options.log_readahead_size: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                 Options.best_efforts_recovery: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:             Options.allow_data_in_errors: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:             Options.db_host_id: __hostname__
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:             Options.enforce_single_del_contracts: true
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:             Options.max_background_jobs: 4
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:             Options.max_background_compactions: -1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:             Options.max_subcompactions: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:           Options.writable_file_max_buffer_size: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:             Options.delayed_write_rate : 16777216
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:             Options.max_total_wal_size: 1073741824
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                   Options.stats_dump_period_sec: 600
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                 Options.stats_persist_period_sec: 600
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                          Options.max_open_files: -1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                          Options.bytes_per_sync: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                      Options.wal_bytes_per_sync: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                   Options.strict_bytes_per_sync: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:       Options.compaction_readahead_size: 2097152
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                  Options.max_background_flushes: -1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Compression algorithms supported:
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         kZSTD supported: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         kXpressCompression supported: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         kBZip2Compression supported: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         kZSTDNotFinalCompression supported: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         kLZ4Compression supported: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         kZlibCompression supported: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         kLZ4HCCompression supported: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         kSnappyCompression supported: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Fast CRC32 supported: Supported on x86
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: DMutex implementation: pthread_mutex_t
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: db/MANIFEST-000032
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 0, name: default)
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:           Options.merge_operator: .T:int64_array.b:bitwise_xor
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:        Options.compaction_filter: None
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:        Options.compaction_filter_factory: None
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:  Options.sst_partitioner_factory: None
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x562bf8acb3e0)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x562bf8a0b610
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:        Options.write_buffer_size: 16777216
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:  Options.max_write_buffer_number: 64
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:          Options.compression: LZ4
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:       Options.prefix_extractor: nullptr
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:             Options.num_levels: 7
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                  Options.compression_opts.level: 32767
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:               Options.compression_opts.strategy: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                  Options.compression_opts.enabled: false
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                        Options.arena_block_size: 1048576
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                Options.disable_auto_compactions: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                   Options.table_properties_collectors: 
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                   Options.inplace_update_support: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                           Options.bloom_locality: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                    Options.max_successive_merges: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                Options.paranoid_file_checks: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                Options.force_consistency_checks: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                Options.report_bg_io_stats: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                               Options.ttl: 2592000
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                       Options.enable_blob_files: false
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                           Options.min_blob_size: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                          Options.blob_file_size: 268435456
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                Options.blob_file_starting_level: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 1, name: m-0)
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-0]:
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:           Options.merge_operator: None
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:        Options.compaction_filter: None
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:        Options.compaction_filter_factory: None
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:  Options.sst_partitioner_factory: None
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x562bf8acb3e0)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x562bf8a0b610
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:        Options.write_buffer_size: 16777216
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:  Options.max_write_buffer_number: 64
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:          Options.compression: LZ4
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:       Options.prefix_extractor: nullptr
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:             Options.num_levels: 7
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                  Options.compression_opts.level: 32767
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:               Options.compression_opts.strategy: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                  Options.compression_opts.enabled: false
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                        Options.arena_block_size: 1048576
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                Options.disable_auto_compactions: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                   Options.inplace_update_support: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                           Options.bloom_locality: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                    Options.max_successive_merges: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                Options.paranoid_file_checks: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                Options.force_consistency_checks: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                Options.report_bg_io_stats: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                               Options.ttl: 2592000
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                       Options.enable_blob_files: false
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                           Options.min_blob_size: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                          Options.blob_file_size: 268435456
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                Options.blob_file_starting_level: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 2, name: m-1)
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-1]:
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:           Options.merge_operator: None
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:        Options.compaction_filter: None
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:        Options.compaction_filter_factory: None
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:  Options.sst_partitioner_factory: None
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x562bf8acb3e0)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x562bf8a0b610
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:        Options.write_buffer_size: 16777216
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:  Options.max_write_buffer_number: 64
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:          Options.compression: LZ4
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:       Options.prefix_extractor: nullptr
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:             Options.num_levels: 7
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                  Options.compression_opts.level: 32767
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:               Options.compression_opts.strategy: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                  Options.compression_opts.enabled: false
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                        Options.arena_block_size: 1048576
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                Options.disable_auto_compactions: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                   Options.inplace_update_support: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                           Options.bloom_locality: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                    Options.max_successive_merges: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                Options.paranoid_file_checks: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                Options.force_consistency_checks: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                Options.report_bg_io_stats: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                               Options.ttl: 2592000
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                       Options.enable_blob_files: false
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                           Options.min_blob_size: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                          Options.blob_file_size: 268435456
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                Options.blob_file_starting_level: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 3, name: m-2)
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-2]:
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:           Options.merge_operator: None
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:        Options.compaction_filter: None
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:        Options.compaction_filter_factory: None
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:  Options.sst_partitioner_factory: None
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x562bf8acb3e0)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x562bf8a0b610
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:        Options.write_buffer_size: 16777216
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:  Options.max_write_buffer_number: 64
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:          Options.compression: LZ4
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:       Options.prefix_extractor: nullptr
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:             Options.num_levels: 7
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                  Options.compression_opts.level: 32767
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:               Options.compression_opts.strategy: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                  Options.compression_opts.enabled: false
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                        Options.arena_block_size: 1048576
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                Options.disable_auto_compactions: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                   Options.inplace_update_support: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                           Options.bloom_locality: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                    Options.max_successive_merges: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                Options.paranoid_file_checks: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                Options.force_consistency_checks: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                Options.report_bg_io_stats: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                               Options.ttl: 2592000
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                       Options.enable_blob_files: false
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                           Options.min_blob_size: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                          Options.blob_file_size: 268435456
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                Options.blob_file_starting_level: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 4, name: p-0)
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-0]:
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:           Options.merge_operator: None
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:        Options.compaction_filter: None
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:        Options.compaction_filter_factory: None
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:  Options.sst_partitioner_factory: None
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x562bf8acb3e0)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x562bf8a0b610
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:        Options.write_buffer_size: 16777216
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:  Options.max_write_buffer_number: 64
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:          Options.compression: LZ4
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:       Options.prefix_extractor: nullptr
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:             Options.num_levels: 7
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                  Options.compression_opts.level: 32767
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:               Options.compression_opts.strategy: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                  Options.compression_opts.enabled: false
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                        Options.arena_block_size: 1048576
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                Options.disable_auto_compactions: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                   Options.inplace_update_support: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                           Options.bloom_locality: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                    Options.max_successive_merges: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                Options.paranoid_file_checks: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                Options.force_consistency_checks: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                Options.report_bg_io_stats: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                               Options.ttl: 2592000
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                       Options.enable_blob_files: false
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                           Options.min_blob_size: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                          Options.blob_file_size: 268435456
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                Options.blob_file_starting_level: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 5, name: p-1)
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-1]:
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:           Options.merge_operator: None
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:        Options.compaction_filter: None
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:        Options.compaction_filter_factory: None
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:  Options.sst_partitioner_factory: None
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x562bf8acb3e0)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x562bf8a0b610
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:        Options.write_buffer_size: 16777216
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:  Options.max_write_buffer_number: 64
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:          Options.compression: LZ4
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:       Options.prefix_extractor: nullptr
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:             Options.num_levels: 7
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                  Options.compression_opts.level: 32767
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:               Options.compression_opts.strategy: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                  Options.compression_opts.enabled: false
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                        Options.arena_block_size: 1048576
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                Options.disable_auto_compactions: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                   Options.inplace_update_support: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                           Options.bloom_locality: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                    Options.max_successive_merges: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                Options.paranoid_file_checks: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                Options.force_consistency_checks: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                Options.report_bg_io_stats: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                               Options.ttl: 2592000
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                       Options.enable_blob_files: false
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                           Options.min_blob_size: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                          Options.blob_file_size: 268435456
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                Options.blob_file_starting_level: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 6, name: p-2)
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-2]:
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:           Options.merge_operator: None
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:        Options.compaction_filter: None
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:        Options.compaction_filter_factory: None
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:  Options.sst_partitioner_factory: None
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x562bf8acb3e0)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x562bf8a0b610
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:        Options.write_buffer_size: 16777216
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:  Options.max_write_buffer_number: 64
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:          Options.compression: LZ4
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:       Options.prefix_extractor: nullptr
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:             Options.num_levels: 7
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                  Options.compression_opts.level: 32767
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:               Options.compression_opts.strategy: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                  Options.compression_opts.enabled: false
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                        Options.arena_block_size: 1048576
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                Options.disable_auto_compactions: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                   Options.inplace_update_support: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                           Options.bloom_locality: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                    Options.max_successive_merges: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                Options.paranoid_file_checks: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                Options.force_consistency_checks: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                Options.report_bg_io_stats: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                               Options.ttl: 2592000
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                       Options.enable_blob_files: false
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                           Options.min_blob_size: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                          Options.blob_file_size: 268435456
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                Options.blob_file_starting_level: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 7, name: O-0)
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-0]:
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:           Options.merge_operator: None
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:        Options.compaction_filter: None
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:        Options.compaction_filter_factory: None
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:  Options.sst_partitioner_factory: None
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x562bf8acabc0)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x562bf8a0a2d0
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 536870912
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:        Options.write_buffer_size: 16777216
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:  Options.max_write_buffer_number: 64
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:          Options.compression: LZ4
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:       Options.prefix_extractor: nullptr
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:             Options.num_levels: 7
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                  Options.compression_opts.level: 32767
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:               Options.compression_opts.strategy: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                  Options.compression_opts.enabled: false
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                        Options.arena_block_size: 1048576
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                Options.disable_auto_compactions: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                   Options.inplace_update_support: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                           Options.bloom_locality: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                    Options.max_successive_merges: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                Options.paranoid_file_checks: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                Options.force_consistency_checks: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                Options.report_bg_io_stats: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                               Options.ttl: 2592000
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                       Options.enable_blob_files: false
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                           Options.min_blob_size: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                          Options.blob_file_size: 268435456
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                Options.blob_file_starting_level: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 8, name: O-1)
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-1]:
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:           Options.merge_operator: None
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:        Options.compaction_filter: None
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:        Options.compaction_filter_factory: None
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:  Options.sst_partitioner_factory: None
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x562bf8acabc0)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x562bf8a0a2d0
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 536870912
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:        Options.write_buffer_size: 16777216
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:  Options.max_write_buffer_number: 64
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:          Options.compression: LZ4
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:       Options.prefix_extractor: nullptr
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:             Options.num_levels: 7
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                  Options.compression_opts.level: 32767
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:               Options.compression_opts.strategy: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                  Options.compression_opts.enabled: false
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                        Options.arena_block_size: 1048576
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                Options.disable_auto_compactions: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                   Options.inplace_update_support: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                           Options.bloom_locality: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                    Options.max_successive_merges: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                Options.paranoid_file_checks: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                Options.force_consistency_checks: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                Options.report_bg_io_stats: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                               Options.ttl: 2592000
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                       Options.enable_blob_files: false
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                           Options.min_blob_size: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                          Options.blob_file_size: 268435456
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                Options.blob_file_starting_level: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 9, name: O-2)
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-2]:
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:           Options.merge_operator: None
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:        Options.compaction_filter: None
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:        Options.compaction_filter_factory: None
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:  Options.sst_partitioner_factory: None
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x562bf8acabc0)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x562bf8a0a2d0
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 536870912
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:        Options.write_buffer_size: 16777216
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:  Options.max_write_buffer_number: 64
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:          Options.compression: LZ4
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:       Options.prefix_extractor: nullptr
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:             Options.num_levels: 7
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                  Options.compression_opts.level: 32767
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:               Options.compression_opts.strategy: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                  Options.compression_opts.enabled: false
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                        Options.arena_block_size: 1048576
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                Options.disable_auto_compactions: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                   Options.inplace_update_support: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                           Options.bloom_locality: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                    Options.max_successive_merges: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                Options.paranoid_file_checks: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                Options.force_consistency_checks: 1
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                Options.report_bg_io_stats: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                               Options.ttl: 2592000
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                       Options.enable_blob_files: false
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                           Options.min_blob_size: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                          Options.blob_file_size: 268435456
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb:                Options.blob_file_starting_level: 0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 10, name: L)
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: [db/column_family.cc:635]         (skipping printing options)
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 11, name: P)
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: [db/column_family.cc:635]         (skipping printing options)
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:db/MANIFEST-000032 succeeded,manifest_file_number is 32, next_file_number is 34, last_sequence is 12, log_number is 5,prev_log_number is 0,max_column_family is 11,min_log_number_to_keep is 5
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 5
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: [db/version_set.cc:5581] Column family [m-0] (ID 1), log number is 5
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: [db/version_set.cc:5581] Column family [m-1] (ID 2), log number is 5
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: [db/version_set.cc:5581] Column family [m-2] (ID 3), log number is 5
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: [db/version_set.cc:5581] Column family [p-0] (ID 4), log number is 5
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: [db/version_set.cc:5581] Column family [p-1] (ID 5), log number is 5
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: [db/version_set.cc:5581] Column family [p-2] (ID 6), log number is 5
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: [db/version_set.cc:5581] Column family [O-0] (ID 7), log number is 5
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: [db/version_set.cc:5581] Column family [O-1] (ID 8), log number is 5
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: [db/version_set.cc:5581] Column family [O-2] (ID 9), log number is 5
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: [db/version_set.cc:5581] Column family [L] (ID 10), log number is 5
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: [db/version_set.cc:5581] Column family [P] (ID 11), log number is 5
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: 0e9f3910-9ea8-45ae-a4b3-9f14ef476182
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764316118363303, "job": 1, "event": "recovery_started", "wal_files": [31]}
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #31 mode 2
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764316118388906, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 35, "file_size": 1261, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 13, "largest_seqno": 21, "table_properties": {"data_size": 128, "index_size": 27, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 87, "raw_average_key_size": 17, "raw_value_size": 82, "raw_average_value_size": 16, "num_data_blocks": 1, "num_entries": 5, "num_filter_entries": 5, "num_deletions": 0, "num_merge_operands": 2, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": ".T:int64_array.b:bitwise_xor", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764316118, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "0e9f3910-9ea8-45ae-a4b3-9f14ef476182", "db_session_id": "X4M0XD0YLGEXMBVZCYFS", "orig_file_number": 35, "seqno_to_time_mapping": "N/A"}}
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764316118417048, "cf_name": "p-0", "job": 1, "event": "table_file_creation", "file_number": 36, "file_size": 1609, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 14, "largest_seqno": 15, "table_properties": {"data_size": 468, "index_size": 39, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 72, "raw_average_key_size": 36, "raw_value_size": 567, "raw_average_value_size": 283, "num_data_blocks": 1, "num_entries": 2, "num_filter_entries": 2, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "p-0", "column_family_id": 4, "comparator": "leveldb.BytewiseComparator", "merge_operator": "nullptr", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764316118, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "0e9f3910-9ea8-45ae-a4b3-9f14ef476182", "db_session_id": "X4M0XD0YLGEXMBVZCYFS", "orig_file_number": 36, "seqno_to_time_mapping": "N/A"}}
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764316118426342, "cf_name": "O-2", "job": 1, "event": "table_file_creation", "file_number": 37, "file_size": 1290, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 16, "largest_seqno": 16, "table_properties": {"data_size": 121, "index_size": 64, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 55, "raw_average_key_size": 55, "raw_value_size": 50, "raw_average_value_size": 50, "num_data_blocks": 1, "num_entries": 1, "num_filter_entries": 1, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "O-2", "column_family_id": 9, "comparator": "leveldb.BytewiseComparator", "merge_operator": "nullptr", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764316118, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "0e9f3910-9ea8-45ae-a4b3-9f14ef476182", "db_session_id": "X4M0XD0YLGEXMBVZCYFS", "orig_file_number": 37, "seqno_to_time_mapping": "N/A"}}
Nov 28 07:48:38 np0005538515.localdomain podman[33660]: 
Nov 28 07:48:38 np0005538515.localdomain podman[33660]: 2025-11-28 07:48:38.464706324 +0000 UTC m=+0.114714088 container create 607c4c1d86fe68184ea33fb17f285556f6d07da725a4c8e2b2da08f7dce07470 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elated_lovelace, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, release=553, build-date=2025-09-24T08:57:55, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, name=rhceph, RELEASE=main, io.buildah.version=1.33.12, description=Red Hat Ceph Storage 7, version=7, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True)
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: [db/db_impl/db_impl_open.cc:1432] Failed to truncate log #31: IO error: No such file or directory: While open a file for appending: db.wal/000031.log: No such file or directory
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764316118466362, "job": 1, "event": "recovery_finished"}
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: [db/version_set.cc:5047] Creating manifest 40
Nov 28 07:48:38 np0005538515.localdomain podman[33660]: 2025-11-28 07:48:38.405078086 +0000 UTC m=+0.055085880 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x562bf8ad0380
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: DB pointer 0x562bf9911a00
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: bluestore(/var/lib/ceph/osd/ceph-4) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: bluestore(/var/lib/ceph/osd/ceph-4) _upgrade_super from 4, latest 4
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: bluestore(/var/lib/ceph/osd/ceph-4) _upgrade_super done
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 0.2 total, 0.2 interval
                                                          Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s
                                                          Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                          Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          
                                                          ** Compaction Stats [default] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      2/0    2.61 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.03              0.00         1    0.025       0      0       0.0       0.0
                                                           Sum      2/0    2.61 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.03              0.00         1    0.025       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.03              0.00         1    0.025       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [default] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.03              0.00         1    0.025       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.2 total, 0.2 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.01 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.01 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x562bf8a0b610#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 2.3e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [default] **
                                                          
                                                          ** Compaction Stats [m-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.2 total, 0.2 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x562bf8a0b610#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 2.3e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-0] **
                                                          
                                                          ** Compaction Stats [m-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.2 total, 0.2 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x562bf8a0b610#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 2.3e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-1] **
                                                          
                                                          ** Compaction Stats [m-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.2 total, 0.2 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x562bf8a0b610#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 2.3e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-2] **
                                                          
                                                          ** Compaction Stats [p-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      1/0    1.57 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.03              0.00         1    0.028       0      0       0.0       0.0
                                                           Sum      1/0    1.57 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.03              0.00         1    0.028       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.03              0.00         1    0.028       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.1      0.03              0.00         1    0.028       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.2 total, 0.2 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.01 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.01 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x562bf8a0b610#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 2.3e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-0] **
                                                          
                                                          ** Compaction Stats [p-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.2 total, 0.2 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x562bf8a0b610#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 2.3e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-1] **
                                                          
                                                          ** Compaction Stats [p-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.2 total, 0.2 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x562bf8a0b610#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 2.3e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-2] **
                                                          
                                                          ** Compaction Stats [O-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.2 total, 0.2 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x562bf8a0a2d0#2 capacity: 512.00 MB usage: 0.25 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 2 last_secs: 9e-06 secs_since: 0
                                                          Block cache entry stats(count,size,portion): FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.14 KB,2.68221e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-0] **
                                                          
                                                          ** Compaction Stats [O-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.2 total, 0.2 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x562bf8a0a2d0#2 capacity: 512.00 MB usage: 0.25 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 2 last_secs: 9e-06 secs_since: 0
                                                          Block cache entry stats(count,size,portion): FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.14 KB,2.68221e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-1] **
                                                          
                                                          ** Compaction Stats [O-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      1/0    1.26 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.01              0.00         1    0.009       0      0       0.0       0.0
                                                           Sum      1/0    1.26 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.01              0.00         1    0.009       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.01              0.00         1    0.009       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.1      0.01              0.00         1    0.009       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.2 total, 0.2 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.01 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.01 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x562bf8a0a2d0#2 capacity: 512.00 MB usage: 0.25 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 2 last_secs: 9e-06 secs_since: 0
                                                          Block cache entry stats(count,size,portion): FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.14 KB,2.68221e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-2] **
                                                          
                                                          ** Compaction Stats [L] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.04              0.00         1    0.040       0      0       0.0       0.0
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.04              0.00         1    0.040       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.04              0.00         1    0.040       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [L] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.04              0.00         1    0.040       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.2 total, 0.2 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x562bf8a0b610#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 2.3e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [L] **
                                                          
                                                          ** Compaction Stats [P] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [P] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.2 total, 0.2 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x562bf8a0b610#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 2.3e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [P] **
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: <cls> /builddir/build/BUILD/ceph-18.2.1/src/cls/cephfs/cls_cephfs.cc:201: loading cephfs
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: <cls> /builddir/build/BUILD/ceph-18.2.1/src/cls/hello/cls_hello.cc:316: loading cls_hello
Nov 28 07:48:38 np0005538515.localdomain systemd[1]: Started libpod-conmon-607c4c1d86fe68184ea33fb17f285556f6d07da725a4c8e2b2da08f7dce07470.scope.
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: _get_class not permitted to load lua
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: _get_class not permitted to load sdk
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: _get_class not permitted to load test_remote_reads
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: osd.4 0 crush map has features 288232575208783872, adjusting msgr requires for clients
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: osd.4 0 crush map has features 288232575208783872 was 8705, adjusting msgr requires for mons
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: osd.4 0 crush map has features 288232575208783872, adjusting msgr requires for osds
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: osd.4 0 check_osdmap_features enabling on-disk ERASURE CODES compat feature
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: osd.4 0 load_pgs
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: osd.4 0 load_pgs opened 0 pgs
Nov 28 07:48:38 np0005538515.localdomain ceph-osd[33334]: osd.4 0 log_to_monitors true
Nov 28 07:48:38 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 07:48:38 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-4[33330]: 2025-11-28T07:48:38.555+0000 7f2b3a71ea80 -1 osd.4 0 log_to_monitors true
Nov 28 07:48:38 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f87df368dbed6e848a6f32f85972509b1e344bec02ca59072eaf174c54e6e606/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Nov 28 07:48:38 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f87df368dbed6e848a6f32f85972509b1e344bec02ca59072eaf174c54e6e606/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Nov 28 07:48:38 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f87df368dbed6e848a6f32f85972509b1e344bec02ca59072eaf174c54e6e606/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Nov 28 07:48:38 np0005538515.localdomain podman[33660]: 2025-11-28 07:48:38.627343463 +0000 UTC m=+0.277351227 container init 607c4c1d86fe68184ea33fb17f285556f6d07da725a4c8e2b2da08f7dce07470 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elated_lovelace, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, name=rhceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, version=7, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., release=553, io.buildah.version=1.33.12, architecture=x86_64, vcs-type=git, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, io.openshift.expose-services=, build-date=2025-09-24T08:57:55, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Nov 28 07:48:38 np0005538515.localdomain podman[33660]: 2025-11-28 07:48:38.63816182 +0000 UTC m=+0.288169554 container start 607c4c1d86fe68184ea33fb17f285556f6d07da725a4c8e2b2da08f7dce07470 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elated_lovelace, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=rhceph-container, name=rhceph, description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, distribution-scope=public, GIT_BRANCH=main, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=553, CEPH_POINT_RELEASE=, io.buildah.version=1.33.12, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, RELEASE=main, build-date=2025-09-24T08:57:55)
Nov 28 07:48:38 np0005538515.localdomain podman[33660]: 2025-11-28 07:48:38.638358249 +0000 UTC m=+0.288366063 container attach 607c4c1d86fe68184ea33fb17f285556f6d07da725a4c8e2b2da08f7dce07470 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elated_lovelace, vcs-type=git, com.redhat.component=rhceph-container, ceph=True, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, GIT_CLEAN=True, build-date=2025-09-24T08:57:55, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, io.buildah.version=1.33.12, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.tags=rhceph ceph, version=7, com.redhat.license_terms=https://www.redhat.com/agreements, release=553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, GIT_BRANCH=main)
Nov 28 07:48:38 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-769e70b48a1c2c822b927d650341de595d4acfa241ab5da7498791095dcf7b4d-merged.mount: Deactivated successfully.
Nov 28 07:48:39 np0005538515.localdomain elated_lovelace[33863]: {
Nov 28 07:48:39 np0005538515.localdomain elated_lovelace[33863]:     "b9cdd064-f06d-4e2b-b6e3-0368d5f01fb9": {
Nov 28 07:48:39 np0005538515.localdomain elated_lovelace[33863]:         "ceph_fsid": "2c5417c9-00eb-57d5-a565-ddecbc7995c1",
Nov 28 07:48:39 np0005538515.localdomain elated_lovelace[33863]:         "device": "/dev/mapper/ceph_vg0-ceph_lv0",
Nov 28 07:48:39 np0005538515.localdomain elated_lovelace[33863]:         "osd_id": 1,
Nov 28 07:48:39 np0005538515.localdomain elated_lovelace[33863]:         "osd_uuid": "b9cdd064-f06d-4e2b-b6e3-0368d5f01fb9",
Nov 28 07:48:39 np0005538515.localdomain elated_lovelace[33863]:         "type": "bluestore"
Nov 28 07:48:39 np0005538515.localdomain elated_lovelace[33863]:     },
Nov 28 07:48:39 np0005538515.localdomain elated_lovelace[33863]:     "f4f9cdb9-a7e9-468b-968c-003e9ca341ca": {
Nov 28 07:48:39 np0005538515.localdomain elated_lovelace[33863]:         "ceph_fsid": "2c5417c9-00eb-57d5-a565-ddecbc7995c1",
Nov 28 07:48:39 np0005538515.localdomain elated_lovelace[33863]:         "device": "/dev/mapper/ceph_vg1-ceph_lv1",
Nov 28 07:48:39 np0005538515.localdomain elated_lovelace[33863]:         "osd_id": 4,
Nov 28 07:48:39 np0005538515.localdomain elated_lovelace[33863]:         "osd_uuid": "f4f9cdb9-a7e9-468b-968c-003e9ca341ca",
Nov 28 07:48:39 np0005538515.localdomain elated_lovelace[33863]:         "type": "bluestore"
Nov 28 07:48:39 np0005538515.localdomain elated_lovelace[33863]:     }
Nov 28 07:48:39 np0005538515.localdomain elated_lovelace[33863]: }
Nov 28 07:48:39 np0005538515.localdomain systemd[1]: libpod-607c4c1d86fe68184ea33fb17f285556f6d07da725a4c8e2b2da08f7dce07470.scope: Deactivated successfully.
Nov 28 07:48:39 np0005538515.localdomain podman[33660]: 2025-11-28 07:48:39.293089971 +0000 UTC m=+0.943097745 container died 607c4c1d86fe68184ea33fb17f285556f6d07da725a4c8e2b2da08f7dce07470 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elated_lovelace, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, io.openshift.expose-services=, com.redhat.component=rhceph-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, name=rhceph, distribution-scope=public, build-date=2025-09-24T08:57:55, io.openshift.tags=rhceph ceph, RELEASE=main, GIT_CLEAN=True, vendor=Red Hat, Inc., vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, io.buildah.version=1.33.12, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, release=553, description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Nov 28 07:48:39 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-f87df368dbed6e848a6f32f85972509b1e344bec02ca59072eaf174c54e6e606-merged.mount: Deactivated successfully.
Nov 28 07:48:39 np0005538515.localdomain podman[33927]: 2025-11-28 07:48:39.404715261 +0000 UTC m=+0.101273025 container remove 607c4c1d86fe68184ea33fb17f285556f6d07da725a4c8e2b2da08f7dce07470 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elated_lovelace, io.buildah.version=1.33.12, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, vcs-type=git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, version=7, name=rhceph, build-date=2025-09-24T08:57:55, distribution-scope=public, architecture=x86_64, release=553, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=)
Nov 28 07:48:39 np0005538515.localdomain systemd[1]: libpod-conmon-607c4c1d86fe68184ea33fb17f285556f6d07da725a4c8e2b2da08f7dce07470.scope: Deactivated successfully.
Nov 28 07:48:39 np0005538515.localdomain sudo[33365]: pam_unix(sudo:session): session closed for user root
Nov 28 07:48:39 np0005538515.localdomain ceph-osd[33334]: log_channel(cluster) log [DBG] : purged_snaps scrub starts
Nov 28 07:48:39 np0005538515.localdomain ceph-osd[33334]: log_channel(cluster) log [DBG] : purged_snaps scrub ok
Nov 28 07:48:39 np0005538515.localdomain ceph-osd[32393]: osd.1 0 maybe_override_max_osd_capacity_for_qos osd bench result - bandwidth (MiB/sec): 19.779 iops: 5063.440 elapsed_sec: 0.592
Nov 28 07:48:39 np0005538515.localdomain ceph-osd[32393]: log_channel(cluster) log [WRN] : OSD bench result of 5063.439684 IOPS is not within the threshold limit range of 50.000000 IOPS and 500.000000 IOPS for osd.1. IOPS capacity is unchanged at 315.000000 IOPS. The recommendation is to establish the osd's IOPS capacity using other benchmark tools (e.g. Fio) and then override osd_mclock_max_capacity_iops_[hdd|ssd].
Nov 28 07:48:39 np0005538515.localdomain ceph-osd[32393]: osd.1 0 waiting for initial osdmap
Nov 28 07:48:39 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-1[32389]: 2025-11-28T07:48:39.970+0000 7fc6707e2640 -1 osd.1 0 waiting for initial osdmap
Nov 28 07:48:39 np0005538515.localdomain ceph-osd[32393]: osd.1 12 crush map has features 288514050185494528, adjusting msgr requires for clients
Nov 28 07:48:39 np0005538515.localdomain ceph-osd[32393]: osd.1 12 crush map has features 288514050185494528 was 288232575208792577, adjusting msgr requires for mons
Nov 28 07:48:39 np0005538515.localdomain ceph-osd[32393]: osd.1 12 crush map has features 3314932999778484224, adjusting msgr requires for osds
Nov 28 07:48:39 np0005538515.localdomain ceph-osd[32393]: osd.1 12 check_osdmap_features require_osd_release unknown -> reef
Nov 28 07:48:39 np0005538515.localdomain ceph-osd[32393]: osd.1 12 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory
Nov 28 07:48:39 np0005538515.localdomain ceph-osd[32393]: osd.1 12 set_numa_affinity not setting numa affinity
Nov 28 07:48:39 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-1[32389]: 2025-11-28T07:48:39.989+0000 7fc66be0c640 -1 osd.1 12 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory
Nov 28 07:48:39 np0005538515.localdomain ceph-osd[32393]: osd.1 12 _collect_metadata loop3:  no unique device id for loop3: fallback method has no model nor serial
Nov 28 07:48:40 np0005538515.localdomain ceph-osd[33334]: osd.4 0 done with init, starting boot process
Nov 28 07:48:40 np0005538515.localdomain ceph-osd[33334]: osd.4 0 start_boot
Nov 28 07:48:40 np0005538515.localdomain ceph-osd[33334]: osd.4 0 maybe_override_options_for_qos osd_max_backfills set to 1
Nov 28 07:48:40 np0005538515.localdomain ceph-osd[33334]: osd.4 0 maybe_override_options_for_qos osd_recovery_max_active set to 0
Nov 28 07:48:40 np0005538515.localdomain ceph-osd[33334]: osd.4 0 maybe_override_options_for_qos osd_recovery_max_active_hdd set to 3
Nov 28 07:48:40 np0005538515.localdomain ceph-osd[33334]: osd.4 0 maybe_override_options_for_qos osd_recovery_max_active_ssd set to 10
Nov 28 07:48:40 np0005538515.localdomain ceph-osd[33334]: osd.4 0  bench count 12288000 bsize 4 KiB
Nov 28 07:48:40 np0005538515.localdomain ceph-osd[32393]: osd.1 13 state: booting -> active
Nov 28 07:48:40 np0005538515.localdomain sudo[33942]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 07:48:40 np0005538515.localdomain sudo[33942]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 07:48:40 np0005538515.localdomain sudo[33942]: pam_unix(sudo:session): session closed for user root
Nov 28 07:48:40 np0005538515.localdomain sudo[33957]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 07:48:40 np0005538515.localdomain sudo[33957]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 07:48:40 np0005538515.localdomain sudo[33957]: pam_unix(sudo:session): session closed for user root
Nov 28 07:48:40 np0005538515.localdomain sudo[33972]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Nov 28 07:48:40 np0005538515.localdomain sudo[33972]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 07:48:41 np0005538515.localdomain podman[34052]: 2025-11-28 07:48:41.690900559 +0000 UTC m=+0.108232412 container exec 98f7091a3e2ea0e9ed1e630f1e98c8fad1fd276cf7448473db6afc3c103ea45d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538515, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, com.redhat.component=rhceph-container, RELEASE=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, release=553, architecture=x86_64, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, distribution-scope=public, version=7, GIT_CLEAN=True, build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/agreements, name=rhceph, GIT_BRANCH=main, io.buildah.version=1.33.12, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=)
Nov 28 07:48:41 np0005538515.localdomain podman[34052]: 2025-11-28 07:48:41.820508513 +0000 UTC m=+0.237840406 container exec_died 98f7091a3e2ea0e9ed1e630f1e98c8fad1fd276cf7448473db6afc3c103ea45d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538515, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, release=553, GIT_BRANCH=main, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., name=rhceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2025-09-24T08:57:55, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, distribution-scope=public, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, version=7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, GIT_CLEAN=True, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7)
Nov 28 07:48:42 np0005538515.localdomain ceph-osd[32393]: osd.1 15 crush map has features 288514051259236352, adjusting msgr requires for clients
Nov 28 07:48:42 np0005538515.localdomain ceph-osd[32393]: osd.1 15 crush map has features 288514051259236352 was 288514050185503233, adjusting msgr requires for mons
Nov 28 07:48:42 np0005538515.localdomain ceph-osd[32393]: osd.1 15 crush map has features 3314933000852226048, adjusting msgr requires for osds
Nov 28 07:48:42 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 15 pg[1.0( empty local-lis/les=0/0 n=0 ec=15/15 lis/c=0/0 les/c/f=0/0/0 sis=15) [1] r=0 lpr=15 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 07:48:42 np0005538515.localdomain sudo[33972]: pam_unix(sudo:session): session closed for user root
Nov 28 07:48:42 np0005538515.localdomain sudo[34115]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 07:48:42 np0005538515.localdomain sudo[34115]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 07:48:42 np0005538515.localdomain sudo[34115]: pam_unix(sudo:session): session closed for user root
Nov 28 07:48:42 np0005538515.localdomain sudo[34130]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 07:48:42 np0005538515.localdomain sudo[34130]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 07:48:42 np0005538515.localdomain sudo[34130]: pam_unix(sudo:session): session closed for user root
Nov 28 07:48:43 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 16 pg[1.0( empty local-lis/les=0/0 n=0 ec=15/15 lis/c=0/0 les/c/f=0/0/0 sis=15) [1] r=0 lpr=15 crt=0'0 mlcod 0'0 undersized+peered mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 07:48:43 np0005538515.localdomain sudo[34175]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 07:48:43 np0005538515.localdomain sudo[34175]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 07:48:43 np0005538515.localdomain sudo[34175]: pam_unix(sudo:session): session closed for user root
Nov 28 07:48:43 np0005538515.localdomain sudo[34190]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ceph-volume --fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1 -- inventory --format=json-pretty --filter-for-batch
Nov 28 07:48:43 np0005538515.localdomain sudo[34190]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 07:48:43 np0005538515.localdomain podman[34244]: 
Nov 28 07:48:43 np0005538515.localdomain podman[34244]: 2025-11-28 07:48:43.857351459 +0000 UTC m=+0.094432664 container create 3f8be99dd0c6d201be45dd3a9c2722e314812e74164aeb070f20b0a0247122d4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=funny_chaum, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, com.redhat.component=rhceph-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, name=rhceph, GIT_CLEAN=True, io.openshift.expose-services=, vendor=Red Hat, Inc., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=553, CEPH_POINT_RELEASE=, RELEASE=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph, io.buildah.version=1.33.12, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-09-24T08:57:55)
Nov 28 07:48:43 np0005538515.localdomain systemd[1]: Started libpod-conmon-3f8be99dd0c6d201be45dd3a9c2722e314812e74164aeb070f20b0a0247122d4.scope.
Nov 28 07:48:43 np0005538515.localdomain podman[34244]: 2025-11-28 07:48:43.807328334 +0000 UTC m=+0.044409599 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 07:48:43 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 07:48:43 np0005538515.localdomain podman[34244]: 2025-11-28 07:48:43.95149392 +0000 UTC m=+0.188575145 container init 3f8be99dd0c6d201be45dd3a9c2722e314812e74164aeb070f20b0a0247122d4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=funny_chaum, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, version=7, architecture=x86_64, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, name=rhceph, io.buildah.version=1.33.12, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, description=Red Hat Ceph Storage 7, distribution-scope=public, release=553, GIT_CLEAN=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, io.openshift.expose-services=, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, GIT_BRANCH=main, build-date=2025-09-24T08:57:55)
Nov 28 07:48:43 np0005538515.localdomain funny_chaum[34259]: 167 167
Nov 28 07:48:43 np0005538515.localdomain systemd[1]: libpod-3f8be99dd0c6d201be45dd3a9c2722e314812e74164aeb070f20b0a0247122d4.scope: Deactivated successfully.
Nov 28 07:48:43 np0005538515.localdomain podman[34244]: 2025-11-28 07:48:43.970322949 +0000 UTC m=+0.207404184 container start 3f8be99dd0c6d201be45dd3a9c2722e314812e74164aeb070f20b0a0247122d4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=funny_chaum, com.redhat.license_terms=https://www.redhat.com/agreements, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, distribution-scope=public, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, architecture=x86_64, name=rhceph, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, vendor=Red Hat, Inc., GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-09-24T08:57:55, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, release=553)
Nov 28 07:48:43 np0005538515.localdomain podman[34244]: 2025-11-28 07:48:43.970644064 +0000 UTC m=+0.207725309 container attach 3f8be99dd0c6d201be45dd3a9c2722e314812e74164aeb070f20b0a0247122d4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=funny_chaum, architecture=x86_64, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, GIT_BRANCH=main, vcs-type=git, distribution-scope=public, io.openshift.expose-services=, version=7, release=553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, name=rhceph, GIT_CLEAN=True, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, build-date=2025-09-24T08:57:55, RELEASE=main)
Nov 28 07:48:43 np0005538515.localdomain podman[34244]: 2025-11-28 07:48:43.973368894 +0000 UTC m=+0.210450159 container died 3f8be99dd0c6d201be45dd3a9c2722e314812e74164aeb070f20b0a0247122d4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=funny_chaum, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., GIT_CLEAN=True, description=Red Hat Ceph Storage 7, release=553, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, vcs-type=git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.expose-services=, build-date=2025-09-24T08:57:55, GIT_BRANCH=main, version=7, com.redhat.component=rhceph-container, distribution-scope=public)
Nov 28 07:48:44 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-b95aefb47c92c64a11fce9bc4483f344fa2851454bba834b9598b864e3e1f9db-merged.mount: Deactivated successfully.
Nov 28 07:48:44 np0005538515.localdomain podman[34264]: 2025-11-28 07:48:44.07605092 +0000 UTC m=+0.100686310 container remove 3f8be99dd0c6d201be45dd3a9c2722e314812e74164aeb070f20b0a0247122d4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=funny_chaum, io.buildah.version=1.33.12, release=553, description=Red Hat Ceph Storage 7, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, vendor=Red Hat, Inc., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_CLEAN=True, build-date=2025-09-24T08:57:55, GIT_BRANCH=main, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, CEPH_POINT_RELEASE=, ceph=True)
Nov 28 07:48:44 np0005538515.localdomain systemd[1]: libpod-conmon-3f8be99dd0c6d201be45dd3a9c2722e314812e74164aeb070f20b0a0247122d4.scope: Deactivated successfully.
Nov 28 07:48:44 np0005538515.localdomain podman[34284]: 
Nov 28 07:48:44 np0005538515.localdomain podman[34284]: 2025-11-28 07:48:44.27295069 +0000 UTC m=+0.070705728 container create f5648193c0da0f8955416bbb58ab546b6b527415067bb6af48164d8787d3e4e2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elastic_williamson, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-09-24T08:57:55, CEPH_POINT_RELEASE=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.33.12, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=rhceph-container, distribution-scope=public, architecture=x86_64, io.openshift.tags=rhceph ceph, release=553, version=7, ceph=True, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, RELEASE=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.expose-services=)
Nov 28 07:48:44 np0005538515.localdomain systemd[1]: Started libpod-conmon-f5648193c0da0f8955416bbb58ab546b6b527415067bb6af48164d8787d3e4e2.scope.
Nov 28 07:48:44 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 07:48:44 np0005538515.localdomain podman[34284]: 2025-11-28 07:48:44.240760121 +0000 UTC m=+0.038515169 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 07:48:44 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/afa3ec5d5d1eafd6ab6d4478cd62a168d8e40c6257e4979f5963c5b04408f26f/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Nov 28 07:48:44 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/afa3ec5d5d1eafd6ab6d4478cd62a168d8e40c6257e4979f5963c5b04408f26f/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Nov 28 07:48:44 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/afa3ec5d5d1eafd6ab6d4478cd62a168d8e40c6257e4979f5963c5b04408f26f/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Nov 28 07:48:44 np0005538515.localdomain podman[34284]: 2025-11-28 07:48:44.37595034 +0000 UTC m=+0.173705398 container init f5648193c0da0f8955416bbb58ab546b6b527415067bb6af48164d8787d3e4e2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elastic_williamson, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2025-09-24T08:57:55, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, io.openshift.expose-services=, release=553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, architecture=x86_64, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, vcs-type=git, com.redhat.component=rhceph-container, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, io.buildah.version=1.33.12, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, RELEASE=main)
Nov 28 07:48:44 np0005538515.localdomain podman[34284]: 2025-11-28 07:48:44.395877148 +0000 UTC m=+0.193632206 container start f5648193c0da0f8955416bbb58ab546b6b527415067bb6af48164d8787d3e4e2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elastic_williamson, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, CEPH_POINT_RELEASE=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_BRANCH=main, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.tags=rhceph ceph, build-date=2025-09-24T08:57:55, RELEASE=main, GIT_CLEAN=True, name=rhceph, io.openshift.expose-services=, version=7, vcs-type=git, com.redhat.component=rhceph-container, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, release=553, architecture=x86_64)
Nov 28 07:48:44 np0005538515.localdomain podman[34284]: 2025-11-28 07:48:44.396308558 +0000 UTC m=+0.194063796 container attach f5648193c0da0f8955416bbb58ab546b6b527415067bb6af48164d8787d3e4e2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elastic_williamson, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.33.12, com.redhat.component=rhceph-container, build-date=2025-09-24T08:57:55, io.openshift.expose-services=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, release=553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, distribution-scope=public, ceph=True, version=7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, architecture=x86_64)
Nov 28 07:48:44 np0005538515.localdomain ceph-osd[33334]: osd.4 0 maybe_override_max_osd_capacity_for_qos osd bench result - bandwidth (MiB/sec): 19.197 iops: 4914.424 elapsed_sec: 0.610
Nov 28 07:48:44 np0005538515.localdomain ceph-osd[33334]: log_channel(cluster) log [WRN] : OSD bench result of 4914.423996 IOPS is not within the threshold limit range of 50.000000 IOPS and 500.000000 IOPS for osd.4. IOPS capacity is unchanged at 315.000000 IOPS. The recommendation is to establish the osd's IOPS capacity using other benchmark tools (e.g. Fio) and then override osd_mclock_max_capacity_iops_[hdd|ssd].
Nov 28 07:48:44 np0005538515.localdomain ceph-osd[33334]: osd.4 0 waiting for initial osdmap
Nov 28 07:48:44 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-4[33330]: 2025-11-28T07:48:44.489+0000 7f2b3669d640 -1 osd.4 0 waiting for initial osdmap
Nov 28 07:48:44 np0005538515.localdomain ceph-osd[33334]: osd.4 17 crush map has features 288514051259236352, adjusting msgr requires for clients
Nov 28 07:48:44 np0005538515.localdomain ceph-osd[33334]: osd.4 17 crush map has features 288514051259236352 was 288232575208792577, adjusting msgr requires for mons
Nov 28 07:48:44 np0005538515.localdomain ceph-osd[33334]: osd.4 17 crush map has features 3314933000852226048, adjusting msgr requires for osds
Nov 28 07:48:44 np0005538515.localdomain ceph-osd[33334]: osd.4 17 check_osdmap_features require_osd_release unknown -> reef
Nov 28 07:48:44 np0005538515.localdomain ceph-osd[33334]: osd.4 17 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory
Nov 28 07:48:44 np0005538515.localdomain ceph-osd[33334]: osd.4 17 set_numa_affinity not setting numa affinity
Nov 28 07:48:44 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-4[33330]: 2025-11-28T07:48:44.510+0000 7f2b31cc7640 -1 osd.4 17 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory
Nov 28 07:48:44 np0005538515.localdomain ceph-osd[33334]: osd.4 17 _collect_metadata loop4:  no unique device id for loop4: fallback method has no model nor serial
Nov 28 07:48:45 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 18 pg[1.0( empty local-lis/les=0/0 n=0 ec=15/15 lis/c=0/0 les/c/f=0/0/0 sis=18 pruub=14.005291939s) [1,5,3] r=0 lpr=18 pi=[15,18)/0 crt=0'0 mlcod 0'0 peered pruub 25.125785828s@ mbc={}] start_peering_interval up [1] -> [1,5,3], acting [1] -> [1,5,3], acting_primary 1 -> 1, up_primary 1 -> 1, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 07:48:45 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 18 pg[1.0( empty local-lis/les=0/0 n=0 ec=15/15 lis/c=0/0 les/c/f=0/0/0 sis=18 pruub=14.005291939s) [1,5,3] r=0 lpr=18 pi=[15,18)/0 crt=0'0 mlcod 0'0 unknown pruub 25.125785828s@ mbc={}] state<Start>: transitioning to Primary
Nov 28 07:48:45 np0005538515.localdomain ceph-osd[33334]: osd.4 18 state: booting -> active
Nov 28 07:48:45 np0005538515.localdomain elastic_williamson[34299]: [
Nov 28 07:48:45 np0005538515.localdomain elastic_williamson[34299]:     {
Nov 28 07:48:45 np0005538515.localdomain elastic_williamson[34299]:         "available": false,
Nov 28 07:48:45 np0005538515.localdomain elastic_williamson[34299]:         "ceph_device": false,
Nov 28 07:48:45 np0005538515.localdomain elastic_williamson[34299]:         "device_id": "QEMU_DVD-ROM_QM00001",
Nov 28 07:48:45 np0005538515.localdomain elastic_williamson[34299]:         "lsm_data": {},
Nov 28 07:48:45 np0005538515.localdomain elastic_williamson[34299]:         "lvs": [],
Nov 28 07:48:45 np0005538515.localdomain elastic_williamson[34299]:         "path": "/dev/sr0",
Nov 28 07:48:45 np0005538515.localdomain elastic_williamson[34299]:         "rejected_reasons": [
Nov 28 07:48:45 np0005538515.localdomain elastic_williamson[34299]:             "Has a FileSystem",
Nov 28 07:48:45 np0005538515.localdomain elastic_williamson[34299]:             "Insufficient space (<5GB)"
Nov 28 07:48:45 np0005538515.localdomain elastic_williamson[34299]:         ],
Nov 28 07:48:45 np0005538515.localdomain elastic_williamson[34299]:         "sys_api": {
Nov 28 07:48:45 np0005538515.localdomain elastic_williamson[34299]:             "actuators": null,
Nov 28 07:48:45 np0005538515.localdomain elastic_williamson[34299]:             "device_nodes": "sr0",
Nov 28 07:48:45 np0005538515.localdomain elastic_williamson[34299]:             "human_readable_size": "482.00 KB",
Nov 28 07:48:45 np0005538515.localdomain elastic_williamson[34299]:             "id_bus": "ata",
Nov 28 07:48:45 np0005538515.localdomain elastic_williamson[34299]:             "model": "QEMU DVD-ROM",
Nov 28 07:48:45 np0005538515.localdomain elastic_williamson[34299]:             "nr_requests": "2",
Nov 28 07:48:45 np0005538515.localdomain elastic_williamson[34299]:             "partitions": {},
Nov 28 07:48:45 np0005538515.localdomain elastic_williamson[34299]:             "path": "/dev/sr0",
Nov 28 07:48:45 np0005538515.localdomain elastic_williamson[34299]:             "removable": "1",
Nov 28 07:48:45 np0005538515.localdomain elastic_williamson[34299]:             "rev": "2.5+",
Nov 28 07:48:45 np0005538515.localdomain elastic_williamson[34299]:             "ro": "0",
Nov 28 07:48:45 np0005538515.localdomain elastic_williamson[34299]:             "rotational": "1",
Nov 28 07:48:45 np0005538515.localdomain elastic_williamson[34299]:             "sas_address": "",
Nov 28 07:48:45 np0005538515.localdomain elastic_williamson[34299]:             "sas_device_handle": "",
Nov 28 07:48:45 np0005538515.localdomain elastic_williamson[34299]:             "scheduler_mode": "mq-deadline",
Nov 28 07:48:45 np0005538515.localdomain elastic_williamson[34299]:             "sectors": 0,
Nov 28 07:48:45 np0005538515.localdomain elastic_williamson[34299]:             "sectorsize": "2048",
Nov 28 07:48:45 np0005538515.localdomain elastic_williamson[34299]:             "size": 493568.0,
Nov 28 07:48:45 np0005538515.localdomain elastic_williamson[34299]:             "support_discard": "0",
Nov 28 07:48:45 np0005538515.localdomain elastic_williamson[34299]:             "type": "disk",
Nov 28 07:48:45 np0005538515.localdomain elastic_williamson[34299]:             "vendor": "QEMU"
Nov 28 07:48:45 np0005538515.localdomain elastic_williamson[34299]:         }
Nov 28 07:48:45 np0005538515.localdomain elastic_williamson[34299]:     }
Nov 28 07:48:45 np0005538515.localdomain elastic_williamson[34299]: ]
Nov 28 07:48:45 np0005538515.localdomain systemd[1]: libpod-f5648193c0da0f8955416bbb58ab546b6b527415067bb6af48164d8787d3e4e2.scope: Deactivated successfully.
Nov 28 07:48:45 np0005538515.localdomain systemd[1]: libpod-f5648193c0da0f8955416bbb58ab546b6b527415067bb6af48164d8787d3e4e2.scope: Consumed 1.060s CPU time.
Nov 28 07:48:45 np0005538515.localdomain podman[34284]: 2025-11-28 07:48:45.417731503 +0000 UTC m=+1.215486601 container died f5648193c0da0f8955416bbb58ab546b6b527415067bb6af48164d8787d3e4e2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elastic_williamson, build-date=2025-09-24T08:57:55, distribution-scope=public, architecture=x86_64, io.buildah.version=1.33.12, com.redhat.component=rhceph-container, io.openshift.expose-services=, release=553, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, ceph=True, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, version=7, GIT_CLEAN=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhceph)
Nov 28 07:48:45 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-afa3ec5d5d1eafd6ab6d4478cd62a168d8e40c6257e4979f5963c5b04408f26f-merged.mount: Deactivated successfully.
Nov 28 07:48:45 np0005538515.localdomain podman[35774]: 2025-11-28 07:48:45.525800547 +0000 UTC m=+0.095558563 container remove f5648193c0da0f8955416bbb58ab546b6b527415067bb6af48164d8787d3e4e2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elastic_williamson, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, build-date=2025-09-24T08:57:55, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, vcs-type=git, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=rhceph-container, architecture=x86_64, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, CEPH_POINT_RELEASE=, GIT_BRANCH=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, ceph=True, RELEASE=main, distribution-scope=public, release=553, vendor=Red Hat, Inc., version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git)
Nov 28 07:48:45 np0005538515.localdomain systemd[1]: libpod-conmon-f5648193c0da0f8955416bbb58ab546b6b527415067bb6af48164d8787d3e4e2.scope: Deactivated successfully.
Nov 28 07:48:45 np0005538515.localdomain sudo[34190]: pam_unix(sudo:session): session closed for user root
Nov 28 07:48:46 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 19 pg[1.0( empty local-lis/les=18/19 n=0 ec=15/15 lis/c=0/0 les/c/f=0/0/0 sis=18) [1,5,3] r=0 lpr=18 pi=[15,18)/0 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 07:48:48 np0005538515.localdomain sudo[35789]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 07:48:48 np0005538515.localdomain sudo[35789]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 07:48:48 np0005538515.localdomain sudo[35789]: pam_unix(sudo:session): session closed for user root
Nov 28 07:48:53 np0005538515.localdomain systemd[26783]: Starting Mark boot as successful...
Nov 28 07:48:53 np0005538515.localdomain systemd[26783]: Finished Mark boot as successful.
Nov 28 07:48:54 np0005538515.localdomain sudo[35805]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 07:48:54 np0005538515.localdomain sudo[35805]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 07:48:54 np0005538515.localdomain sudo[35805]: pam_unix(sudo:session): session closed for user root
Nov 28 07:48:54 np0005538515.localdomain sudo[35820]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Nov 28 07:48:54 np0005538515.localdomain sudo[35820]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 07:48:55 np0005538515.localdomain systemd[1]: tmp-crun.eVeOQd.mount: Deactivated successfully.
Nov 28 07:48:55 np0005538515.localdomain podman[35903]: 2025-11-28 07:48:55.257211929 +0000 UTC m=+0.086794197 container exec 98f7091a3e2ea0e9ed1e630f1e98c8fad1fd276cf7448473db6afc3c103ea45d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538515, io.openshift.tags=rhceph ceph, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-09-24T08:57:55, architecture=x86_64, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.buildah.version=1.33.12, release=553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/agreements, ceph=True, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, GIT_CLEAN=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vendor=Red Hat, Inc.)
Nov 28 07:48:55 np0005538515.localdomain podman[35903]: 2025-11-28 07:48:55.389754213 +0000 UTC m=+0.219336531 container exec_died 98f7091a3e2ea0e9ed1e630f1e98c8fad1fd276cf7448473db6afc3c103ea45d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538515, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, release=553, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, version=7, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-09-24T08:57:55, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., io.buildah.version=1.33.12, GIT_BRANCH=main, CEPH_POINT_RELEASE=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, com.redhat.component=rhceph-container)
Nov 28 07:48:55 np0005538515.localdomain sudo[35820]: pam_unix(sudo:session): session closed for user root
Nov 28 07:48:56 np0005538515.localdomain sudo[35967]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 07:48:56 np0005538515.localdomain sudo[35967]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 07:48:56 np0005538515.localdomain sudo[35967]: pam_unix(sudo:session): session closed for user root
Nov 28 07:49:56 np0005538515.localdomain sudo[35982]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 07:49:56 np0005538515.localdomain sudo[35982]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 07:49:56 np0005538515.localdomain sudo[35982]: pam_unix(sudo:session): session closed for user root
Nov 28 07:49:56 np0005538515.localdomain sudo[35997]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Nov 28 07:49:56 np0005538515.localdomain sudo[35997]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 07:49:57 np0005538515.localdomain podman[36080]: 2025-11-28 07:49:57.191924451 +0000 UTC m=+0.082572441 container exec 98f7091a3e2ea0e9ed1e630f1e98c8fad1fd276cf7448473db6afc3c103ea45d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538515, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.33.12, GIT_BRANCH=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, build-date=2025-09-24T08:57:55, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, release=553, distribution-scope=public, version=7, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhceph, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, vendor=Red Hat, Inc.)
Nov 28 07:49:57 np0005538515.localdomain podman[36080]: 2025-11-28 07:49:57.294069173 +0000 UTC m=+0.184717163 container exec_died 98f7091a3e2ea0e9ed1e630f1e98c8fad1fd276cf7448473db6afc3c103ea45d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538515, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, version=7, CEPH_POINT_RELEASE=, release=553, build-date=2025-09-24T08:57:55, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.33.12, vendor=Red Hat, Inc., name=rhceph, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-type=git, GIT_CLEAN=True, ceph=True, com.redhat.component=rhceph-container)
Nov 28 07:49:57 np0005538515.localdomain sudo[35997]: pam_unix(sudo:session): session closed for user root
Nov 28 07:49:57 np0005538515.localdomain sudo[36146]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 07:49:57 np0005538515.localdomain sudo[36146]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 07:49:57 np0005538515.localdomain sudo[36146]: pam_unix(sudo:session): session closed for user root
Nov 28 07:49:57 np0005538515.localdomain sudo[36161]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 07:49:57 np0005538515.localdomain sudo[36161]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 07:49:58 np0005538515.localdomain sudo[36161]: pam_unix(sudo:session): session closed for user root
Nov 28 07:49:58 np0005538515.localdomain sudo[36208]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 07:49:58 np0005538515.localdomain sudo[36208]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 07:49:58 np0005538515.localdomain sudo[36208]: pam_unix(sudo:session): session closed for user root
Nov 28 07:50:01 np0005538515.localdomain sshd[25284]: Received disconnect from 192.168.122.100 port 54654:11: disconnected by user
Nov 28 07:50:01 np0005538515.localdomain sshd[25284]: Disconnected from user zuul 192.168.122.100 port 54654
Nov 28 07:50:01 np0005538515.localdomain sshd[25281]: pam_unix(sshd:session): session closed for user zuul
Nov 28 07:50:01 np0005538515.localdomain systemd-logind[763]: Session 13 logged out. Waiting for processes to exit.
Nov 28 07:50:01 np0005538515.localdomain systemd[1]: session-13.scope: Deactivated successfully.
Nov 28 07:50:01 np0005538515.localdomain systemd[1]: session-13.scope: Consumed 21.355s CPU time.
Nov 28 07:50:01 np0005538515.localdomain systemd-logind[763]: Removed session 13.
Nov 28 07:50:59 np0005538515.localdomain sudo[36223]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 07:50:59 np0005538515.localdomain sudo[36223]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 07:50:59 np0005538515.localdomain sudo[36223]: pam_unix(sudo:session): session closed for user root
Nov 28 07:50:59 np0005538515.localdomain sudo[36238]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 07:50:59 np0005538515.localdomain sudo[36238]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 07:50:59 np0005538515.localdomain sudo[36238]: pam_unix(sudo:session): session closed for user root
Nov 28 07:51:00 np0005538515.localdomain sudo[36285]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 07:51:00 np0005538515.localdomain sudo[36285]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 07:51:00 np0005538515.localdomain sudo[36285]: pam_unix(sudo:session): session closed for user root
Nov 28 07:51:53 np0005538515.localdomain systemd[26783]: Created slice User Background Tasks Slice.
Nov 28 07:51:53 np0005538515.localdomain systemd[26783]: Starting Cleanup of User's Temporary Files and Directories...
Nov 28 07:51:53 np0005538515.localdomain systemd[26783]: Finished Cleanup of User's Temporary Files and Directories.
Nov 28 07:52:00 np0005538515.localdomain sudo[36301]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 07:52:00 np0005538515.localdomain sudo[36301]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 07:52:00 np0005538515.localdomain sudo[36301]: pam_unix(sudo:session): session closed for user root
Nov 28 07:52:00 np0005538515.localdomain sudo[36316]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 07:52:00 np0005538515.localdomain sudo[36316]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 07:52:01 np0005538515.localdomain sudo[36316]: pam_unix(sudo:session): session closed for user root
Nov 28 07:52:01 np0005538515.localdomain sudo[36362]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 07:52:01 np0005538515.localdomain sudo[36362]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 07:52:01 np0005538515.localdomain sudo[36362]: pam_unix(sudo:session): session closed for user root
Nov 28 07:53:02 np0005538515.localdomain sudo[36378]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 07:53:02 np0005538515.localdomain sudo[36378]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 07:53:02 np0005538515.localdomain sudo[36378]: pam_unix(sudo:session): session closed for user root
Nov 28 07:53:02 np0005538515.localdomain sudo[36393]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 07:53:02 np0005538515.localdomain sudo[36393]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 07:53:02 np0005538515.localdomain sudo[36393]: pam_unix(sudo:session): session closed for user root
Nov 28 07:53:03 np0005538515.localdomain sudo[36440]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 07:53:03 np0005538515.localdomain sudo[36440]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 07:53:03 np0005538515.localdomain sudo[36440]: pam_unix(sudo:session): session closed for user root
Nov 28 07:53:40 np0005538515.localdomain sshd[36455]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:53:40 np0005538515.localdomain sshd[36455]: Accepted publickey for zuul from 192.168.122.100 port 42070 ssh2: RSA SHA256:3gOhaEk5Hp1Sm2LwNst6cGDJ5O01KvSo8lCo9SBO2II
Nov 28 07:53:40 np0005538515.localdomain systemd-logind[763]: New session 27 of user zuul.
Nov 28 07:53:40 np0005538515.localdomain systemd[1]: Started Session 27 of User zuul.
Nov 28 07:53:40 np0005538515.localdomain sshd[36455]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Nov 28 07:53:41 np0005538515.localdomain sudo[36501]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cyefkfwuzkuncsozouxyqboxrdbepiyj ; /usr/bin/python3
Nov 28 07:53:41 np0005538515.localdomain sudo[36501]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:53:41 np0005538515.localdomain python3[36503]: ansible-ansible.legacy.ping Invoked with data=pong
Nov 28 07:53:41 np0005538515.localdomain sudo[36501]: pam_unix(sudo:session): session closed for user root
Nov 28 07:53:42 np0005538515.localdomain sudo[36546]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ierdwfiavyoqcyrgigejxntcqdsbgdua ; /usr/bin/python3
Nov 28 07:53:42 np0005538515.localdomain sudo[36546]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:53:42 np0005538515.localdomain python3[36548]: ansible-setup Invoked with gather_subset=['!facter', '!ohai'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 07:53:42 np0005538515.localdomain sudo[36546]: pam_unix(sudo:session): session closed for user root
Nov 28 07:53:42 np0005538515.localdomain sudo[36566]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-argfqkzdbbwgvngfawioxffehnyqpith ; /usr/bin/python3
Nov 28 07:53:42 np0005538515.localdomain sudo[36566]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:53:42 np0005538515.localdomain python3[36568]: ansible-user Invoked with name=tripleo-admin generate_ssh_key=False state=present non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005538515.localdomain update_password=always uid=None group=None groups=None comment=None home=None shell=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None hidden=None seuser=None skeleton=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None
Nov 28 07:53:42 np0005538515.localdomain useradd[36570]: new group: name=tripleo-admin, GID=1003
Nov 28 07:53:42 np0005538515.localdomain useradd[36570]: new user: name=tripleo-admin, UID=1003, GID=1003, home=/home/tripleo-admin, shell=/bin/bash, from=none
Nov 28 07:53:42 np0005538515.localdomain sudo[36566]: pam_unix(sudo:session): session closed for user root
Nov 28 07:53:43 np0005538515.localdomain sudo[36622]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zahsktvzsyqcchewxntvzvxakzzldrvs ; /usr/bin/python3
Nov 28 07:53:43 np0005538515.localdomain sudo[36622]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:53:43 np0005538515.localdomain python3[36624]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/tripleo-admin follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 07:53:43 np0005538515.localdomain sudo[36622]: pam_unix(sudo:session): session closed for user root
Nov 28 07:53:43 np0005538515.localdomain sudo[36665]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xsdnpmcrllpphbwswgsrneprjsanrndq ; /usr/bin/python3
Nov 28 07:53:43 np0005538515.localdomain sudo[36665]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:53:43 np0005538515.localdomain python3[36667]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/tripleo-admin mode=288 owner=root group=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764316423.0875354-66621-79331243140527/source _original_basename=tmpotbj1vh6 follow=False checksum=b3e7ecdcc699d217c6b083a91b07208207813d93 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:53:43 np0005538515.localdomain sudo[36665]: pam_unix(sudo:session): session closed for user root
Nov 28 07:53:44 np0005538515.localdomain sudo[36695]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ytjtgaqwqwfntelckxavoqtsxeuxzdsz ; /usr/bin/python3
Nov 28 07:53:44 np0005538515.localdomain sudo[36695]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:53:44 np0005538515.localdomain python3[36697]: ansible-file Invoked with path=/home/tripleo-admin state=directory owner=tripleo-admin group=tripleo-admin mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:53:44 np0005538515.localdomain sudo[36695]: pam_unix(sudo:session): session closed for user root
Nov 28 07:53:44 np0005538515.localdomain sudo[36711]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lomlvtaiclheuttmcpglkkjckxhxnwlz ; /usr/bin/python3
Nov 28 07:53:44 np0005538515.localdomain sudo[36711]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:53:44 np0005538515.localdomain python3[36713]: ansible-file Invoked with path=/home/tripleo-admin/.ssh state=directory owner=tripleo-admin group=tripleo-admin mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:53:44 np0005538515.localdomain sudo[36711]: pam_unix(sudo:session): session closed for user root
Nov 28 07:53:44 np0005538515.localdomain sudo[36727]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-emqppdrvxhrcdhxrxchplknfqcjfqbjk ; /usr/bin/python3
Nov 28 07:53:44 np0005538515.localdomain sudo[36727]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:53:44 np0005538515.localdomain python3[36729]: ansible-file Invoked with path=/home/tripleo-admin/.ssh/authorized_keys state=touch owner=tripleo-admin group=tripleo-admin mode=384 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:53:44 np0005538515.localdomain sudo[36727]: pam_unix(sudo:session): session closed for user root
Nov 28 07:53:45 np0005538515.localdomain sudo[36743]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aicuitfmqqbjbciexshziomskjwdxnfj ; /usr/bin/python3
Nov 28 07:53:45 np0005538515.localdomain sudo[36743]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 07:53:45 np0005538515.localdomain python3[36745]: ansible-lineinfile Invoked with path=/home/tripleo-admin/.ssh/authorized_keys line=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCsnBivukZgTjr1SoC29hE3ofwUMxTaKeXh9gXvDwMJASbvK4q9943cbJ2j47GUf8sEgY38kkU/dxSMQWULl4d2oquIgZpJbJuXMU1WNxwGNSrS74OecQ3Or4VxTiDmu/HV83nIWHqfpDCra4DlrIBPPNwhBK4u0QYy87AJaML6NGEDaubbHgVCg1UpW1ho/sDoXptAehoCEaaeRz5tPHiXRnHpIXu44Sp8fRcyU9rBqdv+/lgachTcMYadsD2WBHIL+pptEDHB5TvQTDpnU58YdGFarn8uuGPP4t8H6xcqXbaJS9/oZa5Fb5Mh3vORBbR65jvlGg4PYGzCuI/xllY5+lGK7eyOleFyRqWKa2uAIaGoRBT4ZLKAssOFwCIaGfOAFFOBMkuylg4+MtbYiMJYRARPSRAufAROqhUDOo73y5lBrXh07aiWuSn8fU4mclWu+Xw382ryxW+XeHPc12d7S46TvGJaRvzsLtlyerRxGI77xOHRexq1Z/SFjOWLOwc= zuul-build-sshkey
                                                          regexp=Generated by TripleO state=present backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:53:45 np0005538515.localdomain sudo[36743]: pam_unix(sudo:session): session closed for user root
Nov 28 07:53:46 np0005538515.localdomain python3[36759]: ansible-ping Invoked with data=pong
Nov 28 07:53:57 np0005538515.localdomain sshd[36760]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:53:57 np0005538515.localdomain sshd[36760]: Accepted publickey for tripleo-admin from 192.168.122.100 port 40252 ssh2: RSA SHA256:3gOhaEk5Hp1Sm2LwNst6cGDJ5O01KvSo8lCo9SBO2II
Nov 28 07:53:57 np0005538515.localdomain systemd[1]: Created slice User Slice of UID 1003.
Nov 28 07:53:57 np0005538515.localdomain systemd[1]: Starting User Runtime Directory /run/user/1003...
Nov 28 07:53:57 np0005538515.localdomain systemd-logind[763]: New session 28 of user tripleo-admin.
Nov 28 07:53:57 np0005538515.localdomain systemd[1]: Finished User Runtime Directory /run/user/1003.
Nov 28 07:53:57 np0005538515.localdomain systemd[1]: Starting User Manager for UID 1003...
Nov 28 07:53:57 np0005538515.localdomain systemd[36764]: pam_unix(systemd-user:session): session opened for user tripleo-admin(uid=1003) by (uid=0)
Nov 28 07:53:57 np0005538515.localdomain systemd[36764]: Queued start job for default target Main User Target.
Nov 28 07:53:57 np0005538515.localdomain systemd[36764]: Created slice User Application Slice.
Nov 28 07:53:57 np0005538515.localdomain systemd[36764]: Started Mark boot as successful after the user session has run 2 minutes.
Nov 28 07:53:57 np0005538515.localdomain systemd[36764]: Started Daily Cleanup of User's Temporary Directories.
Nov 28 07:53:57 np0005538515.localdomain systemd[36764]: Reached target Paths.
Nov 28 07:53:57 np0005538515.localdomain systemd[36764]: Reached target Timers.
Nov 28 07:53:57 np0005538515.localdomain systemd[36764]: Starting D-Bus User Message Bus Socket...
Nov 28 07:53:57 np0005538515.localdomain systemd[36764]: Starting Create User's Volatile Files and Directories...
Nov 28 07:53:57 np0005538515.localdomain systemd[36764]: Listening on D-Bus User Message Bus Socket.
Nov 28 07:53:57 np0005538515.localdomain systemd[36764]: Reached target Sockets.
Nov 28 07:53:57 np0005538515.localdomain systemd[36764]: Finished Create User's Volatile Files and Directories.
Nov 28 07:53:57 np0005538515.localdomain systemd[36764]: Reached target Basic System.
Nov 28 07:53:57 np0005538515.localdomain systemd[36764]: Reached target Main User Target.
Nov 28 07:53:57 np0005538515.localdomain systemd[36764]: Startup finished in 123ms.
Nov 28 07:53:57 np0005538515.localdomain systemd[1]: Started User Manager for UID 1003.
Nov 28 07:53:57 np0005538515.localdomain systemd[1]: Started Session 28 of User tripleo-admin.
Nov 28 07:53:57 np0005538515.localdomain sshd[36760]: pam_unix(sshd:session): session opened for user tripleo-admin(uid=1003) by (uid=0)
Nov 28 07:53:58 np0005538515.localdomain sudo[36821]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fdxtjnuipullxaaeomwwufqrwfgpszti ; /usr/bin/python3
Nov 28 07:53:58 np0005538515.localdomain sudo[36821]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:53:58 np0005538515.localdomain python3[36823]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all', 'min'] gather_timeout=45 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 07:53:58 np0005538515.localdomain sudo[36821]: pam_unix(sudo:session): session closed for user root
Nov 28 07:54:03 np0005538515.localdomain sudo[36828]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 07:54:03 np0005538515.localdomain sudo[36828]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 07:54:03 np0005538515.localdomain sudo[36828]: pam_unix(sudo:session): session closed for user root
Nov 28 07:54:03 np0005538515.localdomain sudo[36843]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 07:54:03 np0005538515.localdomain sudo[36843]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 07:54:03 np0005538515.localdomain sudo[36871]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ojrdeqoijzvglvbcjaurgiyzcxwjxdii ; /usr/bin/python3
Nov 28 07:54:03 np0005538515.localdomain sudo[36871]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:54:03 np0005538515.localdomain python3[36873]: ansible-selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config
Nov 28 07:54:03 np0005538515.localdomain sudo[36871]: pam_unix(sudo:session): session closed for user root
Nov 28 07:54:04 np0005538515.localdomain sudo[36843]: pam_unix(sudo:session): session closed for user root
Nov 28 07:54:04 np0005538515.localdomain sudo[36919]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ldvbpvbzgxhuvzsfxtxyknnpdbngflek ; /usr/bin/python3
Nov 28 07:54:04 np0005538515.localdomain sudo[36919]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:54:04 np0005538515.localdomain python3[36921]: ansible-tempfile Invoked with state=file suffix=tmphosts prefix=ansible. path=None
Nov 28 07:54:04 np0005538515.localdomain sudo[36919]: pam_unix(sudo:session): session closed for user root
Nov 28 07:54:04 np0005538515.localdomain sudo[36922]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 07:54:04 np0005538515.localdomain sudo[36922]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 07:54:04 np0005538515.localdomain sudo[36922]: pam_unix(sudo:session): session closed for user root
Nov 28 07:54:05 np0005538515.localdomain sudo[36982]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qfnouerypwgjykunsejosafmqvtsevdd ; /usr/bin/python3
Nov 28 07:54:05 np0005538515.localdomain sudo[36982]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:54:05 np0005538515.localdomain python3[36984]: ansible-ansible.legacy.copy Invoked with remote_src=True src=/etc/hosts dest=/tmp/ansible.pcvp7x4ttmphosts mode=preserve backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:54:05 np0005538515.localdomain sudo[36982]: pam_unix(sudo:session): session closed for user root
Nov 28 07:54:05 np0005538515.localdomain sudo[37012]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bqkcoymorzshuizjjzcrjoplxygahuas ; /usr/bin/python3
Nov 28 07:54:05 np0005538515.localdomain sudo[37012]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:54:05 np0005538515.localdomain python3[37014]: ansible-blockinfile Invoked with state=absent path=/tmp/ansible.pcvp7x4ttmphosts block= marker=# {mark} marker_begin=HEAT_HOSTS_START - Do not edit manually within this section! marker_end=HEAT_HOSTS_END create=False backup=False unsafe_writes=False insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:54:05 np0005538515.localdomain sudo[37012]: pam_unix(sudo:session): session closed for user root
Nov 28 07:54:06 np0005538515.localdomain sudo[37028]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aeoacdcfqtoeejtwiwarwqyxdzaweazx ; /usr/bin/python3
Nov 28 07:54:06 np0005538515.localdomain sudo[37028]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:54:06 np0005538515.localdomain python3[37030]: ansible-blockinfile Invoked with create=True path=/tmp/ansible.pcvp7x4ttmphosts insertbefore=BOF block=172.17.0.106 np0005538513.localdomain np0005538513
                                                         172.18.0.106 np0005538513.storage.localdomain np0005538513.storage
                                                         172.20.0.106 np0005538513.storagemgmt.localdomain np0005538513.storagemgmt
                                                         172.17.0.106 np0005538513.internalapi.localdomain np0005538513.internalapi
                                                         172.19.0.106 np0005538513.tenant.localdomain np0005538513.tenant
                                                         192.168.122.106 np0005538513.ctlplane.localdomain np0005538513.ctlplane
                                                         172.17.0.107 np0005538514.localdomain np0005538514
                                                         172.18.0.107 np0005538514.storage.localdomain np0005538514.storage
                                                         172.20.0.107 np0005538514.storagemgmt.localdomain np0005538514.storagemgmt
                                                         172.17.0.107 np0005538514.internalapi.localdomain np0005538514.internalapi
                                                         172.19.0.107 np0005538514.tenant.localdomain np0005538514.tenant
                                                         192.168.122.107 np0005538514.ctlplane.localdomain np0005538514.ctlplane
                                                         172.17.0.108 np0005538515.localdomain np0005538515
                                                         172.18.0.108 np0005538515.storage.localdomain np0005538515.storage
                                                         172.20.0.108 np0005538515.storagemgmt.localdomain np0005538515.storagemgmt
                                                         172.17.0.108 np0005538515.internalapi.localdomain np0005538515.internalapi
                                                         172.19.0.108 np0005538515.tenant.localdomain np0005538515.tenant
                                                         192.168.122.108 np0005538515.ctlplane.localdomain np0005538515.ctlplane
                                                         172.17.0.103 np0005538510.localdomain np0005538510
                                                         172.18.0.103 np0005538510.storage.localdomain np0005538510.storage
                                                         172.20.0.103 np0005538510.storagemgmt.localdomain np0005538510.storagemgmt
                                                         172.17.0.103 np0005538510.internalapi.localdomain np0005538510.internalapi
                                                         172.19.0.103 np0005538510.tenant.localdomain np0005538510.tenant
                                                         192.168.122.103 np0005538510.ctlplane.localdomain np0005538510.ctlplane
                                                         172.17.0.104 np0005538511.localdomain np0005538511
                                                         172.18.0.104 np0005538511.storage.localdomain np0005538511.storage
                                                         172.20.0.104 np0005538511.storagemgmt.localdomain np0005538511.storagemgmt
                                                         172.17.0.104 np0005538511.internalapi.localdomain np0005538511.internalapi
                                                         172.19.0.104 np0005538511.tenant.localdomain np0005538511.tenant
                                                         192.168.122.104 np0005538511.ctlplane.localdomain np0005538511.ctlplane
                                                         172.17.0.105 np0005538512.localdomain np0005538512
                                                         172.18.0.105 np0005538512.storage.localdomain np0005538512.storage
                                                         172.20.0.105 np0005538512.storagemgmt.localdomain np0005538512.storagemgmt
                                                         172.17.0.105 np0005538512.internalapi.localdomain np0005538512.internalapi
                                                         172.19.0.105 np0005538512.tenant.localdomain np0005538512.tenant
                                                         192.168.122.105 np0005538512.ctlplane.localdomain np0005538512.ctlplane
                                                         
                                                         192.168.122.100 undercloud.ctlplane.localdomain undercloud.ctlplane
                                                         192.168.122.99  overcloud.ctlplane.localdomain
                                                         172.18.0.197  overcloud.storage.localdomain
                                                         172.20.0.177  overcloud.storagemgmt.localdomain
                                                         172.17.0.128  overcloud.internalapi.localdomain
                                                         172.21.0.169  overcloud.localdomain
                                                          marker=# {mark} marker_begin=START_HOST_ENTRIES_FOR_STACK: overcloud marker_end=END_HOST_ENTRIES_FOR_STACK: overcloud state=present backup=False unsafe_writes=False insertafter=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:54:06 np0005538515.localdomain sudo[37028]: pam_unix(sudo:session): session closed for user root
Nov 28 07:54:07 np0005538515.localdomain sudo[37044]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jltocunlqmdgxqjlifyueqdfeucjnzku ; /usr/bin/python3
Nov 28 07:54:07 np0005538515.localdomain sudo[37044]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:54:07 np0005538515.localdomain python3[37046]: ansible-ansible.legacy.command Invoked with _raw_params=cp "/tmp/ansible.pcvp7x4ttmphosts" "/etc/hosts" _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 07:54:07 np0005538515.localdomain sudo[37044]: pam_unix(sudo:session): session closed for user root
Nov 28 07:54:07 np0005538515.localdomain sudo[37061]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-anextvorxztbzqbobpytzcjfofklgbry ; /usr/bin/python3
Nov 28 07:54:07 np0005538515.localdomain sudo[37061]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:54:07 np0005538515.localdomain python3[37063]: ansible-file Invoked with path=/tmp/ansible.pcvp7x4ttmphosts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:54:07 np0005538515.localdomain sudo[37061]: pam_unix(sudo:session): session closed for user root
Nov 28 07:54:08 np0005538515.localdomain sudo[37077]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xyvzcalaslbdnxllvbwkjdizomqlyhxb ; /usr/bin/python3
Nov 28 07:54:08 np0005538515.localdomain sudo[37077]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:54:08 np0005538515.localdomain python3[37079]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -q --whatprovides rhosp-release _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 07:54:08 np0005538515.localdomain sudo[37077]: pam_unix(sudo:session): session closed for user root
Nov 28 07:54:09 np0005538515.localdomain sudo[37094]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dnlsnehbqpfxqxyqdbntdcgmlovetjax ; /usr/bin/python3
Nov 28 07:54:09 np0005538515.localdomain sudo[37094]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:54:09 np0005538515.localdomain python3[37096]: ansible-ansible.legacy.dnf Invoked with name=['rhosp-release'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Nov 28 07:54:12 np0005538515.localdomain sudo[37094]: pam_unix(sudo:session): session closed for user root
Nov 28 07:54:13 np0005538515.localdomain sudo[37114]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hxmaiewdbrtainbvpiigecizpkvrkmdk ; /usr/bin/python3
Nov 28 07:54:13 np0005538515.localdomain sudo[37114]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:54:13 np0005538515.localdomain python3[37116]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -q --whatprovides driverctl lvm2 jq nftables openvswitch openstack-heat-agents openstack-selinux os-net-config python3-libselinux python3-pyyaml puppet-tripleo rsync tmpwatch sysstat iproute-tc _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 07:54:14 np0005538515.localdomain sudo[37114]: pam_unix(sudo:session): session closed for user root
Nov 28 07:54:14 np0005538515.localdomain sudo[37131]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-stdksokifizjvnhniinjegxtlyztuoiu ; /usr/bin/python3
Nov 28 07:54:14 np0005538515.localdomain sudo[37131]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:54:14 np0005538515.localdomain python3[37133]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'jq', 'nftables', 'openvswitch', 'openstack-heat-agents', 'openstack-selinux', 'os-net-config', 'python3-libselinux', 'python3-pyyaml', 'puppet-tripleo', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Nov 28 07:54:29 np0005538515.localdomain groupadd[37303]: group added to /etc/group: name=puppet, GID=52
Nov 28 07:54:29 np0005538515.localdomain groupadd[37303]: group added to /etc/gshadow: name=puppet
Nov 28 07:54:29 np0005538515.localdomain groupadd[37303]: new group: name=puppet, GID=52
Nov 28 07:54:30 np0005538515.localdomain useradd[37310]: new user: name=puppet, UID=52, GID=52, home=/var/lib/puppet, shell=/sbin/nologin, from=none
Nov 28 07:55:04 np0005538515.localdomain sudo[37748]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 07:55:04 np0005538515.localdomain sudo[37748]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 07:55:04 np0005538515.localdomain sudo[37748]: pam_unix(sudo:session): session closed for user root
Nov 28 07:55:05 np0005538515.localdomain sudo[37764]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 07:55:05 np0005538515.localdomain sudo[37764]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 07:55:05 np0005538515.localdomain sudo[37764]: pam_unix(sudo:session): session closed for user root
Nov 28 07:55:06 np0005538515.localdomain sudo[37817]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 07:55:06 np0005538515.localdomain sudo[37817]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 07:55:06 np0005538515.localdomain sudo[37817]: pam_unix(sudo:session): session closed for user root
Nov 28 07:55:24 np0005538515.localdomain kernel: SELinux:  Converting 2699 SID table entries...
Nov 28 07:55:24 np0005538515.localdomain kernel: SELinux:  policy capability network_peer_controls=1
Nov 28 07:55:24 np0005538515.localdomain kernel: SELinux:  policy capability open_perms=1
Nov 28 07:55:24 np0005538515.localdomain kernel: SELinux:  policy capability extended_socket_class=1
Nov 28 07:55:24 np0005538515.localdomain kernel: SELinux:  policy capability always_check_network=0
Nov 28 07:55:24 np0005538515.localdomain kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 28 07:55:24 np0005538515.localdomain kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 28 07:55:24 np0005538515.localdomain kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 28 07:55:24 np0005538515.localdomain dbus-broker-launch[754]: avc:  op=load_policy lsm=selinux seqno=6 res=1
Nov 28 07:55:24 np0005538515.localdomain systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 28 07:55:24 np0005538515.localdomain systemd[1]: Starting man-db-cache-update.service...
Nov 28 07:55:24 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 07:55:24 np0005538515.localdomain systemd-sysv-generator[37996]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 07:55:24 np0005538515.localdomain systemd-rc-local-generator[37991]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 07:55:24 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 07:55:25 np0005538515.localdomain systemd[1]: Queuing reload/restart jobs for marked units…
Nov 28 07:55:25 np0005538515.localdomain systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 28 07:55:25 np0005538515.localdomain systemd[1]: Finished man-db-cache-update.service.
Nov 28 07:55:25 np0005538515.localdomain systemd[1]: run-r7192d6730b4e4c4b9d51597ca633e445.service: Deactivated successfully.
Nov 28 07:55:26 np0005538515.localdomain sudo[37131]: pam_unix(sudo:session): session closed for user root
Nov 28 07:55:27 np0005538515.localdomain sudo[38432]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kwakrazklpamuoawaaaomhrrugfwesvw ; /usr/bin/python3
Nov 28 07:55:27 np0005538515.localdomain sudo[38432]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:55:27 np0005538515.localdomain python3[38434]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 jq nftables openvswitch openstack-heat-agents openstack-selinux os-net-config python3-libselinux python3-pyyaml puppet-tripleo rsync tmpwatch sysstat iproute-tc _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 07:55:28 np0005538515.localdomain sudo[38432]: pam_unix(sudo:session): session closed for user root
Nov 28 07:55:28 np0005538515.localdomain sudo[38571]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xjpmcgopftseqoomszywzxhmctqczxnn ; /usr/bin/python3
Nov 28 07:55:28 np0005538515.localdomain sudo[38571]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:55:29 np0005538515.localdomain python3[38573]: ansible-ansible.legacy.systemd Invoked with name=openvswitch enabled=True state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 07:55:30 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 07:55:30 np0005538515.localdomain systemd-rc-local-generator[38596]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 07:55:30 np0005538515.localdomain systemd-sysv-generator[38601]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 07:55:30 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 07:55:30 np0005538515.localdomain sudo[38571]: pam_unix(sudo:session): session closed for user root
Nov 28 07:55:30 np0005538515.localdomain sudo[38625]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jovceghfqqufktayqbkmvdjapymgvtvi ; /usr/bin/python3
Nov 28 07:55:30 np0005538515.localdomain sudo[38625]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:55:30 np0005538515.localdomain python3[38627]: ansible-file Invoked with path=/var/lib/heat-config/tripleo-config-download state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:55:30 np0005538515.localdomain sudo[38625]: pam_unix(sudo:session): session closed for user root
Nov 28 07:55:31 np0005538515.localdomain sudo[38641]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xgxqiyihotwdusnjrruaerfbhgfowwik ; /usr/bin/python3
Nov 28 07:55:31 np0005538515.localdomain sudo[38641]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:55:31 np0005538515.localdomain python3[38643]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -q --whatprovides openstack-network-scripts _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 07:55:31 np0005538515.localdomain sudo[38641]: pam_unix(sudo:session): session closed for user root
Nov 28 07:55:31 np0005538515.localdomain sudo[38658]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tkpkmlllvwjoyoghixtfzbqdzsmomkpi ; /usr/bin/python3
Nov 28 07:55:31 np0005538515.localdomain sudo[38658]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:55:32 np0005538515.localdomain python3[38660]: ansible-systemd Invoked with name=NetworkManager enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Nov 28 07:55:32 np0005538515.localdomain sudo[38658]: pam_unix(sudo:session): session closed for user root
Nov 28 07:55:32 np0005538515.localdomain sudo[38676]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-naomncetocrfkjsmgdsnpxvfdjfplgfx ; /usr/bin/python3
Nov 28 07:55:32 np0005538515.localdomain sudo[38676]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:55:33 np0005538515.localdomain python3[38678]: ansible-ini_file Invoked with path=/etc/NetworkManager/NetworkManager.conf state=present no_extra_spaces=True section=main option=dns value=none backup=True exclusive=True allow_no_value=False create=True unsafe_writes=False values=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:55:33 np0005538515.localdomain sudo[38676]: pam_unix(sudo:session): session closed for user root
Nov 28 07:55:33 np0005538515.localdomain sudo[38694]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tgxzymvmejvhebdzshfocrtrlpjahcwr ; /usr/bin/python3
Nov 28 07:55:33 np0005538515.localdomain sudo[38694]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:55:33 np0005538515.localdomain python3[38696]: ansible-ini_file Invoked with path=/etc/NetworkManager/NetworkManager.conf state=present no_extra_spaces=True section=main option=rc-manager value=unmanaged backup=True exclusive=True allow_no_value=False create=True unsafe_writes=False values=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:55:33 np0005538515.localdomain sudo[38694]: pam_unix(sudo:session): session closed for user root
Nov 28 07:55:34 np0005538515.localdomain sudo[38712]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-srdqboiazsthnlsyorxfmwdtejbkunpp ; /usr/bin/python3
Nov 28 07:55:34 np0005538515.localdomain sudo[38712]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:55:34 np0005538515.localdomain python3[38714]: ansible-ansible.legacy.systemd Invoked with name=NetworkManager state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 28 07:55:34 np0005538515.localdomain systemd[1]: Reloading Network Manager...
Nov 28 07:55:34 np0005538515.localdomain NetworkManager[5965]: <info>  [1764316534.3587] audit: op="reload" arg="0" pid=38717 uid=0 result="success"
Nov 28 07:55:34 np0005538515.localdomain NetworkManager[5965]: <info>  [1764316534.3595] config: signal: SIGHUP,config-files,values,values-user,no-auto-default,dns-mode,rc-manager (/etc/NetworkManager/NetworkManager.conf (lib: 00-server.conf) (run: 15-carrier-timeout.conf))
Nov 28 07:55:34 np0005538515.localdomain NetworkManager[5965]: <info>  [1764316534.3595] dns-mgr: init: dns=none,systemd-resolved rc-manager=unmanaged
Nov 28 07:55:34 np0005538515.localdomain systemd[1]: Reloaded Network Manager.
Nov 28 07:55:34 np0005538515.localdomain sudo[38712]: pam_unix(sudo:session): session closed for user root
Nov 28 07:55:35 np0005538515.localdomain sudo[38731]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rnchmsmvadmpehzqrqkyleluhkaeyunj ; /usr/bin/python3
Nov 28 07:55:35 np0005538515.localdomain sudo[38731]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:55:35 np0005538515.localdomain python3[38733]: ansible-ansible.legacy.command Invoked with _raw_params=ln -f -s /usr/share/openstack-puppet/modules/* /etc/puppet/modules/ _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 07:55:35 np0005538515.localdomain sudo[38731]: pam_unix(sudo:session): session closed for user root
Nov 28 07:55:36 np0005538515.localdomain sudo[38748]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tqdpyupnluajusaphewrcpjmswtqowod ; /usr/bin/python3
Nov 28 07:55:36 np0005538515.localdomain sudo[38748]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:55:36 np0005538515.localdomain python3[38750]: ansible-stat Invoked with path=/usr/bin/ansible-playbook follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 28 07:55:36 np0005538515.localdomain sudo[38748]: pam_unix(sudo:session): session closed for user root
Nov 28 07:55:36 np0005538515.localdomain sudo[38766]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ecswfvakmvhfwmwpfrgiammzqbqyzrys ; /usr/bin/python3
Nov 28 07:55:36 np0005538515.localdomain sudo[38766]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:55:36 np0005538515.localdomain python3[38768]: ansible-stat Invoked with path=/usr/bin/ansible-playbook-3 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 28 07:55:36 np0005538515.localdomain sudo[38766]: pam_unix(sudo:session): session closed for user root
Nov 28 07:55:36 np0005538515.localdomain sudo[38782]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-talveaxukjybnctiaqriqgrffehlhphx ; /usr/bin/python3
Nov 28 07:55:36 np0005538515.localdomain sudo[38782]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:55:37 np0005538515.localdomain python3[38784]: ansible-file Invoked with state=link src=/usr/bin/ansible-playbook path=/usr/bin/ansible-playbook-3 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:55:37 np0005538515.localdomain sudo[38782]: pam_unix(sudo:session): session closed for user root
Nov 28 07:55:37 np0005538515.localdomain sudo[38798]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eudqmgqxdtrerlterjplzknzapktuvgf ; /usr/bin/python3
Nov 28 07:55:37 np0005538515.localdomain sudo[38798]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:55:37 np0005538515.localdomain python3[38800]: ansible-tempfile Invoked with state=file prefix=ansible. suffix= path=None
Nov 28 07:55:37 np0005538515.localdomain sudo[38798]: pam_unix(sudo:session): session closed for user root
Nov 28 07:55:38 np0005538515.localdomain sudo[38814]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-guxecugfavzjsnnvzpnjpjqnteeekrcn ; /usr/bin/python3
Nov 28 07:55:38 np0005538515.localdomain sudo[38814]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:55:38 np0005538515.localdomain python3[38816]: ansible-stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 28 07:55:38 np0005538515.localdomain sudo[38814]: pam_unix(sudo:session): session closed for user root
Nov 28 07:55:38 np0005538515.localdomain sudo[38830]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-emhyqfnokbnsilkbcvbzldbmyfrzasax ; /usr/bin/python3
Nov 28 07:55:38 np0005538515.localdomain sudo[38830]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:55:39 np0005538515.localdomain python3[38832]: ansible-blockinfile Invoked with path=/tmp/ansible.nk45s4qy block=[192.168.122.106]*,[np0005538513.ctlplane.localdomain]*,[172.17.0.106]*,[np0005538513.internalapi.localdomain]*,[172.18.0.106]*,[np0005538513.storage.localdomain]*,[172.20.0.106]*,[np0005538513.storagemgmt.localdomain]*,[172.19.0.106]*,[np0005538513.tenant.localdomain]*,[np0005538513.localdomain]*,[np0005538513]* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCToHi/c1OL/UxMWy2v/t0tcvSlMeoKa6EPBYbcu51p2Gn2UxEPgCRLM9+84Smh2pxAR4Y/5LVm2lbZ9Gf4okHGg5GLIyqzxxqbQHyR+YRljujVEOvksUPuKCptzx9fQj2Ij2t9GPGHc5klgGPIKjx0pza8T37vdz+G9y7zuK5wWI66AeN8y/6dD2hvi1Lp94VRSvTTEo+nUOFSIgsOwqQO+ZSwTgjG1pmtESBe8nkhW0I0BQPX46v9f1PN1LXDg8cN2FSVjQ91RI0uCvTaBYJ3soFBFspgiJ113zapbQCaNwg7lK7ofS0QT5WONP3QIsDAq1gSpWuOdS2DRY4NU3WMd4m5tLbj+ubiWr39rNU/zQiEl8r38aiM0OwOfuQ9S8wxO7phpVCQrbOkYCLLijdy/xTODvP+jYohTMWX8Gh6IVeVtm6SB2Tw3lDBCjpqlclCSs905Xe+mTJ6WYTaz+Q1xgflKEeemzJ0+rt+QZbrmL7u5MUdf/l/yOLAgACNsws=
                                                         [192.168.122.107]*,[np0005538514.ctlplane.localdomain]*,[172.17.0.107]*,[np0005538514.internalapi.localdomain]*,[172.18.0.107]*,[np0005538514.storage.localdomain]*,[172.20.0.107]*,[np0005538514.storagemgmt.localdomain]*,[172.19.0.107]*,[np0005538514.tenant.localdomain]*,[np0005538514.localdomain]*,[np0005538514]* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDLIqwhlOevSQuHXF0nrkLOzRoSQqnWWb/cXzK4um93clqGujVOE9PUyL6ONBo/qlr4Pp+QMzSsIFwjW1T/6G+Ce2CS/TGphIUxvvB9NhBt+OJl/zDUEmjAU6bwVIx6ApqtimsXWWIap9GEtVWA5P9pcqPMyGzq1mCzwCS252Ylioij0zZxfMrxTt3RSsWrDED61vRes0ZKd8HERTLN+Lzis5t8f74zfwTesOea6CRkIHth4cUP7ua3q+KhhbhIPj+fXWN5w+qVbcTMJSYyUPsZ2ymPhR4x4db1oPk1Jg14dw1BnmAZZl3v8o4l7bUQ2Fj/PE1JbSiApxbK+V0KdZGMrG4iVbnMmzwBXPXHa6lNQGneflVd3MNEepnTnXx4hAVpaJHc8EtIREq8aPe07DW0wL9clpTKaSGU2Ma+BLXmSDPkuPh6JWLxn3iM1yybL574NnGt2MgBj6z2tiSb4NkNmaBkoG8PMHw8YUSabKBBZNiMEO2GKBpHZldSrYvOZHU=
                                                         [192.168.122.108]*,[np0005538515.ctlplane.localdomain]*,[172.17.0.108]*,[np0005538515.internalapi.localdomain]*,[172.18.0.108]*,[np0005538515.storage.localdomain]*,[172.20.0.108]*,[np0005538515.storagemgmt.localdomain]*,[172.19.0.108]*,[np0005538515.tenant.localdomain]*,[np0005538515.localdomain]*,[np0005538515]* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDbwc/gBZF5hmsFU6BSK1/DT5hduj2+3ukzoCGLU6mgpBv7BiInN7vVOqXilL+QUAWOvfKTekaQe1Vv/2jpygQnlu6MEMopmac/36IfVjgt39zxCULfSWv3Gp8tLP0ATF2LfhHBWFrGX7G3Bg3AiNfIUnQIQadBaKIByl+FfA7nJ7phwBAwJaQxvByGDeMwC2CWIUPgVqKclcw1WmldPnNmwquLlCbAeMV2hHlBfnVk8BI6fsOUcBB6a05zRpJpbrl584F+qkiQX0RpZYJQdZCoLiJStJv39lYhgiAWChUOVJsCbeNQnC9/Xgs5JhmRESgXh7Tm+8UNW9DxSHN7BS5qKYPUULdjobSp2v9pFOx30MLMsNd5r3JE07pgm5PpjuviSGEvJ8DIAPTF3kUXM43wax1q9rGV4ZfoJiLAwS9CmWWDWZDg17cnC5z+3qi+K8HUKz8LxQCHI+yEtTFzUEYyXTQfQbNvkauEHI/PwFA1iC+4/2g/0UhtjkM+FO2Czwk=
                                                         [192.168.122.103]*,[np0005538510.ctlplane.localdomain]*,[172.17.0.103]*,[np0005538510.internalapi.localdomain]*,[172.18.0.103]*,[np0005538510.storage.localdomain]*,[172.20.0.103]*,[np0005538510.storagemgmt.localdomain]*,[172.19.0.103]*,[np0005538510.tenant.localdomain]*,[np0005538510.localdomain]*,[np0005538510]* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDAxqgPHnyGChl6yd1/HRo8ox+w8llSVhIj8iYUdDG7IquyLr4/CZguZzRkngbXi/Dq544iKS4kFL/zPKi+yuxeFs4b6fgo4vGoV8wwKNSJXx0d0hOQa9651VqB6k/trENRTgLa2fHkXgF+/g0f7HvloQfhr7qjhTBRV4l4UfJiOEpMvMxN6map/D0JuHlAZGZ5mGUoBTEMuPGEPvMWqe0kc/I8WIgsMsvijOGM2xDxsOqAYlV9a8faoyMdacWUNkeQTfPF6h+z8xdvP8qWPtrPKWHMpcGicTI6pFZ2JxOjWnMaBXs2j/CN7HFLbyOCwuAvAu9efAbxJvgtZlO++6kSlq7SHMzwv7PLP69GaQJHR+jANJ/O2BchbxL09mIkpFSzLSS0k7xXJlwqnAMciIlTaud2n5Hqnnb06WgtvD6O0nnuCLH5am7F1YDGJJgUmNbbgF1PuwzOZqQy+tA2igji/n2z87KkGZdIbrHdPU1PPIlzVGPO6aO02RhvtD+/iQM=
                                                         [192.168.122.104]*,[np0005538511.ctlplane.localdomain]*,[172.17.0.104]*,[np0005538511.internalapi.localdomain]*,[172.18.0.104]*,[np0005538511.storage.localdomain]*,[172.20.0.104]*,[np0005538511.storagemgmt.localdomain]*,[172.19.0.104]*,[np0005538511.tenant.localdomain]*,[np0005538511.localdomain]*,[np0005538511]* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDqtmgm0KAOOIJ7a8whlZPfasnwJpfcm6zVmQjiKHZZrcojE/a6oALfufKXbfWWiLjJ2VzyK9v7QPNXhIWxgAKT9J40A1lSpSAmmxMaWvy+hzzvePs0Z4Fc4bFX7V4zBGI+dAJ+eAu73z6OKNuMhxBrL46ejpRFbqjwBP3veWRiLOMbyPn+Wc+amop0p1eEzV2QHMAIC5Dwm6/tYNLixNSa/Ea0ciaY3jWii+IGhYy+wqQP+9qkoVf9bZ4Ewa+7UfXI/q4zvvic/Znb8ZpCpezLnH4ilBORLyV9r/wkkkVGY7UVgUdSoLVjzTGQAtHl2ZgA3zJ2F2ES9QcBEvrHygT4vGgtEaxQn8XFhBwhzCpPaLyXti/6d+8M36cJx+7gv1eEfDgLz3MNR+tcnFSew9N6dIN4afV0DvA/9FsWk8PTqddN4iHcZzRo0GiDJWNtB+gYVZOytTYMZm2Cyv59IthEzxaB+wTZoSdCeuEeTM0ohYspOKirIPqMPuCbGbtrJFE=
                                                         [192.168.122.105]*,[np0005538512.ctlplane.localdomain]*,[172.17.0.105]*,[np0005538512.internalapi.localdomain]*,[172.18.0.105]*,[np0005538512.storage.localdomain]*,[172.20.0.105]*,[np0005538512.storagemgmt.localdomain]*,[172.19.0.105]*,[np0005538512.tenant.localdomain]*,[np0005538512.localdomain]*,[np0005538512]* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCy9/gxqH+eMqafXwUuPf+1Clpw4qsugdFefisnCDhJ5U7Pc+eWMUQVMS0ErxabBJhneDOyPwXwIbv72cEAtmgfvHDlSuS3mt8LRzKqsv1dXTy4Zqb3JGVzrvxo0iczGRsn2MIDJUv/Zjq9YqVeCnDj2HOwV+qx+EFecEFXS797FxsnMmTw0A5z8yUtBuJEGAKQX96LpZc4k5ltq+Uy0rK85Kk7cGR4A+wrIChLC8wggxvA99NdPEBtne6Chb+3PcbYUcTGhGtV6FGzpgbWmuWT/gcANb+fJE5/4n87loLmBMsmvGhvQuN9kuJ20g6nwPJbPTpIbV6XALx4tbma68bL3RL+lcGlh3jf0pEXPfolrB/MRmJn5ggMLjRv50FrowQalnCEgWE0gtd9IGjmqFz3jP008bGotn9rcacbjC2AvE+5NEjp7TzXGnFcD6jW8+9AWiusCww4ULs/oWbi0GLkmhwU5EifitDYF2+r1CigAdlEjb6sa0wAQSmclWk6guM=
                                                          create=True state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END unsafe_writes=False insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:55:39 np0005538515.localdomain sudo[38830]: pam_unix(sudo:session): session closed for user root
Nov 28 07:55:39 np0005538515.localdomain sudo[38846]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cyqbyfjqluxmquwjauloxbxavnhvblpy ; /usr/bin/python3
Nov 28 07:55:39 np0005538515.localdomain sudo[38846]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:55:39 np0005538515.localdomain python3[38848]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.nk45s4qy' > /etc/ssh/ssh_known_hosts _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 07:55:39 np0005538515.localdomain sudo[38846]: pam_unix(sudo:session): session closed for user root
Nov 28 07:55:39 np0005538515.localdomain sudo[38864]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nfwbfaoqsfxrckmygikwaapsxyortayt ; /usr/bin/python3
Nov 28 07:55:39 np0005538515.localdomain sudo[38864]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:55:39 np0005538515.localdomain python3[38866]: ansible-file Invoked with path=/tmp/ansible.nk45s4qy state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:55:39 np0005538515.localdomain sudo[38864]: pam_unix(sudo:session): session closed for user root
Nov 28 07:55:40 np0005538515.localdomain sudo[38880]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jpqwkinevditycupojikmxshkkkzclub ; /usr/bin/python3
Nov 28 07:55:40 np0005538515.localdomain sudo[38880]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:55:40 np0005538515.localdomain python3[38882]: ansible-file Invoked with path=/var/log/journal state=directory mode=0750 owner=root group=root setype=var_log_t recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 07:55:40 np0005538515.localdomain sudo[38880]: pam_unix(sudo:session): session closed for user root
Nov 28 07:55:41 np0005538515.localdomain sudo[38896]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-umyqjvfkauiiajrhyurjvoezktodsmag ; /usr/bin/python3
Nov 28 07:55:41 np0005538515.localdomain sudo[38896]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:55:41 np0005538515.localdomain python3[38898]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-active cloud-init.service || systemctl is-enabled cloud-init.service _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 07:55:41 np0005538515.localdomain sudo[38896]: pam_unix(sudo:session): session closed for user root
Nov 28 07:55:41 np0005538515.localdomain sudo[38914]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ymsttflzqafyceobfzpekgfykakwvvei ; /usr/bin/python3
Nov 28 07:55:41 np0005538515.localdomain sudo[38914]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:55:41 np0005538515.localdomain python3[38916]: ansible-ansible.legacy.command Invoked with _raw_params=cat /proc/cmdline | grep -q cloud-init=disabled _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 07:55:41 np0005538515.localdomain sudo[38914]: pam_unix(sudo:session): session closed for user root
Nov 28 07:55:41 np0005538515.localdomain sudo[38933]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ucmyobjnzifpoijkwuwbontoiefojoov ; /usr/bin/python3
Nov 28 07:55:41 np0005538515.localdomain sudo[38933]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:55:41 np0005538515.localdomain python3[38935]: ansible-community.general.cloud_init_data_facts Invoked with filter=status
Nov 28 07:55:41 np0005538515.localdomain sudo[38933]: pam_unix(sudo:session): session closed for user root
Nov 28 07:55:42 np0005538515.localdomain sudo[38949]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yzvadckxqlewzstzaqogbomzpqfibjvc ; /usr/bin/python3
Nov 28 07:55:42 np0005538515.localdomain sudo[38949]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:55:42 np0005538515.localdomain sudo[38949]: pam_unix(sudo:session): session closed for user root
Nov 28 07:55:42 np0005538515.localdomain sudo[38997]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-emtacgstpchqmakzxceyxagdstezqhvb ; /usr/bin/python3
Nov 28 07:55:42 np0005538515.localdomain sudo[38997]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:55:42 np0005538515.localdomain sudo[38997]: pam_unix(sudo:session): session closed for user root
Nov 28 07:55:43 np0005538515.localdomain sudo[39040]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-abbidmcgxeyttusniicclaguhdnvliou ; /usr/bin/python3
Nov 28 07:55:43 np0005538515.localdomain sudo[39040]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:55:43 np0005538515.localdomain sudo[39040]: pam_unix(sudo:session): session closed for user root
Nov 28 07:55:44 np0005538515.localdomain sudo[39070]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fbomddzalfncoiogzgjyzadljlebkgvw ; /usr/bin/python3
Nov 28 07:55:44 np0005538515.localdomain sudo[39070]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:55:44 np0005538515.localdomain python3[39072]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -q --whatprovides tuned tuned-profiles-cpu-partitioning _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 07:55:44 np0005538515.localdomain sudo[39070]: pam_unix(sudo:session): session closed for user root
Nov 28 07:55:44 np0005538515.localdomain sudo[39087]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fnklsvyphodpntjbmiiaayxgtdwkdkts ; /usr/bin/python3
Nov 28 07:55:44 np0005538515.localdomain sudo[39087]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:55:45 np0005538515.localdomain python3[39089]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Nov 28 07:55:48 np0005538515.localdomain dbus-broker-launch[750]: Noticed file-system modification, trigger reload.
Nov 28 07:55:48 np0005538515.localdomain dbus-broker-launch[750]: Noticed file-system modification, trigger reload.
Nov 28 07:55:48 np0005538515.localdomain systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 28 07:55:48 np0005538515.localdomain systemd[1]: Starting man-db-cache-update.service...
Nov 28 07:55:48 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 07:55:48 np0005538515.localdomain systemd-rc-local-generator[39167]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 07:55:48 np0005538515.localdomain systemd-sysv-generator[39170]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 07:55:48 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 07:55:48 np0005538515.localdomain systemd[1]: Queuing reload/restart jobs for marked units…
Nov 28 07:55:48 np0005538515.localdomain systemd[1]: Stopping Dynamic System Tuning Daemon...
Nov 28 07:55:48 np0005538515.localdomain systemd[1]: tuned.service: Deactivated successfully.
Nov 28 07:55:48 np0005538515.localdomain systemd[1]: Stopped Dynamic System Tuning Daemon.
Nov 28 07:55:48 np0005538515.localdomain systemd[1]: tuned.service: Consumed 1.722s CPU time.
Nov 28 07:55:48 np0005538515.localdomain systemd[1]: Starting Dynamic System Tuning Daemon...
Nov 28 07:55:48 np0005538515.localdomain systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 28 07:55:48 np0005538515.localdomain systemd[1]: Finished man-db-cache-update.service.
Nov 28 07:55:48 np0005538515.localdomain systemd[1]: run-r0c642f159fba4057b9d2f8270231ada6.service: Deactivated successfully.
Nov 28 07:55:50 np0005538515.localdomain systemd[1]: Started Dynamic System Tuning Daemon.
Nov 28 07:55:50 np0005538515.localdomain systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 28 07:55:50 np0005538515.localdomain systemd[1]: Starting man-db-cache-update.service...
Nov 28 07:55:50 np0005538515.localdomain systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 28 07:55:50 np0005538515.localdomain systemd[1]: Finished man-db-cache-update.service.
Nov 28 07:55:50 np0005538515.localdomain systemd[1]: run-r1730aa72abef4569a7698fa9dabccad7.service: Deactivated successfully.
Nov 28 07:55:51 np0005538515.localdomain sudo[39087]: pam_unix(sudo:session): session closed for user root
Nov 28 07:55:51 np0005538515.localdomain sudo[39524]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hxwvvsiowtfmuvrlzeqpppecaedcewux ; /usr/bin/python3
Nov 28 07:55:51 np0005538515.localdomain sudo[39524]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:55:51 np0005538515.localdomain python3[39526]: ansible-systemd Invoked with name=tuned state=restarted enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 07:55:51 np0005538515.localdomain systemd[1]: Stopping Dynamic System Tuning Daemon...
Nov 28 07:55:51 np0005538515.localdomain systemd[1]: tuned.service: Deactivated successfully.
Nov 28 07:55:51 np0005538515.localdomain systemd[1]: Stopped Dynamic System Tuning Daemon.
Nov 28 07:55:51 np0005538515.localdomain systemd[1]: Starting Dynamic System Tuning Daemon...
Nov 28 07:55:52 np0005538515.localdomain systemd[1]: Started Dynamic System Tuning Daemon.
Nov 28 07:55:52 np0005538515.localdomain sudo[39524]: pam_unix(sudo:session): session closed for user root
Nov 28 07:55:53 np0005538515.localdomain sudo[39719]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kkcerxtnqyerfaocvqkvtjquucpdowmi ; PATH=/bin:/usr/bin:/sbin:/usr/sbin /usr/bin/python3
Nov 28 07:55:53 np0005538515.localdomain sudo[39719]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:55:53 np0005538515.localdomain python3[39721]: ansible-ansible.legacy.command Invoked with _raw_params=which tuned-adm _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 07:55:53 np0005538515.localdomain sudo[39719]: pam_unix(sudo:session): session closed for user root
Nov 28 07:55:53 np0005538515.localdomain sudo[39736]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-biagxvlkrorlonbxrpgcwpompbfkevpj ; /usr/bin/python3
Nov 28 07:55:53 np0005538515.localdomain sudo[39736]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:55:54 np0005538515.localdomain python3[39738]: ansible-slurp Invoked with src=/etc/tuned/active_profile
Nov 28 07:55:54 np0005538515.localdomain sudo[39736]: pam_unix(sudo:session): session closed for user root
Nov 28 07:55:54 np0005538515.localdomain sudo[39752]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gynsxjvchdwqninusfmvcieoftfzfnbw ; /usr/bin/python3
Nov 28 07:55:54 np0005538515.localdomain sudo[39752]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:55:54 np0005538515.localdomain python3[39754]: ansible-stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 28 07:55:54 np0005538515.localdomain sudo[39752]: pam_unix(sudo:session): session closed for user root
Nov 28 07:55:54 np0005538515.localdomain sudo[39768]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ebeqepttwmgrlxaqnyqzavqxxofngmmi ; PATH=/bin:/usr/bin:/sbin:/usr/sbin /usr/bin/python3
Nov 28 07:55:54 np0005538515.localdomain sudo[39768]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:55:55 np0005538515.localdomain python3[39770]: ansible-ansible.legacy.command Invoked with _raw_params=tuned-adm profile throughput-performance _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 07:55:56 np0005538515.localdomain sudo[39768]: pam_unix(sudo:session): session closed for user root
Nov 28 07:55:56 np0005538515.localdomain sudo[39788]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-govrlfsrxmrofmcqilnysqosnvfruzsi ; /usr/bin/python3
Nov 28 07:55:56 np0005538515.localdomain sudo[39788]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:55:57 np0005538515.localdomain python3[39790]: ansible-ansible.legacy.command Invoked with _raw_params=cat /proc/cmdline _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 07:55:57 np0005538515.localdomain sudo[39788]: pam_unix(sudo:session): session closed for user root
Nov 28 07:55:57 np0005538515.localdomain sudo[39805]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ihpiycaqdwfjilmurcrcomishnjetbwd ; /usr/bin/python3
Nov 28 07:55:57 np0005538515.localdomain sudo[39805]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:55:57 np0005538515.localdomain python3[39807]: ansible-stat Invoked with path=/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova/nova.conf follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 28 07:55:57 np0005538515.localdomain sudo[39805]: pam_unix(sudo:session): session closed for user root
Nov 28 07:56:00 np0005538515.localdomain sudo[39821]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cebdsnkbiopsscdrhhbsevdymrposfqi ; /usr/bin/python3
Nov 28 07:56:00 np0005538515.localdomain sudo[39821]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:56:00 np0005538515.localdomain python3[39823]: ansible-replace Invoked with regexp=TRIPLEO_HEAT_TEMPLATE_KERNEL_ARGS dest=/etc/default/grub replace= path=/etc/default/grub backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:56:00 np0005538515.localdomain sudo[39821]: pam_unix(sudo:session): session closed for user root
Nov 28 07:56:05 np0005538515.localdomain sudo[39837]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rjtfjlrgmwuhjawevpngjrvvdwqakyiz ; /usr/bin/python3
Nov 28 07:56:05 np0005538515.localdomain sudo[39837]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:56:05 np0005538515.localdomain python3[39839]: ansible-file Invoked with path=/etc/puppet/hieradata state=directory mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:56:05 np0005538515.localdomain sudo[39837]: pam_unix(sudo:session): session closed for user root
Nov 28 07:56:05 np0005538515.localdomain sudo[39885]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-icwihmocexazkiftexkqocqbmamwvhvy ; /usr/bin/python3
Nov 28 07:56:05 np0005538515.localdomain sudo[39885]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:56:05 np0005538515.localdomain python3[39887]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hiera.yaml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 07:56:05 np0005538515.localdomain sudo[39885]: pam_unix(sudo:session): session closed for user root
Nov 28 07:56:05 np0005538515.localdomain sudo[39930]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hnchcytjfajywumxbhjkbitniedxtxwj ; /usr/bin/python3
Nov 28 07:56:05 np0005538515.localdomain sudo[39930]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:56:06 np0005538515.localdomain python3[39932]: ansible-ansible.legacy.copy Invoked with mode=384 dest=/etc/puppet/hiera.yaml src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764316565.3866425-71269-4171597198821/source _original_basename=tmpkd232841 follow=False checksum=aaf3699defba931d532f4955ae152f505046749a backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:56:06 np0005538515.localdomain sudo[39930]: pam_unix(sudo:session): session closed for user root
Nov 28 07:56:06 np0005538515.localdomain sudo[39971]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-abobspefevwovhnjvesqpiwssxfcdahs ; /usr/bin/python3
Nov 28 07:56:06 np0005538515.localdomain sudo[39971]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:56:06 np0005538515.localdomain sudo[39951]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 07:56:06 np0005538515.localdomain sudo[39951]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 07:56:06 np0005538515.localdomain sudo[39951]: pam_unix(sudo:session): session closed for user root
Nov 28 07:56:06 np0005538515.localdomain sudo[39978]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 07:56:06 np0005538515.localdomain sudo[39978]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 07:56:06 np0005538515.localdomain python3[39976]: ansible-file Invoked with src=/etc/puppet/hiera.yaml dest=/etc/hiera.yaml state=link force=True path=/etc/hiera.yaml recurse=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:56:06 np0005538515.localdomain sudo[39971]: pam_unix(sudo:session): session closed for user root
Nov 28 07:56:07 np0005538515.localdomain sudo[39978]: pam_unix(sudo:session): session closed for user root
Nov 28 07:56:07 np0005538515.localdomain sudo[40069]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qundzeagqeyhpejftbefaksmvgtqykqu ; /usr/bin/python3
Nov 28 07:56:07 np0005538515.localdomain sudo[40069]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:56:07 np0005538515.localdomain python3[40071]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/all_nodes.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 07:56:07 np0005538515.localdomain sudo[40069]: pam_unix(sudo:session): session closed for user root
Nov 28 07:56:07 np0005538515.localdomain sudo[40112]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dvuegajocsibnhrhecyzhnknqbdsuwnd ; /usr/bin/python3
Nov 28 07:56:07 np0005538515.localdomain sudo[40112]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:56:07 np0005538515.localdomain python3[40114]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764316567.0255291-71365-161059675954738/source dest=/etc/puppet/hieradata/all_nodes.json _original_basename=overcloud.json follow=False checksum=f62dcfb681d1b393d0933e3027f5bdff5685b671 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:56:07 np0005538515.localdomain sudo[40112]: pam_unix(sudo:session): session closed for user root
Nov 28 07:56:07 np0005538515.localdomain sudo[40115]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 07:56:07 np0005538515.localdomain sudo[40115]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 07:56:07 np0005538515.localdomain sudo[40115]: pam_unix(sudo:session): session closed for user root
Nov 28 07:56:08 np0005538515.localdomain sudo[40189]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kretygpctskjlaeisfzteastxaogcrnf ; /usr/bin/python3
Nov 28 07:56:08 np0005538515.localdomain sudo[40189]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:56:08 np0005538515.localdomain python3[40191]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/bootstrap_node.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 07:56:08 np0005538515.localdomain sudo[40189]: pam_unix(sudo:session): session closed for user root
Nov 28 07:56:08 np0005538515.localdomain sudo[40232]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-prsiwmmmuptssnvphoinarufsjpsdowh ; /usr/bin/python3
Nov 28 07:56:08 np0005538515.localdomain sudo[40232]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:56:08 np0005538515.localdomain python3[40234]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764316567.9728765-71498-216974929580161/source dest=/etc/puppet/hieradata/bootstrap_node.json mode=None follow=False _original_basename=bootstrap_node.j2 checksum=526fa277b7a2f2320a39d589994ce8c8af83f91d backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:56:08 np0005538515.localdomain sudo[40232]: pam_unix(sudo:session): session closed for user root
Nov 28 07:56:09 np0005538515.localdomain sudo[40294]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mfldzphwpufveuivxwgwbzblbaoqalvf ; /usr/bin/python3
Nov 28 07:56:09 np0005538515.localdomain sudo[40294]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:56:09 np0005538515.localdomain python3[40296]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/vip_data.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 07:56:09 np0005538515.localdomain sudo[40294]: pam_unix(sudo:session): session closed for user root
Nov 28 07:56:09 np0005538515.localdomain sudo[40337]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fncmfjziockaqcnfmqeephlviizkwloh ; /usr/bin/python3
Nov 28 07:56:09 np0005538515.localdomain sudo[40337]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:56:09 np0005538515.localdomain python3[40339]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764316568.8568509-71498-268617579071918/source dest=/etc/puppet/hieradata/vip_data.json mode=None follow=False _original_basename=vip_data.j2 checksum=a223df0bad6272fbaedbfa3b3952717db2fe2201 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:56:09 np0005538515.localdomain sudo[40337]: pam_unix(sudo:session): session closed for user root
Nov 28 07:56:10 np0005538515.localdomain sudo[40399]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kbyfpyauffsuuafrvmoojurerigesmce ; /usr/bin/python3
Nov 28 07:56:10 np0005538515.localdomain sudo[40399]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:56:10 np0005538515.localdomain python3[40401]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/net_ip_map.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 07:56:10 np0005538515.localdomain sudo[40399]: pam_unix(sudo:session): session closed for user root
Nov 28 07:56:10 np0005538515.localdomain sudo[40442]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xpfhklozbzmpdoilodpomdmuuatxapxv ; /usr/bin/python3
Nov 28 07:56:10 np0005538515.localdomain sudo[40442]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:56:10 np0005538515.localdomain python3[40444]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764316569.7610092-71498-248799131100880/source dest=/etc/puppet/hieradata/net_ip_map.json mode=None follow=False _original_basename=net_ip_map.j2 checksum=1bd75eeb71ad8a06f7ad5bd2e02e7279e09e867f backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:56:10 np0005538515.localdomain sudo[40442]: pam_unix(sudo:session): session closed for user root
Nov 28 07:56:10 np0005538515.localdomain sudo[40504]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uczekqrloqvkikibdeipscvmrzuhimrx ; /usr/bin/python3
Nov 28 07:56:10 np0005538515.localdomain sudo[40504]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:56:11 np0005538515.localdomain python3[40506]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/cloud_domain.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 07:56:11 np0005538515.localdomain sudo[40504]: pam_unix(sudo:session): session closed for user root
Nov 28 07:56:11 np0005538515.localdomain sudo[40547]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kccjahppnvhfayhvrebbdcpsdtmljiqq ; /usr/bin/python3
Nov 28 07:56:11 np0005538515.localdomain sudo[40547]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:56:11 np0005538515.localdomain python3[40549]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764316570.7028797-71498-156670614555741/source dest=/etc/puppet/hieradata/cloud_domain.json mode=None follow=False _original_basename=cloud_domain.j2 checksum=5dd835a63e6a03d74797c2e2eadf4bea1cecd9d9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:56:11 np0005538515.localdomain sudo[40547]: pam_unix(sudo:session): session closed for user root
Nov 28 07:56:11 np0005538515.localdomain sudo[40609]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mjojzrzwnikqeetptckjovtrtloonzfa ; /usr/bin/python3
Nov 28 07:56:11 np0005538515.localdomain sudo[40609]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:56:11 np0005538515.localdomain python3[40611]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/fqdn.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 07:56:11 np0005538515.localdomain sudo[40609]: pam_unix(sudo:session): session closed for user root
Nov 28 07:56:12 np0005538515.localdomain sudo[40652]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aschmooxehprzfsazmgficlgmraqptrt ; /usr/bin/python3
Nov 28 07:56:12 np0005538515.localdomain sudo[40652]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:56:12 np0005538515.localdomain python3[40654]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764316571.5210068-71498-175478226132879/source dest=/etc/puppet/hieradata/fqdn.json mode=None follow=False _original_basename=fqdn.j2 checksum=62e0064aaeb633d534c066293fb50230d01591cd backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:56:12 np0005538515.localdomain sudo[40652]: pam_unix(sudo:session): session closed for user root
Nov 28 07:56:12 np0005538515.localdomain sudo[40714]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jdhoerfneyoqeatceunvnuyzlkbgduqs ; /usr/bin/python3
Nov 28 07:56:12 np0005538515.localdomain sudo[40714]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:56:12 np0005538515.localdomain python3[40716]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/service_names.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 07:56:12 np0005538515.localdomain sudo[40714]: pam_unix(sudo:session): session closed for user root
Nov 28 07:56:12 np0005538515.localdomain sudo[40757]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pdngixifeomltjdpxbfsckskodktdhxq ; /usr/bin/python3
Nov 28 07:56:12 np0005538515.localdomain sudo[40757]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:56:13 np0005538515.localdomain python3[40759]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764316572.383641-71498-94396251438356/source dest=/etc/puppet/hieradata/service_names.json mode=None follow=False _original_basename=service_names.j2 checksum=ff586b96402d8ae133745cf06f17e772b2f22d52 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:56:13 np0005538515.localdomain sudo[40757]: pam_unix(sudo:session): session closed for user root
Nov 28 07:56:13 np0005538515.localdomain sudo[40819]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cjzztsbnxuaepqshwmrtpshetjnjwxgf ; /usr/bin/python3
Nov 28 07:56:13 np0005538515.localdomain sudo[40819]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:56:13 np0005538515.localdomain python3[40821]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/service_configs.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 07:56:13 np0005538515.localdomain sudo[40819]: pam_unix(sudo:session): session closed for user root
Nov 28 07:56:13 np0005538515.localdomain sudo[40862]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-udpmlhqrltvjadsyhyalcmbgjqeucxxg ; /usr/bin/python3
Nov 28 07:56:13 np0005538515.localdomain sudo[40862]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:56:14 np0005538515.localdomain python3[40864]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764316573.29409-71498-259177846658353/source dest=/etc/puppet/hieradata/service_configs.json mode=None follow=False _original_basename=service_configs.j2 checksum=8f5fcf4d1773fc71cd0863786080c50634c31bf2 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:56:14 np0005538515.localdomain sudo[40862]: pam_unix(sudo:session): session closed for user root
Nov 28 07:56:14 np0005538515.localdomain sudo[40924]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-acuiptpyjegfulfntrkaohmiccofdftr ; /usr/bin/python3
Nov 28 07:56:14 np0005538515.localdomain sudo[40924]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:56:14 np0005538515.localdomain python3[40926]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/extraconfig.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 07:56:14 np0005538515.localdomain sudo[40924]: pam_unix(sudo:session): session closed for user root
Nov 28 07:56:14 np0005538515.localdomain sudo[40967]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ciqunaxrqsqhjsfeoohxnbyhdbepeasp ; /usr/bin/python3
Nov 28 07:56:14 np0005538515.localdomain sudo[40967]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:56:14 np0005538515.localdomain python3[40969]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764316574.1672018-71498-280473941620089/source dest=/etc/puppet/hieradata/extraconfig.json mode=None follow=False _original_basename=extraconfig.j2 checksum=5f36b2ea290645ee34d943220a14b54ee5ea5be5 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:56:14 np0005538515.localdomain sudo[40967]: pam_unix(sudo:session): session closed for user root
Nov 28 07:56:15 np0005538515.localdomain sudo[41029]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sjvehgzlvxgvnfpdmtfyujtukdexvdvj ; /usr/bin/python3
Nov 28 07:56:15 np0005538515.localdomain sudo[41029]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:56:15 np0005538515.localdomain python3[41031]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/role_extraconfig.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 07:56:15 np0005538515.localdomain sudo[41029]: pam_unix(sudo:session): session closed for user root
Nov 28 07:56:15 np0005538515.localdomain sudo[41072]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wslqhpfljdohdrbuocwftyanhoxvibau ; /usr/bin/python3
Nov 28 07:56:15 np0005538515.localdomain sudo[41072]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:56:15 np0005538515.localdomain python3[41074]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764316575.0305808-71498-156474437425396/source dest=/etc/puppet/hieradata/role_extraconfig.json mode=None follow=False _original_basename=role_extraconfig.j2 checksum=34875968bf996542162e620523f9dcfb3deac331 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:56:15 np0005538515.localdomain sudo[41072]: pam_unix(sudo:session): session closed for user root
Nov 28 07:56:16 np0005538515.localdomain sudo[41134]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-arcvcolxyvdtsilvrjssoenrxjrsgiiz ; /usr/bin/python3
Nov 28 07:56:16 np0005538515.localdomain sudo[41134]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:56:16 np0005538515.localdomain python3[41136]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/ovn_chassis_mac_map.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 07:56:16 np0005538515.localdomain sudo[41134]: pam_unix(sudo:session): session closed for user root
Nov 28 07:56:16 np0005538515.localdomain sudo[41177]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dmewvruphbjkzeppucqxuvqpwlflwjkg ; /usr/bin/python3
Nov 28 07:56:16 np0005538515.localdomain sudo[41177]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:56:16 np0005538515.localdomain python3[41179]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764316575.9257348-71498-182361790183983/source dest=/etc/puppet/hieradata/ovn_chassis_mac_map.json mode=None follow=False _original_basename=ovn_chassis_mac_map.j2 checksum=fbd352828bf2a24978bac89caf2b80ad6306db82 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:56:16 np0005538515.localdomain sudo[41177]: pam_unix(sudo:session): session closed for user root
Nov 28 07:56:17 np0005538515.localdomain sudo[41207]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rfcnpnezspiosjuojqcivpbiecwgemhn ; /usr/bin/python3
Nov 28 07:56:17 np0005538515.localdomain sudo[41207]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:56:17 np0005538515.localdomain python3[41209]: ansible-stat Invoked with path={'src': '/etc/puppet/hieradata/ansible_managed.json'} follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 28 07:56:17 np0005538515.localdomain sudo[41207]: pam_unix(sudo:session): session closed for user root
Nov 28 07:56:17 np0005538515.localdomain sudo[41255]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yrkkhnnjxkfgmtwfbryibqgcafmhuqgg ; /usr/bin/python3
Nov 28 07:56:17 np0005538515.localdomain sudo[41255]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:56:17 np0005538515.localdomain python3[41257]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/ansible_managed.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 07:56:17 np0005538515.localdomain sudo[41255]: pam_unix(sudo:session): session closed for user root
Nov 28 07:56:18 np0005538515.localdomain sudo[41298]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nthizygqwhhwsvnpinqisnhrmyvwsxif ; /usr/bin/python3
Nov 28 07:56:18 np0005538515.localdomain sudo[41298]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:56:18 np0005538515.localdomain python3[41300]: ansible-ansible.legacy.copy Invoked with dest=/etc/puppet/hieradata/ansible_managed.json owner=root group=root mode=0644 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764316577.6255393-72137-238168634886834/source _original_basename=tmp05ku064p follow=False checksum=bf21a9e8fbc5a3846fb05b4fa0859e0917b2202f backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:56:18 np0005538515.localdomain sudo[41298]: pam_unix(sudo:session): session closed for user root
Nov 28 07:56:23 np0005538515.localdomain sudo[41328]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yxnhwkucoukdkttikzstvtkdvgpuilip ; /usr/bin/python3
Nov 28 07:56:23 np0005538515.localdomain sudo[41328]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:56:23 np0005538515.localdomain python3[41330]: ansible-setup Invoked with gather_subset=['!all', '!min', 'network'] filter=['ansible_default_ipv4'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 28 07:56:23 np0005538515.localdomain sudo[41328]: pam_unix(sudo:session): session closed for user root
Nov 28 07:56:23 np0005538515.localdomain sudo[41389]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-chltxzrwlapowamwsmjuimejpugesreq ; /usr/bin/python3
Nov 28 07:56:23 np0005538515.localdomain sudo[41389]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:56:23 np0005538515.localdomain python3[41391]: ansible-ansible.legacy.command Invoked with _raw_params=ping -w 10 -c 5 38.102.83.1 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 07:56:28 np0005538515.localdomain sudo[41389]: pam_unix(sudo:session): session closed for user root
Nov 28 07:56:28 np0005538515.localdomain sudo[41406]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cnvhqenxngdheereitukqseenfsfdzoi ; /usr/bin/python3
Nov 28 07:56:28 np0005538515.localdomain sudo[41406]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:56:28 np0005538515.localdomain python3[41408]: ansible-ansible.legacy.command Invoked with _raw_params=ping -w 10 -c 5 192.168.122.10 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 07:56:32 np0005538515.localdomain sudo[41406]: pam_unix(sudo:session): session closed for user root
Nov 28 07:56:33 np0005538515.localdomain sudo[41423]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jawineudzdwmtsrnvzlvigalwlzjrtco ; /usr/bin/python3
Nov 28 07:56:33 np0005538515.localdomain sudo[41423]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:56:33 np0005538515.localdomain python3[41425]: ansible-ansible.legacy.command Invoked with _raw_params=INT=$(ip ro get 192.168.122.106 | head -1 | sed -nr "s/.* dev (\w+) .*/\1/p")
                                                         MTU=$(cat /sys/class/net/${INT}/mtu 2>/dev/null || echo "0")
                                                         echo "$INT $MTU"
                                                          _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 07:56:33 np0005538515.localdomain sudo[41423]: pam_unix(sudo:session): session closed for user root
Nov 28 07:56:33 np0005538515.localdomain sudo[41446]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rwvwyhlscaidddunfjdnjyysygfdygnc ; /usr/bin/python3
Nov 28 07:56:33 np0005538515.localdomain sudo[41446]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:56:34 np0005538515.localdomain python3[41448]: ansible-ansible.legacy.command Invoked with _raw_params=ping -w 10 -c 5 192.168.122.106 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 07:56:38 np0005538515.localdomain sudo[41446]: pam_unix(sudo:session): session closed for user root
Nov 28 07:56:38 np0005538515.localdomain sudo[41463]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qrvjvsanpivgdaumqmowzzexfwfcsccd ; /usr/bin/python3
Nov 28 07:56:38 np0005538515.localdomain sudo[41463]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:56:38 np0005538515.localdomain python3[41465]: ansible-ansible.legacy.command Invoked with _raw_params=INT=$(ip ro get 172.18.0.106 | head -1 | sed -nr "s/.* dev (\w+) .*/\1/p")
                                                         MTU=$(cat /sys/class/net/${INT}/mtu 2>/dev/null || echo "0")
                                                         echo "$INT $MTU"
                                                          _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 07:56:38 np0005538515.localdomain sudo[41463]: pam_unix(sudo:session): session closed for user root
Nov 28 07:56:38 np0005538515.localdomain sudo[41486]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xcbqnyxgsbimdcfekgflhcxhdhcmvnge ; /usr/bin/python3
Nov 28 07:56:38 np0005538515.localdomain sudo[41486]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:56:39 np0005538515.localdomain python3[41488]: ansible-ansible.legacy.command Invoked with _raw_params=ping -w 10 -c 5 172.18.0.106 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 07:56:43 np0005538515.localdomain sudo[41486]: pam_unix(sudo:session): session closed for user root
Nov 28 07:56:43 np0005538515.localdomain sudo[41503]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-felnsfqvetehxhkzjygqyayqvzivlsqa ; /usr/bin/python3
Nov 28 07:56:43 np0005538515.localdomain sudo[41503]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:56:43 np0005538515.localdomain python3[41505]: ansible-ansible.legacy.command Invoked with _raw_params=ping -w 10 -s 1472 -c 5 172.18.0.106 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 07:56:47 np0005538515.localdomain sudo[41503]: pam_unix(sudo:session): session closed for user root
Nov 28 07:56:48 np0005538515.localdomain sudo[41520]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jrjbqmwyiiqvgjfagjdtzzlsbxkeeehy ; /usr/bin/python3
Nov 28 07:56:48 np0005538515.localdomain sudo[41520]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:56:48 np0005538515.localdomain python3[41522]: ansible-ansible.legacy.command Invoked with _raw_params=INT=$(ip ro get 172.20.0.106 | head -1 | sed -nr "s/.* dev (\w+) .*/\1/p")
                                                         MTU=$(cat /sys/class/net/${INT}/mtu 2>/dev/null || echo "0")
                                                         echo "$INT $MTU"
                                                          _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 07:56:48 np0005538515.localdomain sudo[41520]: pam_unix(sudo:session): session closed for user root
Nov 28 07:56:48 np0005538515.localdomain sudo[41543]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ocmgrfepdvtglgsklehgiahkgvixulkj ; /usr/bin/python3
Nov 28 07:56:48 np0005538515.localdomain sudo[41543]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:56:48 np0005538515.localdomain python3[41545]: ansible-ansible.legacy.command Invoked with _raw_params=ping -w 10 -c 5 172.20.0.106 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 07:56:52 np0005538515.localdomain sudo[41543]: pam_unix(sudo:session): session closed for user root
Nov 28 07:56:52 np0005538515.localdomain sudo[41560]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zswdjfumhrcjabnmqoetrywoqrpiyovn ; /usr/bin/python3
Nov 28 07:56:52 np0005538515.localdomain sudo[41560]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:56:53 np0005538515.localdomain python3[41562]: ansible-ansible.legacy.command Invoked with _raw_params=ping -w 10 -s 1472 -c 5 172.20.0.106 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 07:56:53 np0005538515.localdomain systemd[36764]: Starting Mark boot as successful...
Nov 28 07:56:53 np0005538515.localdomain systemd[36764]: Finished Mark boot as successful.
Nov 28 07:56:57 np0005538515.localdomain sudo[41560]: pam_unix(sudo:session): session closed for user root
Nov 28 07:56:57 np0005538515.localdomain sudo[41578]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kykmqurcovgvfkvjtlgvlgxwiviegwbb ; /usr/bin/python3
Nov 28 07:56:57 np0005538515.localdomain sudo[41578]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:56:57 np0005538515.localdomain python3[41580]: ansible-ansible.legacy.command Invoked with _raw_params=INT=$(ip ro get 172.17.0.106 | head -1 | sed -nr "s/.* dev (\w+) .*/\1/p")
                                                         MTU=$(cat /sys/class/net/${INT}/mtu 2>/dev/null || echo "0")
                                                         echo "$INT $MTU"
                                                          _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 07:56:57 np0005538515.localdomain sudo[41578]: pam_unix(sudo:session): session closed for user root
Nov 28 07:56:57 np0005538515.localdomain sudo[41601]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-whxfstvofrmhquzotfasddtykduizzwi ; /usr/bin/python3
Nov 28 07:56:57 np0005538515.localdomain sudo[41601]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:56:58 np0005538515.localdomain python3[41603]: ansible-ansible.legacy.command Invoked with _raw_params=ping -w 10 -c 5 172.17.0.106 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 07:57:02 np0005538515.localdomain sudo[41601]: pam_unix(sudo:session): session closed for user root
Nov 28 07:57:02 np0005538515.localdomain sudo[41618]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ohffoiwzdawlnqozycmruwjujondmihx ; /usr/bin/python3
Nov 28 07:57:02 np0005538515.localdomain sudo[41618]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:57:02 np0005538515.localdomain python3[41620]: ansible-ansible.legacy.command Invoked with _raw_params=ping -w 10 -s 1472 -c 5 172.17.0.106 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 07:57:06 np0005538515.localdomain sudo[41618]: pam_unix(sudo:session): session closed for user root
Nov 28 07:57:06 np0005538515.localdomain sudo[41635]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xzksewksmpgzgnurgrfswlxxyzlcpahm ; /usr/bin/python3
Nov 28 07:57:06 np0005538515.localdomain sudo[41635]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:57:07 np0005538515.localdomain python3[41637]: ansible-ansible.legacy.command Invoked with _raw_params=INT=$(ip ro get 172.19.0.106 | head -1 | sed -nr "s/.* dev (\w+) .*/\1/p")
                                                         MTU=$(cat /sys/class/net/${INT}/mtu 2>/dev/null || echo "0")
                                                         echo "$INT $MTU"
                                                          _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 07:57:07 np0005538515.localdomain sudo[41635]: pam_unix(sudo:session): session closed for user root
Nov 28 07:57:07 np0005538515.localdomain sudo[41658]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xhmwlfrdfyawyvqorbmkziayuzbubeez ; /usr/bin/python3
Nov 28 07:57:07 np0005538515.localdomain sudo[41658]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:57:07 np0005538515.localdomain python3[41660]: ansible-ansible.legacy.command Invoked with _raw_params=ping -w 10 -c 5 172.19.0.106 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 07:57:07 np0005538515.localdomain sudo[41662]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 07:57:07 np0005538515.localdomain sudo[41662]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 07:57:07 np0005538515.localdomain sudo[41662]: pam_unix(sudo:session): session closed for user root
Nov 28 07:57:07 np0005538515.localdomain sudo[41677]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 07:57:07 np0005538515.localdomain sudo[41677]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 07:57:08 np0005538515.localdomain sudo[41677]: pam_unix(sudo:session): session closed for user root
Nov 28 07:57:11 np0005538515.localdomain sudo[41723]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 07:57:11 np0005538515.localdomain sudo[41723]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 07:57:11 np0005538515.localdomain sudo[41723]: pam_unix(sudo:session): session closed for user root
Nov 28 07:57:11 np0005538515.localdomain sudo[41658]: pam_unix(sudo:session): session closed for user root
Nov 28 07:57:11 np0005538515.localdomain sudo[41751]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-egnfketqknpgfxuhidxzajivdcoyzmzj ; /usr/bin/python3
Nov 28 07:57:11 np0005538515.localdomain sudo[41751]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:57:11 np0005538515.localdomain python3[41753]: ansible-ansible.legacy.command Invoked with _raw_params=ping -w 10 -s 1472 -c 5 172.19.0.106 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 07:57:16 np0005538515.localdomain sudo[41751]: pam_unix(sudo:session): session closed for user root
Nov 28 07:57:17 np0005538515.localdomain sudo[41768]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wrpajmnngswgvupiiptpngfdvteladmf ; /usr/bin/python3
Nov 28 07:57:17 np0005538515.localdomain sudo[41768]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:57:17 np0005538515.localdomain python3[41770]: ansible-file Invoked with path=/etc/puppet/hieradata state=directory mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:57:17 np0005538515.localdomain sudo[41768]: pam_unix(sudo:session): session closed for user root
Nov 28 07:57:17 np0005538515.localdomain sudo[41816]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xvcysqlnyfgeximsopfjtheeebfckzuy ; /usr/bin/python3
Nov 28 07:57:17 np0005538515.localdomain sudo[41816]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:57:17 np0005538515.localdomain python3[41818]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hiera.yaml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 07:57:17 np0005538515.localdomain sudo[41816]: pam_unix(sudo:session): session closed for user root
Nov 28 07:57:17 np0005538515.localdomain sudo[41834]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rlwawzodopkmryjdkowxwsouyvwnfjia ; /usr/bin/python3
Nov 28 07:57:17 np0005538515.localdomain sudo[41834]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:57:17 np0005538515.localdomain python3[41836]: ansible-ansible.legacy.file Invoked with mode=384 dest=/etc/puppet/hiera.yaml _original_basename=tmps_4gqnva recurse=False state=file path=/etc/puppet/hiera.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:57:17 np0005538515.localdomain sudo[41834]: pam_unix(sudo:session): session closed for user root
Nov 28 07:57:18 np0005538515.localdomain sudo[41864]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ffeoqvwnchtlpruobdkgzxrfburhoagm ; /usr/bin/python3
Nov 28 07:57:18 np0005538515.localdomain sudo[41864]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:57:18 np0005538515.localdomain python3[41866]: ansible-file Invoked with src=/etc/puppet/hiera.yaml dest=/etc/hiera.yaml state=link force=True path=/etc/hiera.yaml recurse=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:57:18 np0005538515.localdomain sudo[41864]: pam_unix(sudo:session): session closed for user root
Nov 28 07:57:19 np0005538515.localdomain sudo[41912]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zpcypdrspdhzcxpbyjwbtexbuzbkcrdc ; /usr/bin/python3
Nov 28 07:57:19 np0005538515.localdomain sudo[41912]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:57:19 np0005538515.localdomain python3[41914]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/all_nodes.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 07:57:19 np0005538515.localdomain sudo[41912]: pam_unix(sudo:session): session closed for user root
Nov 28 07:57:19 np0005538515.localdomain sudo[41930]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-digjkwsunxikeijtcsgvnwvfijnocumn ; /usr/bin/python3
Nov 28 07:57:19 np0005538515.localdomain sudo[41930]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:57:19 np0005538515.localdomain python3[41932]: ansible-ansible.legacy.file Invoked with dest=/etc/puppet/hieradata/all_nodes.json _original_basename=overcloud.json recurse=False state=file path=/etc/puppet/hieradata/all_nodes.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:57:19 np0005538515.localdomain sudo[41930]: pam_unix(sudo:session): session closed for user root
Nov 28 07:57:19 np0005538515.localdomain sudo[41992]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-flfvkzwqfukzvqdvbgejydgojrjywrld ; /usr/bin/python3
Nov 28 07:57:19 np0005538515.localdomain sudo[41992]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:57:20 np0005538515.localdomain python3[41994]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/bootstrap_node.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 07:57:20 np0005538515.localdomain sudo[41992]: pam_unix(sudo:session): session closed for user root
Nov 28 07:57:20 np0005538515.localdomain sudo[42010]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mrmeldedxshntbghrbwhpadqpvemkvtz ; /usr/bin/python3
Nov 28 07:57:20 np0005538515.localdomain sudo[42010]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:57:20 np0005538515.localdomain python3[42012]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/bootstrap_node.json _original_basename=bootstrap_node.j2 recurse=False state=file path=/etc/puppet/hieradata/bootstrap_node.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:57:20 np0005538515.localdomain sudo[42010]: pam_unix(sudo:session): session closed for user root
Nov 28 07:57:20 np0005538515.localdomain sudo[42072]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rnwttoykskqfawphgwoupxgnycspxcwx ; /usr/bin/python3
Nov 28 07:57:20 np0005538515.localdomain sudo[42072]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:57:20 np0005538515.localdomain python3[42074]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/vip_data.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 07:57:20 np0005538515.localdomain sudo[42072]: pam_unix(sudo:session): session closed for user root
Nov 28 07:57:20 np0005538515.localdomain sudo[42090]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kthornedyqxufxpxviccocxgxsdewqop ; /usr/bin/python3
Nov 28 07:57:20 np0005538515.localdomain sudo[42090]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:57:21 np0005538515.localdomain python3[42092]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/vip_data.json _original_basename=vip_data.j2 recurse=False state=file path=/etc/puppet/hieradata/vip_data.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:57:21 np0005538515.localdomain sudo[42090]: pam_unix(sudo:session): session closed for user root
Nov 28 07:57:21 np0005538515.localdomain sudo[42152]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gizmmlpmaeegsqgneudewvacdfadkqii ; /usr/bin/python3
Nov 28 07:57:21 np0005538515.localdomain sudo[42152]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:57:21 np0005538515.localdomain python3[42154]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/net_ip_map.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 07:57:21 np0005538515.localdomain sudo[42152]: pam_unix(sudo:session): session closed for user root
Nov 28 07:57:21 np0005538515.localdomain sudo[42170]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wskhionpscgpqiselwvzwaardbpoifgg ; /usr/bin/python3
Nov 28 07:57:21 np0005538515.localdomain sudo[42170]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:57:22 np0005538515.localdomain python3[42172]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/net_ip_map.json _original_basename=net_ip_map.j2 recurse=False state=file path=/etc/puppet/hieradata/net_ip_map.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:57:22 np0005538515.localdomain sudo[42170]: pam_unix(sudo:session): session closed for user root
Nov 28 07:57:22 np0005538515.localdomain sudo[42232]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qgwioimuffdakyowkkxxejyrurdqmkgv ; /usr/bin/python3
Nov 28 07:57:22 np0005538515.localdomain sudo[42232]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:57:22 np0005538515.localdomain python3[42234]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/cloud_domain.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 07:57:22 np0005538515.localdomain sudo[42232]: pam_unix(sudo:session): session closed for user root
Nov 28 07:57:22 np0005538515.localdomain sudo[42250]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mfpbamnepqtqchdxhpyrysmroarmdziq ; /usr/bin/python3
Nov 28 07:57:22 np0005538515.localdomain sudo[42250]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:57:22 np0005538515.localdomain python3[42252]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/cloud_domain.json _original_basename=cloud_domain.j2 recurse=False state=file path=/etc/puppet/hieradata/cloud_domain.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:57:22 np0005538515.localdomain sudo[42250]: pam_unix(sudo:session): session closed for user root
Nov 28 07:57:23 np0005538515.localdomain sudo[42312]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sfdrveybfhrrpbjyeeischbjdebkvwfo ; /usr/bin/python3
Nov 28 07:57:23 np0005538515.localdomain sudo[42312]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:57:23 np0005538515.localdomain python3[42314]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/fqdn.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 07:57:23 np0005538515.localdomain sudo[42312]: pam_unix(sudo:session): session closed for user root
Nov 28 07:57:23 np0005538515.localdomain sudo[42330]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-igwzplecmffriliewbgnmqjmnvgnbmqk ; /usr/bin/python3
Nov 28 07:57:23 np0005538515.localdomain sudo[42330]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:57:23 np0005538515.localdomain python3[42332]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/fqdn.json _original_basename=fqdn.j2 recurse=False state=file path=/etc/puppet/hieradata/fqdn.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:57:23 np0005538515.localdomain sudo[42330]: pam_unix(sudo:session): session closed for user root
Nov 28 07:57:24 np0005538515.localdomain sudo[42392]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aplygpzzkoiywbsjokqttutnxrwarxmj ; /usr/bin/python3
Nov 28 07:57:24 np0005538515.localdomain sudo[42392]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:57:24 np0005538515.localdomain python3[42394]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/service_names.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 07:57:24 np0005538515.localdomain sudo[42392]: pam_unix(sudo:session): session closed for user root
Nov 28 07:57:24 np0005538515.localdomain sudo[42410]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fdzemsoamawhjvfjofnnxwbrsttpphbv ; /usr/bin/python3
Nov 28 07:57:24 np0005538515.localdomain sudo[42410]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:57:24 np0005538515.localdomain python3[42412]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/service_names.json _original_basename=service_names.j2 recurse=False state=file path=/etc/puppet/hieradata/service_names.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:57:24 np0005538515.localdomain sudo[42410]: pam_unix(sudo:session): session closed for user root
Nov 28 07:57:24 np0005538515.localdomain sudo[42472]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cluqgquplntleqitadykosacjjzdtplw ; /usr/bin/python3
Nov 28 07:57:24 np0005538515.localdomain sudo[42472]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:57:24 np0005538515.localdomain python3[42474]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/service_configs.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 07:57:24 np0005538515.localdomain sudo[42472]: pam_unix(sudo:session): session closed for user root
Nov 28 07:57:25 np0005538515.localdomain sudo[42490]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-spodgcgtwjvevkzntlanisdmnbnjtmuz ; /usr/bin/python3
Nov 28 07:57:25 np0005538515.localdomain sudo[42490]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:57:25 np0005538515.localdomain python3[42492]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/service_configs.json _original_basename=service_configs.j2 recurse=False state=file path=/etc/puppet/hieradata/service_configs.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:57:25 np0005538515.localdomain sudo[42490]: pam_unix(sudo:session): session closed for user root
Nov 28 07:57:25 np0005538515.localdomain sudo[42552]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-spjfqkxnlbzppjfpnmmfuuzybljxcjha ; /usr/bin/python3
Nov 28 07:57:25 np0005538515.localdomain sudo[42552]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:57:25 np0005538515.localdomain python3[42554]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/extraconfig.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 07:57:25 np0005538515.localdomain sudo[42552]: pam_unix(sudo:session): session closed for user root
Nov 28 07:57:25 np0005538515.localdomain sudo[42570]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jeeysjipnpyvrjawgswdxfbfozbrjujk ; /usr/bin/python3
Nov 28 07:57:25 np0005538515.localdomain sudo[42570]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:57:25 np0005538515.localdomain python3[42572]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/extraconfig.json _original_basename=extraconfig.j2 recurse=False state=file path=/etc/puppet/hieradata/extraconfig.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:57:25 np0005538515.localdomain sudo[42570]: pam_unix(sudo:session): session closed for user root
Nov 28 07:57:26 np0005538515.localdomain sudo[42632]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tfhjcqearudeznqzwnvgktpcavrpccpu ; /usr/bin/python3
Nov 28 07:57:26 np0005538515.localdomain sudo[42632]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:57:26 np0005538515.localdomain python3[42634]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/role_extraconfig.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 07:57:26 np0005538515.localdomain sudo[42632]: pam_unix(sudo:session): session closed for user root
Nov 28 07:57:26 np0005538515.localdomain sudo[42650]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pvmkmogtmnsjfmkkwtpyntoiuktrkvbe ; /usr/bin/python3
Nov 28 07:57:26 np0005538515.localdomain sudo[42650]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:57:26 np0005538515.localdomain python3[42652]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/role_extraconfig.json _original_basename=role_extraconfig.j2 recurse=False state=file path=/etc/puppet/hieradata/role_extraconfig.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:57:26 np0005538515.localdomain sudo[42650]: pam_unix(sudo:session): session closed for user root
Nov 28 07:57:27 np0005538515.localdomain sudo[42712]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xlpboyotadpmndjitemhryecsertxpbq ; /usr/bin/python3
Nov 28 07:57:27 np0005538515.localdomain sudo[42712]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:57:27 np0005538515.localdomain python3[42714]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/ovn_chassis_mac_map.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 07:57:27 np0005538515.localdomain sudo[42712]: pam_unix(sudo:session): session closed for user root
Nov 28 07:57:27 np0005538515.localdomain sudo[42730]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-djqymnduosuqwzqlhnhhrkthxmwemdhh ; /usr/bin/python3
Nov 28 07:57:27 np0005538515.localdomain sudo[42730]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:57:27 np0005538515.localdomain python3[42732]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/ovn_chassis_mac_map.json _original_basename=ovn_chassis_mac_map.j2 recurse=False state=file path=/etc/puppet/hieradata/ovn_chassis_mac_map.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:57:27 np0005538515.localdomain sudo[42730]: pam_unix(sudo:session): session closed for user root
Nov 28 07:57:27 np0005538515.localdomain sudo[42760]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pajshpuxfgamxxkoaptindsccjolsxaa ; /usr/bin/python3
Nov 28 07:57:27 np0005538515.localdomain sudo[42760]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:57:27 np0005538515.localdomain python3[42762]: ansible-stat Invoked with path={'src': '/etc/puppet/hieradata/ansible_managed.json'} follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 28 07:57:27 np0005538515.localdomain sudo[42760]: pam_unix(sudo:session): session closed for user root
Nov 28 07:57:28 np0005538515.localdomain sudo[42808]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nmfcouuokjaxagyjjqfeggchjhlgjmro ; /usr/bin/python3
Nov 28 07:57:28 np0005538515.localdomain sudo[42808]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:57:28 np0005538515.localdomain python3[42810]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/ansible_managed.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 07:57:28 np0005538515.localdomain sudo[42808]: pam_unix(sudo:session): session closed for user root
Nov 28 07:57:28 np0005538515.localdomain sudo[42826]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eicziznwqpvjhkwwbcrevxmpsqjiytks ; /usr/bin/python3
Nov 28 07:57:28 np0005538515.localdomain sudo[42826]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:57:28 np0005538515.localdomain python3[42828]: ansible-ansible.legacy.file Invoked with owner=root group=root mode=0644 dest=/etc/puppet/hieradata/ansible_managed.json _original_basename=tmpl_ja7pqv recurse=False state=file path=/etc/puppet/hieradata/ansible_managed.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:57:28 np0005538515.localdomain sudo[42826]: pam_unix(sudo:session): session closed for user root
Nov 28 07:57:31 np0005538515.localdomain sudo[42856]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jjlusralaxitgymgttfdljdpzvodksgt ; /usr/bin/python3
Nov 28 07:57:31 np0005538515.localdomain sudo[42856]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:57:31 np0005538515.localdomain python3[42858]: ansible-dnf Invoked with name=['firewalld'] state=absent allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Nov 28 07:57:34 np0005538515.localdomain sudo[42856]: pam_unix(sudo:session): session closed for user root
Nov 28 07:57:36 np0005538515.localdomain sudo[42873]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dqyjatiuejshjjhudkifloxakrsqdprh ; /usr/bin/python3
Nov 28 07:57:36 np0005538515.localdomain sudo[42873]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:57:36 np0005538515.localdomain python3[42875]: ansible-ansible.builtin.systemd Invoked with name=iptables.service state=stopped enabled=False daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 07:57:36 np0005538515.localdomain sudo[42873]: pam_unix(sudo:session): session closed for user root
Nov 28 07:57:36 np0005538515.localdomain sudo[42891]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-owqjhzkfdodtagtgmisjewscfwxoscqr ; /usr/bin/python3
Nov 28 07:57:36 np0005538515.localdomain sudo[42891]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:57:36 np0005538515.localdomain python3[42893]: ansible-ansible.builtin.systemd Invoked with name=ip6tables.service state=stopped enabled=False daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 07:57:38 np0005538515.localdomain sudo[42891]: pam_unix(sudo:session): session closed for user root
Nov 28 07:57:38 np0005538515.localdomain sudo[42909]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tcpbuuowsgvjyowbpwxvfexyzfagehyq ; /usr/bin/python3
Nov 28 07:57:38 np0005538515.localdomain sudo[42909]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:57:38 np0005538515.localdomain python3[42911]: ansible-ansible.builtin.systemd Invoked with name=nftables state=started enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 07:57:38 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 07:57:38 np0005538515.localdomain systemd-rc-local-generator[42936]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 07:57:38 np0005538515.localdomain systemd-sysv-generator[42942]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 07:57:38 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 07:57:38 np0005538515.localdomain systemd[1]: Starting Netfilter Tables...
Nov 28 07:57:38 np0005538515.localdomain systemd[1]: Finished Netfilter Tables.
Nov 28 07:57:38 np0005538515.localdomain sudo[42909]: pam_unix(sudo:session): session closed for user root
Nov 28 07:57:39 np0005538515.localdomain sudo[42998]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jfdlfkpbzgnafqeclwdkcpejjiayyslt ; /usr/bin/python3
Nov 28 07:57:39 np0005538515.localdomain sudo[42998]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:57:39 np0005538515.localdomain python3[43000]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 07:57:39 np0005538515.localdomain sudo[42998]: pam_unix(sudo:session): session closed for user root
Nov 28 07:57:39 np0005538515.localdomain sudo[43041]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hdbdtftrywgkvvoqvxqzrrsmizlecdyx ; /usr/bin/python3
Nov 28 07:57:39 np0005538515.localdomain sudo[43041]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:57:39 np0005538515.localdomain python3[43043]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/iptables.nft src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764316659.2863007-75112-91271994407602/source _original_basename=iptables.nft follow=False checksum=ede9860c99075946a7bc827210247aac639bc84a backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:57:40 np0005538515.localdomain sudo[43041]: pam_unix(sudo:session): session closed for user root
Nov 28 07:57:40 np0005538515.localdomain sudo[43071]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uipcysfdjsaygvmbewaxxpkgdtezoiyx ; /usr/bin/python3
Nov 28 07:57:40 np0005538515.localdomain sudo[43071]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:57:40 np0005538515.localdomain python3[43073]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/iptables.nft _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 07:57:40 np0005538515.localdomain sudo[43071]: pam_unix(sudo:session): session closed for user root
Nov 28 07:57:40 np0005538515.localdomain sudo[43089]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tzrdquvzxwztyqobpssgusvnivyozzma ; /usr/bin/python3
Nov 28 07:57:40 np0005538515.localdomain sudo[43089]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:57:40 np0005538515.localdomain python3[43091]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 07:57:40 np0005538515.localdomain sudo[43089]: pam_unix(sudo:session): session closed for user root
Nov 28 07:57:41 np0005538515.localdomain sudo[43138]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-myrsmjgrvyvkqxlxebabpfmfshlrsbta ; /usr/bin/python3
Nov 28 07:57:41 np0005538515.localdomain sudo[43138]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:57:41 np0005538515.localdomain python3[43140]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/tripleo-jumps.nft follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 07:57:41 np0005538515.localdomain sudo[43138]: pam_unix(sudo:session): session closed for user root
Nov 28 07:57:41 np0005538515.localdomain sudo[43181]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-omuoiqfzpobvqrsnhqnprxwmmbecovzo ; /usr/bin/python3
Nov 28 07:57:41 np0005538515.localdomain sudo[43181]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:57:41 np0005538515.localdomain python3[43183]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/tripleo-jumps.nft src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764316661.031042-75291-164256847562180/source mode=None follow=False _original_basename=jump-chain.j2 checksum=eec306c3276262a27663d76bd0ea526457445afa backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:57:41 np0005538515.localdomain sudo[43181]: pam_unix(sudo:session): session closed for user root
Nov 28 07:57:42 np0005538515.localdomain sudo[43243]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zjuyzvceqhbsormgyowqbwycygribbug ; /usr/bin/python3
Nov 28 07:57:42 np0005538515.localdomain sudo[43243]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:57:42 np0005538515.localdomain python3[43245]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/tripleo-update-jumps.nft follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 07:57:42 np0005538515.localdomain sudo[43243]: pam_unix(sudo:session): session closed for user root
Nov 28 07:57:42 np0005538515.localdomain sudo[43286]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qhnvgwqzhknsvtdeigdyowpiobgxmjaq ; /usr/bin/python3
Nov 28 07:57:42 np0005538515.localdomain sudo[43286]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:57:42 np0005538515.localdomain python3[43288]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/tripleo-update-jumps.nft src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764316662.0064602-75353-113716215427850/source mode=None follow=False _original_basename=jump-chain.j2 checksum=eec306c3276262a27663d76bd0ea526457445afa backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:57:42 np0005538515.localdomain sudo[43286]: pam_unix(sudo:session): session closed for user root
Nov 28 07:57:43 np0005538515.localdomain sudo[43348]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-syvjgdffdihfjhwkevyxemiwgteplfpw ; /usr/bin/python3
Nov 28 07:57:43 np0005538515.localdomain sudo[43348]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:57:43 np0005538515.localdomain python3[43350]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/tripleo-flushes.nft follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 07:57:43 np0005538515.localdomain sudo[43348]: pam_unix(sudo:session): session closed for user root
Nov 28 07:57:43 np0005538515.localdomain sudo[43391]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fqrgqspllunvzaipozjcmbdgyyapscbe ; /usr/bin/python3
Nov 28 07:57:43 np0005538515.localdomain sudo[43391]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:57:43 np0005538515.localdomain python3[43393]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/tripleo-flushes.nft src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764316663.0663264-75417-217519372544593/source mode=None follow=False _original_basename=flush-chain.j2 checksum=e8e7b8db0d61a7fe393441cc91613f470eb34a6e backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:57:43 np0005538515.localdomain sudo[43391]: pam_unix(sudo:session): session closed for user root
Nov 28 07:57:44 np0005538515.localdomain sudo[43453]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tyguehloqehbgkfuqdwsdcnbfrjirror ; /usr/bin/python3
Nov 28 07:57:44 np0005538515.localdomain sudo[43453]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:57:44 np0005538515.localdomain python3[43455]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/tripleo-chains.nft follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 07:57:44 np0005538515.localdomain sudo[43453]: pam_unix(sudo:session): session closed for user root
Nov 28 07:57:44 np0005538515.localdomain sudo[43496]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rjjrmgffmgnrfugcgviqwuzulctheril ; /usr/bin/python3
Nov 28 07:57:44 np0005538515.localdomain sudo[43496]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:57:44 np0005538515.localdomain python3[43498]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/tripleo-chains.nft src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764316663.9738405-75467-161828163473985/source mode=None follow=False _original_basename=chains.j2 checksum=e60ee651f5014e83924f4e901ecc8e25b1906610 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:57:44 np0005538515.localdomain sudo[43496]: pam_unix(sudo:session): session closed for user root
Nov 28 07:57:45 np0005538515.localdomain sudo[43558]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jdufqwxjiavgecatzlqxrtlmvvaqdzqb ; /usr/bin/python3
Nov 28 07:57:45 np0005538515.localdomain sudo[43558]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:57:45 np0005538515.localdomain python3[43560]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/tripleo-rules.nft follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 07:57:45 np0005538515.localdomain sudo[43558]: pam_unix(sudo:session): session closed for user root
Nov 28 07:57:45 np0005538515.localdomain sudo[43601]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qkdjjkqsnxeyzqzkbrloxpyvsoeqeicr ; /usr/bin/python3
Nov 28 07:57:45 np0005538515.localdomain sudo[43601]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:57:46 np0005538515.localdomain python3[43603]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/tripleo-rules.nft src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764316664.8835588-75499-83727017972286/source mode=None follow=False _original_basename=ruleset.j2 checksum=0444e4206083f91e2fb2aabfa2928244c2db35ed backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:57:46 np0005538515.localdomain sudo[43601]: pam_unix(sudo:session): session closed for user root
Nov 28 07:57:46 np0005538515.localdomain sudo[43631]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-laymbjeiubrfmfugxzsqvazxmfebdhws ; /usr/bin/python3
Nov 28 07:57:46 np0005538515.localdomain sudo[43631]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:57:46 np0005538515.localdomain python3[43633]: ansible-ansible.legacy.command Invoked with _raw_params=cat /etc/nftables/tripleo-chains.nft /etc/nftables/tripleo-flushes.nft /etc/nftables/tripleo-rules.nft /etc/nftables/tripleo-update-jumps.nft /etc/nftables/tripleo-jumps.nft | nft -c -f - _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 07:57:46 np0005538515.localdomain sudo[43631]: pam_unix(sudo:session): session closed for user root
Nov 28 07:57:46 np0005538515.localdomain sudo[43696]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rvadjgfdelwxunjihbexpuenommvpbtv ; /usr/bin/python3
Nov 28 07:57:46 np0005538515.localdomain sudo[43696]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:57:47 np0005538515.localdomain python3[43698]: ansible-ansible.builtin.blockinfile Invoked with path=/etc/sysconfig/nftables.conf backup=False validate=nft -c -f %s block=include "/etc/nftables/iptables.nft"
                                                         include "/etc/nftables/tripleo-chains.nft"
                                                         include "/etc/nftables/tripleo-rules.nft"
                                                         include "/etc/nftables/tripleo-jumps.nft"
                                                          state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:57:47 np0005538515.localdomain sudo[43696]: pam_unix(sudo:session): session closed for user root
Nov 28 07:57:47 np0005538515.localdomain sudo[43713]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gwqfeyuvvozpaaqzpfacfwisfqgaedmg ; /usr/bin/python3
Nov 28 07:57:47 np0005538515.localdomain sudo[43713]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:57:47 np0005538515.localdomain python3[43715]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/tripleo-chains.nft _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 07:57:47 np0005538515.localdomain sudo[43713]: pam_unix(sudo:session): session closed for user root
Nov 28 07:57:47 np0005538515.localdomain sudo[43730]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wtdgmtvzictuioslgxtdzomhidsnodnp ; /usr/bin/python3
Nov 28 07:57:47 np0005538515.localdomain sudo[43730]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:57:47 np0005538515.localdomain python3[43732]: ansible-ansible.legacy.command Invoked with _raw_params=cat /etc/nftables/tripleo-flushes.nft /etc/nftables/tripleo-rules.nft /etc/nftables/tripleo-update-jumps.nft | nft -f - _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 07:57:48 np0005538515.localdomain sudo[43730]: pam_unix(sudo:session): session closed for user root
Nov 28 07:57:48 np0005538515.localdomain sudo[43749]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fjdugsooqxrnmqntmajliiatikvftzyh ; /usr/bin/python3
Nov 28 07:57:48 np0005538515.localdomain sudo[43749]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:57:48 np0005538515.localdomain python3[43751]: ansible-file Invoked with mode=0750 path=/var/log/containers/collectd setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 07:57:48 np0005538515.localdomain sudo[43749]: pam_unix(sudo:session): session closed for user root
Nov 28 07:57:48 np0005538515.localdomain sudo[43765]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-whoqtwlrtleasmkuvqulaczxtvydblgn ; /usr/bin/python3
Nov 28 07:57:48 np0005538515.localdomain sudo[43765]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:57:48 np0005538515.localdomain python3[43767]: ansible-file Invoked with mode=0755 path=/var/lib/container-user-scripts/ setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 07:57:48 np0005538515.localdomain sudo[43765]: pam_unix(sudo:session): session closed for user root
Nov 28 07:57:48 np0005538515.localdomain sudo[43781]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kdawhddosvhshwraxifaqzdvutxgnwdk ; /usr/bin/python3
Nov 28 07:57:48 np0005538515.localdomain sudo[43781]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:57:49 np0005538515.localdomain python3[43783]: ansible-file Invoked with mode=0750 path=/var/log/containers/ceilometer setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 07:57:49 np0005538515.localdomain sudo[43781]: pam_unix(sudo:session): session closed for user root
Nov 28 07:57:49 np0005538515.localdomain sudo[43797]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-otqeymkskuvzggcsqbcxtuawxtymnqhj ; /usr/bin/python3
Nov 28 07:57:49 np0005538515.localdomain sudo[43797]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:57:49 np0005538515.localdomain python3[43799]: ansible-seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Nov 28 07:57:50 np0005538515.localdomain sudo[43797]: pam_unix(sudo:session): session closed for user root
Nov 28 07:57:50 np0005538515.localdomain sudo[43817]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-krkaguskmzlyjbblrbcihrmhrfsqdssj ; /usr/bin/python3
Nov 28 07:57:50 np0005538515.localdomain dbus-broker-launch[754]: avc:  op=load_policy lsm=selinux seqno=7 res=1
Nov 28 07:57:50 np0005538515.localdomain sudo[43817]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:57:50 np0005538515.localdomain python3[43819]: ansible-community.general.sefcontext Invoked with setype=container_file_t state=present target=/etc/iscsi(/.*)? ignore_selinux_state=False ftype=a reload=True seuser=None selevel=None
Nov 28 07:57:51 np0005538515.localdomain kernel: SELinux:  Converting 2703 SID table entries...
Nov 28 07:57:51 np0005538515.localdomain kernel: SELinux:  policy capability network_peer_controls=1
Nov 28 07:57:51 np0005538515.localdomain kernel: SELinux:  policy capability open_perms=1
Nov 28 07:57:51 np0005538515.localdomain kernel: SELinux:  policy capability extended_socket_class=1
Nov 28 07:57:51 np0005538515.localdomain kernel: SELinux:  policy capability always_check_network=0
Nov 28 07:57:51 np0005538515.localdomain kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 28 07:57:51 np0005538515.localdomain kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 28 07:57:51 np0005538515.localdomain kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 28 07:57:51 np0005538515.localdomain sudo[43817]: pam_unix(sudo:session): session closed for user root
Nov 28 07:57:51 np0005538515.localdomain sudo[43838]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ysaigzplcoxqldxcpmwffmrljakpqzyb ; /usr/bin/python3
Nov 28 07:57:51 np0005538515.localdomain dbus-broker-launch[754]: avc:  op=load_policy lsm=selinux seqno=8 res=1
Nov 28 07:57:51 np0005538515.localdomain sudo[43838]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:57:51 np0005538515.localdomain python3[43840]: ansible-community.general.sefcontext Invoked with setype=container_file_t state=present target=/etc/target(/.*)? ignore_selinux_state=False ftype=a reload=True seuser=None selevel=None
Nov 28 07:57:52 np0005538515.localdomain kernel: SELinux:  Converting 2703 SID table entries...
Nov 28 07:57:52 np0005538515.localdomain kernel: SELinux:  policy capability network_peer_controls=1
Nov 28 07:57:52 np0005538515.localdomain kernel: SELinux:  policy capability open_perms=1
Nov 28 07:57:52 np0005538515.localdomain kernel: SELinux:  policy capability extended_socket_class=1
Nov 28 07:57:52 np0005538515.localdomain kernel: SELinux:  policy capability always_check_network=0
Nov 28 07:57:52 np0005538515.localdomain kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 28 07:57:52 np0005538515.localdomain kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 28 07:57:52 np0005538515.localdomain kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 28 07:57:52 np0005538515.localdomain sudo[43838]: pam_unix(sudo:session): session closed for user root
Nov 28 07:57:52 np0005538515.localdomain sudo[43859]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uvnqrjvugawpvwmzuthcryfgmoianeez ; /usr/bin/python3
Nov 28 07:57:52 np0005538515.localdomain dbus-broker-launch[754]: avc:  op=load_policy lsm=selinux seqno=9 res=1
Nov 28 07:57:52 np0005538515.localdomain sudo[43859]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:57:53 np0005538515.localdomain python3[43861]: ansible-community.general.sefcontext Invoked with setype=container_file_t state=present target=/var/lib/iscsi(/.*)? ignore_selinux_state=False ftype=a reload=True seuser=None selevel=None
Nov 28 07:57:53 np0005538515.localdomain kernel: SELinux:  Converting 2703 SID table entries...
Nov 28 07:57:53 np0005538515.localdomain kernel: SELinux:  policy capability network_peer_controls=1
Nov 28 07:57:53 np0005538515.localdomain kernel: SELinux:  policy capability open_perms=1
Nov 28 07:57:53 np0005538515.localdomain kernel: SELinux:  policy capability extended_socket_class=1
Nov 28 07:57:53 np0005538515.localdomain kernel: SELinux:  policy capability always_check_network=0
Nov 28 07:57:53 np0005538515.localdomain kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 28 07:57:53 np0005538515.localdomain kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 28 07:57:53 np0005538515.localdomain kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 28 07:57:54 np0005538515.localdomain sudo[43859]: pam_unix(sudo:session): session closed for user root
Nov 28 07:57:54 np0005538515.localdomain sudo[43880]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ulwvjfmraxqmlirvckwbbzxfmebjflrn ; /usr/bin/python3
Nov 28 07:57:54 np0005538515.localdomain dbus-broker-launch[754]: avc:  op=load_policy lsm=selinux seqno=10 res=1
Nov 28 07:57:54 np0005538515.localdomain sudo[43880]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:57:54 np0005538515.localdomain python3[43882]: ansible-file Invoked with path=/etc/iscsi setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 07:57:54 np0005538515.localdomain sudo[43880]: pam_unix(sudo:session): session closed for user root
Nov 28 07:57:54 np0005538515.localdomain sudo[43896]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ouvxxawctxpceoxemomllfpojdbvdcic ; /usr/bin/python3
Nov 28 07:57:54 np0005538515.localdomain sudo[43896]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:57:54 np0005538515.localdomain python3[43898]: ansible-file Invoked with path=/etc/target setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 07:57:54 np0005538515.localdomain sudo[43896]: pam_unix(sudo:session): session closed for user root
Nov 28 07:57:55 np0005538515.localdomain sudo[43912]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tuzesarbcsrnokwzjgixfwkrclxzwmmj ; /usr/bin/python3
Nov 28 07:57:55 np0005538515.localdomain sudo[43912]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:57:55 np0005538515.localdomain python3[43914]: ansible-file Invoked with path=/var/lib/iscsi setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 07:57:55 np0005538515.localdomain sudo[43912]: pam_unix(sudo:session): session closed for user root
Nov 28 07:57:55 np0005538515.localdomain sudo[43928]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-crkvldndtovsdeqdrbixyuftneoiyrbo ; /usr/bin/python3
Nov 28 07:57:55 np0005538515.localdomain sudo[43928]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:57:55 np0005538515.localdomain python3[43930]: ansible-stat Invoked with path=/lib/systemd/system/iscsid.socket follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 28 07:57:55 np0005538515.localdomain sudo[43928]: pam_unix(sudo:session): session closed for user root
Nov 28 07:57:55 np0005538515.localdomain sudo[43944]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ltndwnedxijyrgyprhqhnxbxnnujsvfu ; /usr/bin/python3
Nov 28 07:57:55 np0005538515.localdomain sudo[43944]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:57:56 np0005538515.localdomain python3[43946]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-enabled --quiet iscsi.service _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 07:57:56 np0005538515.localdomain sudo[43944]: pam_unix(sudo:session): session closed for user root
Nov 28 07:57:56 np0005538515.localdomain sudo[43961]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bztsqetxtgvkgoylkhfmldiukamsqgyx ; /usr/bin/python3
Nov 28 07:57:56 np0005538515.localdomain sudo[43961]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:57:56 np0005538515.localdomain python3[43963]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Nov 28 07:58:00 np0005538515.localdomain sudo[43961]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:00 np0005538515.localdomain sudo[43978]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pmwgqwatiwksunlrahohzkjhrifygriq ; /usr/bin/python3
Nov 28 07:58:00 np0005538515.localdomain sudo[43978]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:00 np0005538515.localdomain python3[43980]: ansible-file Invoked with path=/etc/modules-load.d state=directory mode=493 owner=root group=root setype=etc_t recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 07:58:00 np0005538515.localdomain sudo[43978]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:00 np0005538515.localdomain sudo[44026]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nzvrdxsqerbouftkclkddzpeitelizku ; /usr/bin/python3
Nov 28 07:58:00 np0005538515.localdomain sudo[44026]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:00 np0005538515.localdomain python3[44028]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-tripleo.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 07:58:00 np0005538515.localdomain sudo[44026]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:01 np0005538515.localdomain sudo[44069]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kkxqbjwvbexfxxjjyamrdlrckzdyddvi ; /usr/bin/python3
Nov 28 07:58:01 np0005538515.localdomain sudo[44069]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:01 np0005538515.localdomain python3[44071]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764316680.6064835-76419-180082442120474/source dest=/etc/modules-load.d/99-tripleo.conf mode=420 owner=root group=root setype=etc_t follow=False _original_basename=tripleo-modprobe.conf.j2 checksum=8021efe01721d8fa8cab46b95c00ec1be6dbb9d0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 28 07:58:01 np0005538515.localdomain sudo[44069]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:01 np0005538515.localdomain sudo[44099]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jclkoazwtpqzdetkwewyphikbhfnfpdg ; /usr/bin/python3
Nov 28 07:58:01 np0005538515.localdomain sudo[44099]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:02 np0005538515.localdomain python3[44101]: ansible-systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 28 07:58:02 np0005538515.localdomain systemd[1]: systemd-modules-load.service: Deactivated successfully.
Nov 28 07:58:02 np0005538515.localdomain systemd[1]: Stopped Load Kernel Modules.
Nov 28 07:58:02 np0005538515.localdomain systemd[1]: Stopping Load Kernel Modules...
Nov 28 07:58:02 np0005538515.localdomain systemd[1]: Starting Load Kernel Modules...
Nov 28 07:58:02 np0005538515.localdomain kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this.
Nov 28 07:58:02 np0005538515.localdomain kernel: Bridge firewalling registered
Nov 28 07:58:02 np0005538515.localdomain systemd-modules-load[44104]: Inserted module 'br_netfilter'
Nov 28 07:58:02 np0005538515.localdomain systemd-modules-load[44104]: Module 'msr' is built in
Nov 28 07:58:02 np0005538515.localdomain systemd[1]: Finished Load Kernel Modules.
Nov 28 07:58:02 np0005538515.localdomain sudo[44099]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:02 np0005538515.localdomain sudo[44153]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qkzgjbrkggoguxqbuqaeuwodplrpbbip ; /usr/bin/python3
Nov 28 07:58:02 np0005538515.localdomain sudo[44153]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:03 np0005538515.localdomain python3[44155]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-tripleo.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 07:58:03 np0005538515.localdomain sudo[44153]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:03 np0005538515.localdomain sudo[44196]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pupzmhbvmtalxnojurxzugjqlwladxln ; /usr/bin/python3
Nov 28 07:58:03 np0005538515.localdomain sudo[44196]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:03 np0005538515.localdomain python3[44198]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764316682.8042061-76494-187475773523860/source dest=/etc/sysctl.d/99-tripleo.conf mode=420 owner=root group=root setype=etc_t follow=False _original_basename=tripleo-sysctl.conf.j2 checksum=cddb9401fdafaaf28a4a94b98448f98ae93c94c9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 28 07:58:03 np0005538515.localdomain sudo[44196]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:03 np0005538515.localdomain sudo[44226]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-toesobgobobwslqtfbixtfsytsmhzfmd ; /usr/bin/python3
Nov 28 07:58:03 np0005538515.localdomain sudo[44226]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:03 np0005538515.localdomain python3[44228]: ansible-sysctl Invoked with name=fs.aio-max-nr value=1048576 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Nov 28 07:58:03 np0005538515.localdomain sudo[44226]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:04 np0005538515.localdomain sudo[44243]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-purtkmukkgwxdzwbyfjhnrwqenfdkygw ; /usr/bin/python3
Nov 28 07:58:04 np0005538515.localdomain sudo[44243]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:04 np0005538515.localdomain python3[44245]: ansible-sysctl Invoked with name=fs.inotify.max_user_instances value=1024 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Nov 28 07:58:04 np0005538515.localdomain sudo[44243]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:04 np0005538515.localdomain sudo[44261]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ouroenhtxvfkrqleeuqdkdmqywglcdwf ; /usr/bin/python3
Nov 28 07:58:04 np0005538515.localdomain sudo[44261]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:04 np0005538515.localdomain python3[44263]: ansible-sysctl Invoked with name=kernel.pid_max value=1048576 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Nov 28 07:58:04 np0005538515.localdomain sudo[44261]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:04 np0005538515.localdomain sudo[44279]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qlkseqmjhzxeyddnwltidwwtrtufvuma ; /usr/bin/python3
Nov 28 07:58:04 np0005538515.localdomain sudo[44279]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:04 np0005538515.localdomain python3[44281]: ansible-sysctl Invoked with name=net.bridge.bridge-nf-call-arptables value=1 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Nov 28 07:58:04 np0005538515.localdomain sudo[44279]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:04 np0005538515.localdomain sudo[44296]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dxyqtyjrllvzutpqfgrfowaszycojzfz ; /usr/bin/python3
Nov 28 07:58:04 np0005538515.localdomain sudo[44296]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:05 np0005538515.localdomain python3[44298]: ansible-sysctl Invoked with name=net.bridge.bridge-nf-call-ip6tables value=1 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Nov 28 07:58:05 np0005538515.localdomain sudo[44296]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:05 np0005538515.localdomain sudo[44313]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-deurxmsllxhtvuuozyodrhrxycofhzvc ; /usr/bin/python3
Nov 28 07:58:05 np0005538515.localdomain sudo[44313]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:05 np0005538515.localdomain python3[44315]: ansible-sysctl Invoked with name=net.bridge.bridge-nf-call-iptables value=1 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Nov 28 07:58:05 np0005538515.localdomain sudo[44313]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:05 np0005538515.localdomain sudo[44330]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rahlzkkxoykgefflasaxxgikrnyyrxdj ; /usr/bin/python3
Nov 28 07:58:05 np0005538515.localdomain sudo[44330]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:05 np0005538515.localdomain python3[44332]: ansible-sysctl Invoked with name=net.ipv4.conf.all.rp_filter value=1 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Nov 28 07:58:05 np0005538515.localdomain sudo[44330]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:05 np0005538515.localdomain sudo[44348]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pkypmrmgedewtdnsphfjkvgdguvsdtmz ; /usr/bin/python3
Nov 28 07:58:05 np0005538515.localdomain sudo[44348]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:06 np0005538515.localdomain python3[44350]: ansible-sysctl Invoked with name=net.ipv4.ip_forward value=1 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Nov 28 07:58:06 np0005538515.localdomain sudo[44348]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:06 np0005538515.localdomain sudo[44366]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jiaamzlizpmbimauzlqjhovetvqbdesr ; /usr/bin/python3
Nov 28 07:58:06 np0005538515.localdomain sudo[44366]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:06 np0005538515.localdomain python3[44368]: ansible-sysctl Invoked with name=net.ipv4.ip_local_reserved_ports value=35357,49000-49001 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Nov 28 07:58:06 np0005538515.localdomain sudo[44366]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:06 np0005538515.localdomain sudo[44384]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jzjjncmxfgxxjyhycpxxboomfdhfdobg ; /usr/bin/python3
Nov 28 07:58:06 np0005538515.localdomain sudo[44384]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:06 np0005538515.localdomain python3[44386]: ansible-sysctl Invoked with name=net.ipv4.ip_nonlocal_bind value=1 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Nov 28 07:58:06 np0005538515.localdomain sudo[44384]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:06 np0005538515.localdomain sudo[44402]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dtsakfvfkwsuhzwdvqiuojvaglufhzzr ; /usr/bin/python3
Nov 28 07:58:06 np0005538515.localdomain sudo[44402]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:07 np0005538515.localdomain python3[44404]: ansible-sysctl Invoked with name=net.ipv4.neigh.default.gc_thresh1 value=1024 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Nov 28 07:58:07 np0005538515.localdomain sudo[44402]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:07 np0005538515.localdomain sudo[44420]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hbkwgtfolbldohomehkvdrzetlffwgfp ; /usr/bin/python3
Nov 28 07:58:07 np0005538515.localdomain sudo[44420]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:07 np0005538515.localdomain python3[44422]: ansible-sysctl Invoked with name=net.ipv4.neigh.default.gc_thresh2 value=2048 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Nov 28 07:58:07 np0005538515.localdomain sudo[44420]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:07 np0005538515.localdomain sudo[44438]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tqbqqeirjgybizmxubapzjcelswjevwq ; /usr/bin/python3
Nov 28 07:58:07 np0005538515.localdomain sudo[44438]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:07 np0005538515.localdomain python3[44440]: ansible-sysctl Invoked with name=net.ipv4.neigh.default.gc_thresh3 value=4096 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Nov 28 07:58:07 np0005538515.localdomain sudo[44438]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:07 np0005538515.localdomain sudo[44456]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kyswckyhaleqotyugssupydpxrkyufsp ; /usr/bin/python3
Nov 28 07:58:07 np0005538515.localdomain sudo[44456]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:07 np0005538515.localdomain python3[44458]: ansible-sysctl Invoked with name=net.ipv6.conf.all.disable_ipv6 value=0 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Nov 28 07:58:07 np0005538515.localdomain sudo[44456]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:08 np0005538515.localdomain sudo[44473]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-diqidmfqbfqbrskqgqepbdvdjrillkwm ; /usr/bin/python3
Nov 28 07:58:08 np0005538515.localdomain sudo[44473]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:08 np0005538515.localdomain python3[44475]: ansible-sysctl Invoked with name=net.ipv6.conf.all.forwarding value=0 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Nov 28 07:58:08 np0005538515.localdomain sudo[44473]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:08 np0005538515.localdomain sudo[44490]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ggefdsqytqnvpjpscckwnwailfoljzpe ; /usr/bin/python3
Nov 28 07:58:08 np0005538515.localdomain sudo[44490]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:08 np0005538515.localdomain python3[44492]: ansible-sysctl Invoked with name=net.ipv6.conf.default.disable_ipv6 value=0 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Nov 28 07:58:08 np0005538515.localdomain sudo[44490]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:08 np0005538515.localdomain sudo[44507]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dqstlfufuyiqgjnoqvoohcxwryfioknh ; /usr/bin/python3
Nov 28 07:58:08 np0005538515.localdomain sudo[44507]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:08 np0005538515.localdomain python3[44509]: ansible-sysctl Invoked with name=net.ipv6.conf.lo.disable_ipv6 value=0 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Nov 28 07:58:08 np0005538515.localdomain sudo[44507]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:08 np0005538515.localdomain sudo[44524]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xtmikostkujugikigtnyxbrkdchvmbyh ; /usr/bin/python3
Nov 28 07:58:08 np0005538515.localdomain sudo[44524]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:09 np0005538515.localdomain python3[44526]: ansible-sysctl Invoked with name=net.ipv6.ip_nonlocal_bind value=1 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Nov 28 07:58:09 np0005538515.localdomain sudo[44524]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:09 np0005538515.localdomain sudo[44542]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bgwvervmxsiyqhfosdbrrxhsacakdkbg ; /usr/bin/python3
Nov 28 07:58:09 np0005538515.localdomain sudo[44542]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:09 np0005538515.localdomain python3[44544]: ansible-systemd Invoked with name=systemd-sysctl.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 28 07:58:09 np0005538515.localdomain systemd[1]: systemd-sysctl.service: Deactivated successfully.
Nov 28 07:58:09 np0005538515.localdomain systemd[1]: Stopped Apply Kernel Variables.
Nov 28 07:58:09 np0005538515.localdomain systemd[1]: Stopping Apply Kernel Variables...
Nov 28 07:58:09 np0005538515.localdomain systemd[1]: Starting Apply Kernel Variables...
Nov 28 07:58:09 np0005538515.localdomain systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Nov 28 07:58:09 np0005538515.localdomain systemd[1]: Finished Apply Kernel Variables.
Nov 28 07:58:09 np0005538515.localdomain sudo[44542]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:09 np0005538515.localdomain sudo[44562]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fmgyoqvnfezzcskbsdjzzawehmdbkdvl ; /usr/bin/python3
Nov 28 07:58:09 np0005538515.localdomain sudo[44562]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:10 np0005538515.localdomain python3[44564]: ansible-file Invoked with mode=0750 path=/var/log/containers/metrics_qdr setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 07:58:10 np0005538515.localdomain sudo[44562]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:10 np0005538515.localdomain sudo[44578]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rwgjdiropturimvwgcnigimehqynmkdt ; /usr/bin/python3
Nov 28 07:58:10 np0005538515.localdomain sudo[44578]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:10 np0005538515.localdomain python3[44580]: ansible-file Invoked with path=/var/lib/metrics_qdr setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 07:58:10 np0005538515.localdomain sudo[44578]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:10 np0005538515.localdomain sudo[44594]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zjjgnlvsyrdhdmcgjuowvonghxejjgsy ; /usr/bin/python3
Nov 28 07:58:10 np0005538515.localdomain sudo[44594]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:10 np0005538515.localdomain python3[44596]: ansible-file Invoked with mode=0750 path=/var/log/containers/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 07:58:10 np0005538515.localdomain sudo[44594]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:10 np0005538515.localdomain sudo[44610]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-soyktmuidfmoimvbiewgpqahnjarhsti ; /usr/bin/python3
Nov 28 07:58:10 np0005538515.localdomain sudo[44610]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:11 np0005538515.localdomain python3[44612]: ansible-stat Invoked with path=/var/lib/nova/instances follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 28 07:58:11 np0005538515.localdomain sudo[44610]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:11 np0005538515.localdomain sudo[44626]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qvarjctndicqwwxwuzfvapqurozefkfk ; /usr/bin/python3
Nov 28 07:58:11 np0005538515.localdomain sudo[44626]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:11 np0005538515.localdomain sudo[44628]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 07:58:11 np0005538515.localdomain sudo[44628]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 07:58:11 np0005538515.localdomain sudo[44628]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:11 np0005538515.localdomain sudo[44644]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 check-host
Nov 28 07:58:11 np0005538515.localdomain sudo[44644]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 07:58:11 np0005538515.localdomain python3[44639]: ansible-file Invoked with path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 07:58:11 np0005538515.localdomain sudo[44626]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:11 np0005538515.localdomain sudo[44672]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cwfgnfjycwojefrxbeyprnitdpsjhuwx ; /usr/bin/python3
Nov 28 07:58:11 np0005538515.localdomain sudo[44672]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:11 np0005538515.localdomain python3[44674]: ansible-file Invoked with path=/var/lib/_nova_secontext setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 07:58:11 np0005538515.localdomain sudo[44672]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:11 np0005538515.localdomain sudo[44644]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:11 np0005538515.localdomain sudo[44709]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xbeubhaefyxxjkghkukxryhqcplrkjdi ; /usr/bin/python3
Nov 28 07:58:11 np0005538515.localdomain sudo[44709]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:12 np0005538515.localdomain python3[44711]: ansible-file Invoked with path=/var/lib/nova/instances setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 07:58:12 np0005538515.localdomain sudo[44709]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:12 np0005538515.localdomain sudo[44712]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 07:58:12 np0005538515.localdomain sudo[44712]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 07:58:12 np0005538515.localdomain sudo[44712]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:12 np0005538515.localdomain sudo[44727]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 07:58:12 np0005538515.localdomain sudo[44727]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 07:58:12 np0005538515.localdomain sudo[44754]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jvnwmhygzlaardiaxmvttzeyufxhmzis ; /usr/bin/python3
Nov 28 07:58:12 np0005538515.localdomain sudo[44754]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:12 np0005538515.localdomain python3[44757]: ansible-file Invoked with path=/var/lib/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 07:58:12 np0005538515.localdomain sudo[44754]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:12 np0005538515.localdomain sudo[44782]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qknypukdwremxkzgyiwqwtruglvqzcxr ; /usr/bin/python3
Nov 28 07:58:12 np0005538515.localdomain sudo[44782]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:12 np0005538515.localdomain python3[44786]: ansible-file Invoked with path=/etc/tmpfiles.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:58:12 np0005538515.localdomain sudo[44782]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:12 np0005538515.localdomain sudo[44727]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:12 np0005538515.localdomain sudo[44849]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uxujvwtvjcwxeaclavbevgxumbwldggi ; /usr/bin/python3
Nov 28 07:58:12 np0005538515.localdomain sudo[44849]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:13 np0005538515.localdomain python3[44851]: ansible-ansible.legacy.stat Invoked with path=/etc/tmpfiles.d/run-nova.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 07:58:13 np0005538515.localdomain sudo[44849]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:13 np0005538515.localdomain sudo[44879]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 07:58:13 np0005538515.localdomain sudo[44879]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 07:58:13 np0005538515.localdomain sudo[44879]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:13 np0005538515.localdomain sudo[44906]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kdpsxphoslnjlfpqctwhzekjvnlbwvnj ; /usr/bin/python3
Nov 28 07:58:13 np0005538515.localdomain sudo[44906]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:13 np0005538515.localdomain python3[44909]: ansible-ansible.legacy.copy Invoked with dest=/etc/tmpfiles.d/run-nova.conf src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764316692.8450131-76841-145110471484068/source _original_basename=tmp_0jdipv_ follow=False checksum=f834349098718ec09c7562bcb470b717a83ff411 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:58:13 np0005538515.localdomain sudo[44906]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:13 np0005538515.localdomain sudo[44937]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bjayuvubqzozzvbvxkssygxcxhzhlycx ; /usr/bin/python3
Nov 28 07:58:13 np0005538515.localdomain sudo[44937]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:13 np0005538515.localdomain python3[44939]: ansible-ansible.legacy.command Invoked with _raw_params=systemd-tmpfiles --create _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 07:58:14 np0005538515.localdomain sudo[44937]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:15 np0005538515.localdomain sudo[44954]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bkaxawuawdwepcmqkzuclrfbwlmyopbb ; /usr/bin/python3
Nov 28 07:58:15 np0005538515.localdomain sudo[44954]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:16 np0005538515.localdomain python3[44956]: ansible-file Invoked with path=/var/lib/tripleo-config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:58:16 np0005538515.localdomain sudo[44954]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:16 np0005538515.localdomain sudo[45002]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-urnhokmxrzetkwhfwqgprfwallphlrkz ; /usr/bin/python3
Nov 28 07:58:16 np0005538515.localdomain sudo[45002]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:16 np0005538515.localdomain python3[45004]: ansible-ansible.legacy.stat Invoked with path=/var/lib/nova/delay-nova-compute follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 07:58:16 np0005538515.localdomain sudo[45002]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:16 np0005538515.localdomain sudo[45045]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-paiqsnaqbxmektgqcthikzpwmuamhnyt ; /usr/bin/python3
Nov 28 07:58:16 np0005538515.localdomain sudo[45045]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:17 np0005538515.localdomain python3[45047]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/nova/delay-nova-compute mode=493 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764316696.285853-77052-28241370299002/source _original_basename=tmpqkpgen63 follow=False checksum=f07ad3e8cf3766b3b3b07ae8278826a0ef3bb5e3 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:58:17 np0005538515.localdomain sudo[45045]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:17 np0005538515.localdomain sudo[45075]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dtaefspmglczqrlqcdqvcpzouydxembz ; /usr/bin/python3
Nov 28 07:58:17 np0005538515.localdomain sudo[45075]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:17 np0005538515.localdomain python3[45077]: ansible-file Invoked with mode=0750 path=/var/log/containers/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 07:58:17 np0005538515.localdomain sudo[45075]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:17 np0005538515.localdomain sudo[45091]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ktserljakfzdpjhnbbjjqbgizrrdqepr ; /usr/bin/python3
Nov 28 07:58:17 np0005538515.localdomain sudo[45091]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:17 np0005538515.localdomain python3[45093]: ansible-file Invoked with path=/etc/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 07:58:17 np0005538515.localdomain sudo[45091]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:18 np0005538515.localdomain sudo[45107]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tgreojabltaoewrlgdhxlurwkfjoscus ; /usr/bin/python3
Nov 28 07:58:18 np0005538515.localdomain sudo[45107]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:18 np0005538515.localdomain python3[45109]: ansible-file Invoked with path=/etc/libvirt/secrets setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 07:58:18 np0005538515.localdomain sudo[45107]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:18 np0005538515.localdomain sudo[45123]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-usirqpmuhoqpkiyqmfdvuxmjazlmxaaq ; /usr/bin/python3
Nov 28 07:58:18 np0005538515.localdomain sudo[45123]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:18 np0005538515.localdomain python3[45125]: ansible-file Invoked with path=/etc/libvirt/qemu setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 07:58:18 np0005538515.localdomain sudo[45123]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:18 np0005538515.localdomain sudo[45139]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yadsipusayutzyfsmjevermteevuxwto ; /usr/bin/python3
Nov 28 07:58:18 np0005538515.localdomain sudo[45139]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:18 np0005538515.localdomain python3[45141]: ansible-file Invoked with path=/var/lib/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 07:58:18 np0005538515.localdomain sudo[45139]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:19 np0005538515.localdomain sudo[45155]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gmwhzuumoyzmlbqfdibxtyffsyfpjmcn ; /usr/bin/python3
Nov 28 07:58:19 np0005538515.localdomain sudo[45155]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:19 np0005538515.localdomain python3[45157]: ansible-file Invoked with path=/var/cache/libvirt state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:58:19 np0005538515.localdomain sudo[45155]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:19 np0005538515.localdomain sudo[45171]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zzuflfuanzbzriroowhexzoewgrczkqa ; /usr/bin/python3
Nov 28 07:58:19 np0005538515.localdomain sudo[45171]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:19 np0005538515.localdomain python3[45173]: ansible-file Invoked with path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 07:58:19 np0005538515.localdomain sudo[45171]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:19 np0005538515.localdomain sudo[45187]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-erljuwozwazoacvwhzyontortqvxzita ; /usr/bin/python3
Nov 28 07:58:19 np0005538515.localdomain sudo[45187]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:19 np0005538515.localdomain python3[45189]: ansible-file Invoked with path=/run/libvirt state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:58:19 np0005538515.localdomain sudo[45187]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:19 np0005538515.localdomain sudo[45203]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kltzyamyzvxwnenvzhcurrnifbjsbhos ; /usr/bin/python3
Nov 28 07:58:19 np0005538515.localdomain sudo[45203]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:20 np0005538515.localdomain python3[45205]: ansible-file Invoked with mode=0770 path=/var/log/containers/libvirt/swtpm setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 07:58:20 np0005538515.localdomain sudo[45203]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:20 np0005538515.localdomain sudo[45219]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tkskrlhcljitwkmwuytzplveoydiplcy ; /usr/bin/python3
Nov 28 07:58:20 np0005538515.localdomain sudo[45219]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:20 np0005538515.localdomain python3[45221]: ansible-group Invoked with gid=107 name=qemu state=present system=False local=False non_unique=False
Nov 28 07:58:20 np0005538515.localdomain groupadd[45222]: group added to /etc/group: name=qemu, GID=107
Nov 28 07:58:20 np0005538515.localdomain groupadd[45222]: group added to /etc/gshadow: name=qemu
Nov 28 07:58:20 np0005538515.localdomain groupadd[45222]: new group: name=qemu, GID=107
Nov 28 07:58:20 np0005538515.localdomain sudo[45219]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:20 np0005538515.localdomain sudo[45241]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-azsscthdranewayixiklvitvuntkwbto ; /usr/bin/python3
Nov 28 07:58:20 np0005538515.localdomain sudo[45241]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:20 np0005538515.localdomain python3[45243]: ansible-user Invoked with comment=qemu user group=qemu name=qemu shell=/sbin/nologin state=present uid=107 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005538515.localdomain update_password=always groups=None home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None
Nov 28 07:58:20 np0005538515.localdomain useradd[45245]: new user: name=qemu, UID=107, GID=107, home=/home/qemu, shell=/sbin/nologin, from=none
Nov 28 07:58:21 np0005538515.localdomain sudo[45241]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:21 np0005538515.localdomain sudo[45265]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zpjcsdmlxokemjoiemwjemfkvlptzjby ; /usr/bin/python3
Nov 28 07:58:21 np0005538515.localdomain sudo[45265]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:21 np0005538515.localdomain python3[45267]: ansible-file Invoked with group=qemu owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None serole=None selevel=None attributes=None
Nov 28 07:58:21 np0005538515.localdomain sudo[45265]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:21 np0005538515.localdomain sudo[45281]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hgdgwshdtwevjtwkraoidqaqmddorrus ; /usr/bin/python3
Nov 28 07:58:21 np0005538515.localdomain sudo[45281]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:21 np0005538515.localdomain python3[45283]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/rpm -q libvirt-daemon _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 07:58:21 np0005538515.localdomain sudo[45281]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:22 np0005538515.localdomain sudo[45330]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vvzjrkyhgwontlkwyfdqnnxvgpwyooej ; /usr/bin/python3
Nov 28 07:58:22 np0005538515.localdomain sudo[45330]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:22 np0005538515.localdomain python3[45332]: ansible-ansible.legacy.stat Invoked with path=/etc/tmpfiles.d/run-libvirt.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 07:58:22 np0005538515.localdomain sudo[45330]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:22 np0005538515.localdomain sudo[45373]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rozisikfnpbplgntphtwttmciliqihdx ; /usr/bin/python3
Nov 28 07:58:22 np0005538515.localdomain sudo[45373]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:22 np0005538515.localdomain python3[45375]: ansible-ansible.legacy.copy Invoked with dest=/etc/tmpfiles.d/run-libvirt.conf src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764316702.0181687-77376-265437569415924/source _original_basename=tmpu0rjcyih follow=False checksum=57f3ff94c666c6aae69ae22e23feb750cf9e8b13 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:58:22 np0005538515.localdomain sudo[45373]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:22 np0005538515.localdomain sudo[45403]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-niydhomilqmvtoyugztnzrddttjmagup ; /usr/bin/python3
Nov 28 07:58:22 np0005538515.localdomain sudo[45403]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:23 np0005538515.localdomain python3[45405]: ansible-seboolean Invoked with name=os_enable_vtpm persistent=True state=True ignore_selinux_state=False
Nov 28 07:58:23 np0005538515.localdomain sudo[45403]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:24 np0005538515.localdomain sudo[45495]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vnkckdjbiworvtmvqxfahwdxuzrmfcek ; /usr/bin/python3
Nov 28 07:58:24 np0005538515.localdomain dbus-broker-launch[754]: avc:  op=load_policy lsm=selinux seqno=11 res=1
Nov 28 07:58:24 np0005538515.localdomain sudo[45495]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:24 np0005538515.localdomain python3[45497]: ansible-file Invoked with path=/etc/crypto-policies/local.d/gnutls-qemu.config state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:58:24 np0005538515.localdomain sudo[45495]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:24 np0005538515.localdomain sudo[45511]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ftrhowiookdgveoxdctcybccjurkufeh ; /usr/bin/python3
Nov 28 07:58:24 np0005538515.localdomain sudo[45511]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:24 np0005538515.localdomain python3[45513]: ansible-file Invoked with path=/run/libvirt setype=virt_var_run_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 07:58:24 np0005538515.localdomain sudo[45511]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:24 np0005538515.localdomain sudo[45527]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pllpufozygaikexjcdyzudqbcjcizelm ; /usr/bin/python3
Nov 28 07:58:24 np0005538515.localdomain sudo[45527]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:25 np0005538515.localdomain python3[45529]: ansible-seboolean Invoked with name=logrotate_read_inside_containers persistent=True state=True ignore_selinux_state=False
Nov 28 07:58:25 np0005538515.localdomain sudo[45527]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:26 np0005538515.localdomain sudo[45548]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-midcjhcawkpbzkxrgxhwdxyhcbdaesrn ; /usr/bin/python3
Nov 28 07:58:26 np0005538515.localdomain dbus-broker-launch[754]: avc:  op=load_policy lsm=selinux seqno=12 res=1
Nov 28 07:58:26 np0005538515.localdomain sudo[45548]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:26 np0005538515.localdomain python3[45550]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Nov 28 07:58:29 np0005538515.localdomain sudo[45548]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:29 np0005538515.localdomain sudo[45565]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sqqvmuvmdrsyoyvkcqnelzikhgzdamrd ; /usr/bin/python3
Nov 28 07:58:29 np0005538515.localdomain sudo[45565]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:29 np0005538515.localdomain python3[45567]: ansible-setup Invoked with gather_subset=['!all', '!min', 'network'] filter=['ansible_interfaces'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 28 07:58:29 np0005538515.localdomain sudo[45565]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:30 np0005538515.localdomain sudo[45626]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ysvxtbaoifquwmquwjbrijgwztpuwyfw ; /usr/bin/python3
Nov 28 07:58:30 np0005538515.localdomain sudo[45626]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:30 np0005538515.localdomain python3[45628]: ansible-file Invoked with path=/etc/containers/networks state=directory recurse=True mode=493 owner=root group=root force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:58:30 np0005538515.localdomain sudo[45626]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:30 np0005538515.localdomain sudo[45642]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oxutaqfihfertlhsdxkjtuxmjjtoxgeo ; /usr/bin/python3
Nov 28 07:58:30 np0005538515.localdomain sudo[45642]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:30 np0005538515.localdomain python3[45644]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman
                                                          _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 07:58:30 np0005538515.localdomain sudo[45642]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:31 np0005538515.localdomain sudo[45702]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sjhkwolkiplrisczugtzqlztuuzmkovk ; /usr/bin/python3
Nov 28 07:58:31 np0005538515.localdomain sudo[45702]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:31 np0005538515.localdomain python3[45704]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 07:58:31 np0005538515.localdomain sudo[45702]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:31 np0005538515.localdomain sudo[45745]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-txdntioqjoicsokczznbmtkkbezbsttk ; /usr/bin/python3
Nov 28 07:58:31 np0005538515.localdomain sudo[45745]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:31 np0005538515.localdomain python3[45747]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764316711.00285-77866-91826337183377/source dest=/etc/containers/networks/podman.json mode=0644 owner=root group=root follow=False _original_basename=podman_network_config.j2 checksum=2f5a399fbfa982ef0876ce5d0ff30a44474c412f backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:58:31 np0005538515.localdomain sudo[45745]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:32 np0005538515.localdomain sudo[45807]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ioqmoejsqlcnghlubhwocbpenoujlamf ; /usr/bin/python3
Nov 28 07:58:32 np0005538515.localdomain sudo[45807]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:32 np0005538515.localdomain python3[45809]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 07:58:32 np0005538515.localdomain sudo[45807]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:32 np0005538515.localdomain sudo[45852]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fbucozudxkptpdtoroghdymoauexldzf ; /usr/bin/python3
Nov 28 07:58:32 np0005538515.localdomain sudo[45852]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:32 np0005538515.localdomain python3[45854]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764316711.9728777-77904-47598495393176/source dest=/etc/containers/registries.conf owner=root group=root setype=etc_t mode=0644 follow=False _original_basename=registries.conf.j2 checksum=710a00cfb11a4c3eba9c028ef1984a9fea9ba83a backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 28 07:58:32 np0005538515.localdomain sudo[45852]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:33 np0005538515.localdomain sudo[45882]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uwcqbbphcqujsxaproxyzkrcqspoidvv ; /usr/bin/python3
Nov 28 07:58:33 np0005538515.localdomain sudo[45882]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:33 np0005538515.localdomain python3[45884]: ansible-ini_file Invoked with path=/etc/containers/containers.conf owner=root group=root setype=etc_t mode=0644 create=True section=containers option=pids_limit value=4096 backup=False state=present exclusive=True no_extra_spaces=False allow_no_value=False unsafe_writes=False values=None seuser=None serole=None selevel=None attributes=None
Nov 28 07:58:33 np0005538515.localdomain sudo[45882]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:33 np0005538515.localdomain sudo[45898]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dxfqpdwqphkycpbcxuvieglqkdshfmun ; /usr/bin/python3
Nov 28 07:58:33 np0005538515.localdomain sudo[45898]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:33 np0005538515.localdomain python3[45900]: ansible-ini_file Invoked with path=/etc/containers/containers.conf owner=root group=root setype=etc_t mode=0644 create=True section=engine option=events_logger value="journald" backup=False state=present exclusive=True no_extra_spaces=False allow_no_value=False unsafe_writes=False values=None seuser=None serole=None selevel=None attributes=None
Nov 28 07:58:33 np0005538515.localdomain sudo[45898]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:33 np0005538515.localdomain sudo[45914]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jqheyxmiupfcqaehqvvisqofawyztyio ; /usr/bin/python3
Nov 28 07:58:33 np0005538515.localdomain sudo[45914]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:33 np0005538515.localdomain python3[45916]: ansible-ini_file Invoked with path=/etc/containers/containers.conf owner=root group=root setype=etc_t mode=0644 create=True section=engine option=runtime value="crun" backup=False state=present exclusive=True no_extra_spaces=False allow_no_value=False unsafe_writes=False values=None seuser=None serole=None selevel=None attributes=None
Nov 28 07:58:33 np0005538515.localdomain sudo[45914]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:33 np0005538515.localdomain sudo[45930]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xnxjtnjorbxhoyjiujwcowyucftyydxr ; /usr/bin/python3
Nov 28 07:58:33 np0005538515.localdomain sudo[45930]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:34 np0005538515.localdomain python3[45932]: ansible-ini_file Invoked with path=/etc/containers/containers.conf owner=root group=root setype=etc_t mode=0644 create=True section=network option=network_backend value="netavark" backup=False state=present exclusive=True no_extra_spaces=False allow_no_value=False unsafe_writes=False values=None seuser=None serole=None selevel=None attributes=None
Nov 28 07:58:34 np0005538515.localdomain ceph-osd[32393]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 28 07:58:34 np0005538515.localdomain ceph-osd[32393]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Cumulative writes: 3405 writes, 16K keys, 3405 commit groups, 1.0 writes per commit group, ingest: 0.01 GB, 0.03 MB/s
                                                          Cumulative WAL: 3405 writes, 206 syncs, 16.53 writes per sync, written: 0.01 GB, 0.03 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 3405 writes, 16K keys, 3405 commit groups, 1.0 writes per commit group, ingest: 15.31 MB, 0.03 MB/s
                                                          Interval WAL: 3405 writes, 206 syncs, 16.53 writes per sync, written: 0.01 GB, 0.03 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          
                                                          ** Compaction Stats [default] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      2/0    2.61 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.005       0      0       0.0       0.0
                                                           Sum      2/0    2.61 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.005       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [default] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.2      0.01              0.00         1    0.005       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55ab8a42b350#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 5.9e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [default] **
                                                          
                                                          ** Compaction Stats [m-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55ab8a42b350#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 5.9e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-0] **
                                                          
                                                          ** Compaction Stats [m-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55ab8a42b350#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 5.9e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-1] **
                                                          
                                                          ** Compaction Stats [m-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55ab8a42b350#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 5.9e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-2] **
                                                          
                                                          ** Compaction Stats [p-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      1/0    1.57 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0
                                                           Sum      1/0    1.57 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55ab8a42b350#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 5.9e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-0] **
                                                          
                                                          ** Compaction Stats [p-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55ab8a42b350#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 5.9e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-1] **
                                                          
                                                          ** Compaction Stats [p-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55ab8a42b350#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 5.9e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-2] **
                                                          
                                                          ** Compaction Stats [O-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55ab8a42b610#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 6e-06 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-0] **
                                                          
                                                          ** Compaction Stats [O-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55ab8a42b610#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 6e-06 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-1] **
                                                          
                                                          ** Compaction Stats [O-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      1/0    1.26 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0
                                                           Sum      1/0    1.26 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55ab8a42b610#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 6e-06 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-2] **
                                                          
                                                          ** Compaction Stats [L] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [L] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55ab8a42b350#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 5.9e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [L] **
                                                          
                                                          ** Compaction Stats [P] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [P] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55ab8a42b350#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 5.9e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [P] **
Nov 28 07:58:34 np0005538515.localdomain sudo[45930]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:34 np0005538515.localdomain sudo[45978]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oinemdwvemuqzgxdwndcbppvwzlknumn ; /usr/bin/python3
Nov 28 07:58:34 np0005538515.localdomain sudo[45978]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:35 np0005538515.localdomain python3[45980]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 07:58:35 np0005538515.localdomain sudo[45978]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:35 np0005538515.localdomain sudo[46021]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tmdnqryxwjwjpszzdeykldrdffmdlktx ; /usr/bin/python3
Nov 28 07:58:35 np0005538515.localdomain sudo[46021]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:35 np0005538515.localdomain python3[46023]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysconfig/podman_drop_in src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764316714.6777782-78040-191918879047508/source _original_basename=tmp_nbd7txt follow=False checksum=0bfbc70e9a4740c9004b9947da681f723d529c83 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:58:35 np0005538515.localdomain sudo[46021]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:35 np0005538515.localdomain sudo[46051]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fdnksaarkmpthzjdumotdailcptomewa ; /usr/bin/python3
Nov 28 07:58:35 np0005538515.localdomain sudo[46051]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:35 np0005538515.localdomain python3[46053]: ansible-file Invoked with mode=0750 path=/var/log/containers/rsyslog setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 07:58:35 np0005538515.localdomain sudo[46051]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:35 np0005538515.localdomain sudo[46067]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xozhdaknppemfqklggdahpmtlmcewlhq ; /usr/bin/python3
Nov 28 07:58:35 np0005538515.localdomain sudo[46067]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:36 np0005538515.localdomain python3[46069]: ansible-file Invoked with path=/var/lib/rsyslog.container setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 07:58:36 np0005538515.localdomain sudo[46067]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:36 np0005538515.localdomain sudo[46083]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sljgchcupqmetmtxsewfvyhlxwygwmyz ; /usr/bin/python3
Nov 28 07:58:36 np0005538515.localdomain sudo[46083]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:36 np0005538515.localdomain python3[46085]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Nov 28 07:58:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 28 07:58:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 600.2 total, 600.0 interval
                                                          Cumulative writes: 3247 writes, 16K keys, 3247 commit groups, 1.0 writes per commit group, ingest: 0.01 GB, 0.02 MB/s
                                                          Cumulative WAL: 3247 writes, 139 syncs, 23.36 writes per sync, written: 0.01 GB, 0.02 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 3247 writes, 16K keys, 3247 commit groups, 1.0 writes per commit group, ingest: 14.62 MB, 0.02 MB/s
                                                          Interval WAL: 3247 writes, 139 syncs, 23.36 writes per sync, written: 0.01 GB, 0.02 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          
                                                          ** Compaction Stats [default] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      2/0    2.61 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.03              0.00         1    0.025       0      0       0.0       0.0
                                                           Sum      2/0    2.61 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.03              0.00         1    0.025       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [default] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.03              0.00         1    0.025       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.2 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x562bf8a0b610#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.6e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [default] **
                                                          
                                                          ** Compaction Stats [m-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.2 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x562bf8a0b610#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.6e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-0] **
                                                          
                                                          ** Compaction Stats [m-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.2 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x562bf8a0b610#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.6e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-1] **
                                                          
                                                          ** Compaction Stats [m-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.2 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x562bf8a0b610#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.6e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-2] **
                                                          
                                                          ** Compaction Stats [p-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      1/0    1.57 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.03              0.00         1    0.028       0      0       0.0       0.0
                                                           Sum      1/0    1.57 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.03              0.00         1    0.028       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.1      0.03              0.00         1    0.028       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.2 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x562bf8a0b610#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.6e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-0] **
                                                          
                                                          ** Compaction Stats [p-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.2 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x562bf8a0b610#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.6e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-1] **
                                                          
                                                          ** Compaction Stats [p-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.2 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x562bf8a0b610#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.6e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-2] **
                                                          
                                                          ** Compaction Stats [O-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.2 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x562bf8a0a2d0#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 7e-06 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-0] **
                                                          
                                                          ** Compaction Stats [O-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.2 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x562bf8a0a2d0#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 7e-06 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-1] **
                                                          
                                                          ** Compaction Stats [O-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      1/0    1.26 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.01              0.00         1    0.009       0      0       0.0       0.0
                                                           Sum      1/0    1.26 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.01              0.00         1    0.009       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.1      0.01              0.00         1    0.009       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.2 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x562bf8a0a2d0#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 7e-06 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-2] **
                                                          
                                                          ** Compaction Stats [L] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.04              0.00         1    0.040       0      0       0.0       0.0
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.04              0.00         1    0.040       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [L] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.04              0.00         1    0.040       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.2 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x562bf8a0b610#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.6e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [L] **
                                                          
                                                          ** Compaction Stats [P] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [P] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.2 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x562bf8a0b610#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.6e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [P] **
Nov 28 07:58:39 np0005538515.localdomain sudo[46083]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:40 np0005538515.localdomain sudo[46132]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wvkzwpwvggbasdmwhxcfeedaonnivbou ; /usr/bin/python3
Nov 28 07:58:40 np0005538515.localdomain sudo[46132]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:40 np0005538515.localdomain python3[46134]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 07:58:40 np0005538515.localdomain sudo[46132]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:40 np0005538515.localdomain sudo[46177]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ftuisfgyxhcgamzfpmgvcfsuiuvwlzvn ; /usr/bin/python3
Nov 28 07:58:40 np0005538515.localdomain sudo[46177]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:41 np0005538515.localdomain python3[46179]: ansible-ansible.legacy.copy Invoked with dest=/etc/ssh/sshd_config src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764316720.3253143-78346-266047300254133/source validate=/usr/sbin/sshd -T -f %s mode=None follow=False _original_basename=sshd_config_block.j2 checksum=913c99ed7d5c33615bfb07a6792a4ef143dcfd2b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:58:41 np0005538515.localdomain sudo[46177]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:41 np0005538515.localdomain sudo[46208]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cmhzoowcnzcvuipwrvxamlugxvajshtx ; /usr/bin/python3
Nov 28 07:58:41 np0005538515.localdomain sudo[46208]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:41 np0005538515.localdomain python3[46210]: ansible-systemd Invoked with name=sshd state=restarted enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 07:58:41 np0005538515.localdomain systemd[1]: Stopping OpenSSH server daemon...
Nov 28 07:58:41 np0005538515.localdomain sshd[1128]: Received signal 15; terminating.
Nov 28 07:58:41 np0005538515.localdomain systemd[1]: sshd.service: Deactivated successfully.
Nov 28 07:58:41 np0005538515.localdomain systemd[1]: Stopped OpenSSH server daemon.
Nov 28 07:58:41 np0005538515.localdomain systemd[1]: sshd.service: Consumed 14.274s CPU time, read 2.1M from disk, written 428.0K to disk.
Nov 28 07:58:41 np0005538515.localdomain systemd[1]: Stopped target sshd-keygen.target.
Nov 28 07:58:41 np0005538515.localdomain systemd[1]: Stopping sshd-keygen.target...
Nov 28 07:58:41 np0005538515.localdomain systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 28 07:58:41 np0005538515.localdomain systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 28 07:58:41 np0005538515.localdomain systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 28 07:58:41 np0005538515.localdomain systemd[1]: Reached target sshd-keygen.target.
Nov 28 07:58:41 np0005538515.localdomain systemd[1]: Starting OpenSSH server daemon...
Nov 28 07:58:41 np0005538515.localdomain sshd[46214]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:58:41 np0005538515.localdomain sshd[46214]: Server listening on 0.0.0.0 port 22.
Nov 28 07:58:41 np0005538515.localdomain sshd[46214]: Server listening on :: port 22.
Nov 28 07:58:41 np0005538515.localdomain systemd[1]: Started OpenSSH server daemon.
Nov 28 07:58:41 np0005538515.localdomain sudo[46208]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:41 np0005538515.localdomain sudo[46228]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ipzhsenvxybclqqyyzaawyyqvyvkecry ; /usr/bin/python3
Nov 28 07:58:41 np0005538515.localdomain sudo[46228]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:42 np0005538515.localdomain python3[46230]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-active ntpd.service || systemctl is-enabled ntpd.service _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 07:58:42 np0005538515.localdomain sudo[46228]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:43 np0005538515.localdomain sudo[46246]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xvvsqchbiesvahdnavajhsbnkxkqebar ; /usr/bin/python3
Nov 28 07:58:43 np0005538515.localdomain sudo[46246]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:43 np0005538515.localdomain python3[46248]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-active ntpd.service || systemctl is-enabled ntpd.service _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 07:58:43 np0005538515.localdomain sudo[46246]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:44 np0005538515.localdomain sudo[46264]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ttfdeyvkfndjyvvkxtvautscgwqyhnrl ; /usr/bin/python3
Nov 28 07:58:44 np0005538515.localdomain sudo[46264]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:44 np0005538515.localdomain python3[46266]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Nov 28 07:58:46 np0005538515.localdomain sudo[46264]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:47 np0005538515.localdomain sudo[46313]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sfmcbiglnmcoyvtwgftyfmzsxbzmqmzm ; /usr/bin/python3
Nov 28 07:58:47 np0005538515.localdomain sudo[46313]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:47 np0005538515.localdomain python3[46315]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 07:58:47 np0005538515.localdomain sudo[46313]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:48 np0005538515.localdomain sudo[46331]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-igiggneerphishpsmaxglhoulbnkmzqc ; /usr/bin/python3
Nov 28 07:58:48 np0005538515.localdomain sudo[46331]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:48 np0005538515.localdomain python3[46333]: ansible-ansible.legacy.file Invoked with owner=root group=root mode=420 dest=/etc/chrony.conf _original_basename=chrony.conf.j2 recurse=False state=file path=/etc/chrony.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:58:48 np0005538515.localdomain sudo[46331]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:48 np0005538515.localdomain sudo[46361]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uqzktywjawwtdixjgjansmdpaiwmmcho ; /usr/bin/python3
Nov 28 07:58:48 np0005538515.localdomain sudo[46361]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:49 np0005538515.localdomain python3[46363]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 07:58:49 np0005538515.localdomain sudo[46361]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:49 np0005538515.localdomain sudo[46411]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kiaikimscfzxdexectwtqerkltprqfyi ; /usr/bin/python3
Nov 28 07:58:49 np0005538515.localdomain sudo[46411]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:49 np0005538515.localdomain python3[46413]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/chrony-online.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 07:58:49 np0005538515.localdomain sudo[46411]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:49 np0005538515.localdomain sudo[46429]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lrwbbqvyxnivpnolrsajdaepvvwukgws ; /usr/bin/python3
Nov 28 07:58:49 np0005538515.localdomain sudo[46429]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:49 np0005538515.localdomain python3[46431]: ansible-ansible.legacy.file Invoked with dest=/etc/systemd/system/chrony-online.service _original_basename=chrony-online.service recurse=False state=file path=/etc/systemd/system/chrony-online.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:58:49 np0005538515.localdomain sudo[46429]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:50 np0005538515.localdomain sudo[46459]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zzifbjaeianhvlvdwlokswahsbmhtfnx ; /usr/bin/python3
Nov 28 07:58:50 np0005538515.localdomain sudo[46459]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:50 np0005538515.localdomain python3[46461]: ansible-systemd Invoked with state=started name=chrony-online.service enabled=True daemon-reload=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 07:58:50 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 07:58:50 np0005538515.localdomain systemd-sysv-generator[46489]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 07:58:50 np0005538515.localdomain systemd-rc-local-generator[46486]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 07:58:50 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 07:58:50 np0005538515.localdomain systemd[1]: Starting chronyd online sources service...
Nov 28 07:58:50 np0005538515.localdomain chronyc[46500]: 200 OK
Nov 28 07:58:50 np0005538515.localdomain systemd[1]: chrony-online.service: Deactivated successfully.
Nov 28 07:58:50 np0005538515.localdomain systemd[1]: Finished chronyd online sources service.
Nov 28 07:58:50 np0005538515.localdomain sudo[46459]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:51 np0005538515.localdomain sudo[46514]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yfkgupnyvibyguuttxtsrzroplgsbnda ; /usr/bin/python3
Nov 28 07:58:51 np0005538515.localdomain sudo[46514]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:51 np0005538515.localdomain python3[46516]: ansible-ansible.legacy.command Invoked with _raw_params=chronyc makestep _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 07:58:51 np0005538515.localdomain chronyd[26579]: System clock was stepped by -0.000044 seconds
Nov 28 07:58:51 np0005538515.localdomain sudo[46514]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:51 np0005538515.localdomain sudo[46531]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dcqstqufaewtpseuoadjgztrqoggjqpc ; /usr/bin/python3
Nov 28 07:58:51 np0005538515.localdomain sudo[46531]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:51 np0005538515.localdomain python3[46533]: ansible-ansible.legacy.command Invoked with _raw_params=chronyc waitsync 30 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 07:58:51 np0005538515.localdomain sudo[46531]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:51 np0005538515.localdomain sudo[46548]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cfgcdroqthxojtshztvkaytxrqnadpxr ; /usr/bin/python3
Nov 28 07:58:51 np0005538515.localdomain sudo[46548]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:52 np0005538515.localdomain python3[46550]: ansible-ansible.legacy.command Invoked with _raw_params=chronyc makestep _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 07:58:52 np0005538515.localdomain chronyd[26579]: System clock was stepped by 0.000000 seconds
Nov 28 07:58:52 np0005538515.localdomain sudo[46548]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:52 np0005538515.localdomain sudo[46565]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-abqqlrzqlxfxolfklbrizllirjlwetnp ; /usr/bin/python3
Nov 28 07:58:52 np0005538515.localdomain sudo[46565]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:52 np0005538515.localdomain python3[46567]: ansible-ansible.legacy.command Invoked with _raw_params=chronyc waitsync 30 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 07:58:52 np0005538515.localdomain sudo[46565]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:52 np0005538515.localdomain sudo[46582]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-irhohdghdvaavgoypeyyiroqglqannea ; /usr/bin/python3
Nov 28 07:58:52 np0005538515.localdomain sudo[46582]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:52 np0005538515.localdomain python3[46584]: ansible-timezone Invoked with name=UTC hwclock=None
Nov 28 07:58:52 np0005538515.localdomain systemd[1]: Starting Time & Date Service...
Nov 28 07:58:52 np0005538515.localdomain systemd[1]: Started Time & Date Service.
Nov 28 07:58:52 np0005538515.localdomain sudo[46582]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:53 np0005538515.localdomain sudo[46602]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-djwxedjljswudwvlnnlomwazcqsmodqo ; /usr/bin/python3
Nov 28 07:58:53 np0005538515.localdomain sudo[46602]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:53 np0005538515.localdomain python3[46604]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -q --whatprovides tuned tuned-profiles-cpu-partitioning _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 07:58:54 np0005538515.localdomain sudo[46602]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:54 np0005538515.localdomain sudo[46619]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-udomdfvcsmhtnjszukxjkgohgycixkpi ; PATH=/bin:/usr/bin:/sbin:/usr/sbin /usr/bin/python3
Nov 28 07:58:54 np0005538515.localdomain sudo[46619]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:54 np0005538515.localdomain python3[46621]: ansible-ansible.legacy.command Invoked with _raw_params=which tuned-adm _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 07:58:54 np0005538515.localdomain sudo[46619]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:55 np0005538515.localdomain sudo[46636]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gdfgcpbcxcbnlyiqszpuzdfschumwvst ; /usr/bin/python3
Nov 28 07:58:55 np0005538515.localdomain sudo[46636]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:55 np0005538515.localdomain python3[46638]: ansible-slurp Invoked with src=/etc/tuned/active_profile
Nov 28 07:58:55 np0005538515.localdomain sudo[46636]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:55 np0005538515.localdomain sudo[46652]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mmrhdxpotctqzdgftzwqszmslsartfyj ; /usr/bin/python3
Nov 28 07:58:55 np0005538515.localdomain sudo[46652]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:55 np0005538515.localdomain python3[46654]: ansible-stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 28 07:58:55 np0005538515.localdomain sudo[46652]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:56 np0005538515.localdomain sudo[46668]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aduklbmmmhaynhdguiwtgmoxfkotcfim ; /usr/bin/python3
Nov 28 07:58:56 np0005538515.localdomain sudo[46668]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:56 np0005538515.localdomain python3[46670]: ansible-file Invoked with mode=0750 path=/var/log/containers/openvswitch setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 07:58:56 np0005538515.localdomain sudo[46668]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:56 np0005538515.localdomain sudo[46684]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ovikecrblqcpxcjwsgmfbeydsjtufeej ; /usr/bin/python3
Nov 28 07:58:56 np0005538515.localdomain sudo[46684]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:56 np0005538515.localdomain python3[46686]: ansible-file Invoked with path=/var/lib/openvswitch/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 07:58:56 np0005538515.localdomain sudo[46684]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:56 np0005538515.localdomain sudo[46732]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xkgirhahpmlphggmzjglqvdskalbcjjy ; /usr/bin/python3
Nov 28 07:58:56 np0005538515.localdomain sudo[46732]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:56 np0005538515.localdomain python3[46734]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/neutron-cleanup follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 07:58:56 np0005538515.localdomain sudo[46732]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:57 np0005538515.localdomain sudo[46775]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hrlafzwmkwapsgptnmrpsofumkrhuysh ; /usr/bin/python3
Nov 28 07:58:57 np0005538515.localdomain sudo[46775]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:57 np0005538515.localdomain python3[46777]: ansible-ansible.legacy.copy Invoked with dest=/usr/libexec/neutron-cleanup force=True mode=0755 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764316736.6169143-79361-80936644226304/source _original_basename=tmpz63bned5 follow=False checksum=f9cc7d1e91fbae49caa7e35eb2253bba146a73b4 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:58:57 np0005538515.localdomain sudo[46775]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:57 np0005538515.localdomain sudo[46837]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bdnhtmtutpodxxpyalgpezlnssnkhifx ; /usr/bin/python3
Nov 28 07:58:57 np0005538515.localdomain sudo[46837]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:57 np0005538515.localdomain python3[46839]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/neutron-cleanup.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 07:58:57 np0005538515.localdomain sudo[46837]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:57 np0005538515.localdomain sudo[46880]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vdooduxlwjmhckmdfwsdcsqgubtgbeyc ; /usr/bin/python3
Nov 28 07:58:57 np0005538515.localdomain sudo[46880]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:58 np0005538515.localdomain python3[46882]: ansible-ansible.legacy.copy Invoked with dest=/usr/lib/systemd/system/neutron-cleanup.service force=True src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764316737.4807377-79416-164637156526161/source _original_basename=tmpkuxti0_7 follow=False checksum=6b6cd9f074903a28d054eb530a10c7235d0c39fc backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:58:58 np0005538515.localdomain sudo[46880]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:58 np0005538515.localdomain sudo[46910]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wfodfwglzxtpmfjngwngqxcnnlqupkok ; /usr/bin/python3
Nov 28 07:58:58 np0005538515.localdomain sudo[46910]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:58 np0005538515.localdomain python3[46912]: ansible-ansible.legacy.systemd Invoked with enabled=True name=neutron-cleanup daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Nov 28 07:58:58 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 07:58:58 np0005538515.localdomain systemd-sysv-generator[46942]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 07:58:58 np0005538515.localdomain systemd-rc-local-generator[46937]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 07:58:58 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 07:58:58 np0005538515.localdomain sudo[46910]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:59 np0005538515.localdomain sudo[46964]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dwyrzjwhhvwwcprqxlsraakixvyfloyb ; /usr/bin/python3
Nov 28 07:58:59 np0005538515.localdomain sudo[46964]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:59 np0005538515.localdomain python3[46966]: ansible-file Invoked with mode=0750 path=/var/log/containers/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 07:58:59 np0005538515.localdomain sudo[46964]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:59 np0005538515.localdomain sudo[46980]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-trhnctcfonyrfadnywabylqysmuwiibg ; /usr/bin/python3
Nov 28 07:58:59 np0005538515.localdomain sudo[46980]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:58:59 np0005538515.localdomain python3[46982]: ansible-ansible.legacy.command Invoked with _raw_params=ip netns add ns_temp _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 07:58:59 np0005538515.localdomain systemd[36764]: Created slice User Background Tasks Slice.
Nov 28 07:58:59 np0005538515.localdomain sudo[46980]: pam_unix(sudo:session): session closed for user root
Nov 28 07:58:59 np0005538515.localdomain systemd[36764]: Starting Cleanup of User's Temporary Files and Directories...
Nov 28 07:58:59 np0005538515.localdomain systemd[36764]: Finished Cleanup of User's Temporary Files and Directories.
Nov 28 07:58:59 np0005538515.localdomain sudo[46998]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uyuhfocinxeglvmgxtbrtvcqjkuyuybx ; /usr/bin/python3
Nov 28 07:58:59 np0005538515.localdomain sudo[46998]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:59:00 np0005538515.localdomain python3[47000]: ansible-ansible.legacy.command Invoked with _raw_params=ip netns delete ns_temp _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 07:59:00 np0005538515.localdomain systemd[1]: run-netns-ns_temp.mount: Deactivated successfully.
Nov 28 07:59:00 np0005538515.localdomain sudo[46998]: pam_unix(sudo:session): session closed for user root
Nov 28 07:59:00 np0005538515.localdomain sudo[47015]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aljkckbngmkbiycdsbgusrliwklqunfl ; /usr/bin/python3
Nov 28 07:59:00 np0005538515.localdomain sudo[47015]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:59:00 np0005538515.localdomain python3[47017]: ansible-file Invoked with path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 07:59:00 np0005538515.localdomain sudo[47015]: pam_unix(sudo:session): session closed for user root
Nov 28 07:59:00 np0005538515.localdomain sudo[47031]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-csfainlvgqxypykvfkoxjapxiscgkuno ; /usr/bin/python3
Nov 28 07:59:00 np0005538515.localdomain sudo[47031]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:59:00 np0005538515.localdomain python3[47033]: ansible-file Invoked with path=/var/lib/neutron/kill_scripts state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:59:00 np0005538515.localdomain sudo[47031]: pam_unix(sudo:session): session closed for user root
Nov 28 07:59:01 np0005538515.localdomain sudo[47079]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nwfjmydefltacwlcyvzurkcwxnfinubs ; /usr/bin/python3
Nov 28 07:59:01 np0005538515.localdomain sudo[47079]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:59:01 np0005538515.localdomain python3[47081]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/haproxy-kill follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 07:59:01 np0005538515.localdomain sudo[47079]: pam_unix(sudo:session): session closed for user root
Nov 28 07:59:01 np0005538515.localdomain sudo[47122]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kjnysxdaaholptmbnlnxvlmwzqtqgpkl ; /usr/bin/python3
Nov 28 07:59:01 np0005538515.localdomain sudo[47122]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:59:01 np0005538515.localdomain python3[47124]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/kill_scripts/haproxy-kill mode=493 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764316740.9649713-79678-67147584158396/source _original_basename=tmp5iut2dii follow=False checksum=2f369fbe8f83639cdfd4efc53e7feb4ee77d1ed7 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:59:01 np0005538515.localdomain sudo[47122]: pam_unix(sudo:session): session closed for user root
Nov 28 07:59:02 np0005538515.localdomain sshd[47139]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 07:59:02 np0005538515.localdomain sshd[47139]: error: kex_exchange_identification: Connection closed by remote host
Nov 28 07:59:02 np0005538515.localdomain sshd[47139]: Connection closed by 80.94.92.186 port 57748
Nov 28 07:59:13 np0005538515.localdomain sudo[47140]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 07:59:13 np0005538515.localdomain sudo[47140]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 07:59:13 np0005538515.localdomain sudo[47140]: pam_unix(sudo:session): session closed for user root
Nov 28 07:59:13 np0005538515.localdomain sudo[47155]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 07:59:13 np0005538515.localdomain sudo[47155]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 07:59:14 np0005538515.localdomain sudo[47155]: pam_unix(sudo:session): session closed for user root
Nov 28 07:59:15 np0005538515.localdomain sudo[47202]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 07:59:15 np0005538515.localdomain sudo[47202]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 07:59:16 np0005538515.localdomain sudo[47202]: pam_unix(sudo:session): session closed for user root
Nov 28 07:59:22 np0005538515.localdomain sudo[47230]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ilhpzmpmcogivgodwsurusenqvdgmvhq ; /usr/bin/python3
Nov 28 07:59:22 np0005538515.localdomain sudo[47230]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:59:22 np0005538515.localdomain python3[47232]: ansible-file Invoked with path=/var/log/containers state=directory setype=container_file_t selevel=s0 mode=488 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Nov 28 07:59:22 np0005538515.localdomain sudo[47230]: pam_unix(sudo:session): session closed for user root
Nov 28 07:59:23 np0005538515.localdomain systemd[1]: systemd-timedated.service: Deactivated successfully.
Nov 28 07:59:23 np0005538515.localdomain sudo[47248]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bqctbabuiptcrxfzfyaebpwoazmlknse ; /usr/bin/python3
Nov 28 07:59:23 np0005538515.localdomain sudo[47248]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:59:23 np0005538515.localdomain python3[47250]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory selevel=s0 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None setype=None attributes=None
Nov 28 07:59:23 np0005538515.localdomain sudo[47248]: pam_unix(sudo:session): session closed for user root
Nov 28 07:59:23 np0005538515.localdomain sudo[47264]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bgclotxflrfulcjikqbzrilyimvmlsdc ; /usr/bin/python3
Nov 28 07:59:23 np0005538515.localdomain sudo[47264]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:59:23 np0005538515.localdomain python3[47266]: ansible-file Invoked with path=/var/lib/tripleo-config state=directory setype=container_file_t selevel=s0 recurse=True force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None
Nov 28 07:59:23 np0005538515.localdomain sudo[47264]: pam_unix(sudo:session): session closed for user root
Nov 28 07:59:23 np0005538515.localdomain sudo[47280]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jstdzcxkuuufphuexjoxqvqsmxkpsviq ; /usr/bin/python3
Nov 28 07:59:23 np0005538515.localdomain sudo[47280]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:59:23 np0005538515.localdomain python3[47282]: ansible-file Invoked with path=/var/lib/container-startup-configs.json state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:59:24 np0005538515.localdomain sudo[47280]: pam_unix(sudo:session): session closed for user root
Nov 28 07:59:24 np0005538515.localdomain sudo[47296]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pcfjnjyupwbmdsfcwnkieqlwtoqmrdut ; /usr/bin/python3
Nov 28 07:59:24 np0005538515.localdomain sudo[47296]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:59:24 np0005538515.localdomain python3[47298]: ansible-file Invoked with path=/var/lib/docker-container-startup-configs.json state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:59:24 np0005538515.localdomain sudo[47296]: pam_unix(sudo:session): session closed for user root
Nov 28 07:59:24 np0005538515.localdomain sudo[47312]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wrdeewylsulvitghyhfvyfdbwicihsfq ; /usr/bin/python3
Nov 28 07:59:24 np0005538515.localdomain sudo[47312]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:59:24 np0005538515.localdomain python3[47314]: ansible-community.general.sefcontext Invoked with target=/var/lib/container-config-scripts(/.*)? setype=container_file_t state=present ignore_selinux_state=False ftype=a reload=True seuser=None selevel=None
Nov 28 07:59:25 np0005538515.localdomain kernel: SELinux:  Converting 2706 SID table entries...
Nov 28 07:59:25 np0005538515.localdomain kernel: SELinux:  policy capability network_peer_controls=1
Nov 28 07:59:25 np0005538515.localdomain kernel: SELinux:  policy capability open_perms=1
Nov 28 07:59:25 np0005538515.localdomain kernel: SELinux:  policy capability extended_socket_class=1
Nov 28 07:59:25 np0005538515.localdomain kernel: SELinux:  policy capability always_check_network=0
Nov 28 07:59:25 np0005538515.localdomain kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 28 07:59:25 np0005538515.localdomain kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 28 07:59:25 np0005538515.localdomain kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 28 07:59:25 np0005538515.localdomain sudo[47312]: pam_unix(sudo:session): session closed for user root
Nov 28 07:59:25 np0005538515.localdomain sudo[47337]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bkgcaqdghssyykktipsterxthsxjwpyl ; /usr/bin/python3
Nov 28 07:59:25 np0005538515.localdomain dbus-broker-launch[754]: avc:  op=load_policy lsm=selinux seqno=13 res=1
Nov 28 07:59:25 np0005538515.localdomain sudo[47337]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:59:26 np0005538515.localdomain python3[47339]: ansible-file Invoked with path=/var/lib/container-config-scripts state=directory setype=container_file_t recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 07:59:26 np0005538515.localdomain sudo[47337]: pam_unix(sudo:session): session closed for user root
Nov 28 07:59:26 np0005538515.localdomain sudo[47353]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xilgehivlggaiiyudbzgfrthttzemzpg ; /usr/bin/python3
Nov 28 07:59:26 np0005538515.localdomain sudo[47353]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:59:26 np0005538515.localdomain sudo[47353]: pam_unix(sudo:session): session closed for user root
Nov 28 07:59:26 np0005538515.localdomain sudo[47401]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-panapfizsqtfgqpnuhtbifwkmhvdzlbf ; /usr/bin/python3
Nov 28 07:59:26 np0005538515.localdomain sudo[47401]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:59:26 np0005538515.localdomain sudo[47401]: pam_unix(sudo:session): session closed for user root
Nov 28 07:59:27 np0005538515.localdomain sudo[47444]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mpgvqegstguqbcvxjlckhwyuuqqecehh ; /usr/bin/python3
Nov 28 07:59:27 np0005538515.localdomain sudo[47444]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:59:27 np0005538515.localdomain sudo[47444]: pam_unix(sudo:session): session closed for user root
Nov 28 07:59:27 np0005538515.localdomain sudo[47474]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uwkanwxaokialmxepakemkmkzzmpxvpc ; /usr/bin/python3
Nov 28 07:59:27 np0005538515.localdomain sudo[47474]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:59:27 np0005538515.localdomain python3[47476]: ansible-container_startup_config Invoked with config_base_dir=/var/lib/tripleo-config/container-startup-config config_data={'step_1': {'metrics_qdr': {'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, 'metrics_qdr_init_logs': {'command': ['/bin/bash', '-c', 'chown -R qdrouterd:qdrouterd /var/log/qdrouterd'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'none', 'privileged': False, 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}}, 'step_2': {'create_haproxy_wrapper': {'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z']}, 'create_virtlogd_wrapper': {'cgroupns': 'host', 'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::nova::virtlogd_wrapper'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764316155'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z']}, 'nova_compute_init_log': {'command': ['/bin/bash', '-c', 'chown -R nova:nova /var/log/nova'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764316155'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'user': 'root', 'volumes': ['/var/log/containers/nova:/var/log/nova:z']}, 'nova_virtqemud_init_logs': {'command': ['/bin/bash', '-c', 'chown -R tss:tss /var/log/swtpm'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764316155'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'none', 'privileged': True, 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'user': 'root', 'volumes': ['/var/log/containers/libvirt/swtpm:/var/log/swtpm:shared,z']}}, 'step_3': {'ceilometer_init_log': {'command': ['/bin/bash', '-c', 'chown -R ceilometer:ceilometer /var/log/ceilometer'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'none', 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/ceilometer:/var/log/ceilometer:z']}, 'collectd': {'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, 'iscsid': {'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, 'nova_statedir_owner': {'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': 'triliovault-mounts', 'TRIPLEO_DEPLOY_IDENTIFIER': '1764316155', '__OS_DEBUG': 'true'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/container-config-scripts:/container-config-scripts:z']}, 'nova_virtlogd_wrapper': {'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']}, 'nova_virtnodedevd': {'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 2, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro']}, 'nova_virtproxyd': {'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 5, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro']}, 'nova_virtqemud': {'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']}, 'nova_virtsecretd': {'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 1, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro']}, 'nova_virtstoraged': {'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 3, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro']}, 'rsyslog': {'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}}, 'step_4': {'ceilometer_agent_compute': {'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, 'ceilometer_agent_ipmi': {'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, 'configure_cms_options': {'command': ['/bin/bash', '-c', 'CMS_OPTS=$(hiera ovn::controller::ovn_cms_options -c /etc/puppet/hiera.yaml);  if [ X"$CMS_OPTS" !=  X ]; then ovs-vsctl set open . external_ids:ovn-cms-options=$CMS_OPTS;else ovs-vsctl remove open . external_ids ovn-cms-options; fi'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764316155'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, 'logrotate_crond': {'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, 'nova_libvirt_init_secret': {'cgroupns': 'host', 'command': '/nova_libvirt_init_secret.sh ceph:openstack', 'detach': False, 'environment': {'LIBVIRT_DEFAULT_URI': 'qemu:///system'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova:/etc/nova', '/etc/libvirt:/etc/libvirt', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/lib/container-config-scripts/nova_libvirt_init_secret.sh:/nova_libvirt_init_secret.sh:ro', '/var/lib/tripleo-config/ceph:/etc/ceph:ro']}, 'nova_migration_target': {'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, 'ovn_controller': {'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, 'ovn_metadata_agent': {'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, 'setup_ovs_manager': {'command': ['/container_puppet_apply.sh', '4', 'exec', 'include tripleo::profile::base::neutron::ovn_metadata'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764316155'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}}, 'step_5': {'nova_compute': {'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, 'nova_wait_for_compute_service': {'detach': False, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', '__OS_DEBUG': 'true'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'start_order': 4, 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/log/containers/nova:/var/log/nova', '/var/lib/container-config-scripts:/container-config-scripts']}}}
Nov 28 07:59:27 np0005538515.localdomain sudo[47474]: pam_unix(sudo:session): session closed for user root
Nov 28 07:59:28 np0005538515.localdomain sudo[47490]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qbramxunnxavvbxlmwtzppvjcvbnnzkc ; /usr/bin/python3
Nov 28 07:59:28 np0005538515.localdomain rsyslogd[758]: message too long (31243) with configured size 8096, begin of message is: ansible-container_startup_config Invoked with config_base_dir=/var/lib/tripleo-c [v8.2102.0-111.el9 try https://www.rsyslog.com/e/2445 ]
Nov 28 07:59:28 np0005538515.localdomain sudo[47490]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:59:28 np0005538515.localdomain python3[47492]: ansible-file Invoked with path=/var/lib/kolla/config_files state=directory setype=container_file_t selevel=s0 recurse=True force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None
Nov 28 07:59:28 np0005538515.localdomain sudo[47490]: pam_unix(sudo:session): session closed for user root
Nov 28 07:59:28 np0005538515.localdomain sudo[47506]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hdgwljxroeqrcbhtmgjgfjuuxdaqijnj ; /usr/bin/python3
Nov 28 07:59:28 np0005538515.localdomain sudo[47506]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:59:28 np0005538515.localdomain python3[47508]: ansible-file Invoked with path=/var/lib/config-data mode=493 state=directory setype=container_file_t selevel=s0 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Nov 28 07:59:28 np0005538515.localdomain sudo[47506]: pam_unix(sudo:session): session closed for user root
Nov 28 07:59:29 np0005538515.localdomain sudo[47522]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hlrmakimpwmyqnfhjulmcqpdmvzddpcd ; /usr/bin/python3
Nov 28 07:59:29 np0005538515.localdomain sudo[47522]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:59:29 np0005538515.localdomain python3[47524]: ansible-tripleo_container_configs Invoked with config_data={'/var/lib/kolla/config_files/ceilometer-agent-ipmi.json': {'command': '/usr/bin/ceilometer-polling --polling-namespaces ipmi --logfile /var/log/ceilometer/ipmi.log', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}]}, '/var/lib/kolla/config_files/ceilometer_agent_compute.json': {'command': '/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /var/log/ceilometer/compute.log', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}]}, '/var/lib/kolla/config_files/collectd.json': {'command': '/usr/sbin/collectd -f', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/', 'merge': False, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/etc/collectd.d'}], 'permissions': [{'owner': 'collectd:collectd', 'path': '/var/log/collectd', 'recurse': True}, {'owner': 'collectd:collectd', 'path': '/scripts', 'recurse': True}, {'owner': 'collectd:collectd', 'path': '/config-scripts', 'recurse': True}]}, '/var/lib/kolla/config_files/iscsid.json': {'command': '/usr/sbin/iscsid -f', 'config_files': [{'dest': '/etc/iscsi/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-iscsid/'}]}, '/var/lib/kolla/config_files/logrotate-crond.json': {'command': '/usr/sbin/crond -s -n', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}]}, '/var/lib/kolla/config_files/metrics_qdr.json': {'command': '/usr/sbin/qdrouterd -c /etc/qpid-dispatch/qdrouterd.conf', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/', 'merge': True, 'optional': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-tls/*'}], 'permissions': [{'owner': 'qdrouterd:qdrouterd', 'path': '/var/lib/qdrouterd', 'recurse': True}, {'optional': True, 'owner': 'qdrouterd:qdrouterd', 'path': '/etc/pki/tls/certs/metrics_qdr.crt'}, {'optional': True, 'owner': 'qdrouterd:qdrouterd', 'path': '/etc/pki/tls/private/metrics_qdr.key'}]}, '/var/lib/kolla/config_files/nova-migration-target.json': {'command': 'dumb-init --single-child -- /usr/sbin/sshd -D -p 2022', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/ssh/', 'owner': 'root', 'perm': '0600', 'source': '/host-ssh/ssh_host_*_key'}]}, '/var/lib/kolla/config_files/nova_compute.json': {'command': '/var/lib/nova/delay-nova-compute --delay 180 --nova-binary /usr/bin/nova-compute ', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/iscsi/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-iscsid/*'}, {'dest': '/etc/ceph/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-ceph/'}], 'permissions': [{'owner': 'nova:nova', 'path': '/var/log/nova', 'recurse': True}, {'owner': 'nova:nova', 'path': '/etc/ceph/ceph.client.openstack.keyring', 'perm': '0600'}]}, '/var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json': {'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_wait_for_compute_service.py', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}], 'permissions': [{'owner': 'nova:nova', 'path': '/var/log/nova', 'recurse': True}]}, '/var/lib/kolla/config_files/nova_virtlogd.json': {'command': '/usr/local/bin/virtlogd_wrapper', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/ceph/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-ceph/'}], 'permissions': [{'owner': 'nova:nova', 'path': '/etc/ceph/ceph.client.openstack.keyring', 'perm': '0600'}]}, '/var/lib/kolla/config_files/nova_virtnodedevd.json': {'command': '/usr/sbin/virtnodedevd --config /etc/libvirt/virtnodedevd.conf', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/ceph/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-ceph/'}], 'permissions': [{'owner': 'nova:nova', 'path': '/etc/ceph/ceph.client.openstack.keyring', 'perm': '0600'}]}, '/var/lib/kolla/config_files/nova_virtproxyd.json': {'command': '/usr/sbin/virtproxyd --config /etc/libvirt/virtproxyd.conf', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/ceph/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-ceph/'}], 'permissions': [{'owner': 'nova:nova', 'path': '/etc/ceph/ceph.client.openstack.keyring', 'perm': '0600'}]}, '/var/lib/kolla/config_files/nova_virtqemud.json': {'command': '/usr/sbin/virtqemud --config /etc/libvirt/virtqemud.conf', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/ceph/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-ceph/'}], 'permissions': [{'owner': 'nova:nova', 'path': '/etc/ceph/ceph.client.openstack.keyring', 'perm': '0600'}]}, '/var/lib/kolla/config_files/nova_virtsecretd.json': {'command': '/usr/sbin/virtsecretd --config /etc/libvirt/virtsecretd.conf', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/ceph/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-ceph/'}], 'permissions': [{'owner': 'nova:nova', 'path': '/etc/ceph/ceph.client.openstack.keyring', 'perm': '0600'}]}, '/var/lib/kolla/config_files/nova_virtstoraged.json': {'command': '/usr/sbin/virtstoraged --config /etc/libvirt/virtstoraged.conf', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/ceph/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-ceph/'}], 'permissions': [{'owner': 'nova:nova', 'path': '/etc/ceph/ceph.client.openstack.keyring', 'perm': '0600'}]}, '/var/lib/kolla/config_files/ovn_controller.json': {'command': '/usr/bin/ovn-controller --pidfile --log-file unix:/run/openvswitch/db.sock ', 'permissions': [{'owner': 'root:root', 'path': '/var/log/openvswitch', 'recurse': True}, {'owner': 'root:root', 'path': '/var/log/ovn', 'recurse': True}]}, '/var/lib/kolla/config_files/ovn_metadata_agent.json': {'command': '/usr/bin/networking-ovn-metadata-agent --config-file /etc/neutron/neutron.conf --config-file /etc/neutron/plugins/networking-ovn/networking-ovn-metadata-agent.ini --log-file=/var/log/neutron/ovn-metadata-agent.log', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}], 'permissions': [{'owner': 'neutron:neutron', 'path': '/var/log/neutron', 'recurse': True}, {'owner': 'neutron:neutron', 'path': '/var/lib/neutron', 'recurse': True}, {'optional': True, 'owner': 'neutron:neutron', 'path': '/etc/pki/tls/certs/ovn_metadata.crt', 'perm': '0644'}, {'optional': True, 'owner': 'neutron:neutron', 'path': '/etc/pki/tls/private/ovn_metadata.key', 'perm': '0644'}]}, '/var/lib/kolla/config_files/rsyslog.json': {'command': '/usr/sbin/rsyslogd -n', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}], 'permissions': [{'owner': 'root:root', 'path': '/var/lib/rsyslog', 'recurse': True}, {'owner': 'root:root', 'path': '/var/log/rsyslog', 'recurse': True}]}}
Nov 28 07:59:29 np0005538515.localdomain sudo[47522]: pam_unix(sudo:session): session closed for user root
Nov 28 07:59:35 np0005538515.localdomain sudo[47570]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zsevbwprzuqqlyuhzoshzwuhogjcxfkg ; /usr/bin/python3
Nov 28 07:59:35 np0005538515.localdomain sudo[47570]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:59:35 np0005538515.localdomain python3[47572]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/config_step.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 07:59:35 np0005538515.localdomain sudo[47570]: pam_unix(sudo:session): session closed for user root
Nov 28 07:59:35 np0005538515.localdomain sudo[47613]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fvtnngyglqosdxmtbbshhcarrqywughm ; /usr/bin/python3
Nov 28 07:59:35 np0005538515.localdomain sudo[47613]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:59:35 np0005538515.localdomain python3[47615]: ansible-ansible.legacy.copy Invoked with dest=/etc/puppet/hieradata/config_step.json force=True mode=0600 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764316774.8966718-81139-96789904088545/source _original_basename=tmpi28phjyh follow=False checksum=dfdcc7695edd230e7a2c06fc7b739bfa56506d8f backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 07:59:35 np0005538515.localdomain sudo[47613]: pam_unix(sudo:session): session closed for user root
Nov 28 07:59:35 np0005538515.localdomain sudo[47643]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-deprjvvowytixpjbgsoawmeanfjcfflo ; /usr/bin/python3
Nov 28 07:59:35 np0005538515.localdomain sudo[47643]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:59:36 np0005538515.localdomain python3[47645]: ansible-stat Invoked with path=/var/lib/tripleo-config/container-startup-config/step_1 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 28 07:59:36 np0005538515.localdomain sudo[47643]: pam_unix(sudo:session): session closed for user root
Nov 28 07:59:37 np0005538515.localdomain sudo[47693]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gamwdqzwzfwjrzhoiqdvkgrgnfzixzsi ; /usr/bin/python3
Nov 28 07:59:37 np0005538515.localdomain sudo[47693]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:59:37 np0005538515.localdomain sudo[47693]: pam_unix(sudo:session): session closed for user root
Nov 28 07:59:37 np0005538515.localdomain sudo[47736]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qqjjksquuqfphznjlbtlnzlhwcctjofk ; /usr/bin/python3
Nov 28 07:59:37 np0005538515.localdomain sudo[47736]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:59:37 np0005538515.localdomain sudo[47736]: pam_unix(sudo:session): session closed for user root
Nov 28 07:59:38 np0005538515.localdomain sudo[47766]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zjffmbledamfllqydwzhrvzcxqsgsyso ; /usr/bin/python3
Nov 28 07:59:38 np0005538515.localdomain sudo[47766]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:59:38 np0005538515.localdomain python3[47768]: ansible-file Invoked with path=/var/lib/container-puppet state=directory setype=container_file_t selevel=s0 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None
Nov 28 07:59:38 np0005538515.localdomain sudo[47766]: pam_unix(sudo:session): session closed for user root
Nov 28 07:59:38 np0005538515.localdomain sudo[47814]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zgxikobdgwhqrbymshgsujcvwhebutsx ; /usr/bin/python3
Nov 28 07:59:38 np0005538515.localdomain sudo[47814]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:59:38 np0005538515.localdomain sudo[47814]: pam_unix(sudo:session): session closed for user root
Nov 28 07:59:39 np0005538515.localdomain sudo[47857]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nyzsdgtiorubfjqvbirmyiicvxgyizfq ; /usr/bin/python3
Nov 28 07:59:39 np0005538515.localdomain sudo[47857]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:59:39 np0005538515.localdomain sudo[47857]: pam_unix(sudo:session): session closed for user root
Nov 28 07:59:39 np0005538515.localdomain sudo[47887]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sadvkwbqandjrcakodwwvdrxpbncynvi ; /usr/bin/python3
Nov 28 07:59:39 np0005538515.localdomain sudo[47887]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:59:40 np0005538515.localdomain python3[47889]: ansible-container_puppet_config Invoked with update_config_hash_only=True no_archive=True check_mode=False config_vol_prefix=/var/lib/config-data debug=False net_host=True puppet_config= short_hostname= step=6
Nov 28 07:59:40 np0005538515.localdomain sudo[47887]: pam_unix(sudo:session): session closed for user root
Nov 28 07:59:41 np0005538515.localdomain sudo[47903]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ncpzgoorfqtmebpdqxshmofphsgwylyt ; /usr/bin/python3
Nov 28 07:59:41 np0005538515.localdomain sudo[47903]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:59:42 np0005538515.localdomain python3[47905]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -q lvm2 _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 07:59:42 np0005538515.localdomain sudo[47903]: pam_unix(sudo:session): session closed for user root
Nov 28 07:59:43 np0005538515.localdomain sudo[47920]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-orpgmtosnlkbyjendxfjqonxyyxixumh ; /usr/bin/python3
Nov 28 07:59:43 np0005538515.localdomain sudo[47920]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:59:43 np0005538515.localdomain python3[47922]: ansible-ansible.legacy.dnf Invoked with name=['systemd-container'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Nov 28 07:59:46 np0005538515.localdomain dbus-broker-launch[750]: Noticed file-system modification, trigger reload.
Nov 28 07:59:46 np0005538515.localdomain dbus-broker-launch[14507]: Noticed file-system modification, trigger reload.
Nov 28 07:59:46 np0005538515.localdomain dbus-broker-launch[14507]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +31: Eavesdropping is deprecated and ignored
Nov 28 07:59:46 np0005538515.localdomain dbus-broker-launch[14507]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +33: Eavesdropping is deprecated and ignored
Nov 28 07:59:46 np0005538515.localdomain dbus-broker-launch[750]: Noticed file-system modification, trigger reload.
Nov 28 07:59:47 np0005538515.localdomain systemd[1]: Reexecuting.
Nov 28 07:59:47 np0005538515.localdomain systemd[1]: systemd 252-14.el9_2.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Nov 28 07:59:47 np0005538515.localdomain systemd[1]: Detected virtualization kvm.
Nov 28 07:59:47 np0005538515.localdomain systemd[1]: Detected architecture x86-64.
Nov 28 07:59:47 np0005538515.localdomain systemd-rc-local-generator[47977]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 07:59:47 np0005538515.localdomain systemd-sysv-generator[47983]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 07:59:47 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 07:59:55 np0005538515.localdomain kernel: SELinux:  Converting 2706 SID table entries...
Nov 28 07:59:55 np0005538515.localdomain kernel: SELinux:  policy capability network_peer_controls=1
Nov 28 07:59:55 np0005538515.localdomain kernel: SELinux:  policy capability open_perms=1
Nov 28 07:59:55 np0005538515.localdomain kernel: SELinux:  policy capability extended_socket_class=1
Nov 28 07:59:55 np0005538515.localdomain kernel: SELinux:  policy capability always_check_network=0
Nov 28 07:59:55 np0005538515.localdomain kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 28 07:59:55 np0005538515.localdomain kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 28 07:59:55 np0005538515.localdomain kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 28 07:59:55 np0005538515.localdomain dbus-broker-launch[750]: Noticed file-system modification, trigger reload.
Nov 28 07:59:55 np0005538515.localdomain dbus-broker-launch[754]: avc:  op=load_policy lsm=selinux seqno=14 res=1
Nov 28 07:59:55 np0005538515.localdomain dbus-broker-launch[750]: Noticed file-system modification, trigger reload.
Nov 28 07:59:56 np0005538515.localdomain systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 28 07:59:56 np0005538515.localdomain systemd[1]: Starting man-db-cache-update.service...
Nov 28 07:59:56 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 07:59:56 np0005538515.localdomain systemd-rc-local-generator[48049]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 07:59:56 np0005538515.localdomain systemd-sysv-generator[48052]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 07:59:56 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 07:59:56 np0005538515.localdomain systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 28 07:59:56 np0005538515.localdomain systemd[1]: Queuing reload/restart jobs for marked units…
Nov 28 07:59:56 np0005538515.localdomain systemd-journald[618]: Journal stopped
Nov 28 07:59:57 np0005538515.localdomain systemd[1]: Stopping Journal Service...
Nov 28 07:59:57 np0005538515.localdomain systemd-journald[618]: Received SIGTERM from PID 1 (systemd).
Nov 28 07:59:57 np0005538515.localdomain systemd[1]: Stopping Rule-based Manager for Device Events and Files...
Nov 28 07:59:57 np0005538515.localdomain systemd[1]: systemd-journald.service: Deactivated successfully.
Nov 28 07:59:57 np0005538515.localdomain systemd[1]: Stopped Journal Service.
Nov 28 07:59:57 np0005538515.localdomain systemd[1]: systemd-journald.service: Consumed 2.545s CPU time.
Nov 28 07:59:57 np0005538515.localdomain systemd[1]: Starting Journal Service...
Nov 28 07:59:57 np0005538515.localdomain systemd[1]: systemd-udevd.service: Deactivated successfully.
Nov 28 07:59:57 np0005538515.localdomain systemd[1]: Stopped Rule-based Manager for Device Events and Files.
Nov 28 07:59:57 np0005538515.localdomain systemd[1]: systemd-udevd.service: Consumed 3.147s CPU time.
Nov 28 07:59:57 np0005538515.localdomain systemd[1]: Starting Rule-based Manager for Device Events and Files...
Nov 28 07:59:57 np0005538515.localdomain systemd-journald[48427]: Journal started
Nov 28 07:59:57 np0005538515.localdomain systemd-journald[48427]: Runtime Journal (/run/log/journal/5cd59ba25ae47acac865224fa46a5f9e) is 12.8M, max 314.7M, 301.9M free.
Nov 28 07:59:57 np0005538515.localdomain systemd[1]: Started Journal Service.
Nov 28 07:59:57 np0005538515.localdomain systemd-journald[48427]: Field hash table of /run/log/journal/5cd59ba25ae47acac865224fa46a5f9e/system.journal has a fill level at 75.4 (251 of 333 items), suggesting rotation.
Nov 28 07:59:57 np0005538515.localdomain systemd-journald[48427]: /run/log/journal/5cd59ba25ae47acac865224fa46a5f9e/system.journal: Journal header limits reached or header out-of-date, rotating.
Nov 28 07:59:57 np0005538515.localdomain rsyslogd[758]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Nov 28 07:59:57 np0005538515.localdomain rsyslogd[758]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Nov 28 07:59:57 np0005538515.localdomain systemd-udevd[48433]: Using default interface naming scheme 'rhel-9.0'.
Nov 28 07:59:57 np0005538515.localdomain systemd[1]: Started Rule-based Manager for Device Events and Files.
Nov 28 07:59:57 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 07:59:57 np0005538515.localdomain systemd-rc-local-generator[49050]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 07:59:57 np0005538515.localdomain systemd-sysv-generator[49054]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 07:59:57 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 07:59:57 np0005538515.localdomain systemd[1]: Queuing reload/restart jobs for marked units…
Nov 28 07:59:57 np0005538515.localdomain systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 28 07:59:57 np0005538515.localdomain systemd[1]: Finished man-db-cache-update.service.
Nov 28 07:59:57 np0005538515.localdomain systemd[1]: man-db-cache-update.service: Consumed 1.288s CPU time.
Nov 28 07:59:57 np0005538515.localdomain systemd[1]: run-r9e51df2d70664b349fb2ee3c2b2c60c0.service: Deactivated successfully.
Nov 28 07:59:57 np0005538515.localdomain systemd[1]: run-rc26f846339ab422d9c3850f3df48cfb5.service: Deactivated successfully.
Nov 28 07:59:59 np0005538515.localdomain sudo[47920]: pam_unix(sudo:session): session closed for user root
Nov 28 07:59:59 np0005538515.localdomain sudo[49415]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aurkiepofggctxecmpdwoesjegbyyvfn ; /usr/bin/python3
Nov 28 07:59:59 np0005538515.localdomain sudo[49415]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:59:59 np0005538515.localdomain python3[49417]: ansible-sysctl Invoked with name=vm.unprivileged_userfaultfd reload=True state=present sysctl_file=/etc/sysctl.d/99-tripleo-postcopy.conf sysctl_set=True value=1 ignoreerrors=False
Nov 28 07:59:59 np0005538515.localdomain sudo[49415]: pam_unix(sudo:session): session closed for user root
Nov 28 07:59:59 np0005538515.localdomain sudo[49434]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ajlpymrgqikduclcxyzdakvwkcbbjmgw ; /usr/bin/python3
Nov 28 07:59:59 np0005538515.localdomain sudo[49434]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 07:59:59 np0005538515.localdomain python3[49436]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-active ksm.service || systemctl is-enabled ksm.service _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 07:59:59 np0005538515.localdomain sudo[49434]: pam_unix(sudo:session): session closed for user root
Nov 28 08:00:00 np0005538515.localdomain sudo[49452]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vpfahguvtawgmfewofuwikuwhztahvdg ; /usr/bin/python3
Nov 28 08:00:00 np0005538515.localdomain sudo[49452]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:00:00 np0005538515.localdomain python3[49454]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None
Nov 28 08:00:00 np0005538515.localdomain python3[49454]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1 --format json
Nov 28 08:00:00 np0005538515.localdomain python3[49454]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1 -q --tls-verify=false
Nov 28 08:00:16 np0005538515.localdomain sudo[49530]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 08:00:16 np0005538515.localdomain sudo[49530]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:00:16 np0005538515.localdomain sudo[49530]: pam_unix(sudo:session): session closed for user root
Nov 28 08:00:16 np0005538515.localdomain sudo[49545]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Nov 28 08:00:16 np0005538515.localdomain sudo[49545]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:00:17 np0005538515.localdomain podman[49466]: 2025-11-28 08:00:00.939634721 +0000 UTC m=+0.030707961 image pull  registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1
Nov 28 08:00:17 np0005538515.localdomain python3[49454]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect bac901955dcf7a32a493c6ef724c092009bbc18467858aa8c55e916b8c2b2b8f --format json
Nov 28 08:00:17 np0005538515.localdomain sudo[49452]: pam_unix(sudo:session): session closed for user root
Nov 28 08:00:17 np0005538515.localdomain sudo[49677]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kllsdwgozteeddkeyzdolxvjynbqsqlk ; /usr/bin/python3
Nov 28 08:00:17 np0005538515.localdomain sudo[49677]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:00:17 np0005538515.localdomain systemd[1]: tmp-crun.iIYSCi.mount: Deactivated successfully.
Nov 28 08:00:17 np0005538515.localdomain podman[49652]: 2025-11-28 08:00:17.409383502 +0000 UTC m=+0.094884278 container exec 98f7091a3e2ea0e9ed1e630f1e98c8fad1fd276cf7448473db6afc3c103ea45d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538515, vendor=Red Hat, Inc., GIT_CLEAN=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, description=Red Hat Ceph Storage 7, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, release=553, com.redhat.component=rhceph-container, build-date=2025-09-24T08:57:55, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, version=7, architecture=x86_64, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_REPO=https://github.com/ceph/ceph-container.git)
Nov 28 08:00:17 np0005538515.localdomain python3[49681]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None
Nov 28 08:00:17 np0005538515.localdomain python3[49681]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1 --format json
Nov 28 08:00:17 np0005538515.localdomain podman[49652]: 2025-11-28 08:00:17.537401966 +0000 UTC m=+0.222902782 container exec_died 98f7091a3e2ea0e9ed1e630f1e98c8fad1fd276cf7448473db6afc3c103ea45d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538515, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, distribution-scope=public, release=553, RELEASE=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, GIT_CLEAN=True, name=rhceph, architecture=x86_64, io.openshift.tags=rhceph ceph, ceph=True, io.buildah.version=1.33.12, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, com.redhat.component=rhceph-container, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vendor=Red Hat, Inc., GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2025-09-24T08:57:55)
Nov 28 08:00:17 np0005538515.localdomain python3[49681]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1 -q --tls-verify=false
Nov 28 08:00:17 np0005538515.localdomain sudo[49545]: pam_unix(sudo:session): session closed for user root
Nov 28 08:00:17 np0005538515.localdomain sudo[49759]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 08:00:17 np0005538515.localdomain sudo[49759]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:00:17 np0005538515.localdomain sudo[49759]: pam_unix(sudo:session): session closed for user root
Nov 28 08:00:17 np0005538515.localdomain sudo[49774]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 08:00:17 np0005538515.localdomain sudo[49774]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:00:18 np0005538515.localdomain sudo[49774]: pam_unix(sudo:session): session closed for user root
Nov 28 08:00:21 np0005538515.localdomain sudo[49858]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 08:00:21 np0005538515.localdomain sudo[49858]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:00:21 np0005538515.localdomain sudo[49858]: pam_unix(sudo:session): session closed for user root
Nov 28 08:00:25 np0005538515.localdomain podman[49729]: 2025-11-28 08:00:17.687997817 +0000 UTC m=+0.042470256 image pull  registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1
Nov 28 08:00:25 np0005538515.localdomain python3[49681]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect 44feaf8d87c1d40487578230316b622680576d805efdb45dfeea6aad464b41f1 --format json
Nov 28 08:00:25 np0005538515.localdomain sudo[49677]: pam_unix(sudo:session): session closed for user root
Nov 28 08:00:26 np0005538515.localdomain sudo[49922]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ijsszwhovvuczadmqvvmlciazqjlveki ; /usr/bin/python3
Nov 28 08:00:26 np0005538515.localdomain sudo[49922]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:00:26 np0005538515.localdomain python3[49924]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None
Nov 28 08:00:26 np0005538515.localdomain python3[49924]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 --format json
Nov 28 08:00:26 np0005538515.localdomain python3[49924]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 -q --tls-verify=false
Nov 28 08:00:43 np0005538515.localdomain podman[49937]: 2025-11-28 08:00:26.376281055 +0000 UTC m=+0.043795087 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Nov 28 08:00:43 np0005538515.localdomain python3[49924]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect 3a088c12511c977065fdc5f1594cba7b1a79f163578a6ffd0ac4a475b8e67938 --format json
Nov 28 08:00:43 np0005538515.localdomain sudo[49922]: pam_unix(sudo:session): session closed for user root
Nov 28 08:00:43 np0005538515.localdomain sudo[50297]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nrqivdsxqbcxpbyvpmwsuaqfvvanidxi ; /usr/bin/python3
Nov 28 08:00:43 np0005538515.localdomain sudo[50297]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:00:43 np0005538515.localdomain python3[50299]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None
Nov 28 08:00:43 np0005538515.localdomain python3[50299]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 --format json
Nov 28 08:00:43 np0005538515.localdomain python3[50299]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 -q --tls-verify=false
Nov 28 08:00:57 np0005538515.localdomain podman[50312]: 2025-11-28 08:00:44.008463072 +0000 UTC m=+0.040243856 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1
Nov 28 08:00:57 np0005538515.localdomain python3[50299]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect 514d439186251360cf734cbc6d4a44c834664891872edf3798a653dfaacf10c0 --format json
Nov 28 08:00:57 np0005538515.localdomain sudo[50297]: pam_unix(sudo:session): session closed for user root
Nov 28 08:00:57 np0005538515.localdomain sudo[50393]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-soqmxlziythbrbxmltrwvynuptqfxilf ; /usr/bin/python3
Nov 28 08:00:57 np0005538515.localdomain sudo[50393]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:00:57 np0005538515.localdomain python3[50395]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None
Nov 28 08:00:57 np0005538515.localdomain python3[50395]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1 --format json
Nov 28 08:00:57 np0005538515.localdomain python3[50395]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1 -q --tls-verify=false
Nov 28 08:01:01 np0005538515.localdomain CROND[50638]: (root) CMD (run-parts /etc/cron.hourly)
Nov 28 08:01:01 np0005538515.localdomain run-parts[50641]: (/etc/cron.hourly) starting 0anacron
Nov 28 08:01:01 np0005538515.localdomain run-parts[50647]: (/etc/cron.hourly) finished 0anacron
Nov 28 08:01:01 np0005538515.localdomain CROND[50637]: (root) CMDEND (run-parts /etc/cron.hourly)
Nov 28 08:01:05 np0005538515.localdomain podman[50408]: 2025-11-28 08:00:57.995379575 +0000 UTC m=+0.031002361 image pull  registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1
Nov 28 08:01:05 np0005538515.localdomain python3[50395]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect a9dd7a2ac6f35cb086249f87f74e2f8e74e7e2ad5141ce2228263be6faedce26 --format json
Nov 28 08:01:05 np0005538515.localdomain sudo[50393]: pam_unix(sudo:session): session closed for user root
Nov 28 08:01:05 np0005538515.localdomain sudo[50756]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hykoiuyjkgkejfnibjgwunsimuqiwofv ; /usr/bin/python3
Nov 28 08:01:05 np0005538515.localdomain sudo[50756]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:01:05 np0005538515.localdomain python3[50758]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None
Nov 28 08:01:05 np0005538515.localdomain python3[50758]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1 --format json
Nov 28 08:01:05 np0005538515.localdomain python3[50758]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1 -q --tls-verify=false
Nov 28 08:01:11 np0005538515.localdomain podman[50770]: 2025-11-28 08:01:05.99363467 +0000 UTC m=+0.044023584 image pull  registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1
Nov 28 08:01:11 np0005538515.localdomain python3[50758]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect 24976907b2c2553304119aba5731a800204d664feed24ca9eb7f2b4c7d81016b --format json
Nov 28 08:01:11 np0005538515.localdomain sudo[50756]: pam_unix(sudo:session): session closed for user root
Nov 28 08:01:11 np0005538515.localdomain sudo[50847]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-btnuohalulbehhvnjeyrmzwschgvwdnc ; /usr/bin/python3
Nov 28 08:01:11 np0005538515.localdomain sudo[50847]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:01:11 np0005538515.localdomain python3[50849]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None
Nov 28 08:01:11 np0005538515.localdomain python3[50849]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1 --format json
Nov 28 08:01:11 np0005538515.localdomain python3[50849]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1 -q --tls-verify=false
Nov 28 08:01:13 np0005538515.localdomain podman[50862]: 2025-11-28 08:01:11.815459746 +0000 UTC m=+0.047358561 image pull  registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1
Nov 28 08:01:13 np0005538515.localdomain python3[50849]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect 57163a7b21fdbb804a27897cb6e6052a5e5c7a339c45d663e80b52375a760dcf --format json
Nov 28 08:01:13 np0005538515.localdomain sudo[50847]: pam_unix(sudo:session): session closed for user root
Nov 28 08:01:14 np0005538515.localdomain sudo[50939]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ogciwdqotpbtgikriwszxkmhnkgybqjl ; /usr/bin/python3
Nov 28 08:01:14 np0005538515.localdomain sudo[50939]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:01:14 np0005538515.localdomain python3[50941]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None
Nov 28 08:01:14 np0005538515.localdomain python3[50941]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1 --format json
Nov 28 08:01:14 np0005538515.localdomain python3[50941]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1 -q --tls-verify=false
Nov 28 08:01:16 np0005538515.localdomain podman[50955]: 2025-11-28 08:01:14.354340683 +0000 UTC m=+0.045486536 image pull  registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1
Nov 28 08:01:16 np0005538515.localdomain python3[50941]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect 076d82a27d63c8328729ed27ceb4291585ae18d017befe6fe353df7aa11715ae --format json
Nov 28 08:01:16 np0005538515.localdomain sudo[50939]: pam_unix(sudo:session): session closed for user root
Nov 28 08:01:16 np0005538515.localdomain sudo[51031]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bctrugqczhfavdagvzlqzjksazzulxsn ; /usr/bin/python3
Nov 28 08:01:16 np0005538515.localdomain sudo[51031]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:01:16 np0005538515.localdomain python3[51033]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None
Nov 28 08:01:16 np0005538515.localdomain python3[51033]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1 --format json
Nov 28 08:01:16 np0005538515.localdomain python3[51033]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1 -q --tls-verify=false
Nov 28 08:01:19 np0005538515.localdomain podman[51046]: 2025-11-28 08:01:16.891433899 +0000 UTC m=+0.043991104 image pull  registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1
Nov 28 08:01:19 np0005538515.localdomain python3[51033]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect d0dbcb95546840a8d088df044347a7877ad5ea45a2ddba0578e9bb5de4ab0da5 --format json
Nov 28 08:01:19 np0005538515.localdomain sudo[51031]: pam_unix(sudo:session): session closed for user root
Nov 28 08:01:19 np0005538515.localdomain sudo[51123]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dvuqktenxwfgpxzqwfpshpwyrpnhlhfv ; /usr/bin/python3
Nov 28 08:01:19 np0005538515.localdomain sudo[51123]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:01:19 np0005538515.localdomain python3[51125]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None
Nov 28 08:01:19 np0005538515.localdomain python3[51125]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1 --format json
Nov 28 08:01:19 np0005538515.localdomain python3[51125]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1 -q --tls-verify=false
Nov 28 08:01:21 np0005538515.localdomain sudo[51176]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 08:01:21 np0005538515.localdomain sudo[51176]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:01:21 np0005538515.localdomain sudo[51176]: pam_unix(sudo:session): session closed for user root
Nov 28 08:01:21 np0005538515.localdomain sudo[51191]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 08:01:21 np0005538515.localdomain sudo[51191]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:01:22 np0005538515.localdomain podman[51138]: 2025-11-28 08:01:19.571884204 +0000 UTC m=+0.047978150 image pull  registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1
Nov 28 08:01:22 np0005538515.localdomain python3[51125]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect e6e981540e553415b2d6eda490d7683db07164af2e7a0af8245623900338a4d6 --format json
Nov 28 08:01:22 np0005538515.localdomain sudo[51123]: pam_unix(sudo:session): session closed for user root
Nov 28 08:01:23 np0005538515.localdomain sudo[51287]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eezwoqpffjdgrvbumvwubtsopimkkico ; /usr/bin/python3
Nov 28 08:01:23 np0005538515.localdomain sudo[51287]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:01:23 np0005538515.localdomain sudo[51191]: pam_unix(sudo:session): session closed for user root
Nov 28 08:01:23 np0005538515.localdomain python3[51289]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None
Nov 28 08:01:23 np0005538515.localdomain python3[51289]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-cron:17.1 --format json
Nov 28 08:01:23 np0005538515.localdomain python3[51289]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-cron:17.1 -q --tls-verify=false
Nov 28 08:01:23 np0005538515.localdomain sudo[51315]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 08:01:23 np0005538515.localdomain sudo[51315]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:01:23 np0005538515.localdomain sudo[51315]: pam_unix(sudo:session): session closed for user root
Nov 28 08:01:25 np0005538515.localdomain podman[51302]: 2025-11-28 08:01:23.393047174 +0000 UTC m=+0.045160517 image pull  registry.redhat.io/rhosp-rhel9/openstack-cron:17.1
Nov 28 08:01:25 np0005538515.localdomain python3[51289]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect 87ee88cbf01fb42e0b22747072843bcca6130a90eda4de6e74b3ccd847bb4040 --format json
Nov 28 08:01:25 np0005538515.localdomain sudo[51287]: pam_unix(sudo:session): session closed for user root
Nov 28 08:01:25 np0005538515.localdomain sudo[51392]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jpyoseplqezhwegsikczvjpwoqmvbupw ; /usr/bin/python3
Nov 28 08:01:25 np0005538515.localdomain sudo[51392]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:01:25 np0005538515.localdomain python3[51394]: ansible-stat Invoked with path=/var/lib/tripleo-config/container-startup-config/step_1 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 28 08:01:25 np0005538515.localdomain sudo[51392]: pam_unix(sudo:session): session closed for user root
Nov 28 08:01:26 np0005538515.localdomain sudo[51442]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yhifqdvskyqbccbeoqsvktaaviilpptp ; /usr/bin/python3
Nov 28 08:01:26 np0005538515.localdomain sudo[51442]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:01:26 np0005538515.localdomain sudo[51442]: pam_unix(sudo:session): session closed for user root
Nov 28 08:01:26 np0005538515.localdomain sudo[51460]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cdngyuxsmdklqwmppwoeebbnpmpccdji ; /usr/bin/python3
Nov 28 08:01:26 np0005538515.localdomain sudo[51460]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:01:26 np0005538515.localdomain sudo[51460]: pam_unix(sudo:session): session closed for user root
Nov 28 08:01:27 np0005538515.localdomain sudo[51564]: tripleo-admin : TTY=pts/0 ; PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iuuxcmvxwhglttnqzamykdktddxdvful ; ANSIBLE_ASYNC_DIR=/tmp/.ansible_async /usr/bin/python3 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1764316886.9211266-84042-242228584962139/async_wrapper.py 4007624472 3600 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1764316886.9211266-84042-242228584962139/AnsiballZ_command.py _
Nov 28 08:01:27 np0005538515.localdomain sudo[51564]: pam_unix(sudo:session): session opened for user root(uid=0) by tripleo-admin(uid=1003)
Nov 28 08:01:27 np0005538515.localdomain ansible-async_wrapper.py[51566]: Invoked with 4007624472 3600 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1764316886.9211266-84042-242228584962139/AnsiballZ_command.py _
Nov 28 08:01:27 np0005538515.localdomain ansible-async_wrapper.py[51569]: Starting module and watcher
Nov 28 08:01:27 np0005538515.localdomain ansible-async_wrapper.py[51569]: Start watching 51570 (3600)
Nov 28 08:01:27 np0005538515.localdomain ansible-async_wrapper.py[51570]: Start module (51570)
Nov 28 08:01:27 np0005538515.localdomain ansible-async_wrapper.py[51566]: Return async_wrapper task started.
Nov 28 08:01:27 np0005538515.localdomain sudo[51564]: pam_unix(sudo:session): session closed for user root
Nov 28 08:01:27 np0005538515.localdomain sudo[51585]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kidaaxnvtldofjwoqqyvvfeknwwadgym ; /usr/bin/python3
Nov 28 08:01:27 np0005538515.localdomain sudo[51585]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:01:27 np0005538515.localdomain python3[51587]: ansible-ansible.legacy.async_status Invoked with jid=4007624472.51566 mode=status _async_dir=/tmp/.ansible_async
Nov 28 08:01:27 np0005538515.localdomain sudo[51585]: pam_unix(sudo:session): session closed for user root
Nov 28 08:01:31 np0005538515.localdomain puppet-user[51590]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5
Nov 28 08:01:31 np0005538515.localdomain puppet-user[51590]:    (file: /etc/puppet/hiera.yaml)
Nov 28 08:01:31 np0005538515.localdomain puppet-user[51590]: Warning: Undefined variable '::deploy_config_name';
Nov 28 08:01:31 np0005538515.localdomain puppet-user[51590]:    (file & line not available)
Nov 28 08:01:31 np0005538515.localdomain puppet-user[51590]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html
Nov 28 08:01:31 np0005538515.localdomain puppet-user[51590]:    (file & line not available)
Nov 28 08:01:31 np0005538515.localdomain puppet-user[51590]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/profile/base/database/mysql/client.pp, line: 89, column: 8)
Nov 28 08:01:31 np0005538515.localdomain puppet-user[51590]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/packages.pp, line: 39, column: 69)
Nov 28 08:01:31 np0005538515.localdomain puppet-user[51590]: Notice: Compiled catalog for np0005538515.localdomain in environment production in 0.12 seconds
Nov 28 08:01:31 np0005538515.localdomain puppet-user[51590]: Notice: /Stage[main]/Tripleo::Profile::Base::Database::Mysql::Client/Exec[directory-create-etc-my.cnf.d]/returns: executed successfully
Nov 28 08:01:31 np0005538515.localdomain puppet-user[51590]: Notice: /Stage[main]/Tripleo::Profile::Base::Database::Mysql::Client/File[/etc/my.cnf.d/tripleo.cnf]/ensure: created
Nov 28 08:01:31 np0005538515.localdomain puppet-user[51590]: Notice: /Stage[main]/Tripleo::Profile::Base::Database::Mysql::Client/Augeas[tripleo-mysql-client-conf]/returns: executed successfully
Nov 28 08:01:31 np0005538515.localdomain puppet-user[51590]: Notice: Applied catalog in 0.05 seconds
Nov 28 08:01:31 np0005538515.localdomain puppet-user[51590]: Application:
Nov 28 08:01:31 np0005538515.localdomain puppet-user[51590]:    Initial environment: production
Nov 28 08:01:31 np0005538515.localdomain puppet-user[51590]:    Converged environment: production
Nov 28 08:01:31 np0005538515.localdomain puppet-user[51590]:          Run mode: user
Nov 28 08:01:31 np0005538515.localdomain puppet-user[51590]: Changes:
Nov 28 08:01:31 np0005538515.localdomain puppet-user[51590]:             Total: 3
Nov 28 08:01:31 np0005538515.localdomain puppet-user[51590]: Events:
Nov 28 08:01:31 np0005538515.localdomain puppet-user[51590]:           Success: 3
Nov 28 08:01:31 np0005538515.localdomain puppet-user[51590]:             Total: 3
Nov 28 08:01:31 np0005538515.localdomain puppet-user[51590]: Resources:
Nov 28 08:01:31 np0005538515.localdomain puppet-user[51590]:           Changed: 3
Nov 28 08:01:31 np0005538515.localdomain puppet-user[51590]:       Out of sync: 3
Nov 28 08:01:31 np0005538515.localdomain puppet-user[51590]:             Total: 10
Nov 28 08:01:31 np0005538515.localdomain puppet-user[51590]: Time:
Nov 28 08:01:31 np0005538515.localdomain puppet-user[51590]:          Schedule: 0.00
Nov 28 08:01:31 np0005538515.localdomain puppet-user[51590]:              File: 0.00
Nov 28 08:01:31 np0005538515.localdomain puppet-user[51590]:              Exec: 0.01
Nov 28 08:01:31 np0005538515.localdomain puppet-user[51590]:            Augeas: 0.02
Nov 28 08:01:31 np0005538515.localdomain puppet-user[51590]:    Transaction evaluation: 0.05
Nov 28 08:01:31 np0005538515.localdomain puppet-user[51590]:    Catalog application: 0.05
Nov 28 08:01:31 np0005538515.localdomain puppet-user[51590]:    Config retrieval: 0.16
Nov 28 08:01:31 np0005538515.localdomain puppet-user[51590]:          Last run: 1764316891
Nov 28 08:01:31 np0005538515.localdomain puppet-user[51590]:        Filebucket: 0.00
Nov 28 08:01:31 np0005538515.localdomain puppet-user[51590]:             Total: 0.05
Nov 28 08:01:31 np0005538515.localdomain puppet-user[51590]: Version:
Nov 28 08:01:31 np0005538515.localdomain puppet-user[51590]:            Config: 1764316891
Nov 28 08:01:31 np0005538515.localdomain puppet-user[51590]:            Puppet: 7.10.0
Nov 28 08:01:31 np0005538515.localdomain ansible-async_wrapper.py[51570]: Module complete (51570)
Nov 28 08:01:32 np0005538515.localdomain ansible-async_wrapper.py[51569]: Done in kid B.
Nov 28 08:01:37 np0005538515.localdomain sudo[51939]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wpupeaiubkeefsubhiwgfxahtciqhbuz ; /usr/bin/python3
Nov 28 08:01:37 np0005538515.localdomain sudo[51939]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:01:38 np0005538515.localdomain python3[51941]: ansible-ansible.legacy.async_status Invoked with jid=4007624472.51566 mode=status _async_dir=/tmp/.ansible_async
Nov 28 08:01:38 np0005538515.localdomain sudo[51939]: pam_unix(sudo:session): session closed for user root
Nov 28 08:01:38 np0005538515.localdomain sudo[51955]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-atoplttphkfszijsuazggyaonjhkaslo ; /usr/bin/python3
Nov 28 08:01:38 np0005538515.localdomain sudo[51955]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:01:38 np0005538515.localdomain python3[51957]: ansible-file Invoked with path=/var/lib/container-puppet/puppetlabs state=directory setype=svirt_sandbox_file_t selevel=s0 recurse=True force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None
Nov 28 08:01:38 np0005538515.localdomain sudo[51955]: pam_unix(sudo:session): session closed for user root
Nov 28 08:01:38 np0005538515.localdomain sudo[51971]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kmgvpfninovvgapqhsgrhsgyvfxpkllb ; /usr/bin/python3
Nov 28 08:01:38 np0005538515.localdomain sudo[51971]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:01:39 np0005538515.localdomain python3[51973]: ansible-stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 28 08:01:39 np0005538515.localdomain sudo[51971]: pam_unix(sudo:session): session closed for user root
Nov 28 08:01:39 np0005538515.localdomain sudo[52019]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jawpgscnlknenujawkoiutxbemplkykz ; /usr/bin/python3
Nov 28 08:01:39 np0005538515.localdomain sudo[52019]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:01:39 np0005538515.localdomain python3[52021]: ansible-ansible.legacy.stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 08:01:39 np0005538515.localdomain sudo[52019]: pam_unix(sudo:session): session closed for user root
Nov 28 08:01:39 np0005538515.localdomain sudo[52062]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pvhbkbaynalfpdazupevbjhozzmjweiu ; /usr/bin/python3
Nov 28 08:01:39 np0005538515.localdomain sudo[52062]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:01:40 np0005538515.localdomain python3[52064]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/container-puppet/puppetlabs/facter.conf setype=svirt_sandbox_file_t selevel=s0 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764316899.2812352-84370-231143398356466/source _original_basename=tmpn8xo65m6 follow=False checksum=53908622cb869db5e2e2a68e737aa2ab1a872111 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None attributes=None
Nov 28 08:01:40 np0005538515.localdomain sudo[52062]: pam_unix(sudo:session): session closed for user root
Nov 28 08:01:40 np0005538515.localdomain sudo[52092]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wityjjwivygoqdbnyamawhjnmjczjhtj ; /usr/bin/python3
Nov 28 08:01:40 np0005538515.localdomain sudo[52092]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:01:40 np0005538515.localdomain python3[52094]: ansible-file Invoked with path=/opt/puppetlabs/facter state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:01:40 np0005538515.localdomain sudo[52092]: pam_unix(sudo:session): session closed for user root
Nov 28 08:01:40 np0005538515.localdomain sudo[52108]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-crespwsuoxgsgmikoldmkcaockfetbnp ; /usr/bin/python3
Nov 28 08:01:40 np0005538515.localdomain sudo[52108]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:01:41 np0005538515.localdomain sudo[52108]: pam_unix(sudo:session): session closed for user root
Nov 28 08:01:41 np0005538515.localdomain sudo[52196]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kmhxvnkgkqykmgfizgosqozptnxmawqj ; /usr/bin/python3
Nov 28 08:01:41 np0005538515.localdomain sudo[52196]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:01:41 np0005538515.localdomain python3[52198]: ansible-ansible.posix.synchronize Invoked with src=/opt/puppetlabs/ dest=/var/lib/container-puppet/puppetlabs/ _local_rsync_path=rsync _local_rsync_password=NOT_LOGGING_PARAMETER rsync_path=None delete=False _substitute_controller=False archive=True checksum=False compress=True existing_only=False dirs=False copy_links=False set_remote_user=True rsync_timeout=0 rsync_opts=[] ssh_connection_multiplexing=False partial=False verify_host=False mode=push dest_port=None private_key=None recursive=None links=None perms=None times=None owner=None group=None ssh_args=None link_dest=None
Nov 28 08:01:41 np0005538515.localdomain sudo[52196]: pam_unix(sudo:session): session closed for user root
Nov 28 08:01:42 np0005538515.localdomain sudo[52215]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xgmktwfnvulafsxdeooyrfhptnkvojjt ; /usr/bin/python3
Nov 28 08:01:42 np0005538515.localdomain sudo[52215]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:01:42 np0005538515.localdomain python3[52217]: ansible-file Invoked with path=/var/lib/tripleo-config/container-puppet-config mode=448 recurse=True setype=container_file_t force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 08:01:42 np0005538515.localdomain sudo[52215]: pam_unix(sudo:session): session closed for user root
Nov 28 08:01:42 np0005538515.localdomain sudo[52231]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jxszndssexbsazwipcxcplcyojqrsfao ; /usr/bin/python3
Nov 28 08:01:42 np0005538515.localdomain sudo[52231]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:01:42 np0005538515.localdomain python3[52233]: ansible-container_puppet_config Invoked with check_mode=False config_vol_prefix=/var/lib/config-data debug=True net_host=True no_archive=False puppet_config=/var/lib/container-puppet/container-puppet.json short_hostname=np0005538515 step=1 update_config_hash_only=False
Nov 28 08:01:42 np0005538515.localdomain sudo[52231]: pam_unix(sudo:session): session closed for user root
Nov 28 08:01:43 np0005538515.localdomain sudo[52247]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eseuifvesavkoruyzaexibivteqyrupf ; /usr/bin/python3
Nov 28 08:01:43 np0005538515.localdomain sudo[52247]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:01:43 np0005538515.localdomain python3[52249]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:01:43 np0005538515.localdomain sudo[52247]: pam_unix(sudo:session): session closed for user root
Nov 28 08:01:43 np0005538515.localdomain sudo[52263]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pmxpiatcoqrcvdwenydvunstuenofxgj ; /usr/bin/python3
Nov 28 08:01:43 np0005538515.localdomain sudo[52263]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:01:43 np0005538515.localdomain python3[52265]: ansible-container_config_data Invoked with config_path=/var/lib/tripleo-config/container-puppet-config/step_1 config_pattern=container-puppet-*.json config_overrides={} debug=True
Nov 28 08:01:43 np0005538515.localdomain sudo[52263]: pam_unix(sudo:session): session closed for user root
Nov 28 08:01:44 np0005538515.localdomain sudo[52279]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ifsyxhijuecdhhzxytscnzxvglmatblz ; /usr/bin/python3
Nov 28 08:01:44 np0005538515.localdomain sudo[52279]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:01:44 np0005538515.localdomain python3[52281]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Nov 28 08:01:44 np0005538515.localdomain sudo[52279]: pam_unix(sudo:session): session closed for user root
Nov 28 08:01:45 np0005538515.localdomain sudo[52319]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kjgwxabdruqbittnlbvlweeaunrgzwgh ; /usr/bin/python3
Nov 28 08:01:45 np0005538515.localdomain sudo[52319]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:01:45 np0005538515.localdomain python3[52321]: ansible-tripleo_container_manage Invoked with config_id=tripleo_puppet_step1 config_dir=/var/lib/tripleo-config/container-puppet-config/step_1 config_patterns=container-puppet-*.json config_overrides={} concurrency=6 log_base_path=/var/log/containers/stdouts debug=False
Nov 28 08:01:45 np0005538515.localdomain podman[52501]: 2025-11-28 08:01:45.742058736 +0000 UTC m=+0.069480866 container create 8834e3582a8fa1c94bce5b1886005ace73c6931a23b894267022dfd791337f10 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=container-puppet-collectd, description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, url=https://www.redhat.com, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_puppet_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538515', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,collectd_client_config,exec', 'NAME': 'collectd', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::collectd'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, com.redhat.component=openstack-collectd-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, vendor=Red Hat, Inc., distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=container-puppet-collectd, batch=17.1_20251118.1, name=rhosp17/openstack-collectd, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, tcib_managed=true)
Nov 28 08:01:45 np0005538515.localdomain podman[52490]: 2025-11-28 08:01:45.777781737 +0000 UTC m=+0.119662449 container create d184a5420aa167537c4418fe72018d93cc08508bdda98c15877ff895cf99cb9b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=container-puppet-nova_libvirt, distribution-scope=public, architecture=x86_64, url=https://www.redhat.com, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-nova-libvirt-container, tcib_managed=true, io.buildah.version=1.41.4, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.12, release=1761123044, container_name=container-puppet-nova_libvirt, build-date=2025-11-19T00:35:22Z, config_id=tripleo_puppet_step1, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp17/openstack-nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538515', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,nova_config,libvirtd_config,virtlogd_config,virtproxyd_config,virtqemud_config,virtnodedevd_config,virtsecretd_config,virtstoraged_config,nova_config,file,libvirt_tls_password,libvirtd_config,nova_config,file,libvirt_tls_password', 'NAME': 'nova_libvirt', 'STEP_CONFIG': "include ::tripleo::packages\n# TODO(emilien): figure how to deal with libvirt profile.\n# We'll probably treat it like we do with Neutron plugins.\n# Until then, just include it in the default nova-compute role.\ninclude tripleo::profile::base::nova::compute::libvirt\n\ninclude tripleo::profile::base::nova::libvirt\n\ninclude tripleo::profile::base::nova::compute::libvirt_guests\n\ninclude tripleo::profile::base::sshd\ninclude tripleo::profile::base::nova::migration::target"}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 28 08:01:45 np0005538515.localdomain systemd[1]: Started libpod-conmon-8834e3582a8fa1c94bce5b1886005ace73c6931a23b894267022dfd791337f10.scope.
Nov 28 08:01:45 np0005538515.localdomain podman[52490]: 2025-11-28 08:01:45.701494003 +0000 UTC m=+0.043374695 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Nov 28 08:01:45 np0005538515.localdomain podman[52526]: 2025-11-28 08:01:45.805164866 +0000 UTC m=+0.112371137 container create 2b4ea332b462b808784dc8956e5a9441745ce5065c88da8e21e37d1ee9bf6447 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=container-puppet-metrics_qdr, distribution-scope=public, name=rhosp17/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, tcib_managed=true, build-date=2025-11-18T22:49:46Z, managed_by=tripleo_ansible, version=17.1.12, config_id=tripleo_puppet_step1, batch=17.1_20251118.1, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., container_name=container-puppet-metrics_qdr, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538515', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'metrics_qdr', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::qdr\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 28 08:01:45 np0005538515.localdomain podman[52501]: 2025-11-28 08:01:45.707426306 +0000 UTC m=+0.034848456 image pull  registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1
Nov 28 08:01:45 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 08:01:45 np0005538515.localdomain systemd[1]: Started libpod-conmon-d184a5420aa167537c4418fe72018d93cc08508bdda98c15877ff895cf99cb9b.scope.
Nov 28 08:01:45 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2465f602934c9b49eec4e0598b6266084474df0f2da0f1de92a72390c7a9be21/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff)
Nov 28 08:01:45 np0005538515.localdomain systemd[1]: Started libpod-conmon-2b4ea332b462b808784dc8956e5a9441745ce5065c88da8e21e37d1ee9bf6447.scope.
Nov 28 08:01:45 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 08:01:45 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 08:01:45 np0005538515.localdomain podman[52531]: 2025-11-28 08:01:45.733849207 +0000 UTC m=+0.029341287 image pull  registry.redhat.io/rhosp-rhel9/openstack-cron:17.1
Nov 28 08:01:45 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c8beb7bd0728a5185ae08edcb0afeede0750b5c1acd8c5a453f776b712778919/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff)
Nov 28 08:01:45 np0005538515.localdomain podman[52526]: 2025-11-28 08:01:45.73775594 +0000 UTC m=+0.044962231 image pull  registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1
Nov 28 08:01:45 np0005538515.localdomain podman[52501]: 2025-11-28 08:01:45.837708495 +0000 UTC m=+0.165130625 container init 8834e3582a8fa1c94bce5b1886005ace73c6931a23b894267022dfd791337f10 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=container-puppet-collectd, build-date=2025-11-18T22:51:28Z, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, container_name=container-puppet-collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.openshift.expose-services=, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, config_id=tripleo_puppet_step1, release=1761123044, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538515', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,collectd_client_config,exec', 'NAME': 'collectd', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::collectd'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, name=rhosp17/openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., url=https://www.redhat.com, architecture=x86_64, com.redhat.component=openstack-collectd-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 28 08:01:45 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4d267351eb91c27e496fa400ef9055b36048428ec01962767ba6b671d1258ac4/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff)
Nov 28 08:01:45 np0005538515.localdomain podman[52526]: 2025-11-28 08:01:45.842317008 +0000 UTC m=+0.149523279 container init 2b4ea332b462b808784dc8956e5a9441745ce5065c88da8e21e37d1ee9bf6447 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=container-puppet-metrics_qdr, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538515', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'metrics_qdr', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::qdr\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, release=1761123044, distribution-scope=public, vcs-type=git, container_name=container-puppet-metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, architecture=x86_64, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_puppet_step1, tcib_managed=true, build-date=2025-11-18T22:49:46Z, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, maintainer=OpenStack TripleO Team)
Nov 28 08:01:45 np0005538515.localdomain podman[52501]: 2025-11-28 08:01:45.848903431 +0000 UTC m=+0.176325551 container start 8834e3582a8fa1c94bce5b1886005ace73c6931a23b894267022dfd791337f10 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=container-puppet-collectd, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-collectd-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538515', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,collectd_client_config,exec', 'NAME': 'collectd', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::collectd'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, batch=17.1_20251118.1, managed_by=tripleo_ansible, url=https://www.redhat.com, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, version=17.1.12, io.openshift.expose-services=, release=1761123044, config_id=tripleo_puppet_step1, name=rhosp17/openstack-collectd, container_name=container-puppet-collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vendor=Red Hat, Inc.)
Nov 28 08:01:45 np0005538515.localdomain podman[52526]: 2025-11-28 08:01:45.851564689 +0000 UTC m=+0.158770960 container start 2b4ea332b462b808784dc8956e5a9441745ce5065c88da8e21e37d1ee9bf6447 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=container-puppet-metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-qdrouterd, container_name=container-puppet-metrics_qdr, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_puppet_step1, url=https://www.redhat.com, vendor=Red Hat, Inc., io.buildah.version=1.41.4, vcs-type=git, release=1761123044, distribution-scope=public, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538515', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'metrics_qdr', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::qdr\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:49:46Z)
Nov 28 08:01:45 np0005538515.localdomain podman[52526]: 2025-11-28 08:01:45.852353682 +0000 UTC m=+0.159559953 container attach 2b4ea332b462b808784dc8956e5a9441745ce5065c88da8e21e37d1ee9bf6447 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=container-puppet-metrics_qdr, tcib_managed=true, container_name=container-puppet-metrics_qdr, url=https://www.redhat.com, io.buildah.version=1.41.4, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538515', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'metrics_qdr', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::qdr\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-qdrouterd, architecture=x86_64, io.openshift.expose-services=, build-date=2025-11-18T22:49:46Z, managed_by=tripleo_ansible, vendor=Red Hat, Inc., distribution-scope=public, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_puppet_step1, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 28 08:01:45 np0005538515.localdomain podman[52501]: 2025-11-28 08:01:45.849126598 +0000 UTC m=+0.176548738 container attach 8834e3582a8fa1c94bce5b1886005ace73c6931a23b894267022dfd791337f10 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=container-puppet-collectd, release=1761123044, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, config_id=tripleo_puppet_step1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:51:28Z, name=rhosp17/openstack-collectd, io.openshift.expose-services=, vendor=Red Hat, Inc., container_name=container-puppet-collectd, version=17.1.12, vcs-type=git, architecture=x86_64, com.redhat.component=openstack-collectd-container, description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538515', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,collectd_client_config,exec', 'NAME': 'collectd', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::collectd'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']})
Nov 28 08:01:45 np0005538515.localdomain podman[52538]: 2025-11-28 08:01:45.769518746 +0000 UTC m=+0.051169052 image pull  registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1
Nov 28 08:01:46 np0005538515.localdomain systemd[1]: tmp-crun.jsdx86.mount: Deactivated successfully.
Nov 28 08:01:47 np0005538515.localdomain podman[52538]: 2025-11-28 08:01:47.187143726 +0000 UTC m=+1.468794062 container create ed41a5a8171688821f2c74039d5b07173339f60e4defb1c8fdfea4439a6b75c0 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=container-puppet-iscsid, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538515', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,iscsid_config', 'NAME': 'iscsid', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::iscsid\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/iscsi:/tmp/iscsi.host:z', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., config_id=tripleo_puppet_step1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, tcib_managed=true, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, url=https://www.redhat.com, io.buildah.version=1.41.4, distribution-scope=public, io.openshift.expose-services=, container_name=container-puppet-iscsid, maintainer=OpenStack TripleO Team, vcs-type=git, build-date=2025-11-18T23:44:13Z, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.component=openstack-iscsid-container, name=rhosp17/openstack-iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12)
Nov 28 08:01:47 np0005538515.localdomain podman[52490]: 2025-11-28 08:01:47.21202581 +0000 UTC m=+1.553906542 container init d184a5420aa167537c4418fe72018d93cc08508bdda98c15877ff895cf99cb9b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=container-puppet-nova_libvirt, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538515', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,nova_config,libvirtd_config,virtlogd_config,virtproxyd_config,virtqemud_config,virtnodedevd_config,virtsecretd_config,virtstoraged_config,nova_config,file,libvirt_tls_password,libvirtd_config,nova_config,file,libvirt_tls_password', 'NAME': 'nova_libvirt', 'STEP_CONFIG': "include ::tripleo::packages\n# TODO(emilien): figure how to deal with libvirt profile.\n# We'll probably treat it like we do with Neutron plugins.\n# Until then, just include it in the default nova-compute role.\ninclude tripleo::profile::base::nova::compute::libvirt\n\ninclude tripleo::profile::base::nova::libvirt\n\ninclude tripleo::profile::base::nova::compute::libvirt_guests\n\ninclude tripleo::profile::base::sshd\ninclude tripleo::profile::base::nova::migration::target"}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, name=rhosp17/openstack-nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-nova-libvirt-container, batch=17.1_20251118.1, container_name=container-puppet-nova_libvirt, build-date=2025-11-19T00:35:22Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, io.buildah.version=1.41.4, vcs-type=git, vendor=Red Hat, Inc., version=17.1.12, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, architecture=x86_64, release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, url=https://www.redhat.com, distribution-scope=public, config_id=tripleo_puppet_step1)
Nov 28 08:01:47 np0005538515.localdomain podman[52490]: 2025-11-28 08:01:47.222193407 +0000 UTC m=+1.564074129 container start d184a5420aa167537c4418fe72018d93cc08508bdda98c15877ff895cf99cb9b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=container-puppet-nova_libvirt, tcib_managed=true, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538515', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,nova_config,libvirtd_config,virtlogd_config,virtproxyd_config,virtqemud_config,virtnodedevd_config,virtsecretd_config,virtstoraged_config,nova_config,file,libvirt_tls_password,libvirtd_config,nova_config,file,libvirt_tls_password', 'NAME': 'nova_libvirt', 'STEP_CONFIG': "include ::tripleo::packages\n# TODO(emilien): figure how to deal with libvirt profile.\n# We'll probably treat it like we do with Neutron plugins.\n# Until then, just include it in the default nova-compute role.\ninclude tripleo::profile::base::nova::compute::libvirt\n\ninclude tripleo::profile::base::nova::libvirt\n\ninclude tripleo::profile::base::nova::compute::libvirt_guests\n\ninclude tripleo::profile::base::sshd\ninclude tripleo::profile::base::nova::migration::target"}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_puppet_step1, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, name=rhosp17/openstack-nova-libvirt, container_name=container-puppet-nova_libvirt, build-date=2025-11-19T00:35:22Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-libvirt-container, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4)
Nov 28 08:01:47 np0005538515.localdomain podman[52490]: 2025-11-28 08:01:47.222899557 +0000 UTC m=+1.564780339 container attach d184a5420aa167537c4418fe72018d93cc08508bdda98c15877ff895cf99cb9b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=container-puppet-nova_libvirt, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, name=rhosp17/openstack-nova-libvirt, container_name=container-puppet-nova_libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-libvirt-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538515', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,nova_config,libvirtd_config,virtlogd_config,virtproxyd_config,virtqemud_config,virtnodedevd_config,virtsecretd_config,virtstoraged_config,nova_config,file,libvirt_tls_password,libvirtd_config,nova_config,file,libvirt_tls_password', 'NAME': 'nova_libvirt', 'STEP_CONFIG': "include ::tripleo::packages\n# TODO(emilien): figure how to deal with libvirt profile.\n# We'll probably treat it like we do with Neutron plugins.\n# Until then, just include it in the default nova-compute role.\ninclude tripleo::profile::base::nova::compute::libvirt\n\ninclude tripleo::profile::base::nova::libvirt\n\ninclude tripleo::profile::base::nova::compute::libvirt_guests\n\ninclude tripleo::profile::base::sshd\ninclude tripleo::profile::base::nova::migration::target"}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, release=1761123044, config_id=tripleo_puppet_step1, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:35:22Z, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., tcib_managed=true, vcs-type=git)
Nov 28 08:01:47 np0005538515.localdomain systemd[1]: Started libpod-conmon-ed41a5a8171688821f2c74039d5b07173339f60e4defb1c8fdfea4439a6b75c0.scope.
Nov 28 08:01:47 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 08:01:47 np0005538515.localdomain podman[52531]: 2025-11-28 08:01:47.256708583 +0000 UTC m=+1.552200693 container create 3139c22a6742d68c52f0403d373ba7e8f851d424310f3264e609af6544368751 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=container-puppet-crond, com.redhat.component=openstack-cron-container, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538515', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'crond', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::logrotate'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_puppet_step1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, vcs-type=git, name=rhosp17/openstack-cron, io.buildah.version=1.41.4, vendor=Red Hat, Inc., release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=container-puppet-crond, build-date=2025-11-18T22:49:32Z, url=https://www.redhat.com)
Nov 28 08:01:47 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/29e7cf6abe6a2bbfc58462ae307ac9362023c413708070730336bba274ac12e7/merged/tmp/iscsi.host supports timestamps until 2038 (0x7fffffff)
Nov 28 08:01:47 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/29e7cf6abe6a2bbfc58462ae307ac9362023c413708070730336bba274ac12e7/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff)
Nov 28 08:01:47 np0005538515.localdomain podman[52538]: 2025-11-28 08:01:47.277344105 +0000 UTC m=+1.558994401 container init ed41a5a8171688821f2c74039d5b07173339f60e4defb1c8fdfea4439a6b75c0 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=container-puppet-iscsid, architecture=x86_64, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538515', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,iscsid_config', 'NAME': 'iscsid', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::iscsid\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/iscsi:/tmp/iscsi.host:z', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T23:44:13Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=container-puppet-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, url=https://www.redhat.com, vendor=Red Hat, Inc., tcib_managed=true, maintainer=OpenStack TripleO Team, distribution-scope=public, config_id=tripleo_puppet_step1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.buildah.version=1.41.4, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044)
Nov 28 08:01:47 np0005538515.localdomain podman[52538]: 2025-11-28 08:01:47.286288816 +0000 UTC m=+1.567939142 container start ed41a5a8171688821f2c74039d5b07173339f60e4defb1c8fdfea4439a6b75c0 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=container-puppet-iscsid, vcs-type=git, container_name=container-puppet-iscsid, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, architecture=x86_64, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_id=tripleo_puppet_step1, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538515', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,iscsid_config', 'NAME': 'iscsid', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::iscsid\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/iscsi:/tmp/iscsi.host:z', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, build-date=2025-11-18T23:44:13Z, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, com.redhat.component=openstack-iscsid-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, release=1761123044)
Nov 28 08:01:47 np0005538515.localdomain podman[52538]: 2025-11-28 08:01:47.287000806 +0000 UTC m=+1.568651182 container attach ed41a5a8171688821f2c74039d5b07173339f60e4defb1c8fdfea4439a6b75c0 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=container-puppet-iscsid, tcib_managed=true, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, com.redhat.component=openstack-iscsid-container, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, container_name=container-puppet-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, name=rhosp17/openstack-iscsid, config_id=tripleo_puppet_step1, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538515', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,iscsid_config', 'NAME': 'iscsid', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::iscsid\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/iscsi:/tmp/iscsi.host:z', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T23:44:13Z, vendor=Red Hat, Inc., io.buildah.version=1.41.4, distribution-scope=public)
Nov 28 08:01:47 np0005538515.localdomain systemd[1]: Started libpod-conmon-3139c22a6742d68c52f0403d373ba7e8f851d424310f3264e609af6544368751.scope.
Nov 28 08:01:47 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 08:01:47 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/77782651c01fa3d8af8a79c02d3312e7fed09a9087964da1a7c959a65a9214b8/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff)
Nov 28 08:01:47 np0005538515.localdomain podman[52531]: 2025-11-28 08:01:47.366113072 +0000 UTC m=+1.661605132 container init 3139c22a6742d68c52f0403d373ba7e8f851d424310f3264e609af6544368751 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=container-puppet-crond, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538515', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'crond', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::logrotate'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_puppet_step1, distribution-scope=public, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, tcib_managed=true, io.openshift.expose-services=, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, container_name=container-puppet-crond, build-date=2025-11-18T22:49:32Z, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-cron-container)
Nov 28 08:01:47 np0005538515.localdomain podman[52531]: 2025-11-28 08:01:47.372988233 +0000 UTC m=+1.668480293 container start 3139c22a6742d68c52f0403d373ba7e8f851d424310f3264e609af6544368751 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=container-puppet-crond, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, architecture=x86_64, name=rhosp17/openstack-cron, release=1761123044, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538515', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'crond', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::logrotate'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, build-date=2025-11-18T22:49:32Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, container_name=container-puppet-crond, config_id=tripleo_puppet_step1, maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, vendor=Red Hat, Inc., managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, batch=17.1_20251118.1, com.redhat.component=openstack-cron-container)
Nov 28 08:01:47 np0005538515.localdomain podman[52531]: 2025-11-28 08:01:47.374079785 +0000 UTC m=+1.669571875 container attach 3139c22a6742d68c52f0403d373ba7e8f851d424310f3264e609af6544368751 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=container-puppet-crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, version=17.1.12, build-date=2025-11-18T22:49:32Z, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-type=git, batch=17.1_20251118.1, url=https://www.redhat.com, distribution-scope=public, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538515', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'crond', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::logrotate'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, com.redhat.component=openstack-cron-container, description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_puppet_step1, container_name=container-puppet-crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, io.openshift.expose-services=)
Nov 28 08:01:48 np0005538515.localdomain podman[52409]: 2025-11-28 08:01:45.58992135 +0000 UTC m=+0.041891822 image pull  registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1
Nov 28 08:01:48 np0005538515.localdomain podman[52721]: 2025-11-28 08:01:48.183582225 +0000 UTC m=+0.060071772 container create 904f45bf1db4e9f2cbbbf5ea95d42519ada7d6f43d6be47e85c0a4d6697b9273 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1, name=container-puppet-ceilometer, url=https://www.redhat.com, version=17.1.12, vcs-type=git, config_id=tripleo_puppet_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-central, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-central, name=rhosp17/openstack-ceilometer-central, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ceilometer-central, build-date=2025-11-19T00:11:59Z, tcib_managed=true, vendor=Red Hat, Inc., org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-central-container, architecture=x86_64, release=1761123044, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538515', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,ceilometer_config,ceilometer_config', 'NAME': 'ceilometer', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::ceilometer::agent::polling\ninclude tripleo::profile::base::ceilometer::agent::polling\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, description=Red Hat OpenStack Platform 17.1 ceilometer-central, io.buildah.version=1.41.4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, managed_by=tripleo_ansible, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-central, container_name=container-puppet-ceilometer, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 28 08:01:48 np0005538515.localdomain systemd[1]: Started libpod-conmon-904f45bf1db4e9f2cbbbf5ea95d42519ada7d6f43d6be47e85c0a4d6697b9273.scope.
Nov 28 08:01:48 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 08:01:48 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/af4441ca58e5a3dae70e850402577fe72fc0370c205d9690db9c04c01d30a59b/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff)
Nov 28 08:01:48 np0005538515.localdomain podman[52721]: 2025-11-28 08:01:48.243549973 +0000 UTC m=+0.120039550 container init 904f45bf1db4e9f2cbbbf5ea95d42519ada7d6f43d6be47e85c0a4d6697b9273 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1, name=container-puppet-ceilometer, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-central, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-central, url=https://www.redhat.com, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538515', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,ceilometer_config,ceilometer_config', 'NAME': 'ceilometer', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::ceilometer::agent::polling\ninclude tripleo::profile::base::ceilometer::agent::polling\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, container_name=container-puppet-ceilometer, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-central, build-date=2025-11-19T00:11:59Z, tcib_managed=true, distribution-scope=public, com.redhat.component=openstack-ceilometer-central-container, managed_by=tripleo_ansible, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_puppet_step1, io.openshift.expose-services=, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-central, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-central, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-central, release=1761123044)
Nov 28 08:01:48 np0005538515.localdomain podman[52721]: 2025-11-28 08:01:48.150256493 +0000 UTC m=+0.026746070 image pull  registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1
Nov 28 08:01:48 np0005538515.localdomain podman[52721]: 2025-11-28 08:01:48.252559626 +0000 UTC m=+0.129049183 container start 904f45bf1db4e9f2cbbbf5ea95d42519ada7d6f43d6be47e85c0a4d6697b9273 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1, name=container-puppet-ceilometer, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-central, vcs-type=git, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-central, build-date=2025-11-19T00:11:59Z, tcib_managed=true, config_id=tripleo_puppet_step1, name=rhosp17/openstack-ceilometer-central, com.redhat.component=openstack-ceilometer-central-container, managed_by=tripleo_ansible, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-central, summary=Red Hat OpenStack Platform 17.1 ceilometer-central, vendor=Red Hat, Inc., container_name=container-puppet-ceilometer, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ceilometer-central, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538515', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,ceilometer_config,ceilometer_config', 'NAME': 'ceilometer', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::ceilometer::agent::polling\ninclude tripleo::profile::base::ceilometer::agent::polling\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']})
Nov 28 08:01:48 np0005538515.localdomain podman[52721]: 2025-11-28 08:01:48.253127222 +0000 UTC m=+0.129616799 container attach 904f45bf1db4e9f2cbbbf5ea95d42519ada7d6f43d6be47e85c0a4d6697b9273 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1, name=container-puppet-ceilometer, build-date=2025-11-19T00:11:59Z, description=Red Hat OpenStack Platform 17.1 ceilometer-central, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, managed_by=tripleo_ansible, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-central, version=17.1.12, release=1761123044, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538515', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,ceilometer_config,ceilometer_config', 'NAME': 'ceilometer', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::ceilometer::agent::polling\ninclude tripleo::profile::base::ceilometer::agent::polling\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, com.redhat.component=openstack-ceilometer-central-container, distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-central, container_name=container-puppet-ceilometer, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-central, config_id=tripleo_puppet_step1, io.buildah.version=1.41.4, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-central, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-central)
Nov 28 08:01:48 np0005538515.localdomain ovs-vsctl[52785]: ovs|00001|db_ctl_base|ERR|unix:/var/run/openvswitch/db.sock: database connection failed (No such file or directory)
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52636]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52636]:    (file: /etc/puppet/hiera.yaml)
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52636]: Warning: Undefined variable '::deploy_config_name';
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52636]:    (file & line not available)
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52636]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52636]:    (file & line not available)
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52657]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52657]:    (file: /etc/puppet/hiera.yaml)
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52657]: Warning: Undefined variable '::deploy_config_name';
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52657]:    (file & line not available)
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52636]: Notice: Accepting previously invalid value for target type 'Integer'
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52657]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52657]:    (file & line not available)
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52636]: Notice: Compiled catalog for np0005538515.localdomain in environment production in 0.12 seconds
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52638]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52638]:    (file: /etc/puppet/hiera.yaml)
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52638]: Warning: Undefined variable '::deploy_config_name';
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52638]:    (file & line not available)
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52636]: Notice: /Stage[main]/Qdr::Config/File[/var/lib/qdrouterd]/owner: owner changed 'qdrouterd' to 'root'
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52636]: Notice: /Stage[main]/Qdr::Config/File[/var/lib/qdrouterd]/group: group changed 'qdrouterd' to 'root'
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52636]: Notice: /Stage[main]/Qdr::Config/File[/var/lib/qdrouterd]/mode: mode changed '0700' to '0755'
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52675]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52675]:    (file: /etc/puppet/hiera.yaml)
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52636]: Notice: /Stage[main]/Qdr::Config/File[/etc/qpid-dispatch/ssl]/ensure: created
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52675]: Warning: Undefined variable '::deploy_config_name';
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52675]:    (file & line not available)
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52689]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52689]:    (file: /etc/puppet/hiera.yaml)
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52689]: Warning: Undefined variable '::deploy_config_name';
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52689]:    (file & line not available)
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52636]: Notice: /Stage[main]/Qdr::Config/File[qdrouterd.conf]/content: content changed '{sha256}89e10d8896247f992c5f0baf027c25a8ca5d0441be46d8859d9db2067ea74cd3' to '{sha256}8b21629c3c588c101e32eb798e9e14b646a0cfd6fc622da2fa0b582fa1678bbf'
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52636]: Notice: /Stage[main]/Qdr::Config/File[/var/log/qdrouterd]/ensure: created
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52636]: Notice: /Stage[main]/Qdr::Config/File[/var/log/qdrouterd/metrics_qdr.log]/ensure: created
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52636]: Notice: Applied catalog in 0.02 seconds
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52636]: Application:
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52636]:    Initial environment: production
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52636]:    Converged environment: production
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52636]:          Run mode: user
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52636]: Changes:
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52636]:             Total: 7
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52636]: Events:
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52636]:           Success: 7
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52636]:             Total: 7
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52636]: Resources:
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52636]:           Skipped: 13
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52636]:           Changed: 5
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52636]:       Out of sync: 5
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52636]:             Total: 20
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52636]: Time:
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52636]:              File: 0.01
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52636]:    Transaction evaluation: 0.02
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52636]:    Catalog application: 0.02
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52636]:    Config retrieval: 0.16
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52636]:          Last run: 1764316909
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52636]:             Total: 0.02
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52636]: Version:
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52636]:            Config: 1764316909
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52636]:            Puppet: 7.10.0
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52638]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52638]:    (file & line not available)
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52675]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52675]:    (file & line not available)
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52689]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52689]:    (file & line not available)
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52689]: Notice: Compiled catalog for np0005538515.localdomain in environment production in 0.07 seconds
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52675]: Notice: Compiled catalog for np0005538515.localdomain in environment production in 0.10 seconds
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52689]: Notice: /Stage[main]/Tripleo::Profile::Base::Logging::Logrotate/File[/etc/logrotate-crond.conf]/ensure: defined content as '{sha256}1c3202f58bd2ae16cb31badcbb7f0d4e6697157b987d1887736ad96bb73d70b0'
Nov 28 08:01:49 np0005538515.localdomain crontab[53077]: (root) LIST (root)
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52689]: Notice: /Stage[main]/Tripleo::Profile::Base::Logging::Logrotate/Cron[logrotate-crond]/ensure: created
Nov 28 08:01:49 np0005538515.localdomain crontab[53078]: (root) REPLACE (root)
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52689]: Notice: Applied catalog in 0.04 seconds
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52689]: Application:
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52689]:    Initial environment: production
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52689]:    Converged environment: production
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52689]:          Run mode: user
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52689]: Changes:
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52689]:             Total: 2
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52689]: Events:
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52689]:           Success: 2
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52689]:             Total: 2
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52689]: Resources:
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52689]:           Changed: 2
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52689]:       Out of sync: 2
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52689]:           Skipped: 7
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52689]:             Total: 9
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52689]: Time:
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52689]:              File: 0.00
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52689]:              Cron: 0.01
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52689]:    Transaction evaluation: 0.04
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52689]:    Catalog application: 0.04
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52689]:    Config retrieval: 0.09
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52689]:          Last run: 1764316909
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52689]:             Total: 0.04
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52689]: Version:
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52689]:            Config: 1764316909
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52689]:            Puppet: 7.10.0
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52675]: Notice: /Stage[main]/Tripleo::Profile::Base::Iscsid/Exec[reset-iscsi-initiator-name]/returns: executed successfully
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52675]: Notice: /Stage[main]/Tripleo::Profile::Base::Iscsid/File[/etc/iscsi/.initiator_reset]/ensure: created
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52657]: Warning: Scope(Class[Nova]): The os_region_name parameter is deprecated and will be removed \
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52657]: in a future release. Use nova::cinder::os_region_name instead
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52657]: Warning: Scope(Class[Nova]): The catalog_info parameter is deprecated and will be removed \
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52657]: in a future release. Use nova::cinder::catalog_info instead
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52675]: Notice: /Stage[main]/Tripleo::Profile::Base::Iscsid/Exec[sync-iqn-to-host]/returns: executed successfully
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52657]: Warning: Unknown variable: '::nova::compute::verify_glance_signatures'. (file: /etc/puppet/modules/nova/manifests/glance.pp, line: 62, column: 41)
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52657]: Warning: Unknown variable: '::nova::compute::libvirt::remove_unused_base_images'. (file: /etc/puppet/modules/nova/manifests/compute/image_cache.pp, line: 44, column: 5)
Nov 28 08:01:49 np0005538515.localdomain systemd[1]: libpod-2b4ea332b462b808784dc8956e5a9441745ce5065c88da8e21e37d1ee9bf6447.scope: Deactivated successfully.
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52657]: Warning: Unknown variable: '::nova::compute::libvirt::remove_unused_original_minimum_age_seconds'. (file: /etc/puppet/modules/nova/manifests/compute/image_cache.pp, line: 48, column: 5)
Nov 28 08:01:49 np0005538515.localdomain systemd[1]: libpod-2b4ea332b462b808784dc8956e5a9441745ce5065c88da8e21e37d1ee9bf6447.scope: Consumed 2.192s CPU time.
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52657]: Warning: Unknown variable: '::nova::compute::libvirt::remove_unused_resized_minimum_age_seconds'. (file: /etc/puppet/modules/nova/manifests/compute/image_cache.pp, line: 52, column: 5)
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52638]: Notice: Compiled catalog for np0005538515.localdomain in environment production in 0.37 seconds
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52657]: Warning: Scope(Class[Tripleo::Profile::Base::Nova::Compute]): The keymgr_backend parameter has been deprecated
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52657]: Warning: Scope(Class[Nova::Compute]): vcpu_pin_set is deprecated, instead use cpu_dedicated_set or cpu_shared_set.
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52657]: Warning: Scope(Class[Nova::Compute]): verify_glance_signatures is deprecated. Use the same parameter in nova::glance
Nov 28 08:01:49 np0005538515.localdomain podman[53135]: 2025-11-28 08:01:49.561972189 +0000 UTC m=+0.035483855 container died 2b4ea332b462b808784dc8956e5a9441745ce5065c88da8e21e37d1ee9bf6447 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=container-puppet-metrics_qdr, container_name=container-puppet-metrics_qdr, architecture=x86_64, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538515', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'metrics_qdr', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::qdr\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, tcib_managed=true, build-date=2025-11-18T22:49:46Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, vcs-type=git, url=https://www.redhat.com, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., release=1761123044, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_puppet_step1, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd)
Nov 28 08:01:49 np0005538515.localdomain systemd[1]: tmp-crun.c2NAS2.mount: Deactivated successfully.
Nov 28 08:01:49 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-2b4ea332b462b808784dc8956e5a9441745ce5065c88da8e21e37d1ee9bf6447-userdata-shm.mount: Deactivated successfully.
Nov 28 08:01:49 np0005538515.localdomain systemd[1]: libpod-3139c22a6742d68c52f0403d373ba7e8f851d424310f3264e609af6544368751.scope: Deactivated successfully.
Nov 28 08:01:49 np0005538515.localdomain systemd[1]: libpod-3139c22a6742d68c52f0403d373ba7e8f851d424310f3264e609af6544368751.scope: Consumed 2.038s CPU time.
Nov 28 08:01:49 np0005538515.localdomain podman[52531]: 2025-11-28 08:01:49.621141764 +0000 UTC m=+3.916633874 container died 3139c22a6742d68c52f0403d373ba7e8f851d424310f3264e609af6544368751 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=container-puppet-crond, config_id=tripleo_puppet_step1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, tcib_managed=true, version=17.1.12, build-date=2025-11-18T22:49:32Z, com.redhat.component=openstack-cron-container, managed_by=tripleo_ansible, name=rhosp17/openstack-cron, batch=17.1_20251118.1, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538515', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'crond', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::logrotate'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, maintainer=OpenStack TripleO Team, container_name=container-puppet-crond, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, release=1761123044, io.openshift.expose-services=, vcs-type=git, architecture=x86_64, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 cron)
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52638]: Notice: /Stage[main]/Collectd::Config/File[collectd.conf]/content: content changed '{sha256}aea388a73ebafc7e07a81ddb930a91099211f660eee55fbf92c13007a77501e5' to '{sha256}2523d01ee9c3022c0e9f61d896b1474a168e18472aee141cc278e69fe13f41c1'
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52638]: Notice: /Stage[main]/Collectd::Config/File[collectd.conf]/owner: owner changed 'collectd' to 'root'
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52638]: Notice: /Stage[main]/Collectd::Config/File[collectd.conf]/group: group changed 'collectd' to 'root'
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52638]: Notice: /Stage[main]/Collectd::Config/File[collectd.conf]/mode: mode changed '0644' to '0640'
Nov 28 08:01:49 np0005538515.localdomain podman[53135]: 2025-11-28 08:01:49.676167429 +0000 UTC m=+0.149679085 container cleanup 2b4ea332b462b808784dc8956e5a9441745ce5065c88da8e21e37d1ee9bf6447 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=container-puppet-metrics_qdr, tcib_managed=true, container_name=container-puppet-metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, name=rhosp17/openstack-qdrouterd, io.openshift.expose-services=, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., config_id=tripleo_puppet_step1, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538515', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'metrics_qdr', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::qdr\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, architecture=x86_64, build-date=2025-11-18T22:49:46Z, com.redhat.component=openstack-qdrouterd-container)
Nov 28 08:01:49 np0005538515.localdomain systemd[1]: libpod-conmon-2b4ea332b462b808784dc8956e5a9441745ce5065c88da8e21e37d1ee9bf6447.scope: Deactivated successfully.
Nov 28 08:01:49 np0005538515.localdomain python3[52321]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-metrics_qdr --conmon-pidfile /run/container-puppet-metrics_qdr.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005538515 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron --env NAME=metrics_qdr --env STEP_CONFIG=include ::tripleo::packages
                                                         include tripleo::profile::base::metrics::qdr
                                                          --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-metrics_qdr --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538515', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'metrics_qdr', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::qdr\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-metrics_qdr.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52657]: Warning: Scope(Class[Nova::Compute::Libvirt]): nova::compute::libvirt::images_type will be required if rbd ephemeral storage is used.
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52675]: Notice: /Stage[main]/Tripleo::Profile::Base::Iscsid/Augeas[chap_algs in /etc/iscsi/iscsid.conf]/returns: executed successfully
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52675]: Notice: Applied catalog in 0.43 seconds
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52675]: Application:
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52675]:    Initial environment: production
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52675]:    Converged environment: production
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52675]:          Run mode: user
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52675]: Changes:
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52675]:             Total: 4
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52675]: Events:
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52675]:           Success: 4
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52675]:             Total: 4
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52675]: Resources:
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52675]:           Changed: 4
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52675]:       Out of sync: 4
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52675]:           Skipped: 8
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52675]:             Total: 13
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52675]: Time:
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52675]:              File: 0.00
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52675]:              Exec: 0.05
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52675]:    Config retrieval: 0.12
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52675]:            Augeas: 0.38
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52675]:    Transaction evaluation: 0.43
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52675]:    Catalog application: 0.43
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52675]:          Last run: 1764316909
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52675]:             Total: 0.43
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52675]: Version:
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52675]:            Config: 1764316909
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52675]:            Puppet: 7.10.0
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52638]: Notice: /Stage[main]/Collectd::Config/File[collectd.d]/owner: owner changed 'collectd' to 'root'
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52638]: Notice: /Stage[main]/Collectd::Config/File[collectd.d]/group: group changed 'collectd' to 'root'
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52638]: Notice: /Stage[main]/Collectd::Config/File[collectd.d]/mode: mode changed '0755' to '0750'
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52638]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/90-default-plugins-cpu.conf]/ensure: removed
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52638]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/90-default-plugins-interface.conf]/ensure: removed
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52638]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/90-default-plugins-load.conf]/ensure: removed
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52638]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/90-default-plugins-memory.conf]/ensure: removed
Nov 28 08:01:49 np0005538515.localdomain podman[53164]: 2025-11-28 08:01:49.737227239 +0000 UTC m=+0.106415053 container cleanup 3139c22a6742d68c52f0403d373ba7e8f851d424310f3264e609af6544368751 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=container-puppet-crond, description=Red Hat OpenStack Platform 17.1 cron, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538515', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'crond', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::logrotate'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, config_id=tripleo_puppet_step1, build-date=2025-11-18T22:49:32Z, version=17.1.12, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, architecture=x86_64, com.redhat.component=openstack-cron-container, io.openshift.expose-services=, tcib_managed=true, container_name=container-puppet-crond, url=https://www.redhat.com, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, vcs-type=git, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-cron)
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52638]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/90-default-plugins-syslog.conf]/ensure: removed
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52638]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/apache.conf]/ensure: removed
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52638]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/dns.conf]/ensure: removed
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52638]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/ipmi.conf]/ensure: removed
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52638]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/mcelog.conf]/ensure: removed
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52638]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/mysql.conf]/ensure: removed
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52638]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/ovs-events.conf]/ensure: removed
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52638]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/ovs-stats.conf]/ensure: removed
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52638]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/ping.conf]/ensure: removed
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52638]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/pmu.conf]/ensure: removed
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52638]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/rdt.conf]/ensure: removed
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52638]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/sensors.conf]/ensure: removed
Nov 28 08:01:49 np0005538515.localdomain systemd[1]: libpod-conmon-3139c22a6742d68c52f0403d373ba7e8f851d424310f3264e609af6544368751.scope: Deactivated successfully.
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52638]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/snmp.conf]/ensure: removed
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52638]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/write_prometheus.conf]/ensure: removed
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52638]: Notice: /Stage[main]/Collectd::Plugin::Python/File[/usr/lib/python3.9/site-packages]/mode: mode changed '0755' to '0750'
Nov 28 08:01:49 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-77782651c01fa3d8af8a79c02d3312e7fed09a9087964da1a7c959a65a9214b8-merged.mount: Deactivated successfully.
Nov 28 08:01:49 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-3139c22a6742d68c52f0403d373ba7e8f851d424310f3264e609af6544368751-userdata-shm.mount: Deactivated successfully.
Nov 28 08:01:49 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-c8beb7bd0728a5185ae08edcb0afeede0750b5c1acd8c5a453f776b712778919-merged.mount: Deactivated successfully.
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52638]: Notice: /Stage[main]/Collectd::Plugin::Python/Collectd::Plugin[python]/File[python.load]/ensure: defined content as '{sha256}0163924a0099dd43fe39cb85e836df147fd2cfee8197dc6866d3c384539eb6ee'
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52638]: Notice: /Stage[main]/Collectd::Plugin::Python/Concat[/etc/collectd.d/python-config.conf]/File[/etc/collectd.d/python-config.conf]/ensure: defined content as '{sha256}2e5fb20e60b30f84687fc456a37fc62451000d2d85f5bbc1b3fca3a5eac9deeb'
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52638]: Notice: /Stage[main]/Collectd::Plugin::Logfile/Collectd::Plugin[logfile]/File[logfile.load]/ensure: defined content as '{sha256}07bbda08ef9b824089500bdc6ac5a86e7d1ef2ae3ed4ed423c0559fe6361e5af'
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52638]: Notice: /Stage[main]/Collectd::Plugin::Amqp1/Collectd::Plugin[amqp1]/File[amqp1.load]/ensure: defined content as '{sha256}8dd3769945b86c38433504b97f7851a931eb3c94b667298d10a9796a3d020595'
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52638]: Notice: /Stage[main]/Collectd::Plugin::Ceph/Collectd::Plugin[ceph]/File[ceph.load]/ensure: defined content as '{sha256}c796abffda2e860875295b4fc11cc95c6032b4e13fa8fb128e839a305aa1676c'
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52638]: Notice: /Stage[main]/Collectd::Plugin::Cpu/Collectd::Plugin[cpu]/File[cpu.load]/ensure: defined content as '{sha256}67d4c8bf6bf5785f4cb6b596712204d9eacbcebbf16fe289907195d4d3cb0e34'
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52638]: Notice: /Stage[main]/Collectd::Plugin::Df/Collectd::Plugin[df]/File[df.load]/ensure: defined content as '{sha256}edeb4716d96fc9dca2c6adfe07bae70ba08c6af3944a3900581cba0f08f3c4ba'
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52638]: Notice: /Stage[main]/Collectd::Plugin::Disk/Collectd::Plugin[disk]/File[disk.load]/ensure: defined content as '{sha256}1d0cb838278f3226fcd381f0fc2e0e1abaf0d590f4ba7bcb2fc6ec113d3ebde7'
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52638]: Notice: /Stage[main]/Collectd::Plugin::Hugepages/Collectd::Plugin[hugepages]/File[hugepages.load]/ensure: defined content as '{sha256}9b9f35b65a73da8d4037e4355a23b678f2cf61997ccf7a5e1adf2a7ce6415827'
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52638]: Notice: /Stage[main]/Collectd::Plugin::Hugepages/Collectd::Plugin[hugepages]/File[older_hugepages.load]/ensure: removed
Nov 28 08:01:49 np0005538515.localdomain python3[52321]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-crond --conmon-pidfile /run/container-puppet-crond.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005538515 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron --env NAME=crond --env STEP_CONFIG=include ::tripleo::packages
                                                         include tripleo::profile::base::logging::logrotate --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-crond --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538515', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'crond', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::logrotate'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-crond.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-cron:17.1
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52638]: Notice: /Stage[main]/Collectd::Plugin::Interface/Collectd::Plugin[interface]/File[interface.load]/ensure: defined content as '{sha256}b76b315dc312e398940fe029c6dbc5c18d2b974ff7527469fc7d3617b5222046'
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52638]: Notice: /Stage[main]/Collectd::Plugin::Load/Collectd::Plugin[load]/File[load.load]/ensure: defined content as '{sha256}af2403f76aebd2f10202d66d2d55e1a8d987eed09ced5a3e3873a4093585dc31'
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52638]: Notice: /Stage[main]/Collectd::Plugin::Memory/Collectd::Plugin[memory]/File[memory.load]/ensure: defined content as '{sha256}0f270425ee6b05fc9440ee32b9afd1010dcbddd9b04ca78ff693858f7ecb9d0e'
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52638]: Notice: /Stage[main]/Collectd::Plugin::Unixsock/Collectd::Plugin[unixsock]/File[unixsock.load]/ensure: defined content as '{sha256}9d1ec1c51ba386baa6f62d2e019dbd6998ad924bf868b3edc2d24d3dc3c63885'
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52638]: Notice: /Stage[main]/Collectd::Plugin::Uptime/Collectd::Plugin[uptime]/File[uptime.load]/ensure: defined content as '{sha256}f7a26c6369f904d0ca1af59627ebea15f5e72160bcacdf08d217af282b42e5c0'
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52638]: Notice: /Stage[main]/Collectd::Plugin::Virt/Collectd::Plugin[virt]/File[virt.load]/ensure: defined content as '{sha256}9a2bcf913f6bf8a962a0ff351a9faea51ae863cc80af97b77f63f8ab68941c62'
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52638]: Notice: /Stage[main]/Collectd::Plugin::Virt/Collectd::Plugin[virt]/File[older_virt.load]/ensure: removed
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52638]: Notice: Applied catalog in 0.26 seconds
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52638]: Application:
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52638]:    Initial environment: production
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52638]:    Converged environment: production
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52638]:          Run mode: user
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52638]: Changes:
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52638]:             Total: 43
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52638]: Events:
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52638]:           Success: 43
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52638]:             Total: 43
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52638]: Resources:
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52638]:           Skipped: 14
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52638]:           Changed: 38
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52638]:       Out of sync: 38
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52638]:             Total: 82
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52638]: Time:
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52638]:    Concat fragment: 0.00
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52638]:       Concat file: 0.00
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52638]:              File: 0.09
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52638]:    Transaction evaluation: 0.25
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52638]:    Catalog application: 0.26
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52638]:    Config retrieval: 0.44
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52638]:          Last run: 1764316909
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52638]:             Total: 0.26
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52638]: Version:
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52638]:            Config: 1764316909
Nov 28 08:01:49 np0005538515.localdomain puppet-user[52638]:            Puppet: 7.10.0
Nov 28 08:01:50 np0005538515.localdomain systemd[1]: libpod-ed41a5a8171688821f2c74039d5b07173339f60e4defb1c8fdfea4439a6b75c0.scope: Deactivated successfully.
Nov 28 08:01:50 np0005538515.localdomain systemd[1]: libpod-ed41a5a8171688821f2c74039d5b07173339f60e4defb1c8fdfea4439a6b75c0.scope: Consumed 2.510s CPU time.
Nov 28 08:01:50 np0005538515.localdomain podman[52538]: 2025-11-28 08:01:50.070759433 +0000 UTC m=+4.352409719 container died ed41a5a8171688821f2c74039d5b07173339f60e4defb1c8fdfea4439a6b75c0 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=container-puppet-iscsid, maintainer=OpenStack TripleO Team, architecture=x86_64, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, managed_by=tripleo_ansible, tcib_managed=true, release=1761123044, config_id=tripleo_puppet_step1, url=https://www.redhat.com, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538515', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,iscsid_config', 'NAME': 'iscsid', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::iscsid\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/iscsi:/tmp/iscsi.host:z', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, build-date=2025-11-18T23:44:13Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., container_name=container-puppet-iscsid, com.redhat.component=openstack-iscsid-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 iscsid)
Nov 28 08:01:50 np0005538515.localdomain podman[53295]: 2025-11-28 08:01:50.094424312 +0000 UTC m=+0.085411311 container create 9c924b4b35e410ba8e6e6caa706b1f19805626094797b80857403544aece1ad5 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=container-puppet-rsyslog, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 rsyslog, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, distribution-scope=public, io.buildah.version=1.41.4, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 rsyslog, config_id=tripleo_puppet_step1, maintainer=OpenStack TripleO Team, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538515', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,rsyslog::generate_concat,concat::fragment', 'NAME': 'rsyslog', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::rsyslog'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, com.redhat.component=openstack-rsyslog-container, vendor=Red Hat, Inc., architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=container-puppet-rsyslog, name=rhosp17/openstack-rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.expose-services=, build-date=2025-11-18T22:49:49Z)
Nov 28 08:01:50 np0005538515.localdomain systemd[1]: Started libpod-conmon-9c924b4b35e410ba8e6e6caa706b1f19805626094797b80857403544aece1ad5.scope.
Nov 28 08:01:50 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 08:01:50 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/69bca6b1ae1a510e610471f91dc39084eac5a14908c47996b36473212637590d/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff)
Nov 28 08:01:50 np0005538515.localdomain podman[53295]: 2025-11-28 08:01:50.14098579 +0000 UTC m=+0.131972779 container init 9c924b4b35e410ba8e6e6caa706b1f19805626094797b80857403544aece1ad5 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=container-puppet-rsyslog, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=container-puppet-rsyslog, com.redhat.component=openstack-rsyslog-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, vendor=Red Hat, Inc., batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-rsyslog, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538515', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,rsyslog::generate_concat,concat::fragment', 'NAME': 'rsyslog', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::rsyslog'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 rsyslog, io.buildah.version=1.41.4, managed_by=tripleo_ansible, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 rsyslog, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, release=1761123044, build-date=2025-11-18T22:49:49Z, config_id=tripleo_puppet_step1)
Nov 28 08:01:50 np0005538515.localdomain podman[53295]: 2025-11-28 08:01:50.06210494 +0000 UTC m=+0.053091929 image pull  registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1
Nov 28 08:01:50 np0005538515.localdomain podman[53295]: 2025-11-28 08:01:50.166502114 +0000 UTC m=+0.157489103 container start 9c924b4b35e410ba8e6e6caa706b1f19805626094797b80857403544aece1ad5 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=container-puppet-rsyslog, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 rsyslog, distribution-scope=public, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, url=https://www.redhat.com, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538515', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,rsyslog::generate_concat,concat::fragment', 'NAME': 'rsyslog', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::rsyslog'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, version=17.1.12, batch=17.1_20251118.1, build-date=2025-11-18T22:49:49Z, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-rsyslog-container, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 rsyslog, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-rsyslog, config_id=tripleo_puppet_step1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, container_name=container-puppet-rsyslog, vcs-type=git)
Nov 28 08:01:50 np0005538515.localdomain podman[53295]: 2025-11-28 08:01:50.166825483 +0000 UTC m=+0.157812492 container attach 9c924b4b35e410ba8e6e6caa706b1f19805626094797b80857403544aece1ad5 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=container-puppet-rsyslog, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:49Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 rsyslog, name=rhosp17/openstack-rsyslog, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, config_id=tripleo_puppet_step1, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, version=17.1.12, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538515', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,rsyslog::generate_concat,concat::fragment', 'NAME': 'rsyslog', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::rsyslog'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, architecture=x86_64, release=1761123044, container_name=container-puppet-rsyslog, vcs-type=git, description=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.component=openstack-rsyslog-container, vendor=Red Hat, Inc., tcib_managed=true)
Nov 28 08:01:50 np0005538515.localdomain podman[53337]: 2025-11-28 08:01:50.17323541 +0000 UTC m=+0.092262001 container cleanup ed41a5a8171688821f2c74039d5b07173339f60e4defb1c8fdfea4439a6b75c0 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=container-puppet-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_id=tripleo_puppet_step1, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538515', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,iscsid_config', 'NAME': 'iscsid', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::iscsid\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/iscsi:/tmp/iscsi.host:z', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, vendor=Red Hat, Inc., vcs-type=git, name=rhosp17/openstack-iscsid, tcib_managed=true, managed_by=tripleo_ansible, distribution-scope=public, batch=17.1_20251118.1, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, com.redhat.component=openstack-iscsid-container, container_name=container-puppet-iscsid, architecture=x86_64, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 28 08:01:50 np0005538515.localdomain systemd[1]: libpod-conmon-ed41a5a8171688821f2c74039d5b07173339f60e4defb1c8fdfea4439a6b75c0.scope: Deactivated successfully.
Nov 28 08:01:50 np0005538515.localdomain python3[52321]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-iscsid --conmon-pidfile /run/container-puppet-iscsid.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005538515 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron,iscsid_config --env NAME=iscsid --env STEP_CONFIG=include ::tripleo::packages
                                                         include tripleo::profile::base::iscsid
                                                          --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-iscsid --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538515', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,iscsid_config', 'NAME': 'iscsid', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::iscsid\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/iscsi:/tmp/iscsi.host:z', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-iscsid.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/iscsi:/tmp/iscsi.host:z --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1
Nov 28 08:01:50 np0005538515.localdomain systemd[1]: libpod-8834e3582a8fa1c94bce5b1886005ace73c6931a23b894267022dfd791337f10.scope: Deactivated successfully.
Nov 28 08:01:50 np0005538515.localdomain systemd[1]: libpod-8834e3582a8fa1c94bce5b1886005ace73c6931a23b894267022dfd791337f10.scope: Consumed 2.713s CPU time.
Nov 28 08:01:50 np0005538515.localdomain podman[52501]: 2025-11-28 08:01:50.209716983 +0000 UTC m=+4.537139113 container died 8834e3582a8fa1c94bce5b1886005ace73c6931a23b894267022dfd791337f10 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=container-puppet-collectd, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=container-puppet-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538515', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,collectd_client_config,exec', 'NAME': 'collectd', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::collectd'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, com.redhat.component=openstack-collectd-container, build-date=2025-11-18T22:51:28Z, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, config_id=tripleo_puppet_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, name=rhosp17/openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team)
Nov 28 08:01:50 np0005538515.localdomain podman[53343]: 2025-11-28 08:01:50.223043372 +0000 UTC m=+0.129379333 container create ebec36eeb114e633393c34c9aea56cfd1cf07c9c3281b2bb9885574c17cfdc09 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=container-puppet-ovn_controller, com.redhat.component=openstack-ovn-controller-container, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, container_name=container-puppet-ovn_controller, name=rhosp17/openstack-ovn-controller, distribution-scope=public, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538515', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,vs_config,exec', 'NAME': 'ovn_controller', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::agents::ovn\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/etc/sysconfig/modules:/etc/sysconfig/modules', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, build-date=2025-11-18T23:34:05Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, managed_by=tripleo_ansible, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, config_id=tripleo_puppet_step1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team)
Nov 28 08:01:50 np0005538515.localdomain systemd[1]: Started libpod-conmon-ebec36eeb114e633393c34c9aea56cfd1cf07c9c3281b2bb9885574c17cfdc09.scope.
Nov 28 08:01:50 np0005538515.localdomain podman[53418]: 2025-11-28 08:01:50.267533199 +0000 UTC m=+0.052398139 container cleanup 8834e3582a8fa1c94bce5b1886005ace73c6931a23b894267022dfd791337f10 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=container-puppet-collectd, url=https://www.redhat.com, batch=17.1_20251118.1, config_id=tripleo_puppet_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, build-date=2025-11-18T22:51:28Z, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538515', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,collectd_client_config,exec', 'NAME': 'collectd', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::collectd'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.buildah.version=1.41.4, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=container-puppet-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-type=git, com.redhat.component=openstack-collectd-container, io.openshift.expose-services=, distribution-scope=public, version=17.1.12)
Nov 28 08:01:50 np0005538515.localdomain puppet-user[52657]: Notice: Compiled catalog for np0005538515.localdomain in environment production in 1.20 seconds
Nov 28 08:01:50 np0005538515.localdomain systemd[1]: libpod-conmon-8834e3582a8fa1c94bce5b1886005ace73c6931a23b894267022dfd791337f10.scope: Deactivated successfully.
Nov 28 08:01:50 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 08:01:50 np0005538515.localdomain python3[52321]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-collectd --conmon-pidfile /run/container-puppet-collectd.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005538515 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron,collectd_client_config,exec --env NAME=collectd --env STEP_CONFIG=include ::tripleo::packages
                                                         include tripleo::profile::base::metrics::collectd --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-collectd --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538515', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,collectd_client_config,exec', 'NAME': 'collectd', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::collectd'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-collectd.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1
Nov 28 08:01:50 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9ce8c6ec24615a8ac03ae2a4194714a4f44afbdc43ba4491ff44c91e34e068e5/merged/etc/sysconfig/modules supports timestamps until 2038 (0x7fffffff)
Nov 28 08:01:50 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9ce8c6ec24615a8ac03ae2a4194714a4f44afbdc43ba4491ff44c91e34e068e5/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff)
Nov 28 08:01:50 np0005538515.localdomain podman[53343]: 2025-11-28 08:01:50.180352837 +0000 UTC m=+0.086688808 image pull  registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1
Nov 28 08:01:50 np0005538515.localdomain podman[53343]: 2025-11-28 08:01:50.285558835 +0000 UTC m=+0.191894786 container init ebec36eeb114e633393c34c9aea56cfd1cf07c9c3281b2bb9885574c17cfdc09 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=container-puppet-ovn_controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538515', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,vs_config,exec', 'NAME': 'ovn_controller', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::agents::ovn\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/etc/sysconfig/modules:/etc/sysconfig/modules', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, version=17.1.12, architecture=x86_64, batch=17.1_20251118.1, io.buildah.version=1.41.4, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, tcib_managed=true, build-date=2025-11-18T23:34:05Z, name=rhosp17/openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, vcs-type=git, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_puppet_step1, container_name=container-puppet-ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container)
Nov 28 08:01:50 np0005538515.localdomain podman[53343]: 2025-11-28 08:01:50.291986592 +0000 UTC m=+0.198322543 container start ebec36eeb114e633393c34c9aea56cfd1cf07c9c3281b2bb9885574c17cfdc09 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=container-puppet-ovn_controller, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, config_id=tripleo_puppet_step1, vendor=Red Hat, Inc., tcib_managed=true, container_name=container-puppet-ovn_controller, name=rhosp17/openstack-ovn-controller, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, managed_by=tripleo_ansible, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538515', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,vs_config,exec', 'NAME': 'ovn_controller', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::agents::ovn\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/etc/sysconfig/modules:/etc/sysconfig/modules', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T23:34:05Z, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272)
Nov 28 08:01:50 np0005538515.localdomain podman[53343]: 2025-11-28 08:01:50.292178768 +0000 UTC m=+0.198514729 container attach ebec36eeb114e633393c34c9aea56cfd1cf07c9c3281b2bb9885574c17cfdc09 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=container-puppet-ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, io.buildah.version=1.41.4, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64, managed_by=tripleo_ansible, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538515', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,vs_config,exec', 'NAME': 'ovn_controller', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::agents::ovn\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/etc/sysconfig/modules:/etc/sysconfig/modules', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, container_name=container-puppet-ovn_controller, com.redhat.component=openstack-ovn-controller-container, name=rhosp17/openstack-ovn-controller, url=https://www.redhat.com, tcib_managed=true, release=1761123044, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_puppet_step1)
Nov 28 08:01:50 np0005538515.localdomain puppet-user[52767]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5
Nov 28 08:01:50 np0005538515.localdomain puppet-user[52767]:    (file: /etc/puppet/hiera.yaml)
Nov 28 08:01:50 np0005538515.localdomain puppet-user[52767]: Warning: Undefined variable '::deploy_config_name';
Nov 28 08:01:50 np0005538515.localdomain puppet-user[52767]:    (file & line not available)
Nov 28 08:01:50 np0005538515.localdomain puppet-user[52767]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html
Nov 28 08:01:50 np0005538515.localdomain puppet-user[52767]:    (file & line not available)
Nov 28 08:01:50 np0005538515.localdomain puppet-user[52767]: Warning: Unknown variable: '::ceilometer::cache_backend'. (file: /etc/puppet/modules/ceilometer/manifests/cache.pp, line: 145, column: 39)
Nov 28 08:01:50 np0005538515.localdomain puppet-user[52767]: Warning: Unknown variable: '::ceilometer::memcache_servers'. (file: /etc/puppet/modules/ceilometer/manifests/cache.pp, line: 146, column: 39)
Nov 28 08:01:50 np0005538515.localdomain puppet-user[52767]: Warning: Unknown variable: '::ceilometer::cache_tls_enabled'. (file: /etc/puppet/modules/ceilometer/manifests/cache.pp, line: 147, column: 39)
Nov 28 08:01:50 np0005538515.localdomain puppet-user[52767]: Warning: Unknown variable: '::ceilometer::cache_tls_cafile'. (file: /etc/puppet/modules/ceilometer/manifests/cache.pp, line: 148, column: 39)
Nov 28 08:01:50 np0005538515.localdomain puppet-user[52767]: Warning: Unknown variable: '::ceilometer::cache_tls_certfile'. (file: /etc/puppet/modules/ceilometer/manifests/cache.pp, line: 149, column: 39)
Nov 28 08:01:50 np0005538515.localdomain puppet-user[52767]: Warning: Unknown variable: '::ceilometer::cache_tls_keyfile'. (file: /etc/puppet/modules/ceilometer/manifests/cache.pp, line: 150, column: 39)
Nov 28 08:01:50 np0005538515.localdomain puppet-user[52767]: Warning: Unknown variable: '::ceilometer::cache_tls_allowed_ciphers'. (file: /etc/puppet/modules/ceilometer/manifests/cache.pp, line: 151, column: 39)
Nov 28 08:01:50 np0005538515.localdomain puppet-user[52767]: Warning: Unknown variable: '::ceilometer::manage_backend_package'. (file: /etc/puppet/modules/ceilometer/manifests/cache.pp, line: 152, column: 39)
Nov 28 08:01:50 np0005538515.localdomain puppet-user[52767]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_password'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 63, column: 25)
Nov 28 08:01:50 np0005538515.localdomain puppet-user[52767]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_url'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 68, column: 25)
Nov 28 08:01:50 np0005538515.localdomain puppet-user[52767]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_region'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 69, column: 28)
Nov 28 08:01:50 np0005538515.localdomain puppet-user[52767]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_user'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 70, column: 25)
Nov 28 08:01:50 np0005538515.localdomain puppet-user[52767]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_tenant_name'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 71, column: 29)
Nov 28 08:01:50 np0005538515.localdomain puppet-user[52767]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_cacert'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 72, column: 23)
Nov 28 08:01:50 np0005538515.localdomain puppet-user[52767]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_endpoint_type'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 73, column: 26)
Nov 28 08:01:50 np0005538515.localdomain puppet-user[52767]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_user_domain_name'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 74, column: 33)
Nov 28 08:01:50 np0005538515.localdomain puppet-user[52767]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_project_domain_name'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 75, column: 36)
Nov 28 08:01:50 np0005538515.localdomain puppet-user[52767]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_type'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 76, column: 26)
Nov 28 08:01:50 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Tripleo::Profile::Base::Nova::Migration::Client/File[/etc/nova/migration/identity]/content: content changed '{sha256}86610d84e745a3992358ae0b747297805d075492e5114c666fa08f8aecce7da0' to '{sha256}e8f4c9c311633f219a6b4c8a97d1389467ae0d86e6640d015eb10a4c73ac6b8b'
Nov 28 08:01:50 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Tripleo::Profile::Base::Nova::Migration::Client/File_line[nova_ssh_port]/ensure: created
Nov 28 08:01:50 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Tripleo::Profile::Base::Nova::Libvirt/File[/etc/sasl2/libvirt.conf]/content: content changed '{sha256}78510a0d6f14b269ddeb9f9638dfdfba9f976d370ee2ec04ba25352a8af6df35' to '{sha256}6d7bcae773217a30c0772f75d0d1b6d21f5d64e72853f5e3d91bb47799dbb7fe'
Nov 28 08:01:50 np0005538515.localdomain puppet-user[52657]: Warning: Empty environment setting 'TLS_PASSWORD'
Nov 28 08:01:50 np0005538515.localdomain puppet-user[52657]:    (file: /etc/puppet/modules/tripleo/manifests/profile/base/nova/libvirt.pp, line: 182)
Nov 28 08:01:50 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Tripleo::Profile::Base::Nova::Libvirt/Exec[set libvirt sasl credentials]/returns: executed successfully
Nov 28 08:01:50 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Tripleo::Profile::Base::Nova::Migration::Target/File[/etc/nova/migration/authorized_keys]/content: content changed '{sha256}0d05a8832f36c0517b84e9c3ad11069d531c7d2be5297661e5552fd29e3a5e47' to '{sha256}ae9c4ab6bedd07e63d6f2c3a5743334d26ea3ed4d1f695ab855f72927fdb71bc'
Nov 28 08:01:50 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Tripleo::Profile::Base::Nova::Migration::Target/File_line[nova_migration_logindefs]/ensure: created
Nov 28 08:01:50 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova::Workarounds/Nova_config[workarounds/never_download_image_if_on_rbd]/ensure: created
Nov 28 08:01:50 np0005538515.localdomain puppet-user[52767]: Notice: Compiled catalog for np0005538515.localdomain in environment production in 0.36 seconds
Nov 28 08:01:50 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova::Workarounds/Nova_config[workarounds/disable_compute_service_check_for_ffu]/ensure: created
Nov 28 08:01:50 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/ssl_only]/ensure: created
Nov 28 08:01:50 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-29e7cf6abe6a2bbfc58462ae307ac9362023c413708070730336bba274ac12e7-merged.mount: Deactivated successfully.
Nov 28 08:01:50 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ed41a5a8171688821f2c74039d5b07173339f60e4defb1c8fdfea4439a6b75c0-userdata-shm.mount: Deactivated successfully.
Nov 28 08:01:50 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-2465f602934c9b49eec4e0598b6266084474df0f2da0f1de92a72390c7a9be21-merged.mount: Deactivated successfully.
Nov 28 08:01:50 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8834e3582a8fa1c94bce5b1886005ace73c6931a23b894267022dfd791337f10-userdata-shm.mount: Deactivated successfully.
Nov 28 08:01:50 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/my_ip]/ensure: created
Nov 28 08:01:50 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/host]/ensure: created
Nov 28 08:01:50 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/cpu_allocation_ratio]/ensure: created
Nov 28 08:01:50 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/ram_allocation_ratio]/ensure: created
Nov 28 08:01:50 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/disk_allocation_ratio]/ensure: created
Nov 28 08:01:50 np0005538515.localdomain puppet-user[52767]: Notice: /Stage[main]/Ceilometer/Ceilometer_config[DEFAULT/http_timeout]/ensure: created
Nov 28 08:01:50 np0005538515.localdomain puppet-user[52767]: Notice: /Stage[main]/Ceilometer/Ceilometer_config[DEFAULT/host]/ensure: created
Nov 28 08:01:50 np0005538515.localdomain puppet-user[52767]: Notice: /Stage[main]/Ceilometer/Ceilometer_config[publisher/telemetry_secret]/ensure: created
Nov 28 08:01:50 np0005538515.localdomain puppet-user[52767]: Notice: /Stage[main]/Ceilometer/Ceilometer_config[hardware/readonly_user_name]/ensure: created
Nov 28 08:01:50 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/dhcp_domain]/ensure: created
Nov 28 08:01:50 np0005538515.localdomain puppet-user[52767]: Notice: /Stage[main]/Ceilometer/Ceilometer_config[hardware/readonly_user_password]/ensure: created
Nov 28 08:01:50 np0005538515.localdomain puppet-user[52767]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/auth_url]/ensure: created
Nov 28 08:01:50 np0005538515.localdomain puppet-user[52767]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/region_name]/ensure: created
Nov 28 08:01:50 np0005538515.localdomain puppet-user[52767]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/username]/ensure: created
Nov 28 08:01:50 np0005538515.localdomain puppet-user[52767]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/password]/ensure: created
Nov 28 08:01:50 np0005538515.localdomain puppet-user[52767]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/project_name]/ensure: created
Nov 28 08:01:50 np0005538515.localdomain puppet-user[52767]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/interface]/ensure: created
Nov 28 08:01:50 np0005538515.localdomain puppet-user[52767]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/user_domain_name]/ensure: created
Nov 28 08:01:50 np0005538515.localdomain puppet-user[52767]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/project_domain_name]/ensure: created
Nov 28 08:01:50 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova/Nova_config[vif_plug_ovs/ovsdb_connection]/ensure: created
Nov 28 08:01:50 np0005538515.localdomain puppet-user[52767]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/auth_type]/ensure: created
Nov 28 08:01:50 np0005538515.localdomain puppet-user[52767]: Notice: /Stage[main]/Ceilometer::Agent::Polling/Ceilometer_config[compute/instance_discovery_method]/ensure: created
Nov 28 08:01:50 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova/Nova_config[notifications/notification_format]/ensure: created
Nov 28 08:01:50 np0005538515.localdomain puppet-user[52767]: Notice: /Stage[main]/Ceilometer::Agent::Polling/Ceilometer_config[DEFAULT/polling_namespaces]/ensure: created
Nov 28 08:01:50 np0005538515.localdomain puppet-user[52767]: Notice: /Stage[main]/Ceilometer::Agent::Polling/Ceilometer_config[polling/tenant_name_discovery]/ensure: created
Nov 28 08:01:50 np0005538515.localdomain puppet-user[52767]: Notice: /Stage[main]/Ceilometer::Agent::Polling/Ceilometer_config[coordination/backend_url]/ensure: created
Nov 28 08:01:50 np0005538515.localdomain puppet-user[52767]: Notice: /Stage[main]/Ceilometer::Cache/Oslo::Cache[ceilometer_config]/Ceilometer_config[cache/backend]/ensure: created
Nov 28 08:01:50 np0005538515.localdomain puppet-user[52767]: Notice: /Stage[main]/Ceilometer::Cache/Oslo::Cache[ceilometer_config]/Ceilometer_config[cache/enabled]/ensure: created
Nov 28 08:01:50 np0005538515.localdomain puppet-user[52767]: Notice: /Stage[main]/Ceilometer::Cache/Oslo::Cache[ceilometer_config]/Ceilometer_config[cache/memcache_servers]/ensure: created
Nov 28 08:01:50 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/state_path]/ensure: created
Nov 28 08:01:50 np0005538515.localdomain puppet-user[52767]: Notice: /Stage[main]/Ceilometer::Cache/Oslo::Cache[ceilometer_config]/Ceilometer_config[cache/tls_enabled]/ensure: created
Nov 28 08:01:50 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/service_down_time]/ensure: created
Nov 28 08:01:50 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/rootwrap_config]/ensure: created
Nov 28 08:01:51 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/report_interval]/ensure: created
Nov 28 08:01:51 np0005538515.localdomain puppet-user[52767]: Notice: /Stage[main]/Ceilometer/Oslo::Messaging::Rabbit[ceilometer_config]/Ceilometer_config[oslo_messaging_rabbit/heartbeat_in_pthread]/ensure: created
Nov 28 08:01:51 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova/Nova_config[notifications/notify_on_state_change]/ensure: created
Nov 28 08:01:51 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova/Nova_config[cinder/cross_az_attach]/ensure: created
Nov 28 08:01:51 np0005538515.localdomain puppet-user[52767]: Notice: /Stage[main]/Ceilometer/Oslo::Messaging::Amqp[ceilometer_config]/Ceilometer_config[oslo_messaging_amqp/rpc_address_prefix]/ensure: created
Nov 28 08:01:51 np0005538515.localdomain puppet-user[52767]: Notice: /Stage[main]/Ceilometer/Oslo::Messaging::Amqp[ceilometer_config]/Ceilometer_config[oslo_messaging_amqp/notify_address_prefix]/ensure: created
Nov 28 08:01:51 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova::Glance/Nova_config[glance/valid_interfaces]/ensure: created
Nov 28 08:01:51 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/auth_type]/ensure: created
Nov 28 08:01:51 np0005538515.localdomain puppet-user[52767]: Notice: /Stage[main]/Ceilometer/Oslo::Messaging::Notifications[ceilometer_config]/Ceilometer_config[oslo_messaging_notifications/driver]/ensure: created
Nov 28 08:01:51 np0005538515.localdomain puppet-user[52767]: Notice: /Stage[main]/Ceilometer/Oslo::Messaging::Notifications[ceilometer_config]/Ceilometer_config[oslo_messaging_notifications/transport_url]/ensure: created
Nov 28 08:01:51 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/auth_url]/ensure: created
Nov 28 08:01:51 np0005538515.localdomain puppet-user[52767]: Notice: /Stage[main]/Ceilometer/Oslo::Messaging::Notifications[ceilometer_config]/Ceilometer_config[oslo_messaging_notifications/topics]/ensure: created
Nov 28 08:01:51 np0005538515.localdomain puppet-user[52767]: Notice: /Stage[main]/Ceilometer/Oslo::Messaging::Default[ceilometer_config]/Ceilometer_config[DEFAULT/transport_url]/ensure: created
Nov 28 08:01:51 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/password]/ensure: created
Nov 28 08:01:51 np0005538515.localdomain puppet-user[52767]: Notice: /Stage[main]/Ceilometer::Logging/Oslo::Log[ceilometer_config]/Ceilometer_config[DEFAULT/debug]/ensure: created
Nov 28 08:01:51 np0005538515.localdomain puppet-user[52767]: Notice: /Stage[main]/Ceilometer::Logging/Oslo::Log[ceilometer_config]/Ceilometer_config[DEFAULT/log_dir]/ensure: created
Nov 28 08:01:51 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/project_domain_name]/ensure: created
Nov 28 08:01:51 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/project_name]/ensure: created
Nov 28 08:01:51 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/user_domain_name]/ensure: created
Nov 28 08:01:51 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/username]/ensure: created
Nov 28 08:01:51 np0005538515.localdomain puppet-user[52767]: Notice: Applied catalog in 0.44 seconds
Nov 28 08:01:51 np0005538515.localdomain puppet-user[52767]: Application:
Nov 28 08:01:51 np0005538515.localdomain puppet-user[52767]:    Initial environment: production
Nov 28 08:01:51 np0005538515.localdomain puppet-user[52767]:    Converged environment: production
Nov 28 08:01:51 np0005538515.localdomain puppet-user[52767]:          Run mode: user
Nov 28 08:01:51 np0005538515.localdomain puppet-user[52767]: Changes:
Nov 28 08:01:51 np0005538515.localdomain puppet-user[52767]:             Total: 31
Nov 28 08:01:51 np0005538515.localdomain puppet-user[52767]: Events:
Nov 28 08:01:51 np0005538515.localdomain puppet-user[52767]:           Success: 31
Nov 28 08:01:51 np0005538515.localdomain puppet-user[52767]:             Total: 31
Nov 28 08:01:51 np0005538515.localdomain puppet-user[52767]: Resources:
Nov 28 08:01:51 np0005538515.localdomain puppet-user[52767]:           Skipped: 22
Nov 28 08:01:51 np0005538515.localdomain puppet-user[52767]:           Changed: 31
Nov 28 08:01:51 np0005538515.localdomain puppet-user[52767]:       Out of sync: 31
Nov 28 08:01:51 np0005538515.localdomain puppet-user[52767]:             Total: 151
Nov 28 08:01:51 np0005538515.localdomain puppet-user[52767]: Time:
Nov 28 08:01:51 np0005538515.localdomain puppet-user[52767]:           Package: 0.02
Nov 28 08:01:51 np0005538515.localdomain puppet-user[52767]:    Ceilometer config: 0.34
Nov 28 08:01:51 np0005538515.localdomain puppet-user[52767]:    Transaction evaluation: 0.43
Nov 28 08:01:51 np0005538515.localdomain puppet-user[52767]:    Config retrieval: 0.44
Nov 28 08:01:51 np0005538515.localdomain puppet-user[52767]:    Catalog application: 0.44
Nov 28 08:01:51 np0005538515.localdomain puppet-user[52767]:          Last run: 1764316911
Nov 28 08:01:51 np0005538515.localdomain puppet-user[52767]:         Resources: 0.00
Nov 28 08:01:51 np0005538515.localdomain puppet-user[52767]:             Total: 0.44
Nov 28 08:01:51 np0005538515.localdomain puppet-user[52767]: Version:
Nov 28 08:01:51 np0005538515.localdomain puppet-user[52767]:            Config: 1764316910
Nov 28 08:01:51 np0005538515.localdomain puppet-user[52767]:            Puppet: 7.10.0
Nov 28 08:01:51 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/region_name]/ensure: created
Nov 28 08:01:51 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/valid_interfaces]/ensure: created
Nov 28 08:01:51 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/password]/ensure: created
Nov 28 08:01:51 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/auth_type]/ensure: created
Nov 28 08:01:51 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/auth_url]/ensure: created
Nov 28 08:01:51 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/region_name]/ensure: created
Nov 28 08:01:51 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/project_name]/ensure: created
Nov 28 08:01:51 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/project_domain_name]/ensure: created
Nov 28 08:01:51 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/username]/ensure: created
Nov 28 08:01:51 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/user_domain_name]/ensure: created
Nov 28 08:01:51 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/os_region_name]/ensure: created
Nov 28 08:01:51 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/catalog_info]/ensure: created
Nov 28 08:01:51 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova::Compute::Image_cache/Nova_config[image_cache/manager_interval]/ensure: created
Nov 28 08:01:51 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova::Compute::Image_cache/Nova_config[image_cache/remove_unused_base_images]/ensure: created
Nov 28 08:01:51 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova::Compute::Image_cache/Nova_config[image_cache/remove_unused_original_minimum_age_seconds]/ensure: created
Nov 28 08:01:51 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova::Compute::Image_cache/Nova_config[image_cache/remove_unused_resized_minimum_age_seconds]/ensure: created
Nov 28 08:01:51 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova::Compute::Image_cache/Nova_config[image_cache/precache_concurrency]/ensure: created
Nov 28 08:01:51 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova::Vendordata/Nova_config[vendordata_dynamic_auth/project_domain_name]/ensure: created
Nov 28 08:01:51 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova::Vendordata/Nova_config[vendordata_dynamic_auth/user_domain_name]/ensure: created
Nov 28 08:01:51 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova::Compute::Provider/Nova_config[compute/provider_config_location]/ensure: created
Nov 28 08:01:51 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova::Compute::Provider/File[/etc/nova/provider_config]/ensure: created
Nov 28 08:01:51 np0005538515.localdomain systemd[1]: libpod-904f45bf1db4e9f2cbbbf5ea95d42519ada7d6f43d6be47e85c0a4d6697b9273.scope: Deactivated successfully.
Nov 28 08:01:51 np0005538515.localdomain systemd[1]: libpod-904f45bf1db4e9f2cbbbf5ea95d42519ada7d6f43d6be47e85c0a4d6697b9273.scope: Consumed 2.956s CPU time.
Nov 28 08:01:51 np0005538515.localdomain podman[52721]: 2025-11-28 08:01:51.6403046 +0000 UTC m=+3.516794187 container died 904f45bf1db4e9f2cbbbf5ea95d42519ada7d6f43d6be47e85c0a4d6697b9273 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1, name=container-puppet-ceilometer, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ceilometer-central, vendor=Red Hat, Inc., config_id=tripleo_puppet_step1, com.redhat.component=openstack-ceilometer-central-container, container_name=container-puppet-ceilometer, build-date=2025-11-19T00:11:59Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538515', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,ceilometer_config,ceilometer_config', 'NAME': 'ceilometer', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::ceilometer::agent::polling\ninclude tripleo::profile::base::ceilometer::agent::polling\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-central, name=rhosp17/openstack-ceilometer-central, tcib_managed=true, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-central, release=1761123044, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-central, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ceilometer-central, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 08:01:51 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/use_cow_images]/ensure: created
Nov 28 08:01:51 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/mkisofs_cmd]/ensure: created
Nov 28 08:01:51 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/force_raw_images]/ensure: created
Nov 28 08:01:51 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-904f45bf1db4e9f2cbbbf5ea95d42519ada7d6f43d6be47e85c0a4d6697b9273-userdata-shm.mount: Deactivated successfully.
Nov 28 08:01:51 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/reserved_host_memory_mb]/ensure: created
Nov 28 08:01:51 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-af4441ca58e5a3dae70e850402577fe72fc0370c205d9690db9c04c01d30a59b-merged.mount: Deactivated successfully.
Nov 28 08:01:51 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/reserved_huge_pages]/ensure: created
Nov 28 08:01:51 np0005538515.localdomain podman[53640]: 2025-11-28 08:01:51.745198948 +0000 UTC m=+0.094996390 container cleanup 904f45bf1db4e9f2cbbbf5ea95d42519ada7d6f43d6be47e85c0a4d6697b9273 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1, name=container-puppet-ceilometer, url=https://www.redhat.com, container_name=container-puppet-ceilometer, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.openshift.expose-services=, architecture=x86_64, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-central-container, config_id=tripleo_puppet_step1, vcs-type=git, name=rhosp17/openstack-ceilometer-central, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., build-date=2025-11-19T00:11:59Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-central, version=17.1.12, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538515', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,ceilometer_config,ceilometer_config', 'NAME': 'ceilometer', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::ceilometer::agent::polling\ninclude tripleo::profile::base::ceilometer::agent::polling\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.buildah.version=1.41.4, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-central, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-central, tcib_managed=true, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-central, summary=Red Hat OpenStack Platform 17.1 ceilometer-central)
Nov 28 08:01:51 np0005538515.localdomain systemd[1]: libpod-conmon-904f45bf1db4e9f2cbbbf5ea95d42519ada7d6f43d6be47e85c0a4d6697b9273.scope: Deactivated successfully.
Nov 28 08:01:51 np0005538515.localdomain python3[52321]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-ceilometer --conmon-pidfile /run/container-puppet-ceilometer.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005538515 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron,ceilometer_config,ceilometer_config --env NAME=ceilometer --env STEP_CONFIG=include ::tripleo::packages
                                                         include tripleo::profile::base::ceilometer::agent::polling
                                                         include tripleo::profile::base::ceilometer::agent::polling
                                                          --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-ceilometer --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538515', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,ceilometer_config,ceilometer_config', 'NAME': 'ceilometer', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::ceilometer::agent::polling\ninclude tripleo::profile::base::ceilometer::agent::polling\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-ceilometer.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1
Nov 28 08:01:51 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/resume_guests_state_on_host_boot]/ensure: created
Nov 28 08:01:51 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova::Compute/Nova_config[key_manager/backend]/ensure: created
Nov 28 08:01:51 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/sync_power_state_interval]/ensure: created
Nov 28 08:01:51 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova::Compute/Nova_config[compute/consecutive_build_service_disable_threshold]/ensure: created
Nov 28 08:01:51 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova::Compute/Nova_config[compute/live_migration_wait_for_vif_plug]/ensure: created
Nov 28 08:01:51 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova::Compute/Nova_config[compute/max_disk_devices_to_attach]/ensure: created
Nov 28 08:01:51 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova::Vncproxy::Common/Nova_config[vnc/novncproxy_base_url]/ensure: created
Nov 28 08:01:51 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova::Compute/Nova_config[vnc/server_proxyclient_address]/ensure: created
Nov 28 08:01:51 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova::Compute/Nova_config[vnc/enabled]/ensure: created
Nov 28 08:01:51 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova::Compute/Nova_config[spice/enabled]/ensure: created
Nov 28 08:01:51 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/instance_usage_audit]/ensure: created
Nov 28 08:01:52 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/instance_usage_audit_period]/ensure: created
Nov 28 08:01:52 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[DEFAULT/vif_plugging_is_fatal]/ensure: created
Nov 28 08:01:52 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[DEFAULT/vif_plugging_timeout]/ensure: created
Nov 28 08:01:52 np0005538515.localdomain puppet-user[53396]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5
Nov 28 08:01:52 np0005538515.localdomain puppet-user[53396]:    (file: /etc/puppet/hiera.yaml)
Nov 28 08:01:52 np0005538515.localdomain puppet-user[53396]: Warning: Undefined variable '::deploy_config_name';
Nov 28 08:01:52 np0005538515.localdomain puppet-user[53396]:    (file & line not available)
Nov 28 08:01:52 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/default_floating_pool]/ensure: created
Nov 28 08:01:52 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/timeout]/ensure: created
Nov 28 08:01:52 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/project_name]/ensure: created
Nov 28 08:01:52 np0005538515.localdomain puppet-user[53396]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html
Nov 28 08:01:52 np0005538515.localdomain puppet-user[53396]:    (file & line not available)
Nov 28 08:01:52 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/project_domain_name]/ensure: created
Nov 28 08:01:52 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/region_name]/ensure: created
Nov 28 08:01:52 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/username]/ensure: created
Nov 28 08:01:52 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/user_domain_name]/ensure: created
Nov 28 08:01:52 np0005538515.localdomain puppet-user[53521]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5
Nov 28 08:01:52 np0005538515.localdomain puppet-user[53521]:    (file: /etc/puppet/hiera.yaml)
Nov 28 08:01:52 np0005538515.localdomain puppet-user[53521]: Warning: Undefined variable '::deploy_config_name';
Nov 28 08:01:52 np0005538515.localdomain puppet-user[53521]:    (file & line not available)
Nov 28 08:01:52 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/password]/ensure: created
Nov 28 08:01:52 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/auth_url]/ensure: created
Nov 28 08:01:52 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/valid_interfaces]/ensure: created
Nov 28 08:01:52 np0005538515.localdomain puppet-user[53521]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html
Nov 28 08:01:52 np0005538515.localdomain puppet-user[53521]:    (file & line not available)
Nov 28 08:01:52 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/ovs_bridge]/ensure: created
Nov 28 08:01:52 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/extension_sync_interval]/ensure: created
Nov 28 08:01:52 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/auth_type]/ensure: created
Nov 28 08:01:52 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova::Migration::Libvirt/Nova_config[libvirt/live_migration_uri]/ensure: created
Nov 28 08:01:52 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova::Migration::Libvirt/Nova_config[libvirt/live_migration_tunnelled]/ensure: created
Nov 28 08:01:52 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova::Migration::Libvirt/Nova_config[libvirt/live_migration_inbound_addr]/ensure: created
Nov 28 08:01:52 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova::Migration::Libvirt/Nova_config[libvirt/live_migration_permit_post_copy]/ensure: created
Nov 28 08:01:52 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova::Migration::Libvirt/Nova_config[libvirt/live_migration_permit_auto_converge]/ensure: created
Nov 28 08:01:52 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova::Migration::Libvirt/Virtproxyd_config[listen_tls]/ensure: created
Nov 28 08:01:52 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova::Migration::Libvirt/Virtproxyd_config[listen_tcp]/ensure: created
Nov 28 08:01:52 np0005538515.localdomain puppet-user[53396]: Notice: Compiled catalog for np0005538515.localdomain in environment production in 0.23 seconds
Nov 28 08:01:52 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova::Compute::Rbd/Nova_config[libvirt/rbd_user]/ensure: created
Nov 28 08:01:52 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova::Compute::Rbd/Nova_config[libvirt/rbd_secret_uuid]/ensure: created
Nov 28 08:01:52 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova::Compute::Rbd/File[/etc/nova/secret.xml]/ensure: defined content as '{sha256}af00b55795dabd7a8ca15fb762e773701eb5c91ea4ae135b9bcdde564d7077dd'
Nov 28 08:01:52 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova::Compute::Rbd/Nova_config[libvirt/images_type]/ensure: created
Nov 28 08:01:52 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova::Compute::Rbd/Nova_config[libvirt/images_rbd_pool]/ensure: created
Nov 28 08:01:52 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova::Compute::Rbd/Nova_config[libvirt/images_rbd_ceph_conf]/ensure: created
Nov 28 08:01:52 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova::Compute::Rbd/Nova_config[libvirt/images_rbd_glance_store_name]/ensure: created
Nov 28 08:01:52 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova::Compute::Rbd/Nova_config[libvirt/images_rbd_glance_copy_poll_interval]/ensure: created
Nov 28 08:01:52 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova::Compute::Rbd/Nova_config[libvirt/images_rbd_glance_copy_timeout]/ensure: created
Nov 28 08:01:52 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[DEFAULT/compute_driver]/ensure: created
Nov 28 08:01:52 np0005538515.localdomain puppet-user[53396]: Notice: /Stage[main]/Rsyslog::Base/File[/etc/rsyslog.conf]/content: content changed '{sha256}d6f679f6a4eb6f33f9fc20c846cb30bef93811e1c86bc4da1946dc3100b826c3' to '{sha256}7963bd801fadd49a17561f4d3f80738c3f504b413b11c443432d8303138041f2'
Nov 28 08:01:52 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[DEFAULT/preallocate_images]/ensure: created
Nov 28 08:01:52 np0005538515.localdomain puppet-user[53396]: Notice: /Stage[main]/Rsyslog::Config::Global/Rsyslog::Component::Global_config[MaxMessageSize]/Rsyslog::Generate_concat[rsyslog::concat::global_config::MaxMessageSize]/Concat[/etc/rsyslog.d/00_rsyslog.conf]/File[/etc/rsyslog.d/00_rsyslog.conf]/ensure: defined content as '{sha256}a291d5cc6d5884a978161f4c7b5831d43edd07797cc590bae366e7f150b8643b'
Nov 28 08:01:52 np0005538515.localdomain puppet-user[53521]: Notice: Compiled catalog for np0005538515.localdomain in environment production in 0.30 seconds
Nov 28 08:01:52 np0005538515.localdomain puppet-user[53396]: Notice: /Stage[main]/Rsyslog::Config::Templates/Rsyslog::Component::Template[rsyslog-node-index]/Rsyslog::Generate_concat[rsyslog::concat::template::rsyslog-node-index]/Concat[/etc/rsyslog.d/50_openstack_logs.conf]/File[/etc/rsyslog.d/50_openstack_logs.conf]/ensure: defined content as '{sha256}6044185b1da867517684b275c4d283584d91a27b22c4084e92ff9a2cc819bcca'
Nov 28 08:01:52 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[vnc/server_listen]/ensure: created
Nov 28 08:01:52 np0005538515.localdomain puppet-user[53396]: Notice: Applied catalog in 0.11 seconds
Nov 28 08:01:52 np0005538515.localdomain puppet-user[53396]: Application:
Nov 28 08:01:52 np0005538515.localdomain puppet-user[53396]:    Initial environment: production
Nov 28 08:01:52 np0005538515.localdomain puppet-user[53396]:    Converged environment: production
Nov 28 08:01:52 np0005538515.localdomain puppet-user[53396]:          Run mode: user
Nov 28 08:01:52 np0005538515.localdomain puppet-user[53396]: Changes:
Nov 28 08:01:52 np0005538515.localdomain puppet-user[53396]:             Total: 3
Nov 28 08:01:52 np0005538515.localdomain puppet-user[53396]: Events:
Nov 28 08:01:52 np0005538515.localdomain puppet-user[53396]:           Success: 3
Nov 28 08:01:52 np0005538515.localdomain puppet-user[53396]:             Total: 3
Nov 28 08:01:52 np0005538515.localdomain puppet-user[53396]: Resources:
Nov 28 08:01:52 np0005538515.localdomain puppet-user[53396]:           Skipped: 11
Nov 28 08:01:52 np0005538515.localdomain puppet-user[53396]:           Changed: 3
Nov 28 08:01:52 np0005538515.localdomain puppet-user[53396]:       Out of sync: 3
Nov 28 08:01:52 np0005538515.localdomain puppet-user[53396]:             Total: 25
Nov 28 08:01:52 np0005538515.localdomain puppet-user[53396]: Time:
Nov 28 08:01:52 np0005538515.localdomain puppet-user[53396]:       Concat file: 0.00
Nov 28 08:01:52 np0005538515.localdomain puppet-user[53396]:    Concat fragment: 0.00
Nov 28 08:01:52 np0005538515.localdomain puppet-user[53396]:              File: 0.01
Nov 28 08:01:52 np0005538515.localdomain puppet-user[53396]:    Transaction evaluation: 0.10
Nov 28 08:01:52 np0005538515.localdomain puppet-user[53396]:    Catalog application: 0.11
Nov 28 08:01:52 np0005538515.localdomain puppet-user[53396]:    Config retrieval: 0.29
Nov 28 08:01:52 np0005538515.localdomain puppet-user[53396]:          Last run: 1764316912
Nov 28 08:01:52 np0005538515.localdomain puppet-user[53396]:             Total: 0.11
Nov 28 08:01:52 np0005538515.localdomain puppet-user[53396]: Version:
Nov 28 08:01:52 np0005538515.localdomain puppet-user[53396]:            Config: 1764316912
Nov 28 08:01:52 np0005538515.localdomain puppet-user[53396]:            Puppet: 7.10.0
Nov 28 08:01:52 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/virt_type]/ensure: created
Nov 28 08:01:52 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/cpu_mode]/ensure: created
Nov 28 08:01:52 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/inject_password]/ensure: created
Nov 28 08:01:52 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/inject_key]/ensure: created
Nov 28 08:01:52 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/inject_partition]/ensure: created
Nov 28 08:01:52 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/hw_disk_discard]/ensure: created
Nov 28 08:01:52 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/hw_machine_type]/ensure: created
Nov 28 08:01:52 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/enabled_perf_events]/ensure: created
Nov 28 08:01:52 np0005538515.localdomain ovs-vsctl[53796]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-remote=tcp:172.17.0.103:6642,tcp:172.17.0.104:6642,tcp:172.17.0.105:6642
Nov 28 08:01:52 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/rx_queue_size]/ensure: created
Nov 28 08:01:52 np0005538515.localdomain puppet-user[53521]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-remote]/ensure: created
Nov 28 08:01:52 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/tx_queue_size]/ensure: created
Nov 28 08:01:52 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/file_backed_memory]/ensure: created
Nov 28 08:01:52 np0005538515.localdomain ovs-vsctl[53798]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-encap-type=geneve
Nov 28 08:01:52 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/volume_use_multipath]/ensure: created
Nov 28 08:01:52 np0005538515.localdomain puppet-user[53521]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-encap-type]/ensure: created
Nov 28 08:01:52 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/num_pcie_ports]/ensure: created
Nov 28 08:01:52 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/mem_stats_period_seconds]/ensure: created
Nov 28 08:01:52 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/pmem_namespaces]/ensure: created
Nov 28 08:01:52 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/swtpm_enabled]/ensure: created
Nov 28 08:01:52 np0005538515.localdomain ovs-vsctl[53814]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-encap-ip=172.19.0.108
Nov 28 08:01:52 np0005538515.localdomain puppet-user[53521]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-encap-ip]/ensure: created
Nov 28 08:01:52 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/cpu_model_extra_flags]/ensure: created
Nov 28 08:01:52 np0005538515.localdomain ovs-vsctl[53819]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:hostname=np0005538515.localdomain
Nov 28 08:01:52 np0005538515.localdomain puppet-user[53521]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:hostname]/value: value changed 'np0005538515.novalocal' to 'np0005538515.localdomain'
Nov 28 08:01:52 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/disk_cachemodes]/ensure: created
Nov 28 08:01:52 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtlogd/Virtlogd_config[log_filters]/ensure: created
Nov 28 08:01:52 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtlogd/Virtlogd_config[log_outputs]/ensure: created
Nov 28 08:01:52 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtproxyd/Virtproxyd_config[log_filters]/ensure: created
Nov 28 08:01:52 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtproxyd/Virtproxyd_config[log_outputs]/ensure: created
Nov 28 08:01:52 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtqemud/Virtqemud_config[log_filters]/ensure: created
Nov 28 08:01:52 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtqemud/Virtqemud_config[log_outputs]/ensure: created
Nov 28 08:01:52 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtnodedevd/Virtnodedevd_config[log_filters]/ensure: created
Nov 28 08:01:52 np0005538515.localdomain ovs-vsctl[53833]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-bridge=br-int
Nov 28 08:01:52 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtnodedevd/Virtnodedevd_config[log_outputs]/ensure: created
Nov 28 08:01:52 np0005538515.localdomain puppet-user[53521]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-bridge]/ensure: created
Nov 28 08:01:52 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtstoraged/Virtstoraged_config[log_filters]/ensure: created
Nov 28 08:01:52 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtstoraged/Virtstoraged_config[log_outputs]/ensure: created
Nov 28 08:01:52 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtsecretd/Virtsecretd_config[log_filters]/ensure: created
Nov 28 08:01:52 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtsecretd/Virtsecretd_config[log_outputs]/ensure: created
Nov 28 08:01:52 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtnodedevd_config[unix_sock_group]/ensure: created
Nov 28 08:01:52 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtnodedevd_config[auth_unix_ro]/ensure: created
Nov 28 08:01:52 np0005538515.localdomain ovs-vsctl[53837]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-remote-probe-interval=60000
Nov 28 08:01:52 np0005538515.localdomain puppet-user[53521]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-remote-probe-interval]/ensure: created
Nov 28 08:01:52 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtnodedevd_config[auth_unix_rw]/ensure: created
Nov 28 08:01:52 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtnodedevd_config[unix_sock_ro_perms]/ensure: created
Nov 28 08:01:52 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtnodedevd_config[unix_sock_rw_perms]/ensure: created
Nov 28 08:01:52 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtproxyd_config[unix_sock_group]/ensure: created
Nov 28 08:01:52 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtproxyd_config[auth_unix_ro]/ensure: created
Nov 28 08:01:52 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtproxyd_config[auth_unix_rw]/ensure: created
Nov 28 08:01:52 np0005538515.localdomain systemd[1]: libpod-9c924b4b35e410ba8e6e6caa706b1f19805626094797b80857403544aece1ad5.scope: Deactivated successfully.
Nov 28 08:01:52 np0005538515.localdomain systemd[1]: libpod-9c924b4b35e410ba8e6e6caa706b1f19805626094797b80857403544aece1ad5.scope: Consumed 2.420s CPU time.
Nov 28 08:01:52 np0005538515.localdomain ovs-vsctl[53844]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-openflow-probe-interval=60
Nov 28 08:01:52 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtproxyd_config[unix_sock_ro_perms]/ensure: created
Nov 28 08:01:52 np0005538515.localdomain puppet-user[53521]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-openflow-probe-interval]/ensure: created
Nov 28 08:01:52 np0005538515.localdomain podman[53295]: 2025-11-28 08:01:52.754840403 +0000 UTC m=+2.745827402 container died 9c924b4b35e410ba8e6e6caa706b1f19805626094797b80857403544aece1ad5 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=container-puppet-rsyslog, vcs-type=git, com.redhat.component=openstack-rsyslog-container, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 rsyslog, version=17.1.12, config_id=tripleo_puppet_step1, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, name=rhosp17/openstack-rsyslog, managed_by=tripleo_ansible, io.buildah.version=1.41.4, release=1761123044, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538515', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,rsyslog::generate_concat,concat::fragment', 'NAME': 'rsyslog', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::rsyslog'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, container_name=container-puppet-rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 rsyslog, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, url=https://www.redhat.com, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:49Z)
Nov 28 08:01:52 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtproxyd_config[unix_sock_rw_perms]/ensure: created
Nov 28 08:01:52 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtqemud_config[unix_sock_group]/ensure: created
Nov 28 08:01:52 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtqemud_config[auth_unix_ro]/ensure: created
Nov 28 08:01:52 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtqemud_config[auth_unix_rw]/ensure: created
Nov 28 08:01:52 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtqemud_config[unix_sock_ro_perms]/ensure: created
Nov 28 08:01:52 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtqemud_config[unix_sock_rw_perms]/ensure: created
Nov 28 08:01:52 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtsecretd_config[unix_sock_group]/ensure: created
Nov 28 08:01:52 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtsecretd_config[auth_unix_ro]/ensure: created
Nov 28 08:01:52 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtsecretd_config[auth_unix_rw]/ensure: created
Nov 28 08:01:52 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtsecretd_config[unix_sock_ro_perms]/ensure: created
Nov 28 08:01:52 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtsecretd_config[unix_sock_rw_perms]/ensure: created
Nov 28 08:01:52 np0005538515.localdomain ovs-vsctl[53858]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-monitor-all=true
Nov 28 08:01:52 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtstoraged_config[unix_sock_group]/ensure: created
Nov 28 08:01:52 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtstoraged_config[auth_unix_ro]/ensure: created
Nov 28 08:01:52 np0005538515.localdomain puppet-user[53521]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-monitor-all]/ensure: created
Nov 28 08:01:52 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtstoraged_config[auth_unix_rw]/ensure: created
Nov 28 08:01:52 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtstoraged_config[unix_sock_ro_perms]/ensure: created
Nov 28 08:01:52 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtstoraged_config[unix_sock_rw_perms]/ensure: created
Nov 28 08:01:52 np0005538515.localdomain ovs-vsctl[53864]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-ofctrl-wait-before-clear=8000
Nov 28 08:01:52 np0005538515.localdomain puppet-user[53521]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-ofctrl-wait-before-clear]/ensure: created
Nov 28 08:01:52 np0005538515.localdomain ovs-vsctl[53866]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-encap-tos=0
Nov 28 08:01:52 np0005538515.localdomain puppet-user[53521]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-encap-tos]/ensure: created
Nov 28 08:01:52 np0005538515.localdomain ovs-vsctl[53868]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-chassis-mac-mappings=datacentre:fa:16:3e:72:ce:0c
Nov 28 08:01:52 np0005538515.localdomain puppet-user[53521]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-chassis-mac-mappings]/ensure: created
Nov 28 08:01:52 np0005538515.localdomain ovs-vsctl[53870]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-bridge-mappings=datacentre:br-ex
Nov 28 08:01:52 np0005538515.localdomain puppet-user[53521]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-bridge-mappings]/ensure: created
Nov 28 08:01:52 np0005538515.localdomain ovs-vsctl[53872]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-match-northd-version=false
Nov 28 08:01:52 np0005538515.localdomain puppet-user[53521]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-match-northd-version]/ensure: created
Nov 28 08:01:52 np0005538515.localdomain ovs-vsctl[53874]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:garp-max-timeout-sec=0
Nov 28 08:01:52 np0005538515.localdomain puppet-user[53521]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:garp-max-timeout-sec]/ensure: created
Nov 28 08:01:53 np0005538515.localdomain puppet-user[53521]: Notice: Applied catalog in 0.58 seconds
Nov 28 08:01:53 np0005538515.localdomain puppet-user[53521]: Application:
Nov 28 08:01:53 np0005538515.localdomain puppet-user[53521]:    Initial environment: production
Nov 28 08:01:53 np0005538515.localdomain puppet-user[53521]:    Converged environment: production
Nov 28 08:01:53 np0005538515.localdomain puppet-user[53521]:          Run mode: user
Nov 28 08:01:53 np0005538515.localdomain puppet-user[53521]: Changes:
Nov 28 08:01:53 np0005538515.localdomain puppet-user[53521]:             Total: 14
Nov 28 08:01:53 np0005538515.localdomain puppet-user[53521]: Events:
Nov 28 08:01:53 np0005538515.localdomain puppet-user[53521]:           Success: 14
Nov 28 08:01:53 np0005538515.localdomain puppet-user[53521]:             Total: 14
Nov 28 08:01:53 np0005538515.localdomain puppet-user[53521]: Resources:
Nov 28 08:01:53 np0005538515.localdomain puppet-user[53521]:           Skipped: 12
Nov 28 08:01:53 np0005538515.localdomain puppet-user[53521]:           Changed: 14
Nov 28 08:01:53 np0005538515.localdomain puppet-user[53521]:       Out of sync: 14
Nov 28 08:01:53 np0005538515.localdomain puppet-user[53521]:             Total: 29
Nov 28 08:01:53 np0005538515.localdomain puppet-user[53521]: Time:
Nov 28 08:01:53 np0005538515.localdomain puppet-user[53521]:              Exec: 0.02
Nov 28 08:01:53 np0005538515.localdomain puppet-user[53521]:    Config retrieval: 0.33
Nov 28 08:01:53 np0005538515.localdomain puppet-user[53521]:         Vs config: 0.47
Nov 28 08:01:53 np0005538515.localdomain puppet-user[53521]:    Transaction evaluation: 0.57
Nov 28 08:01:53 np0005538515.localdomain puppet-user[53521]:    Catalog application: 0.58
Nov 28 08:01:53 np0005538515.localdomain puppet-user[53521]:          Last run: 1764316913
Nov 28 08:01:53 np0005538515.localdomain puppet-user[53521]:             Total: 0.58
Nov 28 08:01:53 np0005538515.localdomain puppet-user[53521]: Version:
Nov 28 08:01:53 np0005538515.localdomain puppet-user[53521]:            Config: 1764316912
Nov 28 08:01:53 np0005538515.localdomain puppet-user[53521]:            Puppet: 7.10.0
Nov 28 08:01:53 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova::Compute::Libvirt::Qemu/Augeas[qemu-conf-limits]/returns: executed successfully
Nov 28 08:01:53 np0005538515.localdomain systemd[1]: tmp-crun.YsuVt6.mount: Deactivated successfully.
Nov 28 08:01:53 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9c924b4b35e410ba8e6e6caa706b1f19805626094797b80857403544aece1ad5-userdata-shm.mount: Deactivated successfully.
Nov 28 08:01:53 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-69bca6b1ae1a510e610471f91dc39084eac5a14908c47996b36473212637590d-merged.mount: Deactivated successfully.
Nov 28 08:01:53 np0005538515.localdomain systemd[1]: libpod-ebec36eeb114e633393c34c9aea56cfd1cf07c9c3281b2bb9885574c17cfdc09.scope: Deactivated successfully.
Nov 28 08:01:53 np0005538515.localdomain systemd[1]: libpod-ebec36eeb114e633393c34c9aea56cfd1cf07c9c3281b2bb9885574c17cfdc09.scope: Consumed 2.986s CPU time.
Nov 28 08:01:53 np0005538515.localdomain podman[53343]: 2025-11-28 08:01:53.528017093 +0000 UTC m=+3.434353044 container died ebec36eeb114e633393c34c9aea56cfd1cf07c9c3281b2bb9885574c17cfdc09 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=container-puppet-ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ovn-controller-container, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, config_id=tripleo_puppet_step1, vcs-type=git, tcib_managed=true, container_name=container-puppet-ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538515', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,vs_config,exec', 'NAME': 'ovn_controller', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::agents::ovn\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/etc/sysconfig/modules:/etc/sysconfig/modules', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, batch=17.1_20251118.1, version=17.1.12, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., build-date=2025-11-18T23:34:05Z, name=rhosp17/openstack-ovn-controller, maintainer=OpenStack TripleO Team, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, managed_by=tripleo_ansible)
Nov 28 08:01:53 np0005538515.localdomain podman[53851]: 2025-11-28 08:01:53.7878827 +0000 UTC m=+1.025521769 container cleanup 9c924b4b35e410ba8e6e6caa706b1f19805626094797b80857403544aece1ad5 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=container-puppet-rsyslog, version=17.1.12, build-date=2025-11-18T22:49:49Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538515', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,rsyslog::generate_concat,concat::fragment', 'NAME': 'rsyslog', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::rsyslog'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=container-puppet-rsyslog, tcib_managed=true, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, config_id=tripleo_puppet_step1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., architecture=x86_64, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-rsyslog, summary=Red Hat OpenStack Platform 17.1 rsyslog, release=1761123044, vcs-type=git, description=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-rsyslog-container)
Nov 28 08:01:53 np0005538515.localdomain python3[52321]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-rsyslog --conmon-pidfile /run/container-puppet-rsyslog.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005538515 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron,rsyslog::generate_concat,concat::fragment --env NAME=rsyslog --env STEP_CONFIG=include ::tripleo::packages
                                                         include tripleo::profile::base::logging::rsyslog --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-rsyslog --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538515', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,rsyslog::generate_concat,concat::fragment', 'NAME': 'rsyslog', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::rsyslog'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-rsyslog.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1
Nov 28 08:01:53 np0005538515.localdomain systemd[1]: libpod-conmon-9c924b4b35e410ba8e6e6caa706b1f19805626094797b80857403544aece1ad5.scope: Deactivated successfully.
Nov 28 08:01:53 np0005538515.localdomain podman[53914]: 2025-11-28 08:01:53.804263407 +0000 UTC m=+0.265044648 container cleanup ebec36eeb114e633393c34c9aea56cfd1cf07c9c3281b2bb9885574c17cfdc09 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=container-puppet-ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=container-puppet-ovn_controller, io.buildah.version=1.41.4, com.redhat.component=openstack-ovn-controller-container, managed_by=tripleo_ansible, release=1761123044, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_puppet_step1, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, vcs-type=git, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538515', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,vs_config,exec', 'NAME': 'ovn_controller', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::agents::ovn\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/etc/sysconfig/modules:/etc/sysconfig/modules', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, architecture=x86_64, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:34:05Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc.)
Nov 28 08:01:53 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova::Migration::Qemu/Augeas[qemu-conf-migration-ports]/returns: executed successfully
Nov 28 08:01:53 np0005538515.localdomain systemd[1]: libpod-conmon-ebec36eeb114e633393c34c9aea56cfd1cf07c9c3281b2bb9885574c17cfdc09.scope: Deactivated successfully.
Nov 28 08:01:53 np0005538515.localdomain python3[52321]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-ovn_controller --conmon-pidfile /run/container-puppet-ovn_controller.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005538515 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron,vs_config,exec --env NAME=ovn_controller --env STEP_CONFIG=include ::tripleo::packages
                                                         include tripleo::profile::base::neutron::agents::ovn
                                                          --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-ovn_controller --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538515', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,vs_config,exec', 'NAME': 'ovn_controller', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::agents::ovn\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/etc/sysconfig/modules:/etc/sysconfig/modules', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-ovn_controller.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /etc/sysconfig/modules:/etc/sysconfig/modules --volume /lib/modules:/lib/modules:ro --volume /run/openvswitch:/run/openvswitch:shared,z --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1
Nov 28 08:01:53 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova::Logging/Oslo::Log[nova_config]/Nova_config[DEFAULT/debug]/ensure: created
Nov 28 08:01:53 np0005538515.localdomain podman[53554]: 2025-11-28 08:01:50.469904669 +0000 UTC m=+0.037463953 image pull  registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1
Nov 28 08:01:53 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova::Logging/Oslo::Log[nova_config]/Nova_config[DEFAULT/log_dir]/ensure: created
Nov 28 08:01:53 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova::Cache/Oslo::Cache[nova_config]/Nova_config[cache/backend]/ensure: created
Nov 28 08:01:54 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova::Cache/Oslo::Cache[nova_config]/Nova_config[cache/enabled]/ensure: created
Nov 28 08:01:54 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova::Cache/Oslo::Cache[nova_config]/Nova_config[cache/memcache_servers]/ensure: created
Nov 28 08:01:54 np0005538515.localdomain podman[53994]: 2025-11-28 08:01:54.053642707 +0000 UTC m=+0.063545573 container create 8692e42b842ef6461ddaf8f87dcd08c54fe8471c3dd5454dde35d88654a795a1 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1, name=container-puppet-neutron, io.openshift.expose-services=, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-server, url=https://www.redhat.com, io.buildah.version=1.41.4, batch=17.1_20251118.1, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538515', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,neutron_config,ovn_metadata_agent_config', 'NAME': 'neutron', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::ovn_metadata\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, distribution-scope=public, version=17.1.12, managed_by=tripleo_ansible, config_id=tripleo_puppet_step1, build-date=2025-11-19T00:23:27Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-neutron-server, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=container-puppet-neutron, description=Red Hat OpenStack Platform 17.1 neutron-server, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-server, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 neutron-server, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-server, architecture=x86_64, vcs-type=git, com.redhat.component=openstack-neutron-server-container, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team)
Nov 28 08:01:54 np0005538515.localdomain systemd[1]: Started libpod-conmon-8692e42b842ef6461ddaf8f87dcd08c54fe8471c3dd5454dde35d88654a795a1.scope.
Nov 28 08:01:54 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 08:01:54 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7c29bfa5a0679179b90046634e87037ab6ff6f22b5fa7106d9841b0f8caae33b/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff)
Nov 28 08:01:54 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova::Cache/Oslo::Cache[nova_config]/Nova_config[cache/tls_enabled]/ensure: created
Nov 28 08:01:54 np0005538515.localdomain podman[53994]: 2025-11-28 08:01:54.113125681 +0000 UTC m=+0.123028587 container init 8692e42b842ef6461ddaf8f87dcd08c54fe8471c3dd5454dde35d88654a795a1 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1, name=container-puppet-neutron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, tcib_managed=true, config_id=tripleo_puppet_step1, description=Red Hat OpenStack Platform 17.1 neutron-server, distribution-scope=public, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538515', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,neutron_config,ovn_metadata_agent_config', 'NAME': 'neutron', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::ovn_metadata\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, name=rhosp17/openstack-neutron-server, architecture=x86_64, version=17.1.12, release=1761123044, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 neutron-server, io.openshift.expose-services=, vendor=Red Hat, Inc., vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:23:27Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-server, io.buildah.version=1.41.4, container_name=container-puppet-neutron, com.redhat.component=openstack-neutron-server-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-server, vcs-type=git, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-server, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 28 08:01:54 np0005538515.localdomain podman[53994]: 2025-11-28 08:01:54.12062557 +0000 UTC m=+0.130528476 container start 8692e42b842ef6461ddaf8f87dcd08c54fe8471c3dd5454dde35d88654a795a1 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1, name=container-puppet-neutron, architecture=x86_64, io.buildah.version=1.41.4, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 neutron-server, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-server, version=17.1.12, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-server, config_id=tripleo_puppet_step1, name=rhosp17/openstack-neutron-server, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 neutron-server, tcib_managed=true, release=1761123044, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:23:27Z, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538515', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,neutron_config,ovn_metadata_agent_config', 'NAME': 'neutron', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::ovn_metadata\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-server, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, managed_by=tripleo_ansible, url=https://www.redhat.com, container_name=container-puppet-neutron, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-neutron-server-container)
Nov 28 08:01:54 np0005538515.localdomain podman[53994]: 2025-11-28 08:01:54.120989291 +0000 UTC m=+0.130892187 container attach 8692e42b842ef6461ddaf8f87dcd08c54fe8471c3dd5454dde35d88654a795a1 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1, name=container-puppet-neutron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-server, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-neutron-server-container, io.buildah.version=1.41.4, vendor=Red Hat, Inc., name=rhosp17/openstack-neutron-server, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, architecture=x86_64, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 neutron-server, release=1761123044, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-server, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:23:27Z, version=17.1.12, managed_by=tripleo_ansible, tcib_managed=true, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-server, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 neutron-server, url=https://www.redhat.com, container_name=container-puppet-neutron, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538515', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,neutron_config,ovn_metadata_agent_config', 'NAME': 'neutron', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::ovn_metadata\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, distribution-scope=public, config_id=tripleo_puppet_step1)
Nov 28 08:01:54 np0005538515.localdomain podman[53994]: 2025-11-28 08:01:54.023434357 +0000 UTC m=+0.033337243 image pull  registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1
Nov 28 08:01:54 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova/Oslo::Messaging::Rabbit[nova_config]/Nova_config[oslo_messaging_rabbit/heartbeat_in_pthread]/ensure: created
Nov 28 08:01:54 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova/Oslo::Messaging::Rabbit[nova_config]/Nova_config[oslo_messaging_rabbit/heartbeat_timeout_threshold]/ensure: created
Nov 28 08:01:54 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-9ce8c6ec24615a8ac03ae2a4194714a4f44afbdc43ba4491ff44c91e34e068e5-merged.mount: Deactivated successfully.
Nov 28 08:01:54 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ebec36eeb114e633393c34c9aea56cfd1cf07c9c3281b2bb9885574c17cfdc09-userdata-shm.mount: Deactivated successfully.
Nov 28 08:01:54 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova/Oslo::Messaging::Rabbit[nova_config]/Nova_config[oslo_messaging_rabbit/ssl]/ensure: created
Nov 28 08:01:54 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova/Oslo::Messaging::Default[nova_config]/Nova_config[DEFAULT/transport_url]/ensure: created
Nov 28 08:01:54 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova/Oslo::Messaging::Notifications[nova_config]/Nova_config[oslo_messaging_notifications/driver]/ensure: created
Nov 28 08:01:54 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova/Oslo::Messaging::Notifications[nova_config]/Nova_config[oslo_messaging_notifications/transport_url]/ensure: created
Nov 28 08:01:54 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova/Oslo::Concurrency[nova_config]/Nova_config[oslo_concurrency/lock_path]/ensure: created
Nov 28 08:01:54 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/auth_type]/ensure: created
Nov 28 08:01:54 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/region_name]/ensure: created
Nov 28 08:01:54 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/auth_url]/ensure: created
Nov 28 08:01:54 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/username]/ensure: created
Nov 28 08:01:54 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/password]/ensure: created
Nov 28 08:01:54 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/user_domain_name]/ensure: created
Nov 28 08:01:54 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/project_name]/ensure: created
Nov 28 08:01:54 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/project_domain_name]/ensure: created
Nov 28 08:01:54 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/send_service_user_token]/ensure: created
Nov 28 08:01:54 np0005538515.localdomain puppet-user[52657]: Notice: /Stage[main]/Ssh::Server::Config/Concat[/etc/ssh/sshd_config]/File[/etc/ssh/sshd_config]/ensure: defined content as '{sha256}66a7ab6cc1a19ea5002a5aaa2cfb2f196778c89c859d0afac926fe3fac9c75a4'
Nov 28 08:01:54 np0005538515.localdomain puppet-user[52657]: Notice: Applied catalog in 4.36 seconds
Nov 28 08:01:54 np0005538515.localdomain puppet-user[52657]: Application:
Nov 28 08:01:54 np0005538515.localdomain puppet-user[52657]:    Initial environment: production
Nov 28 08:01:54 np0005538515.localdomain puppet-user[52657]:    Converged environment: production
Nov 28 08:01:54 np0005538515.localdomain puppet-user[52657]:          Run mode: user
Nov 28 08:01:54 np0005538515.localdomain puppet-user[52657]: Changes:
Nov 28 08:01:54 np0005538515.localdomain puppet-user[52657]:             Total: 183
Nov 28 08:01:54 np0005538515.localdomain puppet-user[52657]: Events:
Nov 28 08:01:54 np0005538515.localdomain puppet-user[52657]:           Success: 183
Nov 28 08:01:54 np0005538515.localdomain puppet-user[52657]:             Total: 183
Nov 28 08:01:54 np0005538515.localdomain puppet-user[52657]: Resources:
Nov 28 08:01:54 np0005538515.localdomain puppet-user[52657]:           Changed: 183
Nov 28 08:01:54 np0005538515.localdomain puppet-user[52657]:       Out of sync: 183
Nov 28 08:01:54 np0005538515.localdomain puppet-user[52657]:           Skipped: 57
Nov 28 08:01:54 np0005538515.localdomain puppet-user[52657]:             Total: 487
Nov 28 08:01:54 np0005538515.localdomain puppet-user[52657]: Time:
Nov 28 08:01:54 np0005538515.localdomain puppet-user[52657]:    Concat fragment: 0.00
Nov 28 08:01:54 np0005538515.localdomain puppet-user[52657]:            Anchor: 0.00
Nov 28 08:01:54 np0005538515.localdomain puppet-user[52657]:         File line: 0.00
Nov 28 08:01:54 np0005538515.localdomain puppet-user[52657]:    Virtlogd config: 0.00
Nov 28 08:01:54 np0005538515.localdomain puppet-user[52657]:    Virtqemud config: 0.02
Nov 28 08:01:54 np0005538515.localdomain puppet-user[52657]:              Exec: 0.02
Nov 28 08:01:54 np0005538515.localdomain puppet-user[52657]:    Virtsecretd config: 0.02
Nov 28 08:01:54 np0005538515.localdomain puppet-user[52657]:    Virtstoraged config: 0.02
Nov 28 08:01:54 np0005538515.localdomain puppet-user[52657]:              File: 0.03
Nov 28 08:01:54 np0005538515.localdomain puppet-user[52657]:    Virtproxyd config: 0.03
Nov 28 08:01:54 np0005538515.localdomain puppet-user[52657]:           Package: 0.03
Nov 28 08:01:54 np0005538515.localdomain puppet-user[52657]:    Virtnodedevd config: 0.05
Nov 28 08:01:54 np0005538515.localdomain puppet-user[52657]:            Augeas: 1.02
Nov 28 08:01:54 np0005538515.localdomain puppet-user[52657]:    Config retrieval: 1.43
Nov 28 08:01:54 np0005538515.localdomain puppet-user[52657]:          Last run: 1764316914
Nov 28 08:01:54 np0005538515.localdomain puppet-user[52657]:       Nova config: 2.93
Nov 28 08:01:54 np0005538515.localdomain puppet-user[52657]:    Transaction evaluation: 4.34
Nov 28 08:01:54 np0005538515.localdomain puppet-user[52657]:    Catalog application: 4.36
Nov 28 08:01:54 np0005538515.localdomain puppet-user[52657]:         Resources: 0.00
Nov 28 08:01:54 np0005538515.localdomain puppet-user[52657]:       Concat file: 0.00
Nov 28 08:01:54 np0005538515.localdomain puppet-user[52657]:             Total: 4.36
Nov 28 08:01:54 np0005538515.localdomain puppet-user[52657]: Version:
Nov 28 08:01:54 np0005538515.localdomain puppet-user[52657]:            Config: 1764316909
Nov 28 08:01:54 np0005538515.localdomain puppet-user[52657]:            Puppet: 7.10.0
Nov 28 08:01:55 np0005538515.localdomain systemd[1]: libpod-d184a5420aa167537c4418fe72018d93cc08508bdda98c15877ff895cf99cb9b.scope: Deactivated successfully.
Nov 28 08:01:55 np0005538515.localdomain systemd[1]: libpod-d184a5420aa167537c4418fe72018d93cc08508bdda98c15877ff895cf99cb9b.scope: Consumed 8.260s CPU time.
Nov 28 08:01:55 np0005538515.localdomain podman[52490]: 2025-11-28 08:01:55.730317768 +0000 UTC m=+10.072198540 container died d184a5420aa167537c4418fe72018d93cc08508bdda98c15877ff895cf99cb9b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=container-puppet-nova_libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538515', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,nova_config,libvirtd_config,virtlogd_config,virtproxyd_config,virtqemud_config,virtnodedevd_config,virtsecretd_config,virtstoraged_config,nova_config,file,libvirt_tls_password,libvirtd_config,nova_config,file,libvirt_tls_password', 'NAME': 'nova_libvirt', 'STEP_CONFIG': "include ::tripleo::packages\n# TODO(emilien): figure how to deal with libvirt profile.\n# We'll probably treat it like we do with Neutron plugins.\n# Until then, just include it in the default nova-compute role.\ninclude tripleo::profile::base::nova::compute::libvirt\n\ninclude tripleo::profile::base::nova::libvirt\n\ninclude tripleo::profile::base::nova::compute::libvirt_guests\n\ninclude tripleo::profile::base::sshd\ninclude tripleo::profile::base::nova::migration::target"}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, name=rhosp17/openstack-nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, release=1761123044, version=17.1.12, url=https://www.redhat.com, distribution-scope=public, build-date=2025-11-19T00:35:22Z, io.buildah.version=1.41.4, vcs-type=git, container_name=container-puppet-nova_libvirt, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_puppet_step1, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Nov 28 08:01:55 np0005538515.localdomain systemd[1]: tmp-crun.NfEF9L.mount: Deactivated successfully.
Nov 28 08:01:55 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d184a5420aa167537c4418fe72018d93cc08508bdda98c15877ff895cf99cb9b-userdata-shm.mount: Deactivated successfully.
Nov 28 08:01:55 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-4d267351eb91c27e496fa400ef9055b36048428ec01962767ba6b671d1258ac4-merged.mount: Deactivated successfully.
Nov 28 08:01:55 np0005538515.localdomain puppet-user[54024]: Error: Facter: error while resolving custom fact "haproxy_version": undefined method `strip' for nil:NilClass
Nov 28 08:01:55 np0005538515.localdomain podman[54067]: 2025-11-28 08:01:55.943245315 +0000 UTC m=+0.199649511 container cleanup d184a5420aa167537c4418fe72018d93cc08508bdda98c15877ff895cf99cb9b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=container-puppet-nova_libvirt, maintainer=OpenStack TripleO Team, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538515', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,nova_config,libvirtd_config,virtlogd_config,virtproxyd_config,virtqemud_config,virtnodedevd_config,virtsecretd_config,virtstoraged_config,nova_config,file,libvirt_tls_password,libvirtd_config,nova_config,file,libvirt_tls_password', 'NAME': 'nova_libvirt', 'STEP_CONFIG': "include ::tripleo::packages\n# TODO(emilien): figure how to deal with libvirt profile.\n# We'll probably treat it like we do with Neutron plugins.\n# Until then, just include it in the default nova-compute role.\ninclude tripleo::profile::base::nova::compute::libvirt\n\ninclude tripleo::profile::base::nova::libvirt\n\ninclude tripleo::profile::base::nova::compute::libvirt_guests\n\ninclude tripleo::profile::base::sshd\ninclude tripleo::profile::base::nova::migration::target"}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-libvirt, release=1761123044, managed_by=tripleo_ansible, distribution-scope=public, vendor=Red Hat, Inc., config_id=tripleo_puppet_step1, container_name=container-puppet-nova_libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, architecture=x86_64, io.buildah.version=1.41.4, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, batch=17.1_20251118.1, com.redhat.component=openstack-nova-libvirt-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2025-11-19T00:35:22Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com)
Nov 28 08:01:55 np0005538515.localdomain systemd[1]: libpod-conmon-d184a5420aa167537c4418fe72018d93cc08508bdda98c15877ff895cf99cb9b.scope: Deactivated successfully.
Nov 28 08:01:55 np0005538515.localdomain python3[52321]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-nova_libvirt --conmon-pidfile /run/container-puppet-nova_libvirt.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005538515 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron,nova_config,libvirtd_config,virtlogd_config,virtproxyd_config,virtqemud_config,virtnodedevd_config,virtsecretd_config,virtstoraged_config,nova_config,file,libvirt_tls_password,libvirtd_config,nova_config,file,libvirt_tls_password --env NAME=nova_libvirt --env STEP_CONFIG=include ::tripleo::packages
                                                         # TODO(emilien): figure how to deal with libvirt profile.
                                                         # We'll probably treat it like we do with Neutron plugins.
                                                         # Until then, just include it in the default nova-compute role.
                                                         include tripleo::profile::base::nova::compute::libvirt
                                                         
                                                         include tripleo::profile::base::nova::libvirt
                                                         
                                                         include tripleo::profile::base::nova::compute::libvirt_guests
                                                         
                                                         include tripleo::profile::base::sshd
                                                         include tripleo::profile::base::nova::migration::target --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-nova_libvirt --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538515', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,nova_config,libvirtd_config,virtlogd_config,virtproxyd_config,virtqemud_config,virtnodedevd_config,virtsecretd_config,virtstoraged_config,nova_config,file,libvirt_tls_password,libvirtd_config,nova_config,file,libvirt_tls_password', 'NAME': 'nova_libvirt', 'STEP_CONFIG': "include ::tripleo::packages\n# TODO(emilien): figure how to deal with libvirt profile.\n# We'll probably treat it like we do with Neutron plugins.\n# Until then, just include it in the default nova-compute role.\ninclude tripleo::profile::base::nova::compute::libvirt\n\ninclude tripleo::profile::base::nova::libvirt\n\ninclude tripleo::profile::base::nova::compute::libvirt_guests\n\ninclude tripleo::profile::base::sshd\ninclude tripleo::profile::base::nova::migration::target"}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-nova_libvirt.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Nov 28 08:01:56 np0005538515.localdomain puppet-user[54024]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5
Nov 28 08:01:56 np0005538515.localdomain puppet-user[54024]:    (file: /etc/puppet/hiera.yaml)
Nov 28 08:01:56 np0005538515.localdomain puppet-user[54024]: Warning: Undefined variable '::deploy_config_name';
Nov 28 08:01:56 np0005538515.localdomain puppet-user[54024]:    (file & line not available)
Nov 28 08:01:56 np0005538515.localdomain puppet-user[54024]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html
Nov 28 08:01:56 np0005538515.localdomain puppet-user[54024]:    (file & line not available)
Nov 28 08:01:56 np0005538515.localdomain puppet-user[54024]: Warning: Unknown variable: 'dhcp_agents_per_net'. (file: /etc/puppet/modules/tripleo/manifests/profile/base/neutron.pp, line: 154, column: 37)
Nov 28 08:01:56 np0005538515.localdomain puppet-user[54024]: Notice: Compiled catalog for np0005538515.localdomain in environment production in 0.67 seconds
Nov 28 08:01:56 np0005538515.localdomain puppet-user[54024]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/auth_strategy]/ensure: created
Nov 28 08:01:56 np0005538515.localdomain puppet-user[54024]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/core_plugin]/ensure: created
Nov 28 08:01:56 np0005538515.localdomain puppet-user[54024]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/host]/ensure: created
Nov 28 08:01:56 np0005538515.localdomain puppet-user[54024]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/dns_domain]/ensure: created
Nov 28 08:01:56 np0005538515.localdomain puppet-user[54024]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/dhcp_agent_notification]/ensure: created
Nov 28 08:01:56 np0005538515.localdomain puppet-user[54024]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/allow_overlapping_ips]/ensure: created
Nov 28 08:01:56 np0005538515.localdomain puppet-user[54024]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/global_physnet_mtu]/ensure: created
Nov 28 08:01:56 np0005538515.localdomain puppet-user[54024]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/vlan_transparent]/ensure: created
Nov 28 08:01:56 np0005538515.localdomain puppet-user[54024]: Notice: /Stage[main]/Neutron/Neutron_config[agent/root_helper]/ensure: created
Nov 28 08:01:56 np0005538515.localdomain puppet-user[54024]: Notice: /Stage[main]/Neutron/Neutron_config[agent/report_interval]/ensure: created
Nov 28 08:01:56 np0005538515.localdomain puppet-user[54024]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/service_plugins]/ensure: created
Nov 28 08:01:56 np0005538515.localdomain puppet-user[54024]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[DEFAULT/debug]/ensure: created
Nov 28 08:01:56 np0005538515.localdomain puppet-user[54024]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[DEFAULT/nova_metadata_host]/ensure: created
Nov 28 08:01:56 np0005538515.localdomain puppet-user[54024]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[DEFAULT/nova_metadata_protocol]/ensure: created
Nov 28 08:01:56 np0005538515.localdomain puppet-user[54024]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[DEFAULT/metadata_proxy_shared_secret]/ensure: created
Nov 28 08:01:56 np0005538515.localdomain puppet-user[54024]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[DEFAULT/metadata_workers]/ensure: created
Nov 28 08:01:56 np0005538515.localdomain puppet-user[54024]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[DEFAULT/state_path]/ensure: created
Nov 28 08:01:56 np0005538515.localdomain puppet-user[54024]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[DEFAULT/hwol_qos_enabled]/ensure: created
Nov 28 08:01:56 np0005538515.localdomain puppet-user[54024]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[agent/root_helper]/ensure: created
Nov 28 08:01:56 np0005538515.localdomain puppet-user[54024]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[ovs/ovsdb_connection]/ensure: created
Nov 28 08:01:56 np0005538515.localdomain puppet-user[54024]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[ovs/ovsdb_connection_timeout]/ensure: created
Nov 28 08:01:56 np0005538515.localdomain puppet-user[54024]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[ovn/ovsdb_probe_interval]/ensure: created
Nov 28 08:01:56 np0005538515.localdomain puppet-user[54024]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[ovn/ovn_nb_connection]/ensure: created
Nov 28 08:01:57 np0005538515.localdomain puppet-user[54024]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[ovn/ovn_sb_connection]/ensure: created
Nov 28 08:01:57 np0005538515.localdomain puppet-user[54024]: Notice: /Stage[main]/Neutron/Oslo::Messaging::Default[neutron_config]/Neutron_config[DEFAULT/transport_url]/ensure: created
Nov 28 08:01:57 np0005538515.localdomain puppet-user[54024]: Notice: /Stage[main]/Neutron/Oslo::Messaging::Default[neutron_config]/Neutron_config[DEFAULT/control_exchange]/ensure: created
Nov 28 08:01:57 np0005538515.localdomain puppet-user[54024]: Notice: /Stage[main]/Neutron/Oslo::Concurrency[neutron_config]/Neutron_config[oslo_concurrency/lock_path]/ensure: created
Nov 28 08:01:57 np0005538515.localdomain puppet-user[54024]: Notice: /Stage[main]/Neutron/Oslo::Messaging::Notifications[neutron_config]/Neutron_config[oslo_messaging_notifications/driver]/ensure: created
Nov 28 08:01:57 np0005538515.localdomain puppet-user[54024]: Notice: /Stage[main]/Neutron/Oslo::Messaging::Notifications[neutron_config]/Neutron_config[oslo_messaging_notifications/transport_url]/ensure: created
Nov 28 08:01:57 np0005538515.localdomain puppet-user[54024]: Notice: /Stage[main]/Neutron/Oslo::Messaging::Rabbit[neutron_config]/Neutron_config[oslo_messaging_rabbit/heartbeat_in_pthread]/ensure: created
Nov 28 08:01:57 np0005538515.localdomain puppet-user[54024]: Notice: /Stage[main]/Neutron/Oslo::Messaging::Rabbit[neutron_config]/Neutron_config[oslo_messaging_rabbit/heartbeat_timeout_threshold]/ensure: created
Nov 28 08:01:57 np0005538515.localdomain puppet-user[54024]: Notice: /Stage[main]/Neutron::Logging/Oslo::Log[neutron_config]/Neutron_config[DEFAULT/debug]/ensure: created
Nov 28 08:01:57 np0005538515.localdomain puppet-user[54024]: Notice: /Stage[main]/Neutron::Logging/Oslo::Log[neutron_config]/Neutron_config[DEFAULT/log_dir]/ensure: created
Nov 28 08:01:57 np0005538515.localdomain puppet-user[54024]: Notice: Applied catalog in 0.44 seconds
Nov 28 08:01:57 np0005538515.localdomain puppet-user[54024]: Application:
Nov 28 08:01:57 np0005538515.localdomain puppet-user[54024]:    Initial environment: production
Nov 28 08:01:57 np0005538515.localdomain puppet-user[54024]:    Converged environment: production
Nov 28 08:01:57 np0005538515.localdomain puppet-user[54024]:          Run mode: user
Nov 28 08:01:57 np0005538515.localdomain puppet-user[54024]: Changes:
Nov 28 08:01:57 np0005538515.localdomain puppet-user[54024]:             Total: 33
Nov 28 08:01:57 np0005538515.localdomain puppet-user[54024]: Events:
Nov 28 08:01:57 np0005538515.localdomain puppet-user[54024]:           Success: 33
Nov 28 08:01:57 np0005538515.localdomain puppet-user[54024]:             Total: 33
Nov 28 08:01:57 np0005538515.localdomain puppet-user[54024]: Resources:
Nov 28 08:01:57 np0005538515.localdomain puppet-user[54024]:           Skipped: 21
Nov 28 08:01:57 np0005538515.localdomain puppet-user[54024]:           Changed: 33
Nov 28 08:01:57 np0005538515.localdomain puppet-user[54024]:       Out of sync: 33
Nov 28 08:01:57 np0005538515.localdomain puppet-user[54024]:             Total: 155
Nov 28 08:01:57 np0005538515.localdomain puppet-user[54024]: Time:
Nov 28 08:01:57 np0005538515.localdomain puppet-user[54024]:         Resources: 0.00
Nov 28 08:01:57 np0005538515.localdomain puppet-user[54024]:    Ovn metadata agent config: 0.02
Nov 28 08:01:57 np0005538515.localdomain puppet-user[54024]:    Neutron config: 0.37
Nov 28 08:01:57 np0005538515.localdomain puppet-user[54024]:    Transaction evaluation: 0.44
Nov 28 08:01:57 np0005538515.localdomain puppet-user[54024]:    Catalog application: 0.44
Nov 28 08:01:57 np0005538515.localdomain puppet-user[54024]:    Config retrieval: 0.74
Nov 28 08:01:57 np0005538515.localdomain puppet-user[54024]:          Last run: 1764316917
Nov 28 08:01:57 np0005538515.localdomain puppet-user[54024]:             Total: 0.45
Nov 28 08:01:57 np0005538515.localdomain puppet-user[54024]: Version:
Nov 28 08:01:57 np0005538515.localdomain puppet-user[54024]:            Config: 1764316916
Nov 28 08:01:57 np0005538515.localdomain puppet-user[54024]:            Puppet: 7.10.0
Nov 28 08:01:57 np0005538515.localdomain systemd[1]: libpod-8692e42b842ef6461ddaf8f87dcd08c54fe8471c3dd5454dde35d88654a795a1.scope: Deactivated successfully.
Nov 28 08:01:57 np0005538515.localdomain systemd[1]: libpod-8692e42b842ef6461ddaf8f87dcd08c54fe8471c3dd5454dde35d88654a795a1.scope: Consumed 3.676s CPU time.
Nov 28 08:01:57 np0005538515.localdomain podman[54208]: 2025-11-28 08:01:57.875844997 +0000 UTC m=+0.050612376 container died 8692e42b842ef6461ddaf8f87dcd08c54fe8471c3dd5454dde35d88654a795a1 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1, name=container-puppet-neutron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., build-date=2025-11-19T00:23:27Z, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 neutron-server, managed_by=tripleo_ansible, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538515', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,neutron_config,ovn_metadata_agent_config', 'NAME': 'neutron', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::ovn_metadata\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-server, io.buildah.version=1.41.4, name=rhosp17/openstack-neutron-server, maintainer=OpenStack TripleO Team, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-server, batch=17.1_20251118.1, version=17.1.12, com.redhat.component=openstack-neutron-server-container, config_id=tripleo_puppet_step1, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-server, summary=Red Hat OpenStack Platform 17.1 neutron-server, url=https://www.redhat.com, container_name=container-puppet-neutron)
Nov 28 08:01:57 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8692e42b842ef6461ddaf8f87dcd08c54fe8471c3dd5454dde35d88654a795a1-userdata-shm.mount: Deactivated successfully.
Nov 28 08:01:57 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-7c29bfa5a0679179b90046634e87037ab6ff6f22b5fa7106d9841b0f8caae33b-merged.mount: Deactivated successfully.
Nov 28 08:01:57 np0005538515.localdomain podman[54208]: 2025-11-28 08:01:57.958408073 +0000 UTC m=+0.133175412 container cleanup 8692e42b842ef6461ddaf8f87dcd08c54fe8471c3dd5454dde35d88654a795a1 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1, name=container-puppet-neutron, release=1761123044, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-server, vendor=Red Hat, Inc., tcib_managed=true, name=rhosp17/openstack-neutron-server, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-neutron-server-container, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 neutron-server, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, version=17.1.12, distribution-scope=public, io.openshift.expose-services=, config_id=tripleo_puppet_step1, description=Red Hat OpenStack Platform 17.1 neutron-server, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:23:27Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-server, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538515', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,neutron_config,ovn_metadata_agent_config', 'NAME': 'neutron', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::ovn_metadata\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-server, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, container_name=container-puppet-neutron)
Nov 28 08:01:57 np0005538515.localdomain systemd[1]: libpod-conmon-8692e42b842ef6461ddaf8f87dcd08c54fe8471c3dd5454dde35d88654a795a1.scope: Deactivated successfully.
Nov 28 08:01:57 np0005538515.localdomain python3[52321]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-neutron --conmon-pidfile /run/container-puppet-neutron.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005538515 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron,neutron_config,ovn_metadata_agent_config --env NAME=neutron --env STEP_CONFIG=include ::tripleo::packages
                                                         include tripleo::profile::base::neutron::ovn_metadata
                                                          --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-neutron --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538515', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,neutron_config,ovn_metadata_agent_config', 'NAME': 'neutron', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::ovn_metadata\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-neutron.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /lib/modules:/lib/modules:ro --volume /run/openvswitch:/run/openvswitch:shared,z --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1
Nov 28 08:01:58 np0005538515.localdomain sudo[52319]: pam_unix(sudo:session): session closed for user root
Nov 28 08:01:58 np0005538515.localdomain sudo[54258]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-albttwgfwntmzgvxcdsloztkluoavuym ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:01:58 np0005538515.localdomain sudo[54258]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:01:58 np0005538515.localdomain python3[54260]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:01:58 np0005538515.localdomain sudo[54258]: pam_unix(sudo:session): session closed for user root
Nov 28 08:01:59 np0005538515.localdomain sudo[54274]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fyscqjmtdkrmfgpnaeldwiwjsilkgnmj ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:01:59 np0005538515.localdomain sudo[54274]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:01:59 np0005538515.localdomain sudo[54274]: pam_unix(sudo:session): session closed for user root
Nov 28 08:01:59 np0005538515.localdomain sudo[54290]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-igsmwfxmkydjsljxunwylhczfihqfxqa ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:01:59 np0005538515.localdomain sudo[54290]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:01:59 np0005538515.localdomain python3[54292]: ansible-stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 28 08:01:59 np0005538515.localdomain sudo[54290]: pam_unix(sudo:session): session closed for user root
Nov 28 08:02:00 np0005538515.localdomain sudo[54340]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hzrimzzimsvoxpvzydthqwikuegbmtld ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:02:00 np0005538515.localdomain sudo[54340]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:02:00 np0005538515.localdomain python3[54342]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-container-shutdown follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 08:02:00 np0005538515.localdomain sudo[54340]: pam_unix(sudo:session): session closed for user root
Nov 28 08:02:00 np0005538515.localdomain sudo[54383]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-endbngjyxxatuubdkogyhlqdohnfqphs ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:02:00 np0005538515.localdomain sudo[54383]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:02:00 np0005538515.localdomain python3[54385]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764316920.050109-84865-133296093000290/source dest=/usr/libexec/tripleo-container-shutdown mode=0700 owner=root group=root _original_basename=tripleo-container-shutdown follow=False checksum=7d67b1986212f5548057505748cd74cfcf9c0d35 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:02:00 np0005538515.localdomain sudo[54383]: pam_unix(sudo:session): session closed for user root
Nov 28 08:02:01 np0005538515.localdomain sudo[54445]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ssqastngfbgskfcslduuuiskhogvqghr ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:02:01 np0005538515.localdomain sudo[54445]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:02:01 np0005538515.localdomain python3[54447]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-start-podman-container follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 08:02:01 np0005538515.localdomain sudo[54445]: pam_unix(sudo:session): session closed for user root
Nov 28 08:02:02 np0005538515.localdomain sudo[54488]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hdkyrgdzivlxklvfsymjpdhuspehwezf ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:02:02 np0005538515.localdomain sudo[54488]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:02:02 np0005538515.localdomain python3[54490]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764316920.912403-84865-205667006347731/source dest=/usr/libexec/tripleo-start-podman-container mode=0700 owner=root group=root _original_basename=tripleo-start-podman-container follow=False checksum=536965633b8d3b1ce794269ffb07be0105a560a0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:02:02 np0005538515.localdomain sudo[54488]: pam_unix(sudo:session): session closed for user root
Nov 28 08:02:02 np0005538515.localdomain sudo[54550]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bccwkyhnejgqyydofqznxhrvlonwdtnf ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:02:02 np0005538515.localdomain sudo[54550]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:02:02 np0005538515.localdomain python3[54552]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/tripleo-container-shutdown.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 08:02:02 np0005538515.localdomain sudo[54550]: pam_unix(sudo:session): session closed for user root
Nov 28 08:02:03 np0005538515.localdomain sudo[54593]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dlstcjpadzmbnidfgxzqpblcrlplvyrn ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:02:03 np0005538515.localdomain sudo[54593]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:02:03 np0005538515.localdomain python3[54595]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764316922.5342379-85024-154985672508809/source dest=/usr/lib/systemd/system/tripleo-container-shutdown.service mode=0644 owner=root group=root _original_basename=tripleo-container-shutdown-service follow=False checksum=66c1d41406ba8714feb9ed0a35259a7a57ef9707 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:02:03 np0005538515.localdomain sudo[54593]: pam_unix(sudo:session): session closed for user root
Nov 28 08:02:03 np0005538515.localdomain sudo[54655]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ccqsqwlkwbkzehsraobtpjdvtldibszy ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:02:03 np0005538515.localdomain sudo[54655]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:02:03 np0005538515.localdomain python3[54657]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 08:02:03 np0005538515.localdomain sudo[54655]: pam_unix(sudo:session): session closed for user root
Nov 28 08:02:03 np0005538515.localdomain sudo[54698]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rytzuagzxlwgebbfhxhyxnmobnymeaja ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:02:03 np0005538515.localdomain sudo[54698]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:02:04 np0005538515.localdomain python3[54700]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764316923.4300764-85054-14470321130955/source dest=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset mode=0644 owner=root group=root _original_basename=91-tripleo-container-shutdown-preset follow=False checksum=bccb1207dcbcfaa5ca05f83c8f36ce4c2460f081 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:02:04 np0005538515.localdomain sudo[54698]: pam_unix(sudo:session): session closed for user root
Nov 28 08:02:04 np0005538515.localdomain sudo[54728]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jwvygzahestuiutwbktfpubupmclbvea ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:02:04 np0005538515.localdomain sudo[54728]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:02:04 np0005538515.localdomain python3[54730]: ansible-systemd Invoked with name=tripleo-container-shutdown state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 08:02:04 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 08:02:04 np0005538515.localdomain systemd-rc-local-generator[54756]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 08:02:04 np0005538515.localdomain systemd-sysv-generator[54760]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 08:02:04 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 08:02:04 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 08:02:05 np0005538515.localdomain systemd-rc-local-generator[54792]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 08:02:05 np0005538515.localdomain systemd-sysv-generator[54795]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 08:02:05 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 08:02:05 np0005538515.localdomain systemd[1]: Starting TripleO Container Shutdown...
Nov 28 08:02:05 np0005538515.localdomain systemd[1]: Finished TripleO Container Shutdown.
Nov 28 08:02:05 np0005538515.localdomain sudo[54728]: pam_unix(sudo:session): session closed for user root
Nov 28 08:02:05 np0005538515.localdomain sudo[54851]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ayqbowaaphxhlhhuxdpvsiqltuccqhic ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:02:05 np0005538515.localdomain sudo[54851]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:02:05 np0005538515.localdomain python3[54853]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/netns-placeholder.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 08:02:05 np0005538515.localdomain sudo[54851]: pam_unix(sudo:session): session closed for user root
Nov 28 08:02:05 np0005538515.localdomain sudo[54894]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-blvmolrvugoamaqhmazrswqixipndwgp ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:02:05 np0005538515.localdomain sudo[54894]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:02:06 np0005538515.localdomain python3[54896]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764316925.3613842-85176-54340264996382/source dest=/usr/lib/systemd/system/netns-placeholder.service mode=0644 owner=root group=root _original_basename=netns-placeholder-service follow=False checksum=8e9c6d5ce3a6e7f71c18780ec899f32f23de4c71 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:02:06 np0005538515.localdomain sudo[54894]: pam_unix(sudo:session): session closed for user root
Nov 28 08:02:06 np0005538515.localdomain sudo[54956]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kpboskrwrsiekbxslskhxfniacsnmyoo ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:02:06 np0005538515.localdomain sudo[54956]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:02:06 np0005538515.localdomain python3[54958]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 08:02:06 np0005538515.localdomain sudo[54956]: pam_unix(sudo:session): session closed for user root
Nov 28 08:02:06 np0005538515.localdomain sudo[54999]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-njsuqzbozomzpwlguoksvaeszpxoikwo ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:02:06 np0005538515.localdomain sudo[54999]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:02:07 np0005538515.localdomain python3[55001]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764316926.291196-85234-14471150599337/source dest=/usr/lib/systemd/system-preset/91-netns-placeholder.preset mode=0644 owner=root group=root _original_basename=91-netns-placeholder-preset follow=False checksum=28b7b9aa893525d134a1eeda8a0a48fb25b736b9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:02:07 np0005538515.localdomain sudo[54999]: pam_unix(sudo:session): session closed for user root
Nov 28 08:02:07 np0005538515.localdomain sudo[55029]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jbxtqzoqdmzljlqkmxbxjgggrjyrndtz ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:02:07 np0005538515.localdomain sudo[55029]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:02:07 np0005538515.localdomain python3[55031]: ansible-systemd Invoked with name=netns-placeholder state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 08:02:07 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 08:02:07 np0005538515.localdomain systemd-sysv-generator[55062]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 08:02:07 np0005538515.localdomain systemd-rc-local-generator[55057]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 08:02:07 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 08:02:07 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 08:02:07 np0005538515.localdomain systemd-sysv-generator[55097]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 08:02:07 np0005538515.localdomain systemd-rc-local-generator[55093]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 08:02:07 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 08:02:08 np0005538515.localdomain systemd[1]: Starting Create netns directory...
Nov 28 08:02:08 np0005538515.localdomain systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Nov 28 08:02:08 np0005538515.localdomain systemd[1]: netns-placeholder.service: Deactivated successfully.
Nov 28 08:02:08 np0005538515.localdomain systemd[1]: Finished Create netns directory.
Nov 28 08:02:08 np0005538515.localdomain sudo[55029]: pam_unix(sudo:session): session closed for user root
Nov 28 08:02:08 np0005538515.localdomain sudo[55122]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jqqwreztrluqmutyskyhjkrvipsjiumv ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:02:08 np0005538515.localdomain sudo[55122]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:02:08 np0005538515.localdomain python3[55124]: ansible-container_puppet_config Invoked with update_config_hash_only=True no_archive=True check_mode=False config_vol_prefix=/var/lib/config-data debug=False net_host=True puppet_config= short_hostname= step=6
Nov 28 08:02:08 np0005538515.localdomain python3[55124]: ansible-container_puppet_config [WARNING] Config change detected for metrics_qdr, new hash: 6e6d33b0e4909c73f2f7adca3bc870a0
Nov 28 08:02:08 np0005538515.localdomain python3[55124]: ansible-container_puppet_config [WARNING] Config change detected for collectd, new hash: d31718fcd17fdeee6489534105191c7a
Nov 28 08:02:08 np0005538515.localdomain python3[55124]: ansible-container_puppet_config [WARNING] Config change detected for iscsid, new hash: 18a2751501986164e709168f53ab57c8
Nov 28 08:02:08 np0005538515.localdomain python3[55124]: ansible-container_puppet_config [WARNING] Config change detected for nova_virtlogd_wrapper, new hash: bbb5ea37891e3118676a78b59837de90
Nov 28 08:02:08 np0005538515.localdomain python3[55124]: ansible-container_puppet_config [WARNING] Config change detected for nova_virtnodedevd, new hash: bbb5ea37891e3118676a78b59837de90
Nov 28 08:02:08 np0005538515.localdomain python3[55124]: ansible-container_puppet_config [WARNING] Config change detected for nova_virtproxyd, new hash: bbb5ea37891e3118676a78b59837de90
Nov 28 08:02:08 np0005538515.localdomain python3[55124]: ansible-container_puppet_config [WARNING] Config change detected for nova_virtqemud, new hash: bbb5ea37891e3118676a78b59837de90
Nov 28 08:02:08 np0005538515.localdomain python3[55124]: ansible-container_puppet_config [WARNING] Config change detected for nova_virtsecretd, new hash: bbb5ea37891e3118676a78b59837de90
Nov 28 08:02:08 np0005538515.localdomain python3[55124]: ansible-container_puppet_config [WARNING] Config change detected for nova_virtstoraged, new hash: bbb5ea37891e3118676a78b59837de90
Nov 28 08:02:08 np0005538515.localdomain python3[55124]: ansible-container_puppet_config [WARNING] Config change detected for rsyslog, new hash: f62921da3a3d0eed1be38a46b3ed6ac3
Nov 28 08:02:08 np0005538515.localdomain python3[55124]: ansible-container_puppet_config [WARNING] Config change detected for ceilometer_agent_compute, new hash: 185ba876a5902dbf87b8591344afd39d
Nov 28 08:02:08 np0005538515.localdomain python3[55124]: ansible-container_puppet_config [WARNING] Config change detected for ceilometer_agent_ipmi, new hash: 185ba876a5902dbf87b8591344afd39d
Nov 28 08:02:08 np0005538515.localdomain python3[55124]: ansible-container_puppet_config [WARNING] Config change detected for logrotate_crond, new hash: 53ed83bb0cae779ff95edb2002262c6f
Nov 28 08:02:08 np0005538515.localdomain python3[55124]: ansible-container_puppet_config [WARNING] Config change detected for nova_libvirt_init_secret, new hash: bbb5ea37891e3118676a78b59837de90
Nov 28 08:02:08 np0005538515.localdomain python3[55124]: ansible-container_puppet_config [WARNING] Config change detected for nova_migration_target, new hash: bbb5ea37891e3118676a78b59837de90
Nov 28 08:02:08 np0005538515.localdomain python3[55124]: ansible-container_puppet_config [WARNING] Config change detected for ovn_metadata_agent, new hash: 08c21dad54d1ba598c6e2fae6b853aba
Nov 28 08:02:08 np0005538515.localdomain python3[55124]: ansible-container_puppet_config [WARNING] Config change detected for nova_compute, new hash: 18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90
Nov 28 08:02:08 np0005538515.localdomain python3[55124]: ansible-container_puppet_config [WARNING] Config change detected for nova_wait_for_compute_service, new hash: bbb5ea37891e3118676a78b59837de90
Nov 28 08:02:08 np0005538515.localdomain sudo[55122]: pam_unix(sudo:session): session closed for user root
Nov 28 08:02:08 np0005538515.localdomain sudo[55138]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-izfvbkucldnredqzzoxeqkfqiffsqwhf ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:02:08 np0005538515.localdomain sudo[55138]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:02:09 np0005538515.localdomain sudo[55138]: pam_unix(sudo:session): session closed for user root
Nov 28 08:02:09 np0005538515.localdomain sudo[55178]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yuahgzntocruvvwjsjlmpsldxbkkecqs ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:02:09 np0005538515.localdomain sudo[55178]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:02:09 np0005538515.localdomain python3[55180]: ansible-tripleo_container_manage Invoked with config_id=tripleo_step1 config_dir=/var/lib/tripleo-config/container-startup-config/step_1 config_patterns=*.json config_overrides={} concurrency=5 log_base_path=/var/log/containers/stdouts debug=False
Nov 28 08:02:10 np0005538515.localdomain podman[55218]: 2025-11-28 08:02:10.263095861 +0000 UTC m=+0.079726573 container create 325abc01ba4485ffe3dd4f572ea163a2b9aaa7bcf66a88a3ab110fbd81332601 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr_init_logs, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, config_id=tripleo_step1, container_name=metrics_qdr_init_logs, architecture=x86_64, vendor=Red Hat, Inc., release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'command': ['/bin/bash', '-c', 'chown -R qdrouterd:qdrouterd /var/log/qdrouterd'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'none', 'privileged': False, 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, build-date=2025-11-18T22:49:46Z, version=17.1.12, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, tcib_managed=true, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 qdrouterd)
Nov 28 08:02:10 np0005538515.localdomain systemd[1]: Started libpod-conmon-325abc01ba4485ffe3dd4f572ea163a2b9aaa7bcf66a88a3ab110fbd81332601.scope.
Nov 28 08:02:10 np0005538515.localdomain podman[55218]: 2025-11-28 08:02:10.217522983 +0000 UTC m=+0.034153725 image pull  registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1
Nov 28 08:02:10 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 08:02:10 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/92a670f87546e9222dc3530777cbcbb6bd2a424665ad22aef150e174bea9c765/merged/var/log/qdrouterd supports timestamps until 2038 (0x7fffffff)
Nov 28 08:02:10 np0005538515.localdomain podman[55218]: 2025-11-28 08:02:10.360538218 +0000 UTC m=+0.177168930 container init 325abc01ba4485ffe3dd4f572ea163a2b9aaa7bcf66a88a3ab110fbd81332601 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr_init_logs, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'command': ['/bin/bash', '-c', 'chown -R qdrouterd:qdrouterd /var/log/qdrouterd'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'none', 'privileged': False, 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp17/openstack-qdrouterd, io.openshift.expose-services=, container_name=metrics_qdr_init_logs, description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64, config_id=tripleo_step1, url=https://www.redhat.com, io.buildah.version=1.41.4, release=1761123044, distribution-scope=public, maintainer=OpenStack TripleO Team)
Nov 28 08:02:10 np0005538515.localdomain podman[55218]: 2025-11-28 08:02:10.371757067 +0000 UTC m=+0.188387779 container start 325abc01ba4485ffe3dd4f572ea163a2b9aaa7bcf66a88a3ab110fbd81332601 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr_init_logs, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, config_data={'command': ['/bin/bash', '-c', 'chown -R qdrouterd:qdrouterd /var/log/qdrouterd'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'none', 'privileged': False, 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp17/openstack-qdrouterd, io.openshift.expose-services=, container_name=metrics_qdr_init_logs, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, build-date=2025-11-18T22:49:46Z, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044)
Nov 28 08:02:10 np0005538515.localdomain podman[55218]: 2025-11-28 08:02:10.372136489 +0000 UTC m=+0.188767211 container attach 325abc01ba4485ffe3dd4f572ea163a2b9aaa7bcf66a88a3ab110fbd81332601 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr_init_logs, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, build-date=2025-11-18T22:49:46Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., config_data={'command': ['/bin/bash', '-c', 'chown -R qdrouterd:qdrouterd /var/log/qdrouterd'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'none', 'privileged': False, 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, config_id=tripleo_step1, container_name=metrics_qdr_init_logs, tcib_managed=true, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, io.openshift.expose-services=, version=17.1.12, url=https://www.redhat.com, release=1761123044)
Nov 28 08:02:10 np0005538515.localdomain systemd[1]: libpod-325abc01ba4485ffe3dd4f572ea163a2b9aaa7bcf66a88a3ab110fbd81332601.scope: Deactivated successfully.
Nov 28 08:02:10 np0005538515.localdomain podman[55218]: 2025-11-28 08:02:10.381601224 +0000 UTC m=+0.198231936 container died 325abc01ba4485ffe3dd4f572ea163a2b9aaa7bcf66a88a3ab110fbd81332601 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr_init_logs, container_name=metrics_qdr_init_logs, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, io.openshift.expose-services=, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, com.redhat.component=openstack-qdrouterd-container, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, config_data={'command': ['/bin/bash', '-c', 'chown -R qdrouterd:qdrouterd /var/log/qdrouterd'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'none', 'privileged': False, 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc.)
Nov 28 08:02:10 np0005538515.localdomain podman[55237]: 2025-11-28 08:02:10.468826722 +0000 UTC m=+0.078311799 container cleanup 325abc01ba4485ffe3dd4f572ea163a2b9aaa7bcf66a88a3ab110fbd81332601 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr_init_logs, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, name=rhosp17/openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, build-date=2025-11-18T22:49:46Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, managed_by=tripleo_ansible, io.buildah.version=1.41.4, vendor=Red Hat, Inc., version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, architecture=x86_64, release=1761123044, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, tcib_managed=true, config_data={'command': ['/bin/bash', '-c', 'chown -R qdrouterd:qdrouterd /var/log/qdrouterd'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'none', 'privileged': False, 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, batch=17.1_20251118.1, container_name=metrics_qdr_init_logs)
Nov 28 08:02:10 np0005538515.localdomain systemd[1]: libpod-conmon-325abc01ba4485ffe3dd4f572ea163a2b9aaa7bcf66a88a3ab110fbd81332601.scope: Deactivated successfully.
Nov 28 08:02:10 np0005538515.localdomain python3[55180]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name metrics_qdr_init_logs --conmon-pidfile /run/metrics_qdr_init_logs.pid --detach=False --label config_id=tripleo_step1 --label container_name=metrics_qdr_init_logs --label managed_by=tripleo_ansible --label config_data={'command': ['/bin/bash', '-c', 'chown -R qdrouterd:qdrouterd /var/log/qdrouterd'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'none', 'privileged': False, 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/metrics_qdr_init_logs.log --network none --privileged=False --user root --volume /var/log/containers/metrics_qdr:/var/log/qdrouterd:z registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1 /bin/bash -c chown -R qdrouterd:qdrouterd /var/log/qdrouterd
Nov 28 08:02:10 np0005538515.localdomain podman[55312]: 2025-11-28 08:02:10.935399734 +0000 UTC m=+0.078553587 container create 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, container_name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step1, release=1761123044, build-date=2025-11-18T22:49:46Z, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, name=rhosp17/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, tcib_managed=true, url=https://www.redhat.com, maintainer=OpenStack TripleO Team)
Nov 28 08:02:10 np0005538515.localdomain systemd[1]: Started libpod-conmon-9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.scope.
Nov 28 08:02:10 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 08:02:10 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/876187b8bc68a02fc79261d7a49dfade5cc37ca730d23b4f758fcf788c522d06/merged/var/log/qdrouterd supports timestamps until 2038 (0x7fffffff)
Nov 28 08:02:10 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/876187b8bc68a02fc79261d7a49dfade5cc37ca730d23b4f758fcf788c522d06/merged/var/lib/qdrouterd supports timestamps until 2038 (0x7fffffff)
Nov 28 08:02:10 np0005538515.localdomain podman[55312]: 2025-11-28 08:02:10.8955865 +0000 UTC m=+0.038740393 image pull  registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1
Nov 28 08:02:11 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.
Nov 28 08:02:11 np0005538515.localdomain podman[55312]: 2025-11-28 08:02:11.025833529 +0000 UTC m=+0.168987372 container init 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, name=rhosp17/openstack-qdrouterd, distribution-scope=public, container_name=metrics_qdr, batch=17.1_20251118.1, tcib_managed=true, url=https://www.redhat.com, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., config_id=tripleo_step1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-qdrouterd-container, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4)
Nov 28 08:02:11 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.
Nov 28 08:02:11 np0005538515.localdomain sudo[55333]: qdrouterd : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Nov 28 08:02:11 np0005538515.localdomain sudo[55333]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42465)
Nov 28 08:02:11 np0005538515.localdomain podman[55312]: 2025-11-28 08:02:11.058866498 +0000 UTC m=+0.202020341 container start 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, batch=17.1_20251118.1, tcib_managed=true, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, container_name=metrics_qdr, distribution-scope=public, release=1761123044, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-qdrouterd-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:46Z, config_id=tripleo_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd)
Nov 28 08:02:11 np0005538515.localdomain python3[55180]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name metrics_qdr --conmon-pidfile /run/metrics_qdr.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=6e6d33b0e4909c73f2f7adca3bc870a0 --healthcheck-command /openstack/healthcheck --label config_id=tripleo_step1 --label container_name=metrics_qdr --label managed_by=tripleo_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/metrics_qdr.log --network host --privileged=False --user qdrouterd --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro --volume /var/lib/metrics_qdr:/var/lib/qdrouterd:z --volume /var/log/containers/metrics_qdr:/var/log/qdrouterd:z registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1
Nov 28 08:02:11 np0005538515.localdomain sudo[55333]: pam_unix(sudo:session): session closed for user root
Nov 28 08:02:11 np0005538515.localdomain podman[55334]: 2025-11-28 08:02:11.161914125 +0000 UTC m=+0.093662584 container health_status 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=starting, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-type=git, name=rhosp17/openstack-qdrouterd, io.openshift.expose-services=, vendor=Red Hat, Inc., version=17.1.12, build-date=2025-11-18T22:49:46Z, com.redhat.component=openstack-qdrouterd-container, url=https://www.redhat.com, release=1761123044, io.buildah.version=1.41.4, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 08:02:11 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-92a670f87546e9222dc3530777cbcbb6bd2a424665ad22aef150e174bea9c765-merged.mount: Deactivated successfully.
Nov 28 08:02:11 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-325abc01ba4485ffe3dd4f572ea163a2b9aaa7bcf66a88a3ab110fbd81332601-userdata-shm.mount: Deactivated successfully.
Nov 28 08:02:11 np0005538515.localdomain sudo[55178]: pam_unix(sudo:session): session closed for user root
Nov 28 08:02:11 np0005538515.localdomain podman[55334]: 2025-11-28 08:02:11.390161028 +0000 UTC m=+0.321909497 container exec_died 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, batch=17.1_20251118.1, io.openshift.expose-services=, version=17.1.12, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, url=https://www.redhat.com, architecture=x86_64, release=1761123044, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, io.buildah.version=1.41.4, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, name=rhosp17/openstack-qdrouterd, config_id=tripleo_step1, container_name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 28 08:02:11 np0005538515.localdomain systemd[1]: 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.service: Deactivated successfully.
Nov 28 08:02:11 np0005538515.localdomain sudo[55405]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jzkxrxyibsixcfwevswtrqrvlfbwypbi ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:02:11 np0005538515.localdomain sudo[55405]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:02:11 np0005538515.localdomain python3[55407]: ansible-file Invoked with path=/etc/systemd/system/tripleo_metrics_qdr.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:02:11 np0005538515.localdomain sudo[55405]: pam_unix(sudo:session): session closed for user root
Nov 28 08:02:11 np0005538515.localdomain sudo[55421]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dflotjhvshiulhemdoxubreoilzgqwyk ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:02:11 np0005538515.localdomain sudo[55421]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:02:11 np0005538515.localdomain python3[55423]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_metrics_qdr_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 28 08:02:11 np0005538515.localdomain sudo[55421]: pam_unix(sudo:session): session closed for user root
Nov 28 08:02:12 np0005538515.localdomain sudo[55482]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zpbhdjwvunvrhwvhxaeptbbrgeclmobm ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:02:12 np0005538515.localdomain sudo[55482]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:02:12 np0005538515.localdomain python3[55484]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764316932.0419774-85354-154070737403857/source dest=/etc/systemd/system/tripleo_metrics_qdr.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:02:12 np0005538515.localdomain sudo[55482]: pam_unix(sudo:session): session closed for user root
Nov 28 08:02:12 np0005538515.localdomain sudo[55498]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xsnutnqatfjqhrkreouhkhuqaeezfyiz ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:02:12 np0005538515.localdomain sudo[55498]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:02:12 np0005538515.localdomain python3[55500]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 28 08:02:12 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 08:02:13 np0005538515.localdomain systemd-rc-local-generator[55521]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 08:02:13 np0005538515.localdomain systemd-sysv-generator[55526]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 08:02:13 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 08:02:13 np0005538515.localdomain sudo[55498]: pam_unix(sudo:session): session closed for user root
Nov 28 08:02:13 np0005538515.localdomain sudo[55549]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xkstmxlrjijbjpxhiavrkhjwajgujwhd ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:02:13 np0005538515.localdomain sudo[55549]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:02:13 np0005538515.localdomain python3[55551]: ansible-systemd Invoked with state=restarted name=tripleo_metrics_qdr.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 08:02:13 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 08:02:14 np0005538515.localdomain systemd-rc-local-generator[55578]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 08:02:14 np0005538515.localdomain systemd-sysv-generator[55583]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 08:02:14 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 08:02:14 np0005538515.localdomain systemd[1]: Starting dnf makecache...
Nov 28 08:02:14 np0005538515.localdomain systemd[1]: Starting metrics_qdr container...
Nov 28 08:02:14 np0005538515.localdomain systemd[1]: Started metrics_qdr container.
Nov 28 08:02:14 np0005538515.localdomain sudo[55549]: pam_unix(sudo:session): session closed for user root
Nov 28 08:02:14 np0005538515.localdomain dnf[55591]: Updating Subscription Management repositories.
Nov 28 08:02:14 np0005538515.localdomain sudo[55630]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tzitgzixqlbtpppgombkrswdwahxolfx ; /usr/bin/python3
Nov 28 08:02:14 np0005538515.localdomain sudo[55630]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:02:14 np0005538515.localdomain python3[55632]: ansible-file Invoked with path=/var/lib/container-puppet/container-puppet-tasks1.json state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:02:14 np0005538515.localdomain sudo[55630]: pam_unix(sudo:session): session closed for user root
Nov 28 08:02:15 np0005538515.localdomain sudo[55678]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zqkghxbpwmotypnnvrnoinxewcbfyjgf ; /usr/bin/python3
Nov 28 08:02:15 np0005538515.localdomain sudo[55678]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:02:15 np0005538515.localdomain sudo[55678]: pam_unix(sudo:session): session closed for user root
Nov 28 08:02:15 np0005538515.localdomain sudo[55721]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cdhotcklubqjcvdheiihsjkawjscjcux ; /usr/bin/python3
Nov 28 08:02:15 np0005538515.localdomain sudo[55721]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:02:15 np0005538515.localdomain sudo[55721]: pam_unix(sudo:session): session closed for user root
Nov 28 08:02:15 np0005538515.localdomain sudo[55751]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kfgexqbqibamcksnwnjujnhbcabsprfd ; /usr/bin/python3
Nov 28 08:02:15 np0005538515.localdomain sudo[55751]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:02:16 np0005538515.localdomain python3[55753]: ansible-container_puppet_config Invoked with check_mode=False config_vol_prefix=/var/lib/config-data debug=True net_host=True no_archive=True puppet_config=/var/lib/container-puppet/container-puppet-tasks1.json short_hostname=np0005538515 step=1 update_config_hash_only=False
Nov 28 08:02:16 np0005538515.localdomain sudo[55751]: pam_unix(sudo:session): session closed for user root
Nov 28 08:02:16 np0005538515.localdomain dnf[55591]: Failed determining last makecache time.
Nov 28 08:02:16 np0005538515.localdomain sudo[55768]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jjtppnakgdpicrmhqyxmhwzwzzupolgi ; /usr/bin/python3
Nov 28 08:02:16 np0005538515.localdomain sudo[55768]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:02:16 np0005538515.localdomain python3[55770]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:02:16 np0005538515.localdomain sudo[55768]: pam_unix(sudo:session): session closed for user root
Nov 28 08:02:16 np0005538515.localdomain sudo[55784]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pkdakepvrvamhoejpluegszhsomdowxh ; /usr/bin/python3
Nov 28 08:02:16 np0005538515.localdomain sudo[55784]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:02:17 np0005538515.localdomain python3[55786]: ansible-container_config_data Invoked with config_path=/var/lib/tripleo-config/container-puppet-config/step_1 config_pattern=container-puppet-*.json config_overrides={} debug=True
Nov 28 08:02:17 np0005538515.localdomain sudo[55784]: pam_unix(sudo:session): session closed for user root
Nov 28 08:02:21 np0005538515.localdomain dnf[55591]: Fast Datapath for RHEL 9 x86_64 (RPMs)          793  B/s | 4.0 kB     00:05
Nov 28 08:02:23 np0005538515.localdomain sudo[55788]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 08:02:23 np0005538515.localdomain sudo[55788]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:02:23 np0005538515.localdomain sudo[55788]: pam_unix(sudo:session): session closed for user root
Nov 28 08:02:23 np0005538515.localdomain sudo[55803]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 08:02:23 np0005538515.localdomain sudo[55803]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:02:24 np0005538515.localdomain sudo[55803]: pam_unix(sudo:session): session closed for user root
Nov 28 08:02:24 np0005538515.localdomain dnf[55591]: Red Hat OpenStack Platform 17.1 for RHEL 9 x86_ 1.2 kB/s | 4.0 kB     00:03
Nov 28 08:02:25 np0005538515.localdomain dnf[55591]: Red Hat Enterprise Linux 9 for x86_64 - AppStre  30 kB/s | 4.5 kB     00:00
Nov 28 08:02:25 np0005538515.localdomain dnf[55591]: Red Hat Enterprise Linux 9 for x86_64 - AppStre  14 kB/s | 4.5 kB     00:00
Nov 28 08:02:29 np0005538515.localdomain dnf[55591]: Red Hat Enterprise Linux 9 for x86_64 - High Av 1.1 kB/s | 4.0 kB     00:03
Nov 28 08:02:29 np0005538515.localdomain dnf[55591]: Red Hat Enterprise Linux 9 for x86_64 - BaseOS   41 kB/s | 4.1 kB     00:00
Nov 28 08:02:30 np0005538515.localdomain dnf[55591]: Red Hat Enterprise Linux 9 for x86_64 - BaseOS  3.2 kB/s | 4.1 kB     00:01
Nov 28 08:02:30 np0005538515.localdomain dnf[55591]: Metadata cache created.
Nov 28 08:02:31 np0005538515.localdomain systemd[1]: dnf-makecache.service: Deactivated successfully.
Nov 28 08:02:31 np0005538515.localdomain systemd[1]: Finished dnf makecache.
Nov 28 08:02:31 np0005538515.localdomain systemd[1]: dnf-makecache.service: Consumed 2.911s CPU time.
Nov 28 08:02:35 np0005538515.localdomain sudo[55855]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 08:02:35 np0005538515.localdomain sudo[55855]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:02:35 np0005538515.localdomain sudo[55855]: pam_unix(sudo:session): session closed for user root
Nov 28 08:02:41 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.
Nov 28 08:02:41 np0005538515.localdomain podman[55870]: 2025-11-28 08:02:41.980507567 +0000 UTC m=+0.084101925 container health_status 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, tcib_managed=true, architecture=x86_64, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, name=rhosp17/openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:46Z, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, container_name=metrics_qdr, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc.)
Nov 28 08:02:42 np0005538515.localdomain podman[55870]: 2025-11-28 08:02:42.217541905 +0000 UTC m=+0.321136233 container exec_died 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.expose-services=, architecture=x86_64, config_id=tripleo_step1, io.buildah.version=1.41.4, distribution-scope=public, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, com.redhat.component=openstack-qdrouterd-container, tcib_managed=true, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, build-date=2025-11-18T22:49:46Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, container_name=metrics_qdr)
Nov 28 08:02:42 np0005538515.localdomain systemd[1]: 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.service: Deactivated successfully.
Nov 28 08:03:12 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.
Nov 28 08:03:12 np0005538515.localdomain podman[55900]: 2025-11-28 08:03:12.967030257 +0000 UTC m=+0.078730412 container health_status 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, release=1761123044, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, managed_by=tripleo_ansible, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, config_id=tripleo_step1, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, tcib_managed=true, url=https://www.redhat.com, container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:46Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd)
Nov 28 08:03:13 np0005538515.localdomain podman[55900]: 2025-11-28 08:03:13.159385235 +0000 UTC m=+0.271085440 container exec_died 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.buildah.version=1.41.4, container_name=metrics_qdr, build-date=2025-11-18T22:49:46Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, name=rhosp17/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, version=17.1.12, architecture=x86_64, managed_by=tripleo_ansible, config_id=tripleo_step1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-type=git, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd)
Nov 28 08:03:13 np0005538515.localdomain systemd[1]: 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.service: Deactivated successfully.
Nov 28 08:03:35 np0005538515.localdomain sudo[55930]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 08:03:35 np0005538515.localdomain sudo[55930]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:03:35 np0005538515.localdomain sudo[55930]: pam_unix(sudo:session): session closed for user root
Nov 28 08:03:35 np0005538515.localdomain sudo[55945]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 08:03:35 np0005538515.localdomain sudo[55945]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:03:36 np0005538515.localdomain sudo[55945]: pam_unix(sudo:session): session closed for user root
Nov 28 08:03:36 np0005538515.localdomain sudo[55991]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 08:03:36 np0005538515.localdomain sudo[55991]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:03:36 np0005538515.localdomain sudo[55991]: pam_unix(sudo:session): session closed for user root
Nov 28 08:03:43 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.
Nov 28 08:03:43 np0005538515.localdomain systemd[1]: tmp-crun.LwrBSE.mount: Deactivated successfully.
Nov 28 08:03:43 np0005538515.localdomain podman[56006]: 2025-11-28 08:03:43.978204803 +0000 UTC m=+0.089933631 container health_status 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vcs-type=git, build-date=2025-11-18T22:49:46Z, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, config_id=tripleo_step1, name=rhosp17/openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, container_name=metrics_qdr, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1761123044, architecture=x86_64)
Nov 28 08:03:44 np0005538515.localdomain podman[56006]: 2025-11-28 08:03:44.200504955 +0000 UTC m=+0.312233773 container exec_died 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, distribution-scope=public, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, io.buildah.version=1.41.4, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, release=1761123044, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, build-date=2025-11-18T22:49:46Z, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, version=17.1.12, description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-qdrouterd, url=https://www.redhat.com)
Nov 28 08:03:44 np0005538515.localdomain systemd[1]: 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.service: Deactivated successfully.
Nov 28 08:04:14 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.
Nov 28 08:04:14 np0005538515.localdomain systemd[1]: tmp-crun.8Z6lC3.mount: Deactivated successfully.
Nov 28 08:04:14 np0005538515.localdomain podman[56035]: 2025-11-28 08:04:14.980705371 +0000 UTC m=+0.086260487 container health_status 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, version=17.1.12, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, build-date=2025-11-18T22:49:46Z, architecture=x86_64, name=rhosp17/openstack-qdrouterd, batch=17.1_20251118.1, io.openshift.expose-services=)
Nov 28 08:04:15 np0005538515.localdomain podman[56035]: 2025-11-28 08:04:15.176421465 +0000 UTC m=+0.281976551 container exec_died 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, name=rhosp17/openstack-qdrouterd, vendor=Red Hat, Inc., container_name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, version=17.1.12, url=https://www.redhat.com, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, release=1761123044, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, managed_by=tripleo_ansible, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 08:04:15 np0005538515.localdomain systemd[1]: 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.service: Deactivated successfully.
Nov 28 08:04:37 np0005538515.localdomain sudo[56064]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 08:04:37 np0005538515.localdomain sudo[56064]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:04:37 np0005538515.localdomain sudo[56064]: pam_unix(sudo:session): session closed for user root
Nov 28 08:04:37 np0005538515.localdomain sudo[56079]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 08:04:37 np0005538515.localdomain sudo[56079]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:04:37 np0005538515.localdomain sudo[56079]: pam_unix(sudo:session): session closed for user root
Nov 28 08:04:38 np0005538515.localdomain sudo[56125]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 08:04:38 np0005538515.localdomain sudo[56125]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:04:38 np0005538515.localdomain sudo[56125]: pam_unix(sudo:session): session closed for user root
Nov 28 08:04:45 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.
Nov 28 08:04:45 np0005538515.localdomain podman[56140]: 2025-11-28 08:04:45.975760774 +0000 UTC m=+0.086256757 container health_status 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, url=https://www.redhat.com, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, release=1761123044, io.openshift.expose-services=, name=rhosp17/openstack-qdrouterd, container_name=metrics_qdr, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, build-date=2025-11-18T22:49:46Z, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.buildah.version=1.41.4, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, com.redhat.component=openstack-qdrouterd-container)
Nov 28 08:04:46 np0005538515.localdomain podman[56140]: 2025-11-28 08:04:46.194641627 +0000 UTC m=+0.305137580 container exec_died 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, architecture=x86_64, io.buildah.version=1.41.4, version=17.1.12, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20251118.1, build-date=2025-11-18T22:49:46Z, config_id=tripleo_step1, container_name=metrics_qdr, distribution-scope=public, name=rhosp17/openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 28 08:04:46 np0005538515.localdomain systemd[1]: 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.service: Deactivated successfully.
Nov 28 08:05:16 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.
Nov 28 08:05:17 np0005538515.localdomain podman[56170]: 2025-11-28 08:05:17.005439372 +0000 UTC m=+0.086691751 container health_status 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vendor=Red Hat, Inc., config_id=tripleo_step1, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:46Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, name=rhosp17/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, url=https://www.redhat.com, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, container_name=metrics_qdr, vcs-type=git, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 28 08:05:17 np0005538515.localdomain podman[56170]: 2025-11-28 08:05:17.196518926 +0000 UTC m=+0.277771355 container exec_died 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, config_id=tripleo_step1, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, container_name=metrics_qdr, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, tcib_managed=true)
Nov 28 08:05:17 np0005538515.localdomain systemd[1]: 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.service: Deactivated successfully.
Nov 28 08:05:30 np0005538515.localdomain sshd[56199]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 08:05:30 np0005538515.localdomain sshd[56200]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 08:05:30 np0005538515.localdomain sshd[56200]: error: kex_exchange_identification: read: Connection reset by peer
Nov 28 08:05:30 np0005538515.localdomain sshd[56200]: Connection reset by 45.140.17.97 port 27682
Nov 28 08:05:38 np0005538515.localdomain sudo[56202]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 08:05:38 np0005538515.localdomain sudo[56202]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:05:38 np0005538515.localdomain sudo[56202]: pam_unix(sudo:session): session closed for user root
Nov 28 08:05:38 np0005538515.localdomain sudo[56217]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 08:05:38 np0005538515.localdomain sudo[56217]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:05:39 np0005538515.localdomain sudo[56217]: pam_unix(sudo:session): session closed for user root
Nov 28 08:05:39 np0005538515.localdomain sudo[56263]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 08:05:39 np0005538515.localdomain sudo[56263]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:05:39 np0005538515.localdomain sudo[56263]: pam_unix(sudo:session): session closed for user root
Nov 28 08:05:47 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.
Nov 28 08:05:47 np0005538515.localdomain podman[56278]: 2025-11-28 08:05:47.978647087 +0000 UTC m=+0.089827515 container health_status 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, url=https://www.redhat.com, distribution-scope=public, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=metrics_qdr, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, build-date=2025-11-18T22:49:46Z, config_id=tripleo_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc.)
Nov 28 08:05:48 np0005538515.localdomain podman[56278]: 2025-11-28 08:05:48.158870572 +0000 UTC m=+0.270050940 container exec_died 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:46Z, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, distribution-scope=public, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, name=rhosp17/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, version=17.1.12)
Nov 28 08:05:48 np0005538515.localdomain systemd[1]: 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.service: Deactivated successfully.
Nov 28 08:06:18 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.
Nov 28 08:06:18 np0005538515.localdomain podman[56308]: 2025-11-28 08:06:18.976099431 +0000 UTC m=+0.085246898 container health_status 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-qdrouterd-container, name=rhosp17/openstack-qdrouterd, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.openshift.expose-services=, tcib_managed=true, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com)
Nov 28 08:06:19 np0005538515.localdomain podman[56308]: 2025-11-28 08:06:19.157524411 +0000 UTC m=+0.266671898 container exec_died 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.expose-services=, vendor=Red Hat, Inc., batch=17.1_20251118.1, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:46Z, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, config_id=tripleo_step1)
Nov 28 08:06:19 np0005538515.localdomain systemd[1]: 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.service: Deactivated successfully.
Nov 28 08:06:39 np0005538515.localdomain sudo[56338]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 08:06:39 np0005538515.localdomain sudo[56338]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:06:39 np0005538515.localdomain sudo[56338]: pam_unix(sudo:session): session closed for user root
Nov 28 08:06:39 np0005538515.localdomain sudo[56353]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 08:06:39 np0005538515.localdomain sudo[56353]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:06:40 np0005538515.localdomain sudo[56353]: pam_unix(sudo:session): session closed for user root
Nov 28 08:06:41 np0005538515.localdomain sudo[56401]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 08:06:41 np0005538515.localdomain sudo[56401]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:06:41 np0005538515.localdomain sudo[56401]: pam_unix(sudo:session): session closed for user root
Nov 28 08:06:49 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.
Nov 28 08:06:49 np0005538515.localdomain systemd[1]: tmp-crun.FReXdP.mount: Deactivated successfully.
Nov 28 08:06:49 np0005538515.localdomain podman[56416]: 2025-11-28 08:06:49.977263797 +0000 UTC m=+0.088380318 container health_status 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.buildah.version=1.41.4, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., name=rhosp17/openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, vcs-type=git, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, tcib_managed=true, distribution-scope=public, architecture=x86_64, config_id=tripleo_step1, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, version=17.1.12)
Nov 28 08:06:50 np0005538515.localdomain podman[56416]: 2025-11-28 08:06:50.175685737 +0000 UTC m=+0.286802298 container exec_died 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vcs-type=git, batch=17.1_20251118.1, name=rhosp17/openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., tcib_managed=true, distribution-scope=public, config_id=tripleo_step1, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, managed_by=tripleo_ansible, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, release=1761123044, url=https://www.redhat.com, container_name=metrics_qdr, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, version=17.1.12, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']})
Nov 28 08:06:50 np0005538515.localdomain systemd[1]: 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.service: Deactivated successfully.
Nov 28 08:06:51 np0005538515.localdomain sshd[56445]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 08:06:52 np0005538515.localdomain sshd[56445]: Invalid user sol from 80.94.92.186 port 33604
Nov 28 08:06:52 np0005538515.localdomain sshd[56445]: Connection closed by invalid user sol 80.94.92.186 port 33604 [preauth]
Nov 28 08:07:06 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 20 pg[2.0( empty local-lis/les=0/0 n=0 ec=20/20 lis/c=0/0 les/c/f=0/0/0 sis=20) [4,5,3] r=0 lpr=20 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:07:08 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 21 pg[2.0( empty local-lis/les=20/21 n=0 ec=20/20 lis/c=0/0 les/c/f=0/0/0 sis=20) [4,5,3] r=0 lpr=20 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:07:10 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 22 pg[3.0( empty local-lis/les=0/0 n=0 ec=22/22 lis/c=0/0 les/c/f=0/0/0 sis=22) [5,4,0] r=1 lpr=22 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 28 08:07:12 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 24 pg[4.0( empty local-lis/les=0/0 n=0 ec=24/24 lis/c=0/0 les/c/f=0/0/0 sis=24) [3,4,5] r=1 lpr=24 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 28 08:07:15 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 26 pg[5.0( empty local-lis/les=0/0 n=0 ec=26/26 lis/c=0/0 les/c/f=0/0/0 sis=26) [2,3,4] r=2 lpr=26 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 28 08:07:20 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.
Nov 28 08:07:20 np0005538515.localdomain podman[56447]: 2025-11-28 08:07:20.97263299 +0000 UTC m=+0.078729099 container health_status 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-qdrouterd, config_id=tripleo_step1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, batch=17.1_20251118.1, tcib_managed=true, version=17.1.12, distribution-scope=public, architecture=x86_64, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, release=1761123044, container_name=metrics_qdr)
Nov 28 08:07:21 np0005538515.localdomain podman[56447]: 2025-11-28 08:07:21.163528355 +0000 UTC m=+0.269624444 container exec_died 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step1, io.openshift.expose-services=, architecture=x86_64, name=rhosp17/openstack-qdrouterd, vcs-type=git, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:46Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd)
Nov 28 08:07:21 np0005538515.localdomain systemd[1]: 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.service: Deactivated successfully.
Nov 28 08:07:29 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 32 pg[6.0( empty local-lis/les=0/0 n=0 ec=32/32 lis/c=0/0 les/c/f=0/0/0 sis=32) [0,4,2] r=1 lpr=32 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 28 08:07:29 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 33 pg[7.0( empty local-lis/les=0/0 n=0 ec=33/33 lis/c=0/0 les/c/f=0/0/0 sis=33) [1,5,3] r=0 lpr=33 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:07:30 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 34 pg[7.0( empty local-lis/les=33/34 n=0 ec=33/33 lis/c=0/0 les/c/f=0/0/0 sis=33) [1,5,3] r=0 lpr=33 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:07:35 np0005538515.localdomain sudo[56478]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 08:07:35 np0005538515.localdomain sudo[56478]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:07:35 np0005538515.localdomain sudo[56478]: pam_unix(sudo:session): session closed for user root
Nov 28 08:07:37 np0005538515.localdomain sudo[56493]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 08:07:37 np0005538515.localdomain sudo[56493]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:07:37 np0005538515.localdomain sudo[56493]: pam_unix(sudo:session): session closed for user root
Nov 28 08:07:37 np0005538515.localdomain sudo[56508]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 08:07:37 np0005538515.localdomain sudo[56508]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:07:37 np0005538515.localdomain sudo[56508]: pam_unix(sudo:session): session closed for user root
Nov 28 08:07:51 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.
Nov 28 08:07:51 np0005538515.localdomain systemd[1]: tmp-crun.6HtFSN.mount: Deactivated successfully.
Nov 28 08:07:51 np0005538515.localdomain podman[56523]: 2025-11-28 08:07:51.974249246 +0000 UTC m=+0.081695858 container health_status 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-qdrouterd, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:46Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr, config_id=tripleo_step1, io.buildah.version=1.41.4, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, vcs-type=git, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, distribution-scope=public, release=1761123044, architecture=x86_64)
Nov 28 08:07:52 np0005538515.localdomain podman[56523]: 2025-11-28 08:07:52.184854429 +0000 UTC m=+0.292301051 container exec_died 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, url=https://www.redhat.com, config_id=tripleo_step1, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-qdrouterd, version=17.1.12, container_name=metrics_qdr, io.buildah.version=1.41.4, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, build-date=2025-11-18T22:49:46Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible)
Nov 28 08:07:52 np0005538515.localdomain systemd[1]: 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.service: Deactivated successfully.
Nov 28 08:07:59 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 38 pg[2.0( empty local-lis/les=20/21 n=0 ec=20/20 lis/c=20/20 les/c/f=21/21/0 sis=38 pruub=13.071694374s) [4,5,3] r=0 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 active pruub 1173.604248047s@ mbc={}] start_peering_interval up [4,5,3] -> [4,5,3], acting [4,5,3] -> [4,5,3], acting_primary 4 -> 4, up_primary 4 -> 4, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:07:59 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 38 pg[2.0( empty local-lis/les=20/21 n=0 ec=20/20 lis/c=20/20 les/c/f=21/21/0 sis=38 pruub=13.071694374s) [4,5,3] r=0 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 unknown pruub 1173.604248047s@ mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:00 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 39 pg[2.1d( empty local-lis/les=20/21 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [4,5,3] r=0 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:00 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 39 pg[2.1c( empty local-lis/les=20/21 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [4,5,3] r=0 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:00 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 39 pg[2.1f( empty local-lis/les=20/21 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [4,5,3] r=0 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:00 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 39 pg[2.1b( empty local-lis/les=20/21 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [4,5,3] r=0 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:00 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 39 pg[2.1a( empty local-lis/les=20/21 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [4,5,3] r=0 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:00 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 39 pg[2.19( empty local-lis/les=20/21 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [4,5,3] r=0 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:00 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 39 pg[2.5( empty local-lis/les=20/21 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [4,5,3] r=0 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:00 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 39 pg[2.8( empty local-lis/les=20/21 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [4,5,3] r=0 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:00 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 39 pg[2.3( empty local-lis/les=20/21 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [4,5,3] r=0 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:00 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 39 pg[2.4( empty local-lis/les=20/21 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [4,5,3] r=0 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:00 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 39 pg[2.2( empty local-lis/les=20/21 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [4,5,3] r=0 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:00 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 39 pg[2.6( empty local-lis/les=20/21 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [4,5,3] r=0 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:00 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 39 pg[2.7( empty local-lis/les=20/21 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [4,5,3] r=0 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:00 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 39 pg[2.1( empty local-lis/les=20/21 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [4,5,3] r=0 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:00 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 39 pg[2.1e( empty local-lis/les=20/21 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [4,5,3] r=0 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:00 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 39 pg[2.9( empty local-lis/les=20/21 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [4,5,3] r=0 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:00 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 39 pg[2.a( empty local-lis/les=20/21 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [4,5,3] r=0 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:00 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 39 pg[2.b( empty local-lis/les=20/21 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [4,5,3] r=0 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:00 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 39 pg[2.c( empty local-lis/les=20/21 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [4,5,3] r=0 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:00 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 39 pg[2.d( empty local-lis/les=20/21 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [4,5,3] r=0 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:00 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 39 pg[2.e( empty local-lis/les=20/21 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [4,5,3] r=0 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:00 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 39 pg[2.f( empty local-lis/les=20/21 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [4,5,3] r=0 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:00 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 39 pg[2.10( empty local-lis/les=20/21 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [4,5,3] r=0 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:00 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 39 pg[2.11( empty local-lis/les=20/21 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [4,5,3] r=0 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:00 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 39 pg[2.12( empty local-lis/les=20/21 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [4,5,3] r=0 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:00 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 39 pg[2.14( empty local-lis/les=20/21 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [4,5,3] r=0 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:00 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 39 pg[2.13( empty local-lis/les=20/21 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [4,5,3] r=0 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:00 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 39 pg[2.15( empty local-lis/les=20/21 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [4,5,3] r=0 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:00 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 39 pg[2.16( empty local-lis/les=20/21 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [4,5,3] r=0 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:00 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 39 pg[2.18( empty local-lis/les=20/21 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [4,5,3] r=0 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:00 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 39 pg[2.17( empty local-lis/les=20/21 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [4,5,3] r=0 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:00 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 39 pg[2.0( empty local-lis/les=38/39 n=0 ec=20/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [4,5,3] r=0 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:00 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 39 pg[2.18( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [4,5,3] r=0 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:00 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 39 pg[2.14( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [4,5,3] r=0 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:00 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 39 pg[2.17( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [4,5,3] r=0 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:00 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 39 pg[2.15( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [4,5,3] r=0 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:00 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 39 pg[2.16( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [4,5,3] r=0 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:00 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 39 pg[2.13( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [4,5,3] r=0 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:00 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 39 pg[2.12( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [4,5,3] r=0 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:00 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 39 pg[2.11( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [4,5,3] r=0 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:00 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 39 pg[2.f( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [4,5,3] r=0 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:00 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 39 pg[2.c( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [4,5,3] r=0 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:00 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 39 pg[2.10( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [4,5,3] r=0 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:00 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 39 pg[2.e( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [4,5,3] r=0 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:00 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 39 pg[2.a( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [4,5,3] r=0 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:00 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 39 pg[2.d( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [4,5,3] r=0 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:00 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 39 pg[2.b( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [4,5,3] r=0 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:00 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 39 pg[2.6( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [4,5,3] r=0 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:00 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 39 pg[2.7( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [4,5,3] r=0 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:00 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 39 pg[2.3( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [4,5,3] r=0 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:00 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 39 pg[2.1( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [4,5,3] r=0 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:00 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 39 pg[2.8( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [4,5,3] r=0 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:00 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 39 pg[2.19( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [4,5,3] r=0 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:00 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 39 pg[2.1a( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [4,5,3] r=0 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:00 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 39 pg[2.9( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [4,5,3] r=0 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:00 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 39 pg[2.2( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [4,5,3] r=0 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:00 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 39 pg[2.4( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [4,5,3] r=0 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:00 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 39 pg[2.1c( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [4,5,3] r=0 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:00 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 39 pg[2.1d( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [4,5,3] r=0 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:00 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 39 pg[2.1f( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [4,5,3] r=0 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:00 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 39 pg[2.1b( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [4,5,3] r=0 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:00 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 39 pg[2.1e( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [4,5,3] r=0 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:00 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 39 pg[2.5( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [4,5,3] r=0 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:01 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 40 pg[4.0( empty local-lis/les=24/25 n=0 ec=24/24 lis/c=24/24 les/c/f=25/25/0 sis=40 pruub=15.074197769s) [3,4,5] r=1 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 active pruub 1177.627685547s@ mbc={}] start_peering_interval up [3,4,5] -> [3,4,5], acting [3,4,5] -> [3,4,5], acting_primary 3 -> 3, up_primary 3 -> 3, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:01 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 40 pg[3.0( empty local-lis/les=22/23 n=0 ec=22/22 lis/c=22/22 les/c/f=23/23/0 sis=40 pruub=13.075235367s) [5,4,0] r=1 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 active pruub 1175.630004883s@ mbc={}] start_peering_interval up [5,4,0] -> [5,4,0], acting [5,4,0] -> [5,4,0], acting_primary 5 -> 5, up_primary 5 -> 5, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:01 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 40 pg[4.0( empty local-lis/les=24/25 n=0 ec=24/24 lis/c=24/24 les/c/f=25/25/0 sis=40 pruub=15.072606087s) [3,4,5] r=1 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1177.627685547s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:01 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 40 pg[3.0( empty local-lis/les=22/23 n=0 ec=22/22 lis/c=22/22 les/c/f=23/23/0 sis=40 pruub=13.072304726s) [5,4,0] r=1 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1175.630004883s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:02 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 41 pg[4.19( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [3,4,5] r=1 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:02 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 41 pg[4.18( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [3,4,5] r=1 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:02 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 41 pg[4.1b( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [3,4,5] r=1 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:02 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 41 pg[4.1a( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [3,4,5] r=1 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:02 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 41 pg[4.1d( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [3,4,5] r=1 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:02 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 41 pg[3.18( empty local-lis/les=22/23 n=0 ec=40/22 lis/c=22/22 les/c/f=23/23/0 sis=40) [5,4,0] r=1 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:02 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 41 pg[4.e( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [3,4,5] r=1 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:02 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 41 pg[3.4( empty local-lis/les=22/23 n=0 ec=40/22 lis/c=22/22 les/c/f=23/23/0 sis=40) [5,4,0] r=1 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:02 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 41 pg[4.1f( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [3,4,5] r=1 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:02 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 41 pg[4.3( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [3,4,5] r=1 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:02 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 41 pg[3.2( empty local-lis/les=22/23 n=0 ec=40/22 lis/c=22/22 les/c/f=23/23/0 sis=40) [5,4,0] r=1 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:02 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 41 pg[4.5( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [3,4,5] r=1 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:02 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 41 pg[4.4( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [3,4,5] r=1 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:02 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 41 pg[3.3( empty local-lis/les=22/23 n=0 ec=40/22 lis/c=22/22 les/c/f=23/23/0 sis=40) [5,4,0] r=1 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:02 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 41 pg[3.5( empty local-lis/les=22/23 n=0 ec=40/22 lis/c=22/22 les/c/f=23/23/0 sis=40) [5,4,0] r=1 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:02 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 41 pg[3.6( empty local-lis/les=22/23 n=0 ec=40/22 lis/c=22/22 les/c/f=23/23/0 sis=40) [5,4,0] r=1 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:02 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 41 pg[4.1( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [3,4,5] r=1 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:02 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 41 pg[3.7( empty local-lis/les=22/23 n=0 ec=40/22 lis/c=22/22 les/c/f=23/23/0 sis=40) [5,4,0] r=1 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:02 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 41 pg[4.1c( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [3,4,5] r=1 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:02 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 41 pg[4.7( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [3,4,5] r=1 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:02 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 41 pg[3.1( empty local-lis/les=22/23 n=0 ec=40/22 lis/c=22/22 les/c/f=23/23/0 sis=40) [5,4,0] r=1 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:02 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 41 pg[4.6( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [3,4,5] r=1 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:02 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 41 pg[3.8( empty local-lis/les=22/23 n=0 ec=40/22 lis/c=22/22 les/c/f=23/23/0 sis=40) [5,4,0] r=1 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:02 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 41 pg[4.f( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [3,4,5] r=1 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:02 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 41 pg[3.b( empty local-lis/les=22/23 n=0 ec=40/22 lis/c=22/22 les/c/f=23/23/0 sis=40) [5,4,0] r=1 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:02 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 41 pg[4.c( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [3,4,5] r=1 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:02 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 41 pg[3.a( empty local-lis/les=22/23 n=0 ec=40/22 lis/c=22/22 les/c/f=23/23/0 sis=40) [5,4,0] r=1 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:02 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 41 pg[4.d( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [3,4,5] r=1 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:02 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 41 pg[3.c( empty local-lis/les=22/23 n=0 ec=40/22 lis/c=22/22 les/c/f=23/23/0 sis=40) [5,4,0] r=1 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:02 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 41 pg[4.b( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [3,4,5] r=1 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:02 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 41 pg[3.d( empty local-lis/les=22/23 n=0 ec=40/22 lis/c=22/22 les/c/f=23/23/0 sis=40) [5,4,0] r=1 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:02 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 41 pg[4.8( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [3,4,5] r=1 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:02 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 41 pg[3.e( empty local-lis/les=22/23 n=0 ec=40/22 lis/c=22/22 les/c/f=23/23/0 sis=40) [5,4,0] r=1 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:02 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 41 pg[4.9( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [3,4,5] r=1 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:02 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 41 pg[4.16( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [3,4,5] r=1 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:02 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 41 pg[4.2( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [3,4,5] r=1 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:02 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 41 pg[3.10( empty local-lis/les=22/23 n=0 ec=40/22 lis/c=22/22 les/c/f=23/23/0 sis=40) [5,4,0] r=1 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:02 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 41 pg[3.f( empty local-lis/les=22/23 n=0 ec=40/22 lis/c=22/22 les/c/f=23/23/0 sis=40) [5,4,0] r=1 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:02 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 41 pg[3.11( empty local-lis/les=22/23 n=0 ec=40/22 lis/c=22/22 les/c/f=23/23/0 sis=40) [5,4,0] r=1 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:02 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 41 pg[4.17( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [3,4,5] r=1 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:02 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 41 pg[3.13( empty local-lis/les=22/23 n=0 ec=40/22 lis/c=22/22 les/c/f=23/23/0 sis=40) [5,4,0] r=1 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:02 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 41 pg[4.a( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [3,4,5] r=1 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:02 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 41 pg[4.15( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [3,4,5] r=1 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:02 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 41 pg[3.12( empty local-lis/les=22/23 n=0 ec=40/22 lis/c=22/22 les/c/f=23/23/0 sis=40) [5,4,0] r=1 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:02 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 41 pg[4.14( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [3,4,5] r=1 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:02 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 41 pg[3.15( empty local-lis/les=22/23 n=0 ec=40/22 lis/c=22/22 les/c/f=23/23/0 sis=40) [5,4,0] r=1 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:02 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 41 pg[4.12( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [3,4,5] r=1 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:02 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 41 pg[3.14( empty local-lis/les=22/23 n=0 ec=40/22 lis/c=22/22 les/c/f=23/23/0 sis=40) [5,4,0] r=1 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:02 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 41 pg[4.13( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [3,4,5] r=1 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:02 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 41 pg[3.17( empty local-lis/les=22/23 n=0 ec=40/22 lis/c=22/22 les/c/f=23/23/0 sis=40) [5,4,0] r=1 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:02 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 41 pg[3.16( empty local-lis/les=22/23 n=0 ec=40/22 lis/c=22/22 les/c/f=23/23/0 sis=40) [5,4,0] r=1 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:02 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 41 pg[4.10( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [3,4,5] r=1 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:02 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 41 pg[4.11( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [3,4,5] r=1 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:02 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 41 pg[3.19( empty local-lis/les=22/23 n=0 ec=40/22 lis/c=22/22 les/c/f=23/23/0 sis=40) [5,4,0] r=1 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:02 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 41 pg[3.1a( empty local-lis/les=22/23 n=0 ec=40/22 lis/c=22/22 les/c/f=23/23/0 sis=40) [5,4,0] r=1 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:02 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 41 pg[4.1e( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [3,4,5] r=1 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:02 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 41 pg[3.1d( empty local-lis/les=22/23 n=0 ec=40/22 lis/c=22/22 les/c/f=23/23/0 sis=40) [5,4,0] r=1 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:02 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 41 pg[3.1b( empty local-lis/les=22/23 n=0 ec=40/22 lis/c=22/22 les/c/f=23/23/0 sis=40) [5,4,0] r=1 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:02 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 41 pg[3.1f( empty local-lis/les=22/23 n=0 ec=40/22 lis/c=22/22 les/c/f=23/23/0 sis=40) [5,4,0] r=1 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:02 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 41 pg[3.1e( empty local-lis/les=22/23 n=0 ec=40/22 lis/c=22/22 les/c/f=23/23/0 sis=40) [5,4,0] r=1 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:02 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 41 pg[3.1c( empty local-lis/les=22/23 n=0 ec=40/22 lis/c=22/22 les/c/f=23/23/0 sis=40) [5,4,0] r=1 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:02 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 41 pg[3.9( empty local-lis/les=22/23 n=0 ec=40/22 lis/c=22/22 les/c/f=23/23/0 sis=40) [5,4,0] r=1 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:02 np0005538515.localdomain ceph-osd[33334]: log_channel(cluster) log [DBG] : 2.0 scrub starts
Nov 28 08:08:02 np0005538515.localdomain ceph-osd[33334]: log_channel(cluster) log [DBG] : 2.0 scrub ok
Nov 28 08:08:02 np0005538515.localdomain sudo[56565]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kkpqlxtuejpbuqwnjhpoylztqitxaelt ; /usr/bin/python3
Nov 28 08:08:02 np0005538515.localdomain sudo[56565]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:08:02 np0005538515.localdomain python3[56567]: ansible-file Invoked with path=/var/lib/tripleo-config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:08:02 np0005538515.localdomain sudo[56565]: pam_unix(sudo:session): session closed for user root
Nov 28 08:08:03 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 42 pg[5.0( empty local-lis/les=26/27 n=0 ec=26/26 lis/c=26/26 les/c/f=27/27/0 sis=42 pruub=15.902339935s) [2,3,4] r=2 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 active pruub 1180.485839844s@ mbc={}] start_peering_interval up [2,3,4] -> [2,3,4], acting [2,3,4] -> [2,3,4], acting_primary 2 -> 2, up_primary 2 -> 2, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:03 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 42 pg[6.0( empty local-lis/les=32/33 n=0 ec=32/32 lis/c=32/32 les/c/f=33/33/0 sis=42 pruub=13.960138321s) [0,4,2] r=1 lpr=42 pi=[32,42)/1 crt=0'0 mlcod 0'0 active pruub 1178.544433594s@ mbc={}] start_peering_interval up [0,4,2] -> [0,4,2], acting [0,4,2] -> [0,4,2], acting_primary 0 -> 0, up_primary 0 -> 0, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:03 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 42 pg[6.0( empty local-lis/les=32/33 n=0 ec=32/32 lis/c=32/32 les/c/f=33/33/0 sis=42 pruub=13.955951691s) [0,4,2] r=1 lpr=42 pi=[32,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1178.544433594s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:03 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 42 pg[5.0( empty local-lis/les=26/27 n=0 ec=26/26 lis/c=26/26 les/c/f=27/27/0 sis=42 pruub=15.897210121s) [2,3,4] r=2 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1180.485839844s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:04 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 43 pg[5.10( empty local-lis/les=26/27 n=0 ec=42/26 lis/c=26/26 les/c/f=27/27/0 sis=42) [2,3,4] r=2 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:04 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 43 pg[6.13( empty local-lis/les=32/33 n=0 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [0,4,2] r=1 lpr=42 pi=[32,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:04 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 43 pg[5.11( empty local-lis/les=26/27 n=0 ec=42/26 lis/c=26/26 les/c/f=27/27/0 sis=42) [2,3,4] r=2 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:04 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 43 pg[6.1c( empty local-lis/les=32/33 n=0 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [0,4,2] r=1 lpr=42 pi=[32,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:04 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 43 pg[6.12( empty local-lis/les=32/33 n=0 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [0,4,2] r=1 lpr=42 pi=[32,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:04 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 43 pg[5.12( empty local-lis/les=26/27 n=0 ec=42/26 lis/c=26/26 les/c/f=27/27/0 sis=42) [2,3,4] r=2 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:04 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 43 pg[5.13( empty local-lis/les=26/27 n=0 ec=42/26 lis/c=26/26 les/c/f=27/27/0 sis=42) [2,3,4] r=2 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:04 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 43 pg[6.10( empty local-lis/les=32/33 n=0 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [0,4,2] r=1 lpr=42 pi=[32,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:04 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 43 pg[5.1f( empty local-lis/les=26/27 n=0 ec=42/26 lis/c=26/26 les/c/f=27/27/0 sis=42) [2,3,4] r=2 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:04 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 43 pg[6.17( empty local-lis/les=32/33 n=0 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [0,4,2] r=1 lpr=42 pi=[32,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:04 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 43 pg[5.15( empty local-lis/les=26/27 n=0 ec=42/26 lis/c=26/26 les/c/f=27/27/0 sis=42) [2,3,4] r=2 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:04 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 43 pg[6.16( empty local-lis/les=32/33 n=0 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [0,4,2] r=1 lpr=42 pi=[32,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:04 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 43 pg[5.14( empty local-lis/les=26/27 n=0 ec=42/26 lis/c=26/26 les/c/f=27/27/0 sis=42) [2,3,4] r=2 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:04 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 43 pg[5.16( empty local-lis/les=26/27 n=0 ec=42/26 lis/c=26/26 les/c/f=27/27/0 sis=42) [2,3,4] r=2 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:04 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 43 pg[6.11( empty local-lis/les=32/33 n=0 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [0,4,2] r=1 lpr=42 pi=[32,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:04 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 43 pg[5.17( empty local-lis/les=26/27 n=0 ec=42/26 lis/c=26/26 les/c/f=27/27/0 sis=42) [2,3,4] r=2 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:04 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 43 pg[6.14( empty local-lis/les=32/33 n=0 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [0,4,2] r=1 lpr=42 pi=[32,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:04 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 43 pg[6.b( empty local-lis/les=32/33 n=0 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [0,4,2] r=1 lpr=42 pi=[32,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:04 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 43 pg[5.8( empty local-lis/les=26/27 n=0 ec=42/26 lis/c=26/26 les/c/f=27/27/0 sis=42) [2,3,4] r=2 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:04 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 43 pg[6.15( empty local-lis/les=32/33 n=0 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [0,4,2] r=1 lpr=42 pi=[32,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:04 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 43 pg[6.a( empty local-lis/les=32/33 n=0 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [0,4,2] r=1 lpr=42 pi=[32,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:04 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 43 pg[5.9( empty local-lis/les=26/27 n=0 ec=42/26 lis/c=26/26 les/c/f=27/27/0 sis=42) [2,3,4] r=2 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:04 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 43 pg[5.a( empty local-lis/les=26/27 n=0 ec=42/26 lis/c=26/26 les/c/f=27/27/0 sis=42) [2,3,4] r=2 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:04 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 43 pg[6.8( empty local-lis/les=32/33 n=0 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [0,4,2] r=1 lpr=42 pi=[32,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:04 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 43 pg[6.9( empty local-lis/les=32/33 n=0 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [0,4,2] r=1 lpr=42 pi=[32,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:04 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 43 pg[5.c( empty local-lis/les=26/27 n=0 ec=42/26 lis/c=26/26 les/c/f=27/27/0 sis=42) [2,3,4] r=2 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:04 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 43 pg[6.f( empty local-lis/les=32/33 n=0 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [0,4,2] r=1 lpr=42 pi=[32,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:04 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 43 pg[6.e( empty local-lis/les=32/33 n=0 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [0,4,2] r=1 lpr=42 pi=[32,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:04 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 43 pg[5.7( empty local-lis/les=26/27 n=0 ec=42/26 lis/c=26/26 les/c/f=27/27/0 sis=42) [2,3,4] r=2 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:04 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 43 pg[6.d( empty local-lis/les=32/33 n=0 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [0,4,2] r=1 lpr=42 pi=[32,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:04 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 43 pg[5.b( empty local-lis/les=26/27 n=0 ec=42/26 lis/c=26/26 les/c/f=27/27/0 sis=42) [2,3,4] r=2 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:04 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 43 pg[6.4( empty local-lis/les=32/33 n=0 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [0,4,2] r=1 lpr=42 pi=[32,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:04 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 43 pg[5.6( empty local-lis/les=26/27 n=0 ec=42/26 lis/c=26/26 les/c/f=27/27/0 sis=42) [2,3,4] r=2 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:04 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 43 pg[6.5( empty local-lis/les=32/33 n=0 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [0,4,2] r=1 lpr=42 pi=[32,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:04 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 43 pg[5.1( empty local-lis/les=26/27 n=0 ec=42/26 lis/c=26/26 les/c/f=27/27/0 sis=42) [2,3,4] r=2 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:04 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 43 pg[5.e( empty local-lis/les=26/27 n=0 ec=42/26 lis/c=26/26 les/c/f=27/27/0 sis=42) [2,3,4] r=2 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:04 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 43 pg[6.2( empty local-lis/les=32/33 n=0 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [0,4,2] r=1 lpr=42 pi=[32,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:04 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 43 pg[6.3( empty local-lis/les=32/33 n=0 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [0,4,2] r=1 lpr=42 pi=[32,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:04 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 43 pg[5.3( empty local-lis/les=26/27 n=0 ec=42/26 lis/c=26/26 les/c/f=27/27/0 sis=42) [2,3,4] r=2 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:04 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 43 pg[5.5( empty local-lis/les=26/27 n=0 ec=42/26 lis/c=26/26 les/c/f=27/27/0 sis=42) [2,3,4] r=2 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:04 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 43 pg[5.4( empty local-lis/les=26/27 n=0 ec=42/26 lis/c=26/26 les/c/f=27/27/0 sis=42) [2,3,4] r=2 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:04 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 43 pg[6.6( empty local-lis/les=32/33 n=0 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [0,4,2] r=1 lpr=42 pi=[32,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:04 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 43 pg[5.2( empty local-lis/les=26/27 n=0 ec=42/26 lis/c=26/26 les/c/f=27/27/0 sis=42) [2,3,4] r=2 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:04 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 43 pg[5.d( empty local-lis/les=26/27 n=0 ec=42/26 lis/c=26/26 les/c/f=27/27/0 sis=42) [2,3,4] r=2 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:04 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 43 pg[6.1( empty local-lis/les=32/33 n=0 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [0,4,2] r=1 lpr=42 pi=[32,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:04 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 43 pg[6.c( empty local-lis/les=32/33 n=0 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [0,4,2] r=1 lpr=42 pi=[32,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:04 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 43 pg[5.1e( empty local-lis/les=26/27 n=0 ec=42/26 lis/c=26/26 les/c/f=27/27/0 sis=42) [2,3,4] r=2 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:04 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 43 pg[6.1d( empty local-lis/les=32/33 n=0 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [0,4,2] r=1 lpr=42 pi=[32,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:04 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 43 pg[5.f( empty local-lis/les=26/27 n=0 ec=42/26 lis/c=26/26 les/c/f=27/27/0 sis=42) [2,3,4] r=2 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:04 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 43 pg[5.1d( empty local-lis/les=26/27 n=0 ec=42/26 lis/c=26/26 les/c/f=27/27/0 sis=42) [2,3,4] r=2 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:04 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 43 pg[6.1e( empty local-lis/les=32/33 n=0 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [0,4,2] r=1 lpr=42 pi=[32,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:04 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 43 pg[5.1c( empty local-lis/les=26/27 n=0 ec=42/26 lis/c=26/26 les/c/f=27/27/0 sis=42) [2,3,4] r=2 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:04 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 43 pg[6.7( empty local-lis/les=32/33 n=0 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [0,4,2] r=1 lpr=42 pi=[32,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:04 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 43 pg[6.18( empty local-lis/les=32/33 n=0 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [0,4,2] r=1 lpr=42 pi=[32,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:04 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 43 pg[5.1b( empty local-lis/les=26/27 n=0 ec=42/26 lis/c=26/26 les/c/f=27/27/0 sis=42) [2,3,4] r=2 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:04 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 43 pg[5.1a( empty local-lis/les=26/27 n=0 ec=42/26 lis/c=26/26 les/c/f=27/27/0 sis=42) [2,3,4] r=2 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:04 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 43 pg[6.1f( empty local-lis/les=32/33 n=0 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [0,4,2] r=1 lpr=42 pi=[32,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:04 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 43 pg[6.19( empty local-lis/les=32/33 n=0 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [0,4,2] r=1 lpr=42 pi=[32,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:04 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 43 pg[5.19( empty local-lis/les=26/27 n=0 ec=42/26 lis/c=26/26 les/c/f=27/27/0 sis=42) [2,3,4] r=2 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:04 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 43 pg[6.1a( empty local-lis/les=32/33 n=0 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [0,4,2] r=1 lpr=42 pi=[32,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:04 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 43 pg[6.1b( empty local-lis/les=32/33 n=0 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [0,4,2] r=1 lpr=42 pi=[32,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:04 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 43 pg[5.18( empty local-lis/les=26/27 n=0 ec=42/26 lis/c=26/26 les/c/f=27/27/0 sis=42) [2,3,4] r=2 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:04 np0005538515.localdomain ceph-osd[33334]: log_channel(cluster) log [DBG] : 2.18 scrub starts
Nov 28 08:08:04 np0005538515.localdomain ceph-osd[33334]: log_channel(cluster) log [DBG] : 2.18 scrub ok
Nov 28 08:08:04 np0005538515.localdomain sudo[56581]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pvoozhhozrwmhpnzizwmuddmezdbdqdo ; /usr/bin/python3
Nov 28 08:08:04 np0005538515.localdomain sudo[56581]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:08:04 np0005538515.localdomain python3[56583]: ansible-file Invoked with path=/var/lib/tripleo-config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:08:04 np0005538515.localdomain sudo[56581]: pam_unix(sudo:session): session closed for user root
Nov 28 08:08:05 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 44 pg[7.0( v 36'39 (0'0,36'39] local-lis/les=33/34 n=22 ec=33/33 lis/c=33/33 les/c/f=34/34/0 sis=44 pruub=12.875659943s) [1,5,3] r=0 lpr=44 pi=[33,44)/1 crt=36'39 lcod 36'38 mlcod 36'38 active pruub 1184.027221680s@ mbc={}] start_peering_interval up [1,5,3] -> [1,5,3], acting [1,5,3] -> [1,5,3], acting_primary 1 -> 1, up_primary 1 -> 1, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:05 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 44 pg[7.0( v 36'39 lc 0'0 (0'0,36'39] local-lis/les=33/34 n=1 ec=33/33 lis/c=33/33 les/c/f=34/34/0 sis=44 pruub=12.875659943s) [1,5,3] r=0 lpr=44 pi=[33,44)/1 crt=36'39 lcod 36'38 mlcod 0'0 unknown pruub 1184.027221680s@ mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:06 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 45 pg[7.f( v 36'39 lc 0'0 (0'0,36'39] local-lis/les=33/34 n=1 ec=44/33 lis/c=33/33 les/c/f=34/34/0 sis=44) [1,5,3] r=0 lpr=44 pi=[33,44)/1 crt=36'39 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:06 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 45 pg[7.e( v 36'39 lc 0'0 (0'0,36'39] local-lis/les=33/34 n=1 ec=44/33 lis/c=33/33 les/c/f=34/34/0 sis=44) [1,5,3] r=0 lpr=44 pi=[33,44)/1 crt=36'39 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:06 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 45 pg[7.c( v 36'39 lc 0'0 (0'0,36'39] local-lis/les=33/34 n=1 ec=44/33 lis/c=33/33 les/c/f=34/34/0 sis=44) [1,5,3] r=0 lpr=44 pi=[33,44)/1 crt=36'39 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:06 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 45 pg[7.d( v 36'39 lc 0'0 (0'0,36'39] local-lis/les=33/34 n=1 ec=44/33 lis/c=33/33 les/c/f=34/34/0 sis=44) [1,5,3] r=0 lpr=44 pi=[33,44)/1 crt=36'39 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:06 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 45 pg[7.9( v 36'39 lc 0'0 (0'0,36'39] local-lis/les=33/34 n=1 ec=44/33 lis/c=33/33 les/c/f=34/34/0 sis=44) [1,5,3] r=0 lpr=44 pi=[33,44)/1 crt=36'39 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:06 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 45 pg[7.6( v 36'39 lc 0'0 (0'0,36'39] local-lis/les=33/34 n=2 ec=44/33 lis/c=33/33 les/c/f=34/34/0 sis=44) [1,5,3] r=0 lpr=44 pi=[33,44)/1 crt=36'39 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:06 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 45 pg[7.1( v 36'39 (0'0,36'39] local-lis/les=33/34 n=2 ec=44/33 lis/c=33/33 les/c/f=34/34/0 sis=44) [1,5,3] r=0 lpr=44 pi=[33,44)/1 crt=36'39 lcod 0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:06 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 45 pg[7.4( v 36'39 lc 0'0 (0'0,36'39] local-lis/les=33/34 n=2 ec=44/33 lis/c=33/33 les/c/f=34/34/0 sis=44) [1,5,3] r=0 lpr=44 pi=[33,44)/1 crt=36'39 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:06 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 45 pg[7.3( v 36'39 lc 0'0 (0'0,36'39] local-lis/les=33/34 n=2 ec=44/33 lis/c=33/33 les/c/f=34/34/0 sis=44) [1,5,3] r=0 lpr=44 pi=[33,44)/1 crt=36'39 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:06 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 45 pg[7.5( v 36'39 lc 0'0 (0'0,36'39] local-lis/les=33/34 n=2 ec=44/33 lis/c=33/33 les/c/f=34/34/0 sis=44) [1,5,3] r=0 lpr=44 pi=[33,44)/1 crt=36'39 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:06 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 45 pg[7.8( v 36'39 lc 0'0 (0'0,36'39] local-lis/les=33/34 n=1 ec=44/33 lis/c=33/33 les/c/f=34/34/0 sis=44) [1,5,3] r=0 lpr=44 pi=[33,44)/1 crt=36'39 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:06 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 45 pg[7.7( v 36'39 lc 0'0 (0'0,36'39] local-lis/les=33/34 n=1 ec=44/33 lis/c=33/33 les/c/f=34/34/0 sis=44) [1,5,3] r=0 lpr=44 pi=[33,44)/1 crt=36'39 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:06 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 45 pg[7.b( v 36'39 lc 0'0 (0'0,36'39] local-lis/les=33/34 n=1 ec=44/33 lis/c=33/33 les/c/f=34/34/0 sis=44) [1,5,3] r=0 lpr=44 pi=[33,44)/1 crt=36'39 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:06 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 45 pg[7.2( v 36'39 lc 0'0 (0'0,36'39] local-lis/les=33/34 n=2 ec=44/33 lis/c=33/33 les/c/f=34/34/0 sis=44) [1,5,3] r=0 lpr=44 pi=[33,44)/1 crt=36'39 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:06 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 45 pg[7.a( v 36'39 lc 0'0 (0'0,36'39] local-lis/les=33/34 n=1 ec=44/33 lis/c=33/33 les/c/f=34/34/0 sis=44) [1,5,3] r=0 lpr=44 pi=[33,44)/1 crt=36'39 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:06 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 45 pg[7.1( v 36'39 (0'0,36'39] local-lis/les=44/45 n=2 ec=44/33 lis/c=33/33 les/c/f=34/34/0 sis=44) [1,5,3] r=0 lpr=44 pi=[33,44)/1 crt=36'39 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:06 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 45 pg[7.0( v 36'39 (0'0,36'39] local-lis/les=44/45 n=1 ec=33/33 lis/c=33/33 les/c/f=34/34/0 sis=44) [1,5,3] r=0 lpr=44 pi=[33,44)/1 crt=36'39 lcod 36'38 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:06 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 45 pg[7.d( v 36'39 (0'0,36'39] local-lis/les=44/45 n=1 ec=44/33 lis/c=33/33 les/c/f=34/34/0 sis=44) [1,5,3] r=0 lpr=44 pi=[33,44)/1 crt=36'39 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:06 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 45 pg[7.3( v 36'39 (0'0,36'39] local-lis/les=44/45 n=2 ec=44/33 lis/c=33/33 les/c/f=34/34/0 sis=44) [1,5,3] r=0 lpr=44 pi=[33,44)/1 crt=36'39 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:06 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 45 pg[7.7( v 36'39 (0'0,36'39] local-lis/les=44/45 n=1 ec=44/33 lis/c=33/33 les/c/f=34/34/0 sis=44) [1,5,3] r=0 lpr=44 pi=[33,44)/1 crt=36'39 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:06 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 45 pg[7.2( v 36'39 (0'0,36'39] local-lis/les=44/45 n=2 ec=44/33 lis/c=33/33 les/c/f=34/34/0 sis=44) [1,5,3] r=0 lpr=44 pi=[33,44)/1 crt=36'39 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:06 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 45 pg[7.f( v 36'39 (0'0,36'39] local-lis/les=44/45 n=1 ec=44/33 lis/c=33/33 les/c/f=34/34/0 sis=44) [1,5,3] r=0 lpr=44 pi=[33,44)/1 crt=36'39 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:06 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 45 pg[7.5( v 36'39 (0'0,36'39] local-lis/les=44/45 n=2 ec=44/33 lis/c=33/33 les/c/f=34/34/0 sis=44) [1,5,3] r=0 lpr=44 pi=[33,44)/1 crt=36'39 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:06 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 45 pg[7.8( v 36'39 (0'0,36'39] local-lis/les=44/45 n=1 ec=44/33 lis/c=33/33 les/c/f=34/34/0 sis=44) [1,5,3] r=0 lpr=44 pi=[33,44)/1 crt=36'39 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:06 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 45 pg[7.e( v 36'39 (0'0,36'39] local-lis/les=44/45 n=1 ec=44/33 lis/c=33/33 les/c/f=34/34/0 sis=44) [1,5,3] r=0 lpr=44 pi=[33,44)/1 crt=36'39 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:06 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 45 pg[7.4( v 36'39 (0'0,36'39] local-lis/les=44/45 n=2 ec=44/33 lis/c=33/33 les/c/f=34/34/0 sis=44) [1,5,3] r=0 lpr=44 pi=[33,44)/1 crt=36'39 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:06 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 45 pg[7.c( v 36'39 (0'0,36'39] local-lis/les=44/45 n=1 ec=44/33 lis/c=33/33 les/c/f=34/34/0 sis=44) [1,5,3] r=0 lpr=44 pi=[33,44)/1 crt=36'39 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:06 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 45 pg[7.a( v 36'39 (0'0,36'39] local-lis/les=44/45 n=1 ec=44/33 lis/c=33/33 les/c/f=34/34/0 sis=44) [1,5,3] r=0 lpr=44 pi=[33,44)/1 crt=36'39 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:06 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 45 pg[7.9( v 36'39 (0'0,36'39] local-lis/les=44/45 n=1 ec=44/33 lis/c=33/33 les/c/f=34/34/0 sis=44) [1,5,3] r=0 lpr=44 pi=[33,44)/1 crt=36'39 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:06 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 45 pg[7.b( v 36'39 (0'0,36'39] local-lis/les=44/45 n=1 ec=44/33 lis/c=33/33 les/c/f=34/34/0 sis=44) [1,5,3] r=0 lpr=44 pi=[33,44)/1 crt=36'39 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:06 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 45 pg[7.6( v 36'39 (0'0,36'39] local-lis/les=44/45 n=2 ec=44/33 lis/c=33/33 les/c/f=34/34/0 sis=44) [1,5,3] r=0 lpr=44 pi=[33,44)/1 crt=36'39 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:06 np0005538515.localdomain ceph-osd[33334]: log_channel(cluster) log [DBG] : 2.14 scrub starts
Nov 28 08:08:06 np0005538515.localdomain ceph-osd[33334]: log_channel(cluster) log [DBG] : 2.14 scrub ok
Nov 28 08:08:06 np0005538515.localdomain ceph-osd[32393]: log_channel(cluster) log [DBG] : 7.0 scrub starts
Nov 28 08:08:06 np0005538515.localdomain sudo[56597]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-usviuvxpsyvdahspubvexianmgjbownh ; /usr/bin/python3
Nov 28 08:08:06 np0005538515.localdomain sudo[56597]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:08:06 np0005538515.localdomain python3[56599]: ansible-file Invoked with path=/var/lib/tripleo-config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:08:06 np0005538515.localdomain sudo[56597]: pam_unix(sudo:session): session closed for user root
Nov 28 08:08:10 np0005538515.localdomain sudo[56645]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dqhqawoonykiubmlnzkhsbzhybrpdkjk ; /usr/bin/python3
Nov 28 08:08:10 np0005538515.localdomain sudo[56645]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:08:10 np0005538515.localdomain python3[56647]: ansible-ansible.legacy.stat Invoked with path=/var/lib/tripleo-config/ceph/ceph.client.openstack.keyring follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 08:08:10 np0005538515.localdomain sudo[56645]: pam_unix(sudo:session): session closed for user root
Nov 28 08:08:10 np0005538515.localdomain ceph-osd[32393]: log_channel(cluster) log [DBG] : 7.d deep-scrub starts
Nov 28 08:08:10 np0005538515.localdomain ceph-osd[32393]: log_channel(cluster) log [DBG] : 7.d deep-scrub ok
Nov 28 08:08:10 np0005538515.localdomain sudo[56688]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ttyybmqurshijwzgwkprzdkwhgxipimb ; /usr/bin/python3
Nov 28 08:08:10 np0005538515.localdomain sudo[56688]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:08:10 np0005538515.localdomain python3[56690]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764317289.9633682-92600-5183222995555/source dest=/var/lib/tripleo-config/ceph/ceph.client.openstack.keyring mode=600 _original_basename=ceph.client.openstack.keyring follow=False checksum=98ffd20e3b9db1cae39a950d9da1f69e92796658 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:08:10 np0005538515.localdomain sudo[56688]: pam_unix(sudo:session): session closed for user root
Nov 28 08:08:12 np0005538515.localdomain ceph-osd[33334]: log_channel(cluster) log [DBG] : 2.17 scrub starts
Nov 28 08:08:12 np0005538515.localdomain ceph-osd[33334]: log_channel(cluster) log [DBG] : 2.17 scrub ok
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 46 pg[5.11( empty local-lis/les=0/0 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46) [1,2,0] r=0 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 46 pg[5.16( empty local-lis/les=0/0 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46) [1,3,2] r=0 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 46 pg[3.10( empty local-lis/les=0/0 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46) [1,5,3] r=0 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[3.19( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.977840424s) [0,1,2] r=-1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1187.792602539s@ mbc={}] start_peering_interval up [5,4,0] -> [0,1,2], acting [5,4,0] -> [0,1,2], acting_primary 5 -> 0, up_primary 5 -> 0, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 46 pg[3.13( empty local-lis/les=0/0 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46) [1,3,2] r=0 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[5.1f( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.830454826s) [4,5,3] r=0 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1189.645263672s@ mbc={}] start_peering_interval up [2,3,4] -> [4,5,3], acting [2,3,4] -> [4,5,3], acting_primary 2 -> 4, up_primary 2 -> 4, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 46 pg[3.14( empty local-lis/les=0/0 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46) [1,2,0] r=0 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[3.19( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.977735519s) [0,1,2] r=-1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1187.792602539s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[5.1f( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.830454826s) [4,5,3] r=0 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown pruub 1189.645263672s@ mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[5.10( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.831009865s) [4,5,0] r=0 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1189.646118164s@ mbc={}] start_peering_interval up [2,3,4] -> [4,5,0], acting [2,3,4] -> [4,5,0], acting_primary 2 -> 4, up_primary 2 -> 4, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[5.10( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.831009865s) [4,5,0] r=0 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown pruub 1189.646118164s@ mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 46 pg[3.16( empty local-lis/les=0/0 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46) [1,3,5] r=0 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[6.13( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.825892448s) [3,2,1] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1189.641235352s@ mbc={}] start_peering_interval up [0,4,2] -> [3,2,1], acting [0,4,2] -> [3,2,1], acting_primary 0 -> 3, up_primary 0 -> 3, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[6.13( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.825799942s) [3,2,1] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1189.641235352s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[3.17( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.976946831s) [0,4,5] r=1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1187.792480469s@ mbc={}] start_peering_interval up [5,4,0] -> [0,4,5], acting [5,4,0] -> [0,4,5], acting_primary 5 -> 0, up_primary 5 -> 0, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[3.17( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.976910591s) [0,4,5] r=1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1187.792480469s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 46 pg[6.d( empty local-lis/les=0/0 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46) [1,3,2] r=0 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[3.15( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.976987839s) [2,1,0] r=-1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1187.792724609s@ mbc={}] start_peering_interval up [5,4,0] -> [2,1,0], acting [5,4,0] -> [2,1,0], acting_primary 5 -> 2, up_primary 5 -> 2, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[3.15( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.976950645s) [2,1,0] r=-1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1187.792724609s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[3.12( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.980078697s) [0,4,5] r=1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1187.795776367s@ mbc={}] start_peering_interval up [5,4,0] -> [0,4,5], acting [5,4,0] -> [0,4,5], acting_primary 5 -> 0, up_primary 5 -> 0, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[6.10( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.825941086s) [0,2,4] r=2 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1189.641723633s@ mbc={}] start_peering_interval up [0,4,2] -> [0,2,4], acting [0,4,2] -> [0,2,4], acting_primary 0 -> 0, up_primary 0 -> 0, role 1 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 46 pg[6.8( empty local-lis/les=0/0 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46) [1,2,3] r=0 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[3.12( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.980021477s) [0,4,5] r=1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1187.795776367s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[6.10( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.825866699s) [0,2,4] r=2 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1189.641723633s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[6.16( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.825681686s) [0,1,5] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1189.641601562s@ mbc={}] start_peering_interval up [0,4,2] -> [0,1,5], acting [0,4,2] -> [0,1,5], acting_primary 0 -> 0, up_primary 0 -> 0, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[5.15( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.828049660s) [4,3,5] r=0 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1189.643920898s@ mbc={}] start_peering_interval up [2,3,4] -> [4,3,5], acting [2,3,4] -> [4,3,5], acting_primary 2 -> 4, up_primary 2 -> 4, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[6.15( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.833456039s) [4,5,0] r=0 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1189.649414062s@ mbc={}] start_peering_interval up [0,4,2] -> [4,5,0], acting [0,4,2] -> [4,5,0], acting_primary 0 -> 4, up_primary 0 -> 4, role 1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 46 pg[3.d( empty local-lis/les=0/0 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46) [1,2,3] r=0 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[6.16( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.825644493s) [0,1,5] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1189.641601562s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[6.15( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.833456039s) [4,5,0] r=0 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown pruub 1189.649414062s@ mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[5.15( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.828049660s) [4,3,5] r=0 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown pruub 1189.643920898s@ mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[5.8( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.828238487s) [2,0,1] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1189.644287109s@ mbc={}] start_peering_interval up [2,3,4] -> [2,0,1], acting [2,3,4] -> [2,0,1], acting_primary 2 -> 2, up_primary 2 -> 2, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[5.8( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.828203201s) [2,0,1] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1189.644287109s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 46 pg[6.2( empty local-lis/les=0/0 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46) [1,3,2] r=0 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[3.e( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.977394104s) [2,4,0] r=1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1187.793579102s@ mbc={}] start_peering_interval up [5,4,0] -> [2,4,0], acting [5,4,0] -> [2,4,0], acting_primary 5 -> 2, up_primary 5 -> 2, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[3.e( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.977326393s) [2,4,0] r=1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1187.793579102s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[6.a( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.832056046s) [4,0,2] r=0 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1189.648559570s@ mbc={}] start_peering_interval up [0,4,2] -> [4,0,2], acting [0,4,2] -> [4,0,2], acting_primary 0 -> 4, up_primary 0 -> 4, role 1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[6.a( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.832056046s) [4,0,2] r=0 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown pruub 1189.648559570s@ mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[6.9( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.831913948s) [0,2,4] r=2 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1189.648559570s@ mbc={}] start_peering_interval up [0,4,2] -> [0,2,4], acting [0,4,2] -> [0,2,4], acting_primary 0 -> 0, up_primary 0 -> 0, role 1 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 46 pg[3.f( empty local-lis/les=0/0 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46) [1,5,0] r=0 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[3.c( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.977123260s) [4,3,5] r=0 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1187.793701172s@ mbc={}] start_peering_interval up [5,4,0] -> [4,3,5], acting [5,4,0] -> [4,3,5], acting_primary 5 -> 4, up_primary 5 -> 4, role 1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[6.9( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.831879616s) [0,2,4] r=2 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1189.648559570s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[3.c( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.977123260s) [4,3,5] r=0 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown pruub 1187.793701172s@ mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[5.d( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.827555656s) [2,4,0] r=1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1189.644287109s@ mbc={}] start_peering_interval up [2,3,4] -> [2,4,0], acting [2,3,4] -> [2,4,0], acting_primary 2 -> 2, up_primary 2 -> 2, role 2 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 46 pg[3.1c( empty local-lis/les=0/0 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46) [1,3,2] r=0 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[5.d( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.827482224s) [2,4,0] r=1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1189.644287109s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[6.e( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.832635880s) [4,3,2] r=0 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1189.649536133s@ mbc={}] start_peering_interval up [0,4,2] -> [4,3,2], acting [0,4,2] -> [4,3,2], acting_primary 0 -> 4, up_primary 0 -> 4, role 1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[6.e( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.832635880s) [4,3,2] r=0 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown pruub 1189.649536133s@ mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[5.e( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.825080872s) [2,0,4] r=2 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1189.642089844s@ mbc={}] start_peering_interval up [2,3,4] -> [2,0,4], acting [2,3,4] -> [2,0,4], acting_primary 2 -> 2, up_primary 2 -> 2, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[5.7( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.826766968s) [4,3,5] r=0 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1189.643798828s@ mbc={}] start_peering_interval up [2,3,4] -> [4,3,5], acting [2,3,4] -> [4,3,5], acting_primary 2 -> 4, up_primary 2 -> 4, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 46 pg[6.19( empty local-lis/les=0/0 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46) [1,3,2] r=0 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[5.7( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.826766968s) [4,3,5] r=0 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown pruub 1189.643798828s@ mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[5.e( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.825041771s) [2,0,4] r=2 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1189.642089844s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[3.a( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.977373123s) [4,3,5] r=0 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1187.794067383s@ mbc={}] start_peering_interval up [5,4,0] -> [4,3,5], acting [5,4,0] -> [4,3,5], acting_primary 5 -> 4, up_primary 5 -> 4, role 1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[3.1( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.976415634s) [0,4,2] r=1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1187.793701172s@ mbc={}] start_peering_interval up [5,4,0] -> [0,4,2], acting [5,4,0] -> [0,4,2], acting_primary 5 -> 0, up_primary 5 -> 0, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 46 pg[5.1b( empty local-lis/les=0/0 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46) [1,0,2] r=0 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[3.1( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.976384163s) [0,4,2] r=1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1187.793701172s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[3.a( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.977373123s) [4,3,5] r=0 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown pruub 1187.794067383s@ mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[6.5( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.832133293s) [4,2,0] r=0 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1189.649658203s@ mbc={}] start_peering_interval up [0,4,2] -> [4,2,0], acting [0,4,2] -> [4,2,0], acting_primary 0 -> 4, up_primary 0 -> 4, role 1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[6.5( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.832133293s) [4,2,0] r=0 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown pruub 1189.649658203s@ mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[3.6( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.976243973s) [0,4,5] r=1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1187.794067383s@ mbc={}] start_peering_interval up [5,4,0] -> [0,4,5], acting [5,4,0] -> [0,4,5], acting_primary 5 -> 0, up_primary 5 -> 0, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[3.6( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.976193428s) [0,4,5] r=1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1187.794067383s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[5.1( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.823978424s) [4,3,5] r=0 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1189.641967773s@ mbc={}] start_peering_interval up [2,3,4] -> [4,3,5], acting [2,3,4] -> [4,3,5], acting_primary 2 -> 4, up_primary 2 -> 4, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[6.3( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.832106590s) [4,5,0] r=0 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1189.650024414s@ mbc={}] start_peering_interval up [0,4,2] -> [4,5,0], acting [0,4,2] -> [4,5,0], acting_primary 0 -> 4, up_primary 0 -> 4, role 1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[5.1( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.823978424s) [4,3,5] r=0 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown pruub 1189.641967773s@ mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[5.3( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.825306892s) [0,1,2] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1189.643432617s@ mbc={}] start_peering_interval up [2,3,4] -> [0,1,2], acting [2,3,4] -> [0,1,2], acting_primary 2 -> 0, up_primary 2 -> 0, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[6.3( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.832106590s) [4,5,0] r=0 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown pruub 1189.650024414s@ mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[3.5( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.975559235s) [4,3,5] r=0 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1187.793701172s@ mbc={}] start_peering_interval up [5,4,0] -> [4,3,5], acting [5,4,0] -> [4,3,5], acting_primary 5 -> 4, up_primary 5 -> 4, role 1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[5.3( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.825263977s) [0,1,2] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1189.643432617s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[3.5( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.975559235s) [4,3,5] r=0 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown pruub 1187.793701172s@ mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[3.3( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.975432396s) [4,0,5] r=0 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1187.793823242s@ mbc={}] start_peering_interval up [5,4,0] -> [4,0,5], acting [5,4,0] -> [4,0,5], acting_primary 5 -> 4, up_primary 5 -> 4, role 1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[6.7( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.832025528s) [4,3,2] r=0 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1189.650390625s@ mbc={}] start_peering_interval up [0,4,2] -> [4,3,2], acting [0,4,2] -> [4,3,2], acting_primary 0 -> 4, up_primary 0 -> 4, role 1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[5.5( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.824101448s) [0,4,5] r=1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1189.642456055s@ mbc={}] start_peering_interval up [2,3,4] -> [0,4,5], acting [2,3,4] -> [0,4,5], acting_primary 2 -> 0, up_primary 2 -> 0, role 2 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[3.3( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.975432396s) [4,0,5] r=0 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown pruub 1187.793823242s@ mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[6.7( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.832025528s) [4,3,2] r=0 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown pruub 1189.650390625s@ mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[5.5( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.824048996s) [0,4,5] r=1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1189.642456055s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[5.2( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.825001717s) [4,0,2] r=0 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1189.643676758s@ mbc={}] start_peering_interval up [2,3,4] -> [4,0,2], acting [2,3,4] -> [4,0,2], acting_primary 2 -> 4, up_primary 2 -> 4, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 46 pg[5.9( empty local-lis/les=0/0 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46) [1,5,0] r=0 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[5.2( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.825001717s) [4,0,2] r=0 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown pruub 1189.643676758s@ mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[6.1( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.831455231s) [2,1,3] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1189.650146484s@ mbc={}] start_peering_interval up [0,4,2] -> [2,1,3], acting [0,4,2] -> [2,1,3], acting_primary 0 -> 2, up_primary 0 -> 2, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[6.1( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.831419945s) [2,1,3] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1189.650146484s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[5.f( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.825596809s) [4,2,3] r=0 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1189.644287109s@ mbc={}] start_peering_interval up [2,3,4] -> [4,2,3], acting [2,3,4] -> [4,2,3], acting_primary 2 -> 4, up_primary 2 -> 4, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[5.f( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.825596809s) [4,2,3] r=0 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown pruub 1189.644287109s@ mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[5.1a( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.827201843s) [2,4,3] r=1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1189.645996094s@ mbc={}] start_peering_interval up [2,3,4] -> [2,4,3], acting [2,3,4] -> [2,4,3], acting_primary 2 -> 2, up_primary 2 -> 2, role 2 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[5.19( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.826667786s) [0,1,5] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1189.645507812s@ mbc={}] start_peering_interval up [2,3,4] -> [0,1,5], acting [2,3,4] -> [0,1,5], acting_primary 2 -> 0, up_primary 2 -> 0, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[6.18( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.831649780s) [0,1,2] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1189.650634766s@ mbc={}] start_peering_interval up [0,4,2] -> [0,1,2], acting [0,4,2] -> [0,1,2], acting_primary 0 -> 0, up_primary 0 -> 0, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[5.1a( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.827166557s) [2,4,3] r=1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1189.645996094s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[5.19( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.826630592s) [0,1,5] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1189.645507812s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[6.18( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.831589699s) [0,1,2] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1189.650634766s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[6.1a( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.831677437s) [4,2,0] r=0 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1189.650756836s@ mbc={}] start_peering_interval up [0,4,2] -> [4,2,0], acting [0,4,2] -> [4,2,0], acting_primary 0 -> 4, up_primary 0 -> 4, role 1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[3.1f( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.975507736s) [0,1,5] r=-1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1187.794677734s@ mbc={}] start_peering_interval up [5,4,0] -> [0,1,5], acting [5,4,0] -> [0,1,5], acting_primary 5 -> 0, up_primary 5 -> 0, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[3.1f( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.975453377s) [0,1,5] r=-1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1187.794677734s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[5.18( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.826570511s) [4,2,3] r=0 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1189.645751953s@ mbc={}] start_peering_interval up [2,3,4] -> [4,2,3], acting [2,3,4] -> [4,2,3], acting_primary 2 -> 4, up_primary 2 -> 4, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[5.18( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.826570511s) [4,2,3] r=0 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown pruub 1189.645751953s@ mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[6.1b( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.831312180s) [5,1,0] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1189.650756836s@ mbc={}] start_peering_interval up [0,4,2] -> [5,1,0], acting [0,4,2] -> [5,1,0], acting_primary 0 -> 5, up_primary 0 -> 5, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[6.1b( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.831275940s) [5,1,0] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1189.650756836s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[6.11( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.822189331s) [3,5,4] r=2 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1189.641845703s@ mbc={}] start_peering_interval up [0,4,2] -> [3,5,4], acting [0,4,2] -> [3,5,4], acting_primary 0 -> 3, up_primary 0 -> 3, role 1 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[2.1f( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.030885696s) [0,4,2] r=1 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 active pruub 1185.850585938s@ mbc={}] start_peering_interval up [4,5,3] -> [0,4,2], acting [4,5,3] -> [0,4,2], acting_primary 4 -> 0, up_primary 4 -> 0, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[6.11( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.822154045s) [3,5,4] r=2 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1189.641845703s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[2.1f( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.030819893s) [0,4,2] r=1 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1185.850585938s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[4.19( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.968659401s) [2,3,1] r=-1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1187.788574219s@ mbc={}] start_peering_interval up [3,4,5] -> [2,3,1], acting [3,4,5] -> [2,3,1], acting_primary 3 -> 2, up_primary 3 -> 2, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[4.19( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.968625069s) [2,3,1] r=-1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1187.788574219s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[3.1e( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.974594116s) [3,2,4] r=2 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1187.794677734s@ mbc={}] start_peering_interval up [5,4,0] -> [3,2,4], acting [5,4,0] -> [3,2,4], acting_primary 5 -> 3, up_primary 5 -> 3, role 1 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[2.1e( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.030447006s) [3,5,4] r=2 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 active pruub 1185.850585938s@ mbc={}] start_peering_interval up [4,5,3] -> [3,5,4], acting [4,5,3] -> [3,5,4], acting_primary 4 -> 3, up_primary 4 -> 3, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[3.1e( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.974533081s) [3,2,4] r=2 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1187.794677734s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[2.1e( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.030405998s) [3,5,4] r=2 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1185.850585938s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[6.1a( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.831677437s) [4,2,0] r=0 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown pruub 1189.650756836s@ mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[4.18( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.967097282s) [4,3,5] r=0 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1187.787475586s@ mbc={}] start_peering_interval up [3,4,5] -> [4,3,5], acting [3,4,5] -> [4,3,5], acting_primary 3 -> 4, up_primary 3 -> 4, role 1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[4.18( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.967097282s) [4,3,5] r=0 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown pruub 1187.787475586s@ mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 46 pg[7.7( v 36'39 (0'0,36'39] local-lis/les=44/45 n=1 ec=44/33 lis/c=44/44 les/c/f=45/45/0 sis=46 pruub=8.853073120s) [4,2,3] r=-1 lpr=46 pi=[44,46)/1 crt=36'39 lcod 0'0 mlcod 0'0 active pruub 1188.204345703s@ mbc={}] start_peering_interval up [1,5,3] -> [4,2,3], acting [1,5,3] -> [4,2,3], acting_primary 1 -> 4, up_primary 1 -> 4, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 46 pg[7.7( v 36'39 (0'0,36'39] local-lis/les=44/45 n=1 ec=44/33 lis/c=44/44 les/c/f=45/45/0 sis=46 pruub=8.853013039s) [4,2,3] r=-1 lpr=46 pi=[44,46)/1 crt=36'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1188.204345703s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 46 pg[7.5( v 36'39 (0'0,36'39] local-lis/les=44/45 n=2 ec=44/33 lis/c=44/44 les/c/f=45/45/0 sis=46 pruub=8.856533051s) [4,2,3] r=-1 lpr=46 pi=[44,46)/1 crt=36'39 lcod 0'0 mlcod 0'0 active pruub 1188.208007812s@ mbc={}] start_peering_interval up [1,5,3] -> [4,2,3], acting [1,5,3] -> [4,2,3], acting_primary 1 -> 4, up_primary 1 -> 4, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 46 pg[7.5( v 36'39 (0'0,36'39] local-lis/les=44/45 n=2 ec=44/33 lis/c=44/44 les/c/f=45/45/0 sis=46 pruub=8.856444359s) [4,2,3] r=-1 lpr=46 pi=[44,46)/1 crt=36'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1188.208007812s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 46 pg[7.1( v 36'39 (0'0,36'39] local-lis/les=44/45 n=2 ec=44/33 lis/c=44/44 les/c/f=45/45/0 sis=46 pruub=8.851760864s) [4,2,3] r=-1 lpr=46 pi=[44,46)/1 crt=36'39 lcod 0'0 mlcod 0'0 active pruub 1188.203857422s@ mbc={}] start_peering_interval up [1,5,3] -> [4,2,3], acting [1,5,3] -> [4,2,3], acting_primary 1 -> 4, up_primary 1 -> 4, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 46 pg[7.1( v 36'39 (0'0,36'39] local-lis/les=44/45 n=2 ec=44/33 lis/c=44/44 les/c/f=45/45/0 sis=46 pruub=8.851696968s) [4,2,3] r=-1 lpr=46 pi=[44,46)/1 crt=36'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1188.203857422s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 46 pg[7.3( v 36'39 (0'0,36'39] local-lis/les=44/45 n=2 ec=44/33 lis/c=44/44 les/c/f=45/45/0 sis=46 pruub=8.852090836s) [4,2,3] r=-1 lpr=46 pi=[44,46)/1 crt=36'39 lcod 0'0 mlcod 0'0 active pruub 1188.204223633s@ mbc={}] start_peering_interval up [1,5,3] -> [4,2,3], acting [1,5,3] -> [4,2,3], acting_primary 1 -> 4, up_primary 1 -> 4, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 46 pg[7.b( v 36'39 (0'0,36'39] local-lis/les=44/45 n=1 ec=44/33 lis/c=44/44 les/c/f=45/45/0 sis=46 pruub=8.857297897s) [4,2,3] r=-1 lpr=46 pi=[44,46)/1 crt=36'39 lcod 0'0 mlcod 0'0 active pruub 1188.208618164s@ mbc={}] start_peering_interval up [1,5,3] -> [4,2,3], acting [1,5,3] -> [4,2,3], acting_primary 1 -> 4, up_primary 1 -> 4, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 46 pg[7.9( v 36'39 (0'0,36'39] local-lis/les=44/45 n=1 ec=44/33 lis/c=44/44 les/c/f=45/45/0 sis=46 pruub=8.855573654s) [4,2,3] r=-1 lpr=46 pi=[44,46)/1 crt=36'39 lcod 0'0 mlcod 0'0 active pruub 1188.208129883s@ mbc={}] start_peering_interval up [1,5,3] -> [4,2,3], acting [1,5,3] -> [4,2,3], acting_primary 1 -> 4, up_primary 1 -> 4, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 46 pg[7.b( v 36'39 (0'0,36'39] local-lis/les=44/45 n=1 ec=44/33 lis/c=44/44 les/c/f=45/45/0 sis=46 pruub=8.855964661s) [4,2,3] r=-1 lpr=46 pi=[44,46)/1 crt=36'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1188.208618164s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 46 pg[2.17( empty local-lis/les=0/0 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46) [1,5,3] r=0 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 46 pg[7.3( v 36'39 (0'0,36'39] local-lis/les=44/45 n=2 ec=44/33 lis/c=44/44 les/c/f=45/45/0 sis=46 pruub=8.851481438s) [4,2,3] r=-1 lpr=46 pi=[44,46)/1 crt=36'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1188.204223633s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 46 pg[7.9( v 36'39 (0'0,36'39] local-lis/les=44/45 n=1 ec=44/33 lis/c=44/44 les/c/f=45/45/0 sis=46 pruub=8.855342865s) [4,2,3] r=-1 lpr=46 pi=[44,46)/1 crt=36'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1188.208129883s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 46 pg[7.d( v 36'39 (0'0,36'39] local-lis/les=44/45 n=2 ec=44/33 lis/c=44/44 les/c/f=45/45/0 sis=46 pruub=8.850686073s) [4,2,3] r=-1 lpr=46 pi=[44,46)/1 crt=36'39 lcod 0'0 mlcod 0'0 active pruub 1188.204101562s@ mbc={}] start_peering_interval up [1,5,3] -> [4,2,3], acting [1,5,3] -> [4,2,3], acting_primary 1 -> 4, up_primary 1 -> 4, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 46 pg[7.d( v 36'39 (0'0,36'39] local-lis/les=44/45 n=2 ec=44/33 lis/c=44/44 les/c/f=45/45/0 sis=46 pruub=8.850599289s) [4,2,3] r=-1 lpr=46 pi=[44,46)/1 crt=36'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1188.204101562s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[6.19( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.830129623s) [1,3,2] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1189.650634766s@ mbc={}] start_peering_interval up [0,4,2] -> [1,3,2], acting [0,4,2] -> [1,3,2], acting_primary 0 -> 1, up_primary 0 -> 1, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[6.19( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.830075264s) [1,3,2] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1189.650634766s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 46 pg[7.f( v 36'39 (0'0,36'39] local-lis/les=44/45 n=1 ec=44/33 lis/c=44/44 les/c/f=45/45/0 sis=46 pruub=8.854302406s) [4,2,3] r=-1 lpr=46 pi=[44,46)/1 crt=36'39 lcod 0'0 mlcod 0'0 active pruub 1188.208007812s@ mbc={}] start_peering_interval up [1,5,3] -> [4,2,3], acting [1,5,3] -> [4,2,3], acting_primary 1 -> 4, up_primary 1 -> 4, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 46 pg[2.16( empty local-lis/les=0/0 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46) [1,2,0] r=0 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[3.1c( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.974110603s) [1,3,2] r=-1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1187.794799805s@ mbc={}] start_peering_interval up [5,4,0] -> [1,3,2], acting [5,4,0] -> [1,3,2], acting_primary 5 -> 1, up_primary 5 -> 1, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 46 pg[7.f( v 36'39 (0'0,36'39] local-lis/les=44/45 n=1 ec=44/33 lis/c=44/44 les/c/f=45/45/0 sis=46 pruub=8.853870392s) [4,2,3] r=-1 lpr=46 pi=[44,46)/1 crt=36'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1188.208007812s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[2.1d( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.029905319s) [2,4,0] r=1 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 active pruub 1185.850585938s@ mbc={}] start_peering_interval up [4,5,3] -> [2,4,0], acting [4,5,3] -> [2,4,0], acting_primary 4 -> 2, up_primary 4 -> 2, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[3.1c( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.974070549s) [1,3,2] r=-1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1187.794799805s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[2.1d( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.029850960s) [2,4,0] r=1 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1185.850585938s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[4.1b( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.968106270s) [4,3,5] r=0 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1187.788940430s@ mbc={}] start_peering_interval up [3,4,5] -> [4,3,5], acting [3,4,5] -> [4,3,5], acting_primary 3 -> 4, up_primary 3 -> 4, role 1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[2.1c( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.029664040s) [2,1,0] r=-1 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 active pruub 1185.850585938s@ mbc={}] start_peering_interval up [4,5,3] -> [2,1,0], acting [4,5,3] -> [2,1,0], acting_primary 4 -> 2, up_primary 4 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 46 pg[4.a( empty local-lis/les=0/0 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46) [1,0,2] r=0 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[2.1c( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.029624939s) [2,1,0] r=-1 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1185.850585938s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[4.1b( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.968106270s) [4,3,5] r=0 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown pruub 1187.788940430s@ mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[4.1a( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.968004227s) [4,3,5] r=0 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1187.789306641s@ mbc={}] start_peering_interval up [3,4,5] -> [4,3,5], acting [3,4,5] -> [4,3,5], acting_primary 3 -> 4, up_primary 3 -> 4, role 1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[4.1a( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.968004227s) [4,3,5] r=0 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown pruub 1187.789306641s@ mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 46 pg[4.5( empty local-lis/les=0/0 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46) [1,5,0] r=0 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[3.1d( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.973605156s) [5,4,3] r=1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1187.794555664s@ mbc={}] start_peering_interval up [5,4,0] -> [5,4,3], acting [5,4,0] -> [5,4,3], acting_primary 5 -> 5, up_primary 5 -> 5, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 46 pg[2.2( empty local-lis/les=0/0 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46) [1,0,2] r=0 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 46 pg[2.8( empty local-lis/les=0/0 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46) [1,2,0] r=0 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 46 pg[2.1a( empty local-lis/les=0/0 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46) [1,5,3] r=0 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[6.1f( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.829088211s) [3,5,1] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1189.650634766s@ mbc={}] start_peering_interval up [0,4,2] -> [3,5,1], acting [0,4,2] -> [3,5,1], acting_primary 0 -> 3, up_primary 0 -> 3, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[3.1d( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.973084450s) [5,4,3] r=1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1187.794555664s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[5.1b( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.823925018s) [1,0,2] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1189.645507812s@ mbc={}] start_peering_interval up [2,3,4] -> [1,0,2], acting [2,3,4] -> [1,0,2], acting_primary 2 -> 1, up_primary 2 -> 1, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[6.1f( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.829054832s) [3,5,1] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1189.650634766s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[5.1b( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.823860168s) [1,0,2] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1189.645507812s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[3.1a( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.971156120s) [5,3,4] r=2 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1187.792968750s@ mbc={}] start_peering_interval up [5,4,0] -> [5,3,4], acting [5,4,0] -> [5,3,4], acting_primary 5 -> 5, up_primary 5 -> 5, role 1 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[2.1b( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.028826714s) [5,4,3] r=1 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 active pruub 1185.850585938s@ mbc={}] start_peering_interval up [4,5,3] -> [5,4,3], acting [4,5,3] -> [5,4,3], acting_primary 4 -> 5, up_primary 4 -> 5, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[3.1a( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.971123695s) [5,3,4] r=2 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1187.792968750s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[2.1b( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.028784752s) [5,4,3] r=1 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1185.850585938s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[4.1d( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.968725204s) [2,1,3] r=-1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1187.790527344s@ mbc={}] start_peering_interval up [3,4,5] -> [2,1,3], acting [3,4,5] -> [2,1,3], acting_primary 3 -> 2, up_primary 3 -> 2, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[5.1c( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.823025703s) [4,2,0] r=0 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1189.645019531s@ mbc={}] start_peering_interval up [2,3,4] -> [4,2,0], acting [2,3,4] -> [4,2,0], acting_primary 2 -> 4, up_primary 2 -> 4, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[6.1e( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.828413963s) [5,1,3] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1189.650390625s@ mbc={}] start_peering_interval up [0,4,2] -> [5,1,3], acting [0,4,2] -> [5,1,3], acting_primary 0 -> 5, up_primary 0 -> 5, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[4.1d( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.968675613s) [2,1,3] r=-1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1187.790527344s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[5.1c( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.823025703s) [4,2,0] r=0 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown pruub 1189.645019531s@ mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[6.1e( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.828380585s) [5,1,3] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1189.650390625s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[3.1b( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.967222214s) [5,4,3] r=1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1187.789428711s@ mbc={}] start_peering_interval up [5,4,0] -> [5,4,3], acting [5,4,0] -> [5,4,3], acting_primary 5 -> 5, up_primary 5 -> 5, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[3.1b( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.967191696s) [5,4,3] r=1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1187.789428711s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[2.1a( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.027956009s) [1,5,3] r=-1 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 active pruub 1185.850219727s@ mbc={}] start_peering_interval up [4,5,3] -> [1,5,3], acting [4,5,3] -> [1,5,3], acting_primary 4 -> 1, up_primary 4 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[2.1a( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.027852058s) [1,5,3] r=-1 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1185.850219727s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[4.1c( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.968965530s) [2,3,4] r=2 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1187.791503906s@ mbc={}] start_peering_interval up [3,4,5] -> [2,3,4], acting [3,4,5] -> [2,3,4], acting_primary 3 -> 2, up_primary 3 -> 2, role 1 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[4.1c( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.968930244s) [2,3,4] r=2 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1187.791503906s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[5.1d( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.822300911s) [3,1,5] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1189.645019531s@ mbc={}] start_peering_interval up [2,3,4] -> [3,1,5], acting [2,3,4] -> [3,1,5], acting_primary 2 -> 3, up_primary 2 -> 3, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[5.1d( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.822244644s) [3,1,5] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1189.645019531s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[6.1d( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.827562332s) [3,5,1] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1189.650390625s@ mbc={}] start_peering_interval up [0,4,2] -> [3,5,1], acting [0,4,2] -> [3,5,1], acting_primary 0 -> 3, up_primary 0 -> 3, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[5.1e( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.821415901s) [0,1,2] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1189.644165039s@ mbc={}] start_peering_interval up [2,3,4] -> [0,1,2], acting [2,3,4] -> [0,1,2], acting_primary 2 -> 0, up_primary 2 -> 0, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[5.1e( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.821235657s) [0,1,2] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1189.644165039s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[3.18( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.969354630s) [3,2,1] r=-1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1187.792480469s@ mbc={}] start_peering_interval up [5,4,0] -> [3,2,1], acting [5,4,0] -> [3,2,1], acting_primary 5 -> 3, up_primary 5 -> 3, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[2.19( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.027047157s) [3,4,2] r=1 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 active pruub 1185.850097656s@ mbc={}] start_peering_interval up [4,5,3] -> [3,4,2], acting [4,5,3] -> [3,4,2], acting_primary 4 -> 3, up_primary 4 -> 3, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[3.18( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.969314575s) [3,2,1] r=-1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1187.792480469s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[6.1d( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.827419281s) [3,5,1] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1189.650390625s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[2.19( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.026980400s) [3,4,2] r=1 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1185.850097656s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[4.1f( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.966959000s) [2,4,3] r=1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1187.790039062s@ mbc={}] start_peering_interval up [3,4,5] -> [2,4,3], acting [3,4,5] -> [2,4,3], acting_primary 3 -> 2, up_primary 3 -> 2, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[4.1f( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.966921806s) [2,4,3] r=1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1187.790039062s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[6.c( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.827233315s) [3,1,5] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1189.650512695s@ mbc={}] start_peering_interval up [0,4,2] -> [3,1,5], acting [0,4,2] -> [3,1,5], acting_primary 0 -> 3, up_primary 0 -> 3, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[6.c( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.827199936s) [3,1,5] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1189.650512695s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[4.e( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.967514992s) [4,5,0] r=0 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1187.790893555s@ mbc={}] start_peering_interval up [3,4,5] -> [4,5,0], acting [3,4,5] -> [4,5,0], acting_primary 3 -> 4, up_primary 3 -> 4, role 1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[3.9( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.970689774s) [5,1,3] r=-1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1187.794067383s@ mbc={}] start_peering_interval up [5,4,0] -> [5,1,3], acting [5,4,0] -> [5,1,3], acting_primary 5 -> 5, up_primary 5 -> 5, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[4.e( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.967514992s) [4,5,0] r=0 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown pruub 1187.790893555s@ mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[2.8( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.026292801s) [1,2,0] r=-1 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 active pruub 1185.849853516s@ mbc={}] start_peering_interval up [4,5,3] -> [1,2,0], acting [4,5,3] -> [1,2,0], acting_primary 4 -> 1, up_primary 4 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[3.9( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.970632553s) [5,1,3] r=-1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1187.794067383s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[3.4( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.970430374s) [3,2,1] r=-1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1187.793945312s@ mbc={}] start_peering_interval up [5,4,0] -> [3,2,1], acting [5,4,0] -> [3,2,1], acting_primary 5 -> 3, up_primary 5 -> 3, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[2.8( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.026234627s) [1,2,0] r=-1 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1185.849853516s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[2.5( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.026973724s) [2,0,1] r=-1 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 active pruub 1185.850585938s@ mbc={}] start_peering_interval up [4,5,3] -> [2,0,1], acting [4,5,3] -> [2,0,1], acting_primary 4 -> 2, up_primary 4 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[4.3( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.966407776s) [2,4,3] r=1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1187.790039062s@ mbc={}] start_peering_interval up [3,4,5] -> [2,4,3], acting [3,4,5] -> [2,4,3], acting_primary 3 -> 2, up_primary 3 -> 2, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[3.4( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.970390320s) [3,2,1] r=-1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1187.793945312s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[4.3( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.966372490s) [2,4,3] r=1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1187.790039062s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[3.2( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.970038414s) [3,5,1] r=-1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1187.793701172s@ mbc={}] start_peering_interval up [5,4,0] -> [3,5,1], acting [5,4,0] -> [3,5,1], acting_primary 5 -> 3, up_primary 5 -> 3, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[2.5( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.026905060s) [2,0,1] r=-1 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1185.850585938s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[3.2( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.969979286s) [3,5,1] r=-1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1187.793701172s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[2.3( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.025691032s) [4,3,5] r=0 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 active pruub 1185.849487305s@ mbc={}] start_peering_interval up [4,5,3] -> [4,3,5], acting [4,5,3] -> [4,3,5], acting_primary 4 -> 4, up_primary 4 -> 4, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[2.3( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.025691032s) [4,3,5] r=0 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 unknown pruub 1185.849487305s@ mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[4.5( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.967890739s) [1,5,0] r=-1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1187.791870117s@ mbc={}] start_peering_interval up [3,4,5] -> [1,5,0], acting [3,4,5] -> [1,5,0], acting_primary 3 -> 1, up_primary 3 -> 1, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[4.5( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.967849731s) [1,5,0] r=-1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1187.791870117s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[6.6( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.825885773s) [3,4,5] r=1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1189.650024414s@ mbc={}] start_peering_interval up [0,4,2] -> [3,4,5], acting [0,4,2] -> [3,4,5], acting_primary 0 -> 3, up_primary 0 -> 3, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[6.6( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.825831413s) [3,4,5] r=1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1189.650024414s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[5.4( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.818105698s) [5,3,4] r=2 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1189.642333984s@ mbc={}] start_peering_interval up [2,3,4] -> [5,3,4], acting [2,3,4] -> [5,3,4], acting_primary 2 -> 5, up_primary 2 -> 5, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[4.4( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.966425896s) [0,5,1] r=-1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1187.790527344s@ mbc={}] start_peering_interval up [3,4,5] -> [0,5,1], acting [3,4,5] -> [0,5,1], acting_primary 3 -> 0, up_primary 3 -> 0, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[5.4( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.818045616s) [5,3,4] r=2 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1189.642333984s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[2.4( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.026214600s) [3,2,1] r=-1 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 active pruub 1185.850585938s@ mbc={}] start_peering_interval up [4,5,3] -> [3,2,1], acting [4,5,3] -> [3,2,1], acting_primary 4 -> 3, up_primary 4 -> 3, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[4.4( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.966373444s) [0,5,1] r=-1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1187.790527344s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[2.4( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.026171684s) [3,2,1] r=-1 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1185.850585938s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[4.2( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.967329979s) [2,1,3] r=-1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1187.791870117s@ mbc={}] start_peering_interval up [3,4,5] -> [2,1,3], acting [3,4,5] -> [2,1,3], acting_primary 3 -> 2, up_primary 3 -> 2, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[4.2( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.967271805s) [2,1,3] r=-1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1187.791870117s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[2.7( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.025094986s) [4,2,3] r=0 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 active pruub 1185.849853516s@ mbc={}] start_peering_interval up [4,5,3] -> [4,2,3], acting [4,5,3] -> [4,2,3], acting_primary 4 -> 4, up_primary 4 -> 4, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[4.1( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.966789246s) [2,1,0] r=-1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1187.791503906s@ mbc={}] start_peering_interval up [3,4,5] -> [2,1,0], acting [3,4,5] -> [2,1,0], acting_primary 3 -> 2, up_primary 3 -> 2, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[2.7( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.025094986s) [4,2,3] r=0 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 unknown pruub 1185.849853516s@ mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[2.2( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.025369644s) [1,0,2] r=-1 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 active pruub 1185.850219727s@ mbc={}] start_peering_interval up [4,5,3] -> [1,0,2], acting [4,5,3] -> [1,0,2], acting_primary 4 -> 1, up_primary 4 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[4.1( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.966738701s) [2,1,0] r=-1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1187.791503906s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[2.2( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.025326729s) [1,0,2] r=-1 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1185.850219727s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[3.7( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.969273567s) [3,5,4] r=2 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1187.794067383s@ mbc={}] start_peering_interval up [5,4,0] -> [3,5,4], acting [5,4,0] -> [3,5,4], acting_primary 5 -> 3, up_primary 5 -> 3, role 1 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[6.2( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.825081825s) [1,3,2] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1189.650146484s@ mbc={}] start_peering_interval up [0,4,2] -> [1,3,2], acting [0,4,2] -> [1,3,2], acting_primary 0 -> 1, up_primary 0 -> 1, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[2.6( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.024597168s) [3,2,4] r=2 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 active pruub 1185.849487305s@ mbc={}] start_peering_interval up [4,5,3] -> [3,2,4], acting [4,5,3] -> [3,2,4], acting_primary 4 -> 3, up_primary 4 -> 3, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[3.7( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.969213486s) [3,5,4] r=2 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1187.794067383s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[2.6( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.024558067s) [3,2,4] r=2 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1185.849487305s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[2.1( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.024641991s) [3,5,4] r=2 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 active pruub 1185.849853516s@ mbc={}] start_peering_interval up [4,5,3] -> [3,5,4], acting [4,5,3] -> [3,5,4], acting_primary 4 -> 3, up_primary 4 -> 3, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[4.7( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.965794563s) [0,5,4] r=2 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1187.791015625s@ mbc={}] start_peering_interval up [3,4,5] -> [0,5,4], acting [3,4,5] -> [0,5,4], acting_primary 3 -> 0, up_primary 3 -> 0, role 1 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[2.1( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.024593353s) [3,5,4] r=2 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1185.849853516s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[6.2( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.825022697s) [1,3,2] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1189.650146484s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[6.4( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.824617386s) [3,1,5] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1189.649902344s@ mbc={}] start_peering_interval up [0,4,2] -> [3,1,5], acting [0,4,2] -> [3,1,5], acting_primary 0 -> 3, up_primary 0 -> 3, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[6.14( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.815999985s) [3,4,5] r=1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1189.641723633s@ mbc={}] start_peering_interval up [0,4,2] -> [3,4,5], acting [0,4,2] -> [3,4,5], acting_primary 0 -> 3, up_primary 0 -> 3, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[6.14( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.815927505s) [3,4,5] r=1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1189.641723633s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[6.4( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.824579239s) [3,1,5] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1189.649902344s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[5.6( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.817840576s) [3,1,2] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1189.643188477s@ mbc={}] start_peering_interval up [2,3,4] -> [3,1,2], acting [2,3,4] -> [3,1,2], acting_primary 2 -> 3, up_primary 2 -> 3, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[4.6( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.964966774s) [5,3,4] r=2 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1187.791259766s@ mbc={}] start_peering_interval up [3,4,5] -> [5,3,4], acting [3,4,5] -> [5,3,4], acting_primary 3 -> 5, up_primary 3 -> 5, role 1 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[4.6( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.964921951s) [5,3,4] r=2 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1187.791259766s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[5.6( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.816976547s) [3,1,2] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1189.643188477s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[6.d( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.823229790s) [1,3,2] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1189.649658203s@ mbc={}] start_peering_interval up [0,4,2] -> [1,3,2], acting [0,4,2] -> [1,3,2], acting_primary 0 -> 1, up_primary 0 -> 1, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[3.8( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.967538834s) [2,0,4] r=2 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1187.793945312s@ mbc={}] start_peering_interval up [5,4,0] -> [2,0,4], acting [5,4,0] -> [2,0,4], acting_primary 5 -> 2, up_primary 5 -> 2, role 1 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[3.8( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.967495918s) [2,0,4] r=2 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1187.793945312s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[6.d( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.823106766s) [1,3,2] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1189.649658203s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[2.9( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.023468971s) [3,4,5] r=1 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 active pruub 1185.850341797s@ mbc={}] start_peering_interval up [4,5,3] -> [3,4,5], acting [4,5,3] -> [3,4,5], acting_primary 4 -> 3, up_primary 4 -> 3, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[2.9( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.023432732s) [3,4,5] r=1 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1185.850341797s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[4.f( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.965711594s) [3,2,4] r=2 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1187.792724609s@ mbc={}] start_peering_interval up [3,4,5] -> [3,2,4], acting [3,4,5] -> [3,2,4], acting_primary 3 -> 3, up_primary 3 -> 3, role 1 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[4.7( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.965720177s) [0,5,4] r=2 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1187.791015625s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[4.f( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.965650558s) [3,2,4] r=2 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1187.792724609s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[4.c( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.964346886s) [4,3,2] r=0 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1187.791503906s@ mbc={}] start_peering_interval up [3,4,5] -> [4,3,2], acting [3,4,5] -> [4,3,2], acting_primary 3 -> 4, up_primary 3 -> 4, role 1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[3.b( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.967173576s) [3,4,5] r=1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1187.794433594s@ mbc={}] start_peering_interval up [5,4,0] -> [3,4,5], acting [5,4,0] -> [3,4,5], acting_primary 5 -> 3, up_primary 5 -> 3, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[2.a( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.022138596s) [2,3,1] r=-1 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 active pruub 1185.849365234s@ mbc={}] start_peering_interval up [4,5,3] -> [2,3,1], acting [4,5,3] -> [2,3,1], acting_primary 4 -> 2, up_primary 4 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[4.c( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.964346886s) [4,3,2] r=0 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown pruub 1187.791503906s@ mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[3.b( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.967098236s) [3,4,5] r=1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1187.794433594s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[2.a( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.022038460s) [2,3,1] r=-1 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1185.849365234s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[4.d( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.963990211s) [4,2,3] r=0 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1187.791625977s@ mbc={}] start_peering_interval up [3,4,5] -> [4,2,3], acting [3,4,5] -> [4,2,3], acting_primary 3 -> 4, up_primary 3 -> 4, role 1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[2.b( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.021915436s) [5,1,0] r=-1 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 active pruub 1185.849487305s@ mbc={}] start_peering_interval up [4,5,3] -> [5,1,0], acting [4,5,3] -> [5,1,0], acting_primary 4 -> 5, up_primary 4 -> 5, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[4.d( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.963990211s) [4,2,3] r=0 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown pruub 1187.791625977s@ mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[2.b( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.021776199s) [5,1,0] r=-1 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1185.849487305s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[5.c( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.816356659s) [3,4,2] r=1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1189.644287109s@ mbc={}] start_peering_interval up [2,3,4] -> [3,4,2], acting [2,3,4] -> [3,4,2], acting_primary 2 -> 3, up_primary 2 -> 3, role 2 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[5.c( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.816319466s) [3,4,2] r=1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1189.644287109s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[6.8( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.821342468s) [1,2,3] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1189.649414062s@ mbc={}] start_peering_interval up [0,4,2] -> [1,2,3], acting [0,4,2] -> [1,2,3], acting_primary 0 -> 1, up_primary 0 -> 1, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[6.f( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.821489334s) [3,5,1] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1189.649658203s@ mbc={}] start_peering_interval up [0,4,2] -> [3,5,1], acting [0,4,2] -> [3,5,1], acting_primary 0 -> 3, up_primary 0 -> 3, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[6.8( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.821275711s) [1,2,3] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1189.649414062s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[6.f( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.821254730s) [3,5,1] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1189.649658203s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[3.d( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.965167999s) [1,2,3] r=-1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1187.793579102s@ mbc={}] start_peering_interval up [5,4,0] -> [1,2,3], acting [5,4,0] -> [1,2,3], acting_primary 5 -> 1, up_primary 5 -> 1, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[3.d( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.965078354s) [1,2,3] r=-1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1187.793579102s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[2.c( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.020599365s) [2,0,1] r=-1 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 active pruub 1185.849243164s@ mbc={}] start_peering_interval up [4,5,3] -> [2,0,1], acting [4,5,3] -> [2,0,1], acting_primary 4 -> 2, up_primary 4 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[2.c( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.020244598s) [2,0,1] r=-1 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1185.849243164s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[5.b( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.816335678s) [5,0,4] r=2 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1189.645507812s@ mbc={}] start_peering_interval up [2,3,4] -> [5,0,4], acting [2,3,4] -> [5,0,4], acting_primary 2 -> 5, up_primary 2 -> 5, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[2.d( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.020224571s) [5,1,3] r=-1 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 active pruub 1185.849487305s@ mbc={}] start_peering_interval up [4,5,3] -> [5,1,3], acting [4,5,3] -> [5,1,3], acting_primary 4 -> 5, up_primary 4 -> 5, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[5.b( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.816245079s) [5,0,4] r=2 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1189.645507812s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[4.a( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.962602615s) [1,0,2] r=-1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1187.791870117s@ mbc={}] start_peering_interval up [3,4,5] -> [1,0,2], acting [3,4,5] -> [1,0,2], acting_primary 3 -> 1, up_primary 3 -> 1, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[4.b( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.963671684s) [0,1,5] r=-1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1187.793212891s@ mbc={}] start_peering_interval up [3,4,5] -> [0,1,5], acting [3,4,5] -> [0,1,5], acting_primary 3 -> 0, up_primary 3 -> 0, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[4.a( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.962430954s) [1,0,2] r=-1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1187.791870117s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[5.a( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.814314842s) [0,2,4] r=2 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1189.643798828s@ mbc={}] start_peering_interval up [2,3,4] -> [0,2,4], acting [2,3,4] -> [0,2,4], acting_primary 2 -> 0, up_primary 2 -> 0, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[4.b( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.963333130s) [0,1,5] r=-1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1187.793212891s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[3.f( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.964268684s) [1,5,0] r=-1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1187.794433594s@ mbc={}] start_peering_interval up [5,4,0] -> [1,5,0], acting [5,4,0] -> [1,5,0], acting_primary 5 -> 1, up_primary 5 -> 1, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[2.d( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.019411087s) [5,1,3] r=-1 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1185.849487305s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[3.f( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.964202881s) [1,5,0] r=-1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1187.794433594s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[5.9( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.813126564s) [1,5,0] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1189.643798828s@ mbc={}] start_peering_interval up [2,3,4] -> [1,5,0], acting [2,3,4] -> [1,5,0], acting_primary 2 -> 1, up_primary 2 -> 1, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[5.9( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.813075066s) [1,5,0] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1189.643798828s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[4.8( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.961013794s) [5,4,3] r=1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1187.791870117s@ mbc={}] start_peering_interval up [3,4,5] -> [5,4,3], acting [3,4,5] -> [5,4,3], acting_primary 3 -> 5, up_primary 3 -> 5, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[2.e( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.018561363s) [3,2,4] r=2 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 active pruub 1185.849243164s@ mbc={}] start_peering_interval up [4,5,3] -> [3,2,4], acting [4,5,3] -> [3,2,4], acting_primary 4 -> 3, up_primary 4 -> 3, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[2.e( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.018381119s) [3,2,4] r=2 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1185.849243164s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[4.8( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.960960388s) [5,4,3] r=1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1187.791870117s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[6.b( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.817350388s) [3,1,2] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1189.648437500s@ mbc={}] start_peering_interval up [0,4,2] -> [3,1,2], acting [0,4,2] -> [3,1,2], acting_primary 0 -> 3, up_primary 0 -> 3, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[6.b( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.817280769s) [3,1,2] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1189.648437500s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[2.f( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.017602921s) [2,4,0] r=1 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 active pruub 1185.848876953s@ mbc={}] start_peering_interval up [4,5,3] -> [2,4,0], acting [4,5,3] -> [2,4,0], acting_primary 4 -> 2, up_primary 4 -> 2, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[2.f( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.017548561s) [2,4,0] r=1 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1185.848876953s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[4.16( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.960121155s) [0,4,5] r=1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1187.791748047s@ mbc={}] start_peering_interval up [3,4,5] -> [0,4,5], acting [3,4,5] -> [0,4,5], acting_primary 3 -> 0, up_primary 3 -> 0, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[4.16( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.960069656s) [0,4,5] r=1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1187.791748047s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[5.17( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.812532425s) [3,5,4] r=2 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1189.644287109s@ mbc={}] start_peering_interval up [2,3,4] -> [3,5,4], acting [2,3,4] -> [3,5,4], acting_primary 2 -> 3, up_primary 2 -> 3, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[5.17( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.812440872s) [3,5,4] r=2 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1189.644287109s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[4.9( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.960086823s) [5,0,1] r=-1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1187.791992188s@ mbc={}] start_peering_interval up [3,4,5] -> [5,0,1], acting [3,4,5] -> [5,0,1], acting_primary 3 -> 5, up_primary 3 -> 5, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[2.10( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.017263412s) [2,0,4] r=2 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 active pruub 1185.849243164s@ mbc={}] start_peering_interval up [4,5,3] -> [2,0,4], acting [4,5,3] -> [2,0,4], acting_primary 4 -> 2, up_primary 4 -> 2, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[2.10( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.017198563s) [2,0,4] r=2 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1185.849243164s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[4.9( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.959957123s) [5,0,1] r=-1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1187.791992188s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[3.10( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.961544037s) [1,5,3] r=-1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1187.793701172s@ mbc={}] start_peering_interval up [5,4,0] -> [1,5,3], acting [5,4,0] -> [1,5,3], acting_primary 5 -> 1, up_primary 5 -> 1, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[3.10( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.961502075s) [1,5,3] r=-1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1187.793701172s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[4.17( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.960978508s) [3,1,5] r=-1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1187.793212891s@ mbc={}] start_peering_interval up [3,4,5] -> [3,1,5], acting [3,4,5] -> [3,1,5], acting_primary 3 -> 3, up_primary 3 -> 3, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[2.11( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.016392708s) [4,3,2] r=0 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 active pruub 1185.848754883s@ mbc={}] start_peering_interval up [4,5,3] -> [4,3,2], acting [4,5,3] -> [4,3,2], acting_primary 4 -> 4, up_primary 4 -> 4, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[2.11( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.016392708s) [4,3,2] r=0 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 unknown pruub 1185.848754883s@ mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[5.16( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.811491013s) [1,3,2] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1189.643920898s@ mbc={}] start_peering_interval up [2,3,4] -> [1,3,2], acting [2,3,4] -> [1,3,2], acting_primary 2 -> 1, up_primary 2 -> 1, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[5.16( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.811398506s) [1,3,2] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1189.643920898s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[4.17( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.960863113s) [3,1,5] r=-1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1187.793212891s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[4.14( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.958947182s) [5,0,1] r=-1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1187.791992188s@ mbc={}] start_peering_interval up [3,4,5] -> [5,0,1], acting [3,4,5] -> [5,0,1], acting_primary 3 -> 5, up_primary 3 -> 5, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[2.12( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.015252113s) [5,3,1] r=-1 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 active pruub 1185.848388672s@ mbc={}] start_peering_interval up [4,5,3] -> [5,3,1], acting [4,5,3] -> [5,3,1], acting_primary 4 -> 5, up_primary 4 -> 5, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[6.17( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.808784485s) [5,0,1] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1189.641967773s@ mbc={}] start_peering_interval up [0,4,2] -> [5,0,1], acting [0,4,2] -> [5,0,1], acting_primary 0 -> 5, up_primary 0 -> 5, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[2.12( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.015178680s) [5,3,1] r=-1 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1185.848388672s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[4.14( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.958866119s) [5,0,1] r=-1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1187.791992188s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[2.13( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.013576508s) [2,4,3] r=1 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 active pruub 1185.848388672s@ mbc={}] start_peering_interval up [4,5,3] -> [2,4,3], acting [4,5,3] -> [2,4,3], acting_primary 4 -> 2, up_primary 4 -> 2, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[5.a( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.813790321s) [0,2,4] r=2 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1189.643798828s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[6.17( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.808738708s) [5,0,1] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1189.641967773s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[4.15( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.957107544s) [5,3,1] r=-1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1187.791992188s@ mbc={}] start_peering_interval up [3,4,5] -> [5,3,1], acting [3,4,5] -> [5,3,1], acting_primary 3 -> 5, up_primary 3 -> 5, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[2.13( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.013529778s) [2,4,3] r=1 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1185.848388672s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[4.15( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.957038879s) [5,3,1] r=-1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1187.791992188s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[5.14( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.810080528s) [3,2,4] r=2 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1189.645263672s@ mbc={}] start_peering_interval up [2,3,4] -> [3,2,4], acting [2,3,4] -> [3,2,4], acting_primary 2 -> 3, up_primary 2 -> 3, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[5.14( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.810009956s) [3,2,4] r=2 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1189.645263672s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[2.14( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.008739471s) [4,2,0] r=0 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 active pruub 1185.844116211s@ mbc={}] start_peering_interval up [4,5,3] -> [4,2,0], acting [4,5,3] -> [4,2,0], acting_primary 4 -> 4, up_primary 4 -> 4, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[4.12( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.956622124s) [0,5,4] r=2 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1187.791992188s@ mbc={}] start_peering_interval up [3,4,5] -> [0,5,4], acting [3,4,5] -> [0,5,4], acting_primary 3 -> 0, up_primary 3 -> 0, role 1 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[2.14( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.008739471s) [4,2,0] r=0 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 unknown pruub 1185.844116211s@ mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[4.12( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.956571579s) [0,5,4] r=2 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1187.791992188s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[5.13( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.810297012s) [5,0,1] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1189.645751953s@ mbc={}] start_peering_interval up [2,3,4] -> [5,0,1], acting [2,3,4] -> [5,0,1], acting_primary 2 -> 5, up_primary 2 -> 5, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[5.13( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.810237885s) [5,0,1] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1189.645751953s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[3.13( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.957045555s) [1,3,2] r=-1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1187.792724609s@ mbc={}] start_peering_interval up [5,4,0] -> [1,3,2], acting [5,4,0] -> [1,3,2], acting_primary 5 -> 1, up_primary 5 -> 1, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[4.13( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.956451416s) [4,2,3] r=0 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1187.792114258s@ mbc={}] start_peering_interval up [3,4,5] -> [4,2,3], acting [3,4,5] -> [4,2,3], acting_primary 3 -> 4, up_primary 3 -> 4, role 1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[4.13( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.956451416s) [4,2,3] r=0 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown pruub 1187.792114258s@ mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[3.14( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.956871033s) [1,2,0] r=-1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1187.792724609s@ mbc={}] start_peering_interval up [5,4,0] -> [1,2,0], acting [5,4,0] -> [1,2,0], acting_primary 5 -> 1, up_primary 5 -> 1, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[3.14( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.956832886s) [1,2,0] r=-1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1187.792724609s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[5.12( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.809107780s) [5,1,3] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1189.645019531s@ mbc={}] start_peering_interval up [2,3,4] -> [5,1,3], acting [2,3,4] -> [5,1,3], acting_primary 2 -> 5, up_primary 2 -> 5, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[5.12( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.809051514s) [5,1,3] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1189.645019531s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[6.12( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.805438042s) [5,4,0] r=1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1189.641479492s@ mbc={}] start_peering_interval up [0,4,2] -> [5,4,0], acting [0,4,2] -> [5,4,0], acting_primary 0 -> 5, up_primary 0 -> 5, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[2.16( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.012266159s) [1,2,0] r=-1 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 active pruub 1185.848266602s@ mbc={}] start_peering_interval up [4,5,3] -> [1,2,0], acting [4,5,3] -> [1,2,0], acting_primary 4 -> 1, up_primary 4 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[6.12( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.805400848s) [5,4,0] r=1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1189.641479492s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[2.16( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.012229919s) [1,2,0] r=-1 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1185.848266602s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[4.10( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.955913544s) [3,2,4] r=2 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1187.792114258s@ mbc={}] start_peering_interval up [3,4,5] -> [3,2,4], acting [3,4,5] -> [3,2,4], acting_primary 3 -> 3, up_primary 3 -> 3, role 1 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[3.16( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.956340790s) [1,3,5] r=-1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1187.792602539s@ mbc={}] start_peering_interval up [5,4,0] -> [1,3,5], acting [5,4,0] -> [1,3,5], acting_primary 5 -> 1, up_primary 5 -> 1, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[5.11( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.808458328s) [1,2,0] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1189.644775391s@ mbc={}] start_peering_interval up [2,3,4] -> [1,2,0], acting [2,3,4] -> [1,2,0], acting_primary 2 -> 1, up_primary 2 -> 1, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[4.10( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.955863953s) [3,2,4] r=2 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1187.792114258s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[3.16( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.956303596s) [1,3,5] r=-1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1187.792602539s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[5.11( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.808383942s) [1,2,0] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1189.644775391s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[2.17( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.011260033s) [1,5,3] r=-1 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 active pruub 1185.847778320s@ mbc={}] start_peering_interval up [4,5,3] -> [1,5,3], acting [4,5,3] -> [1,5,3], acting_primary 4 -> 1, up_primary 4 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[2.17( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.011223793s) [1,5,3] r=-1 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1185.847778320s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[6.1c( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.804635048s) [5,3,4] r=2 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1189.641357422s@ mbc={}] start_peering_interval up [0,4,2] -> [5,3,4], acting [0,4,2] -> [5,3,4], acting_primary 0 -> 5, up_primary 0 -> 5, role 1 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[4.11( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.955525398s) [3,4,2] r=1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1187.792236328s@ mbc={}] start_peering_interval up [3,4,5] -> [3,4,2], acting [3,4,5] -> [3,4,2], acting_primary 3 -> 3, up_primary 3 -> 3, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[6.1c( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.804577827s) [5,3,4] r=2 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1189.641357422s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[2.18( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.007039070s) [5,1,3] r=-1 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 active pruub 1185.843750000s@ mbc={}] start_peering_interval up [4,5,3] -> [5,1,3], acting [4,5,3] -> [5,1,3], acting_primary 4 -> 5, up_primary 4 -> 5, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[4.11( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.955476761s) [3,4,2] r=1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1187.792236328s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[2.18( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.006999969s) [5,1,3] r=-1 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1185.843750000s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[4.1e( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.955180168s) [0,4,5] r=1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1187.792114258s@ mbc={}] start_peering_interval up [3,4,5] -> [0,4,5], acting [3,4,5] -> [0,4,5], acting_primary 3 -> 0, up_primary 3 -> 0, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[4.1e( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.955126762s) [0,4,5] r=1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1187.792114258s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[3.13( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.956981659s) [1,3,2] r=-1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1187.792724609s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[2.15( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.010449409s) [5,0,4] r=2 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 active pruub 1185.848022461s@ mbc={}] start_peering_interval up [4,5,3] -> [5,0,4], acting [4,5,3] -> [5,0,4], acting_primary 4 -> 5, up_primary 4 -> 5, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[2.15( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.010396957s) [5,0,4] r=2 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1185.848022461s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[7.b( empty local-lis/les=0/0 n=0 ec=44/33 lis/c=44/44 les/c/f=45/45/0 sis=46) [4,2,3] r=0 lpr=46 pi=[44,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[7.9( empty local-lis/les=0/0 n=0 ec=44/33 lis/c=44/44 les/c/f=45/45/0 sis=46) [4,2,3] r=0 lpr=46 pi=[44,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[7.f( empty local-lis/les=0/0 n=0 ec=44/33 lis/c=44/44 les/c/f=45/45/0 sis=46) [4,2,3] r=0 lpr=46 pi=[44,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[7.5( empty local-lis/les=0/0 n=0 ec=44/33 lis/c=44/44 les/c/f=45/45/0 sis=46) [4,2,3] r=0 lpr=46 pi=[44,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[7.3( empty local-lis/les=0/0 n=0 ec=44/33 lis/c=44/44 les/c/f=45/45/0 sis=46) [4,2,3] r=0 lpr=46 pi=[44,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[7.7( empty local-lis/les=0/0 n=0 ec=44/33 lis/c=44/44 les/c/f=45/45/0 sis=46) [4,2,3] r=0 lpr=46 pi=[44,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[7.1( empty local-lis/les=0/0 n=0 ec=44/33 lis/c=44/44 les/c/f=45/45/0 sis=46) [4,2,3] r=0 lpr=46 pi=[44,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 46 pg[7.d( empty local-lis/les=0/0 n=0 ec=44/33 lis/c=44/44 les/c/f=45/45/0 sis=46) [4,2,3] r=0 lpr=46 pi=[44,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[32393]: log_channel(cluster) log [DBG] : 7.2 scrub starts
Nov 28 08:08:13 np0005538515.localdomain ceph-osd[32393]: log_channel(cluster) log [DBG] : 7.2 scrub ok
Nov 28 08:08:14 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 46 pg[4.15( empty local-lis/les=0/0 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46) [5,3,1] r=2 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:14 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 46 pg[6.17( empty local-lis/les=0/0 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46) [5,0,1] r=2 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:14 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 46 pg[6.1e( empty local-lis/les=0/0 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46) [5,1,3] r=1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:14 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 46 pg[6.1b( empty local-lis/les=0/0 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46) [5,1,0] r=1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:14 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 46 pg[4.9( empty local-lis/les=0/0 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46) [5,0,1] r=2 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:14 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 46 pg[2.d( empty local-lis/les=0/0 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46) [5,1,3] r=1 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:14 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 46 pg[3.9( empty local-lis/les=0/0 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46) [5,1,3] r=1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:14 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 46 pg[2.b( empty local-lis/les=0/0 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46) [5,1,0] r=1 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:14 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 46 pg[2.12( empty local-lis/les=0/0 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46) [5,3,1] r=2 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:14 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 46 pg[4.14( empty local-lis/les=0/0 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46) [5,0,1] r=2 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:14 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 46 pg[5.13( empty local-lis/les=0/0 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46) [5,0,1] r=2 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:14 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 46 pg[5.12( empty local-lis/les=0/0 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46) [5,1,3] r=1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:14 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 46 pg[4.17( empty local-lis/les=0/0 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46) [3,1,5] r=1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:14 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 46 pg[2.18( empty local-lis/les=0/0 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46) [5,1,3] r=1 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:14 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 46 pg[3.4( empty local-lis/les=0/0 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46) [3,2,1] r=2 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:14 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 46 pg[3.1f( empty local-lis/les=0/0 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46) [0,1,5] r=1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:14 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 46 pg[3.2( empty local-lis/les=0/0 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46) [3,5,1] r=2 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:14 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 46 pg[4.4( empty local-lis/les=0/0 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46) [0,5,1] r=2 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:14 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 46 pg[3.18( empty local-lis/les=0/0 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46) [3,2,1] r=2 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:14 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 46 pg[4.b( empty local-lis/les=0/0 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46) [0,1,5] r=1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:14 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 46 pg[2.4( empty local-lis/les=0/0 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46) [3,2,1] r=2 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:14 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 46 pg[5.1e( empty local-lis/les=0/0 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46) [0,1,2] r=1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:14 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 46 pg[6.16( empty local-lis/les=0/0 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46) [0,1,5] r=1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:14 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 46 pg[3.19( empty local-lis/les=0/0 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46) [0,1,2] r=1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:14 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 46 pg[5.3( empty local-lis/les=0/0 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46) [0,1,2] r=1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:14 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 46 pg[6.1f( empty local-lis/les=0/0 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46) [3,5,1] r=2 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:14 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 46 pg[6.18( empty local-lis/les=0/0 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46) [0,1,2] r=1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:14 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 46 pg[5.19( empty local-lis/les=0/0 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46) [0,1,5] r=1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:14 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 46 pg[5.1d( empty local-lis/les=0/0 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46) [3,1,5] r=1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:14 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 46 pg[6.1d( empty local-lis/les=0/0 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46) [3,5,1] r=2 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:14 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 46 pg[6.c( empty local-lis/les=0/0 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46) [3,1,5] r=1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:14 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 46 pg[5.6( empty local-lis/les=0/0 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46) [3,1,2] r=1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:14 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 46 pg[6.4( empty local-lis/les=0/0 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46) [3,1,5] r=1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:14 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 47 pg[2.11( empty local-lis/les=46/47 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46) [4,3,2] r=0 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:14 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 47 pg[7.9( v 36'39 (0'0,36'39] local-lis/les=46/47 n=1 ec=44/33 lis/c=44/44 les/c/f=45/45/0 sis=46) [4,2,3] r=0 lpr=46 pi=[44,46)/1 crt=36'39 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:14 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 47 pg[7.b( v 36'39 lc 0'0 (0'0,36'39] local-lis/les=46/47 n=1 ec=44/33 lis/c=44/44 les/c/f=45/45/0 sis=46) [4,2,3] r=0 lpr=46 pi=[44,46)/1 crt=36'39 mlcod 0'0 active+degraded m=1 mbc={255={(1+2)=1}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:14 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 47 pg[4.d( empty local-lis/les=46/47 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46) [4,2,3] r=0 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:14 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 47 pg[4.c( empty local-lis/les=46/47 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46) [4,3,2] r=0 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:14 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 47 pg[2.7( empty local-lis/les=46/47 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46) [4,2,3] r=0 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:14 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 46 pg[6.f( empty local-lis/les=0/0 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46) [3,5,1] r=2 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:14 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 47 pg[2.14( empty local-lis/les=46/47 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46) [4,2,0] r=0 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:14 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 47 pg[3.c( empty local-lis/les=46/47 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46) [4,3,5] r=0 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:14 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 47 pg[3.3( empty local-lis/les=46/47 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46) [4,0,5] r=0 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:14 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 46 pg[6.b( empty local-lis/les=0/0 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46) [3,1,2] r=1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:14 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 47 pg[4.e( empty local-lis/les=46/47 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46) [4,5,0] r=0 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:14 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 47 pg[3.a( empty local-lis/les=46/47 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46) [4,3,5] r=0 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:14 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 47 pg[3.5( empty local-lis/les=46/47 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46) [4,3,5] r=0 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:14 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 47 pg[2.3( empty local-lis/les=46/47 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46) [4,3,5] r=0 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:14 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 47 pg[4.18( empty local-lis/les=46/47 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46) [4,3,5] r=0 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:14 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 47 pg[4.1b( empty local-lis/les=46/47 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46) [4,3,5] r=0 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:14 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 47 pg[4.1a( empty local-lis/les=46/47 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46) [4,3,5] r=0 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:14 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 46 pg[6.13( empty local-lis/les=0/0 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46) [3,2,1] r=2 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:14 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 46 pg[4.1d( empty local-lis/les=0/0 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46) [2,1,3] r=1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:14 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 47 pg[7.f( v 36'39 lc 36'1 (0'0,36'39] local-lis/les=46/47 n=1 ec=44/33 lis/c=44/44 les/c/f=45/45/0 sis=46) [4,2,3] r=0 lpr=46 pi=[44,46)/1 crt=36'39 lcod 0'0 mlcod 0'0 active+degraded m=3 mbc={255={(1+2)=3}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:14 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 47 pg[7.3( v 36'39 lc 0'0 (0'0,36'39] local-lis/les=46/47 n=2 ec=44/33 lis/c=44/44 les/c/f=45/45/0 sis=46) [4,2,3] r=0 lpr=46 pi=[44,46)/1 crt=36'39 mlcod 0'0 active+degraded m=2 mbc={255={(1+2)=2}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:14 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 47 pg[4.13( empty local-lis/les=46/47 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46) [4,2,3] r=0 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:14 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 47 pg[7.1( v 36'39 (0'0,36'39] local-lis/les=46/47 n=2 ec=44/33 lis/c=44/44 les/c/f=45/45/0 sis=46) [4,2,3] r=0 lpr=46 pi=[44,46)/1 crt=36'39 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:14 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 47 pg[6.1a( empty local-lis/les=46/47 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46) [4,2,0] r=0 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:14 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 47 pg[5.18( empty local-lis/les=46/47 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46) [4,2,3] r=0 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:14 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 47 pg[5.f( empty local-lis/les=46/47 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46) [4,2,3] r=0 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:14 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 47 pg[5.2( empty local-lis/les=46/47 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46) [4,0,2] r=0 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:14 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 47 pg[5.1c( empty local-lis/les=46/47 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46) [4,2,0] r=0 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:14 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 47 pg[6.5( empty local-lis/les=46/47 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46) [4,2,0] r=0 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:14 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 47 pg[6.e( empty local-lis/les=46/47 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46) [4,3,2] r=0 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:14 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 46 pg[4.19( empty local-lis/les=0/0 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46) [2,3,1] r=2 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:14 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 47 pg[6.a( empty local-lis/les=46/47 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46) [4,0,2] r=0 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:14 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 47 pg[6.7( empty local-lis/les=46/47 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46) [4,3,2] r=0 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:14 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 46 pg[2.1c( empty local-lis/les=0/0 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46) [2,1,0] r=1 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:14 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 46 pg[2.c( empty local-lis/les=0/0 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46) [2,0,1] r=2 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:14 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 47 pg[3.f( empty local-lis/les=46/47 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46) [1,5,0] r=0 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:14 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 46 pg[2.5( empty local-lis/les=0/0 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46) [2,0,1] r=2 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:14 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 46 pg[4.2( empty local-lis/les=0/0 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46) [2,1,3] r=1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:14 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 47 pg[5.7( empty local-lis/les=46/47 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46) [4,3,5] r=0 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:14 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 47 pg[4.5( empty local-lis/les=46/47 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46) [1,5,0] r=0 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:14 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 47 pg[6.3( empty local-lis/les=46/47 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46) [4,5,0] r=0 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:14 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 46 pg[2.a( empty local-lis/les=0/0 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46) [2,3,1] r=2 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:14 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 46 pg[4.1( empty local-lis/les=0/0 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46) [2,1,0] r=1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:14 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 47 pg[7.5( v 36'39 lc 36'11 (0'0,36'39] local-lis/les=46/47 n=2 ec=44/33 lis/c=44/44 les/c/f=45/45/0 sis=46) [4,2,3] r=0 lpr=46 pi=[44,46)/1 crt=36'39 lcod 0'0 mlcod 0'0 active+degraded m=2 mbc={255={(1+2)=2}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:14 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 47 pg[7.7( v 36'39 lc 36'18 (0'0,36'39] local-lis/les=46/47 n=1 ec=44/33 lis/c=44/44 les/c/f=45/45/0 sis=46) [4,2,3] r=0 lpr=46 pi=[44,46)/1 crt=36'39 lcod 0'0 mlcod 0'0 active+degraded m=1 mbc={255={(1+2)=1}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:14 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 47 pg[5.1( empty local-lis/les=46/47 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46) [4,3,5] r=0 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:14 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 47 pg[6.15( empty local-lis/les=46/47 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46) [4,5,0] r=0 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:14 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 47 pg[7.d( v 36'39 lc 36'13 (0'0,36'39] local-lis/les=46/47 n=2 ec=44/33 lis/c=44/44 les/c/f=45/45/0 sis=46) [4,2,3] r=0 lpr=46 pi=[44,46)/1 crt=36'39 lcod 0'0 mlcod 0'0 active+degraded m=2 mbc={255={(1+2)=2}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:14 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 46 pg[3.15( empty local-lis/les=0/0 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46) [2,1,0] r=1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:14 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 46 pg[6.1( empty local-lis/les=0/0 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46) [2,1,3] r=1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:14 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 46 pg[5.8( empty local-lis/les=0/0 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46) [2,0,1] r=2 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:14 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 47 pg[3.1c( empty local-lis/les=46/47 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46) [1,3,2] r=0 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:14 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 47 pg[2.2( empty local-lis/les=46/47 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46) [1,0,2] r=0 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:14 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 47 pg[4.a( empty local-lis/les=46/47 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46) [1,0,2] r=0 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:14 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 47 pg[3.d( empty local-lis/les=46/47 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46) [1,2,3] r=0 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:14 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 47 pg[2.8( empty local-lis/les=46/47 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46) [1,2,0] r=0 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:14 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 47 pg[5.16( empty local-lis/les=46/47 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46) [1,3,2] r=0 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:14 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 47 pg[3.13( empty local-lis/les=46/47 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46) [1,3,2] r=0 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:14 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 47 pg[6.2( empty local-lis/les=46/47 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46) [1,3,2] r=0 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:14 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 47 pg[2.1a( empty local-lis/les=46/47 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46) [1,5,3] r=0 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:14 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 47 pg[6.8( empty local-lis/les=46/47 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46) [1,2,3] r=0 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:14 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 47 pg[6.d( empty local-lis/les=46/47 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46) [1,3,2] r=0 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:14 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 47 pg[3.14( empty local-lis/les=46/47 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46) [1,2,0] r=0 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:14 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 47 pg[2.16( empty local-lis/les=46/47 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46) [1,2,0] r=0 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:14 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 47 pg[5.11( empty local-lis/les=46/47 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46) [1,2,0] r=0 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:14 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 47 pg[5.15( empty local-lis/les=46/47 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46) [4,3,5] r=0 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:14 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 47 pg[5.1f( empty local-lis/les=46/47 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46) [4,5,3] r=0 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:14 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 47 pg[5.10( empty local-lis/les=46/47 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46) [4,5,0] r=0 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:14 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 47 pg[3.10( empty local-lis/les=46/47 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46) [1,5,3] r=0 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:14 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 47 pg[2.17( empty local-lis/les=46/47 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46) [1,5,3] r=0 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:14 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 47 pg[3.16( empty local-lis/les=46/47 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46) [1,3,5] r=0 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:14 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 47 pg[5.9( empty local-lis/les=46/47 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46) [1,5,0] r=0 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:14 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 47 pg[5.1b( empty local-lis/les=46/47 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46) [1,0,2] r=0 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:14 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 47 pg[6.19( empty local-lis/les=46/47 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46) [1,3,2] r=0 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:15 np0005538515.localdomain ceph-osd[32393]: log_channel(cluster) log [DBG] : 7.c scrub starts
Nov 28 08:08:15 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 48 pg[7.2( v 36'39 (0'0,36'39] local-lis/les=44/45 n=0 ec=44/33 lis/c=44/44 les/c/f=45/45/0 sis=48 pruub=14.800663948s) [3,5,1] r=2 lpr=48 pi=[44,48)/1 crt=36'39 lcod 0'0 mlcod 0'0 active pruub 1196.204589844s@ mbc={}] start_peering_interval up [1,5,3] -> [3,5,1], acting [1,5,3] -> [3,5,1], acting_primary 1 -> 3, up_primary 1 -> 3, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:15 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 48 pg[7.2( v 36'39 (0'0,36'39] local-lis/les=44/45 n=0 ec=44/33 lis/c=44/44 les/c/f=45/45/0 sis=48 pruub=14.800559998s) [3,5,1] r=2 lpr=48 pi=[44,48)/1 crt=36'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1196.204589844s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:15 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 48 pg[7.e( v 36'39 (0'0,36'39] local-lis/les=44/45 n=1 ec=44/33 lis/c=44/44 les/c/f=45/45/0 sis=48 pruub=14.804156303s) [3,5,1] r=2 lpr=48 pi=[44,48)/1 crt=36'39 lcod 0'0 mlcod 0'0 active pruub 1196.208251953s@ mbc={}] start_peering_interval up [1,5,3] -> [3,5,1], acting [1,5,3] -> [3,5,1], acting_primary 1 -> 3, up_primary 1 -> 3, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:15 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 48 pg[7.e( v 36'39 (0'0,36'39] local-lis/les=44/45 n=1 ec=44/33 lis/c=44/44 les/c/f=45/45/0 sis=48 pruub=14.804106712s) [3,5,1] r=2 lpr=48 pi=[44,48)/1 crt=36'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1196.208251953s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:15 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 48 pg[7.6( v 36'39 (0'0,36'39] local-lis/les=44/45 n=2 ec=44/33 lis/c=44/44 les/c/f=45/45/0 sis=48 pruub=14.804538727s) [3,5,1] r=2 lpr=48 pi=[44,48)/1 crt=36'39 lcod 0'0 mlcod 0'0 active pruub 1196.209350586s@ mbc={}] start_peering_interval up [1,5,3] -> [3,5,1], acting [1,5,3] -> [3,5,1], acting_primary 1 -> 3, up_primary 1 -> 3, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:15 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 48 pg[7.a( v 36'39 (0'0,36'39] local-lis/les=44/45 n=1 ec=44/33 lis/c=44/44 les/c/f=45/45/0 sis=48 pruub=14.803477287s) [3,5,1] r=2 lpr=48 pi=[44,48)/1 crt=36'39 lcod 0'0 mlcod 0'0 active pruub 1196.208496094s@ mbc={}] start_peering_interval up [1,5,3] -> [3,5,1], acting [1,5,3] -> [3,5,1], acting_primary 1 -> 3, up_primary 1 -> 3, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:15 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 48 pg[7.6( v 36'39 (0'0,36'39] local-lis/les=44/45 n=2 ec=44/33 lis/c=44/44 les/c/f=45/45/0 sis=48 pruub=14.804100990s) [3,5,1] r=2 lpr=48 pi=[44,48)/1 crt=36'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1196.209350586s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:15 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 48 pg[7.a( v 36'39 (0'0,36'39] local-lis/les=44/45 n=1 ec=44/33 lis/c=44/44 les/c/f=45/45/0 sis=48 pruub=14.803343773s) [3,5,1] r=2 lpr=48 pi=[44,48)/1 crt=36'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1196.208496094s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:15 np0005538515.localdomain sudo[56751]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hdvzmbexuuszvmxwnnteqxleramjydvn ; /usr/bin/python3
Nov 28 08:08:15 np0005538515.localdomain sudo[56751]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:08:15 np0005538515.localdomain python3[56753]: ansible-ansible.legacy.stat Invoked with path=/var/lib/tripleo-config/ceph/ceph.client.manila.keyring follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 08:08:15 np0005538515.localdomain sudo[56751]: pam_unix(sudo:session): session closed for user root
Nov 28 08:08:16 np0005538515.localdomain sudo[56794]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pjycksudxcrhuxnnjlwdjlrovnwhibyu ; /usr/bin/python3
Nov 28 08:08:16 np0005538515.localdomain sudo[56794]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:08:16 np0005538515.localdomain python3[56796]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764317295.4785428-92600-114468586821601/source dest=/var/lib/tripleo-config/ceph/ceph.client.manila.keyring mode=600 _original_basename=ceph.client.manila.keyring follow=False checksum=880d8421ed22fd6e089f5c7c842f51482074b0c0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:08:16 np0005538515.localdomain sudo[56794]: pam_unix(sudo:session): session closed for user root
Nov 28 08:08:20 np0005538515.localdomain ceph-osd[32393]: log_channel(cluster) log [DBG] : 7.8 deep-scrub starts
Nov 28 08:08:21 np0005538515.localdomain sudo[56856]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-koearghtjjsmvspjdncrzlsywvtreeav ; /usr/bin/python3
Nov 28 08:08:21 np0005538515.localdomain sudo[56856]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:08:21 np0005538515.localdomain python3[56858]: ansible-ansible.legacy.stat Invoked with path=/var/lib/tripleo-config/ceph/ceph.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 08:08:21 np0005538515.localdomain sudo[56856]: pam_unix(sudo:session): session closed for user root
Nov 28 08:08:21 np0005538515.localdomain ceph-osd[32393]: log_channel(cluster) log [DBG] : 7.4 scrub starts
Nov 28 08:08:21 np0005538515.localdomain sudo[56899]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nitlttbxkicvtxxkyyrlbahpdomkxkso ; /usr/bin/python3
Nov 28 08:08:21 np0005538515.localdomain sudo[56899]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:08:21 np0005538515.localdomain ceph-osd[32393]: log_channel(cluster) log [DBG] : 7.4 scrub ok
Nov 28 08:08:21 np0005538515.localdomain python3[56901]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764317300.9094434-92600-79932590728993/source dest=/var/lib/tripleo-config/ceph/ceph.conf mode=644 _original_basename=ceph.conf follow=False checksum=3f1634d98b90f8c800fba4d3a33fb1546a043fff backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:08:21 np0005538515.localdomain sudo[56899]: pam_unix(sudo:session): session closed for user root
Nov 28 08:08:22 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.
Nov 28 08:08:22 np0005538515.localdomain podman[56916]: 2025-11-28 08:08:22.977933337 +0000 UTC m=+0.084439330 container health_status 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, name=rhosp17/openstack-qdrouterd, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:46Z, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, container_name=metrics_qdr, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1761123044, config_id=tripleo_step1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, io.buildah.version=1.41.4, tcib_managed=true, version=17.1.12, batch=17.1_20251118.1, url=https://www.redhat.com, architecture=x86_64)
Nov 28 08:08:23 np0005538515.localdomain podman[56916]: 2025-11-28 08:08:23.155501304 +0000 UTC m=+0.262007337 container exec_died 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, distribution-scope=public, maintainer=OpenStack TripleO Team, container_name=metrics_qdr, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, config_id=tripleo_step1, vcs-type=git, io.openshift.expose-services=, build-date=2025-11-18T22:49:46Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, architecture=x86_64, name=rhosp17/openstack-qdrouterd, version=17.1.12, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, release=1761123044)
Nov 28 08:08:23 np0005538515.localdomain systemd[1]: 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.service: Deactivated successfully.
Nov 28 08:08:23 np0005538515.localdomain ceph-osd[32393]: log_channel(cluster) log [DBG] : 3.f deep-scrub starts
Nov 28 08:08:23 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 50 pg[7.7( v 36'39 (0'0,36'39] local-lis/les=46/47 n=1 ec=44/33 lis/c=46/46 les/c/f=47/49/0 sis=50 pruub=14.779549599s) [3,4,2] r=1 lpr=50 pi=[46,50)/1 crt=36'39 mlcod 0'0 active pruub 1199.847290039s@ mbc={255={}}] start_peering_interval up [4,2,3] -> [3,4,2], acting [4,2,3] -> [3,4,2], acting_primary 4 -> 3, up_primary 4 -> 3, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:23 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 50 pg[7.7( v 36'39 (0'0,36'39] local-lis/les=46/47 n=1 ec=44/33 lis/c=46/46 les/c/f=47/49/0 sis=50 pruub=14.779484749s) [3,4,2] r=1 lpr=50 pi=[46,50)/1 crt=36'39 mlcod 0'0 unknown NOTIFY pruub 1199.847290039s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:23 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 50 pg[7.3( v 36'39 (0'0,36'39] local-lis/les=46/47 n=2 ec=44/33 lis/c=46/46 les/c/f=47/49/0 sis=50 pruub=14.774587631s) [3,4,2] r=1 lpr=50 pi=[46,50)/1 crt=36'39 mlcod 0'0 active pruub 1199.842651367s@ mbc={255={}}] start_peering_interval up [4,2,3] -> [3,4,2], acting [4,2,3] -> [3,4,2], acting_primary 4 -> 3, up_primary 4 -> 3, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:23 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 50 pg[7.f( v 36'39 (0'0,36'39] local-lis/les=46/47 n=1 ec=44/33 lis/c=46/46 les/c/f=47/49/0 sis=50 pruub=14.774202347s) [3,4,2] r=1 lpr=50 pi=[46,50)/1 crt=36'39 mlcod 0'0 active pruub 1199.842407227s@ mbc={255={}}] start_peering_interval up [4,2,3] -> [3,4,2], acting [4,2,3] -> [3,4,2], acting_primary 4 -> 3, up_primary 4 -> 3, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:23 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 50 pg[7.f( v 36'39 (0'0,36'39] local-lis/les=46/47 n=1 ec=44/33 lis/c=46/46 les/c/f=47/49/0 sis=50 pruub=14.774164200s) [3,4,2] r=1 lpr=50 pi=[46,50)/1 crt=36'39 mlcod 0'0 unknown NOTIFY pruub 1199.842407227s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:23 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 50 pg[7.3( v 36'39 (0'0,36'39] local-lis/les=46/47 n=2 ec=44/33 lis/c=46/46 les/c/f=47/49/0 sis=50 pruub=14.774456978s) [3,4,2] r=1 lpr=50 pi=[46,50)/1 crt=36'39 mlcod 0'0 unknown NOTIFY pruub 1199.842651367s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:23 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 50 pg[7.b( v 36'39 (0'0,36'39] local-lis/les=46/47 n=1 ec=44/33 lis/c=46/46 les/c/f=47/47/0 sis=50 pruub=14.769097328s) [3,4,2] r=1 lpr=50 pi=[46,50)/1 crt=36'39 mlcod 0'0 active pruub 1199.837524414s@ mbc={255={}}] start_peering_interval up [4,2,3] -> [3,4,2], acting [4,2,3] -> [3,4,2], acting_primary 4 -> 3, up_primary 4 -> 3, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:23 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 50 pg[7.b( v 36'39 (0'0,36'39] local-lis/les=46/47 n=1 ec=44/33 lis/c=46/46 les/c/f=47/47/0 sis=50 pruub=14.769072533s) [3,4,2] r=1 lpr=50 pi=[46,50)/1 crt=36'39 mlcod 0'0 unknown NOTIFY pruub 1199.837524414s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:25 np0005538515.localdomain ceph-osd[32393]: log_channel(cluster) log [DBG] : 3.10 scrub starts
Nov 28 08:08:25 np0005538515.localdomain ceph-osd[32393]: log_channel(cluster) log [DBG] : 3.10 scrub ok
Nov 28 08:08:25 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 52 pg[7.4( v 36'39 (0'0,36'39] local-lis/les=44/45 n=4 ec=44/33 lis/c=44/44 les/c/f=45/45/0 sis=52 pruub=12.558876038s) [0,1,2] r=1 lpr=52 pi=[44,52)/1 crt=36'39 lcod 0'0 mlcod 0'0 active pruub 1204.208496094s@ mbc={}] start_peering_interval up [1,5,3] -> [0,1,2], acting [1,5,3] -> [0,1,2], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:25 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 52 pg[7.4( v 36'39 (0'0,36'39] local-lis/les=44/45 n=4 ec=44/33 lis/c=44/44 les/c/f=45/45/0 sis=52 pruub=12.558786392s) [0,1,2] r=1 lpr=52 pi=[44,52)/1 crt=36'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1204.208496094s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:25 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 52 pg[7.c( v 36'39 (0'0,36'39] local-lis/les=44/45 n=1 ec=44/33 lis/c=44/44 les/c/f=45/45/0 sis=52 pruub=12.559266090s) [0,1,2] r=1 lpr=52 pi=[44,52)/1 crt=36'39 lcod 0'0 mlcod 0'0 active pruub 1204.208618164s@ mbc={}] start_peering_interval up [1,5,3] -> [0,1,2], acting [1,5,3] -> [0,1,2], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:25 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 52 pg[7.c( v 36'39 (0'0,36'39] local-lis/les=44/45 n=1 ec=44/33 lis/c=44/44 les/c/f=45/45/0 sis=52 pruub=12.558312416s) [0,1,2] r=1 lpr=52 pi=[44,52)/1 crt=36'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1204.208618164s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:25 np0005538515.localdomain sudo[56991]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tsgehrsqyvpwujsowbstmxgmjmdthvrq ; /usr/bin/python3
Nov 28 08:08:25 np0005538515.localdomain sudo[56991]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:08:26 np0005538515.localdomain python3[56993]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/config_step.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 08:08:26 np0005538515.localdomain sudo[56991]: pam_unix(sudo:session): session closed for user root
Nov 28 08:08:26 np0005538515.localdomain sudo[57036]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ivgwxfwhkpiqnvwzqgwfbpxomqvleogi ; /usr/bin/python3
Nov 28 08:08:26 np0005538515.localdomain sudo[57036]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:08:26 np0005538515.localdomain python3[57038]: ansible-ansible.legacy.copy Invoked with dest=/etc/puppet/hieradata/config_step.json force=True mode=0600 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764317305.8341537-93172-12382793399610/source _original_basename=tmpj2x10uya follow=False checksum=f17091ee142621a3c8290c8c96b5b52d67b3a864 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:08:26 np0005538515.localdomain sudo[57036]: pam_unix(sudo:session): session closed for user root
Nov 28 08:08:27 np0005538515.localdomain ceph-osd[32393]: log_channel(cluster) log [DBG] : 3.16 scrub starts
Nov 28 08:08:27 np0005538515.localdomain ceph-osd[32393]: log_channel(cluster) log [DBG] : 3.16 scrub ok
Nov 28 08:08:27 np0005538515.localdomain sudo[57098]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zsjsyougwokecqkxjujuonnjugrkedtr ; /usr/bin/python3
Nov 28 08:08:27 np0005538515.localdomain sudo[57098]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:08:27 np0005538515.localdomain python3[57100]: ansible-ansible.legacy.stat Invoked with path=/usr/local/sbin/containers-tmpwatch follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 08:08:27 np0005538515.localdomain sudo[57098]: pam_unix(sudo:session): session closed for user root
Nov 28 08:08:28 np0005538515.localdomain sudo[57141]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jijvgtmqawyiimfmodndzpmwtragepbm ; /usr/bin/python3
Nov 28 08:08:28 np0005538515.localdomain sudo[57141]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:08:28 np0005538515.localdomain ceph-osd[32393]: log_channel(cluster) log [DBG] : 6.d scrub starts
Nov 28 08:08:28 np0005538515.localdomain python3[57143]: ansible-ansible.legacy.copy Invoked with dest=/usr/local/sbin/containers-tmpwatch group=root mode=493 owner=root src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764317307.5709486-93261-107695719738191/source _original_basename=tmpob9jdbub follow=False checksum=84397b037dad9813fed388c4bcdd4871f384cd22 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:08:28 np0005538515.localdomain sudo[57141]: pam_unix(sudo:session): session closed for user root
Nov 28 08:08:28 np0005538515.localdomain sudo[57171]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-enqdqzqwighidwexxbpznntamclbimpg ; /usr/bin/python3
Nov 28 08:08:28 np0005538515.localdomain sudo[57171]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:08:29 np0005538515.localdomain python3[57173]: ansible-cron Invoked with job=/usr/local/sbin/containers-tmpwatch name=Remove old logs special_time=daily user=root state=present backup=False minute=* hour=* day=* month=* weekday=* disabled=False env=False cron_file=None insertafter=None insertbefore=None
Nov 28 08:08:29 np0005538515.localdomain crontab[57174]: (root) LIST (root)
Nov 28 08:08:29 np0005538515.localdomain crontab[57175]: (root) REPLACE (root)
Nov 28 08:08:29 np0005538515.localdomain sudo[57171]: pam_unix(sudo:session): session closed for user root
Nov 28 08:08:29 np0005538515.localdomain ceph-osd[32393]: log_channel(cluster) log [DBG] : 6.8 deep-scrub starts
Nov 28 08:08:29 np0005538515.localdomain sudo[57189]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kgtgmzjdmdxowuourlncmukbswkoosyd ; /usr/bin/python3
Nov 28 08:08:29 np0005538515.localdomain sudo[57189]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:08:29 np0005538515.localdomain python3[57191]: ansible-stat Invoked with path=/var/lib/tripleo-config/container-startup-config/step_2 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 28 08:08:29 np0005538515.localdomain sudo[57189]: pam_unix(sudo:session): session closed for user root
Nov 28 08:08:30 np0005538515.localdomain sudo[57239]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rnlkfvwjkpsyqvwgjryriijocnqrqmuu ; /usr/bin/python3
Nov 28 08:08:30 np0005538515.localdomain sudo[57239]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:08:30 np0005538515.localdomain sudo[57239]: pam_unix(sudo:session): session closed for user root
Nov 28 08:08:30 np0005538515.localdomain sudo[57257]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tnebmfeaoujkzzpunoyxwwjwjhvhhqjk ; /usr/bin/python3
Nov 28 08:08:30 np0005538515.localdomain sudo[57257]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:08:30 np0005538515.localdomain sudo[57257]: pam_unix(sudo:session): session closed for user root
Nov 28 08:08:31 np0005538515.localdomain sudo[57361]: tripleo-admin : TTY=pts/0 ; PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-auvdlswghwpvrzgrlfdtnarbhlmkzucf ; ANSIBLE_ASYNC_DIR=/tmp/.ansible_async /usr/bin/python3 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1764317310.6999605-93349-104102807179937/async_wrapper.py 289175210079 3600 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1764317310.6999605-93349-104102807179937/AnsiballZ_command.py _
Nov 28 08:08:31 np0005538515.localdomain sudo[57361]: pam_unix(sudo:session): session opened for user root(uid=0) by tripleo-admin(uid=1003)
Nov 28 08:08:31 np0005538515.localdomain ceph-osd[32393]: log_channel(cluster) log [DBG] : 6.2 scrub starts
Nov 28 08:08:31 np0005538515.localdomain ceph-osd[32393]: log_channel(cluster) log [DBG] : 6.2 scrub ok
Nov 28 08:08:31 np0005538515.localdomain ansible-async_wrapper.py[57363]: Invoked with 289175210079 3600 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1764317310.6999605-93349-104102807179937/AnsiballZ_command.py _
Nov 28 08:08:31 np0005538515.localdomain ansible-async_wrapper.py[57366]: Starting module and watcher
Nov 28 08:08:31 np0005538515.localdomain ansible-async_wrapper.py[57366]: Start watching 57367 (3600)
Nov 28 08:08:31 np0005538515.localdomain ansible-async_wrapper.py[57367]: Start module (57367)
Nov 28 08:08:31 np0005538515.localdomain ansible-async_wrapper.py[57363]: Return async_wrapper task started.
Nov 28 08:08:31 np0005538515.localdomain sudo[57361]: pam_unix(sudo:session): session closed for user root
Nov 28 08:08:31 np0005538515.localdomain sudo[57382]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-scwkllilrwfmxlnsknrgmytlkjecbunv ; /usr/bin/python3
Nov 28 08:08:31 np0005538515.localdomain sudo[57382]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:08:31 np0005538515.localdomain python3[57387]: ansible-ansible.legacy.async_status Invoked with jid=289175210079.57363 mode=status _async_dir=/tmp/.ansible_async
Nov 28 08:08:31 np0005538515.localdomain sudo[57382]: pam_unix(sudo:session): session closed for user root
Nov 28 08:08:32 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 54 pg[7.5( v 36'39 (0'0,36'39] local-lis/les=46/47 n=2 ec=44/33 lis/c=46/46 les/c/f=47/49/0 sis=54 pruub=13.456579208s) [2,0,4] r=2 lpr=54 pi=[46,54)/1 crt=36'39 mlcod 0'0 active pruub 1207.847656250s@ mbc={255={}}] start_peering_interval up [4,2,3] -> [2,0,4], acting [4,2,3] -> [2,0,4], acting_primary 4 -> 2, up_primary 4 -> 2, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:32 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 54 pg[7.d( v 36'39 (0'0,36'39] local-lis/les=46/47 n=2 ec=44/33 lis/c=46/46 les/c/f=47/49/0 sis=54 pruub=13.456820488s) [2,0,4] r=2 lpr=54 pi=[46,54)/1 crt=36'39 mlcod 0'0 active pruub 1207.847900391s@ mbc={255={}}] start_peering_interval up [4,2,3] -> [2,0,4], acting [4,2,3] -> [2,0,4], acting_primary 4 -> 2, up_primary 4 -> 2, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:32 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 54 pg[7.5( v 36'39 (0'0,36'39] local-lis/les=46/47 n=2 ec=44/33 lis/c=46/46 les/c/f=47/49/0 sis=54 pruub=13.456375122s) [2,0,4] r=2 lpr=54 pi=[46,54)/1 crt=36'39 mlcod 0'0 unknown NOTIFY pruub 1207.847656250s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:32 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 54 pg[7.d( v 36'39 (0'0,36'39] local-lis/les=46/47 n=2 ec=44/33 lis/c=46/46 les/c/f=47/49/0 sis=54 pruub=13.456663132s) [2,0,4] r=2 lpr=54 pi=[46,54)/1 crt=36'39 mlcod 0'0 unknown NOTIFY pruub 1207.847900391s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:33 np0005538515.localdomain ceph-osd[32393]: log_channel(cluster) log [DBG] : 6.19 scrub starts
Nov 28 08:08:34 np0005538515.localdomain ceph-osd[32393]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 28 08:08:34 np0005538515.localdomain ceph-osd[32393]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Cumulative writes: 4047 writes, 19K keys, 4047 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s
                                                          Cumulative WAL: 4047 writes, 325 syncs, 12.45 writes per sync, written: 0.02 GB, 0.01 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 642 writes, 2559 keys, 642 commit groups, 1.0 writes per commit group, ingest: 1.32 MB, 0.00 MB/s
                                                          Interval WAL: 642 writes, 119 syncs, 5.39 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          
                                                          ** Compaction Stats [default] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      2/0    2.61 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.005       0      0       0.0       0.0
                                                           Sum      2/0    2.61 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.005       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [default] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.2      0.01              0.00         1    0.005       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55ab8a42b350#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 4.7e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [default] **
                                                          
                                                          ** Compaction Stats [m-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55ab8a42b350#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 4.7e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-0] **
                                                          
                                                          ** Compaction Stats [m-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55ab8a42b350#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 4.7e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-1] **
                                                          
                                                          ** Compaction Stats [m-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55ab8a42b350#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 4.7e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-2] **
                                                          
                                                          ** Compaction Stats [p-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      1/0    1.57 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0
                                                           Sum      1/0    1.57 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55ab8a42b350#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 4.7e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-0] **
                                                          
                                                          ** Compaction Stats [p-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55ab8a42b350#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 4.7e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-1] **
                                                          
                                                          ** Compaction Stats [p-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55ab8a42b350#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 4.7e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-2] **
                                                          
                                                          ** Compaction Stats [O-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55ab8a42b610#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 1e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-0] **
                                                          
                                                          ** Compaction Stats [O-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55ab8a42b610#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 1e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-1] **
                                                          
                                                          ** Compaction Stats [O-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      1/0    1.26 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0
                                                           Sum      1/0    1.26 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55ab8a42b610#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 1e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-2] **
                                                          
                                                          ** Compaction Stats [L] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [L] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55ab8a42b350#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 4.7e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [L] **
                                                          
                                                          ** Compaction Stats [P] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [P] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55ab8a42b350#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 4.7e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [P] **
Nov 28 08:08:34 np0005538515.localdomain ceph-osd[32393]: log_channel(cluster) log [DBG] : 2.1a scrub starts
Nov 28 08:08:34 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 56 pg[7.6( v 36'39 (0'0,36'39] local-lis/les=48/49 n=2 ec=44/33 lis/c=48/48 les/c/f=49/49/0 sis=56 pruub=13.463187218s) [0,4,5] r=-1 lpr=56 pi=[48,56)/1 luod=0'0 crt=36'39 lcod 0'0 mlcod 0'0 active pruub 1214.438842773s@ mbc={}] start_peering_interval up [3,5,1] -> [0,4,5], acting [3,5,1] -> [0,4,5], acting_primary 3 -> 0, up_primary 3 -> 0, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:34 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 56 pg[7.6( v 36'39 (0'0,36'39] local-lis/les=48/49 n=2 ec=44/33 lis/c=48/48 les/c/f=49/49/0 sis=56 pruub=13.463090897s) [0,4,5] r=-1 lpr=56 pi=[48,56)/1 crt=36'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1214.438842773s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:34 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 56 pg[7.e( v 36'39 (0'0,36'39] local-lis/les=48/49 n=1 ec=44/33 lis/c=48/48 les/c/f=49/49/0 sis=56 pruub=13.463495255s) [0,4,5] r=-1 lpr=56 pi=[48,56)/1 luod=0'0 crt=36'39 lcod 0'0 mlcod 0'0 active pruub 1214.439086914s@ mbc={}] start_peering_interval up [3,5,1] -> [0,4,5], acting [3,5,1] -> [0,4,5], acting_primary 3 -> 0, up_primary 3 -> 0, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:34 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 56 pg[7.e( v 36'39 (0'0,36'39] local-lis/les=48/49 n=1 ec=44/33 lis/c=48/48 les/c/f=49/49/0 sis=56 pruub=13.463070869s) [0,4,5] r=-1 lpr=56 pi=[48,56)/1 crt=36'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1214.439086914s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:35 np0005538515.localdomain puppet-user[57386]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5
Nov 28 08:08:35 np0005538515.localdomain puppet-user[57386]:    (file: /etc/puppet/hiera.yaml)
Nov 28 08:08:35 np0005538515.localdomain puppet-user[57386]: Warning: Undefined variable '::deploy_config_name';
Nov 28 08:08:35 np0005538515.localdomain puppet-user[57386]:    (file & line not available)
Nov 28 08:08:35 np0005538515.localdomain puppet-user[57386]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html
Nov 28 08:08:35 np0005538515.localdomain puppet-user[57386]:    (file & line not available)
Nov 28 08:08:35 np0005538515.localdomain puppet-user[57386]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/profile/base/database/mysql/client.pp, line: 89, column: 8)
Nov 28 08:08:35 np0005538515.localdomain puppet-user[57386]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/packages.pp, line: 39, column: 69)
Nov 28 08:08:35 np0005538515.localdomain puppet-user[57386]: Notice: Compiled catalog for np0005538515.localdomain in environment production in 0.15 seconds
Nov 28 08:08:35 np0005538515.localdomain puppet-user[57386]: Notice: Applied catalog in 0.04 seconds
Nov 28 08:08:35 np0005538515.localdomain puppet-user[57386]: Application:
Nov 28 08:08:35 np0005538515.localdomain puppet-user[57386]:    Initial environment: production
Nov 28 08:08:35 np0005538515.localdomain puppet-user[57386]:    Converged environment: production
Nov 28 08:08:35 np0005538515.localdomain puppet-user[57386]:          Run mode: user
Nov 28 08:08:35 np0005538515.localdomain puppet-user[57386]: Changes:
Nov 28 08:08:35 np0005538515.localdomain puppet-user[57386]: Events:
Nov 28 08:08:35 np0005538515.localdomain puppet-user[57386]: Resources:
Nov 28 08:08:35 np0005538515.localdomain puppet-user[57386]:             Total: 10
Nov 28 08:08:35 np0005538515.localdomain puppet-user[57386]: Time:
Nov 28 08:08:35 np0005538515.localdomain puppet-user[57386]:          Schedule: 0.00
Nov 28 08:08:35 np0005538515.localdomain puppet-user[57386]:              File: 0.00
Nov 28 08:08:35 np0005538515.localdomain puppet-user[57386]:              Exec: 0.01
Nov 28 08:08:35 np0005538515.localdomain puppet-user[57386]:            Augeas: 0.01
Nov 28 08:08:35 np0005538515.localdomain puppet-user[57386]:    Transaction evaluation: 0.03
Nov 28 08:08:35 np0005538515.localdomain puppet-user[57386]:    Catalog application: 0.04
Nov 28 08:08:35 np0005538515.localdomain puppet-user[57386]:    Config retrieval: 0.19
Nov 28 08:08:35 np0005538515.localdomain puppet-user[57386]:          Last run: 1764317315
Nov 28 08:08:35 np0005538515.localdomain puppet-user[57386]:        Filebucket: 0.00
Nov 28 08:08:35 np0005538515.localdomain puppet-user[57386]:             Total: 0.04
Nov 28 08:08:35 np0005538515.localdomain puppet-user[57386]: Version:
Nov 28 08:08:35 np0005538515.localdomain puppet-user[57386]:            Config: 1764317314
Nov 28 08:08:35 np0005538515.localdomain puppet-user[57386]:            Puppet: 7.10.0
Nov 28 08:08:35 np0005538515.localdomain ansible-async_wrapper.py[57367]: Module complete (57367)
Nov 28 08:08:35 np0005538515.localdomain ceph-osd[33334]: log_channel(cluster) log [DBG] : 6.a scrub starts
Nov 28 08:08:35 np0005538515.localdomain ceph-osd[33334]: log_channel(cluster) log [DBG] : 6.a scrub ok
Nov 28 08:08:36 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 56 pg[7.e( empty local-lis/les=0/0 n=0 ec=44/33 lis/c=48/48 les/c/f=49/49/0 sis=56) [0,4,5] r=1 lpr=56 pi=[48,56)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:36 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 56 pg[7.6( empty local-lis/les=0/0 n=0 ec=44/33 lis/c=48/48 les/c/f=49/49/0 sis=56) [0,4,5] r=1 lpr=56 pi=[48,56)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:36 np0005538515.localdomain ansible-async_wrapper.py[57366]: Done in kid B.
Nov 28 08:08:37 np0005538515.localdomain ceph-osd[33334]: log_channel(cluster) log [DBG] : 6.5 scrub starts
Nov 28 08:08:37 np0005538515.localdomain ceph-osd[33334]: log_channel(cluster) log [DBG] : 6.5 scrub ok
Nov 28 08:08:37 np0005538515.localdomain sudo[57498]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 08:08:37 np0005538515.localdomain sudo[57498]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:08:37 np0005538515.localdomain sudo[57498]: pam_unix(sudo:session): session closed for user root
Nov 28 08:08:37 np0005538515.localdomain sudo[57513]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 check-host
Nov 28 08:08:37 np0005538515.localdomain sudo[57513]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:08:38 np0005538515.localdomain sudo[57513]: pam_unix(sudo:session): session closed for user root
Nov 28 08:08:38 np0005538515.localdomain sudo[57548]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 08:08:38 np0005538515.localdomain sudo[57548]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:08:38 np0005538515.localdomain sudo[57548]: pam_unix(sudo:session): session closed for user root
Nov 28 08:08:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 28 08:08:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 1200.2 total, 600.0 interval
                                                          Cumulative writes: 4930 writes, 22K keys, 4930 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s
                                                          Cumulative WAL: 4930 writes, 382 syncs, 12.91 writes per sync, written: 0.02 GB, 0.01 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 1683 writes, 6347 keys, 1683 commit groups, 1.0 writes per commit group, ingest: 2.34 MB, 0.00 MB/s
                                                          Interval WAL: 1683 writes, 243 syncs, 6.93 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          
                                                          ** Compaction Stats [default] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      2/0    2.61 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.03              0.00         1    0.025       0      0       0.0       0.0
                                                           Sum      2/0    2.61 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.03              0.00         1    0.025       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [default] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.03              0.00         1    0.025       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.2 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x562bf8a0b610#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.3e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [default] **
                                                          
                                                          ** Compaction Stats [m-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.2 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x562bf8a0b610#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.3e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-0] **
                                                          
                                                          ** Compaction Stats [m-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.2 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x562bf8a0b610#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.3e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-1] **
                                                          
                                                          ** Compaction Stats [m-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.2 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x562bf8a0b610#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.3e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-2] **
                                                          
                                                          ** Compaction Stats [p-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      1/0    1.57 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.03              0.00         1    0.028       0      0       0.0       0.0
                                                           Sum      1/0    1.57 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.03              0.00         1    0.028       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.1      0.03              0.00         1    0.028       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.2 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x562bf8a0b610#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.3e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-0] **
                                                          
                                                          ** Compaction Stats [p-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.2 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x562bf8a0b610#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.3e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-1] **
                                                          
                                                          ** Compaction Stats [p-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.2 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x562bf8a0b610#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.3e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-2] **
                                                          
                                                          ** Compaction Stats [O-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.2 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x562bf8a0a2d0#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 9e-06 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-0] **
                                                          
                                                          ** Compaction Stats [O-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.2 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x562bf8a0a2d0#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 9e-06 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-1] **
                                                          
                                                          ** Compaction Stats [O-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      1/0    1.26 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.01              0.00         1    0.009       0      0       0.0       0.0
                                                           Sum      1/0    1.26 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.01              0.00         1    0.009       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.1      0.01              0.00         1    0.009       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.2 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x562bf8a0a2d0#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 9e-06 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-2] **
                                                          
                                                          ** Compaction Stats [L] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.04              0.00         1    0.040       0      0       0.0       0.0
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.04              0.00         1    0.040       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [L] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.04              0.00         1    0.040       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.2 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x562bf8a0b610#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.3e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [L] **
                                                          
                                                          ** Compaction Stats [P] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [P] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.2 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x562bf8a0b610#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.3e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [P] **
Nov 28 08:08:38 np0005538515.localdomain sudo[57563]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 08:08:38 np0005538515.localdomain sudo[57563]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:08:39 np0005538515.localdomain sudo[57563]: pam_unix(sudo:session): session closed for user root
Nov 28 08:08:39 np0005538515.localdomain sudo[57609]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 08:08:39 np0005538515.localdomain sudo[57609]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:08:39 np0005538515.localdomain sudo[57609]: pam_unix(sudo:session): session closed for user root
Nov 28 08:08:40 np0005538515.localdomain ceph-osd[33334]: log_channel(cluster) log [DBG] : 6.1a deep-scrub starts
Nov 28 08:08:40 np0005538515.localdomain ceph-osd[32393]: log_channel(cluster) log [DBG] : 3.13 scrub starts
Nov 28 08:08:40 np0005538515.localdomain ceph-osd[32393]: log_channel(cluster) log [DBG] : 3.13 scrub ok
Nov 28 08:08:41 np0005538515.localdomain ceph-osd[33334]: log_channel(cluster) log [DBG] : 6.15 scrub starts
Nov 28 08:08:41 np0005538515.localdomain ceph-osd[33334]: log_channel(cluster) log [DBG] : 6.15 scrub ok
Nov 28 08:08:41 np0005538515.localdomain ceph-osd[32393]: log_channel(cluster) log [DBG] : 3.14 scrub starts
Nov 28 08:08:41 np0005538515.localdomain ceph-osd[32393]: log_channel(cluster) log [DBG] : 3.14 scrub ok
Nov 28 08:08:41 np0005538515.localdomain sudo[57637]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tsglhkburybvlmjvoyhiznesllfklhsu ; /usr/bin/python3
Nov 28 08:08:41 np0005538515.localdomain sudo[57637]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:08:41 np0005538515.localdomain python3[57639]: ansible-ansible.legacy.async_status Invoked with jid=289175210079.57363 mode=status _async_dir=/tmp/.ansible_async
Nov 28 08:08:41 np0005538515.localdomain sudo[57637]: pam_unix(sudo:session): session closed for user root
Nov 28 08:08:42 np0005538515.localdomain ceph-osd[33334]: log_channel(cluster) log [DBG] : 3.3 scrub starts
Nov 28 08:08:42 np0005538515.localdomain ceph-osd[32393]: log_channel(cluster) log [DBG] : 3.d scrub starts
Nov 28 08:08:42 np0005538515.localdomain ceph-osd[32393]: log_channel(cluster) log [DBG] : 3.d scrub ok
Nov 28 08:08:42 np0005538515.localdomain sudo[57653]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ftpeweqfaoyxiyaghsglqjqyndpqsnuq ; /usr/bin/python3
Nov 28 08:08:42 np0005538515.localdomain sudo[57653]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:08:42 np0005538515.localdomain ceph-osd[33334]: log_channel(cluster) log [DBG] : 3.3 scrub ok
Nov 28 08:08:42 np0005538515.localdomain python3[57655]: ansible-file Invoked with path=/var/lib/container-puppet/puppetlabs state=directory setype=svirt_sandbox_file_t selevel=s0 recurse=True force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None
Nov 28 08:08:42 np0005538515.localdomain sudo[57653]: pam_unix(sudo:session): session closed for user root
Nov 28 08:08:42 np0005538515.localdomain sudo[57669]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zaslmfbmpxfvdxlvowufgxpxfncjuccd ; /usr/bin/python3
Nov 28 08:08:42 np0005538515.localdomain sudo[57669]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:08:42 np0005538515.localdomain python3[57671]: ansible-stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 28 08:08:42 np0005538515.localdomain sudo[57669]: pam_unix(sudo:session): session closed for user root
Nov 28 08:08:43 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 58 pg[7.7( empty local-lis/les=0/0 n=0 ec=44/33 lis/c=50/50 les/c/f=51/51/0 sis=58) [1,5,3] r=0 lpr=58 pi=[50,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:43 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 58 pg[7.f( empty local-lis/les=0/0 n=0 ec=44/33 lis/c=50/50 les/c/f=51/51/0 sis=58) [1,5,3] r=0 lpr=58 pi=[50,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:43 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 58 pg[7.f( v 36'39 (0'0,36'39] local-lis/les=50/51 n=1 ec=44/33 lis/c=50/50 les/c/f=51/51/0 sis=58 pruub=13.545802116s) [1,5,3] r=-1 lpr=58 pi=[50,58)/1 luod=0'0 crt=36'39 mlcod 0'0 active pruub 1218.125732422s@ mbc={}] start_peering_interval up [3,4,2] -> [1,5,3], acting [3,4,2] -> [1,5,3], acting_primary 3 -> 1, up_primary 3 -> 1, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:43 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 58 pg[7.f( v 36'39 (0'0,36'39] local-lis/les=50/51 n=1 ec=44/33 lis/c=50/50 les/c/f=51/51/0 sis=58 pruub=13.545705795s) [1,5,3] r=-1 lpr=58 pi=[50,58)/1 crt=36'39 mlcod 0'0 unknown NOTIFY pruub 1218.125732422s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:43 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 58 pg[7.7( v 36'39 (0'0,36'39] local-lis/les=50/51 n=1 ec=44/33 lis/c=50/50 les/c/f=51/51/0 sis=58 pruub=13.541893005s) [1,5,3] r=-1 lpr=58 pi=[50,58)/1 luod=0'0 crt=36'39 mlcod 0'0 active pruub 1218.122192383s@ mbc={}] start_peering_interval up [3,4,2] -> [1,5,3], acting [3,4,2] -> [1,5,3], acting_primary 3 -> 1, up_primary 3 -> 1, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:43 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 58 pg[7.7( v 36'39 (0'0,36'39] local-lis/les=50/51 n=1 ec=44/33 lis/c=50/50 les/c/f=51/51/0 sis=58 pruub=13.541290283s) [1,5,3] r=-1 lpr=58 pi=[50,58)/1 crt=36'39 mlcod 0'0 unknown NOTIFY pruub 1218.122192383s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:43 np0005538515.localdomain ceph-osd[33334]: log_channel(cluster) log [DBG] : 5.18 scrub starts
Nov 28 08:08:43 np0005538515.localdomain ceph-osd[32393]: log_channel(cluster) log [DBG] : 3.1c scrub starts
Nov 28 08:08:43 np0005538515.localdomain sudo[57719]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-smykkctfmaefissynmrbtrqgvwbusfkp ; /usr/bin/python3
Nov 28 08:08:43 np0005538515.localdomain sudo[57719]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:08:43 np0005538515.localdomain ceph-osd[33334]: log_channel(cluster) log [DBG] : 5.18 scrub ok
Nov 28 08:08:43 np0005538515.localdomain python3[57721]: ansible-ansible.legacy.stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 08:08:43 np0005538515.localdomain sudo[57719]: pam_unix(sudo:session): session closed for user root
Nov 28 08:08:43 np0005538515.localdomain sudo[57737]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wkzekonegzhqnsjvsyxmxxhwgqqpalwm ; /usr/bin/python3
Nov 28 08:08:43 np0005538515.localdomain sudo[57737]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:08:43 np0005538515.localdomain python3[57739]: ansible-ansible.legacy.file Invoked with setype=svirt_sandbox_file_t selevel=s0 dest=/var/lib/container-puppet/puppetlabs/facter.conf _original_basename=tmp_7vl7qvz recurse=False state=file path=/var/lib/container-puppet/puppetlabs/facter.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None
Nov 28 08:08:43 np0005538515.localdomain sudo[57737]: pam_unix(sudo:session): session closed for user root
Nov 28 08:08:44 np0005538515.localdomain sudo[57767]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xwytqqcavoyzsptoqchdizmujgmammzu ; /usr/bin/python3
Nov 28 08:08:44 np0005538515.localdomain sudo[57767]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:08:44 np0005538515.localdomain python3[57769]: ansible-file Invoked with path=/opt/puppetlabs/facter state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:08:44 np0005538515.localdomain sudo[57767]: pam_unix(sudo:session): session closed for user root
Nov 28 08:08:44 np0005538515.localdomain ceph-osd[32393]: log_channel(cluster) log [DBG] : 5.1b scrub starts
Nov 28 08:08:44 np0005538515.localdomain sudo[57783]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eadajkkbkxrycqcngskfzdcuojphhghr ; /usr/bin/python3
Nov 28 08:08:44 np0005538515.localdomain sudo[57783]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:08:44 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 59 pg[7.f( v 36'39 lc 36'1 (0'0,36'39] local-lis/les=58/59 n=1 ec=44/33 lis/c=50/50 les/c/f=51/51/0 sis=58) [1,5,3] r=0 lpr=58 pi=[50,58)/1 crt=36'39 lcod 0'0 mlcod 0'0 active+degraded m=3 mbc={255={(1+2)=3}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:44 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 59 pg[7.7( v 36'39 lc 36'18 (0'0,36'39] local-lis/les=58/59 n=1 ec=44/33 lis/c=50/50 les/c/f=51/51/0 sis=58) [1,5,3] r=0 lpr=58 pi=[50,58)/1 crt=36'39 lcod 0'0 mlcod 0'0 active+degraded m=1 mbc={255={(1+2)=1}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:44 np0005538515.localdomain ceph-osd[32393]: log_channel(cluster) log [DBG] : 5.1b scrub ok
Nov 28 08:08:45 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 60 pg[7.8( v 36'39 (0'0,36'39] local-lis/les=44/45 n=1 ec=44/33 lis/c=44/44 les/c/f=45/45/0 sis=60 pruub=9.056539536s) [3,4,5] r=-1 lpr=60 pi=[44,60)/1 crt=36'39 lcod 0'0 mlcod 0'0 active pruub 1220.208984375s@ TIME_FOR_DEEP mbc={}] start_peering_interval up [1,5,3] -> [3,4,5], acting [1,5,3] -> [3,4,5], acting_primary 1 -> 3, up_primary 1 -> 3, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:45 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 60 pg[7.8( v 36'39 (0'0,36'39] local-lis/les=44/45 n=1 ec=44/33 lis/c=44/44 les/c/f=45/45/0 sis=60 pruub=9.056453705s) [3,4,5] r=-1 lpr=60 pi=[44,60)/1 crt=36'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1220.208984375s@ TIME_FOR_DEEP mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:45 np0005538515.localdomain ceph-osd[32393]: log_channel(cluster) log [DBG] : 5.9 deep-scrub starts
Nov 28 08:08:45 np0005538515.localdomain ceph-osd[32393]: log_channel(cluster) log [DBG] : 5.9 deep-scrub ok
Nov 28 08:08:45 np0005538515.localdomain sudo[57783]: pam_unix(sudo:session): session closed for user root
Nov 28 08:08:46 np0005538515.localdomain ceph-osd[32393]: log_channel(cluster) log [DBG] : 5.16 scrub starts
Nov 28 08:08:46 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 60 pg[7.8( empty local-lis/les=0/0 n=0 ec=44/33 lis/c=44/44 les/c/f=45/45/0 sis=60) [3,4,5] r=1 lpr=60 pi=[44,60)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:46 np0005538515.localdomain ceph-osd[32393]: log_channel(cluster) log [DBG] : 5.16 scrub ok
Nov 28 08:08:46 np0005538515.localdomain sudo[57871]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qssxxifiwkrcmoqylvsbuzsnzdodigeu ; /usr/bin/python3
Nov 28 08:08:46 np0005538515.localdomain sudo[57871]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:08:46 np0005538515.localdomain python3[57873]: ansible-ansible.posix.synchronize Invoked with src=/opt/puppetlabs/ dest=/var/lib/container-puppet/puppetlabs/ _local_rsync_path=rsync _local_rsync_password=NOT_LOGGING_PARAMETER rsync_path=None delete=False _substitute_controller=False archive=True checksum=False compress=True existing_only=False dirs=False copy_links=False set_remote_user=True rsync_timeout=0 rsync_opts=[] ssh_connection_multiplexing=False partial=False verify_host=False mode=push dest_port=None private_key=None recursive=None links=None perms=None times=None owner=None group=None ssh_args=None link_dest=None
Nov 28 08:08:46 np0005538515.localdomain sudo[57871]: pam_unix(sudo:session): session closed for user root
Nov 28 08:08:46 np0005538515.localdomain sudo[57890]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-itxhvicjhsmnnjfgyfrokalfvkekvyxu ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:08:46 np0005538515.localdomain sudo[57890]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:08:47 np0005538515.localdomain python3[57892]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:08:47 np0005538515.localdomain sudo[57890]: pam_unix(sudo:session): session closed for user root
Nov 28 08:08:47 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 62 pg[7.9( v 36'39 (0'0,36'39] local-lis/les=46/47 n=1 ec=44/33 lis/c=46/46 les/c/f=47/47/0 sis=62 pruub=15.171545982s) [0,2,4] r=2 lpr=62 pi=[46,62)/1 crt=36'39 lcod 0'0 mlcod 0'0 active pruub 1223.838500977s@ mbc={}] start_peering_interval up [4,2,3] -> [0,2,4], acting [4,2,3] -> [0,2,4], acting_primary 4 -> 0, up_primary 4 -> 0, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:47 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 62 pg[7.9( v 36'39 (0'0,36'39] local-lis/les=46/47 n=1 ec=44/33 lis/c=46/46 les/c/f=47/47/0 sis=62 pruub=15.171466827s) [0,2,4] r=2 lpr=62 pi=[46,62)/1 crt=36'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1223.838500977s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:47 np0005538515.localdomain sudo[57906]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ftegwkpevvkthmwpfiqxkcuhcrnvdeov ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:08:47 np0005538515.localdomain sudo[57906]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:08:47 np0005538515.localdomain sudo[57906]: pam_unix(sudo:session): session closed for user root
Nov 28 08:08:47 np0005538515.localdomain sudo[57922]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pjvbckrkallentltplbrsfetgyqbloas ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:08:47 np0005538515.localdomain sudo[57922]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:08:47 np0005538515.localdomain python3[57924]: ansible-stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 28 08:08:47 np0005538515.localdomain sudo[57922]: pam_unix(sudo:session): session closed for user root
Nov 28 08:08:48 np0005538515.localdomain sudo[57972]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ahvcufdbwgwsibujkoewbqpybtbswtfj ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:08:48 np0005538515.localdomain sudo[57972]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:08:48 np0005538515.localdomain python3[57974]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-container-shutdown follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 08:08:48 np0005538515.localdomain sudo[57972]: pam_unix(sudo:session): session closed for user root
Nov 28 08:08:48 np0005538515.localdomain sudo[57990]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kmawcdffhdntjapszgplemqnvynriknl ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:08:48 np0005538515.localdomain sudo[57990]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:08:48 np0005538515.localdomain python3[57992]: ansible-ansible.legacy.file Invoked with mode=0700 owner=root group=root dest=/usr/libexec/tripleo-container-shutdown _original_basename=tripleo-container-shutdown recurse=False state=file path=/usr/libexec/tripleo-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:08:48 np0005538515.localdomain sudo[57990]: pam_unix(sudo:session): session closed for user root
Nov 28 08:08:49 np0005538515.localdomain sudo[58052]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nbupkmtfshsjvuddvznrpnrvmmrjbtlw ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:08:49 np0005538515.localdomain sudo[58052]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:08:49 np0005538515.localdomain ceph-osd[32393]: log_channel(cluster) log [DBG] : 5.11 deep-scrub starts
Nov 28 08:08:49 np0005538515.localdomain ceph-osd[33334]: log_channel(cluster) log [DBG] : 5.10 deep-scrub starts
Nov 28 08:08:49 np0005538515.localdomain python3[58054]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-start-podman-container follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 08:08:49 np0005538515.localdomain sudo[58052]: pam_unix(sudo:session): session closed for user root
Nov 28 08:08:49 np0005538515.localdomain ceph-osd[32393]: log_channel(cluster) log [DBG] : 5.11 deep-scrub ok
Nov 28 08:08:49 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 64 pg[7.a( v 36'39 (0'0,36'39] local-lis/les=48/49 n=1 ec=44/33 lis/c=48/48 les/c/f=49/49/0 sis=64 pruub=15.202980042s) [2,0,4] r=-1 lpr=64 pi=[48,64)/1 luod=0'0 crt=36'39 lcod 0'0 mlcod 0'0 active pruub 1230.435668945s@ mbc={}] start_peering_interval up [3,5,1] -> [2,0,4], acting [3,5,1] -> [2,0,4], acting_primary 3 -> 2, up_primary 3 -> 2, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:49 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 64 pg[7.a( v 36'39 (0'0,36'39] local-lis/les=48/49 n=1 ec=44/33 lis/c=48/48 les/c/f=49/49/0 sis=64 pruub=15.202869415s) [2,0,4] r=-1 lpr=64 pi=[48,64)/1 crt=36'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1230.435668945s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:49 np0005538515.localdomain sudo[58070]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mhgbeyhwjyenzfergzqtnmkdghgrpqio ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:08:49 np0005538515.localdomain sudo[58070]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:08:49 np0005538515.localdomain python3[58072]: ansible-ansible.legacy.file Invoked with mode=0700 owner=root group=root dest=/usr/libexec/tripleo-start-podman-container _original_basename=tripleo-start-podman-container recurse=False state=file path=/usr/libexec/tripleo-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:08:49 np0005538515.localdomain sudo[58070]: pam_unix(sudo:session): session closed for user root
Nov 28 08:08:49 np0005538515.localdomain sudo[58132]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mrvhraodxrgnvucosxtcvhlgfzzkcokv ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:08:49 np0005538515.localdomain sudo[58132]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:08:50 np0005538515.localdomain python3[58134]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/tripleo-container-shutdown.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 08:08:50 np0005538515.localdomain sudo[58132]: pam_unix(sudo:session): session closed for user root
Nov 28 08:08:50 np0005538515.localdomain ceph-osd[32393]: log_channel(cluster) log [DBG] : 4.5 scrub starts
Nov 28 08:08:50 np0005538515.localdomain ceph-osd[32393]: log_channel(cluster) log [DBG] : 4.5 scrub ok
Nov 28 08:08:50 np0005538515.localdomain sudo[58150]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wxspjhfrtvsqzrygafwhddmfipdqpzmv ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:08:50 np0005538515.localdomain sudo[58150]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:08:50 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 64 pg[7.a( empty local-lis/les=0/0 n=0 ec=44/33 lis/c=48/48 les/c/f=49/49/0 sis=64) [2,0,4] r=2 lpr=64 pi=[48,64)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:50 np0005538515.localdomain python3[58152]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system/tripleo-container-shutdown.service _original_basename=tripleo-container-shutdown-service recurse=False state=file path=/usr/lib/systemd/system/tripleo-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:08:50 np0005538515.localdomain sudo[58150]: pam_unix(sudo:session): session closed for user root
Nov 28 08:08:50 np0005538515.localdomain sudo[58212]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hmllxddtcmchihnlzxlpcvwhrfjmztrd ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:08:50 np0005538515.localdomain sudo[58212]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:08:50 np0005538515.localdomain python3[58214]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 08:08:50 np0005538515.localdomain sudo[58212]: pam_unix(sudo:session): session closed for user root
Nov 28 08:08:51 np0005538515.localdomain sudo[58230]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-inllwxlwstsnfpyjjkeefwqpxdzuadcp ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:08:51 np0005538515.localdomain sudo[58230]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:08:51 np0005538515.localdomain python3[58232]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset _original_basename=91-tripleo-container-shutdown-preset recurse=False state=file path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:08:51 np0005538515.localdomain sudo[58230]: pam_unix(sudo:session): session closed for user root
Nov 28 08:08:51 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 66 pg[7.b( v 36'39 (0'0,36'39] local-lis/les=50/51 n=1 ec=44/33 lis/c=50/50 les/c/f=51/51/0 sis=66 pruub=13.369278908s) [3,1,2] r=-1 lpr=66 pi=[50,66)/1 luod=0'0 crt=36'39 mlcod 0'0 active pruub 1226.125610352s@ mbc={}] start_peering_interval up [3,4,2] -> [3,1,2], acting [3,4,2] -> [3,1,2], acting_primary 3 -> 3, up_primary 3 -> 3, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:51 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 66 pg[7.b( v 36'39 (0'0,36'39] local-lis/les=50/51 n=1 ec=44/33 lis/c=50/50 les/c/f=51/51/0 sis=66 pruub=13.369094849s) [3,1,2] r=-1 lpr=66 pi=[50,66)/1 crt=36'39 mlcod 0'0 unknown NOTIFY pruub 1226.125610352s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:51 np0005538515.localdomain sudo[58260]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wfjkagvszodrrcclajmzrrintddgmqic ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:08:51 np0005538515.localdomain sudo[58260]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:08:51 np0005538515.localdomain python3[58262]: ansible-systemd Invoked with name=tripleo-container-shutdown state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 08:08:51 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 08:08:51 np0005538515.localdomain systemd-rc-local-generator[58287]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 08:08:51 np0005538515.localdomain systemd-sysv-generator[58291]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 08:08:51 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 08:08:52 np0005538515.localdomain sudo[58260]: pam_unix(sudo:session): session closed for user root
Nov 28 08:08:52 np0005538515.localdomain ceph-osd[32393]: log_channel(cluster) log [DBG] : 2.16 scrub starts
Nov 28 08:08:52 np0005538515.localdomain ceph-osd[32393]: log_channel(cluster) log [DBG] : 2.16 scrub ok
Nov 28 08:08:52 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 66 pg[7.b( empty local-lis/les=0/0 n=0 ec=44/33 lis/c=50/50 les/c/f=51/51/0 sis=66) [3,1,2] r=1 lpr=66 pi=[50,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:52 np0005538515.localdomain sudo[58346]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qynhtqnesyyjqumsaskngzvvhflycejd ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:08:52 np0005538515.localdomain sudo[58346]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:08:52 np0005538515.localdomain python3[58348]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/netns-placeholder.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 08:08:52 np0005538515.localdomain sudo[58346]: pam_unix(sudo:session): session closed for user root
Nov 28 08:08:52 np0005538515.localdomain sudo[58364]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iddvypbaxkohnatkoplmpsmhjzizagzz ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:08:52 np0005538515.localdomain sudo[58364]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:08:52 np0005538515.localdomain python3[58366]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/usr/lib/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:08:52 np0005538515.localdomain sudo[58364]: pam_unix(sudo:session): session closed for user root
Nov 28 08:08:53 np0005538515.localdomain sudo[58426]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pbmdiyowzlamxixhnszjsqrspsomyjfa ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:08:53 np0005538515.localdomain sudo[58426]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:08:53 np0005538515.localdomain python3[58428]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 08:08:53 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 68 pg[7.c( v 36'39 (0'0,36'39] local-lis/les=52/53 n=1 ec=44/33 lis/c=52/52 les/c/f=53/53/0 sis=68 pruub=13.951500893s) [1,3,2] r=0 lpr=68 pi=[52,68)/1 luod=0'0 crt=36'39 lcod 0'0 mlcod 0'0 active pruub 1233.277465820s@ mbc={}] start_peering_interval up [0,1,2] -> [1,3,2], acting [0,1,2] -> [1,3,2], acting_primary 0 -> 1, up_primary 0 -> 1, role 1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:53 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 68 pg[7.c( v 36'39 (0'0,36'39] local-lis/les=52/53 n=1 ec=44/33 lis/c=52/52 les/c/f=53/53/0 sis=68 pruub=13.951500893s) [1,3,2] r=0 lpr=68 pi=[52,68)/1 crt=36'39 lcod 0'0 mlcod 0'0 unknown pruub 1233.277465820s@ mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:53 np0005538515.localdomain sudo[58426]: pam_unix(sudo:session): session closed for user root
Nov 28 08:08:53 np0005538515.localdomain sudo[58444]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dlubikrjhqwvxtrdqtrjdumvztadsufn ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:08:53 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.
Nov 28 08:08:53 np0005538515.localdomain sudo[58444]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:08:53 np0005538515.localdomain systemd[1]: tmp-crun.68LuOY.mount: Deactivated successfully.
Nov 28 08:08:53 np0005538515.localdomain podman[58446]: 2025-11-28 08:08:53.613557648 +0000 UTC m=+0.101052051 container health_status 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, version=17.1.12, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, container_name=metrics_qdr, vcs-type=git, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, io.openshift.expose-services=, build-date=2025-11-18T22:49:46Z, url=https://www.redhat.com, name=rhosp17/openstack-qdrouterd)
Nov 28 08:08:53 np0005538515.localdomain python3[58447]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:08:53 np0005538515.localdomain sudo[58444]: pam_unix(sudo:session): session closed for user root
Nov 28 08:08:53 np0005538515.localdomain podman[58446]: 2025-11-28 08:08:53.868594021 +0000 UTC m=+0.356088474 container exec_died 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, tcib_managed=true, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, build-date=2025-11-18T22:49:46Z, batch=17.1_20251118.1, container_name=metrics_qdr, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, architecture=x86_64, version=17.1.12)
Nov 28 08:08:53 np0005538515.localdomain systemd[1]: 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.service: Deactivated successfully.
Nov 28 08:08:53 np0005538515.localdomain sudo[58504]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vmjaxcwtickytfrfedvpbbnhqczalxlw ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:08:53 np0005538515.localdomain sudo[58504]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:08:54 np0005538515.localdomain python3[58506]: ansible-systemd Invoked with name=netns-placeholder state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 08:08:54 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 08:08:54 np0005538515.localdomain systemd-rc-local-generator[58533]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 08:08:54 np0005538515.localdomain systemd-sysv-generator[58537]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 08:08:54 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 69 pg[7.c( v 36'39 (0'0,36'39] local-lis/les=68/69 n=1 ec=44/33 lis/c=52/52 les/c/f=53/53/0 sis=68) [1,3,2] r=0 lpr=68 pi=[52,68)/1 crt=36'39 lcod 0'0 mlcod 0'0 active+degraded mbc={255={(2+1)=1}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:54 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 08:08:54 np0005538515.localdomain systemd[1]: Starting Create netns directory...
Nov 28 08:08:54 np0005538515.localdomain systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Nov 28 08:08:54 np0005538515.localdomain systemd[1]: netns-placeholder.service: Deactivated successfully.
Nov 28 08:08:54 np0005538515.localdomain systemd[1]: Finished Create netns directory.
Nov 28 08:08:54 np0005538515.localdomain sudo[58504]: pam_unix(sudo:session): session closed for user root
Nov 28 08:08:54 np0005538515.localdomain sudo[58561]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pdfakmisctpdvzvuoeaszqptbdjtlciq ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:08:54 np0005538515.localdomain sudo[58561]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:08:55 np0005538515.localdomain ceph-osd[32393]: log_channel(cluster) log [DBG] : 2.17 scrub starts
Nov 28 08:08:55 np0005538515.localdomain python3[58563]: ansible-container_puppet_config Invoked with update_config_hash_only=True no_archive=True check_mode=False config_vol_prefix=/var/lib/config-data debug=False net_host=True puppet_config= short_hostname= step=6
Nov 28 08:08:55 np0005538515.localdomain sudo[58561]: pam_unix(sudo:session): session closed for user root
Nov 28 08:08:55 np0005538515.localdomain ceph-osd[33334]: log_channel(cluster) log [DBG] : 4.1b scrub starts
Nov 28 08:08:55 np0005538515.localdomain ceph-osd[32393]: log_channel(cluster) log [DBG] : 2.17 scrub ok
Nov 28 08:08:55 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 70 pg[7.d( v 36'39 (0'0,36'39] local-lis/les=54/55 n=2 ec=44/33 lis/c=54/54 les/c/f=55/55/0 sis=70 pruub=10.772554398s) [1,3,5] r=-1 lpr=70 pi=[54,70)/1 luod=0'0 crt=36'39 mlcod 0'0 active pruub 1227.644409180s@ mbc={}] start_peering_interval up [2,0,4] -> [1,3,5], acting [2,0,4] -> [1,3,5], acting_primary 2 -> 1, up_primary 2 -> 1, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:08:55 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 70 pg[7.d( v 36'39 (0'0,36'39] local-lis/les=54/55 n=2 ec=44/33 lis/c=54/54 les/c/f=55/55/0 sis=70 pruub=10.772466660s) [1,3,5] r=-1 lpr=70 pi=[54,70)/1 crt=36'39 mlcod 0'0 unknown NOTIFY pruub 1227.644409180s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:08:55 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 70 pg[7.d( empty local-lis/les=0/0 n=0 ec=44/33 lis/c=54/54 les/c/f=55/55/0 sis=70) [1,3,5] r=0 lpr=70 pi=[54,70)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 28 08:08:55 np0005538515.localdomain sudo[58577]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tkmpkiuxwcvmmvmmmygbefsyzgpligbu ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:08:55 np0005538515.localdomain sudo[58577]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:08:56 np0005538515.localdomain sudo[58577]: pam_unix(sudo:session): session closed for user root
Nov 28 08:08:56 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 71 pg[7.d( v 36'39 lc 36'13 (0'0,36'39] local-lis/les=70/71 n=2 ec=44/33 lis/c=54/54 les/c/f=55/55/0 sis=70) [1,3,5] r=0 lpr=70 pi=[54,70)/1 crt=36'39 lcod 0'0 mlcod 0'0 active+degraded m=2 mbc={255={(0+3)=2}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 28 08:08:56 np0005538515.localdomain sudo[58619]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ofxaoxbmnbujwwsezvigomsrrrhzheme ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:08:56 np0005538515.localdomain sudo[58619]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:08:57 np0005538515.localdomain python3[58621]: ansible-tripleo_container_manage Invoked with config_id=tripleo_step2 config_dir=/var/lib/tripleo-config/container-startup-config/step_2 config_patterns=*.json config_overrides={} concurrency=5 log_base_path=/var/log/containers/stdouts debug=False
Nov 28 08:08:57 np0005538515.localdomain podman[58687]: 2025-11-28 08:08:57.33739144 +0000 UTC m=+0.081405768 container create 2a044fcb236ee7f5542f44e64fe793d77e8763efce6900a93b0314c6ec72e94b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute_init_log, version=17.1.12, container_name=nova_compute_init_log, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, config_id=tripleo_step2, name=rhosp17/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, url=https://www.redhat.com, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, build-date=2025-11-19T00:36:58Z, release=1761123044, config_data={'command': ['/bin/bash', '-c', 'chown -R nova:nova /var/log/nova'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764316155'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'user': 'root', 'volumes': ['/var/log/containers/nova:/var/log/nova:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, tcib_managed=true)
Nov 28 08:08:57 np0005538515.localdomain systemd[1]: Started libpod-conmon-2a044fcb236ee7f5542f44e64fe793d77e8763efce6900a93b0314c6ec72e94b.scope.
Nov 28 08:08:57 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 08:08:57 np0005538515.localdomain podman[58687]: 2025-11-28 08:08:57.290793441 +0000 UTC m=+0.034807849 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1
Nov 28 08:08:57 np0005538515.localdomain ceph-osd[33334]: log_channel(cluster) log [DBG] : 5.1f scrub starts
Nov 28 08:08:57 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/25ffb2916f839ff1700adaff5cb29e97302d49ba8ff980d3124f389d659473a3/merged/var/log/nova supports timestamps until 2038 (0x7fffffff)
Nov 28 08:08:57 np0005538515.localdomain podman[58687]: 2025-11-28 08:08:57.404510308 +0000 UTC m=+0.148524656 container init 2a044fcb236ee7f5542f44e64fe793d77e8763efce6900a93b0314c6ec72e94b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute_init_log, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, config_id=tripleo_step2, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, container_name=nova_compute_init_log, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, tcib_managed=true, version=17.1.12, name=rhosp17/openstack-nova-compute, vcs-type=git, managed_by=tripleo_ansible, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'command': ['/bin/bash', '-c', 'chown -R nova:nova /var/log/nova'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764316155'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'user': 'root', 'volumes': ['/var/log/containers/nova:/var/log/nova:z']}, vendor=Red Hat, Inc., architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, build-date=2025-11-19T00:36:58Z)
Nov 28 08:08:57 np0005538515.localdomain podman[58710]: 2025-11-28 08:08:57.414477305 +0000 UTC m=+0.093908802 container create 8cd7b1e7d0a7d27d2f93a539023775fca4a5591861445449044b4e057c54b496 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud_init_logs, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, vcs-type=git, batch=17.1_20251118.1, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-libvirt, version=17.1.12, io.buildah.version=1.41.4, build-date=2025-11-19T00:35:22Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, config_id=tripleo_step2, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_virtqemud_init_logs, description=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, architecture=x86_64, com.redhat.component=openstack-nova-libvirt-container, config_data={'command': ['/bin/bash', '-c', 'chown -R tss:tss /var/log/swtpm'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764316155'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'none', 'privileged': True, 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'user': 'root', 'volumes': ['/var/log/containers/libvirt/swtpm:/var/log/swtpm:shared,z']}, managed_by=tripleo_ansible, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-libvirt)
Nov 28 08:08:57 np0005538515.localdomain systemd[1]: libpod-2a044fcb236ee7f5542f44e64fe793d77e8763efce6900a93b0314c6ec72e94b.scope: Deactivated successfully.
Nov 28 08:08:57 np0005538515.localdomain systemd[1]: Started libpod-conmon-8cd7b1e7d0a7d27d2f93a539023775fca4a5591861445449044b4e057c54b496.scope.
Nov 28 08:08:57 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 08:08:57 np0005538515.localdomain podman[58710]: 2025-11-28 08:08:57.364482031 +0000 UTC m=+0.043913558 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Nov 28 08:08:57 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0818f18800ef844a6a48c8a7ece9ae523d5aa6f809095a5eb180d408e8d636c4/merged/var/log/swtpm supports timestamps until 2038 (0x7fffffff)
Nov 28 08:08:57 np0005538515.localdomain podman[58687]: 2025-11-28 08:08:57.465885271 +0000 UTC m=+0.209899619 container start 2a044fcb236ee7f5542f44e64fe793d77e8763efce6900a93b0314c6ec72e94b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute_init_log, architecture=x86_64, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, config_data={'command': ['/bin/bash', '-c', 'chown -R nova:nova /var/log/nova'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764316155'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'user': 'root', 'volumes': ['/var/log/containers/nova:/var/log/nova:z']}, vendor=Red Hat, Inc., org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, io.buildah.version=1.41.4, container_name=nova_compute_init_log, build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step2, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-compute, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, distribution-scope=public, version=17.1.12, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute)
Nov 28 08:08:57 np0005538515.localdomain python3[58621]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_compute_init_log --conmon-pidfile /run/nova_compute_init_log.pid --detach=True --env TRIPLEO_DEPLOY_IDENTIFIER=1764316155 --label config_id=tripleo_step2 --label container_name=nova_compute_init_log --label managed_by=tripleo_ansible --label config_data={'command': ['/bin/bash', '-c', 'chown -R nova:nova /var/log/nova'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764316155'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'user': 'root', 'volumes': ['/var/log/containers/nova:/var/log/nova:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_compute_init_log.log --network none --privileged=False --user root --volume /var/log/containers/nova:/var/log/nova:z registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 /bin/bash -c chown -R nova:nova /var/log/nova
Nov 28 08:08:57 np0005538515.localdomain podman[58710]: 2025-11-28 08:08:57.47072541 +0000 UTC m=+0.150156937 container init 8cd7b1e7d0a7d27d2f93a539023775fca4a5591861445449044b4e057c54b496 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud_init_logs, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, io.buildah.version=1.41.4, vendor=Red Hat, Inc., managed_by=tripleo_ansible, architecture=x86_64, container_name=nova_virtqemud_init_logs, url=https://www.redhat.com, config_data={'command': ['/bin/bash', '-c', 'chown -R tss:tss /var/log/swtpm'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764316155'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'none', 'privileged': True, 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'user': 'root', 'volumes': ['/var/log/containers/libvirt/swtpm:/var/log/swtpm:shared,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-libvirt, release=1761123044, config_id=tripleo_step2, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2025-11-19T00:35:22Z, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-libvirt, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, vcs-type=git)
Nov 28 08:08:57 np0005538515.localdomain podman[58710]: 2025-11-28 08:08:57.480076666 +0000 UTC m=+0.159508203 container start 8cd7b1e7d0a7d27d2f93a539023775fca4a5591861445449044b4e057c54b496 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud_init_logs, config_id=tripleo_step2, container_name=nova_virtqemud_init_logs, description=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, url=https://www.redhat.com, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, io.buildah.version=1.41.4, config_data={'command': ['/bin/bash', '-c', 'chown -R tss:tss /var/log/swtpm'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764316155'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'none', 'privileged': True, 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'user': 'root', 'volumes': ['/var/log/containers/libvirt/swtpm:/var/log/swtpm:shared,z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, batch=17.1_20251118.1, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:35:22Z, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, version=17.1.12, distribution-scope=public, name=rhosp17/openstack-nova-libvirt)
Nov 28 08:08:57 np0005538515.localdomain python3[58621]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_virtqemud_init_logs --conmon-pidfile /run/nova_virtqemud_init_logs.pid --detach=True --env TRIPLEO_DEPLOY_IDENTIFIER=1764316155 --label config_id=tripleo_step2 --label container_name=nova_virtqemud_init_logs --label managed_by=tripleo_ansible --label config_data={'command': ['/bin/bash', '-c', 'chown -R tss:tss /var/log/swtpm'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764316155'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'none', 'privileged': True, 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'user': 'root', 'volumes': ['/var/log/containers/libvirt/swtpm:/var/log/swtpm:shared,z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_virtqemud_init_logs.log --network none --privileged=True --security-opt label=level:s0 --security-opt label=type:spc_t --security-opt label=filetype:container_file_t --user root --volume /var/log/containers/libvirt/swtpm:/var/log/swtpm:shared,z registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 /bin/bash -c chown -R tss:tss /var/log/swtpm
Nov 28 08:08:57 np0005538515.localdomain systemd[1]: libpod-8cd7b1e7d0a7d27d2f93a539023775fca4a5591861445449044b4e057c54b496.scope: Deactivated successfully.
Nov 28 08:08:57 np0005538515.localdomain podman[58730]: 2025-11-28 08:08:57.510443088 +0000 UTC m=+0.072591978 container died 2a044fcb236ee7f5542f44e64fe793d77e8763efce6900a93b0314c6ec72e94b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute_init_log, version=17.1.12, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, container_name=nova_compute_init_log, distribution-scope=public, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, com.redhat.component=openstack-nova-compute-container, batch=17.1_20251118.1, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, release=1761123044, config_data={'command': ['/bin/bash', '-c', 'chown -R nova:nova /var/log/nova'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764316155'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'user': 'root', 'volumes': ['/var/log/containers/nova:/var/log/nova:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step2, managed_by=tripleo_ansible)
Nov 28 08:08:57 np0005538515.localdomain podman[58730]: 2025-11-28 08:08:57.5401625 +0000 UTC m=+0.102311330 container cleanup 2a044fcb236ee7f5542f44e64fe793d77e8763efce6900a93b0314c6ec72e94b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute_init_log, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z, architecture=x86_64, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute_init_log, vcs-type=git, name=rhosp17/openstack-nova-compute, release=1761123044, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step2, vendor=Red Hat, Inc., io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.expose-services=, config_data={'command': ['/bin/bash', '-c', 'chown -R nova:nova /var/log/nova'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764316155'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'user': 'root', 'volumes': ['/var/log/containers/nova:/var/log/nova:z']}, tcib_managed=true)
Nov 28 08:08:57 np0005538515.localdomain systemd[1]: libpod-conmon-2a044fcb236ee7f5542f44e64fe793d77e8763efce6900a93b0314c6ec72e94b.scope: Deactivated successfully.
Nov 28 08:08:57 np0005538515.localdomain podman[58753]: 2025-11-28 08:08:57.591725251 +0000 UTC m=+0.097371088 container died 8cd7b1e7d0a7d27d2f93a539023775fca4a5591861445449044b4e057c54b496 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud_init_logs, com.redhat.component=openstack-nova-libvirt-container, vcs-type=git, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:35:22Z, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp17/openstack-nova-libvirt, config_data={'command': ['/bin/bash', '-c', 'chown -R tss:tss /var/log/swtpm'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764316155'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'none', 'privileged': True, 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'user': 'root', 'volumes': ['/var/log/containers/libvirt/swtpm:/var/log/swtpm:shared,z']}, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, config_id=tripleo_step2, tcib_managed=true, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_virtqemud_init_logs, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, url=https://www.redhat.com, version=17.1.12, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4)
Nov 28 08:08:57 np0005538515.localdomain podman[58753]: 2025-11-28 08:08:57.61647096 +0000 UTC m=+0.122116717 container cleanup 8cd7b1e7d0a7d27d2f93a539023775fca4a5591861445449044b4e057c54b496 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud_init_logs, url=https://www.redhat.com, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20251118.1, build-date=2025-11-19T00:35:22Z, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, release=1761123044, vcs-type=git, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, config_id=tripleo_step2, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, vendor=Red Hat, Inc., version=17.1.12, config_data={'command': ['/bin/bash', '-c', 'chown -R tss:tss /var/log/swtpm'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764316155'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'none', 'privileged': True, 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'user': 'root', 'volumes': ['/var/log/containers/libvirt/swtpm:/var/log/swtpm:shared,z']}, com.redhat.component=openstack-nova-libvirt-container, container_name=nova_virtqemud_init_logs, name=rhosp17/openstack-nova-libvirt, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt)
Nov 28 08:08:57 np0005538515.localdomain systemd[1]: libpod-conmon-8cd7b1e7d0a7d27d2f93a539023775fca4a5591861445449044b4e057c54b496.scope: Deactivated successfully.
Nov 28 08:08:57 np0005538515.localdomain ceph-osd[33334]: log_channel(cluster) log [DBG] : 5.1f scrub ok
Nov 28 08:08:57 np0005538515.localdomain podman[58878]: 2025-11-28 08:08:57.980679102 +0000 UTC m=+0.071076862 container create 99a1ca315b8742a552d837e443e8b0abb0296bf95f2a35ef9ad3c6a89867d1c8 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=create_virtlogd_wrapper, version=17.1.12, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step2, com.redhat.component=openstack-nova-libvirt-container, maintainer=OpenStack TripleO Team, config_data={'cgroupns': 'host', 'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::nova::virtlogd_wrapper'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764316155'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z']}, vendor=Red Hat, Inc., vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20251118.1, build-date=2025-11-19T00:35:22Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, name=rhosp17/openstack-nova-libvirt, container_name=create_virtlogd_wrapper, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt)
Nov 28 08:08:58 np0005538515.localdomain podman[58879]: 2025-11-28 08:08:58.002235912 +0000 UTC m=+0.086131702 container create b7e80d2244e97078fdfa0b346326e9e74f6c05087f46da1752d168ebfe5e5e18 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=create_haproxy_wrapper, vendor=Red Hat, Inc., architecture=x86_64, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=create_haproxy_wrapper, build-date=2025-11-19T00:14:25Z, io.buildah.version=1.41.4, config_id=tripleo_step2, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, tcib_managed=true, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z']}, vcs-type=git, version=17.1.12, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn)
Nov 28 08:08:58 np0005538515.localdomain systemd[1]: Started libpod-conmon-99a1ca315b8742a552d837e443e8b0abb0296bf95f2a35ef9ad3c6a89867d1c8.scope.
Nov 28 08:08:58 np0005538515.localdomain systemd[1]: Started libpod-conmon-b7e80d2244e97078fdfa0b346326e9e74f6c05087f46da1752d168ebfe5e5e18.scope.
Nov 28 08:08:58 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 08:08:58 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/16fa74df54e8af55fadaf755a0b31aeec0d57923866d5b313d9489d8d758e3e8/merged/var/lib/container-config-scripts supports timestamps until 2038 (0x7fffffff)
Nov 28 08:08:58 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 08:08:58 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8bb70caa6e9f6f4d170c3a5868421bfc24d38542c14f6a27a1edf3bcfdc45d32/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 08:08:58 np0005538515.localdomain podman[58878]: 2025-11-28 08:08:58.045286133 +0000 UTC m=+0.135683903 container init 99a1ca315b8742a552d837e443e8b0abb0296bf95f2a35ef9ad3c6a89867d1c8 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=create_virtlogd_wrapper, description=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.12, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, container_name=create_virtlogd_wrapper, vendor=Red Hat, Inc., io.openshift.expose-services=, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-libvirt, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step2, io.buildah.version=1.41.4, tcib_managed=true, batch=17.1_20251118.1, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:35:22Z, release=1761123044, com.redhat.component=openstack-nova-libvirt-container, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::nova::virtlogd_wrapper'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764316155'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z']})
Nov 28 08:08:58 np0005538515.localdomain podman[58879]: 2025-11-28 08:08:58.048936155 +0000 UTC m=+0.132831925 container init b7e80d2244e97078fdfa0b346326e9e74f6c05087f46da1752d168ebfe5e5e18 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=create_haproxy_wrapper, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, release=1761123044, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:14:25Z, batch=17.1_20251118.1, architecture=x86_64, vcs-type=git, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, container_name=create_haproxy_wrapper, config_id=tripleo_step2, config_data={'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z']}, vendor=Red Hat, Inc., vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 28 08:08:58 np0005538515.localdomain podman[58879]: 2025-11-28 08:08:57.952028223 +0000 UTC m=+0.035924003 image pull  registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1
Nov 28 08:08:58 np0005538515.localdomain podman[58878]: 2025-11-28 08:08:57.952342673 +0000 UTC m=+0.042740453 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Nov 28 08:08:58 np0005538515.localdomain podman[58878]: 2025-11-28 08:08:58.054754264 +0000 UTC m=+0.145152054 container start 99a1ca315b8742a552d837e443e8b0abb0296bf95f2a35ef9ad3c6a89867d1c8 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=create_virtlogd_wrapper, container_name=create_virtlogd_wrapper, version=17.1.12, vcs-type=git, managed_by=tripleo_ansible, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, architecture=x86_64, build-date=2025-11-19T00:35:22Z, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, config_data={'cgroupns': 'host', 'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::nova::virtlogd_wrapper'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764316155'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z']}, url=https://www.redhat.com, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-libvirt, config_id=tripleo_step2, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt)
Nov 28 08:08:58 np0005538515.localdomain podman[58878]: 2025-11-28 08:08:58.05497898 +0000 UTC m=+0.145376750 container attach 99a1ca315b8742a552d837e443e8b0abb0296bf95f2a35ef9ad3c6a89867d1c8 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=create_virtlogd_wrapper, vendor=Red Hat, Inc., io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, build-date=2025-11-19T00:35:22Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-nova-libvirt, batch=17.1_20251118.1, config_data={'cgroupns': 'host', 'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::nova::virtlogd_wrapper'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764316155'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z']}, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=create_virtlogd_wrapper, com.redhat.component=openstack-nova-libvirt-container, vcs-type=git, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, release=1761123044, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, config_id=tripleo_step2, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt)
Nov 28 08:08:58 np0005538515.localdomain podman[58879]: 2025-11-28 08:08:58.05690819 +0000 UTC m=+0.140803940 container start b7e80d2244e97078fdfa0b346326e9e74f6c05087f46da1752d168ebfe5e5e18 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=create_haproxy_wrapper, vendor=Red Hat, Inc., release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step2, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, name=rhosp17/openstack-neutron-metadata-agent-ovn, distribution-scope=public, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, config_data={'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z']}, architecture=x86_64, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, vcs-type=git, container_name=create_haproxy_wrapper, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, tcib_managed=true, url=https://www.redhat.com)
Nov 28 08:08:58 np0005538515.localdomain podman[58879]: 2025-11-28 08:08:58.057111986 +0000 UTC m=+0.141007806 container attach b7e80d2244e97078fdfa0b346326e9e74f6c05087f46da1752d168ebfe5e5e18 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=create_haproxy_wrapper, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, vendor=Red Hat, Inc., build-date=2025-11-19T00:14:25Z, url=https://www.redhat.com, config_id=tripleo_step2, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, config_data={'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, io.openshift.expose-services=, distribution-scope=public, architecture=x86_64, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=create_haproxy_wrapper)
Nov 28 08:08:58 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-25ffb2916f839ff1700adaff5cb29e97302d49ba8ff980d3124f389d659473a3-merged.mount: Deactivated successfully.
Nov 28 08:08:58 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-2a044fcb236ee7f5542f44e64fe793d77e8763efce6900a93b0314c6ec72e94b-userdata-shm.mount: Deactivated successfully.
Nov 28 08:08:59 np0005538515.localdomain ceph-osd[33334]: log_channel(cluster) log [DBG] : 4.18 scrub starts
Nov 28 08:08:59 np0005538515.localdomain ceph-osd[33334]: log_channel(cluster) log [DBG] : 4.18 scrub ok
Nov 28 08:08:59 np0005538515.localdomain ovs-vsctl[58986]: ovs|00001|db_ctl_base|ERR|unix:/var/run/openvswitch/db.sock: database connection failed (No such file or directory)
Nov 28 08:09:00 np0005538515.localdomain systemd[1]: libpod-99a1ca315b8742a552d837e443e8b0abb0296bf95f2a35ef9ad3c6a89867d1c8.scope: Deactivated successfully.
Nov 28 08:09:00 np0005538515.localdomain systemd[1]: libpod-99a1ca315b8742a552d837e443e8b0abb0296bf95f2a35ef9ad3c6a89867d1c8.scope: Consumed 2.104s CPU time.
Nov 28 08:09:00 np0005538515.localdomain podman[59130]: 2025-11-28 08:09:00.222503626 +0000 UTC m=+0.039888295 container died 99a1ca315b8742a552d837e443e8b0abb0296bf95f2a35ef9ad3c6a89867d1c8 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=create_virtlogd_wrapper, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, config_id=tripleo_step2, com.redhat.component=openstack-nova-libvirt-container, container_name=create_virtlogd_wrapper, managed_by=tripleo_ansible, vcs-type=git, vendor=Red Hat, Inc., version=17.1.12, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp17/openstack-nova-libvirt, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, tcib_managed=true, config_data={'cgroupns': 'host', 'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::nova::virtlogd_wrapper'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764316155'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z']}, architecture=x86_64, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:35:22Z, description=Red Hat OpenStack Platform 17.1 nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, release=1761123044, batch=17.1_20251118.1)
Nov 28 08:09:00 np0005538515.localdomain systemd[1]: tmp-crun.8Bxigw.mount: Deactivated successfully.
Nov 28 08:09:00 np0005538515.localdomain systemd[1]: tmp-crun.arSVSd.mount: Deactivated successfully.
Nov 28 08:09:00 np0005538515.localdomain podman[59130]: 2025-11-28 08:09:00.260949685 +0000 UTC m=+0.078334334 container cleanup 99a1ca315b8742a552d837e443e8b0abb0296bf95f2a35ef9ad3c6a89867d1c8 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=create_virtlogd_wrapper, vcs-type=git, config_id=tripleo_step2, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, name=rhosp17/openstack-nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, io.openshift.expose-services=, architecture=x86_64, url=https://www.redhat.com, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:35:22Z, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, distribution-scope=public, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=create_virtlogd_wrapper, description=Red Hat OpenStack Platform 17.1 nova-libvirt, config_data={'cgroupns': 'host', 'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::nova::virtlogd_wrapper'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764316155'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z']}, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 28 08:09:00 np0005538515.localdomain systemd[1]: libpod-conmon-99a1ca315b8742a552d837e443e8b0abb0296bf95f2a35ef9ad3c6a89867d1c8.scope: Deactivated successfully.
Nov 28 08:09:00 np0005538515.localdomain python3[58621]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name create_virtlogd_wrapper --cgroupns=host --conmon-pidfile /run/create_virtlogd_wrapper.pid --detach=False --env TRIPLEO_DEPLOY_IDENTIFIER=1764316155 --label config_id=tripleo_step2 --label container_name=create_virtlogd_wrapper --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::nova::virtlogd_wrapper'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764316155'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/create_virtlogd_wrapper.log --network host --pid host --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 /container_puppet_apply.sh 4 file include ::tripleo::profile::base::nova::virtlogd_wrapper
Nov 28 08:09:00 np0005538515.localdomain ceph-osd[33334]: log_channel(cluster) log [DBG] : 3.c scrub starts
Nov 28 08:09:00 np0005538515.localdomain ceph-osd[33334]: log_channel(cluster) log [DBG] : 3.c scrub ok
Nov 28 08:09:00 np0005538515.localdomain systemd[1]: libpod-b7e80d2244e97078fdfa0b346326e9e74f6c05087f46da1752d168ebfe5e5e18.scope: Deactivated successfully.
Nov 28 08:09:00 np0005538515.localdomain systemd[1]: libpod-b7e80d2244e97078fdfa0b346326e9e74f6c05087f46da1752d168ebfe5e5e18.scope: Consumed 2.092s CPU time.
Nov 28 08:09:00 np0005538515.localdomain podman[58879]: 2025-11-28 08:09:00.941265003 +0000 UTC m=+3.025160833 container died b7e80d2244e97078fdfa0b346326e9e74f6c05087f46da1752d168ebfe5e5e18 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=create_haproxy_wrapper, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, batch=17.1_20251118.1, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-type=git, managed_by=tripleo_ansible, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, url=https://www.redhat.com, build-date=2025-11-19T00:14:25Z, io.openshift.expose-services=, config_data={'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, architecture=x86_64, io.buildah.version=1.41.4, container_name=create_haproxy_wrapper, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step2, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12)
Nov 28 08:09:01 np0005538515.localdomain ceph-osd[32393]: log_channel(cluster) log [DBG] : 2.2 scrub starts
Nov 28 08:09:01 np0005538515.localdomain podman[59169]: 2025-11-28 08:09:01.043739426 +0000 UTC m=+0.090431225 container cleanup b7e80d2244e97078fdfa0b346326e9e74f6c05087f46da1752d168ebfe5e5e18 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=create_haproxy_wrapper, batch=17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-type=git, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, managed_by=tripleo_ansible, build-date=2025-11-19T00:14:25Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, architecture=x86_64, distribution-scope=public, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_data={'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z']}, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, config_id=tripleo_step2, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., url=https://www.redhat.com, container_name=create_haproxy_wrapper)
Nov 28 08:09:01 np0005538515.localdomain systemd[1]: libpod-conmon-b7e80d2244e97078fdfa0b346326e9e74f6c05087f46da1752d168ebfe5e5e18.scope: Deactivated successfully.
Nov 28 08:09:01 np0005538515.localdomain python3[58621]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name create_haproxy_wrapper --conmon-pidfile /run/create_haproxy_wrapper.pid --detach=False --label config_id=tripleo_step2 --label container_name=create_haproxy_wrapper --label managed_by=tripleo_ansible --label config_data={'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/create_haproxy_wrapper.log --network host --pid host --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /run/openvswitch:/run/openvswitch:shared,z --volume /var/lib/neutron:/var/lib/neutron:shared,z registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1 /container_puppet_apply.sh 4 file include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers
Nov 28 08:09:01 np0005538515.localdomain sudo[58619]: pam_unix(sudo:session): session closed for user root
Nov 28 08:09:01 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-8bb70caa6e9f6f4d170c3a5868421bfc24d38542c14f6a27a1edf3bcfdc45d32-merged.mount: Deactivated successfully.
Nov 28 08:09:01 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b7e80d2244e97078fdfa0b346326e9e74f6c05087f46da1752d168ebfe5e5e18-userdata-shm.mount: Deactivated successfully.
Nov 28 08:09:01 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-16fa74df54e8af55fadaf755a0b31aeec0d57923866d5b313d9489d8d758e3e8-merged.mount: Deactivated successfully.
Nov 28 08:09:01 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-99a1ca315b8742a552d837e443e8b0abb0296bf95f2a35ef9ad3c6a89867d1c8-userdata-shm.mount: Deactivated successfully.
Nov 28 08:09:01 np0005538515.localdomain ceph-osd[32393]: log_channel(cluster) log [DBG] : 2.2 scrub ok
Nov 28 08:09:01 np0005538515.localdomain ceph-osd[33334]: log_channel(cluster) log [DBG] : 6.e scrub starts
Nov 28 08:09:01 np0005538515.localdomain ceph-osd[33334]: log_channel(cluster) log [DBG] : 6.e scrub ok
Nov 28 08:09:01 np0005538515.localdomain sudo[59222]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xngtmqgwvzpohlcqxadxfqbjyyoeugkk ; /usr/bin/python3
Nov 28 08:09:01 np0005538515.localdomain sudo[59222]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:09:01 np0005538515.localdomain python3[59224]: ansible-file Invoked with path=/var/lib/container-puppet/container-puppet-tasks2.json state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:09:01 np0005538515.localdomain sudo[59222]: pam_unix(sudo:session): session closed for user root
Nov 28 08:09:02 np0005538515.localdomain sudo[59270]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dfygmloyakiumirwdftarkeaeoblxain ; /usr/bin/python3
Nov 28 08:09:02 np0005538515.localdomain sudo[59270]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:09:02 np0005538515.localdomain sudo[59270]: pam_unix(sudo:session): session closed for user root
Nov 28 08:09:02 np0005538515.localdomain sudo[59313]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vzehzjspcurxjjjnkylwtcscxtskowhv ; /usr/bin/python3
Nov 28 08:09:02 np0005538515.localdomain sudo[59313]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:09:02 np0005538515.localdomain sudo[59313]: pam_unix(sudo:session): session closed for user root
Nov 28 08:09:03 np0005538515.localdomain sudo[59343]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xjiqgvcyupleklwfeclgcrjrjhsnswzn ; /usr/bin/python3
Nov 28 08:09:03 np0005538515.localdomain sudo[59343]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:09:03 np0005538515.localdomain python3[59345]: ansible-container_puppet_config Invoked with check_mode=False config_vol_prefix=/var/lib/config-data debug=True net_host=True no_archive=True puppet_config=/var/lib/container-puppet/container-puppet-tasks2.json short_hostname=np0005538515 step=2 update_config_hash_only=False
Nov 28 08:09:03 np0005538515.localdomain sudo[59343]: pam_unix(sudo:session): session closed for user root
Nov 28 08:09:03 np0005538515.localdomain sudo[59359]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ydqljunsenifwnaedlstmggsatyygvxm ; /usr/bin/python3
Nov 28 08:09:03 np0005538515.localdomain sudo[59359]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:09:03 np0005538515.localdomain python3[59361]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:09:03 np0005538515.localdomain sudo[59359]: pam_unix(sudo:session): session closed for user root
Nov 28 08:09:04 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 72 pg[7.e( v 36'39 (0'0,36'39] local-lis/les=56/57 n=1 ec=44/33 lis/c=56/56 les/c/f=57/57/0 sis=72 pruub=12.102886200s) [3,5,1] r=-1 lpr=72 pi=[56,72)/1 luod=0'0 crt=36'39 mlcod 0'0 active pruub 1237.636230469s@ mbc={}] start_peering_interval up [0,4,5] -> [3,5,1], acting [0,4,5] -> [3,5,1], acting_primary 0 -> 3, up_primary 0 -> 3, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:09:04 np0005538515.localdomain ceph-osd[33334]: osd.4 pg_epoch: 72 pg[7.e( v 36'39 (0'0,36'39] local-lis/les=56/57 n=1 ec=44/33 lis/c=56/56 les/c/f=57/57/0 sis=72 pruub=12.102806091s) [3,5,1] r=-1 lpr=72 pi=[56,72)/1 crt=36'39 mlcod 0'0 unknown NOTIFY pruub 1237.636230469s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:09:04 np0005538515.localdomain sudo[59375]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-loymrulmxyybyyjomdfqhqplnklubnqq ; /usr/bin/python3
Nov 28 08:09:04 np0005538515.localdomain sudo[59375]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:09:04 np0005538515.localdomain python3[59377]: ansible-container_config_data Invoked with config_path=/var/lib/tripleo-config/container-puppet-config/step_2 config_pattern=container-puppet-*.json config_overrides={} debug=True
Nov 28 08:09:04 np0005538515.localdomain sudo[59375]: pam_unix(sudo:session): session closed for user root
Nov 28 08:09:04 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 72 pg[7.e( empty local-lis/les=0/0 n=0 ec=44/33 lis/c=56/56 les/c/f=57/57/0 sis=72) [3,5,1] r=2 lpr=72 pi=[56,72)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Nov 28 08:09:05 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 74 pg[7.f( v 36'39 (0'0,36'39] local-lis/les=58/59 n=1 ec=44/33 lis/c=58/58 les/c/f=59/59/0 sis=74 pruub=10.883349419s) [0,5,1] r=2 lpr=74 pi=[58,74)/1 crt=36'39 mlcod 0'0 active pruub 1242.462890625s@ mbc={255={}}] start_peering_interval up [1,5,3] -> [0,5,1], acting [1,5,3] -> [0,5,1], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Nov 28 08:09:05 np0005538515.localdomain ceph-osd[32393]: osd.1 pg_epoch: 74 pg[7.f( v 36'39 (0'0,36'39] local-lis/les=58/59 n=1 ec=44/33 lis/c=58/58 les/c/f=59/59/0 sis=74 pruub=10.883151054s) [0,5,1] r=2 lpr=74 pi=[58,74)/1 crt=36'39 mlcod 0'0 unknown NOTIFY pruub 1242.462890625s@ mbc={}] state<Start>: transitioning to Stray
Nov 28 08:09:08 np0005538515.localdomain ceph-osd[32393]: log_channel(cluster) log [DBG] : 2.8 scrub starts
Nov 28 08:09:08 np0005538515.localdomain ceph-osd[32393]: log_channel(cluster) log [DBG] : 2.8 scrub ok
Nov 28 08:09:08 np0005538515.localdomain ceph-osd[33334]: log_channel(cluster) log [DBG] : 3.a scrub starts
Nov 28 08:09:08 np0005538515.localdomain ceph-osd[33334]: log_channel(cluster) log [DBG] : 3.a scrub ok
Nov 28 08:09:11 np0005538515.localdomain ceph-osd[33334]: log_channel(cluster) log [DBG] : 3.5 scrub starts
Nov 28 08:09:11 np0005538515.localdomain ceph-osd[33334]: log_channel(cluster) log [DBG] : 3.5 scrub ok
Nov 28 08:09:14 np0005538515.localdomain ceph-osd[32393]: log_channel(cluster) log [DBG] : 4.a scrub starts
Nov 28 08:09:14 np0005538515.localdomain ceph-osd[32393]: log_channel(cluster) log [DBG] : 4.a scrub ok
Nov 28 08:09:15 np0005538515.localdomain ceph-osd[32393]: log_channel(cluster) log [DBG] : 7.7 scrub starts
Nov 28 08:09:15 np0005538515.localdomain ceph-osd[32393]: log_channel(cluster) log [DBG] : 7.7 scrub ok
Nov 28 08:09:15 np0005538515.localdomain ceph-osd[33334]: log_channel(cluster) log [DBG] : 6.7 scrub starts
Nov 28 08:09:15 np0005538515.localdomain ceph-osd[33334]: log_channel(cluster) log [DBG] : 6.7 scrub ok
Nov 28 08:09:16 np0005538515.localdomain ceph-osd[32393]: log_channel(cluster) log [DBG] : 7.c scrub starts
Nov 28 08:09:19 np0005538515.localdomain ceph-osd[32393]: log_channel(cluster) log [DBG] : 7.0 scrub starts
Nov 28 08:09:19 np0005538515.localdomain ceph-osd[32393]: log_channel(cluster) log [DBG] : 7.0 scrub ok
Nov 28 08:09:20 np0005538515.localdomain ceph-osd[32393]: log_channel(cluster) log [DBG] : 3.f scrub starts
Nov 28 08:09:20 np0005538515.localdomain ceph-osd[32393]: log_channel(cluster) log [DBG] : 3.f scrub ok
Nov 28 08:09:20 np0005538515.localdomain ceph-osd[33334]: log_channel(cluster) log [DBG] : 5.15 scrub starts
Nov 28 08:09:20 np0005538515.localdomain ceph-osd[33334]: log_channel(cluster) log [DBG] : 5.15 scrub ok
Nov 28 08:09:21 np0005538515.localdomain ceph-osd[32393]: log_channel(cluster) log [DBG] : 6.d scrub starts
Nov 28 08:09:21 np0005538515.localdomain ceph-osd[32393]: log_channel(cluster) log [DBG] : 6.d scrub ok
Nov 28 08:09:23 np0005538515.localdomain sshd[59378]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 08:09:23 np0005538515.localdomain ceph-osd[33334]: log_channel(cluster) log [DBG] : 4.1a deep-scrub starts
Nov 28 08:09:23 np0005538515.localdomain ceph-osd[33334]: log_channel(cluster) log [DBG] : 4.1a deep-scrub ok
Nov 28 08:09:23 np0005538515.localdomain sshd[59378]: Invalid user support from 78.128.112.74 port 53574
Nov 28 08:09:24 np0005538515.localdomain sshd[59378]: Connection closed by invalid user support 78.128.112.74 port 53574 [preauth]
Nov 28 08:09:24 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.
Nov 28 08:09:24 np0005538515.localdomain systemd[1]: tmp-crun.rKxtLS.mount: Deactivated successfully.
Nov 28 08:09:24 np0005538515.localdomain podman[59380]: 2025-11-28 08:09:24.164275225 +0000 UTC m=+0.091620860 container health_status 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, name=rhosp17/openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=metrics_qdr, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, io.openshift.expose-services=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, config_id=tripleo_step1, build-date=2025-11-18T22:49:46Z, url=https://www.redhat.com, io.buildah.version=1.41.4, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 28 08:09:24 np0005538515.localdomain podman[59380]: 2025-11-28 08:09:24.383674226 +0000 UTC m=+0.311019831 container exec_died 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-qdrouterd, batch=17.1_20251118.1, config_id=tripleo_step1, container_name=metrics_qdr, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, build-date=2025-11-18T22:49:46Z, io.buildah.version=1.41.4, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, managed_by=tripleo_ansible, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd)
Nov 28 08:09:24 np0005538515.localdomain systemd[1]: 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.service: Deactivated successfully.
Nov 28 08:09:25 np0005538515.localdomain ceph-osd[32393]: log_channel(cluster) log [DBG] : 6.8 scrub starts
Nov 28 08:09:25 np0005538515.localdomain ceph-osd[32393]: log_channel(cluster) log [DBG] : 6.8 scrub ok
Nov 28 08:09:25 np0005538515.localdomain ceph-osd[33334]: log_channel(cluster) log [DBG] : 5.2 scrub starts
Nov 28 08:09:25 np0005538515.localdomain ceph-osd[33334]: log_channel(cluster) log [DBG] : 5.2 scrub ok
Nov 28 08:09:26 np0005538515.localdomain ceph-osd[33334]: log_channel(cluster) log [DBG] : 5.7 scrub starts
Nov 28 08:09:26 np0005538515.localdomain ceph-osd[33334]: log_channel(cluster) log [DBG] : 5.7 scrub ok
Nov 28 08:09:28 np0005538515.localdomain ceph-osd[32393]: log_channel(cluster) log [DBG] : 6.19 deep-scrub starts
Nov 28 08:09:28 np0005538515.localdomain ceph-osd[32393]: log_channel(cluster) log [DBG] : 6.19 deep-scrub ok
Nov 28 08:09:28 np0005538515.localdomain ceph-osd[33334]: log_channel(cluster) log [DBG] : 5.f scrub starts
Nov 28 08:09:30 np0005538515.localdomain ceph-osd[32393]: log_channel(cluster) log [DBG] : 2.1a deep-scrub starts
Nov 28 08:09:30 np0005538515.localdomain ceph-osd[32393]: log_channel(cluster) log [DBG] : 2.1a deep-scrub ok
Nov 28 08:09:30 np0005538515.localdomain ceph-osd[33334]: log_channel(cluster) log [DBG] : 5.1 scrub starts
Nov 28 08:09:30 np0005538515.localdomain ceph-osd[33334]: log_channel(cluster) log [DBG] : 5.1 scrub ok
Nov 28 08:09:32 np0005538515.localdomain ceph-osd[33334]: log_channel(cluster) log [DBG] : 6.3 scrub starts
Nov 28 08:09:32 np0005538515.localdomain ceph-osd[33334]: log_channel(cluster) log [DBG] : 6.3 scrub ok
Nov 28 08:09:34 np0005538515.localdomain ceph-osd[32393]: log_channel(cluster) log [DBG] : 3.1c scrub starts
Nov 28 08:09:34 np0005538515.localdomain ceph-osd[32393]: log_channel(cluster) log [DBG] : 3.1c scrub ok
Nov 28 08:09:34 np0005538515.localdomain ceph-osd[33334]: log_channel(cluster) log [DBG] : 2.7 scrub starts
Nov 28 08:09:34 np0005538515.localdomain ceph-osd[33334]: log_channel(cluster) log [DBG] : 2.7 scrub ok
Nov 28 08:09:36 np0005538515.localdomain ceph-osd[32393]: log_channel(cluster) log [DBG] : 7.c scrub starts
Nov 28 08:09:36 np0005538515.localdomain ceph-osd[32393]: log_channel(cluster) log [DBG] : 7.c scrub ok
Nov 28 08:09:39 np0005538515.localdomain ceph-osd[33334]: log_channel(cluster) log [DBG] : 2.3 scrub starts
Nov 28 08:09:39 np0005538515.localdomain ceph-osd[33334]: log_channel(cluster) log [DBG] : 2.3 scrub ok
Nov 28 08:09:39 np0005538515.localdomain sudo[59408]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 08:09:39 np0005538515.localdomain sudo[59408]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:09:39 np0005538515.localdomain sudo[59408]: pam_unix(sudo:session): session closed for user root
Nov 28 08:09:39 np0005538515.localdomain sudo[59423]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 08:09:39 np0005538515.localdomain sudo[59423]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:09:40 np0005538515.localdomain sudo[59423]: pam_unix(sudo:session): session closed for user root
Nov 28 08:09:41 np0005538515.localdomain sudo[59470]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 08:09:41 np0005538515.localdomain sudo[59470]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:09:41 np0005538515.localdomain sudo[59470]: pam_unix(sudo:session): session closed for user root
Nov 28 08:09:41 np0005538515.localdomain ceph-osd[33334]: log_channel(cluster) log [DBG] : 4.c scrub starts
Nov 28 08:09:41 np0005538515.localdomain ceph-osd[33334]: log_channel(cluster) log [DBG] : 4.c scrub ok
Nov 28 08:09:42 np0005538515.localdomain ceph-osd[33334]: log_channel(cluster) log [DBG] : 4.d scrub starts
Nov 28 08:09:42 np0005538515.localdomain ceph-osd[33334]: log_channel(cluster) log [DBG] : 4.d scrub ok
Nov 28 08:09:43 np0005538515.localdomain ceph-osd[33334]: log_channel(cluster) log [DBG] : 2.11 scrub starts
Nov 28 08:09:43 np0005538515.localdomain ceph-osd[33334]: log_channel(cluster) log [DBG] : 2.11 scrub ok
Nov 28 08:09:46 np0005538515.localdomain ceph-osd[33334]: log_channel(cluster) log [DBG] : 4.13 scrub starts
Nov 28 08:09:46 np0005538515.localdomain ceph-osd[33334]: log_channel(cluster) log [DBG] : 4.13 scrub ok
Nov 28 08:09:48 np0005538515.localdomain ceph-osd[33334]: log_channel(cluster) log [DBG] : 5.1c scrub starts
Nov 28 08:09:48 np0005538515.localdomain ceph-osd[33334]: log_channel(cluster) log [DBG] : 5.1c scrub ok
Nov 28 08:09:50 np0005538515.localdomain ceph-osd[33334]: log_channel(cluster) log [DBG] : 4.e scrub starts
Nov 28 08:09:50 np0005538515.localdomain ceph-osd[33334]: log_channel(cluster) log [DBG] : 4.e scrub ok
Nov 28 08:09:54 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.
Nov 28 08:09:54 np0005538515.localdomain podman[59485]: 2025-11-28 08:09:54.982870772 +0000 UTC m=+0.090365963 container health_status 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, container_name=metrics_qdr, version=17.1.12, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, vcs-type=git, name=rhosp17/openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, io.openshift.expose-services=, architecture=x86_64, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible)
Nov 28 08:09:55 np0005538515.localdomain podman[59485]: 2025-11-28 08:09:55.179629537 +0000 UTC m=+0.287124728 container exec_died 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, name=rhosp17/openstack-qdrouterd, config_id=tripleo_step1, tcib_managed=true, release=1761123044, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, io.openshift.expose-services=, architecture=x86_64, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, build-date=2025-11-18T22:49:46Z)
Nov 28 08:09:55 np0005538515.localdomain systemd[1]: 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.service: Deactivated successfully.
Nov 28 08:09:58 np0005538515.localdomain ceph-osd[33334]: log_channel(cluster) log [DBG] : 7.1 deep-scrub starts
Nov 28 08:09:58 np0005538515.localdomain ceph-osd[33334]: log_channel(cluster) log [DBG] : 7.1 deep-scrub ok
Nov 28 08:10:02 np0005538515.localdomain ceph-osd[33334]: log_channel(cluster) log [DBG] : 6.1a scrub starts
Nov 28 08:10:02 np0005538515.localdomain ceph-osd[33334]: log_channel(cluster) log [DBG] : 6.1a scrub ok
Nov 28 08:10:03 np0005538515.localdomain ceph-osd[33334]: log_channel(cluster) log [DBG] : 5.10 scrub starts
Nov 28 08:10:03 np0005538515.localdomain ceph-osd[33334]: log_channel(cluster) log [DBG] : 5.10 scrub ok
Nov 28 08:10:12 np0005538515.localdomain ceph-osd[33334]: log_channel(cluster) log [DBG] : 4.1b deep-scrub starts
Nov 28 08:10:12 np0005538515.localdomain ceph-osd[33334]: log_channel(cluster) log [DBG] : 4.1b deep-scrub ok
Nov 28 08:10:13 np0005538515.localdomain ceph-osd[33334]: log_channel(cluster) log [DBG] : 5.f scrub starts
Nov 28 08:10:13 np0005538515.localdomain ceph-osd[33334]: log_channel(cluster) log [DBG] : 5.f scrub ok
Nov 28 08:10:25 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.
Nov 28 08:10:25 np0005538515.localdomain podman[59515]: 2025-11-28 08:10:25.981979644 +0000 UTC m=+0.093638023 container health_status 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, architecture=x86_64, config_id=tripleo_step1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:46Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-qdrouterd-container, distribution-scope=public, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, version=17.1.12, vcs-type=git, release=1761123044)
Nov 28 08:10:26 np0005538515.localdomain podman[59515]: 2025-11-28 08:10:26.229664912 +0000 UTC m=+0.341323241 container exec_died 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, managed_by=tripleo_ansible, io.openshift.expose-services=, vendor=Red Hat, Inc., name=rhosp17/openstack-qdrouterd, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:46Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=metrics_qdr, vcs-type=git, distribution-scope=public)
Nov 28 08:10:26 np0005538515.localdomain systemd[1]: 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.service: Deactivated successfully.
Nov 28 08:10:41 np0005538515.localdomain sudo[59545]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 08:10:41 np0005538515.localdomain sudo[59545]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:10:41 np0005538515.localdomain sudo[59545]: pam_unix(sudo:session): session closed for user root
Nov 28 08:10:41 np0005538515.localdomain sudo[59560]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Nov 28 08:10:41 np0005538515.localdomain sudo[59560]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:10:42 np0005538515.localdomain systemd[1]: tmp-crun.1dyme5.mount: Deactivated successfully.
Nov 28 08:10:42 np0005538515.localdomain podman[59644]: 2025-11-28 08:10:42.218914964 +0000 UTC m=+0.110890892 container exec 98f7091a3e2ea0e9ed1e630f1e98c8fad1fd276cf7448473db6afc3c103ea45d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538515, architecture=x86_64, io.openshift.expose-services=, name=rhceph, RELEASE=main, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-09-24T08:57:55, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., version=7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, release=553, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vcs-type=git)
Nov 28 08:10:42 np0005538515.localdomain podman[59644]: 2025-11-28 08:10:42.294658107 +0000 UTC m=+0.186634025 container exec_died 98f7091a3e2ea0e9ed1e630f1e98c8fad1fd276cf7448473db6afc3c103ea45d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538515, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, ceph=True, build-date=2025-09-24T08:57:55, architecture=x86_64, io.openshift.tags=rhceph ceph, version=7, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, RELEASE=main, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, io.buildah.version=1.33.12, release=553, description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.license_terms=https://www.redhat.com/agreements, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Nov 28 08:10:42 np0005538515.localdomain sudo[59560]: pam_unix(sudo:session): session closed for user root
Nov 28 08:10:42 np0005538515.localdomain sudo[59712]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 08:10:42 np0005538515.localdomain sudo[59712]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:10:42 np0005538515.localdomain sudo[59712]: pam_unix(sudo:session): session closed for user root
Nov 28 08:10:42 np0005538515.localdomain sudo[59727]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 08:10:42 np0005538515.localdomain sudo[59727]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:10:43 np0005538515.localdomain sudo[59727]: pam_unix(sudo:session): session closed for user root
Nov 28 08:10:43 np0005538515.localdomain sudo[59775]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 08:10:43 np0005538515.localdomain sudo[59775]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:10:43 np0005538515.localdomain sudo[59775]: pam_unix(sudo:session): session closed for user root
Nov 28 08:10:55 np0005538515.localdomain sshd[59790]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 08:10:56 np0005538515.localdomain sshd[59790]: Invalid user solana from 80.94.92.186 port 36880
Nov 28 08:10:56 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.
Nov 28 08:10:56 np0005538515.localdomain systemd[1]: tmp-crun.yECbQz.mount: Deactivated successfully.
Nov 28 08:10:56 np0005538515.localdomain podman[59792]: 2025-11-28 08:10:56.866046083 +0000 UTC m=+0.124086354 container health_status 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, version=17.1.12, io.openshift.expose-services=, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, managed_by=tripleo_ansible, distribution-scope=public, tcib_managed=true, name=rhosp17/openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, batch=17.1_20251118.1, config_id=tripleo_step1, container_name=metrics_qdr, vendor=Red Hat, Inc., io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd)
Nov 28 08:10:56 np0005538515.localdomain sshd[59790]: Connection closed by invalid user solana 80.94.92.186 port 36880 [preauth]
Nov 28 08:10:57 np0005538515.localdomain podman[59792]: 2025-11-28 08:10:57.063378773 +0000 UTC m=+0.321419064 container exec_died 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, version=17.1.12, io.openshift.expose-services=, build-date=2025-11-18T22:49:46Z, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-qdrouterd, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, url=https://www.redhat.com, io.buildah.version=1.41.4, managed_by=tripleo_ansible, batch=17.1_20251118.1, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=metrics_qdr, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc.)
Nov 28 08:10:57 np0005538515.localdomain systemd[1]: 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.service: Deactivated successfully.
Nov 28 08:11:27 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.
Nov 28 08:11:27 np0005538515.localdomain podman[59819]: 2025-11-28 08:11:27.973246488 +0000 UTC m=+0.083235769 container health_status 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-qdrouterd-container, build-date=2025-11-18T22:49:46Z, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, architecture=x86_64, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step1, io.buildah.version=1.41.4, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, batch=17.1_20251118.1, release=1761123044, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=)
Nov 28 08:11:28 np0005538515.localdomain podman[59819]: 2025-11-28 08:11:28.172418747 +0000 UTC m=+0.282408028 container exec_died 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, batch=17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vcs-type=git, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, tcib_managed=true, distribution-scope=public, url=https://www.redhat.com, build-date=2025-11-18T22:49:46Z, io.buildah.version=1.41.4, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., config_id=tripleo_step1, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd)
Nov 28 08:11:28 np0005538515.localdomain systemd[1]: 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.service: Deactivated successfully.
Nov 28 08:11:44 np0005538515.localdomain sudo[59848]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 08:11:44 np0005538515.localdomain sudo[59848]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:11:44 np0005538515.localdomain sudo[59848]: pam_unix(sudo:session): session closed for user root
Nov 28 08:11:44 np0005538515.localdomain sudo[59863]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 08:11:44 np0005538515.localdomain sudo[59863]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:11:44 np0005538515.localdomain sudo[59863]: pam_unix(sudo:session): session closed for user root
Nov 28 08:11:45 np0005538515.localdomain sudo[59910]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 08:11:45 np0005538515.localdomain sudo[59910]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:11:45 np0005538515.localdomain sudo[59910]: pam_unix(sudo:session): session closed for user root
Nov 28 08:11:58 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.
Nov 28 08:11:58 np0005538515.localdomain systemd[1]: tmp-crun.nZdOYV.mount: Deactivated successfully.
Nov 28 08:11:58 np0005538515.localdomain podman[59925]: 2025-11-28 08:11:58.996916966 +0000 UTC m=+0.095899633 container health_status 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:46Z, version=17.1.12, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, name=rhosp17/openstack-qdrouterd, container_name=metrics_qdr, batch=17.1_20251118.1, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step1, managed_by=tripleo_ansible, distribution-scope=public, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 08:11:59 np0005538515.localdomain podman[59925]: 2025-11-28 08:11:59.200547217 +0000 UTC m=+0.299529884 container exec_died 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, build-date=2025-11-18T22:49:46Z, managed_by=tripleo_ansible, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, version=17.1.12, release=1761123044, name=rhosp17/openstack-qdrouterd)
Nov 28 08:11:59 np0005538515.localdomain systemd[1]: 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.service: Deactivated successfully.
Nov 28 08:12:29 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.
Nov 28 08:12:29 np0005538515.localdomain podman[59955]: 2025-11-28 08:12:29.978429696 +0000 UTC m=+0.084568762 container health_status 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, url=https://www.redhat.com, batch=17.1_20251118.1, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, com.redhat.component=openstack-qdrouterd-container, release=1761123044, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, managed_by=tripleo_ansible, config_id=tripleo_step1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., container_name=metrics_qdr)
Nov 28 08:12:30 np0005538515.localdomain podman[59955]: 2025-11-28 08:12:30.2028306 +0000 UTC m=+0.308969626 container exec_died 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-qdrouterd, batch=17.1_20251118.1, managed_by=tripleo_ansible, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, io.openshift.expose-services=, vendor=Red Hat, Inc., release=1761123044, container_name=metrics_qdr, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, tcib_managed=true, maintainer=OpenStack TripleO Team)
Nov 28 08:12:30 np0005538515.localdomain systemd[1]: 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.service: Deactivated successfully.
Nov 28 08:12:45 np0005538515.localdomain sudo[59985]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 08:12:45 np0005538515.localdomain sudo[59985]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:12:45 np0005538515.localdomain sudo[59985]: pam_unix(sudo:session): session closed for user root
Nov 28 08:12:45 np0005538515.localdomain sudo[60000]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 08:12:45 np0005538515.localdomain sudo[60000]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:12:46 np0005538515.localdomain sudo[60000]: pam_unix(sudo:session): session closed for user root
Nov 28 08:12:46 np0005538515.localdomain sudo[60047]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 08:12:46 np0005538515.localdomain sudo[60047]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:12:46 np0005538515.localdomain sudo[60047]: pam_unix(sudo:session): session closed for user root
Nov 28 08:13:00 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.
Nov 28 08:13:00 np0005538515.localdomain podman[60062]: 2025-11-28 08:13:00.973213994 +0000 UTC m=+0.085386366 container health_status 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:46Z, managed_by=tripleo_ansible, container_name=metrics_qdr, vcs-type=git, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, architecture=x86_64, name=rhosp17/openstack-qdrouterd, config_id=tripleo_step1, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 28 08:13:01 np0005538515.localdomain podman[60062]: 2025-11-28 08:13:01.212593966 +0000 UTC m=+0.324766298 container exec_died 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step1, io.buildah.version=1.41.4, release=1761123044, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, url=https://www.redhat.com, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp17/openstack-qdrouterd)
Nov 28 08:13:01 np0005538515.localdomain systemd[1]: 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.service: Deactivated successfully.
Nov 28 08:13:31 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.
Nov 28 08:13:31 np0005538515.localdomain podman[60092]: 2025-11-28 08:13:31.979716769 +0000 UTC m=+0.080384298 container health_status 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:46Z, name=rhosp17/openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, config_id=tripleo_step1, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, distribution-scope=public, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.buildah.version=1.41.4, architecture=x86_64)
Nov 28 08:13:32 np0005538515.localdomain podman[60092]: 2025-11-28 08:13:32.166833116 +0000 UTC m=+0.267500635 container exec_died 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, architecture=x86_64, managed_by=tripleo_ansible, release=1761123044, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, io.openshift.expose-services=, io.buildah.version=1.41.4, config_id=tripleo_step1, tcib_managed=true, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20251118.1, name=rhosp17/openstack-qdrouterd)
Nov 28 08:13:32 np0005538515.localdomain systemd[1]: 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.service: Deactivated successfully.
Nov 28 08:13:33 np0005538515.localdomain sudo[60165]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-audurtbunimmhzzirxydfknkxgzsilst ; /usr/bin/python3
Nov 28 08:13:33 np0005538515.localdomain sudo[60165]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:13:33 np0005538515.localdomain python3[60167]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/config_step.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 08:13:33 np0005538515.localdomain sudo[60165]: pam_unix(sudo:session): session closed for user root
Nov 28 08:13:34 np0005538515.localdomain sudo[60210]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eerdltvjpgppjyhsvpeuzeyqkelvmnsl ; /usr/bin/python3
Nov 28 08:13:34 np0005538515.localdomain sudo[60210]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:13:34 np0005538515.localdomain python3[60212]: ansible-ansible.legacy.copy Invoked with dest=/etc/puppet/hieradata/config_step.json force=True mode=0600 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764317613.46885-99577-195677168809152/source _original_basename=tmpyndwrb1i follow=False checksum=62439dd24dde40c90e7a39f6a1b31cc6061fe59b backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:13:34 np0005538515.localdomain sudo[60210]: pam_unix(sudo:session): session closed for user root
Nov 28 08:13:35 np0005538515.localdomain sudo[60240]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iurptsdaiovomegzjmpzfbkjwskrpbzy ; /usr/bin/python3
Nov 28 08:13:35 np0005538515.localdomain sudo[60240]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:13:35 np0005538515.localdomain python3[60242]: ansible-stat Invoked with path=/var/lib/tripleo-config/container-startup-config/step_3 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 28 08:13:35 np0005538515.localdomain sudo[60240]: pam_unix(sudo:session): session closed for user root
Nov 28 08:13:35 np0005538515.localdomain sudo[60290]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ykyryabqewcwgixkerpgdijjflrykgsk ; /usr/bin/python3
Nov 28 08:13:35 np0005538515.localdomain sudo[60290]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:13:35 np0005538515.localdomain sudo[60290]: pam_unix(sudo:session): session closed for user root
Nov 28 08:13:36 np0005538515.localdomain sudo[60308]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wqqkzdcwnencgkjomcsqqrtkhskjcsij ; /usr/bin/python3
Nov 28 08:13:36 np0005538515.localdomain sudo[60308]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:13:36 np0005538515.localdomain sudo[60308]: pam_unix(sudo:session): session closed for user root
Nov 28 08:13:36 np0005538515.localdomain sudo[60412]: tripleo-admin : TTY=pts/0 ; PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qbcblxjpvnhmewztvpxyvjmhuazxwnmo ; ANSIBLE_ASYNC_DIR=/tmp/.ansible_async /usr/bin/python3 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1764317616.4082067-99742-204519230252259/async_wrapper.py 939045046376 3600 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1764317616.4082067-99742-204519230252259/AnsiballZ_command.py _
Nov 28 08:13:36 np0005538515.localdomain sudo[60412]: pam_unix(sudo:session): session opened for user root(uid=0) by tripleo-admin(uid=1003)
Nov 28 08:13:36 np0005538515.localdomain ansible-async_wrapper.py[60414]: Invoked with 939045046376 3600 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1764317616.4082067-99742-204519230252259/AnsiballZ_command.py _
Nov 28 08:13:36 np0005538515.localdomain ansible-async_wrapper.py[60417]: Starting module and watcher
Nov 28 08:13:36 np0005538515.localdomain ansible-async_wrapper.py[60417]: Start watching 60418 (3600)
Nov 28 08:13:36 np0005538515.localdomain ansible-async_wrapper.py[60418]: Start module (60418)
Nov 28 08:13:36 np0005538515.localdomain ansible-async_wrapper.py[60414]: Return async_wrapper task started.
Nov 28 08:13:36 np0005538515.localdomain sudo[60412]: pam_unix(sudo:session): session closed for user root
Nov 28 08:13:37 np0005538515.localdomain sudo[60436]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fasffwahuxqbktohmxcgbdqnyafmpeoy ; /usr/bin/python3
Nov 28 08:13:37 np0005538515.localdomain sudo[60436]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:13:37 np0005538515.localdomain python3[60438]: ansible-ansible.legacy.async_status Invoked with jid=939045046376.60414 mode=status _async_dir=/tmp/.ansible_async
Nov 28 08:13:37 np0005538515.localdomain sudo[60436]: pam_unix(sudo:session): session closed for user root
Nov 28 08:13:41 np0005538515.localdomain puppet-user[60433]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5
Nov 28 08:13:41 np0005538515.localdomain puppet-user[60433]:    (file: /etc/puppet/hiera.yaml)
Nov 28 08:13:41 np0005538515.localdomain puppet-user[60433]: Warning: Undefined variable '::deploy_config_name';
Nov 28 08:13:41 np0005538515.localdomain puppet-user[60433]:    (file & line not available)
Nov 28 08:13:41 np0005538515.localdomain puppet-user[60433]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html
Nov 28 08:13:41 np0005538515.localdomain puppet-user[60433]:    (file & line not available)
Nov 28 08:13:41 np0005538515.localdomain puppet-user[60433]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/profile/base/database/mysql/client.pp, line: 89, column: 8)
Nov 28 08:13:41 np0005538515.localdomain puppet-user[60433]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/packages.pp, line: 39, column: 69)
Nov 28 08:13:41 np0005538515.localdomain puppet-user[60433]: Notice: Compiled catalog for np0005538515.localdomain in environment production in 0.11 seconds
Nov 28 08:13:41 np0005538515.localdomain puppet-user[60433]: Notice: Applied catalog in 0.03 seconds
Nov 28 08:13:41 np0005538515.localdomain puppet-user[60433]: Application:
Nov 28 08:13:41 np0005538515.localdomain puppet-user[60433]:    Initial environment: production
Nov 28 08:13:41 np0005538515.localdomain puppet-user[60433]:    Converged environment: production
Nov 28 08:13:41 np0005538515.localdomain puppet-user[60433]:          Run mode: user
Nov 28 08:13:41 np0005538515.localdomain puppet-user[60433]: Changes:
Nov 28 08:13:41 np0005538515.localdomain puppet-user[60433]: Events:
Nov 28 08:13:41 np0005538515.localdomain puppet-user[60433]: Resources:
Nov 28 08:13:41 np0005538515.localdomain puppet-user[60433]:             Total: 10
Nov 28 08:13:41 np0005538515.localdomain puppet-user[60433]: Time:
Nov 28 08:13:41 np0005538515.localdomain puppet-user[60433]:          Schedule: 0.00
Nov 28 08:13:41 np0005538515.localdomain puppet-user[60433]:              File: 0.00
Nov 28 08:13:41 np0005538515.localdomain puppet-user[60433]:              Exec: 0.01
Nov 28 08:13:41 np0005538515.localdomain puppet-user[60433]:            Augeas: 0.01
Nov 28 08:13:41 np0005538515.localdomain puppet-user[60433]:    Transaction evaluation: 0.02
Nov 28 08:13:41 np0005538515.localdomain puppet-user[60433]:    Catalog application: 0.03
Nov 28 08:13:41 np0005538515.localdomain puppet-user[60433]:    Config retrieval: 0.14
Nov 28 08:13:41 np0005538515.localdomain puppet-user[60433]:          Last run: 1764317621
Nov 28 08:13:41 np0005538515.localdomain puppet-user[60433]:        Filebucket: 0.00
Nov 28 08:13:41 np0005538515.localdomain puppet-user[60433]:             Total: 0.04
Nov 28 08:13:41 np0005538515.localdomain puppet-user[60433]: Version:
Nov 28 08:13:41 np0005538515.localdomain puppet-user[60433]:            Config: 1764317621
Nov 28 08:13:41 np0005538515.localdomain puppet-user[60433]:            Puppet: 7.10.0
Nov 28 08:13:41 np0005538515.localdomain ansible-async_wrapper.py[60418]: Module complete (60418)
Nov 28 08:13:41 np0005538515.localdomain ansible-async_wrapper.py[60417]: Done in kid B.
Nov 28 08:13:47 np0005538515.localdomain sudo[60551]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 08:13:47 np0005538515.localdomain sudo[60551]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:13:47 np0005538515.localdomain sudo[60551]: pam_unix(sudo:session): session closed for user root
Nov 28 08:13:47 np0005538515.localdomain sudo[60566]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 08:13:47 np0005538515.localdomain sudo[60566]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:13:47 np0005538515.localdomain sudo[60594]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kcidcpyipephuuwjhvbdbkswxfvtrcce ; /usr/bin/python3
Nov 28 08:13:47 np0005538515.localdomain sudo[60594]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:13:47 np0005538515.localdomain python3[60596]: ansible-ansible.legacy.async_status Invoked with jid=939045046376.60414 mode=status _async_dir=/tmp/.ansible_async
Nov 28 08:13:47 np0005538515.localdomain sudo[60594]: pam_unix(sudo:session): session closed for user root
Nov 28 08:13:47 np0005538515.localdomain sudo[60566]: pam_unix(sudo:session): session closed for user root
Nov 28 08:13:48 np0005538515.localdomain sudo[60645]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ivpdajxldneuwjthocpokuubbnsnzocn ; /usr/bin/python3
Nov 28 08:13:48 np0005538515.localdomain sudo[60645]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:13:48 np0005538515.localdomain sudo[60641]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 08:13:48 np0005538515.localdomain sudo[60641]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:13:48 np0005538515.localdomain sudo[60641]: pam_unix(sudo:session): session closed for user root
Nov 28 08:13:48 np0005538515.localdomain python3[60657]: ansible-file Invoked with path=/var/lib/container-puppet/puppetlabs state=directory setype=svirt_sandbox_file_t selevel=s0 recurse=True force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None
Nov 28 08:13:48 np0005538515.localdomain sudo[60645]: pam_unix(sudo:session): session closed for user root
Nov 28 08:13:48 np0005538515.localdomain sudo[60672]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dxvdtaceyvnlosbcehslorhirzshtrfj ; /usr/bin/python3
Nov 28 08:13:48 np0005538515.localdomain sudo[60672]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:13:48 np0005538515.localdomain python3[60674]: ansible-stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 28 08:13:48 np0005538515.localdomain sudo[60672]: pam_unix(sudo:session): session closed for user root
Nov 28 08:13:49 np0005538515.localdomain sudo[60722]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zglzewzeypqnagnqplktzbybwnhukudl ; /usr/bin/python3
Nov 28 08:13:49 np0005538515.localdomain sudo[60722]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:13:49 np0005538515.localdomain python3[60724]: ansible-ansible.legacy.stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 08:13:49 np0005538515.localdomain sudo[60722]: pam_unix(sudo:session): session closed for user root
Nov 28 08:13:49 np0005538515.localdomain sudo[60740]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xcbveblckzpeneyycpmvkzcnowemuxmx ; /usr/bin/python3
Nov 28 08:13:49 np0005538515.localdomain sudo[60740]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:13:49 np0005538515.localdomain python3[60742]: ansible-ansible.legacy.file Invoked with setype=svirt_sandbox_file_t selevel=s0 dest=/var/lib/container-puppet/puppetlabs/facter.conf _original_basename=tmporglf9wi recurse=False state=file path=/var/lib/container-puppet/puppetlabs/facter.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None
Nov 28 08:13:49 np0005538515.localdomain sudo[60740]: pam_unix(sudo:session): session closed for user root
Nov 28 08:13:49 np0005538515.localdomain sudo[60770]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sejomjprekicqkrirahamghkawwrlrza ; /usr/bin/python3
Nov 28 08:13:49 np0005538515.localdomain sudo[60770]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:13:50 np0005538515.localdomain python3[60772]: ansible-file Invoked with path=/opt/puppetlabs/facter state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:13:50 np0005538515.localdomain sudo[60770]: pam_unix(sudo:session): session closed for user root
Nov 28 08:13:50 np0005538515.localdomain sudo[60786]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-siltszqqyurhhnkjwhgfrqrmmjjxjwfe ; /usr/bin/python3
Nov 28 08:13:50 np0005538515.localdomain sudo[60786]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:13:50 np0005538515.localdomain sudo[60786]: pam_unix(sudo:session): session closed for user root
Nov 28 08:13:51 np0005538515.localdomain sudo[60873]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bchixxotowvytxlvlhbwylhgjkzpraid ; /usr/bin/python3
Nov 28 08:13:51 np0005538515.localdomain sudo[60873]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:13:51 np0005538515.localdomain python3[60875]: ansible-ansible.posix.synchronize Invoked with src=/opt/puppetlabs/ dest=/var/lib/container-puppet/puppetlabs/ _local_rsync_path=rsync _local_rsync_password=NOT_LOGGING_PARAMETER rsync_path=None delete=False _substitute_controller=False archive=True checksum=False compress=True existing_only=False dirs=False copy_links=False set_remote_user=True rsync_timeout=0 rsync_opts=[] ssh_connection_multiplexing=False partial=False verify_host=False mode=push dest_port=None private_key=None recursive=None links=None perms=None times=None owner=None group=None ssh_args=None link_dest=None
Nov 28 08:13:51 np0005538515.localdomain sudo[60873]: pam_unix(sudo:session): session closed for user root
Nov 28 08:13:52 np0005538515.localdomain sudo[60892]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tzpnateounwjnnsqtwtogqaebgpkymlc ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:13:52 np0005538515.localdomain sudo[60892]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:13:52 np0005538515.localdomain python3[60894]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:13:52 np0005538515.localdomain sudo[60892]: pam_unix(sudo:session): session closed for user root
Nov 28 08:13:52 np0005538515.localdomain sudo[60908]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pvjlshsxiwmpzsiwsmlmihmthlsahtbs ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:13:52 np0005538515.localdomain sudo[60908]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:13:52 np0005538515.localdomain sudo[60908]: pam_unix(sudo:session): session closed for user root
Nov 28 08:13:53 np0005538515.localdomain sudo[60924]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qqgliwzhmvalurpfymirbwxydxhcvofw ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:13:53 np0005538515.localdomain sudo[60924]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:13:53 np0005538515.localdomain python3[60926]: ansible-stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 28 08:13:54 np0005538515.localdomain sudo[60924]: pam_unix(sudo:session): session closed for user root
Nov 28 08:13:54 np0005538515.localdomain sudo[60974]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ohmobarswlvhsdcbixmknwlpycfgehii ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:13:54 np0005538515.localdomain sudo[60974]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:13:54 np0005538515.localdomain python3[60976]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-container-shutdown follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 08:13:54 np0005538515.localdomain sudo[60974]: pam_unix(sudo:session): session closed for user root
Nov 28 08:13:54 np0005538515.localdomain sudo[60992]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jmykvklwijmbvwhiyjwalrzzlvsjexql ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:13:54 np0005538515.localdomain sudo[60992]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:13:54 np0005538515.localdomain python3[60994]: ansible-ansible.legacy.file Invoked with mode=0700 owner=root group=root dest=/usr/libexec/tripleo-container-shutdown _original_basename=tripleo-container-shutdown recurse=False state=file path=/usr/libexec/tripleo-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:13:54 np0005538515.localdomain sudo[60992]: pam_unix(sudo:session): session closed for user root
Nov 28 08:13:55 np0005538515.localdomain sudo[61054]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cjqjcrnsmyasfnyogdnlnficsszbovyj ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:13:55 np0005538515.localdomain sudo[61054]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:13:55 np0005538515.localdomain python3[61056]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-start-podman-container follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 08:13:55 np0005538515.localdomain sudo[61054]: pam_unix(sudo:session): session closed for user root
Nov 28 08:13:55 np0005538515.localdomain sudo[61072]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pjuizwcqvwccutsqjejncdwefscechio ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:13:55 np0005538515.localdomain sudo[61072]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:13:55 np0005538515.localdomain python3[61074]: ansible-ansible.legacy.file Invoked with mode=0700 owner=root group=root dest=/usr/libexec/tripleo-start-podman-container _original_basename=tripleo-start-podman-container recurse=False state=file path=/usr/libexec/tripleo-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:13:55 np0005538515.localdomain sudo[61072]: pam_unix(sudo:session): session closed for user root
Nov 28 08:13:56 np0005538515.localdomain sudo[61134]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pzlncxlevnwcinukmxvkckkdekdzubnd ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:13:56 np0005538515.localdomain sudo[61134]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:13:56 np0005538515.localdomain python3[61136]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/tripleo-container-shutdown.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 08:13:56 np0005538515.localdomain sudo[61134]: pam_unix(sudo:session): session closed for user root
Nov 28 08:13:56 np0005538515.localdomain sudo[61152]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uyiwwhzqyrjmwkctcvktgoxdawfpeajr ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:13:56 np0005538515.localdomain sudo[61152]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:13:56 np0005538515.localdomain python3[61154]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system/tripleo-container-shutdown.service _original_basename=tripleo-container-shutdown-service recurse=False state=file path=/usr/lib/systemd/system/tripleo-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:13:56 np0005538515.localdomain sudo[61152]: pam_unix(sudo:session): session closed for user root
Nov 28 08:13:56 np0005538515.localdomain sudo[61214]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xlgovvnhkalehsxzqqbssjdiytfbshdh ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:13:56 np0005538515.localdomain sudo[61214]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:13:57 np0005538515.localdomain python3[61216]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 08:13:57 np0005538515.localdomain sudo[61214]: pam_unix(sudo:session): session closed for user root
Nov 28 08:13:57 np0005538515.localdomain sudo[61232]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-liapbbbfwmshztqwrrvfkgfzgohjtvua ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:13:57 np0005538515.localdomain sudo[61232]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:13:57 np0005538515.localdomain python3[61234]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset _original_basename=91-tripleo-container-shutdown-preset recurse=False state=file path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:13:57 np0005538515.localdomain sudo[61232]: pam_unix(sudo:session): session closed for user root
Nov 28 08:13:57 np0005538515.localdomain sudo[61262]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hkqfzaiuwvastpwwqfpdwrhmclmbsfev ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:13:57 np0005538515.localdomain sudo[61262]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:13:57 np0005538515.localdomain python3[61264]: ansible-systemd Invoked with name=tripleo-container-shutdown state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 08:13:57 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 08:13:57 np0005538515.localdomain systemd-rc-local-generator[61287]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 08:13:57 np0005538515.localdomain systemd-sysv-generator[61292]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 08:13:58 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 08:13:58 np0005538515.localdomain sudo[61262]: pam_unix(sudo:session): session closed for user root
Nov 28 08:13:58 np0005538515.localdomain sudo[61347]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vruzivyuqwmtalfzzvntpmrzkgljmpfm ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:13:58 np0005538515.localdomain sudo[61347]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:13:58 np0005538515.localdomain python3[61349]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/netns-placeholder.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 08:13:58 np0005538515.localdomain sudo[61347]: pam_unix(sudo:session): session closed for user root
Nov 28 08:13:58 np0005538515.localdomain sudo[61365]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vejfvejrvysqhaefcdvbyxqszzyvrnvd ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:13:58 np0005538515.localdomain sudo[61365]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:13:59 np0005538515.localdomain python3[61367]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/usr/lib/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:13:59 np0005538515.localdomain sudo[61365]: pam_unix(sudo:session): session closed for user root
Nov 28 08:13:59 np0005538515.localdomain sudo[61427]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-znfmtjvytkozllbmgfadgbnjnzvfgsff ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:13:59 np0005538515.localdomain sudo[61427]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:13:59 np0005538515.localdomain python3[61429]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 08:13:59 np0005538515.localdomain sudo[61427]: pam_unix(sudo:session): session closed for user root
Nov 28 08:13:59 np0005538515.localdomain sudo[61445]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ovvbmmwqrmbqqalzaphetbtljrtpzxma ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:13:59 np0005538515.localdomain sudo[61445]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:13:59 np0005538515.localdomain python3[61447]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:13:59 np0005538515.localdomain sudo[61445]: pam_unix(sudo:session): session closed for user root
Nov 28 08:14:00 np0005538515.localdomain sudo[61475]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rwvhlpotjykuwrncneprdjgjdnbhuxwv ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:14:00 np0005538515.localdomain sudo[61475]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:14:00 np0005538515.localdomain python3[61477]: ansible-systemd Invoked with name=netns-placeholder state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 08:14:00 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 08:14:00 np0005538515.localdomain systemd-rc-local-generator[61502]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 08:14:00 np0005538515.localdomain systemd-sysv-generator[61508]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 08:14:00 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 08:14:00 np0005538515.localdomain systemd[1]: Starting Create netns directory...
Nov 28 08:14:00 np0005538515.localdomain systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Nov 28 08:14:00 np0005538515.localdomain systemd[1]: netns-placeholder.service: Deactivated successfully.
Nov 28 08:14:00 np0005538515.localdomain systemd[1]: Finished Create netns directory.
Nov 28 08:14:01 np0005538515.localdomain sudo[61475]: pam_unix(sudo:session): session closed for user root
Nov 28 08:14:01 np0005538515.localdomain sudo[61535]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-teqmpklilozvtvanprqahlxvhytdfwkd ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:14:01 np0005538515.localdomain sudo[61535]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:14:01 np0005538515.localdomain python3[61537]: ansible-container_puppet_config Invoked with update_config_hash_only=True no_archive=True check_mode=False config_vol_prefix=/var/lib/config-data debug=False net_host=True puppet_config= short_hostname= step=6
Nov 28 08:14:01 np0005538515.localdomain sudo[61535]: pam_unix(sudo:session): session closed for user root
Nov 28 08:14:02 np0005538515.localdomain sudo[61551]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xxaagfpmogvqlvynrpyjwynnyyqgixru ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:14:02 np0005538515.localdomain sudo[61551]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:14:02 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.
Nov 28 08:14:02 np0005538515.localdomain podman[61553]: 2025-11-28 08:14:02.433246278 +0000 UTC m=+0.073273833 container health_status 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, version=17.1.12, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, architecture=x86_64, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr, io.openshift.expose-services=, vendor=Red Hat, Inc., distribution-scope=public, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, release=1761123044, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:49:46Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']})
Nov 28 08:14:02 np0005538515.localdomain podman[61553]: 2025-11-28 08:14:02.641625618 +0000 UTC m=+0.281653113 container exec_died 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, vendor=Red Hat, Inc., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, tcib_managed=true, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, io.buildah.version=1.41.4, name=rhosp17/openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, build-date=2025-11-18T22:49:46Z, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com)
Nov 28 08:14:02 np0005538515.localdomain systemd[1]: 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.service: Deactivated successfully.
Nov 28 08:14:02 np0005538515.localdomain sudo[61551]: pam_unix(sudo:session): session closed for user root
Nov 28 08:14:03 np0005538515.localdomain sudo[61619]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-efojyehmovpmrqxlymfhwjsoziknrijm ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:14:03 np0005538515.localdomain sudo[61619]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:14:03 np0005538515.localdomain python3[61621]: ansible-tripleo_container_manage Invoked with config_id=tripleo_step3 config_dir=/var/lib/tripleo-config/container-startup-config/step_3 config_patterns=*.json config_overrides={} concurrency=5 log_base_path=/var/log/containers/stdouts debug=False
Nov 28 08:14:03 np0005538515.localdomain podman[61756]: 2025-11-28 08:14:03.948737581 +0000 UTC m=+0.063189213 container create 2e97d8a51625064e6f06a470ec8e1c443497ab99753302611140ab63dcf05711 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd_wrapper, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']}, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2025-11-19T00:35:22Z, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, vendor=Red Hat, Inc., container_name=nova_virtlogd_wrapper, name=rhosp17/openstack-nova-libvirt, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, com.redhat.component=openstack-nova-libvirt-container, io.openshift.expose-services=, vcs-type=git, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, config_id=tripleo_step3, io.buildah.version=1.41.4, distribution-scope=public)
Nov 28 08:14:03 np0005538515.localdomain podman[61770]: 2025-11-28 08:14:03.983281215 +0000 UTC m=+0.088328829 container create 7ab9dfcc9475e410a0c49d77a88cf9c95c3c1b4ce14ef438a666bad57bd45d0f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_statedir_owner, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., batch=17.1_20251118.1, container_name=nova_statedir_owner, name=rhosp17/openstack-nova-compute, config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, config_data={'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': 'triliovault-mounts', 'TRIPLEO_DEPLOY_IDENTIFIER': '1764316155', '__OS_DEBUG': 'true'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/container-config-scripts:/container-config-scripts:z']}, tcib_managed=true, build-date=2025-11-19T00:36:58Z, io.openshift.expose-services=, release=1761123044, url=https://www.redhat.com, maintainer=OpenStack TripleO Team)
Nov 28 08:14:03 np0005538515.localdomain systemd[1]: Started libpod-conmon-2e97d8a51625064e6f06a470ec8e1c443497ab99753302611140ab63dcf05711.scope.
Nov 28 08:14:04 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 08:14:04 np0005538515.localdomain podman[61756]: 2025-11-28 08:14:03.914857678 +0000 UTC m=+0.029309330 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Nov 28 08:14:04 np0005538515.localdomain systemd[1]: Started libpod-conmon-7ab9dfcc9475e410a0c49d77a88cf9c95c3c1b4ce14ef438a666bad57bd45d0f.scope.
Nov 28 08:14:04 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/58fd8127cd9da7f4875f0be3b8ee189ddf406ac3663dca02ef65d88f989bc037/merged/etc/libvirt supports timestamps until 2038 (0x7fffffff)
Nov 28 08:14:04 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/58fd8127cd9da7f4875f0be3b8ee189ddf406ac3663dca02ef65d88f989bc037/merged/var/lib/vhost_sockets supports timestamps until 2038 (0x7fffffff)
Nov 28 08:14:04 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/58fd8127cd9da7f4875f0be3b8ee189ddf406ac3663dca02ef65d88f989bc037/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Nov 28 08:14:04 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/58fd8127cd9da7f4875f0be3b8ee189ddf406ac3663dca02ef65d88f989bc037/merged/var/log/libvirt supports timestamps until 2038 (0x7fffffff)
Nov 28 08:14:04 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/58fd8127cd9da7f4875f0be3b8ee189ddf406ac3663dca02ef65d88f989bc037/merged/var/cache/libvirt supports timestamps until 2038 (0x7fffffff)
Nov 28 08:14:04 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/58fd8127cd9da7f4875f0be3b8ee189ddf406ac3663dca02ef65d88f989bc037/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Nov 28 08:14:04 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 08:14:04 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/58fd8127cd9da7f4875f0be3b8ee189ddf406ac3663dca02ef65d88f989bc037/merged/var/lib/kolla/config_files/src-ceph supports timestamps until 2038 (0x7fffffff)
Nov 28 08:14:04 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b290307da7690cf991f1186b07b34a264d1d07b861913129e99370229181e3a2/merged/container-config-scripts supports timestamps until 2038 (0x7fffffff)
Nov 28 08:14:04 np0005538515.localdomain podman[61756]: 2025-11-28 08:14:04.027398513 +0000 UTC m=+0.141850135 container init 2e97d8a51625064e6f06a470ec8e1c443497ab99753302611140ab63dcf05711 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd_wrapper, io.buildah.version=1.41.4, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, container_name=nova_virtlogd_wrapper, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-libvirt, vendor=Red Hat, Inc., config_id=tripleo_step3, url=https://www.redhat.com, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, architecture=x86_64, build-date=2025-11-19T00:35:22Z, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-libvirt-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044)
Nov 28 08:14:04 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b290307da7690cf991f1186b07b34a264d1d07b861913129e99370229181e3a2/merged/var/lib/_nova_secontext supports timestamps until 2038 (0x7fffffff)
Nov 28 08:14:04 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b290307da7690cf991f1186b07b34a264d1d07b861913129e99370229181e3a2/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Nov 28 08:14:04 np0005538515.localdomain podman[61756]: 2025-11-28 08:14:04.034652292 +0000 UTC m=+0.149103924 container start 2e97d8a51625064e6f06a470ec8e1c443497ab99753302611140ab63dcf05711 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd_wrapper, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']}, tcib_managed=true, vendor=Red Hat, Inc., container_name=nova_virtlogd_wrapper, com.redhat.component=openstack-nova-libvirt-container, version=17.1.12, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2025-11-19T00:35:22Z, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, name=rhosp17/openstack-nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, batch=17.1_20251118.1, release=1761123044, config_id=tripleo_step3)
Nov 28 08:14:04 np0005538515.localdomain podman[61770]: 2025-11-28 08:14:03.94047905 +0000 UTC m=+0.045526734 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1
Nov 28 08:14:04 np0005538515.localdomain python3[61621]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_virtlogd_wrapper --cgroupns=host --conmon-pidfile /run/nova_virtlogd_wrapper.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=bbb5ea37891e3118676a78b59837de90 --label config_id=tripleo_step3 --label container_name=nova_virtlogd_wrapper --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_virtlogd_wrapper.log --network host --pid host --privileged=True --security-opt label=level:s0 --security-opt label=type:spc_t --security-opt label=filetype:container_file_t --ulimit nofile=131072 --ulimit nproc=126960 --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/log/containers/libvirt:/var/log/libvirt:shared,z --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /run:/run --volume /sys/fs/cgroup:/sys/fs/cgroup --volume /sys/fs/selinux:/sys/fs/selinux --volume /etc/selinux/config:/etc/selinux/config:ro --volume /etc/libvirt:/etc/libvirt:shared --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro --volume /run/libvirt:/run/libvirt:shared,z --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/libvirt:/var/lib/libvirt:shared --volume /var/cache/libvirt:/var/cache/libvirt:shared --volume /var/lib/vhost_sockets:/var/lib/vhost_sockets --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z --volume /var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Nov 28 08:14:04 np0005538515.localdomain podman[61786]: 2025-11-28 08:14:04.049546355 +0000 UTC m=+0.126331064 container create cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, distribution-scope=public, com.redhat.component=openstack-collectd-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, name=rhosp17/openstack-collectd, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, io.openshift.expose-services=, release=1761123044, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, url=https://www.redhat.com, config_id=tripleo_step3, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=collectd)
Nov 28 08:14:04 np0005538515.localdomain sudo[61853]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Nov 28 08:14:04 np0005538515.localdomain systemd-logind[763]: Existing logind session ID 28 used by new audit session, ignoring.
Nov 28 08:14:04 np0005538515.localdomain systemd[1]: Created slice User Slice of UID 0.
Nov 28 08:14:04 np0005538515.localdomain systemd[1]: Starting User Runtime Directory /run/user/0...
Nov 28 08:14:04 np0005538515.localdomain systemd[1]: Finished User Runtime Directory /run/user/0.
Nov 28 08:14:04 np0005538515.localdomain podman[61786]: 2025-11-28 08:14:04.011364415 +0000 UTC m=+0.088149144 image pull  registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1
Nov 28 08:14:04 np0005538515.localdomain systemd[1]: Starting User Manager for UID 0...
Nov 28 08:14:04 np0005538515.localdomain systemd[1]: Started libpod-conmon-cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.scope.
Nov 28 08:14:04 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 08:14:04 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cb78a9787fbfdee8df647dff935d3e6e34a25076546a1ccbc8a68d8c48f6925c/merged/scripts supports timestamps until 2038 (0x7fffffff)
Nov 28 08:14:04 np0005538515.localdomain podman[61822]: 2025-11-28 08:14:04.040051123 +0000 UTC m=+0.048555018 image pull  registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1
Nov 28 08:14:04 np0005538515.localdomain podman[61821]: 2025-11-28 08:14:04.041262972 +0000 UTC m=+0.047587568 image pull  registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1
Nov 28 08:14:04 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cb78a9787fbfdee8df647dff935d3e6e34a25076546a1ccbc8a68d8c48f6925c/merged/var/log/collectd supports timestamps until 2038 (0x7fffffff)
Nov 28 08:14:04 np0005538515.localdomain systemd[61870]: pam_unix(systemd-user:session): session opened for user root(uid=0) by (uid=0)
Nov 28 08:14:04 np0005538515.localdomain podman[61770]: 2025-11-28 08:14:04.155272403 +0000 UTC m=+0.260320037 container init 7ab9dfcc9475e410a0c49d77a88cf9c95c3c1b4ce14ef438a666bad57bd45d0f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_statedir_owner, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, release=1761123044, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, batch=17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, config_id=tripleo_step3, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, container_name=nova_statedir_owner, managed_by=tripleo_ansible, architecture=x86_64, config_data={'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': 'triliovault-mounts', 'TRIPLEO_DEPLOY_IDENTIFIER': '1764316155', '__OS_DEBUG': 'true'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/container-config-scripts:/container-config-scripts:z']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 28 08:14:04 np0005538515.localdomain podman[61770]: 2025-11-28 08:14:04.194502815 +0000 UTC m=+0.299550449 container start 7ab9dfcc9475e410a0c49d77a88cf9c95c3c1b4ce14ef438a666bad57bd45d0f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_statedir_owner, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, release=1761123044, config_data={'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': 'triliovault-mounts', 'TRIPLEO_DEPLOY_IDENTIFIER': '1764316155', '__OS_DEBUG': 'true'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/container-config-scripts:/container-config-scripts:z']}, version=17.1.12, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, managed_by=tripleo_ansible, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, name=rhosp17/openstack-nova-compute, container_name=nova_statedir_owner, batch=17.1_20251118.1)
Nov 28 08:14:04 np0005538515.localdomain podman[61770]: 2025-11-28 08:14:04.195127316 +0000 UTC m=+0.300174980 container attach 7ab9dfcc9475e410a0c49d77a88cf9c95c3c1b4ce14ef438a666bad57bd45d0f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_statedir_owner, distribution-scope=public, build-date=2025-11-19T00:36:58Z, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': 'triliovault-mounts', 'TRIPLEO_DEPLOY_IDENTIFIER': '1764316155', '__OS_DEBUG': 'true'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/container-config-scripts:/container-config-scripts:z']}, name=rhosp17/openstack-nova-compute, io.openshift.expose-services=, io.buildah.version=1.41.4, container_name=nova_statedir_owner, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, release=1761123044, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, architecture=x86_64, version=17.1.12, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 08:14:04 np0005538515.localdomain systemd[1]: libpod-7ab9dfcc9475e410a0c49d77a88cf9c95c3c1b4ce14ef438a666bad57bd45d0f.scope: Deactivated successfully.
Nov 28 08:14:04 np0005538515.localdomain systemd[61870]: Queued start job for default target Main User Target.
Nov 28 08:14:04 np0005538515.localdomain systemd[61870]: Created slice User Application Slice.
Nov 28 08:14:04 np0005538515.localdomain systemd[61870]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Nov 28 08:14:04 np0005538515.localdomain systemd[61870]: Started Daily Cleanup of User's Temporary Directories.
Nov 28 08:14:04 np0005538515.localdomain systemd[61870]: Reached target Paths.
Nov 28 08:14:04 np0005538515.localdomain systemd[61870]: Reached target Timers.
Nov 28 08:14:04 np0005538515.localdomain systemd[61870]: Starting D-Bus User Message Bus Socket...
Nov 28 08:14:04 np0005538515.localdomain systemd[61870]: Starting Create User's Volatile Files and Directories...
Nov 28 08:14:04 np0005538515.localdomain systemd[61870]: Finished Create User's Volatile Files and Directories.
Nov 28 08:14:04 np0005538515.localdomain systemd[61870]: Listening on D-Bus User Message Bus Socket.
Nov 28 08:14:04 np0005538515.localdomain systemd[61870]: Reached target Sockets.
Nov 28 08:14:04 np0005538515.localdomain systemd[61870]: Reached target Basic System.
Nov 28 08:14:04 np0005538515.localdomain systemd[61870]: Reached target Main User Target.
Nov 28 08:14:04 np0005538515.localdomain systemd[61870]: Startup finished in 128ms.
Nov 28 08:14:04 np0005538515.localdomain systemd[1]: Started User Manager for UID 0.
Nov 28 08:14:04 np0005538515.localdomain systemd[1]: Started Session c1 of User root.
Nov 28 08:14:04 np0005538515.localdomain sudo[61853]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Nov 28 08:14:04 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.
Nov 28 08:14:04 np0005538515.localdomain podman[61786]: 2025-11-28 08:14:04.33703471 +0000 UTC m=+0.413819589 container init cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-collectd, container_name=collectd, build-date=2025-11-18T22:51:28Z, config_id=tripleo_step3, com.redhat.component=openstack-collectd-container, description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd)
Nov 28 08:14:04 np0005538515.localdomain podman[61822]: 2025-11-28 08:14:04.339801038 +0000 UTC m=+0.348304893 container create 45cbc87a49d83485e7046909600fab5f788908e8e4b6bf89504ff818c1bf66ec (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 rsyslog, maintainer=OpenStack TripleO Team, container_name=rsyslog, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, config_id=tripleo_step3, vendor=Red Hat, Inc., managed_by=tripleo_ansible, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 rsyslog, name=rhosp17/openstack-rsyslog, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f62921da3a3d0eed1be38a46b3ed6ac3'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:49Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, com.redhat.component=openstack-rsyslog-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, tcib_managed=true, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, architecture=x86_64)
Nov 28 08:14:04 np0005538515.localdomain podman[61770]: 2025-11-28 08:14:04.348622767 +0000 UTC m=+0.453670481 container died 7ab9dfcc9475e410a0c49d77a88cf9c95c3c1b4ce14ef438a666bad57bd45d0f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_statedir_owner, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, batch=17.1_20251118.1, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': 'triliovault-mounts', 'TRIPLEO_DEPLOY_IDENTIFIER': '1764316155', '__OS_DEBUG': 'true'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/container-config-scripts:/container-config-scripts:z']}, tcib_managed=true, config_id=tripleo_step3, build-date=2025-11-19T00:36:58Z, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, com.redhat.component=openstack-nova-compute-container, vcs-type=git, vendor=Red Hat, Inc., url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=nova_statedir_owner, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 28 08:14:04 np0005538515.localdomain sudo[61913]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Nov 28 08:14:04 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.
Nov 28 08:14:04 np0005538515.localdomain systemd-logind[763]: Existing logind session ID 28 used by new audit session, ignoring.
Nov 28 08:14:04 np0005538515.localdomain systemd[1]: Started Session c2 of User root.
Nov 28 08:14:04 np0005538515.localdomain sudo[61853]: pam_unix(sudo:session): session closed for user root
Nov 28 08:14:04 np0005538515.localdomain systemd[1]: session-c1.scope: Deactivated successfully.
Nov 28 08:14:04 np0005538515.localdomain systemd[1]: Started libpod-conmon-45cbc87a49d83485e7046909600fab5f788908e8e4b6bf89504ff818c1bf66ec.scope.
Nov 28 08:14:04 np0005538515.localdomain sudo[61913]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Nov 28 08:14:04 np0005538515.localdomain podman[61786]: 2025-11-28 08:14:04.407239474 +0000 UTC m=+0.484024223 container start cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, release=1761123044, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-type=git, managed_by=tripleo_ansible, vendor=Red Hat, Inc., version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, name=rhosp17/openstack-collectd, build-date=2025-11-18T22:51:28Z, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, com.redhat.component=openstack-collectd-container)
Nov 28 08:14:04 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 08:14:04 np0005538515.localdomain python3[61621]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name collectd --cap-add IPC_LOCK --conmon-pidfile /run/collectd.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=d31718fcd17fdeee6489534105191c7a --healthcheck-command /openstack/healthcheck --label config_id=tripleo_step3 --label container_name=collectd --label managed_by=tripleo_ansible --label config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/collectd.log --memory 512m --network host --pid host --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro --volume /var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro --volume /var/log/containers/collectd:/var/log/collectd:rw,z --volume /var/lib/container-config-scripts:/config-scripts:ro --volume /var/lib/container-user-scripts:/scripts:z --volume /run:/run:rw --volume /sys/fs/cgroup:/sys/fs/cgroup:ro registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1
Nov 28 08:14:04 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/78fa405dc0dc1392ab1503db8559712d0c057956ae06af0d32d5b9d343fe4a38/merged/var/log/rsyslog supports timestamps until 2038 (0x7fffffff)
Nov 28 08:14:04 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/78fa405dc0dc1392ab1503db8559712d0c057956ae06af0d32d5b9d343fe4a38/merged/var/lib/rsyslog supports timestamps until 2038 (0x7fffffff)
Nov 28 08:14:04 np0005538515.localdomain podman[61822]: 2025-11-28 08:14:04.44311381 +0000 UTC m=+0.451617675 container init 45cbc87a49d83485e7046909600fab5f788908e8e4b6bf89504ff818c1bf66ec (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, name=rhosp17/openstack-rsyslog, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, description=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, tcib_managed=true, release=1761123044, summary=Red Hat OpenStack Platform 17.1 rsyslog, vcs-type=git, version=17.1.12, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., managed_by=tripleo_ansible, container_name=rsyslog, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f62921da3a3d0eed1be38a46b3ed6ac3'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-rsyslog-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, config_id=tripleo_step3, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:49Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com)
Nov 28 08:14:04 np0005538515.localdomain sudo[61913]: pam_unix(sudo:session): session closed for user root
Nov 28 08:14:04 np0005538515.localdomain systemd[1]: session-c2.scope: Deactivated successfully.
Nov 28 08:14:04 np0005538515.localdomain sudo[61961]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Nov 28 08:14:04 np0005538515.localdomain sudo[61961]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Nov 28 08:14:04 np0005538515.localdomain podman[61895]: 2025-11-28 08:14:04.473344708 +0000 UTC m=+0.159927616 container cleanup 7ab9dfcc9475e410a0c49d77a88cf9c95c3c1b4ce14ef438a666bad57bd45d0f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_statedir_owner, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., config_data={'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': 'triliovault-mounts', 'TRIPLEO_DEPLOY_IDENTIFIER': '1764316155', '__OS_DEBUG': 'true'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/container-config-scripts:/container-config-scripts:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, release=1761123044, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, version=17.1.12, url=https://www.redhat.com, container_name=nova_statedir_owner, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, tcib_managed=true, io.buildah.version=1.41.4, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 28 08:14:04 np0005538515.localdomain systemd[1]: libpod-conmon-7ab9dfcc9475e410a0c49d77a88cf9c95c3c1b4ce14ef438a666bad57bd45d0f.scope: Deactivated successfully.
Nov 28 08:14:04 np0005538515.localdomain python3[61621]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_statedir_owner --conmon-pidfile /run/nova_statedir_owner.pid --detach=False --env NOVA_STATEDIR_OWNERSHIP_SKIP=triliovault-mounts --env TRIPLEO_DEPLOY_IDENTIFIER=1764316155 --env __OS_DEBUG=true --label config_id=tripleo_step3 --label container_name=nova_statedir_owner --label managed_by=tripleo_ansible --label config_data={'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': 'triliovault-mounts', 'TRIPLEO_DEPLOY_IDENTIFIER': '1764316155', '__OS_DEBUG': 'true'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/container-config-scripts:/container-config-scripts:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_statedir_owner.log --network none --privileged=False --security-opt label=disable --user root --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z --volume /var/lib/container-config-scripts:/container-config-scripts:z registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 /container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py
Nov 28 08:14:04 np0005538515.localdomain podman[61821]: 2025-11-28 08:14:04.492864336 +0000 UTC m=+0.499188892 container create a2af07c07192c0af15d6c45b3388461a125bcda5da66447a986ee6fedd37e9e1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_init_log, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'command': ['/bin/bash', '-c', 'chown -R ceilometer:ceilometer /var/log/ceilometer'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'none', 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, container_name=ceilometer_init_log, com.redhat.component=openstack-ceilometer-ipmi-container, name=rhosp17/openstack-ceilometer-ipmi, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, version=17.1.12, io.openshift.expose-services=, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, release=1761123044, managed_by=tripleo_ansible)
Nov 28 08:14:04 np0005538515.localdomain podman[61822]: 2025-11-28 08:14:04.506822339 +0000 UTC m=+0.515326204 container start 45cbc87a49d83485e7046909600fab5f788908e8e4b6bf89504ff818c1bf66ec (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f62921da3a3d0eed1be38a46b3ed6ac3'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, summary=Red Hat OpenStack Platform 17.1 rsyslog, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, vcs-type=git, build-date=2025-11-18T22:49:49Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-rsyslog-container, url=https://www.redhat.com, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=rsyslog, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., name=rhosp17/openstack-rsyslog, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, description=Red Hat OpenStack Platform 17.1 rsyslog, release=1761123044, io.buildah.version=1.41.4, architecture=x86_64, batch=17.1_20251118.1, version=17.1.12)
Nov 28 08:14:04 np0005538515.localdomain sudo[61961]: pam_unix(sudo:session): session closed for user root
Nov 28 08:14:04 np0005538515.localdomain python3[61621]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name rsyslog --conmon-pidfile /run/rsyslog.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=f62921da3a3d0eed1be38a46b3ed6ac3 --label config_id=tripleo_step3 --label container_name=rsyslog --label managed_by=tripleo_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f62921da3a3d0eed1be38a46b3ed6ac3'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/rsyslog.log --network host --privileged=True --security-opt label=disable --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro --volume /var/log/containers:/var/log/containers:ro --volume /var/log/containers/rsyslog:/var/log/rsyslog:rw,z --volume /var/log:/var/log/host:ro --volume /var/lib/rsyslog.container:/var/lib/rsyslog:rw,z registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1
Nov 28 08:14:04 np0005538515.localdomain systemd[1]: Started libpod-conmon-a2af07c07192c0af15d6c45b3388461a125bcda5da66447a986ee6fedd37e9e1.scope.
Nov 28 08:14:04 np0005538515.localdomain systemd[1]: libpod-45cbc87a49d83485e7046909600fab5f788908e8e4b6bf89504ff818c1bf66ec.scope: Deactivated successfully.
Nov 28 08:14:04 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 08:14:04 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/558adb40dc3f0c457c124ec6699b165daa74a355f52d98e7436d696b86369c63/merged/var/log/ceilometer supports timestamps until 2038 (0x7fffffff)
Nov 28 08:14:04 np0005538515.localdomain podman[61999]: 2025-11-28 08:14:04.592180322 +0000 UTC m=+0.048659232 container died 45cbc87a49d83485e7046909600fab5f788908e8e4b6bf89504ff818c1bf66ec (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, container_name=rsyslog, description=Red Hat OpenStack Platform 17.1 rsyslog, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-rsyslog-container, config_id=tripleo_step3, distribution-scope=public, io.buildah.version=1.41.4, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, build-date=2025-11-18T22:49:49Z, name=rhosp17/openstack-rsyslog, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f62921da3a3d0eed1be38a46b3ed6ac3'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, url=https://www.redhat.com, tcib_managed=true, vcs-type=git, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, maintainer=OpenStack TripleO Team, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 rsyslog)
Nov 28 08:14:04 np0005538515.localdomain podman[61821]: 2025-11-28 08:14:04.614386765 +0000 UTC m=+0.620711331 container init a2af07c07192c0af15d6c45b3388461a125bcda5da66447a986ee6fedd37e9e1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_init_log, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, release=1761123044, batch=17.1_20251118.1, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., distribution-scope=public, name=rhosp17/openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, config_id=tripleo_step3, tcib_managed=true, container_name=ceilometer_init_log, config_data={'command': ['/bin/bash', '-c', 'chown -R ceilometer:ceilometer /var/log/ceilometer'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'none', 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 28 08:14:04 np0005538515.localdomain podman[61821]: 2025-11-28 08:14:04.622465661 +0000 UTC m=+0.628790227 container start a2af07c07192c0af15d6c45b3388461a125bcda5da66447a986ee6fedd37e9e1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_init_log, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, config_id=tripleo_step3, release=1761123044, batch=17.1_20251118.1, vcs-type=git, container_name=ceilometer_init_log, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, io.openshift.expose-services=, tcib_managed=true, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:12:45Z, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, version=17.1.12, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, config_data={'command': ['/bin/bash', '-c', 'chown -R ceilometer:ceilometer /var/log/ceilometer'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'none', 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676)
Nov 28 08:14:04 np0005538515.localdomain python3[61621]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name ceilometer_init_log --conmon-pidfile /run/ceilometer_init_log.pid --detach=True --label config_id=tripleo_step3 --label container_name=ceilometer_init_log --label managed_by=tripleo_ansible --label config_data={'command': ['/bin/bash', '-c', 'chown -R ceilometer:ceilometer /var/log/ceilometer'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'none', 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/ceilometer:/var/log/ceilometer:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/ceilometer_init_log.log --network none --user root --volume /var/log/containers/ceilometer:/var/log/ceilometer:z registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1 /bin/bash -c chown -R ceilometer:ceilometer /var/log/ceilometer
Nov 28 08:14:04 np0005538515.localdomain systemd[1]: libpod-a2af07c07192c0af15d6c45b3388461a125bcda5da66447a986ee6fedd37e9e1.scope: Deactivated successfully.
Nov 28 08:14:04 np0005538515.localdomain podman[61919]: 2025-11-28 08:14:04.634906256 +0000 UTC m=+0.252257592 container health_status cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=starting, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, architecture=x86_64, config_id=tripleo_step3, build-date=2025-11-18T22:51:28Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-collectd, distribution-scope=public, managed_by=tripleo_ansible, io.buildah.version=1.41.4, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.component=openstack-collectd-container, version=17.1.12, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1)
Nov 28 08:14:04 np0005538515.localdomain podman[62059]: 2025-11-28 08:14:04.688536915 +0000 UTC m=+0.045149681 container died a2af07c07192c0af15d6c45b3388461a125bcda5da66447a986ee6fedd37e9e1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_init_log, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, com.redhat.component=openstack-ceilometer-ipmi-container, batch=17.1_20251118.1, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, vcs-type=git, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_init_log, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, name=rhosp17/openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, vendor=Red Hat, Inc., distribution-scope=public, config_id=tripleo_step3, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'command': ['/bin/bash', '-c', 'chown -R ceilometer:ceilometer /var/log/ceilometer'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'none', 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team)
Nov 28 08:14:04 np0005538515.localdomain podman[61999]: 2025-11-28 08:14:04.722555352 +0000 UTC m=+0.179034212 container cleanup 45cbc87a49d83485e7046909600fab5f788908e8e4b6bf89504ff818c1bf66ec (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, config_id=tripleo_step3, io.openshift.expose-services=, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.component=openstack-rsyslog-container, summary=Red Hat OpenStack Platform 17.1 rsyslog, build-date=2025-11-18T22:49:49Z, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 rsyslog, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, managed_by=tripleo_ansible, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f62921da3a3d0eed1be38a46b3ed6ac3'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, version=17.1.12, url=https://www.redhat.com, vcs-type=git, container_name=rsyslog, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 28 08:14:04 np0005538515.localdomain systemd[1]: libpod-conmon-45cbc87a49d83485e7046909600fab5f788908e8e4b6bf89504ff818c1bf66ec.scope: Deactivated successfully.
Nov 28 08:14:04 np0005538515.localdomain podman[62107]: 2025-11-28 08:14:04.797744284 +0000 UTC m=+0.076202176 container create f207e5b37e3f4ec55a88edcf4dbcbe5cbbc20fb4f3557998c461a11b61b3019b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, name=rhosp17/openstack-nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, io.openshift.expose-services=, com.redhat.component=openstack-nova-libvirt-container, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.buildah.version=1.41.4, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, build-date=2025-11-19T00:35:22Z, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc.)
Nov 28 08:14:04 np0005538515.localdomain podman[61919]: 2025-11-28 08:14:04.819270205 +0000 UTC m=+0.436621541 container exec_died cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, managed_by=tripleo_ansible, version=17.1.12, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, build-date=2025-11-18T22:51:28Z, name=rhosp17/openstack-collectd, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, container_name=collectd, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-collectd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, vcs-type=git, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, tcib_managed=true)
Nov 28 08:14:04 np0005538515.localdomain systemd[1]: Started libpod-conmon-f207e5b37e3f4ec55a88edcf4dbcbe5cbbc20fb4f3557998c461a11b61b3019b.scope.
Nov 28 08:14:04 np0005538515.localdomain systemd[1]: cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.service: Deactivated successfully.
Nov 28 08:14:04 np0005538515.localdomain podman[62107]: 2025-11-28 08:14:04.753380608 +0000 UTC m=+0.031838500 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Nov 28 08:14:04 np0005538515.localdomain podman[62059]: 2025-11-28 08:14:04.86711077 +0000 UTC m=+0.223723566 container cleanup a2af07c07192c0af15d6c45b3388461a125bcda5da66447a986ee6fedd37e9e1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_init_log, tcib_managed=true, name=rhosp17/openstack-ceilometer-ipmi, config_id=tripleo_step3, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, container_name=ceilometer_init_log, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, com.redhat.component=openstack-ceilometer-ipmi-container, url=https://www.redhat.com, build-date=2025-11-19T00:12:45Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, managed_by=tripleo_ansible, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'command': ['/bin/bash', '-c', 'chown -R ceilometer:ceilometer /var/log/ceilometer'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'none', 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi)
Nov 28 08:14:04 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 08:14:04 np0005538515.localdomain systemd[1]: libpod-conmon-a2af07c07192c0af15d6c45b3388461a125bcda5da66447a986ee6fedd37e9e1.scope: Deactivated successfully.
Nov 28 08:14:04 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6d65302dd3a585cea223ca3e05b9a858698ed3b54cf3bbf51971fe5feba8f16c/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Nov 28 08:14:04 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6d65302dd3a585cea223ca3e05b9a858698ed3b54cf3bbf51971fe5feba8f16c/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Nov 28 08:14:04 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6d65302dd3a585cea223ca3e05b9a858698ed3b54cf3bbf51971fe5feba8f16c/merged/var/log/libvirt supports timestamps until 2038 (0x7fffffff)
Nov 28 08:14:04 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6d65302dd3a585cea223ca3e05b9a858698ed3b54cf3bbf51971fe5feba8f16c/merged/var/log/swtpm/libvirt supports timestamps until 2038 (0x7fffffff)
Nov 28 08:14:04 np0005538515.localdomain podman[62107]: 2025-11-28 08:14:04.933508834 +0000 UTC m=+0.211966696 container init f207e5b37e3f4ec55a88edcf4dbcbe5cbbc20fb4f3557998c461a11b61b3019b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd, architecture=x86_64, io.openshift.expose-services=, distribution-scope=public, com.redhat.component=openstack-nova-libvirt-container, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, vendor=Red Hat, Inc., vcs-type=git, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2025-11-19T00:35:22Z, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044)
Nov 28 08:14:04 np0005538515.localdomain podman[62107]: 2025-11-28 08:14:04.956478351 +0000 UTC m=+0.234936233 container start f207e5b37e3f4ec55a88edcf4dbcbe5cbbc20fb4f3557998c461a11b61b3019b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-libvirt-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-libvirt, vendor=Red Hat, Inc., build-date=2025-11-19T00:35:22Z, vcs-type=git, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, batch=17.1_20251118.1)
Nov 28 08:14:04 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-b290307da7690cf991f1186b07b34a264d1d07b861913129e99370229181e3a2-merged.mount: Deactivated successfully.
Nov 28 08:14:04 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-7ab9dfcc9475e410a0c49d77a88cf9c95c3c1b4ce14ef438a666bad57bd45d0f-userdata-shm.mount: Deactivated successfully.
Nov 28 08:14:05 np0005538515.localdomain podman[62201]: 2025-11-28 08:14:05.331707267 +0000 UTC m=+0.092494771 container create c819cf470c2869c75c471bbedd276e4a2f4c93050051a8f401cabeeedb4a8808 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtsecretd, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 1, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public, io.buildah.version=1.41.4, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp17/openstack-nova-libvirt, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, config_id=tripleo_step3, architecture=x86_64, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., tcib_managed=true, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, container_name=nova_virtsecretd, com.redhat.component=openstack-nova-libvirt-container, build-date=2025-11-19T00:35:22Z, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12)
Nov 28 08:14:05 np0005538515.localdomain systemd[1]: Started libpod-conmon-c819cf470c2869c75c471bbedd276e4a2f4c93050051a8f401cabeeedb4a8808.scope.
Nov 28 08:14:05 np0005538515.localdomain podman[62201]: 2025-11-28 08:14:05.285542535 +0000 UTC m=+0.046330079 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Nov 28 08:14:05 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 08:14:05 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/897f35829b1f881949b1c333f7f4948d19933191339ff7279e3c8582c9dcbd21/merged/etc/libvirt supports timestamps until 2038 (0x7fffffff)
Nov 28 08:14:05 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/897f35829b1f881949b1c333f7f4948d19933191339ff7279e3c8582c9dcbd21/merged/var/lib/vhost_sockets supports timestamps until 2038 (0x7fffffff)
Nov 28 08:14:05 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/897f35829b1f881949b1c333f7f4948d19933191339ff7279e3c8582c9dcbd21/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Nov 28 08:14:05 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/897f35829b1f881949b1c333f7f4948d19933191339ff7279e3c8582c9dcbd21/merged/var/cache/libvirt supports timestamps until 2038 (0x7fffffff)
Nov 28 08:14:05 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/897f35829b1f881949b1c333f7f4948d19933191339ff7279e3c8582c9dcbd21/merged/var/log/libvirt supports timestamps until 2038 (0x7fffffff)
Nov 28 08:14:05 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/897f35829b1f881949b1c333f7f4948d19933191339ff7279e3c8582c9dcbd21/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Nov 28 08:14:05 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/897f35829b1f881949b1c333f7f4948d19933191339ff7279e3c8582c9dcbd21/merged/var/lib/kolla/config_files/src-ceph supports timestamps until 2038 (0x7fffffff)
Nov 28 08:14:05 np0005538515.localdomain podman[62201]: 2025-11-28 08:14:05.42021747 +0000 UTC m=+0.181004974 container init c819cf470c2869c75c471bbedd276e4a2f4c93050051a8f401cabeeedb4a8808 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtsecretd, com.redhat.component=openstack-nova-libvirt-container, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 1, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro']}, architecture=x86_64, container_name=nova_virtsecretd, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2025-11-19T00:35:22Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, config_id=tripleo_step3, url=https://www.redhat.com, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.12, vcs-type=git, release=1761123044, managed_by=tripleo_ansible, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-libvirt, distribution-scope=public, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt)
Nov 28 08:14:05 np0005538515.localdomain podman[62201]: 2025-11-28 08:14:05.432677085 +0000 UTC m=+0.193464559 container start c819cf470c2869c75c471bbedd276e4a2f4c93050051a8f401cabeeedb4a8808 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtsecretd, url=https://www.redhat.com, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, version=17.1.12, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 1, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro']}, vendor=Red Hat, Inc., container_name=nova_virtsecretd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-libvirt-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2025-11-19T00:35:22Z, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-libvirt, maintainer=OpenStack TripleO Team, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, tcib_managed=true, batch=17.1_20251118.1, config_id=tripleo_step3)
Nov 28 08:14:05 np0005538515.localdomain python3[61621]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_virtsecretd --cgroupns=host --conmon-pidfile /run/nova_virtsecretd.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=bbb5ea37891e3118676a78b59837de90 --label config_id=tripleo_step3 --label container_name=nova_virtsecretd --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 1, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_virtsecretd.log --network host --pid host --pids-limit 65536 --privileged=True --security-opt label=level:s0 --security-opt label=type:spc_t --security-opt label=filetype:container_file_t --ulimit nofile=131072 --ulimit nproc=126960 --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/log/containers/libvirt:/var/log/libvirt:shared,z --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /run:/run --volume /sys/fs/cgroup:/sys/fs/cgroup --volume /sys/fs/selinux:/sys/fs/selinux --volume /etc/selinux/config:/etc/selinux/config:ro --volume /etc/libvirt:/etc/libvirt:shared --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro --volume /run/libvirt:/run/libvirt:shared,z --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/libvirt:/var/lib/libvirt:shared --volume /var/cache/libvirt:/var/cache/libvirt:shared --volume /var/lib/vhost_sockets:/var/lib/vhost_sockets --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z --volume /var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Nov 28 08:14:05 np0005538515.localdomain sudo[62221]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Nov 28 08:14:05 np0005538515.localdomain systemd-logind[763]: Existing logind session ID 28 used by new audit session, ignoring.
Nov 28 08:14:05 np0005538515.localdomain systemd[1]: Started Session c3 of User root.
Nov 28 08:14:05 np0005538515.localdomain sudo[62221]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Nov 28 08:14:05 np0005538515.localdomain sudo[62221]: pam_unix(sudo:session): session closed for user root
Nov 28 08:14:05 np0005538515.localdomain systemd[1]: session-c3.scope: Deactivated successfully.
Nov 28 08:14:05 np0005538515.localdomain podman[62334]: 2025-11-28 08:14:05.958689246 +0000 UTC m=+0.092896623 container create 490141dc0beecfbdec2cf756928e0dd5b717de05c10e967326da43f7b52be436 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtnodedevd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, com.redhat.component=openstack-nova-libvirt-container, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 2, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro']}, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_virtnodedevd, build-date=2025-11-19T00:35:22Z, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, release=1761123044, managed_by=tripleo_ansible, architecture=x86_64, url=https://www.redhat.com, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, vcs-type=git, name=rhosp17/openstack-nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20251118.1)
Nov 28 08:14:06 np0005538515.localdomain podman[62334]: 2025-11-28 08:14:05.910038475 +0000 UTC m=+0.044245892 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Nov 28 08:14:06 np0005538515.localdomain systemd[1]: Started libpod-conmon-490141dc0beecfbdec2cf756928e0dd5b717de05c10e967326da43f7b52be436.scope.
Nov 28 08:14:06 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 08:14:06 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/95337b0ee1bc2060bec425b9be63b35b01d68f1de2bac6065e353d72be5388e0/merged/etc/libvirt supports timestamps until 2038 (0x7fffffff)
Nov 28 08:14:06 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/95337b0ee1bc2060bec425b9be63b35b01d68f1de2bac6065e353d72be5388e0/merged/var/lib/vhost_sockets supports timestamps until 2038 (0x7fffffff)
Nov 28 08:14:06 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/95337b0ee1bc2060bec425b9be63b35b01d68f1de2bac6065e353d72be5388e0/merged/var/cache/libvirt supports timestamps until 2038 (0x7fffffff)
Nov 28 08:14:06 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/95337b0ee1bc2060bec425b9be63b35b01d68f1de2bac6065e353d72be5388e0/merged/var/log/libvirt supports timestamps until 2038 (0x7fffffff)
Nov 28 08:14:06 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/95337b0ee1bc2060bec425b9be63b35b01d68f1de2bac6065e353d72be5388e0/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Nov 28 08:14:06 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/95337b0ee1bc2060bec425b9be63b35b01d68f1de2bac6065e353d72be5388e0/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Nov 28 08:14:06 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/95337b0ee1bc2060bec425b9be63b35b01d68f1de2bac6065e353d72be5388e0/merged/var/lib/kolla/config_files/src-ceph supports timestamps until 2038 (0x7fffffff)
Nov 28 08:14:06 np0005538515.localdomain podman[62334]: 2025-11-28 08:14:06.056131653 +0000 UTC m=+0.190339010 container init 490141dc0beecfbdec2cf756928e0dd5b717de05c10e967326da43f7b52be436 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtnodedevd, com.redhat.component=openstack-nova-libvirt-container, name=rhosp17/openstack-nova-libvirt, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, release=1761123044, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vendor=Red Hat, Inc., build-date=2025-11-19T00:35:22Z, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, version=17.1.12, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 2, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro']}, container_name=nova_virtnodedevd, config_id=tripleo_step3)
Nov 28 08:14:06 np0005538515.localdomain podman[62334]: 2025-11-28 08:14:06.065787368 +0000 UTC m=+0.199994725 container start 490141dc0beecfbdec2cf756928e0dd5b717de05c10e967326da43f7b52be436 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtnodedevd, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:35:22Z, release=1761123044, name=rhosp17/openstack-nova-libvirt, batch=17.1_20251118.1, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, container_name=nova_virtnodedevd, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 2, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro']}, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, io.openshift.expose-services=, version=17.1.12, tcib_managed=true, vendor=Red Hat, Inc., distribution-scope=public, config_id=tripleo_step3, vcs-type=git, url=https://www.redhat.com)
Nov 28 08:14:06 np0005538515.localdomain python3[61621]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_virtnodedevd --cgroupns=host --conmon-pidfile /run/nova_virtnodedevd.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=bbb5ea37891e3118676a78b59837de90 --label config_id=tripleo_step3 --label container_name=nova_virtnodedevd --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 2, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_virtnodedevd.log --network host --pid host --pids-limit 65536 --privileged=True --security-opt label=level:s0 --security-opt label=type:spc_t --security-opt label=filetype:container_file_t --ulimit nofile=131072 --ulimit nproc=126960 --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/log/containers/libvirt:/var/log/libvirt:shared,z --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /run:/run --volume /sys/fs/cgroup:/sys/fs/cgroup --volume /sys/fs/selinux:/sys/fs/selinux --volume /etc/selinux/config:/etc/selinux/config:ro --volume /etc/libvirt:/etc/libvirt:shared --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro --volume /run/libvirt:/run/libvirt:shared,z --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/libvirt:/var/lib/libvirt:shared --volume /var/cache/libvirt:/var/cache/libvirt:shared --volume /var/lib/vhost_sockets:/var/lib/vhost_sockets --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z --volume /var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Nov 28 08:14:06 np0005538515.localdomain sudo[62377]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Nov 28 08:14:06 np0005538515.localdomain systemd-logind[763]: Existing logind session ID 28 used by new audit session, ignoring.
Nov 28 08:14:06 np0005538515.localdomain podman[62356]: 2025-11-28 08:14:06.108508622 +0000 UTC m=+0.189749822 container create 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, name=rhosp17/openstack-iscsid, build-date=2025-11-18T23:44:13Z, release=1761123044, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, container_name=iscsid, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com, managed_by=tripleo_ansible, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, config_id=tripleo_step3, com.redhat.component=openstack-iscsid-container, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, tcib_managed=true)
Nov 28 08:14:06 np0005538515.localdomain podman[62356]: 2025-11-28 08:14:06.013049447 +0000 UTC m=+0.094290687 image pull  registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1
Nov 28 08:14:06 np0005538515.localdomain systemd[1]: Started Session c4 of User root.
Nov 28 08:14:06 np0005538515.localdomain sudo[62377]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Nov 28 08:14:06 np0005538515.localdomain systemd[1]: Started libpod-conmon-9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.scope.
Nov 28 08:14:06 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 08:14:06 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6e24aa22dcc3c3aeaf326f993725e399e8f3215a32c5fb5c28a2698bed898907/merged/etc/target supports timestamps until 2038 (0x7fffffff)
Nov 28 08:14:06 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6e24aa22dcc3c3aeaf326f993725e399e8f3215a32c5fb5c28a2698bed898907/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Nov 28 08:14:06 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.
Nov 28 08:14:06 np0005538515.localdomain podman[62356]: 2025-11-28 08:14:06.207495386 +0000 UTC m=+0.288736576 container init 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, distribution-scope=public, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, architecture=x86_64, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vendor=Red Hat, Inc., build-date=2025-11-18T23:44:13Z, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=iscsid, name=rhosp17/openstack-iscsid, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-iscsid-container, url=https://www.redhat.com, managed_by=tripleo_ansible, vcs-type=git, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid)
Nov 28 08:14:06 np0005538515.localdomain sudo[62377]: pam_unix(sudo:session): session closed for user root
Nov 28 08:14:06 np0005538515.localdomain systemd[1]: session-c4.scope: Deactivated successfully.
Nov 28 08:14:06 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.
Nov 28 08:14:06 np0005538515.localdomain sudo[62411]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Nov 28 08:14:06 np0005538515.localdomain systemd-logind[763]: Existing logind session ID 28 used by new audit session, ignoring.
Nov 28 08:14:06 np0005538515.localdomain systemd[1]: Started Session c5 of User root.
Nov 28 08:14:06 np0005538515.localdomain sudo[62411]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Nov 28 08:14:06 np0005538515.localdomain podman[62356]: 2025-11-28 08:14:06.305043026 +0000 UTC m=+0.386284216 container start 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, com.redhat.component=openstack-iscsid-container, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, build-date=2025-11-18T23:44:13Z, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., version=17.1.12, container_name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.buildah.version=1.41.4, distribution-scope=public, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, name=rhosp17/openstack-iscsid, tcib_managed=true, architecture=x86_64, vcs-type=git, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d)
Nov 28 08:14:06 np0005538515.localdomain python3[61621]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name iscsid --conmon-pidfile /run/iscsid.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=18a2751501986164e709168f53ab57c8 --healthcheck-command /openstack/healthcheck --label config_id=tripleo_step3 --label container_name=iscsid --label managed_by=tripleo_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/iscsid.log --network host --privileged=True --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro --volume /dev:/dev --volume /run:/run --volume /sys:/sys --volume /lib/modules:/lib/modules:ro --volume /var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro --volume /etc/target:/etc/target:z --volume /var/lib/iscsi:/var/lib/iscsi:z registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1
Nov 28 08:14:06 np0005538515.localdomain podman[62431]: 2025-11-28 08:14:06.356057162 +0000 UTC m=+0.097665505 container health_status 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=starting, distribution-scope=public, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., managed_by=tripleo_ansible, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.component=openstack-iscsid-container, architecture=x86_64, url=https://www.redhat.com, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:44:13Z, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, name=rhosp17/openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=iscsid)
Nov 28 08:14:06 np0005538515.localdomain sudo[62411]: pam_unix(sudo:session): session closed for user root
Nov 28 08:14:06 np0005538515.localdomain systemd[1]: session-c5.scope: Deactivated successfully.
Nov 28 08:14:06 np0005538515.localdomain kernel: Loading iSCSI transport class v2.0-870.
Nov 28 08:14:06 np0005538515.localdomain podman[62431]: 2025-11-28 08:14:06.441612071 +0000 UTC m=+0.183220374 container exec_died 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, architecture=x86_64, vendor=Red Hat, Inc., version=17.1.12, batch=17.1_20251118.1, vcs-type=git, build-date=2025-11-18T23:44:13Z, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, url=https://www.redhat.com, container_name=iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, tcib_managed=true, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, io.buildah.version=1.41.4, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid)
Nov 28 08:14:06 np0005538515.localdomain podman[62431]: unhealthy
Nov 28 08:14:06 np0005538515.localdomain systemd[1]: 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 08:14:06 np0005538515.localdomain systemd[1]: 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.service: Failed with result 'exit-code'.
Nov 28 08:14:06 np0005538515.localdomain podman[62523]: 2025-11-28 08:14:06.802304876 +0000 UTC m=+0.083958690 container create 77e9d72df8f79ee50de7116306a3a6d3da17ccdfda2a4c48233804c3562cc2f0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtstoraged, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.12, vendor=Red Hat, Inc., release=1761123044, config_id=tripleo_step3, build-date=2025-11-19T00:35:22Z, io.openshift.expose-services=, url=https://www.redhat.com, com.redhat.component=openstack-nova-libvirt-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, vcs-type=git, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, container_name=nova_virtstoraged, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 3, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro']}, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, name=rhosp17/openstack-nova-libvirt, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Nov 28 08:14:06 np0005538515.localdomain systemd[1]: Started libpod-conmon-77e9d72df8f79ee50de7116306a3a6d3da17ccdfda2a4c48233804c3562cc2f0.scope.
Nov 28 08:14:06 np0005538515.localdomain podman[62523]: 2025-11-28 08:14:06.753512291 +0000 UTC m=+0.035166165 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Nov 28 08:14:06 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 08:14:06 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c649278c2e5a474424d7d5698a840ae7cdf6b8243f9150d8a362719bce70699a/merged/etc/libvirt supports timestamps until 2038 (0x7fffffff)
Nov 28 08:14:06 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c649278c2e5a474424d7d5698a840ae7cdf6b8243f9150d8a362719bce70699a/merged/var/log/libvirt supports timestamps until 2038 (0x7fffffff)
Nov 28 08:14:06 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c649278c2e5a474424d7d5698a840ae7cdf6b8243f9150d8a362719bce70699a/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Nov 28 08:14:06 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c649278c2e5a474424d7d5698a840ae7cdf6b8243f9150d8a362719bce70699a/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Nov 28 08:14:06 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c649278c2e5a474424d7d5698a840ae7cdf6b8243f9150d8a362719bce70699a/merged/var/lib/vhost_sockets supports timestamps until 2038 (0x7fffffff)
Nov 28 08:14:06 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c649278c2e5a474424d7d5698a840ae7cdf6b8243f9150d8a362719bce70699a/merged/var/cache/libvirt supports timestamps until 2038 (0x7fffffff)
Nov 28 08:14:06 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c649278c2e5a474424d7d5698a840ae7cdf6b8243f9150d8a362719bce70699a/merged/var/lib/kolla/config_files/src-ceph supports timestamps until 2038 (0x7fffffff)
Nov 28 08:14:06 np0005538515.localdomain podman[62523]: 2025-11-28 08:14:06.883199759 +0000 UTC m=+0.164853593 container init 77e9d72df8f79ee50de7116306a3a6d3da17ccdfda2a4c48233804c3562cc2f0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtstoraged, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, io.openshift.expose-services=, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:35:22Z, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_virtstoraged, config_id=tripleo_step3, vcs-type=git, io.buildah.version=1.41.4, version=17.1.12, vendor=Red Hat, Inc., config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 3, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro']}, name=rhosp17/openstack-nova-libvirt)
Nov 28 08:14:06 np0005538515.localdomain podman[62523]: 2025-11-28 08:14:06.893148464 +0000 UTC m=+0.174802298 container start 77e9d72df8f79ee50de7116306a3a6d3da17ccdfda2a4c48233804c3562cc2f0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtstoraged, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 3, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro']}, config_id=tripleo_step3, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, com.redhat.component=openstack-nova-libvirt-container, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, io.buildah.version=1.41.4, build-date=2025-11-19T00:35:22Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, container_name=nova_virtstoraged, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-libvirt)
Nov 28 08:14:06 np0005538515.localdomain python3[61621]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_virtstoraged --cgroupns=host --conmon-pidfile /run/nova_virtstoraged.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=bbb5ea37891e3118676a78b59837de90 --label config_id=tripleo_step3 --label container_name=nova_virtstoraged --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 3, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_virtstoraged.log --network host --pid host --pids-limit 65536 --privileged=True --security-opt label=level:s0 --security-opt label=type:spc_t --security-opt label=filetype:container_file_t --ulimit nofile=131072 --ulimit nproc=126960 --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/log/containers/libvirt:/var/log/libvirt:shared,z --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /run:/run --volume /sys/fs/cgroup:/sys/fs/cgroup --volume /sys/fs/selinux:/sys/fs/selinux --volume /etc/selinux/config:/etc/selinux/config:ro --volume /etc/libvirt:/etc/libvirt:shared --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro --volume /run/libvirt:/run/libvirt:shared,z --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/libvirt:/var/lib/libvirt:shared --volume /var/cache/libvirt:/var/cache/libvirt:shared --volume /var/lib/vhost_sockets:/var/lib/vhost_sockets --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z --volume /var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Nov 28 08:14:06 np0005538515.localdomain sudo[62543]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Nov 28 08:14:06 np0005538515.localdomain systemd-logind[763]: Existing logind session ID 28 used by new audit session, ignoring.
Nov 28 08:14:06 np0005538515.localdomain systemd[1]: Started Session c6 of User root.
Nov 28 08:14:06 np0005538515.localdomain sudo[62543]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Nov 28 08:14:07 np0005538515.localdomain sudo[62543]: pam_unix(sudo:session): session closed for user root
Nov 28 08:14:07 np0005538515.localdomain systemd[1]: session-c6.scope: Deactivated successfully.
Nov 28 08:14:07 np0005538515.localdomain podman[62624]: 2025-11-28 08:14:07.398139639 +0000 UTC m=+0.085216171 container create 929c9b5315b4fc33e01978423e59cb5b383ecd56e91c5f891b7c011283bec432 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, io.openshift.expose-services=, build-date=2025-11-19T00:35:22Z, tcib_managed=true, container_name=nova_virtqemud, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, config_id=tripleo_step3, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']}, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, url=https://www.redhat.com, vendor=Red Hat, Inc., vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-libvirt-container, description=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp17/openstack-nova-libvirt, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4)
Nov 28 08:14:07 np0005538515.localdomain systemd[1]: Started libpod-conmon-929c9b5315b4fc33e01978423e59cb5b383ecd56e91c5f891b7c011283bec432.scope.
Nov 28 08:14:07 np0005538515.localdomain podman[62624]: 2025-11-28 08:14:07.358972069 +0000 UTC m=+0.046048601 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Nov 28 08:14:07 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 08:14:07 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0ab58418a3b33798bab22812a6bf35faf1a05b29cb02b615b8bae9fef6fe9073/merged/etc/libvirt supports timestamps until 2038 (0x7fffffff)
Nov 28 08:14:07 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0ab58418a3b33798bab22812a6bf35faf1a05b29cb02b615b8bae9fef6fe9073/merged/var/cache/libvirt supports timestamps until 2038 (0x7fffffff)
Nov 28 08:14:07 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0ab58418a3b33798bab22812a6bf35faf1a05b29cb02b615b8bae9fef6fe9073/merged/var/lib/vhost_sockets supports timestamps until 2038 (0x7fffffff)
Nov 28 08:14:07 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0ab58418a3b33798bab22812a6bf35faf1a05b29cb02b615b8bae9fef6fe9073/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Nov 28 08:14:07 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0ab58418a3b33798bab22812a6bf35faf1a05b29cb02b615b8bae9fef6fe9073/merged/var/log/libvirt supports timestamps until 2038 (0x7fffffff)
Nov 28 08:14:07 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0ab58418a3b33798bab22812a6bf35faf1a05b29cb02b615b8bae9fef6fe9073/merged/var/log/swtpm supports timestamps until 2038 (0x7fffffff)
Nov 28 08:14:07 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0ab58418a3b33798bab22812a6bf35faf1a05b29cb02b615b8bae9fef6fe9073/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Nov 28 08:14:07 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0ab58418a3b33798bab22812a6bf35faf1a05b29cb02b615b8bae9fef6fe9073/merged/var/lib/kolla/config_files/src-ceph supports timestamps until 2038 (0x7fffffff)
Nov 28 08:14:07 np0005538515.localdomain podman[62624]: 2025-11-28 08:14:07.506348677 +0000 UTC m=+0.193425229 container init 929c9b5315b4fc33e01978423e59cb5b383ecd56e91c5f891b7c011283bec432 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']}, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, build-date=2025-11-19T00:35:22Z, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, vcs-type=git, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, distribution-scope=public, com.redhat.component=openstack-nova-libvirt-container, name=rhosp17/openstack-nova-libvirt, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, container_name=nova_virtqemud, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, url=https://www.redhat.com, batch=17.1_20251118.1)
Nov 28 08:14:07 np0005538515.localdomain podman[62624]: 2025-11-28 08:14:07.51593836 +0000 UTC m=+0.203014902 container start 929c9b5315b4fc33e01978423e59cb5b383ecd56e91c5f891b7c011283bec432 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud, com.redhat.component=openstack-nova-libvirt-container, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']}, vcs-type=git, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2025-11-19T00:35:22Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-libvirt, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3, url=https://www.redhat.com, version=17.1.12, io.buildah.version=1.41.4, managed_by=tripleo_ansible, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, tcib_managed=true, architecture=x86_64, container_name=nova_virtqemud, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 28 08:14:07 np0005538515.localdomain python3[61621]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_virtqemud --cgroupns=host --conmon-pidfile /run/nova_virtqemud.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=bbb5ea37891e3118676a78b59837de90 --label config_id=tripleo_step3 --label container_name=nova_virtqemud --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_virtqemud.log --network host --pid host --pids-limit 65536 --privileged=True --security-opt label=level:s0 --security-opt label=type:spc_t --security-opt label=filetype:container_file_t --ulimit nofile=131072 --ulimit nproc=126960 --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/log/containers/libvirt:/var/log/libvirt:shared,z --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /run:/run --volume /sys/fs/cgroup:/sys/fs/cgroup --volume /sys/fs/selinux:/sys/fs/selinux --volume /etc/selinux/config:/etc/selinux/config:ro --volume /etc/libvirt:/etc/libvirt:shared --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro --volume /run/libvirt:/run/libvirt:shared,z --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/libvirt:/var/lib/libvirt:shared --volume /var/cache/libvirt:/var/cache/libvirt:shared --volume /var/lib/vhost_sockets:/var/lib/vhost_sockets --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z --volume /var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro --volume /var/log/containers/libvirt/swtpm:/var/log/swtpm:z registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Nov 28 08:14:07 np0005538515.localdomain sudo[62643]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Nov 28 08:14:07 np0005538515.localdomain systemd-logind[763]: Existing logind session ID 28 used by new audit session, ignoring.
Nov 28 08:14:07 np0005538515.localdomain systemd[1]: Started Session c7 of User root.
Nov 28 08:14:07 np0005538515.localdomain sudo[62643]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Nov 28 08:14:07 np0005538515.localdomain sudo[62643]: pam_unix(sudo:session): session closed for user root
Nov 28 08:14:07 np0005538515.localdomain systemd[1]: session-c7.scope: Deactivated successfully.
Nov 28 08:14:08 np0005538515.localdomain podman[62728]: 2025-11-28 08:14:08.041324222 +0000 UTC m=+0.106889297 container create 7f1efe5480b4850e72969a411413c723808a2e9f2a72da0ab9b5bc407d874657 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtproxyd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2025-11-19T00:35:22Z, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, vendor=Red Hat, Inc., vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, io.openshift.expose-services=, distribution-scope=public, com.redhat.component=openstack-nova-libvirt-container, name=rhosp17/openstack-nova-libvirt, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20251118.1, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, container_name=nova_virtproxyd, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 5, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro']}, release=1761123044, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3)
Nov 28 08:14:08 np0005538515.localdomain systemd[1]: Started libpod-conmon-7f1efe5480b4850e72969a411413c723808a2e9f2a72da0ab9b5bc407d874657.scope.
Nov 28 08:14:08 np0005538515.localdomain podman[62728]: 2025-11-28 08:14:07.991746942 +0000 UTC m=+0.057312027 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Nov 28 08:14:08 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 08:14:08 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0f50e6e193badeb95447e2c9ef73121ac91dbd5780ab99ca29933bd60e5eb8a8/merged/etc/libvirt supports timestamps until 2038 (0x7fffffff)
Nov 28 08:14:08 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0f50e6e193badeb95447e2c9ef73121ac91dbd5780ab99ca29933bd60e5eb8a8/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Nov 28 08:14:08 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0f50e6e193badeb95447e2c9ef73121ac91dbd5780ab99ca29933bd60e5eb8a8/merged/var/lib/vhost_sockets supports timestamps until 2038 (0x7fffffff)
Nov 28 08:14:08 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0f50e6e193badeb95447e2c9ef73121ac91dbd5780ab99ca29933bd60e5eb8a8/merged/var/log/libvirt supports timestamps until 2038 (0x7fffffff)
Nov 28 08:14:08 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0f50e6e193badeb95447e2c9ef73121ac91dbd5780ab99ca29933bd60e5eb8a8/merged/var/cache/libvirt supports timestamps until 2038 (0x7fffffff)
Nov 28 08:14:08 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0f50e6e193badeb95447e2c9ef73121ac91dbd5780ab99ca29933bd60e5eb8a8/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Nov 28 08:14:08 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0f50e6e193badeb95447e2c9ef73121ac91dbd5780ab99ca29933bd60e5eb8a8/merged/var/lib/kolla/config_files/src-ceph supports timestamps until 2038 (0x7fffffff)
Nov 28 08:14:08 np0005538515.localdomain podman[62728]: 2025-11-28 08:14:08.146310867 +0000 UTC m=+0.211875932 container init 7f1efe5480b4850e72969a411413c723808a2e9f2a72da0ab9b5bc407d874657 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtproxyd, vcs-type=git, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, tcib_managed=true, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, distribution-scope=public, batch=17.1_20251118.1, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.buildah.version=1.41.4, com.redhat.component=openstack-nova-libvirt-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, build-date=2025-11-19T00:35:22Z, container_name=nova_virtproxyd, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, name=rhosp17/openstack-nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 5, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro']}, description=Red Hat OpenStack Platform 17.1 nova-libvirt)
Nov 28 08:14:08 np0005538515.localdomain podman[62728]: 2025-11-28 08:14:08.156344025 +0000 UTC m=+0.221909090 container start 7f1efe5480b4850e72969a411413c723808a2e9f2a72da0ab9b5bc407d874657 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtproxyd, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step3, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-libvirt, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, container_name=nova_virtproxyd, vcs-type=git, distribution-scope=public, version=17.1.12, com.redhat.component=openstack-nova-libvirt-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-libvirt, batch=17.1_20251118.1, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2025-11-19T00:35:22Z, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 5, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro']})
Nov 28 08:14:08 np0005538515.localdomain python3[61621]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_virtproxyd --cgroupns=host --conmon-pidfile /run/nova_virtproxyd.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=bbb5ea37891e3118676a78b59837de90 --label config_id=tripleo_step3 --label container_name=nova_virtproxyd --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 5, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_virtproxyd.log --network host --pid host --pids-limit 65536 --privileged=True --security-opt label=level:s0 --security-opt label=type:spc_t --security-opt label=filetype:container_file_t --ulimit nofile=131072 --ulimit nproc=126960 --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/log/containers/libvirt:/var/log/libvirt:shared,z --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /run:/run --volume /sys/fs/cgroup:/sys/fs/cgroup --volume /sys/fs/selinux:/sys/fs/selinux --volume /etc/selinux/config:/etc/selinux/config:ro --volume /etc/libvirt:/etc/libvirt:shared --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro --volume /run/libvirt:/run/libvirt:shared,z --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/libvirt:/var/lib/libvirt:shared --volume /var/cache/libvirt:/var/cache/libvirt:shared --volume /var/lib/vhost_sockets:/var/lib/vhost_sockets --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z --volume /var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Nov 28 08:14:08 np0005538515.localdomain sudo[62747]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Nov 28 08:14:08 np0005538515.localdomain systemd-logind[763]: Existing logind session ID 28 used by new audit session, ignoring.
Nov 28 08:14:08 np0005538515.localdomain systemd[1]: Started Session c8 of User root.
Nov 28 08:14:08 np0005538515.localdomain sudo[62747]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Nov 28 08:14:08 np0005538515.localdomain sudo[62747]: pam_unix(sudo:session): session closed for user root
Nov 28 08:14:08 np0005538515.localdomain systemd[1]: session-c8.scope: Deactivated successfully.
Nov 28 08:14:08 np0005538515.localdomain sudo[61619]: pam_unix(sudo:session): session closed for user root
Nov 28 08:14:08 np0005538515.localdomain sudo[62806]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lxdzdunneshmwuncnqpfishjsnyanqkr ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:14:08 np0005538515.localdomain sudo[62806]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:14:08 np0005538515.localdomain python3[62808]: ansible-file Invoked with path=/etc/systemd/system/tripleo_collectd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:14:08 np0005538515.localdomain sudo[62806]: pam_unix(sudo:session): session closed for user root
Nov 28 08:14:08 np0005538515.localdomain sudo[62822]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qcojehogrcwrjfgcqzmasivumztjujrz ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:14:08 np0005538515.localdomain sudo[62822]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:14:09 np0005538515.localdomain python3[62824]: ansible-file Invoked with path=/etc/systemd/system/tripleo_iscsid.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:14:09 np0005538515.localdomain sudo[62822]: pam_unix(sudo:session): session closed for user root
Nov 28 08:14:09 np0005538515.localdomain sudo[62838]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xmrkwinofyszdakquelvwoucszngzlye ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:14:09 np0005538515.localdomain sudo[62838]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:14:09 np0005538515.localdomain python3[62840]: ansible-file Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:14:09 np0005538515.localdomain sudo[62838]: pam_unix(sudo:session): session closed for user root
Nov 28 08:14:09 np0005538515.localdomain sudo[62854]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oiqfdvbrfsulgdkygiuoqzeavflmvept ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:14:09 np0005538515.localdomain sudo[62854]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:14:09 np0005538515.localdomain python3[62856]: ansible-file Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:14:09 np0005538515.localdomain sudo[62854]: pam_unix(sudo:session): session closed for user root
Nov 28 08:14:09 np0005538515.localdomain sudo[62870]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nvlsbqvmdllgjejzuizdeurjrbwigbpx ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:14:09 np0005538515.localdomain sudo[62870]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:14:09 np0005538515.localdomain python3[62872]: ansible-file Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:14:09 np0005538515.localdomain sudo[62870]: pam_unix(sudo:session): session closed for user root
Nov 28 08:14:09 np0005538515.localdomain sudo[62886]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qcqntjrmluqgwwgzrfuqdbuzjvuwhrjt ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:14:09 np0005538515.localdomain sudo[62886]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:14:10 np0005538515.localdomain python3[62888]: ansible-file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:14:10 np0005538515.localdomain sudo[62886]: pam_unix(sudo:session): session closed for user root
Nov 28 08:14:10 np0005538515.localdomain sudo[62902]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xfzyqtamfgbwwnvxpzpwmxukclckvzld ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:14:10 np0005538515.localdomain sudo[62902]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:14:10 np0005538515.localdomain python3[62904]: ansible-file Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:14:10 np0005538515.localdomain sudo[62902]: pam_unix(sudo:session): session closed for user root
Nov 28 08:14:10 np0005538515.localdomain sudo[62918]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-znydjoukjwljcxcymanolumppvczlysh ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:14:10 np0005538515.localdomain sudo[62918]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:14:10 np0005538515.localdomain python3[62920]: ansible-file Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:14:10 np0005538515.localdomain sudo[62918]: pam_unix(sudo:session): session closed for user root
Nov 28 08:14:10 np0005538515.localdomain sudo[62934]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pmslfyfhljgqsbkvzkabytavzjvozsxy ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:14:10 np0005538515.localdomain sudo[62934]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:14:10 np0005538515.localdomain python3[62936]: ansible-file Invoked with path=/etc/systemd/system/tripleo_rsyslog.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:14:10 np0005538515.localdomain sudo[62934]: pam_unix(sudo:session): session closed for user root
Nov 28 08:14:10 np0005538515.localdomain sudo[62950]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yzsmepyidpjwvkfuqvjjkzcwymffqkyj ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:14:10 np0005538515.localdomain sudo[62950]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:14:11 np0005538515.localdomain python3[62952]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_collectd_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 28 08:14:11 np0005538515.localdomain sudo[62950]: pam_unix(sudo:session): session closed for user root
Nov 28 08:14:11 np0005538515.localdomain sudo[62966]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-djarmktdvquoixbrbedalrgyzdpqtjkc ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:14:11 np0005538515.localdomain sudo[62966]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:14:11 np0005538515.localdomain python3[62968]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_iscsid_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 28 08:14:11 np0005538515.localdomain sudo[62966]: pam_unix(sudo:session): session closed for user root
Nov 28 08:14:11 np0005538515.localdomain sudo[62982]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-geuryqehrvfxlfwqnqadwlazqjifbfsm ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:14:11 np0005538515.localdomain sudo[62982]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:14:11 np0005538515.localdomain python3[62984]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 28 08:14:11 np0005538515.localdomain sudo[62982]: pam_unix(sudo:session): session closed for user root
Nov 28 08:14:11 np0005538515.localdomain sudo[62998]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mmklfuzlsrddbtuhecdjagvojrvecwpx ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:14:11 np0005538515.localdomain sudo[62998]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:14:11 np0005538515.localdomain python3[63000]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 28 08:14:11 np0005538515.localdomain sudo[62998]: pam_unix(sudo:session): session closed for user root
Nov 28 08:14:11 np0005538515.localdomain sudo[63014]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jzhsedtfcdjcnikzldrtrfsetwgrjmgx ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:14:11 np0005538515.localdomain sudo[63014]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:14:12 np0005538515.localdomain python3[63016]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 28 08:14:12 np0005538515.localdomain sudo[63014]: pam_unix(sudo:session): session closed for user root
Nov 28 08:14:12 np0005538515.localdomain sudo[63030]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bagwbxedkpbsqfubzvvglsgmqcvivicb ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:14:12 np0005538515.localdomain sudo[63030]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:14:12 np0005538515.localdomain python3[63032]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 28 08:14:12 np0005538515.localdomain sudo[63030]: pam_unix(sudo:session): session closed for user root
Nov 28 08:14:12 np0005538515.localdomain sudo[63046]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ikwxshzixodcmetocasxjcjylyqttkst ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:14:12 np0005538515.localdomain sudo[63046]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:14:12 np0005538515.localdomain python3[63048]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 28 08:14:12 np0005538515.localdomain sudo[63046]: pam_unix(sudo:session): session closed for user root
Nov 28 08:14:12 np0005538515.localdomain sudo[63062]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yamnbczebsmuctqqcdpgvqwmhmbvfgvh ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:14:12 np0005538515.localdomain sudo[63062]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:14:12 np0005538515.localdomain python3[63064]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 28 08:14:12 np0005538515.localdomain sudo[63062]: pam_unix(sudo:session): session closed for user root
Nov 28 08:14:12 np0005538515.localdomain sudo[63078]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ujnnijtxoybqrekfabgzvuobkrakfglh ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:14:12 np0005538515.localdomain sudo[63078]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:14:13 np0005538515.localdomain python3[63080]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_rsyslog_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 28 08:14:13 np0005538515.localdomain sudo[63078]: pam_unix(sudo:session): session closed for user root
Nov 28 08:14:13 np0005538515.localdomain sudo[63139]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jgtkujsfbqmgxebgchqtijtoejfpyuoe ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:14:13 np0005538515.localdomain sudo[63139]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:14:13 np0005538515.localdomain python3[63141]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764317653.2087498-101042-145141859775249/source dest=/etc/systemd/system/tripleo_collectd.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:14:13 np0005538515.localdomain sudo[63139]: pam_unix(sudo:session): session closed for user root
Nov 28 08:14:14 np0005538515.localdomain sudo[63168]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-byvfsyxhacaoweasyexydczrqcncuovu ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:14:14 np0005538515.localdomain sudo[63168]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:14:14 np0005538515.localdomain python3[63170]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764317653.2087498-101042-145141859775249/source dest=/etc/systemd/system/tripleo_iscsid.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:14:14 np0005538515.localdomain sudo[63168]: pam_unix(sudo:session): session closed for user root
Nov 28 08:14:14 np0005538515.localdomain sudo[63197]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gxfoighxltefsxwmbumqwcospxhkeovb ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:14:14 np0005538515.localdomain sudo[63197]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:14:14 np0005538515.localdomain python3[63199]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764317653.2087498-101042-145141859775249/source dest=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:14:14 np0005538515.localdomain sudo[63197]: pam_unix(sudo:session): session closed for user root
Nov 28 08:14:15 np0005538515.localdomain sudo[63226]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cmffiopzbkxokptgupjehjuuaihnlhyi ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:14:15 np0005538515.localdomain sudo[63226]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:14:15 np0005538515.localdomain python3[63228]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764317653.2087498-101042-145141859775249/source dest=/etc/systemd/system/tripleo_nova_virtnodedevd.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:14:15 np0005538515.localdomain sudo[63226]: pam_unix(sudo:session): session closed for user root
Nov 28 08:14:15 np0005538515.localdomain sudo[63255]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rlfflvgvujntmgqgwzhukevgzfepcnnp ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:14:15 np0005538515.localdomain sudo[63255]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:14:15 np0005538515.localdomain python3[63257]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764317653.2087498-101042-145141859775249/source dest=/etc/systemd/system/tripleo_nova_virtproxyd.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:14:15 np0005538515.localdomain sudo[63255]: pam_unix(sudo:session): session closed for user root
Nov 28 08:14:16 np0005538515.localdomain sudo[63284]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qmaawipsyepjdddeisynwszpjdbaiunv ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:14:16 np0005538515.localdomain sudo[63284]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:14:16 np0005538515.localdomain python3[63286]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764317653.2087498-101042-145141859775249/source dest=/etc/systemd/system/tripleo_nova_virtqemud.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:14:16 np0005538515.localdomain sudo[63284]: pam_unix(sudo:session): session closed for user root
Nov 28 08:14:16 np0005538515.localdomain sudo[63313]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wtxrjxyduqvxjuejycixyptlrahhuihr ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:14:16 np0005538515.localdomain sudo[63313]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:14:16 np0005538515.localdomain python3[63315]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764317653.2087498-101042-145141859775249/source dest=/etc/systemd/system/tripleo_nova_virtsecretd.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:14:16 np0005538515.localdomain sudo[63313]: pam_unix(sudo:session): session closed for user root
Nov 28 08:14:17 np0005538515.localdomain sudo[63342]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rjlymmowdlqxtyzbdmyrmgwlxezrqfrj ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:14:17 np0005538515.localdomain sudo[63342]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:14:17 np0005538515.localdomain python3[63344]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764317653.2087498-101042-145141859775249/source dest=/etc/systemd/system/tripleo_nova_virtstoraged.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:14:17 np0005538515.localdomain sudo[63342]: pam_unix(sudo:session): session closed for user root
Nov 28 08:14:17 np0005538515.localdomain sudo[63371]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-akbyutqwwyoomzwjwakwnrwzgdegysuc ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:14:17 np0005538515.localdomain sudo[63371]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:14:17 np0005538515.localdomain python3[63373]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764317653.2087498-101042-145141859775249/source dest=/etc/systemd/system/tripleo_rsyslog.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:14:17 np0005538515.localdomain sudo[63371]: pam_unix(sudo:session): session closed for user root
Nov 28 08:14:18 np0005538515.localdomain sudo[63387]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lixvdvkzxlgshlnpfoqyxrayhteyhyqx ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:14:18 np0005538515.localdomain sudo[63387]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:14:18 np0005538515.localdomain python3[63389]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 28 08:14:18 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 08:14:18 np0005538515.localdomain systemd-rc-local-generator[63410]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 08:14:18 np0005538515.localdomain systemd-sysv-generator[63414]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 08:14:18 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 08:14:18 np0005538515.localdomain systemd[1]: Stopping User Manager for UID 0...
Nov 28 08:14:18 np0005538515.localdomain systemd[61870]: Activating special unit Exit the Session...
Nov 28 08:14:18 np0005538515.localdomain systemd[61870]: Stopped target Main User Target.
Nov 28 08:14:18 np0005538515.localdomain systemd[61870]: Stopped target Basic System.
Nov 28 08:14:18 np0005538515.localdomain systemd[61870]: Stopped target Paths.
Nov 28 08:14:18 np0005538515.localdomain systemd[61870]: Stopped target Sockets.
Nov 28 08:14:18 np0005538515.localdomain systemd[61870]: Stopped target Timers.
Nov 28 08:14:18 np0005538515.localdomain systemd[61870]: Stopped Daily Cleanup of User's Temporary Directories.
Nov 28 08:14:18 np0005538515.localdomain systemd[61870]: Closed D-Bus User Message Bus Socket.
Nov 28 08:14:18 np0005538515.localdomain systemd[61870]: Stopped Create User's Volatile Files and Directories.
Nov 28 08:14:18 np0005538515.localdomain systemd[61870]: Removed slice User Application Slice.
Nov 28 08:14:18 np0005538515.localdomain systemd[61870]: Reached target Shutdown.
Nov 28 08:14:18 np0005538515.localdomain systemd[61870]: Finished Exit the Session.
Nov 28 08:14:18 np0005538515.localdomain systemd[61870]: Reached target Exit the Session.
Nov 28 08:14:18 np0005538515.localdomain systemd[1]: user@0.service: Deactivated successfully.
Nov 28 08:14:18 np0005538515.localdomain systemd[1]: Stopped User Manager for UID 0.
Nov 28 08:14:18 np0005538515.localdomain systemd[1]: Stopping User Runtime Directory /run/user/0...
Nov 28 08:14:18 np0005538515.localdomain sudo[63387]: pam_unix(sudo:session): session closed for user root
Nov 28 08:14:18 np0005538515.localdomain systemd[1]: run-user-0.mount: Deactivated successfully.
Nov 28 08:14:18 np0005538515.localdomain systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Nov 28 08:14:18 np0005538515.localdomain systemd[1]: Stopped User Runtime Directory /run/user/0.
Nov 28 08:14:18 np0005538515.localdomain systemd[1]: Removed slice User Slice of UID 0.
Nov 28 08:14:19 np0005538515.localdomain sudo[63440]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fjmrralplbqsvjkgkheggnzkwcpcarbh ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:14:19 np0005538515.localdomain sudo[63440]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:14:19 np0005538515.localdomain python3[63442]: ansible-systemd Invoked with state=restarted name=tripleo_collectd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 08:14:19 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 08:14:19 np0005538515.localdomain systemd-rc-local-generator[63462]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 08:14:19 np0005538515.localdomain systemd-sysv-generator[63468]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 08:14:19 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 08:14:19 np0005538515.localdomain systemd[1]: Starting collectd container...
Nov 28 08:14:19 np0005538515.localdomain systemd[1]: Started collectd container.
Nov 28 08:14:19 np0005538515.localdomain sudo[63440]: pam_unix(sudo:session): session closed for user root
Nov 28 08:14:20 np0005538515.localdomain sudo[63509]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cfqfpupvzxjyyobnrmesbmmwtmzdyzsk ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:14:20 np0005538515.localdomain sudo[63509]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:14:20 np0005538515.localdomain python3[63511]: ansible-systemd Invoked with state=restarted name=tripleo_iscsid.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 08:14:20 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 08:14:20 np0005538515.localdomain systemd-sysv-generator[63540]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 08:14:20 np0005538515.localdomain systemd-rc-local-generator[63537]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 08:14:20 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 08:14:20 np0005538515.localdomain systemd[1]: Starting iscsid container...
Nov 28 08:14:20 np0005538515.localdomain systemd[1]: Started iscsid container.
Nov 28 08:14:20 np0005538515.localdomain sudo[63509]: pam_unix(sudo:session): session closed for user root
Nov 28 08:14:21 np0005538515.localdomain sudo[63576]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qqkbivgjiyjdzmjxxfrnggkxhsrwezqo ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:14:21 np0005538515.localdomain sudo[63576]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:14:21 np0005538515.localdomain python3[63578]: ansible-systemd Invoked with state=restarted name=tripleo_nova_virtlogd_wrapper.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 08:14:21 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 08:14:21 np0005538515.localdomain systemd-sysv-generator[63611]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 08:14:21 np0005538515.localdomain systemd-rc-local-generator[63607]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 08:14:21 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 08:14:21 np0005538515.localdomain systemd[1]: Starting nova_virtlogd_wrapper container...
Nov 28 08:14:22 np0005538515.localdomain systemd[1]: Started nova_virtlogd_wrapper container.
Nov 28 08:14:22 np0005538515.localdomain sudo[63576]: pam_unix(sudo:session): session closed for user root
Nov 28 08:14:22 np0005538515.localdomain sudo[63644]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-naoqnhrtimaslwavuhnfhkfwlxywgdib ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:14:22 np0005538515.localdomain sudo[63644]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:14:22 np0005538515.localdomain python3[63646]: ansible-systemd Invoked with state=restarted name=tripleo_nova_virtnodedevd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 08:14:22 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 08:14:22 np0005538515.localdomain systemd-rc-local-generator[63671]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 08:14:22 np0005538515.localdomain systemd-sysv-generator[63677]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 08:14:22 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 08:14:23 np0005538515.localdomain systemd[1]: Starting nova_virtnodedevd container...
Nov 28 08:14:23 np0005538515.localdomain tripleo-start-podman-container[63686]: Creating additional drop-in dependency for "nova_virtnodedevd" (490141dc0beecfbdec2cf756928e0dd5b717de05c10e967326da43f7b52be436)
Nov 28 08:14:23 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 08:14:23 np0005538515.localdomain systemd-sysv-generator[63746]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 08:14:23 np0005538515.localdomain systemd-rc-local-generator[63740]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 08:14:23 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 08:14:23 np0005538515.localdomain systemd[1]: Started nova_virtnodedevd container.
Nov 28 08:14:23 np0005538515.localdomain sudo[63644]: pam_unix(sudo:session): session closed for user root
Nov 28 08:14:23 np0005538515.localdomain sudo[63768]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-itzjhomiijzlfxldkimyzvtvrlmhenoy ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:14:23 np0005538515.localdomain sudo[63768]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:14:24 np0005538515.localdomain python3[63770]: ansible-systemd Invoked with state=restarted name=tripleo_nova_virtproxyd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 08:14:25 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 08:14:25 np0005538515.localdomain systemd-rc-local-generator[63800]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 08:14:25 np0005538515.localdomain systemd-sysv-generator[63804]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 08:14:25 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 08:14:25 np0005538515.localdomain systemd[1]: Starting nova_virtproxyd container...
Nov 28 08:14:25 np0005538515.localdomain tripleo-start-podman-container[63811]: Creating additional drop-in dependency for "nova_virtproxyd" (7f1efe5480b4850e72969a411413c723808a2e9f2a72da0ab9b5bc407d874657)
Nov 28 08:14:25 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 08:14:25 np0005538515.localdomain systemd-sysv-generator[63873]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 08:14:25 np0005538515.localdomain systemd-rc-local-generator[63867]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 08:14:25 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 08:14:26 np0005538515.localdomain systemd[1]: Started nova_virtproxyd container.
Nov 28 08:14:26 np0005538515.localdomain sudo[63768]: pam_unix(sudo:session): session closed for user root
Nov 28 08:14:26 np0005538515.localdomain sudo[63892]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vsdhxhktkszneiseiibdeujsthcrcwjw ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:14:26 np0005538515.localdomain sudo[63892]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:14:26 np0005538515.localdomain python3[63894]: ansible-systemd Invoked with state=restarted name=tripleo_nova_virtqemud.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 08:14:26 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 08:14:26 np0005538515.localdomain systemd-sysv-generator[63923]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 08:14:26 np0005538515.localdomain systemd-rc-local-generator[63920]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 08:14:26 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 08:14:27 np0005538515.localdomain systemd[1]: Starting nova_virtqemud container...
Nov 28 08:14:27 np0005538515.localdomain tripleo-start-podman-container[63934]: Creating additional drop-in dependency for "nova_virtqemud" (929c9b5315b4fc33e01978423e59cb5b383ecd56e91c5f891b7c011283bec432)
Nov 28 08:14:27 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 08:14:27 np0005538515.localdomain systemd-sysv-generator[63995]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 08:14:27 np0005538515.localdomain systemd-rc-local-generator[63992]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 08:14:27 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 08:14:27 np0005538515.localdomain systemd[1]: Started nova_virtqemud container.
Nov 28 08:14:27 np0005538515.localdomain sudo[63892]: pam_unix(sudo:session): session closed for user root
Nov 28 08:14:28 np0005538515.localdomain sudo[64017]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ashzfyalsbbnlyhyvzfnnlyrpelvjhrr ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:14:28 np0005538515.localdomain sudo[64017]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:14:28 np0005538515.localdomain python3[64019]: ansible-systemd Invoked with state=restarted name=tripleo_nova_virtsecretd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 08:14:28 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 08:14:28 np0005538515.localdomain systemd-rc-local-generator[64044]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 08:14:28 np0005538515.localdomain systemd-sysv-generator[64049]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 08:14:28 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 08:14:28 np0005538515.localdomain systemd[1]: Starting nova_virtsecretd container...
Nov 28 08:14:28 np0005538515.localdomain tripleo-start-podman-container[64059]: Creating additional drop-in dependency for "nova_virtsecretd" (c819cf470c2869c75c471bbedd276e4a2f4c93050051a8f401cabeeedb4a8808)
Nov 28 08:14:28 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 08:14:28 np0005538515.localdomain systemd-rc-local-generator[64113]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 08:14:28 np0005538515.localdomain systemd-sysv-generator[64118]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 08:14:28 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 08:14:29 np0005538515.localdomain systemd[1]: Started nova_virtsecretd container.
Nov 28 08:14:29 np0005538515.localdomain sudo[64017]: pam_unix(sudo:session): session closed for user root
Nov 28 08:14:29 np0005538515.localdomain sudo[64138]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nhytggzvjfpzkynxkotszylzxjcxkkae ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:14:29 np0005538515.localdomain sudo[64138]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:14:29 np0005538515.localdomain python3[64140]: ansible-systemd Invoked with state=restarted name=tripleo_nova_virtstoraged.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 08:14:29 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 08:14:29 np0005538515.localdomain systemd-sysv-generator[64168]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 08:14:29 np0005538515.localdomain systemd-rc-local-generator[64164]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 08:14:29 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 08:14:30 np0005538515.localdomain systemd[1]: Starting nova_virtstoraged container...
Nov 28 08:14:30 np0005538515.localdomain tripleo-start-podman-container[64180]: Creating additional drop-in dependency for "nova_virtstoraged" (77e9d72df8f79ee50de7116306a3a6d3da17ccdfda2a4c48233804c3562cc2f0)
Nov 28 08:14:30 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 08:14:30 np0005538515.localdomain systemd-rc-local-generator[64237]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 08:14:30 np0005538515.localdomain systemd-sysv-generator[64242]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 08:14:30 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 08:14:30 np0005538515.localdomain systemd[1]: Started nova_virtstoraged container.
Nov 28 08:14:30 np0005538515.localdomain sudo[64138]: pam_unix(sudo:session): session closed for user root
Nov 28 08:14:30 np0005538515.localdomain sudo[64261]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pfgzmhyugxtihpnxgtydpzceludvbwas ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:14:30 np0005538515.localdomain sudo[64261]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:14:31 np0005538515.localdomain python3[64263]: ansible-systemd Invoked with state=restarted name=tripleo_rsyslog.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 08:14:31 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 08:14:31 np0005538515.localdomain systemd-sysv-generator[64292]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 08:14:31 np0005538515.localdomain systemd-rc-local-generator[64289]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 08:14:31 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 08:14:31 np0005538515.localdomain systemd[1]: Starting rsyslog container...
Nov 28 08:14:31 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 08:14:31 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/78fa405dc0dc1392ab1503db8559712d0c057956ae06af0d32d5b9d343fe4a38/merged/var/log/rsyslog supports timestamps until 2038 (0x7fffffff)
Nov 28 08:14:31 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/78fa405dc0dc1392ab1503db8559712d0c057956ae06af0d32d5b9d343fe4a38/merged/var/lib/rsyslog supports timestamps until 2038 (0x7fffffff)
Nov 28 08:14:31 np0005538515.localdomain podman[64303]: 2025-11-28 08:14:31.746609169 +0000 UTC m=+0.139035864 container init 45cbc87a49d83485e7046909600fab5f788908e8e4b6bf89504ff818c1bf66ec (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, architecture=x86_64, io.openshift.expose-services=, release=1761123044, vcs-type=git, com.redhat.component=openstack-rsyslog-container, summary=Red Hat OpenStack Platform 17.1 rsyslog, config_id=tripleo_step3, tcib_managed=true, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:49Z, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, container_name=rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, name=rhosp17/openstack-rsyslog, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 rsyslog, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f62921da3a3d0eed1be38a46b3ed6ac3'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 28 08:14:31 np0005538515.localdomain podman[64303]: 2025-11-28 08:14:31.757208915 +0000 UTC m=+0.149635600 container start 45cbc87a49d83485e7046909600fab5f788908e8e4b6bf89504ff818c1bf66ec (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, managed_by=tripleo_ansible, vcs-type=git, architecture=x86_64, io.openshift.expose-services=, name=rhosp17/openstack-rsyslog, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, description=Red Hat OpenStack Platform 17.1 rsyslog, container_name=rsyslog, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f62921da3a3d0eed1be38a46b3ed6ac3'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, batch=17.1_20251118.1, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.component=openstack-rsyslog-container, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, build-date=2025-11-18T22:49:49Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, release=1761123044, config_id=tripleo_step3, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 rsyslog, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 28 08:14:31 np0005538515.localdomain podman[64303]: rsyslog
Nov 28 08:14:31 np0005538515.localdomain systemd[1]: Started rsyslog container.
Nov 28 08:14:31 np0005538515.localdomain sudo[64321]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Nov 28 08:14:31 np0005538515.localdomain sudo[64321]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Nov 28 08:14:31 np0005538515.localdomain sudo[64261]: pam_unix(sudo:session): session closed for user root
Nov 28 08:14:31 np0005538515.localdomain sudo[64321]: pam_unix(sudo:session): session closed for user root
Nov 28 08:14:31 np0005538515.localdomain systemd[1]: libpod-45cbc87a49d83485e7046909600fab5f788908e8e4b6bf89504ff818c1bf66ec.scope: Deactivated successfully.
Nov 28 08:14:31 np0005538515.localdomain podman[64328]: 2025-11-28 08:14:31.871143224 +0000 UTC m=+0.032685666 container died 45cbc87a49d83485e7046909600fab5f788908e8e4b6bf89504ff818c1bf66ec (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, description=Red Hat OpenStack Platform 17.1 rsyslog, architecture=x86_64, config_id=tripleo_step3, release=1761123044, url=https://www.redhat.com, build-date=2025-11-18T22:49:49Z, container_name=rsyslog, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, distribution-scope=public, io.openshift.expose-services=, com.redhat.component=openstack-rsyslog-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, tcib_managed=true, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f62921da3a3d0eed1be38a46b3ed6ac3'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, name=rhosp17/openstack-rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 rsyslog, batch=17.1_20251118.1)
Nov 28 08:14:31 np0005538515.localdomain podman[64328]: 2025-11-28 08:14:31.891240101 +0000 UTC m=+0.052782543 container cleanup 45cbc87a49d83485e7046909600fab5f788908e8e4b6bf89504ff818c1bf66ec (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, release=1761123044, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, url=https://www.redhat.com, com.redhat.component=openstack-rsyslog-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f62921da3a3d0eed1be38a46b3ed6ac3'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, version=17.1.12, build-date=2025-11-18T22:49:49Z, description=Red Hat OpenStack Platform 17.1 rsyslog, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-rsyslog, distribution-scope=public, container_name=rsyslog, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 rsyslog, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, vcs-type=git)
Nov 28 08:14:31 np0005538515.localdomain systemd[1]: tripleo_rsyslog.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 08:14:31 np0005538515.localdomain podman[64351]: 2025-11-28 08:14:31.956610451 +0000 UTC m=+0.038987665 container cleanup 45cbc87a49d83485e7046909600fab5f788908e8e4b6bf89504ff818c1bf66ec (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, version=17.1.12, tcib_managed=true, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, vcs-type=git, release=1761123044, vendor=Red Hat, Inc., com.redhat.component=openstack-rsyslog-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f62921da3a3d0eed1be38a46b3ed6ac3'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 rsyslog, summary=Red Hat OpenStack Platform 17.1 rsyslog, container_name=rsyslog, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-rsyslog, io.openshift.expose-services=, build-date=2025-11-18T22:49:49Z)
Nov 28 08:14:31 np0005538515.localdomain podman[64351]: rsyslog
Nov 28 08:14:31 np0005538515.localdomain systemd[1]: tripleo_rsyslog.service: Failed with result 'exit-code'.
Nov 28 08:14:32 np0005538515.localdomain sudo[64377]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vqfgnlkiscnemznyssnjhfpmstlxwdqv ; /usr/bin/python3
Nov 28 08:14:32 np0005538515.localdomain systemd[1]: tripleo_rsyslog.service: Scheduled restart job, restart counter is at 1.
Nov 28 08:14:32 np0005538515.localdomain systemd[1]: Stopped rsyslog container.
Nov 28 08:14:32 np0005538515.localdomain systemd[1]: Starting rsyslog container...
Nov 28 08:14:32 np0005538515.localdomain sudo[64377]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:14:32 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 08:14:32 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/78fa405dc0dc1392ab1503db8559712d0c057956ae06af0d32d5b9d343fe4a38/merged/var/log/rsyslog supports timestamps until 2038 (0x7fffffff)
Nov 28 08:14:32 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/78fa405dc0dc1392ab1503db8559712d0c057956ae06af0d32d5b9d343fe4a38/merged/var/lib/rsyslog supports timestamps until 2038 (0x7fffffff)
Nov 28 08:14:32 np0005538515.localdomain podman[64379]: 2025-11-28 08:14:32.211797825 +0000 UTC m=+0.120061034 container init 45cbc87a49d83485e7046909600fab5f788908e8e4b6bf89504ff818c1bf66ec (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, description=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f62921da3a3d0eed1be38a46b3ed6ac3'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., vcs-type=git, release=1761123044, summary=Red Hat OpenStack Platform 17.1 rsyslog, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:49Z, maintainer=OpenStack TripleO Team, architecture=x86_64, name=rhosp17/openstack-rsyslog, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, container_name=rsyslog, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.component=openstack-rsyslog-container, distribution-scope=public, io.buildah.version=1.41.4, url=https://www.redhat.com, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, version=17.1.12)
Nov 28 08:14:32 np0005538515.localdomain podman[64379]: 2025-11-28 08:14:32.21889716 +0000 UTC m=+0.127160359 container start 45cbc87a49d83485e7046909600fab5f788908e8e4b6bf89504ff818c1bf66ec (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, release=1761123044, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 rsyslog, distribution-scope=public, url=https://www.redhat.com, version=17.1.12, vcs-type=git, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-rsyslog, build-date=2025-11-18T22:49:49Z, io.openshift.expose-services=, vendor=Red Hat, Inc., batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f62921da3a3d0eed1be38a46b3ed6ac3'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=rsyslog, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-rsyslog-container)
Nov 28 08:14:32 np0005538515.localdomain podman[64379]: rsyslog
Nov 28 08:14:32 np0005538515.localdomain systemd[1]: Started rsyslog container.
Nov 28 08:14:32 np0005538515.localdomain sudo[64399]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Nov 28 08:14:32 np0005538515.localdomain sudo[64399]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Nov 28 08:14:32 np0005538515.localdomain python3[64380]: ansible-file Invoked with path=/var/lib/container-puppet/container-puppet-tasks3.json state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:14:32 np0005538515.localdomain sudo[64377]: pam_unix(sudo:session): session closed for user root
Nov 28 08:14:32 np0005538515.localdomain sudo[64399]: pam_unix(sudo:session): session closed for user root
Nov 28 08:14:32 np0005538515.localdomain systemd[1]: libpod-45cbc87a49d83485e7046909600fab5f788908e8e4b6bf89504ff818c1bf66ec.scope: Deactivated successfully.
Nov 28 08:14:32 np0005538515.localdomain podman[64402]: 2025-11-28 08:14:32.386915201 +0000 UTC m=+0.053021200 container died 45cbc87a49d83485e7046909600fab5f788908e8e4b6bf89504ff818c1bf66ec (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, summary=Red Hat OpenStack Platform 17.1 rsyslog, architecture=x86_64, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 rsyslog, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, vendor=Red Hat, Inc., io.buildah.version=1.41.4, url=https://www.redhat.com, tcib_managed=true, container_name=rsyslog, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f62921da3a3d0eed1be38a46b3ed6ac3'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, build-date=2025-11-18T22:49:49Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, name=rhosp17/openstack-rsyslog, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, version=17.1.12, com.redhat.component=openstack-rsyslog-container)
Nov 28 08:14:32 np0005538515.localdomain podman[64402]: 2025-11-28 08:14:32.424492131 +0000 UTC m=+0.090598090 container cleanup 45cbc87a49d83485e7046909600fab5f788908e8e4b6bf89504ff818c1bf66ec (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, summary=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f62921da3a3d0eed1be38a46b3ed6ac3'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, tcib_managed=true, vcs-type=git, description=Red Hat OpenStack Platform 17.1 rsyslog, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-rsyslog, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.expose-services=, build-date=2025-11-18T22:49:49Z, io.buildah.version=1.41.4, container_name=rsyslog, maintainer=OpenStack TripleO Team, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-rsyslog-container, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com)
Nov 28 08:14:32 np0005538515.localdomain systemd[1]: tripleo_rsyslog.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 08:14:32 np0005538515.localdomain podman[64415]: 2025-11-28 08:14:32.511061584 +0000 UTC m=+0.054192458 container cleanup 45cbc87a49d83485e7046909600fab5f788908e8e4b6bf89504ff818c1bf66ec (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-rsyslog-container, release=1761123044, tcib_managed=true, build-date=2025-11-18T22:49:49Z, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=rsyslog, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f62921da3a3d0eed1be38a46b3ed6ac3'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, name=rhosp17/openstack-rsyslog, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, url=https://www.redhat.com, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 rsyslog, vcs-type=git, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 rsyslog, io.buildah.version=1.41.4, distribution-scope=public, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog)
Nov 28 08:14:32 np0005538515.localdomain podman[64415]: rsyslog
Nov 28 08:14:32 np0005538515.localdomain systemd[1]: tripleo_rsyslog.service: Failed with result 'exit-code'.
Nov 28 08:14:32 np0005538515.localdomain systemd[1]: tripleo_rsyslog.service: Scheduled restart job, restart counter is at 2.
Nov 28 08:14:32 np0005538515.localdomain systemd[1]: Stopped rsyslog container.
Nov 28 08:14:32 np0005538515.localdomain systemd[1]: Starting rsyslog container...
Nov 28 08:14:32 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-78fa405dc0dc1392ab1503db8559712d0c057956ae06af0d32d5b9d343fe4a38-merged.mount: Deactivated successfully.
Nov 28 08:14:32 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-45cbc87a49d83485e7046909600fab5f788908e8e4b6bf89504ff818c1bf66ec-userdata-shm.mount: Deactivated successfully.
Nov 28 08:14:32 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.
Nov 28 08:14:32 np0005538515.localdomain sudo[64485]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zcfzndkrolquqktxsflposgknylzzkgt ; /usr/bin/python3
Nov 28 08:14:32 np0005538515.localdomain sudo[64485]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:14:32 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 08:14:32 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/78fa405dc0dc1392ab1503db8559712d0c057956ae06af0d32d5b9d343fe4a38/merged/var/log/rsyslog supports timestamps until 2038 (0x7fffffff)
Nov 28 08:14:32 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/78fa405dc0dc1392ab1503db8559712d0c057956ae06af0d32d5b9d343fe4a38/merged/var/lib/rsyslog supports timestamps until 2038 (0x7fffffff)
Nov 28 08:14:32 np0005538515.localdomain podman[64484]: 2025-11-28 08:14:32.768624802 +0000 UTC m=+0.073907362 container health_status 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, name=rhosp17/openstack-qdrouterd, vcs-type=git, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, io.buildah.version=1.41.4, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, release=1761123044, container_name=metrics_qdr, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, build-date=2025-11-18T22:49:46Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 28 08:14:32 np0005538515.localdomain podman[64460]: 2025-11-28 08:14:32.790455404 +0000 UTC m=+0.150852630 container init 45cbc87a49d83485e7046909600fab5f788908e8e4b6bf89504ff818c1bf66ec (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, config_id=tripleo_step3, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:49Z, container_name=rsyslog, summary=Red Hat OpenStack Platform 17.1 rsyslog, name=rhosp17/openstack-rsyslog, vcs-type=git, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-rsyslog-container, url=https://www.redhat.com, tcib_managed=true, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, distribution-scope=public, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, vendor=Red Hat, Inc., batch=17.1_20251118.1, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 rsyslog, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f62921da3a3d0eed1be38a46b3ed6ac3'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog)
Nov 28 08:14:32 np0005538515.localdomain podman[64460]: 2025-11-28 08:14:32.799980245 +0000 UTC m=+0.160377411 container start 45cbc87a49d83485e7046909600fab5f788908e8e4b6bf89504ff818c1bf66ec (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f62921da3a3d0eed1be38a46b3ed6ac3'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, name=rhosp17/openstack-rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, version=17.1.12, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 rsyslog, config_id=tripleo_step3, distribution-scope=public, url=https://www.redhat.com, com.redhat.component=openstack-rsyslog-container, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, container_name=rsyslog, build-date=2025-11-18T22:49:49Z, summary=Red Hat OpenStack Platform 17.1 rsyslog, maintainer=OpenStack TripleO Team, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, vcs-type=git, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 28 08:14:32 np0005538515.localdomain podman[64460]: rsyslog
Nov 28 08:14:32 np0005538515.localdomain systemd[1]: Started rsyslog container.
Nov 28 08:14:32 np0005538515.localdomain sudo[64522]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Nov 28 08:14:32 np0005538515.localdomain sudo[64522]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Nov 28 08:14:32 np0005538515.localdomain sudo[64485]: pam_unix(sudo:session): session closed for user root
Nov 28 08:14:32 np0005538515.localdomain sudo[64522]: pam_unix(sudo:session): session closed for user root
Nov 28 08:14:32 np0005538515.localdomain systemd[1]: libpod-45cbc87a49d83485e7046909600fab5f788908e8e4b6bf89504ff818c1bf66ec.scope: Deactivated successfully.
Nov 28 08:14:32 np0005538515.localdomain podman[64525]: 2025-11-28 08:14:32.937990177 +0000 UTC m=+0.037036714 container died 45cbc87a49d83485e7046909600fab5f788908e8e4b6bf89504ff818c1bf66ec (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, architecture=x86_64, vcs-type=git, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, name=rhosp17/openstack-rsyslog, summary=Red Hat OpenStack Platform 17.1 rsyslog, distribution-scope=public, release=1761123044, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, batch=17.1_20251118.1, com.redhat.component=openstack-rsyslog-container, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, vendor=Red Hat, Inc., config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 rsyslog, build-date=2025-11-18T22:49:49Z, managed_by=tripleo_ansible, container_name=rsyslog, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f62921da3a3d0eed1be38a46b3ed6ac3'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']})
Nov 28 08:14:32 np0005538515.localdomain podman[64525]: 2025-11-28 08:14:32.959709074 +0000 UTC m=+0.058755601 container cleanup 45cbc87a49d83485e7046909600fab5f788908e8e4b6bf89504ff818c1bf66ec (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 rsyslog, managed_by=tripleo_ansible, tcib_managed=true, build-date=2025-11-18T22:49:49Z, io.openshift.expose-services=, url=https://www.redhat.com, container_name=rsyslog, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f62921da3a3d0eed1be38a46b3ed6ac3'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, name=rhosp17/openstack-rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., com.redhat.component=openstack-rsyslog-container, description=Red Hat OpenStack Platform 17.1 rsyslog, distribution-scope=public, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 28 08:14:32 np0005538515.localdomain systemd[1]: tripleo_rsyslog.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 08:14:32 np0005538515.localdomain podman[64484]: 2025-11-28 08:14:32.979908225 +0000 UTC m=+0.285190835 container exec_died 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_id=tripleo_step1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, version=17.1.12, io.buildah.version=1.41.4, url=https://www.redhat.com, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, batch=17.1_20251118.1, container_name=metrics_qdr, name=rhosp17/openstack-qdrouterd)
Nov 28 08:14:32 np0005538515.localdomain systemd[1]: 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.service: Deactivated successfully.
Nov 28 08:14:33 np0005538515.localdomain podman[64552]: 2025-11-28 08:14:33.040643318 +0000 UTC m=+0.053478725 container cleanup 45cbc87a49d83485e7046909600fab5f788908e8e4b6bf89504ff818c1bf66ec (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-rsyslog-container, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, distribution-scope=public, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 rsyslog, url=https://www.redhat.com, container_name=rsyslog, io.openshift.expose-services=, build-date=2025-11-18T22:49:49Z, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f62921da3a3d0eed1be38a46b3ed6ac3'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, io.buildah.version=1.41.4, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, summary=Red Hat OpenStack Platform 17.1 rsyslog, name=rhosp17/openstack-rsyslog, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, config_id=tripleo_step3)
Nov 28 08:14:33 np0005538515.localdomain podman[64552]: rsyslog
Nov 28 08:14:33 np0005538515.localdomain systemd[1]: tripleo_rsyslog.service: Failed with result 'exit-code'.
Nov 28 08:14:33 np0005538515.localdomain sudo[64590]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dexrztxwhbgussqsdcipwhjftqdmauwa ; /usr/bin/python3
Nov 28 08:14:33 np0005538515.localdomain sudo[64590]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:14:33 np0005538515.localdomain sudo[64590]: pam_unix(sudo:session): session closed for user root
Nov 28 08:14:33 np0005538515.localdomain systemd[1]: tripleo_rsyslog.service: Scheduled restart job, restart counter is at 3.
Nov 28 08:14:33 np0005538515.localdomain systemd[1]: Stopped rsyslog container.
Nov 28 08:14:33 np0005538515.localdomain systemd[1]: Starting rsyslog container...
Nov 28 08:14:33 np0005538515.localdomain sudo[64634]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ojvfkdvxflijyvugqjrqjezdihxfqpca ; /usr/bin/python3
Nov 28 08:14:33 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 08:14:33 np0005538515.localdomain sudo[64634]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:14:33 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/78fa405dc0dc1392ab1503db8559712d0c057956ae06af0d32d5b9d343fe4a38/merged/var/log/rsyslog supports timestamps until 2038 (0x7fffffff)
Nov 28 08:14:33 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/78fa405dc0dc1392ab1503db8559712d0c057956ae06af0d32d5b9d343fe4a38/merged/var/lib/rsyslog supports timestamps until 2038 (0x7fffffff)
Nov 28 08:14:33 np0005538515.localdomain podman[64607]: 2025-11-28 08:14:33.514920271 +0000 UTC m=+0.122536902 container init 45cbc87a49d83485e7046909600fab5f788908e8e4b6bf89504ff818c1bf66ec (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, io.openshift.expose-services=, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, io.buildah.version=1.41.4, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 rsyslog, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f62921da3a3d0eed1be38a46b3ed6ac3'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, container_name=rsyslog, release=1761123044, architecture=x86_64, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-rsyslog-container, name=rhosp17/openstack-rsyslog, url=https://www.redhat.com, build-date=2025-11-18T22:49:49Z, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 rsyslog)
Nov 28 08:14:33 np0005538515.localdomain podman[64607]: 2025-11-28 08:14:33.523602346 +0000 UTC m=+0.131218977 container start 45cbc87a49d83485e7046909600fab5f788908e8e4b6bf89504ff818c1bf66ec (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f62921da3a3d0eed1be38a46b3ed6ac3'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, version=17.1.12, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.component=openstack-rsyslog-container, tcib_managed=true, url=https://www.redhat.com, name=rhosp17/openstack-rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, container_name=rsyslog, batch=17.1_20251118.1, build-date=2025-11-18T22:49:49Z, io.openshift.expose-services=, io.buildah.version=1.41.4, distribution-scope=public, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 rsyslog, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 28 08:14:33 np0005538515.localdomain podman[64607]: rsyslog
Nov 28 08:14:33 np0005538515.localdomain systemd[1]: Started rsyslog container.
Nov 28 08:14:33 np0005538515.localdomain sudo[64642]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Nov 28 08:14:33 np0005538515.localdomain sudo[64642]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Nov 28 08:14:33 np0005538515.localdomain sudo[64642]: pam_unix(sudo:session): session closed for user root
Nov 28 08:14:33 np0005538515.localdomain systemd[1]: libpod-45cbc87a49d83485e7046909600fab5f788908e8e4b6bf89504ff818c1bf66ec.scope: Deactivated successfully.
Nov 28 08:14:33 np0005538515.localdomain python3[64639]: ansible-container_puppet_config Invoked with check_mode=False config_vol_prefix=/var/lib/config-data debug=True net_host=True no_archive=True puppet_config=/var/lib/container-puppet/container-puppet-tasks3.json short_hostname=np0005538515 step=3 update_config_hash_only=False
Nov 28 08:14:33 np0005538515.localdomain podman[64646]: 2025-11-28 08:14:33.686370822 +0000 UTC m=+0.052429792 container died 45cbc87a49d83485e7046909600fab5f788908e8e4b6bf89504ff818c1bf66ec (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, com.redhat.component=openstack-rsyslog-container, description=Red Hat OpenStack Platform 17.1 rsyslog, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:49Z, io.buildah.version=1.41.4, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 rsyslog, release=1761123044, container_name=rsyslog, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, url=https://www.redhat.com, config_id=tripleo_step3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f62921da3a3d0eed1be38a46b3ed6ac3'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., batch=17.1_20251118.1, name=rhosp17/openstack-rsyslog, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 28 08:14:33 np0005538515.localdomain sudo[64634]: pam_unix(sudo:session): session closed for user root
Nov 28 08:14:33 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-45cbc87a49d83485e7046909600fab5f788908e8e4b6bf89504ff818c1bf66ec-userdata-shm.mount: Deactivated successfully.
Nov 28 08:14:33 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-78fa405dc0dc1392ab1503db8559712d0c057956ae06af0d32d5b9d343fe4a38-merged.mount: Deactivated successfully.
Nov 28 08:14:33 np0005538515.localdomain podman[64646]: 2025-11-28 08:14:33.710574269 +0000 UTC m=+0.076633209 container cleanup 45cbc87a49d83485e7046909600fab5f788908e8e4b6bf89504ff818c1bf66ec (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, com.redhat.component=openstack-rsyslog-container, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-rsyslog, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, distribution-scope=public, version=17.1.12, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.openshift.expose-services=, batch=17.1_20251118.1, release=1761123044, config_id=tripleo_step3, build-date=2025-11-18T22:49:49Z, summary=Red Hat OpenStack Platform 17.1 rsyslog, description=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, io.buildah.version=1.41.4, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f62921da3a3d0eed1be38a46b3ed6ac3'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, container_name=rsyslog, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 08:14:33 np0005538515.localdomain systemd[1]: tripleo_rsyslog.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 08:14:33 np0005538515.localdomain podman[64660]: 2025-11-28 08:14:33.766215691 +0000 UTC m=+0.036611141 container cleanup 45cbc87a49d83485e7046909600fab5f788908e8e4b6bf89504ff818c1bf66ec (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vendor=Red Hat, Inc., batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.expose-services=, managed_by=tripleo_ansible, tcib_managed=true, io.buildah.version=1.41.4, vcs-type=git, name=rhosp17/openstack-rsyslog, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 rsyslog, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f62921da3a3d0eed1be38a46b3ed6ac3'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, distribution-scope=public, container_name=rsyslog, description=Red Hat OpenStack Platform 17.1 rsyslog, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, config_id=tripleo_step3, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-rsyslog-container, build-date=2025-11-18T22:49:49Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog)
Nov 28 08:14:33 np0005538515.localdomain podman[64660]: rsyslog
Nov 28 08:14:33 np0005538515.localdomain systemd[1]: tripleo_rsyslog.service: Failed with result 'exit-code'.
Nov 28 08:14:34 np0005538515.localdomain systemd[1]: tripleo_rsyslog.service: Scheduled restart job, restart counter is at 4.
Nov 28 08:14:34 np0005538515.localdomain systemd[1]: Stopped rsyslog container.
Nov 28 08:14:34 np0005538515.localdomain systemd[1]: Starting rsyslog container...
Nov 28 08:14:34 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 08:14:34 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/78fa405dc0dc1392ab1503db8559712d0c057956ae06af0d32d5b9d343fe4a38/merged/var/log/rsyslog supports timestamps until 2038 (0x7fffffff)
Nov 28 08:14:34 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/78fa405dc0dc1392ab1503db8559712d0c057956ae06af0d32d5b9d343fe4a38/merged/var/lib/rsyslog supports timestamps until 2038 (0x7fffffff)
Nov 28 08:14:34 np0005538515.localdomain podman[64671]: 2025-11-28 08:14:34.260019152 +0000 UTC m=+0.119239868 container init 45cbc87a49d83485e7046909600fab5f788908e8e4b6bf89504ff818c1bf66ec (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 rsyslog, distribution-scope=public, config_id=tripleo_step3, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, vendor=Red Hat, Inc., io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:49Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 rsyslog, vcs-type=git, version=17.1.12, container_name=rsyslog, managed_by=tripleo_ansible, com.redhat.component=openstack-rsyslog-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, name=rhosp17/openstack-rsyslog, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f62921da3a3d0eed1be38a46b3ed6ac3'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, release=1761123044)
Nov 28 08:14:34 np0005538515.localdomain podman[64671]: 2025-11-28 08:14:34.268448839 +0000 UTC m=+0.127669565 container start 45cbc87a49d83485e7046909600fab5f788908e8e4b6bf89504ff818c1bf66ec (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, architecture=x86_64, container_name=rsyslog, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, release=1761123044, version=17.1.12, tcib_managed=true, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 rsyslog, batch=17.1_20251118.1, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f62921da3a3d0eed1be38a46b3ed6ac3'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, build-date=2025-11-18T22:49:49Z, com.redhat.component=openstack-rsyslog-container, description=Red Hat OpenStack Platform 17.1 rsyslog, name=rhosp17/openstack-rsyslog)
Nov 28 08:14:34 np0005538515.localdomain podman[64671]: rsyslog
Nov 28 08:14:34 np0005538515.localdomain systemd[1]: Started rsyslog container.
Nov 28 08:14:34 np0005538515.localdomain sudo[64691]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Nov 28 08:14:34 np0005538515.localdomain sudo[64691]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Nov 28 08:14:34 np0005538515.localdomain sudo[64691]: pam_unix(sudo:session): session closed for user root
Nov 28 08:14:34 np0005538515.localdomain systemd[1]: libpod-45cbc87a49d83485e7046909600fab5f788908e8e4b6bf89504ff818c1bf66ec.scope: Deactivated successfully.
Nov 28 08:14:34 np0005538515.localdomain podman[64694]: 2025-11-28 08:14:34.430005776 +0000 UTC m=+0.050065697 container died 45cbc87a49d83485e7046909600fab5f788908e8e4b6bf89504ff818c1bf66ec (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., container_name=rsyslog, description=Red Hat OpenStack Platform 17.1 rsyslog, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 rsyslog, version=17.1.12, build-date=2025-11-18T22:49:49Z, architecture=x86_64, com.redhat.component=openstack-rsyslog-container, distribution-scope=public, io.openshift.expose-services=, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f62921da3a3d0eed1be38a46b3ed6ac3'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-rsyslog, url=https://www.redhat.com, io.buildah.version=1.41.4)
Nov 28 08:14:34 np0005538515.localdomain podman[64694]: 2025-11-28 08:14:34.455121522 +0000 UTC m=+0.075181423 container cleanup 45cbc87a49d83485e7046909600fab5f788908e8e4b6bf89504ff818c1bf66ec (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, tcib_managed=true, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, summary=Red Hat OpenStack Platform 17.1 rsyslog, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, version=17.1.12, distribution-scope=public, config_id=tripleo_step3, name=rhosp17/openstack-rsyslog, url=https://www.redhat.com, container_name=rsyslog, release=1761123044, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.component=openstack-rsyslog-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f62921da3a3d0eed1be38a46b3ed6ac3'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, build-date=2025-11-18T22:49:49Z)
Nov 28 08:14:34 np0005538515.localdomain systemd[1]: tripleo_rsyslog.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 08:14:34 np0005538515.localdomain sudo[64719]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fzjjnyqstjmxpttgmucrxehsgiaptywe ; /usr/bin/python3
Nov 28 08:14:34 np0005538515.localdomain sudo[64719]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:14:34 np0005538515.localdomain podman[64721]: 2025-11-28 08:14:34.538776522 +0000 UTC m=+0.053794235 container cleanup 45cbc87a49d83485e7046909600fab5f788908e8e4b6bf89504ff818c1bf66ec (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, name=rhosp17/openstack-rsyslog, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, version=17.1.12, container_name=rsyslog, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 rsyslog, build-date=2025-11-18T22:49:49Z, com.redhat.component=openstack-rsyslog-container, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f62921da3a3d0eed1be38a46b3ed6ac3'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, url=https://www.redhat.com, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, distribution-scope=public, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 rsyslog)
Nov 28 08:14:34 np0005538515.localdomain podman[64721]: rsyslog
Nov 28 08:14:34 np0005538515.localdomain systemd[1]: tripleo_rsyslog.service: Failed with result 'exit-code'.
Nov 28 08:14:34 np0005538515.localdomain python3[64722]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:14:34 np0005538515.localdomain sudo[64719]: pam_unix(sudo:session): session closed for user root
Nov 28 08:14:34 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-45cbc87a49d83485e7046909600fab5f788908e8e4b6bf89504ff818c1bf66ec-userdata-shm.mount: Deactivated successfully.
Nov 28 08:14:34 np0005538515.localdomain systemd[1]: tripleo_rsyslog.service: Scheduled restart job, restart counter is at 5.
Nov 28 08:14:34 np0005538515.localdomain systemd[1]: Stopped rsyslog container.
Nov 28 08:14:34 np0005538515.localdomain systemd[1]: tripleo_rsyslog.service: Start request repeated too quickly.
Nov 28 08:14:34 np0005538515.localdomain systemd[1]: tripleo_rsyslog.service: Failed with result 'exit-code'.
Nov 28 08:14:34 np0005538515.localdomain systemd[1]: Failed to start rsyslog container.
Nov 28 08:14:34 np0005538515.localdomain sudo[64745]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hgdqlqxhvtuybfvobfsddhtxikrbwpms ; /usr/bin/python3
Nov 28 08:14:34 np0005538515.localdomain sudo[64745]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:14:34 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.
Nov 28 08:14:34 np0005538515.localdomain python3[64747]: ansible-container_config_data Invoked with config_path=/var/lib/tripleo-config/container-puppet-config/step_3 config_pattern=container-puppet-*.json config_overrides={} debug=True
Nov 28 08:14:34 np0005538515.localdomain sudo[64745]: pam_unix(sudo:session): session closed for user root
Nov 28 08:14:34 np0005538515.localdomain podman[64748]: 2025-11-28 08:14:34.972704706 +0000 UTC m=+0.078043193 container health_status cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, com.redhat.component=openstack-collectd-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, name=rhosp17/openstack-collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, url=https://www.redhat.com, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, release=1761123044, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, build-date=2025-11-18T22:51:28Z, vcs-type=git, description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, maintainer=OpenStack TripleO Team)
Nov 28 08:14:34 np0005538515.localdomain podman[64748]: 2025-11-28 08:14:34.983394765 +0000 UTC m=+0.088733232 container exec_died cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, com.redhat.component=openstack-collectd-container, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.buildah.version=1.41.4, vcs-type=git, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=collectd, tcib_managed=true, distribution-scope=public, build-date=2025-11-18T22:51:28Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, config_id=tripleo_step3, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']})
Nov 28 08:14:34 np0005538515.localdomain systemd[1]: cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.service: Deactivated successfully.
Nov 28 08:14:36 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.
Nov 28 08:14:36 np0005538515.localdomain podman[64768]: 2025-11-28 08:14:36.964871689 +0000 UTC m=+0.077956860 container health_status 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=starting, io.openshift.expose-services=, container_name=iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-iscsid-container, distribution-scope=public, release=1761123044, io.buildah.version=1.41.4, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, maintainer=OpenStack TripleO Team, tcib_managed=true, config_id=tripleo_step3, architecture=x86_64, name=rhosp17/openstack-iscsid, batch=17.1_20251118.1)
Nov 28 08:14:36 np0005538515.localdomain podman[64768]: 2025-11-28 08:14:36.979649167 +0000 UTC m=+0.092734318 container exec_died 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, build-date=2025-11-18T23:44:13Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, summary=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, container_name=iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-iscsid, url=https://www.redhat.com, config_id=tripleo_step3, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, batch=17.1_20251118.1, version=17.1.12, io.openshift.expose-services=, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.component=openstack-iscsid-container, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, managed_by=tripleo_ansible)
Nov 28 08:14:36 np0005538515.localdomain systemd[1]: 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.service: Deactivated successfully.
Nov 28 08:14:48 np0005538515.localdomain sudo[64787]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 08:14:48 np0005538515.localdomain sudo[64787]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:14:48 np0005538515.localdomain sudo[64787]: pam_unix(sudo:session): session closed for user root
Nov 28 08:14:48 np0005538515.localdomain sudo[64802]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 08:14:48 np0005538515.localdomain sudo[64802]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:14:49 np0005538515.localdomain sudo[64802]: pam_unix(sudo:session): session closed for user root
Nov 28 08:14:49 np0005538515.localdomain sudo[64849]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 08:14:49 np0005538515.localdomain sudo[64849]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:14:49 np0005538515.localdomain sudo[64849]: pam_unix(sudo:session): session closed for user root
Nov 28 08:14:57 np0005538515.localdomain sshd[64864]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 08:14:58 np0005538515.localdomain sshd[64864]: Invalid user node from 80.94.92.186 port 40178
Nov 28 08:14:58 np0005538515.localdomain sshd[64864]: Connection closed by invalid user node 80.94.92.186 port 40178 [preauth]
Nov 28 08:15:03 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.
Nov 28 08:15:03 np0005538515.localdomain systemd[1]: tmp-crun.HYw7Xj.mount: Deactivated successfully.
Nov 28 08:15:03 np0005538515.localdomain podman[64866]: 2025-11-28 08:15:03.986011595 +0000 UTC m=+0.096061760 container health_status 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, name=rhosp17/openstack-qdrouterd, container_name=metrics_qdr, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, io.buildah.version=1.41.4, managed_by=tripleo_ansible, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, batch=17.1_20251118.1)
Nov 28 08:15:04 np0005538515.localdomain podman[64866]: 2025-11-28 08:15:04.191807063 +0000 UTC m=+0.301857208 container exec_died 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, url=https://www.redhat.com, release=1761123044, com.redhat.component=openstack-qdrouterd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, name=rhosp17/openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, tcib_managed=true, batch=17.1_20251118.1, distribution-scope=public, version=17.1.12)
Nov 28 08:15:04 np0005538515.localdomain systemd[1]: 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.service: Deactivated successfully.
Nov 28 08:15:05 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.
Nov 28 08:15:05 np0005538515.localdomain podman[64895]: 2025-11-28 08:15:05.959371999 +0000 UTC m=+0.073341079 container health_status cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, name=rhosp17/openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, tcib_managed=true, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, io.buildah.version=1.41.4, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, version=17.1.12, url=https://www.redhat.com, vcs-type=git, build-date=2025-11-18T22:51:28Z, vendor=Red Hat, Inc., batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3)
Nov 28 08:15:05 np0005538515.localdomain podman[64895]: 2025-11-28 08:15:05.971444102 +0000 UTC m=+0.085413182 container exec_died cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, build-date=2025-11-18T22:51:28Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-collectd-container, distribution-scope=public, io.buildah.version=1.41.4, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, config_id=tripleo_step3, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, container_name=collectd, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, tcib_managed=true, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, version=17.1.12, io.openshift.expose-services=)
Nov 28 08:15:05 np0005538515.localdomain systemd[1]: cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.service: Deactivated successfully.
Nov 28 08:15:07 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.
Nov 28 08:15:07 np0005538515.localdomain podman[64915]: 2025-11-28 08:15:07.973363226 +0000 UTC m=+0.081061058 container health_status 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, version=17.1.12, config_id=tripleo_step3, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, com.redhat.component=openstack-iscsid-container, summary=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, name=rhosp17/openstack-iscsid, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, io.openshift.expose-services=, container_name=iscsid, io.buildah.version=1.41.4, build-date=2025-11-18T23:44:13Z)
Nov 28 08:15:07 np0005538515.localdomain podman[64915]: 2025-11-28 08:15:07.982474766 +0000 UTC m=+0.090172588 container exec_died 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, config_id=tripleo_step3, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., url=https://www.redhat.com, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, architecture=x86_64, release=1761123044, com.redhat.component=openstack-iscsid-container, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, name=rhosp17/openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, version=17.1.12, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, build-date=2025-11-18T23:44:13Z, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, container_name=iscsid, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, tcib_managed=true)
Nov 28 08:15:07 np0005538515.localdomain systemd[1]: 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.service: Deactivated successfully.
Nov 28 08:15:34 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.
Nov 28 08:15:34 np0005538515.localdomain podman[64935]: 2025-11-28 08:15:34.970577134 +0000 UTC m=+0.082154841 container health_status 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, name=rhosp17/openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:46Z, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, version=17.1.12, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, architecture=x86_64)
Nov 28 08:15:35 np0005538515.localdomain podman[64935]: 2025-11-28 08:15:35.19257221 +0000 UTC m=+0.304149907 container exec_died 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, config_id=tripleo_step1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20251118.1, vcs-type=git, io.buildah.version=1.41.4, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:46Z, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp17/openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public)
Nov 28 08:15:35 np0005538515.localdomain systemd[1]: 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.service: Deactivated successfully.
Nov 28 08:15:36 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.
Nov 28 08:15:36 np0005538515.localdomain podman[64964]: 2025-11-28 08:15:36.974875972 +0000 UTC m=+0.086156674 container health_status cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, managed_by=tripleo_ansible, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, com.redhat.component=openstack-collectd-container, maintainer=OpenStack TripleO Team, container_name=collectd, vcs-type=git, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:51:28Z, config_id=tripleo_step3, name=rhosp17/openstack-collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true)
Nov 28 08:15:37 np0005538515.localdomain podman[64964]: 2025-11-28 08:15:37.007892378 +0000 UTC m=+0.119173030 container exec_died cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, summary=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, container_name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.component=openstack-collectd-container, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., release=1761123044, batch=17.1_20251118.1, build-date=2025-11-18T22:51:28Z, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, name=rhosp17/openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, vcs-type=git)
Nov 28 08:15:37 np0005538515.localdomain systemd[1]: cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.service: Deactivated successfully.
Nov 28 08:15:38 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.
Nov 28 08:15:38 np0005538515.localdomain podman[64984]: 2025-11-28 08:15:38.967845881 +0000 UTC m=+0.077547870 container health_status 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, io.buildah.version=1.41.4, config_id=tripleo_step3, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, build-date=2025-11-18T23:44:13Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, maintainer=OpenStack TripleO Team, container_name=iscsid, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, release=1761123044, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, name=rhosp17/openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, vcs-type=git, tcib_managed=true)
Nov 28 08:15:39 np0005538515.localdomain podman[64984]: 2025-11-28 08:15:39.002457336 +0000 UTC m=+0.112159275 container exec_died 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, name=rhosp17/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, tcib_managed=true, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., build-date=2025-11-18T23:44:13Z, description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3, release=1761123044, io.openshift.expose-services=, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, batch=17.1_20251118.1, vcs-type=git, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, managed_by=tripleo_ansible, distribution-scope=public, io.buildah.version=1.41.4, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team)
Nov 28 08:15:39 np0005538515.localdomain systemd[1]: 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.service: Deactivated successfully.
Nov 28 08:15:50 np0005538515.localdomain sudo[65003]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 08:15:50 np0005538515.localdomain sudo[65003]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:15:50 np0005538515.localdomain sudo[65003]: pam_unix(sudo:session): session closed for user root
Nov 28 08:15:50 np0005538515.localdomain sudo[65018]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 08:15:50 np0005538515.localdomain sudo[65018]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:15:50 np0005538515.localdomain sudo[65018]: pam_unix(sudo:session): session closed for user root
Nov 28 08:15:51 np0005538515.localdomain sudo[65065]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 08:15:51 np0005538515.localdomain sudo[65065]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:15:51 np0005538515.localdomain sudo[65065]: pam_unix(sudo:session): session closed for user root
Nov 28 08:16:05 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.
Nov 28 08:16:05 np0005538515.localdomain podman[65080]: 2025-11-28 08:16:05.98314379 +0000 UTC m=+0.086043991 container health_status 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, version=17.1.12, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.component=openstack-qdrouterd-container, distribution-scope=public, container_name=metrics_qdr, url=https://www.redhat.com, vendor=Red Hat, Inc., config_id=tripleo_step1, io.buildah.version=1.41.4, name=rhosp17/openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, managed_by=tripleo_ansible, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, release=1761123044, batch=17.1_20251118.1, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true)
Nov 28 08:16:06 np0005538515.localdomain podman[65080]: 2025-11-28 08:16:06.187520365 +0000 UTC m=+0.290420576 container exec_died 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, tcib_managed=true, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.41.4, batch=17.1_20251118.1, release=1761123044, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:46Z, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, io.openshift.expose-services=, config_id=tripleo_step1, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 qdrouterd)
Nov 28 08:16:06 np0005538515.localdomain systemd[1]: 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.service: Deactivated successfully.
Nov 28 08:16:07 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.
Nov 28 08:16:07 np0005538515.localdomain podman[65109]: 2025-11-28 08:16:07.978014408 +0000 UTC m=+0.081712118 container health_status cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, release=1761123044, batch=17.1_20251118.1, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:51:28Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., managed_by=tripleo_ansible, architecture=x86_64, distribution-scope=public, com.redhat.component=openstack-collectd-container, description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=collectd, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd)
Nov 28 08:16:08 np0005538515.localdomain podman[65109]: 2025-11-28 08:16:08.019543947 +0000 UTC m=+0.123241697 container exec_died cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, distribution-scope=public, build-date=2025-11-18T22:51:28Z, config_id=tripleo_step3, tcib_managed=true, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, managed_by=tripleo_ansible, architecture=x86_64, name=rhosp17/openstack-collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd)
Nov 28 08:16:08 np0005538515.localdomain systemd[1]: cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.service: Deactivated successfully.
Nov 28 08:16:09 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.
Nov 28 08:16:09 np0005538515.localdomain podman[65128]: 2025-11-28 08:16:09.964821357 +0000 UTC m=+0.076388404 container health_status 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, build-date=2025-11-18T23:44:13Z, url=https://www.redhat.com, release=1761123044, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.openshift.expose-services=, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, batch=17.1_20251118.1, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-iscsid-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12, name=rhosp17/openstack-iscsid, tcib_managed=true, managed_by=tripleo_ansible)
Nov 28 08:16:10 np0005538515.localdomain podman[65128]: 2025-11-28 08:16:10.003453127 +0000 UTC m=+0.115020144 container exec_died 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vcs-type=git, distribution-scope=public, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, managed_by=tripleo_ansible, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, container_name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.component=openstack-iscsid-container, build-date=2025-11-18T23:44:13Z, version=17.1.12, tcib_managed=true, url=https://www.redhat.com, release=1761123044, io.openshift.expose-services=, config_id=tripleo_step3)
Nov 28 08:16:10 np0005538515.localdomain systemd[1]: 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.service: Deactivated successfully.
Nov 28 08:16:36 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.
Nov 28 08:16:36 np0005538515.localdomain podman[65149]: 2025-11-28 08:16:36.958317343 +0000 UTC m=+0.064937251 container health_status 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, vcs-type=git, config_id=tripleo_step1, name=rhosp17/openstack-qdrouterd, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, tcib_managed=true)
Nov 28 08:16:37 np0005538515.localdomain podman[65149]: 2025-11-28 08:16:37.148630665 +0000 UTC m=+0.255250553 container exec_died 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step1, io.openshift.expose-services=, managed_by=tripleo_ansible, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, build-date=2025-11-18T22:49:46Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, version=17.1.12, io.buildah.version=1.41.4, name=rhosp17/openstack-qdrouterd, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, distribution-scope=public, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc.)
Nov 28 08:16:37 np0005538515.localdomain systemd[1]: 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.service: Deactivated successfully.
Nov 28 08:16:38 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.
Nov 28 08:16:38 np0005538515.localdomain systemd[1]: tmp-crun.j8kpdO.mount: Deactivated successfully.
Nov 28 08:16:38 np0005538515.localdomain podman[65178]: 2025-11-28 08:16:38.989821119 +0000 UTC m=+0.100033742 container health_status cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, name=rhosp17/openstack-collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, architecture=x86_64, build-date=2025-11-18T22:51:28Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, batch=17.1_20251118.1, config_id=tripleo_step3, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, version=17.1.12)
Nov 28 08:16:39 np0005538515.localdomain podman[65178]: 2025-11-28 08:16:39.027499779 +0000 UTC m=+0.137712362 container exec_died cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, batch=17.1_20251118.1, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, release=1761123044, managed_by=tripleo_ansible, build-date=2025-11-18T22:51:28Z, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, io.openshift.expose-services=, version=17.1.12, architecture=x86_64, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 28 08:16:39 np0005538515.localdomain systemd[1]: cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.service: Deactivated successfully.
Nov 28 08:16:40 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.
Nov 28 08:16:40 np0005538515.localdomain systemd[1]: tmp-crun.BLZT6R.mount: Deactivated successfully.
Nov 28 08:16:40 np0005538515.localdomain podman[65198]: 2025-11-28 08:16:40.977855365 +0000 UTC m=+0.085245286 container health_status 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, managed_by=tripleo_ansible, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vendor=Red Hat, Inc., name=rhosp17/openstack-iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.buildah.version=1.41.4, build-date=2025-11-18T23:44:13Z, tcib_managed=true, config_id=tripleo_step3, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, container_name=iscsid, maintainer=OpenStack TripleO Team, vcs-type=git, architecture=x86_64)
Nov 28 08:16:41 np0005538515.localdomain podman[65198]: 2025-11-28 08:16:41.015482995 +0000 UTC m=+0.122872976 container exec_died 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.buildah.version=1.41.4, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-type=git, release=1761123044, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, architecture=x86_64, container_name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, build-date=2025-11-18T23:44:13Z, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, version=17.1.12, vendor=Red Hat, Inc., config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, managed_by=tripleo_ansible)
Nov 28 08:16:41 np0005538515.localdomain systemd[1]: 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.service: Deactivated successfully.
Nov 28 08:16:51 np0005538515.localdomain sudo[65219]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 08:16:51 np0005538515.localdomain sudo[65219]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:16:51 np0005538515.localdomain sudo[65219]: pam_unix(sudo:session): session closed for user root
Nov 28 08:16:51 np0005538515.localdomain sudo[65234]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 08:16:51 np0005538515.localdomain sudo[65234]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:16:52 np0005538515.localdomain sudo[65234]: pam_unix(sudo:session): session closed for user root
Nov 28 08:16:55 np0005538515.localdomain sudo[65280]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 08:16:55 np0005538515.localdomain sudo[65280]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:16:55 np0005538515.localdomain sudo[65280]: pam_unix(sudo:session): session closed for user root
Nov 28 08:17:07 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.
Nov 28 08:17:07 np0005538515.localdomain systemd[1]: tmp-crun.VpkTnJ.mount: Deactivated successfully.
Nov 28 08:17:07 np0005538515.localdomain podman[65295]: 2025-11-28 08:17:07.986381786 +0000 UTC m=+0.091478838 container health_status 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, container_name=metrics_qdr, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, tcib_managed=true, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, config_id=tripleo_step1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, version=17.1.12, name=rhosp17/openstack-qdrouterd, distribution-scope=public, maintainer=OpenStack TripleO Team, release=1761123044, com.redhat.component=openstack-qdrouterd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64)
Nov 28 08:17:08 np0005538515.localdomain podman[65295]: 2025-11-28 08:17:08.208507047 +0000 UTC m=+0.313604069 container exec_died 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, release=1761123044, tcib_managed=true, vcs-type=git, config_id=tripleo_step1, name=rhosp17/openstack-qdrouterd, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, version=17.1.12, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, container_name=metrics_qdr, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 28 08:17:08 np0005538515.localdomain systemd[1]: 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.service: Deactivated successfully.
Nov 28 08:17:09 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.
Nov 28 08:17:09 np0005538515.localdomain podman[65324]: 2025-11-28 08:17:09.975982081 +0000 UTC m=+0.085183645 container health_status cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-collectd-container, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, container_name=collectd, architecture=x86_64, release=1761123044, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, build-date=2025-11-18T22:51:28Z, vcs-type=git, io.buildah.version=1.41.4, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, name=rhosp17/openstack-collectd, vendor=Red Hat, Inc., version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 28 08:17:09 np0005538515.localdomain podman[65324]: 2025-11-28 08:17:09.991494398 +0000 UTC m=+0.100695992 container exec_died cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-collectd, container_name=collectd, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, tcib_managed=true, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.buildah.version=1.41.4, version=17.1.12, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, distribution-scope=public, url=https://www.redhat.com, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 28 08:17:10 np0005538515.localdomain systemd[1]: cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.service: Deactivated successfully.
Nov 28 08:17:11 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.
Nov 28 08:17:11 np0005538515.localdomain systemd[1]: tmp-crun.qeXvAz.mount: Deactivated successfully.
Nov 28 08:17:11 np0005538515.localdomain podman[65344]: 2025-11-28 08:17:11.994429144 +0000 UTC m=+0.091274302 container health_status 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, build-date=2025-11-18T23:44:13Z, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, distribution-scope=public, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, com.redhat.component=openstack-iscsid-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, vendor=Red Hat, Inc., release=1761123044)
Nov 28 08:17:12 np0005538515.localdomain podman[65344]: 2025-11-28 08:17:12.007349152 +0000 UTC m=+0.104194320 container exec_died 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, config_id=tripleo_step3, managed_by=tripleo_ansible, io.openshift.expose-services=, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, com.redhat.component=openstack-iscsid-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, vcs-type=git, vendor=Red Hat, Inc., release=1761123044, name=rhosp17/openstack-iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, build-date=2025-11-18T23:44:13Z, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true)
Nov 28 08:17:12 np0005538515.localdomain systemd[1]: 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.service: Deactivated successfully.
Nov 28 08:17:38 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.
Nov 28 08:17:38 np0005538515.localdomain podman[65363]: 2025-11-28 08:17:38.980778042 +0000 UTC m=+0.085233105 container health_status 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, release=1761123044, vcs-type=git, distribution-scope=public, io.buildah.version=1.41.4, url=https://www.redhat.com, vendor=Red Hat, Inc., name=rhosp17/openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_id=tripleo_step1, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, maintainer=OpenStack TripleO Team, architecture=x86_64, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd)
Nov 28 08:17:39 np0005538515.localdomain podman[65363]: 2025-11-28 08:17:39.208715233 +0000 UTC m=+0.313170266 container exec_died 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, architecture=x86_64, url=https://www.redhat.com, distribution-scope=public, io.buildah.version=1.41.4, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, managed_by=tripleo_ansible, release=1761123044, name=rhosp17/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, config_id=tripleo_step1, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, tcib_managed=true, vendor=Red Hat, Inc.)
Nov 28 08:17:39 np0005538515.localdomain systemd[1]: 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.service: Deactivated successfully.
Nov 28 08:17:40 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.
Nov 28 08:17:40 np0005538515.localdomain podman[65392]: 2025-11-28 08:17:40.979950712 +0000 UTC m=+0.086031710 container health_status cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, managed_by=tripleo_ansible, batch=17.1_20251118.1, build-date=2025-11-18T22:51:28Z, release=1761123044, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, container_name=collectd, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, version=17.1.12, distribution-scope=public, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, name=rhosp17/openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, config_id=tripleo_step3)
Nov 28 08:17:41 np0005538515.localdomain podman[65392]: 2025-11-28 08:17:41.01849405 +0000 UTC m=+0.124575038 container exec_died cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vcs-type=git, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, release=1761123044, com.redhat.component=openstack-collectd-container, summary=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, managed_by=tripleo_ansible, build-date=2025-11-18T22:51:28Z, container_name=collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, tcib_managed=true, version=17.1.12, architecture=x86_64, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team)
Nov 28 08:17:41 np0005538515.localdomain systemd[1]: cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.service: Deactivated successfully.
Nov 28 08:17:42 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.
Nov 28 08:17:42 np0005538515.localdomain systemd[1]: tmp-crun.7xvp1M.mount: Deactivated successfully.
Nov 28 08:17:42 np0005538515.localdomain podman[65414]: 2025-11-28 08:17:42.978841464 +0000 UTC m=+0.087864648 container health_status 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, config_id=tripleo_step3, container_name=iscsid, com.redhat.component=openstack-iscsid-container, build-date=2025-11-18T23:44:13Z, distribution-scope=public, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, io.buildah.version=1.41.4, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, release=1761123044)
Nov 28 08:17:43 np0005538515.localdomain podman[65414]: 2025-11-28 08:17:43.017620037 +0000 UTC m=+0.126643211 container exec_died 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, managed_by=tripleo_ansible, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, url=https://www.redhat.com, build-date=2025-11-18T23:44:13Z, batch=17.1_20251118.1, name=rhosp17/openstack-iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, version=17.1.12, release=1761123044, architecture=x86_64, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team)
Nov 28 08:17:43 np0005538515.localdomain systemd[1]: 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.service: Deactivated successfully.
Nov 28 08:17:56 np0005538515.localdomain sudo[65432]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 08:17:56 np0005538515.localdomain sudo[65432]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:17:56 np0005538515.localdomain sudo[65432]: pam_unix(sudo:session): session closed for user root
Nov 28 08:17:56 np0005538515.localdomain sudo[65447]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 08:17:56 np0005538515.localdomain sudo[65447]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:17:56 np0005538515.localdomain sudo[65447]: pam_unix(sudo:session): session closed for user root
Nov 28 08:17:56 np0005538515.localdomain sudo[65494]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 08:17:56 np0005538515.localdomain sudo[65494]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:17:57 np0005538515.localdomain sudo[65494]: pam_unix(sudo:session): session closed for user root
Nov 28 08:17:57 np0005538515.localdomain sudo[65509]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 list-networks
Nov 28 08:17:57 np0005538515.localdomain sudo[65509]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:17:57 np0005538515.localdomain sudo[65509]: pam_unix(sudo:session): session closed for user root
Nov 28 08:18:02 np0005538515.localdomain sudo[65542]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 08:18:02 np0005538515.localdomain sudo[65542]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:18:02 np0005538515.localdomain sudo[65542]: pam_unix(sudo:session): session closed for user root
Nov 28 08:18:09 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.
Nov 28 08:18:09 np0005538515.localdomain podman[65558]: 2025-11-28 08:18:09.973628451 +0000 UTC m=+0.081899373 container health_status 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, container_name=metrics_qdr, architecture=x86_64, io.buildah.version=1.41.4, name=rhosp17/openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, distribution-scope=public, io.openshift.expose-services=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']})
Nov 28 08:18:10 np0005538515.localdomain podman[65558]: 2025-11-28 08:18:10.18954236 +0000 UTC m=+0.297813232 container exec_died 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, vendor=Red Hat, Inc., batch=17.1_20251118.1, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, config_id=tripleo_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, distribution-scope=public, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, name=rhosp17/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, url=https://www.redhat.com, build-date=2025-11-18T22:49:46Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 08:18:10 np0005538515.localdomain systemd[1]: 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.service: Deactivated successfully.
Nov 28 08:18:11 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.
Nov 28 08:18:11 np0005538515.localdomain podman[65588]: 2025-11-28 08:18:11.9854435 +0000 UTC m=+0.089480657 container health_status cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, name=rhosp17/openstack-collectd, build-date=2025-11-18T22:51:28Z, architecture=x86_64, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, config_id=tripleo_step3, vendor=Red Hat, Inc., vcs-type=git, tcib_managed=true, distribution-scope=public, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, com.redhat.component=openstack-collectd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, container_name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com)
Nov 28 08:18:12 np0005538515.localdomain podman[65588]: 2025-11-28 08:18:12.025454001 +0000 UTC m=+0.129491148 container exec_died cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, managed_by=tripleo_ansible, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, build-date=2025-11-18T22:51:28Z, distribution-scope=public, config_id=tripleo_step3, name=rhosp17/openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, io.buildah.version=1.41.4, io.openshift.expose-services=, com.redhat.component=openstack-collectd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=collectd)
Nov 28 08:18:12 np0005538515.localdomain systemd[1]: cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.service: Deactivated successfully.
Nov 28 08:18:13 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.
Nov 28 08:18:13 np0005538515.localdomain podman[65608]: 2025-11-28 08:18:13.970062701 +0000 UTC m=+0.080542051 container health_status 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, vcs-type=git, com.redhat.component=openstack-iscsid-container, name=rhosp17/openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, architecture=x86_64, url=https://www.redhat.com, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, tcib_managed=true, io.openshift.expose-services=, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z)
Nov 28 08:18:13 np0005538515.localdomain podman[65608]: 2025-11-28 08:18:13.98527649 +0000 UTC m=+0.095755860 container exec_died 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, name=rhosp17/openstack-iscsid, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, build-date=2025-11-18T23:44:13Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vcs-type=git, container_name=iscsid, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, com.redhat.component=openstack-iscsid-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 08:18:13 np0005538515.localdomain systemd[1]: 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.service: Deactivated successfully.
Nov 28 08:18:28 np0005538515.localdomain sudo[65672]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nxrtpmqeghkvjcyrwsqzjyxniafihktb ; /usr/bin/python3
Nov 28 08:18:28 np0005538515.localdomain sudo[65672]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:18:28 np0005538515.localdomain python3[65674]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/config_step.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 08:18:28 np0005538515.localdomain sudo[65672]: pam_unix(sudo:session): session closed for user root
Nov 28 08:18:28 np0005538515.localdomain sudo[65717]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mdgfeuqtdbokvzzacuhcrkxztwxsocxf ; /usr/bin/python3
Nov 28 08:18:28 np0005538515.localdomain sudo[65717]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:18:28 np0005538515.localdomain python3[65719]: ansible-ansible.legacy.copy Invoked with dest=/etc/puppet/hieradata/config_step.json force=True mode=0600 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764317908.1914458-108171-18257053407017/source _original_basename=tmpbdnwfiej follow=False checksum=ee48fb03297eb703b1954c8852d0f67fab51dac1 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:18:28 np0005538515.localdomain sudo[65717]: pam_unix(sudo:session): session closed for user root
Nov 28 08:18:29 np0005538515.localdomain sudo[65779]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vqrbcfmsvzssialydvnfrgvdarhnfhwv ; /usr/bin/python3
Nov 28 08:18:29 np0005538515.localdomain sudo[65779]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:18:30 np0005538515.localdomain python3[65781]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/recover_tripleo_nova_virtqemud.sh follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 08:18:30 np0005538515.localdomain sudo[65779]: pam_unix(sudo:session): session closed for user root
Nov 28 08:18:30 np0005538515.localdomain sudo[65822]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wsqhvyuroavxnpskehmfevuxdbwhqrdn ; /usr/bin/python3
Nov 28 08:18:30 np0005538515.localdomain sudo[65822]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:18:30 np0005538515.localdomain python3[65824]: ansible-ansible.legacy.copy Invoked with dest=/usr/libexec/recover_tripleo_nova_virtqemud.sh mode=0755 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764317909.825541-108369-96830662867845/source _original_basename=tmpuzj_x7mm follow=False checksum=922b8aa8342176110bffc2e39abdccc2b39e53a9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:18:30 np0005538515.localdomain sudo[65822]: pam_unix(sudo:session): session closed for user root
Nov 28 08:18:30 np0005538515.localdomain sudo[65884]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zgvsrwbermuxwoqwwnsdsguezsgnhqfe ; /usr/bin/python3
Nov 28 08:18:30 np0005538515.localdomain sudo[65884]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:18:31 np0005538515.localdomain python3[65886]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud_recover.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 08:18:31 np0005538515.localdomain sudo[65884]: pam_unix(sudo:session): session closed for user root
Nov 28 08:18:31 np0005538515.localdomain sudo[65927]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nruwsaxfbyznuswkdkqoaofhyecbumxa ; /usr/bin/python3
Nov 28 08:18:31 np0005538515.localdomain sudo[65927]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:18:31 np0005538515.localdomain python3[65929]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/tripleo_nova_virtqemud_recover.service mode=0644 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764317910.7844563-108426-40749297093871/source _original_basename=tmpeypq69hq follow=False checksum=92f73544b703afc85885fa63ab07bdf8f8671554 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:18:31 np0005538515.localdomain sudo[65927]: pam_unix(sudo:session): session closed for user root
Nov 28 08:18:31 np0005538515.localdomain sudo[65989]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gdpwebcgwhwpoybkksduwagrjdolpnpv ; /usr/bin/python3
Nov 28 08:18:31 np0005538515.localdomain sudo[65989]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:18:32 np0005538515.localdomain python3[65991]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud_recover.timer follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 08:18:32 np0005538515.localdomain sudo[65989]: pam_unix(sudo:session): session closed for user root
Nov 28 08:18:32 np0005538515.localdomain sudo[66032]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ezlxizdlemdexvarcixwomzqaobgsclb ; /usr/bin/python3
Nov 28 08:18:32 np0005538515.localdomain sudo[66032]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:18:32 np0005538515.localdomain python3[66034]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/tripleo_nova_virtqemud_recover.timer mode=0644 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764317911.738824-108485-115760241104859/source _original_basename=tmplnaudq30 follow=False checksum=c6e5f76a53c0d6ccaf46c4b48d813dc2891ad8e9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:18:32 np0005538515.localdomain sudo[66032]: pam_unix(sudo:session): session closed for user root
Nov 28 08:18:32 np0005538515.localdomain sudo[66062]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bvuckxhzicntpshgtulehkszgsovayvr ; /usr/bin/python3
Nov 28 08:18:32 np0005538515.localdomain sudo[66062]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:18:32 np0005538515.localdomain python3[66064]: ansible-systemd Invoked with daemon_reload=True enabled=True name=tripleo_nova_virtqemud_recover.service daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Nov 28 08:18:32 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 08:18:33 np0005538515.localdomain systemd-sysv-generator[66088]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 08:18:33 np0005538515.localdomain systemd-rc-local-generator[66081]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 08:18:33 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 08:18:33 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 08:18:33 np0005538515.localdomain systemd-rc-local-generator[66125]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 08:18:33 np0005538515.localdomain systemd-sysv-generator[66128]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 08:18:33 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 08:18:33 np0005538515.localdomain sudo[66062]: pam_unix(sudo:session): session closed for user root
Nov 28 08:18:33 np0005538515.localdomain sudo[66150]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qbjajsvfofnaaltovaatbawlvmgerezd ; /usr/bin/python3
Nov 28 08:18:33 np0005538515.localdomain sudo[66150]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:18:34 np0005538515.localdomain ceph-osd[32393]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 28 08:18:34 np0005538515.localdomain ceph-osd[32393]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 1800.1 total, 600.0 interval
                                                          Cumulative writes: 4386 writes, 20K keys, 4386 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s
                                                          Cumulative WAL: 4386 writes, 493 syncs, 8.90 writes per sync, written: 0.02 GB, 0.01 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 339 writes, 797 keys, 339 commit groups, 1.0 writes per commit group, ingest: 0.60 MB, 0.00 MB/s
                                                          Interval WAL: 339 writes, 168 syncs, 2.02 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Nov 28 08:18:34 np0005538515.localdomain python3[66152]: ansible-systemd Invoked with daemon_reload=True enabled=True name=tripleo_nova_virtqemud_recover.timer state=restarted daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 08:18:34 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 08:18:34 np0005538515.localdomain systemd-sysv-generator[66177]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 08:18:34 np0005538515.localdomain systemd-rc-local-generator[66173]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 08:18:34 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 08:18:34 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 08:18:34 np0005538515.localdomain systemd-sysv-generator[66220]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 08:18:34 np0005538515.localdomain systemd-rc-local-generator[66215]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 08:18:34 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 08:18:34 np0005538515.localdomain systemd[1]: Started Check and recover tripleo_nova_virtqemud every 10m.
Nov 28 08:18:34 np0005538515.localdomain sudo[66150]: pam_unix(sudo:session): session closed for user root
Nov 28 08:18:34 np0005538515.localdomain sudo[66241]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gdjfezqfhimrkdhzsngpjxrolanobzzj ; /usr/bin/python3
Nov 28 08:18:34 np0005538515.localdomain sudo[66241]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:18:35 np0005538515.localdomain python3[66243]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl enable --now tripleo_nova_virtqemud_recover.timer _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 08:18:35 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 08:18:35 np0005538515.localdomain systemd-rc-local-generator[66268]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 08:18:35 np0005538515.localdomain systemd-sysv-generator[66273]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 08:18:35 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 08:18:35 np0005538515.localdomain sudo[66241]: pam_unix(sudo:session): session closed for user root
Nov 28 08:18:35 np0005538515.localdomain sudo[66325]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pmewwbabzjfksslmmcboedzkfscpgoak ; /usr/bin/python3
Nov 28 08:18:35 np0005538515.localdomain sudo[66325]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:18:35 np0005538515.localdomain python3[66327]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/tripleo_nova_libvirt.target follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 08:18:35 np0005538515.localdomain sudo[66325]: pam_unix(sudo:session): session closed for user root
Nov 28 08:18:36 np0005538515.localdomain sudo[66368]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cquuonojtlkgthcbfwcolmavrnigtknc ; /usr/bin/python3
Nov 28 08:18:36 np0005538515.localdomain sudo[66368]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:18:36 np0005538515.localdomain python3[66370]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/tripleo_nova_libvirt.target group=root mode=0644 owner=root src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764317915.5509162-108604-164968456273423/source _original_basename=tmpwh3cy1n5 follow=False checksum=c064b4a8e7d3d1d7c62d1f80a09e350659996afd backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:18:36 np0005538515.localdomain sudo[66368]: pam_unix(sudo:session): session closed for user root
Nov 28 08:18:36 np0005538515.localdomain sudo[66398]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jupemetggnaybmxrachdjtzmyqfnrrfj ; /usr/bin/python3
Nov 28 08:18:36 np0005538515.localdomain sudo[66398]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:18:36 np0005538515.localdomain python3[66400]: ansible-systemd Invoked with daemon_reload=True enabled=True name=tripleo_nova_libvirt.target state=restarted daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 08:18:36 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 08:18:36 np0005538515.localdomain systemd-rc-local-generator[66425]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 08:18:36 np0005538515.localdomain systemd-sysv-generator[66432]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 08:18:37 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 08:18:37 np0005538515.localdomain systemd[1]: Reached target tripleo_nova_libvirt.target.
Nov 28 08:18:37 np0005538515.localdomain sudo[66398]: pam_unix(sudo:session): session closed for user root
Nov 28 08:18:37 np0005538515.localdomain sudo[66454]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-erohuzprnvlgglhzaqjsecbdeahmmhtu ; /usr/bin/python3
Nov 28 08:18:37 np0005538515.localdomain sudo[66454]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:18:37 np0005538515.localdomain python3[66456]: ansible-stat Invoked with path=/var/lib/tripleo-config/container-startup-config/step_4 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 28 08:18:37 np0005538515.localdomain sudo[66454]: pam_unix(sudo:session): session closed for user root
Nov 28 08:18:38 np0005538515.localdomain sudo[66504]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bsxkxlvzumqofjlneirbqsmvfefmjgtu ; /usr/bin/python3
Nov 28 08:18:38 np0005538515.localdomain sudo[66504]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:18:38 np0005538515.localdomain sudo[66504]: pam_unix(sudo:session): session closed for user root
Nov 28 08:18:38 np0005538515.localdomain sudo[66522]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mzeckrkhtnkwwyrjpwumkxgcilzekzrz ; /usr/bin/python3
Nov 28 08:18:38 np0005538515.localdomain sudo[66522]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:18:38 np0005538515.localdomain sudo[66522]: pam_unix(sudo:session): session closed for user root
Nov 28 08:18:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 28 08:18:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 1800.2 total, 600.0 interval
                                                          Cumulative writes: 5246 writes, 23K keys, 5246 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s
                                                          Cumulative WAL: 5246 writes, 540 syncs, 9.71 writes per sync, written: 0.02 GB, 0.01 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 316 writes, 658 keys, 316 commit groups, 1.0 writes per commit group, ingest: 0.49 MB, 0.00 MB/s
                                                          Interval WAL: 316 writes, 158 syncs, 2.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Nov 28 08:18:39 np0005538515.localdomain sudo[66626]: tripleo-admin : TTY=pts/0 ; PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gukxzlsthweezpnhzfgkvxtvihiwtjdc ; ANSIBLE_ASYNC_DIR=/tmp/.ansible_async /usr/bin/python3 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1764317918.7687347-108730-106179205292456/async_wrapper.py 399744111602 3600 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1764317918.7687347-108730-106179205292456/AnsiballZ_command.py _
Nov 28 08:18:39 np0005538515.localdomain sudo[66626]: pam_unix(sudo:session): session opened for user root(uid=0) by tripleo-admin(uid=1003)
Nov 28 08:18:39 np0005538515.localdomain ansible-async_wrapper.py[66628]: Invoked with 399744111602 3600 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1764317918.7687347-108730-106179205292456/AnsiballZ_command.py _
Nov 28 08:18:39 np0005538515.localdomain ansible-async_wrapper.py[66631]: Starting module and watcher
Nov 28 08:18:39 np0005538515.localdomain ansible-async_wrapper.py[66631]: Start watching 66632 (3600)
Nov 28 08:18:39 np0005538515.localdomain ansible-async_wrapper.py[66632]: Start module (66632)
Nov 28 08:18:39 np0005538515.localdomain ansible-async_wrapper.py[66628]: Return async_wrapper task started.
Nov 28 08:18:39 np0005538515.localdomain sudo[66626]: pam_unix(sudo:session): session closed for user root
Nov 28 08:18:39 np0005538515.localdomain sudo[66647]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hkervlszvittjbduxcpskdpqqwxteufr ; /usr/bin/python3
Nov 28 08:18:39 np0005538515.localdomain sudo[66647]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:18:39 np0005538515.localdomain python3[66649]: ansible-ansible.legacy.async_status Invoked with jid=399744111602.66628 mode=status _async_dir=/tmp/.ansible_async
Nov 28 08:18:39 np0005538515.localdomain sudo[66647]: pam_unix(sudo:session): session closed for user root
Nov 28 08:18:40 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.
Nov 28 08:18:41 np0005538515.localdomain systemd[1]: tmp-crun.p7cccb.mount: Deactivated successfully.
Nov 28 08:18:41 np0005538515.localdomain podman[66668]: 2025-11-28 08:18:41.013957983 +0000 UTC m=+0.114316342 container health_status 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.openshift.expose-services=, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, version=17.1.12, description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:46Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, name=rhosp17/openstack-qdrouterd, io.buildah.version=1.41.4, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 28 08:18:41 np0005538515.localdomain podman[66668]: 2025-11-28 08:18:41.224247329 +0000 UTC m=+0.324605708 container exec_died 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, release=1761123044, batch=17.1_20251118.1, container_name=metrics_qdr, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, name=rhosp17/openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, version=17.1.12, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.41.4)
Nov 28 08:18:41 np0005538515.localdomain systemd[1]: 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.service: Deactivated successfully.
Nov 28 08:18:42 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.
Nov 28 08:18:42 np0005538515.localdomain podman[66735]: 2025-11-28 08:18:42.271315447 +0000 UTC m=+0.087491776 container health_status cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, container_name=collectd, com.redhat.component=openstack-collectd-container, build-date=2025-11-18T22:51:28Z, description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, version=17.1.12, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, config_id=tripleo_step3, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-type=git, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, name=rhosp17/openstack-collectd, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, release=1761123044)
Nov 28 08:18:42 np0005538515.localdomain podman[66735]: 2025-11-28 08:18:42.287946609 +0000 UTC m=+0.104122938 container exec_died cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:51:28Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, architecture=x86_64, com.redhat.component=openstack-collectd-container, vendor=Red Hat, Inc., batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, managed_by=tripleo_ansible, release=1761123044, io.openshift.expose-services=, url=https://www.redhat.com, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12)
Nov 28 08:18:42 np0005538515.localdomain systemd[1]: cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.service: Deactivated successfully.
Nov 28 08:18:43 np0005538515.localdomain puppet-user[66652]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5
Nov 28 08:18:43 np0005538515.localdomain puppet-user[66652]:    (file: /etc/puppet/hiera.yaml)
Nov 28 08:18:43 np0005538515.localdomain puppet-user[66652]: Warning: Undefined variable '::deploy_config_name';
Nov 28 08:18:43 np0005538515.localdomain puppet-user[66652]:    (file & line not available)
Nov 28 08:18:43 np0005538515.localdomain puppet-user[66652]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html
Nov 28 08:18:43 np0005538515.localdomain puppet-user[66652]:    (file & line not available)
Nov 28 08:18:43 np0005538515.localdomain puppet-user[66652]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/profile/base/database/mysql/client.pp, line: 89, column: 8)
Nov 28 08:18:43 np0005538515.localdomain puppet-user[66652]: Warning: This method is deprecated, please use match expressions with Stdlib::Compat::String instead. They are described at https://docs.puppet.com/puppet/latest/reference/lang_data_type.html#match-expressions. at ["/etc/puppet/modules/snmp/manifests/params.pp", 310]:["/var/lib/tripleo-config/puppet_step_config.pp", 4]
Nov 28 08:18:43 np0005538515.localdomain puppet-user[66652]:    (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation')
Nov 28 08:18:43 np0005538515.localdomain puppet-user[66652]: Warning: This method is deprecated, please use the stdlib validate_legacy function,
Nov 28 08:18:43 np0005538515.localdomain puppet-user[66652]:                     with Stdlib::Compat::Bool. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 358]:["/var/lib/tripleo-config/puppet_step_config.pp", 4]
Nov 28 08:18:43 np0005538515.localdomain puppet-user[66652]:    (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation')
Nov 28 08:18:43 np0005538515.localdomain puppet-user[66652]: Warning: This method is deprecated, please use the stdlib validate_legacy function,
Nov 28 08:18:43 np0005538515.localdomain puppet-user[66652]:                     with Stdlib::Compat::Array. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 367]:["/var/lib/tripleo-config/puppet_step_config.pp", 4]
Nov 28 08:18:43 np0005538515.localdomain puppet-user[66652]:    (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation')
Nov 28 08:18:43 np0005538515.localdomain puppet-user[66652]: Warning: This method is deprecated, please use the stdlib validate_legacy function,
Nov 28 08:18:43 np0005538515.localdomain puppet-user[66652]:                     with Stdlib::Compat::String. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 382]:["/var/lib/tripleo-config/puppet_step_config.pp", 4]
Nov 28 08:18:43 np0005538515.localdomain puppet-user[66652]:    (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation')
Nov 28 08:18:43 np0005538515.localdomain puppet-user[66652]: Warning: This method is deprecated, please use the stdlib validate_legacy function,
Nov 28 08:18:43 np0005538515.localdomain puppet-user[66652]:                     with Stdlib::Compat::Numeric. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 388]:["/var/lib/tripleo-config/puppet_step_config.pp", 4]
Nov 28 08:18:43 np0005538515.localdomain puppet-user[66652]:    (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation')
Nov 28 08:18:43 np0005538515.localdomain puppet-user[66652]: Warning: This method is deprecated, please use the stdlib validate_legacy function,
Nov 28 08:18:43 np0005538515.localdomain puppet-user[66652]:                     with Pattern[]. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 393]:["/var/lib/tripleo-config/puppet_step_config.pp", 4]
Nov 28 08:18:43 np0005538515.localdomain puppet-user[66652]:    (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation')
Nov 28 08:18:43 np0005538515.localdomain puppet-user[66652]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/packages.pp, line: 39, column: 69)
Nov 28 08:18:43 np0005538515.localdomain puppet-user[66652]: Notice: Compiled catalog for np0005538515.localdomain in environment production in 0.23 seconds
Nov 28 08:18:44 np0005538515.localdomain ansible-async_wrapper.py[66631]: 66632 still running (3600)
Nov 28 08:18:44 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.
Nov 28 08:18:44 np0005538515.localdomain systemd[1]: tmp-crun.eKgWoM.mount: Deactivated successfully.
Nov 28 08:18:45 np0005538515.localdomain podman[66823]: 2025-11-28 08:18:45.001750308 +0000 UTC m=+0.109813374 container health_status 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, batch=17.1_20251118.1, build-date=2025-11-18T23:44:13Z, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-iscsid-container, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., managed_by=tripleo_ansible, config_id=tripleo_step3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, tcib_managed=true, url=https://www.redhat.com, vcs-type=git)
Nov 28 08:18:45 np0005538515.localdomain podman[66823]: 2025-11-28 08:18:45.009856228 +0000 UTC m=+0.117919294 container exec_died 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, release=1761123044, vcs-type=git, com.redhat.component=openstack-iscsid-container, summary=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, architecture=x86_64, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, managed_by=tripleo_ansible, io.buildah.version=1.41.4, config_id=tripleo_step3, distribution-scope=public, tcib_managed=true, version=17.1.12, name=rhosp17/openstack-iscsid, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 08:18:45 np0005538515.localdomain systemd[1]: 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.service: Deactivated successfully.
Nov 28 08:18:49 np0005538515.localdomain ansible-async_wrapper.py[66631]: 66632 still running (3595)
Nov 28 08:18:49 np0005538515.localdomain sudo[66921]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zixasjouyckdodpqjjystlfrwrngaamb ; /usr/bin/python3
Nov 28 08:18:49 np0005538515.localdomain sudo[66921]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:18:49 np0005538515.localdomain python3[66923]: ansible-ansible.legacy.async_status Invoked with jid=399744111602.66628 mode=status _async_dir=/tmp/.ansible_async
Nov 28 08:18:49 np0005538515.localdomain sudo[66921]: pam_unix(sudo:session): session closed for user root
Nov 28 08:18:51 np0005538515.localdomain systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 28 08:18:51 np0005538515.localdomain systemd[1]: Starting man-db-cache-update.service...
Nov 28 08:18:51 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 08:18:51 np0005538515.localdomain systemd-sysv-generator[67013]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 08:18:51 np0005538515.localdomain systemd-rc-local-generator[67008]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 08:18:51 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 08:18:51 np0005538515.localdomain systemd[1]: Queuing reload/restart jobs for marked units…
Nov 28 08:18:52 np0005538515.localdomain systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 28 08:18:52 np0005538515.localdomain systemd[1]: Finished man-db-cache-update.service.
Nov 28 08:18:52 np0005538515.localdomain systemd[1]: man-db-cache-update.service: Consumed 1.417s CPU time.
Nov 28 08:18:52 np0005538515.localdomain systemd[1]: run-rde87af1a5567449f9804dd1d860a13de.service: Deactivated successfully.
Nov 28 08:18:53 np0005538515.localdomain puppet-user[66652]: Notice: /Stage[main]/Snmp/Package[snmpd]/ensure: created
Nov 28 08:18:53 np0005538515.localdomain puppet-user[66652]: Notice: /Stage[main]/Snmp/File[snmpd.conf]/content: content changed '{sha256}2b743f970e80e2150759bfc66f2d8d0fbd8b31624f79e2991248d1a5ac57494e' to '{sha256}c5de38a09c7b13f562bf9286cb62fdd8d525e0c48ddd85d239ce86e29d135367'
Nov 28 08:18:53 np0005538515.localdomain puppet-user[66652]: Notice: /Stage[main]/Snmp/File[snmpd.sysconfig]/content: content changed '{sha256}b63afb2dee7419b6834471f88581d981c8ae5c8b27b9d329ba67a02f3ddd8221' to '{sha256}3917ee8bbc680ad50d77186ad4a1d2705c2025c32fc32f823abbda7f2328dfbd'
Nov 28 08:18:53 np0005538515.localdomain puppet-user[66652]: Notice: /Stage[main]/Snmp/File[snmptrapd.conf]/content: content changed '{sha256}2e1ca894d609ef337b6243909bf5623c87fd5df98ecbd00c7d4c12cf12f03c4e' to '{sha256}3ecf18da1ba84ea3932607f2b903ee6a038b6f9ac4e1e371e48f3ef61c5052ea'
Nov 28 08:18:53 np0005538515.localdomain puppet-user[66652]: Notice: /Stage[main]/Snmp/File[snmptrapd.sysconfig]/content: content changed '{sha256}86ee5797ad10cb1ea0f631e9dfa6ae278ecf4f4d16f4c80f831cdde45601b23c' to '{sha256}2244553364afcca151958f8e2003e4c182f5e2ecfbe55405cec73fd818581e97'
Nov 28 08:18:53 np0005538515.localdomain puppet-user[66652]: Notice: /Stage[main]/Snmp/Service[snmptrapd]: Triggered 'refresh' from 2 events
Nov 28 08:18:54 np0005538515.localdomain ansible-async_wrapper.py[66631]: 66632 still running (3590)
Nov 28 08:18:58 np0005538515.localdomain puppet-user[66652]: Notice: /Stage[main]/Tripleo::Profile::Base::Snmp/Snmp::Snmpv3_user[ro_snmp_user]/Exec[create-snmpv3-user-ro_snmp_user]/returns: executed successfully
Nov 28 08:18:58 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 08:18:58 np0005538515.localdomain systemd-rc-local-generator[68054]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 08:18:58 np0005538515.localdomain systemd-sysv-generator[68060]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 08:18:58 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 08:18:58 np0005538515.localdomain systemd[1]: Starting Simple Network Management Protocol (SNMP) Daemon....
Nov 28 08:18:58 np0005538515.localdomain snmpd[68067]: Can't find directory of RPM packages
Nov 28 08:18:59 np0005538515.localdomain snmpd[68067]: Duplicate IPv4 address detected, some interfaces may not be visible in IP-MIB
Nov 28 08:18:59 np0005538515.localdomain systemd[1]: Started Simple Network Management Protocol (SNMP) Daemon..
Nov 28 08:18:59 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 08:18:59 np0005538515.localdomain systemd-sysv-generator[68097]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 08:18:59 np0005538515.localdomain systemd-rc-local-generator[68090]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 08:18:59 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 08:18:59 np0005538515.localdomain ansible-async_wrapper.py[66631]: 66632 still running (3585)
Nov 28 08:18:59 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 08:18:59 np0005538515.localdomain systemd-rc-local-generator[68131]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 08:18:59 np0005538515.localdomain systemd-sysv-generator[68134]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 08:18:59 np0005538515.localdomain sshd[68140]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 08:18:59 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 08:18:59 np0005538515.localdomain puppet-user[66652]: Notice: /Stage[main]/Snmp/Service[snmpd]/ensure: ensure changed 'stopped' to 'running'
Nov 28 08:18:59 np0005538515.localdomain puppet-user[66652]: Notice: Applied catalog in 15.94 seconds
Nov 28 08:18:59 np0005538515.localdomain puppet-user[66652]: Application:
Nov 28 08:18:59 np0005538515.localdomain puppet-user[66652]:    Initial environment: production
Nov 28 08:18:59 np0005538515.localdomain puppet-user[66652]:    Converged environment: production
Nov 28 08:18:59 np0005538515.localdomain puppet-user[66652]:          Run mode: user
Nov 28 08:18:59 np0005538515.localdomain puppet-user[66652]: Changes:
Nov 28 08:18:59 np0005538515.localdomain puppet-user[66652]:             Total: 8
Nov 28 08:18:59 np0005538515.localdomain puppet-user[66652]: Events:
Nov 28 08:18:59 np0005538515.localdomain puppet-user[66652]:           Success: 8
Nov 28 08:18:59 np0005538515.localdomain puppet-user[66652]:             Total: 8
Nov 28 08:18:59 np0005538515.localdomain puppet-user[66652]: Resources:
Nov 28 08:18:59 np0005538515.localdomain puppet-user[66652]:         Restarted: 1
Nov 28 08:18:59 np0005538515.localdomain puppet-user[66652]:           Changed: 8
Nov 28 08:18:59 np0005538515.localdomain puppet-user[66652]:       Out of sync: 8
Nov 28 08:18:59 np0005538515.localdomain puppet-user[66652]:             Total: 19
Nov 28 08:18:59 np0005538515.localdomain puppet-user[66652]: Time:
Nov 28 08:18:59 np0005538515.localdomain puppet-user[66652]:        Filebucket: 0.00
Nov 28 08:18:59 np0005538515.localdomain puppet-user[66652]:          Schedule: 0.00
Nov 28 08:18:59 np0005538515.localdomain puppet-user[66652]:            Augeas: 0.01
Nov 28 08:18:59 np0005538515.localdomain puppet-user[66652]:              File: 0.11
Nov 28 08:18:59 np0005538515.localdomain puppet-user[66652]:    Config retrieval: 0.30
Nov 28 08:18:59 np0005538515.localdomain puppet-user[66652]:           Service: 1.27
Nov 28 08:18:59 np0005538515.localdomain puppet-user[66652]:    Transaction evaluation: 15.92
Nov 28 08:18:59 np0005538515.localdomain puppet-user[66652]:    Catalog application: 15.94
Nov 28 08:18:59 np0005538515.localdomain puppet-user[66652]:          Last run: 1764317939
Nov 28 08:18:59 np0005538515.localdomain puppet-user[66652]:              Exec: 5.10
Nov 28 08:18:59 np0005538515.localdomain puppet-user[66652]:           Package: 9.23
Nov 28 08:18:59 np0005538515.localdomain puppet-user[66652]:             Total: 15.94
Nov 28 08:18:59 np0005538515.localdomain puppet-user[66652]: Version:
Nov 28 08:18:59 np0005538515.localdomain puppet-user[66652]:            Config: 1764317923
Nov 28 08:18:59 np0005538515.localdomain puppet-user[66652]:            Puppet: 7.10.0
Nov 28 08:18:59 np0005538515.localdomain ansible-async_wrapper.py[66632]: Module complete (66632)
Nov 28 08:19:00 np0005538515.localdomain sudo[68155]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wghzsqurwkrmggnlwtprqbhiafcrpbom ; /usr/bin/python3
Nov 28 08:19:00 np0005538515.localdomain sudo[68155]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:19:00 np0005538515.localdomain python3[68157]: ansible-ansible.legacy.async_status Invoked with jid=399744111602.66628 mode=status _async_dir=/tmp/.ansible_async
Nov 28 08:19:00 np0005538515.localdomain sudo[68155]: pam_unix(sudo:session): session closed for user root
Nov 28 08:19:00 np0005538515.localdomain sudo[68171]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-njrmojfvbgnyhafnkatbpfqiuxzkelfq ; /usr/bin/python3
Nov 28 08:19:00 np0005538515.localdomain sudo[68171]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:19:00 np0005538515.localdomain python3[68173]: ansible-file Invoked with path=/var/lib/container-puppet/puppetlabs state=directory setype=svirt_sandbox_file_t selevel=s0 recurse=True force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None
Nov 28 08:19:00 np0005538515.localdomain sudo[68171]: pam_unix(sudo:session): session closed for user root
Nov 28 08:19:01 np0005538515.localdomain sshd[68140]: Invalid user ubuntu from 80.94.92.186 port 43484
Nov 28 08:19:01 np0005538515.localdomain sudo[68187]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gkjxjoyitzytcqkegwtrywkgnalyuqfm ; /usr/bin/python3
Nov 28 08:19:01 np0005538515.localdomain sudo[68187]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:19:01 np0005538515.localdomain python3[68189]: ansible-stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 28 08:19:01 np0005538515.localdomain sudo[68187]: pam_unix(sudo:session): session closed for user root
Nov 28 08:19:01 np0005538515.localdomain sshd[68140]: Connection closed by invalid user ubuntu 80.94.92.186 port 43484 [preauth]
Nov 28 08:19:01 np0005538515.localdomain sudo[68237]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ybivuavooavaokyfkcisaqjxfqpxhbcp ; /usr/bin/python3
Nov 28 08:19:01 np0005538515.localdomain sudo[68237]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:19:01 np0005538515.localdomain python3[68239]: ansible-ansible.legacy.stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 08:19:01 np0005538515.localdomain sudo[68237]: pam_unix(sudo:session): session closed for user root
Nov 28 08:19:02 np0005538515.localdomain sudo[68255]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ckgqlfhrqijjrphjsqptmddjarhbodgt ; /usr/bin/python3
Nov 28 08:19:02 np0005538515.localdomain sudo[68255]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:19:02 np0005538515.localdomain python3[68257]: ansible-ansible.legacy.file Invoked with setype=svirt_sandbox_file_t selevel=s0 dest=/var/lib/container-puppet/puppetlabs/facter.conf _original_basename=tmpzfewyovu recurse=False state=file path=/var/lib/container-puppet/puppetlabs/facter.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None
Nov 28 08:19:02 np0005538515.localdomain sudo[68255]: pam_unix(sudo:session): session closed for user root
Nov 28 08:19:02 np0005538515.localdomain sudo[68285]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zornslyotfkvmxtnaagwkyktetebwejl ; /usr/bin/python3
Nov 28 08:19:02 np0005538515.localdomain sudo[68285]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:19:02 np0005538515.localdomain python3[68287]: ansible-file Invoked with path=/opt/puppetlabs/facter state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:19:02 np0005538515.localdomain sudo[68285]: pam_unix(sudo:session): session closed for user root
Nov 28 08:19:02 np0005538515.localdomain sudo[68288]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 08:19:02 np0005538515.localdomain sudo[68288]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:19:02 np0005538515.localdomain sudo[68288]: pam_unix(sudo:session): session closed for user root
Nov 28 08:19:02 np0005538515.localdomain sudo[68303]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 check-host
Nov 28 08:19:02 np0005538515.localdomain sudo[68303]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:19:02 np0005538515.localdomain sudo[68329]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wjkkhnfciivfiuhadmxeqnexgutycqqa ; /usr/bin/python3
Nov 28 08:19:02 np0005538515.localdomain sudo[68329]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:19:03 np0005538515.localdomain sudo[68303]: pam_unix(sudo:session): session closed for user root
Nov 28 08:19:03 np0005538515.localdomain sudo[68329]: pam_unix(sudo:session): session closed for user root
Nov 28 08:19:03 np0005538515.localdomain sudo[68425]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 08:19:03 np0005538515.localdomain sudo[68425]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:19:03 np0005538515.localdomain sudo[68425]: pam_unix(sudo:session): session closed for user root
Nov 28 08:19:03 np0005538515.localdomain sudo[68440]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 08:19:03 np0005538515.localdomain sudo[68440]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:19:03 np0005538515.localdomain sudo[68468]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-putvpxdozwklwcedfijdzaaqqujvcirf ; /usr/bin/python3
Nov 28 08:19:03 np0005538515.localdomain sudo[68468]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:19:03 np0005538515.localdomain python3[68470]: ansible-ansible.posix.synchronize Invoked with src=/opt/puppetlabs/ dest=/var/lib/container-puppet/puppetlabs/ _local_rsync_path=rsync _local_rsync_password=NOT_LOGGING_PARAMETER rsync_path=None delete=False _substitute_controller=False archive=True checksum=False compress=True existing_only=False dirs=False copy_links=False set_remote_user=True rsync_timeout=0 rsync_opts=[] ssh_connection_multiplexing=False partial=False verify_host=False mode=push dest_port=None private_key=None recursive=None links=None perms=None times=None owner=None group=None ssh_args=None link_dest=None
Nov 28 08:19:03 np0005538515.localdomain sudo[68468]: pam_unix(sudo:session): session closed for user root
Nov 28 08:19:04 np0005538515.localdomain sudo[68440]: pam_unix(sudo:session): session closed for user root
Nov 28 08:19:04 np0005538515.localdomain sudo[68508]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 08:19:04 np0005538515.localdomain sudo[68508]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:19:04 np0005538515.localdomain sudo[68508]: pam_unix(sudo:session): session closed for user root
Nov 28 08:19:04 np0005538515.localdomain ansible-async_wrapper.py[66631]: Done in kid B.
Nov 28 08:19:04 np0005538515.localdomain sudo[68523]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ceph-volume --fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1 -- inventory --format=json-pretty --filter-for-batch
Nov 28 08:19:04 np0005538515.localdomain sudo[68523]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:19:04 np0005538515.localdomain sudo[68551]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mqwkvpewjfwvqotmzcblacuswcrceaxy ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:19:04 np0005538515.localdomain sudo[68551]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:19:04 np0005538515.localdomain python3[68553]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:19:04 np0005538515.localdomain sudo[68551]: pam_unix(sudo:session): session closed for user root
Nov 28 08:19:04 np0005538515.localdomain sudo[68591]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vtwthdtbbfcjgnlcobuluoorenhekedn ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:19:04 np0005538515.localdomain sudo[68591]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:19:05 np0005538515.localdomain podman[68609]: 
Nov 28 08:19:05 np0005538515.localdomain sudo[68591]: pam_unix(sudo:session): session closed for user root
Nov 28 08:19:05 np0005538515.localdomain podman[68609]: 2025-11-28 08:19:05.073318687 +0000 UTC m=+0.082926365 container create 7344aa13f74d173a9771e50a99178889ef349c030015ef0bbe7acd133d46c48b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=strange_pasteur, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, description=Red Hat Ceph Storage 7, release=553, RELEASE=main, vcs-type=git, io.buildah.version=1.33.12, build-date=2025-09-24T08:57:55, CEPH_POINT_RELEASE=, architecture=x86_64, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, name=rhceph)
Nov 28 08:19:05 np0005538515.localdomain systemd[1]: Started libpod-conmon-7344aa13f74d173a9771e50a99178889ef349c030015ef0bbe7acd133d46c48b.scope.
Nov 28 08:19:05 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 08:19:05 np0005538515.localdomain podman[68609]: 2025-11-28 08:19:05.041975651 +0000 UTC m=+0.051583379 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 08:19:05 np0005538515.localdomain podman[68609]: 2025-11-28 08:19:05.15751051 +0000 UTC m=+0.167118238 container init 7344aa13f74d173a9771e50a99178889ef349c030015ef0bbe7acd133d46c48b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=strange_pasteur, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, RELEASE=main, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, io.buildah.version=1.33.12, GIT_BRANCH=main, com.redhat.component=rhceph-container, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, build-date=2025-09-24T08:57:55, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_REPO=https://github.com/ceph/ceph-container.git, release=553, description=Red Hat Ceph Storage 7, architecture=x86_64, distribution-scope=public, name=rhceph, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True)
Nov 28 08:19:05 np0005538515.localdomain systemd[1]: tmp-crun.1RurvL.mount: Deactivated successfully.
Nov 28 08:19:05 np0005538515.localdomain podman[68609]: 2025-11-28 08:19:05.17243292 +0000 UTC m=+0.182040608 container start 7344aa13f74d173a9771e50a99178889ef349c030015ef0bbe7acd133d46c48b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=strange_pasteur, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, build-date=2025-09-24T08:57:55, io.buildah.version=1.33.12, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, RELEASE=main, ceph=True, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, name=rhceph, distribution-scope=public, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, release=553, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/agreements, version=7)
Nov 28 08:19:05 np0005538515.localdomain podman[68609]: 2025-11-28 08:19:05.172732059 +0000 UTC m=+0.182339837 container attach 7344aa13f74d173a9771e50a99178889ef349c030015ef0bbe7acd133d46c48b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=strange_pasteur, description=Red Hat Ceph Storage 7, architecture=x86_64, io.openshift.expose-services=, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, build-date=2025-09-24T08:57:55, CEPH_POINT_RELEASE=, release=553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_CLEAN=True, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, com.redhat.component=rhceph-container, version=7, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph)
Nov 28 08:19:05 np0005538515.localdomain systemd[1]: libpod-7344aa13f74d173a9771e50a99178889ef349c030015ef0bbe7acd133d46c48b.scope: Deactivated successfully.
Nov 28 08:19:05 np0005538515.localdomain strange_pasteur[68624]: 167 167
Nov 28 08:19:05 np0005538515.localdomain podman[68609]: 2025-11-28 08:19:05.178562808 +0000 UTC m=+0.188170516 container died 7344aa13f74d173a9771e50a99178889ef349c030015ef0bbe7acd133d46c48b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=strange_pasteur, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, distribution-scope=public, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, version=7, ceph=True, RELEASE=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, release=553, com.redhat.component=rhceph-container, vcs-type=git, build-date=2025-09-24T08:57:55, GIT_BRANCH=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.buildah.version=1.33.12, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Nov 28 08:19:05 np0005538515.localdomain podman[68630]: 2025-11-28 08:19:05.285015207 +0000 UTC m=+0.095053649 container remove 7344aa13f74d173a9771e50a99178889ef349c030015ef0bbe7acd133d46c48b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=strange_pasteur, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, vcs-type=git, architecture=x86_64, version=7, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.buildah.version=1.33.12, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55, RELEASE=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, release=553, com.redhat.component=rhceph-container, distribution-scope=public, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_BRANCH=main)
Nov 28 08:19:05 np0005538515.localdomain systemd[1]: libpod-conmon-7344aa13f74d173a9771e50a99178889ef349c030015ef0bbe7acd133d46c48b.scope: Deactivated successfully.
Nov 28 08:19:05 np0005538515.localdomain sudo[68657]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oxgveltndkiuifzoysgxywkxradlqsrb ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:19:05 np0005538515.localdomain sudo[68657]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:19:05 np0005538515.localdomain podman[68664]: 
Nov 28 08:19:05 np0005538515.localdomain podman[68664]: 2025-11-28 08:19:05.515561717 +0000 UTC m=+0.080092798 container create 47e37f8bdaafc8a45e0942c4c591c464471070f5960643a9b2caf38c7ebbc11f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=admiring_ride, io.buildah.version=1.33.12, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, distribution-scope=public, RELEASE=main, release=553, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, architecture=x86_64, name=rhceph, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-09-24T08:57:55, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.component=rhceph-container, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, io.openshift.tags=rhceph ceph)
Nov 28 08:19:05 np0005538515.localdomain systemd[1]: Started libpod-conmon-47e37f8bdaafc8a45e0942c4c591c464471070f5960643a9b2caf38c7ebbc11f.scope.
Nov 28 08:19:05 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 08:19:05 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e52f4f1aa5f31845c4962ceebcd671f3caafeb2f2d76f906720e6f31fba9991c/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Nov 28 08:19:05 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e52f4f1aa5f31845c4962ceebcd671f3caafeb2f2d76f906720e6f31fba9991c/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Nov 28 08:19:05 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e52f4f1aa5f31845c4962ceebcd671f3caafeb2f2d76f906720e6f31fba9991c/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Nov 28 08:19:05 np0005538515.localdomain podman[68664]: 2025-11-28 08:19:05.482724756 +0000 UTC m=+0.047255847 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 08:19:05 np0005538515.localdomain podman[68664]: 2025-11-28 08:19:05.582017044 +0000 UTC m=+0.146548125 container init 47e37f8bdaafc8a45e0942c4c591c464471070f5960643a9b2caf38c7ebbc11f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=admiring_ride, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, version=7, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, architecture=x86_64, io.openshift.expose-services=, name=rhceph, vcs-type=git, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_CLEAN=True, build-date=2025-09-24T08:57:55, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.33.12, com.redhat.component=rhceph-container, ceph=True, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, distribution-scope=public, release=553)
Nov 28 08:19:05 np0005538515.localdomain python3[68666]: ansible-stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 28 08:19:05 np0005538515.localdomain podman[68664]: 2025-11-28 08:19:05.594984773 +0000 UTC m=+0.159515824 container start 47e37f8bdaafc8a45e0942c4c591c464471070f5960643a9b2caf38c7ebbc11f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=admiring_ride, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, ceph=True, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=, name=rhceph, description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, release=553, io.openshift.expose-services=, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, RELEASE=main, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-09-24T08:57:55)
Nov 28 08:19:05 np0005538515.localdomain podman[68664]: 2025-11-28 08:19:05.603600438 +0000 UTC m=+0.168131479 container attach 47e37f8bdaafc8a45e0942c4c591c464471070f5960643a9b2caf38c7ebbc11f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=admiring_ride, io.openshift.expose-services=, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, io.openshift.tags=rhceph ceph, version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.33.12, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=rhceph-container, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, RELEASE=main, vendor=Red Hat, Inc., GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55, distribution-scope=public, description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, release=553)
Nov 28 08:19:05 np0005538515.localdomain sudo[68657]: pam_unix(sudo:session): session closed for user root
Nov 28 08:19:06 np0005538515.localdomain sudo[68741]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fjlmwwfgzsbfiirxmzypodmmdybphgnh ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:19:06 np0005538515.localdomain sudo[68741]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:19:06 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-33a6bdafd4609c0f505a3833915d396dc809123f2eb6bfcea64c83bbe3b2e0f7-merged.mount: Deactivated successfully.
Nov 28 08:19:06 np0005538515.localdomain python3[68746]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-container-shutdown follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 08:19:06 np0005538515.localdomain sudo[68741]: pam_unix(sudo:session): session closed for user root
Nov 28 08:19:06 np0005538515.localdomain sudo[69201]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tcfahvqzktktiwsplekrcivhlbwyckij ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:19:06 np0005538515.localdomain sudo[69201]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:19:06 np0005538515.localdomain python3[69298]: ansible-ansible.legacy.file Invoked with mode=0700 owner=root group=root dest=/usr/libexec/tripleo-container-shutdown _original_basename=tripleo-container-shutdown recurse=False state=file path=/usr/libexec/tripleo-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:19:06 np0005538515.localdomain sudo[69201]: pam_unix(sudo:session): session closed for user root
Nov 28 08:19:06 np0005538515.localdomain admiring_ride[68680]: [
Nov 28 08:19:06 np0005538515.localdomain admiring_ride[68680]:     {
Nov 28 08:19:06 np0005538515.localdomain admiring_ride[68680]:         "available": false,
Nov 28 08:19:06 np0005538515.localdomain admiring_ride[68680]:         "ceph_device": false,
Nov 28 08:19:06 np0005538515.localdomain admiring_ride[68680]:         "device_id": "QEMU_DVD-ROM_QM00001",
Nov 28 08:19:06 np0005538515.localdomain admiring_ride[68680]:         "lsm_data": {},
Nov 28 08:19:06 np0005538515.localdomain admiring_ride[68680]:         "lvs": [],
Nov 28 08:19:06 np0005538515.localdomain admiring_ride[68680]:         "path": "/dev/sr0",
Nov 28 08:19:06 np0005538515.localdomain admiring_ride[68680]:         "rejected_reasons": [
Nov 28 08:19:06 np0005538515.localdomain admiring_ride[68680]:             "Insufficient space (<5GB)",
Nov 28 08:19:06 np0005538515.localdomain admiring_ride[68680]:             "Has a FileSystem"
Nov 28 08:19:06 np0005538515.localdomain admiring_ride[68680]:         ],
Nov 28 08:19:06 np0005538515.localdomain admiring_ride[68680]:         "sys_api": {
Nov 28 08:19:06 np0005538515.localdomain admiring_ride[68680]:             "actuators": null,
Nov 28 08:19:06 np0005538515.localdomain admiring_ride[68680]:             "device_nodes": "sr0",
Nov 28 08:19:06 np0005538515.localdomain admiring_ride[68680]:             "human_readable_size": "482.00 KB",
Nov 28 08:19:06 np0005538515.localdomain admiring_ride[68680]:             "id_bus": "ata",
Nov 28 08:19:06 np0005538515.localdomain admiring_ride[68680]:             "model": "QEMU DVD-ROM",
Nov 28 08:19:06 np0005538515.localdomain admiring_ride[68680]:             "nr_requests": "2",
Nov 28 08:19:06 np0005538515.localdomain admiring_ride[68680]:             "partitions": {},
Nov 28 08:19:06 np0005538515.localdomain admiring_ride[68680]:             "path": "/dev/sr0",
Nov 28 08:19:06 np0005538515.localdomain admiring_ride[68680]:             "removable": "1",
Nov 28 08:19:06 np0005538515.localdomain admiring_ride[68680]:             "rev": "2.5+",
Nov 28 08:19:06 np0005538515.localdomain admiring_ride[68680]:             "ro": "0",
Nov 28 08:19:06 np0005538515.localdomain admiring_ride[68680]:             "rotational": "1",
Nov 28 08:19:06 np0005538515.localdomain admiring_ride[68680]:             "sas_address": "",
Nov 28 08:19:06 np0005538515.localdomain admiring_ride[68680]:             "sas_device_handle": "",
Nov 28 08:19:06 np0005538515.localdomain admiring_ride[68680]:             "scheduler_mode": "mq-deadline",
Nov 28 08:19:06 np0005538515.localdomain admiring_ride[68680]:             "sectors": 0,
Nov 28 08:19:06 np0005538515.localdomain admiring_ride[68680]:             "sectorsize": "2048",
Nov 28 08:19:06 np0005538515.localdomain admiring_ride[68680]:             "size": 493568.0,
Nov 28 08:19:06 np0005538515.localdomain admiring_ride[68680]:             "support_discard": "0",
Nov 28 08:19:06 np0005538515.localdomain admiring_ride[68680]:             "type": "disk",
Nov 28 08:19:06 np0005538515.localdomain admiring_ride[68680]:             "vendor": "QEMU"
Nov 28 08:19:06 np0005538515.localdomain admiring_ride[68680]:         }
Nov 28 08:19:06 np0005538515.localdomain admiring_ride[68680]:     }
Nov 28 08:19:06 np0005538515.localdomain admiring_ride[68680]: ]
Nov 28 08:19:06 np0005538515.localdomain systemd[1]: libpod-47e37f8bdaafc8a45e0942c4c591c464471070f5960643a9b2caf38c7ebbc11f.scope: Deactivated successfully.
Nov 28 08:19:06 np0005538515.localdomain systemd[1]: libpod-47e37f8bdaafc8a45e0942c4c591c464471070f5960643a9b2caf38c7ebbc11f.scope: Consumed 1.017s CPU time.
Nov 28 08:19:06 np0005538515.localdomain podman[68664]: 2025-11-28 08:19:06.610467807 +0000 UTC m=+1.174998938 container died 47e37f8bdaafc8a45e0942c4c591c464471070f5960643a9b2caf38c7ebbc11f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=admiring_ride, vendor=Red Hat, Inc., distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, vcs-type=git, com.redhat.component=rhceph-container, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, version=7, GIT_CLEAN=True, release=553, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, io.buildah.version=1.33.12, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, build-date=2025-09-24T08:57:55, name=rhceph, CEPH_POINT_RELEASE=)
Nov 28 08:19:06 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-e52f4f1aa5f31845c4962ceebcd671f3caafeb2f2d76f906720e6f31fba9991c-merged.mount: Deactivated successfully.
Nov 28 08:19:06 np0005538515.localdomain podman[70331]: 2025-11-28 08:19:06.711395736 +0000 UTC m=+0.090301522 container remove 47e37f8bdaafc8a45e0942c4c591c464471070f5960643a9b2caf38c7ebbc11f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=admiring_ride, architecture=x86_64, distribution-scope=public, GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-type=git, io.buildah.version=1.33.12, build-date=2025-09-24T08:57:55, release=553, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, RELEASE=main, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, ceph=True, version=7, GIT_CLEAN=True, com.redhat.component=rhceph-container, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Nov 28 08:19:06 np0005538515.localdomain systemd[1]: libpod-conmon-47e37f8bdaafc8a45e0942c4c591c464471070f5960643a9b2caf38c7ebbc11f.scope: Deactivated successfully.
Nov 28 08:19:06 np0005538515.localdomain sudo[68523]: pam_unix(sudo:session): session closed for user root
Nov 28 08:19:06 np0005538515.localdomain sudo[70391]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ryvixgwxtidntgcldukpujorlerpvvtl ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:19:06 np0005538515.localdomain sudo[70391]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:19:07 np0005538515.localdomain python3[70393]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-start-podman-container follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 08:19:07 np0005538515.localdomain sudo[70391]: pam_unix(sudo:session): session closed for user root
Nov 28 08:19:07 np0005538515.localdomain sudo[70409]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eefgpihvzlvlionevkzpxmxukjitrwbw ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:19:07 np0005538515.localdomain sudo[70409]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:19:07 np0005538515.localdomain sudo[70412]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 08:19:07 np0005538515.localdomain sudo[70412]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:19:07 np0005538515.localdomain sudo[70412]: pam_unix(sudo:session): session closed for user root
Nov 28 08:19:07 np0005538515.localdomain python3[70411]: ansible-ansible.legacy.file Invoked with mode=0700 owner=root group=root dest=/usr/libexec/tripleo-start-podman-container _original_basename=tripleo-start-podman-container recurse=False state=file path=/usr/libexec/tripleo-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:19:07 np0005538515.localdomain sudo[70409]: pam_unix(sudo:session): session closed for user root
Nov 28 08:19:07 np0005538515.localdomain sudo[70486]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tgakqlgpztuuctxvrghqmwftdzhlxiwu ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:19:07 np0005538515.localdomain sudo[70486]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:19:07 np0005538515.localdomain python3[70488]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/tripleo-container-shutdown.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 08:19:07 np0005538515.localdomain sudo[70486]: pam_unix(sudo:session): session closed for user root
Nov 28 08:19:08 np0005538515.localdomain sudo[70504]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sopzlpsikcqwewxuqyjxfbzskrsljfbl ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:19:08 np0005538515.localdomain sudo[70504]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:19:08 np0005538515.localdomain python3[70506]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system/tripleo-container-shutdown.service _original_basename=tripleo-container-shutdown-service recurse=False state=file path=/usr/lib/systemd/system/tripleo-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:19:08 np0005538515.localdomain sudo[70504]: pam_unix(sudo:session): session closed for user root
Nov 28 08:19:08 np0005538515.localdomain sudo[70567]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-prytdhycminfpjjuyoxopclsluccoalh ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:19:08 np0005538515.localdomain sudo[70567]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:19:08 np0005538515.localdomain python3[70569]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 08:19:08 np0005538515.localdomain sudo[70567]: pam_unix(sudo:session): session closed for user root
Nov 28 08:19:08 np0005538515.localdomain sudo[70585]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dcgoedcawlwohvxymowuzbersnmexbel ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:19:08 np0005538515.localdomain sudo[70585]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:19:09 np0005538515.localdomain python3[70587]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset _original_basename=91-tripleo-container-shutdown-preset recurse=False state=file path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:19:09 np0005538515.localdomain sudo[70585]: pam_unix(sudo:session): session closed for user root
Nov 28 08:19:09 np0005538515.localdomain sudo[70615]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kdnrhxqxlvthrunovynbbsuaswyciggl ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:19:09 np0005538515.localdomain sudo[70615]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:19:09 np0005538515.localdomain python3[70617]: ansible-systemd Invoked with name=tripleo-container-shutdown state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 08:19:09 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 08:19:09 np0005538515.localdomain systemd-rc-local-generator[70642]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 08:19:09 np0005538515.localdomain systemd-sysv-generator[70647]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 08:19:09 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 08:19:10 np0005538515.localdomain sudo[70615]: pam_unix(sudo:session): session closed for user root
Nov 28 08:19:10 np0005538515.localdomain sudo[70700]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gdmzbkpynsenemzxqxvjbypmammfraav ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:19:10 np0005538515.localdomain sudo[70700]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:19:10 np0005538515.localdomain python3[70702]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/netns-placeholder.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 08:19:10 np0005538515.localdomain sudo[70700]: pam_unix(sudo:session): session closed for user root
Nov 28 08:19:10 np0005538515.localdomain sudo[70718]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-klqjamcrlolkhaucgyratazrfnwsexlk ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:19:10 np0005538515.localdomain sudo[70718]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:19:10 np0005538515.localdomain python3[70720]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/usr/lib/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:19:10 np0005538515.localdomain sudo[70718]: pam_unix(sudo:session): session closed for user root
Nov 28 08:19:11 np0005538515.localdomain sudo[70780]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zuryftbfahuxtpevtchfkkwzxeoqiaov ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:19:11 np0005538515.localdomain sudo[70780]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:19:11 np0005538515.localdomain python3[70782]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 08:19:11 np0005538515.localdomain sudo[70780]: pam_unix(sudo:session): session closed for user root
Nov 28 08:19:11 np0005538515.localdomain sudo[70798]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sgqkmumytzgapreugfbjlxpddjgopilk ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:19:11 np0005538515.localdomain sudo[70798]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:19:11 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.
Nov 28 08:19:11 np0005538515.localdomain podman[70801]: 2025-11-28 08:19:11.637593232 +0000 UTC m=+0.106251943 container health_status 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.openshift.expose-services=, version=17.1.12, tcib_managed=true, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, release=1761123044, build-date=2025-11-18T22:49:46Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, architecture=x86_64, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_id=tripleo_step1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, managed_by=tripleo_ansible)
Nov 28 08:19:11 np0005538515.localdomain python3[70800]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:19:11 np0005538515.localdomain sudo[70798]: pam_unix(sudo:session): session closed for user root
Nov 28 08:19:11 np0005538515.localdomain podman[70801]: 2025-11-28 08:19:11.857536235 +0000 UTC m=+0.326194916 container exec_died 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, container_name=metrics_qdr, version=17.1.12, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:46Z, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, name=rhosp17/openstack-qdrouterd, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, distribution-scope=public, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git)
Nov 28 08:19:11 np0005538515.localdomain systemd[1]: 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.service: Deactivated successfully.
Nov 28 08:19:11 np0005538515.localdomain sudo[70858]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jbhebawsuhobpucmsozzreafvgcyewjq ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:19:11 np0005538515.localdomain sudo[70858]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:19:12 np0005538515.localdomain python3[70862]: ansible-systemd Invoked with name=netns-placeholder state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 08:19:12 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 08:19:12 np0005538515.localdomain systemd-sysv-generator[70886]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 08:19:12 np0005538515.localdomain systemd-rc-local-generator[70882]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 08:19:12 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 08:19:12 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.
Nov 28 08:19:12 np0005538515.localdomain systemd[1]: Starting Create netns directory...
Nov 28 08:19:12 np0005538515.localdomain systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Nov 28 08:19:12 np0005538515.localdomain systemd[1]: netns-placeholder.service: Deactivated successfully.
Nov 28 08:19:12 np0005538515.localdomain systemd[1]: Finished Create netns directory.
Nov 28 08:19:12 np0005538515.localdomain systemd[1]: tmp-crun.uXNoWY.mount: Deactivated successfully.
Nov 28 08:19:12 np0005538515.localdomain podman[70899]: 2025-11-28 08:19:12.606713569 +0000 UTC m=+0.098361930 container health_status cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-collectd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, vendor=Red Hat, Inc., build-date=2025-11-18T22:51:28Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, config_id=tripleo_step3, version=17.1.12, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=collectd)
Nov 28 08:19:12 np0005538515.localdomain sudo[70858]: pam_unix(sudo:session): session closed for user root
Nov 28 08:19:12 np0005538515.localdomain podman[70899]: 2025-11-28 08:19:12.645582036 +0000 UTC m=+0.137230347 container exec_died cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, container_name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:51:28Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-collectd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, batch=17.1_20251118.1, tcib_managed=true, name=rhosp17/openstack-collectd, config_id=tripleo_step3, io.buildah.version=1.41.4, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, distribution-scope=public, url=https://www.redhat.com, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 08:19:12 np0005538515.localdomain systemd[1]: cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.service: Deactivated successfully.
Nov 28 08:19:13 np0005538515.localdomain sudo[70937]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bvspapfxdltffptbshwqeafolzuqbuyq ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:19:13 np0005538515.localdomain sudo[70937]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:19:13 np0005538515.localdomain python3[70939]: ansible-container_puppet_config Invoked with update_config_hash_only=True no_archive=True check_mode=False config_vol_prefix=/var/lib/config-data debug=False net_host=True puppet_config= short_hostname= step=6
Nov 28 08:19:13 np0005538515.localdomain sudo[70937]: pam_unix(sudo:session): session closed for user root
Nov 28 08:19:13 np0005538515.localdomain sudo[70953]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jcuqkrcpekpgfegacyffsvycohsenunu ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:19:13 np0005538515.localdomain sudo[70953]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:19:14 np0005538515.localdomain sudo[70953]: pam_unix(sudo:session): session closed for user root
Nov 28 08:19:15 np0005538515.localdomain sudo[70996]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fejugbbwsdtqqlyxlieppkvvarjjonjg ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:19:15 np0005538515.localdomain sudo[70996]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:19:15 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.
Nov 28 08:19:15 np0005538515.localdomain podman[70999]: 2025-11-28 08:19:15.214254785 +0000 UTC m=+0.081963976 container health_status 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, name=rhosp17/openstack-iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, tcib_managed=true, vcs-type=git, io.buildah.version=1.41.4, vendor=Red Hat, Inc., build-date=2025-11-18T23:44:13Z, io.openshift.expose-services=, container_name=iscsid, release=1761123044, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, managed_by=tripleo_ansible, architecture=x86_64, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-iscsid-container)
Nov 28 08:19:15 np0005538515.localdomain podman[70999]: 2025-11-28 08:19:15.230424403 +0000 UTC m=+0.098133594 container exec_died 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.expose-services=, com.redhat.component=openstack-iscsid-container, version=17.1.12, distribution-scope=public, maintainer=OpenStack TripleO Team, container_name=iscsid, release=1761123044, build-date=2025-11-18T23:44:13Z, summary=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, vcs-type=git, url=https://www.redhat.com, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_id=tripleo_step3)
Nov 28 08:19:15 np0005538515.localdomain systemd[1]: 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.service: Deactivated successfully.
Nov 28 08:19:15 np0005538515.localdomain python3[70998]: ansible-tripleo_container_manage Invoked with config_id=tripleo_step4 config_dir=/var/lib/tripleo-config/container-startup-config/step_4 config_patterns=*.json config_overrides={} concurrency=5 log_base_path=/var/log/containers/stdouts debug=False
Nov 28 08:19:15 np0005538515.localdomain podman[71174]: 2025-11-28 08:19:15.57027093 +0000 UTC m=+0.058107071 container create 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., name=rhosp17/openstack-cron, build-date=2025-11-18T22:49:32Z, com.redhat.component=openstack-cron-container, tcib_managed=true, distribution-scope=public, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 cron, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, batch=17.1_20251118.1, container_name=logrotate_crond, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64)
Nov 28 08:19:15 np0005538515.localdomain podman[71179]: 2025-11-28 08:19:15.597800967 +0000 UTC m=+0.082636156 container create d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-compute-container, architecture=x86_64, io.buildah.version=1.41.4, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, tcib_managed=true, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:11:48Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676)
Nov 28 08:19:15 np0005538515.localdomain podman[71202]: 2025-11-28 08:19:15.619095633 +0000 UTC m=+0.076940530 container create 54ecd038a8454111a00491ca2068a3bc5f289d100154e1428ef79ea1bdd09f9c (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=configure_cms_options, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, maintainer=OpenStack TripleO Team, tcib_managed=true, config_id=tripleo_step4, version=17.1.12, vendor=Red Hat, Inc., org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=configure_cms_options, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, io.buildah.version=1.41.4, config_data={'command': ['/bin/bash', '-c', 'CMS_OPTS=$(hiera ovn::controller::ovn_cms_options -c /etc/puppet/hiera.yaml);  if [ X"$CMS_OPTS" !=  X ]; then ovs-vsctl set open . external_ids:ovn-cms-options=$CMS_OPTS;else ovs-vsctl remove open . external_ids ovn-cms-options; fi'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764316155'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, url=https://www.redhat.com, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible)
Nov 28 08:19:15 np0005538515.localdomain systemd[1]: Started libpod-conmon-d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.scope.
Nov 28 08:19:15 np0005538515.localdomain podman[71174]: 2025-11-28 08:19:15.539834482 +0000 UTC m=+0.027670633 image pull  registry.redhat.io/rhosp-rhel9/openstack-cron:17.1
Nov 28 08:19:15 np0005538515.localdomain podman[71179]: 2025-11-28 08:19:15.543558767 +0000 UTC m=+0.028393956 image pull  registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1
Nov 28 08:19:15 np0005538515.localdomain podman[71226]: 2025-11-28 08:19:15.654576066 +0000 UTC m=+0.096583706 container create f55e5efc588f54c78f3bd9849c02f7371bb35fe8b0ebb7a16930501c2e5d968d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_libvirt_init_secret, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-libvirt-container, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_libvirt_init_secret, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, distribution-scope=public, name=rhosp17/openstack-nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.buildah.version=1.41.4, config_id=tripleo_step4, build-date=2025-11-19T00:35:22Z, release=1761123044, tcib_managed=true, vendor=Red Hat, Inc., io.openshift.expose-services=, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, config_data={'cgroupns': 'host', 'command': '/nova_libvirt_init_secret.sh ceph:openstack', 'detach': False, 'environment': {'LIBVIRT_DEFAULT_URI': 'qemu:///system', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova:/etc/nova', '/etc/libvirt:/etc/libvirt', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/lib/container-config-scripts/nova_libvirt_init_secret.sh:/nova_libvirt_init_secret.sh:ro', '/var/lib/tripleo-config/ceph:/etc/ceph:ro']})
Nov 28 08:19:15 np0005538515.localdomain podman[71212]: 2025-11-28 08:19:15.666244085 +0000 UTC m=+0.112358401 container create 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, name=rhosp17/openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., container_name=ceilometer_agent_ipmi, build-date=2025-11-19T00:12:45Z, io.buildah.version=1.41.4, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-type=git, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, distribution-scope=public, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044)
Nov 28 08:19:15 np0005538515.localdomain podman[71202]: 2025-11-28 08:19:15.568925758 +0000 UTC m=+0.026770685 image pull  registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1
Nov 28 08:19:15 np0005538515.localdomain systemd[1]: Started libpod-conmon-54ecd038a8454111a00491ca2068a3bc5f289d100154e1428ef79ea1bdd09f9c.scope.
Nov 28 08:19:15 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 08:19:15 np0005538515.localdomain systemd[1]: Started libpod-conmon-719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.scope.
Nov 28 08:19:15 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d613e9ce43651b4a22ba11f5bcafcb4dcc9b302834037925dc9c415fac8e707f/merged/var/log/ceilometer supports timestamps until 2038 (0x7fffffff)
Nov 28 08:19:15 np0005538515.localdomain systemd[1]: Started libpod-conmon-f55e5efc588f54c78f3bd9849c02f7371bb35fe8b0ebb7a16930501c2e5d968d.scope.
Nov 28 08:19:15 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 08:19:15 np0005538515.localdomain podman[71226]: 2025-11-28 08:19:15.589004046 +0000 UTC m=+0.031011676 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Nov 28 08:19:15 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 08:19:15 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 08:19:15 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/93bf0314bbd4063198be021c760bb47b8172c6cfa3163da2b90a6f202605824f/merged/var/log/containers supports timestamps until 2038 (0x7fffffff)
Nov 28 08:19:15 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/740c106cec230c8e88a51062b1e59b5e7fe0e9195732430d85360787d0335118/merged/etc/nova supports timestamps until 2038 (0x7fffffff)
Nov 28 08:19:15 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/740c106cec230c8e88a51062b1e59b5e7fe0e9195732430d85360787d0335118/merged/etc/libvirt supports timestamps until 2038 (0x7fffffff)
Nov 28 08:19:15 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/740c106cec230c8e88a51062b1e59b5e7fe0e9195732430d85360787d0335118/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Nov 28 08:19:15 np0005538515.localdomain systemd[1]: Started libpod-conmon-7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.scope.
Nov 28 08:19:15 np0005538515.localdomain podman[71202]: 2025-11-28 08:19:15.702975256 +0000 UTC m=+0.160820163 container init 54ecd038a8454111a00491ca2068a3bc5f289d100154e1428ef79ea1bdd09f9c (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=configure_cms_options, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, architecture=x86_64, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, version=17.1.12, release=1761123044, build-date=2025-11-18T23:34:05Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, name=rhosp17/openstack-ovn-controller, vendor=Red Hat, Inc., io.buildah.version=1.41.4, com.redhat.component=openstack-ovn-controller-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, distribution-scope=public, config_data={'command': ['/bin/bash', '-c', 'CMS_OPTS=$(hiera ovn::controller::ovn_cms_options -c /etc/puppet/hiera.yaml);  if [ X"$CMS_OPTS" !=  X ]; then ovs-vsctl set open . external_ids:ovn-cms-options=$CMS_OPTS;else ovs-vsctl remove open . external_ids ovn-cms-options; fi'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764316155'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, container_name=configure_cms_options)
Nov 28 08:19:15 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.
Nov 28 08:19:15 np0005538515.localdomain podman[71179]: 2025-11-28 08:19:15.707732333 +0000 UTC m=+0.192567512 container init d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-compute, url=https://www.redhat.com, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, container_name=ceilometer_agent_compute, config_id=tripleo_step4, batch=17.1_20251118.1, io.buildah.version=1.41.4, build-date=2025-11-19T00:11:48Z, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 08:19:15 np0005538515.localdomain podman[71202]: 2025-11-28 08:19:15.711629553 +0000 UTC m=+0.169474470 container start 54ecd038a8454111a00491ca2068a3bc5f289d100154e1428ef79ea1bdd09f9c (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=configure_cms_options, config_data={'command': ['/bin/bash', '-c', 'CMS_OPTS=$(hiera ovn::controller::ovn_cms_options -c /etc/puppet/hiera.yaml);  if [ X"$CMS_OPTS" !=  X ]; then ovs-vsctl set open . external_ids:ovn-cms-options=$CMS_OPTS;else ovs-vsctl remove open . external_ids ovn-cms-options; fi'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764316155'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, tcib_managed=true, config_id=tripleo_step4, batch=17.1_20251118.1, io.openshift.expose-services=, com.redhat.component=openstack-ovn-controller-container, build-date=2025-11-18T23:34:05Z, maintainer=OpenStack TripleO Team, vcs-type=git, container_name=configure_cms_options, vendor=Red Hat, Inc., vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, summary=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, name=rhosp17/openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller)
Nov 28 08:19:15 np0005538515.localdomain podman[71202]: 2025-11-28 08:19:15.711825759 +0000 UTC m=+0.169670656 container attach 54ecd038a8454111a00491ca2068a3bc5f289d100154e1428ef79ea1bdd09f9c (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=configure_cms_options, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'command': ['/bin/bash', '-c', 'CMS_OPTS=$(hiera ovn::controller::ovn_cms_options -c /etc/puppet/hiera.yaml);  if [ X"$CMS_OPTS" !=  X ]; then ovs-vsctl set open . external_ids:ovn-cms-options=$CMS_OPTS;else ovs-vsctl remove open . external_ids ovn-cms-options; fi'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764316155'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, container_name=configure_cms_options, architecture=x86_64, managed_by=tripleo_ansible, io.openshift.expose-services=, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, build-date=2025-11-18T23:34:05Z, name=rhosp17/openstack-ovn-controller, maintainer=OpenStack TripleO Team, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, com.redhat.component=openstack-ovn-controller-container, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, batch=17.1_20251118.1, vendor=Red Hat, Inc., vcs-type=git, io.buildah.version=1.41.4, distribution-scope=public, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller)
Nov 28 08:19:15 np0005538515.localdomain podman[71212]: 2025-11-28 08:19:15.618088793 +0000 UTC m=+0.064203129 image pull  registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1
Nov 28 08:19:15 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 08:19:15 np0005538515.localdomain sudo[71286]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Nov 28 08:19:15 np0005538515.localdomain sudo[71286]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Nov 28 08:19:15 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b2363e42c8cc93f560c242c278a1b76f810df60301763e880790aefc5b17b52f/merged/var/log/ceilometer supports timestamps until 2038 (0x7fffffff)
Nov 28 08:19:15 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.
Nov 28 08:19:15 np0005538515.localdomain podman[71179]: 2025-11-28 08:19:15.738699967 +0000 UTC m=+0.223535186 container start d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, url=https://www.redhat.com, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2025-11-19T00:11:48Z, maintainer=OpenStack TripleO Team, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, tcib_managed=true, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, name=rhosp17/openstack-ceilometer-compute, io.buildah.version=1.41.4, container_name=ceilometer_agent_compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public)
Nov 28 08:19:15 np0005538515.localdomain python3[70998]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name ceilometer_agent_compute --conmon-pidfile /run/ceilometer_agent_compute.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=185ba876a5902dbf87b8591344afd39d --healthcheck-command /openstack/healthcheck --label config_id=tripleo_step4 --label container_name=ceilometer_agent_compute --label managed_by=tripleo_ansible --label config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/ceilometer_agent_compute.log --network host --privileged=False --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro --volume /run/libvirt:/run/libvirt:shared,z --volume /var/log/containers/ceilometer:/var/log/ceilometer:z registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1
Nov 28 08:19:15 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.
Nov 28 08:19:15 np0005538515.localdomain podman[71174]: 2025-11-28 08:19:15.767852754 +0000 UTC m=+0.255688895 container init 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, release=1761123044, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, version=17.1.12, build-date=2025-11-18T22:49:32Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vendor=Red Hat, Inc., tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-cron, container_name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, com.redhat.component=openstack-cron-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, architecture=x86_64, vcs-type=git, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 28 08:19:15 np0005538515.localdomain sudo[71286]: pam_unix(sudo:session): session closed for user root
Nov 28 08:19:15 np0005538515.localdomain ovs-vsctl[71310]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove open . external_ids ovn-cms-options
Nov 28 08:19:15 np0005538515.localdomain systemd[1]: libpod-54ecd038a8454111a00491ca2068a3bc5f289d100154e1428ef79ea1bdd09f9c.scope: Deactivated successfully.
Nov 28 08:19:15 np0005538515.localdomain podman[71202]: 2025-11-28 08:19:15.813194131 +0000 UTC m=+0.271039048 container died 54ecd038a8454111a00491ca2068a3bc5f289d100154e1428ef79ea1bdd09f9c (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=configure_cms_options, config_data={'command': ['/bin/bash', '-c', 'CMS_OPTS=$(hiera ovn::controller::ovn_cms_options -c /etc/puppet/hiera.yaml);  if [ X"$CMS_OPTS" !=  X ]; then ovs-vsctl set open . external_ids:ovn-cms-options=$CMS_OPTS;else ovs-vsctl remove open . external_ids ovn-cms-options; fi'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764316155'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, vcs-type=git, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=configure_cms_options, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.openshift.expose-services=, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, tcib_managed=true, release=1761123044, architecture=x86_64, managed_by=tripleo_ansible, version=17.1.12, build-date=2025-11-18T23:34:05Z, batch=17.1_20251118.1, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, distribution-scope=public, com.redhat.component=openstack-ovn-controller-container, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller)
Nov 28 08:19:15 np0005538515.localdomain sudo[71312]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Nov 28 08:19:15 np0005538515.localdomain sudo[71312]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Nov 28 08:19:15 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.
Nov 28 08:19:15 np0005538515.localdomain podman[71174]: 2025-11-28 08:19:15.837473609 +0000 UTC m=+0.325309750 container start 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., vcs-type=git, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, version=17.1.12, description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, name=rhosp17/openstack-cron, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, build-date=2025-11-18T22:49:32Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, summary=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, container_name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container)
Nov 28 08:19:15 np0005538515.localdomain python3[70998]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name logrotate_crond --conmon-pidfile /run/logrotate_crond.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=53ed83bb0cae779ff95edb2002262c6f --healthcheck-command /usr/share/openstack-tripleo-common/healthcheck/cron --label config_id=tripleo_step4 --label container_name=logrotate_crond --label managed_by=tripleo_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/logrotate_crond.log --network none --pid host --privileged=True --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro --volume /var/log/containers:/var/log/containers:z registry.redhat.io/rhosp-rhel9/openstack-cron:17.1
Nov 28 08:19:15 np0005538515.localdomain podman[71226]: 2025-11-28 08:19:15.858136605 +0000 UTC m=+0.300144235 container init f55e5efc588f54c78f3bd9849c02f7371bb35fe8b0ebb7a16930501c2e5d968d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_libvirt_init_secret, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, config_data={'cgroupns': 'host', 'command': '/nova_libvirt_init_secret.sh ceph:openstack', 'detach': False, 'environment': {'LIBVIRT_DEFAULT_URI': 'qemu:///system', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova:/etc/nova', '/etc/libvirt:/etc/libvirt', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/lib/container-config-scripts/nova_libvirt_init_secret.sh:/nova_libvirt_init_secret.sh:ro', '/var/lib/tripleo-config/ceph:/etc/ceph:ro']}, io.openshift.expose-services=, io.buildah.version=1.41.4, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, container_name=nova_libvirt_init_secret, config_id=tripleo_step4, tcib_managed=true, com.redhat.component=openstack-nova-libvirt-container, release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-libvirt, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2025-11-19T00:35:22Z, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Nov 28 08:19:15 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.
Nov 28 08:19:15 np0005538515.localdomain podman[71212]: 2025-11-28 08:19:15.867615467 +0000 UTC m=+0.313729883 container init 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, build-date=2025-11-19T00:12:45Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, vendor=Red Hat, Inc., batch=17.1_20251118.1, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, container_name=ceilometer_agent_ipmi, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, distribution-scope=public, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com)
Nov 28 08:19:15 np0005538515.localdomain podman[71226]: 2025-11-28 08:19:15.8680266 +0000 UTC m=+0.310034220 container start f55e5efc588f54c78f3bd9849c02f7371bb35fe8b0ebb7a16930501c2e5d968d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_libvirt_init_secret, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, build-date=2025-11-19T00:35:22Z, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, release=1761123044, config_data={'cgroupns': 'host', 'command': '/nova_libvirt_init_secret.sh ceph:openstack', 'detach': False, 'environment': {'LIBVIRT_DEFAULT_URI': 'qemu:///system', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova:/etc/nova', '/etc/libvirt:/etc/libvirt', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/lib/container-config-scripts/nova_libvirt_init_secret.sh:/nova_libvirt_init_secret.sh:ro', '/var/lib/tripleo-config/ceph:/etc/ceph:ro']}, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-libvirt-container, vcs-type=git, vendor=Red Hat, Inc., container_name=nova_libvirt_init_secret, name=rhosp17/openstack-nova-libvirt, config_id=tripleo_step4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4)
Nov 28 08:19:15 np0005538515.localdomain podman[71226]: 2025-11-28 08:19:15.868215746 +0000 UTC m=+0.310223396 container attach f55e5efc588f54c78f3bd9849c02f7371bb35fe8b0ebb7a16930501c2e5d968d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_libvirt_init_secret, com.redhat.component=openstack-nova-libvirt-container, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, release=1761123044, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2025-11-19T00:35:22Z, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, container_name=nova_libvirt_init_secret, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-libvirt, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, config_data={'cgroupns': 'host', 'command': '/nova_libvirt_init_secret.sh ceph:openstack', 'detach': False, 'environment': {'LIBVIRT_DEFAULT_URI': 'qemu:///system', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova:/etc/nova', '/etc/libvirt:/etc/libvirt', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/lib/container-config-scripts/nova_libvirt_init_secret.sh:/nova_libvirt_init_secret.sh:ro', '/var/lib/tripleo-config/ceph:/etc/ceph:ro']}, vcs-type=git, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, batch=17.1_20251118.1, tcib_managed=true, maintainer=OpenStack TripleO Team, config_id=tripleo_step4)
Nov 28 08:19:15 np0005538515.localdomain sudo[71312]: pam_unix(sudo:session): session closed for user root
Nov 28 08:19:15 np0005538515.localdomain crond[71311]: (CRON) STARTUP (1.5.7)
Nov 28 08:19:15 np0005538515.localdomain crond[71311]: (CRON) INFO (RANDOM_DELAY will be scaled with factor 45% if used.)
Nov 28 08:19:15 np0005538515.localdomain crond[71311]: (CRON) INFO (running with inotify support)
Nov 28 08:19:15 np0005538515.localdomain sudo[71371]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Nov 28 08:19:15 np0005538515.localdomain sudo[71371]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Nov 28 08:19:15 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.
Nov 28 08:19:15 np0005538515.localdomain podman[71288]: 2025-11-28 08:19:15.938959964 +0000 UTC m=+0.197253375 container health_status d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=starting, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., build-date=2025-11-19T00:11:48Z, maintainer=OpenStack TripleO Team, tcib_managed=true, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, container_name=ceilometer_agent_compute, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, batch=17.1_20251118.1, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-compute, config_id=tripleo_step4, version=17.1.12)
Nov 28 08:19:15 np0005538515.localdomain podman[71288]: 2025-11-28 08:19:15.955301638 +0000 UTC m=+0.213595049 container exec_died d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vcs-type=git, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-compute, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, com.redhat.component=openstack-ceilometer-compute-container, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, batch=17.1_20251118.1, release=1761123044, tcib_managed=true, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:11:48Z, architecture=x86_64, version=17.1.12, url=https://www.redhat.com, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Nov 28 08:19:15 np0005538515.localdomain sudo[71371]: pam_unix(sudo:session): session closed for user root
Nov 28 08:19:15 np0005538515.localdomain podman[71288]: unhealthy
Nov 28 08:19:15 np0005538515.localdomain systemd[1]: d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 08:19:15 np0005538515.localdomain systemd[1]: d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.service: Failed with result 'exit-code'.
Nov 28 08:19:15 np0005538515.localdomain systemd[1]: libpod-f55e5efc588f54c78f3bd9849c02f7371bb35fe8b0ebb7a16930501c2e5d968d.scope: Deactivated successfully.
Nov 28 08:19:15 np0005538515.localdomain podman[71226]: 2025-11-28 08:19:15.978317466 +0000 UTC m=+0.420325096 container died f55e5efc588f54c78f3bd9849c02f7371bb35fe8b0ebb7a16930501c2e5d968d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_libvirt_init_secret, io.openshift.expose-services=, release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, container_name=nova_libvirt_init_secret, tcib_managed=true, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-libvirt-container, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, build-date=2025-11-19T00:35:22Z, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20251118.1, managed_by=tripleo_ansible, config_data={'cgroupns': 'host', 'command': '/nova_libvirt_init_secret.sh ceph:openstack', 'detach': False, 'environment': {'LIBVIRT_DEFAULT_URI': 'qemu:///system', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova:/etc/nova', '/etc/libvirt:/etc/libvirt', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/lib/container-config-scripts/nova_libvirt_init_secret.sh:/nova_libvirt_init_secret.sh:ro', '/var/lib/tripleo-config/ceph:/etc/ceph:ro']}, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp17/openstack-nova-libvirt, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64)
Nov 28 08:19:16 np0005538515.localdomain podman[71315]: 2025-11-28 08:19:16.00470754 +0000 UTC m=+0.185318830 container cleanup 54ecd038a8454111a00491ca2068a3bc5f289d100154e1428ef79ea1bdd09f9c (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=configure_cms_options, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=configure_cms_options, build-date=2025-11-18T23:34:05Z, tcib_managed=true, io.openshift.expose-services=, distribution-scope=public, managed_by=tripleo_ansible, config_id=tripleo_step4, io.buildah.version=1.41.4, name=rhosp17/openstack-ovn-controller, url=https://www.redhat.com, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ovn-controller-container, config_data={'command': ['/bin/bash', '-c', 'CMS_OPTS=$(hiera ovn::controller::ovn_cms_options -c /etc/puppet/hiera.yaml);  if [ X"$CMS_OPTS" !=  X ]; then ovs-vsctl set open . external_ids:ovn-cms-options=$CMS_OPTS;else ovs-vsctl remove open . external_ids ovn-cms-options; fi'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764316155'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']})
Nov 28 08:19:16 np0005538515.localdomain systemd[1]: libpod-conmon-54ecd038a8454111a00491ca2068a3bc5f289d100154e1428ef79ea1bdd09f9c.scope: Deactivated successfully.
Nov 28 08:19:16 np0005538515.localdomain podman[71326]: 2025-11-28 08:19:15.908471115 +0000 UTC m=+0.067636604 container health_status 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=starting, io.openshift.expose-services=, distribution-scope=public, managed_by=tripleo_ansible, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vendor=Red Hat, Inc., url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, vcs-type=git, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=logrotate_crond, maintainer=OpenStack TripleO Team, release=1761123044, build-date=2025-11-18T22:49:32Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, name=rhosp17/openstack-cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 28 08:19:16 np0005538515.localdomain podman[71416]: 2025-11-28 08:19:16.025172149 +0000 UTC m=+0.044427409 container cleanup f55e5efc588f54c78f3bd9849c02f7371bb35fe8b0ebb7a16930501c2e5d968d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_libvirt_init_secret, release=1761123044, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, config_id=tripleo_step4, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-libvirt, config_data={'cgroupns': 'host', 'command': '/nova_libvirt_init_secret.sh ceph:openstack', 'detach': False, 'environment': {'LIBVIRT_DEFAULT_URI': 'qemu:///system', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova:/etc/nova', '/etc/libvirt:/etc/libvirt', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/lib/container-config-scripts/nova_libvirt_init_secret.sh:/nova_libvirt_init_secret.sh:ro', '/var/lib/tripleo-config/ceph:/etc/ceph:ro']}, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2025-11-19T00:35:22Z, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20251118.1, com.redhat.component=openstack-nova-libvirt-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, url=https://www.redhat.com, container_name=nova_libvirt_init_secret, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-libvirt)
Nov 28 08:19:16 np0005538515.localdomain systemd[1]: libpod-conmon-f55e5efc588f54c78f3bd9849c02f7371bb35fe8b0ebb7a16930501c2e5d968d.scope: Deactivated successfully.
Nov 28 08:19:16 np0005538515.localdomain python3[70998]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_libvirt_init_secret --cgroupns=host --conmon-pidfile /run/nova_libvirt_init_secret.pid --detach=False --env LIBVIRT_DEFAULT_URI=qemu:///system --env TRIPLEO_CONFIG_HASH=bbb5ea37891e3118676a78b59837de90 --label config_id=tripleo_step4 --label container_name=nova_libvirt_init_secret --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'command': '/nova_libvirt_init_secret.sh ceph:openstack', 'detach': False, 'environment': {'LIBVIRT_DEFAULT_URI': 'qemu:///system', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova:/etc/nova', '/etc/libvirt:/etc/libvirt', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/lib/container-config-scripts/nova_libvirt_init_secret.sh:/nova_libvirt_init_secret.sh:ro', '/var/lib/tripleo-config/ceph:/etc/ceph:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_libvirt_init_secret.log --network host --privileged=False --security-opt label=disable --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/config-data/puppet-generated/nova_libvirt/etc/nova:/etc/nova --volume /etc/libvirt:/etc/libvirt --volume /run/libvirt:/run/libvirt:shared,z --volume /var/lib/libvirt:/var/lib/libvirt:shared --volume /var/lib/container-config-scripts/nova_libvirt_init_secret.sh:/nova_libvirt_init_secret.sh:ro --volume /var/lib/tripleo-config/ceph:/etc/ceph:ro registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 /nova_libvirt_init_secret.sh ceph:openstack
Nov 28 08:19:16 np0005538515.localdomain podman[71326]: 2025-11-28 08:19:16.040293075 +0000 UTC m=+0.199458574 container exec_died 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, batch=17.1_20251118.1, build-date=2025-11-18T22:49:32Z, config_id=tripleo_step4, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=logrotate_crond, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, com.redhat.component=openstack-cron-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, io.openshift.expose-services=, release=1761123044, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, summary=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, name=rhosp17/openstack-cron, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc.)
Nov 28 08:19:16 np0005538515.localdomain systemd[1]: 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.service: Deactivated successfully.
Nov 28 08:19:16 np0005538515.localdomain podman[71212]: 2025-11-28 08:19:16.062965063 +0000 UTC m=+0.509079369 container start 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, version=17.1.12, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:12:45Z, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, release=1761123044, config_id=tripleo_step4, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, container_name=ceilometer_agent_ipmi, vendor=Red Hat, Inc., io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1)
Nov 28 08:19:16 np0005538515.localdomain python3[70998]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name configure_cms_options --conmon-pidfile /run/configure_cms_options.pid --detach=False --env TRIPLEO_DEPLOY_IDENTIFIER=1764316155 --label config_id=tripleo_step4 --label container_name=configure_cms_options --label managed_by=tripleo_ansible --label config_data={'command': ['/bin/bash', '-c', 'CMS_OPTS=$(hiera ovn::controller::ovn_cms_options -c /etc/puppet/hiera.yaml);  if [ X"$CMS_OPTS" !=  X ]; then ovs-vsctl set open . external_ids:ovn-cms-options=$CMS_OPTS;else ovs-vsctl remove open . external_ids ovn-cms-options; fi'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764316155'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/configure_cms_options.log --network host --privileged=True --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /lib/modules:/lib/modules:ro --volume /run/openvswitch:/run/openvswitch:shared,z registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1 /bin/bash -c CMS_OPTS=$(hiera ovn::controller::ovn_cms_options -c /etc/puppet/hiera.yaml);  if [ X"$CMS_OPTS" !=  X ]; then ovs-vsctl set open . external_ids:ovn-cms-options=$CMS_OPTS;else ovs-vsctl remove open . external_ids ovn-cms-options; fi
Nov 28 08:19:16 np0005538515.localdomain python3[70998]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name ceilometer_agent_ipmi --conmon-pidfile /run/ceilometer_agent_ipmi.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=185ba876a5902dbf87b8591344afd39d --healthcheck-command /openstack/healthcheck --label config_id=tripleo_step4 --label container_name=ceilometer_agent_ipmi --label managed_by=tripleo_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/ceilometer_agent_ipmi.log --network host --privileged=True --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro --volume /var/log/containers/ceilometer:/var/log/ceilometer:z registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1
Nov 28 08:19:16 np0005538515.localdomain podman[71379]: 2025-11-28 08:19:16.079177042 +0000 UTC m=+0.167515960 container health_status 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=starting, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, vendor=Red Hat, Inc., container_name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2025-11-19T00:12:45Z, com.redhat.component=openstack-ceilometer-ipmi-container, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=)
Nov 28 08:19:16 np0005538515.localdomain podman[71379]: 2025-11-28 08:19:16.088311003 +0000 UTC m=+0.176649961 container exec_died 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:12:45Z, vcs-type=git, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, distribution-scope=public, tcib_managed=true, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., container_name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, batch=17.1_20251118.1, managed_by=tripleo_ansible, release=1761123044, url=https://www.redhat.com, io.buildah.version=1.41.4, com.redhat.component=openstack-ceilometer-ipmi-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 28 08:19:16 np0005538515.localdomain podman[71379]: unhealthy
Nov 28 08:19:16 np0005538515.localdomain systemd[1]: 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 08:19:16 np0005538515.localdomain systemd[1]: 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.service: Failed with result 'exit-code'.
Nov 28 08:19:16 np0005538515.localdomain podman[71537]: 2025-11-28 08:19:16.266904634 +0000 UTC m=+0.067943143 container create e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, build-date=2025-11-19T00:36:58Z, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, vcs-type=git, container_name=nova_migration_target, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, release=1761123044, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, config_id=tripleo_step4, version=17.1.12, architecture=x86_64, managed_by=tripleo_ansible, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute)
Nov 28 08:19:16 np0005538515.localdomain systemd[1]: Started libpod-conmon-e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.scope.
Nov 28 08:19:16 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 08:19:16 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3850d276a9594c52a78e85d7b58db016dc835caf89f3a263b0f9d37a3754a60d/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Nov 28 08:19:16 np0005538515.localdomain podman[71537]: 2025-11-28 08:19:16.22391345 +0000 UTC m=+0.024951959 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1
Nov 28 08:19:16 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.
Nov 28 08:19:16 np0005538515.localdomain podman[71537]: 2025-11-28 08:19:16.335022443 +0000 UTC m=+0.136060982 container init e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, release=1761123044, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, vcs-type=git, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step4, vendor=Red Hat, Inc., url=https://www.redhat.com, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 08:19:16 np0005538515.localdomain sudo[71587]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Nov 28 08:19:16 np0005538515.localdomain sudo[71587]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Nov 28 08:19:16 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.
Nov 28 08:19:16 np0005538515.localdomain podman[71537]: 2025-11-28 08:19:16.360200348 +0000 UTC m=+0.161238857 container start e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, tcib_managed=true, name=rhosp17/openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., url=https://www.redhat.com, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, maintainer=OpenStack TripleO Team, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, distribution-scope=public, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, container_name=nova_migration_target, build-date=2025-11-19T00:36:58Z)
Nov 28 08:19:16 np0005538515.localdomain python3[70998]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_migration_target --conmon-pidfile /run/nova_migration_target.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=bbb5ea37891e3118676a78b59837de90 --healthcheck-command /openstack/healthcheck --label config_id=tripleo_step4 --label container_name=nova_migration_target --label managed_by=tripleo_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_migration_target.log --network host --privileged=True --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /etc/ssh:/host-ssh:ro --volume /run/libvirt:/run/libvirt:shared,z --volume /var/lib/nova:/var/lib/nova:shared registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1
Nov 28 08:19:16 np0005538515.localdomain sudo[71587]: pam_unix(sudo:session): session closed for user root
Nov 28 08:19:16 np0005538515.localdomain podman[71593]: 2025-11-28 08:19:16.431438692 +0000 UTC m=+0.068315345 container create dffa592367a31755f9497a289202797714e3584bcc35fa55351b1c65c9a8d73c (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=setup_ovs_manager, architecture=x86_64, batch=17.1_20251118.1, vcs-type=git, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'command': ['/container_puppet_apply.sh', '4', 'exec', 'include tripleo::profile::base::neutron::ovn_metadata'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764316155'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, container_name=setup_ovs_manager, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, build-date=2025-11-19T00:14:25Z, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., version=17.1.12, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, release=1761123044, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Nov 28 08:19:16 np0005538515.localdomain sshd[71626]: Server listening on 0.0.0.0 port 2022.
Nov 28 08:19:16 np0005538515.localdomain sshd[71626]: Server listening on :: port 2022.
Nov 28 08:19:16 np0005538515.localdomain systemd[1]: Started libpod-conmon-dffa592367a31755f9497a289202797714e3584bcc35fa55351b1c65c9a8d73c.scope.
Nov 28 08:19:16 np0005538515.localdomain podman[71592]: 2025-11-28 08:19:16.471850286 +0000 UTC m=+0.105866341 container health_status e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=starting, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, distribution-scope=public, io.openshift.expose-services=, managed_by=tripleo_ansible, architecture=x86_64, tcib_managed=true, build-date=2025-11-19T00:36:58Z, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, vcs-type=git, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step4, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 28 08:19:16 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 08:19:16 np0005538515.localdomain podman[71593]: 2025-11-28 08:19:16.491256273 +0000 UTC m=+0.128132916 container init dffa592367a31755f9497a289202797714e3584bcc35fa55351b1c65c9a8d73c (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=setup_ovs_manager, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'command': ['/container_puppet_apply.sh', '4', 'exec', 'include tripleo::profile::base::neutron::ovn_metadata'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764316155'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, version=17.1.12, io.openshift.expose-services=, name=rhosp17/openstack-neutron-metadata-agent-ovn, distribution-scope=public, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=setup_ovs_manager, build-date=2025-11-19T00:14:25Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, tcib_managed=true, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, vcs-type=git, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, architecture=x86_64)
Nov 28 08:19:16 np0005538515.localdomain podman[71593]: 2025-11-28 08:19:16.392608906 +0000 UTC m=+0.029485579 image pull  registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1
Nov 28 08:19:16 np0005538515.localdomain podman[71593]: 2025-11-28 08:19:16.505454341 +0000 UTC m=+0.142330984 container start dffa592367a31755f9497a289202797714e3584bcc35fa55351b1c65c9a8d73c (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=setup_ovs_manager, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'command': ['/container_puppet_apply.sh', '4', 'exec', 'include tripleo::profile::base::neutron::ovn_metadata'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764316155'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, distribution-scope=public, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, container_name=setup_ovs_manager, tcib_managed=true, vcs-type=git, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, release=1761123044, io.openshift.expose-services=, io.buildah.version=1.41.4, config_id=tripleo_step4, architecture=x86_64, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c)
Nov 28 08:19:16 np0005538515.localdomain podman[71593]: 2025-11-28 08:19:16.506474963 +0000 UTC m=+0.143351616 container attach dffa592367a31755f9497a289202797714e3584bcc35fa55351b1c65c9a8d73c (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=setup_ovs_manager, io.openshift.expose-services=, container_name=setup_ovs_manager, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, version=17.1.12, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, vcs-type=git, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:14:25Z, vendor=Red Hat, Inc., distribution-scope=public, config_id=tripleo_step4, release=1761123044, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, config_data={'command': ['/container_puppet_apply.sh', '4', 'exec', 'include tripleo::profile::base::neutron::ovn_metadata'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764316155'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']})
Nov 28 08:19:16 np0005538515.localdomain podman[71592]: 2025-11-28 08:19:16.82622323 +0000 UTC m=+0.460239375 container exec_died e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, version=17.1.12, batch=17.1_20251118.1, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, distribution-scope=public, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.4, release=1761123044, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, vcs-type=git)
Nov 28 08:19:16 np0005538515.localdomain systemd[1]: e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.service: Deactivated successfully.
Nov 28 08:19:16 np0005538515.localdomain sudo[71667]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpaod0w0rr/privsep.sock
Nov 28 08:19:16 np0005538515.localdomain sudo[71667]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Nov 28 08:19:17 np0005538515.localdomain kernel: capability: warning: `privsep-helper' uses deprecated v2 capabilities in a way that may be insecure
Nov 28 08:19:17 np0005538515.localdomain sudo[71667]: pam_unix(sudo:session): session closed for user root
Nov 28 08:19:19 np0005538515.localdomain ovs-vsctl[71787]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --timeout=5 --id=@manager -- create Manager "target=\"ptcp:6640:127.0.0.1\"" -- add Open_vSwitch . manager_options @manager
Nov 28 08:19:19 np0005538515.localdomain systemd[1]: libpod-dffa592367a31755f9497a289202797714e3584bcc35fa55351b1c65c9a8d73c.scope: Deactivated successfully.
Nov 28 08:19:19 np0005538515.localdomain systemd[1]: libpod-dffa592367a31755f9497a289202797714e3584bcc35fa55351b1c65c9a8d73c.scope: Consumed 2.938s CPU time.
Nov 28 08:19:19 np0005538515.localdomain podman[71788]: 2025-11-28 08:19:19.556753444 +0000 UTC m=+0.054531331 container died dffa592367a31755f9497a289202797714e3584bcc35fa55351b1c65c9a8d73c (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=setup_ovs_manager, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=setup_ovs_manager, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, build-date=2025-11-19T00:14:25Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, vendor=Red Hat, Inc., managed_by=tripleo_ansible, name=rhosp17/openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, config_id=tripleo_step4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, config_data={'command': ['/container_puppet_apply.sh', '4', 'exec', 'include tripleo::profile::base::neutron::ovn_metadata'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764316155'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, tcib_managed=true, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, distribution-scope=public)
Nov 28 08:19:19 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-dffa592367a31755f9497a289202797714e3584bcc35fa55351b1c65c9a8d73c-userdata-shm.mount: Deactivated successfully.
Nov 28 08:19:19 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-a820d2d170f2b0bbf4a680f8c0da82218646a321cb82318df9e6b8161dc1d2c6-merged.mount: Deactivated successfully.
Nov 28 08:19:19 np0005538515.localdomain podman[71788]: 2025-11-28 08:19:19.594482236 +0000 UTC m=+0.092260053 container cleanup dffa592367a31755f9497a289202797714e3584bcc35fa55351b1c65c9a8d73c (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=setup_ovs_manager, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vendor=Red Hat, Inc., version=17.1.12, io.openshift.expose-services=, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_data={'command': ['/container_puppet_apply.sh', '4', 'exec', 'include tripleo::profile::base::neutron::ovn_metadata'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764316155'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-neutron-metadata-agent-ovn, architecture=x86_64, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, distribution-scope=public, container_name=setup_ovs_manager, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044)
Nov 28 08:19:19 np0005538515.localdomain systemd[1]: libpod-conmon-dffa592367a31755f9497a289202797714e3584bcc35fa55351b1c65c9a8d73c.scope: Deactivated successfully.
Nov 28 08:19:19 np0005538515.localdomain python3[70998]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name setup_ovs_manager --conmon-pidfile /run/setup_ovs_manager.pid --detach=False --env TRIPLEO_DEPLOY_IDENTIFIER=1764316155 --label config_id=tripleo_step4 --label container_name=setup_ovs_manager --label managed_by=tripleo_ansible --label config_data={'command': ['/container_puppet_apply.sh', '4', 'exec', 'include tripleo::profile::base::neutron::ovn_metadata'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764316155'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/setup_ovs_manager.log --network host --privileged=True --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /lib/modules:/lib/modules:ro --volume /run/openvswitch:/run/openvswitch:shared,z registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1 /container_puppet_apply.sh 4 exec include tripleo::profile::base::neutron::ovn_metadata
Nov 28 08:19:20 np0005538515.localdomain podman[71895]: 2025-11-28 08:19:20.041444862 +0000 UTC m=+0.087215098 container create e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, distribution-scope=public, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, release=1761123044, config_id=tripleo_step4, batch=17.1_20251118.1, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=ovn_metadata_agent, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:14:25Z, version=17.1.12, vendor=Red Hat, Inc., name=rhosp17/openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn)
Nov 28 08:19:20 np0005538515.localdomain podman[71896]: 2025-11-28 08:19:20.074984685 +0000 UTC m=+0.115037074 container create 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, distribution-scope=public, url=https://www.redhat.com, architecture=x86_64, name=rhosp17/openstack-ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, vendor=Red Hat, Inc., container_name=ovn_controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, release=1761123044, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1)
Nov 28 08:19:20 np0005538515.localdomain systemd[1]: Started libpod-conmon-e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.scope.
Nov 28 08:19:20 np0005538515.localdomain podman[71895]: 2025-11-28 08:19:19.992623088 +0000 UTC m=+0.038393354 image pull  registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1
Nov 28 08:19:20 np0005538515.localdomain systemd[1]: Started libpod-conmon-9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.scope.
Nov 28 08:19:20 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 08:19:20 np0005538515.localdomain podman[71896]: 2025-11-28 08:19:20.008151626 +0000 UTC m=+0.048204065 image pull  registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1
Nov 28 08:19:20 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 08:19:20 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c6bc8e2b5666799e64c84f093eb3569ddc3bccd8602a09788ea75d9b81e61916/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Nov 28 08:19:20 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/22314ee7dcc5723035b6772f98d17adedfb1f7b03c71f0801082e550913dd450/merged/var/log/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 08:19:20 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/22314ee7dcc5723035b6772f98d17adedfb1f7b03c71f0801082e550913dd450/merged/etc/neutron/kill_scripts supports timestamps until 2038 (0x7fffffff)
Nov 28 08:19:20 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/22314ee7dcc5723035b6772f98d17adedfb1f7b03c71f0801082e550913dd450/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 08:19:20 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c6bc8e2b5666799e64c84f093eb3569ddc3bccd8602a09788ea75d9b81e61916/merged/var/log/ovn supports timestamps until 2038 (0x7fffffff)
Nov 28 08:19:20 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c6bc8e2b5666799e64c84f093eb3569ddc3bccd8602a09788ea75d9b81e61916/merged/var/log/openvswitch supports timestamps until 2038 (0x7fffffff)
Nov 28 08:19:20 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.
Nov 28 08:19:20 np0005538515.localdomain podman[71896]: 2025-11-28 08:19:20.135472437 +0000 UTC m=+0.175524816 container init 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, name=rhosp17/openstack-ovn-controller, version=17.1.12, build-date=2025-11-18T23:34:05Z, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, vcs-type=git, managed_by=tripleo_ansible, distribution-scope=public, vendor=Red Hat, Inc., io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, url=https://www.redhat.com, batch=17.1_20251118.1, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ovn-controller-container, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, container_name=ovn_controller, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=)
Nov 28 08:19:20 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.
Nov 28 08:19:20 np0005538515.localdomain podman[71895]: 2025-11-28 08:19:20.138800429 +0000 UTC m=+0.184570675 container init e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, build-date=2025-11-19T00:14:25Z, container_name=ovn_metadata_agent, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_id=tripleo_step4, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, architecture=x86_64, vendor=Red Hat, Inc., release=1761123044, distribution-scope=public, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, version=17.1.12)
Nov 28 08:19:20 np0005538515.localdomain sudo[71939]:  neutron : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Nov 28 08:19:20 np0005538515.localdomain sudo[71939]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42435)
Nov 28 08:19:20 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.
Nov 28 08:19:20 np0005538515.localdomain podman[71896]: 2025-11-28 08:19:20.163979665 +0000 UTC m=+0.204032024 container start 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, batch=17.1_20251118.1, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:34:05Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., release=1761123044, architecture=x86_64, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, name=rhosp17/openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, distribution-scope=public, tcib_managed=true)
Nov 28 08:19:20 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.
Nov 28 08:19:20 np0005538515.localdomain podman[71895]: 2025-11-28 08:19:20.166579435 +0000 UTC m=+0.212349671 container start e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, name=rhosp17/openstack-neutron-metadata-agent-ovn, architecture=x86_64, container_name=ovn_metadata_agent, batch=17.1_20251118.1, vendor=Red Hat, Inc., io.openshift.expose-services=, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:14:25Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, io.buildah.version=1.41.4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, config_id=tripleo_step4, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Nov 28 08:19:20 np0005538515.localdomain python3[70998]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name ovn_controller --conmon-pidfile /run/ovn_controller.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck 6642 --label config_id=tripleo_step4 --label container_name=ovn_controller --label managed_by=tripleo_ansible --label config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/ovn_controller.log --network host --privileged=True --user root --volume /var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro --volume /lib/modules:/lib/modules:ro --volume /run:/run --volume /var/lib/openvswitch/ovn:/run/ovn:shared,z --volume /var/log/containers/openvswitch:/var/log/openvswitch:z --volume /var/log/containers/openvswitch:/var/log/ovn:z registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1
Nov 28 08:19:20 np0005538515.localdomain systemd-logind[763]: Existing logind session ID 28 used by new audit session, ignoring.
Nov 28 08:19:20 np0005538515.localdomain python3[70998]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name ovn_metadata_agent --cgroupns=host --conmon-pidfile /run/ovn_metadata_agent.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=08c21dad54d1ba598c6e2fae6b853aba --healthcheck-command /openstack/healthcheck --label config_id=tripleo_step4 --label container_name=ovn_metadata_agent --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/ovn_metadata_agent.log --network host --pid host --privileged=True --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/log/containers/neutron:/var/log/neutron:z --volume /var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro --volume /lib/modules:/lib/modules:ro --volume /run/openvswitch:/run/openvswitch:shared,z --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /run/netns:/run/netns:shared --volume /var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z --volume /var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1
Nov 28 08:19:20 np0005538515.localdomain systemd[1]: Created slice User Slice of UID 0.
Nov 28 08:19:20 np0005538515.localdomain systemd[1]: Starting User Runtime Directory /run/user/0...
Nov 28 08:19:20 np0005538515.localdomain systemd[1]: Finished User Runtime Directory /run/user/0.
Nov 28 08:19:20 np0005538515.localdomain systemd[1]: Starting User Manager for UID 0...
Nov 28 08:19:20 np0005538515.localdomain sudo[71939]: pam_unix(sudo:session): session closed for user root
Nov 28 08:19:20 np0005538515.localdomain systemd[71972]: pam_unix(systemd-user:session): session opened for user root(uid=0) by (uid=0)
Nov 28 08:19:20 np0005538515.localdomain podman[71941]: 2025-11-28 08:19:20.239696467 +0000 UTC m=+0.069621275 container health_status 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=starting, distribution-scope=public, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., config_id=tripleo_step4, build-date=2025-11-18T23:34:05Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, name=rhosp17/openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, io.buildah.version=1.41.4, version=17.1.12, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, tcib_managed=true, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller)
Nov 28 08:19:20 np0005538515.localdomain podman[71942]: 2025-11-28 08:19:20.299664594 +0000 UTC m=+0.125677281 container health_status e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=starting, distribution-scope=public, io.openshift.expose-services=, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-type=git, tcib_managed=true, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, managed_by=tripleo_ansible, build-date=2025-11-19T00:14:25Z, version=17.1.12, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Nov 28 08:19:20 np0005538515.localdomain systemd[71972]: Queued start job for default target Main User Target.
Nov 28 08:19:20 np0005538515.localdomain systemd[71972]: Created slice User Application Slice.
Nov 28 08:19:20 np0005538515.localdomain systemd[71972]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Nov 28 08:19:20 np0005538515.localdomain systemd[71972]: Started Daily Cleanup of User's Temporary Directories.
Nov 28 08:19:20 np0005538515.localdomain systemd[71972]: Reached target Paths.
Nov 28 08:19:20 np0005538515.localdomain systemd[71972]: Reached target Timers.
Nov 28 08:19:20 np0005538515.localdomain podman[71941]: 2025-11-28 08:19:20.330907526 +0000 UTC m=+0.160832354 container exec_died 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, release=1761123044, version=17.1.12, config_id=tripleo_step4, distribution-scope=public, com.redhat.component=openstack-ovn-controller-container, batch=17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vendor=Red Hat, Inc., name=rhosp17/openstack-ovn-controller, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, build-date=2025-11-18T23:34:05Z, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, managed_by=tripleo_ansible, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4)
Nov 28 08:19:20 np0005538515.localdomain systemd[71972]: Starting D-Bus User Message Bus Socket...
Nov 28 08:19:20 np0005538515.localdomain systemd[71972]: Starting Create User's Volatile Files and Directories...
Nov 28 08:19:20 np0005538515.localdomain podman[71941]: unhealthy
Nov 28 08:19:20 np0005538515.localdomain systemd[1]: 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 08:19:20 np0005538515.localdomain systemd[1]: 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.service: Failed with result 'exit-code'.
Nov 28 08:19:20 np0005538515.localdomain podman[71942]: 2025-11-28 08:19:20.344441617 +0000 UTC m=+0.170454304 container exec_died e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, managed_by=tripleo_ansible, architecture=x86_64, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:14:25Z, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, vendor=Red Hat, Inc., io.openshift.expose-services=, maintainer=OpenStack TripleO Team, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_id=tripleo_step4, vcs-type=git, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, version=17.1.12, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn)
Nov 28 08:19:20 np0005538515.localdomain systemd[71972]: Listening on D-Bus User Message Bus Socket.
Nov 28 08:19:20 np0005538515.localdomain systemd[71972]: Reached target Sockets.
Nov 28 08:19:20 np0005538515.localdomain systemd[71972]: Finished Create User's Volatile Files and Directories.
Nov 28 08:19:20 np0005538515.localdomain systemd[71972]: Reached target Basic System.
Nov 28 08:19:20 np0005538515.localdomain systemd[71972]: Reached target Main User Target.
Nov 28 08:19:20 np0005538515.localdomain systemd[71972]: Startup finished in 112ms.
Nov 28 08:19:20 np0005538515.localdomain systemd[1]: Started User Manager for UID 0.
Nov 28 08:19:20 np0005538515.localdomain podman[71942]: unhealthy
Nov 28 08:19:20 np0005538515.localdomain systemd[1]: Started Session c9 of User root.
Nov 28 08:19:20 np0005538515.localdomain systemd[1]: e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 08:19:20 np0005538515.localdomain systemd[1]: e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.service: Failed with result 'exit-code'.
Nov 28 08:19:20 np0005538515.localdomain systemd[1]: session-c9.scope: Deactivated successfully.
Nov 28 08:19:20 np0005538515.localdomain kernel: device br-int entered promiscuous mode
Nov 28 08:19:20 np0005538515.localdomain NetworkManager[5965]: <info>  [1764317960.4412] manager: (br-int): new Generic device (/org/freedesktop/NetworkManager/Devices/11)
Nov 28 08:19:20 np0005538515.localdomain systemd-udevd[72045]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 08:19:20 np0005538515.localdomain sudo[70996]: pam_unix(sudo:session): session closed for user root
Nov 28 08:19:20 np0005538515.localdomain systemd-udevd[72047]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 08:19:20 np0005538515.localdomain NetworkManager[5965]: <info>  [1764317960.5141] device (genev_sys_6081): carrier: link connected
Nov 28 08:19:20 np0005538515.localdomain NetworkManager[5965]: <info>  [1764317960.5145] manager: (genev_sys_6081): new Generic device (/org/freedesktop/NetworkManager/Devices/12)
Nov 28 08:19:20 np0005538515.localdomain kernel: device genev_sys_6081 entered promiscuous mode
Nov 28 08:19:20 np0005538515.localdomain sudo[72065]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fmdsjwrzehruiecdhbnxxewqiwfzlzco ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:19:20 np0005538515.localdomain sudo[72065]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:19:20 np0005538515.localdomain python3[72067]: ansible-file Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:19:20 np0005538515.localdomain sudo[72065]: pam_unix(sudo:session): session closed for user root
Nov 28 08:19:21 np0005538515.localdomain sudo[72081]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rvhckqrvexpbaeqwemgwpcjdmhaknwqf ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:19:21 np0005538515.localdomain sudo[72081]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:19:21 np0005538515.localdomain python3[72083]: ansible-file Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_ipmi.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:19:21 np0005538515.localdomain sudo[72081]: pam_unix(sudo:session): session closed for user root
Nov 28 08:19:21 np0005538515.localdomain sudo[72097]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lardwxykxvgllwajpktbnsogmaqijxem ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:19:21 np0005538515.localdomain sudo[72097]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:19:21 np0005538515.localdomain python3[72099]: ansible-file Invoked with path=/etc/systemd/system/tripleo_logrotate_crond.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:19:21 np0005538515.localdomain sudo[72097]: pam_unix(sudo:session): session closed for user root
Nov 28 08:19:21 np0005538515.localdomain sudo[72113]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tqxbzikmtsxpmwbvzmzazcafruwpjpam ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:19:21 np0005538515.localdomain sudo[72113]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:19:21 np0005538515.localdomain python3[72115]: ansible-file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:19:21 np0005538515.localdomain sudo[72113]: pam_unix(sudo:session): session closed for user root
Nov 28 08:19:21 np0005538515.localdomain sudo[72129]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pzcjtkhhhuebqmlrodzpdpizenvmskeg ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:19:21 np0005538515.localdomain sudo[72129]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:19:21 np0005538515.localdomain sudo[72133]:  neutron : PWD=/ ; USER=root ; COMMAND=/usr/bin/neutron-rootwrap /etc/neutron/rootwrap.conf privsep-helper --config-file /etc/neutron/neutron.conf --config-file /etc/neutron/plugins/networking-ovn/networking-ovn-metadata-agent.ini --privsep_context neutron.privileged.default --privsep_sock_path /tmp/tmp_0dh6j_y/privsep.sock
Nov 28 08:19:21 np0005538515.localdomain sudo[72133]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42435)
Nov 28 08:19:21 np0005538515.localdomain python3[72131]: ansible-file Invoked with path=/etc/systemd/system/tripleo_ovn_controller.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:19:21 np0005538515.localdomain sudo[72129]: pam_unix(sudo:session): session closed for user root
Nov 28 08:19:21 np0005538515.localdomain sudo[72148]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-chktxmojzwnfsnhfwpyggjivrmuhfdxo ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:19:22 np0005538515.localdomain sudo[72148]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:19:22 np0005538515.localdomain python3[72150]: ansible-file Invoked with path=/etc/systemd/system/tripleo_ovn_metadata_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:19:22 np0005538515.localdomain sudo[72148]: pam_unix(sudo:session): session closed for user root
Nov 28 08:19:22 np0005538515.localdomain sudo[72165]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jfhcajwlpqkjnpcaysaqbxiqjrmakmzg ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:19:22 np0005538515.localdomain sudo[72165]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:19:22 np0005538515.localdomain python3[72167]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_compute_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 28 08:19:22 np0005538515.localdomain sudo[72165]: pam_unix(sudo:session): session closed for user root
Nov 28 08:19:22 np0005538515.localdomain sudo[72182]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fipqhsamkwemkclvcrwuxrrijmtfajdk ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:19:22 np0005538515.localdomain sudo[72133]: pam_unix(sudo:session): session closed for user root
Nov 28 08:19:22 np0005538515.localdomain sudo[72182]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:19:22 np0005538515.localdomain python3[72184]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_ipmi_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 28 08:19:22 np0005538515.localdomain sudo[72182]: pam_unix(sudo:session): session closed for user root
Nov 28 08:19:22 np0005538515.localdomain sudo[72199]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kqzigvjybbvounyfbbxgwnltcsndvchj ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:19:22 np0005538515.localdomain sudo[72199]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:19:22 np0005538515.localdomain python3[72201]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_logrotate_crond_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 28 08:19:22 np0005538515.localdomain sudo[72199]: pam_unix(sudo:session): session closed for user root
Nov 28 08:19:23 np0005538515.localdomain sudo[72217]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nfefzmdrqrremyuzqnkevvxskjkclerq ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:19:23 np0005538515.localdomain sudo[72217]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:19:23 np0005538515.localdomain python3[72219]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_nova_migration_target_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 28 08:19:23 np0005538515.localdomain sudo[72217]: pam_unix(sudo:session): session closed for user root
Nov 28 08:19:23 np0005538515.localdomain sudo[72233]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zhuxasdjisjyfelpxgquwdizvxnuaksb ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:19:23 np0005538515.localdomain sudo[72233]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:19:23 np0005538515.localdomain python3[72235]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_ovn_controller_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 28 08:19:23 np0005538515.localdomain sudo[72233]: pam_unix(sudo:session): session closed for user root
Nov 28 08:19:23 np0005538515.localdomain sudo[72249]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sfmhnysjfvhkycbkiaygocnuvectfbsz ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:19:23 np0005538515.localdomain sudo[72249]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:19:23 np0005538515.localdomain python3[72251]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_ovn_metadata_agent_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 28 08:19:23 np0005538515.localdomain sudo[72249]: pam_unix(sudo:session): session closed for user root
Nov 28 08:19:24 np0005538515.localdomain sudo[72310]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iyxamlbkfaathlgzeynonxoloprklpii ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:19:24 np0005538515.localdomain sudo[72310]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:19:24 np0005538515.localdomain python3[72312]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764317963.7044015-110016-123672381347547/source dest=/etc/systemd/system/tripleo_ceilometer_agent_compute.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:19:24 np0005538515.localdomain sudo[72310]: pam_unix(sudo:session): session closed for user root
Nov 28 08:19:24 np0005538515.localdomain sudo[72339]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fylgcndiapfjcuhjbcbsmaqmvqmhgnku ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:19:24 np0005538515.localdomain sudo[72339]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:19:24 np0005538515.localdomain python3[72341]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764317963.7044015-110016-123672381347547/source dest=/etc/systemd/system/tripleo_ceilometer_agent_ipmi.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:19:24 np0005538515.localdomain sudo[72339]: pam_unix(sudo:session): session closed for user root
Nov 28 08:19:25 np0005538515.localdomain sudo[72368]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-abuopgjseitfyxurwhdwiveddwypgeft ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:19:25 np0005538515.localdomain sudo[72368]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:19:25 np0005538515.localdomain python3[72370]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764317963.7044015-110016-123672381347547/source dest=/etc/systemd/system/tripleo_logrotate_crond.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:19:25 np0005538515.localdomain sudo[72368]: pam_unix(sudo:session): session closed for user root
Nov 28 08:19:25 np0005538515.localdomain sudo[72397]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wxekffhdsosehtxphuircxeroerxatjz ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:19:25 np0005538515.localdomain sudo[72397]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:19:25 np0005538515.localdomain python3[72399]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764317963.7044015-110016-123672381347547/source dest=/etc/systemd/system/tripleo_nova_migration_target.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:19:25 np0005538515.localdomain sudo[72397]: pam_unix(sudo:session): session closed for user root
Nov 28 08:19:26 np0005538515.localdomain sudo[72426]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iisqkhibhlfnjsprconmfsbpndonqyvi ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:19:26 np0005538515.localdomain sudo[72426]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:19:26 np0005538515.localdomain python3[72428]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764317963.7044015-110016-123672381347547/source dest=/etc/systemd/system/tripleo_ovn_controller.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:19:26 np0005538515.localdomain sudo[72426]: pam_unix(sudo:session): session closed for user root
Nov 28 08:19:26 np0005538515.localdomain sudo[72455]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dybyfmouuwxlirmytfwdiussfudidoah ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:19:26 np0005538515.localdomain sudo[72455]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:19:26 np0005538515.localdomain python3[72457]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764317963.7044015-110016-123672381347547/source dest=/etc/systemd/system/tripleo_ovn_metadata_agent.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:19:26 np0005538515.localdomain sudo[72455]: pam_unix(sudo:session): session closed for user root
Nov 28 08:19:26 np0005538515.localdomain sudo[72471]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gylgewpceufrvopqwohwanctuqtnulkv ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:19:26 np0005538515.localdomain sudo[72471]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:19:27 np0005538515.localdomain python3[72473]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 28 08:19:27 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 08:19:27 np0005538515.localdomain systemd-sysv-generator[72504]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 08:19:27 np0005538515.localdomain systemd-rc-local-generator[72498]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 08:19:27 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 08:19:27 np0005538515.localdomain sudo[72471]: pam_unix(sudo:session): session closed for user root
Nov 28 08:19:27 np0005538515.localdomain sudo[72523]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bjvchujvajhovwhemujlznrhjjkdobnp ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:19:27 np0005538515.localdomain sudo[72523]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:19:28 np0005538515.localdomain python3[72525]: ansible-systemd Invoked with state=restarted name=tripleo_ceilometer_agent_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 08:19:28 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 08:19:28 np0005538515.localdomain systemd-rc-local-generator[72550]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 08:19:28 np0005538515.localdomain systemd-sysv-generator[72556]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 08:19:28 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 08:19:28 np0005538515.localdomain systemd[1]: Starting ceilometer_agent_compute container...
Nov 28 08:19:28 np0005538515.localdomain tripleo-start-podman-container[72565]: Creating additional drop-in dependency for "ceilometer_agent_compute" (d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff)
Nov 28 08:19:28 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 08:19:28 np0005538515.localdomain systemd-rc-local-generator[72621]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 08:19:28 np0005538515.localdomain systemd-sysv-generator[72626]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 08:19:29 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 08:19:29 np0005538515.localdomain systemd[1]: Started ceilometer_agent_compute container.
Nov 28 08:19:29 np0005538515.localdomain sudo[72523]: pam_unix(sudo:session): session closed for user root
Nov 28 08:19:29 np0005538515.localdomain sudo[72648]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ympkxvfxjmobzugromemxzqlpucylakd ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:19:29 np0005538515.localdomain sudo[72648]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:19:29 np0005538515.localdomain python3[72650]: ansible-systemd Invoked with state=restarted name=tripleo_ceilometer_agent_ipmi.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 08:19:30 np0005538515.localdomain systemd[1]: Stopping User Manager for UID 0...
Nov 28 08:19:30 np0005538515.localdomain systemd[71972]: Activating special unit Exit the Session...
Nov 28 08:19:30 np0005538515.localdomain systemd[71972]: Stopped target Main User Target.
Nov 28 08:19:30 np0005538515.localdomain systemd[71972]: Stopped target Basic System.
Nov 28 08:19:30 np0005538515.localdomain systemd[71972]: Stopped target Paths.
Nov 28 08:19:30 np0005538515.localdomain systemd[71972]: Stopped target Sockets.
Nov 28 08:19:30 np0005538515.localdomain systemd[71972]: Stopped target Timers.
Nov 28 08:19:30 np0005538515.localdomain systemd[71972]: Stopped Daily Cleanup of User's Temporary Directories.
Nov 28 08:19:30 np0005538515.localdomain systemd[71972]: Closed D-Bus User Message Bus Socket.
Nov 28 08:19:30 np0005538515.localdomain systemd[71972]: Stopped Create User's Volatile Files and Directories.
Nov 28 08:19:30 np0005538515.localdomain systemd[71972]: Removed slice User Application Slice.
Nov 28 08:19:30 np0005538515.localdomain systemd[71972]: Reached target Shutdown.
Nov 28 08:19:30 np0005538515.localdomain systemd[71972]: Finished Exit the Session.
Nov 28 08:19:30 np0005538515.localdomain systemd[71972]: Reached target Exit the Session.
Nov 28 08:19:30 np0005538515.localdomain systemd[1]: user@0.service: Deactivated successfully.
Nov 28 08:19:30 np0005538515.localdomain systemd[1]: Stopped User Manager for UID 0.
Nov 28 08:19:30 np0005538515.localdomain systemd[1]: Stopping User Runtime Directory /run/user/0...
Nov 28 08:19:30 np0005538515.localdomain systemd[1]: run-user-0.mount: Deactivated successfully.
Nov 28 08:19:30 np0005538515.localdomain systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Nov 28 08:19:30 np0005538515.localdomain systemd[1]: Stopped User Runtime Directory /run/user/0.
Nov 28 08:19:30 np0005538515.localdomain systemd[1]: Removed slice User Slice of UID 0.
Nov 28 08:19:30 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 08:19:30 np0005538515.localdomain systemd-rc-local-generator[72676]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 08:19:30 np0005538515.localdomain systemd-sysv-generator[72681]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 08:19:31 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 08:19:31 np0005538515.localdomain systemd[1]: Starting ceilometer_agent_ipmi container...
Nov 28 08:19:31 np0005538515.localdomain systemd[1]: Started ceilometer_agent_ipmi container.
Nov 28 08:19:31 np0005538515.localdomain sudo[72648]: pam_unix(sudo:session): session closed for user root
Nov 28 08:19:31 np0005538515.localdomain sudo[72716]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mkyxzvoyhphiafimoohlychmmrjlctpo ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:19:31 np0005538515.localdomain sudo[72716]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:19:31 np0005538515.localdomain python3[72718]: ansible-systemd Invoked with state=restarted name=tripleo_logrotate_crond.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 08:19:32 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 08:19:33 np0005538515.localdomain systemd-sysv-generator[72747]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 08:19:33 np0005538515.localdomain systemd-rc-local-generator[72743]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 08:19:33 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 08:19:33 np0005538515.localdomain systemd[1]: Starting logrotate_crond container...
Nov 28 08:19:33 np0005538515.localdomain systemd[1]: Started logrotate_crond container.
Nov 28 08:19:33 np0005538515.localdomain sudo[72716]: pam_unix(sudo:session): session closed for user root
Nov 28 08:19:33 np0005538515.localdomain sudo[72784]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jyiubfwgdopoimcshxrfnrmonidpiunf ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:19:33 np0005538515.localdomain sudo[72784]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:19:34 np0005538515.localdomain python3[72786]: ansible-systemd Invoked with state=restarted name=tripleo_nova_migration_target.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 08:19:34 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 08:19:34 np0005538515.localdomain systemd-rc-local-generator[72813]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 08:19:34 np0005538515.localdomain systemd-sysv-generator[72818]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 08:19:34 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 08:19:34 np0005538515.localdomain systemd[1]: Starting nova_migration_target container...
Nov 28 08:19:34 np0005538515.localdomain systemd[1]: Started nova_migration_target container.
Nov 28 08:19:34 np0005538515.localdomain sudo[72784]: pam_unix(sudo:session): session closed for user root
Nov 28 08:19:35 np0005538515.localdomain sudo[72851]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jpitrgalopbbjecrqnrwtlubxcsuyioz ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:19:35 np0005538515.localdomain sudo[72851]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:19:35 np0005538515.localdomain python3[72853]: ansible-systemd Invoked with state=restarted name=tripleo_ovn_controller.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 08:19:35 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 08:19:35 np0005538515.localdomain systemd-sysv-generator[72884]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 08:19:35 np0005538515.localdomain systemd-rc-local-generator[72881]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 08:19:35 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 08:19:35 np0005538515.localdomain systemd[1]: Starting ovn_controller container...
Nov 28 08:19:35 np0005538515.localdomain tripleo-start-podman-container[72893]: Creating additional drop-in dependency for "ovn_controller" (9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164)
Nov 28 08:19:35 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 08:19:36 np0005538515.localdomain systemd-rc-local-generator[72948]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 08:19:36 np0005538515.localdomain systemd-sysv-generator[72953]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 08:19:36 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 08:19:36 np0005538515.localdomain systemd[1]: Started ovn_controller container.
Nov 28 08:19:36 np0005538515.localdomain sudo[72851]: pam_unix(sudo:session): session closed for user root
Nov 28 08:19:36 np0005538515.localdomain sudo[72974]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zhubmldpqdcsetlcgvtqskxwpenqhwen ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:19:36 np0005538515.localdomain sudo[72974]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:19:36 np0005538515.localdomain python3[72976]: ansible-systemd Invoked with state=restarted name=tripleo_ovn_metadata_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 08:19:37 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 08:19:37 np0005538515.localdomain systemd-rc-local-generator[73004]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 08:19:37 np0005538515.localdomain systemd-sysv-generator[73008]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 08:19:37 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 08:19:37 np0005538515.localdomain systemd[1]: Starting ovn_metadata_agent container...
Nov 28 08:19:37 np0005538515.localdomain systemd[1]: Started ovn_metadata_agent container.
Nov 28 08:19:37 np0005538515.localdomain sudo[72974]: pam_unix(sudo:session): session closed for user root
Nov 28 08:19:37 np0005538515.localdomain sudo[73054]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dugllpbfdqqwcxvlufhezweyqwkemgyc ; /usr/bin/python3
Nov 28 08:19:37 np0005538515.localdomain sudo[73054]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:19:37 np0005538515.localdomain python3[73056]: ansible-file Invoked with path=/var/lib/container-puppet/container-puppet-tasks4.json state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:19:37 np0005538515.localdomain sudo[73054]: pam_unix(sudo:session): session closed for user root
Nov 28 08:19:38 np0005538515.localdomain sudo[73102]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pttdparwksokmfzgqyjoagbfxufwsjyj ; /usr/bin/python3
Nov 28 08:19:38 np0005538515.localdomain sudo[73102]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:19:38 np0005538515.localdomain sudo[73102]: pam_unix(sudo:session): session closed for user root
Nov 28 08:19:38 np0005538515.localdomain sudo[73146]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bmtaopjhfzrzzsqpmqacvydsuqvyconb ; /usr/bin/python3
Nov 28 08:19:38 np0005538515.localdomain sudo[73146]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:19:38 np0005538515.localdomain sudo[73146]: pam_unix(sudo:session): session closed for user root
Nov 28 08:19:39 np0005538515.localdomain sudo[73176]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jpdqkwxbhcygiuufeppmerafzaypzxij ; /usr/bin/python3
Nov 28 08:19:39 np0005538515.localdomain sudo[73176]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:19:39 np0005538515.localdomain python3[73178]: ansible-container_puppet_config Invoked with check_mode=False config_vol_prefix=/var/lib/config-data debug=True net_host=True no_archive=True puppet_config=/var/lib/container-puppet/container-puppet-tasks4.json short_hostname=np0005538515 step=4 update_config_hash_only=False
Nov 28 08:19:39 np0005538515.localdomain sudo[73176]: pam_unix(sudo:session): session closed for user root
Nov 28 08:19:39 np0005538515.localdomain sudo[73192]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aekekupenvixwojrupngljwvrderugru ; /usr/bin/python3
Nov 28 08:19:39 np0005538515.localdomain sudo[73192]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:19:39 np0005538515.localdomain python3[73194]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:19:40 np0005538515.localdomain sudo[73192]: pam_unix(sudo:session): session closed for user root
Nov 28 08:19:40 np0005538515.localdomain sudo[73208]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iqlyuvlfvkepnoxbudwzjokdxmthorut ; /usr/bin/python3
Nov 28 08:19:40 np0005538515.localdomain sudo[73208]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:19:40 np0005538515.localdomain python3[73210]: ansible-container_config_data Invoked with config_path=/var/lib/tripleo-config/container-puppet-config/step_4 config_pattern=container-puppet-*.json config_overrides={} debug=True
Nov 28 08:19:40 np0005538515.localdomain sudo[73208]: pam_unix(sudo:session): session closed for user root
Nov 28 08:19:42 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.
Nov 28 08:19:42 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.
Nov 28 08:19:42 np0005538515.localdomain podman[73212]: 2025-11-28 08:19:42.996561718 +0000 UTC m=+0.100726218 container health_status 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, version=17.1.12, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20251118.1, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.component=openstack-qdrouterd-container, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step1, vcs-type=git)
Nov 28 08:19:43 np0005538515.localdomain podman[73213]: 2025-11-28 08:19:43.047695861 +0000 UTC m=+0.149212209 container health_status cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, version=17.1.12, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-collectd-container, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, release=1761123044, managed_by=tripleo_ansible, container_name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, vendor=Red Hat, Inc., tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, vcs-type=git, name=rhosp17/openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com)
Nov 28 08:19:43 np0005538515.localdomain podman[73213]: 2025-11-28 08:19:43.065742363 +0000 UTC m=+0.167258741 container exec_died cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:51:28Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, batch=17.1_20251118.1, io.openshift.expose-services=, release=1761123044, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-collectd-container, tcib_managed=true, vendor=Red Hat, Inc., version=17.1.12, description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, config_id=tripleo_step3, name=rhosp17/openstack-collectd, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, architecture=x86_64, container_name=collectd)
Nov 28 08:19:43 np0005538515.localdomain systemd[1]: cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.service: Deactivated successfully.
Nov 28 08:19:43 np0005538515.localdomain podman[73212]: 2025-11-28 08:19:43.190937313 +0000 UTC m=+0.295101773 container exec_died 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, build-date=2025-11-18T22:49:46Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, name=rhosp17/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044)
Nov 28 08:19:43 np0005538515.localdomain systemd[1]: 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.service: Deactivated successfully.
Nov 28 08:19:45 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.
Nov 28 08:19:45 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.
Nov 28 08:19:46 np0005538515.localdomain podman[73266]: 2025-11-28 08:19:46.000850702 +0000 UTC m=+0.100440710 container health_status 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, managed_by=tripleo_ansible, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-iscsid, vcs-type=git, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, version=17.1.12, url=https://www.redhat.com, architecture=x86_64, io.openshift.expose-services=, com.redhat.component=openstack-iscsid-container, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vendor=Red Hat, Inc., build-date=2025-11-18T23:44:13Z, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public, release=1761123044, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid)
Nov 28 08:19:46 np0005538515.localdomain podman[73266]: 2025-11-28 08:19:46.040565009 +0000 UTC m=+0.140154987 container exec_died 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, build-date=2025-11-18T23:44:13Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, release=1761123044, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, distribution-scope=public, config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, version=17.1.12, architecture=x86_64, container_name=iscsid, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.openshift.expose-services=, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d)
Nov 28 08:19:46 np0005538515.localdomain systemd[1]: 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.service: Deactivated successfully.
Nov 28 08:19:46 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.
Nov 28 08:19:46 np0005538515.localdomain podman[73284]: 2025-11-28 08:19:46.098260077 +0000 UTC m=+0.086235868 container health_status d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=starting, build-date=2025-11-19T00:11:48Z, config_id=tripleo_step4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_compute, io.openshift.expose-services=, architecture=x86_64, io.buildah.version=1.41.4, distribution-scope=public, maintainer=OpenStack TripleO Team, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-compute-container, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 28 08:19:46 np0005538515.localdomain podman[73284]: 2025-11-28 08:19:46.139473321 +0000 UTC m=+0.127449102 container exec_died d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, version=17.1.12, vcs-type=git, build-date=2025-11-19T00:11:48Z, architecture=x86_64, release=1761123044, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676)
Nov 28 08:19:46 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.
Nov 28 08:19:46 np0005538515.localdomain podman[73297]: 2025-11-28 08:19:46.186846987 +0000 UTC m=+0.095876379 container health_status 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, managed_by=tripleo_ansible, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, vendor=Red Hat, Inc., architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, vcs-type=git, com.redhat.component=openstack-cron-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, container_name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, config_id=tripleo_step4)
Nov 28 08:19:46 np0005538515.localdomain systemd[1]: d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.service: Deactivated successfully.
Nov 28 08:19:46 np0005538515.localdomain podman[73297]: 2025-11-28 08:19:46.246737692 +0000 UTC m=+0.155767084 container exec_died 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, version=17.1.12, com.redhat.component=openstack-cron-container, tcib_managed=true, architecture=x86_64, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, vendor=Red Hat, Inc., vcs-type=git, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=logrotate_crond, distribution-scope=public, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:49:32Z, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 28 08:19:46 np0005538515.localdomain systemd[1]: 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.service: Deactivated successfully.
Nov 28 08:19:46 np0005538515.localdomain podman[73324]: 2025-11-28 08:19:46.261418209 +0000 UTC m=+0.092752950 container health_status 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=starting, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ceilometer-ipmi-container, container_name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, io.openshift.expose-services=, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:12:45Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, release=1761123044, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., architecture=x86_64, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Nov 28 08:19:46 np0005538515.localdomain podman[73324]: 2025-11-28 08:19:46.29450791 +0000 UTC m=+0.125842681 container exec_died 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, release=1761123044, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi, build-date=2025-11-19T00:12:45Z, distribution-scope=public, version=17.1.12, com.redhat.component=openstack-ceilometer-ipmi-container, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, managed_by=tripleo_ansible, io.buildah.version=1.41.4, url=https://www.redhat.com, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-ipmi, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4)
Nov 28 08:19:46 np0005538515.localdomain systemd[1]: 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.service: Deactivated successfully.
Nov 28 08:19:46 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.
Nov 28 08:19:46 np0005538515.localdomain podman[73358]: 2025-11-28 08:19:46.974605166 +0000 UTC m=+0.084944788 container health_status e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, version=17.1.12, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, vendor=Red Hat, Inc., distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-nova-compute, release=1761123044, architecture=x86_64, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, config_id=tripleo_step4, container_name=nova_migration_target, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute)
Nov 28 08:19:46 np0005538515.localdomain systemd[1]: tmp-crun.hyy27G.mount: Deactivated successfully.
Nov 28 08:19:47 np0005538515.localdomain podman[73358]: 2025-11-28 08:19:47.335308452 +0000 UTC m=+0.445648044 container exec_died e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, tcib_managed=true, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, build-date=2025-11-19T00:36:58Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-compute, release=1761123044, batch=17.1_20251118.1, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Nov 28 08:19:47 np0005538515.localdomain systemd[1]: e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.service: Deactivated successfully.
Nov 28 08:19:50 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.
Nov 28 08:19:50 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.
Nov 28 08:19:50 np0005538515.localdomain systemd[1]: tmp-crun.92I7Ag.mount: Deactivated successfully.
Nov 28 08:19:50 np0005538515.localdomain podman[73379]: 2025-11-28 08:19:50.980310526 +0000 UTC m=+0.089026255 container health_status 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=starting, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, distribution-scope=public, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, build-date=2025-11-18T23:34:05Z, url=https://www.redhat.com, container_name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, managed_by=tripleo_ansible, vendor=Red Hat, Inc., batch=17.1_20251118.1, architecture=x86_64)
Nov 28 08:19:51 np0005538515.localdomain podman[73380]: 2025-11-28 08:19:51.026088561 +0000 UTC m=+0.131086344 container health_status e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=starting, build-date=2025-11-19T00:14:25Z, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, distribution-scope=public, batch=17.1_20251118.1, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, architecture=x86_64, name=rhosp17/openstack-neutron-metadata-agent-ovn, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, managed_by=tripleo_ansible, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-type=git, release=1761123044, io.buildah.version=1.41.4, container_name=ovn_metadata_agent, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vendor=Red Hat, Inc., vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Nov 28 08:19:51 np0005538515.localdomain podman[73379]: 2025-11-28 08:19:51.051986728 +0000 UTC m=+0.160702517 container exec_died 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, tcib_managed=true, architecture=x86_64, io.openshift.expose-services=, version=17.1.12, batch=17.1_20251118.1, io.buildah.version=1.41.4, vcs-type=git, url=https://www.redhat.com, config_id=tripleo_step4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, release=1761123044, build-date=2025-11-18T23:34:05Z, description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team)
Nov 28 08:19:51 np0005538515.localdomain systemd[1]: 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.service: Deactivated successfully.
Nov 28 08:19:51 np0005538515.localdomain podman[73380]: 2025-11-28 08:19:51.097588699 +0000 UTC m=+0.202586502 container exec_died e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, release=1761123044, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-type=git, config_id=tripleo_step4, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, build-date=2025-11-19T00:14:25Z, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, maintainer=OpenStack TripleO Team, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., managed_by=tripleo_ansible, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Nov 28 08:19:51 np0005538515.localdomain systemd[1]: e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.service: Deactivated successfully.
Nov 28 08:19:59 np0005538515.localdomain snmpd[68067]: empty variable list in _query
Nov 28 08:19:59 np0005538515.localdomain snmpd[68067]: empty variable list in _query
Nov 28 08:20:07 np0005538515.localdomain sudo[73427]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 08:20:07 np0005538515.localdomain sudo[73427]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:20:07 np0005538515.localdomain sudo[73427]: pam_unix(sudo:session): session closed for user root
Nov 28 08:20:07 np0005538515.localdomain sudo[73442]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 08:20:07 np0005538515.localdomain sudo[73442]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:20:08 np0005538515.localdomain sudo[73442]: pam_unix(sudo:session): session closed for user root
Nov 28 08:20:08 np0005538515.localdomain sudo[73489]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 08:20:08 np0005538515.localdomain sudo[73489]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:20:08 np0005538515.localdomain sudo[73489]: pam_unix(sudo:session): session closed for user root
Nov 28 08:20:13 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.
Nov 28 08:20:13 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.
Nov 28 08:20:13 np0005538515.localdomain systemd[1]: tmp-crun.3kG97C.mount: Deactivated successfully.
Nov 28 08:20:13 np0005538515.localdomain podman[73505]: 2025-11-28 08:20:13.995231378 +0000 UTC m=+0.097983613 container health_status cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, architecture=x86_64, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, version=17.1.12, batch=17.1_20251118.1, distribution-scope=public, name=rhosp17/openstack-collectd, release=1761123044, container_name=collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, managed_by=tripleo_ansible, vendor=Red Hat, Inc., url=https://www.redhat.com, build-date=2025-11-18T22:51:28Z, com.redhat.component=openstack-collectd-container)
Nov 28 08:20:14 np0005538515.localdomain podman[73504]: 2025-11-28 08:20:14.037704041 +0000 UTC m=+0.141655553 container health_status 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, name=rhosp17/openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, managed_by=tripleo_ansible, distribution-scope=public, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, build-date=2025-11-18T22:49:46Z, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.4, release=1761123044, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, tcib_managed=true, container_name=metrics_qdr, io.openshift.expose-services=, config_id=tripleo_step1)
Nov 28 08:20:14 np0005538515.localdomain podman[73505]: 2025-11-28 08:20:14.082592209 +0000 UTC m=+0.185344404 container exec_died cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-collectd-container, vendor=Red Hat, Inc., batch=17.1_20251118.1, distribution-scope=public, url=https://www.redhat.com, container_name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, maintainer=OpenStack TripleO Team, tcib_managed=true, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, release=1761123044, version=17.1.12, build-date=2025-11-18T22:51:28Z, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-collectd, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git)
Nov 28 08:20:14 np0005538515.localdomain systemd[1]: cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.service: Deactivated successfully.
Nov 28 08:20:14 np0005538515.localdomain podman[73504]: 2025-11-28 08:20:14.230715424 +0000 UTC m=+0.334666986 container exec_died 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, managed_by=tripleo_ansible, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, distribution-scope=public, com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, batch=17.1_20251118.1, build-date=2025-11-18T22:49:46Z, container_name=metrics_qdr, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, architecture=x86_64, io.openshift.expose-services=)
Nov 28 08:20:14 np0005538515.localdomain systemd[1]: 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.service: Deactivated successfully.
Nov 28 08:20:16 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.
Nov 28 08:20:16 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.
Nov 28 08:20:16 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.
Nov 28 08:20:16 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.
Nov 28 08:20:16 np0005538515.localdomain podman[73554]: 2025-11-28 08:20:16.982646148 +0000 UTC m=+0.089808449 container health_status 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, version=17.1.12, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1761123044, url=https://www.redhat.com, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, container_name=ceilometer_agent_ipmi, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-ipmi-container, distribution-scope=public, io.openshift.expose-services=, build-date=2025-11-19T00:12:45Z, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676)
Nov 28 08:20:17 np0005538515.localdomain podman[73554]: 2025-11-28 08:20:17.012749065 +0000 UTC m=+0.119911336 container exec_died 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, container_name=ceilometer_agent_ipmi, batch=17.1_20251118.1, version=17.1.12, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-ipmi-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, io.openshift.expose-services=, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Nov 28 08:20:17 np0005538515.localdomain systemd[1]: 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.service: Deactivated successfully.
Nov 28 08:20:17 np0005538515.localdomain systemd[1]: tmp-crun.R4pZWB.mount: Deactivated successfully.
Nov 28 08:20:17 np0005538515.localdomain podman[73553]: 2025-11-28 08:20:17.088493534 +0000 UTC m=+0.196675137 container health_status 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.buildah.version=1.41.4, name=rhosp17/openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.expose-services=, config_id=tripleo_step4, managed_by=tripleo_ansible, container_name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, release=1761123044, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-cron-container, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:32Z, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, architecture=x86_64, url=https://www.redhat.com, tcib_managed=true)
Nov 28 08:20:17 np0005538515.localdomain podman[73553]: 2025-11-28 08:20:17.098509596 +0000 UTC m=+0.206691189 container exec_died 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, version=17.1.12, distribution-scope=public, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, architecture=x86_64, container_name=logrotate_crond, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, name=rhosp17/openstack-cron, url=https://www.redhat.com, build-date=2025-11-18T22:49:32Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 28 08:20:17 np0005538515.localdomain systemd[1]: 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.service: Deactivated successfully.
Nov 28 08:20:17 np0005538515.localdomain systemd[1]: tmp-crun.DGAVgo.mount: Deactivated successfully.
Nov 28 08:20:17 np0005538515.localdomain podman[73556]: 2025-11-28 08:20:17.142945761 +0000 UTC m=+0.245241401 container health_status d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, architecture=x86_64, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-compute-container, vcs-type=git, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, config_id=tripleo_step4, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, container_name=ceilometer_agent_compute, tcib_managed=true, managed_by=tripleo_ansible, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., version=17.1.12, build-date=2025-11-19T00:11:48Z)
Nov 28 08:20:17 np0005538515.localdomain podman[73555]: 2025-11-28 08:20:17.198676147 +0000 UTC m=+0.300821862 container health_status 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044, url=https://www.redhat.com, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.openshift.expose-services=, com.redhat.component=openstack-iscsid-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_id=tripleo_step3)
Nov 28 08:20:17 np0005538515.localdomain podman[73556]: 2025-11-28 08:20:17.222854419 +0000 UTC m=+0.325150099 container exec_died d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vcs-type=git, distribution-scope=public, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, container_name=ceilometer_agent_compute, com.redhat.component=openstack-ceilometer-compute-container, batch=17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, url=https://www.redhat.com, name=rhosp17/openstack-ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:11:48Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1761123044, managed_by=tripleo_ansible)
Nov 28 08:20:17 np0005538515.localdomain systemd[1]: d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.service: Deactivated successfully.
Nov 28 08:20:17 np0005538515.localdomain podman[73555]: 2025-11-28 08:20:17.236859856 +0000 UTC m=+0.339005621 container exec_died 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vendor=Red Hat, Inc., config_id=tripleo_step3, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, com.redhat.component=openstack-iscsid-container, managed_by=tripleo_ansible, name=rhosp17/openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, distribution-scope=public, io.buildah.version=1.41.4, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, architecture=x86_64, version=17.1.12, container_name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, build-date=2025-11-18T23:44:13Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, tcib_managed=true, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 iscsid)
Nov 28 08:20:17 np0005538515.localdomain systemd[1]: 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.service: Deactivated successfully.
Nov 28 08:20:17 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.
Nov 28 08:20:17 np0005538515.localdomain podman[73642]: 2025-11-28 08:20:17.965633418 +0000 UTC m=+0.076509645 container health_status e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, name=rhosp17/openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vendor=Red Hat, Inc., release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, url=https://www.redhat.com, architecture=x86_64, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, version=17.1.12, build-date=2025-11-19T00:36:58Z, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target)
Nov 28 08:20:18 np0005538515.localdomain podman[73642]: 2025-11-28 08:20:18.327630264 +0000 UTC m=+0.438506501 container exec_died e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, config_id=tripleo_step4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, distribution-scope=public, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, container_name=nova_migration_target, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc.)
Nov 28 08:20:18 np0005538515.localdomain systemd[1]: e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.service: Deactivated successfully.
Nov 28 08:20:21 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.
Nov 28 08:20:21 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.
Nov 28 08:20:21 np0005538515.localdomain systemd[1]: tmp-crun.9aKQG3.mount: Deactivated successfully.
Nov 28 08:20:21 np0005538515.localdomain podman[73667]: 2025-11-28 08:20:21.95701728 +0000 UTC m=+0.065172950 container health_status e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, build-date=2025-11-19T00:14:25Z, managed_by=tripleo_ansible, vendor=Red Hat, Inc., distribution-scope=public, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, name=rhosp17/openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, io.buildah.version=1.41.4, version=17.1.12, url=https://www.redhat.com, vcs-type=git, tcib_managed=true, io.openshift.expose-services=)
Nov 28 08:20:22 np0005538515.localdomain podman[73666]: 2025-11-28 08:20:22.018177016 +0000 UTC m=+0.126117860 container health_status 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, config_id=tripleo_step4, release=1761123044, vcs-type=git, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, io.buildah.version=1.41.4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_controller, batch=17.1_20251118.1, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, tcib_managed=true, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, build-date=2025-11-18T23:34:05Z)
Nov 28 08:20:22 np0005538515.localdomain podman[73666]: 2025-11-28 08:20:22.041331447 +0000 UTC m=+0.149272261 container exec_died 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, io.buildah.version=1.41.4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, build-date=2025-11-18T23:34:05Z, managed_by=tripleo_ansible, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, architecture=x86_64, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ovn-controller-container, name=rhosp17/openstack-ovn-controller, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller)
Nov 28 08:20:22 np0005538515.localdomain systemd[1]: 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.service: Deactivated successfully.
Nov 28 08:20:22 np0005538515.localdomain podman[73667]: 2025-11-28 08:20:22.071195587 +0000 UTC m=+0.179351327 container exec_died e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, build-date=2025-11-19T00:14:25Z, url=https://www.redhat.com, vcs-type=git, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, vendor=Red Hat, Inc., distribution-scope=public, io.buildah.version=1.41.4, architecture=x86_64, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_id=tripleo_step4, name=rhosp17/openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12)
Nov 28 08:20:22 np0005538515.localdomain systemd[1]: e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.service: Deactivated successfully.
Nov 28 08:20:44 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.
Nov 28 08:20:44 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.
Nov 28 08:20:44 np0005538515.localdomain podman[73715]: 2025-11-28 08:20:44.974780843 +0000 UTC m=+0.085721722 container health_status 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, version=17.1.12, name=rhosp17/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, distribution-scope=public, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, managed_by=tripleo_ansible, io.buildah.version=1.41.4, batch=17.1_20251118.1, container_name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 28 08:20:45 np0005538515.localdomain podman[73716]: 2025-11-28 08:20:45.038812418 +0000 UTC m=+0.141392066 container health_status cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, release=1761123044, architecture=x86_64, com.redhat.component=openstack-collectd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-collectd, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, container_name=collectd, batch=17.1_20251118.1, distribution-scope=public, url=https://www.redhat.com, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 28 08:20:45 np0005538515.localdomain podman[73716]: 2025-11-28 08:20:45.047869609 +0000 UTC m=+0.150449247 container exec_died cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, release=1761123044, com.redhat.component=openstack-collectd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, tcib_managed=true, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., config_id=tripleo_step3, io.openshift.expose-services=, build-date=2025-11-18T22:51:28Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, batch=17.1_20251118.1, url=https://www.redhat.com, version=17.1.12)
Nov 28 08:20:45 np0005538515.localdomain systemd[1]: cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.service: Deactivated successfully.
Nov 28 08:20:45 np0005538515.localdomain podman[73715]: 2025-11-28 08:20:45.1785279 +0000 UTC m=+0.289468759 container exec_died 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:49:46Z, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, name=rhosp17/openstack-qdrouterd, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, distribution-scope=public, config_id=tripleo_step1, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 08:20:45 np0005538515.localdomain systemd[1]: 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.service: Deactivated successfully.
Nov 28 08:20:47 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.
Nov 28 08:20:47 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.
Nov 28 08:20:47 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.
Nov 28 08:20:47 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.
Nov 28 08:20:47 np0005538515.localdomain systemd[1]: tmp-crun.p5QKc5.mount: Deactivated successfully.
Nov 28 08:20:47 np0005538515.localdomain podman[73765]: 2025-11-28 08:20:47.981587956 +0000 UTC m=+0.089009684 container health_status 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, container_name=ceilometer_agent_ipmi, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, name=rhosp17/openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, vendor=Red Hat, Inc., tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, distribution-scope=public, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Nov 28 08:20:48 np0005538515.localdomain podman[73765]: 2025-11-28 08:20:48.031679896 +0000 UTC m=+0.139101644 container exec_died 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, com.redhat.component=openstack-ceilometer-ipmi-container, tcib_managed=true, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-ipmi, vendor=Red Hat, Inc., config_id=tripleo_step4, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, io.buildah.version=1.41.4, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, batch=17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z)
Nov 28 08:20:48 np0005538515.localdomain podman[73766]: 2025-11-28 08:20:48.040876903 +0000 UTC m=+0.142943984 container health_status 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, managed_by=tripleo_ansible, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, config_id=tripleo_step3, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:44:13Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, io.openshift.expose-services=, version=17.1.12, vcs-type=git, tcib_managed=true, architecture=x86_64, io.buildah.version=1.41.4, name=rhosp17/openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, com.redhat.component=openstack-iscsid-container, summary=Red Hat OpenStack Platform 17.1 iscsid)
Nov 28 08:20:48 np0005538515.localdomain systemd[1]: 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.service: Deactivated successfully.
Nov 28 08:20:48 np0005538515.localdomain podman[73766]: 2025-11-28 08:20:48.078358331 +0000 UTC m=+0.180425392 container exec_died 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, tcib_managed=true, build-date=2025-11-18T23:44:13Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, managed_by=tripleo_ansible, name=rhosp17/openstack-iscsid, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, url=https://www.redhat.com, com.redhat.component=openstack-iscsid-container, release=1761123044, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, container_name=iscsid, maintainer=OpenStack TripleO Team, architecture=x86_64, vendor=Red Hat, Inc., vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.openshift.expose-services=)
Nov 28 08:20:48 np0005538515.localdomain systemd[1]: 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.service: Deactivated successfully.
Nov 28 08:20:48 np0005538515.localdomain podman[73767]: 2025-11-28 08:20:48.091767458 +0000 UTC m=+0.192867528 container health_status d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, build-date=2025-11-19T00:11:48Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, architecture=x86_64, distribution-scope=public, container_name=ceilometer_agent_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_id=tripleo_step4, managed_by=tripleo_ansible, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, version=17.1.12, name=rhosp17/openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, batch=17.1_20251118.1)
Nov 28 08:20:48 np0005538515.localdomain podman[73764]: 2025-11-28 08:20:48.143892842 +0000 UTC m=+0.252645621 container health_status 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, container_name=logrotate_crond, managed_by=tripleo_ansible, release=1761123044, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, build-date=2025-11-18T22:49:32Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, com.redhat.component=openstack-cron-container, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, name=rhosp17/openstack-cron, version=17.1.12, io.openshift.expose-services=)
Nov 28 08:20:48 np0005538515.localdomain podman[73764]: 2025-11-28 08:20:48.154323527 +0000 UTC m=+0.263076296 container exec_died 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:32Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, batch=17.1_20251118.1, name=rhosp17/openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, release=1761123044, version=17.1.12, com.redhat.component=openstack-cron-container, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, managed_by=tripleo_ansible, architecture=x86_64)
Nov 28 08:20:48 np0005538515.localdomain systemd[1]: 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.service: Deactivated successfully.
Nov 28 08:20:48 np0005538515.localdomain podman[73767]: 2025-11-28 08:20:48.199104302 +0000 UTC m=+0.300204322 container exec_died d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, io.openshift.expose-services=, url=https://www.redhat.com, release=1761123044, build-date=2025-11-19T00:11:48Z, vendor=Red Hat, Inc., vcs-type=git, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-compute-container, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, io.buildah.version=1.41.4, version=17.1.12, batch=17.1_20251118.1, container_name=ceilometer_agent_compute, name=rhosp17/openstack-ceilometer-compute)
Nov 28 08:20:48 np0005538515.localdomain systemd[1]: d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.service: Deactivated successfully.
Nov 28 08:20:48 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.
Nov 28 08:20:48 np0005538515.localdomain podman[73853]: 2025-11-28 08:20:48.960694605 +0000 UTC m=+0.072825089 container health_status e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, url=https://www.redhat.com, vcs-type=git, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, vendor=Red Hat, Inc., version=17.1.12, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-compute-container, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:36:58Z)
Nov 28 08:20:49 np0005538515.localdomain podman[73853]: 2025-11-28 08:20:49.349533317 +0000 UTC m=+0.461663831 container exec_died e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, version=17.1.12, io.openshift.expose-services=, container_name=nova_migration_target, release=1761123044, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step4, tcib_managed=true, name=rhosp17/openstack-nova-compute, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:36:58Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public)
Nov 28 08:20:49 np0005538515.localdomain systemd[1]: e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.service: Deactivated successfully.
Nov 28 08:20:52 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.
Nov 28 08:20:52 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.
Nov 28 08:20:52 np0005538515.localdomain podman[73878]: 2025-11-28 08:20:52.971428121 +0000 UTC m=+0.075092029 container health_status e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, release=1761123044, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.expose-services=, container_name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, build-date=2025-11-19T00:14:25Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, tcib_managed=true, vendor=Red Hat, Inc., batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-type=git, io.buildah.version=1.41.4, managed_by=tripleo_ansible, version=17.1.12, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 28 08:20:53 np0005538515.localdomain podman[73877]: 2025-11-28 08:20:53.032926227 +0000 UTC m=+0.138269268 container health_status 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, url=https://www.redhat.com, release=1761123044, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, build-date=2025-11-18T23:34:05Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, managed_by=tripleo_ansible, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, tcib_managed=true, name=rhosp17/openstack-ovn-controller, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller)
Nov 28 08:20:53 np0005538515.localdomain podman[73878]: 2025-11-28 08:20:53.042529877 +0000 UTC m=+0.146193755 container exec_died e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vendor=Red Hat, Inc., org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, vcs-type=git, distribution-scope=public, release=1761123044, build-date=2025-11-19T00:14:25Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, architecture=x86_64, name=rhosp17/openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, container_name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible)
Nov 28 08:20:53 np0005538515.localdomain systemd[1]: e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.service: Deactivated successfully.
Nov 28 08:20:53 np0005538515.localdomain podman[73877]: 2025-11-28 08:20:53.060472995 +0000 UTC m=+0.165816026 container exec_died 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, container_name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, url=https://www.redhat.com, build-date=2025-11-18T23:34:05Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, maintainer=OpenStack TripleO Team, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, vcs-type=git, config_id=tripleo_step4, release=1761123044, distribution-scope=public, managed_by=tripleo_ansible, name=rhosp17/openstack-ovn-controller, batch=17.1_20251118.1)
Nov 28 08:20:53 np0005538515.localdomain systemd[1]: 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.service: Deactivated successfully.
Nov 28 08:21:08 np0005538515.localdomain sudo[73925]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 08:21:08 np0005538515.localdomain sudo[73925]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:21:08 np0005538515.localdomain sudo[73925]: pam_unix(sudo:session): session closed for user root
Nov 28 08:21:08 np0005538515.localdomain sudo[73940]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Nov 28 08:21:08 np0005538515.localdomain sudo[73940]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:21:09 np0005538515.localdomain podman[74024]: 2025-11-28 08:21:09.796673922 +0000 UTC m=+0.088926201 container exec 98f7091a3e2ea0e9ed1e630f1e98c8fad1fd276cf7448473db6afc3c103ea45d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538515, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, release=553, io.buildah.version=1.33.12, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., ceph=True, GIT_CLEAN=True, CEPH_POINT_RELEASE=, RELEASE=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, version=7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, build-date=2025-09-24T08:57:55, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_BRANCH=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vcs-type=git, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Nov 28 08:21:09 np0005538515.localdomain podman[74024]: 2025-11-28 08:21:09.88200234 +0000 UTC m=+0.174254669 container exec_died 98f7091a3e2ea0e9ed1e630f1e98c8fad1fd276cf7448473db6afc3c103ea45d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538515, architecture=x86_64, RELEASE=main, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, name=rhceph, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=, distribution-scope=public, com.redhat.component=rhceph-container, version=7, build-date=2025-09-24T08:57:55, io.openshift.expose-services=, GIT_CLEAN=True, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, release=553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.buildah.version=1.33.12)
Nov 28 08:21:10 np0005538515.localdomain sudo[73940]: pam_unix(sudo:session): session closed for user root
Nov 28 08:21:10 np0005538515.localdomain sudo[74092]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 08:21:10 np0005538515.localdomain sudo[74092]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:21:10 np0005538515.localdomain sudo[74092]: pam_unix(sudo:session): session closed for user root
Nov 28 08:21:10 np0005538515.localdomain sudo[74107]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 08:21:10 np0005538515.localdomain sudo[74107]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:21:10 np0005538515.localdomain sudo[74107]: pam_unix(sudo:session): session closed for user root
Nov 28 08:21:11 np0005538515.localdomain sudo[74155]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 08:21:11 np0005538515.localdomain sudo[74155]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:21:11 np0005538515.localdomain sudo[74155]: pam_unix(sudo:session): session closed for user root
Nov 28 08:21:15 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.
Nov 28 08:21:15 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.
Nov 28 08:21:15 np0005538515.localdomain podman[74171]: 2025-11-28 08:21:15.980966067 +0000 UTC m=+0.085065631 container health_status cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, architecture=x86_64, name=rhosp17/openstack-collectd, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., managed_by=tripleo_ansible, vcs-type=git, batch=17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, url=https://www.redhat.com, tcib_managed=true, build-date=2025-11-18T22:51:28Z)
Nov 28 08:21:16 np0005538515.localdomain podman[74170]: 2025-11-28 08:21:16.042459822 +0000 UTC m=+0.148348692 container health_status 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, version=17.1.12, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, io.buildah.version=1.41.4, config_id=tripleo_step1, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, url=https://www.redhat.com, name=rhosp17/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, architecture=x86_64)
Nov 28 08:21:16 np0005538515.localdomain podman[74171]: 2025-11-28 08:21:16.065867051 +0000 UTC m=+0.169966605 container exec_died cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, com.redhat.component=openstack-collectd-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, url=https://www.redhat.com, vendor=Red Hat, Inc., release=1761123044, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, version=17.1.12, config_id=tripleo_step3, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, build-date=2025-11-18T22:51:28Z, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, container_name=collectd, name=rhosp17/openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true)
Nov 28 08:21:16 np0005538515.localdomain systemd[1]: cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.service: Deactivated successfully.
Nov 28 08:21:16 np0005538515.localdomain podman[74170]: 2025-11-28 08:21:16.243510875 +0000 UTC m=+0.349399765 container exec_died 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, name=rhosp17/openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, managed_by=tripleo_ansible, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, distribution-scope=public, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4)
Nov 28 08:21:16 np0005538515.localdomain systemd[1]: 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.service: Deactivated successfully.
Nov 28 08:21:18 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.
Nov 28 08:21:18 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.
Nov 28 08:21:18 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.
Nov 28 08:21:18 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.
Nov 28 08:21:18 np0005538515.localdomain systemd[1]: tmp-crun.QyUHX7.mount: Deactivated successfully.
Nov 28 08:21:18 np0005538515.localdomain podman[74221]: 2025-11-28 08:21:18.983334311 +0000 UTC m=+0.089932123 container health_status 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, container_name=ceilometer_agent_ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, vendor=Red Hat, Inc., config_id=tripleo_step4, version=17.1.12, managed_by=tripleo_ansible, release=1761123044, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-ipmi-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:12:45Z, name=rhosp17/openstack-ceilometer-ipmi, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, distribution-scope=public, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 28 08:21:19 np0005538515.localdomain podman[74223]: 2025-11-28 08:21:19.04077745 +0000 UTC m=+0.141258511 container health_status d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:11:48Z, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, tcib_managed=true, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, name=rhosp17/openstack-ceilometer-compute, distribution-scope=public, io.buildah.version=1.41.4, container_name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, release=1761123044)
Nov 28 08:21:19 np0005538515.localdomain podman[74221]: 2025-11-28 08:21:19.064712846 +0000 UTC m=+0.171310618 container exec_died 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, container_name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, config_id=tripleo_step4, name=rhosp17/openstack-ceilometer-ipmi, tcib_managed=true, io.buildah.version=1.41.4, batch=17.1_20251118.1, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, version=17.1.12, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:12:45Z, vcs-type=git)
Nov 28 08:21:19 np0005538515.localdomain podman[74223]: 2025-11-28 08:21:19.075240973 +0000 UTC m=+0.175722024 container exec_died d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, architecture=x86_64, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, batch=17.1_20251118.1, distribution-scope=public, io.buildah.version=1.41.4, container_name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, build-date=2025-11-19T00:11:48Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team)
Nov 28 08:21:19 np0005538515.localdomain systemd[1]: 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.service: Deactivated successfully.
Nov 28 08:21:19 np0005538515.localdomain systemd[1]: d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.service: Deactivated successfully.
Nov 28 08:21:19 np0005538515.localdomain podman[74222]: 2025-11-28 08:21:19.147007779 +0000 UTC m=+0.249584346 container health_status 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, build-date=2025-11-18T23:44:13Z, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, release=1761123044, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, container_name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, architecture=x86_64, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, config_id=tripleo_step3, vendor=Red Hat, Inc., vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, maintainer=OpenStack TripleO Team)
Nov 28 08:21:19 np0005538515.localdomain podman[74222]: 2025-11-28 08:21:19.185597581 +0000 UTC m=+0.288174098 container exec_died 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, tcib_managed=true, distribution-scope=public, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.buildah.version=1.41.4, container_name=iscsid, version=17.1.12, io.openshift.expose-services=, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, architecture=x86_64, config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-iscsid-container, build-date=2025-11-18T23:44:13Z, vcs-type=git, name=rhosp17/openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 iscsid)
Nov 28 08:21:19 np0005538515.localdomain systemd[1]: 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.service: Deactivated successfully.
Nov 28 08:21:19 np0005538515.localdomain podman[74220]: 2025-11-28 08:21:19.190791583 +0000 UTC m=+0.300545474 container health_status 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, container_name=logrotate_crond, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, com.redhat.component=openstack-cron-container, name=rhosp17/openstack-cron, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, build-date=2025-11-18T22:49:32Z, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, io.openshift.expose-services=, url=https://www.redhat.com, architecture=x86_64)
Nov 28 08:21:19 np0005538515.localdomain podman[74220]: 2025-11-28 08:21:19.274702897 +0000 UTC m=+0.384456798 container exec_died 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:32Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=logrotate_crond, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, name=rhosp17/openstack-cron, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, release=1761123044, com.redhat.component=openstack-cron-container, distribution-scope=public, vendor=Red Hat, Inc., config_id=tripleo_step4, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']})
Nov 28 08:21:19 np0005538515.localdomain systemd[1]: 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.service: Deactivated successfully.
Nov 28 08:21:19 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.
Nov 28 08:21:19 np0005538515.localdomain podman[74308]: 2025-11-28 08:21:19.985097516 +0000 UTC m=+0.089046375 container health_status e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, architecture=x86_64, build-date=2025-11-19T00:36:58Z, name=rhosp17/openstack-nova-compute, tcib_managed=true, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, config_id=tripleo_step4, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, release=1761123044, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=nova_migration_target, version=17.1.12, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']})
Nov 28 08:21:20 np0005538515.localdomain podman[74308]: 2025-11-28 08:21:20.347935319 +0000 UTC m=+0.451884188 container exec_died e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, managed_by=tripleo_ansible, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, batch=17.1_20251118.1, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:36:58Z, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, name=rhosp17/openstack-nova-compute, container_name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 28 08:21:20 np0005538515.localdomain systemd[1]: e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.service: Deactivated successfully.
Nov 28 08:21:23 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.
Nov 28 08:21:23 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.
Nov 28 08:21:23 np0005538515.localdomain podman[74333]: 2025-11-28 08:21:23.976633803 +0000 UTC m=+0.082718647 container health_status e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_id=tripleo_step4, tcib_managed=true, build-date=2025-11-19T00:14:25Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vcs-type=git, release=1761123044, version=17.1.12, url=https://www.redhat.com, vendor=Red Hat, Inc., name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, architecture=x86_64)
Nov 28 08:21:24 np0005538515.localdomain podman[74333]: 2025-11-28 08:21:24.031308557 +0000 UTC m=+0.137393401 container exec_died e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, managed_by=tripleo_ansible, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, architecture=x86_64, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-neutron-metadata-agent-ovn, release=1761123044, build-date=2025-11-19T00:14:25Z, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, batch=17.1_20251118.1, version=17.1.12, io.buildah.version=1.41.4, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com)
Nov 28 08:21:24 np0005538515.localdomain systemd[1]: e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.service: Deactivated successfully.
Nov 28 08:21:24 np0005538515.localdomain podman[74332]: 2025-11-28 08:21:24.035357932 +0000 UTC m=+0.143531131 container health_status 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.openshift.expose-services=, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ovn-controller-container, release=1761123044, vcs-type=git, version=17.1.12, batch=17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:34:05Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, config_id=tripleo_step4, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller)
Nov 28 08:21:24 np0005538515.localdomain podman[74332]: 2025-11-28 08:21:24.116562042 +0000 UTC m=+0.224735181 container exec_died 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:34:05Z, url=https://www.redhat.com, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, batch=17.1_20251118.1, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, managed_by=tripleo_ansible, tcib_managed=true, container_name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, name=rhosp17/openstack-ovn-controller, release=1761123044, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, io.buildah.version=1.41.4)
Nov 28 08:21:24 np0005538515.localdomain systemd[1]: 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.service: Deactivated successfully.
Nov 28 08:21:46 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.
Nov 28 08:21:46 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.
Nov 28 08:21:46 np0005538515.localdomain systemd[1]: tmp-crun.v6Entt.mount: Deactivated successfully.
Nov 28 08:21:46 np0005538515.localdomain podman[74380]: 2025-11-28 08:21:46.970306245 +0000 UTC m=+0.079262450 container health_status 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., managed_by=tripleo_ansible, build-date=2025-11-18T22:49:46Z, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_id=tripleo_step1, name=rhosp17/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, architecture=x86_64, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 28 08:21:46 np0005538515.localdomain systemd[1]: tmp-crun.AjbVmg.mount: Deactivated successfully.
Nov 28 08:21:46 np0005538515.localdomain podman[74381]: 2025-11-28 08:21:46.987426008 +0000 UTC m=+0.088811828 container health_status cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, maintainer=OpenStack TripleO Team, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, vcs-type=git, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, container_name=collectd, tcib_managed=true, managed_by=tripleo_ansible, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, distribution-scope=public, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, name=rhosp17/openstack-collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:51:28Z, config_id=tripleo_step3, com.redhat.component=openstack-collectd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 28 08:21:47 np0005538515.localdomain podman[74381]: 2025-11-28 08:21:46.999561035 +0000 UTC m=+0.100946925 container exec_died cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, distribution-scope=public, container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step3, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:51:28Z, io.openshift.expose-services=, name=rhosp17/openstack-collectd, com.redhat.component=openstack-collectd-container, vendor=Red Hat, Inc., batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, release=1761123044, url=https://www.redhat.com, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64)
Nov 28 08:21:47 np0005538515.localdomain systemd[1]: cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.service: Deactivated successfully.
Nov 28 08:21:47 np0005538515.localdomain podman[74380]: 2025-11-28 08:21:47.161791859 +0000 UTC m=+0.270748094 container exec_died 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, vcs-type=git, config_id=tripleo_step1, io.buildah.version=1.41.4, tcib_managed=true, batch=17.1_20251118.1, name=rhosp17/openstack-qdrouterd, version=17.1.12, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, build-date=2025-11-18T22:49:46Z, vendor=Red Hat, Inc., url=https://www.redhat.com, container_name=metrics_qdr, managed_by=tripleo_ansible, io.openshift.expose-services=)
Nov 28 08:21:47 np0005538515.localdomain systemd[1]: 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.service: Deactivated successfully.
Nov 28 08:21:49 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.
Nov 28 08:21:49 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.
Nov 28 08:21:49 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.
Nov 28 08:21:49 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.
Nov 28 08:21:49 np0005538515.localdomain podman[74430]: 2025-11-28 08:21:49.99610523 +0000 UTC m=+0.093836883 container health_status d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, tcib_managed=true, build-date=2025-11-19T00:11:48Z, container_name=ceilometer_agent_compute, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vendor=Red Hat, Inc., version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, url=https://www.redhat.com, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git)
Nov 28 08:21:50 np0005538515.localdomain podman[74427]: 2025-11-28 08:21:50.027741705 +0000 UTC m=+0.136806171 container health_status 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, name=rhosp17/openstack-cron, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, com.redhat.component=openstack-cron-container, version=17.1.12, config_id=tripleo_step4, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, release=1761123044, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:32Z, io.buildah.version=1.41.4, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git)
Nov 28 08:21:50 np0005538515.localdomain podman[74427]: 2025-11-28 08:21:50.039438281 +0000 UTC m=+0.148502757 container exec_died 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-cron-container, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, container_name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-cron, distribution-scope=public, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, description=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, release=1761123044, tcib_managed=true, version=17.1.12, config_id=tripleo_step4, vcs-type=git)
Nov 28 08:21:50 np0005538515.localdomain systemd[1]: 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.service: Deactivated successfully.
Nov 28 08:21:50 np0005538515.localdomain podman[74428]: 2025-11-28 08:21:50.090384367 +0000 UTC m=+0.193963732 container health_status 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-ipmi, vendor=Red Hat, Inc., config_id=tripleo_step4, tcib_managed=true, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:12:45Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, batch=17.1_20251118.1, container_name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team)
Nov 28 08:21:50 np0005538515.localdomain podman[74428]: 2025-11-28 08:21:50.121509847 +0000 UTC m=+0.225089252 container exec_died 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, batch=17.1_20251118.1, architecture=x86_64, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:12:45Z, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, distribution-scope=public, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, tcib_managed=true, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ceilometer_agent_ipmi, name=rhosp17/openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, managed_by=tripleo_ansible, io.buildah.version=1.41.4, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 08:21:50 np0005538515.localdomain systemd[1]: 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.service: Deactivated successfully.
Nov 28 08:21:50 np0005538515.localdomain podman[74429]: 2025-11-28 08:21:50.140721955 +0000 UTC m=+0.242210046 container health_status 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, com.redhat.component=openstack-iscsid-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, distribution-scope=public, batch=17.1_20251118.1, io.openshift.expose-services=, version=17.1.12, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, build-date=2025-11-18T23:44:13Z, config_id=tripleo_step3, io.buildah.version=1.41.4, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-type=git, tcib_managed=true, name=rhosp17/openstack-iscsid)
Nov 28 08:21:50 np0005538515.localdomain podman[74429]: 2025-11-28 08:21:50.152389399 +0000 UTC m=+0.253877530 container exec_died 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-iscsid-container, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public, container_name=iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, release=1761123044, name=rhosp17/openstack-iscsid, build-date=2025-11-18T23:44:13Z, description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, url=https://www.redhat.com, io.openshift.expose-services=, architecture=x86_64, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, version=17.1.12, maintainer=OpenStack TripleO Team)
Nov 28 08:21:50 np0005538515.localdomain systemd[1]: 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.service: Deactivated successfully.
Nov 28 08:21:50 np0005538515.localdomain podman[74430]: 2025-11-28 08:21:50.20861505 +0000 UTC m=+0.306346633 container exec_died d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, url=https://www.redhat.com, io.buildah.version=1.41.4, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp17/openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:11:48Z, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, architecture=x86_64, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, vcs-type=git, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Nov 28 08:21:50 np0005538515.localdomain systemd[1]: d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.service: Deactivated successfully.
Nov 28 08:21:50 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.
Nov 28 08:21:50 np0005538515.localdomain podman[74520]: 2025-11-28 08:21:50.958882341 +0000 UTC m=+0.074256624 container health_status e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, managed_by=tripleo_ansible, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, build-date=2025-11-19T00:36:58Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, container_name=nova_migration_target, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, config_id=tripleo_step4, batch=17.1_20251118.1, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 28 08:21:51 np0005538515.localdomain podman[74520]: 2025-11-28 08:21:51.324568952 +0000 UTC m=+0.439943235 container exec_died e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, distribution-scope=public, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:36:58Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step4, io.buildah.version=1.41.4, url=https://www.redhat.com, architecture=x86_64, io.openshift.expose-services=, vcs-type=git, release=1761123044, com.redhat.component=openstack-nova-compute-container, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 28 08:21:51 np0005538515.localdomain systemd[1]: e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.service: Deactivated successfully.
Nov 28 08:21:54 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.
Nov 28 08:21:54 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.
Nov 28 08:21:54 np0005538515.localdomain systemd[1]: tmp-crun.IIkpyX.mount: Deactivated successfully.
Nov 28 08:21:54 np0005538515.localdomain podman[74544]: 2025-11-28 08:21:54.97486268 +0000 UTC m=+0.083652236 container health_status e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, managed_by=tripleo_ansible, build-date=2025-11-19T00:14:25Z, architecture=x86_64, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, name=rhosp17/openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, batch=17.1_20251118.1, vcs-type=git, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_id=tripleo_step4, version=17.1.12, tcib_managed=true, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c)
Nov 28 08:21:55 np0005538515.localdomain systemd[1]: tmp-crun.Ub2HqM.mount: Deactivated successfully.
Nov 28 08:21:55 np0005538515.localdomain podman[74543]: 2025-11-28 08:21:55.03134418 +0000 UTC m=+0.140956052 container health_status 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, vcs-type=git, io.buildah.version=1.41.4, name=rhosp17/openstack-ovn-controller, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., release=1761123044, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:34:05Z, distribution-scope=public, batch=17.1_20251118.1, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 28 08:21:55 np0005538515.localdomain podman[74544]: 2025-11-28 08:21:55.040540236 +0000 UTC m=+0.149329762 container exec_died e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, container_name=ovn_metadata_agent, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, url=https://www.redhat.com, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, batch=17.1_20251118.1, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, tcib_managed=true, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, distribution-scope=public, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., name=rhosp17/openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.41.4, build-date=2025-11-19T00:14:25Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Nov 28 08:21:55 np0005538515.localdomain systemd[1]: e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.service: Deactivated successfully.
Nov 28 08:21:55 np0005538515.localdomain podman[74543]: 2025-11-28 08:21:55.057229686 +0000 UTC m=+0.166841568 container exec_died 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.openshift.expose-services=, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, url=https://www.redhat.com, batch=17.1_20251118.1, vcs-type=git, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, managed_by=tripleo_ansible, tcib_managed=true, vendor=Red Hat, Inc., architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, container_name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, build-date=2025-11-18T23:34:05Z, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 28 08:21:55 np0005538515.localdomain systemd[1]: 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.service: Deactivated successfully.
Nov 28 08:22:11 np0005538515.localdomain sudo[74589]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 08:22:11 np0005538515.localdomain sudo[74589]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:22:11 np0005538515.localdomain sudo[74589]: pam_unix(sudo:session): session closed for user root
Nov 28 08:22:11 np0005538515.localdomain sudo[74604]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 08:22:11 np0005538515.localdomain sudo[74604]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:22:12 np0005538515.localdomain sudo[74604]: pam_unix(sudo:session): session closed for user root
Nov 28 08:22:13 np0005538515.localdomain sudo[74650]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 08:22:13 np0005538515.localdomain sudo[74650]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:22:13 np0005538515.localdomain sudo[74650]: pam_unix(sudo:session): session closed for user root
Nov 28 08:22:17 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.
Nov 28 08:22:17 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.
Nov 28 08:22:17 np0005538515.localdomain podman[74665]: 2025-11-28 08:22:17.983730155 +0000 UTC m=+0.085152404 container health_status 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, url=https://www.redhat.com, managed_by=tripleo_ansible, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, io.openshift.expose-services=, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, build-date=2025-11-18T22:49:46Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, batch=17.1_20251118.1, vendor=Red Hat, Inc., vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, architecture=x86_64)
Nov 28 08:22:18 np0005538515.localdomain podman[74666]: 2025-11-28 08:22:18.043458796 +0000 UTC m=+0.141794019 container health_status cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=collectd, url=https://www.redhat.com, com.redhat.component=openstack-collectd-container, name=rhosp17/openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, architecture=x86_64, vcs-type=git, distribution-scope=public, build-date=2025-11-18T22:51:28Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, config_id=tripleo_step3, release=1761123044, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd)
Nov 28 08:22:18 np0005538515.localdomain podman[74666]: 2025-11-28 08:22:18.056570654 +0000 UTC m=+0.154905917 container exec_died cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, managed_by=tripleo_ansible, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, release=1761123044, io.openshift.expose-services=, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, name=rhosp17/openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, container_name=collectd, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-collectd-container, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc.)
Nov 28 08:22:18 np0005538515.localdomain systemd[1]: cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.service: Deactivated successfully.
Nov 28 08:22:18 np0005538515.localdomain podman[74665]: 2025-11-28 08:22:18.241638569 +0000 UTC m=+0.343060838 container exec_died 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, batch=17.1_20251118.1, io.openshift.expose-services=, container_name=metrics_qdr, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, url=https://www.redhat.com, build-date=2025-11-18T22:49:46Z, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, config_id=tripleo_step1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.41.4, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container)
Nov 28 08:22:18 np0005538515.localdomain systemd[1]: 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.service: Deactivated successfully.
Nov 28 08:22:20 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.
Nov 28 08:22:20 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.
Nov 28 08:22:20 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.
Nov 28 08:22:20 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.
Nov 28 08:22:20 np0005538515.localdomain podman[74715]: 2025-11-28 08:22:20.983351774 +0000 UTC m=+0.087463536 container health_status 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, container_name=ceilometer_agent_ipmi, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, batch=17.1_20251118.1, release=1761123044, distribution-scope=public, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, name=rhosp17/openstack-ceilometer-ipmi, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-ipmi-container)
Nov 28 08:22:21 np0005538515.localdomain podman[74716]: 2025-11-28 08:22:21.02912544 +0000 UTC m=+0.131755375 container health_status 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vcs-type=git, version=17.1.12, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.openshift.expose-services=, name=rhosp17/openstack-iscsid, build-date=2025-11-18T23:44:13Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, tcib_managed=true, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, release=1761123044, architecture=x86_64, distribution-scope=public)
Nov 28 08:22:21 np0005538515.localdomain podman[74716]: 2025-11-28 08:22:21.0403525 +0000 UTC m=+0.142982395 container exec_died 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, summary=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, architecture=x86_64, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, managed_by=tripleo_ansible, vcs-type=git, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.component=openstack-iscsid-container, distribution-scope=public, build-date=2025-11-18T23:44:13Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, container_name=iscsid, version=17.1.12, vendor=Red Hat, Inc., tcib_managed=true, maintainer=OpenStack TripleO Team)
Nov 28 08:22:21 np0005538515.localdomain systemd[1]: 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.service: Deactivated successfully.
Nov 28 08:22:21 np0005538515.localdomain podman[74717]: 2025-11-28 08:22:21.083912096 +0000 UTC m=+0.182028251 container health_status d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, batch=17.1_20251118.1, container_name=ceilometer_agent_compute, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, version=17.1.12, managed_by=tripleo_ansible, url=https://www.redhat.com, distribution-scope=public, com.redhat.component=openstack-ceilometer-compute-container, build-date=2025-11-19T00:11:48Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 08:22:21 np0005538515.localdomain podman[74714]: 2025-11-28 08:22:21.131132137 +0000 UTC m=+0.234600868 container health_status 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, distribution-scope=public, batch=17.1_20251118.1, com.redhat.component=openstack-cron-container, summary=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, container_name=logrotate_crond, io.buildah.version=1.41.4)
Nov 28 08:22:21 np0005538515.localdomain podman[74715]: 2025-11-28 08:22:21.160129091 +0000 UTC m=+0.264240873 container exec_died 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, name=rhosp17/openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, io.openshift.expose-services=, tcib_managed=true, batch=17.1_20251118.1, distribution-scope=public, managed_by=tripleo_ansible, vendor=Red Hat, Inc., architecture=x86_64, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ceilometer-ipmi-container, release=1761123044, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com)
Nov 28 08:22:21 np0005538515.localdomain podman[74714]: 2025-11-28 08:22:21.167572272 +0000 UTC m=+0.271041023 container exec_died 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, config_id=tripleo_step4, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, vcs-type=git, maintainer=OpenStack TripleO Team, tcib_managed=true, io.openshift.expose-services=, name=rhosp17/openstack-cron, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, container_name=logrotate_crond, io.buildah.version=1.41.4, distribution-scope=public, vendor=Red Hat, Inc., com.redhat.component=openstack-cron-container)
Nov 28 08:22:21 np0005538515.localdomain systemd[1]: 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.service: Deactivated successfully.
Nov 28 08:22:21 np0005538515.localdomain systemd[1]: 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.service: Deactivated successfully.
Nov 28 08:22:21 np0005538515.localdomain podman[74717]: 2025-11-28 08:22:21.214419822 +0000 UTC m=+0.312535957 container exec_died d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, managed_by=tripleo_ansible, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:11:48Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.openshift.expose-services=, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git)
Nov 28 08:22:21 np0005538515.localdomain systemd[1]: d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.service: Deactivated successfully.
Nov 28 08:22:21 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.
Nov 28 08:22:21 np0005538515.localdomain podman[74803]: 2025-11-28 08:22:21.969995419 +0000 UTC m=+0.078981471 container health_status e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, io.buildah.version=1.41.4, container_name=nova_migration_target, maintainer=OpenStack TripleO Team, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, tcib_managed=true, version=17.1.12, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., build-date=2025-11-19T00:36:58Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step4, vcs-type=git, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Nov 28 08:22:22 np0005538515.localdomain podman[74803]: 2025-11-28 08:22:22.340326915 +0000 UTC m=+0.449312917 container exec_died e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, container_name=nova_migration_target, batch=17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, vcs-type=git, tcib_managed=true, name=rhosp17/openstack-nova-compute, url=https://www.redhat.com, version=17.1.12, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, managed_by=tripleo_ansible, io.openshift.expose-services=, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 08:22:22 np0005538515.localdomain systemd[1]: e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.service: Deactivated successfully.
Nov 28 08:22:25 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.
Nov 28 08:22:25 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.
Nov 28 08:22:25 np0005538515.localdomain systemd[1]: tmp-crun.e5H93g.mount: Deactivated successfully.
Nov 28 08:22:25 np0005538515.localdomain podman[74829]: 2025-11-28 08:22:25.985763971 +0000 UTC m=+0.088911731 container health_status e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, release=1761123044, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, architecture=x86_64, io.buildah.version=1.41.4, container_name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, tcib_managed=true, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:14:25Z, config_id=tripleo_step4, vendor=Red Hat, Inc., distribution-scope=public)
Nov 28 08:22:26 np0005538515.localdomain podman[74829]: 2025-11-28 08:22:26.031234147 +0000 UTC m=+0.134381907 container exec_died e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, architecture=x86_64, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, version=17.1.12, batch=17.1_20251118.1, managed_by=tripleo_ansible, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, vcs-type=git, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:14:25Z, vendor=Red Hat, Inc., tcib_managed=true, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-neutron-metadata-agent-ovn)
Nov 28 08:22:26 np0005538515.localdomain systemd[1]: e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.service: Deactivated successfully.
Nov 28 08:22:26 np0005538515.localdomain podman[74828]: 2025-11-28 08:22:26.034152038 +0000 UTC m=+0.138381971 container health_status 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, version=17.1.12, name=rhosp17/openstack-ovn-controller, vendor=Red Hat, Inc., architecture=x86_64, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, batch=17.1_20251118.1, distribution-scope=public, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, config_id=tripleo_step4, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ovn-controller-container, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller)
Nov 28 08:22:26 np0005538515.localdomain podman[74828]: 2025-11-28 08:22:26.116353379 +0000 UTC m=+0.220583242 container exec_died 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, architecture=x86_64, container_name=ovn_controller, vcs-type=git, managed_by=tripleo_ansible, com.redhat.component=openstack-ovn-controller-container, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, release=1761123044, name=rhosp17/openstack-ovn-controller, vendor=Red Hat, Inc., url=https://www.redhat.com, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272)
Nov 28 08:22:26 np0005538515.localdomain systemd[1]: 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.service: Deactivated successfully.
Nov 28 08:22:26 np0005538515.localdomain sudo[74922]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oucceggtefjdcfeqzxokdodqgwzbjwlu ; /usr/bin/python3
Nov 28 08:22:26 np0005538515.localdomain sudo[74922]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:22:26 np0005538515.localdomain python3[74924]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/config_step.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 08:22:26 np0005538515.localdomain sudo[74922]: pam_unix(sudo:session): session closed for user root
Nov 28 08:22:27 np0005538515.localdomain sudo[74967]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ifdfpnokztvitndlotdmvjanxrzsokpa ; /usr/bin/python3
Nov 28 08:22:27 np0005538515.localdomain sudo[74967]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:22:27 np0005538515.localdomain python3[74969]: ansible-ansible.legacy.copy Invoked with dest=/etc/puppet/hieradata/config_step.json force=True mode=0600 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764318146.5621905-114299-3191084048076/source _original_basename=tmpy_4008sd follow=False checksum=039e0b234f00fbd1242930f0d5dc67e8b4c067fe backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:22:27 np0005538515.localdomain sudo[74967]: pam_unix(sudo:session): session closed for user root
Nov 28 08:22:28 np0005538515.localdomain sudo[74997]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wlrrcpmhmgmivhkfdaqwfhuegxduejps ; /usr/bin/python3
Nov 28 08:22:28 np0005538515.localdomain sudo[74997]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:22:28 np0005538515.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Nov 28 08:22:28 np0005538515.localdomain recover_tripleo_nova_virtqemud[75001]: 62642
Nov 28 08:22:28 np0005538515.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Nov 28 08:22:28 np0005538515.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Nov 28 08:22:28 np0005538515.localdomain python3[74999]: ansible-stat Invoked with path=/var/lib/tripleo-config/container-startup-config/step_5 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 28 08:22:28 np0005538515.localdomain sudo[74997]: pam_unix(sudo:session): session closed for user root
Nov 28 08:22:28 np0005538515.localdomain sudo[75049]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qxnknnuymroimbetzlgguegcspjfffbg ; /usr/bin/python3
Nov 28 08:22:28 np0005538515.localdomain sudo[75049]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:22:28 np0005538515.localdomain sudo[75049]: pam_unix(sudo:session): session closed for user root
Nov 28 08:22:28 np0005538515.localdomain sudo[75067]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jxkqqzoazimlzzxjsgdqafascjrzpjal ; /usr/bin/python3
Nov 28 08:22:28 np0005538515.localdomain sudo[75067]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:22:29 np0005538515.localdomain sudo[75067]: pam_unix(sudo:session): session closed for user root
Nov 28 08:22:29 np0005538515.localdomain sudo[75171]: tripleo-admin : TTY=pts/0 ; PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-myrfaopprksmfglmyiegfqptfaxwtpum ; ANSIBLE_ASYNC_DIR=/tmp/.ansible_async /usr/bin/python3 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1764318149.3304214-114685-167087183875994/async_wrapper.py 413905860756 3600 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1764318149.3304214-114685-167087183875994/AnsiballZ_command.py _
Nov 28 08:22:29 np0005538515.localdomain sudo[75171]: pam_unix(sudo:session): session opened for user root(uid=0) by tripleo-admin(uid=1003)
Nov 28 08:22:29 np0005538515.localdomain ansible-async_wrapper.py[75173]: Invoked with 413905860756 3600 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1764318149.3304214-114685-167087183875994/AnsiballZ_command.py _
Nov 28 08:22:29 np0005538515.localdomain ansible-async_wrapper.py[75176]: Starting module and watcher
Nov 28 08:22:29 np0005538515.localdomain ansible-async_wrapper.py[75176]: Start watching 75177 (3600)
Nov 28 08:22:29 np0005538515.localdomain ansible-async_wrapper.py[75177]: Start module (75177)
Nov 28 08:22:29 np0005538515.localdomain ansible-async_wrapper.py[75173]: Return async_wrapper task started.
Nov 28 08:22:29 np0005538515.localdomain sudo[75171]: pam_unix(sudo:session): session closed for user root
Nov 28 08:22:30 np0005538515.localdomain sudo[75192]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jqmdugoecezufanthwbahorwvfrptbey ; /usr/bin/python3
Nov 28 08:22:30 np0005538515.localdomain sudo[75192]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:22:30 np0005538515.localdomain python3[75197]: ansible-ansible.legacy.async_status Invoked with jid=413905860756.75173 mode=status _async_dir=/tmp/.ansible_async
Nov 28 08:22:30 np0005538515.localdomain sudo[75192]: pam_unix(sudo:session): session closed for user root
Nov 28 08:22:33 np0005538515.localdomain puppet-user[75196]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5
Nov 28 08:22:33 np0005538515.localdomain puppet-user[75196]:    (file: /etc/puppet/hiera.yaml)
Nov 28 08:22:33 np0005538515.localdomain puppet-user[75196]: Warning: Undefined variable '::deploy_config_name';
Nov 28 08:22:33 np0005538515.localdomain puppet-user[75196]:    (file & line not available)
Nov 28 08:22:33 np0005538515.localdomain puppet-user[75196]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html
Nov 28 08:22:33 np0005538515.localdomain puppet-user[75196]:    (file & line not available)
Nov 28 08:22:33 np0005538515.localdomain puppet-user[75196]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/profile/base/database/mysql/client.pp, line: 89, column: 8)
Nov 28 08:22:33 np0005538515.localdomain puppet-user[75196]: Warning: This method is deprecated, please use match expressions with Stdlib::Compat::String instead. They are described at https://docs.puppet.com/puppet/latest/reference/lang_data_type.html#match-expressions. at ["/etc/puppet/modules/snmp/manifests/params.pp", 310]:["/var/lib/tripleo-config/puppet_step_config.pp", 4]
Nov 28 08:22:33 np0005538515.localdomain puppet-user[75196]:    (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation')
Nov 28 08:22:33 np0005538515.localdomain puppet-user[75196]: Warning: This method is deprecated, please use the stdlib validate_legacy function,
Nov 28 08:22:33 np0005538515.localdomain puppet-user[75196]:                     with Stdlib::Compat::Bool. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 358]:["/var/lib/tripleo-config/puppet_step_config.pp", 4]
Nov 28 08:22:33 np0005538515.localdomain puppet-user[75196]:    (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation')
Nov 28 08:22:33 np0005538515.localdomain puppet-user[75196]: Warning: This method is deprecated, please use the stdlib validate_legacy function,
Nov 28 08:22:33 np0005538515.localdomain puppet-user[75196]:                     with Stdlib::Compat::Array. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 367]:["/var/lib/tripleo-config/puppet_step_config.pp", 4]
Nov 28 08:22:33 np0005538515.localdomain puppet-user[75196]:    (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation')
Nov 28 08:22:33 np0005538515.localdomain puppet-user[75196]: Warning: This method is deprecated, please use the stdlib validate_legacy function,
Nov 28 08:22:33 np0005538515.localdomain puppet-user[75196]:                     with Stdlib::Compat::String. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 382]:["/var/lib/tripleo-config/puppet_step_config.pp", 4]
Nov 28 08:22:33 np0005538515.localdomain puppet-user[75196]:    (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation')
Nov 28 08:22:33 np0005538515.localdomain puppet-user[75196]: Warning: This method is deprecated, please use the stdlib validate_legacy function,
Nov 28 08:22:33 np0005538515.localdomain puppet-user[75196]:                     with Stdlib::Compat::Numeric. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 388]:["/var/lib/tripleo-config/puppet_step_config.pp", 4]
Nov 28 08:22:33 np0005538515.localdomain puppet-user[75196]:    (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation')
Nov 28 08:22:33 np0005538515.localdomain puppet-user[75196]: Warning: This method is deprecated, please use the stdlib validate_legacy function,
Nov 28 08:22:33 np0005538515.localdomain puppet-user[75196]:                     with Pattern[]. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 393]:["/var/lib/tripleo-config/puppet_step_config.pp", 4]
Nov 28 08:22:33 np0005538515.localdomain puppet-user[75196]:    (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation')
Nov 28 08:22:33 np0005538515.localdomain puppet-user[75196]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/packages.pp, line: 39, column: 69)
Nov 28 08:22:33 np0005538515.localdomain puppet-user[75196]: Notice: Compiled catalog for np0005538515.localdomain in environment production in 0.20 seconds
Nov 28 08:22:34 np0005538515.localdomain puppet-user[75196]: Notice: Applied catalog in 0.25 seconds
Nov 28 08:22:34 np0005538515.localdomain puppet-user[75196]: Application:
Nov 28 08:22:34 np0005538515.localdomain puppet-user[75196]:    Initial environment: production
Nov 28 08:22:34 np0005538515.localdomain puppet-user[75196]:    Converged environment: production
Nov 28 08:22:34 np0005538515.localdomain puppet-user[75196]:          Run mode: user
Nov 28 08:22:34 np0005538515.localdomain puppet-user[75196]: Changes:
Nov 28 08:22:34 np0005538515.localdomain puppet-user[75196]: Events:
Nov 28 08:22:34 np0005538515.localdomain puppet-user[75196]: Resources:
Nov 28 08:22:34 np0005538515.localdomain puppet-user[75196]:             Total: 19
Nov 28 08:22:34 np0005538515.localdomain puppet-user[75196]: Time:
Nov 28 08:22:34 np0005538515.localdomain puppet-user[75196]:           Package: 0.00
Nov 28 08:22:34 np0005538515.localdomain puppet-user[75196]:          Schedule: 0.00
Nov 28 08:22:34 np0005538515.localdomain puppet-user[75196]:              Exec: 0.01
Nov 28 08:22:34 np0005538515.localdomain puppet-user[75196]:            Augeas: 0.01
Nov 28 08:22:34 np0005538515.localdomain puppet-user[75196]:              File: 0.02
Nov 28 08:22:34 np0005538515.localdomain puppet-user[75196]:           Service: 0.07
Nov 28 08:22:34 np0005538515.localdomain puppet-user[75196]:    Transaction evaluation: 0.24
Nov 28 08:22:34 np0005538515.localdomain puppet-user[75196]:    Catalog application: 0.25
Nov 28 08:22:34 np0005538515.localdomain puppet-user[75196]:    Config retrieval: 0.26
Nov 28 08:22:34 np0005538515.localdomain puppet-user[75196]:          Last run: 1764318154
Nov 28 08:22:34 np0005538515.localdomain puppet-user[75196]:        Filebucket: 0.00
Nov 28 08:22:34 np0005538515.localdomain puppet-user[75196]:             Total: 0.25
Nov 28 08:22:34 np0005538515.localdomain puppet-user[75196]: Version:
Nov 28 08:22:34 np0005538515.localdomain puppet-user[75196]:            Config: 1764318153
Nov 28 08:22:34 np0005538515.localdomain puppet-user[75196]:            Puppet: 7.10.0
Nov 28 08:22:34 np0005538515.localdomain ansible-async_wrapper.py[75177]: Module complete (75177)
Nov 28 08:22:34 np0005538515.localdomain ansible-async_wrapper.py[75176]: Done in kid B.
Nov 28 08:22:40 np0005538515.localdomain sudo[75333]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-atyxssxcapejftrtfimvkylbktlbhkmz ; /usr/bin/python3
Nov 28 08:22:40 np0005538515.localdomain sudo[75333]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:22:40 np0005538515.localdomain python3[75335]: ansible-ansible.legacy.async_status Invoked with jid=413905860756.75173 mode=status _async_dir=/tmp/.ansible_async
Nov 28 08:22:40 np0005538515.localdomain sudo[75333]: pam_unix(sudo:session): session closed for user root
Nov 28 08:22:41 np0005538515.localdomain sudo[75349]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yytzeuufheuevvyfvpxadokcjonlnsjn ; /usr/bin/python3
Nov 28 08:22:41 np0005538515.localdomain sudo[75349]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:22:41 np0005538515.localdomain python3[75351]: ansible-file Invoked with path=/var/lib/container-puppet/puppetlabs state=directory setype=svirt_sandbox_file_t selevel=s0 recurse=True force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None
Nov 28 08:22:41 np0005538515.localdomain sudo[75349]: pam_unix(sudo:session): session closed for user root
Nov 28 08:22:41 np0005538515.localdomain sudo[75365]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rjkktntgazrsuuvxartqpudylansxqgc ; /usr/bin/python3
Nov 28 08:22:41 np0005538515.localdomain sudo[75365]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:22:41 np0005538515.localdomain python3[75367]: ansible-stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 28 08:22:41 np0005538515.localdomain sudo[75365]: pam_unix(sudo:session): session closed for user root
Nov 28 08:22:42 np0005538515.localdomain sudo[75415]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yfoocffdtjxfztokskuwoxfuehppnppi ; /usr/bin/python3
Nov 28 08:22:42 np0005538515.localdomain sudo[75415]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:22:42 np0005538515.localdomain python3[75417]: ansible-ansible.legacy.stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 08:22:42 np0005538515.localdomain sudo[75415]: pam_unix(sudo:session): session closed for user root
Nov 28 08:22:42 np0005538515.localdomain sudo[75433]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lktvycnyezctuuxpsuwvvbtixgznlbfa ; /usr/bin/python3
Nov 28 08:22:42 np0005538515.localdomain sudo[75433]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:22:42 np0005538515.localdomain python3[75435]: ansible-ansible.legacy.file Invoked with setype=svirt_sandbox_file_t selevel=s0 dest=/var/lib/container-puppet/puppetlabs/facter.conf _original_basename=tmpjcsc8r6m recurse=False state=file path=/var/lib/container-puppet/puppetlabs/facter.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None
Nov 28 08:22:42 np0005538515.localdomain sudo[75433]: pam_unix(sudo:session): session closed for user root
Nov 28 08:22:42 np0005538515.localdomain sudo[75463]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tfrqxrserblrexqyckjkeyskezlrohoc ; /usr/bin/python3
Nov 28 08:22:42 np0005538515.localdomain sudo[75463]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:22:42 np0005538515.localdomain python3[75465]: ansible-file Invoked with path=/opt/puppetlabs/facter state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:22:42 np0005538515.localdomain sudo[75463]: pam_unix(sudo:session): session closed for user root
Nov 28 08:22:43 np0005538515.localdomain sudo[75479]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uiruyiktrtznmtfiynviykttihcmfrdg ; /usr/bin/python3
Nov 28 08:22:43 np0005538515.localdomain sudo[75479]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:22:43 np0005538515.localdomain sudo[75479]: pam_unix(sudo:session): session closed for user root
Nov 28 08:22:43 np0005538515.localdomain sudo[75568]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uhvufkecbbwqqhakfrjjxihortharhun ; /usr/bin/python3
Nov 28 08:22:43 np0005538515.localdomain sudo[75568]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:22:44 np0005538515.localdomain python3[75570]: ansible-ansible.posix.synchronize Invoked with src=/opt/puppetlabs/ dest=/var/lib/container-puppet/puppetlabs/ _local_rsync_path=rsync _local_rsync_password=NOT_LOGGING_PARAMETER rsync_path=None delete=False _substitute_controller=False archive=True checksum=False compress=True existing_only=False dirs=False copy_links=False set_remote_user=True rsync_timeout=0 rsync_opts=[] ssh_connection_multiplexing=False partial=False verify_host=False mode=push dest_port=None private_key=None recursive=None links=None perms=None times=None owner=None group=None ssh_args=None link_dest=None
Nov 28 08:22:44 np0005538515.localdomain sudo[75568]: pam_unix(sudo:session): session closed for user root
Nov 28 08:22:44 np0005538515.localdomain sudo[75587]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rjcnleylzwblkentkiqcdqqnrianqhoa ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:22:44 np0005538515.localdomain sudo[75587]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:22:44 np0005538515.localdomain python3[75589]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:22:44 np0005538515.localdomain sudo[75587]: pam_unix(sudo:session): session closed for user root
Nov 28 08:22:45 np0005538515.localdomain sudo[75603]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ljkmzgdsnszmvsenuhpkrwcbalhmumgg ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:22:45 np0005538515.localdomain sudo[75603]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:22:45 np0005538515.localdomain sudo[75603]: pam_unix(sudo:session): session closed for user root
Nov 28 08:22:45 np0005538515.localdomain sudo[75619]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ocxwohtpuwprrvkqpokgyapcwmkbeiyc ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:22:45 np0005538515.localdomain sudo[75619]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:22:45 np0005538515.localdomain python3[75621]: ansible-stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 28 08:22:45 np0005538515.localdomain sudo[75619]: pam_unix(sudo:session): session closed for user root
Nov 28 08:22:46 np0005538515.localdomain sudo[75669]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ssjhljdxbrfzcadibsafxpiklzjyiucj ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:22:46 np0005538515.localdomain sudo[75669]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:22:46 np0005538515.localdomain python3[75671]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-container-shutdown follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 08:22:46 np0005538515.localdomain sudo[75669]: pam_unix(sudo:session): session closed for user root
Nov 28 08:22:46 np0005538515.localdomain sudo[75687]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nbqjxwyjaecpytekztdpnyhqysagolhj ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:22:46 np0005538515.localdomain sudo[75687]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:22:46 np0005538515.localdomain python3[75689]: ansible-ansible.legacy.file Invoked with mode=0700 owner=root group=root dest=/usr/libexec/tripleo-container-shutdown _original_basename=tripleo-container-shutdown recurse=False state=file path=/usr/libexec/tripleo-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:22:46 np0005538515.localdomain sudo[75687]: pam_unix(sudo:session): session closed for user root
Nov 28 08:22:46 np0005538515.localdomain sudo[75749]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-njnafbtapgwwkaqhvvgnmsxfqikvpqdd ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:22:46 np0005538515.localdomain sudo[75749]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:22:47 np0005538515.localdomain python3[75751]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-start-podman-container follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 08:22:47 np0005538515.localdomain sudo[75749]: pam_unix(sudo:session): session closed for user root
Nov 28 08:22:47 np0005538515.localdomain sudo[75767]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tbjmfjkhqhieiuyotzxlritojpepffwo ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:22:47 np0005538515.localdomain sudo[75767]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:22:47 np0005538515.localdomain python3[75769]: ansible-ansible.legacy.file Invoked with mode=0700 owner=root group=root dest=/usr/libexec/tripleo-start-podman-container _original_basename=tripleo-start-podman-container recurse=False state=file path=/usr/libexec/tripleo-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:22:47 np0005538515.localdomain sudo[75767]: pam_unix(sudo:session): session closed for user root
Nov 28 08:22:47 np0005538515.localdomain sudo[75829]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ugsjgosdvvygrykaowrcjqpqjxuovamv ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:22:47 np0005538515.localdomain sudo[75829]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:22:47 np0005538515.localdomain python3[75831]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/tripleo-container-shutdown.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 08:22:47 np0005538515.localdomain sudo[75829]: pam_unix(sudo:session): session closed for user root
Nov 28 08:22:48 np0005538515.localdomain sudo[75847]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-amjeuxjzrtpvacwfgdictlpkgoysixbh ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:22:48 np0005538515.localdomain sudo[75847]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:22:48 np0005538515.localdomain python3[75849]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system/tripleo-container-shutdown.service _original_basename=tripleo-container-shutdown-service recurse=False state=file path=/usr/lib/systemd/system/tripleo-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:22:48 np0005538515.localdomain sudo[75847]: pam_unix(sudo:session): session closed for user root
Nov 28 08:22:48 np0005538515.localdomain sudo[75909]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cgmfwyaddihuekptingclmnysrndcvqu ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:22:48 np0005538515.localdomain sudo[75909]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:22:48 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.
Nov 28 08:22:48 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.
Nov 28 08:22:48 np0005538515.localdomain sshd[75933]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 08:22:48 np0005538515.localdomain podman[75913]: 2025-11-28 08:22:48.662684027 +0000 UTC m=+0.098804609 container health_status cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, com.redhat.component=openstack-collectd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, build-date=2025-11-18T22:51:28Z, description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, url=https://www.redhat.com, architecture=x86_64, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, version=17.1.12, batch=17.1_20251118.1, container_name=collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vendor=Red Hat, Inc., io.buildah.version=1.41.4, release=1761123044)
Nov 28 08:22:48 np0005538515.localdomain podman[75913]: 2025-11-28 08:22:48.684350402 +0000 UTC m=+0.120471034 container exec_died cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-collectd-container, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, name=rhosp17/openstack-collectd, batch=17.1_20251118.1, architecture=x86_64, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, tcib_managed=true, maintainer=OpenStack TripleO Team, vcs-type=git)
Nov 28 08:22:48 np0005538515.localdomain systemd[1]: cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.service: Deactivated successfully.
Nov 28 08:22:48 np0005538515.localdomain python3[75912]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 08:22:48 np0005538515.localdomain sudo[75909]: pam_unix(sudo:session): session closed for user root
Nov 28 08:22:48 np0005538515.localdomain podman[75911]: 2025-11-28 08:22:48.760137082 +0000 UTC m=+0.196463950 container health_status 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, version=17.1.12, tcib_managed=true, config_id=tripleo_step1, architecture=x86_64, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-type=git, build-date=2025-11-18T22:49:46Z, maintainer=OpenStack TripleO Team, container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, release=1761123044)
Nov 28 08:22:48 np0005538515.localdomain sudo[75977]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gumhywyzeejnqzovkgzmukwnyvssmioy ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:22:48 np0005538515.localdomain sudo[75977]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:22:48 np0005538515.localdomain python3[75979]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset _original_basename=91-tripleo-container-shutdown-preset recurse=False state=file path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:22:48 np0005538515.localdomain sudo[75977]: pam_unix(sudo:session): session closed for user root
Nov 28 08:22:48 np0005538515.localdomain podman[75911]: 2025-11-28 08:22:48.965471358 +0000 UTC m=+0.401798236 container exec_died 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, container_name=metrics_qdr, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, name=rhosp17/openstack-qdrouterd, architecture=x86_64, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step1, build-date=2025-11-18T22:49:46Z, distribution-scope=public, io.buildah.version=1.41.4, tcib_managed=true)
Nov 28 08:22:48 np0005538515.localdomain systemd[1]: 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.service: Deactivated successfully.
Nov 28 08:22:49 np0005538515.localdomain sudo[76008]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qtrxxnhqrjremawythhmqwawaflqnkfg ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:22:49 np0005538515.localdomain sudo[76008]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:22:49 np0005538515.localdomain python3[76010]: ansible-systemd Invoked with name=tripleo-container-shutdown state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 08:22:49 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 08:22:49 np0005538515.localdomain systemd-rc-local-generator[76033]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 08:22:49 np0005538515.localdomain systemd-sysv-generator[76038]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 08:22:49 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 08:22:49 np0005538515.localdomain sshd[75933]: Invalid user minima from 80.94.92.186 port 46792
Nov 28 08:22:49 np0005538515.localdomain sudo[76008]: pam_unix(sudo:session): session closed for user root
Nov 28 08:22:50 np0005538515.localdomain sshd[75933]: Connection closed by invalid user minima 80.94.92.186 port 46792 [preauth]
Nov 28 08:22:50 np0005538515.localdomain sudo[76094]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zuiteaedqnwxruryzmjjobzyrxcglgxo ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:22:50 np0005538515.localdomain sudo[76094]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:22:50 np0005538515.localdomain python3[76096]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/netns-placeholder.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 08:22:50 np0005538515.localdomain sudo[76094]: pam_unix(sudo:session): session closed for user root
Nov 28 08:22:50 np0005538515.localdomain sudo[76112]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ovrdbicvdrafvserdabihujuqjhgxnpd ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:22:50 np0005538515.localdomain sudo[76112]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:22:50 np0005538515.localdomain python3[76114]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/usr/lib/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:22:50 np0005538515.localdomain sudo[76112]: pam_unix(sudo:session): session closed for user root
Nov 28 08:22:50 np0005538515.localdomain sudo[76174]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nnmpegmcxpozqrdcdbacobygwscpiwig ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:22:50 np0005538515.localdomain sudo[76174]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:22:51 np0005538515.localdomain python3[76176]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 08:22:51 np0005538515.localdomain sudo[76174]: pam_unix(sudo:session): session closed for user root
Nov 28 08:22:51 np0005538515.localdomain sudo[76192]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gzkfcugsrlyfkrwhuetszvceexicdvds ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:22:51 np0005538515.localdomain sudo[76192]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:22:51 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.
Nov 28 08:22:51 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.
Nov 28 08:22:51 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.
Nov 28 08:22:51 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.
Nov 28 08:22:51 np0005538515.localdomain systemd[1]: tmp-crun.RXpWM4.mount: Deactivated successfully.
Nov 28 08:22:51 np0005538515.localdomain systemd[1]: tmp-crun.Wck0jB.mount: Deactivated successfully.
Nov 28 08:22:51 np0005538515.localdomain podman[76197]: 2025-11-28 08:22:51.369242146 +0000 UTC m=+0.112633169 container health_status 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, io.openshift.expose-services=, com.redhat.component=openstack-iscsid-container, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, name=rhosp17/openstack-iscsid, distribution-scope=public, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step3, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, build-date=2025-11-18T23:44:13Z, architecture=x86_64, container_name=iscsid)
Nov 28 08:22:51 np0005538515.localdomain podman[76196]: 2025-11-28 08:22:51.324210704 +0000 UTC m=+0.067681750 container health_status 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, release=1761123044, url=https://www.redhat.com, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:12:45Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, name=rhosp17/openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, distribution-scope=public, version=17.1.12, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, managed_by=tripleo_ansible, vcs-type=git, io.buildah.version=1.41.4, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Nov 28 08:22:51 np0005538515.localdomain podman[76198]: 2025-11-28 08:22:51.347425296 +0000 UTC m=+0.081934213 container health_status d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, architecture=x86_64, batch=17.1_20251118.1, vcs-type=git, release=1761123044, name=rhosp17/openstack-ceilometer-compute, build-date=2025-11-19T00:11:48Z, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., version=17.1.12, container_name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4)
Nov 28 08:22:51 np0005538515.localdomain podman[76196]: 2025-11-28 08:22:51.405517626 +0000 UTC m=+0.148988742 container exec_died 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, version=17.1.12, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2025-11-19T00:12:45Z, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, vcs-type=git, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-ipmi-container, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, container_name=ceilometer_agent_ipmi, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676)
Nov 28 08:22:51 np0005538515.localdomain python3[76194]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:22:51 np0005538515.localdomain systemd[1]: 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.service: Deactivated successfully.
Nov 28 08:22:51 np0005538515.localdomain sudo[76192]: pam_unix(sudo:session): session closed for user root
Nov 28 08:22:51 np0005538515.localdomain podman[76198]: 2025-11-28 08:22:51.427949115 +0000 UTC m=+0.162458022 container exec_died d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-compute-container, batch=17.1_20251118.1, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, io.buildah.version=1.41.4, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, vcs-type=git, container_name=ceilometer_agent_compute, config_id=tripleo_step4)
Nov 28 08:22:51 np0005538515.localdomain systemd[1]: d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.service: Deactivated successfully.
Nov 28 08:22:51 np0005538515.localdomain podman[76195]: 2025-11-28 08:22:51.479186041 +0000 UTC m=+0.223292216 container health_status 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, version=17.1.12, com.redhat.component=openstack-cron-container, config_id=tripleo_step4, architecture=x86_64, managed_by=tripleo_ansible, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:32Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, distribution-scope=public, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond)
Nov 28 08:22:51 np0005538515.localdomain podman[76195]: 2025-11-28 08:22:51.514350057 +0000 UTC m=+0.258456272 container exec_died 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:49:32Z, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, com.redhat.component=openstack-cron-container, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, release=1761123044, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, container_name=logrotate_crond, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., name=rhosp17/openstack-cron, batch=17.1_20251118.1, io.buildah.version=1.41.4, io.openshift.expose-services=)
Nov 28 08:22:51 np0005538515.localdomain systemd[1]: 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.service: Deactivated successfully.
Nov 28 08:22:51 np0005538515.localdomain podman[76197]: 2025-11-28 08:22:51.532105489 +0000 UTC m=+0.275496502 container exec_died 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, release=1761123044, vendor=Red Hat, Inc., distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, config_id=tripleo_step3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, container_name=iscsid, url=https://www.redhat.com, io.buildah.version=1.41.4, build-date=2025-11-18T23:44:13Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, name=rhosp17/openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, maintainer=OpenStack TripleO Team, vcs-type=git, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, version=17.1.12, batch=17.1_20251118.1, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid)
Nov 28 08:22:51 np0005538515.localdomain systemd[1]: 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.service: Deactivated successfully.
Nov 28 08:22:51 np0005538515.localdomain sudo[76310]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yvtknddchoeofqcbovlngiolnksxkkgp ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:22:51 np0005538515.localdomain sudo[76310]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:22:51 np0005538515.localdomain python3[76312]: ansible-systemd Invoked with name=netns-placeholder state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 08:22:51 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 08:22:52 np0005538515.localdomain systemd-sysv-generator[76340]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 08:22:52 np0005538515.localdomain systemd-rc-local-generator[76336]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 08:22:52 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 08:22:52 np0005538515.localdomain systemd[1]: Starting Create netns directory...
Nov 28 08:22:52 np0005538515.localdomain systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Nov 28 08:22:52 np0005538515.localdomain systemd[1]: netns-placeholder.service: Deactivated successfully.
Nov 28 08:22:52 np0005538515.localdomain systemd[1]: Finished Create netns directory.
Nov 28 08:22:52 np0005538515.localdomain sudo[76310]: pam_unix(sudo:session): session closed for user root
Nov 28 08:22:52 np0005538515.localdomain sudo[76367]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yieoosqgqrwihbkvoqtspktdfnvjowox ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:22:52 np0005538515.localdomain sudo[76367]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:22:52 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.
Nov 28 08:22:52 np0005538515.localdomain podman[76370]: 2025-11-28 08:22:52.773028125 +0000 UTC m=+0.102775892 container health_status e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, distribution-scope=public, config_id=tripleo_step4, release=1761123044, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, architecture=x86_64, io.openshift.expose-services=, build-date=2025-11-19T00:36:58Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., container_name=nova_migration_target, vcs-type=git, url=https://www.redhat.com)
Nov 28 08:22:52 np0005538515.localdomain python3[76369]: ansible-container_puppet_config Invoked with update_config_hash_only=True no_archive=True check_mode=False config_vol_prefix=/var/lib/config-data debug=False net_host=True puppet_config= short_hostname= step=6
Nov 28 08:22:52 np0005538515.localdomain sudo[76367]: pam_unix(sudo:session): session closed for user root
Nov 28 08:22:53 np0005538515.localdomain sudo[76406]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bfgtdxfsrejkqjorqyafmonrqynnndxw ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:22:53 np0005538515.localdomain sudo[76406]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:22:53 np0005538515.localdomain podman[76370]: 2025-11-28 08:22:53.106485122 +0000 UTC m=+0.436232919 container exec_died e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, build-date=2025-11-19T00:36:58Z, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step4, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, architecture=x86_64, distribution-scope=public, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=nova_migration_target, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, vcs-type=git, batch=17.1_20251118.1, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 28 08:22:53 np0005538515.localdomain systemd[1]: e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.service: Deactivated successfully.
Nov 28 08:22:53 np0005538515.localdomain systemd[1]: tmp-crun.vqtUNR.mount: Deactivated successfully.
Nov 28 08:22:53 np0005538515.localdomain sudo[76406]: pam_unix(sudo:session): session closed for user root
Nov 28 08:22:54 np0005538515.localdomain sudo[76449]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-voeuwznbdethakorskdziimzjrkurrot ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:22:54 np0005538515.localdomain sudo[76449]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:22:54 np0005538515.localdomain python3[76451]: ansible-tripleo_container_manage Invoked with config_id=tripleo_step5 config_dir=/var/lib/tripleo-config/container-startup-config/step_5 config_patterns=*.json config_overrides={} concurrency=5 log_base_path=/var/log/containers/stdouts debug=False
Nov 28 08:22:55 np0005538515.localdomain podman[76491]: 2025-11-28 08:22:55.19115603 +0000 UTC m=+0.068663319 container create ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, build-date=2025-11-19T00:36:58Z, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, batch=17.1_20251118.1, io.openshift.expose-services=, vendor=Red Hat, Inc., tcib_managed=true, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, name=rhosp17/openstack-nova-compute, version=17.1.12, architecture=x86_64, container_name=nova_compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step5, url=https://www.redhat.com, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible)
Nov 28 08:22:55 np0005538515.localdomain systemd[1]: Started libpod-conmon-ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.scope.
Nov 28 08:22:55 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 08:22:55 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5442f5016f7f6fcccf64f4788496955298bf5e3ac09b77bd37d30b08717aca4a/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Nov 28 08:22:55 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5442f5016f7f6fcccf64f4788496955298bf5e3ac09b77bd37d30b08717aca4a/merged/var/log/nova supports timestamps until 2038 (0x7fffffff)
Nov 28 08:22:55 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5442f5016f7f6fcccf64f4788496955298bf5e3ac09b77bd37d30b08717aca4a/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Nov 28 08:22:55 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5442f5016f7f6fcccf64f4788496955298bf5e3ac09b77bd37d30b08717aca4a/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Nov 28 08:22:55 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5442f5016f7f6fcccf64f4788496955298bf5e3ac09b77bd37d30b08717aca4a/merged/var/lib/kolla/config_files/src-ceph supports timestamps until 2038 (0x7fffffff)
Nov 28 08:22:55 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.
Nov 28 08:22:55 np0005538515.localdomain podman[76491]: 2025-11-28 08:22:55.26173944 +0000 UTC m=+0.139246739 container init ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, container_name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2025-11-19T00:36:58Z, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, batch=17.1_20251118.1, config_id=tripleo_step5, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-nova-compute, distribution-scope=public, io.openshift.expose-services=, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 08:22:55 np0005538515.localdomain podman[76491]: 2025-11-28 08:22:55.163789218 +0000 UTC m=+0.041296517 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1
Nov 28 08:22:55 np0005538515.localdomain systemd[1]: tmp-crun.QAMzgk.mount: Deactivated successfully.
Nov 28 08:22:55 np0005538515.localdomain sudo[76512]:     nova : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Nov 28 08:22:55 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.
Nov 28 08:22:55 np0005538515.localdomain podman[76491]: 2025-11-28 08:22:55.29834613 +0000 UTC m=+0.175853449 container start ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=nova_compute, batch=17.1_20251118.1, io.openshift.expose-services=, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:36:58Z, vendor=Red Hat, Inc., release=1761123044, config_id=tripleo_step5, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, architecture=x86_64, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 28 08:22:55 np0005538515.localdomain systemd-logind[763]: Existing logind session ID 28 used by new audit session, ignoring.
Nov 28 08:22:55 np0005538515.localdomain python3[76451]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_compute --conmon-pidfile /run/nova_compute.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env LIBGUESTFS_BACKEND=direct --env TRIPLEO_CONFIG_HASH=18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90 --healthcheck-command /openstack/healthcheck 5672 --ipc host --label config_id=tripleo_step5 --label container_name=nova_compute --label managed_by=tripleo_ansible --label config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_compute.log --network host --privileged=True --ulimit nofile=131072 --ulimit memlock=67108864 --user nova --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/log/containers/nova:/var/log/nova --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro --volume /var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro --volume /var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z --volume /dev:/dev --volume /lib/modules:/lib/modules:ro --volume /run:/run --volume /run/nova:/run/nova:z --volume /var/lib/iscsi:/var/lib/iscsi:z --volume /var/lib/libvirt:/var/lib/libvirt:shared --volume /sys/class/net:/sys/class/net --volume /sys/bus/pci:/sys/bus/pci --volume /boot:/boot:ro --volume /var/lib/nova:/var/lib/nova:shared registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1
Nov 28 08:22:55 np0005538515.localdomain systemd[1]: Created slice User Slice of UID 0.
Nov 28 08:22:55 np0005538515.localdomain systemd[1]: Starting User Runtime Directory /run/user/0...
Nov 28 08:22:55 np0005538515.localdomain systemd[1]: Finished User Runtime Directory /run/user/0.
Nov 28 08:22:55 np0005538515.localdomain systemd[1]: Starting User Manager for UID 0...
Nov 28 08:22:55 np0005538515.localdomain systemd[76533]: pam_unix(systemd-user:session): session opened for user root(uid=0) by (uid=0)
Nov 28 08:22:55 np0005538515.localdomain podman[76513]: 2025-11-28 08:22:55.422199118 +0000 UTC m=+0.113692343 container health_status ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=starting, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, release=1761123044, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, io.buildah.version=1.41.4, version=17.1.12, build-date=2025-11-19T00:36:58Z, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, config_id=tripleo_step5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 28 08:22:55 np0005538515.localdomain podman[76513]: 2025-11-28 08:22:55.485428498 +0000 UTC m=+0.176921683 container exec_died ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, config_id=tripleo_step5, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, container_name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044)
Nov 28 08:22:55 np0005538515.localdomain podman[76513]: unhealthy
Nov 28 08:22:55 np0005538515.localdomain systemd[1]: ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 08:22:55 np0005538515.localdomain systemd[1]: ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.service: Failed with result 'exit-code'.
Nov 28 08:22:55 np0005538515.localdomain systemd[76533]: Queued start job for default target Main User Target.
Nov 28 08:22:55 np0005538515.localdomain systemd[76533]: Created slice User Application Slice.
Nov 28 08:22:55 np0005538515.localdomain systemd[76533]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Nov 28 08:22:55 np0005538515.localdomain systemd[76533]: Started Daily Cleanup of User's Temporary Directories.
Nov 28 08:22:55 np0005538515.localdomain systemd[76533]: Reached target Paths.
Nov 28 08:22:55 np0005538515.localdomain systemd[76533]: Reached target Timers.
Nov 28 08:22:55 np0005538515.localdomain systemd[76533]: Starting D-Bus User Message Bus Socket...
Nov 28 08:22:55 np0005538515.localdomain systemd[76533]: Starting Create User's Volatile Files and Directories...
Nov 28 08:22:55 np0005538515.localdomain systemd[76533]: Listening on D-Bus User Message Bus Socket.
Nov 28 08:22:55 np0005538515.localdomain systemd[76533]: Reached target Sockets.
Nov 28 08:22:55 np0005538515.localdomain systemd[76533]: Finished Create User's Volatile Files and Directories.
Nov 28 08:22:55 np0005538515.localdomain systemd[76533]: Reached target Basic System.
Nov 28 08:22:55 np0005538515.localdomain systemd[76533]: Reached target Main User Target.
Nov 28 08:22:55 np0005538515.localdomain systemd[76533]: Startup finished in 127ms.
Nov 28 08:22:55 np0005538515.localdomain systemd[1]: Started User Manager for UID 0.
Nov 28 08:22:55 np0005538515.localdomain systemd[1]: Started Session c10 of User root.
Nov 28 08:22:55 np0005538515.localdomain sudo[76512]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42436)
Nov 28 08:22:55 np0005538515.localdomain sudo[76512]: pam_unix(sudo:session): session closed for user root
Nov 28 08:22:55 np0005538515.localdomain systemd[1]: session-c10.scope: Deactivated successfully.
Nov 28 08:22:55 np0005538515.localdomain podman[76616]: 2025-11-28 08:22:55.819839044 +0000 UTC m=+0.074138461 container create 2659a032c980f8a55db7d088ff7cf0c88c211e46e06e052e716c33cd12909e1b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_wait_for_compute_service, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, url=https://www.redhat.com, build-date=2025-11-19T00:36:58Z, config_data={'detach': False, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', '__OS_DEBUG': 'true', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'start_order': 4, 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/log/containers/nova:/var/log/nova', '/var/lib/container-config-scripts:/container-config-scripts']}, architecture=x86_64, name=rhosp17/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.4, managed_by=tripleo_ansible, version=17.1.12, distribution-scope=public, vendor=Red Hat, Inc., io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_wait_for_compute_service, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, maintainer=OpenStack TripleO Team)
Nov 28 08:22:55 np0005538515.localdomain systemd[1]: Started libpod-conmon-2659a032c980f8a55db7d088ff7cf0c88c211e46e06e052e716c33cd12909e1b.scope.
Nov 28 08:22:55 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 08:22:55 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3584b97526356ef5b6642175730f52565e80737460bbd74a6e31729b79699070/merged/container-config-scripts supports timestamps until 2038 (0x7fffffff)
Nov 28 08:22:55 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3584b97526356ef5b6642175730f52565e80737460bbd74a6e31729b79699070/merged/var/log/nova supports timestamps until 2038 (0x7fffffff)
Nov 28 08:22:55 np0005538515.localdomain podman[76616]: 2025-11-28 08:22:55.782007255 +0000 UTC m=+0.036306712 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1
Nov 28 08:22:55 np0005538515.localdomain podman[76616]: 2025-11-28 08:22:55.885614083 +0000 UTC m=+0.139913490 container init 2659a032c980f8a55db7d088ff7cf0c88c211e46e06e052e716c33cd12909e1b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_wait_for_compute_service, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, vcs-type=git, release=1761123044, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_wait_for_compute_service, managed_by=tripleo_ansible, batch=17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, config_data={'detach': False, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', '__OS_DEBUG': 'true', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'start_order': 4, 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/log/containers/nova:/var/log/nova', '/var/lib/container-config-scripts:/container-config-scripts']}, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2025-11-19T00:36:58Z, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-nova-compute, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 28 08:22:55 np0005538515.localdomain podman[76616]: 2025-11-28 08:22:55.895012936 +0000 UTC m=+0.149312373 container start 2659a032c980f8a55db7d088ff7cf0c88c211e46e06e052e716c33cd12909e1b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_wait_for_compute_service, version=17.1.12, build-date=2025-11-19T00:36:58Z, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, batch=17.1_20251118.1, name=rhosp17/openstack-nova-compute, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, config_data={'detach': False, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', '__OS_DEBUG': 'true', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'start_order': 4, 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/log/containers/nova:/var/log/nova', '/var/lib/container-config-scripts:/container-config-scripts']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.buildah.version=1.41.4, vendor=Red Hat, Inc., io.openshift.expose-services=, container_name=nova_wait_for_compute_service, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, config_id=tripleo_step5, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 08:22:55 np0005538515.localdomain podman[76616]: 2025-11-28 08:22:55.895319426 +0000 UTC m=+0.149618873 container attach 2659a032c980f8a55db7d088ff7cf0c88c211e46e06e052e716c33cd12909e1b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_wait_for_compute_service, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., architecture=x86_64, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, build-date=2025-11-19T00:36:58Z, name=rhosp17/openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step5, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, url=https://www.redhat.com, container_name=nova_wait_for_compute_service, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'detach': False, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', '__OS_DEBUG': 'true', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'start_order': 4, 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/log/containers/nova:/var/log/nova', '/var/lib/container-config-scripts:/container-config-scripts']}, release=1761123044, version=17.1.12)
Nov 28 08:22:55 np0005538515.localdomain sudo[76636]:     nova : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Nov 28 08:22:55 np0005538515.localdomain sudo[76636]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42436)
Nov 28 08:22:55 np0005538515.localdomain sudo[76636]: pam_unix(sudo:session): session closed for user root
Nov 28 08:22:56 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.
Nov 28 08:22:56 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.
Nov 28 08:22:56 np0005538515.localdomain podman[76640]: 2025-11-28 08:22:56.229881877 +0000 UTC m=+0.088053414 container health_status e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, vendor=Red Hat, Inc., container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, version=17.1.12, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, vcs-type=git, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, build-date=2025-11-19T00:14:25Z, name=rhosp17/openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, tcib_managed=true, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, release=1761123044)
Nov 28 08:22:56 np0005538515.localdomain podman[76641]: 2025-11-28 08:22:56.277705457 +0000 UTC m=+0.132088876 container health_status 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, name=rhosp17/openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, version=17.1.12, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, build-date=2025-11-18T23:34:05Z, batch=17.1_20251118.1, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, io.openshift.expose-services=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, architecture=x86_64, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044)
Nov 28 08:22:56 np0005538515.localdomain podman[76640]: 2025-11-28 08:22:56.283347022 +0000 UTC m=+0.141518599 container exec_died e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, name=rhosp17/openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:14:25Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, container_name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, tcib_managed=true, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, version=17.1.12, url=https://www.redhat.com)
Nov 28 08:22:56 np0005538515.localdomain systemd[1]: e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.service: Deactivated successfully.
Nov 28 08:22:56 np0005538515.localdomain podman[76641]: 2025-11-28 08:22:56.302395716 +0000 UTC m=+0.156779115 container exec_died 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, tcib_managed=true, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, url=https://www.redhat.com, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ovn-controller, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.buildah.version=1.41.4)
Nov 28 08:22:56 np0005538515.localdomain systemd[1]: 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.service: Deactivated successfully.
Nov 28 08:23:05 np0005538515.localdomain systemd[1]: Stopping User Manager for UID 0...
Nov 28 08:23:05 np0005538515.localdomain systemd[76533]: Activating special unit Exit the Session...
Nov 28 08:23:05 np0005538515.localdomain systemd[76533]: Stopped target Main User Target.
Nov 28 08:23:05 np0005538515.localdomain systemd[76533]: Stopped target Basic System.
Nov 28 08:23:05 np0005538515.localdomain systemd[76533]: Stopped target Paths.
Nov 28 08:23:05 np0005538515.localdomain systemd[76533]: Stopped target Sockets.
Nov 28 08:23:05 np0005538515.localdomain systemd[76533]: Stopped target Timers.
Nov 28 08:23:05 np0005538515.localdomain systemd[76533]: Stopped Daily Cleanup of User's Temporary Directories.
Nov 28 08:23:05 np0005538515.localdomain systemd[76533]: Closed D-Bus User Message Bus Socket.
Nov 28 08:23:05 np0005538515.localdomain systemd[76533]: Stopped Create User's Volatile Files and Directories.
Nov 28 08:23:05 np0005538515.localdomain systemd[76533]: Removed slice User Application Slice.
Nov 28 08:23:05 np0005538515.localdomain systemd[76533]: Reached target Shutdown.
Nov 28 08:23:05 np0005538515.localdomain systemd[76533]: Finished Exit the Session.
Nov 28 08:23:05 np0005538515.localdomain systemd[76533]: Reached target Exit the Session.
Nov 28 08:23:05 np0005538515.localdomain systemd[1]: user@0.service: Deactivated successfully.
Nov 28 08:23:05 np0005538515.localdomain systemd[1]: Stopped User Manager for UID 0.
Nov 28 08:23:05 np0005538515.localdomain systemd[1]: Stopping User Runtime Directory /run/user/0...
Nov 28 08:23:05 np0005538515.localdomain systemd[1]: run-user-0.mount: Deactivated successfully.
Nov 28 08:23:05 np0005538515.localdomain systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Nov 28 08:23:05 np0005538515.localdomain systemd[1]: Stopped User Runtime Directory /run/user/0.
Nov 28 08:23:05 np0005538515.localdomain systemd[1]: Removed slice User Slice of UID 0.
Nov 28 08:23:13 np0005538515.localdomain sudo[76687]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 08:23:13 np0005538515.localdomain sudo[76687]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:23:13 np0005538515.localdomain sudo[76687]: pam_unix(sudo:session): session closed for user root
Nov 28 08:23:13 np0005538515.localdomain sudo[76702]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 08:23:13 np0005538515.localdomain sudo[76702]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:23:13 np0005538515.localdomain sudo[76702]: pam_unix(sudo:session): session closed for user root
Nov 28 08:23:14 np0005538515.localdomain sudo[76749]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 08:23:14 np0005538515.localdomain sudo[76749]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:23:14 np0005538515.localdomain sudo[76749]: pam_unix(sudo:session): session closed for user root
Nov 28 08:23:18 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.
Nov 28 08:23:18 np0005538515.localdomain podman[76764]: 2025-11-28 08:23:18.972347044 +0000 UTC m=+0.078560359 container health_status cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, build-date=2025-11-18T22:51:28Z, vcs-type=git, config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-collectd-container, container_name=collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, name=rhosp17/openstack-collectd, version=17.1.12, tcib_managed=true, vendor=Red Hat, Inc., release=1761123044)
Nov 28 08:23:18 np0005538515.localdomain podman[76764]: 2025-11-28 08:23:18.987370442 +0000 UTC m=+0.093583757 container exec_died cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, batch=17.1_20251118.1, managed_by=tripleo_ansible, architecture=x86_64, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step3, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=collectd, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, vcs-type=git, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., release=1761123044, url=https://www.redhat.com, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 08:23:19 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.
Nov 28 08:23:19 np0005538515.localdomain systemd[1]: cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.service: Deactivated successfully.
Nov 28 08:23:19 np0005538515.localdomain podman[76785]: 2025-11-28 08:23:19.094890591 +0000 UTC m=+0.082442779 container health_status 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, managed_by=tripleo_ansible, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, distribution-scope=public, vendor=Red Hat, Inc., architecture=x86_64, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, config_id=tripleo_step1, build-date=2025-11-18T22:49:46Z, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, url=https://www.redhat.com, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4)
Nov 28 08:23:19 np0005538515.localdomain podman[76785]: 2025-11-28 08:23:19.317516726 +0000 UTC m=+0.305068884 container exec_died 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, build-date=2025-11-18T22:49:46Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.4, name=rhosp17/openstack-qdrouterd, container_name=metrics_qdr, config_id=tripleo_step1, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, distribution-scope=public, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, architecture=x86_64, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, version=17.1.12, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd)
Nov 28 08:23:19 np0005538515.localdomain systemd[1]: 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.service: Deactivated successfully.
Nov 28 08:23:21 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.
Nov 28 08:23:21 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.
Nov 28 08:23:21 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.
Nov 28 08:23:21 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.
Nov 28 08:23:22 np0005538515.localdomain podman[76813]: 2025-11-28 08:23:22.011696821 +0000 UTC m=+0.123064184 container health_status 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.buildah.version=1.41.4, architecture=x86_64, maintainer=OpenStack TripleO Team, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., managed_by=tripleo_ansible, name=rhosp17/openstack-cron, release=1761123044, com.redhat.component=openstack-cron-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, vcs-type=git, batch=17.1_20251118.1, build-date=2025-11-18T22:49:32Z, container_name=logrotate_crond, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron)
Nov 28 08:23:22 np0005538515.localdomain podman[76813]: 2025-11-28 08:23:22.023931572 +0000 UTC m=+0.135298885 container exec_died 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-cron, io.openshift.expose-services=, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., batch=17.1_20251118.1, com.redhat.component=openstack-cron-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step4, build-date=2025-11-18T22:49:32Z, distribution-scope=public, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=logrotate_crond, tcib_managed=true, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible)
Nov 28 08:23:22 np0005538515.localdomain systemd[1]: 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.service: Deactivated successfully.
Nov 28 08:23:22 np0005538515.localdomain podman[76815]: 2025-11-28 08:23:22.076136698 +0000 UTC m=+0.175709234 container health_status 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, com.redhat.component=openstack-iscsid-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T23:44:13Z, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, name=rhosp17/openstack-iscsid, io.buildah.version=1.41.4, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=iscsid, distribution-scope=public, release=1761123044, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, config_id=tripleo_step3, vendor=Red Hat, Inc.)
Nov 28 08:23:22 np0005538515.localdomain podman[76815]: 2025-11-28 08:23:22.111293824 +0000 UTC m=+0.210866390 container exec_died 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, build-date=2025-11-18T23:44:13Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, managed_by=tripleo_ansible, name=rhosp17/openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, url=https://www.redhat.com, io.openshift.expose-services=, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., vcs-type=git, architecture=x86_64, tcib_managed=true, container_name=iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-iscsid-container, release=1761123044)
Nov 28 08:23:22 np0005538515.localdomain podman[76821]: 2025-11-28 08:23:22.127241121 +0000 UTC m=+0.223980698 container health_status d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, io.buildah.version=1.41.4, vcs-type=git, distribution-scope=public, batch=17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-compute, url=https://www.redhat.com, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-compute-container, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, build-date=2025-11-19T00:11:48Z, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, version=17.1.12, config_id=tripleo_step4, release=1761123044, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ceilometer_agent_compute)
Nov 28 08:23:22 np0005538515.localdomain systemd[1]: 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.service: Deactivated successfully.
Nov 28 08:23:22 np0005538515.localdomain podman[76814]: 2025-11-28 08:23:22.180148369 +0000 UTC m=+0.286154326 container health_status 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ceilometer-ipmi-container, config_id=tripleo_step4, container_name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., io.buildah.version=1.41.4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, build-date=2025-11-19T00:12:45Z, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, version=17.1.12, url=https://www.redhat.com, name=rhosp17/openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']})
Nov 28 08:23:22 np0005538515.localdomain podman[76821]: 2025-11-28 08:23:22.207960575 +0000 UTC m=+0.304700132 container exec_died d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:11:48Z, managed_by=tripleo_ansible, batch=17.1_20251118.1, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, config_id=tripleo_step4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-compute, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, distribution-scope=public, release=1761123044, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, version=17.1.12, container_name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4)
Nov 28 08:23:22 np0005538515.localdomain systemd[1]: d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.service: Deactivated successfully.
Nov 28 08:23:22 np0005538515.localdomain podman[76814]: 2025-11-28 08:23:22.262307618 +0000 UTC m=+0.368313605 container exec_died 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, name=rhosp17/openstack-ceilometer-ipmi, release=1761123044, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:12:45Z, tcib_managed=true, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., managed_by=tripleo_ansible, architecture=x86_64, batch=17.1_20251118.1, container_name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, distribution-scope=public, url=https://www.redhat.com, version=17.1.12, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4)
Nov 28 08:23:22 np0005538515.localdomain systemd[1]: 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.service: Deactivated successfully.
Nov 28 08:23:23 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.
Nov 28 08:23:23 np0005538515.localdomain podman[76905]: 2025-11-28 08:23:23.979331304 +0000 UTC m=+0.085004888 container health_status e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, vcs-type=git, tcib_managed=true, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step4, io.buildah.version=1.41.4, build-date=2025-11-19T00:36:58Z, io.openshift.expose-services=, container_name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc.)
Nov 28 08:23:24 np0005538515.localdomain podman[76905]: 2025-11-28 08:23:24.355982406 +0000 UTC m=+0.461655960 container exec_died e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, io.openshift.expose-services=, url=https://www.redhat.com, container_name=nova_migration_target, version=17.1.12, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, vcs-type=git, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1761123044, io.buildah.version=1.41.4, build-date=2025-11-19T00:36:58Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1)
Nov 28 08:23:24 np0005538515.localdomain systemd[1]: e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.service: Deactivated successfully.
Nov 28 08:23:25 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.
Nov 28 08:23:25 np0005538515.localdomain podman[76929]: 2025-11-28 08:23:25.971315905 +0000 UTC m=+0.079126275 container health_status ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=starting, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, architecture=x86_64, build-date=2025-11-19T00:36:58Z, container_name=nova_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, config_id=tripleo_step5, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, vcs-type=git, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, version=17.1.12, batch=17.1_20251118.1, vendor=Red Hat, Inc., distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute)
Nov 28 08:23:26 np0005538515.localdomain podman[76929]: 2025-11-28 08:23:26.02539848 +0000 UTC m=+0.133208790 container exec_died ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, config_id=tripleo_step5, io.openshift.expose-services=, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, release=1761123044)
Nov 28 08:23:26 np0005538515.localdomain podman[76929]: unhealthy
Nov 28 08:23:26 np0005538515.localdomain systemd[1]: ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 08:23:26 np0005538515.localdomain systemd[1]: ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.service: Failed with result 'exit-code'.
Nov 28 08:23:26 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.
Nov 28 08:23:26 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.
Nov 28 08:23:26 np0005538515.localdomain systemd[1]: tmp-crun.RgtATW.mount: Deactivated successfully.
Nov 28 08:23:26 np0005538515.localdomain podman[76951]: 2025-11-28 08:23:26.982249267 +0000 UTC m=+0.094014600 container health_status 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, architecture=x86_64, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, name=rhosp17/openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, io.openshift.expose-services=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, build-date=2025-11-18T23:34:05Z, io.buildah.version=1.41.4, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, distribution-scope=public)
Nov 28 08:23:27 np0005538515.localdomain podman[76951]: 2025-11-28 08:23:27.012452238 +0000 UTC m=+0.124217551 container exec_died 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, architecture=x86_64, io.openshift.expose-services=, io.buildah.version=1.41.4, com.redhat.component=openstack-ovn-controller-container, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_controller, version=17.1.12, managed_by=tripleo_ansible, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, url=https://www.redhat.com, vendor=Red Hat, Inc., release=1761123044, build-date=2025-11-18T23:34:05Z, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team)
Nov 28 08:23:27 np0005538515.localdomain systemd[1]: tmp-crun.05gSII.mount: Deactivated successfully.
Nov 28 08:23:27 np0005538515.localdomain systemd[1]: 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.service: Deactivated successfully.
Nov 28 08:23:27 np0005538515.localdomain podman[76952]: 2025-11-28 08:23:27.028957411 +0000 UTC m=+0.136724839 container health_status e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, io.openshift.expose-services=, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, maintainer=OpenStack TripleO Team, tcib_managed=true, url=https://www.redhat.com, vcs-type=git, managed_by=tripleo_ansible, architecture=x86_64, name=rhosp17/openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, distribution-scope=public, build-date=2025-11-19T00:14:25Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc.)
Nov 28 08:23:27 np0005538515.localdomain podman[76952]: 2025-11-28 08:23:27.069625528 +0000 UTC m=+0.177392986 container exec_died e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, distribution-scope=public, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.4, config_id=tripleo_step4, release=1761123044, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, version=17.1.12, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, batch=17.1_20251118.1, build-date=2025-11-19T00:14:25Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent)
Nov 28 08:23:27 np0005538515.localdomain systemd[1]: e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.service: Deactivated successfully.
Nov 28 08:23:46 np0005538515.localdomain sshd[36458]: Received disconnect from 192.168.122.100 port 42070:11: disconnected by user
Nov 28 08:23:46 np0005538515.localdomain sshd[36458]: Disconnected from user zuul 192.168.122.100 port 42070
Nov 28 08:23:46 np0005538515.localdomain sshd[36455]: pam_unix(sshd:session): session closed for user zuul
Nov 28 08:23:46 np0005538515.localdomain systemd[1]: session-27.scope: Deactivated successfully.
Nov 28 08:23:46 np0005538515.localdomain systemd[1]: session-27.scope: Consumed 3.058s CPU time.
Nov 28 08:23:46 np0005538515.localdomain systemd-logind[763]: Session 27 logged out. Waiting for processes to exit.
Nov 28 08:23:46 np0005538515.localdomain systemd-logind[763]: Removed session 27.
Nov 28 08:23:49 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.
Nov 28 08:23:49 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.
Nov 28 08:23:49 np0005538515.localdomain systemd[1]: tmp-crun.lgX6Hs.mount: Deactivated successfully.
Nov 28 08:23:49 np0005538515.localdomain podman[77002]: 2025-11-28 08:23:49.989445495 +0000 UTC m=+0.093257151 container health_status cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, managed_by=tripleo_ansible, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-collectd, version=17.1.12, container_name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-type=git, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-collectd-container, build-date=2025-11-18T22:51:28Z)
Nov 28 08:23:50 np0005538515.localdomain podman[77001]: 2025-11-28 08:23:50.044156018 +0000 UTC m=+0.149709777 container health_status 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, url=https://www.redhat.com, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:46Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, batch=17.1_20251118.1, config_id=tripleo_step1, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, io.openshift.expose-services=, container_name=metrics_qdr, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, tcib_managed=true, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., io.buildah.version=1.41.4)
Nov 28 08:23:50 np0005538515.localdomain podman[77002]: 2025-11-28 08:23:50.054189257 +0000 UTC m=+0.158000953 container exec_died cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, build-date=2025-11-18T22:51:28Z, release=1761123044, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=collectd, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vendor=Red Hat, Inc., batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-collectd)
Nov 28 08:23:50 np0005538515.localdomain systemd[1]: cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.service: Deactivated successfully.
Nov 28 08:23:50 np0005538515.localdomain podman[77001]: 2025-11-28 08:23:50.246535964 +0000 UTC m=+0.352089773 container exec_died 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, version=17.1.12, io.openshift.expose-services=, release=1761123044, batch=17.1_20251118.1, config_id=tripleo_step1, name=rhosp17/openstack-qdrouterd, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=metrics_qdr, io.buildah.version=1.41.4, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 28 08:23:50 np0005538515.localdomain systemd[1]: 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.service: Deactivated successfully.
Nov 28 08:23:52 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.
Nov 28 08:23:52 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.
Nov 28 08:23:52 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.
Nov 28 08:23:52 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.
Nov 28 08:23:52 np0005538515.localdomain systemd[1]: tmp-crun.evrH4H.mount: Deactivated successfully.
Nov 28 08:23:52 np0005538515.localdomain podman[77049]: 2025-11-28 08:23:52.980257889 +0000 UTC m=+0.090368321 container health_status 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vendor=Red Hat, Inc., distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, version=17.1.12, io.openshift.expose-services=, url=https://www.redhat.com, name=rhosp17/openstack-ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, release=1761123044)
Nov 28 08:23:53 np0005538515.localdomain podman[77049]: 2025-11-28 08:23:53.008652452 +0000 UTC m=+0.118762894 container exec_died 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, name=rhosp17/openstack-ceilometer-ipmi, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:12:45Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, tcib_managed=true, url=https://www.redhat.com, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, batch=17.1_20251118.1, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-ipmi-container, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, io.buildah.version=1.41.4, io.openshift.expose-services=)
Nov 28 08:23:53 np0005538515.localdomain systemd[1]: tmp-crun.OX92NG.mount: Deactivated successfully.
Nov 28 08:23:53 np0005538515.localdomain systemd[1]: 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.service: Deactivated successfully.
Nov 28 08:23:53 np0005538515.localdomain podman[77048]: 2025-11-28 08:23:53.026794291 +0000 UTC m=+0.136417378 container health_status 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:32Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, config_id=tripleo_step4, name=rhosp17/openstack-cron, io.openshift.expose-services=, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, tcib_managed=true, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., com.redhat.component=openstack-cron-container, release=1761123044, io.buildah.version=1.41.4, container_name=logrotate_crond, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 28 08:23:53 np0005538515.localdomain podman[77048]: 2025-11-28 08:23:53.038594704 +0000 UTC m=+0.148217751 container exec_died 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=logrotate_crond, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, managed_by=tripleo_ansible, vcs-type=git, url=https://www.redhat.com, io.openshift.expose-services=, build-date=2025-11-18T22:49:32Z, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, vendor=Red Hat, Inc., version=17.1.12, architecture=x86_64, com.redhat.component=openstack-cron-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron)
Nov 28 08:23:53 np0005538515.localdomain systemd[1]: 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.service: Deactivated successfully.
Nov 28 08:23:53 np0005538515.localdomain podman[77050]: 2025-11-28 08:23:53.133466612 +0000 UTC m=+0.237336553 container health_status 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, architecture=x86_64, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, tcib_managed=true, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, vendor=Red Hat, Inc., distribution-scope=public, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=)
Nov 28 08:23:53 np0005538515.localdomain podman[77050]: 2025-11-28 08:23:53.146385679 +0000 UTC m=+0.250255600 container exec_died 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, architecture=x86_64, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, name=rhosp17/openstack-iscsid, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=iscsid, build-date=2025-11-18T23:44:13Z, release=1761123044, version=17.1.12, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-type=git, vendor=Red Hat, Inc., tcib_managed=true)
Nov 28 08:23:53 np0005538515.localdomain systemd[1]: 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.service: Deactivated successfully.
Nov 28 08:23:53 np0005538515.localdomain podman[77051]: 2025-11-28 08:23:53.099430595 +0000 UTC m=+0.197792696 container health_status d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-compute, vendor=Red Hat, Inc., version=17.1.12, maintainer=OpenStack TripleO Team, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, tcib_managed=true, batch=17.1_20251118.1, io.buildah.version=1.41.4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:11:48Z, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-type=git, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Nov 28 08:23:53 np0005538515.localdomain podman[77051]: 2025-11-28 08:23:53.230180648 +0000 UTC m=+0.328542749 container exec_died d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, name=rhosp17/openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, tcib_managed=true, vendor=Red Hat, Inc., build-date=2025-11-19T00:11:48Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_id=tripleo_step4, batch=17.1_20251118.1, distribution-scope=public, url=https://www.redhat.com, container_name=ceilometer_agent_compute, com.redhat.component=openstack-ceilometer-compute-container, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, version=17.1.12, managed_by=tripleo_ansible, architecture=x86_64, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Nov 28 08:23:53 np0005538515.localdomain systemd[1]: d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.service: Deactivated successfully.
Nov 28 08:23:54 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.
Nov 28 08:23:54 np0005538515.localdomain podman[77140]: 2025-11-28 08:23:54.963733721 +0000 UTC m=+0.076195535 container health_status e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., release=1761123044, vcs-type=git, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, build-date=2025-11-19T00:36:58Z, architecture=x86_64, container_name=nova_migration_target, version=17.1.12, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, io.buildah.version=1.41.4, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 08:23:55 np0005538515.localdomain podman[77140]: 2025-11-28 08:23:55.320339922 +0000 UTC m=+0.432801756 container exec_died e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, version=17.1.12, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:36:58Z, io.openshift.expose-services=, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 28 08:23:55 np0005538515.localdomain systemd[1]: e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.service: Deactivated successfully.
Nov 28 08:23:56 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.
Nov 28 08:23:56 np0005538515.localdomain podman[77164]: 2025-11-28 08:23:56.962300249 +0000 UTC m=+0.074544045 container health_status ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=starting, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, io.buildah.version=1.41.4, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_compute, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc.)
Nov 28 08:23:57 np0005538515.localdomain podman[77164]: 2025-11-28 08:23:57.045479967 +0000 UTC m=+0.157723743 container exec_died ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, vendor=Red Hat, Inc., tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step5, io.buildah.version=1.41.4, version=17.1.12, container_name=nova_compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public)
Nov 28 08:23:57 np0005538515.localdomain podman[77164]: unhealthy
Nov 28 08:23:57 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.
Nov 28 08:23:57 np0005538515.localdomain systemd[1]: ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 08:23:57 np0005538515.localdomain systemd[1]: ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.service: Failed with result 'exit-code'.
Nov 28 08:23:57 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.
Nov 28 08:23:57 np0005538515.localdomain podman[77186]: 2025-11-28 08:23:57.162590281 +0000 UTC m=+0.091471376 container health_status 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ovn-controller-container, vcs-type=git, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, name=rhosp17/openstack-ovn-controller, container_name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, io.openshift.expose-services=, architecture=x86_64, vendor=Red Hat, Inc., config_id=tripleo_step4, tcib_managed=true, release=1761123044, description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com)
Nov 28 08:23:57 np0005538515.localdomain podman[77186]: 2025-11-28 08:23:57.216121067 +0000 UTC m=+0.145002152 container exec_died 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vendor=Red Hat, Inc., config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, io.buildah.version=1.41.4, io.openshift.expose-services=, url=https://www.redhat.com, tcib_managed=true, batch=17.1_20251118.1, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, maintainer=OpenStack TripleO Team, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, name=rhosp17/openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, build-date=2025-11-18T23:34:05Z, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 28 08:23:57 np0005538515.localdomain systemd[1]: 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.service: Deactivated successfully.
Nov 28 08:23:57 np0005538515.localdomain podman[77203]: 2025-11-28 08:23:57.232694137 +0000 UTC m=+0.065496635 container health_status e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, architecture=x86_64, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-neutron-metadata-agent-ovn, tcib_managed=true, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_metadata_agent, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, config_id=tripleo_step4, release=1761123044, io.buildah.version=1.41.4, build-date=2025-11-19T00:14:25Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, maintainer=OpenStack TripleO Team, vcs-type=git, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com)
Nov 28 08:23:57 np0005538515.localdomain podman[77203]: 2025-11-28 08:23:57.296514141 +0000 UTC m=+0.129316599 container exec_died e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, maintainer=OpenStack TripleO Team, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.41.4, url=https://www.redhat.com, container_name=ovn_metadata_agent, release=1761123044, batch=17.1_20251118.1, managed_by=tripleo_ansible, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., build-date=2025-11-19T00:14:25Z, tcib_managed=true, version=17.1.12, architecture=x86_64, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 28 08:23:57 np0005538515.localdomain systemd[1]: e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.service: Deactivated successfully.
Nov 28 08:24:14 np0005538515.localdomain sudo[77232]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 08:24:14 np0005538515.localdomain sudo[77232]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:24:14 np0005538515.localdomain sudo[77232]: pam_unix(sudo:session): session closed for user root
Nov 28 08:24:14 np0005538515.localdomain sudo[77247]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 08:24:14 np0005538515.localdomain sudo[77247]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:24:15 np0005538515.localdomain sudo[77247]: pam_unix(sudo:session): session closed for user root
Nov 28 08:24:16 np0005538515.localdomain sudo[77293]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 08:24:16 np0005538515.localdomain sudo[77293]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:24:16 np0005538515.localdomain sudo[77293]: pam_unix(sudo:session): session closed for user root
Nov 28 08:24:20 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.
Nov 28 08:24:20 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.
Nov 28 08:24:20 np0005538515.localdomain systemd[1]: tmp-crun.Lcdxsb.mount: Deactivated successfully.
Nov 28 08:24:20 np0005538515.localdomain podman[77309]: 2025-11-28 08:24:20.975292501 +0000 UTC m=+0.083023305 container health_status cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, com.redhat.component=openstack-collectd-container, io.openshift.expose-services=, config_id=tripleo_step3, tcib_managed=true, name=rhosp17/openstack-collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, distribution-scope=public, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, build-date=2025-11-18T22:51:28Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc.)
Nov 28 08:24:20 np0005538515.localdomain podman[77309]: 2025-11-28 08:24:20.988639951 +0000 UTC m=+0.096370815 container exec_died cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, tcib_managed=true, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, maintainer=OpenStack TripleO Team, architecture=x86_64, managed_by=tripleo_ansible, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., io.openshift.expose-services=, com.redhat.component=openstack-collectd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, build-date=2025-11-18T22:51:28Z, batch=17.1_20251118.1, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, name=rhosp17/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, config_id=tripleo_step3)
Nov 28 08:24:21 np0005538515.localdomain systemd[1]: cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.service: Deactivated successfully.
Nov 28 08:24:21 np0005538515.localdomain systemd[1]: tmp-crun.m9QT4F.mount: Deactivated successfully.
Nov 28 08:24:21 np0005538515.localdomain podman[77308]: 2025-11-28 08:24:21.087562875 +0000 UTC m=+0.195222577 container health_status 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.component=openstack-qdrouterd-container, build-date=2025-11-18T22:49:46Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, url=https://www.redhat.com, version=17.1.12, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, config_id=tripleo_step1, tcib_managed=true, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.41.4, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd)
Nov 28 08:24:21 np0005538515.localdomain podman[77308]: 2025-11-28 08:24:21.282540514 +0000 UTC m=+0.390200246 container exec_died 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.expose-services=, build-date=2025-11-18T22:49:46Z, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, architecture=x86_64, container_name=metrics_qdr, batch=17.1_20251118.1, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git)
Nov 28 08:24:21 np0005538515.localdomain systemd[1]: 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.service: Deactivated successfully.
Nov 28 08:24:23 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.
Nov 28 08:24:23 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.
Nov 28 08:24:23 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.
Nov 28 08:24:23 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.
Nov 28 08:24:23 np0005538515.localdomain podman[77358]: 2025-11-28 08:24:23.979105966 +0000 UTC m=+0.088874486 container health_status 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:32Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, name=rhosp17/openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=logrotate_crond, managed_by=tripleo_ansible, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-type=git, com.redhat.component=openstack-cron-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, maintainer=OpenStack TripleO Team, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., io.buildah.version=1.41.4, io.openshift.expose-services=, version=17.1.12, batch=17.1_20251118.1)
Nov 28 08:24:24 np0005538515.localdomain podman[77358]: 2025-11-28 08:24:24.014135213 +0000 UTC m=+0.123903703 container exec_died 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:32Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, release=1761123044, url=https://www.redhat.com, io.buildah.version=1.41.4, com.redhat.component=openstack-cron-container, name=rhosp17/openstack-cron, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-type=git, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., container_name=logrotate_crond, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, version=17.1.12, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron)
Nov 28 08:24:24 np0005538515.localdomain systemd[1]: 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.service: Deactivated successfully.
Nov 28 08:24:24 np0005538515.localdomain podman[77360]: 2025-11-28 08:24:24.033336244 +0000 UTC m=+0.136378027 container health_status 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, io.buildah.version=1.41.4, container_name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, batch=17.1_20251118.1, com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.expose-services=, name=rhosp17/openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, managed_by=tripleo_ansible, config_id=tripleo_step3)
Nov 28 08:24:24 np0005538515.localdomain podman[77360]: 2025-11-28 08:24:24.044537408 +0000 UTC m=+0.147579181 container exec_died 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, config_id=tripleo_step3, com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., architecture=x86_64, container_name=iscsid, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, vcs-type=git, managed_by=tripleo_ansible, name=rhosp17/openstack-iscsid, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1)
Nov 28 08:24:24 np0005538515.localdomain systemd[1]: 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.service: Deactivated successfully.
Nov 28 08:24:24 np0005538515.localdomain podman[77359]: 2025-11-28 08:24:24.13658276 +0000 UTC m=+0.243282436 container health_status 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, tcib_managed=true, name=rhosp17/openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, version=17.1.12, container_name=ceilometer_agent_ipmi, batch=17.1_20251118.1, managed_by=tripleo_ansible, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, io.openshift.expose-services=, io.buildah.version=1.41.4, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4)
Nov 28 08:24:24 np0005538515.localdomain podman[77359]: 2025-11-28 08:24:24.163834529 +0000 UTC m=+0.270534215 container exec_died 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vendor=Red Hat, Inc., tcib_managed=true, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:12:45Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, io.openshift.expose-services=, container_name=ceilometer_agent_ipmi, name=rhosp17/openstack-ceilometer-ipmi, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-ipmi-container, version=17.1.12, managed_by=tripleo_ansible, io.buildah.version=1.41.4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676)
Nov 28 08:24:24 np0005538515.localdomain systemd[1]: 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.service: Deactivated successfully.
Nov 28 08:24:24 np0005538515.localdomain podman[77361]: 2025-11-28 08:24:24.185890357 +0000 UTC m=+0.286932519 container health_status d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.openshift.expose-services=, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, container_name=ceilometer_agent_compute, release=1761123044, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, build-date=2025-11-19T00:11:48Z, distribution-scope=public, name=rhosp17/openstack-ceilometer-compute, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4)
Nov 28 08:24:24 np0005538515.localdomain podman[77361]: 2025-11-28 08:24:24.237144674 +0000 UTC m=+0.338186836 container exec_died d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, distribution-scope=public, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, batch=17.1_20251118.1, container_name=ceilometer_agent_compute, tcib_managed=true, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, com.redhat.component=openstack-ceilometer-compute-container, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute)
Nov 28 08:24:24 np0005538515.localdomain systemd[1]: d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.service: Deactivated successfully.
Nov 28 08:24:24 np0005538515.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Nov 28 08:24:24 np0005538515.localdomain recover_tripleo_nova_virtqemud[77450]: 62642
Nov 28 08:24:24 np0005538515.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Nov 28 08:24:24 np0005538515.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Nov 28 08:24:25 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.
Nov 28 08:24:25 np0005538515.localdomain podman[77451]: 2025-11-28 08:24:25.977252969 +0000 UTC m=+0.083603013 container health_status e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, architecture=x86_64, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, io.buildah.version=1.41.4, batch=17.1_20251118.1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, version=17.1.12, maintainer=OpenStack TripleO Team, distribution-scope=public, build-date=2025-11-19T00:36:58Z, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 28 08:24:26 np0005538515.localdomain podman[77451]: 2025-11-28 08:24:26.344987733 +0000 UTC m=+0.451337727 container exec_died e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, name=rhosp17/openstack-nova-compute, url=https://www.redhat.com, config_id=tripleo_step4, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team)
Nov 28 08:24:26 np0005538515.localdomain systemd[1]: e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.service: Deactivated successfully.
Nov 28 08:24:27 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.
Nov 28 08:24:27 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.
Nov 28 08:24:27 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.
Nov 28 08:24:27 np0005538515.localdomain systemd[1]: tmp-crun.uMCz58.mount: Deactivated successfully.
Nov 28 08:24:27 np0005538515.localdomain podman[77474]: 2025-11-28 08:24:27.989124025 +0000 UTC m=+0.094689114 container health_status 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, version=17.1.12, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, build-date=2025-11-18T23:34:05Z, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, vendor=Red Hat, Inc., config_id=tripleo_step4, io.openshift.expose-services=, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, name=rhosp17/openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, release=1761123044)
Nov 28 08:24:28 np0005538515.localdomain systemd[1]: tmp-crun.PtFeyc.mount: Deactivated successfully.
Nov 28 08:24:28 np0005538515.localdomain podman[77475]: 2025-11-28 08:24:28.033382937 +0000 UTC m=+0.135414136 container health_status ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=unhealthy, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, vcs-type=git, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, url=https://www.redhat.com, build-date=2025-11-19T00:36:58Z, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, config_id=tripleo_step5, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute)
Nov 28 08:24:28 np0005538515.localdomain podman[77474]: 2025-11-28 08:24:28.040413583 +0000 UTC m=+0.145978612 container exec_died 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.openshift.expose-services=, io.buildah.version=1.41.4, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, distribution-scope=public, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, name=rhosp17/openstack-ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, build-date=2025-11-18T23:34:05Z, container_name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, vcs-type=git, tcib_managed=true, version=17.1.12)
Nov 28 08:24:28 np0005538515.localdomain systemd[1]: 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.service: Deactivated successfully.
Nov 28 08:24:28 np0005538515.localdomain podman[77475]: 2025-11-28 08:24:28.083493349 +0000 UTC m=+0.185524488 container exec_died ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, version=17.1.12, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, vcs-type=git, config_id=tripleo_step5, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, container_name=nova_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-compute, url=https://www.redhat.com, batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z)
Nov 28 08:24:28 np0005538515.localdomain podman[77475]: unhealthy
Nov 28 08:24:28 np0005538515.localdomain systemd[1]: ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 08:24:28 np0005538515.localdomain systemd[1]: ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.service: Failed with result 'exit-code'.
Nov 28 08:24:28 np0005538515.localdomain podman[77476]: 2025-11-28 08:24:28.084901492 +0000 UTC m=+0.183916369 container health_status e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, tcib_managed=true, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, vendor=Red Hat, Inc., architecture=x86_64, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, build-date=2025-11-19T00:14:25Z, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, vcs-type=git)
Nov 28 08:24:28 np0005538515.localdomain podman[77476]: 2025-11-28 08:24:28.169525175 +0000 UTC m=+0.268540002 container exec_died e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, name=rhosp17/openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, tcib_managed=true, container_name=ovn_metadata_agent, io.openshift.expose-services=, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, version=17.1.12, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, release=1761123044, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Nov 28 08:24:28 np0005538515.localdomain systemd[1]: e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.service: Deactivated successfully.
Nov 28 08:24:51 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.
Nov 28 08:24:51 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.
Nov 28 08:24:51 np0005538515.localdomain systemd[1]: tmp-crun.dSkVk8.mount: Deactivated successfully.
Nov 28 08:24:51 np0005538515.localdomain podman[77543]: 2025-11-28 08:24:51.982786645 +0000 UTC m=+0.090887706 container health_status cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:51:28Z, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-collectd-container, vcs-type=git, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, summary=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, distribution-scope=public, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, url=https://www.redhat.com, vendor=Red Hat, Inc., container_name=collectd, name=rhosp17/openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 28 08:24:51 np0005538515.localdomain podman[77543]: 2025-11-28 08:24:51.995540388 +0000 UTC m=+0.103641399 container exec_died cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, container_name=collectd, io.openshift.expose-services=, com.redhat.component=openstack-collectd-container, name=rhosp17/openstack-collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, distribution-scope=public, build-date=2025-11-18T22:51:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, vendor=Red Hat, Inc.)
Nov 28 08:24:52 np0005538515.localdomain systemd[1]: cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.service: Deactivated successfully.
Nov 28 08:24:52 np0005538515.localdomain podman[77542]: 2025-11-28 08:24:52.094550094 +0000 UTC m=+0.203586345 container health_status 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-qdrouterd-container, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, name=rhosp17/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, managed_by=tripleo_ansible, version=17.1.12, architecture=x86_64)
Nov 28 08:24:52 np0005538515.localdomain podman[77542]: 2025-11-28 08:24:52.31546103 +0000 UTC m=+0.424497251 container exec_died 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, name=rhosp17/openstack-qdrouterd, vcs-type=git, io.openshift.expose-services=, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., version=17.1.12, distribution-scope=public, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, io.buildah.version=1.41.4, build-date=2025-11-18T22:49:46Z)
Nov 28 08:24:52 np0005538515.localdomain systemd[1]: 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.service: Deactivated successfully.
Nov 28 08:24:54 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.
Nov 28 08:24:54 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.
Nov 28 08:24:54 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.
Nov 28 08:24:54 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.
Nov 28 08:24:54 np0005538515.localdomain podman[77597]: 2025-11-28 08:24:54.977345424 +0000 UTC m=+0.073291446 container health_status d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, batch=17.1_20251118.1, architecture=x86_64, build-date=2025-11-19T00:11:48Z, container_name=ceilometer_agent_compute, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, vcs-type=git, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, tcib_managed=true, version=17.1.12, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Nov 28 08:24:55 np0005538515.localdomain podman[77590]: 2025-11-28 08:24:55.035345749 +0000 UTC m=+0.140391651 container health_status 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-cron-container, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, name=rhosp17/openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, version=17.1.12, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, vcs-type=git, container_name=logrotate_crond, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, config_id=tripleo_step4, batch=17.1_20251118.1)
Nov 28 08:24:55 np0005538515.localdomain podman[77590]: 2025-11-28 08:24:55.0493755 +0000 UTC m=+0.154421352 container exec_died 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, config_id=tripleo_step4, architecture=x86_64, com.redhat.component=openstack-cron-container, description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, version=17.1.12, io.openshift.expose-services=, distribution-scope=public, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vcs-type=git, container_name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, build-date=2025-11-18T22:49:32Z)
Nov 28 08:24:55 np0005538515.localdomain systemd[1]: 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.service: Deactivated successfully.
Nov 28 08:24:55 np0005538515.localdomain podman[77597]: 2025-11-28 08:24:55.088262107 +0000 UTC m=+0.184208159 container exec_died d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, container_name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-compute-container, name=rhosp17/openstack-ceilometer-compute, config_id=tripleo_step4, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, vcs-type=git, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64)
Nov 28 08:24:55 np0005538515.localdomain systemd[1]: d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.service: Deactivated successfully.
Nov 28 08:24:55 np0005538515.localdomain podman[77591]: 2025-11-28 08:24:55.140032059 +0000 UTC m=+0.241383067 container health_status 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, container_name=ceilometer_agent_ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., release=1761123044, managed_by=tripleo_ansible, build-date=2025-11-19T00:12:45Z, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, io.buildah.version=1.41.4, version=17.1.12, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, architecture=x86_64, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 28 08:24:55 np0005538515.localdomain podman[77592]: 2025-11-28 08:24:55.196554798 +0000 UTC m=+0.292161300 container health_status 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, release=1761123044, url=https://www.redhat.com, io.buildah.version=1.41.4, managed_by=tripleo_ansible, vcs-type=git, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., distribution-scope=public, container_name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, build-date=2025-11-18T23:44:13Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64)
Nov 28 08:24:55 np0005538515.localdomain podman[77591]: 2025-11-28 08:24:55.22133049 +0000 UTC m=+0.322681518 container exec_died 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vcs-type=git, build-date=2025-11-19T00:12:45Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, io.buildah.version=1.41.4, url=https://www.redhat.com, batch=17.1_20251118.1, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, com.redhat.component=openstack-ceilometer-ipmi-container, managed_by=tripleo_ansible, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_ipmi)
Nov 28 08:24:55 np0005538515.localdomain systemd[1]: 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.service: Deactivated successfully.
Nov 28 08:24:55 np0005538515.localdomain podman[77592]: 2025-11-28 08:24:55.234565868 +0000 UTC m=+0.330172360 container exec_died 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.openshift.expose-services=, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=iscsid, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, name=rhosp17/openstack-iscsid, maintainer=OpenStack TripleO Team, tcib_managed=true, distribution-scope=public, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, managed_by=tripleo_ansible, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, vendor=Red Hat, Inc., batch=17.1_20251118.1, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3)
Nov 28 08:24:55 np0005538515.localdomain systemd[1]: 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.service: Deactivated successfully.
Nov 28 08:24:56 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.
Nov 28 08:24:56 np0005538515.localdomain podman[77680]: 2025-11-28 08:24:56.973579829 +0000 UTC m=+0.078458655 container health_status e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.buildah.version=1.41.4, managed_by=tripleo_ansible, tcib_managed=true, container_name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, vendor=Red Hat, Inc., vcs-type=git, architecture=x86_64, build-date=2025-11-19T00:36:58Z, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, distribution-scope=public, name=rhosp17/openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 28 08:24:57 np0005538515.localdomain podman[77680]: 2025-11-28 08:24:57.370572362 +0000 UTC m=+0.475451148 container exec_died e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, build-date=2025-11-19T00:36:58Z, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-compute, container_name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, vendor=Red Hat, Inc., distribution-scope=public, maintainer=OpenStack TripleO Team, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Nov 28 08:24:57 np0005538515.localdomain systemd[1]: e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.service: Deactivated successfully.
Nov 28 08:24:58 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.
Nov 28 08:24:58 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.
Nov 28 08:24:58 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.
Nov 28 08:24:58 np0005538515.localdomain podman[77703]: 2025-11-28 08:24:58.971715682 +0000 UTC m=+0.082408107 container health_status 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, vendor=Red Hat, Inc., architecture=x86_64, batch=17.1_20251118.1, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ovn-controller-container, config_id=tripleo_step4, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, maintainer=OpenStack TripleO Team, container_name=ovn_controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, distribution-scope=public, io.buildah.version=1.41.4, io.openshift.expose-services=)
Nov 28 08:24:59 np0005538515.localdomain podman[77705]: 2025-11-28 08:24:59.029108008 +0000 UTC m=+0.130242848 container health_status e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, version=17.1.12, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-type=git, build-date=2025-11-19T00:14:25Z, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, architecture=x86_64, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, distribution-scope=public, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c)
Nov 28 08:24:59 np0005538515.localdomain podman[77704]: 2025-11-28 08:24:59.086204225 +0000 UTC m=+0.190047168 container health_status ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=unhealthy, vendor=Red Hat, Inc., container_name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, name=rhosp17/openstack-nova-compute, url=https://www.redhat.com, vcs-type=git, build-date=2025-11-19T00:36:58Z, architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.expose-services=)
Nov 28 08:24:59 np0005538515.localdomain podman[77705]: 2025-11-28 08:24:59.096427389 +0000 UTC m=+0.197562259 container exec_died e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, batch=17.1_20251118.1, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, managed_by=tripleo_ansible, io.openshift.expose-services=, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, name=rhosp17/openstack-neutron-metadata-agent-ovn, release=1761123044, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-type=git, build-date=2025-11-19T00:14:25Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., url=https://www.redhat.com)
Nov 28 08:24:59 np0005538515.localdomain systemd[1]: e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.service: Deactivated successfully.
Nov 28 08:24:59 np0005538515.localdomain podman[77704]: 2025-11-28 08:24:59.138550175 +0000 UTC m=+0.242393118 container exec_died ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vcs-type=git, release=1761123044, container_name=nova_compute, config_id=tripleo_step5, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-11-19T00:36:58Z, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, architecture=x86_64, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, vendor=Red Hat, Inc., version=17.1.12, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 28 08:24:59 np0005538515.localdomain podman[77704]: unhealthy
Nov 28 08:24:59 np0005538515.localdomain podman[77703]: 2025-11-28 08:24:59.150129681 +0000 UTC m=+0.260822126 container exec_died 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, build-date=2025-11-18T23:34:05Z, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, version=17.1.12, io.openshift.expose-services=, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, vcs-type=git, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272)
Nov 28 08:24:59 np0005538515.localdomain systemd[1]: ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 08:24:59 np0005538515.localdomain systemd[1]: ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.service: Failed with result 'exit-code'.
Nov 28 08:24:59 np0005538515.localdomain systemd[1]: 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.service: Deactivated successfully.
Nov 28 08:25:16 np0005538515.localdomain sudo[77771]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 08:25:16 np0005538515.localdomain sudo[77771]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:25:16 np0005538515.localdomain sudo[77771]: pam_unix(sudo:session): session closed for user root
Nov 28 08:25:16 np0005538515.localdomain sudo[77786]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 08:25:16 np0005538515.localdomain sudo[77786]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:25:17 np0005538515.localdomain sudo[77786]: pam_unix(sudo:session): session closed for user root
Nov 28 08:25:17 np0005538515.localdomain sudo[77833]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 08:25:17 np0005538515.localdomain sudo[77833]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:25:17 np0005538515.localdomain sudo[77833]: pam_unix(sudo:session): session closed for user root
Nov 28 08:25:22 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.
Nov 28 08:25:22 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.
Nov 28 08:25:22 np0005538515.localdomain podman[77849]: 2025-11-28 08:25:22.97896612 +0000 UTC m=+0.083182410 container health_status cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, vcs-type=git, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, name=rhosp17/openstack-collectd, io.openshift.expose-services=, config_id=tripleo_step3, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, version=17.1.12, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:51:28Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.buildah.version=1.41.4, com.redhat.component=openstack-collectd-container, summary=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, container_name=collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 28 08:25:22 np0005538515.localdomain podman[77849]: 2025-11-28 08:25:22.98872446 +0000 UTC m=+0.092940820 container exec_died cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, io.buildah.version=1.41.4, container_name=collectd, release=1761123044, name=rhosp17/openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, architecture=x86_64, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, tcib_managed=true, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, vcs-type=git, config_id=tripleo_step3)
Nov 28 08:25:23 np0005538515.localdomain systemd[1]: cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.service: Deactivated successfully.
Nov 28 08:25:23 np0005538515.localdomain podman[77848]: 2025-11-28 08:25:23.082442893 +0000 UTC m=+0.188948144 container health_status 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, version=17.1.12, name=rhosp17/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.openshift.expose-services=, config_id=tripleo_step1, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, release=1761123044, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com)
Nov 28 08:25:23 np0005538515.localdomain podman[77848]: 2025-11-28 08:25:23.275707168 +0000 UTC m=+0.382212389 container exec_died 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, version=17.1.12, name=rhosp17/openstack-qdrouterd, managed_by=tripleo_ansible, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, io.buildah.version=1.41.4, config_id=tripleo_step1, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, maintainer=OpenStack TripleO Team, tcib_managed=true, release=1761123044, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, url=https://www.redhat.com)
Nov 28 08:25:23 np0005538515.localdomain systemd[1]: 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.service: Deactivated successfully.
Nov 28 08:25:25 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.
Nov 28 08:25:25 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.
Nov 28 08:25:25 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.
Nov 28 08:25:25 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.
Nov 28 08:25:25 np0005538515.localdomain systemd[1]: tmp-crun.ANJNZj.mount: Deactivated successfully.
Nov 28 08:25:25 np0005538515.localdomain podman[77897]: 2025-11-28 08:25:25.989999554 +0000 UTC m=+0.093052583 container health_status 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, vendor=Red Hat, Inc., distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, version=17.1.12, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, io.buildah.version=1.41.4, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, build-date=2025-11-18T22:49:32Z, com.redhat.component=openstack-cron-container, release=1761123044, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond)
Nov 28 08:25:25 np0005538515.localdomain podman[77897]: 2025-11-28 08:25:25.998581158 +0000 UTC m=+0.101634197 container exec_died 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-type=git, build-date=2025-11-18T22:49:32Z, config_id=tripleo_step4, version=17.1.12, container_name=logrotate_crond, summary=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., io.buildah.version=1.41.4, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, com.redhat.component=openstack-cron-container, distribution-scope=public)
Nov 28 08:25:26 np0005538515.localdomain systemd[1]: tmp-crun.VdHkHM.mount: Deactivated successfully.
Nov 28 08:25:26 np0005538515.localdomain podman[77904]: 2025-11-28 08:25:26.039931901 +0000 UTC m=+0.132480567 container health_status d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, vendor=Red Hat, Inc., batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.expose-services=, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, release=1761123044, io.buildah.version=1.41.4, container_name=ceilometer_agent_compute, build-date=2025-11-19T00:11:48Z, url=https://www.redhat.com, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Nov 28 08:25:26 np0005538515.localdomain podman[77898]: 2025-11-28 08:25:26.083482331 +0000 UTC m=+0.183081324 container health_status 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:12:45Z, config_id=tripleo_step4, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, distribution-scope=public, release=1761123044, vcs-type=git, architecture=x86_64, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-ipmi, tcib_managed=true, container_name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676)
Nov 28 08:25:26 np0005538515.localdomain podman[77904]: 2025-11-28 08:25:26.094648405 +0000 UTC m=+0.187197101 container exec_died d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, config_id=tripleo_step4, managed_by=tripleo_ansible, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, tcib_managed=true, vendor=Red Hat, Inc., release=1761123044, build-date=2025-11-19T00:11:48Z, com.redhat.component=openstack-ceilometer-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, container_name=ceilometer_agent_compute)
Nov 28 08:25:26 np0005538515.localdomain systemd[1]: d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.service: Deactivated successfully.
Nov 28 08:25:26 np0005538515.localdomain podman[77899]: 2025-11-28 08:25:26.136461381 +0000 UTC m=+0.231861035 container health_status 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.buildah.version=1.41.4, build-date=2025-11-18T23:44:13Z, architecture=x86_64, release=1761123044, config_id=tripleo_step3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.expose-services=, vendor=Red Hat, Inc., batch=17.1_20251118.1, vcs-type=git, distribution-scope=public, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-iscsid-container, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, container_name=iscsid, tcib_managed=true, managed_by=tripleo_ansible, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com)
Nov 28 08:25:26 np0005538515.localdomain systemd[1]: 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.service: Deactivated successfully.
Nov 28 08:25:26 np0005538515.localdomain podman[77898]: 2025-11-28 08:25:26.166787603 +0000 UTC m=+0.266386566 container exec_died 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, url=https://www.redhat.com, io.buildah.version=1.41.4, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-ipmi-container, tcib_managed=true, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, build-date=2025-11-19T00:12:45Z, maintainer=OpenStack TripleO Team, version=17.1.12, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 08:25:26 np0005538515.localdomain systemd[1]: 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.service: Deactivated successfully.
Nov 28 08:25:26 np0005538515.localdomain podman[77899]: 2025-11-28 08:25:26.217974439 +0000 UTC m=+0.313374103 container exec_died 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, batch=17.1_20251118.1, io.buildah.version=1.41.4, vendor=Red Hat, Inc., vcs-type=git, distribution-scope=public, managed_by=tripleo_ansible, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, version=17.1.12, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-iscsid, tcib_managed=true, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, container_name=iscsid, architecture=x86_64, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com)
Nov 28 08:25:26 np0005538515.localdomain systemd[1]: 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.service: Deactivated successfully.
Nov 28 08:25:27 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.
Nov 28 08:25:27 np0005538515.localdomain podman[77988]: 2025-11-28 08:25:27.981148173 +0000 UTC m=+0.084188630 container health_status e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, config_id=tripleo_step4, release=1761123044, io.buildah.version=1.41.4, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, vcs-type=git, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 28 08:25:28 np0005538515.localdomain podman[77988]: 2025-11-28 08:25:28.321494985 +0000 UTC m=+0.424535452 container exec_died e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, release=1761123044, distribution-scope=public, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=nova_migration_target, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 28 08:25:28 np0005538515.localdomain systemd[1]: e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.service: Deactivated successfully.
Nov 28 08:25:29 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.
Nov 28 08:25:29 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.
Nov 28 08:25:29 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.
Nov 28 08:25:29 np0005538515.localdomain podman[78014]: 2025-11-28 08:25:29.982828576 +0000 UTC m=+0.077244908 container health_status e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.4, url=https://www.redhat.com, vendor=Red Hat, Inc., architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, release=1761123044, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=ovn_metadata_agent, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, managed_by=tripleo_ansible, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, distribution-scope=public, build-date=2025-11-19T00:14:25Z, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Nov 28 08:25:30 np0005538515.localdomain podman[78014]: 2025-11-28 08:25:30.038588162 +0000 UTC m=+0.133004494 container exec_died e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.buildah.version=1.41.4, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, container_name=ovn_metadata_agent, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, version=17.1.12, release=1761123044, architecture=x86_64, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, vendor=Red Hat, Inc., vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, distribution-scope=public, maintainer=OpenStack TripleO Team, vcs-type=git, build-date=2025-11-19T00:14:25Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true)
Nov 28 08:25:30 np0005538515.localdomain systemd[1]: e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.service: Deactivated successfully.
Nov 28 08:25:30 np0005538515.localdomain podman[78013]: 2025-11-28 08:25:30.039397816 +0000 UTC m=+0.133747536 container health_status ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=unhealthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, tcib_managed=true, batch=17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-compute, io.openshift.expose-services=, io.buildah.version=1.41.4, container_name=nova_compute, maintainer=OpenStack TripleO Team, vcs-type=git, build-date=2025-11-19T00:36:58Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., config_id=tripleo_step5, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container)
Nov 28 08:25:30 np0005538515.localdomain podman[78012]: 2025-11-28 08:25:30.089192409 +0000 UTC m=+0.186866631 container health_status 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, io.openshift.expose-services=, vcs-type=git, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, vendor=Red Hat, Inc., distribution-scope=public, architecture=x86_64, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, build-date=2025-11-18T23:34:05Z, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, summary=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, name=rhosp17/openstack-ovn-controller, url=https://www.redhat.com, container_name=ovn_controller)
Nov 28 08:25:30 np0005538515.localdomain podman[78013]: 2025-11-28 08:25:30.123483623 +0000 UTC m=+0.217833373 container exec_died ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, container_name=nova_compute, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20251118.1, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, architecture=x86_64, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, config_id=tripleo_step5, vcs-type=git, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:36:58Z, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container)
Nov 28 08:25:30 np0005538515.localdomain podman[78013]: unhealthy
Nov 28 08:25:30 np0005538515.localdomain systemd[1]: ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 08:25:30 np0005538515.localdomain systemd[1]: ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.service: Failed with result 'exit-code'.
Nov 28 08:25:30 np0005538515.localdomain podman[78012]: 2025-11-28 08:25:30.170363826 +0000 UTC m=+0.268038128 container exec_died 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.openshift.expose-services=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.buildah.version=1.41.4, batch=17.1_20251118.1, config_id=tripleo_step4, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, name=rhosp17/openstack-ovn-controller, vendor=Red Hat, Inc., build-date=2025-11-18T23:34:05Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, managed_by=tripleo_ansible, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller)
Nov 28 08:25:30 np0005538515.localdomain systemd[1]: 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.service: Deactivated successfully.
Nov 28 08:25:53 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.
Nov 28 08:25:53 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.
Nov 28 08:25:53 np0005538515.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Nov 28 08:25:53 np0005538515.localdomain recover_tripleo_nova_virtqemud[78084]: 62642
Nov 28 08:25:53 np0005538515.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Nov 28 08:25:53 np0005538515.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Nov 28 08:25:53 np0005538515.localdomain podman[78078]: 2025-11-28 08:25:53.993975611 +0000 UTC m=+0.099553275 container health_status 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, tcib_managed=true, io.buildah.version=1.41.4, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:46Z, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, version=17.1.12, vcs-type=git, container_name=metrics_qdr, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd)
Nov 28 08:25:54 np0005538515.localdomain systemd[1]: tmp-crun.DWdj9E.mount: Deactivated successfully.
Nov 28 08:25:54 np0005538515.localdomain podman[78079]: 2025-11-28 08:25:54.055969128 +0000 UTC m=+0.158807887 container health_status cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, container_name=collectd, version=17.1.12, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-collectd, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, config_id=tripleo_step3, distribution-scope=public, io.buildah.version=1.41.4, vcs-type=git, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=, vendor=Red Hat, Inc., release=1761123044, architecture=x86_64, tcib_managed=true)
Nov 28 08:25:54 np0005538515.localdomain podman[78079]: 2025-11-28 08:25:54.06710914 +0000 UTC m=+0.169947919 container exec_died cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, name=rhosp17/openstack-collectd, build-date=2025-11-18T22:51:28Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, version=17.1.12, release=1761123044, architecture=x86_64, tcib_managed=true, batch=17.1_20251118.1, container_name=collectd, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, com.redhat.component=openstack-collectd-container, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, managed_by=tripleo_ansible)
Nov 28 08:25:54 np0005538515.localdomain systemd[1]: cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.service: Deactivated successfully.
Nov 28 08:25:54 np0005538515.localdomain podman[78078]: 2025-11-28 08:25:54.229727533 +0000 UTC m=+0.335305147 container exec_died 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, batch=17.1_20251118.1, config_id=tripleo_step1, com.redhat.component=openstack-qdrouterd-container, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-qdrouterd, io.openshift.expose-services=, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, io.buildah.version=1.41.4, version=17.1.12)
Nov 28 08:25:54 np0005538515.localdomain systemd[1]: 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.service: Deactivated successfully.
Nov 28 08:25:56 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.
Nov 28 08:25:56 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.
Nov 28 08:25:56 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.
Nov 28 08:25:56 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.
Nov 28 08:25:56 np0005538515.localdomain podman[78128]: 2025-11-28 08:25:56.98367687 +0000 UTC m=+0.096494489 container health_status 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, name=rhosp17/openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:49:32Z, architecture=x86_64, com.redhat.component=openstack-cron-container, tcib_managed=true, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, container_name=logrotate_crond, managed_by=tripleo_ansible, release=1761123044, summary=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc.)
Nov 28 08:25:57 np0005538515.localdomain podman[78128]: 2025-11-28 08:25:57.015648063 +0000 UTC m=+0.128465682 container exec_died 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, batch=17.1_20251118.1, tcib_managed=true, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, release=1761123044, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-cron, version=17.1.12, architecture=x86_64, com.redhat.component=openstack-cron-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, build-date=2025-11-18T22:49:32Z, config_id=tripleo_step4, io.buildah.version=1.41.4, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc.)
Nov 28 08:25:57 np0005538515.localdomain systemd[1]: tmp-crun.QQu721.mount: Deactivated successfully.
Nov 28 08:25:57 np0005538515.localdomain systemd[1]: 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.service: Deactivated successfully.
Nov 28 08:25:57 np0005538515.localdomain podman[78129]: 2025-11-28 08:25:57.087981289 +0000 UTC m=+0.198111585 container health_status 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, version=17.1.12, config_id=tripleo_step4, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, vcs-type=git, batch=17.1_20251118.1, io.openshift.expose-services=, build-date=2025-11-19T00:12:45Z, vendor=Red Hat, Inc., managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, distribution-scope=public, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-ipmi-container, name=rhosp17/openstack-ceilometer-ipmi, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64)
Nov 28 08:25:57 np0005538515.localdomain podman[78131]: 2025-11-28 08:25:57.139313509 +0000 UTC m=+0.242530473 container health_status d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, release=1761123044, name=rhosp17/openstack-ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, architecture=x86_64, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., build-date=2025-11-19T00:11:48Z, batch=17.1_20251118.1, container_name=ceilometer_agent_compute, tcib_managed=true, managed_by=tripleo_ansible, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, vcs-type=git)
Nov 28 08:25:57 np0005538515.localdomain podman[78129]: 2025-11-28 08:25:57.140204586 +0000 UTC m=+0.250334772 container exec_died 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ceilometer-ipmi-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:12:45Z, managed_by=tripleo_ansible, vcs-type=git, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, tcib_managed=true, architecture=x86_64, container_name=ceilometer_agent_ipmi, url=https://www.redhat.com, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public)
Nov 28 08:25:57 np0005538515.localdomain podman[78130]: 2025-11-28 08:25:57.048492534 +0000 UTC m=+0.154708880 container health_status 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vendor=Red Hat, Inc., build-date=2025-11-18T23:44:13Z, description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, managed_by=tripleo_ansible, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, version=17.1.12, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.openshift.expose-services=, container_name=iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, architecture=x86_64, name=rhosp17/openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-iscsid-container, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 iscsid)
Nov 28 08:25:57 np0005538515.localdomain podman[78130]: 2025-11-28 08:25:57.178720701 +0000 UTC m=+0.284937037 container exec_died 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, managed_by=tripleo_ansible, config_id=tripleo_step3, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.expose-services=, batch=17.1_20251118.1, container_name=iscsid, build-date=2025-11-18T23:44:13Z, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, vcs-type=git, url=https://www.redhat.com, distribution-scope=public, architecture=x86_64, name=rhosp17/openstack-iscsid, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid)
Nov 28 08:25:57 np0005538515.localdomain systemd[1]: 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.service: Deactivated successfully.
Nov 28 08:25:57 np0005538515.localdomain systemd[1]: 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.service: Deactivated successfully.
Nov 28 08:25:57 np0005538515.localdomain podman[78131]: 2025-11-28 08:25:57.24241067 +0000 UTC m=+0.345627654 container exec_died d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, distribution-scope=public, io.openshift.expose-services=, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, batch=17.1_20251118.1, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, architecture=x86_64, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ceilometer_agent_compute, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, build-date=2025-11-19T00:11:48Z)
Nov 28 08:25:57 np0005538515.localdomain systemd[1]: d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.service: Deactivated successfully.
Nov 28 08:25:58 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.
Nov 28 08:25:58 np0005538515.localdomain podman[78237]: 2025-11-28 08:25:58.581843089 +0000 UTC m=+0.065742314 container health_status e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, io.openshift.expose-services=, version=17.1.12, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-compute, container_name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4)
Nov 28 08:25:58 np0005538515.localdomain podman[78237]: 2025-11-28 08:25:58.931421943 +0000 UTC m=+0.415321138 container exec_died e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.buildah.version=1.41.4, vendor=Red Hat, Inc., build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, container_name=nova_migration_target, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20251118.1, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1761123044, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 28 08:25:58 np0005538515.localdomain systemd[1]: e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.service: Deactivated successfully.
Nov 28 08:26:00 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.
Nov 28 08:26:00 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.
Nov 28 08:26:00 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.
Nov 28 08:26:00 np0005538515.localdomain podman[78324]: 2025-11-28 08:26:00.983317641 +0000 UTC m=+0.084390107 container health_status 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, name=rhosp17/openstack-ovn-controller, maintainer=OpenStack TripleO Team, distribution-scope=public, container_name=ovn_controller, vendor=Red Hat, Inc., batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, version=17.1.12, io.openshift.expose-services=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, architecture=x86_64, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible)
Nov 28 08:26:01 np0005538515.localdomain podman[78324]: 2025-11-28 08:26:01.021784174 +0000 UTC m=+0.122856610 container exec_died 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, name=rhosp17/openstack-ovn-controller, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, version=17.1.12, maintainer=OpenStack TripleO Team, vcs-type=git, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T23:34:05Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, tcib_managed=true, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, managed_by=tripleo_ansible)
Nov 28 08:26:01 np0005538515.localdomain podman[78325]: 2025-11-28 08:26:01.039759707 +0000 UTC m=+0.138857743 container health_status ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=unhealthy, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:36:58Z, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., tcib_managed=true, release=1761123044, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, managed_by=tripleo_ansible, container_name=nova_compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, distribution-scope=public)
Nov 28 08:26:01 np0005538515.localdomain systemd[1]: 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.service: Deactivated successfully.
Nov 28 08:26:01 np0005538515.localdomain podman[78326]: 2025-11-28 08:26:01.084511224 +0000 UTC m=+0.180005229 container health_status e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, batch=17.1_20251118.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, io.openshift.expose-services=, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_metadata_agent, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, build-date=2025-11-19T00:14:25Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, vendor=Red Hat, Inc., url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, architecture=x86_64, managed_by=tripleo_ansible, com.redhat.component=openstack-neutron-metadata-agent-ovn-container)
Nov 28 08:26:01 np0005538515.localdomain podman[78325]: 2025-11-28 08:26:01.113331791 +0000 UTC m=+0.212429837 container exec_died ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.buildah.version=1.41.4, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, name=rhosp17/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, release=1761123044, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, config_id=tripleo_step5, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 28 08:26:01 np0005538515.localdomain podman[78326]: 2025-11-28 08:26:01.114093384 +0000 UTC m=+0.209587379 container exec_died e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, container_name=ovn_metadata_agent, url=https://www.redhat.com, name=rhosp17/openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4, managed_by=tripleo_ansible, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, batch=17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Nov 28 08:26:01 np0005538515.localdomain systemd[1]: ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.service: Deactivated successfully.
Nov 28 08:26:01 np0005538515.localdomain systemd[1]: e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.service: Deactivated successfully.
Nov 28 08:26:08 np0005538515.localdomain systemd[1]: libpod-2659a032c980f8a55db7d088ff7cf0c88c211e46e06e052e716c33cd12909e1b.scope: Deactivated successfully.
Nov 28 08:26:08 np0005538515.localdomain podman[78401]: 2025-11-28 08:26:08.51154434 +0000 UTC m=+0.054709484 container died 2659a032c980f8a55db7d088ff7cf0c88c211e46e06e052e716c33cd12909e1b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_wait_for_compute_service, vendor=Red Hat, Inc., io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, config_id=tripleo_step5, vcs-type=git, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_wait_for_compute_service, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, name=rhosp17/openstack-nova-compute, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.12, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'detach': False, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', '__OS_DEBUG': 'true', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'start_order': 4, 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/log/containers/nova:/var/log/nova', '/var/lib/container-config-scripts:/container-config-scripts']}, batch=17.1_20251118.1)
Nov 28 08:26:08 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-2659a032c980f8a55db7d088ff7cf0c88c211e46e06e052e716c33cd12909e1b-userdata-shm.mount: Deactivated successfully.
Nov 28 08:26:08 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-3584b97526356ef5b6642175730f52565e80737460bbd74a6e31729b79699070-merged.mount: Deactivated successfully.
Nov 28 08:26:08 np0005538515.localdomain podman[78401]: 2025-11-28 08:26:08.542015488 +0000 UTC m=+0.085180621 container cleanup 2659a032c980f8a55db7d088ff7cf0c88c211e46e06e052e716c33cd12909e1b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_wait_for_compute_service, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step5, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, config_data={'detach': False, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', '__OS_DEBUG': 'true', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'start_order': 4, 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/log/containers/nova:/var/log/nova', '/var/lib/container-config-scripts:/container-config-scripts']}, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.12, url=https://www.redhat.com, vendor=Red Hat, Inc., vcs-type=git, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, container_name=nova_wait_for_compute_service, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Nov 28 08:26:08 np0005538515.localdomain systemd[1]: libpod-conmon-2659a032c980f8a55db7d088ff7cf0c88c211e46e06e052e716c33cd12909e1b.scope: Deactivated successfully.
Nov 28 08:26:08 np0005538515.localdomain python3[76451]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_wait_for_compute_service --conmon-pidfile /run/nova_wait_for_compute_service.pid --detach=False --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env __OS_DEBUG=true --env TRIPLEO_CONFIG_HASH=bbb5ea37891e3118676a78b59837de90 --label config_id=tripleo_step5 --label container_name=nova_wait_for_compute_service --label managed_by=tripleo_ansible --label config_data={'detach': False, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', '__OS_DEBUG': 'true', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'start_order': 4, 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/log/containers/nova:/var/log/nova', '/var/lib/container-config-scripts:/container-config-scripts']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_wait_for_compute_service.log --network host --user nova --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /var/log/containers/nova:/var/log/nova --volume /var/lib/container-config-scripts:/container-config-scripts registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1
Nov 28 08:26:08 np0005538515.localdomain sudo[76449]: pam_unix(sudo:session): session closed for user root
Nov 28 08:26:08 np0005538515.localdomain sudo[78454]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tvlmlvvswpppzjjxplfgtrlrvijotfyf ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:26:08 np0005538515.localdomain sudo[78454]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:26:09 np0005538515.localdomain python3[78456]: ansible-file Invoked with path=/etc/systemd/system/tripleo_nova_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:26:09 np0005538515.localdomain sudo[78454]: pam_unix(sudo:session): session closed for user root
Nov 28 08:26:09 np0005538515.localdomain sudo[78470]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-muizfnljwzjmmbimmlmptamhrhrcwtjo ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:26:09 np0005538515.localdomain sudo[78470]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:26:09 np0005538515.localdomain python3[78472]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_nova_compute_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 28 08:26:09 np0005538515.localdomain sudo[78470]: pam_unix(sudo:session): session closed for user root
Nov 28 08:26:09 np0005538515.localdomain sudo[78531]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bukmznkavyfkuzamfbxkvhsebeqwxswp ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:26:09 np0005538515.localdomain sudo[78531]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:26:10 np0005538515.localdomain python3[78533]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764318369.4489896-119293-261036023283361/source dest=/etc/systemd/system/tripleo_nova_compute.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:26:10 np0005538515.localdomain sudo[78531]: pam_unix(sudo:session): session closed for user root
Nov 28 08:26:10 np0005538515.localdomain sudo[78547]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-srmyoplfjhuiqugpxcbdnuxomkllkidf ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:26:10 np0005538515.localdomain sudo[78547]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:26:10 np0005538515.localdomain python3[78549]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 28 08:26:10 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 08:26:10 np0005538515.localdomain systemd-rc-local-generator[78577]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 08:26:10 np0005538515.localdomain systemd-sysv-generator[78580]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 08:26:10 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 08:26:10 np0005538515.localdomain sudo[78547]: pam_unix(sudo:session): session closed for user root
Nov 28 08:26:11 np0005538515.localdomain sudo[78599]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aumpljitwhrdtsjbauxrhftqvysqmtwd ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Nov 28 08:26:11 np0005538515.localdomain sudo[78599]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:26:11 np0005538515.localdomain python3[78601]: ansible-systemd Invoked with state=restarted name=tripleo_nova_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 08:26:11 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 08:26:11 np0005538515.localdomain systemd-rc-local-generator[78625]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 08:26:11 np0005538515.localdomain systemd-sysv-generator[78630]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 08:26:11 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 08:26:11 np0005538515.localdomain systemd[1]: Starting nova_compute container...
Nov 28 08:26:11 np0005538515.localdomain tripleo-start-podman-container[78641]: Creating additional drop-in dependency for "nova_compute" (ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0)
Nov 28 08:26:11 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 08:26:12 np0005538515.localdomain systemd-sysv-generator[78706]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 08:26:12 np0005538515.localdomain systemd-rc-local-generator[78702]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 08:26:12 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 08:26:12 np0005538515.localdomain systemd[1]: Started nova_compute container.
Nov 28 08:26:12 np0005538515.localdomain sudo[78599]: pam_unix(sudo:session): session closed for user root
Nov 28 08:26:12 np0005538515.localdomain sudo[78739]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ztlnnhpjrjubiqyhnxdlrlyfmvecvres ; /usr/bin/python3
Nov 28 08:26:12 np0005538515.localdomain sudo[78739]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:26:12 np0005538515.localdomain python3[78741]: ansible-file Invoked with path=/var/lib/container-puppet/container-puppet-tasks5.json state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:26:12 np0005538515.localdomain sudo[78739]: pam_unix(sudo:session): session closed for user root
Nov 28 08:26:13 np0005538515.localdomain sudo[78787]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-glqnzxbuiqryyfdetssmkvtbteavyiea ; /usr/bin/python3
Nov 28 08:26:13 np0005538515.localdomain sudo[78787]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:26:13 np0005538515.localdomain sudo[78787]: pam_unix(sudo:session): session closed for user root
Nov 28 08:26:13 np0005538515.localdomain sudo[78830]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zjqndoctovnoaflsbvcfarwvonehrlqr ; /usr/bin/python3
Nov 28 08:26:13 np0005538515.localdomain sudo[78830]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:26:13 np0005538515.localdomain sudo[78830]: pam_unix(sudo:session): session closed for user root
Nov 28 08:26:14 np0005538515.localdomain sudo[78860]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tgotjfywuwzakgofxkzwsdcegykgajlu ; /usr/bin/python3
Nov 28 08:26:14 np0005538515.localdomain sudo[78860]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:26:14 np0005538515.localdomain python3[78862]: ansible-container_puppet_config Invoked with check_mode=False config_vol_prefix=/var/lib/config-data debug=True net_host=True no_archive=True puppet_config=/var/lib/container-puppet/container-puppet-tasks5.json short_hostname=np0005538515 step=5 update_config_hash_only=False
Nov 28 08:26:14 np0005538515.localdomain sudo[78860]: pam_unix(sudo:session): session closed for user root
Nov 28 08:26:14 np0005538515.localdomain sudo[78876]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zjvjnlhehsklgwbskgchiiurdzajhcgx ; /usr/bin/python3
Nov 28 08:26:14 np0005538515.localdomain sudo[78876]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:26:14 np0005538515.localdomain python3[78878]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 08:26:14 np0005538515.localdomain sudo[78876]: pam_unix(sudo:session): session closed for user root
Nov 28 08:26:15 np0005538515.localdomain sudo[78892]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lneaprifzilepowawphnntpmvnpscdts ; /usr/bin/python3
Nov 28 08:26:15 np0005538515.localdomain sudo[78892]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Nov 28 08:26:15 np0005538515.localdomain python3[78894]: ansible-container_config_data Invoked with config_path=/var/lib/tripleo-config/container-puppet-config/step_5 config_pattern=container-puppet-*.json config_overrides={} debug=True
Nov 28 08:26:15 np0005538515.localdomain sudo[78892]: pam_unix(sudo:session): session closed for user root
Nov 28 08:26:17 np0005538515.localdomain sudo[78895]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 08:26:17 np0005538515.localdomain sudo[78895]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:26:17 np0005538515.localdomain sudo[78895]: pam_unix(sudo:session): session closed for user root
Nov 28 08:26:18 np0005538515.localdomain sudo[78910]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 08:26:18 np0005538515.localdomain sudo[78910]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:26:18 np0005538515.localdomain sudo[78910]: pam_unix(sudo:session): session closed for user root
Nov 28 08:26:19 np0005538515.localdomain sudo[78957]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 08:26:19 np0005538515.localdomain sudo[78957]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:26:19 np0005538515.localdomain sudo[78957]: pam_unix(sudo:session): session closed for user root
Nov 28 08:26:24 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.
Nov 28 08:26:24 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.
Nov 28 08:26:24 np0005538515.localdomain podman[78973]: 2025-11-28 08:26:24.987763461 +0000 UTC m=+0.090718573 container health_status cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, name=rhosp17/openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, maintainer=OpenStack TripleO Team, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, build-date=2025-11-18T22:51:28Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, config_id=tripleo_step3, tcib_managed=true, distribution-scope=public, com.redhat.component=openstack-collectd-container, version=17.1.12, vendor=Red Hat, Inc., io.buildah.version=1.41.4)
Nov 28 08:26:25 np0005538515.localdomain podman[78973]: 2025-11-28 08:26:25.004560278 +0000 UTC m=+0.107515390 container exec_died cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, architecture=x86_64, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, com.redhat.component=openstack-collectd-container, vcs-type=git, io.buildah.version=1.41.4, tcib_managed=true, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, managed_by=tripleo_ansible, name=rhosp17/openstack-collectd, container_name=collectd, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:51:28Z)
Nov 28 08:26:25 np0005538515.localdomain systemd[1]: cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.service: Deactivated successfully.
Nov 28 08:26:25 np0005538515.localdomain systemd[1]: tmp-crun.XAPXnZ.mount: Deactivated successfully.
Nov 28 08:26:25 np0005538515.localdomain podman[78972]: 2025-11-28 08:26:25.093049639 +0000 UTC m=+0.195881027 container health_status 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vendor=Red Hat, Inc., vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, build-date=2025-11-18T22:49:46Z, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, name=rhosp17/openstack-qdrouterd, distribution-scope=public, release=1761123044, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, architecture=x86_64, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=)
Nov 28 08:26:25 np0005538515.localdomain podman[78972]: 2025-11-28 08:26:25.288463262 +0000 UTC m=+0.391294630 container exec_died 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, vendor=Red Hat, Inc., io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, batch=17.1_20251118.1, release=1761123044, container_name=metrics_qdr, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-qdrouterd, config_id=tripleo_step1, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, tcib_managed=true)
Nov 28 08:26:25 np0005538515.localdomain systemd[1]: 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.service: Deactivated successfully.
Nov 28 08:26:27 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.
Nov 28 08:26:27 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.
Nov 28 08:26:27 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.
Nov 28 08:26:27 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.
Nov 28 08:26:27 np0005538515.localdomain podman[79023]: 2025-11-28 08:26:27.992209354 +0000 UTC m=+0.092162497 container health_status d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp17/openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, architecture=x86_64, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., tcib_managed=true, build-date=2025-11-19T00:11:48Z, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, version=17.1.12)
Nov 28 08:26:28 np0005538515.localdomain systemd[1]: tmp-crun.0JOBU6.mount: Deactivated successfully.
Nov 28 08:26:28 np0005538515.localdomain podman[79023]: 2025-11-28 08:26:28.053111488 +0000 UTC m=+0.153064631 container exec_died d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ceilometer_agent_compute, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, name=rhosp17/openstack-ceilometer-compute, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, build-date=2025-11-19T00:11:48Z, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, url=https://www.redhat.com, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-compute-container, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, vcs-type=git, io.openshift.expose-services=)
Nov 28 08:26:28 np0005538515.localdomain podman[79020]: 2025-11-28 08:26:28.053221011 +0000 UTC m=+0.161812529 container health_status 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, batch=17.1_20251118.1, url=https://www.redhat.com, architecture=x86_64, build-date=2025-11-18T22:49:32Z, distribution-scope=public, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.component=openstack-cron-container, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., vcs-type=git, config_id=tripleo_step4, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, name=rhosp17/openstack-cron, container_name=logrotate_crond, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 08:26:28 np0005538515.localdomain systemd[1]: d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.service: Deactivated successfully.
Nov 28 08:26:28 np0005538515.localdomain podman[79021]: 2025-11-28 08:26:28.136369388 +0000 UTC m=+0.241921253 container health_status 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vcs-type=git, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, tcib_managed=true, url=https://www.redhat.com, managed_by=tripleo_ansible, version=17.1.12, build-date=2025-11-19T00:12:45Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, io.buildah.version=1.41.4, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-ipmi-container, container_name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, architecture=x86_64, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 08:26:28 np0005538515.localdomain podman[79021]: 2025-11-28 08:26:28.194570169 +0000 UTC m=+0.300122084 container exec_died 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, build-date=2025-11-19T00:12:45Z, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, vcs-type=git, io.openshift.expose-services=, distribution-scope=public, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, release=1761123044, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi)
Nov 28 08:26:28 np0005538515.localdomain systemd[1]: 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.service: Deactivated successfully.
Nov 28 08:26:28 np0005538515.localdomain podman[79020]: 2025-11-28 08:26:28.234694354 +0000 UTC m=+0.343285842 container exec_died 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, release=1761123044, managed_by=tripleo_ansible, tcib_managed=true, com.redhat.component=openstack-cron-container, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, architecture=x86_64, container_name=logrotate_crond, batch=17.1_20251118.1, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-type=git, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, build-date=2025-11-18T22:49:32Z, distribution-scope=public, name=rhosp17/openstack-cron, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 cron)
Nov 28 08:26:28 np0005538515.localdomain podman[79022]: 2025-11-28 08:26:28.200837102 +0000 UTC m=+0.304936202 container health_status 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, config_id=tripleo_step3, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, name=rhosp17/openstack-iscsid, com.redhat.component=openstack-iscsid-container, io.buildah.version=1.41.4, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, managed_by=tripleo_ansible, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.expose-services=, architecture=x86_64, version=17.1.12, container_name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, build-date=2025-11-18T23:44:13Z)
Nov 28 08:26:28 np0005538515.localdomain systemd[1]: 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.service: Deactivated successfully.
Nov 28 08:26:28 np0005538515.localdomain podman[79022]: 2025-11-28 08:26:28.280746971 +0000 UTC m=+0.384846121 container exec_died 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, com.redhat.component=openstack-iscsid-container, vcs-type=git, io.buildah.version=1.41.4, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044, name=rhosp17/openstack-iscsid, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, architecture=x86_64, vendor=Red Hat, Inc., version=17.1.12, tcib_managed=true, container_name=iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, build-date=2025-11-18T23:44:13Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible)
Nov 28 08:26:28 np0005538515.localdomain systemd[1]: 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.service: Deactivated successfully.
Nov 28 08:26:28 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.
Nov 28 08:26:29 np0005538515.localdomain podman[79111]: 2025-11-28 08:26:29.089317107 +0000 UTC m=+0.081888431 container health_status e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step4, build-date=2025-11-19T00:36:58Z, url=https://www.redhat.com, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=nova_migration_target, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, vendor=Red Hat, Inc., batch=17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 28 08:26:29 np0005538515.localdomain podman[79111]: 2025-11-28 08:26:29.459697262 +0000 UTC m=+0.452268586 container exec_died e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, version=17.1.12, architecture=x86_64, build-date=2025-11-19T00:36:58Z, io.openshift.expose-services=, io.buildah.version=1.41.4, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, vcs-type=git, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, batch=17.1_20251118.1, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 28 08:26:29 np0005538515.localdomain systemd[1]: e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.service: Deactivated successfully.
Nov 28 08:26:31 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.
Nov 28 08:26:31 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.
Nov 28 08:26:31 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.
Nov 28 08:26:31 np0005538515.localdomain podman[79135]: 2025-11-28 08:26:31.987944704 +0000 UTC m=+0.092791976 container health_status 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, config_id=tripleo_step4, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, build-date=2025-11-18T23:34:05Z, name=rhosp17/openstack-ovn-controller, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, batch=17.1_20251118.1, vcs-type=git, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, maintainer=OpenStack TripleO Team)
Nov 28 08:26:32 np0005538515.localdomain podman[79135]: 2025-11-28 08:26:32.004503493 +0000 UTC m=+0.109350766 container exec_died 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, config_id=tripleo_step4, managed_by=tripleo_ansible, version=17.1.12, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.buildah.version=1.41.4, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., build-date=2025-11-18T23:34:05Z, batch=17.1_20251118.1, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ovn-controller-container, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, architecture=x86_64, name=rhosp17/openstack-ovn-controller, io.openshift.expose-services=, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ovn-controller)
Nov 28 08:26:32 np0005538515.localdomain podman[79137]: 2025-11-28 08:26:32.024123127 +0000 UTC m=+0.121296572 container health_status e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, version=17.1.12, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, url=https://www.redhat.com, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., build-date=2025-11-19T00:14:25Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, tcib_managed=true, release=1761123044, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, managed_by=tripleo_ansible, config_id=tripleo_step4, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, architecture=x86_64, vcs-type=git)
Nov 28 08:26:32 np0005538515.localdomain podman[79137]: 2025-11-28 08:26:32.053388358 +0000 UTC m=+0.150561793 container exec_died e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, distribution-scope=public, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, vendor=Red Hat, Inc., architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z)
Nov 28 08:26:32 np0005538515.localdomain systemd[1]: 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.service: Deactivated successfully.
Nov 28 08:26:32 np0005538515.localdomain systemd[1]: e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.service: Deactivated successfully.
Nov 28 08:26:32 np0005538515.localdomain systemd[1]: tmp-crun.jV55Tw.mount: Deactivated successfully.
Nov 28 08:26:32 np0005538515.localdomain podman[79136]: 2025-11-28 08:26:32.197204702 +0000 UTC m=+0.299016010 container health_status ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, container_name=nova_compute, distribution-scope=public, url=https://www.redhat.com, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, version=17.1.12, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, tcib_managed=true)
Nov 28 08:26:32 np0005538515.localdomain podman[79136]: 2025-11-28 08:26:32.252477263 +0000 UTC m=+0.354288591 container exec_died ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, version=17.1.12, vcs-type=git, distribution-scope=public, container_name=nova_compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, tcib_managed=true, config_id=tripleo_step5, architecture=x86_64, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044)
Nov 28 08:26:32 np0005538515.localdomain systemd[1]: ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.service: Deactivated successfully.
Nov 28 08:26:37 np0005538515.localdomain sshd[79206]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 08:26:38 np0005538515.localdomain sshd[79206]: Invalid user validator from 80.94.92.186 port 50096
Nov 28 08:26:38 np0005538515.localdomain sshd[79206]: Connection closed by invalid user validator 80.94.92.186 port 50096 [preauth]
Nov 28 08:26:45 np0005538515.localdomain sshd[79208]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 08:26:45 np0005538515.localdomain sshd[79208]: Accepted publickey for zuul from 192.168.122.100 port 43316 ssh2: RSA SHA256:3gOhaEk5Hp1Sm2LwNst6cGDJ5O01KvSo8lCo9SBO2II
Nov 28 08:26:45 np0005538515.localdomain systemd-logind[763]: New session 33 of user zuul.
Nov 28 08:26:45 np0005538515.localdomain systemd[1]: Started Session 33 of User zuul.
Nov 28 08:26:45 np0005538515.localdomain sshd[79208]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Nov 28 08:26:45 np0005538515.localdomain sudo[79315]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sqezlecptknxwzvagnmmpjlatlfozdgb ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764318405.2916617-41263-220771399790762/AnsiballZ_setup.py
Nov 28 08:26:45 np0005538515.localdomain sudo[79315]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 08:26:46 np0005538515.localdomain python3[79317]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 08:26:48 np0005538515.localdomain sudo[79315]: pam_unix(sudo:session): session closed for user root
Nov 28 08:26:52 np0005538515.localdomain sudo[79579]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wskgxfqhsemgdanqzxfctglykgxjilwb ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764318412.5460758-41323-178420047417552/AnsiballZ_dnf.py
Nov 28 08:26:52 np0005538515.localdomain sudo[79579]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 08:26:53 np0005538515.localdomain python3[79581]: ansible-ansible.legacy.dnf Invoked with name=['iptables'] allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None state=None
Nov 28 08:26:55 np0005538515.localdomain sudo[79579]: pam_unix(sudo:session): session closed for user root
Nov 28 08:26:55 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.
Nov 28 08:26:55 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.
Nov 28 08:26:55 np0005538515.localdomain systemd[1]: tmp-crun.hpzPSh.mount: Deactivated successfully.
Nov 28 08:26:55 np0005538515.localdomain podman[79585]: 2025-11-28 08:26:55.917273083 +0000 UTC m=+0.091037701 container health_status cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, tcib_managed=true, version=17.1.12, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, architecture=x86_64, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, name=rhosp17/openstack-collectd, batch=17.1_20251118.1, io.buildah.version=1.41.4, container_name=collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com)
Nov 28 08:26:55 np0005538515.localdomain podman[79585]: 2025-11-28 08:26:55.925799776 +0000 UTC m=+0.099564394 container exec_died cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, batch=17.1_20251118.1, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, managed_by=tripleo_ansible, com.redhat.component=openstack-collectd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, container_name=collectd, version=17.1.12, tcib_managed=true, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-collectd, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']})
Nov 28 08:26:55 np0005538515.localdomain systemd[1]: cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.service: Deactivated successfully.
Nov 28 08:26:56 np0005538515.localdomain podman[79584]: 2025-11-28 08:26:56.021019245 +0000 UTC m=+0.194655309 container health_status 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp17/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, managed_by=tripleo_ansible, release=1761123044, build-date=2025-11-18T22:49:46Z, io.openshift.expose-services=, tcib_managed=true, vendor=Red Hat, Inc., vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, url=https://www.redhat.com)
Nov 28 08:26:56 np0005538515.localdomain podman[79584]: 2025-11-28 08:26:56.206655617 +0000 UTC m=+0.380291671 container exec_died 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, release=1761123044, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, version=17.1.12, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, config_id=tripleo_step1, build-date=2025-11-18T22:49:46Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com)
Nov 28 08:26:56 np0005538515.localdomain systemd[1]: 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.service: Deactivated successfully.
Nov 28 08:26:57 np0005538515.localdomain sudo[79721]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ycszsaqeqponxrlkxecehszzurushcty ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764318416.86401-41377-249186731279437/AnsiballZ_iptables.py
Nov 28 08:26:57 np0005538515.localdomain sudo[79721]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 08:26:57 np0005538515.localdomain python3[79723]: ansible-ansible.builtin.iptables Invoked with action=insert chain=INPUT comment=allow ssh access for zuul executor in_interface=eth0 jump=ACCEPT protocol=tcp source=38.102.83.114 table=filter state=present ip_version=ipv4 match=[] destination_ports=[] ctstate=[] syn=ignore flush=False chain_management=False numeric=False rule_num=None wait=None to_source=None destination=None to_destination=None tcp_flags=None gateway=None log_prefix=None log_level=None goto=None out_interface=None fragment=None set_counters=None source_port=None destination_port=None to_ports=None set_dscp_mark=None set_dscp_mark_class=None src_range=None dst_range=None match_set=None match_set_flags=None limit=None limit_burst=None uid_owner=None gid_owner=None reject_with=None icmp_type=None policy=None
Nov 28 08:26:57 np0005538515.localdomain kernel: Warning: Deprecated Driver is detected: nft_compat will not be maintained in a future major release and may be disabled
Nov 28 08:26:57 np0005538515.localdomain systemd-journald[48427]: Field hash table of /run/log/journal/5cd59ba25ae47acac865224fa46a5f9e/system.journal has a fill level at 81.1 (270 of 333 items), suggesting rotation.
Nov 28 08:26:57 np0005538515.localdomain systemd-journald[48427]: /run/log/journal/5cd59ba25ae47acac865224fa46a5f9e/system.journal: Journal header limits reached or header out-of-date, rotating.
Nov 28 08:26:57 np0005538515.localdomain rsyslogd[758]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Nov 28 08:26:57 np0005538515.localdomain rsyslogd[758]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Nov 28 08:26:57 np0005538515.localdomain sudo[79721]: pam_unix(sudo:session): session closed for user root
Nov 28 08:26:57 np0005538515.localdomain rsyslogd[758]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Nov 28 08:26:58 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.
Nov 28 08:26:58 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.
Nov 28 08:26:58 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.
Nov 28 08:26:58 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.
Nov 28 08:26:58 np0005538515.localdomain systemd[1]: tmp-crun.UHTBW1.mount: Deactivated successfully.
Nov 28 08:26:58 np0005538515.localdomain podman[79789]: 2025-11-28 08:26:58.981325631 +0000 UTC m=+0.086545824 container health_status 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, vendor=Red Hat, Inc., vcs-type=git, com.redhat.component=openstack-cron-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, io.buildah.version=1.41.4, tcib_managed=true, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, release=1761123044, container_name=logrotate_crond, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, config_id=tripleo_step4)
Nov 28 08:26:58 np0005538515.localdomain podman[79789]: 2025-11-28 08:26:58.993203216 +0000 UTC m=+0.098423429 container exec_died 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, url=https://www.redhat.com, distribution-scope=public, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., config_id=tripleo_step4, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, build-date=2025-11-18T22:49:32Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=logrotate_crond, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 28 08:26:59 np0005538515.localdomain systemd[1]: 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.service: Deactivated successfully.
Nov 28 08:26:59 np0005538515.localdomain systemd[1]: tmp-crun.UI7GCZ.mount: Deactivated successfully.
Nov 28 08:26:59 np0005538515.localdomain podman[79790]: 2025-11-28 08:26:59.035194728 +0000 UTC m=+0.140384020 container health_status 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.buildah.version=1.41.4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, build-date=2025-11-19T00:12:45Z, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., vcs-type=git, architecture=x86_64, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.expose-services=, container_name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team)
Nov 28 08:26:59 np0005538515.localdomain podman[79792]: 2025-11-28 08:26:59.097808015 +0000 UTC m=+0.198169068 container health_status d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, vendor=Red Hat, Inc., build-date=2025-11-19T00:11:48Z, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, maintainer=OpenStack TripleO Team, architecture=x86_64, name=rhosp17/openstack-ceilometer-compute, io.openshift.expose-services=, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, container_name=ceilometer_agent_compute, batch=17.1_20251118.1, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, managed_by=tripleo_ansible, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, tcib_managed=true, com.redhat.component=openstack-ceilometer-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 28 08:26:59 np0005538515.localdomain podman[79791]: 2025-11-28 08:26:59.136804864 +0000 UTC m=+0.238121557 container health_status 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, vcs-type=git, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.expose-services=, batch=17.1_20251118.1, build-date=2025-11-18T23:44:13Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, managed_by=tripleo_ansible, name=rhosp17/openstack-iscsid, com.redhat.component=openstack-iscsid-container, container_name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 28 08:26:59 np0005538515.localdomain podman[79790]: 2025-11-28 08:26:59.151327021 +0000 UTC m=+0.256516373 container exec_died 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:12:45Z, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, distribution-scope=public, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-type=git, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., url=https://www.redhat.com, container_name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-ipmi, architecture=x86_64, release=1761123044)
Nov 28 08:26:59 np0005538515.localdomain podman[79791]: 2025-11-28 08:26:59.150448913 +0000 UTC m=+0.251765616 container exec_died 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, container_name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, config_id=tripleo_step3, version=17.1.12, architecture=x86_64, managed_by=tripleo_ansible, com.redhat.component=openstack-iscsid-container, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.buildah.version=1.41.4, name=rhosp17/openstack-iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, build-date=2025-11-18T23:44:13Z, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 08:26:59 np0005538515.localdomain systemd[1]: 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.service: Deactivated successfully.
Nov 28 08:26:59 np0005538515.localdomain podman[79792]: 2025-11-28 08:26:59.200846154 +0000 UTC m=+0.301207257 container exec_died d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, build-date=2025-11-19T00:11:48Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, release=1761123044, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, com.redhat.component=openstack-ceilometer-compute-container, config_id=tripleo_step4, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-type=git, io.buildah.version=1.41.4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 08:26:59 np0005538515.localdomain systemd[1]: d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.service: Deactivated successfully.
Nov 28 08:26:59 np0005538515.localdomain systemd[1]: 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.service: Deactivated successfully.
Nov 28 08:26:59 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.
Nov 28 08:26:59 np0005538515.localdomain podman[79883]: 2025-11-28 08:26:59.974469746 +0000 UTC m=+0.084695518 container health_status e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, build-date=2025-11-19T00:36:58Z, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, name=rhosp17/openstack-nova-compute, config_id=tripleo_step4, io.openshift.expose-services=, distribution-scope=public, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=nova_migration_target)
Nov 28 08:27:00 np0005538515.localdomain podman[79883]: 2025-11-28 08:27:00.355614582 +0000 UTC m=+0.465840314 container exec_died e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vcs-type=git, config_id=tripleo_step4, managed_by=tripleo_ansible, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, container_name=nova_migration_target, tcib_managed=true, batch=17.1_20251118.1, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-compute, io.openshift.expose-services=, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-compute-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc.)
Nov 28 08:27:00 np0005538515.localdomain systemd[1]: e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.service: Deactivated successfully.
Nov 28 08:27:02 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.
Nov 28 08:27:02 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.
Nov 28 08:27:02 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.
Nov 28 08:27:02 np0005538515.localdomain podman[79906]: 2025-11-28 08:27:02.979719882 +0000 UTC m=+0.086554484 container health_status 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:34:05Z, vcs-type=git, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., version=17.1.12, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, name=rhosp17/openstack-ovn-controller, release=1761123044, url=https://www.redhat.com, container_name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, tcib_managed=true, architecture=x86_64, config_id=tripleo_step4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']})
Nov 28 08:27:03 np0005538515.localdomain podman[79906]: 2025-11-28 08:27:03.030478684 +0000 UTC m=+0.137313286 container exec_died 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vendor=Red Hat, Inc., vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, tcib_managed=true, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.openshift.expose-services=, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, container_name=ovn_controller, batch=17.1_20251118.1, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, io.buildah.version=1.41.4)
Nov 28 08:27:03 np0005538515.localdomain podman[79907]: 2025-11-28 08:27:03.045360153 +0000 UTC m=+0.147044676 container health_status ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:36:58Z, tcib_managed=true, release=1761123044, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., url=https://www.redhat.com, managed_by=tripleo_ansible, config_id=tripleo_step5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, container_name=nova_compute, architecture=x86_64, distribution-scope=public, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 28 08:27:03 np0005538515.localdomain systemd[1]: 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.service: Deactivated successfully.
Nov 28 08:27:03 np0005538515.localdomain podman[79908]: 2025-11-28 08:27:03.089985316 +0000 UTC m=+0.189019737 container health_status e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.4, vendor=Red Hat, Inc., config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.expose-services=, version=17.1.12, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, managed_by=tripleo_ansible, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, build-date=2025-11-19T00:14:25Z, url=https://www.redhat.com, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, architecture=x86_64, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Nov 28 08:27:03 np0005538515.localdomain podman[79907]: 2025-11-28 08:27:03.098636892 +0000 UTC m=+0.200321405 container exec_died ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, tcib_managed=true, config_id=tripleo_step5, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z, managed_by=tripleo_ansible, url=https://www.redhat.com, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, distribution-scope=public, name=rhosp17/openstack-nova-compute, container_name=nova_compute, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc.)
Nov 28 08:27:03 np0005538515.localdomain systemd[1]: ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.service: Deactivated successfully.
Nov 28 08:27:03 np0005538515.localdomain podman[79908]: 2025-11-28 08:27:03.125510858 +0000 UTC m=+0.224545269 container exec_died e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, build-date=2025-11-19T00:14:25Z, vendor=Red Hat, Inc., url=https://www.redhat.com, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, name=rhosp17/openstack-neutron-metadata-agent-ovn, version=17.1.12, batch=17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.41.4, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64)
Nov 28 08:27:03 np0005538515.localdomain systemd[1]: e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.service: Deactivated successfully.
Nov 28 08:27:19 np0005538515.localdomain sudo[79977]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 08:27:19 np0005538515.localdomain sudo[79977]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:27:19 np0005538515.localdomain sudo[79977]: pam_unix(sudo:session): session closed for user root
Nov 28 08:27:19 np0005538515.localdomain sudo[79992]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 08:27:19 np0005538515.localdomain sudo[79992]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:27:20 np0005538515.localdomain sudo[79992]: pam_unix(sudo:session): session closed for user root
Nov 28 08:27:20 np0005538515.localdomain sudo[80038]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 08:27:20 np0005538515.localdomain sudo[80038]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:27:20 np0005538515.localdomain sudo[80038]: pam_unix(sudo:session): session closed for user root
Nov 28 08:27:26 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.
Nov 28 08:27:26 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.
Nov 28 08:27:26 np0005538515.localdomain podman[80054]: 2025-11-28 08:27:26.983942595 +0000 UTC m=+0.084019506 container health_status cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, io.openshift.expose-services=, url=https://www.redhat.com, container_name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:51:28Z, tcib_managed=true, managed_by=tripleo_ansible, com.redhat.component=openstack-collectd-container, release=1761123044, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64)
Nov 28 08:27:27 np0005538515.localdomain podman[80054]: 2025-11-28 08:27:27.020979834 +0000 UTC m=+0.121056795 container exec_died cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, architecture=x86_64, vcs-type=git, tcib_managed=true, io.buildah.version=1.41.4, release=1761123044, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, name=rhosp17/openstack-collectd, distribution-scope=public, vendor=Red Hat, Inc., com.redhat.component=openstack-collectd-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, build-date=2025-11-18T22:51:28Z, url=https://www.redhat.com, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 28 08:27:27 np0005538515.localdomain systemd[1]: cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.service: Deactivated successfully.
Nov 28 08:27:27 np0005538515.localdomain podman[80053]: 2025-11-28 08:27:27.044415276 +0000 UTC m=+0.146601712 container health_status 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, tcib_managed=true, vcs-type=git, io.buildah.version=1.41.4, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., version=17.1.12, managed_by=tripleo_ansible, name=rhosp17/openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:46Z, container_name=metrics_qdr, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, release=1761123044, config_id=tripleo_step1, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd)
Nov 28 08:27:27 np0005538515.localdomain podman[80053]: 2025-11-28 08:27:27.221965638 +0000 UTC m=+0.324152144 container exec_died 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, container_name=metrics_qdr, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, distribution-scope=public, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, release=1761123044, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, config_id=tripleo_step1, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.4, name=rhosp17/openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 28 08:27:27 np0005538515.localdomain systemd[1]: 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.service: Deactivated successfully.
Nov 28 08:27:29 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.
Nov 28 08:27:29 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.
Nov 28 08:27:29 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.
Nov 28 08:27:29 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.
Nov 28 08:27:29 np0005538515.localdomain podman[80104]: 2025-11-28 08:27:29.987955794 +0000 UTC m=+0.092390713 container health_status 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, name=rhosp17/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, managed_by=tripleo_ansible, io.openshift.expose-services=, config_id=tripleo_step3, version=17.1.12, architecture=x86_64, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-iscsid-container, container_name=iscsid, io.buildah.version=1.41.4, distribution-scope=public, build-date=2025-11-18T23:44:13Z, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, batch=17.1_20251118.1, release=1761123044, vendor=Red Hat, Inc.)
Nov 28 08:27:30 np0005538515.localdomain podman[80104]: 2025-11-28 08:27:30.022465857 +0000 UTC m=+0.126900756 container exec_died 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.component=openstack-iscsid-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, release=1761123044, io.openshift.expose-services=, vcs-type=git, name=rhosp17/openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, config_id=tripleo_step3, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true)
Nov 28 08:27:30 np0005538515.localdomain systemd[1]: 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.service: Deactivated successfully.
Nov 28 08:27:30 np0005538515.localdomain systemd[1]: tmp-crun.9SEdAm.mount: Deactivated successfully.
Nov 28 08:27:30 np0005538515.localdomain podman[80103]: 2025-11-28 08:27:30.048716554 +0000 UTC m=+0.154333669 container health_status 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, url=https://www.redhat.com, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, io.openshift.expose-services=, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ceilometer_agent_ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, build-date=2025-11-19T00:12:45Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, release=1761123044)
Nov 28 08:27:30 np0005538515.localdomain podman[80103]: 2025-11-28 08:27:30.076602872 +0000 UTC m=+0.182219987 container exec_died 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ceilometer-ipmi-container, config_id=tripleo_step4, architecture=x86_64, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, release=1761123044, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, batch=17.1_20251118.1, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:12:45Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, version=17.1.12)
Nov 28 08:27:30 np0005538515.localdomain systemd[1]: 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.service: Deactivated successfully.
Nov 28 08:27:30 np0005538515.localdomain podman[80105]: 2025-11-28 08:27:30.099643521 +0000 UTC m=+0.197691943 container health_status d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, io.buildah.version=1.41.4, container_name=ceilometer_agent_compute, vendor=Red Hat, Inc., tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, build-date=2025-11-19T00:11:48Z, managed_by=tripleo_ansible)
Nov 28 08:27:30 np0005538515.localdomain podman[80105]: 2025-11-28 08:27:30.132651526 +0000 UTC m=+0.230699938 container exec_died d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ceilometer-compute-container, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-compute, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, vcs-type=git, build-date=2025-11-19T00:11:48Z, config_id=tripleo_step4, distribution-scope=public, release=1761123044, container_name=ceilometer_agent_compute, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 08:27:30 np0005538515.localdomain systemd[1]: d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.service: Deactivated successfully.
Nov 28 08:27:30 np0005538515.localdomain podman[80102]: 2025-11-28 08:27:30.200218855 +0000 UTC m=+0.308420020 container health_status 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., io.openshift.expose-services=, url=https://www.redhat.com, vcs-type=git, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, container_name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:32Z, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, distribution-scope=public, tcib_managed=true, release=1761123044, summary=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, com.redhat.component=openstack-cron-container)
Nov 28 08:27:30 np0005538515.localdomain podman[80102]: 2025-11-28 08:27:30.21434504 +0000 UTC m=+0.322546185 container exec_died 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, container_name=logrotate_crond, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, vendor=Red Hat, Inc., vcs-type=git, version=17.1.12, release=1761123044, name=rhosp17/openstack-cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step4, build-date=2025-11-18T22:49:32Z, com.redhat.component=openstack-cron-container, maintainer=OpenStack TripleO Team, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, batch=17.1_20251118.1)
Nov 28 08:27:30 np0005538515.localdomain systemd[1]: 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.service: Deactivated successfully.
Nov 28 08:27:30 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.
Nov 28 08:27:30 np0005538515.localdomain podman[80190]: 2025-11-28 08:27:30.984444502 +0000 UTC m=+0.083831740 container health_status e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.buildah.version=1.41.4, url=https://www.redhat.com, batch=17.1_20251118.1, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, build-date=2025-11-19T00:36:58Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, container_name=nova_migration_target, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, managed_by=tripleo_ansible)
Nov 28 08:27:31 np0005538515.localdomain podman[80190]: 2025-11-28 08:27:31.355837019 +0000 UTC m=+0.455224317 container exec_died e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_migration_target, io.openshift.expose-services=, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.4, version=17.1.12, config_id=tripleo_step4, name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, vcs-type=git, batch=17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, architecture=x86_64, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, release=1761123044)
Nov 28 08:27:31 np0005538515.localdomain systemd[1]: e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.service: Deactivated successfully.
Nov 28 08:27:33 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.
Nov 28 08:27:33 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.
Nov 28 08:27:33 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.
Nov 28 08:27:33 np0005538515.localdomain podman[80215]: 2025-11-28 08:27:33.987018028 +0000 UTC m=+0.084989036 container health_status e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, io.openshift.expose-services=, tcib_managed=true, io.buildah.version=1.41.4, release=1761123044, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, version=17.1.12, batch=17.1_20251118.1, vendor=Red Hat, Inc., managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z)
Nov 28 08:27:34 np0005538515.localdomain podman[80214]: 2025-11-28 08:27:34.033550129 +0000 UTC m=+0.134769627 container health_status ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, config_id=tripleo_step5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, tcib_managed=true, managed_by=tripleo_ansible, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, release=1761123044, version=17.1.12, architecture=x86_64)
Nov 28 08:27:34 np0005538515.localdomain podman[80215]: 2025-11-28 08:27:34.088527231 +0000 UTC m=+0.186498219 container exec_died e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.openshift.expose-services=, vcs-type=git, version=17.1.12, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, vendor=Red Hat, Inc., build-date=2025-11-19T00:14:25Z, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, release=1761123044, distribution-scope=public, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ovn_metadata_agent, url=https://www.redhat.com, name=rhosp17/openstack-neutron-metadata-agent-ovn, architecture=x86_64)
Nov 28 08:27:34 np0005538515.localdomain systemd[1]: e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.service: Deactivated successfully.
Nov 28 08:27:34 np0005538515.localdomain podman[80213]: 2025-11-28 08:27:34.095134164 +0000 UTC m=+0.199185919 container health_status 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, io.buildah.version=1.41.4, vendor=Red Hat, Inc., distribution-scope=public, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, managed_by=tripleo_ansible, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, build-date=2025-11-18T23:34:05Z, io.openshift.expose-services=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.component=openstack-ovn-controller-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_controller, name=rhosp17/openstack-ovn-controller, batch=17.1_20251118.1, architecture=x86_64, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true)
Nov 28 08:27:34 np0005538515.localdomain podman[80213]: 2025-11-28 08:27:34.178438967 +0000 UTC m=+0.282490692 container exec_died 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, managed_by=tripleo_ansible, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, container_name=ovn_controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-type=git, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, url=https://www.redhat.com, batch=17.1_20251118.1, io.buildah.version=1.41.4, config_id=tripleo_step4, com.redhat.component=openstack-ovn-controller-container, maintainer=OpenStack TripleO Team, version=17.1.12, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 08:27:34 np0005538515.localdomain systemd[1]: 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.service: Deactivated successfully.
Nov 28 08:27:34 np0005538515.localdomain podman[80214]: 2025-11-28 08:27:34.19671422 +0000 UTC m=+0.297933718 container exec_died ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:36:58Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, vendor=Red Hat, Inc., url=https://www.redhat.com, config_id=tripleo_step5, container_name=nova_compute, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container)
Nov 28 08:27:34 np0005538515.localdomain systemd[1]: ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.service: Deactivated successfully.
Nov 28 08:27:53 np0005538515.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Nov 28 08:27:53 np0005538515.localdomain recover_tripleo_nova_virtqemud[80291]: 62642
Nov 28 08:27:53 np0005538515.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Nov 28 08:27:53 np0005538515.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Nov 28 08:27:57 np0005538515.localdomain sshd[79208]: pam_unix(sshd:session): session closed for user zuul
Nov 28 08:27:57 np0005538515.localdomain systemd[1]: session-33.scope: Deactivated successfully.
Nov 28 08:27:57 np0005538515.localdomain systemd[1]: session-33.scope: Consumed 5.649s CPU time.
Nov 28 08:27:57 np0005538515.localdomain systemd-logind[763]: Session 33 logged out. Waiting for processes to exit.
Nov 28 08:27:57 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.
Nov 28 08:27:57 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.
Nov 28 08:27:57 np0005538515.localdomain systemd-logind[763]: Removed session 33.
Nov 28 08:27:57 np0005538515.localdomain systemd[1]: tmp-crun.lgO1BD.mount: Deactivated successfully.
Nov 28 08:27:57 np0005538515.localdomain podman[80293]: 2025-11-28 08:27:57.498654376 +0000 UTC m=+0.097388187 container health_status cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, name=rhosp17/openstack-collectd, batch=17.1_20251118.1, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, version=17.1.12, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-collectd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, managed_by=tripleo_ansible, build-date=2025-11-18T22:51:28Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 28 08:27:57 np0005538515.localdomain podman[80293]: 2025-11-28 08:27:57.508936992 +0000 UTC m=+0.107670763 container exec_died cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=collectd, architecture=x86_64, distribution-scope=public, config_id=tripleo_step3, com.redhat.component=openstack-collectd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-collectd, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, batch=17.1_20251118.1, version=17.1.12, release=1761123044, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, io.openshift.expose-services=, managed_by=tripleo_ansible, build-date=2025-11-18T22:51:28Z, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 28 08:27:57 np0005538515.localdomain systemd[1]: cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.service: Deactivated successfully.
Nov 28 08:27:57 np0005538515.localdomain podman[80292]: 2025-11-28 08:27:57.589295305 +0000 UTC m=+0.190037788 container health_status 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, version=17.1.12, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:46Z, container_name=metrics_qdr, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4)
Nov 28 08:27:57 np0005538515.localdomain podman[80292]: 2025-11-28 08:27:57.780530358 +0000 UTC m=+0.381272851 container exec_died 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, url=https://www.redhat.com, com.redhat.component=openstack-qdrouterd-container, tcib_managed=true, name=rhosp17/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, version=17.1.12, release=1761123044, vendor=Red Hat, Inc., batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, container_name=metrics_qdr, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public)
Nov 28 08:27:57 np0005538515.localdomain systemd[1]: 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.service: Deactivated successfully.
Nov 28 08:28:00 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.
Nov 28 08:28:00 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.
Nov 28 08:28:00 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.
Nov 28 08:28:00 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.
Nov 28 08:28:00 np0005538515.localdomain systemd[1]: tmp-crun.dvIp9X.mount: Deactivated successfully.
Nov 28 08:28:00 np0005538515.localdomain podman[80385]: 2025-11-28 08:28:00.990321688 +0000 UTC m=+0.092575820 container health_status 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, config_id=tripleo_step3, container_name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, vcs-type=git, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., release=1761123044, name=rhosp17/openstack-iscsid, architecture=x86_64, tcib_managed=true, batch=17.1_20251118.1, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, com.redhat.component=openstack-iscsid-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12)
Nov 28 08:28:01 np0005538515.localdomain podman[80385]: 2025-11-28 08:28:01.003643616 +0000 UTC m=+0.105897758 container exec_died 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, batch=17.1_20251118.1, config_id=tripleo_step3, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.component=openstack-iscsid-container, io.buildah.version=1.41.4, build-date=2025-11-18T23:44:13Z, io.openshift.expose-services=, vcs-type=git, managed_by=tripleo_ansible, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, maintainer=OpenStack TripleO Team, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com, container_name=iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid)
Nov 28 08:28:01 np0005538515.localdomain systemd[1]: 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.service: Deactivated successfully.
Nov 28 08:28:01 np0005538515.localdomain podman[80386]: 2025-11-28 08:28:01.045888727 +0000 UTC m=+0.143845757 container health_status d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:11:48Z, managed_by=tripleo_ansible, distribution-scope=public, batch=17.1_20251118.1, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, version=17.1.12, io.buildah.version=1.41.4, container_name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, name=rhosp17/openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']})
Nov 28 08:28:01 np0005538515.localdomain podman[80383]: 2025-11-28 08:28:01.085145006 +0000 UTC m=+0.191083010 container health_status 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, container_name=logrotate_crond, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:32Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, io.openshift.expose-services=, release=1761123044, summary=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-cron-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., url=https://www.redhat.com, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-cron, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, config_id=tripleo_step4, architecture=x86_64, batch=17.1_20251118.1)
Nov 28 08:28:01 np0005538515.localdomain podman[80383]: 2025-11-28 08:28:01.090813609 +0000 UTC m=+0.196751573 container exec_died 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, summary=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, name=rhosp17/openstack-cron, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, architecture=x86_64, batch=17.1_20251118.1, vcs-type=git, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:32Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-cron-container, url=https://www.redhat.com, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron)
Nov 28 08:28:01 np0005538515.localdomain podman[80386]: 2025-11-28 08:28:01.099444933 +0000 UTC m=+0.197401903 container exec_died d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, name=rhosp17/openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, vendor=Red Hat, Inc., release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, config_id=tripleo_step4, build-date=2025-11-19T00:11:48Z, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, container_name=ceilometer_agent_compute, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, distribution-scope=public, version=17.1.12, io.openshift.expose-services=, tcib_managed=true, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4)
Nov 28 08:28:01 np0005538515.localdomain systemd[1]: 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.service: Deactivated successfully.
Nov 28 08:28:01 np0005538515.localdomain systemd[1]: d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.service: Deactivated successfully.
Nov 28 08:28:01 np0005538515.localdomain podman[80384]: 2025-11-28 08:28:01.139621171 +0000 UTC m=+0.245711300 container health_status 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, com.redhat.component=openstack-ceilometer-ipmi-container, vendor=Red Hat, Inc., build-date=2025-11-19T00:12:45Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, container_name=ceilometer_agent_ipmi, name=rhosp17/openstack-ceilometer-ipmi, release=1761123044, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, version=17.1.12, url=https://www.redhat.com, architecture=x86_64, distribution-scope=public, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, vcs-type=git, io.openshift.expose-services=, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Nov 28 08:28:01 np0005538515.localdomain podman[80384]: 2025-11-28 08:28:01.162059206 +0000 UTC m=+0.268149325 container exec_died 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.buildah.version=1.41.4, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, batch=17.1_20251118.1, config_id=tripleo_step4, distribution-scope=public, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container, release=1761123044, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., url=https://www.redhat.com, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, build-date=2025-11-19T00:12:45Z)
Nov 28 08:28:01 np0005538515.localdomain systemd[1]: 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.service: Deactivated successfully.
Nov 28 08:28:01 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.
Nov 28 08:28:01 np0005538515.localdomain podman[80473]: 2025-11-28 08:28:01.971988309 +0000 UTC m=+0.082974956 container health_status e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, config_id=tripleo_step4, batch=17.1_20251118.1, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, version=17.1.12, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, container_name=nova_migration_target, url=https://www.redhat.com, vcs-type=git, build-date=2025-11-19T00:36:58Z, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 28 08:28:02 np0005538515.localdomain podman[80473]: 2025-11-28 08:28:02.347391652 +0000 UTC m=+0.458378289 container exec_died e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.4, url=https://www.redhat.com, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, vendor=Red Hat, Inc., managed_by=tripleo_ansible, release=1761123044, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, container_name=nova_migration_target, name=rhosp17/openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, io.openshift.expose-services=, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, architecture=x86_64)
Nov 28 08:28:02 np0005538515.localdomain systemd[1]: e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.service: Deactivated successfully.
Nov 28 08:28:04 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.
Nov 28 08:28:04 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.
Nov 28 08:28:04 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.
Nov 28 08:28:04 np0005538515.localdomain systemd[1]: tmp-crun.Nptik9.mount: Deactivated successfully.
Nov 28 08:28:04 np0005538515.localdomain podman[80496]: 2025-11-28 08:28:04.982380752 +0000 UTC m=+0.089824706 container health_status 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., managed_by=tripleo_ansible, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:34:05Z, tcib_managed=true, io.buildah.version=1.41.4, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.expose-services=, name=rhosp17/openstack-ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, batch=17.1_20251118.1, container_name=ovn_controller, distribution-scope=public, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, config_id=tripleo_step4)
Nov 28 08:28:05 np0005538515.localdomain podman[80497]: 2025-11-28 08:28:05.032281138 +0000 UTC m=+0.137935227 container health_status ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, maintainer=OpenStack TripleO Team, tcib_managed=true, container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., io.buildah.version=1.41.4, release=1761123044, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.12, config_id=tripleo_step5)
Nov 28 08:28:05 np0005538515.localdomain podman[80497]: 2025-11-28 08:28:05.087515256 +0000 UTC m=+0.193169385 container exec_died ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, container_name=nova_compute, distribution-scope=public, architecture=x86_64, batch=17.1_20251118.1, url=https://www.redhat.com, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, managed_by=tripleo_ansible)
Nov 28 08:28:05 np0005538515.localdomain podman[80498]: 2025-11-28 08:28:05.094348244 +0000 UTC m=+0.196158676 container health_status e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, url=https://www.redhat.com, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, build-date=2025-11-19T00:14:25Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., version=17.1.12, vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.buildah.version=1.41.4, batch=17.1_20251118.1, distribution-scope=public, architecture=x86_64, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, managed_by=tripleo_ansible, name=rhosp17/openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, container_name=ovn_metadata_agent)
Nov 28 08:28:05 np0005538515.localdomain systemd[1]: ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.service: Deactivated successfully.
Nov 28 08:28:05 np0005538515.localdomain podman[80496]: 2025-11-28 08:28:05.108869318 +0000 UTC m=+0.216313312 container exec_died 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, maintainer=OpenStack TripleO Team, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, summary=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ovn-controller, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.expose-services=, build-date=2025-11-18T23:34:05Z, container_name=ovn_controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, managed_by=tripleo_ansible, batch=17.1_20251118.1, release=1761123044, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, distribution-scope=public, url=https://www.redhat.com, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 28 08:28:05 np0005538515.localdomain systemd[1]: 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.service: Deactivated successfully.
Nov 28 08:28:05 np0005538515.localdomain podman[80498]: 2025-11-28 08:28:05.16259383 +0000 UTC m=+0.264404242 container exec_died e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, config_id=tripleo_step4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.openshift.expose-services=, vendor=Red Hat, Inc., managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, build-date=2025-11-19T00:14:25Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, name=rhosp17/openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=ovn_metadata_agent, tcib_managed=true, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, architecture=x86_64, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 08:28:05 np0005538515.localdomain systemd[1]: e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.service: Deactivated successfully.
Nov 28 08:28:11 np0005538515.localdomain sshd[80570]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 08:28:11 np0005538515.localdomain sshd[80570]: Accepted publickey for zuul from 38.102.83.114 port 37570 ssh2: RSA SHA256:3gOhaEk5Hp1Sm2LwNst6cGDJ5O01KvSo8lCo9SBO2II
Nov 28 08:28:11 np0005538515.localdomain systemd-logind[763]: New session 34 of user zuul.
Nov 28 08:28:11 np0005538515.localdomain systemd[1]: Started Session 34 of User zuul.
Nov 28 08:28:11 np0005538515.localdomain sshd[80570]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Nov 28 08:28:12 np0005538515.localdomain sudo[80587]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qnekirilmridttyrufgjjmkmbidbdfes ; /usr/bin/python3
Nov 28 08:28:12 np0005538515.localdomain sudo[80587]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 08:28:12 np0005538515.localdomain python3[80589]: ansible-ansible.legacy.dnf Invoked with name=['systemd-container'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Nov 28 08:28:14 np0005538515.localdomain sudo[80587]: pam_unix(sudo:session): session closed for user root
Nov 28 08:28:21 np0005538515.localdomain sudo[80591]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 08:28:21 np0005538515.localdomain sudo[80591]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:28:21 np0005538515.localdomain sudo[80591]: pam_unix(sudo:session): session closed for user root
Nov 28 08:28:21 np0005538515.localdomain sudo[80606]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 08:28:21 np0005538515.localdomain sudo[80606]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:28:21 np0005538515.localdomain sudo[80606]: pam_unix(sudo:session): session closed for user root
Nov 28 08:28:25 np0005538515.localdomain sudo[80654]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 08:28:25 np0005538515.localdomain sudo[80654]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:28:25 np0005538515.localdomain sudo[80654]: pam_unix(sudo:session): session closed for user root
Nov 28 08:28:27 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.
Nov 28 08:28:27 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.
Nov 28 08:28:27 np0005538515.localdomain podman[80669]: 2025-11-28 08:28:27.983771453 +0000 UTC m=+0.087381571 container health_status 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, vendor=Red Hat, Inc., distribution-scope=public, release=1761123044, config_id=tripleo_step1, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, build-date=2025-11-18T22:49:46Z, container_name=metrics_qdr, io.openshift.expose-services=, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']})
Nov 28 08:28:28 np0005538515.localdomain systemd[1]: tmp-crun.RmCDmp.mount: Deactivated successfully.
Nov 28 08:28:28 np0005538515.localdomain podman[80670]: 2025-11-28 08:28:28.042192469 +0000 UTC m=+0.145665953 container health_status cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, version=17.1.12, io.openshift.expose-services=, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, architecture=x86_64, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, release=1761123044, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, com.redhat.component=openstack-collectd-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, url=https://www.redhat.com, config_id=tripleo_step3, vendor=Red Hat, Inc., build-date=2025-11-18T22:51:28Z, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-collectd)
Nov 28 08:28:28 np0005538515.localdomain podman[80670]: 2025-11-28 08:28:28.056571758 +0000 UTC m=+0.160045232 container exec_died cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, config_id=tripleo_step3, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, batch=17.1_20251118.1, managed_by=tripleo_ansible, container_name=collectd, tcib_managed=true, build-date=2025-11-18T22:51:28Z, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, name=rhosp17/openstack-collectd, com.redhat.component=openstack-collectd-container, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, architecture=x86_64, io.buildah.version=1.41.4)
Nov 28 08:28:28 np0005538515.localdomain systemd[1]: cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.service: Deactivated successfully.
Nov 28 08:28:28 np0005538515.localdomain podman[80669]: 2025-11-28 08:28:28.170722217 +0000 UTC m=+0.274332365 container exec_died 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, vendor=Red Hat, Inc., container_name=metrics_qdr, managed_by=tripleo_ansible, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, release=1761123044, vcs-type=git, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, name=rhosp17/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd)
Nov 28 08:28:28 np0005538515.localdomain systemd[1]: 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.service: Deactivated successfully.
Nov 28 08:28:31 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.
Nov 28 08:28:31 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.
Nov 28 08:28:31 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.
Nov 28 08:28:31 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.
Nov 28 08:28:31 np0005538515.localdomain systemd[1]: tmp-crun.lmgIXF.mount: Deactivated successfully.
Nov 28 08:28:31 np0005538515.localdomain podman[80720]: 2025-11-28 08:28:31.9804972 +0000 UTC m=+0.086684749 container health_status 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp17/openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_ipmi, vcs-type=git, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, release=1761123044, url=https://www.redhat.com, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, vendor=Red Hat, Inc., architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, tcib_managed=true)
Nov 28 08:28:32 np0005538515.localdomain podman[80719]: 2025-11-28 08:28:31.96021896 +0000 UTC m=+0.073448665 container health_status 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.openshift.expose-services=, build-date=2025-11-18T22:49:32Z, name=rhosp17/openstack-cron, tcib_managed=true, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, com.redhat.component=openstack-cron-container, config_id=tripleo_step4, container_name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, batch=17.1_20251118.1, vendor=Red Hat, Inc., managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git)
Nov 28 08:28:32 np0005538515.localdomain podman[80721]: 2025-11-28 08:28:32.018003047 +0000 UTC m=+0.125164107 container health_status 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, build-date=2025-11-18T23:44:13Z, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, maintainer=OpenStack TripleO Team, release=1761123044, version=17.1.12, container_name=iscsid, name=rhosp17/openstack-iscsid, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, architecture=x86_64, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, io.buildah.version=1.41.4, url=https://www.redhat.com, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=)
Nov 28 08:28:32 np0005538515.localdomain podman[80721]: 2025-11-28 08:28:32.030352384 +0000 UTC m=+0.137513414 container exec_died 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vcs-type=git, description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, architecture=x86_64, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, distribution-scope=public, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, config_id=tripleo_step3, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, name=rhosp17/openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.expose-services=, vendor=Red Hat, Inc., tcib_managed=true)
Nov 28 08:28:32 np0005538515.localdomain systemd[1]: 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.service: Deactivated successfully.
Nov 28 08:28:32 np0005538515.localdomain podman[80727]: 2025-11-28 08:28:32.072993307 +0000 UTC m=+0.168629605 container health_status d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, maintainer=OpenStack TripleO Team, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, config_id=tripleo_step4, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-compute, distribution-scope=public, container_name=ceilometer_agent_compute, com.redhat.component=openstack-ceilometer-compute-container, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, managed_by=tripleo_ansible, url=https://www.redhat.com, batch=17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git)
Nov 28 08:28:32 np0005538515.localdomain podman[80720]: 2025-11-28 08:28:32.085749527 +0000 UTC m=+0.191937096 container exec_died 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, distribution-scope=public, version=17.1.12, url=https://www.redhat.com, architecture=x86_64, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:12:45Z, batch=17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, managed_by=tripleo_ansible, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, release=1761123044)
Nov 28 08:28:32 np0005538515.localdomain podman[80719]: 2025-11-28 08:28:32.096828216 +0000 UTC m=+0.210057991 container exec_died 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-cron-container, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, managed_by=tripleo_ansible, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., tcib_managed=true, config_id=tripleo_step4, io.buildah.version=1.41.4, container_name=logrotate_crond, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, build-date=2025-11-18T22:49:32Z)
Nov 28 08:28:32 np0005538515.localdomain systemd[1]: 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.service: Deactivated successfully.
Nov 28 08:28:32 np0005538515.localdomain systemd[1]: 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.service: Deactivated successfully.
Nov 28 08:28:32 np0005538515.localdomain podman[80727]: 2025-11-28 08:28:32.129502315 +0000 UTC m=+0.225138593 container exec_died d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, config_id=tripleo_step4, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, architecture=x86_64, build-date=2025-11-19T00:11:48Z, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, version=17.1.12, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, release=1761123044, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, container_name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, vendor=Red Hat, Inc., distribution-scope=public, com.redhat.component=openstack-ceilometer-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute)
Nov 28 08:28:32 np0005538515.localdomain systemd[1]: d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.service: Deactivated successfully.
Nov 28 08:28:32 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.
Nov 28 08:28:32 np0005538515.localdomain podman[80812]: 2025-11-28 08:28:32.970531398 +0000 UTC m=+0.078933183 container health_status e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vcs-type=git, release=1761123044, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, architecture=x86_64, batch=17.1_20251118.1, container_name=nova_migration_target, build-date=2025-11-19T00:36:58Z, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 28 08:28:33 np0005538515.localdomain podman[80812]: 2025-11-28 08:28:33.373532765 +0000 UTC m=+0.481934570 container exec_died e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, version=17.1.12, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, vendor=Red Hat, Inc., config_id=tripleo_step4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, distribution-scope=public, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, release=1761123044, container_name=nova_migration_target)
Nov 28 08:28:33 np0005538515.localdomain systemd[1]: e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.service: Deactivated successfully.
Nov 28 08:28:34 np0005538515.localdomain ceph-osd[32393]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 28 08:28:34 np0005538515.localdomain ceph-osd[32393]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 2400.1 total, 600.0 interval
                                                          Cumulative writes: 4386 writes, 20K keys, 4386 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s
                                                          Cumulative WAL: 4386 writes, 493 syncs, 8.90 writes per sync, written: 0.02 GB, 0.01 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                          Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Nov 28 08:28:35 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.
Nov 28 08:28:35 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.
Nov 28 08:28:35 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.
Nov 28 08:28:35 np0005538515.localdomain systemd[1]: tmp-crun.iWD794.mount: Deactivated successfully.
Nov 28 08:28:36 np0005538515.localdomain podman[80837]: 2025-11-28 08:28:36.004668467 +0000 UTC m=+0.103365371 container health_status e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, release=1761123044, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, architecture=x86_64, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, batch=17.1_20251118.1, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, build-date=2025-11-19T00:14:25Z, managed_by=tripleo_ansible, vendor=Red Hat, Inc., container_name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-type=git, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Nov 28 08:28:36 np0005538515.localdomain podman[80835]: 2025-11-28 08:28:35.970651777 +0000 UTC m=+0.077873531 container health_status 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.buildah.version=1.41.4, name=rhosp17/openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, container_name=ovn_controller, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, batch=17.1_20251118.1, distribution-scope=public, com.redhat.component=openstack-ovn-controller-container, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:34:05Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64)
Nov 28 08:28:36 np0005538515.localdomain podman[80837]: 2025-11-28 08:28:36.051444696 +0000 UTC m=+0.150141600 container exec_died e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_managed=true, build-date=2025-11-19T00:14:25Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, batch=17.1_20251118.1, vendor=Red Hat, Inc., vcs-type=git, io.openshift.expose-services=, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public)
Nov 28 08:28:36 np0005538515.localdomain systemd[1]: e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.service: Deactivated successfully.
Nov 28 08:28:36 np0005538515.localdomain podman[80835]: 2025-11-28 08:28:36.10162922 +0000 UTC m=+0.208850914 container exec_died 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ovn-controller-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, name=rhosp17/openstack-ovn-controller, vendor=Red Hat, Inc., managed_by=tripleo_ansible, url=https://www.redhat.com, tcib_managed=true, io.buildah.version=1.41.4, container_name=ovn_controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, distribution-scope=public, build-date=2025-11-18T23:34:05Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044)
Nov 28 08:28:36 np0005538515.localdomain systemd[1]: 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.service: Deactivated successfully.
Nov 28 08:28:36 np0005538515.localdomain podman[80836]: 2025-11-28 08:28:36.194312073 +0000 UTC m=+0.296136242 container health_status ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, config_id=tripleo_step5, container_name=nova_compute, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, vendor=Red Hat, Inc., tcib_managed=true, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, vcs-type=git, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, batch=17.1_20251118.1, distribution-scope=public)
Nov 28 08:28:36 np0005538515.localdomain podman[80836]: 2025-11-28 08:28:36.220053579 +0000 UTC m=+0.321877728 container exec_died ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, batch=17.1_20251118.1, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., io.openshift.expose-services=, url=https://www.redhat.com, container_name=nova_compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, build-date=2025-11-19T00:36:58Z, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.buildah.version=1.41.4, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 28 08:28:36 np0005538515.localdomain systemd[1]: ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.service: Deactivated successfully.
Nov 28 08:28:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 28 08:28:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 2400.2 total, 600.0 interval
                                                          Cumulative writes: 5246 writes, 23K keys, 5246 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s
                                                          Cumulative WAL: 5246 writes, 540 syncs, 9.71 writes per sync, written: 0.02 GB, 0.01 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                          Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Nov 28 08:28:40 np0005538515.localdomain sudo[80919]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bulllkzyyyxwdprpoxzrbrucxjopqnuw ; /usr/bin/python3
Nov 28 08:28:40 np0005538515.localdomain sudo[80919]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 08:28:40 np0005538515.localdomain python3[80921]: ansible-ansible.legacy.dnf Invoked with name=['sos'] state=latest allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Nov 28 08:28:43 np0005538515.localdomain systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 28 08:28:43 np0005538515.localdomain systemd[1]: Starting man-db-cache-update.service...
Nov 28 08:28:43 np0005538515.localdomain systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 28 08:28:44 np0005538515.localdomain systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 28 08:28:44 np0005538515.localdomain systemd[1]: Finished man-db-cache-update.service.
Nov 28 08:28:44 np0005538515.localdomain systemd[1]: run-r445c5d9cc9674500ae668767b7f7d736.service: Deactivated successfully.
Nov 28 08:28:44 np0005538515.localdomain systemd[1]: run-r5464bbe0a6d4419991b2b9de385bfc38.service: Deactivated successfully.
Nov 28 08:28:44 np0005538515.localdomain sudo[80919]: pam_unix(sudo:session): session closed for user root
Nov 28 08:28:58 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.
Nov 28 08:28:58 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.
Nov 28 08:28:58 np0005538515.localdomain systemd[1]: tmp-crun.5Z7Ib8.mount: Deactivated successfully.
Nov 28 08:28:58 np0005538515.localdomain podman[81075]: 2025-11-28 08:28:58.983201298 +0000 UTC m=+0.088637089 container health_status cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=collectd, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-collectd-container, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, name=rhosp17/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, managed_by=tripleo_ansible, release=1761123044, distribution-scope=public, vcs-type=git, io.buildah.version=1.41.4, architecture=x86_64, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, maintainer=OpenStack TripleO Team)
Nov 28 08:28:59 np0005538515.localdomain podman[81074]: 2025-11-28 08:28:59.031274598 +0000 UTC m=+0.137531624 container health_status 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, name=rhosp17/openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:46Z, version=17.1.12, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, maintainer=OpenStack TripleO Team, vcs-type=git, io.buildah.version=1.41.4, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, container_name=metrics_qdr, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 28 08:28:59 np0005538515.localdomain podman[81075]: 2025-11-28 08:28:59.042917744 +0000 UTC m=+0.148353545 container exec_died cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, batch=17.1_20251118.1, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, name=rhosp17/openstack-collectd, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, io.buildah.version=1.41.4, com.redhat.component=openstack-collectd-container, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, version=17.1.12, maintainer=OpenStack TripleO Team, release=1761123044, build-date=2025-11-18T22:51:28Z, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, distribution-scope=public)
Nov 28 08:28:59 np0005538515.localdomain systemd[1]: cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.service: Deactivated successfully.
Nov 28 08:28:59 np0005538515.localdomain podman[81074]: 2025-11-28 08:28:59.224411911 +0000 UTC m=+0.330668957 container exec_died 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, name=rhosp17/openstack-qdrouterd, io.buildah.version=1.41.4, batch=17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, version=17.1.12, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, managed_by=tripleo_ansible, architecture=x86_64, io.openshift.expose-services=, release=1761123044, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:46Z)
Nov 28 08:28:59 np0005538515.localdomain systemd[1]: 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.service: Deactivated successfully.
Nov 28 08:29:02 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.
Nov 28 08:29:02 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.
Nov 28 08:29:02 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.
Nov 28 08:29:02 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.
Nov 28 08:29:02 np0005538515.localdomain podman[81167]: 2025-11-28 08:29:02.987956111 +0000 UTC m=+0.090139346 container health_status 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-cron, release=1761123044, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:32Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, config_id=tripleo_step4, container_name=logrotate_crond, com.redhat.component=openstack-cron-container, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, distribution-scope=public)
Nov 28 08:29:03 np0005538515.localdomain podman[81167]: 2025-11-28 08:29:03.005485377 +0000 UTC m=+0.107668672 container exec_died 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, batch=17.1_20251118.1, container_name=logrotate_crond, maintainer=OpenStack TripleO Team, release=1761123044, io.buildah.version=1.41.4, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, distribution-scope=public, managed_by=tripleo_ansible, version=17.1.12, url=https://www.redhat.com, build-date=2025-11-18T22:49:32Z, name=rhosp17/openstack-cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-cron-container)
Nov 28 08:29:03 np0005538515.localdomain systemd[1]: 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.service: Deactivated successfully.
Nov 28 08:29:03 np0005538515.localdomain podman[81175]: 2025-11-28 08:29:03.047918113 +0000 UTC m=+0.136808431 container health_status d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-compute-container, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:11:48Z, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-compute, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_id=tripleo_step4, distribution-scope=public, io.openshift.expose-services=)
Nov 28 08:29:03 np0005538515.localdomain podman[81169]: 2025-11-28 08:29:03.104906286 +0000 UTC m=+0.197702094 container health_status 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, name=rhosp17/openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, container_name=iscsid, vcs-type=git, tcib_managed=true, io.openshift.expose-services=, config_id=tripleo_step3, architecture=x86_64, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.component=openstack-iscsid-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com, vendor=Red Hat, Inc., release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4)
Nov 28 08:29:03 np0005538515.localdomain podman[81169]: 2025-11-28 08:29:03.142525005 +0000 UTC m=+0.235320773 container exec_died 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vcs-type=git, distribution-scope=public, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.component=openstack-iscsid-container, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044, managed_by=tripleo_ansible, batch=17.1_20251118.1, tcib_managed=true, build-date=2025-11-18T23:44:13Z, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, container_name=iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, url=https://www.redhat.com, io.buildah.version=1.41.4, name=rhosp17/openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid)
Nov 28 08:29:03 np0005538515.localdomain systemd[1]: 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.service: Deactivated successfully.
Nov 28 08:29:03 np0005538515.localdomain podman[81175]: 2025-11-28 08:29:03.157801472 +0000 UTC m=+0.246691800 container exec_died d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, tcib_managed=true, name=rhosp17/openstack-ceilometer-compute, container_name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, version=17.1.12, maintainer=OpenStack TripleO Team, architecture=x86_64, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, vcs-type=git, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, batch=17.1_20251118.1, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, build-date=2025-11-19T00:11:48Z)
Nov 28 08:29:03 np0005538515.localdomain systemd[1]: d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.service: Deactivated successfully.
Nov 28 08:29:03 np0005538515.localdomain podman[81168]: 2025-11-28 08:29:03.157568465 +0000 UTC m=+0.257013636 container health_status 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:12:45Z, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, release=1761123044, vendor=Red Hat, Inc., vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, vcs-type=git, config_id=tripleo_step4, name=rhosp17/openstack-ceilometer-ipmi, distribution-scope=public, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-ipmi-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi)
Nov 28 08:29:03 np0005538515.localdomain podman[81168]: 2025-11-28 08:29:03.240477098 +0000 UTC m=+0.339922219 container exec_died 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, release=1761123044, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:12:45Z, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, url=https://www.redhat.com, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, io.buildah.version=1.41.4)
Nov 28 08:29:03 np0005538515.localdomain systemd[1]: 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.service: Deactivated successfully.
Nov 28 08:29:03 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.
Nov 28 08:29:03 np0005538515.localdomain podman[81259]: 2025-11-28 08:29:03.974931604 +0000 UTC m=+0.082071139 container health_status e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.buildah.version=1.41.4, container_name=nova_migration_target, batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, release=1761123044, vcs-type=git, url=https://www.redhat.com, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, config_id=tripleo_step4, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 08:29:03 np0005538515.localdomain systemd[1]: tmp-crun.IQO7N0.mount: Deactivated successfully.
Nov 28 08:29:04 np0005538515.localdomain podman[81259]: 2025-11-28 08:29:04.356543448 +0000 UTC m=+0.463682983 container exec_died e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, distribution-scope=public, name=rhosp17/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.buildah.version=1.41.4, container_name=nova_migration_target, batch=17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., config_id=tripleo_step4, build-date=2025-11-19T00:36:58Z, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 28 08:29:04 np0005538515.localdomain systemd[1]: e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.service: Deactivated successfully.
Nov 28 08:29:06 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.
Nov 28 08:29:06 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.
Nov 28 08:29:06 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.
Nov 28 08:29:07 np0005538515.localdomain systemd[1]: tmp-crun.O6OV95.mount: Deactivated successfully.
Nov 28 08:29:07 np0005538515.localdomain podman[81284]: 2025-11-28 08:29:07.028512526 +0000 UTC m=+0.121638358 container health_status e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, io.openshift.expose-services=, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, vcs-type=git, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:14:25Z, release=1761123044, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, config_id=tripleo_step4, container_name=ovn_metadata_agent, distribution-scope=public, io.buildah.version=1.41.4, tcib_managed=true, url=https://www.redhat.com, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 28 08:29:07 np0005538515.localdomain podman[81282]: 2025-11-28 08:29:06.993770805 +0000 UTC m=+0.097591814 container health_status 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., version=17.1.12, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, managed_by=tripleo_ansible, name=rhosp17/openstack-ovn-controller, batch=17.1_20251118.1, io.openshift.expose-services=, url=https://www.redhat.com, vcs-type=git, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, build-date=2025-11-18T23:34:05Z, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, container_name=ovn_controller, release=1761123044, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ovn-controller)
Nov 28 08:29:07 np0005538515.localdomain podman[81282]: 2025-11-28 08:29:07.078471093 +0000 UTC m=+0.182292072 container exec_died 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vcs-type=git, release=1761123044, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.buildah.version=1.41.4, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, version=17.1.12, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, build-date=2025-11-18T23:34:05Z, com.redhat.component=openstack-ovn-controller-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, managed_by=tripleo_ansible, name=rhosp17/openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, vendor=Red Hat, Inc., io.openshift.expose-services=)
Nov 28 08:29:07 np0005538515.localdomain podman[81283]: 2025-11-28 08:29:07.078594507 +0000 UTC m=+0.175801383 container health_status ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z, container_name=nova_compute, config_id=tripleo_step5, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, release=1761123044, distribution-scope=public, vcs-type=git, com.redhat.component=openstack-nova-compute-container, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64)
Nov 28 08:29:07 np0005538515.localdomain podman[81284]: 2025-11-28 08:29:07.092523733 +0000 UTC m=+0.185649565 container exec_died e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, managed_by=tripleo_ansible, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, batch=17.1_20251118.1, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., name=rhosp17/openstack-neutron-metadata-agent-ovn, release=1761123044, io.openshift.expose-services=, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, version=17.1.12, vcs-type=git, architecture=x86_64, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']})
Nov 28 08:29:07 np0005538515.localdomain systemd[1]: e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.service: Deactivated successfully.
Nov 28 08:29:07 np0005538515.localdomain podman[81283]: 2025-11-28 08:29:07.135516257 +0000 UTC m=+0.232723153 container exec_died ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, build-date=2025-11-19T00:36:58Z, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step5, name=rhosp17/openstack-nova-compute, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, version=17.1.12, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, release=1761123044, tcib_managed=true, vendor=Red Hat, Inc.)
Nov 28 08:29:07 np0005538515.localdomain systemd[1]: 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.service: Deactivated successfully.
Nov 28 08:29:07 np0005538515.localdomain systemd[1]: ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.service: Deactivated successfully.
Nov 28 08:29:23 np0005538515.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Nov 28 08:29:23 np0005538515.localdomain recover_tripleo_nova_virtqemud[81356]: 62642
Nov 28 08:29:23 np0005538515.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Nov 28 08:29:23 np0005538515.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Nov 28 08:29:24 np0005538515.localdomain sudo[81370]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dogzeszgsvgzwvyhgbrruvdolbyiotay ; /usr/bin/python3
Nov 28 08:29:24 np0005538515.localdomain sudo[81370]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 08:29:24 np0005538515.localdomain python3[81372]: ansible-ansible.legacy.command Invoked with _raw_params=subscription-manager repos --disable rhel-9-for-x86_64-baseos-eus-rpms --disable rhel-9-for-x86_64-appstream-eus-rpms --disable rhel-9-for-x86_64-highavailability-eus-rpms --disable openstack-17.1-for-rhel-9-x86_64-rpms --disable fast-datapath-for-rhel-9-x86_64-rpms _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 08:29:25 np0005538515.localdomain sudo[81375]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 08:29:25 np0005538515.localdomain sudo[81375]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:29:25 np0005538515.localdomain sudo[81375]: pam_unix(sudo:session): session closed for user root
Nov 28 08:29:25 np0005538515.localdomain sudo[81390]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 check-host
Nov 28 08:29:25 np0005538515.localdomain sudo[81390]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:29:25 np0005538515.localdomain sudo[81390]: pam_unix(sudo:session): session closed for user root
Nov 28 08:29:25 np0005538515.localdomain sudo[81426]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 08:29:25 np0005538515.localdomain sudo[81426]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:29:25 np0005538515.localdomain sudo[81426]: pam_unix(sudo:session): session closed for user root
Nov 28 08:29:25 np0005538515.localdomain sudo[81441]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 08:29:25 np0005538515.localdomain sudo[81441]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:29:26 np0005538515.localdomain sudo[81441]: pam_unix(sudo:session): session closed for user root
Nov 28 08:29:27 np0005538515.localdomain sudo[81490]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 08:29:27 np0005538515.localdomain sudo[81490]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:29:27 np0005538515.localdomain sudo[81490]: pam_unix(sudo:session): session closed for user root
Nov 28 08:29:27 np0005538515.localdomain sshd[81505]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 08:29:27 np0005538515.localdomain sshd[81505]: Invalid user ubuntu from 161.35.116.38 port 36216
Nov 28 08:29:27 np0005538515.localdomain sshd[81505]: Received disconnect from 161.35.116.38 port 36216:11:  [preauth]
Nov 28 08:29:27 np0005538515.localdomain sshd[81505]: Disconnected from invalid user ubuntu 161.35.116.38 port 36216 [preauth]
Nov 28 08:29:28 np0005538515.localdomain rhsm-service[6576]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Nov 28 08:29:29 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.
Nov 28 08:29:29 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.
Nov 28 08:29:29 np0005538515.localdomain podman[81632]: 2025-11-28 08:29:29.984699628 +0000 UTC m=+0.087816765 container health_status cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, release=1761123044, io.buildah.version=1.41.4, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, com.redhat.component=openstack-collectd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, container_name=collectd, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, managed_by=tripleo_ansible, config_id=tripleo_step3, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, name=rhosp17/openstack-collectd)
Nov 28 08:29:30 np0005538515.localdomain podman[81632]: 2025-11-28 08:29:30.02241433 +0000 UTC m=+0.125531437 container exec_died cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, release=1761123044, container_name=collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, build-date=2025-11-18T22:51:28Z, tcib_managed=true, vendor=Red Hat, Inc., batch=17.1_20251118.1, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, com.redhat.component=openstack-collectd-container, name=rhosp17/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']})
Nov 28 08:29:30 np0005538515.localdomain systemd[1]: tmp-crun.d1G0bb.mount: Deactivated successfully.
Nov 28 08:29:30 np0005538515.localdomain podman[81631]: 2025-11-28 08:29:30.041680889 +0000 UTC m=+0.145081115 container health_status 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step1, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.4, release=1761123044, build-date=2025-11-18T22:49:46Z, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, container_name=metrics_qdr, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, version=17.1.12, tcib_managed=true, distribution-scope=public, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 28 08:29:30 np0005538515.localdomain systemd[1]: cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.service: Deactivated successfully.
Nov 28 08:29:30 np0005538515.localdomain podman[81631]: 2025-11-28 08:29:30.244375274 +0000 UTC m=+0.347775480 container exec_died 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, architecture=x86_64, distribution-scope=public, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, config_id=tripleo_step1, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:49:46Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., version=17.1.12, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, container_name=metrics_qdr, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, release=1761123044)
Nov 28 08:29:30 np0005538515.localdomain systemd[1]: 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.service: Deactivated successfully.
Nov 28 08:29:32 np0005538515.localdomain sudo[81370]: pam_unix(sudo:session): session closed for user root
Nov 28 08:29:33 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.
Nov 28 08:29:33 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.
Nov 28 08:29:33 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.
Nov 28 08:29:33 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.
Nov 28 08:29:33 np0005538515.localdomain podman[81740]: 2025-11-28 08:29:33.980233108 +0000 UTC m=+0.088735473 container health_status 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vendor=Red Hat, Inc., release=1761123044, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, summary=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, batch=17.1_20251118.1, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, tcib_managed=true, com.redhat.component=openstack-cron-container, container_name=logrotate_crond, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, distribution-scope=public)
Nov 28 08:29:34 np0005538515.localdomain podman[81741]: 2025-11-28 08:29:34.035145316 +0000 UTC m=+0.140855215 container health_status 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.buildah.version=1.41.4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, distribution-scope=public, vendor=Red Hat, Inc., managed_by=tripleo_ansible, build-date=2025-11-19T00:12:45Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, url=https://www.redhat.com, config_id=tripleo_step4, tcib_managed=true, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, version=17.1.12)
Nov 28 08:29:34 np0005538515.localdomain podman[81741]: 2025-11-28 08:29:34.089537148 +0000 UTC m=+0.195247007 container exec_died 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, url=https://www.redhat.com, io.openshift.expose-services=, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ceilometer-ipmi-container, distribution-scope=public, build-date=2025-11-19T00:12:45Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, container_name=ceilometer_agent_ipmi, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 28 08:29:34 np0005538515.localdomain systemd[1]: 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.service: Deactivated successfully.
Nov 28 08:29:34 np0005538515.localdomain podman[81742]: 2025-11-28 08:29:34.139172725 +0000 UTC m=+0.242586435 container health_status 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, maintainer=OpenStack TripleO Team, distribution-scope=public, tcib_managed=true, vcs-type=git, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, url=https://www.redhat.com, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 iscsid)
Nov 28 08:29:34 np0005538515.localdomain podman[81743]: 2025-11-28 08:29:34.095280563 +0000 UTC m=+0.197457995 container health_status d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-compute-container, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, version=17.1.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_compute, vendor=Red Hat, Inc., build-date=2025-11-19T00:11:48Z, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, distribution-scope=public, tcib_managed=true, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, architecture=x86_64, io.buildah.version=1.41.4, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 28 08:29:34 np0005538515.localdomain podman[81740]: 2025-11-28 08:29:34.165785898 +0000 UTC m=+0.274288263 container exec_died 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, batch=17.1_20251118.1, architecture=x86_64, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, io.openshift.expose-services=, vcs-type=git, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, com.redhat.component=openstack-cron-container, build-date=2025-11-18T22:49:32Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, name=rhosp17/openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, summary=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, vendor=Red Hat, Inc.)
Nov 28 08:29:34 np0005538515.localdomain systemd[1]: 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.service: Deactivated successfully.
Nov 28 08:29:34 np0005538515.localdomain podman[81743]: 2025-11-28 08:29:34.177427605 +0000 UTC m=+0.279605067 container exec_died d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, com.redhat.component=openstack-ceilometer-compute-container, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, version=17.1.12, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., container_name=ceilometer_agent_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, release=1761123044, vcs-type=git, build-date=2025-11-19T00:11:48Z, name=rhosp17/openstack-ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, tcib_managed=true, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Nov 28 08:29:34 np0005538515.localdomain podman[81742]: 2025-11-28 08:29:34.198697934 +0000 UTC m=+0.302111654 container exec_died 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, version=17.1.12, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_id=tripleo_step3, distribution-scope=public, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:44:13Z, release=1761123044, io.openshift.expose-services=, container_name=iscsid, name=rhosp17/openstack-iscsid, vendor=Red Hat, Inc., vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, architecture=x86_64, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid)
Nov 28 08:29:34 np0005538515.localdomain systemd[1]: d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.service: Deactivated successfully.
Nov 28 08:29:34 np0005538515.localdomain systemd[1]: 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.service: Deactivated successfully.
Nov 28 08:29:34 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.
Nov 28 08:29:34 np0005538515.localdomain podman[81832]: 2025-11-28 08:29:34.980950331 +0000 UTC m=+0.087948658 container health_status e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.openshift.expose-services=, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, release=1761123044, vcs-type=git, distribution-scope=public, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, config_id=tripleo_step4, name=rhosp17/openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, batch=17.1_20251118.1, url=https://www.redhat.com)
Nov 28 08:29:35 np0005538515.localdomain podman[81832]: 2025-11-28 08:29:35.365548856 +0000 UTC m=+0.472547203 container exec_died e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, distribution-scope=public, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.12, vcs-type=git, tcib_managed=true, batch=17.1_20251118.1, release=1761123044, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:36:58Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, url=https://www.redhat.com, architecture=x86_64, container_name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step4, io.openshift.expose-services=)
Nov 28 08:29:35 np0005538515.localdomain systemd[1]: e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.service: Deactivated successfully.
Nov 28 08:29:37 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.
Nov 28 08:29:37 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.
Nov 28 08:29:37 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.
Nov 28 08:29:37 np0005538515.localdomain podman[81855]: 2025-11-28 08:29:37.975645594 +0000 UTC m=+0.083159522 container health_status 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, maintainer=OpenStack TripleO Team, version=17.1.12, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, name=rhosp17/openstack-ovn-controller, vendor=Red Hat, Inc., org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, build-date=2025-11-18T23:34:05Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, vcs-type=git, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ovn-controller)
Nov 28 08:29:38 np0005538515.localdomain systemd[1]: tmp-crun.1m3IVo.mount: Deactivated successfully.
Nov 28 08:29:38 np0005538515.localdomain systemd[1]: tmp-crun.gYxxDb.mount: Deactivated successfully.
Nov 28 08:29:38 np0005538515.localdomain podman[81856]: 2025-11-28 08:29:38.04816022 +0000 UTC m=+0.152393488 container health_status ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2025-11-19T00:36:58Z, version=17.1.12, io.buildah.version=1.41.4, batch=17.1_20251118.1, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, config_id=tripleo_step5, container_name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, release=1761123044, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true)
Nov 28 08:29:38 np0005538515.localdomain podman[81855]: 2025-11-28 08:29:38.061416096 +0000 UTC m=+0.168929974 container exec_died 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, name=rhosp17/openstack-ovn-controller, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ovn-controller-container, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, distribution-scope=public, batch=17.1_20251118.1, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, build-date=2025-11-18T23:34:05Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_id=tripleo_step4, vcs-type=git, io.openshift.expose-services=)
Nov 28 08:29:38 np0005538515.localdomain podman[81857]: 2025-11-28 08:29:38.019424902 +0000 UTC m=+0.115249973 container health_status e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.4, release=1761123044, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, architecture=x86_64, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, tcib_managed=true, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=ovn_metadata_agent, build-date=2025-11-19T00:14:25Z, maintainer=OpenStack TripleO Team, vcs-type=git, vendor=Red Hat, Inc., org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, batch=17.1_20251118.1)
Nov 28 08:29:38 np0005538515.localdomain systemd[1]: 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.service: Deactivated successfully.
Nov 28 08:29:38 np0005538515.localdomain podman[81856]: 2025-11-28 08:29:38.079318492 +0000 UTC m=+0.183551770 container exec_died ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, batch=17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., build-date=2025-11-19T00:36:58Z, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, release=1761123044, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, container_name=nova_compute, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step5, io.buildah.version=1.41.4, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Nov 28 08:29:38 np0005538515.localdomain systemd[1]: ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.service: Deactivated successfully.
Nov 28 08:29:38 np0005538515.localdomain podman[81857]: 2025-11-28 08:29:38.103396239 +0000 UTC m=+0.199221310 container exec_died e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, release=1761123044, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, managed_by=tripleo_ansible, name=rhosp17/openstack-neutron-metadata-agent-ovn, architecture=x86_64, build-date=2025-11-19T00:14:25Z, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-type=git)
Nov 28 08:29:38 np0005538515.localdomain systemd[1]: e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.service: Deactivated successfully.
Nov 28 08:30:00 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.
Nov 28 08:30:00 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.
Nov 28 08:30:00 np0005538515.localdomain podman[81929]: 2025-11-28 08:30:00.978549272 +0000 UTC m=+0.087266547 container health_status 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, name=rhosp17/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.openshift.expose-services=, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, architecture=x86_64, release=1761123044, build-date=2025-11-18T22:49:46Z, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=metrics_qdr, url=https://www.redhat.com)
Nov 28 08:30:01 np0005538515.localdomain systemd[1]: tmp-crun.Q4dXx5.mount: Deactivated successfully.
Nov 28 08:30:01 np0005538515.localdomain podman[81930]: 2025-11-28 08:30:01.044134686 +0000 UTC m=+0.147242390 container health_status cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, url=https://www.redhat.com, config_id=tripleo_step3, name=rhosp17/openstack-collectd, io.openshift.expose-services=, version=17.1.12, managed_by=tripleo_ansible, container_name=collectd, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, com.redhat.component=openstack-collectd-container, maintainer=OpenStack TripleO Team, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 28 08:30:01 np0005538515.localdomain podman[81930]: 2025-11-28 08:30:01.056391631 +0000 UTC m=+0.159499365 container exec_died cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.buildah.version=1.41.4, name=rhosp17/openstack-collectd, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, distribution-scope=public, com.redhat.component=openstack-collectd-container, managed_by=tripleo_ansible, container_name=collectd, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, url=https://www.redhat.com, tcib_managed=true, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, release=1761123044)
Nov 28 08:30:01 np0005538515.localdomain systemd[1]: cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.service: Deactivated successfully.
Nov 28 08:30:01 np0005538515.localdomain podman[81929]: 2025-11-28 08:30:01.148684471 +0000 UTC m=+0.257401716 container exec_died 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:46Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, name=rhosp17/openstack-qdrouterd, batch=17.1_20251118.1, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, release=1761123044, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, config_id=tripleo_step1, io.buildah.version=1.41.4, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, container_name=metrics_qdr, architecture=x86_64, version=17.1.12, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd)
Nov 28 08:30:01 np0005538515.localdomain systemd[1]: 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.service: Deactivated successfully.
Nov 28 08:30:04 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.
Nov 28 08:30:04 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.
Nov 28 08:30:04 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.
Nov 28 08:30:04 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.
Nov 28 08:30:04 np0005538515.localdomain systemd[1]: tmp-crun.aLqEjQ.mount: Deactivated successfully.
Nov 28 08:30:04 np0005538515.localdomain podman[82020]: 2025-11-28 08:30:04.97954114 +0000 UTC m=+0.078122939 container health_status 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, vendor=Red Hat, Inc., distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, managed_by=tripleo_ansible, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, batch=17.1_20251118.1, name=rhosp17/openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, architecture=x86_64, build-date=2025-11-19T00:12:45Z, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, io.openshift.expose-services=)
Nov 28 08:30:04 np0005538515.localdomain podman[82021]: 2025-11-28 08:30:04.994913229 +0000 UTC m=+0.093545379 container health_status 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, tcib_managed=true, name=rhosp17/openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vcs-type=git, description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, version=17.1.12, distribution-scope=public, managed_by=tripleo_ansible, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., build-date=2025-11-18T23:44:13Z, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid)
Nov 28 08:30:05 np0005538515.localdomain podman[82020]: 2025-11-28 08:30:05.037472871 +0000 UTC m=+0.136054650 container exec_died 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, io.openshift.expose-services=, release=1761123044, url=https://www.redhat.com, distribution-scope=public, maintainer=OpenStack TripleO Team, architecture=x86_64, vcs-type=git, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-ipmi, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ceilometer_agent_ipmi, build-date=2025-11-19T00:12:45Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vendor=Red Hat, Inc., tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ceilometer-ipmi-container, config_id=tripleo_step4)
Nov 28 08:30:05 np0005538515.localdomain systemd[1]: 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.service: Deactivated successfully.
Nov 28 08:30:05 np0005538515.localdomain podman[82021]: 2025-11-28 08:30:05.05578831 +0000 UTC m=+0.154420440 container exec_died 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, build-date=2025-11-18T23:44:13Z, name=rhosp17/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, tcib_managed=true, architecture=x86_64, container_name=iscsid, config_id=tripleo_step3, vcs-type=git, maintainer=OpenStack TripleO Team, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, url=https://www.redhat.com, io.openshift.expose-services=, distribution-scope=public, managed_by=tripleo_ansible, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.buildah.version=1.41.4, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid)
Nov 28 08:30:05 np0005538515.localdomain systemd[1]: 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.service: Deactivated successfully.
Nov 28 08:30:05 np0005538515.localdomain podman[82019]: 2025-11-28 08:30:05.039687368 +0000 UTC m=+0.145402225 container health_status 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, build-date=2025-11-18T22:49:32Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-type=git, com.redhat.component=openstack-cron-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, name=rhosp17/openstack-cron, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, maintainer=OpenStack TripleO Team, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.openshift.expose-services=, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 28 08:30:05 np0005538515.localdomain podman[82025]: 2025-11-28 08:30:05.104360374 +0000 UTC m=+0.197483076 container health_status d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, version=17.1.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2025-11-19T00:11:48Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, name=rhosp17/openstack-ceilometer-compute, config_id=tripleo_step4, distribution-scope=public, url=https://www.redhat.com, release=1761123044, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, vendor=Red Hat, Inc., container_name=ceilometer_agent_compute, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, tcib_managed=true, com.redhat.component=openstack-ceilometer-compute-container)
Nov 28 08:30:05 np0005538515.localdomain podman[82019]: 2025-11-28 08:30:05.119624251 +0000 UTC m=+0.225339088 container exec_died 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=logrotate_crond, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:32Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, url=https://www.redhat.com, io.buildah.version=1.41.4, vendor=Red Hat, Inc., distribution-scope=public)
Nov 28 08:30:05 np0005538515.localdomain systemd[1]: 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.service: Deactivated successfully.
Nov 28 08:30:05 np0005538515.localdomain podman[82025]: 2025-11-28 08:30:05.163619236 +0000 UTC m=+0.256741918 container exec_died d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, version=17.1.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, config_id=tripleo_step4, container_name=ceilometer_agent_compute, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-compute, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Nov 28 08:30:05 np0005538515.localdomain systemd[1]: d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.service: Deactivated successfully.
Nov 28 08:30:05 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.
Nov 28 08:30:05 np0005538515.localdomain systemd[1]: tmp-crun.OgtHXq.mount: Deactivated successfully.
Nov 28 08:30:05 np0005538515.localdomain podman[82105]: 2025-11-28 08:30:05.974847538 +0000 UTC m=+0.081603285 container health_status e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, io.openshift.expose-services=, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.4, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, name=rhosp17/openstack-nova-compute, container_name=nova_migration_target, architecture=x86_64, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, config_id=tripleo_step4)
Nov 28 08:30:06 np0005538515.localdomain podman[82105]: 2025-11-28 08:30:06.390543203 +0000 UTC m=+0.497298910 container exec_died e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, io.buildah.version=1.41.4, container_name=nova_migration_target, tcib_managed=true, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, vcs-type=git, io.openshift.expose-services=, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container)
Nov 28 08:30:06 np0005538515.localdomain systemd[1]: e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.service: Deactivated successfully.
Nov 28 08:30:08 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.
Nov 28 08:30:08 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.
Nov 28 08:30:08 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.
Nov 28 08:30:08 np0005538515.localdomain systemd[1]: tmp-crun.qJ97TB.mount: Deactivated successfully.
Nov 28 08:30:08 np0005538515.localdomain podman[82128]: 2025-11-28 08:30:08.983193968 +0000 UTC m=+0.093408615 container health_status 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, container_name=ovn_controller, name=rhosp17/openstack-ovn-controller, distribution-scope=public, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, io.buildah.version=1.41.4, vcs-type=git, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container, managed_by=tripleo_ansible, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:34:05Z, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, url=https://www.redhat.com)
Nov 28 08:30:09 np0005538515.localdomain systemd[1]: tmp-crun.tf2hrs.mount: Deactivated successfully.
Nov 28 08:30:09 np0005538515.localdomain podman[82128]: 2025-11-28 08:30:09.027654397 +0000 UTC m=+0.137869004 container exec_died 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vendor=Red Hat, Inc., url=https://www.redhat.com, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, batch=17.1_20251118.1, managed_by=tripleo_ansible, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, config_id=tripleo_step4, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, version=17.1.12, build-date=2025-11-18T23:34:05Z, release=1761123044, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272)
Nov 28 08:30:09 np0005538515.localdomain systemd[1]: 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.service: Deactivated successfully.
Nov 28 08:30:09 np0005538515.localdomain podman[82129]: 2025-11-28 08:30:09.073353654 +0000 UTC m=+0.180637481 container health_status ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., container_name=nova_compute, distribution-scope=public, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, vcs-type=git, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, batch=17.1_20251118.1, config_id=tripleo_step5, build-date=2025-11-19T00:36:58Z, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 28 08:30:09 np0005538515.localdomain podman[82130]: 2025-11-28 08:30:09.031604848 +0000 UTC m=+0.135363788 container health_status e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, release=1761123044, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, architecture=x86_64, build-date=2025-11-19T00:14:25Z, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, tcib_managed=true, vcs-type=git, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container)
Nov 28 08:30:09 np0005538515.localdomain podman[82129]: 2025-11-28 08:30:09.095214012 +0000 UTC m=+0.202497909 container exec_died ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, vcs-type=git, distribution-scope=public, config_id=tripleo_step5, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, tcib_managed=true, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, name=rhosp17/openstack-nova-compute, build-date=2025-11-19T00:36:58Z)
Nov 28 08:30:09 np0005538515.localdomain systemd[1]: ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.service: Deactivated successfully.
Nov 28 08:30:09 np0005538515.localdomain podman[82130]: 2025-11-28 08:30:09.114650646 +0000 UTC m=+0.218409596 container exec_died e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=ovn_metadata_agent, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, architecture=x86_64, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, io.openshift.expose-services=, build-date=2025-11-19T00:14:25Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, distribution-scope=public, batch=17.1_20251118.1, vcs-type=git, release=1761123044, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 28 08:30:09 np0005538515.localdomain systemd[1]: e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.service: Deactivated successfully.
Nov 28 08:30:26 np0005538515.localdomain sshd[82200]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 08:30:27 np0005538515.localdomain sshd[82200]: Invalid user eth from 80.94.92.186 port 53370
Nov 28 08:30:27 np0005538515.localdomain sshd[82200]: Connection closed by invalid user eth 80.94.92.186 port 53370 [preauth]
Nov 28 08:30:27 np0005538515.localdomain sudo[82202]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 08:30:27 np0005538515.localdomain sudo[82202]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:30:27 np0005538515.localdomain sudo[82202]: pam_unix(sudo:session): session closed for user root
Nov 28 08:30:27 np0005538515.localdomain sudo[82217]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 08:30:27 np0005538515.localdomain sudo[82217]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:30:28 np0005538515.localdomain sudo[82217]: pam_unix(sudo:session): session closed for user root
Nov 28 08:30:28 np0005538515.localdomain sudo[82264]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 08:30:28 np0005538515.localdomain sudo[82264]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:30:28 np0005538515.localdomain sudo[82264]: pam_unix(sudo:session): session closed for user root
Nov 28 08:30:30 np0005538515.localdomain sudo[82292]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qplbwkxcmpdqtjadwhmpkecqfhxgwbkh ; /usr/bin/python3
Nov 28 08:30:30 np0005538515.localdomain sudo[82292]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 08:30:30 np0005538515.localdomain python3[82294]: ansible-ansible.legacy.command Invoked with _raw_params=subscription-manager repos --disable rhceph-7-tools-for-rhel-9-x86_64-rpms _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 08:30:31 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.
Nov 28 08:30:31 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.
Nov 28 08:30:31 np0005538515.localdomain podman[82297]: 2025-11-28 08:30:31.996636302 +0000 UTC m=+0.096899713 container health_status 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, url=https://www.redhat.com, architecture=x86_64, name=rhosp17/openstack-qdrouterd, tcib_managed=true, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=metrics_qdr, version=17.1.12, config_id=tripleo_step1, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd)
Nov 28 08:30:32 np0005538515.localdomain systemd[1]: tmp-crun.uIiXHv.mount: Deactivated successfully.
Nov 28 08:30:32 np0005538515.localdomain podman[82299]: 2025-11-28 08:30:32.040195234 +0000 UTC m=+0.136563745 container health_status cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, version=17.1.12, name=rhosp17/openstack-collectd, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, container_name=collectd, build-date=2025-11-18T22:51:28Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, io.buildah.version=1.41.4, io.openshift.expose-services=, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, com.redhat.component=openstack-collectd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 collectd)
Nov 28 08:30:32 np0005538515.localdomain podman[82299]: 2025-11-28 08:30:32.077551035 +0000 UTC m=+0.173919546 container exec_died cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=, version=17.1.12, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vcs-type=git, name=rhosp17/openstack-collectd, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, com.redhat.component=openstack-collectd-container, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd)
Nov 28 08:30:32 np0005538515.localdomain systemd[1]: cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.service: Deactivated successfully.
Nov 28 08:30:32 np0005538515.localdomain podman[82297]: 2025-11-28 08:30:32.170110494 +0000 UTC m=+0.270373895 container exec_died 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, name=rhosp17/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, container_name=metrics_qdr, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:46Z, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, config_id=tripleo_step1, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true)
Nov 28 08:30:32 np0005538515.localdomain systemd[1]: 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.service: Deactivated successfully.
Nov 28 08:30:33 np0005538515.localdomain rhsm-service[6576]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Nov 28 08:30:34 np0005538515.localdomain rhsm-service[6576]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Nov 28 08:30:35 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.
Nov 28 08:30:35 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.
Nov 28 08:30:35 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.
Nov 28 08:30:35 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.
Nov 28 08:30:35 np0005538515.localdomain systemd[1]: tmp-crun.3m5ode.mount: Deactivated successfully.
Nov 28 08:30:35 np0005538515.localdomain podman[82471]: 2025-11-28 08:30:35.992711478 +0000 UTC m=+0.102406720 container health_status 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, container_name=logrotate_crond, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, architecture=x86_64, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, build-date=2025-11-18T22:49:32Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-cron, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., com.redhat.component=openstack-cron-container, io.buildah.version=1.41.4, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true)
Nov 28 08:30:36 np0005538515.localdomain systemd[1]: tmp-crun.YPjbCG.mount: Deactivated successfully.
Nov 28 08:30:36 np0005538515.localdomain podman[82471]: 2025-11-28 08:30:36.032218456 +0000 UTC m=+0.141913718 container exec_died 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, architecture=x86_64, com.redhat.component=openstack-cron-container, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, io.buildah.version=1.41.4, url=https://www.redhat.com, container_name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:32Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-cron, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., distribution-scope=public, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 cron)
Nov 28 08:30:36 np0005538515.localdomain systemd[1]: 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.service: Deactivated successfully.
Nov 28 08:30:36 np0005538515.localdomain podman[82472]: 2025-11-28 08:30:36.038191809 +0000 UTC m=+0.145299112 container health_status 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ceilometer-ipmi-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, container_name=ceilometer_agent_ipmi, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., io.openshift.expose-services=, distribution-scope=public, version=17.1.12, tcib_managed=true, build-date=2025-11-19T00:12:45Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 28 08:30:36 np0005538515.localdomain podman[82473]: 2025-11-28 08:30:36.097945084 +0000 UTC m=+0.201958103 container health_status 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vcs-type=git, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044, summary=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, version=17.1.12, architecture=x86_64, url=https://www.redhat.com, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, container_name=iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-iscsid, batch=17.1_20251118.1, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, io.openshift.expose-services=, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, maintainer=OpenStack TripleO Team, config_id=tripleo_step3)
Nov 28 08:30:36 np0005538515.localdomain podman[82472]: 2025-11-28 08:30:36.116952766 +0000 UTC m=+0.224060049 container exec_died 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, distribution-scope=public, vcs-type=git, vendor=Red Hat, Inc., io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, managed_by=tripleo_ansible, version=17.1.12, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:12:45Z, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20251118.1, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com)
Nov 28 08:30:36 np0005538515.localdomain systemd[1]: 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.service: Deactivated successfully.
Nov 28 08:30:36 np0005538515.localdomain podman[82473]: 2025-11-28 08:30:36.135573594 +0000 UTC m=+0.239586613 container exec_died 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, maintainer=OpenStack TripleO Team, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, name=rhosp17/openstack-iscsid, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com, container_name=iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044, com.redhat.component=openstack-iscsid-container, build-date=2025-11-18T23:44:13Z, io.buildah.version=1.41.4, config_id=tripleo_step3, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, vendor=Red Hat, Inc., vcs-type=git, io.openshift.expose-services=)
Nov 28 08:30:36 np0005538515.localdomain systemd[1]: 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.service: Deactivated successfully.
Nov 28 08:30:36 np0005538515.localdomain podman[82479]: 2025-11-28 08:30:36.21037261 +0000 UTC m=+0.308613862 container health_status d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, architecture=x86_64, maintainer=OpenStack TripleO Team, version=17.1.12, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:11:48Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, distribution-scope=public, com.redhat.component=openstack-ceilometer-compute-container, batch=17.1_20251118.1, tcib_managed=true)
Nov 28 08:30:36 np0005538515.localdomain podman[82479]: 2025-11-28 08:30:36.237460339 +0000 UTC m=+0.335701641 container exec_died d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, url=https://www.redhat.com, container_name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, com.redhat.component=openstack-ceilometer-compute-container, vcs-type=git, managed_by=tripleo_ansible, io.buildah.version=1.41.4, config_id=tripleo_step4, batch=17.1_20251118.1, name=rhosp17/openstack-ceilometer-compute, distribution-scope=public, architecture=x86_64, tcib_managed=true, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute)
Nov 28 08:30:36 np0005538515.localdomain systemd[1]: d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.service: Deactivated successfully.
Nov 28 08:30:36 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.
Nov 28 08:30:36 np0005538515.localdomain podman[82561]: 2025-11-28 08:30:36.950340155 +0000 UTC m=+0.065409940 container health_status e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, container_name=nova_migration_target, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., architecture=x86_64, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, tcib_managed=true, distribution-scope=public, release=1761123044)
Nov 28 08:30:37 np0005538515.localdomain podman[82561]: 2025-11-28 08:30:37.327516992 +0000 UTC m=+0.442586817 container exec_died e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, url=https://www.redhat.com, architecture=x86_64, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, distribution-scope=public, io.openshift.expose-services=, vendor=Red Hat, Inc., vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, io.buildah.version=1.41.4, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, config_id=tripleo_step4, name=rhosp17/openstack-nova-compute, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, release=1761123044)
Nov 28 08:30:37 np0005538515.localdomain systemd[1]: e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.service: Deactivated successfully.
Nov 28 08:30:37 np0005538515.localdomain sudo[82292]: pam_unix(sudo:session): session closed for user root
Nov 28 08:30:39 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.
Nov 28 08:30:39 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.
Nov 28 08:30:39 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.
Nov 28 08:30:40 np0005538515.localdomain podman[82644]: 2025-11-28 08:30:40.014163131 +0000 UTC m=+0.118775761 container health_status e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, build-date=2025-11-19T00:14:25Z, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, distribution-scope=public, managed_by=tripleo_ansible, vendor=Red Hat, Inc., name=rhosp17/openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=ovn_metadata_agent, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, version=17.1.12, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 28 08:30:40 np0005538515.localdomain podman[82642]: 2025-11-28 08:30:39.976796799 +0000 UTC m=+0.084228525 container health_status 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, managed_by=tripleo_ansible, vendor=Red Hat, Inc., version=17.1.12, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, release=1761123044, config_id=tripleo_step4, io.buildah.version=1.41.4, build-date=2025-11-18T23:34:05Z, vcs-type=git, maintainer=OpenStack TripleO Team, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 28 08:30:40 np0005538515.localdomain podman[82643]: 2025-11-28 08:30:40.034643096 +0000 UTC m=+0.141133674 container health_status ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, build-date=2025-11-19T00:36:58Z, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-compute, vcs-type=git, architecture=x86_64, tcib_managed=true, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, release=1761123044, config_id=tripleo_step5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.4, distribution-scope=public, managed_by=tripleo_ansible)
Nov 28 08:30:40 np0005538515.localdomain podman[82644]: 2025-11-28 08:30:40.052446161 +0000 UTC m=+0.157058771 container exec_died e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, build-date=2025-11-19T00:14:25Z, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=, managed_by=tripleo_ansible, io.buildah.version=1.41.4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_metadata_agent, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, release=1761123044, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Nov 28 08:30:40 np0005538515.localdomain podman[82642]: 2025-11-28 08:30:40.063452327 +0000 UTC m=+0.170884033 container exec_died 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, url=https://www.redhat.com, io.openshift.expose-services=, config_id=tripleo_step4, io.buildah.version=1.41.4, vendor=Red Hat, Inc., name=rhosp17/openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, batch=17.1_20251118.1, managed_by=tripleo_ansible, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, version=17.1.12, distribution-scope=public, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, container_name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller)
Nov 28 08:30:40 np0005538515.localdomain systemd[1]: e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.service: Deactivated successfully.
Nov 28 08:30:40 np0005538515.localdomain systemd[1]: 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.service: Deactivated successfully.
Nov 28 08:30:40 np0005538515.localdomain podman[82643]: 2025-11-28 08:30:40.109507944 +0000 UTC m=+0.215998522 container exec_died ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_compute, vcs-type=git, build-date=2025-11-19T00:36:58Z, tcib_managed=true, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.4, config_id=tripleo_step5, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, io.openshift.expose-services=, release=1761123044, managed_by=tripleo_ansible, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team)
Nov 28 08:30:40 np0005538515.localdomain systemd[1]: ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.service: Deactivated successfully.
Nov 28 08:30:52 np0005538515.localdomain python3[82729]: ansible-ansible.builtin.slurp Invoked with path=/home/zuul/ansible_hostname src=/home/zuul/ansible_hostname
Nov 28 08:31:02 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.
Nov 28 08:31:02 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.
Nov 28 08:31:02 np0005538515.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Nov 28 08:31:02 np0005538515.localdomain recover_tripleo_nova_virtqemud[82753]: 62642
Nov 28 08:31:02 np0005538515.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Nov 28 08:31:02 np0005538515.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Nov 28 08:31:02 np0005538515.localdomain podman[82742]: 2025-11-28 08:31:02.993882889 +0000 UTC m=+0.089510267 container health_status cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:51:28Z, managed_by=tripleo_ansible, container_name=collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.component=openstack-collectd-container, description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, name=rhosp17/openstack-collectd, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true)
Nov 28 08:31:03 np0005538515.localdomain podman[82742]: 2025-11-28 08:31:03.006337709 +0000 UTC m=+0.101965067 container exec_died cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vcs-type=git, io.buildah.version=1.41.4, build-date=2025-11-18T22:51:28Z, batch=17.1_20251118.1, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, vendor=Red Hat, Inc., config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, container_name=collectd, distribution-scope=public, maintainer=OpenStack TripleO Team, version=17.1.12, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd)
Nov 28 08:31:03 np0005538515.localdomain systemd[1]: cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.service: Deactivated successfully.
Nov 28 08:31:03 np0005538515.localdomain podman[82741]: 2025-11-28 08:31:03.094550035 +0000 UTC m=+0.195445914 container health_status 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, maintainer=OpenStack TripleO Team, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, release=1761123044, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, tcib_managed=true, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:46Z, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 28 08:31:03 np0005538515.localdomain podman[82741]: 2025-11-28 08:31:03.299362324 +0000 UTC m=+0.400258213 container exec_died 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., io.buildah.version=1.41.4, name=rhosp17/openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, config_id=tripleo_step1, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:49:46Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=)
Nov 28 08:31:03 np0005538515.localdomain systemd[1]: 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.service: Deactivated successfully.
Nov 28 08:31:06 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.
Nov 28 08:31:06 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.
Nov 28 08:31:06 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.
Nov 28 08:31:06 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.
Nov 28 08:31:06 np0005538515.localdomain podman[82824]: 2025-11-28 08:31:06.980990102 +0000 UTC m=+0.088464824 container health_status 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, build-date=2025-11-18T22:49:32Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, name=rhosp17/openstack-cron, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, com.redhat.component=openstack-cron-container, vcs-type=git)
Nov 28 08:31:06 np0005538515.localdomain podman[82824]: 2025-11-28 08:31:06.990325948 +0000 UTC m=+0.097800640 container exec_died 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, version=17.1.12, distribution-scope=public, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 cron, release=1761123044, name=rhosp17/openstack-cron, build-date=2025-11-18T22:49:32Z, vcs-type=git, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=logrotate_crond, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc.)
Nov 28 08:31:07 np0005538515.localdomain systemd[1]: 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.service: Deactivated successfully.
Nov 28 08:31:07 np0005538515.localdomain systemd[1]: tmp-crun.zUGQXa.mount: Deactivated successfully.
Nov 28 08:31:07 np0005538515.localdomain podman[82826]: 2025-11-28 08:31:07.047608728 +0000 UTC m=+0.146745085 container health_status 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, name=rhosp17/openstack-iscsid, io.buildah.version=1.41.4, container_name=iscsid, version=17.1.12, io.openshift.expose-services=, release=1761123044, build-date=2025-11-18T23:44:13Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, url=https://www.redhat.com, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, architecture=x86_64, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-iscsid-container)
Nov 28 08:31:07 np0005538515.localdomain podman[82826]: 2025-11-28 08:31:07.062001238 +0000 UTC m=+0.161137625 container exec_died 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, release=1761123044, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.buildah.version=1.41.4, managed_by=tripleo_ansible, version=17.1.12, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, vendor=Red Hat, Inc., batch=17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.component=openstack-iscsid-container, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, build-date=2025-11-18T23:44:13Z, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 08:31:07 np0005538515.localdomain systemd[1]: 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.service: Deactivated successfully.
Nov 28 08:31:07 np0005538515.localdomain systemd[1]: tmp-crun.bokU2O.mount: Deactivated successfully.
Nov 28 08:31:07 np0005538515.localdomain podman[82832]: 2025-11-28 08:31:07.150225184 +0000 UTC m=+0.247095173 container health_status d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, container_name=ceilometer_agent_compute, architecture=x86_64, config_id=tripleo_step4, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, distribution-scope=public, name=rhosp17/openstack-ceilometer-compute, build-date=2025-11-19T00:11:48Z, version=17.1.12, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=)
Nov 28 08:31:07 np0005538515.localdomain podman[82832]: 2025-11-28 08:31:07.182445819 +0000 UTC m=+0.279315818 container exec_died d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, url=https://www.redhat.com, config_id=tripleo_step4, vendor=Red Hat, Inc., vcs-type=git, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:11:48Z, container_name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-compute-container)
Nov 28 08:31:07 np0005538515.localdomain systemd[1]: d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.service: Deactivated successfully.
Nov 28 08:31:07 np0005538515.localdomain podman[82825]: 2025-11-28 08:31:07.186986138 +0000 UTC m=+0.290660094 container health_status 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, com.redhat.component=openstack-ceilometer-ipmi-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., batch=17.1_20251118.1, vcs-type=git, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, release=1761123044, build-date=2025-11-19T00:12:45Z, tcib_managed=true, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, name=rhosp17/openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 08:31:07 np0005538515.localdomain podman[82825]: 2025-11-28 08:31:07.266640442 +0000 UTC m=+0.370314388 container exec_died 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, architecture=x86_64, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-type=git, distribution-scope=public, url=https://www.redhat.com, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:12:45Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 28 08:31:07 np0005538515.localdomain systemd[1]: 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.service: Deactivated successfully.
Nov 28 08:31:07 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.
Nov 28 08:31:07 np0005538515.localdomain podman[82917]: 2025-11-28 08:31:07.966288504 +0000 UTC m=+0.070885767 container health_status e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, release=1761123044, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, vcs-type=git, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, container_name=nova_migration_target, architecture=x86_64, build-date=2025-11-19T00:36:58Z, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, url=https://www.redhat.com, config_id=tripleo_step4, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 28 08:31:08 np0005538515.localdomain podman[82917]: 2025-11-28 08:31:08.351451525 +0000 UTC m=+0.456048788 container exec_died e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vendor=Red Hat, Inc., org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, build-date=2025-11-19T00:36:58Z, io.openshift.expose-services=, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, io.buildah.version=1.41.4, version=17.1.12, name=rhosp17/openstack-nova-compute, config_id=tripleo_step4, managed_by=tripleo_ansible, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git)
Nov 28 08:31:08 np0005538515.localdomain systemd[1]: e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.service: Deactivated successfully.
Nov 28 08:31:10 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.
Nov 28 08:31:10 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.
Nov 28 08:31:10 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.
Nov 28 08:31:10 np0005538515.localdomain podman[82944]: 2025-11-28 08:31:10.982147535 +0000 UTC m=+0.086632448 container health_status e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, container_name=ovn_metadata_agent, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-neutron-metadata-agent-ovn, tcib_managed=true, release=1761123044, vcs-type=git, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., build-date=2025-11-19T00:14:25Z, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 08:31:11 np0005538515.localdomain podman[82942]: 2025-11-28 08:31:11.032021919 +0000 UTC m=+0.140328210 container health_status 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, name=rhosp17/openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.expose-services=, build-date=2025-11-18T23:34:05Z, io.buildah.version=1.41.4, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, vcs-type=git, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., batch=17.1_20251118.1, container_name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ovn-controller)
Nov 28 08:31:11 np0005538515.localdomain podman[82942]: 2025-11-28 08:31:11.080555163 +0000 UTC m=+0.188861424 container exec_died 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, url=https://www.redhat.com, vendor=Red Hat, Inc., distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, managed_by=tripleo_ansible, name=rhosp17/openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, build-date=2025-11-18T23:34:05Z, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, container_name=ovn_controller, release=1761123044, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ovn-controller-container, vcs-type=git, tcib_managed=true, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ovn-controller)
Nov 28 08:31:11 np0005538515.localdomain podman[82943]: 2025-11-28 08:31:11.095248232 +0000 UTC m=+0.200749487 container health_status ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, vcs-type=git, io.buildah.version=1.41.4, vendor=Red Hat, Inc., distribution-scope=public, container_name=nova_compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, version=17.1.12, build-date=2025-11-19T00:36:58Z, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, architecture=x86_64, com.redhat.component=openstack-nova-compute-container)
Nov 28 08:31:11 np0005538515.localdomain systemd[1]: 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.service: Deactivated successfully.
Nov 28 08:31:11 np0005538515.localdomain podman[82944]: 2025-11-28 08:31:11.108790346 +0000 UTC m=+0.213275269 container exec_died e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, managed_by=tripleo_ansible, build-date=2025-11-19T00:14:25Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, release=1761123044, vcs-type=git, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, vendor=Red Hat, Inc., version=17.1.12, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, name=rhosp17/openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, architecture=x86_64, url=https://www.redhat.com, config_id=tripleo_step4)
Nov 28 08:31:11 np0005538515.localdomain systemd[1]: e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.service: Deactivated successfully.
Nov 28 08:31:11 np0005538515.localdomain podman[82943]: 2025-11-28 08:31:11.151323215 +0000 UTC m=+0.256824530 container exec_died ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, config_id=tripleo_step5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, batch=17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, maintainer=OpenStack TripleO Team, distribution-scope=public, build-date=2025-11-19T00:36:58Z, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=)
Nov 28 08:31:11 np0005538515.localdomain systemd[1]: ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.service: Deactivated successfully.
Nov 28 08:31:11 np0005538515.localdomain systemd[1]: tmp-crun.TgrWGw.mount: Deactivated successfully.
Nov 28 08:31:29 np0005538515.localdomain sudo[83018]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 08:31:29 np0005538515.localdomain sudo[83018]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:31:29 np0005538515.localdomain sudo[83018]: pam_unix(sudo:session): session closed for user root
Nov 28 08:31:29 np0005538515.localdomain sudo[83033]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Nov 28 08:31:29 np0005538515.localdomain sudo[83033]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:31:29 np0005538515.localdomain systemd[1]: tmp-crun.eTtv4t.mount: Deactivated successfully.
Nov 28 08:31:29 np0005538515.localdomain podman[83117]: 2025-11-28 08:31:29.906832197 +0000 UTC m=+0.101994258 container exec 98f7091a3e2ea0e9ed1e630f1e98c8fad1fd276cf7448473db6afc3c103ea45d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538515, com.redhat.component=rhceph-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, name=rhceph, build-date=2025-09-24T08:57:55, RELEASE=main, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, release=553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_CLEAN=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, io.buildah.version=1.33.12, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, version=7, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, architecture=x86_64, distribution-scope=public)
Nov 28 08:31:30 np0005538515.localdomain podman[83117]: 2025-11-28 08:31:30.009536506 +0000 UTC m=+0.204698557 container exec_died 98f7091a3e2ea0e9ed1e630f1e98c8fad1fd276cf7448473db6afc3c103ea45d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538515, description=Red Hat Ceph Storage 7, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, GIT_CLEAN=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-type=git, version=7, distribution-scope=public, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, release=553, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., build-date=2025-09-24T08:57:55, GIT_BRANCH=main, io.buildah.version=1.33.12, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph)
Nov 28 08:31:30 np0005538515.localdomain sudo[83033]: pam_unix(sudo:session): session closed for user root
Nov 28 08:31:30 np0005538515.localdomain sudo[83185]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 08:31:30 np0005538515.localdomain sudo[83185]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:31:30 np0005538515.localdomain sudo[83185]: pam_unix(sudo:session): session closed for user root
Nov 28 08:31:30 np0005538515.localdomain sudo[83200]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 08:31:30 np0005538515.localdomain sudo[83200]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:31:31 np0005538515.localdomain sudo[83200]: pam_unix(sudo:session): session closed for user root
Nov 28 08:31:31 np0005538515.localdomain sudo[83246]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 08:31:31 np0005538515.localdomain sudo[83246]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:31:31 np0005538515.localdomain sudo[83246]: pam_unix(sudo:session): session closed for user root
Nov 28 08:31:33 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.
Nov 28 08:31:33 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.
Nov 28 08:31:34 np0005538515.localdomain podman[83261]: 2025-11-28 08:31:34.047057908 +0000 UTC m=+0.155248776 container health_status 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp17/openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, distribution-scope=public, vendor=Red Hat, Inc., tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, build-date=2025-11-18T22:49:46Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, config_id=tripleo_step1, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vcs-type=git)
Nov 28 08:31:34 np0005538515.localdomain podman[83262]: 2025-11-28 08:31:34.112032574 +0000 UTC m=+0.220325415 container health_status cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, com.redhat.component=openstack-collectd-container, io.openshift.expose-services=, release=1761123044, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, version=17.1.12, batch=17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, name=rhosp17/openstack-collectd, maintainer=OpenStack TripleO Team, container_name=collectd, managed_by=tripleo_ansible, build-date=2025-11-18T22:51:28Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, vendor=Red Hat, Inc.)
Nov 28 08:31:34 np0005538515.localdomain podman[83262]: 2025-11-28 08:31:34.122606947 +0000 UTC m=+0.230899778 container exec_died cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=collectd, url=https://www.redhat.com, vcs-type=git, architecture=x86_64, build-date=2025-11-18T22:51:28Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, version=17.1.12, release=1761123044, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-collectd-container, distribution-scope=public, name=rhosp17/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, vendor=Red Hat, Inc., managed_by=tripleo_ansible, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 08:31:34 np0005538515.localdomain systemd[1]: cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.service: Deactivated successfully.
Nov 28 08:31:34 np0005538515.localdomain podman[83261]: 2025-11-28 08:31:34.269466515 +0000 UTC m=+0.377657343 container exec_died 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:49:46Z, io.buildah.version=1.41.4, vcs-type=git, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, config_id=tripleo_step1)
Nov 28 08:31:34 np0005538515.localdomain systemd[1]: 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.service: Deactivated successfully.
Nov 28 08:31:37 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.
Nov 28 08:31:37 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.
Nov 28 08:31:37 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.
Nov 28 08:31:37 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.
Nov 28 08:31:37 np0005538515.localdomain systemd[1]: tmp-crun.AFuGJl.mount: Deactivated successfully.
Nov 28 08:31:38 np0005538515.localdomain podman[83311]: 2025-11-28 08:31:38.041870667 +0000 UTC m=+0.140827084 container health_status d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, managed_by=tripleo_ansible, batch=17.1_20251118.1, config_id=tripleo_step4, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, vcs-type=git, name=rhosp17/openstack-ceilometer-compute, container_name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, url=https://www.redhat.com, build-date=2025-11-19T00:11:48Z, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Nov 28 08:31:38 np0005538515.localdomain podman[83309]: 2025-11-28 08:31:38.086219183 +0000 UTC m=+0.191263267 container health_status 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.buildah.version=1.41.4, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:12:45Z, name=rhosp17/openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, architecture=x86_64, vcs-type=git, distribution-scope=public, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, version=17.1.12, io.openshift.expose-services=, url=https://www.redhat.com, tcib_managed=true, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible)
Nov 28 08:31:38 np0005538515.localdomain podman[83310]: 2025-11-28 08:31:38.140039528 +0000 UTC m=+0.242315277 container health_status 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, url=https://www.redhat.com, architecture=x86_64, maintainer=OpenStack TripleO Team, version=17.1.12, container_name=iscsid, build-date=2025-11-18T23:44:13Z, com.redhat.component=openstack-iscsid-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, managed_by=tripleo_ansible, vcs-type=git, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.buildah.version=1.41.4, config_id=tripleo_step3, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, name=rhosp17/openstack-iscsid, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true)
Nov 28 08:31:38 np0005538515.localdomain podman[83310]: 2025-11-28 08:31:38.153545411 +0000 UTC m=+0.255821070 container exec_died 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vcs-type=git, build-date=2025-11-18T23:44:13Z, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_id=tripleo_step3, container_name=iscsid, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, name=rhosp17/openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-iscsid-container, release=1761123044, tcib_managed=true, version=17.1.12, architecture=x86_64, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, batch=17.1_20251118.1)
Nov 28 08:31:38 np0005538515.localdomain systemd[1]: 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.service: Deactivated successfully.
Nov 28 08:31:38 np0005538515.localdomain podman[83311]: 2025-11-28 08:31:38.165790284 +0000 UTC m=+0.264746641 container exec_died d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-compute, tcib_managed=true, version=17.1.12, batch=17.1_20251118.1, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_id=tripleo_step4, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., release=1761123044, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, com.redhat.component=openstack-ceilometer-compute-container, managed_by=tripleo_ansible, build-date=2025-11-19T00:11:48Z)
Nov 28 08:31:38 np0005538515.localdomain podman[83309]: 2025-11-28 08:31:38.166056132 +0000 UTC m=+0.271100206 container exec_died 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.buildah.version=1.41.4, io.openshift.expose-services=, container_name=ceilometer_agent_ipmi, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, managed_by=tripleo_ansible, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, build-date=2025-11-19T00:12:45Z, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 28 08:31:38 np0005538515.localdomain systemd[1]: d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.service: Deactivated successfully.
Nov 28 08:31:38 np0005538515.localdomain podman[83308]: 2025-11-28 08:31:38.006782685 +0000 UTC m=+0.114311584 container health_status 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-cron, tcib_managed=true, container_name=logrotate_crond, architecture=x86_64, vcs-type=git, com.redhat.component=openstack-cron-container, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vendor=Red Hat, Inc., release=1761123044, batch=17.1_20251118.1, build-date=2025-11-18T22:49:32Z, io.buildah.version=1.41.4, io.openshift.expose-services=, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 28 08:31:38 np0005538515.localdomain systemd[1]: 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.service: Deactivated successfully.
Nov 28 08:31:38 np0005538515.localdomain podman[83308]: 2025-11-28 08:31:38.237553468 +0000 UTC m=+0.345082307 container exec_died 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, vcs-type=git, release=1761123044, vendor=Red Hat, Inc., container_name=logrotate_crond, build-date=2025-11-18T22:49:32Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_id=tripleo_step4, version=17.1.12, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.component=openstack-cron-container, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public)
Nov 28 08:31:38 np0005538515.localdomain systemd[1]: 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.service: Deactivated successfully.
Nov 28 08:31:38 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.
Nov 28 08:31:38 np0005538515.localdomain podman[83397]: 2025-11-28 08:31:38.967762184 +0000 UTC m=+0.078654974 container health_status e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, release=1761123044, name=rhosp17/openstack-nova-compute, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, managed_by=tripleo_ansible, tcib_managed=true, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, build-date=2025-11-19T00:36:58Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git)
Nov 28 08:31:39 np0005538515.localdomain podman[83397]: 2025-11-28 08:31:39.368666987 +0000 UTC m=+0.479559757 container exec_died e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, release=1761123044, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, io.buildah.version=1.41.4, distribution-scope=public, managed_by=tripleo_ansible, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, architecture=x86_64, io.openshift.expose-services=, url=https://www.redhat.com, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 28 08:31:39 np0005538515.localdomain systemd[1]: e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.service: Deactivated successfully.
Nov 28 08:31:41 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.
Nov 28 08:31:41 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.
Nov 28 08:31:41 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.
Nov 28 08:31:41 np0005538515.localdomain systemd[1]: tmp-crun.bBODCS.mount: Deactivated successfully.
Nov 28 08:31:41 np0005538515.localdomain podman[83424]: 2025-11-28 08:31:41.9821819 +0000 UTC m=+0.084713479 container health_status e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, maintainer=OpenStack TripleO Team, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, vcs-type=git, vendor=Red Hat, Inc., org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, container_name=ovn_metadata_agent, url=https://www.redhat.com, batch=17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c)
Nov 28 08:31:42 np0005538515.localdomain podman[83423]: 2025-11-28 08:31:41.957011611 +0000 UTC m=+0.065112470 container health_status ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, build-date=2025-11-19T00:36:58Z, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, vendor=Red Hat, Inc., url=https://www.redhat.com, version=17.1.12, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, distribution-scope=public, name=rhosp17/openstack-nova-compute, release=1761123044, config_id=tripleo_step5, batch=17.1_20251118.1, vcs-type=git, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Nov 28 08:31:42 np0005538515.localdomain podman[83422]: 2025-11-28 08:31:42.030206299 +0000 UTC m=+0.134676267 container health_status 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, architecture=x86_64, config_id=tripleo_step4, container_name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, distribution-scope=public, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, io.buildah.version=1.41.4, release=1761123044, name=rhosp17/openstack-ovn-controller, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ovn-controller-container, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc.)
Nov 28 08:31:42 np0005538515.localdomain podman[83423]: 2025-11-28 08:31:42.046969831 +0000 UTC m=+0.155070740 container exec_died ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, config_id=tripleo_step5, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, container_name=nova_compute, url=https://www.redhat.com, tcib_managed=true, version=17.1.12, name=rhosp17/openstack-nova-compute, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, vcs-type=git, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4)
Nov 28 08:31:42 np0005538515.localdomain podman[83422]: 2025-11-28 08:31:42.053918753 +0000 UTC m=+0.158388711 container exec_died 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, name=rhosp17/openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, build-date=2025-11-18T23:34:05Z, container_name=ovn_controller, io.buildah.version=1.41.4, vendor=Red Hat, Inc., managed_by=tripleo_ansible, url=https://www.redhat.com, version=17.1.12, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 08:31:42 np0005538515.localdomain podman[83424]: 2025-11-28 08:31:42.054315715 +0000 UTC m=+0.156847294 container exec_died e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, vendor=Red Hat, Inc., container_name=ovn_metadata_agent, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.buildah.version=1.41.4, release=1761123044, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, distribution-scope=public, name=rhosp17/openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, architecture=x86_64, io.openshift.expose-services=, build-date=2025-11-19T00:14:25Z)
Nov 28 08:31:42 np0005538515.localdomain systemd[1]: ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.service: Deactivated successfully.
Nov 28 08:31:42 np0005538515.localdomain systemd[1]: e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.service: Deactivated successfully.
Nov 28 08:31:42 np0005538515.localdomain systemd[1]: 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.service: Deactivated successfully.
Nov 28 08:31:53 np0005538515.localdomain sshd[80573]: Received disconnect from 38.102.83.114 port 37570:11: disconnected by user
Nov 28 08:31:53 np0005538515.localdomain sshd[80573]: Disconnected from user zuul 38.102.83.114 port 37570
Nov 28 08:31:53 np0005538515.localdomain sshd[80570]: pam_unix(sshd:session): session closed for user zuul
Nov 28 08:31:53 np0005538515.localdomain systemd[1]: session-34.scope: Deactivated successfully.
Nov 28 08:31:53 np0005538515.localdomain systemd[1]: session-34.scope: Consumed 18.897s CPU time.
Nov 28 08:31:53 np0005538515.localdomain systemd-logind[763]: Session 34 logged out. Waiting for processes to exit.
Nov 28 08:31:53 np0005538515.localdomain systemd-logind[763]: Removed session 34.
Nov 28 08:32:04 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.
Nov 28 08:32:04 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.
Nov 28 08:32:04 np0005538515.localdomain systemd[1]: tmp-crun.mJOFaM.mount: Deactivated successfully.
Nov 28 08:32:04 np0005538515.localdomain podman[83517]: 2025-11-28 08:32:04.987792314 +0000 UTC m=+0.091984672 container health_status 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, container_name=metrics_qdr, release=1761123044, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-type=git, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, tcib_managed=true, build-date=2025-11-18T22:49:46Z, distribution-scope=public, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, architecture=x86_64, batch=17.1_20251118.1, config_id=tripleo_step1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 28 08:32:05 np0005538515.localdomain podman[83518]: 2025-11-28 08:32:05.107324526 +0000 UTC m=+0.207786551 container health_status cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, build-date=2025-11-18T22:51:28Z, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, container_name=collectd, vendor=Red Hat, Inc., tcib_managed=true, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, io.buildah.version=1.41.4, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, summary=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, release=1761123044, version=17.1.12, name=rhosp17/openstack-collectd, url=https://www.redhat.com, vcs-type=git, distribution-scope=public)
Nov 28 08:32:05 np0005538515.localdomain podman[83518]: 2025-11-28 08:32:05.119445868 +0000 UTC m=+0.219907943 container exec_died cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, com.redhat.component=openstack-collectd-container, build-date=2025-11-18T22:51:28Z, maintainer=OpenStack TripleO Team, architecture=x86_64, distribution-scope=public, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, managed_by=tripleo_ansible, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., container_name=collectd, version=17.1.12, vcs-type=git, name=rhosp17/openstack-collectd, config_id=tripleo_step3)
Nov 28 08:32:05 np0005538515.localdomain systemd[1]: cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.service: Deactivated successfully.
Nov 28 08:32:05 np0005538515.localdomain podman[83517]: 2025-11-28 08:32:05.165318799 +0000 UTC m=+0.269511167 container exec_died 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, architecture=x86_64, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, maintainer=OpenStack TripleO Team, release=1761123044, io.openshift.expose-services=, tcib_managed=true, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20251118.1, build-date=2025-11-18T22:49:46Z)
Nov 28 08:32:05 np0005538515.localdomain systemd[1]: 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.service: Deactivated successfully.
Nov 28 08:32:05 np0005538515.localdomain systemd[1]: tmp-crun.IAAnNu.mount: Deactivated successfully.
Nov 28 08:32:08 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.
Nov 28 08:32:08 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.
Nov 28 08:32:08 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.
Nov 28 08:32:08 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.
Nov 28 08:32:08 np0005538515.localdomain podman[83591]: 2025-11-28 08:32:08.993933358 +0000 UTC m=+0.102040160 container health_status 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 cron, release=1761123044, vendor=Red Hat, Inc., batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, io.buildah.version=1.41.4, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, container_name=logrotate_crond, version=17.1.12, distribution-scope=public, build-date=2025-11-18T22:49:32Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.component=openstack-cron-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 28 08:32:09 np0005538515.localdomain podman[83591]: 2025-11-28 08:32:09.032985392 +0000 UTC m=+0.141092224 container exec_died 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, tcib_managed=true, name=rhosp17/openstack-cron, build-date=2025-11-18T22:49:32Z, batch=17.1_20251118.1, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.component=openstack-cron-container, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, managed_by=tripleo_ansible, container_name=logrotate_crond, summary=Red Hat OpenStack Platform 17.1 cron)
Nov 28 08:32:09 np0005538515.localdomain systemd[1]: 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.service: Deactivated successfully.
Nov 28 08:32:09 np0005538515.localdomain podman[83592]: 2025-11-28 08:32:09.089408996 +0000 UTC m=+0.195302610 container health_status 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, io.openshift.expose-services=, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, architecture=x86_64, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, container_name=ceilometer_agent_ipmi, name=rhosp17/openstack-ceilometer-ipmi, vcs-type=git, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:12:45Z)
Nov 28 08:32:09 np0005538515.localdomain podman[83594]: 2025-11-28 08:32:09.039663275 +0000 UTC m=+0.143684162 container health_status d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, version=17.1.12, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, build-date=2025-11-19T00:11:48Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, vcs-type=git, vendor=Red Hat, Inc., io.buildah.version=1.41.4, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, container_name=ceilometer_agent_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ceilometer-compute-container, release=1761123044, name=rhosp17/openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64)
Nov 28 08:32:09 np0005538515.localdomain podman[83592]: 2025-11-28 08:32:09.146253113 +0000 UTC m=+0.252146687 container exec_died 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, com.redhat.component=openstack-ceilometer-ipmi-container, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2025-11-19T00:12:45Z, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, tcib_managed=true, name=rhosp17/openstack-ceilometer-ipmi, batch=17.1_20251118.1, managed_by=tripleo_ansible, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.buildah.version=1.41.4, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, version=17.1.12, container_name=ceilometer_agent_ipmi, io.openshift.expose-services=, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team)
Nov 28 08:32:09 np0005538515.localdomain podman[83593]: 2025-11-28 08:32:09.153160735 +0000 UTC m=+0.256321666 container health_status 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, managed_by=tripleo_ansible, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-iscsid, release=1761123044, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.component=openstack-iscsid-container, summary=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_id=tripleo_step3, vcs-type=git, batch=17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, maintainer=OpenStack TripleO Team, url=https://www.redhat.com)
Nov 28 08:32:09 np0005538515.localdomain systemd[1]: 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.service: Deactivated successfully.
Nov 28 08:32:09 np0005538515.localdomain podman[83594]: 2025-11-28 08:32:09.17234098 +0000 UTC m=+0.276361857 container exec_died d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, distribution-scope=public, vcs-type=git, vendor=Red Hat, Inc., batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, url=https://www.redhat.com, architecture=x86_64, version=17.1.12, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, build-date=2025-11-19T00:11:48Z, container_name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Nov 28 08:32:09 np0005538515.localdomain systemd[1]: d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.service: Deactivated successfully.
Nov 28 08:32:09 np0005538515.localdomain podman[83593]: 2025-11-28 08:32:09.189572117 +0000 UTC m=+0.292733058 container exec_died 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, build-date=2025-11-18T23:44:13Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=iscsid, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.expose-services=, vendor=Red Hat, Inc., org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, name=rhosp17/openstack-iscsid, release=1761123044, managed_by=tripleo_ansible, distribution-scope=public, architecture=x86_64, com.redhat.component=openstack-iscsid-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, tcib_managed=true)
Nov 28 08:32:09 np0005538515.localdomain systemd[1]: 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.service: Deactivated successfully.
Nov 28 08:32:09 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.
Nov 28 08:32:09 np0005538515.localdomain podman[83682]: 2025-11-28 08:32:09.971753592 +0000 UTC m=+0.079151301 container health_status e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, container_name=nova_migration_target, distribution-scope=public, version=17.1.12, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, architecture=x86_64, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.buildah.version=1.41.4, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, vendor=Red Hat, Inc., org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-compute, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 28 08:32:10 np0005538515.localdomain podman[83682]: 2025-11-28 08:32:10.324378999 +0000 UTC m=+0.431776628 container exec_died e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, version=17.1.12, tcib_managed=true, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, batch=17.1_20251118.1, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, build-date=2025-11-19T00:36:58Z, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., container_name=nova_migration_target, name=rhosp17/openstack-nova-compute, maintainer=OpenStack TripleO Team, vcs-type=git, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute)
Nov 28 08:32:10 np0005538515.localdomain systemd[1]: e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.service: Deactivated successfully.
Nov 28 08:32:12 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.
Nov 28 08:32:12 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.
Nov 28 08:32:12 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.
Nov 28 08:32:12 np0005538515.localdomain systemd[1]: tmp-crun.tBlu43.mount: Deactivated successfully.
Nov 28 08:32:12 np0005538515.localdomain podman[83705]: 2025-11-28 08:32:12.992662896 +0000 UTC m=+0.103666459 container health_status 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, name=rhosp17/openstack-ovn-controller, tcib_managed=true, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, batch=17.1_20251118.1, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, maintainer=OpenStack TripleO Team, architecture=x86_64, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, vcs-type=git, managed_by=tripleo_ansible, container_name=ovn_controller, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container, build-date=2025-11-18T23:34:05Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller)
Nov 28 08:32:13 np0005538515.localdomain podman[83705]: 2025-11-28 08:32:13.017016551 +0000 UTC m=+0.128020114 container exec_died 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, tcib_managed=true, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, name=rhosp17/openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, container_name=ovn_controller, vcs-type=git, distribution-scope=public, managed_by=tripleo_ansible, version=17.1.12, io.openshift.expose-services=, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 08:32:13 np0005538515.localdomain systemd[1]: 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.service: Deactivated successfully.
Nov 28 08:32:13 np0005538515.localdomain podman[83707]: 2025-11-28 08:32:13.028933115 +0000 UTC m=+0.130755348 container health_status e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, managed_by=tripleo_ansible, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, io.buildah.version=1.41.4, vcs-type=git, version=17.1.12, tcib_managed=true, container_name=ovn_metadata_agent, release=1761123044, maintainer=OpenStack TripleO Team, architecture=x86_64, name=rhosp17/openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, io.openshift.expose-services=)
Nov 28 08:32:13 np0005538515.localdomain podman[83707]: 2025-11-28 08:32:13.097544932 +0000 UTC m=+0.199367095 container exec_died e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:14:25Z, vendor=Red Hat, Inc., release=1761123044, architecture=x86_64, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, vcs-type=git, io.openshift.expose-services=, managed_by=tripleo_ansible, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, distribution-scope=public, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, batch=17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 08:32:13 np0005538515.localdomain systemd[1]: e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.service: Deactivated successfully.
Nov 28 08:32:13 np0005538515.localdomain podman[83706]: 2025-11-28 08:32:13.105146504 +0000 UTC m=+0.209153243 container health_status ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, io.openshift.expose-services=, distribution-scope=public, config_id=tripleo_step5, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, release=1761123044, maintainer=OpenStack TripleO Team, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, tcib_managed=true, name=rhosp17/openstack-nova-compute, version=17.1.12)
Nov 28 08:32:13 np0005538515.localdomain podman[83706]: 2025-11-28 08:32:13.188337486 +0000 UTC m=+0.292344235 container exec_died ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, config_id=tripleo_step5, container_name=nova_compute, name=rhosp17/openstack-nova-compute, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, distribution-scope=public, io.buildah.version=1.41.4, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, io.openshift.expose-services=)
Nov 28 08:32:13 np0005538515.localdomain systemd[1]: ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.service: Deactivated successfully.
Nov 28 08:32:31 np0005538515.localdomain sudo[83778]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 08:32:31 np0005538515.localdomain sudo[83778]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:32:31 np0005538515.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Nov 28 08:32:31 np0005538515.localdomain sudo[83778]: pam_unix(sudo:session): session closed for user root
Nov 28 08:32:31 np0005538515.localdomain recover_tripleo_nova_virtqemud[83794]: 62642
Nov 28 08:32:31 np0005538515.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Nov 28 08:32:31 np0005538515.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Nov 28 08:32:32 np0005538515.localdomain sudo[83795]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 08:32:32 np0005538515.localdomain sudo[83795]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:32:32 np0005538515.localdomain sudo[83795]: pam_unix(sudo:session): session closed for user root
Nov 28 08:32:33 np0005538515.localdomain sudo[83842]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 08:32:33 np0005538515.localdomain sudo[83842]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:32:33 np0005538515.localdomain sudo[83842]: pam_unix(sudo:session): session closed for user root
Nov 28 08:32:35 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.
Nov 28 08:32:35 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.
Nov 28 08:32:35 np0005538515.localdomain podman[83857]: 2025-11-28 08:32:35.997286773 +0000 UTC m=+0.096928183 container health_status 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, batch=17.1_20251118.1, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, name=rhosp17/openstack-qdrouterd, vcs-type=git, container_name=metrics_qdr, distribution-scope=public, build-date=2025-11-18T22:49:46Z, architecture=x86_64, io.buildah.version=1.41.4, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 28 08:32:36 np0005538515.localdomain systemd[1]: tmp-crun.s4PqyK.mount: Deactivated successfully.
Nov 28 08:32:36 np0005538515.localdomain podman[83858]: 2025-11-28 08:32:36.06316075 +0000 UTC m=+0.162060327 container health_status cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, name=rhosp17/openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, vcs-type=git, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, container_name=collectd, build-date=2025-11-18T22:51:28Z, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, managed_by=tripleo_ansible, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, vendor=Red Hat, Inc., com.redhat.component=openstack-collectd-container, distribution-scope=public, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team)
Nov 28 08:32:36 np0005538515.localdomain podman[83858]: 2025-11-28 08:32:36.110462245 +0000 UTC m=+0.209361752 container exec_died cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, managed_by=tripleo_ansible, config_id=tripleo_step3, build-date=2025-11-18T22:51:28Z, com.redhat.component=openstack-collectd-container, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, io.openshift.expose-services=, io.buildah.version=1.41.4, distribution-scope=public, release=1761123044, version=17.1.12)
Nov 28 08:32:36 np0005538515.localdomain systemd[1]: cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.service: Deactivated successfully.
Nov 28 08:32:36 np0005538515.localdomain podman[83857]: 2025-11-28 08:32:36.226521487 +0000 UTC m=+0.326162907 container exec_died 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.buildah.version=1.41.4, distribution-scope=public, batch=17.1_20251118.1, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, name=rhosp17/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, build-date=2025-11-18T22:49:46Z, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, config_id=tripleo_step1, vendor=Red Hat, Inc., tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd)
Nov 28 08:32:36 np0005538515.localdomain systemd[1]: 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.service: Deactivated successfully.
Nov 28 08:32:39 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.
Nov 28 08:32:39 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.
Nov 28 08:32:39 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.
Nov 28 08:32:39 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.
Nov 28 08:32:39 np0005538515.localdomain systemd[1]: tmp-crun.5kRtrp.mount: Deactivated successfully.
Nov 28 08:32:40 np0005538515.localdomain podman[83913]: 2025-11-28 08:32:40.004896149 +0000 UTC m=+0.105858168 container health_status d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_id=tripleo_step4, vcs-type=git, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ceilometer_agent_compute, name=rhosp17/openstack-ceilometer-compute, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, build-date=2025-11-19T00:11:48Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com)
Nov 28 08:32:40 np0005538515.localdomain podman[83905]: 2025-11-28 08:32:39.965782825 +0000 UTC m=+0.079541208 container health_status 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, build-date=2025-11-18T22:49:32Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, name=rhosp17/openstack-cron, managed_by=tripleo_ansible, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, config_id=tripleo_step4, release=1761123044, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, vcs-type=git, vendor=Red Hat, Inc.)
Nov 28 08:32:40 np0005538515.localdomain podman[83906]: 2025-11-28 08:32:40.028864636 +0000 UTC m=+0.136080748 container health_status 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:12:45Z, version=17.1.12, batch=17.1_20251118.1, container_name=ceilometer_agent_ipmi, io.buildah.version=1.41.4, architecture=x86_64, com.redhat.component=openstack-ceilometer-ipmi-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, name=rhosp17/openstack-ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi)
Nov 28 08:32:40 np0005538515.localdomain podman[83905]: 2025-11-28 08:32:40.050172972 +0000 UTC m=+0.163931375 container exec_died 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, architecture=x86_64, build-date=2025-11-18T22:49:32Z, description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, distribution-scope=public, container_name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, io.buildah.version=1.41.4, vendor=Red Hat, Inc., managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, config_id=tripleo_step4, release=1761123044, batch=17.1_20251118.1, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-cron-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron)
Nov 28 08:32:40 np0005538515.localdomain systemd[1]: 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.service: Deactivated successfully.
Nov 28 08:32:40 np0005538515.localdomain podman[83913]: 2025-11-28 08:32:40.101390627 +0000 UTC m=+0.202352676 container exec_died d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, distribution-scope=public, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, io.openshift.expose-services=, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, version=17.1.12, build-date=2025-11-19T00:11:48Z, config_id=tripleo_step4, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ceilometer_agent_compute, com.redhat.component=openstack-ceilometer-compute-container, io.buildah.version=1.41.4)
Nov 28 08:32:40 np0005538515.localdomain systemd[1]: d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.service: Deactivated successfully.
Nov 28 08:32:40 np0005538515.localdomain podman[83906]: 2025-11-28 08:32:40.15510879 +0000 UTC m=+0.262324952 container exec_died 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-type=git, io.buildah.version=1.41.4, distribution-scope=public, io.openshift.expose-services=, architecture=x86_64, com.redhat.component=openstack-ceilometer-ipmi-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, url=https://www.redhat.com)
Nov 28 08:32:40 np0005538515.localdomain systemd[1]: 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.service: Deactivated successfully.
Nov 28 08:32:40 np0005538515.localdomain podman[83910]: 2025-11-28 08:32:40.240987473 +0000 UTC m=+0.344287624 container health_status 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.buildah.version=1.41.4, io.openshift.expose-services=, com.redhat.component=openstack-iscsid-container, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, container_name=iscsid, release=1761123044, batch=17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, name=rhosp17/openstack-iscsid, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, config_id=tripleo_step3, build-date=2025-11-18T23:44:13Z, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, version=17.1.12, vendor=Red Hat, Inc., distribution-scope=public)
Nov 28 08:32:40 np0005538515.localdomain podman[83910]: 2025-11-28 08:32:40.274970268 +0000 UTC m=+0.378270419 container exec_died 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, url=https://www.redhat.com, config_id=tripleo_step3, com.redhat.component=openstack-iscsid-container, vcs-type=git, vendor=Red Hat, Inc., managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, batch=17.1_20251118.1, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, version=17.1.12, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, container_name=iscsid, tcib_managed=true, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:44:13Z, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 28 08:32:40 np0005538515.localdomain systemd[1]: 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.service: Deactivated successfully.
Nov 28 08:32:40 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.
Nov 28 08:32:40 np0005538515.localdomain systemd[1]: tmp-crun.2EAtwl.mount: Deactivated successfully.
Nov 28 08:32:40 np0005538515.localdomain podman[83995]: 2025-11-28 08:32:40.960335616 +0000 UTC m=+0.073025089 container health_status e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:36:58Z, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20251118.1, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, release=1761123044, version=17.1.12, config_id=tripleo_step4, container_name=nova_migration_target, architecture=x86_64, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 28 08:32:41 np0005538515.localdomain podman[83995]: 2025-11-28 08:32:41.330150734 +0000 UTC m=+0.442840187 container exec_died e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, batch=17.1_20251118.1, distribution-scope=public, managed_by=tripleo_ansible, vendor=Red Hat, Inc.)
Nov 28 08:32:41 np0005538515.localdomain systemd[1]: e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.service: Deactivated successfully.
Nov 28 08:32:43 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.
Nov 28 08:32:43 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.
Nov 28 08:32:43 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.
Nov 28 08:32:43 np0005538515.localdomain systemd[1]: tmp-crun.VArWLH.mount: Deactivated successfully.
Nov 28 08:32:43 np0005538515.localdomain podman[84020]: 2025-11-28 08:32:43.996047537 +0000 UTC m=+0.095801598 container health_status e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, managed_by=tripleo_ansible, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_id=tripleo_step4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, name=rhosp17/openstack-neutron-metadata-agent-ovn, release=1761123044, build-date=2025-11-19T00:14:25Z, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_metadata_agent, vendor=Red Hat, Inc., vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com)
Nov 28 08:32:44 np0005538515.localdomain podman[84018]: 2025-11-28 08:32:44.044287511 +0000 UTC m=+0.149044956 container health_status 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, maintainer=OpenStack TripleO Team, distribution-scope=public, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, build-date=2025-11-18T23:34:05Z, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, managed_by=tripleo_ansible, name=rhosp17/openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, architecture=x86_64, url=https://www.redhat.com, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, version=17.1.12, release=1761123044)
Nov 28 08:32:44 np0005538515.localdomain podman[84020]: 2025-11-28 08:32:44.073539851 +0000 UTC m=+0.173293912 container exec_died e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, vendor=Red Hat, Inc., build-date=2025-11-19T00:14:25Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.expose-services=, url=https://www.redhat.com, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, name=rhosp17/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, version=17.1.12, architecture=x86_64, vcs-type=git, tcib_managed=true, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4)
Nov 28 08:32:44 np0005538515.localdomain systemd[1]: e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.service: Deactivated successfully.
Nov 28 08:32:44 np0005538515.localdomain podman[84019]: 2025-11-28 08:32:44.091353199 +0000 UTC m=+0.193527645 container health_status ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-compute-container, container_name=nova_compute, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:36:58Z, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, vcs-type=git, release=1761123044, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, config_id=tripleo_step5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-compute, io.openshift.expose-services=)
Nov 28 08:32:44 np0005538515.localdomain podman[84018]: 2025-11-28 08:32:44.098747977 +0000 UTC m=+0.203505402 container exec_died 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, architecture=x86_64, container_name=ovn_controller, vcs-type=git, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, name=rhosp17/openstack-ovn-controller, distribution-scope=public, version=17.1.12, build-date=2025-11-18T23:34:05Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']})
Nov 28 08:32:44 np0005538515.localdomain systemd[1]: 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.service: Deactivated successfully.
Nov 28 08:32:44 np0005538515.localdomain podman[84019]: 2025-11-28 08:32:44.145506506 +0000 UTC m=+0.247680942 container exec_died ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, url=https://www.redhat.com, build-date=2025-11-19T00:36:58Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, config_id=tripleo_step5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, vendor=Red Hat, Inc., io.buildah.version=1.41.4, vcs-type=git, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20251118.1)
Nov 28 08:32:44 np0005538515.localdomain systemd[1]: ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.service: Deactivated successfully.
Nov 28 08:33:06 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.
Nov 28 08:33:06 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.
Nov 28 08:33:06 np0005538515.localdomain systemd[1]: tmp-crun.EzsEqG.mount: Deactivated successfully.
Nov 28 08:33:07 np0005538515.localdomain podman[84113]: 2025-11-28 08:33:06.999257418 +0000 UTC m=+0.084912295 container health_status cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.buildah.version=1.41.4, managed_by=tripleo_ansible, build-date=2025-11-18T22:51:28Z, summary=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, config_id=tripleo_step3, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-collectd-container, url=https://www.redhat.com, architecture=x86_64, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=collectd)
Nov 28 08:33:07 np0005538515.localdomain podman[84112]: 2025-11-28 08:33:07.019399367 +0000 UTC m=+0.122111158 container health_status 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, io.buildah.version=1.41.4, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-qdrouterd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, distribution-scope=public, release=1761123044, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, architecture=x86_64, build-date=2025-11-18T22:49:46Z, config_id=tripleo_step1, vcs-type=git)
Nov 28 08:33:07 np0005538515.localdomain podman[84113]: 2025-11-28 08:33:07.08256807 +0000 UTC m=+0.168222907 container exec_died cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, name=rhosp17/openstack-collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, config_id=tripleo_step3, release=1761123044, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, tcib_managed=true, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, container_name=collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-collectd-container, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, io.openshift.expose-services=, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, managed_by=tripleo_ansible)
Nov 28 08:33:07 np0005538515.localdomain systemd[1]: cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.service: Deactivated successfully.
Nov 28 08:33:07 np0005538515.localdomain podman[84112]: 2025-11-28 08:33:07.203524902 +0000 UTC m=+0.306236743 container exec_died 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, version=17.1.12, release=1761123044, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step1, vendor=Red Hat, Inc., vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, architecture=x86_64, build-date=2025-11-18T22:49:46Z, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, tcib_managed=true)
Nov 28 08:33:07 np0005538515.localdomain systemd[1]: 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.service: Deactivated successfully.
Nov 28 08:33:10 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.
Nov 28 08:33:10 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.
Nov 28 08:33:10 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.
Nov 28 08:33:10 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.
Nov 28 08:33:10 np0005538515.localdomain podman[84188]: 2025-11-28 08:33:10.973253108 +0000 UTC m=+0.079458085 container health_status 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:44:13Z, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-iscsid-container, vcs-type=git, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, io.openshift.expose-services=, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_id=tripleo_step3, container_name=iscsid, architecture=x86_64, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, managed_by=tripleo_ansible, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 08:33:11 np0005538515.localdomain podman[84188]: 2025-11-28 08:33:11.012489395 +0000 UTC m=+0.118694412 container exec_died 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, distribution-scope=public, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, managed_by=tripleo_ansible, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, container_name=iscsid, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, vendor=Red Hat, Inc., vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.buildah.version=1.41.4, build-date=2025-11-18T23:44:13Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com)
Nov 28 08:33:11 np0005538515.localdomain systemd[1]: 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.service: Deactivated successfully.
Nov 28 08:33:11 np0005538515.localdomain podman[84189]: 2025-11-28 08:33:11.028323653 +0000 UTC m=+0.132523159 container health_status d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, name=rhosp17/openstack-ceilometer-compute, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ceilometer_agent_compute, architecture=x86_64, com.redhat.component=openstack-ceilometer-compute-container, config_id=tripleo_step4, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, version=17.1.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20251118.1, build-date=2025-11-19T00:11:48Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 28 08:33:11 np0005538515.localdomain podman[84187]: 2025-11-28 08:33:11.078436755 +0000 UTC m=+0.185604782 container health_status 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, tcib_managed=true, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, io.openshift.expose-services=, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, name=rhosp17/openstack-ceilometer-ipmi, config_id=tripleo_step4, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, distribution-scope=public)
Nov 28 08:33:11 np0005538515.localdomain podman[84189]: 2025-11-28 08:33:11.0834871 +0000 UTC m=+0.187686656 container exec_died d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, build-date=2025-11-19T00:11:48Z, config_id=tripleo_step4, managed_by=tripleo_ansible, container_name=ceilometer_agent_compute, batch=17.1_20251118.1, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, io.buildah.version=1.41.4, distribution-scope=public, name=rhosp17/openstack-ceilometer-compute, architecture=x86_64, url=https://www.redhat.com, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676)
Nov 28 08:33:11 np0005538515.localdomain systemd[1]: d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.service: Deactivated successfully.
Nov 28 08:33:11 np0005538515.localdomain podman[84186]: 2025-11-28 08:33:11.129659901 +0000 UTC m=+0.237632233 container health_status 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, release=1761123044, config_id=tripleo_step4, name=rhosp17/openstack-cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=logrotate_crond, vendor=Red Hat, Inc., architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:32Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, distribution-scope=public, batch=17.1_20251118.1, managed_by=tripleo_ansible, version=17.1.12, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, com.redhat.component=openstack-cron-container, summary=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']})
Nov 28 08:33:11 np0005538515.localdomain podman[84186]: 2025-11-28 08:33:11.135562482 +0000 UTC m=+0.243534844 container exec_died 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vcs-type=git, io.buildah.version=1.41.4, distribution-scope=public, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:32Z, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, release=1761123044, com.redhat.component=openstack-cron-container, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, name=rhosp17/openstack-cron, container_name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, version=17.1.12, vendor=Red Hat, Inc.)
Nov 28 08:33:11 np0005538515.localdomain systemd[1]: 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.service: Deactivated successfully.
Nov 28 08:33:11 np0005538515.localdomain podman[84187]: 2025-11-28 08:33:11.191174703 +0000 UTC m=+0.298342730 container exec_died 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, vcs-type=git, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, com.redhat.component=openstack-ceilometer-ipmi-container, vendor=Red Hat, Inc., architecture=x86_64, container_name=ceilometer_agent_ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, build-date=2025-11-19T00:12:45Z, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, tcib_managed=true, name=rhosp17/openstack-ceilometer-ipmi, version=17.1.12, release=1761123044)
Nov 28 08:33:11 np0005538515.localdomain systemd[1]: 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.service: Deactivated successfully.
Nov 28 08:33:11 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.
Nov 28 08:33:11 np0005538515.localdomain podman[84276]: 2025-11-28 08:33:11.973444062 +0000 UTC m=+0.080326993 container health_status e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, batch=17.1_20251118.1, io.buildah.version=1.41.4, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:36:58Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, url=https://www.redhat.com, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, release=1761123044, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, container_name=nova_migration_target, distribution-scope=public, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute)
Nov 28 08:33:12 np0005538515.localdomain podman[84276]: 2025-11-28 08:33:12.357262741 +0000 UTC m=+0.464145662 container exec_died e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, tcib_managed=true, managed_by=tripleo_ansible, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 28 08:33:12 np0005538515.localdomain systemd[1]: e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.service: Deactivated successfully.
Nov 28 08:33:14 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.
Nov 28 08:33:14 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.
Nov 28 08:33:14 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.
Nov 28 08:33:14 np0005538515.localdomain podman[84301]: 2025-11-28 08:33:14.975216111 +0000 UTC m=+0.079807077 container health_status ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step5, build-date=2025-11-19T00:36:58Z, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, managed_by=tripleo_ansible, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.openshift.expose-services=, batch=17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_compute, vendor=Red Hat, Inc., release=1761123044, maintainer=OpenStack TripleO Team, distribution-scope=public, com.redhat.component=openstack-nova-compute-container)
Nov 28 08:33:15 np0005538515.localdomain systemd[1]: tmp-crun.EI6kHC.mount: Deactivated successfully.
Nov 28 08:33:15 np0005538515.localdomain podman[84300]: 2025-11-28 08:33:15.040828009 +0000 UTC m=+0.147361414 container health_status 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, container_name=ovn_controller, maintainer=OpenStack TripleO Team, version=17.1.12, release=1761123044, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, name=rhosp17/openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, build-date=2025-11-18T23:34:05Z, io.openshift.expose-services=, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, managed_by=tripleo_ansible, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.41.4)
Nov 28 08:33:15 np0005538515.localdomain podman[84301]: 2025-11-28 08:33:15.055225282 +0000 UTC m=+0.159816288 container exec_died ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, build-date=2025-11-19T00:36:58Z, release=1761123044, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, tcib_managed=true, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, container_name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, batch=17.1_20251118.1, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=)
Nov 28 08:33:15 np0005538515.localdomain podman[84302]: 2025-11-28 08:33:15.086807544 +0000 UTC m=+0.185213850 container health_status e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, batch=17.1_20251118.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, release=1761123044, url=https://www.redhat.com, tcib_managed=true, io.buildah.version=1.41.4, io.openshift.expose-services=, name=rhosp17/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, architecture=x86_64, build-date=2025-11-19T00:14:25Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public)
Nov 28 08:33:15 np0005538515.localdomain podman[84300]: 2025-11-28 08:33:15.091485668 +0000 UTC m=+0.198019033 container exec_died 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, architecture=x86_64, managed_by=tripleo_ansible, release=1761123044, vcs-type=git, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, name=rhosp17/openstack-ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ovn_controller, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, build-date=2025-11-18T23:34:05Z, batch=17.1_20251118.1)
Nov 28 08:33:15 np0005538515.localdomain systemd[1]: 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.service: Deactivated successfully.
Nov 28 08:33:15 np0005538515.localdomain podman[84302]: 2025-11-28 08:33:15.126398682 +0000 UTC m=+0.224805018 container exec_died e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vcs-type=git, build-date=2025-11-19T00:14:25Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-neutron-metadata-agent-ovn, architecture=x86_64, url=https://www.redhat.com, tcib_managed=true, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container)
Nov 28 08:33:15 np0005538515.localdomain systemd[1]: ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.service: Deactivated successfully.
Nov 28 08:33:15 np0005538515.localdomain systemd[1]: e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.service: Deactivated successfully.
Nov 28 08:33:33 np0005538515.localdomain sudo[84371]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 08:33:33 np0005538515.localdomain sudo[84371]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:33:33 np0005538515.localdomain sudo[84371]: pam_unix(sudo:session): session closed for user root
Nov 28 08:33:33 np0005538515.localdomain sudo[84386]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 08:33:33 np0005538515.localdomain sudo[84386]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:33:34 np0005538515.localdomain sudo[84386]: pam_unix(sudo:session): session closed for user root
Nov 28 08:33:35 np0005538515.localdomain sudo[84433]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 08:33:35 np0005538515.localdomain sudo[84433]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:33:35 np0005538515.localdomain sudo[84433]: pam_unix(sudo:session): session closed for user root
Nov 28 08:33:37 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.
Nov 28 08:33:37 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.
Nov 28 08:33:37 np0005538515.localdomain systemd[1]: tmp-crun.sZ2fdM.mount: Deactivated successfully.
Nov 28 08:33:37 np0005538515.localdomain podman[84448]: 2025-11-28 08:33:37.9878454 +0000 UTC m=+0.092350612 container health_status 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vendor=Red Hat, Inc., com.redhat.component=openstack-qdrouterd-container, tcib_managed=true, config_id=tripleo_step1, distribution-scope=public, architecture=x86_64, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, vcs-type=git, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:46Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, name=rhosp17/openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, url=https://www.redhat.com)
Nov 28 08:33:38 np0005538515.localdomain podman[84449]: 2025-11-28 08:33:38.035162236 +0000 UTC m=+0.138226723 container health_status cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, url=https://www.redhat.com, version=17.1.12, maintainer=OpenStack TripleO Team, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, build-date=2025-11-18T22:51:28Z, description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, com.redhat.component=openstack-collectd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, vendor=Red Hat, Inc., managed_by=tripleo_ansible)
Nov 28 08:33:38 np0005538515.localdomain podman[84449]: 2025-11-28 08:33:38.049631411 +0000 UTC m=+0.152695828 container exec_died cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step3, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, batch=17.1_20251118.1, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, build-date=2025-11-18T22:51:28Z, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., release=1761123044, com.redhat.component=openstack-collectd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, io.openshift.expose-services=, container_name=collectd, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, name=rhosp17/openstack-collectd)
Nov 28 08:33:38 np0005538515.localdomain systemd[1]: cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.service: Deactivated successfully.
Nov 28 08:33:38 np0005538515.localdomain podman[84448]: 2025-11-28 08:33:38.212928086 +0000 UTC m=+0.317433258 container exec_died 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, vendor=Red Hat, Inc., name=rhosp17/openstack-qdrouterd, vcs-type=git, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, release=1761123044, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_id=tripleo_step1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd)
Nov 28 08:33:38 np0005538515.localdomain systemd[1]: 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.service: Deactivated successfully.
Nov 28 08:33:41 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.
Nov 28 08:33:41 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.
Nov 28 08:33:41 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.
Nov 28 08:33:41 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.
Nov 28 08:33:41 np0005538515.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Nov 28 08:33:41 np0005538515.localdomain recover_tripleo_nova_virtqemud[84523]: 62642
Nov 28 08:33:41 np0005538515.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Nov 28 08:33:41 np0005538515.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Nov 28 08:33:42 np0005538515.localdomain podman[84497]: 2025-11-28 08:33:42.013243633 +0000 UTC m=+0.118265999 container health_status 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, batch=17.1_20251118.1, distribution-scope=public, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, version=17.1.12, com.redhat.component=openstack-cron-container, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:32Z, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, release=1761123044, vendor=Red Hat, Inc., architecture=x86_64, url=https://www.redhat.com)
Nov 28 08:33:42 np0005538515.localdomain podman[84497]: 2025-11-28 08:33:42.023700395 +0000 UTC m=+0.128722741 container exec_died 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, batch=17.1_20251118.1, managed_by=tripleo_ansible, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=logrotate_crond, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-cron-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, version=17.1.12, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.expose-services=, build-date=2025-11-18T22:49:32Z, vendor=Red Hat, Inc.)
Nov 28 08:33:42 np0005538515.localdomain podman[84498]: 2025-11-28 08:33:42.027807042 +0000 UTC m=+0.132335993 container health_status 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_id=tripleo_step4, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:12:45Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, managed_by=tripleo_ansible, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., batch=17.1_20251118.1, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-ipmi, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676)
Nov 28 08:33:42 np0005538515.localdomain podman[84498]: 2025-11-28 08:33:42.062492219 +0000 UTC m=+0.167021220 container exec_died 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vcs-type=git, name=rhosp17/openstack-ceilometer-ipmi, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, version=17.1.12, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., io.openshift.expose-services=, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ceilometer-ipmi-container, config_id=tripleo_step4)
Nov 28 08:33:42 np0005538515.localdomain systemd[1]: 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.service: Deactivated successfully.
Nov 28 08:33:42 np0005538515.localdomain podman[84499]: 2025-11-28 08:33:42.081398181 +0000 UTC m=+0.179739962 container health_status 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, build-date=2025-11-18T23:44:13Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, tcib_managed=true, release=1761123044, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, maintainer=OpenStack TripleO Team, vcs-type=git, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-iscsid-container, managed_by=tripleo_ansible, io.buildah.version=1.41.4, name=rhosp17/openstack-iscsid, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, container_name=iscsid)
Nov 28 08:33:42 np0005538515.localdomain systemd[1]: 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.service: Deactivated successfully.
Nov 28 08:33:42 np0005538515.localdomain podman[84499]: 2025-11-28 08:33:42.095428662 +0000 UTC m=+0.193770473 container exec_died 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, tcib_managed=true, url=https://www.redhat.com, release=1761123044, container_name=iscsid, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, version=17.1.12, name=rhosp17/openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., config_id=tripleo_step3, io.openshift.expose-services=, io.buildah.version=1.41.4, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, distribution-scope=public)
Nov 28 08:33:42 np0005538515.localdomain systemd[1]: 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.service: Deactivated successfully.
Nov 28 08:33:42 np0005538515.localdomain podman[84500]: 2025-11-28 08:33:42.191699324 +0000 UTC m=+0.287158376 container health_status d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, release=1761123044, batch=17.1_20251118.1, io.buildah.version=1.41.4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., version=17.1.12, tcib_managed=true, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:11:48Z, managed_by=tripleo_ansible, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, architecture=x86_64, container_name=ceilometer_agent_compute)
Nov 28 08:33:42 np0005538515.localdomain podman[84500]: 2025-11-28 08:33:42.245503739 +0000 UTC m=+0.340962821 container exec_died d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2025-11-19T00:11:48Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, container_name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, batch=17.1_20251118.1, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, io.buildah.version=1.41.4, vcs-type=git, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-compute-container, config_id=tripleo_step4)
Nov 28 08:33:42 np0005538515.localdomain systemd[1]: d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.service: Deactivated successfully.
Nov 28 08:33:42 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.
Nov 28 08:33:42 np0005538515.localdomain podman[84591]: 2025-11-28 08:33:42.96378889 +0000 UTC m=+0.075064121 container health_status e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:36:58Z, container_name=nova_migration_target, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20251118.1, name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, architecture=x86_64, config_id=tripleo_step4, tcib_managed=true, vcs-type=git, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, release=1761123044, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 28 08:33:43 np0005538515.localdomain podman[84591]: 2025-11-28 08:33:43.322527668 +0000 UTC m=+0.433802899 container exec_died e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-compute, vcs-type=git, maintainer=OpenStack TripleO Team, architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.4, vendor=Red Hat, Inc., managed_by=tripleo_ansible, config_id=tripleo_step4, build-date=2025-11-19T00:36:58Z, version=17.1.12, batch=17.1_20251118.1, tcib_managed=true, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, container_name=nova_migration_target)
Nov 28 08:33:43 np0005538515.localdomain systemd[1]: e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.service: Deactivated successfully.
Nov 28 08:33:45 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.
Nov 28 08:33:45 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.
Nov 28 08:33:45 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.
Nov 28 08:33:45 np0005538515.localdomain podman[84615]: 2025-11-28 08:33:45.979897527 +0000 UTC m=+0.086402029 container health_status 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, name=rhosp17/openstack-ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, version=17.1.12, container_name=ovn_controller, build-date=2025-11-18T23:34:05Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, architecture=x86_64, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, release=1761123044, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., managed_by=tripleo_ansible, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4)
Nov 28 08:33:46 np0005538515.localdomain podman[84616]: 2025-11-28 08:33:46.035035384 +0000 UTC m=+0.139258486 container health_status ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, maintainer=OpenStack TripleO Team, version=17.1.12, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, batch=17.1_20251118.1, architecture=x86_64, container_name=nova_compute, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public)
Nov 28 08:33:46 np0005538515.localdomain podman[84615]: 2025-11-28 08:33:46.088553641 +0000 UTC m=+0.195058173 container exec_died 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.buildah.version=1.41.4, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_controller, vcs-type=git, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, architecture=x86_64, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ovn-controller-container, name=rhosp17/openstack-ovn-controller, version=17.1.12, batch=17.1_20251118.1, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true)
Nov 28 08:33:46 np0005538515.localdomain podman[84616]: 2025-11-28 08:33:46.091411269 +0000 UTC m=+0.195634341 container exec_died ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, release=1761123044, version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, config_id=tripleo_step5, distribution-scope=public, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:36:58Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, architecture=x86_64, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, container_name=nova_compute)
Nov 28 08:33:46 np0005538515.localdomain systemd[1]: ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.service: Deactivated successfully.
Nov 28 08:33:46 np0005538515.localdomain podman[84617]: 2025-11-28 08:33:46.090686927 +0000 UTC m=+0.191249375 container health_status e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, distribution-scope=public, vcs-type=git, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, managed_by=tripleo_ansible, io.buildah.version=1.41.4, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:14:25Z, batch=17.1_20251118.1, tcib_managed=true, vendor=Red Hat, Inc., architecture=x86_64, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=ovn_metadata_agent, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn)
Nov 28 08:33:46 np0005538515.localdomain podman[84617]: 2025-11-28 08:33:46.172570076 +0000 UTC m=+0.273132544 container exec_died e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_id=tripleo_step4, release=1761123044, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, version=17.1.12, build-date=2025-11-19T00:14:25Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, managed_by=tripleo_ansible, vendor=Red Hat, Inc., config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 28 08:33:46 np0005538515.localdomain systemd[1]: e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.service: Deactivated successfully.
Nov 28 08:33:46 np0005538515.localdomain systemd[1]: 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.service: Deactivated successfully.
Nov 28 08:33:46 np0005538515.localdomain systemd[1]: tmp-crun.SBZ1l2.mount: Deactivated successfully.
Nov 28 08:34:08 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.
Nov 28 08:34:08 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.
Nov 28 08:34:08 np0005538515.localdomain systemd[1]: tmp-crun.kNSsOF.mount: Deactivated successfully.
Nov 28 08:34:08 np0005538515.localdomain podman[84732]: 2025-11-28 08:34:08.981414496 +0000 UTC m=+0.092057263 container health_status 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, io.buildah.version=1.41.4, vendor=Red Hat, Inc., distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:46Z, architecture=x86_64, release=1761123044, name=rhosp17/openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com)
Nov 28 08:34:09 np0005538515.localdomain podman[84733]: 2025-11-28 08:34:09.026831693 +0000 UTC m=+0.133252660 container health_status cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=collectd, distribution-scope=public, name=rhosp17/openstack-collectd, batch=17.1_20251118.1, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, io.openshift.expose-services=, build-date=2025-11-18T22:51:28Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.buildah.version=1.41.4, vcs-type=git, description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.component=openstack-collectd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true)
Nov 28 08:34:09 np0005538515.localdomain podman[84733]: 2025-11-28 08:34:09.036209033 +0000 UTC m=+0.142629970 container exec_died cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, version=17.1.12, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-collectd, container_name=collectd, tcib_managed=true, io.buildah.version=1.41.4, vendor=Red Hat, Inc., io.openshift.expose-services=, com.redhat.component=openstack-collectd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, build-date=2025-11-18T22:51:28Z, url=https://www.redhat.com, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public)
Nov 28 08:34:09 np0005538515.localdomain systemd[1]: cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.service: Deactivated successfully.
Nov 28 08:34:09 np0005538515.localdomain podman[84732]: 2025-11-28 08:34:09.16746563 +0000 UTC m=+0.278108357 container exec_died 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, batch=17.1_20251118.1, managed_by=tripleo_ansible, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, io.buildah.version=1.41.4, release=1761123044, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 28 08:34:09 np0005538515.localdomain systemd[1]: 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.service: Deactivated successfully.
Nov 28 08:34:11 np0005538515.localdomain sshd[84782]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 08:34:11 np0005538515.localdomain sshd[84782]: Invalid user ethereum from 80.94.92.186 port 56652
Nov 28 08:34:12 np0005538515.localdomain sshd[84782]: Connection closed by invalid user ethereum 80.94.92.186 port 56652 [preauth]
Nov 28 08:34:12 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.
Nov 28 08:34:12 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.
Nov 28 08:34:12 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.
Nov 28 08:34:12 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.
Nov 28 08:34:13 np0005538515.localdomain systemd[1]: tmp-crun.cVp8kB.mount: Deactivated successfully.
Nov 28 08:34:13 np0005538515.localdomain podman[84785]: 2025-11-28 08:34:13.035269835 +0000 UTC m=+0.133313732 container health_status 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, container_name=ceilometer_agent_ipmi, url=https://www.redhat.com, release=1761123044, config_id=tripleo_step4, build-date=2025-11-19T00:12:45Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-ipmi-container, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-ipmi)
Nov 28 08:34:13 np0005538515.localdomain podman[84786]: 2025-11-28 08:34:13.098942934 +0000 UTC m=+0.190639826 container health_status 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, build-date=2025-11-18T23:44:13Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vendor=Red Hat, Inc., vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044, container_name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-iscsid-container, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, architecture=x86_64, io.openshift.expose-services=, tcib_managed=true, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, name=rhosp17/openstack-iscsid, batch=17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d)
Nov 28 08:34:13 np0005538515.localdomain podman[84786]: 2025-11-28 08:34:13.112404648 +0000 UTC m=+0.204101590 container exec_died 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, container_name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, config_id=tripleo_step3, build-date=2025-11-18T23:44:13Z, tcib_managed=true, name=rhosp17/openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, url=https://www.redhat.com, io.buildah.version=1.41.4, version=17.1.12, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-iscsid-container, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=)
Nov 28 08:34:13 np0005538515.localdomain systemd[1]: 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.service: Deactivated successfully.
Nov 28 08:34:13 np0005538515.localdomain podman[84790]: 2025-11-28 08:34:13.15601465 +0000 UTC m=+0.243827013 container health_status d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, batch=17.1_20251118.1, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, build-date=2025-11-19T00:11:48Z, io.buildah.version=1.41.4, tcib_managed=true, vendor=Red Hat, Inc., version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, vcs-type=git, io.openshift.expose-services=)
Nov 28 08:34:13 np0005538515.localdomain podman[84784]: 2025-11-28 08:34:13.06726206 +0000 UTC m=+0.168447015 container health_status 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, config_id=tripleo_step4, container_name=logrotate_crond, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, io.openshift.expose-services=, tcib_managed=true, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, description=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.component=openstack-cron-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team)
Nov 28 08:34:13 np0005538515.localdomain podman[84785]: 2025-11-28 08:34:13.174734456 +0000 UTC m=+0.272778363 container exec_died 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-ipmi, architecture=x86_64, container_name=ceilometer_agent_ipmi, url=https://www.redhat.com, build-date=2025-11-19T00:12:45Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-ipmi-container, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, distribution-scope=public, io.openshift.expose-services=, io.buildah.version=1.41.4, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Nov 28 08:34:13 np0005538515.localdomain systemd[1]: 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.service: Deactivated successfully.
Nov 28 08:34:13 np0005538515.localdomain podman[84784]: 2025-11-28 08:34:13.200458137 +0000 UTC m=+0.301643042 container exec_died 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, name=rhosp17/openstack-cron, url=https://www.redhat.com, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, version=17.1.12, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.component=openstack-cron-container, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:32Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, batch=17.1_20251118.1, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, io.openshift.expose-services=, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, container_name=logrotate_crond)
Nov 28 08:34:13 np0005538515.localdomain systemd[1]: 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.service: Deactivated successfully.
Nov 28 08:34:13 np0005538515.localdomain podman[84790]: 2025-11-28 08:34:13.215127168 +0000 UTC m=+0.302939571 container exec_died d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute, vcs-type=git, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, batch=17.1_20251118.1, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-compute, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1761123044, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676)
Nov 28 08:34:13 np0005538515.localdomain systemd[1]: d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.service: Deactivated successfully.
Nov 28 08:34:13 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.
Nov 28 08:34:13 np0005538515.localdomain podman[84877]: 2025-11-28 08:34:13.978605859 +0000 UTC m=+0.088161254 container health_status e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, vendor=Red Hat, Inc., container_name=nova_migration_target, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, batch=17.1_20251118.1, config_id=tripleo_step4, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, tcib_managed=true, name=rhosp17/openstack-nova-compute, release=1761123044, com.redhat.component=openstack-nova-compute-container, version=17.1.12, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']})
Nov 28 08:34:14 np0005538515.localdomain systemd[1]: tmp-crun.jzmiQ9.mount: Deactivated successfully.
Nov 28 08:34:14 np0005538515.localdomain podman[84877]: 2025-11-28 08:34:14.36053848 +0000 UTC m=+0.470093865 container exec_died e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, version=17.1.12, maintainer=OpenStack TripleO Team, architecture=x86_64, release=1761123044, com.redhat.component=openstack-nova-compute-container, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, tcib_managed=true, container_name=nova_migration_target, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, url=https://www.redhat.com, config_id=tripleo_step4, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 28 08:34:14 np0005538515.localdomain systemd[1]: e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.service: Deactivated successfully.
Nov 28 08:34:16 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.
Nov 28 08:34:16 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.
Nov 28 08:34:16 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.
Nov 28 08:34:16 np0005538515.localdomain systemd[1]: tmp-crun.QqCI79.mount: Deactivated successfully.
Nov 28 08:34:16 np0005538515.localdomain podman[84901]: 2025-11-28 08:34:16.992970004 +0000 UTC m=+0.092783925 container health_status ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, architecture=x86_64, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, config_id=tripleo_step5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, com.redhat.component=openstack-nova-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:36:58Z, container_name=nova_compute, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc.)
Nov 28 08:34:17 np0005538515.localdomain systemd[1]: tmp-crun.raA8dO.mount: Deactivated successfully.
Nov 28 08:34:17 np0005538515.localdomain podman[84900]: 2025-11-28 08:34:17.04647642 +0000 UTC m=+0.146785456 container health_status 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, io.buildah.version=1.41.4, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., build-date=2025-11-18T23:34:05Z, name=rhosp17/openstack-ovn-controller, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, version=17.1.12, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, distribution-scope=public, com.redhat.component=openstack-ovn-controller-container, architecture=x86_64, batch=17.1_20251118.1, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, maintainer=OpenStack TripleO Team)
Nov 28 08:34:17 np0005538515.localdomain podman[84902]: 2025-11-28 08:34:17.093486597 +0000 UTC m=+0.186767588 container health_status e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, maintainer=OpenStack TripleO Team, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, managed_by=tripleo_ansible, distribution-scope=public, container_name=ovn_metadata_agent, config_id=tripleo_step4, architecture=x86_64, batch=17.1_20251118.1, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn)
Nov 28 08:34:17 np0005538515.localdomain podman[84901]: 2025-11-28 08:34:17.101301098 +0000 UTC m=+0.201114979 container exec_died ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, build-date=2025-11-19T00:36:58Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=nova_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, managed_by=tripleo_ansible, vendor=Red Hat, Inc., vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step5, architecture=x86_64, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, version=17.1.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, release=1761123044, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, maintainer=OpenStack TripleO Team)
Nov 28 08:34:17 np0005538515.localdomain systemd[1]: ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.service: Deactivated successfully.
Nov 28 08:34:17 np0005538515.localdomain podman[84900]: 2025-11-28 08:34:17.116617438 +0000 UTC m=+0.216926464 container exec_died 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, release=1761123044, distribution-scope=public, vcs-type=git, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, name=rhosp17/openstack-ovn-controller, managed_by=tripleo_ansible, tcib_managed=true, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, architecture=x86_64, config_id=tripleo_step4, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ovn_controller, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller)
Nov 28 08:34:17 np0005538515.localdomain systemd[1]: 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.service: Deactivated successfully.
Nov 28 08:34:17 np0005538515.localdomain podman[84902]: 2025-11-28 08:34:17.159695724 +0000 UTC m=+0.252976755 container exec_died e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, distribution-scope=public, tcib_managed=true, io.openshift.expose-services=, release=1761123044, architecture=x86_64, batch=17.1_20251118.1, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, config_id=tripleo_step4, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, vcs-type=git, name=rhosp17/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, container_name=ovn_metadata_agent)
Nov 28 08:34:17 np0005538515.localdomain systemd[1]: e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.service: Deactivated successfully.
Nov 28 08:34:35 np0005538515.localdomain sudo[84971]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 08:34:35 np0005538515.localdomain sudo[84971]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:34:35 np0005538515.localdomain sudo[84971]: pam_unix(sudo:session): session closed for user root
Nov 28 08:34:35 np0005538515.localdomain sudo[84986]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 08:34:35 np0005538515.localdomain sudo[84986]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:34:36 np0005538515.localdomain sudo[84986]: pam_unix(sudo:session): session closed for user root
Nov 28 08:34:37 np0005538515.localdomain sudo[85033]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 08:34:37 np0005538515.localdomain sudo[85033]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:34:37 np0005538515.localdomain sudo[85033]: pam_unix(sudo:session): session closed for user root
Nov 28 08:34:39 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.
Nov 28 08:34:39 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.
Nov 28 08:34:40 np0005538515.localdomain podman[85049]: 2025-11-28 08:34:40.016123307 +0000 UTC m=+0.123491999 container health_status cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, com.redhat.component=openstack-collectd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, build-date=2025-11-18T22:51:28Z, vcs-type=git, managed_by=tripleo_ansible, name=rhosp17/openstack-collectd, release=1761123044, version=17.1.12, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, batch=17.1_20251118.1, distribution-scope=public, container_name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 collectd)
Nov 28 08:34:40 np0005538515.localdomain podman[85048]: 2025-11-28 08:34:40.035230075 +0000 UTC m=+0.144867137 container health_status 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=metrics_qdr, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, build-date=2025-11-18T22:49:46Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, distribution-scope=public, vcs-type=git, release=1761123044, url=https://www.redhat.com, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_id=tripleo_step1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, architecture=x86_64)
Nov 28 08:34:40 np0005538515.localdomain podman[85049]: 2025-11-28 08:34:40.036874856 +0000 UTC m=+0.144243548 container exec_died cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, version=17.1.12, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, architecture=x86_64, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, release=1761123044, tcib_managed=true, container_name=collectd, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, name=rhosp17/openstack-collectd)
Nov 28 08:34:40 np0005538515.localdomain systemd[1]: cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.service: Deactivated successfully.
Nov 28 08:34:40 np0005538515.localdomain podman[85048]: 2025-11-28 08:34:40.229937406 +0000 UTC m=+0.339574448 container exec_died 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, maintainer=OpenStack TripleO Team, version=17.1.12, name=rhosp17/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step1, tcib_managed=true, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, url=https://www.redhat.com, batch=17.1_20251118.1, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, build-date=2025-11-18T22:49:46Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, vcs-type=git)
Nov 28 08:34:40 np0005538515.localdomain systemd[1]: 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.service: Deactivated successfully.
Nov 28 08:34:43 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.
Nov 28 08:34:43 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.
Nov 28 08:34:43 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.
Nov 28 08:34:43 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.
Nov 28 08:34:43 np0005538515.localdomain podman[85098]: 2025-11-28 08:34:43.97812827 +0000 UTC m=+0.085802551 container health_status 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, maintainer=OpenStack TripleO Team, tcib_managed=true, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, build-date=2025-11-18T22:49:32Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, name=rhosp17/openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, vcs-type=git, url=https://www.redhat.com, config_id=tripleo_step4, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 28 08:34:43 np0005538515.localdomain podman[85098]: 2025-11-28 08:34:43.9917807 +0000 UTC m=+0.099455041 container exec_died 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, com.redhat.component=openstack-cron-container, build-date=2025-11-18T22:49:32Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, config_id=tripleo_step4, io.buildah.version=1.41.4, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, summary=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, vcs-type=git, url=https://www.redhat.com, container_name=logrotate_crond, name=rhosp17/openstack-cron, managed_by=tripleo_ansible, architecture=x86_64)
Nov 28 08:34:44 np0005538515.localdomain systemd[1]: 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.service: Deactivated successfully.
Nov 28 08:34:44 np0005538515.localdomain systemd[1]: tmp-crun.Nlrf8p.mount: Deactivated successfully.
Nov 28 08:34:44 np0005538515.localdomain podman[85101]: 2025-11-28 08:34:44.054660285 +0000 UTC m=+0.153822854 container health_status d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, com.redhat.component=openstack-ceilometer-compute-container, name=rhosp17/openstack-ceilometer-compute, tcib_managed=true, url=https://www.redhat.com, container_name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, batch=17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:11:48Z, vendor=Red Hat, Inc.)
Nov 28 08:34:44 np0005538515.localdomain podman[85101]: 2025-11-28 08:34:44.082668697 +0000 UTC m=+0.181831266 container exec_died d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, managed_by=tripleo_ansible, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, com.redhat.component=openstack-ceilometer-compute-container, name=rhosp17/openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, build-date=2025-11-19T00:11:48Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, container_name=ceilometer_agent_compute, url=https://www.redhat.com, distribution-scope=public, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4)
Nov 28 08:34:44 np0005538515.localdomain podman[85099]: 2025-11-28 08:34:44.098259026 +0000 UTC m=+0.202532413 container health_status 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, vcs-type=git, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, com.redhat.component=openstack-ceilometer-ipmi-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, url=https://www.redhat.com, container_name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public)
Nov 28 08:34:44 np0005538515.localdomain systemd[1]: d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.service: Deactivated successfully.
Nov 28 08:34:44 np0005538515.localdomain podman[85099]: 2025-11-28 08:34:44.131467358 +0000 UTC m=+0.235740735 container exec_died 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, managed_by=tripleo_ansible, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, container_name=ceilometer_agent_ipmi, vcs-type=git, io.buildah.version=1.41.4, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, config_id=tripleo_step4, vendor=Red Hat, Inc., build-date=2025-11-19T00:12:45Z, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, maintainer=OpenStack TripleO Team, release=1761123044)
Nov 28 08:34:44 np0005538515.localdomain systemd[1]: 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.service: Deactivated successfully.
Nov 28 08:34:44 np0005538515.localdomain podman[85100]: 2025-11-28 08:34:44.148886594 +0000 UTC m=+0.249538449 container health_status 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vendor=Red Hat, Inc., container_name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, build-date=2025-11-18T23:44:13Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, config_id=tripleo_step3, name=rhosp17/openstack-iscsid, version=17.1.12, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, com.redhat.component=openstack-iscsid-container, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, architecture=x86_64, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 08:34:44 np0005538515.localdomain podman[85100]: 2025-11-28 08:34:44.188427841 +0000 UTC m=+0.289079656 container exec_died 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.expose-services=, url=https://www.redhat.com, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public, com.redhat.component=openstack-iscsid-container, managed_by=tripleo_ansible, container_name=iscsid, config_id=tripleo_step3, vendor=Red Hat, Inc., io.buildah.version=1.41.4, name=rhosp17/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 28 08:34:44 np0005538515.localdomain systemd[1]: 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.service: Deactivated successfully.
Nov 28 08:34:44 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.
Nov 28 08:34:44 np0005538515.localdomain podman[85189]: 2025-11-28 08:34:44.963448406 +0000 UTC m=+0.075544926 container health_status e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, architecture=x86_64, build-date=2025-11-19T00:36:58Z, config_id=tripleo_step4, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, version=17.1.12, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 28 08:34:45 np0005538515.localdomain podman[85189]: 2025-11-28 08:34:45.323086082 +0000 UTC m=+0.435182612 container exec_died e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, container_name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, vendor=Red Hat, Inc., vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, io.openshift.expose-services=, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, version=17.1.12, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, config_id=tripleo_step4, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 28 08:34:45 np0005538515.localdomain systemd[1]: e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.service: Deactivated successfully.
Nov 28 08:34:47 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.
Nov 28 08:34:47 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.
Nov 28 08:34:47 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.
Nov 28 08:34:47 np0005538515.localdomain podman[85212]: 2025-11-28 08:34:47.962182781 +0000 UTC m=+0.073040278 container health_status 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, release=1761123044, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, container_name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, distribution-scope=public, vcs-type=git, tcib_managed=true, vendor=Red Hat, Inc., config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, config_id=tripleo_step4, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, build-date=2025-11-18T23:34:05Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272)
Nov 28 08:34:48 np0005538515.localdomain podman[85214]: 2025-11-28 08:34:48.007466844 +0000 UTC m=+0.111571194 container health_status e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, name=rhosp17/openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, io.openshift.expose-services=, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, url=https://www.redhat.com, batch=17.1_20251118.1, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, architecture=x86_64, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, managed_by=tripleo_ansible, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:14:25Z, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container)
Nov 28 08:34:48 np0005538515.localdomain podman[85212]: 2025-11-28 08:34:48.018483003 +0000 UTC m=+0.129340490 container exec_died 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, name=rhosp17/openstack-ovn-controller, maintainer=OpenStack TripleO Team, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, version=17.1.12, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, vcs-type=git, build-date=2025-11-18T23:34:05Z, batch=17.1_20251118.1, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, io.openshift.expose-services=, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4)
Nov 28 08:34:48 np0005538515.localdomain systemd[1]: 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.service: Deactivated successfully.
Nov 28 08:34:48 np0005538515.localdomain podman[85214]: 2025-11-28 08:34:48.072782204 +0000 UTC m=+0.176886614 container exec_died e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, config_id=tripleo_step4, tcib_managed=true, url=https://www.redhat.com, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:14:25Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, architecture=x86_64, name=rhosp17/openstack-neutron-metadata-agent-ovn, version=17.1.12, io.buildah.version=1.41.4, container_name=ovn_metadata_agent)
Nov 28 08:34:48 np0005538515.localdomain systemd[1]: e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.service: Deactivated successfully.
Nov 28 08:34:48 np0005538515.localdomain podman[85213]: 2025-11-28 08:34:48.077722446 +0000 UTC m=+0.183884859 container health_status ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, name=rhosp17/openstack-nova-compute, io.buildah.version=1.41.4, container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, release=1761123044, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, build-date=2025-11-19T00:36:58Z, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.12, vendor=Red Hat, Inc., managed_by=tripleo_ansible)
Nov 28 08:34:48 np0005538515.localdomain podman[85213]: 2025-11-28 08:34:48.161648568 +0000 UTC m=+0.267810991 container exec_died ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.openshift.expose-services=, release=1761123044, tcib_managed=true, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, container_name=nova_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, vcs-type=git, name=rhosp17/openstack-nova-compute, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, architecture=x86_64, build-date=2025-11-19T00:36:58Z, maintainer=OpenStack TripleO Team)
Nov 28 08:34:48 np0005538515.localdomain systemd[1]: ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.service: Deactivated successfully.
Nov 28 08:35:03 np0005538515.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Nov 28 08:35:03 np0005538515.localdomain recover_tripleo_nova_virtqemud[85285]: 62642
Nov 28 08:35:03 np0005538515.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Nov 28 08:35:03 np0005538515.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Nov 28 08:35:10 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.
Nov 28 08:35:10 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.
Nov 28 08:35:10 np0005538515.localdomain podman[85331]: 2025-11-28 08:35:10.992393021 +0000 UTC m=+0.092917149 container health_status 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp17/openstack-qdrouterd, io.openshift.expose-services=, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, vcs-type=git, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, release=1761123044, version=17.1.12, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step1)
Nov 28 08:35:11 np0005538515.localdomain systemd[1]: tmp-crun.kRvbaH.mount: Deactivated successfully.
Nov 28 08:35:11 np0005538515.localdomain podman[85332]: 2025-11-28 08:35:11.048132596 +0000 UTC m=+0.147365355 container health_status cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, batch=17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:51:28Z, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, url=https://www.redhat.com, vcs-type=git, version=17.1.12, distribution-scope=public, managed_by=tripleo_ansible, com.redhat.component=openstack-collectd-container, description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, tcib_managed=true, config_id=tripleo_step3, io.openshift.expose-services=, name=rhosp17/openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 28 08:35:11 np0005538515.localdomain podman[85332]: 2025-11-28 08:35:11.060418524 +0000 UTC m=+0.159651263 container exec_died cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:51:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., url=https://www.redhat.com, com.redhat.component=openstack-collectd-container, io.buildah.version=1.41.4, name=rhosp17/openstack-collectd, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, container_name=collectd, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, managed_by=tripleo_ansible, release=1761123044, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3)
Nov 28 08:35:11 np0005538515.localdomain systemd[1]: cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.service: Deactivated successfully.
Nov 28 08:35:11 np0005538515.localdomain podman[85331]: 2025-11-28 08:35:11.224135731 +0000 UTC m=+0.324659929 container exec_died 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, tcib_managed=true, name=rhosp17/openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, config_id=tripleo_step1, distribution-scope=public, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4)
Nov 28 08:35:11 np0005538515.localdomain systemd[1]: 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.service: Deactivated successfully.
Nov 28 08:35:14 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.
Nov 28 08:35:14 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.
Nov 28 08:35:14 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.
Nov 28 08:35:14 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.
Nov 28 08:35:14 np0005538515.localdomain systemd[1]: tmp-crun.fYgGdF.mount: Deactivated successfully.
Nov 28 08:35:14 np0005538515.localdomain podman[85383]: 2025-11-28 08:35:14.980189816 +0000 UTC m=+0.084999816 container health_status d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, release=1761123044, vendor=Red Hat, Inc., architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_compute, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.12, distribution-scope=public, batch=17.1_20251118.1, name=rhosp17/openstack-ceilometer-compute, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, build-date=2025-11-19T00:11:48Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team)
Nov 28 08:35:15 np0005538515.localdomain podman[85383]: 2025-11-28 08:35:15.016533874 +0000 UTC m=+0.121343914 container exec_died d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, io.buildah.version=1.41.4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, vendor=Red Hat, Inc., container_name=ceilometer_agent_compute, build-date=2025-11-19T00:11:48Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, vcs-type=git, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, distribution-scope=public, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-compute, architecture=x86_64, version=17.1.12)
Nov 28 08:35:15 np0005538515.localdomain systemd[1]: tmp-crun.L7EoRN.mount: Deactivated successfully.
Nov 28 08:35:15 np0005538515.localdomain systemd[1]: d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.service: Deactivated successfully.
Nov 28 08:35:15 np0005538515.localdomain podman[85380]: 2025-11-28 08:35:15.040379998 +0000 UTC m=+0.152448422 container health_status 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., container_name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, com.redhat.component=openstack-cron-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, release=1761123044, description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, tcib_managed=true, version=17.1.12, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:32Z, io.openshift.expose-services=, managed_by=tripleo_ansible, vcs-type=git, batch=17.1_20251118.1, distribution-scope=public, name=rhosp17/openstack-cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 28 08:35:15 np0005538515.localdomain podman[85380]: 2025-11-28 08:35:15.075826908 +0000 UTC m=+0.187895312 container exec_died 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, version=17.1.12, build-date=2025-11-18T22:49:32Z, summary=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-cron-container, url=https://www.redhat.com, release=1761123044, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., container_name=logrotate_crond, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step4, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.openshift.expose-services=, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 28 08:35:15 np0005538515.localdomain systemd[1]: 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.service: Deactivated successfully.
Nov 28 08:35:15 np0005538515.localdomain podman[85382]: 2025-11-28 08:35:15.088480638 +0000 UTC m=+0.194138714 container health_status 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, batch=17.1_20251118.1, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, url=https://www.redhat.com, name=rhosp17/openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, version=17.1.12, io.openshift.expose-services=, release=1761123044, container_name=iscsid, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, build-date=2025-11-18T23:44:13Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 28 08:35:15 np0005538515.localdomain podman[85382]: 2025-11-28 08:35:15.105376088 +0000 UTC m=+0.211034194 container exec_died 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.buildah.version=1.41.4, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., container_name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-type=git, build-date=2025-11-18T23:44:13Z, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, release=1761123044, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.component=openstack-iscsid-container, distribution-scope=public, architecture=x86_64, name=rhosp17/openstack-iscsid, version=17.1.12, batch=17.1_20251118.1)
Nov 28 08:35:15 np0005538515.localdomain systemd[1]: 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.service: Deactivated successfully.
Nov 28 08:35:15 np0005538515.localdomain podman[85381]: 2025-11-28 08:35:15.125683632 +0000 UTC m=+0.232992209 container health_status 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, distribution-scope=public, tcib_managed=true, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:12:45Z, vendor=Red Hat, Inc., container_name=ceilometer_agent_ipmi, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, release=1761123044, config_id=tripleo_step4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container)
Nov 28 08:35:15 np0005538515.localdomain podman[85381]: 2025-11-28 08:35:15.183546153 +0000 UTC m=+0.290854760 container exec_died 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:12:45Z, com.redhat.component=openstack-ceilometer-ipmi-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., release=1761123044, container_name=ceilometer_agent_ipmi, io.buildah.version=1.41.4, tcib_managed=true)
Nov 28 08:35:15 np0005538515.localdomain systemd[1]: 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.service: Deactivated successfully.
Nov 28 08:35:15 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.
Nov 28 08:35:15 np0005538515.localdomain podman[85472]: 2025-11-28 08:35:15.96935648 +0000 UTC m=+0.078522486 container health_status e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, version=17.1.12, config_id=tripleo_step4, build-date=2025-11-19T00:36:58Z, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1761123044, url=https://www.redhat.com, name=rhosp17/openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, distribution-scope=public, container_name=nova_migration_target, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, architecture=x86_64, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 28 08:35:16 np0005538515.localdomain podman[85472]: 2025-11-28 08:35:16.339882391 +0000 UTC m=+0.449048377 container exec_died e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, release=1761123044, managed_by=tripleo_ansible, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target, name=rhosp17/openstack-nova-compute, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, vcs-type=git, io.openshift.expose-services=, config_id=tripleo_step4, architecture=x86_64)
Nov 28 08:35:16 np0005538515.localdomain systemd[1]: e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.service: Deactivated successfully.
Nov 28 08:35:18 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.
Nov 28 08:35:18 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.
Nov 28 08:35:18 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.
Nov 28 08:35:18 np0005538515.localdomain systemd[1]: tmp-crun.g4Sce6.mount: Deactivated successfully.
Nov 28 08:35:18 np0005538515.localdomain podman[85495]: 2025-11-28 08:35:18.988154142 +0000 UTC m=+0.094461347 container health_status 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.41.4, batch=17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, architecture=x86_64, container_name=ovn_controller, url=https://www.redhat.com, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, version=17.1.12, managed_by=tripleo_ansible, build-date=2025-11-18T23:34:05Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044)
Nov 28 08:35:19 np0005538515.localdomain podman[85496]: 2025-11-28 08:35:19.037212191 +0000 UTC m=+0.140921176 container health_status ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, distribution-scope=public, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:36:58Z, tcib_managed=true, config_id=tripleo_step5, name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, container_name=nova_compute, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 28 08:35:19 np0005538515.localdomain podman[85497]: 2025-11-28 08:35:19.088893582 +0000 UTC m=+0.191544984 container health_status e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., container_name=ovn_metadata_agent, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, name=rhosp17/openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, build-date=2025-11-19T00:14:25Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, distribution-scope=public, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c)
Nov 28 08:35:19 np0005538515.localdomain podman[85496]: 2025-11-28 08:35:19.116289335 +0000 UTC m=+0.219998390 container exec_died ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step5, name=rhosp17/openstack-nova-compute, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:36:58Z, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, container_name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, architecture=x86_64, io.buildah.version=1.41.4, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, vcs-type=git, io.openshift.expose-services=, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 08:35:19 np0005538515.localdomain podman[85495]: 2025-11-28 08:35:19.11743932 +0000 UTC m=+0.223746545 container exec_died 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, container_name=ovn_controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.buildah.version=1.41.4, version=17.1.12, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, maintainer=OpenStack TripleO Team, distribution-scope=public, build-date=2025-11-18T23:34:05Z, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, release=1761123044, description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, com.redhat.component=openstack-ovn-controller-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc.)
Nov 28 08:35:19 np0005538515.localdomain systemd[1]: 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.service: Deactivated successfully.
Nov 28 08:35:19 np0005538515.localdomain podman[85497]: 2025-11-28 08:35:19.160843496 +0000 UTC m=+0.263494818 container exec_died e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, name=rhosp17/openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:14:25Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, vcs-type=git, container_name=ovn_metadata_agent, distribution-scope=public, io.openshift.expose-services=, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, tcib_managed=true, maintainer=OpenStack TripleO Team)
Nov 28 08:35:19 np0005538515.localdomain systemd[1]: e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.service: Deactivated successfully.
Nov 28 08:35:19 np0005538515.localdomain systemd[1]: ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.service: Deactivated successfully.
Nov 28 08:35:37 np0005538515.localdomain sudo[85571]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 08:35:37 np0005538515.localdomain sudo[85571]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:35:37 np0005538515.localdomain sudo[85571]: pam_unix(sudo:session): session closed for user root
Nov 28 08:35:37 np0005538515.localdomain sudo[85586]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 08:35:37 np0005538515.localdomain sudo[85586]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:35:38 np0005538515.localdomain sudo[85586]: pam_unix(sudo:session): session closed for user root
Nov 28 08:35:38 np0005538515.localdomain sudo[85634]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 08:35:38 np0005538515.localdomain sudo[85634]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:35:38 np0005538515.localdomain sudo[85634]: pam_unix(sudo:session): session closed for user root
Nov 28 08:35:41 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.
Nov 28 08:35:41 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.
Nov 28 08:35:41 np0005538515.localdomain podman[85649]: 2025-11-28 08:35:41.99132657 +0000 UTC m=+0.096181170 container health_status 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, batch=17.1_20251118.1, build-date=2025-11-18T22:49:46Z, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, url=https://www.redhat.com, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., architecture=x86_64, container_name=metrics_qdr, name=rhosp17/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044)
Nov 28 08:35:42 np0005538515.localdomain systemd[1]: tmp-crun.cZRn8M.mount: Deactivated successfully.
Nov 28 08:35:42 np0005538515.localdomain podman[85650]: 2025-11-28 08:35:42.055223896 +0000 UTC m=+0.155343760 container health_status cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, io.buildah.version=1.41.4, architecture=x86_64, com.redhat.component=openstack-collectd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, maintainer=OpenStack TripleO Team, version=17.1.12, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, managed_by=tripleo_ansible, vendor=Red Hat, Inc., release=1761123044, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, build-date=2025-11-18T22:51:28Z, batch=17.1_20251118.1, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=collectd)
Nov 28 08:35:42 np0005538515.localdomain podman[85650]: 2025-11-28 08:35:42.065676098 +0000 UTC m=+0.165795912 container exec_died cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, name=rhosp17/openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, config_id=tripleo_step3, architecture=x86_64, release=1761123044, vcs-type=git, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, com.redhat.component=openstack-collectd-container, managed_by=tripleo_ansible, build-date=2025-11-18T22:51:28Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, version=17.1.12, io.buildah.version=1.41.4, vendor=Red Hat, Inc.)
Nov 28 08:35:42 np0005538515.localdomain systemd[1]: cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.service: Deactivated successfully.
Nov 28 08:35:42 np0005538515.localdomain podman[85649]: 2025-11-28 08:35:42.20777048 +0000 UTC m=+0.312625100 container exec_died 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, managed_by=tripleo_ansible, vcs-type=git, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, io.openshift.expose-services=, version=17.1.12, name=rhosp17/openstack-qdrouterd, io.buildah.version=1.41.4, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:49:46Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, release=1761123044, batch=17.1_20251118.1, vendor=Red Hat, Inc.)
Nov 28 08:35:42 np0005538515.localdomain systemd[1]: 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.service: Deactivated successfully.
Nov 28 08:35:45 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.
Nov 28 08:35:45 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.
Nov 28 08:35:45 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.
Nov 28 08:35:45 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.
Nov 28 08:35:45 np0005538515.localdomain podman[85697]: 2025-11-28 08:35:45.977508797 +0000 UTC m=+0.080193138 container health_status 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:49:32Z, release=1761123044, com.redhat.component=openstack-cron-container, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., container_name=logrotate_crond, tcib_managed=true, io.openshift.expose-services=, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, distribution-scope=public, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-cron, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, batch=17.1_20251118.1)
Nov 28 08:35:46 np0005538515.localdomain podman[85697]: 2025-11-28 08:35:46.01042757 +0000 UTC m=+0.113111871 container exec_died 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, managed_by=tripleo_ansible, name=rhosp17/openstack-cron, architecture=x86_64, version=17.1.12, maintainer=OpenStack TripleO Team, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:32Z, container_name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.expose-services=, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 28 08:35:46 np0005538515.localdomain systemd[1]: 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.service: Deactivated successfully.
Nov 28 08:35:46 np0005538515.localdomain systemd[1]: tmp-crun.QYGgRW.mount: Deactivated successfully.
Nov 28 08:35:46 np0005538515.localdomain podman[85698]: 2025-11-28 08:35:46.088164252 +0000 UTC m=+0.187022155 container health_status 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, distribution-scope=public, architecture=x86_64, name=rhosp17/openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:12:45Z, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, vcs-type=git, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, tcib_managed=true, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-ipmi-container, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., version=17.1.12, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Nov 28 08:35:46 np0005538515.localdomain podman[85698]: 2025-11-28 08:35:46.118380442 +0000 UTC m=+0.217238365 container exec_died 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, url=https://www.redhat.com, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:12:45Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, name=rhosp17/openstack-ceilometer-ipmi, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, config_id=tripleo_step4, io.buildah.version=1.41.4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_ipmi)
Nov 28 08:35:46 np0005538515.localdomain systemd[1]: 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.service: Deactivated successfully.
Nov 28 08:35:46 np0005538515.localdomain systemd[1]: tmp-crun.DNY3wl.mount: Deactivated successfully.
Nov 28 08:35:46 np0005538515.localdomain podman[85699]: 2025-11-28 08:35:46.196648429 +0000 UTC m=+0.294517062 container health_status 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, build-date=2025-11-18T23:44:13Z, version=17.1.12, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.expose-services=, config_id=tripleo_step3, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-iscsid, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, batch=17.1_20251118.1, tcib_managed=true, io.buildah.version=1.41.4, vendor=Red Hat, Inc., release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-iscsid-container)
Nov 28 08:35:46 np0005538515.localdomain podman[85700]: 2025-11-28 08:35:46.202352165 +0000 UTC m=+0.295798112 container health_status d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, com.redhat.component=openstack-ceilometer-compute-container, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:11:48Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, version=17.1.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, url=https://www.redhat.com, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.buildah.version=1.41.4)
Nov 28 08:35:46 np0005538515.localdomain podman[85699]: 2025-11-28 08:35:46.209376491 +0000 UTC m=+0.307245184 container exec_died 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, release=1761123044, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, architecture=x86_64, tcib_managed=true, build-date=2025-11-18T23:44:13Z, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, io.buildah.version=1.41.4, container_name=iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, managed_by=tripleo_ansible)
Nov 28 08:35:46 np0005538515.localdomain systemd[1]: 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.service: Deactivated successfully.
Nov 28 08:35:46 np0005538515.localdomain podman[85700]: 2025-11-28 08:35:46.24052818 +0000 UTC m=+0.333974127 container exec_died d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.buildah.version=1.41.4, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, distribution-scope=public, container_name=ceilometer_agent_compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-compute, version=17.1.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2025-11-19T00:11:48Z, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, tcib_managed=true, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., batch=17.1_20251118.1, config_id=tripleo_step4)
Nov 28 08:35:46 np0005538515.localdomain systemd[1]: d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.service: Deactivated successfully.
Nov 28 08:35:46 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.
Nov 28 08:35:46 np0005538515.localdomain podman[85789]: 2025-11-28 08:35:46.987218344 +0000 UTC m=+0.093307372 container health_status e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, vendor=Red Hat, Inc., io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_migration_target, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, name=rhosp17/openstack-nova-compute, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, architecture=x86_64, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, build-date=2025-11-19T00:36:58Z)
Nov 28 08:35:47 np0005538515.localdomain podman[85789]: 2025-11-28 08:35:47.353832704 +0000 UTC m=+0.459921682 container exec_died e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20251118.1, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, container_name=nova_migration_target, io.buildah.version=1.41.4, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., release=1761123044, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:36:58Z)
Nov 28 08:35:47 np0005538515.localdomain systemd[1]: e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.service: Deactivated successfully.
Nov 28 08:35:49 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.
Nov 28 08:35:49 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.
Nov 28 08:35:49 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.
Nov 28 08:35:49 np0005538515.localdomain systemd[1]: tmp-crun.cFLmsq.mount: Deactivated successfully.
Nov 28 08:35:49 np0005538515.localdomain podman[85814]: 2025-11-28 08:35:49.988288521 +0000 UTC m=+0.093098534 container health_status 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, io.openshift.expose-services=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_id=tripleo_step4, version=17.1.12, url=https://www.redhat.com, container_name=ovn_controller, name=rhosp17/openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, io.buildah.version=1.41.4, build-date=2025-11-18T23:34:05Z, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ovn-controller-container)
Nov 28 08:35:50 np0005538515.localdomain podman[85816]: 2025-11-28 08:35:50.038924559 +0000 UTC m=+0.139869353 container health_status e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.4, config_id=tripleo_step4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., tcib_managed=true, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, url=https://www.redhat.com, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-neutron-metadata-agent-ovn)
Nov 28 08:35:50 np0005538515.localdomain podman[85815]: 2025-11-28 08:35:50.096235391 +0000 UTC m=+0.198028291 container health_status ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.buildah.version=1.41.4, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=nova_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, config_id=tripleo_step5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, vendor=Red Hat, Inc., url=https://www.redhat.com, build-date=2025-11-19T00:36:58Z, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git)
Nov 28 08:35:50 np0005538515.localdomain podman[85814]: 2025-11-28 08:35:50.115051851 +0000 UTC m=+0.219861834 container exec_died 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, build-date=2025-11-18T23:34:05Z, io.buildah.version=1.41.4, batch=17.1_20251118.1, architecture=x86_64, io.openshift.expose-services=, version=17.1.12, vendor=Red Hat, Inc., tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, name=rhosp17/openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ovn-controller-container, config_id=tripleo_step4, container_name=ovn_controller)
Nov 28 08:35:50 np0005538515.localdomain podman[85815]: 2025-11-28 08:35:50.12641395 +0000 UTC m=+0.228206850 container exec_died ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, container_name=nova_compute, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step5, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, url=https://www.redhat.com, build-date=2025-11-19T00:36:58Z, batch=17.1_20251118.1, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.openshift.expose-services=)
Nov 28 08:35:50 np0005538515.localdomain systemd[1]: 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.service: Deactivated successfully.
Nov 28 08:35:50 np0005538515.localdomain systemd[1]: ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.service: Deactivated successfully.
Nov 28 08:35:50 np0005538515.localdomain podman[85816]: 2025-11-28 08:35:50.15011444 +0000 UTC m=+0.251059254 container exec_died e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, release=1761123044, url=https://www.redhat.com, io.openshift.expose-services=, architecture=x86_64, vendor=Red Hat, Inc., version=17.1.12, tcib_managed=true, io.buildah.version=1.41.4, distribution-scope=public, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=ovn_metadata_agent, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git)
Nov 28 08:35:50 np0005538515.localdomain systemd[1]: e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.service: Deactivated successfully.
Nov 28 08:36:12 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.
Nov 28 08:36:12 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.
Nov 28 08:36:13 np0005538515.localdomain podman[85935]: 2025-11-28 08:36:13.016754628 +0000 UTC m=+0.118046153 container health_status cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, vcs-type=git, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, config_id=tripleo_step3, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, build-date=2025-11-18T22:51:28Z, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, container_name=collectd, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, com.redhat.component=openstack-collectd-container, description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, architecture=x86_64, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, maintainer=OpenStack TripleO Team, tcib_managed=true)
Nov 28 08:36:13 np0005538515.localdomain podman[85935]: 2025-11-28 08:36:13.030495131 +0000 UTC m=+0.131786646 container exec_died cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, name=rhosp17/openstack-collectd, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, container_name=collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, batch=17.1_20251118.1, architecture=x86_64, build-date=2025-11-18T22:51:28Z, com.redhat.component=openstack-collectd-container, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, tcib_managed=true)
Nov 28 08:36:13 np0005538515.localdomain systemd[1]: cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.service: Deactivated successfully.
Nov 28 08:36:13 np0005538515.localdomain podman[85934]: 2025-11-28 08:36:13.122502032 +0000 UTC m=+0.228366497 container health_status 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, release=1761123044, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.41.4, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, build-date=2025-11-18T22:49:46Z, tcib_managed=true, vendor=Red Hat, Inc.)
Nov 28 08:36:13 np0005538515.localdomain podman[85934]: 2025-11-28 08:36:13.315058597 +0000 UTC m=+0.420923062 container exec_died 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, distribution-scope=public, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:46Z, config_id=tripleo_step1, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-qdrouterd, container_name=metrics_qdr, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, vcs-type=git, maintainer=OpenStack TripleO Team, version=17.1.12, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.buildah.version=1.41.4)
Nov 28 08:36:13 np0005538515.localdomain systemd[1]: 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.service: Deactivated successfully.
Nov 28 08:36:16 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.
Nov 28 08:36:16 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.
Nov 28 08:36:16 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.
Nov 28 08:36:16 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.
Nov 28 08:36:16 np0005538515.localdomain podman[85985]: 2025-11-28 08:36:16.971462697 +0000 UTC m=+0.080448827 container health_status 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, distribution-scope=public, config_id=tripleo_step4, build-date=2025-11-18T22:49:32Z, io.buildah.version=1.41.4, com.redhat.component=openstack-cron-container, name=rhosp17/openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=logrotate_crond, tcib_managed=true, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, release=1761123044, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.openshift.expose-services=)
Nov 28 08:36:16 np0005538515.localdomain podman[85985]: 2025-11-28 08:36:16.982645981 +0000 UTC m=+0.091632111 container exec_died 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vcs-type=git, managed_by=tripleo_ansible, name=rhosp17/openstack-cron, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, distribution-scope=public, com.redhat.component=openstack-cron-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, version=17.1.12, container_name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, url=https://www.redhat.com, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., batch=17.1_20251118.1, build-date=2025-11-18T22:49:32Z, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 28 08:36:17 np0005538515.localdomain systemd[1]: 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.service: Deactivated successfully.
Nov 28 08:36:17 np0005538515.localdomain podman[85987]: 2025-11-28 08:36:17.033251278 +0000 UTC m=+0.135933443 container health_status 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, release=1761123044, com.redhat.component=openstack-iscsid-container, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, config_id=tripleo_step3, vcs-type=git, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, name=rhosp17/openstack-iscsid, io.openshift.expose-services=)
Nov 28 08:36:17 np0005538515.localdomain podman[85987]: 2025-11-28 08:36:17.071502005 +0000 UTC m=+0.174184160 container exec_died 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, url=https://www.redhat.com, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, managed_by=tripleo_ansible, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, distribution-scope=public, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, architecture=x86_64, container_name=iscsid, tcib_managed=true, name=rhosp17/openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 08:36:17 np0005538515.localdomain systemd[1]: 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.service: Deactivated successfully.
Nov 28 08:36:17 np0005538515.localdomain podman[85986]: 2025-11-28 08:36:17.078617064 +0000 UTC m=+0.182634770 container health_status 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.12, container_name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, release=1761123044, config_id=tripleo_step4, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, build-date=2025-11-19T00:12:45Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676)
Nov 28 08:36:17 np0005538515.localdomain podman[85988]: 2025-11-28 08:36:17.140838578 +0000 UTC m=+0.243293666 container health_status d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, version=17.1.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, build-date=2025-11-19T00:11:48Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, name=rhosp17/openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_compute)
Nov 28 08:36:17 np0005538515.localdomain podman[85986]: 2025-11-28 08:36:17.164463125 +0000 UTC m=+0.268480831 container exec_died 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, batch=17.1_20251118.1, io.buildah.version=1.41.4, com.redhat.component=openstack-ceilometer-ipmi-container, architecture=x86_64, managed_by=tripleo_ansible, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, distribution-scope=public, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Nov 28 08:36:17 np0005538515.localdomain systemd[1]: 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.service: Deactivated successfully.
Nov 28 08:36:17 np0005538515.localdomain podman[85988]: 2025-11-28 08:36:17.176477275 +0000 UTC m=+0.278932413 container exec_died d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, config_id=tripleo_step4, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, com.redhat.component=openstack-ceilometer-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, tcib_managed=true, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-compute)
Nov 28 08:36:17 np0005538515.localdomain systemd[1]: d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.service: Deactivated successfully.
Nov 28 08:36:17 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.
Nov 28 08:36:18 np0005538515.localdomain systemd[1]: tmp-crun.rUwpAh.mount: Deactivated successfully.
Nov 28 08:36:18 np0005538515.localdomain podman[86077]: 2025-11-28 08:36:18.027335004 +0000 UTC m=+0.133303793 container health_status e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:36:58Z, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., tcib_managed=true, distribution-scope=public, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step4, io.buildah.version=1.41.4, managed_by=tripleo_ansible, version=17.1.12, maintainer=OpenStack TripleO Team)
Nov 28 08:36:18 np0005538515.localdomain podman[86077]: 2025-11-28 08:36:18.396807992 +0000 UTC m=+0.502776821 container exec_died e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.12, build-date=2025-11-19T00:36:58Z, name=rhosp17/openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, batch=17.1_20251118.1, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, distribution-scope=public, config_id=tripleo_step4)
Nov 28 08:36:18 np0005538515.localdomain systemd[1]: e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.service: Deactivated successfully.
Nov 28 08:36:20 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.
Nov 28 08:36:20 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.
Nov 28 08:36:20 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.
Nov 28 08:36:20 np0005538515.localdomain podman[86099]: 2025-11-28 08:36:20.987009697 +0000 UTC m=+0.092984572 container health_status 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, config_id=tripleo_step4, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, release=1761123044, io.buildah.version=1.41.4, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, summary=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ovn-controller, container_name=ovn_controller)
Nov 28 08:36:21 np0005538515.localdomain podman[86099]: 2025-11-28 08:36:21.03488275 +0000 UTC m=+0.140857655 container exec_died 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, release=1761123044, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.buildah.version=1.41.4, container_name=ovn_controller, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, config_id=tripleo_step4, url=https://www.redhat.com, vendor=Red Hat, Inc., build-date=2025-11-18T23:34:05Z, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ovn-controller, managed_by=tripleo_ansible, com.redhat.component=openstack-ovn-controller-container, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ovn-controller)
Nov 28 08:36:21 np0005538515.localdomain systemd[1]: 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.service: Deactivated successfully.
Nov 28 08:36:21 np0005538515.localdomain podman[86101]: 2025-11-28 08:36:21.045311621 +0000 UTC m=+0.144785105 container health_status e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, release=1761123044, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:14:25Z, architecture=x86_64, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.expose-services=, container_name=ovn_metadata_agent, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, maintainer=OpenStack TripleO Team, vcs-type=git, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Nov 28 08:36:21 np0005538515.localdomain podman[86100]: 2025-11-28 08:36:21.099994704 +0000 UTC m=+0.202122250 container health_status ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, io.buildah.version=1.41.4, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, container_name=nova_compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, build-date=2025-11-19T00:36:58Z, distribution-scope=public, io.openshift.expose-services=, batch=17.1_20251118.1)
Nov 28 08:36:21 np0005538515.localdomain podman[86101]: 2025-11-28 08:36:21.125827869 +0000 UTC m=+0.225301403 container exec_died e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, build-date=2025-11-19T00:14:25Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, name=rhosp17/openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, version=17.1.12, distribution-scope=public)
Nov 28 08:36:21 np0005538515.localdomain systemd[1]: e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.service: Deactivated successfully.
Nov 28 08:36:21 np0005538515.localdomain podman[86100]: 2025-11-28 08:36:21.179368306 +0000 UTC m=+0.281495842 container exec_died ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:36:58Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, release=1761123044, vcs-type=git, batch=17.1_20251118.1, architecture=x86_64, tcib_managed=true, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.12, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vendor=Red Hat, Inc., io.openshift.expose-services=, managed_by=tripleo_ansible, container_name=nova_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 28 08:36:21 np0005538515.localdomain systemd[1]: ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.service: Deactivated successfully.
Nov 28 08:36:38 np0005538515.localdomain sudo[86171]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 08:36:38 np0005538515.localdomain sudo[86171]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:36:38 np0005538515.localdomain sudo[86171]: pam_unix(sudo:session): session closed for user root
Nov 28 08:36:39 np0005538515.localdomain sudo[86186]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 08:36:39 np0005538515.localdomain sudo[86186]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:36:39 np0005538515.localdomain sudo[86186]: pam_unix(sudo:session): session closed for user root
Nov 28 08:36:40 np0005538515.localdomain sudo[86233]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 08:36:40 np0005538515.localdomain sudo[86233]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:36:40 np0005538515.localdomain sudo[86233]: pam_unix(sudo:session): session closed for user root
Nov 28 08:36:43 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.
Nov 28 08:36:43 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.
Nov 28 08:36:43 np0005538515.localdomain systemd[1]: tmp-crun.H5meR1.mount: Deactivated successfully.
Nov 28 08:36:43 np0005538515.localdomain podman[86249]: 2025-11-28 08:36:43.988943486 +0000 UTC m=+0.091037081 container health_status cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, name=rhosp17/openstack-collectd, architecture=x86_64, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, managed_by=tripleo_ansible, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, distribution-scope=public, io.buildah.version=1.41.4, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 28 08:36:44 np0005538515.localdomain podman[86249]: 2025-11-28 08:36:44.028545154 +0000 UTC m=+0.130638739 container exec_died cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, container_name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, batch=17.1_20251118.1, io.openshift.expose-services=, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, vcs-type=git, version=17.1.12, maintainer=OpenStack TripleO Team, release=1761123044, build-date=2025-11-18T22:51:28Z, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, com.redhat.component=openstack-collectd-container, vendor=Red Hat, Inc., config_id=tripleo_step3, name=rhosp17/openstack-collectd, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 28 08:36:44 np0005538515.localdomain systemd[1]: cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.service: Deactivated successfully.
Nov 28 08:36:44 np0005538515.localdomain podman[86248]: 2025-11-28 08:36:44.032206358 +0000 UTC m=+0.134719526 container health_status 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, name=rhosp17/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:46Z, com.redhat.component=openstack-qdrouterd-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, distribution-scope=public, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, release=1761123044, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vendor=Red Hat, Inc., config_id=tripleo_step1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.41.4, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 28 08:36:44 np0005538515.localdomain podman[86248]: 2025-11-28 08:36:44.224471123 +0000 UTC m=+0.326984301 container exec_died 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, release=1761123044, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.41.4, tcib_managed=true, maintainer=OpenStack TripleO Team, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:46Z, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12)
Nov 28 08:36:44 np0005538515.localdomain systemd[1]: 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.service: Deactivated successfully.
Nov 28 08:36:47 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.
Nov 28 08:36:47 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.
Nov 28 08:36:47 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.
Nov 28 08:36:47 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.
Nov 28 08:36:47 np0005538515.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Nov 28 08:36:47 np0005538515.localdomain recover_tripleo_nova_virtqemud[86322]: 62642
Nov 28 08:36:47 np0005538515.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Nov 28 08:36:47 np0005538515.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Nov 28 08:36:47 np0005538515.localdomain podman[86300]: 2025-11-28 08:36:47.986270066 +0000 UTC m=+0.090430053 container health_status 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, com.redhat.component=openstack-ceilometer-ipmi-container, config_id=tripleo_step4, vendor=Red Hat, Inc., version=17.1.12, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:12:45Z, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, batch=17.1_20251118.1, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-ipmi, tcib_managed=true, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676)
Nov 28 08:36:48 np0005538515.localdomain podman[86299]: 2025-11-28 08:36:48.035352266 +0000 UTC m=+0.141309609 container health_status 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, name=rhosp17/openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, io.openshift.expose-services=, build-date=2025-11-18T22:49:32Z, description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-cron-container, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, maintainer=OpenStack TripleO Team, container_name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., managed_by=tripleo_ansible, url=https://www.redhat.com, batch=17.1_20251118.1, release=1761123044, architecture=x86_64, tcib_managed=true)
Nov 28 08:36:48 np0005538515.localdomain podman[86299]: 2025-11-28 08:36:48.046331764 +0000 UTC m=+0.152289167 container exec_died 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, tcib_managed=true, distribution-scope=public, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, build-date=2025-11-18T22:49:32Z, container_name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, config_id=tripleo_step4, release=1761123044, version=17.1.12, managed_by=tripleo_ansible)
Nov 28 08:36:48 np0005538515.localdomain systemd[1]: 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.service: Deactivated successfully.
Nov 28 08:36:48 np0005538515.localdomain podman[86301]: 2025-11-28 08:36:48.088722618 +0000 UTC m=+0.189406599 container health_status 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-iscsid, managed_by=tripleo_ansible, release=1761123044, com.redhat.component=openstack-iscsid-container, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12, io.buildah.version=1.41.4, vendor=Red Hat, Inc., batch=17.1_20251118.1, config_id=tripleo_step3, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, container_name=iscsid, build-date=2025-11-18T23:44:13Z, description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 iscsid)
Nov 28 08:36:48 np0005538515.localdomain podman[86301]: 2025-11-28 08:36:48.10242304 +0000 UTC m=+0.203107001 container exec_died 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=iscsid, config_id=tripleo_step3, version=17.1.12, io.buildah.version=1.41.4, release=1761123044, com.redhat.component=openstack-iscsid-container, name=rhosp17/openstack-iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., io.openshift.expose-services=, url=https://www.redhat.com, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, build-date=2025-11-18T23:44:13Z, architecture=x86_64, tcib_managed=true, vcs-type=git, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public)
Nov 28 08:36:48 np0005538515.localdomain podman[86302]: 2025-11-28 08:36:48.135140526 +0000 UTC m=+0.232288998 container health_status d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, batch=17.1_20251118.1, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, release=1761123044, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, container_name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, url=https://www.redhat.com, distribution-scope=public, build-date=2025-11-19T00:11:48Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ceilometer-compute, managed_by=tripleo_ansible, architecture=x86_64)
Nov 28 08:36:48 np0005538515.localdomain podman[86300]: 2025-11-28 08:36:48.163244571 +0000 UTC m=+0.267404528 container exec_died 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-ipmi, io.buildah.version=1.41.4, container_name=ceilometer_agent_ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, vendor=Red Hat, Inc., managed_by=tripleo_ansible, url=https://www.redhat.com, vcs-type=git, batch=17.1_20251118.1, build-date=2025-11-19T00:12:45Z, version=17.1.12, config_id=tripleo_step4, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 28 08:36:48 np0005538515.localdomain systemd[1]: 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.service: Deactivated successfully.
Nov 28 08:36:48 np0005538515.localdomain systemd[1]: 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.service: Deactivated successfully.
Nov 28 08:36:48 np0005538515.localdomain podman[86302]: 2025-11-28 08:36:48.192417999 +0000 UTC m=+0.289566521 container exec_died d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, architecture=x86_64, name=rhosp17/openstack-ceilometer-compute, build-date=2025-11-19T00:11:48Z, batch=17.1_20251118.1, config_id=tripleo_step4, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, release=1761123044, vcs-type=git, com.redhat.component=openstack-ceilometer-compute-container, vendor=Red Hat, Inc., container_name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true)
Nov 28 08:36:48 np0005538515.localdomain systemd[1]: d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.service: Deactivated successfully.
Nov 28 08:36:48 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.
Nov 28 08:36:48 np0005538515.localdomain podman[86394]: 2025-11-28 08:36:48.958261752 +0000 UTC m=+0.071155790 container health_status e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, build-date=2025-11-19T00:36:58Z, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, architecture=x86_64, config_id=tripleo_step4, batch=17.1_20251118.1, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 28 08:36:49 np0005538515.localdomain podman[86394]: 2025-11-28 08:36:49.350464239 +0000 UTC m=+0.463358217 container exec_died e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, vcs-type=git, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:36:58Z, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target, io.openshift.expose-services=, architecture=x86_64, vendor=Red Hat, Inc., version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, name=rhosp17/openstack-nova-compute)
Nov 28 08:36:49 np0005538515.localdomain systemd[1]: e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.service: Deactivated successfully.
Nov 28 08:36:51 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.
Nov 28 08:36:51 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.
Nov 28 08:36:51 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.
Nov 28 08:36:51 np0005538515.localdomain podman[86419]: 2025-11-28 08:36:51.978024813 +0000 UTC m=+0.081339683 container health_status e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, architecture=x86_64, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, managed_by=tripleo_ansible, build-date=2025-11-19T00:14:25Z, io.buildah.version=1.41.4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, vcs-type=git, batch=17.1_20251118.1, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Nov 28 08:36:52 np0005538515.localdomain podman[86419]: 2025-11-28 08:36:52.028036602 +0000 UTC m=+0.131351482 container exec_died e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, version=17.1.12, vcs-type=git, config_id=tripleo_step4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, url=https://www.redhat.com, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, distribution-scope=public, io.buildah.version=1.41.4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, tcib_managed=true, build-date=2025-11-19T00:14:25Z)
Nov 28 08:36:52 np0005538515.localdomain systemd[1]: e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.service: Deactivated successfully.
Nov 28 08:36:52 np0005538515.localdomain podman[86418]: 2025-11-28 08:36:52.081752225 +0000 UTC m=+0.187491480 container health_status ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, tcib_managed=true, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, batch=17.1_20251118.1, container_name=nova_compute, build-date=2025-11-19T00:36:58Z, config_id=tripleo_step5, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, managed_by=tripleo_ansible)
Nov 28 08:36:52 np0005538515.localdomain podman[86418]: 2025-11-28 08:36:52.109496728 +0000 UTC m=+0.215235953 container exec_died ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, distribution-scope=public, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, config_id=tripleo_step5, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2025-11-19T00:36:58Z, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, vendor=Red Hat, Inc., tcib_managed=true, container_name=nova_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, release=1761123044, version=17.1.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 08:36:52 np0005538515.localdomain systemd[1]: ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.service: Deactivated successfully.
Nov 28 08:36:52 np0005538515.localdomain podman[86417]: 2025-11-28 08:36:52.030268691 +0000 UTC m=+0.138741340 container health_status 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, tcib_managed=true, version=17.1.12, url=https://www.redhat.com, release=1761123044, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, vendor=Red Hat, Inc., managed_by=tripleo_ansible, name=rhosp17/openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ovn_controller)
Nov 28 08:36:52 np0005538515.localdomain podman[86417]: 2025-11-28 08:36:52.169469503 +0000 UTC m=+0.277942142 container exec_died 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:34:05Z, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, name=rhosp17/openstack-ovn-controller, maintainer=OpenStack TripleO Team, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, batch=17.1_20251118.1, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.expose-services=, managed_by=tripleo_ansible, release=1761123044, url=https://www.redhat.com, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc.)
Nov 28 08:36:52 np0005538515.localdomain systemd[1]: 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.service: Deactivated successfully.
Nov 28 08:37:14 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.
Nov 28 08:37:14 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.
Nov 28 08:37:14 np0005538515.localdomain systemd[1]: tmp-crun.YB11PN.mount: Deactivated successfully.
Nov 28 08:37:14 np0005538515.localdomain podman[86538]: 2025-11-28 08:37:14.988765846 +0000 UTC m=+0.095996084 container health_status cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, com.redhat.component=openstack-collectd-container, distribution-scope=public, vendor=Red Hat, Inc., config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=collectd, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:51:28Z, url=https://www.redhat.com, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, release=1761123044, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1)
Nov 28 08:37:14 np0005538515.localdomain podman[86538]: 2025-11-28 08:37:14.998245807 +0000 UTC m=+0.105476045 container exec_died cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, summary=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.buildah.version=1.41.4, version=17.1.12, io.openshift.expose-services=, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, container_name=collectd, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, com.redhat.component=openstack-collectd-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, build-date=2025-11-18T22:51:28Z, vcs-type=git, tcib_managed=true)
Nov 28 08:37:15 np0005538515.localdomain systemd[1]: cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.service: Deactivated successfully.
Nov 28 08:37:15 np0005538515.localdomain podman[86537]: 2025-11-28 08:37:14.96289718 +0000 UTC m=+0.075292547 container health_status 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, container_name=metrics_qdr, config_id=tripleo_step1, distribution-scope=public, name=rhosp17/openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, io.buildah.version=1.41.4, architecture=x86_64, batch=17.1_20251118.1, build-date=2025-11-18T22:49:46Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd)
Nov 28 08:37:15 np0005538515.localdomain podman[86537]: 2025-11-28 08:37:15.175007097 +0000 UTC m=+0.287402514 container exec_died 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., managed_by=tripleo_ansible, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp17/openstack-qdrouterd, container_name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, version=17.1.12, config_id=tripleo_step1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, tcib_managed=true, vcs-type=git, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 28 08:37:15 np0005538515.localdomain systemd[1]: 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.service: Deactivated successfully.
Nov 28 08:37:15 np0005538515.localdomain systemd[1]: tmp-crun.enHL1F.mount: Deactivated successfully.
Nov 28 08:37:18 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.
Nov 28 08:37:18 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.
Nov 28 08:37:18 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.
Nov 28 08:37:18 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.
Nov 28 08:37:18 np0005538515.localdomain systemd[1]: tmp-crun.lempb3.mount: Deactivated successfully.
Nov 28 08:37:18 np0005538515.localdomain podman[86587]: 2025-11-28 08:37:18.995845746 +0000 UTC m=+0.095817880 container health_status 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, com.redhat.component=openstack-ceilometer-ipmi-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, io.openshift.expose-services=, build-date=2025-11-19T00:12:45Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi, io.buildah.version=1.41.4, version=17.1.12, vendor=Red Hat, Inc., batch=17.1_20251118.1, name=rhosp17/openstack-ceilometer-ipmi, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git)
Nov 28 08:37:19 np0005538515.localdomain podman[86589]: 2025-11-28 08:37:19.015626375 +0000 UTC m=+0.108955424 container health_status d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, batch=17.1_20251118.1, tcib_managed=true, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, architecture=x86_64, vcs-type=git, build-date=2025-11-19T00:11:48Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, container_name=ceilometer_agent_compute, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-compute-container, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute)
Nov 28 08:37:19 np0005538515.localdomain podman[86587]: 2025-11-28 08:37:19.046598628 +0000 UTC m=+0.146570722 container exec_died 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vcs-type=git, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, url=https://www.redhat.com, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-ipmi, vendor=Red Hat, Inc., version=17.1.12, release=1761123044, com.redhat.component=openstack-ceilometer-ipmi-container, build-date=2025-11-19T00:12:45Z, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Nov 28 08:37:19 np0005538515.localdomain podman[86589]: 2025-11-28 08:37:19.07009408 +0000 UTC m=+0.163423169 container exec_died d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, release=1761123044, name=rhosp17/openstack-ceilometer-compute, version=17.1.12, io.openshift.expose-services=, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_compute, io.buildah.version=1.41.4, build-date=2025-11-19T00:11:48Z)
Nov 28 08:37:19 np0005538515.localdomain systemd[1]: d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.service: Deactivated successfully.
Nov 28 08:37:19 np0005538515.localdomain podman[86588]: 2025-11-28 08:37:19.094061098 +0000 UTC m=+0.188400318 container health_status 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, config_id=tripleo_step3, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, io.buildah.version=1.41.4, version=17.1.12, architecture=x86_64, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.component=openstack-iscsid-container, name=rhosp17/openstack-iscsid, tcib_managed=true, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, build-date=2025-11-18T23:44:13Z, maintainer=OpenStack TripleO Team)
Nov 28 08:37:19 np0005538515.localdomain systemd[1]: 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.service: Deactivated successfully.
Nov 28 08:37:19 np0005538515.localdomain podman[86588]: 2025-11-28 08:37:19.131423667 +0000 UTC m=+0.225762897 container exec_died 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, vendor=Red Hat, Inc., io.buildah.version=1.41.4, com.redhat.component=openstack-iscsid-container, managed_by=tripleo_ansible, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, tcib_managed=true, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, release=1761123044, version=17.1.12, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, name=rhosp17/openstack-iscsid, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid)
Nov 28 08:37:19 np0005538515.localdomain systemd[1]: 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.service: Deactivated successfully.
Nov 28 08:37:19 np0005538515.localdomain podman[86586]: 2025-11-28 08:37:19.150112243 +0000 UTC m=+0.253052088 container health_status 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:32Z, summary=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, com.redhat.component=openstack-cron-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, tcib_managed=true, container_name=logrotate_crond, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, name=rhosp17/openstack-cron, io.openshift.expose-services=, vcs-type=git, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, maintainer=OpenStack TripleO Team, config_id=tripleo_step4)
Nov 28 08:37:19 np0005538515.localdomain podman[86586]: 2025-11-28 08:37:19.158260532 +0000 UTC m=+0.261200397 container exec_died 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, container_name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, url=https://www.redhat.com, com.redhat.component=openstack-cron-container, summary=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, architecture=x86_64, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:32Z, tcib_managed=true, distribution-scope=public, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_id=tripleo_step4, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., io.openshift.expose-services=)
Nov 28 08:37:19 np0005538515.localdomain systemd[1]: 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.service: Deactivated successfully.
Nov 28 08:37:19 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.
Nov 28 08:37:19 np0005538515.localdomain podman[86675]: 2025-11-28 08:37:19.979205432 +0000 UTC m=+0.081204229 container health_status e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.12, io.buildah.version=1.41.4, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20251118.1, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, managed_by=tripleo_ansible, container_name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, build-date=2025-11-19T00:36:58Z, name=rhosp17/openstack-nova-compute, maintainer=OpenStack TripleO Team)
Nov 28 08:37:20 np0005538515.localdomain podman[86675]: 2025-11-28 08:37:20.363428563 +0000 UTC m=+0.465427310 container exec_died e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, build-date=2025-11-19T00:36:58Z, container_name=nova_migration_target, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, release=1761123044, name=rhosp17/openstack-nova-compute, io.openshift.expose-services=, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, managed_by=tripleo_ansible)
Nov 28 08:37:20 np0005538515.localdomain systemd[1]: e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.service: Deactivated successfully.
Nov 28 08:37:22 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.
Nov 28 08:37:22 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.
Nov 28 08:37:22 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.
Nov 28 08:37:22 np0005538515.localdomain systemd[1]: tmp-crun.3GP8Us.mount: Deactivated successfully.
Nov 28 08:37:22 np0005538515.localdomain podman[86698]: 2025-11-28 08:37:22.985107157 +0000 UTC m=+0.091946560 container health_status 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, config_id=tripleo_step4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, vcs-type=git, url=https://www.redhat.com, tcib_managed=true, build-date=2025-11-18T23:34:05Z, vendor=Red Hat, Inc., managed_by=tripleo_ansible, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, com.redhat.component=openstack-ovn-controller-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, name=rhosp17/openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ovn_controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, architecture=x86_64)
Nov 28 08:37:23 np0005538515.localdomain podman[86700]: 2025-11-28 08:37:23.032104692 +0000 UTC m=+0.132004562 container health_status e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, vcs-type=git, config_id=tripleo_step4, release=1761123044, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., batch=17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, build-date=2025-11-19T00:14:25Z, io.buildah.version=1.41.4, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']})
Nov 28 08:37:23 np0005538515.localdomain podman[86698]: 2025-11-28 08:37:23.036523929 +0000 UTC m=+0.143363332 container exec_died 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, name=rhosp17/openstack-ovn-controller, url=https://www.redhat.com, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, distribution-scope=public, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, managed_by=tripleo_ansible, release=1761123044, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, maintainer=OpenStack TripleO Team, vcs-type=git, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vendor=Red Hat, Inc., io.buildah.version=1.41.4, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, com.redhat.component=openstack-ovn-controller-container)
Nov 28 08:37:23 np0005538515.localdomain systemd[1]: 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.service: Deactivated successfully.
Nov 28 08:37:23 np0005538515.localdomain podman[86699]: 2025-11-28 08:37:23.092951245 +0000 UTC m=+0.196482156 container health_status ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, maintainer=OpenStack TripleO Team, tcib_managed=true, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-nova-compute, container_name=nova_compute, io.openshift.expose-services=, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., build-date=2025-11-19T00:36:58Z, version=17.1.12, io.buildah.version=1.41.4, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, distribution-scope=public, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 28 08:37:23 np0005538515.localdomain podman[86700]: 2025-11-28 08:37:23.107559964 +0000 UTC m=+0.207459844 container exec_died e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, distribution-scope=public, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, version=17.1.12, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, io.buildah.version=1.41.4, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, tcib_managed=true, config_id=tripleo_step4, vendor=Red Hat, Inc., build-date=2025-11-19T00:14:25Z, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, name=rhosp17/openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team)
Nov 28 08:37:23 np0005538515.localdomain systemd[1]: e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.service: Deactivated successfully.
Nov 28 08:37:23 np0005538515.localdomain podman[86699]: 2025-11-28 08:37:23.124523476 +0000 UTC m=+0.228054427 container exec_died ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-compute, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, config_id=tripleo_step5, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, url=https://www.redhat.com, tcib_managed=true, build-date=2025-11-19T00:36:58Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 28 08:37:23 np0005538515.localdomain systemd[1]: ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.service: Deactivated successfully.
Nov 28 08:37:40 np0005538515.localdomain sudo[86770]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 08:37:40 np0005538515.localdomain sudo[86770]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:37:40 np0005538515.localdomain sudo[86770]: pam_unix(sudo:session): session closed for user root
Nov 28 08:37:40 np0005538515.localdomain sudo[86785]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 08:37:40 np0005538515.localdomain sudo[86785]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:37:41 np0005538515.localdomain sudo[86785]: pam_unix(sudo:session): session closed for user root
Nov 28 08:37:42 np0005538515.localdomain sudo[86831]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 08:37:42 np0005538515.localdomain sudo[86831]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:37:42 np0005538515.localdomain sudo[86831]: pam_unix(sudo:session): session closed for user root
Nov 28 08:37:45 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.
Nov 28 08:37:45 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.
Nov 28 08:37:45 np0005538515.localdomain podman[86847]: 2025-11-28 08:37:45.986924425 +0000 UTC m=+0.091369721 container health_status cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, tcib_managed=true, architecture=x86_64, io.buildah.version=1.41.4, release=1761123044, com.redhat.component=openstack-collectd-container, config_id=tripleo_step3, build-date=2025-11-18T22:51:28Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=collectd, name=rhosp17/openstack-collectd, version=17.1.12, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd)
Nov 28 08:37:46 np0005538515.localdomain podman[86847]: 2025-11-28 08:37:46.001555465 +0000 UTC m=+0.106000791 container exec_died cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-collectd-container, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, release=1761123044, version=17.1.12, architecture=x86_64, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, url=https://www.redhat.com, batch=17.1_20251118.1, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, tcib_managed=true, distribution-scope=public, name=rhosp17/openstack-collectd, vendor=Red Hat, Inc., build-date=2025-11-18T22:51:28Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd)
Nov 28 08:37:46 np0005538515.localdomain podman[86846]: 2025-11-28 08:37:46.038720218 +0000 UTC m=+0.143132473 container health_status 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, config_id=tripleo_step1, url=https://www.redhat.com, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-qdrouterd, release=1761123044, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, vendor=Red Hat, Inc., com.redhat.component=openstack-qdrouterd-container, tcib_managed=true, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, distribution-scope=public, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=metrics_qdr)
Nov 28 08:37:46 np0005538515.localdomain systemd[1]: cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.service: Deactivated successfully.
Nov 28 08:37:46 np0005538515.localdomain podman[86846]: 2025-11-28 08:37:46.257123638 +0000 UTC m=+0.361535923 container exec_died 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, tcib_managed=true, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, name=rhosp17/openstack-qdrouterd, release=1761123044, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, config_id=tripleo_step1, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, build-date=2025-11-18T22:49:46Z, vendor=Red Hat, Inc., com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr, vcs-type=git)
Nov 28 08:37:46 np0005538515.localdomain systemd[1]: 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.service: Deactivated successfully.
Nov 28 08:37:49 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.
Nov 28 08:37:49 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.
Nov 28 08:37:49 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.
Nov 28 08:37:49 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.
Nov 28 08:37:49 np0005538515.localdomain systemd[1]: tmp-crun.1qtNwE.mount: Deactivated successfully.
Nov 28 08:37:50 np0005538515.localdomain podman[86895]: 2025-11-28 08:37:49.999672699 +0000 UTC m=+0.101495063 container health_status 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:12:45Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, io.openshift.expose-services=, release=1761123044, architecture=x86_64, vendor=Red Hat, Inc., vcs-type=git, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, url=https://www.redhat.com)
Nov 28 08:37:50 np0005538515.localdomain podman[86894]: 2025-11-28 08:37:49.974496164 +0000 UTC m=+0.079090953 container health_status 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, vcs-type=git, io.buildah.version=1.41.4, io.openshift.expose-services=, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, distribution-scope=public, url=https://www.redhat.com, build-date=2025-11-18T22:49:32Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.component=openstack-cron-container, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., config_id=tripleo_step4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, name=rhosp17/openstack-cron)
Nov 28 08:37:50 np0005538515.localdomain podman[86895]: 2025-11-28 08:37:50.057448326 +0000 UTC m=+0.159270660 container exec_died 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.buildah.version=1.41.4, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vendor=Red Hat, Inc., vcs-type=git, batch=17.1_20251118.1, release=1761123044, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-ipmi-container, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:12:45Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_ipmi, architecture=x86_64, io.openshift.expose-services=, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']})
Nov 28 08:37:50 np0005538515.localdomain systemd[1]: 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.service: Deactivated successfully.
Nov 28 08:37:50 np0005538515.localdomain podman[86896]: 2025-11-28 08:37:50.033662964 +0000 UTC m=+0.131754823 container health_status 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=iscsid, distribution-scope=public, com.redhat.component=openstack-iscsid-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, url=https://www.redhat.com, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, tcib_managed=true, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, config_id=tripleo_step3, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid)
Nov 28 08:37:50 np0005538515.localdomain podman[86894]: 2025-11-28 08:37:50.111477979 +0000 UTC m=+0.216072788 container exec_died 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, version=17.1.12, vendor=Red Hat, Inc., url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, com.redhat.component=openstack-cron-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=logrotate_crond, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, batch=17.1_20251118.1)
Nov 28 08:37:50 np0005538515.localdomain systemd[1]: 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.service: Deactivated successfully.
Nov 28 08:37:50 np0005538515.localdomain podman[86896]: 2025-11-28 08:37:50.16256131 +0000 UTC m=+0.260653149 container exec_died 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step3, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, distribution-scope=public, com.redhat.component=openstack-iscsid-container, release=1761123044, version=17.1.12, io.buildah.version=1.41.4, tcib_managed=true, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, url=https://www.redhat.com, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, name=rhosp17/openstack-iscsid, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid)
Nov 28 08:37:50 np0005538515.localdomain systemd[1]: 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.service: Deactivated successfully.
Nov 28 08:37:50 np0005538515.localdomain podman[86897]: 2025-11-28 08:37:50.245821902 +0000 UTC m=+0.340133415 container health_status d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, managed_by=tripleo_ansible, distribution-scope=public, vcs-type=git, tcib_managed=true, com.redhat.component=openstack-ceilometer-compute-container, version=17.1.12, vendor=Red Hat, Inc., release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, io.buildah.version=1.41.4)
Nov 28 08:37:50 np0005538515.localdomain podman[86897]: 2025-11-28 08:37:50.273465993 +0000 UTC m=+0.367777486 container exec_died d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute, tcib_managed=true, config_id=tripleo_step4, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, build-date=2025-11-19T00:11:48Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-compute, batch=17.1_20251118.1, io.openshift.expose-services=, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64)
Nov 28 08:37:50 np0005538515.localdomain systemd[1]: d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.service: Deactivated successfully.
Nov 28 08:37:50 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.
Nov 28 08:37:50 np0005538515.localdomain podman[86980]: 2025-11-28 08:37:50.987961136 +0000 UTC m=+0.092664913 container health_status e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step4, build-date=2025-11-19T00:36:58Z, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., release=1761123044, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, maintainer=OpenStack TripleO Team, container_name=nova_migration_target, vcs-type=git, distribution-scope=public, version=17.1.12, managed_by=tripleo_ansible, io.buildah.version=1.41.4)
Nov 28 08:37:51 np0005538515.localdomain podman[86980]: 2025-11-28 08:37:51.367233005 +0000 UTC m=+0.471936752 container exec_died e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, build-date=2025-11-19T00:36:58Z, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_migration_target, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, name=rhosp17/openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 28 08:37:51 np0005538515.localdomain systemd[1]: e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.service: Deactivated successfully.
Nov 28 08:37:53 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.
Nov 28 08:37:53 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.
Nov 28 08:37:53 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.
Nov 28 08:37:53 np0005538515.localdomain podman[87006]: 2025-11-28 08:37:53.997854893 +0000 UTC m=+0.095989104 container health_status e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, batch=17.1_20251118.1, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, distribution-scope=public, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, url=https://www.redhat.com, io.openshift.expose-services=, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, version=17.1.12, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c)
Nov 28 08:37:54 np0005538515.localdomain podman[87006]: 2025-11-28 08:37:54.042900989 +0000 UTC m=+0.141035070 container exec_died e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, build-date=2025-11-19T00:14:25Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, container_name=ovn_metadata_agent, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, distribution-scope=public, architecture=x86_64, release=1761123044, tcib_managed=true, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vendor=Red Hat, Inc., config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Nov 28 08:37:54 np0005538515.localdomain systemd[1]: e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.service: Deactivated successfully.
Nov 28 08:37:54 np0005538515.localdomain podman[87004]: 2025-11-28 08:37:54.044462948 +0000 UTC m=+0.146736786 container health_status 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, config_id=tripleo_step4, tcib_managed=true, architecture=x86_64, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, build-date=2025-11-18T23:34:05Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, url=https://www.redhat.com, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, io.openshift.expose-services=, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, vcs-type=git, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, name=rhosp17/openstack-ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public)
Nov 28 08:37:54 np0005538515.localdomain podman[87005]: 2025-11-28 08:37:54.102271956 +0000 UTC m=+0.200459388 container health_status ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, release=1761123044, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:36:58Z, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, batch=17.1_20251118.1, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, container_name=nova_compute, io.openshift.expose-services=, vendor=Red Hat, Inc., vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.12)
Nov 28 08:37:54 np0005538515.localdomain podman[87005]: 2025-11-28 08:37:54.132682042 +0000 UTC m=+0.230869494 container exec_died ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-11-19T00:36:58Z, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, container_name=nova_compute, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, version=17.1.12)
Nov 28 08:37:54 np0005538515.localdomain systemd[1]: ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.service: Deactivated successfully.
Nov 28 08:37:54 np0005538515.localdomain podman[87004]: 2025-11-28 08:37:54.183424963 +0000 UTC m=+0.285698791 container exec_died 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vendor=Red Hat, Inc., batch=17.1_20251118.1, distribution-scope=public, url=https://www.redhat.com, io.openshift.expose-services=, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, name=rhosp17/openstack-ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_controller, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, vcs-type=git, version=17.1.12)
Nov 28 08:37:54 np0005538515.localdomain systemd[1]: 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.service: Deactivated successfully.
Nov 28 08:38:16 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.
Nov 28 08:38:16 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.
Nov 28 08:38:16 np0005538515.localdomain podman[87123]: 2025-11-28 08:38:16.97531582 +0000 UTC m=+0.080007062 container health_status 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, build-date=2025-11-18T22:49:46Z, tcib_managed=true, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, url=https://www.redhat.com, container_name=metrics_qdr, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, io.openshift.expose-services=, version=17.1.12, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1)
Nov 28 08:38:17 np0005538515.localdomain podman[87124]: 2025-11-28 08:38:17.028488876 +0000 UTC m=+0.132640811 container health_status cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, build-date=2025-11-18T22:51:28Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=collectd, managed_by=tripleo_ansible, release=1761123044, vendor=Red Hat, Inc., io.openshift.expose-services=, maintainer=OpenStack TripleO Team, distribution-scope=public, tcib_managed=true, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-collectd-container, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd)
Nov 28 08:38:17 np0005538515.localdomain podman[87124]: 2025-11-28 08:38:17.036579185 +0000 UTC m=+0.140731090 container exec_died cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, name=rhosp17/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, batch=17.1_20251118.1, com.redhat.component=openstack-collectd-container, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, vcs-type=git, release=1761123044, config_id=tripleo_step3, container_name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, build-date=2025-11-18T22:51:28Z, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 08:38:17 np0005538515.localdomain systemd[1]: cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.service: Deactivated successfully.
Nov 28 08:38:17 np0005538515.localdomain podman[87123]: 2025-11-28 08:38:17.206099932 +0000 UTC m=+0.310791154 container exec_died 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, build-date=2025-11-18T22:49:46Z, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, vendor=Red Hat, Inc., distribution-scope=public, architecture=x86_64, vcs-type=git, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, io.buildah.version=1.41.4, batch=17.1_20251118.1, container_name=metrics_qdr, config_id=tripleo_step1, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd)
Nov 28 08:38:17 np0005538515.localdomain systemd[1]: 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.service: Deactivated successfully.
Nov 28 08:38:20 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.
Nov 28 08:38:20 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.
Nov 28 08:38:20 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.
Nov 28 08:38:20 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.
Nov 28 08:38:20 np0005538515.localdomain podman[87170]: 2025-11-28 08:38:20.988651852 +0000 UTC m=+0.092387754 container health_status 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, release=1761123044, description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, distribution-scope=public, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-cron-container, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=logrotate_crond, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, vendor=Red Hat, Inc.)
Nov 28 08:38:21 np0005538515.localdomain podman[87170]: 2025-11-28 08:38:21.000909288 +0000 UTC m=+0.104645150 container exec_died 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=logrotate_crond, vendor=Red Hat, Inc., url=https://www.redhat.com, architecture=x86_64, io.buildah.version=1.41.4, tcib_managed=true, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-cron-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-cron, vcs-type=git, build-date=2025-11-18T22:49:32Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_id=tripleo_step4, io.openshift.expose-services=, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron)
Nov 28 08:38:21 np0005538515.localdomain systemd[1]: 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.service: Deactivated successfully.
Nov 28 08:38:21 np0005538515.localdomain systemd[1]: tmp-crun.G0NVCq.mount: Deactivated successfully.
Nov 28 08:38:21 np0005538515.localdomain podman[87172]: 2025-11-28 08:38:21.055661244 +0000 UTC m=+0.154138024 container health_status 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, name=rhosp17/openstack-iscsid, vendor=Red Hat, Inc., distribution-scope=public, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, io.buildah.version=1.41.4, build-date=2025-11-18T23:44:13Z, batch=17.1_20251118.1, io.openshift.expose-services=, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, container_name=iscsid, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, architecture=x86_64, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-iscsid-container, maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 08:38:21 np0005538515.localdomain podman[87172]: 2025-11-28 08:38:21.068542119 +0000 UTC m=+0.167018949 container exec_died 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, config_id=tripleo_step3, batch=17.1_20251118.1, build-date=2025-11-18T23:44:13Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, url=https://www.redhat.com, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-iscsid, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=iscsid, com.redhat.component=openstack-iscsid-container, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.openshift.expose-services=, architecture=x86_64)
Nov 28 08:38:21 np0005538515.localdomain systemd[1]: 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.service: Deactivated successfully.
Nov 28 08:38:21 np0005538515.localdomain podman[87173]: 2025-11-28 08:38:21.149374686 +0000 UTC m=+0.242268234 container health_status d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, container_name=ceilometer_agent_compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, managed_by=tripleo_ansible, version=17.1.12, architecture=x86_64, build-date=2025-11-19T00:11:48Z, com.redhat.component=openstack-ceilometer-compute-container, tcib_managed=true, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute)
Nov 28 08:38:21 np0005538515.localdomain podman[87173]: 2025-11-28 08:38:21.184522618 +0000 UTC m=+0.277416136 container exec_died d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, tcib_managed=true, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., io.openshift.expose-services=, distribution-scope=public, build-date=2025-11-19T00:11:48Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, io.buildah.version=1.41.4, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, name=rhosp17/openstack-ceilometer-compute, container_name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, version=17.1.12, com.redhat.component=openstack-ceilometer-compute-container, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, batch=17.1_20251118.1)
Nov 28 08:38:21 np0005538515.localdomain systemd[1]: d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.service: Deactivated successfully.
Nov 28 08:38:21 np0005538515.localdomain podman[87171]: 2025-11-28 08:38:21.198028964 +0000 UTC m=+0.295910516 container health_status 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, vcs-type=git, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-ipmi, io.buildah.version=1.41.4, release=1761123044, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, version=17.1.12)
Nov 28 08:38:21 np0005538515.localdomain podman[87171]: 2025-11-28 08:38:21.224393135 +0000 UTC m=+0.322274687 container exec_died 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-ipmi, architecture=x86_64, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, release=1761123044, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, build-date=2025-11-19T00:12:45Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ceilometer_agent_ipmi, version=17.1.12, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, vcs-type=git)
Nov 28 08:38:21 np0005538515.localdomain systemd[1]: 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.service: Deactivated successfully.
Nov 28 08:38:21 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.
Nov 28 08:38:21 np0005538515.localdomain podman[87261]: 2025-11-28 08:38:21.985039558 +0000 UTC m=+0.090819745 container health_status e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, tcib_managed=true, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.buildah.version=1.41.4, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z, url=https://www.redhat.com, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Nov 28 08:38:22 np0005538515.localdomain podman[87261]: 2025-11-28 08:38:22.391604218 +0000 UTC m=+0.497384395 container exec_died e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vendor=Red Hat, Inc., io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:36:58Z, url=https://www.redhat.com, io.buildah.version=1.41.4, version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=nova_migration_target, managed_by=tripleo_ansible, distribution-scope=public, batch=17.1_20251118.1, tcib_managed=true, architecture=x86_64, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git)
Nov 28 08:38:22 np0005538515.localdomain systemd[1]: e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.service: Deactivated successfully.
Nov 28 08:38:24 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.
Nov 28 08:38:24 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.
Nov 28 08:38:24 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.
Nov 28 08:38:24 np0005538515.localdomain podman[87285]: 2025-11-28 08:38:24.980112631 +0000 UTC m=+0.083939344 container health_status ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, architecture=x86_64, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., build-date=2025-11-19T00:36:58Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, distribution-scope=public, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, batch=17.1_20251118.1, tcib_managed=true, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, config_id=tripleo_step5, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 28 08:38:25 np0005538515.localdomain podman[87284]: 2025-11-28 08:38:25.032061089 +0000 UTC m=+0.139325158 container health_status 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, tcib_managed=true, container_name=ovn_controller, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, name=rhosp17/openstack-ovn-controller, release=1761123044, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., batch=17.1_20251118.1, vcs-type=git, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, io.buildah.version=1.41.4, com.redhat.component=openstack-ovn-controller-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, architecture=x86_64)
Nov 28 08:38:25 np0005538515.localdomain podman[87284]: 2025-11-28 08:38:25.060478073 +0000 UTC m=+0.167742202 container exec_died 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, maintainer=OpenStack TripleO Team, container_name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, tcib_managed=true, io.openshift.expose-services=, url=https://www.redhat.com, io.buildah.version=1.41.4, architecture=x86_64, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-type=git, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, version=17.1.12, batch=17.1_20251118.1, name=rhosp17/openstack-ovn-controller, managed_by=tripleo_ansible, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:34:05Z, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller)
Nov 28 08:38:25 np0005538515.localdomain systemd[1]: 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.service: Deactivated successfully.
Nov 28 08:38:25 np0005538515.localdomain podman[87286]: 2025-11-28 08:38:25.078507408 +0000 UTC m=+0.180825615 container health_status e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, container_name=ovn_metadata_agent, release=1761123044, io.openshift.expose-services=, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, vendor=Red Hat, Inc., build-date=2025-11-19T00:14:25Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, config_id=tripleo_step4, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 28 08:38:25 np0005538515.localdomain podman[87285]: 2025-11-28 08:38:25.112125752 +0000 UTC m=+0.215952445 container exec_died ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, version=17.1.12, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, config_id=tripleo_step5, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, batch=17.1_20251118.1, architecture=x86_64, build-date=2025-11-19T00:36:58Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, maintainer=OpenStack TripleO Team)
Nov 28 08:38:25 np0005538515.localdomain systemd[1]: ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.service: Deactivated successfully.
Nov 28 08:38:25 np0005538515.localdomain podman[87286]: 2025-11-28 08:38:25.141438904 +0000 UTC m=+0.243757071 container exec_died e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, io.buildah.version=1.41.4, config_id=tripleo_step4, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, managed_by=tripleo_ansible, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.openshift.expose-services=, version=17.1.12, container_name=ovn_metadata_agent, vendor=Red Hat, Inc.)
Nov 28 08:38:25 np0005538515.localdomain systemd[1]: e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.service: Deactivated successfully.
Nov 28 08:38:34 np0005538515.localdomain ceph-osd[32393]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 28 08:38:34 np0005538515.localdomain ceph-osd[32393]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 3000.1 total, 600.0 interval
                                                          Cumulative writes: 4784 writes, 21K keys, 4784 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s
                                                          Cumulative WAL: 4784 writes, 637 syncs, 7.51 writes per sync, written: 0.02 GB, 0.01 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 398 writes, 1523 keys, 398 commit groups, 1.0 writes per commit group, ingest: 1.93 MB, 0.00 MB/s
                                                          Interval WAL: 398 writes, 144 syncs, 2.76 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Nov 28 08:38:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 28 08:38:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 3000.2 total, 600.0 interval
                                                          Cumulative writes: 5781 writes, 25K keys, 5781 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s
                                                          Cumulative WAL: 5781 writes, 729 syncs, 7.93 writes per sync, written: 0.02 GB, 0.01 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 535 writes, 2212 keys, 535 commit groups, 1.0 writes per commit group, ingest: 2.71 MB, 0.00 MB/s
                                                          Interval WAL: 535 writes, 189 syncs, 2.83 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Nov 28 08:38:42 np0005538515.localdomain sudo[87358]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 08:38:42 np0005538515.localdomain sudo[87358]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:38:42 np0005538515.localdomain sudo[87358]: pam_unix(sudo:session): session closed for user root
Nov 28 08:38:42 np0005538515.localdomain sudo[87373]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 08:38:42 np0005538515.localdomain sudo[87373]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:38:42 np0005538515.localdomain sudo[87373]: pam_unix(sudo:session): session closed for user root
Nov 28 08:38:46 np0005538515.localdomain sudo[87419]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 08:38:46 np0005538515.localdomain sudo[87419]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:38:46 np0005538515.localdomain sudo[87419]: pam_unix(sudo:session): session closed for user root
Nov 28 08:38:47 np0005538515.localdomain kernel: DROPPING: IN=eth0 OUT= MACSRC=c6:e7:bc:23:0b:06 MACDST=fa:16:3e:93:ca:2d MACPROTO=0800 SRC=162.142.125.193 DST=38.102.83.53 LEN=60 TOS=0x00 PREC=0x00 TTL=52 ID=33791 DF PROTO=TCP SPT=28136 DPT=19885 SEQ=1257399283 ACK=0 WINDOW=21900 RES=0x00 SYN URGP=0 OPT (020405B40402080AC96FA29A000000000103030A) 
Nov 28 08:38:47 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.
Nov 28 08:38:47 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.
Nov 28 08:38:47 np0005538515.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Nov 28 08:38:47 np0005538515.localdomain recover_tripleo_nova_virtqemud[87437]: 62642
Nov 28 08:38:47 np0005538515.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Nov 28 08:38:47 np0005538515.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Nov 28 08:38:47 np0005538515.localdomain podman[87435]: 2025-11-28 08:38:47.996915529 +0000 UTC m=+0.093951392 container health_status cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, container_name=collectd, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-collectd-container, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, managed_by=tripleo_ansible, tcib_managed=true, maintainer=OpenStack TripleO Team, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.buildah.version=1.41.4, io.openshift.expose-services=, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, batch=17.1_20251118.1, distribution-scope=public)
Nov 28 08:38:48 np0005538515.localdomain podman[87435]: 2025-11-28 08:38:48.033843845 +0000 UTC m=+0.130879708 container exec_died cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, batch=17.1_20251118.1, managed_by=tripleo_ansible, tcib_managed=true, maintainer=OpenStack TripleO Team, container_name=collectd, name=rhosp17/openstack-collectd, vendor=Red Hat, Inc., architecture=x86_64, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, io.openshift.expose-services=, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.component=openstack-collectd-container, config_id=tripleo_step3, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 28 08:38:48 np0005538515.localdomain systemd[1]: cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.service: Deactivated successfully.
Nov 28 08:38:48 np0005538515.localdomain podman[87434]: 2025-11-28 08:38:48.052097916 +0000 UTC m=+0.149861501 container health_status 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, tcib_managed=true, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, name=rhosp17/openstack-qdrouterd, vcs-type=git, managed_by=tripleo_ansible, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., config_id=tripleo_step1, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, build-date=2025-11-18T22:49:46Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr)
Nov 28 08:38:48 np0005538515.localdomain podman[87434]: 2025-11-28 08:38:48.25923894 +0000 UTC m=+0.357002485 container exec_died 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, name=rhosp17/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, distribution-scope=public, container_name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, config_id=tripleo_step1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., batch=17.1_20251118.1, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, release=1761123044, managed_by=tripleo_ansible)
Nov 28 08:38:48 np0005538515.localdomain systemd[1]: 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.service: Deactivated successfully.
Nov 28 08:38:48 np0005538515.localdomain kernel: DROPPING: IN=eth0 OUT= MACSRC=c6:e7:bc:23:0b:06 MACDST=fa:16:3e:93:ca:2d MACPROTO=0800 SRC=162.142.125.193 DST=38.102.83.53 LEN=60 TOS=0x00 PREC=0x00 TTL=52 ID=33792 DF PROTO=TCP SPT=28136 DPT=19885 SEQ=1257399283 ACK=0 WINDOW=21900 RES=0x00 SYN URGP=0 OPT (020405B40402080AC96FA697000000000103030A) 
Nov 28 08:38:48 np0005538515.localdomain kernel: DROPPING: IN=eth0 OUT= MACSRC=c6:e7:bc:23:0b:06 MACDST=fa:16:3e:93:ca:2d MACPROTO=0800 SRC=162.142.125.193 DST=38.102.83.53 LEN=60 TOS=0x00 PREC=0x00 TTL=52 ID=45671 DF PROTO=TCP SPT=28148 DPT=19885 SEQ=2937239539 ACK=0 WINDOW=21900 RES=0x00 SYN URGP=0 OPT (020405B40402080AC96FA6FA000000000103030A) 
Nov 28 08:38:49 np0005538515.localdomain kernel: DROPPING: IN=eth0 OUT= MACSRC=c6:e7:bc:23:0b:06 MACDST=fa:16:3e:93:ca:2d MACPROTO=0800 SRC=162.142.125.193 DST=38.102.83.53 LEN=60 TOS=0x00 PREC=0x00 TTL=52 ID=45672 DF PROTO=TCP SPT=28148 DPT=19885 SEQ=2937239539 ACK=0 WINDOW=21900 RES=0x00 SYN URGP=0 OPT (020405B40402080AC96FAB17000000000103030A) 
Nov 28 08:38:49 np0005538515.localdomain kernel: DROPPING: IN=eth0 OUT= MACSRC=c6:e7:bc:23:0b:06 MACDST=fa:16:3e:93:ca:2d MACPROTO=0800 SRC=162.142.125.193 DST=38.102.83.53 LEN=60 TOS=0x00 PREC=0x00 TTL=52 ID=11619 DF PROTO=TCP SPT=28156 DPT=19885 SEQ=2713186779 ACK=0 WINDOW=21900 RES=0x00 SYN URGP=0 OPT (020405B40402080AC96FAC80000000000103030A) 
Nov 28 08:38:50 np0005538515.localdomain kernel: DROPPING: IN=eth0 OUT= MACSRC=c6:e7:bc:23:0b:06 MACDST=fa:16:3e:93:ca:2d MACPROTO=0800 SRC=162.142.125.193 DST=38.102.83.53 LEN=60 TOS=0x00 PREC=0x00 TTL=52 ID=11620 DF PROTO=TCP SPT=28156 DPT=19885 SEQ=2713186779 ACK=0 WINDOW=21900 RES=0x00 SYN URGP=0 OPT (020405B40402080AC96FB097000000000103030A) 
Nov 28 08:38:51 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.
Nov 28 08:38:51 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.
Nov 28 08:38:51 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.
Nov 28 08:38:51 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.
Nov 28 08:38:51 np0005538515.localdomain systemd[1]: tmp-crun.OKXNL1.mount: Deactivated successfully.
Nov 28 08:38:52 np0005538515.localdomain podman[87486]: 2025-11-28 08:38:51.995402124 +0000 UTC m=+0.096971065 container health_status 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, distribution-scope=public, build-date=2025-11-19T00:12:45Z, name=rhosp17/openstack-ceilometer-ipmi, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, managed_by=tripleo_ansible, io.openshift.expose-services=, vendor=Red Hat, Inc., config_id=tripleo_step4, maintainer=OpenStack TripleO Team, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-ipmi-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, container_name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4)
Nov 28 08:38:52 np0005538515.localdomain podman[87486]: 2025-11-28 08:38:52.030460043 +0000 UTC m=+0.132028994 container exec_died 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, url=https://www.redhat.com, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:12:45Z, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, architecture=x86_64, release=1761123044, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, managed_by=tripleo_ansible, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_ipmi)
Nov 28 08:38:52 np0005538515.localdomain systemd[1]: 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.service: Deactivated successfully.
Nov 28 08:38:52 np0005538515.localdomain podman[87487]: 2025-11-28 08:38:52.049017494 +0000 UTC m=+0.144131636 container health_status 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, config_id=tripleo_step3, release=1761123044, description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, distribution-scope=public, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, io.openshift.expose-services=, managed_by=tripleo_ansible, url=https://www.redhat.com, vcs-type=git, batch=17.1_20251118.1, io.buildah.version=1.41.4, container_name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., name=rhosp17/openstack-iscsid)
Nov 28 08:38:52 np0005538515.localdomain podman[87488]: 2025-11-28 08:38:52.111545248 +0000 UTC m=+0.201736548 container health_status d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, config_id=tripleo_step4, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, build-date=2025-11-19T00:11:48Z, container_name=ceilometer_agent_compute, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.buildah.version=1.41.4, tcib_managed=true, url=https://www.redhat.com, name=rhosp17/openstack-ceilometer-compute, version=17.1.12, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Nov 28 08:38:52 np0005538515.localdomain podman[87485]: 2025-11-28 08:38:52.084047152 +0000 UTC m=+0.186281233 container health_status 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-cron-container, config_id=tripleo_step4, io.openshift.expose-services=, architecture=x86_64, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:49:32Z, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, release=1761123044, io.buildah.version=1.41.4, container_name=logrotate_crond, tcib_managed=true)
Nov 28 08:38:52 np0005538515.localdomain podman[87488]: 2025-11-28 08:38:52.141631423 +0000 UTC m=+0.231822743 container exec_died d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, release=1761123044, io.openshift.expose-services=, tcib_managed=true, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute, name=rhosp17/openstack-ceilometer-compute, build-date=2025-11-19T00:11:48Z, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, version=17.1.12, architecture=x86_64, url=https://www.redhat.com, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc.)
Nov 28 08:38:52 np0005538515.localdomain systemd[1]: d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.service: Deactivated successfully.
Nov 28 08:38:52 np0005538515.localdomain podman[87485]: 2025-11-28 08:38:52.168431638 +0000 UTC m=+0.270665739 container exec_died 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vcs-type=git, com.redhat.component=openstack-cron-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-cron, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, container_name=logrotate_crond, version=17.1.12, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:32Z, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.buildah.version=1.41.4, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc.)
Nov 28 08:38:52 np0005538515.localdomain systemd[1]: 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.service: Deactivated successfully.
Nov 28 08:38:52 np0005538515.localdomain podman[87487]: 2025-11-28 08:38:52.187600578 +0000 UTC m=+0.282714740 container exec_died 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, batch=17.1_20251118.1, name=rhosp17/openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, architecture=x86_64, com.redhat.component=openstack-iscsid-container, version=17.1.12, io.openshift.expose-services=, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.buildah.version=1.41.4, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, build-date=2025-11-18T23:44:13Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git)
Nov 28 08:38:52 np0005538515.localdomain systemd[1]: 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.service: Deactivated successfully.
Nov 28 08:38:52 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.
Nov 28 08:38:52 np0005538515.localdomain podman[87576]: 2025-11-28 08:38:52.966684009 +0000 UTC m=+0.078381703 container health_status e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, container_name=nova_migration_target, distribution-scope=public, name=rhosp17/openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, io.openshift.expose-services=, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, build-date=2025-11-19T00:36:58Z, vcs-type=git, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true)
Nov 28 08:38:52 np0005538515.localdomain systemd[1]: tmp-crun.UDGIAy.mount: Deactivated successfully.
Nov 28 08:38:53 np0005538515.localdomain podman[87576]: 2025-11-28 08:38:53.328682617 +0000 UTC m=+0.440380281 container exec_died e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, architecture=x86_64, name=rhosp17/openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.4, batch=17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, version=17.1.12, tcib_managed=true, io.openshift.expose-services=, url=https://www.redhat.com, container_name=nova_migration_target, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, summary=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 28 08:38:53 np0005538515.localdomain systemd[1]: e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.service: Deactivated successfully.
Nov 28 08:38:55 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.
Nov 28 08:38:55 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.
Nov 28 08:38:55 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.
Nov 28 08:38:55 np0005538515.localdomain podman[87599]: 2025-11-28 08:38:55.983204962 +0000 UTC m=+0.088621208 container health_status 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, name=rhosp17/openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.expose-services=, url=https://www.redhat.com, build-date=2025-11-18T23:34:05Z, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, managed_by=tripleo_ansible, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, distribution-scope=public, container_name=ovn_controller, vcs-type=git, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272)
Nov 28 08:38:56 np0005538515.localdomain podman[87600]: 2025-11-28 08:38:56.030446215 +0000 UTC m=+0.131980192 container health_status ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, batch=17.1_20251118.1, name=rhosp17/openstack-nova-compute, managed_by=tripleo_ansible, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, config_id=tripleo_step5, vcs-type=git, distribution-scope=public, build-date=2025-11-19T00:36:58Z)
Nov 28 08:38:56 np0005538515.localdomain podman[87601]: 2025-11-28 08:38:56.083156687 +0000 UTC m=+0.182936439 container health_status e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, build-date=2025-11-19T00:14:25Z, release=1761123044, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, tcib_managed=true, url=https://www.redhat.com, vendor=Red Hat, Inc., version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 28 08:38:56 np0005538515.localdomain podman[87600]: 2025-11-28 08:38:56.113460059 +0000 UTC m=+0.214994036 container exec_died ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., container_name=nova_compute, name=rhosp17/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, batch=17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, release=1761123044, tcib_managed=true)
Nov 28 08:38:56 np0005538515.localdomain systemd[1]: ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.service: Deactivated successfully.
Nov 28 08:38:56 np0005538515.localdomain podman[87599]: 2025-11-28 08:38:56.136692054 +0000 UTC m=+0.242108350 container exec_died 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.buildah.version=1.41.4, name=rhosp17/openstack-ovn-controller, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, container_name=ovn_controller, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, managed_by=tripleo_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, io.openshift.expose-services=, build-date=2025-11-18T23:34:05Z, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, distribution-scope=public, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., config_id=tripleo_step4)
Nov 28 08:38:56 np0005538515.localdomain systemd[1]: 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.service: Deactivated successfully.
Nov 28 08:38:56 np0005538515.localdomain podman[87601]: 2025-11-28 08:38:56.158892147 +0000 UTC m=+0.258671859 container exec_died e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:14:25Z, batch=17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, version=17.1.12, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, architecture=x86_64, tcib_managed=true)
Nov 28 08:38:56 np0005538515.localdomain systemd[1]: e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.service: Deactivated successfully.
Nov 28 08:39:03 np0005538515.localdomain kernel: DROPPING: IN=eth0 OUT= MACSRC=fa:61:25:a2:5a:71 MACDST=fa:16:3e:93:ca:2d MACPROTO=0800 SRC=167.94.138.32 DST=38.102.83.53 LEN=60 TOS=0x08 PREC=0x40 TTL=52 ID=47924 DF PROTO=TCP SPT=53598 DPT=19885 SEQ=1748276739 ACK=0 WINDOW=21900 RES=0x00 SYN URGP=0 OPT (020405B40402080AA2304D31000000000103030A) 
Nov 28 08:39:04 np0005538515.localdomain kernel: DROPPING: IN=eth0 OUT= MACSRC=fa:61:25:a2:5a:71 MACDST=fa:16:3e:93:ca:2d MACPROTO=0800 SRC=167.94.138.32 DST=38.102.83.53 LEN=60 TOS=0x08 PREC=0x40 TTL=52 ID=47925 DF PROTO=TCP SPT=53598 DPT=19885 SEQ=1748276739 ACK=0 WINDOW=21900 RES=0x00 SYN URGP=0 OPT (020405B40402080AA230513B000000000103030A) 
Nov 28 08:39:05 np0005538515.localdomain kernel: DROPPING: IN=eth0 OUT= MACSRC=fa:61:25:a2:5a:71 MACDST=fa:16:3e:93:ca:2d MACPROTO=0800 SRC=167.94.138.32 DST=38.102.83.53 LEN=60 TOS=0x08 PREC=0x40 TTL=52 ID=21884 DF PROTO=TCP SPT=53610 DPT=19885 SEQ=3538641033 ACK=0 WINDOW=21900 RES=0x00 SYN URGP=0 OPT (020405B40402080AA2305766000000000103030A) 
Nov 28 08:39:07 np0005538515.localdomain kernel: DROPPING: IN=eth0 OUT= MACSRC=fa:61:25:a2:5a:71 MACDST=fa:16:3e:93:ca:2d MACPROTO=0800 SRC=167.94.138.32 DST=38.102.83.53 LEN=60 TOS=0x08 PREC=0x40 TTL=52 ID=21885 DF PROTO=TCP SPT=53610 DPT=19885 SEQ=3538641033 ACK=0 WINDOW=21900 RES=0x00 SYN URGP=0 OPT (020405B40402080AA2305B7B000000000103030A) 
Nov 28 08:39:07 np0005538515.localdomain kernel: DROPPING: IN=eth0 OUT= MACSRC=fa:61:25:a2:5a:71 MACDST=fa:16:3e:93:ca:2d MACPROTO=0800 SRC=167.94.138.32 DST=38.102.83.53 LEN=60 TOS=0x08 PREC=0x40 TTL=52 ID=24914 DF PROTO=TCP SPT=53624 DPT=19885 SEQ=3640415225 ACK=0 WINDOW=21900 RES=0x00 SYN URGP=0 OPT (020405B40402080AA2305BED000000000103030A) 
Nov 28 08:39:08 np0005538515.localdomain kernel: DROPPING: IN=eth0 OUT= MACSRC=fa:61:25:a2:5a:71 MACDST=fa:16:3e:93:ca:2d MACPROTO=0800 SRC=167.94.138.32 DST=38.102.83.53 LEN=60 TOS=0x08 PREC=0x40 TTL=52 ID=24915 DF PROTO=TCP SPT=53624 DPT=19885 SEQ=3640415225 ACK=0 WINDOW=21900 RES=0x00 SYN URGP=0 OPT (020405B40402080AA2305FFA000000000103030A) 
Nov 28 08:39:18 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.
Nov 28 08:39:18 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.
Nov 28 08:39:18 np0005538515.localdomain podman[87721]: 2025-11-28 08:39:18.975891979 +0000 UTC m=+0.085029046 container health_status cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, url=https://www.redhat.com, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, tcib_managed=true, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:51:28Z, io.openshift.expose-services=, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, com.redhat.component=openstack-collectd-container, vendor=Red Hat, Inc., managed_by=tripleo_ansible, architecture=x86_64, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd)
Nov 28 08:39:18 np0005538515.localdomain podman[87721]: 2025-11-28 08:39:18.983809063 +0000 UTC m=+0.092946110 container exec_died cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, tcib_managed=true, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vendor=Red Hat, Inc., name=rhosp17/openstack-collectd, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, url=https://www.redhat.com, io.openshift.expose-services=, build-date=2025-11-18T22:51:28Z, com.redhat.component=openstack-collectd-container, managed_by=tripleo_ansible, architecture=x86_64, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12)
Nov 28 08:39:18 np0005538515.localdomain systemd[1]: cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.service: Deactivated successfully.
Nov 28 08:39:19 np0005538515.localdomain systemd[1]: tmp-crun.6jvD6m.mount: Deactivated successfully.
Nov 28 08:39:19 np0005538515.localdomain podman[87720]: 2025-11-28 08:39:19.040474506 +0000 UTC m=+0.149816510 container health_status 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, build-date=2025-11-18T22:49:46Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20251118.1, distribution-scope=public, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, architecture=x86_64, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=metrics_qdr, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com)
Nov 28 08:39:19 np0005538515.localdomain kernel: DROPPING: IN=eth0 OUT= MACSRC=c6:e7:bc:23:0b:06 MACDST=fa:16:3e:93:ca:2d MACPROTO=0800 SRC=66.132.153.138 DST=38.102.83.53 LEN=60 TOS=0x00 PREC=0x00 TTL=52 ID=48645 DF PROTO=TCP SPT=41584 DPT=19885 SEQ=2558183826 ACK=0 WINDOW=21900 RES=0x00 SYN URGP=0 OPT (020405B40402080AC990113C000000000103030A) 
Nov 28 08:39:19 np0005538515.localdomain podman[87720]: 2025-11-28 08:39:19.256562615 +0000 UTC m=+0.365904639 container exec_died 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, batch=17.1_20251118.1, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, tcib_managed=true, architecture=x86_64, config_id=tripleo_step1, io.openshift.expose-services=, name=rhosp17/openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, release=1761123044, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 28 08:39:19 np0005538515.localdomain systemd[1]: 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.service: Deactivated successfully.
Nov 28 08:39:20 np0005538515.localdomain kernel: DROPPING: IN=eth0 OUT= MACSRC=c6:e7:bc:23:0b:06 MACDST=fa:16:3e:93:ca:2d MACPROTO=0800 SRC=66.132.153.138 DST=38.102.83.53 LEN=60 TOS=0x00 PREC=0x00 TTL=52 ID=12011 DF PROTO=TCP SPT=41618 DPT=19885 SEQ=3498461950 ACK=0 WINDOW=21900 RES=0x00 SYN URGP=0 OPT (020405B40402080AC990152E000000000103030A) 
Nov 28 08:39:21 np0005538515.localdomain kernel: DROPPING: IN=eth0 OUT= MACSRC=c6:e7:bc:23:0b:06 MACDST=fa:16:3e:93:ca:2d MACPROTO=0800 SRC=66.132.153.138 DST=38.102.83.53 LEN=60 TOS=0x00 PREC=0x00 TTL=52 ID=48919 DF PROTO=TCP SPT=41640 DPT=19885 SEQ=2596275895 ACK=0 WINDOW=21900 RES=0x00 SYN URGP=0 OPT (020405B40402080AC9901922000000000103030A) 
Nov 28 08:39:22 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.
Nov 28 08:39:22 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.
Nov 28 08:39:22 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.
Nov 28 08:39:22 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.
Nov 28 08:39:22 np0005538515.localdomain podman[87772]: 2025-11-28 08:39:22.989758518 +0000 UTC m=+0.090473544 container health_status d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack TripleO Team, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, vcs-type=git, com.redhat.component=openstack-ceilometer-compute-container, distribution-scope=public, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, architecture=x86_64, release=1761123044, io.buildah.version=1.41.4, vendor=Red Hat, Inc., url=https://www.redhat.com, container_name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:11:48Z, tcib_managed=true)
Nov 28 08:39:23 np0005538515.localdomain podman[87772]: 2025-11-28 08:39:23.018392379 +0000 UTC m=+0.119107405 container exec_died d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:11:48Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ceilometer-compute, distribution-scope=public, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, tcib_managed=true, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-type=git, container_name=ceilometer_agent_compute, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64)
Nov 28 08:39:23 np0005538515.localdomain systemd[1]: d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.service: Deactivated successfully.
Nov 28 08:39:23 np0005538515.localdomain podman[87771]: 2025-11-28 08:39:23.040340325 +0000 UTC m=+0.142056193 container health_status 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, batch=17.1_20251118.1, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, managed_by=tripleo_ansible, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-type=git, tcib_managed=true, build-date=2025-11-18T23:44:13Z, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044, com.redhat.component=openstack-iscsid-container, io.buildah.version=1.41.4, version=17.1.12, vendor=Red Hat, Inc., distribution-scope=public, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid)
Nov 28 08:39:23 np0005538515.localdomain podman[87771]: 2025-11-28 08:39:23.050308612 +0000 UTC m=+0.152024500 container exec_died 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, com.redhat.component=openstack-iscsid-container, summary=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:44:13Z, name=rhosp17/openstack-iscsid, io.openshift.expose-services=, batch=17.1_20251118.1, version=17.1.12, config_id=tripleo_step3, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.buildah.version=1.41.4, container_name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vendor=Red Hat, Inc., tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 08:39:23 np0005538515.localdomain systemd[1]: 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.service: Deactivated successfully.
Nov 28 08:39:23 np0005538515.localdomain systemd[1]: tmp-crun.irJBJG.mount: Deactivated successfully.
Nov 28 08:39:23 np0005538515.localdomain podman[87769]: 2025-11-28 08:39:23.093226692 +0000 UTC m=+0.200982825 container health_status 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, container_name=logrotate_crond, io.buildah.version=1.41.4, tcib_managed=true, version=17.1.12, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, distribution-scope=public, url=https://www.redhat.com, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-cron-container, name=rhosp17/openstack-cron, batch=17.1_20251118.1, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 28 08:39:23 np0005538515.localdomain podman[87769]: 2025-11-28 08:39:23.104265732 +0000 UTC m=+0.212021855 container exec_died 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.buildah.version=1.41.4, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, container_name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, name=rhosp17/openstack-cron, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.component=openstack-cron-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:49:32Z, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc.)
Nov 28 08:39:23 np0005538515.localdomain systemd[1]: 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.service: Deactivated successfully.
Nov 28 08:39:23 np0005538515.localdomain podman[87770]: 2025-11-28 08:39:23.176325159 +0000 UTC m=+0.282036039 container health_status 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ceilometer_agent_ipmi, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, build-date=2025-11-19T00:12:45Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, tcib_managed=true, distribution-scope=public, managed_by=tripleo_ansible, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi)
Nov 28 08:39:23 np0005538515.localdomain podman[87770]: 2025-11-28 08:39:23.226408809 +0000 UTC m=+0.332119699 container exec_died 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, container_name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:12:45Z, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.expose-services=, batch=17.1_20251118.1, version=17.1.12, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, url=https://www.redhat.com, vendor=Red Hat, Inc., org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team)
Nov 28 08:39:23 np0005538515.localdomain systemd[1]: 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.service: Deactivated successfully.
Nov 28 08:39:23 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.
Nov 28 08:39:23 np0005538515.localdomain podman[87862]: 2025-11-28 08:39:23.967495541 +0000 UTC m=+0.076938508 container health_status e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., container_name=nova_migration_target, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1761123044, tcib_managed=true, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, version=17.1.12, build-date=2025-11-19T00:36:58Z, name=rhosp17/openstack-nova-compute, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible)
Nov 28 08:39:24 np0005538515.localdomain podman[87862]: 2025-11-28 08:39:24.37533965 +0000 UTC m=+0.484782597 container exec_died e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, distribution-scope=public, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, vendor=Red Hat, Inc., version=17.1.12, url=https://www.redhat.com, build-date=2025-11-19T00:36:58Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, architecture=x86_64, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 08:39:24 np0005538515.localdomain systemd[1]: e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.service: Deactivated successfully.
Nov 28 08:39:26 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.
Nov 28 08:39:26 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.
Nov 28 08:39:26 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.
Nov 28 08:39:26 np0005538515.localdomain podman[87884]: 2025-11-28 08:39:26.97577479 +0000 UTC m=+0.083906042 container health_status 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, release=1761123044, io.buildah.version=1.41.4, distribution-scope=public, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, managed_by=tripleo_ansible, architecture=x86_64, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, summary=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T23:34:05Z, name=rhosp17/openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, config_id=tripleo_step4, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272)
Nov 28 08:39:27 np0005538515.localdomain podman[87885]: 2025-11-28 08:39:27.022761766 +0000 UTC m=+0.126412060 container health_status ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, vcs-type=git, version=17.1.12, url=https://www.redhat.com, container_name=nova_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, build-date=2025-11-19T00:36:58Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5, name=rhosp17/openstack-nova-compute, distribution-scope=public)
Nov 28 08:39:27 np0005538515.localdomain podman[87884]: 2025-11-28 08:39:27.077856241 +0000 UTC m=+0.185987543 container exec_died 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, build-date=2025-11-18T23:34:05Z, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, vcs-type=git, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, release=1761123044, name=rhosp17/openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, io.buildah.version=1.41.4, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, architecture=x86_64, tcib_managed=true)
Nov 28 08:39:27 np0005538515.localdomain podman[87886]: 2025-11-28 08:39:27.090909683 +0000 UTC m=+0.192600538 container health_status e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., managed_by=tripleo_ansible, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, release=1761123044, vcs-type=git, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, config_id=tripleo_step4, container_name=ovn_metadata_agent, tcib_managed=true, url=https://www.redhat.com, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, architecture=x86_64, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Nov 28 08:39:27 np0005538515.localdomain systemd[1]: 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.service: Deactivated successfully.
Nov 28 08:39:27 np0005538515.localdomain podman[87885]: 2025-11-28 08:39:27.104083978 +0000 UTC m=+0.207734272 container exec_died ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, url=https://www.redhat.com, name=rhosp17/openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, config_id=tripleo_step5, io.buildah.version=1.41.4, version=17.1.12, tcib_managed=true, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, release=1761123044, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 08:39:27 np0005538515.localdomain systemd[1]: ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.service: Deactivated successfully.
Nov 28 08:39:27 np0005538515.localdomain podman[87886]: 2025-11-28 08:39:27.162494816 +0000 UTC m=+0.264185701 container exec_died e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, managed_by=tripleo_ansible, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, architecture=x86_64, release=1761123044, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, url=https://www.redhat.com, tcib_managed=true, batch=17.1_20251118.1, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Nov 28 08:39:27 np0005538515.localdomain systemd[1]: e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.service: Deactivated successfully.
Nov 28 08:39:46 np0005538515.localdomain sudo[87960]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 08:39:46 np0005538515.localdomain sudo[87960]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:39:46 np0005538515.localdomain sudo[87960]: pam_unix(sudo:session): session closed for user root
Nov 28 08:39:46 np0005538515.localdomain sudo[87975]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 check-host
Nov 28 08:39:46 np0005538515.localdomain sudo[87975]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:39:46 np0005538515.localdomain sudo[87975]: pam_unix(sudo:session): session closed for user root
Nov 28 08:39:46 np0005538515.localdomain sudo[88013]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 08:39:46 np0005538515.localdomain sudo[88013]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:39:46 np0005538515.localdomain sudo[88013]: pam_unix(sudo:session): session closed for user root
Nov 28 08:39:47 np0005538515.localdomain sudo[88028]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 08:39:47 np0005538515.localdomain sudo[88028]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:39:47 np0005538515.localdomain sudo[88028]: pam_unix(sudo:session): session closed for user root
Nov 28 08:39:48 np0005538515.localdomain sudo[88074]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 08:39:48 np0005538515.localdomain sudo[88074]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:39:48 np0005538515.localdomain sudo[88074]: pam_unix(sudo:session): session closed for user root
Nov 28 08:39:49 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.
Nov 28 08:39:49 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.
Nov 28 08:39:49 np0005538515.localdomain systemd[1]: tmp-crun.jdMySW.mount: Deactivated successfully.
Nov 28 08:39:50 np0005538515.localdomain podman[88090]: 2025-11-28 08:39:49.99881449 +0000 UTC m=+0.105894519 container health_status cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, version=17.1.12, name=rhosp17/openstack-collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, release=1761123044, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-collectd-container, vcs-type=git, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, config_id=tripleo_step3, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team)
Nov 28 08:39:50 np0005538515.localdomain systemd[1]: tmp-crun.8J9AvB.mount: Deactivated successfully.
Nov 28 08:39:50 np0005538515.localdomain podman[88089]: 2025-11-28 08:39:50.027436071 +0000 UTC m=+0.136687446 container health_status 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=metrics_qdr, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, com.redhat.component=openstack-qdrouterd-container, architecture=x86_64, config_id=tripleo_step1, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 28 08:39:50 np0005538515.localdomain podman[88090]: 2025-11-28 08:39:50.083557239 +0000 UTC m=+0.190637278 container exec_died cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:51:28Z, url=https://www.redhat.com, vendor=Red Hat, Inc., vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-collectd, config_id=tripleo_step3, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, architecture=x86_64, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-collectd-container, container_name=collectd, version=17.1.12, distribution-scope=public, managed_by=tripleo_ansible, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']})
Nov 28 08:39:50 np0005538515.localdomain systemd[1]: cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.service: Deactivated successfully.
Nov 28 08:39:50 np0005538515.localdomain podman[88089]: 2025-11-28 08:39:50.21979942 +0000 UTC m=+0.329050765 container exec_died 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, url=https://www.redhat.com, vcs-type=git, config_id=tripleo_step1, maintainer=OpenStack TripleO Team, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, release=1761123044, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, version=17.1.12, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd)
Nov 28 08:39:50 np0005538515.localdomain systemd[1]: 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.service: Deactivated successfully.
Nov 28 08:39:53 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.
Nov 28 08:39:53 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.
Nov 28 08:39:53 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.
Nov 28 08:39:53 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.
Nov 28 08:39:53 np0005538515.localdomain podman[88140]: 2025-11-28 08:39:53.98602901 +0000 UTC m=+0.090637621 container health_status 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:12:45Z, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, url=https://www.redhat.com, managed_by=tripleo_ansible, container_name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, distribution-scope=public, vcs-type=git, batch=17.1_20251118.1, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-ipmi)
Nov 28 08:39:54 np0005538515.localdomain podman[88140]: 2025-11-28 08:39:54.017264571 +0000 UTC m=+0.121873172 container exec_died 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, config_id=tripleo_step4, managed_by=tripleo_ansible, release=1761123044, version=17.1.12, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vendor=Red Hat, Inc., io.openshift.expose-services=, container_name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:12:45Z, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-ipmi-container, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, vcs-type=git, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi)
Nov 28 08:39:54 np0005538515.localdomain podman[88141]: 2025-11-28 08:39:54.032159719 +0000 UTC m=+0.135364666 container health_status 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, batch=17.1_20251118.1, release=1761123044, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-iscsid, vendor=Red Hat, Inc., distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, architecture=x86_64, tcib_managed=true, build-date=2025-11-18T23:44:13Z, vcs-type=git, description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, io.buildah.version=1.41.4, version=17.1.12)
Nov 28 08:39:54 np0005538515.localdomain systemd[1]: 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.service: Deactivated successfully.
Nov 28 08:39:54 np0005538515.localdomain podman[88141]: 2025-11-28 08:39:54.046448728 +0000 UTC m=+0.149653665 container exec_died 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, com.redhat.component=openstack-iscsid-container, version=17.1.12, build-date=2025-11-18T23:44:13Z, tcib_managed=true, config_id=tripleo_step3, io.buildah.version=1.41.4, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid)
Nov 28 08:39:54 np0005538515.localdomain systemd[1]: 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.service: Deactivated successfully.
Nov 28 08:39:54 np0005538515.localdomain podman[88142]: 2025-11-28 08:39:54.084228221 +0000 UTC m=+0.185379755 container health_status d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, architecture=x86_64, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, container_name=ceilometer_agent_compute, build-date=2025-11-19T00:11:48Z, com.redhat.component=openstack-ceilometer-compute-container, name=rhosp17/openstack-ceilometer-compute, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, release=1761123044, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, url=https://www.redhat.com, version=17.1.12, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc.)
Nov 28 08:39:54 np0005538515.localdomain podman[88139]: 2025-11-28 08:39:54.149009284 +0000 UTC m=+0.252875812 container health_status 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., container_name=logrotate_crond, vcs-type=git, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:32Z, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-cron-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, name=rhosp17/openstack-cron, release=1761123044, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 cron)
Nov 28 08:39:54 np0005538515.localdomain podman[88142]: 2025-11-28 08:39:54.149587832 +0000 UTC m=+0.250739396 container exec_died d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, build-date=2025-11-19T00:11:48Z, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, vcs-type=git, architecture=x86_64, name=rhosp17/openstack-ceilometer-compute, vendor=Red Hat, Inc., batch=17.1_20251118.1, tcib_managed=true, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-compute-container)
Nov 28 08:39:54 np0005538515.localdomain podman[88139]: 2025-11-28 08:39:54.188666904 +0000 UTC m=+0.292533452 container exec_died 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, release=1761123044, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, distribution-scope=public, tcib_managed=true, managed_by=tripleo_ansible, io.buildah.version=1.41.4, container_name=logrotate_crond, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, build-date=2025-11-18T22:49:32Z, description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, version=17.1.12, com.redhat.component=openstack-cron-container, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-cron, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron)
Nov 28 08:39:54 np0005538515.localdomain systemd[1]: 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.service: Deactivated successfully.
Nov 28 08:39:54 np0005538515.localdomain systemd[1]: d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.service: Deactivated successfully.
Nov 28 08:39:54 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.
Nov 28 08:39:54 np0005538515.localdomain podman[88231]: 2025-11-28 08:39:54.96467997 +0000 UTC m=+0.076726681 container health_status e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, release=1761123044, build-date=2025-11-19T00:36:58Z, managed_by=tripleo_ansible, vendor=Red Hat, Inc., architecture=x86_64, url=https://www.redhat.com, container_name=nova_migration_target, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vcs-type=git, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public)
Nov 28 08:39:55 np0005538515.localdomain podman[88231]: 2025-11-28 08:39:55.33973965 +0000 UTC m=+0.451786311 container exec_died e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, config_id=tripleo_step4, batch=17.1_20251118.1, release=1761123044, architecture=x86_64, container_name=nova_migration_target, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, build-date=2025-11-19T00:36:58Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, io.buildah.version=1.41.4, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 28 08:39:55 np0005538515.localdomain systemd[1]: e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.service: Deactivated successfully.
Nov 28 08:39:57 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.
Nov 28 08:39:57 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.
Nov 28 08:39:57 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.
Nov 28 08:39:57 np0005538515.localdomain systemd[1]: tmp-crun.LJkWJx.mount: Deactivated successfully.
Nov 28 08:39:57 np0005538515.localdomain podman[88254]: 2025-11-28 08:39:57.972456734 +0000 UTC m=+0.079230138 container health_status ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, managed_by=tripleo_ansible, config_id=tripleo_step5, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, vcs-type=git, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, name=rhosp17/openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public)
Nov 28 08:39:57 np0005538515.localdomain podman[88254]: 2025-11-28 08:39:57.998424123 +0000 UTC m=+0.105197517 container exec_died ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, config_id=tripleo_step5, vcs-type=git, url=https://www.redhat.com, name=rhosp17/openstack-nova-compute, build-date=2025-11-19T00:36:58Z, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, distribution-scope=public, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team)
Nov 28 08:39:58 np0005538515.localdomain systemd[1]: ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.service: Deactivated successfully.
Nov 28 08:39:58 np0005538515.localdomain podman[88255]: 2025-11-28 08:39:58.019683487 +0000 UTC m=+0.121764796 container health_status e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, version=17.1.12, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, architecture=x86_64, distribution-scope=public, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, vendor=Red Hat, Inc., url=https://www.redhat.com, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, tcib_managed=true, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_metadata_agent, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044)
Nov 28 08:39:58 np0005538515.localdomain podman[88255]: 2025-11-28 08:39:58.061493273 +0000 UTC m=+0.163574572 container exec_died e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, distribution-scope=public, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, architecture=x86_64, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, name=rhosp17/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, tcib_managed=true, container_name=ovn_metadata_agent, release=1761123044, build-date=2025-11-19T00:14:25Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, url=https://www.redhat.com, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4)
Nov 28 08:39:58 np0005538515.localdomain systemd[1]: e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.service: Deactivated successfully.
Nov 28 08:39:58 np0005538515.localdomain podman[88253]: 2025-11-28 08:39:58.084203812 +0000 UTC m=+0.189590183 container health_status 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, com.redhat.component=openstack-ovn-controller-container, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, url=https://www.redhat.com, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, container_name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vcs-type=git, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, tcib_managed=true, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, version=17.1.12, managed_by=tripleo_ansible)
Nov 28 08:39:58 np0005538515.localdomain podman[88253]: 2025-11-28 08:39:58.110399868 +0000 UTC m=+0.215786249 container exec_died 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, name=rhosp17/openstack-ovn-controller, vendor=Red Hat, Inc., architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, distribution-scope=public, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:34:05Z, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, url=https://www.redhat.com, com.redhat.component=openstack-ovn-controller-container, release=1761123044, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272)
Nov 28 08:39:58 np0005538515.localdomain systemd[1]: 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.service: Deactivated successfully.
Nov 28 08:39:58 np0005538515.localdomain systemd[1]: tmp-crun.ak6eHd.mount: Deactivated successfully.
Nov 28 08:40:13 np0005538515.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Nov 28 08:40:13 np0005538515.localdomain recover_tripleo_nova_virtqemud[88329]: 62642
Nov 28 08:40:13 np0005538515.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Nov 28 08:40:13 np0005538515.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Nov 28 08:40:20 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.
Nov 28 08:40:20 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.
Nov 28 08:40:21 np0005538515.localdomain podman[88376]: 2025-11-28 08:40:21.009213646 +0000 UTC m=+0.112825902 container health_status cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, config_id=tripleo_step3, io.openshift.expose-services=, build-date=2025-11-18T22:51:28Z, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, architecture=x86_64, vendor=Red Hat, Inc., version=17.1.12, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, tcib_managed=true, com.redhat.component=openstack-collectd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible)
Nov 28 08:40:21 np0005538515.localdomain podman[88375]: 2025-11-28 08:40:20.992612885 +0000 UTC m=+0.101324758 container health_status 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, container_name=metrics_qdr, name=rhosp17/openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd)
Nov 28 08:40:21 np0005538515.localdomain podman[88376]: 2025-11-28 08:40:21.048704691 +0000 UTC m=+0.152316977 container exec_died cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, version=17.1.12, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:51:28Z, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.buildah.version=1.41.4, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, vcs-type=git, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, batch=17.1_20251118.1, url=https://www.redhat.com, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, name=rhosp17/openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd)
Nov 28 08:40:21 np0005538515.localdomain systemd[1]: cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.service: Deactivated successfully.
Nov 28 08:40:21 np0005538515.localdomain podman[88375]: 2025-11-28 08:40:21.179421763 +0000 UTC m=+0.288133636 container exec_died 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, release=1761123044, config_id=tripleo_step1, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, vcs-type=git, build-date=2025-11-18T22:49:46Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-qdrouterd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, version=17.1.12, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, batch=17.1_20251118.1)
Nov 28 08:40:21 np0005538515.localdomain systemd[1]: 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.service: Deactivated successfully.
Nov 28 08:40:24 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.
Nov 28 08:40:24 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.
Nov 28 08:40:24 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.
Nov 28 08:40:24 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.
Nov 28 08:40:24 np0005538515.localdomain podman[88424]: 2025-11-28 08:40:24.983847567 +0000 UTC m=+0.086053869 container health_status 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, version=17.1.12, managed_by=tripleo_ansible, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:12:45Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, distribution-scope=public, container_name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, io.buildah.version=1.41.4, batch=17.1_20251118.1, tcib_managed=true)
Nov 28 08:40:25 np0005538515.localdomain podman[88423]: 2025-11-28 08:40:25.047161644 +0000 UTC m=+0.153493553 container health_status 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, version=17.1.12, name=rhosp17/openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.component=openstack-cron-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:32Z, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, batch=17.1_20251118.1, io.openshift.expose-services=, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, vcs-type=git, config_id=tripleo_step4, io.buildah.version=1.41.4)
Nov 28 08:40:25 np0005538515.localdomain podman[88424]: 2025-11-28 08:40:25.062224868 +0000 UTC m=+0.164431170 container exec_died 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-ipmi-container, build-date=2025-11-19T00:12:45Z, managed_by=tripleo_ansible, vendor=Red Hat, Inc., container_name=ceilometer_agent_ipmi, vcs-type=git, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, distribution-scope=public, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 28 08:40:25 np0005538515.localdomain systemd[1]: 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.service: Deactivated successfully.
Nov 28 08:40:25 np0005538515.localdomain podman[88423]: 2025-11-28 08:40:25.085508174 +0000 UTC m=+0.191840093 container exec_died 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, build-date=2025-11-18T22:49:32Z, container_name=logrotate_crond, name=rhosp17/openstack-cron, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, com.redhat.component=openstack-cron-container, vcs-type=git, config_id=tripleo_step4, architecture=x86_64, io.buildah.version=1.41.4, managed_by=tripleo_ansible, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, tcib_managed=true, vendor=Red Hat, Inc., release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron)
Nov 28 08:40:25 np0005538515.localdomain systemd[1]: 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.service: Deactivated successfully.
Nov 28 08:40:25 np0005538515.localdomain podman[88429]: 2025-11-28 08:40:25.153042642 +0000 UTC m=+0.248152035 container health_status d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, url=https://www.redhat.com, build-date=2025-11-19T00:11:48Z, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-compute, io.buildah.version=1.41.4, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, release=1761123044, distribution-scope=public)
Nov 28 08:40:25 np0005538515.localdomain podman[88425]: 2025-11-28 08:40:24.966021328 +0000 UTC m=+0.069185289 container health_status 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, name=rhosp17/openstack-iscsid, url=https://www.redhat.com, build-date=2025-11-18T23:44:13Z, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, tcib_managed=true, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, release=1761123044, io.openshift.expose-services=, batch=17.1_20251118.1, com.redhat.component=openstack-iscsid-container, vendor=Red Hat, Inc., architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team)
Nov 28 08:40:25 np0005538515.localdomain podman[88429]: 2025-11-28 08:40:25.184434998 +0000 UTC m=+0.279544401 container exec_died d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, architecture=x86_64, url=https://www.redhat.com, build-date=2025-11-19T00:11:48Z, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, name=rhosp17/openstack-ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_compute, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-compute-container, managed_by=tripleo_ansible, distribution-scope=public, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, release=1761123044, version=17.1.12, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Nov 28 08:40:25 np0005538515.localdomain systemd[1]: d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.service: Deactivated successfully.
Nov 28 08:40:25 np0005538515.localdomain podman[88425]: 2025-11-28 08:40:25.19650173 +0000 UTC m=+0.299665731 container exec_died 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, architecture=x86_64, tcib_managed=true, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, release=1761123044, build-date=2025-11-18T23:44:13Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, version=17.1.12, io.openshift.expose-services=, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, container_name=iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, config_id=tripleo_step3, url=https://www.redhat.com, name=rhosp17/openstack-iscsid, vendor=Red Hat, Inc.)
Nov 28 08:40:25 np0005538515.localdomain systemd[1]: 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.service: Deactivated successfully.
Nov 28 08:40:25 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.
Nov 28 08:40:25 np0005538515.localdomain systemd[1]: tmp-crun.nZHtU4.mount: Deactivated successfully.
Nov 28 08:40:25 np0005538515.localdomain podman[88515]: 2025-11-28 08:40:25.966876043 +0000 UTC m=+0.074453322 container health_status e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-compute, url=https://www.redhat.com, config_id=tripleo_step4, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, container_name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, tcib_managed=true, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 28 08:40:26 np0005538515.localdomain podman[88515]: 2025-11-28 08:40:26.373509504 +0000 UTC m=+0.481086813 container exec_died e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., container_name=nova_migration_target, tcib_managed=true, distribution-scope=public, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step4, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:36:58Z, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, url=https://www.redhat.com, vcs-type=git, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.4, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, version=17.1.12)
Nov 28 08:40:26 np0005538515.localdomain systemd[1]: e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.service: Deactivated successfully.
Nov 28 08:40:28 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.
Nov 28 08:40:28 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.
Nov 28 08:40:28 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.
Nov 28 08:40:28 np0005538515.localdomain podman[88538]: 2025-11-28 08:40:28.986279533 +0000 UTC m=+0.086704428 container health_status 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, architecture=x86_64, vendor=Red Hat, Inc., distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, maintainer=OpenStack TripleO Team, container_name=ovn_controller, config_id=tripleo_step4, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, build-date=2025-11-18T23:34:05Z, com.redhat.component=openstack-ovn-controller-container, name=rhosp17/openstack-ovn-controller, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, tcib_managed=true, vcs-type=git, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller)
Nov 28 08:40:29 np0005538515.localdomain podman[88539]: 2025-11-28 08:40:29.035513239 +0000 UTC m=+0.131645871 container health_status ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, release=1761123044, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, version=17.1.12, build-date=2025-11-19T00:36:58Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, batch=17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., container_name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, distribution-scope=public, io.buildah.version=1.41.4)
Nov 28 08:40:29 np0005538515.localdomain podman[88538]: 2025-11-28 08:40:29.061145877 +0000 UTC m=+0.161570792 container exec_died 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vendor=Red Hat, Inc., container_name=ovn_controller, managed_by=tripleo_ansible, name=rhosp17/openstack-ovn-controller, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:34:05Z, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, io.openshift.expose-services=, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, tcib_managed=true, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, com.redhat.component=openstack-ovn-controller-container)
Nov 28 08:40:29 np0005538515.localdomain podman[88539]: 2025-11-28 08:40:29.070436193 +0000 UTC m=+0.166568875 container exec_died ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, version=17.1.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., batch=17.1_20251118.1, config_id=tripleo_step5, tcib_managed=true, build-date=2025-11-19T00:36:58Z, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, vcs-type=git, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_compute, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 28 08:40:29 np0005538515.localdomain systemd[1]: 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.service: Deactivated successfully.
Nov 28 08:40:29 np0005538515.localdomain systemd[1]: ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.service: Deactivated successfully.
Nov 28 08:40:29 np0005538515.localdomain podman[88540]: 2025-11-28 08:40:29.155482 +0000 UTC m=+0.248501987 container health_status e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, name=rhosp17/openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, vcs-type=git, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, io.openshift.expose-services=, url=https://www.redhat.com, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, version=17.1.12, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, batch=17.1_20251118.1, tcib_managed=true)
Nov 28 08:40:29 np0005538515.localdomain podman[88540]: 2025-11-28 08:40:29.21563581 +0000 UTC m=+0.308655787 container exec_died e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.openshift.expose-services=, url=https://www.redhat.com, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.buildah.version=1.41.4, name=rhosp17/openstack-neutron-metadata-agent-ovn, distribution-scope=public, architecture=x86_64, release=1761123044, vcs-type=git, build-date=2025-11-19T00:14:25Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_metadata_agent, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true)
Nov 28 08:40:29 np0005538515.localdomain systemd[1]: e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.service: Deactivated successfully.
Nov 28 08:40:48 np0005538515.localdomain sudo[88610]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 08:40:48 np0005538515.localdomain sudo[88610]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:40:48 np0005538515.localdomain sudo[88610]: pam_unix(sudo:session): session closed for user root
Nov 28 08:40:48 np0005538515.localdomain sudo[88625]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 08:40:48 np0005538515.localdomain sudo[88625]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:40:49 np0005538515.localdomain sudo[88625]: pam_unix(sudo:session): session closed for user root
Nov 28 08:40:50 np0005538515.localdomain sudo[88672]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 08:40:50 np0005538515.localdomain sudo[88672]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:40:50 np0005538515.localdomain sudo[88672]: pam_unix(sudo:session): session closed for user root
Nov 28 08:40:51 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.
Nov 28 08:40:51 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.
Nov 28 08:40:52 np0005538515.localdomain podman[88687]: 2025-11-28 08:40:52.007807709 +0000 UTC m=+0.106207079 container health_status 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, io.openshift.expose-services=, managed_by=tripleo_ansible, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, name=rhosp17/openstack-qdrouterd, config_id=tripleo_step1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-type=git, version=17.1.12, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd)
Nov 28 08:40:52 np0005538515.localdomain systemd[1]: tmp-crun.wpGHpu.mount: Deactivated successfully.
Nov 28 08:40:52 np0005538515.localdomain podman[88688]: 2025-11-28 08:40:52.112511831 +0000 UTC m=+0.208112875 container health_status cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, batch=17.1_20251118.1, com.redhat.component=openstack-collectd-container, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., managed_by=tripleo_ansible, vcs-type=git, io.openshift.expose-services=, version=17.1.12, build-date=2025-11-18T22:51:28Z, name=rhosp17/openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 28 08:40:52 np0005538515.localdomain podman[88688]: 2025-11-28 08:40:52.150660404 +0000 UTC m=+0.246261438 container exec_died cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., vcs-type=git, build-date=2025-11-18T22:51:28Z, com.redhat.component=openstack-collectd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, name=rhosp17/openstack-collectd, io.buildah.version=1.41.4, release=1761123044, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, version=17.1.12)
Nov 28 08:40:52 np0005538515.localdomain systemd[1]: cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.service: Deactivated successfully.
Nov 28 08:40:52 np0005538515.localdomain podman[88687]: 2025-11-28 08:40:52.23567103 +0000 UTC m=+0.334070350 container exec_died 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, batch=17.1_20251118.1, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., vcs-type=git, build-date=2025-11-18T22:49:46Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, name=rhosp17/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, tcib_managed=true, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.buildah.version=1.41.4, config_id=tripleo_step1, container_name=metrics_qdr)
Nov 28 08:40:52 np0005538515.localdomain systemd[1]: 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.service: Deactivated successfully.
Nov 28 08:40:55 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.
Nov 28 08:40:55 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.
Nov 28 08:40:55 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.
Nov 28 08:40:55 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.
Nov 28 08:40:55 np0005538515.localdomain podman[88738]: 2025-11-28 08:40:55.978251182 +0000 UTC m=+0.081460788 container health_status 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3, distribution-scope=public, vcs-type=git, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, batch=17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.expose-services=, release=1761123044, version=17.1.12, name=rhosp17/openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, architecture=x86_64, tcib_managed=true, vendor=Red Hat, Inc., build-date=2025-11-18T23:44:13Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, managed_by=tripleo_ansible)
Nov 28 08:40:55 np0005538515.localdomain podman[88738]: 2025-11-28 08:40:55.991447587 +0000 UTC m=+0.094657243 container exec_died 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, tcib_managed=true, config_id=tripleo_step3, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-type=git, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.expose-services=, name=rhosp17/openstack-iscsid, io.buildah.version=1.41.4, build-date=2025-11-18T23:44:13Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, vendor=Red Hat, Inc., managed_by=tripleo_ansible, version=17.1.12, container_name=iscsid)
Nov 28 08:40:56 np0005538515.localdomain systemd[1]: 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.service: Deactivated successfully.
Nov 28 08:40:56 np0005538515.localdomain systemd[1]: tmp-crun.7Gg2Mb.mount: Deactivated successfully.
Nov 28 08:40:56 np0005538515.localdomain podman[88741]: 2025-11-28 08:40:56.052330851 +0000 UTC m=+0.152732180 container health_status d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, name=rhosp17/openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.openshift.expose-services=, tcib_managed=true, vcs-type=git, url=https://www.redhat.com, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, io.buildah.version=1.41.4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_compute, distribution-scope=public, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 28 08:40:56 np0005538515.localdomain podman[88741]: 2025-11-28 08:40:56.078566747 +0000 UTC m=+0.178968066 container exec_died d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, io.openshift.expose-services=, batch=17.1_20251118.1, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, version=17.1.12, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:11:48Z, release=1761123044)
Nov 28 08:40:56 np0005538515.localdomain systemd[1]: d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.service: Deactivated successfully.
Nov 28 08:40:56 np0005538515.localdomain podman[88737]: 2025-11-28 08:40:56.135211801 +0000 UTC m=+0.242897555 container health_status 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, config_id=tripleo_step4, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, vendor=Red Hat, Inc., tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, architecture=x86_64, url=https://www.redhat.com, io.openshift.expose-services=, container_name=ceilometer_agent_ipmi)
Nov 28 08:40:56 np0005538515.localdomain podman[88736]: 2025-11-28 08:40:56.179835944 +0000 UTC m=+0.290061896 container health_status 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, config_id=tripleo_step4, vcs-type=git, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:32Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, distribution-scope=public, container_name=logrotate_crond, version=17.1.12, io.openshift.expose-services=, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-cron-container, description=Red Hat OpenStack Platform 17.1 cron, release=1761123044, tcib_managed=true)
Nov 28 08:40:56 np0005538515.localdomain podman[88737]: 2025-11-28 08:40:56.187340194 +0000 UTC m=+0.295025948 container exec_died 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-ipmi-container, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, container_name=ceilometer_agent_ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, io.buildah.version=1.41.4, build-date=2025-11-19T00:12:45Z, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., architecture=x86_64, tcib_managed=true)
Nov 28 08:40:56 np0005538515.localdomain podman[88736]: 2025-11-28 08:40:56.21253766 +0000 UTC m=+0.322763592 container exec_died 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, tcib_managed=true, com.redhat.component=openstack-cron-container, distribution-scope=public, container_name=logrotate_crond, vendor=Red Hat, Inc., release=1761123044, description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, vcs-type=git, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, build-date=2025-11-18T22:49:32Z, url=https://www.redhat.com, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1)
Nov 28 08:40:56 np0005538515.localdomain systemd[1]: 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.service: Deactivated successfully.
Nov 28 08:40:56 np0005538515.localdomain systemd[1]: 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.service: Deactivated successfully.
Nov 28 08:40:56 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.
Nov 28 08:40:56 np0005538515.localdomain podman[88828]: 2025-11-28 08:40:56.959943526 +0000 UTC m=+0.072817311 container health_status e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, version=17.1.12, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, batch=17.1_20251118.1, architecture=x86_64, distribution-scope=public, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, build-date=2025-11-19T00:36:58Z, url=https://www.redhat.com, vendor=Red Hat, Inc., container_name=nova_migration_target)
Nov 28 08:40:57 np0005538515.localdomain podman[88828]: 2025-11-28 08:40:57.29419939 +0000 UTC m=+0.407073205 container exec_died e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, managed_by=tripleo_ansible, url=https://www.redhat.com, io.openshift.expose-services=, vcs-type=git, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, vendor=Red Hat, Inc., architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, release=1761123044, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 28 08:40:57 np0005538515.localdomain systemd[1]: e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.service: Deactivated successfully.
Nov 28 08:40:59 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.
Nov 28 08:40:59 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.
Nov 28 08:40:59 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.
Nov 28 08:40:59 np0005538515.localdomain podman[88851]: 2025-11-28 08:40:59.957491376 +0000 UTC m=+0.067377209 container health_status ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, managed_by=tripleo_ansible, io.openshift.expose-services=, vendor=Red Hat, Inc., release=1761123044, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, config_id=tripleo_step5, container_name=nova_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.12, batch=17.1_20251118.1)
Nov 28 08:40:59 np0005538515.localdomain podman[88851]: 2025-11-28 08:40:59.985170062 +0000 UTC m=+0.095055885 container exec_died ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.12, managed_by=tripleo_ansible, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, distribution-scope=public, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step5, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, vendor=Red Hat, Inc., release=1761123044, com.redhat.component=openstack-nova-compute-container, batch=17.1_20251118.1, vcs-type=git, architecture=x86_64, container_name=nova_compute, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']})
Nov 28 08:40:59 np0005538515.localdomain systemd[1]: ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.service: Deactivated successfully.
Nov 28 08:41:00 np0005538515.localdomain podman[88852]: 2025-11-28 08:41:00.067579809 +0000 UTC m=+0.171985855 container health_status e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, version=17.1.12, maintainer=OpenStack TripleO Team, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, distribution-scope=public, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, architecture=x86_64, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, url=https://www.redhat.com, tcib_managed=true, io.openshift.expose-services=, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, container_name=ovn_metadata_agent, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-neutron-metadata-agent-ovn)
Nov 28 08:41:00 np0005538515.localdomain podman[88850]: 2025-11-28 08:41:00.121865879 +0000 UTC m=+0.232389091 container health_status 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, architecture=x86_64, name=rhosp17/openstack-ovn-controller, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, url=https://www.redhat.com, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:34:05Z, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., managed_by=tripleo_ansible, com.redhat.component=openstack-ovn-controller-container, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, maintainer=OpenStack TripleO Team, container_name=ovn_controller, release=1761123044)
Nov 28 08:41:00 np0005538515.localdomain podman[88852]: 2025-11-28 08:41:00.138337052 +0000 UTC m=+0.242743038 container exec_died e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, build-date=2025-11-19T00:14:25Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, io.openshift.expose-services=, vcs-type=git, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, name=rhosp17/openstack-neutron-metadata-agent-ovn, release=1761123044, vendor=Red Hat, Inc., version=17.1.12)
Nov 28 08:41:00 np0005538515.localdomain systemd[1]: e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.service: Deactivated successfully.
Nov 28 08:41:00 np0005538515.localdomain podman[88850]: 2025-11-28 08:41:00.19523101 +0000 UTC m=+0.305754302 container exec_died 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, architecture=x86_64, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.41.4, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, build-date=2025-11-18T23:34:05Z, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, name=rhosp17/openstack-ovn-controller, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, container_name=ovn_controller, vendor=Red Hat, Inc., version=17.1.12, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, distribution-scope=public)
Nov 28 08:41:00 np0005538515.localdomain systemd[1]: 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.service: Deactivated successfully.
Nov 28 08:41:22 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.
Nov 28 08:41:22 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.
Nov 28 08:41:22 np0005538515.localdomain podman[88947]: 2025-11-28 08:41:22.980659958 +0000 UTC m=+0.081401939 container health_status cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-collectd-container, name=rhosp17/openstack-collectd, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, version=17.1.12, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vendor=Red Hat, Inc., distribution-scope=public, container_name=collectd, io.openshift.expose-services=, managed_by=tripleo_ansible, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z)
Nov 28 08:41:23 np0005538515.localdomain podman[88947]: 2025-11-28 08:41:23.017600777 +0000 UTC m=+0.118342798 container exec_died cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, name=rhosp17/openstack-collectd, io.openshift.expose-services=, vcs-type=git, url=https://www.redhat.com, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, com.redhat.component=openstack-collectd-container, summary=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, batch=17.1_20251118.1, architecture=x86_64, io.buildah.version=1.41.4, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z)
Nov 28 08:41:23 np0005538515.localdomain podman[88946]: 2025-11-28 08:41:23.037265747 +0000 UTC m=+0.138685448 container health_status 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, release=1761123044, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, managed_by=tripleo_ansible, vcs-type=git, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, io.openshift.expose-services=, container_name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, io.buildah.version=1.41.4, name=rhosp17/openstack-qdrouterd, architecture=x86_64, config_id=tripleo_step1)
Nov 28 08:41:23 np0005538515.localdomain systemd[1]: cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.service: Deactivated successfully.
Nov 28 08:41:23 np0005538515.localdomain podman[88946]: 2025-11-28 08:41:23.211574652 +0000 UTC m=+0.312994383 container exec_died 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, distribution-scope=public, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, name=rhosp17/openstack-qdrouterd, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step1, url=https://www.redhat.com, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, io.buildah.version=1.41.4, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, vendor=Red Hat, Inc.)
Nov 28 08:41:23 np0005538515.localdomain systemd[1]: 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.service: Deactivated successfully.
Nov 28 08:41:26 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.
Nov 28 08:41:26 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.
Nov 28 08:41:26 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.
Nov 28 08:41:26 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.
Nov 28 08:41:26 np0005538515.localdomain systemd[1]: tmp-crun.OmmYvR.mount: Deactivated successfully.
Nov 28 08:41:26 np0005538515.localdomain podman[88996]: 2025-11-28 08:41:26.981837158 +0000 UTC m=+0.088654080 container health_status 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, tcib_managed=true, batch=17.1_20251118.1, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-cron-container, release=1761123044, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, distribution-scope=public, io.buildah.version=1.41.4, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:32Z, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 28 08:41:27 np0005538515.localdomain podman[88999]: 2025-11-28 08:41:27.035950871 +0000 UTC m=+0.135750808 container health_status d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, vendor=Red Hat, Inc., build-date=2025-11-19T00:11:48Z, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, version=17.1.12, batch=17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, name=rhosp17/openstack-ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, io.buildah.version=1.41.4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, url=https://www.redhat.com, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ceilometer_agent_compute, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Nov 28 08:41:27 np0005538515.localdomain podman[88996]: 2025-11-28 08:41:27.048567157 +0000 UTC m=+0.155384069 container exec_died 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, config_id=tripleo_step4, vcs-type=git, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.buildah.version=1.41.4, managed_by=tripleo_ansible, url=https://www.redhat.com, container_name=logrotate_crond, architecture=x86_64, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:32Z, com.redhat.component=openstack-cron-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, name=rhosp17/openstack-cron, version=17.1.12, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., distribution-scope=public)
Nov 28 08:41:27 np0005538515.localdomain podman[88998]: 2025-11-28 08:41:27.000347783 +0000 UTC m=+0.102517923 container health_status 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=iscsid, name=rhosp17/openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, config_id=tripleo_step3, tcib_managed=true, architecture=x86_64, io.buildah.version=1.41.4, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12, url=https://www.redhat.com, batch=17.1_20251118.1, com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public)
Nov 28 08:41:27 np0005538515.localdomain podman[88998]: 2025-11-28 08:41:27.083550596 +0000 UTC m=+0.185720736 container exec_died 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.openshift.expose-services=, architecture=x86_64, release=1761123044, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, tcib_managed=true, vcs-type=git, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-iscsid-container, maintainer=OpenStack TripleO Team, container_name=iscsid, io.buildah.version=1.41.4, name=rhosp17/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, batch=17.1_20251118.1, distribution-scope=public, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, version=17.1.12, description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 28 08:41:27 np0005538515.localdomain podman[88999]: 2025-11-28 08:41:27.091296672 +0000 UTC m=+0.191096609 container exec_died d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, release=1761123044, vendor=Red Hat, Inc., container_name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ceilometer-compute, io.buildah.version=1.41.4, managed_by=tripleo_ansible, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, batch=17.1_20251118.1, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team)
Nov 28 08:41:27 np0005538515.localdomain systemd[1]: 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.service: Deactivated successfully.
Nov 28 08:41:27 np0005538515.localdomain systemd[1]: d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.service: Deactivated successfully.
Nov 28 08:41:27 np0005538515.localdomain systemd[1]: 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.service: Deactivated successfully.
Nov 28 08:41:27 np0005538515.localdomain podman[88997]: 2025-11-28 08:41:27.194157095 +0000 UTC m=+0.298636745 container health_status 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, version=17.1.12, build-date=2025-11-19T00:12:45Z, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, name=rhosp17/openstack-ceilometer-ipmi, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, io.openshift.expose-services=, managed_by=tripleo_ansible, config_id=tripleo_step4, vendor=Red Hat, Inc., container_name=ceilometer_agent_ipmi, vcs-type=git)
Nov 28 08:41:27 np0005538515.localdomain podman[88997]: 2025-11-28 08:41:27.22641378 +0000 UTC m=+0.330893450 container exec_died 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-ipmi-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, architecture=x86_64, url=https://www.redhat.com, version=17.1.12, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:12:45Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, batch=17.1_20251118.1)
Nov 28 08:41:27 np0005538515.localdomain systemd[1]: 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.service: Deactivated successfully.
Nov 28 08:41:27 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.
Nov 28 08:41:27 np0005538515.localdomain podman[89081]: 2025-11-28 08:41:27.969104161 +0000 UTC m=+0.081134380 container health_status e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, architecture=x86_64, url=https://www.redhat.com, tcib_managed=true, version=17.1.12, release=1761123044, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc.)
Nov 28 08:41:27 np0005538515.localdomain systemd[1]: tmp-crun.2bmihx.mount: Deactivated successfully.
Nov 28 08:41:28 np0005538515.localdomain podman[89081]: 2025-11-28 08:41:28.354410313 +0000 UTC m=+0.466440512 container exec_died e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, name=rhosp17/openstack-nova-compute, url=https://www.redhat.com, config_id=tripleo_step4, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, version=17.1.12, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 28 08:41:28 np0005538515.localdomain systemd[1]: e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.service: Deactivated successfully.
Nov 28 08:41:30 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.
Nov 28 08:41:30 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.
Nov 28 08:41:30 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.
Nov 28 08:41:30 np0005538515.localdomain podman[89105]: 2025-11-28 08:41:30.980566423 +0000 UTC m=+0.088154044 container health_status 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, maintainer=OpenStack TripleO Team, distribution-scope=public, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, vcs-type=git, io.buildah.version=1.41.4, release=1761123044, description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, version=17.1.12, name=rhosp17/openstack-ovn-controller, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, tcib_managed=true, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.expose-services=)
Nov 28 08:41:31 np0005538515.localdomain systemd[1]: tmp-crun.UQxPZZ.mount: Deactivated successfully.
Nov 28 08:41:31 np0005538515.localdomain podman[89106]: 2025-11-28 08:41:31.029359924 +0000 UTC m=+0.133011435 container health_status ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step5, release=1761123044, url=https://www.redhat.com, name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, vcs-type=git, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, architecture=x86_64, container_name=nova_compute, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, distribution-scope=public, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute)
Nov 28 08:41:31 np0005538515.localdomain podman[89107]: 2025-11-28 08:41:31.089505711 +0000 UTC m=+0.190281764 container health_status e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, managed_by=tripleo_ansible, vendor=Red Hat, Inc., name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, architecture=x86_64, io.openshift.expose-services=, batch=17.1_20251118.1, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, distribution-scope=public, version=17.1.12, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Nov 28 08:41:31 np0005538515.localdomain podman[89105]: 2025-11-28 08:41:31.107631346 +0000 UTC m=+0.215218967 container exec_died 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., url=https://www.redhat.com, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, distribution-scope=public, com.redhat.component=openstack-ovn-controller-container, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, release=1761123044, version=17.1.12, vcs-type=git, io.buildah.version=1.41.4, container_name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 28 08:41:31 np0005538515.localdomain systemd[1]: 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.service: Deactivated successfully.
Nov 28 08:41:31 np0005538515.localdomain podman[89107]: 2025-11-28 08:41:31.12644557 +0000 UTC m=+0.227221633 container exec_died e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, architecture=x86_64, vcs-type=git, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, distribution-scope=public, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, url=https://www.redhat.com, container_name=ovn_metadata_agent, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, managed_by=tripleo_ansible, name=rhosp17/openstack-neutron-metadata-agent-ovn, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn)
Nov 28 08:41:31 np0005538515.localdomain systemd[1]: e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.service: Deactivated successfully.
Nov 28 08:41:31 np0005538515.localdomain podman[89106]: 2025-11-28 08:41:31.162332176 +0000 UTC m=+0.265983627 container exec_died ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, build-date=2025-11-19T00:36:58Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, url=https://www.redhat.com, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-compute, version=17.1.12, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, release=1761123044, container_name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 28 08:41:31 np0005538515.localdomain systemd[1]: ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.service: Deactivated successfully.
Nov 28 08:41:31 np0005538515.localdomain systemd[1]: tmp-crun.MK3dYl.mount: Deactivated successfully.
Nov 28 08:41:50 np0005538515.localdomain sudo[89176]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 08:41:50 np0005538515.localdomain sudo[89176]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:41:50 np0005538515.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Nov 28 08:41:50 np0005538515.localdomain sudo[89176]: pam_unix(sudo:session): session closed for user root
Nov 28 08:41:50 np0005538515.localdomain recover_tripleo_nova_virtqemud[89192]: 62642
Nov 28 08:41:50 np0005538515.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Nov 28 08:41:50 np0005538515.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Nov 28 08:41:50 np0005538515.localdomain sudo[89193]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Nov 28 08:41:50 np0005538515.localdomain sudo[89193]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:41:51 np0005538515.localdomain podman[89280]: 2025-11-28 08:41:51.111699721 +0000 UTC m=+0.082643285 container exec 98f7091a3e2ea0e9ed1e630f1e98c8fad1fd276cf7448473db6afc3c103ea45d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538515, vcs-type=git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_BRANCH=main, GIT_CLEAN=True, io.openshift.expose-services=, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, name=rhceph, io.buildah.version=1.33.12, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, release=553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, version=7, ceph=True, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc.)
Nov 28 08:41:51 np0005538515.localdomain podman[89280]: 2025-11-28 08:41:51.21150058 +0000 UTC m=+0.182444114 container exec_died 98f7091a3e2ea0e9ed1e630f1e98c8fad1fd276cf7448473db6afc3c103ea45d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538515, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, RELEASE=main, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_BRANCH=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, CEPH_POINT_RELEASE=, io.openshift.expose-services=, name=rhceph, vendor=Red Hat, Inc., vcs-type=git, build-date=2025-09-24T08:57:55, description=Red Hat Ceph Storage 7, version=7, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=553, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553)
Nov 28 08:41:51 np0005538515.localdomain sudo[89193]: pam_unix(sudo:session): session closed for user root
Nov 28 08:41:51 np0005538515.localdomain sudo[89346]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 08:41:51 np0005538515.localdomain sudo[89346]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:41:51 np0005538515.localdomain sudo[89346]: pam_unix(sudo:session): session closed for user root
Nov 28 08:41:51 np0005538515.localdomain sudo[89361]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 08:41:51 np0005538515.localdomain sudo[89361]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:41:52 np0005538515.localdomain sudo[89361]: pam_unix(sudo:session): session closed for user root
Nov 28 08:41:52 np0005538515.localdomain sudo[89407]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 08:41:52 np0005538515.localdomain sudo[89407]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:41:52 np0005538515.localdomain sudo[89407]: pam_unix(sudo:session): session closed for user root
Nov 28 08:41:53 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.
Nov 28 08:41:53 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.
Nov 28 08:41:53 np0005538515.localdomain podman[89423]: 2025-11-28 08:41:53.989501021 +0000 UTC m=+0.091494456 container health_status cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, architecture=x86_64, tcib_managed=true, config_id=tripleo_step3, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, url=https://www.redhat.com, com.redhat.component=openstack-collectd-container, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, build-date=2025-11-18T22:51:28Z, batch=17.1_20251118.1, container_name=collectd, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12)
Nov 28 08:41:54 np0005538515.localdomain podman[89422]: 2025-11-28 08:41:54.035120895 +0000 UTC m=+0.137081509 container health_status 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, name=rhosp17/openstack-qdrouterd, managed_by=tripleo_ansible, tcib_managed=true, container_name=metrics_qdr, batch=17.1_20251118.1, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., config_id=tripleo_step1, io.openshift.expose-services=, distribution-scope=public, architecture=x86_64)
Nov 28 08:41:54 np0005538515.localdomain podman[89423]: 2025-11-28 08:41:54.054393194 +0000 UTC m=+0.156386589 container exec_died cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, container_name=collectd, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, name=rhosp17/openstack-collectd, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, architecture=x86_64, batch=17.1_20251118.1, url=https://www.redhat.com, com.redhat.component=openstack-collectd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, managed_by=tripleo_ansible, io.openshift.expose-services=, build-date=2025-11-18T22:51:28Z, description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc.)
Nov 28 08:41:54 np0005538515.localdomain systemd[1]: cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.service: Deactivated successfully.
Nov 28 08:41:54 np0005538515.localdomain podman[89422]: 2025-11-28 08:41:54.231439473 +0000 UTC m=+0.333400097 container exec_died 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, distribution-scope=public, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20251118.1, build-date=2025-11-18T22:49:46Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, tcib_managed=true, config_id=tripleo_step1, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., name=rhosp17/openstack-qdrouterd, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 08:41:54 np0005538515.localdomain systemd[1]: 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.service: Deactivated successfully.
Nov 28 08:41:57 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.
Nov 28 08:41:57 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.
Nov 28 08:41:57 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.
Nov 28 08:41:57 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.
Nov 28 08:41:57 np0005538515.localdomain podman[89472]: 2025-11-28 08:41:57.99294372 +0000 UTC m=+0.090391003 container health_status 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, vcs-type=git, container_name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, managed_by=tripleo_ansible, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, release=1761123044, io.openshift.expose-services=, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.12, build-date=2025-11-19T00:12:45Z, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, com.redhat.component=openstack-ceilometer-ipmi-container)
Nov 28 08:41:58 np0005538515.localdomain podman[89471]: 2025-11-28 08:41:58.042463092 +0000 UTC m=+0.141341129 container health_status 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, url=https://www.redhat.com, build-date=2025-11-18T22:49:32Z, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, batch=17.1_20251118.1, name=rhosp17/openstack-cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.openshift.expose-services=, release=1761123044, tcib_managed=true, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, container_name=logrotate_crond, io.buildah.version=1.41.4, vcs-type=git, description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc.)
Nov 28 08:41:58 np0005538515.localdomain podman[89471]: 2025-11-28 08:41:58.079616278 +0000 UTC m=+0.178494315 container exec_died 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, managed_by=tripleo_ansible, config_id=tripleo_step4, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, release=1761123044, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-cron-container, vendor=Red Hat, Inc., name=rhosp17/openstack-cron, container_name=logrotate_crond, build-date=2025-11-18T22:49:32Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron)
Nov 28 08:41:58 np0005538515.localdomain podman[89473]: 2025-11-28 08:41:58.097137083 +0000 UTC m=+0.190716008 container health_status 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, version=17.1.12, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:44:13Z, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, vendor=Red Hat, Inc., io.openshift.expose-services=, tcib_managed=true, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, com.redhat.component=openstack-iscsid-container, url=https://www.redhat.com, container_name=iscsid, name=rhosp17/openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git)
Nov 28 08:41:58 np0005538515.localdomain podman[89473]: 2025-11-28 08:41:58.104418595 +0000 UTC m=+0.197997520 container exec_died 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., name=rhosp17/openstack-iscsid, build-date=2025-11-18T23:44:13Z, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-type=git, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, tcib_managed=true, maintainer=OpenStack TripleO Team, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, io.openshift.expose-services=, version=17.1.12)
Nov 28 08:41:58 np0005538515.localdomain podman[89478]: 2025-11-28 08:41:58.106353984 +0000 UTC m=+0.194803632 container health_status d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ceilometer-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, config_id=tripleo_step4, managed_by=tripleo_ansible, container_name=ceilometer_agent_compute, version=17.1.12, name=rhosp17/openstack-ceilometer-compute, build-date=2025-11-19T00:11:48Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, vendor=Red Hat, Inc.)
Nov 28 08:41:58 np0005538515.localdomain podman[89472]: 2025-11-28 08:41:58.128276234 +0000 UTC m=+0.225723547 container exec_died 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, build-date=2025-11-19T00:12:45Z, url=https://www.redhat.com, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ceilometer-ipmi-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, tcib_managed=true, vendor=Red Hat, Inc., vcs-type=git, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, version=17.1.12, batch=17.1_20251118.1, io.openshift.expose-services=, container_name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Nov 28 08:41:58 np0005538515.localdomain systemd[1]: 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.service: Deactivated successfully.
Nov 28 08:41:58 np0005538515.localdomain systemd[1]: 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.service: Deactivated successfully.
Nov 28 08:41:58 np0005538515.localdomain podman[89478]: 2025-11-28 08:41:58.182625354 +0000 UTC m=+0.271074962 container exec_died d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, config_id=tripleo_step4, release=1761123044, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, io.buildah.version=1.41.4, architecture=x86_64, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.expose-services=, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12)
Nov 28 08:41:58 np0005538515.localdomain systemd[1]: d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.service: Deactivated successfully.
Nov 28 08:41:58 np0005538515.localdomain systemd[1]: 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.service: Deactivated successfully.
Nov 28 08:41:58 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.
Nov 28 08:41:58 np0005538515.localdomain podman[89561]: 2025-11-28 08:41:58.979322795 +0000 UTC m=+0.080280834 container health_status e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, version=17.1.12, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, build-date=2025-11-19T00:36:58Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target, config_id=tripleo_step4, url=https://www.redhat.com, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 28 08:41:59 np0005538515.localdomain podman[89561]: 2025-11-28 08:41:59.374735495 +0000 UTC m=+0.475693534 container exec_died e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vendor=Red Hat, Inc., vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-compute, container_name=nova_migration_target, architecture=x86_64, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step4, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, tcib_managed=true, release=1761123044)
Nov 28 08:41:59 np0005538515.localdomain systemd[1]: e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.service: Deactivated successfully.
Nov 28 08:42:01 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.
Nov 28 08:42:01 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.
Nov 28 08:42:01 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.
Nov 28 08:42:01 np0005538515.localdomain podman[89582]: 2025-11-28 08:42:01.976363649 +0000 UTC m=+0.086034970 container health_status 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, vendor=Red Hat, Inc., tcib_managed=true, com.redhat.component=openstack-ovn-controller-container, architecture=x86_64, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, distribution-scope=public, vcs-type=git, batch=17.1_20251118.1, managed_by=tripleo_ansible, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, name=rhosp17/openstack-ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.openshift.expose-services=, url=https://www.redhat.com, version=17.1.12, build-date=2025-11-18T23:34:05Z, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller)
Nov 28 08:42:02 np0005538515.localdomain podman[89582]: 2025-11-28 08:42:02.026612044 +0000 UTC m=+0.136283375 container exec_died 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, version=17.1.12, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:34:05Z, com.redhat.component=openstack-ovn-controller-container, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, name=rhosp17/openstack-ovn-controller, io.openshift.expose-services=, url=https://www.redhat.com, vcs-type=git, distribution-scope=public, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., tcib_managed=true)
Nov 28 08:42:02 np0005538515.localdomain systemd[1]: 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.service: Deactivated successfully.
Nov 28 08:42:02 np0005538515.localdomain podman[89584]: 2025-11-28 08:42:02.028296755 +0000 UTC m=+0.132801488 container health_status e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, vcs-type=git, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., build-date=2025-11-19T00:14:25Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, io.openshift.expose-services=, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, io.buildah.version=1.41.4, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_id=tripleo_step4, container_name=ovn_metadata_agent, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com)
Nov 28 08:42:02 np0005538515.localdomain podman[89584]: 2025-11-28 08:42:02.11357056 +0000 UTC m=+0.218075253 container exec_died e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vcs-type=git, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=ovn_metadata_agent, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:14:25Z, name=rhosp17/openstack-neutron-metadata-agent-ovn, version=17.1.12, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., tcib_managed=true, url=https://www.redhat.com, architecture=x86_64, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, io.openshift.expose-services=)
Nov 28 08:42:02 np0005538515.localdomain systemd[1]: e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.service: Deactivated successfully.
Nov 28 08:42:02 np0005538515.localdomain podman[89583]: 2025-11-28 08:42:02.080850741 +0000 UTC m=+0.188320105 container health_status ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-nova-compute, release=1761123044, batch=17.1_20251118.1, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_compute, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, tcib_managed=true, vendor=Red Hat, Inc., vcs-type=git, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, config_id=tripleo_step5, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, distribution-scope=public)
Nov 28 08:42:02 np0005538515.localdomain podman[89583]: 2025-11-28 08:42:02.159974968 +0000 UTC m=+0.267444252 container exec_died ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, build-date=2025-11-19T00:36:58Z, tcib_managed=true, io.openshift.expose-services=, vcs-type=git, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, vendor=Red Hat, Inc., vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, release=1761123044, container_name=nova_compute, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible)
Nov 28 08:42:02 np0005538515.localdomain systemd[1]: ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.service: Deactivated successfully.
Nov 28 08:42:24 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.
Nov 28 08:42:24 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.
Nov 28 08:42:24 np0005538515.localdomain podman[89679]: 2025-11-28 08:42:24.979054233 +0000 UTC m=+0.085836063 container health_status 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:49:46Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., name=rhosp17/openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, release=1761123044, distribution-scope=public, url=https://www.redhat.com, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, tcib_managed=true)
Nov 28 08:42:25 np0005538515.localdomain systemd[1]: tmp-crun.qURE4h.mount: Deactivated successfully.
Nov 28 08:42:25 np0005538515.localdomain podman[89680]: 2025-11-28 08:42:25.042155851 +0000 UTC m=+0.147349342 container health_status cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, batch=17.1_20251118.1, url=https://www.redhat.com, config_id=tripleo_step3, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, name=rhosp17/openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, architecture=x86_64, container_name=collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, build-date=2025-11-18T22:51:28Z, io.buildah.version=1.41.4, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, com.redhat.component=openstack-collectd-container, description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 28 08:42:25 np0005538515.localdomain podman[89680]: 2025-11-28 08:42:25.078607754 +0000 UTC m=+0.183801255 container exec_died cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, build-date=2025-11-18T22:51:28Z, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, com.redhat.component=openstack-collectd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_id=tripleo_step3, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, name=rhosp17/openstack-collectd, io.buildah.version=1.41.4, architecture=x86_64, tcib_managed=true, vcs-type=git, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd)
Nov 28 08:42:25 np0005538515.localdomain systemd[1]: cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.service: Deactivated successfully.
Nov 28 08:42:25 np0005538515.localdomain podman[89679]: 2025-11-28 08:42:25.16748708 +0000 UTC m=+0.274268920 container exec_died 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, release=1761123044, container_name=metrics_qdr, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, architecture=x86_64, build-date=2025-11-18T22:49:46Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, config_id=tripleo_step1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, managed_by=tripleo_ansible, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd)
Nov 28 08:42:25 np0005538515.localdomain systemd[1]: 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.service: Deactivated successfully.
Nov 28 08:42:28 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.
Nov 28 08:42:28 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.
Nov 28 08:42:28 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.
Nov 28 08:42:28 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.
Nov 28 08:42:28 np0005538515.localdomain podman[89729]: 2025-11-28 08:42:28.988714293 +0000 UTC m=+0.086306768 container health_status 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, release=1761123044, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:44:13Z, io.buildah.version=1.41.4, managed_by=tripleo_ansible, com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, architecture=x86_64, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, tcib_managed=true, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, container_name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, distribution-scope=public, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d)
Nov 28 08:42:29 np0005538515.localdomain podman[89728]: 2025-11-28 08:42:28.968164305 +0000 UTC m=+0.072000951 container health_status 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:12:45Z, vendor=Red Hat, Inc., distribution-scope=public, tcib_managed=true, config_id=tripleo_step4, container_name=ceilometer_agent_ipmi, architecture=x86_64, url=https://www.redhat.com, vcs-type=git, name=rhosp17/openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, version=17.1.12, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-ipmi-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible)
Nov 28 08:42:29 np0005538515.localdomain podman[89730]: 2025-11-28 08:42:29.035218474 +0000 UTC m=+0.131498538 container health_status d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, config_id=tripleo_step4, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vendor=Red Hat, Inc., org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:11:48Z, architecture=x86_64, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Nov 28 08:42:29 np0005538515.localdomain podman[89727]: 2025-11-28 08:42:29.081700844 +0000 UTC m=+0.186529890 container health_status 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, build-date=2025-11-18T22:49:32Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_id=tripleo_step4, container_name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, release=1761123044, vendor=Red Hat, Inc., batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, version=17.1.12, name=rhosp17/openstack-cron, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, io.buildah.version=1.41.4)
Nov 28 08:42:29 np0005538515.localdomain podman[89730]: 2025-11-28 08:42:29.08581177 +0000 UTC m=+0.182091804 container exec_died d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., io.openshift.expose-services=, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, name=rhosp17/openstack-ceilometer-compute, distribution-scope=public, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, build-date=2025-11-19T00:11:48Z, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12)
Nov 28 08:42:29 np0005538515.localdomain systemd[1]: d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.service: Deactivated successfully.
Nov 28 08:42:29 np0005538515.localdomain podman[89728]: 2025-11-28 08:42:29.101452387 +0000 UTC m=+0.205289023 container exec_died 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, name=rhosp17/openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-type=git, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, distribution-scope=public, managed_by=tripleo_ansible, io.buildah.version=1.41.4, architecture=x86_64, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:12:45Z, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., container_name=ceilometer_agent_ipmi, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Nov 28 08:42:29 np0005538515.localdomain systemd[1]: 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.service: Deactivated successfully.
Nov 28 08:42:29 np0005538515.localdomain podman[89727]: 2025-11-28 08:42:29.14310903 +0000 UTC m=+0.247938126 container exec_died 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:32Z, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, io.openshift.expose-services=, vendor=Red Hat, Inc., url=https://www.redhat.com, com.redhat.component=openstack-cron-container, managed_by=tripleo_ansible, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=logrotate_crond, name=rhosp17/openstack-cron, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron)
Nov 28 08:42:29 np0005538515.localdomain podman[89729]: 2025-11-28 08:42:29.156327064 +0000 UTC m=+0.253919589 container exec_died 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, tcib_managed=true, container_name=iscsid, release=1761123044, com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, name=rhosp17/openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, architecture=x86_64, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., build-date=2025-11-18T23:44:13Z, summary=Red Hat OpenStack Platform 17.1 iscsid)
Nov 28 08:42:29 np0005538515.localdomain systemd[1]: 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.service: Deactivated successfully.
Nov 28 08:42:29 np0005538515.localdomain systemd[1]: 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.service: Deactivated successfully.
Nov 28 08:42:29 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.
Nov 28 08:42:29 np0005538515.localdomain podman[89818]: 2025-11-28 08:42:29.961279085 +0000 UTC m=+0.073926349 container health_status e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, name=rhosp17/openstack-nova-compute, tcib_managed=true, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, vcs-type=git, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.4, version=17.1.12, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Nov 28 08:42:30 np0005538515.localdomain podman[89818]: 2025-11-28 08:42:30.394561403 +0000 UTC m=+0.507208627 container exec_died e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:36:58Z, release=1761123044, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, container_name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, url=https://www.redhat.com, io.buildah.version=1.41.4)
Nov 28 08:42:30 np0005538515.localdomain systemd[1]: e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.service: Deactivated successfully.
Nov 28 08:42:32 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.
Nov 28 08:42:32 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.
Nov 28 08:42:32 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.
Nov 28 08:42:32 np0005538515.localdomain systemd[1]: tmp-crun.BFajQU.mount: Deactivated successfully.
Nov 28 08:42:33 np0005538515.localdomain podman[89843]: 2025-11-28 08:42:33.020310122 +0000 UTC m=+0.116718066 container health_status ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, managed_by=tripleo_ansible, container_name=nova_compute, io.buildah.version=1.41.4, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, config_id=tripleo_step5, release=1761123044, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:36:58Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc.)
Nov 28 08:42:33 np0005538515.localdomain podman[89844]: 2025-11-28 08:42:32.975693699 +0000 UTC m=+0.073156786 container health_status e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.4, name=rhosp17/openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, maintainer=OpenStack TripleO Team, architecture=x86_64, tcib_managed=true, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, build-date=2025-11-19T00:14:25Z, config_id=tripleo_step4, url=https://www.redhat.com, io.openshift.expose-services=, container_name=ovn_metadata_agent, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, distribution-scope=public, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 28 08:42:33 np0005538515.localdomain podman[89843]: 2025-11-28 08:42:33.050569647 +0000 UTC m=+0.146977621 container exec_died ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:36:58Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, tcib_managed=true, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=nova_compute, url=https://www.redhat.com, architecture=x86_64, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, batch=17.1_20251118.1, config_id=tripleo_step5, vendor=Red Hat, Inc.)
Nov 28 08:42:33 np0005538515.localdomain podman[89844]: 2025-11-28 08:42:33.060708467 +0000 UTC m=+0.158171544 container exec_died e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, release=1761123044, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, batch=17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vcs-type=git, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, container_name=ovn_metadata_agent, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:14:25Z, architecture=x86_64, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 08:42:33 np0005538515.localdomain systemd[1]: ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.service: Deactivated successfully.
Nov 28 08:42:33 np0005538515.localdomain systemd[1]: e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.service: Deactivated successfully.
Nov 28 08:42:33 np0005538515.localdomain podman[89842]: 2025-11-28 08:42:33.099628076 +0000 UTC m=+0.199275680 container health_status 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.buildah.version=1.41.4, vcs-type=git, config_id=tripleo_step4, build-date=2025-11-18T23:34:05Z, com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, distribution-scope=public, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, name=rhosp17/openstack-ovn-controller, managed_by=tripleo_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.expose-services=, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, container_name=ovn_controller)
Nov 28 08:42:33 np0005538515.localdomain podman[89842]: 2025-11-28 08:42:33.144651892 +0000 UTC m=+0.244299536 container exec_died 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, version=17.1.12, batch=17.1_20251118.1, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, name=rhosp17/openstack-ovn-controller, distribution-scope=public, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_controller, build-date=2025-11-18T23:34:05Z, url=https://www.redhat.com, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, release=1761123044, description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc.)
Nov 28 08:42:33 np0005538515.localdomain systemd[1]: 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.service: Deactivated successfully.
Nov 28 08:42:33 np0005538515.localdomain systemd[1]: tmp-crun.XOvAMK.mount: Deactivated successfully.
Nov 28 08:42:53 np0005538515.localdomain sudo[89919]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 08:42:53 np0005538515.localdomain sudo[89919]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:42:53 np0005538515.localdomain sudo[89919]: pam_unix(sudo:session): session closed for user root
Nov 28 08:42:53 np0005538515.localdomain sudo[89934]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 08:42:53 np0005538515.localdomain sudo[89934]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:42:53 np0005538515.localdomain sudo[89934]: pam_unix(sudo:session): session closed for user root
Nov 28 08:42:54 np0005538515.localdomain sudo[89980]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 08:42:54 np0005538515.localdomain sudo[89980]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:42:54 np0005538515.localdomain sudo[89980]: pam_unix(sudo:session): session closed for user root
Nov 28 08:42:55 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.
Nov 28 08:42:55 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.
Nov 28 08:42:55 np0005538515.localdomain podman[89995]: 2025-11-28 08:42:55.975359233 +0000 UTC m=+0.083183822 container health_status 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, container_name=metrics_qdr, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, tcib_managed=true, config_id=tripleo_step1, distribution-scope=public, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.openshift.expose-services=, managed_by=tripleo_ansible, io.buildah.version=1.41.4, release=1761123044, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, name=rhosp17/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container)
Nov 28 08:42:56 np0005538515.localdomain systemd[1]: tmp-crun.gXgHqo.mount: Deactivated successfully.
Nov 28 08:42:56 np0005538515.localdomain podman[89996]: 2025-11-28 08:42:56.038313627 +0000 UTC m=+0.145466106 container health_status cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, com.redhat.component=openstack-collectd-container, summary=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, distribution-scope=public, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, name=rhosp17/openstack-collectd, io.buildah.version=1.41.4, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, container_name=collectd, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, build-date=2025-11-18T22:51:28Z, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1)
Nov 28 08:42:56 np0005538515.localdomain podman[89996]: 2025-11-28 08:42:56.074360008 +0000 UTC m=+0.181512457 container exec_died cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, url=https://www.redhat.com, tcib_managed=true, com.redhat.component=openstack-collectd-container, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, io.openshift.expose-services=, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, distribution-scope=public, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:51:28Z, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 28 08:42:56 np0005538515.localdomain systemd[1]: cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.service: Deactivated successfully.
Nov 28 08:42:56 np0005538515.localdomain podman[89995]: 2025-11-28 08:42:56.173028302 +0000 UTC m=+0.280852861 container exec_died 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_id=tripleo_step1, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, managed_by=tripleo_ansible, release=1761123044, url=https://www.redhat.com, container_name=metrics_qdr, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, batch=17.1_20251118.1, distribution-scope=public, name=rhosp17/openstack-qdrouterd)
Nov 28 08:42:56 np0005538515.localdomain systemd[1]: 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.service: Deactivated successfully.
Nov 28 08:42:59 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.
Nov 28 08:42:59 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.
Nov 28 08:42:59 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.
Nov 28 08:42:59 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.
Nov 28 08:42:59 np0005538515.localdomain systemd[1]: tmp-crun.SFzAhi.mount: Deactivated successfully.
Nov 28 08:42:59 np0005538515.localdomain podman[90044]: 2025-11-28 08:42:59.988238371 +0000 UTC m=+0.092255070 container health_status 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, tcib_managed=true, io.buildah.version=1.41.4, version=17.1.12, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:32Z, managed_by=tripleo_ansible, container_name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, release=1761123044, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, vendor=Red Hat, Inc., architecture=x86_64, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container)
Nov 28 08:43:00 np0005538515.localdomain podman[90044]: 2025-11-28 08:42:59.99835947 +0000 UTC m=+0.102376189 container exec_died 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, io.buildah.version=1.41.4, managed_by=tripleo_ansible, version=17.1.12, com.redhat.component=openstack-cron-container, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, release=1761123044, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:32Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, container_name=logrotate_crond, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-cron, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 28 08:43:00 np0005538515.localdomain systemd[1]: 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.service: Deactivated successfully.
Nov 28 08:43:00 np0005538515.localdomain podman[90045]: 2025-11-28 08:43:00.078000793 +0000 UTC m=+0.178494094 container health_status 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-ipmi-container, build-date=2025-11-19T00:12:45Z, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, io.openshift.expose-services=, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, tcib_managed=true, name=rhosp17/openstack-ceilometer-ipmi)
Nov 28 08:43:00 np0005538515.localdomain podman[90045]: 2025-11-28 08:43:00.101353436 +0000 UTC m=+0.201846677 container exec_died 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, config_id=tripleo_step4, batch=17.1_20251118.1, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, io.openshift.expose-services=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, release=1761123044, url=https://www.redhat.com, build-date=2025-11-19T00:12:45Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, managed_by=tripleo_ansible, version=17.1.12, com.redhat.component=openstack-ceilometer-ipmi-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676)
Nov 28 08:43:00 np0005538515.localdomain systemd[1]: 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.service: Deactivated successfully.
Nov 28 08:43:00 np0005538515.localdomain podman[90046]: 2025-11-28 08:43:00.18198848 +0000 UTC m=+0.276688374 container health_status 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, batch=17.1_20251118.1, build-date=2025-11-18T23:44:13Z, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, managed_by=tripleo_ansible, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, container_name=iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3, url=https://www.redhat.com, com.redhat.component=openstack-iscsid-container, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public, version=17.1.12, architecture=x86_64, name=rhosp17/openstack-iscsid, io.buildah.version=1.41.4)
Nov 28 08:43:00 np0005538515.localdomain podman[90046]: 2025-11-28 08:43:00.197010729 +0000 UTC m=+0.291710583 container exec_died 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, release=1761123044, build-date=2025-11-18T23:44:13Z, com.redhat.component=openstack-iscsid-container, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public, io.openshift.expose-services=, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, version=17.1.12, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, vcs-type=git, vendor=Red Hat, Inc., io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 iscsid)
Nov 28 08:43:00 np0005538515.localdomain systemd[1]: 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.service: Deactivated successfully.
Nov 28 08:43:00 np0005538515.localdomain podman[90047]: 2025-11-28 08:43:00.24317335 +0000 UTC m=+0.338206964 container health_status d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, tcib_managed=true, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-compute, url=https://www.redhat.com, config_id=tripleo_step4, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, managed_by=tripleo_ansible, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, maintainer=OpenStack TripleO Team, release=1761123044, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, container_name=ceilometer_agent_compute, vendor=Red Hat, Inc., distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Nov 28 08:43:00 np0005538515.localdomain podman[90047]: 2025-11-28 08:43:00.268342358 +0000 UTC m=+0.363375942 container exec_died d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., build-date=2025-11-19T00:11:48Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ceilometer-compute-container, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, release=1761123044, distribution-scope=public, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, container_name=ceilometer_agent_compute, io.buildah.version=1.41.4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, vcs-type=git, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute)
Nov 28 08:43:00 np0005538515.localdomain systemd[1]: d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.service: Deactivated successfully.
Nov 28 08:43:00 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.
Nov 28 08:43:01 np0005538515.localdomain podman[90132]: 2025-11-28 08:43:01.061882122 +0000 UTC m=+0.074506017 container health_status e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, config_id=tripleo_step4, io.buildah.version=1.41.4, managed_by=tripleo_ansible, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., build-date=2025-11-19T00:36:58Z, vcs-type=git, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, name=rhosp17/openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, release=1761123044)
Nov 28 08:43:01 np0005538515.localdomain podman[90132]: 2025-11-28 08:43:01.442477029 +0000 UTC m=+0.455100984 container exec_died e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, batch=17.1_20251118.1, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, release=1761123044, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step4, io.openshift.expose-services=, architecture=x86_64, distribution-scope=public, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=nova_migration_target, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 28 08:43:01 np0005538515.localdomain systemd[1]: e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.service: Deactivated successfully.
Nov 28 08:43:03 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.
Nov 28 08:43:03 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.
Nov 28 08:43:03 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.
Nov 28 08:43:03 np0005538515.localdomain podman[90155]: 2025-11-28 08:43:03.976752784 +0000 UTC m=+0.084751461 container health_status 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, architecture=x86_64, release=1761123044, name=rhosp17/openstack-ovn-controller, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-type=git, managed_by=tripleo_ansible, io.openshift.expose-services=, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, version=17.1.12, distribution-scope=public, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ovn-controller-container, vendor=Red Hat, Inc.)
Nov 28 08:43:04 np0005538515.localdomain podman[90155]: 2025-11-28 08:43:04.005605395 +0000 UTC m=+0.113604062 container exec_died 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, build-date=2025-11-18T23:34:05Z, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, url=https://www.redhat.com, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, distribution-scope=public, release=1761123044, com.redhat.component=openstack-ovn-controller-container, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., io.openshift.expose-services=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-type=git)
Nov 28 08:43:04 np0005538515.localdomain systemd[1]: 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.service: Deactivated successfully.
Nov 28 08:43:04 np0005538515.localdomain podman[90157]: 2025-11-28 08:43:04.025124421 +0000 UTC m=+0.126278458 container health_status e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, managed_by=tripleo_ansible, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, vcs-type=git, container_name=ovn_metadata_agent, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, version=17.1.12, tcib_managed=true, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, io.openshift.expose-services=, architecture=x86_64, vendor=Red Hat, Inc.)
Nov 28 08:43:04 np0005538515.localdomain podman[90157]: 2025-11-28 08:43:04.080644078 +0000 UTC m=+0.181798075 container exec_died e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, batch=17.1_20251118.1, managed_by=tripleo_ansible, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, build-date=2025-11-19T00:14:25Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, architecture=x86_64, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., container_name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 28 08:43:04 np0005538515.localdomain podman[90156]: 2025-11-28 08:43:04.092966314 +0000 UTC m=+0.197549996 container health_status ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, config_id=tripleo_step5, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20251118.1, version=17.1.12, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, url=https://www.redhat.com, vendor=Red Hat, Inc., container_name=nova_compute, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 28 08:43:04 np0005538515.localdomain systemd[1]: e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.service: Deactivated successfully.
Nov 28 08:43:04 np0005538515.localdomain podman[90156]: 2025-11-28 08:43:04.148582713 +0000 UTC m=+0.253166435 container exec_died ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.buildah.version=1.41.4, release=1761123044, url=https://www.redhat.com, batch=17.1_20251118.1, container_name=nova_compute, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, build-date=2025-11-19T00:36:58Z, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true)
Nov 28 08:43:04 np0005538515.localdomain systemd[1]: ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.service: Deactivated successfully.
Nov 28 08:43:23 np0005538515.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Nov 28 08:43:23 np0005538515.localdomain recover_tripleo_nova_virtqemud[90227]: 62642
Nov 28 08:43:23 np0005538515.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Nov 28 08:43:23 np0005538515.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Nov 28 08:43:26 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.
Nov 28 08:43:26 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.
Nov 28 08:43:26 np0005538515.localdomain podman[90229]: 2025-11-28 08:43:26.969483574 +0000 UTC m=+0.077445336 container health_status cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-type=git, build-date=2025-11-18T22:51:28Z, name=rhosp17/openstack-collectd, tcib_managed=true, vendor=Red Hat, Inc., url=https://www.redhat.com, container_name=collectd, io.openshift.expose-services=, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, release=1761123044, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-collectd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 28 08:43:27 np0005538515.localdomain podman[90229]: 2025-11-28 08:43:27.00764736 +0000 UTC m=+0.115609062 container exec_died cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vendor=Red Hat, Inc., version=17.1.12, container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, name=rhosp17/openstack-collectd, distribution-scope=public, config_id=tripleo_step3, architecture=x86_64, managed_by=tripleo_ansible, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:51:28Z, com.redhat.component=openstack-collectd-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, release=1761123044, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']})
Nov 28 08:43:27 np0005538515.localdomain systemd[1]: cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.service: Deactivated successfully.
Nov 28 08:43:27 np0005538515.localdomain podman[90228]: 2025-11-28 08:43:27.026678892 +0000 UTC m=+0.137063919 container health_status 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, release=1761123044, build-date=2025-11-18T22:49:46Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, com.redhat.component=openstack-qdrouterd-container, name=rhosp17/openstack-qdrouterd, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, url=https://www.redhat.com, batch=17.1_20251118.1, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., tcib_managed=true, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 28 08:43:27 np0005538515.localdomain podman[90228]: 2025-11-28 08:43:27.208447385 +0000 UTC m=+0.318832372 container exec_died 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., tcib_managed=true, config_id=tripleo_step1, com.redhat.component=openstack-qdrouterd-container, name=rhosp17/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, maintainer=OpenStack TripleO Team, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:46Z, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd)
Nov 28 08:43:27 np0005538515.localdomain systemd[1]: 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.service: Deactivated successfully.
Nov 28 08:43:30 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.
Nov 28 08:43:30 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.
Nov 28 08:43:30 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.
Nov 28 08:43:30 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.
Nov 28 08:43:30 np0005538515.localdomain systemd[1]: tmp-crun.xpAT3Y.mount: Deactivated successfully.
Nov 28 08:43:30 np0005538515.localdomain podman[90276]: 2025-11-28 08:43:30.991247344 +0000 UTC m=+0.097761128 container health_status 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, url=https://www.redhat.com, com.redhat.component=openstack-cron-container, description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, build-date=2025-11-18T22:49:32Z, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-cron, batch=17.1_20251118.1)
Nov 28 08:43:31 np0005538515.localdomain podman[90276]: 2025-11-28 08:43:31.002368913 +0000 UTC m=+0.108882747 container exec_died 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, build-date=2025-11-18T22:49:32Z, distribution-scope=public, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, architecture=x86_64, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.component=openstack-cron-container, version=17.1.12, tcib_managed=true, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, config_id=tripleo_step4, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044)
Nov 28 08:43:31 np0005538515.localdomain podman[90284]: 2025-11-28 08:43:31.041179439 +0000 UTC m=+0.135645895 container health_status d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, managed_by=tripleo_ansible, build-date=2025-11-19T00:11:48Z, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, batch=17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, version=17.1.12, distribution-scope=public, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-compute-container)
Nov 28 08:43:31 np0005538515.localdomain podman[90284]: 2025-11-28 08:43:31.072670391 +0000 UTC m=+0.167136867 container exec_died d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, name=rhosp17/openstack-ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ceilometer_agent_compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, config_id=tripleo_step4, build-date=2025-11-19T00:11:48Z, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, io.openshift.expose-services=, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, release=1761123044, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, architecture=x86_64)
Nov 28 08:43:31 np0005538515.localdomain systemd[1]: d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.service: Deactivated successfully.
Nov 28 08:43:31 np0005538515.localdomain podman[90278]: 2025-11-28 08:43:31.088366691 +0000 UTC m=+0.187574242 container health_status 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, config_id=tripleo_step3, io.buildah.version=1.41.4, managed_by=tripleo_ansible, name=rhosp17/openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, tcib_managed=true, release=1761123044, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:44:13Z, summary=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., com.redhat.component=openstack-iscsid-container, maintainer=OpenStack TripleO Team, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.expose-services=, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git)
Nov 28 08:43:31 np0005538515.localdomain podman[90278]: 2025-11-28 08:43:31.099416328 +0000 UTC m=+0.198623919 container exec_died 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, build-date=2025-11-18T23:44:13Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, container_name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, architecture=x86_64, distribution-scope=public, batch=17.1_20251118.1, name=rhosp17/openstack-iscsid, managed_by=tripleo_ansible, com.redhat.component=openstack-iscsid-container, version=17.1.12, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, vendor=Red Hat, Inc.)
Nov 28 08:43:31 np0005538515.localdomain systemd[1]: 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.service: Deactivated successfully.
Nov 28 08:43:31 np0005538515.localdomain systemd[1]: 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.service: Deactivated successfully.
Nov 28 08:43:31 np0005538515.localdomain podman[90277]: 2025-11-28 08:43:31.186056625 +0000 UTC m=+0.288203466 container health_status 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, version=17.1.12, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, container_name=ceilometer_agent_ipmi, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-type=git, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, release=1761123044, batch=17.1_20251118.1, distribution-scope=public)
Nov 28 08:43:31 np0005538515.localdomain podman[90277]: 2025-11-28 08:43:31.212472602 +0000 UTC m=+0.314619423 container exec_died 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, release=1761123044, vcs-type=git, distribution-scope=public, build-date=2025-11-19T00:12:45Z, container_name=ceilometer_agent_ipmi, tcib_managed=true, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ceilometer-ipmi-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, version=17.1.12)
Nov 28 08:43:31 np0005538515.localdomain systemd[1]: 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.service: Deactivated successfully.
Nov 28 08:43:31 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.
Nov 28 08:43:31 np0005538515.localdomain podman[90367]: 2025-11-28 08:43:31.96164523 +0000 UTC m=+0.073880368 container health_status e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1761123044, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, version=17.1.12, config_id=tripleo_step4, name=rhosp17/openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, container_name=nova_migration_target, tcib_managed=true, managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, url=https://www.redhat.com)
Nov 28 08:43:32 np0005538515.localdomain podman[90367]: 2025-11-28 08:43:32.330326874 +0000 UTC m=+0.442562012 container exec_died e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., batch=17.1_20251118.1, container_name=nova_migration_target, tcib_managed=true, release=1761123044, build-date=2025-11-19T00:36:58Z, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, io.buildah.version=1.41.4, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']})
Nov 28 08:43:32 np0005538515.localdomain systemd[1]: e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.service: Deactivated successfully.
Nov 28 08:43:34 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.
Nov 28 08:43:34 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.
Nov 28 08:43:34 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.
Nov 28 08:43:34 np0005538515.localdomain systemd[1]: tmp-crun.h4q8TC.mount: Deactivated successfully.
Nov 28 08:43:35 np0005538515.localdomain systemd[1]: tmp-crun.iFxwXQ.mount: Deactivated successfully.
Nov 28 08:43:35 np0005538515.localdomain podman[90389]: 2025-11-28 08:43:34.984596503 +0000 UTC m=+0.092292781 container health_status 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, io.openshift.expose-services=, name=rhosp17/openstack-ovn-controller, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., url=https://www.redhat.com, tcib_managed=true, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_controller, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:34:05Z, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ovn-controller-container, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272)
Nov 28 08:43:35 np0005538515.localdomain podman[90391]: 2025-11-28 08:43:35.039301274 +0000 UTC m=+0.136775449 container health_status e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, batch=17.1_20251118.1, build-date=2025-11-19T00:14:25Z, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, name=rhosp17/openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, version=17.1.12, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, distribution-scope=public, io.openshift.expose-services=, tcib_managed=true, config_id=tripleo_step4, vendor=Red Hat, Inc., vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, maintainer=OpenStack TripleO Team)
Nov 28 08:43:35 np0005538515.localdomain podman[90390]: 2025-11-28 08:43:35.010266507 +0000 UTC m=+0.107591878 container health_status ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, distribution-scope=public, container_name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2025-11-19T00:36:58Z, tcib_managed=true, architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, vcs-type=git, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 28 08:43:35 np0005538515.localdomain podman[90389]: 2025-11-28 08:43:35.067463935 +0000 UTC m=+0.175160203 container exec_died 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, config_id=tripleo_step4, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, container_name=ovn_controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, version=17.1.12, architecture=x86_64, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:34:05Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ovn-controller-container, tcib_managed=true)
Nov 28 08:43:35 np0005538515.localdomain systemd[1]: 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.service: Deactivated successfully.
Nov 28 08:43:35 np0005538515.localdomain podman[90390]: 2025-11-28 08:43:35.095510202 +0000 UTC m=+0.192835523 container exec_died ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, version=17.1.12, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-compute, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, distribution-scope=public, build-date=2025-11-19T00:36:58Z, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, tcib_managed=true, container_name=nova_compute, release=1761123044, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 08:43:35 np0005538515.localdomain systemd[1]: ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.service: Deactivated successfully.
Nov 28 08:43:35 np0005538515.localdomain podman[90391]: 2025-11-28 08:43:35.10824811 +0000 UTC m=+0.205722265 container exec_died e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, io.openshift.expose-services=, build-date=2025-11-19T00:14:25Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, name=rhosp17/openstack-neutron-metadata-agent-ovn, version=17.1.12, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-type=git, distribution-scope=public, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, batch=17.1_20251118.1)
Nov 28 08:43:35 np0005538515.localdomain systemd[1]: e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.service: Deactivated successfully.
Nov 28 08:43:54 np0005538515.localdomain sudo[90460]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 08:43:54 np0005538515.localdomain sudo[90460]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:43:54 np0005538515.localdomain sudo[90460]: pam_unix(sudo:session): session closed for user root
Nov 28 08:43:54 np0005538515.localdomain sudo[90475]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 08:43:54 np0005538515.localdomain sudo[90475]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:43:55 np0005538515.localdomain sudo[90475]: pam_unix(sudo:session): session closed for user root
Nov 28 08:43:56 np0005538515.localdomain sudo[90523]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 08:43:56 np0005538515.localdomain sudo[90523]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:43:56 np0005538515.localdomain sudo[90523]: pam_unix(sudo:session): session closed for user root
Nov 28 08:43:57 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.
Nov 28 08:43:57 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.
Nov 28 08:43:57 np0005538515.localdomain podman[90538]: 2025-11-28 08:43:57.971165087 +0000 UTC m=+0.080859661 container health_status 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, vcs-type=git, name=rhosp17/openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, config_id=tripleo_step1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, managed_by=tripleo_ansible, batch=17.1_20251118.1, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., architecture=x86_64, release=1761123044)
Nov 28 08:43:58 np0005538515.localdomain podman[90539]: 2025-11-28 08:43:58.027086776 +0000 UTC m=+0.131450777 container health_status cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, name=rhosp17/openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=collectd, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, batch=17.1_20251118.1, release=1761123044, url=https://www.redhat.com, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, build-date=2025-11-18T22:51:28Z, distribution-scope=public, com.redhat.component=openstack-collectd-container, config_id=tripleo_step3)
Nov 28 08:43:58 np0005538515.localdomain podman[90539]: 2025-11-28 08:43:58.035254555 +0000 UTC m=+0.139618556 container exec_died cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, vendor=Red Hat, Inc., batch=17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=collectd, architecture=x86_64, distribution-scope=public, io.openshift.expose-services=, tcib_managed=true)
Nov 28 08:43:58 np0005538515.localdomain systemd[1]: cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.service: Deactivated successfully.
Nov 28 08:43:58 np0005538515.localdomain podman[90538]: 2025-11-28 08:43:58.163706009 +0000 UTC m=+0.273400583 container exec_died 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, release=1761123044, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, config_id=tripleo_step1, vendor=Red Hat, Inc., batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:49:46Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, managed_by=tripleo_ansible, vcs-type=git, architecture=x86_64, tcib_managed=true, name=rhosp17/openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.buildah.version=1.41.4)
Nov 28 08:43:58 np0005538515.localdomain systemd[1]: 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.service: Deactivated successfully.
Nov 28 08:44:01 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.
Nov 28 08:44:01 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.
Nov 28 08:44:01 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.
Nov 28 08:44:01 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.
Nov 28 08:44:01 np0005538515.localdomain podman[90591]: 2025-11-28 08:44:01.987643564 +0000 UTC m=+0.087680519 container health_status d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ceilometer_agent_compute, architecture=x86_64, build-date=2025-11-19T00:11:48Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, release=1761123044, tcib_managed=true, batch=17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, vendor=Red Hat, Inc., managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, config_id=tripleo_step4)
Nov 28 08:44:02 np0005538515.localdomain podman[90588]: 2025-11-28 08:44:02.043054517 +0000 UTC m=+0.148840808 container health_status 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-cron-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, name=rhosp17/openstack-cron, io.openshift.expose-services=, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, release=1761123044, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, architecture=x86_64, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:32Z, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, io.buildah.version=1.41.4)
Nov 28 08:44:02 np0005538515.localdomain podman[90591]: 2025-11-28 08:44:02.046496652 +0000 UTC m=+0.146533617 container exec_died d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, version=17.1.12, tcib_managed=true, architecture=x86_64, build-date=2025-11-19T00:11:48Z, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, com.redhat.component=openstack-ceilometer-compute-container, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, distribution-scope=public, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_compute, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']})
Nov 28 08:44:02 np0005538515.localdomain systemd[1]: d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.service: Deactivated successfully.
Nov 28 08:44:02 np0005538515.localdomain podman[90588]: 2025-11-28 08:44:02.076498499 +0000 UTC m=+0.182284810 container exec_died 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, config_id=tripleo_step4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-cron-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, url=https://www.redhat.com, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, release=1761123044, container_name=logrotate_crond, io.openshift.expose-services=, vendor=Red Hat, Inc., distribution-scope=public, name=rhosp17/openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, tcib_managed=true, architecture=x86_64, build-date=2025-11-18T22:49:32Z, version=17.1.12)
Nov 28 08:44:02 np0005538515.localdomain systemd[1]: 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.service: Deactivated successfully.
Nov 28 08:44:02 np0005538515.localdomain podman[90589]: 2025-11-28 08:44:02.092579971 +0000 UTC m=+0.197837026 container health_status 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, release=1761123044, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, vcs-type=git, io.openshift.expose-services=, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, container_name=ceilometer_agent_ipmi, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ceilometer-ipmi-container, managed_by=tripleo_ansible, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z)
Nov 28 08:44:02 np0005538515.localdomain podman[90589]: 2025-11-28 08:44:02.12300305 +0000 UTC m=+0.228260075 container exec_died 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, build-date=2025-11-19T00:12:45Z, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, version=17.1.12, name=rhosp17/openstack-ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_ipmi, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, vcs-type=git, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container, config_id=tripleo_step4)
Nov 28 08:44:02 np0005538515.localdomain systemd[1]: 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.service: Deactivated successfully.
Nov 28 08:44:02 np0005538515.localdomain podman[90590]: 2025-11-28 08:44:02.141131864 +0000 UTC m=+0.243581343 container health_status 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vcs-type=git, io.openshift.expose-services=, container_name=iscsid, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, version=17.1.12, config_id=tripleo_step3, managed_by=tripleo_ansible, batch=17.1_20251118.1, com.redhat.component=openstack-iscsid-container, distribution-scope=public, vendor=Red Hat, Inc., org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, name=rhosp17/openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, url=https://www.redhat.com, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:44:13Z, summary=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 28 08:44:02 np0005538515.localdomain podman[90590]: 2025-11-28 08:44:02.47030186 +0000 UTC m=+0.572751299 container exec_died 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, build-date=2025-11-18T23:44:13Z, com.redhat.component=openstack-iscsid-container, container_name=iscsid, maintainer=OpenStack TripleO Team, version=17.1.12, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, url=https://www.redhat.com, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, batch=17.1_20251118.1, name=rhosp17/openstack-iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid)
Nov 28 08:44:02 np0005538515.localdomain systemd[1]: 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.service: Deactivated successfully.
Nov 28 08:44:02 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.
Nov 28 08:44:02 np0005538515.localdomain podman[90682]: 2025-11-28 08:44:02.583580381 +0000 UTC m=+0.077032864 container health_status e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.buildah.version=1.41.4, managed_by=tripleo_ansible, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, config_id=tripleo_step4, name=rhosp17/openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:36:58Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, architecture=x86_64, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 28 08:44:02 np0005538515.localdomain podman[90682]: 2025-11-28 08:44:02.970955435 +0000 UTC m=+0.464407938 container exec_died e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.openshift.expose-services=, tcib_managed=true, version=17.1.12, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, build-date=2025-11-19T00:36:58Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., architecture=x86_64, name=rhosp17/openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 08:44:02 np0005538515.localdomain systemd[1]: e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.service: Deactivated successfully.
Nov 28 08:44:05 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.
Nov 28 08:44:05 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.
Nov 28 08:44:05 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.
Nov 28 08:44:05 np0005538515.localdomain systemd[1]: tmp-crun.IqKjSn.mount: Deactivated successfully.
Nov 28 08:44:06 np0005538515.localdomain podman[90705]: 2025-11-28 08:44:06.000558964 +0000 UTC m=+0.108712562 container health_status 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, distribution-scope=public, config_id=tripleo_step4, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, url=https://www.redhat.com, name=rhosp17/openstack-ovn-controller, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-type=git, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, release=1761123044, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, build-date=2025-11-18T23:34:05Z, managed_by=tripleo_ansible, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.41.4)
Nov 28 08:44:06 np0005538515.localdomain podman[90707]: 2025-11-28 08:44:05.982135841 +0000 UTC m=+0.086404140 container health_status e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, distribution-scope=public, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.expose-services=, vcs-type=git, url=https://www.redhat.com, config_id=tripleo_step4, build-date=2025-11-19T00:14:25Z, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-neutron-metadata-agent-ovn, architecture=x86_64, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, batch=17.1_20251118.1, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, version=17.1.12)
Nov 28 08:44:06 np0005538515.localdomain podman[90706]: 2025-11-28 08:44:06.043240398 +0000 UTC m=+0.148685204 container health_status ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, release=1761123044, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, vendor=Red Hat, Inc., org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step5, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, managed_by=tripleo_ansible, vcs-type=git, tcib_managed=true, url=https://www.redhat.com, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 28 08:44:06 np0005538515.localdomain podman[90705]: 2025-11-28 08:44:06.054424469 +0000 UTC m=+0.162578057 container exec_died 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, architecture=x86_64, container_name=ovn_controller, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:34:05Z, tcib_managed=true, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., name=rhosp17/openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-type=git, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, io.buildah.version=1.41.4, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, managed_by=tripleo_ansible, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, com.redhat.component=openstack-ovn-controller-container, summary=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044)
Nov 28 08:44:06 np0005538515.localdomain podman[90707]: 2025-11-28 08:44:06.064535319 +0000 UTC m=+0.168803618 container exec_died e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, container_name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, batch=17.1_20251118.1, config_id=tripleo_step4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, build-date=2025-11-19T00:14:25Z, architecture=x86_64, release=1761123044, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, name=rhosp17/openstack-neutron-metadata-agent-ovn, tcib_managed=true, io.openshift.expose-services=, vendor=Red Hat, Inc., org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4)
Nov 28 08:44:06 np0005538515.localdomain systemd[1]: 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.service: Deactivated successfully.
Nov 28 08:44:06 np0005538515.localdomain podman[90706]: 2025-11-28 08:44:06.077426162 +0000 UTC m=+0.182870928 container exec_died ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, io.buildah.version=1.41.4, config_id=tripleo_step5, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, managed_by=tripleo_ansible, vcs-type=git, container_name=nova_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, tcib_managed=true, io.openshift.expose-services=, batch=17.1_20251118.1, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, architecture=x86_64)
Nov 28 08:44:06 np0005538515.localdomain systemd[1]: ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.service: Deactivated successfully.
Nov 28 08:44:06 np0005538515.localdomain systemd[1]: e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.service: Deactivated successfully.
Nov 28 08:44:28 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.
Nov 28 08:44:28 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.
Nov 28 08:44:28 np0005538515.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Nov 28 08:44:28 np0005538515.localdomain recover_tripleo_nova_virtqemud[90786]: 62642
Nov 28 08:44:28 np0005538515.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Nov 28 08:44:28 np0005538515.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Nov 28 08:44:28 np0005538515.localdomain systemd[1]: tmp-crun.YoDC71.mount: Deactivated successfully.
Nov 28 08:44:28 np0005538515.localdomain podman[90778]: 2025-11-28 08:44:28.980457795 +0000 UTC m=+0.085828274 container health_status 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, version=17.1.12, maintainer=OpenStack TripleO Team, container_name=metrics_qdr, name=rhosp17/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:46Z, release=1761123044, io.buildah.version=1.41.4, tcib_managed=true, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 28 08:44:29 np0005538515.localdomain podman[90779]: 2025-11-28 08:44:29.041141689 +0000 UTC m=+0.141123343 container health_status cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, container_name=collectd, io.buildah.version=1.41.4, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, managed_by=tripleo_ansible, com.redhat.component=openstack-collectd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, vendor=Red Hat, Inc., url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, distribution-scope=public, config_id=tripleo_step3, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team)
Nov 28 08:44:29 np0005538515.localdomain podman[90779]: 2025-11-28 08:44:29.051921038 +0000 UTC m=+0.151902692 container exec_died cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, managed_by=tripleo_ansible, distribution-scope=public, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, name=rhosp17/openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, io.openshift.expose-services=, container_name=collectd, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:51:28Z, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, batch=17.1_20251118.1, vcs-type=git, com.redhat.component=openstack-collectd-container, description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']})
Nov 28 08:44:29 np0005538515.localdomain systemd[1]: cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.service: Deactivated successfully.
Nov 28 08:44:29 np0005538515.localdomain podman[90778]: 2025-11-28 08:44:29.202040214 +0000 UTC m=+0.307410703 container exec_died 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, maintainer=OpenStack TripleO Team, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step1, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, architecture=x86_64, batch=17.1_20251118.1, release=1761123044, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., name=rhosp17/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, io.openshift.expose-services=, container_name=metrics_qdr)
Nov 28 08:44:29 np0005538515.localdomain systemd[1]: 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.service: Deactivated successfully.
Nov 28 08:44:29 np0005538515.localdomain systemd[1]: tmp-crun.B9u5OF.mount: Deactivated successfully.
Nov 28 08:44:32 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.
Nov 28 08:44:32 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.
Nov 28 08:44:32 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.
Nov 28 08:44:32 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.
Nov 28 08:44:32 np0005538515.localdomain podman[90829]: 2025-11-28 08:44:32.980470639 +0000 UTC m=+0.089222437 container health_status 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, url=https://www.redhat.com, com.redhat.component=openstack-cron-container, release=1761123044, config_id=tripleo_step4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, container_name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:32Z, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, tcib_managed=true, version=17.1.12, name=rhosp17/openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, vendor=Red Hat, Inc., batch=17.1_20251118.1, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']})
Nov 28 08:44:32 np0005538515.localdomain podman[90829]: 2025-11-28 08:44:32.992339102 +0000 UTC m=+0.101090900 container exec_died 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, batch=17.1_20251118.1, build-date=2025-11-18T22:49:32Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, container_name=logrotate_crond, config_id=tripleo_step4, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 cron, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-cron-container, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, name=rhosp17/openstack-cron)
Nov 28 08:44:33 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.
Nov 28 08:44:33 np0005538515.localdomain systemd[1]: 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.service: Deactivated successfully.
Nov 28 08:44:33 np0005538515.localdomain podman[90832]: 2025-11-28 08:44:33.030047854 +0000 UTC m=+0.131554230 container health_status d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:11:48Z, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, tcib_managed=true, container_name=ceilometer_agent_compute, url=https://www.redhat.com, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, vendor=Red Hat, Inc., io.openshift.expose-services=, vcs-type=git, com.redhat.component=openstack-ceilometer-compute-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Nov 28 08:44:33 np0005538515.localdomain podman[90832]: 2025-11-28 08:44:33.062434443 +0000 UTC m=+0.163940889 container exec_died d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-compute, batch=17.1_20251118.1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, build-date=2025-11-19T00:11:48Z, maintainer=OpenStack TripleO Team, vcs-type=git, io.openshift.expose-services=, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, config_id=tripleo_step4, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_compute, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Nov 28 08:44:33 np0005538515.localdomain systemd[1]: tmp-crun.McpvTg.mount: Deactivated successfully.
Nov 28 08:44:33 np0005538515.localdomain systemd[1]: d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.service: Deactivated successfully.
Nov 28 08:44:33 np0005538515.localdomain podman[90830]: 2025-11-28 08:44:33.079466953 +0000 UTC m=+0.185157057 container health_status 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ceilometer_agent_ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, build-date=2025-11-19T00:12:45Z)
Nov 28 08:44:33 np0005538515.localdomain podman[90830]: 2025-11-28 08:44:33.129878053 +0000 UTC m=+0.235568147 container exec_died 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, batch=17.1_20251118.1, url=https://www.redhat.com, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-ipmi, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., version=17.1.12, managed_by=tripleo_ansible, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, build-date=2025-11-19T00:12:45Z)
Nov 28 08:44:33 np0005538515.localdomain systemd[1]: 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.service: Deactivated successfully.
Nov 28 08:44:33 np0005538515.localdomain podman[90831]: 2025-11-28 08:44:33.152324269 +0000 UTC m=+0.254407973 container health_status 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-iscsid-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, release=1761123044, batch=17.1_20251118.1, vendor=Red Hat, Inc., url=https://www.redhat.com, managed_by=tripleo_ansible, container_name=iscsid, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, architecture=x86_64, build-date=2025-11-18T23:44:13Z, config_id=tripleo_step3, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public, maintainer=OpenStack TripleO Team, version=17.1.12)
Nov 28 08:44:33 np0005538515.localdomain podman[90879]: 2025-11-28 08:44:33.202497512 +0000 UTC m=+0.186238510 container health_status e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, tcib_managed=true, vcs-type=git, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, name=rhosp17/openstack-nova-compute, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, release=1761123044)
Nov 28 08:44:33 np0005538515.localdomain podman[90831]: 2025-11-28 08:44:33.22468926 +0000 UTC m=+0.326772984 container exec_died 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, tcib_managed=true, io.openshift.expose-services=, name=rhosp17/openstack-iscsid, config_id=tripleo_step3, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, container_name=iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.component=openstack-iscsid-container, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., vcs-type=git, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, managed_by=tripleo_ansible, url=https://www.redhat.com, build-date=2025-11-18T23:44:13Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 iscsid)
Nov 28 08:44:33 np0005538515.localdomain systemd[1]: 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.service: Deactivated successfully.
Nov 28 08:44:33 np0005538515.localdomain podman[90879]: 2025-11-28 08:44:33.559286672 +0000 UTC m=+0.543027620 container exec_died e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, batch=17.1_20251118.1, name=rhosp17/openstack-nova-compute, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, distribution-scope=public, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, container_name=nova_migration_target, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, url=https://www.redhat.com, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute)
Nov 28 08:44:33 np0005538515.localdomain systemd[1]: e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.service: Deactivated successfully.
Nov 28 08:44:36 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.
Nov 28 08:44:36 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.
Nov 28 08:44:36 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.
Nov 28 08:44:37 np0005538515.localdomain podman[90938]: 2025-11-28 08:44:37.003182437 +0000 UTC m=+0.106577767 container health_status 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, container_name=ovn_controller, url=https://www.redhat.com, vcs-type=git, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, batch=17.1_20251118.1, vendor=Red Hat, Inc., name=rhosp17/openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:34:05Z, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, distribution-scope=public, release=1761123044, com.redhat.component=openstack-ovn-controller-container, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller)
Nov 28 08:44:37 np0005538515.localdomain podman[90940]: 2025-11-28 08:44:37.04028777 +0000 UTC m=+0.139241705 container health_status e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.4, release=1761123044, vcs-type=git, name=rhosp17/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_metadata_agent, tcib_managed=true, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, distribution-scope=public, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=)
Nov 28 08:44:37 np0005538515.localdomain podman[90938]: 2025-11-28 08:44:37.057414243 +0000 UTC m=+0.160809573 container exec_died 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vendor=Red Hat, Inc., release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, url=https://www.redhat.com, name=rhosp17/openstack-ovn-controller, io.openshift.expose-services=, build-date=2025-11-18T23:34:05Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.41.4, distribution-scope=public, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1)
Nov 28 08:44:37 np0005538515.localdomain systemd[1]: 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.service: Deactivated successfully.
Nov 28 08:44:37 np0005538515.localdomain podman[90939]: 2025-11-28 08:44:37.068129531 +0000 UTC m=+0.169304433 container health_status ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, batch=17.1_20251118.1, url=https://www.redhat.com, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, io.openshift.expose-services=, build-date=2025-11-19T00:36:58Z, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, name=rhosp17/openstack-nova-compute, config_id=tripleo_step5, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 28 08:44:37 np0005538515.localdomain podman[90940]: 2025-11-28 08:44:37.121933024 +0000 UTC m=+0.220886969 container exec_died e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, io.openshift.expose-services=, vendor=Red Hat, Inc., release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, tcib_managed=true, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, url=https://www.redhat.com, container_name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, version=17.1.12, name=rhosp17/openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, vcs-type=git, architecture=x86_64)
Nov 28 08:44:37 np0005538515.localdomain systemd[1]: e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.service: Deactivated successfully.
Nov 28 08:44:37 np0005538515.localdomain podman[90939]: 2025-11-28 08:44:37.172489369 +0000 UTC m=+0.273664281 container exec_died ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.buildah.version=1.41.4, container_name=nova_compute, name=rhosp17/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, batch=17.1_20251118.1, architecture=x86_64, url=https://www.redhat.com, vendor=Red Hat, Inc., release=1761123044, vcs-type=git, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-compute-container, build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step5, managed_by=tripleo_ansible)
Nov 28 08:44:37 np0005538515.localdomain systemd[1]: ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.service: Deactivated successfully.
Nov 28 08:44:56 np0005538515.localdomain sudo[91012]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 08:44:56 np0005538515.localdomain sudo[91012]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:44:56 np0005538515.localdomain sudo[91012]: pam_unix(sudo:session): session closed for user root
Nov 28 08:44:56 np0005538515.localdomain sudo[91027]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 08:44:56 np0005538515.localdomain sudo[91027]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:44:57 np0005538515.localdomain sudo[91027]: pam_unix(sudo:session): session closed for user root
Nov 28 08:44:57 np0005538515.localdomain sudo[91074]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 08:44:57 np0005538515.localdomain sudo[91074]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:44:57 np0005538515.localdomain sudo[91074]: pam_unix(sudo:session): session closed for user root
Nov 28 08:44:59 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.
Nov 28 08:44:59 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.
Nov 28 08:44:59 np0005538515.localdomain systemd[1]: tmp-crun.OxJevl.mount: Deactivated successfully.
Nov 28 08:45:00 np0005538515.localdomain systemd[1]: tmp-crun.Yyaaas.mount: Deactivated successfully.
Nov 28 08:45:00 np0005538515.localdomain podman[91090]: 2025-11-28 08:45:00.038914603 +0000 UTC m=+0.142403222 container health_status cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, config_id=tripleo_step3, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, name=rhosp17/openstack-collectd, batch=17.1_20251118.1, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:51:28Z, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, url=https://www.redhat.com, managed_by=tripleo_ansible, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, architecture=x86_64, maintainer=OpenStack TripleO Team, distribution-scope=public, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 08:45:00 np0005538515.localdomain podman[91089]: 2025-11-28 08:45:00.006266035 +0000 UTC m=+0.111785226 container health_status 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, container_name=metrics_qdr, release=1761123044, vcs-type=git, io.openshift.expose-services=, distribution-scope=public, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, name=rhosp17/openstack-qdrouterd, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 28 08:45:00 np0005538515.localdomain podman[91090]: 2025-11-28 08:45:00.077365957 +0000 UTC m=+0.180854486 container exec_died cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, name=rhosp17/openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, version=17.1.12, batch=17.1_20251118.1, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, architecture=x86_64, vcs-type=git, com.redhat.component=openstack-collectd-container, config_id=tripleo_step3, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, build-date=2025-11-18T22:51:28Z, description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 08:45:00 np0005538515.localdomain systemd[1]: cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.service: Deactivated successfully.
Nov 28 08:45:00 np0005538515.localdomain podman[91089]: 2025-11-28 08:45:00.219539351 +0000 UTC m=+0.325058552 container exec_died 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, vendor=Red Hat, Inc., config_id=tripleo_step1, url=https://www.redhat.com, io.buildah.version=1.41.4, build-date=2025-11-18T22:49:46Z, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, container_name=metrics_qdr, release=1761123044, distribution-scope=public, managed_by=tripleo_ansible)
Nov 28 08:45:00 np0005538515.localdomain systemd[1]: 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.service: Deactivated successfully.
Nov 28 08:45:03 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.
Nov 28 08:45:03 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.
Nov 28 08:45:03 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.
Nov 28 08:45:03 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.
Nov 28 08:45:03 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.
Nov 28 08:45:03 np0005538515.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Nov 28 08:45:03 np0005538515.localdomain recover_tripleo_nova_virtqemud[91162]: 62642
Nov 28 08:45:03 np0005538515.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Nov 28 08:45:03 np0005538515.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Nov 28 08:45:03 np0005538515.localdomain podman[91138]: 2025-11-28 08:45:03.98595031 +0000 UTC m=+0.086944337 container health_status 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, distribution-scope=public, tcib_managed=true, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, version=17.1.12, build-date=2025-11-19T00:12:45Z, container_name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, vendor=Red Hat, Inc., release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676)
Nov 28 08:45:03 np0005538515.localdomain systemd[1]: tmp-crun.dmbAn9.mount: Deactivated successfully.
Nov 28 08:45:04 np0005538515.localdomain podman[91137]: 2025-11-28 08:45:03.998451341 +0000 UTC m=+0.100734318 container health_status 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, tcib_managed=true, release=1761123044, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.buildah.version=1.41.4, vcs-type=git, com.redhat.component=openstack-cron-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, url=https://www.redhat.com, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, build-date=2025-11-18T22:49:32Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, vendor=Red Hat, Inc.)
Nov 28 08:45:04 np0005538515.localdomain podman[91137]: 2025-11-28 08:45:04.006758615 +0000 UTC m=+0.109041552 container exec_died 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, build-date=2025-11-18T22:49:32Z, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, name=rhosp17/openstack-cron, com.redhat.component=openstack-cron-container, description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, distribution-scope=public, config_id=tripleo_step4, container_name=logrotate_crond, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git)
Nov 28 08:45:04 np0005538515.localdomain systemd[1]: 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.service: Deactivated successfully.
Nov 28 08:45:04 np0005538515.localdomain podman[91141]: 2025-11-28 08:45:04.042746715 +0000 UTC m=+0.130992814 container health_status e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, distribution-scope=public, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, vcs-type=git, architecture=x86_64, container_name=nova_migration_target, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 28 08:45:04 np0005538515.localdomain podman[91138]: 2025-11-28 08:45:04.099488718 +0000 UTC m=+0.200482755 container exec_died 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, architecture=x86_64, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, batch=17.1_20251118.1, container_name=ceilometer_agent_ipmi, build-date=2025-11-19T00:12:45Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team)
Nov 28 08:45:04 np0005538515.localdomain systemd[1]: 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.service: Deactivated successfully.
Nov 28 08:45:04 np0005538515.localdomain podman[91139]: 2025-11-28 08:45:04.10446442 +0000 UTC m=+0.201980401 container health_status 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, architecture=x86_64, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, name=rhosp17/openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, batch=17.1_20251118.1, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, build-date=2025-11-18T23:44:13Z, description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, vcs-type=git, com.redhat.component=openstack-iscsid-container, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, managed_by=tripleo_ansible)
Nov 28 08:45:04 np0005538515.localdomain podman[91140]: 2025-11-28 08:45:04.163662719 +0000 UTC m=+0.258354754 container health_status d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, vcs-type=git, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, io.openshift.expose-services=, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-compute-container, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, release=1761123044, container_name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, config_id=tripleo_step4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20251118.1, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, version=17.1.12)
Nov 28 08:45:04 np0005538515.localdomain podman[91139]: 2025-11-28 08:45:04.188336273 +0000 UTC m=+0.285852254 container exec_died 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, container_name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, batch=17.1_20251118.1, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, build-date=2025-11-18T23:44:13Z, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, url=https://www.redhat.com, tcib_managed=true, release=1761123044, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=)
Nov 28 08:45:04 np0005538515.localdomain podman[91140]: 2025-11-28 08:45:04.195244663 +0000 UTC m=+0.289936698 container exec_died d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:11:48Z, io.buildah.version=1.41.4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, release=1761123044, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, distribution-scope=public, container_name=ceilometer_agent_compute, name=rhosp17/openstack-ceilometer-compute, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., version=17.1.12)
Nov 28 08:45:04 np0005538515.localdomain systemd[1]: 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.service: Deactivated successfully.
Nov 28 08:45:04 np0005538515.localdomain systemd[1]: d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.service: Deactivated successfully.
Nov 28 08:45:04 np0005538515.localdomain podman[91141]: 2025-11-28 08:45:04.448323926 +0000 UTC m=+0.536570005 container exec_died e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, vcs-type=git, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, build-date=2025-11-19T00:36:58Z, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, container_name=nova_migration_target, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64)
Nov 28 08:45:04 np0005538515.localdomain systemd[1]: e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.service: Deactivated successfully.
Nov 28 08:45:07 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.
Nov 28 08:45:07 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.
Nov 28 08:45:07 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.
Nov 28 08:45:07 np0005538515.localdomain podman[91251]: 2025-11-28 08:45:07.984650074 +0000 UTC m=+0.086040890 container health_status ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, vendor=Red Hat, Inc., config_id=tripleo_step5, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, architecture=x86_64, container_name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.12, com.redhat.component=openstack-nova-compute-container, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, name=rhosp17/openstack-nova-compute, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 28 08:45:08 np0005538515.localdomain systemd[1]: tmp-crun.D8HipQ.mount: Deactivated successfully.
Nov 28 08:45:08 np0005538515.localdomain podman[91250]: 2025-11-28 08:45:08.039225622 +0000 UTC m=+0.140588087 container health_status 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, config_id=tripleo_step4, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, architecture=x86_64, batch=17.1_20251118.1, url=https://www.redhat.com, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.component=openstack-ovn-controller-container, build-date=2025-11-18T23:34:05Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, distribution-scope=public, name=rhosp17/openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, tcib_managed=true, release=1761123044, managed_by=tripleo_ansible, container_name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller)
Nov 28 08:45:08 np0005538515.localdomain podman[91251]: 2025-11-28 08:45:08.045537794 +0000 UTC m=+0.146928660 container exec_died ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, build-date=2025-11-19T00:36:58Z, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.4, tcib_managed=true, architecture=x86_64, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, config_id=tripleo_step5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., url=https://www.redhat.com, container_name=nova_compute)
Nov 28 08:45:08 np0005538515.localdomain systemd[1]: ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.service: Deactivated successfully.
Nov 28 08:45:08 np0005538515.localdomain systemd[1]: tmp-crun.6IKWXd.mount: Deactivated successfully.
Nov 28 08:45:08 np0005538515.localdomain podman[91252]: 2025-11-28 08:45:08.093462388 +0000 UTC m=+0.193126401 container health_status e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, version=17.1.12, config_id=tripleo_step4, container_name=ovn_metadata_agent, build-date=2025-11-19T00:14:25Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-neutron-metadata-agent-ovn, architecture=x86_64, vcs-type=git, io.openshift.expose-services=, tcib_managed=true, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Nov 28 08:45:08 np0005538515.localdomain podman[91250]: 2025-11-28 08:45:08.111685204 +0000 UTC m=+0.213047679 container exec_died 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-type=git, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, build-date=2025-11-18T23:34:05Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, batch=17.1_20251118.1, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, version=17.1.12, config_id=tripleo_step4, tcib_managed=true, managed_by=tripleo_ansible, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, distribution-scope=public, name=rhosp17/openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272)
Nov 28 08:45:08 np0005538515.localdomain systemd[1]: 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.service: Deactivated successfully.
Nov 28 08:45:08 np0005538515.localdomain podman[91252]: 2025-11-28 08:45:08.136463342 +0000 UTC m=+0.236127305 container exec_died e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, build-date=2025-11-19T00:14:25Z, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, name=rhosp17/openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, managed_by=tripleo_ansible, vcs-type=git, container_name=ovn_metadata_agent)
Nov 28 08:45:08 np0005538515.localdomain systemd[1]: e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.service: Deactivated successfully.
Nov 28 08:45:30 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.
Nov 28 08:45:30 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.
Nov 28 08:45:30 np0005538515.localdomain podman[91321]: 2025-11-28 08:45:30.970426371 +0000 UTC m=+0.077832969 container health_status cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, distribution-scope=public, name=rhosp17/openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, batch=17.1_20251118.1, version=17.1.12, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, container_name=collectd, vendor=Red Hat, Inc., build-date=2025-11-18T22:51:28Z, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, com.redhat.component=openstack-collectd-container, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, tcib_managed=true)
Nov 28 08:45:30 np0005538515.localdomain podman[91321]: 2025-11-28 08:45:30.976786845 +0000 UTC m=+0.084193463 container exec_died cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, url=https://www.redhat.com, com.redhat.component=openstack-collectd-container, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-collectd, vendor=Red Hat, Inc., tcib_managed=true, build-date=2025-11-18T22:51:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, container_name=collectd, version=17.1.12, io.openshift.expose-services=, architecture=x86_64)
Nov 28 08:45:30 np0005538515.localdomain systemd[1]: cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.service: Deactivated successfully.
Nov 28 08:45:31 np0005538515.localdomain podman[91320]: 2025-11-28 08:45:31.026359309 +0000 UTC m=+0.135016805 container health_status 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, batch=17.1_20251118.1, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, tcib_managed=true, config_id=tripleo_step1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-qdrouterd, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, io.openshift.expose-services=, build-date=2025-11-18T22:49:46Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, version=17.1.12, description=Red Hat OpenStack Platform 17.1 qdrouterd)
Nov 28 08:45:31 np0005538515.localdomain podman[91320]: 2025-11-28 08:45:31.203032667 +0000 UTC m=+0.311690193 container exec_died 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.buildah.version=1.41.4, config_id=tripleo_step1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=metrics_qdr, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-qdrouterd, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, release=1761123044, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, vcs-type=git, version=17.1.12, batch=17.1_20251118.1, architecture=x86_64, build-date=2025-11-18T22:49:46Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public)
Nov 28 08:45:31 np0005538515.localdomain systemd[1]: 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.service: Deactivated successfully.
Nov 28 08:45:34 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.
Nov 28 08:45:34 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.
Nov 28 08:45:34 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.
Nov 28 08:45:34 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.
Nov 28 08:45:34 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.
Nov 28 08:45:34 np0005538515.localdomain systemd[1]: tmp-crun.3ezRyU.mount: Deactivated successfully.
Nov 28 08:45:35 np0005538515.localdomain podman[91369]: 2025-11-28 08:45:34.994059828 +0000 UTC m=+0.100398179 container health_status 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, tcib_managed=true, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, name=rhosp17/openstack-ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:12:45Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, batch=17.1_20251118.1, vcs-type=git, io.buildah.version=1.41.4, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=)
Nov 28 08:45:35 np0005538515.localdomain podman[91370]: 2025-11-28 08:45:35.047929003 +0000 UTC m=+0.145153576 container health_status 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., config_id=tripleo_step3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, name=rhosp17/openstack-iscsid, build-date=2025-11-18T23:44:13Z, distribution-scope=public, io.buildah.version=1.41.4, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, managed_by=tripleo_ansible, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, container_name=iscsid, vcs-type=git, com.redhat.component=openstack-iscsid-container, architecture=x86_64, batch=17.1_20251118.1)
Nov 28 08:45:35 np0005538515.localdomain podman[91369]: 2025-11-28 08:45:35.05242772 +0000 UTC m=+0.158766071 container exec_died 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, managed_by=tripleo_ansible, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, name=rhosp17/openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, batch=17.1_20251118.1, vcs-type=git, container_name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:12:45Z, io.buildah.version=1.41.4, distribution-scope=public, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container)
Nov 28 08:45:35 np0005538515.localdomain podman[91370]: 2025-11-28 08:45:35.061462057 +0000 UTC m=+0.158686680 container exec_died 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, config_id=tripleo_step3, name=rhosp17/openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, io.buildah.version=1.41.4, architecture=x86_64, vendor=Red Hat, Inc., version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, url=https://www.redhat.com, managed_by=tripleo_ansible, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:44:13Z, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.expose-services=)
Nov 28 08:45:35 np0005538515.localdomain systemd[1]: 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.service: Deactivated successfully.
Nov 28 08:45:35 np0005538515.localdomain systemd[1]: 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.service: Deactivated successfully.
Nov 28 08:45:35 np0005538515.localdomain podman[91368]: 2025-11-28 08:45:35.140338096 +0000 UTC m=+0.245695667 container health_status 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, architecture=x86_64, name=rhosp17/openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:32Z, summary=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step4, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, url=https://www.redhat.com, container_name=logrotate_crond, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 cron, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.openshift.expose-services=)
Nov 28 08:45:35 np0005538515.localdomain podman[91372]: 2025-11-28 08:45:34.998783271 +0000 UTC m=+0.093639331 container health_status e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.buildah.version=1.41.4, container_name=nova_migration_target, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, config_id=tripleo_step4, vcs-type=git, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, architecture=x86_64, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:36:58Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc.)
Nov 28 08:45:35 np0005538515.localdomain podman[91371]: 2025-11-28 08:45:35.11297484 +0000 UTC m=+0.208332305 container health_status d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, distribution-scope=public, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.4, vcs-type=git, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, architecture=x86_64, io.openshift.expose-services=, build-date=2025-11-19T00:11:48Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, url=https://www.redhat.com)
Nov 28 08:45:35 np0005538515.localdomain podman[91368]: 2025-11-28 08:45:35.177375748 +0000 UTC m=+0.282733299 container exec_died 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:32Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, container_name=logrotate_crond, config_id=tripleo_step4, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 cron, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, distribution-scope=public, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-cron-container, name=rhosp17/openstack-cron)
Nov 28 08:45:35 np0005538515.localdomain systemd[1]: 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.service: Deactivated successfully.
Nov 28 08:45:35 np0005538515.localdomain podman[91371]: 2025-11-28 08:45:35.199416011 +0000 UTC m=+0.294773466 container exec_died d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, url=https://www.redhat.com, container_name=ceilometer_agent_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, com.redhat.component=openstack-ceilometer-compute-container, vendor=Red Hat, Inc., io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, managed_by=tripleo_ansible, build-date=2025-11-19T00:11:48Z, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, version=17.1.12, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676)
Nov 28 08:45:35 np0005538515.localdomain systemd[1]: d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.service: Deactivated successfully.
Nov 28 08:45:35 np0005538515.localdomain podman[91372]: 2025-11-28 08:45:35.379532434 +0000 UTC m=+0.474388534 container exec_died e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, url=https://www.redhat.com, release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, build-date=2025-11-19T00:36:58Z, batch=17.1_20251118.1, version=17.1.12, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, managed_by=tripleo_ansible, vcs-type=git, architecture=x86_64, distribution-scope=public)
Nov 28 08:45:35 np0005538515.localdomain systemd[1]: e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.service: Deactivated successfully.
Nov 28 08:45:38 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.
Nov 28 08:45:38 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.
Nov 28 08:45:38 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.
Nov 28 08:45:38 np0005538515.localdomain podman[91481]: 2025-11-28 08:45:38.976551446 +0000 UTC m=+0.079317134 container health_status e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, architecture=x86_64, maintainer=OpenStack TripleO Team, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:14:25Z, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_id=tripleo_step4, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, vendor=Red Hat, Inc., vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, name=rhosp17/openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1)
Nov 28 08:45:39 np0005538515.localdomain systemd[1]: tmp-crun.oXkrVW.mount: Deactivated successfully.
Nov 28 08:45:39 np0005538515.localdomain podman[91480]: 2025-11-28 08:45:39.040905372 +0000 UTC m=+0.144513255 container health_status ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, build-date=2025-11-19T00:36:58Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, batch=17.1_20251118.1, version=17.1.12, url=https://www.redhat.com, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, config_id=tripleo_step5, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, io.buildah.version=1.41.4)
Nov 28 08:45:39 np0005538515.localdomain podman[91481]: 2025-11-28 08:45:39.049521095 +0000 UTC m=+0.152286753 container exec_died e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, architecture=x86_64, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-type=git, vendor=Red Hat, Inc., build-date=2025-11-19T00:14:25Z, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, release=1761123044, managed_by=tripleo_ansible, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, version=17.1.12, io.buildah.version=1.41.4, url=https://www.redhat.com, tcib_managed=true, container_name=ovn_metadata_agent)
Nov 28 08:45:39 np0005538515.localdomain systemd[1]: e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.service: Deactivated successfully.
Nov 28 08:45:39 np0005538515.localdomain podman[91480]: 2025-11-28 08:45:39.074570071 +0000 UTC m=+0.178177964 container exec_died ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:36:58Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, tcib_managed=true, url=https://www.redhat.com, io.buildah.version=1.41.4, io.openshift.expose-services=, vcs-type=git, distribution-scope=public, architecture=x86_64, batch=17.1_20251118.1, version=17.1.12, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., container_name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible)
Nov 28 08:45:39 np0005538515.localdomain systemd[1]: ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.service: Deactivated successfully.
Nov 28 08:45:39 np0005538515.localdomain systemd[1]: tmp-crun.UYJSnv.mount: Deactivated successfully.
Nov 28 08:45:39 np0005538515.localdomain podman[91479]: 2025-11-28 08:45:39.141814675 +0000 UTC m=+0.248226414 container health_status 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, summary=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, tcib_managed=true, vcs-type=git, container_name=ovn_controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, vendor=Red Hat, Inc., name=rhosp17/openstack-ovn-controller, release=1761123044, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, build-date=2025-11-18T23:34:05Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller)
Nov 28 08:45:39 np0005538515.localdomain podman[91479]: 2025-11-28 08:45:39.169442199 +0000 UTC m=+0.275853958 container exec_died 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, vcs-type=git, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, version=17.1.12, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, distribution-scope=public, build-date=2025-11-18T23:34:05Z, config_id=tripleo_step4, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ovn_controller, vendor=Red Hat, Inc., name=rhosp17/openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 28 08:45:39 np0005538515.localdomain systemd[1]: 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.service: Deactivated successfully.
Nov 28 08:45:57 np0005538515.localdomain sudo[91551]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 08:45:57 np0005538515.localdomain sudo[91551]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:45:57 np0005538515.localdomain sudo[91551]: pam_unix(sudo:session): session closed for user root
Nov 28 08:45:57 np0005538515.localdomain sudo[91566]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 08:45:57 np0005538515.localdomain sudo[91566]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:45:58 np0005538515.localdomain sudo[91566]: pam_unix(sudo:session): session closed for user root
Nov 28 08:45:59 np0005538515.localdomain sudo[91614]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 08:45:59 np0005538515.localdomain sudo[91614]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:45:59 np0005538515.localdomain sudo[91614]: pam_unix(sudo:session): session closed for user root
Nov 28 08:46:02 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.
Nov 28 08:46:02 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.
Nov 28 08:46:02 np0005538515.localdomain systemd[1]: tmp-crun.LHXBSZ.mount: Deactivated successfully.
Nov 28 08:46:02 np0005538515.localdomain podman[91630]: 2025-11-28 08:46:02.153485423 +0000 UTC m=+0.095175329 container health_status cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-collectd-container, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, vcs-type=git, distribution-scope=public, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., build-date=2025-11-18T22:51:28Z, description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-collectd, batch=17.1_20251118.1, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, version=17.1.12, io.buildah.version=1.41.4, tcib_managed=true, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=)
Nov 28 08:46:02 np0005538515.localdomain podman[91630]: 2025-11-28 08:46:02.193464015 +0000 UTC m=+0.135153951 container exec_died cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, build-date=2025-11-18T22:51:28Z, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, name=rhosp17/openstack-collectd, io.buildah.version=1.41.4, url=https://www.redhat.com, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=collectd, vcs-type=git, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 28 08:46:02 np0005538515.localdomain systemd[1]: cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.service: Deactivated successfully.
Nov 28 08:46:02 np0005538515.localdomain podman[91629]: 2025-11-28 08:46:02.245622698 +0000 UTC m=+0.188909392 container health_status 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, batch=17.1_20251118.1, vcs-type=git, build-date=2025-11-18T22:49:46Z, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, container_name=metrics_qdr, distribution-scope=public, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, release=1761123044, name=rhosp17/openstack-qdrouterd, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, architecture=x86_64, config_id=tripleo_step1, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 qdrouterd)
Nov 28 08:46:02 np0005538515.localdomain podman[91629]: 2025-11-28 08:46:02.479802803 +0000 UTC m=+0.423089527 container exec_died 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, architecture=x86_64, tcib_managed=true, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:46Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, vcs-type=git, config_id=tripleo_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, release=1761123044, version=17.1.12, url=https://www.redhat.com)
Nov 28 08:46:02 np0005538515.localdomain systemd[1]: 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.service: Deactivated successfully.
Nov 28 08:46:05 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.
Nov 28 08:46:05 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.
Nov 28 08:46:05 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.
Nov 28 08:46:05 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.
Nov 28 08:46:05 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.
Nov 28 08:46:05 np0005538515.localdomain podman[91678]: 2025-11-28 08:46:05.981303727 +0000 UTC m=+0.084441670 container health_status 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step4, release=1761123044, container_name=logrotate_crond, io.openshift.expose-services=, url=https://www.redhat.com, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, tcib_managed=true, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:32Z, io.buildah.version=1.41.4, com.redhat.component=openstack-cron-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., version=17.1.12, name=rhosp17/openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1)
Nov 28 08:46:05 np0005538515.localdomain podman[91678]: 2025-11-28 08:46:05.993097327 +0000 UTC m=+0.096235260 container exec_died 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-cron, vcs-type=git, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, container_name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, architecture=x86_64, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-cron-container, release=1761123044, io.buildah.version=1.41.4, io.openshift.expose-services=, tcib_managed=true)
Nov 28 08:46:06 np0005538515.localdomain systemd[1]: 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.service: Deactivated successfully.
Nov 28 08:46:06 np0005538515.localdomain podman[91684]: 2025-11-28 08:46:06.040186086 +0000 UTC m=+0.133929863 container health_status e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.buildah.version=1.41.4, batch=17.1_20251118.1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, release=1761123044, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=nova_migration_target)
Nov 28 08:46:06 np0005538515.localdomain podman[91681]: 2025-11-28 08:46:05.9964629 +0000 UTC m=+0.092340112 container health_status d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.41.4, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-compute, url=https://www.redhat.com, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, tcib_managed=true, release=1761123044, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, config_id=tripleo_step4, build-date=2025-11-19T00:11:48Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 28 08:46:06 np0005538515.localdomain podman[91680]: 2025-11-28 08:46:06.095479636 +0000 UTC m=+0.194176254 container health_status 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, build-date=2025-11-18T23:44:13Z, tcib_managed=true, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, managed_by=tripleo_ansible, config_id=tripleo_step3, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, batch=17.1_20251118.1, container_name=iscsid, io.buildah.version=1.41.4, version=17.1.12, com.redhat.component=openstack-iscsid-container, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid)
Nov 28 08:46:06 np0005538515.localdomain podman[91681]: 2025-11-28 08:46:06.126437711 +0000 UTC m=+0.222314923 container exec_died d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:11:48Z, io.openshift.expose-services=, container_name=ceilometer_agent_compute, com.redhat.component=openstack-ceilometer-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, maintainer=OpenStack TripleO Team, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, vendor=Red Hat, Inc., config_id=tripleo_step4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, io.buildah.version=1.41.4, managed_by=tripleo_ansible, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, batch=17.1_20251118.1)
Nov 28 08:46:06 np0005538515.localdomain podman[91680]: 2025-11-28 08:46:06.134491948 +0000 UTC m=+0.233188566 container exec_died 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, tcib_managed=true, container_name=iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, architecture=x86_64, com.redhat.component=openstack-iscsid-container, summary=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, url=https://www.redhat.com, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, build-date=2025-11-18T23:44:13Z, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, name=rhosp17/openstack-iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, version=17.1.12, config_id=tripleo_step3)
Nov 28 08:46:06 np0005538515.localdomain systemd[1]: d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.service: Deactivated successfully.
Nov 28 08:46:06 np0005538515.localdomain systemd[1]: 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.service: Deactivated successfully.
Nov 28 08:46:06 np0005538515.localdomain podman[91679]: 2025-11-28 08:46:06.185082113 +0000 UTC m=+0.285763021 container health_status 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, version=17.1.12, distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-ipmi-container, name=rhosp17/openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ceilometer_agent_ipmi, batch=17.1_20251118.1, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, url=https://www.redhat.com, vcs-type=git, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, tcib_managed=true, architecture=x86_64)
Nov 28 08:46:06 np0005538515.localdomain podman[91679]: 2025-11-28 08:46:06.215467211 +0000 UTC m=+0.316148139 container exec_died 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, container_name=ceilometer_agent_ipmi, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.expose-services=, version=17.1.12, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, build-date=2025-11-19T00:12:45Z, release=1761123044, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-ipmi)
Nov 28 08:46:06 np0005538515.localdomain systemd[1]: 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.service: Deactivated successfully.
Nov 28 08:46:06 np0005538515.localdomain podman[91684]: 2025-11-28 08:46:06.437476774 +0000 UTC m=+0.531220601 container exec_died e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, config_id=tripleo_step4, release=1761123044, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, build-date=2025-11-19T00:36:58Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., architecture=x86_64, container_name=nova_migration_target, distribution-scope=public, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 08:46:06 np0005538515.localdomain systemd[1]: e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.service: Deactivated successfully.
Nov 28 08:46:09 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.
Nov 28 08:46:09 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.
Nov 28 08:46:09 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.
Nov 28 08:46:10 np0005538515.localdomain systemd[1]: tmp-crun.nJ9R3C.mount: Deactivated successfully.
Nov 28 08:46:10 np0005538515.localdomain podman[91791]: 2025-11-28 08:46:10.043550143 +0000 UTC m=+0.147949022 container health_status 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, build-date=2025-11-18T23:34:05Z, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, name=rhosp17/openstack-ovn-controller, io.buildah.version=1.41.4, tcib_managed=true, vcs-type=git, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, config_id=tripleo_step4, distribution-scope=public)
Nov 28 08:46:10 np0005538515.localdomain podman[91793]: 2025-11-28 08:46:10.090564399 +0000 UTC m=+0.189288414 container health_status e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, version=17.1.12, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-type=git, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, io.buildah.version=1.41.4, release=1761123044, tcib_managed=true, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_id=tripleo_step4)
Nov 28 08:46:10 np0005538515.localdomain podman[91791]: 2025-11-28 08:46:10.094817399 +0000 UTC m=+0.199216328 container exec_died 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, batch=17.1_20251118.1, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., name=rhosp17/openstack-ovn-controller, io.buildah.version=1.41.4, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, tcib_managed=true, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:34:05Z, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git)
Nov 28 08:46:10 np0005538515.localdomain podman[91791]: unhealthy
Nov 28 08:46:10 np0005538515.localdomain systemd[1]: 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 08:46:10 np0005538515.localdomain systemd[1]: 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.service: Failed with result 'exit-code'.
Nov 28 08:46:10 np0005538515.localdomain podman[91792]: 2025-11-28 08:46:10.012518545 +0000 UTC m=+0.114011075 container health_status ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.openshift.expose-services=, batch=17.1_20251118.1, name=rhosp17/openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, build-date=2025-11-19T00:36:58Z, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.buildah.version=1.41.4, release=1761123044, vcs-type=git, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., container_name=nova_compute, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, config_id=tripleo_step5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 28 08:46:10 np0005538515.localdomain podman[91793]: 2025-11-28 08:46:10.127397544 +0000 UTC m=+0.226121529 container exec_died e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:14:25Z, batch=17.1_20251118.1, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, release=1761123044, tcib_managed=true, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, url=https://www.redhat.com, managed_by=tripleo_ansible, container_name=ovn_metadata_agent, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, name=rhosp17/openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, architecture=x86_64, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, distribution-scope=public)
Nov 28 08:46:10 np0005538515.localdomain systemd[1]: e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.service: Deactivated successfully.
Nov 28 08:46:10 np0005538515.localdomain podman[91792]: 2025-11-28 08:46:10.144637431 +0000 UTC m=+0.246129921 container exec_died ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, vendor=Red Hat, Inc., distribution-scope=public, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, build-date=2025-11-19T00:36:58Z, container_name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, version=17.1.12, com.redhat.component=openstack-nova-compute-container, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-compute)
Nov 28 08:46:10 np0005538515.localdomain systemd[1]: ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.service: Deactivated successfully.
Nov 28 08:46:32 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.
Nov 28 08:46:32 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.
Nov 28 08:46:32 np0005538515.localdomain podman[91867]: 2025-11-28 08:46:32.987929228 +0000 UTC m=+0.090061393 container health_status cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, url=https://www.redhat.com, batch=17.1_20251118.1, vendor=Red Hat, Inc., com.redhat.component=openstack-collectd-container, version=17.1.12, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, tcib_managed=true, io.openshift.expose-services=, name=rhosp17/openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_id=tripleo_step3, architecture=x86_64, maintainer=OpenStack TripleO Team, vcs-type=git)
Nov 28 08:46:33 np0005538515.localdomain podman[91867]: 2025-11-28 08:46:33.024117023 +0000 UTC m=+0.126249208 container exec_died cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, build-date=2025-11-18T22:51:28Z, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=collectd, tcib_managed=true, architecture=x86_64, io.buildah.version=1.41.4, release=1761123044, io.openshift.expose-services=, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, name=rhosp17/openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.component=openstack-collectd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 28 08:46:33 np0005538515.localdomain podman[91866]: 2025-11-28 08:46:33.036533863 +0000 UTC m=+0.140670099 container health_status 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, distribution-scope=public, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-qdrouterd-container, version=17.1.12, vendor=Red Hat, Inc., name=rhosp17/openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:46Z, vcs-type=git, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, config_id=tripleo_step1, url=https://www.redhat.com, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd)
Nov 28 08:46:33 np0005538515.localdomain systemd[1]: cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.service: Deactivated successfully.
Nov 28 08:46:33 np0005538515.localdomain podman[91866]: 2025-11-28 08:46:33.22333201 +0000 UTC m=+0.327468216 container exec_died 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, io.openshift.expose-services=, tcib_managed=true, name=rhosp17/openstack-qdrouterd, vcs-type=git, vendor=Red Hat, Inc., com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, managed_by=tripleo_ansible, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:46Z, config_id=tripleo_step1, distribution-scope=public, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 28 08:46:33 np0005538515.localdomain systemd[1]: 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.service: Deactivated successfully.
Nov 28 08:46:36 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.
Nov 28 08:46:36 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.
Nov 28 08:46:36 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.
Nov 28 08:46:36 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.
Nov 28 08:46:36 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.
Nov 28 08:46:36 np0005538515.localdomain systemd[1]: tmp-crun.jXzA5A.mount: Deactivated successfully.
Nov 28 08:46:36 np0005538515.localdomain podman[91913]: 2025-11-28 08:46:36.992459011 +0000 UTC m=+0.099932254 container health_status 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:32Z, release=1761123044, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, maintainer=OpenStack TripleO Team, distribution-scope=public, tcib_managed=true, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, managed_by=tripleo_ansible, vcs-type=git, architecture=x86_64, config_id=tripleo_step4, io.openshift.expose-services=, container_name=logrotate_crond, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 28 08:46:37 np0005538515.localdomain podman[91916]: 2025-11-28 08:46:37.053681671 +0000 UTC m=+0.151196080 container health_status d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, release=1761123044, version=17.1.12, container_name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, build-date=2025-11-19T00:11:48Z, url=https://www.redhat.com, managed_by=tripleo_ansible, architecture=x86_64, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Nov 28 08:46:37 np0005538515.localdomain podman[91922]: 2025-11-28 08:46:37.101472211 +0000 UTC m=+0.192181712 container health_status e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, io.openshift.expose-services=, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, tcib_managed=true, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, distribution-scope=public, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, build-date=2025-11-19T00:36:58Z, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4)
Nov 28 08:46:37 np0005538515.localdomain podman[91915]: 2025-11-28 08:46:36.972362057 +0000 UTC m=+0.075860058 container health_status 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, tcib_managed=true, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, build-date=2025-11-18T23:44:13Z, description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-iscsid, managed_by=tripleo_ansible, url=https://www.redhat.com, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, io.openshift.expose-services=, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., config_id=tripleo_step3, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.component=openstack-iscsid-container, container_name=iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']})
Nov 28 08:46:37 np0005538515.localdomain podman[91913]: 2025-11-28 08:46:37.125937758 +0000 UTC m=+0.233411011 container exec_died 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.buildah.version=1.41.4, io.openshift.expose-services=, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, build-date=2025-11-18T22:49:32Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.component=openstack-cron-container, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, batch=17.1_20251118.1, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=logrotate_crond, release=1761123044)
Nov 28 08:46:37 np0005538515.localdomain podman[91914]: 2025-11-28 08:46:37.032277227 +0000 UTC m=+0.135317445 container health_status 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, container_name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vendor=Red Hat, Inc., version=17.1.12, tcib_managed=true, name=rhosp17/openstack-ceilometer-ipmi, architecture=x86_64, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, batch=17.1_20251118.1, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, build-date=2025-11-19T00:12:45Z, com.redhat.component=openstack-ceilometer-ipmi-container, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Nov 28 08:46:37 np0005538515.localdomain podman[91915]: 2025-11-28 08:46:37.154918434 +0000 UTC m=+0.258416495 container exec_died 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.openshift.expose-services=, release=1761123044, vcs-type=git, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., name=rhosp17/openstack-iscsid, build-date=2025-11-18T23:44:13Z, container_name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 28 08:46:37 np0005538515.localdomain systemd[1]: 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.service: Deactivated successfully.
Nov 28 08:46:37 np0005538515.localdomain podman[91914]: 2025-11-28 08:46:37.173969736 +0000 UTC m=+0.277009924 container exec_died 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_id=tripleo_step4, managed_by=tripleo_ansible, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, distribution-scope=public, io.buildah.version=1.41.4, container_name=ceilometer_agent_ipmi, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:12:45Z, name=rhosp17/openstack-ceilometer-ipmi, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, release=1761123044, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team)
Nov 28 08:46:37 np0005538515.localdomain systemd[1]: 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.service: Deactivated successfully.
Nov 28 08:46:37 np0005538515.localdomain systemd[1]: 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.service: Deactivated successfully.
Nov 28 08:46:37 np0005538515.localdomain podman[91916]: 2025-11-28 08:46:37.278573212 +0000 UTC m=+0.376087621 container exec_died d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, managed_by=tripleo_ansible, container_name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, version=17.1.12, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-compute-container, vendor=Red Hat, Inc., config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2025-11-19T00:11:48Z, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-compute, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Nov 28 08:46:37 np0005538515.localdomain systemd[1]: d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.service: Deactivated successfully.
Nov 28 08:46:37 np0005538515.localdomain podman[91922]: 2025-11-28 08:46:37.54657119 +0000 UTC m=+0.637280711 container exec_died e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, config_id=tripleo_step4, vendor=Red Hat, Inc., vcs-type=git, distribution-scope=public, url=https://www.redhat.com, tcib_managed=true, batch=17.1_20251118.1, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, release=1761123044, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, container_name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Nov 28 08:46:37 np0005538515.localdomain systemd[1]: e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.service: Deactivated successfully.
Nov 28 08:46:40 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.
Nov 28 08:46:40 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.
Nov 28 08:46:40 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.
Nov 28 08:46:40 np0005538515.localdomain podman[92022]: 2025-11-28 08:46:40.973445644 +0000 UTC m=+0.077461198 container health_status ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.openshift.expose-services=, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, version=17.1.12, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, vcs-type=git, container_name=nova_compute, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:36:58Z, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute)
Nov 28 08:46:41 np0005538515.localdomain podman[92022]: 2025-11-28 08:46:41.023880015 +0000 UTC m=+0.127895569 container exec_died ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.openshift.expose-services=, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, release=1761123044, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.4, batch=17.1_20251118.1, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, config_id=tripleo_step5, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2025-11-19T00:36:58Z, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Nov 28 08:46:41 np0005538515.localdomain podman[92021]: 2025-11-28 08:46:41.033436086 +0000 UTC m=+0.140171143 container health_status 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, architecture=x86_64, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ovn-controller-container, build-date=2025-11-18T23:34:05Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, batch=17.1_20251118.1, version=17.1.12, name=rhosp17/openstack-ovn-controller, release=1761123044, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, vendor=Red Hat, Inc., org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, tcib_managed=true, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-type=git, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller)
Nov 28 08:46:41 np0005538515.localdomain systemd[1]: ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.service: Deactivated successfully.
Nov 28 08:46:41 np0005538515.localdomain podman[92023]: 2025-11-28 08:46:41.080835064 +0000 UTC m=+0.182419534 container health_status e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, name=rhosp17/openstack-neutron-metadata-agent-ovn, release=1761123044, config_id=tripleo_step4, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., io.openshift.expose-services=, url=https://www.redhat.com, vcs-type=git, version=17.1.12, build-date=2025-11-19T00:14:25Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, architecture=x86_64, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ovn_metadata_agent)
Nov 28 08:46:41 np0005538515.localdomain podman[92021]: 2025-11-28 08:46:41.103401584 +0000 UTC m=+0.210136621 container exec_died 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.openshift.expose-services=, vendor=Red Hat, Inc., release=1761123044, com.redhat.component=openstack-ovn-controller-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, tcib_managed=true, version=17.1.12, config_id=tripleo_step4, vcs-type=git, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, name=rhosp17/openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, build-date=2025-11-18T23:34:05Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 08:46:41 np0005538515.localdomain podman[92021]: unhealthy
Nov 28 08:46:41 np0005538515.localdomain systemd[1]: 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 08:46:41 np0005538515.localdomain systemd[1]: 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.service: Failed with result 'exit-code'.
Nov 28 08:46:41 np0005538515.localdomain podman[92023]: 2025-11-28 08:46:41.12226366 +0000 UTC m=+0.223848160 container exec_died e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, distribution-scope=public, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, name=rhosp17/openstack-neutron-metadata-agent-ovn, tcib_managed=true, container_name=ovn_metadata_agent, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, release=1761123044, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.41.4, vendor=Red Hat, Inc., vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 28 08:46:41 np0005538515.localdomain systemd[1]: e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.service: Deactivated successfully.
Nov 28 08:46:53 np0005538515.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Nov 28 08:46:53 np0005538515.localdomain recover_tripleo_nova_virtqemud[92098]: 62642
Nov 28 08:46:53 np0005538515.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Nov 28 08:46:53 np0005538515.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Nov 28 08:46:59 np0005538515.localdomain sudo[92099]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 08:46:59 np0005538515.localdomain sudo[92099]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:46:59 np0005538515.localdomain sudo[92099]: pam_unix(sudo:session): session closed for user root
Nov 28 08:46:59 np0005538515.localdomain sudo[92114]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 08:46:59 np0005538515.localdomain sudo[92114]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:47:00 np0005538515.localdomain sudo[92114]: pam_unix(sudo:session): session closed for user root
Nov 28 08:47:03 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.
Nov 28 08:47:03 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.
Nov 28 08:47:03 np0005538515.localdomain podman[92162]: 2025-11-28 08:47:03.977431439 +0000 UTC m=+0.082400888 container health_status 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, vendor=Red Hat, Inc., version=17.1.12, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, url=https://www.redhat.com, name=rhosp17/openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, build-date=2025-11-18T22:49:46Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, maintainer=OpenStack TripleO Team)
Nov 28 08:47:04 np0005538515.localdomain podman[92163]: 2025-11-28 08:47:04.028848231 +0000 UTC m=+0.134058977 container health_status cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.buildah.version=1.41.4, release=1761123044, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, maintainer=OpenStack TripleO Team, container_name=collectd, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, name=rhosp17/openstack-collectd, distribution-scope=public, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, build-date=2025-11-18T22:51:28Z, architecture=x86_64, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, url=https://www.redhat.com, batch=17.1_20251118.1)
Nov 28 08:47:04 np0005538515.localdomain podman[92163]: 2025-11-28 08:47:04.068492652 +0000 UTC m=+0.173703398 container exec_died cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.buildah.version=1.41.4, tcib_managed=true, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, architecture=x86_64, version=17.1.12, batch=17.1_20251118.1, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-collectd, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, container_name=collectd, build-date=2025-11-18T22:51:28Z, config_id=tripleo_step3, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vendor=Red Hat, Inc., config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-collectd-container)
Nov 28 08:47:04 np0005538515.localdomain systemd[1]: cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.service: Deactivated successfully.
Nov 28 08:47:04 np0005538515.localdomain podman[92162]: 2025-11-28 08:47:04.213708578 +0000 UTC m=+0.318677987 container exec_died 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:46Z, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, distribution-scope=public, batch=17.1_20251118.1, name=rhosp17/openstack-qdrouterd, container_name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, architecture=x86_64, url=https://www.redhat.com)
Nov 28 08:47:04 np0005538515.localdomain systemd[1]: 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.service: Deactivated successfully.
Nov 28 08:47:04 np0005538515.localdomain sudo[92211]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 08:47:04 np0005538515.localdomain sudo[92211]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:47:04 np0005538515.localdomain sudo[92211]: pam_unix(sudo:session): session closed for user root
Nov 28 08:47:07 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.
Nov 28 08:47:07 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.
Nov 28 08:47:07 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.
Nov 28 08:47:07 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.
Nov 28 08:47:07 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.
Nov 28 08:47:07 np0005538515.localdomain systemd[1]: tmp-crun.WSNp0s.mount: Deactivated successfully.
Nov 28 08:47:07 np0005538515.localdomain podman[92227]: 2025-11-28 08:47:07.989127012 +0000 UTC m=+0.092165866 container health_status 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, managed_by=tripleo_ansible, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, release=1761123044, name=rhosp17/openstack-ceilometer-ipmi, vendor=Red Hat, Inc., io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, container_name=ceilometer_agent_ipmi, architecture=x86_64, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-type=git, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:12:45Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true)
Nov 28 08:47:08 np0005538515.localdomain podman[92227]: 2025-11-28 08:47:08.012145936 +0000 UTC m=+0.115184800 container exec_died 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, batch=17.1_20251118.1, container_name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, io.openshift.expose-services=, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, distribution-scope=public, build-date=2025-11-19T00:12:45Z, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, tcib_managed=true)
Nov 28 08:47:08 np0005538515.localdomain systemd[1]: 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.service: Deactivated successfully.
Nov 28 08:47:08 np0005538515.localdomain podman[92248]: 2025-11-28 08:47:08.082749843 +0000 UTC m=+0.131959053 container health_status e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vendor=Red Hat, Inc., managed_by=tripleo_ansible, release=1761123044, version=17.1.12, config_id=tripleo_step4, distribution-scope=public, name=rhosp17/openstack-nova-compute, batch=17.1_20251118.1, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, build-date=2025-11-19T00:36:58Z, url=https://www.redhat.com, tcib_managed=true, container_name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Nov 28 08:47:08 np0005538515.localdomain podman[92246]: 2025-11-28 08:47:08.087726945 +0000 UTC m=+0.143399402 container health_status 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, architecture=x86_64, version=17.1.12, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, container_name=iscsid, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, com.redhat.component=openstack-iscsid-container, release=1761123044, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step3, build-date=2025-11-18T23:44:13Z, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.expose-services=, managed_by=tripleo_ansible, batch=17.1_20251118.1, name=rhosp17/openstack-iscsid)
Nov 28 08:47:08 np0005538515.localdomain podman[92247]: 2025-11-28 08:47:08.143304943 +0000 UTC m=+0.194131563 container health_status d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, container_name=ceilometer_agent_compute, managed_by=tripleo_ansible, io.buildah.version=1.41.4, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, url=https://www.redhat.com, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, batch=17.1_20251118.1, distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, architecture=x86_64, build-date=2025-11-19T00:11:48Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Nov 28 08:47:08 np0005538515.localdomain podman[92246]: 2025-11-28 08:47:08.171479634 +0000 UTC m=+0.227152061 container exec_died 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, release=1761123044, managed_by=tripleo_ansible, batch=17.1_20251118.1, io.buildah.version=1.41.4, com.redhat.component=openstack-iscsid-container, architecture=x86_64, name=rhosp17/openstack-iscsid, config_id=tripleo_step3, container_name=iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, io.openshift.expose-services=, build-date=2025-11-18T23:44:13Z, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']})
Nov 28 08:47:08 np0005538515.localdomain systemd[1]: 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.service: Deactivated successfully.
Nov 28 08:47:08 np0005538515.localdomain podman[92226]: 2025-11-28 08:47:08.187268235 +0000 UTC m=+0.292900959 container health_status 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, com.redhat.component=openstack-cron-container, container_name=logrotate_crond, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, batch=17.1_20251118.1, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, name=rhosp17/openstack-cron, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:32Z, managed_by=tripleo_ansible, architecture=x86_64, release=1761123044)
Nov 28 08:47:08 np0005538515.localdomain podman[92226]: 2025-11-28 08:47:08.192169375 +0000 UTC m=+0.297802129 container exec_died 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.component=openstack-cron-container, build-date=2025-11-18T22:49:32Z, maintainer=OpenStack TripleO Team, container_name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-cron, architecture=x86_64, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.buildah.version=1.41.4, distribution-scope=public, config_id=tripleo_step4, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, managed_by=tripleo_ansible)
Nov 28 08:47:08 np0005538515.localdomain systemd[1]: 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.service: Deactivated successfully.
Nov 28 08:47:08 np0005538515.localdomain podman[92247]: 2025-11-28 08:47:08.224579416 +0000 UTC m=+0.275406076 container exec_died d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, build-date=2025-11-19T00:11:48Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, release=1761123044, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., tcib_managed=true, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-compute, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, distribution-scope=public)
Nov 28 08:47:08 np0005538515.localdomain systemd[1]: d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.service: Deactivated successfully.
Nov 28 08:47:08 np0005538515.localdomain podman[92248]: 2025-11-28 08:47:08.433361424 +0000 UTC m=+0.482570604 container exec_died e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, url=https://www.redhat.com, batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, version=17.1.12, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, vcs-type=git, release=1761123044, container_name=nova_migration_target, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute)
Nov 28 08:47:08 np0005538515.localdomain systemd[1]: e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.service: Deactivated successfully.
Nov 28 08:47:08 np0005538515.localdomain systemd[1]: tmp-crun.TXFubs.mount: Deactivated successfully.
Nov 28 08:47:11 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.
Nov 28 08:47:11 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.
Nov 28 08:47:11 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.
Nov 28 08:47:11 np0005538515.localdomain systemd[1]: tmp-crun.uX9cU5.mount: Deactivated successfully.
Nov 28 08:47:11 np0005538515.localdomain podman[92340]: 2025-11-28 08:47:11.993906052 +0000 UTC m=+0.099564912 container health_status 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, com.redhat.component=openstack-ovn-controller-container, batch=17.1_20251118.1, io.openshift.expose-services=, vcs-type=git, tcib_managed=true, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, architecture=x86_64, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, distribution-scope=public, release=1761123044, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, managed_by=tripleo_ansible, build-date=2025-11-18T23:34:05Z, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4)
Nov 28 08:47:12 np0005538515.localdomain podman[92340]: 2025-11-28 08:47:12.043192808 +0000 UTC m=+0.148851668 container exec_died 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vendor=Red Hat, Inc., vcs-type=git, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.openshift.expose-services=, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, distribution-scope=public, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, architecture=x86_64, batch=17.1_20251118.1, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:34:05Z, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, com.redhat.component=openstack-ovn-controller-container)
Nov 28 08:47:12 np0005538515.localdomain systemd[1]: tmp-crun.zyAWSe.mount: Deactivated successfully.
Nov 28 08:47:12 np0005538515.localdomain podman[92340]: unhealthy
Nov 28 08:47:12 np0005538515.localdomain systemd[1]: 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 08:47:12 np0005538515.localdomain systemd[1]: 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.service: Failed with result 'exit-code'.
Nov 28 08:47:12 np0005538515.localdomain podman[92341]: 2025-11-28 08:47:12.045342143 +0000 UTC m=+0.147283180 container health_status ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, build-date=2025-11-19T00:36:58Z, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_compute, config_id=tripleo_step5, release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible)
Nov 28 08:47:12 np0005538515.localdomain podman[92342]: 2025-11-28 08:47:12.102090758 +0000 UTC m=+0.203657614 container health_status e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, io.buildah.version=1.41.4, url=https://www.redhat.com, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, container_name=ovn_metadata_agent, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn)
Nov 28 08:47:12 np0005538515.localdomain podman[92342]: 2025-11-28 08:47:12.116535609 +0000 UTC m=+0.218102465 container exec_died e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vendor=Red Hat, Inc., url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:14:25Z, vcs-type=git, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, release=1761123044, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_metadata_agent, distribution-scope=public, batch=17.1_20251118.1, architecture=x86_64, io.buildah.version=1.41.4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 28 08:47:12 np0005538515.localdomain podman[92342]: unhealthy
Nov 28 08:47:12 np0005538515.localdomain podman[92341]: 2025-11-28 08:47:12.125538354 +0000 UTC m=+0.227479441 container exec_died ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, version=17.1.12, release=1761123044, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, managed_by=tripleo_ansible, config_id=tripleo_step5, com.redhat.component=openstack-nova-compute-container, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., architecture=x86_64, build-date=2025-11-19T00:36:58Z, vcs-type=git, container_name=nova_compute, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Nov 28 08:47:12 np0005538515.localdomain systemd[1]: e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 08:47:12 np0005538515.localdomain systemd[1]: e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.service: Failed with result 'exit-code'.
Nov 28 08:47:12 np0005538515.localdomain systemd[1]: ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.service: Deactivated successfully.
Nov 28 08:47:34 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.
Nov 28 08:47:34 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.
Nov 28 08:47:34 np0005538515.localdomain podman[92406]: 2025-11-28 08:47:34.972240351 +0000 UTC m=+0.083711928 container health_status 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, url=https://www.redhat.com, version=17.1.12, tcib_managed=true, batch=17.1_20251118.1, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:46Z, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc.)
Nov 28 08:47:35 np0005538515.localdomain podman[92407]: 2025-11-28 08:47:35.044986283 +0000 UTC m=+0.146047863 container health_status cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, url=https://www.redhat.com, io.buildah.version=1.41.4, build-date=2025-11-18T22:51:28Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, distribution-scope=public, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-collectd-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, architecture=x86_64, io.openshift.expose-services=, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, release=1761123044, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-collectd, config_id=tripleo_step3)
Nov 28 08:47:35 np0005538515.localdomain podman[92407]: 2025-11-28 08:47:35.079450396 +0000 UTC m=+0.180511976 container exec_died cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, container_name=collectd, architecture=x86_64, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, managed_by=tripleo_ansible, vcs-type=git, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, batch=17.1_20251118.1, name=rhosp17/openstack-collectd, distribution-scope=public, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, build-date=2025-11-18T22:51:28Z, summary=Red Hat OpenStack Platform 17.1 collectd)
Nov 28 08:47:35 np0005538515.localdomain systemd[1]: cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.service: Deactivated successfully.
Nov 28 08:47:35 np0005538515.localdomain podman[92406]: 2025-11-28 08:47:35.22687615 +0000 UTC m=+0.338347717 container exec_died 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, io.openshift.expose-services=, build-date=2025-11-18T22:49:46Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, tcib_managed=true, config_id=tripleo_step1, batch=17.1_20251118.1, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4)
Nov 28 08:47:35 np0005538515.localdomain systemd[1]: 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.service: Deactivated successfully.
Nov 28 08:47:38 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.
Nov 28 08:47:38 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.
Nov 28 08:47:38 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.
Nov 28 08:47:38 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.
Nov 28 08:47:38 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.
Nov 28 08:47:39 np0005538515.localdomain podman[92456]: 2025-11-28 08:47:38.999424425 +0000 UTC m=+0.096721236 container health_status d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, vendor=Red Hat, Inc., release=1761123044, config_id=tripleo_step4, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, tcib_managed=true, io.openshift.expose-services=, version=17.1.12, name=rhosp17/openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, com.redhat.component=openstack-ceilometer-compute-container, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, distribution-scope=public)
Nov 28 08:47:39 np0005538515.localdomain podman[92456]: 2025-11-28 08:47:39.02642297 +0000 UTC m=+0.123719761 container exec_died d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, batch=17.1_20251118.1, distribution-scope=public, com.redhat.component=openstack-ceilometer-compute-container, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ceilometer-compute, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, vcs-type=git, io.openshift.expose-services=, vendor=Red Hat, Inc., version=17.1.12, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, io.buildah.version=1.41.4, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Nov 28 08:47:39 np0005538515.localdomain systemd[1]: d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.service: Deactivated successfully.
Nov 28 08:47:39 np0005538515.localdomain systemd[1]: tmp-crun.NenImy.mount: Deactivated successfully.
Nov 28 08:47:39 np0005538515.localdomain podman[92454]: 2025-11-28 08:47:39.047307278 +0000 UTC m=+0.148732435 container health_status 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, batch=17.1_20251118.1, tcib_managed=true, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, io.buildah.version=1.41.4, build-date=2025-11-19T00:12:45Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, version=17.1.12, com.redhat.component=openstack-ceilometer-ipmi-container, config_id=tripleo_step4, container_name=ceilometer_agent_ipmi, vendor=Red Hat, Inc., url=https://www.redhat.com, name=rhosp17/openstack-ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 28 08:47:39 np0005538515.localdomain podman[92453]: 2025-11-28 08:47:39.085823505 +0000 UTC m=+0.190445750 container health_status 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, name=rhosp17/openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:32Z, description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, version=17.1.12, vendor=Red Hat, Inc., io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, tcib_managed=true, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=logrotate_crond, release=1761123044, batch=17.1_20251118.1, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron)
Nov 28 08:47:39 np0005538515.localdomain podman[92455]: 2025-11-28 08:47:39.100116271 +0000 UTC m=+0.200338761 container health_status 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, url=https://www.redhat.com, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, maintainer=OpenStack TripleO Team, vcs-type=git, tcib_managed=true, io.openshift.expose-services=, version=17.1.12, batch=17.1_20251118.1, com.redhat.component=openstack-iscsid-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step3, name=rhosp17/openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., container_name=iscsid, io.buildah.version=1.41.4, build-date=2025-11-18T23:44:13Z)
Nov 28 08:47:39 np0005538515.localdomain podman[92455]: 2025-11-28 08:47:39.137357929 +0000 UTC m=+0.237580419 container exec_died 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., architecture=x86_64, vcs-type=git, release=1761123044, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.component=openstack-iscsid-container, tcib_managed=true, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, distribution-scope=public, container_name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid)
Nov 28 08:47:39 np0005538515.localdomain systemd[1]: 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.service: Deactivated successfully.
Nov 28 08:47:39 np0005538515.localdomain podman[92459]: 2025-11-28 08:47:39.151583883 +0000 UTC m=+0.245897042 container health_status e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, config_id=tripleo_step4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.12, name=rhosp17/openstack-nova-compute, distribution-scope=public, release=1761123044, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Nov 28 08:47:39 np0005538515.localdomain podman[92453]: 2025-11-28 08:47:39.172347608 +0000 UTC m=+0.276969833 container exec_died 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, com.redhat.component=openstack-cron-container, config_id=tripleo_step4, container_name=logrotate_crond, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, io.buildah.version=1.41.4, build-date=2025-11-18T22:49:32Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, managed_by=tripleo_ansible, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, vendor=Red Hat, Inc., name=rhosp17/openstack-cron, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']})
Nov 28 08:47:39 np0005538515.localdomain systemd[1]: 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.service: Deactivated successfully.
Nov 28 08:47:39 np0005538515.localdomain podman[92454]: 2025-11-28 08:47:39.203559672 +0000 UTC m=+0.304984789 container exec_died 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, release=1761123044, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-ipmi, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, managed_by=tripleo_ansible, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']})
Nov 28 08:47:39 np0005538515.localdomain systemd[1]: 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.service: Deactivated successfully.
Nov 28 08:47:39 np0005538515.localdomain podman[92459]: 2025-11-28 08:47:39.481502403 +0000 UTC m=+0.575815512 container exec_died e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, architecture=x86_64, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, release=1761123044, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=nova_migration_target, tcib_managed=true, managed_by=tripleo_ansible, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:36:58Z, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute)
Nov 28 08:47:39 np0005538515.localdomain systemd[1]: e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.service: Deactivated successfully.
Nov 28 08:47:39 np0005538515.localdomain systemd[1]: tmp-crun.0fUfPt.mount: Deactivated successfully.
Nov 28 08:47:42 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.
Nov 28 08:47:42 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.
Nov 28 08:47:42 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.
Nov 28 08:47:42 np0005538515.localdomain systemd[1]: tmp-crun.2jSQxH.mount: Deactivated successfully.
Nov 28 08:47:42 np0005538515.localdomain podman[92566]: 2025-11-28 08:47:42.986876836 +0000 UTC m=+0.094881209 container health_status 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., architecture=x86_64, distribution-scope=public, io.buildah.version=1.41.4, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, build-date=2025-11-18T23:34:05Z, release=1761123044, com.redhat.component=openstack-ovn-controller-container, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, config_id=tripleo_step4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller)
Nov 28 08:47:43 np0005538515.localdomain podman[92568]: 2025-11-28 08:47:43.032358946 +0000 UTC m=+0.131634253 container health_status e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, managed_by=tripleo_ansible, build-date=2025-11-19T00:14:25Z, io.openshift.expose-services=, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-neutron-metadata-agent-ovn, release=1761123044, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, url=https://www.redhat.com, container_name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, version=17.1.12, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c)
Nov 28 08:47:43 np0005538515.localdomain podman[92566]: 2025-11-28 08:47:43.058429572 +0000 UTC m=+0.166433905 container exec_died 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, batch=17.1_20251118.1, container_name=ovn_controller, name=rhosp17/openstack-ovn-controller, io.buildah.version=1.41.4, managed_by=tripleo_ansible, io.openshift.expose-services=, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, com.redhat.component=openstack-ovn-controller-container, build-date=2025-11-18T23:34:05Z, release=1761123044, config_id=tripleo_step4, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller)
Nov 28 08:47:43 np0005538515.localdomain podman[92566]: unhealthy
Nov 28 08:47:43 np0005538515.localdomain systemd[1]: 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 08:47:43 np0005538515.localdomain systemd[1]: 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.service: Failed with result 'exit-code'.
Nov 28 08:47:43 np0005538515.localdomain podman[92568]: 2025-11-28 08:47:43.102923902 +0000 UTC m=+0.202199209 container exec_died e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_id=tripleo_step4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, release=1761123044, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, architecture=x86_64, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, build-date=2025-11-19T00:14:25Z, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, container_name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, distribution-scope=public, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 28 08:47:43 np0005538515.localdomain podman[92568]: unhealthy
Nov 28 08:47:43 np0005538515.localdomain systemd[1]: e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 08:47:43 np0005538515.localdomain systemd[1]: e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.service: Failed with result 'exit-code'.
Nov 28 08:47:43 np0005538515.localdomain podman[92567]: 2025-11-28 08:47:43.191723955 +0000 UTC m=+0.294936523 container health_status ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, vendor=Red Hat, Inc., vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, managed_by=tripleo_ansible, config_id=tripleo_step5, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, container_name=nova_compute, architecture=x86_64, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, tcib_managed=true, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-compute)
Nov 28 08:47:43 np0005538515.localdomain podman[92567]: 2025-11-28 08:47:43.22103631 +0000 UTC m=+0.324248858 container exec_died ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=nova_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step5, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, vendor=Red Hat, Inc., version=17.1.12, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, release=1761123044, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 28 08:47:43 np0005538515.localdomain systemd[1]: ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.service: Deactivated successfully.
Nov 28 08:47:43 np0005538515.localdomain systemd[1]: tmp-crun.xr2iUb.mount: Deactivated successfully.
Nov 28 08:48:04 np0005538515.localdomain sudo[92632]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 08:48:04 np0005538515.localdomain sudo[92632]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:48:04 np0005538515.localdomain sudo[92632]: pam_unix(sudo:session): session closed for user root
Nov 28 08:48:04 np0005538515.localdomain sudo[92647]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 08:48:04 np0005538515.localdomain sudo[92647]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:48:05 np0005538515.localdomain sudo[92647]: pam_unix(sudo:session): session closed for user root
Nov 28 08:48:05 np0005538515.localdomain sudo[92695]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 08:48:05 np0005538515.localdomain sudo[92695]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:48:05 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.
Nov 28 08:48:05 np0005538515.localdomain sudo[92695]: pam_unix(sudo:session): session closed for user root
Nov 28 08:48:05 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.
Nov 28 08:48:05 np0005538515.localdomain sudo[92722]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 list-networks
Nov 28 08:48:05 np0005538515.localdomain sudo[92722]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:48:05 np0005538515.localdomain systemd[1]: tmp-crun.AuNkhT.mount: Deactivated successfully.
Nov 28 08:48:05 np0005538515.localdomain systemd[1]: tmp-crun.mMIwIK.mount: Deactivated successfully.
Nov 28 08:48:05 np0005538515.localdomain podman[92711]: 2025-11-28 08:48:05.86219018 +0000 UTC m=+0.157808102 container health_status cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, release=1761123044, tcib_managed=true, url=https://www.redhat.com, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, build-date=2025-11-18T22:51:28Z, container_name=collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, vendor=Red Hat, Inc., com.redhat.component=openstack-collectd-container, io.buildah.version=1.41.4, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-collectd)
Nov 28 08:48:05 np0005538515.localdomain podman[92711]: 2025-11-28 08:48:05.874370363 +0000 UTC m=+0.169988225 container exec_died cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, build-date=2025-11-18T22:51:28Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, distribution-scope=public, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, io.openshift.expose-services=, name=rhosp17/openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, batch=17.1_20251118.1, com.redhat.component=openstack-collectd-container, version=17.1.12, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4)
Nov 28 08:48:05 np0005538515.localdomain systemd[1]: cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.service: Deactivated successfully.
Nov 28 08:48:05 np0005538515.localdomain podman[92709]: 2025-11-28 08:48:05.815301698 +0000 UTC m=+0.110852018 container health_status 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, name=rhosp17/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, build-date=2025-11-18T22:49:46Z, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, container_name=metrics_qdr, distribution-scope=public, release=1761123044, version=17.1.12, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, config_id=tripleo_step1, tcib_managed=true, url=https://www.redhat.com, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd)
Nov 28 08:48:06 np0005538515.localdomain podman[92709]: 2025-11-28 08:48:06.013252885 +0000 UTC m=+0.308803185 container exec_died 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_id=tripleo_step1, io.buildah.version=1.41.4, version=17.1.12, vendor=Red Hat, Inc., vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, name=rhosp17/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, build-date=2025-11-18T22:49:46Z, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, container_name=metrics_qdr)
Nov 28 08:48:06 np0005538515.localdomain systemd[1]: 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.service: Deactivated successfully.
Nov 28 08:48:06 np0005538515.localdomain sudo[92722]: pam_unix(sudo:session): session closed for user root
Nov 28 08:48:09 np0005538515.localdomain sudo[92793]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 08:48:09 np0005538515.localdomain sudo[92793]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:48:09 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.
Nov 28 08:48:09 np0005538515.localdomain sudo[92793]: pam_unix(sudo:session): session closed for user root
Nov 28 08:48:09 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.
Nov 28 08:48:09 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.
Nov 28 08:48:09 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.
Nov 28 08:48:09 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.
Nov 28 08:48:09 np0005538515.localdomain podman[92808]: 2025-11-28 08:48:09.627765002 +0000 UTC m=+0.094596551 container health_status 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:32Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, batch=17.1_20251118.1, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, tcib_managed=true, version=17.1.12, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., com.redhat.component=openstack-cron-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=logrotate_crond, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git)
Nov 28 08:48:09 np0005538515.localdomain podman[92808]: 2025-11-28 08:48:09.640958645 +0000 UTC m=+0.107790204 container exec_died 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, release=1761123044, distribution-scope=public, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, com.redhat.component=openstack-cron-container, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.buildah.version=1.41.4, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.openshift.expose-services=, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:32Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron)
Nov 28 08:48:09 np0005538515.localdomain systemd[1]: 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.service: Deactivated successfully.
Nov 28 08:48:09 np0005538515.localdomain podman[92810]: 2025-11-28 08:48:09.643839093 +0000 UTC m=+0.104875195 container health_status 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, batch=17.1_20251118.1, io.openshift.expose-services=, version=17.1.12, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:44:13Z, tcib_managed=true, vendor=Red Hat, Inc., container_name=iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, config_id=tripleo_step3, io.buildah.version=1.41.4, name=rhosp17/openstack-iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.component=openstack-iscsid-container, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']})
Nov 28 08:48:09 np0005538515.localdomain podman[92816]: 2025-11-28 08:48:09.730216442 +0000 UTC m=+0.187525020 container health_status e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, url=https://www.redhat.com, architecture=x86_64, build-date=2025-11-19T00:36:58Z, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., release=1761123044, name=rhosp17/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.4, container_name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, io.openshift.expose-services=, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4)
Nov 28 08:48:09 np0005538515.localdomain podman[92810]: 2025-11-28 08:48:09.780409236 +0000 UTC m=+0.241445278 container exec_died 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, build-date=2025-11-18T23:44:13Z, config_id=tripleo_step3, url=https://www.redhat.com, release=1761123044, container_name=iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, batch=17.1_20251118.1, io.openshift.expose-services=, distribution-scope=public, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vendor=Red Hat, Inc., tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, managed_by=tripleo_ansible, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid)
Nov 28 08:48:09 np0005538515.localdomain systemd[1]: 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.service: Deactivated successfully.
Nov 28 08:48:09 np0005538515.localdomain podman[92809]: 2025-11-28 08:48:09.83490019 +0000 UTC m=+0.298362406 container health_status 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, architecture=x86_64, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, tcib_managed=true, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, distribution-scope=public, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-ipmi, release=1761123044, build-date=2025-11-19T00:12:45Z, version=17.1.12)
Nov 28 08:48:09 np0005538515.localdomain podman[92811]: 2025-11-28 08:48:09.783782958 +0000 UTC m=+0.243966074 container health_status d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, container_name=ceilometer_agent_compute, io.openshift.expose-services=, vcs-type=git, release=1761123044, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, architecture=x86_64, managed_by=tripleo_ansible, distribution-scope=public, com.redhat.component=openstack-ceilometer-compute-container, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute)
Nov 28 08:48:09 np0005538515.localdomain podman[92809]: 2025-11-28 08:48:09.864434073 +0000 UTC m=+0.327896299 container exec_died 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ceilometer_agent_ipmi, url=https://www.redhat.com, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, vendor=Red Hat, Inc., build-date=2025-11-19T00:12:45Z, managed_by=tripleo_ansible, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-ipmi-container, io.buildah.version=1.41.4, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Nov 28 08:48:09 np0005538515.localdomain systemd[1]: 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.service: Deactivated successfully.
Nov 28 08:48:09 np0005538515.localdomain podman[92811]: 2025-11-28 08:48:09.918423052 +0000 UTC m=+0.378606168 container exec_died d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vcs-type=git, com.redhat.component=openstack-ceilometer-compute-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, tcib_managed=true, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, container_name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, build-date=2025-11-19T00:11:48Z, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, distribution-scope=public, maintainer=OpenStack TripleO Team, version=17.1.12, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Nov 28 08:48:09 np0005538515.localdomain systemd[1]: d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.service: Deactivated successfully.
Nov 28 08:48:10 np0005538515.localdomain podman[92816]: 2025-11-28 08:48:10.09969325 +0000 UTC m=+0.557001838 container exec_died e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, name=rhosp17/openstack-nova-compute, build-date=2025-11-19T00:36:58Z, version=17.1.12, container_name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20251118.1, managed_by=tripleo_ansible, tcib_managed=true, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 28 08:48:10 np0005538515.localdomain systemd[1]: e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.service: Deactivated successfully.
Nov 28 08:48:13 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.
Nov 28 08:48:13 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.
Nov 28 08:48:13 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.
Nov 28 08:48:13 np0005538515.localdomain podman[92924]: 2025-11-28 08:48:13.973014814 +0000 UTC m=+0.079978404 container health_status 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, summary=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, vendor=Red Hat, Inc., vcs-type=git, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, com.redhat.component=openstack-ovn-controller-container, architecture=x86_64, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, container_name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, io.buildah.version=1.41.4, release=1761123044, url=https://www.redhat.com)
Nov 28 08:48:14 np0005538515.localdomain podman[92926]: 2025-11-28 08:48:14.034827183 +0000 UTC m=+0.136144040 container health_status e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, release=1761123044, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, version=17.1.12, container_name=ovn_metadata_agent, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, name=rhosp17/openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, distribution-scope=public, tcib_managed=true, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container)
Nov 28 08:48:14 np0005538515.localdomain podman[92924]: 2025-11-28 08:48:14.06551261 +0000 UTC m=+0.172476180 container exec_died 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, build-date=2025-11-18T23:34:05Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, io.buildah.version=1.41.4, vcs-type=git, container_name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, tcib_managed=true, config_id=tripleo_step4, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., release=1761123044, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 28 08:48:14 np0005538515.localdomain podman[92924]: unhealthy
Nov 28 08:48:14 np0005538515.localdomain systemd[1]: 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 08:48:14 np0005538515.localdomain systemd[1]: 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.service: Failed with result 'exit-code'.
Nov 28 08:48:14 np0005538515.localdomain podman[92926]: 2025-11-28 08:48:14.097390695 +0000 UTC m=+0.198707602 container exec_died e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, name=rhosp17/openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, architecture=x86_64, vcs-type=git, version=17.1.12, build-date=2025-11-19T00:14:25Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, release=1761123044, container_name=ovn_metadata_agent, distribution-scope=public, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true)
Nov 28 08:48:14 np0005538515.localdomain podman[92926]: unhealthy
Nov 28 08:48:14 np0005538515.localdomain systemd[1]: e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 08:48:14 np0005538515.localdomain systemd[1]: e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.service: Failed with result 'exit-code'.
Nov 28 08:48:14 np0005538515.localdomain podman[92925]: 2025-11-28 08:48:14.069262075 +0000 UTC m=+0.173600274 container health_status ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, name=rhosp17/openstack-nova-compute, distribution-scope=public, container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, vendor=Red Hat, Inc., architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, release=1761123044, vcs-type=git, tcib_managed=true, io.buildah.version=1.41.4)
Nov 28 08:48:14 np0005538515.localdomain podman[92925]: 2025-11-28 08:48:14.148285489 +0000 UTC m=+0.252623718 container exec_died ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, architecture=x86_64, build-date=2025-11-19T00:36:58Z, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, config_id=tripleo_step5, vendor=Red Hat, Inc., container_name=nova_compute, release=1761123044, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, version=17.1.12)
Nov 28 08:48:14 np0005538515.localdomain systemd[1]: ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.service: Deactivated successfully.
Nov 28 08:48:33 np0005538515.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Nov 28 08:48:33 np0005538515.localdomain recover_tripleo_nova_virtqemud[92994]: 62642
Nov 28 08:48:33 np0005538515.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Nov 28 08:48:33 np0005538515.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Nov 28 08:48:34 np0005538515.localdomain ceph-osd[32393]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 28 08:48:34 np0005538515.localdomain ceph-osd[32393]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 3600.1 total, 600.0 interval
                                                          Cumulative writes: 4784 writes, 21K keys, 4784 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s
                                                          Cumulative WAL: 4784 writes, 637 syncs, 7.51 writes per sync, written: 0.02 GB, 0.01 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                          Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Nov 28 08:48:36 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.
Nov 28 08:48:36 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.
Nov 28 08:48:36 np0005538515.localdomain systemd[1]: tmp-crun.TeAzvV.mount: Deactivated successfully.
Nov 28 08:48:36 np0005538515.localdomain podman[92995]: 2025-11-28 08:48:36.985508059 +0000 UTC m=+0.090195107 container health_status 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, architecture=x86_64, version=17.1.12, maintainer=OpenStack TripleO Team, distribution-scope=public, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, tcib_managed=true, name=rhosp17/openstack-qdrouterd, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20251118.1)
Nov 28 08:48:37 np0005538515.localdomain podman[92996]: 2025-11-28 08:48:37.093312083 +0000 UTC m=+0.195681810 container health_status cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:51:28Z, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-collectd-container, container_name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.buildah.version=1.41.4, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, name=rhosp17/openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_id=tripleo_step3)
Nov 28 08:48:37 np0005538515.localdomain podman[92996]: 2025-11-28 08:48:37.10535388 +0000 UTC m=+0.207723587 container exec_died cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, build-date=2025-11-18T22:51:28Z, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vcs-type=git, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, config_id=tripleo_step3, name=rhosp17/openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, version=17.1.12, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 28 08:48:37 np0005538515.localdomain systemd[1]: cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.service: Deactivated successfully.
Nov 28 08:48:37 np0005538515.localdomain podman[92995]: 2025-11-28 08:48:37.20255586 +0000 UTC m=+0.307242848 container exec_died 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=metrics_qdr, maintainer=OpenStack TripleO Team, vcs-type=git, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., release=1761123044, tcib_managed=true, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:46Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, distribution-scope=public)
Nov 28 08:48:37 np0005538515.localdomain systemd[1]: 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.service: Deactivated successfully.
Nov 28 08:48:37 np0005538515.localdomain systemd[1]: tmp-crun.So191M.mount: Deactivated successfully.
Nov 28 08:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 28 08:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 3600.2 total, 600.0 interval
                                                          Cumulative writes: 5781 writes, 25K keys, 5781 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s
                                                          Cumulative WAL: 5781 writes, 729 syncs, 7.93 writes per sync, written: 0.02 GB, 0.01 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                          Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Nov 28 08:48:39 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.
Nov 28 08:48:39 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.
Nov 28 08:48:39 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.
Nov 28 08:48:39 np0005538515.localdomain podman[93046]: 2025-11-28 08:48:39.982340406 +0000 UTC m=+0.086939567 container health_status 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, version=17.1.12, com.redhat.component=openstack-iscsid-container, io.buildah.version=1.41.4, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, tcib_managed=true, container_name=iscsid, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, architecture=x86_64, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, build-date=2025-11-18T23:44:13Z, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, release=1761123044, description=Red Hat OpenStack Platform 17.1 iscsid)
Nov 28 08:48:39 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.
Nov 28 08:48:39 np0005538515.localdomain podman[93046]: 2025-11-28 08:48:39.995788297 +0000 UTC m=+0.100387438 container exec_died 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.expose-services=, com.redhat.component=openstack-iscsid-container, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, config_id=tripleo_step3, release=1761123044, tcib_managed=true, container_name=iscsid, io.buildah.version=1.41.4, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:44:13Z, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-iscsid, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 28 08:48:40 np0005538515.localdomain systemd[1]: 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.service: Deactivated successfully.
Nov 28 08:48:40 np0005538515.localdomain systemd[1]: tmp-crun.dk6ykH.mount: Deactivated successfully.
Nov 28 08:48:40 np0005538515.localdomain podman[93047]: 2025-11-28 08:48:40.091121669 +0000 UTC m=+0.191794731 container health_status 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, version=17.1.12, architecture=x86_64, name=rhosp17/openstack-ceilometer-ipmi, release=1761123044, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, distribution-scope=public, container_name=ceilometer_agent_ipmi, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, url=https://www.redhat.com, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:12:45Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-type=git, io.openshift.expose-services=, vendor=Red Hat, Inc.)
Nov 28 08:48:40 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.
Nov 28 08:48:40 np0005538515.localdomain podman[93045]: 2025-11-28 08:48:40.140942181 +0000 UTC m=+0.246333106 container health_status 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=logrotate_crond, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, managed_by=tripleo_ansible, io.buildah.version=1.41.4, config_id=tripleo_step4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, tcib_managed=true, vendor=Red Hat, Inc., url=https://www.redhat.com, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:32Z, description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, batch=17.1_20251118.1, distribution-scope=public, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron)
Nov 28 08:48:40 np0005538515.localdomain podman[93047]: 2025-11-28 08:48:40.149617937 +0000 UTC m=+0.250291099 container exec_died 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, url=https://www.redhat.com, version=17.1.12, architecture=x86_64, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-ipmi, io.openshift.expose-services=, batch=17.1_20251118.1, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ceilometer-ipmi-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2025-11-19T00:12:45Z, container_name=ceilometer_agent_ipmi, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4)
Nov 28 08:48:40 np0005538515.localdomain systemd[1]: 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.service: Deactivated successfully.
Nov 28 08:48:40 np0005538515.localdomain podman[93122]: 2025-11-28 08:48:40.232853689 +0000 UTC m=+0.081395718 container health_status e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.openshift.expose-services=, config_id=tripleo_step4, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, build-date=2025-11-19T00:36:58Z, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, container_name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, managed_by=tripleo_ansible)
Nov 28 08:48:40 np0005538515.localdomain podman[93045]: 2025-11-28 08:48:40.25349653 +0000 UTC m=+0.358887415 container exec_died 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.openshift.expose-services=, build-date=2025-11-18T22:49:32Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., name=rhosp17/openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, release=1761123044, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, tcib_managed=true, batch=17.1_20251118.1, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.component=openstack-cron-container, distribution-scope=public, managed_by=tripleo_ansible)
Nov 28 08:48:40 np0005538515.localdomain systemd[1]: 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.service: Deactivated successfully.
Nov 28 08:48:40 np0005538515.localdomain podman[93087]: 2025-11-28 08:48:40.331001757 +0000 UTC m=+0.328315021 container health_status d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, managed_by=tripleo_ansible, build-date=2025-11-19T00:11:48Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, architecture=x86_64, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-compute, vcs-type=git, version=17.1.12, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, container_name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, io.buildah.version=1.41.4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1)
Nov 28 08:48:40 np0005538515.localdomain podman[93087]: 2025-11-28 08:48:40.355486486 +0000 UTC m=+0.352799790 container exec_died d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, maintainer=OpenStack TripleO Team, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, vcs-type=git, io.buildah.version=1.41.4, release=1761123044, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-compute, container_name=ceilometer_agent_compute, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-compute-container, managed_by=tripleo_ansible, io.openshift.expose-services=, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, url=https://www.redhat.com, build-date=2025-11-19T00:11:48Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 28 08:48:40 np0005538515.localdomain systemd[1]: d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.service: Deactivated successfully.
Nov 28 08:48:40 np0005538515.localdomain podman[93122]: 2025-11-28 08:48:40.675543903 +0000 UTC m=+0.524085972 container exec_died e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, build-date=2025-11-19T00:36:58Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, io.openshift.expose-services=, url=https://www.redhat.com, distribution-scope=public, config_id=tripleo_step4, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-compute, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=nova_migration_target, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, maintainer=OpenStack TripleO Team)
Nov 28 08:48:40 np0005538515.localdomain systemd[1]: e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.service: Deactivated successfully.
Nov 28 08:48:44 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.
Nov 28 08:48:44 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.
Nov 28 08:48:44 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.
Nov 28 08:48:44 np0005538515.localdomain podman[93160]: 2025-11-28 08:48:44.978933995 +0000 UTC m=+0.083698017 container health_status 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, release=1761123044, container_name=ovn_controller, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, io.openshift.expose-services=, config_id=tripleo_step4, build-date=2025-11-18T23:34:05Z, io.buildah.version=1.41.4, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, managed_by=tripleo_ansible, vendor=Red Hat, Inc., batch=17.1_20251118.1, distribution-scope=public)
Nov 28 08:48:45 np0005538515.localdomain podman[93160]: 2025-11-28 08:48:45.019745073 +0000 UTC m=+0.124509065 container exec_died 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., batch=17.1_20251118.1, architecture=x86_64, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, vcs-type=git, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, distribution-scope=public, release=1761123044, container_name=ovn_controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, managed_by=tripleo_ansible, io.buildah.version=1.41.4)
Nov 28 08:48:45 np0005538515.localdomain podman[93160]: unhealthy
Nov 28 08:48:45 np0005538515.localdomain podman[93161]: 2025-11-28 08:48:45.035264057 +0000 UTC m=+0.136726718 container health_status ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, config_id=tripleo_step5, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., container_name=nova_compute, io.buildah.version=1.41.4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, build-date=2025-11-19T00:36:58Z, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, name=rhosp17/openstack-nova-compute, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, release=1761123044, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 28 08:48:45 np0005538515.localdomain systemd[1]: 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 08:48:45 np0005538515.localdomain systemd[1]: 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.service: Failed with result 'exit-code'.
Nov 28 08:48:45 np0005538515.localdomain systemd[1]: tmp-crun.y2Z7d3.mount: Deactivated successfully.
Nov 28 08:48:45 np0005538515.localdomain podman[93161]: 2025-11-28 08:48:45.09755852 +0000 UTC m=+0.199021191 container exec_died ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, config_id=tripleo_step5, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-11-19T00:36:58Z, url=https://www.redhat.com, vendor=Red Hat, Inc., container_name=nova_compute, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, version=17.1.12, release=1761123044)
Nov 28 08:48:45 np0005538515.localdomain systemd[1]: ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.service: Deactivated successfully.
Nov 28 08:48:45 np0005538515.localdomain podman[93162]: 2025-11-28 08:48:45.098300253 +0000 UTC m=+0.194317168 container health_status e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, batch=17.1_20251118.1, managed_by=tripleo_ansible, version=17.1.12, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, distribution-scope=public, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, architecture=x86_64, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, container_name=ovn_metadata_agent, release=1761123044, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, build-date=2025-11-19T00:14:25Z, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, maintainer=OpenStack TripleO Team)
Nov 28 08:48:45 np0005538515.localdomain podman[93162]: 2025-11-28 08:48:45.185120945 +0000 UTC m=+0.281137890 container exec_died e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vcs-type=git, container_name=ovn_metadata_agent, io.openshift.expose-services=, release=1761123044, architecture=x86_64, name=rhosp17/openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., build-date=2025-11-19T00:14:25Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, tcib_managed=true, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4)
Nov 28 08:48:45 np0005538515.localdomain podman[93162]: unhealthy
Nov 28 08:48:45 np0005538515.localdomain systemd[1]: e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 08:48:45 np0005538515.localdomain systemd[1]: e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.service: Failed with result 'exit-code'.
Nov 28 08:49:07 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.
Nov 28 08:49:07 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.
Nov 28 08:49:07 np0005538515.localdomain systemd[1]: tmp-crun.IKmhMn.mount: Deactivated successfully.
Nov 28 08:49:07 np0005538515.localdomain podman[93225]: 2025-11-28 08:49:07.994023723 +0000 UTC m=+0.094622092 container health_status cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.buildah.version=1.41.4, name=rhosp17/openstack-collectd, container_name=collectd, maintainer=OpenStack TripleO Team, distribution-scope=public, release=1761123044, com.redhat.component=openstack-collectd-container, description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., config_id=tripleo_step3, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd)
Nov 28 08:49:08 np0005538515.localdomain systemd[1]: tmp-crun.41YAkR.mount: Deactivated successfully.
Nov 28 08:49:08 np0005538515.localdomain podman[93224]: 2025-11-28 08:49:08.033013124 +0000 UTC m=+0.136538883 container health_status 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.buildah.version=1.41.4, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, version=17.1.12, managed_by=tripleo_ansible, name=rhosp17/openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step1, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, release=1761123044, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, distribution-scope=public, tcib_managed=true, build-date=2025-11-18T22:49:46Z, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=metrics_qdr)
Nov 28 08:49:08 np0005538515.localdomain podman[93225]: 2025-11-28 08:49:08.078590736 +0000 UTC m=+0.179189075 container exec_died cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, maintainer=OpenStack TripleO Team, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.buildah.version=1.41.4, com.redhat.component=openstack-collectd-container, vcs-type=git, batch=17.1_20251118.1, io.openshift.expose-services=, url=https://www.redhat.com, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, build-date=2025-11-18T22:51:28Z)
Nov 28 08:49:08 np0005538515.localdomain systemd[1]: cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.service: Deactivated successfully.
Nov 28 08:49:08 np0005538515.localdomain podman[93224]: 2025-11-28 08:49:08.29047519 +0000 UTC m=+0.394000949 container exec_died 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, container_name=metrics_qdr, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp17/openstack-qdrouterd, vcs-type=git, managed_by=tripleo_ansible, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, config_id=tripleo_step1, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, architecture=x86_64, build-date=2025-11-18T22:49:46Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-qdrouterd-container)
Nov 28 08:49:08 np0005538515.localdomain systemd[1]: 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.service: Deactivated successfully.
Nov 28 08:49:09 np0005538515.localdomain sudo[93274]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 08:49:09 np0005538515.localdomain sudo[93274]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:49:09 np0005538515.localdomain sudo[93274]: pam_unix(sudo:session): session closed for user root
Nov 28 08:49:09 np0005538515.localdomain sudo[93289]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 08:49:09 np0005538515.localdomain sudo[93289]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:49:10 np0005538515.localdomain sudo[93289]: pam_unix(sudo:session): session closed for user root
Nov 28 08:49:10 np0005538515.localdomain sudo[93336]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 08:49:10 np0005538515.localdomain sudo[93336]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:49:10 np0005538515.localdomain sudo[93336]: pam_unix(sudo:session): session closed for user root
Nov 28 08:49:10 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.
Nov 28 08:49:10 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.
Nov 28 08:49:10 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.
Nov 28 08:49:10 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.
Nov 28 08:49:10 np0005538515.localdomain sudo[93351]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ceph-volume --fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1 -- inventory --format=json-pretty --filter-for-batch
Nov 28 08:49:10 np0005538515.localdomain sudo[93351]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:49:10 np0005538515.localdomain podman[93363]: 2025-11-28 08:49:10.68059673 +0000 UTC m=+0.087861925 container health_status 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, name=rhosp17/openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.expose-services=, distribution-scope=public, build-date=2025-11-18T23:44:13Z, description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, release=1761123044, vendor=Red Hat, Inc., managed_by=tripleo_ansible, architecture=x86_64, config_id=tripleo_step3, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, summary=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=iscsid, version=17.1.12, com.redhat.component=openstack-iscsid-container, url=https://www.redhat.com)
Nov 28 08:49:10 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.
Nov 28 08:49:10 np0005538515.localdomain podman[93363]: 2025-11-28 08:49:10.71953893 +0000 UTC m=+0.126804125 container exec_died 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, batch=17.1_20251118.1, architecture=x86_64, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, name=rhosp17/openstack-iscsid, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, summary=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, config_id=tripleo_step3, io.openshift.expose-services=, build-date=2025-11-18T23:44:13Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, maintainer=OpenStack TripleO Team, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, com.redhat.component=openstack-iscsid-container, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 08:49:10 np0005538515.localdomain systemd[1]: tmp-crun.85ozvx.mount: Deactivated successfully.
Nov 28 08:49:10 np0005538515.localdomain podman[93357]: 2025-11-28 08:49:10.734628951 +0000 UTC m=+0.150505509 container health_status 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=logrotate_crond, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, vendor=Red Hat, Inc., managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, vcs-type=git, com.redhat.component=openstack-cron-container, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, config_id=tripleo_step4, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, build-date=2025-11-18T22:49:32Z, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, name=rhosp17/openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 08:49:10 np0005538515.localdomain systemd[1]: 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.service: Deactivated successfully.
Nov 28 08:49:10 np0005538515.localdomain podman[93357]: 2025-11-28 08:49:10.747386261 +0000 UTC m=+0.163262829 container exec_died 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, com.redhat.component=openstack-cron-container, io.buildah.version=1.41.4, tcib_managed=true, build-date=2025-11-18T22:49:32Z, vcs-type=git, vendor=Red Hat, Inc., managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, distribution-scope=public, container_name=logrotate_crond, url=https://www.redhat.com, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 cron, version=17.1.12)
Nov 28 08:49:10 np0005538515.localdomain systemd[1]: 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.service: Deactivated successfully.
Nov 28 08:49:10 np0005538515.localdomain podman[93409]: 2025-11-28 08:49:10.799933446 +0000 UTC m=+0.088453223 container health_status e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, config_id=tripleo_step4, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, io.openshift.expose-services=, release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, container_name=nova_migration_target, vcs-type=git, name=rhosp17/openstack-nova-compute, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 28 08:49:10 np0005538515.localdomain podman[93370]: 2025-11-28 08:49:10.841231487 +0000 UTC m=+0.246316446 container health_status d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, com.redhat.component=openstack-ceilometer-compute-container, tcib_managed=true, architecture=x86_64, managed_by=tripleo_ansible, io.openshift.expose-services=, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:11:48Z, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, url=https://www.redhat.com, release=1761123044, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-compute, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Nov 28 08:49:10 np0005538515.localdomain podman[93360]: 2025-11-28 08:49:10.889849343 +0000 UTC m=+0.305653999 container health_status 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, name=rhosp17/openstack-ceilometer-ipmi, batch=17.1_20251118.1, container_name=ceilometer_agent_ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, config_id=tripleo_step4, build-date=2025-11-19T00:12:45Z, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, version=17.1.12, tcib_managed=true, io.openshift.expose-services=, architecture=x86_64, release=1761123044, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-ipmi-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676)
Nov 28 08:49:10 np0005538515.localdomain podman[93370]: 2025-11-28 08:49:10.900482788 +0000 UTC m=+0.305567737 container exec_died d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:11:48Z, tcib_managed=true, vcs-type=git, architecture=x86_64, name=rhosp17/openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, version=17.1.12, container_name=ceilometer_agent_compute, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1761123044, io.buildah.version=1.41.4, batch=17.1_20251118.1, config_id=tripleo_step4, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute)
Nov 28 08:49:10 np0005538515.localdomain systemd[1]: d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.service: Deactivated successfully.
Nov 28 08:49:10 np0005538515.localdomain podman[93360]: 2025-11-28 08:49:10.924500012 +0000 UTC m=+0.340304658 container exec_died 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, distribution-scope=public, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, url=https://www.redhat.com, vendor=Red Hat, Inc., io.openshift.expose-services=, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, version=17.1.12, container_name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, build-date=2025-11-19T00:12:45Z, architecture=x86_64, name=rhosp17/openstack-ceilometer-ipmi, tcib_managed=true)
Nov 28 08:49:10 np0005538515.localdomain systemd[1]: 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.service: Deactivated successfully.
Nov 28 08:49:11 np0005538515.localdomain podman[93409]: 2025-11-28 08:49:11.224464966 +0000 UTC m=+0.512984733 container exec_died e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.openshift.expose-services=, url=https://www.redhat.com, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, name=rhosp17/openstack-nova-compute, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, vcs-type=git, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., distribution-scope=public, com.redhat.component=openstack-nova-compute-container, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, batch=17.1_20251118.1)
Nov 28 08:49:11 np0005538515.localdomain podman[93517]: 
Nov 28 08:49:11 np0005538515.localdomain podman[93517]: 2025-11-28 08:49:11.234829863 +0000 UTC m=+0.076019114 container create 1dd1df3bc840dfca20bb34d3b19a63d370c34aa64b9d2a0a4d6370f3e9b8a44f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=angry_sutherland, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-09-24T08:57:55, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.expose-services=, vcs-type=git, io.buildah.version=1.33.12, name=rhceph, release=553, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, version=7, io.openshift.tags=rhceph ceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, RELEASE=main, architecture=x86_64)
Nov 28 08:49:11 np0005538515.localdomain systemd[1]: e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.service: Deactivated successfully.
Nov 28 08:49:11 np0005538515.localdomain systemd[1]: Started libpod-conmon-1dd1df3bc840dfca20bb34d3b19a63d370c34aa64b9d2a0a4d6370f3e9b8a44f.scope.
Nov 28 08:49:11 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 08:49:11 np0005538515.localdomain podman[93517]: 2025-11-28 08:49:11.293383972 +0000 UTC m=+0.134573263 container init 1dd1df3bc840dfca20bb34d3b19a63d370c34aa64b9d2a0a4d6370f3e9b8a44f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=angry_sutherland, build-date=2025-09-24T08:57:55, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=553, RELEASE=main, distribution-scope=public, CEPH_POINT_RELEASE=, GIT_CLEAN=True, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, vcs-type=git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., io.openshift.expose-services=, ceph=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.buildah.version=1.33.12, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, architecture=x86_64, com.redhat.component=rhceph-container, name=rhceph, com.redhat.license_terms=https://www.redhat.com/agreements)
Nov 28 08:49:11 np0005538515.localdomain podman[93517]: 2025-11-28 08:49:11.202859995 +0000 UTC m=+0.044049256 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 08:49:11 np0005538515.localdomain podman[93517]: 2025-11-28 08:49:11.306590375 +0000 UTC m=+0.147779656 container start 1dd1df3bc840dfca20bb34d3b19a63d370c34aa64b9d2a0a4d6370f3e9b8a44f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=angry_sutherland, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_CLEAN=True, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, build-date=2025-09-24T08:57:55, io.buildah.version=1.33.12, vcs-type=git, description=Red Hat Ceph Storage 7, RELEASE=main, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, name=rhceph, release=553, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d)
Nov 28 08:49:11 np0005538515.localdomain podman[93517]: 2025-11-28 08:49:11.308944326 +0000 UTC m=+0.150133617 container attach 1dd1df3bc840dfca20bb34d3b19a63d370c34aa64b9d2a0a4d6370f3e9b8a44f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=angry_sutherland, release=553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, build-date=2025-09-24T08:57:55, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, CEPH_POINT_RELEASE=, name=rhceph, version=7, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, io.buildah.version=1.33.12, com.redhat.component=rhceph-container, vcs-type=git, ceph=True, architecture=x86_64, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.expose-services=)
Nov 28 08:49:11 np0005538515.localdomain angry_sutherland[93532]: 167 167
Nov 28 08:49:11 np0005538515.localdomain systemd[1]: libpod-1dd1df3bc840dfca20bb34d3b19a63d370c34aa64b9d2a0a4d6370f3e9b8a44f.scope: Deactivated successfully.
Nov 28 08:49:11 np0005538515.localdomain podman[93517]: 2025-11-28 08:49:11.313804955 +0000 UTC m=+0.154994246 container died 1dd1df3bc840dfca20bb34d3b19a63d370c34aa64b9d2a0a4d6370f3e9b8a44f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=angry_sutherland, architecture=x86_64, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., release=553, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, distribution-scope=public, io.buildah.version=1.33.12, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, RELEASE=main, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, build-date=2025-09-24T08:57:55, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, version=7, io.openshift.tags=rhceph ceph)
Nov 28 08:49:11 np0005538515.localdomain podman[93538]: 2025-11-28 08:49:11.418184414 +0000 UTC m=+0.089168665 container remove 1dd1df3bc840dfca20bb34d3b19a63d370c34aa64b9d2a0a4d6370f3e9b8a44f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=angry_sutherland, RELEASE=main, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, version=7, GIT_BRANCH=main, CEPH_POINT_RELEASE=, vcs-type=git, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, build-date=2025-09-24T08:57:55, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_CLEAN=True, com.redhat.component=rhceph-container, release=553, distribution-scope=public, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git)
Nov 28 08:49:11 np0005538515.localdomain systemd[1]: libpod-conmon-1dd1df3bc840dfca20bb34d3b19a63d370c34aa64b9d2a0a4d6370f3e9b8a44f.scope: Deactivated successfully.
Nov 28 08:49:11 np0005538515.localdomain podman[93560]: 
Nov 28 08:49:11 np0005538515.localdomain podman[93560]: 2025-11-28 08:49:11.660175788 +0000 UTC m=+0.085994419 container create ef7ad5e4773c66a693d67323a7d8be57322232f40de8e0c51e7534c66989eca6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=wizardly_perlman, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, release=553, GIT_BRANCH=main, CEPH_POINT_RELEASE=, vcs-type=git, RELEASE=main, name=rhceph, io.buildah.version=1.33.12, version=7, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, build-date=2025-09-24T08:57:55, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d)
Nov 28 08:49:11 np0005538515.localdomain systemd[1]: Started libpod-conmon-ef7ad5e4773c66a693d67323a7d8be57322232f40de8e0c51e7534c66989eca6.scope.
Nov 28 08:49:11 np0005538515.localdomain podman[93560]: 2025-11-28 08:49:11.627710925 +0000 UTC m=+0.053529606 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 08:49:11 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 08:49:11 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e2c6b6c643f4130ffc3706e1d537273f2123af64f3aaeec13c9f99a6da6f1157/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Nov 28 08:49:11 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e2c6b6c643f4130ffc3706e1d537273f2123af64f3aaeec13c9f99a6da6f1157/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Nov 28 08:49:11 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e2c6b6c643f4130ffc3706e1d537273f2123af64f3aaeec13c9f99a6da6f1157/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Nov 28 08:49:11 np0005538515.localdomain podman[93560]: 2025-11-28 08:49:11.7443595 +0000 UTC m=+0.170178141 container init ef7ad5e4773c66a693d67323a7d8be57322232f40de8e0c51e7534c66989eca6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=wizardly_perlman, vcs-type=git, architecture=x86_64, GIT_BRANCH=main, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, version=7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_CLEAN=True, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.33.12, vendor=Red Hat, Inc., RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, com.redhat.component=rhceph-container, build-date=2025-09-24T08:57:55, release=553, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, io.openshift.expose-services=)
Nov 28 08:49:11 np0005538515.localdomain systemd[1]: tmp-crun.z63Cm4.mount: Deactivated successfully.
Nov 28 08:49:11 np0005538515.localdomain podman[93560]: 2025-11-28 08:49:11.760157402 +0000 UTC m=+0.185975993 container start ef7ad5e4773c66a693d67323a7d8be57322232f40de8e0c51e7534c66989eca6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=wizardly_perlman, io.openshift.expose-services=, CEPH_POINT_RELEASE=, distribution-scope=public, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, vcs-type=git, version=7, RELEASE=main, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.33.12, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, build-date=2025-09-24T08:57:55, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=553, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, architecture=x86_64, name=rhceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d)
Nov 28 08:49:11 np0005538515.localdomain podman[93560]: 2025-11-28 08:49:11.76039447 +0000 UTC m=+0.186213101 container attach ef7ad5e4773c66a693d67323a7d8be57322232f40de8e0c51e7534c66989eca6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=wizardly_perlman, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, version=7, name=rhceph, CEPH_POINT_RELEASE=, GIT_CLEAN=True, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.component=rhceph-container, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, vcs-type=git, release=553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3)
Nov 28 08:49:12 np0005538515.localdomain wizardly_perlman[93576]: [
Nov 28 08:49:12 np0005538515.localdomain wizardly_perlman[93576]:     {
Nov 28 08:49:12 np0005538515.localdomain wizardly_perlman[93576]:         "available": false,
Nov 28 08:49:12 np0005538515.localdomain wizardly_perlman[93576]:         "ceph_device": false,
Nov 28 08:49:12 np0005538515.localdomain wizardly_perlman[93576]:         "device_id": "QEMU_DVD-ROM_QM00001",
Nov 28 08:49:12 np0005538515.localdomain wizardly_perlman[93576]:         "lsm_data": {},
Nov 28 08:49:12 np0005538515.localdomain wizardly_perlman[93576]:         "lvs": [],
Nov 28 08:49:12 np0005538515.localdomain wizardly_perlman[93576]:         "path": "/dev/sr0",
Nov 28 08:49:12 np0005538515.localdomain wizardly_perlman[93576]:         "rejected_reasons": [
Nov 28 08:49:12 np0005538515.localdomain wizardly_perlman[93576]:             "Insufficient space (<5GB)",
Nov 28 08:49:12 np0005538515.localdomain wizardly_perlman[93576]:             "Has a FileSystem"
Nov 28 08:49:12 np0005538515.localdomain wizardly_perlman[93576]:         ],
Nov 28 08:49:12 np0005538515.localdomain wizardly_perlman[93576]:         "sys_api": {
Nov 28 08:49:12 np0005538515.localdomain wizardly_perlman[93576]:             "actuators": null,
Nov 28 08:49:12 np0005538515.localdomain wizardly_perlman[93576]:             "device_nodes": "sr0",
Nov 28 08:49:12 np0005538515.localdomain wizardly_perlman[93576]:             "human_readable_size": "482.00 KB",
Nov 28 08:49:12 np0005538515.localdomain wizardly_perlman[93576]:             "id_bus": "ata",
Nov 28 08:49:12 np0005538515.localdomain wizardly_perlman[93576]:             "model": "QEMU DVD-ROM",
Nov 28 08:49:12 np0005538515.localdomain wizardly_perlman[93576]:             "nr_requests": "2",
Nov 28 08:49:12 np0005538515.localdomain wizardly_perlman[93576]:             "partitions": {},
Nov 28 08:49:12 np0005538515.localdomain wizardly_perlman[93576]:             "path": "/dev/sr0",
Nov 28 08:49:12 np0005538515.localdomain wizardly_perlman[93576]:             "removable": "1",
Nov 28 08:49:12 np0005538515.localdomain wizardly_perlman[93576]:             "rev": "2.5+",
Nov 28 08:49:12 np0005538515.localdomain wizardly_perlman[93576]:             "ro": "0",
Nov 28 08:49:12 np0005538515.localdomain wizardly_perlman[93576]:             "rotational": "1",
Nov 28 08:49:12 np0005538515.localdomain wizardly_perlman[93576]:             "sas_address": "",
Nov 28 08:49:12 np0005538515.localdomain wizardly_perlman[93576]:             "sas_device_handle": "",
Nov 28 08:49:12 np0005538515.localdomain wizardly_perlman[93576]:             "scheduler_mode": "mq-deadline",
Nov 28 08:49:12 np0005538515.localdomain wizardly_perlman[93576]:             "sectors": 0,
Nov 28 08:49:12 np0005538515.localdomain wizardly_perlman[93576]:             "sectorsize": "2048",
Nov 28 08:49:12 np0005538515.localdomain wizardly_perlman[93576]:             "size": 493568.0,
Nov 28 08:49:12 np0005538515.localdomain wizardly_perlman[93576]:             "support_discard": "0",
Nov 28 08:49:12 np0005538515.localdomain wizardly_perlman[93576]:             "type": "disk",
Nov 28 08:49:12 np0005538515.localdomain wizardly_perlman[93576]:             "vendor": "QEMU"
Nov 28 08:49:12 np0005538515.localdomain wizardly_perlman[93576]:         }
Nov 28 08:49:12 np0005538515.localdomain wizardly_perlman[93576]:     }
Nov 28 08:49:12 np0005538515.localdomain wizardly_perlman[93576]: ]
Nov 28 08:49:12 np0005538515.localdomain systemd[1]: libpod-ef7ad5e4773c66a693d67323a7d8be57322232f40de8e0c51e7534c66989eca6.scope: Deactivated successfully.
Nov 28 08:49:12 np0005538515.localdomain systemd[1]: libpod-ef7ad5e4773c66a693d67323a7d8be57322232f40de8e0c51e7534c66989eca6.scope: Consumed 1.048s CPU time.
Nov 28 08:49:12 np0005538515.localdomain podman[93560]: 2025-11-28 08:49:12.757155681 +0000 UTC m=+1.182974332 container died ef7ad5e4773c66a693d67323a7d8be57322232f40de8e0c51e7534c66989eca6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=wizardly_perlman, io.buildah.version=1.33.12, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, build-date=2025-09-24T08:57:55, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, version=7, name=rhceph, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, ceph=True, io.openshift.expose-services=, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, release=553, com.redhat.component=rhceph-container, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7)
Nov 28 08:49:12 np0005538515.localdomain systemd[1]: tmp-crun.a12E0H.mount: Deactivated successfully.
Nov 28 08:49:12 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-e2c6b6c643f4130ffc3706e1d537273f2123af64f3aaeec13c9f99a6da6f1157-merged.mount: Deactivated successfully.
Nov 28 08:49:12 np0005538515.localdomain podman[95609]: 2025-11-28 08:49:12.866816101 +0000 UTC m=+0.096915011 container remove ef7ad5e4773c66a693d67323a7d8be57322232f40de8e0c51e7534c66989eca6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=wizardly_perlman, io.openshift.expose-services=, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, build-date=2025-09-24T08:57:55, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, ceph=True, release=553, architecture=x86_64, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-type=git, GIT_BRANCH=main, name=rhceph, version=7, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12)
Nov 28 08:49:12 np0005538515.localdomain systemd[1]: libpod-conmon-ef7ad5e4773c66a693d67323a7d8be57322232f40de8e0c51e7534c66989eca6.scope: Deactivated successfully.
Nov 28 08:49:12 np0005538515.localdomain sudo[93351]: pam_unix(sudo:session): session closed for user root
Nov 28 08:49:13 np0005538515.localdomain sudo[95624]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 08:49:13 np0005538515.localdomain sudo[95624]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:49:13 np0005538515.localdomain sudo[95624]: pam_unix(sudo:session): session closed for user root
Nov 28 08:49:15 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.
Nov 28 08:49:15 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.
Nov 28 08:49:15 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.
Nov 28 08:49:15 np0005538515.localdomain podman[95639]: 2025-11-28 08:49:15.980701604 +0000 UTC m=+0.083190733 container health_status 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, vendor=Red Hat, Inc., release=1761123044, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vcs-type=git, io.openshift.expose-services=, name=rhosp17/openstack-ovn-controller, architecture=x86_64, tcib_managed=true, batch=17.1_20251118.1, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, build-date=2025-11-18T23:34:05Z, container_name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller)
Nov 28 08:49:16 np0005538515.localdomain podman[95640]: 2025-11-28 08:49:16.02803114 +0000 UTC m=+0.132001283 container health_status ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, vcs-type=git, tcib_managed=true, managed_by=tripleo_ansible, url=https://www.redhat.com, build-date=2025-11-19T00:36:58Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, io.openshift.expose-services=, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step5, version=17.1.12, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=nova_compute)
Nov 28 08:49:16 np0005538515.localdomain podman[95639]: 2025-11-28 08:49:16.053037584 +0000 UTC m=+0.155526703 container exec_died 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, container_name=ovn_controller, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, vendor=Red Hat, Inc., distribution-scope=public, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, architecture=x86_64, url=https://www.redhat.com, batch=17.1_20251118.1)
Nov 28 08:49:16 np0005538515.localdomain podman[95640]: 2025-11-28 08:49:16.080028118 +0000 UTC m=+0.183998341 container exec_died ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step5, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, container_name=nova_compute, name=rhosp17/openstack-nova-compute, release=1761123044, version=17.1.12, batch=17.1_20251118.1, url=https://www.redhat.com, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, architecture=x86_64)
Nov 28 08:49:16 np0005538515.localdomain podman[95641]: 2025-11-28 08:49:16.092190261 +0000 UTC m=+0.194899446 container health_status e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, architecture=x86_64, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:14:25Z, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, container_name=ovn_metadata_agent, config_id=tripleo_step4, distribution-scope=public, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, batch=17.1_20251118.1, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true)
Nov 28 08:49:16 np0005538515.localdomain systemd[1]: ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.service: Deactivated successfully.
Nov 28 08:49:16 np0005538515.localdomain podman[95641]: 2025-11-28 08:49:16.104125945 +0000 UTC m=+0.206835180 container exec_died e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, url=https://www.redhat.com, distribution-scope=public, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, maintainer=OpenStack TripleO Team, container_name=ovn_metadata_agent, version=17.1.12, vcs-type=git, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, name=rhosp17/openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn)
Nov 28 08:49:16 np0005538515.localdomain podman[95639]: unhealthy
Nov 28 08:49:16 np0005538515.localdomain systemd[1]: 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 08:49:16 np0005538515.localdomain systemd[1]: 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.service: Failed with result 'exit-code'.
Nov 28 08:49:16 np0005538515.localdomain podman[95641]: unhealthy
Nov 28 08:49:16 np0005538515.localdomain systemd[1]: e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 08:49:16 np0005538515.localdomain systemd[1]: e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.service: Failed with result 'exit-code'.
Nov 28 08:49:16 np0005538515.localdomain systemd[1]: tmp-crun.d2ri3Q.mount: Deactivated successfully.
Nov 28 08:49:38 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.
Nov 28 08:49:38 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.
Nov 28 08:49:38 np0005538515.localdomain podman[95701]: 2025-11-28 08:49:38.978186445 +0000 UTC m=+0.079750514 container health_status cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, url=https://www.redhat.com, com.redhat.component=openstack-collectd-container, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-collectd, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, batch=17.1_20251118.1, io.buildah.version=1.41.4, container_name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, distribution-scope=public, build-date=2025-11-18T22:51:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 collectd)
Nov 28 08:49:39 np0005538515.localdomain podman[95701]: 2025-11-28 08:49:39.009675804 +0000 UTC m=+0.111239813 container exec_died cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.openshift.expose-services=, config_id=tripleo_step3, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, managed_by=tripleo_ansible, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., build-date=2025-11-18T22:51:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, name=rhosp17/openstack-collectd, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-collectd-container, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.buildah.version=1.41.4, container_name=collectd, tcib_managed=true)
Nov 28 08:49:39 np0005538515.localdomain podman[95700]: 2025-11-28 08:49:39.023420366 +0000 UTC m=+0.127575975 container health_status 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, vendor=Red Hat, Inc., distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, version=17.1.12, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, name=rhosp17/openstack-qdrouterd, io.openshift.expose-services=, release=1761123044, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, config_id=tripleo_step1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, build-date=2025-11-18T22:49:46Z, com.redhat.component=openstack-qdrouterd-container)
Nov 28 08:49:39 np0005538515.localdomain systemd[1]: cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.service: Deactivated successfully.
Nov 28 08:49:39 np0005538515.localdomain podman[95700]: 2025-11-28 08:49:39.204037262 +0000 UTC m=+0.308192821 container exec_died 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-qdrouterd, batch=17.1_20251118.1, build-date=2025-11-18T22:49:46Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr, tcib_managed=true, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, version=17.1.12, vendor=Red Hat, Inc., managed_by=tripleo_ansible)
Nov 28 08:49:39 np0005538515.localdomain systemd[1]: 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.service: Deactivated successfully.
Nov 28 08:49:40 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.
Nov 28 08:49:40 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.
Nov 28 08:49:40 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.
Nov 28 08:49:40 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.
Nov 28 08:49:40 np0005538515.localdomain podman[95753]: 2025-11-28 08:49:40.991653384 +0000 UTC m=+0.097044146 container health_status 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, url=https://www.redhat.com, build-date=2025-11-18T23:44:13Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., version=17.1.12, io.buildah.version=1.41.4, release=1761123044, batch=17.1_20251118.1, container_name=iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.expose-services=, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, tcib_managed=true, com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64)
Nov 28 08:49:41 np0005538515.localdomain podman[95753]: 2025-11-28 08:49:41.024155214 +0000 UTC m=+0.129545966 container exec_died 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., container_name=iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.component=openstack-iscsid-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, name=rhosp17/openstack-iscsid, tcib_managed=true, build-date=2025-11-18T23:44:13Z, version=17.1.12, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, config_id=tripleo_step3, distribution-scope=public, architecture=x86_64, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid)
Nov 28 08:49:41 np0005538515.localdomain systemd[1]: 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.service: Deactivated successfully.
Nov 28 08:49:41 np0005538515.localdomain podman[95752]: 2025-11-28 08:49:41.070993304 +0000 UTC m=+0.177891172 container health_status 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, tcib_managed=true, build-date=2025-11-18T22:49:32Z, config_id=tripleo_step4, distribution-scope=public, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, version=17.1.12, architecture=x86_64, com.redhat.component=openstack-cron-container, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1)
Nov 28 08:49:41 np0005538515.localdomain podman[95781]: 2025-11-28 08:49:41.12741892 +0000 UTC m=+0.129658849 container health_status 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, io.buildah.version=1.41.4, url=https://www.redhat.com, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, build-date=2025-11-19T00:12:45Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ceilometer-ipmi-container, container_name=ceilometer_agent_ipmi, release=1761123044, name=rhosp17/openstack-ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, batch=17.1_20251118.1, config_id=tripleo_step4)
Nov 28 08:49:41 np0005538515.localdomain podman[95781]: 2025-11-28 08:49:41.186671203 +0000 UTC m=+0.188911142 container exec_died 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, distribution-scope=public, container_name=ceilometer_agent_ipmi, io.buildah.version=1.41.4, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:12:45Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vendor=Red Hat, Inc., version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ceilometer-ipmi-container, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, config_id=tripleo_step4, name=rhosp17/openstack-ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676)
Nov 28 08:49:41 np0005538515.localdomain podman[95783]: 2025-11-28 08:49:41.188604701 +0000 UTC m=+0.188974633 container health_status d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, build-date=2025-11-19T00:11:48Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1761123044, name=rhosp17/openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, container_name=ceilometer_agent_compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-compute-container, vendor=Red Hat, Inc.)
Nov 28 08:49:41 np0005538515.localdomain podman[95752]: 2025-11-28 08:49:41.208675979 +0000 UTC m=+0.315573847 container exec_died 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, url=https://www.redhat.com, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 cron, release=1761123044, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, config_id=tripleo_step4, io.buildah.version=1.41.4, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:49:32Z, com.redhat.component=openstack-cron-container, vcs-type=git, container_name=logrotate_crond)
Nov 28 08:49:41 np0005538515.localdomain systemd[1]: 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.service: Deactivated successfully.
Nov 28 08:49:41 np0005538515.localdomain podman[95783]: 2025-11-28 08:49:41.223229287 +0000 UTC m=+0.223599189 container exec_died d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, managed_by=tripleo_ansible, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., container_name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, build-date=2025-11-19T00:11:48Z, io.openshift.expose-services=, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1761123044, url=https://www.redhat.com, distribution-scope=public, architecture=x86_64, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute)
Nov 28 08:49:41 np0005538515.localdomain systemd[1]: d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.service: Deactivated successfully.
Nov 28 08:49:41 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.
Nov 28 08:49:41 np0005538515.localdomain systemd[1]: 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.service: Deactivated successfully.
Nov 28 08:49:41 np0005538515.localdomain podman[95840]: 2025-11-28 08:49:41.338990957 +0000 UTC m=+0.078326920 container health_status e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, vcs-type=git, version=17.1.12, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_migration_target, architecture=x86_64, release=1761123044, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., build-date=2025-11-19T00:36:58Z, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, batch=17.1_20251118.1)
Nov 28 08:49:41 np0005538515.localdomain podman[95840]: 2025-11-28 08:49:41.714516527 +0000 UTC m=+0.453852550 container exec_died e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.buildah.version=1.41.4, release=1761123044, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, url=https://www.redhat.com, batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-compute, architecture=x86_64, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 28 08:49:41 np0005538515.localdomain systemd[1]: e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.service: Deactivated successfully.
Nov 28 08:49:46 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.
Nov 28 08:49:46 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.
Nov 28 08:49:46 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.
Nov 28 08:49:46 np0005538515.localdomain podman[95865]: 2025-11-28 08:49:46.985633074 +0000 UTC m=+0.084608624 container health_status 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, maintainer=OpenStack TripleO Team, release=1761123044, com.redhat.component=openstack-ovn-controller-container, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, vcs-type=git, name=rhosp17/openstack-ovn-controller, io.openshift.expose-services=, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, build-date=2025-11-18T23:34:05Z, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, batch=17.1_20251118.1)
Nov 28 08:49:47 np0005538515.localdomain podman[95865]: 2025-11-28 08:49:47.026020476 +0000 UTC m=+0.124996016 container exec_died 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, vcs-type=git, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.41.4, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.expose-services=, version=17.1.12, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, name=rhosp17/openstack-ovn-controller, vendor=Red Hat, Inc., url=https://www.redhat.com, managed_by=tripleo_ansible, architecture=x86_64, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, tcib_managed=true)
Nov 28 08:49:47 np0005538515.localdomain podman[95865]: unhealthy
Nov 28 08:49:47 np0005538515.localdomain systemd[1]: 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 08:49:47 np0005538515.localdomain systemd[1]: 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.service: Failed with result 'exit-code'.
Nov 28 08:49:47 np0005538515.localdomain systemd[1]: tmp-crun.w1QFGd.mount: Deactivated successfully.
Nov 28 08:49:47 np0005538515.localdomain podman[95866]: 2025-11-28 08:49:47.051661164 +0000 UTC m=+0.147300660 container health_status ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., url=https://www.redhat.com, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-compute, architecture=x86_64, vcs-type=git, release=1761123044, container_name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1)
Nov 28 08:49:47 np0005538515.localdomain podman[95866]: 2025-11-28 08:49:47.075517049 +0000 UTC m=+0.171156545 container exec_died ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., config_id=tripleo_step5, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-compute, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, architecture=x86_64, build-date=2025-11-19T00:36:58Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, url=https://www.redhat.com, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team)
Nov 28 08:49:47 np0005538515.localdomain systemd[1]: ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.service: Deactivated successfully.
Nov 28 08:49:47 np0005538515.localdomain podman[95867]: 2025-11-28 08:49:47.092962185 +0000 UTC m=+0.184852826 container health_status e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_metadata_agent, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, managed_by=tripleo_ansible, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, distribution-scope=public, release=1761123044, architecture=x86_64, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, url=https://www.redhat.com, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true)
Nov 28 08:49:47 np0005538515.localdomain podman[95867]: 2025-11-28 08:49:47.10937873 +0000 UTC m=+0.201269371 container exec_died e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, release=1761123044, url=https://www.redhat.com, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, batch=17.1_20251118.1, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, build-date=2025-11-19T00:14:25Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, architecture=x86_64, io.buildah.version=1.41.4, managed_by=tripleo_ansible, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']})
Nov 28 08:49:47 np0005538515.localdomain podman[95867]: unhealthy
Nov 28 08:49:47 np0005538515.localdomain systemd[1]: e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 08:49:47 np0005538515.localdomain systemd[1]: e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.service: Failed with result 'exit-code'.
Nov 28 08:50:09 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.
Nov 28 08:50:09 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.
Nov 28 08:50:09 np0005538515.localdomain systemd[1]: tmp-crun.wLofg3.mount: Deactivated successfully.
Nov 28 08:50:09 np0005538515.localdomain podman[95930]: 2025-11-28 08:50:09.97627566 +0000 UTC m=+0.084244413 container health_status cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, container_name=collectd, com.redhat.component=openstack-collectd-container, io.buildah.version=1.41.4, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, config_id=tripleo_step3, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-collectd, version=17.1.12, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, io.openshift.expose-services=, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 28 08:50:09 np0005538515.localdomain podman[95930]: 2025-11-28 08:50:09.987390782 +0000 UTC m=+0.095359595 container exec_died cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, batch=17.1_20251118.1, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, architecture=x86_64, container_name=collectd, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, url=https://www.redhat.com, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:51:28Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, name=rhosp17/openstack-collectd, com.redhat.component=openstack-collectd-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 28 08:50:09 np0005538515.localdomain systemd[1]: cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.service: Deactivated successfully.
Nov 28 08:50:10 np0005538515.localdomain podman[95929]: 2025-11-28 08:50:10.083398525 +0000 UTC m=+0.192230254 container health_status 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, managed_by=tripleo_ansible, tcib_managed=true, batch=17.1_20251118.1, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, url=https://www.redhat.com, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, version=17.1.12, release=1761123044, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-qdrouterd, config_id=tripleo_step1, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:46Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 08:50:10 np0005538515.localdomain podman[95929]: 2025-11-28 08:50:10.296462208 +0000 UTC m=+0.405293917 container exec_died 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, batch=17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, io.buildah.version=1.41.4, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, container_name=metrics_qdr, config_id=tripleo_step1, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, vcs-type=git, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 28 08:50:10 np0005538515.localdomain systemd[1]: 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.service: Deactivated successfully.
Nov 28 08:50:11 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.
Nov 28 08:50:11 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.
Nov 28 08:50:11 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.
Nov 28 08:50:11 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.
Nov 28 08:50:11 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.
Nov 28 08:50:11 np0005538515.localdomain podman[95985]: 2025-11-28 08:50:11.99854889 +0000 UTC m=+0.094159047 container health_status e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, config_id=tripleo_step4, tcib_managed=true, vendor=Red Hat, Inc., io.openshift.expose-services=, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_migration_target, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']})
Nov 28 08:50:12 np0005538515.localdomain podman[95978]: 2025-11-28 08:50:12.046761313 +0000 UTC m=+0.149155079 container health_status 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, name=rhosp17/openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, config_id=tripleo_step3, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-iscsid-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, batch=17.1_20251118.1, architecture=x86_64, container_name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, tcib_managed=true, build-date=2025-11-18T23:44:13Z, io.buildah.version=1.41.4, version=17.1.12)
Nov 28 08:50:12 np0005538515.localdomain podman[95978]: 2025-11-28 08:50:12.058415362 +0000 UTC m=+0.160809198 container exec_died 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, url=https://www.redhat.com, build-date=2025-11-18T23:44:13Z, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, version=17.1.12, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-iscsid-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, vendor=Red Hat, Inc., tcib_managed=true, config_id=tripleo_step3, architecture=x86_64, io.openshift.expose-services=, name=rhosp17/openstack-iscsid)
Nov 28 08:50:12 np0005538515.localdomain systemd[1]: tmp-crun.Wd0nEc.mount: Deactivated successfully.
Nov 28 08:50:12 np0005538515.localdomain systemd[1]: 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.service: Deactivated successfully.
Nov 28 08:50:12 np0005538515.localdomain podman[95976]: 2025-11-28 08:50:12.087933919 +0000 UTC m=+0.197428483 container health_status 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-cron-container, managed_by=tripleo_ansible, container_name=logrotate_crond, version=17.1.12, distribution-scope=public, url=https://www.redhat.com, name=rhosp17/openstack-cron, build-date=2025-11-18T22:49:32Z, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step4, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64)
Nov 28 08:50:12 np0005538515.localdomain podman[95977]: 2025-11-28 08:50:12.104424306 +0000 UTC m=+0.207663967 container health_status 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, version=17.1.12, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, config_id=tripleo_step4, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, maintainer=OpenStack TripleO Team, url=https://www.redhat.com)
Nov 28 08:50:12 np0005538515.localdomain podman[95976]: 2025-11-28 08:50:12.118481769 +0000 UTC m=+0.227976353 container exec_died 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-cron, io.openshift.expose-services=, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step4, version=17.1.12, description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, batch=17.1_20251118.1, container_name=logrotate_crond, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, tcib_managed=true, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, architecture=x86_64, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-cron-container, managed_by=tripleo_ansible)
Nov 28 08:50:12 np0005538515.localdomain systemd[1]: 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.service: Deactivated successfully.
Nov 28 08:50:12 np0005538515.localdomain podman[95977]: 2025-11-28 08:50:12.139392113 +0000 UTC m=+0.242631824 container exec_died 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, build-date=2025-11-19T00:12:45Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, name=rhosp17/openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, managed_by=tripleo_ansible, distribution-scope=public, architecture=x86_64, version=17.1.12, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ceilometer-ipmi-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044)
Nov 28 08:50:12 np0005538515.localdomain systemd[1]: 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.service: Deactivated successfully.
Nov 28 08:50:12 np0005538515.localdomain podman[95984]: 2025-11-28 08:50:12.163521414 +0000 UTC m=+0.260897855 container health_status d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., version=17.1.12, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, architecture=x86_64, managed_by=tripleo_ansible, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, build-date=2025-11-19T00:11:48Z, release=1761123044, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-compute, container_name=ceilometer_agent_compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Nov 28 08:50:12 np0005538515.localdomain podman[95984]: 2025-11-28 08:50:12.215834863 +0000 UTC m=+0.313211394 container exec_died d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, container_name=ceilometer_agent_compute, name=rhosp17/openstack-ceilometer-compute, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-compute-container, config_id=tripleo_step4, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, version=17.1.12, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, build-date=2025-11-19T00:11:48Z, io.openshift.expose-services=, release=1761123044, vendor=Red Hat, Inc.)
Nov 28 08:50:12 np0005538515.localdomain systemd[1]: d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.service: Deactivated successfully.
Nov 28 08:50:12 np0005538515.localdomain podman[95985]: 2025-11-28 08:50:12.385446861 +0000 UTC m=+0.481057028 container exec_died e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, batch=17.1_20251118.1, config_id=tripleo_step4, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, architecture=x86_64, io.openshift.expose-services=, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, release=1761123044, name=rhosp17/openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., container_name=nova_migration_target)
Nov 28 08:50:12 np0005538515.localdomain systemd[1]: e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.service: Deactivated successfully.
Nov 28 08:50:13 np0005538515.localdomain sudo[96091]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 08:50:13 np0005538515.localdomain sudo[96091]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:50:13 np0005538515.localdomain sudo[96091]: pam_unix(sudo:session): session closed for user root
Nov 28 08:50:13 np0005538515.localdomain sudo[96106]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 check-host
Nov 28 08:50:13 np0005538515.localdomain sudo[96106]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:50:14 np0005538515.localdomain sudo[96106]: pam_unix(sudo:session): session closed for user root
Nov 28 08:50:14 np0005538515.localdomain sudo[96142]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 08:50:14 np0005538515.localdomain sudo[96142]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:50:14 np0005538515.localdomain sudo[96142]: pam_unix(sudo:session): session closed for user root
Nov 28 08:50:14 np0005538515.localdomain sudo[96157]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 08:50:14 np0005538515.localdomain sudo[96157]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:50:15 np0005538515.localdomain sudo[96157]: pam_unix(sudo:session): session closed for user root
Nov 28 08:50:15 np0005538515.localdomain sudo[96204]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 08:50:15 np0005538515.localdomain sudo[96204]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:50:15 np0005538515.localdomain sudo[96204]: pam_unix(sudo:session): session closed for user root
Nov 28 08:50:17 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.
Nov 28 08:50:17 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.
Nov 28 08:50:17 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.
Nov 28 08:50:17 np0005538515.localdomain systemd[1]: tmp-crun.mQfvhH.mount: Deactivated successfully.
Nov 28 08:50:17 np0005538515.localdomain podman[96219]: 2025-11-28 08:50:17.973884097 +0000 UTC m=+0.081892939 container health_status 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, build-date=2025-11-18T23:34:05Z, name=rhosp17/openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, architecture=x86_64, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, version=17.1.12, url=https://www.redhat.com, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, maintainer=OpenStack TripleO Team, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272)
Nov 28 08:50:18 np0005538515.localdomain podman[96220]: 2025-11-28 08:50:18.018458898 +0000 UTC m=+0.124748107 container health_status ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, release=1761123044, io.buildah.version=1.41.4, container_name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, architecture=x86_64, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, build-date=2025-11-19T00:36:58Z, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step5)
Nov 28 08:50:18 np0005538515.localdomain podman[96219]: 2025-11-28 08:50:18.041194107 +0000 UTC m=+0.149202959 container exec_died 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., io.buildah.version=1.41.4, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, name=rhosp17/openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, com.redhat.component=openstack-ovn-controller-container, maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:34:05Z, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_controller, batch=17.1_20251118.1, release=1761123044)
Nov 28 08:50:18 np0005538515.localdomain podman[96219]: unhealthy
Nov 28 08:50:18 np0005538515.localdomain podman[96220]: 2025-11-28 08:50:18.04941124 +0000 UTC m=+0.155700439 container exec_died ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vendor=Red Hat, Inc., vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, build-date=2025-11-19T00:36:58Z, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, container_name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, tcib_managed=true, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1761123044, managed_by=tripleo_ansible, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-compute, batch=17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 28 08:50:18 np0005538515.localdomain systemd[1]: 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 08:50:18 np0005538515.localdomain systemd[1]: 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.service: Failed with result 'exit-code'.
Nov 28 08:50:18 np0005538515.localdomain systemd[1]: ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.service: Deactivated successfully.
Nov 28 08:50:18 np0005538515.localdomain podman[96221]: 2025-11-28 08:50:18.129584187 +0000 UTC m=+0.229651415 container health_status e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, maintainer=OpenStack TripleO Team, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, name=rhosp17/openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, managed_by=tripleo_ansible, vendor=Red Hat, Inc., config_id=tripleo_step4, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, container_name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, build-date=2025-11-19T00:14:25Z, tcib_managed=true, architecture=x86_64, version=17.1.12, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn)
Nov 28 08:50:18 np0005538515.localdomain podman[96221]: 2025-11-28 08:50:18.172643641 +0000 UTC m=+0.272710869 container exec_died e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_metadata_agent, architecture=x86_64, name=rhosp17/openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, tcib_managed=true, distribution-scope=public, batch=17.1_20251118.1, build-date=2025-11-19T00:14:25Z, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_id=tripleo_step4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, version=17.1.12, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 08:50:18 np0005538515.localdomain podman[96221]: unhealthy
Nov 28 08:50:18 np0005538515.localdomain systemd[1]: e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 08:50:18 np0005538515.localdomain systemd[1]: e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.service: Failed with result 'exit-code'.
Nov 28 08:50:33 np0005538515.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Nov 28 08:50:33 np0005538515.localdomain recover_tripleo_nova_virtqemud[96283]: 62642
Nov 28 08:50:33 np0005538515.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Nov 28 08:50:33 np0005538515.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Nov 28 08:50:40 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.
Nov 28 08:50:40 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.
Nov 28 08:50:41 np0005538515.localdomain podman[96284]: 2025-11-28 08:50:41.000219771 +0000 UTC m=+0.103614427 container health_status 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, build-date=2025-11-18T22:49:46Z, vendor=Red Hat, Inc., managed_by=tripleo_ansible, vcs-type=git, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step1, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, version=17.1.12)
Nov 28 08:50:41 np0005538515.localdomain podman[96285]: 2025-11-28 08:50:41.094547093 +0000 UTC m=+0.194154323 container health_status cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, release=1761123044, description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., io.buildah.version=1.41.4, com.redhat.component=openstack-collectd-container, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step3, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=collectd, managed_by=tripleo_ansible, build-date=2025-11-18T22:51:28Z, version=17.1.12, tcib_managed=true, name=rhosp17/openstack-collectd, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-type=git, url=https://www.redhat.com)
Nov 28 08:50:41 np0005538515.localdomain podman[96285]: 2025-11-28 08:50:41.110257055 +0000 UTC m=+0.209864275 container exec_died cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, config_id=tripleo_step3, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-collectd, vcs-type=git, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, managed_by=tripleo_ansible, build-date=2025-11-18T22:51:28Z, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, batch=17.1_20251118.1, com.redhat.component=openstack-collectd-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd)
Nov 28 08:50:41 np0005538515.localdomain systemd[1]: cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.service: Deactivated successfully.
Nov 28 08:50:41 np0005538515.localdomain podman[96284]: 2025-11-28 08:50:41.226011386 +0000 UTC m=+0.329406042 container exec_died 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, architecture=x86_64, name=rhosp17/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, version=17.1.12, description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, managed_by=tripleo_ansible, container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, distribution-scope=public, io.buildah.version=1.41.4)
Nov 28 08:50:41 np0005538515.localdomain systemd[1]: 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.service: Deactivated successfully.
Nov 28 08:50:42 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.
Nov 28 08:50:42 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.
Nov 28 08:50:42 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.
Nov 28 08:50:42 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.
Nov 28 08:50:42 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.
Nov 28 08:50:42 np0005538515.localdomain podman[96335]: 2025-11-28 08:50:42.993684176 +0000 UTC m=+0.088589046 container health_status d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, architecture=x86_64, io.openshift.expose-services=, container_name=ceilometer_agent_compute, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, name=rhosp17/openstack-ceilometer-compute, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, release=1761123044, build-date=2025-11-19T00:11:48Z, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true)
Nov 28 08:50:43 np0005538515.localdomain podman[96334]: 2025-11-28 08:50:43.047958505 +0000 UTC m=+0.146655741 container health_status 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, version=17.1.12, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, architecture=x86_64, managed_by=tripleo_ansible, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:44:13Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, container_name=iscsid, tcib_managed=true, vendor=Red Hat, Inc., batch=17.1_20251118.1, io.openshift.expose-services=, release=1761123044, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, name=rhosp17/openstack-iscsid, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-iscsid-container, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 08:50:43 np0005538515.localdomain podman[96334]: 2025-11-28 08:50:43.086513041 +0000 UTC m=+0.185210257 container exec_died 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, build-date=2025-11-18T23:44:13Z, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.component=openstack-iscsid-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, managed_by=tripleo_ansible, name=rhosp17/openstack-iscsid, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, container_name=iscsid, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, vendor=Red Hat, Inc., io.buildah.version=1.41.4, version=17.1.12, distribution-scope=public, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 28 08:50:43 np0005538515.localdomain podman[96336]: 2025-11-28 08:50:43.095172077 +0000 UTC m=+0.188567750 container health_status e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vcs-type=git, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., build-date=2025-11-19T00:36:58Z, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, tcib_managed=true, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044)
Nov 28 08:50:43 np0005538515.localdomain podman[96335]: 2025-11-28 08:50:43.098561481 +0000 UTC m=+0.193466361 container exec_died d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, build-date=2025-11-19T00:11:48Z, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, version=17.1.12, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-compute, config_id=tripleo_step4, architecture=x86_64, managed_by=tripleo_ansible, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 28 08:50:43 np0005538515.localdomain systemd[1]: 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.service: Deactivated successfully.
Nov 28 08:50:43 np0005538515.localdomain systemd[1]: d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.service: Deactivated successfully.
Nov 28 08:50:43 np0005538515.localdomain podman[96332]: 2025-11-28 08:50:43.143357749 +0000 UTC m=+0.245377008 container health_status 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, maintainer=OpenStack TripleO Team, release=1761123044, build-date=2025-11-18T22:49:32Z, vcs-type=git, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-cron-container, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, batch=17.1_20251118.1, container_name=logrotate_crond, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-cron, vendor=Red Hat, Inc., distribution-scope=public)
Nov 28 08:50:43 np0005538515.localdomain podman[96333]: 2025-11-28 08:50:43.201955401 +0000 UTC m=+0.303924239 container health_status 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vendor=Red Hat, Inc., version=17.1.12, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, io.buildah.version=1.41.4, io.openshift.expose-services=, container_name=ceilometer_agent_ipmi, tcib_managed=true, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-ipmi-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:12:45Z, architecture=x86_64, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-type=git, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-ipmi, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Nov 28 08:50:43 np0005538515.localdomain podman[96333]: 2025-11-28 08:50:43.224556426 +0000 UTC m=+0.326525284 container exec_died 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-ipmi-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, distribution-scope=public, io.buildah.version=1.41.4, build-date=2025-11-19T00:12:45Z, version=17.1.12, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, release=1761123044, container_name=ceilometer_agent_ipmi, name=rhosp17/openstack-ceilometer-ipmi, architecture=x86_64, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676)
Nov 28 08:50:43 np0005538515.localdomain systemd[1]: 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.service: Deactivated successfully.
Nov 28 08:50:43 np0005538515.localdomain podman[96332]: 2025-11-28 08:50:43.278229407 +0000 UTC m=+0.380248666 container exec_died 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, release=1761123044, description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, name=rhosp17/openstack-cron, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, com.redhat.component=openstack-cron-container, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, container_name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, tcib_managed=true, batch=17.1_20251118.1)
Nov 28 08:50:43 np0005538515.localdomain systemd[1]: 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.service: Deactivated successfully.
Nov 28 08:50:43 np0005538515.localdomain podman[96336]: 2025-11-28 08:50:43.438362692 +0000 UTC m=+0.531758345 container exec_died e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.buildah.version=1.41.4, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., vcs-type=git, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, build-date=2025-11-19T00:36:58Z, name=rhosp17/openstack-nova-compute, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, architecture=x86_64)
Nov 28 08:50:43 np0005538515.localdomain systemd[1]: e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.service: Deactivated successfully.
Nov 28 08:50:48 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.
Nov 28 08:50:48 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.
Nov 28 08:50:48 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.
Nov 28 08:50:48 np0005538515.localdomain systemd[1]: tmp-crun.a7PicO.mount: Deactivated successfully.
Nov 28 08:50:49 np0005538515.localdomain podman[96448]: 2025-11-28 08:50:48.99953727 +0000 UTC m=+0.102986848 container health_status 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, managed_by=tripleo_ansible, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, io.openshift.expose-services=, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, vendor=Red Hat, Inc., build-date=2025-11-18T23:34:05Z, maintainer=OpenStack TripleO Team, release=1761123044, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, config_id=tripleo_step4, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller)
Nov 28 08:50:49 np0005538515.localdomain systemd[1]: tmp-crun.7EPVlV.mount: Deactivated successfully.
Nov 28 08:50:49 np0005538515.localdomain podman[96450]: 2025-11-28 08:50:49.045859585 +0000 UTC m=+0.141362168 container health_status e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, batch=17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, version=17.1.12, maintainer=OpenStack TripleO Team, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, config_id=tripleo_step4, build-date=2025-11-19T00:14:25Z, io.openshift.expose-services=, url=https://www.redhat.com, vcs-type=git, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc.)
Nov 28 08:50:49 np0005538515.localdomain podman[96449]: 2025-11-28 08:50:49.094807 +0000 UTC m=+0.193274496 container health_status ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, managed_by=tripleo_ansible, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-compute, config_id=tripleo_step5, architecture=x86_64, build-date=2025-11-19T00:36:58Z, vendor=Red Hat, Inc., io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git)
Nov 28 08:50:49 np0005538515.localdomain podman[96450]: 2025-11-28 08:50:49.115148007 +0000 UTC m=+0.210650590 container exec_died e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, vendor=Red Hat, Inc., config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, build-date=2025-11-19T00:14:25Z, architecture=x86_64, io.openshift.expose-services=, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, url=https://www.redhat.com, batch=17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c)
Nov 28 08:50:49 np0005538515.localdomain podman[96450]: unhealthy
Nov 28 08:50:49 np0005538515.localdomain systemd[1]: e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 08:50:49 np0005538515.localdomain systemd[1]: e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.service: Failed with result 'exit-code'.
Nov 28 08:50:49 np0005538515.localdomain podman[96449]: 2025-11-28 08:50:49.150788532 +0000 UTC m=+0.249255938 container exec_died ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.12, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, url=https://www.redhat.com, build-date=2025-11-19T00:36:58Z, container_name=nova_compute, vendor=Red Hat, Inc., managed_by=tripleo_ansible, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, release=1761123044, name=rhosp17/openstack-nova-compute, config_id=tripleo_step5, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Nov 28 08:50:49 np0005538515.localdomain systemd[1]: ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.service: Deactivated successfully.
Nov 28 08:50:49 np0005538515.localdomain podman[96448]: 2025-11-28 08:50:49.170980374 +0000 UTC m=+0.274429942 container exec_died 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, io.openshift.expose-services=, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, distribution-scope=public, release=1761123044, version=17.1.12, architecture=x86_64, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.41.4, name=rhosp17/openstack-ovn-controller, vendor=Red Hat, Inc.)
Nov 28 08:50:49 np0005538515.localdomain podman[96448]: unhealthy
Nov 28 08:50:49 np0005538515.localdomain systemd[1]: 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 08:50:49 np0005538515.localdomain systemd[1]: 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.service: Failed with result 'exit-code'.
Nov 28 08:51:11 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.
Nov 28 08:51:11 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.
Nov 28 08:51:11 np0005538515.localdomain podman[96513]: 2025-11-28 08:51:11.979185457 +0000 UTC m=+0.090277957 container health_status 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, maintainer=OpenStack TripleO Team, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, architecture=x86_64, vcs-type=git, release=1761123044, description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, io.openshift.expose-services=, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 08:51:12 np0005538515.localdomain systemd[1]: tmp-crun.XF58oF.mount: Deactivated successfully.
Nov 28 08:51:12 np0005538515.localdomain podman[96514]: 2025-11-28 08:51:12.029889888 +0000 UTC m=+0.139049359 container health_status cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, io.openshift.expose-services=, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, url=https://www.redhat.com, managed_by=tripleo_ansible, container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, name=rhosp17/openstack-collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, distribution-scope=public, io.buildah.version=1.41.4, release=1761123044, com.redhat.component=openstack-collectd-container, vendor=Red Hat, Inc.)
Nov 28 08:51:12 np0005538515.localdomain podman[96514]: 2025-11-28 08:51:12.039006338 +0000 UTC m=+0.148165829 container exec_died cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-collectd-container, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vendor=Red Hat, Inc., distribution-scope=public, config_id=tripleo_step3, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=collectd, release=1761123044, tcib_managed=true, io.buildah.version=1.41.4, managed_by=tripleo_ansible, build-date=2025-11-18T22:51:28Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, version=17.1.12)
Nov 28 08:51:12 np0005538515.localdomain systemd[1]: cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.service: Deactivated successfully.
Nov 28 08:51:12 np0005538515.localdomain podman[96513]: 2025-11-28 08:51:12.18732458 +0000 UTC m=+0.298417080 container exec_died 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, architecture=x86_64, config_id=tripleo_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, name=rhosp17/openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, build-date=2025-11-18T22:49:46Z, distribution-scope=public, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4)
Nov 28 08:51:12 np0005538515.localdomain systemd[1]: 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.service: Deactivated successfully.
Nov 28 08:51:13 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.
Nov 28 08:51:13 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.
Nov 28 08:51:13 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.
Nov 28 08:51:13 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.
Nov 28 08:51:13 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.
Nov 28 08:51:13 np0005538515.localdomain podman[96563]: 2025-11-28 08:51:13.972853248 +0000 UTC m=+0.082540509 container health_status 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:32Z, summary=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-cron, vcs-type=git, com.redhat.component=openstack-cron-container, distribution-scope=public, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, container_name=logrotate_crond, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, tcib_managed=true)
Nov 28 08:51:14 np0005538515.localdomain podman[96564]: 2025-11-28 08:51:14.026866949 +0000 UTC m=+0.134892920 container health_status 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, managed_by=tripleo_ansible, batch=17.1_20251118.1, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ceilometer-ipmi-container, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, build-date=2025-11-19T00:12:45Z, io.buildah.version=1.41.4, io.openshift.expose-services=, tcib_managed=true, container_name=ceilometer_agent_ipmi, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, release=1761123044, maintainer=OpenStack TripleO Team, version=17.1.12, name=rhosp17/openstack-ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676)
Nov 28 08:51:14 np0005538515.localdomain podman[96564]: 2025-11-28 08:51:14.083481481 +0000 UTC m=+0.191507522 container exec_died 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, container_name=ceilometer_agent_ipmi, version=17.1.12, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-ipmi, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2025-11-19T00:12:45Z, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676)
Nov 28 08:51:14 np0005538515.localdomain systemd[1]: 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.service: Deactivated successfully.
Nov 28 08:51:14 np0005538515.localdomain podman[96572]: 2025-11-28 08:51:14.138567635 +0000 UTC m=+0.238167707 container health_status e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, com.redhat.component=openstack-nova-compute-container, vcs-type=git, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, distribution-scope=public, name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, release=1761123044, container_name=nova_migration_target, managed_by=tripleo_ansible, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute)
Nov 28 08:51:14 np0005538515.localdomain podman[96566]: 2025-11-28 08:51:14.088268257 +0000 UTC m=+0.189806597 container health_status d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, build-date=2025-11-19T00:11:48Z, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, io.buildah.version=1.41.4, container_name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp17/openstack-ceilometer-compute, url=https://www.redhat.com, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-compute-container, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, architecture=x86_64, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1)
Nov 28 08:51:14 np0005538515.localdomain podman[96565]: 2025-11-28 08:51:14.18848771 +0000 UTC m=+0.290769934 container health_status 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step3, release=1761123044, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, build-date=2025-11-18T23:44:13Z, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-iscsid, container_name=iscsid, com.redhat.component=openstack-iscsid-container, batch=17.1_20251118.1, vcs-type=git, managed_by=tripleo_ansible, io.buildah.version=1.41.4, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vendor=Red Hat, Inc.)
Nov 28 08:51:14 np0005538515.localdomain podman[96563]: 2025-11-28 08:51:14.215921264 +0000 UTC m=+0.325608525 container exec_died 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, vendor=Red Hat, Inc., release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, com.redhat.component=openstack-cron-container, io.buildah.version=1.41.4, architecture=x86_64, batch=17.1_20251118.1, config_id=tripleo_step4, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, url=https://www.redhat.com, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, build-date=2025-11-18T22:49:32Z, description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, container_name=logrotate_crond, tcib_managed=true)
Nov 28 08:51:14 np0005538515.localdomain systemd[1]: 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.service: Deactivated successfully.
Nov 28 08:51:14 np0005538515.localdomain podman[96565]: 2025-11-28 08:51:14.2275002 +0000 UTC m=+0.329782404 container exec_died 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, url=https://www.redhat.com, vcs-type=git, managed_by=tripleo_ansible, name=rhosp17/openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, vendor=Red Hat, Inc., tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, build-date=2025-11-18T23:44:13Z, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, release=1761123044, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d)
Nov 28 08:51:14 np0005538515.localdomain systemd[1]: 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.service: Deactivated successfully.
Nov 28 08:51:14 np0005538515.localdomain podman[96566]: 2025-11-28 08:51:14.273554056 +0000 UTC m=+0.375092416 container exec_died d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, container_name=ceilometer_agent_compute, architecture=x86_64, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, release=1761123044, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., io.openshift.expose-services=, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-compute-container, name=rhosp17/openstack-ceilometer-compute, url=https://www.redhat.com, tcib_managed=true, build-date=2025-11-19T00:11:48Z)
Nov 28 08:51:14 np0005538515.localdomain systemd[1]: d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.service: Deactivated successfully.
Nov 28 08:51:14 np0005538515.localdomain podman[96572]: 2025-11-28 08:51:14.536547455 +0000 UTC m=+0.636147577 container exec_died e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, config_id=tripleo_step4, container_name=nova_migration_target, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, version=17.1.12, batch=17.1_20251118.1, release=1761123044, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 28 08:51:14 np0005538515.localdomain systemd[1]: e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.service: Deactivated successfully.
Nov 28 08:51:15 np0005538515.localdomain sudo[96675]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 08:51:15 np0005538515.localdomain sudo[96675]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:51:15 np0005538515.localdomain sudo[96675]: pam_unix(sudo:session): session closed for user root
Nov 28 08:51:15 np0005538515.localdomain sudo[96690]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 08:51:15 np0005538515.localdomain sudo[96690]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:51:16 np0005538515.localdomain sudo[96690]: pam_unix(sudo:session): session closed for user root
Nov 28 08:51:17 np0005538515.localdomain sudo[96736]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 08:51:17 np0005538515.localdomain sudo[96736]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:51:17 np0005538515.localdomain sudo[96736]: pam_unix(sudo:session): session closed for user root
Nov 28 08:51:19 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.
Nov 28 08:51:19 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.
Nov 28 08:51:19 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.
Nov 28 08:51:19 np0005538515.localdomain podman[96753]: 2025-11-28 08:51:19.98740558 +0000 UTC m=+0.082830129 container health_status e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.expose-services=, release=1761123044, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, version=17.1.12, managed_by=tripleo_ansible, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, name=rhosp17/openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., batch=17.1_20251118.1, container_name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, build-date=2025-11-19T00:14:25Z, distribution-scope=public, url=https://www.redhat.com)
Nov 28 08:51:20 np0005538515.localdomain podman[96751]: 2025-11-28 08:51:20.033381624 +0000 UTC m=+0.131985970 container health_status 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vendor=Red Hat, Inc., build-date=2025-11-18T23:34:05Z, io.openshift.expose-services=, url=https://www.redhat.com, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, managed_by=tripleo_ansible, config_id=tripleo_step4, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, name=rhosp17/openstack-ovn-controller, version=17.1.12, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ovn-controller)
Nov 28 08:51:20 np0005538515.localdomain podman[96752]: 2025-11-28 08:51:20.082090213 +0000 UTC m=+0.181539185 container health_status ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, build-date=2025-11-19T00:36:58Z, io.openshift.expose-services=, container_name=nova_compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, url=https://www.redhat.com, vendor=Red Hat, Inc., io.buildah.version=1.41.4, name=rhosp17/openstack-nova-compute, batch=17.1_20251118.1, architecture=x86_64, vcs-type=git, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, distribution-scope=public)
Nov 28 08:51:20 np0005538515.localdomain podman[96751]: 2025-11-28 08:51:20.103052727 +0000 UTC m=+0.201657133 container exec_died 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, distribution-scope=public, managed_by=tripleo_ansible, container_name=ovn_controller, config_id=tripleo_step4, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, build-date=2025-11-18T23:34:05Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ovn-controller-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, name=rhosp17/openstack-ovn-controller, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., architecture=x86_64, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, maintainer=OpenStack TripleO Team)
Nov 28 08:51:20 np0005538515.localdomain podman[96753]: 2025-11-28 08:51:20.10377319 +0000 UTC m=+0.199197739 container exec_died e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, release=1761123044, container_name=ovn_metadata_agent, url=https://www.redhat.com, build-date=2025-11-19T00:14:25Z, tcib_managed=true, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., batch=17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 08:51:20 np0005538515.localdomain podman[96753]: unhealthy
Nov 28 08:51:20 np0005538515.localdomain systemd[1]: e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 08:51:20 np0005538515.localdomain systemd[1]: e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.service: Failed with result 'exit-code'.
Nov 28 08:51:20 np0005538515.localdomain podman[96751]: unhealthy
Nov 28 08:51:20 np0005538515.localdomain systemd[1]: 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 08:51:20 np0005538515.localdomain systemd[1]: 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.service: Failed with result 'exit-code'.
Nov 28 08:51:20 np0005538515.localdomain podman[96752]: 2025-11-28 08:51:20.211434561 +0000 UTC m=+0.310883563 container exec_died ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.buildah.version=1.41.4, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, container_name=nova_compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, build-date=2025-11-19T00:36:58Z, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, version=17.1.12, name=rhosp17/openstack-nova-compute)
Nov 28 08:51:20 np0005538515.localdomain systemd[1]: ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.service: Deactivated successfully.
Nov 28 08:51:42 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.
Nov 28 08:51:42 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.
Nov 28 08:51:42 np0005538515.localdomain podman[96817]: 2025-11-28 08:51:42.984977058 +0000 UTC m=+0.086770840 container health_status cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., batch=17.1_20251118.1, distribution-scope=public, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, managed_by=tripleo_ansible, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, version=17.1.12, config_id=tripleo_step3, name=rhosp17/openstack-collectd, release=1761123044, vcs-type=git, container_name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, io.buildah.version=1.41.4, com.redhat.component=openstack-collectd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd)
Nov 28 08:51:42 np0005538515.localdomain podman[96817]: 2025-11-28 08:51:42.993007975 +0000 UTC m=+0.094801717 container exec_died cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, build-date=2025-11-18T22:51:28Z, managed_by=tripleo_ansible, io.buildah.version=1.41.4, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, com.redhat.component=openstack-collectd-container, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vendor=Red Hat, Inc., name=rhosp17/openstack-collectd, batch=17.1_20251118.1, url=https://www.redhat.com, io.openshift.expose-services=, version=17.1.12, architecture=x86_64, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 28 08:51:43 np0005538515.localdomain systemd[1]: cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.service: Deactivated successfully.
Nov 28 08:51:43 np0005538515.localdomain systemd[1]: tmp-crun.xyESSM.mount: Deactivated successfully.
Nov 28 08:51:43 np0005538515.localdomain podman[96816]: 2025-11-28 08:51:43.036928497 +0000 UTC m=+0.141032229 container health_status 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, name=rhosp17/openstack-qdrouterd, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, release=1761123044, com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, build-date=2025-11-18T22:49:46Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, maintainer=OpenStack TripleO Team, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, config_id=tripleo_step1, io.buildah.version=1.41.4, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd)
Nov 28 08:51:43 np0005538515.localdomain podman[96816]: 2025-11-28 08:51:43.207984908 +0000 UTC m=+0.312088590 container exec_died 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, architecture=x86_64, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, config_id=tripleo_step1, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, io.openshift.expose-services=, build-date=2025-11-18T22:49:46Z, name=rhosp17/openstack-qdrouterd, managed_by=tripleo_ansible, vendor=Red Hat, Inc., release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd)
Nov 28 08:51:43 np0005538515.localdomain systemd[1]: 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.service: Deactivated successfully.
Nov 28 08:51:44 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.
Nov 28 08:51:44 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.
Nov 28 08:51:44 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.
Nov 28 08:51:44 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.
Nov 28 08:51:44 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.
Nov 28 08:51:44 np0005538515.localdomain podman[96866]: 2025-11-28 08:51:44.987540183 +0000 UTC m=+0.090650199 container health_status 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, build-date=2025-11-19T00:12:45Z, batch=17.1_20251118.1, distribution-scope=public, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ceilometer-ipmi-container, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ceilometer_agent_ipmi, tcib_managed=true, vendor=Red Hat, Inc., io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, url=https://www.redhat.com, architecture=x86_64, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vcs-type=git, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676)
Nov 28 08:51:45 np0005538515.localdomain podman[96867]: 2025-11-28 08:51:45.037783898 +0000 UTC m=+0.138907082 container health_status 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, maintainer=OpenStack TripleO Team, architecture=x86_64, container_name=iscsid, release=1761123044, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., build-date=2025-11-18T23:44:13Z, config_id=tripleo_step3, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.expose-services=, name=rhosp17/openstack-iscsid, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 28 08:51:45 np0005538515.localdomain podman[96866]: 2025-11-28 08:51:45.04661647 +0000 UTC m=+0.149726526 container exec_died 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vcs-type=git, architecture=x86_64, batch=17.1_20251118.1, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vendor=Red Hat, Inc., build-date=2025-11-19T00:12:45Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, container_name=ceilometer_agent_ipmi, distribution-scope=public, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 08:51:45 np0005538515.localdomain systemd[1]: 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.service: Deactivated successfully.
Nov 28 08:51:45 np0005538515.localdomain podman[96865]: 2025-11-28 08:51:45.091631924 +0000 UTC m=+0.197354600 container health_status 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, container_name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, tcib_managed=true, vendor=Red Hat, Inc., version=17.1.12, build-date=2025-11-18T22:49:32Z, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step4, com.redhat.component=openstack-cron-container, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, batch=17.1_20251118.1)
Nov 28 08:51:45 np0005538515.localdomain podman[96865]: 2025-11-28 08:51:45.100402085 +0000 UTC m=+0.206124761 container exec_died 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, vendor=Red Hat, Inc., distribution-scope=public, name=rhosp17/openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, container_name=logrotate_crond, url=https://www.redhat.com, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:49:32Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_id=tripleo_step4, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, architecture=x86_64, com.redhat.component=openstack-cron-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 cron)
Nov 28 08:51:45 np0005538515.localdomain systemd[1]: 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.service: Deactivated successfully.
Nov 28 08:51:45 np0005538515.localdomain podman[96868]: 2025-11-28 08:51:45.149866965 +0000 UTC m=+0.248150012 container health_status d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, batch=17.1_20251118.1, vcs-type=git, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, version=17.1.12, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, vendor=Red Hat, Inc., io.buildah.version=1.41.4, managed_by=tripleo_ansible, container_name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']})
Nov 28 08:51:45 np0005538515.localdomain podman[96868]: 2025-11-28 08:51:45.184424869 +0000 UTC m=+0.282707946 container exec_died d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, container_name=ceilometer_agent_compute, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-compute-container, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, batch=17.1_20251118.1, vendor=Red Hat, Inc., vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, architecture=x86_64, config_id=tripleo_step4, io.buildah.version=1.41.4, tcib_managed=true, url=https://www.redhat.com, build-date=2025-11-19T00:11:48Z, name=rhosp17/openstack-ceilometer-compute)
Nov 28 08:51:45 np0005538515.localdomain podman[96874]: 2025-11-28 08:51:45.210658235 +0000 UTC m=+0.301983778 container health_status e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vcs-type=git, managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_migration_target, version=17.1.12, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true)
Nov 28 08:51:45 np0005538515.localdomain podman[96867]: 2025-11-28 08:51:45.22381181 +0000 UTC m=+0.324934974 container exec_died 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, version=17.1.12, io.buildah.version=1.41.4, config_id=tripleo_step3, build-date=2025-11-18T23:44:13Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, architecture=x86_64, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., distribution-scope=public, tcib_managed=true, managed_by=tripleo_ansible, container_name=iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d)
Nov 28 08:51:45 np0005538515.localdomain systemd[1]: 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.service: Deactivated successfully.
Nov 28 08:51:45 np0005538515.localdomain systemd[1]: d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.service: Deactivated successfully.
Nov 28 08:51:45 np0005538515.localdomain podman[96874]: 2025-11-28 08:51:45.58894623 +0000 UTC m=+0.680271823 container exec_died e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, maintainer=OpenStack TripleO Team, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, release=1761123044, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, managed_by=tripleo_ansible, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, distribution-scope=public, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Nov 28 08:51:45 np0005538515.localdomain systemd[1]: e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.service: Deactivated successfully.
Nov 28 08:51:50 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.
Nov 28 08:51:50 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.
Nov 28 08:51:50 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.
Nov 28 08:51:50 np0005538515.localdomain podman[96981]: 2025-11-28 08:51:50.988164217 +0000 UTC m=+0.089592677 container health_status 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, tcib_managed=true, com.redhat.component=openstack-ovn-controller-container, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ovn-controller, config_id=tripleo_step4, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, architecture=x86_64, vendor=Red Hat, Inc., build-date=2025-11-18T23:34:05Z, maintainer=OpenStack TripleO Team, vcs-type=git, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, distribution-scope=public, url=https://www.redhat.com, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller)
Nov 28 08:51:51 np0005538515.localdomain podman[96981]: 2025-11-28 08:51:51.031718587 +0000 UTC m=+0.133147047 container exec_died 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, architecture=x86_64, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, summary=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, config_id=tripleo_step4, name=rhosp17/openstack-ovn-controller, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, vcs-type=git, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, build-date=2025-11-18T23:34:05Z, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, distribution-scope=public, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 08:51:51 np0005538515.localdomain podman[96981]: unhealthy
Nov 28 08:51:51 np0005538515.localdomain systemd[1]: 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 08:51:51 np0005538515.localdomain systemd[1]: 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.service: Failed with result 'exit-code'.
Nov 28 08:51:51 np0005538515.localdomain podman[96982]: 2025-11-28 08:51:51.052214747 +0000 UTC m=+0.151543892 container health_status ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., container_name=nova_compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, vcs-type=git, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, architecture=x86_64, name=rhosp17/openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true)
Nov 28 08:51:51 np0005538515.localdomain podman[96983]: 2025-11-28 08:51:51.103546535 +0000 UTC m=+0.197033981 container health_status e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, config_id=tripleo_step4, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, version=17.1.12, release=1761123044, build-date=2025-11-19T00:14:25Z, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.4, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, batch=17.1_20251118.1)
Nov 28 08:51:51 np0005538515.localdomain podman[96982]: 2025-11-28 08:51:51.133644001 +0000 UTC m=+0.232973156 container exec_died ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, release=1761123044, container_name=nova_compute, config_id=tripleo_step5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-nova-compute, io.buildah.version=1.41.4, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, tcib_managed=true, build-date=2025-11-19T00:36:58Z, architecture=x86_64, distribution-scope=public, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 28 08:51:51 np0005538515.localdomain systemd[1]: ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.service: Deactivated successfully.
Nov 28 08:51:51 np0005538515.localdomain podman[96983]: 2025-11-28 08:51:51.150598633 +0000 UTC m=+0.244085969 container exec_died e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, release=1761123044, distribution-scope=public, io.openshift.expose-services=, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, version=17.1.12, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., batch=17.1_20251118.1, build-date=2025-11-19T00:14:25Z, vcs-type=git)
Nov 28 08:51:51 np0005538515.localdomain podman[96983]: unhealthy
Nov 28 08:51:51 np0005538515.localdomain systemd[1]: e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 08:51:51 np0005538515.localdomain systemd[1]: e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.service: Failed with result 'exit-code'.
Nov 28 08:52:13 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.
Nov 28 08:52:13 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.
Nov 28 08:52:13 np0005538515.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Nov 28 08:52:13 np0005538515.localdomain recover_tripleo_nova_virtqemud[97055]: 62642
Nov 28 08:52:13 np0005538515.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Nov 28 08:52:13 np0005538515.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Nov 28 08:52:14 np0005538515.localdomain podman[97048]: 2025-11-28 08:52:14.003216244 +0000 UTC m=+0.089969758 container health_status cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, tcib_managed=true, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, build-date=2025-11-18T22:51:28Z, release=1761123044, url=https://www.redhat.com, name=rhosp17/openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-collectd-container, distribution-scope=public, vcs-type=git)
Nov 28 08:52:14 np0005538515.localdomain systemd[1]: tmp-crun.yHVDz7.mount: Deactivated successfully.
Nov 28 08:52:14 np0005538515.localdomain podman[97047]: 2025-11-28 08:52:14.057173673 +0000 UTC m=+0.148247020 container health_status 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, io.openshift.expose-services=, vendor=Red Hat, Inc., config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, com.redhat.component=openstack-qdrouterd-container, version=17.1.12, release=1761123044, container_name=metrics_qdr, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4)
Nov 28 08:52:14 np0005538515.localdomain podman[97048]: 2025-11-28 08:52:14.071844624 +0000 UTC m=+0.158598198 container exec_died cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, url=https://www.redhat.com, vendor=Red Hat, Inc., io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, container_name=collectd, batch=17.1_20251118.1, tcib_managed=true, managed_by=tripleo_ansible, com.redhat.component=openstack-collectd-container, io.openshift.expose-services=, distribution-scope=public, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-collectd, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 28 08:52:14 np0005538515.localdomain systemd[1]: cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.service: Deactivated successfully.
Nov 28 08:52:14 np0005538515.localdomain podman[97047]: 2025-11-28 08:52:14.270528935 +0000 UTC m=+0.361602292 container exec_died 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:46Z, batch=17.1_20251118.1, container_name=metrics_qdr, config_id=tripleo_step1, distribution-scope=public, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, url=https://www.redhat.com, name=rhosp17/openstack-qdrouterd)
Nov 28 08:52:14 np0005538515.localdomain systemd[1]: 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.service: Deactivated successfully.
Nov 28 08:52:15 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.
Nov 28 08:52:15 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.
Nov 28 08:52:15 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.
Nov 28 08:52:15 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.
Nov 28 08:52:15 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.
Nov 28 08:52:15 np0005538515.localdomain podman[97113]: 2025-11-28 08:52:15.991307101 +0000 UTC m=+0.083084235 container health_status e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, io.buildah.version=1.41.4, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z, url=https://www.redhat.com, architecture=x86_64, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.openshift.expose-services=, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git)
Nov 28 08:52:16 np0005538515.localdomain podman[97099]: 2025-11-28 08:52:15.967141889 +0000 UTC m=+0.074884215 container health_status 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, config_id=tripleo_step4, distribution-scope=public, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, container_name=ceilometer_agent_ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, vcs-type=git, build-date=2025-11-19T00:12:45Z, com.redhat.component=openstack-ceilometer-ipmi-container, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi)
Nov 28 08:52:16 np0005538515.localdomain systemd[1]: tmp-crun.R2bbVm.mount: Deactivated successfully.
Nov 28 08:52:16 np0005538515.localdomain podman[97099]: 2025-11-28 08:52:16.047653405 +0000 UTC m=+0.155395711 container exec_died 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, build-date=2025-11-19T00:12:45Z, url=https://www.redhat.com, distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, managed_by=tripleo_ansible, vcs-type=git, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., architecture=x86_64, maintainer=OpenStack TripleO Team, release=1761123044, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ceilometer_agent_ipmi, version=17.1.12)
Nov 28 08:52:16 np0005538515.localdomain systemd[1]: 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.service: Deactivated successfully.
Nov 28 08:52:16 np0005538515.localdomain podman[97112]: 2025-11-28 08:52:16.139695775 +0000 UTC m=+0.237807244 container health_status d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, vendor=Red Hat, Inc., batch=17.1_20251118.1, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ceilometer_agent_compute, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, distribution-scope=public, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 28 08:52:16 np0005538515.localdomain podman[97100]: 2025-11-28 08:52:16.05498583 +0000 UTC m=+0.154803292 container health_status 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.expose-services=, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-iscsid-container, release=1761123044, container_name=iscsid, build-date=2025-11-18T23:44:13Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-type=git, version=17.1.12, url=https://www.redhat.com, managed_by=tripleo_ansible, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-iscsid, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true)
Nov 28 08:52:16 np0005538515.localdomain podman[97100]: 2025-11-28 08:52:16.184146393 +0000 UTC m=+0.283963875 container exec_died 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, maintainer=OpenStack TripleO Team, release=1761123044, tcib_managed=true, build-date=2025-11-18T23:44:13Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, version=17.1.12, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, name=rhosp17/openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_id=tripleo_step3, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, batch=17.1_20251118.1, com.redhat.component=openstack-iscsid-container)
Nov 28 08:52:16 np0005538515.localdomain podman[97112]: 2025-11-28 08:52:16.193894243 +0000 UTC m=+0.292005652 container exec_died d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, com.redhat.component=openstack-ceilometer-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, io.buildah.version=1.41.4, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp17/openstack-ceilometer-compute, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, config_id=tripleo_step4, container_name=ceilometer_agent_compute, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, url=https://www.redhat.com, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676)
Nov 28 08:52:16 np0005538515.localdomain systemd[1]: 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.service: Deactivated successfully.
Nov 28 08:52:16 np0005538515.localdomain systemd[1]: d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.service: Deactivated successfully.
Nov 28 08:52:16 np0005538515.localdomain podman[97098]: 2025-11-28 08:52:16.249799793 +0000 UTC m=+0.357925610 container health_status 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:32Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, distribution-scope=public, io.buildah.version=1.41.4, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, name=rhosp17/openstack-cron, container_name=logrotate_crond, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 cron, release=1761123044, summary=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-cron-container, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, vcs-type=git, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team)
Nov 28 08:52:16 np0005538515.localdomain podman[97098]: 2025-11-28 08:52:16.284427727 +0000 UTC m=+0.392553474 container exec_died 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, summary=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.component=openstack-cron-container, maintainer=OpenStack TripleO Team, architecture=x86_64, build-date=2025-11-18T22:49:32Z, managed_by=tripleo_ansible, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, tcib_managed=true, vcs-type=git, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12)
Nov 28 08:52:16 np0005538515.localdomain systemd[1]: 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.service: Deactivated successfully.
Nov 28 08:52:16 np0005538515.localdomain podman[97113]: 2025-11-28 08:52:16.342427872 +0000 UTC m=+0.434205016 container exec_died e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, container_name=nova_migration_target, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:36:58Z, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, architecture=x86_64, url=https://www.redhat.com, name=rhosp17/openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, tcib_managed=true, distribution-scope=public, io.openshift.expose-services=, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute)
Nov 28 08:52:16 np0005538515.localdomain systemd[1]: e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.service: Deactivated successfully.
Nov 28 08:52:17 np0005538515.localdomain sudo[97208]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 08:52:17 np0005538515.localdomain sudo[97208]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:52:17 np0005538515.localdomain sudo[97208]: pam_unix(sudo:session): session closed for user root
Nov 28 08:52:17 np0005538515.localdomain sudo[97223]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Nov 28 08:52:17 np0005538515.localdomain sudo[97223]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:52:18 np0005538515.localdomain systemd[1]: tmp-crun.iS5MDP.mount: Deactivated successfully.
Nov 28 08:52:18 np0005538515.localdomain podman[97308]: 2025-11-28 08:52:18.411349156 +0000 UTC m=+0.102885235 container exec 98f7091a3e2ea0e9ed1e630f1e98c8fad1fd276cf7448473db6afc3c103ea45d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538515, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_CLEAN=True, ceph=True, io.buildah.version=1.33.12, name=rhceph, build-date=2025-09-24T08:57:55, version=7, description=Red Hat Ceph Storage 7, architecture=x86_64, release=553, io.openshift.expose-services=, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=rhceph-container, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph)
Nov 28 08:52:18 np0005538515.localdomain podman[97308]: 2025-11-28 08:52:18.515458829 +0000 UTC m=+0.206994898 container exec_died 98f7091a3e2ea0e9ed1e630f1e98c8fad1fd276cf7448473db6afc3c103ea45d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538515, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, version=7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, build-date=2025-09-24T08:57:55, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, release=553, io.buildah.version=1.33.12, vcs-type=git, name=rhceph, architecture=x86_64, description=Red Hat Ceph Storage 7, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, ceph=True, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Nov 28 08:52:18 np0005538515.localdomain sudo[97223]: pam_unix(sudo:session): session closed for user root
Nov 28 08:52:18 np0005538515.localdomain sudo[97378]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 08:52:18 np0005538515.localdomain sudo[97378]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:52:18 np0005538515.localdomain sudo[97378]: pam_unix(sudo:session): session closed for user root
Nov 28 08:52:19 np0005538515.localdomain sudo[97393]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 08:52:19 np0005538515.localdomain sudo[97393]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:52:19 np0005538515.localdomain sudo[97393]: pam_unix(sudo:session): session closed for user root
Nov 28 08:52:20 np0005538515.localdomain sudo[97439]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 08:52:20 np0005538515.localdomain sudo[97439]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:52:20 np0005538515.localdomain sudo[97439]: pam_unix(sudo:session): session closed for user root
Nov 28 08:52:21 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.
Nov 28 08:52:21 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.
Nov 28 08:52:21 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.
Nov 28 08:52:21 np0005538515.localdomain podman[97455]: 2025-11-28 08:52:21.985392315 +0000 UTC m=+0.084721016 container health_status ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, config_id=tripleo_step5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, container_name=nova_compute, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, distribution-scope=public, name=rhosp17/openstack-nova-compute, release=1761123044)
Nov 28 08:52:22 np0005538515.localdomain podman[97454]: 2025-11-28 08:52:22.041256513 +0000 UTC m=+0.141847994 container health_status 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, batch=17.1_20251118.1, com.redhat.component=openstack-ovn-controller-container, io.openshift.expose-services=, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, name=rhosp17/openstack-ovn-controller, io.buildah.version=1.41.4, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, architecture=x86_64, release=1761123044, description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, managed_by=tripleo_ansible)
Nov 28 08:52:22 np0005538515.localdomain podman[97455]: 2025-11-28 08:52:22.093934333 +0000 UTC m=+0.193263004 container exec_died ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, tcib_managed=true, config_id=tripleo_step5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, release=1761123044, managed_by=tripleo_ansible, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-compute, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, vcs-type=git, architecture=x86_64, build-date=2025-11-19T00:36:58Z, distribution-scope=public, version=17.1.12, vendor=Red Hat, Inc., container_name=nova_compute)
Nov 28 08:52:22 np0005538515.localdomain systemd[1]: ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.service: Deactivated successfully.
Nov 28 08:52:22 np0005538515.localdomain podman[97454]: 2025-11-28 08:52:22.110199924 +0000 UTC m=+0.210791395 container exec_died 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, container_name=ovn_controller, vcs-type=git, distribution-scope=public, name=rhosp17/openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, architecture=x86_64, build-date=2025-11-18T23:34:05Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, release=1761123044, url=https://www.redhat.com, batch=17.1_20251118.1, com.redhat.component=openstack-ovn-controller-container, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller)
Nov 28 08:52:22 np0005538515.localdomain podman[97454]: unhealthy
Nov 28 08:52:22 np0005538515.localdomain systemd[1]: 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 08:52:22 np0005538515.localdomain systemd[1]: 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.service: Failed with result 'exit-code'.
Nov 28 08:52:22 np0005538515.localdomain podman[97456]: 2025-11-28 08:52:22.12144185 +0000 UTC m=+0.218645416 container health_status e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, release=1761123044, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-type=git, vendor=Red Hat, Inc., distribution-scope=public, architecture=x86_64, batch=17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, container_name=ovn_metadata_agent, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, version=17.1.12, io.openshift.expose-services=)
Nov 28 08:52:22 np0005538515.localdomain podman[97456]: 2025-11-28 08:52:22.212579312 +0000 UTC m=+0.309782828 container exec_died e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.buildah.version=1.41.4, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, batch=17.1_20251118.1, tcib_managed=true, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., vcs-type=git, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, version=17.1.12, maintainer=OpenStack TripleO Team, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, container_name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, build-date=2025-11-19T00:14:25Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 28 08:52:22 np0005538515.localdomain podman[97456]: unhealthy
Nov 28 08:52:22 np0005538515.localdomain systemd[1]: e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 08:52:22 np0005538515.localdomain systemd[1]: e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.service: Failed with result 'exit-code'.
Nov 28 08:52:44 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.
Nov 28 08:52:44 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.
Nov 28 08:52:44 np0005538515.localdomain podman[97519]: 2025-11-28 08:52:44.988033121 +0000 UTC m=+0.095051134 container health_status 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, vcs-type=git, config_id=tripleo_step1, vendor=Red Hat, Inc., version=17.1.12, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, batch=17.1_20251118.1, tcib_managed=true, name=rhosp17/openstack-qdrouterd, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:46Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, release=1761123044, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 28 08:52:45 np0005538515.localdomain systemd[1]: tmp-crun.xeJn9w.mount: Deactivated successfully.
Nov 28 08:52:45 np0005538515.localdomain podman[97520]: 2025-11-28 08:52:45.029050383 +0000 UTC m=+0.133581279 container health_status cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., container_name=collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.component=openstack-collectd-container, summary=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, architecture=x86_64, batch=17.1_20251118.1, name=rhosp17/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, managed_by=tripleo_ansible, build-date=2025-11-18T22:51:28Z)
Nov 28 08:52:45 np0005538515.localdomain podman[97520]: 2025-11-28 08:52:45.040443013 +0000 UTC m=+0.144973979 container exec_died cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, version=17.1.12, maintainer=OpenStack TripleO Team, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, container_name=collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, config_id=tripleo_step3, url=https://www.redhat.com, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, com.redhat.component=openstack-collectd-container, build-date=2025-11-18T22:51:28Z, description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 28 08:52:45 np0005538515.localdomain systemd[1]: cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.service: Deactivated successfully.
Nov 28 08:52:45 np0005538515.localdomain podman[97519]: 2025-11-28 08:52:45.195837852 +0000 UTC m=+0.302855855 container exec_died 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, name=rhosp17/openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, distribution-scope=public, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, version=17.1.12, architecture=x86_64, url=https://www.redhat.com, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr)
Nov 28 08:52:45 np0005538515.localdomain systemd[1]: 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.service: Deactivated successfully.
Nov 28 08:52:46 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.
Nov 28 08:52:46 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.
Nov 28 08:52:46 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.
Nov 28 08:52:46 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.
Nov 28 08:52:46 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.
Nov 28 08:52:46 np0005538515.localdomain podman[97577]: 2025-11-28 08:52:46.989034807 +0000 UTC m=+0.077595528 container health_status e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, architecture=x86_64, release=1761123044, name=rhosp17/openstack-nova-compute, vcs-type=git, url=https://www.redhat.com, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, tcib_managed=true, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, com.redhat.component=openstack-nova-compute-container)
Nov 28 08:52:47 np0005538515.localdomain podman[97569]: 2025-11-28 08:52:47.045414721 +0000 UTC m=+0.143480754 container health_status 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vcs-type=git, config_id=tripleo_step4, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-ipmi-container, container_name=ceilometer_agent_ipmi, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, managed_by=tripleo_ansible, batch=17.1_20251118.1, name=rhosp17/openstack-ceilometer-ipmi, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2025-11-19T00:12:45Z, architecture=x86_64, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676)
Nov 28 08:52:47 np0005538515.localdomain podman[97571]: 2025-11-28 08:52:47.104982693 +0000 UTC m=+0.197074822 container health_status d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20251118.1, managed_by=tripleo_ansible, vendor=Red Hat, Inc., release=1761123044, name=rhosp17/openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, container_name=ceilometer_agent_compute, config_id=tripleo_step4, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, tcib_managed=true, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, architecture=x86_64, version=17.1.12, vcs-type=git)
Nov 28 08:52:47 np0005538515.localdomain podman[97569]: 2025-11-28 08:52:47.106474249 +0000 UTC m=+0.204540292 container exec_died 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, architecture=x86_64, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-ipmi, vendor=Red Hat, Inc., io.openshift.expose-services=, tcib_managed=true, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ceilometer_agent_ipmi, build-date=2025-11-19T00:12:45Z, distribution-scope=public, release=1761123044, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Nov 28 08:52:47 np0005538515.localdomain systemd[1]: 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.service: Deactivated successfully.
Nov 28 08:52:47 np0005538515.localdomain podman[97571]: 2025-11-28 08:52:47.189545663 +0000 UTC m=+0.281637772 container exec_died d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, container_name=ceilometer_agent_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, io.buildah.version=1.41.4, distribution-scope=public, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, build-date=2025-11-19T00:11:48Z, tcib_managed=true, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-compute-container, version=17.1.12, vendor=Red Hat, Inc., org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676)
Nov 28 08:52:47 np0005538515.localdomain systemd[1]: d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.service: Deactivated successfully.
Nov 28 08:52:47 np0005538515.localdomain podman[97570]: 2025-11-28 08:52:47.203948546 +0000 UTC m=+0.298024626 container health_status 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, batch=17.1_20251118.1, version=17.1.12, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, name=rhosp17/openstack-iscsid, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, managed_by=tripleo_ansible, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.expose-services=, build-date=2025-11-18T23:44:13Z, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, container_name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, config_id=tripleo_step3)
Nov 28 08:52:47 np0005538515.localdomain podman[97568]: 2025-11-28 08:52:47.159286972 +0000 UTC m=+0.260425800 container health_status 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, distribution-scope=public, container_name=logrotate_crond, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, name=rhosp17/openstack-cron, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, batch=17.1_20251118.1, com.redhat.component=openstack-cron-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4)
Nov 28 08:52:47 np0005538515.localdomain podman[97568]: 2025-11-28 08:52:47.240280783 +0000 UTC m=+0.341419591 container exec_died 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, build-date=2025-11-18T22:49:32Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=logrotate_crond, architecture=x86_64, batch=17.1_20251118.1, com.redhat.component=openstack-cron-container, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, config_id=tripleo_step4, release=1761123044, io.buildah.version=1.41.4, vcs-type=git, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vendor=Red Hat, Inc., name=rhosp17/openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron)
Nov 28 08:52:47 np0005538515.localdomain systemd[1]: 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.service: Deactivated successfully.
Nov 28 08:52:47 np0005538515.localdomain podman[97570]: 2025-11-28 08:52:47.290585681 +0000 UTC m=+0.384661751 container exec_died 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-type=git, version=17.1.12, name=rhosp17/openstack-iscsid, architecture=x86_64, com.redhat.component=openstack-iscsid-container, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, managed_by=tripleo_ansible, tcib_managed=true, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, batch=17.1_20251118.1, container_name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, build-date=2025-11-18T23:44:13Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, release=1761123044)
Nov 28 08:52:47 np0005538515.localdomain systemd[1]: 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.service: Deactivated successfully.
Nov 28 08:52:47 np0005538515.localdomain podman[97577]: 2025-11-28 08:52:47.353554438 +0000 UTC m=+0.442115189 container exec_died e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.4, vendor=Red Hat, Inc., config_id=tripleo_step4, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, vcs-type=git, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, version=17.1.12, name=rhosp17/openstack-nova-compute, container_name=nova_migration_target, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']})
Nov 28 08:52:47 np0005538515.localdomain systemd[1]: e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.service: Deactivated successfully.
Nov 28 08:52:47 np0005538515.localdomain systemd[1]: tmp-crun.5Sx5a0.mount: Deactivated successfully.
Nov 28 08:52:52 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.
Nov 28 08:52:52 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.
Nov 28 08:52:52 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.
Nov 28 08:52:52 np0005538515.localdomain podman[97678]: 2025-11-28 08:52:52.997304525 +0000 UTC m=+0.100006566 container health_status 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, vcs-type=git, batch=17.1_20251118.1, release=1761123044, vendor=Red Hat, Inc., io.buildah.version=1.41.4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, name=rhosp17/openstack-ovn-controller, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, tcib_managed=true, container_name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container)
Nov 28 08:52:53 np0005538515.localdomain podman[97678]: 2025-11-28 08:52:53.037854383 +0000 UTC m=+0.140556374 container exec_died 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, config_id=tripleo_step4, name=rhosp17/openstack-ovn-controller, batch=17.1_20251118.1, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., build-date=2025-11-18T23:34:05Z, description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, tcib_managed=true, com.redhat.component=openstack-ovn-controller-container, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, release=1761123044, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272)
Nov 28 08:52:53 np0005538515.localdomain podman[97680]: 2025-11-28 08:52:53.041826014 +0000 UTC m=+0.141475631 container health_status e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, vendor=Red Hat, Inc., name=rhosp17/openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, vcs-type=git, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=ovn_metadata_agent, config_id=tripleo_step4, release=1761123044, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, tcib_managed=true, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, batch=17.1_20251118.1, architecture=x86_64)
Nov 28 08:52:53 np0005538515.localdomain podman[97678]: unhealthy
Nov 28 08:52:53 np0005538515.localdomain systemd[1]: 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 08:52:53 np0005538515.localdomain systemd[1]: 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.service: Failed with result 'exit-code'.
Nov 28 08:52:53 np0005538515.localdomain systemd[1]: tmp-crun.T6iAaa.mount: Deactivated successfully.
Nov 28 08:52:53 np0005538515.localdomain podman[97680]: 2025-11-28 08:52:53.128579703 +0000 UTC m=+0.228229330 container exec_died e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, tcib_managed=true, config_id=tripleo_step4, release=1761123044, maintainer=OpenStack TripleO Team, version=17.1.12, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, url=https://www.redhat.com, managed_by=tripleo_ansible, batch=17.1_20251118.1, build-date=2025-11-19T00:14:25Z, vendor=Red Hat, Inc., architecture=x86_64, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c)
Nov 28 08:52:53 np0005538515.localdomain podman[97680]: unhealthy
Nov 28 08:52:53 np0005538515.localdomain systemd[1]: e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 08:52:53 np0005538515.localdomain systemd[1]: e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.service: Failed with result 'exit-code'.
Nov 28 08:52:53 np0005538515.localdomain podman[97679]: 2025-11-28 08:52:53.096298611 +0000 UTC m=+0.198246949 container health_status ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_compute, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-compute, release=1761123044, maintainer=OpenStack TripleO Team, architecture=x86_64, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, build-date=2025-11-19T00:36:58Z, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, vendor=Red Hat, Inc., batch=17.1_20251118.1, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.12, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 28 08:52:53 np0005538515.localdomain podman[97679]: 2025-11-28 08:52:53.17627345 +0000 UTC m=+0.278221798 container exec_died ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-compute, config_id=tripleo_step5, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:36:58Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, release=1761123044, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, tcib_managed=true, url=https://www.redhat.com, managed_by=tripleo_ansible)
Nov 28 08:52:53 np0005538515.localdomain systemd[1]: ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.service: Deactivated successfully.
Nov 28 08:53:15 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.
Nov 28 08:53:15 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.
Nov 28 08:53:15 np0005538515.localdomain podman[97743]: 2025-11-28 08:53:15.986617701 +0000 UTC m=+0.092256978 container health_status 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, distribution-scope=public, container_name=metrics_qdr, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, batch=17.1_20251118.1, config_id=tripleo_step1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:46Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, name=rhosp17/openstack-qdrouterd, architecture=x86_64, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, release=1761123044, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 28 08:53:16 np0005538515.localdomain podman[97744]: 2025-11-28 08:53:16.03538668 +0000 UTC m=+0.138073707 container health_status cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, build-date=2025-11-18T22:51:28Z, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, version=17.1.12, name=rhosp17/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, tcib_managed=true, release=1761123044, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, config_id=tripleo_step3, com.redhat.component=openstack-collectd-container, vendor=Red Hat, Inc., distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible)
Nov 28 08:53:16 np0005538515.localdomain podman[97744]: 2025-11-28 08:53:16.071867372 +0000 UTC m=+0.174554419 container exec_died cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, build-date=2025-11-18T22:51:28Z, name=rhosp17/openstack-collectd, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, architecture=x86_64, tcib_managed=true, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=collectd, url=https://www.redhat.com, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team)
Nov 28 08:53:16 np0005538515.localdomain systemd[1]: cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.service: Deactivated successfully.
Nov 28 08:53:16 np0005538515.localdomain podman[97743]: 2025-11-28 08:53:16.20152015 +0000 UTC m=+0.307159477 container exec_died 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1761123044, tcib_managed=true, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, io.buildah.version=1.41.4, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, config_id=tripleo_step1, container_name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 28 08:53:16 np0005538515.localdomain systemd[1]: 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.service: Deactivated successfully.
Nov 28 08:53:17 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.
Nov 28 08:53:17 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.
Nov 28 08:53:17 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.
Nov 28 08:53:17 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.
Nov 28 08:53:17 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.
Nov 28 08:53:18 np0005538515.localdomain systemd[1]: tmp-crun.UNLLMX.mount: Deactivated successfully.
Nov 28 08:53:18 np0005538515.localdomain podman[97798]: 2025-11-28 08:53:18.060933311 +0000 UTC m=+0.151160180 container health_status d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, tcib_managed=true, version=17.1.12, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, vendor=Red Hat, Inc., batch=17.1_20251118.1, architecture=x86_64, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-compute-container, build-date=2025-11-19T00:11:48Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_compute, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, release=1761123044, distribution-scope=public)
Nov 28 08:53:18 np0005538515.localdomain podman[97800]: 2025-11-28 08:53:18.018730773 +0000 UTC m=+0.103169714 container health_status e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vcs-type=git, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, version=17.1.12, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, architecture=x86_64, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:36:58Z, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 28 08:53:18 np0005538515.localdomain podman[97798]: 2025-11-28 08:53:18.097430113 +0000 UTC m=+0.187657032 container exec_died d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, release=1761123044, vendor=Red Hat, Inc., io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-compute-container, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, config_id=tripleo_step4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, build-date=2025-11-19T00:11:48Z, container_name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, name=rhosp17/openstack-ceilometer-compute, version=17.1.12)
Nov 28 08:53:18 np0005538515.localdomain systemd[1]: d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.service: Deactivated successfully.
Nov 28 08:53:18 np0005538515.localdomain podman[97790]: 2025-11-28 08:53:18.113271861 +0000 UTC m=+0.215888571 container health_status 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, container_name=logrotate_crond, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, name=rhosp17/openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, distribution-scope=public, config_id=tripleo_step4, version=17.1.12, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, build-date=2025-11-18T22:49:32Z, com.redhat.component=openstack-cron-container, maintainer=OpenStack TripleO Team, tcib_managed=true, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible)
Nov 28 08:53:18 np0005538515.localdomain podman[97790]: 2025-11-28 08:53:18.150708882 +0000 UTC m=+0.253325582 container exec_died 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, config_id=tripleo_step4, vendor=Red Hat, Inc., vcs-type=git, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, io.buildah.version=1.41.4, architecture=x86_64, maintainer=OpenStack TripleO Team, distribution-scope=public, build-date=2025-11-18T22:49:32Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-cron-container, tcib_managed=true, name=rhosp17/openstack-cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, managed_by=tripleo_ansible, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1)
Nov 28 08:53:18 np0005538515.localdomain systemd[1]: 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.service: Deactivated successfully.
Nov 28 08:53:18 np0005538515.localdomain podman[97792]: 2025-11-28 08:53:18.195940133 +0000 UTC m=+0.290141554 container health_status 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, maintainer=OpenStack TripleO Team, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_id=tripleo_step3, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, batch=17.1_20251118.1, name=rhosp17/openstack-iscsid, vcs-type=git, description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, url=https://www.redhat.com, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, com.redhat.component=openstack-iscsid-container, vendor=Red Hat, Inc., org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, tcib_managed=true, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid)
Nov 28 08:53:18 np0005538515.localdomain podman[97792]: 2025-11-28 08:53:18.208701786 +0000 UTC m=+0.302903257 container exec_died 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, name=rhosp17/openstack-iscsid, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, com.redhat.component=openstack-iscsid-container, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, container_name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vendor=Red Hat, Inc., version=17.1.12, distribution-scope=public, tcib_managed=true, vcs-type=git, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, build-date=2025-11-18T23:44:13Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 28 08:53:18 np0005538515.localdomain systemd[1]: 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.service: Deactivated successfully.
Nov 28 08:53:18 np0005538515.localdomain podman[97791]: 2025-11-28 08:53:18.260340814 +0000 UTC m=+0.355004299 container health_status 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ceilometer-ipmi, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, vendor=Red Hat, Inc., managed_by=tripleo_ansible, vcs-type=git, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:12:45Z, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, container_name=ceilometer_agent_ipmi, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi)
Nov 28 08:53:18 np0005538515.localdomain podman[97791]: 2025-11-28 08:53:18.31742657 +0000 UTC m=+0.412090085 container exec_died 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vendor=Red Hat, Inc., url=https://www.redhat.com, build-date=2025-11-19T00:12:45Z, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, managed_by=tripleo_ansible, batch=17.1_20251118.1, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, distribution-scope=public, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, release=1761123044, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, container_name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Nov 28 08:53:18 np0005538515.localdomain systemd[1]: 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.service: Deactivated successfully.
Nov 28 08:53:18 np0005538515.localdomain podman[97800]: 2025-11-28 08:53:18.375547367 +0000 UTC m=+0.459986358 container exec_died e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, maintainer=OpenStack TripleO Team, tcib_managed=true, container_name=nova_migration_target, version=17.1.12, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2025-11-19T00:36:58Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, name=rhosp17/openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4)
Nov 28 08:53:18 np0005538515.localdomain systemd[1]: e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.service: Deactivated successfully.
Nov 28 08:53:20 np0005538515.localdomain sudo[97902]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 08:53:20 np0005538515.localdomain sudo[97902]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:53:20 np0005538515.localdomain sudo[97902]: pam_unix(sudo:session): session closed for user root
Nov 28 08:53:20 np0005538515.localdomain sudo[97917]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 08:53:20 np0005538515.localdomain sudo[97917]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:53:21 np0005538515.localdomain sudo[97917]: pam_unix(sudo:session): session closed for user root
Nov 28 08:53:21 np0005538515.localdomain sudo[97964]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 08:53:21 np0005538515.localdomain sudo[97964]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:53:21 np0005538515.localdomain sudo[97964]: pam_unix(sudo:session): session closed for user root
Nov 28 08:53:23 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.
Nov 28 08:53:23 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.
Nov 28 08:53:23 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.
Nov 28 08:53:23 np0005538515.localdomain podman[97979]: 2025-11-28 08:53:23.993950758 +0000 UTC m=+0.096568301 container health_status 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, name=rhosp17/openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, url=https://www.redhat.com, managed_by=tripleo_ansible, vcs-type=git, build-date=2025-11-18T23:34:05Z, tcib_managed=true, vendor=Red Hat, Inc., version=17.1.12, distribution-scope=public, batch=17.1_20251118.1, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller)
Nov 28 08:53:24 np0005538515.localdomain systemd[1]: tmp-crun.iqQYFR.mount: Deactivated successfully.
Nov 28 08:53:24 np0005538515.localdomain podman[97980]: 2025-11-28 08:53:24.042127779 +0000 UTC m=+0.140554804 container health_status ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, release=1761123044, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, managed_by=tripleo_ansible, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, config_id=tripleo_step5, architecture=x86_64, container_name=nova_compute, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc.)
Nov 28 08:53:24 np0005538515.localdomain podman[97981]: 2025-11-28 08:53:24.100464483 +0000 UTC m=+0.195782072 container health_status e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, architecture=x86_64, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, url=https://www.redhat.com, batch=17.1_20251118.1, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Nov 28 08:53:24 np0005538515.localdomain podman[97979]: 2025-11-28 08:53:24.113610188 +0000 UTC m=+0.216227711 container exec_died 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_controller, batch=17.1_20251118.1, io.openshift.expose-services=, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, distribution-scope=public, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, com.redhat.component=openstack-ovn-controller-container, build-date=2025-11-18T23:34:05Z, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, architecture=x86_64)
Nov 28 08:53:24 np0005538515.localdomain podman[97979]: unhealthy
Nov 28 08:53:24 np0005538515.localdomain systemd[1]: 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 08:53:24 np0005538515.localdomain systemd[1]: 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.service: Failed with result 'exit-code'.
Nov 28 08:53:24 np0005538515.localdomain podman[97980]: 2025-11-28 08:53:24.16894058 +0000 UTC m=+0.267367625 container exec_died ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, config_id=tripleo_step5, managed_by=tripleo_ansible, tcib_managed=true, architecture=x86_64, io.buildah.version=1.41.4, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']})
Nov 28 08:53:24 np0005538515.localdomain podman[97981]: 2025-11-28 08:53:24.169304841 +0000 UTC m=+0.264622440 container exec_died e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.buildah.version=1.41.4, vendor=Red Hat, Inc., config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, io.openshift.expose-services=, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, release=1761123044, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z)
Nov 28 08:53:24 np0005538515.localdomain systemd[1]: ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.service: Deactivated successfully.
Nov 28 08:53:24 np0005538515.localdomain podman[97981]: unhealthy
Nov 28 08:53:24 np0005538515.localdomain systemd[1]: e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 08:53:24 np0005538515.localdomain systemd[1]: e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.service: Failed with result 'exit-code'.
Nov 28 08:53:46 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.
Nov 28 08:53:46 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.
Nov 28 08:53:46 np0005538515.localdomain podman[98041]: 2025-11-28 08:53:46.970136847 +0000 UTC m=+0.077643109 container health_status cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:51:28Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, vcs-type=git, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, container_name=collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, distribution-scope=public, config_id=tripleo_step3, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, name=rhosp17/openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, url=https://www.redhat.com, version=17.1.12, architecture=x86_64, com.redhat.component=openstack-collectd-container)
Nov 28 08:53:46 np0005538515.localdomain podman[98041]: 2025-11-28 08:53:46.984416386 +0000 UTC m=+0.091922608 container exec_died cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-collectd, batch=17.1_20251118.1, config_id=tripleo_step3, build-date=2025-11-18T22:51:28Z, distribution-scope=public, io.buildah.version=1.41.4, url=https://www.redhat.com, release=1761123044, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container)
Nov 28 08:53:46 np0005538515.localdomain systemd[1]: cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.service: Deactivated successfully.
Nov 28 08:53:47 np0005538515.localdomain systemd[1]: tmp-crun.KFTIO3.mount: Deactivated successfully.
Nov 28 08:53:47 np0005538515.localdomain podman[98040]: 2025-11-28 08:53:47.07948915 +0000 UTC m=+0.189948863 container health_status 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr, managed_by=tripleo_ansible, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:46Z, tcib_managed=true, io.openshift.expose-services=, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_id=tripleo_step1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1761123044, architecture=x86_64)
Nov 28 08:53:47 np0005538515.localdomain podman[98040]: 2025-11-28 08:53:47.267243104 +0000 UTC m=+0.377702857 container exec_died 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-qdrouterd, managed_by=tripleo_ansible, tcib_managed=true, vcs-type=git, url=https://www.redhat.com, version=17.1.12, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, release=1761123044, architecture=x86_64, io.openshift.expose-services=, config_id=tripleo_step1, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc.)
Nov 28 08:53:47 np0005538515.localdomain systemd[1]: 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.service: Deactivated successfully.
Nov 28 08:53:48 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.
Nov 28 08:53:48 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.
Nov 28 08:53:48 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.
Nov 28 08:53:48 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.
Nov 28 08:53:48 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.
Nov 28 08:53:48 np0005538515.localdomain podman[98090]: 2025-11-28 08:53:48.974176735 +0000 UTC m=+0.083394396 container health_status 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, architecture=x86_64, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, batch=17.1_20251118.1, tcib_managed=true, name=rhosp17/openstack-cron, url=https://www.redhat.com, com.redhat.component=openstack-cron-container, container_name=logrotate_crond, io.openshift.expose-services=, distribution-scope=public, build-date=2025-11-18T22:49:32Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team)
Nov 28 08:53:48 np0005538515.localdomain podman[98090]: 2025-11-28 08:53:48.982098369 +0000 UTC m=+0.091315950 container exec_died 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, release=1761123044, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-cron, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:32Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, container_name=logrotate_crond, io.buildah.version=1.41.4, vendor=Red Hat, Inc., tcib_managed=true, com.redhat.component=openstack-cron-container, summary=Red Hat OpenStack Platform 17.1 cron)
Nov 28 08:53:48 np0005538515.localdomain systemd[1]: 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.service: Deactivated successfully.
Nov 28 08:53:49 np0005538515.localdomain systemd[1]: tmp-crun.QxwICa.mount: Deactivated successfully.
Nov 28 08:53:49 np0005538515.localdomain podman[98091]: 2025-11-28 08:53:49.038706769 +0000 UTC m=+0.143472793 container health_status 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, io.openshift.expose-services=, build-date=2025-11-19T00:12:45Z, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.buildah.version=1.41.4, config_id=tripleo_step4, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, vendor=Red Hat, Inc., distribution-scope=public, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12)
Nov 28 08:53:49 np0005538515.localdomain podman[98092]: 2025-11-28 08:53:49.078338488 +0000 UTC m=+0.180621125 container health_status 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., distribution-scope=public, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, release=1761123044, name=rhosp17/openstack-iscsid, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, managed_by=tripleo_ansible, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, container_name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, com.redhat.component=openstack-iscsid-container, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com)
Nov 28 08:53:49 np0005538515.localdomain podman[98092]: 2025-11-28 08:53:49.086477499 +0000 UTC m=+0.188760206 container exec_died 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, name=rhosp17/openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, build-date=2025-11-18T23:44:13Z, com.redhat.component=openstack-iscsid-container, url=https://www.redhat.com, release=1761123044, version=17.1.12, description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., container_name=iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, distribution-scope=public, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, io.buildah.version=1.41.4, batch=17.1_20251118.1)
Nov 28 08:53:49 np0005538515.localdomain podman[98091]: 2025-11-28 08:53:49.093435013 +0000 UTC m=+0.198200967 container exec_died 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., build-date=2025-11-19T00:12:45Z, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-type=git, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp17/openstack-ceilometer-ipmi, io.buildah.version=1.41.4, tcib_managed=true, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_ipmi, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, io.openshift.expose-services=, batch=17.1_20251118.1)
Nov 28 08:53:49 np0005538515.localdomain systemd[1]: 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.service: Deactivated successfully.
Nov 28 08:53:49 np0005538515.localdomain systemd[1]: 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.service: Deactivated successfully.
Nov 28 08:53:49 np0005538515.localdomain podman[98102]: 2025-11-28 08:53:49.136108475 +0000 UTC m=+0.230423758 container health_status e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, release=1761123044, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, managed_by=tripleo_ansible, vendor=Red Hat, Inc., vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, distribution-scope=public, name=rhosp17/openstack-nova-compute, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, config_id=tripleo_step4, vcs-type=git, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target)
Nov 28 08:53:49 np0005538515.localdomain podman[98098]: 2025-11-28 08:53:49.19575389 +0000 UTC m=+0.294622233 container health_status d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., managed_by=tripleo_ansible, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, batch=17.1_20251118.1, name=rhosp17/openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, release=1761123044, container_name=ceilometer_agent_compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-compute-container, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, distribution-scope=public, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']})
Nov 28 08:53:49 np0005538515.localdomain podman[98098]: 2025-11-28 08:53:49.221029057 +0000 UTC m=+0.319897380 container exec_died d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., release=1761123044, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, container_name=ceilometer_agent_compute, name=rhosp17/openstack-ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.openshift.expose-services=, tcib_managed=true, build-date=2025-11-19T00:11:48Z, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, batch=17.1_20251118.1)
Nov 28 08:53:49 np0005538515.localdomain systemd[1]: d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.service: Deactivated successfully.
Nov 28 08:53:49 np0005538515.localdomain podman[98102]: 2025-11-28 08:53:49.558454976 +0000 UTC m=+0.652770278 container exec_died e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, version=17.1.12, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step4, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, url=https://www.redhat.com, vendor=Red Hat, Inc., architecture=x86_64)
Nov 28 08:53:49 np0005538515.localdomain systemd[1]: e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.service: Deactivated successfully.
Nov 28 08:53:54 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.
Nov 28 08:53:54 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.
Nov 28 08:53:54 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.
Nov 28 08:53:54 np0005538515.localdomain podman[98206]: 2025-11-28 08:53:54.987282844 +0000 UTC m=+0.090866427 container health_status ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.4, url=https://www.redhat.com, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-compute, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, container_name=nova_compute)
Nov 28 08:53:55 np0005538515.localdomain systemd[1]: tmp-crun.mcD4Ua.mount: Deactivated successfully.
Nov 28 08:53:55 np0005538515.localdomain podman[98206]: 2025-11-28 08:53:55.04572594 +0000 UTC m=+0.149309523 container exec_died ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, architecture=x86_64, container_name=nova_compute, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, release=1761123044, build-date=2025-11-19T00:36:58Z, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step5, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, batch=17.1_20251118.1, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Nov 28 08:53:55 np0005538515.localdomain systemd[1]: ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.service: Deactivated successfully.
Nov 28 08:53:55 np0005538515.localdomain podman[98207]: 2025-11-28 08:53:55.098210985 +0000 UTC m=+0.200506618 container health_status e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, container_name=ovn_metadata_agent, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, distribution-scope=public, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, vendor=Red Hat, Inc., io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, release=1761123044, tcib_managed=true, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, build-date=2025-11-19T00:14:25Z)
Nov 28 08:53:55 np0005538515.localdomain podman[98205]: 2025-11-28 08:53:55.051340933 +0000 UTC m=+0.155842404 container health_status 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, version=17.1.12, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, name=rhosp17/openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, url=https://www.redhat.com, distribution-scope=public, config_id=tripleo_step4, architecture=x86_64, build-date=2025-11-18T23:34:05Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.buildah.version=1.41.4)
Nov 28 08:53:55 np0005538515.localdomain podman[98205]: 2025-11-28 08:53:55.136050109 +0000 UTC m=+0.240551570 container exec_died 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, version=17.1.12, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, batch=17.1_20251118.1, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, managed_by=tripleo_ansible, build-date=2025-11-18T23:34:05Z, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., name=rhosp17/openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, distribution-scope=public, url=https://www.redhat.com, container_name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true)
Nov 28 08:53:55 np0005538515.localdomain podman[98205]: unhealthy
Nov 28 08:53:55 np0005538515.localdomain systemd[1]: 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 08:53:55 np0005538515.localdomain systemd[1]: 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.service: Failed with result 'exit-code'.
Nov 28 08:53:55 np0005538515.localdomain podman[98207]: 2025-11-28 08:53:55.189971047 +0000 UTC m=+0.292266680 container exec_died e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, vendor=Red Hat, Inc., vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, distribution-scope=public, build-date=2025-11-19T00:14:25Z, name=rhosp17/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, url=https://www.redhat.com, release=1761123044, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, batch=17.1_20251118.1, version=17.1.12, architecture=x86_64, io.openshift.expose-services=, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_metadata_agent)
Nov 28 08:53:55 np0005538515.localdomain podman[98207]: unhealthy
Nov 28 08:53:55 np0005538515.localdomain systemd[1]: e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 08:53:55 np0005538515.localdomain systemd[1]: e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.service: Failed with result 'exit-code'.
Nov 28 08:54:13 np0005538515.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Nov 28 08:54:13 np0005538515.localdomain recover_tripleo_nova_virtqemud[98274]: 62642
Nov 28 08:54:13 np0005538515.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Nov 28 08:54:13 np0005538515.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Nov 28 08:54:17 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.
Nov 28 08:54:17 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.
Nov 28 08:54:18 np0005538515.localdomain podman[98276]: 2025-11-28 08:54:18.009101058 +0000 UTC m=+0.116928387 container health_status cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, vcs-type=git, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, build-date=2025-11-18T22:51:28Z, com.redhat.component=openstack-collectd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, release=1761123044, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 collectd)
Nov 28 08:54:18 np0005538515.localdomain podman[98276]: 2025-11-28 08:54:18.017558269 +0000 UTC m=+0.125385628 container exec_died cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, version=17.1.12, name=rhosp17/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, com.redhat.component=openstack-collectd-container, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.41.4, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, vcs-type=git, io.openshift.expose-services=)
Nov 28 08:54:18 np0005538515.localdomain systemd[1]: cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.service: Deactivated successfully.
Nov 28 08:54:18 np0005538515.localdomain podman[98275]: 2025-11-28 08:54:18.063503831 +0000 UTC m=+0.171105563 container health_status 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, tcib_managed=true, release=1761123044, maintainer=OpenStack TripleO Team, version=17.1.12, container_name=metrics_qdr, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.4, name=rhosp17/openstack-qdrouterd, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, build-date=2025-11-18T22:49:46Z, config_id=tripleo_step1, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1)
Nov 28 08:54:18 np0005538515.localdomain podman[98275]: 2025-11-28 08:54:18.27968547 +0000 UTC m=+0.387287182 container exec_died 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.buildah.version=1.41.4, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, tcib_managed=true, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp17/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, config_id=tripleo_step1, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=metrics_qdr, architecture=x86_64, maintainer=OpenStack TripleO Team)
Nov 28 08:54:18 np0005538515.localdomain systemd[1]: 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.service: Deactivated successfully.
Nov 28 08:54:19 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.
Nov 28 08:54:19 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.
Nov 28 08:54:19 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.
Nov 28 08:54:19 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.
Nov 28 08:54:19 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.
Nov 28 08:54:19 np0005538515.localdomain systemd[1]: tmp-crun.mQTKrY.mount: Deactivated successfully.
Nov 28 08:54:19 np0005538515.localdomain podman[98325]: 2025-11-28 08:54:19.986299241 +0000 UTC m=+0.085420367 container health_status 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, maintainer=OpenStack TripleO Team, architecture=x86_64, tcib_managed=true, container_name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., batch=17.1_20251118.1, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-ipmi, managed_by=tripleo_ansible, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2025-11-19T00:12:45Z, config_id=tripleo_step4)
Nov 28 08:54:20 np0005538515.localdomain podman[98325]: 2025-11-28 08:54:20.030312935 +0000 UTC m=+0.129434071 container exec_died 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, vcs-type=git, url=https://www.redhat.com, io.buildah.version=1.41.4, io.openshift.expose-services=, version=17.1.12, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20251118.1, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, com.redhat.component=openstack-ceilometer-ipmi-container, managed_by=tripleo_ansible, container_name=ceilometer_agent_ipmi)
Nov 28 08:54:20 np0005538515.localdomain podman[98324]: 2025-11-28 08:54:20.03795217 +0000 UTC m=+0.143291807 container health_status 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-cron-container, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, architecture=x86_64, container_name=logrotate_crond, name=rhosp17/openstack-cron, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, config_id=tripleo_step4, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, build-date=2025-11-18T22:49:32Z, distribution-scope=public, io.openshift.expose-services=, release=1761123044, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 08:54:20 np0005538515.localdomain systemd[1]: 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.service: Deactivated successfully.
Nov 28 08:54:20 np0005538515.localdomain podman[98332]: 2025-11-28 08:54:20.046221005 +0000 UTC m=+0.135942453 container health_status d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, vendor=Red Hat, Inc., release=1761123044, container_name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-compute-container, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, io.openshift.expose-services=, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, architecture=x86_64, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, build-date=2025-11-19T00:11:48Z, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Nov 28 08:54:20 np0005538515.localdomain podman[98334]: 2025-11-28 08:54:19.968141183 +0000 UTC m=+0.061581065 container health_status e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:36:58Z, container_name=nova_migration_target, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, managed_by=tripleo_ansible, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1761123044, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, version=17.1.12, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, batch=17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Nov 28 08:54:20 np0005538515.localdomain podman[98326]: 2025-11-28 08:54:20.093272352 +0000 UTC m=+0.190045426 container health_status 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, container_name=iscsid, vendor=Red Hat, Inc., build-date=2025-11-18T23:44:13Z, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, name=rhosp17/openstack-iscsid, batch=17.1_20251118.1, release=1761123044, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, architecture=x86_64, vcs-type=git, tcib_managed=true, distribution-scope=public, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, io.openshift.expose-services=, com.redhat.component=openstack-iscsid-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid)
Nov 28 08:54:20 np0005538515.localdomain podman[98324]: 2025-11-28 08:54:20.101957009 +0000 UTC m=+0.207296626 container exec_died 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, architecture=x86_64, container_name=logrotate_crond, tcib_managed=true, name=rhosp17/openstack-cron, io.openshift.expose-services=, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, build-date=2025-11-18T22:49:32Z, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, batch=17.1_20251118.1, config_id=tripleo_step4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git)
Nov 28 08:54:20 np0005538515.localdomain systemd[1]: 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.service: Deactivated successfully.
Nov 28 08:54:20 np0005538515.localdomain podman[98326]: 2025-11-28 08:54:20.128578277 +0000 UTC m=+0.225351301 container exec_died 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, name=rhosp17/openstack-iscsid, build-date=2025-11-18T23:44:13Z, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, release=1761123044, summary=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-iscsid-container, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, tcib_managed=true, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, url=https://www.redhat.com, architecture=x86_64, container_name=iscsid, io.openshift.expose-services=, vendor=Red Hat, Inc., version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 28 08:54:20 np0005538515.localdomain systemd[1]: 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.service: Deactivated successfully.
Nov 28 08:54:20 np0005538515.localdomain podman[98332]: 2025-11-28 08:54:20.17971876 +0000 UTC m=+0.269440198 container exec_died d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, url=https://www.redhat.com, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, release=1761123044, vcs-type=git, version=17.1.12, distribution-scope=public, io.openshift.expose-services=, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ceilometer-compute-container, vendor=Red Hat, Inc., io.buildah.version=1.41.4, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, maintainer=OpenStack TripleO Team)
Nov 28 08:54:20 np0005538515.localdomain systemd[1]: d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.service: Deactivated successfully.
Nov 28 08:54:20 np0005538515.localdomain podman[98334]: 2025-11-28 08:54:20.301536818 +0000 UTC m=+0.394976760 container exec_died e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step4, vcs-type=git, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, container_name=nova_migration_target, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, build-date=2025-11-19T00:36:58Z, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., architecture=x86_64, release=1761123044, name=rhosp17/openstack-nova-compute)
Nov 28 08:54:20 np0005538515.localdomain systemd[1]: e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.service: Deactivated successfully.
Nov 28 08:54:22 np0005538515.localdomain sudo[98434]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 08:54:22 np0005538515.localdomain sudo[98434]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:54:22 np0005538515.localdomain sudo[98434]: pam_unix(sudo:session): session closed for user root
Nov 28 08:54:22 np0005538515.localdomain sudo[98449]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 08:54:22 np0005538515.localdomain sudo[98449]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:54:22 np0005538515.localdomain sudo[98449]: pam_unix(sudo:session): session closed for user root
Nov 28 08:54:23 np0005538515.localdomain sudo[98496]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 08:54:23 np0005538515.localdomain sudo[98496]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:54:23 np0005538515.localdomain sudo[98496]: pam_unix(sudo:session): session closed for user root
Nov 28 08:54:25 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.
Nov 28 08:54:25 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.
Nov 28 08:54:25 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.
Nov 28 08:54:25 np0005538515.localdomain systemd[1]: tmp-crun.tXS3NH.mount: Deactivated successfully.
Nov 28 08:54:26 np0005538515.localdomain podman[98513]: 2025-11-28 08:54:26.016336861 +0000 UTC m=+0.108322413 container health_status e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, distribution-scope=public, version=17.1.12, tcib_managed=true, build-date=2025-11-19T00:14:25Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.openshift.expose-services=, url=https://www.redhat.com, release=1761123044, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.buildah.version=1.41.4, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_metadata_agent, batch=17.1_20251118.1)
Nov 28 08:54:26 np0005538515.localdomain podman[98511]: 2025-11-28 08:54:26.032819198 +0000 UTC m=+0.133334272 container health_status 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, version=17.1.12, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ovn-controller-container, config_id=tripleo_step4, name=rhosp17/openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, build-date=2025-11-18T23:34:05Z, managed_by=tripleo_ansible, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, tcib_managed=true, batch=17.1_20251118.1, io.buildah.version=1.41.4)
Nov 28 08:54:26 np0005538515.localdomain podman[98512]: 2025-11-28 08:54:25.985981787 +0000 UTC m=+0.087456770 container health_status ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, vendor=Red Hat, Inc., container_name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:36:58Z, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, config_id=tripleo_step5, io.buildah.version=1.41.4, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-compute-container)
Nov 28 08:54:26 np0005538515.localdomain podman[98511]: 2025-11-28 08:54:26.04360176 +0000 UTC m=+0.144116904 container exec_died 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, batch=17.1_20251118.1, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.41.4, tcib_managed=true, url=https://www.redhat.com, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, container_name=ovn_controller, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, distribution-scope=public, build-date=2025-11-18T23:34:05Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git)
Nov 28 08:54:26 np0005538515.localdomain podman[98511]: unhealthy
Nov 28 08:54:26 np0005538515.localdomain podman[98513]: 2025-11-28 08:54:26.055212107 +0000 UTC m=+0.147197649 container exec_died e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_metadata_agent, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., distribution-scope=public, tcib_managed=true, vcs-type=git, architecture=x86_64, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, release=1761123044, io.buildah.version=1.41.4, config_id=tripleo_step4, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.expose-services=, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, version=17.1.12, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, build-date=2025-11-19T00:14:25Z)
Nov 28 08:54:26 np0005538515.localdomain systemd[1]: 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 08:54:26 np0005538515.localdomain systemd[1]: 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.service: Failed with result 'exit-code'.
Nov 28 08:54:26 np0005538515.localdomain podman[98513]: unhealthy
Nov 28 08:54:26 np0005538515.localdomain podman[98512]: 2025-11-28 08:54:26.066142463 +0000 UTC m=+0.167617476 container exec_died ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vcs-type=git, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:36:58Z, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, io.openshift.expose-services=, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, container_name=nova_compute, config_id=tripleo_step5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, maintainer=OpenStack TripleO Team, version=17.1.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64)
Nov 28 08:54:26 np0005538515.localdomain systemd[1]: e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 08:54:26 np0005538515.localdomain systemd[1]: e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.service: Failed with result 'exit-code'.
Nov 28 08:54:26 np0005538515.localdomain systemd[1]: ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.service: Deactivated successfully.
Nov 28 08:54:48 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.
Nov 28 08:54:48 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.
Nov 28 08:54:48 np0005538515.localdomain podman[98574]: 2025-11-28 08:54:48.969024999 +0000 UTC m=+0.073177431 container health_status cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-collectd-container, vendor=Red Hat, Inc., managed_by=tripleo_ansible, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.buildah.version=1.41.4, build-date=2025-11-18T22:51:28Z, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, io.openshift.expose-services=, version=17.1.12)
Nov 28 08:54:48 np0005538515.localdomain podman[98574]: 2025-11-28 08:54:48.978629655 +0000 UTC m=+0.082782107 container exec_died cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, container_name=collectd, url=https://www.redhat.com, build-date=2025-11-18T22:51:28Z, tcib_managed=true, release=1761123044, vendor=Red Hat, Inc., config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-collectd-container, name=rhosp17/openstack-collectd, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, version=17.1.12, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, vcs-type=git)
Nov 28 08:54:48 np0005538515.localdomain systemd[1]: cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.service: Deactivated successfully.
Nov 28 08:54:49 np0005538515.localdomain podman[98573]: 2025-11-28 08:54:49.071807481 +0000 UTC m=+0.183283839 container health_status 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, architecture=x86_64, version=17.1.12, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, container_name=metrics_qdr, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, release=1761123044, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, io.openshift.expose-services=, config_id=tripleo_step1, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-qdrouterd)
Nov 28 08:54:49 np0005538515.localdomain podman[98573]: 2025-11-28 08:54:49.265955212 +0000 UTC m=+0.377431610 container exec_died 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, version=17.1.12, container_name=metrics_qdr, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, vendor=Red Hat, Inc., com.redhat.component=openstack-qdrouterd-container, build-date=2025-11-18T22:49:46Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, io.openshift.expose-services=, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, tcib_managed=true, vcs-type=git, batch=17.1_20251118.1, architecture=x86_64, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']})
Nov 28 08:54:49 np0005538515.localdomain systemd[1]: 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.service: Deactivated successfully.
Nov 28 08:54:50 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.
Nov 28 08:54:50 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.
Nov 28 08:54:50 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.
Nov 28 08:54:50 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.
Nov 28 08:54:50 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.
Nov 28 08:54:50 np0005538515.localdomain podman[98625]: 2025-11-28 08:54:50.987990418 +0000 UTC m=+0.089054711 container health_status d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, managed_by=tripleo_ansible, build-date=2025-11-19T00:11:48Z, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, config_id=tripleo_step4, name=rhosp17/openstack-ceilometer-compute, vcs-type=git, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, release=1761123044, container_name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, version=17.1.12, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-compute-container, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 08:54:51 np0005538515.localdomain podman[98626]: 2025-11-28 08:54:51.042103662 +0000 UTC m=+0.139721499 container health_status e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:36:58Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, architecture=x86_64, io.buildah.version=1.41.4, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, version=17.1.12, container_name=nova_migration_target, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, batch=17.1_20251118.1)
Nov 28 08:54:51 np0005538515.localdomain podman[98625]: 2025-11-28 08:54:51.04759069 +0000 UTC m=+0.148654953 container exec_died d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, build-date=2025-11-19T00:11:48Z, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, release=1761123044, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, version=17.1.12, io.buildah.version=1.41.4, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Nov 28 08:54:51 np0005538515.localdomain systemd[1]: d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.service: Deactivated successfully.
Nov 28 08:54:51 np0005538515.localdomain podman[98622]: 2025-11-28 08:54:51.100390445 +0000 UTC m=+0.202191100 container health_status 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, name=rhosp17/openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, release=1761123044, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, build-date=2025-11-18T22:49:32Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible)
Nov 28 08:54:51 np0005538515.localdomain podman[98622]: 2025-11-28 08:54:51.141572682 +0000 UTC m=+0.243373357 container exec_died 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, container_name=logrotate_crond, architecture=x86_64, com.redhat.component=openstack-cron-container, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, io.openshift.expose-services=, release=1761123044, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, config_id=tripleo_step4, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, build-date=2025-11-18T22:49:32Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron)
Nov 28 08:54:51 np0005538515.localdomain systemd[1]: 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.service: Deactivated successfully.
Nov 28 08:54:51 np0005538515.localdomain podman[98624]: 2025-11-28 08:54:51.202498405 +0000 UTC m=+0.304818126 container health_status 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, container_name=iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, tcib_managed=true, com.redhat.component=openstack-iscsid-container, managed_by=tripleo_ansible, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, io.openshift.expose-services=, batch=17.1_20251118.1, name=rhosp17/openstack-iscsid, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.buildah.version=1.41.4, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']})
Nov 28 08:54:51 np0005538515.localdomain podman[98623]: 2025-11-28 08:54:51.173437151 +0000 UTC m=+0.279377024 container health_status 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, distribution-scope=public, name=rhosp17/openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, release=1761123044, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, build-date=2025-11-19T00:12:45Z, container_name=ceilometer_agent_ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, managed_by=tripleo_ansible, vcs-type=git, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 28 08:54:51 np0005538515.localdomain podman[98624]: 2025-11-28 08:54:51.243555338 +0000 UTC m=+0.345875019 container exec_died 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.openshift.expose-services=, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, tcib_managed=true, vcs-type=git, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, architecture=x86_64, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, name=rhosp17/openstack-iscsid, config_id=tripleo_step3, vendor=Red Hat, Inc., com.redhat.component=openstack-iscsid-container, summary=Red Hat OpenStack Platform 17.1 iscsid)
Nov 28 08:54:51 np0005538515.localdomain systemd[1]: 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.service: Deactivated successfully.
Nov 28 08:54:51 np0005538515.localdomain podman[98623]: 2025-11-28 08:54:51.256399993 +0000 UTC m=+0.362339836 container exec_died 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, distribution-scope=public, version=17.1.12, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, vendor=Red Hat, Inc., release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, tcib_managed=true, name=rhosp17/openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, io.buildah.version=1.41.4, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=)
Nov 28 08:54:51 np0005538515.localdomain systemd[1]: 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.service: Deactivated successfully.
Nov 28 08:54:51 np0005538515.localdomain podman[98626]: 2025-11-28 08:54:51.443666423 +0000 UTC m=+0.541284260 container exec_died e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, maintainer=OpenStack TripleO Team, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.4, architecture=x86_64, vendor=Red Hat, Inc., batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, url=https://www.redhat.com, name=rhosp17/openstack-nova-compute, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, tcib_managed=true)
Nov 28 08:54:51 np0005538515.localdomain systemd[1]: e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.service: Deactivated successfully.
Nov 28 08:54:56 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.
Nov 28 08:54:56 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.
Nov 28 08:54:56 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.
Nov 28 08:54:56 np0005538515.localdomain systemd[1]: tmp-crun.nX0E7M.mount: Deactivated successfully.
Nov 28 08:54:56 np0005538515.localdomain podman[98732]: 2025-11-28 08:54:56.992606795 +0000 UTC m=+0.091308979 container health_status 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, com.redhat.component=openstack-ovn-controller-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, managed_by=tripleo_ansible, url=https://www.redhat.com, release=1761123044, batch=17.1_20251118.1, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, io.openshift.expose-services=, container_name=ovn_controller, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, name=rhosp17/openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']})
Nov 28 08:54:57 np0005538515.localdomain podman[98732]: 2025-11-28 08:54:57.03957735 +0000 UTC m=+0.138279524 container exec_died 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, io.buildah.version=1.41.4, version=17.1.12, managed_by=tripleo_ansible, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-type=git, io.openshift.expose-services=, com.redhat.component=openstack-ovn-controller-container, name=rhosp17/openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, maintainer=OpenStack TripleO Team, tcib_managed=true, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272)
Nov 28 08:54:57 np0005538515.localdomain podman[98732]: unhealthy
Nov 28 08:54:57 np0005538515.localdomain systemd[1]: 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 08:54:57 np0005538515.localdomain systemd[1]: 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.service: Failed with result 'exit-code'.
Nov 28 08:54:57 np0005538515.localdomain podman[98734]: 2025-11-28 08:54:57.056963484 +0000 UTC m=+0.151014325 container health_status e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, vendor=Red Hat, Inc., managed_by=tripleo_ansible, version=17.1.12, distribution-scope=public, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, container_name=ovn_metadata_agent, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, architecture=x86_64, batch=17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:14:25Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Nov 28 08:54:57 np0005538515.localdomain podman[98734]: 2025-11-28 08:54:57.107007114 +0000 UTC m=+0.201057925 container exec_died e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, tcib_managed=true, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, io.openshift.expose-services=, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, release=1761123044, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, config_id=tripleo_step4, io.buildah.version=1.41.4, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, name=rhosp17/openstack-neutron-metadata-agent-ovn)
Nov 28 08:54:57 np0005538515.localdomain podman[98734]: unhealthy
Nov 28 08:54:57 np0005538515.localdomain systemd[1]: e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 08:54:57 np0005538515.localdomain systemd[1]: e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.service: Failed with result 'exit-code'.
Nov 28 08:54:57 np0005538515.localdomain podman[98733]: 2025-11-28 08:54:57.111164282 +0000 UTC m=+0.206359358 container health_status ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, build-date=2025-11-19T00:36:58Z, batch=17.1_20251118.1, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, maintainer=OpenStack TripleO Team, config_id=tripleo_step5, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, vcs-type=git, name=rhosp17/openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_compute, com.redhat.component=openstack-nova-compute-container)
Nov 28 08:54:57 np0005538515.localdomain podman[98733]: 2025-11-28 08:54:57.19564096 +0000 UTC m=+0.290836056 container exec_died ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., tcib_managed=true, vcs-type=git, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_compute, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5, release=1761123044, io.buildah.version=1.41.4, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute)
Nov 28 08:54:57 np0005538515.localdomain systemd[1]: ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.service: Deactivated successfully.
Nov 28 08:55:19 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.
Nov 28 08:55:19 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.
Nov 28 08:55:19 np0005538515.localdomain podman[98798]: 2025-11-28 08:55:19.990627657 +0000 UTC m=+0.077195456 container health_status 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, build-date=2025-11-18T22:49:46Z, io.buildah.version=1.41.4, com.redhat.component=openstack-qdrouterd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step1, vcs-type=git, release=1761123044, io.openshift.expose-services=, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, architecture=x86_64, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, vendor=Red Hat, Inc., batch=17.1_20251118.1, url=https://www.redhat.com, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd)
Nov 28 08:55:20 np0005538515.localdomain podman[98799]: 2025-11-28 08:55:20.04924399 +0000 UTC m=+0.132923750 container health_status cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step3, com.redhat.component=openstack-collectd-container, architecture=x86_64, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, container_name=collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, name=rhosp17/openstack-collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, release=1761123044, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, batch=17.1_20251118.1, build-date=2025-11-18T22:51:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 collectd)
Nov 28 08:55:20 np0005538515.localdomain podman[98799]: 2025-11-28 08:55:20.059373511 +0000 UTC m=+0.143053241 container exec_died cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, name=rhosp17/openstack-collectd, tcib_managed=true, io.openshift.expose-services=, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, maintainer=OpenStack TripleO Team, distribution-scope=public, batch=17.1_20251118.1, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vendor=Red Hat, Inc., com.redhat.component=openstack-collectd-container, url=https://www.redhat.com, architecture=x86_64, container_name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 28 08:55:20 np0005538515.localdomain systemd[1]: cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.service: Deactivated successfully.
Nov 28 08:55:20 np0005538515.localdomain podman[98798]: 2025-11-28 08:55:20.208232079 +0000 UTC m=+0.294799938 container exec_died 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step1, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, name=rhosp17/openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, container_name=metrics_qdr, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team)
Nov 28 08:55:20 np0005538515.localdomain systemd[1]: 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.service: Deactivated successfully.
Nov 28 08:55:21 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.
Nov 28 08:55:21 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.
Nov 28 08:55:21 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.
Nov 28 08:55:21 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.
Nov 28 08:55:21 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.
Nov 28 08:55:21 np0005538515.localdomain podman[98848]: 2025-11-28 08:55:21.981454339 +0000 UTC m=+0.087326367 container health_status 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, com.redhat.component=openstack-cron-container, distribution-scope=public, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, vcs-type=git, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, name=rhosp17/openstack-cron, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, tcib_managed=true, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:32Z, version=17.1.12, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron)
Nov 28 08:55:22 np0005538515.localdomain systemd[1]: tmp-crun.rkGrsR.mount: Deactivated successfully.
Nov 28 08:55:22 np0005538515.localdomain podman[98849]: 2025-11-28 08:55:22.02665906 +0000 UTC m=+0.128842103 container health_status 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, com.redhat.component=openstack-ceilometer-ipmi-container, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:12:45Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, architecture=x86_64, io.openshift.expose-services=, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, io.buildah.version=1.41.4, tcib_managed=true, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git)
Nov 28 08:55:22 np0005538515.localdomain podman[98850]: 2025-11-28 08:55:22.053015171 +0000 UTC m=+0.150191441 container health_status 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., container_name=iscsid, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, vcs-type=git, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-iscsid-container, summary=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, managed_by=tripleo_ansible, build-date=2025-11-18T23:44:13Z, description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, name=rhosp17/openstack-iscsid, tcib_managed=true, io.openshift.expose-services=)
Nov 28 08:55:22 np0005538515.localdomain podman[98849]: 2025-11-28 08:55:22.084686875 +0000 UTC m=+0.186869928 container exec_died 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-ipmi-container, name=rhosp17/openstack-ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, release=1761123044, tcib_managed=true, version=17.1.12, distribution-scope=public, build-date=2025-11-19T00:12:45Z, architecture=x86_64, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, container_name=ceilometer_agent_ipmi, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, vcs-type=git)
Nov 28 08:55:22 np0005538515.localdomain podman[98850]: 2025-11-28 08:55:22.085288834 +0000 UTC m=+0.182465044 container exec_died 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, com.redhat.component=openstack-iscsid-container, version=17.1.12, description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, build-date=2025-11-18T23:44:13Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, distribution-scope=public, managed_by=tripleo_ansible, config_id=tripleo_step3, io.buildah.version=1.41.4, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vendor=Red Hat, Inc., org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, url=https://www.redhat.com, tcib_managed=true)
Nov 28 08:55:22 np0005538515.localdomain systemd[1]: 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.service: Deactivated successfully.
Nov 28 08:55:22 np0005538515.localdomain podman[98851]: 2025-11-28 08:55:22.096188288 +0000 UTC m=+0.189610582 container health_status d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, batch=17.1_20251118.1, distribution-scope=public, vendor=Red Hat, Inc., org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, container_name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, url=https://www.redhat.com, name=rhosp17/openstack-ceilometer-compute, version=17.1.12, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, build-date=2025-11-19T00:11:48Z, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1761123044, vcs-type=git)
Nov 28 08:55:22 np0005538515.localdomain systemd[1]: 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.service: Deactivated successfully.
Nov 28 08:55:22 np0005538515.localdomain podman[98857]: 2025-11-28 08:55:22.15183878 +0000 UTC m=+0.243830031 container health_status e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.openshift.expose-services=, container_name=nova_migration_target, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, config_id=tripleo_step4, build-date=2025-11-19T00:36:58Z, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, release=1761123044, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, tcib_managed=true, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, vcs-type=git, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com)
Nov 28 08:55:22 np0005538515.localdomain podman[98851]: 2025-11-28 08:55:22.159528427 +0000 UTC m=+0.252950771 container exec_died d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, batch=17.1_20251118.1, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, build-date=2025-11-19T00:11:48Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, io.openshift.expose-services=, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, vcs-type=git, name=rhosp17/openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 28 08:55:22 np0005538515.localdomain podman[98848]: 2025-11-28 08:55:22.166446349 +0000 UTC m=+0.272318397 container exec_died 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, name=rhosp17/openstack-cron, distribution-scope=public, release=1761123044, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:32Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-cron-container, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, io.buildah.version=1.41.4, architecture=x86_64)
Nov 28 08:55:22 np0005538515.localdomain systemd[1]: d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.service: Deactivated successfully.
Nov 28 08:55:22 np0005538515.localdomain systemd[1]: 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.service: Deactivated successfully.
Nov 28 08:55:22 np0005538515.localdomain podman[98857]: 2025-11-28 08:55:22.522391958 +0000 UTC m=+0.614383119 container exec_died e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, architecture=x86_64, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, distribution-scope=public, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, batch=17.1_20251118.1, tcib_managed=true, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:36:58Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com)
Nov 28 08:55:22 np0005538515.localdomain systemd[1]: e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.service: Deactivated successfully.
Nov 28 08:55:22 np0005538515.localdomain systemd[1]: tmp-crun.5xsfxU.mount: Deactivated successfully.
Nov 28 08:55:23 np0005538515.localdomain sudo[98958]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 08:55:23 np0005538515.localdomain sudo[98958]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:55:23 np0005538515.localdomain sudo[98958]: pam_unix(sudo:session): session closed for user root
Nov 28 08:55:23 np0005538515.localdomain sudo[98973]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 08:55:23 np0005538515.localdomain sudo[98973]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:55:24 np0005538515.localdomain sudo[98973]: pam_unix(sudo:session): session closed for user root
Nov 28 08:55:25 np0005538515.localdomain sudo[99020]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 08:55:25 np0005538515.localdomain sudo[99020]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:55:25 np0005538515.localdomain sudo[99020]: pam_unix(sudo:session): session closed for user root
Nov 28 08:55:27 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.
Nov 28 08:55:27 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.
Nov 28 08:55:27 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.
Nov 28 08:55:27 np0005538515.localdomain podman[99036]: 2025-11-28 08:55:27.983718664 +0000 UTC m=+0.086696637 container health_status ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, tcib_managed=true, version=17.1.12, batch=17.1_20251118.1, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-compute, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, container_name=nova_compute, release=1761123044, build-date=2025-11-19T00:36:58Z, config_id=tripleo_step5)
Nov 28 08:55:28 np0005538515.localdomain podman[99035]: 2025-11-28 08:55:28.035705744 +0000 UTC m=+0.139497862 container health_status 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ovn-controller-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, batch=17.1_20251118.1, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.buildah.version=1.41.4, managed_by=tripleo_ansible, build-date=2025-11-18T23:34:05Z, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-type=git, io.openshift.expose-services=, name=rhosp17/openstack-ovn-controller, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, vendor=Red Hat, Inc., version=17.1.12, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller)
Nov 28 08:55:28 np0005538515.localdomain podman[99036]: 2025-11-28 08:55:28.046671101 +0000 UTC m=+0.149649044 container exec_died ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, config_id=tripleo_step5, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute, managed_by=tripleo_ansible, container_name=nova_compute, maintainer=OpenStack TripleO Team, vcs-type=git, url=https://www.redhat.com, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, version=17.1.12)
Nov 28 08:55:28 np0005538515.localdomain systemd[1]: ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.service: Deactivated successfully.
Nov 28 08:55:28 np0005538515.localdomain podman[99035]: 2025-11-28 08:55:28.079937643 +0000 UTC m=+0.183729741 container exec_died 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, managed_by=tripleo_ansible, distribution-scope=public, batch=17.1_20251118.1, url=https://www.redhat.com, vendor=Red Hat, Inc., release=1761123044, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, version=17.1.12, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, build-date=2025-11-18T23:34:05Z, com.redhat.component=openstack-ovn-controller-container, name=rhosp17/openstack-ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, architecture=x86_64)
Nov 28 08:55:28 np0005538515.localdomain podman[99035]: unhealthy
Nov 28 08:55:28 np0005538515.localdomain systemd[1]: 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 08:55:28 np0005538515.localdomain systemd[1]: 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.service: Failed with result 'exit-code'.
Nov 28 08:55:28 np0005538515.localdomain podman[99037]: 2025-11-28 08:55:28.096381689 +0000 UTC m=+0.194185303 container health_status e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, batch=17.1_20251118.1, build-date=2025-11-19T00:14:25Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., io.openshift.expose-services=, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.buildah.version=1.41.4, release=1761123044, url=https://www.redhat.com, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, config_id=tripleo_step4)
Nov 28 08:55:28 np0005538515.localdomain podman[99037]: 2025-11-28 08:55:28.114421695 +0000 UTC m=+0.212225259 container exec_died e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, version=17.1.12, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, architecture=x86_64, maintainer=OpenStack TripleO Team, release=1761123044, batch=17.1_20251118.1, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, container_name=ovn_metadata_agent)
Nov 28 08:55:28 np0005538515.localdomain podman[99037]: unhealthy
Nov 28 08:55:28 np0005538515.localdomain systemd[1]: e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 08:55:28 np0005538515.localdomain systemd[1]: e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.service: Failed with result 'exit-code'.
Nov 28 08:55:28 np0005538515.localdomain systemd[1]: tmp-crun.ZIoVEc.mount: Deactivated successfully.
Nov 28 08:55:30 np0005538515.localdomain sshd[99098]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 08:55:30 np0005538515.localdomain sshd[99098]: error: kex_exchange_identification: Connection closed by remote host
Nov 28 08:55:30 np0005538515.localdomain sshd[99098]: Connection closed by 134.209.154.70 port 44080
Nov 28 08:55:50 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.
Nov 28 08:55:50 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.
Nov 28 08:55:50 np0005538515.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Nov 28 08:55:50 np0005538515.localdomain recover_tripleo_nova_virtqemud[99102]: 62642
Nov 28 08:55:50 np0005538515.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Nov 28 08:55:50 np0005538515.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Nov 28 08:55:50 np0005538515.localdomain podman[99099]: 2025-11-28 08:55:50.993709115 +0000 UTC m=+0.089557836 container health_status 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, distribution-scope=public, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_id=tripleo_step1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, container_name=metrics_qdr, io.buildah.version=1.41.4, name=rhosp17/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, url=https://www.redhat.com, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 08:55:51 np0005538515.localdomain systemd[1]: tmp-crun.vrZF4I.mount: Deactivated successfully.
Nov 28 08:55:51 np0005538515.localdomain podman[99100]: 2025-11-28 08:55:51.048261342 +0000 UTC m=+0.145266039 container health_status cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, build-date=2025-11-18T22:51:28Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vendor=Red Hat, Inc., com.redhat.component=openstack-collectd-container, description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, distribution-scope=public, vcs-type=git, version=17.1.12, io.openshift.expose-services=, container_name=collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-collectd, io.buildah.version=1.41.4, managed_by=tripleo_ansible, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team)
Nov 28 08:55:51 np0005538515.localdomain podman[99100]: 2025-11-28 08:55:51.083412274 +0000 UTC m=+0.180417021 container exec_died cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-collectd-container, distribution-scope=public, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, release=1761123044, vcs-type=git, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:51:28Z, container_name=collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-collectd, architecture=x86_64, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, managed_by=tripleo_ansible, version=17.1.12, io.buildah.version=1.41.4, vendor=Red Hat, Inc., config_id=tripleo_step3, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 28 08:55:51 np0005538515.localdomain systemd[1]: cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.service: Deactivated successfully.
Nov 28 08:55:51 np0005538515.localdomain podman[99099]: 2025-11-28 08:55:51.187728082 +0000 UTC m=+0.283576833 container exec_died 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, build-date=2025-11-18T22:49:46Z, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, url=https://www.redhat.com, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=metrics_qdr, version=17.1.12, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, name=rhosp17/openstack-qdrouterd, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 28 08:55:51 np0005538515.localdomain systemd[1]: 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.service: Deactivated successfully.
Nov 28 08:55:52 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.
Nov 28 08:55:52 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.
Nov 28 08:55:52 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.
Nov 28 08:55:52 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.
Nov 28 08:55:52 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.
Nov 28 08:55:52 np0005538515.localdomain systemd[1]: tmp-crun.tQrJmJ.mount: Deactivated successfully.
Nov 28 08:55:53 np0005538515.localdomain podman[99152]: 2025-11-28 08:55:52.999175968 +0000 UTC m=+0.096617173 container health_status d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, com.redhat.component=openstack-ceilometer-compute-container, version=17.1.12, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., tcib_managed=true, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4)
Nov 28 08:55:53 np0005538515.localdomain podman[99152]: 2025-11-28 08:55:53.034400451 +0000 UTC m=+0.131841476 container exec_died d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, batch=17.1_20251118.1, vendor=Red Hat, Inc., managed_by=tripleo_ansible, version=17.1.12, build-date=2025-11-19T00:11:48Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, config_id=tripleo_step4, release=1761123044, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-compute, tcib_managed=true, container_name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.buildah.version=1.41.4)
Nov 28 08:55:53 np0005538515.localdomain systemd[1]: tmp-crun.LJfw7Z.mount: Deactivated successfully.
Nov 28 08:55:53 np0005538515.localdomain podman[99158]: 2025-11-28 08:55:53.04930378 +0000 UTC m=+0.142259027 container health_status e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-compute, url=https://www.redhat.com, vcs-type=git, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, container_name=nova_migration_target, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, batch=17.1_20251118.1, vendor=Red Hat, Inc., io.buildah.version=1.41.4, io.openshift.expose-services=)
Nov 28 08:55:53 np0005538515.localdomain systemd[1]: d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.service: Deactivated successfully.
Nov 28 08:55:53 np0005538515.localdomain podman[99151]: 2025-11-28 08:55:53.091461687 +0000 UTC m=+0.189043427 container health_status 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, vendor=Red Hat, Inc., batch=17.1_20251118.1, managed_by=tripleo_ansible, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, architecture=x86_64, release=1761123044, tcib_managed=true, io.openshift.expose-services=, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:44:13Z, summary=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, name=rhosp17/openstack-iscsid, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']})
Nov 28 08:55:53 np0005538515.localdomain podman[99151]: 2025-11-28 08:55:53.104388584 +0000 UTC m=+0.201970314 container exec_died 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, distribution-scope=public, build-date=2025-11-18T23:44:13Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, vendor=Red Hat, Inc., managed_by=tripleo_ansible, architecture=x86_64, io.buildah.version=1.41.4, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, url=https://www.redhat.com, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, container_name=iscsid)
Nov 28 08:55:53 np0005538515.localdomain systemd[1]: 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.service: Deactivated successfully.
Nov 28 08:55:53 np0005538515.localdomain podman[99150]: 2025-11-28 08:55:53.133215481 +0000 UTC m=+0.234690110 container health_status 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, build-date=2025-11-19T00:12:45Z, release=1761123044, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, tcib_managed=true, managed_by=tripleo_ansible, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-ipmi-container, batch=17.1_20251118.1, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 28 08:55:53 np0005538515.localdomain podman[99149]: 2025-11-28 08:55:53.179607207 +0000 UTC m=+0.284590954 container health_status 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, tcib_managed=true, name=rhosp17/openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=logrotate_crond, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, vendor=Red Hat, Inc., config_id=tripleo_step4, build-date=2025-11-18T22:49:32Z, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, batch=17.1_20251118.1, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git)
Nov 28 08:55:53 np0005538515.localdomain podman[99150]: 2025-11-28 08:55:53.207791304 +0000 UTC m=+0.309265963 container exec_died 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, version=17.1.12, batch=17.1_20251118.1, distribution-scope=public, tcib_managed=true, vendor=Red Hat, Inc., io.openshift.expose-services=, build-date=2025-11-19T00:12:45Z, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, name=rhosp17/openstack-ceilometer-ipmi, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, container_name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']})
Nov 28 08:55:53 np0005538515.localdomain systemd[1]: 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.service: Deactivated successfully.
Nov 28 08:55:53 np0005538515.localdomain podman[99149]: 2025-11-28 08:55:53.263105116 +0000 UTC m=+0.368088923 container exec_died 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-cron, com.redhat.component=openstack-cron-container, container_name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, build-date=2025-11-18T22:49:32Z, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, distribution-scope=public, config_id=tripleo_step4, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible)
Nov 28 08:55:53 np0005538515.localdomain systemd[1]: 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.service: Deactivated successfully.
Nov 28 08:55:53 np0005538515.localdomain podman[99158]: 2025-11-28 08:55:53.365561186 +0000 UTC m=+0.458516423 container exec_died e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, build-date=2025-11-19T00:36:58Z, vcs-type=git, version=17.1.12, io.buildah.version=1.41.4, io.openshift.expose-services=, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_migration_target, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., architecture=x86_64)
Nov 28 08:55:53 np0005538515.localdomain systemd[1]: e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.service: Deactivated successfully.
Nov 28 08:55:58 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.
Nov 28 08:55:58 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.
Nov 28 08:55:58 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.
Nov 28 08:55:58 np0005538515.localdomain podman[99260]: 2025-11-28 08:55:58.977659103 +0000 UTC m=+0.085185562 container health_status 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, version=17.1.12, name=rhosp17/openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, release=1761123044, architecture=x86_64, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, vcs-type=git, build-date=2025-11-18T23:34:05Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, distribution-scope=public, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, vendor=Red Hat, Inc., container_name=ovn_controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller)
Nov 28 08:55:58 np0005538515.localdomain podman[99260]: 2025-11-28 08:55:58.99642201 +0000 UTC m=+0.103948429 container exec_died 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, tcib_managed=true, distribution-scope=public, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., version=17.1.12, name=rhosp17/openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, com.redhat.component=openstack-ovn-controller-container, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, container_name=ovn_controller, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:34:05Z, io.openshift.expose-services=, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 28 08:55:59 np0005538515.localdomain podman[99260]: unhealthy
Nov 28 08:55:59 np0005538515.localdomain systemd[1]: 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 08:55:59 np0005538515.localdomain systemd[1]: 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.service: Failed with result 'exit-code'.
Nov 28 08:55:59 np0005538515.localdomain podman[99262]: 2025-11-28 08:55:59.073515131 +0000 UTC m=+0.174517518 container health_status e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, maintainer=OpenStack TripleO Team, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, container_name=ovn_metadata_agent, distribution-scope=public, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., tcib_managed=true, io.openshift.expose-services=, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, managed_by=tripleo_ansible, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, version=17.1.12, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Nov 28 08:55:59 np0005538515.localdomain podman[99262]: 2025-11-28 08:55:59.087559223 +0000 UTC m=+0.188561670 container exec_died e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, release=1761123044, distribution-scope=public, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, architecture=x86_64, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:14:25Z, tcib_managed=true)
Nov 28 08:55:59 np0005538515.localdomain podman[99262]: unhealthy
Nov 28 08:55:59 np0005538515.localdomain systemd[1]: e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 08:55:59 np0005538515.localdomain systemd[1]: e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.service: Failed with result 'exit-code'.
Nov 28 08:55:59 np0005538515.localdomain podman[99261]: 2025-11-28 08:55:59.137430647 +0000 UTC m=+0.241021734 container health_status ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., tcib_managed=true, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, architecture=x86_64, io.openshift.expose-services=, build-date=2025-11-19T00:36:58Z, batch=17.1_20251118.1, container_name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, vcs-type=git, name=rhosp17/openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step5, com.redhat.component=openstack-nova-compute-container)
Nov 28 08:55:59 np0005538515.localdomain podman[99261]: 2025-11-28 08:55:59.190054625 +0000 UTC m=+0.293645672 container exec_died ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, distribution-scope=public, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_compute, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, vendor=Red Hat, Inc., vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.4, release=1761123044, build-date=2025-11-19T00:36:58Z, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, io.openshift.expose-services=, config_id=tripleo_step5, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, url=https://www.redhat.com)
Nov 28 08:55:59 np0005538515.localdomain systemd[1]: ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.service: Deactivated successfully.
Nov 28 08:56:15 np0005538515.localdomain sshd[36778]: Received disconnect from 192.168.122.100 port 40252:11: disconnected by user
Nov 28 08:56:15 np0005538515.localdomain sshd[36778]: Disconnected from user tripleo-admin 192.168.122.100 port 40252
Nov 28 08:56:15 np0005538515.localdomain sshd[36760]: pam_unix(sshd:session): session closed for user tripleo-admin
Nov 28 08:56:15 np0005538515.localdomain systemd-logind[763]: Session 28 logged out. Waiting for processes to exit.
Nov 28 08:56:15 np0005538515.localdomain systemd[1]: session-28.scope: Deactivated successfully.
Nov 28 08:56:15 np0005538515.localdomain systemd[1]: session-28.scope: Consumed 7min 12.106s CPU time.
Nov 28 08:56:15 np0005538515.localdomain systemd-logind[763]: Removed session 28.
Nov 28 08:56:21 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.
Nov 28 08:56:21 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.
Nov 28 08:56:21 np0005538515.localdomain podman[99326]: 2025-11-28 08:56:21.984867717 +0000 UTC m=+0.090897257 container health_status 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, container_name=metrics_qdr, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, name=rhosp17/openstack-qdrouterd, vendor=Red Hat, Inc., com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, architecture=x86_64, batch=17.1_20251118.1, tcib_managed=true, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, build-date=2025-11-18T22:49:46Z, config_id=tripleo_step1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 28 08:56:22 np0005538515.localdomain systemd[1]: tmp-crun.8uG734.mount: Deactivated successfully.
Nov 28 08:56:22 np0005538515.localdomain podman[99327]: 2025-11-28 08:56:22.037401822 +0000 UTC m=+0.139761220 container health_status cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, url=https://www.redhat.com, com.redhat.component=openstack-collectd-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:51:28Z, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, architecture=x86_64, release=1761123044, vcs-type=git, io.buildah.version=1.41.4, distribution-scope=public, vendor=Red Hat, Inc., config_id=tripleo_step3, container_name=collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd)
Nov 28 08:56:22 np0005538515.localdomain podman[99327]: 2025-11-28 08:56:22.047323777 +0000 UTC m=+0.149683165 container exec_died cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, com.redhat.component=openstack-collectd-container, distribution-scope=public, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, url=https://www.redhat.com, vcs-type=git, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., container_name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, build-date=2025-11-18T22:51:28Z, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, io.buildah.version=1.41.4)
Nov 28 08:56:22 np0005538515.localdomain systemd[1]: cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.service: Deactivated successfully.
Nov 28 08:56:22 np0005538515.localdomain podman[99326]: 2025-11-28 08:56:22.224383763 +0000 UTC m=+0.330413373 container exec_died 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, container_name=metrics_qdr, vcs-type=git, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, architecture=x86_64, batch=17.1_20251118.1, io.openshift.expose-services=, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step1)
Nov 28 08:56:22 np0005538515.localdomain systemd[1]: 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.service: Deactivated successfully.
Nov 28 08:56:23 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.
Nov 28 08:56:23 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.
Nov 28 08:56:23 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.
Nov 28 08:56:23 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.
Nov 28 08:56:23 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.
Nov 28 08:56:23 np0005538515.localdomain podman[99376]: 2025-11-28 08:56:23.98159264 +0000 UTC m=+0.079838036 container health_status 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, release=1761123044, build-date=2025-11-19T00:12:45Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-ipmi-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, config_id=tripleo_step4, vcs-type=git, managed_by=tripleo_ansible, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.buildah.version=1.41.4, vendor=Red Hat, Inc., tcib_managed=true, version=17.1.12)
Nov 28 08:56:23 np0005538515.localdomain podman[99378]: 2025-11-28 08:56:23.998963425 +0000 UTC m=+0.087520533 container health_status d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, container_name=ceilometer_agent_compute, distribution-scope=public, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, tcib_managed=true, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, release=1761123044, version=17.1.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-compute, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.buildah.version=1.41.4)
Nov 28 08:56:24 np0005538515.localdomain podman[99376]: 2025-11-28 08:56:24.009392745 +0000 UTC m=+0.107638142 container exec_died 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ceilometer-ipmi, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, release=1761123044, url=https://www.redhat.com, container_name=ceilometer_agent_ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., batch=17.1_20251118.1, architecture=x86_64, com.redhat.component=openstack-ceilometer-ipmi-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676)
Nov 28 08:56:24 np0005538515.localdomain systemd[1]: 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.service: Deactivated successfully.
Nov 28 08:56:24 np0005538515.localdomain podman[99389]: 2025-11-28 08:56:24.111887508 +0000 UTC m=+0.195014359 container health_status e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, name=rhosp17/openstack-nova-compute, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, release=1761123044, vcs-type=git, distribution-scope=public, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, managed_by=tripleo_ansible, container_name=nova_migration_target, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., config_id=tripleo_step4, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 28 08:56:24 np0005538515.localdomain podman[99378]: 2025-11-28 08:56:24.131145271 +0000 UTC m=+0.219702419 container exec_died d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-compute-container, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, release=1761123044, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, tcib_managed=true, distribution-scope=public, build-date=2025-11-19T00:11:48Z, name=rhosp17/openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_compute)
Nov 28 08:56:24 np0005538515.localdomain podman[99375]: 2025-11-28 08:56:24.079254084 +0000 UTC m=+0.178003725 container health_status 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, io.buildah.version=1.41.4, config_id=tripleo_step4, architecture=x86_64, com.redhat.component=openstack-cron-container, batch=17.1_20251118.1, build-date=2025-11-18T22:49:32Z, container_name=logrotate_crond, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, url=https://www.redhat.com, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 28 08:56:24 np0005538515.localdomain systemd[1]: d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.service: Deactivated successfully.
Nov 28 08:56:24 np0005538515.localdomain podman[99377]: 2025-11-28 08:56:24.061139928 +0000 UTC m=+0.151603854 container health_status 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, config_id=tripleo_step3, batch=17.1_20251118.1, distribution-scope=public, build-date=2025-11-18T23:44:13Z, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044, com.redhat.component=openstack-iscsid-container, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-type=git, url=https://www.redhat.com, container_name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, architecture=x86_64, io.openshift.expose-services=)
Nov 28 08:56:24 np0005538515.localdomain podman[99377]: 2025-11-28 08:56:24.195529631 +0000 UTC m=+0.285993557 container exec_died 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vcs-type=git, version=17.1.12, com.redhat.component=openstack-iscsid-container, architecture=x86_64, managed_by=tripleo_ansible, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, config_id=tripleo_step3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.buildah.version=1.41.4, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, name=rhosp17/openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, container_name=iscsid, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T23:44:13Z)
Nov 28 08:56:24 np0005538515.localdomain systemd[1]: 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.service: Deactivated successfully.
Nov 28 08:56:24 np0005538515.localdomain podman[99375]: 2025-11-28 08:56:24.212610516 +0000 UTC m=+0.311360157 container exec_died 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-cron, vcs-type=git, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, com.redhat.component=openstack-cron-container, tcib_managed=true, container_name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.openshift.expose-services=, release=1761123044, distribution-scope=public, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, build-date=2025-11-18T22:49:32Z, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron)
Nov 28 08:56:24 np0005538515.localdomain systemd[1]: 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.service: Deactivated successfully.
Nov 28 08:56:24 np0005538515.localdomain podman[99389]: 2025-11-28 08:56:24.512579043 +0000 UTC m=+0.595705894 container exec_died e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step4, com.redhat.component=openstack-nova-compute-container, vcs-type=git, build-date=2025-11-19T00:36:58Z, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.4, architecture=x86_64, name=rhosp17/openstack-nova-compute, maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., url=https://www.redhat.com)
Nov 28 08:56:24 np0005538515.localdomain systemd[1]: e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.service: Deactivated successfully.
Nov 28 08:56:25 np0005538515.localdomain sudo[99488]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 08:56:25 np0005538515.localdomain sudo[99488]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:56:25 np0005538515.localdomain sudo[99488]: pam_unix(sudo:session): session closed for user root
Nov 28 08:56:25 np0005538515.localdomain sudo[99503]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 08:56:25 np0005538515.localdomain sudo[99503]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:56:25 np0005538515.localdomain sudo[99503]: pam_unix(sudo:session): session closed for user root
Nov 28 08:56:25 np0005538515.localdomain systemd[1]: Stopping User Manager for UID 1003...
Nov 28 08:56:25 np0005538515.localdomain systemd[36764]: Activating special unit Exit the Session...
Nov 28 08:56:25 np0005538515.localdomain systemd[36764]: Removed slice User Background Tasks Slice.
Nov 28 08:56:25 np0005538515.localdomain systemd[36764]: Stopped target Main User Target.
Nov 28 08:56:25 np0005538515.localdomain systemd[36764]: Stopped target Basic System.
Nov 28 08:56:25 np0005538515.localdomain systemd[36764]: Stopped target Paths.
Nov 28 08:56:25 np0005538515.localdomain systemd[36764]: Stopped target Sockets.
Nov 28 08:56:25 np0005538515.localdomain systemd[36764]: Stopped target Timers.
Nov 28 08:56:25 np0005538515.localdomain systemd[36764]: Stopped Mark boot as successful after the user session has run 2 minutes.
Nov 28 08:56:25 np0005538515.localdomain systemd[36764]: Stopped Daily Cleanup of User's Temporary Directories.
Nov 28 08:56:25 np0005538515.localdomain systemd[36764]: Closed D-Bus User Message Bus Socket.
Nov 28 08:56:25 np0005538515.localdomain systemd[36764]: Stopped Create User's Volatile Files and Directories.
Nov 28 08:56:25 np0005538515.localdomain systemd[36764]: Removed slice User Application Slice.
Nov 28 08:56:25 np0005538515.localdomain systemd[36764]: Reached target Shutdown.
Nov 28 08:56:25 np0005538515.localdomain systemd[36764]: Finished Exit the Session.
Nov 28 08:56:25 np0005538515.localdomain systemd[36764]: Reached target Exit the Session.
Nov 28 08:56:25 np0005538515.localdomain systemd[1]: user@1003.service: Deactivated successfully.
Nov 28 08:56:25 np0005538515.localdomain systemd[1]: Stopped User Manager for UID 1003.
Nov 28 08:56:25 np0005538515.localdomain systemd[1]: user@1003.service: Consumed 4.603s CPU time, read 0B from disk, written 7.0K to disk.
Nov 28 08:56:26 np0005538515.localdomain systemd[1]: Stopping User Runtime Directory /run/user/1003...
Nov 28 08:56:26 np0005538515.localdomain systemd[1]: run-user-1003.mount: Deactivated successfully.
Nov 28 08:56:26 np0005538515.localdomain systemd[1]: user-runtime-dir@1003.service: Deactivated successfully.
Nov 28 08:56:26 np0005538515.localdomain systemd[1]: Stopped User Runtime Directory /run/user/1003.
Nov 28 08:56:26 np0005538515.localdomain sudo[99553]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 08:56:26 np0005538515.localdomain systemd[1]: Removed slice User Slice of UID 1003.
Nov 28 08:56:26 np0005538515.localdomain systemd[1]: user-1003.slice: Consumed 7min 16.736s CPU time.
Nov 28 08:56:26 np0005538515.localdomain sudo[99553]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:56:26 np0005538515.localdomain sudo[99553]: pam_unix(sudo:session): session closed for user root
Nov 28 08:56:29 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.
Nov 28 08:56:29 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.
Nov 28 08:56:29 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.
Nov 28 08:56:29 np0005538515.localdomain podman[99568]: 2025-11-28 08:56:29.972667061 +0000 UTC m=+0.080089754 container health_status 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, batch=17.1_20251118.1, io.buildah.version=1.41.4, managed_by=tripleo_ansible, release=1761123044, name=rhosp17/openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, build-date=2025-11-18T23:34:05Z, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ovn_controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, vendor=Red Hat, Inc., architecture=x86_64)
Nov 28 08:56:29 np0005538515.localdomain podman[99568]: 2025-11-28 08:56:29.99246961 +0000 UTC m=+0.099892313 container exec_died 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vcs-type=git, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, io.openshift.expose-services=, tcib_managed=true, version=17.1.12, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, io.buildah.version=1.41.4, managed_by=tripleo_ansible, url=https://www.redhat.com, build-date=2025-11-18T23:34:05Z, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, name=rhosp17/openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, distribution-scope=public, container_name=ovn_controller)
Nov 28 08:56:29 np0005538515.localdomain podman[99568]: unhealthy
Nov 28 08:56:30 np0005538515.localdomain systemd[1]: 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 08:56:30 np0005538515.localdomain systemd[1]: 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.service: Failed with result 'exit-code'.
Nov 28 08:56:30 np0005538515.localdomain podman[99569]: 2025-11-28 08:56:30.082499129 +0000 UTC m=+0.186776446 container health_status ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, version=17.1.12, io.buildah.version=1.41.4, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2025-11-19T00:36:58Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, config_id=tripleo_step5, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_compute, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, vcs-type=git, io.openshift.expose-services=, tcib_managed=true)
Nov 28 08:56:30 np0005538515.localdomain podman[99569]: 2025-11-28 08:56:30.114477754 +0000 UTC m=+0.218755131 container exec_died ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, architecture=x86_64, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, release=1761123044, config_id=tripleo_step5, version=17.1.12, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, container_name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 28 08:56:30 np0005538515.localdomain systemd[1]: ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.service: Deactivated successfully.
Nov 28 08:56:30 np0005538515.localdomain podman[99570]: 2025-11-28 08:56:30.13419981 +0000 UTC m=+0.234859875 container health_status e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vendor=Red Hat, Inc., batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, release=1761123044, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, maintainer=OpenStack TripleO Team, tcib_managed=true, version=17.1.12, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, config_id=tripleo_step4, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 08:56:30 np0005538515.localdomain podman[99570]: 2025-11-28 08:56:30.154383231 +0000 UTC m=+0.255043306 container exec_died e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, release=1761123044, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, build-date=2025-11-19T00:14:25Z, batch=17.1_20251118.1, distribution-scope=public, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, container_name=ovn_metadata_agent, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4)
Nov 28 08:56:30 np0005538515.localdomain podman[99570]: unhealthy
Nov 28 08:56:30 np0005538515.localdomain systemd[1]: e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 08:56:30 np0005538515.localdomain systemd[1]: e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.service: Failed with result 'exit-code'.
Nov 28 08:56:52 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.
Nov 28 08:56:52 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.
Nov 28 08:56:52 np0005538515.localdomain podman[99631]: 2025-11-28 08:56:52.974920744 +0000 UTC m=+0.082103226 container health_status 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, config_id=tripleo_step1, build-date=2025-11-18T22:49:46Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, batch=17.1_20251118.1, container_name=metrics_qdr, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, io.buildah.version=1.41.4, tcib_managed=true, managed_by=tripleo_ansible, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp17/openstack-qdrouterd, vcs-type=git, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd)
Nov 28 08:56:53 np0005538515.localdomain podman[99632]: 2025-11-28 08:56:53.029708158 +0000 UTC m=+0.128806282 container health_status cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, managed_by=tripleo_ansible, vendor=Red Hat, Inc., config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, release=1761123044, build-date=2025-11-18T22:51:28Z, name=rhosp17/openstack-collectd, container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, tcib_managed=true, io.buildah.version=1.41.4, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 28 08:56:53 np0005538515.localdomain podman[99632]: 2025-11-28 08:56:53.043394009 +0000 UTC m=+0.142492113 container exec_died cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, release=1761123044, io.buildah.version=1.41.4, url=https://www.redhat.com, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, build-date=2025-11-18T22:51:28Z, io.openshift.expose-services=, batch=17.1_20251118.1, vendor=Red Hat, Inc., com.redhat.component=openstack-collectd-container, name=rhosp17/openstack-collectd)
Nov 28 08:56:53 np0005538515.localdomain systemd[1]: cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.service: Deactivated successfully.
Nov 28 08:56:53 np0005538515.localdomain podman[99631]: 2025-11-28 08:56:53.164494305 +0000 UTC m=+0.271676787 container exec_died 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.buildah.version=1.41.4, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, distribution-scope=public, maintainer=OpenStack TripleO Team, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, vendor=Red Hat, Inc., architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, build-date=2025-11-18T22:49:46Z, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp17/openstack-qdrouterd, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd)
Nov 28 08:56:53 np0005538515.localdomain systemd[1]: 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.service: Deactivated successfully.
Nov 28 08:56:54 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.
Nov 28 08:56:54 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.
Nov 28 08:56:54 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.
Nov 28 08:56:54 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.
Nov 28 08:56:54 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.
Nov 28 08:56:54 np0005538515.localdomain podman[99681]: 2025-11-28 08:56:54.983773422 +0000 UTC m=+0.084381906 container health_status 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, tcib_managed=true, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, name=rhosp17/openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, io.openshift.expose-services=, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, distribution-scope=public, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, com.redhat.component=openstack-cron-container, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-type=git)
Nov 28 08:56:54 np0005538515.localdomain podman[99681]: 2025-11-28 08:56:54.995386069 +0000 UTC m=+0.095994613 container exec_died 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, url=https://www.redhat.com, name=rhosp17/openstack-cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, batch=17.1_20251118.1, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, vcs-type=git, release=1761123044, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:32Z, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vendor=Red Hat, Inc., container_name=logrotate_crond, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, managed_by=tripleo_ansible, architecture=x86_64)
Nov 28 08:56:55 np0005538515.localdomain systemd[1]: 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.service: Deactivated successfully.
Nov 28 08:56:55 np0005538515.localdomain podman[99685]: 2025-11-28 08:56:55.047620536 +0000 UTC m=+0.137661235 container health_status d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, build-date=2025-11-19T00:11:48Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, container_name=ceilometer_agent_compute, vcs-type=git, tcib_managed=true, url=https://www.redhat.com, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, config_id=tripleo_step4, architecture=x86_64, name=rhosp17/openstack-ceilometer-compute, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Nov 28 08:56:55 np0005538515.localdomain podman[99682]: 2025-11-28 08:56:55.094650823 +0000 UTC m=+0.188988544 container health_status 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, release=1761123044, io.openshift.expose-services=, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-ipmi-container, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ceilometer_agent_ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:12:45Z, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, vendor=Red Hat, Inc., architecture=x86_64, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 28 08:56:55 np0005538515.localdomain podman[99685]: 2025-11-28 08:56:55.103414832 +0000 UTC m=+0.193455491 container exec_died d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, architecture=x86_64, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, version=17.1.12, tcib_managed=true, config_id=tripleo_step4, release=1761123044, build-date=2025-11-19T00:11:48Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-compute-container, batch=17.1_20251118.1, url=https://www.redhat.com, container_name=ceilometer_agent_compute, distribution-scope=public, name=rhosp17/openstack-ceilometer-compute, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Nov 28 08:56:55 np0005538515.localdomain systemd[1]: d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.service: Deactivated successfully.
Nov 28 08:56:55 np0005538515.localdomain podman[99682]: 2025-11-28 08:56:55.14954097 +0000 UTC m=+0.243878761 container exec_died 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, build-date=2025-11-19T00:12:45Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi, batch=17.1_20251118.1, name=rhosp17/openstack-ceilometer-ipmi, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., config_id=tripleo_step4, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi)
Nov 28 08:56:55 np0005538515.localdomain systemd[1]: 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.service: Deactivated successfully.
Nov 28 08:56:55 np0005538515.localdomain podman[99690]: 2025-11-28 08:56:55.201780037 +0000 UTC m=+0.287944217 container health_status e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-type=git, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, container_name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, version=17.1.12, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2025-11-19T00:36:58Z, batch=17.1_20251118.1, name=rhosp17/openstack-nova-compute, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, distribution-scope=public)
Nov 28 08:56:55 np0005538515.localdomain podman[99683]: 2025-11-28 08:56:55.155810123 +0000 UTC m=+0.247835643 container health_status 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, url=https://www.redhat.com, build-date=2025-11-18T23:44:13Z, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-iscsid-container, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, tcib_managed=true, batch=17.1_20251118.1, io.openshift.expose-services=, version=17.1.12, release=1761123044, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_id=tripleo_step3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, container_name=iscsid, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid)
Nov 28 08:56:55 np0005538515.localdomain podman[99683]: 2025-11-28 08:56:55.240734385 +0000 UTC m=+0.332759865 container exec_died 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.component=openstack-iscsid-container, container_name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, release=1761123044, name=rhosp17/openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public, architecture=x86_64, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, config_id=tripleo_step3, url=https://www.redhat.com, io.openshift.expose-services=)
Nov 28 08:56:55 np0005538515.localdomain systemd[1]: 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.service: Deactivated successfully.
Nov 28 08:56:55 np0005538515.localdomain podman[99690]: 2025-11-28 08:56:55.578755891 +0000 UTC m=+0.664920091 container exec_died e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, architecture=x86_64, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, build-date=2025-11-19T00:36:58Z, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, container_name=nova_migration_target, tcib_managed=true, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., batch=17.1_20251118.1, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, distribution-scope=public, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 28 08:56:55 np0005538515.localdomain systemd[1]: e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.service: Deactivated successfully.
Nov 28 08:56:57 np0005538515.localdomain sshd[99794]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 08:56:57 np0005538515.localdomain sshd[99794]: Connection closed by 142.93.252.174 port 57980 [preauth]
Nov 28 08:57:00 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.
Nov 28 08:57:00 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.
Nov 28 08:57:00 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.
Nov 28 08:57:00 np0005538515.localdomain podman[99796]: 2025-11-28 08:57:00.975843034 +0000 UTC m=+0.082176628 container health_status 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ovn-controller, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., tcib_managed=true, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, com.redhat.component=openstack-ovn-controller-container, release=1761123044, managed_by=tripleo_ansible, build-date=2025-11-18T23:34:05Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']})
Nov 28 08:57:00 np0005538515.localdomain podman[99796]: 2025-11-28 08:57:00.991369292 +0000 UTC m=+0.097702886 container exec_died 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, architecture=x86_64, batch=17.1_20251118.1, release=1761123044, build-date=2025-11-18T23:34:05Z, vcs-type=git, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, managed_by=tripleo_ansible, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, version=17.1.12, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272)
Nov 28 08:57:00 np0005538515.localdomain podman[99796]: unhealthy
Nov 28 08:57:01 np0005538515.localdomain systemd[1]: 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 08:57:01 np0005538515.localdomain systemd[1]: 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.service: Failed with result 'exit-code'.
Nov 28 08:57:01 np0005538515.localdomain podman[99797]: 2025-11-28 08:57:01.081171983 +0000 UTC m=+0.185648691 container health_status ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, tcib_managed=true, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, build-date=2025-11-19T00:36:58Z, version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, config_id=tripleo_step5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, batch=17.1_20251118.1, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']})
Nov 28 08:57:01 np0005538515.localdomain podman[99798]: 2025-11-28 08:57:01.136004119 +0000 UTC m=+0.236573976 container health_status e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, managed_by=tripleo_ansible, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, name=rhosp17/openstack-neutron-metadata-agent-ovn, tcib_managed=true, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, vendor=Red Hat, Inc., io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, release=1761123044, io.buildah.version=1.41.4, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, build-date=2025-11-19T00:14:25Z, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn)
Nov 28 08:57:01 np0005538515.localdomain podman[99798]: 2025-11-28 08:57:01.155451447 +0000 UTC m=+0.256021274 container exec_died e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, io.openshift.expose-services=, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, name=rhosp17/openstack-neutron-metadata-agent-ovn, tcib_managed=true, architecture=x86_64, distribution-scope=public, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, version=17.1.12, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:14:25Z, managed_by=tripleo_ansible, vendor=Red Hat, Inc.)
Nov 28 08:57:01 np0005538515.localdomain podman[99798]: unhealthy
Nov 28 08:57:01 np0005538515.localdomain systemd[1]: e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 08:57:01 np0005538515.localdomain systemd[1]: e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.service: Failed with result 'exit-code'.
Nov 28 08:57:01 np0005538515.localdomain podman[99797]: 2025-11-28 08:57:01.188509684 +0000 UTC m=+0.292986402 container exec_died ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2025-11-19T00:36:58Z, config_id=tripleo_step5, io.openshift.expose-services=, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 28 08:57:01 np0005538515.localdomain systemd[1]: ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.service: Deactivated successfully.
Nov 28 08:57:23 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.
Nov 28 08:57:23 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.
Nov 28 08:57:23 np0005538515.localdomain podman[99863]: 2025-11-28 08:57:23.996145588 +0000 UTC m=+0.096832928 container health_status 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, maintainer=OpenStack TripleO Team, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, container_name=metrics_qdr, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, build-date=2025-11-18T22:49:46Z, vendor=Red Hat, Inc., batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step1, architecture=x86_64)
Nov 28 08:57:24 np0005538515.localdomain podman[99864]: 2025-11-28 08:57:24.051114359 +0000 UTC m=+0.151824950 container health_status cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, container_name=collectd, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, managed_by=tripleo_ansible, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, version=17.1.12, com.redhat.component=openstack-collectd-container, url=https://www.redhat.com, name=rhosp17/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z)
Nov 28 08:57:24 np0005538515.localdomain podman[99864]: 2025-11-28 08:57:24.061736966 +0000 UTC m=+0.162447547 container exec_died cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, config_id=tripleo_step3, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, release=1761123044, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, build-date=2025-11-18T22:51:28Z, com.redhat.component=openstack-collectd-container, vendor=Red Hat, Inc., config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, distribution-scope=public, tcib_managed=true, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd)
Nov 28 08:57:24 np0005538515.localdomain systemd[1]: cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.service: Deactivated successfully.
Nov 28 08:57:24 np0005538515.localdomain podman[99863]: 2025-11-28 08:57:24.19940559 +0000 UTC m=+0.300092900 container exec_died 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, url=https://www.redhat.com, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:46Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, release=1761123044, name=rhosp17/openstack-qdrouterd, io.openshift.expose-services=, architecture=x86_64, batch=17.1_20251118.1, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, distribution-scope=public, maintainer=OpenStack TripleO Team, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd)
Nov 28 08:57:24 np0005538515.localdomain systemd[1]: 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.service: Deactivated successfully.
Nov 28 08:57:25 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.
Nov 28 08:57:25 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.
Nov 28 08:57:25 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.
Nov 28 08:57:25 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.
Nov 28 08:57:25 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.
Nov 28 08:57:25 np0005538515.localdomain podman[99912]: 2025-11-28 08:57:25.988330904 +0000 UTC m=+0.090354110 container health_status 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vcs-type=git, name=rhosp17/openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ceilometer-ipmi-container, container_name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, config_id=tripleo_step4, architecture=x86_64, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, distribution-scope=public, tcib_managed=true)
Nov 28 08:57:26 np0005538515.localdomain podman[99913]: 2025-11-28 08:57:26.091706152 +0000 UTC m=+0.192848531 container health_status 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, release=1761123044, version=17.1.12, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, io.buildah.version=1.41.4, tcib_managed=true, name=rhosp17/openstack-iscsid, batch=17.1_20251118.1, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_id=tripleo_step3, com.redhat.component=openstack-iscsid-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=iscsid, maintainer=OpenStack TripleO Team, distribution-scope=public, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d)
Nov 28 08:57:26 np0005538515.localdomain podman[99913]: 2025-11-28 08:57:26.098200943 +0000 UTC m=+0.199343342 container exec_died 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, architecture=x86_64, batch=17.1_20251118.1, url=https://www.redhat.com, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, summary=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, config_id=tripleo_step3, vendor=Red Hat, Inc., container_name=iscsid, build-date=2025-11-18T23:44:13Z, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, com.redhat.component=openstack-iscsid-container, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-iscsid, managed_by=tripleo_ansible, version=17.1.12)
Nov 28 08:57:26 np0005538515.localdomain podman[99918]: 2025-11-28 08:57:26.058671997 +0000 UTC m=+0.153371898 container health_status d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, tcib_managed=true, release=1761123044, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, url=https://www.redhat.com, vcs-type=git, build-date=2025-11-19T00:11:48Z, container_name=ceilometer_agent_compute, com.redhat.component=openstack-ceilometer-compute-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, config_id=tripleo_step4, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute)
Nov 28 08:57:26 np0005538515.localdomain systemd[1]: 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.service: Deactivated successfully.
Nov 28 08:57:26 np0005538515.localdomain podman[99912]: 2025-11-28 08:57:26.117432625 +0000 UTC m=+0.219455891 container exec_died 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vcs-type=git, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, architecture=x86_64, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-ipmi-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp17/openstack-ceilometer-ipmi, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, container_name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, version=17.1.12, build-date=2025-11-19T00:12:45Z, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi)
Nov 28 08:57:26 np0005538515.localdomain podman[99911]: 2025-11-28 08:57:26.026218259 +0000 UTC m=+0.135624142 container health_status 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, tcib_managed=true, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step4, batch=17.1_20251118.1, release=1761123044, io.openshift.expose-services=, vcs-type=git, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, distribution-scope=public, architecture=x86_64, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, name=rhosp17/openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-cron-container, summary=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:32Z, container_name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron)
Nov 28 08:57:26 np0005538515.localdomain systemd[1]: 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.service: Deactivated successfully.
Nov 28 08:57:26 np0005538515.localdomain podman[99918]: 2025-11-28 08:57:26.144422694 +0000 UTC m=+0.239122525 container exec_died d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, architecture=x86_64, distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, url=https://www.redhat.com, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, io.buildah.version=1.41.4, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-compute, tcib_managed=true, vendor=Red Hat, Inc.)
Nov 28 08:57:26 np0005538515.localdomain podman[99920]: 2025-11-28 08:57:26.101264967 +0000 UTC m=+0.192724828 container health_status e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, distribution-scope=public, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, name=rhosp17/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, tcib_managed=true, io.buildah.version=1.41.4, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, config_id=tripleo_step4, vendor=Red Hat, Inc., container_name=nova_migration_target, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Nov 28 08:57:26 np0005538515.localdomain systemd[1]: d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.service: Deactivated successfully.
Nov 28 08:57:26 np0005538515.localdomain podman[99911]: 2025-11-28 08:57:26.158019972 +0000 UTC m=+0.267425835 container exec_died 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, name=rhosp17/openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.expose-services=, build-date=2025-11-18T22:49:32Z, config_id=tripleo_step4, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, com.redhat.component=openstack-cron-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, architecture=x86_64, container_name=logrotate_crond, summary=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, batch=17.1_20251118.1, vcs-type=git, managed_by=tripleo_ansible)
Nov 28 08:57:26 np0005538515.localdomain systemd[1]: 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.service: Deactivated successfully.
Nov 28 08:57:26 np0005538515.localdomain podman[99920]: 2025-11-28 08:57:26.449363833 +0000 UTC m=+0.540823714 container exec_died e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, version=17.1.12, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.buildah.version=1.41.4, config_id=tripleo_step4, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, release=1761123044, batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_migration_target)
Nov 28 08:57:26 np0005538515.localdomain systemd[1]: e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.service: Deactivated successfully.
Nov 28 08:57:26 np0005538515.localdomain sudo[100023]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 08:57:26 np0005538515.localdomain sudo[100023]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:57:26 np0005538515.localdomain sudo[100023]: pam_unix(sudo:session): session closed for user root
Nov 28 08:57:26 np0005538515.localdomain sudo[100038]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 08:57:26 np0005538515.localdomain sudo[100038]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:57:27 np0005538515.localdomain sudo[100038]: pam_unix(sudo:session): session closed for user root
Nov 28 08:57:28 np0005538515.localdomain sudo[100086]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 08:57:28 np0005538515.localdomain sudo[100086]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:57:28 np0005538515.localdomain sudo[100086]: pam_unix(sudo:session): session closed for user root
Nov 28 08:57:31 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.
Nov 28 08:57:31 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.
Nov 28 08:57:31 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.
Nov 28 08:57:31 np0005538515.localdomain podman[100102]: 2025-11-28 08:57:31.973822322 +0000 UTC m=+0.074969167 container health_status ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1761123044, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_compute, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, config_id=tripleo_step5, io.openshift.expose-services=, distribution-scope=public, batch=17.1_20251118.1, vendor=Red Hat, Inc., vcs-type=git, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, build-date=2025-11-19T00:36:58Z)
Nov 28 08:57:32 np0005538515.localdomain podman[100103]: 2025-11-28 08:57:32.033812968 +0000 UTC m=+0.134192479 container health_status e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=ovn_metadata_agent, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-type=git, config_id=tripleo_step4, distribution-scope=public, build-date=2025-11-19T00:14:25Z, managed_by=tripleo_ansible, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, architecture=x86_64)
Nov 28 08:57:32 np0005538515.localdomain podman[100102]: 2025-11-28 08:57:32.050266163 +0000 UTC m=+0.151413008 container exec_died ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, release=1761123044, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, vcs-type=git, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, architecture=x86_64, name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:36:58Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute)
Nov 28 08:57:32 np0005538515.localdomain systemd[1]: ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.service: Deactivated successfully.
Nov 28 08:57:32 np0005538515.localdomain podman[100103]: 2025-11-28 08:57:32.070273149 +0000 UTC m=+0.170652670 container exec_died e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, maintainer=OpenStack TripleO Team, tcib_managed=true, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-type=git, distribution-scope=public, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, version=17.1.12, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, build-date=2025-11-19T00:14:25Z, release=1761123044, managed_by=tripleo_ansible, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, container_name=ovn_metadata_agent)
Nov 28 08:57:32 np0005538515.localdomain podman[100103]: unhealthy
Nov 28 08:57:32 np0005538515.localdomain systemd[1]: e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 08:57:32 np0005538515.localdomain systemd[1]: e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.service: Failed with result 'exit-code'.
Nov 28 08:57:32 np0005538515.localdomain podman[100101]: 2025-11-28 08:57:31.954753095 +0000 UTC m=+0.062190243 container health_status 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, build-date=2025-11-18T23:34:05Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, release=1761123044, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, distribution-scope=public, config_id=tripleo_step4, vendor=Red Hat, Inc., tcib_managed=true, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, managed_by=tripleo_ansible, com.redhat.component=openstack-ovn-controller-container, container_name=ovn_controller, name=rhosp17/openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller)
Nov 28 08:57:32 np0005538515.localdomain podman[100101]: 2025-11-28 08:57:32.135598998 +0000 UTC m=+0.243036166 container exec_died 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, vendor=Red Hat, Inc., vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, version=17.1.12, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, com.redhat.component=openstack-ovn-controller-container, io.openshift.expose-services=, tcib_managed=true, io.buildah.version=1.41.4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, managed_by=tripleo_ansible, config_id=tripleo_step4, name=rhosp17/openstack-ovn-controller, build-date=2025-11-18T23:34:05Z)
Nov 28 08:57:32 np0005538515.localdomain podman[100101]: unhealthy
Nov 28 08:57:32 np0005538515.localdomain systemd[1]: 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 08:57:32 np0005538515.localdomain systemd[1]: 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.service: Failed with result 'exit-code'.
Nov 28 08:57:53 np0005538515.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Nov 28 08:57:53 np0005538515.localdomain recover_tripleo_nova_virtqemud[100166]: 62642
Nov 28 08:57:53 np0005538515.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Nov 28 08:57:53 np0005538515.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Nov 28 08:57:54 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.
Nov 28 08:57:54 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.
Nov 28 08:57:54 np0005538515.localdomain podman[100167]: 2025-11-28 08:57:54.980999047 +0000 UTC m=+0.087588435 container health_status 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, build-date=2025-11-18T22:49:46Z, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, version=17.1.12, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, name=rhosp17/openstack-qdrouterd, tcib_managed=true, batch=17.1_20251118.1, vendor=Red Hat, Inc., config_id=tripleo_step1, managed_by=tripleo_ansible, distribution-scope=public, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd)
Nov 28 08:57:55 np0005538515.localdomain podman[100168]: 2025-11-28 08:57:55.039544027 +0000 UTC m=+0.143679549 container health_status cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, maintainer=OpenStack TripleO Team, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, distribution-scope=public, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, managed_by=tripleo_ansible, version=17.1.12, vcs-type=git, container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, io.openshift.expose-services=, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-collectd, url=https://www.redhat.com)
Nov 28 08:57:55 np0005538515.localdomain podman[100168]: 2025-11-28 08:57:55.049344028 +0000 UTC m=+0.153479570 container exec_died cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, config_id=tripleo_step3, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=collectd, io.openshift.expose-services=, io.buildah.version=1.41.4, url=https://www.redhat.com, build-date=2025-11-18T22:51:28Z, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, architecture=x86_64, release=1761123044, distribution-scope=public, vendor=Red Hat, Inc., name=rhosp17/openstack-collectd)
Nov 28 08:57:55 np0005538515.localdomain systemd[1]: cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.service: Deactivated successfully.
Nov 28 08:57:55 np0005538515.localdomain podman[100167]: 2025-11-28 08:57:55.184504616 +0000 UTC m=+0.291093984 container exec_died 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, architecture=x86_64, container_name=metrics_qdr, managed_by=tripleo_ansible, tcib_managed=true, config_id=tripleo_step1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 28 08:57:55 np0005538515.localdomain systemd[1]: 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.service: Deactivated successfully.
Nov 28 08:57:56 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.
Nov 28 08:57:56 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.
Nov 28 08:57:56 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.
Nov 28 08:57:56 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.
Nov 28 08:57:56 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.
Nov 28 08:57:57 np0005538515.localdomain podman[100218]: 2025-11-28 08:57:57.022995744 +0000 UTC m=+0.123874761 container health_status 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, com.redhat.component=openstack-cron-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:32Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, container_name=logrotate_crond, summary=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_id=tripleo_step4, vendor=Red Hat, Inc., batch=17.1_20251118.1, tcib_managed=true, name=rhosp17/openstack-cron, url=https://www.redhat.com, release=1761123044, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, version=17.1.12)
Nov 28 08:57:57 np0005538515.localdomain systemd[1]: tmp-crun.RKopZg.mount: Deactivated successfully.
Nov 28 08:57:57 np0005538515.localdomain podman[100220]: 2025-11-28 08:57:57.040502702 +0000 UTC m=+0.134923391 container health_status 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, com.redhat.component=openstack-iscsid-container, batch=17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, url=https://www.redhat.com, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, vcs-type=git, config_id=tripleo_step3, io.openshift.expose-services=, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, architecture=x86_64, build-date=2025-11-18T23:44:13Z, description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044)
Nov 28 08:57:57 np0005538515.localdomain podman[100218]: 2025-11-28 08:57:57.061738465 +0000 UTC m=+0.162617442 container exec_died 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, name=rhosp17/openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, container_name=logrotate_crond, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.openshift.expose-services=, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-cron-container, release=1761123044, managed_by=tripleo_ansible, tcib_managed=true, build-date=2025-11-18T22:49:32Z, batch=17.1_20251118.1, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 28 08:57:57 np0005538515.localdomain podman[100227]: 2025-11-28 08:57:57.108157273 +0000 UTC m=+0.195875986 container health_status e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, version=17.1.12, name=rhosp17/openstack-nova-compute, vcs-type=git, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, vendor=Red Hat, Inc., org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, tcib_managed=true, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 28 08:57:57 np0005538515.localdomain podman[100219]: 2025-11-28 08:57:57.144924233 +0000 UTC m=+0.242703505 container health_status 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, vendor=Red Hat, Inc., container_name=ceilometer_agent_ipmi, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-ipmi-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, tcib_managed=true, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:12:45Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, vcs-type=git, name=rhosp17/openstack-ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible)
Nov 28 08:57:57 np0005538515.localdomain podman[100220]: 2025-11-28 08:57:57.176412213 +0000 UTC m=+0.270832922 container exec_died 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, tcib_managed=true, url=https://www.redhat.com, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-iscsid-container, architecture=x86_64, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, summary=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, config_id=tripleo_step3, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-iscsid, container_name=iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git)
Nov 28 08:57:57 np0005538515.localdomain systemd[1]: 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.service: Deactivated successfully.
Nov 28 08:57:57 np0005538515.localdomain podman[100219]: 2025-11-28 08:57:57.196248083 +0000 UTC m=+0.294027355 container exec_died 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, distribution-scope=public, container_name=ceilometer_agent_ipmi, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ceilometer-ipmi-container, vendor=Red Hat, Inc., version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-ipmi, vcs-type=git, managed_by=tripleo_ansible, release=1761123044, maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.openshift.expose-services=, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1)
Nov 28 08:57:57 np0005538515.localdomain systemd[1]: 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.service: Deactivated successfully.
Nov 28 08:57:57 np0005538515.localdomain systemd[1]: 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.service: Deactivated successfully.
Nov 28 08:57:57 np0005538515.localdomain podman[100221]: 2025-11-28 08:57:57.25698988 +0000 UTC m=+0.345777886 container health_status d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, maintainer=OpenStack TripleO Team, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., version=17.1.12, name=rhosp17/openstack-ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, distribution-scope=public, build-date=2025-11-19T00:11:48Z, io.buildah.version=1.41.4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_compute, release=1761123044, batch=17.1_20251118.1, vcs-type=git)
Nov 28 08:57:57 np0005538515.localdomain podman[100221]: 2025-11-28 08:57:57.314044686 +0000 UTC m=+0.402832692 container exec_died d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, release=1761123044, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, vcs-type=git, managed_by=tripleo_ansible, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, batch=17.1_20251118.1, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, build-date=2025-11-19T00:11:48Z, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-compute-container, distribution-scope=public, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Nov 28 08:57:57 np0005538515.localdomain systemd[1]: d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.service: Deactivated successfully.
Nov 28 08:57:57 np0005538515.localdomain podman[100227]: 2025-11-28 08:57:57.467506406 +0000 UTC m=+0.555225129 container exec_died e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, tcib_managed=true, architecture=x86_64, io.buildah.version=1.41.4, config_id=tripleo_step4, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, batch=17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 28 08:57:57 np0005538515.localdomain systemd[1]: e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.service: Deactivated successfully.
Nov 28 08:58:02 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.
Nov 28 08:58:02 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.
Nov 28 08:58:02 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.
Nov 28 08:58:02 np0005538515.localdomain podman[100328]: 2025-11-28 08:58:02.984403992 +0000 UTC m=+0.091354501 container health_status 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, tcib_managed=true, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, vcs-type=git, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, vendor=Red Hat, Inc., build-date=2025-11-18T23:34:05Z, maintainer=OpenStack TripleO Team, container_name=ovn_controller, name=rhosp17/openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller)
Nov 28 08:58:03 np0005538515.localdomain podman[100329]: 2025-11-28 08:58:03.036904907 +0000 UTC m=+0.141077000 container health_status ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, vcs-type=git, config_id=tripleo_step5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1761123044, vendor=Red Hat, Inc., batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, build-date=2025-11-19T00:36:58Z, architecture=x86_64, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 28 08:58:03 np0005538515.localdomain systemd[1]: tmp-crun.mPQFVt.mount: Deactivated successfully.
Nov 28 08:58:03 np0005538515.localdomain podman[100330]: 2025-11-28 08:58:03.090781004 +0000 UTC m=+0.190720117 container health_status e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, architecture=x86_64, container_name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.openshift.expose-services=, build-date=2025-11-19T00:14:25Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, version=17.1.12, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, managed_by=tripleo_ansible, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, config_id=tripleo_step4)
Nov 28 08:58:03 np0005538515.localdomain podman[100329]: 2025-11-28 08:58:03.097318905 +0000 UTC m=+0.201490958 container exec_died ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, version=17.1.12, vcs-type=git, architecture=x86_64, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, vendor=Red Hat, Inc., release=1761123044, tcib_managed=true, config_id=tripleo_step5, container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible)
Nov 28 08:58:03 np0005538515.localdomain podman[100328]: 2025-11-28 08:58:03.107778467 +0000 UTC m=+0.214728976 container exec_died 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, build-date=2025-11-18T23:34:05Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, maintainer=OpenStack TripleO Team, distribution-scope=public, io.openshift.expose-services=, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, architecture=x86_64, managed_by=tripleo_ansible, release=1761123044, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, name=rhosp17/openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., tcib_managed=true, batch=17.1_20251118.1, config_id=tripleo_step4)
Nov 28 08:58:03 np0005538515.localdomain systemd[1]: ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.service: Deactivated successfully.
Nov 28 08:58:03 np0005538515.localdomain podman[100328]: unhealthy
Nov 28 08:58:03 np0005538515.localdomain systemd[1]: 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 08:58:03 np0005538515.localdomain systemd[1]: 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.service: Failed with result 'exit-code'.
Nov 28 08:58:03 np0005538515.localdomain podman[100330]: 2025-11-28 08:58:03.159054174 +0000 UTC m=+0.258993277 container exec_died e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.openshift.expose-services=, release=1761123044, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, url=https://www.redhat.com, architecture=x86_64, managed_by=tripleo_ansible, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, version=17.1.12, build-date=2025-11-19T00:14:25Z, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, vendor=Red Hat, Inc., name=rhosp17/openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Nov 28 08:58:03 np0005538515.localdomain podman[100330]: unhealthy
Nov 28 08:58:03 np0005538515.localdomain systemd[1]: e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 08:58:03 np0005538515.localdomain systemd[1]: e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.service: Failed with result 'exit-code'.
Nov 28 08:58:25 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.
Nov 28 08:58:25 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.
Nov 28 08:58:25 np0005538515.localdomain systemd[1]: tmp-crun.ZdkgZn.mount: Deactivated successfully.
Nov 28 08:58:25 np0005538515.localdomain podman[100394]: 2025-11-28 08:58:25.971631604 +0000 UTC m=+0.083477501 container health_status 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, io.openshift.expose-services=, io.buildah.version=1.41.4, vendor=Red Hat, Inc., managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, tcib_managed=true, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, architecture=x86_64, build-date=2025-11-18T22:49:46Z)
Nov 28 08:58:26 np0005538515.localdomain podman[100406]: 2025-11-28 08:58:26.020191658 +0000 UTC m=+0.092539669 container health_status cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, name=rhosp17/openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, architecture=x86_64, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-collectd-container, summary=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, io.buildah.version=1.41.4, container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']})
Nov 28 08:58:26 np0005538515.localdomain podman[100406]: 2025-11-28 08:58:26.055776744 +0000 UTC m=+0.128124775 container exec_died cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, build-date=2025-11-18T22:51:28Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-collectd-container, vcs-type=git, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, vendor=Red Hat, Inc., name=rhosp17/openstack-collectd, io.openshift.expose-services=, container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, maintainer=OpenStack TripleO Team, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.41.4, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 28 08:58:26 np0005538515.localdomain systemd[1]: cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.service: Deactivated successfully.
Nov 28 08:58:26 np0005538515.localdomain podman[100394]: 2025-11-28 08:58:26.157897357 +0000 UTC m=+0.269743254 container exec_died 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, name=rhosp17/openstack-qdrouterd, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, batch=17.1_20251118.1, io.openshift.expose-services=, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, build-date=2025-11-18T22:49:46Z, architecture=x86_64, managed_by=tripleo_ansible, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044)
Nov 28 08:58:26 np0005538515.localdomain systemd[1]: 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.service: Deactivated successfully.
Nov 28 08:58:27 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.
Nov 28 08:58:27 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.
Nov 28 08:58:27 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.
Nov 28 08:58:27 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.
Nov 28 08:58:27 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.
Nov 28 08:58:27 np0005538515.localdomain podman[100448]: 2025-11-28 08:58:27.995429001 +0000 UTC m=+0.090536558 container health_status d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., distribution-scope=public, container_name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, architecture=x86_64, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, name=rhosp17/openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, build-date=2025-11-19T00:11:48Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, version=17.1.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=)
Nov 28 08:58:28 np0005538515.localdomain podman[100442]: 2025-11-28 08:58:28.03244926 +0000 UTC m=+0.135924024 container health_status 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, batch=17.1_20251118.1, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:32Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, release=1761123044, description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_id=tripleo_step4, vcs-type=git, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, version=17.1.12, managed_by=tripleo_ansible, com.redhat.component=openstack-cron-container, distribution-scope=public, container_name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 28 08:58:28 np0005538515.localdomain podman[100443]: 2025-11-28 08:58:28.047631908 +0000 UTC m=+0.145896093 container health_status 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, distribution-scope=public, maintainer=OpenStack TripleO Team, architecture=x86_64, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.expose-services=, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, version=17.1.12, url=https://www.redhat.com, vendor=Red Hat, Inc., container_name=ceilometer_agent_ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:12:45Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-ipmi, batch=17.1_20251118.1, release=1761123044, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 28 08:58:28 np0005538515.localdomain podman[100442]: 2025-11-28 08:58:28.068352996 +0000 UTC m=+0.171827750 container exec_died 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, release=1761123044, vendor=Red Hat, Inc., io.buildah.version=1.41.4, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, container_name=logrotate_crond, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, tcib_managed=true, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, config_id=tripleo_step4, build-date=2025-11-18T22:49:32Z, io.openshift.expose-services=, managed_by=tripleo_ansible)
Nov 28 08:58:28 np0005538515.localdomain systemd[1]: 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.service: Deactivated successfully.
Nov 28 08:58:28 np0005538515.localdomain podman[100448]: 2025-11-28 08:58:28.101345492 +0000 UTC m=+0.196452979 container exec_died d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1761123044, vcs-type=git, container_name=ceilometer_agent_compute, version=17.1.12, managed_by=tripleo_ansible, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-compute-container, name=rhosp17/openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:11:48Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Nov 28 08:58:28 np0005538515.localdomain systemd[1]: d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.service: Deactivated successfully.
Nov 28 08:58:28 np0005538515.localdomain podman[100451]: 2025-11-28 08:58:28.150500554 +0000 UTC m=+0.236713087 container health_status e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, release=1761123044, io.buildah.version=1.41.4, managed_by=tripleo_ansible, vcs-type=git, batch=17.1_20251118.1, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., config_id=tripleo_step4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, architecture=x86_64, container_name=nova_migration_target, name=rhosp17/openstack-nova-compute, tcib_managed=true, build-date=2025-11-19T00:36:58Z)
Nov 28 08:58:28 np0005538515.localdomain podman[100443]: 2025-11-28 08:58:28.174987218 +0000 UTC m=+0.273251413 container exec_died 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-ipmi, config_id=tripleo_step4, io.openshift.expose-services=, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:12:45Z, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, release=1761123044, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, io.buildah.version=1.41.4, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-type=git)
Nov 28 08:58:28 np0005538515.localdomain podman[100444]: 2025-11-28 08:58:28.186961227 +0000 UTC m=+0.283164658 container health_status 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, io.buildah.version=1.41.4, batch=17.1_20251118.1, architecture=x86_64, vendor=Red Hat, Inc., io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-iscsid-container, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, container_name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, release=1761123044, vcs-type=git)
Nov 28 08:58:28 np0005538515.localdomain systemd[1]: 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.service: Deactivated successfully.
Nov 28 08:58:28 np0005538515.localdomain podman[100444]: 2025-11-28 08:58:28.197042797 +0000 UTC m=+0.293246238 container exec_died 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, architecture=x86_64, io.openshift.expose-services=, name=rhosp17/openstack-iscsid, url=https://www.redhat.com, com.redhat.component=openstack-iscsid-container, release=1761123044, vcs-type=git, description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:44:13Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, managed_by=tripleo_ansible)
Nov 28 08:58:28 np0005538515.localdomain systemd[1]: 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.service: Deactivated successfully.
Nov 28 08:58:28 np0005538515.localdomain podman[100451]: 2025-11-28 08:58:28.52945014 +0000 UTC m=+0.615662713 container exec_died e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vendor=Red Hat, Inc., distribution-scope=public, release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, config_id=tripleo_step4, url=https://www.redhat.com, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, build-date=2025-11-19T00:36:58Z, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 28 08:58:28 np0005538515.localdomain systemd[1]: e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.service: Deactivated successfully.
Nov 28 08:58:28 np0005538515.localdomain sudo[100555]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 08:58:28 np0005538515.localdomain sudo[100555]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:58:28 np0005538515.localdomain sudo[100555]: pam_unix(sudo:session): session closed for user root
Nov 28 08:58:28 np0005538515.localdomain sudo[100570]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 08:58:28 np0005538515.localdomain sudo[100570]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:58:29 np0005538515.localdomain sudo[100570]: pam_unix(sudo:session): session closed for user root
Nov 28 08:58:30 np0005538515.localdomain sudo[100616]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 08:58:30 np0005538515.localdomain sudo[100616]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:58:30 np0005538515.localdomain sudo[100616]: pam_unix(sudo:session): session closed for user root
Nov 28 08:58:33 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.
Nov 28 08:58:33 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.
Nov 28 08:58:33 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.
Nov 28 08:58:33 np0005538515.localdomain podman[100633]: 2025-11-28 08:58:33.976383242 +0000 UTC m=+0.075430263 container health_status e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, io.openshift.expose-services=, url=https://www.redhat.com, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-neutron-metadata-agent-ovn, architecture=x86_64, build-date=2025-11-19T00:14:25Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, vendor=Red Hat, Inc., io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, managed_by=tripleo_ansible, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, tcib_managed=true)
Nov 28 08:58:34 np0005538515.localdomain podman[100631]: 2025-11-28 08:58:34.021055158 +0000 UTC m=+0.125611758 container health_status 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, build-date=2025-11-18T23:34:05Z, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.41.4, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., architecture=x86_64, version=17.1.12, distribution-scope=public, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4)
Nov 28 08:58:34 np0005538515.localdomain ceph-osd[32393]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 28 08:58:34 np0005538515.localdomain ceph-osd[32393]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 4200.1 total, 600.0 interval
                                                          Cumulative writes: 4784 writes, 21K keys, 4784 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s
                                                          Cumulative WAL: 4784 writes, 637 syncs, 7.51 writes per sync, written: 0.02 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                          Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Nov 28 08:58:34 np0005538515.localdomain podman[100631]: 2025-11-28 08:58:34.036277426 +0000 UTC m=+0.140834046 container exec_died 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, vendor=Red Hat, Inc., config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.buildah.version=1.41.4, build-date=2025-11-18T23:34:05Z, tcib_managed=true, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, release=1761123044, distribution-scope=public, container_name=ovn_controller, vcs-type=git, name=rhosp17/openstack-ovn-controller, batch=17.1_20251118.1, version=17.1.12, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ovn-controller-container, summary=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team)
Nov 28 08:58:34 np0005538515.localdomain podman[100631]: unhealthy
Nov 28 08:58:34 np0005538515.localdomain systemd[1]: 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 08:58:34 np0005538515.localdomain systemd[1]: 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.service: Failed with result 'exit-code'.
Nov 28 08:58:34 np0005538515.localdomain podman[100633]: 2025-11-28 08:58:34.072494851 +0000 UTC m=+0.171541922 container exec_died e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_id=tripleo_step4, io.buildah.version=1.41.4, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, build-date=2025-11-19T00:14:25Z, vendor=Red Hat, Inc., tcib_managed=true, distribution-scope=public, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, release=1761123044, name=rhosp17/openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team)
Nov 28 08:58:34 np0005538515.localdomain podman[100633]: unhealthy
Nov 28 08:58:34 np0005538515.localdomain systemd[1]: e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 08:58:34 np0005538515.localdomain systemd[1]: e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.service: Failed with result 'exit-code'.
Nov 28 08:58:34 np0005538515.localdomain podman[100632]: 2025-11-28 08:58:34.095309873 +0000 UTC m=+0.195446497 container health_status ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, config_id=tripleo_step5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vendor=Red Hat, Inc., io.buildah.version=1.41.4, url=https://www.redhat.com, name=rhosp17/openstack-nova-compute, build-date=2025-11-19T00:36:58Z, version=17.1.12, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute)
Nov 28 08:58:34 np0005538515.localdomain podman[100632]: 2025-11-28 08:58:34.126479312 +0000 UTC m=+0.226615906 container exec_died ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, release=1761123044, managed_by=tripleo_ansible, io.openshift.expose-services=, container_name=nova_compute, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, build-date=2025-11-19T00:36:58Z)
Nov 28 08:58:34 np0005538515.localdomain systemd[1]: ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.service: Deactivated successfully.
Nov 28 08:58:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 28 08:58:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 4200.2 total, 600.0 interval
                                                          Cumulative writes: 5781 writes, 25K keys, 5781 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s
                                                          Cumulative WAL: 5781 writes, 729 syncs, 7.93 writes per sync, written: 0.02 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                          Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Nov 28 08:58:56 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.
Nov 28 08:58:56 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.
Nov 28 08:58:56 np0005538515.localdomain systemd[1]: tmp-crun.WbSk6m.mount: Deactivated successfully.
Nov 28 08:58:56 np0005538515.localdomain podman[100697]: 2025-11-28 08:58:56.983652269 +0000 UTC m=+0.093517850 container health_status cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.openshift.expose-services=, name=rhosp17/openstack-collectd, build-date=2025-11-18T22:51:28Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, distribution-scope=public, tcib_managed=true, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, url=https://www.redhat.com, batch=17.1_20251118.1, architecture=x86_64, container_name=collectd, io.buildah.version=1.41.4, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible)
Nov 28 08:58:57 np0005538515.localdomain podman[100697]: 2025-11-28 08:58:57.018710798 +0000 UTC m=+0.128576409 container exec_died cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, name=rhosp17/openstack-collectd, architecture=x86_64, io.buildah.version=1.41.4, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., build-date=2025-11-18T22:51:28Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, batch=17.1_20251118.1, com.redhat.component=openstack-collectd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, container_name=collectd, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, vcs-type=git)
Nov 28 08:58:57 np0005538515.localdomain podman[100696]: 2025-11-28 08:58:57.029381747 +0000 UTC m=+0.140069403 container health_status 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, batch=17.1_20251118.1, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.41.4, build-date=2025-11-18T22:49:46Z, tcib_managed=true, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, release=1761123044, managed_by=tripleo_ansible, config_id=tripleo_step1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 28 08:58:57 np0005538515.localdomain systemd[1]: cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.service: Deactivated successfully.
Nov 28 08:58:57 np0005538515.localdomain podman[100696]: 2025-11-28 08:58:57.222315606 +0000 UTC m=+0.333003232 container exec_died 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, config_id=tripleo_step1, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1761123044, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, vendor=Red Hat, Inc., io.buildah.version=1.41.4, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, distribution-scope=public, version=17.1.12)
Nov 28 08:58:57 np0005538515.localdomain systemd[1]: 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.service: Deactivated successfully.
Nov 28 08:58:58 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.
Nov 28 08:58:58 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.
Nov 28 08:58:58 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.
Nov 28 08:58:58 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.
Nov 28 08:58:58 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.
Nov 28 08:58:58 np0005538515.localdomain podman[100747]: 2025-11-28 08:58:58.988965948 +0000 UTC m=+0.094961604 container health_status 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, managed_by=tripleo_ansible, vcs-type=git, config_id=tripleo_step4, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, distribution-scope=public, vendor=Red Hat, Inc., io.buildah.version=1.41.4, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, version=17.1.12, batch=17.1_20251118.1, tcib_managed=true, maintainer=OpenStack TripleO Team, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_ipmi)
Nov 28 08:58:59 np0005538515.localdomain podman[100747]: 2025-11-28 08:58:59.04102878 +0000 UTC m=+0.147024466 container exec_died 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1761123044, vcs-type=git, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ceilometer_agent_ipmi, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, architecture=x86_64, com.redhat.component=openstack-ceilometer-ipmi-container, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, tcib_managed=true, vendor=Red Hat, Inc.)
Nov 28 08:58:59 np0005538515.localdomain systemd[1]: 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.service: Deactivated successfully.
Nov 28 08:58:59 np0005538515.localdomain podman[100746]: 2025-11-28 08:58:59.093775604 +0000 UTC m=+0.201741751 container health_status 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, managed_by=tripleo_ansible, release=1761123044, description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, vendor=Red Hat, Inc., com.redhat.component=openstack-cron-container, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, build-date=2025-11-18T22:49:32Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=logrotate_crond, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, name=rhosp17/openstack-cron, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.buildah.version=1.41.4, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 28 08:58:59 np0005538515.localdomain podman[100749]: 2025-11-28 08:58:59.045127236 +0000 UTC m=+0.144322704 container health_status d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, tcib_managed=true, build-date=2025-11-19T00:11:48Z, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-compute, io.buildah.version=1.41.4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_compute, distribution-scope=public, batch=17.1_20251118.1, config_id=tripleo_step4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, io.openshift.expose-services=, release=1761123044, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-compute-container, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 28 08:58:59 np0005538515.localdomain podman[100748]: 2025-11-28 08:58:59.149584812 +0000 UTC m=+0.248776609 container health_status 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, release=1761123044, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:44:13Z, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, com.redhat.component=openstack-iscsid-container, version=17.1.12, batch=17.1_20251118.1, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, distribution-scope=public, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, managed_by=tripleo_ansible, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid)
Nov 28 08:58:59 np0005538515.localdomain podman[100749]: 2025-11-28 08:58:59.227580193 +0000 UTC m=+0.326775691 container exec_died d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ceilometer-compute-container, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, container_name=ceilometer_agent_compute, tcib_managed=true, architecture=x86_64, build-date=2025-11-19T00:11:48Z, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., batch=17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, io.buildah.version=1.41.4, config_id=tripleo_step4, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044)
Nov 28 08:58:59 np0005538515.localdomain systemd[1]: d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.service: Deactivated successfully.
Nov 28 08:58:59 np0005538515.localdomain podman[100758]: 2025-11-28 08:58:59.203941455 +0000 UTC m=+0.299681896 container health_status e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.12, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, vendor=Red Hat, Inc., batch=17.1_20251118.1, config_id=tripleo_step4, distribution-scope=public, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, url=https://www.redhat.com, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, container_name=nova_migration_target)
Nov 28 08:58:59 np0005538515.localdomain podman[100746]: 2025-11-28 08:58:59.280632506 +0000 UTC m=+0.388598683 container exec_died 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, container_name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, version=17.1.12, io.buildah.version=1.41.4, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, build-date=2025-11-18T22:49:32Z, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, com.redhat.component=openstack-cron-container, io.openshift.expose-services=, distribution-scope=public)
Nov 28 08:58:59 np0005538515.localdomain podman[100748]: 2025-11-28 08:58:59.281129971 +0000 UTC m=+0.380321768 container exec_died 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, release=1761123044, maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:44:13Z, name=rhosp17/openstack-iscsid, com.redhat.component=openstack-iscsid-container, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, architecture=x86_64, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, vendor=Red Hat, Inc., org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_id=tripleo_step3, io.openshift.expose-services=)
Nov 28 08:58:59 np0005538515.localdomain systemd[1]: 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.service: Deactivated successfully.
Nov 28 08:58:59 np0005538515.localdomain systemd[1]: 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.service: Deactivated successfully.
Nov 28 08:58:59 np0005538515.localdomain podman[100758]: 2025-11-28 08:58:59.60987956 +0000 UTC m=+0.705619971 container exec_died e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, version=17.1.12, tcib_managed=true, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, url=https://www.redhat.com, release=1761123044, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute)
Nov 28 08:58:59 np0005538515.localdomain systemd[1]: e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.service: Deactivated successfully.
Nov 28 08:59:04 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.
Nov 28 08:59:04 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.
Nov 28 08:59:04 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.
Nov 28 08:59:04 np0005538515.localdomain systemd[1]: tmp-crun.SMsxxg.mount: Deactivated successfully.
Nov 28 08:59:04 np0005538515.localdomain podman[100857]: 2025-11-28 08:59:04.991354687 +0000 UTC m=+0.096465860 container health_status ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, version=17.1.12, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, config_id=tripleo_step5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, tcib_managed=true, container_name=nova_compute)
Nov 28 08:59:05 np0005538515.localdomain podman[100857]: 2025-11-28 08:59:05.017931666 +0000 UTC m=+0.123042849 container exec_died ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., container_name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, version=17.1.12, batch=17.1_20251118.1, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, release=1761123044, build-date=2025-11-19T00:36:58Z, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute)
Nov 28 08:59:05 np0005538515.localdomain podman[100858]: 2025-11-28 08:59:05.029563904 +0000 UTC m=+0.129339883 container health_status e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, managed_by=tripleo_ansible, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, build-date=2025-11-19T00:14:25Z, distribution-scope=public, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, url=https://www.redhat.com, io.buildah.version=1.41.4, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vendor=Red Hat, Inc., config_id=tripleo_step4, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 28 08:59:05 np0005538515.localdomain systemd[1]: ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.service: Deactivated successfully.
Nov 28 08:59:05 np0005538515.localdomain podman[100858]: 2025-11-28 08:59:05.043768181 +0000 UTC m=+0.143544160 container exec_died e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, batch=17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_metadata_agent, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, io.openshift.expose-services=, build-date=2025-11-19T00:14:25Z, tcib_managed=true, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, distribution-scope=public, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.4, managed_by=tripleo_ansible)
Nov 28 08:59:05 np0005538515.localdomain podman[100858]: unhealthy
Nov 28 08:59:05 np0005538515.localdomain podman[100856]: 2025-11-28 08:59:04.954518613 +0000 UTC m=+0.063005899 container health_status 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, managed_by=tripleo_ansible, io.buildah.version=1.41.4, url=https://www.redhat.com, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, architecture=x86_64, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, name=rhosp17/openstack-ovn-controller, distribution-scope=public, com.redhat.component=openstack-ovn-controller-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, build-date=2025-11-18T23:34:05Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, version=17.1.12, vcs-type=git, vendor=Red Hat, Inc.)
Nov 28 08:59:05 np0005538515.localdomain systemd[1]: e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 08:59:05 np0005538515.localdomain systemd[1]: e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.service: Failed with result 'exit-code'.
Nov 28 08:59:05 np0005538515.localdomain podman[100856]: 2025-11-28 08:59:05.087469286 +0000 UTC m=+0.195956572 container exec_died 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, config_id=tripleo_step4, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, architecture=x86_64, io.openshift.expose-services=, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, container_name=ovn_controller, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, distribution-scope=public, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272)
Nov 28 08:59:05 np0005538515.localdomain podman[100856]: unhealthy
Nov 28 08:59:05 np0005538515.localdomain systemd[1]: 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 08:59:05 np0005538515.localdomain systemd[1]: 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.service: Failed with result 'exit-code'.
Nov 28 08:59:23 np0005538515.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Nov 28 08:59:23 np0005538515.localdomain recover_tripleo_nova_virtqemud[100924]: 62642
Nov 28 08:59:23 np0005538515.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Nov 28 08:59:23 np0005538515.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Nov 28 08:59:27 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.
Nov 28 08:59:27 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.
Nov 28 08:59:27 np0005538515.localdomain systemd[1]: tmp-crun.gSOhJ6.mount: Deactivated successfully.
Nov 28 08:59:27 np0005538515.localdomain podman[100925]: 2025-11-28 08:59:27.983913988 +0000 UTC m=+0.094038776 container health_status 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, name=rhosp17/openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr, release=1761123044, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, version=17.1.12, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4)
Nov 28 08:59:28 np0005538515.localdomain podman[100926]: 2025-11-28 08:59:28.034904768 +0000 UTC m=+0.139523587 container health_status cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, managed_by=tripleo_ansible, batch=17.1_20251118.1, url=https://www.redhat.com, release=1761123044, vendor=Red Hat, Inc., distribution-scope=public, config_id=tripleo_step3, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-type=git, version=17.1.12, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:51:28Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true)
Nov 28 08:59:28 np0005538515.localdomain podman[100926]: 2025-11-28 08:59:28.046563156 +0000 UTC m=+0.151181925 container exec_died cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, io.buildah.version=1.41.4, io.openshift.expose-services=, managed_by=tripleo_ansible, com.redhat.component=openstack-collectd-container, name=rhosp17/openstack-collectd, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, version=17.1.12, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:51:28Z, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 28 08:59:28 np0005538515.localdomain systemd[1]: cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.service: Deactivated successfully.
Nov 28 08:59:28 np0005538515.localdomain podman[100925]: 2025-11-28 08:59:28.193035385 +0000 UTC m=+0.303160163 container exec_died 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, build-date=2025-11-18T22:49:46Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, architecture=x86_64, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, vendor=Red Hat, Inc., config_id=tripleo_step1, managed_by=tripleo_ansible, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, version=17.1.12)
Nov 28 08:59:28 np0005538515.localdomain systemd[1]: 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.service: Deactivated successfully.
Nov 28 08:59:29 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.
Nov 28 08:59:29 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.
Nov 28 08:59:29 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.
Nov 28 08:59:29 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.
Nov 28 08:59:29 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.
Nov 28 08:59:29 np0005538515.localdomain podman[100974]: 2025-11-28 08:59:29.996924804 +0000 UTC m=+0.103018711 container health_status 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, architecture=x86_64, com.redhat.component=openstack-cron-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, version=17.1.12, tcib_managed=true, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, batch=17.1_20251118.1, url=https://www.redhat.com, build-date=2025-11-18T22:49:32Z, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=logrotate_crond, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc.)
Nov 28 08:59:30 np0005538515.localdomain podman[100974]: 2025-11-28 08:59:30.005104526 +0000 UTC m=+0.111198453 container exec_died 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, name=rhosp17/openstack-cron, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, release=1761123044, distribution-scope=public, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-cron-container, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-type=git, build-date=2025-11-18T22:49:32Z, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 28 08:59:30 np0005538515.localdomain systemd[1]: 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.service: Deactivated successfully.
Nov 28 08:59:30 np0005538515.localdomain systemd[1]: tmp-crun.ht9chI.mount: Deactivated successfully.
Nov 28 08:59:30 np0005538515.localdomain podman[100988]: 2025-11-28 08:59:30.062223065 +0000 UTC m=+0.154206598 container health_status e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.buildah.version=1.41.4, config_id=tripleo_step4, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, build-date=2025-11-19T00:36:58Z, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, architecture=x86_64, batch=17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 28 08:59:30 np0005538515.localdomain podman[100981]: 2025-11-28 08:59:30.113248415 +0000 UTC m=+0.206015123 container health_status d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, managed_by=tripleo_ansible, url=https://www.redhat.com, release=1761123044, name=rhosp17/openstack-ceilometer-compute, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, build-date=2025-11-19T00:11:48Z, container_name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 28 08:59:30 np0005538515.localdomain sudo[101046]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 08:59:30 np0005538515.localdomain podman[100975]: 2025-11-28 08:59:30.153555526 +0000 UTC m=+0.256110435 container health_status 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.openshift.expose-services=, managed_by=tripleo_ansible, vendor=Red Hat, Inc., tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-ipmi-container, container_name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:12:45Z, release=1761123044)
Nov 28 08:59:30 np0005538515.localdomain sudo[101046]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:59:30 np0005538515.localdomain sudo[101046]: pam_unix(sudo:session): session closed for user root
Nov 28 08:59:30 np0005538515.localdomain podman[100975]: 2025-11-28 08:59:30.185606483 +0000 UTC m=+0.288161402 container exec_died 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, distribution-scope=public, io.buildah.version=1.41.4, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20251118.1, io.openshift.expose-services=, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-type=git, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, release=1761123044, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, name=rhosp17/openstack-ceilometer-ipmi, build-date=2025-11-19T00:12:45Z)
Nov 28 08:59:30 np0005538515.localdomain systemd[1]: 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.service: Deactivated successfully.
Nov 28 08:59:30 np0005538515.localdomain podman[100976]: 2025-11-28 08:59:30.202002787 +0000 UTC m=+0.298269952 container health_status 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, build-date=2025-11-18T23:44:13Z, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, container_name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.openshift.expose-services=, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, distribution-scope=public, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, managed_by=tripleo_ansible, architecture=x86_64, config_id=tripleo_step3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, batch=17.1_20251118.1)
Nov 28 08:59:30 np0005538515.localdomain sudo[101090]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 08:59:30 np0005538515.localdomain sudo[101090]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:59:30 np0005538515.localdomain podman[100976]: 2025-11-28 08:59:30.235596981 +0000 UTC m=+0.331864196 container exec_died 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, managed_by=tripleo_ansible, batch=17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, distribution-scope=public, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., url=https://www.redhat.com, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 28 08:59:30 np0005538515.localdomain systemd[1]: 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.service: Deactivated successfully.
Nov 28 08:59:30 np0005538515.localdomain podman[100981]: 2025-11-28 08:59:30.256556786 +0000 UTC m=+0.349323524 container exec_died d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vendor=Red Hat, Inc., managed_by=tripleo_ansible, architecture=x86_64, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, vcs-type=git, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ceilometer_agent_compute, name=rhosp17/openstack-ceilometer-compute, batch=17.1_20251118.1, io.buildah.version=1.41.4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:11:48Z)
Nov 28 08:59:30 np0005538515.localdomain systemd[1]: d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.service: Deactivated successfully.
Nov 28 08:59:30 np0005538515.localdomain podman[100988]: 2025-11-28 08:59:30.382888955 +0000 UTC m=+0.474872508 container exec_died e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, tcib_managed=true, config_id=tripleo_step4, name=rhosp17/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, batch=17.1_20251118.1, container_name=nova_migration_target, managed_by=tripleo_ansible, version=17.1.12, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:36:58Z, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 28 08:59:30 np0005538515.localdomain systemd[1]: e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.service: Deactivated successfully.
Nov 28 08:59:30 np0005538515.localdomain sudo[101090]: pam_unix(sudo:session): session closed for user root
Nov 28 08:59:34 np0005538515.localdomain sudo[101153]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 08:59:34 np0005538515.localdomain sudo[101153]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 08:59:34 np0005538515.localdomain sudo[101153]: pam_unix(sudo:session): session closed for user root
Nov 28 08:59:35 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.
Nov 28 08:59:35 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.
Nov 28 08:59:35 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.
Nov 28 08:59:35 np0005538515.localdomain podman[101168]: 2025-11-28 08:59:35.979016698 +0000 UTC m=+0.084554794 container health_status 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, tcib_managed=true, batch=17.1_20251118.1, managed_by=tripleo_ansible, name=rhosp17/openstack-ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., version=17.1.12, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, container_name=ovn_controller, release=1761123044, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, distribution-scope=public, url=https://www.redhat.com, io.openshift.expose-services=, build-date=2025-11-18T23:34:05Z, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.buildah.version=1.41.4)
Nov 28 08:59:36 np0005538515.localdomain podman[101168]: 2025-11-28 08:59:36.02264077 +0000 UTC m=+0.128178876 container exec_died 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, build-date=2025-11-18T23:34:05Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vendor=Red Hat, Inc., name=rhosp17/openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_controller, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, batch=17.1_20251118.1, vcs-type=git, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, managed_by=tripleo_ansible, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, release=1761123044, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, tcib_managed=true)
Nov 28 08:59:36 np0005538515.localdomain podman[101168]: unhealthy
Nov 28 08:59:36 np0005538515.localdomain podman[101169]: 2025-11-28 08:59:36.030788031 +0000 UTC m=+0.134630825 container health_status ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, distribution-scope=public, architecture=x86_64, tcib_managed=true, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-compute-container, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, container_name=nova_compute, config_id=tripleo_step5, version=17.1.12, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, build-date=2025-11-19T00:36:58Z, managed_by=tripleo_ansible, url=https://www.redhat.com, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']})
Nov 28 08:59:36 np0005538515.localdomain systemd[1]: 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 08:59:36 np0005538515.localdomain systemd[1]: 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.service: Failed with result 'exit-code'.
Nov 28 08:59:36 np0005538515.localdomain podman[101169]: 2025-11-28 08:59:36.08435865 +0000 UTC m=+0.188201414 container exec_died ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:36:58Z, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, io.openshift.expose-services=, container_name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.12, io.buildah.version=1.41.4, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, name=rhosp17/openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, config_id=tripleo_step5)
Nov 28 08:59:36 np0005538515.localdomain systemd[1]: tmp-crun.P1kJJm.mount: Deactivated successfully.
Nov 28 08:59:36 np0005538515.localdomain systemd[1]: ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.service: Deactivated successfully.
Nov 28 08:59:36 np0005538515.localdomain podman[101170]: 2025-11-28 08:59:36.089710555 +0000 UTC m=+0.190297359 container health_status e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, architecture=x86_64, io.openshift.expose-services=, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, container_name=ovn_metadata_agent, vcs-type=git, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., release=1761123044, io.buildah.version=1.41.4, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, com.redhat.component=openstack-neutron-metadata-agent-ovn-container)
Nov 28 08:59:36 np0005538515.localdomain podman[101170]: 2025-11-28 08:59:36.169558843 +0000 UTC m=+0.270145617 container exec_died e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, build-date=2025-11-19T00:14:25Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, architecture=x86_64, io.openshift.expose-services=, name=rhosp17/openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, tcib_managed=true, container_name=ovn_metadata_agent, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, release=1761123044, managed_by=tripleo_ansible)
Nov 28 08:59:36 np0005538515.localdomain podman[101170]: unhealthy
Nov 28 08:59:36 np0005538515.localdomain systemd[1]: e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 08:59:36 np0005538515.localdomain systemd[1]: e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.service: Failed with result 'exit-code'.
Nov 28 08:59:58 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.
Nov 28 08:59:58 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.
Nov 28 08:59:58 np0005538515.localdomain podman[101234]: 2025-11-28 08:59:58.980118781 +0000 UTC m=+0.086709960 container health_status 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, url=https://www.redhat.com, architecture=x86_64, vendor=Red Hat, Inc., version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:46Z, io.openshift.expose-services=, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']})
Nov 28 08:59:59 np0005538515.localdomain systemd[1]: tmp-crun.GeFQyq.mount: Deactivated successfully.
Nov 28 08:59:59 np0005538515.localdomain podman[101235]: 2025-11-28 08:59:59.042805051 +0000 UTC m=+0.147108430 container health_status cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-collectd-container, build-date=2025-11-18T22:51:28Z, batch=17.1_20251118.1, distribution-scope=public, config_id=tripleo_step3, io.buildah.version=1.41.4, vcs-type=git, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=collectd, release=1761123044, description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, summary=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64)
Nov 28 08:59:59 np0005538515.localdomain podman[101235]: 2025-11-28 08:59:59.054443479 +0000 UTC m=+0.158746828 container exec_died cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, com.redhat.component=openstack-collectd-container, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:51:28Z, container_name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, tcib_managed=true, distribution-scope=public, managed_by=tripleo_ansible, release=1761123044, batch=17.1_20251118.1, vendor=Red Hat, Inc., name=rhosp17/openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 28 08:59:59 np0005538515.localdomain systemd[1]: cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.service: Deactivated successfully.
Nov 28 08:59:59 np0005538515.localdomain podman[101234]: 2025-11-28 08:59:59.173978488 +0000 UTC m=+0.280569687 container exec_died 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, container_name=metrics_qdr, name=rhosp17/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, distribution-scope=public, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, architecture=x86_64, vcs-type=git, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:46Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4)
Nov 28 08:59:59 np0005538515.localdomain systemd[1]: 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.service: Deactivated successfully.
Nov 28 09:00:00 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.
Nov 28 09:00:00 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.
Nov 28 09:00:00 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.
Nov 28 09:00:00 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.
Nov 28 09:00:00 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.
Nov 28 09:00:00 np0005538515.localdomain podman[101286]: 2025-11-28 09:00:00.990176006 +0000 UTC m=+0.086445623 container health_status d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, managed_by=tripleo_ansible, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, config_id=tripleo_step4, batch=17.1_20251118.1, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, build-date=2025-11-19T00:11:48Z, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, io.openshift.expose-services=, distribution-scope=public, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_compute, com.redhat.component=openstack-ceilometer-compute-container, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team)
Nov 28 09:00:01 np0005538515.localdomain podman[101284]: 2025-11-28 09:00:01.03516324 +0000 UTC m=+0.137002938 container health_status 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:12:45Z, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, url=https://www.redhat.com, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.openshift.expose-services=, managed_by=tripleo_ansible, release=1761123044, vcs-type=git, io.buildah.version=1.41.4, distribution-scope=public, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Nov 28 09:00:01 np0005538515.localdomain podman[101286]: 2025-11-28 09:00:01.050675387 +0000 UTC m=+0.146945014 container exec_died d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, name=rhosp17/openstack-ceilometer-compute, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, managed_by=tripleo_ansible, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, vcs-type=git, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, url=https://www.redhat.com, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_compute, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, distribution-scope=public)
Nov 28 09:00:01 np0005538515.localdomain systemd[1]: d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.service: Deactivated successfully.
Nov 28 09:00:01 np0005538515.localdomain podman[101284]: 2025-11-28 09:00:01.093533047 +0000 UTC m=+0.195372735 container exec_died 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-ipmi-container, version=17.1.12, config_id=tripleo_step4, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, release=1761123044, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, vendor=Red Hat, Inc.)
Nov 28 09:00:01 np0005538515.localdomain systemd[1]: 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.service: Deactivated successfully.
Nov 28 09:00:01 np0005538515.localdomain podman[101285]: 2025-11-28 09:00:01.138389188 +0000 UTC m=+0.238591245 container health_status 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-iscsid, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:44:13Z, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, distribution-scope=public, tcib_managed=true, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, release=1761123044, vendor=Red Hat, Inc., container_name=iscsid, config_id=tripleo_step3, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.buildah.version=1.41.4, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']})
Nov 28 09:00:01 np0005538515.localdomain podman[101285]: 2025-11-28 09:00:01.152340328 +0000 UTC m=+0.252542405 container exec_died 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, io.buildah.version=1.41.4, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, batch=17.1_20251118.1, managed_by=tripleo_ansible, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, version=17.1.12, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.component=openstack-iscsid-container, distribution-scope=public, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 28 09:00:01 np0005538515.localdomain systemd[1]: 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.service: Deactivated successfully.
Nov 28 09:00:01 np0005538515.localdomain podman[101292]: 2025-11-28 09:00:01.247103705 +0000 UTC m=+0.339729950 container health_status e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, name=rhosp17/openstack-nova-compute, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, distribution-scope=public, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 28 09:00:01 np0005538515.localdomain podman[101283]: 2025-11-28 09:00:01.278083468 +0000 UTC m=+0.383482505 container health_status 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, maintainer=OpenStack TripleO Team, release=1761123044, architecture=x86_64, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, com.redhat.component=openstack-cron-container, container_name=logrotate_crond, version=17.1.12, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-type=git, distribution-scope=public, batch=17.1_20251118.1, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron)
Nov 28 09:00:01 np0005538515.localdomain podman[101283]: 2025-11-28 09:00:01.285701202 +0000 UTC m=+0.391100209 container exec_died 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, version=17.1.12, architecture=x86_64, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., name=rhosp17/openstack-cron, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, release=1761123044, config_id=tripleo_step4, url=https://www.redhat.com, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-cron-container, maintainer=OpenStack TripleO Team, container_name=logrotate_crond, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, tcib_managed=true, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git)
Nov 28 09:00:01 np0005538515.localdomain systemd[1]: 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.service: Deactivated successfully.
Nov 28 09:00:01 np0005538515.localdomain podman[101292]: 2025-11-28 09:00:01.654484385 +0000 UTC m=+0.747110630 container exec_died e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, config_id=tripleo_step4, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_migration_target, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1761123044, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, vendor=Red Hat, Inc., batch=17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, architecture=x86_64, version=17.1.12, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 09:00:01 np0005538515.localdomain systemd[1]: e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.service: Deactivated successfully.
Nov 28 09:00:01 np0005538515.localdomain CROND[101400]: (root) CMD (sleep `expr ${RANDOM} % 90`; /usr/sbin/logrotate -s /var/lib/logrotate/logrotate-crond.status /etc/logrotate-crond.conf 2>&1|logger -t logrotate-crond)
Nov 28 09:00:06 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.
Nov 28 09:00:06 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.
Nov 28 09:00:06 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.
Nov 28 09:00:06 np0005538515.localdomain podman[101403]: 2025-11-28 09:00:06.975212981 +0000 UTC m=+0.081303143 container health_status 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, version=17.1.12, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.expose-services=, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.41.4, managed_by=tripleo_ansible, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, release=1761123044, tcib_managed=true, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., name=rhosp17/openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller)
Nov 28 09:00:06 np0005538515.localdomain systemd[1]: tmp-crun.tYRBLl.mount: Deactivated successfully.
Nov 28 09:00:06 np0005538515.localdomain podman[101403]: 2025-11-28 09:00:06.997368533 +0000 UTC m=+0.103458735 container exec_died 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, summary=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, architecture=x86_64, io.openshift.expose-services=, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, distribution-scope=public, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, container_name=ovn_controller, io.buildah.version=1.41.4, tcib_managed=true, managed_by=tripleo_ansible, vcs-type=git, vendor=Red Hat, Inc., build-date=2025-11-18T23:34:05Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller)
Nov 28 09:00:07 np0005538515.localdomain podman[101403]: unhealthy
Nov 28 09:00:07 np0005538515.localdomain systemd[1]: 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 09:00:07 np0005538515.localdomain systemd[1]: 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.service: Failed with result 'exit-code'.
Nov 28 09:00:07 np0005538515.localdomain podman[101410]: 2025-11-28 09:00:06.995655581 +0000 UTC m=+0.087330830 container health_status e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, architecture=x86_64, config_id=tripleo_step4, version=17.1.12, vcs-type=git, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=ovn_metadata_agent, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:14:25Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, io.openshift.expose-services=, release=1761123044, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 28 09:00:07 np0005538515.localdomain podman[101410]: 2025-11-28 09:00:07.082839575 +0000 UTC m=+0.174514814 container exec_died e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, build-date=2025-11-19T00:14:25Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, managed_by=tripleo_ansible, name=rhosp17/openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, architecture=x86_64, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, container_name=ovn_metadata_agent, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, url=https://www.redhat.com, io.buildah.version=1.41.4, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, config_id=tripleo_step4)
Nov 28 09:00:07 np0005538515.localdomain podman[101410]: unhealthy
Nov 28 09:00:07 np0005538515.localdomain systemd[1]: e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 09:00:07 np0005538515.localdomain systemd[1]: e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.service: Failed with result 'exit-code'.
Nov 28 09:00:07 np0005538515.localdomain podman[101404]: 2025-11-28 09:00:07.139390316 +0000 UTC m=+0.238199894 container health_status ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, vendor=Red Hat, Inc., io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, container_name=nova_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, config_id=tripleo_step5, summary=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 28 09:00:07 np0005538515.localdomain podman[101404]: 2025-11-28 09:00:07.169871524 +0000 UTC m=+0.268681052 container exec_died ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, tcib_managed=true, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, build-date=2025-11-19T00:36:58Z, name=rhosp17/openstack-nova-compute, container_name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, batch=17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, com.redhat.component=openstack-nova-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.openshift.expose-services=, version=17.1.12, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 28 09:00:07 np0005538515.localdomain systemd[1]: ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.service: Deactivated successfully.
Nov 28 09:00:29 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.
Nov 28 09:00:29 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.
Nov 28 09:00:29 np0005538515.localdomain podman[101469]: 2025-11-28 09:00:29.978550064 +0000 UTC m=+0.084693359 container health_status 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.component=openstack-qdrouterd-container, architecture=x86_64, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, container_name=metrics_qdr, build-date=2025-11-18T22:49:46Z, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, batch=17.1_20251118.1, distribution-scope=public, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, name=rhosp17/openstack-qdrouterd, vcs-type=git, tcib_managed=true, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 28 09:00:30 np0005538515.localdomain podman[101470]: 2025-11-28 09:00:30.031116781 +0000 UTC m=+0.131325212 container health_status cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, build-date=2025-11-18T22:51:28Z, maintainer=OpenStack TripleO Team, release=1761123044, distribution-scope=public, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, io.buildah.version=1.41.4, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, url=https://www.redhat.com, managed_by=tripleo_ansible, vendor=Red Hat, Inc., vcs-type=git, batch=17.1_20251118.1, name=rhosp17/openstack-collectd, config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 28 09:00:30 np0005538515.localdomain podman[101470]: 2025-11-28 09:00:30.069576075 +0000 UTC m=+0.169784526 container exec_died cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, distribution-scope=public, managed_by=tripleo_ansible, batch=17.1_20251118.1, io.openshift.expose-services=, vcs-type=git, release=1761123044, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-collectd-container, container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, io.buildah.version=1.41.4, url=https://www.redhat.com, tcib_managed=true, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, name=rhosp17/openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, build-date=2025-11-18T22:51:28Z, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 28 09:00:30 np0005538515.localdomain systemd[1]: cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.service: Deactivated successfully.
Nov 28 09:00:30 np0005538515.localdomain podman[101469]: 2025-11-28 09:00:30.172857955 +0000 UTC m=+0.279001300 container exec_died 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, vendor=Red Hat, Inc., name=rhosp17/openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, io.openshift.expose-services=, container_name=metrics_qdr, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, release=1761123044, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, vcs-type=git, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com)
Nov 28 09:00:30 np0005538515.localdomain systemd[1]: 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.service: Deactivated successfully.
Nov 28 09:00:31 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.
Nov 28 09:00:31 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.
Nov 28 09:00:31 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.
Nov 28 09:00:31 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.
Nov 28 09:00:31 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.
Nov 28 09:00:31 np0005538515.localdomain podman[101519]: 2025-11-28 09:00:31.979238671 +0000 UTC m=+0.085909076 container health_status 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step4, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:32Z, batch=17.1_20251118.1, container_name=logrotate_crond, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, io.buildah.version=1.41.4, name=rhosp17/openstack-cron, vcs-type=git, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team)
Nov 28 09:00:32 np0005538515.localdomain systemd[1]: tmp-crun.Ww5AVT.mount: Deactivated successfully.
Nov 28 09:00:32 np0005538515.localdomain podman[101521]: 2025-11-28 09:00:32.038220136 +0000 UTC m=+0.140710982 container health_status 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, build-date=2025-11-18T23:44:13Z, tcib_managed=true, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, version=17.1.12, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, url=https://www.redhat.com, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-iscsid, architecture=x86_64, managed_by=tripleo_ansible, vendor=Red Hat, Inc., batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-iscsid-container, distribution-scope=public, release=1761123044, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid)
Nov 28 09:00:32 np0005538515.localdomain podman[101521]: 2025-11-28 09:00:32.050463983 +0000 UTC m=+0.152954899 container exec_died 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, container_name=iscsid, maintainer=OpenStack TripleO Team, vcs-type=git, name=rhosp17/openstack-iscsid, com.redhat.component=openstack-iscsid-container, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, config_id=tripleo_step3, release=1761123044, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, version=17.1.12)
Nov 28 09:00:32 np0005538515.localdomain systemd[1]: 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.service: Deactivated successfully.
Nov 28 09:00:32 np0005538515.localdomain podman[101520]: 2025-11-28 09:00:32.090234357 +0000 UTC m=+0.196787478 container health_status 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, build-date=2025-11-19T00:12:45Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, version=17.1.12, tcib_managed=true, vendor=Red Hat, Inc., container_name=ceilometer_agent_ipmi, release=1761123044, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi)
Nov 28 09:00:32 np0005538515.localdomain podman[101523]: 2025-11-28 09:00:32.148553662 +0000 UTC m=+0.246114487 container health_status e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, distribution-scope=public, config_id=tripleo_step4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_migration_target, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, build-date=2025-11-19T00:36:58Z, io.openshift.expose-services=, vendor=Red Hat, Inc., managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, version=17.1.12, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044)
Nov 28 09:00:32 np0005538515.localdomain podman[101519]: 2025-11-28 09:00:32.167101954 +0000 UTC m=+0.273772359 container exec_died 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, io.buildah.version=1.41.4, com.redhat.component=openstack-cron-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, version=17.1.12, architecture=x86_64, config_id=tripleo_step4, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, release=1761123044, description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, distribution-scope=public, maintainer=OpenStack TripleO Team, tcib_managed=true, build-date=2025-11-18T22:49:32Z, name=rhosp17/openstack-cron, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 28 09:00:32 np0005538515.localdomain systemd[1]: 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.service: Deactivated successfully.
Nov 28 09:00:32 np0005538515.localdomain podman[101522]: 2025-11-28 09:00:31.965552149 +0000 UTC m=+0.070356736 container health_status d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack TripleO Team, architecture=x86_64, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, com.redhat.component=openstack-ceilometer-compute-container, vcs-type=git, distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, build-date=2025-11-19T00:11:48Z, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp17/openstack-ceilometer-compute, managed_by=tripleo_ansible, release=1761123044, container_name=ceilometer_agent_compute, vendor=Red Hat, Inc.)
Nov 28 09:00:32 np0005538515.localdomain podman[101522]: 2025-11-28 09:00:32.251513412 +0000 UTC m=+0.356317959 container exec_died d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:11:48Z, io.buildah.version=1.41.4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp17/openstack-ceilometer-compute, config_id=tripleo_step4, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, batch=17.1_20251118.1, tcib_managed=true, architecture=x86_64, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676)
Nov 28 09:00:32 np0005538515.localdomain systemd[1]: d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.service: Deactivated successfully.
Nov 28 09:00:32 np0005538515.localdomain podman[101520]: 2025-11-28 09:00:32.26868196 +0000 UTC m=+0.375235071 container exec_died 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-ipmi-container, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, tcib_managed=true, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, name=rhosp17/openstack-ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, config_id=tripleo_step4, build-date=2025-11-19T00:12:45Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, version=17.1.12, container_name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public)
Nov 28 09:00:32 np0005538515.localdomain systemd[1]: 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.service: Deactivated successfully.
Nov 28 09:00:32 np0005538515.localdomain podman[101523]: 2025-11-28 09:00:32.513283119 +0000 UTC m=+0.610843884 container exec_died e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., vcs-type=git, version=17.1.12, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step4, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_migration_target, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, io.openshift.expose-services=, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Nov 28 09:00:32 np0005538515.localdomain systemd[1]: e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.service: Deactivated successfully.
Nov 28 09:00:34 np0005538515.localdomain sudo[101631]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:00:34 np0005538515.localdomain sudo[101631]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:00:34 np0005538515.localdomain sudo[101631]: pam_unix(sudo:session): session closed for user root
Nov 28 09:00:34 np0005538515.localdomain sudo[101646]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 check-host
Nov 28 09:00:34 np0005538515.localdomain sudo[101646]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:00:34 np0005538515.localdomain sudo[101646]: pam_unix(sudo:session): session closed for user root
Nov 28 09:00:34 np0005538515.localdomain sudo[101682]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:00:34 np0005538515.localdomain sudo[101682]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:00:34 np0005538515.localdomain sudo[101682]: pam_unix(sudo:session): session closed for user root
Nov 28 09:00:34 np0005538515.localdomain sudo[101697]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 09:00:34 np0005538515.localdomain sudo[101697]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:00:35 np0005538515.localdomain sudo[101697]: pam_unix(sudo:session): session closed for user root
Nov 28 09:00:37 np0005538515.localdomain sudo[101743]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:00:37 np0005538515.localdomain sudo[101743]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:00:37 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.
Nov 28 09:00:37 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.
Nov 28 09:00:37 np0005538515.localdomain sudo[101743]: pam_unix(sudo:session): session closed for user root
Nov 28 09:00:37 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.
Nov 28 09:00:37 np0005538515.localdomain podman[101758]: 2025-11-28 09:00:37.369092494 +0000 UTC m=+0.088358901 container health_status 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, url=https://www.redhat.com, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:34:05Z, name=rhosp17/openstack-ovn-controller, io.buildah.version=1.41.4, version=17.1.12, tcib_managed=true, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller)
Nov 28 09:00:37 np0005538515.localdomain podman[101760]: 2025-11-28 09:00:37.419686202 +0000 UTC m=+0.132551321 container health_status e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, build-date=2025-11-19T00:14:25Z, batch=17.1_20251118.1, managed_by=tripleo_ansible, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, io.buildah.version=1.41.4, architecture=x86_64, release=1761123044, tcib_managed=true, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, container_name=ovn_metadata_agent, vcs-type=git, distribution-scope=public)
Nov 28 09:00:37 np0005538515.localdomain podman[101758]: 2025-11-28 09:00:37.440248124 +0000 UTC m=+0.159514561 container exec_died 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, tcib_managed=true, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, name=rhosp17/openstack-ovn-controller, architecture=x86_64, config_id=tripleo_step4, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, io.openshift.expose-services=, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, version=17.1.12, managed_by=tripleo_ansible, io.buildah.version=1.41.4)
Nov 28 09:00:37 np0005538515.localdomain podman[101758]: unhealthy
Nov 28 09:00:37 np0005538515.localdomain systemd[1]: 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 09:00:37 np0005538515.localdomain systemd[1]: 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.service: Failed with result 'exit-code'.
Nov 28 09:00:37 np0005538515.localdomain podman[101760]: 2025-11-28 09:00:37.491398129 +0000 UTC m=+0.204263168 container exec_died e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, name=rhosp17/openstack-neutron-metadata-agent-ovn, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, version=17.1.12, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, container_name=ovn_metadata_agent, build-date=2025-11-19T00:14:25Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, distribution-scope=public)
Nov 28 09:00:37 np0005538515.localdomain podman[101760]: unhealthy
Nov 28 09:00:37 np0005538515.localdomain systemd[1]: e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 09:00:37 np0005538515.localdomain systemd[1]: e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.service: Failed with result 'exit-code'.
Nov 28 09:00:37 np0005538515.localdomain podman[101759]: 2025-11-28 09:00:37.581478212 +0000 UTC m=+0.295885669 container health_status ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, architecture=x86_64, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, vendor=Red Hat, Inc., build-date=2025-11-19T00:36:58Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, batch=17.1_20251118.1, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, container_name=nova_compute, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, config_id=tripleo_step5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 09:00:37 np0005538515.localdomain podman[101759]: 2025-11-28 09:00:37.613504438 +0000 UTC m=+0.327911885 container exec_died ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, architecture=x86_64, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.openshift.expose-services=, config_id=tripleo_step5, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 28 09:00:37 np0005538515.localdomain systemd[1]: ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.service: Deactivated successfully.
Nov 28 09:01:00 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.
Nov 28 09:01:00 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.
Nov 28 09:01:01 np0005538515.localdomain podman[101826]: 2025-11-28 09:01:01.006862944 +0000 UTC m=+0.112344539 container health_status cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, com.redhat.component=openstack-collectd-container, build-date=2025-11-18T22:51:28Z, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, io.buildah.version=1.41.4, release=1761123044, name=rhosp17/openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step3, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, tcib_managed=true, version=17.1.12)
Nov 28 09:01:01 np0005538515.localdomain podman[101826]: 2025-11-28 09:01:01.01452311 +0000 UTC m=+0.120004705 container exec_died cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, vcs-type=git, architecture=x86_64, config_id=tripleo_step3, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-collectd-container, distribution-scope=public, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, release=1761123044, vendor=Red Hat, Inc., name=rhosp17/openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, container_name=collectd, build-date=2025-11-18T22:51:28Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd)
Nov 28 09:01:01 np0005538515.localdomain systemd[1]: cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.service: Deactivated successfully.
Nov 28 09:01:01 np0005538515.localdomain podman[101825]: 2025-11-28 09:01:01.064793317 +0000 UTC m=+0.172775709 container health_status 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, config_id=tripleo_step1, name=rhosp17/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, batch=17.1_20251118.1, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:46Z, container_name=metrics_qdr, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, tcib_managed=true)
Nov 28 09:01:01 np0005538515.localdomain podman[101825]: 2025-11-28 09:01:01.26140946 +0000 UTC m=+0.369391842 container exec_died 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, config_id=tripleo_step1, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, container_name=metrics_qdr, version=17.1.12, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, architecture=x86_64, batch=17.1_20251118.1, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-qdrouterd-container, build-date=2025-11-18T22:49:46Z, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd)
Nov 28 09:01:01 np0005538515.localdomain systemd[1]: 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.service: Deactivated successfully.
Nov 28 09:01:01 np0005538515.localdomain CROND[101873]: (root) CMD (run-parts /etc/cron.hourly)
Nov 28 09:01:01 np0005538515.localdomain run-parts[101876]: (/etc/cron.hourly) starting 0anacron
Nov 28 09:01:01 np0005538515.localdomain run-parts[101882]: (/etc/cron.hourly) finished 0anacron
Nov 28 09:01:01 np0005538515.localdomain CROND[101872]: (root) CMDEND (run-parts /etc/cron.hourly)
Nov 28 09:01:01 np0005538515.localdomain CROND[101884]: (root) CMD (run-parts /etc/cron.hourly)
Nov 28 09:01:01 np0005538515.localdomain run-parts[101887]: (/etc/cron.hourly) starting 0anacron
Nov 28 09:01:01 np0005538515.localdomain anacron[101895]: Anacron started on 2025-11-28
Nov 28 09:01:01 np0005538515.localdomain anacron[101895]: Will run job `cron.daily' in 40 min.
Nov 28 09:01:01 np0005538515.localdomain anacron[101895]: Will run job `cron.weekly' in 60 min.
Nov 28 09:01:01 np0005538515.localdomain anacron[101895]: Will run job `cron.monthly' in 80 min.
Nov 28 09:01:01 np0005538515.localdomain anacron[101895]: Jobs will be executed sequentially
Nov 28 09:01:01 np0005538515.localdomain run-parts[101897]: (/etc/cron.hourly) finished 0anacron
Nov 28 09:01:01 np0005538515.localdomain CROND[101883]: (root) CMDEND (run-parts /etc/cron.hourly)
Nov 28 09:01:02 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.
Nov 28 09:01:02 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.
Nov 28 09:01:02 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.
Nov 28 09:01:02 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.
Nov 28 09:01:02 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.
Nov 28 09:01:02 np0005538515.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Nov 28 09:01:03 np0005538515.localdomain recover_tripleo_nova_virtqemud[101931]: 62642
Nov 28 09:01:03 np0005538515.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Nov 28 09:01:03 np0005538515.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Nov 28 09:01:03 np0005538515.localdomain podman[101899]: 2025-11-28 09:01:03.068342602 +0000 UTC m=+0.081920183 container health_status 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, build-date=2025-11-19T00:12:45Z, tcib_managed=true, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, io.openshift.expose-services=, vendor=Red Hat, Inc., batch=17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-ipmi-container, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ceilometer-ipmi, config_id=tripleo_step4, version=17.1.12, container_name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, vcs-type=git, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 28 09:01:03 np0005538515.localdomain podman[101907]: 2025-11-28 09:01:03.13876907 +0000 UTC m=+0.145960284 container health_status e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, tcib_managed=true, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, release=1761123044, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, version=17.1.12, build-date=2025-11-19T00:36:58Z, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, io.buildah.version=1.41.4, url=https://www.redhat.com, architecture=x86_64, name=rhosp17/openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=nova_migration_target, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 28 09:01:03 np0005538515.localdomain podman[101901]: 2025-11-28 09:01:03.116237886 +0000 UTC m=+0.128208987 container health_status d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, release=1761123044, io.buildah.version=1.41.4, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:11:48Z, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., tcib_managed=true, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, config_id=tripleo_step4)
Nov 28 09:01:03 np0005538515.localdomain podman[101900]: 2025-11-28 09:01:03.184862258 +0000 UTC m=+0.197948934 container health_status 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-iscsid, vcs-type=git, build-date=2025-11-18T23:44:13Z, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, managed_by=tripleo_ansible, distribution-scope=public, maintainer=OpenStack TripleO Team, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, container_name=iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, batch=17.1_20251118.1, com.redhat.component=openstack-iscsid-container)
Nov 28 09:01:03 np0005538515.localdomain podman[101899]: 2025-11-28 09:01:03.203467961 +0000 UTC m=+0.217045592 container exec_died 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, name=rhosp17/openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container, version=17.1.12, distribution-scope=public, url=https://www.redhat.com, build-date=2025-11-19T00:12:45Z, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, io.buildah.version=1.41.4, container_name=ceilometer_agent_ipmi)
Nov 28 09:01:03 np0005538515.localdomain podman[101900]: 2025-11-28 09:01:03.219951369 +0000 UTC m=+0.233038095 container exec_died 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.buildah.version=1.41.4, vendor=Red Hat, Inc., name=rhosp17/openstack-iscsid, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.component=openstack-iscsid-container, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:44:13Z, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, managed_by=tripleo_ansible, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, distribution-scope=public, vcs-type=git, batch=17.1_20251118.1)
Nov 28 09:01:03 np0005538515.localdomain systemd[1]: 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.service: Deactivated successfully.
Nov 28 09:01:03 np0005538515.localdomain podman[101901]: 2025-11-28 09:01:03.251122668 +0000 UTC m=+0.263093769 container exec_died d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, url=https://www.redhat.com, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, container_name=ceilometer_agent_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-compute, tcib_managed=true, batch=17.1_20251118.1, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, build-date=2025-11-19T00:11:48Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, config_id=tripleo_step4, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Nov 28 09:01:03 np0005538515.localdomain systemd[1]: d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.service: Deactivated successfully.
Nov 28 09:01:03 np0005538515.localdomain podman[101898]: 2025-11-28 09:01:03.225955484 +0000 UTC m=+0.245147398 container health_status 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, container_name=logrotate_crond, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-cron-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, tcib_managed=true, version=17.1.12, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:32Z, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., io.openshift.expose-services=)
Nov 28 09:01:03 np0005538515.localdomain systemd[1]: 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.service: Deactivated successfully.
Nov 28 09:01:03 np0005538515.localdomain podman[101898]: 2025-11-28 09:01:03.359811074 +0000 UTC m=+0.379002988 container exec_died 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.openshift.expose-services=, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 cron, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, config_id=tripleo_step4, version=17.1.12, batch=17.1_20251118.1, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, url=https://www.redhat.com, tcib_managed=true, build-date=2025-11-18T22:49:32Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.buildah.version=1.41.4, name=rhosp17/openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, managed_by=tripleo_ansible)
Nov 28 09:01:03 np0005538515.localdomain systemd[1]: 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.service: Deactivated successfully.
Nov 28 09:01:03 np0005538515.localdomain podman[101907]: 2025-11-28 09:01:03.50654032 +0000 UTC m=+0.513731514 container exec_died e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vendor=Red Hat, Inc., batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, release=1761123044, config_id=tripleo_step4, distribution-scope=public, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, tcib_managed=true, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, container_name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12)
Nov 28 09:01:03 np0005538515.localdomain systemd[1]: e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.service: Deactivated successfully.
Nov 28 09:01:07 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.
Nov 28 09:01:07 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.
Nov 28 09:01:07 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.
Nov 28 09:01:07 np0005538515.localdomain systemd[1]: tmp-crun.BEqCoO.mount: Deactivated successfully.
Nov 28 09:01:07 np0005538515.localdomain podman[102015]: 2025-11-28 09:01:07.988650121 +0000 UTC m=+0.087300548 container health_status e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, release=1761123044, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, batch=17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, build-date=2025-11-19T00:14:25Z, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true)
Nov 28 09:01:08 np0005538515.localdomain podman[102015]: 2025-11-28 09:01:08.028360553 +0000 UTC m=+0.127010890 container exec_died e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, managed_by=tripleo_ansible, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, container_name=ovn_metadata_agent, io.openshift.expose-services=, vendor=Red Hat, Inc., config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, build-date=2025-11-19T00:14:25Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, version=17.1.12, name=rhosp17/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, batch=17.1_20251118.1, url=https://www.redhat.com, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, release=1761123044, architecture=x86_64, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4)
Nov 28 09:01:08 np0005538515.localdomain podman[102015]: unhealthy
Nov 28 09:01:08 np0005538515.localdomain systemd[1]: e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 09:01:08 np0005538515.localdomain systemd[1]: e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.service: Failed with result 'exit-code'.
Nov 28 09:01:08 np0005538515.localdomain podman[102013]: 2025-11-28 09:01:08.038377312 +0000 UTC m=+0.136847854 container health_status 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ovn-controller-container, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, io.openshift.expose-services=, build-date=2025-11-18T23:34:05Z, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, tcib_managed=true, release=1761123044, managed_by=tripleo_ansible, url=https://www.redhat.com, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, vendor=Red Hat, Inc., config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, name=rhosp17/openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, batch=17.1_20251118.1)
Nov 28 09:01:08 np0005538515.localdomain podman[102014]: 2025-11-28 09:01:08.103088904 +0000 UTC m=+0.200983988 container health_status ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, managed_by=tripleo_ansible, version=17.1.12, release=1761123044, architecture=x86_64, container_name=nova_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., url=https://www.redhat.com, maintainer=OpenStack TripleO Team, distribution-scope=public, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, io.buildah.version=1.41.4, config_id=tripleo_step5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:36:58Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20251118.1)
Nov 28 09:01:08 np0005538515.localdomain podman[102013]: 2025-11-28 09:01:08.124541983 +0000 UTC m=+0.223012515 container exec_died 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ovn-controller, io.openshift.expose-services=, vendor=Red Hat, Inc., tcib_managed=true, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, version=17.1.12, build-date=2025-11-18T23:34:05Z, config_id=tripleo_step4, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, container_name=ovn_controller, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272)
Nov 28 09:01:08 np0005538515.localdomain podman[102013]: unhealthy
Nov 28 09:01:08 np0005538515.localdomain systemd[1]: 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 09:01:08 np0005538515.localdomain systemd[1]: 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.service: Failed with result 'exit-code'.
Nov 28 09:01:08 np0005538515.localdomain podman[102014]: 2025-11-28 09:01:08.181993583 +0000 UTC m=+0.279888727 container exec_died ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, version=17.1.12, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, build-date=2025-11-19T00:36:58Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, release=1761123044, com.redhat.component=openstack-nova-compute-container, batch=17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, container_name=nova_compute)
Nov 28 09:01:08 np0005538515.localdomain systemd[1]: ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.service: Deactivated successfully.
Nov 28 09:01:29 np0005538515.localdomain CROND[101399]: (root) CMDEND (sleep `expr ${RANDOM} % 90`; /usr/sbin/logrotate -s /var/lib/logrotate/logrotate-crond.status /etc/logrotate-crond.conf 2>&1|logger -t logrotate-crond)
Nov 28 09:01:31 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.
Nov 28 09:01:31 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.
Nov 28 09:01:31 np0005538515.localdomain systemd[1]: tmp-crun.RI0WfF.mount: Deactivated successfully.
Nov 28 09:01:31 np0005538515.localdomain podman[102083]: 2025-11-28 09:01:31.99245234 +0000 UTC m=+0.091791695 container health_status cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.buildah.version=1.41.4, tcib_managed=true, url=https://www.redhat.com, architecture=x86_64, vendor=Red Hat, Inc., config_id=tripleo_step3, version=17.1.12, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, com.redhat.component=openstack-collectd-container, container_name=collectd, io.openshift.expose-services=, name=rhosp17/openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']})
Nov 28 09:01:32 np0005538515.localdomain podman[102082]: 2025-11-28 09:01:32.037141826 +0000 UTC m=+0.140417282 container health_status 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, architecture=x86_64, batch=17.1_20251118.1, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, managed_by=tripleo_ansible, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, vendor=Red Hat, Inc., vcs-type=git, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, container_name=metrics_qdr, distribution-scope=public, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 28 09:01:32 np0005538515.localdomain podman[102083]: 2025-11-28 09:01:32.052417157 +0000 UTC m=+0.151756432 container exec_died cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., version=17.1.12, vcs-type=git, container_name=collectd, io.buildah.version=1.41.4, config_id=tripleo_step3, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.component=openstack-collectd-container, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, build-date=2025-11-18T22:51:28Z, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, name=rhosp17/openstack-collectd, distribution-scope=public, architecture=x86_64)
Nov 28 09:01:32 np0005538515.localdomain systemd[1]: cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.service: Deactivated successfully.
Nov 28 09:01:32 np0005538515.localdomain podman[102082]: 2025-11-28 09:01:32.267552909 +0000 UTC m=+0.370828335 container exec_died 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, version=17.1.12, architecture=x86_64, config_id=tripleo_step1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.openshift.expose-services=, name=rhosp17/openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, build-date=2025-11-18T22:49:46Z, container_name=metrics_qdr, release=1761123044, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 28 09:01:32 np0005538515.localdomain systemd[1]: 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.service: Deactivated successfully.
Nov 28 09:01:33 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.
Nov 28 09:01:33 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.
Nov 28 09:01:33 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.
Nov 28 09:01:33 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.
Nov 28 09:01:33 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.
Nov 28 09:01:33 np0005538515.localdomain podman[102146]: 2025-11-28 09:01:33.984452049 +0000 UTC m=+0.071331226 container health_status e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, vcs-type=git, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, batch=17.1_20251118.1, container_name=nova_migration_target, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 28 09:01:34 np0005538515.localdomain podman[102139]: 2025-11-28 09:01:34.016315431 +0000 UTC m=+0.105492309 container health_status d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, batch=17.1_20251118.1, name=rhosp17/openstack-ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, config_id=tripleo_step4, release=1761123044, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.buildah.version=1.41.4, vendor=Red Hat, Inc., version=17.1.12, architecture=x86_64, tcib_managed=true, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, build-date=2025-11-19T00:11:48Z)
Nov 28 09:01:34 np0005538515.localdomain podman[102135]: 2025-11-28 09:01:34.046216361 +0000 UTC m=+0.137524825 container health_status 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, com.redhat.component=openstack-iscsid-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, io.buildah.version=1.41.4, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, tcib_managed=true, version=17.1.12, build-date=2025-11-18T23:44:13Z, container_name=iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-iscsid, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., managed_by=tripleo_ansible, architecture=x86_64, release=1761123044, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']})
Nov 28 09:01:34 np0005538515.localdomain podman[102135]: 2025-11-28 09:01:34.091892077 +0000 UTC m=+0.183200571 container exec_died 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, com.redhat.component=openstack-iscsid-container, name=rhosp17/openstack-iscsid, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, container_name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, summary=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, vcs-type=git, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., url=https://www.redhat.com, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, maintainer=OpenStack TripleO Team, tcib_managed=true, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:44:13Z, distribution-scope=public, release=1761123044, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, managed_by=tripleo_ansible)
Nov 28 09:01:34 np0005538515.localdomain podman[102134]: 2025-11-28 09:01:34.099289574 +0000 UTC m=+0.198777649 container health_status 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, io.openshift.expose-services=, container_name=ceilometer_agent_ipmi, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ceilometer-ipmi-container, config_id=tripleo_step4, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, release=1761123044, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, name=rhosp17/openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4)
Nov 28 09:01:34 np0005538515.localdomain systemd[1]: 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.service: Deactivated successfully.
Nov 28 09:01:34 np0005538515.localdomain podman[102134]: 2025-11-28 09:01:34.134631943 +0000 UTC m=+0.234120068 container exec_died 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, vendor=Red Hat, Inc., url=https://www.redhat.com, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, release=1761123044, build-date=2025-11-19T00:12:45Z, vcs-type=git, config_id=tripleo_step4, container_name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.expose-services=, distribution-scope=public, name=rhosp17/openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Nov 28 09:01:34 np0005538515.localdomain systemd[1]: 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.service: Deactivated successfully.
Nov 28 09:01:34 np0005538515.localdomain podman[102139]: 2025-11-28 09:01:34.170409244 +0000 UTC m=+0.259586122 container exec_died d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, name=rhosp17/openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-compute-container, config_id=tripleo_step4, release=1761123044, maintainer=OpenStack TripleO Team, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_compute, vendor=Red Hat, Inc., url=https://www.redhat.com, vcs-type=git, io.buildah.version=1.41.4, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:11:48Z, tcib_managed=true)
Nov 28 09:01:34 np0005538515.localdomain systemd[1]: d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.service: Deactivated successfully.
Nov 28 09:01:34 np0005538515.localdomain podman[102133]: 2025-11-28 09:01:34.143239357 +0000 UTC m=+0.244717853 container health_status 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.component=openstack-cron-container, vcs-type=git, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, release=1761123044, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, version=17.1.12, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-cron, build-date=2025-11-18T22:49:32Z, container_name=logrotate_crond, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 09:01:34 np0005538515.localdomain podman[102133]: 2025-11-28 09:01:34.226394578 +0000 UTC m=+0.327873024 container exec_died 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, distribution-scope=public, version=17.1.12, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, container_name=logrotate_crond, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.component=openstack-cron-container, summary=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, io.openshift.expose-services=, url=https://www.redhat.com, config_id=tripleo_step4, io.buildah.version=1.41.4, vcs-type=git, build-date=2025-11-18T22:49:32Z)
Nov 28 09:01:34 np0005538515.localdomain systemd[1]: 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.service: Deactivated successfully.
Nov 28 09:01:34 np0005538515.localdomain podman[102146]: 2025-11-28 09:01:34.293650928 +0000 UTC m=+0.380530185 container exec_died e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, container_name=nova_migration_target, config_id=tripleo_step4, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, io.buildah.version=1.41.4, architecture=x86_64, build-date=2025-11-19T00:36:58Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, com.redhat.component=openstack-nova-compute-container)
Nov 28 09:01:34 np0005538515.localdomain systemd[1]: e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.service: Deactivated successfully.
Nov 28 09:01:34 np0005538515.localdomain systemd[1]: tmp-crun.1mRdfs.mount: Deactivated successfully.
Nov 28 09:01:37 np0005538515.localdomain sudo[102247]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:01:37 np0005538515.localdomain sudo[102247]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:01:37 np0005538515.localdomain sudo[102247]: pam_unix(sudo:session): session closed for user root
Nov 28 09:01:37 np0005538515.localdomain sudo[102262]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 09:01:37 np0005538515.localdomain sudo[102262]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:01:38 np0005538515.localdomain sudo[102262]: pam_unix(sudo:session): session closed for user root
Nov 28 09:01:38 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.
Nov 28 09:01:38 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.
Nov 28 09:01:38 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.
Nov 28 09:01:38 np0005538515.localdomain sudo[102309]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:01:38 np0005538515.localdomain sudo[102309]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:01:38 np0005538515.localdomain sudo[102309]: pam_unix(sudo:session): session closed for user root
Nov 28 09:01:38 np0005538515.localdomain podman[102325]: 2025-11-28 09:01:38.988369613 +0000 UTC m=+0.077234759 container health_status e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, config_id=tripleo_step4, container_name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-neutron-metadata-agent-ovn, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, distribution-scope=public, maintainer=OpenStack TripleO Team, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., url=https://www.redhat.com, release=1761123044, io.buildah.version=1.41.4, vcs-type=git, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, build-date=2025-11-19T00:14:25Z, batch=17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c)
Nov 28 09:01:39 np0005538515.localdomain podman[102325]: 2025-11-28 09:01:39.036541466 +0000 UTC m=+0.125406622 container exec_died e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, version=17.1.12, name=rhosp17/openstack-neutron-metadata-agent-ovn, architecture=x86_64, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, io.buildah.version=1.41.4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, release=1761123044, batch=17.1_20251118.1, container_name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, tcib_managed=true)
Nov 28 09:01:39 np0005538515.localdomain systemd[1]: tmp-crun.Ggho6R.mount: Deactivated successfully.
Nov 28 09:01:39 np0005538515.localdomain podman[102325]: unhealthy
Nov 28 09:01:39 np0005538515.localdomain systemd[1]: e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 09:01:39 np0005538515.localdomain systemd[1]: e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.service: Failed with result 'exit-code'.
Nov 28 09:01:39 np0005538515.localdomain podman[102324]: 2025-11-28 09:01:39.050781404 +0000 UTC m=+0.142811477 container health_status ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, tcib_managed=true, name=rhosp17/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:36:58Z, container_name=nova_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, release=1761123044)
Nov 28 09:01:39 np0005538515.localdomain podman[102323]: 2025-11-28 09:01:39.090994652 +0000 UTC m=+0.183406036 container health_status 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, summary=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, distribution-scope=public, name=rhosp17/openstack-ovn-controller, release=1761123044, maintainer=OpenStack TripleO Team, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, container_name=ovn_controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.buildah.version=1.41.4, batch=17.1_20251118.1, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=)
Nov 28 09:01:39 np0005538515.localdomain podman[102324]: 2025-11-28 09:01:39.135304086 +0000 UTC m=+0.227334189 container exec_died ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, managed_by=tripleo_ansible, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, url=https://www.redhat.com, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, container_name=nova_compute, name=rhosp17/openstack-nova-compute, config_id=tripleo_step5, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 28 09:01:39 np0005538515.localdomain podman[102323]: 2025-11-28 09:01:39.131114967 +0000 UTC m=+0.223526421 container exec_died 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, config_id=tripleo_step4, build-date=2025-11-18T23:34:05Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, distribution-scope=public, batch=17.1_20251118.1, name=rhosp17/openstack-ovn-controller, vcs-type=git, version=17.1.12, io.buildah.version=1.41.4, vendor=Red Hat, Inc., release=1761123044, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, io.openshift.expose-services=, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_controller, url=https://www.redhat.com)
Nov 28 09:01:39 np0005538515.localdomain systemd[1]: ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.service: Deactivated successfully.
Nov 28 09:01:39 np0005538515.localdomain podman[102323]: unhealthy
Nov 28 09:01:39 np0005538515.localdomain systemd[1]: 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 09:01:39 np0005538515.localdomain systemd[1]: 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.service: Failed with result 'exit-code'.
Nov 28 09:02:02 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.
Nov 28 09:02:02 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.
Nov 28 09:02:02 np0005538515.localdomain podman[102390]: 2025-11-28 09:02:02.987787278 +0000 UTC m=+0.088790735 container health_status 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, version=17.1.12, container_name=metrics_qdr, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-qdrouterd, config_id=tripleo_step1, build-date=2025-11-18T22:49:46Z, io.openshift.expose-services=, distribution-scope=public, batch=17.1_20251118.1, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, architecture=x86_64, release=1761123044, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.4)
Nov 28 09:02:03 np0005538515.localdomain podman[102391]: 2025-11-28 09:02:03.044517474 +0000 UTC m=+0.140479415 container health_status cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, com.redhat.component=openstack-collectd-container, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, io.buildah.version=1.41.4, version=17.1.12, build-date=2025-11-18T22:51:28Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, name=rhosp17/openstack-collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, release=1761123044, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd)
Nov 28 09:02:03 np0005538515.localdomain podman[102391]: 2025-11-28 09:02:03.053333195 +0000 UTC m=+0.149295106 container exec_died cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, name=rhosp17/openstack-collectd, batch=17.1_20251118.1, version=17.1.12, release=1761123044, config_id=tripleo_step3, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, managed_by=tripleo_ansible, container_name=collectd, url=https://www.redhat.com, com.redhat.component=openstack-collectd-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:51:28Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, distribution-scope=public)
Nov 28 09:02:03 np0005538515.localdomain systemd[1]: cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.service: Deactivated successfully.
Nov 28 09:02:03 np0005538515.localdomain podman[102390]: 2025-11-28 09:02:03.175996941 +0000 UTC m=+0.277000378 container exec_died 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, url=https://www.redhat.com, release=1761123044, description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp17/openstack-qdrouterd, config_id=tripleo_step1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, vcs-type=git, io.buildah.version=1.41.4, build-date=2025-11-18T22:49:46Z, io.openshift.expose-services=, distribution-scope=public)
Nov 28 09:02:03 np0005538515.localdomain systemd[1]: 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.service: Deactivated successfully.
Nov 28 09:02:03 np0005538515.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Nov 28 09:02:03 np0005538515.localdomain recover_tripleo_nova_virtqemud[102438]: 62642
Nov 28 09:02:03 np0005538515.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Nov 28 09:02:03 np0005538515.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Nov 28 09:02:04 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.
Nov 28 09:02:04 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.
Nov 28 09:02:04 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.
Nov 28 09:02:04 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.
Nov 28 09:02:04 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.
Nov 28 09:02:04 np0005538515.localdomain podman[102441]: 2025-11-28 09:02:04.996588124 +0000 UTC m=+0.092046475 container health_status 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, build-date=2025-11-18T23:44:13Z, tcib_managed=true, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, version=17.1.12, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, com.redhat.component=openstack-iscsid-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-type=git, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.buildah.version=1.41.4, url=https://www.redhat.com, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 28 09:02:05 np0005538515.localdomain podman[102441]: 2025-11-28 09:02:05.006685774 +0000 UTC m=+0.102144125 container exec_died 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, maintainer=OpenStack TripleO Team, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, container_name=iscsid, io.buildah.version=1.41.4, build-date=2025-11-18T23:44:13Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, version=17.1.12, config_id=tripleo_step3, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, managed_by=tripleo_ansible, com.redhat.component=openstack-iscsid-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, io.openshift.expose-services=, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc.)
Nov 28 09:02:05 np0005538515.localdomain systemd[1]: 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.service: Deactivated successfully.
Nov 28 09:02:05 np0005538515.localdomain systemd[1]: tmp-crun.ePap8R.mount: Deactivated successfully.
Nov 28 09:02:05 np0005538515.localdomain podman[102451]: 2025-11-28 09:02:05.089437742 +0000 UTC m=+0.181012093 container health_status e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., distribution-scope=public, tcib_managed=true, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, architecture=x86_64, io.buildah.version=1.41.4, container_name=nova_migration_target, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z)
Nov 28 09:02:05 np0005538515.localdomain podman[102442]: 2025-11-28 09:02:05.065692471 +0000 UTC m=+0.158169440 container health_status d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, build-date=2025-11-19T00:11:48Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, vendor=Red Hat, Inc., version=17.1.12, config_id=tripleo_step4, container_name=ceilometer_agent_compute, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, managed_by=tripleo_ansible, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, distribution-scope=public, com.redhat.component=openstack-ceilometer-compute-container, tcib_managed=true, name=rhosp17/openstack-ceilometer-compute, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676)
Nov 28 09:02:05 np0005538515.localdomain podman[102439]: 2025-11-28 09:02:05.137019177 +0000 UTC m=+0.241298369 container health_status 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.openshift.expose-services=, architecture=x86_64, url=https://www.redhat.com, release=1761123044, batch=17.1_20251118.1, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-type=git, build-date=2025-11-18T22:49:32Z, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., version=17.1.12, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, container_name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-cron-container, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 28 09:02:05 np0005538515.localdomain podman[102439]: 2025-11-28 09:02:05.145481217 +0000 UTC m=+0.249760429 container exec_died 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, build-date=2025-11-18T22:49:32Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-cron-container, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step4, name=rhosp17/openstack-cron, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, managed_by=tripleo_ansible, distribution-scope=public, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, container_name=logrotate_crond)
Nov 28 09:02:05 np0005538515.localdomain systemd[1]: 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.service: Deactivated successfully.
Nov 28 09:02:05 np0005538515.localdomain podman[102440]: 2025-11-28 09:02:05.192536966 +0000 UTC m=+0.294379393 container health_status 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vcs-type=git, distribution-scope=public, url=https://www.redhat.com, io.buildah.version=1.41.4, com.redhat.component=openstack-ceilometer-ipmi-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., tcib_managed=true, architecture=x86_64, build-date=2025-11-19T00:12:45Z, container_name=ceilometer_agent_ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, maintainer=OpenStack TripleO Team, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, release=1761123044)
Nov 28 09:02:05 np0005538515.localdomain podman[102442]: 2025-11-28 09:02:05.202052808 +0000 UTC m=+0.294529737 container exec_died d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, io.buildah.version=1.41.4, distribution-scope=public, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, managed_by=tripleo_ansible, batch=17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_compute, vendor=Red Hat, Inc., release=1761123044, url=https://www.redhat.com)
Nov 28 09:02:05 np0005538515.localdomain systemd[1]: d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.service: Deactivated successfully.
Nov 28 09:02:05 np0005538515.localdomain podman[102440]: 2025-11-28 09:02:05.245324721 +0000 UTC m=+0.347167198 container exec_died 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, release=1761123044, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, com.redhat.component=openstack-ceilometer-ipmi-container, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, version=17.1.12, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, distribution-scope=public, batch=17.1_20251118.1, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi, build-date=2025-11-19T00:12:45Z)
Nov 28 09:02:05 np0005538515.localdomain systemd[1]: 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.service: Deactivated successfully.
Nov 28 09:02:05 np0005538515.localdomain podman[102451]: 2025-11-28 09:02:05.450684232 +0000 UTC m=+0.542258603 container exec_died e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, container_name=nova_migration_target, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, build-date=2025-11-19T00:36:58Z, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, batch=17.1_20251118.1, name=rhosp17/openstack-nova-compute, io.openshift.expose-services=)
Nov 28 09:02:05 np0005538515.localdomain systemd[1]: e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.service: Deactivated successfully.
Nov 28 09:02:09 np0005538515.localdomain sshd[102552]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 09:02:09 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.
Nov 28 09:02:09 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.
Nov 28 09:02:09 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.
Nov 28 09:02:09 np0005538515.localdomain podman[102554]: 2025-11-28 09:02:09.978442448 +0000 UTC m=+0.085084890 container health_status 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, vendor=Red Hat, Inc., url=https://www.redhat.com, vcs-type=git, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, release=1761123044, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, managed_by=tripleo_ansible, io.openshift.expose-services=, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, maintainer=OpenStack TripleO Team, container_name=ovn_controller)
Nov 28 09:02:09 np0005538515.localdomain podman[102554]: 2025-11-28 09:02:09.997439933 +0000 UTC m=+0.104082365 container exec_died 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., tcib_managed=true, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.41.4, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, version=17.1.12, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, build-date=2025-11-18T23:34:05Z, name=rhosp17/openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller)
Nov 28 09:02:10 np0005538515.localdomain podman[102554]: unhealthy
Nov 28 09:02:10 np0005538515.localdomain systemd[1]: 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 09:02:10 np0005538515.localdomain systemd[1]: 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.service: Failed with result 'exit-code'.
Nov 28 09:02:10 np0005538515.localdomain sshd[102552]: Invalid user  from 194.187.176.135 port 43600
Nov 28 09:02:10 np0005538515.localdomain podman[102555]: 2025-11-28 09:02:10.049479694 +0000 UTC m=+0.150576685 container health_status ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, architecture=x86_64, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, distribution-scope=public, container_name=nova_compute, version=17.1.12, config_id=tripleo_step5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, vcs-type=git, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, release=1761123044, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']})
Nov 28 09:02:10 np0005538515.localdomain systemd[1]: tmp-crun.JFBaaU.mount: Deactivated successfully.
Nov 28 09:02:10 np0005538515.localdomain podman[102556]: 2025-11-28 09:02:10.093507829 +0000 UTC m=+0.193359553 container health_status e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, distribution-scope=public, managed_by=tripleo_ansible, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, build-date=2025-11-19T00:14:25Z, config_id=tripleo_step4, io.openshift.expose-services=, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, batch=17.1_20251118.1, vcs-type=git)
Nov 28 09:02:10 np0005538515.localdomain podman[102556]: 2025-11-28 09:02:10.10749484 +0000 UTC m=+0.207346614 container exec_died e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, vcs-type=git, release=1761123044, managed_by=tripleo_ansible, tcib_managed=true, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ovn_metadata_agent, config_id=tripleo_step4, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, distribution-scope=public, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, architecture=x86_64, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 28 09:02:10 np0005538515.localdomain podman[102556]: unhealthy
Nov 28 09:02:10 np0005538515.localdomain systemd[1]: e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 09:02:10 np0005538515.localdomain systemd[1]: e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.service: Failed with result 'exit-code'.
Nov 28 09:02:10 np0005538515.localdomain sshd[102552]: Connection closed by invalid user  194.187.176.135 port 43600 [preauth]
Nov 28 09:02:10 np0005538515.localdomain podman[102555]: 2025-11-28 09:02:10.16076832 +0000 UTC m=+0.261865251 container exec_died ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, managed_by=tripleo_ansible, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, url=https://www.redhat.com, name=rhosp17/openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, batch=17.1_20251118.1, config_id=tripleo_step5, container_name=nova_compute, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc.)
Nov 28 09:02:10 np0005538515.localdomain systemd[1]: ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.service: Deactivated successfully.
Nov 28 09:02:33 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.
Nov 28 09:02:33 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.
Nov 28 09:02:33 np0005538515.localdomain podman[102617]: 2025-11-28 09:02:33.98536317 +0000 UTC m=+0.092377525 container health_status 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, config_id=tripleo_step1, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, io.buildah.version=1.41.4, name=rhosp17/openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, release=1761123044, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd)
Nov 28 09:02:34 np0005538515.localdomain podman[102618]: 2025-11-28 09:02:34.02923429 +0000 UTC m=+0.134937135 container health_status cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, container_name=collectd, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:51:28Z, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, vendor=Red Hat, Inc., url=https://www.redhat.com, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-collectd, version=17.1.12, managed_by=tripleo_ansible, batch=17.1_20251118.1, architecture=x86_64, tcib_managed=true, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, release=1761123044, com.redhat.component=openstack-collectd-container, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 28 09:02:34 np0005538515.localdomain podman[102618]: 2025-11-28 09:02:34.067511848 +0000 UTC m=+0.173214673 container exec_died cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, release=1761123044, url=https://www.redhat.com, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-collectd, tcib_managed=true, io.buildah.version=1.41.4, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., managed_by=tripleo_ansible, container_name=collectd, config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-collectd-container, build-date=2025-11-18T22:51:28Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, distribution-scope=public)
Nov 28 09:02:34 np0005538515.localdomain systemd[1]: cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.service: Deactivated successfully.
Nov 28 09:02:34 np0005538515.localdomain podman[102617]: 2025-11-28 09:02:34.170500079 +0000 UTC m=+0.277514434 container exec_died 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, version=17.1.12, vcs-type=git, config_id=tripleo_step1, io.openshift.expose-services=, tcib_managed=true, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, name=rhosp17/openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=metrics_qdr, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64)
Nov 28 09:02:34 np0005538515.localdomain systemd[1]: 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.service: Deactivated successfully.
Nov 28 09:02:35 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.
Nov 28 09:02:35 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.
Nov 28 09:02:35 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.
Nov 28 09:02:35 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.
Nov 28 09:02:35 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.
Nov 28 09:02:35 np0005538515.localdomain systemd[1]: tmp-crun.dh57NW.mount: Deactivated successfully.
Nov 28 09:02:35 np0005538515.localdomain podman[102666]: 2025-11-28 09:02:35.996041363 +0000 UTC m=+0.099930396 container health_status 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=logrotate_crond, distribution-scope=public, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:32Z, vendor=Red Hat, Inc., config_id=tripleo_step4, vcs-type=git, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-cron, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.buildah.version=1.41.4, release=1761123044, maintainer=OpenStack TripleO Team, url=https://www.redhat.com)
Nov 28 09:02:36 np0005538515.localdomain podman[102666]: 2025-11-28 09:02:36.032351701 +0000 UTC m=+0.136240684 container exec_died 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, architecture=x86_64, managed_by=tripleo_ansible, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, url=https://www.redhat.com, io.buildah.version=1.41.4, container_name=logrotate_crond, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, description=Red Hat OpenStack Platform 17.1 cron, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.expose-services=, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z)
Nov 28 09:02:36 np0005538515.localdomain systemd[1]: 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.service: Deactivated successfully.
Nov 28 09:02:36 np0005538515.localdomain podman[102667]: 2025-11-28 09:02:36.035759367 +0000 UTC m=+0.136897715 container health_status 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, config_id=tripleo_step4, container_name=ceilometer_agent_ipmi, name=rhosp17/openstack-ceilometer-ipmi, tcib_managed=true, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, architecture=x86_64, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:12:45Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, vendor=Red Hat, Inc., io.openshift.expose-services=, distribution-scope=public)
Nov 28 09:02:36 np0005538515.localdomain podman[102669]: 2025-11-28 09:02:36.091199753 +0000 UTC m=+0.186913534 container health_status d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.openshift.expose-services=, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, tcib_managed=true, build-date=2025-11-19T00:11:48Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, release=1761123044, distribution-scope=public, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, name=rhosp17/openstack-ceilometer-compute)
Nov 28 09:02:36 np0005538515.localdomain podman[102668]: 2025-11-28 09:02:36.149232189 +0000 UTC m=+0.248380957 container health_status 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, com.redhat.component=openstack-iscsid-container, release=1761123044, container_name=iscsid, managed_by=tripleo_ansible, name=rhosp17/openstack-iscsid, tcib_managed=true, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, architecture=x86_64, build-date=2025-11-18T23:44:13Z, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, vcs-type=git, description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, vendor=Red Hat, Inc.)
Nov 28 09:02:36 np0005538515.localdomain podman[102668]: 2025-11-28 09:02:36.156776742 +0000 UTC m=+0.255925510 container exec_died 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, release=1761123044, build-date=2025-11-18T23:44:13Z, name=rhosp17/openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, version=17.1.12, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public, vendor=Red Hat, Inc., tcib_managed=true, container_name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, config_id=tripleo_step3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']})
Nov 28 09:02:36 np0005538515.localdomain systemd[1]: 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.service: Deactivated successfully.
Nov 28 09:02:36 np0005538515.localdomain podman[102669]: 2025-11-28 09:02:36.16840496 +0000 UTC m=+0.264118731 container exec_died d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, vendor=Red Hat, Inc., url=https://www.redhat.com, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20251118.1, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, release=1761123044, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, container_name=ceilometer_agent_compute, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-compute-container, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4)
Nov 28 09:02:36 np0005538515.localdomain systemd[1]: d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.service: Deactivated successfully.
Nov 28 09:02:36 np0005538515.localdomain podman[102674]: 2025-11-28 09:02:36.247558336 +0000 UTC m=+0.340216483 container health_status e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, batch=17.1_20251118.1, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, io.openshift.expose-services=, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_migration_target, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:36:58Z, tcib_managed=true, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 09:02:36 np0005538515.localdomain podman[102667]: 2025-11-28 09:02:36.269375228 +0000 UTC m=+0.370513566 container exec_died 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, tcib_managed=true, distribution-scope=public, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, managed_by=tripleo_ansible, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, build-date=2025-11-19T00:12:45Z, name=rhosp17/openstack-ceilometer-ipmi, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-ipmi-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi)
Nov 28 09:02:36 np0005538515.localdomain podman[102667]: unhealthy
Nov 28 09:02:36 np0005538515.localdomain systemd[1]: 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 09:02:36 np0005538515.localdomain systemd[1]: 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.service: Failed with result 'exit-code'.
Nov 28 09:02:36 np0005538515.localdomain podman[102674]: 2025-11-28 09:02:36.604405671 +0000 UTC m=+0.697063868 container exec_died e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step4, build-date=2025-11-19T00:36:58Z, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, version=17.1.12, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, architecture=x86_64, container_name=nova_migration_target, name=rhosp17/openstack-nova-compute, managed_by=tripleo_ansible, distribution-scope=public, io.buildah.version=1.41.4)
Nov 28 09:02:36 np0005538515.localdomain systemd[1]: e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.service: Deactivated successfully.
Nov 28 09:02:39 np0005538515.localdomain sudo[102783]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:02:39 np0005538515.localdomain sudo[102783]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:02:39 np0005538515.localdomain sudo[102783]: pam_unix(sudo:session): session closed for user root
Nov 28 09:02:39 np0005538515.localdomain sudo[102798]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Nov 28 09:02:39 np0005538515.localdomain sudo[102798]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:02:40 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.
Nov 28 09:02:40 np0005538515.localdomain podman[102885]: 2025-11-28 09:02:40.064292985 +0000 UTC m=+0.100628148 container exec 98f7091a3e2ea0e9ed1e630f1e98c8fad1fd276cf7448473db6afc3c103ea45d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538515, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, description=Red Hat Ceph Storage 7, version=7, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-09-24T08:57:55, io.buildah.version=1.33.12, GIT_BRANCH=main, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, vendor=Red Hat, Inc., release=553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, CEPH_POINT_RELEASE=)
Nov 28 09:02:40 np0005538515.localdomain podman[102885]: 2025-11-28 09:02:40.162461767 +0000 UTC m=+0.198796970 container exec_died 98f7091a3e2ea0e9ed1e630f1e98c8fad1fd276cf7448473db6afc3c103ea45d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538515, release=553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.33.12, com.redhat.component=rhceph-container, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, GIT_CLEAN=True, io.openshift.expose-services=, vcs-type=git, ceph=True, description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-09-24T08:57:55, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, RELEASE=main, vendor=Red Hat, Inc., architecture=x86_64, GIT_BRANCH=main)
Nov 28 09:02:40 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.
Nov 28 09:02:40 np0005538515.localdomain systemd[1]: tmp-crun.xJSeJv.mount: Deactivated successfully.
Nov 28 09:02:40 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.
Nov 28 09:02:40 np0005538515.localdomain podman[102903]: 2025-11-28 09:02:40.163751446 +0000 UTC m=+0.098147962 container health_status 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, distribution-scope=public, name=rhosp17/openstack-ovn-controller, container_name=ovn_controller, config_id=tripleo_step4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, io.openshift.expose-services=, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, build-date=2025-11-18T23:34:05Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, batch=17.1_20251118.1, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller)
Nov 28 09:02:40 np0005538515.localdomain podman[102903]: 2025-11-28 09:02:40.255353207 +0000 UTC m=+0.189749673 container exec_died 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vendor=Red Hat, Inc., vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:34:05Z, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ovn-controller, architecture=x86_64, url=https://www.redhat.com, distribution-scope=public, container_name=ovn_controller, batch=17.1_20251118.1, config_id=tripleo_step4, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, vcs-type=git, com.redhat.component=openstack-ovn-controller-container)
Nov 28 09:02:40 np0005538515.localdomain podman[102903]: unhealthy
Nov 28 09:02:40 np0005538515.localdomain systemd[1]: 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 09:02:40 np0005538515.localdomain systemd[1]: 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.service: Failed with result 'exit-code'.
Nov 28 09:02:40 np0005538515.localdomain podman[102923]: 2025-11-28 09:02:40.259512624 +0000 UTC m=+0.085343638 container health_status e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, release=1761123044, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, io.openshift.expose-services=, url=https://www.redhat.com, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, tcib_managed=true, managed_by=tripleo_ansible, version=17.1.12, name=rhosp17/openstack-neutron-metadata-agent-ovn)
Nov 28 09:02:40 np0005538515.localdomain podman[102923]: 2025-11-28 09:02:40.342729346 +0000 UTC m=+0.168560360 container exec_died e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:14:25Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, release=1761123044, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, name=rhosp17/openstack-neutron-metadata-agent-ovn, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, url=https://www.redhat.com, container_name=ovn_metadata_agent, batch=17.1_20251118.1, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4, architecture=x86_64, vcs-type=git, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 28 09:02:40 np0005538515.localdomain podman[102923]: unhealthy
Nov 28 09:02:40 np0005538515.localdomain systemd[1]: e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 09:02:40 np0005538515.localdomain systemd[1]: e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.service: Failed with result 'exit-code'.
Nov 28 09:02:40 np0005538515.localdomain podman[102949]: 2025-11-28 09:02:40.393080726 +0000 UTC m=+0.179253269 container health_status ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, config_id=tripleo_step5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, container_name=nova_compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, batch=17.1_20251118.1, tcib_managed=true, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, io.buildah.version=1.41.4, managed_by=tripleo_ansible)
Nov 28 09:02:40 np0005538515.localdomain podman[102949]: 2025-11-28 09:02:40.500912196 +0000 UTC m=+0.287084769 container exec_died ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-nova-compute, config_id=tripleo_step5, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, build-date=2025-11-19T00:36:58Z, container_name=nova_compute, url=https://www.redhat.com, release=1761123044, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, batch=17.1_20251118.1, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 28 09:02:40 np0005538515.localdomain sudo[102798]: pam_unix(sudo:session): session closed for user root
Nov 28 09:02:40 np0005538515.localdomain systemd[1]: ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.service: Deactivated successfully.
Nov 28 09:02:40 np0005538515.localdomain sudo[103015]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:02:40 np0005538515.localdomain sudo[103015]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:02:40 np0005538515.localdomain sudo[103015]: pam_unix(sudo:session): session closed for user root
Nov 28 09:02:40 np0005538515.localdomain sudo[103030]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 09:02:40 np0005538515.localdomain sudo[103030]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:02:41 np0005538515.localdomain sudo[103030]: pam_unix(sudo:session): session closed for user root
Nov 28 09:02:41 np0005538515.localdomain sudo[103078]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:02:41 np0005538515.localdomain sudo[103078]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:02:41 np0005538515.localdomain sudo[103078]: pam_unix(sudo:session): session closed for user root
Nov 28 09:03:04 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.
Nov 28 09:03:04 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.
Nov 28 09:03:04 np0005538515.localdomain podman[103093]: 2025-11-28 09:03:04.99586564 +0000 UTC m=+0.091142956 container health_status 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, distribution-scope=public, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, managed_by=tripleo_ansible, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, name=rhosp17/openstack-qdrouterd, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, release=1761123044, container_name=metrics_qdr, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc.)
Nov 28 09:03:05 np0005538515.localdomain systemd[1]: tmp-crun.yA6SyH.mount: Deactivated successfully.
Nov 28 09:03:05 np0005538515.localdomain podman[103094]: 2025-11-28 09:03:05.0569192 +0000 UTC m=+0.152324530 container health_status cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, vendor=Red Hat, Inc., config_id=tripleo_step3, managed_by=tripleo_ansible, architecture=x86_64, io.openshift.expose-services=, release=1761123044, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, name=rhosp17/openstack-collectd, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-collectd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, version=17.1.12)
Nov 28 09:03:05 np0005538515.localdomain podman[103094]: 2025-11-28 09:03:05.072366596 +0000 UTC m=+0.167771906 container exec_died cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., config_id=tripleo_step3, release=1761123044, description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, batch=17.1_20251118.1, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-collectd-container, summary=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, tcib_managed=true, build-date=2025-11-18T22:51:28Z, name=rhosp17/openstack-collectd, architecture=x86_64, io.openshift.expose-services=, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 28 09:03:05 np0005538515.localdomain systemd[1]: cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.service: Deactivated successfully.
Nov 28 09:03:05 np0005538515.localdomain podman[103093]: 2025-11-28 09:03:05.210743115 +0000 UTC m=+0.306020461 container exec_died 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vcs-type=git, build-date=2025-11-18T22:49:46Z, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., name=rhosp17/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, config_id=tripleo_step1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=)
Nov 28 09:03:05 np0005538515.localdomain systemd[1]: 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.service: Deactivated successfully.
Nov 28 09:03:06 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.
Nov 28 09:03:06 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.
Nov 28 09:03:06 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.
Nov 28 09:03:06 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.
Nov 28 09:03:06 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.
Nov 28 09:03:06 np0005538515.localdomain podman[103142]: 2025-11-28 09:03:06.975824909 +0000 UTC m=+0.079220969 container health_status 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, url=https://www.redhat.com, io.openshift.expose-services=, version=17.1.12, description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-type=git, container_name=logrotate_crond, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, io.buildah.version=1.41.4, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, distribution-scope=public, batch=17.1_20251118.1, build-date=2025-11-18T22:49:32Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 28 09:03:06 np0005538515.localdomain podman[103142]: 2025-11-28 09:03:06.987479478 +0000 UTC m=+0.090875578 container exec_died 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, name=rhosp17/openstack-cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, version=17.1.12, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, architecture=x86_64, com.redhat.component=openstack-cron-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, release=1761123044, build-date=2025-11-18T22:49:32Z, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=logrotate_crond, url=https://www.redhat.com, vendor=Red Hat, Inc., io.openshift.expose-services=)
Nov 28 09:03:07 np0005538515.localdomain systemd[1]: 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.service: Deactivated successfully.
Nov 28 09:03:07 np0005538515.localdomain systemd[1]: tmp-crun.eGsJU0.mount: Deactivated successfully.
Nov 28 09:03:07 np0005538515.localdomain podman[103157]: 2025-11-28 09:03:07.045214216 +0000 UTC m=+0.135241494 container health_status e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vendor=Red Hat, Inc., release=1761123044, io.buildah.version=1.41.4, managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target, architecture=x86_64, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 28 09:03:07 np0005538515.localdomain podman[103143]: 2025-11-28 09:03:07.086574229 +0000 UTC m=+0.189569607 container health_status 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-type=git, batch=17.1_20251118.1, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, container_name=ceilometer_agent_ipmi, version=17.1.12, name=rhosp17/openstack-ceilometer-ipmi, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, distribution-scope=public, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 28 09:03:07 np0005538515.localdomain podman[103145]: 2025-11-28 09:03:07.159619178 +0000 UTC m=+0.253283109 container health_status d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, managed_by=tripleo_ansible, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, container_name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, name=rhosp17/openstack-ceilometer-compute, version=17.1.12, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-compute-container, vcs-type=git, io.buildah.version=1.41.4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, io.openshift.expose-services=, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 09:03:07 np0005538515.localdomain podman[103143]: 2025-11-28 09:03:07.174550817 +0000 UTC m=+0.277546225 container exec_died 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, batch=17.1_20251118.1, url=https://www.redhat.com, version=17.1.12, distribution-scope=public, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, architecture=x86_64, managed_by=tripleo_ansible, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vendor=Red Hat, Inc., tcib_managed=true, build-date=2025-11-19T00:12:45Z, container_name=ceilometer_agent_ipmi, release=1761123044, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team)
Nov 28 09:03:07 np0005538515.localdomain podman[103143]: unhealthy
Nov 28 09:03:07 np0005538515.localdomain systemd[1]: 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 09:03:07 np0005538515.localdomain systemd[1]: 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.service: Failed with result 'exit-code'.
Nov 28 09:03:07 np0005538515.localdomain podman[103145]: 2025-11-28 09:03:07.199502955 +0000 UTC m=+0.293166926 container exec_died d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:11:48Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.buildah.version=1.41.4, managed_by=tripleo_ansible, vendor=Red Hat, Inc., container_name=ceilometer_agent_compute, architecture=x86_64, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, tcib_managed=true, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, url=https://www.redhat.com, name=rhosp17/openstack-ceilometer-compute)
Nov 28 09:03:07 np0005538515.localdomain systemd[1]: d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.service: Deactivated successfully.
Nov 28 09:03:07 np0005538515.localdomain podman[103144]: 2025-11-28 09:03:07.248459852 +0000 UTC m=+0.346465406 container health_status 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, container_name=iscsid, managed_by=tripleo_ansible, config_id=tripleo_step3, build-date=2025-11-18T23:44:13Z, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.expose-services=, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, batch=17.1_20251118.1, vcs-type=git, name=rhosp17/openstack-iscsid, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid)
Nov 28 09:03:07 np0005538515.localdomain podman[103144]: 2025-11-28 09:03:07.262421491 +0000 UTC m=+0.360427015 container exec_died 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.component=openstack-iscsid-container, tcib_managed=true, io.buildah.version=1.41.4, version=17.1.12, name=rhosp17/openstack-iscsid, managed_by=tripleo_ansible, config_id=tripleo_step3, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, batch=17.1_20251118.1, build-date=2025-11-18T23:44:13Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid)
Nov 28 09:03:07 np0005538515.localdomain systemd[1]: 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.service: Deactivated successfully.
Nov 28 09:03:07 np0005538515.localdomain podman[103157]: 2025-11-28 09:03:07.406380113 +0000 UTC m=+0.496407451 container exec_died e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., container_name=nova_migration_target, architecture=x86_64, version=17.1.12, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, com.redhat.component=openstack-nova-compute-container, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute)
Nov 28 09:03:07 np0005538515.localdomain systemd[1]: e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.service: Deactivated successfully.
Nov 28 09:03:10 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.
Nov 28 09:03:10 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.
Nov 28 09:03:10 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.
Nov 28 09:03:10 np0005538515.localdomain podman[103252]: 2025-11-28 09:03:10.988510251 +0000 UTC m=+0.098770442 container health_status 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, name=rhosp17/openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, managed_by=tripleo_ansible, version=17.1.12, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, com.redhat.component=openstack-ovn-controller-container, config_id=tripleo_step4, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, io.buildah.version=1.41.4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, maintainer=OpenStack TripleO Team, distribution-scope=public)
Nov 28 09:03:11 np0005538515.localdomain podman[103252]: 2025-11-28 09:03:11.023146106 +0000 UTC m=+0.133406357 container exec_died 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, name=rhosp17/openstack-ovn-controller, vcs-type=git, tcib_managed=true, com.redhat.component=openstack-ovn-controller-container, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, distribution-scope=public, version=17.1.12, architecture=x86_64, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, managed_by=tripleo_ansible, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, build-date=2025-11-18T23:34:05Z, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller)
Nov 28 09:03:11 np0005538515.localdomain podman[103252]: unhealthy
Nov 28 09:03:11 np0005538515.localdomain podman[103254]: 2025-11-28 09:03:11.036684173 +0000 UTC m=+0.138495084 container health_status e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, build-date=2025-11-19T00:14:25Z, config_id=tripleo_step4, container_name=ovn_metadata_agent, architecture=x86_64, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, vcs-type=git, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, name=rhosp17/openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, tcib_managed=true, vendor=Red Hat, Inc., batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, release=1761123044, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=)
Nov 28 09:03:11 np0005538515.localdomain systemd[1]: 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 09:03:11 np0005538515.localdomain systemd[1]: 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.service: Failed with result 'exit-code'.
Nov 28 09:03:11 np0005538515.localdomain podman[103254]: 2025-11-28 09:03:11.050594592 +0000 UTC m=+0.152405503 container exec_died e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, tcib_managed=true, build-date=2025-11-19T00:14:25Z, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-neutron-metadata-agent-ovn, release=1761123044, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, container_name=ovn_metadata_agent, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, architecture=x86_64, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, managed_by=tripleo_ansible, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.openshift.expose-services=)
Nov 28 09:03:11 np0005538515.localdomain podman[103254]: unhealthy
Nov 28 09:03:11 np0005538515.localdomain systemd[1]: e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 09:03:11 np0005538515.localdomain systemd[1]: e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.service: Failed with result 'exit-code'.
Nov 28 09:03:11 np0005538515.localdomain podman[103253]: 2025-11-28 09:03:11.106879194 +0000 UTC m=+0.212922315 container health_status ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, vcs-type=git, container_name=nova_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step5, io.buildah.version=1.41.4, tcib_managed=true, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, vendor=Red Hat, Inc., version=17.1.12, batch=17.1_20251118.1, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute)
Nov 28 09:03:11 np0005538515.localdomain podman[103253]: 2025-11-28 09:03:11.123900928 +0000 UTC m=+0.229944069 container exec_died ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_compute, batch=17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-compute, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, config_id=tripleo_step5, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., version=17.1.12, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, io.buildah.version=1.41.4, architecture=x86_64)
Nov 28 09:03:11 np0005538515.localdomain podman[103253]: unhealthy
Nov 28 09:03:11 np0005538515.localdomain systemd[1]: ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 09:03:11 np0005538515.localdomain systemd[1]: ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.service: Failed with result 'exit-code'.
Nov 28 09:03:13 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29737 DF PROTO=TCP SPT=37178 DPT=9100 SEQ=4062564550 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AB739710000000001030307) 
Nov 28 09:03:14 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59278 DF PROTO=TCP SPT=49178 DPT=9101 SEQ=3146939845 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AB73C400000000001030307) 
Nov 28 09:03:14 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29738 DF PROTO=TCP SPT=37178 DPT=9100 SEQ=4062564550 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AB73D7A0000000001030307) 
Nov 28 09:03:15 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59279 DF PROTO=TCP SPT=49178 DPT=9101 SEQ=3146939845 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AB7403B0000000001030307) 
Nov 28 09:03:16 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29739 DF PROTO=TCP SPT=37178 DPT=9100 SEQ=4062564550 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AB7457A0000000001030307) 
Nov 28 09:03:17 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59280 DF PROTO=TCP SPT=49178 DPT=9101 SEQ=3146939845 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AB7483A0000000001030307) 
Nov 28 09:03:20 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29740 DF PROTO=TCP SPT=37178 DPT=9100 SEQ=4062564550 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AB7553A0000000001030307) 
Nov 28 09:03:21 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59281 DF PROTO=TCP SPT=49178 DPT=9101 SEQ=3146939845 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AB757FA0000000001030307) 
Nov 28 09:03:27 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48822 DF PROTO=TCP SPT=55802 DPT=9105 SEQ=3116172900 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AB770720000000001030307) 
Nov 28 09:03:28 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48823 DF PROTO=TCP SPT=55802 DPT=9105 SEQ=3116172900 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AB7747B0000000001030307) 
Nov 28 09:03:28 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29741 DF PROTO=TCP SPT=37178 DPT=9100 SEQ=4062564550 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AB774FB0000000001030307) 
Nov 28 09:03:28 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55894 DF PROTO=TCP SPT=33272 DPT=9882 SEQ=482753259 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AB7759C0000000001030307) 
Nov 28 09:03:29 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59282 DF PROTO=TCP SPT=49178 DPT=9101 SEQ=3146939845 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AB778FA0000000001030307) 
Nov 28 09:03:29 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55895 DF PROTO=TCP SPT=33272 DPT=9882 SEQ=482753259 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AB779BA0000000001030307) 
Nov 28 09:03:30 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48824 DF PROTO=TCP SPT=55802 DPT=9105 SEQ=3116172900 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AB77C7A0000000001030307) 
Nov 28 09:03:32 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55896 DF PROTO=TCP SPT=33272 DPT=9882 SEQ=482753259 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AB781BB0000000001030307) 
Nov 28 09:03:32 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17877 DF PROTO=TCP SPT=40392 DPT=9102 SEQ=1719917992 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AB781E80000000001030307) 
Nov 28 09:03:33 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17878 DF PROTO=TCP SPT=40392 DPT=9102 SEQ=1719917992 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AB785FA0000000001030307) 
Nov 28 09:03:34 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48825 DF PROTO=TCP SPT=55802 DPT=9105 SEQ=3116172900 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AB78C3A0000000001030307) 
Nov 28 09:03:35 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17879 DF PROTO=TCP SPT=40392 DPT=9102 SEQ=1719917992 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AB78DFA0000000001030307) 
Nov 28 09:03:35 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.
Nov 28 09:03:35 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.
Nov 28 09:03:35 np0005538515.localdomain podman[103314]: 2025-11-28 09:03:35.973651234 +0000 UTC m=+0.083629844 container health_status 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, batch=17.1_20251118.1, tcib_managed=true, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, architecture=x86_64, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vendor=Red Hat, Inc., name=rhosp17/openstack-qdrouterd, url=https://www.redhat.com, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, build-date=2025-11-18T22:49:46Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044)
Nov 28 09:03:36 np0005538515.localdomain systemd[1]: tmp-crun.H4wZZE.mount: Deactivated successfully.
Nov 28 09:03:36 np0005538515.localdomain podman[103315]: 2025-11-28 09:03:36.032422564 +0000 UTC m=+0.140376933 container health_status cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, architecture=x86_64, vendor=Red Hat, Inc., batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, build-date=2025-11-18T22:51:28Z, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.openshift.expose-services=, vcs-type=git, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-collectd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, url=https://www.redhat.com)
Nov 28 09:03:36 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55897 DF PROTO=TCP SPT=33272 DPT=9882 SEQ=482753259 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AB7917A0000000001030307) 
Nov 28 09:03:36 np0005538515.localdomain podman[103315]: 2025-11-28 09:03:36.042164974 +0000 UTC m=+0.150119303 container exec_died cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, release=1761123044, com.redhat.component=openstack-collectd-container, url=https://www.redhat.com, architecture=x86_64, distribution-scope=public, container_name=collectd, build-date=2025-11-18T22:51:28Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, vcs-type=git, name=rhosp17/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, version=17.1.12, io.buildah.version=1.41.4, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 28 09:03:36 np0005538515.localdomain systemd[1]: cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.service: Deactivated successfully.
Nov 28 09:03:36 np0005538515.localdomain podman[103314]: 2025-11-28 09:03:36.170612428 +0000 UTC m=+0.280591068 container exec_died 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.expose-services=, tcib_managed=true, build-date=2025-11-18T22:49:46Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, vendor=Red Hat, Inc., managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step1, release=1761123044, architecture=x86_64, batch=17.1_20251118.1, container_name=metrics_qdr, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 28 09:03:36 np0005538515.localdomain systemd[1]: 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.service: Deactivated successfully.
Nov 28 09:03:37 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.
Nov 28 09:03:37 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.
Nov 28 09:03:37 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.
Nov 28 09:03:37 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.
Nov 28 09:03:37 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.
Nov 28 09:03:37 np0005538515.localdomain podman[103367]: 2025-11-28 09:03:37.993748578 +0000 UTC m=+0.091138146 container health_status d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.openshift.expose-services=, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, url=https://www.redhat.com, config_id=tripleo_step4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.12, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, build-date=2025-11-19T00:11:48Z, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git)
Nov 28 09:03:38 np0005538515.localdomain podman[103367]: 2025-11-28 09:03:38.0214056 +0000 UTC m=+0.118795198 container exec_died d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, release=1761123044, tcib_managed=true, container_name=ceilometer_agent_compute, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vendor=Red Hat, Inc., distribution-scope=public, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z)
Nov 28 09:03:38 np0005538515.localdomain systemd[1]: d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.service: Deactivated successfully.
Nov 28 09:03:38 np0005538515.localdomain podman[103370]: 2025-11-28 09:03:38.091458036 +0000 UTC m=+0.184512611 container health_status e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, batch=17.1_20251118.1, container_name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.buildah.version=1.41.4, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, build-date=2025-11-19T00:36:58Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, tcib_managed=true, name=rhosp17/openstack-nova-compute, config_id=tripleo_step4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044)
Nov 28 09:03:38 np0005538515.localdomain podman[103364]: 2025-11-28 09:03:38.102167916 +0000 UTC m=+0.207606152 container health_status 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, managed_by=tripleo_ansible, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, maintainer=OpenStack TripleO Team, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 cron, release=1761123044, com.redhat.component=openstack-cron-container, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, url=https://www.redhat.com, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, build-date=2025-11-18T22:49:32Z, version=17.1.12, config_id=tripleo_step4, vcs-type=git)
Nov 28 09:03:38 np0005538515.localdomain podman[103364]: 2025-11-28 09:03:38.110783491 +0000 UTC m=+0.216221677 container exec_died 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, tcib_managed=true, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:49:32Z, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, architecture=x86_64, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, release=1761123044, version=17.1.12, managed_by=tripleo_ansible)
Nov 28 09:03:38 np0005538515.localdomain systemd[1]: 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.service: Deactivated successfully.
Nov 28 09:03:38 np0005538515.localdomain podman[103365]: 2025-11-28 09:03:38.202498145 +0000 UTC m=+0.304436163 container health_status 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, build-date=2025-11-19T00:12:45Z, vendor=Red Hat, Inc., container_name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, com.redhat.component=openstack-ceilometer-ipmi-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, name=rhosp17/openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, batch=17.1_20251118.1, version=17.1.12, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_id=tripleo_step4, release=1761123044)
Nov 28 09:03:38 np0005538515.localdomain podman[103365]: 2025-11-28 09:03:38.257297942 +0000 UTC m=+0.359236000 container exec_died 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, release=1761123044, vendor=Red Hat, Inc., org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:12:45Z, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, tcib_managed=true, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, version=17.1.12, io.buildah.version=1.41.4)
Nov 28 09:03:38 np0005538515.localdomain podman[103365]: unhealthy
Nov 28 09:03:38 np0005538515.localdomain systemd[1]: 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 09:03:38 np0005538515.localdomain systemd[1]: 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.service: Failed with result 'exit-code'.
Nov 28 09:03:38 np0005538515.localdomain podman[103366]: 2025-11-28 09:03:38.258046065 +0000 UTC m=+0.356482515 container health_status 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, build-date=2025-11-18T23:44:13Z, com.redhat.component=openstack-iscsid-container, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, vendor=Red Hat, Inc., managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, name=rhosp17/openstack-iscsid, config_id=tripleo_step3, version=17.1.12, container_name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 28 09:03:38 np0005538515.localdomain podman[103366]: 2025-11-28 09:03:38.337999395 +0000 UTC m=+0.436435835 container exec_died 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, container_name=iscsid, build-date=2025-11-18T23:44:13Z, com.redhat.component=openstack-iscsid-container, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, config_id=tripleo_step3, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, architecture=x86_64, maintainer=OpenStack TripleO Team, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., batch=17.1_20251118.1)
Nov 28 09:03:38 np0005538515.localdomain systemd[1]: 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.service: Deactivated successfully.
Nov 28 09:03:38 np0005538515.localdomain podman[103370]: 2025-11-28 09:03:38.461674832 +0000 UTC m=+0.554729407 container exec_died e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=nova_migration_target, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.4, url=https://www.redhat.com, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20251118.1, distribution-scope=public, architecture=x86_64, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, release=1761123044, tcib_managed=true, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:36:58Z, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 28 09:03:38 np0005538515.localdomain systemd[1]: e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.service: Deactivated successfully.
Nov 28 09:03:39 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17880 DF PROTO=TCP SPT=40392 DPT=9102 SEQ=1719917992 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AB79DBB0000000001030307) 
Nov 28 09:03:39 np0005538515.localdomain sshd[103472]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 09:03:40 np0005538515.localdomain sshd[103472]: Accepted publickey for zuul from 192.168.122.31 port 37838 ssh2: RSA SHA256:3gOhaEk5Hp1Sm2LwNst6cGDJ5O01KvSo8lCo9SBO2II
Nov 28 09:03:40 np0005538515.localdomain systemd-logind[763]: New session 35 of user zuul.
Nov 28 09:03:40 np0005538515.localdomain systemd[1]: Started Session 35 of User zuul.
Nov 28 09:03:40 np0005538515.localdomain sshd[103472]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Nov 28 09:03:40 np0005538515.localdomain sudo[103565]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bxxybchqzjtjzqgxrbkhnsawcczxskaj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764320620.1637855-28-92696366606286/AnsiballZ_stat.py
Nov 28 09:03:40 np0005538515.localdomain sudo[103565]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:03:40 np0005538515.localdomain python3.9[103567]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova/nova.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 09:03:40 np0005538515.localdomain sudo[103565]: pam_unix(sudo:session): session closed for user root
Nov 28 09:03:41 np0005538515.localdomain sudo[103659]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eowyticbfslxmptnjdwiqrwlkoxypsrs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764320621.135052-64-61096555595998/AnsiballZ_command.py
Nov 28 09:03:41 np0005538515.localdomain sudo[103659]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:03:41 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.
Nov 28 09:03:41 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.
Nov 28 09:03:41 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.
Nov 28 09:03:41 np0005538515.localdomain podman[103662]: 2025-11-28 09:03:41.618058084 +0000 UTC m=+0.083223712 container health_status 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.buildah.version=1.41.4, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, distribution-scope=public, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, container_name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vendor=Red Hat, Inc., version=17.1.12, tcib_managed=true, build-date=2025-11-18T23:34:05Z, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, name=rhosp17/openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git)
Nov 28 09:03:41 np0005538515.localdomain podman[103662]: 2025-11-28 09:03:41.63644916 +0000 UTC m=+0.101614788 container exec_died 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_controller, distribution-scope=public, name=rhosp17/openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, batch=17.1_20251118.1, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, release=1761123044, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, build-date=2025-11-18T23:34:05Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, io.openshift.expose-services=)
Nov 28 09:03:41 np0005538515.localdomain podman[103663]: 2025-11-28 09:03:41.666681801 +0000 UTC m=+0.128120965 container health_status ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, vcs-type=git, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, architecture=x86_64, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, vendor=Red Hat, Inc., container_name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-compute, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-compute-container)
Nov 28 09:03:41 np0005538515.localdomain podman[103662]: unhealthy
Nov 28 09:03:41 np0005538515.localdomain systemd[1]: 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 09:03:41 np0005538515.localdomain systemd[1]: 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.service: Failed with result 'exit-code'.
Nov 28 09:03:41 np0005538515.localdomain podman[103663]: 2025-11-28 09:03:41.714453742 +0000 UTC m=+0.175892906 container exec_died ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, version=17.1.12, container_name=nova_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, tcib_managed=true, io.openshift.expose-services=, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step5, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 28 09:03:41 np0005538515.localdomain python3.9[103661]: ansible-ansible.legacy.command Invoked with cmd=python3 -c "import configparser as c; p = c.ConfigParser(strict=False); p.read('/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova/nova.conf'); print(p['DEFAULT']['host'])"
                                                             _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:03:41 np0005538515.localdomain systemd[1]: ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 09:03:41 np0005538515.localdomain podman[103663]: unhealthy
Nov 28 09:03:41 np0005538515.localdomain systemd[1]: ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.service: Failed with result 'exit-code'.
Nov 28 09:03:41 np0005538515.localdomain podman[103664]: 2025-11-28 09:03:41.779862315 +0000 UTC m=+0.239822164 container health_status e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, build-date=2025-11-19T00:14:25Z, io.buildah.version=1.41.4, vcs-type=git, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, version=17.1.12, container_name=ovn_metadata_agent, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 09:03:41 np0005538515.localdomain podman[103664]: 2025-11-28 09:03:41.798422937 +0000 UTC m=+0.258382706 container exec_died e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, config_id=tripleo_step4, url=https://www.redhat.com, architecture=x86_64, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, tcib_managed=true, release=1761123044, vcs-type=git, io.openshift.expose-services=, vendor=Red Hat, Inc., batch=17.1_20251118.1, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, io.buildah.version=1.41.4, version=17.1.12, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.component=openstack-neutron-metadata-agent-ovn-container)
Nov 28 09:03:41 np0005538515.localdomain podman[103664]: unhealthy
Nov 28 09:03:41 np0005538515.localdomain systemd[1]: e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 09:03:41 np0005538515.localdomain systemd[1]: e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.service: Failed with result 'exit-code'.
Nov 28 09:03:41 np0005538515.localdomain sudo[103659]: pam_unix(sudo:session): session closed for user root
Nov 28 09:03:42 np0005538515.localdomain sudo[103740]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:03:42 np0005538515.localdomain sudo[103740]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:03:42 np0005538515.localdomain sudo[103740]: pam_unix(sudo:session): session closed for user root
Nov 28 09:03:42 np0005538515.localdomain sudo[103786]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 09:03:42 np0005538515.localdomain sudo[103786]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:03:42 np0005538515.localdomain sudo[103844]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vmodrcitvkvvqxrhthyfdoaihidqutws ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764320621.9953887-88-165487485015939/AnsiballZ_stat.py
Nov 28 09:03:42 np0005538515.localdomain sudo[103844]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:03:42 np0005538515.localdomain python3.9[103846]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated/neutron/etc/neutron/neutron.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 09:03:42 np0005538515.localdomain sudo[103844]: pam_unix(sudo:session): session closed for user root
Nov 28 09:03:42 np0005538515.localdomain sudo[103786]: pam_unix(sudo:session): session closed for user root
Nov 28 09:03:43 np0005538515.localdomain sudo[103970]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bdobksikktezbzdgogkpznkwzdrqnhrb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764320622.7226772-112-65354248690104/AnsiballZ_command.py
Nov 28 09:03:43 np0005538515.localdomain sudo[103970]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:03:43 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48826 DF PROTO=TCP SPT=55802 DPT=9105 SEQ=3116172900 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AB7ACFA0000000001030307) 
Nov 28 09:03:43 np0005538515.localdomain python3.9[103972]: ansible-ansible.legacy.command Invoked with cmd=python3 -c "import configparser as c; p = c.ConfigParser(strict=False); p.read('/var/lib/config-data/puppet-generated/neutron/etc/neutron/neutron.conf'); print(p['DEFAULT']['host'])"
                                                             _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:03:43 np0005538515.localdomain sudo[103970]: pam_unix(sudo:session): session closed for user root
Nov 28 09:03:43 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=106 DF PROTO=TCP SPT=37044 DPT=9100 SEQ=1663755763 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AB7AEA20000000001030307) 
Nov 28 09:03:43 np0005538515.localdomain sudo[103989]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:03:43 np0005538515.localdomain sudo[103989]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:03:43 np0005538515.localdomain sudo[103989]: pam_unix(sudo:session): session closed for user root
Nov 28 09:03:43 np0005538515.localdomain sudo[104078]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vdwyidioaqvtovjnfmbukmpjzazmpqas ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764320623.5067651-139-133963974222181/AnsiballZ_command.py
Nov 28 09:03:43 np0005538515.localdomain sudo[104078]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:03:43 np0005538515.localdomain python3.9[104080]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:03:43 np0005538515.localdomain sudo[104078]: pam_unix(sudo:session): session closed for user root
Nov 28 09:03:44 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55898 DF PROTO=TCP SPT=33272 DPT=9882 SEQ=482753259 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AB7B0FA0000000001030307) 
Nov 28 09:03:44 np0005538515.localdomain python3.9[104171]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline
Nov 28 09:03:46 np0005538515.localdomain python3.9[104261]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 09:03:46 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=108 DF PROTO=TCP SPT=37044 DPT=9100 SEQ=1663755763 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AB7BABB0000000001030307) 
Nov 28 09:03:46 np0005538515.localdomain python3.9[104353]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile
Nov 28 09:03:48 np0005538515.localdomain python3.9[104443]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 28 09:03:48 np0005538515.localdomain python3.9[104491]: ansible-ansible.legacy.dnf Invoked with name=['systemd-container'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 28 09:03:49 np0005538515.localdomain sshd[103472]: pam_unix(sshd:session): session closed for user zuul
Nov 28 09:03:49 np0005538515.localdomain systemd[1]: session-35.scope: Deactivated successfully.
Nov 28 09:03:49 np0005538515.localdomain systemd[1]: session-35.scope: Consumed 4.989s CPU time.
Nov 28 09:03:49 np0005538515.localdomain systemd-logind[763]: Session 35 logged out. Waiting for processes to exit.
Nov 28 09:03:49 np0005538515.localdomain systemd-logind[763]: Removed session 35.
Nov 28 09:03:50 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=109 DF PROTO=TCP SPT=37044 DPT=9100 SEQ=1663755763 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AB7CA7B0000000001030307) 
Nov 28 09:03:57 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34651 DF PROTO=TCP SPT=35246 DPT=9105 SEQ=2060818330 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AB7E5A30000000001030307) 
Nov 28 09:03:58 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34652 DF PROTO=TCP SPT=35246 DPT=9105 SEQ=2060818330 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AB7E9BA0000000001030307) 
Nov 28 09:03:58 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17769 DF PROTO=TCP SPT=46804 DPT=9882 SEQ=2702682633 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AB7EACC0000000001030307) 
Nov 28 09:04:01 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17771 DF PROTO=TCP SPT=46804 DPT=9882 SEQ=2702682633 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AB7F6BA0000000001030307) 
Nov 28 09:04:03 np0005538515.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Nov 28 09:04:03 np0005538515.localdomain recover_tripleo_nova_virtqemud[104508]: 62642
Nov 28 09:04:03 np0005538515.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Nov 28 09:04:03 np0005538515.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Nov 28 09:04:04 np0005538515.localdomain sshd[104509]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 09:04:04 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34654 DF PROTO=TCP SPT=35246 DPT=9105 SEQ=2060818330 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AB8017A0000000001030307) 
Nov 28 09:04:04 np0005538515.localdomain sshd[104509]: Accepted publickey for zuul from 192.168.122.30 port 35138 ssh2: RSA SHA256:3gOhaEk5Hp1Sm2LwNst6cGDJ5O01KvSo8lCo9SBO2II
Nov 28 09:04:04 np0005538515.localdomain systemd-logind[763]: New session 36 of user zuul.
Nov 28 09:04:04 np0005538515.localdomain systemd[1]: Started Session 36 of User zuul.
Nov 28 09:04:04 np0005538515.localdomain sshd[104509]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Nov 28 09:04:05 np0005538515.localdomain sudo[104602]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-asazfweukorhmzfhnnrcelnpispgbkmx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764320644.9251072-25-164883106678975/AnsiballZ_systemd_service.py
Nov 28 09:04:05 np0005538515.localdomain sudo[104602]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:04:05 np0005538515.localdomain python3.9[104604]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 28 09:04:05 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 09:04:06 np0005538515.localdomain systemd-sysv-generator[104634]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:04:06 np0005538515.localdomain systemd-rc-local-generator[104629]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:04:06 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:04:06 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.
Nov 28 09:04:06 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.
Nov 28 09:04:06 np0005538515.localdomain sudo[104602]: pam_unix(sudo:session): session closed for user root
Nov 28 09:04:06 np0005538515.localdomain podman[104642]: 2025-11-28 09:04:06.372031876 +0000 UTC m=+0.089387333 container health_status cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, release=1761123044, com.redhat.component=openstack-collectd-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=collectd, config_id=tripleo_step3, distribution-scope=public, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, version=17.1.12, tcib_managed=true, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., vcs-type=git, architecture=x86_64, batch=17.1_20251118.1, managed_by=tripleo_ansible, name=rhosp17/openstack-collectd, build-date=2025-11-18T22:51:28Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 28 09:04:06 np0005538515.localdomain podman[104642]: 2025-11-28 09:04:06.41345616 +0000 UTC m=+0.130811657 container exec_died cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vendor=Red Hat, Inc., batch=17.1_20251118.1, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step3, version=17.1.12, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-collectd-container, url=https://www.redhat.com, container_name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, vcs-type=git, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 28 09:04:06 np0005538515.localdomain systemd[1]: cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.service: Deactivated successfully.
Nov 28 09:04:06 np0005538515.localdomain podman[104641]: 2025-11-28 09:04:06.434437537 +0000 UTC m=+0.152273809 container health_status 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, batch=17.1_20251118.1, io.openshift.expose-services=, config_id=tripleo_step1, maintainer=OpenStack TripleO Team, architecture=x86_64, vcs-type=git, build-date=2025-11-18T22:49:46Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, vendor=Red Hat, Inc., name=rhosp17/openstack-qdrouterd, release=1761123044, tcib_managed=true)
Nov 28 09:04:06 np0005538515.localdomain podman[104641]: 2025-11-28 09:04:06.67040034 +0000 UTC m=+0.388236612 container exec_died 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.expose-services=, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp17/openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.buildah.version=1.41.4, architecture=x86_64, maintainer=OpenStack TripleO Team, container_name=metrics_qdr, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, tcib_managed=true, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 28 09:04:06 np0005538515.localdomain systemd[1]: 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.service: Deactivated successfully.
Nov 28 09:04:07 np0005538515.localdomain python3.9[104778]: ansible-ansible.builtin.service_facts Invoked
Nov 28 09:04:07 np0005538515.localdomain network[104795]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 28 09:04:07 np0005538515.localdomain network[104796]: 'network-scripts' will be removed from distribution in near future.
Nov 28 09:04:07 np0005538515.localdomain network[104797]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 28 09:04:08 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.
Nov 28 09:04:08 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.
Nov 28 09:04:08 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.
Nov 28 09:04:08 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.
Nov 28 09:04:08 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.
Nov 28 09:04:09 np0005538515.localdomain podman[104814]: 2025-11-28 09:04:09.01039255 +0000 UTC m=+0.094847340 container health_status e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2025-11-19T00:36:58Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, distribution-scope=public, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, release=1761123044, container_name=nova_migration_target, config_id=tripleo_step4, version=17.1.12, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, url=https://www.redhat.com, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']})
Nov 28 09:04:09 np0005538515.localdomain podman[104810]: 2025-11-28 09:04:09.063455594 +0000 UTC m=+0.156824288 container health_status 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, tcib_managed=true, vendor=Red Hat, Inc., batch=17.1_20251118.1, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, release=1761123044, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:49:32Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, container_name=logrotate_crond, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, com.redhat.component=openstack-cron-container, io.buildah.version=1.41.4)
Nov 28 09:04:09 np0005538515.localdomain podman[104810]: 2025-11-28 09:04:09.097807262 +0000 UTC m=+0.191175956 container exec_died 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, batch=17.1_20251118.1, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, config_id=tripleo_step4, name=rhosp17/openstack-cron, com.redhat.component=openstack-cron-container, tcib_managed=true, io.buildah.version=1.41.4, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, vendor=Red Hat, Inc., container_name=logrotate_crond, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, build-date=2025-11-18T22:49:32Z, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team)
Nov 28 09:04:09 np0005538515.localdomain systemd[1]: 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.service: Deactivated successfully.
Nov 28 09:04:09 np0005538515.localdomain podman[104812]: 2025-11-28 09:04:09.178253348 +0000 UTC m=+0.270225519 container health_status 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, io.openshift.expose-services=, architecture=x86_64, name=rhosp17/openstack-iscsid, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, config_id=tripleo_step3, build-date=2025-11-18T23:44:13Z, vcs-type=git, container_name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, batch=17.1_20251118.1, com.redhat.component=openstack-iscsid-container, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid)
Nov 28 09:04:09 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60307 DF PROTO=TCP SPT=51290 DPT=9102 SEQ=1197907661 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AB812FA0000000001030307) 
Nov 28 09:04:09 np0005538515.localdomain podman[104812]: 2025-11-28 09:04:09.217616199 +0000 UTC m=+0.309588350 container exec_died 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, io.openshift.expose-services=, tcib_managed=true, container_name=iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.buildah.version=1.41.4, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, release=1761123044, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-iscsid-container, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, batch=17.1_20251118.1, architecture=x86_64, name=rhosp17/openstack-iscsid)
Nov 28 09:04:09 np0005538515.localdomain systemd[1]: 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.service: Deactivated successfully.
Nov 28 09:04:09 np0005538515.localdomain podman[104813]: 2025-11-28 09:04:09.237040177 +0000 UTC m=+0.320215307 container health_status d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20251118.1, release=1761123044, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, config_id=tripleo_step4, name=rhosp17/openstack-ceilometer-compute, tcib_managed=true, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:11:48Z, architecture=x86_64, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, managed_by=tripleo_ansible, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, container_name=ceilometer_agent_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 09:04:09 np0005538515.localdomain podman[104813]: 2025-11-28 09:04:09.270773516 +0000 UTC m=+0.353948636 container exec_died d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ceilometer-compute, batch=17.1_20251118.1, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, build-date=2025-11-19T00:11:48Z, tcib_managed=true, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, config_id=tripleo_step4, release=1761123044, distribution-scope=public, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']})
Nov 28 09:04:09 np0005538515.localdomain systemd[1]: d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.service: Deactivated successfully.
Nov 28 09:04:09 np0005538515.localdomain podman[104811]: 2025-11-28 09:04:09.276364918 +0000 UTC m=+0.369909908 container health_status 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=unhealthy, build-date=2025-11-19T00:12:45Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, distribution-scope=public, container_name=ceilometer_agent_ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, release=1761123044, name=rhosp17/openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 28 09:04:09 np0005538515.localdomain podman[104811]: 2025-11-28 09:04:09.359712594 +0000 UTC m=+0.453257584 container exec_died 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, batch=17.1_20251118.1, vcs-type=git, config_id=tripleo_step4, architecture=x86_64, release=1761123044, build-date=2025-11-19T00:12:45Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, container_name=ceilometer_agent_ipmi, vendor=Red Hat, Inc., distribution-scope=public, url=https://www.redhat.com)
Nov 28 09:04:09 np0005538515.localdomain podman[104811]: unhealthy
Nov 28 09:04:09 np0005538515.localdomain systemd[1]: 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 09:04:09 np0005538515.localdomain systemd[1]: 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.service: Failed with result 'exit-code'.
Nov 28 09:04:09 np0005538515.localdomain podman[104814]: 2025-11-28 09:04:09.381356 +0000 UTC m=+0.465810780 container exec_died e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., release=1761123044, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, name=rhosp17/openstack-nova-compute, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2025-11-19T00:36:58Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, container_name=nova_migration_target, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.12)
Nov 28 09:04:09 np0005538515.localdomain systemd[1]: e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.service: Deactivated successfully.
Nov 28 09:04:09 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:04:11 np0005538515.localdomain python3.9[105108]: ansible-ansible.builtin.service_facts Invoked
Nov 28 09:04:11 np0005538515.localdomain network[105125]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 28 09:04:11 np0005538515.localdomain network[105126]: 'network-scripts' will be removed from distribution in near future.
Nov 28 09:04:11 np0005538515.localdomain network[105127]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 28 09:04:11 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.
Nov 28 09:04:11 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.
Nov 28 09:04:11 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.
Nov 28 09:04:12 np0005538515.localdomain podman[105133]: 2025-11-28 09:04:12.000616858 +0000 UTC m=+0.098023429 container health_status ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, architecture=x86_64, distribution-scope=public, build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, io.openshift.expose-services=, url=https://www.redhat.com, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., config_id=tripleo_step5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, version=17.1.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']})
Nov 28 09:04:12 np0005538515.localdomain podman[105134]: 2025-11-28 09:04:12.108852519 +0000 UTC m=+0.202729971 container health_status e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, release=1761123044, vendor=Red Hat, Inc., build-date=2025-11-19T00:14:25Z, version=17.1.12, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, vcs-type=git, distribution-scope=public, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_metadata_agent, io.openshift.expose-services=, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64)
Nov 28 09:04:12 np0005538515.localdomain podman[105132]: 2025-11-28 09:04:12.070846319 +0000 UTC m=+0.168740535 container health_status 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, io.openshift.expose-services=, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, name=rhosp17/openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_controller, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, managed_by=tripleo_ansible, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, batch=17.1_20251118.1, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, config_id=tripleo_step4, vendor=Red Hat, Inc.)
Nov 28 09:04:12 np0005538515.localdomain podman[105132]: 2025-11-28 09:04:12.15112249 +0000 UTC m=+0.249016736 container exec_died 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_controller, managed_by=tripleo_ansible, tcib_managed=true, build-date=2025-11-18T23:34:05Z, description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, io.openshift.expose-services=, release=1761123044, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.41.4, distribution-scope=public)
Nov 28 09:04:12 np0005538515.localdomain podman[105132]: unhealthy
Nov 28 09:04:12 np0005538515.localdomain podman[105134]: 2025-11-28 09:04:12.174243842 +0000 UTC m=+0.268121234 container exec_died e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., version=17.1.12, url=https://www.redhat.com, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, build-date=2025-11-19T00:14:25Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, architecture=x86_64, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.4, container_name=ovn_metadata_agent, io.openshift.expose-services=)
Nov 28 09:04:12 np0005538515.localdomain podman[105134]: unhealthy
Nov 28 09:04:12 np0005538515.localdomain podman[105133]: 2025-11-28 09:04:12.226010565 +0000 UTC m=+0.323417166 container exec_died ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.12, name=rhosp17/openstack-nova-compute, io.openshift.expose-services=, architecture=x86_64, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=nova_compute, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, vcs-type=git, io.buildah.version=1.41.4, config_id=tripleo_step5, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container)
Nov 28 09:04:12 np0005538515.localdomain podman[105133]: unhealthy
Nov 28 09:04:12 np0005538515.localdomain systemd[1]: 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 09:04:12 np0005538515.localdomain systemd[1]: 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.service: Failed with result 'exit-code'.
Nov 28 09:04:12 np0005538515.localdomain systemd[1]: ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 09:04:12 np0005538515.localdomain systemd[1]: ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.service: Failed with result 'exit-code'.
Nov 28 09:04:12 np0005538515.localdomain systemd[1]: e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 09:04:12 np0005538515.localdomain systemd[1]: e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.service: Failed with result 'exit-code'.
Nov 28 09:04:12 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34655 DF PROTO=TCP SPT=35246 DPT=9105 SEQ=2060818330 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AB820FA0000000001030307) 
Nov 28 09:04:13 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:04:14 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31820 DF PROTO=TCP SPT=38202 DPT=9101 SEQ=3180037704 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AB826A00000000001030307) 
Nov 28 09:04:15 np0005538515.localdomain sudo[105389]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vftigjopsflakcnedylppkzyqicafeqo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764320654.9032903-116-28473397262476/AnsiballZ_systemd_service.py
Nov 28 09:04:15 np0005538515.localdomain sudo[105389]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:04:15 np0005538515.localdomain python3.9[105391]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_ceilometer_agent_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:04:15 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 09:04:15 np0005538515.localdomain systemd-sysv-generator[105423]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:04:15 np0005538515.localdomain systemd-rc-local-generator[105417]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:04:15 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:04:15 np0005538515.localdomain systemd[1]: Stopping ceilometer_agent_compute container...
Nov 28 09:04:16 np0005538515.localdomain systemd[1]: tmp-crun.xYoGUS.mount: Deactivated successfully.
Nov 28 09:04:16 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8058 DF PROTO=TCP SPT=60568 DPT=9100 SEQ=491117686 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AB82FBA0000000001030307) 
Nov 28 09:04:20 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8059 DF PROTO=TCP SPT=60568 DPT=9100 SEQ=491117686 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AB83F7A0000000001030307) 
Nov 28 09:04:27 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9562 DF PROTO=TCP SPT=60390 DPT=9105 SEQ=1153921280 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AB85AD30000000001030307) 
Nov 28 09:04:28 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8060 DF PROTO=TCP SPT=60568 DPT=9100 SEQ=491117686 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AB85EFA0000000001030307) 
Nov 28 09:04:28 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9563 DF PROTO=TCP SPT=60390 DPT=9105 SEQ=1153921280 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AB85EFA0000000001030307) 
Nov 28 09:04:31 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48828 DF PROTO=TCP SPT=55802 DPT=9105 SEQ=3116172900 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AB86AFB0000000001030307) 
Nov 28 09:04:34 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9565 DF PROTO=TCP SPT=60390 DPT=9105 SEQ=1153921280 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AB876BA0000000001030307) 
Nov 28 09:04:36 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.
Nov 28 09:04:36 np0005538515.localdomain podman[105447]: 2025-11-28 09:04:36.745038496 +0000 UTC m=+0.091731276 container health_status cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, distribution-scope=public, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=collectd, release=1761123044, url=https://www.redhat.com, architecture=x86_64, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, name=rhosp17/openstack-collectd, batch=17.1_20251118.1, io.buildah.version=1.41.4, io.openshift.expose-services=, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, maintainer=OpenStack TripleO Team, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']})
Nov 28 09:04:36 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.
Nov 28 09:04:36 np0005538515.localdomain podman[105447]: 2025-11-28 09:04:36.784691677 +0000 UTC m=+0.131384426 container exec_died cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, build-date=2025-11-18T22:51:28Z, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, vcs-type=git, batch=17.1_20251118.1, container_name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, tcib_managed=true, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., name=rhosp17/openstack-collectd, release=1761123044, managed_by=tripleo_ansible, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12)
Nov 28 09:04:36 np0005538515.localdomain systemd[1]: cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.service: Deactivated successfully.
Nov 28 09:04:36 np0005538515.localdomain podman[105465]: 2025-11-28 09:04:36.839553555 +0000 UTC m=+0.082283943 container health_status 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, batch=17.1_20251118.1, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, version=17.1.12, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, container_name=metrics_qdr, build-date=2025-11-18T22:49:46Z, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible)
Nov 28 09:04:37 np0005538515.localdomain podman[105465]: 2025-11-28 09:04:37.096822685 +0000 UTC m=+0.339553113 container exec_died 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, distribution-scope=public, vcs-type=git, build-date=2025-11-18T22:49:46Z, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, config_id=tripleo_step1, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, release=1761123044, vendor=Red Hat, Inc.)
Nov 28 09:04:37 np0005538515.localdomain systemd[1]: 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.service: Deactivated successfully.
Nov 28 09:04:39 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30847 DF PROTO=TCP SPT=56696 DPT=9102 SEQ=1278786119 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AB887FB0000000001030307) 
Nov 28 09:04:39 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.
Nov 28 09:04:39 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.
Nov 28 09:04:39 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.
Nov 28 09:04:39 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.
Nov 28 09:04:39 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.
Nov 28 09:04:39 np0005538515.localdomain podman[105503]: 2025-11-28 09:04:39.495245225 +0000 UTC m=+0.091809057 container health_status 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=unhealthy, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, com.redhat.component=openstack-ceilometer-ipmi-container, url=https://www.redhat.com, distribution-scope=public, container_name=ceilometer_agent_ipmi, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20251118.1, version=17.1.12, config_id=tripleo_step4, architecture=x86_64, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, io.openshift.expose-services=)
Nov 28 09:04:39 np0005538515.localdomain podman[105496]: 2025-11-28 09:04:39.542589962 +0000 UTC m=+0.145977404 container health_status 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12, config_id=tripleo_step3, tcib_managed=true, name=rhosp17/openstack-iscsid, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:44:13Z, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, container_name=iscsid, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.buildah.version=1.41.4, release=1761123044, managed_by=tripleo_ansible, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public)
Nov 28 09:04:39 np0005538515.localdomain podman[105496]: 2025-11-28 09:04:39.551973722 +0000 UTC m=+0.155361204 container exec_died 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, tcib_managed=true, config_id=tripleo_step3, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-type=git, com.redhat.component=openstack-iscsid-container, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., io.buildah.version=1.41.4, name=rhosp17/openstack-iscsid, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, build-date=2025-11-18T23:44:13Z, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com)
Nov 28 09:04:39 np0005538515.localdomain systemd[1]: 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.service: Deactivated successfully.
Nov 28 09:04:39 np0005538515.localdomain podman[105503]: 2025-11-28 09:04:39.575216607 +0000 UTC m=+0.171780439 container exec_died 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, tcib_managed=true, vcs-type=git, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, container_name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, url=https://www.redhat.com, name=rhosp17/openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:12:45Z, com.redhat.component=openstack-ceilometer-ipmi-container, architecture=x86_64, vendor=Red Hat, Inc.)
Nov 28 09:04:39 np0005538515.localdomain podman[105503]: unhealthy
Nov 28 09:04:39 np0005538515.localdomain systemd[1]: 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 09:04:39 np0005538515.localdomain systemd[1]: 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.service: Failed with result 'exit-code'.
Nov 28 09:04:39 np0005538515.localdomain podman[105495]: 2025-11-28 09:04:39.591494868 +0000 UTC m=+0.198701958 container health_status 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, config_id=tripleo_step4, com.redhat.component=openstack-cron-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, maintainer=OpenStack TripleO Team, release=1761123044, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, container_name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, managed_by=tripleo_ansible, architecture=x86_64, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, batch=17.1_20251118.1, name=rhosp17/openstack-cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 28 09:04:39 np0005538515.localdomain podman[105495]: 2025-11-28 09:04:39.623617846 +0000 UTC m=+0.230824956 container exec_died 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=logrotate_crond, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, tcib_managed=true, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, batch=17.1_20251118.1, build-date=2025-11-18T22:49:32Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, vcs-type=git, url=https://www.redhat.com, managed_by=tripleo_ansible, release=1761123044, vendor=Red Hat, Inc., io.buildah.version=1.41.4, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-cron-container, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron)
Nov 28 09:04:39 np0005538515.localdomain podman[105543]: 2025-11-28 09:04:39.637506304 +0000 UTC m=+0.133711307 container health_status e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, vendor=Red Hat, Inc., build-date=2025-11-19T00:36:58Z, container_name=nova_migration_target, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-compute, distribution-scope=public, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step4, version=17.1.12, vcs-type=git)
Nov 28 09:04:39 np0005538515.localdomain systemd[1]: 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.service: Deactivated successfully.
Nov 28 09:04:39 np0005538515.localdomain podman[105497]: Error: container d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff is not running
Nov 28 09:04:39 np0005538515.localdomain systemd[1]: d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.service: Main process exited, code=exited, status=125/n/a
Nov 28 09:04:39 np0005538515.localdomain systemd[1]: d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.service: Failed with result 'exit-code'.
Nov 28 09:04:40 np0005538515.localdomain podman[105543]: 2025-11-28 09:04:40.013593091 +0000 UTC m=+0.509798094 container exec_died e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, version=17.1.12, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., config_id=tripleo_step4, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, tcib_managed=true, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, io.buildah.version=1.41.4, container_name=nova_migration_target)
Nov 28 09:04:40 np0005538515.localdomain systemd[1]: e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.service: Deactivated successfully.
Nov 28 09:04:42 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.
Nov 28 09:04:42 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.
Nov 28 09:04:42 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.
Nov 28 09:04:42 np0005538515.localdomain podman[105596]: 2025-11-28 09:04:42.709694035 +0000 UTC m=+0.066123647 container health_status ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=unhealthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, distribution-scope=public, container_name=nova_compute, config_id=tripleo_step5, io.openshift.expose-services=, vcs-type=git, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible)
Nov 28 09:04:42 np0005538515.localdomain podman[105596]: 2025-11-28 09:04:42.730489734 +0000 UTC m=+0.086919386 container exec_died ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, config_id=tripleo_step5, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, managed_by=tripleo_ansible, version=17.1.12)
Nov 28 09:04:42 np0005538515.localdomain podman[105596]: unhealthy
Nov 28 09:04:42 np0005538515.localdomain systemd[1]: ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 09:04:42 np0005538515.localdomain systemd[1]: ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.service: Failed with result 'exit-code'.
Nov 28 09:04:42 np0005538515.localdomain podman[105597]: 2025-11-28 09:04:42.787584902 +0000 UTC m=+0.137608317 container health_status e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, distribution-scope=public, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, vendor=Red Hat, Inc., name=rhosp17/openstack-neutron-metadata-agent-ovn, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, config_id=tripleo_step4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_metadata_agent, release=1761123044, io.buildah.version=1.41.4)
Nov 28 09:04:42 np0005538515.localdomain podman[105595]: 2025-11-28 09:04:42.827166461 +0000 UTC m=+0.185659317 container health_status 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:34:05Z, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, managed_by=tripleo_ansible, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, release=1761123044, com.redhat.component=openstack-ovn-controller-container, name=rhosp17/openstack-ovn-controller, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, vendor=Red Hat, Inc., vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 28 09:04:42 np0005538515.localdomain podman[105595]: 2025-11-28 09:04:42.844416151 +0000 UTC m=+0.202909047 container exec_died 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, managed_by=tripleo_ansible, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, io.buildah.version=1.41.4, version=17.1.12, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:34:05Z, maintainer=OpenStack TripleO Team, tcib_managed=true, container_name=ovn_controller, vendor=Red Hat, Inc., name=rhosp17/openstack-ovn-controller, distribution-scope=public, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, config_id=tripleo_step4, vcs-type=git)
Nov 28 09:04:42 np0005538515.localdomain podman[105595]: unhealthy
Nov 28 09:04:42 np0005538515.localdomain systemd[1]: 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 09:04:42 np0005538515.localdomain systemd[1]: 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.service: Failed with result 'exit-code'.
Nov 28 09:04:42 np0005538515.localdomain podman[105597]: 2025-11-28 09:04:42.870613218 +0000 UTC m=+0.220636593 container exec_died e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, url=https://www.redhat.com, release=1761123044, io.openshift.expose-services=, batch=17.1_20251118.1, io.buildah.version=1.41.4, managed_by=tripleo_ansible, build-date=2025-11-19T00:14:25Z, tcib_managed=true, vcs-type=git, name=rhosp17/openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., architecture=x86_64, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4, distribution-scope=public)
Nov 28 09:04:42 np0005538515.localdomain podman[105597]: unhealthy
Nov 28 09:04:42 np0005538515.localdomain systemd[1]: e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 09:04:42 np0005538515.localdomain systemd[1]: e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.service: Failed with result 'exit-code'.
Nov 28 09:04:42 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9566 DF PROTO=TCP SPT=60390 DPT=9105 SEQ=1153921280 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AB896FA0000000001030307) 
Nov 28 09:04:43 np0005538515.localdomain sudo[105655]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:04:43 np0005538515.localdomain sudo[105655]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:04:43 np0005538515.localdomain sudo[105655]: pam_unix(sudo:session): session closed for user root
Nov 28 09:04:43 np0005538515.localdomain systemd[1]: tmp-crun.pX1ygc.mount: Deactivated successfully.
Nov 28 09:04:43 np0005538515.localdomain sudo[105670]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 09:04:43 np0005538515.localdomain sudo[105670]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:04:44 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44103 DF PROTO=TCP SPT=48844 DPT=9101 SEQ=2307672429 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AB89BD00000000001030307) 
Nov 28 09:04:44 np0005538515.localdomain sudo[105670]: pam_unix(sudo:session): session closed for user root
Nov 28 09:04:45 np0005538515.localdomain sudo[105716]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:04:45 np0005538515.localdomain sudo[105716]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:04:45 np0005538515.localdomain sudo[105716]: pam_unix(sudo:session): session closed for user root
Nov 28 09:04:46 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41000 DF PROTO=TCP SPT=33542 DPT=9100 SEQ=1773268596 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AB8A4FA0000000001030307) 
Nov 28 09:04:50 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41001 DF PROTO=TCP SPT=33542 DPT=9100 SEQ=1773268596 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AB8B4BA0000000001030307) 
Nov 28 09:04:57 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59803 DF PROTO=TCP SPT=54994 DPT=9105 SEQ=3520360375 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AB8D0030000000001030307) 
Nov 28 09:04:58 np0005538515.localdomain podman[105432]: time="2025-11-28T09:04:58Z" level=warning msg="StopSignal SIGTERM failed to stop container ceilometer_agent_compute in 42 seconds, resorting to SIGKILL"
Nov 28 09:04:58 np0005538515.localdomain systemd[1]: libpod-d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.scope: Deactivated successfully.
Nov 28 09:04:58 np0005538515.localdomain systemd[1]: libpod-d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.scope: Consumed 5.798s CPU time.
Nov 28 09:04:58 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:33:a3:19 MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.110 DST=192.168.122.108 LEN=40 TOS=0x00 PREC=0x00 TTL=64 ID=0 DF PROTO=TCP SPT=6379 DPT=45014 SEQ=27833604 ACK=0 WINDOW=0 RES=0x00 RST URGP=0 
Nov 28 09:04:58 np0005538515.localdomain podman[105432]: 2025-11-28 09:04:58.099346667 +0000 UTC m=+42.111713380 container died d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.buildah.version=1.41.4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, version=17.1.12, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_compute, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, build-date=2025-11-19T00:11:48Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-compute-container, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-compute, release=1761123044)
Nov 28 09:04:58 np0005538515.localdomain systemd[1]: d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.timer: Deactivated successfully.
Nov 28 09:04:58 np0005538515.localdomain systemd[1]: Stopped /usr/bin/podman healthcheck run d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.
Nov 28 09:04:58 np0005538515.localdomain systemd[1]: d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.service: Failed to open /run/systemd/transient/d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.service: No such file or directory
Nov 28 09:04:58 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff-userdata-shm.mount: Deactivated successfully.
Nov 28 09:04:58 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-d613e9ce43651b4a22ba11f5bcafcb4dcc9b302834037925dc9c415fac8e707f-merged.mount: Deactivated successfully.
Nov 28 09:04:58 np0005538515.localdomain podman[105432]: 2025-11-28 09:04:58.157623001 +0000 UTC m=+42.169989664 container cleanup d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, distribution-scope=public, version=17.1.12, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, build-date=2025-11-19T00:11:48Z, container_name=ceilometer_agent_compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, io.buildah.version=1.41.4, architecture=x86_64, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, config_id=tripleo_step4, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team)
Nov 28 09:04:58 np0005538515.localdomain podman[105432]: ceilometer_agent_compute
Nov 28 09:04:58 np0005538515.localdomain systemd[1]: d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.timer: Failed to open /run/systemd/transient/d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.timer: No such file or directory
Nov 28 09:04:58 np0005538515.localdomain systemd[1]: d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.service: Failed to open /run/systemd/transient/d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.service: No such file or directory
Nov 28 09:04:58 np0005538515.localdomain podman[105732]: 2025-11-28 09:04:58.187828701 +0000 UTC m=+0.080535850 container cleanup d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, config_id=tripleo_step4, release=1761123044, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, distribution-scope=public, url=https://www.redhat.com, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, managed_by=tripleo_ansible, container_name=ceilometer_agent_compute, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-compute, io.openshift.expose-services=)
Nov 28 09:04:58 np0005538515.localdomain systemd[1]: libpod-conmon-d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.scope: Deactivated successfully.
Nov 28 09:04:58 np0005538515.localdomain systemd[1]: d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.timer: Failed to open /run/systemd/transient/d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.timer: No such file or directory
Nov 28 09:04:58 np0005538515.localdomain systemd[1]: d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.service: Failed to open /run/systemd/transient/d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff.service: No such file or directory
Nov 28 09:04:58 np0005538515.localdomain podman[105748]: 2025-11-28 09:04:58.317764571 +0000 UTC m=+0.071013477 container cleanup d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, tcib_managed=true, com.redhat.component=openstack-ceilometer-compute-container, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, io.buildah.version=1.41.4, vendor=Red Hat, Inc., release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, managed_by=tripleo_ansible, config_id=tripleo_step4, version=17.1.12, build-date=2025-11-19T00:11:48Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team)
Nov 28 09:04:58 np0005538515.localdomain podman[105748]: ceilometer_agent_compute
Nov 28 09:04:58 np0005538515.localdomain systemd[1]: tripleo_ceilometer_agent_compute.service: Deactivated successfully.
Nov 28 09:04:58 np0005538515.localdomain systemd[1]: Stopped ceilometer_agent_compute container.
Nov 28 09:04:58 np0005538515.localdomain systemd[1]: tripleo_ceilometer_agent_compute.service: Consumed 1.121s CPU time, no IO.
Nov 28 09:04:58 np0005538515.localdomain sudo[105389]: pam_unix(sudo:session): session closed for user root
Nov 28 09:04:58 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:33:a3:19 MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.110 DST=192.168.122.108 LEN=40 TOS=0x00 PREC=0x00 TTL=64 ID=0 DF PROTO=TCP SPT=6379 DPT=45014 SEQ=27833604 ACK=0 WINDOW=0 RES=0x00 RST URGP=0 
Nov 28 09:04:58 np0005538515.localdomain sudo[105850]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-djxwfgymcogakyopiwnotusgfemifcdj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764320698.5040448-116-267519854265993/AnsiballZ_systemd_service.py
Nov 28 09:04:58 np0005538515.localdomain sudo[105850]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:04:59 np0005538515.localdomain python3.9[105852]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_ceilometer_agent_ipmi.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:04:59 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 09:04:59 np0005538515.localdomain systemd-rc-local-generator[105878]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:04:59 np0005538515.localdomain systemd-sysv-generator[105883]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:04:59 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:04:59 np0005538515.localdomain systemd[1]: Stopping ceilometer_agent_ipmi container...
Nov 28 09:05:01 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6015 DF PROTO=TCP SPT=55908 DPT=9882 SEQ=1613149440 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AB8E13A0000000001030307) 
Nov 28 09:05:04 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59806 DF PROTO=TCP SPT=54994 DPT=9105 SEQ=3520360375 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AB8EBBB0000000001030307) 
Nov 28 09:05:06 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.
Nov 28 09:05:06 np0005538515.localdomain podman[105906]: 2025-11-28 09:05:06.995819413 +0000 UTC m=+0.093927301 container health_status cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, vcs-type=git, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=collectd, build-date=2025-11-18T22:51:28Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, distribution-scope=public, release=1761123044, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, version=17.1.12, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, com.redhat.component=openstack-collectd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, name=rhosp17/openstack-collectd)
Nov 28 09:05:07 np0005538515.localdomain podman[105906]: 2025-11-28 09:05:07.007183843 +0000 UTC m=+0.105291711 container exec_died cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, version=17.1.12, tcib_managed=true, com.redhat.component=openstack-collectd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, name=rhosp17/openstack-collectd, vcs-type=git, build-date=2025-11-18T22:51:28Z, distribution-scope=public, url=https://www.redhat.com, managed_by=tripleo_ansible)
Nov 28 09:05:07 np0005538515.localdomain systemd[1]: cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.service: Deactivated successfully.
Nov 28 09:05:07 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.
Nov 28 09:05:07 np0005538515.localdomain podman[105926]: 2025-11-28 09:05:07.997267581 +0000 UTC m=+0.095078448 container health_status 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.buildah.version=1.41.4, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-qdrouterd, url=https://www.redhat.com, version=17.1.12, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, container_name=metrics_qdr, release=1761123044, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20251118.1, distribution-scope=public, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 28 09:05:08 np0005538515.localdomain podman[105926]: 2025-11-28 09:05:08.190954723 +0000 UTC m=+0.288765610 container exec_died 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, build-date=2025-11-18T22:49:46Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vcs-type=git, managed_by=tripleo_ansible, release=1761123044, vendor=Red Hat, Inc., batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_id=tripleo_step1, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 09:05:08 np0005538515.localdomain systemd[1]: 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.service: Deactivated successfully.
Nov 28 09:05:09 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37429 DF PROTO=TCP SPT=45106 DPT=9102 SEQ=56186255 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AB8FD3A0000000001030307) 
Nov 28 09:05:09 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.
Nov 28 09:05:09 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.
Nov 28 09:05:09 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.
Nov 28 09:05:09 np0005538515.localdomain podman[105954]: 2025-11-28 09:05:09.973019748 +0000 UTC m=+0.081919322 container health_status 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, batch=17.1_20251118.1, managed_by=tripleo_ansible, url=https://www.redhat.com, io.openshift.expose-services=, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, io.buildah.version=1.41.4, tcib_managed=true, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.component=openstack-cron-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, name=rhosp17/openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, container_name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 28 09:05:10 np0005538515.localdomain podman[105954]: 2025-11-28 09:05:10.013457712 +0000 UTC m=+0.122357236 container exec_died 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:32Z, architecture=x86_64, version=17.1.12, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, release=1761123044, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, io.buildah.version=1.41.4, container_name=logrotate_crond, config_id=tripleo_step4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, com.redhat.component=openstack-cron-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 cron)
Nov 28 09:05:10 np0005538515.localdomain systemd[1]: 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.service: Deactivated successfully.
Nov 28 09:05:10 np0005538515.localdomain podman[105956]: 2025-11-28 09:05:10.038778852 +0000 UTC m=+0.143678264 container health_status 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, com.redhat.component=openstack-iscsid-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, distribution-scope=public, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.expose-services=, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, architecture=x86_64, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vendor=Red Hat, Inc., name=rhosp17/openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, version=17.1.12, io.buildah.version=1.41.4)
Nov 28 09:05:10 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.
Nov 28 09:05:10 np0005538515.localdomain podman[105956]: 2025-11-28 09:05:10.085848491 +0000 UTC m=+0.190747883 container exec_died 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.expose-services=, distribution-scope=public, tcib_managed=true, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.component=openstack-iscsid-container, build-date=2025-11-18T23:44:13Z, vcs-type=git, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, release=1761123044, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, vendor=Red Hat, Inc.)
Nov 28 09:05:10 np0005538515.localdomain systemd[1]: 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.service: Deactivated successfully.
Nov 28 09:05:10 np0005538515.localdomain podman[105955]: Error: container 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 is not running
Nov 28 09:05:10 np0005538515.localdomain podman[106004]: 2025-11-28 09:05:10.159944002 +0000 UTC m=+0.088000020 container health_status e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, vcs-type=git, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., build-date=2025-11-19T00:36:58Z, container_name=nova_migration_target, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-nova-compute, tcib_managed=true, url=https://www.redhat.com, distribution-scope=public, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 28 09:05:10 np0005538515.localdomain systemd[1]: 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.service: Main process exited, code=exited, status=125/n/a
Nov 28 09:05:10 np0005538515.localdomain systemd[1]: 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.service: Failed with result 'exit-code'.
Nov 28 09:05:10 np0005538515.localdomain podman[106004]: 2025-11-28 09:05:10.55135345 +0000 UTC m=+0.479409498 container exec_died e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, version=17.1.12, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, distribution-scope=public, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target)
Nov 28 09:05:10 np0005538515.localdomain systemd[1]: e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.service: Deactivated successfully.
Nov 28 09:05:11 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:33:a3:19 MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.110 DST=192.168.122.108 LEN=40 TOS=0x00 PREC=0x00 TTL=64 ID=0 DF PROTO=TCP SPT=6379 DPT=45014 SEQ=27833604 ACK=0 WINDOW=0 RES=0x00 RST URGP=0 
Nov 28 09:05:12 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.
Nov 28 09:05:12 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.
Nov 28 09:05:12 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.
Nov 28 09:05:12 np0005538515.localdomain podman[106028]: 2025-11-28 09:05:12.987996558 +0000 UTC m=+0.094581173 container health_status 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, url=https://www.redhat.com, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ovn-controller, container_name=ovn_controller, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, config_id=tripleo_step4, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.buildah.version=1.41.4, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, distribution-scope=public, batch=17.1_20251118.1, com.redhat.component=openstack-ovn-controller-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, build-date=2025-11-18T23:34:05Z, vcs-type=git)
Nov 28 09:05:13 np0005538515.localdomain podman[106028]: 2025-11-28 09:05:13.029442624 +0000 UTC m=+0.136027259 container exec_died 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, name=rhosp17/openstack-ovn-controller, tcib_managed=true, maintainer=OpenStack TripleO Team, version=17.1.12, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.expose-services=, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, build-date=2025-11-18T23:34:05Z, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, batch=17.1_20251118.1, url=https://www.redhat.com, vcs-type=git)
Nov 28 09:05:13 np0005538515.localdomain podman[106028]: unhealthy
Nov 28 09:05:13 np0005538515.localdomain systemd[1]: 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 09:05:13 np0005538515.localdomain systemd[1]: 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.service: Failed with result 'exit-code'.
Nov 28 09:05:13 np0005538515.localdomain podman[106029]: 2025-11-28 09:05:13.049111079 +0000 UTC m=+0.149797362 container health_status ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=unhealthy, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, maintainer=OpenStack TripleO Team, vcs-type=git, architecture=x86_64, container_name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, release=1761123044, config_id=tripleo_step5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-compute-container, build-date=2025-11-19T00:36:58Z, url=https://www.redhat.com, tcib_managed=true, version=17.1.12)
Nov 28 09:05:13 np0005538515.localdomain podman[106057]: 2025-11-28 09:05:13.101807352 +0000 UTC m=+0.102761235 container health_status e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, release=1761123044, url=https://www.redhat.com, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, name=rhosp17/openstack-neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, maintainer=OpenStack TripleO Team, vcs-type=git, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, architecture=x86_64, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., container_name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 28 09:05:13 np0005538515.localdomain podman[106057]: 2025-11-28 09:05:13.117724731 +0000 UTC m=+0.118678534 container exec_died e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, version=17.1.12, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., batch=17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, url=https://www.redhat.com, release=1761123044, build-date=2025-11-19T00:14:25Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c)
Nov 28 09:05:13 np0005538515.localdomain podman[106057]: unhealthy
Nov 28 09:05:13 np0005538515.localdomain systemd[1]: e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 09:05:13 np0005538515.localdomain systemd[1]: e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.service: Failed with result 'exit-code'.
Nov 28 09:05:13 np0005538515.localdomain podman[106029]: 2025-11-28 09:05:13.170609569 +0000 UTC m=+0.271295832 container exec_died ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, config_id=tripleo_step5, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, managed_by=tripleo_ansible, version=17.1.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, architecture=x86_64, release=1761123044, name=rhosp17/openstack-nova-compute, container_name=nova_compute, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, build-date=2025-11-19T00:36:58Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4)
Nov 28 09:05:13 np0005538515.localdomain podman[106029]: unhealthy
Nov 28 09:05:13 np0005538515.localdomain systemd[1]: ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 09:05:13 np0005538515.localdomain systemd[1]: ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.service: Failed with result 'exit-code'.
Nov 28 09:05:13 np0005538515.localdomain systemd[1]: tmp-crun.p6LUXg.mount: Deactivated successfully.
Nov 28 09:05:14 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6017 DF PROTO=TCP SPT=55908 DPT=9882 SEQ=1613149440 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AB910FA0000000001030307) 
Nov 28 09:05:16 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62494 DF PROTO=TCP SPT=54982 DPT=9100 SEQ=3662909776 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AB91A3A0000000001030307) 
Nov 28 09:05:17 np0005538515.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Nov 28 09:05:17 np0005538515.localdomain recover_tripleo_nova_virtqemud[106091]: 62642
Nov 28 09:05:17 np0005538515.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Nov 28 09:05:17 np0005538515.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Nov 28 09:05:20 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62495 DF PROTO=TCP SPT=54982 DPT=9100 SEQ=3662909776 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AB929FA0000000001030307) 
Nov 28 09:05:25 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:33:a3:19 MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.110 DST=192.168.122.108 LEN=40 TOS=0x00 PREC=0x00 TTL=64 ID=0 DF PROTO=TCP SPT=6379 DPT=45014 SEQ=27833604 ACK=0 WINDOW=0 RES=0x00 RST URGP=0 
Nov 28 09:05:27 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13697 DF PROTO=TCP SPT=50388 DPT=9105 SEQ=3033705835 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AB945340000000001030307) 
Nov 28 09:05:28 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13698 DF PROTO=TCP SPT=50388 DPT=9105 SEQ=3033705835 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AB9493A0000000001030307) 
Nov 28 09:05:31 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9568 DF PROTO=TCP SPT=60390 DPT=9105 SEQ=1153921280 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AB954FA0000000001030307) 
Nov 28 09:05:34 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13700 DF PROTO=TCP SPT=50388 DPT=9105 SEQ=3033705835 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AB960FB0000000001030307) 
Nov 28 09:05:37 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.
Nov 28 09:05:37 np0005538515.localdomain podman[106092]: 2025-11-28 09:05:37.475822204 +0000 UTC m=+0.083374666 container health_status cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, container_name=collectd, build-date=2025-11-18T22:51:28Z, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, tcib_managed=true, version=17.1.12, name=rhosp17/openstack-collectd, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-collectd-container, config_id=tripleo_step3, maintainer=OpenStack TripleO Team)
Nov 28 09:05:37 np0005538515.localdomain podman[106092]: 2025-11-28 09:05:37.483280724 +0000 UTC m=+0.090833186 container exec_died cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, tcib_managed=true, com.redhat.component=openstack-collectd-container, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, version=17.1.12, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:51:28Z)
Nov 28 09:05:37 np0005538515.localdomain systemd[1]: cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.service: Deactivated successfully.
Nov 28 09:05:38 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.
Nov 28 09:05:38 np0005538515.localdomain podman[106113]: 2025-11-28 09:05:38.973950301 +0000 UTC m=+0.076790704 container health_status 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, vendor=Red Hat, Inc., vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, name=rhosp17/openstack-qdrouterd, release=1761123044, build-date=2025-11-18T22:49:46Z, container_name=metrics_qdr, io.openshift.expose-services=, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, tcib_managed=true)
Nov 28 09:05:39 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47407 DF PROTO=TCP SPT=59842 DPT=9102 SEQ=3243152348 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AB9727A0000000001030307) 
Nov 28 09:05:39 np0005538515.localdomain podman[106113]: 2025-11-28 09:05:39.185409391 +0000 UTC m=+0.288249754 container exec_died 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, url=https://www.redhat.com, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, build-date=2025-11-18T22:49:46Z, io.buildah.version=1.41.4, version=17.1.12, managed_by=tripleo_ansible, batch=17.1_20251118.1, release=1761123044, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, name=rhosp17/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Nov 28 09:05:39 np0005538515.localdomain systemd[1]: 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.service: Deactivated successfully.
Nov 28 09:05:40 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.
Nov 28 09:05:40 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.
Nov 28 09:05:40 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.
Nov 28 09:05:40 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.
Nov 28 09:05:40 np0005538515.localdomain podman[106142]: 2025-11-28 09:05:40.988650749 +0000 UTC m=+0.091293051 container health_status 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, version=17.1.12, distribution-scope=public, name=rhosp17/openstack-cron, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, tcib_managed=true, io.openshift.expose-services=, container_name=logrotate_crond, vendor=Red Hat, Inc., vcs-type=git, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:32Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, url=https://www.redhat.com, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-cron-container, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 28 09:05:41 np0005538515.localdomain podman[106143]: Error: container 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 is not running
Nov 28 09:05:41 np0005538515.localdomain systemd[1]: 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.service: Main process exited, code=exited, status=125/n/a
Nov 28 09:05:41 np0005538515.localdomain systemd[1]: 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.service: Failed with result 'exit-code'.
Nov 28 09:05:41 np0005538515.localdomain podman[106148]: 2025-11-28 09:05:41.050942627 +0000 UTC m=+0.141942190 container health_status e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, maintainer=OpenStack TripleO Team, architecture=x86_64, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-11-19T00:36:58Z, distribution-scope=public, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=nova_migration_target, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 28 09:05:41 np0005538515.localdomain podman[106142]: 2025-11-28 09:05:41.074461651 +0000 UTC m=+0.177103963 container exec_died 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, url=https://www.redhat.com, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, tcib_managed=true, build-date=2025-11-18T22:49:32Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, managed_by=tripleo_ansible, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-cron, container_name=logrotate_crond, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, vendor=Red Hat, Inc., distribution-scope=public)
Nov 28 09:05:41 np0005538515.localdomain systemd[1]: 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.service: Deactivated successfully.
Nov 28 09:05:41 np0005538515.localdomain podman[106144]: 2025-11-28 09:05:41.091860316 +0000 UTC m=+0.186003456 container health_status 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-type=git, config_id=tripleo_step3, io.buildah.version=1.41.4, release=1761123044, url=https://www.redhat.com, batch=17.1_20251118.1, name=rhosp17/openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, distribution-scope=public, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, container_name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-iscsid-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid)
Nov 28 09:05:41 np0005538515.localdomain podman[106144]: 2025-11-28 09:05:41.10334687 +0000 UTC m=+0.197489960 container exec_died 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, tcib_managed=true, managed_by=tripleo_ansible, version=17.1.12, build-date=2025-11-18T23:44:13Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, url=https://www.redhat.com, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, config_id=tripleo_step3, container_name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-iscsid-container, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, distribution-scope=public, io.openshift.expose-services=)
Nov 28 09:05:41 np0005538515.localdomain systemd[1]: 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.service: Deactivated successfully.
Nov 28 09:05:41 np0005538515.localdomain podman[106148]: 2025-11-28 09:05:41.43081274 +0000 UTC m=+0.521812263 container exec_died e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, container_name=nova_migration_target, url=https://www.redhat.com, io.buildah.version=1.41.4, batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, vendor=Red Hat, Inc., managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, tcib_managed=true, release=1761123044, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, config_id=tripleo_step4, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute)
Nov 28 09:05:41 np0005538515.localdomain systemd[1]: e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.service: Deactivated successfully.
Nov 28 09:05:41 np0005538515.localdomain podman[105893]: time="2025-11-28T09:05:41Z" level=warning msg="StopSignal SIGTERM failed to stop container ceilometer_agent_ipmi in 42 seconds, resorting to SIGKILL"
Nov 28 09:05:41 np0005538515.localdomain systemd[1]: libpod-7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.scope: Deactivated successfully.
Nov 28 09:05:41 np0005538515.localdomain systemd[1]: libpod-7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.scope: Consumed 6.400s CPU time.
Nov 28 09:05:41 np0005538515.localdomain podman[105893]: 2025-11-28 09:05:41.7446318 +0000 UTC m=+42.109114386 container died 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, container_name=ceilometer_agent_ipmi, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, distribution-scope=public, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-ipmi, release=1761123044, batch=17.1_20251118.1, config_id=tripleo_step4, version=17.1.12, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, vcs-type=git, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, build-date=2025-11-19T00:12:45Z, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Nov 28 09:05:41 np0005538515.localdomain systemd[1]: 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.timer: Deactivated successfully.
Nov 28 09:05:41 np0005538515.localdomain systemd[1]: Stopped /usr/bin/podman healthcheck run 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.
Nov 28 09:05:41 np0005538515.localdomain systemd[1]: 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.service: Failed to open /run/systemd/transient/7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.service: No such file or directory
Nov 28 09:05:41 np0005538515.localdomain podman[105893]: 2025-11-28 09:05:41.799035026 +0000 UTC m=+42.163517582 container cleanup 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1761123044, batch=17.1_20251118.1, config_id=tripleo_step4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, io.openshift.expose-services=, version=17.1.12, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:12:45Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ceilometer_agent_ipmi)
Nov 28 09:05:41 np0005538515.localdomain podman[105893]: ceilometer_agent_ipmi
Nov 28 09:05:41 np0005538515.localdomain systemd[1]: 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.timer: Failed to open /run/systemd/transient/7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.timer: No such file or directory
Nov 28 09:05:41 np0005538515.localdomain systemd[1]: 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.service: Failed to open /run/systemd/transient/7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.service: No such file or directory
Nov 28 09:05:41 np0005538515.localdomain podman[106214]: 2025-11-28 09:05:41.845451154 +0000 UTC m=+0.085822892 container cleanup 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, version=17.1.12, io.buildah.version=1.41.4, managed_by=tripleo_ansible, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, release=1761123044, maintainer=OpenStack TripleO Team, vcs-type=git, name=rhosp17/openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, architecture=x86_64, io.openshift.expose-services=, build-date=2025-11-19T00:12:45Z, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20251118.1, container_name=ceilometer_agent_ipmi)
Nov 28 09:05:41 np0005538515.localdomain systemd[1]: libpod-conmon-7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.scope: Deactivated successfully.
Nov 28 09:05:41 np0005538515.localdomain systemd[1]: 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.timer: Failed to open /run/systemd/transient/7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.timer: No such file or directory
Nov 28 09:05:41 np0005538515.localdomain systemd[1]: 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.service: Failed to open /run/systemd/transient/7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6.service: No such file or directory
Nov 28 09:05:41 np0005538515.localdomain podman[106230]: 2025-11-28 09:05:41.926244651 +0000 UTC m=+0.044064398 container cleanup 7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, build-date=2025-11-19T00:12:45Z, config_id=tripleo_step4, name=rhosp17/openstack-ceilometer-ipmi, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, managed_by=tripleo_ansible, tcib_managed=true, url=https://www.redhat.com, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, release=1761123044, com.redhat.component=openstack-ceilometer-ipmi-container, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, vcs-type=git, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']})
Nov 28 09:05:41 np0005538515.localdomain podman[106230]: ceilometer_agent_ipmi
Nov 28 09:05:41 np0005538515.localdomain systemd[1]: tripleo_ceilometer_agent_ipmi.service: Deactivated successfully.
Nov 28 09:05:41 np0005538515.localdomain systemd[1]: Stopped ceilometer_agent_ipmi container.
Nov 28 09:05:41 np0005538515.localdomain sudo[105850]: pam_unix(sudo:session): session closed for user root
Nov 28 09:05:41 np0005538515.localdomain systemd[1]: tmp-crun.QykCaf.mount: Deactivated successfully.
Nov 28 09:05:41 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-b2363e42c8cc93f560c242c278a1b76f810df60301763e880790aefc5b17b52f-merged.mount: Deactivated successfully.
Nov 28 09:05:41 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-7d61a684c799c99e7df2b110e7b08d8c3b10a949b562f2d88d54d1eb7d8503e6-userdata-shm.mount: Deactivated successfully.
Nov 28 09:05:42 np0005538515.localdomain sudo[106332]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pbbvrdowizwifpcagipydjulftufrhlk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764320742.105521-116-127460147099437/AnsiballZ_systemd_service.py
Nov 28 09:05:42 np0005538515.localdomain sudo[106332]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:05:42 np0005538515.localdomain python3.9[106334]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_collectd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:05:42 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 09:05:42 np0005538515.localdomain systemd-rc-local-generator[106353]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:05:42 np0005538515.localdomain systemd-sysv-generator[106360]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:05:42 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13701 DF PROTO=TCP SPT=50388 DPT=9105 SEQ=3033705835 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AB980FA0000000001030307) 
Nov 28 09:05:42 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:05:43 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.
Nov 28 09:05:43 np0005538515.localdomain systemd[1]: Stopping collectd container...
Nov 28 09:05:43 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.
Nov 28 09:05:43 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.
Nov 28 09:05:43 np0005538515.localdomain podman[106398]: 2025-11-28 09:05:43.269033765 +0000 UTC m=+0.102833786 container health_status e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, build-date=2025-11-19T00:14:25Z, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, managed_by=tripleo_ansible, container_name=ovn_metadata_agent, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-neutron-metadata-agent-ovn, release=1761123044, architecture=x86_64, config_id=tripleo_step4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true)
Nov 28 09:05:43 np0005538515.localdomain podman[106373]: 2025-11-28 09:05:43.228137527 +0000 UTC m=+0.147256074 container health_status 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, name=rhosp17/openstack-ovn-controller, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, release=1761123044, com.redhat.component=openstack-ovn-controller-container, vendor=Red Hat, Inc., version=17.1.12, distribution-scope=public, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, build-date=2025-11-18T23:34:05Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, tcib_managed=true, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ovn-controller)
Nov 28 09:05:43 np0005538515.localdomain podman[106398]: 2025-11-28 09:05:43.284476201 +0000 UTC m=+0.118276222 container exec_died e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, tcib_managed=true, vcs-type=git, version=17.1.12, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, name=rhosp17/openstack-neutron-metadata-agent-ovn, architecture=x86_64, release=1761123044, container_name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, build-date=2025-11-19T00:14:25Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, config_id=tripleo_step4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c)
Nov 28 09:05:43 np0005538515.localdomain podman[106398]: unhealthy
Nov 28 09:05:43 np0005538515.localdomain systemd[1]: e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 09:05:43 np0005538515.localdomain systemd[1]: e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.service: Failed with result 'exit-code'.
Nov 28 09:05:43 np0005538515.localdomain podman[106373]: 2025-11-28 09:05:43.309433659 +0000 UTC m=+0.228552276 container exec_died 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.buildah.version=1.41.4, vcs-type=git, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, config_id=tripleo_step4, managed_by=tripleo_ansible, release=1761123044, url=https://www.redhat.com, build-date=2025-11-18T23:34:05Z, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, architecture=x86_64)
Nov 28 09:05:43 np0005538515.localdomain podman[106373]: unhealthy
Nov 28 09:05:43 np0005538515.localdomain systemd[1]: 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 09:05:43 np0005538515.localdomain systemd[1]: 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.service: Failed with result 'exit-code'.
Nov 28 09:05:43 np0005538515.localdomain podman[106417]: 2025-11-28 09:05:43.373404369 +0000 UTC m=+0.136861294 container health_status ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=unhealthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-nova-compute, vcs-type=git, build-date=2025-11-19T00:36:58Z, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., tcib_managed=true, container_name=nova_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, distribution-scope=public, config_id=tripleo_step5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20251118.1, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible)
Nov 28 09:05:43 np0005538515.localdomain podman[106417]: 2025-11-28 09:05:43.395431296 +0000 UTC m=+0.158888241 container exec_died ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., tcib_managed=true, io.openshift.expose-services=, container_name=nova_compute, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, release=1761123044, distribution-scope=public, config_id=tripleo_step5, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, vcs-type=git, build-date=2025-11-19T00:36:58Z, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 28 09:05:43 np0005538515.localdomain podman[106417]: unhealthy
Nov 28 09:05:43 np0005538515.localdomain systemd[1]: ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 09:05:43 np0005538515.localdomain systemd[1]: ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.service: Failed with result 'exit-code'.
Nov 28 09:05:44 np0005538515.localdomain systemd[1]: tmp-crun.Egtyr6.mount: Deactivated successfully.
Nov 28 09:05:44 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28554 DF PROTO=TCP SPT=45652 DPT=9101 SEQ=3547471805 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AB986300000000001030307) 
Nov 28 09:05:44 np0005538515.localdomain systemd[1]: libpod-cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.scope: Deactivated successfully.
Nov 28 09:05:44 np0005538515.localdomain systemd[1]: libpod-cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.scope: Consumed 2.115s CPU time.
Nov 28 09:05:44 np0005538515.localdomain podman[106375]: 2025-11-28 09:05:44.558803028 +0000 UTC m=+1.474140338 container stop cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, distribution-scope=public, io.openshift.expose-services=, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, vendor=Red Hat, Inc., batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-collectd, io.buildah.version=1.41.4, com.redhat.component=openstack-collectd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, architecture=x86_64, container_name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:51:28Z)
Nov 28 09:05:44 np0005538515.localdomain podman[106375]: 2025-11-28 09:05:44.595293401 +0000 UTC m=+1.510630731 container died cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, name=rhosp17/openstack-collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:51:28Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, release=1761123044, distribution-scope=public, maintainer=OpenStack TripleO Team, architecture=x86_64, com.redhat.component=openstack-collectd-container, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vendor=Red Hat, Inc., tcib_managed=true, url=https://www.redhat.com, managed_by=tripleo_ansible, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 28 09:05:44 np0005538515.localdomain systemd[1]: cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.timer: Deactivated successfully.
Nov 28 09:05:44 np0005538515.localdomain systemd[1]: Stopped /usr/bin/podman healthcheck run cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.
Nov 28 09:05:44 np0005538515.localdomain systemd[1]: cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.service: Failed to open /run/systemd/transient/cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.service: No such file or directory
Nov 28 09:05:44 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-cb78a9787fbfdee8df647dff935d3e6e34a25076546a1ccbc8a68d8c48f6925c-merged.mount: Deactivated successfully.
Nov 28 09:05:44 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277-userdata-shm.mount: Deactivated successfully.
Nov 28 09:05:44 np0005538515.localdomain podman[106375]: 2025-11-28 09:05:44.689120099 +0000 UTC m=+1.604457419 container cleanup cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, managed_by=tripleo_ansible, io.buildah.version=1.41.4, build-date=2025-11-18T22:51:28Z, description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, tcib_managed=true, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, release=1761123044, vcs-type=git, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, batch=17.1_20251118.1, architecture=x86_64, url=https://www.redhat.com, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., container_name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd)
Nov 28 09:05:44 np0005538515.localdomain podman[106375]: collectd
Nov 28 09:05:44 np0005538515.localdomain systemd[1]: cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.timer: Failed to open /run/systemd/transient/cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.timer: No such file or directory
Nov 28 09:05:44 np0005538515.localdomain systemd[1]: cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.service: Failed to open /run/systemd/transient/cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.service: No such file or directory
Nov 28 09:05:44 np0005538515.localdomain podman[106450]: 2025-11-28 09:05:44.709867698 +0000 UTC m=+0.138712611 container cleanup cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vcs-type=git, description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, managed_by=tripleo_ansible, architecture=x86_64, config_id=tripleo_step3, build-date=2025-11-18T22:51:28Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, tcib_managed=true, name=rhosp17/openstack-collectd, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-collectd-container, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd)
Nov 28 09:05:44 np0005538515.localdomain systemd[1]: tripleo_collectd.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 09:05:44 np0005538515.localdomain systemd[1]: libpod-conmon-cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.scope: Deactivated successfully.
Nov 28 09:05:44 np0005538515.localdomain podman[106476]: error opening file `/run/crun/cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277/status`: No such file or directory
Nov 28 09:05:44 np0005538515.localdomain systemd[1]: cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.timer: Failed to open /run/systemd/transient/cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.timer: No such file or directory
Nov 28 09:05:44 np0005538515.localdomain systemd[1]: cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.service: Failed to open /run/systemd/transient/cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277.service: No such file or directory
Nov 28 09:05:44 np0005538515.localdomain podman[106465]: 2025-11-28 09:05:44.807015628 +0000 UTC m=+0.067411226 container cleanup cb823bb3ef2bd83aa7ef957f3aca91a84b39b4427f978ef32db4305d4edb2277 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, architecture=x86_64, name=rhosp17/openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, com.redhat.component=openstack-collectd-container, io.openshift.expose-services=, build-date=2025-11-18T22:51:28Z, io.buildah.version=1.41.4, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, container_name=collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 28 09:05:44 np0005538515.localdomain podman[106465]: collectd
Nov 28 09:05:44 np0005538515.localdomain systemd[1]: tripleo_collectd.service: Failed with result 'exit-code'.
Nov 28 09:05:44 np0005538515.localdomain systemd[1]: Stopped collectd container.
Nov 28 09:05:44 np0005538515.localdomain sudo[106332]: pam_unix(sudo:session): session closed for user root
Nov 28 09:05:45 np0005538515.localdomain sudo[106567]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gucccyvafluaevxgfjbztvghbynfyfep ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764320745.092885-116-115488085567837/AnsiballZ_systemd_service.py
Nov 28 09:05:45 np0005538515.localdomain sudo[106567]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:05:45 np0005538515.localdomain sudo[106570]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:05:45 np0005538515.localdomain sudo[106570]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:05:45 np0005538515.localdomain sudo[106570]: pam_unix(sudo:session): session closed for user root
Nov 28 09:05:45 np0005538515.localdomain python3.9[106569]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_iscsid.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:05:45 np0005538515.localdomain sudo[106585]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 09:05:45 np0005538515.localdomain sudo[106585]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:05:45 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 09:05:45 np0005538515.localdomain systemd-rc-local-generator[106627]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:05:45 np0005538515.localdomain systemd-sysv-generator[106630]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:05:45 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:05:46 np0005538515.localdomain systemd[1]: Stopping iscsid container...
Nov 28 09:05:46 np0005538515.localdomain systemd[1]: libpod-9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.scope: Deactivated successfully.
Nov 28 09:05:46 np0005538515.localdomain systemd[1]: libpod-9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.scope: Consumed 1.070s CPU time.
Nov 28 09:05:46 np0005538515.localdomain podman[106655]: 2025-11-28 09:05:46.206954552 +0000 UTC m=+0.095090398 container died 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, distribution-scope=public, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, build-date=2025-11-18T23:44:13Z, vendor=Red Hat, Inc., vcs-type=git, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, config_id=tripleo_step3, com.redhat.component=openstack-iscsid-container, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, tcib_managed=true)
Nov 28 09:05:46 np0005538515.localdomain systemd[1]: 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.timer: Deactivated successfully.
Nov 28 09:05:46 np0005538515.localdomain systemd[1]: Stopped /usr/bin/podman healthcheck run 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.
Nov 28 09:05:46 np0005538515.localdomain systemd[1]: 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.service: Failed to open /run/systemd/transient/9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.service: No such file or directory
Nov 28 09:05:46 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360-userdata-shm.mount: Deactivated successfully.
Nov 28 09:05:46 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-6e24aa22dcc3c3aeaf326f993725e399e8f3215a32c5fb5c28a2698bed898907-merged.mount: Deactivated successfully.
Nov 28 09:05:46 np0005538515.localdomain podman[106655]: 2025-11-28 09:05:46.263639347 +0000 UTC m=+0.151775103 container cleanup 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, container_name=iscsid, vcs-type=git, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, com.redhat.component=openstack-iscsid-container, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, version=17.1.12, config_id=tripleo_step3, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., release=1761123044, name=rhosp17/openstack-iscsid, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, tcib_managed=true, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:44:13Z, summary=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, batch=17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, distribution-scope=public)
Nov 28 09:05:46 np0005538515.localdomain podman[106655]: iscsid
Nov 28 09:05:46 np0005538515.localdomain systemd[1]: 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.timer: Failed to open /run/systemd/transient/9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.timer: No such file or directory
Nov 28 09:05:46 np0005538515.localdomain systemd[1]: 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.service: Failed to open /run/systemd/transient/9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.service: No such file or directory
Nov 28 09:05:46 np0005538515.localdomain podman[106672]: 2025-11-28 09:05:46.290939847 +0000 UTC m=+0.074214985 container cleanup 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, architecture=x86_64, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, tcib_managed=true, maintainer=OpenStack TripleO Team, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.expose-services=, config_id=tripleo_step3, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, name=rhosp17/openstack-iscsid, com.redhat.component=openstack-iscsid-container, build-date=2025-11-18T23:44:13Z, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, release=1761123044, io.buildah.version=1.41.4)
Nov 28 09:05:46 np0005538515.localdomain systemd[1]: libpod-conmon-9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.scope: Deactivated successfully.
Nov 28 09:05:46 np0005538515.localdomain systemd[1]: 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.timer: Failed to open /run/systemd/transient/9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.timer: No such file or directory
Nov 28 09:05:46 np0005538515.localdomain systemd[1]: 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.service: Failed to open /run/systemd/transient/9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360.service: No such file or directory
Nov 28 09:05:46 np0005538515.localdomain podman[106686]: 2025-11-28 09:05:46.384484227 +0000 UTC m=+0.063285299 container cleanup 9be6f10185ad3e52f6217d1c4d2aaa05051206eedd7df0d205999efaa04c4360 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.openshift.expose-services=, version=17.1.12, vcs-type=git, managed_by=tripleo_ansible, url=https://www.redhat.com, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, tcib_managed=true, build-date=2025-11-18T23:44:13Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, release=1761123044, io.buildah.version=1.41.4, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid)
Nov 28 09:05:46 np0005538515.localdomain podman[106686]: iscsid
Nov 28 09:05:46 np0005538515.localdomain systemd[1]: tripleo_iscsid.service: Deactivated successfully.
Nov 28 09:05:46 np0005538515.localdomain systemd[1]: Stopped iscsid container.
Nov 28 09:05:46 np0005538515.localdomain sudo[106585]: pam_unix(sudo:session): session closed for user root
Nov 28 09:05:46 np0005538515.localdomain sudo[106567]: pam_unix(sudo:session): session closed for user root
Nov 28 09:05:46 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15319 DF PROTO=TCP SPT=34424 DPT=9100 SEQ=1906070116 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AB98F7A0000000001030307) 
Nov 28 09:05:46 np0005538515.localdomain sudo[106800]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wuyubcwugkpepirfnljcnukvqhlovnie ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764320746.544816-116-39401019485320/AnsiballZ_systemd_service.py
Nov 28 09:05:46 np0005538515.localdomain sudo[106800]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:05:47 np0005538515.localdomain sudo[106803]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:05:47 np0005538515.localdomain sudo[106803]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:05:47 np0005538515.localdomain sudo[106803]: pam_unix(sudo:session): session closed for user root
Nov 28 09:05:47 np0005538515.localdomain python3.9[106802]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_logrotate_crond.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:05:47 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 09:05:47 np0005538515.localdomain systemd-rc-local-generator[106841]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:05:47 np0005538515.localdomain systemd-sysv-generator[106845]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:05:47 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:05:47 np0005538515.localdomain systemd[1]: Stopping logrotate_crond container...
Nov 28 09:05:47 np0005538515.localdomain crond[71311]: (CRON) INFO (Shutting down)
Nov 28 09:05:47 np0005538515.localdomain systemd[1]: libpod-719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.scope: Deactivated successfully.
Nov 28 09:05:47 np0005538515.localdomain systemd[1]: libpod-719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.scope: Consumed 1.014s CPU time.
Nov 28 09:05:47 np0005538515.localdomain podman[106858]: 2025-11-28 09:05:47.680316426 +0000 UTC m=+0.078063674 container died 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, tcib_managed=true, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:32Z, name=rhosp17/openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, url=https://www.redhat.com, vendor=Red Hat, Inc., release=1761123044, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=logrotate_crond, architecture=x86_64, com.redhat.component=openstack-cron-container, vcs-type=git)
Nov 28 09:05:47 np0005538515.localdomain systemd[1]: tmp-crun.N0lFj8.mount: Deactivated successfully.
Nov 28 09:05:47 np0005538515.localdomain systemd[1]: 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.timer: Deactivated successfully.
Nov 28 09:05:47 np0005538515.localdomain systemd[1]: Stopped /usr/bin/podman healthcheck run 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.
Nov 28 09:05:47 np0005538515.localdomain systemd[1]: 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.service: Failed to open /run/systemd/transient/719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.service: No such file or directory
Nov 28 09:05:47 np0005538515.localdomain podman[106858]: 2025-11-28 09:05:47.802169338 +0000 UTC m=+0.199916586 container cleanup 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, container_name=logrotate_crond, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, release=1761123044, io.buildah.version=1.41.4, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.openshift.expose-services=, com.redhat.component=openstack-cron-container, architecture=x86_64, vendor=Red Hat, Inc., name=rhosp17/openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']})
Nov 28 09:05:47 np0005538515.localdomain podman[106858]: logrotate_crond
Nov 28 09:05:47 np0005538515.localdomain systemd[1]: 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.timer: Failed to open /run/systemd/transient/719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.timer: No such file or directory
Nov 28 09:05:47 np0005538515.localdomain systemd[1]: 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.service: Failed to open /run/systemd/transient/719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.service: No such file or directory
Nov 28 09:05:47 np0005538515.localdomain podman[106871]: 2025-11-28 09:05:47.827234548 +0000 UTC m=+0.131213889 container cleanup 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, version=17.1.12, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, build-date=2025-11-18T22:49:32Z, io.buildah.version=1.41.4, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, config_id=tripleo_step4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, maintainer=OpenStack TripleO Team, architecture=x86_64, vendor=Red Hat, Inc., com.redhat.component=openstack-cron-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron)
Nov 28 09:05:47 np0005538515.localdomain systemd[1]: libpod-conmon-719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.scope: Deactivated successfully.
Nov 28 09:05:47 np0005538515.localdomain podman[106900]: error opening file `/run/crun/719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae/status`: No such file or directory
Nov 28 09:05:47 np0005538515.localdomain systemd[1]: 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.timer: Failed to open /run/systemd/transient/719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.timer: No such file or directory
Nov 28 09:05:47 np0005538515.localdomain systemd[1]: 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.service: Failed to open /run/systemd/transient/719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae.service: No such file or directory
Nov 28 09:05:47 np0005538515.localdomain podman[106889]: 2025-11-28 09:05:47.945165509 +0000 UTC m=+0.079142218 container cleanup 719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, com.redhat.component=openstack-cron-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, release=1761123044, container_name=logrotate_crond, summary=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:32Z, io.openshift.expose-services=, io.buildah.version=1.41.4, url=https://www.redhat.com, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron)
Nov 28 09:05:47 np0005538515.localdomain podman[106889]: logrotate_crond
Nov 28 09:05:47 np0005538515.localdomain systemd[1]: tripleo_logrotate_crond.service: Deactivated successfully.
Nov 28 09:05:47 np0005538515.localdomain systemd[1]: Stopped logrotate_crond container.
Nov 28 09:05:47 np0005538515.localdomain sudo[106800]: pam_unix(sudo:session): session closed for user root
Nov 28 09:05:48 np0005538515.localdomain sudo[106993]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ozxbjqggwirnoarldifdhergaawowrkz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764320748.106312-116-113768665166558/AnsiballZ_systemd_service.py
Nov 28 09:05:48 np0005538515.localdomain sudo[106993]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:05:48 np0005538515.localdomain systemd[1]: tmp-crun.4gzU5W.mount: Deactivated successfully.
Nov 28 09:05:48 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-93bf0314bbd4063198be021c760bb47b8172c6cfa3163da2b90a6f202605824f-merged.mount: Deactivated successfully.
Nov 28 09:05:48 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-719548fc87f254c80adab6f249c889f61ec2ef0efd715b6e26934904f5914eae-userdata-shm.mount: Deactivated successfully.
Nov 28 09:05:48 np0005538515.localdomain python3.9[106995]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_metrics_qdr.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:05:48 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 09:05:48 np0005538515.localdomain systemd-rc-local-generator[107018]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:05:48 np0005538515.localdomain systemd-sysv-generator[107025]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:05:49 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:05:49 np0005538515.localdomain systemd[1]: Stopping metrics_qdr container...
Nov 28 09:05:49 np0005538515.localdomain kernel: qdrouterd[55332]: segfault at 0 ip 00007f678c8617cb sp 00007ffdcb0a71a0 error 4 in libc.so.6[7f678c7fe000+175000]
Nov 28 09:05:49 np0005538515.localdomain kernel: Code: 0b 00 64 44 89 23 85 c0 75 d4 e9 2b ff ff ff e8 db a5 00 00 e9 fd fe ff ff e8 41 1d 0d 00 90 f3 0f 1e fa 41 54 55 48 89 fd 53 <8b> 07 f6 c4 20 0f 85 aa 00 00 00 89 c2 81 e2 00 80 00 00 0f 84 a9
Nov 28 09:05:49 np0005538515.localdomain systemd[1]: Created slice Slice /system/systemd-coredump.
Nov 28 09:05:49 np0005538515.localdomain systemd[1]: Started Process Core Dump (PID 107050/UID 0).
Nov 28 09:05:49 np0005538515.localdomain systemd-coredump[107051]: Resource limits disable core dumping for process 55332 (qdrouterd).
Nov 28 09:05:49 np0005538515.localdomain systemd-coredump[107051]: Process 55332 (qdrouterd) of user 42465 dumped core.
Nov 28 09:05:49 np0005538515.localdomain systemd[1]: systemd-coredump@0-107050-0.service: Deactivated successfully.
Nov 28 09:05:49 np0005538515.localdomain podman[107036]: 2025-11-28 09:05:49.455906673 +0000 UTC m=+0.233047815 container died 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, container_name=metrics_qdr, tcib_managed=true, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, batch=17.1_20251118.1, version=17.1.12, io.openshift.expose-services=, name=rhosp17/openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, config_id=tripleo_step1, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64)
Nov 28 09:05:49 np0005538515.localdomain systemd[1]: libpod-9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.scope: Deactivated successfully.
Nov 28 09:05:49 np0005538515.localdomain systemd[1]: libpod-9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.scope: Consumed 28.091s CPU time.
Nov 28 09:05:49 np0005538515.localdomain systemd[1]: 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.timer: Deactivated successfully.
Nov 28 09:05:49 np0005538515.localdomain systemd[1]: Stopped /usr/bin/podman healthcheck run 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.
Nov 28 09:05:49 np0005538515.localdomain systemd[1]: 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.service: Failed to open /run/systemd/transient/9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.service: No such file or directory
Nov 28 09:05:49 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74-userdata-shm.mount: Deactivated successfully.
Nov 28 09:05:49 np0005538515.localdomain podman[107036]: 2025-11-28 09:05:49.510280847 +0000 UTC m=+0.287421979 container cleanup 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, distribution-scope=public, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, architecture=x86_64, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, io.openshift.expose-services=, container_name=metrics_qdr, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:46Z, batch=17.1_20251118.1, name=rhosp17/openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4)
Nov 28 09:05:49 np0005538515.localdomain podman[107036]: metrics_qdr
Nov 28 09:05:49 np0005538515.localdomain systemd[1]: 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.timer: Failed to open /run/systemd/transient/9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.timer: No such file or directory
Nov 28 09:05:49 np0005538515.localdomain systemd[1]: 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.service: Failed to open /run/systemd/transient/9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.service: No such file or directory
Nov 28 09:05:49 np0005538515.localdomain podman[107055]: 2025-11-28 09:05:49.534501132 +0000 UTC m=+0.063872177 container cleanup 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, distribution-scope=public, vcs-type=git, vendor=Red Hat, Inc., name=rhosp17/openstack-qdrouterd, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, build-date=2025-11-18T22:49:46Z, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, release=1761123044, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, tcib_managed=true, io.openshift.expose-services=, container_name=metrics_qdr, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd)
Nov 28 09:05:49 np0005538515.localdomain systemd[1]: tripleo_metrics_qdr.service: Main process exited, code=exited, status=139/n/a
Nov 28 09:05:49 np0005538515.localdomain systemd[1]: libpod-conmon-9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.scope: Deactivated successfully.
Nov 28 09:05:49 np0005538515.localdomain systemd[1]: 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.timer: Failed to open /run/systemd/transient/9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.timer: No such file or directory
Nov 28 09:05:49 np0005538515.localdomain systemd[1]: 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.service: Failed to open /run/systemd/transient/9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74.service: No such file or directory
Nov 28 09:05:49 np0005538515.localdomain podman[107067]: 2025-11-28 09:05:49.643592931 +0000 UTC m=+0.074038261 container cleanup 9ebc6c348d37137661c5094c88cfc7a2b1d54aa7c60a526cd6427459d35a7b74 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, release=1761123044, vcs-type=git, build-date=2025-11-18T22:49:46Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., name=rhosp17/openstack-qdrouterd, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6e6d33b0e4909c73f2f7adca3bc870a0'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd)
Nov 28 09:05:49 np0005538515.localdomain podman[107067]: metrics_qdr
Nov 28 09:05:49 np0005538515.localdomain systemd[1]: tripleo_metrics_qdr.service: Failed with result 'exit-code'.
Nov 28 09:05:49 np0005538515.localdomain systemd[1]: Stopped metrics_qdr container.
Nov 28 09:05:49 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-876187b8bc68a02fc79261d7a49dfade5cc37ca730d23b4f758fcf788c522d06-merged.mount: Deactivated successfully.
Nov 28 09:05:49 np0005538515.localdomain sudo[106993]: pam_unix(sudo:session): session closed for user root
Nov 28 09:05:50 np0005538515.localdomain sudo[107168]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uhifegzeoienuevrqebnahkzscsbvclk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764320749.807751-116-107966887027853/AnsiballZ_systemd_service.py
Nov 28 09:05:50 np0005538515.localdomain sudo[107168]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:05:50 np0005538515.localdomain python3.9[107170]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_neutron_dhcp.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:05:50 np0005538515.localdomain sudo[107168]: pam_unix(sudo:session): session closed for user root
Nov 28 09:05:50 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15320 DF PROTO=TCP SPT=34424 DPT=9100 SEQ=1906070116 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AB99F3A0000000001030307) 
Nov 28 09:05:50 np0005538515.localdomain sudo[107261]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nicodoeahiwmxbxxsjndgmzjoqtuqxrp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764320750.56432-116-247496272160999/AnsiballZ_systemd_service.py
Nov 28 09:05:50 np0005538515.localdomain sudo[107261]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:05:51 np0005538515.localdomain python3.9[107263]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_neutron_l3_agent.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:05:51 np0005538515.localdomain sudo[107261]: pam_unix(sudo:session): session closed for user root
Nov 28 09:05:51 np0005538515.localdomain sudo[107354]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wjfpmwlxejbsnfgypgjsiwhdzhfnwmip ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764320751.2776244-116-98793313766275/AnsiballZ_systemd_service.py
Nov 28 09:05:51 np0005538515.localdomain sudo[107354]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:05:51 np0005538515.localdomain python3.9[107356]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_neutron_ovs_agent.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:05:51 np0005538515.localdomain sudo[107354]: pam_unix(sudo:session): session closed for user root
Nov 28 09:05:52 np0005538515.localdomain sudo[107447]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jwesljpqvbhkizviyslagccnzbmghcxy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764320752.0382304-116-143468021233573/AnsiballZ_systemd_service.py
Nov 28 09:05:52 np0005538515.localdomain sudo[107447]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:05:52 np0005538515.localdomain python3.9[107449]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:05:52 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 09:05:52 np0005538515.localdomain systemd-rc-local-generator[107478]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:05:52 np0005538515.localdomain systemd-sysv-generator[107481]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:05:52 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:05:53 np0005538515.localdomain systemd[1]: Stopping nova_compute container...
Nov 28 09:05:57 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24302 DF PROTO=TCP SPT=33784 DPT=9105 SEQ=2010536769 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AB9BA630000000001030307) 
Nov 28 09:05:58 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24303 DF PROTO=TCP SPT=33784 DPT=9105 SEQ=2010536769 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AB9BE7B0000000001030307) 
Nov 28 09:05:58 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15321 DF PROTO=TCP SPT=34424 DPT=9100 SEQ=1906070116 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AB9BEFA0000000001030307) 
Nov 28 09:06:01 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59809 DF PROTO=TCP SPT=54994 DPT=9105 SEQ=3520360375 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AB9CAFA0000000001030307) 
Nov 28 09:06:04 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24305 DF PROTO=TCP SPT=33784 DPT=9105 SEQ=2010536769 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AB9D63B0000000001030307) 
Nov 28 09:06:09 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25369 DF PROTO=TCP SPT=47772 DPT=9102 SEQ=3617329224 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AB9E7BA0000000001030307) 
Nov 28 09:06:11 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.
Nov 28 09:06:11 np0005538515.localdomain systemd[1]: tmp-crun.WTCMYG.mount: Deactivated successfully.
Nov 28 09:06:11 np0005538515.localdomain podman[107504]: 2025-11-28 09:06:11.729757801 +0000 UTC m=+0.085881445 container health_status e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, architecture=x86_64, managed_by=tripleo_ansible, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, container_name=nova_migration_target, version=17.1.12, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, vcs-type=git, name=rhosp17/openstack-nova-compute, release=1761123044, batch=17.1_20251118.1, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-compute-container, build-date=2025-11-19T00:36:58Z)
Nov 28 09:06:12 np0005538515.localdomain podman[107504]: 2025-11-28 09:06:12.128382781 +0000 UTC m=+0.484506405 container exec_died e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.buildah.version=1.41.4, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, architecture=x86_64, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, config_id=tripleo_step4, batch=17.1_20251118.1, vendor=Red Hat, Inc., org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, release=1761123044, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z, container_name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git)
Nov 28 09:06:12 np0005538515.localdomain systemd[1]: e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.service: Deactivated successfully.
Nov 28 09:06:13 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24306 DF PROTO=TCP SPT=33784 DPT=9105 SEQ=2010536769 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AB9F6FA0000000001030307) 
Nov 28 09:06:13 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.
Nov 28 09:06:13 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.
Nov 28 09:06:13 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.
Nov 28 09:06:13 np0005538515.localdomain systemd[1]: tmp-crun.v6CapV.mount: Deactivated successfully.
Nov 28 09:06:13 np0005538515.localdomain podman[107528]: 2025-11-28 09:06:13.960943571 +0000 UTC m=+0.068698355 container health_status 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, io.buildah.version=1.41.4, name=rhosp17/openstack-ovn-controller, io.openshift.expose-services=, build-date=2025-11-18T23:34:05Z, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, version=17.1.12, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ovn-controller-container, managed_by=tripleo_ansible, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, tcib_managed=true)
Nov 28 09:06:14 np0005538515.localdomain podman[107528]: 2025-11-28 09:06:14.000669854 +0000 UTC m=+0.108424708 container exec_died 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, version=17.1.12, name=rhosp17/openstack-ovn-controller, managed_by=tripleo_ansible, build-date=2025-11-18T23:34:05Z, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.buildah.version=1.41.4, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, architecture=x86_64, release=1761123044, tcib_managed=true, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vendor=Red Hat, Inc., vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller)
Nov 28 09:06:14 np0005538515.localdomain podman[107528]: unhealthy
Nov 28 09:06:14 np0005538515.localdomain systemd[1]: 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 09:06:14 np0005538515.localdomain systemd[1]: 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.service: Failed with result 'exit-code'.
Nov 28 09:06:14 np0005538515.localdomain podman[107529]: Error: container ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 is not running
Nov 28 09:06:14 np0005538515.localdomain systemd[1]: ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.service: Main process exited, code=exited, status=125/n/a
Nov 28 09:06:14 np0005538515.localdomain systemd[1]: ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.service: Failed with result 'exit-code'.
Nov 28 09:06:14 np0005538515.localdomain podman[107530]: 2025-11-28 09:06:14.00440911 +0000 UTC m=+0.105710536 container health_status e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, tcib_managed=true, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=ovn_metadata_agent, architecture=x86_64, release=1761123044, vcs-type=git, build-date=2025-11-19T00:14:25Z, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_id=tripleo_step4, vendor=Red Hat, Inc., io.buildah.version=1.41.4, version=17.1.12, url=https://www.redhat.com, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, distribution-scope=public)
Nov 28 09:06:14 np0005538515.localdomain podman[107530]: 2025-11-28 09:06:14.08630607 +0000 UTC m=+0.187607436 container exec_died e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, url=https://www.redhat.com, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vendor=Red Hat, Inc., architecture=x86_64, config_id=tripleo_step4, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, managed_by=tripleo_ansible, distribution-scope=public, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.expose-services=, build-date=2025-11-19T00:14:25Z)
Nov 28 09:06:14 np0005538515.localdomain podman[107530]: unhealthy
Nov 28 09:06:14 np0005538515.localdomain systemd[1]: e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 09:06:14 np0005538515.localdomain systemd[1]: e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.service: Failed with result 'exit-code'.
Nov 28 09:06:14 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58646 DF PROTO=TCP SPT=47690 DPT=9882 SEQ=3581178474 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AB9FAFB0000000001030307) 
Nov 28 09:06:16 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48906 DF PROTO=TCP SPT=33522 DPT=9100 SEQ=460410310 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ABA04BB0000000001030307) 
Nov 28 09:06:20 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48907 DF PROTO=TCP SPT=33522 DPT=9100 SEQ=460410310 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ABA147A0000000001030307) 
Nov 28 09:06:27 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16889 DF PROTO=TCP SPT=59482 DPT=9105 SEQ=3206880083 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ABA2F920000000001030307) 
Nov 28 09:06:28 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16890 DF PROTO=TCP SPT=59482 DPT=9105 SEQ=3206880083 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ABA33BA0000000001030307) 
Nov 28 09:06:28 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15052 DF PROTO=TCP SPT=38194 DPT=9882 SEQ=2964310619 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ABA34BC0000000001030307) 
Nov 28 09:06:31 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13703 DF PROTO=TCP SPT=50388 DPT=9105 SEQ=3033705835 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ABA3EFA0000000001030307) 
Nov 28 09:06:34 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16892 DF PROTO=TCP SPT=59482 DPT=9105 SEQ=3206880083 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ABA4B7A0000000001030307) 
Nov 28 09:06:35 np0005538515.localdomain podman[107490]: time="2025-11-28T09:06:35Z" level=warning msg="StopSignal SIGTERM failed to stop container nova_compute in 42 seconds, resorting to SIGKILL"
Nov 28 09:06:35 np0005538515.localdomain systemd[1]: libpod-ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.scope: Deactivated successfully.
Nov 28 09:06:35 np0005538515.localdomain systemd[1]: libpod-ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.scope: Consumed 26.850s CPU time.
Nov 28 09:06:35 np0005538515.localdomain podman[107490]: 2025-11-28 09:06:35.177276797 +0000 UTC m=+42.110968467 container died ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.4, io.openshift.expose-services=, tcib_managed=true, architecture=x86_64, name=rhosp17/openstack-nova-compute, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, vendor=Red Hat, Inc., container_name=nova_compute, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible)
Nov 28 09:06:35 np0005538515.localdomain systemd[1]: ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.timer: Deactivated successfully.
Nov 28 09:06:35 np0005538515.localdomain systemd[1]: Stopped /usr/bin/podman healthcheck run ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.
Nov 28 09:06:35 np0005538515.localdomain systemd[1]: ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.service: Failed to open /run/systemd/transient/ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.service: No such file or directory
Nov 28 09:06:35 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-5442f5016f7f6fcccf64f4788496955298bf5e3ac09b77bd37d30b08717aca4a-merged.mount: Deactivated successfully.
Nov 28 09:06:35 np0005538515.localdomain podman[107490]: 2025-11-28 09:06:35.232656122 +0000 UTC m=+42.166347762 container cleanup ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, architecture=x86_64, container_name=nova_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, name=rhosp17/openstack-nova-compute, url=https://www.redhat.com, batch=17.1_20251118.1, io.openshift.expose-services=, tcib_managed=true, build-date=2025-11-19T00:36:58Z, config_id=tripleo_step5, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, vendor=Red Hat, Inc., managed_by=tripleo_ansible)
Nov 28 09:06:35 np0005538515.localdomain podman[107490]: nova_compute
Nov 28 09:06:35 np0005538515.localdomain systemd[1]: ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.timer: Failed to open /run/systemd/transient/ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.timer: No such file or directory
Nov 28 09:06:35 np0005538515.localdomain systemd[1]: ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.service: Failed to open /run/systemd/transient/ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.service: No such file or directory
Nov 28 09:06:35 np0005538515.localdomain podman[107580]: 2025-11-28 09:06:35.310546849 +0000 UTC m=+0.125903007 container cleanup ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, architecture=x86_64, build-date=2025-11-19T00:36:58Z, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step5, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1761123044, io.openshift.expose-services=, batch=17.1_20251118.1, vendor=Red Hat, Inc., managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, container_name=nova_compute)
Nov 28 09:06:35 np0005538515.localdomain systemd[1]: libpod-conmon-ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.scope: Deactivated successfully.
Nov 28 09:06:35 np0005538515.localdomain systemd[1]: ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.timer: Failed to open /run/systemd/transient/ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.timer: No such file or directory
Nov 28 09:06:35 np0005538515.localdomain systemd[1]: ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.service: Failed to open /run/systemd/transient/ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0.service: No such file or directory
Nov 28 09:06:35 np0005538515.localdomain podman[107596]: 2025-11-28 09:06:35.416497511 +0000 UTC m=+0.071485091 container cleanup ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-11-19T00:36:58Z, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, distribution-scope=public, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, release=1761123044, io.openshift.expose-services=, batch=17.1_20251118.1)
Nov 28 09:06:35 np0005538515.localdomain podman[107596]: nova_compute
Nov 28 09:06:35 np0005538515.localdomain systemd[1]: tripleo_nova_compute.service: Deactivated successfully.
Nov 28 09:06:35 np0005538515.localdomain systemd[1]: Stopped nova_compute container.
Nov 28 09:06:35 np0005538515.localdomain systemd[1]: tripleo_nova_compute.service: Consumed 1.146s CPU time, no IO.
Nov 28 09:06:35 np0005538515.localdomain sudo[107447]: pam_unix(sudo:session): session closed for user root
Nov 28 09:06:36 np0005538515.localdomain sudo[107696]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ehewetetczpcoicnyznsmqryhkvpvilh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764320795.5912247-116-54204963399634/AnsiballZ_systemd_service.py
Nov 28 09:06:36 np0005538515.localdomain sudo[107696]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:06:36 np0005538515.localdomain python3.9[107698]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_migration_target.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:06:36 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 09:06:36 np0005538515.localdomain systemd-sysv-generator[107731]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:06:36 np0005538515.localdomain systemd-rc-local-generator[107725]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:06:36 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:06:36 np0005538515.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Nov 28 09:06:36 np0005538515.localdomain systemd[1]: Starting dnf makecache...
Nov 28 09:06:36 np0005538515.localdomain systemd[1]: Stopping nova_migration_target container...
Nov 28 09:06:36 np0005538515.localdomain recover_tripleo_nova_virtqemud[107741]: 62642
Nov 28 09:06:36 np0005538515.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Nov 28 09:06:36 np0005538515.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Nov 28 09:06:36 np0005538515.localdomain sshd[71626]: Received signal 15; terminating.
Nov 28 09:06:36 np0005538515.localdomain systemd[1]: libpod-e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.scope: Deactivated successfully.
Nov 28 09:06:36 np0005538515.localdomain systemd[1]: libpod-e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.scope: Consumed 32.824s CPU time.
Nov 28 09:06:36 np0005538515.localdomain podman[107742]: 2025-11-28 09:06:36.858281483 +0000 UTC m=+0.078710824 container died e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, architecture=x86_64, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, config_id=tripleo_step4, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, build-date=2025-11-19T00:36:58Z, vendor=Red Hat, Inc., version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 28 09:06:36 np0005538515.localdomain systemd[1]: e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.timer: Deactivated successfully.
Nov 28 09:06:36 np0005538515.localdomain systemd[1]: Stopped /usr/bin/podman healthcheck run e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.
Nov 28 09:06:36 np0005538515.localdomain systemd[1]: e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.service: Failed to open /run/systemd/transient/e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.service: No such file or directory
Nov 28 09:06:36 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860-userdata-shm.mount: Deactivated successfully.
Nov 28 09:06:36 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-3850d276a9594c52a78e85d7b58db016dc835caf89f3a263b0f9d37a3754a60d-merged.mount: Deactivated successfully.
Nov 28 09:06:36 np0005538515.localdomain podman[107742]: 2025-11-28 09:06:36.911217382 +0000 UTC m=+0.131646693 container cleanup e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, container_name=nova_migration_target, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, architecture=x86_64, vcs-type=git, build-date=2025-11-19T00:36:58Z, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 28 09:06:36 np0005538515.localdomain podman[107742]: nova_migration_target
Nov 28 09:06:36 np0005538515.localdomain systemd[1]: e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.timer: Failed to open /run/systemd/transient/e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.timer: No such file or directory
Nov 28 09:06:36 np0005538515.localdomain systemd[1]: e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.service: Failed to open /run/systemd/transient/e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.service: No such file or directory
Nov 28 09:06:36 np0005538515.localdomain podman[107756]: 2025-11-28 09:06:36.936322285 +0000 UTC m=+0.074130583 container cleanup e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, url=https://www.redhat.com, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, tcib_managed=true, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, release=1761123044, architecture=x86_64, build-date=2025-11-19T00:36:58Z, container_name=nova_migration_target, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, managed_by=tripleo_ansible, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute)
Nov 28 09:06:36 np0005538515.localdomain systemd[1]: libpod-conmon-e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.scope: Deactivated successfully.
Nov 28 09:06:37 np0005538515.localdomain dnf[107739]: Updating Subscription Management repositories.
Nov 28 09:06:37 np0005538515.localdomain systemd[1]: e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.timer: Failed to open /run/systemd/transient/e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.timer: No such file or directory
Nov 28 09:06:37 np0005538515.localdomain systemd[1]: e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.service: Failed to open /run/systemd/transient/e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860.service: No such file or directory
Nov 28 09:06:37 np0005538515.localdomain podman[107771]: 2025-11-28 09:06:37.032925538 +0000 UTC m=+0.068027664 container cleanup e9b02b0c931bf0c269a9a5374b3e037ce986d87a003b12282cbd56e201e75860 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., version=17.1.12, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, url=https://www.redhat.com, batch=17.1_20251118.1, io.openshift.expose-services=, architecture=x86_64, container_name=nova_migration_target, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4)
Nov 28 09:06:37 np0005538515.localdomain podman[107771]: nova_migration_target
Nov 28 09:06:37 np0005538515.localdomain systemd[1]: tripleo_nova_migration_target.service: Deactivated successfully.
Nov 28 09:06:37 np0005538515.localdomain systemd[1]: Stopped nova_migration_target container.
Nov 28 09:06:37 np0005538515.localdomain sudo[107696]: pam_unix(sudo:session): session closed for user root
Nov 28 09:06:37 np0005538515.localdomain sudo[107872]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qoknhcdujvwnbmvgpfjbsfwoioqjxrpa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764320797.2167833-116-60391377037250/AnsiballZ_systemd_service.py
Nov 28 09:06:37 np0005538515.localdomain sudo[107872]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:06:37 np0005538515.localdomain python3.9[107874]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtlogd_wrapper.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:06:37 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 09:06:37 np0005538515.localdomain systemd-rc-local-generator[107901]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:06:37 np0005538515.localdomain systemd-sysv-generator[107906]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:06:38 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:06:38 np0005538515.localdomain systemd[1]: Stopping nova_virtlogd_wrapper container...
Nov 28 09:06:38 np0005538515.localdomain systemd[1]: tmp-crun.2abe5z.mount: Deactivated successfully.
Nov 28 09:06:38 np0005538515.localdomain systemd[1]: libpod-2e97d8a51625064e6f06a470ec8e1c443497ab99753302611140ab63dcf05711.scope: Deactivated successfully.
Nov 28 09:06:38 np0005538515.localdomain podman[107915]: 2025-11-28 09:06:38.251800248 +0000 UTC m=+0.086930707 container died 2e97d8a51625064e6f06a470ec8e1c443497ab99753302611140ab63dcf05711 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd_wrapper, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-libvirt, release=1761123044, maintainer=OpenStack TripleO Team, container_name=nova_virtlogd_wrapper, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:35:22Z, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, vcs-type=git, version=17.1.12, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-libvirt, io.openshift.expose-services=, io.buildah.version=1.41.4, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']}, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-libvirt-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Nov 28 09:06:38 np0005538515.localdomain podman[107915]: 2025-11-28 09:06:38.29083181 +0000 UTC m=+0.125962269 container cleanup 2e97d8a51625064e6f06a470ec8e1c443497ab99753302611140ab63dcf05711 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd_wrapper, com.redhat.component=openstack-nova-libvirt-container, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, name=rhosp17/openstack-nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, batch=17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']}, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.openshift.expose-services=, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_virtlogd_wrapper, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2025-11-19T00:35:22Z, description=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, version=17.1.12, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 28 09:06:38 np0005538515.localdomain podman[107915]: nova_virtlogd_wrapper
Nov 28 09:06:38 np0005538515.localdomain podman[107928]: 2025-11-28 09:06:38.323215737 +0000 UTC m=+0.066574840 container cleanup 2e97d8a51625064e6f06a470ec8e1c443497ab99753302611140ab63dcf05711 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd_wrapper, release=1761123044, managed_by=tripleo_ansible, vendor=Red Hat, Inc., architecture=x86_64, url=https://www.redhat.com, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public, version=17.1.12, build-date=2025-11-19T00:35:22Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-libvirt, vcs-type=git, config_id=tripleo_step3, com.redhat.component=openstack-nova-libvirt-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, container_name=nova_virtlogd_wrapper, io.buildah.version=1.41.4)
Nov 28 09:06:38 np0005538515.localdomain dnf[107739]: Metadata cache refreshed recently.
Nov 28 09:06:38 np0005538515.localdomain systemd[1]: dnf-makecache.service: Deactivated successfully.
Nov 28 09:06:38 np0005538515.localdomain systemd[1]: Finished dnf makecache.
Nov 28 09:06:38 np0005538515.localdomain systemd[1]: dnf-makecache.service: Consumed 1.973s CPU time.
Nov 28 09:06:39 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56580 DF PROTO=TCP SPT=47098 DPT=9102 SEQ=1094284258 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ABA5CBA0000000001030307) 
Nov 28 09:06:39 np0005538515.localdomain systemd[1]: tmp-crun.8JZiqw.mount: Deactivated successfully.
Nov 28 09:06:39 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-58fd8127cd9da7f4875f0be3b8ee189ddf406ac3663dca02ef65d88f989bc037-merged.mount: Deactivated successfully.
Nov 28 09:06:39 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-2e97d8a51625064e6f06a470ec8e1c443497ab99753302611140ab63dcf05711-userdata-shm.mount: Deactivated successfully.
Nov 28 09:06:42 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16893 DF PROTO=TCP SPT=59482 DPT=9105 SEQ=3206880083 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ABA6AFB0000000001030307) 
Nov 28 09:06:44 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64054 DF PROTO=TCP SPT=41116 DPT=9101 SEQ=214251831 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ABA70920000000001030307) 
Nov 28 09:06:44 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.
Nov 28 09:06:44 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.
Nov 28 09:06:44 np0005538515.localdomain podman[107944]: 2025-11-28 09:06:44.480731451 +0000 UTC m=+0.086209875 container health_status e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, architecture=x86_64, release=1761123044, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, version=17.1.12, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, managed_by=tripleo_ansible, vcs-type=git, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Nov 28 09:06:44 np0005538515.localdomain podman[107943]: 2025-11-28 09:06:44.531244486 +0000 UTC m=+0.135792551 container health_status 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, name=rhosp17/openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, build-date=2025-11-18T23:34:05Z, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, container_name=ovn_controller, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, managed_by=tripleo_ansible, url=https://www.redhat.com, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, architecture=x86_64, batch=17.1_20251118.1, com.redhat.component=openstack-ovn-controller-container, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272)
Nov 28 09:06:44 np0005538515.localdomain podman[107944]: 2025-11-28 09:06:44.554306656 +0000 UTC m=+0.159785090 container exec_died e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, managed_by=tripleo_ansible, batch=17.1_20251118.1, build-date=2025-11-19T00:14:25Z, architecture=x86_64, io.openshift.expose-services=, name=rhosp17/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, container_name=ovn_metadata_agent, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, release=1761123044, version=17.1.12, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, config_id=tripleo_step4)
Nov 28 09:06:44 np0005538515.localdomain podman[107944]: unhealthy
Nov 28 09:06:44 np0005538515.localdomain systemd[1]: e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 09:06:44 np0005538515.localdomain systemd[1]: e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.service: Failed with result 'exit-code'.
Nov 28 09:06:44 np0005538515.localdomain podman[107943]: 2025-11-28 09:06:44.570768063 +0000 UTC m=+0.175316138 container exec_died 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, distribution-scope=public, name=rhosp17/openstack-ovn-controller, container_name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, vendor=Red Hat, Inc., config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, batch=17.1_20251118.1, vcs-type=git, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, managed_by=tripleo_ansible, io.openshift.expose-services=, architecture=x86_64, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller)
Nov 28 09:06:44 np0005538515.localdomain podman[107943]: unhealthy
Nov 28 09:06:44 np0005538515.localdomain systemd[1]: 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 09:06:44 np0005538515.localdomain systemd[1]: 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.service: Failed with result 'exit-code'.
Nov 28 09:06:46 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12746 DF PROTO=TCP SPT=45140 DPT=9100 SEQ=3843044834 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ABA79BA0000000001030307) 
Nov 28 09:06:47 np0005538515.localdomain sudo[107985]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:06:47 np0005538515.localdomain sudo[107985]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:06:47 np0005538515.localdomain sudo[107985]: pam_unix(sudo:session): session closed for user root
Nov 28 09:06:47 np0005538515.localdomain sudo[108000]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 09:06:47 np0005538515.localdomain sudo[108000]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:06:48 np0005538515.localdomain sudo[108000]: pam_unix(sudo:session): session closed for user root
Nov 28 09:06:48 np0005538515.localdomain sudo[108046]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:06:48 np0005538515.localdomain sudo[108046]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:06:48 np0005538515.localdomain sudo[108046]: pam_unix(sudo:session): session closed for user root
Nov 28 09:06:50 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12747 DF PROTO=TCP SPT=45140 DPT=9100 SEQ=3843044834 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ABA897B0000000001030307) 
Nov 28 09:06:57 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14296 DF PROTO=TCP SPT=44144 DPT=9105 SEQ=3700151614 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ABAA4C30000000001030307) 
Nov 28 09:06:58 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14297 DF PROTO=TCP SPT=44144 DPT=9105 SEQ=3700151614 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ABAA8BA0000000001030307) 
Nov 28 09:06:58 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12748 DF PROTO=TCP SPT=45140 DPT=9100 SEQ=3843044834 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ABAA8FA0000000001030307) 
Nov 28 09:07:01 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24308 DF PROTO=TCP SPT=33784 DPT=9105 SEQ=2010536769 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ABAB4FB0000000001030307) 
Nov 28 09:07:04 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14299 DF PROTO=TCP SPT=44144 DPT=9105 SEQ=3700151614 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ABAC07B0000000001030307) 
Nov 28 09:07:09 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56522 DF PROTO=TCP SPT=49498 DPT=9102 SEQ=237804585 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ABAD1FA0000000001030307) 
Nov 28 09:07:12 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14300 DF PROTO=TCP SPT=44144 DPT=9105 SEQ=3700151614 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ABAE0FA0000000001030307) 
Nov 28 09:07:14 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15402 DF PROTO=TCP SPT=55336 DPT=9101 SEQ=43871835 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ABAE5C00000000001030307) 
Nov 28 09:07:14 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.
Nov 28 09:07:14 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.
Nov 28 09:07:14 np0005538515.localdomain podman[108062]: 2025-11-28 09:07:14.738174294 +0000 UTC m=+0.084407830 container health_status e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, version=17.1.12, io.buildah.version=1.41.4, io.openshift.expose-services=, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_metadata_agent, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, managed_by=tripleo_ansible, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, release=1761123044, distribution-scope=public, config_id=tripleo_step4, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 28 09:07:14 np0005538515.localdomain podman[108061]: 2025-11-28 09:07:14.786606885 +0000 UTC m=+0.134428149 container health_status 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, tcib_managed=true, vcs-type=git, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.buildah.version=1.41.4, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.expose-services=, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, architecture=x86_64, version=17.1.12, distribution-scope=public, url=https://www.redhat.com, name=rhosp17/openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, com.redhat.component=openstack-ovn-controller-container, build-date=2025-11-18T23:34:05Z, release=1761123044)
Nov 28 09:07:14 np0005538515.localdomain podman[108062]: 2025-11-28 09:07:14.805655141 +0000 UTC m=+0.151888627 container exec_died e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, distribution-scope=public, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, container_name=ovn_metadata_agent, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, vendor=Red Hat, Inc., tcib_managed=true, name=rhosp17/openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, config_id=tripleo_step4, build-date=2025-11-19T00:14:25Z, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=)
Nov 28 09:07:14 np0005538515.localdomain podman[108062]: unhealthy
Nov 28 09:07:14 np0005538515.localdomain systemd[1]: e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 09:07:14 np0005538515.localdomain systemd[1]: e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.service: Failed with result 'exit-code'.
Nov 28 09:07:14 np0005538515.localdomain podman[108061]: 2025-11-28 09:07:14.826941626 +0000 UTC m=+0.174762860 container exec_died 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ovn-controller, version=17.1.12, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, architecture=x86_64, vcs-type=git, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, com.redhat.component=openstack-ovn-controller-container, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.buildah.version=1.41.4, build-date=2025-11-18T23:34:05Z, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, release=1761123044, container_name=ovn_controller)
Nov 28 09:07:14 np0005538515.localdomain podman[108061]: unhealthy
Nov 28 09:07:14 np0005538515.localdomain systemd[1]: 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 09:07:14 np0005538515.localdomain systemd[1]: 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.service: Failed with result 'exit-code'.
Nov 28 09:07:16 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24417 DF PROTO=TCP SPT=39450 DPT=9100 SEQ=2313255533 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ABAEEFA0000000001030307) 
Nov 28 09:07:20 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24418 DF PROTO=TCP SPT=39450 DPT=9100 SEQ=2313255533 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ABAFEBB0000000001030307) 
Nov 28 09:07:27 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6112 DF PROTO=TCP SPT=60306 DPT=9105 SEQ=3988060708 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ABB19F30000000001030307) 
Nov 28 09:07:28 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6113 DF PROTO=TCP SPT=60306 DPT=9105 SEQ=3988060708 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ABB1DFB0000000001030307) 
Nov 28 09:07:28 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24419 DF PROTO=TCP SPT=39450 DPT=9100 SEQ=2313255533 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ABB1EFA0000000001030307) 
Nov 28 09:07:32 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22552 DF PROTO=TCP SPT=46434 DPT=9882 SEQ=2138006 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ABB2B3A0000000001030307) 
Nov 28 09:07:34 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6115 DF PROTO=TCP SPT=60306 DPT=9105 SEQ=3988060708 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ABB35BA0000000001030307) 
Nov 28 09:07:37 np0005538515.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Nov 28 09:07:37 np0005538515.localdomain recover_tripleo_nova_virtqemud[108101]: 62642
Nov 28 09:07:37 np0005538515.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Nov 28 09:07:37 np0005538515.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Nov 28 09:07:39 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=65460 DF PROTO=TCP SPT=39772 DPT=9102 SEQ=941991457 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ABB473A0000000001030307) 
Nov 28 09:07:43 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6116 DF PROTO=TCP SPT=60306 DPT=9105 SEQ=3988060708 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ABB56FB0000000001030307) 
Nov 28 09:07:44 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29840 DF PROTO=TCP SPT=52336 DPT=9101 SEQ=2319000963 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ABB5AF00000000001030307) 
Nov 28 09:07:44 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.
Nov 28 09:07:44 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.
Nov 28 09:07:44 np0005538515.localdomain podman[108102]: 2025-11-28 09:07:44.97040713 +0000 UTC m=+0.078785516 container health_status 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, build-date=2025-11-18T23:34:05Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, vendor=Red Hat, Inc., url=https://www.redhat.com, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, version=17.1.12, name=rhosp17/openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, container_name=ovn_controller, io.openshift.expose-services=, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.buildah.version=1.41.4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, tcib_managed=true, com.redhat.component=openstack-ovn-controller-container)
Nov 28 09:07:44 np0005538515.localdomain podman[108102]: 2025-11-28 09:07:44.988447346 +0000 UTC m=+0.096825732 container exec_died 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, url=https://www.redhat.com, distribution-scope=public, name=rhosp17/openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, architecture=x86_64, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, container_name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, managed_by=tripleo_ansible, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, io.buildah.version=1.41.4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, vendor=Red Hat, Inc., config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, tcib_managed=true)
Nov 28 09:07:45 np0005538515.localdomain podman[108103]: 2025-11-28 09:07:45.026773066 +0000 UTC m=+0.134283635 container health_status e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, release=1761123044, io.openshift.expose-services=, batch=17.1_20251118.1, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, name=rhosp17/openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, managed_by=tripleo_ansible, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, distribution-scope=public, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, com.redhat.component=openstack-neutron-metadata-agent-ovn-container)
Nov 28 09:07:45 np0005538515.localdomain podman[108102]: unhealthy
Nov 28 09:07:45 np0005538515.localdomain systemd[1]: 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 09:07:45 np0005538515.localdomain systemd[1]: 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.service: Failed with result 'exit-code'.
Nov 28 09:07:45 np0005538515.localdomain podman[108103]: 2025-11-28 09:07:45.065680622 +0000 UTC m=+0.173191151 container exec_died e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, vendor=Red Hat, Inc., version=17.1.12, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, name=rhosp17/openstack-neutron-metadata-agent-ovn, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, config_id=tripleo_step4, container_name=ovn_metadata_agent, release=1761123044, distribution-scope=public, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 28 09:07:45 np0005538515.localdomain podman[108103]: unhealthy
Nov 28 09:07:45 np0005538515.localdomain systemd[1]: e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 09:07:45 np0005538515.localdomain systemd[1]: e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.service: Failed with result 'exit-code'.
Nov 28 09:07:46 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11948 DF PROTO=TCP SPT=34136 DPT=9100 SEQ=2955239952 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ABB643B0000000001030307) 
Nov 28 09:07:48 np0005538515.localdomain sudo[108144]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:07:48 np0005538515.localdomain sudo[108144]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:07:48 np0005538515.localdomain sudo[108144]: pam_unix(sudo:session): session closed for user root
Nov 28 09:07:48 np0005538515.localdomain sudo[108159]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 09:07:48 np0005538515.localdomain sudo[108159]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:07:49 np0005538515.localdomain sudo[108159]: pam_unix(sudo:session): session closed for user root
Nov 28 09:07:50 np0005538515.localdomain sudo[108206]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:07:50 np0005538515.localdomain sudo[108206]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:07:50 np0005538515.localdomain sudo[108206]: pam_unix(sudo:session): session closed for user root
Nov 28 09:07:50 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11949 DF PROTO=TCP SPT=34136 DPT=9100 SEQ=2955239952 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ABB73FB0000000001030307) 
Nov 28 09:07:57 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35544 DF PROTO=TCP SPT=46700 DPT=9105 SEQ=4195846488 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ABB8F230000000001030307) 
Nov 28 09:07:58 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35545 DF PROTO=TCP SPT=46700 DPT=9105 SEQ=4195846488 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ABB933A0000000001030307) 
Nov 28 09:07:58 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31862 DF PROTO=TCP SPT=41944 DPT=9882 SEQ=2607740531 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ABB944D0000000001030307) 
Nov 28 09:08:01 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14302 DF PROTO=TCP SPT=44144 DPT=9105 SEQ=3700151614 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ABB9EFA0000000001030307) 
Nov 28 09:08:02 np0005538515.localdomain systemd[1]: tripleo_nova_virtlogd_wrapper.service: State 'stop-sigterm' timed out. Killing.
Nov 28 09:08:02 np0005538515.localdomain systemd[1]: tripleo_nova_virtlogd_wrapper.service: Killing process 61828 (conmon) with signal SIGKILL.
Nov 28 09:08:02 np0005538515.localdomain systemd[1]: tripleo_nova_virtlogd_wrapper.service: Main process exited, code=killed, status=9/KILL
Nov 28 09:08:02 np0005538515.localdomain systemd[1]: libpod-conmon-2e97d8a51625064e6f06a470ec8e1c443497ab99753302611140ab63dcf05711.scope: Deactivated successfully.
Nov 28 09:08:02 np0005538515.localdomain systemd[1]: tmp-crun.xt1OnO.mount: Deactivated successfully.
Nov 28 09:08:02 np0005538515.localdomain podman[108234]: error opening file `/run/crun/2e97d8a51625064e6f06a470ec8e1c443497ab99753302611140ab63dcf05711/status`: No such file or directory
Nov 28 09:08:02 np0005538515.localdomain podman[108221]: 2025-11-28 09:08:02.468875289 +0000 UTC m=+0.073675159 container cleanup 2e97d8a51625064e6f06a470ec8e1c443497ab99753302611140ab63dcf05711 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd_wrapper, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, version=17.1.12, url=https://www.redhat.com, vcs-type=git, container_name=nova_virtlogd_wrapper, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, vendor=Red Hat, Inc., batch=17.1_20251118.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, managed_by=tripleo_ansible, build-date=2025-11-19T00:35:22Z, config_id=tripleo_step3, release=1761123044, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-libvirt-container, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 09:08:02 np0005538515.localdomain podman[108221]: nova_virtlogd_wrapper
Nov 28 09:08:02 np0005538515.localdomain systemd[1]: tripleo_nova_virtlogd_wrapper.service: Failed with result 'timeout'.
Nov 28 09:08:02 np0005538515.localdomain systemd[1]: Stopped nova_virtlogd_wrapper container.
Nov 28 09:08:02 np0005538515.localdomain sudo[107872]: pam_unix(sudo:session): session closed for user root
Nov 28 09:08:02 np0005538515.localdomain sudo[108325]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xxvvctctvibqbvfyhcoqgwdsikyrfciy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764320882.6321936-116-131449327871821/AnsiballZ_systemd_service.py
Nov 28 09:08:02 np0005538515.localdomain sudo[108325]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:08:03 np0005538515.localdomain python3.9[108327]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtnodedevd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:08:03 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 09:08:03 np0005538515.localdomain systemd-sysv-generator[108360]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:08:03 np0005538515.localdomain systemd-rc-local-generator[108353]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:08:03 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:08:03 np0005538515.localdomain systemd[1]: Stopping nova_virtnodedevd container...
Nov 28 09:08:03 np0005538515.localdomain systemd[1]: libpod-490141dc0beecfbdec2cf756928e0dd5b717de05c10e967326da43f7b52be436.scope: Deactivated successfully.
Nov 28 09:08:03 np0005538515.localdomain systemd[1]: libpod-490141dc0beecfbdec2cf756928e0dd5b717de05c10e967326da43f7b52be436.scope: Consumed 1.310s CPU time.
Nov 28 09:08:03 np0005538515.localdomain podman[108368]: 2025-11-28 09:08:03.707893649 +0000 UTC m=+0.074440722 container died 490141dc0beecfbdec2cf756928e0dd5b717de05c10e967326da43f7b52be436 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtnodedevd, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, config_id=tripleo_step3, distribution-scope=public, container_name=nova_virtnodedevd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.buildah.version=1.41.4, io.openshift.expose-services=, name=rhosp17/openstack-nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., tcib_managed=true, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, com.redhat.component=openstack-nova-libvirt-container, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, build-date=2025-11-19T00:35:22Z, version=17.1.12, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 2, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro']}, release=1761123044)
Nov 28 09:08:03 np0005538515.localdomain systemd[1]: tmp-crun.s0WZ3V.mount: Deactivated successfully.
Nov 28 09:08:03 np0005538515.localdomain podman[108368]: 2025-11-28 09:08:03.752091289 +0000 UTC m=+0.118638362 container cleanup 490141dc0beecfbdec2cf756928e0dd5b717de05c10e967326da43f7b52be436 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtnodedevd, tcib_managed=true, com.redhat.component=openstack-nova-libvirt-container, config_id=tripleo_step3, version=17.1.12, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, io.buildah.version=1.41.4, container_name=nova_virtnodedevd, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, name=rhosp17/openstack-nova-libvirt, architecture=x86_64, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 2, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro']}, release=1761123044, url=https://www.redhat.com, io.openshift.expose-services=, vcs-type=git, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:35:22Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt)
Nov 28 09:08:03 np0005538515.localdomain podman[108368]: nova_virtnodedevd
Nov 28 09:08:03 np0005538515.localdomain podman[108382]: 2025-11-28 09:08:03.792040789 +0000 UTC m=+0.073329358 container cleanup 490141dc0beecfbdec2cf756928e0dd5b717de05c10e967326da43f7b52be436 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtnodedevd, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.12, io.openshift.expose-services=, container_name=nova_virtnodedevd, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-libvirt, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-nova-libvirt-container, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:35:22Z, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 2, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro']}, distribution-scope=public, vcs-type=git, architecture=x86_64, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt)
Nov 28 09:08:03 np0005538515.localdomain systemd[1]: libpod-conmon-490141dc0beecfbdec2cf756928e0dd5b717de05c10e967326da43f7b52be436.scope: Deactivated successfully.
Nov 28 09:08:03 np0005538515.localdomain podman[108411]: error opening file `/run/crun/490141dc0beecfbdec2cf756928e0dd5b717de05c10e967326da43f7b52be436/status`: No such file or directory
Nov 28 09:08:03 np0005538515.localdomain podman[108399]: 2025-11-28 09:08:03.897255948 +0000 UTC m=+0.073276836 container cleanup 490141dc0beecfbdec2cf756928e0dd5b717de05c10e967326da43f7b52be436 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtnodedevd, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, vendor=Red Hat, Inc., org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, managed_by=tripleo_ansible, release=1761123044, container_name=nova_virtnodedevd, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:35:22Z, url=https://www.redhat.com, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 2, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro']}, tcib_managed=true, version=17.1.12, maintainer=OpenStack TripleO Team, architecture=x86_64, name=rhosp17/openstack-nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt)
Nov 28 09:08:03 np0005538515.localdomain podman[108399]: nova_virtnodedevd
Nov 28 09:08:03 np0005538515.localdomain systemd[1]: tripleo_nova_virtnodedevd.service: Deactivated successfully.
Nov 28 09:08:03 np0005538515.localdomain systemd[1]: Stopped nova_virtnodedevd container.
Nov 28 09:08:03 np0005538515.localdomain sudo[108325]: pam_unix(sudo:session): session closed for user root
Nov 28 09:08:04 np0005538515.localdomain sudo[108502]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-chzhwhetzodubcosxodhhljrqwfbbrvw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764320884.0880618-116-237499375198805/AnsiballZ_systemd_service.py
Nov 28 09:08:04 np0005538515.localdomain sudo[108502]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:08:04 np0005538515.localdomain python3.9[108504]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtproxyd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:08:04 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-95337b0ee1bc2060bec425b9be63b35b01d68f1de2bac6065e353d72be5388e0-merged.mount: Deactivated successfully.
Nov 28 09:08:04 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-490141dc0beecfbdec2cf756928e0dd5b717de05c10e967326da43f7b52be436-userdata-shm.mount: Deactivated successfully.
Nov 28 09:08:04 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 09:08:04 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35547 DF PROTO=TCP SPT=46700 DPT=9105 SEQ=4195846488 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ABBAAFB0000000001030307) 
Nov 28 09:08:04 np0005538515.localdomain systemd-sysv-generator[108536]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:08:04 np0005538515.localdomain systemd-rc-local-generator[108531]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:08:04 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:08:04 np0005538515.localdomain systemd[1]: Stopping nova_virtproxyd container...
Nov 28 09:08:05 np0005538515.localdomain systemd[1]: libpod-7f1efe5480b4850e72969a411413c723808a2e9f2a72da0ab9b5bc407d874657.scope: Deactivated successfully.
Nov 28 09:08:05 np0005538515.localdomain podman[108545]: 2025-11-28 09:08:05.033205995 +0000 UTC m=+0.057024656 container died 7f1efe5480b4850e72969a411413c723808a2e9f2a72da0ab9b5bc407d874657 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtproxyd, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_virtproxyd, batch=17.1_20251118.1, vcs-type=git, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-nova-libvirt-container, name=rhosp17/openstack-nova-libvirt, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2025-11-19T00:35:22Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 5, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro']}, distribution-scope=public, managed_by=tripleo_ansible, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-libvirt, config_id=tripleo_step3, tcib_managed=true, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team)
Nov 28 09:08:05 np0005538515.localdomain podman[108545]: 2025-11-28 09:08:05.069023848 +0000 UTC m=+0.092842439 container cleanup 7f1efe5480b4850e72969a411413c723808a2e9f2a72da0ab9b5bc407d874657 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtproxyd, description=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp17/openstack-nova-libvirt, architecture=x86_64, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, batch=17.1_20251118.1, release=1761123044, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, container_name=nova_virtproxyd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-libvirt-container, vcs-type=git, io.buildah.version=1.41.4, build-date=2025-11-19T00:35:22Z, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 5, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt)
Nov 28 09:08:05 np0005538515.localdomain podman[108545]: nova_virtproxyd
Nov 28 09:08:05 np0005538515.localdomain podman[108560]: 2025-11-28 09:08:05.111774523 +0000 UTC m=+0.063487354 container cleanup 7f1efe5480b4850e72969a411413c723808a2e9f2a72da0ab9b5bc407d874657 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtproxyd, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 5, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro']}, name=rhosp17/openstack-nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, vcs-type=git, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-libvirt-container, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.12, distribution-scope=public, io.openshift.expose-services=, build-date=2025-11-19T00:35:22Z, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-libvirt, container_name=nova_virtproxyd, io.buildah.version=1.41.4, config_id=tripleo_step3, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, batch=17.1_20251118.1)
Nov 28 09:08:05 np0005538515.localdomain systemd[1]: libpod-conmon-7f1efe5480b4850e72969a411413c723808a2e9f2a72da0ab9b5bc407d874657.scope: Deactivated successfully.
Nov 28 09:08:05 np0005538515.localdomain podman[108586]: error opening file `/run/crun/7f1efe5480b4850e72969a411413c723808a2e9f2a72da0ab9b5bc407d874657/status`: No such file or directory
Nov 28 09:08:05 np0005538515.localdomain podman[108575]: 2025-11-28 09:08:05.20912505 +0000 UTC m=+0.069946254 container cleanup 7f1efe5480b4850e72969a411413c723808a2e9f2a72da0ab9b5bc407d874657 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtproxyd, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, config_id=tripleo_step3, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_virtproxyd, vcs-type=git, com.redhat.component=openstack-nova-libvirt-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, managed_by=tripleo_ansible, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 5, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro']}, description=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2025-11-19T00:35:22Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.openshift.expose-services=, distribution-scope=public, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, vendor=Red Hat, Inc.)
Nov 28 09:08:05 np0005538515.localdomain podman[108575]: nova_virtproxyd
Nov 28 09:08:05 np0005538515.localdomain systemd[1]: tripleo_nova_virtproxyd.service: Deactivated successfully.
Nov 28 09:08:05 np0005538515.localdomain systemd[1]: Stopped nova_virtproxyd container.
Nov 28 09:08:05 np0005538515.localdomain sudo[108502]: pam_unix(sudo:session): session closed for user root
Nov 28 09:08:05 np0005538515.localdomain sudo[108677]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-omgsyyqzrmtudnsjjzwbnmtbqbvknpvm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764320885.3826668-116-99045846633913/AnsiballZ_systemd_service.py
Nov 28 09:08:05 np0005538515.localdomain sudo[108677]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:08:05 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-0f50e6e193badeb95447e2c9ef73121ac91dbd5780ab99ca29933bd60e5eb8a8-merged.mount: Deactivated successfully.
Nov 28 09:08:05 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-7f1efe5480b4850e72969a411413c723808a2e9f2a72da0ab9b5bc407d874657-userdata-shm.mount: Deactivated successfully.
Nov 28 09:08:05 np0005538515.localdomain python3.9[108679]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtqemud.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:08:06 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 09:08:06 np0005538515.localdomain systemd-rc-local-generator[108702]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:08:06 np0005538515.localdomain systemd-sysv-generator[108707]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:08:06 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:08:06 np0005538515.localdomain systemd[1]: tripleo_nova_virtqemud_recover.timer: Deactivated successfully.
Nov 28 09:08:06 np0005538515.localdomain systemd[1]: Stopped Check and recover tripleo_nova_virtqemud every 10m.
Nov 28 09:08:06 np0005538515.localdomain systemd[1]: Stopping nova_virtqemud container...
Nov 28 09:08:06 np0005538515.localdomain systemd[1]: libpod-929c9b5315b4fc33e01978423e59cb5b383ecd56e91c5f891b7c011283bec432.scope: Deactivated successfully.
Nov 28 09:08:06 np0005538515.localdomain systemd[1]: libpod-929c9b5315b4fc33e01978423e59cb5b383ecd56e91c5f891b7c011283bec432.scope: Consumed 1.944s CPU time.
Nov 28 09:08:06 np0005538515.localdomain podman[108720]: 2025-11-28 09:08:06.438231666 +0000 UTC m=+0.082306435 container died 929c9b5315b4fc33e01978423e59cb5b383ecd56e91c5f891b7c011283bec432 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud, container_name=nova_virtqemud, com.redhat.component=openstack-nova-libvirt-container, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:35:22Z, name=rhosp17/openstack-nova-libvirt, release=1761123044, vcs-type=git, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']}, io.openshift.expose-services=, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, architecture=x86_64, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, io.buildah.version=1.41.4)
Nov 28 09:08:06 np0005538515.localdomain podman[108720]: 2025-11-28 09:08:06.472724317 +0000 UTC m=+0.116799086 container cleanup 929c9b5315b4fc33e01978423e59cb5b383ecd56e91c5f891b7c011283bec432 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud, description=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, version=17.1.12, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']}, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, build-date=2025-11-19T00:35:22Z, managed_by=tripleo_ansible, release=1761123044, tcib_managed=true, container_name=nova_virtqemud, name=rhosp17/openstack-nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-nova-libvirt-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt)
Nov 28 09:08:06 np0005538515.localdomain podman[108720]: nova_virtqemud
Nov 28 09:08:06 np0005538515.localdomain podman[108734]: 2025-11-28 09:08:06.524409358 +0000 UTC m=+0.070308945 container cleanup 929c9b5315b4fc33e01978423e59cb5b383ecd56e91c5f891b7c011283bec432 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']}, description=Red Hat OpenStack Platform 17.1 nova-libvirt, release=1761123044, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_virtqemud, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.12, distribution-scope=public, config_id=tripleo_step3, name=rhosp17/openstack-nova-libvirt, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, vcs-type=git, com.redhat.component=openstack-nova-libvirt-container, build-date=2025-11-19T00:35:22Z, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true)
Nov 28 09:08:06 np0005538515.localdomain systemd[1]: libpod-conmon-929c9b5315b4fc33e01978423e59cb5b383ecd56e91c5f891b7c011283bec432.scope: Deactivated successfully.
Nov 28 09:08:06 np0005538515.localdomain podman[108764]: error opening file `/run/crun/929c9b5315b4fc33e01978423e59cb5b383ecd56e91c5f891b7c011283bec432/status`: No such file or directory
Nov 28 09:08:06 np0005538515.localdomain podman[108751]: 2025-11-28 09:08:06.635185958 +0000 UTC m=+0.069382997 container cleanup 929c9b5315b4fc33e01978423e59cb5b383ecd56e91c5f891b7c011283bec432 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud, vendor=Red Hat, Inc., url=https://www.redhat.com, com.redhat.component=openstack-nova-libvirt-container, release=1761123044, name=rhosp17/openstack-nova-libvirt, architecture=x86_64, version=17.1.12, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, tcib_managed=true, maintainer=OpenStack TripleO Team, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']}, build-date=2025-11-19T00:35:22Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, config_id=tripleo_step3, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=nova_virtqemud, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 28 09:08:06 np0005538515.localdomain podman[108751]: nova_virtqemud
Nov 28 09:08:06 np0005538515.localdomain systemd[1]: tripleo_nova_virtqemud.service: Deactivated successfully.
Nov 28 09:08:06 np0005538515.localdomain systemd[1]: Stopped nova_virtqemud container.
Nov 28 09:08:06 np0005538515.localdomain sudo[108677]: pam_unix(sudo:session): session closed for user root
Nov 28 09:08:06 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-0ab58418a3b33798bab22812a6bf35faf1a05b29cb02b615b8bae9fef6fe9073-merged.mount: Deactivated successfully.
Nov 28 09:08:06 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-929c9b5315b4fc33e01978423e59cb5b383ecd56e91c5f891b7c011283bec432-userdata-shm.mount: Deactivated successfully.
Nov 28 09:08:07 np0005538515.localdomain sudo[108855]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-msicroaiewpxunrrbhjefyccydsucknp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764320886.8241746-116-118671138725828/AnsiballZ_systemd_service.py
Nov 28 09:08:07 np0005538515.localdomain sudo[108855]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:08:07 np0005538515.localdomain python3.9[108857]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtqemud_recover.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:08:07 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 09:08:07 np0005538515.localdomain systemd-rc-local-generator[108883]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:08:07 np0005538515.localdomain systemd-sysv-generator[108889]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:08:07 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:08:07 np0005538515.localdomain sudo[108855]: pam_unix(sudo:session): session closed for user root
Nov 28 09:08:08 np0005538515.localdomain sudo[108984]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xkaujshxivanjeqjsmzvsxckaeiiouba ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764320887.983225-116-197179823847892/AnsiballZ_systemd_service.py
Nov 28 09:08:08 np0005538515.localdomain sudo[108984]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:08:08 np0005538515.localdomain python3.9[108986]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtsecretd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:08:09 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20004 DF PROTO=TCP SPT=53812 DPT=9102 SEQ=1366834653 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ABBBC7A0000000001030307) 
Nov 28 09:08:09 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 09:08:09 np0005538515.localdomain systemd-sysv-generator[109016]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:08:09 np0005538515.localdomain systemd-rc-local-generator[109013]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:08:09 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:08:10 np0005538515.localdomain systemd[1]: Stopping nova_virtsecretd container...
Nov 28 09:08:10 np0005538515.localdomain systemd[1]: libpod-c819cf470c2869c75c471bbedd276e4a2f4c93050051a8f401cabeeedb4a8808.scope: Deactivated successfully.
Nov 28 09:08:10 np0005538515.localdomain podman[109027]: 2025-11-28 09:08:10.111133277 +0000 UTC m=+0.077145786 container died c819cf470c2869c75c471bbedd276e4a2f4c93050051a8f401cabeeedb4a8808 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtsecretd, version=17.1.12, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=nova_virtsecretd, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 1, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro']}, description=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, distribution-scope=public, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-libvirt, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, io.buildah.version=1.41.4, build-date=2025-11-19T00:35:22Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step3, batch=17.1_20251118.1)
Nov 28 09:08:10 np0005538515.localdomain podman[109027]: 2025-11-28 09:08:10.147055103 +0000 UTC m=+0.113067562 container cleanup c819cf470c2869c75c471bbedd276e4a2f4c93050051a8f401cabeeedb4a8808 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtsecretd, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 1, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro']}, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-libvirt, container_name=nova_virtsecretd, release=1761123044, build-date=2025-11-19T00:35:22Z, description=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, com.redhat.component=openstack-nova-libvirt-container, tcib_managed=true, version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public, vcs-type=git, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, config_id=tripleo_step3)
Nov 28 09:08:10 np0005538515.localdomain podman[109027]: nova_virtsecretd
Nov 28 09:08:10 np0005538515.localdomain podman[109041]: 2025-11-28 09:08:10.193929305 +0000 UTC m=+0.069683375 container cleanup c819cf470c2869c75c471bbedd276e4a2f4c93050051a8f401cabeeedb4a8808 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtsecretd, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20251118.1, distribution-scope=public, io.openshift.expose-services=, config_id=tripleo_step3, com.redhat.component=openstack-nova-libvirt-container, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-libvirt, url=https://www.redhat.com, managed_by=tripleo_ansible, architecture=x86_64, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.12, build-date=2025-11-19T00:35:22Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 1, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_virtsecretd, maintainer=OpenStack TripleO Team)
Nov 28 09:08:10 np0005538515.localdomain systemd[1]: libpod-conmon-c819cf470c2869c75c471bbedd276e4a2f4c93050051a8f401cabeeedb4a8808.scope: Deactivated successfully.
Nov 28 09:08:10 np0005538515.localdomain podman[109069]: error opening file `/run/crun/c819cf470c2869c75c471bbedd276e4a2f4c93050051a8f401cabeeedb4a8808/status`: No such file or directory
Nov 28 09:08:10 np0005538515.localdomain podman[109058]: 2025-11-28 09:08:10.297684499 +0000 UTC m=+0.073157343 container cleanup c819cf470c2869c75c471bbedd276e4a2f4c93050051a8f401cabeeedb4a8808 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtsecretd, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2025-11-19T00:35:22Z, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, tcib_managed=true, url=https://www.redhat.com, com.redhat.component=openstack-nova-libvirt-container, version=17.1.12, batch=17.1_20251118.1, container_name=nova_virtsecretd, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 1, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro']}, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, distribution-scope=public, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, io.buildah.version=1.41.4, config_id=tripleo_step3, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-libvirt)
Nov 28 09:08:10 np0005538515.localdomain podman[109058]: nova_virtsecretd
Nov 28 09:08:10 np0005538515.localdomain systemd[1]: tripleo_nova_virtsecretd.service: Deactivated successfully.
Nov 28 09:08:10 np0005538515.localdomain systemd[1]: Stopped nova_virtsecretd container.
Nov 28 09:08:10 np0005538515.localdomain sudo[108984]: pam_unix(sudo:session): session closed for user root
Nov 28 09:08:10 np0005538515.localdomain sudo[109161]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cjzwrsluhhhtopxqvsqnzfgadycrhsig ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764320890.4732182-116-61542008363751/AnsiballZ_systemd_service.py
Nov 28 09:08:10 np0005538515.localdomain sudo[109161]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:08:11 np0005538515.localdomain python3.9[109163]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtstoraged.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:08:11 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-897f35829b1f881949b1c333f7f4948d19933191339ff7279e3c8582c9dcbd21-merged.mount: Deactivated successfully.
Nov 28 09:08:11 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c819cf470c2869c75c471bbedd276e4a2f4c93050051a8f401cabeeedb4a8808-userdata-shm.mount: Deactivated successfully.
Nov 28 09:08:11 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 09:08:11 np0005538515.localdomain systemd-rc-local-generator[109187]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:08:11 np0005538515.localdomain systemd-sysv-generator[109193]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:08:11 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:08:11 np0005538515.localdomain systemd[1]: Stopping nova_virtstoraged container...
Nov 28 09:08:11 np0005538515.localdomain systemd[1]: libpod-77e9d72df8f79ee50de7116306a3a6d3da17ccdfda2a4c48233804c3562cc2f0.scope: Deactivated successfully.
Nov 28 09:08:11 np0005538515.localdomain podman[109203]: 2025-11-28 09:08:11.55392514 +0000 UTC m=+0.083742939 container died 77e9d72df8f79ee50de7116306a3a6d3da17ccdfda2a4c48233804c3562cc2f0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtstoraged, vcs-type=git, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-libvirt, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 3, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro']}, build-date=2025-11-19T00:35:22Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, com.redhat.component=openstack-nova-libvirt-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step3, tcib_managed=true, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, vendor=Red Hat, Inc., architecture=x86_64, maintainer=OpenStack TripleO Team, container_name=nova_virtstoraged, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt)
Nov 28 09:08:11 np0005538515.localdomain podman[109203]: 2025-11-28 09:08:11.59424306 +0000 UTC m=+0.124060789 container cleanup 77e9d72df8f79ee50de7116306a3a6d3da17ccdfda2a4c48233804c3562cc2f0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtstoraged, com.redhat.component=openstack-nova-libvirt-container, managed_by=tripleo_ansible, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 3, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro']}, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:35:22Z, version=17.1.12, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp17/openstack-nova-libvirt, distribution-scope=public, container_name=nova_virtstoraged, io.openshift.expose-services=, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, config_id=tripleo_step3, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1)
Nov 28 09:08:11 np0005538515.localdomain podman[109203]: nova_virtstoraged
Nov 28 09:08:11 np0005538515.localdomain podman[109218]: 2025-11-28 09:08:11.638698879 +0000 UTC m=+0.072571935 container cleanup 77e9d72df8f79ee50de7116306a3a6d3da17ccdfda2a4c48233804c3562cc2f0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtstoraged, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp17/openstack-nova-libvirt, managed_by=tripleo_ansible, url=https://www.redhat.com, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=nova_virtstoraged, com.redhat.component=openstack-nova-libvirt-container, batch=17.1_20251118.1, io.buildah.version=1.41.4, version=17.1.12, io.openshift.expose-services=, vendor=Red Hat, Inc., architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-libvirt, config_id=tripleo_step3, release=1761123044, build-date=2025-11-19T00:35:22Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 3, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Nov 28 09:08:11 np0005538515.localdomain systemd[1]: libpod-conmon-77e9d72df8f79ee50de7116306a3a6d3da17ccdfda2a4c48233804c3562cc2f0.scope: Deactivated successfully.
Nov 28 09:08:11 np0005538515.localdomain podman[109245]: error opening file `/run/crun/77e9d72df8f79ee50de7116306a3a6d3da17ccdfda2a4c48233804c3562cc2f0/status`: No such file or directory
Nov 28 09:08:11 np0005538515.localdomain podman[109234]: 2025-11-28 09:08:11.73811812 +0000 UTC m=+0.067214840 container cleanup 77e9d72df8f79ee50de7116306a3a6d3da17ccdfda2a4c48233804c3562cc2f0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtstoraged, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, version=17.1.12, container_name=nova_virtstoraged, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, config_id=tripleo_step3, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, io.buildah.version=1.41.4, tcib_managed=true, vendor=Red Hat, Inc., url=https://www.redhat.com, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'bbb5ea37891e3118676a78b59837de90'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 3, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro']}, batch=17.1_20251118.1, release=1761123044, vcs-type=git, build-date=2025-11-19T00:35:22Z, maintainer=OpenStack TripleO Team)
Nov 28 09:08:11 np0005538515.localdomain podman[109234]: nova_virtstoraged
Nov 28 09:08:11 np0005538515.localdomain systemd[1]: tripleo_nova_virtstoraged.service: Deactivated successfully.
Nov 28 09:08:11 np0005538515.localdomain systemd[1]: Stopped nova_virtstoraged container.
Nov 28 09:08:11 np0005538515.localdomain sudo[109161]: pam_unix(sudo:session): session closed for user root
Nov 28 09:08:12 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-c649278c2e5a474424d7d5698a840ae7cdf6b8243f9150d8a362719bce70699a-merged.mount: Deactivated successfully.
Nov 28 09:08:12 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-77e9d72df8f79ee50de7116306a3a6d3da17ccdfda2a4c48233804c3562cc2f0-userdata-shm.mount: Deactivated successfully.
Nov 28 09:08:12 np0005538515.localdomain sudo[109338]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tshdbeflthngswyabctdtdvmmgrtcqle ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764320891.919008-116-111319827789585/AnsiballZ_systemd_service.py
Nov 28 09:08:12 np0005538515.localdomain sudo[109338]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:08:12 np0005538515.localdomain python3.9[109340]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_ovn_controller.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:08:12 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 09:08:12 np0005538515.localdomain systemd-rc-local-generator[109367]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:08:12 np0005538515.localdomain systemd-sysv-generator[109370]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:08:12 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:08:12 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35548 DF PROTO=TCP SPT=46700 DPT=9105 SEQ=4195846488 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ABBCAFA0000000001030307) 
Nov 28 09:08:12 np0005538515.localdomain systemd[1]: Stopping ovn_controller container...
Nov 28 09:08:12 np0005538515.localdomain systemd[1]: tmp-crun.teiCMy.mount: Deactivated successfully.
Nov 28 09:08:13 np0005538515.localdomain systemd[1]: libpod-9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.scope: Deactivated successfully.
Nov 28 09:08:13 np0005538515.localdomain systemd[1]: libpod-9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.scope: Consumed 2.490s CPU time.
Nov 28 09:08:13 np0005538515.localdomain podman[109381]: 2025-11-28 09:08:13.009590378 +0000 UTC m=+0.084236154 container died 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, maintainer=OpenStack TripleO Team, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, name=rhosp17/openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, tcib_managed=true, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, io.openshift.expose-services=, com.redhat.component=openstack-ovn-controller-container, build-date=2025-11-18T23:34:05Z, batch=17.1_20251118.1, vendor=Red Hat, Inc., org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com)
Nov 28 09:08:13 np0005538515.localdomain systemd[1]: 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.timer: Deactivated successfully.
Nov 28 09:08:13 np0005538515.localdomain systemd[1]: Stopped /usr/bin/podman healthcheck run 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.
Nov 28 09:08:13 np0005538515.localdomain systemd[1]: 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.service: Failed to open /run/systemd/transient/9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.service: No such file or directory
Nov 28 09:08:13 np0005538515.localdomain systemd[1]: tmp-crun.qR2apN.mount: Deactivated successfully.
Nov 28 09:08:13 np0005538515.localdomain podman[109381]: 2025-11-28 09:08:13.057277597 +0000 UTC m=+0.131923343 container cleanup 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, distribution-scope=public, name=rhosp17/openstack-ovn-controller, container_name=ovn_controller, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, vendor=Red Hat, Inc., io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, build-date=2025-11-18T23:34:05Z, batch=17.1_20251118.1, release=1761123044, managed_by=tripleo_ansible, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, version=17.1.12, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']})
Nov 28 09:08:13 np0005538515.localdomain podman[109381]: ovn_controller
Nov 28 09:08:13 np0005538515.localdomain systemd[1]: 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.timer: Failed to open /run/systemd/transient/9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.timer: No such file or directory
Nov 28 09:08:13 np0005538515.localdomain systemd[1]: 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.service: Failed to open /run/systemd/transient/9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.service: No such file or directory
Nov 28 09:08:13 np0005538515.localdomain podman[109393]: 2025-11-28 09:08:13.095009598 +0000 UTC m=+0.073956967 container cleanup 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, managed_by=tripleo_ansible, io.buildah.version=1.41.4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, url=https://www.redhat.com, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:34:05Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, architecture=x86_64, config_id=tripleo_step4, vendor=Red Hat, Inc., batch=17.1_20251118.1, container_name=ovn_controller, version=17.1.12, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Nov 28 09:08:13 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-c6bc8e2b5666799e64c84f093eb3569ddc3bccd8602a09788ea75d9b81e61916-merged.mount: Deactivated successfully.
Nov 28 09:08:13 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164-userdata-shm.mount: Deactivated successfully.
Nov 28 09:08:13 np0005538515.localdomain systemd[1]: libpod-conmon-9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.scope: Deactivated successfully.
Nov 28 09:08:13 np0005538515.localdomain systemd[1]: 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.timer: Failed to open /run/systemd/transient/9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.timer: No such file or directory
Nov 28 09:08:13 np0005538515.localdomain systemd[1]: 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.service: Failed to open /run/systemd/transient/9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164.service: No such file or directory
Nov 28 09:08:13 np0005538515.localdomain podman[109409]: 2025-11-28 09:08:13.207539032 +0000 UTC m=+0.077116694 container cleanup 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ovn-controller, io.buildah.version=1.41.4, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, batch=17.1_20251118.1, tcib_managed=true, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, distribution-scope=public, release=1761123044, container_name=ovn_controller, architecture=x86_64, vcs-type=git, vendor=Red Hat, Inc., config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller)
Nov 28 09:08:13 np0005538515.localdomain podman[109409]: ovn_controller
Nov 28 09:08:13 np0005538515.localdomain systemd[1]: tripleo_ovn_controller.service: Deactivated successfully.
Nov 28 09:08:13 np0005538515.localdomain systemd[1]: Stopped ovn_controller container.
Nov 28 09:08:13 np0005538515.localdomain sudo[109338]: pam_unix(sudo:session): session closed for user root
Nov 28 09:08:13 np0005538515.localdomain sudo[109509]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aexblcxobkjczkkjxhuynwxkzwlgdkvf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764320893.3948562-116-70829188724393/AnsiballZ_systemd_service.py
Nov 28 09:08:13 np0005538515.localdomain sudo[109509]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:08:13 np0005538515.localdomain python3.9[109511]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_ovn_metadata_agent.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:08:14 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 09:08:14 np0005538515.localdomain systemd-sysv-generator[109539]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:08:14 np0005538515.localdomain systemd-rc-local-generator[109536]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:08:14 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:08:14 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6287 DF PROTO=TCP SPT=42114 DPT=9101 SEQ=1254892130 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ABBD0200000000001030307) 
Nov 28 09:08:14 np0005538515.localdomain systemd[1]: Stopping ovn_metadata_agent container...
Nov 28 09:08:14 np0005538515.localdomain systemd[1]: tmp-crun.UZgQuR.mount: Deactivated successfully.
Nov 28 09:08:14 np0005538515.localdomain systemd[1]: libpod-e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.scope: Deactivated successfully.
Nov 28 09:08:14 np0005538515.localdomain systemd[1]: libpod-e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.scope: Consumed 9.033s CPU time.
Nov 28 09:08:14 np0005538515.localdomain podman[109552]: 2025-11-28 09:08:14.570497178 +0000 UTC m=+0.200048220 container died e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.expose-services=, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, name=rhosp17/openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, distribution-scope=public, container_name=ovn_metadata_agent, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, build-date=2025-11-19T00:14:25Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, architecture=x86_64, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, version=17.1.12, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Nov 28 09:08:14 np0005538515.localdomain systemd[1]: e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.timer: Deactivated successfully.
Nov 28 09:08:14 np0005538515.localdomain systemd[1]: Stopped /usr/bin/podman healthcheck run e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.
Nov 28 09:08:14 np0005538515.localdomain systemd[1]: e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.service: Failed to open /run/systemd/transient/e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.service: No such file or directory
Nov 28 09:08:14 np0005538515.localdomain podman[109552]: 2025-11-28 09:08:14.634502838 +0000 UTC m=+0.264053860 container cleanup e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, config_id=tripleo_step4, version=17.1.12, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, url=https://www.redhat.com, managed_by=tripleo_ansible, build-date=2025-11-19T00:14:25Z, vcs-type=git, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_metadata_agent, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']})
Nov 28 09:08:14 np0005538515.localdomain podman[109552]: ovn_metadata_agent
Nov 28 09:08:14 np0005538515.localdomain systemd[1]: e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.timer: Failed to open /run/systemd/transient/e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.timer: No such file or directory
Nov 28 09:08:14 np0005538515.localdomain systemd[1]: e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.service: Failed to open /run/systemd/transient/e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.service: No such file or directory
Nov 28 09:08:14 np0005538515.localdomain podman[109565]: 2025-11-28 09:08:14.70702093 +0000 UTC m=+0.125769702 container cleanup e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, release=1761123044, vcs-type=git, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., build-date=2025-11-19T00:14:25Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, tcib_managed=true, config_id=tripleo_step4, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64)
Nov 28 09:08:14 np0005538515.localdomain systemd[1]: libpod-conmon-e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.scope: Deactivated successfully.
Nov 28 09:08:14 np0005538515.localdomain podman[109595]: error opening file `/run/crun/e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47/status`: No such file or directory
Nov 28 09:08:14 np0005538515.localdomain systemd[1]: e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.timer: Failed to open /run/systemd/transient/e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.timer: No such file or directory
Nov 28 09:08:14 np0005538515.localdomain systemd[1]: e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.service: Failed to open /run/systemd/transient/e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47.service: No such file or directory
Nov 28 09:08:14 np0005538515.localdomain podman[109581]: 2025-11-28 09:08:14.804620364 +0000 UTC m=+0.066228739 container cleanup e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, url=https://www.redhat.com, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, build-date=2025-11-19T00:14:25Z, container_name=ovn_metadata_agent, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., release=1761123044, batch=17.1_20251118.1, io.buildah.version=1.41.4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-neutron-metadata-agent-ovn)
Nov 28 09:08:14 np0005538515.localdomain podman[109581]: ovn_metadata_agent
Nov 28 09:08:14 np0005538515.localdomain systemd[1]: tripleo_ovn_metadata_agent.service: Deactivated successfully.
Nov 28 09:08:14 np0005538515.localdomain systemd[1]: Stopped ovn_metadata_agent container.
Nov 28 09:08:14 np0005538515.localdomain sudo[109509]: pam_unix(sudo:session): session closed for user root
Nov 28 09:08:15 np0005538515.localdomain sudo[109686]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fvawzgomiolcwwjxxineaqhzshydgxir ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764320894.9924295-116-47419593031348/AnsiballZ_systemd_service.py
Nov 28 09:08:15 np0005538515.localdomain sudo[109686]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:08:15 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-22314ee7dcc5723035b6772f98d17adedfb1f7b03c71f0801082e550913dd450-merged.mount: Deactivated successfully.
Nov 28 09:08:15 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47-userdata-shm.mount: Deactivated successfully.
Nov 28 09:08:15 np0005538515.localdomain python3.9[109688]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_rsyslog.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:08:15 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 09:08:15 np0005538515.localdomain systemd-sysv-generator[109720]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:08:15 np0005538515.localdomain systemd-rc-local-generator[109717]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:08:15 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:08:16 np0005538515.localdomain sudo[109686]: pam_unix(sudo:session): session closed for user root
Nov 28 09:08:16 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44402 DF PROTO=TCP SPT=41132 DPT=9100 SEQ=3174964367 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ABBD97A0000000001030307) 
Nov 28 09:08:20 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44403 DF PROTO=TCP SPT=41132 DPT=9100 SEQ=3174964367 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ABBE93A0000000001030307) 
Nov 28 09:08:27 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13782 DF PROTO=TCP SPT=55652 DPT=9105 SEQ=883028529 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ABC04530000000001030307) 
Nov 28 09:08:28 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13783 DF PROTO=TCP SPT=55652 DPT=9105 SEQ=883028529 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ABC087A0000000001030307) 
Nov 28 09:08:28 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44404 DF PROTO=TCP SPT=41132 DPT=9100 SEQ=3174964367 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ABC08FA0000000001030307) 
Nov 28 09:08:31 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6118 DF PROTO=TCP SPT=60306 DPT=9105 SEQ=3988060708 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ABC14FA0000000001030307) 
Nov 28 09:08:34 np0005538515.localdomain ceph-osd[32393]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 28 09:08:34 np0005538515.localdomain ceph-osd[32393]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 4800.1 total, 600.0 interval
                                                          Cumulative writes: 4784 writes, 21K keys, 4784 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s
                                                          Cumulative WAL: 4784 writes, 637 syncs, 7.51 writes per sync, written: 0.02 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                          Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Nov 28 09:08:34 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13785 DF PROTO=TCP SPT=55652 DPT=9105 SEQ=883028529 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ABC203A0000000001030307) 
Nov 28 09:08:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 28 09:08:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 4800.2 total, 600.0 interval
                                                          Cumulative writes: 5781 writes, 25K keys, 5781 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s
                                                          Cumulative WAL: 5781 writes, 729 syncs, 7.93 writes per sync, written: 0.02 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                          Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Nov 28 09:08:39 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57015 DF PROTO=TCP SPT=40258 DPT=9102 SEQ=523244208 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ABC317A0000000001030307) 
Nov 28 09:08:43 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13786 DF PROTO=TCP SPT=55652 DPT=9105 SEQ=883028529 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ABC40FA0000000001030307) 
Nov 28 09:08:44 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50403 DF PROTO=TCP SPT=52180 DPT=9882 SEQ=1087689752 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ABC44FA0000000001030307) 
Nov 28 09:08:46 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33203 DF PROTO=TCP SPT=46266 DPT=9100 SEQ=3793430538 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ABC4E7A0000000001030307) 
Nov 28 09:08:50 np0005538515.localdomain sudo[109741]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:08:50 np0005538515.localdomain sudo[109741]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:08:50 np0005538515.localdomain sudo[109741]: pam_unix(sudo:session): session closed for user root
Nov 28 09:08:50 np0005538515.localdomain sudo[109756]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 09:08:50 np0005538515.localdomain sudo[109756]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:08:50 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33204 DF PROTO=TCP SPT=46266 DPT=9100 SEQ=3793430538 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ABC5E3A0000000001030307) 
Nov 28 09:08:51 np0005538515.localdomain sudo[109756]: pam_unix(sudo:session): session closed for user root
Nov 28 09:08:51 np0005538515.localdomain sudo[109803]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:08:51 np0005538515.localdomain sudo[109803]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:08:51 np0005538515.localdomain sudo[109803]: pam_unix(sudo:session): session closed for user root
Nov 28 09:08:57 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42554 DF PROTO=TCP SPT=59496 DPT=9105 SEQ=2266543500 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ABC79840000000001030307) 
Nov 28 09:08:58 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42555 DF PROTO=TCP SPT=59496 DPT=9105 SEQ=2266543500 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ABC7D7A0000000001030307) 
Nov 28 09:08:58 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9945 DF PROTO=TCP SPT=55470 DPT=9882 SEQ=1427205240 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ABC7EAC0000000001030307) 
Nov 28 09:09:01 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35550 DF PROTO=TCP SPT=46700 DPT=9105 SEQ=4195846488 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ABC88FA0000000001030307) 
Nov 28 09:09:04 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42557 DF PROTO=TCP SPT=59496 DPT=9105 SEQ=2266543500 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ABC953A0000000001030307) 
Nov 28 09:09:09 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2686 DF PROTO=TCP SPT=45890 DPT=9102 SEQ=3024629957 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ABCA6BA0000000001030307) 
Nov 28 09:09:12 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42558 DF PROTO=TCP SPT=59496 DPT=9105 SEQ=2266543500 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ABCB4FA0000000001030307) 
Nov 28 09:09:14 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48602 DF PROTO=TCP SPT=43578 DPT=9101 SEQ=2593526270 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ABCBA800000000001030307) 
Nov 28 09:09:16 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48963 DF PROTO=TCP SPT=52360 DPT=9100 SEQ=2930900476 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ABCC3BA0000000001030307) 
Nov 28 09:09:20 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48964 DF PROTO=TCP SPT=52360 DPT=9100 SEQ=2930900476 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ABCD37A0000000001030307) 
Nov 28 09:09:27 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38153 DF PROTO=TCP SPT=37750 DPT=9105 SEQ=465103086 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ABCEEB30000000001030307) 
Nov 28 09:09:28 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38154 DF PROTO=TCP SPT=37750 DPT=9105 SEQ=465103086 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ABCF2BA0000000001030307) 
Nov 28 09:09:28 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48965 DF PROTO=TCP SPT=52360 DPT=9100 SEQ=2930900476 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ABCF2FA0000000001030307) 
Nov 28 09:09:31 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13788 DF PROTO=TCP SPT=55652 DPT=9105 SEQ=883028529 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ABCFEFA0000000001030307) 
Nov 28 09:09:34 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38156 DF PROTO=TCP SPT=37750 DPT=9105 SEQ=465103086 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ABD0A7A0000000001030307) 
Nov 28 09:09:36 np0005538515.localdomain sshd[104512]: Received disconnect from 192.168.122.30 port 35138:11: disconnected by user
Nov 28 09:09:36 np0005538515.localdomain sshd[104512]: Disconnected from user zuul 192.168.122.30 port 35138
Nov 28 09:09:36 np0005538515.localdomain sshd[104509]: pam_unix(sshd:session): session closed for user zuul
Nov 28 09:09:36 np0005538515.localdomain systemd[1]: session-36.scope: Deactivated successfully.
Nov 28 09:09:36 np0005538515.localdomain systemd[1]: session-36.scope: Consumed 19.462s CPU time.
Nov 28 09:09:36 np0005538515.localdomain systemd-logind[763]: Session 36 logged out. Waiting for processes to exit.
Nov 28 09:09:36 np0005538515.localdomain systemd-logind[763]: Removed session 36.
Nov 28 09:09:39 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52547 DF PROTO=TCP SPT=55104 DPT=9102 SEQ=2348563081 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ABD1BFA0000000001030307) 
Nov 28 09:09:43 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38157 DF PROTO=TCP SPT=37750 DPT=9105 SEQ=465103086 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ABD2AFA0000000001030307) 
Nov 28 09:09:44 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61712 DF PROTO=TCP SPT=35592 DPT=9101 SEQ=999991009 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ABD2FB00000000001030307) 
Nov 28 09:09:46 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27524 DF PROTO=TCP SPT=35812 DPT=9100 SEQ=629490546 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ABD38FA0000000001030307) 
Nov 28 09:09:50 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27525 DF PROTO=TCP SPT=35812 DPT=9100 SEQ=629490546 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ABD48BA0000000001030307) 
Nov 28 09:09:52 np0005538515.localdomain sudo[109819]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:09:52 np0005538515.localdomain sudo[109819]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:09:52 np0005538515.localdomain sudo[109819]: pam_unix(sudo:session): session closed for user root
Nov 28 09:09:52 np0005538515.localdomain sudo[109834]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 09:09:52 np0005538515.localdomain sudo[109834]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:09:52 np0005538515.localdomain sudo[109834]: pam_unix(sudo:session): session closed for user root
Nov 28 09:09:55 np0005538515.localdomain sudo[109880]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:09:55 np0005538515.localdomain sudo[109880]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:09:55 np0005538515.localdomain sudo[109880]: pam_unix(sudo:session): session closed for user root
Nov 28 09:09:57 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19934 DF PROTO=TCP SPT=53582 DPT=9105 SEQ=2412963223 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ABD63E30000000001030307) 
Nov 28 09:09:58 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19935 DF PROTO=TCP SPT=53582 DPT=9105 SEQ=2412963223 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ABD67FA0000000001030307) 
Nov 28 09:09:58 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27526 DF PROTO=TCP SPT=35812 DPT=9100 SEQ=629490546 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ABD68FA0000000001030307) 
Nov 28 09:10:01 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13119 DF PROTO=TCP SPT=51294 DPT=9882 SEQ=686327318 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ABD74FA0000000001030307) 
Nov 28 09:10:04 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19937 DF PROTO=TCP SPT=53582 DPT=9105 SEQ=2412963223 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ABD7FBA0000000001030307) 
Nov 28 09:10:09 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45504 DF PROTO=TCP SPT=39436 DPT=9102 SEQ=1530707447 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ABD913B0000000001030307) 
Nov 28 09:10:13 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19938 DF PROTO=TCP SPT=53582 DPT=9105 SEQ=2412963223 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ABDA0FB0000000001030307) 
Nov 28 09:10:14 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38641 DF PROTO=TCP SPT=33346 DPT=9101 SEQ=4112440622 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ABDA4E10000000001030307) 
Nov 28 09:10:16 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=4648 DF PROTO=TCP SPT=60218 DPT=9100 SEQ=1035888636 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ABDAE3A0000000001030307) 
Nov 28 09:10:20 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=4649 DF PROTO=TCP SPT=60218 DPT=9100 SEQ=1035888636 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ABDBDFA0000000001030307) 
Nov 28 09:10:27 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46143 DF PROTO=TCP SPT=48230 DPT=9105 SEQ=1998963562 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ABDD9130000000001030307) 
Nov 28 09:10:28 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46144 DF PROTO=TCP SPT=48230 DPT=9105 SEQ=1998963562 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ABDDD3A0000000001030307) 
Nov 28 09:10:28 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64749 DF PROTO=TCP SPT=49020 DPT=9882 SEQ=1698743421 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ABDDE3C0000000001030307) 
Nov 28 09:10:31 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38159 DF PROTO=TCP SPT=37750 DPT=9105 SEQ=465103086 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ABDE8FA0000000001030307) 
Nov 28 09:10:34 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46146 DF PROTO=TCP SPT=48230 DPT=9105 SEQ=1998963562 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ABDF4FB0000000001030307) 
Nov 28 09:10:39 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61708 DF PROTO=TCP SPT=47744 DPT=9102 SEQ=4236463055 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ABE063A0000000001030307) 
Nov 28 09:10:42 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46147 DF PROTO=TCP SPT=48230 DPT=9105 SEQ=1998963562 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ABE14FA0000000001030307) 
Nov 28 09:10:44 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39815 DF PROTO=TCP SPT=34776 DPT=9101 SEQ=3358262386 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ABE1A100000000001030307) 
Nov 28 09:10:46 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2441 DF PROTO=TCP SPT=37152 DPT=9100 SEQ=1364534858 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ABE233A0000000001030307) 
Nov 28 09:10:50 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2442 DF PROTO=TCP SPT=37152 DPT=9100 SEQ=1364534858 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ABE32FA0000000001030307) 
Nov 28 09:10:55 np0005538515.localdomain sudo[109895]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:10:55 np0005538515.localdomain sudo[109895]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:10:55 np0005538515.localdomain sudo[109895]: pam_unix(sudo:session): session closed for user root
Nov 28 09:10:55 np0005538515.localdomain sudo[109910]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 check-host
Nov 28 09:10:55 np0005538515.localdomain sudo[109910]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:10:56 np0005538515.localdomain sudo[109910]: pam_unix(sudo:session): session closed for user root
Nov 28 09:10:56 np0005538515.localdomain sudo[109946]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:10:56 np0005538515.localdomain sudo[109946]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:10:56 np0005538515.localdomain sudo[109946]: pam_unix(sudo:session): session closed for user root
Nov 28 09:10:56 np0005538515.localdomain sudo[109961]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 09:10:56 np0005538515.localdomain sudo[109961]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:10:57 np0005538515.localdomain sudo[109961]: pam_unix(sudo:session): session closed for user root
Nov 28 09:10:57 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=617 DF PROTO=TCP SPT=45096 DPT=9105 SEQ=2771617679 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ABE4E420000000001030307) 
Nov 28 09:10:57 np0005538515.localdomain sudo[110009]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:10:57 np0005538515.localdomain sudo[110009]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:10:57 np0005538515.localdomain sudo[110009]: pam_unix(sudo:session): session closed for user root
Nov 28 09:10:58 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=618 DF PROTO=TCP SPT=45096 DPT=9105 SEQ=2771617679 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ABE523B0000000001030307) 
Nov 28 09:10:58 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2443 DF PROTO=TCP SPT=37152 DPT=9100 SEQ=1364534858 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ABE52FA0000000001030307) 
Nov 28 09:11:01 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19940 DF PROTO=TCP SPT=53582 DPT=9105 SEQ=2412963223 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ABE5EFA0000000001030307) 
Nov 28 09:11:04 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=620 DF PROTO=TCP SPT=45096 DPT=9105 SEQ=2771617679 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ABE69FA0000000001030307) 
Nov 28 09:11:06 np0005538515.localdomain sshd[110024]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 09:11:06 np0005538515.localdomain sshd[110024]: Accepted publickey for zuul from 192.168.122.30 port 57724 ssh2: RSA SHA256:3gOhaEk5Hp1Sm2LwNst6cGDJ5O01KvSo8lCo9SBO2II
Nov 28 09:11:06 np0005538515.localdomain systemd-logind[763]: New session 37 of user zuul.
Nov 28 09:11:06 np0005538515.localdomain systemd[1]: Started Session 37 of User zuul.
Nov 28 09:11:06 np0005538515.localdomain sshd[110024]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Nov 28 09:11:07 np0005538515.localdomain sudo[110103]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kzqcoxxkeydoeacxisnyrqwntvpkvckq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321066.7483723-565-257556560209096/AnsiballZ_file.py
Nov 28 09:11:07 np0005538515.localdomain sudo[110103]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:11:07 np0005538515.localdomain python3.9[110105]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:11:07 np0005538515.localdomain sudo[110103]: pam_unix(sudo:session): session closed for user root
Nov 28 09:11:07 np0005538515.localdomain sudo[110195]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mzlaoglimaigycewruculxgnlyjyrakl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321067.4058626-565-273296303141735/AnsiballZ_file.py
Nov 28 09:11:07 np0005538515.localdomain sudo[110195]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:11:07 np0005538515.localdomain python3.9[110197]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_ceilometer_agent_ipmi.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:11:07 np0005538515.localdomain sudo[110195]: pam_unix(sudo:session): session closed for user root
Nov 28 09:11:08 np0005538515.localdomain sudo[110287]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jveuumriegmlatapzevemusgzzpldryd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321068.047251-565-201759020967476/AnsiballZ_file.py
Nov 28 09:11:08 np0005538515.localdomain sudo[110287]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:11:08 np0005538515.localdomain python3.9[110289]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_collectd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:11:08 np0005538515.localdomain sudo[110287]: pam_unix(sudo:session): session closed for user root
Nov 28 09:11:08 np0005538515.localdomain sudo[110379]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hfhhmozjzqvryxbkmtcrwxnfpcxkqyvw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321068.7125013-565-226315425711373/AnsiballZ_file.py
Nov 28 09:11:08 np0005538515.localdomain sudo[110379]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:11:09 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2944 DF PROTO=TCP SPT=46602 DPT=9102 SEQ=287814670 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ABE7B7A0000000001030307) 
Nov 28 09:11:09 np0005538515.localdomain python3.9[110381]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_iscsid.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:11:09 np0005538515.localdomain sudo[110379]: pam_unix(sudo:session): session closed for user root
Nov 28 09:11:09 np0005538515.localdomain sudo[110471]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jorzjgjirrxazirlcqoafpnwfkmvvcax ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321069.3508546-565-215723951641807/AnsiballZ_file.py
Nov 28 09:11:09 np0005538515.localdomain sudo[110471]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:11:09 np0005538515.localdomain python3.9[110473]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_logrotate_crond.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:11:09 np0005538515.localdomain sudo[110471]: pam_unix(sudo:session): session closed for user root
Nov 28 09:11:10 np0005538515.localdomain sudo[110563]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vsvsyvvtfcqsuozoxieiilfwfvxlzpzu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321069.9890225-565-276440730503352/AnsiballZ_file.py
Nov 28 09:11:10 np0005538515.localdomain sudo[110563]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:11:10 np0005538515.localdomain python3.9[110565]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_metrics_qdr.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:11:10 np0005538515.localdomain sudo[110563]: pam_unix(sudo:session): session closed for user root
Nov 28 09:11:10 np0005538515.localdomain sudo[110655]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-boxhkicfvdjupczlbmdizonmcifzjfcv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321070.586552-565-47352634795647/AnsiballZ_file.py
Nov 28 09:11:10 np0005538515.localdomain sudo[110655]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:11:11 np0005538515.localdomain python3.9[110657]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_neutron_dhcp.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:11:11 np0005538515.localdomain sudo[110655]: pam_unix(sudo:session): session closed for user root
Nov 28 09:11:11 np0005538515.localdomain sudo[110747]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-apgcaknrtihhpfcgbbjachcowidxwobk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321071.141992-565-196542415638270/AnsiballZ_file.py
Nov 28 09:11:11 np0005538515.localdomain sudo[110747]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:11:11 np0005538515.localdomain python3.9[110749]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_neutron_l3_agent.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:11:11 np0005538515.localdomain sudo[110747]: pam_unix(sudo:session): session closed for user root
Nov 28 09:11:11 np0005538515.localdomain sudo[110839]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bffosklpqihxoliyddctvagrjvwpnxaj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321071.7279181-565-126750747050476/AnsiballZ_file.py
Nov 28 09:11:11 np0005538515.localdomain sudo[110839]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:11:12 np0005538515.localdomain python3.9[110841]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_neutron_ovs_agent.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:11:12 np0005538515.localdomain sudo[110839]: pam_unix(sudo:session): session closed for user root
Nov 28 09:11:12 np0005538515.localdomain sudo[110931]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yhahukolaekepisgpwglyxxsekwjzlza ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321072.231161-565-152301554155482/AnsiballZ_file.py
Nov 28 09:11:12 np0005538515.localdomain sudo[110931]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:11:12 np0005538515.localdomain python3.9[110933]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:11:12 np0005538515.localdomain sudo[110931]: pam_unix(sudo:session): session closed for user root
Nov 28 09:11:13 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=621 DF PROTO=TCP SPT=45096 DPT=9105 SEQ=2771617679 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ABE8AFB0000000001030307) 
Nov 28 09:11:13 np0005538515.localdomain sudo[111023]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vnttatnqckuescbgmluuqkumrggbdcfm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321072.8382902-565-165826894027889/AnsiballZ_file.py
Nov 28 09:11:13 np0005538515.localdomain sudo[111023]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:11:13 np0005538515.localdomain python3.9[111025]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:11:13 np0005538515.localdomain sudo[111023]: pam_unix(sudo:session): session closed for user root
Nov 28 09:11:13 np0005538515.localdomain sudo[111115]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mjgzwxbnronjrzwmbrzdueqnmsxepkkk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321073.460167-565-126227120605152/AnsiballZ_file.py
Nov 28 09:11:13 np0005538515.localdomain sudo[111115]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:11:13 np0005538515.localdomain python3.9[111117]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:11:13 np0005538515.localdomain sudo[111115]: pam_unix(sudo:session): session closed for user root
Nov 28 09:11:14 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55985 DF PROTO=TCP SPT=51270 DPT=9882 SEQ=2418090807 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ABE8EFB0000000001030307) 
Nov 28 09:11:14 np0005538515.localdomain sudo[111207]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cwzpegkaktjmzvthouqgxihoyiwepxwa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321074.0923045-565-123719869754956/AnsiballZ_file.py
Nov 28 09:11:14 np0005538515.localdomain sudo[111207]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:11:14 np0005538515.localdomain python3.9[111209]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:11:14 np0005538515.localdomain sudo[111207]: pam_unix(sudo:session): session closed for user root
Nov 28 09:11:14 np0005538515.localdomain sudo[111299]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ftmrceywdhqbmegbbkfcnnswednbgwbp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321074.665603-565-55151351657989/AnsiballZ_file.py
Nov 28 09:11:14 np0005538515.localdomain sudo[111299]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:11:15 np0005538515.localdomain python3.9[111301]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:11:15 np0005538515.localdomain sudo[111299]: pam_unix(sudo:session): session closed for user root
Nov 28 09:11:15 np0005538515.localdomain sudo[111391]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pnxujqqyosytjjxhuokmspfzsabuupup ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321075.2629561-565-229711488948991/AnsiballZ_file.py
Nov 28 09:11:15 np0005538515.localdomain sudo[111391]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:11:15 np0005538515.localdomain python3.9[111393]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:11:15 np0005538515.localdomain sudo[111391]: pam_unix(sudo:session): session closed for user root
Nov 28 09:11:16 np0005538515.localdomain sudo[111483]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oetqivruvfqaaofxovglvzrchllqyevp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321075.8546026-565-102841908166713/AnsiballZ_file.py
Nov 28 09:11:16 np0005538515.localdomain sudo[111483]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:11:16 np0005538515.localdomain python3.9[111485]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtqemud_recover.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:11:16 np0005538515.localdomain sudo[111483]: pam_unix(sudo:session): session closed for user root
Nov 28 09:11:16 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15889 DF PROTO=TCP SPT=56990 DPT=9100 SEQ=2037800138 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ABE987B0000000001030307) 
Nov 28 09:11:16 np0005538515.localdomain sudo[111575]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uvzkrggkrpleecuzmseqvfjhwgenwhyr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321076.479674-565-4366402335854/AnsiballZ_file.py
Nov 28 09:11:16 np0005538515.localdomain sudo[111575]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:11:16 np0005538515.localdomain python3.9[111577]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:11:16 np0005538515.localdomain sudo[111575]: pam_unix(sudo:session): session closed for user root
Nov 28 09:11:17 np0005538515.localdomain sudo[111667]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-imbhseshqvortshtxlbmkegremckmlhw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321077.1111617-565-64793031276340/AnsiballZ_file.py
Nov 28 09:11:17 np0005538515.localdomain sudo[111667]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:11:17 np0005538515.localdomain python3.9[111669]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:11:17 np0005538515.localdomain sudo[111667]: pam_unix(sudo:session): session closed for user root
Nov 28 09:11:18 np0005538515.localdomain sudo[111759]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hwwnrdwaexcgpveuqsjnmdlgwdcdldin ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321077.7576938-565-230264639780402/AnsiballZ_file.py
Nov 28 09:11:18 np0005538515.localdomain sudo[111759]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:11:18 np0005538515.localdomain python3.9[111761]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_ovn_controller.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:11:18 np0005538515.localdomain sudo[111759]: pam_unix(sudo:session): session closed for user root
Nov 28 09:11:18 np0005538515.localdomain sudo[111851]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qymxignucsdxjinlarhbmasesrztmqjl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321078.3988135-565-30713445400083/AnsiballZ_file.py
Nov 28 09:11:18 np0005538515.localdomain sudo[111851]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:11:18 np0005538515.localdomain python3.9[111853]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_ovn_metadata_agent.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:11:18 np0005538515.localdomain sudo[111851]: pam_unix(sudo:session): session closed for user root
Nov 28 09:11:19 np0005538515.localdomain sudo[111943]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aofhlzrgayzlqjattcxsxictckuafkfq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321079.0213535-565-223556917262972/AnsiballZ_file.py
Nov 28 09:11:19 np0005538515.localdomain sudo[111943]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:11:19 np0005538515.localdomain python3.9[111945]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_rsyslog.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:11:19 np0005538515.localdomain sudo[111943]: pam_unix(sudo:session): session closed for user root
Nov 28 09:11:20 np0005538515.localdomain sudo[112035]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pidntsjnoiwsekwmhebnibxsxmxodnhz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321080.1383822-1016-100053380747281/AnsiballZ_file.py
Nov 28 09:11:20 np0005538515.localdomain sudo[112035]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:11:20 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15890 DF PROTO=TCP SPT=56990 DPT=9100 SEQ=2037800138 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ABEA83A0000000001030307) 
Nov 28 09:11:20 np0005538515.localdomain python3.9[112037]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:11:20 np0005538515.localdomain sudo[112035]: pam_unix(sudo:session): session closed for user root
Nov 28 09:11:21 np0005538515.localdomain sudo[112127]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-evqmhznsggnlqfmmelimzrcfqtkzchth ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321080.777133-1016-131314603172746/AnsiballZ_file.py
Nov 28 09:11:21 np0005538515.localdomain sudo[112127]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:11:21 np0005538515.localdomain python3.9[112129]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_ipmi.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:11:21 np0005538515.localdomain sudo[112127]: pam_unix(sudo:session): session closed for user root
Nov 28 09:11:21 np0005538515.localdomain sudo[112219]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sgqgdltvwlacofkieipypnexfgwewfmx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321081.3797207-1016-245589623966314/AnsiballZ_file.py
Nov 28 09:11:21 np0005538515.localdomain sudo[112219]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:11:21 np0005538515.localdomain python3.9[112221]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_collectd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:11:21 np0005538515.localdomain sudo[112219]: pam_unix(sudo:session): session closed for user root
Nov 28 09:11:22 np0005538515.localdomain sudo[112311]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hvllcpbskpvugwnfilxdxfhqnztxbdkg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321081.983181-1016-218664471310144/AnsiballZ_file.py
Nov 28 09:11:22 np0005538515.localdomain sudo[112311]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:11:22 np0005538515.localdomain python3.9[112313]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_iscsid.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:11:22 np0005538515.localdomain sudo[112311]: pam_unix(sudo:session): session closed for user root
Nov 28 09:11:22 np0005538515.localdomain sudo[112403]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xvigvrtdixvxzctlfcecjioormtjunrr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321082.5701866-1016-206638910964670/AnsiballZ_file.py
Nov 28 09:11:22 np0005538515.localdomain sudo[112403]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:11:22 np0005538515.localdomain python3.9[112405]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_logrotate_crond.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:11:22 np0005538515.localdomain sudo[112403]: pam_unix(sudo:session): session closed for user root
Nov 28 09:11:23 np0005538515.localdomain sudo[112495]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ihdnvdlratmphhqoaevlxleobbrfdegj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321083.1277232-1016-92973690004864/AnsiballZ_file.py
Nov 28 09:11:23 np0005538515.localdomain sudo[112495]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:11:23 np0005538515.localdomain python3.9[112497]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_metrics_qdr.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:11:23 np0005538515.localdomain sudo[112495]: pam_unix(sudo:session): session closed for user root
Nov 28 09:11:24 np0005538515.localdomain sudo[112587]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ufajomvjyederpgqdodlbgocqvsjfjjd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321083.767708-1016-157015289921443/AnsiballZ_file.py
Nov 28 09:11:24 np0005538515.localdomain sudo[112587]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:11:24 np0005538515.localdomain python3.9[112589]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_neutron_dhcp.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:11:24 np0005538515.localdomain sudo[112587]: pam_unix(sudo:session): session closed for user root
Nov 28 09:11:24 np0005538515.localdomain sudo[112679]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qzzqqjqldwofzzptvtmjknjrhitmakry ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321084.591302-1016-179954690802947/AnsiballZ_file.py
Nov 28 09:11:24 np0005538515.localdomain sudo[112679]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:11:25 np0005538515.localdomain python3.9[112681]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_neutron_l3_agent.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:11:25 np0005538515.localdomain sudo[112679]: pam_unix(sudo:session): session closed for user root
Nov 28 09:11:25 np0005538515.localdomain sudo[112771]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jmbjjyuvjmlsuqdzfrspqongdkoaajvi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321085.1622062-1016-117139967753307/AnsiballZ_file.py
Nov 28 09:11:25 np0005538515.localdomain sudo[112771]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:11:25 np0005538515.localdomain python3.9[112773]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_neutron_ovs_agent.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:11:25 np0005538515.localdomain sudo[112771]: pam_unix(sudo:session): session closed for user root
Nov 28 09:11:26 np0005538515.localdomain sudo[112863]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hffledkstrzodgatcusqfwusnoedngpn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321085.761024-1016-53436093638627/AnsiballZ_file.py
Nov 28 09:11:26 np0005538515.localdomain sudo[112863]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:11:26 np0005538515.localdomain python3.9[112865]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:11:26 np0005538515.localdomain sudo[112863]: pam_unix(sudo:session): session closed for user root
Nov 28 09:11:26 np0005538515.localdomain sudo[112955]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rhlozimotwalowtwiecmufzhzvkygech ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321086.3450155-1016-245256203232658/AnsiballZ_file.py
Nov 28 09:11:26 np0005538515.localdomain sudo[112955]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:11:26 np0005538515.localdomain python3.9[112957]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:11:26 np0005538515.localdomain sudo[112955]: pam_unix(sudo:session): session closed for user root
Nov 28 09:11:27 np0005538515.localdomain sudo[113047]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cmvrhabuieipzfznenysgmjvxxcsgxud ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321086.9380217-1016-55850384093305/AnsiballZ_file.py
Nov 28 09:11:27 np0005538515.localdomain sudo[113047]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:11:27 np0005538515.localdomain python3.9[113049]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:11:27 np0005538515.localdomain sudo[113047]: pam_unix(sudo:session): session closed for user root
Nov 28 09:11:27 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30540 DF PROTO=TCP SPT=40126 DPT=9105 SEQ=4094490606 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ABEC3730000000001030307) 
Nov 28 09:11:27 np0005538515.localdomain sudo[113139]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bbnisettlaofanqrduhwlpvlvkdvgqtq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321087.5730603-1016-31343858818060/AnsiballZ_file.py
Nov 28 09:11:27 np0005538515.localdomain sudo[113139]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:11:28 np0005538515.localdomain python3.9[113141]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:11:28 np0005538515.localdomain sudo[113139]: pam_unix(sudo:session): session closed for user root
Nov 28 09:11:28 np0005538515.localdomain sudo[113231]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-swlsissprymsvkievuyntcgirpthvyxe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321088.1650426-1016-207002441363276/AnsiballZ_file.py
Nov 28 09:11:28 np0005538515.localdomain sudo[113231]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:11:28 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30541 DF PROTO=TCP SPT=40126 DPT=9105 SEQ=4094490606 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ABEC77B0000000001030307) 
Nov 28 09:11:28 np0005538515.localdomain python3.9[113233]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:11:28 np0005538515.localdomain sudo[113231]: pam_unix(sudo:session): session closed for user root
Nov 28 09:11:28 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13890 DF PROTO=TCP SPT=37348 DPT=9882 SEQ=2422586960 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ABEC89E0000000001030307) 
Nov 28 09:11:29 np0005538515.localdomain sudo[113323]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mcecsicvivjzcdycltgmqktywgeihhhp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321088.7694094-1016-75114758846587/AnsiballZ_file.py
Nov 28 09:11:29 np0005538515.localdomain sudo[113323]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:11:29 np0005538515.localdomain python3.9[113325]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:11:29 np0005538515.localdomain sudo[113323]: pam_unix(sudo:session): session closed for user root
Nov 28 09:11:29 np0005538515.localdomain sudo[113415]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-upglyzcmgzzdpsctmiexuuqdmmqlzvjg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321089.3847735-1016-29166569063234/AnsiballZ_file.py
Nov 28 09:11:29 np0005538515.localdomain sudo[113415]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:11:29 np0005538515.localdomain python3.9[113417]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud_recover.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:11:29 np0005538515.localdomain sudo[113415]: pam_unix(sudo:session): session closed for user root
Nov 28 09:11:30 np0005538515.localdomain sudo[113507]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yfrypparadfdwhbyxhwmjpnoshaabpux ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321089.9917183-1016-82489749570694/AnsiballZ_file.py
Nov 28 09:11:30 np0005538515.localdomain sudo[113507]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:11:30 np0005538515.localdomain python3.9[113509]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:11:30 np0005538515.localdomain sudo[113507]: pam_unix(sudo:session): session closed for user root
Nov 28 09:11:30 np0005538515.localdomain sudo[113599]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jcafthoxlozccqayoqqgxehzhdbndrfx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321090.592457-1016-46149295919577/AnsiballZ_file.py
Nov 28 09:11:30 np0005538515.localdomain sudo[113599]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:11:31 np0005538515.localdomain python3.9[113601]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:11:31 np0005538515.localdomain sudo[113599]: pam_unix(sudo:session): session closed for user root
Nov 28 09:11:31 np0005538515.localdomain sudo[113691]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jpwkpoigxytxtzazhgfqxbzlkbeqfwen ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321091.1970778-1016-166757916770079/AnsiballZ_file.py
Nov 28 09:11:31 np0005538515.localdomain sudo[113691]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:11:31 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46149 DF PROTO=TCP SPT=48230 DPT=9105 SEQ=1998963562 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ABED2FA0000000001030307) 
Nov 28 09:11:31 np0005538515.localdomain python3.9[113693]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_ovn_controller.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:11:31 np0005538515.localdomain sudo[113691]: pam_unix(sudo:session): session closed for user root
Nov 28 09:11:32 np0005538515.localdomain sudo[113783]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-luhoiwuavlxdjggjbpwwhmbhwvnmkhvk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321091.7800937-1016-189127668790720/AnsiballZ_file.py
Nov 28 09:11:32 np0005538515.localdomain sudo[113783]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:11:32 np0005538515.localdomain python3.9[113785]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_ovn_metadata_agent.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:11:32 np0005538515.localdomain sudo[113783]: pam_unix(sudo:session): session closed for user root
Nov 28 09:11:32 np0005538515.localdomain sudo[113875]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pssfrrogqsduaadevapfyoqnqpwiuiid ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321092.3419924-1016-158461634714005/AnsiballZ_file.py
Nov 28 09:11:32 np0005538515.localdomain sudo[113875]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:11:32 np0005538515.localdomain python3.9[113877]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_rsyslog.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:11:32 np0005538515.localdomain sudo[113875]: pam_unix(sudo:session): session closed for user root
Nov 28 09:11:33 np0005538515.localdomain sudo[113967]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uepmbgflvxlpeztvimxenoewijuuvjub ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321093.6140459-1463-126077790235051/AnsiballZ_command.py
Nov 28 09:11:33 np0005538515.localdomain sudo[113967]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:11:34 np0005538515.localdomain python3.9[113969]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                                              systemctl disable --now certmonger.service
                                                              test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                                            fi
                                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:11:34 np0005538515.localdomain sudo[113967]: pam_unix(sudo:session): session closed for user root
Nov 28 09:11:34 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30543 DF PROTO=TCP SPT=40126 DPT=9105 SEQ=4094490606 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ABEDF3A0000000001030307) 
Nov 28 09:11:35 np0005538515.localdomain python3.9[114061]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Nov 28 09:11:35 np0005538515.localdomain sudo[114151]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-geesefhnthriwlturvrzlhlinmtusarp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321095.383175-1517-93569985273785/AnsiballZ_systemd_service.py
Nov 28 09:11:35 np0005538515.localdomain sudo[114151]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:11:35 np0005538515.localdomain python3.9[114153]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 28 09:11:35 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 09:11:36 np0005538515.localdomain systemd-rc-local-generator[114180]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:11:36 np0005538515.localdomain systemd-sysv-generator[114184]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:11:36 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:11:36 np0005538515.localdomain sudo[114151]: pam_unix(sudo:session): session closed for user root
Nov 28 09:11:36 np0005538515.localdomain sudo[114279]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rqfcegaviccowezvwefjmplwbueascgv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321096.4745245-1541-182279624316189/AnsiballZ_command.py
Nov 28 09:11:36 np0005538515.localdomain sudo[114279]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:11:36 np0005538515.localdomain python3.9[114281]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_ceilometer_agent_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:11:37 np0005538515.localdomain sudo[114279]: pam_unix(sudo:session): session closed for user root
Nov 28 09:11:38 np0005538515.localdomain sudo[114372]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ugtperuvqjmzoavvlvvhdvxsurnokpoa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321098.1231852-1541-62353764981883/AnsiballZ_command.py
Nov 28 09:11:38 np0005538515.localdomain sudo[114372]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:11:38 np0005538515.localdomain python3.9[114374]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_ceilometer_agent_ipmi.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:11:38 np0005538515.localdomain sudo[114372]: pam_unix(sudo:session): session closed for user root
Nov 28 09:11:39 np0005538515.localdomain sudo[114465]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aujggfxrqqosqbjcnorrxiokosfsowyv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321098.7782507-1541-36292918975729/AnsiballZ_command.py
Nov 28 09:11:39 np0005538515.localdomain sudo[114465]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:11:39 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42420 DF PROTO=TCP SPT=51322 DPT=9102 SEQ=1200137031 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ABEF0BA0000000001030307) 
Nov 28 09:11:39 np0005538515.localdomain python3.9[114467]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_collectd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:11:39 np0005538515.localdomain sudo[114465]: pam_unix(sudo:session): session closed for user root
Nov 28 09:11:39 np0005538515.localdomain sudo[114558]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-daupydipoplovbzputqictxflmuitjgf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321099.3678188-1541-78730031117551/AnsiballZ_command.py
Nov 28 09:11:39 np0005538515.localdomain sudo[114558]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:11:39 np0005538515.localdomain python3.9[114560]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_iscsid.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:11:39 np0005538515.localdomain sudo[114558]: pam_unix(sudo:session): session closed for user root
Nov 28 09:11:40 np0005538515.localdomain sudo[114651]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jbbzywstqyfrlatzxugocnapkzaieijh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321099.9703836-1541-89681745956297/AnsiballZ_command.py
Nov 28 09:11:40 np0005538515.localdomain sudo[114651]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:11:40 np0005538515.localdomain python3.9[114653]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_logrotate_crond.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:11:40 np0005538515.localdomain sudo[114651]: pam_unix(sudo:session): session closed for user root
Nov 28 09:11:40 np0005538515.localdomain sudo[114744]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dqyeswufhegvzasvzqzqtnzvimgehenr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321100.540562-1541-167742881474061/AnsiballZ_command.py
Nov 28 09:11:40 np0005538515.localdomain sudo[114744]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:11:41 np0005538515.localdomain python3.9[114746]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_metrics_qdr.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:11:41 np0005538515.localdomain sudo[114744]: pam_unix(sudo:session): session closed for user root
Nov 28 09:11:41 np0005538515.localdomain sudo[114837]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ytngclhwruiilpizhhtvnwrxbrkzdppu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321101.1579328-1541-104812178599543/AnsiballZ_command.py
Nov 28 09:11:41 np0005538515.localdomain sudo[114837]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:11:41 np0005538515.localdomain python3.9[114839]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_neutron_dhcp.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:11:41 np0005538515.localdomain sudo[114837]: pam_unix(sudo:session): session closed for user root
Nov 28 09:11:42 np0005538515.localdomain sudo[114930]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ofhlytyoentszossqexxahmkfqydkjvi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321101.7752721-1541-178365682638966/AnsiballZ_command.py
Nov 28 09:11:42 np0005538515.localdomain sudo[114930]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:11:42 np0005538515.localdomain python3.9[114932]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_neutron_l3_agent.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:11:42 np0005538515.localdomain sudo[114930]: pam_unix(sudo:session): session closed for user root
Nov 28 09:11:42 np0005538515.localdomain sudo[115023]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xwiiwwdxbavmcjhjdwyeivgcfuozsjsq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321102.3595269-1541-137254748255703/AnsiballZ_command.py
Nov 28 09:11:42 np0005538515.localdomain sudo[115023]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:11:42 np0005538515.localdomain python3.9[115025]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_neutron_ovs_agent.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:11:42 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30544 DF PROTO=TCP SPT=40126 DPT=9105 SEQ=4094490606 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ABEFEFA0000000001030307) 
Nov 28 09:11:42 np0005538515.localdomain sudo[115023]: pam_unix(sudo:session): session closed for user root
Nov 28 09:11:43 np0005538515.localdomain sudo[115116]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zsvbpqpecnhrcemddxlyhwkvgspzvxmy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321102.9669147-1541-106247286649335/AnsiballZ_command.py
Nov 28 09:11:43 np0005538515.localdomain sudo[115116]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:11:43 np0005538515.localdomain python3.9[115118]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:11:43 np0005538515.localdomain sudo[115116]: pam_unix(sudo:session): session closed for user root
Nov 28 09:11:44 np0005538515.localdomain sudo[115209]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uuqimjlhrlyniqzebagnbwmkxldhnivo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321103.7495248-1541-52834539553741/AnsiballZ_command.py
Nov 28 09:11:44 np0005538515.localdomain sudo[115209]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:11:44 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20902 DF PROTO=TCP SPT=43364 DPT=9101 SEQ=2399263955 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ABF04700000000001030307) 
Nov 28 09:11:44 np0005538515.localdomain python3.9[115211]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_migration_target.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:11:45 np0005538515.localdomain sudo[115209]: pam_unix(sudo:session): session closed for user root
Nov 28 09:11:45 np0005538515.localdomain sudo[115302]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-awkdwphojtiiaegkofglergrkcluyfau ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321105.4525228-1541-223471530540806/AnsiballZ_command.py
Nov 28 09:11:45 np0005538515.localdomain sudo[115302]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:11:45 np0005538515.localdomain python3.9[115304]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtlogd_wrapper.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:11:45 np0005538515.localdomain sudo[115302]: pam_unix(sudo:session): session closed for user root
Nov 28 09:11:46 np0005538515.localdomain sudo[115395]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-firwttwwesncubejdvrqnjxoajyxzxrh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321106.0361042-1541-216583944246677/AnsiballZ_command.py
Nov 28 09:11:46 np0005538515.localdomain sudo[115395]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:11:46 np0005538515.localdomain python3.9[115397]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtnodedevd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:11:46 np0005538515.localdomain sudo[115395]: pam_unix(sudo:session): session closed for user root
Nov 28 09:11:46 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31179 DF PROTO=TCP SPT=47726 DPT=9100 SEQ=3059007427 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ABF0DBA0000000001030307) 
Nov 28 09:11:46 np0005538515.localdomain sudo[115488]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dvnrlhbuayhbcrvllmbvbkmkzrzapqoy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321106.699977-1541-118809790317072/AnsiballZ_command.py
Nov 28 09:11:46 np0005538515.localdomain sudo[115488]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:11:47 np0005538515.localdomain python3.9[115490]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtproxyd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:11:47 np0005538515.localdomain sudo[115488]: pam_unix(sudo:session): session closed for user root
Nov 28 09:11:47 np0005538515.localdomain sudo[115581]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fhuojpajolcgycpjwgzgxghwqgtpsdel ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321107.3855805-1541-55121067312176/AnsiballZ_command.py
Nov 28 09:11:47 np0005538515.localdomain sudo[115581]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:11:47 np0005538515.localdomain python3.9[115583]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtqemud.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:11:47 np0005538515.localdomain sudo[115581]: pam_unix(sudo:session): session closed for user root
Nov 28 09:11:48 np0005538515.localdomain sudo[115674]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rbtuaigphngtgglopeiconrmknhlxwyv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321108.0055172-1541-53568694953615/AnsiballZ_command.py
Nov 28 09:11:48 np0005538515.localdomain sudo[115674]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:11:48 np0005538515.localdomain python3.9[115676]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtqemud_recover.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:11:48 np0005538515.localdomain sudo[115674]: pam_unix(sudo:session): session closed for user root
Nov 28 09:11:48 np0005538515.localdomain sudo[115767]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dpgechjzihelfwcmzgkmehqngrfnhsku ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321108.5863113-1541-77868674407808/AnsiballZ_command.py
Nov 28 09:11:48 np0005538515.localdomain sudo[115767]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:11:49 np0005538515.localdomain python3.9[115769]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtsecretd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:11:49 np0005538515.localdomain sudo[115767]: pam_unix(sudo:session): session closed for user root
Nov 28 09:11:49 np0005538515.localdomain sudo[115860]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-isxrwdzmrfawbphjwgepfppdvqssfisy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321109.236906-1541-29400395471190/AnsiballZ_command.py
Nov 28 09:11:49 np0005538515.localdomain sudo[115860]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:11:49 np0005538515.localdomain python3.9[115862]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtstoraged.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:11:49 np0005538515.localdomain sudo[115860]: pam_unix(sudo:session): session closed for user root
Nov 28 09:11:50 np0005538515.localdomain sudo[115953]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oieuaiwxklgclyrcokqqvmkuydfzdjmb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321109.832345-1541-44317882450976/AnsiballZ_command.py
Nov 28 09:11:50 np0005538515.localdomain sudo[115953]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:11:50 np0005538515.localdomain python3.9[115955]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_ovn_controller.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:11:50 np0005538515.localdomain sudo[115953]: pam_unix(sudo:session): session closed for user root
Nov 28 09:11:50 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31180 DF PROTO=TCP SPT=47726 DPT=9100 SEQ=3059007427 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ABF1D7A0000000001030307) 
Nov 28 09:11:50 np0005538515.localdomain sudo[116046]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iyvmlhumxwfttjeclqcjwwogksdojsoe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321110.4283884-1541-164399061742659/AnsiballZ_command.py
Nov 28 09:11:50 np0005538515.localdomain sudo[116046]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:11:50 np0005538515.localdomain python3.9[116048]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_ovn_metadata_agent.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:11:51 np0005538515.localdomain sudo[116046]: pam_unix(sudo:session): session closed for user root
Nov 28 09:11:52 np0005538515.localdomain sudo[116139]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mfahozodowxfwyrlzrhquiofudtygjot ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321112.0382612-1541-240264909459437/AnsiballZ_command.py
Nov 28 09:11:52 np0005538515.localdomain sudo[116139]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:11:52 np0005538515.localdomain python3.9[116141]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_rsyslog.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:11:52 np0005538515.localdomain sudo[116139]: pam_unix(sudo:session): session closed for user root
Nov 28 09:11:53 np0005538515.localdomain sshd[110024]: pam_unix(sshd:session): session closed for user zuul
Nov 28 09:11:53 np0005538515.localdomain systemd-logind[763]: Session 37 logged out. Waiting for processes to exit.
Nov 28 09:11:53 np0005538515.localdomain systemd[1]: session-37.scope: Deactivated successfully.
Nov 28 09:11:53 np0005538515.localdomain systemd[1]: session-37.scope: Consumed 30.647s CPU time.
Nov 28 09:11:53 np0005538515.localdomain systemd-logind[763]: Removed session 37.
Nov 28 09:11:57 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37706 DF PROTO=TCP SPT=51020 DPT=9105 SEQ=839970434 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ABF38A40000000001030307) 
Nov 28 09:11:57 np0005538515.localdomain sudo[116157]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:11:57 np0005538515.localdomain sudo[116157]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:11:57 np0005538515.localdomain sudo[116157]: pam_unix(sudo:session): session closed for user root
Nov 28 09:11:57 np0005538515.localdomain sudo[116172]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 09:11:57 np0005538515.localdomain sudo[116172]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:11:58 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37707 DF PROTO=TCP SPT=51020 DPT=9105 SEQ=839970434 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ABF3CBA0000000001030307) 
Nov 28 09:11:58 np0005538515.localdomain sudo[116172]: pam_unix(sudo:session): session closed for user root
Nov 28 09:11:58 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31181 DF PROTO=TCP SPT=47726 DPT=9100 SEQ=3059007427 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ABF3CFA0000000001030307) 
Nov 28 09:11:59 np0005538515.localdomain sudo[116220]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:11:59 np0005538515.localdomain sudo[116220]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:11:59 np0005538515.localdomain sudo[116220]: pam_unix(sudo:session): session closed for user root
Nov 28 09:12:01 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=623 DF PROTO=TCP SPT=45096 DPT=9105 SEQ=2771617679 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ABF48FA0000000001030307) 
Nov 28 09:12:04 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37709 DF PROTO=TCP SPT=51020 DPT=9105 SEQ=839970434 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ABF547A0000000001030307) 
Nov 28 09:12:09 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42414 DF PROTO=TCP SPT=37072 DPT=9102 SEQ=924624839 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ABF65FA0000000001030307) 
Nov 28 09:12:13 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37710 DF PROTO=TCP SPT=51020 DPT=9105 SEQ=839970434 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ABF74FA0000000001030307) 
Nov 28 09:12:14 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22580 DF PROTO=TCP SPT=39386 DPT=9882 SEQ=1173008762 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ABF78FB0000000001030307) 
Nov 28 09:12:16 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27784 DF PROTO=TCP SPT=53182 DPT=9100 SEQ=3797941213 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ABF82FB0000000001030307) 
Nov 28 09:12:20 np0005538515.localdomain sshd[116235]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 09:12:20 np0005538515.localdomain sshd[116235]: Accepted publickey for zuul from 192.168.122.31 port 43910 ssh2: RSA SHA256:3gOhaEk5Hp1Sm2LwNst6cGDJ5O01KvSo8lCo9SBO2II
Nov 28 09:12:20 np0005538515.localdomain systemd-logind[763]: New session 38 of user zuul.
Nov 28 09:12:20 np0005538515.localdomain systemd[1]: Started Session 38 of User zuul.
Nov 28 09:12:20 np0005538515.localdomain sshd[116235]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Nov 28 09:12:20 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27785 DF PROTO=TCP SPT=53182 DPT=9100 SEQ=3797941213 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ABF92BB0000000001030307) 
Nov 28 09:12:21 np0005538515.localdomain python3.9[116328]: ansible-ansible.legacy.ping Invoked with data=pong
Nov 28 09:12:22 np0005538515.localdomain python3.9[116432]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 09:12:23 np0005538515.localdomain sudo[116522]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ljbxcmmvlnfhrguboywacubmduykzinl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321142.6759539-96-175688907633926/AnsiballZ_command.py
Nov 28 09:12:23 np0005538515.localdomain sudo[116522]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:12:23 np0005538515.localdomain python3.9[116524]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols
                                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:12:23 np0005538515.localdomain sudo[116522]: pam_unix(sudo:session): session closed for user root
Nov 28 09:12:24 np0005538515.localdomain sudo[116615]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uwotnfrgiinozmrwxhjxjpjwcxkvpjas ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321143.6736655-131-267029427065120/AnsiballZ_stat.py
Nov 28 09:12:24 np0005538515.localdomain sudo[116615]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:12:24 np0005538515.localdomain python3.9[116617]: ansible-ansible.builtin.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 09:12:24 np0005538515.localdomain sudo[116615]: pam_unix(sudo:session): session closed for user root
Nov 28 09:12:25 np0005538515.localdomain sudo[116707]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lmohzotntbowqotfplhbqaaevveljhkv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321144.5956626-155-48656205402485/AnsiballZ_file.py
Nov 28 09:12:25 np0005538515.localdomain sudo[116707]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:12:25 np0005538515.localdomain python3.9[116709]: ansible-ansible.builtin.file Invoked with mode=755 path=/etc/ansible/facts.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:12:25 np0005538515.localdomain sudo[116707]: pam_unix(sudo:session): session closed for user root
Nov 28 09:12:25 np0005538515.localdomain sudo[116799]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-srftxajxwxlfkyigrncmydlqegwyxgdf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321145.4753542-179-4419938152560/AnsiballZ_stat.py
Nov 28 09:12:25 np0005538515.localdomain sudo[116799]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:12:25 np0005538515.localdomain python3.9[116801]: ansible-ansible.legacy.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:12:25 np0005538515.localdomain sudo[116799]: pam_unix(sudo:session): session closed for user root
Nov 28 09:12:26 np0005538515.localdomain sudo[116872]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gobnxzqmaxvnhelpwftfqegpdepsmqml ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321145.4753542-179-4419938152560/AnsiballZ_copy.py
Nov 28 09:12:26 np0005538515.localdomain sudo[116872]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:12:26 np0005538515.localdomain python3.9[116874]: ansible-ansible.legacy.copy Invoked with dest=/etc/ansible/facts.d/bootc.fact mode=755 src=/home/zuul/.ansible/tmp/ansible-tmp-1764321145.4753542-179-4419938152560/.source.fact _original_basename=bootc.fact follow=False checksum=eb4122ce7fc50a38407beb511c4ff8c178005b12 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:12:26 np0005538515.localdomain sudo[116872]: pam_unix(sudo:session): session closed for user root
Nov 28 09:12:27 np0005538515.localdomain sudo[116964]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ltrwcktnczpbohmndvvqkmazysnhofux ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321146.9345925-225-198440393960201/AnsiballZ_setup.py
Nov 28 09:12:27 np0005538515.localdomain sudo[116964]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:12:27 np0005538515.localdomain python3.9[116966]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 09:12:27 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64497 DF PROTO=TCP SPT=60080 DPT=9105 SEQ=380828836 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ABFADD30000000001030307) 
Nov 28 09:12:27 np0005538515.localdomain sudo[116964]: pam_unix(sudo:session): session closed for user root
Nov 28 09:12:28 np0005538515.localdomain sudo[117060]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vcfyzkdjbkhldvsobogpmfoigkytawdn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321147.9998581-248-178291483014330/AnsiballZ_file.py
Nov 28 09:12:28 np0005538515.localdomain sudo[117060]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:12:28 np0005538515.localdomain python3.9[117062]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/log/journal setype=var_log_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:12:28 np0005538515.localdomain sudo[117060]: pam_unix(sudo:session): session closed for user root
Nov 28 09:12:28 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64498 DF PROTO=TCP SPT=60080 DPT=9105 SEQ=380828836 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ABFB1FB0000000001030307) 
Nov 28 09:12:28 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27786 DF PROTO=TCP SPT=53182 DPT=9100 SEQ=3797941213 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ABFB2FA0000000001030307) 
Nov 28 09:12:29 np0005538515.localdomain sudo[117152]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dtxizkqdgzbavlnyiberruxqfqexqojn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321148.7631657-276-191814095604923/AnsiballZ_file.py
Nov 28 09:12:29 np0005538515.localdomain sudo[117152]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:12:29 np0005538515.localdomain python3.9[117154]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/config-data/ansible-generated recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:12:29 np0005538515.localdomain sudo[117152]: pam_unix(sudo:session): session closed for user root
Nov 28 09:12:29 np0005538515.localdomain python3.9[117244]: ansible-ansible.builtin.service_facts Invoked
Nov 28 09:12:30 np0005538515.localdomain network[117261]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 28 09:12:30 np0005538515.localdomain network[117262]: 'network-scripts' will be removed from distribution in near future.
Nov 28 09:12:30 np0005538515.localdomain network[117263]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 28 09:12:31 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:12:31 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52068 DF PROTO=TCP SPT=55116 DPT=9882 SEQ=3541698670 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ABFBEFA0000000001030307) 
Nov 28 09:12:34 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64500 DF PROTO=TCP SPT=60080 DPT=9105 SEQ=380828836 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ABFC9BA0000000001030307) 
Nov 28 09:12:35 np0005538515.localdomain python3.9[117461]: ansible-ansible.builtin.lineinfile Invoked with line=cloud-init=disabled path=/proc/cmdline state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:12:36 np0005538515.localdomain python3.9[117551]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 09:12:36 np0005538515.localdomain sudo[117645]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hademnxzwpmlpobfblczjbblujtslauv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321156.6016252-377-227792704577619/AnsiballZ_command.py
Nov 28 09:12:36 np0005538515.localdomain sudo[117645]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:12:37 np0005538515.localdomain python3.9[117647]: ansible-ansible.legacy.command Invoked with _raw_params=# This is a hack to deploy RDO Delorean repos to RHEL as if it were Centos 9 Stream
                                                            set -euxo pipefail
                                                            curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz
                                                            python3 -m venv ./venv
                                                            PBR_VERSION=0.0.0 ./venv/bin/pip install ./repo-setup-main
                                                            # This is required for FIPS enabled until trunk.rdoproject.org
                                                            # is not being served from a centos7 host, tracked by
                                                            # https://issues.redhat.com/browse/RHOSZUUL-1517
                                                            dnf -y install crypto-policies
                                                            update-crypto-policies --set FIPS:NO-ENFORCE-EMS
                                                            ./venv/bin/repo-setup current-podified -b antelope -d centos9 --stream
                                                            
                                                            # Exclude ceph-common-18.2.7 as it's pulling newer openssl not compatible
                                                            # with rhel 9.2 openssh
                                                            dnf config-manager --setopt centos9-storage.exclude="ceph-common-18.2.7" --save
                                                            # FIXME: perform dnf upgrade for other packages in EDPM ansible
                                                            # here we only ensuring that decontainerized libvirt can start
                                                            dnf -y upgrade openstack-selinux
                                                            rm -f /run/virtlogd.pid
                                                            
                                                            rm -rf repo-setup-main
                                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:12:39 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8488 DF PROTO=TCP SPT=33026 DPT=9102 SEQ=1824906303 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ABFDAFB0000000001030307) 
Nov 28 09:12:43 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64501 DF PROTO=TCP SPT=60080 DPT=9105 SEQ=380828836 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ABFEAFA0000000001030307) 
Nov 28 09:12:44 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19794 DF PROTO=TCP SPT=54068 DPT=9101 SEQ=250308116 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ABFEED00000000001030307) 
Nov 28 09:12:46 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17951 DF PROTO=TCP SPT=37498 DPT=9100 SEQ=4152429093 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ABFF7FB0000000001030307) 
Nov 28 09:12:46 np0005538515.localdomain systemd[1]: Stopping OpenSSH server daemon...
Nov 28 09:12:46 np0005538515.localdomain sshd[46214]: Received signal 15; terminating.
Nov 28 09:12:46 np0005538515.localdomain systemd[1]: sshd.service: Deactivated successfully.
Nov 28 09:12:46 np0005538515.localdomain systemd[1]: Stopped OpenSSH server daemon.
Nov 28 09:12:46 np0005538515.localdomain systemd[1]: sshd.service: Consumed 1.009s CPU time.
Nov 28 09:12:46 np0005538515.localdomain systemd[1]: Stopped target sshd-keygen.target.
Nov 28 09:12:46 np0005538515.localdomain systemd[1]: Stopping sshd-keygen.target...
Nov 28 09:12:46 np0005538515.localdomain systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 28 09:12:46 np0005538515.localdomain systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 28 09:12:46 np0005538515.localdomain systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 28 09:12:46 np0005538515.localdomain systemd[1]: Reached target sshd-keygen.target.
Nov 28 09:12:46 np0005538515.localdomain systemd[1]: Starting OpenSSH server daemon...
Nov 28 09:12:46 np0005538515.localdomain sshd[117690]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 09:12:46 np0005538515.localdomain sshd[117690]: Server listening on 0.0.0.0 port 22.
Nov 28 09:12:46 np0005538515.localdomain sshd[117690]: Server listening on :: port 22.
Nov 28 09:12:46 np0005538515.localdomain systemd[1]: Started OpenSSH server daemon.
Nov 28 09:12:46 np0005538515.localdomain systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 28 09:12:46 np0005538515.localdomain systemd[1]: Starting man-db-cache-update.service...
Nov 28 09:12:46 np0005538515.localdomain systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 28 09:12:47 np0005538515.localdomain systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 28 09:12:47 np0005538515.localdomain systemd[1]: Finished man-db-cache-update.service.
Nov 28 09:12:47 np0005538515.localdomain systemd[1]: run-r371e36430ddb4353bd579e3a3681593a.service: Deactivated successfully.
Nov 28 09:12:47 np0005538515.localdomain systemd[1]: run-rbbae77d1706e4800851f165e959e8902.service: Deactivated successfully.
Nov 28 09:12:47 np0005538515.localdomain systemd[1]: Stopping OpenSSH server daemon...
Nov 28 09:12:47 np0005538515.localdomain sshd[117690]: Received signal 15; terminating.
Nov 28 09:12:47 np0005538515.localdomain systemd[1]: sshd.service: Deactivated successfully.
Nov 28 09:12:47 np0005538515.localdomain systemd[1]: Stopped OpenSSH server daemon.
Nov 28 09:12:47 np0005538515.localdomain systemd[1]: Stopped target sshd-keygen.target.
Nov 28 09:12:47 np0005538515.localdomain systemd[1]: Stopping sshd-keygen.target...
Nov 28 09:12:47 np0005538515.localdomain systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 28 09:12:47 np0005538515.localdomain systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 28 09:12:47 np0005538515.localdomain systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 28 09:12:47 np0005538515.localdomain systemd[1]: Reached target sshd-keygen.target.
Nov 28 09:12:47 np0005538515.localdomain systemd[1]: Starting OpenSSH server daemon...
Nov 28 09:12:47 np0005538515.localdomain sshd[117864]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 09:12:47 np0005538515.localdomain sshd[117864]: Server listening on 0.0.0.0 port 22.
Nov 28 09:12:47 np0005538515.localdomain sshd[117864]: Server listening on :: port 22.
Nov 28 09:12:47 np0005538515.localdomain systemd[1]: Started OpenSSH server daemon.
Nov 28 09:12:50 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17952 DF PROTO=TCP SPT=37498 DPT=9100 SEQ=4152429093 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC007BA0000000001030307) 
Nov 28 09:12:57 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27943 DF PROTO=TCP SPT=60562 DPT=9105 SEQ=1404972871 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC023030000000001030307) 
Nov 28 09:12:58 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27944 DF PROTO=TCP SPT=60562 DPT=9105 SEQ=1404972871 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC026FB0000000001030307) 
Nov 28 09:12:58 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33323 DF PROTO=TCP SPT=48168 DPT=9882 SEQ=1107686790 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC0282C0000000001030307) 
Nov 28 09:12:59 np0005538515.localdomain sudo[117961]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:12:59 np0005538515.localdomain sudo[117961]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:12:59 np0005538515.localdomain sudo[117961]: pam_unix(sudo:session): session closed for user root
Nov 28 09:12:59 np0005538515.localdomain sudo[117976]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Nov 28 09:12:59 np0005538515.localdomain sudo[117976]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:13:00 np0005538515.localdomain systemd[1]: tmp-crun.y4oLqG.mount: Deactivated successfully.
Nov 28 09:13:00 np0005538515.localdomain podman[118062]: 2025-11-28 09:13:00.4118648 +0000 UTC m=+0.102088303 container exec 98f7091a3e2ea0e9ed1e630f1e98c8fad1fd276cf7448473db6afc3c103ea45d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538515, io.buildah.version=1.33.12, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=553, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, name=rhceph, GIT_CLEAN=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., GIT_BRANCH=main, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, RELEASE=main, com.redhat.component=rhceph-container, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, build-date=2025-09-24T08:57:55)
Nov 28 09:13:00 np0005538515.localdomain podman[118062]: 2025-11-28 09:13:00.511710344 +0000 UTC m=+0.201933797 container exec_died 98f7091a3e2ea0e9ed1e630f1e98c8fad1fd276cf7448473db6afc3c103ea45d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538515, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-type=git, description=Red Hat Ceph Storage 7, ceph=True, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_CLEAN=True, RELEASE=main, distribution-scope=public, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, com.redhat.component=rhceph-container, name=rhceph, version=7, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.33.12, release=553, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., build-date=2025-09-24T08:57:55)
Nov 28 09:13:00 np0005538515.localdomain sudo[117976]: pam_unix(sudo:session): session closed for user root
Nov 28 09:13:00 np0005538515.localdomain sudo[118131]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:13:00 np0005538515.localdomain sudo[118131]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:13:00 np0005538515.localdomain sudo[118131]: pam_unix(sudo:session): session closed for user root
Nov 28 09:13:00 np0005538515.localdomain sudo[118146]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 09:13:00 np0005538515.localdomain sudo[118146]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:13:01 np0005538515.localdomain sudo[118146]: pam_unix(sudo:session): session closed for user root
Nov 28 09:13:01 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37712 DF PROTO=TCP SPT=51020 DPT=9105 SEQ=839970434 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC032FA0000000001030307) 
Nov 28 09:13:02 np0005538515.localdomain sudo[118193]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:13:02 np0005538515.localdomain sudo[118193]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:13:02 np0005538515.localdomain sudo[118193]: pam_unix(sudo:session): session closed for user root
Nov 28 09:13:04 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27946 DF PROTO=TCP SPT=60562 DPT=9105 SEQ=1404972871 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC03EBA0000000001030307) 
Nov 28 09:13:09 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22964 DF PROTO=TCP SPT=53526 DPT=9102 SEQ=1115222816 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC0503A0000000001030307) 
Nov 28 09:13:12 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27947 DF PROTO=TCP SPT=60562 DPT=9105 SEQ=1404972871 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC05EFA0000000001030307) 
Nov 28 09:13:14 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30793 DF PROTO=TCP SPT=37358 DPT=9101 SEQ=481832130 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC064030000000001030307) 
Nov 28 09:13:16 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20348 DF PROTO=TCP SPT=57078 DPT=9100 SEQ=3399415307 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC06D3A0000000001030307) 
Nov 28 09:13:20 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20349 DF PROTO=TCP SPT=57078 DPT=9100 SEQ=3399415307 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC07CFA0000000001030307) 
Nov 28 09:13:27 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19517 DF PROTO=TCP SPT=52600 DPT=9105 SEQ=3779108153 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC098330000000001030307) 
Nov 28 09:13:28 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19518 DF PROTO=TCP SPT=52600 DPT=9105 SEQ=3779108153 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC09C3A0000000001030307) 
Nov 28 09:13:28 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20350 DF PROTO=TCP SPT=57078 DPT=9100 SEQ=3399415307 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC09CFA0000000001030307) 
Nov 28 09:13:31 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64503 DF PROTO=TCP SPT=60080 DPT=9105 SEQ=380828836 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC0A8FA0000000001030307) 
Nov 28 09:13:34 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19520 DF PROTO=TCP SPT=52600 DPT=9105 SEQ=3779108153 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC0B3FA0000000001030307) 
Nov 28 09:13:39 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9449 DF PROTO=TCP SPT=43326 DPT=9102 SEQ=149057482 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC0C57A0000000001030307) 
Nov 28 09:13:43 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19521 DF PROTO=TCP SPT=52600 DPT=9105 SEQ=3779108153 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC0D4FA0000000001030307) 
Nov 28 09:13:44 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36162 DF PROTO=TCP SPT=37542 DPT=9882 SEQ=1271581261 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC0D8FA0000000001030307) 
Nov 28 09:13:46 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37447 DF PROTO=TCP SPT=46520 DPT=9100 SEQ=498432609 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC0E27A0000000001030307) 
Nov 28 09:13:50 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37448 DF PROTO=TCP SPT=46520 DPT=9100 SEQ=498432609 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC0F23A0000000001030307) 
Nov 28 09:13:57 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60617 DF PROTO=TCP SPT=39838 DPT=9105 SEQ=3790350889 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC10D660000000001030307) 
Nov 28 09:13:58 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60618 DF PROTO=TCP SPT=39838 DPT=9105 SEQ=3790350889 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC1117A0000000001030307) 
Nov 28 09:13:58 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22409 DF PROTO=TCP SPT=38130 DPT=9882 SEQ=1707402887 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC1128C0000000001030307) 
Nov 28 09:13:59 np0005538515.localdomain kernel: SELinux:  Converting 2741 SID table entries...
Nov 28 09:13:59 np0005538515.localdomain kernel: SELinux:  policy capability network_peer_controls=1
Nov 28 09:13:59 np0005538515.localdomain kernel: SELinux:  policy capability open_perms=1
Nov 28 09:13:59 np0005538515.localdomain kernel: SELinux:  policy capability extended_socket_class=1
Nov 28 09:13:59 np0005538515.localdomain kernel: SELinux:  policy capability always_check_network=0
Nov 28 09:13:59 np0005538515.localdomain kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 28 09:13:59 np0005538515.localdomain kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 28 09:13:59 np0005538515.localdomain kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 28 09:14:00 np0005538515.localdomain sudo[117645]: pam_unix(sudo:session): session closed for user root
Nov 28 09:14:01 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27949 DF PROTO=TCP SPT=60562 DPT=9105 SEQ=1404972871 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC11CFA0000000001030307) 
Nov 28 09:14:02 np0005538515.localdomain sudo[118606]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:14:02 np0005538515.localdomain dbus-broker-launch[754]: avc:  op=load_policy lsm=selinux seqno=17 res=1
Nov 28 09:14:02 np0005538515.localdomain sudo[118606]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:14:03 np0005538515.localdomain sudo[118606]: pam_unix(sudo:session): session closed for user root
Nov 28 09:14:03 np0005538515.localdomain sudo[118621]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 09:14:03 np0005538515.localdomain sudo[118621]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:14:03 np0005538515.localdomain sudo[118621]: pam_unix(sudo:session): session closed for user root
Nov 28 09:14:04 np0005538515.localdomain sudo[118668]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:14:04 np0005538515.localdomain sudo[118668]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:14:04 np0005538515.localdomain sudo[118668]: pam_unix(sudo:session): session closed for user root
Nov 28 09:14:04 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60620 DF PROTO=TCP SPT=39838 DPT=9105 SEQ=3790350889 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC1293A0000000001030307) 
Nov 28 09:14:06 np0005538515.localdomain sudo[118758]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-keoeyjtdtgkhaspmpwbfmatlaidedfbn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321246.0041327-404-142875354988053/AnsiballZ_file.py
Nov 28 09:14:06 np0005538515.localdomain sudo[118758]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:14:06 np0005538515.localdomain python3.9[118760]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/ansible/facts.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:14:06 np0005538515.localdomain sudo[118758]: pam_unix(sudo:session): session closed for user root
Nov 28 09:14:06 np0005538515.localdomain sudo[118850]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gugpogubqfxheptqlyunyzjeqghdrykp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321246.6766562-428-99739598710650/AnsiballZ_stat.py
Nov 28 09:14:06 np0005538515.localdomain sudo[118850]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:14:07 np0005538515.localdomain python3.9[118852]: ansible-ansible.legacy.stat Invoked with path=/etc/ansible/facts.d/edpm.fact follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:14:07 np0005538515.localdomain sudo[118850]: pam_unix(sudo:session): session closed for user root
Nov 28 09:14:07 np0005538515.localdomain sudo[118923]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lwlpmfikzheasnuymiuhdpxkfmzfjglr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321246.6766562-428-99739598710650/AnsiballZ_copy.py
Nov 28 09:14:07 np0005538515.localdomain sudo[118923]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:14:07 np0005538515.localdomain python3.9[118925]: ansible-ansible.legacy.copy Invoked with dest=/etc/ansible/facts.d/edpm.fact mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764321246.6766562-428-99739598710650/.source.fact _original_basename=.wzwoes5n follow=False checksum=03aee63dcf9b49b0ac4473b2f1a1b5d3783aa639 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:14:07 np0005538515.localdomain sudo[118923]: pam_unix(sudo:session): session closed for user root
Nov 28 09:14:08 np0005538515.localdomain python3.9[119015]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 09:14:09 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7621 DF PROTO=TCP SPT=52616 DPT=9102 SEQ=131165713 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC13ABA0000000001030307) 
Nov 28 09:14:09 np0005538515.localdomain sudo[119111]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-snbshhttejsvioxvmjijrsucbuujtqjs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321249.445579-503-204691335775359/AnsiballZ_setup.py
Nov 28 09:14:09 np0005538515.localdomain sudo[119111]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:14:10 np0005538515.localdomain python3.9[119113]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 28 09:14:10 np0005538515.localdomain sudo[119111]: pam_unix(sudo:session): session closed for user root
Nov 28 09:14:10 np0005538515.localdomain sudo[119165]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pqdjrgxqolsapedxeavxvddyhrrunvxo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321249.445579-503-204691335775359/AnsiballZ_dnf.py
Nov 28 09:14:10 np0005538515.localdomain sudo[119165]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:14:10 np0005538515.localdomain python3.9[119167]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 28 09:14:12 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60621 DF PROTO=TCP SPT=39838 DPT=9105 SEQ=3790350889 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC148FA0000000001030307) 
Nov 28 09:14:14 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39940 DF PROTO=TCP SPT=36954 DPT=9101 SEQ=54121519 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC14E620000000001030307) 
Nov 28 09:14:14 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 09:14:14 np0005538515.localdomain systemd-rc-local-generator[119201]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:14:14 np0005538515.localdomain systemd-sysv-generator[119206]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:14:14 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:14:14 np0005538515.localdomain systemd[1]: Queuing reload/restart jobs for marked units…
Nov 28 09:14:15 np0005538515.localdomain sudo[119165]: pam_unix(sudo:session): session closed for user root
Nov 28 09:14:16 np0005538515.localdomain sudo[119305]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-abgultuitjncdhkudxhkaoypkvfaopuo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321256.0551665-539-80309881535234/AnsiballZ_command.py
Nov 28 09:14:16 np0005538515.localdomain sudo[119305]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:14:16 np0005538515.localdomain python3.9[119307]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:14:16 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22687 DF PROTO=TCP SPT=34372 DPT=9100 SEQ=311103029 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC157BB0000000001030307) 
Nov 28 09:14:17 np0005538515.localdomain sudo[119305]: pam_unix(sudo:session): session closed for user root
Nov 28 09:14:18 np0005538515.localdomain sudo[119544]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-swvqryokxbjpodvuwrsfgkskpixeulcv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321257.6618578-564-125017302656107/AnsiballZ_selinux.py
Nov 28 09:14:18 np0005538515.localdomain sudo[119544]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:14:18 np0005538515.localdomain python3.9[119546]: ansible-ansible.posix.selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config update_kernel_param=False
Nov 28 09:14:18 np0005538515.localdomain sudo[119544]: pam_unix(sudo:session): session closed for user root
Nov 28 09:14:19 np0005538515.localdomain sudo[119636]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eksgeiceskvchuajkbusjdnviejtroys ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321259.0036285-598-211260462466156/AnsiballZ_command.py
Nov 28 09:14:19 np0005538515.localdomain sudo[119636]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:14:19 np0005538515.localdomain python3.9[119638]: ansible-ansible.legacy.command Invoked with cmd=dd if=/dev/zero of=/swap count=1024 bs=1M creates=/swap _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None removes=None stdin=None
Nov 28 09:14:19 np0005538515.localdomain sudo[119636]: pam_unix(sudo:session): session closed for user root
Nov 28 09:14:20 np0005538515.localdomain sudo[119729]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tudmizpklfvojeodndpfcbmboizdrvzn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321260.289161-620-22930458876383/AnsiballZ_file.py
Nov 28 09:14:20 np0005538515.localdomain sudo[119729]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:14:20 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22688 DF PROTO=TCP SPT=34372 DPT=9100 SEQ=311103029 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC1677A0000000001030307) 
Nov 28 09:14:20 np0005538515.localdomain python3.9[119731]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/swap recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:14:20 np0005538515.localdomain sudo[119729]: pam_unix(sudo:session): session closed for user root
Nov 28 09:14:21 np0005538515.localdomain sudo[119821]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xoudpvtdjdmicnjrhlwxxwismfqsxtdw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321260.9618974-644-206752717220602/AnsiballZ_mount.py
Nov 28 09:14:21 np0005538515.localdomain sudo[119821]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:14:21 np0005538515.localdomain python3.9[119823]: ansible-ansible.posix.mount Invoked with dump=0 fstype=swap name=none opts=sw passno=0 src=/swap state=present path=none boot=True opts_no_log=False backup=False fstab=None
Nov 28 09:14:21 np0005538515.localdomain sudo[119821]: pam_unix(sudo:session): session closed for user root
Nov 28 09:14:22 np0005538515.localdomain sudo[119913]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ccsnjumxiylywbdkfjurpqmtsfmqussm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321262.6794748-729-209920179548803/AnsiballZ_file.py
Nov 28 09:14:22 np0005538515.localdomain sudo[119913]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:14:23 np0005538515.localdomain python3.9[119915]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/ca-trust/source/anchors setype=cert_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:14:23 np0005538515.localdomain sudo[119913]: pam_unix(sudo:session): session closed for user root
Nov 28 09:14:23 np0005538515.localdomain sudo[120005]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vsjmsrmmsdpowuucnisqczktbvenjkwx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321263.4484231-753-121942666321280/AnsiballZ_stat.py
Nov 28 09:14:23 np0005538515.localdomain sudo[120005]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:14:23 np0005538515.localdomain python3.9[120007]: ansible-ansible.legacy.stat Invoked with path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:14:23 np0005538515.localdomain sudo[120005]: pam_unix(sudo:session): session closed for user root
Nov 28 09:14:24 np0005538515.localdomain sudo[120078]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-efzdtsisbyyulfrymsyhhrgeujmzqhvx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321263.4484231-753-121942666321280/AnsiballZ_copy.py
Nov 28 09:14:24 np0005538515.localdomain sudo[120078]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:14:24 np0005538515.localdomain python3.9[120080]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764321263.4484231-753-121942666321280/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=777d5f6763fde4fea484664803960858c2bba706 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:14:24 np0005538515.localdomain sudo[120078]: pam_unix(sudo:session): session closed for user root
Nov 28 09:14:25 np0005538515.localdomain sudo[120170]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cdyruukysxmlygalvhsyldmtndpdopks ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321265.5389113-825-218239550639355/AnsiballZ_stat.py
Nov 28 09:14:25 np0005538515.localdomain sudo[120170]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:14:25 np0005538515.localdomain python3.9[120172]: ansible-ansible.builtin.stat Invoked with path=/etc/lvm/devices/system.devices follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 09:14:26 np0005538515.localdomain sudo[120170]: pam_unix(sudo:session): session closed for user root
Nov 28 09:14:27 np0005538515.localdomain sudo[120264]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nwmcimhkbeozfyvtyniisyfxhnlklhuv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321266.656181-864-193796665125421/AnsiballZ_getent.py
Nov 28 09:14:27 np0005538515.localdomain sudo[120264]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:14:27 np0005538515.localdomain python3.9[120266]: ansible-ansible.builtin.getent Invoked with database=passwd key=qemu fail_key=True service=None split=None
Nov 28 09:14:27 np0005538515.localdomain sudo[120264]: pam_unix(sudo:session): session closed for user root
Nov 28 09:14:27 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60391 DF PROTO=TCP SPT=35486 DPT=9105 SEQ=3510398960 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC182940000000001030307) 
Nov 28 09:14:27 np0005538515.localdomain sudo[120357]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-exnescdgwxmyorcatdwlhzzahvqmygzr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321267.706143-894-166161400790460/AnsiballZ_getent.py
Nov 28 09:14:27 np0005538515.localdomain sudo[120357]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:14:28 np0005538515.localdomain python3.9[120359]: ansible-ansible.builtin.getent Invoked with database=passwd key=hugetlbfs fail_key=True service=None split=None
Nov 28 09:14:28 np0005538515.localdomain sudo[120357]: pam_unix(sudo:session): session closed for user root
Nov 28 09:14:28 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60392 DF PROTO=TCP SPT=35486 DPT=9105 SEQ=3510398960 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC186BA0000000001030307) 
Nov 28 09:14:28 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22689 DF PROTO=TCP SPT=34372 DPT=9100 SEQ=311103029 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC186FA0000000001030307) 
Nov 28 09:14:28 np0005538515.localdomain sudo[120450]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lkkrxfrgegswanoqjopeicgfxlrklxdi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321268.4512446-918-109921113094551/AnsiballZ_group.py
Nov 28 09:14:28 np0005538515.localdomain sudo[120450]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:14:29 np0005538515.localdomain python3.9[120452]: ansible-ansible.builtin.group Invoked with gid=42477 name=hugetlbfs state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Nov 28 09:14:29 np0005538515.localdomain groupmod[120453]: group changed in /etc/group (group hugetlbfs/985, new gid: 42477)
Nov 28 09:14:29 np0005538515.localdomain groupmod[120453]: group changed in /etc/passwd (group hugetlbfs/985, new gid: 42477)
Nov 28 09:14:29 np0005538515.localdomain sudo[120450]: pam_unix(sudo:session): session closed for user root
Nov 28 09:14:29 np0005538515.localdomain sudo[120548]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kviajiqpklylzefeagvokdixdmdzkqhn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321269.4785182-945-151276649816622/AnsiballZ_file.py
Nov 28 09:14:29 np0005538515.localdomain sudo[120548]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:14:29 np0005538515.localdomain python3.9[120550]: ansible-ansible.builtin.file Invoked with group=qemu mode=0755 owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None serole=None selevel=None attributes=None
Nov 28 09:14:29 np0005538515.localdomain sudo[120548]: pam_unix(sudo:session): session closed for user root
Nov 28 09:14:30 np0005538515.localdomain sudo[120640]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sbcicxexgestceuuyjwburgpscugutnq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321270.381807-977-36722303193370/AnsiballZ_dnf.py
Nov 28 09:14:30 np0005538515.localdomain sudo[120640]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:14:30 np0005538515.localdomain python3.9[120642]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 28 09:14:31 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19523 DF PROTO=TCP SPT=52600 DPT=9105 SEQ=3779108153 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC192FA0000000001030307) 
Nov 28 09:14:33 np0005538515.localdomain sudo[120640]: pam_unix(sudo:session): session closed for user root
Nov 28 09:14:34 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60394 DF PROTO=TCP SPT=35486 DPT=9105 SEQ=3510398960 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC19E7B0000000001030307) 
Nov 28 09:14:34 np0005538515.localdomain sudo[120734]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-omuiwjjnabactwyaucdeyflflhdmzofm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321274.5999897-1002-199196761581926/AnsiballZ_file.py
Nov 28 09:14:34 np0005538515.localdomain sudo[120734]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:14:38 np0005538515.localdomain python3.9[120736]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/modules-load.d setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:14:38 np0005538515.localdomain sudo[120734]: pam_unix(sudo:session): session closed for user root
Nov 28 09:14:39 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11677 DF PROTO=TCP SPT=54912 DPT=9102 SEQ=1513804586 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC1AFBA0000000001030307) 
Nov 28 09:14:39 np0005538515.localdomain sudo[120826]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vsyakgbbvwurjcyjoosbhksnastyvjbh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321278.9719198-1025-179235887840623/AnsiballZ_stat.py
Nov 28 09:14:39 np0005538515.localdomain sudo[120826]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:14:40 np0005538515.localdomain python3.9[120828]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:14:40 np0005538515.localdomain sudo[120826]: pam_unix(sudo:session): session closed for user root
Nov 28 09:14:40 np0005538515.localdomain sudo[120899]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bndnsxlorxedzdmhjxkrvphnjjhpwxjk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321278.9719198-1025-179235887840623/AnsiballZ_copy.py
Nov 28 09:14:40 np0005538515.localdomain sudo[120899]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:14:41 np0005538515.localdomain python3.9[120901]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764321278.9719198-1025-179235887840623/.source.conf follow=False _original_basename=edpm-modprobe.conf.j2 checksum=8021efe01721d8fa8cab46b95c00ec1be6dbb9d0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:14:41 np0005538515.localdomain sudo[120899]: pam_unix(sudo:session): session closed for user root
Nov 28 09:14:43 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60395 DF PROTO=TCP SPT=35486 DPT=9105 SEQ=3510398960 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC1BEFA0000000001030307) 
Nov 28 09:14:44 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8681 DF PROTO=TCP SPT=58080 DPT=9882 SEQ=3609277679 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC1C2FA0000000001030307) 
Nov 28 09:14:44 np0005538515.localdomain sudo[120991]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-omozdbggpnzjwanaxisnhwzzycfaditz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321283.871838-1070-56677224949643/AnsiballZ_systemd.py
Nov 28 09:14:44 np0005538515.localdomain sudo[120991]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:14:44 np0005538515.localdomain python3.9[120993]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 28 09:14:44 np0005538515.localdomain systemd[1]: systemd-modules-load.service: Deactivated successfully.
Nov 28 09:14:44 np0005538515.localdomain systemd[1]: Stopped Load Kernel Modules.
Nov 28 09:14:44 np0005538515.localdomain systemd[1]: Stopping Load Kernel Modules...
Nov 28 09:14:44 np0005538515.localdomain systemd[1]: Starting Load Kernel Modules...
Nov 28 09:14:44 np0005538515.localdomain systemd-modules-load[120997]: Module 'msr' is built in
Nov 28 09:14:44 np0005538515.localdomain systemd[1]: Finished Load Kernel Modules.
Nov 28 09:14:44 np0005538515.localdomain sudo[120991]: pam_unix(sudo:session): session closed for user root
Nov 28 09:14:46 np0005538515.localdomain sudo[121087]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yupasnpobhhwnvpauokuukullnatjfxm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321285.9744039-1094-181237401562284/AnsiballZ_stat.py
Nov 28 09:14:46 np0005538515.localdomain sudo[121087]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:14:46 np0005538515.localdomain python3.9[121089]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:14:46 np0005538515.localdomain sudo[121087]: pam_unix(sudo:session): session closed for user root
Nov 28 09:14:46 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58032 DF PROTO=TCP SPT=35450 DPT=9100 SEQ=3749418050 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC1CCBA0000000001030307) 
Nov 28 09:14:46 np0005538515.localdomain sudo[121160]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yokkptikvcmhavegtqjbxrlabdlertns ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321285.9744039-1094-181237401562284/AnsiballZ_copy.py
Nov 28 09:14:46 np0005538515.localdomain sudo[121160]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:14:47 np0005538515.localdomain python3.9[121162]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysctl.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764321285.9744039-1094-181237401562284/.source.conf follow=False _original_basename=edpm-sysctl.conf.j2 checksum=2a366439721b855adcfe4d7f152babb68596a007 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:14:47 np0005538515.localdomain sudo[121160]: pam_unix(sudo:session): session closed for user root
Nov 28 09:14:48 np0005538515.localdomain sudo[121252]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hemlqbkfnhjlpqhqaaprtuiyeblwqmvj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321287.76628-1149-161590535431847/AnsiballZ_dnf.py
Nov 28 09:14:48 np0005538515.localdomain sudo[121252]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:14:48 np0005538515.localdomain python3.9[121254]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 28 09:14:50 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58033 DF PROTO=TCP SPT=35450 DPT=9100 SEQ=3749418050 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC1DC7A0000000001030307) 
Nov 28 09:14:51 np0005538515.localdomain sudo[121252]: pam_unix(sudo:session): session closed for user root
Nov 28 09:14:56 np0005538515.localdomain python3.9[121346]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 09:14:57 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3262 DF PROTO=TCP SPT=43256 DPT=9105 SEQ=2018064289 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC1F7C30000000001030307) 
Nov 28 09:14:57 np0005538515.localdomain python3.9[121438]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile
Nov 28 09:14:58 np0005538515.localdomain python3.9[121528]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 09:14:58 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3263 DF PROTO=TCP SPT=43256 DPT=9105 SEQ=2018064289 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC1FBBB0000000001030307) 
Nov 28 09:14:58 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=4138 DF PROTO=TCP SPT=33730 DPT=9882 SEQ=2757824583 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC1FCEC0000000001030307) 
Nov 28 09:14:59 np0005538515.localdomain sudo[121618]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wgurhanevlyanwmpqmpjacdsambbntem ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321298.990889-1272-251173629183968/AnsiballZ_systemd.py
Nov 28 09:14:59 np0005538515.localdomain sudo[121618]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:14:59 np0005538515.localdomain python3.9[121620]: ansible-ansible.builtin.systemd Invoked with enabled=True name=tuned state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:14:59 np0005538515.localdomain systemd[1]: Stopping Dynamic System Tuning Daemon...
Nov 28 09:14:59 np0005538515.localdomain systemd[1]: tuned.service: Deactivated successfully.
Nov 28 09:14:59 np0005538515.localdomain systemd[1]: Stopped Dynamic System Tuning Daemon.
Nov 28 09:14:59 np0005538515.localdomain systemd[1]: tuned.service: Consumed 1.828s CPU time, no IO.
Nov 28 09:14:59 np0005538515.localdomain systemd[1]: Starting Dynamic System Tuning Daemon...
Nov 28 09:15:00 np0005538515.localdomain systemd[1]: Started Dynamic System Tuning Daemon.
Nov 28 09:15:00 np0005538515.localdomain sudo[121618]: pam_unix(sudo:session): session closed for user root
Nov 28 09:15:01 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=4140 DF PROTO=TCP SPT=33730 DPT=9882 SEQ=2757824583 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC208FA0000000001030307) 
Nov 28 09:15:02 np0005538515.localdomain python3.9[121723]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline
Nov 28 09:15:04 np0005538515.localdomain sudo[121738]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:15:04 np0005538515.localdomain sudo[121738]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:15:04 np0005538515.localdomain sudo[121738]: pam_unix(sudo:session): session closed for user root
Nov 28 09:15:04 np0005538515.localdomain sudo[121753]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 09:15:04 np0005538515.localdomain sudo[121753]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:15:04 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3265 DF PROTO=TCP SPT=43256 DPT=9105 SEQ=2018064289 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC2137A0000000001030307) 
Nov 28 09:15:05 np0005538515.localdomain sudo[121753]: pam_unix(sudo:session): session closed for user root
Nov 28 09:15:06 np0005538515.localdomain sudo[121858]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:15:06 np0005538515.localdomain sudo[121858]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:15:06 np0005538515.localdomain sudo[121858]: pam_unix(sudo:session): session closed for user root
Nov 28 09:15:06 np0005538515.localdomain sudo[121888]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tkrdjblboytvnreczxinfbnhsklkpant ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321305.8205936-1443-154948216866241/AnsiballZ_systemd.py
Nov 28 09:15:06 np0005538515.localdomain sudo[121888]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:15:06 np0005538515.localdomain python3.9[121891]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksm.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:15:06 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 09:15:06 np0005538515.localdomain systemd-sysv-generator[121922]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:15:06 np0005538515.localdomain systemd-rc-local-generator[121917]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:15:06 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:15:06 np0005538515.localdomain sudo[121888]: pam_unix(sudo:session): session closed for user root
Nov 28 09:15:07 np0005538515.localdomain sudo[122018]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sccstlcrcopqairjzhzngufqroiqnqmz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321306.8955266-1443-205819549592188/AnsiballZ_systemd.py
Nov 28 09:15:07 np0005538515.localdomain sudo[122018]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:15:07 np0005538515.localdomain python3.9[122020]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksmtuned.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:15:08 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 09:15:08 np0005538515.localdomain systemd-rc-local-generator[122046]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:15:08 np0005538515.localdomain systemd-sysv-generator[122049]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:15:08 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:15:08 np0005538515.localdomain sudo[122018]: pam_unix(sudo:session): session closed for user root
Nov 28 09:15:09 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57001 DF PROTO=TCP SPT=39756 DPT=9102 SEQ=531856499 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC224FA0000000001030307) 
Nov 28 09:15:09 np0005538515.localdomain sudo[122148]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-shotfnenxxexgooinerpuntoyzmqrorx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321309.2312086-1491-277202825401935/AnsiballZ_command.py
Nov 28 09:15:09 np0005538515.localdomain sudo[122148]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:15:09 np0005538515.localdomain python3.9[122150]: ansible-ansible.legacy.command Invoked with _raw_params=mkswap "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:15:09 np0005538515.localdomain sudo[122148]: pam_unix(sudo:session): session closed for user root
Nov 28 09:15:10 np0005538515.localdomain sudo[122241]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kziumvftidcgngrkrpwevonuiajxayjh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321309.9515202-1515-265808961178637/AnsiballZ_command.py
Nov 28 09:15:10 np0005538515.localdomain sudo[122241]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:15:10 np0005538515.localdomain python3.9[122243]: ansible-ansible.legacy.command Invoked with _raw_params=swapon "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:15:10 np0005538515.localdomain kernel: Adding 1048572k swap on /swap.  Priority:-2 extents:1 across:1048572k FS
Nov 28 09:15:10 np0005538515.localdomain sudo[122241]: pam_unix(sudo:session): session closed for user root
Nov 28 09:15:10 np0005538515.localdomain sudo[122334]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mhfjorcnlrtrsbjuzxragaagcsvibhyy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321310.6878324-1538-143689862541909/AnsiballZ_command.py
Nov 28 09:15:10 np0005538515.localdomain sudo[122334]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:15:11 np0005538515.localdomain python3.9[122336]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/update-ca-trust _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:15:12 np0005538515.localdomain sudo[122334]: pam_unix(sudo:session): session closed for user root
Nov 28 09:15:12 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3266 DF PROTO=TCP SPT=43256 DPT=9105 SEQ=2018064289 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC232FB0000000001030307) 
Nov 28 09:15:12 np0005538515.localdomain sudo[122433]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mhrsqcexhdugixyzrtvvvevwxsdcykax ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321312.454895-1562-199813677272429/AnsiballZ_command.py
Nov 28 09:15:12 np0005538515.localdomain sudo[122433]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:15:12 np0005538515.localdomain python3.9[122435]: ansible-ansible.legacy.command Invoked with _raw_params=echo 2 >/sys/kernel/mm/ksm/run _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:15:12 np0005538515.localdomain sudo[122433]: pam_unix(sudo:session): session closed for user root
Nov 28 09:15:13 np0005538515.localdomain sudo[122526]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rtpretiwyhyflkkgqjkutmmskqayfphz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321313.1812015-1587-223961557458538/AnsiballZ_systemd.py
Nov 28 09:15:13 np0005538515.localdomain sudo[122526]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:15:13 np0005538515.localdomain python3.9[122528]: ansible-ansible.builtin.systemd Invoked with name=systemd-sysctl.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 28 09:15:13 np0005538515.localdomain systemd[1]: systemd-sysctl.service: Deactivated successfully.
Nov 28 09:15:13 np0005538515.localdomain systemd[1]: Stopped Apply Kernel Variables.
Nov 28 09:15:13 np0005538515.localdomain systemd[1]: Stopping Apply Kernel Variables...
Nov 28 09:15:13 np0005538515.localdomain systemd[1]: Starting Apply Kernel Variables...
Nov 28 09:15:13 np0005538515.localdomain systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Nov 28 09:15:13 np0005538515.localdomain systemd[1]: Finished Apply Kernel Variables.
Nov 28 09:15:13 np0005538515.localdomain sudo[122526]: pam_unix(sudo:session): session closed for user root
Nov 28 09:15:14 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53550 DF PROTO=TCP SPT=37978 DPT=9101 SEQ=3619797123 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC238C10000000001030307) 
Nov 28 09:15:14 np0005538515.localdomain sshd[116235]: pam_unix(sshd:session): session closed for user zuul
Nov 28 09:15:14 np0005538515.localdomain systemd-logind[763]: Session 38 logged out. Waiting for processes to exit.
Nov 28 09:15:14 np0005538515.localdomain systemd[1]: session-38.scope: Deactivated successfully.
Nov 28 09:15:14 np0005538515.localdomain systemd[1]: session-38.scope: Consumed 1min 55.779s CPU time.
Nov 28 09:15:14 np0005538515.localdomain systemd-logind[763]: Removed session 38.
Nov 28 09:15:16 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21611 DF PROTO=TCP SPT=49132 DPT=9100 SEQ=1534770021 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC241FA0000000001030307) 
Nov 28 09:15:19 np0005538515.localdomain sshd[122548]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 09:15:19 np0005538515.localdomain sshd[122548]: Accepted publickey for zuul from 192.168.122.31 port 54530 ssh2: RSA SHA256:3gOhaEk5Hp1Sm2LwNst6cGDJ5O01KvSo8lCo9SBO2II
Nov 28 09:15:19 np0005538515.localdomain systemd-logind[763]: New session 39 of user zuul.
Nov 28 09:15:19 np0005538515.localdomain systemd[1]: Started Session 39 of User zuul.
Nov 28 09:15:19 np0005538515.localdomain sshd[122548]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Nov 28 09:15:20 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21612 DF PROTO=TCP SPT=49132 DPT=9100 SEQ=1534770021 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC251BB0000000001030307) 
Nov 28 09:15:20 np0005538515.localdomain python3.9[122641]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 09:15:22 np0005538515.localdomain python3.9[122735]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 09:15:23 np0005538515.localdomain sudo[122829]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-geknzkenyslvozisnyqsrgwsubvvdsyw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321323.0112891-112-167042079744080/AnsiballZ_command.py
Nov 28 09:15:23 np0005538515.localdomain sudo[122829]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:15:23 np0005538515.localdomain python3.9[122831]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols
                                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:15:23 np0005538515.localdomain sudo[122829]: pam_unix(sudo:session): session closed for user root
Nov 28 09:15:24 np0005538515.localdomain python3.9[122922]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 09:15:25 np0005538515.localdomain sudo[123016]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vswfdhshfpzfnsmcdtgjrgzghnnmkhhq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321325.2112167-172-104548689755148/AnsiballZ_setup.py
Nov 28 09:15:25 np0005538515.localdomain sudo[123016]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:15:25 np0005538515.localdomain python3.9[123018]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 28 09:15:26 np0005538515.localdomain sudo[123016]: pam_unix(sudo:session): session closed for user root
Nov 28 09:15:26 np0005538515.localdomain sudo[123070]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-diiyhmcoiuvqfypokhhnpxgqfyxaqvvr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321325.2112167-172-104548689755148/AnsiballZ_dnf.py
Nov 28 09:15:26 np0005538515.localdomain sudo[123070]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:15:26 np0005538515.localdomain python3.9[123072]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 28 09:15:27 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14641 DF PROTO=TCP SPT=57744 DPT=9105 SEQ=919896499 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC26CF30000000001030307) 
Nov 28 09:15:28 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14642 DF PROTO=TCP SPT=57744 DPT=9105 SEQ=919896499 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC270FA0000000001030307) 
Nov 28 09:15:28 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53300 DF PROTO=TCP SPT=44682 DPT=9882 SEQ=1226433098 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC2721C0000000001030307) 
Nov 28 09:15:29 np0005538515.localdomain sudo[123070]: pam_unix(sudo:session): session closed for user root
Nov 28 09:15:30 np0005538515.localdomain sudo[123164]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mpshtikspaadioqaaoandezoewxyoeim ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321330.1282668-209-116956243818438/AnsiballZ_setup.py
Nov 28 09:15:30 np0005538515.localdomain sudo[123164]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:15:30 np0005538515.localdomain python3.9[123166]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 28 09:15:31 np0005538515.localdomain sudo[123164]: pam_unix(sudo:session): session closed for user root
Nov 28 09:15:31 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60397 DF PROTO=TCP SPT=35486 DPT=9105 SEQ=3510398960 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC27CFA0000000001030307) 
Nov 28 09:15:31 np0005538515.localdomain sudo[123311]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bvfzxinqukuvwjalulltvykgqoonmbgi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321331.4391775-241-275059855371164/AnsiballZ_file.py
Nov 28 09:15:31 np0005538515.localdomain sudo[123311]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:15:32 np0005538515.localdomain python3.9[123313]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:15:32 np0005538515.localdomain sudo[123311]: pam_unix(sudo:session): session closed for user root
Nov 28 09:15:32 np0005538515.localdomain sudo[123403]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nkpojwtbgoqbvfkwksksuqgmvnvdwdrv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321332.2691355-266-129192586687972/AnsiballZ_command.py
Nov 28 09:15:32 np0005538515.localdomain sudo[123403]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:15:32 np0005538515.localdomain python3.9[123405]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman
                                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:15:32 np0005538515.localdomain sudo[123403]: pam_unix(sudo:session): session closed for user root
Nov 28 09:15:33 np0005538515.localdomain sudo[123506]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-orlzmslgkaesbnqtzktgkxfzccalcmru ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321333.0048602-290-158902225503299/AnsiballZ_stat.py
Nov 28 09:15:33 np0005538515.localdomain sudo[123506]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:15:33 np0005538515.localdomain python3.9[123508]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:15:33 np0005538515.localdomain sudo[123506]: pam_unix(sudo:session): session closed for user root
Nov 28 09:15:33 np0005538515.localdomain sudo[123554]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zqbkfzgqjquzauzrprftwrjkplcbkbnv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321333.0048602-290-158902225503299/AnsiballZ_file.py
Nov 28 09:15:33 np0005538515.localdomain sudo[123554]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:15:34 np0005538515.localdomain python3.9[123556]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/containers/networks/podman.json _original_basename=podman_network_config.j2 recurse=False state=file path=/etc/containers/networks/podman.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:15:34 np0005538515.localdomain sudo[123554]: pam_unix(sudo:session): session closed for user root
Nov 28 09:15:34 np0005538515.localdomain sudo[123646]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kvwhxbtgivwcyjfhxdqjtcrivziqstqx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321334.265909-325-172165940699182/AnsiballZ_stat.py
Nov 28 09:15:34 np0005538515.localdomain sudo[123646]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:15:34 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14644 DF PROTO=TCP SPT=57744 DPT=9105 SEQ=919896499 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC288BA0000000001030307) 
Nov 28 09:15:34 np0005538515.localdomain python3.9[123648]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:15:34 np0005538515.localdomain sudo[123646]: pam_unix(sudo:session): session closed for user root
Nov 28 09:15:35 np0005538515.localdomain sudo[123719]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-khgzrdbewzvspszhvcydieniyhlisagr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321334.265909-325-172165940699182/AnsiballZ_copy.py
Nov 28 09:15:35 np0005538515.localdomain sudo[123719]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:15:35 np0005538515.localdomain python3.9[123721]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764321334.265909-325-172165940699182/.source.conf follow=False _original_basename=registries.conf.j2 checksum=804a0d01b832e60d20f779a331306df708c87b02 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:15:35 np0005538515.localdomain sudo[123719]: pam_unix(sudo:session): session closed for user root
Nov 28 09:15:36 np0005538515.localdomain sudo[123811]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-snfpgzqgwxjjfcylpcxxkxhjonkzgoac ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321335.706038-373-136661604215818/AnsiballZ_ini_file.py
Nov 28 09:15:36 np0005538515.localdomain sudo[123811]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:15:36 np0005538515.localdomain python3.9[123813]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:15:36 np0005538515.localdomain sudo[123811]: pam_unix(sudo:session): session closed for user root
Nov 28 09:15:36 np0005538515.localdomain sudo[123903]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nwkxmsjfxvloswzfbujahtkwkvzcugvx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321336.4535615-373-40387197616998/AnsiballZ_ini_file.py
Nov 28 09:15:36 np0005538515.localdomain sudo[123903]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:15:36 np0005538515.localdomain python3.9[123905]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:15:36 np0005538515.localdomain sudo[123903]: pam_unix(sudo:session): session closed for user root
Nov 28 09:15:37 np0005538515.localdomain sudo[123995]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-akkpnksictmzyfutnoonzsoalxeecnis ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321337.0374393-373-177115734757271/AnsiballZ_ini_file.py
Nov 28 09:15:37 np0005538515.localdomain sudo[123995]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:15:37 np0005538515.localdomain python3.9[123997]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:15:37 np0005538515.localdomain sudo[123995]: pam_unix(sudo:session): session closed for user root
Nov 28 09:15:37 np0005538515.localdomain sudo[124087]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oxodejiuiytedwgjmfdfyohyppyiynlx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321337.6190712-373-124644809316941/AnsiballZ_ini_file.py
Nov 28 09:15:37 np0005538515.localdomain sudo[124087]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:15:38 np0005538515.localdomain python3.9[124089]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:15:38 np0005538515.localdomain sudo[124087]: pam_unix(sudo:session): session closed for user root
Nov 28 09:15:39 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59198 DF PROTO=TCP SPT=48334 DPT=9102 SEQ=865906324 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC29A3A0000000001030307) 
Nov 28 09:15:39 np0005538515.localdomain python3.9[124179]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 09:15:39 np0005538515.localdomain sudo[124271]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tqusifqtdfagbgzyjzekpzvrohuocdzk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321339.415762-494-33594210117962/AnsiballZ_dnf.py
Nov 28 09:15:39 np0005538515.localdomain sudo[124271]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:15:39 np0005538515.localdomain python3.9[124273]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 28 09:15:42 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14645 DF PROTO=TCP SPT=57744 DPT=9105 SEQ=919896499 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC2A8FA0000000001030307) 
Nov 28 09:15:43 np0005538515.localdomain sudo[124271]: pam_unix(sudo:session): session closed for user root
Nov 28 09:15:43 np0005538515.localdomain sudo[124365]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lsqckcnaapvgjxdyxtyctoyaxsqiblcy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321343.4408977-517-144729547736704/AnsiballZ_dnf.py
Nov 28 09:15:43 np0005538515.localdomain sudo[124365]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:15:43 np0005538515.localdomain python3.9[124367]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openstack-network-scripts'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 28 09:15:44 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61599 DF PROTO=TCP SPT=40624 DPT=9101 SEQ=247155788 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC2ADF00000000001030307) 
Nov 28 09:15:46 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34072 DF PROTO=TCP SPT=53536 DPT=9100 SEQ=3645617858 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC2B73A0000000001030307) 
Nov 28 09:15:47 np0005538515.localdomain sudo[124365]: pam_unix(sudo:session): session closed for user root
Nov 28 09:15:47 np0005538515.localdomain sudo[124459]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-owlxzddyxdnotycldzaozwrwddjcitbd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321347.6153302-547-53863926516842/AnsiballZ_dnf.py
Nov 28 09:15:47 np0005538515.localdomain sudo[124459]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:15:48 np0005538515.localdomain python3.9[124461]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['podman', 'buildah'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 28 09:15:50 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34073 DF PROTO=TCP SPT=53536 DPT=9100 SEQ=3645617858 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC2C6FB0000000001030307) 
Nov 28 09:15:51 np0005538515.localdomain sudo[124459]: pam_unix(sudo:session): session closed for user root
Nov 28 09:15:52 np0005538515.localdomain sudo[124559]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-katwgsiasallkuzfjlzzlmyozexlnalo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321351.8233693-574-125698121116222/AnsiballZ_dnf.py
Nov 28 09:15:52 np0005538515.localdomain sudo[124559]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:15:52 np0005538515.localdomain python3.9[124561]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['tuned', 'tuned-profiles-cpu-partitioning'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 28 09:15:55 np0005538515.localdomain sudo[124559]: pam_unix(sudo:session): session closed for user root
Nov 28 09:15:56 np0005538515.localdomain sudo[124653]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lwtioejkxdeevlvjlebxranstdatrwrm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321356.1762624-610-63111871481053/AnsiballZ_dnf.py
Nov 28 09:15:56 np0005538515.localdomain sudo[124653]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:15:56 np0005538515.localdomain python3.9[124655]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['os-net-config'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 28 09:15:57 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22516 DF PROTO=TCP SPT=56974 DPT=9105 SEQ=3597303019 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC2E2230000000001030307) 
Nov 28 09:15:58 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22517 DF PROTO=TCP SPT=56974 DPT=9105 SEQ=3597303019 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC2E63A0000000001030307) 
Nov 28 09:15:58 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34074 DF PROTO=TCP SPT=53536 DPT=9100 SEQ=3645617858 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC2E6FA0000000001030307) 
Nov 28 09:15:59 np0005538515.localdomain sudo[124653]: pam_unix(sudo:session): session closed for user root
Nov 28 09:16:00 np0005538515.localdomain sudo[124747]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hhhiayqaamxdwvkhvhpochnkuxebfjmg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321360.3053179-637-84279415045607/AnsiballZ_dnf.py
Nov 28 09:16:00 np0005538515.localdomain sudo[124747]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:16:00 np0005538515.localdomain python3.9[124749]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openssh-server'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 28 09:16:01 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46985 DF PROTO=TCP SPT=36826 DPT=9882 SEQ=3041031134 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC2F33A0000000001030307) 
Nov 28 09:16:04 np0005538515.localdomain sudo[124747]: pam_unix(sudo:session): session closed for user root
Nov 28 09:16:04 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22519 DF PROTO=TCP SPT=56974 DPT=9105 SEQ=3597303019 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC2FDFA0000000001030307) 
Nov 28 09:16:05 np0005538515.localdomain sudo[124841]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wmyicxrfaxndbikyqazqefxjksszltql ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321364.786914-664-43103547237906/AnsiballZ_dnf.py
Nov 28 09:16:05 np0005538515.localdomain sudo[124841]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:16:05 np0005538515.localdomain python3.9[124843]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 28 09:16:06 np0005538515.localdomain sudo[124846]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:16:06 np0005538515.localdomain sudo[124846]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:16:06 np0005538515.localdomain sudo[124846]: pam_unix(sudo:session): session closed for user root
Nov 28 09:16:06 np0005538515.localdomain sudo[124861]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 09:16:06 np0005538515.localdomain sudo[124861]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:16:06 np0005538515.localdomain sudo[124861]: pam_unix(sudo:session): session closed for user root
Nov 28 09:16:07 np0005538515.localdomain sudo[124907]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:16:07 np0005538515.localdomain sudo[124907]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:16:07 np0005538515.localdomain sudo[124907]: pam_unix(sudo:session): session closed for user root
Nov 28 09:16:09 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43010 DF PROTO=TCP SPT=44408 DPT=9102 SEQ=2239395787 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC30F7A0000000001030307) 
Nov 28 09:16:13 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22520 DF PROTO=TCP SPT=56974 DPT=9105 SEQ=3597303019 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC31EFA0000000001030307) 
Nov 28 09:16:14 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46987 DF PROTO=TCP SPT=36826 DPT=9882 SEQ=3041031134 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC322FA0000000001030307) 
Nov 28 09:16:16 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52592 DF PROTO=TCP SPT=40540 DPT=9100 SEQ=1621524664 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC32C7A0000000001030307) 
Nov 28 09:16:17 np0005538515.localdomain sudo[124841]: pam_unix(sudo:session): session closed for user root
Nov 28 09:16:17 np0005538515.localdomain sudo[125088]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dsiamxnexqailsuvnmeokssiynzzcfgd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321377.6759558-700-106317918615102/AnsiballZ_file.py
Nov 28 09:16:17 np0005538515.localdomain sudo[125088]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:16:18 np0005538515.localdomain python3.9[125090]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:16:18 np0005538515.localdomain sudo[125088]: pam_unix(sudo:session): session closed for user root
Nov 28 09:16:18 np0005538515.localdomain sudo[125193]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fkpnvsknfhukuvrmkcrqtviesdjkppqa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321378.4169908-724-107076641266109/AnsiballZ_stat.py
Nov 28 09:16:18 np0005538515.localdomain sudo[125193]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:16:19 np0005538515.localdomain python3.9[125195]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:16:19 np0005538515.localdomain sudo[125193]: pam_unix(sudo:session): session closed for user root
Nov 28 09:16:19 np0005538515.localdomain sudo[125266]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-stuftttgrmmvkurnmicntuyidzhvzlxh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321378.4169908-724-107076641266109/AnsiballZ_copy.py
Nov 28 09:16:19 np0005538515.localdomain sudo[125266]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:16:19 np0005538515.localdomain python3.9[125268]: ansible-ansible.legacy.copy Invoked with dest=/root/.config/containers/auth.json group=zuul mode=0660 owner=zuul src=/home/zuul/.ansible/tmp/ansible-tmp-1764321378.4169908-724-107076641266109/.source.json _original_basename=.q3h4m5gm follow=False checksum=bf21a9e8fbc5a3846fb05b4fa0859e0917b2202f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:16:19 np0005538515.localdomain sudo[125266]: pam_unix(sudo:session): session closed for user root
Nov 28 09:16:20 np0005538515.localdomain sudo[125358]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jgspkghzmjbijqjzlritijqsxqnwlwli ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321379.9999037-778-185815164099640/AnsiballZ_podman_image.py
Nov 28 09:16:20 np0005538515.localdomain sudo[125358]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:16:20 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52593 DF PROTO=TCP SPT=40540 DPT=9100 SEQ=1621524664 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC33C3A0000000001030307) 
Nov 28 09:16:20 np0005538515.localdomain python3.9[125360]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Nov 28 09:16:27 np0005538515.localdomain podman[125373]: 2025-11-28 09:16:20.831944511 +0000 UTC m=+0.032047797 image pull  quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Nov 28 09:16:27 np0005538515.localdomain systemd-journald[48427]: Field hash table of /run/log/journal/5cd59ba25ae47acac865224fa46a5f9e/system.journal has a fill level at 77.5 (258 of 333 items), suggesting rotation.
Nov 28 09:16:27 np0005538515.localdomain systemd-journald[48427]: /run/log/journal/5cd59ba25ae47acac865224fa46a5f9e/system.journal: Journal header limits reached or header out-of-date, rotating.
Nov 28 09:16:27 np0005538515.localdomain rsyslogd[758]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Nov 28 09:16:27 np0005538515.localdomain rsyslogd[758]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Nov 28 09:16:27 np0005538515.localdomain rsyslogd[758]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Nov 28 09:16:27 np0005538515.localdomain sudo[125358]: pam_unix(sudo:session): session closed for user root
Nov 28 09:16:27 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48260 DF PROTO=TCP SPT=46364 DPT=9105 SEQ=3844461558 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC357550000000001030307) 
Nov 28 09:16:28 np0005538515.localdomain sudo[125569]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pnrcqcvqwoewwsyxoxpacplmtqyrsala ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321387.9632215-811-259013795670082/AnsiballZ_podman_image.py
Nov 28 09:16:28 np0005538515.localdomain sudo[125569]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:16:28 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48261 DF PROTO=TCP SPT=46364 DPT=9105 SEQ=3844461558 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC35B7B0000000001030307) 
Nov 28 09:16:28 np0005538515.localdomain python3.9[125571]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Nov 28 09:16:28 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30662 DF PROTO=TCP SPT=48928 DPT=9882 SEQ=267520267 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC35C7C0000000001030307) 
Nov 28 09:16:31 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14647 DF PROTO=TCP SPT=57744 DPT=9105 SEQ=919896499 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC366FA0000000001030307) 
Nov 28 09:16:34 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48263 DF PROTO=TCP SPT=46364 DPT=9105 SEQ=3844461558 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC3733B0000000001030307) 
Nov 28 09:16:36 np0005538515.localdomain podman[125585]: 2025-11-28 09:16:28.766771503 +0000 UTC m=+0.043712025 image pull  quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 28 09:16:37 np0005538515.localdomain sudo[125569]: pam_unix(sudo:session): session closed for user root
Nov 28 09:16:37 np0005538515.localdomain sudo[125782]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oojuzaycwjbkeukffmdzxrbjcvrvaxto ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321397.6432772-847-148263261049745/AnsiballZ_podman_image.py
Nov 28 09:16:37 np0005538515.localdomain sudo[125782]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:16:38 np0005538515.localdomain python3.9[125784]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Nov 28 09:16:39 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29188 DF PROTO=TCP SPT=53792 DPT=9102 SEQ=512309727 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC3847A0000000001030307) 
Nov 28 09:16:39 np0005538515.localdomain podman[125797]: 2025-11-28 09:16:38.238471942 +0000 UTC m=+0.045082148 image pull  quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified
Nov 28 09:16:40 np0005538515.localdomain sudo[125782]: pam_unix(sudo:session): session closed for user root
Nov 28 09:16:40 np0005538515.localdomain sudo[125961]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bgbekacvfqexeclmwqgrawavhjvxirdr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321400.6807823-874-139586154292606/AnsiballZ_podman_image.py
Nov 28 09:16:40 np0005538515.localdomain sudo[125961]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:16:41 np0005538515.localdomain python3.9[125963]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Nov 28 09:16:42 np0005538515.localdomain podman[125975]: 2025-11-28 09:16:41.273941039 +0000 UTC m=+0.044520821 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 28 09:16:42 np0005538515.localdomain sudo[125961]: pam_unix(sudo:session): session closed for user root
Nov 28 09:16:42 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48264 DF PROTO=TCP SPT=46364 DPT=9105 SEQ=3844461558 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC392FA0000000001030307) 
Nov 28 09:16:43 np0005538515.localdomain sudo[126138]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pawolyuphijveifsjzkznaitxtgzugcx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321402.962655-901-158253256153814/AnsiballZ_podman_image.py
Nov 28 09:16:43 np0005538515.localdomain sudo[126138]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:16:43 np0005538515.localdomain python3.9[126140]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Nov 28 09:16:44 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23727 DF PROTO=TCP SPT=36496 DPT=9101 SEQ=3608160821 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC398510000000001030307) 
Nov 28 09:16:46 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11782 DF PROTO=TCP SPT=34516 DPT=9100 SEQ=3977907378 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC3A17A0000000001030307) 
Nov 28 09:16:46 np0005538515.localdomain podman[126153]: 2025-11-28 09:16:43.588973079 +0000 UTC m=+0.044376006 image pull  quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified
Nov 28 09:16:47 np0005538515.localdomain sudo[126138]: pam_unix(sudo:session): session closed for user root
Nov 28 09:16:47 np0005538515.localdomain sudo[126327]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jygmsdygrenutnwspuherplcemodimeh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321407.3296568-901-102109272393310/AnsiballZ_podman_image.py
Nov 28 09:16:47 np0005538515.localdomain sudo[126327]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:16:47 np0005538515.localdomain python3.9[126329]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Nov 28 09:16:49 np0005538515.localdomain podman[126343]: 2025-11-28 09:16:47.954886251 +0000 UTC m=+0.031371296 image pull  quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c
Nov 28 09:16:50 np0005538515.localdomain sudo[126327]: pam_unix(sudo:session): session closed for user root
Nov 28 09:16:50 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11783 DF PROTO=TCP SPT=34516 DPT=9100 SEQ=3977907378 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC3B13A0000000001030307) 
Nov 28 09:16:51 np0005538515.localdomain sshd[122548]: pam_unix(sshd:session): session closed for user zuul
Nov 28 09:16:51 np0005538515.localdomain systemd-logind[763]: Session 39 logged out. Waiting for processes to exit.
Nov 28 09:16:51 np0005538515.localdomain systemd[1]: session-39.scope: Deactivated successfully.
Nov 28 09:16:51 np0005538515.localdomain systemd[1]: session-39.scope: Consumed 1min 31.619s CPU time.
Nov 28 09:16:51 np0005538515.localdomain systemd-logind[763]: Removed session 39.
Nov 28 09:16:57 np0005538515.localdomain sshd[126448]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 09:16:57 np0005538515.localdomain sshd[126448]: Accepted publickey for zuul from 192.168.122.31 port 59342 ssh2: RSA SHA256:3gOhaEk5Hp1Sm2LwNst6cGDJ5O01KvSo8lCo9SBO2II
Nov 28 09:16:57 np0005538515.localdomain systemd-logind[763]: New session 40 of user zuul.
Nov 28 09:16:57 np0005538515.localdomain systemd[1]: Started Session 40 of User zuul.
Nov 28 09:16:57 np0005538515.localdomain sshd[126448]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Nov 28 09:16:57 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44103 DF PROTO=TCP SPT=46612 DPT=9105 SEQ=2192743834 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC3CC830000000001030307) 
Nov 28 09:16:58 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44104 DF PROTO=TCP SPT=46612 DPT=9105 SEQ=2192743834 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC3D07A0000000001030307) 
Nov 28 09:16:58 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11784 DF PROTO=TCP SPT=34516 DPT=9100 SEQ=3977907378 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC3D0FA0000000001030307) 
Nov 28 09:16:58 np0005538515.localdomain python3.9[126797]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 09:16:59 np0005538515.localdomain sudo[126891]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zbpmliifbuocqngjeqmslwkdrinyewhq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321419.489213-70-189319534121301/AnsiballZ_getent.py
Nov 28 09:16:59 np0005538515.localdomain sudo[126891]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:17:00 np0005538515.localdomain python3.9[126893]: ansible-ansible.builtin.getent Invoked with database=passwd key=openvswitch fail_key=True service=None split=None
Nov 28 09:17:00 np0005538515.localdomain sudo[126891]: pam_unix(sudo:session): session closed for user root
Nov 28 09:17:01 np0005538515.localdomain sudo[126984]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hpdzoktnlobpcfsfeflgfdoekegdtcph ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321420.820639-106-145132529783880/AnsiballZ_setup.py
Nov 28 09:17:01 np0005538515.localdomain sudo[126984]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:17:01 np0005538515.localdomain python3.9[126986]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 28 09:17:01 np0005538515.localdomain sudo[126984]: pam_unix(sudo:session): session closed for user root
Nov 28 09:17:01 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22522 DF PROTO=TCP SPT=56974 DPT=9105 SEQ=3597303019 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC3DCFB0000000001030307) 
Nov 28 09:17:02 np0005538515.localdomain sudo[127038]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nbugpogkytmcpeerqznzagfklfwkxtfp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321420.820639-106-145132529783880/AnsiballZ_dnf.py
Nov 28 09:17:02 np0005538515.localdomain sudo[127038]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:17:02 np0005538515.localdomain python3.9[127040]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openvswitch3.3'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 28 09:17:04 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44106 DF PROTO=TCP SPT=46612 DPT=9105 SEQ=2192743834 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC3E83A0000000001030307) 
Nov 28 09:17:05 np0005538515.localdomain sudo[127038]: pam_unix(sudo:session): session closed for user root
Nov 28 09:17:07 np0005538515.localdomain sudo[127241]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dqqfxwgotcgwwavhjvnsjxisueckmdyo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321427.0995262-148-88268789437715/AnsiballZ_dnf.py
Nov 28 09:17:07 np0005538515.localdomain sudo[127241]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:17:07 np0005538515.localdomain sudo[127244]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:17:07 np0005538515.localdomain sudo[127244]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:17:07 np0005538515.localdomain sudo[127244]: pam_unix(sudo:session): session closed for user root
Nov 28 09:17:07 np0005538515.localdomain python3.9[127243]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch3.3'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 28 09:17:07 np0005538515.localdomain sudo[127260]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 09:17:07 np0005538515.localdomain sudo[127260]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:17:08 np0005538515.localdomain sudo[127260]: pam_unix(sudo:session): session closed for user root
Nov 28 09:17:09 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21191 DF PROTO=TCP SPT=35300 DPT=9102 SEQ=3701130136 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC3F9BA0000000001030307) 
Nov 28 09:17:12 np0005538515.localdomain sudo[127241]: pam_unix(sudo:session): session closed for user root
Nov 28 09:17:12 np0005538515.localdomain sudo[127560]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:17:12 np0005538515.localdomain sudo[127560]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:17:12 np0005538515.localdomain sudo[127560]: pam_unix(sudo:session): session closed for user root
Nov 28 09:17:13 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44107 DF PROTO=TCP SPT=46612 DPT=9105 SEQ=2192743834 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC408FA0000000001030307) 
Nov 28 09:17:13 np0005538515.localdomain sudo[127664]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hrxgsijyprhdjiojjjahmkijegbbsnqv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321432.691429-172-93207224415289/AnsiballZ_systemd.py
Nov 28 09:17:13 np0005538515.localdomain sudo[127664]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:17:13 np0005538515.localdomain python3.9[127666]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 28 09:17:14 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52818 DF PROTO=TCP SPT=54796 DPT=9882 SEQ=3235994454 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC40CFA0000000001030307) 
Nov 28 09:17:14 np0005538515.localdomain sudo[127664]: pam_unix(sudo:session): session closed for user root
Nov 28 09:17:15 np0005538515.localdomain python3.9[127759]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 09:17:16 np0005538515.localdomain sudo[127849]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ezxwzcmmfvmzqdtagihbtmncaucvgbia ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321435.8998175-227-4020766780925/AnsiballZ_sefcontext.py
Nov 28 09:17:16 np0005538515.localdomain sudo[127849]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:17:16 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39728 DF PROTO=TCP SPT=41438 DPT=9100 SEQ=1105269456 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC416BA0000000001030307) 
Nov 28 09:17:16 np0005538515.localdomain python3.9[127851]: ansible-community.general.sefcontext Invoked with selevel=s0 setype=container_file_t state=present target=/var/lib/edpm-config(/.*)? ignore_selinux_state=False ftype=a reload=True substitute=None seuser=None
Nov 28 09:17:18 np0005538515.localdomain kernel: SELinux:  Converting 2743 SID table entries...
Nov 28 09:17:18 np0005538515.localdomain kernel: SELinux:  policy capability network_peer_controls=1
Nov 28 09:17:18 np0005538515.localdomain kernel: SELinux:  policy capability open_perms=1
Nov 28 09:17:18 np0005538515.localdomain kernel: SELinux:  policy capability extended_socket_class=1
Nov 28 09:17:18 np0005538515.localdomain kernel: SELinux:  policy capability always_check_network=0
Nov 28 09:17:18 np0005538515.localdomain kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 28 09:17:18 np0005538515.localdomain kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 28 09:17:18 np0005538515.localdomain kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 28 09:17:18 np0005538515.localdomain sudo[127849]: pam_unix(sudo:session): session closed for user root
Nov 28 09:17:19 np0005538515.localdomain python3.9[128026]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 09:17:20 np0005538515.localdomain sudo[128122]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rhyrgoyogljfqqdlehxqumkykyhermex ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321440.1603231-280-130323995494642/AnsiballZ_dnf.py
Nov 28 09:17:20 np0005538515.localdomain dbus-broker-launch[754]: avc:  op=load_policy lsm=selinux seqno=18 res=1
Nov 28 09:17:20 np0005538515.localdomain sudo[128122]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:17:20 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39729 DF PROTO=TCP SPT=41438 DPT=9100 SEQ=1105269456 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC4267A0000000001030307) 
Nov 28 09:17:20 np0005538515.localdomain python3.9[128124]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 28 09:17:23 np0005538515.localdomain sudo[128122]: pam_unix(sudo:session): session closed for user root
Nov 28 09:17:24 np0005538515.localdomain sudo[128216]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xgvajdcdqnirmqwodmwfwmqcyduoxzgi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321444.0279133-304-82525318178393/AnsiballZ_command.py
Nov 28 09:17:24 np0005538515.localdomain sudo[128216]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:17:24 np0005538515.localdomain python3.9[128218]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:17:25 np0005538515.localdomain sudo[128216]: pam_unix(sudo:session): session closed for user root
Nov 28 09:17:26 np0005538515.localdomain sudo[128461]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ifjxrimqlvlyjeitduunbmfbuwuaakcb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321445.6418931-328-161311241536815/AnsiballZ_file.py
Nov 28 09:17:26 np0005538515.localdomain sudo[128461]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:17:26 np0005538515.localdomain python3.9[128463]: ansible-ansible.builtin.file Invoked with mode=0750 path=/var/lib/edpm-config selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Nov 28 09:17:26 np0005538515.localdomain sudo[128461]: pam_unix(sudo:session): session closed for user root
Nov 28 09:17:26 np0005538515.localdomain python3.9[128553]: ansible-ansible.builtin.stat Invoked with path=/etc/cloud/cloud.cfg.d follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 09:17:27 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46985 DF PROTO=TCP SPT=35882 DPT=9105 SEQ=2967770997 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC441B30000000001030307) 
Nov 28 09:17:27 np0005538515.localdomain sudo[128645]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-waedesfpiajuglrvilarsqnvoionszpb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321447.4302945-382-107732374175283/AnsiballZ_dnf.py
Nov 28 09:17:27 np0005538515.localdomain sudo[128645]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:17:27 np0005538515.localdomain python3.9[128647]: ansible-ansible.legacy.dnf Invoked with name=['os-net-config'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 28 09:17:28 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46986 DF PROTO=TCP SPT=35882 DPT=9105 SEQ=2967770997 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC445BB0000000001030307) 
Nov 28 09:17:28 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22094 DF PROTO=TCP SPT=32908 DPT=9882 SEQ=1920140342 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC446DB0000000001030307) 
Nov 28 09:17:30 np0005538515.localdomain sudo[128645]: pam_unix(sudo:session): session closed for user root
Nov 28 09:17:31 np0005538515.localdomain sudo[128739]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ygtkxmmtxghyfnnvbqcrlwsfvplzzknb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321451.320201-406-184932994287566/AnsiballZ_dnf.py
Nov 28 09:17:31 np0005538515.localdomain sudo[128739]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:17:31 np0005538515.localdomain python3.9[128741]: ansible-ansible.legacy.dnf Invoked with name=['openstack-network-scripts'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 28 09:17:32 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22096 DF PROTO=TCP SPT=32908 DPT=9882 SEQ=1920140342 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC452FA0000000001030307) 
Nov 28 09:17:34 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46988 DF PROTO=TCP SPT=35882 DPT=9105 SEQ=2967770997 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC45D7A0000000001030307) 
Nov 28 09:17:34 np0005538515.localdomain sudo[128739]: pam_unix(sudo:session): session closed for user root
Nov 28 09:17:35 np0005538515.localdomain sudo[128833]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xsdsagvciyefeaquvhmxgmkzovuggtbb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321455.1975482-430-94760477123184/AnsiballZ_systemd.py
Nov 28 09:17:35 np0005538515.localdomain sudo[128833]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:17:35 np0005538515.localdomain python3.9[128835]: ansible-ansible.builtin.systemd Invoked with enabled=True name=network daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Nov 28 09:17:36 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 09:17:36 np0005538515.localdomain systemd-rc-local-generator[128866]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:17:36 np0005538515.localdomain systemd-sysv-generator[128870]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:17:37 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:17:37 np0005538515.localdomain sudo[128833]: pam_unix(sudo:session): session closed for user root
Nov 28 09:17:37 np0005538515.localdomain sudo[128965]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kxffrprrnlfnxmsjwpkjjeynfjuyuhlm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321457.5478568-460-193694161103406/AnsiballZ_stat.py
Nov 28 09:17:37 np0005538515.localdomain sudo[128965]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:17:38 np0005538515.localdomain python3.9[128967]: ansible-ansible.builtin.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 09:17:38 np0005538515.localdomain sudo[128965]: pam_unix(sudo:session): session closed for user root
Nov 28 09:17:38 np0005538515.localdomain sudo[129057]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jxcptqkyksrjdjzdkbiwzksdgtugicuo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321458.2685544-487-80081820159432/AnsiballZ_ini_file.py
Nov 28 09:17:38 np0005538515.localdomain sudo[129057]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:17:38 np0005538515.localdomain python3.9[129059]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=no-auto-default path=/etc/NetworkManager/NetworkManager.conf section=main state=present value=* exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:17:39 np0005538515.localdomain sudo[129057]: pam_unix(sudo:session): session closed for user root
Nov 28 09:17:39 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20253 DF PROTO=TCP SPT=57926 DPT=9102 SEQ=1309689648 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC46EFA0000000001030307) 
Nov 28 09:17:39 np0005538515.localdomain sudo[129151]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-veezofkunjiboaopqtvptwymokkrgmbm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321459.2308733-511-5743137217724/AnsiballZ_ini_file.py
Nov 28 09:17:39 np0005538515.localdomain sudo[129151]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:17:39 np0005538515.localdomain python3.9[129153]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/NetworkManager.conf section=main state=present value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:17:39 np0005538515.localdomain sudo[129151]: pam_unix(sudo:session): session closed for user root
Nov 28 09:17:40 np0005538515.localdomain sudo[129243]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qhkcfnpdxgqqdltnrbnlvfzvbupqxurh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321459.9544222-536-239273584254784/AnsiballZ_ini_file.py
Nov 28 09:17:40 np0005538515.localdomain sudo[129243]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:17:40 np0005538515.localdomain python3.9[129245]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/NetworkManager.conf section=main state=present value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:17:40 np0005538515.localdomain sudo[129243]: pam_unix(sudo:session): session closed for user root
Nov 28 09:17:40 np0005538515.localdomain auditd[719]: Audit daemon rotating log files
Nov 28 09:17:41 np0005538515.localdomain sudo[129335]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iazoyyrhgmswmjbapgysgphpeibkvlzj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321460.8703544-566-112929059883920/AnsiballZ_stat.py
Nov 28 09:17:41 np0005538515.localdomain sudo[129335]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:17:41 np0005538515.localdomain python3.9[129337]: ansible-ansible.legacy.stat Invoked with path=/etc/dhcp/dhclient-enter-hooks follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:17:41 np0005538515.localdomain sudo[129335]: pam_unix(sudo:session): session closed for user root
Nov 28 09:17:41 np0005538515.localdomain sudo[129408]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cdaybpcymudubmdezoyjtgiwzasyvacf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321460.8703544-566-112929059883920/AnsiballZ_copy.py
Nov 28 09:17:41 np0005538515.localdomain sudo[129408]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:17:41 np0005538515.localdomain python3.9[129410]: ansible-ansible.legacy.copy Invoked with dest=/etc/dhcp/dhclient-enter-hooks mode=0755 src=/home/zuul/.ansible/tmp/ansible-tmp-1764321460.8703544-566-112929059883920/.source _original_basename=.ylwp0eto follow=False checksum=f6278a40de79a9841f6ed1fc584538225566990c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:17:41 np0005538515.localdomain sudo[129408]: pam_unix(sudo:session): session closed for user root
Nov 28 09:17:42 np0005538515.localdomain sudo[129500]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ksjhgznwhrqkntfojexjozpepixtwhxk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321462.184951-610-24030066623734/AnsiballZ_file.py
Nov 28 09:17:42 np0005538515.localdomain sudo[129500]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:17:42 np0005538515.localdomain python3.9[129502]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/os-net-config state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:17:42 np0005538515.localdomain sudo[129500]: pam_unix(sudo:session): session closed for user root
Nov 28 09:17:42 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46989 DF PROTO=TCP SPT=35882 DPT=9105 SEQ=2967770997 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC47CFA0000000001030307) 
Nov 28 09:17:43 np0005538515.localdomain sudo[129592]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sqnpzybxrbdlkennbyiterfzfevuxlni ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321462.8908675-634-47136548046636/AnsiballZ_edpm_os_net_config_mappings.py
Nov 28 09:17:43 np0005538515.localdomain sudo[129592]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:17:43 np0005538515.localdomain python3.9[129594]: ansible-edpm_os_net_config_mappings Invoked with net_config_data_lookup={}
Nov 28 09:17:43 np0005538515.localdomain sudo[129592]: pam_unix(sudo:session): session closed for user root
Nov 28 09:17:44 np0005538515.localdomain sudo[129684]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hzbpfxbwmayaqlcintnodlcqqsgfsgso ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321463.9100113-661-259445896914211/AnsiballZ_file.py
Nov 28 09:17:44 np0005538515.localdomain sudo[129684]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:17:44 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5870 DF PROTO=TCP SPT=37890 DPT=9101 SEQ=1158618655 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC482B10000000001030307) 
Nov 28 09:17:44 np0005538515.localdomain python3.9[129686]: ansible-ansible.builtin.file Invoked with path=/var/lib/edpm-config/scripts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:17:44 np0005538515.localdomain sudo[129684]: pam_unix(sudo:session): session closed for user root
Nov 28 09:17:45 np0005538515.localdomain sudo[129776]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ggzgftwdyqwjpokibzwzksuzegqniadh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321464.757868-691-220683502873993/AnsiballZ_stat.py
Nov 28 09:17:45 np0005538515.localdomain sudo[129776]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:17:45 np0005538515.localdomain python3.9[129778]: ansible-ansible.legacy.stat Invoked with path=/etc/os-net-config/config.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:17:45 np0005538515.localdomain sudo[129776]: pam_unix(sudo:session): session closed for user root
Nov 28 09:17:45 np0005538515.localdomain sudo[129849]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-guekwwykimeklqtpcdeclabqyyicncdh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321464.757868-691-220683502873993/AnsiballZ_copy.py
Nov 28 09:17:45 np0005538515.localdomain sudo[129849]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:17:45 np0005538515.localdomain python3.9[129851]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/os-net-config/config.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764321464.757868-691-220683502873993/.source.yaml _original_basename=.c9ttw3fb follow=False checksum=4c28d1662755c608a6ffaa942e27a2488c0a78a3 force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:17:45 np0005538515.localdomain sudo[129849]: pam_unix(sudo:session): session closed for user root
Nov 28 09:17:46 np0005538515.localdomain sudo[129941]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sqxqwxsddnhfwadpdjfywddxsfmrnjnk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321466.04793-736-139367734922417/AnsiballZ_slurp.py
Nov 28 09:17:46 np0005538515.localdomain sudo[129941]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:17:46 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43548 DF PROTO=TCP SPT=36578 DPT=9100 SEQ=2429665641 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC48BFB0000000001030307) 
Nov 28 09:17:46 np0005538515.localdomain python3.9[129943]: ansible-ansible.builtin.slurp Invoked with path=/etc/os-net-config/config.yaml src=/etc/os-net-config/config.yaml
Nov 28 09:17:46 np0005538515.localdomain sudo[129941]: pam_unix(sudo:session): session closed for user root
Nov 28 09:17:47 np0005538515.localdomain sudo[130046]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fljfmritlmicbxcworyskwxenbzxqzkq ; ANSIBLE_ASYNC_DIR='~/.ansible_async' /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321467.097456-763-247709448489443/async_wrapper.py j995671152243 300 /home/zuul/.ansible/tmp/ansible-tmp-1764321467.097456-763-247709448489443/AnsiballZ_edpm_os_net_config.py _
Nov 28 09:17:47 np0005538515.localdomain sudo[130046]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:17:47 np0005538515.localdomain ansible-async_wrapper.py[130048]: Invoked with j995671152243 300 /home/zuul/.ansible/tmp/ansible-tmp-1764321467.097456-763-247709448489443/AnsiballZ_edpm_os_net_config.py _
Nov 28 09:17:47 np0005538515.localdomain ansible-async_wrapper.py[130051]: Starting module and watcher
Nov 28 09:17:47 np0005538515.localdomain ansible-async_wrapper.py[130051]: Start watching 130052 (300)
Nov 28 09:17:47 np0005538515.localdomain ansible-async_wrapper.py[130052]: Start module (130052)
Nov 28 09:17:47 np0005538515.localdomain ansible-async_wrapper.py[130048]: Return async_wrapper task started.
Nov 28 09:17:47 np0005538515.localdomain sudo[130046]: pam_unix(sudo:session): session closed for user root
Nov 28 09:17:48 np0005538515.localdomain python3.9[130053]: ansible-edpm_os_net_config Invoked with cleanup=False config_file=/etc/os-net-config/config.yaml debug=True detailed_exit_codes=True safe_defaults=False use_nmstate=False
Nov 28 09:17:48 np0005538515.localdomain ansible-async_wrapper.py[130052]: Module complete (130052)
Nov 28 09:17:50 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43549 DF PROTO=TCP SPT=36578 DPT=9100 SEQ=2429665641 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC49BBA0000000001030307) 
Nov 28 09:17:51 np0005538515.localdomain sudo[130143]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bfzncvoqlhxcdjyuyhjwpizewzdsfnqk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321471.0241933-763-70749618710438/AnsiballZ_async_status.py
Nov 28 09:17:51 np0005538515.localdomain sudo[130143]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:17:51 np0005538515.localdomain python3.9[130145]: ansible-ansible.legacy.async_status Invoked with jid=j995671152243.130048 mode=status _async_dir=/root/.ansible_async
Nov 28 09:17:51 np0005538515.localdomain sudo[130143]: pam_unix(sudo:session): session closed for user root
Nov 28 09:17:51 np0005538515.localdomain sudo[130202]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ejrcniehlcatsqugquljarckzgqzsxkf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321471.0241933-763-70749618710438/AnsiballZ_async_status.py
Nov 28 09:17:51 np0005538515.localdomain sudo[130202]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:17:52 np0005538515.localdomain python3.9[130204]: ansible-ansible.legacy.async_status Invoked with jid=j995671152243.130048 mode=cleanup _async_dir=/root/.ansible_async
Nov 28 09:17:52 np0005538515.localdomain sudo[130202]: pam_unix(sudo:session): session closed for user root
Nov 28 09:17:52 np0005538515.localdomain sudo[130294]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-roytxmiubmwnsgogeysqdwjdcadbuydv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321472.3957-829-179115735629208/AnsiballZ_stat.py
Nov 28 09:17:52 np0005538515.localdomain sudo[130294]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:17:52 np0005538515.localdomain python3.9[130296]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:17:52 np0005538515.localdomain sudo[130294]: pam_unix(sudo:session): session closed for user root
Nov 28 09:17:52 np0005538515.localdomain ansible-async_wrapper.py[130051]: Done in kid B.
Nov 28 09:17:53 np0005538515.localdomain sudo[130367]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yipsmlzujonqylvbauxrsekgfbzhrkig ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321472.3957-829-179115735629208/AnsiballZ_copy.py
Nov 28 09:17:53 np0005538515.localdomain sudo[130367]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:17:53 np0005538515.localdomain python3.9[130369]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/os-net-config.returncode mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764321472.3957-829-179115735629208/.source.returncode _original_basename=.qtpdiqhn follow=False checksum=b6589fc6ab0dc82cf12099d1c2d40ab994e8410c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:17:53 np0005538515.localdomain sudo[130367]: pam_unix(sudo:session): session closed for user root
Nov 28 09:17:53 np0005538515.localdomain sudo[130459]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tsukayuuuyhfgicfnlhclgtfabvokjeu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321473.598801-877-179571465957105/AnsiballZ_stat.py
Nov 28 09:17:53 np0005538515.localdomain sudo[130459]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:17:54 np0005538515.localdomain python3.9[130461]: ansible-ansible.legacy.stat Invoked with path=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:17:54 np0005538515.localdomain sudo[130459]: pam_unix(sudo:session): session closed for user root
Nov 28 09:17:54 np0005538515.localdomain sudo[130532]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wmcdsgtmgzbrductqiabvplxddmaamuo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321473.598801-877-179571465957105/AnsiballZ_copy.py
Nov 28 09:17:54 np0005538515.localdomain sudo[130532]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:17:54 np0005538515.localdomain python3.9[130534]: ansible-ansible.legacy.copy Invoked with dest=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764321473.598801-877-179571465957105/.source.cfg _original_basename=.a89rgtaa follow=False checksum=f3c5952a9cd4c6c31b314b25eb897168971cc86e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:17:54 np0005538515.localdomain sudo[130532]: pam_unix(sudo:session): session closed for user root
Nov 28 09:17:55 np0005538515.localdomain sudo[130624]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-okvvojinhtbncrozljovislufzkqkzcn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321474.8007758-922-167349471764679/AnsiballZ_systemd.py
Nov 28 09:17:55 np0005538515.localdomain sudo[130624]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:17:55 np0005538515.localdomain python3.9[130626]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 28 09:17:55 np0005538515.localdomain systemd[1]: Reloading Network Manager...
Nov 28 09:17:55 np0005538515.localdomain NetworkManager[5965]: <info>  [1764321475.4543] audit: op="reload" arg="0" pid=130630 uid=0 result="success"
Nov 28 09:17:55 np0005538515.localdomain NetworkManager[5965]: <info>  [1764321475.4554] config: signal: SIGHUP (no changes from disk)
Nov 28 09:17:55 np0005538515.localdomain systemd[1]: Reloaded Network Manager.
Nov 28 09:17:55 np0005538515.localdomain sudo[130624]: pam_unix(sudo:session): session closed for user root
Nov 28 09:17:56 np0005538515.localdomain sshd[126448]: pam_unix(sshd:session): session closed for user zuul
Nov 28 09:17:56 np0005538515.localdomain systemd[1]: session-40.scope: Deactivated successfully.
Nov 28 09:17:56 np0005538515.localdomain systemd[1]: session-40.scope: Consumed 35.992s CPU time.
Nov 28 09:17:56 np0005538515.localdomain systemd-logind[763]: Session 40 logged out. Waiting for processes to exit.
Nov 28 09:17:56 np0005538515.localdomain systemd-logind[763]: Removed session 40.
Nov 28 09:17:57 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2392 DF PROTO=TCP SPT=50554 DPT=9105 SEQ=3377560418 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC4B6E30000000001030307) 
Nov 28 09:17:58 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2393 DF PROTO=TCP SPT=50554 DPT=9105 SEQ=3377560418 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC4BAFB0000000001030307) 
Nov 28 09:17:58 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54886 DF PROTO=TCP SPT=53636 DPT=9882 SEQ=3097252902 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC4BC0D0000000001030307) 
Nov 28 09:18:01 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44109 DF PROTO=TCP SPT=46612 DPT=9105 SEQ=2192743834 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC4C6FB0000000001030307) 
Nov 28 09:18:02 np0005538515.localdomain sshd[130645]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 09:18:03 np0005538515.localdomain sshd[130645]: Accepted publickey for zuul from 192.168.122.31 port 55916 ssh2: RSA SHA256:3gOhaEk5Hp1Sm2LwNst6cGDJ5O01KvSo8lCo9SBO2II
Nov 28 09:18:03 np0005538515.localdomain systemd-logind[763]: New session 41 of user zuul.
Nov 28 09:18:03 np0005538515.localdomain systemd[1]: Started Session 41 of User zuul.
Nov 28 09:18:03 np0005538515.localdomain sshd[130645]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Nov 28 09:18:03 np0005538515.localdomain python3.9[130738]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 09:18:04 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2395 DF PROTO=TCP SPT=50554 DPT=9105 SEQ=3377560418 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC4D2BA0000000001030307) 
Nov 28 09:18:05 np0005538515.localdomain python3.9[130832]: ansible-ansible.builtin.setup Invoked with filter=['ansible_default_ipv4'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 28 09:18:06 np0005538515.localdomain python3.9[130977]: ansible-ansible.legacy.command Invoked with _raw_params=hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:18:06 np0005538515.localdomain sshd[130645]: pam_unix(sshd:session): session closed for user zuul
Nov 28 09:18:06 np0005538515.localdomain systemd[1]: session-41.scope: Deactivated successfully.
Nov 28 09:18:06 np0005538515.localdomain systemd[1]: session-41.scope: Consumed 2.245s CPU time.
Nov 28 09:18:06 np0005538515.localdomain systemd-logind[763]: Session 41 logged out. Waiting for processes to exit.
Nov 28 09:18:06 np0005538515.localdomain systemd-logind[763]: Removed session 41.
Nov 28 09:18:09 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41137 DF PROTO=TCP SPT=56410 DPT=9102 SEQ=331650185 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC4E43A0000000001030307) 
Nov 28 09:18:11 np0005538515.localdomain sshd[130993]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 09:18:12 np0005538515.localdomain sshd[130993]: Accepted publickey for zuul from 192.168.122.31 port 43730 ssh2: RSA SHA256:3gOhaEk5Hp1Sm2LwNst6cGDJ5O01KvSo8lCo9SBO2II
Nov 28 09:18:12 np0005538515.localdomain systemd-logind[763]: New session 42 of user zuul.
Nov 28 09:18:12 np0005538515.localdomain systemd[1]: Started Session 42 of User zuul.
Nov 28 09:18:12 np0005538515.localdomain sshd[130993]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Nov 28 09:18:12 np0005538515.localdomain sudo[131043]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:18:12 np0005538515.localdomain sudo[131043]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:18:12 np0005538515.localdomain sudo[131043]: pam_unix(sudo:session): session closed for user root
Nov 28 09:18:12 np0005538515.localdomain sudo[131071]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 09:18:12 np0005538515.localdomain sudo[131071]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:18:12 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2396 DF PROTO=TCP SPT=50554 DPT=9105 SEQ=3377560418 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC4F2FA0000000001030307) 
Nov 28 09:18:13 np0005538515.localdomain python3.9[131116]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 09:18:13 np0005538515.localdomain sudo[131071]: pam_unix(sudo:session): session closed for user root
Nov 28 09:18:13 np0005538515.localdomain sudo[131229]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:18:13 np0005538515.localdomain sudo[131229]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:18:13 np0005538515.localdomain sudo[131229]: pam_unix(sudo:session): session closed for user root
Nov 28 09:18:13 np0005538515.localdomain sudo[131257]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 list-networks
Nov 28 09:18:13 np0005538515.localdomain sudo[131257]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:18:13 np0005538515.localdomain python3.9[131254]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 09:18:14 np0005538515.localdomain sudo[131257]: pam_unix(sudo:session): session closed for user root
Nov 28 09:18:14 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22422 DF PROTO=TCP SPT=44324 DPT=9101 SEQ=264599625 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC4F7E10000000001030307) 
Nov 28 09:18:14 np0005538515.localdomain sudo[131384]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lgsxlcrqaxjubmtnuhzqoweprkiptqql ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321494.4414418-82-99044551012852/AnsiballZ_setup.py
Nov 28 09:18:14 np0005538515.localdomain sudo[131384]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:18:15 np0005538515.localdomain python3.9[131386]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 28 09:18:15 np0005538515.localdomain sudo[131384]: pam_unix(sudo:session): session closed for user root
Nov 28 09:18:15 np0005538515.localdomain sudo[131438]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ercsfpbfodzqtatlplajdgjwctxqpzak ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321494.4414418-82-99044551012852/AnsiballZ_dnf.py
Nov 28 09:18:15 np0005538515.localdomain sudo[131438]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:18:15 np0005538515.localdomain python3.9[131440]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 28 09:18:16 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27699 DF PROTO=TCP SPT=42218 DPT=9100 SEQ=2855922814 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC500FA0000000001030307) 
Nov 28 09:18:17 np0005538515.localdomain sudo[131443]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:18:17 np0005538515.localdomain sudo[131443]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:18:17 np0005538515.localdomain sudo[131443]: pam_unix(sudo:session): session closed for user root
Nov 28 09:18:18 np0005538515.localdomain sudo[131438]: pam_unix(sudo:session): session closed for user root
Nov 28 09:18:19 np0005538515.localdomain sudo[131547]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aqtoelruhxewobuoajahucapelpdzbpz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321499.3470194-118-182733617289408/AnsiballZ_setup.py
Nov 28 09:18:19 np0005538515.localdomain sudo[131547]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:18:19 np0005538515.localdomain python3.9[131549]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 28 09:18:20 np0005538515.localdomain sudo[131547]: pam_unix(sudo:session): session closed for user root
Nov 28 09:18:20 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27700 DF PROTO=TCP SPT=42218 DPT=9100 SEQ=2855922814 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC510BB0000000001030307) 
Nov 28 09:18:20 np0005538515.localdomain sudo[131694]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-abgpmznothxlnjsipgwkmnxqdyyouoct ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321500.588487-151-75367106384042/AnsiballZ_file.py
Nov 28 09:18:20 np0005538515.localdomain sudo[131694]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:18:21 np0005538515.localdomain python3.9[131696]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:18:21 np0005538515.localdomain sudo[131694]: pam_unix(sudo:session): session closed for user root
Nov 28 09:18:21 np0005538515.localdomain sudo[131786]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uzewvrvnmbbshcmasfkqqlyyqlshtoex ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321501.4156122-175-166279927839726/AnsiballZ_command.py
Nov 28 09:18:21 np0005538515.localdomain sudo[131786]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:18:22 np0005538515.localdomain python3.9[131788]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman
                                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:18:22 np0005538515.localdomain sudo[131786]: pam_unix(sudo:session): session closed for user root
Nov 28 09:18:22 np0005538515.localdomain sudo[131889]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vadpxtjytzugqyqsbaqwubppvxrnirzl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321502.318931-200-13940246392499/AnsiballZ_stat.py
Nov 28 09:18:22 np0005538515.localdomain sudo[131889]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:18:22 np0005538515.localdomain python3.9[131891]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:18:22 np0005538515.localdomain sudo[131889]: pam_unix(sudo:session): session closed for user root
Nov 28 09:18:23 np0005538515.localdomain sudo[131937]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lokauzgxvxgwngebrvrmhfxmboydufmm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321502.318931-200-13940246392499/AnsiballZ_file.py
Nov 28 09:18:23 np0005538515.localdomain sudo[131937]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:18:23 np0005538515.localdomain python3.9[131939]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/containers/networks/podman.json _original_basename=podman_network_config.j2 recurse=False state=file path=/etc/containers/networks/podman.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:18:23 np0005538515.localdomain sudo[131937]: pam_unix(sudo:session): session closed for user root
Nov 28 09:18:23 np0005538515.localdomain sudo[132029]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xfhegxcpblqgbsbovpzzsfqzjvqgzbbq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321503.535913-236-232302530581099/AnsiballZ_stat.py
Nov 28 09:18:23 np0005538515.localdomain sudo[132029]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:18:24 np0005538515.localdomain python3.9[132031]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:18:24 np0005538515.localdomain sudo[132029]: pam_unix(sudo:session): session closed for user root
Nov 28 09:18:24 np0005538515.localdomain sudo[132077]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xwcateiqdinahefnizxznysfzyzzalit ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321503.535913-236-232302530581099/AnsiballZ_file.py
Nov 28 09:18:24 np0005538515.localdomain sudo[132077]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:18:24 np0005538515.localdomain python3.9[132079]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf _original_basename=registries.conf.j2 recurse=False state=file path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:18:24 np0005538515.localdomain sudo[132077]: pam_unix(sudo:session): session closed for user root
Nov 28 09:18:25 np0005538515.localdomain sudo[132169]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wqubzobrazbwenahvnjgtcxrkqphwwjn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321504.8821602-274-211000020057676/AnsiballZ_ini_file.py
Nov 28 09:18:25 np0005538515.localdomain sudo[132169]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:18:25 np0005538515.localdomain python3.9[132171]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:18:25 np0005538515.localdomain sudo[132169]: pam_unix(sudo:session): session closed for user root
Nov 28 09:18:25 np0005538515.localdomain sudo[132261]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rjqtftswgpfvcssgfgtrqbicasgzqiid ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321505.6438315-274-129027731531518/AnsiballZ_ini_file.py
Nov 28 09:18:25 np0005538515.localdomain sudo[132261]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:18:26 np0005538515.localdomain python3.9[132263]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:18:26 np0005538515.localdomain sudo[132261]: pam_unix(sudo:session): session closed for user root
Nov 28 09:18:26 np0005538515.localdomain sudo[132353]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cnfkttrelietnemufuzugokcsszxbsqb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321506.2270584-274-238541109876034/AnsiballZ_ini_file.py
Nov 28 09:18:26 np0005538515.localdomain sudo[132353]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:18:26 np0005538515.localdomain python3.9[132355]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:18:26 np0005538515.localdomain sudo[132353]: pam_unix(sudo:session): session closed for user root
Nov 28 09:18:27 np0005538515.localdomain sudo[132445]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pgnsuvicaazxuiebnhbjzgbodcdkonuq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321506.8027415-274-203019482513908/AnsiballZ_ini_file.py
Nov 28 09:18:27 np0005538515.localdomain sudo[132445]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:18:27 np0005538515.localdomain python3.9[132447]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:18:27 np0005538515.localdomain sudo[132445]: pam_unix(sudo:session): session closed for user root
Nov 28 09:18:27 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52032 DF PROTO=TCP SPT=51260 DPT=9105 SEQ=992114278 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC52C130000000001030307) 
Nov 28 09:18:27 np0005538515.localdomain sudo[132537]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qhdrfvnokvopricbzhqzqmnxfjkwqaea ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321507.703024-367-211638767235720/AnsiballZ_dnf.py
Nov 28 09:18:27 np0005538515.localdomain sudo[132537]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:18:28 np0005538515.localdomain python3.9[132539]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 28 09:18:28 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52033 DF PROTO=TCP SPT=51260 DPT=9105 SEQ=992114278 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC5303A0000000001030307) 
Nov 28 09:18:28 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27701 DF PROTO=TCP SPT=42218 DPT=9100 SEQ=2855922814 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC530FA0000000001030307) 
Nov 28 09:18:31 np0005538515.localdomain sudo[132537]: pam_unix(sudo:session): session closed for user root
Nov 28 09:18:31 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6537 DF PROTO=TCP SPT=35140 DPT=9882 SEQ=3504925525 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC53D3A0000000001030307) 
Nov 28 09:18:32 np0005538515.localdomain sudo[132631]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tlsazcldtyxhpzfospxaramatsjyesjz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321511.9679315-400-148591044760977/AnsiballZ_setup.py
Nov 28 09:18:32 np0005538515.localdomain sudo[132631]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:18:32 np0005538515.localdomain python3.9[132633]: ansible-setup Invoked with gather_subset=['!all', '!min', 'distribution', 'distribution_major_version', 'distribution_version', 'os_family'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 09:18:32 np0005538515.localdomain sudo[132631]: pam_unix(sudo:session): session closed for user root
Nov 28 09:18:33 np0005538515.localdomain sudo[132725]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ecyzdcmabakwkvvtpwpfxqgmcarzvfgx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321513.2090647-424-127620008464537/AnsiballZ_stat.py
Nov 28 09:18:33 np0005538515.localdomain sudo[132725]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:18:33 np0005538515.localdomain python3.9[132727]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 09:18:33 np0005538515.localdomain sudo[132725]: pam_unix(sudo:session): session closed for user root
Nov 28 09:18:34 np0005538515.localdomain ceph-osd[32393]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 28 09:18:34 np0005538515.localdomain ceph-osd[32393]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 5400.1 total, 600.0 interval
                                                          Cumulative writes: 4784 writes, 21K keys, 4784 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s
                                                          Cumulative WAL: 4784 writes, 637 syncs, 7.51 writes per sync, written: 0.02 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                          Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Nov 28 09:18:34 np0005538515.localdomain sudo[132817]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xbaachslzcscztkbrzbjvrgukbvfkppf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321513.972109-451-60510865104534/AnsiballZ_stat.py
Nov 28 09:18:34 np0005538515.localdomain sudo[132817]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:18:34 np0005538515.localdomain python3.9[132819]: ansible-stat Invoked with path=/sbin/transactional-update follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 09:18:34 np0005538515.localdomain sudo[132817]: pam_unix(sudo:session): session closed for user root
Nov 28 09:18:34 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52035 DF PROTO=TCP SPT=51260 DPT=9105 SEQ=992114278 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC547FA0000000001030307) 
Nov 28 09:18:35 np0005538515.localdomain sudo[132909]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aiyjzakmoygqpmnashxaaobhmpnzgief ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321515.4834332-482-201214658958258/AnsiballZ_command.py
Nov 28 09:18:35 np0005538515.localdomain sudo[132909]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:18:35 np0005538515.localdomain python3.9[132911]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-system-running _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:18:35 np0005538515.localdomain sudo[132909]: pam_unix(sudo:session): session closed for user root
Nov 28 09:18:36 np0005538515.localdomain sudo[133002]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bwnitalmumysatlxceqjenqdxpgoprqs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321516.2803314-512-248535065818881/AnsiballZ_service_facts.py
Nov 28 09:18:36 np0005538515.localdomain sudo[133002]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:18:36 np0005538515.localdomain python3.9[133004]: ansible-service_facts Invoked
Nov 28 09:18:36 np0005538515.localdomain network[133021]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 28 09:18:36 np0005538515.localdomain network[133022]: 'network-scripts' will be removed from distribution in near future.
Nov 28 09:18:36 np0005538515.localdomain network[133023]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 28 09:18:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 28 09:18:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 5400.2 total, 600.0 interval
                                                          Cumulative writes: 5781 writes, 25K keys, 5781 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s
                                                          Cumulative WAL: 5781 writes, 729 syncs, 7.93 writes per sync, written: 0.02 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                          Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Nov 28 09:18:39 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48615 DF PROTO=TCP SPT=35222 DPT=9102 SEQ=462316261 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC5593A0000000001030307) 
Nov 28 09:18:39 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:18:43 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52036 DF PROTO=TCP SPT=51260 DPT=9105 SEQ=992114278 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC568FB0000000001030307) 
Nov 28 09:18:43 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26912 DF PROTO=TCP SPT=60128 DPT=9100 SEQ=1427177641 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC56A430000000001030307) 
Nov 28 09:18:43 np0005538515.localdomain sudo[133002]: pam_unix(sudo:session): session closed for user root
Nov 28 09:18:44 np0005538515.localdomain sudo[133236]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kbjeuooxiiresidziscmflixtmuvpkyn ; /bin/bash /home/zuul/.ansible/tmp/ansible-tmp-1764321524.4237368-557-25219341912766/AnsiballZ_timesync_provider.sh /home/zuul/.ansible/tmp/ansible-tmp-1764321524.4237368-557-25219341912766/args
Nov 28 09:18:44 np0005538515.localdomain sudo[133236]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:18:44 np0005538515.localdomain sudo[133236]: pam_unix(sudo:session): session closed for user root
Nov 28 09:18:45 np0005538515.localdomain sudo[133343]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wcvnkfszwmnzsiediyfvyciwlbkxcvam ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321525.1121361-590-113769326667866/AnsiballZ_dnf.py
Nov 28 09:18:45 np0005538515.localdomain sudo[133343]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:18:45 np0005538515.localdomain python3.9[133345]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 28 09:18:46 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26914 DF PROTO=TCP SPT=60128 DPT=9100 SEQ=1427177641 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC5763A0000000001030307) 
Nov 28 09:18:48 np0005538515.localdomain sudo[133343]: pam_unix(sudo:session): session closed for user root
Nov 28 09:18:49 np0005538515.localdomain sudo[133437]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fowqkprczxfczheqozmezibaxzijlxvv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321529.259901-629-157375397417134/AnsiballZ_package_facts.py
Nov 28 09:18:49 np0005538515.localdomain sudo[133437]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:18:50 np0005538515.localdomain python3.9[133439]: ansible-package_facts Invoked with manager=['auto'] strategy=first
Nov 28 09:18:50 np0005538515.localdomain sudo[133437]: pam_unix(sudo:session): session closed for user root
Nov 28 09:18:50 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26915 DF PROTO=TCP SPT=60128 DPT=9100 SEQ=1427177641 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC585FA0000000001030307) 
Nov 28 09:18:51 np0005538515.localdomain sudo[133529]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lqhmlrywxqminlqabpyvmxanxfcvwljd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321531.0919702-660-118800461441946/AnsiballZ_stat.py
Nov 28 09:18:51 np0005538515.localdomain sudo[133529]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:18:51 np0005538515.localdomain python3.9[133531]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:18:51 np0005538515.localdomain sudo[133529]: pam_unix(sudo:session): session closed for user root
Nov 28 09:18:52 np0005538515.localdomain sudo[133604]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hfbvpjrqrrtdxhdbkcqaoizljsrerykz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321531.0919702-660-118800461441946/AnsiballZ_copy.py
Nov 28 09:18:52 np0005538515.localdomain sudo[133604]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:18:52 np0005538515.localdomain python3.9[133606]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/chrony.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764321531.0919702-660-118800461441946/.source.conf follow=False _original_basename=chrony.conf.j2 checksum=cfb003e56d02d0d2c65555452eb1a05073fecdad force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:18:52 np0005538515.localdomain sudo[133604]: pam_unix(sudo:session): session closed for user root
Nov 28 09:18:52 np0005538515.localdomain sudo[133698]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wvvxababomomzhodownpeudizzuxxumg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321532.596823-704-84088194372708/AnsiballZ_stat.py
Nov 28 09:18:52 np0005538515.localdomain sudo[133698]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:18:53 np0005538515.localdomain python3.9[133700]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/chronyd follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:18:53 np0005538515.localdomain sudo[133698]: pam_unix(sudo:session): session closed for user root
Nov 28 09:18:53 np0005538515.localdomain sudo[133773]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-amdzhxyayrdlxqfpkadywqbujsnvqqsz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321532.596823-704-84088194372708/AnsiballZ_copy.py
Nov 28 09:18:53 np0005538515.localdomain sudo[133773]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:18:53 np0005538515.localdomain python3.9[133775]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/sysconfig/chronyd mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764321532.596823-704-84088194372708/.source follow=False _original_basename=chronyd.sysconfig.j2 checksum=dd196b1ff1f915b23eebc37ec77405b5dd3df76c force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:18:53 np0005538515.localdomain sudo[133773]: pam_unix(sudo:session): session closed for user root
Nov 28 09:18:55 np0005538515.localdomain sudo[133867]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oahgqdxxwzondmmjtygjjvolexskacpj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321534.624042-767-124906602375489/AnsiballZ_lineinfile.py
Nov 28 09:18:55 np0005538515.localdomain sudo[133867]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:18:55 np0005538515.localdomain python3.9[133869]: ansible-lineinfile Invoked with backup=True create=True dest=/etc/sysconfig/network line=PEERNTP=no mode=0644 regexp=^PEERNTP= state=present path=/etc/sysconfig/network encoding=utf-8 backrefs=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:18:55 np0005538515.localdomain sudo[133867]: pam_unix(sudo:session): session closed for user root
Nov 28 09:18:56 np0005538515.localdomain sudo[133961]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ygeezwnduxmlpdqyqktzzzvmumnyiacs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321536.3471959-812-266274581110533/AnsiballZ_setup.py
Nov 28 09:18:56 np0005538515.localdomain sudo[133961]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:18:56 np0005538515.localdomain python3.9[133963]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 28 09:18:57 np0005538515.localdomain sudo[133961]: pam_unix(sudo:session): session closed for user root
Nov 28 09:18:57 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37810 DF PROTO=TCP SPT=40280 DPT=9105 SEQ=1448376460 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC5A1430000000001030307) 
Nov 28 09:18:57 np0005538515.localdomain sudo[134015]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-znttdlcntknkwqznywfrabwxzirvkwsp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321536.3471959-812-266274581110533/AnsiballZ_systemd.py
Nov 28 09:18:57 np0005538515.localdomain sudo[134015]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:18:58 np0005538515.localdomain python3.9[134017]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:18:58 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37811 DF PROTO=TCP SPT=40280 DPT=9105 SEQ=1448376460 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC5A53A0000000001030307) 
Nov 28 09:18:58 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7316 DF PROTO=TCP SPT=48208 DPT=9882 SEQ=3142136880 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC5A66C0000000001030307) 
Nov 28 09:18:59 np0005538515.localdomain sudo[134015]: pam_unix(sudo:session): session closed for user root
Nov 28 09:19:00 np0005538515.localdomain sudo[134109]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-glryszlbfccuukvzxiwagnikzkzwbgep ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321540.0009687-862-241687962383167/AnsiballZ_setup.py
Nov 28 09:19:00 np0005538515.localdomain sudo[134109]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:19:00 np0005538515.localdomain python3.9[134111]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 28 09:19:00 np0005538515.localdomain sudo[134109]: pam_unix(sudo:session): session closed for user root
Nov 28 09:19:01 np0005538515.localdomain sudo[134163]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yyauigmohcdkvcvtoawoarzbceunperp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321540.0009687-862-241687962383167/AnsiballZ_systemd.py
Nov 28 09:19:01 np0005538515.localdomain sudo[134163]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:19:01 np0005538515.localdomain python3.9[134165]: ansible-ansible.legacy.systemd Invoked with name=chronyd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 28 09:19:01 np0005538515.localdomain chronyd[26579]: chronyd exiting
Nov 28 09:19:01 np0005538515.localdomain systemd[1]: Stopping NTP client/server...
Nov 28 09:19:01 np0005538515.localdomain systemd[1]: chronyd.service: Deactivated successfully.
Nov 28 09:19:01 np0005538515.localdomain systemd[1]: Stopped NTP client/server.
Nov 28 09:19:01 np0005538515.localdomain systemd[1]: Starting NTP client/server...
Nov 28 09:19:01 np0005538515.localdomain chronyd[134173]: chronyd version 4.3 starting (+CMDMON +NTP +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +ASYNCDNS +NTS +SECHASH +IPV6 +DEBUG)
Nov 28 09:19:01 np0005538515.localdomain chronyd[134173]: Frequency -30.412 +/- 0.251 ppm read from /var/lib/chrony/drift
Nov 28 09:19:01 np0005538515.localdomain chronyd[134173]: Loaded seccomp filter (level 2)
Nov 28 09:19:01 np0005538515.localdomain systemd[1]: Started NTP client/server.
Nov 28 09:19:01 np0005538515.localdomain sudo[134163]: pam_unix(sudo:session): session closed for user root
Nov 28 09:19:01 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2398 DF PROTO=TCP SPT=50554 DPT=9105 SEQ=3377560418 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC5B0FB0000000001030307) 
Nov 28 09:19:01 np0005538515.localdomain sshd[130993]: pam_unix(sshd:session): session closed for user zuul
Nov 28 09:19:01 np0005538515.localdomain systemd[1]: session-42.scope: Deactivated successfully.
Nov 28 09:19:01 np0005538515.localdomain systemd[1]: session-42.scope: Consumed 27.754s CPU time.
Nov 28 09:19:01 np0005538515.localdomain systemd-logind[763]: Session 42 logged out. Waiting for processes to exit.
Nov 28 09:19:01 np0005538515.localdomain systemd-logind[763]: Removed session 42.
Nov 28 09:19:04 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37813 DF PROTO=TCP SPT=40280 DPT=9105 SEQ=1448376460 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC5BCFA0000000001030307) 
Nov 28 09:19:07 np0005538515.localdomain sshd[134189]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 09:19:07 np0005538515.localdomain sshd[134189]: Accepted publickey for zuul from 192.168.122.31 port 40846 ssh2: RSA SHA256:3gOhaEk5Hp1Sm2LwNst6cGDJ5O01KvSo8lCo9SBO2II
Nov 28 09:19:07 np0005538515.localdomain systemd-logind[763]: New session 43 of user zuul.
Nov 28 09:19:07 np0005538515.localdomain systemd[1]: Started Session 43 of User zuul.
Nov 28 09:19:07 np0005538515.localdomain sshd[134189]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Nov 28 09:19:08 np0005538515.localdomain python3.9[134282]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 09:19:09 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23043 DF PROTO=TCP SPT=34794 DPT=9102 SEQ=1845156581 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC5CE7A0000000001030307) 
Nov 28 09:19:09 np0005538515.localdomain sudo[134376]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ckedpbwfcsexzcmqecjfljovifhvvgrf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321549.4480617-61-267207777960504/AnsiballZ_file.py
Nov 28 09:19:09 np0005538515.localdomain sudo[134376]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:19:10 np0005538515.localdomain python3.9[134378]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:19:10 np0005538515.localdomain sudo[134376]: pam_unix(sudo:session): session closed for user root
Nov 28 09:19:10 np0005538515.localdomain sudo[134481]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ksidejxgkogxegiyyhxnzogebdigevav ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321550.275163-85-125854129584851/AnsiballZ_stat.py
Nov 28 09:19:10 np0005538515.localdomain sudo[134481]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:19:10 np0005538515.localdomain python3.9[134483]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:19:10 np0005538515.localdomain sudo[134481]: pam_unix(sudo:session): session closed for user root
Nov 28 09:19:11 np0005538515.localdomain sudo[134529]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zimnnmkaqaprulpkmgopavlimtwlssdx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321550.275163-85-125854129584851/AnsiballZ_file.py
Nov 28 09:19:11 np0005538515.localdomain sudo[134529]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:19:11 np0005538515.localdomain python3.9[134531]: ansible-ansible.legacy.file Invoked with group=zuul mode=0660 owner=zuul dest=/root/.config/containers/auth.json _original_basename=.hs4czwh6 recurse=False state=file path=/root/.config/containers/auth.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:19:11 np0005538515.localdomain sudo[134529]: pam_unix(sudo:session): session closed for user root
Nov 28 09:19:12 np0005538515.localdomain sudo[134621]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qmzgyewebauvexpvocntgiwqrpkchwle ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321552.157387-145-25053830111139/AnsiballZ_stat.py
Nov 28 09:19:12 np0005538515.localdomain sudo[134621]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:19:12 np0005538515.localdomain python3.9[134623]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:19:12 np0005538515.localdomain sudo[134621]: pam_unix(sudo:session): session closed for user root
Nov 28 09:19:12 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37814 DF PROTO=TCP SPT=40280 DPT=9105 SEQ=1448376460 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC5DD1A0000000001030307) 
Nov 28 09:19:13 np0005538515.localdomain sudo[134696]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-edoipqwammvbuunkwrwnenahhupozyig ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321552.157387-145-25053830111139/AnsiballZ_copy.py
Nov 28 09:19:13 np0005538515.localdomain sudo[134696]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:19:13 np0005538515.localdomain python3.9[134698]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysconfig/podman_drop_in mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764321552.157387-145-25053830111139/.source _original_basename=.zj4whxlu follow=False checksum=125299ce8dea7711a76292961206447f0043248b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:19:13 np0005538515.localdomain sudo[134696]: pam_unix(sudo:session): session closed for user root
Nov 28 09:19:13 np0005538515.localdomain sudo[134788]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xzzugdgnkekfbdvbhzpaxfctdupvnngl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321553.6449854-193-202840917138912/AnsiballZ_file.py
Nov 28 09:19:13 np0005538515.localdomain sudo[134788]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:19:14 np0005538515.localdomain python3.9[134790]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:19:14 np0005538515.localdomain sudo[134788]: pam_unix(sudo:session): session closed for user root
Nov 28 09:19:14 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18770 DF PROTO=TCP SPT=34950 DPT=9101 SEQ=3969529837 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC5E2400000000001030307) 
Nov 28 09:19:14 np0005538515.localdomain sudo[134880]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pkhtwvuxahryqxryvbhlmoocjdsnpuci ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321554.387282-218-72689948104764/AnsiballZ_stat.py
Nov 28 09:19:14 np0005538515.localdomain sudo[134880]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:19:14 np0005538515.localdomain python3.9[134882]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:19:14 np0005538515.localdomain sudo[134880]: pam_unix(sudo:session): session closed for user root
Nov 28 09:19:15 np0005538515.localdomain sudo[134953]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-khqpznkojewfefivqnryynkrrmhwzxhw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321554.387282-218-72689948104764/AnsiballZ_copy.py
Nov 28 09:19:15 np0005538515.localdomain sudo[134953]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:19:15 np0005538515.localdomain python3.9[134955]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-container-shutdown group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764321554.387282-218-72689948104764/.source _original_basename=edpm-container-shutdown follow=False checksum=632c3792eb3dce4288b33ae7b265b71950d69f13 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:19:15 np0005538515.localdomain sudo[134953]: pam_unix(sudo:session): session closed for user root
Nov 28 09:19:15 np0005538515.localdomain sudo[135045]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-llbdptybxycdafqzgddipymlqryzqwei ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321555.4753401-218-87334635381486/AnsiballZ_stat.py
Nov 28 09:19:15 np0005538515.localdomain sudo[135045]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:19:15 np0005538515.localdomain python3.9[135047]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:19:16 np0005538515.localdomain sudo[135045]: pam_unix(sudo:session): session closed for user root
Nov 28 09:19:16 np0005538515.localdomain sudo[135118]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pdnnvegguvtcdhfljcczmophpvblnvle ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321555.4753401-218-87334635381486/AnsiballZ_copy.py
Nov 28 09:19:16 np0005538515.localdomain sudo[135118]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:19:16 np0005538515.localdomain python3.9[135120]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-start-podman-container group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764321555.4753401-218-87334635381486/.source _original_basename=edpm-start-podman-container follow=False checksum=b963c569d75a655c0ccae95d9bb4a2a9a4df27d1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:19:16 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27201 DF PROTO=TCP SPT=48680 DPT=9100 SEQ=234430106 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC5EB7A0000000001030307) 
Nov 28 09:19:16 np0005538515.localdomain sudo[135118]: pam_unix(sudo:session): session closed for user root
Nov 28 09:19:17 np0005538515.localdomain sudo[135210]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-olodplikbfkzgwexxtafakhrdoahopfd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321556.7702746-304-193730451068488/AnsiballZ_file.py
Nov 28 09:19:17 np0005538515.localdomain sudo[135210]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:19:17 np0005538515.localdomain python3.9[135212]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:19:17 np0005538515.localdomain sudo[135210]: pam_unix(sudo:session): session closed for user root
Nov 28 09:19:17 np0005538515.localdomain sudo[135213]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:19:17 np0005538515.localdomain sudo[135213]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:19:17 np0005538515.localdomain sudo[135213]: pam_unix(sudo:session): session closed for user root
Nov 28 09:19:17 np0005538515.localdomain sudo[135242]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 09:19:17 np0005538515.localdomain sudo[135242]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:19:17 np0005538515.localdomain sudo[135346]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-blzksfbkiicylzfdtmreivyehsjfwrts ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321557.6448386-328-72131090457722/AnsiballZ_stat.py
Nov 28 09:19:17 np0005538515.localdomain sudo[135346]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:19:18 np0005538515.localdomain python3.9[135348]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:19:18 np0005538515.localdomain sudo[135346]: pam_unix(sudo:session): session closed for user root
Nov 28 09:19:18 np0005538515.localdomain sudo[135242]: pam_unix(sudo:session): session closed for user root
Nov 28 09:19:18 np0005538515.localdomain sudo[135436]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iawdvrqxbsbzainkuwimfefeyqbtqkxy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321557.6448386-328-72131090457722/AnsiballZ_copy.py
Nov 28 09:19:18 np0005538515.localdomain sudo[135436]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:19:18 np0005538515.localdomain sudo[135439]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:19:18 np0005538515.localdomain sudo[135439]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:19:18 np0005538515.localdomain sudo[135439]: pam_unix(sudo:session): session closed for user root
Nov 28 09:19:18 np0005538515.localdomain sudo[135454]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ceph-volume --fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1 -- inventory --format=json-pretty --filter-for-batch
Nov 28 09:19:18 np0005538515.localdomain sudo[135454]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:19:18 np0005538515.localdomain python3.9[135438]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm-container-shutdown.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764321557.6448386-328-72131090457722/.source.service _original_basename=edpm-container-shutdown-service follow=False checksum=6336835cb0f888670cc99de31e19c8c071444d33 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:19:18 np0005538515.localdomain sudo[135436]: pam_unix(sudo:session): session closed for user root
Nov 28 09:19:19 np0005538515.localdomain podman[135545]: 
Nov 28 09:19:19 np0005538515.localdomain podman[135545]: 2025-11-28 09:19:19.207321098 +0000 UTC m=+0.079643872 container create 6185c59ccc6b361817e96770c2f80db049cbf66d4b4aae6b455916496b16c991 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=friendly_spence, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=553, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, name=rhceph, com.redhat.component=rhceph-container, vcs-type=git, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, GIT_CLEAN=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, architecture=x86_64, io.openshift.expose-services=, CEPH_POINT_RELEASE=, RELEASE=main, GIT_BRANCH=main)
Nov 28 09:19:19 np0005538515.localdomain systemd[1]: Started libpod-conmon-6185c59ccc6b361817e96770c2f80db049cbf66d4b4aae6b455916496b16c991.scope.
Nov 28 09:19:19 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 09:19:19 np0005538515.localdomain podman[135545]: 2025-11-28 09:19:19.173779876 +0000 UTC m=+0.046102660 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 09:19:19 np0005538515.localdomain podman[135545]: 2025-11-28 09:19:19.288172106 +0000 UTC m=+0.160494840 container init 6185c59ccc6b361817e96770c2f80db049cbf66d4b4aae6b455916496b16c991 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=friendly_spence, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, build-date=2025-09-24T08:57:55, io.buildah.version=1.33.12, name=rhceph, com.redhat.license_terms=https://www.redhat.com/agreements, version=7, vendor=Red Hat, Inc., RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=rhceph-container, architecture=x86_64, GIT_CLEAN=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, release=553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, distribution-scope=public, CEPH_POINT_RELEASE=)
Nov 28 09:19:19 np0005538515.localdomain podman[135545]: 2025-11-28 09:19:19.300134563 +0000 UTC m=+0.172457307 container start 6185c59ccc6b361817e96770c2f80db049cbf66d4b4aae6b455916496b16c991 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=friendly_spence, io.buildah.version=1.33.12, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, version=7, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_BRANCH=main, release=553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.expose-services=, vcs-type=git, io.openshift.tags=rhceph ceph, ceph=True, RELEASE=main, com.redhat.component=rhceph-container, build-date=2025-09-24T08:57:55, com.redhat.license_terms=https://www.redhat.com/agreements)
Nov 28 09:19:19 np0005538515.localdomain podman[135545]: 2025-11-28 09:19:19.301157494 +0000 UTC m=+0.173480268 container attach 6185c59ccc6b361817e96770c2f80db049cbf66d4b4aae6b455916496b16c991 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=friendly_spence, release=553, vendor=Red Hat, Inc., name=rhceph, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.buildah.version=1.33.12, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_BRANCH=main, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, version=7, architecture=x86_64, io.openshift.expose-services=, build-date=2025-09-24T08:57:55, io.openshift.tags=rhceph ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, vcs-type=git, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements)
Nov 28 09:19:19 np0005538515.localdomain friendly_spence[135583]: 167 167
Nov 28 09:19:19 np0005538515.localdomain systemd[1]: libpod-6185c59ccc6b361817e96770c2f80db049cbf66d4b4aae6b455916496b16c991.scope: Deactivated successfully.
Nov 28 09:19:19 np0005538515.localdomain podman[135545]: 2025-11-28 09:19:19.307465188 +0000 UTC m=+0.179787972 container died 6185c59ccc6b361817e96770c2f80db049cbf66d4b4aae6b455916496b16c991 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=friendly_spence, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., GIT_CLEAN=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhceph, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, build-date=2025-09-24T08:57:55, vcs-type=git, io.buildah.version=1.33.12, distribution-scope=public, CEPH_POINT_RELEASE=, release=553, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, version=7)
Nov 28 09:19:19 np0005538515.localdomain podman[135602]: 2025-11-28 09:19:19.400993767 +0000 UTC m=+0.086340418 container remove 6185c59ccc6b361817e96770c2f80db049cbf66d4b4aae6b455916496b16c991 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=friendly_spence, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, release=553, architecture=x86_64, version=7, name=rhceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph, build-date=2025-09-24T08:57:55, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, ceph=True, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, distribution-scope=public, io.buildah.version=1.33.12, GIT_BRANCH=main, description=Red Hat Ceph Storage 7)
Nov 28 09:19:19 np0005538515.localdomain systemd[1]: libpod-conmon-6185c59ccc6b361817e96770c2f80db049cbf66d4b4aae6b455916496b16c991.scope: Deactivated successfully.
Nov 28 09:19:19 np0005538515.localdomain sudo[135627]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lhksjzmoeuikcbrrcnowjufarmgtjvow ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321559.0844204-373-108617403463863/AnsiballZ_stat.py
Nov 28 09:19:19 np0005538515.localdomain sudo[135627]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:19:19 np0005538515.localdomain podman[135641]: 
Nov 28 09:19:19 np0005538515.localdomain python3.9[135633]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:19:19 np0005538515.localdomain podman[135641]: 2025-11-28 09:19:19.61237214 +0000 UTC m=+0.070035226 container create 732e0f4686b57e1561c9fbb72b502d15d89c1bffac5406b83f7a3de2b2a3c0da (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eloquent_dhawan, vcs-type=git, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, CEPH_POINT_RELEASE=, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, RELEASE=main, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.33.12, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, build-date=2025-09-24T08:57:55, version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, distribution-scope=public, release=553, architecture=x86_64, com.redhat.component=rhceph-container, vendor=Red Hat, Inc.)
Nov 28 09:19:19 np0005538515.localdomain sudo[135627]: pam_unix(sudo:session): session closed for user root
Nov 28 09:19:19 np0005538515.localdomain podman[135641]: 2025-11-28 09:19:19.574140914 +0000 UTC m=+0.031804040 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 09:19:19 np0005538515.localdomain systemd[1]: Started libpod-conmon-732e0f4686b57e1561c9fbb72b502d15d89c1bffac5406b83f7a3de2b2a3c0da.scope.
Nov 28 09:19:19 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 09:19:19 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a16a45510d8a0af781d2a925b6188ddea1809f93b477f1e6800b24e01d484c50/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Nov 28 09:19:19 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a16a45510d8a0af781d2a925b6188ddea1809f93b477f1e6800b24e01d484c50/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Nov 28 09:19:19 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a16a45510d8a0af781d2a925b6188ddea1809f93b477f1e6800b24e01d484c50/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Nov 28 09:19:19 np0005538515.localdomain podman[135641]: 2025-11-28 09:19:19.70597518 +0000 UTC m=+0.163638246 container init 732e0f4686b57e1561c9fbb72b502d15d89c1bffac5406b83f7a3de2b2a3c0da (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eloquent_dhawan, GIT_CLEAN=True, io.openshift.expose-services=, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, vcs-type=git, io.openshift.tags=rhceph ceph, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=553, build-date=2025-09-24T08:57:55, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, architecture=x86_64, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/agreements, ceph=True)
Nov 28 09:19:19 np0005538515.localdomain podman[135641]: 2025-11-28 09:19:19.716634509 +0000 UTC m=+0.174297575 container start 732e0f4686b57e1561c9fbb72b502d15d89c1bffac5406b83f7a3de2b2a3c0da (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eloquent_dhawan, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, GIT_CLEAN=True, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.component=rhceph-container, build-date=2025-09-24T08:57:55, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, version=7, description=Red Hat Ceph Storage 7, vcs-type=git, ceph=True, vendor=Red Hat, Inc., io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, distribution-scope=public, name=rhceph, release=553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.33.12, RELEASE=main)
Nov 28 09:19:19 np0005538515.localdomain podman[135641]: 2025-11-28 09:19:19.716776283 +0000 UTC m=+0.174439339 container attach 732e0f4686b57e1561c9fbb72b502d15d89c1bffac5406b83f7a3de2b2a3c0da (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eloquent_dhawan, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, distribution-scope=public, architecture=x86_64, vendor=Red Hat, Inc., io.buildah.version=1.33.12, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, build-date=2025-09-24T08:57:55, io.k8s.description=Red Hat Ceph Storage 7, release=553, io.openshift.tags=rhceph ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, ceph=True, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, com.redhat.component=rhceph-container, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7)
Nov 28 09:19:19 np0005538515.localdomain sudo[135732]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-chzunxnecxnfrgktrqyynodzfhmrnipg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321559.0844204-373-108617403463863/AnsiballZ_copy.py
Nov 28 09:19:19 np0005538515.localdomain sudo[135732]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:19:20 np0005538515.localdomain python3.9[135736]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764321559.0844204-373-108617403463863/.source.preset _original_basename=91-edpm-container-shutdown-preset follow=False checksum=b275e4375287528cb63464dd32f622c4f142a915 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:19:20 np0005538515.localdomain sudo[135732]: pam_unix(sudo:session): session closed for user root
Nov 28 09:19:20 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-6964407b0d14846ddd213b8ca0f4ccbf7bc1e50105065c9c28f341fa01c0aa42-merged.mount: Deactivated successfully.
Nov 28 09:19:20 np0005538515.localdomain eloquent_dhawan[135670]: [
Nov 28 09:19:20 np0005538515.localdomain eloquent_dhawan[135670]:     {
Nov 28 09:19:20 np0005538515.localdomain eloquent_dhawan[135670]:         "available": false,
Nov 28 09:19:20 np0005538515.localdomain eloquent_dhawan[135670]:         "ceph_device": false,
Nov 28 09:19:20 np0005538515.localdomain eloquent_dhawan[135670]:         "device_id": "QEMU_DVD-ROM_QM00001",
Nov 28 09:19:20 np0005538515.localdomain eloquent_dhawan[135670]:         "lsm_data": {},
Nov 28 09:19:20 np0005538515.localdomain eloquent_dhawan[135670]:         "lvs": [],
Nov 28 09:19:20 np0005538515.localdomain eloquent_dhawan[135670]:         "path": "/dev/sr0",
Nov 28 09:19:20 np0005538515.localdomain eloquent_dhawan[135670]:         "rejected_reasons": [
Nov 28 09:19:20 np0005538515.localdomain eloquent_dhawan[135670]:             "Insufficient space (<5GB)",
Nov 28 09:19:20 np0005538515.localdomain eloquent_dhawan[135670]:             "Has a FileSystem"
Nov 28 09:19:20 np0005538515.localdomain eloquent_dhawan[135670]:         ],
Nov 28 09:19:20 np0005538515.localdomain eloquent_dhawan[135670]:         "sys_api": {
Nov 28 09:19:20 np0005538515.localdomain eloquent_dhawan[135670]:             "actuators": null,
Nov 28 09:19:20 np0005538515.localdomain eloquent_dhawan[135670]:             "device_nodes": "sr0",
Nov 28 09:19:20 np0005538515.localdomain eloquent_dhawan[135670]:             "human_readable_size": "482.00 KB",
Nov 28 09:19:20 np0005538515.localdomain eloquent_dhawan[135670]:             "id_bus": "ata",
Nov 28 09:19:20 np0005538515.localdomain eloquent_dhawan[135670]:             "model": "QEMU DVD-ROM",
Nov 28 09:19:20 np0005538515.localdomain eloquent_dhawan[135670]:             "nr_requests": "2",
Nov 28 09:19:20 np0005538515.localdomain eloquent_dhawan[135670]:             "partitions": {},
Nov 28 09:19:20 np0005538515.localdomain eloquent_dhawan[135670]:             "path": "/dev/sr0",
Nov 28 09:19:20 np0005538515.localdomain eloquent_dhawan[135670]:             "removable": "1",
Nov 28 09:19:20 np0005538515.localdomain eloquent_dhawan[135670]:             "rev": "2.5+",
Nov 28 09:19:20 np0005538515.localdomain eloquent_dhawan[135670]:             "ro": "0",
Nov 28 09:19:20 np0005538515.localdomain eloquent_dhawan[135670]:             "rotational": "1",
Nov 28 09:19:20 np0005538515.localdomain eloquent_dhawan[135670]:             "sas_address": "",
Nov 28 09:19:20 np0005538515.localdomain eloquent_dhawan[135670]:             "sas_device_handle": "",
Nov 28 09:19:20 np0005538515.localdomain eloquent_dhawan[135670]:             "scheduler_mode": "mq-deadline",
Nov 28 09:19:20 np0005538515.localdomain eloquent_dhawan[135670]:             "sectors": 0,
Nov 28 09:19:20 np0005538515.localdomain eloquent_dhawan[135670]:             "sectorsize": "2048",
Nov 28 09:19:20 np0005538515.localdomain eloquent_dhawan[135670]:             "size": 493568.0,
Nov 28 09:19:20 np0005538515.localdomain eloquent_dhawan[135670]:             "support_discard": "0",
Nov 28 09:19:20 np0005538515.localdomain eloquent_dhawan[135670]:             "type": "disk",
Nov 28 09:19:20 np0005538515.localdomain eloquent_dhawan[135670]:             "vendor": "QEMU"
Nov 28 09:19:20 np0005538515.localdomain eloquent_dhawan[135670]:         }
Nov 28 09:19:20 np0005538515.localdomain eloquent_dhawan[135670]:     }
Nov 28 09:19:20 np0005538515.localdomain eloquent_dhawan[135670]: ]
Nov 28 09:19:20 np0005538515.localdomain systemd[1]: libpod-732e0f4686b57e1561c9fbb72b502d15d89c1bffac5406b83f7a3de2b2a3c0da.scope: Deactivated successfully.
Nov 28 09:19:20 np0005538515.localdomain podman[135641]: 2025-11-28 09:19:20.564040061 +0000 UTC m=+1.021703197 container died 732e0f4686b57e1561c9fbb72b502d15d89c1bffac5406b83f7a3de2b2a3c0da (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eloquent_dhawan, name=rhceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.component=rhceph-container, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., ceph=True, build-date=2025-09-24T08:57:55, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, CEPH_POINT_RELEASE=, io.buildah.version=1.33.12, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, distribution-scope=public, RELEASE=main, release=553, architecture=x86_64, description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9)
Nov 28 09:19:20 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27202 DF PROTO=TCP SPT=48680 DPT=9100 SEQ=234430106 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC5FB3A0000000001030307) 
Nov 28 09:19:20 np0005538515.localdomain systemd[1]: tmp-crun.yo5Dvj.mount: Deactivated successfully.
Nov 28 09:19:20 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-a16a45510d8a0af781d2a925b6188ddea1809f93b477f1e6800b24e01d484c50-merged.mount: Deactivated successfully.
Nov 28 09:19:20 np0005538515.localdomain podman[137201]: 2025-11-28 09:19:20.646721106 +0000 UTC m=+0.076074213 container remove 732e0f4686b57e1561c9fbb72b502d15d89c1bffac5406b83f7a3de2b2a3c0da (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eloquent_dhawan, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.buildah.version=1.33.12, io.openshift.tags=rhceph ceph, version=7, distribution-scope=public, release=553, name=rhceph, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, RELEASE=main, CEPH_POINT_RELEASE=, build-date=2025-09-24T08:57:55, com.redhat.license_terms=https://www.redhat.com/agreements)
Nov 28 09:19:20 np0005538515.localdomain systemd[1]: libpod-conmon-732e0f4686b57e1561c9fbb72b502d15d89c1bffac5406b83f7a3de2b2a3c0da.scope: Deactivated successfully.
Nov 28 09:19:20 np0005538515.localdomain sudo[135454]: pam_unix(sudo:session): session closed for user root
Nov 28 09:19:20 np0005538515.localdomain sudo[137258]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-boeszulbdgfgzfzobhwdcjekivcpcrma ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321560.3593814-419-183632538284519/AnsiballZ_systemd.py
Nov 28 09:19:20 np0005538515.localdomain sudo[137258]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:19:21 np0005538515.localdomain python3.9[137260]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:19:21 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 09:19:21 np0005538515.localdomain systemd-sysv-generator[137288]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:19:21 np0005538515.localdomain systemd-rc-local-generator[137285]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:19:21 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:19:21 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 09:19:21 np0005538515.localdomain sudo[137298]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:19:21 np0005538515.localdomain systemd-sysv-generator[137342]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:19:21 np0005538515.localdomain systemd-rc-local-generator[137336]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:19:21 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:19:21 np0005538515.localdomain sudo[137298]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:19:21 np0005538515.localdomain sudo[137298]: pam_unix(sudo:session): session closed for user root
Nov 28 09:19:21 np0005538515.localdomain systemd[1]: Starting EDPM Container Shutdown...
Nov 28 09:19:21 np0005538515.localdomain systemd[1]: Finished EDPM Container Shutdown.
Nov 28 09:19:21 np0005538515.localdomain sudo[137258]: pam_unix(sudo:session): session closed for user root
Nov 28 09:19:22 np0005538515.localdomain sudo[137442]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cdbqhfxogvkifuuimgvtxwdywaiwylfu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321562.0514174-442-253686083774015/AnsiballZ_stat.py
Nov 28 09:19:22 np0005538515.localdomain sudo[137442]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:19:22 np0005538515.localdomain python3.9[137444]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:19:22 np0005538515.localdomain sudo[137442]: pam_unix(sudo:session): session closed for user root
Nov 28 09:19:22 np0005538515.localdomain sudo[137515]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-boeorzddosaxqhlndnxfhexpstxvyrnz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321562.0514174-442-253686083774015/AnsiballZ_copy.py
Nov 28 09:19:22 np0005538515.localdomain sudo[137515]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:19:23 np0005538515.localdomain python3.9[137517]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/netns-placeholder.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764321562.0514174-442-253686083774015/.source.service _original_basename=netns-placeholder-service follow=False checksum=b61b1b5918c20c877b8b226fbf34ff89a082d972 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:19:23 np0005538515.localdomain sudo[137515]: pam_unix(sudo:session): session closed for user root
Nov 28 09:19:23 np0005538515.localdomain sudo[137607]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kbvixxqpyrqekezdhvpignlasvjkjgfw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321563.3922908-488-70195843921004/AnsiballZ_stat.py
Nov 28 09:19:23 np0005538515.localdomain sudo[137607]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:19:23 np0005538515.localdomain python3.9[137609]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:19:23 np0005538515.localdomain sudo[137607]: pam_unix(sudo:session): session closed for user root
Nov 28 09:19:24 np0005538515.localdomain sudo[137680]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ilcoaqkxnqthohzvyrkiqlvmryxjemdn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321563.3922908-488-70195843921004/AnsiballZ_copy.py
Nov 28 09:19:24 np0005538515.localdomain sudo[137680]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:19:24 np0005538515.localdomain python3.9[137682]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-netns-placeholder.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764321563.3922908-488-70195843921004/.source.preset _original_basename=91-netns-placeholder-preset follow=False checksum=28b7b9aa893525d134a1eeda8a0a48fb25b736b9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:19:24 np0005538515.localdomain sudo[137680]: pam_unix(sudo:session): session closed for user root
Nov 28 09:19:24 np0005538515.localdomain sudo[137772]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hetsrizkswduqgugeviugzyhsmlnkxrh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321564.615036-532-89165624908295/AnsiballZ_systemd.py
Nov 28 09:19:24 np0005538515.localdomain sudo[137772]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:19:25 np0005538515.localdomain python3.9[137774]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:19:25 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 09:19:25 np0005538515.localdomain systemd-sysv-generator[137802]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:19:25 np0005538515.localdomain systemd-rc-local-generator[137797]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:19:25 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:19:25 np0005538515.localdomain systemd[1]: Starting Create netns directory...
Nov 28 09:19:25 np0005538515.localdomain systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Nov 28 09:19:25 np0005538515.localdomain systemd[1]: netns-placeholder.service: Deactivated successfully.
Nov 28 09:19:25 np0005538515.localdomain systemd[1]: Finished Create netns directory.
Nov 28 09:19:25 np0005538515.localdomain sudo[137772]: pam_unix(sudo:session): session closed for user root
Nov 28 09:19:26 np0005538515.localdomain python3.9[137906]: ansible-ansible.builtin.service_facts Invoked
Nov 28 09:19:26 np0005538515.localdomain network[137923]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 28 09:19:26 np0005538515.localdomain network[137924]: 'network-scripts' will be removed from distribution in near future.
Nov 28 09:19:26 np0005538515.localdomain network[137925]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 28 09:19:27 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38718 DF PROTO=TCP SPT=54040 DPT=9105 SEQ=3331519835 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC616720000000001030307) 
Nov 28 09:19:27 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:19:28 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38719 DF PROTO=TCP SPT=54040 DPT=9105 SEQ=3331519835 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC61A7A0000000001030307) 
Nov 28 09:19:28 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27203 DF PROTO=TCP SPT=48680 DPT=9100 SEQ=234430106 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC61AFA0000000001030307) 
Nov 28 09:19:30 np0005538515.localdomain sudo[138125]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jrjzqzyobkyihdbzmssmpygcjhiedjmc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321570.1283126-610-265402222506364/AnsiballZ_stat.py
Nov 28 09:19:30 np0005538515.localdomain sudo[138125]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:19:30 np0005538515.localdomain python3.9[138127]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:19:30 np0005538515.localdomain sudo[138125]: pam_unix(sudo:session): session closed for user root
Nov 28 09:19:31 np0005538515.localdomain sudo[138200]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zbtlhbqjxgifvmcwxnmkrwbfobrpuwig ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321570.1283126-610-265402222506364/AnsiballZ_copy.py
Nov 28 09:19:31 np0005538515.localdomain sudo[138200]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:19:31 np0005538515.localdomain python3.9[138202]: ansible-ansible.legacy.copy Invoked with dest=/etc/ssh/sshd_config mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764321570.1283126-610-265402222506364/.source validate=/usr/sbin/sshd -T -f %s follow=False _original_basename=sshd_config_block.j2 checksum=6c79f4cb960ad444688fde322eeacb8402e22d79 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:19:31 np0005538515.localdomain sudo[138200]: pam_unix(sudo:session): session closed for user root
Nov 28 09:19:31 np0005538515.localdomain sudo[138293]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dekavtsygvthgruamskvnnnzczxhmoqb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321571.484312-656-63110359379016/AnsiballZ_systemd.py
Nov 28 09:19:31 np0005538515.localdomain sudo[138293]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:19:31 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52038 DF PROTO=TCP SPT=51260 DPT=9105 SEQ=992114278 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC626FB0000000001030307) 
Nov 28 09:19:32 np0005538515.localdomain python3.9[138295]: ansible-ansible.builtin.systemd Invoked with name=sshd state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 28 09:19:32 np0005538515.localdomain systemd[1]: Reloading OpenSSH server daemon...
Nov 28 09:19:32 np0005538515.localdomain sshd[117864]: Received SIGHUP; restarting.
Nov 28 09:19:32 np0005538515.localdomain systemd[1]: Reloaded OpenSSH server daemon.
Nov 28 09:19:32 np0005538515.localdomain sshd[117864]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 09:19:32 np0005538515.localdomain sshd[117864]: Server listening on 0.0.0.0 port 22.
Nov 28 09:19:32 np0005538515.localdomain sshd[117864]: Server listening on :: port 22.
Nov 28 09:19:32 np0005538515.localdomain sudo[138293]: pam_unix(sudo:session): session closed for user root
Nov 28 09:19:32 np0005538515.localdomain sudo[138389]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nmtzidwcslzcxvoogkawvwostszqgcyx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321572.364644-679-259915686456667/AnsiballZ_file.py
Nov 28 09:19:32 np0005538515.localdomain sudo[138389]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:19:32 np0005538515.localdomain python3.9[138391]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:19:32 np0005538515.localdomain sudo[138389]: pam_unix(sudo:session): session closed for user root
Nov 28 09:19:33 np0005538515.localdomain sudo[138481]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ppuumqcvvldzyjpzulyvdjnofvcnrqym ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321573.2163177-705-3039262264424/AnsiballZ_stat.py
Nov 28 09:19:33 np0005538515.localdomain sudo[138481]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:19:33 np0005538515.localdomain python3.9[138483]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/sshd-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:19:33 np0005538515.localdomain sudo[138481]: pam_unix(sudo:session): session closed for user root
Nov 28 09:19:34 np0005538515.localdomain sudo[138554]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rbfyehzwesgzkjfkswekxxbuaqarbljw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321573.2163177-705-3039262264424/AnsiballZ_copy.py
Nov 28 09:19:34 np0005538515.localdomain sudo[138554]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:19:34 np0005538515.localdomain python3.9[138556]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/sshd-networks.yaml group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764321573.2163177-705-3039262264424/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=0bfc8440fd8f39002ab90252479fb794f51b5ae8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:19:34 np0005538515.localdomain sudo[138554]: pam_unix(sudo:session): session closed for user root
Nov 28 09:19:34 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38721 DF PROTO=TCP SPT=54040 DPT=9105 SEQ=3331519835 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC6323A0000000001030307) 
Nov 28 09:19:35 np0005538515.localdomain sudo[138646]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jyaldbmmrwqjzjpbdkkjkafliadnnlur ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321575.016702-758-115172268234199/AnsiballZ_timezone.py
Nov 28 09:19:35 np0005538515.localdomain sudo[138646]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:19:35 np0005538515.localdomain python3.9[138648]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Nov 28 09:19:35 np0005538515.localdomain systemd[1]: Starting Time & Date Service...
Nov 28 09:19:35 np0005538515.localdomain systemd[1]: Started Time & Date Service.
Nov 28 09:19:35 np0005538515.localdomain sudo[138646]: pam_unix(sudo:session): session closed for user root
Nov 28 09:19:37 np0005538515.localdomain sudo[138742]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hajyovouwhumgumbrckppuhxtmuzaqld ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321577.0773995-785-241440806583749/AnsiballZ_file.py
Nov 28 09:19:37 np0005538515.localdomain sudo[138742]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:19:37 np0005538515.localdomain python3.9[138744]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:19:37 np0005538515.localdomain sudo[138742]: pam_unix(sudo:session): session closed for user root
Nov 28 09:19:38 np0005538515.localdomain sudo[138834]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-adpccozcfnvbhprojrmfoihiwfkexeat ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321577.7613251-810-155713258236950/AnsiballZ_stat.py
Nov 28 09:19:38 np0005538515.localdomain sudo[138834]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:19:38 np0005538515.localdomain python3.9[138836]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:19:38 np0005538515.localdomain sudo[138834]: pam_unix(sudo:session): session closed for user root
Nov 28 09:19:38 np0005538515.localdomain sudo[138907]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wqqlrysunndactfqgmlautkicaovnnzt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321577.7613251-810-155713258236950/AnsiballZ_copy.py
Nov 28 09:19:38 np0005538515.localdomain sudo[138907]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:19:38 np0005538515.localdomain python3.9[138909]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764321577.7613251-810-155713258236950/.source.yaml follow=False _original_basename=base-rules.yaml.j2 checksum=450456afcafded6d4bdecceec7a02e806eebd8b3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:19:38 np0005538515.localdomain sudo[138907]: pam_unix(sudo:session): session closed for user root
Nov 28 09:19:39 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61991 DF PROTO=TCP SPT=59476 DPT=9102 SEQ=1768630148 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC643BA0000000001030307) 
Nov 28 09:19:39 np0005538515.localdomain sudo[138999]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dojkbhuhmfywannutylsknefnskmfwdz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321579.04462-854-185558833826055/AnsiballZ_stat.py
Nov 28 09:19:39 np0005538515.localdomain sudo[138999]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:19:39 np0005538515.localdomain python3.9[139001]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:19:39 np0005538515.localdomain sudo[138999]: pam_unix(sudo:session): session closed for user root
Nov 28 09:19:39 np0005538515.localdomain sudo[139072]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xkbvjcwtakxvyuyzsjxmcjsbxrgjpami ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321579.04462-854-185558833826055/AnsiballZ_copy.py
Nov 28 09:19:39 np0005538515.localdomain sudo[139072]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:19:40 np0005538515.localdomain python3.9[139074]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764321579.04462-854-185558833826055/.source.yaml _original_basename=.g6c8i0jw follow=False checksum=97d170e1550eee4afc0af065b78cda302a97674c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:19:40 np0005538515.localdomain sudo[139072]: pam_unix(sudo:session): session closed for user root
Nov 28 09:19:40 np0005538515.localdomain sudo[139164]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dezpmjufnprpnkjvamjsmixboomipafg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321580.2987318-898-34897107821847/AnsiballZ_stat.py
Nov 28 09:19:40 np0005538515.localdomain sudo[139164]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:19:40 np0005538515.localdomain python3.9[139166]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:19:40 np0005538515.localdomain sudo[139164]: pam_unix(sudo:session): session closed for user root
Nov 28 09:19:41 np0005538515.localdomain sudo[139239]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sgszuzlerecnokaeajipktppnhikilzi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321580.2987318-898-34897107821847/AnsiballZ_copy.py
Nov 28 09:19:41 np0005538515.localdomain sudo[139239]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:19:41 np0005538515.localdomain python3.9[139241]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/iptables.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764321580.2987318-898-34897107821847/.source.nft _original_basename=iptables.nft follow=False checksum=3e02df08f1f3ab4a513e94056dbd390e3d38fe30 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:19:41 np0005538515.localdomain sudo[139239]: pam_unix(sudo:session): session closed for user root
Nov 28 09:19:42 np0005538515.localdomain sudo[139331]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-geqbhdayrvwqjcyxftogseokssaxcgrd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321581.6372442-944-159835161799355/AnsiballZ_command.py
Nov 28 09:19:42 np0005538515.localdomain sudo[139331]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:19:42 np0005538515.localdomain python3.9[139333]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/iptables.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:19:42 np0005538515.localdomain sudo[139331]: pam_unix(sudo:session): session closed for user root
Nov 28 09:19:42 np0005538515.localdomain sudo[139424]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wyyywuydqllokstoprydrqirjhejzflv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321582.503742-968-81690876106482/AnsiballZ_command.py
Nov 28 09:19:42 np0005538515.localdomain sudo[139424]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:19:42 np0005538515.localdomain python3.9[139426]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:19:43 np0005538515.localdomain sudo[139424]: pam_unix(sudo:session): session closed for user root
Nov 28 09:19:43 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38722 DF PROTO=TCP SPT=54040 DPT=9105 SEQ=3331519835 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC652FA0000000001030307) 
Nov 28 09:19:43 np0005538515.localdomain sudo[139517]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eensanwewbbcibrhtdkvhllfbkvnlgtf ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764321583.2287889-991-221288559160803/AnsiballZ_edpm_nftables_from_files.py
Nov 28 09:19:43 np0005538515.localdomain sudo[139517]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:19:43 np0005538515.localdomain python3[139519]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Nov 28 09:19:43 np0005538515.localdomain sudo[139517]: pam_unix(sudo:session): session closed for user root
Nov 28 09:19:44 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51114 DF PROTO=TCP SPT=35212 DPT=9882 SEQ=1137333229 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC656FA0000000001030307) 
Nov 28 09:19:44 np0005538515.localdomain sudo[139609]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tnqfzlnapnqizhmfnykfihxfvxlciuhd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321584.2228937-1016-153505989421412/AnsiballZ_stat.py
Nov 28 09:19:44 np0005538515.localdomain sudo[139609]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:19:44 np0005538515.localdomain python3.9[139611]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:19:44 np0005538515.localdomain sudo[139609]: pam_unix(sudo:session): session closed for user root
Nov 28 09:19:45 np0005538515.localdomain sudo[139682]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nflsztqepcomlobpxfgmhhndihabixoj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321584.2228937-1016-153505989421412/AnsiballZ_copy.py
Nov 28 09:19:45 np0005538515.localdomain sudo[139682]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:19:45 np0005538515.localdomain python3.9[139684]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764321584.2228937-1016-153505989421412/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:19:45 np0005538515.localdomain sudo[139682]: pam_unix(sudo:session): session closed for user root
Nov 28 09:19:45 np0005538515.localdomain sudo[139774]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xhithtiqagvygkisdkkjfwpfkfeqobcm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321585.5609047-1060-186373686427954/AnsiballZ_stat.py
Nov 28 09:19:45 np0005538515.localdomain sudo[139774]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:19:46 np0005538515.localdomain python3.9[139776]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:19:46 np0005538515.localdomain sudo[139774]: pam_unix(sudo:session): session closed for user root
Nov 28 09:19:46 np0005538515.localdomain sudo[139847]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rwvwkcdsajxuqawuqokbacpdpsqovwbh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321585.5609047-1060-186373686427954/AnsiballZ_copy.py
Nov 28 09:19:46 np0005538515.localdomain sudo[139847]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:19:46 np0005538515.localdomain python3.9[139849]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764321585.5609047-1060-186373686427954/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:19:46 np0005538515.localdomain sudo[139847]: pam_unix(sudo:session): session closed for user root
Nov 28 09:19:47 np0005538515.localdomain sudo[139939]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-konqulyepsljtjdvytkqllwuvfwgevhs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321586.9611557-1106-64615816316335/AnsiballZ_stat.py
Nov 28 09:19:47 np0005538515.localdomain sudo[139939]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:19:47 np0005538515.localdomain python3.9[139941]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:19:47 np0005538515.localdomain sudo[139939]: pam_unix(sudo:session): session closed for user root
Nov 28 09:19:47 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61992 DF PROTO=TCP SPT=59476 DPT=9102 SEQ=1768630148 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC664FA0000000001030307) 
Nov 28 09:19:47 np0005538515.localdomain sudo[140012]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vwrlrzgneoxiammnpydnazlkauzhpmkh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321586.9611557-1106-64615816316335/AnsiballZ_copy.py
Nov 28 09:19:47 np0005538515.localdomain sudo[140012]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:19:48 np0005538515.localdomain python3.9[140014]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764321586.9611557-1106-64615816316335/.source.nft follow=False _original_basename=flush-chain.j2 checksum=d16337256a56373421842284fe09e4e6c7df417e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:19:48 np0005538515.localdomain sudo[140012]: pam_unix(sudo:session): session closed for user root
Nov 28 09:19:48 np0005538515.localdomain sudo[140104]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qhpamaqathspqbleuzfqaeszxmbgqjyl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321588.2086613-1151-279271851122672/AnsiballZ_stat.py
Nov 28 09:19:48 np0005538515.localdomain sudo[140104]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:19:48 np0005538515.localdomain python3.9[140106]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:19:48 np0005538515.localdomain sudo[140104]: pam_unix(sudo:session): session closed for user root
Nov 28 09:19:49 np0005538515.localdomain sudo[140177]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tkairvftriwgaisakvkzyfijbwvstfxs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321588.2086613-1151-279271851122672/AnsiballZ_copy.py
Nov 28 09:19:49 np0005538515.localdomain sudo[140177]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:19:49 np0005538515.localdomain python3.9[140179]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764321588.2086613-1151-279271851122672/.source.nft follow=False _original_basename=chains.j2 checksum=2079f3b60590a165d1d502e763170876fc8e2984 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:19:49 np0005538515.localdomain sudo[140177]: pam_unix(sudo:session): session closed for user root
Nov 28 09:19:50 np0005538515.localdomain sudo[140269]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pdxjalvbdqghrluhpgdldtlzwyoeilvr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321589.5293438-1196-12187398475216/AnsiballZ_stat.py
Nov 28 09:19:50 np0005538515.localdomain sudo[140269]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:19:50 np0005538515.localdomain python3.9[140271]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:19:50 np0005538515.localdomain sudo[140269]: pam_unix(sudo:session): session closed for user root
Nov 28 09:19:50 np0005538515.localdomain sudo[140342]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gpnmxyyqgujtlhgiraekmjqiefykzdep ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321589.5293438-1196-12187398475216/AnsiballZ_copy.py
Nov 28 09:19:50 np0005538515.localdomain sudo[140342]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:19:50 np0005538515.localdomain python3.9[140344]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764321589.5293438-1196-12187398475216/.source.nft follow=False _original_basename=ruleset.j2 checksum=15a82a0dc61abfd6aa593407582b5b950437eb80 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:19:50 np0005538515.localdomain sudo[140342]: pam_unix(sudo:session): session closed for user root
Nov 28 09:19:51 np0005538515.localdomain sudo[140434]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rbofwfucvymcolsgoejshfwkvctomvjq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321591.0249352-1240-89002356720706/AnsiballZ_file.py
Nov 28 09:19:51 np0005538515.localdomain sudo[140434]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:19:51 np0005538515.localdomain python3.9[140436]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:19:51 np0005538515.localdomain sudo[140434]: pam_unix(sudo:session): session closed for user root
Nov 28 09:19:51 np0005538515.localdomain sudo[140526]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ihatwmzjhkdvvazkckychjuruiaqotzo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321591.6919699-1265-184295227354432/AnsiballZ_command.py
Nov 28 09:19:51 np0005538515.localdomain sudo[140526]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:19:52 np0005538515.localdomain python3.9[140528]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:19:52 np0005538515.localdomain sudo[140526]: pam_unix(sudo:session): session closed for user root
Nov 28 09:19:52 np0005538515.localdomain sudo[140621]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qqnodyhdncoqnaymninrcgvzugilztrc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321592.3922844-1289-121751470153459/AnsiballZ_blockinfile.py
Nov 28 09:19:52 np0005538515.localdomain sudo[140621]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:19:52 np0005538515.localdomain python3.9[140623]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                                            include "/etc/nftables/edpm-chains.nft"
                                                            include "/etc/nftables/edpm-rules.nft"
                                                            include "/etc/nftables/edpm-jumps.nft"
                                                             path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:19:53 np0005538515.localdomain sudo[140621]: pam_unix(sudo:session): session closed for user root
Nov 28 09:19:53 np0005538515.localdomain sudo[140714]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-volunsjpayrqjtmilvvmcwnxinrwmqas ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321593.3032243-1315-147312196746647/AnsiballZ_file.py
Nov 28 09:19:53 np0005538515.localdomain sudo[140714]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:19:53 np0005538515.localdomain python3.9[140716]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages1G state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:19:53 np0005538515.localdomain sudo[140714]: pam_unix(sudo:session): session closed for user root
Nov 28 09:19:54 np0005538515.localdomain sudo[140806]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-adenpvzqbfidljviywflxutypwbjtylx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321593.918695-1315-254208238971634/AnsiballZ_file.py
Nov 28 09:19:54 np0005538515.localdomain sudo[140806]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:19:54 np0005538515.localdomain python3.9[140808]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages2M state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:19:54 np0005538515.localdomain sudo[140806]: pam_unix(sudo:session): session closed for user root
Nov 28 09:19:55 np0005538515.localdomain sudo[140898]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tsejgwecvrldrduxtmjpvbevafzjhjou ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321594.585158-1361-815875385289/AnsiballZ_mount.py
Nov 28 09:19:55 np0005538515.localdomain sudo[140898]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:19:55 np0005538515.localdomain python3.9[140900]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=1G path=/dev/hugepages1G src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Nov 28 09:19:55 np0005538515.localdomain sudo[140898]: pam_unix(sudo:session): session closed for user root
Nov 28 09:19:55 np0005538515.localdomain sudo[140991]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-derovybuxkluckreijmfksxtfugafnck ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321595.4020865-1361-253915985458779/AnsiballZ_mount.py
Nov 28 09:19:55 np0005538515.localdomain sudo[140991]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:19:55 np0005538515.localdomain python3.9[140993]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=2M path=/dev/hugepages2M src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Nov 28 09:19:55 np0005538515.localdomain sudo[140991]: pam_unix(sudo:session): session closed for user root
Nov 28 09:19:56 np0005538515.localdomain sshd[134189]: pam_unix(sshd:session): session closed for user zuul
Nov 28 09:19:56 np0005538515.localdomain systemd[1]: session-43.scope: Deactivated successfully.
Nov 28 09:19:56 np0005538515.localdomain systemd[1]: session-43.scope: Consumed 28.826s CPU time.
Nov 28 09:19:56 np0005538515.localdomain systemd-logind[763]: Session 43 logged out. Waiting for processes to exit.
Nov 28 09:19:56 np0005538515.localdomain systemd-logind[763]: Removed session 43.
Nov 28 09:19:57 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37742 DF PROTO=TCP SPT=38210 DPT=9105 SEQ=3067797486 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC68BA60000000001030307) 
Nov 28 09:19:58 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52081 DF PROTO=TCP SPT=58050 DPT=9882 SEQ=2195339610 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC690CC0000000001030307) 
Nov 28 09:20:01 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37816 DF PROTO=TCP SPT=40280 DPT=9105 SEQ=1448376460 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC69AFA0000000001030307) 
Nov 28 09:20:02 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35545 DF PROTO=TCP SPT=58008 DPT=9102 SEQ=2439919783 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC69D190000000001030307) 
Nov 28 09:20:02 np0005538515.localdomain sshd[141009]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 09:20:02 np0005538515.localdomain sshd[141009]: Accepted publickey for zuul from 192.168.122.30 port 58204 ssh2: RSA SHA256:3gOhaEk5Hp1Sm2LwNst6cGDJ5O01KvSo8lCo9SBO2II
Nov 28 09:20:02 np0005538515.localdomain systemd-logind[763]: New session 44 of user zuul.
Nov 28 09:20:03 np0005538515.localdomain systemd[1]: Started Session 44 of User zuul.
Nov 28 09:20:03 np0005538515.localdomain sshd[141009]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Nov 28 09:20:03 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7322 DF PROTO=TCP SPT=48208 DPT=9882 SEQ=3142136880 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC6A0FB0000000001030307) 
Nov 28 09:20:03 np0005538515.localdomain sudo[141102]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wdxvndslaqihlxojurtjbexjnamybbyw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321603.1062315-23-171465775124498/AnsiballZ_tempfile.py
Nov 28 09:20:03 np0005538515.localdomain sudo[141102]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:20:03 np0005538515.localdomain python3.9[141104]: ansible-ansible.builtin.tempfile Invoked with state=file prefix=ansible. suffix= path=None
Nov 28 09:20:03 np0005538515.localdomain sudo[141102]: pam_unix(sudo:session): session closed for user root
Nov 28 09:20:05 np0005538515.localdomain sudo[141194]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lyejeiljhhqotjsqhmwgumdgdrwbfxvr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321604.843705-95-86026837038515/AnsiballZ_stat.py
Nov 28 09:20:05 np0005538515.localdomain sudo[141194]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:20:05 np0005538515.localdomain python3.9[141196]: ansible-ansible.builtin.stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 09:20:05 np0005538515.localdomain sudo[141194]: pam_unix(sudo:session): session closed for user root
Nov 28 09:20:05 np0005538515.localdomain systemd[1]: systemd-timedated.service: Deactivated successfully.
Nov 28 09:20:06 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23046 DF PROTO=TCP SPT=34794 DPT=9102 SEQ=1845156581 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC6ACFB0000000001030307) 
Nov 28 09:20:06 np0005538515.localdomain sudo[141290]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eujvrbwwggfpmhueaujoqsfrdnonyhzc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321606.2753694-144-94536931298675/AnsiballZ_slurp.py
Nov 28 09:20:06 np0005538515.localdomain sudo[141290]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:20:06 np0005538515.localdomain python3.9[141292]: ansible-ansible.builtin.slurp Invoked with src=/etc/ssh/ssh_known_hosts
Nov 28 09:20:06 np0005538515.localdomain sudo[141290]: pam_unix(sudo:session): session closed for user root
Nov 28 09:20:08 np0005538515.localdomain sudo[141382]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eedsrouacuzisngkldyrckmxbedmvsix ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321607.794715-192-227701633114446/AnsiballZ_stat.py
Nov 28 09:20:08 np0005538515.localdomain sudo[141382]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:20:08 np0005538515.localdomain python3.9[141384]: ansible-ansible.legacy.stat Invoked with path=/tmp/ansible.nrwpmgem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:20:08 np0005538515.localdomain sudo[141382]: pam_unix(sudo:session): session closed for user root
Nov 28 09:20:08 np0005538515.localdomain sudo[141457]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-plsmxsbyqeucyqhdisecbgkgcqvoqxph ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321607.794715-192-227701633114446/AnsiballZ_copy.py
Nov 28 09:20:08 np0005538515.localdomain sudo[141457]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:20:08 np0005538515.localdomain python3.9[141459]: ansible-ansible.legacy.copy Invoked with dest=/tmp/ansible.nrwpmgem mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764321607.794715-192-227701633114446/.source.nrwpmgem _original_basename=.ole86qdk follow=False checksum=37b6ce2b006ecd64876d6796769d1ed663c9f074 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:20:08 np0005538515.localdomain sudo[141457]: pam_unix(sudo:session): session closed for user root
Nov 28 09:20:10 np0005538515.localdomain sudo[141549]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ybxfgoztdgwtaxgwcwrlzponqbzwqqtf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321610.3007936-281-96339769463045/AnsiballZ_setup.py
Nov 28 09:20:10 np0005538515.localdomain sudo[141549]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:20:11 np0005538515.localdomain python3.9[141551]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'ssh_host_key_rsa_public', 'ssh_host_key_ed25519_public', 'ssh_host_key_ecdsa_public'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 09:20:11 np0005538515.localdomain sudo[141549]: pam_unix(sudo:session): session closed for user root
Nov 28 09:20:12 np0005538515.localdomain sudo[141641]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fctpkcpdtthbdwkquzskyydfyedqygth ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321612.0827267-330-146301769509288/AnsiballZ_blockinfile.py
Nov 28 09:20:12 np0005538515.localdomain sudo[141641]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:20:12 np0005538515.localdomain python3.9[141643]: ansible-ansible.builtin.blockinfile Invoked with block=np0005538513.localdomain,192.168.122.106,np0005538513* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCToHi/c1OL/UxMWy2v/t0tcvSlMeoKa6EPBYbcu51p2Gn2UxEPgCRLM9+84Smh2pxAR4Y/5LVm2lbZ9Gf4okHGg5GLIyqzxxqbQHyR+YRljujVEOvksUPuKCptzx9fQj2Ij2t9GPGHc5klgGPIKjx0pza8T37vdz+G9y7zuK5wWI66AeN8y/6dD2hvi1Lp94VRSvTTEo+nUOFSIgsOwqQO+ZSwTgjG1pmtESBe8nkhW0I0BQPX46v9f1PN1LXDg8cN2FSVjQ91RI0uCvTaBYJ3soFBFspgiJ113zapbQCaNwg7lK7ofS0QT5WONP3QIsDAq1gSpWuOdS2DRY4NU3WMd4m5tLbj+ubiWr39rNU/zQiEl8r38aiM0OwOfuQ9S8wxO7phpVCQrbOkYCLLijdy/xTODvP+jYohTMWX8Gh6IVeVtm6SB2Tw3lDBCjpqlclCSs905Xe+mTJ6WYTaz+Q1xgflKEeemzJ0+rt+QZbrmL7u5MUdf/l/yOLAgACNsws=
                                                            np0005538513.localdomain,192.168.122.106,np0005538513* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIAcFP+DjLmcEEAm8Lwvxl6FPIO6oOWnH/RhIcXcMqT1F
                                                            np0005538513.localdomain,192.168.122.106,np0005538513* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBBCKBYRInRUdTiZ6KYKN+DMW+w3dTbv2b2ZRO5doLdo2BjNWxCzSevWq4Ptdwg4i7AwfVsH37MVU5ijvc8yJB7o=
                                                            np0005538512.localdomain,192.168.122.105,np0005538512* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCy9/gxqH+eMqafXwUuPf+1Clpw4qsugdFefisnCDhJ5U7Pc+eWMUQVMS0ErxabBJhneDOyPwXwIbv72cEAtmgfvHDlSuS3mt8LRzKqsv1dXTy4Zqb3JGVzrvxo0iczGRsn2MIDJUv/Zjq9YqVeCnDj2HOwV+qx+EFecEFXS797FxsnMmTw0A5z8yUtBuJEGAKQX96LpZc4k5ltq+Uy0rK85Kk7cGR4A+wrIChLC8wggxvA99NdPEBtne6Chb+3PcbYUcTGhGtV6FGzpgbWmuWT/gcANb+fJE5/4n87loLmBMsmvGhvQuN9kuJ20g6nwPJbPTpIbV6XALx4tbma68bL3RL+lcGlh3jf0pEXPfolrB/MRmJn5ggMLjRv50FrowQalnCEgWE0gtd9IGjmqFz3jP008bGotn9rcacbjC2AvE+5NEjp7TzXGnFcD6jW8+9AWiusCww4ULs/oWbi0GLkmhwU5EifitDYF2+r1CigAdlEjb6sa0wAQSmclWk6guM=
                                                            np0005538512.localdomain,192.168.122.105,np0005538512* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIK28mLCPVbXy0OXsvv/yFemdmkq0TouDg2F8iIBtrFNP
                                                            np0005538512.localdomain,192.168.122.105,np0005538512* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOAaBZ7v0nx9ZqEqgPbFZS0ak6RTWK6bkXL/jWgEJnhpVMoiRYOxmcwlW3qCW0ftaWYgMItu1j7anWibS+umVXI=
                                                            np0005538511.localdomain,192.168.122.104,np0005538511* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDqtmgm0KAOOIJ7a8whlZPfasnwJpfcm6zVmQjiKHZZrcojE/a6oALfufKXbfWWiLjJ2VzyK9v7QPNXhIWxgAKT9J40A1lSpSAmmxMaWvy+hzzvePs0Z4Fc4bFX7V4zBGI+dAJ+eAu73z6OKNuMhxBrL46ejpRFbqjwBP3veWRiLOMbyPn+Wc+amop0p1eEzV2QHMAIC5Dwm6/tYNLixNSa/Ea0ciaY3jWii+IGhYy+wqQP+9qkoVf9bZ4Ewa+7UfXI/q4zvvic/Znb8ZpCpezLnH4ilBORLyV9r/wkkkVGY7UVgUdSoLVjzTGQAtHl2ZgA3zJ2F2ES9QcBEvrHygT4vGgtEaxQn8XFhBwhzCpPaLyXti/6d+8M36cJx+7gv1eEfDgLz3MNR+tcnFSew9N6dIN4afV0DvA/9FsWk8PTqddN4iHcZzRo0GiDJWNtB+gYVZOytTYMZm2Cyv59IthEzxaB+wTZoSdCeuEeTM0ohYspOKirIPqMPuCbGbtrJFE=
                                                            np0005538511.localdomain,192.168.122.104,np0005538511* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIOw3UAOk5rmRZZUABN/csr2bxG0kPuwFOfnLWM0dbphK
                                                            np0005538511.localdomain,192.168.122.104,np0005538511* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOporAXIBWakUq++II8S8bptvpP8um9hXQ1t0EGSEC6CKLIa5aENxiSz3hPWhpfOMIda2pAiC8tHJ/ctg1cA7bI=
                                                            np0005538515.localdomain,192.168.122.108,np0005538515* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDbwc/gBZF5hmsFU6BSK1/DT5hduj2+3ukzoCGLU6mgpBv7BiInN7vVOqXilL+QUAWOvfKTekaQe1Vv/2jpygQnlu6MEMopmac/36IfVjgt39zxCULfSWv3Gp8tLP0ATF2LfhHBWFrGX7G3Bg3AiNfIUnQIQadBaKIByl+FfA7nJ7phwBAwJaQxvByGDeMwC2CWIUPgVqKclcw1WmldPnNmwquLlCbAeMV2hHlBfnVk8BI6fsOUcBB6a05zRpJpbrl584F+qkiQX0RpZYJQdZCoLiJStJv39lYhgiAWChUOVJsCbeNQnC9/Xgs5JhmRESgXh7Tm+8UNW9DxSHN7BS5qKYPUULdjobSp2v9pFOx30MLMsNd5r3JE07pgm5PpjuviSGEvJ8DIAPTF3kUXM43wax1q9rGV4ZfoJiLAwS9CmWWDWZDg17cnC5z+3qi+K8HUKz8LxQCHI+yEtTFzUEYyXTQfQbNvkauEHI/PwFA1iC+4/2g/0UhtjkM+FO2Czwk=
                                                            np0005538515.localdomain,192.168.122.108,np0005538515* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIPVAkJQTOfLnB4ufl+yfJWTOwj/+yeZMYj9KPcqQhG41
                                                            np0005538515.localdomain,192.168.122.108,np0005538515* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJxSQcYu8iH02KDWynHrNs+wu90XfG3ktCJ/ydvMFl7Khrh5CImI23f+XeJr4A7okpxJw7hhtVd+bcWjM/VGibU=
                                                            np0005538510.localdomain,192.168.122.103,np0005538510* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDAxqgPHnyGChl6yd1/HRo8ox+w8llSVhIj8iYUdDG7IquyLr4/CZguZzRkngbXi/Dq544iKS4kFL/zPKi+yuxeFs4b6fgo4vGoV8wwKNSJXx0d0hOQa9651VqB6k/trENRTgLa2fHkXgF+/g0f7HvloQfhr7qjhTBRV4l4UfJiOEpMvMxN6map/D0JuHlAZGZ5mGUoBTEMuPGEPvMWqe0kc/I8WIgsMsvijOGM2xDxsOqAYlV9a8faoyMdacWUNkeQTfPF6h+z8xdvP8qWPtrPKWHMpcGicTI6pFZ2JxOjWnMaBXs2j/CN7HFLbyOCwuAvAu9efAbxJvgtZlO++6kSlq7SHMzwv7PLP69GaQJHR+jANJ/O2BchbxL09mIkpFSzLSS0k7xXJlwqnAMciIlTaud2n5Hqnnb06WgtvD6O0nnuCLH5am7F1YDGJJgUmNbbgF1PuwzOZqQy+tA2igji/n2z87KkGZdIbrHdPU1PPIlzVGPO6aO02RhvtD+/iQM=
                                                            np0005538510.localdomain,192.168.122.103,np0005538510* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIO316T0CvGWuEUZtluJgtZ9ZZEUIgwqLNzmYcEgwx90d
                                                            np0005538510.localdomain,192.168.122.103,np0005538510* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBFs0shW57fSaFIES4CjKi1hUQjnXLq99+vhyRfpt8xn5+tcCwnrhlVxDAoMMHaxjmVGblslVcZ1lb3oEH51GZuE=
                                                            np0005538514.localdomain,192.168.122.107,np0005538514* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDLIqwhlOevSQuHXF0nrkLOzRoSQqnWWb/cXzK4um93clqGujVOE9PUyL6ONBo/qlr4Pp+QMzSsIFwjW1T/6G+Ce2CS/TGphIUxvvB9NhBt+OJl/zDUEmjAU6bwVIx6ApqtimsXWWIap9GEtVWA5P9pcqPMyGzq1mCzwCS252Ylioij0zZxfMrxTt3RSsWrDED61vRes0ZKd8HERTLN+Lzis5t8f74zfwTesOea6CRkIHth4cUP7ua3q+KhhbhIPj+fXWN5w+qVbcTMJSYyUPsZ2ymPhR4x4db1oPk1Jg14dw1BnmAZZl3v8o4l7bUQ2Fj/PE1JbSiApxbK+V0KdZGMrG4iVbnMmzwBXPXHa6lNQGneflVd3MNEepnTnXx4hAVpaJHc8EtIREq8aPe07DW0wL9clpTKaSGU2Ma+BLXmSDPkuPh6JWLxn3iM1yybL574NnGt2MgBj6z2tiSb4NkNmaBkoG8PMHw8YUSabKBBZNiMEO2GKBpHZldSrYvOZHU=
                                                            np0005538514.localdomain,192.168.122.107,np0005538514* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAINhivqz2RYo1kKlRUCCEwVKn/fRbUXKh+9HKcoRBbRik
                                                            np0005538514.localdomain,192.168.122.107,np0005538514* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBEj7Mfl3DOkiBgUjao8Ey8r/pUITSMDHIaEViUpgeShgnNz3/omNuAseQqHK6/tA9gN/Uo8Pq1wRSxeBtUVD++U=
                                                             create=True mode=0644 path=/tmp/ansible.nrwpmgem state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:20:12 np0005538515.localdomain sudo[141641]: pam_unix(sudo:session): session closed for user root
Nov 28 09:20:13 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22627 DF PROTO=TCP SPT=44208 DPT=9100 SEQ=3514915242 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC6C9D20000000001030307) 
Nov 28 09:20:13 np0005538515.localdomain sudo[141733]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dvtwqwhulgmsqkxpszwjyejjdqjuhapp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321613.470766-378-122394318604977/AnsiballZ_command.py
Nov 28 09:20:13 np0005538515.localdomain sudo[141733]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:20:14 np0005538515.localdomain python3.9[141735]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.nrwpmgem' > /etc/ssh/ssh_known_hosts _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:20:14 np0005538515.localdomain sudo[141733]: pam_unix(sudo:session): session closed for user root
Nov 28 09:20:14 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61585 DF PROTO=TCP SPT=38078 DPT=9101 SEQ=1927108180 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC6CCA00000000001030307) 
Nov 28 09:20:16 np0005538515.localdomain sudo[141827]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ugqvtpntcajkkqxiknndicoihsysqlsu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321615.5971196-427-239172705126326/AnsiballZ_file.py
Nov 28 09:20:16 np0005538515.localdomain sudo[141827]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:20:16 np0005538515.localdomain python3.9[141829]: ansible-ansible.builtin.file Invoked with path=/tmp/ansible.nrwpmgem state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:20:16 np0005538515.localdomain sudo[141827]: pam_unix(sudo:session): session closed for user root
Nov 28 09:20:17 np0005538515.localdomain sshd[141009]: pam_unix(sshd:session): session closed for user zuul
Nov 28 09:20:17 np0005538515.localdomain systemd[1]: session-44.scope: Deactivated successfully.
Nov 28 09:20:17 np0005538515.localdomain systemd[1]: session-44.scope: Consumed 4.157s CPU time.
Nov 28 09:20:17 np0005538515.localdomain systemd-logind[763]: Session 44 logged out. Waiting for processes to exit.
Nov 28 09:20:17 np0005538515.localdomain systemd-logind[763]: Removed session 44.
Nov 28 09:20:21 np0005538515.localdomain sudo[141844]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:20:21 np0005538515.localdomain sudo[141844]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:20:21 np0005538515.localdomain sudo[141844]: pam_unix(sudo:session): session closed for user root
Nov 28 09:20:21 np0005538515.localdomain sudo[141859]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 09:20:21 np0005538515.localdomain sudo[141859]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:20:22 np0005538515.localdomain sudo[141859]: pam_unix(sudo:session): session closed for user root
Nov 28 09:20:24 np0005538515.localdomain sshd[141907]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 09:20:24 np0005538515.localdomain sshd[141907]: Accepted publickey for zuul from 192.168.122.30 port 41032 ssh2: RSA SHA256:3gOhaEk5Hp1Sm2LwNst6cGDJ5O01KvSo8lCo9SBO2II
Nov 28 09:20:24 np0005538515.localdomain systemd-logind[763]: New session 45 of user zuul.
Nov 28 09:20:24 np0005538515.localdomain systemd[1]: Started Session 45 of User zuul.
Nov 28 09:20:24 np0005538515.localdomain sshd[141907]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Nov 28 09:20:25 np0005538515.localdomain sudo[141970]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:20:25 np0005538515.localdomain sudo[141970]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:20:25 np0005538515.localdomain sudo[141970]: pam_unix(sudo:session): session closed for user root
Nov 28 09:20:25 np0005538515.localdomain python3.9[142015]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 09:20:26 np0005538515.localdomain sudo[142109]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sqmcytjroaqlkpcesfprbhwtyberpmyf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321626.3272152-58-118307143432718/AnsiballZ_systemd.py
Nov 28 09:20:26 np0005538515.localdomain sudo[142109]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:20:27 np0005538515.localdomain python3.9[142111]: ansible-ansible.builtin.systemd Invoked with enabled=True name=sshd daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Nov 28 09:20:27 np0005538515.localdomain sudo[142109]: pam_unix(sudo:session): session closed for user root
Nov 28 09:20:27 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43824 DF PROTO=TCP SPT=43266 DPT=9105 SEQ=3828650478 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC700D30000000001030307) 
Nov 28 09:20:28 np0005538515.localdomain sudo[142203]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vexgjzalgbkcdnmrcarjvgwwxexmdgun ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321628.5509353-82-59431132884467/AnsiballZ_systemd.py
Nov 28 09:20:28 np0005538515.localdomain sudo[142203]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:20:28 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33956 DF PROTO=TCP SPT=60000 DPT=9882 SEQ=3871908273 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC705FC0000000001030307) 
Nov 28 09:20:29 np0005538515.localdomain python3.9[142205]: ansible-ansible.builtin.systemd Invoked with name=sshd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 28 09:20:29 np0005538515.localdomain sudo[142203]: pam_unix(sudo:session): session closed for user root
Nov 28 09:20:29 np0005538515.localdomain sudo[142296]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-almbpuhwdldmkieatyueruhvsgocycmb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321629.462187-109-145669901385461/AnsiballZ_command.py
Nov 28 09:20:29 np0005538515.localdomain sudo[142296]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:20:30 np0005538515.localdomain python3.9[142298]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:20:30 np0005538515.localdomain sudo[142296]: pam_unix(sudo:session): session closed for user root
Nov 28 09:20:30 np0005538515.localdomain sudo[142389]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-etujbgzjesaygpbbgwodhaaokccgdfyk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321630.2832134-133-95254172735910/AnsiballZ_stat.py
Nov 28 09:20:30 np0005538515.localdomain sudo[142389]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:20:30 np0005538515.localdomain python3.9[142391]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 09:20:30 np0005538515.localdomain sudo[142389]: pam_unix(sudo:session): session closed for user root
Nov 28 09:20:31 np0005538515.localdomain sudo[142483]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ljpgsoycpqkxrafmambzbkvbecjrrtsw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321631.1043372-157-73141836636506/AnsiballZ_command.py
Nov 28 09:20:31 np0005538515.localdomain sudo[142483]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:20:31 np0005538515.localdomain python3.9[142485]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:20:31 np0005538515.localdomain sudo[142483]: pam_unix(sudo:session): session closed for user root
Nov 28 09:20:32 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42022 DF PROTO=TCP SPT=49360 DPT=9102 SEQ=3740878924 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC712490000000001030307) 
Nov 28 09:20:32 np0005538515.localdomain sudo[142578]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iooqepyruwdikmgmesuclqlgqluxeosg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321631.8134212-182-278592652118099/AnsiballZ_file.py
Nov 28 09:20:32 np0005538515.localdomain sudo[142578]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:20:32 np0005538515.localdomain python3.9[142580]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:20:32 np0005538515.localdomain sudo[142578]: pam_unix(sudo:session): session closed for user root
Nov 28 09:20:32 np0005538515.localdomain sshd[141907]: pam_unix(sshd:session): session closed for user zuul
Nov 28 09:20:32 np0005538515.localdomain systemd[1]: session-45.scope: Deactivated successfully.
Nov 28 09:20:32 np0005538515.localdomain systemd[1]: session-45.scope: Consumed 3.831s CPU time.
Nov 28 09:20:32 np0005538515.localdomain systemd-logind[763]: Session 45 logged out. Waiting for processes to exit.
Nov 28 09:20:32 np0005538515.localdomain systemd-logind[763]: Removed session 45.
Nov 28 09:20:33 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42023 DF PROTO=TCP SPT=49360 DPT=9102 SEQ=3740878924 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC7163B0000000001030307) 
Nov 28 09:20:35 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42024 DF PROTO=TCP SPT=49360 DPT=9102 SEQ=3740878924 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC71E3A0000000001030307) 
Nov 28 09:20:38 np0005538515.localdomain sshd[142595]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 09:20:38 np0005538515.localdomain sshd[142595]: Accepted publickey for zuul from 192.168.122.30 port 48132 ssh2: RSA SHA256:3gOhaEk5Hp1Sm2LwNst6cGDJ5O01KvSo8lCo9SBO2II
Nov 28 09:20:38 np0005538515.localdomain systemd-logind[763]: New session 46 of user zuul.
Nov 28 09:20:38 np0005538515.localdomain systemd[1]: Started Session 46 of User zuul.
Nov 28 09:20:38 np0005538515.localdomain sshd[142595]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Nov 28 09:20:39 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42025 DF PROTO=TCP SPT=49360 DPT=9102 SEQ=3740878924 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC72DFA0000000001030307) 
Nov 28 09:20:39 np0005538515.localdomain python3.9[142688]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 09:20:40 np0005538515.localdomain sudo[142782]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-umdorfgsvfmrcoizddpxlfohhcrtqllr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321640.3293667-64-179324081037798/AnsiballZ_setup.py
Nov 28 09:20:40 np0005538515.localdomain sudo[142782]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:20:40 np0005538515.localdomain python3.9[142784]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 28 09:20:41 np0005538515.localdomain sudo[142782]: pam_unix(sudo:session): session closed for user root
Nov 28 09:20:41 np0005538515.localdomain sudo[142836]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tshwwqltrimnumifaxmnhzxiqlfighav ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321640.3293667-64-179324081037798/AnsiballZ_dnf.py
Nov 28 09:20:41 np0005538515.localdomain sudo[142836]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:20:41 np0005538515.localdomain python3.9[142838]: ansible-ansible.legacy.dnf Invoked with name=['yum-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 28 09:20:43 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15236 DF PROTO=TCP SPT=44444 DPT=9100 SEQ=2418341388 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC73F010000000001030307) 
Nov 28 09:20:44 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61001 DF PROTO=TCP SPT=37704 DPT=9101 SEQ=1634436327 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC741D00000000001030307) 
Nov 28 09:20:44 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15237 DF PROTO=TCP SPT=44444 DPT=9100 SEQ=2418341388 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC742FA0000000001030307) 
Nov 28 09:20:44 np0005538515.localdomain sudo[142836]: pam_unix(sudo:session): session closed for user root
Nov 28 09:20:45 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61002 DF PROTO=TCP SPT=37704 DPT=9101 SEQ=1634436327 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC745BA0000000001030307) 
Nov 28 09:20:45 np0005538515.localdomain python3.9[142930]: ansible-ansible.legacy.command Invoked with _raw_params=needs-restarting -r _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:20:46 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15238 DF PROTO=TCP SPT=44444 DPT=9100 SEQ=2418341388 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC74AFA0000000001030307) 
Nov 28 09:20:46 np0005538515.localdomain sudo[143021]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xrkddjlclaerugodsecrqlnauceybvsa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321646.5882416-128-137299919281288/AnsiballZ_file.py
Nov 28 09:20:46 np0005538515.localdomain sudo[143021]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:20:47 np0005538515.localdomain python3.9[143023]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/reboot_required/ state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:20:47 np0005538515.localdomain sudo[143021]: pam_unix(sudo:session): session closed for user root
Nov 28 09:20:47 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61003 DF PROTO=TCP SPT=37704 DPT=9101 SEQ=1634436327 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC74DBB0000000001030307) 
Nov 28 09:20:47 np0005538515.localdomain sudo[143113]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zljzujkmoqfgiqozwhasgkbkwzohcaiu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321647.410572-151-214490585462573/AnsiballZ_file.py
Nov 28 09:20:47 np0005538515.localdomain sudo[143113]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:20:47 np0005538515.localdomain python3.9[143115]: ansible-ansible.builtin.file Invoked with mode=0600 path=/var/lib/openstack/reboot_required/needs_restarting state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:20:47 np0005538515.localdomain sudo[143113]: pam_unix(sudo:session): session closed for user root
Nov 28 09:20:48 np0005538515.localdomain sudo[143205]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cqpcdbbxdgzapyowwjvogccclalxrazm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321648.0925574-176-207537578614692/AnsiballZ_lineinfile.py
Nov 28 09:20:48 np0005538515.localdomain sudo[143205]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:20:48 np0005538515.localdomain python3.9[143207]: ansible-ansible.builtin.lineinfile Invoked with dest=/var/lib/openstack/reboot_required/needs_restarting line=Not root, Subscription Management repositories not updated
                                                            Core libraries or services have been updated since boot-up:
                                                              * systemd
                                                            
                                                            Reboot is required to fully utilize these updates.
                                                            More information: https://access.redhat.com/solutions/27943 path=/var/lib/openstack/reboot_required/needs_restarting state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:20:48 np0005538515.localdomain sudo[143205]: pam_unix(sudo:session): session closed for user root
Nov 28 09:20:49 np0005538515.localdomain python3.9[143297]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/reboot_required/'] patterns=[] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Nov 28 09:20:50 np0005538515.localdomain python3.9[143387]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 09:20:50 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15239 DF PROTO=TCP SPT=44444 DPT=9100 SEQ=2418341388 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC75ABB0000000001030307) 
Nov 28 09:20:50 np0005538515.localdomain python3.9[143479]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/config follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 09:20:51 np0005538515.localdomain sshd[142595]: pam_unix(sshd:session): session closed for user zuul
Nov 28 09:20:51 np0005538515.localdomain systemd[1]: session-46.scope: Deactivated successfully.
Nov 28 09:20:51 np0005538515.localdomain systemd[1]: session-46.scope: Consumed 8.693s CPU time.
Nov 28 09:20:51 np0005538515.localdomain systemd-logind[763]: Session 46 logged out. Waiting for processes to exit.
Nov 28 09:20:51 np0005538515.localdomain systemd-logind[763]: Removed session 46.
Nov 28 09:20:56 np0005538515.localdomain sshd[143494]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 09:20:57 np0005538515.localdomain sshd[143494]: Accepted publickey for zuul from 192.168.122.30 port 43494 ssh2: RSA SHA256:3gOhaEk5Hp1Sm2LwNst6cGDJ5O01KvSo8lCo9SBO2II
Nov 28 09:20:57 np0005538515.localdomain systemd-logind[763]: New session 47 of user zuul.
Nov 28 09:20:57 np0005538515.localdomain systemd[1]: Started Session 47 of User zuul.
Nov 28 09:20:57 np0005538515.localdomain sshd[143494]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Nov 28 09:20:57 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41174 DF PROTO=TCP SPT=60178 DPT=9105 SEQ=1447876135 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC776030000000001030307) 
Nov 28 09:20:58 np0005538515.localdomain python3.9[143587]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 09:20:58 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41175 DF PROTO=TCP SPT=60178 DPT=9105 SEQ=1447876135 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC779FA0000000001030307) 
Nov 28 09:20:58 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15240 DF PROTO=TCP SPT=44444 DPT=9100 SEQ=2418341388 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC77AFA0000000001030307) 
Nov 28 09:21:00 np0005538515.localdomain sudo[143681]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cireodengutljyyfdlzysbxfgclvluvk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321660.5061896-159-93240192249065/AnsiballZ_file.py
Nov 28 09:21:00 np0005538515.localdomain sudo[143681]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:21:00 np0005538515.localdomain python3.9[143683]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/telemetry setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:21:00 np0005538515.localdomain sudo[143681]: pam_unix(sudo:session): session closed for user root
Nov 28 09:21:01 np0005538515.localdomain sudo[143773]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-feglqynnzqcwtwhcadmtknturxgzgcuo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321661.155385-182-96870790585131/AnsiballZ_stat.py
Nov 28 09:21:01 np0005538515.localdomain sudo[143773]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:21:01 np0005538515.localdomain python3.9[143775]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:21:01 np0005538515.localdomain sudo[143773]: pam_unix(sudo:session): session closed for user root
Nov 28 09:21:01 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31280 DF PROTO=TCP SPT=43530 DPT=9882 SEQ=4129427140 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC7873A0000000001030307) 
Nov 28 09:21:02 np0005538515.localdomain sudo[143846]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xhfkfvmtqvoibiwqwqtpnnimqrwvzksr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321661.155385-182-96870790585131/AnsiballZ_copy.py
Nov 28 09:21:02 np0005538515.localdomain sudo[143846]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:21:02 np0005538515.localdomain python3.9[143848]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764321661.155385-182-96870790585131/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=777d5f6763fde4fea484664803960858c2bba706 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:21:02 np0005538515.localdomain sudo[143846]: pam_unix(sudo:session): session closed for user root
Nov 28 09:21:03 np0005538515.localdomain sudo[143938]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nfqqpnqacvmdjbxtvpdmqcxkjebguvrz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321663.226793-230-247970929156271/AnsiballZ_file.py
Nov 28 09:21:03 np0005538515.localdomain sudo[143938]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:21:03 np0005538515.localdomain python3.9[143940]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/neutron-sriov setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:21:03 np0005538515.localdomain sudo[143938]: pam_unix(sudo:session): session closed for user root
Nov 28 09:21:04 np0005538515.localdomain sudo[144030]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-smzyqctsbmxienzsjeufkortxdlyxkse ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321663.8510077-254-246269642521288/AnsiballZ_stat.py
Nov 28 09:21:04 np0005538515.localdomain sudo[144030]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:21:04 np0005538515.localdomain python3.9[144032]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:21:04 np0005538515.localdomain sudo[144030]: pam_unix(sudo:session): session closed for user root
Nov 28 09:21:04 np0005538515.localdomain sudo[144103]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qbfahduxgqhgedjbnbrocnsvsxcftcqo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321663.8510077-254-246269642521288/AnsiballZ_copy.py
Nov 28 09:21:04 np0005538515.localdomain sudo[144103]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:21:04 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41177 DF PROTO=TCP SPT=60178 DPT=9105 SEQ=1447876135 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC791BA0000000001030307) 
Nov 28 09:21:04 np0005538515.localdomain python3.9[144105]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764321663.8510077-254-246269642521288/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=777d5f6763fde4fea484664803960858c2bba706 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:21:04 np0005538515.localdomain sudo[144103]: pam_unix(sudo:session): session closed for user root
Nov 28 09:21:05 np0005538515.localdomain sudo[144195]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-etzigzbqiwnatnvqldckcolrjtvfpcae ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321665.057825-300-152153874817888/AnsiballZ_file.py
Nov 28 09:21:05 np0005538515.localdomain sudo[144195]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:21:05 np0005538515.localdomain python3.9[144197]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/neutron-dhcp setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:21:05 np0005538515.localdomain sudo[144195]: pam_unix(sudo:session): session closed for user root
Nov 28 09:21:05 np0005538515.localdomain sudo[144287]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gdelzrwcfbbktxkbelopapnfpujlwbxo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321665.6413279-324-186230067059145/AnsiballZ_stat.py
Nov 28 09:21:05 np0005538515.localdomain sudo[144287]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:21:06 np0005538515.localdomain python3.9[144289]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:21:06 np0005538515.localdomain sudo[144287]: pam_unix(sudo:session): session closed for user root
Nov 28 09:21:06 np0005538515.localdomain sudo[144360]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kgaljqldlditpaokggjjtthelmszznpl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321665.6413279-324-186230067059145/AnsiballZ_copy.py
Nov 28 09:21:06 np0005538515.localdomain sudo[144360]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:21:06 np0005538515.localdomain python3.9[144362]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764321665.6413279-324-186230067059145/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=777d5f6763fde4fea484664803960858c2bba706 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:21:06 np0005538515.localdomain sudo[144360]: pam_unix(sudo:session): session closed for user root
Nov 28 09:21:07 np0005538515.localdomain sudo[144452]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ccksiycdmwjwlnupxzyuavcytbdmsowc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321666.8521032-370-13129467953772/AnsiballZ_file.py
Nov 28 09:21:07 np0005538515.localdomain sudo[144452]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:21:07 np0005538515.localdomain python3.9[144454]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:21:07 np0005538515.localdomain sudo[144452]: pam_unix(sudo:session): session closed for user root
Nov 28 09:21:07 np0005538515.localdomain sudo[144544]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ajlgyaresksfhxgbgnatiwcehxruiycb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321667.573376-396-230623098768221/AnsiballZ_stat.py
Nov 28 09:21:07 np0005538515.localdomain sudo[144544]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:21:08 np0005538515.localdomain python3.9[144546]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:21:08 np0005538515.localdomain sudo[144544]: pam_unix(sudo:session): session closed for user root
Nov 28 09:21:08 np0005538515.localdomain sudo[144617]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ditsfxsofxqzzlrgchvlkywltjajbkqe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321667.573376-396-230623098768221/AnsiballZ_copy.py
Nov 28 09:21:08 np0005538515.localdomain sudo[144617]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:21:08 np0005538515.localdomain python3.9[144619]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764321667.573376-396-230623098768221/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=777d5f6763fde4fea484664803960858c2bba706 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:21:08 np0005538515.localdomain sudo[144617]: pam_unix(sudo:session): session closed for user root
Nov 28 09:21:09 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20592 DF PROTO=TCP SPT=52212 DPT=9102 SEQ=434276050 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC7A33A0000000001030307) 
Nov 28 09:21:09 np0005538515.localdomain sudo[144709]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vslxjdudwekacjaahkgusaxwdrswfovo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321668.932199-445-78411282135014/AnsiballZ_file.py
Nov 28 09:21:09 np0005538515.localdomain sudo[144709]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:21:09 np0005538515.localdomain python3.9[144711]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:21:09 np0005538515.localdomain sudo[144709]: pam_unix(sudo:session): session closed for user root
Nov 28 09:21:09 np0005538515.localdomain sudo[144801]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lozslazrnplifbxdtbwmulmybcgmzozg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321669.6399689-468-72087412119558/AnsiballZ_stat.py
Nov 28 09:21:09 np0005538515.localdomain sudo[144801]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:21:10 np0005538515.localdomain python3.9[144803]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:21:10 np0005538515.localdomain sudo[144801]: pam_unix(sudo:session): session closed for user root
Nov 28 09:21:10 np0005538515.localdomain sudo[144874]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dtchqwxxesqgjewedznpfshqdvyeyhll ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321669.6399689-468-72087412119558/AnsiballZ_copy.py
Nov 28 09:21:10 np0005538515.localdomain sudo[144874]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:21:10 np0005538515.localdomain python3.9[144876]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764321669.6399689-468-72087412119558/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=777d5f6763fde4fea484664803960858c2bba706 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:21:10 np0005538515.localdomain sudo[144874]: pam_unix(sudo:session): session closed for user root
Nov 28 09:21:11 np0005538515.localdomain sudo[144966]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jveqjqoluuscvxoedtehsuebvdlqkavg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321671.0121293-514-270473210712226/AnsiballZ_file.py
Nov 28 09:21:11 np0005538515.localdomain sudo[144966]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:21:12 np0005538515.localdomain python3.9[144968]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:21:12 np0005538515.localdomain sudo[144966]: pam_unix(sudo:session): session closed for user root
Nov 28 09:21:12 np0005538515.localdomain chronyd[134173]: Selected source 167.160.187.12 (pool.ntp.org)
Nov 28 09:21:12 np0005538515.localdomain sudo[145058]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hcfrtotehnyisxkorxypniycovctnztz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321672.1535165-540-110020842775095/AnsiballZ_stat.py
Nov 28 09:21:12 np0005538515.localdomain sudo[145058]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:21:12 np0005538515.localdomain python3.9[145060]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:21:12 np0005538515.localdomain sudo[145058]: pam_unix(sudo:session): session closed for user root
Nov 28 09:21:12 np0005538515.localdomain sudo[145131]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dxfnrdpmgryzeyyxsfulussgnwszxmsk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321672.1535165-540-110020842775095/AnsiballZ_copy.py
Nov 28 09:21:12 np0005538515.localdomain sudo[145131]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:21:13 np0005538515.localdomain python3.9[145133]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764321672.1535165-540-110020842775095/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=777d5f6763fde4fea484664803960858c2bba706 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:21:13 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41178 DF PROTO=TCP SPT=60178 DPT=9105 SEQ=1447876135 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC7B2FA0000000001030307) 
Nov 28 09:21:13 np0005538515.localdomain sudo[145131]: pam_unix(sudo:session): session closed for user root
Nov 28 09:21:13 np0005538515.localdomain sudo[145223]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tefnfhdnptsealvpldkxludtrqyxbxcq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321673.382358-587-34094600763123/AnsiballZ_file.py
Nov 28 09:21:13 np0005538515.localdomain sudo[145223]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:21:13 np0005538515.localdomain python3.9[145225]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/bootstrap setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:21:13 np0005538515.localdomain sudo[145223]: pam_unix(sudo:session): session closed for user root
Nov 28 09:21:14 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31282 DF PROTO=TCP SPT=43530 DPT=9882 SEQ=4129427140 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC7B6FA0000000001030307) 
Nov 28 09:21:14 np0005538515.localdomain sudo[145315]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zfgvgffkrcyrpqorqluiyzjlypcztebo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321674.4826872-611-39281160695468/AnsiballZ_stat.py
Nov 28 09:21:14 np0005538515.localdomain sudo[145315]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:21:14 np0005538515.localdomain python3.9[145317]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:21:14 np0005538515.localdomain sudo[145315]: pam_unix(sudo:session): session closed for user root
Nov 28 09:21:15 np0005538515.localdomain sudo[145388]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hdelkdlyiyalctwtwjjwtkkedruiutaq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321674.4826872-611-39281160695468/AnsiballZ_copy.py
Nov 28 09:21:15 np0005538515.localdomain sudo[145388]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:21:15 np0005538515.localdomain python3.9[145390]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764321674.4826872-611-39281160695468/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=777d5f6763fde4fea484664803960858c2bba706 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:21:15 np0005538515.localdomain sudo[145388]: pam_unix(sudo:session): session closed for user root
Nov 28 09:21:15 np0005538515.localdomain sudo[145480]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vbnaefzihbalzgvwppgeqxsnxmzpgkwp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321675.6099634-657-67653191682208/AnsiballZ_file.py
Nov 28 09:21:15 np0005538515.localdomain sudo[145480]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:21:16 np0005538515.localdomain python3.9[145482]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/neutron-metadata setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:21:16 np0005538515.localdomain sudo[145480]: pam_unix(sudo:session): session closed for user root
Nov 28 09:21:16 np0005538515.localdomain sudo[145572]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ncywbbzzncsdxsroqqaxcnoihkguyokh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321676.2881389-681-121224476243509/AnsiballZ_stat.py
Nov 28 09:21:16 np0005538515.localdomain sudo[145572]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:21:16 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38213 DF PROTO=TCP SPT=43314 DPT=9100 SEQ=3386024306 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC7C03B0000000001030307) 
Nov 28 09:21:16 np0005538515.localdomain python3.9[145574]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:21:16 np0005538515.localdomain sudo[145572]: pam_unix(sudo:session): session closed for user root
Nov 28 09:21:17 np0005538515.localdomain sudo[145645]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-khasvvkqgsjmpbygfavjwimwpebzweic ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321676.2881389-681-121224476243509/AnsiballZ_copy.py
Nov 28 09:21:17 np0005538515.localdomain sudo[145645]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:21:17 np0005538515.localdomain python3.9[145647]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764321676.2881389-681-121224476243509/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=777d5f6763fde4fea484664803960858c2bba706 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:21:17 np0005538515.localdomain sudo[145645]: pam_unix(sudo:session): session closed for user root
Nov 28 09:21:18 np0005538515.localdomain sshd[143494]: pam_unix(sshd:session): session closed for user zuul
Nov 28 09:21:18 np0005538515.localdomain systemd[1]: session-47.scope: Deactivated successfully.
Nov 28 09:21:18 np0005538515.localdomain systemd[1]: session-47.scope: Consumed 11.502s CPU time.
Nov 28 09:21:18 np0005538515.localdomain systemd-logind[763]: Session 47 logged out. Waiting for processes to exit.
Nov 28 09:21:18 np0005538515.localdomain systemd-logind[763]: Removed session 47.
Nov 28 09:21:20 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38214 DF PROTO=TCP SPT=43314 DPT=9100 SEQ=3386024306 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC7CFFA0000000001030307) 
Nov 28 09:21:24 np0005538515.localdomain sshd[145662]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 09:21:24 np0005538515.localdomain sshd[145662]: Accepted publickey for zuul from 192.168.122.30 port 46806 ssh2: RSA SHA256:3gOhaEk5Hp1Sm2LwNst6cGDJ5O01KvSo8lCo9SBO2II
Nov 28 09:21:24 np0005538515.localdomain systemd-logind[763]: New session 48 of user zuul.
Nov 28 09:21:24 np0005538515.localdomain systemd[1]: Started Session 48 of User zuul.
Nov 28 09:21:24 np0005538515.localdomain sshd[145662]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Nov 28 09:21:24 np0005538515.localdomain sudo[145755]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qjotuupbvmhnxsboiilrtabdqjrvetil ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321684.4056084-28-173382688690691/AnsiballZ_file.py
Nov 28 09:21:24 np0005538515.localdomain sudo[145755]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:21:25 np0005538515.localdomain python3.9[145757]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:21:25 np0005538515.localdomain sudo[145755]: pam_unix(sudo:session): session closed for user root
Nov 28 09:21:25 np0005538515.localdomain sudo[145772]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:21:25 np0005538515.localdomain sudo[145772]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:21:25 np0005538515.localdomain sudo[145772]: pam_unix(sudo:session): session closed for user root
Nov 28 09:21:25 np0005538515.localdomain sudo[145787]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 check-host
Nov 28 09:21:25 np0005538515.localdomain sudo[145787]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:21:26 np0005538515.localdomain sudo[145787]: pam_unix(sudo:session): session closed for user root
Nov 28 09:21:26 np0005538515.localdomain sudo[145836]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:21:26 np0005538515.localdomain sudo[145836]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:21:26 np0005538515.localdomain sudo[145836]: pam_unix(sudo:session): session closed for user root
Nov 28 09:21:26 np0005538515.localdomain sudo[145870]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 09:21:26 np0005538515.localdomain sudo[145870]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:21:26 np0005538515.localdomain sudo[145930]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zpoicrdobjmnapovklisitdbcxvqvnoh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321686.1530418-64-257268379395393/AnsiballZ_stat.py
Nov 28 09:21:26 np0005538515.localdomain sudo[145930]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:21:26 np0005538515.localdomain python3.9[145942]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/ceph/ceph.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:21:26 np0005538515.localdomain sudo[145930]: pam_unix(sudo:session): session closed for user root
Nov 28 09:21:26 np0005538515.localdomain sudo[145870]: pam_unix(sudo:session): session closed for user root
Nov 28 09:21:27 np0005538515.localdomain sudo[146032]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oeaphdemftzfnfmuvpkokmcatpjdkoml ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321686.1530418-64-257268379395393/AnsiballZ_copy.py
Nov 28 09:21:27 np0005538515.localdomain sudo[146032]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:21:27 np0005538515.localdomain python3.9[146034]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/ceph/ceph.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764321686.1530418-64-257268379395393/.source.conf _original_basename=ceph.conf follow=False checksum=e86499341cc75988f759ac10cb7bf332387204b7 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:21:27 np0005538515.localdomain sudo[146032]: pam_unix(sudo:session): session closed for user root
Nov 28 09:21:27 np0005538515.localdomain sudo[146049]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:21:27 np0005538515.localdomain sudo[146049]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:21:27 np0005538515.localdomain sudo[146049]: pam_unix(sudo:session): session closed for user root
Nov 28 09:21:27 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26687 DF PROTO=TCP SPT=49094 DPT=9105 SEQ=3249111599 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC7EB330000000001030307) 
Nov 28 09:21:27 np0005538515.localdomain sudo[146139]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-neqckgvqmsjqqaufgwbkphyomsvfssnk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321687.5908878-64-261334273689399/AnsiballZ_stat.py
Nov 28 09:21:27 np0005538515.localdomain sudo[146139]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:21:28 np0005538515.localdomain python3.9[146141]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/ceph/ceph.client.openstack.keyring follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:21:28 np0005538515.localdomain sudo[146139]: pam_unix(sudo:session): session closed for user root
Nov 28 09:21:28 np0005538515.localdomain sudo[146212]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bttqhekmqofdfpweupjmonzwggnztrkk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321687.5908878-64-261334273689399/AnsiballZ_copy.py
Nov 28 09:21:28 np0005538515.localdomain sudo[146212]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:21:28 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26688 DF PROTO=TCP SPT=49094 DPT=9105 SEQ=3249111599 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC7EF3A0000000001030307) 
Nov 28 09:21:28 np0005538515.localdomain python3.9[146214]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/ceph/ceph.client.openstack.keyring mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764321687.5908878-64-261334273689399/.source.keyring _original_basename=ceph.client.openstack.keyring follow=False checksum=98ffd20e3b9db1cae39a950d9da1f69e92796658 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:21:28 np0005538515.localdomain sudo[146212]: pam_unix(sudo:session): session closed for user root
Nov 28 09:21:28 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38338 DF PROTO=TCP SPT=57676 DPT=9882 SEQ=1964313840 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC7F05C0000000001030307) 
Nov 28 09:21:29 np0005538515.localdomain sshd[145662]: pam_unix(sshd:session): session closed for user zuul
Nov 28 09:21:29 np0005538515.localdomain systemd[1]: session-48.scope: Deactivated successfully.
Nov 28 09:21:29 np0005538515.localdomain systemd[1]: session-48.scope: Consumed 2.186s CPU time.
Nov 28 09:21:29 np0005538515.localdomain systemd-logind[763]: Session 48 logged out. Waiting for processes to exit.
Nov 28 09:21:29 np0005538515.localdomain systemd-logind[763]: Removed session 48.
Nov 28 09:21:32 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38340 DF PROTO=TCP SPT=57676 DPT=9882 SEQ=1964313840 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC7FC7A0000000001030307) 
Nov 28 09:21:34 np0005538515.localdomain sshd[146229]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 09:21:34 np0005538515.localdomain sshd[146229]: Accepted publickey for zuul from 192.168.122.30 port 48980 ssh2: RSA SHA256:3gOhaEk5Hp1Sm2LwNst6cGDJ5O01KvSo8lCo9SBO2II
Nov 28 09:21:34 np0005538515.localdomain systemd-logind[763]: New session 49 of user zuul.
Nov 28 09:21:34 np0005538515.localdomain systemd[1]: Started Session 49 of User zuul.
Nov 28 09:21:34 np0005538515.localdomain sshd[146229]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Nov 28 09:21:34 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26690 DF PROTO=TCP SPT=49094 DPT=9105 SEQ=3249111599 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC806FA0000000001030307) 
Nov 28 09:21:35 np0005538515.localdomain python3.9[146322]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 09:21:37 np0005538515.localdomain sudo[146416]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-icxaqehanzccgiihqsisetbmbecyfwid ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321696.570674-64-204462445139505/AnsiballZ_file.py
Nov 28 09:21:37 np0005538515.localdomain sudo[146416]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:21:37 np0005538515.localdomain python3.9[146418]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:21:37 np0005538515.localdomain sudo[146416]: pam_unix(sudo:session): session closed for user root
Nov 28 09:21:38 np0005538515.localdomain sudo[146508]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sklioodcijpkqegmrexzawiadcykocem ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321697.8298163-64-251581662056912/AnsiballZ_file.py
Nov 28 09:21:38 np0005538515.localdomain sudo[146508]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:21:38 np0005538515.localdomain python3.9[146510]: ansible-ansible.builtin.file Invoked with group=openvswitch owner=openvswitch path=/var/lib/openvswitch/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:21:38 np0005538515.localdomain sudo[146508]: pam_unix(sudo:session): session closed for user root
Nov 28 09:21:39 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64523 DF PROTO=TCP SPT=36618 DPT=9102 SEQ=1488426619 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC8187A0000000001030307) 
Nov 28 09:21:39 np0005538515.localdomain python3.9[146600]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 09:21:39 np0005538515.localdomain sudo[146690]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cihnrmkprfhdaptfncdahnkgyvynalrr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321699.486965-133-242707187423406/AnsiballZ_seboolean.py
Nov 28 09:21:39 np0005538515.localdomain sudo[146690]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:21:40 np0005538515.localdomain python3.9[146692]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Nov 28 09:21:40 np0005538515.localdomain sudo[146690]: pam_unix(sudo:session): session closed for user root
Nov 28 09:21:40 np0005538515.localdomain sudo[146782]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tctonbtbthizlynluihkuprrtpkvkkjy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321700.680451-163-83799978755983/AnsiballZ_setup.py
Nov 28 09:21:40 np0005538515.localdomain sudo[146782]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:21:41 np0005538515.localdomain python3.9[146784]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 28 09:21:41 np0005538515.localdomain sudo[146782]: pam_unix(sudo:session): session closed for user root
Nov 28 09:21:42 np0005538515.localdomain sudo[146836]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-debczheppvyexalflgodannzacxwvwvc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321700.680451-163-83799978755983/AnsiballZ_dnf.py
Nov 28 09:21:42 np0005538515.localdomain sudo[146836]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:21:42 np0005538515.localdomain python3.9[146838]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch3.3'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 28 09:21:42 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26691 DF PROTO=TCP SPT=49094 DPT=9105 SEQ=3249111599 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC826FB0000000001030307) 
Nov 28 09:21:44 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40144 DF PROTO=TCP SPT=34980 DPT=9101 SEQ=2118942241 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC82C300000000001030307) 
Nov 28 09:21:45 np0005538515.localdomain sudo[146836]: pam_unix(sudo:session): session closed for user root
Nov 28 09:21:46 np0005538515.localdomain sudo[146930]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rccthuvwirlnlnskyresshxklftwmnrg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321705.4890957-199-165811186144025/AnsiballZ_systemd.py
Nov 28 09:21:46 np0005538515.localdomain sudo[146930]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:21:46 np0005538515.localdomain python3.9[146932]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 28 09:21:46 np0005538515.localdomain sudo[146930]: pam_unix(sudo:session): session closed for user root
Nov 28 09:21:46 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42715 DF PROTO=TCP SPT=50552 DPT=9100 SEQ=3352178026 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC8357A0000000001030307) 
Nov 28 09:21:48 np0005538515.localdomain sudo[147025]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pllauptwxkwbzgvppcuqhdtaqkurubhm ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764321707.6824827-223-117525714117372/AnsiballZ_edpm_nftables_snippet.py
Nov 28 09:21:48 np0005538515.localdomain sudo[147025]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:21:48 np0005538515.localdomain python3[147027]: ansible-osp.edpm.edpm_nftables_snippet Invoked with content=- rule_name: 118 neutron vxlan networks
                                                            rule:
                                                              proto: udp
                                                              dport: 4789
                                                          - rule_name: 119 neutron geneve networks
                                                            rule:
                                                              proto: udp
                                                              dport: 6081
                                                              state: ["UNTRACKED"]
                                                          - rule_name: 120 neutron geneve networks no conntrack
                                                            rule:
                                                              proto: udp
                                                              dport: 6081
                                                              table: raw
                                                              chain: OUTPUT
                                                              jump: NOTRACK
                                                              action: append
                                                              state: []
                                                          - rule_name: 121 neutron geneve networks no conntrack
                                                            rule:
                                                              proto: udp
                                                              dport: 6081
                                                              table: raw
                                                              chain: PREROUTING
                                                              jump: NOTRACK
                                                              action: append
                                                              state: []
                                                           dest=/var/lib/edpm-config/firewall/ovn.yaml state=present
Nov 28 09:21:48 np0005538515.localdomain sudo[147025]: pam_unix(sudo:session): session closed for user root
Nov 28 09:21:49 np0005538515.localdomain sudo[147117]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sqeydxowlyulkitqceqoogtftvajqnri ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321709.5366154-250-118853080383301/AnsiballZ_file.py
Nov 28 09:21:49 np0005538515.localdomain sudo[147117]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:21:49 np0005538515.localdomain python3.9[147119]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:21:50 np0005538515.localdomain sudo[147117]: pam_unix(sudo:session): session closed for user root
Nov 28 09:21:50 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42716 DF PROTO=TCP SPT=50552 DPT=9100 SEQ=3352178026 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC8453A0000000001030307) 
Nov 28 09:21:50 np0005538515.localdomain sudo[147209]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hfavvvxrxwglqojmidolxaepihrmzgux ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321710.2034957-274-208872148390115/AnsiballZ_stat.py
Nov 28 09:21:50 np0005538515.localdomain sudo[147209]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:21:50 np0005538515.localdomain python3.9[147211]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:21:50 np0005538515.localdomain sudo[147209]: pam_unix(sudo:session): session closed for user root
Nov 28 09:21:51 np0005538515.localdomain sudo[147257]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ukfmqdfeayuqcqibsgifuhbwyvtivsma ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321710.2034957-274-208872148390115/AnsiballZ_file.py
Nov 28 09:21:51 np0005538515.localdomain sudo[147257]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:21:51 np0005538515.localdomain python3.9[147259]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:21:51 np0005538515.localdomain sudo[147257]: pam_unix(sudo:session): session closed for user root
Nov 28 09:21:51 np0005538515.localdomain sudo[147349]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jpmuvfqfqkytmcnzqnoilcxdfakgzzky ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321711.4909189-310-77145166520167/AnsiballZ_stat.py
Nov 28 09:21:51 np0005538515.localdomain sudo[147349]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:21:51 np0005538515.localdomain python3.9[147351]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:21:51 np0005538515.localdomain sudo[147349]: pam_unix(sudo:session): session closed for user root
Nov 28 09:21:52 np0005538515.localdomain sudo[147397]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-whcqeozipvovwalfyfzaqgtkysokledh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321711.4909189-310-77145166520167/AnsiballZ_file.py
Nov 28 09:21:52 np0005538515.localdomain sudo[147397]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:21:52 np0005538515.localdomain python3.9[147399]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.3a79_geu recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:21:52 np0005538515.localdomain sudo[147397]: pam_unix(sudo:session): session closed for user root
Nov 28 09:21:52 np0005538515.localdomain sudo[147489]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qzoycngbkytkgpdlstwosqjwaeefmhya ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321712.609536-347-169761355413998/AnsiballZ_stat.py
Nov 28 09:21:52 np0005538515.localdomain sudo[147489]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:21:53 np0005538515.localdomain python3.9[147491]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:21:53 np0005538515.localdomain sudo[147489]: pam_unix(sudo:session): session closed for user root
Nov 28 09:21:53 np0005538515.localdomain sudo[147537]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tecjxwyfoadtesmsmkmskuptjkoopsxh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321712.609536-347-169761355413998/AnsiballZ_file.py
Nov 28 09:21:53 np0005538515.localdomain sudo[147537]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:21:53 np0005538515.localdomain python3.9[147539]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:21:53 np0005538515.localdomain sudo[147537]: pam_unix(sudo:session): session closed for user root
Nov 28 09:21:54 np0005538515.localdomain sudo[147629]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-miwfxeasjfvtunxslqvymyamsjwbpmpv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321713.7808769-385-175242376301437/AnsiballZ_command.py
Nov 28 09:21:54 np0005538515.localdomain sudo[147629]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:21:54 np0005538515.localdomain python3.9[147631]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:21:54 np0005538515.localdomain sudo[147629]: pam_unix(sudo:session): session closed for user root
Nov 28 09:21:55 np0005538515.localdomain sudo[147722]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jewmsurzcymjxqjihbwnfyscygbocoyf ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764321714.6529741-410-7553963958637/AnsiballZ_edpm_nftables_from_files.py
Nov 28 09:21:55 np0005538515.localdomain sudo[147722]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:21:55 np0005538515.localdomain python3[147724]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Nov 28 09:21:55 np0005538515.localdomain sudo[147722]: pam_unix(sudo:session): session closed for user root
Nov 28 09:21:55 np0005538515.localdomain sudo[147814]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hvdnlgkwxajczbogthvbcgroqszmwgcu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321715.437453-433-245295769080927/AnsiballZ_stat.py
Nov 28 09:21:55 np0005538515.localdomain sudo[147814]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:21:55 np0005538515.localdomain python3.9[147816]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:21:55 np0005538515.localdomain sudo[147814]: pam_unix(sudo:session): session closed for user root
Nov 28 09:21:56 np0005538515.localdomain sudo[147889]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ygxizjaxzrwgfasaozdjwtsvnbitmawt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321715.437453-433-245295769080927/AnsiballZ_copy.py
Nov 28 09:21:56 np0005538515.localdomain sudo[147889]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:21:56 np0005538515.localdomain python3.9[147891]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764321715.437453-433-245295769080927/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:21:56 np0005538515.localdomain sudo[147889]: pam_unix(sudo:session): session closed for user root
Nov 28 09:21:57 np0005538515.localdomain sudo[147981]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vrxpqmtoiwfppfncgxkwkmfhgyuipkot ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321716.819206-478-92526906791120/AnsiballZ_stat.py
Nov 28 09:21:57 np0005538515.localdomain sudo[147981]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:21:57 np0005538515.localdomain python3.9[147983]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:21:57 np0005538515.localdomain sudo[147981]: pam_unix(sudo:session): session closed for user root
Nov 28 09:21:57 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36276 DF PROTO=TCP SPT=56778 DPT=9105 SEQ=2406824212 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC860630000000001030307) 
Nov 28 09:21:57 np0005538515.localdomain sudo[148056]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gqtdwxwqqccatgderdppxvpkmyimzhop ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321716.819206-478-92526906791120/AnsiballZ_copy.py
Nov 28 09:21:57 np0005538515.localdomain sudo[148056]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:21:57 np0005538515.localdomain python3.9[148058]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764321716.819206-478-92526906791120/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:21:57 np0005538515.localdomain sudo[148056]: pam_unix(sudo:session): session closed for user root
Nov 28 09:21:58 np0005538515.localdomain sudo[148148]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jnsvgtznwbzisodfoclhoahwevjjfutu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321718.0207698-524-172836122936403/AnsiballZ_stat.py
Nov 28 09:21:58 np0005538515.localdomain sudo[148148]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:21:58 np0005538515.localdomain python3.9[148150]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:21:58 np0005538515.localdomain sudo[148148]: pam_unix(sudo:session): session closed for user root
Nov 28 09:21:58 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36277 DF PROTO=TCP SPT=56778 DPT=9105 SEQ=2406824212 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC8647A0000000001030307) 
Nov 28 09:21:58 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42717 DF PROTO=TCP SPT=50552 DPT=9100 SEQ=3352178026 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC864FA0000000001030307) 
Nov 28 09:21:58 np0005538515.localdomain sudo[148223]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nuvbeytmpteefkygwdznktaewfsdzxhy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321718.0207698-524-172836122936403/AnsiballZ_copy.py
Nov 28 09:21:58 np0005538515.localdomain sudo[148223]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:21:59 np0005538515.localdomain python3.9[148225]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764321718.0207698-524-172836122936403/.source.nft follow=False _original_basename=flush-chain.j2 checksum=4d3ffec49c8eb1a9b80d2f1e8cd64070063a87b4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:21:59 np0005538515.localdomain sudo[148223]: pam_unix(sudo:session): session closed for user root
Nov 28 09:22:00 np0005538515.localdomain sudo[148315]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xibzacyzkbwepaxbrxovcphrcxscwqnc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321720.0765269-568-163146176608401/AnsiballZ_stat.py
Nov 28 09:22:00 np0005538515.localdomain sudo[148315]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:22:00 np0005538515.localdomain python3.9[148317]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:22:00 np0005538515.localdomain sudo[148315]: pam_unix(sudo:session): session closed for user root
Nov 28 09:22:00 np0005538515.localdomain sudo[148390]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-msdixcarnlzwjcoknyxjyjukoirrzfix ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321720.0765269-568-163146176608401/AnsiballZ_copy.py
Nov 28 09:22:00 np0005538515.localdomain sudo[148390]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:22:01 np0005538515.localdomain python3.9[148392]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764321720.0765269-568-163146176608401/.source.nft follow=False _original_basename=chains.j2 checksum=298ada419730ec15df17ded0cc50c97a4014a591 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:22:01 np0005538515.localdomain sudo[148390]: pam_unix(sudo:session): session closed for user root
Nov 28 09:22:01 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41180 DF PROTO=TCP SPT=60178 DPT=9105 SEQ=1447876135 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC870FA0000000001030307) 
Nov 28 09:22:02 np0005538515.localdomain sudo[148482]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xepcqwwmxbptuvihnzizyiqkgortbbfm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321722.525155-614-78977186599096/AnsiballZ_stat.py
Nov 28 09:22:02 np0005538515.localdomain sudo[148482]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:22:03 np0005538515.localdomain python3.9[148484]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:22:03 np0005538515.localdomain sudo[148482]: pam_unix(sudo:session): session closed for user root
Nov 28 09:22:03 np0005538515.localdomain sudo[148557]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qdzvotcuwkojpwmmvwitisfingejmilc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321722.525155-614-78977186599096/AnsiballZ_copy.py
Nov 28 09:22:03 np0005538515.localdomain sudo[148557]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:22:03 np0005538515.localdomain python3.9[148559]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764321722.525155-614-78977186599096/.source.nft follow=False _original_basename=ruleset.j2 checksum=eb691bdb7d792c5f8ff0d719e807fe1c95b09438 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:22:03 np0005538515.localdomain sudo[148557]: pam_unix(sudo:session): session closed for user root
Nov 28 09:22:04 np0005538515.localdomain sudo[148649]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lvqibjquyevmlzgrezvvlkmlfwpqtzup ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321723.7843628-658-90416942094914/AnsiballZ_file.py
Nov 28 09:22:04 np0005538515.localdomain sudo[148649]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:22:04 np0005538515.localdomain python3.9[148651]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:22:04 np0005538515.localdomain sudo[148649]: pam_unix(sudo:session): session closed for user root
Nov 28 09:22:04 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36279 DF PROTO=TCP SPT=56778 DPT=9105 SEQ=2406824212 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC87C3B0000000001030307) 
Nov 28 09:22:04 np0005538515.localdomain sudo[148741]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ynlzelayoupgagqnlditefuebsyeyqqp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321724.472883-682-143856492247156/AnsiballZ_command.py
Nov 28 09:22:04 np0005538515.localdomain sudo[148741]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:22:04 np0005538515.localdomain python3.9[148743]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:22:04 np0005538515.localdomain sudo[148741]: pam_unix(sudo:session): session closed for user root
Nov 28 09:22:05 np0005538515.localdomain sudo[148836]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tpbnlvorfyqjvnexjrlszdmozejwyflt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321725.1678004-707-190564095113212/AnsiballZ_blockinfile.py
Nov 28 09:22:05 np0005538515.localdomain sudo[148836]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:22:05 np0005538515.localdomain python3.9[148838]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                                            include "/etc/nftables/edpm-chains.nft"
                                                            include "/etc/nftables/edpm-rules.nft"
                                                            include "/etc/nftables/edpm-jumps.nft"
                                                             path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:22:05 np0005538515.localdomain sudo[148836]: pam_unix(sudo:session): session closed for user root
Nov 28 09:22:06 np0005538515.localdomain sudo[148929]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sgntsprekyyimsgjrvaxrfgxelbsirpt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321726.0281336-734-102036950782179/AnsiballZ_command.py
Nov 28 09:22:06 np0005538515.localdomain sudo[148929]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:22:06 np0005538515.localdomain python3.9[148931]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:22:06 np0005538515.localdomain sudo[148929]: pam_unix(sudo:session): session closed for user root
Nov 28 09:22:06 np0005538515.localdomain sudo[149022]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-erqckbldafxxjnjderjxkgsstkiiaaew ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321726.677094-758-138337501022890/AnsiballZ_stat.py
Nov 28 09:22:06 np0005538515.localdomain sudo[149022]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:22:07 np0005538515.localdomain python3.9[149024]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 09:22:07 np0005538515.localdomain sudo[149022]: pam_unix(sudo:session): session closed for user root
Nov 28 09:22:07 np0005538515.localdomain sudo[149116]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pwevimbxrxrahyqyaqkxhhymvvpwdrqw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321727.3672125-781-55651729096589/AnsiballZ_command.py
Nov 28 09:22:07 np0005538515.localdomain sudo[149116]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:22:07 np0005538515.localdomain python3.9[149118]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:22:07 np0005538515.localdomain sudo[149116]: pam_unix(sudo:session): session closed for user root
Nov 28 09:22:08 np0005538515.localdomain sudo[149211]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bjuvzpsanzmxmuopufpejtfnaqcqfyli ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321728.0567534-805-75904720725954/AnsiballZ_file.py
Nov 28 09:22:08 np0005538515.localdomain sudo[149211]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:22:08 np0005538515.localdomain python3.9[149213]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:22:08 np0005538515.localdomain sudo[149211]: pam_unix(sudo:session): session closed for user root
Nov 28 09:22:09 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43190 DF PROTO=TCP SPT=47466 DPT=9102 SEQ=683390314 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC88DBA0000000001030307) 
Nov 28 09:22:09 np0005538515.localdomain python3.9[149303]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'machine'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 09:22:10 np0005538515.localdomain sudo[149394]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-osygthetlnezbweqzdupjuicceawazun ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321730.4902256-925-185800494129852/AnsiballZ_command.py
Nov 28 09:22:10 np0005538515.localdomain sudo[149394]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:22:10 np0005538515.localdomain python3.9[149396]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl set open . external_ids:hostname=np0005538515.localdomain external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings="datacentre:3e:0a:28:f9:1a:af" external_ids:ovn-encap-ip=172.19.0.108 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=tcp:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch 
                                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:22:10 np0005538515.localdomain ovs-vsctl[149397]: ovs|00001|vsctl|INFO|Called as ovs-vsctl set open . external_ids:hostname=np0005538515.localdomain external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings=datacentre:3e:0a:28:f9:1a:af external_ids:ovn-encap-ip=172.19.0.108 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=tcp:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch
Nov 28 09:22:11 np0005538515.localdomain sudo[149394]: pam_unix(sudo:session): session closed for user root
Nov 28 09:22:11 np0005538515.localdomain sudo[149487]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qhdyqaextchvgyktsdhulkjlihkwwbyj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321731.240667-953-109588929355928/AnsiballZ_command.py
Nov 28 09:22:11 np0005538515.localdomain sudo[149487]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:22:11 np0005538515.localdomain python3.9[149489]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail
                                                            ovs-vsctl show | grep -q "Manager"
                                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:22:11 np0005538515.localdomain sudo[149487]: pam_unix(sudo:session): session closed for user root
Nov 28 09:22:12 np0005538515.localdomain python3.9[149582]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 09:22:12 np0005538515.localdomain sudo[149674]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nlrqjatjascivnmwnhbanznhpfkoeqoa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321732.693077-1006-221409055362906/AnsiballZ_file.py
Nov 28 09:22:12 np0005538515.localdomain sudo[149674]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:22:13 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36280 DF PROTO=TCP SPT=56778 DPT=9105 SEQ=2406824212 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC89CFA0000000001030307) 
Nov 28 09:22:13 np0005538515.localdomain python3.9[149676]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:22:13 np0005538515.localdomain sudo[149674]: pam_unix(sudo:session): session closed for user root
Nov 28 09:22:13 np0005538515.localdomain sudo[149766]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-plwveijuyqlsdrxbdnpgsiyrvejaegkz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321733.380377-1030-145612137746662/AnsiballZ_stat.py
Nov 28 09:22:13 np0005538515.localdomain sudo[149766]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:22:13 np0005538515.localdomain python3.9[149768]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:22:13 np0005538515.localdomain sudo[149766]: pam_unix(sudo:session): session closed for user root
Nov 28 09:22:14 np0005538515.localdomain sudo[149814]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-roymjmiqbzozqqwmbiackuephjrnnavf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321733.380377-1030-145612137746662/AnsiballZ_file.py
Nov 28 09:22:14 np0005538515.localdomain sudo[149814]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:22:14 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18751 DF PROTO=TCP SPT=36858 DPT=9882 SEQ=426720246 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC8A0FA0000000001030307) 
Nov 28 09:22:14 np0005538515.localdomain python3.9[149816]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:22:14 np0005538515.localdomain sudo[149814]: pam_unix(sudo:session): session closed for user root
Nov 28 09:22:14 np0005538515.localdomain sudo[149906]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dbpdbijojvxeihtmvvakbhuxjkxzbaod ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321734.341328-1030-111184735584268/AnsiballZ_stat.py
Nov 28 09:22:14 np0005538515.localdomain sudo[149906]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:22:14 np0005538515.localdomain python3.9[149908]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:22:14 np0005538515.localdomain sudo[149906]: pam_unix(sudo:session): session closed for user root
Nov 28 09:22:15 np0005538515.localdomain sudo[149954]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ftrgywghgsinrxbcwfezexkpwrcwtovg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321734.341328-1030-111184735584268/AnsiballZ_file.py
Nov 28 09:22:15 np0005538515.localdomain sudo[149954]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:22:15 np0005538515.localdomain python3.9[149956]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:22:15 np0005538515.localdomain sudo[149954]: pam_unix(sudo:session): session closed for user root
Nov 28 09:22:15 np0005538515.localdomain sudo[150046]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rnvpgcytjhpsdxblxpncjkmpiltdsuze ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321735.3856652-1100-98370634812462/AnsiballZ_file.py
Nov 28 09:22:15 np0005538515.localdomain sudo[150046]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:22:15 np0005538515.localdomain python3.9[150048]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:22:15 np0005538515.localdomain sudo[150046]: pam_unix(sudo:session): session closed for user root
Nov 28 09:22:16 np0005538515.localdomain sudo[150138]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zksekvvhnjcjqpvyzdopzigpxrtdwdbu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321736.0004108-1123-233074792125934/AnsiballZ_stat.py
Nov 28 09:22:16 np0005538515.localdomain sudo[150138]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:22:16 np0005538515.localdomain python3.9[150140]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:22:16 np0005538515.localdomain sudo[150138]: pam_unix(sudo:session): session closed for user root
Nov 28 09:22:16 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10332 DF PROTO=TCP SPT=33336 DPT=9100 SEQ=455937120 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC8AA7A0000000001030307) 
Nov 28 09:22:16 np0005538515.localdomain sudo[150186]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lyymiyhpdnahhkctlbomhwgjvcvwukul ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321736.0004108-1123-233074792125934/AnsiballZ_file.py
Nov 28 09:22:16 np0005538515.localdomain sudo[150186]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:22:16 np0005538515.localdomain python3.9[150188]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:22:16 np0005538515.localdomain sudo[150186]: pam_unix(sudo:session): session closed for user root
Nov 28 09:22:17 np0005538515.localdomain sudo[150278]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uvcrkssukiosotzkcgumrxzucnesyuck ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321737.0844378-1159-262381455531165/AnsiballZ_stat.py
Nov 28 09:22:17 np0005538515.localdomain sudo[150278]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:22:17 np0005538515.localdomain python3.9[150280]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:22:17 np0005538515.localdomain sudo[150278]: pam_unix(sudo:session): session closed for user root
Nov 28 09:22:17 np0005538515.localdomain sudo[150326]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mfrvcnblvadbnlutdaeprkrpczktascf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321737.0844378-1159-262381455531165/AnsiballZ_file.py
Nov 28 09:22:17 np0005538515.localdomain sudo[150326]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:22:17 np0005538515.localdomain python3.9[150328]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:22:18 np0005538515.localdomain sudo[150326]: pam_unix(sudo:session): session closed for user root
Nov 28 09:22:18 np0005538515.localdomain sudo[150418]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ydsorotnsxejwzasadkslocpqhxwqbnm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321738.2047842-1196-73325474603696/AnsiballZ_systemd.py
Nov 28 09:22:18 np0005538515.localdomain sudo[150418]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:22:18 np0005538515.localdomain python3.9[150420]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:22:18 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 09:22:18 np0005538515.localdomain systemd-sysv-generator[150449]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:22:18 np0005538515.localdomain systemd-rc-local-generator[150445]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:22:19 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:22:19 np0005538515.localdomain sudo[150418]: pam_unix(sudo:session): session closed for user root
Nov 28 09:22:20 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10333 DF PROTO=TCP SPT=33336 DPT=9100 SEQ=455937120 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC8BA3A0000000001030307) 
Nov 28 09:22:20 np0005538515.localdomain sudo[150548]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nkujxzmohnbsdxrlnorvgrdegirfyomi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321740.298129-1219-169212558160514/AnsiballZ_stat.py
Nov 28 09:22:20 np0005538515.localdomain sudo[150548]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:22:20 np0005538515.localdomain python3.9[150550]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:22:20 np0005538515.localdomain sudo[150548]: pam_unix(sudo:session): session closed for user root
Nov 28 09:22:21 np0005538515.localdomain sudo[150596]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dlrrvxjvdsvanucivtktgqamwhecxshm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321740.298129-1219-169212558160514/AnsiballZ_file.py
Nov 28 09:22:21 np0005538515.localdomain sudo[150596]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:22:21 np0005538515.localdomain python3.9[150598]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:22:21 np0005538515.localdomain sudo[150596]: pam_unix(sudo:session): session closed for user root
Nov 28 09:22:22 np0005538515.localdomain sudo[150689]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-igiasbzobzgsmqiadfzyuojppyqghgeh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321742.5030828-1256-150224861898825/AnsiballZ_stat.py
Nov 28 09:22:22 np0005538515.localdomain sudo[150689]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:22:22 np0005538515.localdomain python3.9[150691]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:22:22 np0005538515.localdomain sudo[150689]: pam_unix(sudo:session): session closed for user root
Nov 28 09:22:23 np0005538515.localdomain sudo[150737]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nejtdwqyjetpljvuimowqqdwjsjhgmaa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321742.5030828-1256-150224861898825/AnsiballZ_file.py
Nov 28 09:22:23 np0005538515.localdomain sudo[150737]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:22:23 np0005538515.localdomain python3.9[150739]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:22:23 np0005538515.localdomain sudo[150737]: pam_unix(sudo:session): session closed for user root
Nov 28 09:22:23 np0005538515.localdomain sudo[150829]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fbcwvmadtjlhcllxetijmrjvysywzoti ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321743.5947616-1292-243449509751588/AnsiballZ_systemd.py
Nov 28 09:22:23 np0005538515.localdomain sudo[150829]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:22:24 np0005538515.localdomain python3.9[150831]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:22:24 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 09:22:24 np0005538515.localdomain systemd-rc-local-generator[150855]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:22:24 np0005538515.localdomain systemd-sysv-generator[150859]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:22:24 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:22:24 np0005538515.localdomain systemd[1]: Starting Create netns directory...
Nov 28 09:22:24 np0005538515.localdomain systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Nov 28 09:22:24 np0005538515.localdomain systemd[1]: netns-placeholder.service: Deactivated successfully.
Nov 28 09:22:24 np0005538515.localdomain systemd[1]: Finished Create netns directory.
Nov 28 09:22:24 np0005538515.localdomain sudo[150829]: pam_unix(sudo:session): session closed for user root
Nov 28 09:22:25 np0005538515.localdomain sudo[150965]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tdnfjssqiexgawlnlbhpdbgedhdjezfn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321744.9810164-1321-8884379011878/AnsiballZ_file.py
Nov 28 09:22:25 np0005538515.localdomain sudo[150965]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:22:25 np0005538515.localdomain python3.9[150967]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:22:25 np0005538515.localdomain sudo[150965]: pam_unix(sudo:session): session closed for user root
Nov 28 09:22:25 np0005538515.localdomain sudo[151057]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ljezyzkqxhgbijzgsxcnviiuqaqavhja ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321745.6519449-1345-133025415248799/AnsiballZ_stat.py
Nov 28 09:22:25 np0005538515.localdomain sudo[151057]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:22:26 np0005538515.localdomain python3.9[151059]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_controller/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:22:26 np0005538515.localdomain sudo[151057]: pam_unix(sudo:session): session closed for user root
Nov 28 09:22:26 np0005538515.localdomain sudo[151130]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dhhvgstabiepkqxyiffjvbbmodqjiwvs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321745.6519449-1345-133025415248799/AnsiballZ_copy.py
Nov 28 09:22:26 np0005538515.localdomain sudo[151130]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:22:26 np0005538515.localdomain python3.9[151132]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_controller/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764321745.6519449-1345-133025415248799/.source _original_basename=healthcheck follow=False checksum=4098dd010265fabdf5c26b97d169fc4e575ff457 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:22:26 np0005538515.localdomain sudo[151130]: pam_unix(sudo:session): session closed for user root
Nov 28 09:22:27 np0005538515.localdomain sudo[151222]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xtpzamftuavrusuhresbfeyladmxtzrt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321747.0827398-1397-44495166685612/AnsiballZ_file.py
Nov 28 09:22:27 np0005538515.localdomain sudo[151222]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:22:27 np0005538515.localdomain python3.9[151224]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:22:27 np0005538515.localdomain sudo[151222]: pam_unix(sudo:session): session closed for user root
Nov 28 09:22:27 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64274 DF PROTO=TCP SPT=35422 DPT=9105 SEQ=1586787104 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC8D5920000000001030307) 
Nov 28 09:22:27 np0005538515.localdomain sudo[151239]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:22:27 np0005538515.localdomain sudo[151239]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:22:27 np0005538515.localdomain sudo[151239]: pam_unix(sudo:session): session closed for user root
Nov 28 09:22:27 np0005538515.localdomain sudo[151254]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 09:22:27 np0005538515.localdomain sudo[151254]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:22:28 np0005538515.localdomain sudo[151344]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dctkgoqgbkbvofzjyjsfjjkkzxfrakop ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321747.7678537-1420-268989061163857/AnsiballZ_stat.py
Nov 28 09:22:28 np0005538515.localdomain sudo[151344]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:22:28 np0005538515.localdomain python3.9[151346]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_controller.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:22:28 np0005538515.localdomain sudo[151344]: pam_unix(sudo:session): session closed for user root
Nov 28 09:22:28 np0005538515.localdomain sudo[151254]: pam_unix(sudo:session): session closed for user root
Nov 28 09:22:28 np0005538515.localdomain sudo[151451]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qqmvlzowsuipwreedxssiclkjrjibrae ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321747.7678537-1420-268989061163857/AnsiballZ_copy.py
Nov 28 09:22:28 np0005538515.localdomain sudo[151451]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:22:28 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64275 DF PROTO=TCP SPT=35422 DPT=9105 SEQ=1586787104 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC8D9BA0000000001030307) 
Nov 28 09:22:28 np0005538515.localdomain python3.9[151453]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_controller.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764321747.7678537-1420-268989061163857/.source.json _original_basename=.hy5cctwr follow=False checksum=38f75f59f5c2ef6b5da12297bfd31cd1e97012ac backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:22:28 np0005538515.localdomain sudo[151451]: pam_unix(sudo:session): session closed for user root
Nov 28 09:22:28 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7166 DF PROTO=TCP SPT=45656 DPT=9882 SEQ=3468100896 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC8DABC0000000001030307) 
Nov 28 09:22:29 np0005538515.localdomain sudo[151506]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:22:29 np0005538515.localdomain sudo[151506]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:22:29 np0005538515.localdomain sudo[151506]: pam_unix(sudo:session): session closed for user root
Nov 28 09:22:29 np0005538515.localdomain sudo[151558]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vcsdnlamvmqmbjauohvylimptfektrmj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321748.9686608-1465-184473715588934/AnsiballZ_file.py
Nov 28 09:22:29 np0005538515.localdomain sudo[151558]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:22:29 np0005538515.localdomain python3.9[151560]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_controller state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:22:29 np0005538515.localdomain sudo[151558]: pam_unix(sudo:session): session closed for user root
Nov 28 09:22:29 np0005538515.localdomain sudo[151650]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xbkpqkiyaxpwfxcwqvqxalesjtwurzuy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321749.6650074-1489-209043471935452/AnsiballZ_stat.py
Nov 28 09:22:29 np0005538515.localdomain sudo[151650]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:22:30 np0005538515.localdomain sudo[151650]: pam_unix(sudo:session): session closed for user root
Nov 28 09:22:30 np0005538515.localdomain sudo[151723]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sqnwznyevotdvhxzkkaynezikzbowumm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321749.6650074-1489-209043471935452/AnsiballZ_copy.py
Nov 28 09:22:30 np0005538515.localdomain sudo[151723]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:22:30 np0005538515.localdomain sudo[151723]: pam_unix(sudo:session): session closed for user root
Nov 28 09:22:31 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26693 DF PROTO=TCP SPT=49094 DPT=9105 SEQ=3249111599 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC8E4FA0000000001030307) 
Nov 28 09:22:31 np0005538515.localdomain sudo[151815]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cwlmghmhhhabdvrbnyosnzhwobuettdf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321751.1826317-1540-168025229166587/AnsiballZ_container_config_data.py
Nov 28 09:22:31 np0005538515.localdomain sudo[151815]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:22:32 np0005538515.localdomain python3.9[151817]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_controller config_pattern=*.json debug=False
Nov 28 09:22:32 np0005538515.localdomain sudo[151815]: pam_unix(sudo:session): session closed for user root
Nov 28 09:22:32 np0005538515.localdomain sudo[151907]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jpoqcibbbrulgaxmxehtzrfyzunyszwz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321752.4028854-1567-252381389126891/AnsiballZ_container_config_hash.py
Nov 28 09:22:32 np0005538515.localdomain sudo[151907]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:22:32 np0005538515.localdomain python3.9[151909]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 28 09:22:33 np0005538515.localdomain sudo[151907]: pam_unix(sudo:session): session closed for user root
Nov 28 09:22:34 np0005538515.localdomain sudo[151999]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-abpfqvopmidpurysadnnxwymatmdffcb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321754.0622635-1594-68683071204722/AnsiballZ_podman_container_info.py
Nov 28 09:22:34 np0005538515.localdomain sudo[151999]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:22:34 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64277 DF PROTO=TCP SPT=35422 DPT=9105 SEQ=1586787104 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC8F17A0000000001030307) 
Nov 28 09:22:34 np0005538515.localdomain python3.9[152001]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Nov 28 09:22:35 np0005538515.localdomain sudo[151999]: pam_unix(sudo:session): session closed for user root
Nov 28 09:22:38 np0005538515.localdomain sudo[152118]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-asqcvamvthjfsifprhxukwnlhageggfp ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764321758.1194465-1633-40564473432650/AnsiballZ_edpm_container_manage.py
Nov 28 09:22:38 np0005538515.localdomain sudo[152118]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:22:38 np0005538515.localdomain python3[152120]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_controller config_id=ovn_controller config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Nov 28 09:22:39 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38576 DF PROTO=TCP SPT=57142 DPT=9102 SEQ=2612554662 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC902BA0000000001030307) 
Nov 28 09:22:39 np0005538515.localdomain python3[152120]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: [
                                                               {
                                                                    "Id": "52cb1910f3f090372807028d1c2aea98d2557b1086636469529f290368ecdf69",
                                                                    "Digest": "sha256:7ab0ee81fdc9b162df9b50eb2e264c777d08f90975a442620ec940edabe300b2",
                                                                    "RepoTags": [
                                                                         "quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified"
                                                                    ],
                                                                    "RepoDigests": [
                                                                         "quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:7ab0ee81fdc9b162df9b50eb2e264c777d08f90975a442620ec940edabe300b2"
                                                                    ],
                                                                    "Parent": "",
                                                                    "Comment": "",
                                                                    "Created": "2025-11-26T06:43:38.999472418Z",
                                                                    "Config": {
                                                                         "User": "root",
                                                                         "Env": [
                                                                              "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",
                                                                              "LANG=en_US.UTF-8",
                                                                              "TZ=UTC",
                                                                              "container=oci"
                                                                         ],
                                                                         "Entrypoint": [
                                                                              "dumb-init",
                                                                              "--single-child",
                                                                              "--"
                                                                         ],
                                                                         "Cmd": [
                                                                              "kolla_start"
                                                                         ],
                                                                         "Labels": {
                                                                              "io.buildah.version": "1.41.3",
                                                                              "maintainer": "OpenStack Kubernetes Operator team",
                                                                              "org.label-schema.build-date": "20251125",
                                                                              "org.label-schema.license": "GPLv2",
                                                                              "org.label-schema.name": "CentOS Stream 9 Base Image",
                                                                              "org.label-schema.schema-version": "1.0",
                                                                              "org.label-schema.vendor": "CentOS",
                                                                              "tcib_build_tag": "1f5c0439f2433cb462b222a5bb23e629",
                                                                              "tcib_managed": "true"
                                                                         },
                                                                         "StopSignal": "SIGTERM"
                                                                    },
                                                                    "Version": "",
                                                                    "Author": "",
                                                                    "Architecture": "amd64",
                                                                    "Os": "linux",
                                                                    "Size": 345745352,
                                                                    "VirtualSize": 345745352,
                                                                    "GraphDriver": {
                                                                         "Name": "overlay",
                                                                         "Data": {
                                                                              "LowerDir": "/var/lib/containers/storage/overlay/d63efe17da859108a09d9b90626ba0c433787abe209cd4ac755f6ba2a5206671/diff:/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/diff:/var/lib/containers/storage/overlay/cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa/diff",
                                                                              "UpperDir": "/var/lib/containers/storage/overlay/d8443c9fdf039c2367e44e0edbe81c941f30f604c3f1eccc2fc81efb5a97a784/diff",
                                                                              "WorkDir": "/var/lib/containers/storage/overlay/d8443c9fdf039c2367e44e0edbe81c941f30f604c3f1eccc2fc81efb5a97a784/work"
                                                                         }
                                                                    },
                                                                    "RootFS": {
                                                                         "Type": "layers",
                                                                         "Layers": [
                                                                              "sha256:cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa",
                                                                              "sha256:1e3477d3ea795ca64b46f28aa9428ba791c4250e0fd05e173a4b9c0fb0bdee23",
                                                                              "sha256:41a433848ac42a81e513766649f77cfa09e37aae045bcbbb33be77f7cf86edc4",
                                                                              "sha256:055d9012b48b3c8064accd40b6372c79c29fedd85061a710ada00677f88b1db9"
                                                                         ]
                                                                    },
                                                                    "Labels": {
                                                                         "io.buildah.version": "1.41.3",
                                                                         "maintainer": "OpenStack Kubernetes Operator team",
                                                                         "org.label-schema.build-date": "20251125",
                                                                         "org.label-schema.license": "GPLv2",
                                                                         "org.label-schema.name": "CentOS Stream 9 Base Image",
                                                                         "org.label-schema.schema-version": "1.0",
                                                                         "org.label-schema.vendor": "CentOS",
                                                                         "tcib_build_tag": "1f5c0439f2433cb462b222a5bb23e629",
                                                                         "tcib_managed": "true"
                                                                    },
                                                                    "Annotations": {},
                                                                    "ManifestType": "application/vnd.docker.distribution.manifest.v2+json",
                                                                    "User": "root",
                                                                    "History": [
                                                                         {
                                                                              "created": "2025-11-25T04:02:36.223494528Z",
                                                                              "created_by": "/bin/sh -c #(nop) ADD file:cacf1a97b4abfca5db2db22f7ddbca8fd7daa5076a559639c109f09aaf55871d in / ",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-25T04:02:36.223562059Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL org.label-schema.schema-version=\"1.0\"     org.label-schema.name=\"CentOS Stream 9 Base Image\"     org.label-schema.vendor=\"CentOS\"     org.label-schema.license=\"GPLv2\"     org.label-schema.build-date=\"20251125\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-25T04:02:39.054452717Z",
                                                                              "created_by": "/bin/sh -c #(nop) CMD [\"/bin/bash\"]"
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:10:57.55004106Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL maintainer=\"OpenStack Kubernetes Operator team\"",
                                                                              "comment": "FROM quay.io/centos/centos:stream9",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:10:57.550061231Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL tcib_managed=true",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:10:57.550071761Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV LANG=\"en_US.UTF-8\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:10:57.550082711Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV TZ=\"UTC\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:10:57.550094371Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV container=\"oci\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:10:57.550104472Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:10:57.937139683Z",
                                                                              "created_by": "/bin/sh -c if [ -f \"/etc/yum.repos.d/ubi.repo\" ]; then rm -f /etc/yum.repos.d/ubi.repo && dnf clean all && rm -rf /var/cache/dnf; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:33.845342269Z",
                                                                              "created_by": "/bin/sh -c dnf install -y crudini && crudini --del /etc/dnf/dnf.conf main override_install_langs && crudini --set /etc/dnf/dnf.conf main clean_requirements_on_remove True && crudini --set /etc/dnf/dnf.conf main exactarch 1 && crudini --set /etc/dnf/dnf.conf main gpgcheck 1 && crudini --set /etc/dnf/dnf.conf main install_weak_deps False && if [ 'centos' == 'centos' ];then crudini --set /etc/dnf/dnf.conf main best False; fi && crudini --set /etc/dnf/dnf.conf main installonly_limit 0 && crudini --set /etc/dnf/dnf.conf main keepcache 0 && crudini --set /etc/dnf/dnf.conf main obsoletes 1 && crudini --set /etc/dnf/dnf.conf main plugins 1 && crudini --set /etc/dnf/dnf.conf main skip_missing_names_on_install False && crudini --set /etc/dnf/dnf.conf main tsflags nodocs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:37.752912815Z",
                                                                              "created_by": "/bin/sh -c dnf install -y ca-certificates dumb-init glibc-langpack-en procps-ng python3 sudo util-linux-user which python-tcib-containers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:38.066850603Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/uid_gid_manage.sh /usr/local/bin/uid_gid_manage",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:38.343690066Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/uid_gid_manage",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:39.121414134Z",
                                                                              "created_by": "/bin/sh -c bash /usr/local/bin/uid_gid_manage kolla hugetlbfs libvirt qemu",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:39.758394881Z",
                                                                              "created_by": "/bin/sh -c touch /usr/local/bin/kolla_extend_start && chmod 755 /usr/local/bin/kolla_extend_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:40.023293708Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/set_configs.py /usr/local/bin/kolla_set_configs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:40.666927498Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_set_configs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:41.274045447Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/start.sh /usr/local/bin/kolla_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:41.934810694Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:42.460051822Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/httpd_setup.sh /usr/local/bin/kolla_httpd_setup",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:43.056709748Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_httpd_setup",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:43.656939418Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/copy_cacerts.sh /usr/local/bin/kolla_copy_cacerts",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:44.391634882Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_copy_cacerts",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:44.866551538Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/sudoers /etc/sudoers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:45.384686341Z",
                                                                              "created_by": "/bin/sh -c chmod 440 /etc/sudoers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:45.893815667Z",
                                                                              "created_by": "/bin/sh -c sed -ri '/^(passwd:|group:)/ s/systemd//g' /etc/nsswitch.conf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:50.280039705Z",
                                                                              "created_by": "/bin/sh -c dnf -y reinstall which && rpm -e --nodeps tzdata && dnf -y install tzdata",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:51.365780205Z",
                                                                              "created_by": "/bin/sh -c if [ ! -f \"/etc/localtime\" ]; then ln -s /usr/share/zoneinfo/Etc/UTC /etc/localtime; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:52.238116267Z",
                                                                              "created_by": "/bin/sh -c mkdir -p /openstack",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:54.354755699Z",
                                                                              "created_by": "/bin/sh -c if [ 'centos' == 'centos' ];then if [ -n \"$(rpm -qa redhat-release)\" ];then rpm -e --nodeps redhat-release; fi ; dnf -y install centos-stream-release; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:57.47438266Z",
                                                                              "created_by": "/bin/sh -c dnf update --excludepkgs redhat-release -y && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:57.474435383Z",
                                                                              "created_by": "/bin/sh -c #(nop) STOPSIGNAL SIGTERM",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:57.474444143Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENTRYPOINT [\"dumb-init\", \"--single-child\", \"--\"]",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:57.474450953Z",
                                                                              "created_by": "/bin/sh -c #(nop) CMD [\"kolla_start\"]",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:58.542433842Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"1f5c0439f2433cb462b222a5bb23e629\""
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:14:26.691247936Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-base:1f5c0439f2433cb462b222a5bb23e629",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:15:32.288422734Z",
                                                                              "created_by": "/bin/sh -c dnf -y install openvswitch openvswitch-ovn-common python3-netifaces python3-openvswitch tcpdump && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:15:33.83333928Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"1f5c0439f2433cb462b222a5bb23e629\""
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:42:58.179075923Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-ovn-base:1f5c0439f2433cb462b222a5bb23e629",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:43:38.997189664Z",
                                                                              "created_by": "/bin/sh -c dnf -y install openvswitch-ovn-host && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:43:40.109412373Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"1f5c0439f2433cb462b222a5bb23e629\""
                                                                         }
                                                                    ],
                                                                    "NamesHistory": [
                                                                         "quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified"
                                                                    ]
                                                               }
                                                          ]
                                                          : quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Nov 28 09:22:39 np0005538515.localdomain podman[152170]: 2025-11-28 09:22:39.286289199 +0000 UTC m=+0.088607518 container remove 9779adc100c83ac3ca149677b2b7e017b342ee203e42cfa7e38dbb3b27c3d164 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, batch=17.1_20251118.1, config_id=tripleo_step4, architecture=x86_64, io.openshift.expose-services=, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ovn-controller, url=https://www.redhat.com, io.buildah.version=1.41.4, container_name=ovn_controller, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ovn-controller)
Nov 28 09:22:39 np0005538515.localdomain python3[152120]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman rm --force ovn_controller
Nov 28 09:22:39 np0005538515.localdomain podman[152183]: 
Nov 28 09:22:39 np0005538515.localdomain podman[152183]: 2025-11-28 09:22:39.369997121 +0000 UTC m=+0.069487103 container create 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, container_name=ovn_controller, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 28 09:22:39 np0005538515.localdomain podman[152183]: 2025-11-28 09:22:39.331580656 +0000 UTC m=+0.031070688 image pull  quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Nov 28 09:22:39 np0005538515.localdomain python3[152120]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_controller --conmon-pidfile /run/ovn_controller.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=ovn_controller --label container_name=ovn_controller --label managed_by=edpm_ansible --label config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --user root --volume /lib/modules:/lib/modules:ro --volume /run:/run --volume /var/lib/openvswitch/ovn:/run/ovn:shared,z --volume /var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Nov 28 09:22:39 np0005538515.localdomain sudo[152118]: pam_unix(sudo:session): session closed for user root
Nov 28 09:22:40 np0005538515.localdomain sudo[152309]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sialrgyqnbewvrzkgaqmnleomzvxfckp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321759.786643-1657-130572212038052/AnsiballZ_stat.py
Nov 28 09:22:40 np0005538515.localdomain sudo[152309]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:22:40 np0005538515.localdomain python3.9[152311]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 09:22:40 np0005538515.localdomain sudo[152309]: pam_unix(sudo:session): session closed for user root
Nov 28 09:22:40 np0005538515.localdomain sudo[152403]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oghmmidvteosyvlymnualqdrdpbzlfad ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321760.6008534-1684-192045651663650/AnsiballZ_file.py
Nov 28 09:22:40 np0005538515.localdomain sudo[152403]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:22:41 np0005538515.localdomain python3.9[152405]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_controller.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:22:41 np0005538515.localdomain sudo[152403]: pam_unix(sudo:session): session closed for user root
Nov 28 09:22:41 np0005538515.localdomain sudo[152449]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tpvcwjksfwmalgaspggmgblbtwywrood ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321760.6008534-1684-192045651663650/AnsiballZ_stat.py
Nov 28 09:22:41 np0005538515.localdomain sudo[152449]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:22:41 np0005538515.localdomain python3.9[152451]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_controller_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 09:22:41 np0005538515.localdomain sudo[152449]: pam_unix(sudo:session): session closed for user root
Nov 28 09:22:41 np0005538515.localdomain sudo[152540]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fbvbhbjjkugselmoxyugleqrelrxqlxa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321761.5549982-1684-116866121389217/AnsiballZ_copy.py
Nov 28 09:22:41 np0005538515.localdomain sudo[152540]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:22:42 np0005538515.localdomain python3.9[152542]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764321761.5549982-1684-116866121389217/source dest=/etc/systemd/system/edpm_ovn_controller.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:22:42 np0005538515.localdomain sudo[152540]: pam_unix(sudo:session): session closed for user root
Nov 28 09:22:42 np0005538515.localdomain sudo[152586]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sflgwxwqxkugytlrdqjsoyzzthrekiao ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321761.5549982-1684-116866121389217/AnsiballZ_systemd.py
Nov 28 09:22:42 np0005538515.localdomain sudo[152586]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:22:42 np0005538515.localdomain python3.9[152588]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 28 09:22:42 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 09:22:42 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64278 DF PROTO=TCP SPT=35422 DPT=9105 SEQ=1586787104 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC910FA0000000001030307) 
Nov 28 09:22:42 np0005538515.localdomain systemd-rc-local-generator[152607]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:22:42 np0005538515.localdomain systemd-sysv-generator[152614]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:22:42 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:22:43 np0005538515.localdomain sudo[152586]: pam_unix(sudo:session): session closed for user root
Nov 28 09:22:43 np0005538515.localdomain sudo[152668]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lbbndsqthfvxuwmggzjnmxyuczkhkngk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321761.5549982-1684-116866121389217/AnsiballZ_systemd.py
Nov 28 09:22:43 np0005538515.localdomain sudo[152668]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:22:43 np0005538515.localdomain python3.9[152670]: ansible-systemd Invoked with state=restarted name=edpm_ovn_controller.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:22:43 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 09:22:43 np0005538515.localdomain systemd-rc-local-generator[152697]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:22:43 np0005538515.localdomain systemd-sysv-generator[152702]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:22:43 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:22:44 np0005538515.localdomain systemd[1]: Starting ovn_controller container...
Nov 28 09:22:44 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 09:22:44 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e5f4753be574a6e4d1b818630bc8663d83b7be29d27e9a8539e5e7161ddb05a6/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Nov 28 09:22:44 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57497 DF PROTO=TCP SPT=40360 DPT=9101 SEQ=1127572714 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC916900000000001030307) 
Nov 28 09:22:44 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.
Nov 28 09:22:44 np0005538515.localdomain podman[152712]: 2025-11-28 09:22:44.224366321 +0000 UTC m=+0.135922216 container init 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 09:22:44 np0005538515.localdomain ovn_controller[152726]: + sudo -E kolla_set_configs
Nov 28 09:22:44 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.
Nov 28 09:22:44 np0005538515.localdomain podman[152712]: 2025-11-28 09:22:44.261492677 +0000 UTC m=+0.173048542 container start 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Nov 28 09:22:44 np0005538515.localdomain edpm-start-podman-container[152712]: ovn_controller
Nov 28 09:22:44 np0005538515.localdomain systemd[1]: Created slice User Slice of UID 0.
Nov 28 09:22:44 np0005538515.localdomain systemd[1]: Starting User Runtime Directory /run/user/0...
Nov 28 09:22:44 np0005538515.localdomain systemd[1]: Finished User Runtime Directory /run/user/0.
Nov 28 09:22:44 np0005538515.localdomain systemd[1]: Starting User Manager for UID 0...
Nov 28 09:22:44 np0005538515.localdomain systemd[152755]: pam_unix(systemd-user:session): session opened for user root(uid=0) by (uid=0)
Nov 28 09:22:44 np0005538515.localdomain podman[152733]: 2025-11-28 09:22:44.357878733 +0000 UTC m=+0.091462761 container health_status 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=starting, org.label-schema.build-date=20251125, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.vendor=CentOS)
Nov 28 09:22:44 np0005538515.localdomain podman[152733]: 2025-11-28 09:22:44.447908292 +0000 UTC m=+0.181492320 container exec_died 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 09:22:44 np0005538515.localdomain podman[152733]: unhealthy
Nov 28 09:22:44 np0005538515.localdomain systemd[1]: 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 09:22:44 np0005538515.localdomain systemd[1]: 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.service: Failed with result 'exit-code'.
Nov 28 09:22:44 np0005538515.localdomain edpm-start-podman-container[152711]: Creating additional drop-in dependency for "ovn_controller" (98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9)
Nov 28 09:22:44 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 09:22:44 np0005538515.localdomain systemd[152755]: Queued start job for default target Main User Target.
Nov 28 09:22:44 np0005538515.localdomain systemd[152755]: Created slice User Application Slice.
Nov 28 09:22:44 np0005538515.localdomain systemd-journald[48427]: Field hash table of /run/log/journal/5cd59ba25ae47acac865224fa46a5f9e/system.journal has a fill level at 75.1 (250 of 333 items), suggesting rotation.
Nov 28 09:22:44 np0005538515.localdomain systemd-journald[48427]: /run/log/journal/5cd59ba25ae47acac865224fa46a5f9e/system.journal: Journal header limits reached or header out-of-date, rotating.
Nov 28 09:22:44 np0005538515.localdomain systemd[152755]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Nov 28 09:22:44 np0005538515.localdomain rsyslogd[758]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Nov 28 09:22:44 np0005538515.localdomain systemd[152755]: Started Daily Cleanup of User's Temporary Directories.
Nov 28 09:22:44 np0005538515.localdomain systemd[152755]: Reached target Paths.
Nov 28 09:22:44 np0005538515.localdomain systemd[152755]: Reached target Timers.
Nov 28 09:22:44 np0005538515.localdomain systemd[152755]: Starting D-Bus User Message Bus Socket...
Nov 28 09:22:44 np0005538515.localdomain systemd[152755]: Starting Create User's Volatile Files and Directories...
Nov 28 09:22:44 np0005538515.localdomain systemd[152755]: Listening on D-Bus User Message Bus Socket.
Nov 28 09:22:44 np0005538515.localdomain systemd[152755]: Reached target Sockets.
Nov 28 09:22:44 np0005538515.localdomain systemd[152755]: Finished Create User's Volatile Files and Directories.
Nov 28 09:22:44 np0005538515.localdomain systemd[152755]: Reached target Basic System.
Nov 28 09:22:44 np0005538515.localdomain systemd[152755]: Reached target Main User Target.
Nov 28 09:22:44 np0005538515.localdomain systemd[152755]: Startup finished in 158ms.
Nov 28 09:22:44 np0005538515.localdomain rsyslogd[758]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Nov 28 09:22:44 np0005538515.localdomain systemd-rc-local-generator[152811]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:22:44 np0005538515.localdomain systemd-sysv-generator[152815]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:22:44 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:22:44 np0005538515.localdomain systemd[1]: tmp-crun.jO1Edv.mount: Deactivated successfully.
Nov 28 09:22:44 np0005538515.localdomain systemd[1]: Started User Manager for UID 0.
Nov 28 09:22:44 np0005538515.localdomain systemd[1]: Started ovn_controller container.
Nov 28 09:22:44 np0005538515.localdomain systemd[1]: Started Session c11 of User root.
Nov 28 09:22:44 np0005538515.localdomain sudo[152668]: pam_unix(sudo:session): session closed for user root
Nov 28 09:22:44 np0005538515.localdomain ovn_controller[152726]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 28 09:22:44 np0005538515.localdomain ovn_controller[152726]: INFO:__main__:Validating config file
Nov 28 09:22:44 np0005538515.localdomain ovn_controller[152726]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 28 09:22:44 np0005538515.localdomain ovn_controller[152726]: INFO:__main__:Writing out command to execute
Nov 28 09:22:44 np0005538515.localdomain systemd[1]: session-c11.scope: Deactivated successfully.
Nov 28 09:22:44 np0005538515.localdomain ovn_controller[152726]: ++ cat /run_command
Nov 28 09:22:44 np0005538515.localdomain ovn_controller[152726]: + CMD='/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock '
Nov 28 09:22:44 np0005538515.localdomain ovn_controller[152726]: + ARGS=
Nov 28 09:22:44 np0005538515.localdomain ovn_controller[152726]: + sudo kolla_copy_cacerts
Nov 28 09:22:44 np0005538515.localdomain systemd[1]: Started Session c12 of User root.
Nov 28 09:22:44 np0005538515.localdomain systemd[1]: session-c12.scope: Deactivated successfully.
Nov 28 09:22:44 np0005538515.localdomain ovn_controller[152726]: + [[ ! -n '' ]]
Nov 28 09:22:44 np0005538515.localdomain ovn_controller[152726]: + . kolla_extend_start
Nov 28 09:22:44 np0005538515.localdomain ovn_controller[152726]: + echo 'Running command: '\''/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock '\'''
Nov 28 09:22:44 np0005538515.localdomain ovn_controller[152726]: Running command: '/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock '
Nov 28 09:22:44 np0005538515.localdomain ovn_controller[152726]: + umask 0022
Nov 28 09:22:44 np0005538515.localdomain ovn_controller[152726]: + exec /usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock
Nov 28 09:22:44 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T09:22:44Z|00001|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Nov 28 09:22:44 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T09:22:44Z|00002|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Nov 28 09:22:44 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T09:22:44Z|00003|main|INFO|OVN internal version is : [24.03.7-20.33.0-76.8]
Nov 28 09:22:44 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T09:22:44Z|00004|main|INFO|OVS IDL reconnected, force recompute.
Nov 28 09:22:44 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T09:22:44Z|00005|reconnect|INFO|tcp:ovsdbserver-sb.openstack.svc:6642: connecting...
Nov 28 09:22:44 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T09:22:44Z|00006|main|INFO|OVNSB IDL reconnected, force recompute.
Nov 28 09:22:44 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T09:22:44Z|00007|reconnect|INFO|tcp:ovsdbserver-sb.openstack.svc:6642: connected
Nov 28 09:22:44 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T09:22:44Z|00008|features|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Nov 28 09:22:44 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T09:22:44Z|00009|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Nov 28 09:22:44 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T09:22:44Z|00010|features|INFO|OVS Feature: ct_zero_snat, state: supported
Nov 28 09:22:44 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T09:22:44Z|00011|features|INFO|OVS Feature: ct_flush, state: supported
Nov 28 09:22:44 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T09:22:44Z|00012|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Nov 28 09:22:44 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T09:22:44Z|00013|main|INFO|OVS feature set changed, force recompute.
Nov 28 09:22:44 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T09:22:44Z|00014|ofctrl|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Nov 28 09:22:44 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T09:22:44Z|00015|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Nov 28 09:22:44 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T09:22:44Z|00016|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Nov 28 09:22:44 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T09:22:44Z|00017|ofctrl|INFO|ofctrl-wait-before-clear is now 8000 ms (was 0 ms)
Nov 28 09:22:44 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T09:22:44Z|00018|main|INFO|OVS OpenFlow connection reconnected,force recompute.
Nov 28 09:22:44 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T09:22:44Z|00019|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Nov 28 09:22:44 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T09:22:44Z|00020|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Nov 28 09:22:44 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T09:22:44Z|00021|main|INFO|OVS feature set changed, force recompute.
Nov 28 09:22:44 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T09:22:44Z|00022|features|INFO|OVS DB schema supports 4 flow table prefixes, our IDL supports: 4
Nov 28 09:22:44 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T09:22:44Z|00001|pinctrl(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Nov 28 09:22:44 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T09:22:44Z|00001|statctrl(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Nov 28 09:22:44 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T09:22:44Z|00002|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Nov 28 09:22:44 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T09:22:44Z|00002|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Nov 28 09:22:44 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T09:22:44Z|00003|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Nov 28 09:22:44 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T09:22:44Z|00003|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Nov 28 09:22:45 np0005538515.localdomain sudo[152919]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mqetxowkckcwwsbtcemcfysagjgmytit ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321765.387137-1768-73912850057889/AnsiballZ_command.py
Nov 28 09:22:45 np0005538515.localdomain sudo[152919]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:22:45 np0005538515.localdomain python3.9[152921]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove open . other_config hw-offload
                                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:22:45 np0005538515.localdomain ovs-vsctl[152922]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove open . other_config hw-offload
Nov 28 09:22:45 np0005538515.localdomain sudo[152919]: pam_unix(sudo:session): session closed for user root
Nov 28 09:22:46 np0005538515.localdomain sudo[153012]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-efehqaxsaslsjjrfyidugvfdwwphltou ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321766.0834522-1793-162504091201602/AnsiballZ_command.py
Nov 28 09:22:46 np0005538515.localdomain sudo[153012]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:22:46 np0005538515.localdomain python3.9[153014]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl get Open_vSwitch . external_ids:ovn-cms-options | sed 's/\"//g'
                                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:22:46 np0005538515.localdomain ovs-vsctl[153016]: ovs|00001|db_ctl_base|ERR|no key "ovn-cms-options" in Open_vSwitch record "." column external_ids
Nov 28 09:22:46 np0005538515.localdomain sudo[153012]: pam_unix(sudo:session): session closed for user root
Nov 28 09:22:46 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34378 DF PROTO=TCP SPT=45790 DPT=9100 SEQ=3977660164 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC91FBA0000000001030307) 
Nov 28 09:22:47 np0005538515.localdomain sudo[153107]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ktjbnolxzfkckcubwhxknphlgabqxxss ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321767.5684001-1834-204646742125969/AnsiballZ_command.py
Nov 28 09:22:47 np0005538515.localdomain sudo[153107]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:22:48 np0005538515.localdomain python3.9[153109]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
                                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:22:48 np0005538515.localdomain ovs-vsctl[153110]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
Nov 28 09:22:48 np0005538515.localdomain sudo[153107]: pam_unix(sudo:session): session closed for user root
Nov 28 09:22:48 np0005538515.localdomain sshd[146229]: pam_unix(sshd:session): session closed for user zuul
Nov 28 09:22:48 np0005538515.localdomain systemd[1]: session-49.scope: Deactivated successfully.
Nov 28 09:22:48 np0005538515.localdomain systemd[1]: session-49.scope: Consumed 40.691s CPU time.
Nov 28 09:22:48 np0005538515.localdomain systemd-logind[763]: Session 49 logged out. Waiting for processes to exit.
Nov 28 09:22:48 np0005538515.localdomain systemd-logind[763]: Removed session 49.
Nov 28 09:22:50 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34379 DF PROTO=TCP SPT=45790 DPT=9100 SEQ=3977660164 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC92F7B0000000001030307) 
Nov 28 09:22:54 np0005538515.localdomain sshd[153125]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 09:22:55 np0005538515.localdomain sshd[153125]: Accepted publickey for zuul from 192.168.122.30 port 36066 ssh2: RSA SHA256:3gOhaEk5Hp1Sm2LwNst6cGDJ5O01KvSo8lCo9SBO2II
Nov 28 09:22:55 np0005538515.localdomain systemd-logind[763]: New session 51 of user zuul.
Nov 28 09:22:55 np0005538515.localdomain systemd[1]: Started Session 51 of User zuul.
Nov 28 09:22:55 np0005538515.localdomain systemd[1]: Stopping User Manager for UID 0...
Nov 28 09:22:55 np0005538515.localdomain systemd[152755]: Activating special unit Exit the Session...
Nov 28 09:22:55 np0005538515.localdomain systemd[152755]: Stopped target Main User Target.
Nov 28 09:22:55 np0005538515.localdomain systemd[152755]: Stopped target Basic System.
Nov 28 09:22:55 np0005538515.localdomain systemd[152755]: Stopped target Paths.
Nov 28 09:22:55 np0005538515.localdomain systemd[152755]: Stopped target Sockets.
Nov 28 09:22:55 np0005538515.localdomain systemd[152755]: Stopped target Timers.
Nov 28 09:22:55 np0005538515.localdomain systemd[152755]: Stopped Daily Cleanup of User's Temporary Directories.
Nov 28 09:22:55 np0005538515.localdomain systemd[152755]: Closed D-Bus User Message Bus Socket.
Nov 28 09:22:55 np0005538515.localdomain systemd[152755]: Stopped Create User's Volatile Files and Directories.
Nov 28 09:22:55 np0005538515.localdomain systemd[152755]: Removed slice User Application Slice.
Nov 28 09:22:55 np0005538515.localdomain systemd[152755]: Reached target Shutdown.
Nov 28 09:22:55 np0005538515.localdomain systemd[152755]: Finished Exit the Session.
Nov 28 09:22:55 np0005538515.localdomain systemd[152755]: Reached target Exit the Session.
Nov 28 09:22:55 np0005538515.localdomain sshd[153125]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Nov 28 09:22:55 np0005538515.localdomain systemd[1]: user@0.service: Deactivated successfully.
Nov 28 09:22:55 np0005538515.localdomain systemd[1]: Stopped User Manager for UID 0.
Nov 28 09:22:55 np0005538515.localdomain systemd[1]: Stopping User Runtime Directory /run/user/0...
Nov 28 09:22:55 np0005538515.localdomain systemd[1]: run-user-0.mount: Deactivated successfully.
Nov 28 09:22:55 np0005538515.localdomain systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Nov 28 09:22:55 np0005538515.localdomain systemd[1]: Stopped User Runtime Directory /run/user/0.
Nov 28 09:22:55 np0005538515.localdomain systemd[1]: Removed slice User Slice of UID 0.
Nov 28 09:22:56 np0005538515.localdomain python3.9[153219]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 09:22:57 np0005538515.localdomain sudo[153313]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tmijtewmlikcbzzfnsdyrepxagdrmkgo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321777.15838-64-148925594172793/AnsiballZ_file.py
Nov 28 09:22:57 np0005538515.localdomain sudo[153313]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:22:57 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54857 DF PROTO=TCP SPT=59168 DPT=9105 SEQ=4196639421 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC94AC30000000001030307) 
Nov 28 09:22:57 np0005538515.localdomain python3.9[153315]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:22:57 np0005538515.localdomain sudo[153313]: pam_unix(sudo:session): session closed for user root
Nov 28 09:22:58 np0005538515.localdomain sudo[153405]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jsulnwlslefbydmrwmbkyptuwocfyfgf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321777.9360974-64-5546965283978/AnsiballZ_file.py
Nov 28 09:22:58 np0005538515.localdomain sudo[153405]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:22:58 np0005538515.localdomain python3.9[153407]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:22:58 np0005538515.localdomain sudo[153405]: pam_unix(sudo:session): session closed for user root
Nov 28 09:22:58 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54858 DF PROTO=TCP SPT=59168 DPT=9105 SEQ=4196639421 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC94EBB0000000001030307) 
Nov 28 09:22:58 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34380 DF PROTO=TCP SPT=45790 DPT=9100 SEQ=3977660164 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC94EFB0000000001030307) 
Nov 28 09:22:59 np0005538515.localdomain sudo[153497]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hwdemtszklazfaoqmrzzopbgyncmmhfk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321778.520495-64-276479728113597/AnsiballZ_file.py
Nov 28 09:22:59 np0005538515.localdomain sudo[153497]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:22:59 np0005538515.localdomain python3.9[153499]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/kill_scripts setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:22:59 np0005538515.localdomain sudo[153497]: pam_unix(sudo:session): session closed for user root
Nov 28 09:23:00 np0005538515.localdomain sudo[153589]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aryrgqxgvevlgtpavquulvmtonuueblq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321779.7997444-64-15807543786102/AnsiballZ_file.py
Nov 28 09:23:00 np0005538515.localdomain sudo[153589]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:23:00 np0005538515.localdomain python3.9[153591]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/ovn-metadata-proxy setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:23:00 np0005538515.localdomain sudo[153589]: pam_unix(sudo:session): session closed for user root
Nov 28 09:23:00 np0005538515.localdomain sudo[153681]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xqazcqqrafrwuwfmiemihotclvnbgxnb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321780.5501478-64-142155173387079/AnsiballZ_file.py
Nov 28 09:23:00 np0005538515.localdomain sudo[153681]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:23:00 np0005538515.localdomain python3.9[153683]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/external/pids setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:23:01 np0005538515.localdomain sudo[153681]: pam_unix(sudo:session): session closed for user root
Nov 28 09:23:01 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36282 DF PROTO=TCP SPT=56778 DPT=9105 SEQ=2406824212 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC95AFA0000000001030307) 
Nov 28 09:23:01 np0005538515.localdomain python3.9[153773]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 09:23:02 np0005538515.localdomain sshd[153820]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 09:23:02 np0005538515.localdomain sudo[153864]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ehciglfqojxdwwvympoqdcmyudtoiacz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321781.9676409-196-239097218049504/AnsiballZ_seboolean.py
Nov 28 09:23:02 np0005538515.localdomain sudo[153864]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:23:02 np0005538515.localdomain sshd[153820]: error: kex_exchange_identification: Connection closed by remote host
Nov 28 09:23:02 np0005538515.localdomain sshd[153820]: Connection closed by 218.8.225.25 port 48154
Nov 28 09:23:02 np0005538515.localdomain python3.9[153866]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Nov 28 09:23:02 np0005538515.localdomain sudo[153864]: pam_unix(sudo:session): session closed for user root
Nov 28 09:23:03 np0005538515.localdomain python3.9[153957]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/ovn_metadata_haproxy_wrapper follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:23:04 np0005538515.localdomain python3.9[154030]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/ovn_metadata_haproxy_wrapper mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764321782.9724538-221-64914217819856/.source follow=False _original_basename=haproxy.j2 checksum=95c62e64c8f82dd9393a560d1b052dc98d38f810 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:23:04 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54860 DF PROTO=TCP SPT=59168 DPT=9105 SEQ=4196639421 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC9667A0000000001030307) 
Nov 28 09:23:04 np0005538515.localdomain python3.9[154120]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/haproxy-kill follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:23:05 np0005538515.localdomain python3.9[154193]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/kill_scripts/haproxy-kill mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764321784.3687816-265-149231866377358/.source follow=False _original_basename=kill-script.j2 checksum=2dfb5489f491f61b95691c3bf95fa1fe48ff3700 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:23:05 np0005538515.localdomain sudo[154283]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yofrlocpsqgwkhnuivjbxkwryaduexop ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321785.6163518-316-207844295220449/AnsiballZ_setup.py
Nov 28 09:23:05 np0005538515.localdomain sudo[154283]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:23:06 np0005538515.localdomain python3.9[154285]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 28 09:23:06 np0005538515.localdomain sudo[154283]: pam_unix(sudo:session): session closed for user root
Nov 28 09:23:06 np0005538515.localdomain sudo[154337]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pyirzzsexfotzlywjoiuxhendxqyckqo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321785.6163518-316-207844295220449/AnsiballZ_dnf.py
Nov 28 09:23:06 np0005538515.localdomain sudo[154337]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:23:07 np0005538515.localdomain python3.9[154339]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch3.3'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 28 09:23:09 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18546 DF PROTO=TCP SPT=57534 DPT=9102 SEQ=2894442272 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC977FA0000000001030307) 
Nov 28 09:23:10 np0005538515.localdomain sudo[154337]: pam_unix(sudo:session): session closed for user root
Nov 28 09:23:10 np0005538515.localdomain sudo[154431]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-daxqcvulainjvlpywhpwbpkcmjfrcswz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321790.306313-353-243829790369056/AnsiballZ_systemd.py
Nov 28 09:23:10 np0005538515.localdomain sudo[154431]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:23:11 np0005538515.localdomain python3.9[154433]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 28 09:23:11 np0005538515.localdomain sudo[154431]: pam_unix(sudo:session): session closed for user root
Nov 28 09:23:11 np0005538515.localdomain python3.9[154526]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-rootwrap.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:23:12 np0005538515.localdomain python3.9[154597]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-rootwrap.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764321791.4755554-376-26151904617266/.source.conf follow=False _original_basename=rootwrap.conf.j2 checksum=11f2cfb4b7d97b2cef3c2c2d88089e6999cffe22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:23:12 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54861 DF PROTO=TCP SPT=59168 DPT=9105 SEQ=4196639421 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC986FA0000000001030307) 
Nov 28 09:23:13 np0005538515.localdomain python3.9[154687]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:23:13 np0005538515.localdomain python3.9[154758]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764321792.7528632-376-173464192714396/.source.conf follow=False _original_basename=neutron-ovn-metadata-agent.conf.j2 checksum=8bc979abbe81c2cf3993a225517a7e2483e20443 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:23:14 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15772 DF PROTO=TCP SPT=34894 DPT=9101 SEQ=2655486233 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC98BC10000000001030307) 
Nov 28 09:23:14 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.
Nov 28 09:23:14 np0005538515.localdomain python3.9[154848]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/10-neutron-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:23:14 np0005538515.localdomain systemd[1]: tmp-crun.Hq7D4S.mount: Deactivated successfully.
Nov 28 09:23:14 np0005538515.localdomain podman[154849]: 2025-11-28 09:23:14.986515961 +0000 UTC m=+0.091489883 container health_status 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=starting, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Nov 28 09:23:14 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T09:23:14Z|00023|memory|INFO|13040 kB peak resident set size after 30.1 seconds
Nov 28 09:23:14 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T09:23:14Z|00024|memory|INFO|idl-cells-OVN_Southbound:4028 idl-cells-Open_vSwitch:813 ofctrl_desired_flow_usage-KB:9 ofctrl_installed_flow_usage-KB:7 ofctrl_sb_flow_ref_usage-KB:3
Nov 28 09:23:15 np0005538515.localdomain podman[154849]: 2025-11-28 09:23:15.027425128 +0000 UTC m=+0.132399030 container exec_died 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Nov 28 09:23:15 np0005538515.localdomain systemd[1]: 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.service: Deactivated successfully.
Nov 28 09:23:15 np0005538515.localdomain python3.9[154944]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/10-neutron-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764321794.4773312-508-223363253615347/.source.conf _original_basename=10-neutron-metadata.conf follow=False checksum=aa9e89725fbcebf7a5c773d7b97083445b7b7759 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:23:16 np0005538515.localdomain python3.9[155034]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/05-nova-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:23:16 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32106 DF PROTO=TCP SPT=41286 DPT=9100 SEQ=3622792034 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC994FA0000000001030307) 
Nov 28 09:23:16 np0005538515.localdomain python3.9[155105]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/05-nova-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764321795.6301637-508-148679564908690/.source.conf _original_basename=05-nova-metadata.conf follow=False checksum=979187b925479d81d0609f4188e5b95fe1f92c18 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:23:17 np0005538515.localdomain python3.9[155195]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 09:23:17 np0005538515.localdomain sudo[155287]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pbgtawucvnzwtvsqkaqyurtxwyxnkrii ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321797.4242942-622-227495595864365/AnsiballZ_file.py
Nov 28 09:23:17 np0005538515.localdomain sudo[155287]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:23:17 np0005538515.localdomain python3.9[155289]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:23:17 np0005538515.localdomain sudo[155287]: pam_unix(sudo:session): session closed for user root
Nov 28 09:23:18 np0005538515.localdomain sudo[155379]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ouqognfrwmpdirbnijentcsyvopomlmo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321798.5861742-646-233511795362296/AnsiballZ_stat.py
Nov 28 09:23:18 np0005538515.localdomain sudo[155379]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:23:19 np0005538515.localdomain python3.9[155381]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:23:19 np0005538515.localdomain sudo[155379]: pam_unix(sudo:session): session closed for user root
Nov 28 09:23:19 np0005538515.localdomain sudo[155427]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bfdijxohdvwtuwtbzqejgsbzyuoqjphp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321798.5861742-646-233511795362296/AnsiballZ_file.py
Nov 28 09:23:19 np0005538515.localdomain sudo[155427]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:23:19 np0005538515.localdomain python3.9[155429]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:23:19 np0005538515.localdomain sudo[155427]: pam_unix(sudo:session): session closed for user root
Nov 28 09:23:19 np0005538515.localdomain sudo[155519]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kyzfcoywdghwxvpofvongpevtcwfzvje ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321799.6929536-646-26239876547160/AnsiballZ_stat.py
Nov 28 09:23:19 np0005538515.localdomain sudo[155519]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:23:20 np0005538515.localdomain python3.9[155521]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:23:20 np0005538515.localdomain sudo[155519]: pam_unix(sudo:session): session closed for user root
Nov 28 09:23:20 np0005538515.localdomain sudo[155567]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xjirocskkopxourwamncihrnuafkhqee ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321799.6929536-646-26239876547160/AnsiballZ_file.py
Nov 28 09:23:20 np0005538515.localdomain sudo[155567]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:23:20 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32107 DF PROTO=TCP SPT=41286 DPT=9100 SEQ=3622792034 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC9A4BA0000000001030307) 
Nov 28 09:23:20 np0005538515.localdomain python3.9[155569]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:23:20 np0005538515.localdomain sudo[155567]: pam_unix(sudo:session): session closed for user root
Nov 28 09:23:21 np0005538515.localdomain sudo[155659]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xbbelfpqitqbqwmjmsqaxetvovvvzchi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321801.3926058-715-144550037925109/AnsiballZ_file.py
Nov 28 09:23:21 np0005538515.localdomain sudo[155659]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:23:21 np0005538515.localdomain python3.9[155661]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:23:21 np0005538515.localdomain sudo[155659]: pam_unix(sudo:session): session closed for user root
Nov 28 09:23:22 np0005538515.localdomain sudo[155751]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vqzwshyzfqhduxbeqjlpzyztgjsxobrs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321802.0734222-739-210859284989074/AnsiballZ_stat.py
Nov 28 09:23:22 np0005538515.localdomain sudo[155751]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:23:22 np0005538515.localdomain python3.9[155753]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:23:22 np0005538515.localdomain sudo[155751]: pam_unix(sudo:session): session closed for user root
Nov 28 09:23:22 np0005538515.localdomain sudo[155799]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-trbngaueomxvtyicwswflkgcwgpuuvna ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321802.0734222-739-210859284989074/AnsiballZ_file.py
Nov 28 09:23:22 np0005538515.localdomain sudo[155799]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:23:22 np0005538515.localdomain python3.9[155801]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:23:22 np0005538515.localdomain sudo[155799]: pam_unix(sudo:session): session closed for user root
Nov 28 09:23:23 np0005538515.localdomain sudo[155891]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-grvaznwkryqdstpgskbnvqujccjtzcjy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321803.242232-775-14690288587409/AnsiballZ_stat.py
Nov 28 09:23:23 np0005538515.localdomain sudo[155891]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:23:23 np0005538515.localdomain python3.9[155893]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:23:23 np0005538515.localdomain sudo[155891]: pam_unix(sudo:session): session closed for user root
Nov 28 09:23:23 np0005538515.localdomain sudo[155939]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-enagwcddxytfftwrechbmbqttgtekgdy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321803.242232-775-14690288587409/AnsiballZ_file.py
Nov 28 09:23:23 np0005538515.localdomain sudo[155939]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:23:24 np0005538515.localdomain python3.9[155941]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:23:24 np0005538515.localdomain sudo[155939]: pam_unix(sudo:session): session closed for user root
Nov 28 09:23:24 np0005538515.localdomain sudo[156031]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ogqhwcmdirkmuowfofqllpmfmtwjcegr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321804.3533337-811-251299796111781/AnsiballZ_systemd.py
Nov 28 09:23:24 np0005538515.localdomain sudo[156031]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:23:24 np0005538515.localdomain python3.9[156033]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:23:24 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 09:23:25 np0005538515.localdomain systemd-rc-local-generator[156061]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:23:25 np0005538515.localdomain systemd-sysv-generator[156064]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:23:25 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:23:25 np0005538515.localdomain sudo[156031]: pam_unix(sudo:session): session closed for user root
Nov 28 09:23:25 np0005538515.localdomain sudo[156161]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ghzkmpdqwniqtzbhchrcusfytgkpftlg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321805.5188775-835-78210885396398/AnsiballZ_stat.py
Nov 28 09:23:25 np0005538515.localdomain sudo[156161]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:23:26 np0005538515.localdomain python3.9[156163]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:23:26 np0005538515.localdomain sudo[156161]: pam_unix(sudo:session): session closed for user root
Nov 28 09:23:26 np0005538515.localdomain sudo[156209]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mmmyvjmsllsifcpmkotxjqipfwuzdtkr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321805.5188775-835-78210885396398/AnsiballZ_file.py
Nov 28 09:23:26 np0005538515.localdomain sudo[156209]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:23:26 np0005538515.localdomain python3.9[156211]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:23:26 np0005538515.localdomain sudo[156209]: pam_unix(sudo:session): session closed for user root
Nov 28 09:23:26 np0005538515.localdomain sudo[156301]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-myeajwlqlsgzkyxndbfzdevrkpxwsrta ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321806.637772-871-160966418989675/AnsiballZ_stat.py
Nov 28 09:23:26 np0005538515.localdomain sudo[156301]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:23:27 np0005538515.localdomain python3.9[156303]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:23:27 np0005538515.localdomain sudo[156301]: pam_unix(sudo:session): session closed for user root
Nov 28 09:23:27 np0005538515.localdomain sudo[156349]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vqoetyoxihsyflzrhxbqtlnvlmkaamvr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321806.637772-871-160966418989675/AnsiballZ_file.py
Nov 28 09:23:27 np0005538515.localdomain sudo[156349]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:23:27 np0005538515.localdomain python3.9[156351]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:23:27 np0005538515.localdomain sudo[156349]: pam_unix(sudo:session): session closed for user root
Nov 28 09:23:27 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1569 DF PROTO=TCP SPT=38762 DPT=9105 SEQ=3189010300 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC9BFF30000000001030307) 
Nov 28 09:23:28 np0005538515.localdomain sudo[156441]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iaaarnaxhguuxzdamaqruwmbqxmmtstl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321807.7827582-908-270256643439593/AnsiballZ_systemd.py
Nov 28 09:23:28 np0005538515.localdomain sudo[156441]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:23:28 np0005538515.localdomain python3.9[156443]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:23:28 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 09:23:28 np0005538515.localdomain systemd-sysv-generator[156473]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:23:28 np0005538515.localdomain systemd-rc-local-generator[156468]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:23:28 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:23:28 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1570 DF PROTO=TCP SPT=38762 DPT=9105 SEQ=3189010300 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC9C3FA0000000001030307) 
Nov 28 09:23:28 np0005538515.localdomain systemd[1]: Starting Create netns directory...
Nov 28 09:23:28 np0005538515.localdomain systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Nov 28 09:23:28 np0005538515.localdomain systemd[1]: netns-placeholder.service: Deactivated successfully.
Nov 28 09:23:28 np0005538515.localdomain systemd[1]: Finished Create netns directory.
Nov 28 09:23:28 np0005538515.localdomain sudo[156441]: pam_unix(sudo:session): session closed for user root
Nov 28 09:23:28 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32108 DF PROTO=TCP SPT=41286 DPT=9100 SEQ=3622792034 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC9C4FA0000000001030307) 
Nov 28 09:23:29 np0005538515.localdomain sudo[156515]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:23:29 np0005538515.localdomain sudo[156515]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:23:29 np0005538515.localdomain sudo[156515]: pam_unix(sudo:session): session closed for user root
Nov 28 09:23:29 np0005538515.localdomain sudo[156549]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Nov 28 09:23:29 np0005538515.localdomain sudo[156549]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:23:29 np0005538515.localdomain sudo[156607]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rqsukerqakmjkqewzgolzchoiuwioloi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321809.1466799-937-257143423622960/AnsiballZ_file.py
Nov 28 09:23:29 np0005538515.localdomain sudo[156607]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:23:29 np0005538515.localdomain python3.9[156609]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:23:29 np0005538515.localdomain sudo[156607]: pam_unix(sudo:session): session closed for user root
Nov 28 09:23:30 np0005538515.localdomain sudo[156778]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-svnrfkztpdwdmqeillzkqoogpxxndtnf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321809.846758-961-56186497430519/AnsiballZ_stat.py
Nov 28 09:23:30 np0005538515.localdomain sudo[156778]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:23:30 np0005538515.localdomain podman[156757]: 2025-11-28 09:23:30.150592124 +0000 UTC m=+0.091022669 container exec 98f7091a3e2ea0e9ed1e630f1e98c8fad1fd276cf7448473db6afc3c103ea45d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538515, io.openshift.expose-services=, release=553, CEPH_POINT_RELEASE=, vcs-type=git, name=rhceph, RELEASE=main, description=Red Hat Ceph Storage 7, version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2025-09-24T08:57:55, vendor=Red Hat, Inc., ceph=True, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.33.12, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, com.redhat.component=rhceph-container, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553)
Nov 28 09:23:30 np0005538515.localdomain podman[156757]: 2025-11-28 09:23:30.261552328 +0000 UTC m=+0.201982873 container exec_died 98f7091a3e2ea0e9ed1e630f1e98c8fad1fd276cf7448473db6afc3c103ea45d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538515, RELEASE=main, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, ceph=True, architecture=x86_64, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, name=rhceph, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-09-24T08:57:55, release=553, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.expose-services=, version=7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True)
Nov 28 09:23:30 np0005538515.localdomain python3.9[156783]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_metadata_agent/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:23:30 np0005538515.localdomain sudo[156778]: pam_unix(sudo:session): session closed for user root
Nov 28 09:23:30 np0005538515.localdomain sudo[156549]: pam_unix(sudo:session): session closed for user root
Nov 28 09:23:30 np0005538515.localdomain sudo[156892]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:23:30 np0005538515.localdomain sudo[156892]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:23:30 np0005538515.localdomain sudo[156892]: pam_unix(sudo:session): session closed for user root
Nov 28 09:23:30 np0005538515.localdomain sudo[156922]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pywbnotapjnllcygdiqomlhifyaiqwjp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321809.846758-961-56186497430519/AnsiballZ_copy.py
Nov 28 09:23:30 np0005538515.localdomain sudo[156922]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:23:30 np0005538515.localdomain sudo[156925]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 09:23:30 np0005538515.localdomain sudo[156925]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:23:30 np0005538515.localdomain python3.9[156929]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_metadata_agent/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764321809.846758-961-56186497430519/.source _original_basename=healthcheck follow=False checksum=898a5a1fcd473cf731177fc866e3bd7ebf20a131 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:23:30 np0005538515.localdomain sudo[156922]: pam_unix(sudo:session): session closed for user root
Nov 28 09:23:31 np0005538515.localdomain sudo[156925]: pam_unix(sudo:session): session closed for user root
Nov 28 09:23:31 np0005538515.localdomain sudo[157020]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:23:31 np0005538515.localdomain sudo[157020]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:23:31 np0005538515.localdomain sudo[157020]: pam_unix(sudo:session): session closed for user root
Nov 28 09:23:32 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24346 DF PROTO=TCP SPT=37990 DPT=9882 SEQ=4210979322 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC9D13B0000000001030307) 
Nov 28 09:23:32 np0005538515.localdomain sudo[157078]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oafmbijmgxkojfswaporndnqhlihmngn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321811.3361967-1013-68515298210155/AnsiballZ_file.py
Nov 28 09:23:32 np0005538515.localdomain sudo[157078]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:23:32 np0005538515.localdomain python3.9[157080]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:23:32 np0005538515.localdomain sudo[157078]: pam_unix(sudo:session): session closed for user root
Nov 28 09:23:33 np0005538515.localdomain sudo[157170]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gfeminzeginutrejuyiyyxntfmidcljw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321813.0384083-1036-165777245459823/AnsiballZ_stat.py
Nov 28 09:23:33 np0005538515.localdomain sudo[157170]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:23:33 np0005538515.localdomain python3.9[157172]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_metadata_agent.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:23:33 np0005538515.localdomain sudo[157170]: pam_unix(sudo:session): session closed for user root
Nov 28 09:23:33 np0005538515.localdomain sudo[157245]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ktjklkjreqveezjpgdcwvfmzhbzsybuv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321813.0384083-1036-165777245459823/AnsiballZ_copy.py
Nov 28 09:23:33 np0005538515.localdomain sudo[157245]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:23:34 np0005538515.localdomain python3.9[157247]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_metadata_agent.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764321813.0384083-1036-165777245459823/.source.json _original_basename=.lrkd_gpb follow=False checksum=a908ef151ded3a33ae6c9ac8be72a35e5e33b9dc backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:23:34 np0005538515.localdomain sudo[157245]: pam_unix(sudo:session): session closed for user root
Nov 28 09:23:34 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1572 DF PROTO=TCP SPT=38762 DPT=9105 SEQ=3189010300 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC9DBBA0000000001030307) 
Nov 28 09:23:34 np0005538515.localdomain sudo[157337]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-flljzbitrjmeokvjgzcmlwwzjteqhhfi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321814.5074167-1081-43492955143588/AnsiballZ_file.py
Nov 28 09:23:34 np0005538515.localdomain sudo[157337]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:23:34 np0005538515.localdomain python3.9[157339]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:23:34 np0005538515.localdomain sudo[157337]: pam_unix(sudo:session): session closed for user root
Nov 28 09:23:35 np0005538515.localdomain sudo[157429]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mmzycejqqrvbhsmkdaiuhcyndiotdqjw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321815.1699133-1105-168911026210256/AnsiballZ_stat.py
Nov 28 09:23:35 np0005538515.localdomain sudo[157429]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:23:35 np0005538515.localdomain sudo[157429]: pam_unix(sudo:session): session closed for user root
Nov 28 09:23:35 np0005538515.localdomain sudo[157502]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ururkpzecwhnvjobksfyilustajeavwk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321815.1699133-1105-168911026210256/AnsiballZ_copy.py
Nov 28 09:23:35 np0005538515.localdomain sudo[157502]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:23:36 np0005538515.localdomain sudo[157502]: pam_unix(sudo:session): session closed for user root
Nov 28 09:23:36 np0005538515.localdomain sudo[157594]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gcsxkvsrvhrsfemljqtoijsuezhqlzly ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321816.4588256-1157-93440690388311/AnsiballZ_container_config_data.py
Nov 28 09:23:36 np0005538515.localdomain sudo[157594]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:23:37 np0005538515.localdomain python3.9[157596]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_pattern=*.json debug=False
Nov 28 09:23:37 np0005538515.localdomain sudo[157594]: pam_unix(sudo:session): session closed for user root
Nov 28 09:23:37 np0005538515.localdomain sudo[157686]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qgwrkzhmooxfmewyqhdjaxyzxbvulmoh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321817.2432833-1183-229824383993208/AnsiballZ_container_config_hash.py
Nov 28 09:23:37 np0005538515.localdomain sudo[157686]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:23:37 np0005538515.localdomain python3.9[157688]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 28 09:23:37 np0005538515.localdomain sudo[157686]: pam_unix(sudo:session): session closed for user root
Nov 28 09:23:38 np0005538515.localdomain sudo[157778]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ioeqtsmerctixzkuvxorpcjqddtmkuaj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321818.159277-1210-62639276791666/AnsiballZ_podman_container_info.py
Nov 28 09:23:38 np0005538515.localdomain sudo[157778]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:23:38 np0005538515.localdomain python3.9[157780]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Nov 28 09:23:39 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48921 DF PROTO=TCP SPT=44214 DPT=9102 SEQ=1344649797 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC9ED3A0000000001030307) 
Nov 28 09:23:39 np0005538515.localdomain sudo[157778]: pam_unix(sudo:session): session closed for user root
Nov 28 09:23:42 np0005538515.localdomain sudo[157895]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gzvopcedxpduhocefhpgysmsfitditdb ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764321822.094501-1249-165523549143910/AnsiballZ_edpm_container_manage.py
Nov 28 09:23:42 np0005538515.localdomain sudo[157895]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:23:42 np0005538515.localdomain python3[157897]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_id=ovn_metadata_agent config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Nov 28 09:23:43 np0005538515.localdomain python3[157897]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: [
                                                               {
                                                                    "Id": "c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071",
                                                                    "Digest": "sha256:2b8255d3a22035616e569dbe22862a2560e15cdaefedae0059a354d558788e1e",
                                                                    "RepoTags": [
                                                                         "quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified"
                                                                    ],
                                                                    "RepoDigests": [
                                                                         "quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:2b8255d3a22035616e569dbe22862a2560e15cdaefedae0059a354d558788e1e"
                                                                    ],
                                                                    "Parent": "",
                                                                    "Comment": "",
                                                                    "Created": "2025-11-26T06:34:14.989876147Z",
                                                                    "Config": {
                                                                         "User": "neutron",
                                                                         "Env": [
                                                                              "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",
                                                                              "LANG=en_US.UTF-8",
                                                                              "TZ=UTC",
                                                                              "container=oci"
                                                                         ],
                                                                         "Entrypoint": [
                                                                              "dumb-init",
                                                                              "--single-child",
                                                                              "--"
                                                                         ],
                                                                         "Cmd": [
                                                                              "kolla_start"
                                                                         ],
                                                                         "Labels": {
                                                                              "io.buildah.version": "1.41.3",
                                                                              "maintainer": "OpenStack Kubernetes Operator team",
                                                                              "org.label-schema.build-date": "20251125",
                                                                              "org.label-schema.license": "GPLv2",
                                                                              "org.label-schema.name": "CentOS Stream 9 Base Image",
                                                                              "org.label-schema.schema-version": "1.0",
                                                                              "org.label-schema.vendor": "CentOS",
                                                                              "tcib_build_tag": "1f5c0439f2433cb462b222a5bb23e629",
                                                                              "tcib_managed": "true"
                                                                         },
                                                                         "StopSignal": "SIGTERM"
                                                                    },
                                                                    "Version": "",
                                                                    "Author": "",
                                                                    "Architecture": "amd64",
                                                                    "Os": "linux",
                                                                    "Size": 784145152,
                                                                    "VirtualSize": 784145152,
                                                                    "GraphDriver": {
                                                                         "Name": "overlay",
                                                                         "Data": {
                                                                              "LowerDir": "/var/lib/containers/storage/overlay/f04f6aa8018da724c9daa5ca37db7cd13477323f1b725eec5dac97862d883048/diff:/var/lib/containers/storage/overlay/47afe78ba3ac18f156703d7ad9e4be64941a9d1bd472a4c2a59f4f2c3531ee35/diff:/var/lib/containers/storage/overlay/f20c3ba929bbb53a84e323dddb8c0eaf3ca74b6729310e964e1fa9eee119e43a/diff:/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/diff:/var/lib/containers/storage/overlay/cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa/diff",
                                                                              "UpperDir": "/var/lib/containers/storage/overlay/b574f97f279779c52df37c61d993141d596fdb6544fa700fbddd8f35f27a4d3b/diff",
                                                                              "WorkDir": "/var/lib/containers/storage/overlay/b574f97f279779c52df37c61d993141d596fdb6544fa700fbddd8f35f27a4d3b/work"
                                                                         }
                                                                    },
                                                                    "RootFS": {
                                                                         "Type": "layers",
                                                                         "Layers": [
                                                                              "sha256:cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa",
                                                                              "sha256:1e3477d3ea795ca64b46f28aa9428ba791c4250e0fd05e173a4b9c0fb0bdee23",
                                                                              "sha256:c136b33417f134a3b932677bcf7a2df089c29f20eca250129eafd2132d4708bb",
                                                                              "sha256:bc63f71478d9d90db803b468b28e5d9e0268adbace958b608ab10bd0819798bd",
                                                                              "sha256:3277562ff4450bdcd859dd0b0be874b10dd6f3502be711d42aab9ff44a85cf28",
                                                                              "sha256:982219792b3d83fa04ae12d0161dd3b982e7e3ed68293e6c876d50161b73746b"
                                                                         ]
                                                                    },
                                                                    "Labels": {
                                                                         "io.buildah.version": "1.41.3",
                                                                         "maintainer": "OpenStack Kubernetes Operator team",
                                                                         "org.label-schema.build-date": "20251125",
                                                                         "org.label-schema.license": "GPLv2",
                                                                         "org.label-schema.name": "CentOS Stream 9 Base Image",
                                                                         "org.label-schema.schema-version": "1.0",
                                                                         "org.label-schema.vendor": "CentOS",
                                                                         "tcib_build_tag": "1f5c0439f2433cb462b222a5bb23e629",
                                                                         "tcib_managed": "true"
                                                                    },
                                                                    "Annotations": {},
                                                                    "ManifestType": "application/vnd.docker.distribution.manifest.v2+json",
                                                                    "User": "neutron",
                                                                    "History": [
                                                                         {
                                                                              "created": "2025-11-25T04:02:36.223494528Z",
                                                                              "created_by": "/bin/sh -c #(nop) ADD file:cacf1a97b4abfca5db2db22f7ddbca8fd7daa5076a559639c109f09aaf55871d in / ",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-25T04:02:36.223562059Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL org.label-schema.schema-version=\"1.0\"     org.label-schema.name=\"CentOS Stream 9 Base Image\"     org.label-schema.vendor=\"CentOS\"     org.label-schema.license=\"GPLv2\"     org.label-schema.build-date=\"20251125\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-25T04:02:39.054452717Z",
                                                                              "created_by": "/bin/sh -c #(nop) CMD [\"/bin/bash\"]"
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:10:57.55004106Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL maintainer=\"OpenStack Kubernetes Operator team\"",
                                                                              "comment": "FROM quay.io/centos/centos:stream9",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:10:57.550061231Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL tcib_managed=true",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:10:57.550071761Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV LANG=\"en_US.UTF-8\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:10:57.550082711Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV TZ=\"UTC\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:10:57.550094371Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV container=\"oci\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:10:57.550104472Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:10:57.937139683Z",
                                                                              "created_by": "/bin/sh -c if [ -f \"/etc/yum.repos.d/ubi.repo\" ]; then rm -f /etc/yum.repos.d/ubi.repo && dnf clean all && rm -rf /var/cache/dnf; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:33.845342269Z",
                                                                              "created_by": "/bin/sh -c dnf install -y crudini && crudini --del /etc/dnf/dnf.conf main override_install_langs && crudini --set /etc/dnf/dnf.conf main clean_requirements_on_remove True && crudini --set /etc/dnf/dnf.conf main exactarch 1 && crudini --set /etc/dnf/dnf.conf main gpgcheck 1 && crudini --set /etc/dnf/dnf.conf main install_weak_deps False && if [ 'centos' == 'centos' ];then crudini --set /etc/dnf/dnf.conf main best False; fi && crudini --set /etc/dnf/dnf.conf main installonly_limit 0 && crudini --set /etc/dnf/dnf.conf main keepcache 0 && crudini --set /etc/dnf/dnf.conf main obsoletes 1 && crudini --set /etc/dnf/dnf.conf main plugins 1 && crudini --set /etc/dnf/dnf.conf main skip_missing_names_on_install False && crudini --set /etc/dnf/dnf.conf main tsflags nodocs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:37.752912815Z",
                                                                              "created_by": "/bin/sh -c dnf install -y ca-certificates dumb-init glibc-langpack-en procps-ng python3 sudo util-linux-user which python-tcib-containers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:38.066850603Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/uid_gid_manage.sh /usr/local/bin/uid_gid_manage",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:38.343690066Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/uid_gid_manage",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:39.121414134Z",
                                                                              "created_by": "/bin/sh -c bash /usr/local/bin/uid_gid_manage kolla hugetlbfs libvirt qemu",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:39.758394881Z",
                                                                              "created_by": "/bin/sh -c touch /usr/local/bin/kolla_extend_start && chmod 755 /usr/local/bin/kolla_extend_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:40.023293708Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/set_configs.py /usr/local/bin/kolla_set_configs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:40.666927498Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_set_configs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:41.274045447Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/start.sh /usr/local/bin/kolla_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:41.934810694Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:42.460051822Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/httpd_setup.sh /usr/local/bin/kolla_httpd_setup",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:43.056709748Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_httpd_setup",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:43.656939418Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/copy_cacerts.sh /usr/local/bin/kolla_copy_cacerts",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:44.391634882Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_copy_cacerts",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:44.866551538Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/sudoers /etc/sudoers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:45.384686341Z",
                                                                              "created_by": "/bin/sh -c chmod 440 /etc/sudoers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:45.893815667Z",
                                                                              "created_by": "/bin/sh -c sed -ri '/^(passwd:|group:)/ s/systemd//g' /etc/nsswitch.conf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:50.280039705Z",
                                                                              "created_by": "/bin/sh -c dnf -y reinstall which && rpm -e --nodeps tzdata && dnf -y install tzdata",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:51.365780205Z",
                                                                              "created_by": "/bin/sh -c if [ ! -f \"/etc/localtime\" ]; then ln -s /usr/share/zoneinfo/Etc/UTC /etc/localtime; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:52.238116267Z",
                                                                              "created_by": "/bin/sh -c mkdir -p /openstack",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:54.354755699Z",
                                                                              "created_by": "/bin/sh -c if [ 'centos' == 'centos' ];then if [ -n \"$(rpm -qa redhat-release)\" ];then rpm -e --nodeps redhat-release; fi ; dnf -y install centos-stream-release; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:57.47438266Z",
                                                                              "created_by": "/bin/sh -c dnf update --excludepkgs redhat-release -y && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:57.474435383Z",
                                                                              "created_by": "/bin/sh -c #(nop) STOPSIGNAL SIGTERM",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:57.474444143Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENTRYPOINT [\"dumb-init\", \"--single-child\", \"--\"]",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:57.474450953Z",
                                                                              "created_by": "/bin/sh -c #(nop) CMD [\"kolla_start\"]",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:58.542433842Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"1f5c0439f2433cb462b222a5bb23e629\""
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:13:58.883943816Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-base:1f5c0439f2433cb462b222a5bb23e629",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:14:39.655921147Z",
                                                                              "created_by": "/bin/sh -c dnf install -y python3-barbicanclient python3-cinderclient python3-designateclient python3-glanceclient python3-ironicclient python3-keystoneclient python3-manilaclient python3-neutronclient python3-novaclient python3-observabilityclient python3-octaviaclient python3-openstackclient python3-swiftclient python3-pymemcache && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:14:42.534184087Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"1f5c0439f2433cb462b222a5bb23e629\""
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:20:42.438406248Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-os:1f5c0439f2433cb462b222a5bb23e629",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:21:24.54454259Z",
                                                                              "created_by": "/bin/sh -c bash /usr/local/bin/uid_gid_manage neutron",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:22:45.350498288Z",
                                                                              "created_by": "/bin/sh -c dnf -y install iputils net-tools openstack-neutron openstack-neutron-rpc-server openstack-neutron-ml2 openvswitch python3-networking-baremetal python3-openvswitch python3-unbound && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:22:45.889263301Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/neutron-base/neutron_sudoers /etc/sudoers.d/neutron_sudoers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:22:46.291004499Z",
                                                                              "created_by": "/bin/sh -c chmod 440 /etc/sudoers.d/neutron_sudoers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:23:25.184071037Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"1f5c0439f2433cb462b222a5bb23e629\""
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:32:10.991588202Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-neutron-base:1f5c0439f2433cb462b222a5bb23e629",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:33:28.900936438Z",
                                                                              "created_by": "/bin/sh -c dnf -y install libseccomp podman && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:33:33.145210084Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"1f5c0439f2433cb462b222a5bb23e629\""
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:33:40.18160951Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-neutron-agent-base:1f5c0439f2433cb462b222a5bb23e629",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:34:14.986660399Z",
                                                                              "created_by": "/bin/sh -c dnf -y install python3-networking-ovn-metadata-agent && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:34:14.986745051Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER neutron",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:34:18.63064752Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"1f5c0439f2433cb462b222a5bb23e629\""
                                                                         }
                                                                    ],
                                                                    "NamesHistory": [
                                                                         "quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified"
                                                                    ]
                                                               }
                                                          ]
                                                          : quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 28 09:23:43 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1573 DF PROTO=TCP SPT=38762 DPT=9105 SEQ=3189010300 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AC9FCFA0000000001030307) 
Nov 28 09:23:43 np0005538515.localdomain podman[157946]: 2025-11-28 09:23:43.241997241 +0000 UTC m=+0.083076403 container remove e1c70e5c2d14d8fb586d5c91ffbea685245645bc934766806b986b70d0aebd47 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vendor=Red Hat, Inc., version=17.1.12, vcs-type=git, name=rhosp17/openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, distribution-scope=public, io.openshift.expose-services=, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, container_name=ovn_metadata_agent, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08c21dad54d1ba598c6e2fae6b853aba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, build-date=2025-11-19T00:14:25Z, url=https://www.redhat.com, managed_by=tripleo_ansible, batch=17.1_20251118.1)
Nov 28 09:23:43 np0005538515.localdomain python3[157897]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman rm --force ovn_metadata_agent
Nov 28 09:23:43 np0005538515.localdomain podman[157960]: 
Nov 28 09:23:43 np0005538515.localdomain podman[157960]: 2025-11-28 09:23:43.349268433 +0000 UTC m=+0.087902164 container create b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 09:23:43 np0005538515.localdomain podman[157960]: 2025-11-28 09:23:43.307328519 +0000 UTC m=+0.045962270 image pull  quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 28 09:23:43 np0005538515.localdomain python3[157897]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_metadata_agent --cgroupns=host --conmon-pidfile /run/ovn_metadata_agent.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311 --healthcheck-command /openstack/healthcheck --label config_id=ovn_metadata_agent --label container_name=ovn_metadata_agent --label managed_by=edpm_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']} --log-driver journald --log-level info --network host --pid host --privileged=True --user root --volume /run/openvswitch:/run/openvswitch:z --volume /var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z --volume /run/netns:/run/netns:shared --volume /var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro --volume /var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro --volume /var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 28 09:23:43 np0005538515.localdomain sudo[157895]: pam_unix(sudo:session): session closed for user root
Nov 28 09:23:44 np0005538515.localdomain sudo[158084]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jhbcxmlygvueknqsapmtobrgbhfggckn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321823.7315497-1273-215750305214370/AnsiballZ_stat.py
Nov 28 09:23:44 np0005538515.localdomain sudo[158084]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:23:44 np0005538515.localdomain python3.9[158086]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 09:23:44 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32614 DF PROTO=TCP SPT=36358 DPT=9101 SEQ=904541868 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ACA00F00000000001030307) 
Nov 28 09:23:44 np0005538515.localdomain sudo[158084]: pam_unix(sudo:session): session closed for user root
Nov 28 09:23:45 np0005538515.localdomain sudo[158178]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hiaehoxhwewqeckkxezjfxrilsydncjr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321824.9006474-1300-154786451576715/AnsiballZ_file.py
Nov 28 09:23:45 np0005538515.localdomain sudo[158178]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:23:45 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.
Nov 28 09:23:45 np0005538515.localdomain podman[158181]: 2025-11-28 09:23:45.291254256 +0000 UTC m=+0.082951948 container health_status 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0)
Nov 28 09:23:45 np0005538515.localdomain podman[158181]: 2025-11-28 09:23:45.330582463 +0000 UTC m=+0.122280155 container exec_died 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, container_name=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible)
Nov 28 09:23:45 np0005538515.localdomain systemd[1]: 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.service: Deactivated successfully.
Nov 28 09:23:45 np0005538515.localdomain python3.9[158180]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:23:45 np0005538515.localdomain sudo[158178]: pam_unix(sudo:session): session closed for user root
Nov 28 09:23:45 np0005538515.localdomain sudo[158248]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fiqmqfixvrshmhfsikscazzgxdhmifws ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321824.9006474-1300-154786451576715/AnsiballZ_stat.py
Nov 28 09:23:45 np0005538515.localdomain sudo[158248]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:23:45 np0005538515.localdomain python3.9[158250]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 09:23:45 np0005538515.localdomain sudo[158248]: pam_unix(sudo:session): session closed for user root
Nov 28 09:23:46 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16557 DF PROTO=TCP SPT=55620 DPT=9100 SEQ=1599959550 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ACA0A3A0000000001030307) 
Nov 28 09:23:46 np0005538515.localdomain sudo[158339]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iingelqlvijompjznsjnevvemvgexcdk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321825.9085803-1300-208601423066950/AnsiballZ_copy.py
Nov 28 09:23:46 np0005538515.localdomain sudo[158339]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:23:47 np0005538515.localdomain python3.9[158341]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764321825.9085803-1300-208601423066950/source dest=/etc/systemd/system/edpm_ovn_metadata_agent.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:23:47 np0005538515.localdomain sudo[158339]: pam_unix(sudo:session): session closed for user root
Nov 28 09:23:47 np0005538515.localdomain sudo[158385]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gxlurfrufoxfwdswqelkuxxykluknivq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321825.9085803-1300-208601423066950/AnsiballZ_systemd.py
Nov 28 09:23:47 np0005538515.localdomain sudo[158385]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:23:47 np0005538515.localdomain python3.9[158387]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 28 09:23:47 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 09:23:47 np0005538515.localdomain systemd-sysv-generator[158417]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:23:47 np0005538515.localdomain systemd-rc-local-generator[158411]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:23:47 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:23:48 np0005538515.localdomain sudo[158385]: pam_unix(sudo:session): session closed for user root
Nov 28 09:23:48 np0005538515.localdomain sudo[158467]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ospsnvvanptjulfgsknxisddsubdoczi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321825.9085803-1300-208601423066950/AnsiballZ_systemd.py
Nov 28 09:23:48 np0005538515.localdomain sudo[158467]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:23:48 np0005538515.localdomain python3.9[158469]: ansible-systemd Invoked with state=restarted name=edpm_ovn_metadata_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:23:48 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 09:23:48 np0005538515.localdomain systemd-rc-local-generator[158494]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:23:48 np0005538515.localdomain systemd-sysv-generator[158500]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:23:48 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:23:48 np0005538515.localdomain systemd[1]: Starting ovn_metadata_agent container...
Nov 28 09:23:49 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 09:23:49 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5cf443e7ac417e059a5e66d88527948f9e0d9e4436d26552658ad1f69652f989/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff)
Nov 28 09:23:49 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5cf443e7ac417e059a5e66d88527948f9e0d9e4436d26552658ad1f69652f989/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 09:23:49 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.
Nov 28 09:23:49 np0005538515.localdomain podman[158511]: 2025-11-28 09:23:49.153834657 +0000 UTC m=+0.158990665 container init b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.schema-version=1.0)
Nov 28 09:23:49 np0005538515.localdomain ovn_metadata_agent[158525]: + sudo -E kolla_set_configs
Nov 28 09:23:49 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.
Nov 28 09:23:49 np0005538515.localdomain podman[158511]: 2025-11-28 09:23:49.190706391 +0000 UTC m=+0.195862409 container start b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, managed_by=edpm_ansible)
Nov 28 09:23:49 np0005538515.localdomain edpm-start-podman-container[158511]: ovn_metadata_agent
Nov 28 09:23:49 np0005538515.localdomain ovn_metadata_agent[158525]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 28 09:23:49 np0005538515.localdomain ovn_metadata_agent[158525]: INFO:__main__:Validating config file
Nov 28 09:23:49 np0005538515.localdomain ovn_metadata_agent[158525]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 28 09:23:49 np0005538515.localdomain ovn_metadata_agent[158525]: INFO:__main__:Copying service configuration files
Nov 28 09:23:49 np0005538515.localdomain ovn_metadata_agent[158525]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf
Nov 28 09:23:49 np0005538515.localdomain ovn_metadata_agent[158525]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf
Nov 28 09:23:49 np0005538515.localdomain ovn_metadata_agent[158525]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf
Nov 28 09:23:49 np0005538515.localdomain ovn_metadata_agent[158525]: INFO:__main__:Writing out command to execute
Nov 28 09:23:49 np0005538515.localdomain ovn_metadata_agent[158525]: INFO:__main__:Setting permission for /var/lib/neutron
Nov 28 09:23:49 np0005538515.localdomain ovn_metadata_agent[158525]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts
Nov 28 09:23:49 np0005538515.localdomain ovn_metadata_agent[158525]: INFO:__main__:Setting permission for /var/lib/neutron/.cache
Nov 28 09:23:49 np0005538515.localdomain ovn_metadata_agent[158525]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy
Nov 28 09:23:49 np0005538515.localdomain ovn_metadata_agent[158525]: INFO:__main__:Setting permission for /var/lib/neutron/external
Nov 28 09:23:49 np0005538515.localdomain ovn_metadata_agent[158525]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper
Nov 28 09:23:49 np0005538515.localdomain ovn_metadata_agent[158525]: INFO:__main__:Setting permission for /var/lib/neutron/metadata_proxy
Nov 28 09:23:49 np0005538515.localdomain ovn_metadata_agent[158525]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill
Nov 28 09:23:49 np0005538515.localdomain ovn_metadata_agent[158525]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints
Nov 28 09:23:49 np0005538515.localdomain ovn_metadata_agent[158525]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints/adac9f827fd7fb11fb07020ef60ee06a1fede4feab743856dc8fb3266181d934
Nov 28 09:23:49 np0005538515.localdomain ovn_metadata_agent[158525]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids
Nov 28 09:23:49 np0005538515.localdomain ovn_metadata_agent[158525]: ++ cat /run_command
Nov 28 09:23:49 np0005538515.localdomain ovn_metadata_agent[158525]: + CMD=neutron-ovn-metadata-agent
Nov 28 09:23:49 np0005538515.localdomain ovn_metadata_agent[158525]: + ARGS=
Nov 28 09:23:49 np0005538515.localdomain ovn_metadata_agent[158525]: + sudo kolla_copy_cacerts
Nov 28 09:23:49 np0005538515.localdomain ovn_metadata_agent[158525]: + [[ ! -n '' ]]
Nov 28 09:23:49 np0005538515.localdomain ovn_metadata_agent[158525]: + . kolla_extend_start
Nov 28 09:23:49 np0005538515.localdomain ovn_metadata_agent[158525]: + echo 'Running command: '\''neutron-ovn-metadata-agent'\'''
Nov 28 09:23:49 np0005538515.localdomain ovn_metadata_agent[158525]: Running command: 'neutron-ovn-metadata-agent'
Nov 28 09:23:49 np0005538515.localdomain ovn_metadata_agent[158525]: + umask 0022
Nov 28 09:23:49 np0005538515.localdomain ovn_metadata_agent[158525]: + exec neutron-ovn-metadata-agent
Nov 28 09:23:49 np0005538515.localdomain podman[158533]: 2025-11-28 09:23:49.383418814 +0000 UTC m=+0.184642443 container health_status b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=starting, container_name=ovn_metadata_agent, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 28 09:23:49 np0005538515.localdomain edpm-start-podman-container[158510]: Creating additional drop-in dependency for "ovn_metadata_agent" (b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c)
Nov 28 09:23:49 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 09:23:49 np0005538515.localdomain podman[158533]: 2025-11-28 09:23:49.414473244 +0000 UTC m=+0.215696923 container exec_died b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent)
Nov 28 09:23:49 np0005538515.localdomain systemd-sysv-generator[158602]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:23:49 np0005538515.localdomain systemd-rc-local-generator[158599]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:23:49 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:23:49 np0005538515.localdomain systemd[1]: b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.service: Deactivated successfully.
Nov 28 09:23:49 np0005538515.localdomain systemd[1]: Started ovn_metadata_agent container.
Nov 28 09:23:49 np0005538515.localdomain sudo[158467]: pam_unix(sudo:session): session closed for user root
Nov 28 09:23:50 np0005538515.localdomain sshd[153125]: pam_unix(sshd:session): session closed for user zuul
Nov 28 09:23:50 np0005538515.localdomain systemd[1]: session-51.scope: Deactivated successfully.
Nov 28 09:23:50 np0005538515.localdomain systemd[1]: session-51.scope: Consumed 31.676s CPU time.
Nov 28 09:23:50 np0005538515.localdomain systemd-logind[763]: Session 51 logged out. Waiting for processes to exit.
Nov 28 09:23:50 np0005538515.localdomain systemd-logind[763]: Removed session 51.
Nov 28 09:23:50 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16558 DF PROTO=TCP SPT=55620 DPT=9100 SEQ=1599959550 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ACA19FA0000000001030307) 
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.762 158530 INFO neutron.common.config [-] Logging enabled!
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.762 158530 INFO neutron.common.config [-] /usr/bin/neutron-ovn-metadata-agent version 22.2.2.dev43
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.762 158530 DEBUG neutron.common.config [-] command line: /usr/bin/neutron-ovn-metadata-agent setup_logging /usr/lib/python3.9/site-packages/neutron/common/config.py:123
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.763 158530 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.763 158530 DEBUG neutron.agent.ovn.metadata_agent [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.763 158530 DEBUG neutron.agent.ovn.metadata_agent [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.763 158530 DEBUG neutron.agent.ovn.metadata_agent [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.763 158530 DEBUG neutron.agent.ovn.metadata_agent [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.763 158530 DEBUG neutron.agent.ovn.metadata_agent [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.763 158530 DEBUG neutron.agent.ovn.metadata_agent [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.763 158530 DEBUG neutron.agent.ovn.metadata_agent [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.763 158530 DEBUG neutron.agent.ovn.metadata_agent [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.764 158530 DEBUG neutron.agent.ovn.metadata_agent [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.764 158530 DEBUG neutron.agent.ovn.metadata_agent [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.764 158530 DEBUG neutron.agent.ovn.metadata_agent [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.764 158530 DEBUG neutron.agent.ovn.metadata_agent [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.764 158530 DEBUG neutron.agent.ovn.metadata_agent [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.764 158530 DEBUG neutron.agent.ovn.metadata_agent [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.764 158530 DEBUG neutron.agent.ovn.metadata_agent [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.764 158530 DEBUG neutron.agent.ovn.metadata_agent [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.764 158530 DEBUG neutron.agent.ovn.metadata_agent [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.765 158530 DEBUG neutron.agent.ovn.metadata_agent [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.765 158530 DEBUG neutron.agent.ovn.metadata_agent [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.765 158530 DEBUG neutron.agent.ovn.metadata_agent [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.765 158530 DEBUG neutron.agent.ovn.metadata_agent [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.765 158530 DEBUG neutron.agent.ovn.metadata_agent [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.765 158530 DEBUG neutron.agent.ovn.metadata_agent [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.765 158530 DEBUG neutron.agent.ovn.metadata_agent [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.765 158530 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.765 158530 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.765 158530 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.765 158530 DEBUG neutron.agent.ovn.metadata_agent [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.766 158530 DEBUG neutron.agent.ovn.metadata_agent [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.766 158530 DEBUG neutron.agent.ovn.metadata_agent [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.766 158530 DEBUG neutron.agent.ovn.metadata_agent [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.766 158530 DEBUG neutron.agent.ovn.metadata_agent [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.766 158530 DEBUG neutron.agent.ovn.metadata_agent [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.766 158530 DEBUG neutron.agent.ovn.metadata_agent [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.766 158530 DEBUG neutron.agent.ovn.metadata_agent [-] host                           = np0005538515.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.766 158530 DEBUG neutron.agent.ovn.metadata_agent [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.766 158530 DEBUG neutron.agent.ovn.metadata_agent [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.767 158530 DEBUG neutron.agent.ovn.metadata_agent [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.767 158530 DEBUG neutron.agent.ovn.metadata_agent [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.767 158530 DEBUG neutron.agent.ovn.metadata_agent [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.767 158530 DEBUG neutron.agent.ovn.metadata_agent [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.767 158530 DEBUG neutron.agent.ovn.metadata_agent [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.767 158530 DEBUG neutron.agent.ovn.metadata_agent [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.767 158530 DEBUG neutron.agent.ovn.metadata_agent [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.767 158530 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.767 158530 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.767 158530 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.767 158530 DEBUG neutron.agent.ovn.metadata_agent [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.768 158530 DEBUG neutron.agent.ovn.metadata_agent [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.768 158530 DEBUG neutron.agent.ovn.metadata_agent [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.768 158530 DEBUG neutron.agent.ovn.metadata_agent [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.768 158530 DEBUG neutron.agent.ovn.metadata_agent [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.768 158530 DEBUG neutron.agent.ovn.metadata_agent [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.768 158530 DEBUG neutron.agent.ovn.metadata_agent [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.768 158530 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.768 158530 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.768 158530 DEBUG neutron.agent.ovn.metadata_agent [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.768 158530 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.768 158530 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.769 158530 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.769 158530 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.769 158530 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.769 158530 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.769 158530 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.769 158530 DEBUG neutron.agent.ovn.metadata_agent [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.769 158530 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.769 158530 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.769 158530 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.770 158530 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.770 158530 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.770 158530 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.770 158530 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.770 158530 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_protocol         = http log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.770 158530 DEBUG neutron.agent.ovn.metadata_agent [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.770 158530 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.770 158530 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.770 158530 DEBUG neutron.agent.ovn.metadata_agent [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.770 158530 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.770 158530 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.771 158530 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.771 158530 DEBUG neutron.agent.ovn.metadata_agent [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.771 158530 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.771 158530 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.771 158530 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.771 158530 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.771 158530 DEBUG neutron.agent.ovn.metadata_agent [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.771 158530 DEBUG neutron.agent.ovn.metadata_agent [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.771 158530 DEBUG neutron.agent.ovn.metadata_agent [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.771 158530 DEBUG neutron.agent.ovn.metadata_agent [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.772 158530 DEBUG neutron.agent.ovn.metadata_agent [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.772 158530 DEBUG neutron.agent.ovn.metadata_agent [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.772 158530 DEBUG neutron.agent.ovn.metadata_agent [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.772 158530 DEBUG neutron.agent.ovn.metadata_agent [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.772 158530 DEBUG neutron.agent.ovn.metadata_agent [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.772 158530 DEBUG neutron.agent.ovn.metadata_agent [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.772 158530 DEBUG neutron.agent.ovn.metadata_agent [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.772 158530 DEBUG neutron.agent.ovn.metadata_agent [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.772 158530 DEBUG neutron.agent.ovn.metadata_agent [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.772 158530 DEBUG neutron.agent.ovn.metadata_agent [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.772 158530 DEBUG neutron.agent.ovn.metadata_agent [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.772 158530 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.773 158530 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.773 158530 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.773 158530 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.773 158530 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.773 158530 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.773 158530 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.773 158530 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.773 158530 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.773 158530 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.774 158530 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.774 158530 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.774 158530 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.774 158530 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.774 158530 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.774 158530 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.774 158530 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.774 158530 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.774 158530 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.774 158530 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.775 158530 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.775 158530 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.775 158530 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.775 158530 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.775 158530 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.775 158530 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.775 158530 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.775 158530 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.775 158530 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.775 158530 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.776 158530 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.776 158530 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.776 158530 DEBUG neutron.agent.ovn.metadata_agent [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.776 158530 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.776 158530 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.776 158530 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.776 158530 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.776 158530 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.776 158530 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.776 158530 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.777 158530 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.777 158530 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.777 158530 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.777 158530 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.777 158530 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.777 158530 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.777 158530 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.777 158530 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.777 158530 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.777 158530 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.778 158530 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.778 158530 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.778 158530 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.778 158530 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.778 158530 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.778 158530 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.778 158530 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.778 158530 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.778 158530 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.778 158530 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.779 158530 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.779 158530 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.779 158530 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.779 158530 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.779 158530 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.779 158530 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.779 158530 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.779 158530 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.779 158530 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.780 158530 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.780 158530 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.780 158530 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.780 158530 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.780 158530 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.780 158530 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.780 158530 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.780 158530 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.780 158530 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.780 158530 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.781 158530 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.781 158530 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.781 158530 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.781 158530 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.781 158530 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.781 158530 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.781 158530 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.781 158530 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.781 158530 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.781 158530 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.782 158530 DEBUG neutron.agent.ovn.metadata_agent [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.782 158530 DEBUG neutron.agent.ovn.metadata_agent [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.782 158530 DEBUG neutron.agent.ovn.metadata_agent [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.782 158530 DEBUG neutron.agent.ovn.metadata_agent [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.782 158530 DEBUG neutron.agent.ovn.metadata_agent [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.782 158530 DEBUG neutron.agent.ovn.metadata_agent [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.782 158530 DEBUG neutron.agent.ovn.metadata_agent [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.782 158530 DEBUG neutron.agent.ovn.metadata_agent [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.782 158530 DEBUG neutron.agent.ovn.metadata_agent [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.782 158530 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.783 158530 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.783 158530 DEBUG neutron.agent.ovn.metadata_agent [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.783 158530 DEBUG neutron.agent.ovn.metadata_agent [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.783 158530 DEBUG neutron.agent.ovn.metadata_agent [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.783 158530 DEBUG neutron.agent.ovn.metadata_agent [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.783 158530 DEBUG neutron.agent.ovn.metadata_agent [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.783 158530 DEBUG neutron.agent.ovn.metadata_agent [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.783 158530 DEBUG neutron.agent.ovn.metadata_agent [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.783 158530 DEBUG neutron.agent.ovn.metadata_agent [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.783 158530 DEBUG neutron.agent.ovn.metadata_agent [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.784 158530 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.784 158530 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.784 158530 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.784 158530 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.784 158530 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.784 158530 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.784 158530 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.784 158530 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.784 158530 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.784 158530 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.785 158530 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.785 158530 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.785 158530 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.785 158530 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.785 158530 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.785 158530 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.785 158530 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.785 158530 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.785 158530 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.785 158530 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.786 158530 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.786 158530 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.786 158530 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.786 158530 DEBUG neutron.agent.ovn.metadata_agent [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.786 158530 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.786 158530 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.786 158530 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.786 158530 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.786 158530 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.786 158530 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.787 158530 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.787 158530 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.787 158530 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.787 158530 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.787 158530 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.787 158530 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.787 158530 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.787 158530 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.787 158530 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.787 158530 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.788 158530 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.788 158530 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.788 158530 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_connection          = tcp:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.788 158530 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.788 158530 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.788 158530 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.788 158530 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.788 158530 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.788 158530 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.789 158530 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.789 158530 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.789 158530 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.789 158530 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.789 158530 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.789 158530 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.789 158530 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.789 158530 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.789 158530 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.789 158530 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.790 158530 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.790 158530 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.790 158530 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.790 158530 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.790 158530 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.790 158530 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.790 158530 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.790 158530 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.790 158530 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.790 158530 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.791 158530 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.791 158530 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.791 158530 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.791 158530 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.791 158530 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.791 158530 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.791 158530 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.791 158530 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.791 158530 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.791 158530 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.792 158530 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.792 158530 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.792 158530 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.792 158530 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.792 158530 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.792 158530 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.792 158530 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.792 158530 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.792 158530 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.792 158530 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.793 158530 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.793 158530 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.800 158530 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.801 158530 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.801 158530 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.801 158530 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connecting...
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.801 158530 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connected
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.826 158530 DEBUG neutron.agent.ovn.metadata.agent [-] Loaded chassis name 62c03cad-89c1-4fd7-973b-8f2a608c71f1 (UUID: 62c03cad-89c1-4fd7-973b-8f2a608c71f1) and ovn bridge br-int. _load_config /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:309
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.849 158530 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.850 158530 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.850 158530 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.850 158530 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Chassis_Private.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.853 158530 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:ovsdbserver-sb.openstack.svc:6642: connecting...
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.858 158530 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:ovsdbserver-sb.openstack.svc:6642: connected
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.872 158530 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched CREATE: ChassisPrivateCreateEvent(events=('create',), table='Chassis_Private', conditions=(('name', '=', '62c03cad-89c1-4fd7-973b-8f2a608c71f1'),), old_conditions=None), priority=20 to row=Chassis_Private(chassis=[<ovs.db.idl.Row object at 0x7fd80e481be0>], external_ids={'neutron:ovn-metadata-id': '2d28d085-9d5e-537f-ab04-258862acbcc7', 'neutron:ovn-metadata-sb-cfg': '1'}, name=62c03cad-89c1-4fd7-973b-8f2a608c71f1, nb_cfg_timestamp=1764321773775, nb_cfg=4) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.874 158530 DEBUG neutron_lib.callbacks.manager [-] Subscribe: <bound method MetadataProxyHandler.post_fork_initialize of <neutron.agent.ovn.metadata.server.MetadataProxyHandler object at 0x7fd80e4a4b50>> process after_init 55550000, False subscribe /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:52
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.875 158530 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.875 158530 DEBUG oslo_concurrency.lockutils [-] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.875 158530 DEBUG oslo_concurrency.lockutils [-] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.876 158530 INFO oslo_service.service [-] Starting 1 workers
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.879 158530 DEBUG oslo_service.service [-] Started child 158625 _start_child /usr/lib/python3.9/site-packages/oslo_service/service.py:575
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.883 158530 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.namespace_cmd', '--privsep_sock_path', '/tmp/tmpvqp2nwoj/privsep.sock']
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.883 158625 DEBUG neutron_lib.callbacks.manager [-] Publish callbacks ['neutron.agent.ovn.metadata.server.MetadataProxyHandler.post_fork_initialize-240285'] for process (None), after_init _notify_loop /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:184
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.905 158625 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.906 158625 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.906 158625 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.910 158625 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:ovsdbserver-sb.openstack.svc:6642: connecting...
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.911 158625 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:ovsdbserver-sb.openstack.svc:6642: connected
Nov 28 09:23:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:50.924 158625 INFO eventlet.wsgi.server [-] (158625) wsgi starting up on http:/var/lib/neutron/metadata_proxy
Nov 28 09:23:51 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:51.482 158530 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Nov 28 09:23:51 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:51.483 158530 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpvqp2nwoj/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362
Nov 28 09:23:51 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:51.377 158630 INFO oslo.privsep.daemon [-] privsep daemon starting
Nov 28 09:23:51 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:51.382 158630 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Nov 28 09:23:51 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:51.386 158630 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_SYS_ADMIN/CAP_SYS_ADMIN/none
Nov 28 09:23:51 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:51.386 158630 INFO oslo.privsep.daemon [-] privsep daemon running as pid 158630
Nov 28 09:23:51 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:51.485 158630 DEBUG oslo.privsep.daemon [-] privsep: reply[20642f77-97c6-4659-998d-442027c0c63c]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 09:23:51 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:51.928 158630 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:23:51 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:51.928 158630 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:23:51 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:51.928 158630 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.358 158630 DEBUG oslo.privsep.daemon [-] privsep: reply[335438eb-9859-4825-8a15-8f73c62cde92]: (4, []) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.362 158530 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbAddCommand(_result=None, table=Chassis_Private, record=62c03cad-89c1-4fd7-973b-8f2a608c71f1, column=external_ids, values=({'neutron:ovn-metadata-id': '2d28d085-9d5e-537f-ab04-258862acbcc7'},)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.363 158530 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.363 158530 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=62c03cad-89c1-4fd7-973b-8f2a608c71f1, col_values=(('external_ids', {'neutron:ovn-bridge': 'br-int'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.379 158530 DEBUG oslo_service.service [-] Full set of CONF: wait /usr/lib/python3.9/site-packages/oslo_service/service.py:649
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.379 158530 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.379 158530 DEBUG oslo_service.service [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.379 158530 DEBUG oslo_service.service [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.380 158530 DEBUG oslo_service.service [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.380 158530 DEBUG oslo_service.service [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.380 158530 DEBUG oslo_service.service [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.380 158530 DEBUG oslo_service.service [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.380 158530 DEBUG oslo_service.service [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.380 158530 DEBUG oslo_service.service [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.381 158530 DEBUG oslo_service.service [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.381 158530 DEBUG oslo_service.service [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.381 158530 DEBUG oslo_service.service [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.381 158530 DEBUG oslo_service.service [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.381 158530 DEBUG oslo_service.service [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.381 158530 DEBUG oslo_service.service [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.382 158530 DEBUG oslo_service.service [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.382 158530 DEBUG oslo_service.service [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.382 158530 DEBUG oslo_service.service [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.382 158530 DEBUG oslo_service.service [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.382 158530 DEBUG oslo_service.service [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.382 158530 DEBUG oslo_service.service [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.382 158530 DEBUG oslo_service.service [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.383 158530 DEBUG oslo_service.service [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.383 158530 DEBUG oslo_service.service [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.383 158530 DEBUG oslo_service.service [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.383 158530 DEBUG oslo_service.service [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.383 158530 DEBUG oslo_service.service [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.383 158530 DEBUG oslo_service.service [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.384 158530 DEBUG oslo_service.service [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.384 158530 DEBUG oslo_service.service [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.384 158530 DEBUG oslo_service.service [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.384 158530 DEBUG oslo_service.service [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.384 158530 DEBUG oslo_service.service [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.384 158530 DEBUG oslo_service.service [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.385 158530 DEBUG oslo_service.service [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.385 158530 DEBUG oslo_service.service [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.385 158530 DEBUG oslo_service.service [-] host                           = np0005538515.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.385 158530 DEBUG oslo_service.service [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.385 158530 DEBUG oslo_service.service [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.385 158530 DEBUG oslo_service.service [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.386 158530 DEBUG oslo_service.service [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.386 158530 DEBUG oslo_service.service [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.386 158530 DEBUG oslo_service.service [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.386 158530 DEBUG oslo_service.service [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.386 158530 DEBUG oslo_service.service [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.386 158530 DEBUG oslo_service.service [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.386 158530 DEBUG oslo_service.service [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.386 158530 DEBUG oslo_service.service [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.387 158530 DEBUG oslo_service.service [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.387 158530 DEBUG oslo_service.service [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.387 158530 DEBUG oslo_service.service [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.387 158530 DEBUG oslo_service.service [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.387 158530 DEBUG oslo_service.service [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.387 158530 DEBUG oslo_service.service [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.387 158530 DEBUG oslo_service.service [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.387 158530 DEBUG oslo_service.service [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.388 158530 DEBUG oslo_service.service [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.388 158530 DEBUG oslo_service.service [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.388 158530 DEBUG oslo_service.service [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.388 158530 DEBUG oslo_service.service [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.388 158530 DEBUG oslo_service.service [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.388 158530 DEBUG oslo_service.service [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.388 158530 DEBUG oslo_service.service [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.389 158530 DEBUG oslo_service.service [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.389 158530 DEBUG oslo_service.service [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.389 158530 DEBUG oslo_service.service [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.389 158530 DEBUG oslo_service.service [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.389 158530 DEBUG oslo_service.service [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.389 158530 DEBUG oslo_service.service [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.390 158530 DEBUG oslo_service.service [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.390 158530 DEBUG oslo_service.service [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.390 158530 DEBUG oslo_service.service [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.390 158530 DEBUG oslo_service.service [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.390 158530 DEBUG oslo_service.service [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.390 158530 DEBUG oslo_service.service [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.390 158530 DEBUG oslo_service.service [-] nova_metadata_protocol         = http log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.391 158530 DEBUG oslo_service.service [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.391 158530 DEBUG oslo_service.service [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.391 158530 DEBUG oslo_service.service [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.391 158530 DEBUG oslo_service.service [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.391 158530 DEBUG oslo_service.service [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.391 158530 DEBUG oslo_service.service [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.391 158530 DEBUG oslo_service.service [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.391 158530 DEBUG oslo_service.service [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.392 158530 DEBUG oslo_service.service [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.392 158530 DEBUG oslo_service.service [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.392 158530 DEBUG oslo_service.service [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.392 158530 DEBUG oslo_service.service [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.392 158530 DEBUG oslo_service.service [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.392 158530 DEBUG oslo_service.service [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.392 158530 DEBUG oslo_service.service [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.393 158530 DEBUG oslo_service.service [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.393 158530 DEBUG oslo_service.service [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.393 158530 DEBUG oslo_service.service [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.393 158530 DEBUG oslo_service.service [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.393 158530 DEBUG oslo_service.service [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.394 158530 DEBUG oslo_service.service [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.394 158530 DEBUG oslo_service.service [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.394 158530 DEBUG oslo_service.service [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.394 158530 DEBUG oslo_service.service [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.394 158530 DEBUG oslo_service.service [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.395 158530 DEBUG oslo_service.service [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.395 158530 DEBUG oslo_service.service [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.395 158530 DEBUG oslo_service.service [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.395 158530 DEBUG oslo_service.service [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.395 158530 DEBUG oslo_service.service [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.395 158530 DEBUG oslo_service.service [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.395 158530 DEBUG oslo_service.service [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.396 158530 DEBUG oslo_service.service [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.396 158530 DEBUG oslo_service.service [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.396 158530 DEBUG oslo_service.service [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.396 158530 DEBUG oslo_service.service [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.397 158530 DEBUG oslo_service.service [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.397 158530 DEBUG oslo_service.service [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.397 158530 DEBUG oslo_service.service [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.397 158530 DEBUG oslo_service.service [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.397 158530 DEBUG oslo_service.service [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.397 158530 DEBUG oslo_service.service [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.397 158530 DEBUG oslo_service.service [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.398 158530 DEBUG oslo_service.service [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.398 158530 DEBUG oslo_service.service [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.398 158530 DEBUG oslo_service.service [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.398 158530 DEBUG oslo_service.service [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.398 158530 DEBUG oslo_service.service [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.398 158530 DEBUG oslo_service.service [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.399 158530 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.399 158530 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.399 158530 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.399 158530 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.399 158530 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.399 158530 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.399 158530 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.400 158530 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.400 158530 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.400 158530 DEBUG oslo_service.service [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.400 158530 DEBUG oslo_service.service [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.400 158530 DEBUG oslo_service.service [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.400 158530 DEBUG oslo_service.service [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.401 158530 DEBUG oslo_service.service [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.401 158530 DEBUG oslo_service.service [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.401 158530 DEBUG oslo_service.service [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.401 158530 DEBUG oslo_service.service [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.401 158530 DEBUG oslo_service.service [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.401 158530 DEBUG oslo_service.service [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.401 158530 DEBUG oslo_service.service [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.402 158530 DEBUG oslo_service.service [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.402 158530 DEBUG oslo_service.service [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.402 158530 DEBUG oslo_service.service [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.402 158530 DEBUG oslo_service.service [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.402 158530 DEBUG oslo_service.service [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.402 158530 DEBUG oslo_service.service [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.402 158530 DEBUG oslo_service.service [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.403 158530 DEBUG oslo_service.service [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.403 158530 DEBUG oslo_service.service [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.403 158530 DEBUG oslo_service.service [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.403 158530 DEBUG oslo_service.service [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.403 158530 DEBUG oslo_service.service [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.403 158530 DEBUG oslo_service.service [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.403 158530 DEBUG oslo_service.service [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.404 158530 DEBUG oslo_service.service [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.404 158530 DEBUG oslo_service.service [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.404 158530 DEBUG oslo_service.service [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.404 158530 DEBUG oslo_service.service [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.404 158530 DEBUG oslo_service.service [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.404 158530 DEBUG oslo_service.service [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.405 158530 DEBUG oslo_service.service [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.405 158530 DEBUG oslo_service.service [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.405 158530 DEBUG oslo_service.service [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.405 158530 DEBUG oslo_service.service [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.405 158530 DEBUG oslo_service.service [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.405 158530 DEBUG oslo_service.service [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.406 158530 DEBUG oslo_service.service [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.406 158530 DEBUG oslo_service.service [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.406 158530 DEBUG oslo_service.service [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.406 158530 DEBUG oslo_service.service [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.406 158530 DEBUG oslo_service.service [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.406 158530 DEBUG oslo_service.service [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.407 158530 DEBUG oslo_service.service [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.407 158530 DEBUG oslo_service.service [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.407 158530 DEBUG oslo_service.service [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.407 158530 DEBUG oslo_service.service [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.407 158530 DEBUG oslo_service.service [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.407 158530 DEBUG oslo_service.service [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.408 158530 DEBUG oslo_service.service [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.408 158530 DEBUG oslo_service.service [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.408 158530 DEBUG oslo_service.service [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.408 158530 DEBUG oslo_service.service [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.408 158530 DEBUG oslo_service.service [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.408 158530 DEBUG oslo_service.service [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.408 158530 DEBUG oslo_service.service [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.409 158530 DEBUG oslo_service.service [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.409 158530 DEBUG oslo_service.service [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.409 158530 DEBUG oslo_service.service [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.409 158530 DEBUG oslo_service.service [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.409 158530 DEBUG oslo_service.service [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.409 158530 DEBUG oslo_service.service [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.410 158530 DEBUG oslo_service.service [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.410 158530 DEBUG oslo_service.service [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.410 158530 DEBUG oslo_service.service [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.410 158530 DEBUG oslo_service.service [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.410 158530 DEBUG oslo_service.service [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.410 158530 DEBUG oslo_service.service [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.410 158530 DEBUG oslo_service.service [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.411 158530 DEBUG oslo_service.service [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.411 158530 DEBUG oslo_service.service [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.411 158530 DEBUG oslo_service.service [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.411 158530 DEBUG oslo_service.service [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.411 158530 DEBUG oslo_service.service [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.411 158530 DEBUG oslo_service.service [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.411 158530 DEBUG oslo_service.service [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.412 158530 DEBUG oslo_service.service [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.412 158530 DEBUG oslo_service.service [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.412 158530 DEBUG oslo_service.service [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.412 158530 DEBUG oslo_service.service [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.412 158530 DEBUG oslo_service.service [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.412 158530 DEBUG oslo_service.service [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.412 158530 DEBUG oslo_service.service [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.413 158530 DEBUG oslo_service.service [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.413 158530 DEBUG oslo_service.service [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.413 158530 DEBUG oslo_service.service [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.413 158530 DEBUG oslo_service.service [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.413 158530 DEBUG oslo_service.service [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.413 158530 DEBUG oslo_service.service [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.413 158530 DEBUG oslo_service.service [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.414 158530 DEBUG oslo_service.service [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.414 158530 DEBUG oslo_service.service [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.414 158530 DEBUG oslo_service.service [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.414 158530 DEBUG oslo_service.service [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.414 158530 DEBUG oslo_service.service [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.414 158530 DEBUG oslo_service.service [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.415 158530 DEBUG oslo_service.service [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.415 158530 DEBUG oslo_service.service [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.415 158530 DEBUG oslo_service.service [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.415 158530 DEBUG oslo_service.service [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.415 158530 DEBUG oslo_service.service [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.415 158530 DEBUG oslo_service.service [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.415 158530 DEBUG oslo_service.service [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.416 158530 DEBUG oslo_service.service [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.416 158530 DEBUG oslo_service.service [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.416 158530 DEBUG oslo_service.service [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.416 158530 DEBUG oslo_service.service [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.416 158530 DEBUG oslo_service.service [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.416 158530 DEBUG oslo_service.service [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.417 158530 DEBUG oslo_service.service [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.417 158530 DEBUG oslo_service.service [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.417 158530 DEBUG oslo_service.service [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.417 158530 DEBUG oslo_service.service [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.417 158530 DEBUG oslo_service.service [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.417 158530 DEBUG oslo_service.service [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.417 158530 DEBUG oslo_service.service [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.418 158530 DEBUG oslo_service.service [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.418 158530 DEBUG oslo_service.service [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.418 158530 DEBUG oslo_service.service [-] ovn.ovn_sb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.418 158530 DEBUG oslo_service.service [-] ovn.ovn_sb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.418 158530 DEBUG oslo_service.service [-] ovn.ovn_sb_connection          = tcp:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.418 158530 DEBUG oslo_service.service [-] ovn.ovn_sb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.418 158530 DEBUG oslo_service.service [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.419 158530 DEBUG oslo_service.service [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.419 158530 DEBUG oslo_service.service [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.419 158530 DEBUG oslo_service.service [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.419 158530 DEBUG oslo_service.service [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.419 158530 DEBUG oslo_service.service [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.419 158530 DEBUG oslo_service.service [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.419 158530 DEBUG oslo_service.service [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.420 158530 DEBUG oslo_service.service [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.420 158530 DEBUG oslo_service.service [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.420 158530 DEBUG oslo_service.service [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.420 158530 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.420 158530 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.420 158530 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.421 158530 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.421 158530 DEBUG oslo_service.service [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.421 158530 DEBUG oslo_service.service [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.421 158530 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.421 158530 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.421 158530 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.422 158530 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.422 158530 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.422 158530 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.422 158530 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.422 158530 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.422 158530 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.422 158530 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.423 158530 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.423 158530 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.423 158530 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.423 158530 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.423 158530 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.423 158530 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.423 158530 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.424 158530 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.424 158530 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.424 158530 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.424 158530 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.424 158530 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.424 158530 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.424 158530 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.425 158530 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.425 158530 DEBUG oslo_service.service [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.425 158530 DEBUG oslo_service.service [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.425 158530 DEBUG oslo_service.service [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.425 158530 DEBUG oslo_service.service [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:23:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:23:52.425 158530 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Nov 28 09:23:56 np0005538515.localdomain sshd[158635]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 09:23:56 np0005538515.localdomain sshd[158635]: Accepted publickey for zuul from 192.168.122.30 port 34326 ssh2: RSA SHA256:3gOhaEk5Hp1Sm2LwNst6cGDJ5O01KvSo8lCo9SBO2II
Nov 28 09:23:57 np0005538515.localdomain systemd-logind[763]: New session 52 of user zuul.
Nov 28 09:23:57 np0005538515.localdomain systemd[1]: Started Session 52 of User zuul.
Nov 28 09:23:57 np0005538515.localdomain sshd[158635]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Nov 28 09:23:57 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37491 DF PROTO=TCP SPT=36594 DPT=9105 SEQ=2108030151 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ACA35230000000001030307) 
Nov 28 09:23:57 np0005538515.localdomain python3.9[158728]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 09:23:58 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37492 DF PROTO=TCP SPT=36594 DPT=9105 SEQ=2108030151 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ACA393A0000000001030307) 
Nov 28 09:23:58 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41753 DF PROTO=TCP SPT=57798 DPT=9882 SEQ=3561430897 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ACA3A4C0000000001030307) 
Nov 28 09:23:59 np0005538515.localdomain sudo[158822]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-boeubhkqvxlnwcigzgtozigyqsqrzddu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321838.6407506-64-82304572193505/AnsiballZ_command.py
Nov 28 09:23:59 np0005538515.localdomain sudo[158822]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:24:00 np0005538515.localdomain python3.9[158824]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps -a --filter name=^nova_virtlogd$ --format \{\{.Names\}\} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:24:00 np0005538515.localdomain sudo[158822]: pam_unix(sudo:session): session closed for user root
Nov 28 09:24:01 np0005538515.localdomain sudo[158927]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ozuoovmcyqnkhuhkkkavigjumyfdzqcy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321841.1016848-89-155804599215524/AnsiballZ_command.py
Nov 28 09:24:01 np0005538515.localdomain sudo[158927]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:24:01 np0005538515.localdomain python3.9[158929]: ansible-ansible.legacy.command Invoked with _raw_params=podman stop nova_virtlogd _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:24:01 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54863 DF PROTO=TCP SPT=59168 DPT=9105 SEQ=4196639421 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ACA44FA0000000001030307) 
Nov 28 09:24:01 np0005538515.localdomain systemd[1]: libpod-f207e5b37e3f4ec55a88edcf4dbcbe5cbbc20fb4f3557998c461a11b61b3019b.scope: Deactivated successfully.
Nov 28 09:24:01 np0005538515.localdomain podman[158930]: 2025-11-28 09:24:01.657112113 +0000 UTC m=+0.075663514 container died f207e5b37e3f4ec55a88edcf4dbcbe5cbbc20fb4f3557998c461a11b61b3019b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd, com.redhat.component=openstack-nova-libvirt-container, vcs-type=git, name=rhosp17/openstack-nova-libvirt, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, tcib_managed=true, vendor=Red Hat, Inc., build-date=2025-11-19T00:35:22Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, distribution-scope=public, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, architecture=x86_64)
Nov 28 09:24:01 np0005538515.localdomain systemd[1]: tmp-crun.HRXK5v.mount: Deactivated successfully.
Nov 28 09:24:01 np0005538515.localdomain podman[158930]: 2025-11-28 09:24:01.696523543 +0000 UTC m=+0.115074844 container cleanup f207e5b37e3f4ec55a88edcf4dbcbe5cbbc20fb4f3557998c461a11b61b3019b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd, vcs-type=git, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, build-date=2025-11-19T00:35:22Z, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, release=1761123044, maintainer=OpenStack TripleO Team, distribution-scope=public, com.redhat.component=openstack-nova-libvirt-container, io.buildah.version=1.41.4, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp17/openstack-nova-libvirt, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-libvirt)
Nov 28 09:24:01 np0005538515.localdomain podman[158945]: 2025-11-28 09:24:01.723806897 +0000 UTC m=+0.056810093 container remove f207e5b37e3f4ec55a88edcf4dbcbe5cbbc20fb4f3557998c461a11b61b3019b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd, tcib_managed=true, url=https://www.redhat.com, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp17/openstack-nova-libvirt, batch=17.1_20251118.1, io.buildah.version=1.41.4, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, vcs-type=git, version=17.1.12, build-date=2025-11-19T00:35:22Z, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-libvirt-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, konflux.additional-tags=17.1.12 17.1_20251118.1)
Nov 28 09:24:01 np0005538515.localdomain sudo[158927]: pam_unix(sudo:session): session closed for user root
Nov 28 09:24:01 np0005538515.localdomain systemd[1]: libpod-conmon-f207e5b37e3f4ec55a88edcf4dbcbe5cbbc20fb4f3557998c461a11b61b3019b.scope: Deactivated successfully.
Nov 28 09:24:02 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-6d65302dd3a585cea223ca3e05b9a858698ed3b54cf3bbf51971fe5feba8f16c-merged.mount: Deactivated successfully.
Nov 28 09:24:02 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f207e5b37e3f4ec55a88edcf4dbcbe5cbbc20fb4f3557998c461a11b61b3019b-userdata-shm.mount: Deactivated successfully.
Nov 28 09:24:02 np0005538515.localdomain sudo[159049]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lglfjaaamlhnrikrkttibxuxiexgsdmd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321842.1385827-119-8161489949168/AnsiballZ_systemd_service.py
Nov 28 09:24:02 np0005538515.localdomain sudo[159049]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:24:02 np0005538515.localdomain python3.9[159051]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 28 09:24:02 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 09:24:03 np0005538515.localdomain systemd-sysv-generator[159082]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:24:03 np0005538515.localdomain systemd-rc-local-generator[159076]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:24:03 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:24:03 np0005538515.localdomain sudo[159049]: pam_unix(sudo:session): session closed for user root
Nov 28 09:24:04 np0005538515.localdomain python3.9[159177]: ansible-ansible.builtin.service_facts Invoked
Nov 28 09:24:04 np0005538515.localdomain network[159194]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 28 09:24:04 np0005538515.localdomain network[159195]: 'network-scripts' will be removed from distribution in near future.
Nov 28 09:24:04 np0005538515.localdomain network[159196]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 28 09:24:04 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37494 DF PROTO=TCP SPT=36594 DPT=9105 SEQ=2108030151 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ACA50FA0000000001030307) 
Nov 28 09:24:05 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:24:08 np0005538515.localdomain sudo[159396]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yfifyebwjvmfqqzyztuwtiuklpuzxnop ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321847.9955382-176-113171947271673/AnsiballZ_systemd_service.py
Nov 28 09:24:08 np0005538515.localdomain sudo[159396]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:24:08 np0005538515.localdomain python3.9[159398]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_libvirt.target state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:24:08 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 09:24:08 np0005538515.localdomain systemd-sysv-generator[159431]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:24:08 np0005538515.localdomain systemd-rc-local-generator[159425]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:24:08 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:24:08 np0005538515.localdomain systemd[1]: Stopped target tripleo_nova_libvirt.target.
Nov 28 09:24:08 np0005538515.localdomain sudo[159396]: pam_unix(sudo:session): session closed for user root
Nov 28 09:24:09 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40218 DF PROTO=TCP SPT=35690 DPT=9102 SEQ=1147369062 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ACA627A0000000001030307) 
Nov 28 09:24:10 np0005538515.localdomain sudo[159528]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dmvomhprrtdcdwbawjsvtfgovocjgjrp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321849.0486033-176-245295378555404/AnsiballZ_systemd_service.py
Nov 28 09:24:10 np0005538515.localdomain sudo[159528]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:24:10 np0005538515.localdomain python3.9[159530]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtlogd_wrapper.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:24:10 np0005538515.localdomain sudo[159528]: pam_unix(sudo:session): session closed for user root
Nov 28 09:24:11 np0005538515.localdomain sudo[159621]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zwjtqvuvuscjdebgygskgzcshyzsqemk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321850.8628845-176-152018214972211/AnsiballZ_systemd_service.py
Nov 28 09:24:11 np0005538515.localdomain sudo[159621]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:24:11 np0005538515.localdomain python3.9[159623]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtnodedevd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:24:11 np0005538515.localdomain sudo[159621]: pam_unix(sudo:session): session closed for user root
Nov 28 09:24:12 np0005538515.localdomain sudo[159714]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dqlogpnewoupelqeiobzpuzpgbaciodw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321851.5731416-176-48102242129183/AnsiballZ_systemd_service.py
Nov 28 09:24:12 np0005538515.localdomain sudo[159714]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:24:12 np0005538515.localdomain python3.9[159716]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtproxyd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:24:12 np0005538515.localdomain sudo[159714]: pam_unix(sudo:session): session closed for user root
Nov 28 09:24:12 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37495 DF PROTO=TCP SPT=36594 DPT=9105 SEQ=2108030151 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ACA70FA0000000001030307) 
Nov 28 09:24:13 np0005538515.localdomain sudo[159807]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ukoyalrhimbipwtrogmdttmcvrjraqje ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321852.9367862-176-7909053783272/AnsiballZ_systemd_service.py
Nov 28 09:24:13 np0005538515.localdomain sudo[159807]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:24:13 np0005538515.localdomain python3.9[159809]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtqemud.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:24:13 np0005538515.localdomain sudo[159807]: pam_unix(sudo:session): session closed for user root
Nov 28 09:24:13 np0005538515.localdomain sudo[159900]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hbiuajwdvryitapauwzivyrfluagoerb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321853.6565015-176-72825232984998/AnsiballZ_systemd_service.py
Nov 28 09:24:13 np0005538515.localdomain sudo[159900]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:24:14 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42010 DF PROTO=TCP SPT=53020 DPT=9101 SEQ=2759722639 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ACA76210000000001030307) 
Nov 28 09:24:14 np0005538515.localdomain python3.9[159902]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtsecretd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:24:14 np0005538515.localdomain sudo[159900]: pam_unix(sudo:session): session closed for user root
Nov 28 09:24:14 np0005538515.localdomain sudo[159993]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dlmrzuhlmqsfdqclcmzuedxiuhthmyaw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321854.396901-176-181695721803817/AnsiballZ_systemd_service.py
Nov 28 09:24:14 np0005538515.localdomain sudo[159993]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:24:14 np0005538515.localdomain python3.9[159995]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtstoraged.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:24:14 np0005538515.localdomain sudo[159993]: pam_unix(sudo:session): session closed for user root
Nov 28 09:24:15 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.
Nov 28 09:24:15 np0005538515.localdomain systemd[1]: tmp-crun.ARinvB.mount: Deactivated successfully.
Nov 28 09:24:15 np0005538515.localdomain podman[160011]: 2025-11-28 09:24:15.983405951 +0000 UTC m=+0.091234095 container health_status 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Nov 28 09:24:16 np0005538515.localdomain podman[160011]: 2025-11-28 09:24:16.024548869 +0000 UTC m=+0.132377013 container exec_died 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, managed_by=edpm_ansible, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 28 09:24:16 np0005538515.localdomain systemd[1]: 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.service: Deactivated successfully.
Nov 28 09:24:16 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27152 DF PROTO=TCP SPT=48020 DPT=9100 SEQ=1198390150 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ACA7F7A0000000001030307) 
Nov 28 09:24:16 np0005538515.localdomain sudo[160110]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mcbnbbkmbhjtzkulyenvjdqdosezwirs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321856.5015934-332-45142224016048/AnsiballZ_file.py
Nov 28 09:24:16 np0005538515.localdomain sudo[160110]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:24:17 np0005538515.localdomain python3.9[160112]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:24:17 np0005538515.localdomain sudo[160110]: pam_unix(sudo:session): session closed for user root
Nov 28 09:24:17 np0005538515.localdomain sudo[160202]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tpfdlrfchswhybkgnfhsofyxzbuyjybm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321857.3051016-332-13488592451773/AnsiballZ_file.py
Nov 28 09:24:17 np0005538515.localdomain sudo[160202]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:24:17 np0005538515.localdomain python3.9[160204]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:24:17 np0005538515.localdomain sudo[160202]: pam_unix(sudo:session): session closed for user root
Nov 28 09:24:18 np0005538515.localdomain sudo[160294]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vjnqktnmkdssaandrtbqeovukixjmwgd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321857.9124482-332-11578154468214/AnsiballZ_file.py
Nov 28 09:24:18 np0005538515.localdomain sudo[160294]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:24:18 np0005538515.localdomain python3.9[160296]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:24:18 np0005538515.localdomain sudo[160294]: pam_unix(sudo:session): session closed for user root
Nov 28 09:24:18 np0005538515.localdomain sudo[160386]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cothtpldkpjvgemigqoesgbrwqnqvqry ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321858.5313087-332-205327523453485/AnsiballZ_file.py
Nov 28 09:24:18 np0005538515.localdomain sudo[160386]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:24:19 np0005538515.localdomain python3.9[160388]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:24:19 np0005538515.localdomain sudo[160386]: pam_unix(sudo:session): session closed for user root
Nov 28 09:24:19 np0005538515.localdomain sudo[160478]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sktnckekkfyzkhqhkevrdcatlxbveouk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321859.1276248-332-237794903343098/AnsiballZ_file.py
Nov 28 09:24:19 np0005538515.localdomain sudo[160478]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:24:19 np0005538515.localdomain python3.9[160480]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:24:19 np0005538515.localdomain sudo[160478]: pam_unix(sudo:session): session closed for user root
Nov 28 09:24:19 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.
Nov 28 09:24:19 np0005538515.localdomain podman[160553]: 2025-11-28 09:24:19.945038218 +0000 UTC m=+0.054452444 container health_status b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 28 09:24:19 np0005538515.localdomain podman[160553]: 2025-11-28 09:24:19.947525981 +0000 UTC m=+0.056940207 container exec_died b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 28 09:24:19 np0005538515.localdomain sudo[160583]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ghheijkfjbnkmsyqfhtjidqxytivcydd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321859.6991165-332-29973891461209/AnsiballZ_file.py
Nov 28 09:24:19 np0005538515.localdomain sudo[160583]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:24:19 np0005538515.localdomain systemd[1]: b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.service: Deactivated successfully.
Nov 28 09:24:20 np0005538515.localdomain python3.9[160590]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:24:20 np0005538515.localdomain sudo[160583]: pam_unix(sudo:session): session closed for user root
Nov 28 09:24:20 np0005538515.localdomain sudo[160680]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zkiyiciznflmjdbfkcdxdknebjaevjfk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321860.2992678-332-131701296314708/AnsiballZ_file.py
Nov 28 09:24:20 np0005538515.localdomain sudo[160680]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:24:20 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27153 DF PROTO=TCP SPT=48020 DPT=9100 SEQ=1198390150 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ACA8F3B0000000001030307) 
Nov 28 09:24:20 np0005538515.localdomain python3.9[160682]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:24:20 np0005538515.localdomain sudo[160680]: pam_unix(sudo:session): session closed for user root
Nov 28 09:24:21 np0005538515.localdomain sudo[160772]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sdvyrkzzogocyhwytqxheevtpucbrfkw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321861.0556865-481-218732874246094/AnsiballZ_file.py
Nov 28 09:24:21 np0005538515.localdomain sudo[160772]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:24:21 np0005538515.localdomain python3.9[160774]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:24:21 np0005538515.localdomain sudo[160772]: pam_unix(sudo:session): session closed for user root
Nov 28 09:24:21 np0005538515.localdomain sudo[160864]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jrsbjuufeulmqahsdanuwerwpyyrwxmq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321861.7310991-481-3493922304938/AnsiballZ_file.py
Nov 28 09:24:21 np0005538515.localdomain sudo[160864]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:24:22 np0005538515.localdomain python3.9[160866]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:24:22 np0005538515.localdomain sudo[160864]: pam_unix(sudo:session): session closed for user root
Nov 28 09:24:22 np0005538515.localdomain sudo[160956]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gkxeggpyxpoucrlsszsnakveekujbavs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321862.3195515-481-216195720196350/AnsiballZ_file.py
Nov 28 09:24:22 np0005538515.localdomain sudo[160956]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:24:22 np0005538515.localdomain python3.9[160958]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:24:22 np0005538515.localdomain sudo[160956]: pam_unix(sudo:session): session closed for user root
Nov 28 09:24:23 np0005538515.localdomain sudo[161048]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cvwntuirhnlitwscvsricrkxpflrvbhk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321862.8984022-481-60658799202838/AnsiballZ_file.py
Nov 28 09:24:23 np0005538515.localdomain sudo[161048]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:24:23 np0005538515.localdomain python3.9[161050]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:24:23 np0005538515.localdomain sudo[161048]: pam_unix(sudo:session): session closed for user root
Nov 28 09:24:23 np0005538515.localdomain sudo[161140]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cccgmpouqejnwehdyknyeduxuglqlncy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321863.5086958-481-84544128485933/AnsiballZ_file.py
Nov 28 09:24:23 np0005538515.localdomain sudo[161140]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:24:23 np0005538515.localdomain python3.9[161142]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:24:23 np0005538515.localdomain sudo[161140]: pam_unix(sudo:session): session closed for user root
Nov 28 09:24:24 np0005538515.localdomain sudo[161232]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ixbvhbmeunbpdyoxuojbfirudbhxvzdd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321864.162269-481-72247055582505/AnsiballZ_file.py
Nov 28 09:24:24 np0005538515.localdomain sudo[161232]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:24:24 np0005538515.localdomain python3.9[161234]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:24:24 np0005538515.localdomain sudo[161232]: pam_unix(sudo:session): session closed for user root
Nov 28 09:24:25 np0005538515.localdomain sudo[161324]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jhhaiqhdmfsffqqoqskmdgxhaflcetbl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321864.7743726-481-245457379108694/AnsiballZ_file.py
Nov 28 09:24:25 np0005538515.localdomain sudo[161324]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:24:25 np0005538515.localdomain python3.9[161326]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:24:25 np0005538515.localdomain sudo[161324]: pam_unix(sudo:session): session closed for user root
Nov 28 09:24:25 np0005538515.localdomain sudo[161416]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-frweafcbdevcfmbqnyumqmgvbmdhmwio ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321865.5771964-634-242532766499924/AnsiballZ_command.py
Nov 28 09:24:25 np0005538515.localdomain sudo[161416]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:24:26 np0005538515.localdomain python3.9[161418]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                                              systemctl disable --now certmonger.service
                                                              test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                                            fi
                                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:24:26 np0005538515.localdomain sudo[161416]: pam_unix(sudo:session): session closed for user root
Nov 28 09:24:26 np0005538515.localdomain python3.9[161510]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Nov 28 09:24:27 np0005538515.localdomain sudo[161600]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gufwrkaniqozkgrxjlwibwhlajcwubsm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321867.2161963-688-89820008890050/AnsiballZ_systemd_service.py
Nov 28 09:24:27 np0005538515.localdomain sudo[161600]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:24:27 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48537 DF PROTO=TCP SPT=44100 DPT=9105 SEQ=2256137115 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ACAAA530000000001030307) 
Nov 28 09:24:27 np0005538515.localdomain python3.9[161602]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 28 09:24:27 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 09:24:27 np0005538515.localdomain systemd-rc-local-generator[161630]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:24:27 np0005538515.localdomain systemd-sysv-generator[161633]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:24:27 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:24:28 np0005538515.localdomain sudo[161600]: pam_unix(sudo:session): session closed for user root
Nov 28 09:24:28 np0005538515.localdomain sudo[161728]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jrpgxzlddsjuxoaugfthcmymxkyfrwsd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321868.3519483-713-80606286626408/AnsiballZ_command.py
Nov 28 09:24:28 np0005538515.localdomain sudo[161728]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:24:28 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48538 DF PROTO=TCP SPT=44100 DPT=9105 SEQ=2256137115 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ACAAE7A0000000001030307) 
Nov 28 09:24:28 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27154 DF PROTO=TCP SPT=48020 DPT=9100 SEQ=1198390150 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ACAAEFA0000000001030307) 
Nov 28 09:24:28 np0005538515.localdomain python3.9[161730]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_libvirt.target _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:24:28 np0005538515.localdomain sudo[161728]: pam_unix(sudo:session): session closed for user root
Nov 28 09:24:29 np0005538515.localdomain sudo[161821]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oqfcnkymedkyrmmducqaspodcxeeqpva ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321868.972042-713-135564342377775/AnsiballZ_command.py
Nov 28 09:24:29 np0005538515.localdomain sudo[161821]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:24:29 np0005538515.localdomain python3.9[161823]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtlogd_wrapper.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:24:29 np0005538515.localdomain sudo[161821]: pam_unix(sudo:session): session closed for user root
Nov 28 09:24:29 np0005538515.localdomain sudo[161914]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kowhclabiqkxfifwewwsbwruarfjkvww ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321869.6411347-713-220122406708547/AnsiballZ_command.py
Nov 28 09:24:29 np0005538515.localdomain sudo[161914]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:24:30 np0005538515.localdomain python3.9[161916]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtnodedevd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:24:31 np0005538515.localdomain sudo[161914]: pam_unix(sudo:session): session closed for user root
Nov 28 09:24:31 np0005538515.localdomain sudo[162007]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hbllhtnttaccqhcnzvdawfhcdodgyqej ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321871.236538-713-257608162958566/AnsiballZ_command.py
Nov 28 09:24:31 np0005538515.localdomain sudo[162007]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:24:31 np0005538515.localdomain python3.9[162009]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtproxyd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:24:31 np0005538515.localdomain sudo[162007]: pam_unix(sudo:session): session closed for user root
Nov 28 09:24:31 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1575 DF PROTO=TCP SPT=38762 DPT=9105 SEQ=3189010300 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ACABAFB0000000001030307) 
Nov 28 09:24:32 np0005538515.localdomain sudo[162070]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:24:32 np0005538515.localdomain sudo[162070]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:24:32 np0005538515.localdomain sudo[162070]: pam_unix(sudo:session): session closed for user root
Nov 28 09:24:32 np0005538515.localdomain sudo[162119]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yoqtmstuoimdiukqulltmqviyhabieef ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321871.9039903-713-272997623253727/AnsiballZ_command.py
Nov 28 09:24:32 np0005538515.localdomain sudo[162119]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:24:32 np0005538515.localdomain sudo[162113]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 09:24:32 np0005538515.localdomain sudo[162113]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:24:32 np0005538515.localdomain python3.9[162131]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtqemud.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:24:32 np0005538515.localdomain sudo[162119]: pam_unix(sudo:session): session closed for user root
Nov 28 09:24:32 np0005538515.localdomain sudo[162242]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-htstksgjjvbokbfndisfcyjtizvpsgdh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321872.483407-713-193061680389615/AnsiballZ_command.py
Nov 28 09:24:32 np0005538515.localdomain sudo[162242]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:24:32 np0005538515.localdomain sudo[162113]: pam_unix(sudo:session): session closed for user root
Nov 28 09:24:32 np0005538515.localdomain python3.9[162253]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtsecretd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:24:32 np0005538515.localdomain sudo[162242]: pam_unix(sudo:session): session closed for user root
Nov 28 09:24:33 np0005538515.localdomain sudo[162347]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fwksmbahqxwpqqiljzlgqjzyywkwxnfe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321873.0803149-713-177520629566503/AnsiballZ_command.py
Nov 28 09:24:33 np0005538515.localdomain sudo[162347]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:24:33 np0005538515.localdomain python3.9[162349]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtstoraged.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:24:33 np0005538515.localdomain sudo[162350]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:24:33 np0005538515.localdomain sudo[162350]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:24:33 np0005538515.localdomain sudo[162350]: pam_unix(sudo:session): session closed for user root
Nov 28 09:24:33 np0005538515.localdomain sudo[162347]: pam_unix(sudo:session): session closed for user root
Nov 28 09:24:34 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48540 DF PROTO=TCP SPT=44100 DPT=9105 SEQ=2256137115 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ACAC63B0000000001030307) 
Nov 28 09:24:35 np0005538515.localdomain sudo[162455]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kqnztzhnksvzgnkvwnsbwkrlkwtcgrkj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321875.5348754-875-170257404973832/AnsiballZ_getent.py
Nov 28 09:24:35 np0005538515.localdomain sudo[162455]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:24:36 np0005538515.localdomain python3.9[162457]: ansible-ansible.builtin.getent Invoked with database=passwd key=libvirt fail_key=True service=None split=None
Nov 28 09:24:36 np0005538515.localdomain sudo[162455]: pam_unix(sudo:session): session closed for user root
Nov 28 09:24:36 np0005538515.localdomain sudo[162548]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-somoaqwmoauceqlfjcwrpqnygtssfujo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321876.3570004-899-53350944299074/AnsiballZ_group.py
Nov 28 09:24:36 np0005538515.localdomain sudo[162548]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:24:36 np0005538515.localdomain python3.9[162550]: ansible-ansible.builtin.group Invoked with gid=42473 name=libvirt state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Nov 28 09:24:36 np0005538515.localdomain groupadd[162551]: group added to /etc/group: name=libvirt, GID=42473
Nov 28 09:24:36 np0005538515.localdomain groupadd[162551]: group added to /etc/gshadow: name=libvirt
Nov 28 09:24:36 np0005538515.localdomain groupadd[162551]: new group: name=libvirt, GID=42473
Nov 28 09:24:37 np0005538515.localdomain sudo[162548]: pam_unix(sudo:session): session closed for user root
Nov 28 09:24:38 np0005538515.localdomain sudo[162646]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vqqsizmbwyghbpeqjywduvanqlumncyc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321877.7206216-922-171794417745430/AnsiballZ_user.py
Nov 28 09:24:38 np0005538515.localdomain sudo[162646]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:24:38 np0005538515.localdomain python3.9[162648]: ansible-ansible.builtin.user Invoked with comment=libvirt user group=libvirt groups=[''] name=libvirt shell=/sbin/nologin state=present uid=42473 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005538515.localdomain update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Nov 28 09:24:38 np0005538515.localdomain useradd[162650]: new user: name=libvirt, UID=42473, GID=42473, home=/home/libvirt, shell=/sbin/nologin, from=/dev/pts/0
Nov 28 09:24:38 np0005538515.localdomain sudo[162646]: pam_unix(sudo:session): session closed for user root
Nov 28 09:24:39 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11996 DF PROTO=TCP SPT=34474 DPT=9102 SEQ=2891308655 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ACAD77A0000000001030307) 
Nov 28 09:24:39 np0005538515.localdomain sudo[162746]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dwmvlojcsqjwjkortyttgzpjkuvuszbv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321879.0093474-956-18737713837902/AnsiballZ_setup.py
Nov 28 09:24:39 np0005538515.localdomain sudo[162746]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:24:39 np0005538515.localdomain python3.9[162748]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 28 09:24:39 np0005538515.localdomain sudo[162746]: pam_unix(sudo:session): session closed for user root
Nov 28 09:24:40 np0005538515.localdomain sudo[162800]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cujjloeadfgfgiqgwfglmorrqcjlfzxx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764321879.0093474-956-18737713837902/AnsiballZ_dnf.py
Nov 28 09:24:40 np0005538515.localdomain sudo[162800]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:24:40 np0005538515.localdomain python3.9[162802]: ansible-ansible.legacy.dnf Invoked with name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 28 09:24:43 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48541 DF PROTO=TCP SPT=44100 DPT=9105 SEQ=2256137115 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ACAE6FB0000000001030307) 
Nov 28 09:24:44 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35893 DF PROTO=TCP SPT=55554 DPT=9882 SEQ=3425997190 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ACAEAFB0000000001030307) 
Nov 28 09:24:46 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46591 DF PROTO=TCP SPT=57010 DPT=9100 SEQ=729025881 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ACAF47B0000000001030307) 
Nov 28 09:24:46 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.
Nov 28 09:24:46 np0005538515.localdomain podman[162872]: 2025-11-28 09:24:46.990205255 +0000 UTC m=+0.094799406 container health_status 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 28 09:24:47 np0005538515.localdomain podman[162872]: 2025-11-28 09:24:47.027147406 +0000 UTC m=+0.131741567 container exec_died 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 09:24:47 np0005538515.localdomain systemd[1]: 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.service: Deactivated successfully.
Nov 28 09:24:50 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46592 DF PROTO=TCP SPT=57010 DPT=9100 SEQ=729025881 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ACB043A0000000001030307) 
Nov 28 09:24:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:24:50.803 158530 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:24:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:24:50.803 158530 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:24:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:24:50.804 158530 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:24:50 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.
Nov 28 09:24:50 np0005538515.localdomain podman[162900]: 2025-11-28 09:24:50.967272188 +0000 UTC m=+0.077497645 container health_status b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 28 09:24:51 np0005538515.localdomain podman[162900]: 2025-11-28 09:24:51.001445125 +0000 UTC m=+0.111670582 container exec_died b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 28 09:24:51 np0005538515.localdomain systemd[1]: b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.service: Deactivated successfully.
Nov 28 09:24:57 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37116 DF PROTO=TCP SPT=51108 DPT=9105 SEQ=3890080462 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ACB1F830000000001030307) 
Nov 28 09:24:58 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37117 DF PROTO=TCP SPT=51108 DPT=9105 SEQ=3890080462 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ACB237B0000000001030307) 
Nov 28 09:24:58 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33846 DF PROTO=TCP SPT=52758 DPT=9882 SEQ=1136204254 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ACB24AC0000000001030307) 
Nov 28 09:25:01 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37497 DF PROTO=TCP SPT=36594 DPT=9105 SEQ=2108030151 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ACB2EFA0000000001030307) 
Nov 28 09:25:04 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37119 DF PROTO=TCP SPT=51108 DPT=9105 SEQ=3890080462 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ACB3B3A0000000001030307) 
Nov 28 09:25:06 np0005538515.localdomain kernel: SELinux:  Converting 2746 SID table entries...
Nov 28 09:25:06 np0005538515.localdomain kernel: SELinux:  Context system_u:object_r:insights_client_cache_t:s0 became invalid (unmapped).
Nov 28 09:25:06 np0005538515.localdomain kernel: SELinux:  policy capability network_peer_controls=1
Nov 28 09:25:06 np0005538515.localdomain kernel: SELinux:  policy capability open_perms=1
Nov 28 09:25:06 np0005538515.localdomain kernel: SELinux:  policy capability extended_socket_class=1
Nov 28 09:25:06 np0005538515.localdomain kernel: SELinux:  policy capability always_check_network=0
Nov 28 09:25:06 np0005538515.localdomain kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 28 09:25:06 np0005538515.localdomain kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 28 09:25:06 np0005538515.localdomain kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 28 09:25:09 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3984 DF PROTO=TCP SPT=38716 DPT=9102 SEQ=2634108566 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ACB4CBA0000000001030307) 
Nov 28 09:25:12 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37120 DF PROTO=TCP SPT=51108 DPT=9105 SEQ=3890080462 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ACB5AFB0000000001030307) 
Nov 28 09:25:14 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16058 DF PROTO=TCP SPT=59640 DPT=9101 SEQ=4264250608 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ACB60800000000001030307) 
Nov 28 09:25:16 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63914 DF PROTO=TCP SPT=41870 DPT=9100 SEQ=952836503 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ACB69BA0000000001030307) 
Nov 28 09:25:16 np0005538515.localdomain kernel: SELinux:  Converting 2749 SID table entries...
Nov 28 09:25:16 np0005538515.localdomain kernel: SELinux:  policy capability network_peer_controls=1
Nov 28 09:25:16 np0005538515.localdomain kernel: SELinux:  policy capability open_perms=1
Nov 28 09:25:16 np0005538515.localdomain kernel: SELinux:  policy capability extended_socket_class=1
Nov 28 09:25:16 np0005538515.localdomain kernel: SELinux:  policy capability always_check_network=0
Nov 28 09:25:16 np0005538515.localdomain kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 28 09:25:16 np0005538515.localdomain kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 28 09:25:16 np0005538515.localdomain kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 28 09:25:17 np0005538515.localdomain dbus-broker-launch[754]: avc:  op=load_policy lsm=selinux seqno=20 res=1
Nov 28 09:25:17 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.
Nov 28 09:25:17 np0005538515.localdomain systemd[1]: tmp-crun.b4xROk.mount: Deactivated successfully.
Nov 28 09:25:18 np0005538515.localdomain podman[163942]: 2025-11-28 09:25:18.006764774 +0000 UTC m=+0.104459851 container health_status 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 28 09:25:18 np0005538515.localdomain podman[163942]: 2025-11-28 09:25:18.066872596 +0000 UTC m=+0.164567673 container exec_died 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_controller, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Nov 28 09:25:18 np0005538515.localdomain systemd[1]: 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.service: Deactivated successfully.
Nov 28 09:25:20 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63915 DF PROTO=TCP SPT=41870 DPT=9100 SEQ=952836503 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ACB797A0000000001030307) 
Nov 28 09:25:21 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.
Nov 28 09:25:21 np0005538515.localdomain systemd[1]: tmp-crun.1phFrk.mount: Deactivated successfully.
Nov 28 09:25:21 np0005538515.localdomain podman[163967]: 2025-11-28 09:25:21.98976888 +0000 UTC m=+0.097930161 container health_status b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 09:25:22 np0005538515.localdomain podman[163967]: 2025-11-28 09:25:22.023328468 +0000 UTC m=+0.131489749 container exec_died b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 28 09:25:22 np0005538515.localdomain systemd[1]: b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.service: Deactivated successfully.
Nov 28 09:25:25 np0005538515.localdomain kernel: SELinux:  Converting 2749 SID table entries...
Nov 28 09:25:25 np0005538515.localdomain kernel: SELinux:  policy capability network_peer_controls=1
Nov 28 09:25:25 np0005538515.localdomain kernel: SELinux:  policy capability open_perms=1
Nov 28 09:25:25 np0005538515.localdomain kernel: SELinux:  policy capability extended_socket_class=1
Nov 28 09:25:25 np0005538515.localdomain kernel: SELinux:  policy capability always_check_network=0
Nov 28 09:25:25 np0005538515.localdomain kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 28 09:25:25 np0005538515.localdomain kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 28 09:25:25 np0005538515.localdomain kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 28 09:25:27 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31717 DF PROTO=TCP SPT=47824 DPT=9105 SEQ=610611585 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ACB94B30000000001030307) 
Nov 28 09:25:28 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31718 DF PROTO=TCP SPT=47824 DPT=9105 SEQ=610611585 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ACB98BA0000000001030307) 
Nov 28 09:25:28 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63916 DF PROTO=TCP SPT=41870 DPT=9100 SEQ=952836503 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ACB98FA0000000001030307) 
Nov 28 09:25:31 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48543 DF PROTO=TCP SPT=44100 DPT=9105 SEQ=2256137115 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ACBA4FB0000000001030307) 
Nov 28 09:25:33 np0005538515.localdomain kernel: SELinux:  Converting 2749 SID table entries...
Nov 28 09:25:33 np0005538515.localdomain kernel: SELinux:  policy capability network_peer_controls=1
Nov 28 09:25:33 np0005538515.localdomain kernel: SELinux:  policy capability open_perms=1
Nov 28 09:25:33 np0005538515.localdomain kernel: SELinux:  policy capability extended_socket_class=1
Nov 28 09:25:33 np0005538515.localdomain kernel: SELinux:  policy capability always_check_network=0
Nov 28 09:25:33 np0005538515.localdomain kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 28 09:25:33 np0005538515.localdomain kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 28 09:25:33 np0005538515.localdomain kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 28 09:25:33 np0005538515.localdomain sudo[164001]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:25:33 np0005538515.localdomain dbus-broker-launch[754]: avc:  op=load_policy lsm=selinux seqno=22 res=1
Nov 28 09:25:33 np0005538515.localdomain sudo[164001]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:25:33 np0005538515.localdomain sudo[164001]: pam_unix(sudo:session): session closed for user root
Nov 28 09:25:33 np0005538515.localdomain sudo[164019]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 09:25:33 np0005538515.localdomain sudo[164019]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:25:34 np0005538515.localdomain sudo[164019]: pam_unix(sudo:session): session closed for user root
Nov 28 09:25:34 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31720 DF PROTO=TCP SPT=47824 DPT=9105 SEQ=610611585 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ACBB07A0000000001030307) 
Nov 28 09:25:35 np0005538515.localdomain sudo[164075]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:25:35 np0005538515.localdomain sudo[164075]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:25:35 np0005538515.localdomain sudo[164075]: pam_unix(sudo:session): session closed for user root
Nov 28 09:25:39 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36254 DF PROTO=TCP SPT=50342 DPT=9102 SEQ=2546681035 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ACBC1FB0000000001030307) 
Nov 28 09:25:43 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31721 DF PROTO=TCP SPT=47824 DPT=9105 SEQ=610611585 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ACBD0FA0000000001030307) 
Nov 28 09:25:43 np0005538515.localdomain kernel: SELinux:  Converting 2749 SID table entries...
Nov 28 09:25:43 np0005538515.localdomain kernel: SELinux:  policy capability network_peer_controls=1
Nov 28 09:25:43 np0005538515.localdomain kernel: SELinux:  policy capability open_perms=1
Nov 28 09:25:43 np0005538515.localdomain kernel: SELinux:  policy capability extended_socket_class=1
Nov 28 09:25:43 np0005538515.localdomain kernel: SELinux:  policy capability always_check_network=0
Nov 28 09:25:43 np0005538515.localdomain kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 28 09:25:43 np0005538515.localdomain kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 28 09:25:43 np0005538515.localdomain kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 28 09:25:44 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33013 DF PROTO=TCP SPT=42454 DPT=9101 SEQ=2394468827 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ACBD5B00000000001030307) 
Nov 28 09:25:46 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20738 DF PROTO=TCP SPT=46568 DPT=9100 SEQ=3501637710 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ACBDEFA0000000001030307) 
Nov 28 09:25:48 np0005538515.localdomain dbus-broker-launch[754]: avc:  op=load_policy lsm=selinux seqno=23 res=1
Nov 28 09:25:48 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.
Nov 28 09:25:49 np0005538515.localdomain podman[164101]: 2025-11-28 09:25:48.99814718 +0000 UTC m=+0.093233621 container health_status 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team)
Nov 28 09:25:49 np0005538515.localdomain podman[164101]: 2025-11-28 09:25:49.033775936 +0000 UTC m=+0.128862377 container exec_died 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 09:25:49 np0005538515.localdomain systemd[1]: 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.service: Deactivated successfully.
Nov 28 09:25:50 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20739 DF PROTO=TCP SPT=46568 DPT=9100 SEQ=3501637710 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ACBEEBA0000000001030307) 
Nov 28 09:25:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:25:50.804 158530 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:25:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:25:50.805 158530 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:25:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:25:50.805 158530 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:25:51 np0005538515.localdomain kernel: SELinux:  Converting 2749 SID table entries...
Nov 28 09:25:51 np0005538515.localdomain kernel: SELinux:  policy capability network_peer_controls=1
Nov 28 09:25:51 np0005538515.localdomain kernel: SELinux:  policy capability open_perms=1
Nov 28 09:25:51 np0005538515.localdomain kernel: SELinux:  policy capability extended_socket_class=1
Nov 28 09:25:51 np0005538515.localdomain kernel: SELinux:  policy capability always_check_network=0
Nov 28 09:25:51 np0005538515.localdomain kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 28 09:25:51 np0005538515.localdomain kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 28 09:25:51 np0005538515.localdomain kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 28 09:25:51 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 09:25:51 np0005538515.localdomain dbus-broker-launch[754]: avc:  op=load_policy lsm=selinux seqno=24 res=1
Nov 28 09:25:52 np0005538515.localdomain systemd-rc-local-generator[164160]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:25:52 np0005538515.localdomain systemd-sysv-generator[164165]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:25:52 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:25:52 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.
Nov 28 09:25:52 np0005538515.localdomain podman[164172]: 2025-11-28 09:25:52.309248024 +0000 UTC m=+0.074253366 container health_status b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Nov 28 09:25:52 np0005538515.localdomain podman[164172]: 2025-11-28 09:25:52.322576154 +0000 UTC m=+0.087581536 container exec_died b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 09:25:52 np0005538515.localdomain systemd[1]: b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.service: Deactivated successfully.
Nov 28 09:25:52 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 09:25:52 np0005538515.localdomain systemd-rc-local-generator[164216]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:25:52 np0005538515.localdomain systemd-sysv-generator[164219]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:25:52 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:25:57 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34853 DF PROTO=TCP SPT=37268 DPT=9105 SEQ=2440985153 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ACC09E20000000001030307) 
Nov 28 09:25:58 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34854 DF PROTO=TCP SPT=37268 DPT=9105 SEQ=2440985153 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ACC0DFA0000000001030307) 
Nov 28 09:25:58 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20740 DF PROTO=TCP SPT=46568 DPT=9100 SEQ=3501637710 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ACC0EFA0000000001030307) 
Nov 28 09:26:01 np0005538515.localdomain kernel: SELinux:  Converting 2750 SID table entries...
Nov 28 09:26:01 np0005538515.localdomain kernel: SELinux:  policy capability network_peer_controls=1
Nov 28 09:26:01 np0005538515.localdomain kernel: SELinux:  policy capability open_perms=1
Nov 28 09:26:01 np0005538515.localdomain kernel: SELinux:  policy capability extended_socket_class=1
Nov 28 09:26:01 np0005538515.localdomain kernel: SELinux:  policy capability always_check_network=0
Nov 28 09:26:01 np0005538515.localdomain kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 28 09:26:01 np0005538515.localdomain kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 28 09:26:01 np0005538515.localdomain kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 28 09:26:01 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34216 DF PROTO=TCP SPT=37558 DPT=9882 SEQ=3594735828 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ACC1AFA0000000001030307) 
Nov 28 09:26:02 np0005538515.localdomain groupadd[164242]: group added to /etc/group: name=clevis, GID=985
Nov 28 09:26:02 np0005538515.localdomain groupadd[164242]: group added to /etc/gshadow: name=clevis
Nov 28 09:26:02 np0005538515.localdomain groupadd[164242]: new group: name=clevis, GID=985
Nov 28 09:26:02 np0005538515.localdomain useradd[164249]: new user: name=clevis, UID=985, GID=985, home=/var/cache/clevis, shell=/usr/sbin/nologin, from=none
Nov 28 09:26:02 np0005538515.localdomain usermod[164259]: add 'clevis' to group 'tss'
Nov 28 09:26:02 np0005538515.localdomain usermod[164259]: add 'clevis' to shadow group 'tss'
Nov 28 09:26:04 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34856 DF PROTO=TCP SPT=37268 DPT=9105 SEQ=2440985153 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ACC25BA0000000001030307) 
Nov 28 09:26:06 np0005538515.localdomain groupadd[164281]: group added to /etc/group: name=dnsmasq, GID=984
Nov 28 09:26:06 np0005538515.localdomain groupadd[164281]: group added to /etc/gshadow: name=dnsmasq
Nov 28 09:26:06 np0005538515.localdomain groupadd[164281]: new group: name=dnsmasq, GID=984
Nov 28 09:26:06 np0005538515.localdomain useradd[164288]: new user: name=dnsmasq, UID=984, GID=984, home=/var/lib/dnsmasq, shell=/usr/sbin/nologin, from=none
Nov 28 09:26:06 np0005538515.localdomain dbus-broker-launch[750]: Noticed file-system modification, trigger reload.
Nov 28 09:26:06 np0005538515.localdomain dbus-broker-launch[754]: avc:  op=load_policy lsm=selinux seqno=25 res=1
Nov 28 09:26:07 np0005538515.localdomain polkitd[1033]: Reloading rules
Nov 28 09:26:07 np0005538515.localdomain polkitd[1033]: Collecting garbage unconditionally...
Nov 28 09:26:07 np0005538515.localdomain polkitd[1033]: Loading rules from directory /etc/polkit-1/rules.d
Nov 28 09:26:07 np0005538515.localdomain polkitd[1033]: Loading rules from directory /usr/share/polkit-1/rules.d
Nov 28 09:26:07 np0005538515.localdomain polkitd[1033]: Finished loading, compiling and executing 5 rules
Nov 28 09:26:07 np0005538515.localdomain polkitd[1033]: Reloading rules
Nov 28 09:26:07 np0005538515.localdomain polkitd[1033]: Collecting garbage unconditionally...
Nov 28 09:26:07 np0005538515.localdomain polkitd[1033]: Loading rules from directory /etc/polkit-1/rules.d
Nov 28 09:26:07 np0005538515.localdomain polkitd[1033]: Loading rules from directory /usr/share/polkit-1/rules.d
Nov 28 09:26:07 np0005538515.localdomain polkitd[1033]: Finished loading, compiling and executing 5 rules
Nov 28 09:26:09 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45125 DF PROTO=TCP SPT=45008 DPT=9102 SEQ=1571074448 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ACC373A0000000001030307) 
Nov 28 09:26:13 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34857 DF PROTO=TCP SPT=37268 DPT=9105 SEQ=2440985153 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ACC46FA0000000001030307) 
Nov 28 09:26:14 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14673 DF PROTO=TCP SPT=38030 DPT=9101 SEQ=505000131 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ACC4AE10000000001030307) 
Nov 28 09:26:16 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23070 DF PROTO=TCP SPT=46586 DPT=9100 SEQ=4169613418 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ACC543A0000000001030307) 
Nov 28 09:26:19 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.
Nov 28 09:26:20 np0005538515.localdomain podman[164962]: 2025-11-28 09:26:20.023384607 +0000 UTC m=+0.118716434 container health_status 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 28 09:26:20 np0005538515.localdomain podman[164962]: 2025-11-28 09:26:20.05693748 +0000 UTC m=+0.152269267 container exec_died 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller)
Nov 28 09:26:20 np0005538515.localdomain systemd[1]: 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.service: Deactivated successfully.
Nov 28 09:26:20 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23071 DF PROTO=TCP SPT=46586 DPT=9100 SEQ=4169613418 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ACC63FA0000000001030307) 
Nov 28 09:26:22 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.
Nov 28 09:26:22 np0005538515.localdomain podman[167173]: 2025-11-28 09:26:22.963183425 +0000 UTC m=+0.066557480 container health_status b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent)
Nov 28 09:26:22 np0005538515.localdomain podman[167173]: 2025-11-28 09:26:22.996364625 +0000 UTC m=+0.099738700 container exec_died b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Nov 28 09:26:23 np0005538515.localdomain systemd[1]: b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.service: Deactivated successfully.
Nov 28 09:26:27 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26048 DF PROTO=TCP SPT=43022 DPT=9105 SEQ=4127241913 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ACC7F130000000001030307) 
Nov 28 09:26:28 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26049 DF PROTO=TCP SPT=43022 DPT=9105 SEQ=4127241913 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ACC833A0000000001030307) 
Nov 28 09:26:28 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24605 DF PROTO=TCP SPT=44230 DPT=9882 SEQ=948264820 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ACC843C0000000001030307) 
Nov 28 09:26:31 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31723 DF PROTO=TCP SPT=47824 DPT=9105 SEQ=610611585 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ACC8EFA0000000001030307) 
Nov 28 09:26:34 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26051 DF PROTO=TCP SPT=43022 DPT=9105 SEQ=4127241913 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ACC9AFA0000000001030307) 
Nov 28 09:26:35 np0005538515.localdomain sudo[176770]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:26:35 np0005538515.localdomain sudo[176770]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:26:35 np0005538515.localdomain sudo[176770]: pam_unix(sudo:session): session closed for user root
Nov 28 09:26:35 np0005538515.localdomain sudo[176868]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 09:26:35 np0005538515.localdomain sudo[176868]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:26:36 np0005538515.localdomain sudo[176868]: pam_unix(sudo:session): session closed for user root
Nov 28 09:26:37 np0005538515.localdomain sudo[177982]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:26:37 np0005538515.localdomain sudo[177982]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:26:37 np0005538515.localdomain sudo[177982]: pam_unix(sudo:session): session closed for user root
Nov 28 09:26:39 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=542 DF PROTO=TCP SPT=54452 DPT=9102 SEQ=1090106761 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ACCAC3A0000000001030307) 
Nov 28 09:26:42 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26052 DF PROTO=TCP SPT=43022 DPT=9105 SEQ=4127241913 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ACCBAFA0000000001030307) 
Nov 28 09:26:44 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60019 DF PROTO=TCP SPT=47406 DPT=9101 SEQ=2970992170 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ACCC0100000000001030307) 
Nov 28 09:26:44 np0005538515.localdomain groupadd[181386]: group added to /etc/group: name=ceph, GID=167
Nov 28 09:26:44 np0005538515.localdomain groupadd[181386]: group added to /etc/gshadow: name=ceph
Nov 28 09:26:44 np0005538515.localdomain groupadd[181386]: new group: name=ceph, GID=167
Nov 28 09:26:44 np0005538515.localdomain useradd[181392]: new user: name=ceph, UID=167, GID=167, home=/var/lib/ceph, shell=/sbin/nologin, from=none
Nov 28 09:26:46 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11028 DF PROTO=TCP SPT=48582 DPT=9100 SEQ=2362422037 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ACCC93B0000000001030307) 
Nov 28 09:26:47 np0005538515.localdomain sshd[117864]: Received signal 15; terminating.
Nov 28 09:26:47 np0005538515.localdomain systemd[1]: Stopping OpenSSH server daemon...
Nov 28 09:26:47 np0005538515.localdomain systemd[1]: sshd.service: Deactivated successfully.
Nov 28 09:26:47 np0005538515.localdomain systemd[1]: Stopped OpenSSH server daemon.
Nov 28 09:26:47 np0005538515.localdomain systemd[1]: Stopped target sshd-keygen.target.
Nov 28 09:26:47 np0005538515.localdomain systemd[1]: Stopping sshd-keygen.target...
Nov 28 09:26:47 np0005538515.localdomain systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 28 09:26:47 np0005538515.localdomain systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 28 09:26:47 np0005538515.localdomain systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 28 09:26:47 np0005538515.localdomain systemd[1]: Reached target sshd-keygen.target.
Nov 28 09:26:48 np0005538515.localdomain systemd[1]: Starting OpenSSH server daemon...
Nov 28 09:26:48 np0005538515.localdomain sshd[182098]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 09:26:48 np0005538515.localdomain sshd[182098]: Server listening on 0.0.0.0 port 22.
Nov 28 09:26:48 np0005538515.localdomain sshd[182098]: Server listening on :: port 22.
Nov 28 09:26:48 np0005538515.localdomain systemd[1]: Started OpenSSH server daemon.
Nov 28 09:26:48 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 28 09:26:48 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/libvirtd.service:29: Failed to parse service type, ignoring: notify-reload
Nov 28 09:26:48 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 28 09:26:48 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/libvirtd.service:29: Failed to parse service type, ignoring: notify-reload
Nov 28 09:26:48 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:26:48 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/libvirtd.service:29: Failed to parse service type, ignoring: notify-reload
Nov 28 09:26:48 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/libvirtd.service:29: Failed to parse service type, ignoring: notify-reload
Nov 28 09:26:48 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:26:48 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/libvirtd.service:29: Failed to parse service type, ignoring: notify-reload
Nov 28 09:26:48 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:26:48 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/libvirtd.service:29: Failed to parse service type, ignoring: notify-reload
Nov 28 09:26:48 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:26:48 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/libvirtd.service:29: Failed to parse service type, ignoring: notify-reload
Nov 28 09:26:48 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:26:48 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/libvirtd.service:29: Failed to parse service type, ignoring: notify-reload
Nov 28 09:26:49 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:26:49 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/libvirtd.service:29: Failed to parse service type, ignoring: notify-reload
Nov 28 09:26:49 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:26:49 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/libvirtd.service:29: Failed to parse service type, ignoring: notify-reload
Nov 28 09:26:49 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:26:49 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/libvirtd.service:29: Failed to parse service type, ignoring: notify-reload
Nov 28 09:26:49 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:26:49 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/libvirtd.service:29: Failed to parse service type, ignoring: notify-reload
Nov 28 09:26:49 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 28 09:26:49 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/libvirtd.service:29: Failed to parse service type, ignoring: notify-reload
Nov 28 09:26:49 np0005538515.localdomain systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 28 09:26:50 np0005538515.localdomain systemd[1]: Starting man-db-cache-update.service...
Nov 28 09:26:50 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 09:26:50 np0005538515.localdomain systemd-sysv-generator[182368]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:26:50 np0005538515.localdomain systemd-rc-local-generator[182365]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:26:50 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 28 09:26:50 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/libvirtd.service:29: Failed to parse service type, ignoring: notify-reload
Nov 28 09:26:50 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:26:50 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 28 09:26:50 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:26:50 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:26:50 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:26:50 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:26:50 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:26:50 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:26:50 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.
Nov 28 09:26:50 np0005538515.localdomain systemd[1]: Queuing reload/restart jobs for marked units…
Nov 28 09:26:50 np0005538515.localdomain systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 28 09:26:50 np0005538515.localdomain podman[182387]: 2025-11-28 09:26:50.425665365 +0000 UTC m=+0.063001006 container health_status 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, managed_by=edpm_ansible, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 09:26:50 np0005538515.localdomain podman[182387]: 2025-11-28 09:26:50.526983619 +0000 UTC m=+0.164319260 container exec_died 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible)
Nov 28 09:26:50 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11029 DF PROTO=TCP SPT=48582 DPT=9100 SEQ=2362422037 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ACCD8FA0000000001030307) 
Nov 28 09:26:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:26:50.805 158530 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:26:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:26:50.806 158530 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:26:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:26:50.806 158530 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:26:51 np0005538515.localdomain systemd[1]: 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.service: Deactivated successfully.
Nov 28 09:26:53 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.
Nov 28 09:26:53 np0005538515.localdomain sudo[162800]: pam_unix(sudo:session): session closed for user root
Nov 28 09:26:53 np0005538515.localdomain systemd[1]: tmp-crun.wGygyL.mount: Deactivated successfully.
Nov 28 09:26:53 np0005538515.localdomain podman[186237]: 2025-11-28 09:26:53.243276067 +0000 UTC m=+0.101282635 container health_status b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS)
Nov 28 09:26:53 np0005538515.localdomain podman[186237]: 2025-11-28 09:26:53.281609397 +0000 UTC m=+0.139615965 container exec_died b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent)
Nov 28 09:26:53 np0005538515.localdomain systemd[1]: b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.service: Deactivated successfully.
Nov 28 09:26:53 np0005538515.localdomain sudo[186963]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ikxduylgdhzrmvuxjhcnbvbljsjedqmr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322013.447621-992-147482228475733/AnsiballZ_systemd.py
Nov 28 09:26:53 np0005538515.localdomain sudo[186963]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:26:54 np0005538515.localdomain python3.9[187012]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 28 09:26:55 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 09:26:55 np0005538515.localdomain systemd-rc-local-generator[188123]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:26:55 np0005538515.localdomain systemd-sysv-generator[188126]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:26:55 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 28 09:26:55 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:26:55 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 28 09:26:55 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:26:55 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:26:55 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:26:55 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:26:55 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:26:55 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:26:55 np0005538515.localdomain sudo[186963]: pam_unix(sudo:session): session closed for user root
Nov 28 09:26:56 np0005538515.localdomain sudo[188395]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sqeedsppitlwforqgfzexomqydjbltle ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322015.8745706-992-194577338213478/AnsiballZ_systemd.py
Nov 28 09:26:56 np0005538515.localdomain sudo[188395]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:26:56 np0005538515.localdomain python3.9[188411]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 28 09:26:56 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 09:26:56 np0005538515.localdomain systemd-rc-local-generator[188602]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:26:56 np0005538515.localdomain systemd-sysv-generator[188608]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:26:56 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 28 09:26:56 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:26:56 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 28 09:26:56 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:26:56 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:26:56 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:26:56 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:26:56 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:26:56 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:26:56 np0005538515.localdomain sudo[188395]: pam_unix(sudo:session): session closed for user root
Nov 28 09:26:57 np0005538515.localdomain sudo[189019]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wpomicsoedegnsbcausbzuiopjooldnj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322017.0459065-992-245513874487715/AnsiballZ_systemd.py
Nov 28 09:26:57 np0005538515.localdomain sudo[189019]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:26:57 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47820 DF PROTO=TCP SPT=33570 DPT=9105 SEQ=1927594134 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ACCF4430000000001030307) 
Nov 28 09:26:57 np0005538515.localdomain python3.9[189035]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tls.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 28 09:26:57 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 09:26:57 np0005538515.localdomain systemd-rc-local-generator[189298]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:26:57 np0005538515.localdomain systemd-sysv-generator[189301]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:26:57 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 28 09:26:57 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:26:57 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 28 09:26:57 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:26:57 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:26:57 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:26:57 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:26:57 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:26:57 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:26:58 np0005538515.localdomain sudo[189019]: pam_unix(sudo:session): session closed for user root
Nov 28 09:26:58 np0005538515.localdomain sudo[189696]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mxvfurfdoilmttwkwvktpycglzyterdg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322018.17766-992-40227803462292/AnsiballZ_systemd.py
Nov 28 09:26:58 np0005538515.localdomain sudo[189696]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:26:58 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47821 DF PROTO=TCP SPT=33570 DPT=9105 SEQ=1927594134 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ACCF83A0000000001030307) 
Nov 28 09:26:58 np0005538515.localdomain python3.9[189713]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=virtproxyd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 28 09:26:58 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11030 DF PROTO=TCP SPT=48582 DPT=9100 SEQ=2362422037 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ACCF8FB0000000001030307) 
Nov 28 09:26:58 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 09:26:58 np0005538515.localdomain systemd-rc-local-generator[189911]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:26:58 np0005538515.localdomain systemd-sysv-generator[189914]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:26:58 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 28 09:26:58 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:26:59 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 28 09:26:59 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:26:59 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:26:59 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:26:59 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:26:59 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:26:59 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:26:59 np0005538515.localdomain sudo[189696]: pam_unix(sudo:session): session closed for user root
Nov 28 09:26:59 np0005538515.localdomain sudo[190410]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wokbfmuuknzwwlyrtolgufutfybkgxss ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322019.459746-1078-152196914437111/AnsiballZ_systemd.py
Nov 28 09:26:59 np0005538515.localdomain sudo[190410]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:27:00 np0005538515.localdomain python3.9[190430]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 28 09:27:01 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 09:27:01 np0005538515.localdomain systemd-rc-local-generator[191032]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:27:01 np0005538515.localdomain systemd-sysv-generator[191039]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:27:01 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 28 09:27:01 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:27:01 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 28 09:27:01 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:27:01 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:27:01 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:27:01 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:27:01 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:27:01 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:27:01 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34859 DF PROTO=TCP SPT=37268 DPT=9105 SEQ=2440985153 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ACD04FA0000000001030307) 
Nov 28 09:27:01 np0005538515.localdomain sudo[190410]: pam_unix(sudo:session): session closed for user root
Nov 28 09:27:02 np0005538515.localdomain sudo[191503]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-diegxvmumxiaptrxyypkscimdzaaemjm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322022.0742018-1078-70081729859020/AnsiballZ_systemd.py
Nov 28 09:27:02 np0005538515.localdomain sudo[191503]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:27:02 np0005538515.localdomain python3.9[191524]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 28 09:27:02 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 09:27:02 np0005538515.localdomain systemd-rc-local-generator[191646]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:27:02 np0005538515.localdomain systemd-sysv-generator[191650]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:27:02 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 28 09:27:02 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:27:02 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:27:03 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 28 09:27:03 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:27:03 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:27:03 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:27:03 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:27:03 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:27:03 np0005538515.localdomain sudo[191503]: pam_unix(sudo:session): session closed for user root
Nov 28 09:27:03 np0005538515.localdomain systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 28 09:27:03 np0005538515.localdomain systemd[1]: Finished man-db-cache-update.service.
Nov 28 09:27:03 np0005538515.localdomain systemd[1]: man-db-cache-update.service: Consumed 15.148s CPU time.
Nov 28 09:27:03 np0005538515.localdomain systemd[1]: run-r3a3807d29206469a80a9ed357c87018e.service: Deactivated successfully.
Nov 28 09:27:03 np0005538515.localdomain systemd[1]: run-r4e2e26ab81f5435480c307cabef7f63d.service: Deactivated successfully.
Nov 28 09:27:04 np0005538515.localdomain sudo[191880]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zpgarwxmsarzokjwbtyngdmhutxexmkt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322023.7411585-1078-25939853702673/AnsiballZ_systemd.py
Nov 28 09:27:04 np0005538515.localdomain sudo[191880]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:27:04 np0005538515.localdomain python3.9[191882]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 28 09:27:04 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47823 DF PROTO=TCP SPT=33570 DPT=9105 SEQ=1927594134 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ACD0FFA0000000001030307) 
Nov 28 09:27:05 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 09:27:05 np0005538515.localdomain systemd-sysv-generator[191915]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:27:05 np0005538515.localdomain systemd-rc-local-generator[191909]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:27:05 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 28 09:27:05 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:27:05 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:27:05 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:27:05 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 28 09:27:05 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:27:05 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:27:05 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:27:05 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:27:05 np0005538515.localdomain sudo[191880]: pam_unix(sudo:session): session closed for user root
Nov 28 09:27:06 np0005538515.localdomain sudo[192028]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fbsxhbirebguvxlgqhizwulcmqqfttst ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322025.9240534-1078-232524000433092/AnsiballZ_systemd.py
Nov 28 09:27:06 np0005538515.localdomain sudo[192028]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:27:06 np0005538515.localdomain python3.9[192030]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 28 09:27:06 np0005538515.localdomain sudo[192028]: pam_unix(sudo:session): session closed for user root
Nov 28 09:27:06 np0005538515.localdomain sudo[192141]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dkqkpibtqsvzkgyokjvtztqjjdtekhtk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322026.7161386-1078-60893036961630/AnsiballZ_systemd.py
Nov 28 09:27:06 np0005538515.localdomain sudo[192141]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:27:07 np0005538515.localdomain python3.9[192143]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 28 09:27:08 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 09:27:08 np0005538515.localdomain systemd-rc-local-generator[192170]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:27:08 np0005538515.localdomain systemd-sysv-generator[192176]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:27:08 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:27:08 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 28 09:27:08 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:27:08 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:27:08 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:27:08 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 28 09:27:08 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:27:08 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:27:08 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:27:08 np0005538515.localdomain sudo[192141]: pam_unix(sudo:session): session closed for user root
Nov 28 09:27:09 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40460 DF PROTO=TCP SPT=41452 DPT=9102 SEQ=4005614768 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ACD217A0000000001030307) 
Nov 28 09:27:09 np0005538515.localdomain sudo[192289]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zxrzfdwudxkhhadgsohcbffymcsxapfb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322028.9213066-1187-185955271833361/AnsiballZ_systemd.py
Nov 28 09:27:09 np0005538515.localdomain sudo[192289]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:27:09 np0005538515.localdomain python3.9[192291]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=virtproxyd-tls.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 28 09:27:09 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 09:27:09 np0005538515.localdomain systemd-sysv-generator[192323]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:27:09 np0005538515.localdomain systemd-rc-local-generator[192316]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:27:09 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:27:09 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 28 09:27:09 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:27:09 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:27:09 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:27:09 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 28 09:27:09 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:27:09 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:27:09 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:27:09 np0005538515.localdomain sudo[192289]: pam_unix(sudo:session): session closed for user root
Nov 28 09:27:10 np0005538515.localdomain sudo[192439]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qkxvrxfunldujvuoqrupmcoojngsjgfw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322030.1386697-1210-26907097593391/AnsiballZ_systemd.py
Nov 28 09:27:10 np0005538515.localdomain sudo[192439]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:27:10 np0005538515.localdomain python3.9[192441]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 28 09:27:11 np0005538515.localdomain sudo[192439]: pam_unix(sudo:session): session closed for user root
Nov 28 09:27:12 np0005538515.localdomain sudo[192552]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dxnszqqhzlijffwandfvpvircaulrimm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322031.9548635-1210-142005627270748/AnsiballZ_systemd.py
Nov 28 09:27:12 np0005538515.localdomain sudo[192552]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:27:12 np0005538515.localdomain python3.9[192554]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 28 09:27:13 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47824 DF PROTO=TCP SPT=33570 DPT=9105 SEQ=1927594134 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ACD30FA0000000001030307) 
Nov 28 09:27:13 np0005538515.localdomain sudo[192552]: pam_unix(sudo:session): session closed for user root
Nov 28 09:27:14 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30968 DF PROTO=TCP SPT=54654 DPT=9882 SEQ=1642028125 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ACD34FA0000000001030307) 
Nov 28 09:27:14 np0005538515.localdomain sudo[192665]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cddaddreuifwhznobspmvpblfrdmrhhu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322034.0060387-1210-9576983542761/AnsiballZ_systemd.py
Nov 28 09:27:14 np0005538515.localdomain sudo[192665]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:27:14 np0005538515.localdomain python3.9[192667]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 28 09:27:14 np0005538515.localdomain sudo[192665]: pam_unix(sudo:session): session closed for user root
Nov 28 09:27:15 np0005538515.localdomain sudo[192778]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-apjkejqmxmtjnebpzxyqbsyjmhizgsle ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322034.8388493-1210-199151668190948/AnsiballZ_systemd.py
Nov 28 09:27:15 np0005538515.localdomain sudo[192778]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:27:15 np0005538515.localdomain python3.9[192780]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 28 09:27:16 np0005538515.localdomain sudo[192778]: pam_unix(sudo:session): session closed for user root
Nov 28 09:27:16 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12830 DF PROTO=TCP SPT=41304 DPT=9100 SEQ=34626681 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ACD3E7A0000000001030307) 
Nov 28 09:27:16 np0005538515.localdomain sudo[192891]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jzvmrdmtrkctnsnsnfegdduklsgczojo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322036.633013-1210-279013750487099/AnsiballZ_systemd.py
Nov 28 09:27:16 np0005538515.localdomain sudo[192891]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:27:17 np0005538515.localdomain python3.9[192893]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 28 09:27:18 np0005538515.localdomain sudo[192891]: pam_unix(sudo:session): session closed for user root
Nov 28 09:27:18 np0005538515.localdomain sudo[193004]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ahrkfclquirbgeabacaebpocvgjurswu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322038.460233-1210-60789784030561/AnsiballZ_systemd.py
Nov 28 09:27:18 np0005538515.localdomain sudo[193004]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:27:19 np0005538515.localdomain python3.9[193006]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 28 09:27:20 np0005538515.localdomain sudo[193004]: pam_unix(sudo:session): session closed for user root
Nov 28 09:27:20 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12831 DF PROTO=TCP SPT=41304 DPT=9100 SEQ=34626681 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ACD4E3A0000000001030307) 
Nov 28 09:27:20 np0005538515.localdomain sudo[193117]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yaiqavmjbodepcdadzxgkfowonmtduim ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322040.3481884-1210-115711224482724/AnsiballZ_systemd.py
Nov 28 09:27:20 np0005538515.localdomain sudo[193117]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:27:20 np0005538515.localdomain python3.9[193119]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 28 09:27:21 np0005538515.localdomain sudo[193117]: pam_unix(sudo:session): session closed for user root
Nov 28 09:27:21 np0005538515.localdomain sudo[193230]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aceleivzayugkbozlkkrrfnelpqxcyph ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322041.176624-1210-151137337653390/AnsiballZ_systemd.py
Nov 28 09:27:21 np0005538515.localdomain sudo[193230]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:27:21 np0005538515.localdomain python3.9[193232]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 28 09:27:21 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.
Nov 28 09:27:21 np0005538515.localdomain sudo[193230]: pam_unix(sudo:session): session closed for user root
Nov 28 09:27:21 np0005538515.localdomain podman[193234]: 2025-11-28 09:27:21.906591429 +0000 UTC m=+0.091661227 container health_status 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 28 09:27:21 np0005538515.localdomain podman[193234]: 2025-11-28 09:27:21.946429235 +0000 UTC m=+0.131499023 container exec_died 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Nov 28 09:27:21 np0005538515.localdomain systemd[1]: 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.service: Deactivated successfully.
Nov 28 09:27:22 np0005538515.localdomain sudo[193366]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gasoarfxasmnbnwogceyugltnggrzsdz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322041.98585-1210-173666619590993/AnsiballZ_systemd.py
Nov 28 09:27:22 np0005538515.localdomain sudo[193366]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:27:22 np0005538515.localdomain python3.9[193368]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 28 09:27:22 np0005538515.localdomain sudo[193366]: pam_unix(sudo:session): session closed for user root
Nov 28 09:27:23 np0005538515.localdomain sudo[193479]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ljnbfpuvtusgxytoxjzrtaptsdoqffmk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322042.755224-1210-59799241619396/AnsiballZ_systemd.py
Nov 28 09:27:23 np0005538515.localdomain sudo[193479]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:27:23 np0005538515.localdomain python3.9[193481]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 28 09:27:23 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.
Nov 28 09:27:23 np0005538515.localdomain sudo[193479]: pam_unix(sudo:session): session closed for user root
Nov 28 09:27:23 np0005538515.localdomain podman[193483]: 2025-11-28 09:27:23.438174168 +0000 UTC m=+0.079395742 container health_status b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Nov 28 09:27:23 np0005538515.localdomain podman[193483]: 2025-11-28 09:27:23.471471948 +0000 UTC m=+0.112693532 container exec_died b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125)
Nov 28 09:27:23 np0005538515.localdomain systemd[1]: b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.service: Deactivated successfully.
Nov 28 09:27:23 np0005538515.localdomain sudo[193608]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xihmwrpkqbosxohtcpyxcuowabmgnpyr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322043.5467744-1210-239787850803384/AnsiballZ_systemd.py
Nov 28 09:27:23 np0005538515.localdomain sudo[193608]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:27:24 np0005538515.localdomain python3.9[193610]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 28 09:27:24 np0005538515.localdomain sudo[193608]: pam_unix(sudo:session): session closed for user root
Nov 28 09:27:24 np0005538515.localdomain sudo[193721]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gjbxnaovyxdhqqwnjcrdwgivygcsmkxv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322044.3158586-1210-3522300278848/AnsiballZ_systemd.py
Nov 28 09:27:24 np0005538515.localdomain sudo[193721]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:27:24 np0005538515.localdomain python3.9[193723]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 28 09:27:24 np0005538515.localdomain sudo[193721]: pam_unix(sudo:session): session closed for user root
Nov 28 09:27:25 np0005538515.localdomain sudo[193834]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mkympkckhdqrdbchojwixsbqjacwtnqu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322045.0915363-1210-76661777941298/AnsiballZ_systemd.py
Nov 28 09:27:25 np0005538515.localdomain sudo[193834]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:27:25 np0005538515.localdomain python3.9[193836]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 28 09:27:25 np0005538515.localdomain sudo[193834]: pam_unix(sudo:session): session closed for user root
Nov 28 09:27:26 np0005538515.localdomain sudo[193947]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-osvuvrnqcsccmjizoxhiyxvaatjojahe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322045.8750184-1210-203660996926361/AnsiballZ_systemd.py
Nov 28 09:27:26 np0005538515.localdomain sudo[193947]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:27:26 np0005538515.localdomain python3.9[193949]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 28 09:27:26 np0005538515.localdomain sudo[193947]: pam_unix(sudo:session): session closed for user root
Nov 28 09:27:27 np0005538515.localdomain sudo[194060]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ohiqsabiyjiqcznwxjmqggnastnrfmjq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322046.9932048-1517-109151611318359/AnsiballZ_file.py
Nov 28 09:27:27 np0005538515.localdomain sudo[194060]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:27:27 np0005538515.localdomain python3.9[194062]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/etc/tmpfiles.d/ setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:27:27 np0005538515.localdomain sudo[194060]: pam_unix(sudo:session): session closed for user root
Nov 28 09:27:27 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5139 DF PROTO=TCP SPT=51608 DPT=9105 SEQ=3037923953 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ACD69730000000001030307) 
Nov 28 09:27:28 np0005538515.localdomain sudo[194170]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fwqhtkiicahfbrsralbkhyypoyspddhe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322047.6156042-1517-253968211024045/AnsiballZ_file.py
Nov 28 09:27:28 np0005538515.localdomain sudo[194170]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:27:28 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5140 DF PROTO=TCP SPT=51608 DPT=9105 SEQ=3037923953 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ACD6D7A0000000001030307) 
Nov 28 09:27:28 np0005538515.localdomain python3.9[194172]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:27:28 np0005538515.localdomain sudo[194170]: pam_unix(sudo:session): session closed for user root
Nov 28 09:27:28 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38384 DF PROTO=TCP SPT=51510 DPT=9882 SEQ=4238462862 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ACD6E9C0000000001030307) 
Nov 28 09:27:29 np0005538515.localdomain sudo[194280]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dnzfevduoaaztqyjrtaxjyqbtohvdibe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322048.808351-1517-131548607937391/AnsiballZ_file.py
Nov 28 09:27:29 np0005538515.localdomain sudo[194280]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:27:29 np0005538515.localdomain python3.9[194282]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:27:29 np0005538515.localdomain sudo[194280]: pam_unix(sudo:session): session closed for user root
Nov 28 09:27:29 np0005538515.localdomain sudo[194390]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rhqxxxmvbewyyxeyvtsabzmidxodsetn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322049.4648964-1517-59997578453597/AnsiballZ_file.py
Nov 28 09:27:29 np0005538515.localdomain sudo[194390]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:27:29 np0005538515.localdomain python3.9[194392]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt/private setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:27:29 np0005538515.localdomain sudo[194390]: pam_unix(sudo:session): session closed for user root
Nov 28 09:27:31 np0005538515.localdomain sudo[194500]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qilzlmuoamgsmbrftzhohftbdltikggb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322050.8317661-1517-107641651609319/AnsiballZ_file.py
Nov 28 09:27:31 np0005538515.localdomain sudo[194500]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:27:31 np0005538515.localdomain python3.9[194502]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/CA setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:27:31 np0005538515.localdomain sudo[194500]: pam_unix(sudo:session): session closed for user root
Nov 28 09:27:31 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26054 DF PROTO=TCP SPT=43022 DPT=9105 SEQ=4127241913 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ACD78FB0000000001030307) 
Nov 28 09:27:31 np0005538515.localdomain sudo[194610]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uwsiovefwlriafuvcegfjgaqsntketfl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322051.5133767-1517-149542431245381/AnsiballZ_file.py
Nov 28 09:27:31 np0005538515.localdomain sudo[194610]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:27:32 np0005538515.localdomain python3.9[194612]: ansible-ansible.builtin.file Invoked with group=qemu owner=root path=/etc/pki/qemu setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:27:32 np0005538515.localdomain sudo[194610]: pam_unix(sudo:session): session closed for user root
Nov 28 09:27:32 np0005538515.localdomain sudo[194720]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-winleqvboyemyubiiqdtoxyroywuzvhl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322052.2122705-1646-51385911595155/AnsiballZ_stat.py
Nov 28 09:27:32 np0005538515.localdomain sudo[194720]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:27:32 np0005538515.localdomain python3.9[194722]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtlogd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:27:32 np0005538515.localdomain sudo[194720]: pam_unix(sudo:session): session closed for user root
Nov 28 09:27:33 np0005538515.localdomain sudo[194810]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ycffcienrpetwcgrzsuxbxtuzziaybuq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322052.2122705-1646-51385911595155/AnsiballZ_copy.py
Nov 28 09:27:33 np0005538515.localdomain sudo[194810]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:27:33 np0005538515.localdomain python3.9[194812]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtlogd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764322052.2122705-1646-51385911595155/.source.conf follow=False _original_basename=virtlogd.conf checksum=d7a72ae92c2c205983b029473e05a6aa4c58ec24 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:27:33 np0005538515.localdomain sudo[194810]: pam_unix(sudo:session): session closed for user root
Nov 28 09:27:33 np0005538515.localdomain sudo[194920]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jispqqlkxagvhwdqbekcpjujeyvgdlhf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322053.71599-1646-67231418976200/AnsiballZ_stat.py
Nov 28 09:27:33 np0005538515.localdomain sudo[194920]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:27:34 np0005538515.localdomain python3.9[194922]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtnodedevd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:27:34 np0005538515.localdomain sudo[194920]: pam_unix(sudo:session): session closed for user root
Nov 28 09:27:34 np0005538515.localdomain sudo[195010]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fkuopignafskjltcbydsepauvflmdpnm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322053.71599-1646-67231418976200/AnsiballZ_copy.py
Nov 28 09:27:34 np0005538515.localdomain sudo[195010]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:27:34 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5142 DF PROTO=TCP SPT=51608 DPT=9105 SEQ=3037923953 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ACD853A0000000001030307) 
Nov 28 09:27:34 np0005538515.localdomain python3.9[195012]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtnodedevd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764322053.71599-1646-67231418976200/.source.conf follow=False _original_basename=virtnodedevd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:27:34 np0005538515.localdomain sudo[195010]: pam_unix(sudo:session): session closed for user root
Nov 28 09:27:35 np0005538515.localdomain sudo[195120]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vocndpgmkhqhcnbjtaydmykvuteakkyv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322054.9062426-1646-85290725561634/AnsiballZ_stat.py
Nov 28 09:27:35 np0005538515.localdomain sudo[195120]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:27:35 np0005538515.localdomain python3.9[195122]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtproxyd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:27:35 np0005538515.localdomain sudo[195120]: pam_unix(sudo:session): session closed for user root
Nov 28 09:27:35 np0005538515.localdomain sudo[195210]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-viumeqyrfuizjlobsolllvvgtkqjtajx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322054.9062426-1646-85290725561634/AnsiballZ_copy.py
Nov 28 09:27:35 np0005538515.localdomain sudo[195210]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:27:35 np0005538515.localdomain python3.9[195212]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtproxyd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764322054.9062426-1646-85290725561634/.source.conf follow=False _original_basename=virtproxyd.conf checksum=28bc484b7c9988e03de49d4fcc0a088ea975f716 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:27:35 np0005538515.localdomain sudo[195210]: pam_unix(sudo:session): session closed for user root
Nov 28 09:27:36 np0005538515.localdomain sudo[195320]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xjtldinrlxyodawhirqhlibbrxvtkxka ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322056.137119-1646-46329029484448/AnsiballZ_stat.py
Nov 28 09:27:36 np0005538515.localdomain sudo[195320]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:27:36 np0005538515.localdomain python3.9[195322]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtqemud.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:27:36 np0005538515.localdomain sudo[195320]: pam_unix(sudo:session): session closed for user root
Nov 28 09:27:37 np0005538515.localdomain sudo[195410]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nrxuzesloqwzmbvjrauwgtcqlxxsclqg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322056.137119-1646-46329029484448/AnsiballZ_copy.py
Nov 28 09:27:37 np0005538515.localdomain sudo[195410]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:27:37 np0005538515.localdomain python3.9[195412]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtqemud.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764322056.137119-1646-46329029484448/.source.conf follow=False _original_basename=virtqemud.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:27:37 np0005538515.localdomain sudo[195410]: pam_unix(sudo:session): session closed for user root
Nov 28 09:27:37 np0005538515.localdomain sudo[195430]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:27:37 np0005538515.localdomain sudo[195430]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:27:37 np0005538515.localdomain sudo[195430]: pam_unix(sudo:session): session closed for user root
Nov 28 09:27:37 np0005538515.localdomain sudo[195470]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 09:27:37 np0005538515.localdomain sudo[195470]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:27:37 np0005538515.localdomain sudo[195556]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vghcozqxvamywdjwvgypkzbzphecwdii ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322057.395518-1646-153893200498097/AnsiballZ_stat.py
Nov 28 09:27:37 np0005538515.localdomain sudo[195556]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:27:37 np0005538515.localdomain python3.9[195558]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/qemu.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:27:37 np0005538515.localdomain sudo[195556]: pam_unix(sudo:session): session closed for user root
Nov 28 09:27:38 np0005538515.localdomain sudo[195470]: pam_unix(sudo:session): session closed for user root
Nov 28 09:27:38 np0005538515.localdomain sudo[195678]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-taonlvsidoreyeafxkumfcadyhwxyusb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322057.395518-1646-153893200498097/AnsiballZ_copy.py
Nov 28 09:27:38 np0005538515.localdomain sudo[195678]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:27:38 np0005538515.localdomain python3.9[195680]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/qemu.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764322057.395518-1646-153893200498097/.source.conf follow=False _original_basename=qemu.conf.j2 checksum=8d9b2057482987a531d808ceb2ac4bc7d43bf17c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:27:38 np0005538515.localdomain sudo[195678]: pam_unix(sudo:session): session closed for user root
Nov 28 09:27:38 np0005538515.localdomain sudo[195793]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cotgdljiaplrvosbveyzskpugyzhxche ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322058.6462526-1646-173100488185855/AnsiballZ_stat.py
Nov 28 09:27:38 np0005538515.localdomain sudo[195793]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:27:38 np0005538515.localdomain sudo[195785]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:27:38 np0005538515.localdomain sudo[195785]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:27:38 np0005538515.localdomain sudo[195785]: pam_unix(sudo:session): session closed for user root
Nov 28 09:27:39 np0005538515.localdomain python3.9[195807]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtsecretd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:27:39 np0005538515.localdomain sudo[195793]: pam_unix(sudo:session): session closed for user root
Nov 28 09:27:39 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16943 DF PROTO=TCP SPT=43692 DPT=9102 SEQ=694514364 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ACD96BB0000000001030307) 
Nov 28 09:27:39 np0005538515.localdomain sudo[195896]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oznwgulmwtfpaqtmmpnwvmbgwkrwmhqp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322058.6462526-1646-173100488185855/AnsiballZ_copy.py
Nov 28 09:27:39 np0005538515.localdomain sudo[195896]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:27:39 np0005538515.localdomain python3.9[195898]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtsecretd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764322058.6462526-1646-173100488185855/.source.conf follow=False _original_basename=virtsecretd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:27:39 np0005538515.localdomain sudo[195896]: pam_unix(sudo:session): session closed for user root
Nov 28 09:27:40 np0005538515.localdomain sudo[196006]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aftbabhusdfnmkiurtzwlzjyyzwujcgb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322059.7951365-1646-89436496203327/AnsiballZ_stat.py
Nov 28 09:27:40 np0005538515.localdomain sudo[196006]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:27:40 np0005538515.localdomain python3.9[196008]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/auth.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:27:40 np0005538515.localdomain sudo[196006]: pam_unix(sudo:session): session closed for user root
Nov 28 09:27:41 np0005538515.localdomain sudo[196094]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gjgcwygiykiayfzcphjyqjpznfvjpkqo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322059.7951365-1646-89436496203327/AnsiballZ_copy.py
Nov 28 09:27:41 np0005538515.localdomain sudo[196094]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:27:41 np0005538515.localdomain python3.9[196096]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/auth.conf group=libvirt mode=0600 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764322059.7951365-1646-89436496203327/.source.conf follow=False _original_basename=auth.conf checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:27:41 np0005538515.localdomain sudo[196094]: pam_unix(sudo:session): session closed for user root
Nov 28 09:27:42 np0005538515.localdomain sudo[196204]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rmqwhbpyraprjzvnnyhpdrvjfpoiptob ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322061.68807-1646-133046340239568/AnsiballZ_stat.py
Nov 28 09:27:42 np0005538515.localdomain sudo[196204]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:27:42 np0005538515.localdomain python3.9[196206]: ansible-ansible.legacy.stat Invoked with path=/etc/sasl2/libvirt.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:27:42 np0005538515.localdomain sudo[196204]: pam_unix(sudo:session): session closed for user root
Nov 28 09:27:42 np0005538515.localdomain sudo[196294]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wssqdgayqpwpspksuynlrlxvbokfitlw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322061.68807-1646-133046340239568/AnsiballZ_copy.py
Nov 28 09:27:42 np0005538515.localdomain sudo[196294]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:27:42 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5143 DF PROTO=TCP SPT=51608 DPT=9105 SEQ=3037923953 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ACDA4FA0000000001030307) 
Nov 28 09:27:42 np0005538515.localdomain python3.9[196296]: ansible-ansible.legacy.copy Invoked with dest=/etc/sasl2/libvirt.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764322061.68807-1646-133046340239568/.source.conf follow=False _original_basename=sasl_libvirt.conf checksum=652e4d404bf79253d06956b8e9847c9364979d4a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:27:42 np0005538515.localdomain sudo[196294]: pam_unix(sudo:session): session closed for user root
Nov 28 09:27:43 np0005538515.localdomain sudo[196404]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iaqfcdjqjbygbhxxpnxptklbkigjqyda ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322063.3975646-1988-31532936033431/AnsiballZ_file.py
Nov 28 09:27:43 np0005538515.localdomain sudo[196404]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:27:43 np0005538515.localdomain python3.9[196406]: ansible-ansible.builtin.file Invoked with path=/etc/libvirt/passwd.db state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:27:43 np0005538515.localdomain sudo[196404]: pam_unix(sudo:session): session closed for user root
Nov 28 09:27:44 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16744 DF PROTO=TCP SPT=46192 DPT=9101 SEQ=1228987203 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ACDAA710000000001030307) 
Nov 28 09:27:44 np0005538515.localdomain sudo[196514]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-anwrfsfvrzxwcsxijbbsdqzasgchzfgx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322064.1606586-2011-63864565640932/AnsiballZ_file.py
Nov 28 09:27:44 np0005538515.localdomain sudo[196514]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:27:44 np0005538515.localdomain python3.9[196516]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:27:44 np0005538515.localdomain sudo[196514]: pam_unix(sudo:session): session closed for user root
Nov 28 09:27:45 np0005538515.localdomain sudo[196624]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ueyrgepiqfdmqwojeslqupwcvbmduwje ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322064.7452903-2011-89206913928398/AnsiballZ_file.py
Nov 28 09:27:45 np0005538515.localdomain sudo[196624]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:27:45 np0005538515.localdomain python3.9[196626]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:27:45 np0005538515.localdomain sudo[196624]: pam_unix(sudo:session): session closed for user root
Nov 28 09:27:45 np0005538515.localdomain sudo[196734]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-btbypcpvezpblvdtitungmqxkvyxjxfc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322065.4037337-2011-57508582751532/AnsiballZ_file.py
Nov 28 09:27:45 np0005538515.localdomain sudo[196734]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:27:45 np0005538515.localdomain python3.9[196736]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:27:45 np0005538515.localdomain sudo[196734]: pam_unix(sudo:session): session closed for user root
Nov 28 09:27:46 np0005538515.localdomain sudo[196844]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ftjaxhbcgicuksehfwasronqzwloaukn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322066.0671566-2011-233932995783323/AnsiballZ_file.py
Nov 28 09:27:46 np0005538515.localdomain sudo[196844]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:27:46 np0005538515.localdomain python3.9[196846]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:27:46 np0005538515.localdomain sudo[196844]: pam_unix(sudo:session): session closed for user root
Nov 28 09:27:46 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46539 DF PROTO=TCP SPT=43862 DPT=9100 SEQ=382430769 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ACDB3BB0000000001030307) 
Nov 28 09:27:46 np0005538515.localdomain sudo[196954]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yzxjxmqbagjkixlnwyvlgcscljhrtjcw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322066.6961777-2011-134128998839449/AnsiballZ_file.py
Nov 28 09:27:46 np0005538515.localdomain sudo[196954]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:27:47 np0005538515.localdomain python3.9[196956]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:27:47 np0005538515.localdomain sudo[196954]: pam_unix(sudo:session): session closed for user root
Nov 28 09:27:47 np0005538515.localdomain sudo[197064]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oxfxxztygtxchjjnpfwlevjepoizcken ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322067.411624-2011-223312093089369/AnsiballZ_file.py
Nov 28 09:27:47 np0005538515.localdomain sudo[197064]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:27:47 np0005538515.localdomain python3.9[197066]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:27:47 np0005538515.localdomain sudo[197064]: pam_unix(sudo:session): session closed for user root
Nov 28 09:27:48 np0005538515.localdomain sudo[197174]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ktpshpjyhebdcvxvvejllwvsuyubddde ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322068.041616-2011-103462197881896/AnsiballZ_file.py
Nov 28 09:27:48 np0005538515.localdomain sudo[197174]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:27:48 np0005538515.localdomain python3.9[197176]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:27:48 np0005538515.localdomain sudo[197174]: pam_unix(sudo:session): session closed for user root
Nov 28 09:27:48 np0005538515.localdomain sudo[197284]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-caqrhggbokdcmwtvmnsdwpajccfjelqn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322068.7100592-2011-9521182947671/AnsiballZ_file.py
Nov 28 09:27:48 np0005538515.localdomain sudo[197284]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:27:49 np0005538515.localdomain python3.9[197286]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:27:49 np0005538515.localdomain sudo[197284]: pam_unix(sudo:session): session closed for user root
Nov 28 09:27:49 np0005538515.localdomain sudo[197394]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kkkpanqfxkyezbibsrodkfkpwfbnqymm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322069.3416636-2011-205619861141953/AnsiballZ_file.py
Nov 28 09:27:49 np0005538515.localdomain sudo[197394]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:27:49 np0005538515.localdomain python3.9[197396]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:27:49 np0005538515.localdomain sudo[197394]: pam_unix(sudo:session): session closed for user root
Nov 28 09:27:50 np0005538515.localdomain sudo[197504]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eenehcavtefvsvrnewlqwktexngehjms ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322069.9755352-2011-276455814480980/AnsiballZ_file.py
Nov 28 09:27:50 np0005538515.localdomain sudo[197504]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:27:50 np0005538515.localdomain python3.9[197506]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:27:50 np0005538515.localdomain sudo[197504]: pam_unix(sudo:session): session closed for user root
Nov 28 09:27:50 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46540 DF PROTO=TCP SPT=43862 DPT=9100 SEQ=382430769 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ACDC37A0000000001030307) 
Nov 28 09:27:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:27:50.807 158530 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:27:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:27:50.808 158530 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:27:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:27:50.808 158530 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:27:50 np0005538515.localdomain sudo[197614]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tudjbmzilnwpmkxyvvfpkkuqqdlppclo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322070.6358078-2011-115524534356792/AnsiballZ_file.py
Nov 28 09:27:50 np0005538515.localdomain sudo[197614]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:27:51 np0005538515.localdomain python3.9[197616]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:27:51 np0005538515.localdomain sudo[197614]: pam_unix(sudo:session): session closed for user root
Nov 28 09:27:51 np0005538515.localdomain sudo[197724]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xtzrkehikzlkrosjasrxnlxyaugjwawf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322071.316436-2011-141050963675134/AnsiballZ_file.py
Nov 28 09:27:51 np0005538515.localdomain sudo[197724]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:27:51 np0005538515.localdomain python3.9[197726]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:27:51 np0005538515.localdomain sudo[197724]: pam_unix(sudo:session): session closed for user root
Nov 28 09:27:52 np0005538515.localdomain sudo[197834]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fzexowjseajhguhwcgsipmliktlxhkzn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322071.9806204-2011-113608398309688/AnsiballZ_file.py
Nov 28 09:27:52 np0005538515.localdomain sudo[197834]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:27:52 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.
Nov 28 09:27:52 np0005538515.localdomain systemd[1]: tmp-crun.3P4TCw.mount: Deactivated successfully.
Nov 28 09:27:52 np0005538515.localdomain podman[197837]: 2025-11-28 09:27:52.388469815 +0000 UTC m=+0.097247988 container health_status 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible)
Nov 28 09:27:52 np0005538515.localdomain podman[197837]: 2025-11-28 09:27:52.421908799 +0000 UTC m=+0.130686972 container exec_died 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller)
Nov 28 09:27:52 np0005538515.localdomain systemd[1]: 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.service: Deactivated successfully.
Nov 28 09:27:52 np0005538515.localdomain python3.9[197836]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:27:52 np0005538515.localdomain sudo[197834]: pam_unix(sudo:session): session closed for user root
Nov 28 09:27:52 np0005538515.localdomain sudo[197969]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-njmnzyulxtzyltrrllyuhkiniquemsle ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322072.6360574-2011-183745901676569/AnsiballZ_file.py
Nov 28 09:27:52 np0005538515.localdomain sudo[197969]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:27:53 np0005538515.localdomain python3.9[197971]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:27:53 np0005538515.localdomain sudo[197969]: pam_unix(sudo:session): session closed for user root
Nov 28 09:27:53 np0005538515.localdomain sudo[198079]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bdqcqazhqsusruyqxsvwpjpwicbfvymg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322073.3527386-2308-269256967262127/AnsiballZ_stat.py
Nov 28 09:27:53 np0005538515.localdomain sudo[198079]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:27:53 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.
Nov 28 09:27:53 np0005538515.localdomain podman[198082]: 2025-11-28 09:27:53.792251853 +0000 UTC m=+0.082212354 container health_status b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 09:27:53 np0005538515.localdomain podman[198082]: 2025-11-28 09:27:53.822354834 +0000 UTC m=+0.112315285 container exec_died b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 28 09:27:53 np0005538515.localdomain systemd[1]: b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.service: Deactivated successfully.
Nov 28 09:27:53 np0005538515.localdomain python3.9[198081]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:27:53 np0005538515.localdomain sudo[198079]: pam_unix(sudo:session): session closed for user root
Nov 28 09:27:54 np0005538515.localdomain sudo[198185]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zyefivjvvxwepexprdogdjzazzczqguw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322073.3527386-2308-269256967262127/AnsiballZ_copy.py
Nov 28 09:27:54 np0005538515.localdomain sudo[198185]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:27:55 np0005538515.localdomain python3.9[198187]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764322073.3527386-2308-269256967262127/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:27:55 np0005538515.localdomain sudo[198185]: pam_unix(sudo:session): session closed for user root
Nov 28 09:27:55 np0005538515.localdomain sudo[198295]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oatddqzbrnguanncorqaewyreqlrvbaa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322075.3172312-2308-29852060132597/AnsiballZ_stat.py
Nov 28 09:27:55 np0005538515.localdomain sudo[198295]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:27:55 np0005538515.localdomain python3.9[198297]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:27:55 np0005538515.localdomain sudo[198295]: pam_unix(sudo:session): session closed for user root
Nov 28 09:27:57 np0005538515.localdomain sudo[198383]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cfazdriabrnevgfkvvdnfjvoxpmgwkoi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322075.3172312-2308-29852060132597/AnsiballZ_copy.py
Nov 28 09:27:57 np0005538515.localdomain sudo[198383]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:27:57 np0005538515.localdomain python3.9[198385]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764322075.3172312-2308-29852060132597/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:27:57 np0005538515.localdomain sudo[198383]: pam_unix(sudo:session): session closed for user root
Nov 28 09:27:57 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14905 DF PROTO=TCP SPT=33392 DPT=9105 SEQ=4242062705 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ACDDEA20000000001030307) 
Nov 28 09:27:57 np0005538515.localdomain sudo[198493]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hysonmstsbgkhffcwlngfappzjxnmjhb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322077.4218621-2308-228135076465444/AnsiballZ_stat.py
Nov 28 09:27:57 np0005538515.localdomain sudo[198493]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:27:57 np0005538515.localdomain python3.9[198495]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:27:57 np0005538515.localdomain sudo[198493]: pam_unix(sudo:session): session closed for user root
Nov 28 09:27:58 np0005538515.localdomain sudo[198581]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zdumsifrkeubwgsvklzxcuhcdfvohvux ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322077.4218621-2308-228135076465444/AnsiballZ_copy.py
Nov 28 09:27:58 np0005538515.localdomain sudo[198581]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:27:58 np0005538515.localdomain python3.9[198583]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764322077.4218621-2308-228135076465444/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:27:58 np0005538515.localdomain sudo[198581]: pam_unix(sudo:session): session closed for user root
Nov 28 09:27:58 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14906 DF PROTO=TCP SPT=33392 DPT=9105 SEQ=4242062705 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ACDE2BA0000000001030307) 
Nov 28 09:27:58 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46541 DF PROTO=TCP SPT=43862 DPT=9100 SEQ=382430769 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ACDE2FA0000000001030307) 
Nov 28 09:27:58 np0005538515.localdomain sudo[198691]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-efrtzgpvwcqhpfythihpahqnkvvnrgpn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322078.5871315-2308-241193875116595/AnsiballZ_stat.py
Nov 28 09:27:58 np0005538515.localdomain sudo[198691]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:27:59 np0005538515.localdomain python3.9[198693]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:27:59 np0005538515.localdomain sudo[198691]: pam_unix(sudo:session): session closed for user root
Nov 28 09:27:59 np0005538515.localdomain sudo[198779]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-elqpdexdxehkjgfcerdjkxpskmcqwltc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322078.5871315-2308-241193875116595/AnsiballZ_copy.py
Nov 28 09:27:59 np0005538515.localdomain sudo[198779]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:27:59 np0005538515.localdomain python3.9[198781]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764322078.5871315-2308-241193875116595/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:27:59 np0005538515.localdomain sudo[198779]: pam_unix(sudo:session): session closed for user root
Nov 28 09:28:00 np0005538515.localdomain sudo[198889]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nbvdbtnximnxjommvdeskzwserhkpgav ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322079.7439036-2308-180250792400484/AnsiballZ_stat.py
Nov 28 09:28:00 np0005538515.localdomain sudo[198889]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:28:00 np0005538515.localdomain python3.9[198891]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:28:00 np0005538515.localdomain sudo[198889]: pam_unix(sudo:session): session closed for user root
Nov 28 09:28:00 np0005538515.localdomain sudo[198977]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-txbioazjkushzcqcwajxoszbxotajzbk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322079.7439036-2308-180250792400484/AnsiballZ_copy.py
Nov 28 09:28:00 np0005538515.localdomain sudo[198977]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:28:00 np0005538515.localdomain python3.9[198979]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764322079.7439036-2308-180250792400484/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:28:00 np0005538515.localdomain sudo[198977]: pam_unix(sudo:session): session closed for user root
Nov 28 09:28:01 np0005538515.localdomain sudo[199087]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jxbdqcywhsczwdgzuncglbnbzlrrcelo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322080.9841394-2308-35524085384709/AnsiballZ_stat.py
Nov 28 09:28:01 np0005538515.localdomain sudo[199087]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:28:01 np0005538515.localdomain python3.9[199089]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:28:01 np0005538515.localdomain sudo[199087]: pam_unix(sudo:session): session closed for user root
Nov 28 09:28:01 np0005538515.localdomain sudo[199175]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vxgptmonqkesvwlneisztdgeqhvhccjz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322080.9841394-2308-35524085384709/AnsiballZ_copy.py
Nov 28 09:28:01 np0005538515.localdomain sudo[199175]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:28:01 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47826 DF PROTO=TCP SPT=33570 DPT=9105 SEQ=1927594134 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ACDEEFA0000000001030307) 
Nov 28 09:28:01 np0005538515.localdomain python3.9[199177]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764322080.9841394-2308-35524085384709/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:28:01 np0005538515.localdomain sudo[199175]: pam_unix(sudo:session): session closed for user root
Nov 28 09:28:02 np0005538515.localdomain sudo[199285]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-schovrvhowkokbkymaozjtuegpuullhc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322082.1087193-2308-2644194947495/AnsiballZ_stat.py
Nov 28 09:28:02 np0005538515.localdomain sudo[199285]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:28:02 np0005538515.localdomain python3.9[199287]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:28:02 np0005538515.localdomain sudo[199285]: pam_unix(sudo:session): session closed for user root
Nov 28 09:28:02 np0005538515.localdomain sudo[199373]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ofdyvysvlauhybmhszwhsfcujxfigmfb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322082.1087193-2308-2644194947495/AnsiballZ_copy.py
Nov 28 09:28:02 np0005538515.localdomain sudo[199373]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:28:03 np0005538515.localdomain python3.9[199375]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764322082.1087193-2308-2644194947495/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:28:03 np0005538515.localdomain sudo[199373]: pam_unix(sudo:session): session closed for user root
Nov 28 09:28:03 np0005538515.localdomain sudo[199483]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vylsbrjeabbzwajbrqfleaptiuhsddtb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322083.4275162-2308-11266728512259/AnsiballZ_stat.py
Nov 28 09:28:03 np0005538515.localdomain sudo[199483]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:28:03 np0005538515.localdomain python3.9[199485]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:28:03 np0005538515.localdomain sudo[199483]: pam_unix(sudo:session): session closed for user root
Nov 28 09:28:04 np0005538515.localdomain sudo[199571]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eppahoabkjbmowgiyyjlliookwbfcjfl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322083.4275162-2308-11266728512259/AnsiballZ_copy.py
Nov 28 09:28:04 np0005538515.localdomain sudo[199571]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:28:04 np0005538515.localdomain python3.9[199573]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764322083.4275162-2308-11266728512259/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:28:04 np0005538515.localdomain sudo[199571]: pam_unix(sudo:session): session closed for user root
Nov 28 09:28:04 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14908 DF PROTO=TCP SPT=33392 DPT=9105 SEQ=4242062705 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ACDFA7B0000000001030307) 
Nov 28 09:28:04 np0005538515.localdomain sudo[199681]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tuhrhcxhvegwydsfuzhsbwmlbivdgrri ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322084.6467984-2308-115488532257364/AnsiballZ_stat.py
Nov 28 09:28:04 np0005538515.localdomain sudo[199681]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:28:05 np0005538515.localdomain python3.9[199683]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:28:05 np0005538515.localdomain sudo[199681]: pam_unix(sudo:session): session closed for user root
Nov 28 09:28:05 np0005538515.localdomain sudo[199769]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rmfgszdztfytcwdbauuyzvwdamdwkhkp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322084.6467984-2308-115488532257364/AnsiballZ_copy.py
Nov 28 09:28:05 np0005538515.localdomain sudo[199769]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:28:05 np0005538515.localdomain python3.9[199771]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764322084.6467984-2308-115488532257364/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:28:05 np0005538515.localdomain sudo[199769]: pam_unix(sudo:session): session closed for user root
Nov 28 09:28:06 np0005538515.localdomain sudo[199879]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sptktoivtjupxvpzysqvlulilodrgoyw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322085.8230546-2308-108243518876229/AnsiballZ_stat.py
Nov 28 09:28:06 np0005538515.localdomain sudo[199879]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:28:06 np0005538515.localdomain python3.9[199881]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:28:06 np0005538515.localdomain sudo[199879]: pam_unix(sudo:session): session closed for user root
Nov 28 09:28:06 np0005538515.localdomain sudo[199967]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nzufjzcvcwwarsaieyqxntabipjyjivh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322085.8230546-2308-108243518876229/AnsiballZ_copy.py
Nov 28 09:28:06 np0005538515.localdomain sudo[199967]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:28:06 np0005538515.localdomain python3.9[199969]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764322085.8230546-2308-108243518876229/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:28:06 np0005538515.localdomain sudo[199967]: pam_unix(sudo:session): session closed for user root
Nov 28 09:28:07 np0005538515.localdomain sudo[200077]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wihmnfaqlkkfhgnvqrqdkzlkoehzadnu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322087.0163543-2308-188583093520749/AnsiballZ_stat.py
Nov 28 09:28:07 np0005538515.localdomain sudo[200077]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:28:07 np0005538515.localdomain python3.9[200079]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:28:07 np0005538515.localdomain sudo[200077]: pam_unix(sudo:session): session closed for user root
Nov 28 09:28:07 np0005538515.localdomain sudo[200165]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qqzuhhjuaivlcxqfqqxilluarmleqxdh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322087.0163543-2308-188583093520749/AnsiballZ_copy.py
Nov 28 09:28:07 np0005538515.localdomain sudo[200165]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:28:08 np0005538515.localdomain python3.9[200167]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764322087.0163543-2308-188583093520749/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:28:08 np0005538515.localdomain sudo[200165]: pam_unix(sudo:session): session closed for user root
Nov 28 09:28:09 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34620 DF PROTO=TCP SPT=58496 DPT=9102 SEQ=3312259674 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ACE0BFB0000000001030307) 
Nov 28 09:28:09 np0005538515.localdomain sudo[200275]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-llxovmnjqazgyemlfnykfcytsvrgacbd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322089.3391469-2308-221361733904935/AnsiballZ_stat.py
Nov 28 09:28:09 np0005538515.localdomain sudo[200275]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:28:09 np0005538515.localdomain python3.9[200277]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:28:09 np0005538515.localdomain sudo[200275]: pam_unix(sudo:session): session closed for user root
Nov 28 09:28:10 np0005538515.localdomain sudo[200363]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gorrgxappoeifldcjknaynogulojzxwh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322089.3391469-2308-221361733904935/AnsiballZ_copy.py
Nov 28 09:28:10 np0005538515.localdomain sudo[200363]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:28:10 np0005538515.localdomain python3.9[200365]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764322089.3391469-2308-221361733904935/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:28:10 np0005538515.localdomain sudo[200363]: pam_unix(sudo:session): session closed for user root
Nov 28 09:28:10 np0005538515.localdomain sudo[200473]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qhybuvrycfnukosbupiufyojqxqpguac ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322090.441418-2308-53745199370202/AnsiballZ_stat.py
Nov 28 09:28:10 np0005538515.localdomain sudo[200473]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:28:10 np0005538515.localdomain python3.9[200475]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:28:10 np0005538515.localdomain sudo[200473]: pam_unix(sudo:session): session closed for user root
Nov 28 09:28:11 np0005538515.localdomain sudo[200561]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-htqfffktwhmyjkbrljsimujofiuubitk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322090.441418-2308-53745199370202/AnsiballZ_copy.py
Nov 28 09:28:11 np0005538515.localdomain sudo[200561]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:28:11 np0005538515.localdomain python3.9[200563]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764322090.441418-2308-53745199370202/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:28:11 np0005538515.localdomain sudo[200561]: pam_unix(sudo:session): session closed for user root
Nov 28 09:28:12 np0005538515.localdomain sudo[200671]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-edmwbtspnbnttfjvlvoyrxlqgavlkqvr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322091.9935346-2308-3041262932487/AnsiballZ_stat.py
Nov 28 09:28:12 np0005538515.localdomain sudo[200671]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:28:12 np0005538515.localdomain python3.9[200673]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:28:12 np0005538515.localdomain sudo[200671]: pam_unix(sudo:session): session closed for user root
Nov 28 09:28:12 np0005538515.localdomain sudo[200759]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ucnaudoqascdkasqqqsfodhlwdcvwbze ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322091.9935346-2308-3041262932487/AnsiballZ_copy.py
Nov 28 09:28:12 np0005538515.localdomain sudo[200759]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:28:13 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14909 DF PROTO=TCP SPT=33392 DPT=9105 SEQ=4242062705 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ACE1AFB0000000001030307) 
Nov 28 09:28:13 np0005538515.localdomain python3.9[200761]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764322091.9935346-2308-3041262932487/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:28:13 np0005538515.localdomain sudo[200759]: pam_unix(sudo:session): session closed for user root
Nov 28 09:28:14 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32968 DF PROTO=TCP SPT=56624 DPT=9882 SEQ=2895839007 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ACE1F220000000001030307) 
Nov 28 09:28:14 np0005538515.localdomain python3.9[200869]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail
                                                            ls -lRZ /run/libvirt | grep -E ':container_\S+_t'
                                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:28:15 np0005538515.localdomain sudo[200980]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-plywaezewrlrnbikdzwkojudoaiocmoi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322094.7927463-2927-161486501956123/AnsiballZ_seboolean.py
Nov 28 09:28:15 np0005538515.localdomain sudo[200980]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:28:15 np0005538515.localdomain python3.9[200982]: ansible-ansible.posix.seboolean Invoked with name=os_enable_vtpm persistent=True state=True ignore_selinux_state=False
Nov 28 09:28:15 np0005538515.localdomain sudo[200980]: pam_unix(sudo:session): session closed for user root
Nov 28 09:28:16 np0005538515.localdomain sudo[201090]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cniyfxfranissrlgqdnhjbzffgtuwcqz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322096.0444453-2957-268600592896163/AnsiballZ_systemd.py
Nov 28 09:28:16 np0005538515.localdomain sudo[201090]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:28:16 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14131 DF PROTO=TCP SPT=53422 DPT=9100 SEQ=767705073 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ACE28FB0000000001030307) 
Nov 28 09:28:16 np0005538515.localdomain python3.9[201092]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtlogd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 28 09:28:16 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 09:28:16 np0005538515.localdomain systemd-rc-local-generator[201113]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:28:16 np0005538515.localdomain systemd-sysv-generator[201117]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:28:16 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:28:16 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 28 09:28:16 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:28:16 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:28:16 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:28:16 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 28 09:28:16 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:28:16 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:28:16 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:28:16 np0005538515.localdomain systemd[1]: Starting libvirt logging daemon socket...
Nov 28 09:28:16 np0005538515.localdomain systemd[1]: Listening on libvirt logging daemon socket.
Nov 28 09:28:16 np0005538515.localdomain systemd[1]: Starting libvirt logging daemon admin socket...
Nov 28 09:28:16 np0005538515.localdomain systemd[1]: Listening on libvirt logging daemon admin socket.
Nov 28 09:28:16 np0005538515.localdomain systemd[1]: Starting libvirt logging daemon...
Nov 28 09:28:17 np0005538515.localdomain systemd[1]: Started libvirt logging daemon.
Nov 28 09:28:17 np0005538515.localdomain sudo[201090]: pam_unix(sudo:session): session closed for user root
Nov 28 09:28:17 np0005538515.localdomain sudo[201241]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mihzcktcjaaqhdqgovglewknshglxsvt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322097.274155-2957-78704076098567/AnsiballZ_systemd.py
Nov 28 09:28:17 np0005538515.localdomain sudo[201241]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:28:17 np0005538515.localdomain python3.9[201243]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtnodedevd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 28 09:28:17 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 09:28:18 np0005538515.localdomain systemd-rc-local-generator[201267]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:28:18 np0005538515.localdomain systemd-sysv-generator[201272]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:28:18 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:28:18 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 28 09:28:18 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:28:18 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:28:18 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:28:18 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 28 09:28:18 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:28:18 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:28:18 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:28:18 np0005538515.localdomain systemd[1]: Starting SETroubleshoot daemon for processing new SELinux denial logs...
Nov 28 09:28:18 np0005538515.localdomain systemd[1]: Starting libvirt nodedev daemon socket...
Nov 28 09:28:18 np0005538515.localdomain systemd[1]: Listening on libvirt nodedev daemon socket.
Nov 28 09:28:18 np0005538515.localdomain systemd[1]: Starting libvirt nodedev daemon admin socket...
Nov 28 09:28:18 np0005538515.localdomain systemd[1]: Starting libvirt nodedev daemon read-only socket...
Nov 28 09:28:18 np0005538515.localdomain systemd[1]: Listening on libvirt nodedev daemon admin socket.
Nov 28 09:28:18 np0005538515.localdomain systemd[1]: Listening on libvirt nodedev daemon read-only socket.
Nov 28 09:28:18 np0005538515.localdomain systemd[1]: Started libvirt nodedev daemon.
Nov 28 09:28:18 np0005538515.localdomain sudo[201241]: pam_unix(sudo:session): session closed for user root
Nov 28 09:28:18 np0005538515.localdomain systemd[1]: Started SETroubleshoot daemon for processing new SELinux denial logs.
Nov 28 09:28:18 np0005538515.localdomain systemd[1]: Created slice Slice /system/dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged.
Nov 28 09:28:18 np0005538515.localdomain systemd[1]: Started dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service.
Nov 28 09:28:18 np0005538515.localdomain sudo[201421]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ruyoxdvgfmblrmusjxhzkyvjivowfkor ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322098.4231951-2957-78425943701547/AnsiballZ_systemd.py
Nov 28 09:28:18 np0005538515.localdomain sudo[201421]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:28:19 np0005538515.localdomain python3.9[201426]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtproxyd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 28 09:28:19 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 09:28:19 np0005538515.localdomain systemd-rc-local-generator[201453]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:28:19 np0005538515.localdomain systemd-sysv-generator[201458]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:28:19 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:28:19 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 28 09:28:19 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:28:19 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:28:19 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:28:19 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 28 09:28:19 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:28:19 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:28:19 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:28:19 np0005538515.localdomain systemd[1]: Starting libvirt proxy daemon socket...
Nov 28 09:28:19 np0005538515.localdomain systemd[1]: Listening on libvirt proxy daemon socket.
Nov 28 09:28:19 np0005538515.localdomain systemd[1]: Starting libvirt proxy daemon admin socket...
Nov 28 09:28:19 np0005538515.localdomain systemd[1]: Starting libvirt proxy daemon read-only socket...
Nov 28 09:28:19 np0005538515.localdomain systemd[1]: Listening on libvirt proxy daemon admin socket.
Nov 28 09:28:19 np0005538515.localdomain systemd[1]: Listening on libvirt proxy daemon read-only socket.
Nov 28 09:28:19 np0005538515.localdomain systemd[1]: Started libvirt proxy daemon.
Nov 28 09:28:19 np0005538515.localdomain sudo[201421]: pam_unix(sudo:session): session closed for user root
Nov 28 09:28:19 np0005538515.localdomain setroubleshoot[201281]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 29696a87-b25b-4ed1-b530-193abd3ed6b3
Nov 28 09:28:19 np0005538515.localdomain setroubleshoot[201281]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.
                                                                 
                                                                 *****  Plugin dac_override (91.4 confidence) suggests   **********************
                                                                 
                                                                 If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system
                                                                 Then turn on full auditing to get path information about the offending file and generate the error again.
                                                                 Do
                                                                 
                                                                 Turn on full auditing
                                                                 # auditctl -w /etc/shadow -p w
                                                                 Try to recreate AVC. Then execute
                                                                 # ausearch -m avc -ts recent
                                                                 If you see PATH record check ownership/permissions on file, and fix it,
                                                                 otherwise report as a bugzilla.
                                                                 
                                                                 *****  Plugin catchall (9.59 confidence) suggests   **************************
                                                                 
                                                                 If you believe that virtlogd should have the dac_read_search capability by default.
                                                                 Then you should report this as a bug.
                                                                 You can generate a local policy module to allow this access.
                                                                 Do
                                                                 allow this access for now by executing:
                                                                 # ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd
                                                                 # semodule -X 300 -i my-virtlogd.pp
                                                                 
Nov 28 09:28:19 np0005538515.localdomain setroubleshoot[201281]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 29696a87-b25b-4ed1-b530-193abd3ed6b3
Nov 28 09:28:19 np0005538515.localdomain setroubleshoot[201281]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.
                                                                 
                                                                 *****  Plugin dac_override (91.4 confidence) suggests   **********************
                                                                 
                                                                 If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system
                                                                 Then turn on full auditing to get path information about the offending file and generate the error again.
                                                                 Do
                                                                 
                                                                 Turn on full auditing
                                                                 # auditctl -w /etc/shadow -p w
                                                                 Try to recreate AVC. Then execute
                                                                 # ausearch -m avc -ts recent
                                                                 If you see PATH record check ownership/permissions on file, and fix it,
                                                                 otherwise report as a bugzilla.
                                                                 
                                                                 *****  Plugin catchall (9.59 confidence) suggests   **************************
                                                                 
                                                                 If you believe that virtlogd should have the dac_read_search capability by default.
                                                                 Then you should report this as a bug.
                                                                 You can generate a local policy module to allow this access.
                                                                 Do
                                                                 allow this access for now by executing:
                                                                 # ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd
                                                                 # semodule -X 300 -i my-virtlogd.pp
                                                                 
Nov 28 09:28:19 np0005538515.localdomain sudo[201598]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-abyjuvybttbiddoqbusapydpugxluoze ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322099.4891229-2957-264442891937335/AnsiballZ_systemd.py
Nov 28 09:28:19 np0005538515.localdomain sudo[201598]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:28:20 np0005538515.localdomain python3.9[201600]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtqemud.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 28 09:28:20 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 09:28:20 np0005538515.localdomain systemd-rc-local-generator[201623]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:28:20 np0005538515.localdomain systemd-sysv-generator[201628]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:28:20 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:28:20 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 28 09:28:20 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:28:20 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:28:20 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:28:20 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 28 09:28:20 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:28:20 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:28:20 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:28:20 np0005538515.localdomain systemd[1]: Listening on libvirt locking daemon socket.
Nov 28 09:28:20 np0005538515.localdomain systemd[1]: Starting libvirt QEMU daemon socket...
Nov 28 09:28:20 np0005538515.localdomain systemd[1]: Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw).
Nov 28 09:28:20 np0005538515.localdomain systemd[1]: Starting Virtual Machine and Container Registration Service...
Nov 28 09:28:20 np0005538515.localdomain systemd[1]: Listening on libvirt QEMU daemon socket.
Nov 28 09:28:20 np0005538515.localdomain systemd[1]: Starting libvirt QEMU daemon admin socket...
Nov 28 09:28:20 np0005538515.localdomain systemd[1]: Starting libvirt QEMU daemon read-only socket...
Nov 28 09:28:20 np0005538515.localdomain systemd[1]: Listening on libvirt QEMU daemon admin socket.
Nov 28 09:28:20 np0005538515.localdomain systemd[1]: Listening on libvirt QEMU daemon read-only socket.
Nov 28 09:28:20 np0005538515.localdomain systemd[1]: Started Virtual Machine and Container Registration Service.
Nov 28 09:28:20 np0005538515.localdomain systemd[1]: Started libvirt QEMU daemon.
Nov 28 09:28:20 np0005538515.localdomain sudo[201598]: pam_unix(sudo:session): session closed for user root
Nov 28 09:28:20 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14132 DF PROTO=TCP SPT=53422 DPT=9100 SEQ=767705073 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ACE38BA0000000001030307) 
Nov 28 09:28:20 np0005538515.localdomain sudo[201772]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jghlwwolystssqfdjefmierdtuxibhji ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322100.6314967-2957-192823350859615/AnsiballZ_systemd.py
Nov 28 09:28:20 np0005538515.localdomain sudo[201772]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:28:21 np0005538515.localdomain python3.9[201774]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtsecretd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 28 09:28:21 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 09:28:21 np0005538515.localdomain systemd-rc-local-generator[201800]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:28:21 np0005538515.localdomain systemd-sysv-generator[201805]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:28:21 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:28:21 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 28 09:28:21 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:28:21 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:28:21 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:28:21 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 28 09:28:21 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:28:21 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:28:21 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:28:21 np0005538515.localdomain systemd[1]: Starting libvirt secret daemon socket...
Nov 28 09:28:21 np0005538515.localdomain systemd[1]: Listening on libvirt secret daemon socket.
Nov 28 09:28:21 np0005538515.localdomain systemd[1]: Starting libvirt secret daemon admin socket...
Nov 28 09:28:21 np0005538515.localdomain systemd[1]: Starting libvirt secret daemon read-only socket...
Nov 28 09:28:21 np0005538515.localdomain systemd[1]: Listening on libvirt secret daemon admin socket.
Nov 28 09:28:21 np0005538515.localdomain systemd[1]: Listening on libvirt secret daemon read-only socket.
Nov 28 09:28:21 np0005538515.localdomain systemd[1]: Started libvirt secret daemon.
Nov 28 09:28:21 np0005538515.localdomain sudo[201772]: pam_unix(sudo:session): session closed for user root
Nov 28 09:28:22 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.
Nov 28 09:28:22 np0005538515.localdomain systemd[1]: tmp-crun.7LATM0.mount: Deactivated successfully.
Nov 28 09:28:22 np0005538515.localdomain podman[201853]: 2025-11-28 09:28:22.983473079 +0000 UTC m=+0.090949525 container health_status 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible)
Nov 28 09:28:23 np0005538515.localdomain podman[201853]: 2025-11-28 09:28:23.051471592 +0000 UTC m=+0.158948068 container exec_died 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Nov 28 09:28:23 np0005538515.localdomain systemd[1]: 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.service: Deactivated successfully.
Nov 28 09:28:23 np0005538515.localdomain sudo[201968]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-coxdehdijxpgmlinwwqhzbhbxfxmimpf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322102.9536078-3068-230163008069186/AnsiballZ_file.py
Nov 28 09:28:23 np0005538515.localdomain sudo[201968]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:28:23 np0005538515.localdomain python3.9[201970]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:28:23 np0005538515.localdomain sudo[201968]: pam_unix(sudo:session): session closed for user root
Nov 28 09:28:23 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.
Nov 28 09:28:23 np0005538515.localdomain podman[202042]: 2025-11-28 09:28:23.98674786 +0000 UTC m=+0.080306666 container health_status b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Nov 28 09:28:23 np0005538515.localdomain podman[202042]: 2025-11-28 09:28:23.996385157 +0000 UTC m=+0.089943953 container exec_died b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 09:28:24 np0005538515.localdomain systemd[1]: b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.service: Deactivated successfully.
Nov 28 09:28:24 np0005538515.localdomain sudo[202096]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ddzumwbkiqdfhwjhpnnnukjxnmlqqliy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322103.7654924-3092-256804613421033/AnsiballZ_find.py
Nov 28 09:28:24 np0005538515.localdomain sudo[202096]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:28:24 np0005538515.localdomain python3.9[202099]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.conf'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Nov 28 09:28:24 np0005538515.localdomain sudo[202096]: pam_unix(sudo:session): session closed for user root
Nov 28 09:28:25 np0005538515.localdomain sudo[202207]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tvuwvyvvfrawnaufcdxqcxcoubyyushq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322105.3703957-3115-66939968018153/AnsiballZ_command.py
Nov 28 09:28:25 np0005538515.localdomain sudo[202207]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:28:25 np0005538515.localdomain python3.9[202209]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail;
                                                            echo ceph
                                                            awk -F '=' '/fsid/ {print $2}' /var/lib/openstack/config/ceph/ceph.conf | xargs
                                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:28:25 np0005538515.localdomain sudo[202207]: pam_unix(sudo:session): session closed for user root
Nov 28 09:28:26 np0005538515.localdomain python3.9[202321]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.keyring'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Nov 28 09:28:27 np0005538515.localdomain python3.9[202429]: ansible-ansible.legacy.stat Invoked with path=/tmp/secret.xml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:28:27 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57641 DF PROTO=TCP SPT=41550 DPT=9105 SEQ=2528829746 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ACE53D20000000001030307) 
Nov 28 09:28:28 np0005538515.localdomain python3.9[202515]: ansible-ansible.legacy.copy Invoked with dest=/tmp/secret.xml mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764322107.061128-3172-105405855256949/.source.xml follow=False _original_basename=secret.xml.j2 checksum=817431989b0a3ade349fa0105099056ad78b021d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:28:28 np0005538515.localdomain sudo[202623]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ltilcqwqytlmbzyfqsvlfdhnqjnvkxis ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322108.2605193-3217-124611155520764/AnsiballZ_command.py
Nov 28 09:28:28 np0005538515.localdomain sudo[202623]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:28:28 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57642 DF PROTO=TCP SPT=41550 DPT=9105 SEQ=2528829746 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ACE57FA0000000001030307) 
Nov 28 09:28:28 np0005538515.localdomain python3.9[202625]: ansible-ansible.legacy.command Invoked with _raw_params=virsh secret-undefine 2c5417c9-00eb-57d5-a565-ddecbc7995c1
                                                            virsh secret-define --file /tmp/secret.xml
                                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:28:28 np0005538515.localdomain polkitd[1033]: Registered Authentication Agent for unix-process:202627:993290 (system bus name :1.2837 [pkttyagent --process 202627 --notify-fd 4 --fallback], object path /org/freedesktop/PolicyKit1/AuthenticationAgent, locale C.UTF-8)
Nov 28 09:28:28 np0005538515.localdomain polkitd[1033]: Unregistered Authentication Agent for unix-process:202627:993290 (system bus name :1.2837, object path /org/freedesktop/PolicyKit1/AuthenticationAgent, locale C.UTF-8) (disconnected from bus)
Nov 28 09:28:28 np0005538515.localdomain polkitd[1033]: Registered Authentication Agent for unix-process:202626:993290 (system bus name :1.2838 [pkttyagent --process 202626 --notify-fd 4 --fallback], object path /org/freedesktop/PolicyKit1/AuthenticationAgent, locale C.UTF-8)
Nov 28 09:28:28 np0005538515.localdomain polkitd[1033]: Unregistered Authentication Agent for unix-process:202626:993290 (system bus name :1.2838, object path /org/freedesktop/PolicyKit1/AuthenticationAgent, locale C.UTF-8) (disconnected from bus)
Nov 28 09:28:28 np0005538515.localdomain sudo[202623]: pam_unix(sudo:session): session closed for user root
Nov 28 09:28:28 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14133 DF PROTO=TCP SPT=53422 DPT=9100 SEQ=767705073 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ACE58FA0000000001030307) 
Nov 28 09:28:29 np0005538515.localdomain systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Deactivated successfully.
Nov 28 09:28:29 np0005538515.localdomain systemd[1]: setroubleshootd.service: Deactivated successfully.
Nov 28 09:28:30 np0005538515.localdomain python3.9[202745]: ansible-ansible.builtin.file Invoked with path=/tmp/secret.xml state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:28:31 np0005538515.localdomain sudo[202853]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-radscfkggvgdkyuuwfgsfefswioxuzfp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322110.8361154-3266-50195918939682/AnsiballZ_command.py
Nov 28 09:28:31 np0005538515.localdomain sudo[202853]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:28:31 np0005538515.localdomain sudo[202853]: pam_unix(sudo:session): session closed for user root
Nov 28 09:28:31 np0005538515.localdomain sudo[202964]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qknfqtjsrufuktuhfaxmgoacwdsxeotz ; FSID=2c5417c9-00eb-57d5-a565-ddecbc7995c1 KEY=AQD7UylpAAAAABAAFA51EB/tlSHSRoK3+SF42Q== /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322111.5243545-3289-12267203501370/AnsiballZ_command.py
Nov 28 09:28:31 np0005538515.localdomain sudo[202964]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:28:31 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44635 DF PROTO=TCP SPT=36840 DPT=9882 SEQ=2416056394 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ACE64FA0000000001030307) 
Nov 28 09:28:32 np0005538515.localdomain polkitd[1033]: Registered Authentication Agent for unix-process:202967:993621 (system bus name :1.2841 [pkttyagent --process 202967 --notify-fd 4 --fallback], object path /org/freedesktop/PolicyKit1/AuthenticationAgent, locale C.UTF-8)
Nov 28 09:28:32 np0005538515.localdomain polkitd[1033]: Unregistered Authentication Agent for unix-process:202967:993621 (system bus name :1.2841, object path /org/freedesktop/PolicyKit1/AuthenticationAgent, locale C.UTF-8) (disconnected from bus)
Nov 28 09:28:32 np0005538515.localdomain sudo[202964]: pam_unix(sudo:session): session closed for user root
Nov 28 09:28:32 np0005538515.localdomain sudo[203080]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mrxkxvnozxwteyagwdbuuvxmwrsqqmym ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322112.3190036-3314-148130502642519/AnsiballZ_copy.py
Nov 28 09:28:32 np0005538515.localdomain sudo[203080]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:28:32 np0005538515.localdomain python3.9[203082]: ansible-ansible.legacy.copy Invoked with dest=/etc/ceph/ceph.conf group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/config/ceph/ceph.conf backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:28:32 np0005538515.localdomain sudo[203080]: pam_unix(sudo:session): session closed for user root
Nov 28 09:28:33 np0005538515.localdomain sudo[203190]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vawscftvdmjbbopfvqfjgaydincfovnm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322113.0430126-3338-216008786423962/AnsiballZ_stat.py
Nov 28 09:28:33 np0005538515.localdomain sudo[203190]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:28:33 np0005538515.localdomain python3.9[203192]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/libvirt.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:28:33 np0005538515.localdomain sudo[203190]: pam_unix(sudo:session): session closed for user root
Nov 28 09:28:33 np0005538515.localdomain sudo[203278]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jsqbqfitlspnvoocdqosqgkrbfovflau ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322113.0430126-3338-216008786423962/AnsiballZ_copy.py
Nov 28 09:28:33 np0005538515.localdomain sudo[203278]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:28:34 np0005538515.localdomain ceph-osd[32393]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 28 09:28:34 np0005538515.localdomain ceph-osd[32393]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 6000.1 total, 600.0 interval
                                                          Cumulative writes: 4784 writes, 21K keys, 4784 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s
                                                          Cumulative WAL: 4784 writes, 637 syncs, 7.51 writes per sync, written: 0.02 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                          Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          
                                                          ** Compaction Stats [default] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      2/0    2.61 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.005       0      0       0.0       0.0
                                                           Sum      2/0    2.61 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.005       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [default] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.2      0.01              0.00         1    0.005       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.1 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55ab8a42b350#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 3.7e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [default] **
                                                          
                                                          ** Compaction Stats [m-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.1 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55ab8a42b350#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 3.7e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-0] **
                                                          
                                                          ** Compaction Stats [m-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.1 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55ab8a42b350#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 3.7e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-1] **
                                                          
                                                          ** Compaction Stats [m-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.1 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55ab8a42b350#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 3.7e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-2] **
                                                          
                                                          ** Compaction Stats [p-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      1/0    1.57 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0
                                                           Sum      1/0    1.57 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.3      0.00              0.00         1    0.005       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.1 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55ab8a42b350#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 3.7e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-0] **
                                                          
                                                          ** Compaction Stats [p-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.1 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55ab8a42b350#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 3.7e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-1] **
                                                          
                                                          ** Compaction Stats [p-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.1 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55ab8a42b350#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 3.7e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-2] **
                                                          
                                                          ** Compaction Stats [O-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.1 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55ab8a42b610#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 2 last_secs: 8e-06 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-0] **
                                                          
                                                          ** Compaction Stats [O-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.1 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55ab8a42b610#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 2 last_secs: 8e-06 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-1] **
                                                          
                                                          ** Compaction Stats [O-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      1/0    1.26 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0
                                                           Sum      1/0    1.26 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.1 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55ab8a42b610#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 2 last_secs: 8e-06 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-2] **
                                                          
                                                          ** Compaction Stats [L] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [L] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.1 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55ab8a42b350#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 3.7e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [L] **
                                                          
                                                          ** Compaction Stats [P] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [P] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.1 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55ab8a42b350#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 3.7e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [P] **
Nov 28 09:28:34 np0005538515.localdomain python3.9[203280]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/libvirt.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1764322113.0430126-3338-216008786423962/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=dc5ee7162311c27a6084cbee4052b901d56cb1ba backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:28:34 np0005538515.localdomain sudo[203278]: pam_unix(sudo:session): session closed for user root
Nov 28 09:28:34 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57644 DF PROTO=TCP SPT=41550 DPT=9105 SEQ=2528829746 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ACE6FBA0000000001030307) 
Nov 28 09:28:34 np0005538515.localdomain sudo[203388]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kozidgulgezefwewmvawoyzfwihdqjrc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322114.511787-3386-70357994450496/AnsiballZ_file.py
Nov 28 09:28:34 np0005538515.localdomain sudo[203388]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:28:35 np0005538515.localdomain python3.9[203390]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:28:35 np0005538515.localdomain sudo[203388]: pam_unix(sudo:session): session closed for user root
Nov 28 09:28:36 np0005538515.localdomain sudo[203498]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vjkzxaoigvmdcgsrycfxfrdithdxplnk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322115.2083225-3409-209249153543020/AnsiballZ_stat.py
Nov 28 09:28:36 np0005538515.localdomain sudo[203498]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:28:36 np0005538515.localdomain python3.9[203500]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:28:36 np0005538515.localdomain sudo[203498]: pam_unix(sudo:session): session closed for user root
Nov 28 09:28:36 np0005538515.localdomain sudo[203555]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zyqunjzgfhjrpweqvphkueoqghgnutla ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322115.2083225-3409-209249153543020/AnsiballZ_file.py
Nov 28 09:28:36 np0005538515.localdomain sudo[203555]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:28:36 np0005538515.localdomain python3.9[203557]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:28:36 np0005538515.localdomain sudo[203555]: pam_unix(sudo:session): session closed for user root
Nov 28 09:28:37 np0005538515.localdomain sudo[203665]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-errjencwjepknvmxflvwjdpdbmhfpozf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322117.018653-3446-123367945884471/AnsiballZ_stat.py
Nov 28 09:28:37 np0005538515.localdomain sudo[203665]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:28:37 np0005538515.localdomain python3.9[203667]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:28:37 np0005538515.localdomain sudo[203665]: pam_unix(sudo:session): session closed for user root
Nov 28 09:28:38 np0005538515.localdomain sudo[203722]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nixdopuenwtzgsjqavctlastgcezawnk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322117.018653-3446-123367945884471/AnsiballZ_file.py
Nov 28 09:28:38 np0005538515.localdomain sudo[203722]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:28:38 np0005538515.localdomain python3.9[203724]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.n6s81c7c recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:28:38 np0005538515.localdomain sudo[203722]: pam_unix(sudo:session): session closed for user root
Nov 28 09:28:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 28 09:28:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 6000.2 total, 600.0 interval
                                                          Cumulative writes: 5781 writes, 25K keys, 5781 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s
                                                          Cumulative WAL: 5781 writes, 729 syncs, 7.93 writes per sync, written: 0.02 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                          Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          
                                                          ** Compaction Stats [default] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      2/0    2.61 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.03              0.00         1    0.025       0      0       0.0       0.0
                                                           Sum      2/0    2.61 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.03              0.00         1    0.025       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [default] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.03              0.00         1    0.025       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.2 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x562bf8a0b610#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 7.3e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [default] **
                                                          
                                                          ** Compaction Stats [m-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.2 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x562bf8a0b610#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 7.3e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-0] **
                                                          
                                                          ** Compaction Stats [m-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.2 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x562bf8a0b610#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 7.3e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-1] **
                                                          
                                                          ** Compaction Stats [m-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.2 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x562bf8a0b610#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 7.3e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-2] **
                                                          
                                                          ** Compaction Stats [p-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      1/0    1.57 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.03              0.00         1    0.028       0      0       0.0       0.0
                                                           Sum      1/0    1.57 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.03              0.00         1    0.028       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.1      0.03              0.00         1    0.028       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.2 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x562bf8a0b610#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 7.3e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-0] **
                                                          
                                                          ** Compaction Stats [p-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.2 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x562bf8a0b610#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 7.3e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-1] **
                                                          
                                                          ** Compaction Stats [p-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.2 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x562bf8a0b610#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 7.3e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-2] **
                                                          
                                                          ** Compaction Stats [O-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.2 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x562bf8a0a2d0#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 2 last_secs: 7e-06 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-0] **
                                                          
                                                          ** Compaction Stats [O-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.2 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x562bf8a0a2d0#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 2 last_secs: 7e-06 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-1] **
                                                          
                                                          ** Compaction Stats [O-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      1/0    1.26 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.01              0.00         1    0.009       0      0       0.0       0.0
                                                           Sum      1/0    1.26 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.01              0.00         1    0.009       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.1      0.01              0.00         1    0.009       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.2 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x562bf8a0a2d0#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 2 last_secs: 7e-06 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-2] **
                                                          
                                                          ** Compaction Stats [L] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.04              0.00         1    0.040       0      0       0.0       0.0
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.04              0.00         1    0.040       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [L] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.04              0.00         1    0.040       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.2 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x562bf8a0b610#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 7.3e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [L] **
                                                          
                                                          ** Compaction Stats [P] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [P] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.2 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x562bf8a0b610#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 7.3e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [P] **
Nov 28 09:28:38 np0005538515.localdomain sudo[203832]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ycrpgvcwnhmtbrdwiehdauhyspkurtpc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322118.440706-3482-110273385257249/AnsiballZ_stat.py
Nov 28 09:28:38 np0005538515.localdomain sudo[203832]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:28:38 np0005538515.localdomain python3.9[203834]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:28:39 np0005538515.localdomain sudo[203832]: pam_unix(sudo:session): session closed for user root
Nov 28 09:28:39 np0005538515.localdomain sudo[203835]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:28:39 np0005538515.localdomain sudo[203835]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:28:39 np0005538515.localdomain sudo[203835]: pam_unix(sudo:session): session closed for user root
Nov 28 09:28:39 np0005538515.localdomain sudo[203858]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 09:28:39 np0005538515.localdomain sudo[203858]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:28:39 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25187 DF PROTO=TCP SPT=48762 DPT=9102 SEQ=3832156165 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ACE80FA0000000001030307) 
Nov 28 09:28:39 np0005538515.localdomain sudo[203925]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bnfdaddulkkofcznycpeownxyvsrqbum ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322118.440706-3482-110273385257249/AnsiballZ_file.py
Nov 28 09:28:39 np0005538515.localdomain sudo[203925]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:28:39 np0005538515.localdomain python3.9[203927]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:28:39 np0005538515.localdomain sudo[203925]: pam_unix(sudo:session): session closed for user root
Nov 28 09:28:39 np0005538515.localdomain sudo[203858]: pam_unix(sudo:session): session closed for user root
Nov 28 09:28:40 np0005538515.localdomain sudo[204066]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hfkvdumvnaioemfevmnywdiyrljwurnu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322119.7653668-3521-156912955375428/AnsiballZ_command.py
Nov 28 09:28:40 np0005538515.localdomain sudo[204066]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:28:40 np0005538515.localdomain python3.9[204068]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:28:40 np0005538515.localdomain sudo[204066]: pam_unix(sudo:session): session closed for user root
Nov 28 09:28:40 np0005538515.localdomain sudo[204087]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:28:40 np0005538515.localdomain sudo[204087]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:28:40 np0005538515.localdomain sudo[204087]: pam_unix(sudo:session): session closed for user root
Nov 28 09:28:40 np0005538515.localdomain sudo[204195]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wcvvekploebcmzbgwtgmximirlfckduk ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764322120.492355-3544-265407128170075/AnsiballZ_edpm_nftables_from_files.py
Nov 28 09:28:40 np0005538515.localdomain sudo[204195]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:28:41 np0005538515.localdomain python3[204197]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Nov 28 09:28:41 np0005538515.localdomain sudo[204195]: pam_unix(sudo:session): session closed for user root
Nov 28 09:28:41 np0005538515.localdomain sudo[204305]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kubkgsqjpwlkunmernzsjffhsjcvhjko ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322121.3929532-3569-254315702786698/AnsiballZ_stat.py
Nov 28 09:28:41 np0005538515.localdomain sudo[204305]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:28:41 np0005538515.localdomain python3.9[204307]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:28:41 np0005538515.localdomain sudo[204305]: pam_unix(sudo:session): session closed for user root
Nov 28 09:28:42 np0005538515.localdomain sudo[204362]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-teehxxbnpxzbztmpafqpnqtggpklsypz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322121.3929532-3569-254315702786698/AnsiballZ_file.py
Nov 28 09:28:42 np0005538515.localdomain sudo[204362]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:28:42 np0005538515.localdomain python3.9[204364]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:28:42 np0005538515.localdomain sudo[204362]: pam_unix(sudo:session): session closed for user root
Nov 28 09:28:42 np0005538515.localdomain sudo[204472]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cmyjiwcvbqpdpkxdygcblctljnvoduml ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322122.6144187-3605-201293040997764/AnsiballZ_stat.py
Nov 28 09:28:42 np0005538515.localdomain sudo[204472]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:28:43 np0005538515.localdomain python3.9[204474]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:28:43 np0005538515.localdomain sudo[204472]: pam_unix(sudo:session): session closed for user root
Nov 28 09:28:43 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57645 DF PROTO=TCP SPT=41550 DPT=9105 SEQ=2528829746 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ACE90FB0000000001030307) 
Nov 28 09:28:43 np0005538515.localdomain sudo[204529]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gipsnkeqmanuknnantxfdtjwilzdhyyk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322122.6144187-3605-201293040997764/AnsiballZ_file.py
Nov 28 09:28:43 np0005538515.localdomain sudo[204529]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:28:43 np0005538515.localdomain python3.9[204531]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:28:43 np0005538515.localdomain sudo[204529]: pam_unix(sudo:session): session closed for user root
Nov 28 09:28:44 np0005538515.localdomain sudo[204639]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zypyzirhpaenthgeumdqxabhubdkihtf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322123.8428175-3640-223487943172695/AnsiballZ_stat.py
Nov 28 09:28:44 np0005538515.localdomain sudo[204639]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:28:44 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37771 DF PROTO=TCP SPT=43138 DPT=9101 SEQ=1486897896 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ACE94D00000000001030307) 
Nov 28 09:28:44 np0005538515.localdomain python3.9[204641]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:28:44 np0005538515.localdomain sudo[204639]: pam_unix(sudo:session): session closed for user root
Nov 28 09:28:44 np0005538515.localdomain sudo[204696]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wrvzjfdifvyswqowkdarneyycwjtoftd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322123.8428175-3640-223487943172695/AnsiballZ_file.py
Nov 28 09:28:44 np0005538515.localdomain sudo[204696]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:28:44 np0005538515.localdomain python3.9[204698]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:28:44 np0005538515.localdomain sudo[204696]: pam_unix(sudo:session): session closed for user root
Nov 28 09:28:45 np0005538515.localdomain sudo[204806]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dckbsxdvshjsulgugryiyywmjmvpwroc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322125.0921655-3677-100039292275139/AnsiballZ_stat.py
Nov 28 09:28:45 np0005538515.localdomain sudo[204806]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:28:45 np0005538515.localdomain python3.9[204808]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:28:45 np0005538515.localdomain sudo[204806]: pam_unix(sudo:session): session closed for user root
Nov 28 09:28:45 np0005538515.localdomain sudo[204863]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xaigwqvfuarkumzmjpkurhdnovvkujyp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322125.0921655-3677-100039292275139/AnsiballZ_file.py
Nov 28 09:28:45 np0005538515.localdomain sudo[204863]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:28:46 np0005538515.localdomain python3.9[204865]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:28:46 np0005538515.localdomain sudo[204863]: pam_unix(sudo:session): session closed for user root
Nov 28 09:28:46 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51985 DF PROTO=TCP SPT=49654 DPT=9100 SEQ=3981679216 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ACE9DFA0000000001030307) 
Nov 28 09:28:46 np0005538515.localdomain sudo[204973]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fgtfjjxurldmkkragkricmeuuhvzvtpp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322126.419052-3713-219472289816993/AnsiballZ_stat.py
Nov 28 09:28:46 np0005538515.localdomain sudo[204973]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:28:46 np0005538515.localdomain python3.9[204975]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:28:47 np0005538515.localdomain sudo[204973]: pam_unix(sudo:session): session closed for user root
Nov 28 09:28:47 np0005538515.localdomain sudo[205063]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bprutccjdvdvyidxrrpuuatxkkhjctli ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322126.419052-3713-219472289816993/AnsiballZ_copy.py
Nov 28 09:28:47 np0005538515.localdomain sudo[205063]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:28:47 np0005538515.localdomain python3.9[205065]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764322126.419052-3713-219472289816993/.source.nft follow=False _original_basename=ruleset.j2 checksum=e2e2635f27347d386f310e86d2b40c40289835bb backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:28:47 np0005538515.localdomain sudo[205063]: pam_unix(sudo:session): session closed for user root
Nov 28 09:28:48 np0005538515.localdomain sudo[205173]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qntrffenopjitlovwmoieazjljzydtti ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322127.8250115-3758-131789665223261/AnsiballZ_file.py
Nov 28 09:28:48 np0005538515.localdomain sudo[205173]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:28:48 np0005538515.localdomain python3.9[205175]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:28:48 np0005538515.localdomain sudo[205173]: pam_unix(sudo:session): session closed for user root
Nov 28 09:28:49 np0005538515.localdomain sudo[205283]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wicoafbkztxsfhjbokxvrnnepcvhuyza ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322129.0069926-3782-64232732671871/AnsiballZ_command.py
Nov 28 09:28:49 np0005538515.localdomain sudo[205283]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:28:49 np0005538515.localdomain python3.9[205285]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:28:49 np0005538515.localdomain sudo[205283]: pam_unix(sudo:session): session closed for user root
Nov 28 09:28:50 np0005538515.localdomain sudo[205396]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zuokntsgawualjnqskoqkmntigwwrvlh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322129.738271-3806-26770330544093/AnsiballZ_blockinfile.py
Nov 28 09:28:50 np0005538515.localdomain sudo[205396]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:28:50 np0005538515.localdomain python3.9[205398]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                                            include "/etc/nftables/edpm-chains.nft"
                                                            include "/etc/nftables/edpm-rules.nft"
                                                            include "/etc/nftables/edpm-jumps.nft"
                                                             path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:28:50 np0005538515.localdomain sudo[205396]: pam_unix(sudo:session): session closed for user root
Nov 28 09:28:50 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51986 DF PROTO=TCP SPT=49654 DPT=9100 SEQ=3981679216 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ACEADBA0000000001030307) 
Nov 28 09:28:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:28:50.808 158530 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:28:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:28:50.809 158530 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:28:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:28:50.809 158530 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:28:51 np0005538515.localdomain sudo[205506]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ykfybgydzkkjsrdkgdmvtavcholiarnt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322131.3259299-3832-145501427937028/AnsiballZ_command.py
Nov 28 09:28:51 np0005538515.localdomain sudo[205506]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:28:51 np0005538515.localdomain python3.9[205508]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:28:51 np0005538515.localdomain sudo[205506]: pam_unix(sudo:session): session closed for user root
Nov 28 09:28:52 np0005538515.localdomain sudo[205618]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vrnberwjrtkhehnzkncnclmirdpuecmp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322131.9950557-3856-16841895472200/AnsiballZ_stat.py
Nov 28 09:28:52 np0005538515.localdomain sudo[205618]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:28:52 np0005538515.localdomain python3.9[205620]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 09:28:52 np0005538515.localdomain sudo[205618]: pam_unix(sudo:session): session closed for user root
Nov 28 09:28:52 np0005538515.localdomain sudo[205730]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eejalcidxyitnjwgzojxersmgwljixrd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322132.7083712-3880-64202170878279/AnsiballZ_command.py
Nov 28 09:28:52 np0005538515.localdomain sudo[205730]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:28:53 np0005538515.localdomain python3.9[205732]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:28:53 np0005538515.localdomain sudo[205730]: pam_unix(sudo:session): session closed for user root
Nov 28 09:28:53 np0005538515.localdomain sudo[205843]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-axxrkafcqvrsrtvmwynibhanynxnngxk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322133.403885-3905-83498215272319/AnsiballZ_file.py
Nov 28 09:28:53 np0005538515.localdomain sudo[205843]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:28:53 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.
Nov 28 09:28:53 np0005538515.localdomain systemd[1]: tmp-crun.ydcLLB.mount: Deactivated successfully.
Nov 28 09:28:53 np0005538515.localdomain podman[205846]: 2025-11-28 09:28:53.773810236 +0000 UTC m=+0.090439980 container health_status 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller)
Nov 28 09:28:53 np0005538515.localdomain podman[205846]: 2025-11-28 09:28:53.84675863 +0000 UTC m=+0.163388364 container exec_died 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 28 09:28:53 np0005538515.localdomain systemd[1]: 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.service: Deactivated successfully.
Nov 28 09:28:53 np0005538515.localdomain python3.9[205845]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:28:53 np0005538515.localdomain sudo[205843]: pam_unix(sudo:session): session closed for user root
Nov 28 09:28:54 np0005538515.localdomain sudo[205977]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kqhfiycdedrqgpqoyybyshxzonwsenra ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322134.1557379-3929-40441686076193/AnsiballZ_stat.py
Nov 28 09:28:54 np0005538515.localdomain sudo[205977]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:28:54 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.
Nov 28 09:28:54 np0005538515.localdomain systemd[1]: tmp-crun.US96oS.mount: Deactivated successfully.
Nov 28 09:28:54 np0005538515.localdomain podman[205980]: 2025-11-28 09:28:54.607911721 +0000 UTC m=+0.133236552 container health_status b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Nov 28 09:28:54 np0005538515.localdomain podman[205980]: 2025-11-28 09:28:54.641443907 +0000 UTC m=+0.166768738 container exec_died b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 09:28:54 np0005538515.localdomain systemd[1]: b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.service: Deactivated successfully.
Nov 28 09:28:54 np0005538515.localdomain python3.9[205979]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:28:54 np0005538515.localdomain sudo[205977]: pam_unix(sudo:session): session closed for user root
Nov 28 09:28:55 np0005538515.localdomain sudo[206081]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rejllisffrzdnkhxcggyccjffjyynjmo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322134.1557379-3929-40441686076193/AnsiballZ_copy.py
Nov 28 09:28:55 np0005538515.localdomain sudo[206081]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:28:55 np0005538515.localdomain python3.9[206083]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764322134.1557379-3929-40441686076193/.source.target follow=False _original_basename=edpm_libvirt.target checksum=13035a1aa0f414c677b14be9a5a363b6623d393c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:28:55 np0005538515.localdomain sudo[206081]: pam_unix(sudo:session): session closed for user root
Nov 28 09:28:56 np0005538515.localdomain sudo[206191]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hjahnphhafwmiafsckrdrpjkpmquhkfk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322135.9008873-3975-200302596605283/AnsiballZ_stat.py
Nov 28 09:28:56 np0005538515.localdomain sudo[206191]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:28:56 np0005538515.localdomain python3.9[206193]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt_guests.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:28:56 np0005538515.localdomain sudo[206191]: pam_unix(sudo:session): session closed for user root
Nov 28 09:28:56 np0005538515.localdomain sudo[206279]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-idwcsufojkfduthkjencuhvkrhrikduz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322135.9008873-3975-200302596605283/AnsiballZ_copy.py
Nov 28 09:28:56 np0005538515.localdomain sudo[206279]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:28:56 np0005538515.localdomain python3.9[206281]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt_guests.service mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764322135.9008873-3975-200302596605283/.source.service follow=False _original_basename=edpm_libvirt_guests.service checksum=db83430a42fc2ccfd6ed8b56ebf04f3dff9cd0cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:28:56 np0005538515.localdomain sudo[206279]: pam_unix(sudo:session): session closed for user root
Nov 28 09:28:57 np0005538515.localdomain sudo[206389]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nxskkltcnrhdadgdyuewofdxltivhsmv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322137.1153536-4020-136733623472910/AnsiballZ_stat.py
Nov 28 09:28:57 np0005538515.localdomain sudo[206389]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:28:57 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41330 DF PROTO=TCP SPT=44712 DPT=9105 SEQ=381928380 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ACEC9020000000001030307) 
Nov 28 09:28:57 np0005538515.localdomain python3.9[206391]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virt-guest-shutdown.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:28:57 np0005538515.localdomain sudo[206389]: pam_unix(sudo:session): session closed for user root
Nov 28 09:28:57 np0005538515.localdomain sudo[206477]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qzmvcrdstznmnpbzsjrsnreuayuyktec ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322137.1153536-4020-136733623472910/AnsiballZ_copy.py
Nov 28 09:28:57 np0005538515.localdomain sudo[206477]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:28:58 np0005538515.localdomain python3.9[206479]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virt-guest-shutdown.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764322137.1153536-4020-136733623472910/.source.target follow=False _original_basename=virt-guest-shutdown.target checksum=49ca149619c596cbba877418629d2cf8f7b0f5cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:28:58 np0005538515.localdomain sudo[206477]: pam_unix(sudo:session): session closed for user root
Nov 28 09:28:58 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41331 DF PROTO=TCP SPT=44712 DPT=9105 SEQ=381928380 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ACECCFA0000000001030307) 
Nov 28 09:28:58 np0005538515.localdomain sudo[206587]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-olkqzogyukgiwbegagthgndjyaalquqy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322138.3403194-4064-44498641655617/AnsiballZ_systemd.py
Nov 28 09:28:58 np0005538515.localdomain sudo[206587]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:28:58 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41579 DF PROTO=TCP SPT=45910 DPT=9882 SEQ=1699515411 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ACECE2C0000000001030307) 
Nov 28 09:28:58 np0005538515.localdomain python3.9[206589]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt.target state=restarted daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:28:58 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 09:28:59 np0005538515.localdomain systemd-sysv-generator[206619]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:28:59 np0005538515.localdomain systemd-rc-local-generator[206614]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:28:59 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:28:59 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 28 09:28:59 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:28:59 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:28:59 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:28:59 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 28 09:28:59 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:28:59 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:28:59 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:28:59 np0005538515.localdomain systemd[1]: Reached target edpm_libvirt.target.
Nov 28 09:28:59 np0005538515.localdomain sudo[206587]: pam_unix(sudo:session): session closed for user root
Nov 28 09:29:00 np0005538515.localdomain sudo[206737]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tzkfhotisnakwdzxfqqujmktzzadwohm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322140.1765997-4088-252551242283670/AnsiballZ_systemd.py
Nov 28 09:29:00 np0005538515.localdomain sudo[206737]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:29:00 np0005538515.localdomain python3.9[206739]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt_guests daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Nov 28 09:29:00 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 09:29:00 np0005538515.localdomain systemd-rc-local-generator[206762]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:29:00 np0005538515.localdomain systemd-sysv-generator[206765]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:29:00 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:29:00 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 28 09:29:00 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:29:00 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:29:00 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:29:00 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 28 09:29:00 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:29:00 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:29:00 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:29:01 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 09:29:01 np0005538515.localdomain systemd-rc-local-generator[206799]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:29:01 np0005538515.localdomain systemd-sysv-generator[206802]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:29:01 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:29:01 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 28 09:29:01 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:29:01 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:29:01 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:29:01 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 28 09:29:01 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:29:01 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:29:01 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:29:01 np0005538515.localdomain sudo[206737]: pam_unix(sudo:session): session closed for user root
Nov 28 09:29:01 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14911 DF PROTO=TCP SPT=33392 DPT=9105 SEQ=4242062705 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ACED8FB0000000001030307) 
Nov 28 09:29:02 np0005538515.localdomain sshd[158635]: pam_unix(sshd:session): session closed for user zuul
Nov 28 09:29:02 np0005538515.localdomain systemd[1]: session-52.scope: Deactivated successfully.
Nov 28 09:29:02 np0005538515.localdomain systemd[1]: session-52.scope: Consumed 3min 38.127s CPU time.
Nov 28 09:29:02 np0005538515.localdomain systemd-logind[763]: Session 52 logged out. Waiting for processes to exit.
Nov 28 09:29:02 np0005538515.localdomain systemd-logind[763]: Removed session 52.
Nov 28 09:29:04 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41333 DF PROTO=TCP SPT=44712 DPT=9105 SEQ=381928380 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ACEE4BA0000000001030307) 
Nov 28 09:29:08 np0005538515.localdomain sshd[206830]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 09:29:08 np0005538515.localdomain sshd[206830]: Accepted publickey for zuul from 192.168.122.30 port 35912 ssh2: RSA SHA256:3gOhaEk5Hp1Sm2LwNst6cGDJ5O01KvSo8lCo9SBO2II
Nov 28 09:29:08 np0005538515.localdomain systemd-logind[763]: New session 53 of user zuul.
Nov 28 09:29:08 np0005538515.localdomain systemd[1]: Started Session 53 of User zuul.
Nov 28 09:29:08 np0005538515.localdomain sshd[206830]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Nov 28 09:29:09 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28587 DF PROTO=TCP SPT=38402 DPT=9102 SEQ=3051273201 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ACEF63A0000000001030307) 
Nov 28 09:29:09 np0005538515.localdomain python3.9[206941]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 09:29:11 np0005538515.localdomain python3.9[207053]: ansible-ansible.builtin.service_facts Invoked
Nov 28 09:29:11 np0005538515.localdomain network[207070]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 28 09:29:11 np0005538515.localdomain network[207071]: 'network-scripts' will be removed from distribution in near future.
Nov 28 09:29:11 np0005538515.localdomain network[207072]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 28 09:29:12 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:29:12 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41334 DF PROTO=TCP SPT=44712 DPT=9105 SEQ=381928380 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ACF04FB0000000001030307) 
Nov 28 09:29:14 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28446 DF PROTO=TCP SPT=38730 DPT=9101 SEQ=4030811793 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ACF0A010000000001030307) 
Nov 28 09:29:16 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48449 DF PROTO=TCP SPT=34136 DPT=9100 SEQ=1636643879 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ACF133B0000000001030307) 
Nov 28 09:29:17 np0005538515.localdomain sudo[207302]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bfdmludjlqsyertennolkvrhnerfhloc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322156.8068426-103-275033023581042/AnsiballZ_setup.py
Nov 28 09:29:17 np0005538515.localdomain sudo[207302]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:29:17 np0005538515.localdomain python3.9[207304]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 28 09:29:17 np0005538515.localdomain sudo[207302]: pam_unix(sudo:session): session closed for user root
Nov 28 09:29:18 np0005538515.localdomain sudo[207365]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-csnurhvdkubqntabdhfbupgrguczfvrk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322156.8068426-103-275033023581042/AnsiballZ_dnf.py
Nov 28 09:29:18 np0005538515.localdomain sudo[207365]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:29:18 np0005538515.localdomain python3.9[207367]: ansible-ansible.legacy.dnf Invoked with name=['iscsi-initiator-utils'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 28 09:29:20 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48450 DF PROTO=TCP SPT=34136 DPT=9100 SEQ=1636643879 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ACF22FA0000000001030307) 
Nov 28 09:29:23 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.
Nov 28 09:29:23 np0005538515.localdomain systemd[1]: tmp-crun.j1dYou.mount: Deactivated successfully.
Nov 28 09:29:24 np0005538515.localdomain podman[207370]: 2025-11-28 09:29:23.999646882 +0000 UTC m=+0.099275481 container health_status 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Nov 28 09:29:24 np0005538515.localdomain podman[207370]: 2025-11-28 09:29:24.079817917 +0000 UTC m=+0.179446487 container exec_died 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, container_name=ovn_controller)
Nov 28 09:29:24 np0005538515.localdomain systemd[1]: 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.service: Deactivated successfully.
Nov 28 09:29:24 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.
Nov 28 09:29:24 np0005538515.localdomain podman[207394]: 2025-11-28 09:29:24.972743574 +0000 UTC m=+0.078761273 container health_status b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 09:29:25 np0005538515.localdomain podman[207394]: 2025-11-28 09:29:25.006457716 +0000 UTC m=+0.112475485 container exec_died b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 09:29:25 np0005538515.localdomain systemd[1]: b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.service: Deactivated successfully.
Nov 28 09:29:25 np0005538515.localdomain sudo[207365]: pam_unix(sudo:session): session closed for user root
Nov 28 09:29:26 np0005538515.localdomain sudo[207521]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jbcsipduokioqhjptzjicyljaxiigcfi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322165.680639-141-184047685493659/AnsiballZ_stat.py
Nov 28 09:29:26 np0005538515.localdomain sudo[207521]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:29:26 np0005538515.localdomain python3.9[207523]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated/iscsid/etc/iscsi follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 09:29:26 np0005538515.localdomain sudo[207521]: pam_unix(sudo:session): session closed for user root
Nov 28 09:29:27 np0005538515.localdomain sudo[207633]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pkcwvjmggddgzvhzgxbjkmwrytydevbh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322167.0896745-163-204984154426176/AnsiballZ_copy.py
Nov 28 09:29:27 np0005538515.localdomain sudo[207633]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:29:27 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8637 DF PROTO=TCP SPT=50356 DPT=9105 SEQ=3978927549 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ACF3E330000000001030307) 
Nov 28 09:29:27 np0005538515.localdomain python3.9[207635]: ansible-ansible.legacy.copy Invoked with dest=/etc/iscsi mode=preserve remote_src=True src=/var/lib/config-data/puppet-generated/iscsid/etc/iscsi/ backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:29:27 np0005538515.localdomain sudo[207633]: pam_unix(sudo:session): session closed for user root
Nov 28 09:29:28 np0005538515.localdomain sudo[207743]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-llxezlzhzytnbgvzvfrkxgmwreixssxe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322167.9027202-188-18723126983046/AnsiballZ_command.py
Nov 28 09:29:28 np0005538515.localdomain sudo[207743]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:29:28 np0005538515.localdomain python3.9[207745]: ansible-ansible.legacy.command Invoked with _raw_params=mv "/var/lib/config-data/puppet-generated/iscsid/etc/iscsi" "/var/lib/config-data/puppet-generated/iscsid/etc/iscsi.adopted"
                                                             _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:29:28 np0005538515.localdomain sudo[207743]: pam_unix(sudo:session): session closed for user root
Nov 28 09:29:28 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8638 DF PROTO=TCP SPT=50356 DPT=9105 SEQ=3978927549 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ACF423B0000000001030307) 
Nov 28 09:29:28 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48451 DF PROTO=TCP SPT=34136 DPT=9100 SEQ=1636643879 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ACF42FA0000000001030307) 
Nov 28 09:29:28 np0005538515.localdomain sudo[207854]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-luewcsiftfthuqchmuqhnlwwuduxpqcp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322168.7191713-212-119967098992755/AnsiballZ_command.py
Nov 28 09:29:28 np0005538515.localdomain sudo[207854]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:29:29 np0005538515.localdomain python3.9[207856]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/iscsi /var/lib/iscsi _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:29:29 np0005538515.localdomain sudo[207854]: pam_unix(sudo:session): session closed for user root
Nov 28 09:29:29 np0005538515.localdomain sudo[207965]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-antwusloknfotqogxarwthigfofettqo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322169.4424007-235-108800758466112/AnsiballZ_command.py
Nov 28 09:29:29 np0005538515.localdomain sudo[207965]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:29:29 np0005538515.localdomain python3.9[207967]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -rF /etc/iscsi /var/lib/iscsi _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:29:29 np0005538515.localdomain sudo[207965]: pam_unix(sudo:session): session closed for user root
Nov 28 09:29:30 np0005538515.localdomain sudo[208076]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tpqjioekxztrtmxpuptyvquruxqnmnyh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322170.2430267-262-55130998317412/AnsiballZ_stat.py
Nov 28 09:29:30 np0005538515.localdomain sudo[208076]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:29:30 np0005538515.localdomain python3.9[208078]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.initiator_reset follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 09:29:30 np0005538515.localdomain sudo[208076]: pam_unix(sudo:session): session closed for user root
Nov 28 09:29:31 np0005538515.localdomain sudo[208188]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iyirybcrkzvfnlqyezhvmzlqeejzdxfi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322171.13116-295-195772667641216/AnsiballZ_lineinfile.py
Nov 28 09:29:31 np0005538515.localdomain sudo[208188]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:29:31 np0005538515.localdomain python3.9[208190]: ansible-ansible.builtin.lineinfile Invoked with insertafter=^#node.session.auth.chap.algs line=node.session.auth.chap_algs = SHA3-256,SHA256,SHA1,MD5 path=/etc/iscsi/iscsid.conf regexp=^node.session.auth.chap_algs state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:29:31 np0005538515.localdomain sudo[208188]: pam_unix(sudo:session): session closed for user root
Nov 28 09:29:31 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57647 DF PROTO=TCP SPT=41550 DPT=9105 SEQ=2528829746 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ACF4EFA0000000001030307) 
Nov 28 09:29:32 np0005538515.localdomain sudo[208298]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wizkxtxxbtxinkzkihevdgwedcukirbn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322172.119112-323-200417119349023/AnsiballZ_systemd_service.py
Nov 28 09:29:32 np0005538515.localdomain sudo[208298]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:29:33 np0005538515.localdomain python3.9[208300]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:29:34 np0005538515.localdomain systemd[1]: Listening on Open-iSCSI iscsid Socket.
Nov 28 09:29:34 np0005538515.localdomain sudo[208298]: pam_unix(sudo:session): session closed for user root
Nov 28 09:29:34 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8640 DF PROTO=TCP SPT=50356 DPT=9105 SEQ=3978927549 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ACF59FB0000000001030307) 
Nov 28 09:29:34 np0005538515.localdomain sudo[208412]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ejzrsevlcmdwlgkrpbvflwqezmktxbjr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322174.3855848-347-137783763911425/AnsiballZ_systemd_service.py
Nov 28 09:29:34 np0005538515.localdomain sudo[208412]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:29:35 np0005538515.localdomain python3.9[208414]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:29:35 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 09:29:35 np0005538515.localdomain systemd-rc-local-generator[208439]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:29:35 np0005538515.localdomain systemd-sysv-generator[208444]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:29:35 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:29:35 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 28 09:29:35 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:29:35 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:29:35 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:29:35 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 28 09:29:35 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:29:35 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:29:35 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:29:35 np0005538515.localdomain systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Nov 28 09:29:35 np0005538515.localdomain systemd[1]: Starting Open-iSCSI...
Nov 28 09:29:35 np0005538515.localdomain iscsid[208454]: iscsid: can't open InitiatorName configuration file /etc/iscsi/initiatorname.iscsi
Nov 28 09:29:35 np0005538515.localdomain iscsid[208454]: iscsid: Warning: InitiatorName file /etc/iscsi/initiatorname.iscsi does not exist or does not contain a properly formatted InitiatorName. If using software iscsi (iscsi_tcp or ib_iser) or partial offload (bnx2i or cxgbi iscsi), you may not be able to log into or discover targets. Please create a file /etc/iscsi/initiatorname.iscsi that contains a sting with the format: InitiatorName=iqn.yyyy-mm.<reversed domain name>[:identifier].
Nov 28 09:29:35 np0005538515.localdomain iscsid[208454]: Example: InitiatorName=iqn.2001-04.com.redhat:fc6.
Nov 28 09:29:35 np0005538515.localdomain iscsid[208454]: If using hardware iscsi like qla4xxx this message can be ignored.
Nov 28 09:29:35 np0005538515.localdomain iscsid[208454]: iscsid: can't open InitiatorAlias configuration file /etc/iscsi/initiatorname.iscsi
Nov 28 09:29:35 np0005538515.localdomain iscsid[208454]: iscsid: can't open iscsid.safe_logout configuration file /etc/iscsi/iscsid.conf
Nov 28 09:29:35 np0005538515.localdomain iscsid[208454]: iscsid: can't open iscsid.ipc_auth_uid configuration file /etc/iscsi/iscsid.conf
Nov 28 09:29:35 np0005538515.localdomain systemd[1]: Started Open-iSCSI.
Nov 28 09:29:35 np0005538515.localdomain systemd[1]: Starting Logout off all iSCSI sessions on shutdown...
Nov 28 09:29:35 np0005538515.localdomain systemd[1]: Finished Logout off all iSCSI sessions on shutdown.
Nov 28 09:29:35 np0005538515.localdomain sudo[208412]: pam_unix(sudo:session): session closed for user root
Nov 28 09:29:37 np0005538515.localdomain systemd[1]: Starting SETroubleshoot daemon for processing new SELinux denial logs...
Nov 28 09:29:37 np0005538515.localdomain sudo[208564]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qtpezpmrqbonpuuzwghnaapyseebeysk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322177.0797544-380-241812800459165/AnsiballZ_service_facts.py
Nov 28 09:29:37 np0005538515.localdomain sudo[208564]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:29:37 np0005538515.localdomain systemd[1]: Started SETroubleshoot daemon for processing new SELinux denial logs.
Nov 28 09:29:37 np0005538515.localdomain systemd[1]: Started dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@1.service.
Nov 28 09:29:37 np0005538515.localdomain python3.9[208566]: ansible-ansible.builtin.service_facts Invoked
Nov 28 09:29:37 np0005538515.localdomain network[208596]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 28 09:29:37 np0005538515.localdomain network[208597]: 'network-scripts' will be removed from distribution in near future.
Nov 28 09:29:37 np0005538515.localdomain network[208598]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 28 09:29:38 np0005538515.localdomain setroubleshoot[208489]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi. For complete SELinux messages run: sealert -l 1a700e1f-be86-4e1b-b86c-07704bf948ee
Nov 28 09:29:38 np0005538515.localdomain setroubleshoot[208489]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi.
                                                                 
                                                                 *****  Plugin catchall (100. confidence) suggests   **************************
                                                                 
                                                                 If you believe that iscsid should be allowed search access on the iscsi directory by default.
                                                                 Then you should report this as a bug.
                                                                 You can generate a local policy module to allow this access.
                                                                 Do
                                                                 allow this access for now by executing:
                                                                 # ausearch -c 'iscsid' --raw | audit2allow -M my-iscsid
                                                                 # semodule -X 300 -i my-iscsid.pp
                                                                 
Nov 28 09:29:38 np0005538515.localdomain setroubleshoot[208489]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi. For complete SELinux messages run: sealert -l 1a700e1f-be86-4e1b-b86c-07704bf948ee
Nov 28 09:29:38 np0005538515.localdomain setroubleshoot[208489]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi.
                                                                 
                                                                 *****  Plugin catchall (100. confidence) suggests   **************************
                                                                 
                                                                 If you believe that iscsid should be allowed search access on the iscsi directory by default.
                                                                 Then you should report this as a bug.
                                                                 You can generate a local policy module to allow this access.
                                                                 Do
                                                                 allow this access for now by executing:
                                                                 # ausearch -c 'iscsid' --raw | audit2allow -M my-iscsid
                                                                 # semodule -X 300 -i my-iscsid.pp
                                                                 
Nov 28 09:29:38 np0005538515.localdomain setroubleshoot[208489]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi. For complete SELinux messages run: sealert -l 1a700e1f-be86-4e1b-b86c-07704bf948ee
Nov 28 09:29:38 np0005538515.localdomain setroubleshoot[208489]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi.
                                                                 
                                                                 *****  Plugin catchall (100. confidence) suggests   **************************
                                                                 
                                                                 If you believe that iscsid should be allowed search access on the iscsi directory by default.
                                                                 Then you should report this as a bug.
                                                                 You can generate a local policy module to allow this access.
                                                                 Do
                                                                 allow this access for now by executing:
                                                                 # ausearch -c 'iscsid' --raw | audit2allow -M my-iscsid
                                                                 # semodule -X 300 -i my-iscsid.pp
                                                                 
Nov 28 09:29:38 np0005538515.localdomain setroubleshoot[208489]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi. For complete SELinux messages run: sealert -l 1a700e1f-be86-4e1b-b86c-07704bf948ee
Nov 28 09:29:38 np0005538515.localdomain setroubleshoot[208489]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi.
                                                                 
                                                                 *****  Plugin catchall (100. confidence) suggests   **************************
                                                                 
                                                                 If you believe that iscsid should be allowed search access on the iscsi directory by default.
                                                                 Then you should report this as a bug.
                                                                 You can generate a local policy module to allow this access.
                                                                 Do
                                                                 allow this access for now by executing:
                                                                 # ausearch -c 'iscsid' --raw | audit2allow -M my-iscsid
                                                                 # semodule -X 300 -i my-iscsid.pp
                                                                 
Nov 28 09:29:38 np0005538515.localdomain setroubleshoot[208489]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi. For complete SELinux messages run: sealert -l 1a700e1f-be86-4e1b-b86c-07704bf948ee
Nov 28 09:29:38 np0005538515.localdomain setroubleshoot[208489]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi.
                                                                 
                                                                 *****  Plugin catchall (100. confidence) suggests   **************************
                                                                 
                                                                 If you believe that iscsid should be allowed search access on the iscsi directory by default.
                                                                 Then you should report this as a bug.
                                                                 You can generate a local policy module to allow this access.
                                                                 Do
                                                                 allow this access for now by executing:
                                                                 # ausearch -c 'iscsid' --raw | audit2allow -M my-iscsid
                                                                 # semodule -X 300 -i my-iscsid.pp
                                                                 
Nov 28 09:29:38 np0005538515.localdomain setroubleshoot[208489]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi. For complete SELinux messages run: sealert -l 1a700e1f-be86-4e1b-b86c-07704bf948ee
Nov 28 09:29:38 np0005538515.localdomain setroubleshoot[208489]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi.
                                                                 
                                                                 *****  Plugin catchall (100. confidence) suggests   **************************
                                                                 
                                                                 If you believe that iscsid should be allowed search access on the iscsi directory by default.
                                                                 Then you should report this as a bug.
                                                                 You can generate a local policy module to allow this access.
                                                                 Do
                                                                 allow this access for now by executing:
                                                                 # ausearch -c 'iscsid' --raw | audit2allow -M my-iscsid
                                                                 # semodule -X 300 -i my-iscsid.pp
                                                                 
Nov 28 09:29:39 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8698 DF PROTO=TCP SPT=34248 DPT=9102 SEQ=2145120188 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ACF6B7A0000000001030307) 
Nov 28 09:29:40 np0005538515.localdomain sudo[208621]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:29:40 np0005538515.localdomain sudo[208621]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:29:40 np0005538515.localdomain sudo[208621]: pam_unix(sudo:session): session closed for user root
Nov 28 09:29:40 np0005538515.localdomain sudo[208644]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 09:29:40 np0005538515.localdomain sudo[208644]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:29:40 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:29:41 np0005538515.localdomain sudo[208644]: pam_unix(sudo:session): session closed for user root
Nov 28 09:29:42 np0005538515.localdomain sudo[208764]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:29:42 np0005538515.localdomain sudo[208764]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:29:42 np0005538515.localdomain sudo[208764]: pam_unix(sudo:session): session closed for user root
Nov 28 09:29:42 np0005538515.localdomain sudo[208564]: pam_unix(sudo:session): session closed for user root
Nov 28 09:29:43 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8641 DF PROTO=TCP SPT=50356 DPT=9105 SEQ=3978927549 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ACF7AFA0000000001030307) 
Nov 28 09:29:43 np0005538515.localdomain sudo[208915]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ahsqlwllvczfonpygxsifyiffvpuxtzb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322183.0608144-410-259963846151778/AnsiballZ_file.py
Nov 28 09:29:43 np0005538515.localdomain sudo[208915]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:29:43 np0005538515.localdomain python3.9[208917]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Nov 28 09:29:43 np0005538515.localdomain sudo[208915]: pam_unix(sudo:session): session closed for user root
Nov 28 09:29:44 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=4748 DF PROTO=TCP SPT=55146 DPT=9882 SEQ=358108239 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ACF7EFB0000000001030307) 
Nov 28 09:29:44 np0005538515.localdomain sudo[209025]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-krnlstzxozqsadrhfhvtvervwwxlbxar ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322183.9994378-433-69680326359688/AnsiballZ_modprobe.py
Nov 28 09:29:44 np0005538515.localdomain sudo[209025]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:29:44 np0005538515.localdomain python3.9[209027]: ansible-community.general.modprobe Invoked with name=dm-multipath state=present params= persistent=disabled
Nov 28 09:29:44 np0005538515.localdomain sudo[209025]: pam_unix(sudo:session): session closed for user root
Nov 28 09:29:45 np0005538515.localdomain sudo[209139]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ihuldxowmdjbvmlfysczkeugaqnhhqve ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322184.8981476-458-171705779550144/AnsiballZ_stat.py
Nov 28 09:29:45 np0005538515.localdomain sudo[209139]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:29:45 np0005538515.localdomain python3.9[209141]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/dm-multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:29:45 np0005538515.localdomain sudo[209139]: pam_unix(sudo:session): session closed for user root
Nov 28 09:29:45 np0005538515.localdomain sudo[209227]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-plhyzmakuxlwizhvewukiaiohsjnqupl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322184.8981476-458-171705779550144/AnsiballZ_copy.py
Nov 28 09:29:45 np0005538515.localdomain sudo[209227]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:29:45 np0005538515.localdomain python3.9[209229]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/dm-multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764322184.8981476-458-171705779550144/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=065061c60917e4f67cecc70d12ce55e42f9d0b3f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:29:45 np0005538515.localdomain sudo[209227]: pam_unix(sudo:session): session closed for user root
Nov 28 09:29:46 np0005538515.localdomain sudo[209337]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zqkqunzkbajvrznusfdnmhupjsnqnalz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322186.2732344-506-50576296882518/AnsiballZ_lineinfile.py
Nov 28 09:29:46 np0005538515.localdomain sudo[209337]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:29:46 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27688 DF PROTO=TCP SPT=35896 DPT=9100 SEQ=904848703 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ACF887A0000000001030307) 
Nov 28 09:29:46 np0005538515.localdomain python3.9[209339]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=dm-multipath  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:29:46 np0005538515.localdomain sudo[209337]: pam_unix(sudo:session): session closed for user root
Nov 28 09:29:47 np0005538515.localdomain sudo[209447]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hjgdxxoywepjccikjedvkqssnvgtjmyr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322186.9773998-530-161115382804934/AnsiballZ_systemd.py
Nov 28 09:29:47 np0005538515.localdomain sudo[209447]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:29:47 np0005538515.localdomain python3.9[209449]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 28 09:29:47 np0005538515.localdomain systemd[1]: systemd-modules-load.service: Deactivated successfully.
Nov 28 09:29:47 np0005538515.localdomain systemd[1]: Stopped Load Kernel Modules.
Nov 28 09:29:47 np0005538515.localdomain systemd[1]: Stopping Load Kernel Modules...
Nov 28 09:29:47 np0005538515.localdomain systemd[1]: Starting Load Kernel Modules...
Nov 28 09:29:47 np0005538515.localdomain systemd-modules-load[209453]: Module 'msr' is built in
Nov 28 09:29:47 np0005538515.localdomain systemd[1]: Finished Load Kernel Modules.
Nov 28 09:29:47 np0005538515.localdomain sudo[209447]: pam_unix(sudo:session): session closed for user root
Nov 28 09:29:48 np0005538515.localdomain systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@1.service: Deactivated successfully.
Nov 28 09:29:48 np0005538515.localdomain systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@1.service: Consumed 1.007s CPU time.
Nov 28 09:29:48 np0005538515.localdomain systemd[1]: setroubleshootd.service: Deactivated successfully.
Nov 28 09:29:48 np0005538515.localdomain sudo[209562]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uhpjzfjlbnttydwuoweycuvempvkieez ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322188.6524515-553-8376726048694/AnsiballZ_file.py
Nov 28 09:29:48 np0005538515.localdomain sudo[209562]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:29:49 np0005538515.localdomain python3.9[209564]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:29:49 np0005538515.localdomain sudo[209562]: pam_unix(sudo:session): session closed for user root
Nov 28 09:29:49 np0005538515.localdomain sudo[209672]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oakgmeqdhcvxfdtlwtkhjezrewwdoexo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322189.4183648-580-278427745320219/AnsiballZ_stat.py
Nov 28 09:29:49 np0005538515.localdomain sudo[209672]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:29:49 np0005538515.localdomain python3.9[209674]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 09:29:49 np0005538515.localdomain sudo[209672]: pam_unix(sudo:session): session closed for user root
Nov 28 09:29:50 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27689 DF PROTO=TCP SPT=35896 DPT=9100 SEQ=904848703 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ACF983A0000000001030307) 
Nov 28 09:29:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:29:50.809 158530 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:29:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:29:50.810 158530 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:29:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:29:50.810 158530 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:29:51 np0005538515.localdomain sudo[209782]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xcklbrjgqwkdfhpxvwadafbqsxeplxtr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322190.2620926-608-9591933583616/AnsiballZ_stat.py
Nov 28 09:29:51 np0005538515.localdomain sudo[209782]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:29:51 np0005538515.localdomain python3.9[209784]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 09:29:51 np0005538515.localdomain sudo[209782]: pam_unix(sudo:session): session closed for user root
Nov 28 09:29:52 np0005538515.localdomain sudo[209892]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ygvswfrmjrehbotrnzrymivggzbpscqw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322191.9403815-632-199712378350050/AnsiballZ_stat.py
Nov 28 09:29:52 np0005538515.localdomain sudo[209892]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:29:52 np0005538515.localdomain python3.9[209894]: ansible-ansible.legacy.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:29:52 np0005538515.localdomain sudo[209892]: pam_unix(sudo:session): session closed for user root
Nov 28 09:29:52 np0005538515.localdomain sudo[209980]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mmcbrfnrnsicdagybcefodgulbguhudt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322191.9403815-632-199712378350050/AnsiballZ_copy.py
Nov 28 09:29:52 np0005538515.localdomain sudo[209980]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:29:52 np0005538515.localdomain python3.9[209982]: ansible-ansible.legacy.copy Invoked with dest=/etc/multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764322191.9403815-632-199712378350050/.source.conf _original_basename=multipath.conf follow=False checksum=bf02ab264d3d648048a81f3bacec8bc58db93162 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:29:52 np0005538515.localdomain sudo[209980]: pam_unix(sudo:session): session closed for user root
Nov 28 09:29:53 np0005538515.localdomain sudo[210090]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tqixljcmzlcyvsfivueciypzohjijptv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322193.1610153-677-257519845738367/AnsiballZ_command.py
Nov 28 09:29:53 np0005538515.localdomain sudo[210090]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:29:53 np0005538515.localdomain python3.9[210092]: ansible-ansible.legacy.command Invoked with _raw_params=grep -q '^blacklist\s*{' /etc/multipath.conf _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:29:53 np0005538515.localdomain sudo[210090]: pam_unix(sudo:session): session closed for user root
Nov 28 09:29:54 np0005538515.localdomain sudo[210201]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cwndobehisugvjbpjccysmzqpvyaxgjj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322193.9481583-700-170897588241760/AnsiballZ_lineinfile.py
Nov 28 09:29:54 np0005538515.localdomain sudo[210201]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:29:54 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.
Nov 28 09:29:54 np0005538515.localdomain systemd[1]: tmp-crun.NMLxPx.mount: Deactivated successfully.
Nov 28 09:29:54 np0005538515.localdomain podman[210204]: 2025-11-28 09:29:54.355865575 +0000 UTC m=+0.097321478 container health_status 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 28 09:29:54 np0005538515.localdomain podman[210204]: 2025-11-28 09:29:54.430266233 +0000 UTC m=+0.171722156 container exec_died 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, tcib_managed=true, config_id=ovn_controller, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Nov 28 09:29:54 np0005538515.localdomain systemd[1]: 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.service: Deactivated successfully.
Nov 28 09:29:54 np0005538515.localdomain python3.9[210203]: ansible-ansible.builtin.lineinfile Invoked with line=blacklist { path=/etc/multipath.conf state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:29:54 np0005538515.localdomain sudo[210201]: pam_unix(sudo:session): session closed for user root
Nov 28 09:29:55 np0005538515.localdomain sudo[210335]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nbkfimejlwofnjzxrzbuzqqivzdvnxmy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322194.706364-725-85415242122002/AnsiballZ_replace.py
Nov 28 09:29:55 np0005538515.localdomain sudo[210335]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:29:55 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.
Nov 28 09:29:55 np0005538515.localdomain podman[210338]: 2025-11-28 09:29:55.253845091 +0000 UTC m=+0.086956807 container health_status b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Nov 28 09:29:55 np0005538515.localdomain podman[210338]: 2025-11-28 09:29:55.284568954 +0000 UTC m=+0.117680620 container exec_died b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 09:29:55 np0005538515.localdomain systemd[1]: b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.service: Deactivated successfully.
Nov 28 09:29:55 np0005538515.localdomain systemd[1]: tmp-crun.4LXkY3.mount: Deactivated successfully.
Nov 28 09:29:55 np0005538515.localdomain python3.9[210337]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^(blacklist {) replace=\1\n} backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:29:55 np0005538515.localdomain sudo[210335]: pam_unix(sudo:session): session closed for user root
Nov 28 09:29:56 np0005538515.localdomain sudo[210461]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lchngixuarequidqyxthezkpazfwzkwq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322195.7652721-749-276737859039835/AnsiballZ_replace.py
Nov 28 09:29:56 np0005538515.localdomain sudo[210461]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:29:56 np0005538515.localdomain python3.9[210463]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^blacklist\s*{\n[\s]+devnode \"\.\*\" replace=blacklist { backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:29:56 np0005538515.localdomain sudo[210461]: pam_unix(sudo:session): session closed for user root
Nov 28 09:29:56 np0005538515.localdomain sudo[210571]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jjojhsjsspsctpzccfilcrizarribead ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322196.52673-776-245163356808386/AnsiballZ_lineinfile.py
Nov 28 09:29:56 np0005538515.localdomain sudo[210571]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:29:57 np0005538515.localdomain python3.9[210573]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        find_multipaths yes path=/etc/multipath.conf regexp=^\s+find_multipaths state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:29:57 np0005538515.localdomain sudo[210571]: pam_unix(sudo:session): session closed for user root
Nov 28 09:29:57 np0005538515.localdomain sudo[210681]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dywgtpofktcjjswpdldcvpprdspgnmqv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322197.1402748-776-266260089449037/AnsiballZ_lineinfile.py
Nov 28 09:29:57 np0005538515.localdomain sudo[210681]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:29:57 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3190 DF PROTO=TCP SPT=48832 DPT=9105 SEQ=3226923448 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ACFB3630000000001030307) 
Nov 28 09:29:57 np0005538515.localdomain python3.9[210683]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        recheck_wwid yes path=/etc/multipath.conf regexp=^\s+recheck_wwid state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:29:57 np0005538515.localdomain sudo[210681]: pam_unix(sudo:session): session closed for user root
Nov 28 09:29:58 np0005538515.localdomain sudo[210791]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jfuygnhflbhgvnnyaludbtrardcfgvch ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322197.7829978-776-211173426076484/AnsiballZ_lineinfile.py
Nov 28 09:29:58 np0005538515.localdomain sudo[210791]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:29:58 np0005538515.localdomain python3.9[210793]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        skip_kpartx yes path=/etc/multipath.conf regexp=^\s+skip_kpartx state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:29:58 np0005538515.localdomain sudo[210791]: pam_unix(sudo:session): session closed for user root
Nov 28 09:29:58 np0005538515.localdomain sudo[210901]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xyysslinhdvejzrkbidikcqhbrknpkxp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322198.350684-776-184328110056404/AnsiballZ_lineinfile.py
Nov 28 09:29:58 np0005538515.localdomain sudo[210901]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:29:58 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3191 DF PROTO=TCP SPT=48832 DPT=9105 SEQ=3226923448 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ACFB77B0000000001030307) 
Nov 28 09:29:58 np0005538515.localdomain python3.9[210903]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        user_friendly_names no path=/etc/multipath.conf regexp=^\s+user_friendly_names state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:29:58 np0005538515.localdomain sudo[210901]: pam_unix(sudo:session): session closed for user root
Nov 28 09:29:58 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53077 DF PROTO=TCP SPT=52310 DPT=9882 SEQ=180747497 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ACFB88C0000000001030307) 
Nov 28 09:29:59 np0005538515.localdomain sudo[211011]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sywhfagllnaxjiadotsxdomrbedrujmq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322199.125805-863-159097206799979/AnsiballZ_stat.py
Nov 28 09:29:59 np0005538515.localdomain sudo[211011]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:29:59 np0005538515.localdomain python3.9[211013]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 09:29:59 np0005538515.localdomain sudo[211011]: pam_unix(sudo:session): session closed for user root
Nov 28 09:30:00 np0005538515.localdomain sudo[211123]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uinclwbvewfbicaamtohtiouyjontaik ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322199.8655286-886-202043032460937/AnsiballZ_file.py
Nov 28 09:30:00 np0005538515.localdomain sudo[211123]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:30:00 np0005538515.localdomain python3.9[211125]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/multipath/.multipath_restart_required state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:30:00 np0005538515.localdomain sudo[211123]: pam_unix(sudo:session): session closed for user root
Nov 28 09:30:00 np0005538515.localdomain sudo[211233]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-svkzgqyzlsfhkcvrqgpitpvrdyolylzl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322200.7070928-914-61426099085064/AnsiballZ_file.py
Nov 28 09:30:00 np0005538515.localdomain sudo[211233]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:30:01 np0005538515.localdomain python3.9[211235]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:30:01 np0005538515.localdomain sudo[211233]: pam_unix(sudo:session): session closed for user root
Nov 28 09:30:01 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41336 DF PROTO=TCP SPT=44712 DPT=9105 SEQ=381928380 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ACFC2FB0000000001030307) 
Nov 28 09:30:01 np0005538515.localdomain sudo[211343]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uqtalrupbezdvddqttkgrywkprtcdlhm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322201.450173-937-200207027644817/AnsiballZ_stat.py
Nov 28 09:30:01 np0005538515.localdomain sudo[211343]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:30:01 np0005538515.localdomain python3.9[211345]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:30:01 np0005538515.localdomain sudo[211343]: pam_unix(sudo:session): session closed for user root
Nov 28 09:30:03 np0005538515.localdomain sudo[211400]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-leohauyjlpdhrtwiciwqeyybzpsjagns ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322201.450173-937-200207027644817/AnsiballZ_file.py
Nov 28 09:30:03 np0005538515.localdomain sudo[211400]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:30:03 np0005538515.localdomain python3.9[211402]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:30:03 np0005538515.localdomain sudo[211400]: pam_unix(sudo:session): session closed for user root
Nov 28 09:30:03 np0005538515.localdomain sudo[211510]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dbjlsehquoutzkodxfhecgskqpktjnjz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322203.4050164-937-116809009366736/AnsiballZ_stat.py
Nov 28 09:30:03 np0005538515.localdomain sudo[211510]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:30:03 np0005538515.localdomain python3.9[211512]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:30:03 np0005538515.localdomain sudo[211510]: pam_unix(sudo:session): session closed for user root
Nov 28 09:30:04 np0005538515.localdomain sudo[211567]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-waqbnguvkkjagmplypycwvyqdmgoprvw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322203.4050164-937-116809009366736/AnsiballZ_file.py
Nov 28 09:30:04 np0005538515.localdomain sudo[211567]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:30:04 np0005538515.localdomain python3.9[211569]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:30:04 np0005538515.localdomain sudo[211567]: pam_unix(sudo:session): session closed for user root
Nov 28 09:30:04 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3193 DF PROTO=TCP SPT=48832 DPT=9105 SEQ=3226923448 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ACFCF3A0000000001030307) 
Nov 28 09:30:05 np0005538515.localdomain sudo[211677]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qefczrkareccqjwcwvtgntiraefqquhd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322205.5978298-1006-187568815373896/AnsiballZ_file.py
Nov 28 09:30:05 np0005538515.localdomain sudo[211677]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:30:06 np0005538515.localdomain python3.9[211679]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:30:06 np0005538515.localdomain sudo[211677]: pam_unix(sudo:session): session closed for user root
Nov 28 09:30:06 np0005538515.localdomain sudo[211787]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-frvvrillivwhxudergdkeqdpekebteez ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322206.2871604-1031-280226315837804/AnsiballZ_stat.py
Nov 28 09:30:06 np0005538515.localdomain sudo[211787]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:30:06 np0005538515.localdomain python3.9[211789]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:30:06 np0005538515.localdomain sudo[211787]: pam_unix(sudo:session): session closed for user root
Nov 28 09:30:07 np0005538515.localdomain sudo[211844]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jwvscfftmmkkzouopushivweyqpwfzmk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322206.2871604-1031-280226315837804/AnsiballZ_file.py
Nov 28 09:30:07 np0005538515.localdomain sudo[211844]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:30:07 np0005538515.localdomain python3.9[211846]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:30:07 np0005538515.localdomain sudo[211844]: pam_unix(sudo:session): session closed for user root
Nov 28 09:30:07 np0005538515.localdomain sudo[211954]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jpqtqxtvqkrgjkbbaxjggpwfrdvocbsy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322207.551104-1067-63109674281794/AnsiballZ_stat.py
Nov 28 09:30:07 np0005538515.localdomain sudo[211954]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:30:08 np0005538515.localdomain python3.9[211956]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:30:08 np0005538515.localdomain sudo[211954]: pam_unix(sudo:session): session closed for user root
Nov 28 09:30:08 np0005538515.localdomain sudo[212011]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wdktsxibjpaynvwfsjrwbzzwokblijqk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322207.551104-1067-63109674281794/AnsiballZ_file.py
Nov 28 09:30:08 np0005538515.localdomain sudo[212011]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:30:08 np0005538515.localdomain python3.9[212013]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:30:08 np0005538515.localdomain sudo[212011]: pam_unix(sudo:session): session closed for user root
Nov 28 09:30:09 np0005538515.localdomain sudo[212121]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ntjogzoelxpbeukqwguduxzfiotyxnxx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322208.7276218-1103-186259223589434/AnsiballZ_systemd.py
Nov 28 09:30:09 np0005538515.localdomain sudo[212121]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:30:09 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42834 DF PROTO=TCP SPT=51478 DPT=9102 SEQ=1356090074 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ACFE0BA0000000001030307) 
Nov 28 09:30:09 np0005538515.localdomain python3.9[212123]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:30:09 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 09:30:09 np0005538515.localdomain systemd-rc-local-generator[212146]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:30:09 np0005538515.localdomain systemd-sysv-generator[212149]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:30:09 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:30:09 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 28 09:30:09 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:30:09 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:30:09 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:30:09 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 28 09:30:09 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:30:09 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:30:09 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:30:09 np0005538515.localdomain sudo[212121]: pam_unix(sudo:session): session closed for user root
Nov 28 09:30:10 np0005538515.localdomain sudo[212268]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nzfjzsoopzgnhhykybffqzobopmjnmuf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322209.9540558-1126-18593697128528/AnsiballZ_stat.py
Nov 28 09:30:10 np0005538515.localdomain sudo[212268]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:30:10 np0005538515.localdomain python3.9[212270]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:30:10 np0005538515.localdomain sudo[212268]: pam_unix(sudo:session): session closed for user root
Nov 28 09:30:10 np0005538515.localdomain sudo[212325]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-syxzromngyjxjtmhondarxylviacgtnj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322209.9540558-1126-18593697128528/AnsiballZ_file.py
Nov 28 09:30:10 np0005538515.localdomain sudo[212325]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:30:10 np0005538515.localdomain python3.9[212327]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:30:10 np0005538515.localdomain sudo[212325]: pam_unix(sudo:session): session closed for user root
Nov 28 09:30:11 np0005538515.localdomain sudo[212435]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bqqcparoostlobkqrnldcjlzyclfsbom ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322211.2498903-1162-206917260969928/AnsiballZ_stat.py
Nov 28 09:30:11 np0005538515.localdomain sudo[212435]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:30:11 np0005538515.localdomain python3.9[212437]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:30:11 np0005538515.localdomain sudo[212435]: pam_unix(sudo:session): session closed for user root
Nov 28 09:30:11 np0005538515.localdomain sudo[212492]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pnlqzbelrzsktrtodptclikvkxfutdfo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322211.2498903-1162-206917260969928/AnsiballZ_file.py
Nov 28 09:30:11 np0005538515.localdomain sudo[212492]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:30:12 np0005538515.localdomain python3.9[212494]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:30:12 np0005538515.localdomain sudo[212492]: pam_unix(sudo:session): session closed for user root
Nov 28 09:30:12 np0005538515.localdomain sudo[212602]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rtuoddzpgdhamqfpizrlllbiabczhjis ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322212.371365-1199-225383610213551/AnsiballZ_systemd.py
Nov 28 09:30:12 np0005538515.localdomain sudo[212602]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:30:12 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3194 DF PROTO=TCP SPT=48832 DPT=9105 SEQ=3226923448 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ACFEEFA0000000001030307) 
Nov 28 09:30:12 np0005538515.localdomain python3.9[212604]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:30:12 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 09:30:13 np0005538515.localdomain systemd-rc-local-generator[212630]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:30:13 np0005538515.localdomain systemd-sysv-generator[212634]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:30:13 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:30:13 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 28 09:30:13 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:30:13 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:30:13 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:30:13 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 28 09:30:13 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:30:13 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:30:13 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:30:13 np0005538515.localdomain systemd[1]: Starting Create netns directory...
Nov 28 09:30:13 np0005538515.localdomain systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Nov 28 09:30:13 np0005538515.localdomain systemd[1]: netns-placeholder.service: Deactivated successfully.
Nov 28 09:30:13 np0005538515.localdomain systemd[1]: Finished Create netns directory.
Nov 28 09:30:13 np0005538515.localdomain sudo[212602]: pam_unix(sudo:session): session closed for user root
Nov 28 09:30:14 np0005538515.localdomain sudo[212754]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cxmaacasxmtslskxpozskkzhhuqciwqv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322213.8520386-1229-85217000946262/AnsiballZ_file.py
Nov 28 09:30:14 np0005538515.localdomain sudo[212754]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:30:14 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40665 DF PROTO=TCP SPT=51526 DPT=9101 SEQ=3212966188 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ACFF4610000000001030307) 
Nov 28 09:30:14 np0005538515.localdomain python3.9[212756]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:30:14 np0005538515.localdomain sudo[212754]: pam_unix(sudo:session): session closed for user root
Nov 28 09:30:14 np0005538515.localdomain sudo[212864]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lurspclkhtggausnhoubmknjewpmujog ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322214.5598152-1253-161988379727081/AnsiballZ_stat.py
Nov 28 09:30:14 np0005538515.localdomain sudo[212864]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:30:15 np0005538515.localdomain python3.9[212866]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/multipathd/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:30:15 np0005538515.localdomain sudo[212864]: pam_unix(sudo:session): session closed for user root
Nov 28 09:30:15 np0005538515.localdomain sudo[212952]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xclnzxhjjixipwsdjugaxjcwqinheypg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322214.5598152-1253-161988379727081/AnsiballZ_copy.py
Nov 28 09:30:15 np0005538515.localdomain sudo[212952]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:30:15 np0005538515.localdomain python3.9[212954]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/multipathd/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764322214.5598152-1253-161988379727081/.source _original_basename=healthcheck follow=False checksum=af9d0c1c8f3cb0e30ce9609be9d5b01924d0d23f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:30:15 np0005538515.localdomain sudo[212952]: pam_unix(sudo:session): session closed for user root
Nov 28 09:30:16 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30374 DF PROTO=TCP SPT=49952 DPT=9100 SEQ=2669280850 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ACFFDBA0000000001030307) 
Nov 28 09:30:16 np0005538515.localdomain sudo[213062]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gwsmylbwkuqqwhemtipijypqgygumpwg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322216.4702277-1304-218128385185922/AnsiballZ_file.py
Nov 28 09:30:16 np0005538515.localdomain sudo[213062]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:30:16 np0005538515.localdomain python3.9[213064]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:30:16 np0005538515.localdomain sudo[213062]: pam_unix(sudo:session): session closed for user root
Nov 28 09:30:17 np0005538515.localdomain sudo[213172]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dpqplqcfzuzuguoclqazwsaysffojldm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322217.2370613-1327-255706693340777/AnsiballZ_stat.py
Nov 28 09:30:17 np0005538515.localdomain sudo[213172]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:30:17 np0005538515.localdomain python3.9[213174]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/multipathd.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:30:17 np0005538515.localdomain sudo[213172]: pam_unix(sudo:session): session closed for user root
Nov 28 09:30:18 np0005538515.localdomain systemd[1]: virtnodedevd.service: Deactivated successfully.
Nov 28 09:30:18 np0005538515.localdomain sudo[213261]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-snryegiifbnwohdroqhnuivdcljbudnx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322217.2370613-1327-255706693340777/AnsiballZ_copy.py
Nov 28 09:30:18 np0005538515.localdomain sudo[213261]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:30:19 np0005538515.localdomain python3.9[213263]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/multipathd.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764322217.2370613-1327-255706693340777/.source.json _original_basename=.eduy01wy follow=False checksum=3f7959ee8ac9757398adcc451c3b416c957d7c14 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:30:19 np0005538515.localdomain sudo[213261]: pam_unix(sudo:session): session closed for user root
Nov 28 09:30:19 np0005538515.localdomain systemd[1]: virtproxyd.service: Deactivated successfully.
Nov 28 09:30:19 np0005538515.localdomain sudo[213372]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wsfuqdjaywqugzcvemzfirkdjetiiypr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322219.3075712-1373-191607609016341/AnsiballZ_file.py
Nov 28 09:30:19 np0005538515.localdomain sudo[213372]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:30:19 np0005538515.localdomain python3.9[213374]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/multipathd state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:30:19 np0005538515.localdomain sudo[213372]: pam_unix(sudo:session): session closed for user root
Nov 28 09:30:20 np0005538515.localdomain sudo[213482]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rsqzemchghfrmbupfqlcudmsufeooydo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322220.0315552-1396-204220848068645/AnsiballZ_stat.py
Nov 28 09:30:20 np0005538515.localdomain sudo[213482]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:30:20 np0005538515.localdomain sudo[213482]: pam_unix(sudo:session): session closed for user root
Nov 28 09:30:20 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30375 DF PROTO=TCP SPT=49952 DPT=9100 SEQ=2669280850 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD00D7A0000000001030307) 
Nov 28 09:30:20 np0005538515.localdomain sudo[213570]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-snzhowuyexvzorqktgykogtoglntqmzp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322220.0315552-1396-204220848068645/AnsiballZ_copy.py
Nov 28 09:30:20 np0005538515.localdomain sudo[213570]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:30:21 np0005538515.localdomain sudo[213570]: pam_unix(sudo:session): session closed for user root
Nov 28 09:30:22 np0005538515.localdomain sudo[213680]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-liddboobqxkufkixkqsppfgoolhwshzk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322221.5451398-1448-54391746014796/AnsiballZ_container_config_data.py
Nov 28 09:30:22 np0005538515.localdomain sudo[213680]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:30:22 np0005538515.localdomain python3.9[213682]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/multipathd config_pattern=*.json debug=False
Nov 28 09:30:22 np0005538515.localdomain sudo[213680]: pam_unix(sudo:session): session closed for user root
Nov 28 09:30:23 np0005538515.localdomain sudo[213790]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ocgefxsellvheaguzruqgwjtsuuuvgvw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322222.668923-1475-231627121269520/AnsiballZ_container_config_hash.py
Nov 28 09:30:23 np0005538515.localdomain sudo[213790]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:30:23 np0005538515.localdomain python3.9[213792]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 28 09:30:23 np0005538515.localdomain sudo[213790]: pam_unix(sudo:session): session closed for user root
Nov 28 09:30:24 np0005538515.localdomain sudo[213900]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vsysqtmompsbxowoydnbrllyyymdpcpe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322223.6347742-1501-1128880109277/AnsiballZ_podman_container_info.py
Nov 28 09:30:24 np0005538515.localdomain sudo[213900]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:30:24 np0005538515.localdomain python3.9[213902]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Nov 28 09:30:24 np0005538515.localdomain sudo[213900]: pam_unix(sudo:session): session closed for user root
Nov 28 09:30:24 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.
Nov 28 09:30:24 np0005538515.localdomain podman[213947]: 2025-11-28 09:30:24.977112667 +0000 UTC m=+0.080715785 container health_status 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Nov 28 09:30:25 np0005538515.localdomain podman[213947]: 2025-11-28 09:30:25.076491899 +0000 UTC m=+0.180094977 container exec_died 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_controller, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true)
Nov 28 09:30:25 np0005538515.localdomain systemd[1]: 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.service: Deactivated successfully.
Nov 28 09:30:25 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.
Nov 28 09:30:25 np0005538515.localdomain podman[213972]: 2025-11-28 09:30:25.964452174 +0000 UTC m=+0.076322219 container health_status b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 28 09:30:25 np0005538515.localdomain podman[213972]: 2025-11-28 09:30:25.999422367 +0000 UTC m=+0.111292432 container exec_died b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Nov 28 09:30:26 np0005538515.localdomain systemd[1]: b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.service: Deactivated successfully.
Nov 28 09:30:27 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3062 DF PROTO=TCP SPT=44944 DPT=9105 SEQ=1053215145 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD028940000000001030307) 
Nov 28 09:30:28 np0005538515.localdomain sudo[214080]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aoxgkbeqmthmkbwmxrhuapudtqhxlrwe ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764322228.0765343-1540-195463796542571/AnsiballZ_edpm_container_manage.py
Nov 28 09:30:28 np0005538515.localdomain sudo[214080]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:30:28 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3063 DF PROTO=TCP SPT=44944 DPT=9105 SEQ=1053215145 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD02CBA0000000001030307) 
Nov 28 09:30:28 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30376 DF PROTO=TCP SPT=49952 DPT=9100 SEQ=2669280850 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD02CFA0000000001030307) 
Nov 28 09:30:28 np0005538515.localdomain python3[214082]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/multipathd config_id=multipathd config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Nov 28 09:30:31 np0005538515.localdomain podman[214095]: 2025-11-28 09:30:28.933215451 +0000 UTC m=+0.048360590 image pull  quay.io/podified-antelope-centos9/openstack-multipathd:current-podified
Nov 28 09:30:31 np0005538515.localdomain podman[214145]: 
Nov 28 09:30:31 np0005538515.localdomain podman[214145]: 2025-11-28 09:30:31.191036973 +0000 UTC m=+0.056683068 container create cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd)
Nov 28 09:30:31 np0005538515.localdomain podman[214145]: 2025-11-28 09:30:31.165007946 +0000 UTC m=+0.030654091 image pull  quay.io/podified-antelope-centos9/openstack-multipathd:current-podified
Nov 28 09:30:31 np0005538515.localdomain python3[214082]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name multipathd --conmon-pidfile /run/multipathd.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=multipathd --label container_name=multipathd --label managed_by=edpm_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro --volume /dev:/dev --volume /run/udev:/run/udev --volume /sys:/sys --volume /lib/modules:/lib/modules:ro --volume /etc/iscsi:/etc/iscsi:ro --volume /var/lib/iscsi:/var/lib/iscsi --volume /etc/multipath:/etc/multipath:z --volume /etc/multipath.conf:/etc/multipath.conf:ro --volume /var/lib/openstack/healthchecks/multipathd:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-multipathd:current-podified
Nov 28 09:30:31 np0005538515.localdomain sudo[214080]: pam_unix(sudo:session): session closed for user root
Nov 28 09:30:31 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8643 DF PROTO=TCP SPT=50356 DPT=9105 SEQ=3978927549 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD038FB0000000001030307) 
Nov 28 09:30:31 np0005538515.localdomain sudo[214289]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zeopgmcinlpsqosncbpmeoysjxkycnag ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322231.546027-1565-237500944683994/AnsiballZ_stat.py
Nov 28 09:30:31 np0005538515.localdomain sudo[214289]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:30:32 np0005538515.localdomain python3.9[214291]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 09:30:32 np0005538515.localdomain sudo[214289]: pam_unix(sudo:session): session closed for user root
Nov 28 09:30:32 np0005538515.localdomain systemd[1]: virtqemud.service: Deactivated successfully.
Nov 28 09:30:32 np0005538515.localdomain systemd[1]: virtsecretd.service: Deactivated successfully.
Nov 28 09:30:33 np0005538515.localdomain sudo[214403]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lsucbkwmxpmlkvwcfuxhrklgisjjcgxy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322233.47776-1591-166113635828170/AnsiballZ_file.py
Nov 28 09:30:33 np0005538515.localdomain sudo[214403]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:30:34 np0005538515.localdomain python3.9[214405]: ansible-file Invoked with path=/etc/systemd/system/edpm_multipathd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:30:34 np0005538515.localdomain sudo[214403]: pam_unix(sudo:session): session closed for user root
Nov 28 09:30:34 np0005538515.localdomain sudo[214458]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wnfsgtsuvfheugmbudxlwizylwgiusvx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322233.47776-1591-166113635828170/AnsiballZ_stat.py
Nov 28 09:30:34 np0005538515.localdomain sudo[214458]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:30:34 np0005538515.localdomain python3.9[214460]: ansible-stat Invoked with path=/etc/systemd/system/edpm_multipathd_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 09:30:34 np0005538515.localdomain sudo[214458]: pam_unix(sudo:session): session closed for user root
Nov 28 09:30:34 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3065 DF PROTO=TCP SPT=44944 DPT=9105 SEQ=1053215145 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD0447A0000000001030307) 
Nov 28 09:30:34 np0005538515.localdomain sudo[214567]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qdknnovlgqsrjzzkyqfumcoqajeepxme ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322234.5241246-1591-130175300280513/AnsiballZ_copy.py
Nov 28 09:30:34 np0005538515.localdomain sudo[214567]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:30:35 np0005538515.localdomain python3.9[214569]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764322234.5241246-1591-130175300280513/source dest=/etc/systemd/system/edpm_multipathd.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:30:35 np0005538515.localdomain sudo[214567]: pam_unix(sudo:session): session closed for user root
Nov 28 09:30:35 np0005538515.localdomain sudo[214622]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yaioxmmzhoqfsjwvudsvkstmzgwybmkv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322234.5241246-1591-130175300280513/AnsiballZ_systemd.py
Nov 28 09:30:35 np0005538515.localdomain sudo[214622]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:30:35 np0005538515.localdomain python3.9[214624]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 28 09:30:35 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 09:30:35 np0005538515.localdomain systemd-rc-local-generator[214645]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:30:35 np0005538515.localdomain systemd-sysv-generator[214650]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:30:35 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:30:35 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 28 09:30:35 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:30:35 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:30:35 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:30:35 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 28 09:30:35 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:30:35 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:30:35 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:30:36 np0005538515.localdomain sudo[214622]: pam_unix(sudo:session): session closed for user root
Nov 28 09:30:36 np0005538515.localdomain sudo[214712]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ztrofoyvzxocypfeepetnzcjdomoiful ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322234.5241246-1591-130175300280513/AnsiballZ_systemd.py
Nov 28 09:30:36 np0005538515.localdomain sudo[214712]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:30:36 np0005538515.localdomain python3.9[214714]: ansible-systemd Invoked with state=restarted name=edpm_multipathd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:30:36 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 09:30:36 np0005538515.localdomain systemd-rc-local-generator[214742]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:30:36 np0005538515.localdomain systemd-sysv-generator[214748]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:30:36 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:30:36 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 28 09:30:36 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:30:36 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:30:36 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:30:36 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 28 09:30:36 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:30:36 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:30:36 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:30:37 np0005538515.localdomain systemd[1]: Starting multipathd container...
Nov 28 09:30:37 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 09:30:37 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/80d42f4221e88fe424ad9b23bcf9b91099549aefec136c75a9ba145b2cc16629/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Nov 28 09:30:37 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/80d42f4221e88fe424ad9b23bcf9b91099549aefec136c75a9ba145b2cc16629/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Nov 28 09:30:37 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.
Nov 28 09:30:37 np0005538515.localdomain podman[214756]: 2025-11-28 09:30:37.21081971 +0000 UTC m=+0.146851264 container init cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.schema-version=1.0, config_id=multipathd, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 09:30:37 np0005538515.localdomain multipathd[214770]: + sudo -E kolla_set_configs
Nov 28 09:30:37 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.
Nov 28 09:30:37 np0005538515.localdomain sudo[214776]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Nov 28 09:30:37 np0005538515.localdomain sudo[214776]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Nov 28 09:30:37 np0005538515.localdomain sudo[214776]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Nov 28 09:30:37 np0005538515.localdomain podman[214756]: 2025-11-28 09:30:37.245228757 +0000 UTC m=+0.181260271 container start cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, config_id=multipathd, org.label-schema.build-date=20251125)
Nov 28 09:30:37 np0005538515.localdomain podman[214756]: multipathd
Nov 28 09:30:37 np0005538515.localdomain systemd[1]: Started multipathd container.
Nov 28 09:30:37 np0005538515.localdomain sudo[214712]: pam_unix(sudo:session): session closed for user root
Nov 28 09:30:37 np0005538515.localdomain multipathd[214770]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 28 09:30:37 np0005538515.localdomain multipathd[214770]: INFO:__main__:Validating config file
Nov 28 09:30:37 np0005538515.localdomain multipathd[214770]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 28 09:30:37 np0005538515.localdomain multipathd[214770]: INFO:__main__:Writing out command to execute
Nov 28 09:30:37 np0005538515.localdomain sudo[214776]: pam_unix(sudo:session): session closed for user root
Nov 28 09:30:37 np0005538515.localdomain multipathd[214770]: ++ cat /run_command
Nov 28 09:30:37 np0005538515.localdomain multipathd[214770]: + CMD='/usr/sbin/multipathd -d'
Nov 28 09:30:37 np0005538515.localdomain multipathd[214770]: + ARGS=
Nov 28 09:30:37 np0005538515.localdomain multipathd[214770]: + sudo kolla_copy_cacerts
Nov 28 09:30:37 np0005538515.localdomain sudo[214791]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_copy_cacerts
Nov 28 09:30:37 np0005538515.localdomain sudo[214791]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Nov 28 09:30:37 np0005538515.localdomain sudo[214791]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Nov 28 09:30:37 np0005538515.localdomain sudo[214791]: pam_unix(sudo:session): session closed for user root
Nov 28 09:30:37 np0005538515.localdomain multipathd[214770]: + [[ ! -n '' ]]
Nov 28 09:30:37 np0005538515.localdomain multipathd[214770]: + . kolla_extend_start
Nov 28 09:30:37 np0005538515.localdomain multipathd[214770]: Running command: '/usr/sbin/multipathd -d'
Nov 28 09:30:37 np0005538515.localdomain multipathd[214770]: + echo 'Running command: '\''/usr/sbin/multipathd -d'\'''
Nov 28 09:30:37 np0005538515.localdomain multipathd[214770]: + umask 0022
Nov 28 09:30:37 np0005538515.localdomain multipathd[214770]: + exec /usr/sbin/multipathd -d
Nov 28 09:30:37 np0005538515.localdomain podman[214778]: 2025-11-28 09:30:37.336205248 +0000 UTC m=+0.085725739 container health_status cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=starting, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, config_id=multipathd, container_name=multipathd, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 09:30:37 np0005538515.localdomain multipathd[214770]: 10061.561487 | --------start up--------
Nov 28 09:30:37 np0005538515.localdomain multipathd[214770]: 10061.561508 | read /etc/multipath.conf
Nov 28 09:30:37 np0005538515.localdomain multipathd[214770]: 10061.565375 | path checkers start up
Nov 28 09:30:37 np0005538515.localdomain podman[214778]: 2025-11-28 09:30:37.355465116 +0000 UTC m=+0.104985637 container exec_died cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_id=multipathd)
Nov 28 09:30:37 np0005538515.localdomain systemd[1]: cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.service: Deactivated successfully.
Nov 28 09:30:38 np0005538515.localdomain python3.9[214916]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath/.multipath_restart_required follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 09:30:39 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59047 DF PROTO=TCP SPT=46606 DPT=9102 SEQ=3210608279 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD055BA0000000001030307) 
Nov 28 09:30:39 np0005538515.localdomain sudo[215026]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-symiykrmftejksrytpnborcfpxbhfiww ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322239.5982523-1700-273055666091315/AnsiballZ_command.py
Nov 28 09:30:39 np0005538515.localdomain sudo[215026]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:30:40 np0005538515.localdomain python3.9[215028]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps --filter volume=/etc/multipath.conf --format {{.Names}} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:30:40 np0005538515.localdomain sudo[215026]: pam_unix(sudo:session): session closed for user root
Nov 28 09:30:41 np0005538515.localdomain sudo[215149]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mbererylobrjjhztmubgdhpzmyvvjsnr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322240.4057133-1724-48156335190606/AnsiballZ_systemd.py
Nov 28 09:30:41 np0005538515.localdomain sudo[215149]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:30:41 np0005538515.localdomain python3.9[215151]: ansible-ansible.builtin.systemd Invoked with name=edpm_multipathd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 28 09:30:42 np0005538515.localdomain sudo[215153]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:30:42 np0005538515.localdomain sudo[215153]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:30:42 np0005538515.localdomain sudo[215153]: pam_unix(sudo:session): session closed for user root
Nov 28 09:30:42 np0005538515.localdomain sudo[215171]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 09:30:42 np0005538515.localdomain sudo[215171]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:30:42 np0005538515.localdomain systemd[1]: Stopping multipathd container...
Nov 28 09:30:42 np0005538515.localdomain multipathd[214770]: 10067.046254 | exit (signal)
Nov 28 09:30:42 np0005538515.localdomain multipathd[214770]: 10067.047764 | --------shut down-------
Nov 28 09:30:42 np0005538515.localdomain sudo[215171]: pam_unix(sudo:session): session closed for user root
Nov 28 09:30:42 np0005538515.localdomain systemd[1]: libpod-cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.scope: Deactivated successfully.
Nov 28 09:30:42 np0005538515.localdomain podman[215214]: 2025-11-28 09:30:42.855795215 +0000 UTC m=+0.104351347 container died cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Nov 28 09:30:42 np0005538515.localdomain systemd[1]: cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.timer: Deactivated successfully.
Nov 28 09:30:42 np0005538515.localdomain systemd[1]: Stopped /usr/bin/podman healthcheck run cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.
Nov 28 09:30:42 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f-userdata-shm.mount: Deactivated successfully.
Nov 28 09:30:42 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-80d42f4221e88fe424ad9b23bcf9b91099549aefec136c75a9ba145b2cc16629-merged.mount: Deactivated successfully.
Nov 28 09:30:43 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3066 DF PROTO=TCP SPT=44944 DPT=9105 SEQ=1053215145 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD064FB0000000001030307) 
Nov 28 09:30:43 np0005538515.localdomain podman[215214]: 2025-11-28 09:30:43.123804286 +0000 UTC m=+0.372360388 container cleanup cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.license=GPLv2, tcib_managed=true, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0)
Nov 28 09:30:43 np0005538515.localdomain podman[215214]: multipathd
Nov 28 09:30:43 np0005538515.localdomain podman[215247]: 2025-11-28 09:30:43.23197849 +0000 UTC m=+0.076213985 container cleanup cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, config_id=multipathd)
Nov 28 09:30:43 np0005538515.localdomain podman[215247]: multipathd
Nov 28 09:30:43 np0005538515.localdomain systemd[1]: edpm_multipathd.service: Deactivated successfully.
Nov 28 09:30:43 np0005538515.localdomain systemd[1]: Stopped multipathd container.
Nov 28 09:30:43 np0005538515.localdomain systemd[1]: Starting multipathd container...
Nov 28 09:30:43 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 09:30:43 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/80d42f4221e88fe424ad9b23bcf9b91099549aefec136c75a9ba145b2cc16629/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Nov 28 09:30:43 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/80d42f4221e88fe424ad9b23bcf9b91099549aefec136c75a9ba145b2cc16629/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Nov 28 09:30:43 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.
Nov 28 09:30:43 np0005538515.localdomain podman[215260]: 2025-11-28 09:30:43.408436382 +0000 UTC m=+0.145227514 container init cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, tcib_managed=true, org.label-schema.license=GPLv2, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125)
Nov 28 09:30:43 np0005538515.localdomain multipathd[215273]: + sudo -E kolla_set_configs
Nov 28 09:30:43 np0005538515.localdomain sudo[215279]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Nov 28 09:30:43 np0005538515.localdomain sudo[215279]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Nov 28 09:30:43 np0005538515.localdomain sudo[215279]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Nov 28 09:30:43 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.
Nov 28 09:30:43 np0005538515.localdomain podman[215260]: 2025-11-28 09:30:43.452681653 +0000 UTC m=+0.189472845 container start cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, managed_by=edpm_ansible, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Nov 28 09:30:43 np0005538515.localdomain podman[215260]: multipathd
Nov 28 09:30:43 np0005538515.localdomain systemd[1]: Started multipathd container.
Nov 28 09:30:43 np0005538515.localdomain sudo[215149]: pam_unix(sudo:session): session closed for user root
Nov 28 09:30:43 np0005538515.localdomain multipathd[215273]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 28 09:30:43 np0005538515.localdomain multipathd[215273]: INFO:__main__:Validating config file
Nov 28 09:30:43 np0005538515.localdomain multipathd[215273]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 28 09:30:43 np0005538515.localdomain multipathd[215273]: INFO:__main__:Writing out command to execute
Nov 28 09:30:43 np0005538515.localdomain sudo[215279]: pam_unix(sudo:session): session closed for user root
Nov 28 09:30:43 np0005538515.localdomain multipathd[215273]: ++ cat /run_command
Nov 28 09:30:43 np0005538515.localdomain multipathd[215273]: + CMD='/usr/sbin/multipathd -d'
Nov 28 09:30:43 np0005538515.localdomain multipathd[215273]: + ARGS=
Nov 28 09:30:43 np0005538515.localdomain multipathd[215273]: + sudo kolla_copy_cacerts
Nov 28 09:30:43 np0005538515.localdomain podman[215282]: 2025-11-28 09:30:43.543644605 +0000 UTC m=+0.083145340 container health_status cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=starting, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=multipathd, tcib_managed=true, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Nov 28 09:30:43 np0005538515.localdomain sudo[215299]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_copy_cacerts
Nov 28 09:30:43 np0005538515.localdomain sudo[215299]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Nov 28 09:30:43 np0005538515.localdomain sudo[215299]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Nov 28 09:30:43 np0005538515.localdomain sudo[215299]: pam_unix(sudo:session): session closed for user root
Nov 28 09:30:43 np0005538515.localdomain multipathd[215273]: + [[ ! -n '' ]]
Nov 28 09:30:43 np0005538515.localdomain multipathd[215273]: + . kolla_extend_start
Nov 28 09:30:43 np0005538515.localdomain multipathd[215273]: + echo 'Running command: '\''/usr/sbin/multipathd -d'\'''
Nov 28 09:30:43 np0005538515.localdomain multipathd[215273]: Running command: '/usr/sbin/multipathd -d'
Nov 28 09:30:43 np0005538515.localdomain multipathd[215273]: + umask 0022
Nov 28 09:30:43 np0005538515.localdomain multipathd[215273]: + exec /usr/sbin/multipathd -d
Nov 28 09:30:43 np0005538515.localdomain multipathd[215273]: 10067.784746 | --------start up--------
Nov 28 09:30:43 np0005538515.localdomain multipathd[215273]: 10067.784768 | read /etc/multipath.conf
Nov 28 09:30:43 np0005538515.localdomain multipathd[215273]: 10067.788915 | path checkers start up
Nov 28 09:30:43 np0005538515.localdomain podman[215282]: 2025-11-28 09:30:43.579861878 +0000 UTC m=+0.119362603 container exec_died cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, org.label-schema.build-date=20251125, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true)
Nov 28 09:30:43 np0005538515.localdomain podman[215282]: unhealthy
Nov 28 09:30:43 np0005538515.localdomain systemd[1]: cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 09:30:43 np0005538515.localdomain systemd[1]: cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.service: Failed with result 'exit-code'.
Nov 28 09:30:44 np0005538515.localdomain sudo[215419]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ymjhrdddocjfcsyfihzfmqjlmpatqrza ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322243.7424085-1748-133932829463938/AnsiballZ_file.py
Nov 28 09:30:44 np0005538515.localdomain sudo[215419]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:30:44 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13556 DF PROTO=TCP SPT=44324 DPT=9882 SEQ=1017419348 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD068FA0000000001030307) 
Nov 28 09:30:44 np0005538515.localdomain python3.9[215421]: ansible-ansible.builtin.file Invoked with path=/etc/multipath/.multipath_restart_required state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:30:44 np0005538515.localdomain sudo[215419]: pam_unix(sudo:session): session closed for user root
Nov 28 09:30:45 np0005538515.localdomain sudo[215529]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ivjagqqyaqprtltmoxdarebtzmbjfnqg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322244.7906303-1784-153146054701862/AnsiballZ_file.py
Nov 28 09:30:45 np0005538515.localdomain sudo[215529]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:30:45 np0005538515.localdomain python3.9[215531]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Nov 28 09:30:45 np0005538515.localdomain sudo[215529]: pam_unix(sudo:session): session closed for user root
Nov 28 09:30:45 np0005538515.localdomain sudo[215549]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:30:45 np0005538515.localdomain sudo[215549]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:30:45 np0005538515.localdomain sudo[215549]: pam_unix(sudo:session): session closed for user root
Nov 28 09:30:46 np0005538515.localdomain sudo[215657]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rvhvvqonmobqzsupjnrkpddnlvkuopkc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322245.7366896-1808-204445004061867/AnsiballZ_modprobe.py
Nov 28 09:30:46 np0005538515.localdomain sudo[215657]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:30:46 np0005538515.localdomain python3.9[215659]: ansible-community.general.modprobe Invoked with name=nvme-fabrics state=present params= persistent=disabled
Nov 28 09:30:46 np0005538515.localdomain sudo[215657]: pam_unix(sudo:session): session closed for user root
Nov 28 09:30:46 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14802 DF PROTO=TCP SPT=48694 DPT=9100 SEQ=3818782677 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD072BA0000000001030307) 
Nov 28 09:30:46 np0005538515.localdomain sudo[215776]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lypogfkkqxbwbmrccqrxxfzrrwyqbbvp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322246.4637814-1832-172212941398812/AnsiballZ_stat.py
Nov 28 09:30:46 np0005538515.localdomain sudo[215776]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:30:46 np0005538515.localdomain python3.9[215778]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/nvme-fabrics.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:30:46 np0005538515.localdomain sudo[215776]: pam_unix(sudo:session): session closed for user root
Nov 28 09:30:47 np0005538515.localdomain sudo[215864]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pbbuijtbgdvkxypmcnzwbbmdbtcwsqqs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322246.4637814-1832-172212941398812/AnsiballZ_copy.py
Nov 28 09:30:47 np0005538515.localdomain sudo[215864]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:30:47 np0005538515.localdomain python3.9[215866]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/nvme-fabrics.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764322246.4637814-1832-172212941398812/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=783c778f0c68cc414f35486f234cbb1cf3f9bbff backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:30:47 np0005538515.localdomain sudo[215864]: pam_unix(sudo:session): session closed for user root
Nov 28 09:30:47 np0005538515.localdomain sudo[215974]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jnnnnoyybyhsdapftutttyeergstbnzw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322247.7245421-1880-194397398858839/AnsiballZ_lineinfile.py
Nov 28 09:30:47 np0005538515.localdomain sudo[215974]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:30:48 np0005538515.localdomain python3.9[215976]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=nvme-fabrics  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:30:48 np0005538515.localdomain sudo[215974]: pam_unix(sudo:session): session closed for user root
Nov 28 09:30:48 np0005538515.localdomain sudo[216084]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tzefmoeudgjzxyfyfdtkvdjywudlanyh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322248.4209573-1904-265666485374556/AnsiballZ_systemd.py
Nov 28 09:30:48 np0005538515.localdomain sudo[216084]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:30:49 np0005538515.localdomain python3.9[216086]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 28 09:30:49 np0005538515.localdomain systemd[1]: systemd-modules-load.service: Deactivated successfully.
Nov 28 09:30:49 np0005538515.localdomain systemd[1]: Stopped Load Kernel Modules.
Nov 28 09:30:49 np0005538515.localdomain systemd[1]: Stopping Load Kernel Modules...
Nov 28 09:30:49 np0005538515.localdomain systemd[1]: Starting Load Kernel Modules...
Nov 28 09:30:49 np0005538515.localdomain systemd-modules-load[216090]: Module 'msr' is built in
Nov 28 09:30:49 np0005538515.localdomain systemd[1]: Finished Load Kernel Modules.
Nov 28 09:30:49 np0005538515.localdomain sudo[216084]: pam_unix(sudo:session): session closed for user root
Nov 28 09:30:50 np0005538515.localdomain sudo[216199]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wkhacovnqofoordngzswjnspxuuyofxj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322249.3988087-1927-221383018918875/AnsiballZ_dnf.py
Nov 28 09:30:50 np0005538515.localdomain sudo[216199]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:30:50 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14803 DF PROTO=TCP SPT=48694 DPT=9100 SEQ=3818782677 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD0827A0000000001030307) 
Nov 28 09:30:50 np0005538515.localdomain python3.9[216201]: ansible-ansible.legacy.dnf Invoked with name=['nvme-cli'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 28 09:30:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:30:50.810 158530 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:30:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:30:50.812 158530 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:30:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:30:50.812 158530 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:30:54 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 09:30:54 np0005538515.localdomain systemd-rc-local-generator[216231]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:30:54 np0005538515.localdomain systemd-sysv-generator[216235]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:30:54 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:30:54 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 28 09:30:54 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:30:54 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:30:54 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:30:54 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 28 09:30:54 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:30:54 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:30:54 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:30:54 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 09:30:54 np0005538515.localdomain systemd-rc-local-generator[216271]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:30:54 np0005538515.localdomain systemd-sysv-generator[216276]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:30:54 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:30:54 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 28 09:30:54 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:30:54 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:30:54 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:30:54 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 28 09:30:54 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:30:54 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:30:54 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:30:55 np0005538515.localdomain systemd-logind[763]: Watching system buttons on /dev/input/event0 (Power Button)
Nov 28 09:30:55 np0005538515.localdomain systemd-logind[763]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Nov 28 09:30:55 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.
Nov 28 09:30:55 np0005538515.localdomain lvm[216322]: PV /dev/loop4 online, VG ceph_vg1 is complete.
Nov 28 09:30:55 np0005538515.localdomain lvm[216322]: VG ceph_vg1 finished
Nov 28 09:30:55 np0005538515.localdomain lvm[216320]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Nov 28 09:30:55 np0005538515.localdomain lvm[216320]: VG ceph_vg0 finished
Nov 28 09:30:55 np0005538515.localdomain podman[216321]: 2025-11-28 09:30:55.207388464 +0000 UTC m=+0.080409124 container health_status 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 09:30:55 np0005538515.localdomain podman[216321]: 2025-11-28 09:30:55.233384351 +0000 UTC m=+0.106405011 container exec_died 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Nov 28 09:30:55 np0005538515.localdomain systemd[1]: 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.service: Deactivated successfully.
Nov 28 09:30:55 np0005538515.localdomain systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 28 09:30:55 np0005538515.localdomain systemd[1]: Starting man-db-cache-update.service...
Nov 28 09:30:55 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 09:30:55 np0005538515.localdomain systemd-sysv-generator[216398]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:30:55 np0005538515.localdomain systemd-rc-local-generator[216393]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:30:55 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:30:55 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 28 09:30:55 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:30:55 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:30:55 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:30:55 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 28 09:30:55 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:30:55 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:30:55 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:30:55 np0005538515.localdomain systemd[1]: Queuing reload/restart jobs for marked units…
Nov 28 09:30:56 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.
Nov 28 09:30:56 np0005538515.localdomain systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 28 09:30:56 np0005538515.localdomain systemd[1]: Finished man-db-cache-update.service.
Nov 28 09:30:56 np0005538515.localdomain systemd[1]: man-db-cache-update.service: Consumed 1.149s CPU time.
Nov 28 09:30:56 np0005538515.localdomain systemd[1]: run-r70ea4d54fb0a412baff52c542a9c9d28.service: Deactivated successfully.
Nov 28 09:30:56 np0005538515.localdomain systemd[1]: tmp-crun.Azf98P.mount: Deactivated successfully.
Nov 28 09:30:56 np0005538515.localdomain podman[217535]: 2025-11-28 09:30:56.447363258 +0000 UTC m=+0.086458715 container health_status b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 28 09:30:56 np0005538515.localdomain podman[217535]: 2025-11-28 09:30:56.482364674 +0000 UTC m=+0.121460091 container exec_died b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 28 09:30:56 np0005538515.localdomain systemd[1]: b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.service: Deactivated successfully.
Nov 28 09:30:56 np0005538515.localdomain sudo[216199]: pam_unix(sudo:session): session closed for user root
Nov 28 09:30:57 np0005538515.localdomain python3.9[217660]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 09:30:57 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31051 DF PROTO=TCP SPT=45572 DPT=9105 SEQ=2228635939 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD09DC30000000001030307) 
Nov 28 09:30:58 np0005538515.localdomain sudo[217772]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jwrnejoojrgquzzaemvaejfzzbaeybpr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322258.0280302-1980-36439979270917/AnsiballZ_file.py
Nov 28 09:30:58 np0005538515.localdomain sudo[217772]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:30:58 np0005538515.localdomain python3.9[217774]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/ssh/ssh_known_hosts state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:30:58 np0005538515.localdomain sudo[217772]: pam_unix(sudo:session): session closed for user root
Nov 28 09:30:58 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31052 DF PROTO=TCP SPT=45572 DPT=9105 SEQ=2228635939 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD0A1BB0000000001030307) 
Nov 28 09:30:58 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26620 DF PROTO=TCP SPT=34866 DPT=9882 SEQ=4163958840 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD0A2EC0000000001030307) 
Nov 28 09:30:59 np0005538515.localdomain sudo[217882]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hzhhqtzxxukdpczctrztrvwrvxpurgck ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322259.187367-2013-58202298849713/AnsiballZ_systemd_service.py
Nov 28 09:30:59 np0005538515.localdomain sudo[217882]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:30:59 np0005538515.localdomain python3.9[217884]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 28 09:30:59 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 09:30:59 np0005538515.localdomain systemd-sysv-generator[217909]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:30:59 np0005538515.localdomain systemd-rc-local-generator[217906]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:30:59 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:30:59 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 28 09:30:59 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:30:59 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:30:59 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:30:59 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 28 09:30:59 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:30:59 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:30:59 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:31:00 np0005538515.localdomain sudo[217882]: pam_unix(sudo:session): session closed for user root
Nov 28 09:31:00 np0005538515.localdomain python3.9[218027]: ansible-ansible.builtin.service_facts Invoked
Nov 28 09:31:00 np0005538515.localdomain network[218044]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 28 09:31:00 np0005538515.localdomain network[218045]: 'network-scripts' will be removed from distribution in near future.
Nov 28 09:31:00 np0005538515.localdomain network[218046]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 28 09:31:01 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26622 DF PROTO=TCP SPT=34866 DPT=9882 SEQ=4163958840 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD0AEFA0000000001030307) 
Nov 28 09:31:02 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:31:04 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31054 DF PROTO=TCP SPT=45572 DPT=9105 SEQ=2228635939 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD0B97A0000000001030307) 
Nov 28 09:31:05 np0005538515.localdomain sudo[218279]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wsoajubwbcfghqtjjxogvyrrqjauuxzr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322265.162734-2070-108066372500515/AnsiballZ_systemd_service.py
Nov 28 09:31:05 np0005538515.localdomain sudo[218279]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:31:05 np0005538515.localdomain python3.9[218281]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:31:05 np0005538515.localdomain sudo[218279]: pam_unix(sudo:session): session closed for user root
Nov 28 09:31:06 np0005538515.localdomain sudo[218390]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pkvexkryosuskuogbrknvxqrushuqyvl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322265.9268775-2070-176775312823397/AnsiballZ_systemd_service.py
Nov 28 09:31:06 np0005538515.localdomain sudo[218390]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:31:06 np0005538515.localdomain python3.9[218392]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_migration_target.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:31:06 np0005538515.localdomain sudo[218390]: pam_unix(sudo:session): session closed for user root
Nov 28 09:31:06 np0005538515.localdomain sudo[218501]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-alqxmlyjxdeudbnhhqlpaoqgfcjwspyo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322266.6734824-2070-89088724721223/AnsiballZ_systemd_service.py
Nov 28 09:31:06 np0005538515.localdomain sudo[218501]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:31:07 np0005538515.localdomain python3.9[218503]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api_cron.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:31:07 np0005538515.localdomain sudo[218501]: pam_unix(sudo:session): session closed for user root
Nov 28 09:31:07 np0005538515.localdomain sudo[218612]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bggywygzcowkwrhwonftcqzgoqenfzmm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322267.4475946-2070-32974670698875/AnsiballZ_systemd_service.py
Nov 28 09:31:07 np0005538515.localdomain sudo[218612]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:31:08 np0005538515.localdomain python3.9[218614]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:31:08 np0005538515.localdomain sudo[218612]: pam_unix(sudo:session): session closed for user root
Nov 28 09:31:09 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50658 DF PROTO=TCP SPT=57754 DPT=9102 SEQ=150800526 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD0CAFA0000000001030307) 
Nov 28 09:31:09 np0005538515.localdomain sudo[218723]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pwopmmacinnwhfkbvodsjaqggjufytyj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322268.284586-2070-103896140777158/AnsiballZ_systemd_service.py
Nov 28 09:31:09 np0005538515.localdomain sudo[218723]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:31:09 np0005538515.localdomain python3.9[218725]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_conductor.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:31:10 np0005538515.localdomain sudo[218723]: pam_unix(sudo:session): session closed for user root
Nov 28 09:31:11 np0005538515.localdomain sudo[218834]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wvnqoorpccfxjebvzimgukmqcfosolkl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322271.2163022-2070-142319189108821/AnsiballZ_systemd_service.py
Nov 28 09:31:11 np0005538515.localdomain sudo[218834]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:31:11 np0005538515.localdomain python3.9[218836]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_metadata.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:31:11 np0005538515.localdomain sudo[218834]: pam_unix(sudo:session): session closed for user root
Nov 28 09:31:12 np0005538515.localdomain sudo[218945]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-einvmqaqgjtpzrkdisadreronqdtibra ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322272.0206046-2070-1013411777869/AnsiballZ_systemd_service.py
Nov 28 09:31:12 np0005538515.localdomain sudo[218945]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:31:12 np0005538515.localdomain python3.9[218947]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_scheduler.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:31:12 np0005538515.localdomain sudo[218945]: pam_unix(sudo:session): session closed for user root
Nov 28 09:31:12 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31055 DF PROTO=TCP SPT=45572 DPT=9105 SEQ=2228635939 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD0D8FA0000000001030307) 
Nov 28 09:31:13 np0005538515.localdomain sudo[219056]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fecflcohrphcmovdjmamicremojhxejf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322272.8098042-2070-103406589348426/AnsiballZ_systemd_service.py
Nov 28 09:31:13 np0005538515.localdomain sudo[219056]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:31:13 np0005538515.localdomain python3.9[219058]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_vnc_proxy.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:31:13 np0005538515.localdomain sudo[219056]: pam_unix(sudo:session): session closed for user root
Nov 28 09:31:13 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.
Nov 28 09:31:13 np0005538515.localdomain podman[219093]: 2025-11-28 09:31:13.98494426 +0000 UTC m=+0.089136427 container health_status cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=starting, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, container_name=multipathd, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0)
Nov 28 09:31:13 np0005538515.localdomain podman[219093]: 2025-11-28 09:31:13.997682736 +0000 UTC m=+0.101874863 container exec_died cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=multipathd, container_name=multipathd, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 28 09:31:14 np0005538515.localdomain systemd[1]: cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.service: Deactivated successfully.
Nov 28 09:31:14 np0005538515.localdomain sudo[219187]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rsqotdzacboxznxmxorggnfkahzptzbl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322273.8717735-2247-104207396628820/AnsiballZ_file.py
Nov 28 09:31:14 np0005538515.localdomain sudo[219187]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:31:14 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1766 DF PROTO=TCP SPT=46146 DPT=9101 SEQ=1223127398 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD0DEC00000000001030307) 
Nov 28 09:31:14 np0005538515.localdomain python3.9[219189]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:31:14 np0005538515.localdomain sudo[219187]: pam_unix(sudo:session): session closed for user root
Nov 28 09:31:14 np0005538515.localdomain sudo[219297]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-luvykiktfbzdxsajjddvdeedlhysdfrp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322274.5237725-2247-280149266805552/AnsiballZ_file.py
Nov 28 09:31:14 np0005538515.localdomain sudo[219297]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:31:14 np0005538515.localdomain python3.9[219299]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:31:15 np0005538515.localdomain sudo[219297]: pam_unix(sudo:session): session closed for user root
Nov 28 09:31:15 np0005538515.localdomain sudo[219407]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hwufndqwfqyodrrpitdudnedlluowfah ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322275.26768-2247-54844457311859/AnsiballZ_file.py
Nov 28 09:31:15 np0005538515.localdomain sudo[219407]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:31:15 np0005538515.localdomain python3.9[219409]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:31:15 np0005538515.localdomain sudo[219407]: pam_unix(sudo:session): session closed for user root
Nov 28 09:31:16 np0005538515.localdomain sudo[219517]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ypggicbjdlfqaoeybqdmxcylngmohgdp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322275.8881412-2247-138088225888988/AnsiballZ_file.py
Nov 28 09:31:16 np0005538515.localdomain sudo[219517]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:31:16 np0005538515.localdomain python3.9[219519]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:31:16 np0005538515.localdomain sudo[219517]: pam_unix(sudo:session): session closed for user root
Nov 28 09:31:16 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54607 DF PROTO=TCP SPT=59018 DPT=9100 SEQ=2143755290 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD0E7FB0000000001030307) 
Nov 28 09:31:16 np0005538515.localdomain sudo[219627]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lnqtugftaukbmbbwevvcmltujswxyrrv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322276.4625728-2247-5203618835282/AnsiballZ_file.py
Nov 28 09:31:16 np0005538515.localdomain sudo[219627]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:31:16 np0005538515.localdomain python3.9[219629]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:31:16 np0005538515.localdomain sudo[219627]: pam_unix(sudo:session): session closed for user root
Nov 28 09:31:17 np0005538515.localdomain sudo[219737]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eocwcouemlytnocyjiviekjoxvwgkdew ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322277.0540938-2247-113443972131613/AnsiballZ_file.py
Nov 28 09:31:17 np0005538515.localdomain sudo[219737]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:31:17 np0005538515.localdomain python3.9[219739]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:31:17 np0005538515.localdomain sudo[219737]: pam_unix(sudo:session): session closed for user root
Nov 28 09:31:17 np0005538515.localdomain sudo[219847]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nmsxkqsduujokxiturzpudlqkcsysnyz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322277.6671212-2247-275820321509848/AnsiballZ_file.py
Nov 28 09:31:17 np0005538515.localdomain sudo[219847]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:31:18 np0005538515.localdomain python3.9[219849]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:31:18 np0005538515.localdomain sudo[219847]: pam_unix(sudo:session): session closed for user root
Nov 28 09:31:18 np0005538515.localdomain sudo[219957]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-toachhjeizbbeugdaaualulxipoixwtc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322278.334658-2247-40180874641823/AnsiballZ_file.py
Nov 28 09:31:18 np0005538515.localdomain sudo[219957]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:31:18 np0005538515.localdomain python3.9[219959]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:31:18 np0005538515.localdomain sudo[219957]: pam_unix(sudo:session): session closed for user root
Nov 28 09:31:19 np0005538515.localdomain sudo[220067]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yragisgiiwcirthymusfpipvbojelxnz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322279.0518482-2417-196972568630536/AnsiballZ_file.py
Nov 28 09:31:19 np0005538515.localdomain sudo[220067]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:31:19 np0005538515.localdomain python3.9[220069]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:31:19 np0005538515.localdomain sudo[220067]: pam_unix(sudo:session): session closed for user root
Nov 28 09:31:19 np0005538515.localdomain sudo[220177]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iptloqqhzogdrlwtrwjwluyympufptub ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322279.6628306-2417-129166132443840/AnsiballZ_file.py
Nov 28 09:31:19 np0005538515.localdomain sudo[220177]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:31:20 np0005538515.localdomain python3.9[220179]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:31:20 np0005538515.localdomain sudo[220177]: pam_unix(sudo:session): session closed for user root
Nov 28 09:31:20 np0005538515.localdomain sudo[220287]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hgtymojznouvgjuygxoglahbpfiesref ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322280.28295-2417-210803626362765/AnsiballZ_file.py
Nov 28 09:31:20 np0005538515.localdomain sudo[220287]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:31:20 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54608 DF PROTO=TCP SPT=59018 DPT=9100 SEQ=2143755290 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD0F7BA0000000001030307) 
Nov 28 09:31:20 np0005538515.localdomain python3.9[220289]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:31:20 np0005538515.localdomain sudo[220287]: pam_unix(sudo:session): session closed for user root
Nov 28 09:31:21 np0005538515.localdomain sudo[220397]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nrikqgzhlhywolevuskdokafgaygqvrk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322281.546832-2417-107609508434909/AnsiballZ_file.py
Nov 28 09:31:21 np0005538515.localdomain sudo[220397]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:31:22 np0005538515.localdomain python3.9[220399]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:31:22 np0005538515.localdomain sudo[220397]: pam_unix(sudo:session): session closed for user root
Nov 28 09:31:22 np0005538515.localdomain sudo[220507]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zisbjokteioulmsioerpmeqhjfsipbsd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322282.1791668-2417-79303507331023/AnsiballZ_file.py
Nov 28 09:31:22 np0005538515.localdomain sudo[220507]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:31:22 np0005538515.localdomain python3.9[220509]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:31:22 np0005538515.localdomain sudo[220507]: pam_unix(sudo:session): session closed for user root
Nov 28 09:31:23 np0005538515.localdomain sudo[220617]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-byyrknexgajpbzfiswywzjnpywdtjdwz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322282.7897062-2417-143502788960826/AnsiballZ_file.py
Nov 28 09:31:23 np0005538515.localdomain sudo[220617]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:31:23 np0005538515.localdomain python3.9[220619]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:31:23 np0005538515.localdomain sudo[220617]: pam_unix(sudo:session): session closed for user root
Nov 28 09:31:24 np0005538515.localdomain sudo[220727]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xfppkqqsdecerepecbgvpzdoxyoxhitx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322284.0390139-2417-172087247315602/AnsiballZ_file.py
Nov 28 09:31:24 np0005538515.localdomain sudo[220727]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:31:24 np0005538515.localdomain python3.9[220729]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:31:24 np0005538515.localdomain sudo[220727]: pam_unix(sudo:session): session closed for user root
Nov 28 09:31:24 np0005538515.localdomain sudo[220837]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wmxexhpmirspenibbpnuuguwoeotrphx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322284.684782-2417-122677002278285/AnsiballZ_file.py
Nov 28 09:31:24 np0005538515.localdomain sudo[220837]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:31:25 np0005538515.localdomain python3.9[220839]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:31:25 np0005538515.localdomain sudo[220837]: pam_unix(sudo:session): session closed for user root
Nov 28 09:31:25 np0005538515.localdomain sudo[220947]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dfmgzfulhwdrxvhpxvdpqentoeuibzmf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322285.5293152-2592-11736539741978/AnsiballZ_command.py
Nov 28 09:31:25 np0005538515.localdomain sudo[220947]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:31:25 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.
Nov 28 09:31:25 np0005538515.localdomain podman[220950]: 2025-11-28 09:31:25.93288713 +0000 UTC m=+0.083973498 container health_status 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Nov 28 09:31:25 np0005538515.localdomain podman[220950]: 2025-11-28 09:31:25.972964475 +0000 UTC m=+0.124050903 container exec_died 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_controller, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Nov 28 09:31:25 np0005538515.localdomain systemd[1]: 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.service: Deactivated successfully.
Nov 28 09:31:26 np0005538515.localdomain python3.9[220949]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                                              systemctl disable --now certmonger.service
                                                              test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                                            fi
                                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:31:26 np0005538515.localdomain sudo[220947]: pam_unix(sudo:session): session closed for user root
Nov 28 09:31:26 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.
Nov 28 09:31:26 np0005538515.localdomain python3.9[221086]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Nov 28 09:31:26 np0005538515.localdomain podman[221087]: 2025-11-28 09:31:26.963263375 +0000 UTC m=+0.076141605 container health_status b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 09:31:26 np0005538515.localdomain podman[221087]: 2025-11-28 09:31:26.968336462 +0000 UTC m=+0.081214693 container exec_died b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent)
Nov 28 09:31:26 np0005538515.localdomain systemd[1]: b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.service: Deactivated successfully.
Nov 28 09:31:27 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48797 DF PROTO=TCP SPT=34422 DPT=9105 SEQ=603387921 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD112F30000000001030307) 
Nov 28 09:31:27 np0005538515.localdomain sudo[221214]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rezrcyrizcreynmuajtwmaynvjzoqfcd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322287.2716544-2645-54212574858802/AnsiballZ_systemd_service.py
Nov 28 09:31:27 np0005538515.localdomain sudo[221214]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:31:27 np0005538515.localdomain python3.9[221216]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 28 09:31:27 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 09:31:28 np0005538515.localdomain systemd-rc-local-generator[221241]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:31:28 np0005538515.localdomain systemd-sysv-generator[221246]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:31:28 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:31:28 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 28 09:31:28 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:31:28 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:31:28 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:31:28 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 28 09:31:28 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:31:28 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:31:28 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:31:28 np0005538515.localdomain sudo[221214]: pam_unix(sudo:session): session closed for user root
Nov 28 09:31:28 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48798 DF PROTO=TCP SPT=34422 DPT=9105 SEQ=603387921 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD116FA0000000001030307) 
Nov 28 09:31:28 np0005538515.localdomain sudo[221359]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dlcqcejceggmnaafmymvawyhnazzrjbb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322288.4583092-2669-29485602653706/AnsiballZ_command.py
Nov 28 09:31:28 np0005538515.localdomain sudo[221359]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:31:28 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34434 DF PROTO=TCP SPT=36436 DPT=9882 SEQ=1351299922 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD1181C0000000001030307) 
Nov 28 09:31:28 np0005538515.localdomain python3.9[221361]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:31:28 np0005538515.localdomain sudo[221359]: pam_unix(sudo:session): session closed for user root
Nov 28 09:31:29 np0005538515.localdomain sudo[221470]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tosklvnlucvhxpaugwigjqvocrhdbaex ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322289.0630155-2669-2331955596420/AnsiballZ_command.py
Nov 28 09:31:29 np0005538515.localdomain sudo[221470]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:31:29 np0005538515.localdomain python3.9[221472]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_migration_target.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:31:29 np0005538515.localdomain sudo[221470]: pam_unix(sudo:session): session closed for user root
Nov 28 09:31:30 np0005538515.localdomain sudo[221581]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fvlodelflwrkuqseuuplwtpztxeaozwr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322289.7518206-2669-124328221044788/AnsiballZ_command.py
Nov 28 09:31:30 np0005538515.localdomain sudo[221581]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:31:30 np0005538515.localdomain python3.9[221583]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api_cron.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:31:30 np0005538515.localdomain sudo[221581]: pam_unix(sudo:session): session closed for user root
Nov 28 09:31:30 np0005538515.localdomain sudo[221692]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pqtrabnayowfabexhchtlnppajvpbeuu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322290.3943102-2669-24445115298148/AnsiballZ_command.py
Nov 28 09:31:30 np0005538515.localdomain sudo[221692]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:31:30 np0005538515.localdomain python3.9[221694]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:31:31 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3068 DF PROTO=TCP SPT=44944 DPT=9105 SEQ=1053215145 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD122FA0000000001030307) 
Nov 28 09:31:31 np0005538515.localdomain sudo[221692]: pam_unix(sudo:session): session closed for user root
Nov 28 09:31:32 np0005538515.localdomain sudo[221803]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qtpqlmnhzkvjpnvqfyqfpffhrsxtxxjm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322292.2739942-2669-10570563916808/AnsiballZ_command.py
Nov 28 09:31:32 np0005538515.localdomain sudo[221803]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:31:32 np0005538515.localdomain python3.9[221805]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_conductor.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:31:32 np0005538515.localdomain sudo[221803]: pam_unix(sudo:session): session closed for user root
Nov 28 09:31:33 np0005538515.localdomain sudo[221914]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gtwgqnqbetdqfwhlpltfwujckhidebeu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322292.9082513-2669-272272277094178/AnsiballZ_command.py
Nov 28 09:31:33 np0005538515.localdomain sudo[221914]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:31:33 np0005538515.localdomain python3.9[221916]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_metadata.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:31:33 np0005538515.localdomain sudo[221914]: pam_unix(sudo:session): session closed for user root
Nov 28 09:31:33 np0005538515.localdomain sudo[222025]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ohgytgguxuvmokqnemafapkagygatpij ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322293.5521944-2669-116933704328556/AnsiballZ_command.py
Nov 28 09:31:33 np0005538515.localdomain sudo[222025]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:31:34 np0005538515.localdomain python3.9[222027]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_scheduler.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:31:34 np0005538515.localdomain sudo[222025]: pam_unix(sudo:session): session closed for user root
Nov 28 09:31:34 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48800 DF PROTO=TCP SPT=34422 DPT=9105 SEQ=603387921 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD12EBA0000000001030307) 
Nov 28 09:31:35 np0005538515.localdomain sudo[222136]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qsaeqcynzmjrjfeowcnpjseyzyepzjhq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322295.245537-2669-180157944073959/AnsiballZ_command.py
Nov 28 09:31:35 np0005538515.localdomain sudo[222136]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:31:35 np0005538515.localdomain python3.9[222138]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_vnc_proxy.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:31:35 np0005538515.localdomain sudo[222136]: pam_unix(sudo:session): session closed for user root
Nov 28 09:31:37 np0005538515.localdomain sudo[222247]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zteeupydtefyuciqhjrufdxmhhpmtqmc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322297.5127177-2877-76908318264181/AnsiballZ_file.py
Nov 28 09:31:37 np0005538515.localdomain sudo[222247]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:31:37 np0005538515.localdomain python3.9[222249]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:31:37 np0005538515.localdomain sudo[222247]: pam_unix(sudo:session): session closed for user root
Nov 28 09:31:38 np0005538515.localdomain sudo[222357]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ckpcjffzqbzfxjuzelbcjslwpxyuknrv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322298.1272235-2877-268962876141124/AnsiballZ_file.py
Nov 28 09:31:38 np0005538515.localdomain sudo[222357]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:31:38 np0005538515.localdomain python3.9[222359]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/containers setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:31:38 np0005538515.localdomain sudo[222357]: pam_unix(sudo:session): session closed for user root
Nov 28 09:31:39 np0005538515.localdomain sudo[222467]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xuckzmfdjgnacdfjduhnlueihbcsahmy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322298.7611897-2877-153153053329798/AnsiballZ_file.py
Nov 28 09:31:39 np0005538515.localdomain sudo[222467]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:31:39 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14237 DF PROTO=TCP SPT=44678 DPT=9102 SEQ=3061132303 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD1403A0000000001030307) 
Nov 28 09:31:39 np0005538515.localdomain python3.9[222469]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova_nvme_cleaner setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:31:39 np0005538515.localdomain sudo[222467]: pam_unix(sudo:session): session closed for user root
Nov 28 09:31:39 np0005538515.localdomain sudo[222577]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lkwiadvmypqsltapdelfwltuaoivbdvr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322299.4825222-2943-266767084147659/AnsiballZ_file.py
Nov 28 09:31:39 np0005538515.localdomain sudo[222577]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:31:39 np0005538515.localdomain python3.9[222579]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:31:39 np0005538515.localdomain sudo[222577]: pam_unix(sudo:session): session closed for user root
Nov 28 09:31:40 np0005538515.localdomain sshd[222597]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 09:31:40 np0005538515.localdomain sudo[222689]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pfzgrrlnmdgjtczncugjilqzvqmenxez ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322300.1918743-2943-31371311255400/AnsiballZ_file.py
Nov 28 09:31:40 np0005538515.localdomain sudo[222689]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:31:40 np0005538515.localdomain sshd[222597]: Invalid user support from 78.128.112.74 port 50206
Nov 28 09:31:40 np0005538515.localdomain python3.9[222691]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/_nova_secontext setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:31:40 np0005538515.localdomain sudo[222689]: pam_unix(sudo:session): session closed for user root
Nov 28 09:31:40 np0005538515.localdomain sshd[222597]: Connection closed by invalid user support 78.128.112.74 port 50206 [preauth]
Nov 28 09:31:41 np0005538515.localdomain sudo[222799]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tjyvnxpbqqrwehdjvieowrukljyusckk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322300.8178978-2943-276534727698083/AnsiballZ_file.py
Nov 28 09:31:41 np0005538515.localdomain sudo[222799]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:31:41 np0005538515.localdomain python3.9[222801]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova/instances setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:31:41 np0005538515.localdomain sudo[222799]: pam_unix(sudo:session): session closed for user root
Nov 28 09:31:41 np0005538515.localdomain sudo[222909]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kpidyfnlcpfrfvbtrfpgytaekaaetzdj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322301.3822753-2943-136260790693154/AnsiballZ_file.py
Nov 28 09:31:41 np0005538515.localdomain sudo[222909]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:31:41 np0005538515.localdomain python3.9[222911]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/etc/ceph setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:31:41 np0005538515.localdomain sudo[222909]: pam_unix(sudo:session): session closed for user root
Nov 28 09:31:42 np0005538515.localdomain sudo[223019]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gebcgpzzhhzbnnfbndwghletcpuybgqf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322301.9848378-2943-41874857641353/AnsiballZ_file.py
Nov 28 09:31:42 np0005538515.localdomain sudo[223019]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:31:42 np0005538515.localdomain python3.9[223021]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:31:42 np0005538515.localdomain sudo[223019]: pam_unix(sudo:session): session closed for user root
Nov 28 09:31:42 np0005538515.localdomain sudo[223129]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wfvdrjjlqczusoiozshsqjynhhsioizj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322302.6109414-2943-49035815293918/AnsiballZ_file.py
Nov 28 09:31:42 np0005538515.localdomain sudo[223129]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:31:42 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48801 DF PROTO=TCP SPT=34422 DPT=9105 SEQ=603387921 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD14EFA0000000001030307) 
Nov 28 09:31:43 np0005538515.localdomain python3.9[223131]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/nvme setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:31:43 np0005538515.localdomain sudo[223129]: pam_unix(sudo:session): session closed for user root
Nov 28 09:31:43 np0005538515.localdomain sudo[223239]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xbhslrrifktlokcfqwbsbynuguijgisq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322303.2546325-2943-118737502771778/AnsiballZ_file.py
Nov 28 09:31:43 np0005538515.localdomain sudo[223239]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:31:43 np0005538515.localdomain python3.9[223241]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/run/openvswitch setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:31:43 np0005538515.localdomain sudo[223239]: pam_unix(sudo:session): session closed for user root
Nov 28 09:31:44 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35641 DF PROTO=TCP SPT=42412 DPT=9101 SEQ=267168259 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD153F00000000001030307) 
Nov 28 09:31:44 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.
Nov 28 09:31:44 np0005538515.localdomain podman[223259]: 2025-11-28 09:31:44.993344495 +0000 UTC m=+0.094640199 container health_status cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd)
Nov 28 09:31:45 np0005538515.localdomain podman[223259]: 2025-11-28 09:31:45.035627718 +0000 UTC m=+0.136923472 container exec_died cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=multipathd, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Nov 28 09:31:45 np0005538515.localdomain systemd[1]: cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.service: Deactivated successfully.
Nov 28 09:31:45 np0005538515.localdomain sudo[223278]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:31:45 np0005538515.localdomain sudo[223278]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:31:45 np0005538515.localdomain sudo[223278]: pam_unix(sudo:session): session closed for user root
Nov 28 09:31:45 np0005538515.localdomain sudo[223296]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 check-host
Nov 28 09:31:45 np0005538515.localdomain sudo[223296]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:31:46 np0005538515.localdomain sudo[223296]: pam_unix(sudo:session): session closed for user root
Nov 28 09:31:46 np0005538515.localdomain sudo[223335]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:31:46 np0005538515.localdomain sudo[223335]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:31:46 np0005538515.localdomain sudo[223335]: pam_unix(sudo:session): session closed for user root
Nov 28 09:31:46 np0005538515.localdomain sudo[223353]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 09:31:46 np0005538515.localdomain sudo[223353]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:31:46 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10326 DF PROTO=TCP SPT=45948 DPT=9100 SEQ=3223047178 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD15D3A0000000001030307) 
Nov 28 09:31:47 np0005538515.localdomain sudo[223353]: pam_unix(sudo:session): session closed for user root
Nov 28 09:31:47 np0005538515.localdomain sudo[223403]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:31:47 np0005538515.localdomain sudo[223403]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:31:47 np0005538515.localdomain sudo[223403]: pam_unix(sudo:session): session closed for user root
Nov 28 09:31:50 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10327 DF PROTO=TCP SPT=45948 DPT=9100 SEQ=3223047178 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD16CFA0000000001030307) 
Nov 28 09:31:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:31:50.811 158530 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:31:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:31:50.812 158530 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:31:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:31:50.812 158530 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:31:50 np0005538515.localdomain sudo[223511]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-blgiivwrfwdkhdkfsufcmefvuakbbxlx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322310.467872-3267-217990571021591/AnsiballZ_getent.py
Nov 28 09:31:50 np0005538515.localdomain sudo[223511]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:31:51 np0005538515.localdomain python3.9[223513]: ansible-ansible.builtin.getent Invoked with database=passwd key=nova fail_key=True service=None split=None
Nov 28 09:31:51 np0005538515.localdomain sudo[223511]: pam_unix(sudo:session): session closed for user root
Nov 28 09:31:51 np0005538515.localdomain sudo[223622]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xikkwjcoeuvlcfhvxyxnnxjfltwpgnlq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322311.3906965-3291-220647829211487/AnsiballZ_group.py
Nov 28 09:31:51 np0005538515.localdomain sudo[223622]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:31:51 np0005538515.localdomain python3.9[223624]: ansible-ansible.builtin.group Invoked with gid=42436 name=nova state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Nov 28 09:31:52 np0005538515.localdomain groupadd[223625]: group added to /etc/group: name=nova, GID=42436
Nov 28 09:31:52 np0005538515.localdomain groupadd[223625]: group added to /etc/gshadow: name=nova
Nov 28 09:31:52 np0005538515.localdomain groupadd[223625]: new group: name=nova, GID=42436
Nov 28 09:31:52 np0005538515.localdomain sudo[223622]: pam_unix(sudo:session): session closed for user root
Nov 28 09:31:52 np0005538515.localdomain sudo[223738]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qoudejrlgsnbvifcwcgwqypuzoaeedcq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322312.3337548-3315-54762191913947/AnsiballZ_user.py
Nov 28 09:31:52 np0005538515.localdomain sudo[223738]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:31:53 np0005538515.localdomain python3.9[223740]: ansible-ansible.builtin.user Invoked with comment=nova user group=nova groups=['libvirt'] name=nova shell=/bin/sh state=present uid=42436 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005538515.localdomain update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Nov 28 09:31:53 np0005538515.localdomain useradd[223742]: new user: name=nova, UID=42436, GID=42436, home=/home/nova, shell=/bin/sh, from=/dev/pts/0
Nov 28 09:31:53 np0005538515.localdomain useradd[223742]: add 'nova' to group 'libvirt'
Nov 28 09:31:53 np0005538515.localdomain useradd[223742]: add 'nova' to shadow group 'libvirt'
Nov 28 09:31:53 np0005538515.localdomain sudo[223738]: pam_unix(sudo:session): session closed for user root
Nov 28 09:31:54 np0005538515.localdomain sshd[223766]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 09:31:54 np0005538515.localdomain sshd[223766]: Accepted publickey for zuul from 192.168.122.30 port 43690 ssh2: RSA SHA256:3gOhaEk5Hp1Sm2LwNst6cGDJ5O01KvSo8lCo9SBO2II
Nov 28 09:31:54 np0005538515.localdomain systemd-logind[763]: New session 54 of user zuul.
Nov 28 09:31:54 np0005538515.localdomain systemd[1]: Started Session 54 of User zuul.
Nov 28 09:31:54 np0005538515.localdomain sshd[223766]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Nov 28 09:31:54 np0005538515.localdomain sshd[223769]: Received disconnect from 192.168.122.30 port 43690:11: disconnected by user
Nov 28 09:31:54 np0005538515.localdomain sshd[223769]: Disconnected from user zuul 192.168.122.30 port 43690
Nov 28 09:31:54 np0005538515.localdomain sshd[223766]: pam_unix(sshd:session): session closed for user zuul
Nov 28 09:31:54 np0005538515.localdomain systemd[1]: session-54.scope: Deactivated successfully.
Nov 28 09:31:54 np0005538515.localdomain systemd-logind[763]: Session 54 logged out. Waiting for processes to exit.
Nov 28 09:31:54 np0005538515.localdomain systemd-logind[763]: Removed session 54.
Nov 28 09:31:55 np0005538515.localdomain python3.9[223877]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/config.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:31:55 np0005538515.localdomain python3.9[223963]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/config.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764322314.5895195-3391-223551265961520/.source.json follow=False _original_basename=config.json.j2 checksum=b51012bfb0ca26296dcf3793a2f284446fb1395e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:31:56 np0005538515.localdomain python3.9[224071]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova-blank.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:31:56 np0005538515.localdomain python3.9[224126]: ansible-ansible.legacy.file Invoked with mode=0644 setype=container_file_t dest=/var/lib/openstack/config/nova/nova-blank.conf _original_basename=nova-blank.conf recurse=False state=file path=/var/lib/openstack/config/nova/nova-blank.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:31:56 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.
Nov 28 09:31:57 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.
Nov 28 09:31:57 np0005538515.localdomain podman[224144]: 2025-11-28 09:31:57.022662359 +0000 UTC m=+0.113123192 container health_status 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 28 09:31:57 np0005538515.localdomain podman[224144]: 2025-11-28 09:31:57.103322304 +0000 UTC m=+0.193783117 container exec_died 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 28 09:31:57 np0005538515.localdomain systemd[1]: 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.service: Deactivated successfully.
Nov 28 09:31:57 np0005538515.localdomain podman[224199]: 2025-11-28 09:31:57.155128891 +0000 UTC m=+0.123678910 container health_status b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, managed_by=edpm_ansible)
Nov 28 09:31:57 np0005538515.localdomain podman[224199]: 2025-11-28 09:31:57.185670379 +0000 UTC m=+0.154220398 container exec_died b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 09:31:57 np0005538515.localdomain systemd[1]: b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.service: Deactivated successfully.
Nov 28 09:31:57 np0005538515.localdomain python3.9[224278]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/ssh-config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:31:57 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3401 DF PROTO=TCP SPT=33760 DPT=9105 SEQ=3839620824 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD188220000000001030307) 
Nov 28 09:31:57 np0005538515.localdomain python3.9[224364]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/ssh-config mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764322316.9653697-3391-101791380511549/.source follow=False _original_basename=ssh-config checksum=4297f735c41bdc1ff52d72e6f623a02242f37958 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:31:58 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3402 DF PROTO=TCP SPT=33760 DPT=9105 SEQ=3839620824 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD18C3A0000000001030307) 
Nov 28 09:31:58 np0005538515.localdomain python3.9[224472]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/02-nova-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:31:58 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10328 DF PROTO=TCP SPT=45948 DPT=9100 SEQ=3223047178 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD18CFA0000000001030307) 
Nov 28 09:31:59 np0005538515.localdomain python3.9[224558]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/02-nova-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764322318.1478288-3391-31584114663146/.source.conf follow=False _original_basename=02-nova-host-specific.conf.j2 checksum=ea203e550d6f82354ff814f038f2bcabd98eed86 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:31:59 np0005538515.localdomain python3.9[224666]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova_statedir_ownership.py follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:32:01 np0005538515.localdomain python3.9[224752]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/nova_statedir_ownership.py mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764322319.262768-3391-142197283468942/.source.py follow=False _original_basename=nova_statedir_ownership.py checksum=c6c8a3cfefa5efd60ceb1408c4e977becedb71e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:32:01 np0005538515.localdomain python3.9[224860]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/run-on-host follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:32:01 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64744 DF PROTO=TCP SPT=51316 DPT=9882 SEQ=2403100779 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD1993A0000000001030307) 
Nov 28 09:32:03 np0005538515.localdomain python3.9[224946]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/run-on-host mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764322321.4034064-3391-133399573480838/.source follow=False _original_basename=run-on-host checksum=93aba8edc83d5878604a66d37fea2f12b60bdea2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:32:03 np0005538515.localdomain sudo[225054]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tedkjtanbvnrrpwdvngjyeikdcxjkere ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322323.4950037-3641-158549950183479/AnsiballZ_file.py
Nov 28 09:32:03 np0005538515.localdomain sudo[225054]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:32:04 np0005538515.localdomain python3.9[225056]: ansible-ansible.builtin.file Invoked with group=nova mode=0700 owner=nova path=/home/nova/.ssh state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:32:04 np0005538515.localdomain sudo[225054]: pam_unix(sudo:session): session closed for user root
Nov 28 09:32:04 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3404 DF PROTO=TCP SPT=33760 DPT=9105 SEQ=3839620824 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD1A3FB0000000001030307) 
Nov 28 09:32:04 np0005538515.localdomain sudo[225164]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-imfjvikvmzgnwcmietqcymkzszdsdajp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322324.5729578-3664-132358974331447/AnsiballZ_copy.py
Nov 28 09:32:04 np0005538515.localdomain sudo[225164]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:32:05 np0005538515.localdomain python3.9[225166]: ansible-ansible.legacy.copy Invoked with dest=/home/nova/.ssh/authorized_keys group=nova mode=0600 owner=nova remote_src=True src=/var/lib/openstack/config/nova/ssh-publickey backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:32:05 np0005538515.localdomain sudo[225164]: pam_unix(sudo:session): session closed for user root
Nov 28 09:32:05 np0005538515.localdomain sudo[225274]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eunbxkrlgpunucqxqzspgvnzkhypqhkt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322325.302804-3689-230293829043292/AnsiballZ_stat.py
Nov 28 09:32:05 np0005538515.localdomain sudo[225274]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:32:05 np0005538515.localdomain python3.9[225276]: ansible-ansible.builtin.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 09:32:05 np0005538515.localdomain sudo[225274]: pam_unix(sudo:session): session closed for user root
Nov 28 09:32:06 np0005538515.localdomain sudo[225386]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wvwwncbzikstgtnwvuqwqqssbxlkylus ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322326.071341-3715-3767188749390/AnsiballZ_file.py
Nov 28 09:32:06 np0005538515.localdomain sudo[225386]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:32:06 np0005538515.localdomain python3.9[225388]: ansible-ansible.builtin.file Invoked with group=nova mode=0400 owner=nova path=/var/lib/nova/compute_id state=file recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:32:06 np0005538515.localdomain sudo[225386]: pam_unix(sudo:session): session closed for user root
Nov 28 09:32:07 np0005538515.localdomain python3.9[225496]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 09:32:08 np0005538515.localdomain python3.9[225606]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:32:08 np0005538515.localdomain python3.9[225692]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764322327.572156-3767-247361820802228/.source.json follow=False _original_basename=nova_compute.json.j2 checksum=211ffd0bca4b407eb4de45a749ef70116a7806fd backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:32:09 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33309 DF PROTO=TCP SPT=55346 DPT=9102 SEQ=777400748 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD1B57B0000000001030307) 
Nov 28 09:32:09 np0005538515.localdomain python3.9[225800]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute_init.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:32:09 np0005538515.localdomain python3.9[225886]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute_init.json mode=0700 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764322328.8346283-3810-206243369475966/.source.json follow=False _original_basename=nova_compute_init.json.j2 checksum=60b024e6db49dc6e700fc0d50263944d98d4c034 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:32:10 np0005538515.localdomain sudo[225994]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-awfcqkcprpksqzjdqnrjexqgiizyjksq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322330.2431552-3861-227727046302075/AnsiballZ_container_config_data.py
Nov 28 09:32:10 np0005538515.localdomain sudo[225994]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:32:10 np0005538515.localdomain python3.9[225996]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute_init.json debug=False
Nov 28 09:32:10 np0005538515.localdomain sudo[225994]: pam_unix(sudo:session): session closed for user root
Nov 28 09:32:11 np0005538515.localdomain sudo[226104]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mcvccqfhrwflcbpegtygwbnxakgqntwa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322331.084522-3889-165366555963918/AnsiballZ_container_config_hash.py
Nov 28 09:32:11 np0005538515.localdomain sudo[226104]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:32:11 np0005538515.localdomain python3.9[226106]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 28 09:32:11 np0005538515.localdomain sudo[226104]: pam_unix(sudo:session): session closed for user root
Nov 28 09:32:12 np0005538515.localdomain sudo[226214]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fydntzogamkydiwmibqpexnbalumtdue ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764322331.9208336-3919-211649676164602/AnsiballZ_edpm_container_manage.py
Nov 28 09:32:12 np0005538515.localdomain sudo[226214]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:32:12 np0005538515.localdomain python3[226216]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute_init.json log_base_path=/var/log/containers/stdouts debug=False
Nov 28 09:32:13 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3405 DF PROTO=TCP SPT=33760 DPT=9105 SEQ=3839620824 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD1C4FA0000000001030307) 
Nov 28 09:32:14 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64746 DF PROTO=TCP SPT=51316 DPT=9882 SEQ=2403100779 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD1C8FB0000000001030307) 
Nov 28 09:32:15 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.
Nov 28 09:32:15 np0005538515.localdomain podman[226242]: 2025-11-28 09:32:15.951241298 +0000 UTC m=+0.061676693 container health_status cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Nov 28 09:32:15 np0005538515.localdomain podman[226242]: 2025-11-28 09:32:15.960268799 +0000 UTC m=+0.070704204 container exec_died cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3)
Nov 28 09:32:15 np0005538515.localdomain systemd[1]: cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.service: Deactivated successfully.
Nov 28 09:32:16 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7859 DF PROTO=TCP SPT=59650 DPT=9100 SEQ=275072454 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD1D23A0000000001030307) 
Nov 28 09:32:20 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7860 DF PROTO=TCP SPT=59650 DPT=9100 SEQ=275072454 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD1E1FB0000000001030307) 
Nov 28 09:32:23 np0005538515.localdomain podman[226229]: 2025-11-28 09:32:12.555855097 +0000 UTC m=+0.047273458 image pull  quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Nov 28 09:32:23 np0005538515.localdomain podman[226309]: 
Nov 28 09:32:23 np0005538515.localdomain podman[226309]: 2025-11-28 09:32:23.299556215 +0000 UTC m=+0.142311863 container create acc5612457ab293e4f840ea19b50676bf97e3477bba289ad940bf778a740745d (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, org.label-schema.build-date=20251125, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, container_name=nova_compute_init, io.buildah.version=1.41.3)
Nov 28 09:32:23 np0005538515.localdomain podman[226309]: 2025-11-28 09:32:23.206246266 +0000 UTC m=+0.049001944 image pull  quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Nov 28 09:32:23 np0005538515.localdomain python3[226216]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute_init --conmon-pidfile /run/nova_compute_init.pid --env NOVA_STATEDIR_OWNERSHIP_SKIP=/var/lib/nova/compute_id --env __OS_DEBUG=False --label config_id=edpm --label container_name=nova_compute_init --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']} --log-driver journald --log-level info --network none --privileged=False --security-opt label=disable --user root --volume /dev/log:/dev/log --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z --volume /var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init
Nov 28 09:32:23 np0005538515.localdomain sudo[226214]: pam_unix(sudo:session): session closed for user root
Nov 28 09:32:23 np0005538515.localdomain sudo[226455]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-omefnfimgkusjyoccfvxjorzrhqonrai ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322343.7138937-3943-122735285494282/AnsiballZ_stat.py
Nov 28 09:32:23 np0005538515.localdomain sudo[226455]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:32:24 np0005538515.localdomain python3.9[226457]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 09:32:24 np0005538515.localdomain sudo[226455]: pam_unix(sudo:session): session closed for user root
Nov 28 09:32:25 np0005538515.localdomain sudo[226567]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vspgeldbzwnohdymvatuveobxstshrmy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322344.8982017-3979-582641972488/AnsiballZ_container_config_data.py
Nov 28 09:32:25 np0005538515.localdomain sudo[226567]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:32:25 np0005538515.localdomain python3.9[226569]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute.json debug=False
Nov 28 09:32:25 np0005538515.localdomain sudo[226567]: pam_unix(sudo:session): session closed for user root
Nov 28 09:32:25 np0005538515.localdomain sudo[226677]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-adfozrxehulyhgmqrmhgwginnmomjkzz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322345.74427-4005-34978185449416/AnsiballZ_container_config_hash.py
Nov 28 09:32:25 np0005538515.localdomain sudo[226677]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:32:26 np0005538515.localdomain python3.9[226679]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 28 09:32:26 np0005538515.localdomain sudo[226677]: pam_unix(sudo:session): session closed for user root
Nov 28 09:32:26 np0005538515.localdomain sudo[226787]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rlbodbnijcfxwvljqyslxhvotgqfzfpo ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764322346.6219742-4035-55708424591638/AnsiballZ_edpm_container_manage.py
Nov 28 09:32:26 np0005538515.localdomain sudo[226787]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:32:27 np0005538515.localdomain python3[226789]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute.json log_base_path=/var/log/containers/stdouts debug=False
Nov 28 09:32:27 np0005538515.localdomain python3[226789]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: [
                                                               {
                                                                    "Id": "b65793e7266422f5b94c32d109b906c8ffd974cf2ddf0b6929e463e29e05864a",
                                                                    "Digest": "sha256:647f1d5dc1b70ffa3e1832199619d57bfaeceac8823ff53ece64b8e42cc9688e",
                                                                    "RepoTags": [
                                                                         "quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified"
                                                                    ],
                                                                    "RepoDigests": [
                                                                         "quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:647f1d5dc1b70ffa3e1832199619d57bfaeceac8823ff53ece64b8e42cc9688e"
                                                                    ],
                                                                    "Parent": "",
                                                                    "Comment": "",
                                                                    "Created": "2025-11-26T06:36:07.10279245Z",
                                                                    "Config": {
                                                                         "User": "nova",
                                                                         "Env": [
                                                                              "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",
                                                                              "LANG=en_US.UTF-8",
                                                                              "TZ=UTC",
                                                                              "container=oci"
                                                                         ],
                                                                         "Entrypoint": [
                                                                              "dumb-init",
                                                                              "--single-child",
                                                                              "--"
                                                                         ],
                                                                         "Cmd": [
                                                                              "kolla_start"
                                                                         ],
                                                                         "Labels": {
                                                                              "io.buildah.version": "1.41.3",
                                                                              "maintainer": "OpenStack Kubernetes Operator team",
                                                                              "org.label-schema.build-date": "20251125",
                                                                              "org.label-schema.license": "GPLv2",
                                                                              "org.label-schema.name": "CentOS Stream 9 Base Image",
                                                                              "org.label-schema.schema-version": "1.0",
                                                                              "org.label-schema.vendor": "CentOS",
                                                                              "tcib_build_tag": "1f5c0439f2433cb462b222a5bb23e629",
                                                                              "tcib_managed": "true"
                                                                         },
                                                                         "StopSignal": "SIGTERM"
                                                                    },
                                                                    "Version": "",
                                                                    "Author": "",
                                                                    "Architecture": "amd64",
                                                                    "Os": "linux",
                                                                    "Size": 1211782527,
                                                                    "VirtualSize": 1211782527,
                                                                    "GraphDriver": {
                                                                         "Name": "overlay",
                                                                         "Data": {
                                                                              "LowerDir": "/var/lib/containers/storage/overlay/c3914bdda39f47c0c497a56396d11c84b489b87df2bfd019b00ddced1e1ae309/diff:/var/lib/containers/storage/overlay/f20c3ba929bbb53a84e323dddb8c0eaf3ca74b6729310e964e1fa9eee119e43a/diff:/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/diff:/var/lib/containers/storage/overlay/cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa/diff",
                                                                              "UpperDir": "/var/lib/containers/storage/overlay/f7726cecd9e8969401979ecd2369f385c53efc762aea19175eca5dfbffa00449/diff",
                                                                              "WorkDir": "/var/lib/containers/storage/overlay/f7726cecd9e8969401979ecd2369f385c53efc762aea19175eca5dfbffa00449/work"
                                                                         }
                                                                    },
                                                                    "RootFS": {
                                                                         "Type": "layers",
                                                                         "Layers": [
                                                                              "sha256:cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa",
                                                                              "sha256:1e3477d3ea795ca64b46f28aa9428ba791c4250e0fd05e173a4b9c0fb0bdee23",
                                                                              "sha256:c136b33417f134a3b932677bcf7a2df089c29f20eca250129eafd2132d4708bb",
                                                                              "sha256:7913bde445307e7f24767d9149b2e7f498930793ac9f073ccec69b608c009d31",
                                                                              "sha256:084b2323a717fe711217b0ec21da61f4804f7a0d506adae935888421b80809cf"
                                                                         ]
                                                                    },
                                                                    "Labels": {
                                                                         "io.buildah.version": "1.41.3",
                                                                         "maintainer": "OpenStack Kubernetes Operator team",
                                                                         "org.label-schema.build-date": "20251125",
                                                                         "org.label-schema.license": "GPLv2",
                                                                         "org.label-schema.name": "CentOS Stream 9 Base Image",
                                                                         "org.label-schema.schema-version": "1.0",
                                                                         "org.label-schema.vendor": "CentOS",
                                                                         "tcib_build_tag": "1f5c0439f2433cb462b222a5bb23e629",
                                                                         "tcib_managed": "true"
                                                                    },
                                                                    "Annotations": {},
                                                                    "ManifestType": "application/vnd.docker.distribution.manifest.v2+json",
                                                                    "User": "nova",
                                                                    "History": [
                                                                         {
                                                                              "created": "2025-11-25T04:02:36.223494528Z",
                                                                              "created_by": "/bin/sh -c #(nop) ADD file:cacf1a97b4abfca5db2db22f7ddbca8fd7daa5076a559639c109f09aaf55871d in / ",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-25T04:02:36.223562059Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL org.label-schema.schema-version=\"1.0\"     org.label-schema.name=\"CentOS Stream 9 Base Image\"     org.label-schema.vendor=\"CentOS\"     org.label-schema.license=\"GPLv2\"     org.label-schema.build-date=\"20251125\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-25T04:02:39.054452717Z",
                                                                              "created_by": "/bin/sh -c #(nop) CMD [\"/bin/bash\"]"
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:10:57.55004106Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL maintainer=\"OpenStack Kubernetes Operator team\"",
                                                                              "comment": "FROM quay.io/centos/centos:stream9",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:10:57.550061231Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL tcib_managed=true",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:10:57.550071761Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV LANG=\"en_US.UTF-8\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:10:57.550082711Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV TZ=\"UTC\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:10:57.550094371Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV container=\"oci\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:10:57.550104472Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:10:57.937139683Z",
                                                                              "created_by": "/bin/sh -c if [ -f \"/etc/yum.repos.d/ubi.repo\" ]; then rm -f /etc/yum.repos.d/ubi.repo && dnf clean all && rm -rf /var/cache/dnf; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:33.845342269Z",
                                                                              "created_by": "/bin/sh -c dnf install -y crudini && crudini --del /etc/dnf/dnf.conf main override_install_langs && crudini --set /etc/dnf/dnf.conf main clean_requirements_on_remove True && crudini --set /etc/dnf/dnf.conf main exactarch 1 && crudini --set /etc/dnf/dnf.conf main gpgcheck 1 && crudini --set /etc/dnf/dnf.conf main install_weak_deps False && if [ 'centos' == 'centos' ];then crudini --set /etc/dnf/dnf.conf main best False; fi && crudini --set /etc/dnf/dnf.conf main installonly_limit 0 && crudini --set /etc/dnf/dnf.conf main keepcache 0 && crudini --set /etc/dnf/dnf.conf main obsoletes 1 && crudini --set /etc/dnf/dnf.conf main plugins 1 && crudini --set /etc/dnf/dnf.conf main skip_missing_names_on_install False && crudini --set /etc/dnf/dnf.conf main tsflags nodocs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:37.752912815Z",
                                                                              "created_by": "/bin/sh -c dnf install -y ca-certificates dumb-init glibc-langpack-en procps-ng python3 sudo util-linux-user which python-tcib-containers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:38.066850603Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/uid_gid_manage.sh /usr/local/bin/uid_gid_manage",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:38.343690066Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/uid_gid_manage",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:39.121414134Z",
                                                                              "created_by": "/bin/sh -c bash /usr/local/bin/uid_gid_manage kolla hugetlbfs libvirt qemu",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:39.758394881Z",
                                                                              "created_by": "/bin/sh -c touch /usr/local/bin/kolla_extend_start && chmod 755 /usr/local/bin/kolla_extend_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:40.023293708Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/set_configs.py /usr/local/bin/kolla_set_configs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:40.666927498Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_set_configs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:41.274045447Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/start.sh /usr/local/bin/kolla_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:41.934810694Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:42.460051822Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/httpd_setup.sh /usr/local/bin/kolla_httpd_setup",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:43.056709748Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_httpd_setup",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:43.656939418Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/copy_cacerts.sh /usr/local/bin/kolla_copy_cacerts",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:44.391634882Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_copy_cacerts",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:44.866551538Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/sudoers /etc/sudoers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:45.384686341Z",
                                                                              "created_by": "/bin/sh -c chmod 440 /etc/sudoers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:45.893815667Z",
                                                                              "created_by": "/bin/sh -c sed -ri '/^(passwd:|group:)/ s/systemd//g' /etc/nsswitch.conf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:50.280039705Z",
                                                                              "created_by": "/bin/sh -c dnf -y reinstall which && rpm -e --nodeps tzdata && dnf -y install tzdata",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:51.365780205Z",
                                                                              "created_by": "/bin/sh -c if [ ! -f \"/etc/localtime\" ]; then ln -s /usr/share/zoneinfo/Etc/UTC /etc/localtime; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:52.238116267Z",
                                                                              "created_by": "/bin/sh -c mkdir -p /openstack",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:54.354755699Z",
                                                                              "created_by": "/bin/sh -c if [ 'centos' == 'centos' ];then if [ -n \"$(rpm -qa redhat-release)\" ];then rpm -e --nodeps redhat-release; fi ; dnf -y install centos-stream-release; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:57.47438266Z",
                                                                              "created_by": "/bin/sh -c dnf update --excludepkgs redhat-release -y && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:57.474435383Z",
                                                                              "created_by": "/bin/sh -c #(nop) STOPSIGNAL SIGTERM",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:57.474444143Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENTRYPOINT [\"dumb-init\", \"--single-child\", \"--\"]",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:57.474450953Z",
                                                                              "created_by": "/bin/sh -c #(nop) CMD [\"kolla_start\"]",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:58.542433842Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"1f5c0439f2433cb462b222a5bb23e629\""
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:13:58.883943816Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-base:1f5c0439f2433cb462b222a5bb23e629",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:14:39.655921147Z",
                                                                              "created_by": "/bin/sh -c dnf install -y python3-barbicanclient python3-cinderclient python3-designateclient python3-glanceclient python3-ironicclient python3-keystoneclient python3-manilaclient python3-neutronclient python3-novaclient python3-observabilityclient python3-octaviaclient python3-openstackclient python3-swiftclient python3-pymemcache && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:14:42.534184087Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"1f5c0439f2433cb462b222a5bb23e629\""
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:22:20.237322707Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-os:1f5c0439f2433cb462b222a5bb23e629",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:22:20.688296939Z",
                                                                              "created_by": "/bin/sh -c bash /usr/local/bin/uid_gid_manage nova",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:22:21.069367201Z",
                                                                              "created_by": "/bin/sh -c mkdir -p /etc/ssh && touch /etc/ssh/ssh_known_host",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:23:46.989417927Z",
                                                                              "created_by": "/bin/sh -c dnf install -y openstack-nova-common && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:23:54.535170465Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"1f5c0439f2433cb462b222a5bb23e629\""
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:34:24.828469773Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-nova-base:1f5c0439f2433cb462b222a5bb23e629",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:36:06.089054875Z",
                                                                              "created_by": "/bin/sh -c dnf -y install e2fsprogs xfsprogs xorriso iscsi-initiator-utils nfs-utils targetcli nvme-cli device-mapper-multipath ceph-common openssh-clients openstack-nova-compute openvswitch swtpm swtpm-tools && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:36:06.610811813Z",
                                                                              "created_by": "/bin/sh -c bash /usr/local/bin/uid_gid_manage nova",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:36:07.099939071Z",
                                                                              "created_by": "/bin/sh -c rm -f /etc/machine-id",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:36:07.100032994Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER nova",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:36:14.509959241Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"1f5c0439f2433cb462b222a5bb23e629\""
                                                                         }
                                                                    ],
                                                                    "NamesHistory": [
                                                                         "quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified"
                                                                    ]
                                                               }
                                                          ]
                                                          : quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Nov 28 09:32:27 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.
Nov 28 09:32:27 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.
Nov 28 09:32:27 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25472 DF PROTO=TCP SPT=59640 DPT=9105 SEQ=2968183367 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD1FD530000000001030307) 
Nov 28 09:32:27 np0005538515.localdomain podman[226838]: 2025-11-28 09:32:27.602124147 +0000 UTC m=+0.125591405 container remove ae9e8db2a854119ed082566caf78eaf6c9d895e78df0b69420b5a74f73b528b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, architecture=x86_64, io.buildah.version=1.41.4, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-compute-container, batch=17.1_20251118.1, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18a2751501986164e709168f53ab57c8-bbb5ea37891e3118676a78b59837de90'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, tcib_managed=true, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, name=rhosp17/openstack-nova-compute)
Nov 28 09:32:27 np0005538515.localdomain python3[226789]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman rm --force nova_compute
Nov 28 09:32:27 np0005538515.localdomain systemd[1]: tmp-crun.M7P2if.mount: Deactivated successfully.
Nov 28 09:32:27 np0005538515.localdomain podman[226851]: 2025-11-28 09:32:27.709437761 +0000 UTC m=+0.144695266 container health_status b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 28 09:32:27 np0005538515.localdomain podman[226850]: 2025-11-28 09:32:27.687979747 +0000 UTC m=+0.125290007 container health_status 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_controller, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 09:32:27 np0005538515.localdomain podman[226851]: 2025-11-28 09:32:27.743516825 +0000 UTC m=+0.178774280 container exec_died b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 28 09:32:27 np0005538515.localdomain podman[226875]: 
Nov 28 09:32:27 np0005538515.localdomain systemd[1]: b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.service: Deactivated successfully.
Nov 28 09:32:27 np0005538515.localdomain podman[226850]: 2025-11-28 09:32:27.772607464 +0000 UTC m=+0.209917724 container exec_died 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_id=ovn_controller, managed_by=edpm_ansible, tcib_managed=true)
Nov 28 09:32:27 np0005538515.localdomain podman[226875]: 2025-11-28 09:32:27.775489671 +0000 UTC m=+0.147451411 container create 1d94fbbb119736b439df2d989b40e3ac469ad8b1f74689adb48cea226c0cf94e (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, container_name=nova_compute, org.label-schema.license=GPLv2, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, config_id=edpm, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 09:32:27 np0005538515.localdomain podman[226875]: 2025-11-28 09:32:27.731392149 +0000 UTC m=+0.103353939 image pull  quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Nov 28 09:32:27 np0005538515.localdomain python3[226789]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute --conmon-pidfile /run/nova_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --label config_id=edpm --label container_name=nova_compute --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']} --log-driver journald --log-level info --network host --pid host --privileged=True --user nova --volume /var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro --volume /var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /etc/localtime:/etc/localtime:ro --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /var/lib/libvirt:/var/lib/libvirt --volume /run/libvirt:/run/libvirt:shared --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/iscsi:/var/lib/iscsi --volume /etc/multipath:/etc/multipath:z --volume /etc/multipath.conf:/etc/multipath.conf:ro --volume /etc/iscsi:/etc/iscsi:ro --volume /etc/nvme:/etc/nvme --volume /var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified kolla_start
Nov 28 09:32:27 np0005538515.localdomain systemd[1]: 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.service: Deactivated successfully.
Nov 28 09:32:27 np0005538515.localdomain sudo[226787]: pam_unix(sudo:session): session closed for user root
Nov 28 09:32:28 np0005538515.localdomain sudo[227038]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xwukewzcxisqavzspalhcohchakmghcn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322348.2317088-4061-53328975450905/AnsiballZ_stat.py
Nov 28 09:32:28 np0005538515.localdomain sudo[227038]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:32:28 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25473 DF PROTO=TCP SPT=59640 DPT=9105 SEQ=2968183367 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD2017A0000000001030307) 
Nov 28 09:32:28 np0005538515.localdomain python3.9[227040]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 09:32:28 np0005538515.localdomain sudo[227038]: pam_unix(sudo:session): session closed for user root
Nov 28 09:32:28 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42259 DF PROTO=TCP SPT=58618 DPT=9882 SEQ=1004386776 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD2027C0000000001030307) 
Nov 28 09:32:29 np0005538515.localdomain sudo[227150]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ugveoapjjdosjovwyjopotpfpecmgqhl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322349.0744307-4086-209553762341893/AnsiballZ_file.py
Nov 28 09:32:29 np0005538515.localdomain sudo[227150]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:32:29 np0005538515.localdomain python3.9[227152]: ansible-file Invoked with path=/etc/systemd/system/edpm_nova_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:32:29 np0005538515.localdomain sudo[227150]: pam_unix(sudo:session): session closed for user root
Nov 28 09:32:30 np0005538515.localdomain sudo[227261]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lkvgzjosychfxzkbyabcgrugokwtaage ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322349.7524283-4086-230659389946330/AnsiballZ_copy.py
Nov 28 09:32:30 np0005538515.localdomain sudo[227261]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:32:30 np0005538515.localdomain python3.9[227263]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764322349.7524283-4086-230659389946330/source dest=/etc/systemd/system/edpm_nova_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:32:30 np0005538515.localdomain sudo[227261]: pam_unix(sudo:session): session closed for user root
Nov 28 09:32:30 np0005538515.localdomain sudo[227316]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kfcmpbgflftetrxupostzqvxhgtxpvar ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322349.7524283-4086-230659389946330/AnsiballZ_systemd.py
Nov 28 09:32:30 np0005538515.localdomain sudo[227316]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:32:30 np0005538515.localdomain python3.9[227318]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 28 09:32:30 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 09:32:31 np0005538515.localdomain systemd-sysv-generator[227352]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:32:31 np0005538515.localdomain systemd-rc-local-generator[227349]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:32:31 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:32:31 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 28 09:32:31 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:32:31 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:32:31 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:32:31 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 28 09:32:31 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:32:31 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:32:31 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:32:31 np0005538515.localdomain sudo[227316]: pam_unix(sudo:session): session closed for user root
Nov 28 09:32:31 np0005538515.localdomain sudo[227406]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cgfjwehqvdiaopomwgpvsrxgrnsnmtkb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322349.7524283-4086-230659389946330/AnsiballZ_systemd.py
Nov 28 09:32:31 np0005538515.localdomain sudo[227406]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:32:31 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48803 DF PROTO=TCP SPT=34422 DPT=9105 SEQ=603387921 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD20CFA0000000001030307) 
Nov 28 09:32:31 np0005538515.localdomain python3.9[227408]: ansible-systemd Invoked with state=restarted name=edpm_nova_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:32:31 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 09:32:32 np0005538515.localdomain systemd-rc-local-generator[227439]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:32:32 np0005538515.localdomain systemd-sysv-generator[227442]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:32:32 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:32:32 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 28 09:32:32 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:32:32 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:32:32 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:32:32 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 28 09:32:32 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:32:32 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:32:32 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:32:32 np0005538515.localdomain systemd[1]: Starting nova_compute container...
Nov 28 09:32:32 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 09:32:32 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7d070e222432defa9c0fb260246ed4b88067e3e8c5320c077932e5b44f128942/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Nov 28 09:32:32 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7d070e222432defa9c0fb260246ed4b88067e3e8c5320c077932e5b44f128942/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Nov 28 09:32:32 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7d070e222432defa9c0fb260246ed4b88067e3e8c5320c077932e5b44f128942/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Nov 28 09:32:32 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7d070e222432defa9c0fb260246ed4b88067e3e8c5320c077932e5b44f128942/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Nov 28 09:32:32 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7d070e222432defa9c0fb260246ed4b88067e3e8c5320c077932e5b44f128942/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Nov 28 09:32:32 np0005538515.localdomain podman[227450]: 2025-11-28 09:32:32.419234432 +0000 UTC m=+0.142982231 container init 1d94fbbb119736b439df2d989b40e3ac469ad8b1f74689adb48cea226c0cf94e (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, container_name=nova_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 09:32:32 np0005538515.localdomain podman[227450]: 2025-11-28 09:32:32.429827155 +0000 UTC m=+0.153574954 container start 1d94fbbb119736b439df2d989b40e3ac469ad8b1f74689adb48cea226c0cf94e (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, container_name=nova_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 28 09:32:32 np0005538515.localdomain podman[227450]: nova_compute
Nov 28 09:32:32 np0005538515.localdomain nova_compute[227465]: + sudo -E kolla_set_configs
Nov 28 09:32:32 np0005538515.localdomain systemd[1]: Started nova_compute container.
Nov 28 09:32:32 np0005538515.localdomain sudo[227406]: pam_unix(sudo:session): session closed for user root
Nov 28 09:32:32 np0005538515.localdomain nova_compute[227465]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 28 09:32:32 np0005538515.localdomain nova_compute[227465]: INFO:__main__:Validating config file
Nov 28 09:32:32 np0005538515.localdomain nova_compute[227465]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 28 09:32:32 np0005538515.localdomain nova_compute[227465]: INFO:__main__:Copying service configuration files
Nov 28 09:32:32 np0005538515.localdomain nova_compute[227465]: INFO:__main__:Deleting /etc/nova/nova.conf
Nov 28 09:32:32 np0005538515.localdomain nova_compute[227465]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Nov 28 09:32:32 np0005538515.localdomain nova_compute[227465]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Nov 28 09:32:32 np0005538515.localdomain nova_compute[227465]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Nov 28 09:32:32 np0005538515.localdomain nova_compute[227465]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Nov 28 09:32:32 np0005538515.localdomain nova_compute[227465]: INFO:__main__:Copying /var/lib/kolla/config_files/03-ceph-nova.conf to /etc/nova/nova.conf.d/03-ceph-nova.conf
Nov 28 09:32:32 np0005538515.localdomain nova_compute[227465]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/03-ceph-nova.conf
Nov 28 09:32:32 np0005538515.localdomain nova_compute[227465]: INFO:__main__:Copying /var/lib/kolla/config_files/99-nova-compute-cells-workarounds.conf to /etc/nova/nova.conf.d/99-nova-compute-cells-workarounds.conf
Nov 28 09:32:32 np0005538515.localdomain nova_compute[227465]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/99-nova-compute-cells-workarounds.conf
Nov 28 09:32:32 np0005538515.localdomain nova_compute[227465]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Nov 28 09:32:32 np0005538515.localdomain nova_compute[227465]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Nov 28 09:32:32 np0005538515.localdomain nova_compute[227465]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Nov 28 09:32:32 np0005538515.localdomain nova_compute[227465]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Nov 28 09:32:32 np0005538515.localdomain nova_compute[227465]: INFO:__main__:Deleting /etc/ceph
Nov 28 09:32:32 np0005538515.localdomain nova_compute[227465]: INFO:__main__:Creating directory /etc/ceph
Nov 28 09:32:32 np0005538515.localdomain nova_compute[227465]: INFO:__main__:Setting permission for /etc/ceph
Nov 28 09:32:32 np0005538515.localdomain nova_compute[227465]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.conf to /etc/ceph/ceph.conf
Nov 28 09:32:32 np0005538515.localdomain nova_compute[227465]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Nov 28 09:32:32 np0005538515.localdomain nova_compute[227465]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.client.openstack.keyring to /etc/ceph/ceph.client.openstack.keyring
Nov 28 09:32:32 np0005538515.localdomain nova_compute[227465]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Nov 28 09:32:32 np0005538515.localdomain nova_compute[227465]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Nov 28 09:32:32 np0005538515.localdomain nova_compute[227465]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Nov 28 09:32:32 np0005538515.localdomain nova_compute[227465]: INFO:__main__:Deleting /var/lib/nova/.ssh/config
Nov 28 09:32:32 np0005538515.localdomain nova_compute[227465]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Nov 28 09:32:32 np0005538515.localdomain nova_compute[227465]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Nov 28 09:32:32 np0005538515.localdomain nova_compute[227465]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Nov 28 09:32:32 np0005538515.localdomain nova_compute[227465]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Nov 28 09:32:32 np0005538515.localdomain nova_compute[227465]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Nov 28 09:32:32 np0005538515.localdomain nova_compute[227465]: INFO:__main__:Writing out command to execute
Nov 28 09:32:32 np0005538515.localdomain nova_compute[227465]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Nov 28 09:32:32 np0005538515.localdomain nova_compute[227465]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Nov 28 09:32:32 np0005538515.localdomain nova_compute[227465]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Nov 28 09:32:32 np0005538515.localdomain nova_compute[227465]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Nov 28 09:32:32 np0005538515.localdomain nova_compute[227465]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Nov 28 09:32:32 np0005538515.localdomain nova_compute[227465]: ++ cat /run_command
Nov 28 09:32:32 np0005538515.localdomain nova_compute[227465]: + CMD=nova-compute
Nov 28 09:32:32 np0005538515.localdomain nova_compute[227465]: + ARGS=
Nov 28 09:32:32 np0005538515.localdomain nova_compute[227465]: + sudo kolla_copy_cacerts
Nov 28 09:32:32 np0005538515.localdomain nova_compute[227465]: + [[ ! -n '' ]]
Nov 28 09:32:32 np0005538515.localdomain nova_compute[227465]: + . kolla_extend_start
Nov 28 09:32:32 np0005538515.localdomain nova_compute[227465]: Running command: 'nova-compute'
Nov 28 09:32:32 np0005538515.localdomain nova_compute[227465]: + echo 'Running command: '\''nova-compute'\'''
Nov 28 09:32:32 np0005538515.localdomain nova_compute[227465]: + umask 0022
Nov 28 09:32:32 np0005538515.localdomain nova_compute[227465]: + exec nova-compute
Nov 28 09:32:33 np0005538515.localdomain python3.9[227585]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner_healthcheck.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.205 227469 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.206 227469 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.206 227469 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.206 227469 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.322 227469 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.344 227469 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.021s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.344 227469 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473
Nov 28 09:32:34 np0005538515.localdomain python3.9[227695]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.711 227469 INFO nova.virt.driver [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'
Nov 28 09:32:34 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25475 DF PROTO=TCP SPT=59640 DPT=9105 SEQ=2968183367 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD2193A0000000001030307) 
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.836 227469 INFO nova.compute.provider_config [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.858 227469 WARNING nova.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] Current Nova version does not support computes older than Yoga but the minimum compute service level in your cell is 57 and the oldest supported service level is 61.: nova.exception.TooOldComputeService: Current Nova version does not support computes older than Yoga but the minimum compute service level in your cell is 57 and the oldest supported service level is 61.
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.858 227469 DEBUG oslo_concurrency.lockutils [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.859 227469 DEBUG oslo_concurrency.lockutils [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.859 227469 DEBUG oslo_concurrency.lockutils [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.859 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.859 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.860 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.860 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.860 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.860 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.860 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.860 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.861 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.861 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.861 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.861 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.861 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.861 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.861 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.861 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.862 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.862 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.862 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.862 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] console_host                   = np0005538515.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.862 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.862 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.862 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.863 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.863 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.863 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.863 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.863 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.863 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.863 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.864 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.864 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.864 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.864 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.864 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.864 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.864 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.864 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.865 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] host                           = np0005538515.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.865 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.865 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.865 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.865 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.865 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.866 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.866 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.866 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.866 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.866 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.866 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.866 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.866 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.867 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.867 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.867 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.867 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.867 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.867 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.867 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.868 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.868 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.868 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.868 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.868 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.868 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.868 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.868 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.869 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.869 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.869 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.869 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.869 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.869 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.869 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.869 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.870 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.870 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.870 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.870 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.870 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.870 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] my_block_storage_ip            = 192.168.122.108 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.870 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] my_ip                          = 192.168.122.108 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.871 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.871 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.871 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.871 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.871 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.871 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.871 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.871 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.872 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.872 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.872 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.872 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.872 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.872 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.872 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.873 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.873 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.873 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.873 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.873 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.873 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.873 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.873 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.874 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.874 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.874 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.874 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.874 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.874 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.874 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.874 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.875 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.875 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.875 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.875 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.875 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.875 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.875 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.876 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.876 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.876 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.876 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.876 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.876 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.876 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.876 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.877 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.877 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.877 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.877 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.877 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.877 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.877 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.877 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.878 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.878 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.878 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.878 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.878 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.878 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.878 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.878 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.879 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.879 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.879 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.879 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.879 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.879 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.880 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.880 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.880 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.880 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.880 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.880 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.880 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.881 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.881 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.881 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.881 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.881 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.881 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.881 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.881 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.882 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.882 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.882 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.882 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.882 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.882 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.882 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.883 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.883 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.883 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.883 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.883 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.883 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.883 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.884 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.884 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.884 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.884 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.884 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.884 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.884 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.884 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.885 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.885 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.885 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.885 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.885 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.885 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.885 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.886 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.886 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.886 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.886 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.886 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.886 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.886 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.886 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.887 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.887 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.887 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.887 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.887 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.887 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.887 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.888 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.888 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.888 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.888 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.888 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.888 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.888 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.889 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.889 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.889 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.889 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.889 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.889 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.889 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.889 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.890 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.890 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.890 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.890 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.890 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.890 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.890 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.890 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.891 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.891 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.891 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.891 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.891 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.891 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.891 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.892 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.892 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.892 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.892 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.892 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.892 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.892 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.893 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.893 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.893 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.893 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.893 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.893 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.893 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.893 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.894 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.894 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.894 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.894 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.894 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.894 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.894 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.895 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.895 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.895 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.895 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.895 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.895 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.895 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.895 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.896 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.896 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.896 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.896 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.896 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.896 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.896 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.897 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.897 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.897 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.897 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.897 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.897 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.897 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.897 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.898 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.898 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.898 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.898 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.898 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.898 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.898 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.899 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.899 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.899 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.899 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.899 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.899 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.899 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.900 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.900 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.900 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.900 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.900 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.900 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.900 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.901 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.901 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.901 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.901 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.901 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.901 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.901 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.901 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.902 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.902 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.902 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.902 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.902 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.902 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.902 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.903 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.903 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.903 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.903 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.903 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.903 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.903 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.904 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.904 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.904 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.904 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.904 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.904 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.904 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.904 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.905 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.905 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.905 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.905 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.905 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.905 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.905 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.906 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.906 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.906 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.906 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.906 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.906 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.906 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.906 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.907 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.907 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.907 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.907 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.907 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.908 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.908 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.908 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.908 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.908 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.908 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.909 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.909 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.909 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.909 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.909 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.909 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.909 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.910 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.910 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.910 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.910 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.910 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.910 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.910 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.911 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.911 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.911 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.911 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.911 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.911 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.912 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.912 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.912 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.912 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.912 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.912 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.912 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.913 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.913 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.913 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.913 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.913 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.913 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.913 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.914 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.914 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.914 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.914 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.914 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.914 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.914 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.914 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.915 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.915 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.915 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.915 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.915 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.916 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.916 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.916 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.916 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.916 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.916 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.916 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.916 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.917 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.917 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.917 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.917 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.917 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.917 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.917 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.918 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.918 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.918 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.918 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.918 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.918 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.918 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.918 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.919 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.919 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.919 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.919 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.919 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.919 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.919 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.920 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.920 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.920 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.920 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.920 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.920 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.920 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.921 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.921 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.921 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.921 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.921 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.921 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.921 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.922 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] libvirt.cpu_mode               = host-model log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.922 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.922 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] libvirt.cpu_models             = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.922 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.922 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.922 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.923 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.923 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.923 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.923 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.923 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.923 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.923 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.924 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.924 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.924 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.924 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] libvirt.images_rbd_ceph_conf   = /etc/ceph/ceph.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.924 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.924 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.925 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] libvirt.images_rbd_glance_store_name = default_backend log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.925 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] libvirt.images_rbd_pool        = vms log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.925 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] libvirt.images_type            = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.925 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.925 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.925 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.925 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.925 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.926 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.926 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.926 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.926 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.926 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.926 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.927 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.927 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.927 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.927 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.927 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.927 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.928 227469 WARNING oslo_config.cfg [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: live_migration_uri is deprecated for removal in favor of two other options that
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: and ``live_migration_inbound_addr`` respectively.
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: ).  Its value may be silently ignored in the future.
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.928 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] libvirt.live_migration_uri     = qemu+ssh://nova@%s/system?keyfile=/var/lib/nova/.ssh/ssh-privatekey log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.928 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] libvirt.live_migration_with_native_tls = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.928 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.928 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.928 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.928 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.929 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.929 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.929 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.929 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.929 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.929 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.929 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.930 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.930 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.930 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.930 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.930 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.930 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] libvirt.rbd_secret_uuid        = 2c5417c9-00eb-57d5-a565-ddecbc7995c1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.931 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] libvirt.rbd_user               = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.931 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.931 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.931 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.931 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.931 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.931 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.932 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.932 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.932 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.932 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.932 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.932 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.933 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.933 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.933 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.933 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.933 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.933 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.934 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.934 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.934 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.934 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.934 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.934 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.935 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.935 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.935 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.935 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.935 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.935 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.935 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.936 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.936 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.936 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.936 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.936 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.936 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.936 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.937 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.937 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.937 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.937 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.937 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.937 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.937 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.938 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.938 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.938 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.938 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.938 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.938 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.938 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.939 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.939 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.939 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.939 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.939 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.939 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.939 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.939 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.940 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.940 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.940 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.940 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.940 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.940 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.941 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.941 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.941 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.941 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.941 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] placement.auth_url             = http://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.941 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.941 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.941 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.942 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.942 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.942 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.942 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.942 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.942 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.942 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.943 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.943 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.943 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.943 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.943 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.943 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.943 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.944 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.944 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.944 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.944 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.944 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.944 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.944 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.945 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.945 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.945 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.945 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.945 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.945 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.945 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.945 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.946 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.946 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.946 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.946 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.946 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.946 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.946 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.947 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.947 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.947 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.947 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.947 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.947 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.947 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.948 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.948 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.948 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.948 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.948 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.948 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.949 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.949 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.949 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.949 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.949 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.949 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.949 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.949 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.950 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.950 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.950 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.950 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.950 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.950 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.950 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.951 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.951 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.951 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.951 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.951 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.951 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.951 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.952 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.952 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.952 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.952 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.952 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.952 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.952 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.953 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.953 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.953 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.953 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.953 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.953 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.953 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.954 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.954 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.954 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.954 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.954 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.955 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.955 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.955 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.955 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.955 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.955 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.955 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.956 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.956 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.956 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.956 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.956 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.956 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.957 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.957 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.957 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.957 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.957 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.957 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.957 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.958 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.958 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.958 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.958 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.958 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.958 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.958 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.958 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.959 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.959 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.959 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.959 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.959 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.959 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.959 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.959 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.960 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.960 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.960 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.960 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.960 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.960 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.960 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.961 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.961 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.961 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.961 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.961 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.961 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.961 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.962 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.962 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.962 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.962 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.962 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.962 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.962 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.962 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.963 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.963 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.963 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.963 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.963 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.963 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.964 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] vnc.novncproxy_base_url        = http://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.964 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.964 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.964 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.964 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] vnc.server_proxyclient_address = 192.168.122.108 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.964 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.964 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.965 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.965 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] workarounds.disable_compute_service_check_for_ffu = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.965 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.965 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.965 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.965 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.965 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.966 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.966 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.966 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.966 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.966 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.966 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.966 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.967 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.967 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.967 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.967 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.967 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.967 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.967 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.967 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.968 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.968 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.968 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.968 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.968 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.968 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.969 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.969 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.969 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.969 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.969 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.969 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.969 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.969 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.970 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.970 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.970 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.970 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.970 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.970 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.970 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.971 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.971 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.971 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.971 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.971 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.971 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.971 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.972 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.972 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.972 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.972 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.972 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.972 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.972 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.973 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.973 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.973 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.973 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.973 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.973 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.973 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.974 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.974 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.974 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.974 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.974 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.974 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.974 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.975 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.975 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.975 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.975 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.975 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.975 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.975 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.976 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.976 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.976 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.976 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.976 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.976 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.977 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.977 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.977 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.977 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.977 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] oslo_limit.auth_url            = http://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.977 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.977 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.977 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.978 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.978 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.978 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.978 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.978 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.978 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.978 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.979 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.979 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.979 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.979 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.979 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.979 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.979 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.979 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.980 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.980 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.980 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.980 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.980 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.980 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.980 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.980 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.981 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.981 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.981 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.981 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.981 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.981 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.981 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.982 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.982 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.982 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.982 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.982 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.982 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.982 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.982 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.983 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.983 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.983 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.983 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.983 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.983 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.983 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.984 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.984 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.984 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.984 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.984 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.984 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.985 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.985 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.985 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.985 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.985 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.985 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.985 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.986 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.986 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.986 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.986 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.986 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.986 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.986 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.987 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.987 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.987 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.987 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.987 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.987 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.987 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.987 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.988 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.988 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.988 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.988 227469 DEBUG oslo_service.service [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Nov 28 09:32:34 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:34.989 227469 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:35.007 227469 INFO nova.virt.node [None req-b811f39d-3152-4ae0-8dd6-76baa64b5d28 - - - - - -] Determined node identity 72fba1ca-0d86-48af-8a3d-510284dfd0e0 from /var/lib/nova/compute_id
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:35.007 227469 DEBUG nova.virt.libvirt.host [None req-b811f39d-3152-4ae0-8dd6-76baa64b5d28 - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:35.008 227469 DEBUG nova.virt.libvirt.host [None req-b811f39d-3152-4ae0-8dd6-76baa64b5d28 - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:35.008 227469 DEBUG nova.virt.libvirt.host [None req-b811f39d-3152-4ae0-8dd6-76baa64b5d28 - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:35.008 227469 DEBUG nova.virt.libvirt.host [None req-b811f39d-3152-4ae0-8dd6-76baa64b5d28 - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503
Nov 28 09:32:35 np0005538515.localdomain systemd[1]: Started libvirt QEMU daemon.
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:35.069 227469 DEBUG nova.virt.libvirt.host [None req-b811f39d-3152-4ae0-8dd6-76baa64b5d28 - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7faf38eb7310> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:35.071 227469 DEBUG nova.virt.libvirt.host [None req-b811f39d-3152-4ae0-8dd6-76baa64b5d28 - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7faf38eb7310> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:35.072 227469 INFO nova.virt.libvirt.driver [None req-b811f39d-3152-4ae0-8dd6-76baa64b5d28 - - - - - -] Connection event '1' reason 'None'
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:35.084 227469 DEBUG nova.virt.libvirt.volume.mount [None req-b811f39d-3152-4ae0-8dd6-76baa64b5d28 - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:35.986 227469 INFO nova.virt.libvirt.host [None req-b811f39d-3152-4ae0-8dd6-76baa64b5d28 - - - - - -] Libvirt host capabilities <capabilities>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]: 
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:   <host>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:     <uuid>4c358f0e-7e15-44e5-bde2-714780d05a92</uuid>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:     <cpu>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:       <arch>x86_64</arch>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:       <model>EPYC-Rome-v4</model>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:       <vendor>AMD</vendor>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:       <microcode version='16777317'/>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:       <signature family='23' model='49' stepping='0'/>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:       <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:       <maxphysaddr mode='emulate' bits='40'/>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:       <feature name='x2apic'/>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:       <feature name='tsc-deadline'/>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:       <feature name='osxsave'/>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:       <feature name='hypervisor'/>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:       <feature name='tsc_adjust'/>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:       <feature name='spec-ctrl'/>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:       <feature name='stibp'/>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:       <feature name='arch-capabilities'/>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:       <feature name='ssbd'/>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:       <feature name='cmp_legacy'/>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:       <feature name='topoext'/>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:       <feature name='virt-ssbd'/>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:       <feature name='lbrv'/>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:       <feature name='tsc-scale'/>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:       <feature name='vmcb-clean'/>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:       <feature name='pause-filter'/>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:       <feature name='pfthreshold'/>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:       <feature name='svme-addr-chk'/>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:       <feature name='rdctl-no'/>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:       <feature name='skip-l1dfl-vmentry'/>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:       <feature name='mds-no'/>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:       <feature name='pschange-mc-no'/>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:       <pages unit='KiB' size='4'/>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:       <pages unit='KiB' size='2048'/>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:       <pages unit='KiB' size='1048576'/>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:     </cpu>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:     <power_management>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:       <suspend_mem/>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:       <suspend_disk/>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:       <suspend_hybrid/>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:     </power_management>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:     <iommu support='no'/>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:     <migration_features>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:       <live/>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:       <uri_transports>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:         <uri_transport>tcp</uri_transport>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:         <uri_transport>rdma</uri_transport>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:       </uri_transports>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:     </migration_features>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:     <topology>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:       <cells num='1'>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:         <cell id='0'>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:           <memory unit='KiB'>16116612</memory>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:           <pages unit='KiB' size='4'>4029153</pages>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:           <pages unit='KiB' size='2048'>0</pages>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:           <pages unit='KiB' size='1048576'>0</pages>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:           <distances>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:             <sibling id='0' value='10'/>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:           </distances>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:           <cpus num='8'>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:             <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:             <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:             <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:             <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:             <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:             <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:             <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:             <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:           </cpus>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:         </cell>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:       </cells>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:     </topology>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:     <cache>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:       <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:       <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:       <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:       <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:       <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:       <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:       <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:       <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:       <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:       <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:       <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:       <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:       <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:       <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:       <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:       <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:     </cache>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:     <secmodel>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:       <model>selinux</model>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:       <doi>0</doi>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:       <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:       <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:     </secmodel>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:     <secmodel>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:       <model>dac</model>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:       <doi>0</doi>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:       <baselabel type='kvm'>+107:+107</baselabel>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:       <baselabel type='qemu'>+107:+107</baselabel>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:     </secmodel>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:   </host>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]: 
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:   <guest>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:     <os_type>hvm</os_type>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:     <arch name='i686'>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:       <wordsize>32</wordsize>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:       <domain type='qemu'/>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:       <domain type='kvm'/>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:     </arch>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:     <features>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:       <pae/>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:       <nonpae/>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:       <acpi default='on' toggle='yes'/>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:       <apic default='on' toggle='no'/>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:       <cpuselection/>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:       <deviceboot/>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:       <disksnapshot default='on' toggle='no'/>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:       <externalSnapshot/>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:     </features>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:   </guest>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]: 
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:   <guest>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:     <os_type>hvm</os_type>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:     <arch name='x86_64'>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:       <wordsize>64</wordsize>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:       <domain type='qemu'/>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:       <domain type='kvm'/>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:     </arch>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:     <features>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:       <acpi default='on' toggle='yes'/>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:       <apic default='on' toggle='no'/>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:       <cpuselection/>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:       <deviceboot/>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:       <disksnapshot default='on' toggle='no'/>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:       <externalSnapshot/>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:     </features>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]:   </guest>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]: 
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]: </capabilities>
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]: 
Nov 28 09:32:35 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:35.996 227469 DEBUG nova.virt.libvirt.host [None req-b811f39d-3152-4ae0-8dd6-76baa64b5d28 - - - - - -] Getting domain capabilities for i686 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:36.020 227469 DEBUG nova.virt.libvirt.host [None req-b811f39d-3152-4ae0-8dd6-76baa64b5d28 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]: <domainCapabilities>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:   <path>/usr/libexec/qemu-kvm</path>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:   <domain>kvm</domain>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:   <machine>pc-q35-rhel9.8.0</machine>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:   <arch>i686</arch>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:   <vcpu max='1024'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:   <iothreads supported='yes'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:   <os supported='yes'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     <enum name='firmware'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     <loader supported='yes'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <enum name='type'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>rom</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>pflash</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </enum>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <enum name='readonly'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>yes</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>no</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </enum>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <enum name='secure'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>no</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </enum>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     </loader>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:   </os>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:   <cpu>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     <mode name='host-passthrough' supported='yes'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <enum name='hostPassthroughMigratable'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>on</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>off</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </enum>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     </mode>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     <mode name='maximum' supported='yes'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <enum name='maximumMigratable'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>on</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>off</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </enum>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     </mode>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     <mode name='host-model' supported='yes'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model fallback='forbid'>EPYC-Rome</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <vendor>AMD</vendor>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <maxphysaddr mode='passthrough' limit='40'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <feature policy='require' name='x2apic'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <feature policy='require' name='tsc-deadline'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <feature policy='require' name='hypervisor'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <feature policy='require' name='tsc_adjust'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <feature policy='require' name='spec-ctrl'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <feature policy='require' name='stibp'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <feature policy='require' name='ssbd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <feature policy='require' name='cmp_legacy'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <feature policy='require' name='overflow-recov'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <feature policy='require' name='succor'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <feature policy='require' name='ibrs'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <feature policy='require' name='amd-ssbd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <feature policy='require' name='virt-ssbd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <feature policy='require' name='lbrv'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <feature policy='require' name='tsc-scale'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <feature policy='require' name='vmcb-clean'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <feature policy='require' name='pause-filter'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <feature policy='require' name='pfthreshold'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <feature policy='require' name='svme-addr-chk'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <feature policy='require' name='lfence-always-serializing'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <feature policy='disable' name='xsaves'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     </mode>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     <mode name='custom' supported='yes'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Broadwell'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='hle'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='rtm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Broadwell-IBRS'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='hle'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='rtm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Broadwell-noTSX'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Broadwell-noTSX-IBRS'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Broadwell-v1'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='hle'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='rtm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Broadwell-v2'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Broadwell-v3'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='hle'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='rtm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Broadwell-v4'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Cascadelake-Server'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bw'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512cd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512dq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512f'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vl'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vnni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='hle'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='rtm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Cascadelake-Server-noTSX'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bw'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512cd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512dq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512f'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vl'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vnni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='ibrs-all'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Cascadelake-Server-v1'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bw'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512cd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512dq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512f'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vl'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vnni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='hle'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='rtm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Cascadelake-Server-v2'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bw'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512cd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512dq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512f'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vl'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vnni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='hle'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='ibrs-all'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='rtm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Cascadelake-Server-v3'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bw'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512cd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512dq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512f'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vl'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vnni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='ibrs-all'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Cascadelake-Server-v4'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bw'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512cd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512dq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512f'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vl'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vnni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='ibrs-all'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Cascadelake-Server-v5'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bw'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512cd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512dq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512f'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vl'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vnni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='ibrs-all'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xsaves'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Cooperlake'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-bf16'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bw'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512cd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512dq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512f'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vl'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vnni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='hle'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='ibrs-all'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='rtm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='taa-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Cooperlake-v1'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-bf16'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bw'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512cd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512dq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512f'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vl'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vnni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='hle'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='ibrs-all'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='rtm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='taa-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Cooperlake-v2'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-bf16'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bw'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512cd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512dq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512f'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vl'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vnni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='hle'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='ibrs-all'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='rtm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='taa-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xsaves'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Denverton'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='mpx'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Denverton-v1'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='mpx'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Denverton-v2'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Denverton-v3'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xsaves'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Dhyana-v2'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xsaves'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='EPYC-Genoa'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='amd-psfd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='auto-ibrs'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-bf16'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bitalg'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bw'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512cd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512dq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512f'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512ifma'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vbmi'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vl'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vnni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fsrm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='gfni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='la57'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='no-nested-data-bp'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='null-sel-clr-base'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='stibp-always-on'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vaes'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xsaves'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='EPYC-Genoa-v1'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='amd-psfd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='auto-ibrs'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-bf16'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bitalg'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bw'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512cd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512dq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512f'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512ifma'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vbmi'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vl'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vnni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fsrm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='gfni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='la57'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='no-nested-data-bp'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='null-sel-clr-base'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='stibp-always-on'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vaes'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xsaves'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='EPYC-Milan'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fsrm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xsaves'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='EPYC-Milan-v1'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fsrm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xsaves'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='EPYC-Milan-v2'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='amd-psfd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fsrm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='no-nested-data-bp'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='null-sel-clr-base'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='stibp-always-on'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vaes'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xsaves'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='EPYC-Rome'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xsaves'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='EPYC-Rome-v1'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xsaves'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='EPYC-Rome-v2'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xsaves'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='EPYC-Rome-v3'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xsaves'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='EPYC-v3'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xsaves'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='EPYC-v4'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xsaves'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='GraniteRapids'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='amx-bf16'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='amx-fp16'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='amx-int8'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='amx-tile'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx-vnni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-bf16'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-fp16'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bitalg'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bw'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512cd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512dq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512f'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512ifma'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vbmi'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vl'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vnni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='bus-lock-detect'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fbsdp-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fsrc'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fsrm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fsrs'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fzrm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='gfni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='hle'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='ibrs-all'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='la57'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='mcdt-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pbrsb-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='prefetchiti'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='psdp-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='rtm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='serialize'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='taa-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='tsx-ldtrk'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vaes'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xfd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xsaves'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='GraniteRapids-v1'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='amx-bf16'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='amx-fp16'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='amx-int8'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='amx-tile'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx-vnni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-bf16'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-fp16'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bitalg'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bw'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512cd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512dq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512f'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512ifma'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vbmi'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vl'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vnni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='bus-lock-detect'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fbsdp-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fsrc'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fsrm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fsrs'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fzrm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='gfni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='hle'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='ibrs-all'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='la57'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='mcdt-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pbrsb-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='prefetchiti'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='psdp-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='rtm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='serialize'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='taa-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='tsx-ldtrk'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vaes'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xfd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xsaves'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='GraniteRapids-v2'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='amx-bf16'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='amx-fp16'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='amx-int8'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='amx-tile'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx-vnni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx10'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx10-128'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx10-256'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx10-512'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-bf16'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-fp16'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bitalg'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bw'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512cd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512dq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512f'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512ifma'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vbmi'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vl'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vnni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='bus-lock-detect'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='cldemote'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fbsdp-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fsrc'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fsrm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fsrs'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fzrm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='gfni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='hle'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='ibrs-all'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='la57'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='mcdt-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='movdir64b'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='movdiri'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pbrsb-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='prefetchiti'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='psdp-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='rtm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='serialize'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='ss'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='taa-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='tsx-ldtrk'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vaes'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xfd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xsaves'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Haswell'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='hle'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='rtm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Haswell-IBRS'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='hle'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='rtm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Haswell-noTSX'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Haswell-noTSX-IBRS'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Haswell-v1'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='hle'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='rtm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Haswell-v2'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Haswell-v3'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='hle'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='rtm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Haswell-v4'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Icelake-Server'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bitalg'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bw'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512cd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512dq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512f'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vbmi'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vl'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vnni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='gfni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='hle'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='la57'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='rtm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vaes'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Icelake-Server-noTSX'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bitalg'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bw'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512cd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512dq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512f'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vbmi'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vl'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vnni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='gfni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='la57'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vaes'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Icelake-Server-v1'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bitalg'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bw'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512cd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512dq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512f'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vbmi'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vl'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vnni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='gfni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='hle'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='la57'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='rtm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vaes'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Icelake-Server-v2'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bitalg'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bw'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512cd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512dq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512f'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vbmi'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vl'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vnni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='gfni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='la57'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vaes'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Icelake-Server-v3'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bitalg'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bw'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512cd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512dq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512f'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vbmi'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vl'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vnni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='gfni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='ibrs-all'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='la57'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='taa-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vaes'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Icelake-Server-v4'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bitalg'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bw'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512cd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512dq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512f'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512ifma'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vbmi'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vl'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vnni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fsrm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='gfni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='ibrs-all'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='la57'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='taa-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vaes'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Icelake-Server-v5'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bitalg'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bw'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512cd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512dq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512f'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512ifma'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vbmi'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vl'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vnni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fsrm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='gfni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='ibrs-all'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='la57'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='taa-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vaes'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xsaves'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Icelake-Server-v6'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bitalg'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bw'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512cd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512dq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512f'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512ifma'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vbmi'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vl'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vnni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fsrm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='gfni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='ibrs-all'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='la57'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='taa-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vaes'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xsaves'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Icelake-Server-v7'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bitalg'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bw'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512cd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512dq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512f'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512ifma'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vbmi'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vl'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vnni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fsrm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='gfni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='hle'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='ibrs-all'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='la57'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='rtm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='taa-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vaes'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xsaves'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='IvyBridge'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='IvyBridge-IBRS'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='IvyBridge-v1'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='IvyBridge-v2'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='KnightsMill'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-4fmaps'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-4vnniw'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512cd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512er'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512f'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512pf'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='ss'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='KnightsMill-v1'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-4fmaps'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-4vnniw'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512cd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512er'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512f'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512pf'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='ss'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Opteron_G4'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fma4'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xop'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Opteron_G4-v1'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fma4'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xop'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Opteron_G5'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fma4'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='tbm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xop'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Opteron_G5-v1'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fma4'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='tbm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xop'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='SapphireRapids'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='amx-bf16'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='amx-int8'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='amx-tile'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx-vnni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-bf16'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-fp16'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bitalg'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bw'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512cd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512dq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512f'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512ifma'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vbmi'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vl'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vnni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='bus-lock-detect'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fsrc'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fsrm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fsrs'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fzrm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='gfni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='hle'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='ibrs-all'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='la57'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='rtm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='serialize'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='taa-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='tsx-ldtrk'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vaes'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xfd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xsaves'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='SapphireRapids-v1'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='amx-bf16'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='amx-int8'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='amx-tile'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx-vnni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-bf16'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-fp16'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bitalg'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bw'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512cd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512dq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512f'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512ifma'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vbmi'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vl'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vnni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='bus-lock-detect'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fsrc'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fsrm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fsrs'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fzrm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='gfni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='hle'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='ibrs-all'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='la57'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='rtm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='serialize'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='taa-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='tsx-ldtrk'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vaes'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xfd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xsaves'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='SapphireRapids-v2'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='amx-bf16'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='amx-int8'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='amx-tile'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx-vnni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-bf16'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-fp16'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bitalg'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bw'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512cd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512dq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512f'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512ifma'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vbmi'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vl'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vnni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='bus-lock-detect'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fbsdp-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fsrc'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fsrm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fsrs'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fzrm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='gfni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='hle'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='ibrs-all'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='la57'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='psdp-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='rtm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='serialize'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='taa-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='tsx-ldtrk'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vaes'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xfd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xsaves'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='SapphireRapids-v3'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='amx-bf16'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='amx-int8'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='amx-tile'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx-vnni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-bf16'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-fp16'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bitalg'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bw'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512cd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512dq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512f'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512ifma'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vbmi'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vl'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vnni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='bus-lock-detect'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='cldemote'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fbsdp-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fsrc'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fsrm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fsrs'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fzrm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='gfni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='hle'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='ibrs-all'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='la57'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='movdir64b'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='movdiri'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='psdp-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='rtm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='serialize'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='ss'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='taa-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='tsx-ldtrk'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vaes'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xfd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xsaves'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='SierraForest'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx-ifma'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx-ne-convert'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx-vnni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx-vnni-int8'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='bus-lock-detect'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='cmpccxadd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fbsdp-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fsrm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fsrs'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='gfni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='ibrs-all'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='mcdt-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pbrsb-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='psdp-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='serialize'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vaes'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xsaves'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='SierraForest-v1'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx-ifma'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx-ne-convert'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx-vnni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx-vnni-int8'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='bus-lock-detect'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='cmpccxadd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fbsdp-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fsrm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fsrs'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='gfni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='ibrs-all'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='mcdt-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pbrsb-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='psdp-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='serialize'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vaes'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xsaves'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Skylake-Client'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='hle'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='rtm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Skylake-Client-IBRS'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='hle'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='rtm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Skylake-Client-v1'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='hle'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='rtm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Skylake-Client-v2'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='hle'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='rtm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Skylake-Client-v3'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Skylake-Client-v4'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xsaves'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Skylake-Server'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bw'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512cd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512dq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512f'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vl'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='hle'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='rtm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Skylake-Server-IBRS'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bw'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512cd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512dq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512f'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vl'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='hle'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='rtm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bw'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512cd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512dq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512f'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vl'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Skylake-Server-v1'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bw'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512cd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512dq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512f'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vl'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='hle'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='rtm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Skylake-Server-v2'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bw'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512cd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512dq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512f'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vl'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='hle'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='rtm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Skylake-Server-v3'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bw'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512cd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512dq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512f'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vl'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Skylake-Server-v4'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bw'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512cd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512dq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512f'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vl'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Skylake-Server-v5'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bw'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512cd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512dq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512f'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vl'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xsaves'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Snowridge'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='cldemote'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='core-capability'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='gfni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='movdir64b'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='movdiri'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='mpx'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='split-lock-detect'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Snowridge-v1'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='cldemote'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='core-capability'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='gfni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='movdir64b'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='movdiri'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='mpx'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='split-lock-detect'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Snowridge-v2'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='cldemote'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='core-capability'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='gfni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='movdir64b'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='movdiri'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='split-lock-detect'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Snowridge-v3'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='cldemote'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='core-capability'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='gfni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='movdir64b'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='movdiri'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='split-lock-detect'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xsaves'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Snowridge-v4'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='cldemote'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='gfni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='movdir64b'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='movdiri'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xsaves'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='athlon'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='3dnow'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='3dnowext'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='athlon-v1'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='3dnow'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='3dnowext'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='core2duo'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='ss'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='core2duo-v1'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='ss'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='coreduo'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='ss'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='coreduo-v1'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='ss'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='n270'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='ss'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='n270-v1'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='ss'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='phenom'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='3dnow'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='3dnowext'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='phenom-v1'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='3dnow'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='3dnowext'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     </mode>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:   </cpu>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:   <memoryBacking supported='yes'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     <enum name='sourceType'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <value>file</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <value>anonymous</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <value>memfd</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     </enum>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:   </memoryBacking>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:   <devices>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     <disk supported='yes'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <enum name='diskDevice'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>disk</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>cdrom</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>floppy</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>lun</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </enum>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <enum name='bus'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>fdc</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>scsi</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>virtio</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>usb</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>sata</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </enum>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <enum name='model'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>virtio</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>virtio-transitional</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>virtio-non-transitional</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </enum>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     </disk>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     <graphics supported='yes'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <enum name='type'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>vnc</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>egl-headless</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>dbus</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </enum>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     </graphics>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     <video supported='yes'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <enum name='modelType'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>vga</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>cirrus</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>virtio</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>none</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>bochs</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>ramfb</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </enum>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     </video>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     <hostdev supported='yes'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <enum name='mode'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>subsystem</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </enum>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <enum name='startupPolicy'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>default</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>mandatory</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>requisite</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>optional</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </enum>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <enum name='subsysType'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>usb</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>pci</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>scsi</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </enum>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <enum name='capsType'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <enum name='pciBackend'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     </hostdev>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     <rng supported='yes'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <enum name='model'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>virtio</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>virtio-transitional</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>virtio-non-transitional</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </enum>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <enum name='backendModel'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>random</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>egd</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>builtin</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </enum>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     </rng>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     <filesystem supported='yes'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <enum name='driverType'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>path</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>handle</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>virtiofs</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </enum>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     </filesystem>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     <tpm supported='yes'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <enum name='model'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>tpm-tis</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>tpm-crb</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </enum>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <enum name='backendModel'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>emulator</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>external</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </enum>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <enum name='backendVersion'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>2.0</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </enum>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     </tpm>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     <redirdev supported='yes'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <enum name='bus'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>usb</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </enum>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     </redirdev>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     <channel supported='yes'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <enum name='type'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>pty</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>unix</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </enum>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     </channel>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     <crypto supported='yes'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <enum name='model'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <enum name='type'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>qemu</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </enum>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <enum name='backendModel'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>builtin</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </enum>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     </crypto>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     <interface supported='yes'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <enum name='backendType'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>default</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>passt</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </enum>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     </interface>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     <panic supported='yes'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <enum name='model'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>isa</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>hyperv</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </enum>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     </panic>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     <console supported='yes'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <enum name='type'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>null</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>vc</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>pty</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>dev</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>file</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>pipe</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>stdio</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>udp</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>tcp</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>unix</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>qemu-vdagent</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>dbus</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </enum>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     </console>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:   </devices>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:   <features>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     <gic supported='no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     <vmcoreinfo supported='yes'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     <genid supported='yes'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     <backingStoreInput supported='yes'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     <backup supported='yes'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     <async-teardown supported='yes'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     <ps2 supported='yes'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     <sev supported='no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     <sgx supported='no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     <hyperv supported='yes'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <enum name='features'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>relaxed</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>vapic</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>spinlocks</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>vpindex</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>runtime</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>synic</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>stimer</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>reset</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>vendor_id</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>frequencies</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>reenlightenment</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>tlbflush</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>ipi</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>avic</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>emsr_bitmap</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>xmm_input</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </enum>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <defaults>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <spinlocks>4095</spinlocks>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <stimer_direct>on</stimer_direct>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <tlbflush_direct>off</tlbflush_direct>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <tlbflush_extended>off</tlbflush_extended>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <vendor_id>Linux KVM Hv</vendor_id>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </defaults>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     </hyperv>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     <launchSecurity supported='yes'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <enum name='sectype'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>tdx</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </enum>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     </launchSecurity>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:   </features>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]: </domainCapabilities>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:36.028 227469 DEBUG nova.virt.libvirt.host [None req-b811f39d-3152-4ae0-8dd6-76baa64b5d28 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]: <domainCapabilities>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:   <path>/usr/libexec/qemu-kvm</path>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:   <domain>kvm</domain>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:   <machine>pc-i440fx-rhel7.6.0</machine>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:   <arch>i686</arch>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:   <vcpu max='240'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:   <iothreads supported='yes'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:   <os supported='yes'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     <enum name='firmware'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     <loader supported='yes'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <enum name='type'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>rom</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>pflash</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </enum>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <enum name='readonly'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>yes</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>no</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </enum>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <enum name='secure'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>no</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </enum>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     </loader>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:   </os>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:   <cpu>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     <mode name='host-passthrough' supported='yes'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <enum name='hostPassthroughMigratable'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>on</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>off</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </enum>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     </mode>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     <mode name='maximum' supported='yes'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <enum name='maximumMigratable'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>on</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>off</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </enum>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     </mode>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     <mode name='host-model' supported='yes'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model fallback='forbid'>EPYC-Rome</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <vendor>AMD</vendor>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <maxphysaddr mode='passthrough' limit='40'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <feature policy='require' name='x2apic'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <feature policy='require' name='tsc-deadline'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <feature policy='require' name='hypervisor'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <feature policy='require' name='tsc_adjust'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <feature policy='require' name='spec-ctrl'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <feature policy='require' name='stibp'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <feature policy='require' name='ssbd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <feature policy='require' name='cmp_legacy'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <feature policy='require' name='overflow-recov'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <feature policy='require' name='succor'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <feature policy='require' name='ibrs'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <feature policy='require' name='amd-ssbd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <feature policy='require' name='virt-ssbd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <feature policy='require' name='lbrv'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <feature policy='require' name='tsc-scale'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <feature policy='require' name='vmcb-clean'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <feature policy='require' name='pause-filter'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <feature policy='require' name='pfthreshold'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <feature policy='require' name='svme-addr-chk'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <feature policy='require' name='lfence-always-serializing'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <feature policy='disable' name='xsaves'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     </mode>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     <mode name='custom' supported='yes'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Broadwell'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='hle'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='rtm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Broadwell-IBRS'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='hle'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='rtm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Broadwell-noTSX'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Broadwell-noTSX-IBRS'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Broadwell-v1'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='hle'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='rtm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Broadwell-v2'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Broadwell-v3'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='hle'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='rtm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Broadwell-v4'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Cascadelake-Server'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bw'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512cd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512dq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512f'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vl'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vnni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='hle'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='rtm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Cascadelake-Server-noTSX'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bw'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512cd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512dq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512f'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vl'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vnni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='ibrs-all'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Cascadelake-Server-v1'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bw'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512cd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512dq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512f'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vl'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vnni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='hle'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='rtm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Cascadelake-Server-v2'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bw'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512cd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512dq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512f'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vl'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vnni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='hle'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='ibrs-all'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='rtm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Cascadelake-Server-v3'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bw'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512cd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512dq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512f'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vl'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vnni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='ibrs-all'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Cascadelake-Server-v4'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bw'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512cd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512dq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512f'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vl'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vnni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='ibrs-all'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Cascadelake-Server-v5'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bw'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512cd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512dq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512f'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vl'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vnni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='ibrs-all'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xsaves'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Cooperlake'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-bf16'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bw'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512cd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512dq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512f'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vl'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vnni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='hle'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='ibrs-all'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='rtm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='taa-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Cooperlake-v1'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-bf16'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bw'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512cd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512dq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512f'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vl'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vnni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='hle'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='ibrs-all'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='rtm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='taa-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Cooperlake-v2'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-bf16'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bw'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512cd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512dq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512f'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vl'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vnni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='hle'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='ibrs-all'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='rtm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='taa-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xsaves'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Denverton'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='mpx'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Denverton-v1'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='mpx'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Denverton-v2'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Denverton-v3'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xsaves'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Dhyana-v2'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xsaves'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='EPYC-Genoa'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='amd-psfd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='auto-ibrs'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-bf16'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bitalg'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bw'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512cd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512dq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512f'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512ifma'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vbmi'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vl'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vnni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fsrm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='gfni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='la57'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='no-nested-data-bp'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='null-sel-clr-base'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='stibp-always-on'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vaes'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xsaves'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='EPYC-Genoa-v1'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='amd-psfd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='auto-ibrs'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-bf16'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bitalg'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bw'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512cd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512dq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512f'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512ifma'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vbmi'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vl'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vnni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fsrm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='gfni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='la57'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='no-nested-data-bp'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='null-sel-clr-base'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='stibp-always-on'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vaes'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xsaves'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='EPYC-Milan'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fsrm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xsaves'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='EPYC-Milan-v1'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fsrm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xsaves'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='EPYC-Milan-v2'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='amd-psfd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fsrm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='no-nested-data-bp'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='null-sel-clr-base'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='stibp-always-on'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vaes'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xsaves'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='EPYC-Rome'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xsaves'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='EPYC-Rome-v1'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xsaves'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='EPYC-Rome-v2'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xsaves'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='EPYC-Rome-v3'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xsaves'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='EPYC-v3'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xsaves'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='EPYC-v4'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xsaves'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='GraniteRapids'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='amx-bf16'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='amx-fp16'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='amx-int8'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='amx-tile'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx-vnni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-bf16'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-fp16'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bitalg'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bw'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512cd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512dq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512f'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512ifma'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vbmi'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vl'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vnni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='bus-lock-detect'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fbsdp-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fsrc'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fsrm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fsrs'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fzrm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='gfni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='hle'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='ibrs-all'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='la57'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='mcdt-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pbrsb-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='prefetchiti'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='psdp-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='rtm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='serialize'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='taa-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='tsx-ldtrk'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vaes'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xfd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xsaves'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='GraniteRapids-v1'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='amx-bf16'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='amx-fp16'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='amx-int8'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='amx-tile'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx-vnni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-bf16'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-fp16'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bitalg'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bw'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512cd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512dq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512f'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512ifma'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vbmi'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vl'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vnni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='bus-lock-detect'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fbsdp-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fsrc'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fsrm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fsrs'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fzrm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='gfni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='hle'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='ibrs-all'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='la57'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='mcdt-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pbrsb-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='prefetchiti'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='psdp-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='rtm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='serialize'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='taa-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='tsx-ldtrk'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vaes'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xfd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xsaves'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='GraniteRapids-v2'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='amx-bf16'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='amx-fp16'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='amx-int8'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='amx-tile'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx-vnni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx10'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx10-128'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx10-256'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx10-512'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-bf16'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-fp16'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bitalg'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bw'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512cd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512dq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512f'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512ifma'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vbmi'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vl'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vnni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='bus-lock-detect'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='cldemote'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fbsdp-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fsrc'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fsrm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fsrs'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fzrm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='gfni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='hle'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='ibrs-all'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='la57'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='mcdt-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='movdir64b'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='movdiri'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pbrsb-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='prefetchiti'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='psdp-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='rtm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='serialize'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='ss'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='taa-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='tsx-ldtrk'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vaes'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xfd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xsaves'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Haswell'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='hle'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='rtm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Haswell-IBRS'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='hle'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='rtm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Haswell-noTSX'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Haswell-noTSX-IBRS'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Haswell-v1'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='hle'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='rtm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Haswell-v2'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Haswell-v3'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='hle'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='rtm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Haswell-v4'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Icelake-Server'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bitalg'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bw'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512cd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512dq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512f'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vbmi'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vl'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vnni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='gfni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='hle'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='la57'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='rtm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vaes'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Icelake-Server-noTSX'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bitalg'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bw'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512cd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512dq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512f'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vbmi'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vl'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vnni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='gfni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='la57'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vaes'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Icelake-Server-v1'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bitalg'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bw'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512cd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512dq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512f'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vbmi'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vl'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vnni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='gfni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='hle'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='la57'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='rtm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vaes'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Icelake-Server-v2'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bitalg'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bw'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512cd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512dq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512f'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vbmi'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vl'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vnni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='gfni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='la57'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vaes'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Icelake-Server-v3'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bitalg'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bw'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512cd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512dq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512f'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vbmi'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vl'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vnni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='gfni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='ibrs-all'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='la57'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='taa-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vaes'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Icelake-Server-v4'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bitalg'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bw'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512cd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512dq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512f'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512ifma'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vbmi'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vl'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vnni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fsrm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='gfni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='ibrs-all'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='la57'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='taa-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vaes'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Icelake-Server-v5'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bitalg'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bw'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512cd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512dq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512f'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512ifma'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vbmi'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vl'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vnni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fsrm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='gfni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='ibrs-all'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='la57'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='taa-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vaes'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xsaves'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Icelake-Server-v6'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bitalg'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bw'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512cd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512dq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512f'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512ifma'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vbmi'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vl'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vnni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fsrm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='gfni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='ibrs-all'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='la57'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='taa-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vaes'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xsaves'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Icelake-Server-v7'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bitalg'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bw'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512cd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512dq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512f'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512ifma'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vbmi'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vl'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vnni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fsrm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='gfni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='hle'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='ibrs-all'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='la57'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='rtm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='taa-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vaes'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xsaves'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='IvyBridge'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='IvyBridge-IBRS'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='IvyBridge-v1'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='IvyBridge-v2'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='KnightsMill'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-4fmaps'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-4vnniw'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512cd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512er'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512f'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512pf'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='ss'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='KnightsMill-v1'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-4fmaps'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-4vnniw'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512cd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512er'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512f'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512pf'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='ss'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Opteron_G4'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fma4'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xop'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Opteron_G4-v1'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fma4'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xop'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Opteron_G5'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fma4'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='tbm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xop'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Opteron_G5-v1'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fma4'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='tbm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xop'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='SapphireRapids'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='amx-bf16'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='amx-int8'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='amx-tile'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx-vnni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-bf16'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-fp16'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bitalg'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bw'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512cd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512dq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512f'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512ifma'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vbmi'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vl'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vnni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='bus-lock-detect'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fsrc'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fsrm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fsrs'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fzrm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='gfni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='hle'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='ibrs-all'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='la57'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='rtm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='serialize'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='taa-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='tsx-ldtrk'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vaes'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xfd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xsaves'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='SapphireRapids-v1'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='amx-bf16'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='amx-int8'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='amx-tile'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx-vnni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-bf16'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-fp16'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bitalg'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bw'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512cd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512dq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512f'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512ifma'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vbmi'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vl'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vnni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='bus-lock-detect'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fsrc'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fsrm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fsrs'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fzrm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='gfni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='hle'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='ibrs-all'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='la57'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='rtm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='serialize'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='taa-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='tsx-ldtrk'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vaes'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xfd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xsaves'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='SapphireRapids-v2'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='amx-bf16'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='amx-int8'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='amx-tile'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx-vnni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-bf16'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-fp16'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bitalg'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bw'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512cd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512dq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512f'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512ifma'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vbmi'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vl'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vnni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='bus-lock-detect'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fbsdp-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fsrc'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fsrm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fsrs'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fzrm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='gfni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='hle'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='ibrs-all'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='la57'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='psdp-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='rtm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='serialize'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='taa-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='tsx-ldtrk'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vaes'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xfd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xsaves'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='SapphireRapids-v3'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='amx-bf16'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='amx-int8'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='amx-tile'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx-vnni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-bf16'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-fp16'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bitalg'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bw'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512cd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512dq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512f'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512ifma'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vbmi'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vl'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vnni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='bus-lock-detect'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='cldemote'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fbsdp-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fsrc'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fsrm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fsrs'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fzrm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='gfni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='hle'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='ibrs-all'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='la57'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='movdir64b'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='movdiri'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='psdp-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='rtm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='serialize'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='ss'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='taa-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='tsx-ldtrk'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vaes'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xfd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xsaves'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='SierraForest'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx-ifma'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx-ne-convert'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx-vnni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx-vnni-int8'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='bus-lock-detect'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='cmpccxadd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fbsdp-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fsrm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fsrs'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='gfni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='ibrs-all'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='mcdt-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pbrsb-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='psdp-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='serialize'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vaes'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xsaves'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='SierraForest-v1'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx-ifma'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx-ne-convert'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx-vnni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx-vnni-int8'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='bus-lock-detect'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='cmpccxadd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fbsdp-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fsrm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fsrs'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='gfni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='ibrs-all'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='mcdt-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pbrsb-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='psdp-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='serialize'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vaes'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xsaves'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Skylake-Client'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='hle'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='rtm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Skylake-Client-IBRS'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='hle'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='rtm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Skylake-Client-v1'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='hle'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='rtm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Skylake-Client-v2'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='hle'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='rtm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Skylake-Client-v3'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Skylake-Client-v4'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xsaves'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Skylake-Server'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bw'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512cd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512dq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512f'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vl'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='hle'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='rtm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Skylake-Server-IBRS'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bw'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512cd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512dq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512f'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vl'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='hle'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='rtm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bw'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512cd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512dq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512f'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vl'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Skylake-Server-v1'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bw'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512cd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512dq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512f'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vl'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='hle'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='rtm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Skylake-Server-v2'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bw'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512cd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512dq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512f'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vl'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='hle'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='rtm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Skylake-Server-v3'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bw'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512cd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512dq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512f'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vl'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Skylake-Server-v4'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bw'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512cd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512dq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512f'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vl'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Skylake-Server-v5'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bw'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512cd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512dq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512f'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vl'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xsaves'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Snowridge'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='cldemote'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='core-capability'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='gfni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='movdir64b'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='movdiri'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='mpx'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='split-lock-detect'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Snowridge-v1'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='cldemote'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='core-capability'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='gfni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='movdir64b'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='movdiri'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='mpx'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='split-lock-detect'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Snowridge-v2'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='cldemote'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='core-capability'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='gfni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='movdir64b'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='movdiri'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='split-lock-detect'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Snowridge-v3'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='cldemote'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='core-capability'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='gfni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='movdir64b'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='movdiri'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='split-lock-detect'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xsaves'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Snowridge-v4'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='cldemote'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='gfni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='movdir64b'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='movdiri'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xsaves'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='athlon'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='3dnow'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='3dnowext'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='athlon-v1'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='3dnow'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='3dnowext'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='core2duo'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='ss'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='core2duo-v1'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='ss'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='coreduo'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='ss'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='coreduo-v1'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='ss'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='n270'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='ss'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='n270-v1'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='ss'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='phenom'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='3dnow'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='3dnowext'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='phenom-v1'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='3dnow'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='3dnowext'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     </mode>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:   </cpu>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:   <memoryBacking supported='yes'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     <enum name='sourceType'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <value>file</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <value>anonymous</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <value>memfd</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     </enum>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:   </memoryBacking>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:   <devices>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     <disk supported='yes'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <enum name='diskDevice'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>disk</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>cdrom</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>floppy</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>lun</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </enum>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <enum name='bus'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>ide</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>fdc</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>scsi</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>virtio</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>usb</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>sata</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </enum>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <enum name='model'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>virtio</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>virtio-transitional</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>virtio-non-transitional</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </enum>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     </disk>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     <graphics supported='yes'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <enum name='type'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>vnc</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>egl-headless</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>dbus</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </enum>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     </graphics>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     <video supported='yes'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <enum name='modelType'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>vga</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>cirrus</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>virtio</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>none</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>bochs</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>ramfb</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </enum>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     </video>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     <hostdev supported='yes'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <enum name='mode'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>subsystem</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </enum>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <enum name='startupPolicy'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>default</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>mandatory</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>requisite</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>optional</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </enum>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <enum name='subsysType'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>usb</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>pci</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>scsi</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </enum>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <enum name='capsType'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <enum name='pciBackend'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     </hostdev>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     <rng supported='yes'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <enum name='model'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>virtio</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>virtio-transitional</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>virtio-non-transitional</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </enum>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <enum name='backendModel'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>random</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>egd</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>builtin</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </enum>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     </rng>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     <filesystem supported='yes'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <enum name='driverType'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>path</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>handle</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>virtiofs</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </enum>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     </filesystem>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     <tpm supported='yes'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <enum name='model'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>tpm-tis</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>tpm-crb</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </enum>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <enum name='backendModel'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>emulator</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>external</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </enum>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <enum name='backendVersion'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>2.0</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </enum>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     </tpm>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     <redirdev supported='yes'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <enum name='bus'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>usb</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </enum>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     </redirdev>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     <channel supported='yes'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <enum name='type'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>pty</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>unix</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </enum>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     </channel>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     <crypto supported='yes'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <enum name='model'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <enum name='type'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>qemu</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </enum>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <enum name='backendModel'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>builtin</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </enum>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     </crypto>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     <interface supported='yes'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <enum name='backendType'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>default</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>passt</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </enum>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     </interface>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     <panic supported='yes'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <enum name='model'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>isa</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>hyperv</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </enum>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     </panic>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     <console supported='yes'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <enum name='type'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>null</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>vc</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>pty</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>dev</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>file</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>pipe</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>stdio</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>udp</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>tcp</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>unix</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>qemu-vdagent</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>dbus</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </enum>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     </console>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:   </devices>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:   <features>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     <gic supported='no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     <vmcoreinfo supported='yes'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     <genid supported='yes'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     <backingStoreInput supported='yes'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     <backup supported='yes'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     <async-teardown supported='yes'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     <ps2 supported='yes'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     <sev supported='no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     <sgx supported='no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     <hyperv supported='yes'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <enum name='features'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>relaxed</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>vapic</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>spinlocks</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>vpindex</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>runtime</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>synic</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>stimer</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>reset</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>vendor_id</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>frequencies</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>reenlightenment</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>tlbflush</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>ipi</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>avic</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>emsr_bitmap</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>xmm_input</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </enum>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <defaults>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <spinlocks>4095</spinlocks>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <stimer_direct>on</stimer_direct>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <tlbflush_direct>off</tlbflush_direct>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <tlbflush_extended>off</tlbflush_extended>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <vendor_id>Linux KVM Hv</vendor_id>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </defaults>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     </hyperv>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     <launchSecurity supported='yes'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <enum name='sectype'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>tdx</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </enum>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     </launchSecurity>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:   </features>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]: </domainCapabilities>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:36.074 227469 DEBUG nova.virt.libvirt.host [None req-b811f39d-3152-4ae0-8dd6-76baa64b5d28 - - - - - -] Getting domain capabilities for x86_64 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:36.080 227469 DEBUG nova.virt.libvirt.host [None req-b811f39d-3152-4ae0-8dd6-76baa64b5d28 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]: <domainCapabilities>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:   <path>/usr/libexec/qemu-kvm</path>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:   <domain>kvm</domain>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:   <machine>pc-q35-rhel9.8.0</machine>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:   <arch>x86_64</arch>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:   <vcpu max='1024'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:   <iothreads supported='yes'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:   <os supported='yes'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     <enum name='firmware'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <value>efi</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     </enum>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     <loader supported='yes'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <enum name='type'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>rom</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>pflash</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </enum>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <enum name='readonly'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>yes</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>no</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </enum>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <enum name='secure'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>yes</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>no</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </enum>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     </loader>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:   </os>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:   <cpu>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     <mode name='host-passthrough' supported='yes'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <enum name='hostPassthroughMigratable'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>on</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>off</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </enum>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     </mode>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     <mode name='maximum' supported='yes'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <enum name='maximumMigratable'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>on</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>off</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </enum>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     </mode>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     <mode name='host-model' supported='yes'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model fallback='forbid'>EPYC-Rome</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <vendor>AMD</vendor>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <maxphysaddr mode='passthrough' limit='40'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <feature policy='require' name='x2apic'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <feature policy='require' name='tsc-deadline'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <feature policy='require' name='hypervisor'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <feature policy='require' name='tsc_adjust'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <feature policy='require' name='spec-ctrl'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <feature policy='require' name='stibp'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <feature policy='require' name='ssbd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <feature policy='require' name='cmp_legacy'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <feature policy='require' name='overflow-recov'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <feature policy='require' name='succor'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <feature policy='require' name='ibrs'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <feature policy='require' name='amd-ssbd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <feature policy='require' name='virt-ssbd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <feature policy='require' name='lbrv'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <feature policy='require' name='tsc-scale'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <feature policy='require' name='vmcb-clean'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <feature policy='require' name='pause-filter'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <feature policy='require' name='pfthreshold'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <feature policy='require' name='svme-addr-chk'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <feature policy='require' name='lfence-always-serializing'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <feature policy='disable' name='xsaves'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     </mode>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     <mode name='custom' supported='yes'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Broadwell'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='hle'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='rtm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Broadwell-IBRS'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='hle'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='rtm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Broadwell-noTSX'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Broadwell-noTSX-IBRS'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Broadwell-v1'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='hle'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='rtm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Broadwell-v2'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Broadwell-v3'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='hle'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='rtm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Broadwell-v4'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Cascadelake-Server'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bw'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512cd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512dq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512f'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vl'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vnni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='hle'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='rtm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Cascadelake-Server-noTSX'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bw'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512cd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512dq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512f'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vl'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vnni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='ibrs-all'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Cascadelake-Server-v1'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bw'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512cd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512dq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512f'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vl'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vnni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='hle'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='rtm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Cascadelake-Server-v2'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bw'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512cd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512dq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512f'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vl'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vnni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='hle'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='ibrs-all'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='rtm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Cascadelake-Server-v3'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bw'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512cd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512dq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512f'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vl'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vnni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='ibrs-all'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Cascadelake-Server-v4'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bw'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512cd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512dq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512f'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vl'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vnni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='ibrs-all'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Cascadelake-Server-v5'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bw'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512cd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512dq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512f'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vl'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vnni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='ibrs-all'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xsaves'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Cooperlake'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-bf16'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bw'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512cd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512dq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512f'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vl'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vnni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='hle'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='ibrs-all'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='rtm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='taa-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Cooperlake-v1'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-bf16'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bw'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512cd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512dq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512f'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vl'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vnni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='hle'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='ibrs-all'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='rtm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='taa-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Cooperlake-v2'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-bf16'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bw'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512cd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512dq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512f'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vl'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vnni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='hle'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='ibrs-all'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='rtm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='taa-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xsaves'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Denverton'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='mpx'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Denverton-v1'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='mpx'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Denverton-v2'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Denverton-v3'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xsaves'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Dhyana-v2'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xsaves'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='EPYC-Genoa'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='amd-psfd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='auto-ibrs'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-bf16'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bitalg'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bw'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512cd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512dq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512f'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512ifma'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vbmi'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vl'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vnni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fsrm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='gfni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='la57'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='no-nested-data-bp'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='null-sel-clr-base'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='stibp-always-on'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vaes'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xsaves'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='EPYC-Genoa-v1'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='amd-psfd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='auto-ibrs'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-bf16'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bitalg'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bw'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512cd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512dq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512f'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512ifma'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vbmi'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vl'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vnni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fsrm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='gfni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='la57'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='no-nested-data-bp'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='null-sel-clr-base'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='stibp-always-on'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vaes'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xsaves'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='EPYC-Milan'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fsrm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xsaves'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='EPYC-Milan-v1'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fsrm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xsaves'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='EPYC-Milan-v2'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='amd-psfd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fsrm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='no-nested-data-bp'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='null-sel-clr-base'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='stibp-always-on'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vaes'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xsaves'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='EPYC-Rome'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xsaves'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='EPYC-Rome-v1'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xsaves'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='EPYC-Rome-v2'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xsaves'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='EPYC-Rome-v3'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xsaves'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='EPYC-v3'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xsaves'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='EPYC-v4'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xsaves'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='GraniteRapids'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='amx-bf16'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='amx-fp16'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='amx-int8'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='amx-tile'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx-vnni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-bf16'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-fp16'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bitalg'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bw'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512cd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512dq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512f'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512ifma'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vbmi'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vl'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vnni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='bus-lock-detect'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fbsdp-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fsrc'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fsrm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fsrs'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fzrm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='gfni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='hle'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='ibrs-all'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='la57'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='mcdt-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pbrsb-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='prefetchiti'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='psdp-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='rtm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='serialize'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='taa-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='tsx-ldtrk'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vaes'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xfd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xsaves'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='GraniteRapids-v1'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='amx-bf16'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='amx-fp16'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='amx-int8'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='amx-tile'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx-vnni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-bf16'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-fp16'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bitalg'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bw'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512cd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512dq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512f'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512ifma'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vbmi'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vl'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vnni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='bus-lock-detect'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fbsdp-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fsrc'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fsrm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fsrs'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fzrm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='gfni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='hle'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='ibrs-all'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='la57'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='mcdt-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pbrsb-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='prefetchiti'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='psdp-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='rtm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='serialize'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='taa-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='tsx-ldtrk'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vaes'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xfd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xsaves'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='GraniteRapids-v2'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='amx-bf16'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='amx-fp16'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='amx-int8'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='amx-tile'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx-vnni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx10'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx10-128'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx10-256'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx10-512'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-bf16'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-fp16'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bitalg'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bw'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512cd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512dq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512f'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512ifma'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vbmi'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vl'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vnni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='bus-lock-detect'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='cldemote'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fbsdp-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fsrc'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fsrm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fsrs'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fzrm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='gfni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='hle'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='ibrs-all'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='la57'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='mcdt-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='movdir64b'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='movdiri'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pbrsb-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='prefetchiti'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='psdp-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='rtm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='serialize'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='ss'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='taa-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='tsx-ldtrk'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vaes'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xfd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xsaves'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Haswell'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='hle'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='rtm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Haswell-IBRS'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='hle'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='rtm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Haswell-noTSX'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Haswell-noTSX-IBRS'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Haswell-v1'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='hle'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='rtm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Haswell-v2'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Haswell-v3'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='hle'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='rtm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Haswell-v4'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Icelake-Server'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bitalg'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bw'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512cd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512dq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512f'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vbmi'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vl'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vnni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='gfni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='hle'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='la57'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='rtm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vaes'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Icelake-Server-noTSX'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bitalg'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bw'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512cd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512dq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512f'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vbmi'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vl'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vnni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='gfni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='la57'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vaes'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Icelake-Server-v1'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bitalg'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bw'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512cd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512dq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512f'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vbmi'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vl'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vnni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='gfni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='hle'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='la57'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='rtm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vaes'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Icelake-Server-v2'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bitalg'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bw'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512cd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512dq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512f'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vbmi'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vl'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vnni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='gfni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='la57'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vaes'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Icelake-Server-v3'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bitalg'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bw'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512cd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512dq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512f'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vbmi'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vl'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vnni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='gfni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='ibrs-all'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='la57'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='taa-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vaes'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Icelake-Server-v4'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bitalg'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bw'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512cd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512dq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512f'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512ifma'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vbmi'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vl'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vnni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fsrm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='gfni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='ibrs-all'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='la57'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='taa-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vaes'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Icelake-Server-v5'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bitalg'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bw'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512cd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512dq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512f'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512ifma'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vbmi'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vl'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vnni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fsrm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='gfni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='ibrs-all'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='la57'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='taa-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vaes'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xsaves'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Icelake-Server-v6'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bitalg'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bw'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512cd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512dq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512f'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512ifma'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vbmi'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vl'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vnni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fsrm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='gfni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='ibrs-all'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='la57'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='taa-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vaes'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xsaves'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Icelake-Server-v7'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bitalg'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bw'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512cd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512dq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512f'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512ifma'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vbmi'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vl'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vnni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fsrm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='gfni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='hle'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='ibrs-all'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='la57'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='rtm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='taa-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vaes'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xsaves'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='IvyBridge'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='IvyBridge-IBRS'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='IvyBridge-v1'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='IvyBridge-v2'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='KnightsMill'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-4fmaps'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-4vnniw'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512cd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512er'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512f'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512pf'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='ss'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='KnightsMill-v1'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-4fmaps'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-4vnniw'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512cd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512er'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512f'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512pf'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='ss'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Opteron_G4'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fma4'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xop'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Opteron_G4-v1'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fma4'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xop'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Opteron_G5'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fma4'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='tbm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xop'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Opteron_G5-v1'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fma4'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='tbm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xop'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='SapphireRapids'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='amx-bf16'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='amx-int8'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='amx-tile'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx-vnni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-bf16'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-fp16'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bitalg'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bw'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512cd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512dq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512f'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512ifma'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vbmi'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vl'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vnni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='bus-lock-detect'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fsrc'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fsrm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fsrs'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fzrm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='gfni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='hle'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='ibrs-all'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='la57'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='rtm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='serialize'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='taa-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='tsx-ldtrk'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vaes'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xfd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xsaves'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='SapphireRapids-v1'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='amx-bf16'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='amx-int8'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='amx-tile'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx-vnni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-bf16'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-fp16'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bitalg'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bw'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512cd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512dq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512f'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512ifma'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vbmi'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vl'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vnni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='bus-lock-detect'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fsrc'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fsrm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fsrs'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fzrm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='gfni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='hle'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='ibrs-all'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='la57'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='rtm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='serialize'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='taa-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='tsx-ldtrk'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vaes'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xfd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xsaves'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='SapphireRapids-v2'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='amx-bf16'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='amx-int8'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='amx-tile'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx-vnni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-bf16'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-fp16'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bitalg'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bw'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512cd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512dq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512f'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512ifma'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vbmi'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vl'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vnni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='bus-lock-detect'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fbsdp-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fsrc'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fsrm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fsrs'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fzrm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='gfni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='hle'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='ibrs-all'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='la57'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='psdp-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='rtm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='serialize'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='taa-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='tsx-ldtrk'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vaes'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xfd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xsaves'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='SapphireRapids-v3'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='amx-bf16'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='amx-int8'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='amx-tile'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx-vnni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-bf16'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-fp16'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bitalg'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bw'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512cd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512dq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512f'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512ifma'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vbmi'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vl'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vnni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='bus-lock-detect'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='cldemote'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fbsdp-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fsrc'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fsrm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fsrs'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fzrm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='gfni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='hle'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='ibrs-all'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='la57'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='movdir64b'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='movdiri'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='psdp-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='rtm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='serialize'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='ss'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='taa-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='tsx-ldtrk'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vaes'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xfd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xsaves'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='SierraForest'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx-ifma'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx-ne-convert'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx-vnni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx-vnni-int8'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='bus-lock-detect'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='cmpccxadd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fbsdp-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fsrm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fsrs'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='gfni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='ibrs-all'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='mcdt-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pbrsb-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='psdp-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='serialize'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vaes'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xsaves'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='SierraForest-v1'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx-ifma'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx-ne-convert'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx-vnni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx-vnni-int8'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='bus-lock-detect'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='cmpccxadd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fbsdp-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fsrm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fsrs'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='gfni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='ibrs-all'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='mcdt-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pbrsb-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='psdp-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='serialize'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vaes'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xsaves'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Skylake-Client'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='hle'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='rtm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Skylake-Client-IBRS'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='hle'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='rtm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Skylake-Client-v1'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='hle'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='rtm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Skylake-Client-v2'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='hle'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='rtm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Skylake-Client-v3'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Skylake-Client-v4'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xsaves'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Skylake-Server'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bw'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512cd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512dq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512f'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vl'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='hle'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='rtm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Skylake-Server-IBRS'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bw'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512cd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512dq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512f'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vl'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='hle'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='rtm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bw'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512cd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512dq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512f'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vl'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Skylake-Server-v1'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bw'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512cd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512dq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512f'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vl'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='hle'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='rtm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Skylake-Server-v2'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bw'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512cd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512dq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512f'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vl'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='hle'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='rtm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Skylake-Server-v3'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bw'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512cd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512dq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512f'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vl'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Skylake-Server-v4'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bw'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512cd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512dq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512f'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vl'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Skylake-Server-v5'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bw'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512cd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512dq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512f'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vl'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xsaves'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Snowridge'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='cldemote'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='core-capability'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='gfni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='movdir64b'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='movdiri'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='mpx'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='split-lock-detect'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Snowridge-v1'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='cldemote'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='core-capability'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='gfni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='movdir64b'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='movdiri'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='mpx'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='split-lock-detect'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Snowridge-v2'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='cldemote'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='core-capability'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='gfni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='movdir64b'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='movdiri'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='split-lock-detect'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Snowridge-v3'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='cldemote'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='core-capability'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='gfni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='movdir64b'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='movdiri'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='split-lock-detect'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xsaves'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Snowridge-v4'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='cldemote'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='gfni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='movdir64b'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='movdiri'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xsaves'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='athlon'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='3dnow'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='3dnowext'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='athlon-v1'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='3dnow'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='3dnowext'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='core2duo'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='ss'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='core2duo-v1'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='ss'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='coreduo'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='ss'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='coreduo-v1'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='ss'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='n270'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='ss'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='n270-v1'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='ss'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='phenom'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='3dnow'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='3dnowext'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='phenom-v1'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='3dnow'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='3dnowext'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     </mode>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:   </cpu>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:   <memoryBacking supported='yes'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     <enum name='sourceType'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <value>file</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <value>anonymous</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <value>memfd</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     </enum>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:   </memoryBacking>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:   <devices>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     <disk supported='yes'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <enum name='diskDevice'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>disk</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>cdrom</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>floppy</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>lun</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </enum>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <enum name='bus'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>fdc</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>scsi</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>virtio</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>usb</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>sata</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </enum>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <enum name='model'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>virtio</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>virtio-transitional</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>virtio-non-transitional</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </enum>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     </disk>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     <graphics supported='yes'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <enum name='type'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>vnc</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>egl-headless</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>dbus</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </enum>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     </graphics>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     <video supported='yes'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <enum name='modelType'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>vga</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>cirrus</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>virtio</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>none</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>bochs</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>ramfb</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </enum>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     </video>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     <hostdev supported='yes'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <enum name='mode'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>subsystem</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </enum>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <enum name='startupPolicy'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>default</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>mandatory</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>requisite</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>optional</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </enum>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <enum name='subsysType'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>usb</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>pci</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>scsi</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </enum>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <enum name='capsType'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <enum name='pciBackend'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     </hostdev>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     <rng supported='yes'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <enum name='model'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>virtio</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>virtio-transitional</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>virtio-non-transitional</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </enum>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <enum name='backendModel'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>random</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>egd</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>builtin</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </enum>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     </rng>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     <filesystem supported='yes'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <enum name='driverType'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>path</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>handle</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>virtiofs</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </enum>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     </filesystem>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     <tpm supported='yes'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <enum name='model'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>tpm-tis</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>tpm-crb</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </enum>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <enum name='backendModel'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>emulator</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>external</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </enum>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <enum name='backendVersion'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>2.0</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </enum>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     </tpm>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     <redirdev supported='yes'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <enum name='bus'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>usb</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </enum>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     </redirdev>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     <channel supported='yes'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <enum name='type'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>pty</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>unix</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </enum>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     </channel>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     <crypto supported='yes'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <enum name='model'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <enum name='type'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>qemu</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </enum>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <enum name='backendModel'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>builtin</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </enum>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     </crypto>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     <interface supported='yes'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <enum name='backendType'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>default</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>passt</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </enum>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     </interface>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     <panic supported='yes'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <enum name='model'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>isa</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>hyperv</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </enum>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     </panic>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     <console supported='yes'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <enum name='type'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>null</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>vc</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>pty</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>dev</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>file</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>pipe</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>stdio</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>udp</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>tcp</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>unix</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>qemu-vdagent</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>dbus</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </enum>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     </console>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:   </devices>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:   <features>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     <gic supported='no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     <vmcoreinfo supported='yes'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     <genid supported='yes'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     <backingStoreInput supported='yes'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     <backup supported='yes'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     <async-teardown supported='yes'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     <ps2 supported='yes'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     <sev supported='no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     <sgx supported='no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     <hyperv supported='yes'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <enum name='features'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>relaxed</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>vapic</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>spinlocks</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>vpindex</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>runtime</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>synic</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>stimer</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>reset</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>vendor_id</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>frequencies</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>reenlightenment</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>tlbflush</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>ipi</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>avic</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>emsr_bitmap</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>xmm_input</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </enum>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <defaults>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <spinlocks>4095</spinlocks>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <stimer_direct>on</stimer_direct>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <tlbflush_direct>off</tlbflush_direct>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <tlbflush_extended>off</tlbflush_extended>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <vendor_id>Linux KVM Hv</vendor_id>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </defaults>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     </hyperv>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     <launchSecurity supported='yes'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <enum name='sectype'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>tdx</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </enum>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     </launchSecurity>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:   </features>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]: </domainCapabilities>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:36.127 227469 DEBUG nova.virt.libvirt.host [None req-b811f39d-3152-4ae0-8dd6-76baa64b5d28 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]: <domainCapabilities>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:   <path>/usr/libexec/qemu-kvm</path>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:   <domain>kvm</domain>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:   <machine>pc-i440fx-rhel7.6.0</machine>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:   <arch>x86_64</arch>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:   <vcpu max='240'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:   <iothreads supported='yes'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:   <os supported='yes'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     <enum name='firmware'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     <loader supported='yes'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <enum name='type'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>rom</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>pflash</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </enum>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <enum name='readonly'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>yes</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>no</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </enum>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <enum name='secure'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>no</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </enum>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     </loader>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:   </os>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:   <cpu>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     <mode name='host-passthrough' supported='yes'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <enum name='hostPassthroughMigratable'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>on</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>off</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </enum>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     </mode>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     <mode name='maximum' supported='yes'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <enum name='maximumMigratable'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>on</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>off</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </enum>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     </mode>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     <mode name='host-model' supported='yes'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model fallback='forbid'>EPYC-Rome</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <vendor>AMD</vendor>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <maxphysaddr mode='passthrough' limit='40'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <feature policy='require' name='x2apic'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <feature policy='require' name='tsc-deadline'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <feature policy='require' name='hypervisor'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <feature policy='require' name='tsc_adjust'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <feature policy='require' name='spec-ctrl'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <feature policy='require' name='stibp'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <feature policy='require' name='ssbd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <feature policy='require' name='cmp_legacy'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <feature policy='require' name='overflow-recov'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <feature policy='require' name='succor'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <feature policy='require' name='ibrs'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <feature policy='require' name='amd-ssbd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <feature policy='require' name='virt-ssbd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <feature policy='require' name='lbrv'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <feature policy='require' name='tsc-scale'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <feature policy='require' name='vmcb-clean'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <feature policy='require' name='pause-filter'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <feature policy='require' name='pfthreshold'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <feature policy='require' name='svme-addr-chk'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <feature policy='require' name='lfence-always-serializing'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <feature policy='disable' name='xsaves'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     </mode>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     <mode name='custom' supported='yes'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Broadwell'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='hle'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='rtm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Broadwell-IBRS'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='hle'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='rtm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Broadwell-noTSX'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Broadwell-noTSX-IBRS'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Broadwell-v1'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='hle'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='rtm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Broadwell-v2'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Broadwell-v3'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='hle'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='rtm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Broadwell-v4'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Cascadelake-Server'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bw'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512cd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512dq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512f'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vl'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vnni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='hle'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='rtm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Cascadelake-Server-noTSX'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bw'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512cd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512dq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512f'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vl'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vnni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='ibrs-all'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Cascadelake-Server-v1'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bw'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512cd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512dq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512f'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vl'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vnni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='hle'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='rtm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Cascadelake-Server-v2'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bw'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512cd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512dq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512f'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vl'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vnni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='hle'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='ibrs-all'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='rtm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Cascadelake-Server-v3'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bw'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512cd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512dq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512f'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vl'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vnni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='ibrs-all'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Cascadelake-Server-v4'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bw'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512cd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512dq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512f'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vl'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vnni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='ibrs-all'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Cascadelake-Server-v5'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bw'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512cd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512dq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512f'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vl'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vnni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='ibrs-all'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xsaves'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Cooperlake'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-bf16'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bw'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512cd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512dq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512f'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vl'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vnni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='hle'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='ibrs-all'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='rtm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='taa-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Cooperlake-v1'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-bf16'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bw'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512cd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512dq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512f'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vl'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vnni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='hle'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='ibrs-all'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='rtm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='taa-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Cooperlake-v2'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-bf16'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bw'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512cd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512dq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512f'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vl'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vnni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='hle'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='ibrs-all'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='rtm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='taa-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xsaves'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Denverton'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='mpx'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Denverton-v1'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='mpx'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Denverton-v2'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Denverton-v3'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xsaves'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Dhyana-v2'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xsaves'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='EPYC-Genoa'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='amd-psfd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='auto-ibrs'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-bf16'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bitalg'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bw'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512cd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512dq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512f'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512ifma'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vbmi'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vl'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vnni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fsrm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='gfni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='la57'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='no-nested-data-bp'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='null-sel-clr-base'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='stibp-always-on'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vaes'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xsaves'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='EPYC-Genoa-v1'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='amd-psfd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='auto-ibrs'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-bf16'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bitalg'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bw'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512cd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512dq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512f'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512ifma'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vbmi'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vl'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vnni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fsrm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='gfni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='la57'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='no-nested-data-bp'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='null-sel-clr-base'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='stibp-always-on'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vaes'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xsaves'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='EPYC-Milan'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fsrm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xsaves'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='EPYC-Milan-v1'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fsrm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xsaves'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='EPYC-Milan-v2'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='amd-psfd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fsrm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='no-nested-data-bp'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='null-sel-clr-base'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='stibp-always-on'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vaes'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xsaves'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='EPYC-Rome'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xsaves'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='EPYC-Rome-v1'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xsaves'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='EPYC-Rome-v2'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xsaves'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='EPYC-Rome-v3'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xsaves'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='EPYC-v3'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xsaves'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='EPYC-v4'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xsaves'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='GraniteRapids'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='amx-bf16'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='amx-fp16'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='amx-int8'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='amx-tile'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx-vnni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-bf16'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-fp16'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bitalg'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bw'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512cd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512dq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512f'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512ifma'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vbmi'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vl'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vnni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='bus-lock-detect'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fbsdp-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fsrc'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fsrm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fsrs'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fzrm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='gfni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='hle'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='ibrs-all'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='la57'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='mcdt-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pbrsb-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='prefetchiti'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='psdp-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='rtm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='serialize'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='taa-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='tsx-ldtrk'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vaes'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xfd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xsaves'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='GraniteRapids-v1'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='amx-bf16'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='amx-fp16'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='amx-int8'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='amx-tile'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx-vnni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-bf16'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-fp16'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bitalg'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bw'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512cd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512dq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512f'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512ifma'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vbmi'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vl'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vnni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='bus-lock-detect'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fbsdp-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fsrc'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fsrm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fsrs'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fzrm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='gfni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='hle'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='ibrs-all'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='la57'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='mcdt-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pbrsb-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='prefetchiti'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='psdp-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='rtm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='serialize'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='taa-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='tsx-ldtrk'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vaes'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xfd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xsaves'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='GraniteRapids-v2'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='amx-bf16'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='amx-fp16'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='amx-int8'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='amx-tile'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx-vnni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx10'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx10-128'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx10-256'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx10-512'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-bf16'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-fp16'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bitalg'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bw'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512cd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512dq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512f'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512ifma'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vbmi'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vl'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vnni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='bus-lock-detect'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='cldemote'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fbsdp-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fsrc'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fsrm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fsrs'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fzrm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='gfni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='hle'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='ibrs-all'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='la57'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='mcdt-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='movdir64b'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='movdiri'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pbrsb-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='prefetchiti'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='psdp-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='rtm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='serialize'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='ss'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='taa-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='tsx-ldtrk'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vaes'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xfd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xsaves'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Haswell'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='hle'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='rtm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Haswell-IBRS'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='hle'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='rtm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Haswell-noTSX'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Haswell-noTSX-IBRS'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Haswell-v1'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='hle'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='rtm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Haswell-v2'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Haswell-v3'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='hle'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='rtm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Haswell-v4'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Icelake-Server'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bitalg'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bw'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512cd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512dq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512f'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vbmi'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vl'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vnni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='gfni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='hle'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='la57'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='rtm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vaes'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Icelake-Server-noTSX'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bitalg'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bw'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512cd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512dq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512f'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vbmi'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vl'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vnni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='gfni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='la57'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vaes'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Icelake-Server-v1'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bitalg'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bw'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512cd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512dq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512f'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vbmi'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vl'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vnni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='gfni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='hle'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='la57'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='rtm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vaes'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Icelake-Server-v2'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bitalg'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bw'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512cd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512dq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512f'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vbmi'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vl'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vnni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='gfni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='la57'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vaes'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Icelake-Server-v3'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bitalg'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bw'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512cd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512dq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512f'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vbmi'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vl'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vnni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='gfni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='ibrs-all'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='la57'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='taa-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vaes'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Icelake-Server-v4'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bitalg'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bw'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512cd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512dq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512f'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512ifma'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vbmi'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vl'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vnni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fsrm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='gfni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='ibrs-all'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='la57'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='taa-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vaes'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Icelake-Server-v5'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bitalg'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bw'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512cd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512dq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512f'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512ifma'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vbmi'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vl'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vnni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fsrm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='gfni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='ibrs-all'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='la57'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='taa-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vaes'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xsaves'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Icelake-Server-v6'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bitalg'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bw'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512cd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512dq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512f'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512ifma'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vbmi'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vl'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vnni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fsrm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='gfni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='ibrs-all'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='la57'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='taa-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vaes'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xsaves'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Icelake-Server-v7'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bitalg'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bw'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512cd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512dq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512f'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512ifma'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vbmi'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vl'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vnni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fsrm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='gfni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='hle'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='ibrs-all'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='la57'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='rtm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='taa-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vaes'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xsaves'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='IvyBridge'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='IvyBridge-IBRS'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='IvyBridge-v1'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='IvyBridge-v2'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='KnightsMill'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-4fmaps'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-4vnniw'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512cd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512er'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512f'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512pf'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='ss'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='KnightsMill-v1'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-4fmaps'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-4vnniw'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512cd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512er'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512f'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512pf'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='ss'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Opteron_G4'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fma4'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xop'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Opteron_G4-v1'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fma4'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xop'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Opteron_G5'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fma4'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='tbm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xop'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Opteron_G5-v1'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fma4'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='tbm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xop'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='SapphireRapids'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='amx-bf16'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='amx-int8'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='amx-tile'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx-vnni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-bf16'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-fp16'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bitalg'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bw'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512cd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512dq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512f'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512ifma'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vbmi'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vl'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vnni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='bus-lock-detect'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fsrc'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fsrm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fsrs'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fzrm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='gfni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='hle'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='ibrs-all'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='la57'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='rtm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='serialize'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='taa-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='tsx-ldtrk'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vaes'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xfd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xsaves'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='SapphireRapids-v1'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='amx-bf16'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='amx-int8'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='amx-tile'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx-vnni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-bf16'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-fp16'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bitalg'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bw'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512cd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512dq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512f'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512ifma'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vbmi'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vl'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vnni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='bus-lock-detect'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fsrc'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fsrm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fsrs'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fzrm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='gfni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='hle'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='ibrs-all'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='la57'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='rtm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='serialize'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='taa-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='tsx-ldtrk'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vaes'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xfd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xsaves'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='SapphireRapids-v2'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='amx-bf16'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='amx-int8'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='amx-tile'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx-vnni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-bf16'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-fp16'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bitalg'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bw'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512cd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512dq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512f'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512ifma'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vbmi'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vl'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vnni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='bus-lock-detect'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fbsdp-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fsrc'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fsrm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fsrs'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fzrm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='gfni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='hle'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='ibrs-all'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='la57'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='psdp-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='rtm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='serialize'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='taa-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='tsx-ldtrk'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vaes'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xfd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xsaves'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='SapphireRapids-v3'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='amx-bf16'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='amx-int8'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='amx-tile'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx-vnni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-bf16'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-fp16'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bitalg'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bw'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512cd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512dq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512f'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512ifma'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vbmi'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vl'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vnni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='bus-lock-detect'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='cldemote'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fbsdp-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fsrc'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fsrm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fsrs'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fzrm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='gfni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='hle'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='ibrs-all'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='la57'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='movdir64b'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='movdiri'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='psdp-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='rtm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='serialize'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='ss'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='taa-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='tsx-ldtrk'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vaes'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xfd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xsaves'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='SierraForest'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx-ifma'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx-ne-convert'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx-vnni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx-vnni-int8'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='bus-lock-detect'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='cmpccxadd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fbsdp-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fsrm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fsrs'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='gfni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='ibrs-all'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='mcdt-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pbrsb-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='psdp-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='serialize'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vaes'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xsaves'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='SierraForest-v1'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx-ifma'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx-ne-convert'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx-vnni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx-vnni-int8'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='bus-lock-detect'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='cmpccxadd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fbsdp-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fsrm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='fsrs'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='gfni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='ibrs-all'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='mcdt-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pbrsb-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='psdp-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='serialize'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vaes'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xsaves'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Skylake-Client'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='hle'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='rtm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Skylake-Client-IBRS'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='hle'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='rtm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Skylake-Client-v1'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='hle'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='rtm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Skylake-Client-v2'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='hle'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='rtm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Skylake-Client-v3'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Skylake-Client-v4'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xsaves'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Skylake-Server'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bw'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512cd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512dq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512f'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vl'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='hle'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='rtm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Skylake-Server-IBRS'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bw'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512cd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512dq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512f'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vl'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='hle'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='rtm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bw'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512cd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512dq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512f'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vl'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Skylake-Server-v1'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bw'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512cd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512dq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512f'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vl'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='hle'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='rtm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Skylake-Server-v2'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bw'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512cd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512dq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512f'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vl'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='hle'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='rtm'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Skylake-Server-v3'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bw'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512cd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512dq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512f'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vl'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Skylake-Server-v4'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bw'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512cd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512dq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512f'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vl'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Skylake-Server-v5'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512bw'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512cd'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512dq'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512f'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='avx512vl'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='invpcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pcid'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='pku'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xsaves'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Snowridge'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='cldemote'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='core-capability'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='gfni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='movdir64b'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='movdiri'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='mpx'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='split-lock-detect'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Snowridge-v1'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='cldemote'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='core-capability'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='gfni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='movdir64b'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='movdiri'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='mpx'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='split-lock-detect'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Snowridge-v2'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='cldemote'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='core-capability'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='gfni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='movdir64b'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='movdiri'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='split-lock-detect'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Snowridge-v3'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='cldemote'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='core-capability'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='gfni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='movdir64b'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='movdiri'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='split-lock-detect'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xsaves'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='Snowridge-v4'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='cldemote'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='erms'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='gfni'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='movdir64b'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='movdiri'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='xsaves'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='athlon'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='3dnow'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='3dnowext'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='athlon-v1'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='3dnow'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='3dnowext'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='core2duo'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='ss'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='core2duo-v1'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='ss'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='coreduo'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='ss'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='coreduo-v1'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='ss'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='n270'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='ss'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='n270-v1'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='ss'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='phenom'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='3dnow'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='3dnowext'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <blockers model='phenom-v1'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='3dnow'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <feature name='3dnowext'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </blockers>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     </mode>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:   </cpu>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:   <memoryBacking supported='yes'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     <enum name='sourceType'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <value>file</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <value>anonymous</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <value>memfd</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     </enum>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:   </memoryBacking>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:   <devices>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     <disk supported='yes'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <enum name='diskDevice'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>disk</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>cdrom</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>floppy</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>lun</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </enum>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <enum name='bus'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>ide</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>fdc</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>scsi</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>virtio</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>usb</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>sata</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </enum>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <enum name='model'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>virtio</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>virtio-transitional</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>virtio-non-transitional</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </enum>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     </disk>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     <graphics supported='yes'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <enum name='type'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>vnc</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>egl-headless</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>dbus</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </enum>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     </graphics>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     <video supported='yes'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <enum name='modelType'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>vga</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>cirrus</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>virtio</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>none</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>bochs</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>ramfb</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </enum>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     </video>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     <hostdev supported='yes'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <enum name='mode'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>subsystem</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </enum>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <enum name='startupPolicy'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>default</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>mandatory</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>requisite</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>optional</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </enum>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <enum name='subsysType'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>usb</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>pci</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>scsi</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </enum>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <enum name='capsType'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <enum name='pciBackend'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     </hostdev>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     <rng supported='yes'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <enum name='model'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>virtio</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>virtio-transitional</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>virtio-non-transitional</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </enum>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <enum name='backendModel'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>random</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>egd</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>builtin</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </enum>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     </rng>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     <filesystem supported='yes'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <enum name='driverType'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>path</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>handle</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>virtiofs</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </enum>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     </filesystem>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     <tpm supported='yes'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <enum name='model'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>tpm-tis</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>tpm-crb</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </enum>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <enum name='backendModel'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>emulator</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>external</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </enum>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <enum name='backendVersion'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>2.0</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </enum>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     </tpm>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     <redirdev supported='yes'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <enum name='bus'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>usb</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </enum>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     </redirdev>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     <channel supported='yes'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <enum name='type'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>pty</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>unix</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </enum>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     </channel>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     <crypto supported='yes'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <enum name='model'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <enum name='type'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>qemu</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </enum>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <enum name='backendModel'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>builtin</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </enum>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     </crypto>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     <interface supported='yes'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <enum name='backendType'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>default</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>passt</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </enum>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     </interface>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     <panic supported='yes'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <enum name='model'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>isa</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>hyperv</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </enum>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     </panic>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     <console supported='yes'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <enum name='type'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>null</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>vc</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>pty</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>dev</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>file</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>pipe</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>stdio</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>udp</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>tcp</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>unix</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>qemu-vdagent</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>dbus</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </enum>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     </console>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:   </devices>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:   <features>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     <gic supported='no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     <vmcoreinfo supported='yes'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     <genid supported='yes'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     <backingStoreInput supported='yes'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     <backup supported='yes'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     <async-teardown supported='yes'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     <ps2 supported='yes'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     <sev supported='no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     <sgx supported='no'/>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     <hyperv supported='yes'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <enum name='features'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>relaxed</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>vapic</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>spinlocks</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>vpindex</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>runtime</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>synic</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>stimer</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>reset</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>vendor_id</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>frequencies</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>reenlightenment</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>tlbflush</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>ipi</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>avic</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>emsr_bitmap</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>xmm_input</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </enum>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <defaults>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <spinlocks>4095</spinlocks>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <stimer_direct>on</stimer_direct>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <tlbflush_direct>off</tlbflush_direct>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <tlbflush_extended>off</tlbflush_extended>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <vendor_id>Linux KVM Hv</vendor_id>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </defaults>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     </hyperv>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     <launchSecurity supported='yes'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       <enum name='sectype'>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:         <value>tdx</value>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:       </enum>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:     </launchSecurity>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:   </features>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]: </domainCapabilities>
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:36.167 227469 DEBUG nova.virt.libvirt.host [None req-b811f39d-3152-4ae0-8dd6-76baa64b5d28 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:36.167 227469 INFO nova.virt.libvirt.host [None req-b811f39d-3152-4ae0-8dd6-76baa64b5d28 - - - - - -] Secure Boot support detected
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:36.170 227469 INFO nova.virt.libvirt.driver [None req-b811f39d-3152-4ae0-8dd6-76baa64b5d28 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:36.170 227469 INFO nova.virt.libvirt.driver [None req-b811f39d-3152-4ae0-8dd6-76baa64b5d28 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:36.181 227469 DEBUG nova.virt.libvirt.driver [None req-b811f39d-3152-4ae0-8dd6-76baa64b5d28 - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:36.196 227469 INFO nova.virt.node [None req-b811f39d-3152-4ae0-8dd6-76baa64b5d28 - - - - - -] Determined node identity 72fba1ca-0d86-48af-8a3d-510284dfd0e0 from /var/lib/nova/compute_id
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:36.211 227469 DEBUG nova.compute.manager [None req-b811f39d-3152-4ae0-8dd6-76baa64b5d28 - - - - - -] Verified node 72fba1ca-0d86-48af-8a3d-510284dfd0e0 matches my host np0005538515.localdomain _check_for_host_rename /usr/lib/python3.9/site-packages/nova/compute/manager.py:1568
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:36.229 227469 INFO nova.compute.manager [None req-b811f39d-3152-4ae0-8dd6-76baa64b5d28 - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:36.666 227469 INFO nova.service [None req-b811f39d-3152-4ae0-8dd6-76baa64b5d28 - - - - - -] Updating service version for nova-compute on np0005538515.localdomain from 57 to 66
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:36.697 227469 DEBUG oslo_concurrency.lockutils [None req-b811f39d-3152-4ae0-8dd6-76baa64b5d28 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:36.697 227469 DEBUG oslo_concurrency.lockutils [None req-b811f39d-3152-4ae0-8dd6-76baa64b5d28 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:36.697 227469 DEBUG oslo_concurrency.lockutils [None req-b811f39d-3152-4ae0-8dd6-76baa64b5d28 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:36.697 227469 DEBUG nova.compute.resource_tracker [None req-b811f39d-3152-4ae0-8dd6-76baa64b5d28 - - - - - -] Auditing locally available compute resources for np0005538515.localdomain (node: np0005538515.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 28 09:32:36 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:36.698 227469 DEBUG oslo_concurrency.processutils [None req-b811f39d-3152-4ae0-8dd6-76baa64b5d28 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 09:32:37 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:37.135 227469 DEBUG oslo_concurrency.processutils [None req-b811f39d-3152-4ae0-8dd6-76baa64b5d28 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.437s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 09:32:37 np0005538515.localdomain systemd[1]: Started libvirt nodedev daemon.
Nov 28 09:32:37 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:37.454 227469 WARNING nova.virt.libvirt.driver [None req-b811f39d-3152-4ae0-8dd6-76baa64b5d28 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 09:32:37 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:37.457 227469 DEBUG nova.compute.resource_tracker [None req-b811f39d-3152-4ae0-8dd6-76baa64b5d28 - - - - - -] Hypervisor/Node resource view: name=np0005538515.localdomain free_ram=13629MB free_disk=41.837093353271484GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 28 09:32:37 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:37.457 227469 DEBUG oslo_concurrency.lockutils [None req-b811f39d-3152-4ae0-8dd6-76baa64b5d28 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:32:37 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:37.458 227469 DEBUG oslo_concurrency.lockutils [None req-b811f39d-3152-4ae0-8dd6-76baa64b5d28 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:32:37 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:37.576 227469 DEBUG nova.compute.resource_tracker [None req-b811f39d-3152-4ae0-8dd6-76baa64b5d28 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 28 09:32:37 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:37.577 227469 DEBUG nova.compute.resource_tracker [None req-b811f39d-3152-4ae0-8dd6-76baa64b5d28 - - - - - -] Final resource view: name=np0005538515.localdomain phys_ram=15738MB used_ram=512MB phys_disk=41GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 28 09:32:37 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:37.638 227469 DEBUG nova.scheduler.client.report [None req-b811f39d-3152-4ae0-8dd6-76baa64b5d28 - - - - - -] Refreshing inventories for resource provider 72fba1ca-0d86-48af-8a3d-510284dfd0e0 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Nov 28 09:32:37 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:37.669 227469 DEBUG nova.scheduler.client.report [None req-b811f39d-3152-4ae0-8dd6-76baa64b5d28 - - - - - -] Updating ProviderTree inventory for provider 72fba1ca-0d86-48af-8a3d-510284dfd0e0 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Nov 28 09:32:37 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:37.670 227469 DEBUG nova.compute.provider_tree [None req-b811f39d-3152-4ae0-8dd6-76baa64b5d28 - - - - - -] Updating inventory in ProviderTree for provider 72fba1ca-0d86-48af-8a3d-510284dfd0e0 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 28 09:32:37 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:37.684 227469 DEBUG nova.scheduler.client.report [None req-b811f39d-3152-4ae0-8dd6-76baa64b5d28 - - - - - -] Refreshing aggregate associations for resource provider 72fba1ca-0d86-48af-8a3d-510284dfd0e0, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Nov 28 09:32:37 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:37.704 227469 DEBUG nova.scheduler.client.report [None req-b811f39d-3152-4ae0-8dd6-76baa64b5d28 - - - - - -] Refreshing trait associations for resource provider 72fba1ca-0d86-48af-8a3d-510284dfd0e0, traits: HW_CPU_X86_SSE4A,HW_CPU_X86_AVX2,COMPUTE_STORAGE_BUS_FDC,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_SSE,HW_CPU_X86_SSSE3,HW_CPU_X86_AVX,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_FMA3,HW_CPU_X86_MMX,HW_CPU_X86_SSE2,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_CLMUL,HW_CPU_X86_ABM,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_AESNI,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_TRUSTED_CERTS,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_LAN9118,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_AMD_SVM,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_BMI,COMPUTE_STORAGE_BUS_USB,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SHA,COMPUTE_ACCELERATORS,HW_CPU_X86_SSE41,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SVM,COMPUTE_STORAGE_BUS_SATA,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_SSE42,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_BMI2,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_F16C _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Nov 28 09:32:37 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:37.726 227469 DEBUG oslo_concurrency.processutils [None req-b811f39d-3152-4ae0-8dd6-76baa64b5d28 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 09:32:38 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:38.198 227469 DEBUG oslo_concurrency.processutils [None req-b811f39d-3152-4ae0-8dd6-76baa64b5d28 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.472s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 09:32:38 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:38.204 227469 DEBUG nova.virt.libvirt.host [None req-b811f39d-3152-4ae0-8dd6-76baa64b5d28 - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N
Nov 28 09:32:38 np0005538515.localdomain nova_compute[227465]: ] _kernel_supports_amd_sev /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1803
Nov 28 09:32:38 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:38.205 227469 INFO nova.virt.libvirt.host [None req-b811f39d-3152-4ae0-8dd6-76baa64b5d28 - - - - - -] kernel doesn't support AMD SEV
Nov 28 09:32:38 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:38.207 227469 DEBUG nova.compute.provider_tree [None req-b811f39d-3152-4ae0-8dd6-76baa64b5d28 - - - - - -] Inventory has not changed in ProviderTree for provider: 72fba1ca-0d86-48af-8a3d-510284dfd0e0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 09:32:38 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:38.207 227469 DEBUG nova.virt.libvirt.driver [None req-b811f39d-3152-4ae0-8dd6-76baa64b5d28 - - - - - -] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 28 09:32:38 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:38.233 227469 DEBUG nova.scheduler.client.report [None req-b811f39d-3152-4ae0-8dd6-76baa64b5d28 - - - - - -] Inventory has not changed for provider 72fba1ca-0d86-48af-8a3d-510284dfd0e0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 09:32:38 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:38.303 227469 DEBUG nova.compute.provider_tree [None req-b811f39d-3152-4ae0-8dd6-76baa64b5d28 - - - - - -] Updating resource provider 72fba1ca-0d86-48af-8a3d-510284dfd0e0 generation from 2 to 3 during operation: update_traits _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Nov 28 09:32:38 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:38.336 227469 DEBUG nova.compute.resource_tracker [None req-b811f39d-3152-4ae0-8dd6-76baa64b5d28 - - - - - -] Compute_service record updated for np0005538515.localdomain:np0005538515.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 28 09:32:38 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:38.337 227469 DEBUG oslo_concurrency.lockutils [None req-b811f39d-3152-4ae0-8dd6-76baa64b5d28 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.880s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:32:38 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:38.338 227469 DEBUG nova.service [None req-b811f39d-3152-4ae0-8dd6-76baa64b5d28 - - - - - -] Creating RPC server for service compute start /usr/lib/python3.9/site-packages/nova/service.py:182
Nov 28 09:32:38 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:38.387 227469 DEBUG nova.service [None req-b811f39d-3152-4ae0-8dd6-76baa64b5d28 - - - - - -] Join ServiceGroup membership for this service compute start /usr/lib/python3.9/site-packages/nova/service.py:199
Nov 28 09:32:38 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:38.388 227469 DEBUG nova.servicegroup.drivers.db [None req-b811f39d-3152-4ae0-8dd6-76baa64b5d28 - - - - - -] DB_Driver: join new ServiceGroup member np0005538515.localdomain to the compute group, service = <Service: host=np0005538515.localdomain, binary=nova-compute, manager_class_name=nova.compute.manager.ComputeManager> join /usr/lib/python3.9/site-packages/nova/servicegroup/drivers/db.py:44
Nov 28 09:32:38 np0005538515.localdomain python3.9[228190]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service.requires follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 09:32:39 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29098 DF PROTO=TCP SPT=54226 DPT=9102 SEQ=3263504921 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD22A7A0000000001030307) 
Nov 28 09:32:39 np0005538515.localdomain sudo[228300]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tzazjavwdkridhtmxllydoxylseyokxn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322359.4358513-4267-32014041138664/AnsiballZ_podman_container.py
Nov 28 09:32:39 np0005538515.localdomain sudo[228300]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:32:40 np0005538515.localdomain python3.9[228302]: ansible-containers.podman.podman_container Invoked with name=nova_nvme_cleaner state=absent executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Nov 28 09:32:40 np0005538515.localdomain sudo[228300]: pam_unix(sudo:session): session closed for user root
Nov 28 09:32:40 np0005538515.localdomain systemd-journald[48427]: Field hash table of /run/log/journal/5cd59ba25ae47acac865224fa46a5f9e/system.journal has a fill level at 121.3 (404 of 333 items), suggesting rotation.
Nov 28 09:32:40 np0005538515.localdomain systemd-journald[48427]: /run/log/journal/5cd59ba25ae47acac865224fa46a5f9e/system.journal: Journal header limits reached or header out-of-date, rotating.
Nov 28 09:32:40 np0005538515.localdomain rsyslogd[758]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Nov 28 09:32:40 np0005538515.localdomain rsyslogd[758]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Nov 28 09:32:41 np0005538515.localdomain sudo[228434]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-igficqsirvnmwbdqkbfxojmikkyhmnke ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322360.9502766-4292-144344444280983/AnsiballZ_systemd.py
Nov 28 09:32:41 np0005538515.localdomain sudo[228434]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:32:41 np0005538515.localdomain python3.9[228436]: ansible-ansible.builtin.systemd Invoked with name=edpm_nova_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 28 09:32:41 np0005538515.localdomain systemd[1]: Stopping nova_compute container...
Nov 28 09:32:42 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:42.510 227469 WARNING amqp [-] Received method (60, 30) during closing channel 1. This method will be ignored
Nov 28 09:32:42 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:42.512 227469 DEBUG oslo_concurrency.lockutils [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 28 09:32:42 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:42.512 227469 DEBUG oslo_concurrency.lockutils [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 28 09:32:42 np0005538515.localdomain nova_compute[227465]: 2025-11-28 09:32:42.513 227469 DEBUG oslo_concurrency.lockutils [None req-10c42a77-220c-4c24-9ffc-bdb7a8461c78 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 28 09:32:42 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25476 DF PROTO=TCP SPT=59640 DPT=9105 SEQ=2968183367 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD238FA0000000001030307) 
Nov 28 09:32:42 np0005538515.localdomain virtqemud[227736]: libvirt version: 11.9.0, package: 1.el9 (builder@centos.org, 2025-11-04-09:54:50, )
Nov 28 09:32:42 np0005538515.localdomain virtqemud[227736]: hostname: np0005538515.localdomain
Nov 28 09:32:42 np0005538515.localdomain virtqemud[227736]: End of file while reading data: Input/output error
Nov 28 09:32:42 np0005538515.localdomain systemd[1]: libpod-1d94fbbb119736b439df2d989b40e3ac469ad8b1f74689adb48cea226c0cf94e.scope: Deactivated successfully.
Nov 28 09:32:42 np0005538515.localdomain systemd[1]: libpod-1d94fbbb119736b439df2d989b40e3ac469ad8b1f74689adb48cea226c0cf94e.scope: Consumed 3.832s CPU time.
Nov 28 09:32:42 np0005538515.localdomain podman[228440]: 2025-11-28 09:32:42.884742718 +0000 UTC m=+1.286433831 container died 1d94fbbb119736b439df2d989b40e3ac469ad8b1f74689adb48cea226c0cf94e (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, container_name=nova_compute, io.buildah.version=1.41.3, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']})
Nov 28 09:32:42 np0005538515.localdomain systemd[1]: tmp-crun.qDM6aX.mount: Deactivated successfully.
Nov 28 09:32:42 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-1d94fbbb119736b439df2d989b40e3ac469ad8b1f74689adb48cea226c0cf94e-userdata-shm.mount: Deactivated successfully.
Nov 28 09:32:42 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-7d070e222432defa9c0fb260246ed4b88067e3e8c5320c077932e5b44f128942-merged.mount: Deactivated successfully.
Nov 28 09:32:42 np0005538515.localdomain podman[228440]: 2025-11-28 09:32:42.94714612 +0000 UTC m=+1.348837263 container cleanup 1d94fbbb119736b439df2d989b40e3ac469ad8b1f74689adb48cea226c0cf94e (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, container_name=nova_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, tcib_managed=true, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Nov 28 09:32:42 np0005538515.localdomain podman[228440]: nova_compute
Nov 28 09:32:43 np0005538515.localdomain podman[228480]: error opening file `/run/crun/1d94fbbb119736b439df2d989b40e3ac469ad8b1f74689adb48cea226c0cf94e/status`: No such file or directory
Nov 28 09:32:43 np0005538515.localdomain podman[228468]: 2025-11-28 09:32:43.044649401 +0000 UTC m=+0.064516609 container cleanup 1d94fbbb119736b439df2d989b40e3ac469ad8b1f74689adb48cea226c0cf94e (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, container_name=nova_compute, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 09:32:43 np0005538515.localdomain podman[228468]: nova_compute
Nov 28 09:32:43 np0005538515.localdomain systemd[1]: edpm_nova_compute.service: Deactivated successfully.
Nov 28 09:32:43 np0005538515.localdomain systemd[1]: Stopped nova_compute container.
Nov 28 09:32:43 np0005538515.localdomain systemd[1]: Starting nova_compute container...
Nov 28 09:32:43 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 09:32:43 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7d070e222432defa9c0fb260246ed4b88067e3e8c5320c077932e5b44f128942/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Nov 28 09:32:43 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7d070e222432defa9c0fb260246ed4b88067e3e8c5320c077932e5b44f128942/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Nov 28 09:32:43 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7d070e222432defa9c0fb260246ed4b88067e3e8c5320c077932e5b44f128942/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Nov 28 09:32:43 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7d070e222432defa9c0fb260246ed4b88067e3e8c5320c077932e5b44f128942/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Nov 28 09:32:43 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7d070e222432defa9c0fb260246ed4b88067e3e8c5320c077932e5b44f128942/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Nov 28 09:32:43 np0005538515.localdomain podman[228482]: 2025-11-28 09:32:43.193391926 +0000 UTC m=+0.109785372 container init 1d94fbbb119736b439df2d989b40e3ac469ad8b1f74689adb48cea226c0cf94e (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, container_name=nova_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_id=edpm)
Nov 28 09:32:43 np0005538515.localdomain podman[228482]: 2025-11-28 09:32:43.203369413 +0000 UTC m=+0.119762869 container start 1d94fbbb119736b439df2d989b40e3ac469ad8b1f74689adb48cea226c0cf94e (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.build-date=20251125, container_name=nova_compute, maintainer=OpenStack Kubernetes Operator team)
Nov 28 09:32:43 np0005538515.localdomain podman[228482]: nova_compute
Nov 28 09:32:43 np0005538515.localdomain nova_compute[228497]: + sudo -E kolla_set_configs
Nov 28 09:32:43 np0005538515.localdomain systemd[1]: Started nova_compute container.
Nov 28 09:32:43 np0005538515.localdomain sudo[228434]: pam_unix(sudo:session): session closed for user root
Nov 28 09:32:43 np0005538515.localdomain nova_compute[228497]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 28 09:32:43 np0005538515.localdomain nova_compute[228497]: INFO:__main__:Validating config file
Nov 28 09:32:43 np0005538515.localdomain nova_compute[228497]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 28 09:32:43 np0005538515.localdomain nova_compute[228497]: INFO:__main__:Copying service configuration files
Nov 28 09:32:43 np0005538515.localdomain nova_compute[228497]: INFO:__main__:Deleting /etc/nova/nova.conf
Nov 28 09:32:43 np0005538515.localdomain nova_compute[228497]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Nov 28 09:32:43 np0005538515.localdomain nova_compute[228497]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Nov 28 09:32:43 np0005538515.localdomain nova_compute[228497]: INFO:__main__:Deleting /etc/nova/nova.conf.d/01-nova.conf
Nov 28 09:32:43 np0005538515.localdomain nova_compute[228497]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Nov 28 09:32:43 np0005538515.localdomain nova_compute[228497]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Nov 28 09:32:43 np0005538515.localdomain nova_compute[228497]: INFO:__main__:Deleting /etc/nova/nova.conf.d/03-ceph-nova.conf
Nov 28 09:32:43 np0005538515.localdomain nova_compute[228497]: INFO:__main__:Copying /var/lib/kolla/config_files/03-ceph-nova.conf to /etc/nova/nova.conf.d/03-ceph-nova.conf
Nov 28 09:32:43 np0005538515.localdomain nova_compute[228497]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/03-ceph-nova.conf
Nov 28 09:32:43 np0005538515.localdomain nova_compute[228497]: INFO:__main__:Deleting /etc/nova/nova.conf.d/99-nova-compute-cells-workarounds.conf
Nov 28 09:32:43 np0005538515.localdomain nova_compute[228497]: INFO:__main__:Copying /var/lib/kolla/config_files/99-nova-compute-cells-workarounds.conf to /etc/nova/nova.conf.d/99-nova-compute-cells-workarounds.conf
Nov 28 09:32:43 np0005538515.localdomain nova_compute[228497]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/99-nova-compute-cells-workarounds.conf
Nov 28 09:32:43 np0005538515.localdomain nova_compute[228497]: INFO:__main__:Deleting /etc/nova/nova.conf.d/nova-blank.conf
Nov 28 09:32:43 np0005538515.localdomain nova_compute[228497]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Nov 28 09:32:43 np0005538515.localdomain nova_compute[228497]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Nov 28 09:32:43 np0005538515.localdomain nova_compute[228497]: INFO:__main__:Deleting /etc/nova/nova.conf.d/02-nova-host-specific.conf
Nov 28 09:32:43 np0005538515.localdomain nova_compute[228497]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Nov 28 09:32:43 np0005538515.localdomain nova_compute[228497]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Nov 28 09:32:43 np0005538515.localdomain nova_compute[228497]: INFO:__main__:Deleting /etc/ceph
Nov 28 09:32:43 np0005538515.localdomain nova_compute[228497]: INFO:__main__:Creating directory /etc/ceph
Nov 28 09:32:43 np0005538515.localdomain nova_compute[228497]: INFO:__main__:Setting permission for /etc/ceph
Nov 28 09:32:43 np0005538515.localdomain nova_compute[228497]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.conf to /etc/ceph/ceph.conf
Nov 28 09:32:43 np0005538515.localdomain nova_compute[228497]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Nov 28 09:32:43 np0005538515.localdomain nova_compute[228497]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.client.openstack.keyring to /etc/ceph/ceph.client.openstack.keyring
Nov 28 09:32:43 np0005538515.localdomain nova_compute[228497]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Nov 28 09:32:43 np0005538515.localdomain nova_compute[228497]: INFO:__main__:Deleting /var/lib/nova/.ssh/ssh-privatekey
Nov 28 09:32:43 np0005538515.localdomain nova_compute[228497]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Nov 28 09:32:43 np0005538515.localdomain nova_compute[228497]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Nov 28 09:32:43 np0005538515.localdomain nova_compute[228497]: INFO:__main__:Deleting /var/lib/nova/.ssh/config
Nov 28 09:32:43 np0005538515.localdomain nova_compute[228497]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Nov 28 09:32:43 np0005538515.localdomain nova_compute[228497]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Nov 28 09:32:43 np0005538515.localdomain nova_compute[228497]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Nov 28 09:32:43 np0005538515.localdomain nova_compute[228497]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Nov 28 09:32:43 np0005538515.localdomain nova_compute[228497]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Nov 28 09:32:43 np0005538515.localdomain nova_compute[228497]: INFO:__main__:Writing out command to execute
Nov 28 09:32:43 np0005538515.localdomain nova_compute[228497]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Nov 28 09:32:43 np0005538515.localdomain nova_compute[228497]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Nov 28 09:32:43 np0005538515.localdomain nova_compute[228497]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Nov 28 09:32:43 np0005538515.localdomain nova_compute[228497]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Nov 28 09:32:43 np0005538515.localdomain nova_compute[228497]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Nov 28 09:32:43 np0005538515.localdomain nova_compute[228497]: ++ cat /run_command
Nov 28 09:32:43 np0005538515.localdomain nova_compute[228497]: + CMD=nova-compute
Nov 28 09:32:43 np0005538515.localdomain nova_compute[228497]: + ARGS=
Nov 28 09:32:43 np0005538515.localdomain nova_compute[228497]: + sudo kolla_copy_cacerts
Nov 28 09:32:43 np0005538515.localdomain nova_compute[228497]: + [[ ! -n '' ]]
Nov 28 09:32:43 np0005538515.localdomain nova_compute[228497]: + . kolla_extend_start
Nov 28 09:32:43 np0005538515.localdomain nova_compute[228497]: Running command: 'nova-compute'
Nov 28 09:32:43 np0005538515.localdomain nova_compute[228497]: + echo 'Running command: '\''nova-compute'\'''
Nov 28 09:32:43 np0005538515.localdomain nova_compute[228497]: + umask 0022
Nov 28 09:32:43 np0005538515.localdomain nova_compute[228497]: + exec nova-compute
Nov 28 09:32:44 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59858 DF PROTO=TCP SPT=36440 DPT=9101 SEQ=279068341 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD23E500000000001030307) 
Nov 28 09:32:44 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:44.929 228501 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Nov 28 09:32:44 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:44.930 228501 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Nov 28 09:32:44 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:44.930 228501 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Nov 28 09:32:44 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:44.930 228501 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.044 228501 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.065 228501 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.021s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.065 228501 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.444 228501 INFO nova.virt.driver [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.550 228501 INFO nova.compute.provider_config [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.569 228501 WARNING nova.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] Current Nova version does not support computes older than Yoga but the minimum compute service level in your cell is 57 and the oldest supported service level is 61.: nova.exception.TooOldComputeService: Current Nova version does not support computes older than Yoga but the minimum compute service level in your cell is 57 and the oldest supported service level is 61.
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.569 228501 DEBUG oslo_concurrency.lockutils [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.569 228501 DEBUG oslo_concurrency.lockutils [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.570 228501 DEBUG oslo_concurrency.lockutils [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.570 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.570 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.571 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.571 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.571 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.571 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.571 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.572 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.572 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.572 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.572 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.572 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.572 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.573 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.573 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.573 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.573 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.573 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.573 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.574 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] console_host                   = np0005538515.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.574 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.574 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.574 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.574 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.575 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.575 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.575 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.575 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.575 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.576 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.576 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.576 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.576 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.576 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.577 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.577 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.577 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.577 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.577 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] host                           = np0005538515.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.578 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.578 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.578 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.578 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.578 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.579 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.579 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.579 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.579 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.579 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.580 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.580 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.580 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.580 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.580 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.581 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.581 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.581 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.581 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.581 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.581 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.582 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.582 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.582 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.582 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.582 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.583 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.583 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.583 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.583 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.583 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.583 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.584 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.584 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.584 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.584 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.584 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.585 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.585 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.585 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.585 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.585 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] my_block_storage_ip            = 192.168.122.108 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.585 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] my_ip                          = 192.168.122.108 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.586 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.586 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.586 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.586 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.586 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.587 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.587 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.587 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.587 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.587 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.588 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.588 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.588 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.588 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.588 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.588 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.589 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.589 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.589 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.589 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.589 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.590 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.590 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.590 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.590 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.590 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.590 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.591 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.591 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.591 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.591 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.591 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.592 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.592 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.592 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.592 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.592 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.592 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.593 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.593 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.593 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.593 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.593 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.594 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.594 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.594 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.594 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.594 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.594 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.595 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.595 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.595 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.595 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.595 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.596 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.596 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.596 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.596 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.596 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.597 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.597 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.597 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.597 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.597 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.597 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.598 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.598 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.598 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.598 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.598 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.599 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.599 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.599 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.599 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.599 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.599 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.599 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.600 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.600 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.600 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.600 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.600 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.600 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.600 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.601 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.601 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.601 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.601 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.601 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.601 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.601 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.601 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.602 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.602 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.602 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.602 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.602 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.602 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.602 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.603 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.603 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.603 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.603 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.603 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.603 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.603 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.603 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.604 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.604 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.604 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.604 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.604 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.604 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.604 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.605 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.605 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.605 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.605 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.605 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.605 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.605 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.605 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.606 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.606 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.606 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.606 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.606 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.606 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.606 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.607 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.607 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.607 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.607 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.607 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.607 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.607 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.607 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.608 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.608 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.608 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.608 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.608 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.608 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.608 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.608 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.609 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.609 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.609 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.609 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.609 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.609 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.610 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.610 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.610 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.610 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.610 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.610 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.610 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.611 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.611 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.611 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.611 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.611 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.611 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.612 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.612 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.612 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.612 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.612 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.612 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.613 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.613 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.613 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.613 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.613 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.613 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.614 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.614 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.614 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.614 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.614 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.614 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.615 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.615 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.615 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.615 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.615 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.615 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.615 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.616 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.616 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.616 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.616 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.616 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.616 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.616 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.616 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.617 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.617 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.617 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.617 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.617 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.617 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.617 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.618 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.618 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.618 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.618 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.618 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.618 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.618 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.618 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.619 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.619 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.619 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.619 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.619 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.619 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.620 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.620 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.620 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.620 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.620 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.620 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.621 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.621 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.621 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.621 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.622 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.622 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.622 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.622 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.622 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.623 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.623 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.623 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.623 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.623 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.623 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.624 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.624 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.624 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.624 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.624 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.624 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.625 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.625 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.625 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.625 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.625 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.625 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.626 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.626 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.626 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.626 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.626 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.626 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.627 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.627 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.627 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.627 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.627 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.628 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.628 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.628 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.628 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.628 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.628 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.629 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.629 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.629 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.629 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.629 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.630 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.630 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.630 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.630 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.630 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.630 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.631 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.631 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.631 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.631 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.631 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.631 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.632 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.632 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.632 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.632 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.632 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.633 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.633 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.633 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.633 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.633 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.633 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.634 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.634 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.634 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.634 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.634 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.634 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.635 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.635 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.635 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.635 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.635 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.635 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.636 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.636 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.636 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.636 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.636 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.636 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.636 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.636 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.637 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.637 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.637 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.637 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.637 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.637 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.637 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.638 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.638 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.638 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.638 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.638 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.638 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.638 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.638 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.639 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.639 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.639 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.639 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.639 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.639 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.639 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.640 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.640 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.640 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.640 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.640 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.640 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.640 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.641 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.641 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.641 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.641 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.641 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.641 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.641 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.641 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.642 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.642 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.642 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.642 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.642 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.642 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.642 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.643 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.643 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.643 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.643 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.643 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.643 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.643 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.643 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.644 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.644 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.644 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] libvirt.cpu_mode               = host-model log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.644 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.644 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] libvirt.cpu_models             = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.644 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.645 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.645 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.645 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.645 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.645 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.645 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.646 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.646 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.646 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.646 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.646 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.646 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.646 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] libvirt.images_rbd_ceph_conf   = /etc/ceph/ceph.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.647 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.647 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.647 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] libvirt.images_rbd_glance_store_name = default_backend log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.647 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] libvirt.images_rbd_pool        = vms log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.647 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] libvirt.images_type            = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.647 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.647 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.647 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.648 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.648 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.648 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.648 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.648 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.648 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.648 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.649 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.649 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.649 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.649 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.649 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.649 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.649 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.650 228501 WARNING oslo_config.cfg [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: live_migration_uri is deprecated for removal in favor of two other options that
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: and ``live_migration_inbound_addr`` respectively.
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: ).  Its value may be silently ignored in the future.
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.650 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] libvirt.live_migration_uri     = qemu+ssh://nova@%s/system?keyfile=/var/lib/nova/.ssh/ssh-privatekey log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.650 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] libvirt.live_migration_with_native_tls = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.650 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.650 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.650 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.651 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.651 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.651 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.651 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.651 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.651 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.651 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.652 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.652 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.652 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.652 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.652 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.652 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.652 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] libvirt.rbd_secret_uuid        = 2c5417c9-00eb-57d5-a565-ddecbc7995c1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.653 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] libvirt.rbd_user               = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.653 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.653 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.653 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.653 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.653 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.653 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.654 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.654 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.654 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.654 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.654 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.654 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.655 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.655 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.655 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.655 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.655 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.655 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.655 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.656 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.656 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.656 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.656 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.656 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.656 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.656 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.657 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.657 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.657 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.657 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.657 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.657 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.657 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.658 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.658 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.658 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.658 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.658 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.658 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.658 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.659 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.659 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.659 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.659 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.659 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.659 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.659 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.660 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.660 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.660 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.660 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.660 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.660 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.660 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.660 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.661 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.661 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.661 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.661 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.661 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.661 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.661 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.662 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.662 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.662 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.662 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.662 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.662 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.662 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.663 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.663 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] placement.auth_url             = http://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.663 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.663 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.663 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.663 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.663 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.664 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.664 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.664 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.664 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.664 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.664 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.664 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.664 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.665 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.665 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.665 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.665 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.665 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.665 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.665 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.666 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.666 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.666 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.666 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.666 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.666 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.666 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.667 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.667 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.667 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.667 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.667 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.667 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.667 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.668 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.668 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.668 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.668 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.668 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.668 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.668 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.668 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.669 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.669 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.669 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.669 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.669 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.669 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.670 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.670 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.670 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.670 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.670 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.670 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.670 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.671 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.671 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.671 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.671 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.671 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.671 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.671 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.672 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.672 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.672 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.672 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.672 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.672 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.672 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.673 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.673 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.673 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.673 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.673 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.673 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.673 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.673 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.674 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.674 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.674 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.674 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.674 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.674 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.674 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.675 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.675 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.675 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.675 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.675 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.675 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.676 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.676 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.676 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.676 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.676 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.676 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.676 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.676 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.677 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.677 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.677 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.677 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.677 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.677 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.677 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.678 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.678 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.678 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.678 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.678 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.678 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.679 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.679 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.679 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.679 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.679 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.679 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.679 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.679 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.680 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.680 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.680 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.680 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.680 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.680 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.680 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.681 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.681 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.681 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.681 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.681 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.681 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.681 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.681 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.682 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.682 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.682 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.682 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.682 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.682 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.682 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.683 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.683 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.683 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.683 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.683 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.683 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.683 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.683 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.684 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.684 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.684 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.684 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.684 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.684 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.684 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.685 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.685 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] vnc.novncproxy_base_url        = http://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.685 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.685 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.685 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.685 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] vnc.server_proxyclient_address = 192.168.122.108 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.686 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.686 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.686 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.686 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] workarounds.disable_compute_service_check_for_ffu = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.686 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.686 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.686 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.687 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.687 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.687 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.687 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.687 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.687 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.687 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.687 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.688 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.688 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.688 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.688 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.688 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.688 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.688 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.688 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.689 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.689 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.689 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.689 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.689 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.689 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.689 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.690 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.690 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.690 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.690 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.690 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.690 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.690 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.691 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.691 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.691 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.691 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.691 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.691 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.691 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.691 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.692 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.692 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.692 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.692 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.692 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.692 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.692 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.693 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.693 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.693 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.693 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.693 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.693 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.693 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.693 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.694 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.694 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.694 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.694 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.694 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.694 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.694 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.695 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.695 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.695 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.695 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.695 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.695 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.695 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.695 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.696 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.696 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.696 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.696 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.696 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.696 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.696 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.697 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.697 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.697 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.697 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.697 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.697 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.697 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.698 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.698 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] oslo_limit.auth_url            = http://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.698 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.698 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.698 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.698 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.698 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.698 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.699 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.699 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.699 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.699 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.699 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.699 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.699 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.699 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.700 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.700 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.700 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.700 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.700 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.700 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.700 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.701 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.701 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.701 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.701 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.701 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.701 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.701 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.701 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.702 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.702 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.702 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.702 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.702 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.702 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.702 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.703 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.703 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.703 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.703 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.703 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.703 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.703 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.703 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.704 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.704 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.704 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.704 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.704 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.704 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.704 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.704 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.705 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.705 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.705 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.705 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.705 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.705 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.705 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.706 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.706 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.706 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.706 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.706 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.706 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.706 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.706 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.707 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.707 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.707 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.707 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.707 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.707 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.707 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.708 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.708 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.708 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.708 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.708 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.708 228501 DEBUG oslo_service.service [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.710 228501 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.726 228501 INFO nova.virt.node [None req-413b9c7d-c524-48d4-b5b8-346a93ed4d5d - - - - - -] Determined node identity 72fba1ca-0d86-48af-8a3d-510284dfd0e0 from /var/lib/nova/compute_id
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.727 228501 DEBUG nova.virt.libvirt.host [None req-413b9c7d-c524-48d4-b5b8-346a93ed4d5d - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.727 228501 DEBUG nova.virt.libvirt.host [None req-413b9c7d-c524-48d4-b5b8-346a93ed4d5d - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.728 228501 DEBUG nova.virt.libvirt.host [None req-413b9c7d-c524-48d4-b5b8-346a93ed4d5d - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.728 228501 DEBUG nova.virt.libvirt.host [None req-413b9c7d-c524-48d4-b5b8-346a93ed4d5d - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.737 228501 DEBUG nova.virt.libvirt.host [None req-413b9c7d-c524-48d4-b5b8-346a93ed4d5d - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7f5c8487c910> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.739 228501 DEBUG nova.virt.libvirt.host [None req-413b9c7d-c524-48d4-b5b8-346a93ed4d5d - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7f5c8487c910> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.740 228501 INFO nova.virt.libvirt.driver [None req-413b9c7d-c524-48d4-b5b8-346a93ed4d5d - - - - - -] Connection event '1' reason 'None'
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.744 228501 INFO nova.virt.libvirt.host [None req-413b9c7d-c524-48d4-b5b8-346a93ed4d5d - - - - - -] Libvirt host capabilities <capabilities>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:   <host>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     <uuid>4c358f0e-7e15-44e5-bde2-714780d05a92</uuid>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     <cpu>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <arch>x86_64</arch>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model>EPYC-Rome-v4</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <vendor>AMD</vendor>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <microcode version='16777317'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <signature family='23' model='49' stepping='0'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <maxphysaddr mode='emulate' bits='40'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <feature name='x2apic'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <feature name='tsc-deadline'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <feature name='osxsave'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <feature name='hypervisor'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <feature name='tsc_adjust'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <feature name='spec-ctrl'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <feature name='stibp'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <feature name='arch-capabilities'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <feature name='ssbd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <feature name='cmp_legacy'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <feature name='topoext'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <feature name='virt-ssbd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <feature name='lbrv'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <feature name='tsc-scale'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <feature name='vmcb-clean'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <feature name='pause-filter'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <feature name='pfthreshold'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <feature name='svme-addr-chk'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <feature name='rdctl-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <feature name='skip-l1dfl-vmentry'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <feature name='mds-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <feature name='pschange-mc-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <pages unit='KiB' size='4'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <pages unit='KiB' size='2048'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <pages unit='KiB' size='1048576'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     </cpu>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     <power_management>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <suspend_mem/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <suspend_disk/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <suspend_hybrid/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     </power_management>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     <iommu support='no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     <migration_features>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <live/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <uri_transports>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <uri_transport>tcp</uri_transport>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <uri_transport>rdma</uri_transport>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </uri_transports>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     </migration_features>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     <topology>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <cells num='1'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <cell id='0'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:           <memory unit='KiB'>16116612</memory>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:           <pages unit='KiB' size='4'>4029153</pages>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:           <pages unit='KiB' size='2048'>0</pages>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:           <pages unit='KiB' size='1048576'>0</pages>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:           <distances>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:             <sibling id='0' value='10'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:           </distances>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:           <cpus num='8'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:             <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:             <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:             <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:             <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:             <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:             <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:             <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:             <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:           </cpus>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         </cell>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </cells>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     </topology>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     <cache>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     </cache>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     <secmodel>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model>selinux</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <doi>0</doi>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     </secmodel>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     <secmodel>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model>dac</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <doi>0</doi>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <baselabel type='kvm'>+107:+107</baselabel>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <baselabel type='qemu'>+107:+107</baselabel>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     </secmodel>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:   </host>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:   <guest>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     <os_type>hvm</os_type>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     <arch name='i686'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <wordsize>32</wordsize>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <domain type='qemu'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <domain type='kvm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     </arch>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     <features>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <pae/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <nonpae/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <acpi default='on' toggle='yes'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <apic default='on' toggle='no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <cpuselection/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <deviceboot/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <disksnapshot default='on' toggle='no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <externalSnapshot/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     </features>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:   </guest>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:   <guest>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     <os_type>hvm</os_type>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     <arch name='x86_64'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <wordsize>64</wordsize>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <domain type='qemu'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <domain type='kvm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     </arch>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     <features>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <acpi default='on' toggle='yes'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <apic default='on' toggle='no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <cpuselection/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <deviceboot/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <disksnapshot default='on' toggle='no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <externalSnapshot/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     </features>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:   </guest>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: </capabilities>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.749 228501 DEBUG nova.virt.libvirt.host [None req-413b9c7d-c524-48d4-b5b8-346a93ed4d5d - - - - - -] Getting domain capabilities for i686 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.752 228501 DEBUG nova.virt.libvirt.host [None req-413b9c7d-c524-48d4-b5b8-346a93ed4d5d - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: <domainCapabilities>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:   <path>/usr/libexec/qemu-kvm</path>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:   <domain>kvm</domain>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:   <machine>pc-q35-rhel9.8.0</machine>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:   <arch>i686</arch>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:   <vcpu max='1024'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:   <iothreads supported='yes'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:   <os supported='yes'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     <enum name='firmware'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     <loader supported='yes'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <enum name='type'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>rom</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>pflash</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </enum>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <enum name='readonly'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>yes</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>no</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </enum>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <enum name='secure'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>no</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </enum>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     </loader>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:   </os>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:   <cpu>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     <mode name='host-passthrough' supported='yes'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <enum name='hostPassthroughMigratable'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>on</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>off</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </enum>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     </mode>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     <mode name='maximum' supported='yes'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <enum name='maximumMigratable'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>on</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>off</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </enum>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     </mode>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     <mode name='host-model' supported='yes'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model fallback='forbid'>EPYC-Rome</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <vendor>AMD</vendor>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <maxphysaddr mode='passthrough' limit='40'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <feature policy='require' name='x2apic'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <feature policy='require' name='tsc-deadline'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <feature policy='require' name='hypervisor'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <feature policy='require' name='tsc_adjust'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <feature policy='require' name='spec-ctrl'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <feature policy='require' name='stibp'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <feature policy='require' name='ssbd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <feature policy='require' name='cmp_legacy'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <feature policy='require' name='overflow-recov'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <feature policy='require' name='succor'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <feature policy='require' name='ibrs'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <feature policy='require' name='amd-ssbd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <feature policy='require' name='virt-ssbd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <feature policy='require' name='lbrv'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <feature policy='require' name='tsc-scale'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <feature policy='require' name='vmcb-clean'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <feature policy='require' name='pause-filter'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <feature policy='require' name='pfthreshold'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <feature policy='require' name='svme-addr-chk'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <feature policy='require' name='lfence-always-serializing'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <feature policy='disable' name='xsaves'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     </mode>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     <mode name='custom' supported='yes'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Broadwell'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='hle'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='rtm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Broadwell-IBRS'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='hle'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='rtm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Broadwell-noTSX'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Broadwell-noTSX-IBRS'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Broadwell-v1'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='hle'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='rtm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Broadwell-v2'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Broadwell-v3'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='hle'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='rtm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Broadwell-v4'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Cascadelake-Server'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bw'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512cd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512dq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512f'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vl'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vnni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='hle'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='rtm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Cascadelake-Server-noTSX'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bw'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512cd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512dq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512f'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vl'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vnni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='ibrs-all'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Cascadelake-Server-v1'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bw'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512cd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512dq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512f'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vl'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vnni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='hle'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='rtm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Cascadelake-Server-v2'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bw'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512cd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512dq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512f'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vl'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vnni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='hle'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='ibrs-all'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='rtm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Cascadelake-Server-v3'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bw'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512cd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512dq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512f'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vl'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vnni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='ibrs-all'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Cascadelake-Server-v4'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bw'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512cd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512dq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512f'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vl'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vnni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='ibrs-all'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Cascadelake-Server-v5'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bw'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512cd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512dq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512f'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vl'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vnni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='ibrs-all'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xsaves'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Cooperlake'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-bf16'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bw'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512cd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512dq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512f'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vl'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vnni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='hle'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='ibrs-all'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='rtm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='taa-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Cooperlake-v1'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-bf16'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bw'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512cd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512dq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512f'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vl'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vnni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='hle'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='ibrs-all'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='rtm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='taa-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Cooperlake-v2'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-bf16'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bw'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512cd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512dq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512f'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vl'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vnni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='hle'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='ibrs-all'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='rtm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='taa-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xsaves'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Denverton'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='mpx'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Denverton-v1'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='mpx'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Denverton-v2'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Denverton-v3'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xsaves'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Dhyana-v2'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xsaves'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='EPYC-Genoa'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='amd-psfd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='auto-ibrs'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-bf16'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bitalg'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bw'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512cd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512dq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512f'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512ifma'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vbmi'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vl'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vnni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fsrm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='gfni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='la57'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='no-nested-data-bp'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='null-sel-clr-base'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='stibp-always-on'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vaes'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xsaves'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='EPYC-Genoa-v1'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='amd-psfd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='auto-ibrs'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-bf16'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bitalg'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bw'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512cd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512dq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512f'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512ifma'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vbmi'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vl'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vnni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fsrm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='gfni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='la57'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='no-nested-data-bp'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='null-sel-clr-base'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='stibp-always-on'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vaes'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xsaves'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='EPYC-Milan'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fsrm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xsaves'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='EPYC-Milan-v1'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fsrm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xsaves'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='EPYC-Milan-v2'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='amd-psfd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fsrm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='no-nested-data-bp'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='null-sel-clr-base'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='stibp-always-on'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vaes'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xsaves'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='EPYC-Rome'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xsaves'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='EPYC-Rome-v1'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xsaves'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='EPYC-Rome-v2'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xsaves'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='EPYC-Rome-v3'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xsaves'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='EPYC-v3'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xsaves'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='EPYC-v4'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xsaves'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='GraniteRapids'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='amx-bf16'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='amx-fp16'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='amx-int8'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='amx-tile'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx-vnni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-bf16'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-fp16'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bitalg'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bw'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512cd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512dq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512f'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512ifma'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vbmi'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vl'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vnni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='bus-lock-detect'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fbsdp-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fsrc'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fsrm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fsrs'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fzrm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='gfni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='hle'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='ibrs-all'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='la57'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='mcdt-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pbrsb-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='prefetchiti'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='psdp-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='rtm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='serialize'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='taa-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='tsx-ldtrk'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vaes'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xfd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xsaves'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='GraniteRapids-v1'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='amx-bf16'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='amx-fp16'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='amx-int8'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='amx-tile'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx-vnni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-bf16'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-fp16'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bitalg'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bw'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512cd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512dq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512f'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512ifma'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vbmi'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vl'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vnni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='bus-lock-detect'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fbsdp-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fsrc'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fsrm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fsrs'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fzrm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='gfni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='hle'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='ibrs-all'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='la57'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='mcdt-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pbrsb-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='prefetchiti'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='psdp-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='rtm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='serialize'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='taa-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='tsx-ldtrk'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vaes'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xfd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xsaves'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='GraniteRapids-v2'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='amx-bf16'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='amx-fp16'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='amx-int8'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='amx-tile'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx-vnni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx10'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx10-128'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx10-256'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx10-512'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-bf16'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-fp16'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bitalg'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bw'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512cd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512dq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512f'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512ifma'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vbmi'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vl'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vnni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='bus-lock-detect'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='cldemote'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fbsdp-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fsrc'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fsrm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fsrs'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fzrm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='gfni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='hle'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='ibrs-all'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='la57'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='mcdt-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='movdir64b'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='movdiri'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pbrsb-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='prefetchiti'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='psdp-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='rtm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='serialize'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='ss'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='taa-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='tsx-ldtrk'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vaes'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xfd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xsaves'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Haswell'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='hle'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='rtm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Haswell-IBRS'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='hle'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='rtm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Haswell-noTSX'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Haswell-noTSX-IBRS'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Haswell-v1'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='hle'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='rtm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Haswell-v2'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Haswell-v3'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='hle'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='rtm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Haswell-v4'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Icelake-Server'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bitalg'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bw'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512cd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512dq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512f'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vbmi'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vl'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vnni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='gfni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='hle'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='la57'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='rtm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vaes'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Icelake-Server-noTSX'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bitalg'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bw'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512cd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512dq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512f'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vbmi'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vl'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vnni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='gfni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='la57'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vaes'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Icelake-Server-v1'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bitalg'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bw'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512cd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512dq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512f'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vbmi'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vl'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vnni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='gfni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='hle'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='la57'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='rtm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vaes'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Icelake-Server-v2'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bitalg'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bw'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512cd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512dq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512f'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vbmi'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vl'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vnni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='gfni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='la57'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vaes'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Icelake-Server-v3'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bitalg'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bw'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512cd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512dq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512f'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vbmi'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vl'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vnni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='gfni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='ibrs-all'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='la57'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='taa-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vaes'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Icelake-Server-v4'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bitalg'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bw'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512cd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512dq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512f'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512ifma'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vbmi'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vl'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vnni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fsrm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='gfni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='ibrs-all'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='la57'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='taa-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vaes'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Icelake-Server-v5'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bitalg'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bw'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512cd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512dq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512f'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512ifma'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vbmi'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vl'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vnni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fsrm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='gfni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='ibrs-all'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='la57'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='taa-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vaes'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xsaves'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Icelake-Server-v6'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bitalg'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bw'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512cd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512dq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512f'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512ifma'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vbmi'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vl'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vnni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fsrm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='gfni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='ibrs-all'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='la57'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='taa-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vaes'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xsaves'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Icelake-Server-v7'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bitalg'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bw'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512cd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512dq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512f'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512ifma'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vbmi'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vl'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vnni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fsrm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='gfni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='hle'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='ibrs-all'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='la57'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='rtm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='taa-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vaes'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xsaves'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='IvyBridge'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='IvyBridge-IBRS'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='IvyBridge-v1'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='IvyBridge-v2'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='KnightsMill'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-4fmaps'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-4vnniw'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512cd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512er'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512f'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512pf'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='ss'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='KnightsMill-v1'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-4fmaps'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-4vnniw'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512cd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512er'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512f'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512pf'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='ss'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Opteron_G4'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fma4'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xop'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Opteron_G4-v1'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fma4'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xop'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Opteron_G5'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fma4'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='tbm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xop'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Opteron_G5-v1'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fma4'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='tbm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xop'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='SapphireRapids'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='amx-bf16'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='amx-int8'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='amx-tile'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx-vnni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-bf16'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-fp16'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bitalg'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bw'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512cd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512dq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512f'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512ifma'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vbmi'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vl'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vnni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='bus-lock-detect'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fsrc'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fsrm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fsrs'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fzrm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='gfni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='hle'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='ibrs-all'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='la57'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='rtm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='serialize'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='taa-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='tsx-ldtrk'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vaes'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xfd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xsaves'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='SapphireRapids-v1'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='amx-bf16'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='amx-int8'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='amx-tile'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx-vnni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-bf16'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-fp16'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bitalg'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bw'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512cd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512dq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512f'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512ifma'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vbmi'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vl'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vnni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='bus-lock-detect'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fsrc'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fsrm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fsrs'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fzrm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='gfni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='hle'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='ibrs-all'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='la57'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='rtm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='serialize'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='taa-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='tsx-ldtrk'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vaes'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xfd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xsaves'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='SapphireRapids-v2'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='amx-bf16'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='amx-int8'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='amx-tile'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx-vnni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-bf16'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-fp16'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bitalg'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bw'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512cd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512dq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512f'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512ifma'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vbmi'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vl'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vnni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='bus-lock-detect'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fbsdp-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fsrc'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fsrm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fsrs'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fzrm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='gfni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='hle'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='ibrs-all'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='la57'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='psdp-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='rtm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='serialize'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='taa-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='tsx-ldtrk'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vaes'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xfd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xsaves'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='SapphireRapids-v3'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='amx-bf16'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='amx-int8'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='amx-tile'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx-vnni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-bf16'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-fp16'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bitalg'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bw'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512cd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512dq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512f'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512ifma'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vbmi'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vl'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vnni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='bus-lock-detect'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='cldemote'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fbsdp-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fsrc'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fsrm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fsrs'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fzrm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='gfni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='hle'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='ibrs-all'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='la57'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='movdir64b'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='movdiri'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='psdp-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='rtm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='serialize'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='ss'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='taa-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='tsx-ldtrk'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vaes'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xfd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xsaves'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='SierraForest'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx-ifma'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx-ne-convert'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx-vnni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx-vnni-int8'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='bus-lock-detect'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='cmpccxadd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fbsdp-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fsrm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fsrs'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='gfni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='ibrs-all'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='mcdt-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pbrsb-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='psdp-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='serialize'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vaes'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xsaves'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='SierraForest-v1'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx-ifma'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx-ne-convert'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx-vnni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx-vnni-int8'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='bus-lock-detect'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='cmpccxadd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fbsdp-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fsrm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fsrs'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='gfni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='ibrs-all'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='mcdt-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pbrsb-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='psdp-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='serialize'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vaes'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xsaves'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Skylake-Client'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='hle'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='rtm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Skylake-Client-IBRS'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='hle'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='rtm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Skylake-Client-v1'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='hle'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='rtm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Skylake-Client-v2'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='hle'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='rtm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Skylake-Client-v3'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Skylake-Client-v4'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xsaves'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Skylake-Server'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bw'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512cd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512dq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512f'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vl'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='hle'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='rtm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Skylake-Server-IBRS'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bw'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512cd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512dq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512f'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vl'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='hle'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='rtm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bw'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512cd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512dq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512f'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vl'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Skylake-Server-v1'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bw'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512cd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512dq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512f'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vl'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='hle'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='rtm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Skylake-Server-v2'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bw'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512cd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512dq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512f'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vl'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='hle'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='rtm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Skylake-Server-v3'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bw'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512cd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512dq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512f'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vl'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Skylake-Server-v4'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bw'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512cd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512dq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512f'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vl'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Skylake-Server-v5'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bw'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512cd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512dq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512f'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vl'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xsaves'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Snowridge'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='cldemote'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='core-capability'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='gfni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='movdir64b'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='movdiri'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='mpx'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='split-lock-detect'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Snowridge-v1'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='cldemote'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='core-capability'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='gfni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='movdir64b'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='movdiri'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='mpx'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='split-lock-detect'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Snowridge-v2'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='cldemote'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='core-capability'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='gfni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='movdir64b'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='movdiri'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='split-lock-detect'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Snowridge-v3'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='cldemote'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='core-capability'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='gfni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='movdir64b'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='movdiri'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='split-lock-detect'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xsaves'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Snowridge-v4'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='cldemote'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='gfni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='movdir64b'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='movdiri'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xsaves'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='athlon'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='3dnow'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='3dnowext'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='athlon-v1'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='3dnow'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='3dnowext'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='core2duo'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='ss'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='core2duo-v1'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='ss'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='coreduo'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='ss'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='coreduo-v1'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='ss'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='n270'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='ss'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='n270-v1'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='ss'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='phenom'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='3dnow'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='3dnowext'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='phenom-v1'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='3dnow'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='3dnowext'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     </mode>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:   </cpu>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:   <memoryBacking supported='yes'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     <enum name='sourceType'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <value>file</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <value>anonymous</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <value>memfd</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     </enum>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:   </memoryBacking>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:   <devices>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     <disk supported='yes'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <enum name='diskDevice'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>disk</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>cdrom</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>floppy</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>lun</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </enum>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <enum name='bus'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>fdc</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>scsi</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>virtio</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>usb</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>sata</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </enum>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <enum name='model'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>virtio</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>virtio-transitional</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>virtio-non-transitional</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </enum>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     </disk>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     <graphics supported='yes'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <enum name='type'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>vnc</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>egl-headless</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>dbus</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </enum>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     </graphics>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     <video supported='yes'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <enum name='modelType'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>vga</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>cirrus</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>virtio</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>none</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>bochs</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>ramfb</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </enum>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     </video>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     <hostdev supported='yes'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <enum name='mode'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>subsystem</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </enum>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <enum name='startupPolicy'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>default</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>mandatory</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>requisite</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>optional</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </enum>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <enum name='subsysType'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>usb</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>pci</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>scsi</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </enum>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <enum name='capsType'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <enum name='pciBackend'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     </hostdev>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     <rng supported='yes'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <enum name='model'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>virtio</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>virtio-transitional</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>virtio-non-transitional</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </enum>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <enum name='backendModel'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>random</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>egd</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>builtin</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </enum>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     </rng>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     <filesystem supported='yes'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <enum name='driverType'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>path</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>handle</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>virtiofs</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </enum>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     </filesystem>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     <tpm supported='yes'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <enum name='model'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>tpm-tis</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>tpm-crb</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </enum>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <enum name='backendModel'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>emulator</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>external</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </enum>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <enum name='backendVersion'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>2.0</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </enum>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     </tpm>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     <redirdev supported='yes'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <enum name='bus'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>usb</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </enum>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     </redirdev>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     <channel supported='yes'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <enum name='type'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>pty</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>unix</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </enum>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     </channel>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     <crypto supported='yes'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <enum name='model'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <enum name='type'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>qemu</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </enum>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <enum name='backendModel'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>builtin</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </enum>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     </crypto>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     <interface supported='yes'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <enum name='backendType'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>default</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>passt</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </enum>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     </interface>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     <panic supported='yes'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <enum name='model'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>isa</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>hyperv</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </enum>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     </panic>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     <console supported='yes'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <enum name='type'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>null</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>vc</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>pty</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>dev</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>file</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>pipe</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>stdio</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>udp</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>tcp</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>unix</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>qemu-vdagent</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>dbus</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </enum>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     </console>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:   </devices>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:   <features>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     <gic supported='no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     <vmcoreinfo supported='yes'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     <genid supported='yes'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     <backingStoreInput supported='yes'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     <backup supported='yes'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     <async-teardown supported='yes'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     <ps2 supported='yes'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     <sev supported='no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     <sgx supported='no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     <hyperv supported='yes'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <enum name='features'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>relaxed</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>vapic</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>spinlocks</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>vpindex</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>runtime</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>synic</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>stimer</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>reset</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>vendor_id</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>frequencies</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>reenlightenment</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>tlbflush</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>ipi</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>avic</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>emsr_bitmap</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>xmm_input</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </enum>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <defaults>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <spinlocks>4095</spinlocks>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <stimer_direct>on</stimer_direct>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <tlbflush_direct>off</tlbflush_direct>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <tlbflush_extended>off</tlbflush_extended>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <vendor_id>Linux KVM Hv</vendor_id>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </defaults>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     </hyperv>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     <launchSecurity supported='yes'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <enum name='sectype'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>tdx</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </enum>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     </launchSecurity>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:   </features>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: </domainCapabilities>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.755 228501 DEBUG nova.virt.libvirt.volume.mount [None req-413b9c7d-c524-48d4-b5b8-346a93ed4d5d - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.759 228501 DEBUG nova.virt.libvirt.host [None req-413b9c7d-c524-48d4-b5b8-346a93ed4d5d - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: <domainCapabilities>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:   <path>/usr/libexec/qemu-kvm</path>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:   <domain>kvm</domain>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:   <machine>pc-i440fx-rhel7.6.0</machine>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:   <arch>i686</arch>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:   <vcpu max='240'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:   <iothreads supported='yes'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:   <os supported='yes'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     <enum name='firmware'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     <loader supported='yes'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <enum name='type'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>rom</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>pflash</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </enum>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <enum name='readonly'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>yes</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>no</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </enum>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <enum name='secure'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>no</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </enum>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     </loader>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:   </os>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:   <cpu>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     <mode name='host-passthrough' supported='yes'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <enum name='hostPassthroughMigratable'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>on</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>off</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </enum>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     </mode>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     <mode name='maximum' supported='yes'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <enum name='maximumMigratable'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>on</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>off</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </enum>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     </mode>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     <mode name='host-model' supported='yes'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model fallback='forbid'>EPYC-Rome</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <vendor>AMD</vendor>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <maxphysaddr mode='passthrough' limit='40'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <feature policy='require' name='x2apic'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <feature policy='require' name='tsc-deadline'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <feature policy='require' name='hypervisor'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <feature policy='require' name='tsc_adjust'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <feature policy='require' name='spec-ctrl'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <feature policy='require' name='stibp'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <feature policy='require' name='ssbd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <feature policy='require' name='cmp_legacy'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <feature policy='require' name='overflow-recov'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <feature policy='require' name='succor'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <feature policy='require' name='ibrs'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <feature policy='require' name='amd-ssbd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <feature policy='require' name='virt-ssbd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <feature policy='require' name='lbrv'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <feature policy='require' name='tsc-scale'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <feature policy='require' name='vmcb-clean'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <feature policy='require' name='pause-filter'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <feature policy='require' name='pfthreshold'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <feature policy='require' name='svme-addr-chk'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <feature policy='require' name='lfence-always-serializing'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <feature policy='disable' name='xsaves'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     </mode>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     <mode name='custom' supported='yes'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Broadwell'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='hle'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='rtm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Broadwell-IBRS'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='hle'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='rtm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Broadwell-noTSX'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Broadwell-noTSX-IBRS'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Broadwell-v1'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='hle'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='rtm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Broadwell-v2'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Broadwell-v3'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='hle'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='rtm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Broadwell-v4'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Cascadelake-Server'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bw'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512cd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512dq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512f'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vl'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vnni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='hle'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='rtm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Cascadelake-Server-noTSX'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bw'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512cd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512dq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512f'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vl'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vnni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='ibrs-all'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Cascadelake-Server-v1'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bw'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512cd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512dq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512f'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vl'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vnni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='hle'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='rtm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Cascadelake-Server-v2'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bw'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512cd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512dq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512f'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vl'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vnni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='hle'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='ibrs-all'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='rtm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Cascadelake-Server-v3'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bw'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512cd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512dq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512f'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vl'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vnni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='ibrs-all'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Cascadelake-Server-v4'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bw'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512cd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512dq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512f'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vl'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vnni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='ibrs-all'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Cascadelake-Server-v5'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bw'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512cd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512dq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512f'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vl'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vnni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='ibrs-all'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xsaves'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Cooperlake'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-bf16'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bw'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512cd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512dq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512f'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vl'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vnni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='hle'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='ibrs-all'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='rtm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='taa-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Cooperlake-v1'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-bf16'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bw'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512cd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512dq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512f'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vl'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vnni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='hle'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='ibrs-all'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='rtm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='taa-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Cooperlake-v2'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-bf16'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bw'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512cd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512dq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512f'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vl'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vnni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='hle'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='ibrs-all'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='rtm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='taa-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xsaves'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Denverton'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='mpx'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Denverton-v1'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='mpx'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Denverton-v2'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Denverton-v3'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xsaves'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Dhyana-v2'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xsaves'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='EPYC-Genoa'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='amd-psfd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='auto-ibrs'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-bf16'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bitalg'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bw'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512cd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512dq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512f'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512ifma'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vbmi'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vl'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vnni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fsrm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='gfni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='la57'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='no-nested-data-bp'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='null-sel-clr-base'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='stibp-always-on'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vaes'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xsaves'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='EPYC-Genoa-v1'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='amd-psfd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='auto-ibrs'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-bf16'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bitalg'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bw'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512cd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512dq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512f'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512ifma'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vbmi'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vl'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vnni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fsrm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='gfni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='la57'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='no-nested-data-bp'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='null-sel-clr-base'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='stibp-always-on'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vaes'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xsaves'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='EPYC-Milan'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fsrm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xsaves'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='EPYC-Milan-v1'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fsrm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xsaves'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='EPYC-Milan-v2'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='amd-psfd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fsrm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='no-nested-data-bp'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='null-sel-clr-base'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='stibp-always-on'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vaes'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xsaves'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='EPYC-Rome'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xsaves'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='EPYC-Rome-v1'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xsaves'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='EPYC-Rome-v2'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xsaves'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='EPYC-Rome-v3'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xsaves'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='EPYC-v3'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xsaves'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='EPYC-v4'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xsaves'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='GraniteRapids'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='amx-bf16'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='amx-fp16'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='amx-int8'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='amx-tile'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx-vnni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-bf16'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-fp16'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bitalg'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bw'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512cd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512dq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512f'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512ifma'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vbmi'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vl'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vnni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='bus-lock-detect'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fbsdp-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fsrc'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fsrm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fsrs'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fzrm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='gfni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='hle'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='ibrs-all'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='la57'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='mcdt-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pbrsb-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='prefetchiti'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='psdp-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='rtm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='serialize'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='taa-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='tsx-ldtrk'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vaes'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xfd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xsaves'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='GraniteRapids-v1'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='amx-bf16'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='amx-fp16'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='amx-int8'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='amx-tile'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx-vnni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-bf16'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-fp16'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bitalg'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bw'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512cd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512dq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512f'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512ifma'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vbmi'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vl'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vnni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='bus-lock-detect'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fbsdp-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fsrc'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fsrm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fsrs'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fzrm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='gfni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='hle'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='ibrs-all'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='la57'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='mcdt-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pbrsb-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='prefetchiti'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='psdp-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='rtm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='serialize'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='taa-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='tsx-ldtrk'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vaes'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xfd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xsaves'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='GraniteRapids-v2'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='amx-bf16'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='amx-fp16'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='amx-int8'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='amx-tile'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx-vnni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx10'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx10-128'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx10-256'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx10-512'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-bf16'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-fp16'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bitalg'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bw'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512cd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512dq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512f'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512ifma'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vbmi'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vl'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vnni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='bus-lock-detect'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='cldemote'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fbsdp-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fsrc'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fsrm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fsrs'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fzrm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='gfni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='hle'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='ibrs-all'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='la57'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='mcdt-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='movdir64b'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='movdiri'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pbrsb-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='prefetchiti'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='psdp-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='rtm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='serialize'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='ss'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='taa-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='tsx-ldtrk'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vaes'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xfd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xsaves'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Haswell'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='hle'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='rtm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Haswell-IBRS'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='hle'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='rtm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Haswell-noTSX'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Haswell-noTSX-IBRS'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Haswell-v1'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='hle'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='rtm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Haswell-v2'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Haswell-v3'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='hle'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='rtm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Haswell-v4'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Icelake-Server'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bitalg'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bw'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512cd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512dq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512f'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vbmi'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vl'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vnni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='gfni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='hle'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='la57'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='rtm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vaes'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Icelake-Server-noTSX'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bitalg'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bw'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512cd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512dq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512f'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vbmi'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vl'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vnni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='gfni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='la57'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vaes'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Icelake-Server-v1'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bitalg'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bw'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512cd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512dq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512f'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vbmi'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vl'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vnni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='gfni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='hle'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='la57'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='rtm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vaes'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Icelake-Server-v2'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bitalg'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bw'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512cd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512dq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512f'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vbmi'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vl'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vnni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='gfni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='la57'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vaes'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Icelake-Server-v3'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bitalg'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bw'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512cd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512dq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512f'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vbmi'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vl'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vnni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='gfni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='ibrs-all'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='la57'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='taa-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vaes'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Icelake-Server-v4'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bitalg'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bw'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512cd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512dq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512f'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512ifma'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vbmi'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vl'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vnni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fsrm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='gfni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='ibrs-all'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='la57'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='taa-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vaes'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Icelake-Server-v5'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bitalg'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bw'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512cd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512dq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512f'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512ifma'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vbmi'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vl'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vnni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fsrm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='gfni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='ibrs-all'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='la57'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='taa-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vaes'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xsaves'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Icelake-Server-v6'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bitalg'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bw'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512cd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512dq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512f'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512ifma'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vbmi'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vl'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vnni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fsrm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='gfni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='ibrs-all'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='la57'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='taa-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vaes'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xsaves'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Icelake-Server-v7'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bitalg'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bw'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512cd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512dq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512f'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512ifma'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vbmi'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vl'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vnni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fsrm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='gfni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='hle'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='ibrs-all'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='la57'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='rtm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='taa-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vaes'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xsaves'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='IvyBridge'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='IvyBridge-IBRS'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='IvyBridge-v1'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='IvyBridge-v2'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='KnightsMill'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-4fmaps'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-4vnniw'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512cd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512er'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512f'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512pf'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='ss'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='KnightsMill-v1'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-4fmaps'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-4vnniw'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512cd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512er'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512f'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512pf'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='ss'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Opteron_G4'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fma4'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xop'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Opteron_G4-v1'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fma4'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xop'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Opteron_G5'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fma4'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='tbm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xop'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Opteron_G5-v1'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fma4'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='tbm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xop'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='SapphireRapids'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='amx-bf16'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='amx-int8'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='amx-tile'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx-vnni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-bf16'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-fp16'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bitalg'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bw'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512cd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512dq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512f'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512ifma'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vbmi'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vl'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vnni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='bus-lock-detect'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fsrc'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fsrm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fsrs'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fzrm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='gfni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='hle'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='ibrs-all'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='la57'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='rtm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='serialize'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='taa-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='tsx-ldtrk'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vaes'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xfd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xsaves'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='SapphireRapids-v1'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='amx-bf16'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='amx-int8'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='amx-tile'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx-vnni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-bf16'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-fp16'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bitalg'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bw'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512cd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512dq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512f'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512ifma'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vbmi'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vl'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vnni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='bus-lock-detect'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fsrc'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fsrm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fsrs'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fzrm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='gfni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='hle'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='ibrs-all'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='la57'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='rtm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='serialize'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='taa-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='tsx-ldtrk'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vaes'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xfd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xsaves'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='SapphireRapids-v2'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='amx-bf16'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='amx-int8'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='amx-tile'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx-vnni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-bf16'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-fp16'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bitalg'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bw'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512cd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512dq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512f'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512ifma'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vbmi'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vl'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vnni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='bus-lock-detect'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fbsdp-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fsrc'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fsrm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fsrs'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fzrm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='gfni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='hle'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='ibrs-all'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='la57'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='psdp-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='rtm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='serialize'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='taa-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='tsx-ldtrk'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vaes'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xfd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xsaves'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='SapphireRapids-v3'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='amx-bf16'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='amx-int8'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='amx-tile'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx-vnni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-bf16'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-fp16'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bitalg'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bw'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512cd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512dq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512f'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512ifma'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vbmi'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vl'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vnni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='bus-lock-detect'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='cldemote'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fbsdp-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fsrc'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fsrm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fsrs'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fzrm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='gfni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='hle'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='ibrs-all'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='la57'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='movdir64b'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='movdiri'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='psdp-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='rtm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='serialize'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='ss'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='taa-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='tsx-ldtrk'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vaes'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xfd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xsaves'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='SierraForest'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx-ifma'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx-ne-convert'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx-vnni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx-vnni-int8'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='bus-lock-detect'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='cmpccxadd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fbsdp-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fsrm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fsrs'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='gfni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='ibrs-all'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='mcdt-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pbrsb-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='psdp-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='serialize'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vaes'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xsaves'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='SierraForest-v1'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx-ifma'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx-ne-convert'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx-vnni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx-vnni-int8'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='bus-lock-detect'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='cmpccxadd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fbsdp-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fsrm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fsrs'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='gfni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='ibrs-all'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='mcdt-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pbrsb-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='psdp-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='serialize'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vaes'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xsaves'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Skylake-Client'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='hle'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='rtm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Skylake-Client-IBRS'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='hle'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='rtm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Skylake-Client-v1'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='hle'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='rtm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Skylake-Client-v2'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='hle'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='rtm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Skylake-Client-v3'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Skylake-Client-v4'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xsaves'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Skylake-Server'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bw'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512cd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512dq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512f'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vl'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='hle'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='rtm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Skylake-Server-IBRS'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bw'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512cd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512dq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512f'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vl'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='hle'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='rtm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bw'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512cd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512dq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512f'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vl'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Skylake-Server-v1'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bw'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512cd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512dq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512f'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vl'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='hle'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='rtm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Skylake-Server-v2'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bw'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512cd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512dq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512f'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vl'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='hle'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='rtm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Skylake-Server-v3'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bw'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512cd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512dq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512f'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vl'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Skylake-Server-v4'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bw'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512cd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512dq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512f'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vl'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Skylake-Server-v5'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bw'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512cd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512dq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512f'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vl'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xsaves'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Snowridge'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='cldemote'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='core-capability'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='gfni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='movdir64b'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='movdiri'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='mpx'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='split-lock-detect'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Snowridge-v1'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='cldemote'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='core-capability'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='gfni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='movdir64b'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='movdiri'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='mpx'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='split-lock-detect'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Snowridge-v2'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='cldemote'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='core-capability'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='gfni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='movdir64b'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='movdiri'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='split-lock-detect'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Snowridge-v3'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='cldemote'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='core-capability'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='gfni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='movdir64b'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='movdiri'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='split-lock-detect'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xsaves'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Snowridge-v4'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='cldemote'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='gfni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='movdir64b'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='movdiri'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xsaves'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='athlon'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='3dnow'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='3dnowext'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='athlon-v1'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='3dnow'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='3dnowext'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='core2duo'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='ss'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='core2duo-v1'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='ss'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='coreduo'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='ss'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='coreduo-v1'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='ss'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='n270'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='ss'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='n270-v1'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='ss'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='phenom'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='3dnow'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='3dnowext'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='phenom-v1'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='3dnow'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='3dnowext'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     </mode>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:   </cpu>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:   <memoryBacking supported='yes'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     <enum name='sourceType'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <value>file</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <value>anonymous</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <value>memfd</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     </enum>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:   </memoryBacking>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:   <devices>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     <disk supported='yes'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <enum name='diskDevice'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>disk</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>cdrom</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>floppy</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>lun</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </enum>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <enum name='bus'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>ide</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>fdc</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>scsi</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>virtio</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>usb</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>sata</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </enum>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <enum name='model'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>virtio</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>virtio-transitional</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>virtio-non-transitional</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </enum>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     </disk>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     <graphics supported='yes'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <enum name='type'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>vnc</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>egl-headless</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>dbus</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </enum>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     </graphics>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     <video supported='yes'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <enum name='modelType'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>vga</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>cirrus</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>virtio</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>none</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>bochs</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>ramfb</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </enum>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     </video>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     <hostdev supported='yes'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <enum name='mode'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>subsystem</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </enum>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <enum name='startupPolicy'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>default</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>mandatory</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>requisite</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>optional</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </enum>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <enum name='subsysType'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>usb</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>pci</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>scsi</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </enum>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <enum name='capsType'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <enum name='pciBackend'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     </hostdev>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     <rng supported='yes'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <enum name='model'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>virtio</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>virtio-transitional</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>virtio-non-transitional</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </enum>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <enum name='backendModel'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>random</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>egd</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>builtin</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </enum>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     </rng>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     <filesystem supported='yes'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <enum name='driverType'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>path</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>handle</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>virtiofs</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </enum>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     </filesystem>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     <tpm supported='yes'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <enum name='model'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>tpm-tis</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>tpm-crb</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </enum>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <enum name='backendModel'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>emulator</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>external</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </enum>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <enum name='backendVersion'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>2.0</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </enum>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     </tpm>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     <redirdev supported='yes'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <enum name='bus'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>usb</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </enum>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     </redirdev>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     <channel supported='yes'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <enum name='type'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>pty</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>unix</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </enum>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     </channel>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     <crypto supported='yes'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <enum name='model'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <enum name='type'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>qemu</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </enum>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <enum name='backendModel'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>builtin</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </enum>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     </crypto>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     <interface supported='yes'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <enum name='backendType'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>default</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>passt</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </enum>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     </interface>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     <panic supported='yes'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <enum name='model'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>isa</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>hyperv</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </enum>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     </panic>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     <console supported='yes'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <enum name='type'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>null</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>vc</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>pty</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>dev</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>file</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>pipe</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>stdio</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>udp</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>tcp</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>unix</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>qemu-vdagent</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>dbus</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </enum>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     </console>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:   </devices>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:   <features>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     <gic supported='no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     <vmcoreinfo supported='yes'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     <genid supported='yes'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     <backingStoreInput supported='yes'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     <backup supported='yes'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     <async-teardown supported='yes'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     <ps2 supported='yes'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     <sev supported='no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     <sgx supported='no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     <hyperv supported='yes'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <enum name='features'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>relaxed</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>vapic</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>spinlocks</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>vpindex</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>runtime</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>synic</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>stimer</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>reset</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>vendor_id</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>frequencies</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>reenlightenment</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>tlbflush</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>ipi</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>avic</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>emsr_bitmap</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>xmm_input</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </enum>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <defaults>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <spinlocks>4095</spinlocks>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <stimer_direct>on</stimer_direct>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <tlbflush_direct>off</tlbflush_direct>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <tlbflush_extended>off</tlbflush_extended>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <vendor_id>Linux KVM Hv</vendor_id>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </defaults>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     </hyperv>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     <launchSecurity supported='yes'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <enum name='sectype'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>tdx</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </enum>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     </launchSecurity>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:   </features>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: </domainCapabilities>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.781 228501 DEBUG nova.virt.libvirt.host [None req-413b9c7d-c524-48d4-b5b8-346a93ed4d5d - - - - - -] Getting domain capabilities for x86_64 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.786 228501 DEBUG nova.virt.libvirt.host [None req-413b9c7d-c524-48d4-b5b8-346a93ed4d5d - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: <domainCapabilities>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:   <path>/usr/libexec/qemu-kvm</path>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:   <domain>kvm</domain>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:   <machine>pc-q35-rhel9.8.0</machine>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:   <arch>x86_64</arch>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:   <vcpu max='1024'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:   <iothreads supported='yes'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:   <os supported='yes'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     <enum name='firmware'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <value>efi</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     </enum>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     <loader supported='yes'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <enum name='type'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>rom</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>pflash</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </enum>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <enum name='readonly'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>yes</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>no</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </enum>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <enum name='secure'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>yes</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>no</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </enum>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     </loader>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:   </os>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:   <cpu>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     <mode name='host-passthrough' supported='yes'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <enum name='hostPassthroughMigratable'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>on</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>off</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </enum>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     </mode>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     <mode name='maximum' supported='yes'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <enum name='maximumMigratable'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>on</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>off</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </enum>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     </mode>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     <mode name='host-model' supported='yes'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model fallback='forbid'>EPYC-Rome</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <vendor>AMD</vendor>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <maxphysaddr mode='passthrough' limit='40'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <feature policy='require' name='x2apic'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <feature policy='require' name='tsc-deadline'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <feature policy='require' name='hypervisor'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <feature policy='require' name='tsc_adjust'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <feature policy='require' name='spec-ctrl'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <feature policy='require' name='stibp'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <feature policy='require' name='ssbd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <feature policy='require' name='cmp_legacy'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <feature policy='require' name='overflow-recov'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <feature policy='require' name='succor'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <feature policy='require' name='ibrs'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <feature policy='require' name='amd-ssbd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <feature policy='require' name='virt-ssbd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <feature policy='require' name='lbrv'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <feature policy='require' name='tsc-scale'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <feature policy='require' name='vmcb-clean'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <feature policy='require' name='pause-filter'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <feature policy='require' name='pfthreshold'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <feature policy='require' name='svme-addr-chk'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <feature policy='require' name='lfence-always-serializing'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <feature policy='disable' name='xsaves'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     </mode>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     <mode name='custom' supported='yes'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Broadwell'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='hle'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='rtm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Broadwell-IBRS'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='hle'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='rtm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Broadwell-noTSX'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Broadwell-noTSX-IBRS'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Broadwell-v1'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='hle'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='rtm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Broadwell-v2'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Broadwell-v3'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='hle'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='rtm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Broadwell-v4'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Cascadelake-Server'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bw'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512cd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512dq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512f'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vl'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vnni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='hle'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='rtm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Cascadelake-Server-noTSX'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bw'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512cd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512dq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512f'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vl'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vnni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='ibrs-all'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Cascadelake-Server-v1'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bw'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512cd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512dq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512f'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vl'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vnni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='hle'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='rtm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Cascadelake-Server-v2'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bw'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512cd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512dq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512f'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vl'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vnni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='hle'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='ibrs-all'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='rtm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Cascadelake-Server-v3'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bw'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512cd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512dq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512f'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vl'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vnni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='ibrs-all'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Cascadelake-Server-v4'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bw'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512cd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512dq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512f'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vl'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vnni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='ibrs-all'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Cascadelake-Server-v5'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bw'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512cd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512dq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512f'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vl'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vnni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='ibrs-all'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xsaves'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Cooperlake'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-bf16'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bw'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512cd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512dq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512f'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vl'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vnni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='hle'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='ibrs-all'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='rtm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='taa-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Cooperlake-v1'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-bf16'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bw'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512cd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512dq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512f'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vl'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vnni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='hle'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='ibrs-all'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='rtm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='taa-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Cooperlake-v2'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-bf16'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bw'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512cd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512dq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512f'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vl'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vnni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='hle'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='ibrs-all'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='rtm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='taa-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xsaves'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Denverton'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='mpx'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Denverton-v1'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='mpx'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Denverton-v2'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Denverton-v3'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xsaves'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Dhyana-v2'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xsaves'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='EPYC-Genoa'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='amd-psfd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='auto-ibrs'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-bf16'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bitalg'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bw'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512cd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512dq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512f'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512ifma'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vbmi'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vl'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vnni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fsrm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='gfni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='la57'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='no-nested-data-bp'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='null-sel-clr-base'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='stibp-always-on'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vaes'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xsaves'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='EPYC-Genoa-v1'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='amd-psfd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='auto-ibrs'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-bf16'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bitalg'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bw'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512cd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512dq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512f'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512ifma'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vbmi'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vl'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vnni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fsrm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='gfni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='la57'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='no-nested-data-bp'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='null-sel-clr-base'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='stibp-always-on'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vaes'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xsaves'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='EPYC-Milan'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fsrm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xsaves'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='EPYC-Milan-v1'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fsrm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xsaves'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='EPYC-Milan-v2'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='amd-psfd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fsrm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='no-nested-data-bp'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='null-sel-clr-base'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='stibp-always-on'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vaes'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xsaves'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='EPYC-Rome'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xsaves'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='EPYC-Rome-v1'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xsaves'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='EPYC-Rome-v2'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xsaves'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='EPYC-Rome-v3'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xsaves'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='EPYC-v3'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xsaves'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='EPYC-v4'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xsaves'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='GraniteRapids'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='amx-bf16'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='amx-fp16'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='amx-int8'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='amx-tile'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx-vnni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-bf16'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-fp16'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bitalg'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bw'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512cd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512dq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512f'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512ifma'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vbmi'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vl'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vnni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='bus-lock-detect'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fbsdp-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fsrc'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fsrm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fsrs'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fzrm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='gfni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='hle'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='ibrs-all'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='la57'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='mcdt-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pbrsb-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='prefetchiti'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='psdp-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='rtm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='serialize'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='taa-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='tsx-ldtrk'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vaes'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xfd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xsaves'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='GraniteRapids-v1'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='amx-bf16'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='amx-fp16'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='amx-int8'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='amx-tile'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx-vnni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-bf16'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-fp16'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bitalg'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bw'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512cd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512dq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512f'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512ifma'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vbmi'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vl'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vnni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='bus-lock-detect'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fbsdp-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fsrc'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fsrm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fsrs'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fzrm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='gfni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='hle'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='ibrs-all'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='la57'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='mcdt-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pbrsb-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='prefetchiti'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='psdp-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='rtm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='serialize'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='taa-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='tsx-ldtrk'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vaes'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xfd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xsaves'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='GraniteRapids-v2'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='amx-bf16'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='amx-fp16'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='amx-int8'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='amx-tile'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx-vnni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx10'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx10-128'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx10-256'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx10-512'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-bf16'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-fp16'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bitalg'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bw'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512cd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512dq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512f'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512ifma'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vbmi'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vl'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vnni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='bus-lock-detect'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='cldemote'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fbsdp-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fsrc'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fsrm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fsrs'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fzrm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='gfni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='hle'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='ibrs-all'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='la57'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='mcdt-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='movdir64b'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='movdiri'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pbrsb-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='prefetchiti'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='psdp-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='rtm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='serialize'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='ss'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='taa-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='tsx-ldtrk'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vaes'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xfd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xsaves'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Haswell'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='hle'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='rtm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Haswell-IBRS'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='hle'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='rtm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Haswell-noTSX'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Haswell-noTSX-IBRS'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Haswell-v1'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='hle'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='rtm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Haswell-v2'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Haswell-v3'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='hle'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='rtm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Haswell-v4'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Icelake-Server'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bitalg'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bw'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512cd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512dq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512f'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vbmi'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vl'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vnni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='gfni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='hle'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='la57'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='rtm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vaes'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Icelake-Server-noTSX'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bitalg'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bw'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512cd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512dq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512f'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vbmi'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vl'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vnni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='gfni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='la57'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vaes'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Icelake-Server-v1'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bitalg'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bw'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512cd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512dq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512f'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vbmi'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vl'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vnni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='gfni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='hle'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='la57'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='rtm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vaes'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Icelake-Server-v2'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bitalg'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bw'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512cd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512dq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512f'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vbmi'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vl'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vnni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='gfni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='la57'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vaes'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Icelake-Server-v3'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bitalg'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bw'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512cd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512dq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512f'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vbmi'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vl'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vnni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='gfni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='ibrs-all'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='la57'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='taa-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vaes'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Icelake-Server-v4'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bitalg'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bw'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512cd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512dq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512f'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512ifma'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vbmi'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vl'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vnni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fsrm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='gfni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='ibrs-all'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='la57'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='taa-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vaes'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Icelake-Server-v5'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bitalg'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bw'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512cd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512dq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512f'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512ifma'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vbmi'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vl'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vnni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fsrm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='gfni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='ibrs-all'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='la57'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='taa-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vaes'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xsaves'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Icelake-Server-v6'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bitalg'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bw'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512cd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512dq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512f'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512ifma'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vbmi'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vl'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vnni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fsrm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='gfni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='ibrs-all'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='la57'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='taa-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vaes'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xsaves'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Icelake-Server-v7'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bitalg'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bw'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512cd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512dq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512f'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512ifma'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vbmi'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vl'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vnni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fsrm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='gfni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='hle'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='ibrs-all'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='la57'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='rtm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='taa-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vaes'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xsaves'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='IvyBridge'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='IvyBridge-IBRS'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='IvyBridge-v1'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='IvyBridge-v2'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='KnightsMill'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-4fmaps'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-4vnniw'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512cd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512er'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512f'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512pf'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='ss'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='KnightsMill-v1'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-4fmaps'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-4vnniw'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512cd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512er'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512f'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512pf'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='ss'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Opteron_G4'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fma4'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xop'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Opteron_G4-v1'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fma4'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xop'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Opteron_G5'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fma4'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='tbm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xop'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Opteron_G5-v1'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fma4'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='tbm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xop'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='SapphireRapids'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='amx-bf16'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='amx-int8'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='amx-tile'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx-vnni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-bf16'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-fp16'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bitalg'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bw'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512cd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512dq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512f'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512ifma'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vbmi'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vl'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vnni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='bus-lock-detect'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fsrc'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fsrm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fsrs'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fzrm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='gfni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='hle'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='ibrs-all'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='la57'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='rtm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='serialize'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='taa-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='tsx-ldtrk'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vaes'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xfd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xsaves'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='SapphireRapids-v1'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='amx-bf16'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='amx-int8'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='amx-tile'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx-vnni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-bf16'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-fp16'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bitalg'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bw'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512cd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512dq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512f'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512ifma'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vbmi'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vl'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vnni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='bus-lock-detect'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fsrc'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fsrm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fsrs'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fzrm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='gfni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='hle'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='ibrs-all'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='la57'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='rtm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='serialize'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='taa-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='tsx-ldtrk'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vaes'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xfd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xsaves'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='SapphireRapids-v2'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='amx-bf16'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='amx-int8'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='amx-tile'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx-vnni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-bf16'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-fp16'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bitalg'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bw'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512cd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512dq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512f'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512ifma'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vbmi'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vl'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vnni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='bus-lock-detect'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fbsdp-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fsrc'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fsrm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fsrs'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fzrm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='gfni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='hle'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='ibrs-all'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='la57'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='psdp-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='rtm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='serialize'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='taa-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='tsx-ldtrk'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vaes'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xfd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xsaves'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='SapphireRapids-v3'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='amx-bf16'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='amx-int8'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='amx-tile'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx-vnni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-bf16'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-fp16'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bitalg'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bw'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512cd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512dq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512f'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512ifma'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vbmi'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vl'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vnni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='bus-lock-detect'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='cldemote'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fbsdp-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fsrc'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fsrm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fsrs'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fzrm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='gfni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='hle'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='ibrs-all'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='la57'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='movdir64b'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='movdiri'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='psdp-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='rtm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='serialize'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='ss'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='taa-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='tsx-ldtrk'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vaes'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xfd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xsaves'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='SierraForest'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx-ifma'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx-ne-convert'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx-vnni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx-vnni-int8'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='bus-lock-detect'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='cmpccxadd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fbsdp-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fsrm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fsrs'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='gfni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='ibrs-all'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='mcdt-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pbrsb-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='psdp-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='serialize'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vaes'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xsaves'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='SierraForest-v1'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx-ifma'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx-ne-convert'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx-vnni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx-vnni-int8'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='bus-lock-detect'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='cmpccxadd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fbsdp-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fsrm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fsrs'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='gfni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='ibrs-all'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='mcdt-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pbrsb-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='psdp-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='serialize'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vaes'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xsaves'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Skylake-Client'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='hle'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='rtm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Skylake-Client-IBRS'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='hle'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='rtm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Skylake-Client-v1'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='hle'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='rtm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Skylake-Client-v2'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='hle'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='rtm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Skylake-Client-v3'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Skylake-Client-v4'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xsaves'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Skylake-Server'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bw'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512cd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512dq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512f'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vl'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='hle'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='rtm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Skylake-Server-IBRS'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bw'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512cd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512dq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512f'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vl'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='hle'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='rtm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bw'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512cd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512dq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512f'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vl'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Skylake-Server-v1'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bw'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512cd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512dq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512f'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vl'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='hle'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='rtm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Skylake-Server-v2'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bw'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512cd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512dq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512f'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vl'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='hle'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='rtm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Skylake-Server-v3'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bw'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512cd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512dq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512f'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vl'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Skylake-Server-v4'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bw'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512cd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512dq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512f'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vl'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Skylake-Server-v5'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bw'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512cd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512dq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512f'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vl'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xsaves'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Snowridge'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='cldemote'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='core-capability'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='gfni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='movdir64b'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='movdiri'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='mpx'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='split-lock-detect'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Snowridge-v1'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='cldemote'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='core-capability'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='gfni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='movdir64b'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='movdiri'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='mpx'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='split-lock-detect'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Snowridge-v2'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='cldemote'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='core-capability'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='gfni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='movdir64b'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='movdiri'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='split-lock-detect'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Snowridge-v3'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='cldemote'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='core-capability'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='gfni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='movdir64b'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='movdiri'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='split-lock-detect'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xsaves'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Snowridge-v4'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='cldemote'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='gfni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='movdir64b'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='movdiri'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xsaves'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='athlon'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='3dnow'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='3dnowext'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='athlon-v1'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='3dnow'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='3dnowext'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='core2duo'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='ss'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='core2duo-v1'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='ss'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='coreduo'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='ss'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='coreduo-v1'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='ss'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='n270'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='ss'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='n270-v1'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='ss'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='phenom'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='3dnow'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='3dnowext'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='phenom-v1'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='3dnow'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='3dnowext'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     </mode>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:   </cpu>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:   <memoryBacking supported='yes'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     <enum name='sourceType'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <value>file</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <value>anonymous</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <value>memfd</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     </enum>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:   </memoryBacking>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:   <devices>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     <disk supported='yes'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <enum name='diskDevice'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>disk</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>cdrom</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>floppy</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>lun</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </enum>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <enum name='bus'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>fdc</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>scsi</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>virtio</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>usb</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>sata</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </enum>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <enum name='model'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>virtio</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>virtio-transitional</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>virtio-non-transitional</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </enum>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     </disk>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     <graphics supported='yes'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <enum name='type'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>vnc</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>egl-headless</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>dbus</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </enum>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     </graphics>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     <video supported='yes'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <enum name='modelType'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>vga</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>cirrus</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>virtio</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>none</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>bochs</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>ramfb</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </enum>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     </video>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     <hostdev supported='yes'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <enum name='mode'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>subsystem</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </enum>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <enum name='startupPolicy'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>default</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>mandatory</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>requisite</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>optional</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </enum>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <enum name='subsysType'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>usb</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>pci</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>scsi</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </enum>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <enum name='capsType'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <enum name='pciBackend'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     </hostdev>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     <rng supported='yes'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <enum name='model'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>virtio</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>virtio-transitional</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>virtio-non-transitional</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </enum>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <enum name='backendModel'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>random</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>egd</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>builtin</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </enum>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     </rng>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     <filesystem supported='yes'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <enum name='driverType'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>path</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>handle</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>virtiofs</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </enum>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     </filesystem>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     <tpm supported='yes'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <enum name='model'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>tpm-tis</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>tpm-crb</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </enum>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <enum name='backendModel'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>emulator</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>external</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </enum>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <enum name='backendVersion'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>2.0</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </enum>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     </tpm>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     <redirdev supported='yes'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <enum name='bus'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>usb</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </enum>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     </redirdev>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     <channel supported='yes'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <enum name='type'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>pty</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>unix</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </enum>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     </channel>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     <crypto supported='yes'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <enum name='model'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <enum name='type'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>qemu</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </enum>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <enum name='backendModel'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>builtin</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </enum>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     </crypto>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     <interface supported='yes'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <enum name='backendType'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>default</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>passt</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </enum>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     </interface>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     <panic supported='yes'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <enum name='model'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>isa</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>hyperv</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </enum>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     </panic>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     <console supported='yes'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <enum name='type'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>null</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>vc</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>pty</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>dev</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>file</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>pipe</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>stdio</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>udp</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>tcp</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>unix</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>qemu-vdagent</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>dbus</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </enum>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     </console>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:   </devices>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:   <features>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     <gic supported='no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     <vmcoreinfo supported='yes'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     <genid supported='yes'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     <backingStoreInput supported='yes'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     <backup supported='yes'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     <async-teardown supported='yes'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     <ps2 supported='yes'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     <sev supported='no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     <sgx supported='no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     <hyperv supported='yes'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <enum name='features'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>relaxed</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>vapic</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>spinlocks</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>vpindex</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>runtime</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>synic</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>stimer</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>reset</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>vendor_id</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>frequencies</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>reenlightenment</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>tlbflush</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>ipi</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>avic</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>emsr_bitmap</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>xmm_input</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </enum>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <defaults>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <spinlocks>4095</spinlocks>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <stimer_direct>on</stimer_direct>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <tlbflush_direct>off</tlbflush_direct>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <tlbflush_extended>off</tlbflush_extended>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <vendor_id>Linux KVM Hv</vendor_id>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </defaults>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     </hyperv>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     <launchSecurity supported='yes'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <enum name='sectype'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>tdx</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </enum>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     </launchSecurity>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:   </features>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: </domainCapabilities>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.837 228501 DEBUG nova.virt.libvirt.host [None req-413b9c7d-c524-48d4-b5b8-346a93ed4d5d - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: <domainCapabilities>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:   <path>/usr/libexec/qemu-kvm</path>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:   <domain>kvm</domain>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:   <machine>pc-i440fx-rhel7.6.0</machine>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:   <arch>x86_64</arch>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:   <vcpu max='240'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:   <iothreads supported='yes'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:   <os supported='yes'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     <enum name='firmware'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     <loader supported='yes'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <enum name='type'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>rom</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>pflash</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </enum>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <enum name='readonly'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>yes</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>no</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </enum>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <enum name='secure'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>no</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </enum>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     </loader>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:   </os>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:   <cpu>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     <mode name='host-passthrough' supported='yes'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <enum name='hostPassthroughMigratable'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>on</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>off</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </enum>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     </mode>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     <mode name='maximum' supported='yes'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <enum name='maximumMigratable'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>on</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>off</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </enum>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     </mode>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     <mode name='host-model' supported='yes'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model fallback='forbid'>EPYC-Rome</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <vendor>AMD</vendor>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <maxphysaddr mode='passthrough' limit='40'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <feature policy='require' name='x2apic'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <feature policy='require' name='tsc-deadline'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <feature policy='require' name='hypervisor'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <feature policy='require' name='tsc_adjust'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <feature policy='require' name='spec-ctrl'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <feature policy='require' name='stibp'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <feature policy='require' name='ssbd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <feature policy='require' name='cmp_legacy'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <feature policy='require' name='overflow-recov'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <feature policy='require' name='succor'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <feature policy='require' name='ibrs'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <feature policy='require' name='amd-ssbd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <feature policy='require' name='virt-ssbd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <feature policy='require' name='lbrv'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <feature policy='require' name='tsc-scale'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <feature policy='require' name='vmcb-clean'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <feature policy='require' name='pause-filter'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <feature policy='require' name='pfthreshold'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <feature policy='require' name='svme-addr-chk'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <feature policy='require' name='lfence-always-serializing'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <feature policy='disable' name='xsaves'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     </mode>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     <mode name='custom' supported='yes'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Broadwell'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='hle'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='rtm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Broadwell-IBRS'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='hle'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='rtm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Broadwell-noTSX'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Broadwell-noTSX-IBRS'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Broadwell-v1'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='hle'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='rtm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Broadwell-v2'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Broadwell-v3'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='hle'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='rtm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Broadwell-v4'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Cascadelake-Server'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bw'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512cd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512dq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512f'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vl'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vnni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='hle'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='rtm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Cascadelake-Server-noTSX'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bw'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512cd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512dq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512f'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vl'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vnni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='ibrs-all'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Cascadelake-Server-v1'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bw'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512cd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512dq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512f'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vl'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vnni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='hle'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='rtm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Cascadelake-Server-v2'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bw'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512cd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512dq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512f'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vl'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vnni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='hle'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='ibrs-all'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='rtm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Cascadelake-Server-v3'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bw'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512cd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512dq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512f'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vl'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vnni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='ibrs-all'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Cascadelake-Server-v4'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bw'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512cd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512dq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512f'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vl'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vnni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='ibrs-all'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Cascadelake-Server-v5'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bw'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512cd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512dq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512f'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vl'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vnni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='ibrs-all'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xsaves'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Cooperlake'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-bf16'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bw'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512cd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512dq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512f'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vl'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vnni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='hle'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='ibrs-all'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='rtm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='taa-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Cooperlake-v1'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-bf16'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bw'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512cd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512dq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512f'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vl'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vnni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='hle'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='ibrs-all'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='rtm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='taa-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Cooperlake-v2'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-bf16'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bw'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512cd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512dq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512f'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vl'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vnni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='hle'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='ibrs-all'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='rtm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='taa-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xsaves'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Denverton'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='mpx'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Denverton-v1'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='mpx'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Denverton-v2'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Denverton-v3'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xsaves'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Dhyana-v2'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xsaves'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='EPYC-Genoa'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='amd-psfd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='auto-ibrs'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-bf16'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bitalg'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bw'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512cd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512dq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512f'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512ifma'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vbmi'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vl'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vnni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fsrm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='gfni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='la57'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='no-nested-data-bp'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='null-sel-clr-base'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='stibp-always-on'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vaes'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xsaves'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='EPYC-Genoa-v1'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='amd-psfd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='auto-ibrs'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-bf16'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bitalg'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bw'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512cd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512dq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512f'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512ifma'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vbmi'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vl'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vnni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fsrm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='gfni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='la57'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='no-nested-data-bp'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='null-sel-clr-base'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='stibp-always-on'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vaes'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xsaves'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='EPYC-Milan'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fsrm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xsaves'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='EPYC-Milan-v1'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fsrm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xsaves'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='EPYC-Milan-v2'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='amd-psfd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fsrm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='no-nested-data-bp'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='null-sel-clr-base'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='stibp-always-on'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vaes'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xsaves'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='EPYC-Rome'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xsaves'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='EPYC-Rome-v1'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xsaves'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='EPYC-Rome-v2'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xsaves'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='EPYC-Rome-v3'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xsaves'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='EPYC-v3'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xsaves'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='EPYC-v4'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xsaves'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='GraniteRapids'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='amx-bf16'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='amx-fp16'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='amx-int8'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='amx-tile'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx-vnni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-bf16'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-fp16'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bitalg'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bw'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512cd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512dq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512f'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512ifma'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vbmi'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vl'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vnni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='bus-lock-detect'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fbsdp-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fsrc'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fsrm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fsrs'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fzrm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='gfni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='hle'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='ibrs-all'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='la57'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='mcdt-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pbrsb-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='prefetchiti'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='psdp-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='rtm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='serialize'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='taa-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='tsx-ldtrk'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vaes'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xfd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xsaves'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='GraniteRapids-v1'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='amx-bf16'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='amx-fp16'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='amx-int8'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='amx-tile'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx-vnni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-bf16'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-fp16'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bitalg'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bw'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512cd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512dq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512f'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512ifma'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vbmi'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vl'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vnni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='bus-lock-detect'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fbsdp-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fsrc'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fsrm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fsrs'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fzrm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='gfni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='hle'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='ibrs-all'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='la57'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='mcdt-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pbrsb-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='prefetchiti'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='psdp-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='rtm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='serialize'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='taa-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='tsx-ldtrk'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vaes'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xfd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xsaves'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='GraniteRapids-v2'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='amx-bf16'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='amx-fp16'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='amx-int8'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='amx-tile'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx-vnni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx10'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx10-128'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx10-256'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx10-512'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-bf16'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-fp16'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bitalg'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bw'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512cd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512dq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512f'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512ifma'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vbmi'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vl'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vnni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='bus-lock-detect'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='cldemote'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fbsdp-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fsrc'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fsrm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fsrs'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fzrm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='gfni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='hle'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='ibrs-all'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='la57'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='mcdt-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='movdir64b'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='movdiri'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pbrsb-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='prefetchiti'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='psdp-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='rtm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='serialize'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='ss'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='taa-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='tsx-ldtrk'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vaes'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xfd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xsaves'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Haswell'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='hle'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='rtm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Haswell-IBRS'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='hle'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='rtm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Haswell-noTSX'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Haswell-noTSX-IBRS'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Haswell-v1'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='hle'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='rtm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Haswell-v2'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Haswell-v3'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='hle'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='rtm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Haswell-v4'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Icelake-Server'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bitalg'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bw'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512cd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512dq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512f'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vbmi'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vl'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vnni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='gfni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='hle'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='la57'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='rtm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vaes'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Icelake-Server-noTSX'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bitalg'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bw'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512cd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512dq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512f'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vbmi'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vl'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vnni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='gfni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='la57'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vaes'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Icelake-Server-v1'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bitalg'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bw'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512cd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512dq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512f'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vbmi'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vl'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vnni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='gfni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='hle'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='la57'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='rtm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vaes'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Icelake-Server-v2'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bitalg'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bw'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512cd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512dq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512f'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vbmi'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vl'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vnni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='gfni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='la57'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vaes'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Icelake-Server-v3'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bitalg'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bw'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512cd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512dq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512f'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vbmi'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vl'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vnni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='gfni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='ibrs-all'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='la57'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='taa-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vaes'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Icelake-Server-v4'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bitalg'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bw'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512cd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512dq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512f'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512ifma'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vbmi'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vl'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vnni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fsrm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='gfni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='ibrs-all'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='la57'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='taa-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vaes'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Icelake-Server-v5'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bitalg'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bw'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512cd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512dq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512f'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512ifma'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vbmi'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vl'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vnni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fsrm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='gfni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='ibrs-all'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='la57'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='taa-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vaes'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xsaves'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Icelake-Server-v6'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bitalg'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bw'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512cd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512dq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512f'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512ifma'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vbmi'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vl'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vnni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fsrm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='gfni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='ibrs-all'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='la57'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='taa-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vaes'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xsaves'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Icelake-Server-v7'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bitalg'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bw'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512cd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512dq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512f'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512ifma'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vbmi'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vl'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vnni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fsrm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='gfni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='hle'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='ibrs-all'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='la57'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='rtm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='taa-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vaes'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xsaves'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='IvyBridge'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='IvyBridge-IBRS'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='IvyBridge-v1'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='IvyBridge-v2'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='KnightsMill'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-4fmaps'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-4vnniw'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512cd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512er'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512f'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512pf'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='ss'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='KnightsMill-v1'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-4fmaps'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-4vnniw'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512cd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512er'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512f'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512pf'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='ss'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Opteron_G4'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fma4'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xop'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Opteron_G4-v1'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fma4'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xop'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Opteron_G5'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fma4'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='tbm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xop'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Opteron_G5-v1'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fma4'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='tbm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xop'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='SapphireRapids'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='amx-bf16'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='amx-int8'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='amx-tile'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx-vnni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-bf16'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-fp16'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bitalg'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bw'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512cd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512dq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512f'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512ifma'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vbmi'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vl'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vnni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='bus-lock-detect'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fsrc'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fsrm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fsrs'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fzrm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='gfni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='hle'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='ibrs-all'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='la57'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='rtm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='serialize'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='taa-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='tsx-ldtrk'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vaes'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xfd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xsaves'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='SapphireRapids-v1'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='amx-bf16'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='amx-int8'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='amx-tile'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx-vnni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-bf16'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-fp16'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bitalg'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bw'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512cd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512dq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512f'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512ifma'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vbmi'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vl'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vnni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='bus-lock-detect'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fsrc'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fsrm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fsrs'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fzrm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='gfni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='hle'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='ibrs-all'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='la57'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='rtm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='serialize'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='taa-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='tsx-ldtrk'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vaes'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xfd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xsaves'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='SapphireRapids-v2'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='amx-bf16'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='amx-int8'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='amx-tile'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx-vnni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-bf16'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-fp16'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bitalg'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bw'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512cd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512dq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512f'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512ifma'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vbmi'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vl'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vnni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='bus-lock-detect'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fbsdp-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fsrc'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fsrm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fsrs'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fzrm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='gfni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='hle'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='ibrs-all'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='la57'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='psdp-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='rtm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='serialize'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='taa-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='tsx-ldtrk'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vaes'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xfd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xsaves'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='SapphireRapids-v3'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='amx-bf16'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='amx-int8'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='amx-tile'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx-vnni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-bf16'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-fp16'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bitalg'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bw'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512cd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512dq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512f'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512ifma'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vbmi'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vbmi2'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vl'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vnni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='bus-lock-detect'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='cldemote'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fbsdp-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fsrc'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fsrm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fsrs'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fzrm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='gfni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='hle'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='ibrs-all'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='la57'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='movdir64b'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='movdiri'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='psdp-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='rtm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='serialize'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='ss'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='taa-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='tsx-ldtrk'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vaes'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xfd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xsaves'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='SierraForest'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx-ifma'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx-ne-convert'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx-vnni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx-vnni-int8'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='bus-lock-detect'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='cmpccxadd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fbsdp-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fsrm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fsrs'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='gfni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='ibrs-all'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='mcdt-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pbrsb-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='psdp-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='serialize'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vaes'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xsaves'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='SierraForest-v1'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx-ifma'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx-ne-convert'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx-vnni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx-vnni-int8'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='bus-lock-detect'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='cmpccxadd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fbsdp-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fsrm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='fsrs'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='gfni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='ibrs-all'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='mcdt-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pbrsb-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='psdp-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='serialize'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vaes'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='vpclmulqdq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xsaves'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Skylake-Client'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='hle'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='rtm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Skylake-Client-IBRS'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='hle'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='rtm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Skylake-Client-v1'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='hle'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='rtm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Skylake-Client-v2'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='hle'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='rtm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Skylake-Client-v3'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Skylake-Client-v4'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xsaves'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Skylake-Server'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bw'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512cd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512dq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512f'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vl'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='hle'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='rtm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Skylake-Server-IBRS'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bw'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512cd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512dq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512f'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vl'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='hle'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='rtm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bw'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512cd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512dq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512f'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vl'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Skylake-Server-v1'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bw'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512cd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512dq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512f'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vl'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='hle'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='rtm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Skylake-Server-v2'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bw'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512cd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512dq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512f'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vl'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='hle'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='rtm'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Skylake-Server-v3'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bw'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512cd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512dq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512f'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vl'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Skylake-Server-v4'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bw'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512cd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512dq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512f'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vl'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Skylake-Server-v5'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512bw'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512cd'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512dq'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512f'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='avx512vl'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='invpcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pcid'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='pku'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xsaves'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Snowridge'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='cldemote'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='core-capability'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='gfni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='movdir64b'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='movdiri'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='mpx'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='split-lock-detect'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Snowridge-v1'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='cldemote'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='core-capability'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='gfni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='movdir64b'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='movdiri'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='mpx'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='split-lock-detect'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Snowridge-v2'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='cldemote'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='core-capability'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='gfni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='movdir64b'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='movdiri'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='split-lock-detect'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Snowridge-v3'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='cldemote'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='core-capability'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='gfni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='movdir64b'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='movdiri'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='split-lock-detect'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xsaves'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='Snowridge-v4'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='cldemote'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='erms'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='gfni'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='movdir64b'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='movdiri'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='xsaves'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='athlon'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='3dnow'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='3dnowext'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='athlon-v1'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='3dnow'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='3dnowext'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='core2duo'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='ss'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='core2duo-v1'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='ss'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='coreduo'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='ss'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='coreduo-v1'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='ss'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='n270'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='ss'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='n270-v1'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='ss'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='phenom'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='3dnow'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='3dnowext'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <blockers model='phenom-v1'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='3dnow'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <feature name='3dnowext'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </blockers>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     </mode>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:   </cpu>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:   <memoryBacking supported='yes'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     <enum name='sourceType'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <value>file</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <value>anonymous</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <value>memfd</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     </enum>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:   </memoryBacking>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:   <devices>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     <disk supported='yes'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <enum name='diskDevice'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>disk</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>cdrom</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>floppy</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>lun</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </enum>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <enum name='bus'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>ide</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>fdc</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>scsi</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>virtio</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>usb</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>sata</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </enum>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <enum name='model'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>virtio</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>virtio-transitional</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>virtio-non-transitional</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </enum>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     </disk>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     <graphics supported='yes'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <enum name='type'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>vnc</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>egl-headless</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>dbus</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </enum>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     </graphics>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     <video supported='yes'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <enum name='modelType'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>vga</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>cirrus</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>virtio</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>none</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>bochs</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>ramfb</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </enum>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     </video>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     <hostdev supported='yes'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <enum name='mode'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>subsystem</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </enum>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <enum name='startupPolicy'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>default</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>mandatory</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>requisite</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>optional</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </enum>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <enum name='subsysType'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>usb</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>pci</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>scsi</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </enum>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <enum name='capsType'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <enum name='pciBackend'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     </hostdev>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     <rng supported='yes'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <enum name='model'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>virtio</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>virtio-transitional</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>virtio-non-transitional</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </enum>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <enum name='backendModel'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>random</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>egd</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>builtin</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </enum>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     </rng>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     <filesystem supported='yes'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <enum name='driverType'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>path</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>handle</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>virtiofs</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </enum>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     </filesystem>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     <tpm supported='yes'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <enum name='model'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>tpm-tis</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>tpm-crb</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </enum>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <enum name='backendModel'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>emulator</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>external</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </enum>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <enum name='backendVersion'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>2.0</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </enum>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     </tpm>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     <redirdev supported='yes'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <enum name='bus'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>usb</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </enum>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     </redirdev>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     <channel supported='yes'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <enum name='type'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>pty</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>unix</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </enum>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     </channel>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     <crypto supported='yes'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <enum name='model'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <enum name='type'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>qemu</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </enum>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <enum name='backendModel'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>builtin</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </enum>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     </crypto>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     <interface supported='yes'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <enum name='backendType'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>default</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>passt</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </enum>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     </interface>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     <panic supported='yes'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <enum name='model'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>isa</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>hyperv</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </enum>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     </panic>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     <console supported='yes'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <enum name='type'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>null</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>vc</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>pty</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>dev</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>file</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>pipe</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>stdio</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>udp</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>tcp</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>unix</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>qemu-vdagent</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>dbus</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </enum>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     </console>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:   </devices>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:   <features>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     <gic supported='no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     <vmcoreinfo supported='yes'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     <genid supported='yes'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     <backingStoreInput supported='yes'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     <backup supported='yes'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     <async-teardown supported='yes'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     <ps2 supported='yes'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     <sev supported='no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     <sgx supported='no'/>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     <hyperv supported='yes'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <enum name='features'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>relaxed</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>vapic</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>spinlocks</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>vpindex</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>runtime</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>synic</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>stimer</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>reset</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>vendor_id</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>frequencies</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>reenlightenment</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>tlbflush</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>ipi</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>avic</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>emsr_bitmap</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>xmm_input</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </enum>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <defaults>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <spinlocks>4095</spinlocks>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <stimer_direct>on</stimer_direct>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <tlbflush_direct>off</tlbflush_direct>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <tlbflush_extended>off</tlbflush_extended>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <vendor_id>Linux KVM Hv</vendor_id>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </defaults>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     </hyperv>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     <launchSecurity supported='yes'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       <enum name='sectype'>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:         <value>tdx</value>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:       </enum>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:     </launchSecurity>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:   </features>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: </domainCapabilities>
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.886 228501 DEBUG nova.virt.libvirt.host [None req-413b9c7d-c524-48d4-b5b8-346a93ed4d5d - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.886 228501 INFO nova.virt.libvirt.host [None req-413b9c7d-c524-48d4-b5b8-346a93ed4d5d - - - - - -] Secure Boot support detected
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.889 228501 INFO nova.virt.libvirt.driver [None req-413b9c7d-c524-48d4-b5b8-346a93ed4d5d - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.889 228501 INFO nova.virt.libvirt.driver [None req-413b9c7d-c524-48d4-b5b8-346a93ed4d5d - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.898 228501 DEBUG nova.virt.libvirt.driver [None req-413b9c7d-c524-48d4-b5b8-346a93ed4d5d - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.915 228501 INFO nova.virt.node [None req-413b9c7d-c524-48d4-b5b8-346a93ed4d5d - - - - - -] Determined node identity 72fba1ca-0d86-48af-8a3d-510284dfd0e0 from /var/lib/nova/compute_id
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.931 228501 DEBUG nova.compute.manager [None req-413b9c7d-c524-48d4-b5b8-346a93ed4d5d - - - - - -] Verified node 72fba1ca-0d86-48af-8a3d-510284dfd0e0 matches my host np0005538515.localdomain _check_for_host_rename /usr/lib/python3.9/site-packages/nova/compute/manager.py:1568
Nov 28 09:32:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:45.955 228501 INFO nova.compute.manager [None req-413b9c7d-c524-48d4-b5b8-346a93ed4d5d - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host
Nov 28 09:32:46 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:46.023 228501 DEBUG oslo_concurrency.lockutils [None req-413b9c7d-c524-48d4-b5b8-346a93ed4d5d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:32:46 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:46.024 228501 DEBUG oslo_concurrency.lockutils [None req-413b9c7d-c524-48d4-b5b8-346a93ed4d5d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:32:46 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:46.024 228501 DEBUG oslo_concurrency.lockutils [None req-413b9c7d-c524-48d4-b5b8-346a93ed4d5d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:32:46 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:46.024 228501 DEBUG nova.compute.resource_tracker [None req-413b9c7d-c524-48d4-b5b8-346a93ed4d5d - - - - - -] Auditing locally available compute resources for np0005538515.localdomain (node: np0005538515.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 28 09:32:46 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:46.024 228501 DEBUG oslo_concurrency.processutils [None req-413b9c7d-c524-48d4-b5b8-346a93ed4d5d - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 09:32:46 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:46.428 228501 DEBUG oslo_concurrency.processutils [None req-413b9c7d-c524-48d4-b5b8-346a93ed4d5d - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.404s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 09:32:46 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6001 DF PROTO=TCP SPT=42800 DPT=9100 SEQ=1532807494 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD2477A0000000001030307) 
Nov 28 09:32:46 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:46.690 228501 WARNING nova.virt.libvirt.driver [None req-413b9c7d-c524-48d4-b5b8-346a93ed4d5d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 09:32:46 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:46.691 228501 DEBUG nova.compute.resource_tracker [None req-413b9c7d-c524-48d4-b5b8-346a93ed4d5d - - - - - -] Hypervisor/Node resource view: name=np0005538515.localdomain free_ram=13611MB free_disk=41.837093353271484GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 28 09:32:46 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:46.691 228501 DEBUG oslo_concurrency.lockutils [None req-413b9c7d-c524-48d4-b5b8-346a93ed4d5d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:32:46 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:46.692 228501 DEBUG oslo_concurrency.lockutils [None req-413b9c7d-c524-48d4-b5b8-346a93ed4d5d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:32:46 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:46.805 228501 DEBUG nova.compute.resource_tracker [None req-413b9c7d-c524-48d4-b5b8-346a93ed4d5d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 28 09:32:46 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:46.805 228501 DEBUG nova.compute.resource_tracker [None req-413b9c7d-c524-48d4-b5b8-346a93ed4d5d - - - - - -] Final resource view: name=np0005538515.localdomain phys_ram=15738MB used_ram=512MB phys_disk=41GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 28 09:32:46 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:46.828 228501 DEBUG nova.scheduler.client.report [None req-413b9c7d-c524-48d4-b5b8-346a93ed4d5d - - - - - -] Refreshing inventories for resource provider 72fba1ca-0d86-48af-8a3d-510284dfd0e0 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Nov 28 09:32:46 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.
Nov 28 09:32:46 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:46.901 228501 DEBUG nova.scheduler.client.report [None req-413b9c7d-c524-48d4-b5b8-346a93ed4d5d - - - - - -] Updating ProviderTree inventory for provider 72fba1ca-0d86-48af-8a3d-510284dfd0e0 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Nov 28 09:32:46 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:46.901 228501 DEBUG nova.compute.provider_tree [None req-413b9c7d-c524-48d4-b5b8-346a93ed4d5d - - - - - -] Updating inventory in ProviderTree for provider 72fba1ca-0d86-48af-8a3d-510284dfd0e0 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 28 09:32:46 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:46.925 228501 DEBUG nova.scheduler.client.report [None req-413b9c7d-c524-48d4-b5b8-346a93ed4d5d - - - - - -] Refreshing aggregate associations for resource provider 72fba1ca-0d86-48af-8a3d-510284dfd0e0, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Nov 28 09:32:46 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:46.952 228501 DEBUG nova.scheduler.client.report [None req-413b9c7d-c524-48d4-b5b8-346a93ed4d5d - - - - - -] Refreshing trait associations for resource provider 72fba1ca-0d86-48af-8a3d-510284dfd0e0, traits: COMPUTE_DEVICE_TAGGING,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SHA,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_AESNI,HW_CPU_X86_ABM,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_AMD_SVM,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NODE,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_ACCELERATORS,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_FMA3,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_SSE,HW_CPU_X86_F16C,HW_CPU_X86_SSE4A,HW_CPU_X86_MMX,HW_CPU_X86_SSE2,HW_CPU_X86_SSSE3,COMPUTE_RESCUE_BFV,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_AVX,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_SVM,COMPUTE_STORAGE_BUS_FDC,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_SECURITY_TPM_1_2,COMPUTE_STORAGE_BUS_USB,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_SSE41,HW_CPU_X86_SSE42,HW_CPU_X86_BMI2,HW_CPU_X86_CLMUL,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_BMI,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_AVX2,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_VIF_MODEL_VMXNET3 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Nov 28 09:32:46 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:46.975 228501 DEBUG oslo_concurrency.processutils [None req-413b9c7d-c524-48d4-b5b8-346a93ed4d5d - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 09:32:46 np0005538515.localdomain podman[228574]: 2025-11-28 09:32:46.99841 +0000 UTC m=+0.099269681 container health_status cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=multipathd, io.buildah.version=1.41.3, tcib_managed=true)
Nov 28 09:32:47 np0005538515.localdomain podman[228574]: 2025-11-28 09:32:47.066636197 +0000 UTC m=+0.167495898 container exec_died cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, tcib_managed=true, org.label-schema.schema-version=1.0, config_id=multipathd, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 09:32:47 np0005538515.localdomain systemd[1]: cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.service: Deactivated successfully.
Nov 28 09:32:47 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:47.455 228501 DEBUG oslo_concurrency.processutils [None req-413b9c7d-c524-48d4-b5b8-346a93ed4d5d - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.480s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 09:32:47 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:47.462 228501 DEBUG nova.virt.libvirt.host [None req-413b9c7d-c524-48d4-b5b8-346a93ed4d5d - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N
Nov 28 09:32:47 np0005538515.localdomain nova_compute[228497]: ] _kernel_supports_amd_sev /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1803
Nov 28 09:32:47 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:47.463 228501 INFO nova.virt.libvirt.host [None req-413b9c7d-c524-48d4-b5b8-346a93ed4d5d - - - - - -] kernel doesn't support AMD SEV
Nov 28 09:32:47 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:47.464 228501 DEBUG nova.compute.provider_tree [None req-413b9c7d-c524-48d4-b5b8-346a93ed4d5d - - - - - -] Inventory has not changed in ProviderTree for provider: 72fba1ca-0d86-48af-8a3d-510284dfd0e0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 09:32:47 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:47.464 228501 DEBUG nova.virt.libvirt.driver [None req-413b9c7d-c524-48d4-b5b8-346a93ed4d5d - - - - - -] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 28 09:32:47 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:47.493 228501 DEBUG nova.scheduler.client.report [None req-413b9c7d-c524-48d4-b5b8-346a93ed4d5d - - - - - -] Inventory has not changed for provider 72fba1ca-0d86-48af-8a3d-510284dfd0e0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 09:32:47 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:47.515 228501 DEBUG nova.compute.resource_tracker [None req-413b9c7d-c524-48d4-b5b8-346a93ed4d5d - - - - - -] Compute_service record updated for np0005538515.localdomain:np0005538515.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 28 09:32:47 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:47.515 228501 DEBUG oslo_concurrency.lockutils [None req-413b9c7d-c524-48d4-b5b8-346a93ed4d5d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.823s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:32:47 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:47.515 228501 DEBUG nova.service [None req-413b9c7d-c524-48d4-b5b8-346a93ed4d5d - - - - - -] Creating RPC server for service compute start /usr/lib/python3.9/site-packages/nova/service.py:182
Nov 28 09:32:47 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:47.544 228501 DEBUG nova.service [None req-413b9c7d-c524-48d4-b5b8-346a93ed4d5d - - - - - -] Join ServiceGroup membership for this service compute start /usr/lib/python3.9/site-packages/nova/service.py:199
Nov 28 09:32:47 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:32:47.544 228501 DEBUG nova.servicegroup.drivers.db [None req-413b9c7d-c524-48d4-b5b8-346a93ed4d5d - - - - - -] DB_Driver: join new ServiceGroup member np0005538515.localdomain to the compute group, service = <Service: host=np0005538515.localdomain, binary=nova-compute, manager_class_name=nova.compute.manager.ComputeManager> join /usr/lib/python3.9/site-packages/nova/servicegroup/drivers/db.py:44
Nov 28 09:32:47 np0005538515.localdomain sudo[228615]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:32:47 np0005538515.localdomain sudo[228615]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:32:47 np0005538515.localdomain sudo[228615]: pam_unix(sudo:session): session closed for user root
Nov 28 09:32:48 np0005538515.localdomain sudo[228649]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 09:32:48 np0005538515.localdomain sudo[228649]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:32:48 np0005538515.localdomain sudo[228649]: pam_unix(sudo:session): session closed for user root
Nov 28 09:32:48 np0005538515.localdomain sudo[228772]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xybprmysrhqgfdlbbgkwerdefnrpiqez ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322367.9310384-4317-79700879152292/AnsiballZ_podman_container.py
Nov 28 09:32:48 np0005538515.localdomain sudo[228772]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:32:49 np0005538515.localdomain python3.9[228774]: ansible-containers.podman.podman_container Invoked with name=nova_compute_init state=started executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Nov 28 09:32:49 np0005538515.localdomain sudo[228799]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:32:49 np0005538515.localdomain sudo[228799]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:32:49 np0005538515.localdomain sudo[228799]: pam_unix(sudo:session): session closed for user root
Nov 28 09:32:49 np0005538515.localdomain systemd[1]: Started libpod-conmon-acc5612457ab293e4f840ea19b50676bf97e3477bba289ad940bf778a740745d.scope.
Nov 28 09:32:49 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 09:32:49 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8d7a1875c2425cad72cafe803874ca1ca683dd2f4b513ab7c102d534a7a81b79/merged/usr/sbin/nova_statedir_ownership.py supports timestamps until 2038 (0x7fffffff)
Nov 28 09:32:49 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8d7a1875c2425cad72cafe803874ca1ca683dd2f4b513ab7c102d534a7a81b79/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Nov 28 09:32:49 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8d7a1875c2425cad72cafe803874ca1ca683dd2f4b513ab7c102d534a7a81b79/merged/var/lib/_nova_secontext supports timestamps until 2038 (0x7fffffff)
Nov 28 09:32:49 np0005538515.localdomain podman[228798]: 2025-11-28 09:32:49.431744221 +0000 UTC m=+0.142565399 container init acc5612457ab293e4f840ea19b50676bf97e3477bba289ad940bf778a740745d (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, container_name=nova_compute_init, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, config_id=edpm, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 28 09:32:49 np0005538515.localdomain podman[228798]: 2025-11-28 09:32:49.442509199 +0000 UTC m=+0.153330377 container start acc5612457ab293e4f840ea19b50676bf97e3477bba289ad940bf778a740745d (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, container_name=nova_compute_init, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 09:32:49 np0005538515.localdomain python3.9[228774]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: podman start nova_compute_init
Nov 28 09:32:49 np0005538515.localdomain nova_compute_init[228837]: INFO:nova_statedir:Applying nova statedir ownership
Nov 28 09:32:49 np0005538515.localdomain nova_compute_init[228837]: INFO:nova_statedir:Target ownership for /var/lib/nova: 42436:42436
Nov 28 09:32:49 np0005538515.localdomain nova_compute_init[228837]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/
Nov 28 09:32:49 np0005538515.localdomain nova_compute_init[228837]: INFO:nova_statedir:Changing ownership of /var/lib/nova from 1000:1000 to 42436:42436
Nov 28 09:32:49 np0005538515.localdomain nova_compute_init[228837]: INFO:nova_statedir:Setting selinux context of /var/lib/nova to system_u:object_r:container_file_t:s0
Nov 28 09:32:49 np0005538515.localdomain nova_compute_init[228837]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/instances/
Nov 28 09:32:49 np0005538515.localdomain nova_compute_init[228837]: INFO:nova_statedir:Changing ownership of /var/lib/nova/instances from 1000:1000 to 42436:42436
Nov 28 09:32:49 np0005538515.localdomain nova_compute_init[228837]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances to system_u:object_r:container_file_t:s0
Nov 28 09:32:49 np0005538515.localdomain nova_compute_init[228837]: INFO:nova_statedir:Checking uid: 0 gid: 0 path: /var/lib/nova/delay-nova-compute
Nov 28 09:32:49 np0005538515.localdomain nova_compute_init[228837]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/
Nov 28 09:32:49 np0005538515.localdomain nova_compute_init[228837]: INFO:nova_statedir:Ownership of /var/lib/nova/.ssh already 42436:42436
Nov 28 09:32:49 np0005538515.localdomain nova_compute_init[228837]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.ssh to system_u:object_r:container_file_t:s0
Nov 28 09:32:49 np0005538515.localdomain nova_compute_init[228837]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/ssh-privatekey
Nov 28 09:32:49 np0005538515.localdomain nova_compute_init[228837]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/config
Nov 28 09:32:49 np0005538515.localdomain nova_compute_init[228837]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.cache/
Nov 28 09:32:49 np0005538515.localdomain nova_compute_init[228837]: INFO:nova_statedir:Ownership of /var/lib/nova/.cache already 42436:42436
Nov 28 09:32:49 np0005538515.localdomain nova_compute_init[228837]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.cache to system_u:object_r:container_file_t:s0
Nov 28 09:32:49 np0005538515.localdomain nova_compute_init[228837]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.cache/python-entrypoints/
Nov 28 09:32:49 np0005538515.localdomain nova_compute_init[228837]: INFO:nova_statedir:Ownership of /var/lib/nova/.cache/python-entrypoints already 42436:42436
Nov 28 09:32:49 np0005538515.localdomain nova_compute_init[228837]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.cache/python-entrypoints to system_u:object_r:container_file_t:s0
Nov 28 09:32:49 np0005538515.localdomain nova_compute_init[228837]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.cache/python-entrypoints/b234715fc878456b41e32c4fbc669b417044dbe6c6684bbc9059e5c93396ffea
Nov 28 09:32:49 np0005538515.localdomain nova_compute_init[228837]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.cache/python-entrypoints/469bc4441baff9216df986857f9ff45dbf25965a8d2f755a6449ac2645cb7191
Nov 28 09:32:49 np0005538515.localdomain nova_compute_init[228837]: INFO:nova_statedir:Nova statedir ownership complete
Nov 28 09:32:49 np0005538515.localdomain systemd[1]: libpod-acc5612457ab293e4f840ea19b50676bf97e3477bba289ad940bf778a740745d.scope: Deactivated successfully.
Nov 28 09:32:49 np0005538515.localdomain podman[228838]: 2025-11-28 09:32:49.51496645 +0000 UTC m=+0.052755144 container died acc5612457ab293e4f840ea19b50676bf97e3477bba289ad940bf778a740745d (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, container_name=nova_compute_init, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=edpm, managed_by=edpm_ansible)
Nov 28 09:32:49 np0005538515.localdomain podman[228850]: 2025-11-28 09:32:49.597762178 +0000 UTC m=+0.076205002 container cleanup acc5612457ab293e4f840ea19b50676bf97e3477bba289ad940bf778a740745d (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, container_name=nova_compute_init, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 28 09:32:49 np0005538515.localdomain systemd[1]: libpod-conmon-acc5612457ab293e4f840ea19b50676bf97e3477bba289ad940bf778a740745d.scope: Deactivated successfully.
Nov 28 09:32:49 np0005538515.localdomain sudo[228772]: pam_unix(sudo:session): session closed for user root
Nov 28 09:32:50 np0005538515.localdomain sshd[206830]: pam_unix(sshd:session): session closed for user zuul
Nov 28 09:32:50 np0005538515.localdomain systemd[1]: session-53.scope: Deactivated successfully.
Nov 28 09:32:50 np0005538515.localdomain systemd[1]: session-53.scope: Consumed 2min 14.190s CPU time.
Nov 28 09:32:50 np0005538515.localdomain systemd-logind[763]: Session 53 logged out. Waiting for processes to exit.
Nov 28 09:32:50 np0005538515.localdomain systemd-logind[763]: Removed session 53.
Nov 28 09:32:50 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-8d7a1875c2425cad72cafe803874ca1ca683dd2f4b513ab7c102d534a7a81b79-merged.mount: Deactivated successfully.
Nov 28 09:32:50 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-acc5612457ab293e4f840ea19b50676bf97e3477bba289ad940bf778a740745d-userdata-shm.mount: Deactivated successfully.
Nov 28 09:32:50 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6002 DF PROTO=TCP SPT=42800 DPT=9100 SEQ=1532807494 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD2573B0000000001030307) 
Nov 28 09:32:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:32:50.812 158530 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:32:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:32:50.813 158530 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:32:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:32:50.813 158530 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:32:56 np0005538515.localdomain sshd[228894]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 09:32:56 np0005538515.localdomain sshd[228894]: Accepted publickey for zuul from 192.168.122.30 port 38672 ssh2: RSA SHA256:3gOhaEk5Hp1Sm2LwNst6cGDJ5O01KvSo8lCo9SBO2II
Nov 28 09:32:56 np0005538515.localdomain systemd-logind[763]: New session 55 of user zuul.
Nov 28 09:32:56 np0005538515.localdomain systemd[1]: Started Session 55 of User zuul.
Nov 28 09:32:56 np0005538515.localdomain sshd[228894]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Nov 28 09:32:57 np0005538515.localdomain python3.9[229005]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 09:32:57 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=585 DF PROTO=TCP SPT=50378 DPT=9105 SEQ=3544091224 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD272830000000001030307) 
Nov 28 09:32:57 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.
Nov 28 09:32:57 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.
Nov 28 09:32:57 np0005538515.localdomain systemd[1]: tmp-crun.gIgpU9.mount: Deactivated successfully.
Nov 28 09:32:57 np0005538515.localdomain podman[229026]: 2025-11-28 09:32:57.970844146 +0000 UTC m=+0.077783384 container health_status 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Nov 28 09:32:58 np0005538515.localdomain podman[229028]: 2025-11-28 09:32:58.029723474 +0000 UTC m=+0.133729334 container health_status b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 09:32:58 np0005538515.localdomain podman[229026]: 2025-11-28 09:32:58.061094274 +0000 UTC m=+0.168033522 container exec_died 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=ovn_controller, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Nov 28 09:32:58 np0005538515.localdomain systemd[1]: 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.service: Deactivated successfully.
Nov 28 09:32:58 np0005538515.localdomain podman[229028]: 2025-11-28 09:32:58.115312976 +0000 UTC m=+0.219318796 container exec_died b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible)
Nov 28 09:32:58 np0005538515.localdomain systemd[1]: b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.service: Deactivated successfully.
Nov 28 09:32:58 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=586 DF PROTO=TCP SPT=50378 DPT=9105 SEQ=3544091224 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD2767B0000000001030307) 
Nov 28 09:32:58 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6003 DF PROTO=TCP SPT=42800 DPT=9100 SEQ=1532807494 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD276FA0000000001030307) 
Nov 28 09:32:58 np0005538515.localdomain sudo[229158]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vqrtnofrswzzjxgwgkdrhxleflwdpuqj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322378.2930417-70-210317093403011/AnsiballZ_systemd_service.py
Nov 28 09:32:58 np0005538515.localdomain sudo[229158]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:32:59 np0005538515.localdomain python3.9[229160]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 28 09:32:59 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 09:32:59 np0005538515.localdomain systemd-rc-local-generator[229185]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:32:59 np0005538515.localdomain systemd-sysv-generator[229190]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:32:59 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:32:59 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 28 09:32:59 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:32:59 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:32:59 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:32:59 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 28 09:32:59 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:32:59 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:32:59 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:32:59 np0005538515.localdomain sudo[229158]: pam_unix(sudo:session): session closed for user root
Nov 28 09:33:00 np0005538515.localdomain python3.9[229303]: ansible-ansible.builtin.service_facts Invoked
Nov 28 09:33:00 np0005538515.localdomain network[229320]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 28 09:33:00 np0005538515.localdomain network[229321]: 'network-scripts' will be removed from distribution in near future.
Nov 28 09:33:00 np0005538515.localdomain network[229322]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 28 09:33:01 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:33:01 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3407 DF PROTO=TCP SPT=33760 DPT=9105 SEQ=3839620824 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD282FA0000000001030307) 
Nov 28 09:33:04 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:33:04.546 228501 DEBUG oslo_service.periodic_task [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:33:04 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:33:04.582 228501 DEBUG oslo_service.periodic_task [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:33:04 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=588 DF PROTO=TCP SPT=50378 DPT=9105 SEQ=3544091224 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD28E3A0000000001030307) 
Nov 28 09:33:06 np0005538515.localdomain sudo[229555]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bbsnknvvesmwudxhylidyiactakixdyu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322386.3825953-127-217059136708718/AnsiballZ_systemd_service.py
Nov 28 09:33:06 np0005538515.localdomain sudo[229555]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:33:07 np0005538515.localdomain python3.9[229557]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_ceilometer_agent_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:33:07 np0005538515.localdomain sudo[229555]: pam_unix(sudo:session): session closed for user root
Nov 28 09:33:08 np0005538515.localdomain sudo[229666]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-drcjbzkuaihypbzxyubzksdifadwsauo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322387.6089926-158-1525979913008/AnsiballZ_file.py
Nov 28 09:33:08 np0005538515.localdomain sudo[229666]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:33:08 np0005538515.localdomain python3.9[229668]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:33:08 np0005538515.localdomain sudo[229666]: pam_unix(sudo:session): session closed for user root
Nov 28 09:33:08 np0005538515.localdomain systemd-journald[48427]: Field hash table of /run/log/journal/5cd59ba25ae47acac865224fa46a5f9e/system.journal has a fill level at 76.3 (254 of 333 items), suggesting rotation.
Nov 28 09:33:08 np0005538515.localdomain systemd-journald[48427]: /run/log/journal/5cd59ba25ae47acac865224fa46a5f9e/system.journal: Journal header limits reached or header out-of-date, rotating.
Nov 28 09:33:08 np0005538515.localdomain rsyslogd[758]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Nov 28 09:33:08 np0005538515.localdomain rsyslogd[758]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Nov 28 09:33:08 np0005538515.localdomain sudo[229777]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bfghbuwwpzujzgrfqmukbkkriuqpmbbk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322388.5220141-182-215050364105499/AnsiballZ_file.py
Nov 28 09:33:08 np0005538515.localdomain sudo[229777]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:33:09 np0005538515.localdomain python3.9[229779]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:33:09 np0005538515.localdomain sudo[229777]: pam_unix(sudo:session): session closed for user root
Nov 28 09:33:09 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54533 DF PROTO=TCP SPT=34982 DPT=9102 SEQ=4045366205 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD29FBA0000000001030307) 
Nov 28 09:33:09 np0005538515.localdomain sudo[229887]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bpvmrzvbtjibzbxrdesirzyqrhluiooj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322389.4008784-208-41949056848125/AnsiballZ_command.py
Nov 28 09:33:09 np0005538515.localdomain sudo[229887]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:33:10 np0005538515.localdomain python3.9[229889]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                                              systemctl disable --now certmonger.service
                                                              test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                                            fi
                                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:33:10 np0005538515.localdomain sudo[229887]: pam_unix(sudo:session): session closed for user root
Nov 28 09:33:10 np0005538515.localdomain python3.9[229999]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Nov 28 09:33:12 np0005538515.localdomain sudo[230107]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hgilaiikxbxscubgdfcvinvvbicghfoi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322391.7552717-264-124682784306730/AnsiballZ_systemd_service.py
Nov 28 09:33:12 np0005538515.localdomain sudo[230107]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:33:12 np0005538515.localdomain python3.9[230109]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 28 09:33:12 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 09:33:12 np0005538515.localdomain systemd-sysv-generator[230141]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:33:12 np0005538515.localdomain systemd-rc-local-generator[230138]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:33:12 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:33:12 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 28 09:33:12 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:33:12 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:33:12 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:33:12 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 28 09:33:12 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:33:12 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:33:12 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:33:12 np0005538515.localdomain sudo[230107]: pam_unix(sudo:session): session closed for user root
Nov 28 09:33:13 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=589 DF PROTO=TCP SPT=50378 DPT=9105 SEQ=3544091224 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD2AEFA0000000001030307) 
Nov 28 09:33:13 np0005538515.localdomain sudo[230254]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gysowspdgbitaiwukqtnyejmcdtnwfbn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322392.908329-288-164589158898991/AnsiballZ_command.py
Nov 28 09:33:13 np0005538515.localdomain sudo[230254]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:33:13 np0005538515.localdomain python3.9[230256]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_ceilometer_agent_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:33:14 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43220 DF PROTO=TCP SPT=54008 DPT=9882 SEQ=4224566225 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD2B2FA0000000001030307) 
Nov 28 09:33:14 np0005538515.localdomain sudo[230254]: pam_unix(sudo:session): session closed for user root
Nov 28 09:33:15 np0005538515.localdomain sudo[230365]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cfghnogbupnkkjwndzcfmpbuhbbbkjvp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322394.7274003-313-104679530946103/AnsiballZ_file.py
Nov 28 09:33:15 np0005538515.localdomain sudo[230365]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:33:15 np0005538515.localdomain python3.9[230367]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/openstack/config/telemetry recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:33:15 np0005538515.localdomain sudo[230365]: pam_unix(sudo:session): session closed for user root
Nov 28 09:33:15 np0005538515.localdomain python3.9[230475]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 09:33:16 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57904 DF PROTO=TCP SPT=54348 DPT=9100 SEQ=3096449806 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD2BCBA0000000001030307) 
Nov 28 09:33:16 np0005538515.localdomain python3.9[230585]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:33:17 np0005538515.localdomain python3.9[230671]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764322396.2710052-362-63714341438964/.source.conf follow=False _original_basename=ceilometer-host-specific.conf.j2 checksum=2879a2396d2687409963cab2311faed024e34763 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:33:17 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.
Nov 28 09:33:18 np0005538515.localdomain podman[230710]: 2025-11-28 09:33:18.011017949 +0000 UTC m=+0.106546351 container health_status cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, container_name=multipathd, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Nov 28 09:33:18 np0005538515.localdomain podman[230710]: 2025-11-28 09:33:18.050223369 +0000 UTC m=+0.145751721 container exec_died cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, container_name=multipathd, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 28 09:33:18 np0005538515.localdomain systemd[1]: cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.service: Deactivated successfully.
Nov 28 09:33:18 np0005538515.localdomain sudo[230797]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gwhojzpcqjgsxvbrkbrxpblvvqlcfdwa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322397.785752-407-34718627705672/AnsiballZ_group.py
Nov 28 09:33:18 np0005538515.localdomain sudo[230797]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:33:18 np0005538515.localdomain python3.9[230800]: ansible-ansible.builtin.group Invoked with name=libvirt state=present force=False system=False local=False non_unique=False gid=None gid_min=None gid_max=None
Nov 28 09:33:18 np0005538515.localdomain sudo[230797]: pam_unix(sudo:session): session closed for user root
Nov 28 09:33:19 np0005538515.localdomain sudo[230908]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qbhpiebxuyzroutltxrofyjylqtxspxd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322398.832908-440-224085073487333/AnsiballZ_getent.py
Nov 28 09:33:19 np0005538515.localdomain sudo[230908]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:33:19 np0005538515.localdomain python3.9[230910]: ansible-ansible.builtin.getent Invoked with database=passwd key=ceilometer fail_key=True service=None split=None
Nov 28 09:33:19 np0005538515.localdomain sudo[230908]: pam_unix(sudo:session): session closed for user root
Nov 28 09:33:19 np0005538515.localdomain sudo[231019]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wmnppzywogppreabqmclgwxkyelrzirj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322399.6946301-463-231880377081432/AnsiballZ_group.py
Nov 28 09:33:19 np0005538515.localdomain sudo[231019]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:33:20 np0005538515.localdomain python3.9[231021]: ansible-ansible.builtin.group Invoked with gid=42405 name=ceilometer state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Nov 28 09:33:20 np0005538515.localdomain groupadd[231022]: group added to /etc/group: name=ceilometer, GID=42405
Nov 28 09:33:20 np0005538515.localdomain groupadd[231022]: group added to /etc/gshadow: name=ceilometer
Nov 28 09:33:20 np0005538515.localdomain groupadd[231022]: new group: name=ceilometer, GID=42405
Nov 28 09:33:20 np0005538515.localdomain sudo[231019]: pam_unix(sudo:session): session closed for user root
Nov 28 09:33:20 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57905 DF PROTO=TCP SPT=54348 DPT=9100 SEQ=3096449806 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD2CC7B0000000001030307) 
Nov 28 09:33:20 np0005538515.localdomain sudo[231135]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xczkfdpnfyhbegdejmcbkeqrnxbkyvlh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322400.5058672-488-59869956691527/AnsiballZ_user.py
Nov 28 09:33:20 np0005538515.localdomain sudo[231135]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:33:21 np0005538515.localdomain python3.9[231137]: ansible-ansible.builtin.user Invoked with comment=ceilometer user group=ceilometer groups=['libvirt'] name=ceilometer shell=/sbin/nologin state=present uid=42405 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005538515.localdomain update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Nov 28 09:33:21 np0005538515.localdomain useradd[231139]: new user: name=ceilometer, UID=42405, GID=42405, home=/home/ceilometer, shell=/sbin/nologin, from=/dev/pts/0
Nov 28 09:33:21 np0005538515.localdomain useradd[231139]: add 'ceilometer' to group 'libvirt'
Nov 28 09:33:21 np0005538515.localdomain useradd[231139]: add 'ceilometer' to shadow group 'libvirt'
Nov 28 09:33:21 np0005538515.localdomain sudo[231135]: pam_unix(sudo:session): session closed for user root
Nov 28 09:33:22 np0005538515.localdomain python3.9[231253]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:33:23 np0005538515.localdomain python3.9[231339]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1764322402.1703568-566-98328576520128/.source.conf _original_basename=ceilometer.conf follow=False checksum=e4f5a0d8a335534158f72dc0bd2ff76fd1e29e2d backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:33:23 np0005538515.localdomain python3.9[231447]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/polling.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:33:24 np0005538515.localdomain python3.9[231533]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/polling.yaml mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1764322403.2735925-566-53101525248345/.source.yaml _original_basename=polling.yaml follow=False checksum=6c8680a286285f2e0ef9fa528ca754765e5ed0e5 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:33:25 np0005538515.localdomain python3.9[231641]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/custom.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:33:25 np0005538515.localdomain python3.9[231727]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/custom.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1764322404.845922-566-53873045232368/.source.conf _original_basename=custom.conf follow=False checksum=838b8b0a7d7f72e55ab67d39f32e3cb3eca2139b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:33:27 np0005538515.localdomain python3.9[231835]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 09:33:27 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54147 DF PROTO=TCP SPT=56796 DPT=9105 SEQ=2235505020 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD2E7B20000000001030307) 
Nov 28 09:33:27 np0005538515.localdomain python3.9[231943]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 09:33:28 np0005538515.localdomain python3.9[232051]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:33:28 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54148 DF PROTO=TCP SPT=56796 DPT=9105 SEQ=2235505020 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD2EBBB0000000001030307) 
Nov 28 09:33:28 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.
Nov 28 09:33:28 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.
Nov 28 09:33:28 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16878 DF PROTO=TCP SPT=49914 DPT=9882 SEQ=202554163 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD2ECDC0000000001030307) 
Nov 28 09:33:28 np0005538515.localdomain python3.9[232137]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764322407.9801822-743-64086649190568/.source.json follow=False _original_basename=ceilometer-agent-compute.json.j2 checksum=264d11e8d3809e7ef745878dce7edd46098e25b2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:33:28 np0005538515.localdomain podman[232139]: 2025-11-28 09:33:28.956425337 +0000 UTC m=+0.063851114 container health_status b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent)
Nov 28 09:33:28 np0005538515.localdomain podman[232139]: 2025-11-28 09:33:28.96174248 +0000 UTC m=+0.069168247 container exec_died b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 28 09:33:28 np0005538515.localdomain systemd[1]: b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.service: Deactivated successfully.
Nov 28 09:33:29 np0005538515.localdomain systemd[1]: tmp-crun.fGAcs8.mount: Deactivated successfully.
Nov 28 09:33:29 np0005538515.localdomain podman[232138]: 2025-11-28 09:33:29.022596173 +0000 UTC m=+0.128663859 container health_status 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 28 09:33:29 np0005538515.localdomain podman[232138]: 2025-11-28 09:33:29.080757343 +0000 UTC m=+0.186825029 container exec_died 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller)
Nov 28 09:33:29 np0005538515.localdomain systemd[1]: 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.service: Deactivated successfully.
Nov 28 09:33:29 np0005538515.localdomain python3.9[232287]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:33:29 np0005538515.localdomain python3.9[232342]: ansible-ansible.legacy.file Invoked with mode=420 dest=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf _original_basename=ceilometer-host-specific.conf.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:33:30 np0005538515.localdomain python3.9[232450]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer_agent_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:33:31 np0005538515.localdomain python3.9[232536]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer_agent_compute.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764322410.0726619-743-105798245900071/.source.json follow=False _original_basename=ceilometer_agent_compute.json.j2 checksum=d15068604cf730dd6e7b88a19d62f57d3a39f94f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:33:31 np0005538515.localdomain python3.9[232644]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:33:32 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16880 DF PROTO=TCP SPT=49914 DPT=9882 SEQ=202554163 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD2F8FA0000000001030307) 
Nov 28 09:33:32 np0005538515.localdomain python3.9[232730]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764322411.1797762-743-60880135725985/.source.yaml follow=False _original_basename=ceilometer_prom_exporter.yaml.j2 checksum=10157c879411ee6023e506dc85a343cedc52700f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:33:32 np0005538515.localdomain python3.9[232838]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/firewall.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:33:33 np0005538515.localdomain python3.9[232924]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/firewall.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764322412.3166602-743-82576824235030/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:33:33 np0005538515.localdomain python3.9[233032]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/node_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:33:34 np0005538515.localdomain python3.9[233118]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/node_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764322413.4155815-743-117337531990705/.source.json follow=False _original_basename=node_exporter.json.j2 checksum=7e5ab36b7368c1d4a00810e02af11a7f7d7c84e8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:33:34 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54150 DF PROTO=TCP SPT=56796 DPT=9105 SEQ=2235505020 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD3037A0000000001030307) 
Nov 28 09:33:35 np0005538515.localdomain python3.9[233226]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/node_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:33:35 np0005538515.localdomain python3.9[233312]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/node_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764322414.5669847-743-124438703405987/.source.yaml follow=False _original_basename=node_exporter.yaml.j2 checksum=81d906d3e1e8c4f8367276f5d3a67b80ca7e989e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:33:36 np0005538515.localdomain python3.9[233420]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/openstack_network_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:33:36 np0005538515.localdomain python3.9[233506]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/openstack_network_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764322415.751184-743-64197244377666/.source.json follow=False _original_basename=openstack_network_exporter.json.j2 checksum=0e4ea521b0035bea70b7a804346a5c89364dcbc3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:33:38 np0005538515.localdomain python3.9[233614]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:33:38 np0005538515.localdomain python3.9[233700]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764322416.9033146-743-17101585493052/.source.yaml follow=False _original_basename=openstack_network_exporter.yaml.j2 checksum=b056dcaaba7624b93826bb95ee9e82f81bde6c72 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:33:39 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31912 DF PROTO=TCP SPT=50028 DPT=9102 SEQ=3314951117 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD314FA0000000001030307) 
Nov 28 09:33:39 np0005538515.localdomain python3.9[233808]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/podman_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:33:40 np0005538515.localdomain python3.9[233894]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/podman_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764322418.8352146-743-247817568612191/.source.json follow=False _original_basename=podman_exporter.json.j2 checksum=885ccc6f5edd8803cb385bdda5648d0b3017b4e4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:33:41 np0005538515.localdomain python3.9[234002]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/podman_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:33:41 np0005538515.localdomain python3.9[234088]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/podman_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764322420.8453689-743-248790475964933/.source.yaml follow=False _original_basename=podman_exporter.yaml.j2 checksum=7ccb5eca2ff1dc337c3f3ecbbff5245af7149c47 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:33:42 np0005538515.localdomain sudo[234196]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jwhkpsheholioskugngdpjmhrreufvfj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322422.3113039-1208-130433037042832/AnsiballZ_file.py
Nov 28 09:33:42 np0005538515.localdomain sudo[234196]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:33:42 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54151 DF PROTO=TCP SPT=56796 DPT=9105 SEQ=2235505020 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD322FB0000000001030307) 
Nov 28 09:33:42 np0005538515.localdomain python3.9[234198]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:33:42 np0005538515.localdomain sudo[234196]: pam_unix(sudo:session): session closed for user root
Nov 28 09:33:43 np0005538515.localdomain sudo[234306]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lstgpvvodfpgzuggehdtnzkjitpnzxol ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322423.02922-1232-180507532043497/AnsiballZ_systemd_service.py
Nov 28 09:33:43 np0005538515.localdomain sudo[234306]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:33:43 np0005538515.localdomain python3.9[234308]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=podman.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:33:43 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 09:33:43 np0005538515.localdomain systemd-rc-local-generator[234334]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:33:43 np0005538515.localdomain systemd-sysv-generator[234338]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:33:43 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:33:43 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 28 09:33:43 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:33:43 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:33:43 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:33:43 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 28 09:33:43 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:33:43 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:33:43 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:33:44 np0005538515.localdomain systemd[1]: Listening on Podman API Socket.
Nov 28 09:33:44 np0005538515.localdomain sudo[234306]: pam_unix(sudo:session): session closed for user root
Nov 28 09:33:44 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42018 DF PROTO=TCP SPT=53686 DPT=9101 SEQ=3661804958 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD328B00000000001030307) 
Nov 28 09:33:44 np0005538515.localdomain sudo[234456]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-malqqqszhpiefzhzgzufjeflummgcmqv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322424.4426816-1258-216671780328919/AnsiballZ_stat.py
Nov 28 09:33:44 np0005538515.localdomain sudo[234456]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:33:44 np0005538515.localdomain python3.9[234458]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ceilometer_agent_compute/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:33:44 np0005538515.localdomain sudo[234456]: pam_unix(sudo:session): session closed for user root
Nov 28 09:33:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:33:45.075 228501 DEBUG oslo_service.periodic_task [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:33:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:33:45.076 228501 DEBUG oslo_service.periodic_task [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:33:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:33:45.077 228501 DEBUG nova.compute.manager [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 28 09:33:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:33:45.077 228501 DEBUG nova.compute.manager [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 28 09:33:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:33:45.098 228501 DEBUG nova.compute.manager [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 28 09:33:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:33:45.099 228501 DEBUG oslo_service.periodic_task [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:33:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:33:45.100 228501 DEBUG oslo_service.periodic_task [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:33:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:33:45.100 228501 DEBUG oslo_service.periodic_task [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:33:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:33:45.101 228501 DEBUG oslo_service.periodic_task [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:33:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:33:45.101 228501 DEBUG oslo_service.periodic_task [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:33:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:33:45.101 228501 DEBUG oslo_service.periodic_task [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:33:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:33:45.102 228501 DEBUG nova.compute.manager [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 28 09:33:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:33:45.102 228501 DEBUG oslo_service.periodic_task [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:33:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:33:45.123 228501 DEBUG oslo_concurrency.lockutils [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:33:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:33:45.124 228501 DEBUG oslo_concurrency.lockutils [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:33:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:33:45.124 228501 DEBUG oslo_concurrency.lockutils [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:33:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:33:45.124 228501 DEBUG nova.compute.resource_tracker [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Auditing locally available compute resources for np0005538515.localdomain (node: np0005538515.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 28 09:33:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:33:45.125 228501 DEBUG oslo_concurrency.processutils [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 09:33:45 np0005538515.localdomain sudo[234545]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gicwgrdbiopculsekkvhexxcasxbfxuh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322424.4426816-1258-216671780328919/AnsiballZ_copy.py
Nov 28 09:33:45 np0005538515.localdomain sudo[234545]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:33:45 np0005538515.localdomain python3.9[234548]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ceilometer_agent_compute/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764322424.4426816-1258-216671780328919/.source _original_basename=healthcheck follow=False checksum=ebb343c21fce35a02591a9351660cb7035a47d42 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:33:45 np0005538515.localdomain sudo[234545]: pam_unix(sudo:session): session closed for user root
Nov 28 09:33:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:33:45.606 228501 DEBUG oslo_concurrency.processutils [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.481s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 09:33:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:33:45.841 228501 WARNING nova.virt.libvirt.driver [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 09:33:45 np0005538515.localdomain sudo[234621]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nwsrrbbmyqjiujexpbrnivxhsvzuyovu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322424.4426816-1258-216671780328919/AnsiballZ_stat.py
Nov 28 09:33:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:33:45.843 228501 DEBUG nova.compute.resource_tracker [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Hypervisor/Node resource view: name=np0005538515.localdomain free_ram=13617MB free_disk=41.837093353271484GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 28 09:33:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:33:45.844 228501 DEBUG oslo_concurrency.lockutils [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:33:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:33:45.844 228501 DEBUG oslo_concurrency.lockutils [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:33:45 np0005538515.localdomain sudo[234621]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:33:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:33:45.907 228501 DEBUG nova.compute.resource_tracker [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 28 09:33:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:33:45.907 228501 DEBUG nova.compute.resource_tracker [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Final resource view: name=np0005538515.localdomain phys_ram=15738MB used_ram=512MB phys_disk=41GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 28 09:33:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:33:45.926 228501 DEBUG oslo_concurrency.processutils [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 09:33:46 np0005538515.localdomain python3.9[234623]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ceilometer_agent_compute/healthcheck.future follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:33:46 np0005538515.localdomain sudo[234621]: pam_unix(sudo:session): session closed for user root
Nov 28 09:33:46 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:33:46.366 228501 DEBUG oslo_concurrency.processutils [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.439s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 09:33:46 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:33:46.376 228501 DEBUG nova.compute.provider_tree [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Inventory has not changed in ProviderTree for provider: 72fba1ca-0d86-48af-8a3d-510284dfd0e0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 09:33:46 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:33:46.394 228501 DEBUG nova.scheduler.client.report [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Inventory has not changed for provider 72fba1ca-0d86-48af-8a3d-510284dfd0e0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 09:33:46 np0005538515.localdomain sudo[234731]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nvdzlagljlogpcvqbcgtgxwyqvyfmwmu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322424.4426816-1258-216671780328919/AnsiballZ_copy.py
Nov 28 09:33:46 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:33:46.397 228501 DEBUG nova.compute.resource_tracker [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Compute_service record updated for np0005538515.localdomain:np0005538515.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 28 09:33:46 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:33:46.398 228501 DEBUG oslo_concurrency.lockutils [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.553s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:33:46 np0005538515.localdomain sudo[234731]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:33:46 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46511 DF PROTO=TCP SPT=45814 DPT=9100 SEQ=533774416 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD331FA0000000001030307) 
Nov 28 09:33:46 np0005538515.localdomain python3.9[234733]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ceilometer_agent_compute/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764322424.4426816-1258-216671780328919/.source.future _original_basename=healthcheck.future follow=False checksum=d500a98192f4ddd70b4dfdc059e2d81aed36a294 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:33:46 np0005538515.localdomain sudo[234731]: pam_unix(sudo:session): session closed for user root
Nov 28 09:33:47 np0005538515.localdomain sudo[234841]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-inzriikqngaerrjtafodgtpxgtlxmnje ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322427.038278-1343-220517729000639/AnsiballZ_container_config_data.py
Nov 28 09:33:47 np0005538515.localdomain sudo[234841]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:33:47 np0005538515.localdomain python3.9[234843]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=ceilometer_agent_compute.json debug=False
Nov 28 09:33:47 np0005538515.localdomain sudo[234841]: pam_unix(sudo:session): session closed for user root
Nov 28 09:33:48 np0005538515.localdomain sudo[234951]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kokblwhtvliadkwcojmvtmzrfcjcsfka ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322428.0162492-1369-25763764249823/AnsiballZ_container_config_hash.py
Nov 28 09:33:48 np0005538515.localdomain sudo[234951]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:33:48 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.
Nov 28 09:33:48 np0005538515.localdomain podman[234953]: 2025-11-28 09:33:48.581887272 +0000 UTC m=+0.090202001 container health_status cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=multipathd, tcib_managed=true, org.label-schema.vendor=CentOS, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 09:33:48 np0005538515.localdomain podman[234953]: 2025-11-28 09:33:48.621562366 +0000 UTC m=+0.129877065 container exec_died cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 28 09:33:48 np0005538515.localdomain systemd[1]: cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.service: Deactivated successfully.
Nov 28 09:33:48 np0005538515.localdomain python3.9[234954]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 28 09:33:48 np0005538515.localdomain sudo[234951]: pam_unix(sudo:session): session closed for user root
Nov 28 09:33:49 np0005538515.localdomain sudo[235028]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:33:49 np0005538515.localdomain sudo[235028]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:33:49 np0005538515.localdomain sudo[235028]: pam_unix(sudo:session): session closed for user root
Nov 28 09:33:49 np0005538515.localdomain sudo[235060]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Nov 28 09:33:49 np0005538515.localdomain sudo[235060]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:33:50 np0005538515.localdomain sudo[235192]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xffsxdbntwgsyxnhtzpfxvfoghckfhew ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764322429.1634035-1399-142785476401854/AnsiballZ_edpm_container_manage.py
Nov 28 09:33:50 np0005538515.localdomain sudo[235192]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:33:50 np0005538515.localdomain systemd[1]: tmp-crun.yTAtw2.mount: Deactivated successfully.
Nov 28 09:33:50 np0005538515.localdomain podman[235188]: 2025-11-28 09:33:50.435277608 +0000 UTC m=+0.114838365 container exec 98f7091a3e2ea0e9ed1e630f1e98c8fad1fd276cf7448473db6afc3c103ea45d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538515, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, build-date=2025-09-24T08:57:55, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, name=rhceph, ceph=True, io.openshift.tags=rhceph ceph, RELEASE=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, release=553, version=7, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., io.openshift.expose-services=, GIT_CLEAN=True)
Nov 28 09:33:50 np0005538515.localdomain podman[235188]: 2025-11-28 09:33:50.564574654 +0000 UTC m=+0.244135381 container exec_died 98f7091a3e2ea0e9ed1e630f1e98c8fad1fd276cf7448473db6afc3c103ea45d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538515, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, release=553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements, CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_CLEAN=True, GIT_BRANCH=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.expose-services=, io.buildah.version=1.33.12, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, name=rhceph, ceph=True, build-date=2025-09-24T08:57:55, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, com.redhat.component=rhceph-container, version=7, RELEASE=main)
Nov 28 09:33:50 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46512 DF PROTO=TCP SPT=45814 DPT=9100 SEQ=533774416 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD341BA0000000001030307) 
Nov 28 09:33:50 np0005538515.localdomain python3[235204]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=ceilometer_agent_compute.json log_base_path=/var/log/containers/stdouts debug=False
Nov 28 09:33:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:33:50.813 158530 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:33:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:33:50.813 158530 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:33:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:33:50.814 158530 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:33:50 np0005538515.localdomain sudo[235060]: pam_unix(sudo:session): session closed for user root
Nov 28 09:33:50 np0005538515.localdomain python3[235204]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: [
                                                               {
                                                                    "Id": "e6f07353639e492d8c9627d6d615ceeb47cb00ac4d14993b12e8023ee2aeee6f",
                                                                    "Digest": "sha256:ba8d4a4e89620dec751cb5de5631f858557101d862972a8e817b82e4e10180a1",
                                                                    "RepoTags": [
                                                                         "quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified"
                                                                    ],
                                                                    "RepoDigests": [
                                                                         "quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:ba8d4a4e89620dec751cb5de5631f858557101d862972a8e817b82e4e10180a1"
                                                                    ],
                                                                    "Parent": "",
                                                                    "Comment": "",
                                                                    "Created": "2025-11-26T06:26:47.510377458Z",
                                                                    "Config": {
                                                                         "User": "root",
                                                                         "Env": [
                                                                              "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",
                                                                              "LANG=en_US.UTF-8",
                                                                              "TZ=UTC",
                                                                              "container=oci"
                                                                         ],
                                                                         "Entrypoint": [
                                                                              "dumb-init",
                                                                              "--single-child",
                                                                              "--"
                                                                         ],
                                                                         "Cmd": [
                                                                              "kolla_start"
                                                                         ],
                                                                         "Labels": {
                                                                              "io.buildah.version": "1.41.3",
                                                                              "maintainer": "OpenStack Kubernetes Operator team",
                                                                              "org.label-schema.build-date": "20251125",
                                                                              "org.label-schema.license": "GPLv2",
                                                                              "org.label-schema.name": "CentOS Stream 9 Base Image",
                                                                              "org.label-schema.schema-version": "1.0",
                                                                              "org.label-schema.vendor": "CentOS",
                                                                              "tcib_build_tag": "1f5c0439f2433cb462b222a5bb23e629",
                                                                              "tcib_managed": "true"
                                                                         },
                                                                         "StopSignal": "SIGTERM"
                                                                    },
                                                                    "Version": "",
                                                                    "Author": "",
                                                                    "Architecture": "amd64",
                                                                    "Os": "linux",
                                                                    "Size": 505178369,
                                                                    "VirtualSize": 505178369,
                                                                    "GraphDriver": {
                                                                         "Name": "overlay",
                                                                         "Data": {
                                                                              "LowerDir": "/var/lib/containers/storage/overlay/dc5b8b4def912dce4d14a76402b323c6b5c48ee8271230eacbdaaa7e58e676b2/diff:/var/lib/containers/storage/overlay/f20c3ba929bbb53a84e323dddb8c0eaf3ca74b6729310e964e1fa9eee119e43a/diff:/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/diff:/var/lib/containers/storage/overlay/cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa/diff",
                                                                              "UpperDir": "/var/lib/containers/storage/overlay/5ea32d7a444086a7f1ea2479bd7b214a5adab9651f7d4df1f24a039ae5563f9d/diff",
                                                                              "WorkDir": "/var/lib/containers/storage/overlay/5ea32d7a444086a7f1ea2479bd7b214a5adab9651f7d4df1f24a039ae5563f9d/work"
                                                                         }
                                                                    },
                                                                    "RootFS": {
                                                                         "Type": "layers",
                                                                         "Layers": [
                                                                              "sha256:cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa",
                                                                              "sha256:1e3477d3ea795ca64b46f28aa9428ba791c4250e0fd05e173a4b9c0fb0bdee23",
                                                                              "sha256:c136b33417f134a3b932677bcf7a2df089c29f20eca250129eafd2132d4708bb",
                                                                              "sha256:df29e1f065b3ca62a976bd39a05f70336eee2ae6be8f0f1548e8c749ab2e29f2",
                                                                              "sha256:23884b48504b714fa8c89fa23b204d39c39cc69fece546e604d8bd0566e4fb11"
                                                                         ]
                                                                    },
                                                                    "Labels": {
                                                                         "io.buildah.version": "1.41.3",
                                                                         "maintainer": "OpenStack Kubernetes Operator team",
                                                                         "org.label-schema.build-date": "20251125",
                                                                         "org.label-schema.license": "GPLv2",
                                                                         "org.label-schema.name": "CentOS Stream 9 Base Image",
                                                                         "org.label-schema.schema-version": "1.0",
                                                                         "org.label-schema.vendor": "CentOS",
                                                                         "tcib_build_tag": "1f5c0439f2433cb462b222a5bb23e629",
                                                                         "tcib_managed": "true"
                                                                    },
                                                                    "Annotations": {},
                                                                    "ManifestType": "application/vnd.docker.distribution.manifest.v2+json",
                                                                    "User": "root",
                                                                    "History": [
                                                                         {
                                                                              "created": "2025-11-25T04:02:36.223494528Z",
                                                                              "created_by": "/bin/sh -c #(nop) ADD file:cacf1a97b4abfca5db2db22f7ddbca8fd7daa5076a559639c109f09aaf55871d in / ",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-25T04:02:36.223562059Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL org.label-schema.schema-version=\"1.0\"     org.label-schema.name=\"CentOS Stream 9 Base Image\"     org.label-schema.vendor=\"CentOS\"     org.label-schema.license=\"GPLv2\"     org.label-schema.build-date=\"20251125\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-25T04:02:39.054452717Z",
                                                                              "created_by": "/bin/sh -c #(nop) CMD [\"/bin/bash\"]"
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:10:57.55004106Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL maintainer=\"OpenStack Kubernetes Operator team\"",
                                                                              "comment": "FROM quay.io/centos/centos:stream9",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:10:57.550061231Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL tcib_managed=true",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:10:57.550071761Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV LANG=\"en_US.UTF-8\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:10:57.550082711Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV TZ=\"UTC\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:10:57.550094371Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV container=\"oci\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:10:57.550104472Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:10:57.937139683Z",
                                                                              "created_by": "/bin/sh -c if [ -f \"/etc/yum.repos.d/ubi.repo\" ]; then rm -f /etc/yum.repos.d/ubi.repo && dnf clean all && rm -rf /var/cache/dnf; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:33.845342269Z",
                                                                              "created_by": "/bin/sh -c dnf install -y crudini && crudini --del /etc/dnf/dnf.conf main override_install_langs && crudini --set /etc/dnf/dnf.conf main clean_requirements_on_remove True && crudini --set /etc/dnf/dnf.conf main exactarch 1 && crudini --set /etc/dnf/dnf.conf main gpgcheck 1 && crudini --set /etc/dnf/dnf.conf main install_weak_deps False && if [ 'centos' == 'centos' ];then crudini --set /etc/dnf/dnf.conf main best False; fi && crudini --set /etc/dnf/dnf.conf main installonly_limit 0 && crudini --set /etc/dnf/dnf.conf main keepcache 0 && crudini --set /etc/dnf/dnf.conf main obsoletes 1 && crudini --set /etc/dnf/dnf.conf main plugins 1 && crudini --set /etc/dnf/dnf.conf main skip_missing_names_on_install False && crudini --set /etc/dnf/dnf.conf main tsflags nodocs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:37.752912815Z",
                                                                              "created_by": "/bin/sh -c dnf install -y ca-certificates dumb-init glibc-langpack-en procps-ng python3 sudo util-linux-user which python-tcib-containers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:38.066850603Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/uid_gid_manage.sh /usr/local/bin/uid_gid_manage",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:38.343690066Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/uid_gid_manage",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:39.121414134Z",
                                                                              "created_by": "/bin/sh -c bash /usr/local/bin/uid_gid_manage kolla hugetlbfs libvirt qemu",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:39.758394881Z",
                                                                              "created_by": "/bin/sh -c touch /usr/local/bin/kolla_extend_start && chmod 755 /usr/local/bin/kolla_extend_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:40.023293708Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/set_configs.py /usr/local/bin/kolla_set_configs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:40.666927498Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_set_configs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:41.274045447Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/start.sh /usr/local/bin/kolla_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:41.934810694Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:42.460051822Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/httpd_setup.sh /usr/local/bin/kolla_httpd_setup",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:43.056709748Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_httpd_setup",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:43.656939418Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/copy_cacerts.sh /usr/local/bin/kolla_copy_cacerts",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:44.391634882Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_copy_cacerts",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:44.866551538Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/sudoers /etc/sudoers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:45.384686341Z",
                                                                              "created_by": "/bin/sh -c chmod 440 /etc/sudoers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:45.893815667Z",
                                                                              "created_by": "/bin/sh -c sed -ri '/^(passwd:|group:)/ s/systemd//g' /etc/nsswitch.conf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:50.280039705Z",
                                                                              "created_by": "/bin/sh -c dnf -y reinstall which && rpm -e --nodeps tzdata && dnf -y install tzdata",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:51.365780205Z",
                                                                              "created_by": "/bin/sh -c if [ ! -f \"/etc/localtime\" ]; then ln -s /usr/share/zoneinfo/Etc/UTC /etc/localtime; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:52.238116267Z",
                                                                              "created_by": "/bin/sh -c mkdir -p /openstack",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:54.354755699Z",
                                                                              "created_by": "/bin/sh -c if [ 'centos' == 'centos' ];then if [ -n \"$(rpm -qa redhat-release)\" ];then rpm -e --nodeps redhat-release; fi ; dnf -y install centos-stream-release; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:57.47438266Z",
                                                                              "created_by": "/bin/sh -c dnf update --excludepkgs redhat-release -y && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:57.474435383Z",
                                                                              "created_by": "/bin/sh -c #(nop) STOPSIGNAL SIGTERM",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:57.474444143Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENTRYPOINT [\"dumb-init\", \"--single-child\", \"--\"]",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:57.474450953Z",
                                                                              "created_by": "/bin/sh -c #(nop) CMD [\"kolla_start\"]",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:58.542433842Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"1f5c0439f2433cb462b222a5bb23e629\""
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:13:58.883943816Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-base:1f5c0439f2433cb462b222a5bb23e629",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:14:39.655921147Z",
                                                                              "created_by": "/bin/sh -c dnf install -y python3-barbicanclient python3-cinderclient python3-designateclient python3-glanceclient python3-ironicclient python3-keystoneclient python3-manilaclient python3-neutronclient python3-novaclient python3-observabilityclient python3-octaviaclient python3-openstackclient python3-swiftclient python3-pymemcache && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:14:42.534184087Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"1f5c0439f2433cb462b222a5bb23e629\""
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:17:11.648903438Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-os:1f5c0439f2433cb462b222a5bb23e629",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:17:14.841832772Z",
                                                                              "created_by": "/bin/sh -c bash /usr/local/bin/uid_gid_manage ceilometer",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:18:00.567980594Z",
                                                                              "created_by": "/bin/sh -c dnf -y install openstack-ceilometer-common && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:18:03.88569442Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"1f5c0439f2433cb462b222a5bb23e629\""
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:26:11.053013113Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-ceilometer-base:1f5c0439f2433cb462b222a5bb23e629",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:26:47.509622089Z",
                                                                              "created_by": "/bin/sh -c dnf -y install openstack-ceilometer-compute && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:26:54.939484291Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"1f5c0439f2433cb462b222a5bb23e629\""
                                                                         }
                                                                    ],
                                                                    "NamesHistory": [
                                                                         "quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified"
                                                                    ]
                                                               }
                                                          ]
                                                          : quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified
Nov 28 09:33:50 np0005538515.localdomain sudo[235307]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:33:50 np0005538515.localdomain sudo[235307]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:33:50 np0005538515.localdomain sudo[235307]: pam_unix(sudo:session): session closed for user root
Nov 28 09:33:50 np0005538515.localdomain podman[235306]: 2025-11-28 09:33:50.982469382 +0000 UTC m=+0.083563558 container remove d2588e20d5c6110b484bc7ffe641a1caa93791775223f93ea6db4e71c80aa6ff (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, vcs-type=git, name=rhosp17/openstack-ceilometer-compute, architecture=x86_64, config_id=tripleo_step4, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, release=1761123044, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '185ba876a5902dbf87b8591344afd39d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ceilometer_agent_compute, io.openshift.expose-services=, version=17.1.12, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676)
Nov 28 09:33:50 np0005538515.localdomain python3[235204]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman rm --force ceilometer_agent_compute
Nov 28 09:33:51 np0005538515.localdomain sudo[235339]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 09:33:51 np0005538515.localdomain sudo[235339]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:33:51 np0005538515.localdomain podman[235347]: 
Nov 28 09:33:51 np0005538515.localdomain podman[235347]: 2025-11-28 09:33:51.091610632 +0000 UTC m=+0.091164081 container create 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, container_name=ceilometer_agent_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm, org.label-schema.build-date=20251125)
Nov 28 09:33:51 np0005538515.localdomain podman[235347]: 2025-11-28 09:33:51.04874779 +0000 UTC m=+0.048301359 image pull  quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified
Nov 28 09:33:51 np0005538515.localdomain python3[235204]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ceilometer_agent_compute --conmon-pidfile /run/ceilometer_agent_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env OS_ENDPOINT_TYPE=internal --healthcheck-command /openstack/healthcheck compute --label config_id=edpm --label container_name=ceilometer_agent_compute --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']} --log-driver journald --log-level info --network host --security-opt label:type:ceilometer_polling_t --user ceilometer --volume /var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z --volume /var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z --volume /run/libvirt:/run/libvirt:shared,ro --volume /etc/hosts:/etc/hosts:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /dev/log:/dev/log --volume /var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified kolla_start
Nov 28 09:33:51 np0005538515.localdomain sudo[235192]: pam_unix(sudo:session): session closed for user root
Nov 28 09:33:51 np0005538515.localdomain sudo[235339]: pam_unix(sudo:session): session closed for user root
Nov 28 09:33:51 np0005538515.localdomain sudo[235534]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qpcissrtwiduqqtktfojadrvrjniqawp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322431.4845784-1423-92375422195765/AnsiballZ_stat.py
Nov 28 09:33:51 np0005538515.localdomain sudo[235534]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:33:51 np0005538515.localdomain python3.9[235536]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 09:33:51 np0005538515.localdomain sudo[235534]: pam_unix(sudo:session): session closed for user root
Nov 28 09:33:52 np0005538515.localdomain sudo[235539]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:33:52 np0005538515.localdomain sudo[235539]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:33:52 np0005538515.localdomain sudo[235539]: pam_unix(sudo:session): session closed for user root
Nov 28 09:33:52 np0005538515.localdomain sudo[235664]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-holnltmrpquqolfklfamzeljomjpylmu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322432.550426-1450-155843610197875/AnsiballZ_file.py
Nov 28 09:33:52 np0005538515.localdomain sudo[235664]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:33:53 np0005538515.localdomain python3.9[235666]: ansible-file Invoked with path=/etc/systemd/system/edpm_ceilometer_agent_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:33:53 np0005538515.localdomain sudo[235664]: pam_unix(sudo:session): session closed for user root
Nov 28 09:33:53 np0005538515.localdomain sudo[235773]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pkrgfqhzyywycdaoqkcakovbmsvjyygl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322433.1180668-1450-120375643203824/AnsiballZ_copy.py
Nov 28 09:33:53 np0005538515.localdomain sudo[235773]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:33:53 np0005538515.localdomain python3.9[235775]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764322433.1180668-1450-120375643203824/source dest=/etc/systemd/system/edpm_ceilometer_agent_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:33:53 np0005538515.localdomain sudo[235773]: pam_unix(sudo:session): session closed for user root
Nov 28 09:33:54 np0005538515.localdomain sudo[235828]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ovtvdmxoyobvryhkhhbsucbojgggcnrg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322433.1180668-1450-120375643203824/AnsiballZ_systemd.py
Nov 28 09:33:54 np0005538515.localdomain sudo[235828]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:33:54 np0005538515.localdomain python3.9[235830]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 28 09:33:54 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 09:33:54 np0005538515.localdomain systemd-rc-local-generator[235855]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:33:54 np0005538515.localdomain systemd-sysv-generator[235862]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:33:54 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:33:54 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 28 09:33:54 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:33:54 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:33:54 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:33:54 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 28 09:33:54 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:33:54 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:33:54 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:33:54 np0005538515.localdomain sudo[235828]: pam_unix(sudo:session): session closed for user root
Nov 28 09:33:55 np0005538515.localdomain sudo[235921]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cwomwfatezkdgtyvzadqjfpuuavisimk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322433.1180668-1450-120375643203824/AnsiballZ_systemd.py
Nov 28 09:33:55 np0005538515.localdomain sudo[235921]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:33:55 np0005538515.localdomain python3.9[235923]: ansible-systemd Invoked with state=restarted name=edpm_ceilometer_agent_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:33:55 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 09:33:55 np0005538515.localdomain systemd-sysv-generator[235956]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:33:55 np0005538515.localdomain systemd-rc-local-generator[235953]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:33:55 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:33:55 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 28 09:33:55 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:33:55 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:33:55 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:33:55 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 28 09:33:55 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:33:55 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:33:55 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:33:55 np0005538515.localdomain systemd[1]: Starting ceilometer_agent_compute container...
Nov 28 09:33:56 np0005538515.localdomain systemd[1]: tmp-crun.n2Vm4M.mount: Deactivated successfully.
Nov 28 09:33:56 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 09:33:56 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/79357f94827d1342ad406bdeb4e36d95f97d18c5be3690def0d45192dec0b1fd/merged/var/lib/openstack/config supports timestamps until 2038 (0x7fffffff)
Nov 28 09:33:56 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/79357f94827d1342ad406bdeb4e36d95f97d18c5be3690def0d45192dec0b1fd/merged/var/lib/kolla/config_files/config.json supports timestamps until 2038 (0x7fffffff)
Nov 28 09:33:56 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.
Nov 28 09:33:56 np0005538515.localdomain podman[235964]: 2025-11-28 09:33:56.090230083 +0000 UTC m=+0.151960811 container init 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: + sudo -E kolla_set_configs
Nov 28 09:33:56 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.
Nov 28 09:33:56 np0005538515.localdomain podman[235964]: 2025-11-28 09:33:56.121368686 +0000 UTC m=+0.183099414 container start 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251125)
Nov 28 09:33:56 np0005538515.localdomain podman[235964]: ceilometer_agent_compute
Nov 28 09:33:56 np0005538515.localdomain sudo[235984]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: sudo: unable to send audit message: Operation not permitted
Nov 28 09:33:56 np0005538515.localdomain sudo[235984]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Nov 28 09:33:56 np0005538515.localdomain systemd[1]: Started ceilometer_agent_compute container.
Nov 28 09:33:56 np0005538515.localdomain sudo[235984]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Nov 28 09:33:56 np0005538515.localdomain sudo[235921]: pam_unix(sudo:session): session closed for user root
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: INFO:__main__:Validating config file
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: INFO:__main__:Copying service configuration files
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: INFO:__main__:Copying /var/lib/openstack/config/ceilometer.conf to /etc/ceilometer/ceilometer.conf
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: INFO:__main__:Deleting /etc/ceilometer/polling.yaml
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: INFO:__main__:Copying /var/lib/openstack/config/polling.yaml to /etc/ceilometer/polling.yaml
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: INFO:__main__:Setting permission for /etc/ceilometer/polling.yaml
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: INFO:__main__:Copying /var/lib/openstack/config/custom.conf to /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: INFO:__main__:Copying /var/lib/openstack/config/ceilometer-host-specific.conf to /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: INFO:__main__:Writing out command to execute
Nov 28 09:33:56 np0005538515.localdomain sudo[235984]: pam_unix(sudo:session): session closed for user root
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: ++ cat /run_command
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: + CMD='/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: + ARGS=
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: + sudo kolla_copy_cacerts
Nov 28 09:33:56 np0005538515.localdomain podman[235985]: 2025-11-28 09:33:56.218264571 +0000 UTC m=+0.092989306 container health_status 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=starting, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 28 09:33:56 np0005538515.localdomain sudo[236010]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_copy_cacerts
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: sudo: unable to send audit message: Operation not permitted
Nov 28 09:33:56 np0005538515.localdomain sudo[236010]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Nov 28 09:33:56 np0005538515.localdomain sudo[236010]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Nov 28 09:33:56 np0005538515.localdomain sudo[236010]: pam_unix(sudo:session): session closed for user root
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: + [[ ! -n '' ]]
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: + . kolla_extend_start
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: + echo 'Running command: '\''/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'\'''
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: + umask 0022
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: + exec /usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: Running command: '/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Nov 28 09:33:56 np0005538515.localdomain podman[235985]: 2025-11-28 09:33:56.25154275 +0000 UTC m=+0.126267465 container exec_died 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Nov 28 09:33:56 np0005538515.localdomain podman[235985]: unhealthy
Nov 28 09:33:56 np0005538515.localdomain systemd[1]: 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 09:33:56 np0005538515.localdomain systemd[1]: 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.service: Failed with result 'exit-code'.
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.906 2 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_manager_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:40
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.907 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.907 2 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.907 2 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.907 2 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.907 2 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.907 2 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.907 2 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.907 2 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.907 2 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.907 2 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.908 2 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.908 2 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.908 2 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.908 2 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.908 2 DEBUG cotyledon.oslo_config_glue [-] host                           = np0005538515.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.908 2 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.908 2 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.908 2 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.908 2 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.908 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.908 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.909 2 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.909 2 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.909 2 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.909 2 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.909 2 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.909 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.909 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.909 2 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.909 2 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.909 2 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.909 2 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.909 2 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.909 2 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.910 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.910 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.910 2 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.910 2 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.910 2 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.910 2 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.910 2 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.910 2 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.910 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.910 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.910 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.910 2 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.910 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.911 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.911 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.911 2 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.911 2 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.911 2 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.911 2 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.911 2 DEBUG cotyledon.oslo_config_glue [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.911 2 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.911 2 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.911 2 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.911 2 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.911 2 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.912 2 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.912 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.912 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.912 2 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.912 2 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.912 2 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.912 2 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.912 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.912 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.912 2 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.912 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.912 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path           = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.913 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.913 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.913 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count            = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.913 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries      = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.913 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.913 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.913 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout          = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.913 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.913 2 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.913 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries     = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.913 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.913 2 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version      = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.913 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.914 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.914 2 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.914 2 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.914 2 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.914 2 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.914 2 DEBUG cotyledon.oslo_config_glue [-] monasca.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.914 2 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.914 2 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings       = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.914 2 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.914 2 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.914 2 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.914 2 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.915 2 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.915 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.915 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.915 2 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.915 2 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.915 2 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.915 2 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.915 2 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.915 2 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.915 2 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.915 2 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.916 2 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.916 2 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.916 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.916 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.916 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.916 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.916 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.916 2 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.916 2 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.916 2 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.916 2 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.916 2 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.917 2 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.917 2 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.917 2 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.917 2 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.917 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip                 = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.917 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.917 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.917 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.917 2 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.917 2 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.917 2 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.917 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.918 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.918 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.918 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.918 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.918 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.918 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.918 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.918 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.918 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.918 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.918 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.918 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.918 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.919 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.919 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.919 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.919 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.919 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.919 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.919 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.919 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.919 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.919 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.919 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.919 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.919 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.920 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.920 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.920 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.920 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.920 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.920 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.920 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.937 12 INFO ceilometer.polling.manager [-] Looking for dynamic pollsters configurations at [['/etc/ceilometer/pollsters.d']].
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.938 12 INFO ceilometer.polling.manager [-] No dynamic pollsters found in folder [/etc/ceilometer/pollsters.d].
Nov 28 09:33:56 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:56.939 12 INFO ceilometer.polling.manager [-] No dynamic pollsters file found in dirs [['/etc/ceilometer/pollsters.d']].
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.015 12 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/utils.py:93
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.076 12 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:48
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.076 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.076 12 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.076 12 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.077 12 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.077 12 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.077 12 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.077 12 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.077 12 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.077 12 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.077 12 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.077 12 DEBUG cotyledon.oslo_config_glue [-] control_exchange               = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.077 12 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.077 12 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.078 12 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.078 12 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.078 12 DEBUG cotyledon.oslo_config_glue [-] host                           = np0005538515.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.078 12 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.078 12 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.078 12 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.078 12 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.078 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.078 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.078 12 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.078 12 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.078 12 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.079 12 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.079 12 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.079 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.079 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.079 12 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.079 12 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.079 12 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.079 12 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.079 12 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.079 12 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.079 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.079 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.079 12 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.079 12 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.080 12 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.080 12 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.080 12 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.080 12 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.080 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.080 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.080 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.080 12 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.080 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.080 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.080 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.080 12 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.081 12 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.081 12 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.081 12 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.081 12 DEBUG cotyledon.oslo_config_glue [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.081 12 DEBUG cotyledon.oslo_config_glue [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.081 12 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.081 12 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.081 12 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.081 12 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.081 12 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.081 12 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.081 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.081 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.082 12 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.082 12 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.082 12 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.082 12 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.082 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.082 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.082 12 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.082 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.082 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path           = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.082 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.082 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.082 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count            = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.083 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries      = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.083 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.083 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.083 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout          = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.083 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.083 12 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.083 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries     = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.083 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.083 12 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version      = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.083 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.083 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.083 12 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.083 12 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.084 12 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.084 12 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.084 12 DEBUG cotyledon.oslo_config_glue [-] monasca.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.084 12 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.084 12 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings       = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.084 12 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.084 12 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.084 12 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.084 12 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.084 12 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.084 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.084 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.085 12 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.085 12 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.085 12 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.085 12 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.085 12 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.085 12 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.085 12 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.085 12 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.085 12 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.085 12 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.085 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.086 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.086 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.086 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.086 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.086 12 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.086 12 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.086 12 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.086 12 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.086 12 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.086 12 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.086 12 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.086 12 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.086 12 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.087 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip                 = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.087 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.087 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.087 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.087 12 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.087 12 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.087 12 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.087 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.087 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.087 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_url   = http://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.087 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.087 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.088 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.088 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.088 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.088 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_id  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.088 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.088 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.088 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.088 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.088 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.password   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.088 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.088 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.088 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.088 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_name = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.089 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.089 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.089 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.system_scope = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.089 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.089 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.trust_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.089 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.089 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.089 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.089 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.username   = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.089 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.089 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.089 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.089 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.090 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.090 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.090 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.090 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.090 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.090 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.090 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.090 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.090 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.090 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.090 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.090 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.090 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.091 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.091 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.091 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.091 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.091 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.091 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.091 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.091 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.091 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.091 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.091 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.091 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.091 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.092 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.092 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.092 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.092 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.092 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.092 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.092 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.092 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.092 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.092 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.092 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.092 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.092 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.093 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.093 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.093 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.093 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.093 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.093 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.093 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.093 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.093 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.093 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.093 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.093 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.094 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.094 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.094 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.094 12 DEBUG cotyledon._service [-] Run service AgentManager(0) [12] wait_forever /usr/lib/python3.9/site-packages/cotyledon/_service.py:241
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.097 12 DEBUG ceilometer.agent [-] Config file: {'sources': [{'name': 'pollsters', 'interval': 120, 'meters': ['power.state', 'cpu', 'memory.usage', 'disk.*', 'network.*']}]} load_config /usr/lib/python3.9/site-packages/ceilometer/agent.py:64
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.106 12 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/utils.py:93
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.109 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.109 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.109 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.110 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.110 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.110 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.110 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.110 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.110 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.110 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.111 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.111 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.111 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.111 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.111 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.111 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.111 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.112 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.112 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.112 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.112 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.112 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.112 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.113 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:33:57 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:57.113 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:33:57 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63961 DF PROTO=TCP SPT=35358 DPT=9105 SEQ=1280533386 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD35CE30000000001030307) 
Nov 28 09:33:58 np0005538515.localdomain sudo[236241]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wgqaforytlgmrjrbqtylcylcpdzyhdfx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322438.2298903-1523-31526952497494/AnsiballZ_systemd.py
Nov 28 09:33:58 np0005538515.localdomain sudo[236241]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:33:58 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63962 DF PROTO=TCP SPT=35358 DPT=9105 SEQ=1280533386 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD360FA0000000001030307) 
Nov 28 09:33:58 np0005538515.localdomain python3.9[236243]: ansible-ansible.builtin.systemd Invoked with name=edpm_ceilometer_agent_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 28 09:33:58 np0005538515.localdomain systemd[1]: Stopping ceilometer_agent_compute container...
Nov 28 09:33:58 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49539 DF PROTO=TCP SPT=49276 DPT=9882 SEQ=3920736790 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD3620C0000000001030307) 
Nov 28 09:33:58 np0005538515.localdomain systemd[1]: tmp-crun.jvXDop.mount: Deactivated successfully.
Nov 28 09:33:58 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:58.915 2 INFO cotyledon._service_manager [-] Caught SIGTERM signal, graceful exiting of master process
Nov 28 09:33:59 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:59.017 2 DEBUG cotyledon._service_manager [-] Killing services with signal SIGTERM _shutdown /usr/lib/python3.9/site-packages/cotyledon/_service_manager.py:304
Nov 28 09:33:59 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:59.017 2 DEBUG cotyledon._service_manager [-] Waiting services to terminate _shutdown /usr/lib/python3.9/site-packages/cotyledon/_service_manager.py:308
Nov 28 09:33:59 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:59.017 12 INFO cotyledon._service [-] Caught SIGTERM signal, graceful exiting of service AgentManager(0) [12]
Nov 28 09:33:59 np0005538515.localdomain virtqemud[227736]: End of file while reading data: Input/output error
Nov 28 09:33:59 np0005538515.localdomain virtqemud[227736]: End of file while reading data: Input/output error
Nov 28 09:33:59 np0005538515.localdomain ceilometer_agent_compute[235978]: 2025-11-28 09:33:59.028 2 DEBUG cotyledon._service_manager [-] Shutdown finish _shutdown /usr/lib/python3.9/site-packages/cotyledon/_service_manager.py:320
Nov 28 09:33:59 np0005538515.localdomain systemd[1]: libpod-783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.scope: Deactivated successfully.
Nov 28 09:33:59 np0005538515.localdomain systemd[1]: libpod-783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.scope: Consumed 1.155s CPU time.
Nov 28 09:33:59 np0005538515.localdomain podman[236304]: 2025-11-28 09:33:59.172134301 +0000 UTC m=+0.336641272 container died 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=edpm, container_name=ceilometer_agent_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 28 09:33:59 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.
Nov 28 09:33:59 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.
Nov 28 09:33:59 np0005538515.localdomain systemd[1]: 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.timer: Deactivated successfully.
Nov 28 09:33:59 np0005538515.localdomain systemd[1]: Stopped /usr/bin/podman healthcheck run 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.
Nov 28 09:33:59 np0005538515.localdomain podman[236325]: 2025-11-28 09:33:59.254113441 +0000 UTC m=+0.063067702 container health_status 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.build-date=20251125, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 28 09:33:59 np0005538515.localdomain podman[236326]: 2025-11-28 09:33:59.264690974 +0000 UTC m=+0.069150827 container health_status b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_metadata_agent)
Nov 28 09:33:59 np0005538515.localdomain podman[236326]: 2025-11-28 09:33:59.296304362 +0000 UTC m=+0.100764225 container exec_died b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 28 09:33:59 np0005538515.localdomain systemd[1]: b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.service: Deactivated successfully.
Nov 28 09:33:59 np0005538515.localdomain podman[236325]: 2025-11-28 09:33:59.313396185 +0000 UTC m=+0.122350436 container exec_died 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team)
Nov 28 09:33:59 np0005538515.localdomain systemd[1]: 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.service: Deactivated successfully.
Nov 28 09:33:59 np0005538515.localdomain podman[236304]: 2025-11-28 09:33:59.389327828 +0000 UTC m=+0.553834749 container cleanup 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team)
Nov 28 09:33:59 np0005538515.localdomain podman[236304]: ceilometer_agent_compute
Nov 28 09:33:59 np0005538515.localdomain podman[236372]: 2025-11-28 09:33:59.492483405 +0000 UTC m=+0.072958504 container cleanup 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, container_name=ceilometer_agent_compute, tcib_managed=true, managed_by=edpm_ansible)
Nov 28 09:33:59 np0005538515.localdomain podman[236372]: ceilometer_agent_compute
Nov 28 09:33:59 np0005538515.localdomain systemd[1]: edpm_ceilometer_agent_compute.service: Deactivated successfully.
Nov 28 09:33:59 np0005538515.localdomain systemd[1]: Stopped ceilometer_agent_compute container.
Nov 28 09:33:59 np0005538515.localdomain systemd[1]: Starting ceilometer_agent_compute container...
Nov 28 09:33:59 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 09:33:59 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/79357f94827d1342ad406bdeb4e36d95f97d18c5be3690def0d45192dec0b1fd/merged/var/lib/openstack/config supports timestamps until 2038 (0x7fffffff)
Nov 28 09:33:59 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/79357f94827d1342ad406bdeb4e36d95f97d18c5be3690def0d45192dec0b1fd/merged/var/lib/kolla/config_files/config.json supports timestamps until 2038 (0x7fffffff)
Nov 28 09:33:59 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.
Nov 28 09:33:59 np0005538515.localdomain podman[236386]: 2025-11-28 09:33:59.64232584 +0000 UTC m=+0.119852739 container init 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 28 09:33:59 np0005538515.localdomain ceilometer_agent_compute[236400]: + sudo -E kolla_set_configs
Nov 28 09:33:59 np0005538515.localdomain ceilometer_agent_compute[236400]: sudo: unable to send audit message: Operation not permitted
Nov 28 09:33:59 np0005538515.localdomain sudo[236406]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Nov 28 09:33:59 np0005538515.localdomain sudo[236406]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Nov 28 09:33:59 np0005538515.localdomain sudo[236406]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Nov 28 09:33:59 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.
Nov 28 09:33:59 np0005538515.localdomain podman[236386]: 2025-11-28 09:33:59.685108759 +0000 UTC m=+0.162635678 container start 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Nov 28 09:33:59 np0005538515.localdomain podman[236386]: ceilometer_agent_compute
Nov 28 09:33:59 np0005538515.localdomain systemd[1]: Started ceilometer_agent_compute container.
Nov 28 09:33:59 np0005538515.localdomain ceilometer_agent_compute[236400]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 28 09:33:59 np0005538515.localdomain ceilometer_agent_compute[236400]: INFO:__main__:Validating config file
Nov 28 09:33:59 np0005538515.localdomain ceilometer_agent_compute[236400]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 28 09:33:59 np0005538515.localdomain ceilometer_agent_compute[236400]: INFO:__main__:Copying service configuration files
Nov 28 09:33:59 np0005538515.localdomain ceilometer_agent_compute[236400]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf
Nov 28 09:33:59 np0005538515.localdomain ceilometer_agent_compute[236400]: INFO:__main__:Copying /var/lib/openstack/config/ceilometer.conf to /etc/ceilometer/ceilometer.conf
Nov 28 09:33:59 np0005538515.localdomain ceilometer_agent_compute[236400]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf
Nov 28 09:33:59 np0005538515.localdomain ceilometer_agent_compute[236400]: INFO:__main__:Deleting /etc/ceilometer/polling.yaml
Nov 28 09:33:59 np0005538515.localdomain ceilometer_agent_compute[236400]: INFO:__main__:Copying /var/lib/openstack/config/polling.yaml to /etc/ceilometer/polling.yaml
Nov 28 09:33:59 np0005538515.localdomain ceilometer_agent_compute[236400]: INFO:__main__:Setting permission for /etc/ceilometer/polling.yaml
Nov 28 09:33:59 np0005538515.localdomain ceilometer_agent_compute[236400]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Nov 28 09:33:59 np0005538515.localdomain ceilometer_agent_compute[236400]: INFO:__main__:Copying /var/lib/openstack/config/custom.conf to /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Nov 28 09:33:59 np0005538515.localdomain ceilometer_agent_compute[236400]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Nov 28 09:33:59 np0005538515.localdomain ceilometer_agent_compute[236400]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Nov 28 09:33:59 np0005538515.localdomain ceilometer_agent_compute[236400]: INFO:__main__:Copying /var/lib/openstack/config/ceilometer-host-specific.conf to /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Nov 28 09:33:59 np0005538515.localdomain ceilometer_agent_compute[236400]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Nov 28 09:33:59 np0005538515.localdomain ceilometer_agent_compute[236400]: INFO:__main__:Writing out command to execute
Nov 28 09:33:59 np0005538515.localdomain sudo[236406]: pam_unix(sudo:session): session closed for user root
Nov 28 09:33:59 np0005538515.localdomain ceilometer_agent_compute[236400]: ++ cat /run_command
Nov 28 09:33:59 np0005538515.localdomain sudo[236241]: pam_unix(sudo:session): session closed for user root
Nov 28 09:33:59 np0005538515.localdomain ceilometer_agent_compute[236400]: + CMD='/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Nov 28 09:33:59 np0005538515.localdomain ceilometer_agent_compute[236400]: + ARGS=
Nov 28 09:33:59 np0005538515.localdomain ceilometer_agent_compute[236400]: + sudo kolla_copy_cacerts
Nov 28 09:33:59 np0005538515.localdomain ceilometer_agent_compute[236400]: sudo: unable to send audit message: Operation not permitted
Nov 28 09:33:59 np0005538515.localdomain sudo[236421]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_copy_cacerts
Nov 28 09:33:59 np0005538515.localdomain sudo[236421]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Nov 28 09:33:59 np0005538515.localdomain sudo[236421]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Nov 28 09:33:59 np0005538515.localdomain sudo[236421]: pam_unix(sudo:session): session closed for user root
Nov 28 09:33:59 np0005538515.localdomain ceilometer_agent_compute[236400]: + [[ ! -n '' ]]
Nov 28 09:33:59 np0005538515.localdomain ceilometer_agent_compute[236400]: + . kolla_extend_start
Nov 28 09:33:59 np0005538515.localdomain ceilometer_agent_compute[236400]: Running command: '/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Nov 28 09:33:59 np0005538515.localdomain ceilometer_agent_compute[236400]: + echo 'Running command: '\''/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'\'''
Nov 28 09:33:59 np0005538515.localdomain ceilometer_agent_compute[236400]: + umask 0022
Nov 28 09:33:59 np0005538515.localdomain ceilometer_agent_compute[236400]: + exec /usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout
Nov 28 09:33:59 np0005538515.localdomain podman[236409]: 2025-11-28 09:33:59.760056132 +0000 UTC m=+0.080551436 container health_status 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=starting, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251125)
Nov 28 09:33:59 np0005538515.localdomain podman[236409]: 2025-11-28 09:33:59.79264006 +0000 UTC m=+0.113135354 container exec_died 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ceilometer_agent_compute, config_id=edpm, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 09:33:59 np0005538515.localdomain podman[236409]: unhealthy
Nov 28 09:33:59 np0005538515.localdomain systemd[1]: 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 09:33:59 np0005538515.localdomain systemd[1]: 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.service: Failed with result 'exit-code'.
Nov 28 09:34:00 np0005538515.localdomain rhsm-service[6576]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Nov 28 09:34:00 np0005538515.localdomain sudo[236540]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-umneklcyxzksvwqtegovibcdnfubfnth ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322439.95991-1547-65483499259578/AnsiballZ_stat.py
Nov 28 09:34:00 np0005538515.localdomain sudo[236540]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:34:00 np0005538515.localdomain rhsm-service[6576]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Nov 28 09:34:00 np0005538515.localdomain python3.9[236542]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/node_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.430 2 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_manager_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:40
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.430 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.430 2 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.430 2 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.431 2 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.431 2 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.431 2 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.431 2 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538515.localdomain sudo[236540]: pam_unix(sudo:session): session closed for user root
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.431 2 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.431 2 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.431 2 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.431 2 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.431 2 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.431 2 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.431 2 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.432 2 DEBUG cotyledon.oslo_config_glue [-] host                           = np0005538515.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.432 2 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.432 2 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.432 2 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.432 2 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.432 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.432 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.432 2 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.432 2 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.432 2 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.432 2 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.432 2 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.432 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.432 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.433 2 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.433 2 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.433 2 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.433 2 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.433 2 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.433 2 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.433 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.433 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.433 2 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.433 2 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.433 2 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.433 2 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.433 2 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.433 2 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.434 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.434 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.434 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.434 2 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.434 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.434 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.434 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.434 2 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.434 2 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.434 2 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.434 2 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.434 2 DEBUG cotyledon.oslo_config_glue [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.434 2 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.434 2 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.435 2 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.435 2 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.435 2 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.435 2 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.435 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.435 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.435 2 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.435 2 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.435 2 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.435 2 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.435 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.435 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.436 2 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.436 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.436 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path           = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.436 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.436 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.436 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count            = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.436 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries      = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.436 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.436 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.436 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout          = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.436 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.436 2 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.436 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries     = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.437 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.437 2 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version      = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.437 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.437 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.437 2 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.437 2 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.437 2 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.437 2 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.437 2 DEBUG cotyledon.oslo_config_glue [-] monasca.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.437 2 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.437 2 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings       = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.437 2 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.438 2 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.438 2 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.438 2 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.438 2 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.438 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.438 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.438 2 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.438 2 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.438 2 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.438 2 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.438 2 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.438 2 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.439 2 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.439 2 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.439 2 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.439 2 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.439 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.439 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.439 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.439 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.439 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.439 2 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.439 2 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.439 2 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.440 2 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.440 2 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.440 2 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.440 2 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.440 2 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.440 2 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.440 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip                 = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.440 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.440 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.440 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.440 2 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.440 2 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.440 2 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.441 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.441 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.441 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.441 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.441 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.441 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.441 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.441 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.441 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.441 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.441 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.441 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.441 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.442 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.442 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.442 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.442 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.442 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.442 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.442 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.442 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.442 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.442 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.442 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.442 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.442 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.443 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.443 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.443 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.443 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.443 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.443 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.443 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.443 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.458 12 INFO ceilometer.polling.manager [-] Looking for dynamic pollsters configurations at [['/etc/ceilometer/pollsters.d']].
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.459 12 INFO ceilometer.polling.manager [-] No dynamic pollsters found in folder [/etc/ceilometer/pollsters.d].
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.460 12 INFO ceilometer.polling.manager [-] No dynamic pollsters file found in dirs [['/etc/ceilometer/pollsters.d']].
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.471 12 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/utils.py:93
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.583 12 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:48
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.583 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.583 12 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.583 12 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.584 12 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.584 12 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.584 12 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.584 12 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.584 12 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.584 12 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.584 12 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.584 12 DEBUG cotyledon.oslo_config_glue [-] control_exchange               = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.584 12 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.584 12 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.585 12 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.585 12 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.585 12 DEBUG cotyledon.oslo_config_glue [-] host                           = np0005538515.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.585 12 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.585 12 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.585 12 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.585 12 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.585 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.585 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.585 12 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.585 12 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.586 12 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.586 12 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.586 12 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.586 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.586 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.586 12 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.586 12 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.586 12 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.586 12 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.586 12 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.586 12 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.586 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.586 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.587 12 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.587 12 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.587 12 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.587 12 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.587 12 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.587 12 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.587 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.587 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.587 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.587 12 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.587 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.587 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.588 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.588 12 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.588 12 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.588 12 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.588 12 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.588 12 DEBUG cotyledon.oslo_config_glue [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.588 12 DEBUG cotyledon.oslo_config_glue [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.588 12 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.588 12 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.588 12 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.588 12 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.588 12 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.589 12 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.589 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.589 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.589 12 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.589 12 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.589 12 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.589 12 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.589 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.589 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.589 12 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.589 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.590 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path           = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.590 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.590 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.590 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count            = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.590 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries      = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.590 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.590 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.590 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout          = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.590 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.590 12 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.590 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries     = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.590 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.590 12 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version      = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.591 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.591 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.591 12 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.591 12 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.591 12 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.591 12 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.591 12 DEBUG cotyledon.oslo_config_glue [-] monasca.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.591 12 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.591 12 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings       = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.591 12 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.591 12 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.591 12 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.592 12 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.592 12 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.592 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.592 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.592 12 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.592 12 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.592 12 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.592 12 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.592 12 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.592 12 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.592 12 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.593 12 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.593 12 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.593 12 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.593 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.593 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.593 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.593 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.593 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.593 12 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.593 12 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.593 12 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.593 12 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.594 12 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.594 12 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.594 12 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.594 12 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.594 12 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.594 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip                 = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.594 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.594 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.594 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.594 12 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.594 12 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.595 12 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.595 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.595 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.595 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_url   = http://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.595 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.595 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.595 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.595 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.595 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.595 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_id  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.595 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.595 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.595 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.596 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.596 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.password   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.596 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.596 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.596 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.596 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_name = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.596 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.596 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.596 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.system_scope = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.596 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.596 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.trust_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.596 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.597 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.597 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.597 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.username   = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.597 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.597 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.597 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.597 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.597 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.597 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.597 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.597 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.597 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.598 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.598 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.598 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.598 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.598 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.598 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.598 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.598 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.598 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.598 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.598 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.598 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.598 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.599 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.599 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.599 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.599 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.599 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.599 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.599 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.599 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.599 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.599 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.599 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.599 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.599 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.600 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.600 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.600 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.600 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.600 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.600 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.600 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.600 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.600 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.600 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.600 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.600 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.601 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.601 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.601 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.601 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.601 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.601 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.601 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.601 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.601 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.601 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.601 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.601 12 DEBUG cotyledon._service [-] Run service AgentManager(0) [12] wait_forever /usr/lib/python3.9/site-packages/cotyledon/_service.py:241
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.605 12 DEBUG ceilometer.agent [-] Config file: {'sources': [{'name': 'pollsters', 'interval': 120, 'meters': ['power.state', 'cpu', 'memory.usage', 'disk.*', 'network.*']}]} load_config /usr/lib/python3.9/site-packages/ceilometer/agent.py:64
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.612 12 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/utils.py:93
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.615 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.615 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.615 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.615 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.616 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.616 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.616 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.616 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.616 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.616 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.616 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.617 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.617 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.617 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.617 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.617 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.617 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.617 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.618 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.618 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.618 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.618 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.618 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.618 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:34:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:34:00.618 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:34:00 np0005538515.localdomain sudo[236635]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qxotxwtxtbxpyzfsgbmllohhikvcsvuu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322439.95991-1547-65483499259578/AnsiballZ_copy.py
Nov 28 09:34:00 np0005538515.localdomain sudo[236635]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:34:00 np0005538515.localdomain python3.9[236637]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/node_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764322439.95991-1547-65483499259578/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:34:00 np0005538515.localdomain sudo[236635]: pam_unix(sudo:session): session closed for user root
Nov 28 09:34:01 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=591 DF PROTO=TCP SPT=50378 DPT=9105 SEQ=3544091224 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD36CFA0000000001030307) 
Nov 28 09:34:01 np0005538515.localdomain sudo[236745]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bzmbunydzhpgbnkrgalhkgxbejcvxwmb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322441.4538271-1598-148076999888033/AnsiballZ_container_config_data.py
Nov 28 09:34:01 np0005538515.localdomain sudo[236745]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:34:01 np0005538515.localdomain python3.9[236747]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=node_exporter.json debug=False
Nov 28 09:34:01 np0005538515.localdomain sudo[236745]: pam_unix(sudo:session): session closed for user root
Nov 28 09:34:02 np0005538515.localdomain sudo[236855]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lcuhvbpezrmyofkvmueupsmzvckoklcx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322442.2670956-1625-62717720348178/AnsiballZ_container_config_hash.py
Nov 28 09:34:02 np0005538515.localdomain sudo[236855]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:34:02 np0005538515.localdomain python3.9[236857]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 28 09:34:02 np0005538515.localdomain sudo[236855]: pam_unix(sudo:session): session closed for user root
Nov 28 09:34:03 np0005538515.localdomain sudo[236965]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-edflspgdlgvwzassodhsqzmjknwjdxai ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764322443.5455573-1654-32754868528632/AnsiballZ_edpm_container_manage.py
Nov 28 09:34:03 np0005538515.localdomain sudo[236965]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:34:04 np0005538515.localdomain python3[236967]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=node_exporter.json log_base_path=/var/log/containers/stdouts debug=False
Nov 28 09:34:04 np0005538515.localdomain podman[237005]: 
Nov 28 09:34:04 np0005538515.localdomain podman[237005]: 2025-11-28 09:34:04.549909887 +0000 UTC m=+0.062206574 container create 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 28 09:34:04 np0005538515.localdomain podman[237005]: 2025-11-28 09:34:04.522859249 +0000 UTC m=+0.035155926 image pull  quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c
Nov 28 09:34:04 np0005538515.localdomain python3[236967]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name node_exporter --conmon-pidfile /run/node_exporter.pid --env OS_ENDPOINT_TYPE=internal --healthcheck-command /openstack/healthcheck node_exporter --label config_id=edpm --label container_name=node_exporter --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9100:9100 --user root --volume /var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw --volume /var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c --web.disable-exporter-metrics --collector.systemd --collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\.service --no-collector.dmi --no-collector.entropy --no-collector.thermal_zone --no-collector.time --no-collector.timex --no-collector.uname --no-collector.stat --no-collector.hwmon --no-collector.os --no-collector.selinux --no-collector.textfile --no-collector.powersupplyclass --no-collector.pressure --no-collector.rapl
Nov 28 09:34:04 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63964 DF PROTO=TCP SPT=35358 DPT=9105 SEQ=1280533386 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD378BA0000000001030307) 
Nov 28 09:34:04 np0005538515.localdomain sudo[236965]: pam_unix(sudo:session): session closed for user root
Nov 28 09:34:05 np0005538515.localdomain sudo[237150]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yzcvwpoxqgepwrcsloflkwldkhczgylj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322445.5828695-1678-118318227220223/AnsiballZ_stat.py
Nov 28 09:34:05 np0005538515.localdomain sudo[237150]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:34:06 np0005538515.localdomain python3.9[237152]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 09:34:06 np0005538515.localdomain sudo[237150]: pam_unix(sudo:session): session closed for user root
Nov 28 09:34:06 np0005538515.localdomain sudo[237262]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-deadrhrxialsnioitzsmvllvnjtogokn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322446.4718332-1706-12002284692325/AnsiballZ_file.py
Nov 28 09:34:06 np0005538515.localdomain sudo[237262]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:34:06 np0005538515.localdomain python3.9[237264]: ansible-file Invoked with path=/etc/systemd/system/edpm_node_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:34:06 np0005538515.localdomain sudo[237262]: pam_unix(sudo:session): session closed for user root
Nov 28 09:34:07 np0005538515.localdomain sudo[237371]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kmaozjhtctjxzpgfhwzigdcfqhgpqcti ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322447.0152671-1706-48334662590504/AnsiballZ_copy.py
Nov 28 09:34:07 np0005538515.localdomain sudo[237371]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:34:07 np0005538515.localdomain python3.9[237373]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764322447.0152671-1706-48334662590504/source dest=/etc/systemd/system/edpm_node_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:34:07 np0005538515.localdomain sudo[237371]: pam_unix(sudo:session): session closed for user root
Nov 28 09:34:07 np0005538515.localdomain sudo[237426]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qoagkefpqoaamejnagynysrumbentqxr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322447.0152671-1706-48334662590504/AnsiballZ_systemd.py
Nov 28 09:34:07 np0005538515.localdomain sudo[237426]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:34:08 np0005538515.localdomain python3.9[237428]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 28 09:34:08 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 09:34:08 np0005538515.localdomain systemd-rc-local-generator[237455]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:34:08 np0005538515.localdomain systemd-sysv-generator[237459]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:34:08 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:34:08 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 28 09:34:08 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:34:08 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:34:08 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:34:08 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 28 09:34:08 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:34:08 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:34:08 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:34:08 np0005538515.localdomain sudo[237426]: pam_unix(sudo:session): session closed for user root
Nov 28 09:34:08 np0005538515.localdomain sudo[237517]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fgfzhvmdyinlkfpuhoithggkvssqxqfs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322447.0152671-1706-48334662590504/AnsiballZ_systemd.py
Nov 28 09:34:08 np0005538515.localdomain sudo[237517]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:34:09 np0005538515.localdomain python3.9[237519]: ansible-systemd Invoked with state=restarted name=edpm_node_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:34:09 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 09:34:09 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54587 DF PROTO=TCP SPT=55084 DPT=9102 SEQ=285978181 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD38A3A0000000001030307) 
Nov 28 09:34:09 np0005538515.localdomain systemd-rc-local-generator[237547]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:34:09 np0005538515.localdomain systemd-sysv-generator[237551]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:34:09 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:34:09 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 28 09:34:09 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:34:09 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:34:09 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:34:09 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 28 09:34:09 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:34:09 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:34:09 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:34:09 np0005538515.localdomain systemd[1]: Starting node_exporter container...
Nov 28 09:34:09 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 09:34:09 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.
Nov 28 09:34:09 np0005538515.localdomain podman[237559]: 2025-11-28 09:34:09.659189929 +0000 UTC m=+0.157672064 container init 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 28 09:34:09 np0005538515.localdomain node_exporter[237574]: ts=2025-11-28T09:34:09.675Z caller=node_exporter.go:180 level=info msg="Starting node_exporter" version="(version=1.5.0, branch=HEAD, revision=1b48970ffcf5630534fb00bb0687d73c66d1c959)"
Nov 28 09:34:09 np0005538515.localdomain node_exporter[237574]: ts=2025-11-28T09:34:09.675Z caller=node_exporter.go:181 level=info msg="Build context" build_context="(go=go1.19.3, user=root@6e7732a7b81b, date=20221129-18:59:09)"
Nov 28 09:34:09 np0005538515.localdomain node_exporter[237574]: ts=2025-11-28T09:34:09.675Z caller=node_exporter.go:183 level=warn msg="Node Exporter is running as root user. This exporter is designed to run as unprivileged user, root is not required."
Nov 28 09:34:09 np0005538515.localdomain node_exporter[237574]: ts=2025-11-28T09:34:09.675Z caller=systemd_linux.go:152 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-include" flag=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\.service
Nov 28 09:34:09 np0005538515.localdomain node_exporter[237574]: ts=2025-11-28T09:34:09.675Z caller=systemd_linux.go:154 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-exclude" flag=.+\.(automount|device|mount|scope|slice)
Nov 28 09:34:09 np0005538515.localdomain node_exporter[237574]: ts=2025-11-28T09:34:09.675Z caller=filesystem_common.go:111 level=info collector=filesystem msg="Parsed flag --collector.filesystem.mount-points-exclude" flag=^/(dev|proc|run/credentials/.+|sys|var/lib/docker/.+|var/lib/containers/storage/.+)($|/)
Nov 28 09:34:09 np0005538515.localdomain node_exporter[237574]: ts=2025-11-28T09:34:09.676Z caller=filesystem_common.go:113 level=info collector=filesystem msg="Parsed flag --collector.filesystem.fs-types-exclude" flag=^(autofs|binfmt_misc|bpf|cgroup2?|configfs|debugfs|devpts|devtmpfs|fusectl|hugetlbfs|iso9660|mqueue|nsfs|overlay|proc|procfs|pstore|rpc_pipefs|securityfs|selinuxfs|squashfs|sysfs|tracefs)$
Nov 28 09:34:09 np0005538515.localdomain node_exporter[237574]: ts=2025-11-28T09:34:09.676Z caller=diskstats_common.go:111 level=info collector=diskstats msg="Parsed flag --collector.diskstats.device-exclude" flag=^(ram|loop|fd|(h|s|v|xv)d[a-z]|nvme\d+n\d+p)\d+$
Nov 28 09:34:09 np0005538515.localdomain node_exporter[237574]: ts=2025-11-28T09:34:09.676Z caller=diskstats_linux.go:264 level=error collector=diskstats msg="Failed to open directory, disabling udev device properties" path=/run/udev/data
Nov 28 09:34:09 np0005538515.localdomain node_exporter[237574]: ts=2025-11-28T09:34:09.676Z caller=node_exporter.go:110 level=info msg="Enabled collectors"
Nov 28 09:34:09 np0005538515.localdomain node_exporter[237574]: ts=2025-11-28T09:34:09.676Z caller=node_exporter.go:117 level=info collector=arp
Nov 28 09:34:09 np0005538515.localdomain node_exporter[237574]: ts=2025-11-28T09:34:09.676Z caller=node_exporter.go:117 level=info collector=bcache
Nov 28 09:34:09 np0005538515.localdomain node_exporter[237574]: ts=2025-11-28T09:34:09.676Z caller=node_exporter.go:117 level=info collector=bonding
Nov 28 09:34:09 np0005538515.localdomain node_exporter[237574]: ts=2025-11-28T09:34:09.676Z caller=node_exporter.go:117 level=info collector=btrfs
Nov 28 09:34:09 np0005538515.localdomain node_exporter[237574]: ts=2025-11-28T09:34:09.676Z caller=node_exporter.go:117 level=info collector=conntrack
Nov 28 09:34:09 np0005538515.localdomain node_exporter[237574]: ts=2025-11-28T09:34:09.676Z caller=node_exporter.go:117 level=info collector=cpu
Nov 28 09:34:09 np0005538515.localdomain node_exporter[237574]: ts=2025-11-28T09:34:09.676Z caller=node_exporter.go:117 level=info collector=cpufreq
Nov 28 09:34:09 np0005538515.localdomain node_exporter[237574]: ts=2025-11-28T09:34:09.676Z caller=node_exporter.go:117 level=info collector=diskstats
Nov 28 09:34:09 np0005538515.localdomain node_exporter[237574]: ts=2025-11-28T09:34:09.676Z caller=node_exporter.go:117 level=info collector=edac
Nov 28 09:34:09 np0005538515.localdomain node_exporter[237574]: ts=2025-11-28T09:34:09.676Z caller=node_exporter.go:117 level=info collector=fibrechannel
Nov 28 09:34:09 np0005538515.localdomain node_exporter[237574]: ts=2025-11-28T09:34:09.676Z caller=node_exporter.go:117 level=info collector=filefd
Nov 28 09:34:09 np0005538515.localdomain node_exporter[237574]: ts=2025-11-28T09:34:09.676Z caller=node_exporter.go:117 level=info collector=filesystem
Nov 28 09:34:09 np0005538515.localdomain node_exporter[237574]: ts=2025-11-28T09:34:09.676Z caller=node_exporter.go:117 level=info collector=infiniband
Nov 28 09:34:09 np0005538515.localdomain node_exporter[237574]: ts=2025-11-28T09:34:09.676Z caller=node_exporter.go:117 level=info collector=ipvs
Nov 28 09:34:09 np0005538515.localdomain node_exporter[237574]: ts=2025-11-28T09:34:09.676Z caller=node_exporter.go:117 level=info collector=loadavg
Nov 28 09:34:09 np0005538515.localdomain node_exporter[237574]: ts=2025-11-28T09:34:09.676Z caller=node_exporter.go:117 level=info collector=mdadm
Nov 28 09:34:09 np0005538515.localdomain node_exporter[237574]: ts=2025-11-28T09:34:09.676Z caller=node_exporter.go:117 level=info collector=meminfo
Nov 28 09:34:09 np0005538515.localdomain node_exporter[237574]: ts=2025-11-28T09:34:09.676Z caller=node_exporter.go:117 level=info collector=netclass
Nov 28 09:34:09 np0005538515.localdomain node_exporter[237574]: ts=2025-11-28T09:34:09.676Z caller=node_exporter.go:117 level=info collector=netdev
Nov 28 09:34:09 np0005538515.localdomain node_exporter[237574]: ts=2025-11-28T09:34:09.676Z caller=node_exporter.go:117 level=info collector=netstat
Nov 28 09:34:09 np0005538515.localdomain node_exporter[237574]: ts=2025-11-28T09:34:09.676Z caller=node_exporter.go:117 level=info collector=nfs
Nov 28 09:34:09 np0005538515.localdomain node_exporter[237574]: ts=2025-11-28T09:34:09.676Z caller=node_exporter.go:117 level=info collector=nfsd
Nov 28 09:34:09 np0005538515.localdomain node_exporter[237574]: ts=2025-11-28T09:34:09.676Z caller=node_exporter.go:117 level=info collector=nvme
Nov 28 09:34:09 np0005538515.localdomain node_exporter[237574]: ts=2025-11-28T09:34:09.676Z caller=node_exporter.go:117 level=info collector=schedstat
Nov 28 09:34:09 np0005538515.localdomain node_exporter[237574]: ts=2025-11-28T09:34:09.676Z caller=node_exporter.go:117 level=info collector=sockstat
Nov 28 09:34:09 np0005538515.localdomain node_exporter[237574]: ts=2025-11-28T09:34:09.676Z caller=node_exporter.go:117 level=info collector=softnet
Nov 28 09:34:09 np0005538515.localdomain node_exporter[237574]: ts=2025-11-28T09:34:09.676Z caller=node_exporter.go:117 level=info collector=systemd
Nov 28 09:34:09 np0005538515.localdomain node_exporter[237574]: ts=2025-11-28T09:34:09.677Z caller=node_exporter.go:117 level=info collector=tapestats
Nov 28 09:34:09 np0005538515.localdomain node_exporter[237574]: ts=2025-11-28T09:34:09.677Z caller=node_exporter.go:117 level=info collector=udp_queues
Nov 28 09:34:09 np0005538515.localdomain node_exporter[237574]: ts=2025-11-28T09:34:09.677Z caller=node_exporter.go:117 level=info collector=vmstat
Nov 28 09:34:09 np0005538515.localdomain node_exporter[237574]: ts=2025-11-28T09:34:09.677Z caller=node_exporter.go:117 level=info collector=xfs
Nov 28 09:34:09 np0005538515.localdomain node_exporter[237574]: ts=2025-11-28T09:34:09.677Z caller=node_exporter.go:117 level=info collector=zfs
Nov 28 09:34:09 np0005538515.localdomain node_exporter[237574]: ts=2025-11-28T09:34:09.677Z caller=tls_config.go:232 level=info msg="Listening on" address=[::]:9100
Nov 28 09:34:09 np0005538515.localdomain node_exporter[237574]: ts=2025-11-28T09:34:09.677Z caller=tls_config.go:235 level=info msg="TLS is disabled." http2=false address=[::]:9100
Nov 28 09:34:09 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.
Nov 28 09:34:09 np0005538515.localdomain podman[237559]: 2025-11-28 09:34:09.69856168 +0000 UTC m=+0.197043785 container start 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 28 09:34:09 np0005538515.localdomain podman[237559]: node_exporter
Nov 28 09:34:09 np0005538515.localdomain systemd[1]: Started node_exporter container.
Nov 28 09:34:09 np0005538515.localdomain sudo[237517]: pam_unix(sudo:session): session closed for user root
Nov 28 09:34:09 np0005538515.localdomain podman[237583]: 2025-11-28 09:34:09.770839872 +0000 UTC m=+0.071493888 container health_status 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=starting, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 28 09:34:09 np0005538515.localdomain podman[237583]: 2025-11-28 09:34:09.778689937 +0000 UTC m=+0.079343913 container exec_died 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 28 09:34:09 np0005538515.localdomain systemd[1]: 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.service: Deactivated successfully.
Nov 28 09:34:10 np0005538515.localdomain sudo[237713]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hjabkalsgbjhdchvedbwtaooetvndckp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322450.110441-1779-90889241601860/AnsiballZ_systemd.py
Nov 28 09:34:10 np0005538515.localdomain sudo[237713]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:34:10 np0005538515.localdomain python3.9[237715]: ansible-ansible.builtin.systemd Invoked with name=edpm_node_exporter.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 28 09:34:10 np0005538515.localdomain systemd[1]: Stopping node_exporter container...
Nov 28 09:34:10 np0005538515.localdomain systemd[1]: libpod-56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.scope: Deactivated successfully.
Nov 28 09:34:10 np0005538515.localdomain podman[237719]: 2025-11-28 09:34:10.863387211 +0000 UTC m=+0.068592437 container died 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 28 09:34:10 np0005538515.localdomain systemd[1]: 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.timer: Deactivated successfully.
Nov 28 09:34:10 np0005538515.localdomain systemd[1]: Stopped /usr/bin/podman healthcheck run 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.
Nov 28 09:34:10 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686-userdata-shm.mount: Deactivated successfully.
Nov 28 09:34:10 np0005538515.localdomain podman[237719]: 2025-11-28 09:34:10.906724876 +0000 UTC m=+0.111930092 container cleanup 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 28 09:34:10 np0005538515.localdomain podman[237719]: node_exporter
Nov 28 09:34:10 np0005538515.localdomain systemd[1]: edpm_node_exporter.service: Main process exited, code=exited, status=2/INVALIDARGUMENT
Nov 28 09:34:10 np0005538515.localdomain podman[237744]: 2025-11-28 09:34:10.996591098 +0000 UTC m=+0.052265946 container cleanup 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 09:34:10 np0005538515.localdomain podman[237744]: node_exporter
Nov 28 09:34:11 np0005538515.localdomain systemd[1]: edpm_node_exporter.service: Failed with result 'exit-code'.
Nov 28 09:34:11 np0005538515.localdomain systemd[1]: Stopped node_exporter container.
Nov 28 09:34:11 np0005538515.localdomain systemd[1]: Starting node_exporter container...
Nov 28 09:34:11 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 09:34:11 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.
Nov 28 09:34:11 np0005538515.localdomain podman[237755]: 2025-11-28 09:34:11.173217713 +0000 UTC m=+0.136569493 container init 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 28 09:34:11 np0005538515.localdomain node_exporter[237770]: ts=2025-11-28T09:34:11.187Z caller=node_exporter.go:180 level=info msg="Starting node_exporter" version="(version=1.5.0, branch=HEAD, revision=1b48970ffcf5630534fb00bb0687d73c66d1c959)"
Nov 28 09:34:11 np0005538515.localdomain node_exporter[237770]: ts=2025-11-28T09:34:11.187Z caller=node_exporter.go:181 level=info msg="Build context" build_context="(go=go1.19.3, user=root@6e7732a7b81b, date=20221129-18:59:09)"
Nov 28 09:34:11 np0005538515.localdomain node_exporter[237770]: ts=2025-11-28T09:34:11.187Z caller=node_exporter.go:183 level=warn msg="Node Exporter is running as root user. This exporter is designed to run as unprivileged user, root is not required."
Nov 28 09:34:11 np0005538515.localdomain node_exporter[237770]: ts=2025-11-28T09:34:11.188Z caller=systemd_linux.go:152 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-include" flag=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\.service
Nov 28 09:34:11 np0005538515.localdomain node_exporter[237770]: ts=2025-11-28T09:34:11.188Z caller=systemd_linux.go:154 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-exclude" flag=.+\.(automount|device|mount|scope|slice)
Nov 28 09:34:11 np0005538515.localdomain node_exporter[237770]: ts=2025-11-28T09:34:11.188Z caller=diskstats_common.go:111 level=info collector=diskstats msg="Parsed flag --collector.diskstats.device-exclude" flag=^(ram|loop|fd|(h|s|v|xv)d[a-z]|nvme\d+n\d+p)\d+$
Nov 28 09:34:11 np0005538515.localdomain node_exporter[237770]: ts=2025-11-28T09:34:11.188Z caller=diskstats_linux.go:264 level=error collector=diskstats msg="Failed to open directory, disabling udev device properties" path=/run/udev/data
Nov 28 09:34:11 np0005538515.localdomain node_exporter[237770]: ts=2025-11-28T09:34:11.189Z caller=filesystem_common.go:111 level=info collector=filesystem msg="Parsed flag --collector.filesystem.mount-points-exclude" flag=^/(dev|proc|run/credentials/.+|sys|var/lib/docker/.+|var/lib/containers/storage/.+)($|/)
Nov 28 09:34:11 np0005538515.localdomain node_exporter[237770]: ts=2025-11-28T09:34:11.189Z caller=filesystem_common.go:113 level=info collector=filesystem msg="Parsed flag --collector.filesystem.fs-types-exclude" flag=^(autofs|binfmt_misc|bpf|cgroup2?|configfs|debugfs|devpts|devtmpfs|fusectl|hugetlbfs|iso9660|mqueue|nsfs|overlay|proc|procfs|pstore|rpc_pipefs|securityfs|selinuxfs|squashfs|sysfs|tracefs)$
Nov 28 09:34:11 np0005538515.localdomain node_exporter[237770]: ts=2025-11-28T09:34:11.189Z caller=node_exporter.go:110 level=info msg="Enabled collectors"
Nov 28 09:34:11 np0005538515.localdomain node_exporter[237770]: ts=2025-11-28T09:34:11.189Z caller=node_exporter.go:117 level=info collector=arp
Nov 28 09:34:11 np0005538515.localdomain node_exporter[237770]: ts=2025-11-28T09:34:11.189Z caller=node_exporter.go:117 level=info collector=bcache
Nov 28 09:34:11 np0005538515.localdomain node_exporter[237770]: ts=2025-11-28T09:34:11.189Z caller=node_exporter.go:117 level=info collector=bonding
Nov 28 09:34:11 np0005538515.localdomain node_exporter[237770]: ts=2025-11-28T09:34:11.189Z caller=node_exporter.go:117 level=info collector=btrfs
Nov 28 09:34:11 np0005538515.localdomain node_exporter[237770]: ts=2025-11-28T09:34:11.189Z caller=node_exporter.go:117 level=info collector=conntrack
Nov 28 09:34:11 np0005538515.localdomain node_exporter[237770]: ts=2025-11-28T09:34:11.189Z caller=node_exporter.go:117 level=info collector=cpu
Nov 28 09:34:11 np0005538515.localdomain node_exporter[237770]: ts=2025-11-28T09:34:11.189Z caller=node_exporter.go:117 level=info collector=cpufreq
Nov 28 09:34:11 np0005538515.localdomain node_exporter[237770]: ts=2025-11-28T09:34:11.189Z caller=node_exporter.go:117 level=info collector=diskstats
Nov 28 09:34:11 np0005538515.localdomain node_exporter[237770]: ts=2025-11-28T09:34:11.189Z caller=node_exporter.go:117 level=info collector=edac
Nov 28 09:34:11 np0005538515.localdomain node_exporter[237770]: ts=2025-11-28T09:34:11.189Z caller=node_exporter.go:117 level=info collector=fibrechannel
Nov 28 09:34:11 np0005538515.localdomain node_exporter[237770]: ts=2025-11-28T09:34:11.189Z caller=node_exporter.go:117 level=info collector=filefd
Nov 28 09:34:11 np0005538515.localdomain node_exporter[237770]: ts=2025-11-28T09:34:11.189Z caller=node_exporter.go:117 level=info collector=filesystem
Nov 28 09:34:11 np0005538515.localdomain node_exporter[237770]: ts=2025-11-28T09:34:11.189Z caller=node_exporter.go:117 level=info collector=infiniband
Nov 28 09:34:11 np0005538515.localdomain node_exporter[237770]: ts=2025-11-28T09:34:11.189Z caller=node_exporter.go:117 level=info collector=ipvs
Nov 28 09:34:11 np0005538515.localdomain node_exporter[237770]: ts=2025-11-28T09:34:11.189Z caller=node_exporter.go:117 level=info collector=loadavg
Nov 28 09:34:11 np0005538515.localdomain node_exporter[237770]: ts=2025-11-28T09:34:11.189Z caller=node_exporter.go:117 level=info collector=mdadm
Nov 28 09:34:11 np0005538515.localdomain node_exporter[237770]: ts=2025-11-28T09:34:11.189Z caller=node_exporter.go:117 level=info collector=meminfo
Nov 28 09:34:11 np0005538515.localdomain node_exporter[237770]: ts=2025-11-28T09:34:11.189Z caller=node_exporter.go:117 level=info collector=netclass
Nov 28 09:34:11 np0005538515.localdomain node_exporter[237770]: ts=2025-11-28T09:34:11.189Z caller=node_exporter.go:117 level=info collector=netdev
Nov 28 09:34:11 np0005538515.localdomain node_exporter[237770]: ts=2025-11-28T09:34:11.189Z caller=node_exporter.go:117 level=info collector=netstat
Nov 28 09:34:11 np0005538515.localdomain node_exporter[237770]: ts=2025-11-28T09:34:11.189Z caller=node_exporter.go:117 level=info collector=nfs
Nov 28 09:34:11 np0005538515.localdomain node_exporter[237770]: ts=2025-11-28T09:34:11.189Z caller=node_exporter.go:117 level=info collector=nfsd
Nov 28 09:34:11 np0005538515.localdomain node_exporter[237770]: ts=2025-11-28T09:34:11.189Z caller=node_exporter.go:117 level=info collector=nvme
Nov 28 09:34:11 np0005538515.localdomain node_exporter[237770]: ts=2025-11-28T09:34:11.189Z caller=node_exporter.go:117 level=info collector=schedstat
Nov 28 09:34:11 np0005538515.localdomain node_exporter[237770]: ts=2025-11-28T09:34:11.189Z caller=node_exporter.go:117 level=info collector=sockstat
Nov 28 09:34:11 np0005538515.localdomain node_exporter[237770]: ts=2025-11-28T09:34:11.189Z caller=node_exporter.go:117 level=info collector=softnet
Nov 28 09:34:11 np0005538515.localdomain node_exporter[237770]: ts=2025-11-28T09:34:11.189Z caller=node_exporter.go:117 level=info collector=systemd
Nov 28 09:34:11 np0005538515.localdomain node_exporter[237770]: ts=2025-11-28T09:34:11.189Z caller=node_exporter.go:117 level=info collector=tapestats
Nov 28 09:34:11 np0005538515.localdomain node_exporter[237770]: ts=2025-11-28T09:34:11.189Z caller=node_exporter.go:117 level=info collector=udp_queues
Nov 28 09:34:11 np0005538515.localdomain node_exporter[237770]: ts=2025-11-28T09:34:11.189Z caller=node_exporter.go:117 level=info collector=vmstat
Nov 28 09:34:11 np0005538515.localdomain node_exporter[237770]: ts=2025-11-28T09:34:11.189Z caller=node_exporter.go:117 level=info collector=xfs
Nov 28 09:34:11 np0005538515.localdomain node_exporter[237770]: ts=2025-11-28T09:34:11.189Z caller=node_exporter.go:117 level=info collector=zfs
Nov 28 09:34:11 np0005538515.localdomain node_exporter[237770]: ts=2025-11-28T09:34:11.190Z caller=tls_config.go:232 level=info msg="Listening on" address=[::]:9100
Nov 28 09:34:11 np0005538515.localdomain node_exporter[237770]: ts=2025-11-28T09:34:11.190Z caller=tls_config.go:235 level=info msg="TLS is disabled." http2=false address=[::]:9100
Nov 28 09:34:11 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.
Nov 28 09:34:11 np0005538515.localdomain podman[237755]: 2025-11-28 09:34:11.229277957 +0000 UTC m=+0.192629737 container start 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 28 09:34:11 np0005538515.localdomain podman[237755]: node_exporter
Nov 28 09:34:11 np0005538515.localdomain systemd[1]: Started node_exporter container.
Nov 28 09:34:11 np0005538515.localdomain sudo[237713]: pam_unix(sudo:session): session closed for user root
Nov 28 09:34:11 np0005538515.localdomain podman[237779]: 2025-11-28 09:34:11.291352509 +0000 UTC m=+0.078914100 container health_status 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=starting, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 09:34:11 np0005538515.localdomain podman[237779]: 2025-11-28 09:34:11.297268585 +0000 UTC m=+0.084830156 container exec_died 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 28 09:34:11 np0005538515.localdomain systemd[1]: 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.service: Deactivated successfully.
Nov 28 09:34:11 np0005538515.localdomain sudo[237909]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fegwzuasjdbenltcfluegwykanwpkuev ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322451.4644952-1802-17870683863356/AnsiballZ_stat.py
Nov 28 09:34:11 np0005538515.localdomain sudo[237909]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:34:11 np0005538515.localdomain python3.9[237911]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/podman_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:34:11 np0005538515.localdomain sudo[237909]: pam_unix(sudo:session): session closed for user root
Nov 28 09:34:12 np0005538515.localdomain sudo[237997]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uyqqxwuwgvqcejmuphxqcgnhaqfoxmwd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322451.4644952-1802-17870683863356/AnsiballZ_copy.py
Nov 28 09:34:12 np0005538515.localdomain sudo[237997]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:34:12 np0005538515.localdomain python3.9[237999]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/podman_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764322451.4644952-1802-17870683863356/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:34:12 np0005538515.localdomain sudo[237997]: pam_unix(sudo:session): session closed for user root
Nov 28 09:34:12 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63965 DF PROTO=TCP SPT=35358 DPT=9105 SEQ=1280533386 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD398FA0000000001030307) 
Nov 28 09:34:13 np0005538515.localdomain sudo[238107]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zxplodqhkhnbnqwkdbcvsdizzqveghfg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322452.9845386-1853-266006072478255/AnsiballZ_container_config_data.py
Nov 28 09:34:13 np0005538515.localdomain sudo[238107]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:34:13 np0005538515.localdomain python3.9[238109]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=podman_exporter.json debug=False
Nov 28 09:34:13 np0005538515.localdomain sudo[238107]: pam_unix(sudo:session): session closed for user root
Nov 28 09:34:14 np0005538515.localdomain sudo[238217]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rmvydonmodhgqverdpfmbplhpoxfzlxu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322453.7882473-1880-249892852716319/AnsiballZ_container_config_hash.py
Nov 28 09:34:14 np0005538515.localdomain sudo[238217]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:34:14 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59816 DF PROTO=TCP SPT=33896 DPT=9101 SEQ=2320142314 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD39DE10000000001030307) 
Nov 28 09:34:14 np0005538515.localdomain python3.9[238219]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 28 09:34:14 np0005538515.localdomain sudo[238217]: pam_unix(sudo:session): session closed for user root
Nov 28 09:34:14 np0005538515.localdomain sudo[238327]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iaumjlemtpuissqwvcvhnzswcejhavlv ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764322454.7068775-1910-20821828799649/AnsiballZ_edpm_container_manage.py
Nov 28 09:34:14 np0005538515.localdomain sudo[238327]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:34:15 np0005538515.localdomain python3[238329]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=podman_exporter.json log_base_path=/var/log/containers/stdouts debug=False
Nov 28 09:34:16 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53415 DF PROTO=TCP SPT=48794 DPT=9100 SEQ=2158328996 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD3A6FA0000000001030307) 
Nov 28 09:34:17 np0005538515.localdomain podman[238343]: 2025-11-28 09:34:15.348547865 +0000 UTC m=+0.029961629 image pull  quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd
Nov 28 09:34:17 np0005538515.localdomain podman[238414]: 
Nov 28 09:34:17 np0005538515.localdomain podman[238414]: 2025-11-28 09:34:17.373427351 +0000 UTC m=+0.084557397 container create d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, config_id=edpm, container_name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 28 09:34:17 np0005538515.localdomain podman[238414]: 2025-11-28 09:34:17.334879475 +0000 UTC m=+0.046009521 image pull  quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd
Nov 28 09:34:17 np0005538515.localdomain python3[238329]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name podman_exporter --conmon-pidfile /run/podman_exporter.pid --env OS_ENDPOINT_TYPE=internal --env CONTAINER_HOST=unix:///run/podman/podman.sock --healthcheck-command /openstack/healthcheck podman_exporter --label config_id=edpm --label container_name=podman_exporter --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9882:9882 --user root --volume /run/podman/podman.sock:/run/podman/podman.sock:rw,z --volume /var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd
Nov 28 09:34:17 np0005538515.localdomain sudo[238327]: pam_unix(sudo:session): session closed for user root
Nov 28 09:34:18 np0005538515.localdomain sudo[238560]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hjcwikybpwupbrmmsaaovtsyothxjyhs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322458.4593596-1934-162022909311024/AnsiballZ_stat.py
Nov 28 09:34:18 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.
Nov 28 09:34:18 np0005538515.localdomain sudo[238560]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:34:18 np0005538515.localdomain systemd[1]: tmp-crun.HyI5Da.mount: Deactivated successfully.
Nov 28 09:34:18 np0005538515.localdomain podman[238562]: 2025-11-28 09:34:18.890773449 +0000 UTC m=+0.089972225 container health_status cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, org.label-schema.build-date=20251125, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 28 09:34:18 np0005538515.localdomain podman[238562]: 2025-11-28 09:34:18.905735098 +0000 UTC m=+0.104933824 container exec_died cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 28 09:34:18 np0005538515.localdomain systemd[1]: cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.service: Deactivated successfully.
Nov 28 09:34:18 np0005538515.localdomain python3.9[238563]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 09:34:19 np0005538515.localdomain sudo[238560]: pam_unix(sudo:session): session closed for user root
Nov 28 09:34:19 np0005538515.localdomain sudo[238692]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mtmlbncwmpyzaanfpdnfgsihpchgztmt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322459.321024-1960-75439462166156/AnsiballZ_file.py
Nov 28 09:34:19 np0005538515.localdomain sudo[238692]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:34:19 np0005538515.localdomain python3.9[238694]: ansible-file Invoked with path=/etc/systemd/system/edpm_podman_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:34:19 np0005538515.localdomain sudo[238692]: pam_unix(sudo:session): session closed for user root
Nov 28 09:34:20 np0005538515.localdomain sudo[238801]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-myjwsawhqocxnbtwemydlirwoctizvrt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322459.8480713-1960-68330817643105/AnsiballZ_copy.py
Nov 28 09:34:20 np0005538515.localdomain sudo[238801]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:34:20 np0005538515.localdomain python3.9[238803]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764322459.8480713-1960-68330817643105/source dest=/etc/systemd/system/edpm_podman_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:34:20 np0005538515.localdomain sudo[238801]: pam_unix(sudo:session): session closed for user root
Nov 28 09:34:20 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53416 DF PROTO=TCP SPT=48794 DPT=9100 SEQ=2158328996 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD3B6BA0000000001030307) 
Nov 28 09:34:20 np0005538515.localdomain sudo[238856]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pgjmizqduphmkugkmonkfyzbcrhwlvux ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322459.8480713-1960-68330817643105/AnsiballZ_systemd.py
Nov 28 09:34:20 np0005538515.localdomain sudo[238856]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:34:21 np0005538515.localdomain python3.9[238858]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 28 09:34:21 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 09:34:21 np0005538515.localdomain systemd-rc-local-generator[238881]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:34:21 np0005538515.localdomain systemd-sysv-generator[238888]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:34:21 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:34:21 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 28 09:34:21 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:34:21 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:34:21 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:34:21 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 28 09:34:21 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:34:21 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:34:21 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:34:21 np0005538515.localdomain sudo[238856]: pam_unix(sudo:session): session closed for user root
Nov 28 09:34:21 np0005538515.localdomain sudo[238946]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tkqqnkavtfdipjrewskjeilhvczrhvqa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322459.8480713-1960-68330817643105/AnsiballZ_systemd.py
Nov 28 09:34:21 np0005538515.localdomain sudo[238946]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:34:22 np0005538515.localdomain python3.9[238948]: ansible-systemd Invoked with state=restarted name=edpm_podman_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:34:22 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 09:34:22 np0005538515.localdomain systemd-rc-local-generator[238972]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:34:22 np0005538515.localdomain systemd-sysv-generator[238975]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:34:22 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:34:22 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 28 09:34:22 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:34:22 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:34:22 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:34:22 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 28 09:34:22 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:34:22 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:34:22 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:34:22 np0005538515.localdomain systemd[1]: Starting podman_exporter container...
Nov 28 09:34:22 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 09:34:22 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.
Nov 28 09:34:22 np0005538515.localdomain podman[238988]: 2025-11-28 09:34:22.619052395 +0000 UTC m=+0.176736600 container init d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 09:34:22 np0005538515.localdomain podman_exporter[239000]: ts=2025-11-28T09:34:22.640Z caller=exporter.go:68 level=info msg="Starting podman-prometheus-exporter" version="(version=1.10.1, branch=HEAD, revision=1)"
Nov 28 09:34:22 np0005538515.localdomain podman_exporter[239000]: ts=2025-11-28T09:34:22.640Z caller=exporter.go:69 level=info msg=metrics enhanced=false
Nov 28 09:34:22 np0005538515.localdomain podman_exporter[239000]: ts=2025-11-28T09:34:22.640Z caller=handler.go:94 level=info msg="enabled collectors"
Nov 28 09:34:22 np0005538515.localdomain podman_exporter[239000]: ts=2025-11-28T09:34:22.640Z caller=handler.go:105 level=info collector=container
Nov 28 09:34:22 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.
Nov 28 09:34:22 np0005538515.localdomain podman[238988]: 2025-11-28 09:34:22.664061563 +0000 UTC m=+0.221745768 container start d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 09:34:22 np0005538515.localdomain podman[238988]: podman_exporter
Nov 28 09:34:22 np0005538515.localdomain systemd[1]: Starting Podman API Service...
Nov 28 09:34:22 np0005538515.localdomain systemd[1]: Started Podman API Service.
Nov 28 09:34:22 np0005538515.localdomain systemd[1]: Started podman_exporter container.
Nov 28 09:34:22 np0005538515.localdomain sudo[238946]: pam_unix(sudo:session): session closed for user root
Nov 28 09:34:22 np0005538515.localdomain podman[239012]: time="2025-11-28T09:34:22Z" level=info msg="/usr/bin/podman filtering at log level info"
Nov 28 09:34:22 np0005538515.localdomain podman[239011]: 2025-11-28 09:34:22.755784083 +0000 UTC m=+0.086074984 container health_status d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=starting, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 09:34:22 np0005538515.localdomain podman[239011]: 2025-11-28 09:34:22.767577442 +0000 UTC m=+0.097868353 container exec_died d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 28 09:34:22 np0005538515.localdomain podman[239011]: unhealthy
Nov 28 09:34:22 np0005538515.localdomain systemd[1]: d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 09:34:22 np0005538515.localdomain systemd[1]: d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.service: Failed with result 'exit-code'.
Nov 28 09:34:22 np0005538515.localdomain podman[239012]: time="2025-11-28T09:34:22Z" level=info msg="Not using native diff for overlay, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled"
Nov 28 09:34:22 np0005538515.localdomain podman[239012]: time="2025-11-28T09:34:22Z" level=info msg="Setting parallel job count to 25"
Nov 28 09:34:22 np0005538515.localdomain podman[239012]: time="2025-11-28T09:34:22Z" level=info msg="Using systemd socket activation to determine API endpoint"
Nov 28 09:34:22 np0005538515.localdomain podman[239012]: time="2025-11-28T09:34:22Z" level=info msg="API service listening on \"/run/podman/podman.sock\". URI: \"/run/podman/podman.sock\""
Nov 28 09:34:22 np0005538515.localdomain podman[239012]: @ - - [28/Nov/2025:09:34:22 +0000] "GET /v4.9.3/libpod/_ping HTTP/1.1" 200 2 "" "Go-http-client/1.1"
Nov 28 09:34:22 np0005538515.localdomain podman[239012]: time="2025-11-28T09:34:22Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 09:34:23 np0005538515.localdomain sudo[239157]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-twvwhceyanubkxbjtfpdjlngkiwkyimn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322462.9522288-2033-156183732257/AnsiballZ_systemd.py
Nov 28 09:34:23 np0005538515.localdomain sudo[239157]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:34:23 np0005538515.localdomain python3.9[239159]: ansible-ansible.builtin.systemd Invoked with name=edpm_podman_exporter.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 28 09:34:23 np0005538515.localdomain systemd[1]: Stopping podman_exporter container...
Nov 28 09:34:23 np0005538515.localdomain podman[239012]: @ - - [28/Nov/2025:09:34:22 +0000] "GET /v4.9.3/libpod/events?filters=%7B%7D&since=&stream=true&until= HTTP/1.1" 200 0 "" "Go-http-client/1.1"
Nov 28 09:34:23 np0005538515.localdomain systemd[1]: libpod-d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.scope: Deactivated successfully.
Nov 28 09:34:23 np0005538515.localdomain podman[239163]: 2025-11-28 09:34:23.642587235 +0000 UTC m=+0.050892453 container died d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 09:34:23 np0005538515.localdomain systemd[1]: d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.timer: Deactivated successfully.
Nov 28 09:34:23 np0005538515.localdomain systemd[1]: Stopped /usr/bin/podman healthcheck run d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.
Nov 28 09:34:23 np0005538515.localdomain systemd[1]: tmp-crun.YKwvLr.mount: Deactivated successfully.
Nov 28 09:34:24 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7-userdata-shm.mount: Deactivated successfully.
Nov 28 09:34:24 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-c3914bdda39f47c0c497a56396d11c84b489b87df2bfd019b00ddced1e1ae309-merged.mount: Deactivated successfully.
Nov 28 09:34:24 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-f7726cecd9e8969401979ecd2369f385c53efc762aea19175eca5dfbffa00449-merged.mount: Deactivated successfully.
Nov 28 09:34:25 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-f7726cecd9e8969401979ecd2369f385c53efc762aea19175eca5dfbffa00449-merged.mount: Deactivated successfully.
Nov 28 09:34:26 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-f20c3ba929bbb53a84e323dddb8c0eaf3ca74b6729310e964e1fa9eee119e43a-merged.mount: Deactivated successfully.
Nov 28 09:34:26 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-c3914bdda39f47c0c497a56396d11c84b489b87df2bfd019b00ddced1e1ae309-merged.mount: Deactivated successfully.
Nov 28 09:34:26 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-c3914bdda39f47c0c497a56396d11c84b489b87df2bfd019b00ddced1e1ae309-merged.mount: Deactivated successfully.
Nov 28 09:34:27 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-3c501e2e0af2cd58357bae9f85ce4f24459167783b2ef7b06790cc9f118dfa87-merged.mount: Deactivated successfully.
Nov 28 09:34:27 np0005538515.localdomain podman[239163]: 2025-11-28 09:34:27.004985745 +0000 UTC m=+3.413290993 container cleanup d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 28 09:34:27 np0005538515.localdomain podman[239163]: podman_exporter
Nov 28 09:34:27 np0005538515.localdomain podman[239175]: 2025-11-28 09:34:27.021195291 +0000 UTC m=+3.359906752 container cleanup d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 09:34:27 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35887 DF PROTO=TCP SPT=38958 DPT=9105 SEQ=1038688251 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD3D2130000000001030307) 
Nov 28 09:34:27 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66-merged.mount: Deactivated successfully.
Nov 28 09:34:27 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-f20c3ba929bbb53a84e323dddb8c0eaf3ca74b6729310e964e1fa9eee119e43a-merged.mount: Deactivated successfully.
Nov 28 09:34:27 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-f20c3ba929bbb53a84e323dddb8c0eaf3ca74b6729310e964e1fa9eee119e43a-merged.mount: Deactivated successfully.
Nov 28 09:34:27 np0005538515.localdomain systemd[1]: edpm_podman_exporter.service: Main process exited, code=exited, status=2/INVALIDARGUMENT
Nov 28 09:34:28 np0005538515.localdomain podman[239190]: 2025-11-28 09:34:28.029691801 +0000 UTC m=+0.070845627 container cleanup d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 28 09:34:28 np0005538515.localdomain podman[239190]: podman_exporter
Nov 28 09:34:28 np0005538515.localdomain systemd[1]: edpm_podman_exporter.service: Failed with result 'exit-code'.
Nov 28 09:34:28 np0005538515.localdomain systemd[1]: Stopped podman_exporter container.
Nov 28 09:34:28 np0005538515.localdomain systemd[1]: Starting podman_exporter container...
Nov 28 09:34:28 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35888 DF PROTO=TCP SPT=38958 DPT=9105 SEQ=1038688251 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD3D63A0000000001030307) 
Nov 28 09:34:28 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 09:34:28 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.
Nov 28 09:34:28 np0005538515.localdomain podman[239204]: 2025-11-28 09:34:28.73861763 +0000 UTC m=+0.299033856 container init d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 28 09:34:28 np0005538515.localdomain podman_exporter[239219]: ts=2025-11-28T09:34:28.755Z caller=exporter.go:68 level=info msg="Starting podman-prometheus-exporter" version="(version=1.10.1, branch=HEAD, revision=1)"
Nov 28 09:34:28 np0005538515.localdomain podman_exporter[239219]: ts=2025-11-28T09:34:28.755Z caller=exporter.go:69 level=info msg=metrics enhanced=false
Nov 28 09:34:28 np0005538515.localdomain podman[239012]: @ - - [28/Nov/2025:09:34:28 +0000] "GET /v4.9.3/libpod/_ping HTTP/1.1" 200 2 "" "Go-http-client/1.1"
Nov 28 09:34:28 np0005538515.localdomain podman[239012]: time="2025-11-28T09:34:28Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 09:34:28 np0005538515.localdomain podman_exporter[239219]: ts=2025-11-28T09:34:28.755Z caller=handler.go:94 level=info msg="enabled collectors"
Nov 28 09:34:28 np0005538515.localdomain podman_exporter[239219]: ts=2025-11-28T09:34:28.755Z caller=handler.go:105 level=info collector=container
Nov 28 09:34:28 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.
Nov 28 09:34:28 np0005538515.localdomain podman[239204]: 2025-11-28 09:34:28.770202258 +0000 UTC m=+0.330618474 container start d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 28 09:34:28 np0005538515.localdomain podman[239204]: podman_exporter
Nov 28 09:34:28 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53417 DF PROTO=TCP SPT=48794 DPT=9100 SEQ=2158328996 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD3D6FA0000000001030307) 
Nov 28 09:34:29 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.
Nov 28 09:34:29 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.
Nov 28 09:34:29 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.
Nov 28 09:34:30 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-f7726cecd9e8969401979ecd2369f385c53efc762aea19175eca5dfbffa00449-merged.mount: Deactivated successfully.
Nov 28 09:34:30 np0005538515.localdomain systemd[1]: Started podman_exporter container.
Nov 28 09:34:30 np0005538515.localdomain sudo[239157]: pam_unix(sudo:session): session closed for user root
Nov 28 09:34:30 np0005538515.localdomain podman[239229]: 2025-11-28 09:34:30.54116939 +0000 UTC m=+1.766452903 container health_status d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=starting, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 28 09:34:30 np0005538515.localdomain podman[239229]: 2025-11-28 09:34:30.585452236 +0000 UTC m=+1.810735729 container exec_died d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 09:34:30 np0005538515.localdomain podman[239229]: unhealthy
Nov 28 09:34:30 np0005538515.localdomain podman[239243]: 2025-11-28 09:34:30.628587915 +0000 UTC m=+0.985809861 container health_status b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent)
Nov 28 09:34:30 np0005538515.localdomain podman[239263]: 2025-11-28 09:34:30.640562259 +0000 UTC m=+0.745282986 container health_status 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=starting, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.schema-version=1.0)
Nov 28 09:34:30 np0005538515.localdomain podman[239242]: 2025-11-28 09:34:30.596181322 +0000 UTC m=+0.957832176 container health_status 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 09:34:30 np0005538515.localdomain podman[239243]: 2025-11-28 09:34:30.665574832 +0000 UTC m=+1.022796728 container exec_died b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Nov 28 09:34:30 np0005538515.localdomain podman[239263]: 2025-11-28 09:34:30.672402296 +0000 UTC m=+0.777123053 container exec_died 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Nov 28 09:34:30 np0005538515.localdomain podman[239263]: unhealthy
Nov 28 09:34:30 np0005538515.localdomain podman[239242]: 2025-11-28 09:34:30.724799746 +0000 UTC m=+1.086450600 container exec_died 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3)
Nov 28 09:34:31 np0005538515.localdomain sudo[239417]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wqoutghevkfcdxkbfnonpvqgutxooken ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322470.7686832-2056-190678931330741/AnsiballZ_stat.py
Nov 28 09:34:31 np0005538515.localdomain sudo[239417]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:34:31 np0005538515.localdomain python3.9[239419]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/openstack_network_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:34:31 np0005538515.localdomain sudo[239417]: pam_unix(sudo:session): session closed for user root
Nov 28 09:34:31 np0005538515.localdomain sudo[239505]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xsibkhrcaghqzctxpdyymivnloovkhje ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322470.7686832-2056-190678931330741/AnsiballZ_copy.py
Nov 28 09:34:31 np0005538515.localdomain sudo[239505]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:34:31 np0005538515.localdomain python3.9[239507]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/openstack_network_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764322470.7686832-2056-190678931330741/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:34:31 np0005538515.localdomain sudo[239505]: pam_unix(sudo:session): session closed for user root
Nov 28 09:34:31 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45666 DF PROTO=TCP SPT=50444 DPT=9882 SEQ=1755845825 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD3E33A0000000001030307) 
Nov 28 09:34:32 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-c3914bdda39f47c0c497a56396d11c84b489b87df2bfd019b00ddced1e1ae309-merged.mount: Deactivated successfully.
Nov 28 09:34:32 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-f7726cecd9e8969401979ecd2369f385c53efc762aea19175eca5dfbffa00449-merged.mount: Deactivated successfully.
Nov 28 09:34:32 np0005538515.localdomain sudo[239615]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xmfgcqruwlyzgygbabjvuzjiohkskixf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322472.4589553-2108-130373697177958/AnsiballZ_container_config_data.py
Nov 28 09:34:32 np0005538515.localdomain sudo[239615]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:34:32 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-f7726cecd9e8969401979ecd2369f385c53efc762aea19175eca5dfbffa00449-merged.mount: Deactivated successfully.
Nov 28 09:34:32 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 28 09:34:32 np0005538515.localdomain python3.9[239617]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=openstack_network_exporter.json debug=False
Nov 28 09:34:32 np0005538515.localdomain sudo[239615]: pam_unix(sudo:session): session closed for user root
Nov 28 09:34:33 np0005538515.localdomain systemd[1]: d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 09:34:33 np0005538515.localdomain systemd[1]: d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.service: Failed with result 'exit-code'.
Nov 28 09:34:33 np0005538515.localdomain systemd[1]: b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.service: Deactivated successfully.
Nov 28 09:34:33 np0005538515.localdomain systemd[1]: 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 09:34:33 np0005538515.localdomain systemd[1]: 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.service: Failed with result 'exit-code'.
Nov 28 09:34:33 np0005538515.localdomain systemd[1]: 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.service: Deactivated successfully.
Nov 28 09:34:33 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28-merged.mount: Deactivated successfully.
Nov 28 09:34:33 np0005538515.localdomain sudo[239726]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ivdvcbonotgliczgibkucnrpdlctxpeo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322473.2694314-2135-126147729448510/AnsiballZ_container_config_hash.py
Nov 28 09:34:33 np0005538515.localdomain sudo[239726]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:34:33 np0005538515.localdomain python3.9[239728]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 28 09:34:33 np0005538515.localdomain sudo[239726]: pam_unix(sudo:session): session closed for user root
Nov 28 09:34:34 np0005538515.localdomain sudo[239836]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kbqibswgxdniptsxortpjqujsjpgabdx ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764322474.2878008-2164-231810124452312/AnsiballZ_edpm_container_manage.py
Nov 28 09:34:34 np0005538515.localdomain sudo[239836]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:34:34 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-f20c3ba929bbb53a84e323dddb8c0eaf3ca74b6729310e964e1fa9eee119e43a-merged.mount: Deactivated successfully.
Nov 28 09:34:34 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-c3914bdda39f47c0c497a56396d11c84b489b87df2bfd019b00ddced1e1ae309-merged.mount: Deactivated successfully.
Nov 28 09:34:34 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35890 DF PROTO=TCP SPT=38958 DPT=9105 SEQ=1038688251 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD3EDFA0000000001030307) 
Nov 28 09:34:34 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:34:34 np0005538515.localdomain python3[239838]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=openstack_network_exporter.json log_base_path=/var/log/containers/stdouts debug=False
Nov 28 09:34:34 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-c3914bdda39f47c0c497a56396d11c84b489b87df2bfd019b00ddced1e1ae309-merged.mount: Deactivated successfully.
Nov 28 09:34:35 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 28 09:34:36 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66-merged.mount: Deactivated successfully.
Nov 28 09:34:36 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-f20c3ba929bbb53a84e323dddb8c0eaf3ca74b6729310e964e1fa9eee119e43a-merged.mount: Deactivated successfully.
Nov 28 09:34:36 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:34:36 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-f20c3ba929bbb53a84e323dddb8c0eaf3ca74b6729310e964e1fa9eee119e43a-merged.mount: Deactivated successfully.
Nov 28 09:34:36 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:34:36 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa-merged.mount: Deactivated successfully.
Nov 28 09:34:36 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66-merged.mount: Deactivated successfully.
Nov 28 09:34:36 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66-merged.mount: Deactivated successfully.
Nov 28 09:34:37 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:34:37 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:34:39 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44339 DF PROTO=TCP SPT=42168 DPT=9102 SEQ=4122354115 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD3FF3A0000000001030307) 
Nov 28 09:34:39 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-f7726cecd9e8969401979ecd2369f385c53efc762aea19175eca5dfbffa00449-merged.mount: Deactivated successfully.
Nov 28 09:34:39 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28-merged.mount: Deactivated successfully.
Nov 28 09:34:40 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-3584b97526356ef5b6642175730f52565e80737460bbd74a6e31729b79699070-merged.mount: Deactivated successfully.
Nov 28 09:34:41 np0005538515.localdomain kernel: overlayfs: upperdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:34:41 np0005538515.localdomain kernel: overlayfs: workdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:34:41 np0005538515.localdomain podman[239012]: time="2025-11-28T09:34:41Z" level=error msg="Unmounting /var/lib/containers/storage/overlay/e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28/merged: invalid argument"
Nov 28 09:34:41 np0005538515.localdomain podman[239012]: time="2025-11-28T09:34:41Z" level=error msg="Getting root fs size for \"2a044fcb236ee7f5542f44e64fe793d77e8763efce6900a93b0314c6ec72e94b\": getting diffsize of layer \"e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28\" and its parent \"f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958\": creating overlay mount to /var/lib/containers/storage/overlay/e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28/merged, mount_data=\"lowerdir=/var/lib/containers/storage/overlay/l/YXI27YR2A6GRINAQ3FVOLZKW4B:/var/lib/containers/storage/overlay/l/5NF4JPSMHF7QZY565YBUJ3HOZG:/var/lib/containers/storage/overlay/l/4XMODH4ZBUQAKWFSLQ6LSJ6RSJ:/var/lib/containers/storage/overlay/l/XRMYXCJ3MORFVFM2M5OYFMELKD,upperdir=/var/lib/containers/storage/overlay/e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28/diff,workdir=/var/lib/containers/storage/overlay/e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28/work,nodev,metacopy=on\": no such file or directory"
Nov 28 09:34:41 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.
Nov 28 09:34:42 np0005538515.localdomain podman[239876]: 2025-11-28 09:34:42.155008639 +0000 UTC m=+0.248791584 container health_status 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 28 09:34:42 np0005538515.localdomain podman[239876]: 2025-11-28 09:34:42.171535536 +0000 UTC m=+0.265318531 container exec_died 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 28 09:34:43 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35891 DF PROTO=TCP SPT=38958 DPT=9105 SEQ=1038688251 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD40EFA0000000001030307) 
Nov 28 09:34:43 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 28 09:34:43 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-25ffb2916f839ff1700adaff5cb29e97302d49ba8ff980d3124f389d659473a3-merged.mount: Deactivated successfully.
Nov 28 09:34:43 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28-merged.mount: Deactivated successfully.
Nov 28 09:34:44 np0005538515.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:34:44 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45668 DF PROTO=TCP SPT=50444 DPT=9882 SEQ=1755845825 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD412FA0000000001030307) 
Nov 28 09:34:44 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28-merged.mount: Deactivated successfully.
Nov 28 09:34:44 np0005538515.localdomain systemd[1]: 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.service: Deactivated successfully.
Nov 28 09:34:45 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:34:45 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-a10f3c610bfd3a5166c8bb201abb4a07184bf8ddf69826ea8939f1a48ecba966-merged.mount: Deactivated successfully.
Nov 28 09:34:45 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:34:45 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:34:46 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:34:46 np0005538515.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:34:46 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:34:46.391 228501 DEBUG oslo_service.periodic_task [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:34:46 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:34:46 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:34:46.407 228501 DEBUG oslo_service.periodic_task [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:34:46 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:34:46.407 228501 DEBUG oslo_service.periodic_task [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:34:46 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:34:46.407 228501 DEBUG oslo_service.periodic_task [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:34:46 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:34:46.407 228501 DEBUG oslo_service.periodic_task [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:34:46 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 28 09:34:46 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30037 DF PROTO=TCP SPT=48708 DPT=9100 SEQ=849604836 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD41C3A0000000001030307) 
Nov 28 09:34:47 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:34:47.073 228501 DEBUG oslo_service.periodic_task [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:34:47 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:34:47.073 228501 DEBUG oslo_service.periodic_task [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:34:47 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:34:47.073 228501 DEBUG nova.compute.manager [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 28 09:34:47 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:34:47.073 228501 DEBUG nova.compute.manager [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 28 09:34:47 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:34:47.086 228501 DEBUG nova.compute.manager [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 28 09:34:47 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:34:47.086 228501 DEBUG oslo_service.periodic_task [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:34:47 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:34:47.086 228501 DEBUG oslo_service.periodic_task [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:34:47 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:34:47.086 228501 DEBUG nova.compute.manager [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 28 09:34:47 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:34:47.086 228501 DEBUG oslo_service.periodic_task [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:34:47 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:34:47.104 228501 DEBUG oslo_concurrency.lockutils [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:34:47 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:34:47.104 228501 DEBUG oslo_concurrency.lockutils [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:34:47 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:34:47.104 228501 DEBUG oslo_concurrency.lockutils [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:34:47 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:34:47.105 228501 DEBUG nova.compute.resource_tracker [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Auditing locally available compute resources for np0005538515.localdomain (node: np0005538515.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 28 09:34:47 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:34:47.105 228501 DEBUG oslo_concurrency.processutils [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 09:34:47 np0005538515.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:34:47 np0005538515.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:34:47 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-a10f3c610bfd3a5166c8bb201abb4a07184bf8ddf69826ea8939f1a48ecba966-merged.mount: Deactivated successfully.
Nov 28 09:34:47 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-c8beb7bd0728a5185ae08edcb0afeede0750b5c1acd8c5a453f776b712778919-merged.mount: Deactivated successfully.
Nov 28 09:34:47 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:34:47 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:34:47 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:34:47.560 228501 DEBUG oslo_concurrency.processutils [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.455s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 09:34:47 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:34:47 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:34:47.754 228501 WARNING nova.virt.libvirt.driver [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 09:34:47 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:34:47.756 228501 DEBUG nova.compute.resource_tracker [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Hypervisor/Node resource view: name=np0005538515.localdomain free_ram=13208MB free_disk=41.837093353271484GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 28 09:34:47 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:34:47.756 228501 DEBUG oslo_concurrency.lockutils [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:34:47 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:34:47.757 228501 DEBUG oslo_concurrency.lockutils [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:34:47 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:34:47.823 228501 DEBUG nova.compute.resource_tracker [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 28 09:34:47 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:34:47.823 228501 DEBUG nova.compute.resource_tracker [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Final resource view: name=np0005538515.localdomain phys_ram=15738MB used_ram=512MB phys_disk=41GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 28 09:34:48 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:34:48.065 228501 DEBUG oslo_concurrency.processutils [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 09:34:48 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:34:48.533 228501 DEBUG oslo_concurrency.processutils [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.468s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 09:34:48 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:34:48.539 228501 DEBUG nova.compute.provider_tree [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Inventory has not changed in ProviderTree for provider: 72fba1ca-0d86-48af-8a3d-510284dfd0e0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 09:34:48 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:34:48.560 228501 DEBUG nova.scheduler.client.report [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Inventory has not changed for provider 72fba1ca-0d86-48af-8a3d-510284dfd0e0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 09:34:48 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:34:48.562 228501 DEBUG nova.compute.resource_tracker [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Compute_service record updated for np0005538515.localdomain:np0005538515.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 28 09:34:48 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:34:48.562 228501 DEBUG oslo_concurrency.lockutils [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.806s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:34:48 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:34:48 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:34:49 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.
Nov 28 09:34:49 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 28 09:34:49 np0005538515.localdomain systemd[1]: tmp-crun.RYxLPG.mount: Deactivated successfully.
Nov 28 09:34:49 np0005538515.localdomain podman[239955]: 2025-11-28 09:34:49.503339934 +0000 UTC m=+0.115423451 container health_status cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, container_name=multipathd, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125)
Nov 28 09:34:49 np0005538515.localdomain podman[239955]: 2025-11-28 09:34:49.539166035 +0000 UTC m=+0.151249542 container exec_died cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_managed=true, container_name=multipathd, managed_by=edpm_ansible, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 09:34:50 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Nov 28 09:34:50 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30038 DF PROTO=TCP SPT=48708 DPT=9100 SEQ=849604836 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD42BFA0000000001030307) 
Nov 28 09:34:50 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28-merged.mount: Deactivated successfully.
Nov 28 09:34:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:34:50.814 158530 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:34:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:34:50.814 158530 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:34:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:34:50.814 158530 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:34:50 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-3584b97526356ef5b6642175730f52565e80737460bbd74a6e31729b79699070-merged.mount: Deactivated successfully.
Nov 28 09:34:51 np0005538515.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:34:51 np0005538515.localdomain systemd[1]: cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.service: Deactivated successfully.
Nov 28 09:34:52 np0005538515.localdomain sudo[239986]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:34:52 np0005538515.localdomain sudo[239986]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:34:52 np0005538515.localdomain sudo[239986]: pam_unix(sudo:session): session closed for user root
Nov 28 09:34:52 np0005538515.localdomain sudo[240004]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 09:34:52 np0005538515.localdomain sudo[240004]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:34:52 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:34:53 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 28 09:34:53 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28-merged.mount: Deactivated successfully.
Nov 28 09:34:53 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28-merged.mount: Deactivated successfully.
Nov 28 09:34:53 np0005538515.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:34:53 np0005538515.localdomain podman[239851]: 2025-11-28 09:34:35.234439484 +0000 UTC m=+0.052966588 image pull  quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7
Nov 28 09:34:54 np0005538515.localdomain sudo[240004]: pam_unix(sudo:session): session closed for user root
Nov 28 09:34:54 np0005538515.localdomain sudo[240053]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:34:54 np0005538515.localdomain sudo[240053]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:34:54 np0005538515.localdomain sudo[240053]: pam_unix(sudo:session): session closed for user root
Nov 28 09:34:54 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:34:55 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:34:55 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 28 09:34:55 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 28 09:34:55 np0005538515.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:34:56 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:34:56 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:34:56 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:34:57 np0005538515.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:34:57 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:34:57 np0005538515.localdomain podman[239012]: time="2025-11-28T09:34:57Z" level=error msg="Getting root fs size for \"2e97d8a51625064e6f06a470ec8e1c443497ab99753302611140ab63dcf05711\": unmounting layer c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6: replacing mount point \"/var/lib/containers/storage/overlay/c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6/merged\": device or resource busy"
Nov 28 09:34:57 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24827 DF PROTO=TCP SPT=47076 DPT=9105 SEQ=3152718205 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD447430000000001030307) 
Nov 28 09:34:57 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:34:57 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:34:57 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:34:57 np0005538515.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:34:57 np0005538515.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:34:57 np0005538515.localdomain podman[240089]: 2025-11-28 09:34:57.122951766 +0000 UTC m=+0.038284749 image pull  quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7
Nov 28 09:34:58 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24828 DF PROTO=TCP SPT=47076 DPT=9105 SEQ=3152718205 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD44B3A0000000001030307) 
Nov 28 09:34:58 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50502 DF PROTO=TCP SPT=57680 DPT=9882 SEQ=3572996585 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD44C6C0000000001030307) 
Nov 28 09:34:59 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Nov 28 09:35:00 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-58fd8127cd9da7f4875f0be3b8ee189ddf406ac3663dca02ef65d88f989bc037-merged.mount: Deactivated successfully.
Nov 28 09:35:00 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-58fd8127cd9da7f4875f0be3b8ee189ddf406ac3663dca02ef65d88f989bc037-merged.mount: Deactivated successfully.
Nov 28 09:35:00 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-d8311fd89fa9ff9a4d8824219b7d14d00721d421cc1a51c3601cb914a56f4bfc-merged.mount: Deactivated successfully.
Nov 28 09:35:00 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28-merged.mount: Deactivated successfully.
Nov 28 09:35:00 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-25ffb2916f839ff1700adaff5cb29e97302d49ba8ff980d3124f389d659473a3-merged.mount: Deactivated successfully.
Nov 28 09:35:01 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-a10f3c610bfd3a5166c8bb201abb4a07184bf8ddf69826ea8939f1a48ecba966-merged.mount: Deactivated successfully.
Nov 28 09:35:01 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-a10f3c610bfd3a5166c8bb201abb4a07184bf8ddf69826ea8939f1a48ecba966-merged.mount: Deactivated successfully.
Nov 28 09:35:01 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63967 DF PROTO=TCP SPT=35358 DPT=9105 SEQ=1280533386 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD456FA0000000001030307) 
Nov 28 09:35:01 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:35:02 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:35:02 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:35:02 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-d8311fd89fa9ff9a4d8824219b7d14d00721d421cc1a51c3601cb914a56f4bfc-merged.mount: Deactivated successfully.
Nov 28 09:35:02 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-77782651c01fa3d8af8a79c02d3312e7fed09a9087964da1a7c959a65a9214b8-merged.mount: Deactivated successfully.
Nov 28 09:35:03 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:35:03 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-a10f3c610bfd3a5166c8bb201abb4a07184bf8ddf69826ea8939f1a48ecba966-merged.mount: Deactivated successfully.
Nov 28 09:35:03 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.
Nov 28 09:35:03 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.
Nov 28 09:35:03 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.
Nov 28 09:35:03 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.
Nov 28 09:35:03 np0005538515.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:35:03 np0005538515.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:35:03 np0005538515.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:35:03 np0005538515.localdomain podman[240104]: 2025-11-28 09:35:03.370984519 +0000 UTC m=+0.187758385 container health_status b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.schema-version=1.0)
Nov 28 09:35:03 np0005538515.localdomain podman[240104]: 2025-11-28 09:35:03.382514209 +0000 UTC m=+0.199288005 container exec_died b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible)
Nov 28 09:35:03 np0005538515.localdomain podman[240102]: 2025-11-28 09:35:03.354546744 +0000 UTC m=+0.172265439 container health_status 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=starting, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, config_id=edpm, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Nov 28 09:35:03 np0005538515.localdomain podman[240102]: 2025-11-28 09:35:03.440494173 +0000 UTC m=+0.258212848 container exec_died 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 28 09:35:03 np0005538515.localdomain podman[240102]: unhealthy
Nov 28 09:35:03 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:35:03 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:35:03 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-c8beb7bd0728a5185ae08edcb0afeede0750b5c1acd8c5a453f776b712778919-merged.mount: Deactivated successfully.
Nov 28 09:35:04 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24830 DF PROTO=TCP SPT=47076 DPT=9105 SEQ=3152718205 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD462FA0000000001030307) 
Nov 28 09:35:05 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 28 09:35:05 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Nov 28 09:35:05 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Nov 28 09:35:05 np0005538515.localdomain systemd[1]: b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.service: Deactivated successfully.
Nov 28 09:35:05 np0005538515.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:35:05 np0005538515.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:35:05 np0005538515.localdomain systemd[1]: 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 09:35:05 np0005538515.localdomain systemd[1]: 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.service: Failed with result 'exit-code'.
Nov 28 09:35:05 np0005538515.localdomain podman[240103]: 2025-11-28 09:35:05.998959002 +0000 UTC m=+2.815830081 container health_status 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 28 09:35:06 np0005538515.localdomain podman[240103]: 2025-11-28 09:35:06.037829228 +0000 UTC m=+2.854700277 container exec_died 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 28 09:35:06 np0005538515.localdomain podman[240105]: 2025-11-28 09:35:06.055095718 +0000 UTC m=+2.866883928 container health_status d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=starting, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 09:35:06 np0005538515.localdomain podman[240105]: 2025-11-28 09:35:06.067510607 +0000 UTC m=+2.879298837 container exec_died d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 09:35:06 np0005538515.localdomain podman[240105]: unhealthy
Nov 28 09:35:06 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:35:07 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:35:07 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 28 09:35:07 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 28 09:35:07 np0005538515.localdomain podman[240089]: 
Nov 28 09:35:07 np0005538515.localdomain systemd[1]: 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.service: Deactivated successfully.
Nov 28 09:35:07 np0005538515.localdomain podman[240089]: 2025-11-28 09:35:07.846289295 +0000 UTC m=+10.761622238 container create 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vcs-type=git, vendor=Red Hat, Inc., io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, container_name=openstack_network_exporter, version=9.6, io.buildah.version=1.33.7, managed_by=edpm_ansible, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, distribution-scope=public, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers)
Nov 28 09:35:07 np0005538515.localdomain systemd[1]: d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 09:35:07 np0005538515.localdomain systemd[1]: d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.service: Failed with result 'exit-code'.
Nov 28 09:35:08 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-a10f3c610bfd3a5166c8bb201abb4a07184bf8ddf69826ea8939f1a48ecba966-merged.mount: Deactivated successfully.
Nov 28 09:35:08 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-92a670f87546e9222dc3530777cbcbb6bd2a424665ad22aef150e174bea9c765-merged.mount: Deactivated successfully.
Nov 28 09:35:08 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:35:08 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:35:08 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:35:08 np0005538515.localdomain python3[239838]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name openstack_network_exporter --conmon-pidfile /run/openstack_network_exporter.pid --env OS_ENDPOINT_TYPE=internal --env OPENSTACK_NETWORK_EXPORTER_YAML=/etc/openstack_network_exporter/openstack_network_exporter.yaml --healthcheck-command /openstack/healthcheck openstack-netwo --label config_id=edpm --label container_name=openstack_network_exporter --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9105:9105 --volume /var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z --volume /var/run/openvswitch:/run/openvswitch:rw,z --volume /var/lib/openvswitch/ovn:/run/ovn:rw,z --volume /proc:/host/proc:ro --volume /var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7
Nov 28 09:35:08 np0005538515.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:35:09 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60077 DF PROTO=TCP SPT=41392 DPT=9102 SEQ=2698449811 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD4747A0000000001030307) 
Nov 28 09:35:09 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:35:09 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-cae296f764831135e29cafc4ebb3dae4bbdc9f9a6aba7fb9c51fecf58f2b7f2e-merged.mount: Deactivated successfully.
Nov 28 09:35:09 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-cae296f764831135e29cafc4ebb3dae4bbdc9f9a6aba7fb9c51fecf58f2b7f2e-merged.mount: Deactivated successfully.
Nov 28 09:35:09 np0005538515.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:35:10 np0005538515.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:35:10 np0005538515.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:35:10 np0005538515.localdomain sudo[239836]: pam_unix(sudo:session): session closed for user root
Nov 28 09:35:10 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:35:10 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:35:10 np0005538515.localdomain sudo[240311]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-saplweijowsxxnvsyonscmbcmzvqoonb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322510.7131011-2190-117107368722100/AnsiballZ_stat.py
Nov 28 09:35:10 np0005538515.localdomain sudo[240311]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:35:11 np0005538515.localdomain python3.9[240313]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 09:35:11 np0005538515.localdomain sudo[240311]: pam_unix(sudo:session): session closed for user root
Nov 28 09:35:11 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-cae296f764831135e29cafc4ebb3dae4bbdc9f9a6aba7fb9c51fecf58f2b7f2e-merged.mount: Deactivated successfully.
Nov 28 09:35:11 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-78fa405dc0dc1392ab1503db8559712d0c057956ae06af0d32d5b9d343fe4a38-merged.mount: Deactivated successfully.
Nov 28 09:35:11 np0005538515.localdomain sudo[240423]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-waxjxgykinhblqwexemvbwmhcdquiuem ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322511.5889275-2215-41136068204465/AnsiballZ_file.py
Nov 28 09:35:11 np0005538515.localdomain sudo[240423]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:35:12 np0005538515.localdomain python3.9[240425]: ansible-file Invoked with path=/etc/systemd/system/edpm_openstack_network_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:35:12 np0005538515.localdomain sudo[240423]: pam_unix(sudo:session): session closed for user root
Nov 28 09:35:12 np0005538515.localdomain sudo[240532]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hvovmovplbpqvizrnkpcydlmdsxwksza ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322512.1301556-2215-188458169425784/AnsiballZ_copy.py
Nov 28 09:35:12 np0005538515.localdomain sudo[240532]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:35:12 np0005538515.localdomain python3.9[240534]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764322512.1301556-2215-188458169425784/source dest=/etc/systemd/system/edpm_openstack_network_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:35:12 np0005538515.localdomain sudo[240532]: pam_unix(sudo:session): session closed for user root
Nov 28 09:35:12 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24831 DF PROTO=TCP SPT=47076 DPT=9105 SEQ=3152718205 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD482FA0000000001030307) 
Nov 28 09:35:12 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-58fd8127cd9da7f4875f0be3b8ee189ddf406ac3663dca02ef65d88f989bc037-merged.mount: Deactivated successfully.
Nov 28 09:35:13 np0005538515.localdomain sudo[240587]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ygxfkyqipehizurfiwrxbrzfholfixvn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322512.1301556-2215-188458169425784/AnsiballZ_systemd.py
Nov 28 09:35:13 np0005538515.localdomain sudo[240587]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:35:13 np0005538515.localdomain python3.9[240589]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 28 09:35:13 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 09:35:13 np0005538515.localdomain systemd-rc-local-generator[240611]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:35:13 np0005538515.localdomain systemd-sysv-generator[240618]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:35:13 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:35:13 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 28 09:35:13 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:35:13 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:35:13 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:35:13 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 28 09:35:13 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:35:13 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:35:13 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:35:13 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:35:13 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-d8311fd89fa9ff9a4d8824219b7d14d00721d421cc1a51c3601cb914a56f4bfc-merged.mount: Deactivated successfully.
Nov 28 09:35:13 np0005538515.localdomain sudo[240587]: pam_unix(sudo:session): session closed for user root
Nov 28 09:35:13 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-d8311fd89fa9ff9a4d8824219b7d14d00721d421cc1a51c3601cb914a56f4bfc-merged.mount: Deactivated successfully.
Nov 28 09:35:13 np0005538515.localdomain sudo[240677]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kogjbjbdrpqgroiknfxhroaplfhiacsj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322512.1301556-2215-188458169425784/AnsiballZ_systemd.py
Nov 28 09:35:13 np0005538515.localdomain sudo[240677]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:35:14 np0005538515.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:35:14 np0005538515.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:35:14 np0005538515.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:35:14 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42877 DF PROTO=TCP SPT=39300 DPT=9101 SEQ=2931178254 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD488410000000001030307) 
Nov 28 09:35:14 np0005538515.localdomain python3.9[240679]: ansible-systemd Invoked with state=restarted name=edpm_openstack_network_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:35:14 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 09:35:14 np0005538515.localdomain systemd-sysv-generator[240707]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:35:14 np0005538515.localdomain systemd-rc-local-generator[240704]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:35:14 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:35:14 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 28 09:35:14 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:35:14 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:35:14 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:35:14 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 28 09:35:14 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:35:14 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:35:14 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:35:14 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Nov 28 09:35:14 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:35:14 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.
Nov 28 09:35:14 np0005538515.localdomain systemd[1]: Starting openstack_network_exporter container...
Nov 28 09:35:14 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:35:14 np0005538515.localdomain podman[240719]: 2025-11-28 09:35:14.800477957 +0000 UTC m=+0.143516234 container health_status 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 28 09:35:14 np0005538515.localdomain podman[240719]: 2025-11-28 09:35:14.839513357 +0000 UTC m=+0.182551654 container exec_died 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 28 09:35:15 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:35:15 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 28 09:35:16 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 28 09:35:16 np0005538515.localdomain systemd[1]: 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.service: Deactivated successfully.
Nov 28 09:35:16 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 09:35:16 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5be0dafad38c026533af5bb8627b1fd421c08a935c7735a8b660de0bf6d8ee84/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Nov 28 09:35:16 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5be0dafad38c026533af5bb8627b1fd421c08a935c7735a8b660de0bf6d8ee84/merged/etc/openstack_network_exporter/openstack_network_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Nov 28 09:35:16 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.
Nov 28 09:35:16 np0005538515.localdomain podman[240721]: 2025-11-28 09:35:16.257698362 +0000 UTC m=+1.594410230 container init 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, build-date=2025-08-20T13:12:41, release=1755695350, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, version=9.6, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vendor=Red Hat, Inc., io.buildah.version=1.33.7)
Nov 28 09:35:16 np0005538515.localdomain openstack_network_exporter[240755]: INFO    09:35:16 main.go:48: registering *bridge.Collector
Nov 28 09:35:16 np0005538515.localdomain openstack_network_exporter[240755]: INFO    09:35:16 main.go:48: registering *coverage.Collector
Nov 28 09:35:16 np0005538515.localdomain openstack_network_exporter[240755]: INFO    09:35:16 main.go:48: registering *datapath.Collector
Nov 28 09:35:16 np0005538515.localdomain openstack_network_exporter[240755]: INFO    09:35:16 main.go:48: registering *iface.Collector
Nov 28 09:35:16 np0005538515.localdomain openstack_network_exporter[240755]: INFO    09:35:16 main.go:48: registering *memory.Collector
Nov 28 09:35:16 np0005538515.localdomain openstack_network_exporter[240755]: INFO    09:35:16 main.go:48: registering *ovnnorthd.Collector
Nov 28 09:35:16 np0005538515.localdomain openstack_network_exporter[240755]: INFO    09:35:16 main.go:48: registering *ovn.Collector
Nov 28 09:35:16 np0005538515.localdomain openstack_network_exporter[240755]: INFO    09:35:16 main.go:48: registering *ovsdbserver.Collector
Nov 28 09:35:16 np0005538515.localdomain openstack_network_exporter[240755]: INFO    09:35:16 main.go:48: registering *pmd_perf.Collector
Nov 28 09:35:16 np0005538515.localdomain openstack_network_exporter[240755]: INFO    09:35:16 main.go:48: registering *pmd_rxq.Collector
Nov 28 09:35:16 np0005538515.localdomain openstack_network_exporter[240755]: INFO    09:35:16 main.go:48: registering *vswitch.Collector
Nov 28 09:35:16 np0005538515.localdomain openstack_network_exporter[240755]: NOTICE  09:35:16 main.go:82: listening on http://:9105/metrics
Nov 28 09:35:16 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.
Nov 28 09:35:16 np0005538515.localdomain podman[240721]: 2025-11-28 09:35:16.297112645 +0000 UTC m=+1.633824533 container start 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, release=1755695350, managed_by=edpm_ansible, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.openshift.expose-services=, vcs-type=git, io.openshift.tags=minimal rhel9, distribution-scope=public, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6)
Nov 28 09:35:16 np0005538515.localdomain podman[240721]: openstack_network_exporter
Nov 28 09:35:16 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63619 DF PROTO=TCP SPT=42538 DPT=9100 SEQ=3505838984 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD4917A0000000001030307) 
Nov 28 09:35:17 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:35:17 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:35:17 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:35:17 np0005538515.localdomain systemd[1]: Started openstack_network_exporter container.
Nov 28 09:35:17 np0005538515.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:35:17 np0005538515.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:35:17 np0005538515.localdomain sudo[240677]: pam_unix(sudo:session): session closed for user root
Nov 28 09:35:17 np0005538515.localdomain podman[240765]: 2025-11-28 09:35:17.343574195 +0000 UTC m=+1.043112636 container health_status 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=starting, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., managed_by=edpm_ansible, version=9.6, io.openshift.expose-services=, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, architecture=x86_64, release=1755695350, vendor=Red Hat, Inc., distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Nov 28 09:35:17 np0005538515.localdomain podman[240765]: 2025-11-28 09:35:17.367006084 +0000 UTC m=+1.066544515 container exec_died 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., name=ubi9-minimal, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, managed_by=edpm_ansible, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Nov 28 09:35:17 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:35:17 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:35:18 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:35:18 np0005538515.localdomain systemd[1]: 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.service: Deactivated successfully.
Nov 28 09:35:18 np0005538515.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:35:18 np0005538515.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:35:18 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-d8311fd89fa9ff9a4d8824219b7d14d00721d421cc1a51c3601cb914a56f4bfc-merged.mount: Deactivated successfully.
Nov 28 09:35:18 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-77782651c01fa3d8af8a79c02d3312e7fed09a9087964da1a7c959a65a9214b8-merged.mount: Deactivated successfully.
Nov 28 09:35:18 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:35:19 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-a10f3c610bfd3a5166c8bb201abb4a07184bf8ddf69826ea8939f1a48ecba966-merged.mount: Deactivated successfully.
Nov 28 09:35:19 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-a10f3c610bfd3a5166c8bb201abb4a07184bf8ddf69826ea8939f1a48ecba966-merged.mount: Deactivated successfully.
Nov 28 09:35:20 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:35:20 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:35:20 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63620 DF PROTO=TCP SPT=42538 DPT=9100 SEQ=3505838984 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD4A13A0000000001030307) 
Nov 28 09:35:20 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-a10f3c610bfd3a5166c8bb201abb4a07184bf8ddf69826ea8939f1a48ecba966-merged.mount: Deactivated successfully.
Nov 28 09:35:20 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-92a670f87546e9222dc3530777cbcbb6bd2a424665ad22aef150e174bea9c765-merged.mount: Deactivated successfully.
Nov 28 09:35:20 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Nov 28 09:35:21 np0005538515.localdomain sudo[240895]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hrwkotaodaxxvdiqahwwebwuduojwyfo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322521.173941-2288-41467056061979/AnsiballZ_systemd.py
Nov 28 09:35:21 np0005538515.localdomain sudo[240895]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:35:21 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.
Nov 28 09:35:21 np0005538515.localdomain podman[240898]: 2025-11-28 09:35:21.604254136 +0000 UTC m=+0.098752894 container health_status cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 28 09:35:21 np0005538515.localdomain podman[240898]: 2025-11-28 09:35:21.622695437 +0000 UTC m=+0.117194175 container exec_died cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3)
Nov 28 09:35:21 np0005538515.localdomain python3.9[240897]: ansible-ansible.builtin.systemd Invoked with name=edpm_openstack_network_exporter.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 28 09:35:21 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-cae296f764831135e29cafc4ebb3dae4bbdc9f9a6aba7fb9c51fecf58f2b7f2e-merged.mount: Deactivated successfully.
Nov 28 09:35:21 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-95337b0ee1bc2060bec425b9be63b35b01d68f1de2bac6065e353d72be5388e0-merged.mount: Deactivated successfully.
Nov 28 09:35:21 np0005538515.localdomain systemd[1]: Stopping openstack_network_exporter container...
Nov 28 09:35:22 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-5e63dbc6f2c2fad3afb78d8adbb63d1357a03d400c05fbcd9ab42cd01e6497a2-merged.mount: Deactivated successfully.
Nov 28 09:35:22 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-e007bb9d0888be9cba9b97125428a4f6aecdcc0d729e1ce5c64249815340e7d9-merged.mount: Deactivated successfully.
Nov 28 09:35:22 np0005538515.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:35:22 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-e007bb9d0888be9cba9b97125428a4f6aecdcc0d729e1ce5c64249815340e7d9-merged.mount: Deactivated successfully.
Nov 28 09:35:22 np0005538515.localdomain systemd[1]: cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.service: Deactivated successfully.
Nov 28 09:35:22 np0005538515.localdomain systemd[1]: libpod-6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.scope: Deactivated successfully.
Nov 28 09:35:22 np0005538515.localdomain podman[240917]: 2025-11-28 09:35:22.382131571 +0000 UTC m=+0.527957489 container stop 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, config_id=edpm, container_name=openstack_network_exporter, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, distribution-scope=public, io.buildah.version=1.33.7, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, version=9.6, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers)
Nov 28 09:35:22 np0005538515.localdomain podman[240917]: 2025-11-28 09:35:22.412780997 +0000 UTC m=+0.558606935 container died 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, version=9.6, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Nov 28 09:35:22 np0005538515.localdomain systemd[1]: 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.timer: Deactivated successfully.
Nov 28 09:35:22 np0005538515.localdomain systemd[1]: Stopped /usr/bin/podman healthcheck run 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.
Nov 28 09:35:22 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:35:22 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf-userdata-shm.mount: Deactivated successfully.
Nov 28 09:35:23 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:35:23 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-5e63dbc6f2c2fad3afb78d8adbb63d1357a03d400c05fbcd9ab42cd01e6497a2-merged.mount: Deactivated successfully.
Nov 28 09:35:23 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-5e63dbc6f2c2fad3afb78d8adbb63d1357a03d400c05fbcd9ab42cd01e6497a2-merged.mount: Deactivated successfully.
Nov 28 09:35:23 np0005538515.localdomain podman[240917]: 2025-11-28 09:35:23.320506365 +0000 UTC m=+1.466332263 container cleanup 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, architecture=x86_64, vcs-type=git, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, version=9.6, build-date=2025-08-20T13:12:41, release=1755695350, distribution-scope=public, managed_by=edpm_ansible, config_id=edpm, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container)
Nov 28 09:35:23 np0005538515.localdomain podman[240917]: openstack_network_exporter
Nov 28 09:35:23 np0005538515.localdomain podman[240932]: 2025-11-28 09:35:23.341333582 +0000 UTC m=+0.946285224 container cleanup 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, managed_by=edpm_ansible, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, io.openshift.expose-services=, io.buildah.version=1.33.7, release=1755695350, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers)
Nov 28 09:35:23 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-5be0dafad38c026533af5bb8627b1fd421c08a935c7735a8b660de0bf6d8ee84-merged.mount: Deactivated successfully.
Nov 28 09:35:23 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-cae296f764831135e29cafc4ebb3dae4bbdc9f9a6aba7fb9c51fecf58f2b7f2e-merged.mount: Deactivated successfully.
Nov 28 09:35:23 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-78fa405dc0dc1392ab1503db8559712d0c057956ae06af0d32d5b9d343fe4a38-merged.mount: Deactivated successfully.
Nov 28 09:35:24 np0005538515.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:35:24 np0005538515.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:35:24 np0005538515.localdomain systemd[1]: edpm_openstack_network_exporter.service: Main process exited, code=exited, status=2/INVALIDARGUMENT
Nov 28 09:35:24 np0005538515.localdomain podman[240946]: 2025-11-28 09:35:24.1072361 +0000 UTC m=+0.052923169 container cleanup 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, maintainer=Red Hat, Inc., architecture=x86_64, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, version=9.6, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Nov 28 09:35:24 np0005538515.localdomain podman[240946]: openstack_network_exporter
Nov 28 09:35:24 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:35:24 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:35:26 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 28 09:35:26 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Nov 28 09:35:26 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Nov 28 09:35:26 np0005538515.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:35:26 np0005538515.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:35:26 np0005538515.localdomain systemd[1]: edpm_openstack_network_exporter.service: Failed with result 'exit-code'.
Nov 28 09:35:26 np0005538515.localdomain systemd[1]: Stopped openstack_network_exporter container.
Nov 28 09:35:26 np0005538515.localdomain systemd[1]: Starting openstack_network_exporter container...
Nov 28 09:35:27 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:35:27 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49613 DF PROTO=TCP SPT=54566 DPT=9105 SEQ=3331576412 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD4BC730000000001030307) 
Nov 28 09:35:28 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:35:28 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 28 09:35:28 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49614 DF PROTO=TCP SPT=54566 DPT=9105 SEQ=3331576412 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD4C07A0000000001030307) 
Nov 28 09:35:28 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 28 09:35:28 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63621 DF PROTO=TCP SPT=42538 DPT=9100 SEQ=3505838984 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD4C0FA0000000001030307) 
Nov 28 09:35:28 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 09:35:28 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5be0dafad38c026533af5bb8627b1fd421c08a935c7735a8b660de0bf6d8ee84/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Nov 28 09:35:28 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5be0dafad38c026533af5bb8627b1fd421c08a935c7735a8b660de0bf6d8ee84/merged/etc/openstack_network_exporter/openstack_network_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Nov 28 09:35:28 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.
Nov 28 09:35:28 np0005538515.localdomain podman[240959]: 2025-11-28 09:35:28.872802883 +0000 UTC m=+2.150699512 container init 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, version=9.6, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, config_id=edpm, container_name=openstack_network_exporter)
Nov 28 09:35:28 np0005538515.localdomain openstack_network_exporter[240973]: INFO    09:35:28 main.go:48: registering *bridge.Collector
Nov 28 09:35:28 np0005538515.localdomain openstack_network_exporter[240973]: INFO    09:35:28 main.go:48: registering *coverage.Collector
Nov 28 09:35:28 np0005538515.localdomain openstack_network_exporter[240973]: INFO    09:35:28 main.go:48: registering *datapath.Collector
Nov 28 09:35:28 np0005538515.localdomain openstack_network_exporter[240973]: INFO    09:35:28 main.go:48: registering *iface.Collector
Nov 28 09:35:28 np0005538515.localdomain openstack_network_exporter[240973]: INFO    09:35:28 main.go:48: registering *memory.Collector
Nov 28 09:35:28 np0005538515.localdomain openstack_network_exporter[240973]: INFO    09:35:28 main.go:48: registering *ovnnorthd.Collector
Nov 28 09:35:28 np0005538515.localdomain openstack_network_exporter[240973]: INFO    09:35:28 main.go:48: registering *ovn.Collector
Nov 28 09:35:28 np0005538515.localdomain openstack_network_exporter[240973]: INFO    09:35:28 main.go:48: registering *ovsdbserver.Collector
Nov 28 09:35:28 np0005538515.localdomain openstack_network_exporter[240973]: INFO    09:35:28 main.go:48: registering *pmd_perf.Collector
Nov 28 09:35:28 np0005538515.localdomain openstack_network_exporter[240973]: INFO    09:35:28 main.go:48: registering *pmd_rxq.Collector
Nov 28 09:35:28 np0005538515.localdomain openstack_network_exporter[240973]: INFO    09:35:28 main.go:48: registering *vswitch.Collector
Nov 28 09:35:28 np0005538515.localdomain openstack_network_exporter[240973]: NOTICE  09:35:28 main.go:82: listening on http://:9105/metrics
Nov 28 09:35:28 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.
Nov 28 09:35:28 np0005538515.localdomain podman[240959]: 2025-11-28 09:35:28.924113171 +0000 UTC m=+2.202009770 container start 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., architecture=x86_64, vcs-type=git, name=ubi9-minimal, distribution-scope=public, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, config_id=edpm)
Nov 28 09:35:28 np0005538515.localdomain podman[240959]: openstack_network_exporter
Nov 28 09:35:29 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-e007bb9d0888be9cba9b97125428a4f6aecdcc0d729e1ce5c64249815340e7d9-merged.mount: Deactivated successfully.
Nov 28 09:35:29 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-59a9aab44f9035abb8c665d33ff9cced94beb2960f79a3a6871873d8e649ac58-merged.mount: Deactivated successfully.
Nov 28 09:35:29 np0005538515.localdomain systemd[1]: Started openstack_network_exporter container.
Nov 28 09:35:29 np0005538515.localdomain sudo[240895]: pam_unix(sudo:session): session closed for user root
Nov 28 09:35:30 np0005538515.localdomain podman[240983]: 2025-11-28 09:35:30.000432232 +0000 UTC m=+1.082058113 container health_status 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=starting, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, maintainer=Red Hat, Inc., config_id=edpm, io.buildah.version=1.33.7, managed_by=edpm_ansible)
Nov 28 09:35:30 np0005538515.localdomain podman[240983]: 2025-11-28 09:35:30.009131936 +0000 UTC m=+1.090757797 container exec_died 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, container_name=openstack_network_exporter, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, io.buildah.version=1.33.7, managed_by=edpm_ansible, io.openshift.expose-services=, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, config_id=edpm)
Nov 28 09:35:30 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:35:30 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:35:30 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:35:30 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:35:30 np0005538515.localdomain systemd[1]: 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.service: Deactivated successfully.
Nov 28 09:35:31 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35893 DF PROTO=TCP SPT=38958 DPT=9105 SEQ=1038688251 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD4CCFA0000000001030307) 
Nov 28 09:35:31 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-55d5530fe8468c8c9907e0aa1de030811941604fa5f46de3db6dc15ec40906dd-merged.mount: Deactivated successfully.
Nov 28 09:35:31 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-ae0ebe7656e29542866ff018f5be9a3d02c88268a65814cf045e1b6c30ffd352-merged.mount: Deactivated successfully.
Nov 28 09:35:32 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-ae0ebe7656e29542866ff018f5be9a3d02c88268a65814cf045e1b6c30ffd352-merged.mount: Deactivated successfully.
Nov 28 09:35:32 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-55d5530fe8468c8c9907e0aa1de030811941604fa5f46de3db6dc15ec40906dd-merged.mount: Deactivated successfully.
Nov 28 09:35:32 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-55d5530fe8468c8c9907e0aa1de030811941604fa5f46de3db6dc15ec40906dd-merged.mount: Deactivated successfully.
Nov 28 09:35:33 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Nov 28 09:35:33 np0005538515.localdomain sudo[241109]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tawdrcnncajchxmxybbldbqyjbqubmly ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322533.2055457-2312-101445637908082/AnsiballZ_find.py
Nov 28 09:35:33 np0005538515.localdomain sudo[241109]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:35:33 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-95337b0ee1bc2060bec425b9be63b35b01d68f1de2bac6065e353d72be5388e0-merged.mount: Deactivated successfully.
Nov 28 09:35:33 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-ae0ebe7656e29542866ff018f5be9a3d02c88268a65814cf045e1b6c30ffd352-merged.mount: Deactivated successfully.
Nov 28 09:35:33 np0005538515.localdomain python3.9[241111]: ansible-ansible.builtin.find Invoked with file_type=directory paths=['/var/lib/openstack/healthchecks/'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Nov 28 09:35:33 np0005538515.localdomain sudo[241109]: pam_unix(sudo:session): session closed for user root
Nov 28 09:35:34 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-d8311fd89fa9ff9a4d8824219b7d14d00721d421cc1a51c3601cb914a56f4bfc-merged.mount: Deactivated successfully.
Nov 28 09:35:34 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-5e63dbc6f2c2fad3afb78d8adbb63d1357a03d400c05fbcd9ab42cd01e6497a2-merged.mount: Deactivated successfully.
Nov 28 09:35:34 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-e007bb9d0888be9cba9b97125428a4f6aecdcc0d729e1ce5c64249815340e7d9-merged.mount: Deactivated successfully.
Nov 28 09:35:34 np0005538515.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:35:34 np0005538515.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:35:34 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49616 DF PROTO=TCP SPT=54566 DPT=9105 SEQ=3331576412 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD4D83A0000000001030307) 
Nov 28 09:35:34 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-e007bb9d0888be9cba9b97125428a4f6aecdcc0d729e1ce5c64249815340e7d9-merged.mount: Deactivated successfully.
Nov 28 09:35:35 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:35:35 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:35:35 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-5e63dbc6f2c2fad3afb78d8adbb63d1357a03d400c05fbcd9ab42cd01e6497a2-merged.mount: Deactivated successfully.
Nov 28 09:35:36 np0005538515.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:35:36 np0005538515.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:35:36 np0005538515.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:35:36 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.
Nov 28 09:35:36 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.
Nov 28 09:35:36 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-d8311fd89fa9ff9a4d8824219b7d14d00721d421cc1a51c3601cb914a56f4bfc-merged.mount: Deactivated successfully.
Nov 28 09:35:36 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-93bf0314bbd4063198be021c760bb47b8172c6cfa3163da2b90a6f202605824f-merged.mount: Deactivated successfully.
Nov 28 09:35:36 np0005538515.localdomain systemd[1]: tmp-crun.C6sC7G.mount: Deactivated successfully.
Nov 28 09:35:36 np0005538515.localdomain podman[241130]: 2025-11-28 09:35:36.503806843 +0000 UTC m=+0.102317935 container health_status 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=unhealthy, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125)
Nov 28 09:35:36 np0005538515.localdomain podman[241130]: 2025-11-28 09:35:36.537544077 +0000 UTC m=+0.136055139 container exec_died 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute)
Nov 28 09:35:36 np0005538515.localdomain podman[241130]: unhealthy
Nov 28 09:35:36 np0005538515.localdomain podman[241131]: 2025-11-28 09:35:36.553262672 +0000 UTC m=+0.150806904 container health_status b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Nov 28 09:35:36 np0005538515.localdomain podman[241131]: 2025-11-28 09:35:36.583208986 +0000 UTC m=+0.180753208 container exec_died b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Nov 28 09:35:37 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:35:37 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.
Nov 28 09:35:37 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.
Nov 28 09:35:38 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 28 09:35:38 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Nov 28 09:35:38 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Nov 28 09:35:38 np0005538515.localdomain systemd[1]: 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 09:35:38 np0005538515.localdomain systemd[1]: 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.service: Failed with result 'exit-code'.
Nov 28 09:35:38 np0005538515.localdomain systemd[1]: b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.service: Deactivated successfully.
Nov 28 09:35:38 np0005538515.localdomain podman[241163]: 2025-11-28 09:35:38.767167336 +0000 UTC m=+0.864998352 container health_status 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS)
Nov 28 09:35:38 np0005538515.localdomain podman[241164]: 2025-11-28 09:35:38.843185912 +0000 UTC m=+0.939135699 container health_status d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=starting, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 28 09:35:38 np0005538515.localdomain podman[241163]: 2025-11-28 09:35:38.850315757 +0000 UTC m=+0.948146733 container exec_died 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=ovn_controller)
Nov 28 09:35:38 np0005538515.localdomain podman[241164]: 2025-11-28 09:35:38.906929081 +0000 UTC m=+1.002878918 container exec_died d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 09:35:38 np0005538515.localdomain podman[241164]: unhealthy
Nov 28 09:35:39 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37191 DF PROTO=TCP SPT=49690 DPT=9102 SEQ=2840877608 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD4E9BD0000000001030307) 
Nov 28 09:35:39 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-e007bb9d0888be9cba9b97125428a4f6aecdcc0d729e1ce5c64249815340e7d9-merged.mount: Deactivated successfully.
Nov 28 09:35:39 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-59a9aab44f9035abb8c665d33ff9cced94beb2960f79a3a6871873d8e649ac58-merged.mount: Deactivated successfully.
Nov 28 09:35:40 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:35:40 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 28 09:35:41 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 28 09:35:41 np0005538515.localdomain systemd[1]: 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.service: Deactivated successfully.
Nov 28 09:35:41 np0005538515.localdomain systemd[1]: d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 09:35:41 np0005538515.localdomain systemd[1]: d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.service: Failed with result 'exit-code'.
Nov 28 09:35:41 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-2cd9444c84550fbd551e3826a8110fcc009757858b99e84f1119041f2325189b-merged.mount: Deactivated successfully.
Nov 28 09:35:42 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:35:42 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:35:42 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-55d5530fe8468c8c9907e0aa1de030811941604fa5f46de3db6dc15ec40906dd-merged.mount: Deactivated successfully.
Nov 28 09:35:42 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:35:42 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-ae0ebe7656e29542866ff018f5be9a3d02c88268a65814cf045e1b6c30ffd352-merged.mount: Deactivated successfully.
Nov 28 09:35:42 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-55d5530fe8468c8c9907e0aa1de030811941604fa5f46de3db6dc15ec40906dd-merged.mount: Deactivated successfully.
Nov 28 09:35:43 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49617 DF PROTO=TCP SPT=54566 DPT=9105 SEQ=3331576412 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD4F8FA0000000001030307) 
Nov 28 09:35:43 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-55d5530fe8468c8c9907e0aa1de030811941604fa5f46de3db6dc15ec40906dd-merged.mount: Deactivated successfully.
Nov 28 09:35:43 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:35:43 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:35:43 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-ae0ebe7656e29542866ff018f5be9a3d02c88268a65814cf045e1b6c30ffd352-merged.mount: Deactivated successfully.
Nov 28 09:35:44 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12054 DF PROTO=TCP SPT=39340 DPT=9882 SEQ=4066141312 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD4FCFA0000000001030307) 
Nov 28 09:35:44 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:35:44 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-d8311fd89fa9ff9a4d8824219b7d14d00721d421cc1a51c3601cb914a56f4bfc-merged.mount: Deactivated successfully.
Nov 28 09:35:44 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-d8311fd89fa9ff9a4d8824219b7d14d00721d421cc1a51c3601cb914a56f4bfc-merged.mount: Deactivated successfully.
Nov 28 09:35:45 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:35:45 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:35:45 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:35:46 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.
Nov 28 09:35:46 np0005538515.localdomain podman[241208]: 2025-11-28 09:35:46.499887373 +0000 UTC m=+0.104763943 container health_status 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 28 09:35:46 np0005538515.localdomain podman[241208]: 2025-11-28 09:35:46.529990041 +0000 UTC m=+0.134866591 container exec_died 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 28 09:35:46 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:35:46.550 228501 DEBUG oslo_service.periodic_task [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:35:46 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53433 DF PROTO=TCP SPT=40716 DPT=9100 SEQ=3842616379 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD506BA0000000001030307) 
Nov 28 09:35:46 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-d8311fd89fa9ff9a4d8824219b7d14d00721d421cc1a51c3601cb914a56f4bfc-merged.mount: Deactivated successfully.
Nov 28 09:35:46 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-93bf0314bbd4063198be021c760bb47b8172c6cfa3163da2b90a6f202605824f-merged.mount: Deactivated successfully.
Nov 28 09:35:46 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Nov 28 09:35:46 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-c649278c2e5a474424d7d5698a840ae7cdf6b8243f9150d8a362719bce70699a-merged.mount: Deactivated successfully.
Nov 28 09:35:47 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:35:47.074 228501 DEBUG oslo_service.periodic_task [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:35:47 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:35:47.074 228501 DEBUG oslo_service.periodic_task [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:35:47 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:35:47.074 228501 DEBUG oslo_service.periodic_task [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:35:47 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:35:47.075 228501 DEBUG oslo_service.periodic_task [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:35:47 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:35:47.097 228501 DEBUG oslo_concurrency.lockutils [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:35:47 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:35:47.097 228501 DEBUG oslo_concurrency.lockutils [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:35:47 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:35:47.097 228501 DEBUG oslo_concurrency.lockutils [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:35:47 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:35:47.098 228501 DEBUG nova.compute.resource_tracker [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Auditing locally available compute resources for np0005538515.localdomain (node: np0005538515.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 28 09:35:47 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:35:47.098 228501 DEBUG oslo_concurrency.processutils [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 09:35:47 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-dc5b8b4def912dce4d14a76402b323c6b5c48ee8271230eacbdaaa7e58e676b2-merged.mount: Deactivated successfully.
Nov 28 09:35:47 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-5ea32d7a444086a7f1ea2479bd7b214a5adab9651f7d4df1f24a039ae5563f9d-merged.mount: Deactivated successfully.
Nov 28 09:35:47 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-5ea32d7a444086a7f1ea2479bd7b214a5adab9651f7d4df1f24a039ae5563f9d-merged.mount: Deactivated successfully.
Nov 28 09:35:47 np0005538515.localdomain systemd[1]: 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.service: Deactivated successfully.
Nov 28 09:35:47 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:35:47.567 228501 DEBUG oslo_concurrency.processutils [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.469s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 09:35:47 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:35:47.755 228501 WARNING nova.virt.libvirt.driver [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 09:35:47 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:35:47.756 228501 DEBUG nova.compute.resource_tracker [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Hypervisor/Node resource view: name=np0005538515.localdomain free_ram=13211MB free_disk=41.837093353271484GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 28 09:35:47 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:35:47.756 228501 DEBUG oslo_concurrency.lockutils [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:35:47 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:35:47.756 228501 DEBUG oslo_concurrency.lockutils [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:35:47 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:35:47.836 228501 DEBUG nova.compute.resource_tracker [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 28 09:35:47 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:35:47.836 228501 DEBUG nova.compute.resource_tracker [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Final resource view: name=np0005538515.localdomain phys_ram=15738MB used_ram=512MB phys_disk=41GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 28 09:35:47 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:35:47.861 228501 DEBUG oslo_concurrency.processutils [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 09:35:48 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:35:48.331 228501 DEBUG oslo_concurrency.processutils [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.470s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 09:35:48 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:35:48.338 228501 DEBUG nova.compute.provider_tree [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Inventory has not changed in ProviderTree for provider: 72fba1ca-0d86-48af-8a3d-510284dfd0e0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 09:35:48 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:35:48.363 228501 DEBUG nova.scheduler.client.report [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Inventory has not changed for provider 72fba1ca-0d86-48af-8a3d-510284dfd0e0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 09:35:48 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:35:48.366 228501 DEBUG nova.compute.resource_tracker [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Compute_service record updated for np0005538515.localdomain:np0005538515.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 28 09:35:48 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:35:48.366 228501 DEBUG oslo_concurrency.lockutils [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.610s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:35:48 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-f20c3ba929bbb53a84e323dddb8c0eaf3ca74b6729310e964e1fa9eee119e43a-merged.mount: Deactivated successfully.
Nov 28 09:35:48 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-dc5b8b4def912dce4d14a76402b323c6b5c48ee8271230eacbdaaa7e58e676b2-merged.mount: Deactivated successfully.
Nov 28 09:35:48 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-dc5b8b4def912dce4d14a76402b323c6b5c48ee8271230eacbdaaa7e58e676b2-merged.mount: Deactivated successfully.
Nov 28 09:35:49 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66-merged.mount: Deactivated successfully.
Nov 28 09:35:49 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:35:49.361 228501 DEBUG oslo_service.periodic_task [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:35:49 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:35:49.362 228501 DEBUG oslo_service.periodic_task [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:35:49 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:35:49.362 228501 DEBUG nova.compute.manager [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 28 09:35:49 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:35:49.362 228501 DEBUG nova.compute.manager [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 28 09:35:49 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:35:49.375 228501 DEBUG nova.compute.manager [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 28 09:35:49 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:35:49.375 228501 DEBUG oslo_service.periodic_task [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:35:49 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:35:49.375 228501 DEBUG oslo_service.periodic_task [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:35:49 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:35:49.376 228501 DEBUG nova.compute.manager [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 28 09:35:49 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-f20c3ba929bbb53a84e323dddb8c0eaf3ca74b6729310e964e1fa9eee119e43a-merged.mount: Deactivated successfully.
Nov 28 09:35:49 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa-merged.mount: Deactivated successfully.
Nov 28 09:35:49 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66-merged.mount: Deactivated successfully.
Nov 28 09:35:49 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 28 09:35:50 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53434 DF PROTO=TCP SPT=40716 DPT=9100 SEQ=3842616379 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD5167A0000000001030307) 
Nov 28 09:35:50 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66-merged.mount: Deactivated successfully.
Nov 28 09:35:50 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Nov 28 09:35:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:35:50.815 158530 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:35:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:35:50.815 158530 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:35:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:35:50.816 158530 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:35:51 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-5ea32d7a444086a7f1ea2479bd7b214a5adab9651f7d4df1f24a039ae5563f9d-merged.mount: Deactivated successfully.
Nov 28 09:35:51 np0005538515.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:35:52 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:35:52 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.
Nov 28 09:35:52 np0005538515.localdomain systemd[1]: tmp-crun.Z9mRrW.mount: Deactivated successfully.
Nov 28 09:35:52 np0005538515.localdomain podman[241274]: 2025-11-28 09:35:52.978980639 +0000 UTC m=+0.087138307 container health_status cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Nov 28 09:35:52 np0005538515.localdomain podman[241274]: 2025-11-28 09:35:52.989967015 +0000 UTC m=+0.098124653 container exec_died cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, container_name=multipathd, tcib_managed=true, org.label-schema.build-date=20251125)
Nov 28 09:35:53 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:35:53 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:35:53 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:35:54 np0005538515.localdomain systemd[1]: cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.service: Deactivated successfully.
Nov 28 09:35:54 np0005538515.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:35:54 np0005538515.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:35:54 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28-merged.mount: Deactivated successfully.
Nov 28 09:35:54 np0005538515.localdomain sudo[241292]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:35:54 np0005538515.localdomain sudo[241292]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:35:54 np0005538515.localdomain sudo[241292]: pam_unix(sudo:session): session closed for user root
Nov 28 09:35:54 np0005538515.localdomain sudo[241310]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 09:35:54 np0005538515.localdomain sudo[241310]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:35:54 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:35:54 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:35:55 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:35:55 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:35:55 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:35:55 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:35:56 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 28 09:35:56 np0005538515.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:35:56 np0005538515.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:35:56 np0005538515.localdomain sudo[241310]: pam_unix(sudo:session): session closed for user root
Nov 28 09:35:57 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:35:57 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:35:57 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:35:57 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24064 DF PROTO=TCP SPT=33840 DPT=9105 SEQ=2313935200 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD531A20000000001030307) 
Nov 28 09:35:58 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:35:58 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:35:58 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Nov 28 09:35:58 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:35:58 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24065 DF PROTO=TCP SPT=33840 DPT=9105 SEQ=2313935200 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD535BA0000000001030307) 
Nov 28 09:35:58 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29863 DF PROTO=TCP SPT=36170 DPT=9882 SEQ=314207492 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD536CC0000000001030307) 
Nov 28 09:35:59 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-c649278c2e5a474424d7d5698a840ae7cdf6b8243f9150d8a362719bce70699a-merged.mount: Deactivated successfully.
Nov 28 09:35:59 np0005538515.localdomain sudo[241359]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:35:59 np0005538515.localdomain sudo[241359]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:35:59 np0005538515.localdomain sudo[241359]: pam_unix(sudo:session): session closed for user root
Nov 28 09:36:00 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-dc5b8b4def912dce4d14a76402b323c6b5c48ee8271230eacbdaaa7e58e676b2-merged.mount: Deactivated successfully.
Nov 28 09:36:00 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-5ea32d7a444086a7f1ea2479bd7b214a5adab9651f7d4df1f24a039ae5563f9d-merged.mount: Deactivated successfully.
Nov 28 09:36:00 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-5ea32d7a444086a7f1ea2479bd7b214a5adab9651f7d4df1f24a039ae5563f9d-merged.mount: Deactivated successfully.
Nov 28 09:36:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:36:00.613 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:36:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:36:00.613 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:36:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:36:00.613 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:36:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:36:00.613 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:36:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:36:00.614 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:36:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:36:00.614 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:36:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:36:00.614 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:36:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:36:00.614 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:36:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:36:00.614 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:36:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:36:00.614 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:36:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:36:00.614 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:36:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:36:00.614 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:36:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:36:00.614 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:36:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:36:00.614 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:36:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:36:00.614 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:36:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:36:00.614 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:36:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:36:00.614 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:36:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:36:00.614 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:36:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:36:00.614 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:36:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:36:00.614 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:36:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:36:00.614 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:36:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:36:00.615 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:36:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:36:00.615 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:36:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:36:00.615 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:36:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:36:00.615 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:36:00 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28-merged.mount: Deactivated successfully.
Nov 28 09:36:00 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.
Nov 28 09:36:00 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-b290307da7690cf991f1186b07b34a264d1d07b861913129e99370229181e3a2-merged.mount: Deactivated successfully.
Nov 28 09:36:00 np0005538515.localdomain podman[241377]: 2025-11-28 09:36:00.9393327 +0000 UTC m=+0.086599701 container health_status 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, name=ubi9-minimal, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, managed_by=edpm_ansible, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, vcs-type=git, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc.)
Nov 28 09:36:00 np0005538515.localdomain podman[241377]: 2025-11-28 09:36:00.953453495 +0000 UTC m=+0.100720486 container exec_died 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, config_id=edpm, distribution-scope=public, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, version=9.6, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, container_name=openstack_network_exporter, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, architecture=x86_64)
Nov 28 09:36:01 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-f20c3ba929bbb53a84e323dddb8c0eaf3ca74b6729310e964e1fa9eee119e43a-merged.mount: Deactivated successfully.
Nov 28 09:36:01 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-dc5b8b4def912dce4d14a76402b323c6b5c48ee8271230eacbdaaa7e58e676b2-merged.mount: Deactivated successfully.
Nov 28 09:36:01 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-dc5b8b4def912dce4d14a76402b323c6b5c48ee8271230eacbdaaa7e58e676b2-merged.mount: Deactivated successfully.
Nov 28 09:36:01 np0005538515.localdomain systemd[1]: 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.service: Deactivated successfully.
Nov 28 09:36:01 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24833 DF PROTO=TCP SPT=47076 DPT=9105 SEQ=3152718205 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD540FA0000000001030307) 
Nov 28 09:36:02 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66-merged.mount: Deactivated successfully.
Nov 28 09:36:02 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-f20c3ba929bbb53a84e323dddb8c0eaf3ca74b6729310e964e1fa9eee119e43a-merged.mount: Deactivated successfully.
Nov 28 09:36:02 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa-merged.mount: Deactivated successfully.
Nov 28 09:36:03 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66-merged.mount: Deactivated successfully.
Nov 28 09:36:03 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-cf8de856f68682579de884f5a9ccb4b00fffe375a72087325354c97a26c55ce7-merged.mount: Deactivated successfully.
Nov 28 09:36:04 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-5ea32d7a444086a7f1ea2479bd7b214a5adab9651f7d4df1f24a039ae5563f9d-merged.mount: Deactivated successfully.
Nov 28 09:36:04 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:36:04 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-d6636e8195e20b46e9ff0be91c525681b79b061d34e7042a3302554bc91c2a8c-merged.mount: Deactivated successfully.
Nov 28 09:36:04 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24067 DF PROTO=TCP SPT=33840 DPT=9105 SEQ=2313935200 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD54D7A0000000001030307) 
Nov 28 09:36:05 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:36:05 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:36:05 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:36:06 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:36:06 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:36:06 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:36:07 np0005538515.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:36:07 np0005538515.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:36:07 np0005538515.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:36:07 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28-merged.mount: Deactivated successfully.
Nov 28 09:36:08 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-cf8de856f68682579de884f5a9ccb4b00fffe375a72087325354c97a26c55ce7-merged.mount: Deactivated successfully.
Nov 28 09:36:08 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-b2363e42c8cc93f560c242c278a1b76f810df60301763e880790aefc5b17b52f-merged.mount: Deactivated successfully.
Nov 28 09:36:08 np0005538515.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:36:08 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.
Nov 28 09:36:08 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.
Nov 28 09:36:08 np0005538515.localdomain podman[241397]: 2025-11-28 09:36:08.980144805 +0000 UTC m=+0.093393815 container health_status b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Nov 28 09:36:09 np0005538515.localdomain podman[241396]: 2025-11-28 09:36:09.046493147 +0000 UTC m=+0.163025419 container health_status 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=unhealthy, io.buildah.version=1.41.3, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_managed=true)
Nov 28 09:36:09 np0005538515.localdomain podman[241397]: 2025-11-28 09:36:09.064110962 +0000 UTC m=+0.177360011 container exec_died b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Nov 28 09:36:09 np0005538515.localdomain podman[241396]: 2025-11-28 09:36:09.076226424 +0000 UTC m=+0.192758716 container exec_died 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Nov 28 09:36:09 np0005538515.localdomain podman[241396]: unhealthy
Nov 28 09:36:09 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57651 DF PROTO=TCP SPT=35162 DPT=9102 SEQ=3386133729 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD55EFA0000000001030307) 
Nov 28 09:36:09 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:36:09 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:36:10 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:36:10 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 28 09:36:10 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Nov 28 09:36:11 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Nov 28 09:36:11 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.
Nov 28 09:36:11 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.
Nov 28 09:36:11 np0005538515.localdomain systemd[1]: 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 09:36:11 np0005538515.localdomain systemd[1]: 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.service: Failed with result 'exit-code'.
Nov 28 09:36:11 np0005538515.localdomain systemd[1]: b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.service: Deactivated successfully.
Nov 28 09:36:11 np0005538515.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:36:11 np0005538515.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:36:11 np0005538515.localdomain podman[241431]: 2025-11-28 09:36:11.276051254 +0000 UTC m=+0.082514192 container health_status 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team)
Nov 28 09:36:11 np0005538515.localdomain podman[241431]: 2025-11-28 09:36:11.334491696 +0000 UTC m=+0.140954684 container exec_died 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Nov 28 09:36:11 np0005538515.localdomain podman[241432]: 2025-11-28 09:36:11.339283026 +0000 UTC m=+0.143299877 container health_status d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=unhealthy, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 28 09:36:11 np0005538515.localdomain podman[241432]: 2025-11-28 09:36:11.423605624 +0000 UTC m=+0.227622465 container exec_died d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 28 09:36:11 np0005538515.localdomain podman[241432]: unhealthy
Nov 28 09:36:11 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:36:11 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:36:11 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:36:12 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24068 DF PROTO=TCP SPT=33840 DPT=9105 SEQ=2313935200 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD56CFA0000000001030307) 
Nov 28 09:36:13 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:36:13 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 28 09:36:13 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 28 09:36:13 np0005538515.localdomain systemd[1]: 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.service: Deactivated successfully.
Nov 28 09:36:13 np0005538515.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:36:13 np0005538515.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:36:13 np0005538515.localdomain systemd[1]: d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 09:36:13 np0005538515.localdomain systemd[1]: d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.service: Failed with result 'exit-code'.
Nov 28 09:36:14 np0005538515.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:36:14 np0005538515.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:36:14 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:36:14 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9554 DF PROTO=TCP SPT=44628 DPT=9101 SEQ=769887207 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD572A00000000001030307) 
Nov 28 09:36:14 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:36:14 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:36:14 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:36:15 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:36:15 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:36:15 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:36:16 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28-merged.mount: Deactivated successfully.
Nov 28 09:36:16 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-b290307da7690cf991f1186b07b34a264d1d07b861913129e99370229181e3a2-merged.mount: Deactivated successfully.
Nov 28 09:36:16 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55048 DF PROTO=TCP SPT=37938 DPT=9100 SEQ=3728342459 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD57BFB0000000001030307) 
Nov 28 09:36:17 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.
Nov 28 09:36:17 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-d6636e8195e20b46e9ff0be91c525681b79b061d34e7042a3302554bc91c2a8c-merged.mount: Deactivated successfully.
Nov 28 09:36:17 np0005538515.localdomain systemd[1]: tmp-crun.VnhZuq.mount: Deactivated successfully.
Nov 28 09:36:18 np0005538515.localdomain podman[241476]: 2025-11-28 09:36:18.0057864 +0000 UTC m=+0.111287178 container health_status 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 28 09:36:18 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-cf8de856f68682579de884f5a9ccb4b00fffe375a72087325354c97a26c55ce7-merged.mount: Deactivated successfully.
Nov 28 09:36:18 np0005538515.localdomain podman[241476]: 2025-11-28 09:36:18.041581989 +0000 UTC m=+0.147082807 container exec_died 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 28 09:36:18 np0005538515.localdomain systemd[1]: 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.service: Deactivated successfully.
Nov 28 09:36:18 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Nov 28 09:36:18 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-0f50e6e193badeb95447e2c9ef73121ac91dbd5780ab99ca29933bd60e5eb8a8-merged.mount: Deactivated successfully.
Nov 28 09:36:19 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:36:19 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-d6636e8195e20b46e9ff0be91c525681b79b061d34e7042a3302554bc91c2a8c-merged.mount: Deactivated successfully.
Nov 28 09:36:20 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-d6636e8195e20b46e9ff0be91c525681b79b061d34e7042a3302554bc91c2a8c-merged.mount: Deactivated successfully.
Nov 28 09:36:20 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-14ed6d3c1e7f0efbf3e5310f077b6fbf5a3cd333e0b5df7204752cd3df15a8b7-merged.mount: Deactivated successfully.
Nov 28 09:36:20 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-853ccb0b7aef1ea23933a0a39c3ed46ab9d9a29acf9ba782f87031dcfb79c247-merged.mount: Deactivated successfully.
Nov 28 09:36:20 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55049 DF PROTO=TCP SPT=37938 DPT=9100 SEQ=3728342459 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD58BBB0000000001030307) 
Nov 28 09:36:20 np0005538515.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:36:20 np0005538515.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:36:21 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:36:21 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:36:21 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:36:22 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:36:22 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:36:22 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:36:23 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-14ed6d3c1e7f0efbf3e5310f077b6fbf5a3cd333e0b5df7204752cd3df15a8b7-merged.mount: Deactivated successfully.
Nov 28 09:36:23 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:36:23 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:36:23 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:36:24 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-cf8de856f68682579de884f5a9ccb4b00fffe375a72087325354c97a26c55ce7-merged.mount: Deactivated successfully.
Nov 28 09:36:24 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-b2363e42c8cc93f560c242c278a1b76f810df60301763e880790aefc5b17b52f-merged.mount: Deactivated successfully.
Nov 28 09:36:24 np0005538515.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:36:24 np0005538515.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:36:24 np0005538515.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:36:24 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.
Nov 28 09:36:24 np0005538515.localdomain podman[241499]: 2025-11-28 09:36:24.556675659 +0000 UTC m=+0.076974668 container health_status cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0)
Nov 28 09:36:24 np0005538515.localdomain podman[241499]: 2025-11-28 09:36:24.567967154 +0000 UTC m=+0.088266233 container exec_died cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=multipathd)
Nov 28 09:36:25 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:36:25 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:36:26 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 28 09:36:26 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Nov 28 09:36:26 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Nov 28 09:36:26 np0005538515.localdomain systemd[1]: cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.service: Deactivated successfully.
Nov 28 09:36:27 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10883 DF PROTO=TCP SPT=59496 DPT=9105 SEQ=2277982890 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD5A6D30000000001030307) 
Nov 28 09:36:28 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:36:28 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 28 09:36:28 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10884 DF PROTO=TCP SPT=59496 DPT=9105 SEQ=2277982890 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD5AAFA0000000001030307) 
Nov 28 09:36:28 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-853ccb0b7aef1ea23933a0a39c3ed46ab9d9a29acf9ba782f87031dcfb79c247-merged.mount: Deactivated successfully.
Nov 28 09:36:28 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 28 09:36:28 np0005538515.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:36:28 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62685 DF PROTO=TCP SPT=43380 DPT=9882 SEQ=503015964 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD5ABFC0000000001030307) 
Nov 28 09:36:29 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-7c29bfa5a0679179b90046634e87037ab6ff6f22b5fa7106d9841b0f8caae33b-merged.mount: Deactivated successfully.
Nov 28 09:36:29 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:36:29 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:36:29 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-a802e2c2182c5081dae453e00ae55ca652c01124f4ff691b910ec76e11c97f5a-merged.mount: Deactivated successfully.
Nov 28 09:36:30 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:36:30 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:36:30 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:36:31 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:36:31 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49619 DF PROTO=TCP SPT=54566 DPT=9105 SEQ=3331576412 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD5B6FB0000000001030307) 
Nov 28 09:36:31 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-a802e2c2182c5081dae453e00ae55ca652c01124f4ff691b910ec76e11c97f5a-merged.mount: Deactivated successfully.
Nov 28 09:36:31 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.
Nov 28 09:36:31 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-2465f602934c9b49eec4e0598b6266084474df0f2da0f1de92a72390c7a9be21-merged.mount: Deactivated successfully.
Nov 28 09:36:31 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-2465f602934c9b49eec4e0598b6266084474df0f2da0f1de92a72390c7a9be21-merged.mount: Deactivated successfully.
Nov 28 09:36:31 np0005538515.localdomain podman[241516]: 2025-11-28 09:36:31.880156853 +0000 UTC m=+0.087859689 container health_status 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, io.openshift.expose-services=, config_id=edpm, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, container_name=openstack_network_exporter, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 28 09:36:31 np0005538515.localdomain podman[241516]: 2025-11-28 09:36:31.91011141 +0000 UTC m=+0.117814226 container exec_died 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, architecture=x86_64, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.openshift.expose-services=, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter)
Nov 28 09:36:33 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-0f50e6e193badeb95447e2c9ef73121ac91dbd5780ab99ca29933bd60e5eb8a8-merged.mount: Deactivated successfully.
Nov 28 09:36:33 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 28 09:36:33 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Nov 28 09:36:34 np0005538515.localdomain systemd[1]: 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.service: Deactivated successfully.
Nov 28 09:36:34 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Nov 28 09:36:34 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10886 DF PROTO=TCP SPT=59496 DPT=9105 SEQ=2277982890 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD5C2BA0000000001030307) 
Nov 28 09:36:35 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:36:35 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 28 09:36:36 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 28 09:36:36 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-14ed6d3c1e7f0efbf3e5310f077b6fbf5a3cd333e0b5df7204752cd3df15a8b7-merged.mount: Deactivated successfully.
Nov 28 09:36:36 np0005538515.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:36:36 np0005538515.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:36:36 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-853ccb0b7aef1ea23933a0a39c3ed46ab9d9a29acf9ba782f87031dcfb79c247-merged.mount: Deactivated successfully.
Nov 28 09:36:37 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:36:38 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:36:38 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:36:38 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:36:38 np0005538515.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:36:38 np0005538515.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:36:38 np0005538515.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:36:38 np0005538515.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:36:38 np0005538515.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:36:39 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:36:39 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-14ed6d3c1e7f0efbf3e5310f077b6fbf5a3cd333e0b5df7204752cd3df15a8b7-merged.mount: Deactivated successfully.
Nov 28 09:36:39 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2810 DF PROTO=TCP SPT=59150 DPT=9102 SEQ=2295612562 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD5D3FA0000000001030307) 
Nov 28 09:36:39 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:36:39 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:36:39 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:36:40 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:36:40 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:36:40 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:36:41 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Nov 28 09:36:41 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-0818f18800ef844a6a48c8a7ece9ae523d5aa6f809095a5eb180d408e8d636c4-merged.mount: Deactivated successfully.
Nov 28 09:36:41 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-0818f18800ef844a6a48c8a7ece9ae523d5aa6f809095a5eb180d408e8d636c4-merged.mount: Deactivated successfully.
Nov 28 09:36:41 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.
Nov 28 09:36:41 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.
Nov 28 09:36:41 np0005538515.localdomain podman[241533]: 2025-11-28 09:36:41.316855912 +0000 UTC m=+0.075258962 container health_status 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=unhealthy, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 09:36:41 np0005538515.localdomain podman[241533]: 2025-11-28 09:36:41.405566668 +0000 UTC m=+0.163969668 container exec_died 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Nov 28 09:36:41 np0005538515.localdomain podman[241533]: unhealthy
Nov 28 09:36:41 np0005538515.localdomain podman[241534]: 2025-11-28 09:36:41.40747724 +0000 UTC m=+0.162992267 container health_status b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 09:36:41 np0005538515.localdomain podman[241534]: 2025-11-28 09:36:41.487259007 +0000 UTC m=+0.242774094 container exec_died b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 09:36:42 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-853ccb0b7aef1ea23933a0a39c3ed46ab9d9a29acf9ba782f87031dcfb79c247-merged.mount: Deactivated successfully.
Nov 28 09:36:42 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-d6636e8195e20b46e9ff0be91c525681b79b061d34e7042a3302554bc91c2a8c-merged.mount: Deactivated successfully.
Nov 28 09:36:42 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-7c29bfa5a0679179b90046634e87037ab6ff6f22b5fa7106d9841b0f8caae33b-merged.mount: Deactivated successfully.
Nov 28 09:36:42 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-d39fd500dccd0614704d889eaaf9068fe2575a3bb203d70cd1f6b19969ae7a25-merged.mount: Deactivated successfully.
Nov 28 09:36:42 np0005538515.localdomain systemd[1]: b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.service: Deactivated successfully.
Nov 28 09:36:42 np0005538515.localdomain systemd[1]: 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 09:36:42 np0005538515.localdomain systemd[1]: 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.service: Failed with result 'exit-code'.
Nov 28 09:36:42 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10887 DF PROTO=TCP SPT=59496 DPT=9105 SEQ=2277982890 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD5E2FA0000000001030307) 
Nov 28 09:36:43 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:36:43 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.
Nov 28 09:36:43 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.
Nov 28 09:36:43 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-a802e2c2182c5081dae453e00ae55ca652c01124f4ff691b910ec76e11c97f5a-merged.mount: Deactivated successfully.
Nov 28 09:36:43 np0005538515.localdomain podman[241568]: 2025-11-28 09:36:43.886591054 +0000 UTC m=+0.087872870 container health_status d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=unhealthy, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 09:36:43 np0005538515.localdomain podman[241568]: 2025-11-28 09:36:43.918241196 +0000 UTC m=+0.119523032 container exec_died d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 28 09:36:43 np0005538515.localdomain podman[241568]: unhealthy
Nov 28 09:36:43 np0005538515.localdomain podman[241567]: 2025-11-28 09:36:43.931871207 +0000 UTC m=+0.135028603 container health_status 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Nov 28 09:36:43 np0005538515.localdomain podman[241567]: 2025-11-28 09:36:43.998610033 +0000 UTC m=+0.201767399 container exec_died 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Nov 28 09:36:44 np0005538515.localdomain systemd[1]: tmp-crun.0psnv7.mount: Deactivated successfully.
Nov 28 09:36:44 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-a802e2c2182c5081dae453e00ae55ca652c01124f4ff691b910ec76e11c97f5a-merged.mount: Deactivated successfully.
Nov 28 09:36:44 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:36:44 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-d6636e8195e20b46e9ff0be91c525681b79b061d34e7042a3302554bc91c2a8c-merged.mount: Deactivated successfully.
Nov 28 09:36:44 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8821 DF PROTO=TCP SPT=48714 DPT=9101 SEQ=4107005301 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD5E7D10000000001030307) 
Nov 28 09:36:44 np0005538515.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:36:44 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-d6636e8195e20b46e9ff0be91c525681b79b061d34e7042a3302554bc91c2a8c-merged.mount: Deactivated successfully.
Nov 28 09:36:44 np0005538515.localdomain systemd[1]: d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 09:36:44 np0005538515.localdomain systemd[1]: d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.service: Failed with result 'exit-code'.
Nov 28 09:36:44 np0005538515.localdomain systemd[1]: 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.service: Deactivated successfully.
Nov 28 09:36:45 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:36:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:36:45.074 228501 DEBUG oslo_service.periodic_task [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:36:45 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:36:46 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-a802e2c2182c5081dae453e00ae55ca652c01124f4ff691b910ec76e11c97f5a-merged.mount: Deactivated successfully.
Nov 28 09:36:46 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-2465f602934c9b49eec4e0598b6266084474df0f2da0f1de92a72390c7a9be21-merged.mount: Deactivated successfully.
Nov 28 09:36:46 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:36:46 np0005538515.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:36:46 np0005538515.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:36:46 np0005538515.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:36:46 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-2465f602934c9b49eec4e0598b6266084474df0f2da0f1de92a72390c7a9be21-merged.mount: Deactivated successfully.
Nov 28 09:36:46 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:36:46.069 228501 DEBUG oslo_service.periodic_task [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:36:46 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:36:46 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59321 DF PROTO=TCP SPT=47654 DPT=9100 SEQ=546634030 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD5F0FA0000000001030307) 
Nov 28 09:36:47 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:36:47 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:36:47.074 228501 DEBUG oslo_service.periodic_task [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:36:48 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:36:48.073 228501 DEBUG oslo_service.periodic_task [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:36:48 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:36:48.074 228501 DEBUG nova.compute.manager [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 28 09:36:48 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:36:48.074 228501 DEBUG nova.compute.manager [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 28 09:36:48 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:36:48.092 228501 DEBUG nova.compute.manager [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 28 09:36:48 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:36:48.093 228501 DEBUG oslo_service.periodic_task [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:36:48 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-d39fd500dccd0614704d889eaaf9068fe2575a3bb203d70cd1f6b19969ae7a25-merged.mount: Deactivated successfully.
Nov 28 09:36:48 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.
Nov 28 09:36:48 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-af4441ca58e5a3dae70e850402577fe72fc0370c205d9690db9c04c01d30a59b-merged.mount: Deactivated successfully.
Nov 28 09:36:48 np0005538515.localdomain podman[241613]: 2025-11-28 09:36:48.487370776 +0000 UTC m=+0.077906068 container health_status 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 28 09:36:48 np0005538515.localdomain podman[241613]: 2025-11-28 09:36:48.527023506 +0000 UTC m=+0.117558788 container exec_died 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 28 09:36:48 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-af4441ca58e5a3dae70e850402577fe72fc0370c205d9690db9c04c01d30a59b-merged.mount: Deactivated successfully.
Nov 28 09:36:49 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:36:49.074 228501 DEBUG oslo_service.periodic_task [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:36:49 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:36:49.075 228501 DEBUG oslo_service.periodic_task [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:36:49 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:36:49.075 228501 DEBUG oslo_service.periodic_task [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:36:49 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:36:49.092 228501 DEBUG oslo_concurrency.lockutils [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:36:49 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:36:49.093 228501 DEBUG oslo_concurrency.lockutils [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:36:49 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:36:49.093 228501 DEBUG oslo_concurrency.lockutils [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:36:49 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:36:49.093 228501 DEBUG nova.compute.resource_tracker [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Auditing locally available compute resources for np0005538515.localdomain (node: np0005538515.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 28 09:36:49 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:36:49.094 228501 DEBUG oslo_concurrency.processutils [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 09:36:49 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:36:49.555 228501 DEBUG oslo_concurrency.processutils [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.461s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 09:36:49 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:36:49.732 228501 WARNING nova.virt.libvirt.driver [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 09:36:49 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:36:49.734 228501 DEBUG nova.compute.resource_tracker [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Hypervisor/Node resource view: name=np0005538515.localdomain free_ram=13117MB free_disk=41.837093353271484GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 28 09:36:49 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:36:49.734 228501 DEBUG oslo_concurrency.lockutils [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:36:49 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:36:49.734 228501 DEBUG oslo_concurrency.lockutils [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:36:49 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:36:49.819 228501 DEBUG nova.compute.resource_tracker [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 28 09:36:49 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:36:49.821 228501 DEBUG nova.compute.resource_tracker [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Final resource view: name=np0005538515.localdomain phys_ram=15738MB used_ram=512MB phys_disk=41GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 28 09:36:49 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:36:49.849 228501 DEBUG oslo_concurrency.processutils [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 09:36:50 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:36:50.339 228501 DEBUG oslo_concurrency.processutils [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.490s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 09:36:50 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:36:50.346 228501 DEBUG nova.compute.provider_tree [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Inventory has not changed in ProviderTree for provider: 72fba1ca-0d86-48af-8a3d-510284dfd0e0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 09:36:50 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 28 09:36:50 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:36:50.369 228501 DEBUG nova.scheduler.client.report [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Inventory has not changed for provider 72fba1ca-0d86-48af-8a3d-510284dfd0e0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 09:36:50 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:36:50.373 228501 DEBUG nova.compute.resource_tracker [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Compute_service record updated for np0005538515.localdomain:np0005538515.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 28 09:36:50 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:36:50.374 228501 DEBUG oslo_concurrency.lockutils [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.640s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:36:50 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Nov 28 09:36:50 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59322 DF PROTO=TCP SPT=47654 DPT=9100 SEQ=546634030 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD600BB0000000001030307) 
Nov 28 09:36:50 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Nov 28 09:36:50 np0005538515.localdomain kernel: overlayfs: upperdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:36:50 np0005538515.localdomain kernel: overlayfs: workdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:36:50 np0005538515.localdomain systemd[1]: 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.service: Deactivated successfully.
Nov 28 09:36:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:36:50.816 158530 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:36:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:36:50.817 158530 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:36:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:36:50.817 158530 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:36:51 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:36:51.371 228501 DEBUG oslo_service.periodic_task [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:36:51 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:36:51.371 228501 DEBUG oslo_service.periodic_task [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:36:51 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:36:51.372 228501 DEBUG nova.compute.manager [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 28 09:36:52 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:36:52 np0005538515.localdomain kernel: overlayfs: upperdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:36:52 np0005538515.localdomain kernel: overlayfs: workdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:36:52 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 28 09:36:52 np0005538515.localdomain podman[239012]: time="2025-11-28T09:36:52Z" level=error msg="Getting root fs size for \"929c9b5315b4fc33e01978423e59cb5b383ecd56e91c5f891b7c011283bec432\": getting diffsize of layer \"f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958\" and its parent \"3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae\": unmounting layer f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958: replacing mount point \"/var/lib/containers/storage/overlay/f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958/merged\": device or resource busy"
Nov 28 09:36:53 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 28 09:36:53 np0005538515.localdomain sshd[228897]: Received disconnect from 192.168.122.30 port 38672:11: disconnected by user
Nov 28 09:36:53 np0005538515.localdomain sshd[228897]: Disconnected from user zuul 192.168.122.30 port 38672
Nov 28 09:36:53 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:36:53 np0005538515.localdomain sshd[228894]: pam_unix(sshd:session): session closed for user zuul
Nov 28 09:36:53 np0005538515.localdomain systemd[1]: session-55.scope: Deactivated successfully.
Nov 28 09:36:53 np0005538515.localdomain systemd[1]: session-55.scope: Consumed 58.594s CPU time.
Nov 28 09:36:53 np0005538515.localdomain systemd-logind[763]: Session 55 logged out. Waiting for processes to exit.
Nov 28 09:36:53 np0005538515.localdomain systemd-logind[763]: Removed session 55.
Nov 28 09:36:53 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:36:54 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:36:54 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:36:54 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:36:54 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:36:55 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-0ab58418a3b33798bab22812a6bf35faf1a05b29cb02b615b8bae9fef6fe9073-merged.mount: Deactivated successfully.
Nov 28 09:36:55 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-d63efe17da859108a09d9b90626ba0c433787abe209cd4ac755f6ba2a5206671-merged.mount: Deactivated successfully.
Nov 28 09:36:55 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-d8443c9fdf039c2367e44e0edbe81c941f30f604c3f1eccc2fc81efb5a97a784-merged.mount: Deactivated successfully.
Nov 28 09:36:56 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66-merged.mount: Deactivated successfully.
Nov 28 09:36:56 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-d63efe17da859108a09d9b90626ba0c433787abe209cd4ac755f6ba2a5206671-merged.mount: Deactivated successfully.
Nov 28 09:36:56 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-d63efe17da859108a09d9b90626ba0c433787abe209cd4ac755f6ba2a5206671-merged.mount: Deactivated successfully.
Nov 28 09:36:56 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.
Nov 28 09:36:56 np0005538515.localdomain podman[241679]: 2025-11-28 09:36:56.995267311 +0000 UTC m=+0.101737357 container health_status cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 28 09:36:57 np0005538515.localdomain podman[241679]: 2025-11-28 09:36:57.004696255 +0000 UTC m=+0.111166291 container exec_died cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Nov 28 09:36:57 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa-merged.mount: Deactivated successfully.
Nov 28 09:36:57 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66-merged.mount: Deactivated successfully.
Nov 28 09:36:57 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59094 DF PROTO=TCP SPT=53552 DPT=9105 SEQ=3078962166 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD61C030000000001030307) 
Nov 28 09:36:57 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Nov 28 09:36:57 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-0818f18800ef844a6a48c8a7ece9ae523d5aa6f809095a5eb180d408e8d636c4-merged.mount: Deactivated successfully.
Nov 28 09:36:57 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-0818f18800ef844a6a48c8a7ece9ae523d5aa6f809095a5eb180d408e8d636c4-merged.mount: Deactivated successfully.
Nov 28 09:36:57 np0005538515.localdomain systemd[1]: cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.service: Deactivated successfully.
Nov 28 09:36:58 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59095 DF PROTO=TCP SPT=53552 DPT=9105 SEQ=3078962166 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD61FFB0000000001030307) 
Nov 28 09:36:58 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59323 DF PROTO=TCP SPT=47654 DPT=9100 SEQ=546634030 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD620FA0000000001030307) 
Nov 28 09:36:58 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-d8443c9fdf039c2367e44e0edbe81c941f30f604c3f1eccc2fc81efb5a97a784-merged.mount: Deactivated successfully.
Nov 28 09:36:59 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-d6636e8195e20b46e9ff0be91c525681b79b061d34e7042a3302554bc91c2a8c-merged.mount: Deactivated successfully.
Nov 28 09:36:59 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-d39fd500dccd0614704d889eaaf9068fe2575a3bb203d70cd1f6b19969ae7a25-merged.mount: Deactivated successfully.
Nov 28 09:36:59 np0005538515.localdomain sudo[241698]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:36:59 np0005538515.localdomain sudo[241698]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:36:59 np0005538515.localdomain sudo[241698]: pam_unix(sudo:session): session closed for user root
Nov 28 09:36:59 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-55d5530fe8468c8c9907e0aa1de030811941604fa5f46de3db6dc15ec40906dd-merged.mount: Deactivated successfully.
Nov 28 09:36:59 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-ae0ebe7656e29542866ff018f5be9a3d02c88268a65814cf045e1b6c30ffd352-merged.mount: Deactivated successfully.
Nov 28 09:36:59 np0005538515.localdomain sudo[241716]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 09:36:59 np0005538515.localdomain sudo[241716]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:37:00 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:37:00 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-d6636e8195e20b46e9ff0be91c525681b79b061d34e7042a3302554bc91c2a8c-merged.mount: Deactivated successfully.
Nov 28 09:37:01 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-d6636e8195e20b46e9ff0be91c525681b79b061d34e7042a3302554bc91c2a8c-merged.mount: Deactivated successfully.
Nov 28 09:37:01 np0005538515.localdomain sudo[241716]: pam_unix(sudo:session): session closed for user root
Nov 28 09:37:01 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34849 DF PROTO=TCP SPT=55970 DPT=9882 SEQ=3799603440 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD62D3A0000000001030307) 
Nov 28 09:37:02 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:37:02 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:37:02 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:37:02 np0005538515.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:37:02 np0005538515.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:37:02 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-ae0ebe7656e29542866ff018f5be9a3d02c88268a65814cf045e1b6c30ffd352-merged.mount: Deactivated successfully.
Nov 28 09:37:02 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:37:02 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:37:03 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:37:04 np0005538515.localdomain sudo[241765]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:37:04 np0005538515.localdomain sudo[241765]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:37:04 np0005538515.localdomain sudo[241765]: pam_unix(sudo:session): session closed for user root
Nov 28 09:37:04 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 28 09:37:04 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.
Nov 28 09:37:04 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59097 DF PROTO=TCP SPT=53552 DPT=9105 SEQ=3078962166 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD637BA0000000001030307) 
Nov 28 09:37:04 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Nov 28 09:37:04 np0005538515.localdomain podman[241783]: 2025-11-28 09:37:04.69281235 +0000 UTC m=+0.062064506 container health_status 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, release=1755695350, version=9.6, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., io.openshift.expose-services=)
Nov 28 09:37:04 np0005538515.localdomain podman[241783]: 2025-11-28 09:37:04.703828125 +0000 UTC m=+0.073080331 container exec_died 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, architecture=x86_64, io.buildah.version=1.33.7, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., container_name=openstack_network_exporter, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc.)
Nov 28 09:37:04 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-d39fd500dccd0614704d889eaaf9068fe2575a3bb203d70cd1f6b19969ae7a25-merged.mount: Deactivated successfully.
Nov 28 09:37:04 np0005538515.localdomain systemd[1]: 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.service: Deactivated successfully.
Nov 28 09:37:05 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-af4441ca58e5a3dae70e850402577fe72fc0370c205d9690db9c04c01d30a59b-merged.mount: Deactivated successfully.
Nov 28 09:37:06 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:37:07 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 28 09:37:07 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Nov 28 09:37:07 np0005538515.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:37:07 np0005538515.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:37:07 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Nov 28 09:37:08 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:37:08 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:37:08 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:37:08 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:37:08 np0005538515.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:37:08 np0005538515.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:37:08 np0005538515.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:37:08 np0005538515.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:37:08 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:37:09 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 28 09:37:09 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30372 DF PROTO=TCP SPT=45784 DPT=9102 SEQ=1342579902 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD6493A0000000001030307) 
Nov 28 09:37:10 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:37:10 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:37:10 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:37:11 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:37:11 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:37:11 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Nov 28 09:37:11 np0005538515.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:37:11 np0005538515.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:37:11 np0005538515.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:37:11 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:37:11 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-16fa74df54e8af55fadaf755a0b31aeec0d57923866d5b313d9489d8d758e3e8-merged.mount: Deactivated successfully.
Nov 28 09:37:12 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:37:12 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-88264f091bd3862d781bfa87f5675ae91e879ca34a7c2bbe081e8ea3bd8603d6-merged.mount: Deactivated successfully.
Nov 28 09:37:12 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-88264f091bd3862d781bfa87f5675ae91e879ca34a7c2bbe081e8ea3bd8603d6-merged.mount: Deactivated successfully.
Nov 28 09:37:13 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59098 DF PROTO=TCP SPT=53552 DPT=9105 SEQ=3078962166 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD658FB0000000001030307) 
Nov 28 09:37:13 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:37:13 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.
Nov 28 09:37:13 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.
Nov 28 09:37:13 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:37:13 np0005538515.localdomain podman[241804]: 2025-11-28 09:37:13.725615201 +0000 UTC m=+0.066946934 container health_status b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Nov 28 09:37:13 np0005538515.localdomain podman[241803]: 2025-11-28 09:37:13.78193207 +0000 UTC m=+0.125965620 container health_status 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=unhealthy, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, config_id=edpm, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 28 09:37:13 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:37:13 np0005538515.localdomain podman[241803]: 2025-11-28 09:37:13.811154305 +0000 UTC m=+0.155187835 container exec_died 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, managed_by=edpm_ansible)
Nov 28 09:37:13 np0005538515.localdomain podman[241803]: unhealthy
Nov 28 09:37:13 np0005538515.localdomain podman[241804]: 2025-11-28 09:37:13.863548846 +0000 UTC m=+0.204880649 container exec_died b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 09:37:14 np0005538515.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:37:14 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34851 DF PROTO=TCP SPT=55970 DPT=9882 SEQ=3799603440 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD65CFB0000000001030307) 
Nov 28 09:37:14 np0005538515.localdomain systemd[1]: b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.service: Deactivated successfully.
Nov 28 09:37:14 np0005538515.localdomain systemd[1]: 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 09:37:14 np0005538515.localdomain systemd[1]: 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.service: Failed with result 'exit-code'.
Nov 28 09:37:14 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:37:14 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:37:14 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Nov 28 09:37:14 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-0ab58418a3b33798bab22812a6bf35faf1a05b29cb02b615b8bae9fef6fe9073-merged.mount: Deactivated successfully.
Nov 28 09:37:14 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.
Nov 28 09:37:14 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.
Nov 28 09:37:14 np0005538515.localdomain systemd[1]: tmp-crun.e7bOxj.mount: Deactivated successfully.
Nov 28 09:37:14 np0005538515.localdomain podman[241841]: 2025-11-28 09:37:14.894229111 +0000 UTC m=+0.089253924 container health_status d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=unhealthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 28 09:37:14 np0005538515.localdomain podman[241841]: 2025-11-28 09:37:14.907378516 +0000 UTC m=+0.102403309 container exec_died d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 09:37:14 np0005538515.localdomain podman[241841]: unhealthy
Nov 28 09:37:15 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-d63efe17da859108a09d9b90626ba0c433787abe209cd4ac755f6ba2a5206671-merged.mount: Deactivated successfully.
Nov 28 09:37:15 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-d8443c9fdf039c2367e44e0edbe81c941f30f604c3f1eccc2fc81efb5a97a784-merged.mount: Deactivated successfully.
Nov 28 09:37:15 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-88264f091bd3862d781bfa87f5675ae91e879ca34a7c2bbe081e8ea3bd8603d6-merged.mount: Deactivated successfully.
Nov 28 09:37:15 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-6e24aa22dcc3c3aeaf326f993725e399e8f3215a32c5fb5c28a2698bed898907-merged.mount: Deactivated successfully.
Nov 28 09:37:15 np0005538515.localdomain systemd[1]: d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 09:37:15 np0005538515.localdomain systemd[1]: d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.service: Failed with result 'exit-code'.
Nov 28 09:37:16 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48729 DF PROTO=TCP SPT=41940 DPT=9100 SEQ=3455075542 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD6663A0000000001030307) 
Nov 28 09:37:16 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-cae296f764831135e29cafc4ebb3dae4bbdc9f9a6aba7fb9c51fecf58f2b7f2e-merged.mount: Deactivated successfully.
Nov 28 09:37:17 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa-merged.mount: Deactivated successfully.
Nov 28 09:37:17 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:37:17 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66-merged.mount: Deactivated successfully.
Nov 28 09:37:17 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:37:17 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-d8443c9fdf039c2367e44e0edbe81c941f30f604c3f1eccc2fc81efb5a97a784-merged.mount: Deactivated successfully.
Nov 28 09:37:17 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:37:17 np0005538515.localdomain podman[241840]: 2025-11-28 09:37:17.973621866 +0000 UTC m=+3.175972895 container health_status 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_controller, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Nov 28 09:37:18 np0005538515.localdomain podman[241840]: 2025-11-28 09:37:18.028399986 +0000 UTC m=+3.230751005 container exec_died 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 09:37:18 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-ae0ebe7656e29542866ff018f5be9a3d02c88268a65814cf045e1b6c30ffd352-merged.mount: Deactivated successfully.
Nov 28 09:37:18 np0005538515.localdomain systemd[1]: 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.service: Deactivated successfully.
Nov 28 09:37:19 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:37:19 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-a10f3c610bfd3a5166c8bb201abb4a07184bf8ddf69826ea8939f1a48ecba966-merged.mount: Deactivated successfully.
Nov 28 09:37:19 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-55d5530fe8468c8c9907e0aa1de030811941604fa5f46de3db6dc15ec40906dd-merged.mount: Deactivated successfully.
Nov 28 09:37:19 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-a10f3c610bfd3a5166c8bb201abb4a07184bf8ddf69826ea8939f1a48ecba966-merged.mount: Deactivated successfully.
Nov 28 09:37:20 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-ae0ebe7656e29542866ff018f5be9a3d02c88268a65814cf045e1b6c30ffd352-merged.mount: Deactivated successfully.
Nov 28 09:37:20 np0005538515.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:37:20 np0005538515.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:37:20 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48730 DF PROTO=TCP SPT=41940 DPT=9100 SEQ=3455075542 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD675FA0000000001030307) 
Nov 28 09:37:20 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:37:20 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.
Nov 28 09:37:20 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:37:20 np0005538515.localdomain podman[241886]: 2025-11-28 09:37:20.895486433 +0000 UTC m=+0.099881437 container health_status 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 28 09:37:20 np0005538515.localdomain podman[241886]: 2025-11-28 09:37:20.927925121 +0000 UTC m=+0.132320105 container exec_died 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 09:37:21 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-a10f3c610bfd3a5166c8bb201abb4a07184bf8ddf69826ea8939f1a48ecba966-merged.mount: Deactivated successfully.
Nov 28 09:37:21 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-876187b8bc68a02fc79261d7a49dfade5cc37ca730d23b4f758fcf788c522d06-merged.mount: Deactivated successfully.
Nov 28 09:37:22 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 28 09:37:22 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Nov 28 09:37:22 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Nov 28 09:37:22 np0005538515.localdomain systemd[1]: 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.service: Deactivated successfully.
Nov 28 09:37:24 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-d6636e8195e20b46e9ff0be91c525681b79b061d34e7042a3302554bc91c2a8c-merged.mount: Deactivated successfully.
Nov 28 09:37:24 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-cf8de856f68682579de884f5a9ccb4b00fffe375a72087325354c97a26c55ce7-merged.mount: Deactivated successfully.
Nov 28 09:37:24 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:37:24 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 28 09:37:25 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-cf8de856f68682579de884f5a9ccb4b00fffe375a72087325354c97a26c55ce7-merged.mount: Deactivated successfully.
Nov 28 09:37:25 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:37:25 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-d6636e8195e20b46e9ff0be91c525681b79b061d34e7042a3302554bc91c2a8c-merged.mount: Deactivated successfully.
Nov 28 09:37:25 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:37:26 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-d6636e8195e20b46e9ff0be91c525681b79b061d34e7042a3302554bc91c2a8c-merged.mount: Deactivated successfully.
Nov 28 09:37:26 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:37:26 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:37:26 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:37:26 np0005538515.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:37:27 np0005538515.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:37:27 np0005538515.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:37:27 np0005538515.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:37:27 np0005538515.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:37:27 np0005538515.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:37:27 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:37:27 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:37:27 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6554 DF PROTO=TCP SPT=51392 DPT=9105 SEQ=1980500986 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD691330000000001030307) 
Nov 28 09:37:27 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:37:28 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.
Nov 28 09:37:28 np0005538515.localdomain podman[241907]: 2025-11-28 09:37:28.492790965 +0000 UTC m=+0.093529010 container health_status cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team)
Nov 28 09:37:28 np0005538515.localdomain podman[241907]: 2025-11-28 09:37:28.506356002 +0000 UTC m=+0.107094047 container exec_died cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, io.buildah.version=1.41.3, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 09:37:28 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6555 DF PROTO=TCP SPT=51392 DPT=9105 SEQ=1980500986 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD6953B0000000001030307) 
Nov 28 09:37:28 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55352 DF PROTO=TCP SPT=57020 DPT=9882 SEQ=3460966633 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD6965D0000000001030307) 
Nov 28 09:37:29 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-cf8de856f68682579de884f5a9ccb4b00fffe375a72087325354c97a26c55ce7-merged.mount: Deactivated successfully.
Nov 28 09:37:29 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-558adb40dc3f0c457c124ec6699b165daa74a355f52d98e7436d696b86369c63-merged.mount: Deactivated successfully.
Nov 28 09:37:29 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Nov 28 09:37:29 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-16fa74df54e8af55fadaf755a0b31aeec0d57923866d5b313d9489d8d758e3e8-merged.mount: Deactivated successfully.
Nov 28 09:37:29 np0005538515.localdomain systemd[1]: cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.service: Deactivated successfully.
Nov 28 09:37:30 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-16fa74df54e8af55fadaf755a0b31aeec0d57923866d5b313d9489d8d758e3e8-merged.mount: Deactivated successfully.
Nov 28 09:37:31 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:37:31 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-88264f091bd3862d781bfa87f5675ae91e879ca34a7c2bbe081e8ea3bd8603d6-merged.mount: Deactivated successfully.
Nov 28 09:37:31 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-88264f091bd3862d781bfa87f5675ae91e879ca34a7c2bbe081e8ea3bd8603d6-merged.mount: Deactivated successfully.
Nov 28 09:37:31 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10889 DF PROTO=TCP SPT=59496 DPT=9105 SEQ=2277982890 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD6A0FA0000000001030307) 
Nov 28 09:37:32 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-c3914bdda39f47c0c497a56396d11c84b489b87df2bfd019b00ddced1e1ae309-merged.mount: Deactivated successfully.
Nov 28 09:37:32 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:37:32 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-f7726cecd9e8969401979ecd2369f385c53efc762aea19175eca5dfbffa00449-merged.mount: Deactivated successfully.
Nov 28 09:37:32 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:37:32 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:37:32 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:37:33 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:37:33 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-f20c3ba929bbb53a84e323dddb8c0eaf3ca74b6729310e964e1fa9eee119e43a-merged.mount: Deactivated successfully.
Nov 28 09:37:33 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-c3914bdda39f47c0c497a56396d11c84b489b87df2bfd019b00ddced1e1ae309-merged.mount: Deactivated successfully.
Nov 28 09:37:34 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-c3914bdda39f47c0c497a56396d11c84b489b87df2bfd019b00ddced1e1ae309-merged.mount: Deactivated successfully.
Nov 28 09:37:34 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-88264f091bd3862d781bfa87f5675ae91e879ca34a7c2bbe081e8ea3bd8603d6-merged.mount: Deactivated successfully.
Nov 28 09:37:34 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6557 DF PROTO=TCP SPT=51392 DPT=9105 SEQ=1980500986 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD6ACFA0000000001030307) 
Nov 28 09:37:34 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-6e24aa22dcc3c3aeaf326f993725e399e8f3215a32c5fb5c28a2698bed898907-merged.mount: Deactivated successfully.
Nov 28 09:37:35 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.
Nov 28 09:37:35 np0005538515.localdomain podman[241926]: 2025-11-28 09:37:35.457756317 +0000 UTC m=+0.073878224 container health_status 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., release=1755695350, vcs-type=git, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, config_id=edpm, distribution-scope=public)
Nov 28 09:37:35 np0005538515.localdomain podman[241926]: 2025-11-28 09:37:35.468838865 +0000 UTC m=+0.084960762 container exec_died 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., version=9.6, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, config_id=edpm, managed_by=edpm_ansible, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Nov 28 09:37:35 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66-merged.mount: Deactivated successfully.
Nov 28 09:37:35 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-f20c3ba929bbb53a84e323dddb8c0eaf3ca74b6729310e964e1fa9eee119e43a-merged.mount: Deactivated successfully.
Nov 28 09:37:35 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-cae296f764831135e29cafc4ebb3dae4bbdc9f9a6aba7fb9c51fecf58f2b7f2e-merged.mount: Deactivated successfully.
Nov 28 09:37:35 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66-merged.mount: Deactivated successfully.
Nov 28 09:37:35 np0005538515.localdomain systemd[1]: 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.service: Deactivated successfully.
Nov 28 09:37:36 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:37:37 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-cae296f764831135e29cafc4ebb3dae4bbdc9f9a6aba7fb9c51fecf58f2b7f2e-merged.mount: Deactivated successfully.
Nov 28 09:37:37 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-69bca6b1ae1a510e610471f91dc39084eac5a14908c47996b36473212637590d-merged.mount: Deactivated successfully.
Nov 28 09:37:37 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-f7726cecd9e8969401979ecd2369f385c53efc762aea19175eca5dfbffa00449-merged.mount: Deactivated successfully.
Nov 28 09:37:37 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:37:37 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-a10f3c610bfd3a5166c8bb201abb4a07184bf8ddf69826ea8939f1a48ecba966-merged.mount: Deactivated successfully.
Nov 28 09:37:38 np0005538515.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:37:38 np0005538515.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:37:38 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-a10f3c610bfd3a5166c8bb201abb4a07184bf8ddf69826ea8939f1a48ecba966-merged.mount: Deactivated successfully.
Nov 28 09:37:38 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-8d7a1875c2425cad72cafe803874ca1ca683dd2f4b513ab7c102d534a7a81b79-merged.mount: Deactivated successfully.
Nov 28 09:37:38 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:37:38 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:37:38 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:37:39 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53635 DF PROTO=TCP SPT=54900 DPT=9102 SEQ=1277939374 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD6BE7B0000000001030307) 
Nov 28 09:37:39 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-a10f3c610bfd3a5166c8bb201abb4a07184bf8ddf69826ea8939f1a48ecba966-merged.mount: Deactivated successfully.
Nov 28 09:37:39 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-876187b8bc68a02fc79261d7a49dfade5cc37ca730d23b4f758fcf788c522d06-merged.mount: Deactivated successfully.
Nov 28 09:37:40 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-5072aa4283df2440f817438926274b2ecc1fbb999174180268a40a1b62865efd-merged.mount: Deactivated successfully.
Nov 28 09:37:41 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-d6636e8195e20b46e9ff0be91c525681b79b061d34e7042a3302554bc91c2a8c-merged.mount: Deactivated successfully.
Nov 28 09:37:41 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-cf8de856f68682579de884f5a9ccb4b00fffe375a72087325354c97a26c55ce7-merged.mount: Deactivated successfully.
Nov 28 09:37:41 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-cf8de856f68682579de884f5a9ccb4b00fffe375a72087325354c97a26c55ce7-merged.mount: Deactivated successfully.
Nov 28 09:37:42 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-14ed6d3c1e7f0efbf3e5310f077b6fbf5a3cd333e0b5df7204752cd3df15a8b7-merged.mount: Deactivated successfully.
Nov 28 09:37:42 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-d6636e8195e20b46e9ff0be91c525681b79b061d34e7042a3302554bc91c2a8c-merged.mount: Deactivated successfully.
Nov 28 09:37:42 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6558 DF PROTO=TCP SPT=51392 DPT=9105 SEQ=1980500986 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD6CCFA0000000001030307) 
Nov 28 09:37:42 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-d6636e8195e20b46e9ff0be91c525681b79b061d34e7042a3302554bc91c2a8c-merged.mount: Deactivated successfully.
Nov 28 09:37:43 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:37:43 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:37:44 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47315 DF PROTO=TCP SPT=41096 DPT=9101 SEQ=3060849528 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD6D2310000000001030307) 
Nov 28 09:37:44 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.
Nov 28 09:37:44 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.
Nov 28 09:37:44 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:37:44 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:37:44 np0005538515.localdomain podman[241946]: 2025-11-28 09:37:44.749111796 +0000 UTC m=+0.096464053 container health_status b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Nov 28 09:37:44 np0005538515.localdomain podman[241946]: 2025-11-28 09:37:44.778595903 +0000 UTC m=+0.125948140 container exec_died b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 28 09:37:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:37:45.073 228501 DEBUG oslo_service.periodic_task [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:37:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:37:45.074 228501 DEBUG oslo_service.periodic_task [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:37:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:37:45.074 228501 DEBUG nova.compute.manager [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Nov 28 09:37:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:37:45.109 228501 DEBUG nova.compute.manager [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Nov 28 09:37:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:37:45.110 228501 DEBUG oslo_service.periodic_task [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:37:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:37:45.110 228501 DEBUG nova.compute.manager [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Nov 28 09:37:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:37:45.137 228501 DEBUG oslo_service.periodic_task [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:37:45 np0005538515.localdomain systemd[1]: tmp-crun.eJdsHo.mount: Deactivated successfully.
Nov 28 09:37:45 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:37:46 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-5072aa4283df2440f817438926274b2ecc1fbb999174180268a40a1b62865efd-merged.mount: Deactivated successfully.
Nov 28 09:37:46 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.
Nov 28 09:37:46 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-8bb70caa6e9f6f4d170c3a5868421bfc24d38542c14f6a27a1edf3bcfdc45d32-merged.mount: Deactivated successfully.
Nov 28 09:37:46 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-8bb70caa6e9f6f4d170c3a5868421bfc24d38542c14f6a27a1edf3bcfdc45d32-merged.mount: Deactivated successfully.
Nov 28 09:37:46 np0005538515.localdomain systemd[1]: b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.service: Deactivated successfully.
Nov 28 09:37:46 np0005538515.localdomain podman[241945]: 2025-11-28 09:37:46.472129808 +0000 UTC m=+1.826089902 container health_status 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=unhealthy, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 28 09:37:46 np0005538515.localdomain podman[241945]: 2025-11-28 09:37:46.501700788 +0000 UTC m=+1.855660882 container exec_died 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, managed_by=edpm_ansible)
Nov 28 09:37:46 np0005538515.localdomain podman[241945]: unhealthy
Nov 28 09:37:46 np0005538515.localdomain podman[241971]: 2025-11-28 09:37:46.51829685 +0000 UTC m=+0.242958478 container health_status d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=unhealthy, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 28 09:37:46 np0005538515.localdomain podman[241971]: 2025-11-28 09:37:46.52941731 +0000 UTC m=+0.254078908 container exec_died d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 09:37:46 np0005538515.localdomain podman[241971]: unhealthy
Nov 28 09:37:46 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53413 DF PROTO=TCP SPT=36928 DPT=9100 SEQ=3824350269 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD6DB7A0000000001030307) 
Nov 28 09:37:47 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-cf8de856f68682579de884f5a9ccb4b00fffe375a72087325354c97a26c55ce7-merged.mount: Deactivated successfully.
Nov 28 09:37:47 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-558adb40dc3f0c457c124ec6699b165daa74a355f52d98e7436d696b86369c63-merged.mount: Deactivated successfully.
Nov 28 09:37:48 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-558adb40dc3f0c457c124ec6699b165daa74a355f52d98e7436d696b86369c63-merged.mount: Deactivated successfully.
Nov 28 09:37:48 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:37:48.161 228501 DEBUG oslo_service.periodic_task [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:37:48 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:37:48.161 228501 DEBUG nova.compute.manager [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 28 09:37:48 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:37:48.161 228501 DEBUG nova.compute.manager [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 28 09:37:48 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-f04f6aa8018da724c9daa5ca37db7cd13477323f1b725eec5dac97862d883048-merged.mount: Deactivated successfully.
Nov 28 09:37:48 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:37:48.370 228501 DEBUG nova.compute.manager [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 28 09:37:48 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:37:48.371 228501 DEBUG oslo_service.periodic_task [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:37:48 np0005538515.localdomain systemd[1]: d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 09:37:48 np0005538515.localdomain systemd[1]: d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.service: Failed with result 'exit-code'.
Nov 28 09:37:48 np0005538515.localdomain systemd[1]: 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 09:37:48 np0005538515.localdomain systemd[1]: 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.service: Failed with result 'exit-code'.
Nov 28 09:37:48 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-b574f97f279779c52df37c61d993141d596fdb6544fa700fbddd8f35f27a4d3b-merged.mount: Deactivated successfully.
Nov 28 09:37:49 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.
Nov 28 09:37:49 np0005538515.localdomain podman[242000]: 2025-11-28 09:37:49.982683751 +0000 UTC m=+0.086558092 container health_status 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 28 09:37:50 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:37:50.073 228501 DEBUG oslo_service.periodic_task [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:37:50 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:37:50.073 228501 DEBUG oslo_service.periodic_task [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:37:50 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:37:50.074 228501 DEBUG oslo_service.periodic_task [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:37:50 np0005538515.localdomain podman[242000]: 2025-11-28 09:37:50.075422246 +0000 UTC m=+0.179296607 container exec_died 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20251125, tcib_managed=true, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, managed_by=edpm_ansible, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Nov 28 09:37:50 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-47afe78ba3ac18f156703d7ad9e4be64941a9d1bd472a4c2a59f4f2c3531ee35-merged.mount: Deactivated successfully.
Nov 28 09:37:50 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-f04f6aa8018da724c9daa5ca37db7cd13477323f1b725eec5dac97862d883048-merged.mount: Deactivated successfully.
Nov 28 09:37:50 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-c3914bdda39f47c0c497a56396d11c84b489b87df2bfd019b00ddced1e1ae309-merged.mount: Deactivated successfully.
Nov 28 09:37:50 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53414 DF PROTO=TCP SPT=36928 DPT=9100 SEQ=3824350269 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD6EB3B0000000001030307) 
Nov 28 09:37:50 np0005538515.localdomain systemd[1]: 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.service: Deactivated successfully.
Nov 28 09:37:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:37:50.817 158530 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:37:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:37:50.818 158530 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:37:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:37:50.818 158530 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:37:50 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-f04f6aa8018da724c9daa5ca37db7cd13477323f1b725eec5dac97862d883048-merged.mount: Deactivated successfully.
Nov 28 09:37:50 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-f7726cecd9e8969401979ecd2369f385c53efc762aea19175eca5dfbffa00449-merged.mount: Deactivated successfully.
Nov 28 09:37:51 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:37:51.073 228501 DEBUG oslo_service.periodic_task [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:37:51 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:37:51.073 228501 DEBUG oslo_service.periodic_task [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:37:51 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:37:51.074 228501 DEBUG nova.compute.manager [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 28 09:37:51 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:37:51.074 228501 DEBUG oslo_service.periodic_task [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:37:51 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:37:51.095 228501 DEBUG oslo_concurrency.lockutils [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:37:51 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:37:51.095 228501 DEBUG oslo_concurrency.lockutils [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:37:51 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:37:51.095 228501 DEBUG oslo_concurrency.lockutils [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:37:51 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:37:51.095 228501 DEBUG nova.compute.resource_tracker [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Auditing locally available compute resources for np0005538515.localdomain (node: np0005538515.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 28 09:37:51 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:37:51.096 228501 DEBUG oslo_concurrency.processutils [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 09:37:51 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:37:51.565 228501 DEBUG oslo_concurrency.processutils [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.469s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 09:37:51 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:37:51.720 228501 WARNING nova.virt.libvirt.driver [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 09:37:51 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:37:51.721 228501 DEBUG nova.compute.resource_tracker [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Hypervisor/Node resource view: name=np0005538515.localdomain free_ram=13151MB free_disk=41.837093353271484GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 28 09:37:51 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:37:51.721 228501 DEBUG oslo_concurrency.lockutils [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:37:51 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:37:51.722 228501 DEBUG oslo_concurrency.lockutils [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:37:51 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:37:51.817 228501 DEBUG nova.compute.resource_tracker [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 28 09:37:51 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:37:51.817 228501 DEBUG nova.compute.resource_tracker [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Final resource view: name=np0005538515.localdomain phys_ram=15738MB used_ram=512MB phys_disk=41GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 28 09:37:51 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:37:51.896 228501 DEBUG nova.scheduler.client.report [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Refreshing inventories for resource provider 72fba1ca-0d86-48af-8a3d-510284dfd0e0 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Nov 28 09:37:51 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:37:51.942 228501 DEBUG nova.scheduler.client.report [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Updating ProviderTree inventory for provider 72fba1ca-0d86-48af-8a3d-510284dfd0e0 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Nov 28 09:37:51 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:37:51.942 228501 DEBUG nova.compute.provider_tree [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Updating inventory in ProviderTree for provider 72fba1ca-0d86-48af-8a3d-510284dfd0e0 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 28 09:37:51 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:37:51.957 228501 DEBUG nova.scheduler.client.report [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Refreshing aggregate associations for resource provider 72fba1ca-0d86-48af-8a3d-510284dfd0e0, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Nov 28 09:37:51 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:37:51.978 228501 DEBUG nova.scheduler.client.report [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Refreshing trait associations for resource provider 72fba1ca-0d86-48af-8a3d-510284dfd0e0, traits: COMPUTE_DEVICE_TAGGING,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SHA,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_AESNI,HW_CPU_X86_ABM,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_AMD_SVM,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NODE,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_ACCELERATORS,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_FMA3,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_SSE,HW_CPU_X86_F16C,HW_CPU_X86_SSE4A,HW_CPU_X86_MMX,HW_CPU_X86_SSE2,HW_CPU_X86_SSSE3,COMPUTE_RESCUE_BFV,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_AVX,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_SVM,COMPUTE_STORAGE_BUS_FDC,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_SECURITY_TPM_1_2,COMPUTE_STORAGE_BUS_USB,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_SSE41,HW_CPU_X86_SSE42,HW_CPU_X86_BMI2,HW_CPU_X86_CLMUL,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_BMI,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_AVX2,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_VIF_MODEL_VMXNET3 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Nov 28 09:37:51 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:37:51.994 228501 DEBUG oslo_concurrency.processutils [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 09:37:52 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-c3914bdda39f47c0c497a56396d11c84b489b87df2bfd019b00ddced1e1ae309-merged.mount: Deactivated successfully.
Nov 28 09:37:52 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-f20c3ba929bbb53a84e323dddb8c0eaf3ca74b6729310e964e1fa9eee119e43a-merged.mount: Deactivated successfully.
Nov 28 09:37:52 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-47afe78ba3ac18f156703d7ad9e4be64941a9d1bd472a4c2a59f4f2c3531ee35-merged.mount: Deactivated successfully.
Nov 28 09:37:52 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-c3914bdda39f47c0c497a56396d11c84b489b87df2bfd019b00ddced1e1ae309-merged.mount: Deactivated successfully.
Nov 28 09:37:52 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:37:52.495 228501 DEBUG oslo_concurrency.processutils [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.502s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 09:37:52 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:37:52.500 228501 DEBUG nova.compute.provider_tree [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Inventory has not changed in ProviderTree for provider: 72fba1ca-0d86-48af-8a3d-510284dfd0e0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 09:37:52 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:37:52.515 228501 DEBUG nova.scheduler.client.report [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Inventory has not changed for provider 72fba1ca-0d86-48af-8a3d-510284dfd0e0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 09:37:52 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:37:52.516 228501 DEBUG nova.compute.resource_tracker [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Compute_service record updated for np0005538515.localdomain:np0005538515.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 28 09:37:52 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:37:52.517 228501 DEBUG oslo_concurrency.lockutils [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.795s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:37:53 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.
Nov 28 09:37:53 np0005538515.localdomain podman[242069]: 2025-11-28 09:37:53.245708082 +0000 UTC m=+0.100828921 container health_status 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 28 09:37:53 np0005538515.localdomain podman[242069]: 2025-11-28 09:37:53.275388875 +0000 UTC m=+0.130509724 container exec_died 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 28 09:37:53 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66-merged.mount: Deactivated successfully.
Nov 28 09:37:53 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-f20c3ba929bbb53a84e323dddb8c0eaf3ca74b6729310e964e1fa9eee119e43a-merged.mount: Deactivated successfully.
Nov 28 09:37:53 np0005538515.localdomain kernel: overlayfs: upperdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:37:53 np0005538515.localdomain kernel: overlayfs: workdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:37:53 np0005538515.localdomain systemd[1]: 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.service: Deactivated successfully.
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/run/console\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/run/console: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/run/faillock\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/run/faillock: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/run/motd.d\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/run/motd.d: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/run/sepermit\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/run/sepermit: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/var\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/var: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/var/log\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/var/log: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain kernel: overlayfs: upperdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:37:54 np0005538515.localdomain kernel: overlayfs: workdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/var/log/dnf.rpm.log\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/var/log/dnf.rpm.log: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/var/log/hawkey.log\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/var/log/hawkey.log: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/var/log/dnf.librepo.log\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/var/log/dnf.librepo.log: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/var/log/dnf.log\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/var/log/dnf.log: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/var/lib\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/var/lib: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/var/lib/dnf\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/var/lib/dnf: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/var/lib/dnf/history.sqlite\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/var/lib/dnf/history.sqlite: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/var/lib/dnf/history.sqlite-shm\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/var/lib/dnf/history.sqlite-shm: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/var/lib/dnf/history.sqlite-wal\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/var/lib/dnf/history.sqlite-wal: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/var/lib/dnf/repos\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/var/lib/dnf/repos: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/var/lib/dnf/repos/appstream-831abc7e9d6a1a72\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/var/lib/dnf/repos/appstream-831abc7e9d6a1a72: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/var/lib/dnf/repos/appstream-831abc7e9d6a1a72/countme\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/var/lib/dnf/repos/appstream-831abc7e9d6a1a72/countme: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/var/lib/dnf/repos/baseos-044cae74d71fe9ea\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/var/lib/dnf/repos/baseos-044cae74d71fe9ea: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/var/lib/dnf/repos/baseos-044cae74d71fe9ea/countme\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/var/lib/dnf/repos/baseos-044cae74d71fe9ea/countme: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/var/lib/dnf/repos/epel-low-priority-4b20c555de8aed94\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/var/lib/dnf/repos/epel-low-priority-4b20c555de8aed94: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/var/lib/dnf/repos/epel-low-priority-4b20c555de8aed94/countme\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/var/lib/dnf/repos/epel-low-priority-4b20c555de8aed94/countme: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/var/lib/dnf/repos/extras-common-581a10b8a62294e3\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/var/lib/dnf/repos/extras-common-581a10b8a62294e3: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/var/lib/dnf/repos/extras-common-581a10b8a62294e3/countme\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/var/lib/dnf/repos/extras-common-581a10b8a62294e3/countme: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/var/lib/rpm\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/var/lib/rpm: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/var/lib/rpm/rpmdb.sqlite\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/var/lib/rpm/rpmdb.sqlite: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/var/lib/rpm/rpmdb.sqlite-shm\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/var/lib/rpm/rpmdb.sqlite-shm: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/var/lib/rpm/rpmdb.sqlite-wal\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/var/lib/rpm/rpmdb.sqlite-wal: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/var/db\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/var/db: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/var/db/sudo\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/var/db/sudo: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/var/db/sudo/lectured\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/var/db/sudo/lectured: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/var/spool\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/var/spool: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/var/spool/mail\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/var/spool/mail: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/var/spool/mail/kolla\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/var/spool/mail/kolla: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/var/spool/mail/libvirt\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/var/spool/mail/libvirt: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/var/spool/mail/qemu\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/var/spool/mail/qemu: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/var/spool/mail/hugetlbfs\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/var/spool/mail/hugetlbfs: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/var/cache\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/var/cache: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/var/cache/ldconfig\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/var/cache/ldconfig: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/var/cache/ldconfig/aux-cache\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/var/cache/ldconfig/aux-cache: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/etc\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/etc: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/etc/adjtime.rpmnew\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/etc/adjtime.rpmnew: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/etc/ld.so.cache\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/etc/ld.so.cache: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/etc/sudo.conf\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/etc/sudo.conf: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/etc/gshadow-\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/etc/gshadow-: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/etc/subgid\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/etc/subgid: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/etc/subgid-\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/etc/subgid-: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/etc/group-\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/etc/group-: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/etc/gshadow\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/etc/gshadow: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/etc/sudo-ldap.conf\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/etc/sudo-ldap.conf: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/etc/libuser.conf\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/etc/libuser.conf: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/etc/shadow-\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/etc/shadow-: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/etc/subuid\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/etc/subuid: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/etc/security\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/etc/security: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/etc/security/faillock.conf\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/etc/security/faillock.conf: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/etc/security/pwquality.conf\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/etc/security/pwquality.conf: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/etc/security/time.conf\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/etc/security/time.conf: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/etc/security/namespace.d\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/etc/security/namespace.d: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/etc/security/access.conf\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/etc/security/access.conf: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/etc/security/console.apps\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/etc/security/console.apps: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/etc/security/console.perms\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/etc/security/console.perms: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/etc/security/console.perms.d\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/etc/security/console.perms.d: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/etc/security/group.conf\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/etc/security/group.conf: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/etc/security/limits.conf\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/etc/security/limits.conf: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/etc/security/limits.d\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/etc/security/limits.d: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/etc/security/namespace.init\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/etc/security/namespace.init: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/etc/security/opasswd\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/etc/security/opasswd: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/etc/security/pwquality.conf.d\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/etc/security/pwquality.conf.d: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/etc/security/sepermit.conf\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/etc/security/sepermit.conf: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/etc/security/chroot.conf\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/etc/security/chroot.conf: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/etc/security/console.handlers\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/etc/security/console.handlers: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/etc/security/namespace.conf\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/etc/security/namespace.conf: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/etc/security/pam_env.conf\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/etc/security/pam_env.conf: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/etc/security/pwhistory.conf\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/etc/security/pwhistory.conf: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/etc/profile.d\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/etc/profile.d: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/etc/profile.d/which2.csh\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/etc/profile.d/which2.csh: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/etc/profile.d/which2.sh\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/etc/profile.d/which2.sh: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/etc/pam.d\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/etc/pam.d: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/etc/pam.d/config-util\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/etc/pam.d/config-util: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/etc/pam.d/su\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/etc/pam.d/su: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/etc/pam.d/chfn\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/etc/pam.d/chfn: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/etc/pam.d/chsh\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/etc/pam.d/chsh: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/etc/pam.d/fingerprint-auth\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/etc/pam.d/fingerprint-auth: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/etc/pam.d/login\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/etc/pam.d/login: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Getting root fs size for \"acc5612457ab293e4f840ea19b50676bf97e3477bba289ad940bf778a740745d\": getting diffsize of layer \"06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66\" and its parent \"cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa\": creating overlay mount to /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged, mount_data=\"lowerdir=/var/lib/containers/storage/overlay/l/YH5WONIZBGF5MKGBVLYJQMRTV6,upperdir=/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/diff,workdir=/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/work,nodev,metacopy=on\": no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/etc/pam.d/runuser\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/etc/pam.d/runuser: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/etc/pam.d/su-l\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/etc/pam.d/su-l: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/etc/pam.d/sudo-i\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/etc/pam.d/sudo-i: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/etc/pam.d/other\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/etc/pam.d/other: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/etc/pam.d/remote\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/etc/pam.d/remote: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/etc/pam.d/runuser-l\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/etc/pam.d/runuser-l: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/etc/pam.d/smartcard-auth\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/etc/pam.d/smartcard-auth: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/etc/pam.d/sudo\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/etc/pam.d/sudo: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/etc/pam.d/system-auth\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/etc/pam.d/system-auth: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/etc/pam.d/password-auth\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/etc/pam.d/password-auth: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/etc/pam.d/postlogin\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/etc/pam.d/postlogin: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/etc/passwd\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/etc/passwd: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/etc/nsswitch.conf\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/etc/nsswitch.conf: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/etc/subuid-\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/etc/subuid-: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/etc/sudoers.d\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/etc/sudoers.d: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/etc/group\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/etc/group: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/etc/passwd-\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/etc/passwd-: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/etc/sudoers\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/etc/sudoers: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/etc/dnf\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/etc/dnf: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/etc/dnf/protected.d\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/etc/dnf/protected.d: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/etc/dnf/protected.d/sudo.conf\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/etc/dnf/protected.d/sudo.conf: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/etc/dnf/dnf.conf\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/etc/dnf/dnf.conf: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/etc/shadow\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/etc/shadow: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/openstack\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/openstack: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/doc\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/doc: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/doc/crudini\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/doc/crudini: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/doc/crudini/NEWS\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/doc/crudini/NEWS: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/doc/crudini/README\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/doc/crudini/README: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/doc/crudini/TODO\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/doc/crudini/TODO: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/doc/crudini/example.ini\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/doc/crudini/example.ini: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/doc/crudini/COPYING\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/doc/crudini/COPYING: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/doc/python-iniparse\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/doc/python-iniparse: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/doc/python-iniparse/html\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/doc/python-iniparse/html: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/doc/python-iniparse/html/index.html\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/doc/python-iniparse/html/index.html: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/doc/python-iniparse/html/style.css\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/doc/python-iniparse/html/style.css: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/doc/python-iniparse/Changelog\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/doc/python-iniparse/Changelog: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/doc/python-iniparse/README\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/doc/python-iniparse/README: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/licenses\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/licenses: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/licenses/libeconf\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/licenses/libeconf: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/licenses/libeconf/LICENSE\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/licenses/libeconf/LICENSE: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/licenses/libdb\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/licenses/libdb: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/licenses/libdb/LICENSE\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/licenses/libdb/LICENSE: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/licenses/libdb/lgpl-2.1.txt\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/licenses/libdb/lgpl-2.1.txt: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/licenses/dumb-init\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/licenses/dumb-init: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/licenses/dumb-init/LICENSE\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/licenses/dumb-init/LICENSE: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/licenses/pam\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/licenses/pam: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/licenses/pam/Copyright\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/licenses/pam/Copyright: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/licenses/pam/gpl-2.0.txt\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/licenses/pam/gpl-2.0.txt: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/licenses/libpwquality\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/licenses/libpwquality: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/licenses/libpwquality/COPYING\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/licenses/libpwquality/COPYING: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/licenses/python-iniparse\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/licenses/python-iniparse: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/licenses/python-iniparse/LICENSE\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/licenses/python-iniparse/LICENSE: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/licenses/python-iniparse/LICENSE-PSF\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/licenses/python-iniparse/LICENSE-PSF: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/licenses/sudo\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/licenses/sudo: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/licenses/sudo/LICENSE\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/licenses/sudo/LICENSE: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/licenses/libuser\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/licenses/libuser: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/licenses/libuser/COPYING\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/licenses/libuser/COPYING: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/licenses/procps-ng\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/licenses/procps-ng: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/licenses/procps-ng/COPYING\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/licenses/procps-ng/COPYING: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/licenses/procps-ng/COPYING.LIB\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/licenses/procps-ng/COPYING.LIB: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/licenses/util-linux\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/licenses/util-linux: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/licenses/util-linux/COPYING.BSD-3-Clause\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/licenses/util-linux/COPYING.BSD-3-Clause: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/licenses/util-linux/COPYING.BSD-4-Clause-UC\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/licenses/util-linux/COPYING.BSD-4-Clause-UC: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/licenses/util-linux/COPYING.GPL-2.0-or-later\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/licenses/util-linux/COPYING.GPL-2.0-or-later: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/licenses/util-linux/COPYING.GPL-3.0-or-later\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/licenses/util-linux/COPYING.GPL-3.0-or-later: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/licenses/util-linux/COPYING.ISC\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/licenses/util-linux/COPYING.ISC: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/licenses/util-linux/COPYING.LGPL-2.1-or-later\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/licenses/util-linux/COPYING.LGPL-2.1-or-later: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/licenses/cracklib\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/licenses/cracklib: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/licenses/cracklib/COPYING.LIB\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/licenses/cracklib/COPYING.LIB: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/licenses/which\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/licenses/which: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/licenses/which/COPYING\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/licenses/which/COPYING: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/licenses/openssl\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/licenses/openssl: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/licenses/openssl/LICENSE.txt\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/licenses/openssl/LICENSE.txt: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/licenses/libfdisk\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/licenses/libfdisk: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/licenses/libfdisk/COPYING\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/licenses/libfdisk/COPYING: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/licenses/libfdisk/COPYING.LGPL-2.1-or-later\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/licenses/libfdisk/COPYING.LGPL-2.1-or-later: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/licenses/libutempter\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/licenses/libutempter: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/licenses/libutempter/COPYING\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/licenses/libutempter/COPYING: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/fincore\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/fincore: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/nsenter\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/nsenter: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/rename\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/rename: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/uuidparse\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/uuidparse: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/wipefs\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/wipefs: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/lsmem\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/lsmem: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/pivot_root\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/pivot_root: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/rfkill\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/rfkill: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/wall\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/wall: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/namei\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/namei: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/chsh\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/chsh: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/hwclock\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/hwclock: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/ipcmk\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/ipcmk: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/readprofile\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/readprofile: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/uuidgen\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/uuidgen: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/chmem\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/chmem: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/lslogins\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/lslogins: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/swapoff\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/swapoff: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/eject\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/eject: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/zramctl\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/zramctl: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/chrt\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/chrt: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/colrm\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/colrm: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/fdisk\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/fdisk: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/ipcs\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/ipcs: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/logger\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/logger: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/mkfs\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/mkfs: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/renice\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/renice: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/col\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/col: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/fsck\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/fsck: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/mkfs.minix\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/mkfs.minix: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/mkswap\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/mkswap: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/fstrim\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/fstrim: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/ldattach\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/ldattach: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/mesg\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/mesg: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/runuser\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/runuser: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/prlimit\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/prlimit: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/colcrt\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/colcrt: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/losetup\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/losetup: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/lsirq\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/lsirq: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/lslocks\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/lslocks: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/mcookie\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/mcookie: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/rev\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/rev: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/setarch\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/setarch: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/blkid\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/blkid: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/cfdisk\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/cfdisk: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/swaplabel\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/swaplabel: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/swapon\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/swapon: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/taskset\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/taskset: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/blkdiscard\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/blkdiscard: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/blockdev\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/blockdev: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/mkfs.cramfs\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/mkfs.cramfs: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/addpart\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/addpart: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/chcpu\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/chcpu: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/ctrlaltdel\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/ctrlaltdel: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/fsck.minix\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/fsck.minix: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/lsipc\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/lsipc: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/partx\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/partx: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/fsck.cramfs\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/fsck.cramfs: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/isosize\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/isosize: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/last\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/last: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/lscpu\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/lscpu: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/rtcwake\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/rtcwake: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/getopt\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/getopt: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/hardlink\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/hardlink: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/ionice\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/ionice: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/resizepart\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/resizepart: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/sfdisk\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/sfdisk: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/ul\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/ul: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/blkzone\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/blkzone: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/unshare\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/unshare: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/column\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/column: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/wdctl\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/wdctl: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/ipcrm\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/ipcrm: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/irqtop\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/irqtop: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/look\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/look: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/lsns\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/lsns: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/scriptreplay\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/scriptreplay: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/write\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/write: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/more\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/more: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/setsid\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/setsid: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/setpriv\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/setpriv: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/su\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/su: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/fsfreeze\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/fsfreeze: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/mountpoint\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/mountpoint: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/findmnt\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/findmnt: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/lsblk\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/lsblk: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/script\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/script: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/chfn\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/chfn: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/dmesg\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/dmesg: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/findfs\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/findfs: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/fallocate\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/fallocate: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/utmpdump\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/utmpdump: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/whereis\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/whereis: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/delpart\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/delpart: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/fdformat\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/fdformat: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/flock\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/flock: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/scriptlive\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/scriptlive: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/cal\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/cal: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/hexdump\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/hexdump: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/setterm\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/bash-completion/completions/setterm: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/cracklib\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/cracklib: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/cracklib/pw_dict.pwi\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/cracklib/pw_dict.pwi: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/cracklib/cracklib-small.hwm\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/cracklib/cracklib-small.hwm: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/cracklib/cracklib-small.pwd\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/cracklib/cracklib-small.pwd: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/cracklib/cracklib-small.pwi\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/cracklib/cracklib-small.pwi: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/cracklib/cracklib.magic\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/cracklib/cracklib.magic: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/cracklib/pw_dict.hwm\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/cracklib/pw_dict.hwm: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/cracklib/pw_dict.pwd\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/cracklib/pw_dict.pwd: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/pam.d\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/pam.d: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/tcib\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/share/tcib: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/pkill\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/pkill: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/sudo\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/sudo: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/lslocks\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/lslocks: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/umount\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/umount: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/fincore\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/fincore: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/free\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/free: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/linux32\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/linux32: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/chrt\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/chrt: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/lastb\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/lastb: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/sudoreplay\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/sudoreplay: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/watch\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/watch: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/isosize\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/isosize: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/mesg\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/mesg: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/rename\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/rename: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/slabtop\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/slabtop: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/uuidgen\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/uuidgen: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/more\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/more: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/setarch\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/setarch: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/ipcs\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/ipcs: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/prlimit\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/prlimit: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/pwscore\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/pwscore: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/renice\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/renice: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/hexdump\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/hexdump: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/kill\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/kill: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/dumb-init\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/dumb-init: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/mountpoint\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/mountpoint: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/renew-dummy-cert\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/renew-dummy-cert: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/uname26\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/uname26: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/unshare\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/unshare: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/colrm\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/colrm: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/cvtsudoers\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/cvtsudoers: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/lscpu\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/lscpu: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/whereis\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/whereis: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/findmnt\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/findmnt: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/setsid\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/setsid: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/write\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/write: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/pidof\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/pidof: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/mount\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/mount: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/last\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/last: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/look\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/look: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/pmap\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/pmap: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/chmem\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/chmem: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/linux64\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/linux64: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/w\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/w: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/fallocate\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/fallocate: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/login\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/login: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/rev\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/rev: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/scriptlive\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/scriptlive: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/lsipc\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/lsipc: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/wdctl\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/wdctl: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/lslogins\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/lslogins: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/su\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/su: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/lchfn\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/lchfn: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/ps\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/ps: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/cal\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/cal: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/ipcrm\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/ipcrm: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/pwdx\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/pwdx: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/nsenter\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/nsenter: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/col\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/col: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/lsirq\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/lsirq: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/namei\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/namei: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/setterm\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/setterm: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/ipcmk\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/ipcmk: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/top\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/top: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/which\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/which: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/lsblk\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/lsblk: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/flock\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/flock: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/dmesg\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/dmesg: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/pgrep\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/pgrep: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/chsh\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/chsh: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/mcookie\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/mcookie: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/uptime\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/uptime: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/wall\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/wall: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/x86_64\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/x86_64: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/skill\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/skill: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/ionice\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/ionice: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/snice\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/snice: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/uuidparse\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/uuidparse: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/script\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/script: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/i386\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/i386: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/chfn\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/chfn: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/crudini\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/crudini: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/hardlink\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/hardlink: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/taskset\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/taskset: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/colcrt\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/colcrt: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/choom\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/choom: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/make-dummy-cert\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/make-dummy-cert: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/scriptreplay\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/scriptreplay: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/eject\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/eject: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/pidwait\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/pidwait: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/openssl\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/openssl: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/pwmake\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/pwmake: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/sudoedit\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/sudoedit: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/getopt\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/getopt: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/vmstat\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/vmstat: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/column\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/column: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/setpriv\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/setpriv: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/tload\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/tload: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/logger\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/logger: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/utmpdump\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/utmpdump: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/irqtop\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/irqtop: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/lchsh\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/lchsh: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/lsmem\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/lsmem: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/lsns\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/lsns: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/ul\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/bin/ul: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/libexec\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/libexec: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/libexec/utempter\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/libexec/utempter: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/libexec/utempter/utempter\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/libexec/utempter/utempter: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/libexec/sudo\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/libexec/sudo: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/libexec/sudo/group_file.so\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/libexec/sudo/group_file.so: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/libexec/sudo/libsudo_util.so.0\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/libexec/sudo/libsudo_util.so.0: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/libexec/sudo/sudo_noexec.so\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/libexec/sudo/sudo_noexec.so: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/libexec/sudo/sudoers.so\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/libexec/sudo/sudoers.so: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/libexec/sudo/audit_json.so\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/libexec/sudo/audit_json.so: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/libexec/sudo/libsudo_util.so\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/libexec/sudo/libsudo_util.so: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/libexec/sudo/libsudo_util.so.0.0.0\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/libexec/sudo/libsudo_util.so.0.0.0: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/libexec/sudo/sample_approval.so\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/libexec/sudo/sample_approval.so: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/libexec/sudo/sesh\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/libexec/sudo/sesh: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/libexec/sudo/system_group.so\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/libexec/sudo/system_group.so: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/systemd\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/systemd: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/systemd/system\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/systemd/system: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/systemd/system/pam_namespace.service\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/systemd/system/pam_namespace.service: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/systemd/system/fstrim.service\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/systemd/system/fstrim.service: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/systemd/system/fstrim.timer\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/systemd/system/fstrim.timer: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/tmpfiles.d\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/tmpfiles.d: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/tmpfiles.d/pam.conf\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/tmpfiles.d/pam.conf: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/tmpfiles.d/sudo.conf\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/tmpfiles.d/sudo.conf: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/14\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/14: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/14/4c9ca8a045d0349fbe4927391a86f2d7dcf761\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/14/4c9ca8a045d0349fbe4927391a86f2d7dcf761: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/14/5c37f5a9ccc89098131b09148296bd7ac13ab0\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/14/5c37f5a9ccc89098131b09148296bd7ac13ab0: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/15\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/15: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/15/2a2ebe0c623d60ef6228e73ba3098b9cce0a7a\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/15/2a2ebe0c623d60ef6228e73ba3098b9cce0a7a: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/15/4f272ea09fcb4b40238965fcac16d167461898\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/15/4f272ea09fcb4b40238965fcac16d167461898: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/15/898840061e6c30a02193e64f2166c48bc99155\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/15/898840061e6c30a02193e64f2166c48bc99155: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/3b\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/3b: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/3b/4c3d6759be9e23c97a28ea07abdd571e88cd34\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/3b/4c3d6759be9e23c97a28ea07abdd571e88cd34: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/3b/58ff94084c438c1d0fcdab593997acb4d6aa2b\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/3b/58ff94084c438c1d0fcdab593997acb4d6aa2b: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/45\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/45: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/45/00dde1d2c8968ed19a7a39278e7a11292a2945\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/45/00dde1d2c8968ed19a7a39278e7a11292a2945: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/45/c5bf221d3ae19055276ea851dd02ba757e3d5a\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/45/c5bf221d3ae19055276ea851dd02ba757e3d5a: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/46\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/46: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/46/30c961c15313564d770cc23a1607e117715946\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/46/30c961c15313564d770cc23a1607e117715946: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/e1\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/e1: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/e1/83b7ea31562bfa3436ea76c8d502a66eb92a39\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/e1/83b7ea31562bfa3436ea76c8d502a66eb92a39: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/e1/c5e9b6d42ceda4c3b61be395bc22ad7ee7beee\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/e1/c5e9b6d42ceda4c3b61be395bc22ad7ee7beee: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/59\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/59: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/59/252e8a7a3c8f925d45b9605be1805793384e92\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/59/252e8a7a3c8f925d45b9605be1805793384e92: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/f3\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/f3: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/f3/7e18717bc313da1a0a1cfd7b3882d3d0c447a7\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/f3/7e18717bc313da1a0a1cfd7b3882d3d0c447a7: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/ec\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/ec: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/ec/f2497fc632dd43ce6a0bd29d3fdd228a0519fc\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/ec/f2497fc632dd43ce6a0bd29d3fdd228a0519fc: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/6b\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/6b: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/6b/f0e500addd5efc006db98f04a486b2f43824d1\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/6b/f0e500addd5efc006db98f04a486b2f43824d1: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/6f\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/6f: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/6f/54752c462f7538c83ab991a371d61801a82afa\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/6f/54752c462f7538c83ab991a371d61801a82afa: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/6f/ffa242074a8347039b695cbbe6d11acfb35f52\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/6f/ffa242074a8347039b695cbbe6d11acfb35f52: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/7f\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/7f: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/7f/a537b7784d9a63eec4db9fce96cc92f133cadc\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/7f/a537b7784d9a63eec4db9fce96cc92f133cadc: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/f7\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/f7: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/f7/b4450f86fc03d5293c3430d1909aa63bbe483b\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/f7/b4450f86fc03d5293c3430d1909aa63bbe483b: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/f7/fc1e9e32f38cfa9430496500ca3c2e58b0b7f9\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/f7/fc1e9e32f38cfa9430496500ca3c2e58b0b7f9: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/0b\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/0b: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/0b/036b4fce47462d760fbe5ddd95593d8eb6bf1b\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/0b/036b4fce47462d760fbe5ddd95593d8eb6bf1b: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/5b\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/5b: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/5b/2d7c2788cdc64e0f6b1d59a7237506eca28172\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/5b/2d7c2788cdc64e0f6b1d59a7237506eca28172: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/5b/e190991cd262bab77f1ace91d79f0381924ff2\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/5b/e190991cd262bab77f1ace91d79f0381924ff2: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/92\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/92: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/92/2530cc8e438ec6b5bbd56dbe8bed1f906b2325\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/92/2530cc8e438ec6b5bbd56dbe8bed1f906b2325: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/92/39d33f80859efa086ccc5c1e209d55bdd1bf55\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/92/39d33f80859efa086ccc5c1e209d55bdd1bf55: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/f5\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/f5: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/f5/1f9758e71666b7f3a8a251af89cedafe3eb845\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/f5/1f9758e71666b7f3a8a251af89cedafe3eb845: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/6a\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/6a: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/6a/c47a004b00bd1891c73ecae48ddc3c459bf378\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/6a/c47a004b00bd1891c73ecae48ddc3c459bf378: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/6a/d19969af8bba5d211c413be29a6918f345109b\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/6a/d19969af8bba5d211c413be29a6918f345109b: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/a8\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/a8: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/a8/2e2b0f96ff1bc114067196146fa473104d2b5e\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/a8/2e2b0f96ff1bc114067196146fa473104d2b5e: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/b6\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/b6: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/b6/dccfab4944fb284428ce74bb6510ea77861e9c\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/b6/dccfab4944fb284428ce74bb6510ea77861e9c: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/b6/f4c929bb4296b957784cc7dab0f5ba63b7af1c\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/b6/f4c929bb4296b957784cc7dab0f5ba63b7af1c: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/b9\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/b9: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/b9/7632fc725bf917f235d8fd8d9cec7ed849dd4c\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/b9/7632fc725bf917f235d8fd8d9cec7ed849dd4c: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/b9/93b95067c8c4e9fc0c3e46a107308029abd5a6\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/b9/93b95067c8c4e9fc0c3e46a107308029abd5a6: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/b9/caebb6a887eb763ded31ead03749712860ffa9\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/b9/caebb6a887eb763ded31ead03749712860ffa9: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/19\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/19: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/19/e0aa1f3d6b4a3a3503574398261f32004d8386\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/19/e0aa1f3d6b4a3a3503574398261f32004d8386: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/b7\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/b7: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/b7/f5bc803701ac92cf44ef9c0fb8b9146f0e4795\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/b7/f5bc803701ac92cf44ef9c0fb8b9146f0e4795: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/b7/fadb8192eb83d1a51f02869000e47be3640a2f\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/b7/fadb8192eb83d1a51f02869000e47be3640a2f: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/b7/6c3ab9ad2eef282817ed5e8138ada94610f225\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/b7/6c3ab9ad2eef282817ed5e8138ada94610f225: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/0c\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/0c: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/0c/ab14fabb5747947340f306da22482235815799\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/0c/ab14fabb5747947340f306da22482235815799: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/7a\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/7a: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/7a/b3795a4e53531ac7011ddfb16959d8d8b549a4\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/7a/b3795a4e53531ac7011ddfb16959d8d8b549a4: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/c2\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/c2: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/c2/4b0d30a6fb7869da102c529afd6d316e0ea338\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/c2/4b0d30a6fb7869da102c529afd6d316e0ea338: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/49\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/49: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/49/9cee2296786420301a271fc12acd2d64c40dc0\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/49/9cee2296786420301a271fc12acd2d64c40dc0: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/5a\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/5a: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/5a/4554c73944306a3dd62ed113634be016dbc513\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/5a/4554c73944306a3dd62ed113634be016dbc513: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/5a/4572434ba92404c12db316b384dfc38d89f59e\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/5a/4572434ba92404c12db316b384dfc38d89f59e: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/5a/8e8136bd20adda1f521c21ef405baddfcf2023\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/5a/8e8136bd20adda1f521c21ef405baddfcf2023: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/6d\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/6d: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/6d/10d6a6e84ae2279363c55ad3a63a4063d14e5c\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/6d/10d6a6e84ae2279363c55ad3a63a4063d14e5c: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/6d/70c28d158a258757c0ef0d27c79b21d6c2210a\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/6d/70c28d158a258757c0ef0d27c79b21d6c2210a: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/b5\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/b5: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/b5/a0e5b157378a621d7f37368e5f893372711b10\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/b5/a0e5b157378a621d7f37368e5f893372711b10: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/7c\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/7c: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/7c/95194adad54ca6792c90e26093918a91ba2424\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/7c/95194adad54ca6792c90e26093918a91ba2424: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/e9\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/e9: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/e9/f3b267dd13d8de7109589b9c02651a18a9535c\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/e9/f3b267dd13d8de7109589b9c02651a18a9535c: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/10\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/10: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/10/ec08deb5b4967055a44fca5422b286ce8020e1\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/10/ec08deb5b4967055a44fca5422b286ce8020e1: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/b8\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/b8: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/b8/9c488650845c00ffc32c7c079165acc04d623c\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/b8/9c488650845c00ffc32c7c079165acc04d623c: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/58\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/58: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/58/f15363c0b49dd71b1ab867f4eba1a4aead4b68\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/58/f15363c0b49dd71b1ab867f4eba1a4aead4b68: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/58/f15363c0b49dd71b1ab867f4eba1a4aead4b68.1\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/58/f15363c0b49dd71b1ab867f4eba1a4aead4b68.1: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/99\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/99: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/99/3a4f481887e9e41d0bd488d66103e4b6a20563\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/99/3a4f481887e9e41d0bd488d66103e4b6a20563: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/e3\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/e3: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/e3/b989f8fbfed34ea2f3561debf5bcc14d670e63\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/e3/b989f8fbfed34ea2f3561debf5bcc14d670e63: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/48\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/48: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/48/702f2a14c21df27c4716697129fc7cb163161b\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/48/702f2a14c21df27c4716697129fc7cb163161b: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/48/a908673b2632815a160ffa3c00bb49d8e11867\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/48/a908673b2632815a160ffa3c00bb49d8e11867: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/9b\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/9b: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/9b/4fe579fe4870793a52a315948b4ef2ba2f15f2\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/9b/4fe579fe4870793a52a315948b4ef2ba2f15f2: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/9b/fe211f59b3a00a0be260cbccc7e556b5ebe659\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/9b/fe211f59b3a00a0be260cbccc7e556b5ebe659: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/1d\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/1d: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/1d/554997639785e70765aca467f08d7cd76c48da\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/1d/554997639785e70765aca467f08d7cd76c48da: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/2f\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/2f: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/2f/8c705d070c393e967900e865fab6c390af9595\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/2f/8c705d070c393e967900e865fab6c390af9595: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/07\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/07: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/07/e0ad65914c80af84f7ba0a3f04094e02c27eac\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/07/e0ad65914c80af84f7ba0a3f04094e02c27eac: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/be\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/be: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/be/46282a4ab9f82e481b73b2028c2e0d83f61c01\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/be/46282a4ab9f82e481b73b2028c2e0d83f61c01: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/1e\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/1e: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/1e/871a3d460bdbf5b9e652b66d954d6b41586adb\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/1e/871a3d460bdbf5b9e652b66d954d6b41586adb: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/1e/12610a0a6e2cee03e5c5ef324a795706ca4af8\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/1e/12610a0a6e2cee03e5c5ef324a795706ca4af8: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/24\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/24: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/24/62560acbc4db0ed93e1556faca90d4ee288dfa\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/24/62560acbc4db0ed93e1556faca90d4ee288dfa: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/24/b36014c5c6e0de63afdcbcc33de893839534c1\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/24/b36014c5c6e0de63afdcbcc33de893839534c1: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/37\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/37: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/37/bd1e0df315849b164694a68817d1e4a6a23393\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/37/bd1e0df315849b164694a68817d1e4a6a23393: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/f9\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/f9: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/f9/1f8ca0db0a78ec1cbb5fbea29c2c840a4d604f\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/f9/1f8ca0db0a78ec1cbb5fbea29c2c840a4d604f: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/77\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/77: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/77/eb94029352d9bad8d41379eec908a9f8ffe7a3\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/77/eb94029352d9bad8d41379eec908a9f8ffe7a3: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/77/08e397769139126ec910a9d05fc6b1ca26da0a\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/77/08e397769139126ec910a9d05fc6b1ca26da0a: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/88\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/88: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/88/62148e74a680ce25d4223a3875ccc5ec4ce17e\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/88/62148e74a680ce25d4223a3875ccc5ec4ce17e: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/bd\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/bd: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/bd/15706c05deff94d631801f9704d7ffc5e0e076\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/bd/15706c05deff94d631801f9704d7ffc5e0e076: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/1b\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/1b: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/1b/e9ceb22be877d4e469efa50445f07c8c1f83f8\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/1b/e9ceb22be877d4e469efa50445f07c8c1f83f8: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/98\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/98: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/98/89e3ca48c9b5f9ad98d1a2569d23b36ccb6ab8\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/98/89e3ca48c9b5f9ad98d1a2569d23b36ccb6ab8: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/9f\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/9f: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/9f/c1490e62b94b1b46bffc368f1f51848743093e\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/9f/c1490e62b94b1b46bffc368f1f51848743093e: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/9f/f45c232391ca02ee245fd9e7eb80696dd0c122\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/9f/f45c232391ca02ee245fd9e7eb80696dd0c122: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/38\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/38: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/38/8775e816e130a4641577668abf61635a1d3735\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/38/8775e816e130a4641577668abf61635a1d3735: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/86\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/86: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/86/8158bb58f3941c3932a76c19dea8401d0d881c\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/86/8158bb58f3941c3932a76c19dea8401d0d881c: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/86/ec315c0b77d7f9f34cfac0e53a7e8fecad2d57\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/86/ec315c0b77d7f9f34cfac0e53a7e8fecad2d57: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/b3\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/b3: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/b3/95fd9884705049862fa42e4591fd68abd9b40a\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/b3/95fd9884705049862fa42e4591fd68abd9b40a: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/b3/d4a4a3e9b1870210d215da2bd2b7830b778381\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/b3/d4a4a3e9b1870210d215da2bd2b7830b778381: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/f0\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/f0: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/f0/bffb8d4078741cd77e8d78ffb7d3395c5bc845\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/f0/bffb8d4078741cd77e8d78ffb7d3395c5bc845: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/3a\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/3a: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/3a/65fe582c01b8683b5d078878811aaa8464138b\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/3a/65fe582c01b8683b5d078878811aaa8464138b: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/3c\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/3c: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/3c/b67309a29398e238aec66d4de9f0f23235cd91\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/3c/b67309a29398e238aec66d4de9f0f23235cd91: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/6c\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/6c: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/6c/73bb368712427a628e1ca99543f4add99e3cf0\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/6c/73bb368712427a628e1ca99543f4add99e3cf0: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/6c/d797bbdc5dbcf1ca7ee8d40c176ff1f3ffe0c8\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/6c/d797bbdc5dbcf1ca7ee8d40c176ff1f3ffe0c8: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/6c/f41df6594d8b065ccb75651070a67013eeba46\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/6c/f41df6594d8b065ccb75651070a67013eeba46: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/6c/0950e9365e0b8aa3bd7209ac6ea2c1696d124d\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/6c/0950e9365e0b8aa3bd7209ac6ea2c1696d124d: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/6c/3796d2364876e0b3af34a9ca234eb089565b30\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/6c/3796d2364876e0b3af34a9ca234eb089565b30: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/6c/4330739c6cfd738d9c374c510dc515f0b78701\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/6c/4330739c6cfd738d9c374c510dc515f0b78701: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/73\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/73: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/73/f00ef7111ba2cc8a69497b41beba35ddf06559\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/73/f00ef7111ba2cc8a69497b41beba35ddf06559: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/73/f0ad13c2ab751bce3cdd92706f30e9c5ebafad\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/73/f0ad13c2ab751bce3cdd92706f30e9c5ebafad: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/ad\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/ad: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/ad/673b42528afae3dc68f5d5219f043a79abca02\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/ad/673b42528afae3dc68f5d5219f043a79abca02: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/af\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/af: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/af/894db221956e39b5978a4ce466652fd0972337\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/af/894db221956e39b5978a4ce466652fd0972337: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/d9\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/d9: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/d9/519f4dc1369aa97ab76157a95baaf1ac4c530b\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/d9/519f4dc1369aa97ab76157a95baaf1ac4c530b: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/fc\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/fc: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/fc/7869b92773180b32b3204de10557c09b629e05\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/fc/7869b92773180b32b3204de10557c09b629e05: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/fc/67454d54493ef8831fb8b35eb0851879e1caae\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/fc/67454d54493ef8831fb8b35eb0851879e1caae: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/26\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/26: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/26/b00e9ec1bf99e19e512dd3454e7cfb17150d11\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/26/b00e9ec1bf99e19e512dd3454e7cfb17150d11: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/64\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/64: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/64/b57799da4528e656def0ff3dfc50b219891291\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/64/b57799da4528e656def0ff3dfc50b219891291: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/a2\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/a2: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/a2/b692343d2750000f671c4b7358b3dd3d3314e8\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/a2/b692343d2750000f671c4b7358b3dd3d3314e8: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/a2/23f23002f64e7fbdc460b927be82bf93da2995\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/a2/23f23002f64e7fbdc460b927be82bf93da2995: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/ba\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/ba: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/ba/e16be2abdc0a78548e83dfa5389859eba58b10\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/ba/e16be2abdc0a78548e83dfa5389859eba58b10: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/bf\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/bf: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/bf/e95f8fd60bcb4a480d848ba2720e6acdb35903\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/bf/e95f8fd60bcb4a480d848ba2720e6acdb35903: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/1a\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/1a: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/1a/3e845de17389efa0d2e4a4c24254ba724bd22a\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/1a/3e845de17389efa0d2e4a4c24254ba724bd22a: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/d4\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/d4: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/d4/d33e6813da9853d3d83ea82d99825da0512185\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/d4/d33e6813da9853d3d83ea82d99825da0512185: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/de\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/de: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/de/e3b742fb69d55b23d5efe7a2a7d92d8a8c513c\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/de/e3b742fb69d55b23d5efe7a2a7d92d8a8c513c: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/61\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/61: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/61/661bfa2cab22b87d022bd131c2219e47cb9de9\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/61/661bfa2cab22b87d022bd131c2219e47cb9de9: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/97\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/97: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/97/342aae192903738d26584f7696fdafdd4c0fe8\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/97/342aae192903738d26584f7696fdafdd4c0fe8: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/9a\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/9a: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/9a/24488ee7cfbcaba9e9d761deb8fab654f9ed5f\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/9a/24488ee7cfbcaba9e9d761deb8fab654f9ed5f: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/9a/5ed14cb9cec518130b09d0dbd0babb92715773\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/9a/5ed14cb9cec518130b09d0dbd0babb92715773: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/9c\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/9c: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/9c/1dc2bf18b1193c89a2a0cad01da6821aa50675\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/9c/1dc2bf18b1193c89a2a0cad01da6821aa50675: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/9c/58fe282871b83f3f49a0a7ec4f51f0f2b58318\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/9c/58fe282871b83f3f49a0a7ec4f51f0f2b58318: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/9c/9093df2183bf3294ddc51302b17aa0a6e342a4\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/9c/9093df2183bf3294ddc51302b17aa0a6e342a4: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/0f\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/0f: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/0f/ff831b766a546cb5f767c4af9b4dc136502867\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/0f/ff831b766a546cb5f767c4af9b4dc136502867: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/11\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/11: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/11/81a7a886960671578c62f6b4a3507a289416ee\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/11/81a7a886960671578c62f6b4a3507a289416ee: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/53\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/53: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/53/c43ba27bb0d22fc5833c1808446f07e5d9e086\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/53/c43ba27bb0d22fc5833c1808446f07e5d9e086: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/84\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/84: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/84/4d4d3bf1964d3746e640adf8a0e37ce014c4b5\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/84/4d4d3bf1964d3746e640adf8a0e37ce014c4b5: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/8b\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/8b: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/8b/42721f76cd754eb8f32628f227983835dcb222\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/8b/42721f76cd754eb8f32628f227983835dcb222: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/7e\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/7e: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/7e/6995ce97fef81f5415ca55eb0b712ae0ee7523\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/7e/6995ce97fef81f5415ca55eb0b712ae0ee7523: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/7e/850425a437dbec491bbc6d689fa3979534bf85\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/7e/850425a437dbec491bbc6d689fa3979534bf85: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/fb\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/fb: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/fb/da7a1ca0d4394e5bf2fbdbfdb3553515d5e8d8\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/fb/da7a1ca0d4394e5bf2fbdbfdb3553515d5e8d8: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/6e\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/6e: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/6e/1c3cab5d94f3cd5d4166b6404861200e992b5e\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/6e/1c3cab5d94f3cd5d4166b6404861200e992b5e: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/6e/d8ff701ed7a87a370bb4d38936cb4993d27b33\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/6e/d8ff701ed7a87a370bb4d38936cb4993d27b33: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/81\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/81: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/81/040de175ba8cd678646ebac44fa7c5d9705edb\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/81/040de175ba8cd678646ebac44fa7c5d9705edb: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/96\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/96: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/96/0023ceeab60566c12fff0c7b88e5bc92a05156\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/96/0023ceeab60566c12fff0c7b88e5bc92a05156: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/c4\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/c4: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/c4/10bc1a4b973c150e7971413ec03708b41b9f2f\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/c4/10bc1a4b973c150e7971413ec03708b41b9f2f: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/ca\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/ca: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/ca/e982e5ef85af3a29629c68db468dbba507489b\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/ca/e982e5ef85af3a29629c68db468dbba507489b: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/3e\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/3e: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/3e/d36716e9b4198a54adb547acd1593b840afde4\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/3e/d36716e9b4198a54adb547acd1593b840afde4: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/66\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/66: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/66/aa0a18263c4a29d3b702e956546a16250a9258\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/66/aa0a18263c4a29d3b702e956546a16250a9258: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/90\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/90: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/90/08eb9b2ca67a83ad17fa9aaecebf19cddef3ad\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/90/08eb9b2ca67a83ad17fa9aaecebf19cddef3ad: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/c0\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/c0: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/c0/671f3a73b49ece2f4706012981382cf00c6cc0\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/c0/671f3a73b49ece2f4706012981382cf00c6cc0: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/c0/a662334a9e43ca3cfde299f4e78a7ef3ce0bcc\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/c0/a662334a9e43ca3cfde299f4e78a7ef3ce0bcc: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/4b\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/4b: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/4b/bc5d4b63f30341e6035ae54a20ab7a18e8e0d8\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/4b/bc5d4b63f30341e6035ae54a20ab7a18e8e0d8: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/4b/3fb60aeb8470105990c515a34fc8abe0291182\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/4b/3fb60aeb8470105990c515a34fc8abe0291182: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/d2\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/d2: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/d2/6dedc3e6b0bcd5852e10617fbe64cac139e244\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/d2/6dedc3e6b0bcd5852e10617fbe64cac139e244: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/d2/c420317f7745d79d7c16d80573f057e8c982a1\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/d2/c420317f7745d79d7c16d80573f057e8c982a1: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/12\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/12: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/12/ad33ab2736109dd8f6f14d911cdd8b0db46807\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/12/ad33ab2736109dd8f6f14d911cdd8b0db46807: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/4c\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/4c: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/4c/0acfcee4bd45d6e9feb1d69242ca919dd3ff55\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/4c/0acfcee4bd45d6e9feb1d69242ca919dd3ff55: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/8c\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/8c: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/8c/37ae4650778b6e29873532f105f97cd9fbc21a\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/8c/37ae4650778b6e29873532f105f97cd9fbc21a: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/8c/9ba98bdda1d62ac9eeaaad05b73bda17f6ff7f\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/8c/9ba98bdda1d62ac9eeaaad05b73bda17f6ff7f: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/a3\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/a3: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/a3/cf7e95bd7d3184d470665fe46802582d0633c3\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/a3/cf7e95bd7d3184d470665fe46802582d0633c3: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/c1\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/c1: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/c1/0e0d7841fe061d3bd6e365019fbc2c34ef5005\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/c1/0e0d7841fe061d3bd6e365019fbc2c34ef5005: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/c1/0fa7f53cab2b23f22eff1131d56bbeb327f67b\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/c1/0fa7f53cab2b23f22eff1131d56bbeb327f67b: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/c1/0fa7f53cab2b23f22eff1131d56bbeb327f67b.1\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/c1/0fa7f53cab2b23f22eff1131d56bbeb327f67b.1: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/c1/0fa7f53cab2b23f22eff1131d56bbeb327f67b.2\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/c1/0fa7f53cab2b23f22eff1131d56bbeb327f67b.2: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/c6\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/c6: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/c6/c28655085e12b3c2543458283353c715016cbc\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/c6/c28655085e12b3c2543458283353c715016cbc: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/32\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/32: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/32/95bc757ef096512cda161ed4e8678725e511cf\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/32/95bc757ef096512cda161ed4e8678725e511cf: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/32/dbce3b779f897ea39113f1350e7d0c6c24b4f9\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/32/dbce3b779f897ea39113f1350e7d0c6c24b4f9: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/32/fbcb2a53577122e08e6d43c0ed9e10d54859c1\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/32/fbcb2a53577122e08e6d43c0ed9e10d54859c1: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/5f\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/5f: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/5f/8a9d207fc2fad2166c8fe36442ae994efeac96\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/5f/8a9d207fc2fad2166c8fe36442ae994efeac96: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/5f/f38b5d6ddb25f815b84bf7f1d8343a68f791d9\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/5f/f38b5d6ddb25f815b84bf7f1d8343a68f791d9: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/68\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/68: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/68/a248b415244165868407bc67e9347098fc3ac4\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/68/a248b415244165868407bc67e9347098fc3ac4: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/57\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/57: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/57/744f0680a7727b40bfccc37785b46bdb96c396\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/57/744f0680a7727b40bfccc37785b46bdb96c396: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/57/a4bf13991f53a7ced9665f8bfeb0ce914cf4cd\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/57/a4bf13991f53a7ced9665f8bfeb0ce914cf4cd: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/65\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/65: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/65/187cf35e5a3ef5fb454dd10a2e893d1d499c0a\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/65/187cf35e5a3ef5fb454dd10a2e893d1d499c0a: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/5c\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/5c: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/5c/4706309dfa01c0a55b0b039f2f5d9403696994\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/5c/4706309dfa01c0a55b0b039f2f5d9403696994: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/80\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/80: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/80/edb5b351a77aa371fd7f7e497e0126fd9b8803\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/80/edb5b351a77aa371fd7f7e497e0126fd9b8803: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/80/f9e01b60cf69e6aa348faea41d6532d699a6b9\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/80/f9e01b60cf69e6aa348faea41d6532d699a6b9: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/8f\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/8f: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/8f/84882d82f114554389afaec2a9ee86358c1e1a\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/8f/84882d82f114554389afaec2a9ee86358c1e1a: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/d7\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/d7: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/d7/8b04f97f68129dc25e695a26304c781047120b\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/d7/8b04f97f68129dc25e695a26304c781047120b: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/09\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/09: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/09/6301f1862f7555350f75cf7f2bc12da085e249\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/09/6301f1862f7555350f75cf7f2bc12da085e249: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/09/b1b2e7e88a555d6e90321894f695da219ae39b\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/09/b1b2e7e88a555d6e90321894f695da219ae39b: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/23\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/23: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/23/aaeb6d2bccffd3cac7053fb2bae9993b673917\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/23/aaeb6d2bccffd3cac7053fb2bae9993b673917: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/e7\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/e7: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/e7/e3c451531a69b8f93b0b5f20f331be08f6cad9\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/e7/e3c451531a69b8f93b0b5f20f331be08f6cad9: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/47\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/47: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/47/32472a0dbd0dd3c6943f6937aee28a4709ec19\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/47/32472a0dbd0dd3c6943f6937aee28a4709ec19: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/47/feeec6b3e5dcd790f9f4342823d009f215ad9b\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/47/feeec6b3e5dcd790f9f4342823d009f215ad9b: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/d8\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/d8: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/d8/e2dc267b9f981b04ee4f32ffcf153b80372a97\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/d8/e2dc267b9f981b04ee4f32ffcf153b80372a97: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/fe\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/fe: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/fe/8101e106f28f4d79f1f361a6c6990658949ad6\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/fe/8101e106f28f4d79f1f361a6c6990658949ad6: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/fe/ff62c721e16e69c94d2e5f29694bb8f57b4a05\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/fe/ff62c721e16e69c94d2e5f29694bb8f57b4a05: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/69\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/69: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/69/e91fc180ba574f8c8da8dc0841c02685f9e158\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/69/e91fc180ba574f8c8da8dc0841c02685f9e158: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/b1\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/b1: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/b1/977f3727b618ea409dea372288d59eba56e258\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/b1/977f3727b618ea409dea372288d59eba56e258: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/56\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/56: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/56/2dcf2b3817b09bd6cb6911f37ade7a2ec90132\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/56/2dcf2b3817b09bd6cb6911f37ade7a2ec90132: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/95\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/95: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/95/6559bbeba586774f90a69a1aca070915985c82\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/95/6559bbeba586774f90a69a1aca070915985c82: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/c9\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/c9: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/c9/1502b7e9364d18f330c18b67f19f11bb587251\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/c9/1502b7e9364d18f330c18b67f19f11bb587251: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/c9/c07f8c51195798688d2932fd5113d33b1258e8\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/c9/c07f8c51195798688d2932fd5113d33b1258e8: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/d0\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/d0: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/d0/f907b7f5942ac20599454a4c98017bd14f274f\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/d0/f907b7f5942ac20599454a4c98017bd14f274f: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/d0/d0ead630b525942d6a5f341fe7eb192c40d038\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/d0/d0ead630b525942d6a5f341fe7eb192c40d038: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/35\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/35: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/35/addfd140638b8f14d0ff1cfb6a50818526db4a\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/35/addfd140638b8f14d0ff1cfb6a50818526db4a: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/4e\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/4e: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/4e/aec58e151fb249ecad2e4b6ea0ad1279af660c\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/4e/aec58e151fb249ecad2e4b6ea0ad1279af660c: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/bc\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/bc: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/bc/a2824b9a542c8c1a2d24166dacd6c41def099f\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/bc/a2824b9a542c8c1a2d24166dacd6c41def099f: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/1c\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/1c: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/1c/2f9da1a336a9dbd11c70155b30eeeb9cf1b724\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/1c/2f9da1a336a9dbd11c70155b30eeeb9cf1b724: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/1f\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/1f: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/1f/75d5ec22384f7646e9fdce8dd744e48febf8fc\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/1f/75d5ec22384f7646e9fdce8dd744e48febf8fc: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/a1\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/a1: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/a1/cea1c489c70202432c596418d52da652a61f4e\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/a1/cea1c489c70202432c596418d52da652a61f4e: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/a4\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/a4: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/a4/e91a47cc70b817f69cf680cdd9d26ade32efab\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/a4/e91a47cc70b817f69cf680cdd9d26ade32efab: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/b4\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/b4: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/b4/d1180eaba463aaf6ec16f4b191543da8d10ca2\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/b4/d1180eaba463aaf6ec16f4b191543da8d10ca2: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/e8\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/e8: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/e8/8aef7816752c4fdced4eccae3c831d0e9378b3\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/e8/8aef7816752c4fdced4eccae3c831d0e9378b3: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/e8/a8599321574c8b2218cf2667e63a079ce64931\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/e8/a8599321574c8b2218cf2667e63a079ce64931: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/02\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/02: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/02/0461369d40182bd8b73f5ff0ddf320458847be\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/02/0461369d40182bd8b73f5ff0ddf320458847be: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/02/c63bf4f4130719bd9eac8aa92124b1b9121e8f\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/02/c63bf4f4130719bd9eac8aa92124b1b9121e8f: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/36\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/36: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/36/642c280dda6fc23b017bcf4d96b02d4aa5fa92\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/36/642c280dda6fc23b017bcf4d96b02d4aa5fa92: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/36/c49b0042bbea3205d9fb813fbc0c6fad94018f\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/36/c49b0042bbea3205d9fb813fbc0c6fad94018f: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/36/38367336c106aa5b6af44fa828d16d78360169\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/36/38367336c106aa5b6af44fa828d16d78360169: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/a7\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/a7: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/a7/d2df67bcd9444ce952d07516131850e28835e6\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/a7/d2df67bcd9444ce952d07516131850e28835e6: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/42\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/42: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/42/ec0566d7e0217b38d69a7cae503494b85b5808\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/42/ec0566d7e0217b38d69a7cae503494b85b5808: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/0a\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/0a: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/0a/3c8c8992af0c26b2902bc0d70d1915c91bdcb3\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/0a/3c8c8992af0c26b2902bc0d70d1915c91bdcb3: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/0a/6c0f71bb77bd4f4505b7e636d26fdf4f536043\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/0a/6c0f71bb77bd4f4505b7e636d26fdf4f536043: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/82\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/82: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/82/7c6e053590fe437d616e6ba57b2a340302f016\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/82/7c6e053590fe437d616e6ba57b2a340302f016: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/f2\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/f2: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/f2/2bc7fbcd4127bb47a721083d01adf2b21c4a72\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/f2/2bc7fbcd4127bb47a721083d01adf2b21c4a72: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/ff\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/ff: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/ff/b19d10eed0b0c2a2fda29318808478bb6757f4\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/ff/b19d10eed0b0c2a2fda29318808478bb6757f4: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/22\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/22: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/22/040dced525bac7bde5dd2da9e9161d69b1fd76\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/22/040dced525bac7bde5dd2da9e9161d69b1fd76: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/40\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/40: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/40/27903672cb134dc5d2a1279d12ca116dfc9c83\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/40/27903672cb134dc5d2a1279d12ca116dfc9c83: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/9e\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/9e: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/9e/72ca1dc45d7726c81d250781036c4864f02d51\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/9e/72ca1dc45d7726c81d250781036c4864f02d51: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/dc\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/dc: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/dc/fec00d6e13bce7bfb28a1833b98791754cff0a\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/dc/fec00d6e13bce7bfb28a1833b98791754cff0a: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/51\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/51: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/51/cb826df3ffe1760a506cbbeb5c90c941c5bba8\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/51/cb826df3ffe1760a506cbbeb5c90c941c5bba8: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/5d\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/5d: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/5d/6147fb4d3529e231d2fc7bec8dca9f6a95246a\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/5d/6147fb4d3529e231d2fc7bec8dca9f6a95246a: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/cc\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/cc: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/cc/6c613cb2a391c3402bbc2b4ee7b9e386b3745f\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/cc/6c613cb2a391c3402bbc2b4ee7b9e386b3745f: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/30\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/30: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/30/27da82f167b2e96c8c188a6f775b57dd9ae9ee\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/30/27da82f167b2e96c8c188a6f775b57dd9ae9ee: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/21\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/21: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/21/1690f1a4a95e671b438eba9e6dba8aaca55536\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/21/1690f1a4a95e671b438eba9e6dba8aaca55536: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/21/3fdb2cbb13aab0c1e236e869657a89575733b1\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/21/3fdb2cbb13aab0c1e236e869657a89575733b1: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/27\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/27: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/27/307a54d7ef18d9286d50dac10aa6fa64653d4b\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/27/307a54d7ef18d9286d50dac10aa6fa64653d4b: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/27/4ae8fd908163b4a74be9b8aed1e4c25860b5eb\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/27/4ae8fd908163b4a74be9b8aed1e4c25860b5eb: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/70\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/70: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/70/33eec8708bc486817e6ae72f89a0e9e2ec7b9c\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/70/33eec8708bc486817e6ae72f89a0e9e2ec7b9c: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/2a\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/2a: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/2a/317a879ffe04800b781007d71f3037737f275b\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/2a/317a879ffe04800b781007d71f3037737f275b: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/2a/5ec56f7af55bac66dbb24832633ae3e8b1443b\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/2a/5ec56f7af55bac66dbb24832633ae3e8b1443b: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/91\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/91: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/91/69e16e2da3ec26201bb11ef965947c2f0b7712\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/91/69e16e2da3ec26201bb11ef965947c2f0b7712: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/91/849d07bc5e74f22b550856dcd3adb1147b3349\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/91/849d07bc5e74f22b550856dcd3adb1147b3349: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/01\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/01: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/01/14d70635a6ba8e88d974d07815c518ecb28da9\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/01/14d70635a6ba8e88d974d07815c518ecb28da9: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/a5\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/a5: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/a5/b5f7e802d47f332aaaaf50636f42aac9ed0e74\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/a5/b5f7e802d47f332aaaaf50636f42aac9ed0e74: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/db\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/db: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/db/1db4ad8871c87337ab8ca52a9a7aab6c27c9ec\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/db/1db4ad8871c87337ab8ca52a9a7aab6c27c9ec: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/e2\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/e2: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/e2/6ff8bc813fa3da4ad75ced252e1b8d43d3a581\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/e2/6ff8bc813fa3da4ad75ced252e1b8d43d3a581: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/50\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/50: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/50/aa2d047e7740e42486ea1f6f08d812b5ab59c5\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/50/aa2d047e7740e42486ea1f6f08d812b5ab59c5: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/50/ee4e16ae62e0615cfdab55b9a5ba0ecc02bea9\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/50/ee4e16ae62e0615cfdab55b9a5ba0ecc02bea9: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/df\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/df: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/df/f2189d4c2dbd71b897709832a57ebbd763f7a4\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/df/f2189d4c2dbd71b897709832a57ebbd763f7a4: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/13\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/13: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/13/de9022c0d7eb81bd7e57ddf877f4d773d93155\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/13/de9022c0d7eb81bd7e57ddf877f4d773d93155: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/13/932b9afac88ef1a87e38f5b0988257aea390f1\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/13/932b9afac88ef1a87e38f5b0988257aea390f1: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/29\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/29: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/29/ffdb42c588b438cceb6647f8000016711e07ec\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/29/ffdb42c588b438cceb6647f8000016711e07ec: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/e6\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/e6: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/e6/fc03677d8f4ef7a68cdfe2273b801c1df19c6d\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/.build-id/e6/fc03677d8f4ef7a68cdfe2273b801c1df19c6d: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_IE\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_IE: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_IE/LC_ADDRESS\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_IE/LC_ADDRESS: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_IE/LC_COLLATE\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_IE/LC_COLLATE: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_IE/LC_MEASUREMENT\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_IE/LC_MEASUREMENT: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_IE/LC_MESSAGES\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_IE/LC_MESSAGES: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_IE/LC_MESSAGES/SYS_LC_MESSAGES\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_IE/LC_MESSAGES/SYS_LC_MESSAGES: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_IE/LC_TELEPHONE\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_IE/LC_TELEPHONE: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_IE/LC_TIME\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_IE/LC_TIME: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_IE/LC_CTYPE\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_IE/LC_CTYPE: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_IE/LC_IDENTIFICATION\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_IE/LC_IDENTIFICATION: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_IE/LC_MONETARY\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_IE/LC_MONETARY: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_IE/LC_NAME\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_IE/LC_NAME: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_IE/LC_NUMERIC\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_IE/LC_NUMERIC: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_IE/LC_PAPER\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_IE/LC_PAPER: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_ZA\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_ZA: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_ZA/LC_ADDRESS\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_ZA/LC_ADDRESS: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_ZA/LC_COLLATE\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_ZA/LC_COLLATE: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_ZA/LC_MONETARY\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_ZA/LC_MONETARY: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_ZA/LC_PAPER\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_ZA/LC_PAPER: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_ZA/LC_TELEPHONE\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_ZA/LC_TELEPHONE: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_ZA/LC_TIME\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_ZA/LC_TIME: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_ZA/LC_CTYPE\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_ZA/LC_CTYPE: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_ZA/LC_IDENTIFICATION\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_ZA/LC_IDENTIFICATION: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_ZA/LC_MEASUREMENT\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_ZA/LC_MEASUREMENT: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_ZA/LC_MESSAGES\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_ZA/LC_MESSAGES: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_ZA/LC_MESSAGES/SYS_LC_MESSAGES\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_ZA/LC_MESSAGES/SYS_LC_MESSAGES: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_ZA/LC_NAME\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_ZA/LC_NAME: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_ZA/LC_NUMERIC\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_ZA/LC_NUMERIC: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_BW.utf8\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_BW.utf8: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_BW.utf8/LC_ADDRESS\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_BW.utf8/LC_ADDRESS: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_BW.utf8/LC_CTYPE\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_BW.utf8/LC_CTYPE: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_BW.utf8/LC_IDENTIFICATION\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_BW.utf8/LC_IDENTIFICATION: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_BW.utf8/LC_MEASUREMENT\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_BW.utf8/LC_MEASUREMENT: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_BW.utf8/LC_NAME\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_BW.utf8/LC_NAME: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_BW.utf8/LC_COLLATE\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_BW.utf8/LC_COLLATE: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_BW.utf8/LC_MESSAGES\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_BW.utf8/LC_MESSAGES: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_BW.utf8/LC_MESSAGES/SYS_LC_MESSAGES\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_BW.utf8/LC_MESSAGES/SYS_LC_MESSAGES: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_BW.utf8/LC_MONETARY\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_BW.utf8/LC_MONETARY: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_BW.utf8/LC_NUMERIC\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_BW.utf8/LC_NUMERIC: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_BW.utf8/LC_PAPER\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_BW.utf8/LC_PAPER: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_BW.utf8/LC_TELEPHONE\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_BW.utf8/LC_TELEPHONE: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_BW.utf8/LC_TIME\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_BW.utf8/LC_TIME: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_HK.utf8\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_HK.utf8: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_HK.utf8/LC_NUMERIC\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_HK.utf8/LC_NUMERIC: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_HK.utf8/LC_TIME\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_HK.utf8/LC_TIME: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_HK.utf8/LC_IDENTIFICATION\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_HK.utf8/LC_IDENTIFICATION: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_HK.utf8/LC_MEASUREMENT\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_HK.utf8/LC_MEASUREMENT: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_HK.utf8/LC_MONETARY\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_HK.utf8/LC_MONETARY: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_HK.utf8/LC_MESSAGES\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_HK.utf8/LC_MESSAGES: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_HK.utf8/LC_MESSAGES/SYS_LC_MESSAGES\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_HK.utf8/LC_MESSAGES/SYS_LC_MESSAGES: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_HK.utf8/LC_NAME\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_HK.utf8/LC_NAME: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_HK.utf8/LC_PAPER\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_HK.utf8/LC_PAPER: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_HK.utf8/LC_TELEPHONE\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_HK.utf8/LC_TELEPHONE: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_HK.utf8/LC_ADDRESS\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_HK.utf8/LC_ADDRESS: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_HK.utf8/LC_COLLATE\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_HK.utf8/LC_COLLATE: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_HK.utf8/LC_CTYPE\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_HK.utf8/LC_CTYPE: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_SG.utf8\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_SG.utf8: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_SG.utf8/LC_CTYPE\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_SG.utf8/LC_CTYPE: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_SG.utf8/LC_MESSAGES\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_SG.utf8/LC_MESSAGES: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_SG.utf8/LC_MESSAGES/SYS_LC_MESSAGES\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_SG.utf8/LC_MESSAGES/SYS_LC_MESSAGES: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_SG.utf8/LC_MONETARY\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_SG.utf8/LC_MONETARY: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_SG.utf8/LC_ADDRESS\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_SG.utf8/LC_ADDRESS: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_SG.utf8/LC_COLLATE\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_SG.utf8/LC_COLLATE: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_SG.utf8/LC_IDENTIFICATION\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_SG.utf8/LC_IDENTIFICATION: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_SG.utf8/LC_MEASUREMENT\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_SG.utf8/LC_MEASUREMENT: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_SG.utf8/LC_NAME\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_SG.utf8/LC_NAME: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_SG.utf8/LC_NUMERIC\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_SG.utf8/LC_NUMERIC: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_SG.utf8/LC_PAPER\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_SG.utf8/LC_PAPER: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_SG.utf8/LC_TELEPHONE\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_SG.utf8/LC_TELEPHONE: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_SG.utf8/LC_TIME\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_SG.utf8/LC_TIME: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_AG\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_AG: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_AG/LC_ADDRESS\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_AG/LC_ADDRESS: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_AG/LC_COLLATE\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_AG/LC_COLLATE: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_AG/LC_MEASUREMENT\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_AG/LC_MEASUREMENT: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_AG/LC_NAME\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_AG/LC_NAME: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_AG/LC_NUMERIC\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_AG/LC_NUMERIC: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_AG/LC_TIME\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_AG/LC_TIME: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_AG/LC_CTYPE\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_AG/LC_CTYPE: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_AG/LC_IDENTIFICATION\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_AG/LC_IDENTIFICATION: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_AG/LC_MESSAGES\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_AG/LC_MESSAGES: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_AG/LC_MESSAGES/SYS_LC_MESSAGES\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_AG/LC_MESSAGES/SYS_LC_MESSAGES: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_AG/LC_MONETARY\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_AG/LC_MONETARY: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_AG/LC_PAPER\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_AG/LC_PAPER: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_AG/LC_TELEPHONE\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_AG/LC_TELEPHONE: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_AU\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_AU: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_AU/LC_TIME\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_AU/LC_TIME: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_AU/LC_ADDRESS\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_AU/LC_ADDRESS: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_AU/LC_COLLATE\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_AU/LC_COLLATE: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_AU/LC_MESSAGES\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_AU/LC_MESSAGES: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_AU/LC_MESSAGES/SYS_LC_MESSAGES\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_AU/LC_MESSAGES/SYS_LC_MESSAGES: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_AU/LC_MONETARY\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_AU/LC_MONETARY: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_AU/LC_NAME\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_AU/LC_NAME: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_AU/LC_NUMERIC\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_AU/LC_NUMERIC: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_AU/LC_TELEPHONE\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_AU/LC_TELEPHONE: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_AU/LC_CTYPE\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_AU/LC_CTYPE: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_AU/LC_IDENTIFICATION\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_AU/LC_IDENTIFICATION: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_AU/LC_MEASUREMENT\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_AU/LC_MEASUREMENT: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_AU/LC_PAPER\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_AU/LC_PAPER: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_US\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_US: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_US/LC_COLLATE\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_US/LC_COLLATE: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_US/LC_MEASUREMENT\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_US/LC_MEASUREMENT: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_US/LC_MESSAGES\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_US/LC_MESSAGES: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_US/LC_MESSAGES/SYS_LC_MESSAGES\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_US/LC_MESSAGES/SYS_LC_MESSAGES: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_US/LC_MONETARY\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_US/LC_MONETARY: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_US/LC_NAME\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_US/LC_NAME: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_US/LC_PAPER\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_US/LC_PAPER: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_US/LC_TELEPHONE\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_US/LC_TELEPHONE: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_US/LC_ADDRESS\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_US/LC_ADDRESS: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_US/LC_CTYPE\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_US/LC_CTYPE: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_US/LC_IDENTIFICATION\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_US/LC_IDENTIFICATION: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_US/LC_NUMERIC\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_US/LC_NUMERIC: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_US/LC_TIME\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_US/LC_TIME: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_GB.iso885915\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_GB.iso885915: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_GB.iso885915/LC_PAPER\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_GB.iso885915/LC_PAPER: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_GB.iso885915/LC_TELEPHONE\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_GB.iso885915/LC_TELEPHONE: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_GB.iso885915/LC_COLLATE\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_GB.iso885915/LC_COLLATE: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_GB.iso885915/LC_CTYPE\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_GB.iso885915/LC_CTYPE: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_GB.iso885915/LC_IDENTIFICATION\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_GB.iso885915/LC_IDENTIFICATION: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_GB.iso885915/LC_MEASUREMENT\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_GB.iso885915/LC_MEASUREMENT: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_GB.iso885915/LC_NAME\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_GB.iso885915/LC_NAME: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_GB.iso885915/LC_NUMERIC\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_GB.iso885915/LC_NUMERIC: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_GB.iso885915/LC_ADDRESS\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_GB.iso885915/LC_ADDRESS: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_GB.iso885915/LC_MESSAGES\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_GB.iso885915/LC_MESSAGES: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_GB.iso885915/LC_MESSAGES/SYS_LC_MESSAGES\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_GB.iso885915/LC_MESSAGES/SYS_LC_MESSAGES: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_GB.iso885915/LC_MONETARY\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_GB.iso885915/LC_MONETARY: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_GB.iso885915/LC_TIME\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_GB.iso885915/LC_TIME: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_GB.utf8\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_GB.utf8: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_GB.utf8/LC_NAME\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_GB.utf8/LC_NAME: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_GB.utf8/LC_NUMERIC\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_GB.utf8/LC_NUMERIC: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_GB.utf8/LC_TELEPHONE\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_GB.utf8/LC_TELEPHONE: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_GB.utf8/LC_TIME\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_GB.utf8/LC_TIME: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_GB.utf8/LC_CTYPE\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_GB.utf8/LC_CTYPE: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_GB.utf8/LC_MESSAGES\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_GB.utf8/LC_MESSAGES: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_GB.utf8/LC_MESSAGES/SYS_LC_MESSAGES\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_GB.utf8/LC_MESSAGES/SYS_LC_MESSAGES: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_GB.utf8/LC_MONETARY\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_GB.utf8/LC_MONETARY: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_GB.utf8/LC_MEASUREMENT\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_GB.utf8/LC_MEASUREMENT: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_GB.utf8/LC_PAPER\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_GB.utf8/LC_PAPER: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_GB.utf8/LC_ADDRESS\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_GB.utf8/LC_ADDRESS: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_GB.utf8/LC_COLLATE\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_GB.utf8/LC_COLLATE: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_GB.utf8/LC_IDENTIFICATION\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_GB.utf8/LC_IDENTIFICATION: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_AU.utf8\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_AU.utf8: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_AU.utf8/LC_MEASUREMENT\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_AU.utf8/LC_MEASUREMENT: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_AU.utf8/LC_MESSAGES\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_AU.utf8/LC_MESSAGES: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_AU.utf8/LC_MESSAGES/SYS_LC_MESSAGES\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_AU.utf8/LC_MESSAGES/SYS_LC_MESSAGES: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_AU.utf8/LC_NAME\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_AU.utf8/LC_NAME: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_AU.utf8/LC_NUMERIC\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_AU.utf8/LC_NUMERIC: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_AU.utf8/LC_PAPER\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_AU.utf8/LC_PAPER: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_AU.utf8/LC_COLLATE\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_AU.utf8/LC_COLLATE: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_AU.utf8/LC_CTYPE\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_AU.utf8/LC_CTYPE: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_AU.utf8/LC_IDENTIFICATION\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_AU.utf8/LC_IDENTIFICATION: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_AU.utf8/LC_MONETARY\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_AU.utf8/LC_MONETARY: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_AU.utf8/LC_TELEPHONE\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_AU.utf8/LC_TELEPHONE: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_AU.utf8/LC_TIME\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_AU.utf8/LC_TIME: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_AU.utf8/LC_ADDRESS\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_AU.utf8/LC_ADDRESS: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_NZ\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_NZ: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_NZ/LC_COLLATE\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_NZ/LC_COLLATE: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_NZ/LC_MEASUREMENT\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_NZ/LC_MEASUREMENT: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_NZ/LC_MESSAGES\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_NZ/LC_MESSAGES: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_NZ/LC_MESSAGES/SYS_LC_MESSAGES\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_NZ/LC_MESSAGES/SYS_LC_MESSAGES: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_NZ/LC_MONETARY\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_NZ/LC_MONETARY: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_NZ/LC_NAME\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_NZ/LC_NAME: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_NZ/LC_NUMERIC\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_NZ/LC_NUMERIC: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_NZ/LC_PAPER\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_NZ/LC_PAPER: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_NZ/LC_TELEPHONE\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_NZ/LC_TELEPHONE: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_NZ/LC_ADDRESS\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_NZ/LC_ADDRESS: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_NZ/LC_CTYPE\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_NZ/LC_CTYPE: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_NZ/LC_IDENTIFICATION\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_NZ/LC_IDENTIFICATION: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_NZ/LC_TIME\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_NZ/LC_TIME: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_ZW.utf8\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_ZW.utf8: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_ZW.utf8/LC_COLLATE\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_ZW.utf8/LC_COLLATE: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_ZW.utf8/LC_CTYPE\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_ZW.utf8/LC_CTYPE: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_ZW.utf8/LC_IDENTIFICATION\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_ZW.utf8/LC_IDENTIFICATION: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_ZW.utf8/LC_MEASUREMENT\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_ZW.utf8/LC_MEASUREMENT: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_ZW.utf8/LC_TELEPHONE\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_ZW.utf8/LC_TELEPHONE: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_ZW.utf8/LC_TIME\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_ZW.utf8/LC_TIME: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_ZW.utf8/LC_ADDRESS\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_ZW.utf8/LC_ADDRESS: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_ZW.utf8/LC_MESSAGES\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_ZW.utf8/LC_MESSAGES: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_ZW.utf8/LC_MESSAGES/SYS_LC_MESSAGES\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_ZW.utf8/LC_MESSAGES/SYS_LC_MESSAGES: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_ZW.utf8/LC_MONETARY\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_ZW.utf8/LC_MONETARY: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_ZW.utf8/LC_NAME\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_ZW.utf8/LC_NAME: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_ZW.utf8/LC_NUMERIC\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_ZW.utf8/LC_NUMERIC: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_ZW.utf8/LC_PAPER\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_ZW.utf8/LC_PAPER: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_IN\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_IN: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_IN/LC_IDENTIFICATION\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_IN/LC_IDENTIFICATION: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_IN/LC_MEASUREMENT\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_IN/LC_MEASUREMENT: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_IN/LC_NUMERIC\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_IN/LC_NUMERIC: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_IN/LC_TIME\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_IN/LC_TIME: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_IN/LC_ADDRESS\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_IN/LC_ADDRESS: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_IN/LC_COLLATE\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_IN/LC_COLLATE: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_IN/LC_MONETARY\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_IN/LC_MONETARY: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_IN/LC_NAME\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_IN/LC_NAME: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_IN/LC_PAPER\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_IN/LC_PAPER: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_IN/LC_TELEPHONE\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_IN/LC_TELEPHONE: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_IN/LC_CTYPE\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_IN/LC_CTYPE: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_IN/LC_MESSAGES\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_IN/LC_MESSAGES: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_IN/LC_MESSAGES/SYS_LC_MESSAGES\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_IN/LC_MESSAGES/SYS_LC_MESSAGES: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_CA.utf8\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_CA.utf8: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_CA.utf8/LC_COLLATE\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_CA.utf8/LC_COLLATE: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_CA.utf8/LC_MEASUREMENT\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_CA.utf8/LC_MEASUREMENT: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_CA.utf8/LC_TELEPHONE\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_CA.utf8/LC_TELEPHONE: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_CA.utf8/LC_PAPER\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_CA.utf8/LC_PAPER: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_CA.utf8/LC_ADDRESS\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_CA.utf8/LC_ADDRESS: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_CA.utf8/LC_CTYPE\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_CA.utf8/LC_CTYPE: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_CA.utf8/LC_IDENTIFICATION\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_CA.utf8/LC_IDENTIFICATION: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_CA.utf8/LC_MESSAGES\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_CA.utf8/LC_MESSAGES: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_CA.utf8/LC_MESSAGES/SYS_LC_MESSAGES\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_CA.utf8/LC_MESSAGES/SYS_LC_MESSAGES: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_CA.utf8/LC_MONETARY\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_CA.utf8/LC_MONETARY: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_CA.utf8/LC_NAME\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_CA.utf8/LC_NAME: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_CA.utf8/LC_NUMERIC\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_CA.utf8/LC_NUMERIC: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_CA.utf8/LC_TIME\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_CA.utf8/LC_TIME: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_DK.utf8\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_DK.utf8: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_DK.utf8/LC_MONETARY\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_DK.utf8/LC_MONETARY: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_DK.utf8/LC_NAME\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_DK.utf8/LC_NAME: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_DK.utf8/LC_NUMERIC\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_DK.utf8/LC_NUMERIC: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_DK.utf8/LC_ADDRESS\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_DK.utf8/LC_ADDRESS: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_DK.utf8/LC_COLLATE\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_DK.utf8/LC_COLLATE: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_DK.utf8/LC_CTYPE\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_DK.utf8/LC_CTYPE: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_DK.utf8/LC_IDENTIFICATION\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_DK.utf8/LC_IDENTIFICATION: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_DK.utf8/LC_MESSAGES\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_DK.utf8/LC_MESSAGES: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_DK.utf8/LC_MESSAGES/SYS_LC_MESSAGES\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_DK.utf8/LC_MESSAGES/SYS_LC_MESSAGES: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_DK.utf8/LC_PAPER\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_DK.utf8/LC_PAPER: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_DK.utf8/LC_MEASUREMENT\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_DK.utf8/LC_MEASUREMENT: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_DK.utf8/LC_TELEPHONE\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_DK.utf8/LC_TELEPHONE: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_DK.utf8/LC_TIME\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_DK.utf8/LC_TIME: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_GB\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_GB: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_GB/LC_CTYPE\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_GB/LC_CTYPE: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_GB/LC_IDENTIFICATION\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_GB/LC_IDENTIFICATION: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_GB/LC_MEASUREMENT\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_GB/LC_MEASUREMENT: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_GB/LC_TIME\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_GB/LC_TIME: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_GB/LC_ADDRESS\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_GB/LC_ADDRESS: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_GB/LC_COLLATE\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_GB/LC_COLLATE: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_GB/LC_NAME\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_GB/LC_NAME: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_GB/LC_NUMERIC\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_GB/LC_NUMERIC: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_GB/LC_PAPER\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_GB/LC_PAPER: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_GB/LC_TELEPHONE\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_GB/LC_TELEPHONE: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_GB/LC_MESSAGES\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_GB/LC_MESSAGES: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_GB/LC_MESSAGES/SYS_LC_MESSAGES\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_GB/LC_MESSAGES/SYS_LC_MESSAGES: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_GB/LC_MONETARY\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_GB/LC_MONETARY: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_HK\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_HK: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_HK/LC_COLLATE\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_HK/LC_COLLATE: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_HK/LC_MESSAGES\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_HK/LC_MESSAGES: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_HK/LC_MESSAGES/SYS_LC_MESSAGES\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_HK/LC_MESSAGES/SYS_LC_MESSAGES: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_HK/LC_MONETARY\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_HK/LC_MONETARY: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_HK/LC_NUMERIC\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_HK/LC_NUMERIC: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_HK/LC_TELEPHONE\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_HK/LC_TELEPHONE: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_HK/LC_ADDRESS\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_HK/LC_ADDRESS: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_HK/LC_IDENTIFICATION\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_HK/LC_IDENTIFICATION: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_HK/LC_MEASUREMENT\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_HK/LC_MEASUREMENT: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_HK/LC_NAME\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_HK/LC_NAME: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_HK/LC_PAPER\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_HK/LC_PAPER: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_HK/LC_TIME\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_HK/LC_TIME: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_HK/LC_CTYPE\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_HK/LC_CTYPE: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_IE.utf8\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_IE.utf8: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_IE.utf8/LC_NUMERIC\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_IE.utf8/LC_NUMERIC: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_IE.utf8/LC_TELEPHONE\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_IE.utf8/LC_TELEPHONE: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_IE.utf8/LC_TIME\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_IE.utf8/LC_TIME: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_IE.utf8/LC_CTYPE\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_IE.utf8/LC_CTYPE: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_IE.utf8/LC_MEASUREMENT\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_IE.utf8/LC_MEASUREMENT: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_IE.utf8/LC_MESSAGES\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_IE.utf8/LC_MESSAGES: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_IE.utf8/LC_MESSAGES/SYS_LC_MESSAGES\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_IE.utf8/LC_MESSAGES/SYS_LC_MESSAGES: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_IE.utf8/LC_MONETARY\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_IE.utf8/LC_MONETARY: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_IE.utf8/LC_NAME\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_IE.utf8/LC_NAME: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_IE.utf8/LC_ADDRESS\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_IE.utf8/LC_ADDRESS: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_IE.utf8/LC_COLLATE\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_IE.utf8/LC_COLLATE: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_IE.utf8/LC_IDENTIFICATION\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_IE.utf8/LC_IDENTIFICATION: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_IE.utf8/LC_PAPER\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_IE.utf8/LC_PAPER: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_SG\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_SG: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_SG/LC_TIME\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_SG/LC_TIME: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_SG/LC_ADDRESS\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_SG/LC_ADDRESS: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_SG/LC_IDENTIFICATION\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_SG/LC_IDENTIFICATION: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_SG/LC_MEASUREMENT\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_SG/LC_MEASUREMENT: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_SG/LC_MONETARY\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_SG/LC_MONETARY: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_SG/LC_NUMERIC\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_SG/LC_NUMERIC: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_SG/LC_TELEPHONE\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_SG/LC_TELEPHONE: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_SG/LC_COLLATE\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_SG/LC_COLLATE: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_SG/LC_CTYPE\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_SG/LC_CTYPE: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_SG/LC_MESSAGES\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_SG/LC_MESSAGES: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_SG/LC_MESSAGES/SYS_LC_MESSAGES\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_SG/LC_MESSAGES/SYS_LC_MESSAGES: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_SG/LC_NAME\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_SG/LC_NAME: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_SG/LC_PAPER\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_SG/LC_PAPER: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_BW\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_BW: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_BW/LC_TELEPHONE\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_BW/LC_TELEPHONE: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_BW/LC_COLLATE\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_BW/LC_COLLATE: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_BW/LC_IDENTIFICATION\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_BW/LC_IDENTIFICATION: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_BW/LC_MEASUREMENT\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_BW/LC_MEASUREMENT: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_BW/LC_MESSAGES\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_BW/LC_MESSAGES: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_BW/LC_MESSAGES/SYS_LC_MESSAGES\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_BW/LC_MESSAGES/SYS_LC_MESSAGES: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_BW/LC_NAME\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_BW/LC_NAME: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_BW/LC_NUMERIC\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_BW/LC_NUMERIC: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_BW/LC_PAPER\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_BW/LC_PAPER: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_BW/LC_TIME\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_BW/LC_TIME: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_BW/LC_ADDRESS\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_BW/LC_ADDRESS: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_BW/LC_CTYPE\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_BW/LC_CTYPE: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_BW/LC_MONETARY\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_BW/LC_MONETARY: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_CA\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_CA: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_CA/LC_ADDRESS\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_CA/LC_ADDRESS: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_CA/LC_IDENTIFICATION\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_CA/LC_IDENTIFICATION: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_CA/LC_MESSAGES\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_CA/LC_MESSAGES: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_CA/LC_MESSAGES/SYS_LC_MESSAGES\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_CA/LC_MESSAGES/SYS_LC_MESSAGES: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_CA/LC_NAME\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_CA/LC_NAME: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_CA/LC_NUMERIC\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_CA/LC_NUMERIC: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_CA/LC_COLLATE\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_CA/LC_COLLATE: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_CA/LC_CTYPE\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_CA/LC_CTYPE: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_CA/LC_MEASUREMENT\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_CA/LC_MEASUREMENT: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_CA/LC_MONETARY\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_CA/LC_MONETARY: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_CA/LC_PAPER\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_CA/LC_PAPER: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_CA/LC_TELEPHONE\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_CA/LC_TELEPHONE: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_CA/LC_TIME\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_CA/LC_TIME: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_NG\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_NG: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_NG/LC_NUMERIC\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_NG/LC_NUMERIC: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_NG/LC_PAPER\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_NG/LC_PAPER: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_NG/LC_TELEPHONE\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_NG/LC_TELEPHONE: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_NG/LC_ADDRESS\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_NG/LC_ADDRESS: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_NG/LC_IDENTIFICATION\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_NG/LC_IDENTIFICATION: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_NG/LC_MEASUREMENT\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_NG/LC_MEASUREMENT: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_NG/LC_MESSAGES\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_NG/LC_MESSAGES: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_NG/LC_MESSAGES/SYS_LC_MESSAGES\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_NG/LC_MESSAGES/SYS_LC_MESSAGES: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_NG/LC_MONETARY\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_NG/LC_MONETARY: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_NG/LC_COLLATE\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_NG/LC_COLLATE: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_NG/LC_CTYPE\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_NG/LC_CTYPE: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_NG/LC_NAME\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_NG/LC_NAME: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_NG/LC_TIME\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_NG/LC_TIME: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_NZ.utf8\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_NZ.utf8: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_NZ.utf8/LC_NAME\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_NZ.utf8/LC_NAME: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_NZ.utf8/LC_NUMERIC\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_NZ.utf8/LC_NUMERIC: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_NZ.utf8/LC_ADDRESS\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_NZ.utf8/LC_ADDRESS: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_NZ.utf8/LC_CTYPE\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_NZ.utf8/LC_CTYPE: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_NZ.utf8/LC_IDENTIFICATION\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_NZ.utf8/LC_IDENTIFICATION: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_NZ.utf8/LC_MEASUREMENT\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_NZ.utf8/LC_MEASUREMENT: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_NZ.utf8/LC_MESSAGES\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_NZ.utf8/LC_MESSAGES: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_NZ.utf8/LC_MESSAGES/SYS_LC_MESSAGES\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_NZ.utf8/LC_MESSAGES/SYS_LC_MESSAGES: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_NZ.utf8/LC_MONETARY\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_NZ.utf8/LC_MONETARY: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_NZ.utf8/LC_PAPER\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_NZ.utf8/LC_PAPER: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_NZ.utf8/LC_COLLATE\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_NZ.utf8/LC_COLLATE: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_NZ.utf8/LC_TELEPHONE\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_NZ.utf8/LC_TELEPHONE: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_NZ.utf8/LC_TIME\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_NZ.utf8/LC_TIME: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_PH.utf8\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_PH.utf8: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_PH.utf8/LC_MEASUREMENT\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_PH.utf8/LC_MEASUREMENT: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_PH.utf8/LC_MONETARY\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_PH.utf8/LC_MONETARY: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_PH.utf8/LC_NUMERIC\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_PH.utf8/LC_NUMERIC: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_PH.utf8/LC_TELEPHONE\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_PH.utf8/LC_TELEPHONE: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_PH.utf8/LC_TIME\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_PH.utf8/LC_TIME: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_PH.utf8/LC_ADDRESS\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_PH.utf8/LC_ADDRESS: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_PH.utf8/LC_CTYPE\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_PH.utf8/LC_CTYPE: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_PH.utf8/LC_MESSAGES\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_PH.utf8/LC_MESSAGES: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_PH.utf8/LC_MESSAGES/SYS_LC_MESSAGES\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_PH.utf8/LC_MESSAGES/SYS_LC_MESSAGES: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_PH.utf8/LC_NAME\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_PH.utf8/LC_NAME: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_PH.utf8/LC_PAPER\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_PH.utf8/LC_PAPER: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_PH.utf8/LC_COLLATE\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_PH.utf8/LC_COLLATE: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_PH.utf8/LC_IDENTIFICATION\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_PH.utf8/LC_IDENTIFICATION: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_ZA.utf8\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_ZA.utf8: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_ZA.utf8/LC_TIME\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_ZA.utf8/LC_TIME: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_ZA.utf8/LC_ADDRESS\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_ZA.utf8/LC_ADDRESS: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_ZA.utf8/LC_COLLATE\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_ZA.utf8/LC_COLLATE: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_ZA.utf8/LC_MESSAGES\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_ZA.utf8/LC_MESSAGES: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_ZA.utf8/LC_MESSAGES/SYS_LC_MESSAGES\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_ZA.utf8/LC_MESSAGES/SYS_LC_MESSAGES: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_ZA.utf8/LC_NAME\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_ZA.utf8/LC_NAME: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_ZA.utf8/LC_PAPER\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_ZA.utf8/LC_PAPER: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_ZA.utf8/LC_TELEPHONE\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_ZA.utf8/LC_TELEPHONE: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_ZA.utf8/LC_CTYPE\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_ZA.utf8/LC_CTYPE: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_ZA.utf8/LC_IDENTIFICATION\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_ZA.utf8/LC_IDENTIFICATION: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_ZA.utf8/LC_MEASUREMENT\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_ZA.utf8/LC_MEASUREMENT: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_ZA.utf8/LC_MONETARY\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_ZA.utf8/LC_MONETARY: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_ZA.utf8/LC_NUMERIC\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_ZA.utf8/LC_NUMERIC: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_ZM\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_ZM: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_ZM/LC_MESSAGES\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_ZM/LC_MESSAGES: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_ZM/LC_MESSAGES/SYS_LC_MESSAGES\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_ZM/LC_MESSAGES/SYS_LC_MESSAGES: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_ZM/LC_NAME\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_ZM/LC_NAME: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_ZM/LC_PAPER\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_ZM/LC_PAPER: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_ZM/LC_TELEPHONE\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_ZM/LC_TELEPHONE: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_ZM/LC_TIME\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_ZM/LC_TIME: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_ZM/LC_COLLATE\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_ZM/LC_COLLATE: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_ZM/LC_CTYPE\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_ZM/LC_CTYPE: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_ZM/LC_MEASUREMENT\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_ZM/LC_MEASUREMENT: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_ZM/LC_NUMERIC\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_ZM/LC_NUMERIC: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_ZM/LC_ADDRESS\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_ZM/LC_ADDRESS: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_ZM/LC_IDENTIFICATION\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_ZM/LC_IDENTIFICATION: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_ZM/LC_MONETARY\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_ZM/LC_MONETARY: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_DK\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_DK: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_DK/LC_COLLATE\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_DK/LC_COLLATE: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_DK/LC_TIME\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_DK/LC_TIME: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_DK/LC_NUMERIC\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_DK/LC_NUMERIC: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_DK/LC_ADDRESS\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_DK/LC_ADDRESS: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_DK/LC_CTYPE\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_DK/LC_CTYPE: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_DK/LC_IDENTIFICATION\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_DK/LC_IDENTIFICATION: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_DK/LC_MEASUREMENT\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_DK/LC_MEASUREMENT: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_DK/LC_MESSAGES\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_DK/LC_MESSAGES: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_DK/LC_MESSAGES/SYS_LC_MESSAGES\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_DK/LC_MESSAGES/SYS_LC_MESSAGES: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_DK/LC_MONETARY\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_DK/LC_MONETARY: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_DK/LC_NAME\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_DK/LC_NAME: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_DK/LC_PAPER\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_DK/LC_PAPER: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_DK/LC_TELEPHONE\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_DK/LC_TELEPHONE: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_IE@euro\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_IE@euro: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_IE@euro/LC_MEASUREMENT\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_IE@euro/LC_MEASUREMENT: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_IE@euro/LC_MONETARY\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_IE@euro/LC_MONETARY: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_IE@euro/LC_NUMERIC\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_IE@euro/LC_NUMERIC: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_IE@euro/LC_PAPER\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_IE@euro/LC_PAPER: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_IE@euro/LC_TELEPHONE\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_IE@euro/LC_TELEPHONE: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_IE@euro/LC_TIME\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_IE@euro/LC_TIME: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_IE@euro/LC_COLLATE\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_IE@euro/LC_COLLATE: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_IE@euro/LC_CTYPE\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_IE@euro/LC_CTYPE: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_IE@euro/LC_MESSAGES\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_IE@euro/LC_MESSAGES: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_IE@euro/LC_MESSAGES/SYS_LC_MESSAGES\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_IE@euro/LC_MESSAGES/SYS_LC_MESSAGES: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_IE@euro/LC_NAME\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_IE@euro/LC_NAME: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_IE@euro/LC_ADDRESS\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_IE@euro/LC_ADDRESS: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_IE@euro/LC_IDENTIFICATION\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_IE@euro/LC_IDENTIFICATION: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_SC.utf8\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_SC.utf8: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_SC.utf8/LC_TELEPHONE\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_SC.utf8/LC_TELEPHONE: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_SC.utf8/LC_ADDRESS\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_SC.utf8/LC_ADDRESS: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_SC.utf8/LC_IDENTIFICATION\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_SC.utf8/LC_IDENTIFICATION: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_SC.utf8/LC_MEASUREMENT\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_SC.utf8/LC_MEASUREMENT: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_SC.utf8/LC_MONETARY\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_SC.utf8/LC_MONETARY: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_SC.utf8/LC_NAME\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_SC.utf8/LC_NAME: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_SC.utf8/LC_NUMERIC\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_SC.utf8/LC_NUMERIC: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_SC.utf8/LC_PAPER\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_SC.utf8/LC_PAPER: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_SC.utf8/LC_COLLATE\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_SC.utf8/LC_COLLATE: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_SC.utf8/LC_CTYPE\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_SC.utf8/LC_CTYPE: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_SC.utf8/LC_MESSAGES\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_SC.utf8/LC_MESSAGES: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_SC.utf8/LC_MESSAGES/SYS_LC_MESSAGES\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_SC.utf8/LC_MESSAGES/SYS_LC_MESSAGES: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_SC.utf8/LC_TIME\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_SC.utf8/LC_TIME: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_US.iso885915\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_US.iso885915: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_US.iso885915/LC_MONETARY\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_US.iso885915/LC_MONETARY: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_US.iso885915/LC_NAME\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_US.iso885915/LC_NAME: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_US.iso885915/LC_PAPER\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_US.iso885915/LC_PAPER: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_US.iso885915/LC_ADDRESS\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_US.iso885915/LC_ADDRESS: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_US.iso885915/LC_CTYPE\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_US.iso885915/LC_CTYPE: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_US.iso885915/LC_IDENTIFICATION\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_US.iso885915/LC_IDENTIFICATION: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_US.iso885915/LC_MEASUREMENT\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_US.iso885915/LC_MEASUREMENT: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_US.iso885915/LC_TIME\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_US.iso885915/LC_TIME: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_US.iso885915/LC_COLLATE\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_US.iso885915/LC_COLLATE: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_US.iso885915/LC_MESSAGES\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_US.iso885915/LC_MESSAGES: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_US.iso885915/LC_MESSAGES/SYS_LC_MESSAGES\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_US.iso885915/LC_MESSAGES/SYS_LC_MESSAGES: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_US.iso885915/LC_NUMERIC\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_US.iso885915/LC_NUMERIC: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_US.iso885915/LC_TELEPHONE\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_US.iso885915/LC_TELEPHONE: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_US.utf8\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_US.utf8: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_US.utf8/LC_PAPER\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_US.utf8/LC_PAPER: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_US.utf8/LC_ADDRESS\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_US.utf8/LC_ADDRESS: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_US.utf8/LC_COLLATE\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_US.utf8/LC_COLLATE: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_US.utf8/LC_IDENTIFICATION\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_US.utf8/LC_IDENTIFICATION: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_US.utf8/LC_MESSAGES\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_US.utf8/LC_MESSAGES: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_US.utf8/LC_MESSAGES/SYS_LC_MESSAGES\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_US.utf8/LC_MESSAGES/SYS_LC_MESSAGES: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_US.utf8/LC_MONETARY\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_US.utf8/LC_MONETARY: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_US.utf8/LC_NAME\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_US.utf8/LC_NAME: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_US.utf8/LC_CTYPE\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_US.utf8/LC_CTYPE: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_US.utf8/LC_MEASUREMENT\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_US.utf8/LC_MEASUREMENT: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_US.utf8/LC_NUMERIC\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_US.utf8/LC_NUMERIC: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_US.utf8/LC_TELEPHONE\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_US.utf8/LC_TELEPHONE: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_US.utf8/LC_TIME\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_US.utf8/LC_TIME: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_ZW\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_ZW: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_ZW/LC_NUMERIC\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_ZW/LC_NUMERIC: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_ZW/LC_TELEPHONE\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_ZW/LC_TELEPHONE: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_ZW/LC_TIME\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_ZW/LC_TIME: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_ZW/LC_ADDRESS\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_ZW/LC_ADDRESS: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_ZW/LC_IDENTIFICATION\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_ZW/LC_IDENTIFICATION: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_ZW/LC_MEASUREMENT\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_ZW/LC_MEASUREMENT: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_ZW/LC_MESSAGES\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_ZW/LC_MESSAGES: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_ZW/LC_MESSAGES/SYS_LC_MESSAGES\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_ZW/LC_MESSAGES/SYS_LC_MESSAGES: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_ZW/LC_MONETARY\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_ZW/LC_MONETARY: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_ZW/LC_NAME\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_ZW/LC_NAME: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_ZW/LC_PAPER\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_ZW/LC_PAPER: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_ZW/LC_COLLATE\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_ZW/LC_COLLATE: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_ZW/LC_CTYPE\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_ZW/LC_CTYPE: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_IL\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_IL: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_IL/LC_IDENTIFICATION\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_IL/LC_IDENTIFICATION: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_IL/LC_MEASUREMENT\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_IL/LC_MEASUREMENT: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_IL/LC_MESSAGES\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_IL/LC_MESSAGES: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_IL/LC_MESSAGES/SYS_LC_MESSAGES\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_IL/LC_MESSAGES/SYS_LC_MESSAGES: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_IL/LC_NAME\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_IL/LC_NAME: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_IL/LC_NUMERIC\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_IL/LC_NUMERIC: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_IL/LC_PAPER\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_IL/LC_PAPER: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_IL/LC_COLLATE\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_IL/LC_COLLATE: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_IL/LC_CTYPE\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_IL/LC_CTYPE: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_IL/LC_TELEPHONE\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_IL/LC_TELEPHONE: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_IL/LC_TIME\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_IL/LC_TIME: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_IL/LC_ADDRESS\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_IL/LC_ADDRESS: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_IL/LC_MONETARY\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_IL/LC_MONETARY: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_PH\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_PH: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_PH/LC_PAPER\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_PH/LC_PAPER: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_PH/LC_ADDRESS\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_PH/LC_ADDRESS: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_PH/LC_COLLATE\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_PH/LC_COLLATE: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_PH/LC_IDENTIFICATION\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_PH/LC_IDENTIFICATION: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_PH/LC_MEASUREMENT\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_PH/LC_MEASUREMENT: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_PH/LC_MONETARY\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_PH/LC_MONETARY: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_PH/LC_NAME\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_PH/LC_NAME: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_PH/LC_NUMERIC\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_PH/LC_NUMERIC: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_PH/LC_CTYPE\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_PH/LC_CTYPE: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_PH/LC_MESSAGES\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_PH/LC_MESSAGES: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_PH/LC_MESSAGES/SYS_LC_MESSAGES\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_PH/LC_MESSAGES/SYS_LC_MESSAGES: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_PH/LC_TELEPHONE\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_PH/LC_TELEPHONE: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_PH/LC_TIME\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/locale/en_PH/LC_TIME: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/python3.9\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/python3.9: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/python3.9/site-packages\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/python3.9/site-packages: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/python3.9/site-packages/iniparse-0.4-py3.9.egg-info\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/python3.9/site-packages/iniparse-0.4-py3.9.egg-info: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/python3.9/site-packages/iniparse-0.4-py3.9.egg-info/top_level.txt\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/python3.9/site-packages/iniparse-0.4-py3.9.egg-info/top_level.txt: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/python3.9/site-packages/iniparse-0.4-py3.9.egg-info/PKG-INFO\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/python3.9/site-packages/iniparse-0.4-py3.9.egg-info/PKG-INFO: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/python3.9/site-packages/iniparse-0.4-py3.9.egg-info/SOURCES.txt\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/python3.9/site-packages/iniparse-0.4-py3.9.egg-info/SOURCES.txt: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/python3.9/site-packages/iniparse-0.4-py3.9.egg-info/dependency_links.txt\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/python3.9/site-packages/iniparse-0.4-py3.9.egg-info/dependency_links.txt: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/python3.9/site-packages/iniparse-0.4-py3.9.egg-info/requires.txt\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/python3.9/site-packages/iniparse-0.4-py3.9.egg-info/requires.txt: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/python3.9/site-packages/iniparse\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/python3.9/site-packages/iniparse: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/python3.9/site-packages/iniparse/config.py\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/python3.9/site-packages/iniparse/config.py: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/python3.9/site-packages/iniparse/configparser.py\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/python3.9/site-packages/iniparse/configparser.py: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/python3.9/site-packages/iniparse/ini.py\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/python3.9/site-packages/iniparse/ini.py: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/python3.9/site-packages/iniparse/utils.py\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/python3.9/site-packages/iniparse/utils.py: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/python3.9/site-packages/iniparse/__init__.py\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/python3.9/site-packages/iniparse/__init__.py: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/python3.9/site-packages/iniparse/__pycache__\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/python3.9/site-packages/iniparse/__pycache__: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/python3.9/site-packages/iniparse/__pycache__/__init__.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/python3.9/site-packages/iniparse/__pycache__/__init__.cpython-39.opt-1.pyc: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/python3.9/site-packages/iniparse/__pycache__/__init__.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/python3.9/site-packages/iniparse/__pycache__/__init__.cpython-39.pyc: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/python3.9/site-packages/iniparse/__pycache__/compat.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/python3.9/site-packages/iniparse/__pycache__/compat.cpython-39.opt-1.pyc: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/python3.9/site-packages/iniparse/__pycache__/compat.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/python3.9/site-packages/iniparse/__pycache__/compat.cpython-39.pyc: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/python3.9/site-packages/iniparse/__pycache__/ini.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/python3.9/site-packages/iniparse/__pycache__/ini.cpython-39.pyc: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/python3.9/site-packages/iniparse/__pycache__/config.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/python3.9/site-packages/iniparse/__pycache__/config.cpython-39.opt-1.pyc: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/python3.9/site-packages/iniparse/__pycache__/config.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/python3.9/site-packages/iniparse/__pycache__/config.cpython-39.pyc: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/python3.9/site-packages/iniparse/__pycache__/configparser.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/python3.9/site-packages/iniparse/__pycache__/configparser.cpython-39.opt-1.pyc: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/python3.9/site-packages/iniparse/__pycache__/configparser.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/python3.9/site-packages/iniparse/__pycache__/configparser.cpython-39.pyc: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/python3.9/site-packages/iniparse/__pycache__/ini.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/python3.9/site-packages/iniparse/__pycache__/ini.cpython-39.opt-1.pyc: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/python3.9/site-packages/iniparse/__pycache__/utils.cpython-39.opt-1.pyc\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/python3.9/site-packages/iniparse/__pycache__/utils.cpython-39.opt-1.pyc: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/python3.9/site-packages/iniparse/__pycache__/utils.cpython-39.pyc\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/python3.9/site-packages/iniparse/__pycache__/utils.cpython-39.pyc: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/python3.9/site-packages/iniparse/compat.py\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib/python3.9/site-packages/iniparse/compat.py: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/libpamc.so.0\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/libpamc.so.0: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/libdb-5.3.so\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/libdb-5.3.so: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/libcrack.so.2\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/libcrack.so.2: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/libutempter.so.1.2.1\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/libutempter.so.1.2.1: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/libfdisk.so.1.1.0\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/libfdisk.so.1.1.0: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/libuser\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/libuser: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/libuser/libuser_files.so\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/libuser/libuser_files.so: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/libuser/libuser_ldap.so\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/libuser/libuser_ldap.so: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/libuser/libuser_shadow.so\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/libuser/libuser_shadow.so: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/libuser.so.1\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/libuser.so.1: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/libprocps.so.8.0.3\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/libprocps.so.8.0.3: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/cracklib_dict.pwi\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/cracklib_dict.pwi: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/cracklib_dict.hwm\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/cracklib_dict.hwm: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/libfdisk.so.1\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/libfdisk.so.1: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/libdb-5.so\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/libdb-5.so: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/cracklib_dict.pwd\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/cracklib_dict.pwd: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/libcrack.so.2.9.0\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/libcrack.so.2.9.0: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/libeconf.so.0\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/libeconf.so.0: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/libeconf.so.0.4.1\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/libeconf.so.0.4.1: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/libpam_misc.so.0\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/libpam_misc.so.0: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/libuser.so.1.5.2\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/libuser.so.1.5.2: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/libpam.so.0.85.1\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/libpam.so.0.85.1: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/security\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/security: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/security/pam_faillock.so\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/security/pam_faillock.so: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/security/pam_chroot.so\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/security/pam_chroot.so: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/security/pam_env.so\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/security/pam_env.so: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/security/pam_exec.so\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/security/pam_exec.so: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/security/pam_tty_audit.so\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/security/pam_tty_audit.so: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/security/pam_unix_auth.so\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/security/pam_unix_auth.so: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/security/pam_postgresok.so\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/security/pam_postgresok.so: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/security/pam_warn.so\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/security/pam_warn.so: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/security/pam_console.so\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/security/pam_console.so: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/security/pam_echo.so\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/security/pam_echo.so: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/security/pam_motd.so\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/security/pam_motd.so: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/security/pam_unix_session.so\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/security/pam_unix_session.so: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/security/pam_filter.so\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/security/pam_filter.so: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/security/pam_pwquality.so\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/security/pam_pwquality.so: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/security/pam_namespace.so\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/security/pam_namespace.so: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/security/pam_access.so\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/security/pam_access.so: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/security/pam_xauth.so\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/security/pam_xauth.so: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/security/pam_group.so\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/security/pam_group.so: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/security/pam_timestamp.so\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/security/pam_timestamp.so: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/security/pam_unix_acct.so\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/security/pam_unix_acct.so: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/security/pam_debug.so\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/security/pam_debug.so: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/security/pam_filter\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/security/pam_filter: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/security/pam_filter/upperLOWER\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/security/pam_filter/upperLOWER: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/security/pam_ftp.so\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/security/pam_ftp.so: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/security/pam_pwhistory.so\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/security/pam_pwhistory.so: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/security/pam_unix_passwd.so\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/security/pam_unix_passwd.so: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/security/pam_localuser.so\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/security/pam_localuser.so: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/security/pam_loginuid.so\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/security/pam_loginuid.so: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/security/pam_mail.so\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/security/pam_mail.so: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/security/pam_succeed_if.so\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/security/pam_succeed_if.so: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/security/pam_time.so\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/security/pam_time.so: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/security/pam_setquota.so\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/security/pam_setquota.so: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/security/pam_stress.so\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/security/pam_stress.so: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/security/pam_keyinit.so\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/security/pam_keyinit.so: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/security/pam_limits.so\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/security/pam_limits.so: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/security/pam_selinux_permit.so\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/security/pam_selinux_permit.so: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/security/pam_lastlog.so\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/security/pam_lastlog.so: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/security/pam_mkhomedir.so\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/security/pam_mkhomedir.so: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/security/pam_nologin.so\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/security/pam_nologin.so: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/security/pam_permit.so\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/security/pam_permit.so: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/security/pam_userdb.so\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/security/pam_userdb.so: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/security/pam_deny.so\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/security/pam_deny.so: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/security/pam_faildelay.so\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/security/pam_faildelay.so: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/security/pam_rhosts.so\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/security/pam_rhosts.so: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/security/pam_securetty.so\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/security/pam_securetty.so: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/security/pam_shells.so\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/security/pam_shells.so: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/security/pam_unix.so\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/security/pam_unix.so: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/security/pam_sepermit.so\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/security/pam_sepermit.so: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/security/pam_selinux.so\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/security/pam_selinux.so: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/security/pam_umask.so\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/security/pam_umask.so: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/security/pam_usertype.so\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/security/pam_usertype.so: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/security/pam_wheel.so\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/security/pam_wheel.so: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/security/pam_issue.so\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/security/pam_issue.so: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/security/pam_listfile.so\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/security/pam_listfile.so: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/security/pam_rootok.so\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/security/pam_rootok.so: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/libutempter.so.0\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/libutempter.so.0: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/libprocps.so.8\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/libprocps.so.8: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/libpwquality.so.1\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/libpwquality.so.1: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/libpam.so.0\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/libpam.so.0: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/libpamc.so.0.82.1\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/libpamc.so.0.82.1: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/libpwquality.so.1.0.2\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/libpwquality.so.1.0.2: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/libpam_misc.so.0.82.1\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/lib64/libpam_misc.so.0.82.1: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/local\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/local: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/local/bin\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/local/bin: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/local/bin/kolla_copy_cacerts\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/local/bin/kolla_copy_cacerts: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/local/bin/kolla_extend_start\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/local/bin/kolla_extend_start: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/local/bin/kolla_httpd_setup\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/local/bin/kolla_httpd_setup: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/local/bin/kolla_set_configs\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/local/bin/kolla_set_configs: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/local/bin/kolla_start\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/local/bin/kolla_start: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/local/bin/uid_gid_manage\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/local/bin/uid_gid_manage: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/sbin\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/sbin: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/sbin/fsck.cramfs\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/sbin/fsck.cramfs: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/sbin/lusermod\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/sbin/lusermod: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/sbin/pwhistory_helper\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/sbin/pwhistory_helper: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/sbin/fsfreeze\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/sbin/fsfreeze: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/sbin/luseradd\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/sbin/luseradd: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/sbin/luserdel\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/sbin/luserdel: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/sbin/cracklib-format\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/sbin/cracklib-format: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/sbin/runuser\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/sbin/runuser: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/sbin/sysctl\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/sbin/sysctl: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/sbin/chcpu\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/sbin/chcpu: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/sbin/lnewusers\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/sbin/lnewusers: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/sbin/lgroupdel\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/sbin/lgroupdel: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/sbin/sfdisk\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/sbin/sfdisk: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/sbin/sulogin\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/sbin/sulogin: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/sbin/blkid\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/sbin/blkid: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/sbin/cracklib-check\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/sbin/cracklib-check: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/sbin/delpart\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/sbin/delpart: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/sbin/mkfs.cramfs\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/sbin/mkfs.cramfs: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/sbin/cfdisk\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/sbin/cfdisk: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/sbin/clock\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/sbin/clock: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/sbin/wipefs\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/sbin/wipefs: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/sbin/fdisk\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/sbin/fdisk: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/sbin/fsck.minix\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/sbin/fsck.minix: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/sbin/hwclock\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/sbin/hwclock: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/sbin/blkdiscard\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/sbin/blkdiscard: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/sbin/ldattach\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/sbin/ldattach: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/sbin/mkfs\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/sbin/mkfs: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/sbin/lpasswd\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/sbin/lpasswd: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/sbin/mkdict\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/sbin/mkdict: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/sbin/unix_chkpwd\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/sbin/unix_chkpwd: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/sbin/addpart\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/sbin/addpart: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/sbin/ctrlaltdel\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/sbin/ctrlaltdel: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/sbin/findfs\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/sbin/findfs: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/sbin/mkhomedir_helper\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/sbin/mkhomedir_helper: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/sbin/pam_console_apply\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/sbin/pam_console_apply: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/sbin/lgroupmod\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/sbin/lgroupmod: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/sbin/packer\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/sbin/packer: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/sbin/swapon\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/sbin/swapon: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/sbin/pam_namespace_helper\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/sbin/pam_namespace_helper: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/sbin/pivot_root\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/sbin/pivot_root: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/sbin/readprofile\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/sbin/readprofile: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/sbin/rtcwake\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/sbin/rtcwake: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/sbin/blockdev\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/sbin/blockdev: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/sbin/pidof\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/sbin/pidof: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/sbin/agetty\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/sbin/agetty: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/sbin/swapoff\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/sbin/swapoff: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/sbin/mkfs.minix\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/sbin/mkfs.minix: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/sbin/zramctl\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/sbin/zramctl: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/sbin/blkzone\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/sbin/blkzone: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/sbin/rfkill\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/sbin/rfkill: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/sbin/create-cracklib-dict\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/sbin/create-cracklib-dict: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/sbin/lchage\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/sbin/lchage: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/sbin/partx\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/sbin/partx: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/sbin/fsck\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/sbin/fsck: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/sbin/fstrim\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/sbin/fstrim: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/sbin/swaplabel\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/sbin/swaplabel: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/sbin/switch_root\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/sbin/switch_root: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/sbin/nologin\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/sbin/nologin: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/sbin/fdformat\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/sbin/fdformat: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/sbin/cracklib-packer\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/sbin/cracklib-packer: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/sbin/visudo\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/sbin/visudo: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/sbin/faillock\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/sbin/faillock: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/sbin/mkswap\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/sbin/mkswap: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/sbin/unix_update\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/sbin/unix_update: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/sbin/lgroupadd\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/sbin/lgroupadd: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/sbin/losetup\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/sbin/losetup: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/sbin/resizepart\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/sbin/resizepart: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/sbin/cracklib-unpacker\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/sbin/cracklib-unpacker: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/sbin/pam_timestamp_check\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/sbin/pam_timestamp_check: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain podman[239012]: time="2025-11-28T09:37:54Z" level=error msg="Can not stat \"/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/sbin/lid\": lstat /var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/merged/usr/sbin/lid: no such file or directory"
Nov 28 09:37:54 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-f20c3ba929bbb53a84e323dddb8c0eaf3ca74b6729310e964e1fa9eee119e43a-merged.mount: Deactivated successfully.
Nov 28 09:37:54 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa-merged.mount: Deactivated successfully.
Nov 28 09:37:54 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa-merged.mount: Deactivated successfully.
Nov 28 09:37:54 np0005538515.localdomain rsyslogd[758]: imjournal from <localhost:podman>: begin to drop messages due to rate-limiting
Nov 28 09:37:56 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-f7726cecd9e8969401979ecd2369f385c53efc762aea19175eca5dfbffa00449-merged.mount: Deactivated successfully.
Nov 28 09:37:56 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-8d7a1875c2425cad72cafe803874ca1ca683dd2f4b513ab7c102d534a7a81b79-merged.mount: Deactivated successfully.
Nov 28 09:37:56 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-b574f97f279779c52df37c61d993141d596fdb6544fa700fbddd8f35f27a4d3b-merged.mount: Deactivated successfully.
Nov 28 09:37:57 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9648 DF PROTO=TCP SPT=44770 DPT=9105 SEQ=2579397095 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD706630000000001030307) 
Nov 28 09:37:58 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-14ed6d3c1e7f0efbf3e5310f077b6fbf5a3cd333e0b5df7204752cd3df15a8b7-merged.mount: Deactivated successfully.
Nov 28 09:37:58 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-5072aa4283df2440f817438926274b2ecc1fbb999174180268a40a1b62865efd-merged.mount: Deactivated successfully.
Nov 28 09:37:58 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-5072aa4283df2440f817438926274b2ecc1fbb999174180268a40a1b62865efd-merged.mount: Deactivated successfully.
Nov 28 09:37:58 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9649 DF PROTO=TCP SPT=44770 DPT=9105 SEQ=2579397095 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD70A7B0000000001030307) 
Nov 28 09:37:58 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53415 DF PROTO=TCP SPT=36928 DPT=9100 SEQ=3824350269 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD70AFA0000000001030307) 
Nov 28 09:37:59 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Nov 28 09:38:00 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-14ed6d3c1e7f0efbf3e5310f077b6fbf5a3cd333e0b5df7204752cd3df15a8b7-merged.mount: Deactivated successfully.
Nov 28 09:38:00 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.
Nov 28 09:38:00 np0005538515.localdomain podman[242092]: 2025-11-28 09:38:00.309918212 +0000 UTC m=+0.096899187 container health_status cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd)
Nov 28 09:38:00 np0005538515.localdomain podman[242092]: 2025-11-28 09:38:00.348130464 +0000 UTC m=+0.135111439 container exec_died cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=multipathd, io.buildah.version=1.41.3, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Nov 28 09:38:00 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-14ed6d3c1e7f0efbf3e5310f077b6fbf5a3cd333e0b5df7204752cd3df15a8b7-merged.mount: Deactivated successfully.
Nov 28 09:38:00 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:38:00 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 28 09:38:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:38:00.614 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:38:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:38:00.614 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:38:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:38:00.614 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:38:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:38:00.615 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:38:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:38:00.615 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:38:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:38:00.615 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:38:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:38:00.615 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:38:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:38:00.615 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:38:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:38:00.615 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:38:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:38:00.615 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:38:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:38:00.616 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:38:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:38:00.616 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:38:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:38:00.616 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:38:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:38:00.616 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:38:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:38:00.616 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:38:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:38:00.616 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:38:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:38:00.616 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:38:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:38:00.617 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:38:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:38:00.617 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:38:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:38:00.617 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:38:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:38:00.617 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:38:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:38:00.617 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:38:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:38:00.617 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:38:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:38:00.617 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:38:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:38:00.618 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:38:00 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 28 09:38:00 np0005538515.localdomain systemd[1]: cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.service: Deactivated successfully.
Nov 28 09:38:01 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59100 DF PROTO=TCP SPT=53552 DPT=9105 SEQ=3078962166 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD716FA0000000001030307) 
Nov 28 09:38:01 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:38:02 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:38:02 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:38:02 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:38:03 np0005538515.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:38:03 np0005538515.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:38:03 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:38:04 np0005538515.localdomain sudo[242111]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:38:04 np0005538515.localdomain sudo[242111]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:38:04 np0005538515.localdomain sudo[242111]: pam_unix(sudo:session): session closed for user root
Nov 28 09:38:04 np0005538515.localdomain sudo[242129]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 09:38:04 np0005538515.localdomain sudo[242129]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:38:04 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9651 DF PROTO=TCP SPT=44770 DPT=9105 SEQ=2579397095 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD7223B0000000001030307) 
Nov 28 09:38:05 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-5072aa4283df2440f817438926274b2ecc1fbb999174180268a40a1b62865efd-merged.mount: Deactivated successfully.
Nov 28 09:38:05 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.
Nov 28 09:38:05 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-8bb70caa6e9f6f4d170c3a5868421bfc24d38542c14f6a27a1edf3bcfdc45d32-merged.mount: Deactivated successfully.
Nov 28 09:38:05 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-8bb70caa6e9f6f4d170c3a5868421bfc24d38542c14f6a27a1edf3bcfdc45d32-merged.mount: Deactivated successfully.
Nov 28 09:38:06 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Nov 28 09:38:06 np0005538515.localdomain podman[242160]: 2025-11-28 09:38:06.659469038 +0000 UTC m=+0.890430381 container health_status 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, architecture=x86_64, distribution-scope=public, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, config_id=edpm, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., name=ubi9-minimal, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, container_name=openstack_network_exporter)
Nov 28 09:38:06 np0005538515.localdomain podman[242160]: 2025-11-28 09:38:06.704430161 +0000 UTC m=+0.935391494 container exec_died 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, version=9.6, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, architecture=x86_64, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Nov 28 09:38:06 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-897f35829b1f881949b1c333f7f4948d19933191339ff7279e3c8582c9dcbd21-merged.mount: Deactivated successfully.
Nov 28 09:38:06 np0005538515.localdomain sudo[242129]: pam_unix(sudo:session): session closed for user root
Nov 28 09:38:07 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:38:07 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-a802e2c2182c5081dae453e00ae55ca652c01124f4ff691b910ec76e11c97f5a-merged.mount: Deactivated successfully.
Nov 28 09:38:07 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-a802e2c2182c5081dae453e00ae55ca652c01124f4ff691b910ec76e11c97f5a-merged.mount: Deactivated successfully.
Nov 28 09:38:07 np0005538515.localdomain sudo[242200]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:38:07 np0005538515.localdomain sudo[242200]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:38:07 np0005538515.localdomain sudo[242200]: pam_unix(sudo:session): session closed for user root
Nov 28 09:38:08 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-f04f6aa8018da724c9daa5ca37db7cd13477323f1b725eec5dac97862d883048-merged.mount: Deactivated successfully.
Nov 28 09:38:08 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-b574f97f279779c52df37c61d993141d596fdb6544fa700fbddd8f35f27a4d3b-merged.mount: Deactivated successfully.
Nov 28 09:38:08 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-b574f97f279779c52df37c61d993141d596fdb6544fa700fbddd8f35f27a4d3b-merged.mount: Deactivated successfully.
Nov 28 09:38:08 np0005538515.localdomain systemd[1]: 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.service: Deactivated successfully.
Nov 28 09:38:09 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25120 DF PROTO=TCP SPT=52470 DPT=9102 SEQ=3247818623 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD733BA0000000001030307) 
Nov 28 09:38:09 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:38:09 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:38:09 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:38:09 np0005538515.localdomain sshd[242218]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 09:38:09 np0005538515.localdomain sshd[242218]: Accepted publickey for zuul from 192.168.122.30 port 59550 ssh2: RSA SHA256:3gOhaEk5Hp1Sm2LwNst6cGDJ5O01KvSo8lCo9SBO2II
Nov 28 09:38:09 np0005538515.localdomain systemd-logind[763]: New session 56 of user zuul.
Nov 28 09:38:09 np0005538515.localdomain systemd[1]: Started Session 56 of User zuul.
Nov 28 09:38:09 np0005538515.localdomain sshd[242218]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Nov 28 09:38:10 np0005538515.localdomain sudo[242312]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jhjyqngpauezhetoysgkftwwclxyrymh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322690.0146484-2802-51745968494279/AnsiballZ_podman_container_info.py
Nov 28 09:38:10 np0005538515.localdomain sudo[242312]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:38:10 np0005538515.localdomain python3.9[242314]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_controller'] executable=podman
Nov 28 09:38:10 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-47afe78ba3ac18f156703d7ad9e4be64941a9d1bd472a4c2a59f4f2c3531ee35-merged.mount: Deactivated successfully.
Nov 28 09:38:10 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-a802e2c2182c5081dae453e00ae55ca652c01124f4ff691b910ec76e11c97f5a-merged.mount: Deactivated successfully.
Nov 28 09:38:10 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-f04f6aa8018da724c9daa5ca37db7cd13477323f1b725eec5dac97862d883048-merged.mount: Deactivated successfully.
Nov 28 09:38:10 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-cb78a9787fbfdee8df647dff935d3e6e34a25076546a1ccbc8a68d8c48f6925c-merged.mount: Deactivated successfully.
Nov 28 09:38:12 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-f20c3ba929bbb53a84e323dddb8c0eaf3ca74b6729310e964e1fa9eee119e43a-merged.mount: Deactivated successfully.
Nov 28 09:38:12 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-47afe78ba3ac18f156703d7ad9e4be64941a9d1bd472a4c2a59f4f2c3531ee35-merged.mount: Deactivated successfully.
Nov 28 09:38:12 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-47afe78ba3ac18f156703d7ad9e4be64941a9d1bd472a4c2a59f4f2c3531ee35-merged.mount: Deactivated successfully.
Nov 28 09:38:12 np0005538515.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:38:12 np0005538515.localdomain sudo[242312]: pam_unix(sudo:session): session closed for user root
Nov 28 09:38:13 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa-merged.mount: Deactivated successfully.
Nov 28 09:38:13 np0005538515.localdomain sudo[242435]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kbhsvedwugdsgtrbobkvttafcbvwfvel ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322692.752417-2813-267828089035498/AnsiballZ_podman_container_exec.py
Nov 28 09:38:13 np0005538515.localdomain sudo[242435]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:38:13 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9652 DF PROTO=TCP SPT=44770 DPT=9105 SEQ=2579397095 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD742FA0000000001030307) 
Nov 28 09:38:13 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66-merged.mount: Deactivated successfully.
Nov 28 09:38:13 np0005538515.localdomain python3.9[242437]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 28 09:38:13 np0005538515.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:38:13 np0005538515.localdomain systemd[1]: Started libpod-conmon-98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.scope.
Nov 28 09:38:13 np0005538515.localdomain podman[242438]: 2025-11-28 09:38:13.410279606 +0000 UTC m=+0.116915735 container exec 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, container_name=ovn_controller)
Nov 28 09:38:13 np0005538515.localdomain podman[242438]: 2025-11-28 09:38:13.443391637 +0000 UTC m=+0.150027726 container exec_died 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 28 09:38:13 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-f20c3ba929bbb53a84e323dddb8c0eaf3ca74b6729310e964e1fa9eee119e43a-merged.mount: Deactivated successfully.
Nov 28 09:38:13 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa-merged.mount: Deactivated successfully.
Nov 28 09:38:13 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66-merged.mount: Deactivated successfully.
Nov 28 09:38:13 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-da9f726a106a4f4af24ed404443eca5cd50a43c6e5c864c256f158761c28e938-merged.mount: Deactivated successfully.
Nov 28 09:38:13 np0005538515.localdomain sudo[242435]: pam_unix(sudo:session): session closed for user root
Nov 28 09:38:14 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29055 DF PROTO=TCP SPT=55396 DPT=9882 SEQ=3996363091 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD746FA0000000001030307) 
Nov 28 09:38:14 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66-merged.mount: Deactivated successfully.
Nov 28 09:38:14 np0005538515.localdomain sudo[242573]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xtrvnkboqnswjnqjaxukotaegqzivnai ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322694.374887-2821-230838213335855/AnsiballZ_podman_container_exec.py
Nov 28 09:38:14 np0005538515.localdomain sudo[242573]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:38:14 np0005538515.localdomain python3.9[242575]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 28 09:38:16 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 28 09:38:16 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Nov 28 09:38:16 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.
Nov 28 09:38:16 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1343 DF PROTO=TCP SPT=50466 DPT=9100 SEQ=3837675688 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD750BB0000000001030307) 
Nov 28 09:38:16 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Nov 28 09:38:16 np0005538515.localdomain systemd[1]: libpod-conmon-98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.scope: Deactivated successfully.
Nov 28 09:38:16 np0005538515.localdomain systemd[1]: Started libpod-conmon-98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.scope.
Nov 28 09:38:16 np0005538515.localdomain podman[242576]: 2025-11-28 09:38:16.856934011 +0000 UTC m=+1.893095950 container exec 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_controller, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Nov 28 09:38:16 np0005538515.localdomain podman[242576]: 2025-11-28 09:38:16.886119898 +0000 UTC m=+1.922281847 container exec_died 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.license=GPLv2, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 09:38:18 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:38:18 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 28 09:38:18 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.
Nov 28 09:38:18 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.
Nov 28 09:38:18 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-b574f97f279779c52df37c61d993141d596fdb6544fa700fbddd8f35f27a4d3b-merged.mount: Deactivated successfully.
Nov 28 09:38:18 np0005538515.localdomain podman[242587]: 2025-11-28 09:38:18.736309198 +0000 UTC m=+2.236348990 container health_status b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Nov 28 09:38:18 np0005538515.localdomain podman[242587]: 2025-11-28 09:38:18.766026202 +0000 UTC m=+2.266066004 container exec_died b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible)
Nov 28 09:38:18 np0005538515.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:38:18 np0005538515.localdomain sudo[242573]: pam_unix(sudo:session): session closed for user root
Nov 28 09:38:18 np0005538515.localdomain podman[242615]: 2025-11-28 09:38:18.875207244 +0000 UTC m=+0.416716201 container health_status 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=unhealthy, tcib_managed=true, config_id=edpm, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 28 09:38:18 np0005538515.localdomain podman[242615]: 2025-11-28 09:38:18.910483933 +0000 UTC m=+0.451992870 container exec_died 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm, io.buildah.version=1.41.3, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 09:38:18 np0005538515.localdomain podman[242615]: unhealthy
Nov 28 09:38:18 np0005538515.localdomain podman[242616]: 2025-11-28 09:38:18.917233375 +0000 UTC m=+0.467602800 container health_status d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=unhealthy, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 28 09:38:19 np0005538515.localdomain podman[242616]: 2025-11-28 09:38:19.000539794 +0000 UTC m=+0.550909229 container exec_died d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 09:38:19 np0005538515.localdomain podman[242616]: unhealthy
Nov 28 09:38:19 np0005538515.localdomain sudo[242767]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lghwrlwuoenaaxxrbumysoedwxdhkonk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322698.9971051-2829-272000264875603/AnsiballZ_file.py
Nov 28 09:38:19 np0005538515.localdomain sudo[242767]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:38:19 np0005538515.localdomain python3.9[242769]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_controller recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:38:19 np0005538515.localdomain sudo[242767]: pam_unix(sudo:session): session closed for user root
Nov 28 09:38:19 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:38:19 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:38:20 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:38:20 np0005538515.localdomain sudo[242877]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rvpiauotofbmnpcwhlwyphujmicqqjgz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322699.8023496-2838-205749075555051/AnsiballZ_podman_container_info.py
Nov 28 09:38:20 np0005538515.localdomain sudo[242877]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:38:20 np0005538515.localdomain python3.9[242879]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_metadata_agent'] executable=podman
Nov 28 09:38:20 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1344 DF PROTO=TCP SPT=50466 DPT=9100 SEQ=3837675688 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD7607A0000000001030307) 
Nov 28 09:38:20 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.
Nov 28 09:38:21 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 28 09:38:21 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Nov 28 09:38:21 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Nov 28 09:38:21 np0005538515.localdomain systemd[1]: b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.service: Deactivated successfully.
Nov 28 09:38:21 np0005538515.localdomain systemd[1]: 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 09:38:21 np0005538515.localdomain systemd[1]: 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.service: Failed with result 'exit-code'.
Nov 28 09:38:21 np0005538515.localdomain systemd[1]: libpod-conmon-98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.scope: Deactivated successfully.
Nov 28 09:38:21 np0005538515.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:38:21 np0005538515.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:38:21 np0005538515.localdomain systemd[1]: d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 09:38:21 np0005538515.localdomain systemd[1]: d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.service: Failed with result 'exit-code'.
Nov 28 09:38:21 np0005538515.localdomain podman[242893]: 2025-11-28 09:38:21.603181527 +0000 UTC m=+0.712398745 container health_status 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 09:38:21 np0005538515.localdomain podman[242893]: 2025-11-28 09:38:21.713625379 +0000 UTC m=+0.822842577 container exec_died 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=ovn_controller, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3)
Nov 28 09:38:22 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:38:22 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:38:22 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:38:23 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:38:23 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 28 09:38:23 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.
Nov 28 09:38:23 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 28 09:38:23 np0005538515.localdomain systemd[1]: 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.service: Deactivated successfully.
Nov 28 09:38:23 np0005538515.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:38:23 np0005538515.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:38:23 np0005538515.localdomain podman[242917]: 2025-11-28 09:38:23.8784912 +0000 UTC m=+0.265855738 container health_status 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 09:38:23 np0005538515.localdomain podman[242917]: 2025-11-28 09:38:23.885625155 +0000 UTC m=+0.272989693 container exec_died 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 28 09:38:24 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:38:24 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:38:24 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:38:25 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:38:25 np0005538515.localdomain systemd[1]: 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.service: Deactivated successfully.
Nov 28 09:38:25 np0005538515.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:38:25 np0005538515.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:38:25 np0005538515.localdomain sudo[242877]: pam_unix(sudo:session): session closed for user root
Nov 28 09:38:25 np0005538515.localdomain sudo[243048]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-osocylqwddccvdaofbxsrpqcqshthdoq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322705.2617018-2846-30640720753343/AnsiballZ_podman_container_exec.py
Nov 28 09:38:25 np0005538515.localdomain sudo[243048]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:38:25 np0005538515.localdomain python3.9[243050]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_metadata_agent detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 28 09:38:25 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:38:25 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:38:25 np0005538515.localdomain systemd[1]: Started libpod-conmon-b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.scope.
Nov 28 09:38:25 np0005538515.localdomain podman[243051]: 2025-11-28 09:38:25.880111811 +0000 UTC m=+0.129694189 container exec b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Nov 28 09:38:25 np0005538515.localdomain podman[243051]: 2025-11-28 09:38:25.918563449 +0000 UTC m=+0.168145807 container exec_died b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent)
Nov 28 09:38:27 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Nov 28 09:38:27 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-4d267351eb91c27e496fa400ef9055b36048428ec01962767ba6b671d1258ac4-merged.mount: Deactivated successfully.
Nov 28 09:38:27 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1616 DF PROTO=TCP SPT=52248 DPT=9105 SEQ=2192320128 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD77B930000000001030307) 
Nov 28 09:38:27 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-4d267351eb91c27e496fa400ef9055b36048428ec01962767ba6b671d1258ac4-merged.mount: Deactivated successfully.
Nov 28 09:38:27 np0005538515.localdomain sudo[243048]: pam_unix(sudo:session): session closed for user root
Nov 28 09:38:28 np0005538515.localdomain systemd[1]: libpod-conmon-b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.scope: Deactivated successfully.
Nov 28 09:38:28 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:38:28 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1617 DF PROTO=TCP SPT=52248 DPT=9105 SEQ=2192320128 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD77FBB0000000001030307) 
Nov 28 09:38:28 np0005538515.localdomain sudo[243187]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cggcyxzmdztshddhgciskrurgqbrlthx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322708.467787-2854-101034274825951/AnsiballZ_podman_container_exec.py
Nov 28 09:38:28 np0005538515.localdomain sudo[243187]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:38:28 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59567 DF PROTO=TCP SPT=41688 DPT=9882 SEQ=2582391057 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD780BC0000000001030307) 
Nov 28 09:38:28 np0005538515.localdomain python3.9[243189]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_metadata_agent detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 28 09:38:29 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-55d5530fe8468c8c9907e0aa1de030811941604fa5f46de3db6dc15ec40906dd-merged.mount: Deactivated successfully.
Nov 28 09:38:29 np0005538515.localdomain systemd[1]: Started libpod-conmon-b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.scope.
Nov 28 09:38:29 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-ae0ebe7656e29542866ff018f5be9a3d02c88268a65814cf045e1b6c30ffd352-merged.mount: Deactivated successfully.
Nov 28 09:38:29 np0005538515.localdomain podman[243190]: 2025-11-28 09:38:29.081978347 +0000 UTC m=+0.101117278 container exec b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 28 09:38:29 np0005538515.localdomain podman[243190]: 2025-11-28 09:38:29.113470531 +0000 UTC m=+0.132609502 container exec_died b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Nov 28 09:38:30 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Nov 28 09:38:30 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-897f35829b1f881949b1c333f7f4948d19933191339ff7279e3c8582c9dcbd21-merged.mount: Deactivated successfully.
Nov 28 09:38:30 np0005538515.localdomain sudo[243187]: pam_unix(sudo:session): session closed for user root
Nov 28 09:38:31 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.
Nov 28 09:38:31 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:38:31 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-a802e2c2182c5081dae453e00ae55ca652c01124f4ff691b910ec76e11c97f5a-merged.mount: Deactivated successfully.
Nov 28 09:38:31 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-55d5530fe8468c8c9907e0aa1de030811941604fa5f46de3db6dc15ec40906dd-merged.mount: Deactivated successfully.
Nov 28 09:38:31 np0005538515.localdomain systemd[1]: libpod-conmon-b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.scope: Deactivated successfully.
Nov 28 09:38:31 np0005538515.localdomain podman[243220]: 2025-11-28 09:38:31.501379327 +0000 UTC m=+0.109832068 container health_status cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Nov 28 09:38:31 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6560 DF PROTO=TCP SPT=51392 DPT=9105 SEQ=1980500986 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD78AFA0000000001030307) 
Nov 28 09:38:31 np0005538515.localdomain podman[243220]: 2025-11-28 09:38:31.544659986 +0000 UTC m=+0.153112727 container exec_died cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=multipathd)
Nov 28 09:38:31 np0005538515.localdomain sudo[243346]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lvzvchlwpqvljqmlqzchwribttsrpbkk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322711.6769993-2862-86080943120917/AnsiballZ_file.py
Nov 28 09:38:31 np0005538515.localdomain sudo[243346]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:38:32 np0005538515.localdomain python3.9[243348]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_metadata_agent recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:38:32 np0005538515.localdomain sudo[243346]: pam_unix(sudo:session): session closed for user root
Nov 28 09:38:32 np0005538515.localdomain systemd[1]: tmp-crun.8OrNAS.mount: Deactivated successfully.
Nov 28 09:38:32 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:38:32 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:38:32 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-ae0ebe7656e29542866ff018f5be9a3d02c88268a65814cf045e1b6c30ffd352-merged.mount: Deactivated successfully.
Nov 28 09:38:32 np0005538515.localdomain systemd[1]: cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.service: Deactivated successfully.
Nov 28 09:38:32 np0005538515.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:38:32 np0005538515.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:38:32 np0005538515.localdomain sudo[243456]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bywagsyvmihpxhvkgcdhlliojinitadm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322712.426353-2871-68006846230750/AnsiballZ_podman_container_info.py
Nov 28 09:38:32 np0005538515.localdomain sudo[243456]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:38:32 np0005538515.localdomain python3.9[243458]: ansible-containers.podman.podman_container_info Invoked with name=['multipathd'] executable=podman
Nov 28 09:38:32 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:38:34 np0005538515.localdomain ceph-osd[32393]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 28 09:38:34 np0005538515.localdomain ceph-osd[32393]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 6600.1 total, 600.0 interval
                                                          Cumulative writes: 4784 writes, 21K keys, 4784 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s
                                                          Cumulative WAL: 4784 writes, 637 syncs, 7.51 writes per sync, written: 0.02 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                          Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Nov 28 09:38:34 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-14ed6d3c1e7f0efbf3e5310f077b6fbf5a3cd333e0b5df7204752cd3df15a8b7-merged.mount: Deactivated successfully.
Nov 28 09:38:34 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1619 DF PROTO=TCP SPT=52248 DPT=9105 SEQ=2192320128 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD7977B0000000001030307) 
Nov 28 09:38:34 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-5072aa4283df2440f817438926274b2ecc1fbb999174180268a40a1b62865efd-merged.mount: Deactivated successfully.
Nov 28 09:38:34 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-5072aa4283df2440f817438926274b2ecc1fbb999174180268a40a1b62865efd-merged.mount: Deactivated successfully.
Nov 28 09:38:35 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-a802e2c2182c5081dae453e00ae55ca652c01124f4ff691b910ec76e11c97f5a-merged.mount: Deactivated successfully.
Nov 28 09:38:35 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-cb78a9787fbfdee8df647dff935d3e6e34a25076546a1ccbc8a68d8c48f6925c-merged.mount: Deactivated successfully.
Nov 28 09:38:36 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:38:36 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-14ed6d3c1e7f0efbf3e5310f077b6fbf5a3cd333e0b5df7204752cd3df15a8b7-merged.mount: Deactivated successfully.
Nov 28 09:38:37 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-14ed6d3c1e7f0efbf3e5310f077b6fbf5a3cd333e0b5df7204752cd3df15a8b7-merged.mount: Deactivated successfully.
Nov 28 09:38:37 np0005538515.localdomain sudo[243456]: pam_unix(sudo:session): session closed for user root
Nov 28 09:38:37 np0005538515.localdomain sudo[243579]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pdhdwtjarrruqbtxhjdtcoxmexulxusx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322717.3041387-2879-39809741198089/AnsiballZ_podman_container_exec.py
Nov 28 09:38:37 np0005538515.localdomain sudo[243579]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:38:37 np0005538515.localdomain python3.9[243581]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=multipathd detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 28 09:38:37 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa-merged.mount: Deactivated successfully.
Nov 28 09:38:37 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66-merged.mount: Deactivated successfully.
Nov 28 09:38:38 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66-merged.mount: Deactivated successfully.
Nov 28 09:38:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 28 09:38:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 6600.2 total, 600.0 interval
                                                          Cumulative writes: 5781 writes, 25K keys, 5781 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s
                                                          Cumulative WAL: 5781 writes, 729 syncs, 7.93 writes per sync, written: 0.02 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                          Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Nov 28 09:38:38 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:38:38 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.
Nov 28 09:38:38 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-da9f726a106a4f4af24ed404443eca5cd50a43c6e5c864c256f158761c28e938-merged.mount: Deactivated successfully.
Nov 28 09:38:38 np0005538515.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:38:38 np0005538515.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:38:38 np0005538515.localdomain systemd[1]: Started libpod-conmon-cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.scope.
Nov 28 09:38:38 np0005538515.localdomain podman[243582]: 2025-11-28 09:38:38.954861268 +0000 UTC m=+1.164540160 container exec cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 09:38:38 np0005538515.localdomain podman[243582]: 2025-11-28 09:38:38.962481064 +0000 UTC m=+1.172159946 container exec_died cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd)
Nov 28 09:38:39 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53457 DF PROTO=TCP SPT=60440 DPT=9102 SEQ=1240152433 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD7A8BB0000000001030307) 
Nov 28 09:38:39 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:38:39 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:38:41 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 28 09:38:41 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Nov 28 09:38:41 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Nov 28 09:38:41 np0005538515.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:38:41 np0005538515.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:38:41 np0005538515.localdomain sudo[243579]: pam_unix(sudo:session): session closed for user root
Nov 28 09:38:41 np0005538515.localdomain podman[243593]: 2025-11-28 09:38:41.430495378 +0000 UTC m=+2.538195366 container health_status 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vcs-type=git, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, name=ubi9-minimal, vendor=Red Hat, Inc., managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm)
Nov 28 09:38:41 np0005538515.localdomain podman[243593]: 2025-11-28 09:38:41.481589048 +0000 UTC m=+2.589289056 container exec_died 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., config_id=edpm, io.openshift.tags=minimal rhel9, release=1755695350, version=9.6, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, io.openshift.expose-services=, vcs-type=git, com.redhat.component=ubi9-minimal-container, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41)
Nov 28 09:38:41 np0005538515.localdomain sudo[243733]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lmxywiiiyanggjpnjhflqicoglkdfxvc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322721.597089-2887-257687083673827/AnsiballZ_podman_container_exec.py
Nov 28 09:38:41 np0005538515.localdomain sudo[243733]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:38:42 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:38:42 np0005538515.localdomain python3.9[243735]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=multipathd detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 28 09:38:42 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1620 DF PROTO=TCP SPT=52248 DPT=9105 SEQ=2192320128 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD7B6FA0000000001030307) 
Nov 28 09:38:43 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:38:43 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 28 09:38:43 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 28 09:38:43 np0005538515.localdomain systemd[1]: libpod-conmon-cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.scope: Deactivated successfully.
Nov 28 09:38:43 np0005538515.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:38:43 np0005538515.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:38:43 np0005538515.localdomain systemd[1]: 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.service: Deactivated successfully.
Nov 28 09:38:43 np0005538515.localdomain systemd[1]: Started libpod-conmon-cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.scope.
Nov 28 09:38:43 np0005538515.localdomain podman[243736]: 2025-11-28 09:38:43.652392499 +0000 UTC m=+1.482877426 container exec cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 28 09:38:43 np0005538515.localdomain podman[243736]: 2025-11-28 09:38:43.687603318 +0000 UTC m=+1.518088125 container exec_died cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team)
Nov 28 09:38:44 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61687 DF PROTO=TCP SPT=49344 DPT=9101 SEQ=566542841 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD7BC900000000001030307) 
Nov 28 09:38:44 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:38:44 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:38:44 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:38:45 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-5072aa4283df2440f817438926274b2ecc1fbb999174180268a40a1b62865efd-merged.mount: Deactivated successfully.
Nov 28 09:38:45 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-a820d2d170f2b0bbf4a680f8c0da82218646a321cb82318df9e6b8161dc1d2c6-merged.mount: Deactivated successfully.
Nov 28 09:38:45 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-a820d2d170f2b0bbf4a680f8c0da82218646a321cb82318df9e6b8161dc1d2c6-merged.mount: Deactivated successfully.
Nov 28 09:38:45 np0005538515.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:38:45 np0005538515.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:38:45 np0005538515.localdomain sudo[243733]: pam_unix(sudo:session): session closed for user root
Nov 28 09:38:46 np0005538515.localdomain sudo[243873]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-okdtntkrubzusxskecurnebaraqtvbuk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322726.0843024-2895-219782178963229/AnsiballZ_file.py
Nov 28 09:38:46 np0005538515.localdomain sudo[243873]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:38:46 np0005538515.localdomain python3.9[243875]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/multipathd recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:38:46 np0005538515.localdomain sudo[243873]: pam_unix(sudo:session): session closed for user root
Nov 28 09:38:46 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21688 DF PROTO=TCP SPT=40194 DPT=9100 SEQ=795147180 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD7C5BB0000000001030307) 
Nov 28 09:38:46 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:38:46 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:38:47 np0005538515.localdomain sudo[243983]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ktoldbvsqcylozefbwvsfwzumsavvwby ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322726.8082604-2904-139392998794561/AnsiballZ_podman_container_info.py
Nov 28 09:38:47 np0005538515.localdomain sudo[243983]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:38:47 np0005538515.localdomain python3.9[243985]: ansible-containers.podman.podman_container_info Invoked with name=['ceilometer_agent_compute'] executable=podman
Nov 28 09:38:47 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:38:47.517 228501 DEBUG oslo_service.periodic_task [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:38:47 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:38:47.545 228501 DEBUG oslo_service.periodic_task [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:38:48 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 28 09:38:48 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28-merged.mount: Deactivated successfully.
Nov 28 09:38:48 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28-merged.mount: Deactivated successfully.
Nov 28 09:38:48 np0005538515.localdomain systemd[1]: libpod-conmon-cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.scope: Deactivated successfully.
Nov 28 09:38:48 np0005538515.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:38:48 np0005538515.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:38:49 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:38:50 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:38:50.074 228501 DEBUG oslo_service.periodic_task [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:38:50 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:38:50.075 228501 DEBUG oslo_service.periodic_task [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:38:50 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:38:50.075 228501 DEBUG nova.compute.manager [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 28 09:38:50 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:38:50.075 228501 DEBUG nova.compute.manager [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 28 09:38:50 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:38:50.106 228501 DEBUG nova.compute.manager [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 28 09:38:50 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:38:50.107 228501 DEBUG oslo_service.periodic_task [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:38:50 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:38:50 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 28 09:38:50 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 28 09:38:50 np0005538515.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:38:50 np0005538515.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:38:50 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21689 DF PROTO=TCP SPT=40194 DPT=9100 SEQ=795147180 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD7D57A0000000001030307) 
Nov 28 09:38:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:38:50.818 158530 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:38:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:38:50.819 158530 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:38:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:38:50.819 158530 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:38:51 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:38:51.075 228501 DEBUG oslo_service.periodic_task [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:38:51 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:38:51.075 228501 DEBUG oslo_service.periodic_task [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:38:51 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:38:51.075 228501 DEBUG nova.compute.manager [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 28 09:38:51 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:38:51.075 228501 DEBUG oslo_service.periodic_task [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:38:51 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:38:51.092 228501 DEBUG oslo_concurrency.lockutils [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:38:51 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:38:51.092 228501 DEBUG oslo_concurrency.lockutils [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:38:51 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:38:51.092 228501 DEBUG oslo_concurrency.lockutils [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:38:51 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:38:51.092 228501 DEBUG nova.compute.resource_tracker [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Auditing locally available compute resources for np0005538515.localdomain (node: np0005538515.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 28 09:38:51 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:38:51.093 228501 DEBUG oslo_concurrency.processutils [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 09:38:51 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:38:51 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:38:51 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:38:51.533 228501 DEBUG oslo_concurrency.processutils [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.440s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 09:38:51 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.
Nov 28 09:38:51 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.
Nov 28 09:38:51 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.
Nov 28 09:38:51 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:38:51 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:38:51.711 228501 WARNING nova.virt.libvirt.driver [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 09:38:51 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:38:51.712 228501 DEBUG nova.compute.resource_tracker [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Hypervisor/Node resource view: name=np0005538515.localdomain free_ram=13007MB free_disk=41.837093353271484GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 28 09:38:51 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:38:51.712 228501 DEBUG oslo_concurrency.lockutils [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:38:51 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:38:51.712 228501 DEBUG oslo_concurrency.lockutils [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:38:51 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:38:51.772 228501 DEBUG nova.compute.resource_tracker [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 28 09:38:51 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:38:51.774 228501 DEBUG nova.compute.resource_tracker [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Final resource view: name=np0005538515.localdomain phys_ram=15738MB used_ram=512MB phys_disk=41GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 28 09:38:51 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:38:51.794 228501 DEBUG oslo_concurrency.processutils [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 09:38:52 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:38:52.256 228501 DEBUG oslo_concurrency.processutils [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.463s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 09:38:52 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:38:52.263 228501 DEBUG nova.compute.provider_tree [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Inventory has not changed in ProviderTree for provider: 72fba1ca-0d86-48af-8a3d-510284dfd0e0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 09:38:52 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:38:52.279 228501 DEBUG nova.scheduler.client.report [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Inventory has not changed for provider 72fba1ca-0d86-48af-8a3d-510284dfd0e0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 09:38:52 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:38:52.281 228501 DEBUG nova.compute.resource_tracker [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Compute_service record updated for np0005538515.localdomain:np0005538515.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 28 09:38:52 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:38:52.282 228501 DEBUG oslo_concurrency.lockutils [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.570s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:38:52 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Nov 28 09:38:53 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-4d267351eb91c27e496fa400ef9055b36048428ec01962767ba6b671d1258ac4-merged.mount: Deactivated successfully.
Nov 28 09:38:53 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-4d267351eb91c27e496fa400ef9055b36048428ec01962767ba6b671d1258ac4-merged.mount: Deactivated successfully.
Nov 28 09:38:53 np0005538515.localdomain sudo[243983]: pam_unix(sudo:session): session closed for user root
Nov 28 09:38:53 np0005538515.localdomain podman[244022]: 2025-11-28 09:38:53.264432662 +0000 UTC m=+1.711514467 container health_status b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 09:38:53 np0005538515.localdomain podman[244022]: 2025-11-28 09:38:53.271389987 +0000 UTC m=+1.718471832 container exec_died b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Nov 28 09:38:53 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:38:53.280 228501 DEBUG oslo_service.periodic_task [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:38:53 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:38:53.281 228501 DEBUG oslo_service.periodic_task [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:38:53 np0005538515.localdomain podman[244021]: 2025-11-28 09:38:53.353862718 +0000 UTC m=+1.800932162 container health_status 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=unhealthy, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Nov 28 09:38:53 np0005538515.localdomain podman[244021]: 2025-11-28 09:38:53.361288428 +0000 UTC m=+1.808357882 container exec_died 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=edpm)
Nov 28 09:38:53 np0005538515.localdomain podman[244021]: unhealthy
Nov 28 09:38:53 np0005538515.localdomain sudo[244200]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iptsefblvniripyscydqfpdvjckugdpj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322733.4844522-2912-48188539312643/AnsiballZ_podman_container_exec.py
Nov 28 09:38:53 np0005538515.localdomain sudo[244200]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:38:53 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.
Nov 28 09:38:53 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-b4f761d90eeb5a4c1ea51e856783cf8398e02a6caf306b90498250a43e5bbae1-merged.mount: Deactivated successfully.
Nov 28 09:38:53 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-e1fac4507a16e359f79966290a44e975bb0ed717e8b6cc0e34b61e8c96e0a1a3-merged.mount: Deactivated successfully.
Nov 28 09:38:53 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:38:54 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:38:54 np0005538515.localdomain python3.9[244202]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ceilometer_agent_compute detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 28 09:38:54 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:38:54 np0005538515.localdomain systemd[1]: b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.service: Deactivated successfully.
Nov 28 09:38:54 np0005538515.localdomain systemd[1]: 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 09:38:54 np0005538515.localdomain systemd[1]: 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.service: Failed with result 'exit-code'.
Nov 28 09:38:54 np0005538515.localdomain podman[244203]: 2025-11-28 09:38:54.140192139 +0000 UTC m=+0.249849010 container health_status 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, container_name=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 28 09:38:54 np0005538515.localdomain systemd[1]: Started libpod-conmon-783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.scope.
Nov 28 09:38:54 np0005538515.localdomain podman[244203]: 2025-11-28 09:38:54.231376759 +0000 UTC m=+0.341033630 container exec_died 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, managed_by=edpm_ansible)
Nov 28 09:38:54 np0005538515.localdomain podman[244213]: 2025-11-28 09:38:54.233743942 +0000 UTC m=+0.169784933 container exec 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, config_id=edpm, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 28 09:38:54 np0005538515.localdomain podman[244023]: 2025-11-28 09:38:54.313482628 +0000 UTC m=+2.753802014 container health_status d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=unhealthy, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 28 09:38:54 np0005538515.localdomain podman[244023]: 2025-11-28 09:38:54.350504024 +0000 UTC m=+2.790823470 container exec_died d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 09:38:54 np0005538515.localdomain podman[244023]: unhealthy
Nov 28 09:38:54 np0005538515.localdomain podman[244213]: 2025-11-28 09:38:54.367500929 +0000 UTC m=+0.303541910 container exec_died 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Nov 28 09:38:54 np0005538515.localdomain systemd[1]: 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.service: Deactivated successfully.
Nov 28 09:38:54 np0005538515.localdomain systemd[1]: d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 09:38:54 np0005538515.localdomain systemd[1]: d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.service: Failed with result 'exit-code'.
Nov 28 09:38:54 np0005538515.localdomain sudo[244200]: pam_unix(sudo:session): session closed for user root
Nov 28 09:38:54 np0005538515.localdomain sudo[244373]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rahnmkcwqxmtutwjnjibmwhuiuikkxjk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322734.6896124-2920-130714686427981/AnsiballZ_podman_container_exec.py
Nov 28 09:38:54 np0005538515.localdomain sudo[244373]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:38:55 np0005538515.localdomain python3.9[244375]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ceilometer_agent_compute detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 28 09:38:55 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-55d5530fe8468c8c9907e0aa1de030811941604fa5f46de3db6dc15ec40906dd-merged.mount: Deactivated successfully.
Nov 28 09:38:55 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.
Nov 28 09:38:55 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-ae0ebe7656e29542866ff018f5be9a3d02c88268a65814cf045e1b6c30ffd352-merged.mount: Deactivated successfully.
Nov 28 09:38:55 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-ae0ebe7656e29542866ff018f5be9a3d02c88268a65814cf045e1b6c30ffd352-merged.mount: Deactivated successfully.
Nov 28 09:38:56 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28-merged.mount: Deactivated successfully.
Nov 28 09:38:56 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-3850d276a9594c52a78e85d7b58db016dc835caf89f3a263b0f9d37a3754a60d-merged.mount: Deactivated successfully.
Nov 28 09:38:56 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-3850d276a9594c52a78e85d7b58db016dc835caf89f3a263b0f9d37a3754a60d-merged.mount: Deactivated successfully.
Nov 28 09:38:56 np0005538515.localdomain systemd[1]: libpod-conmon-783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.scope: Deactivated successfully.
Nov 28 09:38:57 np0005538515.localdomain systemd[1]: Started libpod-conmon-783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.scope.
Nov 28 09:38:57 np0005538515.localdomain podman[244376]: 2025-11-28 09:38:57.032405733 +0000 UTC m=+1.821155168 container exec 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 09:38:57 np0005538515.localdomain podman[244376]: 2025-11-28 09:38:57.064453333 +0000 UTC m=+1.853202808 container exec_died 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 28 09:38:57 np0005538515.localdomain podman[244388]: 2025-11-28 09:38:57.074190875 +0000 UTC m=+1.764773234 container health_status 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 28 09:38:57 np0005538515.localdomain podman[244388]: 2025-11-28 09:38:57.083038219 +0000 UTC m=+1.773620558 container exec_died 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 28 09:38:57 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16163 DF PROTO=TCP SPT=58874 DPT=9105 SEQ=2757640128 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD7F0C30000000001030307) 
Nov 28 09:38:57 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-55d5530fe8468c8c9907e0aa1de030811941604fa5f46de3db6dc15ec40906dd-merged.mount: Deactivated successfully.
Nov 28 09:38:57 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-5e63dbc6f2c2fad3afb78d8adbb63d1357a03d400c05fbcd9ab42cd01e6497a2-merged.mount: Deactivated successfully.
Nov 28 09:38:57 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-e007bb9d0888be9cba9b97125428a4f6aecdcc0d729e1ce5c64249815340e7d9-merged.mount: Deactivated successfully.
Nov 28 09:38:57 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-e007bb9d0888be9cba9b97125428a4f6aecdcc0d729e1ce5c64249815340e7d9-merged.mount: Deactivated successfully.
Nov 28 09:38:57 np0005538515.localdomain systemd[1]: 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.service: Deactivated successfully.
Nov 28 09:38:57 np0005538515.localdomain sudo[244373]: pam_unix(sudo:session): session closed for user root
Nov 28 09:38:58 np0005538515.localdomain sudo[244536]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nupdqkjxjhmlsntitptzbzfqqsfkslqg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322738.108831-2928-81416302628188/AnsiballZ_file.py
Nov 28 09:38:58 np0005538515.localdomain sudo[244536]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:38:58 np0005538515.localdomain python3.9[244538]: ansible-ansible.builtin.file Invoked with group=42405 mode=0700 owner=42405 path=/var/lib/openstack/healthchecks/ceilometer_agent_compute recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:38:58 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16164 DF PROTO=TCP SPT=58874 DPT=9105 SEQ=2757640128 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD7F4BA0000000001030307) 
Nov 28 09:38:58 np0005538515.localdomain sudo[244536]: pam_unix(sudo:session): session closed for user root
Nov 28 09:38:58 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21690 DF PROTO=TCP SPT=40194 DPT=9100 SEQ=795147180 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD7F4FA0000000001030307) 
Nov 28 09:38:58 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-5e63dbc6f2c2fad3afb78d8adbb63d1357a03d400c05fbcd9ab42cd01e6497a2-merged.mount: Deactivated successfully.
Nov 28 09:38:59 np0005538515.localdomain sudo[244646]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xnhwkzrddwzibgopqmongrqjzdlzajth ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322738.802961-2937-268799900622580/AnsiballZ_podman_container_info.py
Nov 28 09:38:59 np0005538515.localdomain sudo[244646]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:38:59 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-ae0ebe7656e29542866ff018f5be9a3d02c88268a65814cf045e1b6c30ffd352-merged.mount: Deactivated successfully.
Nov 28 09:38:59 np0005538515.localdomain systemd[1]: libpod-conmon-783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.scope: Deactivated successfully.
Nov 28 09:38:59 np0005538515.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:38:59 np0005538515.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:38:59 np0005538515.localdomain python3.9[244648]: ansible-containers.podman.podman_container_info Invoked with name=['node_exporter'] executable=podman
Nov 28 09:38:59 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:38:59 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:38:59 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:39:01 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-14ed6d3c1e7f0efbf3e5310f077b6fbf5a3cd333e0b5df7204752cd3df15a8b7-merged.mount: Deactivated successfully.
Nov 28 09:39:01 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-5072aa4283df2440f817438926274b2ecc1fbb999174180268a40a1b62865efd-merged.mount: Deactivated successfully.
Nov 28 09:39:01 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-5072aa4283df2440f817438926274b2ecc1fbb999174180268a40a1b62865efd-merged.mount: Deactivated successfully.
Nov 28 09:39:01 np0005538515.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:39:01 np0005538515.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:39:01 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9654 DF PROTO=TCP SPT=44770 DPT=9105 SEQ=2579397095 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD800FA0000000001030307) 
Nov 28 09:39:02 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:39:02 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.
Nov 28 09:39:03 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:39:03 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-14ed6d3c1e7f0efbf3e5310f077b6fbf5a3cd333e0b5df7204752cd3df15a8b7-merged.mount: Deactivated successfully.
Nov 28 09:39:03 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-14ed6d3c1e7f0efbf3e5310f077b6fbf5a3cd333e0b5df7204752cd3df15a8b7-merged.mount: Deactivated successfully.
Nov 28 09:39:03 np0005538515.localdomain sudo[244646]: pam_unix(sudo:session): session closed for user root
Nov 28 09:39:03 np0005538515.localdomain podman[244662]: 2025-11-28 09:39:03.629997751 +0000 UTC m=+0.723088596 container health_status cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, tcib_managed=true)
Nov 28 09:39:03 np0005538515.localdomain podman[244662]: 2025-11-28 09:39:03.672513085 +0000 UTC m=+0.765603950 container exec_died cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, container_name=multipathd, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3)
Nov 28 09:39:04 np0005538515.localdomain sudo[244788]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bxvbsrzgbyyvjxogbkyvoeqrhhapbbpy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322743.795652-2945-163518034414525/AnsiballZ_podman_container_exec.py
Nov 28 09:39:04 np0005538515.localdomain sudo[244788]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:39:04 np0005538515.localdomain python3.9[244790]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=node_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 28 09:39:04 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-e007bb9d0888be9cba9b97125428a4f6aecdcc0d729e1ce5c64249815340e7d9-merged.mount: Deactivated successfully.
Nov 28 09:39:04 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-9ce8c6ec24615a8ac03ae2a4194714a4f44afbdc43ba4491ff44c91e34e068e5-merged.mount: Deactivated successfully.
Nov 28 09:39:04 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-9ce8c6ec24615a8ac03ae2a4194714a4f44afbdc43ba4491ff44c91e34e068e5-merged.mount: Deactivated successfully.
Nov 28 09:39:04 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16166 DF PROTO=TCP SPT=58874 DPT=9105 SEQ=2757640128 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD80C7A0000000001030307) 
Nov 28 09:39:04 np0005538515.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:39:04 np0005538515.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:39:04 np0005538515.localdomain systemd[1]: cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.service: Deactivated successfully.
Nov 28 09:39:04 np0005538515.localdomain systemd[1]: Started libpod-conmon-56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.scope.
Nov 28 09:39:04 np0005538515.localdomain podman[244791]: 2025-11-28 09:39:04.877712371 +0000 UTC m=+0.603654311 container exec 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 28 09:39:04 np0005538515.localdomain podman[244791]: 2025-11-28 09:39:04.908244636 +0000 UTC m=+0.634186496 container exec_died 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 28 09:39:05 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:39:05 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:39:05 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:39:06 np0005538515.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:39:06 np0005538515.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:39:06 np0005538515.localdomain sudo[244788]: pam_unix(sudo:session): session closed for user root
Nov 28 09:39:06 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-88264f091bd3862d781bfa87f5675ae91e879ca34a7c2bbe081e8ea3bd8603d6-merged.mount: Deactivated successfully.
Nov 28 09:39:06 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:39:06 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:39:06 np0005538515.localdomain sudo[244927]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-byamtgiasjgtmoohuyfjsrspfieegxcj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322746.2957985-2953-252178421992512/AnsiballZ_podman_container_exec.py
Nov 28 09:39:06 np0005538515.localdomain sudo[244927]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:39:06 np0005538515.localdomain python3.9[244929]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=node_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 28 09:39:07 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:39:07 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:39:07 np0005538515.localdomain systemd[1]: libpod-conmon-56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.scope: Deactivated successfully.
Nov 28 09:39:07 np0005538515.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:39:07 np0005538515.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:39:07 np0005538515.localdomain systemd[1]: Started libpod-conmon-56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.scope.
Nov 28 09:39:07 np0005538515.localdomain podman[244930]: 2025-11-28 09:39:07.417300259 +0000 UTC m=+0.641111750 container exec 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 28 09:39:07 np0005538515.localdomain podman[244930]: 2025-11-28 09:39:07.451452185 +0000 UTC m=+0.675263696 container exec_died 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 28 09:39:07 np0005538515.localdomain sudo[244958]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:39:07 np0005538515.localdomain sudo[244958]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:39:07 np0005538515.localdomain sudo[244958]: pam_unix(sudo:session): session closed for user root
Nov 28 09:39:08 np0005538515.localdomain sudo[244976]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 09:39:08 np0005538515.localdomain sudo[244976]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:39:08 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:39:08 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:39:09 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21112 DF PROTO=TCP SPT=49268 DPT=9102 SEQ=200872579 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD81DFA0000000001030307) 
Nov 28 09:39:09 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-5072aa4283df2440f817438926274b2ecc1fbb999174180268a40a1b62865efd-merged.mount: Deactivated successfully.
Nov 28 09:39:09 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-a820d2d170f2b0bbf4a680f8c0da82218646a321cb82318df9e6b8161dc1d2c6-merged.mount: Deactivated successfully.
Nov 28 09:39:09 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-a820d2d170f2b0bbf4a680f8c0da82218646a321cb82318df9e6b8161dc1d2c6-merged.mount: Deactivated successfully.
Nov 28 09:39:09 np0005538515.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:39:09 np0005538515.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:39:09 np0005538515.localdomain sudo[244927]: pam_unix(sudo:session): session closed for user root
Nov 28 09:39:09 np0005538515.localdomain auditd[719]: Audit daemon rotating log files
Nov 28 09:39:09 np0005538515.localdomain sudo[245115]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sxfpugycjwefnqaqlkoojwhinbzvbknv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322749.7282705-2961-190003891959309/AnsiballZ_file.py
Nov 28 09:39:10 np0005538515.localdomain sudo[245115]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:39:10 np0005538515.localdomain python3.9[245117]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/node_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:39:10 np0005538515.localdomain sudo[245115]: pam_unix(sudo:session): session closed for user root
Nov 28 09:39:10 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:39:10 np0005538515.localdomain sudo[245225]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qsclpnfsgvrkzgkrfbismaoqasvsiufp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322750.4539156-2970-44941782412130/AnsiballZ_podman_container_info.py
Nov 28 09:39:10 np0005538515.localdomain sudo[245225]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:39:10 np0005538515.localdomain python3.9[245227]: ansible-containers.podman.podman_container_info Invoked with name=['podman_exporter'] executable=podman
Nov 28 09:39:11 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 28 09:39:11 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28-merged.mount: Deactivated successfully.
Nov 28 09:39:12 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28-merged.mount: Deactivated successfully.
Nov 28 09:39:12 np0005538515.localdomain systemd[1]: libpod-conmon-56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.scope: Deactivated successfully.
Nov 28 09:39:12 np0005538515.localdomain sudo[244976]: pam_unix(sudo:session): session closed for user root
Nov 28 09:39:12 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16167 DF PROTO=TCP SPT=58874 DPT=9105 SEQ=2757640128 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD82CFB0000000001030307) 
Nov 28 09:39:13 np0005538515.localdomain sudo[245260]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:39:13 np0005538515.localdomain sudo[245260]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:39:13 np0005538515.localdomain sudo[245260]: pam_unix(sudo:session): session closed for user root
Nov 28 09:39:13 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-88264f091bd3862d781bfa87f5675ae91e879ca34a7c2bbe081e8ea3bd8603d6-merged.mount: Deactivated successfully.
Nov 28 09:39:13 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-29e7cf6abe6a2bbfc58462ae307ac9362023c413708070730336bba274ac12e7-merged.mount: Deactivated successfully.
Nov 28 09:39:13 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-29e7cf6abe6a2bbfc58462ae307ac9362023c413708070730336bba274ac12e7-merged.mount: Deactivated successfully.
Nov 28 09:39:13 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.
Nov 28 09:39:13 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:39:13 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 28 09:39:14 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55274 DF PROTO=TCP SPT=39722 DPT=9101 SEQ=273686329 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD831C00000000001030307) 
Nov 28 09:39:14 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 28 09:39:14 np0005538515.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:39:14 np0005538515.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:39:14 np0005538515.localdomain podman[245278]: 2025-11-28 09:39:14.358751253 +0000 UTC m=+0.471614097 container health_status 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., version=9.6, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.buildah.version=1.33.7)
Nov 28 09:39:14 np0005538515.localdomain podman[245278]: 2025-11-28 09:39:14.400588587 +0000 UTC m=+0.513451451 container exec_died 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1755695350, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, config_id=edpm, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, vcs-type=git, version=9.6)
Nov 28 09:39:15 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:39:15 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:39:15 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:39:16 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1313 DF PROTO=TCP SPT=39298 DPT=9100 SEQ=639885026 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD83AFA0000000001030307) 
Nov 28 09:39:16 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 28 09:39:16 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Nov 28 09:39:17 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Nov 28 09:39:17 np0005538515.localdomain systemd[1]: 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.service: Deactivated successfully.
Nov 28 09:39:17 np0005538515.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:39:17 np0005538515.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:39:17 np0005538515.localdomain sudo[245225]: pam_unix(sudo:session): session closed for user root
Nov 28 09:39:17 np0005538515.localdomain sudo[245402]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-otjlsutsnnqzreuwbfutikybqduogwwz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322757.3256326-2978-143337226561091/AnsiballZ_podman_container_exec.py
Nov 28 09:39:17 np0005538515.localdomain sudo[245402]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:39:17 np0005538515.localdomain python3.9[245404]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 28 09:39:17 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:39:17 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:39:17 np0005538515.localdomain systemd[1]: Started libpod-conmon-d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.scope.
Nov 28 09:39:17 np0005538515.localdomain podman[245405]: 2025-11-28 09:39:17.892051515 +0000 UTC m=+0.110480707 container exec d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 09:39:17 np0005538515.localdomain podman[245405]: 2025-11-28 09:39:17.924657964 +0000 UTC m=+0.143087126 container exec_died d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 28 09:39:18 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:39:18 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:39:18 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 28 09:39:19 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 28 09:39:19 np0005538515.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:39:19 np0005538515.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:39:19 np0005538515.localdomain sudo[245402]: pam_unix(sudo:session): session closed for user root
Nov 28 09:39:19 np0005538515.localdomain sudo[245542]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lgvzwqbmqxgmeyoqclxyslwlfdwdokbu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322759.4957893-2986-166769852612844/AnsiballZ_podman_container_exec.py
Nov 28 09:39:19 np0005538515.localdomain sudo[245542]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:39:20 np0005538515.localdomain python3.9[245544]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 28 09:39:20 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:39:20 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:39:20 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:39:20 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1314 DF PROTO=TCP SPT=39298 DPT=9100 SEQ=639885026 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD84ABA0000000001030307) 
Nov 28 09:39:21 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28-merged.mount: Deactivated successfully.
Nov 28 09:39:21 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-3850d276a9594c52a78e85d7b58db016dc835caf89f3a263b0f9d37a3754a60d-merged.mount: Deactivated successfully.
Nov 28 09:39:21 np0005538515.localdomain systemd[1]: libpod-conmon-d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.scope: Deactivated successfully.
Nov 28 09:39:22 np0005538515.localdomain systemd[1]: Started libpod-conmon-d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.scope.
Nov 28 09:39:22 np0005538515.localdomain podman[245545]: 2025-11-28 09:39:22.026732318 +0000 UTC m=+2.007442540 container exec d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 28 09:39:22 np0005538515.localdomain podman[245545]: 2025-11-28 09:39:22.058778598 +0000 UTC m=+2.039488820 container exec_died d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 28 09:39:22 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:39:22 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:39:22 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-5e63dbc6f2c2fad3afb78d8adbb63d1357a03d400c05fbcd9ab42cd01e6497a2-merged.mount: Deactivated successfully.
Nov 28 09:39:22 np0005538515.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:39:22 np0005538515.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:39:22 np0005538515.localdomain sudo[245542]: pam_unix(sudo:session): session closed for user root
Nov 28 09:39:23 np0005538515.localdomain sudo[245682]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kgdivzmscgqqajaukihqnbcbfnjbqutx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322763.1303353-2994-209967573247765/AnsiballZ_file.py
Nov 28 09:39:23 np0005538515.localdomain sudo[245682]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:39:23 np0005538515.localdomain python3.9[245684]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/podman_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:39:23 np0005538515.localdomain sudo[245682]: pam_unix(sudo:session): session closed for user root
Nov 28 09:39:23 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-e007bb9d0888be9cba9b97125428a4f6aecdcc0d729e1ce5c64249815340e7d9-merged.mount: Deactivated successfully.
Nov 28 09:39:23 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-5e63dbc6f2c2fad3afb78d8adbb63d1357a03d400c05fbcd9ab42cd01e6497a2-merged.mount: Deactivated successfully.
Nov 28 09:39:23 np0005538515.localdomain systemd[1]: libpod-conmon-d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.scope: Deactivated successfully.
Nov 28 09:39:23 np0005538515.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:39:23 np0005538515.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Nov 28 09:39:24 np0005538515.localdomain sudo[245792]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ejusrkavemcicjeqlmfzpgsryzioimxp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322763.8859732-3003-192670637310945/AnsiballZ_podman_container_info.py
Nov 28 09:39:24 np0005538515.localdomain sudo[245792]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:39:24 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.
Nov 28 09:39:24 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.
Nov 28 09:39:24 np0005538515.localdomain podman[245794]: 2025-11-28 09:39:24.283119967 +0000 UTC m=+0.098131357 container health_status 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=unhealthy, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 28 09:39:24 np0005538515.localdomain podman[245796]: 2025-11-28 09:39:24.319903134 +0000 UTC m=+0.129901579 container health_status b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Nov 28 09:39:24 np0005538515.localdomain podman[245796]: 2025-11-28 09:39:24.353510914 +0000 UTC m=+0.163509319 container exec_died b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3)
Nov 28 09:39:24 np0005538515.localdomain podman[245794]: 2025-11-28 09:39:24.367497486 +0000 UTC m=+0.182508916 container exec_died 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=edpm)
Nov 28 09:39:24 np0005538515.localdomain python3.9[245795]: ansible-containers.podman.podman_container_info Invoked with name=['openstack_network_exporter'] executable=podman
Nov 28 09:39:24 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.
Nov 28 09:39:24 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.
Nov 28 09:39:24 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:39:24 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:39:26 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Nov 28 09:39:26 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-740c106cec230c8e88a51062b1e59b5e7fe0e9195732430d85360787d0335118-merged.mount: Deactivated successfully.
Nov 28 09:39:26 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-740c106cec230c8e88a51062b1e59b5e7fe0e9195732430d85360787d0335118-merged.mount: Deactivated successfully.
Nov 28 09:39:26 np0005538515.localdomain systemd[1]: b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.service: Deactivated successfully.
Nov 28 09:39:26 np0005538515.localdomain podman[239012]: time="2025-11-28T09:39:26Z" level=error msg="Unable to write json: \"write unix /run/podman/podman.sock->@: write: broken pipe\""
Nov 28 09:39:26 np0005538515.localdomain podman[239012]: @ - - [28/Nov/2025:09:34:22 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=true&sync=false HTTP/1.1" 200 4096 "" "Go-http-client/1.1"
Nov 28 09:39:26 np0005538515.localdomain systemd[1]: 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.service: Deactivated successfully.
Nov 28 09:39:26 np0005538515.localdomain podman[245839]: 2025-11-28 09:39:26.431277587 +0000 UTC m=+1.781797931 container health_status d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=unhealthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 28 09:39:26 np0005538515.localdomain podman[245839]: 2025-11-28 09:39:26.455438084 +0000 UTC m=+1.805958488 container exec_died d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 28 09:39:26 np0005538515.localdomain podman[245839]: unhealthy
Nov 28 09:39:26 np0005538515.localdomain podman[245838]: 2025-11-28 09:39:26.591168142 +0000 UTC m=+1.943933636 container health_status 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=ovn_controller, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Nov 28 09:39:26 np0005538515.localdomain podman[245838]: 2025-11-28 09:39:26.668335549 +0000 UTC m=+2.021101083 container exec_died 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_controller, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125)
Nov 28 09:39:26 np0005538515.localdomain systemd[1]: d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 09:39:26 np0005538515.localdomain systemd[1]: d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.service: Failed with result 'exit-code'.
Nov 28 09:39:26 np0005538515.localdomain systemd[1]: 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.service: Deactivated successfully.
Nov 28 09:39:27 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:39:27 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42447 DF PROTO=TCP SPT=54810 DPT=9105 SEQ=3612664099 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD865F30000000001030307) 
Nov 28 09:39:27 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-e007bb9d0888be9cba9b97125428a4f6aecdcc0d729e1ce5c64249815340e7d9-merged.mount: Deactivated successfully.
Nov 28 09:39:27 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-9ce8c6ec24615a8ac03ae2a4194714a4f44afbdc43ba4491ff44c91e34e068e5-merged.mount: Deactivated successfully.
Nov 28 09:39:27 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-9ce8c6ec24615a8ac03ae2a4194714a4f44afbdc43ba4491ff44c91e34e068e5-merged.mount: Deactivated successfully.
Nov 28 09:39:27 np0005538515.localdomain sudo[245792]: pam_unix(sudo:session): session closed for user root
Nov 28 09:39:28 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.
Nov 28 09:39:28 np0005538515.localdomain systemd[1]: tmp-crun.ta4Vop.mount: Deactivated successfully.
Nov 28 09:39:28 np0005538515.localdomain podman[245960]: 2025-11-28 09:39:28.224551101 +0000 UTC m=+0.103790341 container health_status 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 28 09:39:28 np0005538515.localdomain podman[245960]: 2025-11-28 09:39:28.261482104 +0000 UTC m=+0.140721284 container exec_died 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 09:39:28 np0005538515.localdomain sudo[246019]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zqdvvhsyjtqedfvpnfvgaxrwbvwrglfb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322767.9779074-3011-203987606690925/AnsiballZ_podman_container_exec.py
Nov 28 09:39:28 np0005538515.localdomain sudo[246019]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:39:28 np0005538515.localdomain python3.9[246021]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 28 09:39:28 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42448 DF PROTO=TCP SPT=54810 DPT=9105 SEQ=3612664099 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD869FA0000000001030307) 
Nov 28 09:39:28 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:39:28 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1315 DF PROTO=TCP SPT=39298 DPT=9100 SEQ=639885026 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD86AFA0000000001030307) 
Nov 28 09:39:28 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-88264f091bd3862d781bfa87f5675ae91e879ca34a7c2bbe081e8ea3bd8603d6-merged.mount: Deactivated successfully.
Nov 28 09:39:28 np0005538515.localdomain systemd[1]: 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.service: Deactivated successfully.
Nov 28 09:39:29 np0005538515.localdomain systemd[1]: Started libpod-conmon-6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.scope.
Nov 28 09:39:29 np0005538515.localdomain podman[246022]: 2025-11-28 09:39:29.046134632 +0000 UTC m=+0.542656125 container exec 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, build-date=2025-08-20T13:12:41, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., distribution-scope=public, managed_by=edpm_ansible, name=ubi9-minimal, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, version=9.6, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, architecture=x86_64, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 28 09:39:29 np0005538515.localdomain podman[246022]: 2025-11-28 09:39:29.081618859 +0000 UTC m=+0.578140372 container exec_died 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, managed_by=edpm_ansible, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, version=9.6, io.buildah.version=1.33.7, architecture=x86_64, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, config_id=edpm, container_name=openstack_network_exporter, distribution-scope=public, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Nov 28 09:39:29 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:39:29 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:39:30 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:39:30 np0005538515.localdomain sudo[246019]: pam_unix(sudo:session): session closed for user root
Nov 28 09:39:30 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:39:30 np0005538515.localdomain sudo[246158]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zkryonjorkkzjtgolykjdywhtbpajwas ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322770.4120843-3019-281116856518013/AnsiballZ_podman_container_exec.py
Nov 28 09:39:30 np0005538515.localdomain sudo[246158]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:39:30 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:39:30 np0005538515.localdomain systemd[1]: libpod-conmon-6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.scope: Deactivated successfully.
Nov 28 09:39:30 np0005538515.localdomain python3.9[246160]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 28 09:39:31 np0005538515.localdomain systemd[1]: Started libpod-conmon-6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.scope.
Nov 28 09:39:31 np0005538515.localdomain podman[246161]: 2025-11-28 09:39:31.010959883 +0000 UTC m=+0.072642098 container exec 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.buildah.version=1.33.7, release=1755695350, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., io.openshift.expose-services=, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, version=9.6, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, name=ubi9-minimal, vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Nov 28 09:39:31 np0005538515.localdomain podman[246161]: 2025-11-28 09:39:31.014954977 +0000 UTC m=+0.076637162 container exec_died 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, vcs-type=git, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, name=ubi9-minimal, distribution-scope=public, architecture=x86_64, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, vendor=Red Hat, Inc., release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 09:39:31 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:39:31 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:39:31 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:39:31 np0005538515.localdomain sudo[246158]: pam_unix(sudo:session): session closed for user root
Nov 28 09:39:31 np0005538515.localdomain sudo[246299]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uywpqbdaxxqrjymrblionsbyeyfwczqm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322771.5306108-3027-217468882734746/AnsiballZ_file.py
Nov 28 09:39:31 np0005538515.localdomain sudo[246299]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:39:32 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8835 DF PROTO=TCP SPT=35100 DPT=9882 SEQ=3246240453 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD8773A0000000001030307) 
Nov 28 09:39:32 np0005538515.localdomain python3.9[246301]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/openstack_network_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:39:32 np0005538515.localdomain sudo[246299]: pam_unix(sudo:session): session closed for user root
Nov 28 09:39:32 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-88264f091bd3862d781bfa87f5675ae91e879ca34a7c2bbe081e8ea3bd8603d6-merged.mount: Deactivated successfully.
Nov 28 09:39:32 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-29e7cf6abe6a2bbfc58462ae307ac9362023c413708070730336bba274ac12e7-merged.mount: Deactivated successfully.
Nov 28 09:39:32 np0005538515.localdomain systemd[1]: libpod-conmon-6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.scope: Deactivated successfully.
Nov 28 09:39:33 np0005538515.localdomain sudo[246409]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nbpcsdcdxckgnebyzgliuohboryrcpfb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322773.1414928-3041-133378502959156/AnsiballZ_file.py
Nov 28 09:39:33 np0005538515.localdomain sudo[246409]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:39:33 np0005538515.localdomain python3.9[246411]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall/ state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:39:33 np0005538515.localdomain sudo[246409]: pam_unix(sudo:session): session closed for user root
Nov 28 09:39:34 np0005538515.localdomain sudo[246519]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yejxffnlsnbrjxuqgivpgnjfebcalnmf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322774.0695786-3071-62095258546325/AnsiballZ_stat.py
Nov 28 09:39:34 np0005538515.localdomain sudo[246519]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:39:34 np0005538515.localdomain python3.9[246521]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/telemetry.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:39:34 np0005538515.localdomain sudo[246519]: pam_unix(sudo:session): session closed for user root
Nov 28 09:39:34 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42450 DF PROTO=TCP SPT=54810 DPT=9105 SEQ=3612664099 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD881BA0000000001030307) 
Nov 28 09:39:34 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 28 09:39:34 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.
Nov 28 09:39:34 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Nov 28 09:39:34 np0005538515.localdomain systemd[1]: tmp-crun.5FaFqu.mount: Deactivated successfully.
Nov 28 09:39:34 np0005538515.localdomain podman[246522]: 2025-11-28 09:39:34.977128364 +0000 UTC m=+0.083104542 container health_status cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd)
Nov 28 09:39:34 np0005538515.localdomain podman[246522]: 2025-11-28 09:39:34.991378025 +0000 UTC m=+0.097354123 container exec_died cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125)
Nov 28 09:39:35 np0005538515.localdomain systemd[1]: cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.service: Deactivated successfully.
Nov 28 09:39:35 np0005538515.localdomain sudo[246626]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mjrrruqfhqpitchlnfimbzhwvxqscndp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322774.0695786-3071-62095258546325/AnsiballZ_copy.py
Nov 28 09:39:35 np0005538515.localdomain sudo[246626]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:39:35 np0005538515.localdomain python3.9[246628]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/telemetry.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1764322774.0695786-3071-62095258546325/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:39:35 np0005538515.localdomain sudo[246626]: pam_unix(sudo:session): session closed for user root
Nov 28 09:39:36 np0005538515.localdomain sudo[246736]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-azimneheamdtqqmzspvgzwnjsoyiylys ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322776.2479658-3118-158961361375278/AnsiballZ_file.py
Nov 28 09:39:36 np0005538515.localdomain sudo[246736]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:39:36 np0005538515.localdomain python3.9[246738]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:39:36 np0005538515.localdomain sudo[246736]: pam_unix(sudo:session): session closed for user root
Nov 28 09:39:36 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:39:36 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 28 09:39:37 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Nov 28 09:39:37 np0005538515.localdomain sudo[246846]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-evbmnqiqshffjwyzbvaspwfepacstquu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322777.0056329-3143-1326854953232/AnsiballZ_stat.py
Nov 28 09:39:37 np0005538515.localdomain sudo[246846]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:39:37 np0005538515.localdomain python3.9[246848]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:39:37 np0005538515.localdomain sudo[246846]: pam_unix(sudo:session): session closed for user root
Nov 28 09:39:37 np0005538515.localdomain sudo[246903]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fqqdalopnoffsyewxrxfiayvzqgvicns ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322777.0056329-3143-1326854953232/AnsiballZ_file.py
Nov 28 09:39:37 np0005538515.localdomain sudo[246903]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:39:37 np0005538515.localdomain python3.9[246905]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:39:38 np0005538515.localdomain sudo[246903]: pam_unix(sudo:session): session closed for user root
Nov 28 09:39:38 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:39:38 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:39:38 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Nov 28 09:39:38 np0005538515.localdomain sudo[247013]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ayoncpmrewkyozymzwoddcqbvbgiwtoz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322778.3665621-3179-176950392128009/AnsiballZ_stat.py
Nov 28 09:39:38 np0005538515.localdomain sudo[247013]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:39:38 np0005538515.localdomain python3.9[247015]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:39:38 np0005538515.localdomain sudo[247013]: pam_unix(sudo:session): session closed for user root
Nov 28 09:39:39 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Nov 28 09:39:39 np0005538515.localdomain sudo[247070]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-smvsvzgrosjhrcphkbymwfegrvqffbfb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322778.3665621-3179-176950392128009/AnsiballZ_file.py
Nov 28 09:39:39 np0005538515.localdomain sudo[247070]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:39:39 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34115 DF PROTO=TCP SPT=41640 DPT=9102 SEQ=2329822090 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD8933A0000000001030307) 
Nov 28 09:39:39 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:39:39 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Nov 28 09:39:39 np0005538515.localdomain python3.9[247072]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.9d0rby5q recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:39:39 np0005538515.localdomain sudo[247070]: pam_unix(sudo:session): session closed for user root
Nov 28 09:39:39 np0005538515.localdomain sudo[247180]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-irufzwhcjnvlfarshbeeioqziqsyqbuh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322779.6422184-3215-116563352426864/AnsiballZ_stat.py
Nov 28 09:39:39 np0005538515.localdomain sudo[247180]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:39:40 np0005538515.localdomain python3.9[247182]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:39:40 np0005538515.localdomain sudo[247180]: pam_unix(sudo:session): session closed for user root
Nov 28 09:39:40 np0005538515.localdomain sudo[247237]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-usahiuzwnisknryhvqugrsdebxarrqjh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322779.6422184-3215-116563352426864/AnsiballZ_file.py
Nov 28 09:39:40 np0005538515.localdomain sudo[247237]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:39:40 np0005538515.localdomain python3.9[247239]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:39:40 np0005538515.localdomain sudo[247237]: pam_unix(sudo:session): session closed for user root
Nov 28 09:39:41 np0005538515.localdomain sudo[247347]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ltnyvllzsobmchdftbtrkepgmtjpyaqy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322780.9257033-3254-148876507504135/AnsiballZ_command.py
Nov 28 09:39:41 np0005538515.localdomain sudo[247347]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:39:41 np0005538515.localdomain python3.9[247349]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:39:41 np0005538515.localdomain sudo[247347]: pam_unix(sudo:session): session closed for user root
Nov 28 09:39:41 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Nov 28 09:39:41 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-740c106cec230c8e88a51062b1e59b5e7fe0e9195732430d85360787d0335118-merged.mount: Deactivated successfully.
Nov 28 09:39:42 np0005538515.localdomain podman[239012]: @ - - [28/Nov/2025:09:34:28 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=true&sync=false HTTP/1.1" 200 140435 "" "Go-http-client/1.1"
Nov 28 09:39:42 np0005538515.localdomain podman_exporter[239219]: ts=2025-11-28T09:39:42.038Z caller=exporter.go:96 level=info msg="Listening on" address=:9882
Nov 28 09:39:42 np0005538515.localdomain podman_exporter[239219]: ts=2025-11-28T09:39:42.038Z caller=tls_config.go:313 level=info msg="Listening on" address=[::]:9882
Nov 28 09:39:42 np0005538515.localdomain podman_exporter[239219]: ts=2025-11-28T09:39:42.038Z caller=tls_config.go:316 level=info msg="TLS is disabled." http2=false address=[::]:9882
Nov 28 09:39:42 np0005538515.localdomain sudo[247458]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-meqwsvydibwqhrlepjayxomsebwiqmio ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764322781.6870732-3278-265999003700235/AnsiballZ_edpm_nftables_from_files.py
Nov 28 09:39:42 np0005538515.localdomain sudo[247458]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:39:42 np0005538515.localdomain python3[247460]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Nov 28 09:39:42 np0005538515.localdomain sudo[247458]: pam_unix(sudo:session): session closed for user root
Nov 28 09:39:42 np0005538515.localdomain sudo[247568]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nhftfbwutiuxcqsbetgokbrfeowgsaxl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322782.643845-3302-52342475614987/AnsiballZ_stat.py
Nov 28 09:39:42 np0005538515.localdomain sudo[247568]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:39:43 np0005538515.localdomain python3.9[247570]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:39:43 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42451 DF PROTO=TCP SPT=54810 DPT=9105 SEQ=3612664099 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD8A2FA0000000001030307) 
Nov 28 09:39:43 np0005538515.localdomain sudo[247568]: pam_unix(sudo:session): session closed for user root
Nov 28 09:39:43 np0005538515.localdomain sudo[247625]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wpimfzfxxytqklalzhnjzsrbonztjgsk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322782.643845-3302-52342475614987/AnsiballZ_file.py
Nov 28 09:39:43 np0005538515.localdomain sudo[247625]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:39:43 np0005538515.localdomain python3.9[247627]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:39:43 np0005538515.localdomain sudo[247625]: pam_unix(sudo:session): session closed for user root
Nov 28 09:39:44 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27273 DF PROTO=TCP SPT=39318 DPT=9101 SEQ=3711898930 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD8A6F10000000001030307) 
Nov 28 09:39:44 np0005538515.localdomain sudo[247735]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lpcogetukhkchxoqnzjwzsyczewprrgq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322784.5537672-3339-57678626471378/AnsiballZ_stat.py
Nov 28 09:39:44 np0005538515.localdomain sudo[247735]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:39:45 np0005538515.localdomain python3.9[247737]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:39:45 np0005538515.localdomain sudo[247735]: pam_unix(sudo:session): session closed for user root
Nov 28 09:39:45 np0005538515.localdomain sudo[247792]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-otacwmxrrjakznwmrllzuirgbcoxtstz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322784.5537672-3339-57678626471378/AnsiballZ_file.py
Nov 28 09:39:45 np0005538515.localdomain sudo[247792]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:39:45 np0005538515.localdomain python3.9[247794]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:39:45 np0005538515.localdomain sudo[247792]: pam_unix(sudo:session): session closed for user root
Nov 28 09:39:46 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60985 DF PROTO=TCP SPT=50342 DPT=9100 SEQ=1861691952 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD8B03B0000000001030307) 
Nov 28 09:39:46 np0005538515.localdomain sudo[247902]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ieqjwbiqfqbuxsiwvhqcyacievjwpgcp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322785.8905966-3374-264173568758710/AnsiballZ_stat.py
Nov 28 09:39:46 np0005538515.localdomain sudo[247902]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:39:46 np0005538515.localdomain python3.9[247904]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:39:46 np0005538515.localdomain sudo[247902]: pam_unix(sudo:session): session closed for user root
Nov 28 09:39:47 np0005538515.localdomain sudo[247959]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rctjyxrstqlzblycuyesxtxvxdbsunqw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322785.8905966-3374-264173568758710/AnsiballZ_file.py
Nov 28 09:39:47 np0005538515.localdomain sudo[247959]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:39:47 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.
Nov 28 09:39:47 np0005538515.localdomain systemd[1]: tmp-crun.f5qDEQ.mount: Deactivated successfully.
Nov 28 09:39:47 np0005538515.localdomain podman[247962]: 2025-11-28 09:39:47.332881738 +0000 UTC m=+0.097061513 container health_status 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, vcs-type=git, version=9.6, managed_by=edpm_ansible, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, maintainer=Red Hat, Inc., name=ubi9-minimal, distribution-scope=public, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41)
Nov 28 09:39:47 np0005538515.localdomain podman[247962]: 2025-11-28 09:39:47.373780023 +0000 UTC m=+0.137959758 container exec_died 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, managed_by=edpm_ansible, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, release=1755695350, config_id=edpm, vcs-type=git, version=9.6, name=ubi9-minimal)
Nov 28 09:39:47 np0005538515.localdomain systemd[1]: 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.service: Deactivated successfully.
Nov 28 09:39:47 np0005538515.localdomain python3.9[247961]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:39:47 np0005538515.localdomain sudo[247959]: pam_unix(sudo:session): session closed for user root
Nov 28 09:39:48 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:39:48.074 228501 DEBUG oslo_service.periodic_task [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:39:48 np0005538515.localdomain sudo[248087]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ssquumrpkhdufexoftsvzrcluaxurlam ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322787.6465669-3410-142926391498312/AnsiballZ_stat.py
Nov 28 09:39:48 np0005538515.localdomain sudo[248087]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:39:48 np0005538515.localdomain python3.9[248089]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:39:48 np0005538515.localdomain sudo[248087]: pam_unix(sudo:session): session closed for user root
Nov 28 09:39:48 np0005538515.localdomain sudo[248144]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-evwxtuggsfoxddwxcmkrbyhcrohqiicx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322787.6465669-3410-142926391498312/AnsiballZ_file.py
Nov 28 09:39:48 np0005538515.localdomain sudo[248144]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:39:48 np0005538515.localdomain python3.9[248146]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:39:48 np0005538515.localdomain sudo[248144]: pam_unix(sudo:session): session closed for user root
Nov 28 09:39:49 np0005538515.localdomain sudo[248254]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rphlbeodcvvsnxergsaoskyqqpeczjmy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322789.1298497-3446-238103872918656/AnsiballZ_stat.py
Nov 28 09:39:49 np0005538515.localdomain sudo[248254]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:39:49 np0005538515.localdomain python3.9[248256]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:39:49 np0005538515.localdomain sudo[248254]: pam_unix(sudo:session): session closed for user root
Nov 28 09:39:50 np0005538515.localdomain sudo[248344]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-schycwtjbblnyfoxyxeydynnmfcudcdg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322789.1298497-3446-238103872918656/AnsiballZ_copy.py
Nov 28 09:39:50 np0005538515.localdomain sudo[248344]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:39:50 np0005538515.localdomain python3.9[248346]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764322789.1298497-3446-238103872918656/.source.nft follow=False _original_basename=ruleset.j2 checksum=953266ca5f7d82d2777a0a437bd7feceb9259ee8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:39:50 np0005538515.localdomain sudo[248344]: pam_unix(sudo:session): session closed for user root
Nov 28 09:39:50 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60986 DF PROTO=TCP SPT=50342 DPT=9100 SEQ=1861691952 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD8BFFB0000000001030307) 
Nov 28 09:39:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:39:50.819 158530 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:39:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:39:50.820 158530 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:39:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:39:50.820 158530 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:39:50 np0005538515.localdomain sudo[248454]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nyaoschcrwbabdyejihvlaytfynpwbja ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322790.59155-3491-9155885943784/AnsiballZ_file.py
Nov 28 09:39:50 np0005538515.localdomain sudo[248454]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:39:51 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:39:51.073 228501 DEBUG oslo_service.periodic_task [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:39:51 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:39:51.073 228501 DEBUG oslo_service.periodic_task [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:39:51 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:39:51.091 228501 DEBUG oslo_concurrency.lockutils [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:39:51 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:39:51.091 228501 DEBUG oslo_concurrency.lockutils [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:39:51 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:39:51.091 228501 DEBUG oslo_concurrency.lockutils [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:39:51 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:39:51.091 228501 DEBUG nova.compute.resource_tracker [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Auditing locally available compute resources for np0005538515.localdomain (node: np0005538515.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 28 09:39:51 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:39:51.091 228501 DEBUG oslo_concurrency.processutils [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 09:39:51 np0005538515.localdomain python3.9[248456]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:39:51 np0005538515.localdomain sudo[248454]: pam_unix(sudo:session): session closed for user root
Nov 28 09:39:51 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:39:51.488 228501 DEBUG oslo_concurrency.processutils [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.397s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 09:39:51 np0005538515.localdomain sudo[248586]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xbldjlqumqzedkqexwrgksgzpiunvaie ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322791.295902-3515-183875868924144/AnsiballZ_command.py
Nov 28 09:39:51 np0005538515.localdomain sudo[248586]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:39:51 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:39:51.676 228501 WARNING nova.virt.libvirt.driver [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 09:39:51 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:39:51.678 228501 DEBUG nova.compute.resource_tracker [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Hypervisor/Node resource view: name=np0005538515.localdomain free_ram=13086MB free_disk=41.837093353271484GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 28 09:39:51 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:39:51.679 228501 DEBUG oslo_concurrency.lockutils [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:39:51 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:39:51.679 228501 DEBUG oslo_concurrency.lockutils [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:39:51 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:39:51.752 228501 DEBUG nova.compute.resource_tracker [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 28 09:39:51 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:39:51.753 228501 DEBUG nova.compute.resource_tracker [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Final resource view: name=np0005538515.localdomain phys_ram=15738MB used_ram=512MB phys_disk=41GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 28 09:39:51 np0005538515.localdomain python3.9[248588]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:39:51 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:39:51.781 228501 DEBUG oslo_concurrency.processutils [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 09:39:51 np0005538515.localdomain sudo[248586]: pam_unix(sudo:session): session closed for user root
Nov 28 09:39:52 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:39:52.244 228501 DEBUG oslo_concurrency.processutils [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.463s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 09:39:52 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:39:52.250 228501 DEBUG nova.compute.provider_tree [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Inventory has not changed in ProviderTree for provider: 72fba1ca-0d86-48af-8a3d-510284dfd0e0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 09:39:52 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:39:52.263 228501 DEBUG nova.scheduler.client.report [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Inventory has not changed for provider 72fba1ca-0d86-48af-8a3d-510284dfd0e0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 09:39:52 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:39:52.265 228501 DEBUG nova.compute.resource_tracker [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Compute_service record updated for np0005538515.localdomain:np0005538515.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 28 09:39:52 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:39:52.265 228501 DEBUG oslo_concurrency.lockutils [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.586s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:39:52 np0005538515.localdomain sudo[248721]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zkalddlwhijrawmskxytmgumfdxvezfg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322792.0260432-3539-209379988680873/AnsiballZ_blockinfile.py
Nov 28 09:39:52 np0005538515.localdomain sudo[248721]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:39:52 np0005538515.localdomain python3.9[248723]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                                            include "/etc/nftables/edpm-chains.nft"
                                                            include "/etc/nftables/edpm-rules.nft"
                                                            include "/etc/nftables/edpm-jumps.nft"
                                                             path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:39:52 np0005538515.localdomain sudo[248721]: pam_unix(sudo:session): session closed for user root
Nov 28 09:39:53 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:39:53.262 228501 DEBUG oslo_service.periodic_task [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:39:53 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:39:53.263 228501 DEBUG oslo_service.periodic_task [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:39:53 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:39:53.264 228501 DEBUG nova.compute.manager [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 28 09:39:53 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:39:53.264 228501 DEBUG nova.compute.manager [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 28 09:39:53 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:39:53.364 228501 DEBUG nova.compute.manager [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 28 09:39:53 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:39:53.365 228501 DEBUG oslo_service.periodic_task [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:39:53 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:39:53.365 228501 DEBUG oslo_service.periodic_task [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:39:53 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:39:53.366 228501 DEBUG oslo_service.periodic_task [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:39:53 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:39:53.366 228501 DEBUG oslo_service.periodic_task [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:39:53 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:39:53.366 228501 DEBUG nova.compute.manager [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 28 09:39:53 np0005538515.localdomain sudo[248831]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bziitnmofpqqewugetsjwjazolhpahjp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322793.028844-3566-15743896475767/AnsiballZ_command.py
Nov 28 09:39:53 np0005538515.localdomain sudo[248831]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:39:53 np0005538515.localdomain python3.9[248833]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:39:53 np0005538515.localdomain sudo[248831]: pam_unix(sudo:session): session closed for user root
Nov 28 09:39:54 np0005538515.localdomain sudo[248942]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mozlnixxhxznjbwvmqtfcncesgsboopi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322793.9630759-3590-51530446087708/AnsiballZ_stat.py
Nov 28 09:39:54 np0005538515.localdomain sudo[248942]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:39:54 np0005538515.localdomain python3.9[248944]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 09:39:54 np0005538515.localdomain sudo[248942]: pam_unix(sudo:session): session closed for user root
Nov 28 09:39:54 np0005538515.localdomain sudo[249054]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iidagiiqjekmyzljlcnrmezhzkrnyzdi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322794.6816847-3615-66901945062240/AnsiballZ_command.py
Nov 28 09:39:54 np0005538515.localdomain sudo[249054]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:39:55 np0005538515.localdomain python3.9[249056]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:39:55 np0005538515.localdomain sudo[249054]: pam_unix(sudo:session): session closed for user root
Nov 28 09:39:55 np0005538515.localdomain sudo[249167]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gmojghgturbmhgxqescbezqpalnraezo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322795.4132977-3638-267923029146709/AnsiballZ_file.py
Nov 28 09:39:55 np0005538515.localdomain sudo[249167]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:39:55 np0005538515.localdomain python3.9[249169]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:39:55 np0005538515.localdomain sudo[249167]: pam_unix(sudo:session): session closed for user root
Nov 28 09:39:56 np0005538515.localdomain sshd[242218]: pam_unix(sshd:session): session closed for user zuul
Nov 28 09:39:56 np0005538515.localdomain systemd[1]: session-56.scope: Deactivated successfully.
Nov 28 09:39:56 np0005538515.localdomain systemd[1]: session-56.scope: Consumed 29.561s CPU time.
Nov 28 09:39:56 np0005538515.localdomain systemd-logind[763]: Session 56 logged out. Waiting for processes to exit.
Nov 28 09:39:56 np0005538515.localdomain systemd-logind[763]: Removed session 56.
Nov 28 09:39:56 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.
Nov 28 09:39:56 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.
Nov 28 09:39:56 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.
Nov 28 09:39:56 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.
Nov 28 09:39:56 np0005538515.localdomain podman[249187]: 2025-11-28 09:39:56.983216886 +0000 UTC m=+0.084970819 container health_status 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 09:39:57 np0005538515.localdomain podman[249187]: 2025-11-28 09:39:57.020997735 +0000 UTC m=+0.122751718 container exec_died 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=edpm, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 09:39:57 np0005538515.localdomain systemd[1]: 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.service: Deactivated successfully.
Nov 28 09:39:57 np0005538515.localdomain podman[249188]: 2025-11-28 09:39:57.041319023 +0000 UTC m=+0.143320094 container health_status 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Nov 28 09:39:57 np0005538515.localdomain podman[249188]: 2025-11-28 09:39:57.116343894 +0000 UTC m=+0.218344955 container exec_died 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 28 09:39:57 np0005538515.localdomain systemd[1]: 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.service: Deactivated successfully.
Nov 28 09:39:57 np0005538515.localdomain podman[249189]: 2025-11-28 09:39:57.136203648 +0000 UTC m=+0.231477361 container health_status b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Nov 28 09:39:57 np0005538515.localdomain podman[249189]: 2025-11-28 09:39:57.170453437 +0000 UTC m=+0.265727100 container exec_died b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Nov 28 09:39:57 np0005538515.localdomain systemd[1]: b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.service: Deactivated successfully.
Nov 28 09:39:57 np0005538515.localdomain podman[249190]: 2025-11-28 09:39:57.236930123 +0000 UTC m=+0.329774611 container health_status d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=unhealthy, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 09:39:57 np0005538515.localdomain podman[249190]: 2025-11-28 09:39:57.248381127 +0000 UTC m=+0.341225635 container exec_died d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 28 09:39:57 np0005538515.localdomain systemd[1]: d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.service: Deactivated successfully.
Nov 28 09:39:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:39:57 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 09:39:57 np0005538515.localdomain openstack_network_exporter[240973]: 
Nov 28 09:39:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:39:57 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:39:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:39:57 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 09:39:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:39:57 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:39:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:39:57 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 09:39:57 np0005538515.localdomain openstack_network_exporter[240973]: 
Nov 28 09:39:58 np0005538515.localdomain podman[239012]: time="2025-11-28T09:39:58Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 09:39:58 np0005538515.localdomain podman[239012]: @ - - [28/Nov/2025:09:39:58 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 142078 "" "Go-http-client/1.1"
Nov 28 09:39:58 np0005538515.localdomain podman[239012]: @ - - [28/Nov/2025:09:39:58 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 15858 "" "Go-http-client/1.1"
Nov 28 09:39:59 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.
Nov 28 09:39:59 np0005538515.localdomain podman[249270]: 2025-11-28 09:39:59.975494793 +0000 UTC m=+0.082381249 container health_status 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 28 09:39:59 np0005538515.localdomain podman[249270]: 2025-11-28 09:39:59.987435193 +0000 UTC m=+0.094321639 container exec_died 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 09:40:00 np0005538515.localdomain systemd[1]: 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.service: Deactivated successfully.
Nov 28 09:40:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:40:00.615 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:40:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:40:00.616 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:40:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:40:00.616 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:40:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:40:00.616 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:40:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:40:00.616 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:40:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:40:00.616 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:40:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:40:00.616 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:40:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:40:00.616 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:40:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:40:00.617 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:40:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:40:00.617 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:40:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:40:00.617 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:40:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:40:00.617 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:40:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:40:00.617 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:40:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:40:00.617 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:40:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:40:00.617 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:40:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:40:00.618 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:40:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:40:00.618 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:40:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:40:00.618 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:40:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:40:00.618 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:40:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:40:00.618 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:40:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:40:00.618 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:40:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:40:00.618 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:40:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:40:00.619 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:40:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:40:00.619 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:40:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:40:00.619 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:40:02 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52049 DF PROTO=TCP SPT=39936 DPT=9102 SEQ=2080535726 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD8EC990000000001030307) 
Nov 28 09:40:02 np0005538515.localdomain sshd[249295]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 09:40:02 np0005538515.localdomain sshd[249295]: Accepted publickey for zuul from 192.168.122.30 port 46144 ssh2: RSA SHA256:3gOhaEk5Hp1Sm2LwNst6cGDJ5O01KvSo8lCo9SBO2II
Nov 28 09:40:02 np0005538515.localdomain systemd-logind[763]: New session 57 of user zuul.
Nov 28 09:40:02 np0005538515.localdomain systemd[1]: Started Session 57 of User zuul.
Nov 28 09:40:02 np0005538515.localdomain sshd[249295]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Nov 28 09:40:03 np0005538515.localdomain sudo[249406]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qmsqhsagomvsevswtxhkiazkbqlzjefw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322802.3703325-29-146385485013533/AnsiballZ_file.py
Nov 28 09:40:03 np0005538515.localdomain sudo[249406]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:40:03 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52050 DF PROTO=TCP SPT=39936 DPT=9102 SEQ=2080535726 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD8F0BA0000000001030307) 
Nov 28 09:40:03 np0005538515.localdomain python3.9[249408]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/containers setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:40:03 np0005538515.localdomain sudo[249406]: pam_unix(sudo:session): session closed for user root
Nov 28 09:40:03 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34117 DF PROTO=TCP SPT=41640 DPT=9102 SEQ=2329822090 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD8F2FB0000000001030307) 
Nov 28 09:40:03 np0005538515.localdomain sudo[249516]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-obnownikakbmzlbqofowbmvjeggmgolk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322803.4730048-29-17084999284637/AnsiballZ_file.py
Nov 28 09:40:03 np0005538515.localdomain sudo[249516]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:40:03 np0005538515.localdomain python3.9[249518]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:40:03 np0005538515.localdomain sudo[249516]: pam_unix(sudo:session): session closed for user root
Nov 28 09:40:04 np0005538515.localdomain sudo[249626]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ugaxbyijqxjcppdwgunecczzargrghuq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322804.0487697-29-178811432440464/AnsiballZ_file.py
Nov 28 09:40:04 np0005538515.localdomain sudo[249626]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:40:04 np0005538515.localdomain python3.9[249628]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/config-data/ansible-generated/neutron-sriov-agent setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:40:04 np0005538515.localdomain sudo[249626]: pam_unix(sudo:session): session closed for user root
Nov 28 09:40:05 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52051 DF PROTO=TCP SPT=39936 DPT=9102 SEQ=2080535726 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD8F8BB0000000001030307) 
Nov 28 09:40:05 np0005538515.localdomain python3.9[249736]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/neutron_sriov_agent.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:40:05 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.
Nov 28 09:40:05 np0005538515.localdomain podman[249823]: 2025-11-28 09:40:05.986754956 +0000 UTC m=+0.085290509 container health_status cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=multipathd, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 09:40:06 np0005538515.localdomain podman[249823]: 2025-11-28 09:40:06.001463851 +0000 UTC m=+0.099999404 container exec_died cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 28 09:40:06 np0005538515.localdomain systemd[1]: cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.service: Deactivated successfully.
Nov 28 09:40:06 np0005538515.localdomain python3.9[249822]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/neutron_sriov_agent.yaml mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764322804.8721523-108-90618899717848/.source.yaml follow=False _original_basename=neutron_sriov_agent.yaml.j2 checksum=d3942d8476d006ea81540d2a1d96dd9d67f33f5f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:40:06 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21115 DF PROTO=TCP SPT=49268 DPT=9102 SEQ=200872579 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD8FCFA0000000001030307) 
Nov 28 09:40:06 np0005538515.localdomain python3.9[249951]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-sriov-agent/01-neutron.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:40:07 np0005538515.localdomain python3.9[250037]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-sriov-agent/01-neutron.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764322806.288129-151-39710718413661/.source.conf follow=False _original_basename=neutron.conf.j2 checksum=24e013b64eb8be4a13596c6ffccbd94df7442bd2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:40:07 np0005538515.localdomain python3.9[250145]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-sriov-agent/01-rootwrap.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:40:08 np0005538515.localdomain python3.9[250231]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-sriov-agent/01-rootwrap.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764322807.3995962-151-242039375155285/.source.conf follow=False _original_basename=rootwrap.conf.j2 checksum=11f2cfb4b7d97b2cef3c2c2d88089e6999cffe22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:40:08 np0005538515.localdomain python3.9[250339]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-sriov-agent/01-neutron-sriov-agent.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:40:09 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52052 DF PROTO=TCP SPT=39936 DPT=9102 SEQ=2080535726 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD9087B0000000001030307) 
Nov 28 09:40:09 np0005538515.localdomain python3.9[250425]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-sriov-agent/01-neutron-sriov-agent.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764322808.4982908-151-218722788188074/.source.conf follow=False _original_basename=neutron-sriov-agent.conf.j2 checksum=c36350e91748d315f1b4c16328f465554d535cdf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:40:11 np0005538515.localdomain python3.9[250533]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-sriov-agent/10-neutron-sriov.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:40:11 np0005538515.localdomain python3.9[250619]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-sriov-agent/10-neutron-sriov.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764322810.4541323-326-157432830409343/.source.conf _original_basename=10-neutron-sriov.conf follow=False checksum=ecea6b6701c9be6f1d83be82edd3c16fe40b7bb4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:40:12 np0005538515.localdomain python3.9[250727]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 09:40:13 np0005538515.localdomain sudo[250747]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:40:13 np0005538515.localdomain sudo[250747]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:40:13 np0005538515.localdomain sudo[250747]: pam_unix(sudo:session): session closed for user root
Nov 28 09:40:13 np0005538515.localdomain sudo[250781]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 09:40:13 np0005538515.localdomain sudo[250781]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:40:13 np0005538515.localdomain sudo[250873]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kvxdkqlmwnslrjmymuunpbpfdswuzmkt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322813.2545693-398-284286550324/AnsiballZ_file.py
Nov 28 09:40:13 np0005538515.localdomain sudo[250873]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:40:13 np0005538515.localdomain python3.9[250875]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:40:13 np0005538515.localdomain sudo[250873]: pam_unix(sudo:session): session closed for user root
Nov 28 09:40:13 np0005538515.localdomain sudo[250781]: pam_unix(sudo:session): session closed for user root
Nov 28 09:40:14 np0005538515.localdomain sudo[251015]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-acpadljvaqthzmecmogozavmidsaluwi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322813.9485364-421-29838192151895/AnsiballZ_stat.py
Nov 28 09:40:14 np0005538515.localdomain sudo[251015]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:40:14 np0005538515.localdomain python3.9[251017]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:40:14 np0005538515.localdomain sudo[251015]: pam_unix(sudo:session): session closed for user root
Nov 28 09:40:14 np0005538515.localdomain sudo[251072]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hxjvbbduuafnpecetsvkfokslwvhjnbk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322813.9485364-421-29838192151895/AnsiballZ_file.py
Nov 28 09:40:14 np0005538515.localdomain sudo[251072]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:40:14 np0005538515.localdomain sudo[251074]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:40:14 np0005538515.localdomain sudo[251074]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:40:14 np0005538515.localdomain sudo[251074]: pam_unix(sudo:session): session closed for user root
Nov 28 09:40:14 np0005538515.localdomain python3.9[251088]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:40:14 np0005538515.localdomain sudo[251072]: pam_unix(sudo:session): session closed for user root
Nov 28 09:40:15 np0005538515.localdomain sudo[251200]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cklvhssiznigyfhjhdzuiidcihplyvgh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322815.547262-421-269296362329183/AnsiballZ_stat.py
Nov 28 09:40:15 np0005538515.localdomain sudo[251200]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:40:15 np0005538515.localdomain python3.9[251202]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:40:16 np0005538515.localdomain sudo[251200]: pam_unix(sudo:session): session closed for user root
Nov 28 09:40:16 np0005538515.localdomain sudo[251257]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jsmeqgunodvltpnabrbivhptasofjmxe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322815.547262-421-269296362329183/AnsiballZ_file.py
Nov 28 09:40:16 np0005538515.localdomain sudo[251257]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:40:16 np0005538515.localdomain python3.9[251259]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:40:16 np0005538515.localdomain sudo[251257]: pam_unix(sudo:session): session closed for user root
Nov 28 09:40:17 np0005538515.localdomain sudo[251367]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nacsleagcnzkjkuwlomualekzckzfcmx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322816.804192-490-190324153314856/AnsiballZ_file.py
Nov 28 09:40:17 np0005538515.localdomain sudo[251367]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:40:17 np0005538515.localdomain python3.9[251369]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:40:17 np0005538515.localdomain sudo[251367]: pam_unix(sudo:session): session closed for user root
Nov 28 09:40:17 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52053 DF PROTO=TCP SPT=39936 DPT=9102 SEQ=2080535726 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD928FA0000000001030307) 
Nov 28 09:40:17 np0005538515.localdomain sudo[251477]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-klcocbxpbbbjevefrftubzwalpolrsgc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322817.4708123-514-177078453002800/AnsiballZ_stat.py
Nov 28 09:40:17 np0005538515.localdomain sudo[251477]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:40:17 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.
Nov 28 09:40:17 np0005538515.localdomain podman[251480]: 2025-11-28 09:40:17.870544632 +0000 UTC m=+0.086229527 container health_status 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, release=1755695350, com.redhat.component=ubi9-minimal-container, config_id=edpm, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, architecture=x86_64, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Nov 28 09:40:17 np0005538515.localdomain podman[251480]: 2025-11-28 09:40:17.883461952 +0000 UTC m=+0.099146817 container exec_died 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., config_id=edpm, version=9.6, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, release=1755695350, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 28 09:40:17 np0005538515.localdomain systemd[1]: 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.service: Deactivated successfully.
Nov 28 09:40:17 np0005538515.localdomain python3.9[251479]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:40:18 np0005538515.localdomain sudo[251477]: pam_unix(sudo:session): session closed for user root
Nov 28 09:40:18 np0005538515.localdomain sudo[251554]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-quyenblgoeppfqtzomuaazojjichujxc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322817.4708123-514-177078453002800/AnsiballZ_file.py
Nov 28 09:40:18 np0005538515.localdomain sudo[251554]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:40:18 np0005538515.localdomain python3.9[251556]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:40:18 np0005538515.localdomain sudo[251554]: pam_unix(sudo:session): session closed for user root
Nov 28 09:40:19 np0005538515.localdomain sudo[251664]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eftwtroqnkfoxarwkpprezhgnaxzrqyi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322818.7398772-550-206736335449938/AnsiballZ_stat.py
Nov 28 09:40:19 np0005538515.localdomain sudo[251664]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:40:19 np0005538515.localdomain python3.9[251666]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:40:19 np0005538515.localdomain sudo[251664]: pam_unix(sudo:session): session closed for user root
Nov 28 09:40:19 np0005538515.localdomain sudo[251721]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-llayrezxekbexjhfccbosupsjdtznvbx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322818.7398772-550-206736335449938/AnsiballZ_file.py
Nov 28 09:40:19 np0005538515.localdomain sudo[251721]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:40:19 np0005538515.localdomain python3.9[251723]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:40:19 np0005538515.localdomain sudo[251721]: pam_unix(sudo:session): session closed for user root
Nov 28 09:40:20 np0005538515.localdomain sudo[251831]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tkteyxycisnpuckozxqnhnecvjzhdpfx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322819.8878853-586-277996427403621/AnsiballZ_systemd.py
Nov 28 09:40:20 np0005538515.localdomain sudo[251831]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:40:20 np0005538515.localdomain python3.9[251833]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:40:20 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 09:40:20 np0005538515.localdomain systemd-sysv-generator[251860]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:40:20 np0005538515.localdomain systemd-rc-local-generator[251855]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:40:20 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:40:20 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 28 09:40:20 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:40:20 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:40:20 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:40:20 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 28 09:40:20 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:40:20 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:40:21 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:40:21 np0005538515.localdomain sudo[251831]: pam_unix(sudo:session): session closed for user root
Nov 28 09:40:21 np0005538515.localdomain sudo[251978]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ufvyndolmippvisncdmxybxmxlpfsuxf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322821.421982-610-245688606663642/AnsiballZ_stat.py
Nov 28 09:40:21 np0005538515.localdomain sudo[251978]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:40:21 np0005538515.localdomain python3.9[251980]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:40:21 np0005538515.localdomain sudo[251978]: pam_unix(sudo:session): session closed for user root
Nov 28 09:40:22 np0005538515.localdomain sudo[252035]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-caxrkhcrgvtqdqzsrptkkvhvjsmhstvl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322821.421982-610-245688606663642/AnsiballZ_file.py
Nov 28 09:40:22 np0005538515.localdomain sudo[252035]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:40:22 np0005538515.localdomain python3.9[252037]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:40:22 np0005538515.localdomain sudo[252035]: pam_unix(sudo:session): session closed for user root
Nov 28 09:40:22 np0005538515.localdomain sudo[252145]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kujduazdhyymysjmjbiaaunzfsuvztsp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322822.6737113-646-127504318810505/AnsiballZ_stat.py
Nov 28 09:40:22 np0005538515.localdomain sudo[252145]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:40:23 np0005538515.localdomain python3.9[252147]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:40:23 np0005538515.localdomain sudo[252145]: pam_unix(sudo:session): session closed for user root
Nov 28 09:40:23 np0005538515.localdomain sudo[252202]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kfjquanqazhzawoqobrqfoqbkpctxwfi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322822.6737113-646-127504318810505/AnsiballZ_file.py
Nov 28 09:40:23 np0005538515.localdomain sudo[252202]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:40:23 np0005538515.localdomain python3.9[252204]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:40:23 np0005538515.localdomain sudo[252202]: pam_unix(sudo:session): session closed for user root
Nov 28 09:40:24 np0005538515.localdomain sudo[252312]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fjvntkzfcyczpoasboksigpehlzpzcmu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322823.8187454-682-1781113029919/AnsiballZ_systemd.py
Nov 28 09:40:24 np0005538515.localdomain sudo[252312]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:40:24 np0005538515.localdomain python3.9[252314]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:40:24 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 09:40:24 np0005538515.localdomain systemd-rc-local-generator[252338]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:40:24 np0005538515.localdomain systemd-sysv-generator[252346]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:40:24 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:40:24 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 28 09:40:24 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:40:24 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:40:24 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:40:24 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 28 09:40:24 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:40:24 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:40:24 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:40:24 np0005538515.localdomain systemd[1]: Starting Create netns directory...
Nov 28 09:40:25 np0005538515.localdomain systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Nov 28 09:40:25 np0005538515.localdomain systemd[1]: netns-placeholder.service: Deactivated successfully.
Nov 28 09:40:25 np0005538515.localdomain systemd[1]: Finished Create netns directory.
Nov 28 09:40:25 np0005538515.localdomain sudo[252312]: pam_unix(sudo:session): session closed for user root
Nov 28 09:40:26 np0005538515.localdomain sudo[252467]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pvceiszahaknnkobmfxkhseqcwjobqlz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322826.5270119-712-80182666484796/AnsiballZ_file.py
Nov 28 09:40:26 np0005538515.localdomain sudo[252467]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:40:27 np0005538515.localdomain python3.9[252469]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:40:27 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.
Nov 28 09:40:27 np0005538515.localdomain sudo[252467]: pam_unix(sudo:session): session closed for user root
Nov 28 09:40:27 np0005538515.localdomain systemd[1]: tmp-crun.Dn2yaJ.mount: Deactivated successfully.
Nov 28 09:40:27 np0005538515.localdomain podman[252470]: 2025-11-28 09:40:27.177806659 +0000 UTC m=+0.100589712 container health_status 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 09:40:27 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.
Nov 28 09:40:27 np0005538515.localdomain podman[252470]: 2025-11-28 09:40:27.192599256 +0000 UTC m=+0.115382409 container exec_died 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_managed=true, config_id=edpm, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Nov 28 09:40:27 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.
Nov 28 09:40:27 np0005538515.localdomain systemd[1]: 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.service: Deactivated successfully.
Nov 28 09:40:27 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.
Nov 28 09:40:27 np0005538515.localdomain systemd[1]: tmp-crun.osyiO3.mount: Deactivated successfully.
Nov 28 09:40:27 np0005538515.localdomain podman[252506]: 2025-11-28 09:40:27.334829936 +0000 UTC m=+0.136576866 container health_status 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, config_id=ovn_controller)
Nov 28 09:40:27 np0005538515.localdomain podman[252507]: 2025-11-28 09:40:27.290273818 +0000 UTC m=+0.085283010 container health_status b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2)
Nov 28 09:40:27 np0005538515.localdomain podman[252557]: 2025-11-28 09:40:27.406610915 +0000 UTC m=+0.116017949 container health_status d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 28 09:40:27 np0005538515.localdomain podman[252506]: 2025-11-28 09:40:27.411351882 +0000 UTC m=+0.213098762 container exec_died 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true)
Nov 28 09:40:27 np0005538515.localdomain systemd[1]: 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.service: Deactivated successfully.
Nov 28 09:40:27 np0005538515.localdomain podman[252507]: 2025-11-28 09:40:27.430389871 +0000 UTC m=+0.225399023 container exec_died b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Nov 28 09:40:27 np0005538515.localdomain systemd[1]: b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.service: Deactivated successfully.
Nov 28 09:40:27 np0005538515.localdomain podman[252557]: 2025-11-28 09:40:27.443726254 +0000 UTC m=+0.153133288 container exec_died d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 09:40:27 np0005538515.localdomain systemd[1]: d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.service: Deactivated successfully.
Nov 28 09:40:27 np0005538515.localdomain sudo[252662]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dkgolydwebhicftqeywhayzylwvqyfgs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322827.2602458-736-269014940561838/AnsiballZ_stat.py
Nov 28 09:40:27 np0005538515.localdomain sudo[252662]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:40:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:40:27 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:40:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:40:27 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:40:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:40:27 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 09:40:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:40:27 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 09:40:27 np0005538515.localdomain openstack_network_exporter[240973]: 
Nov 28 09:40:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:40:27 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 09:40:27 np0005538515.localdomain openstack_network_exporter[240973]: 
Nov 28 09:40:27 np0005538515.localdomain python3.9[252664]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/neutron_sriov_agent.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:40:27 np0005538515.localdomain sudo[252662]: pam_unix(sudo:session): session closed for user root
Nov 28 09:40:28 np0005538515.localdomain sudo[252754]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cklcnmxhzcuyvrwntwkhgmxddxcdzxgk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322827.2602458-736-269014940561838/AnsiballZ_copy.py
Nov 28 09:40:28 np0005538515.localdomain sudo[252754]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:40:28 np0005538515.localdomain python3.9[252756]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/neutron_sriov_agent.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764322827.2602458-736-269014940561838/.source.json _original_basename=.4ym49pcu follow=False checksum=a32073fdba4733b9ffe872cfb91708eff83a585a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:40:28 np0005538515.localdomain sudo[252754]: pam_unix(sudo:session): session closed for user root
Nov 28 09:40:28 np0005538515.localdomain sudo[252864]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tpqxyhzhaiemijusucsadnzzfxriwadc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322828.4606462-782-128824876222644/AnsiballZ_file.py
Nov 28 09:40:28 np0005538515.localdomain sudo[252864]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:40:28 np0005538515.localdomain podman[239012]: time="2025-11-28T09:40:28Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 09:40:28 np0005538515.localdomain podman[239012]: @ - - [28/Nov/2025:09:40:28 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 142078 "" "Go-http-client/1.1"
Nov 28 09:40:28 np0005538515.localdomain podman[239012]: @ - - [28/Nov/2025:09:40:28 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 15864 "" "Go-http-client/1.1"
Nov 28 09:40:28 np0005538515.localdomain python3.9[252866]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/neutron_sriov_agent state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:40:28 np0005538515.localdomain sudo[252864]: pam_unix(sudo:session): session closed for user root
Nov 28 09:40:29 np0005538515.localdomain sudo[252974]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hxjzmsmvonilttdjxutagdmxkxmqjqit ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322829.1872125-805-274291280379815/AnsiballZ_stat.py
Nov 28 09:40:29 np0005538515.localdomain sudo[252974]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:40:29 np0005538515.localdomain sudo[252974]: pam_unix(sudo:session): session closed for user root
Nov 28 09:40:30 np0005538515.localdomain sudo[253062]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lnlvlhwbxfuesdgkslsynefbruevpxah ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322829.1872125-805-274291280379815/AnsiballZ_copy.py
Nov 28 09:40:30 np0005538515.localdomain sudo[253062]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:40:30 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.
Nov 28 09:40:30 np0005538515.localdomain systemd[1]: tmp-crun.HW7KSS.mount: Deactivated successfully.
Nov 28 09:40:30 np0005538515.localdomain podman[253065]: 2025-11-28 09:40:30.196083682 +0000 UTC m=+0.097382644 container health_status 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 28 09:40:30 np0005538515.localdomain podman[253065]: 2025-11-28 09:40:30.234961164 +0000 UTC m=+0.136260086 container exec_died 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 28 09:40:30 np0005538515.localdomain systemd[1]: 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.service: Deactivated successfully.
Nov 28 09:40:30 np0005538515.localdomain sudo[253062]: pam_unix(sudo:session): session closed for user root
Nov 28 09:40:31 np0005538515.localdomain sudo[253194]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nbnenzhmbldchdclaxesaqybvlxsexlg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322830.7053084-856-185023077834412/AnsiballZ_container_config_data.py
Nov 28 09:40:31 np0005538515.localdomain sudo[253194]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:40:31 np0005538515.localdomain python3.9[253196]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/neutron_sriov_agent config_pattern=*.json debug=False
Nov 28 09:40:31 np0005538515.localdomain sudo[253194]: pam_unix(sudo:session): session closed for user root
Nov 28 09:40:32 np0005538515.localdomain sudo[253304]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-snwrfnrmwscjqzxvpbhveybcwbrpbsgu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322831.5820842-883-252667343417443/AnsiballZ_container_config_hash.py
Nov 28 09:40:32 np0005538515.localdomain sudo[253304]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:40:32 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52364 DF PROTO=TCP SPT=54008 DPT=9102 SEQ=2498611152 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD961C90000000001030307) 
Nov 28 09:40:32 np0005538515.localdomain python3.9[253306]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 28 09:40:32 np0005538515.localdomain sudo[253304]: pam_unix(sudo:session): session closed for user root
Nov 28 09:40:32 np0005538515.localdomain sudo[253414]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ybgneldmesqaopmcjamesuaxclffdsqz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322832.5368435-910-188898012148031/AnsiballZ_podman_container_info.py
Nov 28 09:40:32 np0005538515.localdomain sudo[253414]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:40:33 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52365 DF PROTO=TCP SPT=54008 DPT=9102 SEQ=2498611152 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD965BA0000000001030307) 
Nov 28 09:40:33 np0005538515.localdomain python3.9[253416]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Nov 28 09:40:33 np0005538515.localdomain sudo[253414]: pam_unix(sudo:session): session closed for user root
Nov 28 09:40:33 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52054 DF PROTO=TCP SPT=39936 DPT=9102 SEQ=2080535726 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD968FA0000000001030307) 
Nov 28 09:40:35 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52366 DF PROTO=TCP SPT=54008 DPT=9102 SEQ=2498611152 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD96DBB0000000001030307) 
Nov 28 09:40:35 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34118 DF PROTO=TCP SPT=41640 DPT=9102 SEQ=2329822090 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD970FB0000000001030307) 
Nov 28 09:40:36 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.
Nov 28 09:40:36 np0005538515.localdomain podman[253498]: 2025-11-28 09:40:36.975385454 +0000 UTC m=+0.082892462 container health_status cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 28 09:40:36 np0005538515.localdomain podman[253498]: 2025-11-28 09:40:36.99104166 +0000 UTC m=+0.098548688 container exec_died cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.build-date=20251125)
Nov 28 09:40:37 np0005538515.localdomain systemd[1]: cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.service: Deactivated successfully.
Nov 28 09:40:38 np0005538515.localdomain sudo[253570]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pjyzzrzkpujscpqargblolqzdvpymrsu ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764322836.7747805-949-248872488528470/AnsiballZ_edpm_container_manage.py
Nov 28 09:40:38 np0005538515.localdomain sudo[253570]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:40:38 np0005538515.localdomain python3[253572]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/neutron_sriov_agent config_id=neutron_sriov_agent config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Nov 28 09:40:38 np0005538515.localdomain podman[253609]: 
Nov 28 09:40:38 np0005538515.localdomain podman[253609]: 2025-11-28 09:40:38.808823973 +0000 UTC m=+0.084799823 container create 679926e9f9d19f972216ef7c650e8482dfdd25fbca9daa0ab447b1127efd4b9c (image=quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified, name=neutron_sriov_agent, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b3fa7b09cb2ae0d1c8c04f9c87a2894b2bcc39a986fca3511978ef5621fe5639'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'neutron', 'volumes': ['/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/config-data/ansible-generated/neutron-sriov-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=neutron_sriov_agent, managed_by=edpm_ansible, container_name=neutron_sriov_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 09:40:38 np0005538515.localdomain podman[253609]: 2025-11-28 09:40:38.767415128 +0000 UTC m=+0.043391048 image pull  quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified
Nov 28 09:40:38 np0005538515.localdomain python3[253572]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name neutron_sriov_agent --conmon-pidfile /run/neutron_sriov_agent.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=b3fa7b09cb2ae0d1c8c04f9c87a2894b2bcc39a986fca3511978ef5621fe5639 --label config_id=neutron_sriov_agent --label container_name=neutron_sriov_agent --label managed_by=edpm_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b3fa7b09cb2ae0d1c8c04f9c87a2894b2bcc39a986fca3511978ef5621fe5639'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'neutron', 'volumes': ['/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/config-data/ansible-generated/neutron-sriov-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']} --log-driver journald --log-level info --network host --privileged=True --user neutron --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /var/lib/config-data/ansible-generated/neutron-sriov-agent:/etc/neutron.conf.d:z --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified
Nov 28 09:40:38 np0005538515.localdomain sudo[253570]: pam_unix(sudo:session): session closed for user root
Nov 28 09:40:39 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52367 DF PROTO=TCP SPT=54008 DPT=9102 SEQ=2498611152 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD97D7A0000000001030307) 
Nov 28 09:40:39 np0005538515.localdomain sudo[253753]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xiqofvryvmzuzxkhluhzswtvwuckgoyl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322839.1989024-974-124529086458789/AnsiballZ_stat.py
Nov 28 09:40:39 np0005538515.localdomain sudo[253753]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:40:39 np0005538515.localdomain python3.9[253755]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 09:40:39 np0005538515.localdomain sudo[253753]: pam_unix(sudo:session): session closed for user root
Nov 28 09:40:40 np0005538515.localdomain sudo[253865]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fsfeprqvxfkbebizxofhsabjwhkayynu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322840.0034294-1000-265163989438957/AnsiballZ_file.py
Nov 28 09:40:40 np0005538515.localdomain sudo[253865]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:40:40 np0005538515.localdomain python3.9[253867]: ansible-file Invoked with path=/etc/systemd/system/edpm_neutron_sriov_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:40:40 np0005538515.localdomain sudo[253865]: pam_unix(sudo:session): session closed for user root
Nov 28 09:40:40 np0005538515.localdomain rsyslogd[758]: imjournal: 1699 messages lost due to rate-limiting (20000 allowed within 600 seconds)
Nov 28 09:40:40 np0005538515.localdomain sudo[253920]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bmbjuvspbfcpjiiowqojiniftmldcris ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322840.0034294-1000-265163989438957/AnsiballZ_stat.py
Nov 28 09:40:40 np0005538515.localdomain sudo[253920]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:40:40 np0005538515.localdomain python3.9[253922]: ansible-stat Invoked with path=/etc/systemd/system/edpm_neutron_sriov_agent_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 09:40:40 np0005538515.localdomain sudo[253920]: pam_unix(sudo:session): session closed for user root
Nov 28 09:40:41 np0005538515.localdomain sudo[254029]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-awidbfgvvrbvwrdqlfsonacgmykwtnjo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322840.9876287-1000-174850182987501/AnsiballZ_copy.py
Nov 28 09:40:41 np0005538515.localdomain sudo[254029]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:40:41 np0005538515.localdomain python3.9[254031]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764322840.9876287-1000-174850182987501/source dest=/etc/systemd/system/edpm_neutron_sriov_agent.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:40:41 np0005538515.localdomain sudo[254029]: pam_unix(sudo:session): session closed for user root
Nov 28 09:40:41 np0005538515.localdomain sudo[254084]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gkyqjzscirxtfdecwswhtobtwkwpsdrw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322840.9876287-1000-174850182987501/AnsiballZ_systemd.py
Nov 28 09:40:41 np0005538515.localdomain sudo[254084]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:40:42 np0005538515.localdomain python3.9[254086]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 28 09:40:42 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 09:40:42 np0005538515.localdomain systemd-rc-local-generator[254112]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:40:42 np0005538515.localdomain systemd-sysv-generator[254117]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:40:42 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:40:42 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 28 09:40:42 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:40:42 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:40:42 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:40:42 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 28 09:40:42 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:40:42 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:40:42 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:40:42 np0005538515.localdomain sudo[254084]: pam_unix(sudo:session): session closed for user root
Nov 28 09:40:42 np0005538515.localdomain sudo[254176]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sxzzcqyczvdzoxsacufrdlfxcbaaswaj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322840.9876287-1000-174850182987501/AnsiballZ_systemd.py
Nov 28 09:40:42 np0005538515.localdomain sudo[254176]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:40:43 np0005538515.localdomain python3.9[254178]: ansible-systemd Invoked with state=restarted name=edpm_neutron_sriov_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:40:43 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 09:40:43 np0005538515.localdomain systemd-sysv-generator[254205]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:40:43 np0005538515.localdomain systemd-rc-local-generator[254202]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:40:43 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:40:43 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 28 09:40:43 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:40:43 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:40:43 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:40:43 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 28 09:40:43 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:40:43 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:40:43 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:40:43 np0005538515.localdomain systemd[1]: Starting neutron_sriov_agent container...
Nov 28 09:40:43 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 09:40:43 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3a087f5889c3563491fc0fd2134c28d5a04378c317280587469fbfdbb4f54b43/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff)
Nov 28 09:40:43 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3a087f5889c3563491fc0fd2134c28d5a04378c317280587469fbfdbb4f54b43/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 09:40:43 np0005538515.localdomain podman[254219]: 2025-11-28 09:40:43.692419965 +0000 UTC m=+0.105442903 container init 679926e9f9d19f972216ef7c650e8482dfdd25fbca9daa0ab447b1127efd4b9c (image=quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified, name=neutron_sriov_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b3fa7b09cb2ae0d1c8c04f9c87a2894b2bcc39a986fca3511978ef5621fe5639'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'neutron', 'volumes': ['/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/config-data/ansible-generated/neutron-sriov-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=neutron_sriov_agent, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=neutron_sriov_agent)
Nov 28 09:40:43 np0005538515.localdomain podman[254219]: 2025-11-28 09:40:43.700739373 +0000 UTC m=+0.113762311 container start 679926e9f9d19f972216ef7c650e8482dfdd25fbca9daa0ab447b1127efd4b9c (image=quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified, name=neutron_sriov_agent, config_id=neutron_sriov_agent, container_name=neutron_sriov_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b3fa7b09cb2ae0d1c8c04f9c87a2894b2bcc39a986fca3511978ef5621fe5639'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'neutron', 'volumes': ['/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/config-data/ansible-generated/neutron-sriov-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Nov 28 09:40:43 np0005538515.localdomain podman[254219]: neutron_sriov_agent
Nov 28 09:40:43 np0005538515.localdomain neutron_sriov_agent[254234]: + sudo -E kolla_set_configs
Nov 28 09:40:43 np0005538515.localdomain systemd[1]: Started neutron_sriov_agent container.
Nov 28 09:40:43 np0005538515.localdomain sudo[254176]: pam_unix(sudo:session): session closed for user root
Nov 28 09:40:43 np0005538515.localdomain neutron_sriov_agent[254234]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 28 09:40:43 np0005538515.localdomain neutron_sriov_agent[254234]: INFO:__main__:Validating config file
Nov 28 09:40:43 np0005538515.localdomain neutron_sriov_agent[254234]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 28 09:40:43 np0005538515.localdomain neutron_sriov_agent[254234]: INFO:__main__:Copying service configuration files
Nov 28 09:40:43 np0005538515.localdomain neutron_sriov_agent[254234]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf
Nov 28 09:40:43 np0005538515.localdomain neutron_sriov_agent[254234]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf
Nov 28 09:40:43 np0005538515.localdomain neutron_sriov_agent[254234]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf
Nov 28 09:40:43 np0005538515.localdomain neutron_sriov_agent[254234]: INFO:__main__:Writing out command to execute
Nov 28 09:40:43 np0005538515.localdomain neutron_sriov_agent[254234]: INFO:__main__:Setting permission for /var/lib/neutron
Nov 28 09:40:43 np0005538515.localdomain neutron_sriov_agent[254234]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts
Nov 28 09:40:43 np0005538515.localdomain neutron_sriov_agent[254234]: INFO:__main__:Setting permission for /var/lib/neutron/.cache
Nov 28 09:40:43 np0005538515.localdomain neutron_sriov_agent[254234]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy
Nov 28 09:40:43 np0005538515.localdomain neutron_sriov_agent[254234]: INFO:__main__:Setting permission for /var/lib/neutron/external
Nov 28 09:40:43 np0005538515.localdomain neutron_sriov_agent[254234]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper
Nov 28 09:40:43 np0005538515.localdomain neutron_sriov_agent[254234]: INFO:__main__:Setting permission for /var/lib/neutron/metadata_proxy
Nov 28 09:40:43 np0005538515.localdomain neutron_sriov_agent[254234]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill
Nov 28 09:40:43 np0005538515.localdomain neutron_sriov_agent[254234]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints
Nov 28 09:40:43 np0005538515.localdomain neutron_sriov_agent[254234]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints/adac9f827fd7fb11fb07020ef60ee06a1fede4feab743856dc8fb3266181d934
Nov 28 09:40:43 np0005538515.localdomain neutron_sriov_agent[254234]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids
Nov 28 09:40:43 np0005538515.localdomain neutron_sriov_agent[254234]: ++ cat /run_command
Nov 28 09:40:43 np0005538515.localdomain neutron_sriov_agent[254234]: + CMD=/usr/bin/neutron-sriov-nic-agent
Nov 28 09:40:43 np0005538515.localdomain neutron_sriov_agent[254234]: + ARGS=
Nov 28 09:40:43 np0005538515.localdomain neutron_sriov_agent[254234]: + sudo kolla_copy_cacerts
Nov 28 09:40:43 np0005538515.localdomain neutron_sriov_agent[254234]: + [[ ! -n '' ]]
Nov 28 09:40:43 np0005538515.localdomain neutron_sriov_agent[254234]: + . kolla_extend_start
Nov 28 09:40:43 np0005538515.localdomain neutron_sriov_agent[254234]: + echo 'Running command: '\''/usr/bin/neutron-sriov-nic-agent'\'''
Nov 28 09:40:43 np0005538515.localdomain neutron_sriov_agent[254234]: Running command: '/usr/bin/neutron-sriov-nic-agent'
Nov 28 09:40:43 np0005538515.localdomain neutron_sriov_agent[254234]: + umask 0022
Nov 28 09:40:43 np0005538515.localdomain neutron_sriov_agent[254234]: + exec /usr/bin/neutron-sriov-nic-agent
Nov 28 09:40:44 np0005538515.localdomain sudo[254356]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-txrbzjlsrkpjtyxyyqrkbwtykymlhfdv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322844.185531-1085-27968629003359/AnsiballZ_systemd.py
Nov 28 09:40:44 np0005538515.localdomain sudo[254356]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:40:44 np0005538515.localdomain python3.9[254358]: ansible-ansible.builtin.systemd Invoked with name=edpm_neutron_sriov_agent.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 28 09:40:44 np0005538515.localdomain systemd[1]: Stopping neutron_sriov_agent container...
Nov 28 09:40:44 np0005538515.localdomain systemd[1]: tmp-crun.AsCkFe.mount: Deactivated successfully.
Nov 28 09:40:44 np0005538515.localdomain systemd[1]: libpod-679926e9f9d19f972216ef7c650e8482dfdd25fbca9daa0ab447b1127efd4b9c.scope: Deactivated successfully.
Nov 28 09:40:44 np0005538515.localdomain systemd[1]: libpod-679926e9f9d19f972216ef7c650e8482dfdd25fbca9daa0ab447b1127efd4b9c.scope: Consumed 1.217s CPU time.
Nov 28 09:40:44 np0005538515.localdomain podman[254362]: 2025-11-28 09:40:44.93014404 +0000 UTC m=+0.098597081 container died 679926e9f9d19f972216ef7c650e8482dfdd25fbca9daa0ab447b1127efd4b9c (image=quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified, name=neutron_sriov_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b3fa7b09cb2ae0d1c8c04f9c87a2894b2bcc39a986fca3511978ef5621fe5639'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'neutron', 'volumes': ['/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/config-data/ansible-generated/neutron-sriov-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=neutron_sriov_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, container_name=neutron_sriov_agent)
Nov 28 09:40:45 np0005538515.localdomain podman[254362]: 2025-11-28 09:40:45.023543148 +0000 UTC m=+0.191996119 container cleanup 679926e9f9d19f972216ef7c650e8482dfdd25fbca9daa0ab447b1127efd4b9c (image=quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified, name=neutron_sriov_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=neutron_sriov_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=neutron_sriov_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b3fa7b09cb2ae0d1c8c04f9c87a2894b2bcc39a986fca3511978ef5621fe5639'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'neutron', 'volumes': ['/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/config-data/ansible-generated/neutron-sriov-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, org.label-schema.schema-version=1.0)
Nov 28 09:40:45 np0005538515.localdomain podman[254362]: neutron_sriov_agent
Nov 28 09:40:45 np0005538515.localdomain podman[254376]: 2025-11-28 09:40:45.026549561 +0000 UTC m=+0.095977919 container cleanup 679926e9f9d19f972216ef7c650e8482dfdd25fbca9daa0ab447b1127efd4b9c (image=quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified, name=neutron_sriov_agent, tcib_managed=true, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b3fa7b09cb2ae0d1c8c04f9c87a2894b2bcc39a986fca3511978ef5621fe5639'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'neutron', 'volumes': ['/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/config-data/ansible-generated/neutron-sriov-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=neutron_sriov_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, container_name=neutron_sriov_agent, org.label-schema.vendor=CentOS)
Nov 28 09:40:45 np0005538515.localdomain podman[254387]: 2025-11-28 09:40:45.107316237 +0000 UTC m=+0.055817443 container cleanup 679926e9f9d19f972216ef7c650e8482dfdd25fbca9daa0ab447b1127efd4b9c (image=quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified, name=neutron_sriov_agent, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b3fa7b09cb2ae0d1c8c04f9c87a2894b2bcc39a986fca3511978ef5621fe5639'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'neutron', 'volumes': ['/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/config-data/ansible-generated/neutron-sriov-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=neutron_sriov_agent, container_name=neutron_sriov_agent, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2)
Nov 28 09:40:45 np0005538515.localdomain podman[254387]: neutron_sriov_agent
Nov 28 09:40:45 np0005538515.localdomain systemd[1]: edpm_neutron_sriov_agent.service: Deactivated successfully.
Nov 28 09:40:45 np0005538515.localdomain systemd[1]: Stopped neutron_sriov_agent container.
Nov 28 09:40:45 np0005538515.localdomain systemd[1]: Starting neutron_sriov_agent container...
Nov 28 09:40:45 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 09:40:45 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3a087f5889c3563491fc0fd2134c28d5a04378c317280587469fbfdbb4f54b43/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff)
Nov 28 09:40:45 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3a087f5889c3563491fc0fd2134c28d5a04378c317280587469fbfdbb4f54b43/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 09:40:45 np0005538515.localdomain podman[254400]: 2025-11-28 09:40:45.253633697 +0000 UTC m=+0.117499907 container init 679926e9f9d19f972216ef7c650e8482dfdd25fbca9daa0ab447b1127efd4b9c (image=quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified, name=neutron_sriov_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b3fa7b09cb2ae0d1c8c04f9c87a2894b2bcc39a986fca3511978ef5621fe5639'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'neutron', 'volumes': ['/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/config-data/ansible-generated/neutron-sriov-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, org.label-schema.license=GPLv2, config_id=neutron_sriov_agent, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, container_name=neutron_sriov_agent)
Nov 28 09:40:45 np0005538515.localdomain podman[254400]: 2025-11-28 09:40:45.261559673 +0000 UTC m=+0.125425883 container start 679926e9f9d19f972216ef7c650e8482dfdd25fbca9daa0ab447b1127efd4b9c (image=quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified, name=neutron_sriov_agent, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b3fa7b09cb2ae0d1c8c04f9c87a2894b2bcc39a986fca3511978ef5621fe5639'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'neutron', 'volumes': ['/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/config-data/ansible-generated/neutron-sriov-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_id=neutron_sriov_agent, container_name=neutron_sriov_agent)
Nov 28 09:40:45 np0005538515.localdomain podman[254400]: neutron_sriov_agent
Nov 28 09:40:45 np0005538515.localdomain neutron_sriov_agent[254415]: + sudo -E kolla_set_configs
Nov 28 09:40:45 np0005538515.localdomain systemd[1]: Started neutron_sriov_agent container.
Nov 28 09:40:45 np0005538515.localdomain sudo[254356]: pam_unix(sudo:session): session closed for user root
Nov 28 09:40:45 np0005538515.localdomain neutron_sriov_agent[254415]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 28 09:40:45 np0005538515.localdomain neutron_sriov_agent[254415]: INFO:__main__:Validating config file
Nov 28 09:40:45 np0005538515.localdomain neutron_sriov_agent[254415]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 28 09:40:45 np0005538515.localdomain neutron_sriov_agent[254415]: INFO:__main__:Copying service configuration files
Nov 28 09:40:45 np0005538515.localdomain neutron_sriov_agent[254415]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf
Nov 28 09:40:45 np0005538515.localdomain neutron_sriov_agent[254415]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf
Nov 28 09:40:45 np0005538515.localdomain neutron_sriov_agent[254415]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf
Nov 28 09:40:45 np0005538515.localdomain neutron_sriov_agent[254415]: INFO:__main__:Writing out command to execute
Nov 28 09:40:45 np0005538515.localdomain neutron_sriov_agent[254415]: INFO:__main__:Setting permission for /var/lib/neutron
Nov 28 09:40:45 np0005538515.localdomain neutron_sriov_agent[254415]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts
Nov 28 09:40:45 np0005538515.localdomain neutron_sriov_agent[254415]: INFO:__main__:Setting permission for /var/lib/neutron/.cache
Nov 28 09:40:45 np0005538515.localdomain neutron_sriov_agent[254415]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy
Nov 28 09:40:45 np0005538515.localdomain neutron_sriov_agent[254415]: INFO:__main__:Setting permission for /var/lib/neutron/external
Nov 28 09:40:45 np0005538515.localdomain neutron_sriov_agent[254415]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper
Nov 28 09:40:45 np0005538515.localdomain neutron_sriov_agent[254415]: INFO:__main__:Setting permission for /var/lib/neutron/metadata_proxy
Nov 28 09:40:45 np0005538515.localdomain neutron_sriov_agent[254415]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill
Nov 28 09:40:45 np0005538515.localdomain neutron_sriov_agent[254415]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints
Nov 28 09:40:45 np0005538515.localdomain neutron_sriov_agent[254415]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints/adac9f827fd7fb11fb07020ef60ee06a1fede4feab743856dc8fb3266181d934
Nov 28 09:40:45 np0005538515.localdomain neutron_sriov_agent[254415]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints/a99cc267a5c3ade03c88b3bb0a43299c9bb62825df6f4ca0c30c03cccfac55c1
Nov 28 09:40:45 np0005538515.localdomain neutron_sriov_agent[254415]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids
Nov 28 09:40:45 np0005538515.localdomain neutron_sriov_agent[254415]: ++ cat /run_command
Nov 28 09:40:45 np0005538515.localdomain neutron_sriov_agent[254415]: + CMD=/usr/bin/neutron-sriov-nic-agent
Nov 28 09:40:45 np0005538515.localdomain neutron_sriov_agent[254415]: + ARGS=
Nov 28 09:40:45 np0005538515.localdomain neutron_sriov_agent[254415]: + sudo kolla_copy_cacerts
Nov 28 09:40:45 np0005538515.localdomain neutron_sriov_agent[254415]: + [[ ! -n '' ]]
Nov 28 09:40:45 np0005538515.localdomain neutron_sriov_agent[254415]: + . kolla_extend_start
Nov 28 09:40:45 np0005538515.localdomain neutron_sriov_agent[254415]: Running command: '/usr/bin/neutron-sriov-nic-agent'
Nov 28 09:40:45 np0005538515.localdomain neutron_sriov_agent[254415]: + echo 'Running command: '\''/usr/bin/neutron-sriov-nic-agent'\'''
Nov 28 09:40:45 np0005538515.localdomain neutron_sriov_agent[254415]: + umask 0022
Nov 28 09:40:45 np0005538515.localdomain neutron_sriov_agent[254415]: + exec /usr/bin/neutron-sriov-nic-agent
Nov 28 09:40:45 np0005538515.localdomain sshd[249295]: pam_unix(sshd:session): session closed for user zuul
Nov 28 09:40:45 np0005538515.localdomain systemd[1]: session-57.scope: Deactivated successfully.
Nov 28 09:40:45 np0005538515.localdomain systemd[1]: session-57.scope: Consumed 22.770s CPU time.
Nov 28 09:40:45 np0005538515.localdomain systemd-logind[763]: Session 57 logged out. Waiting for processes to exit.
Nov 28 09:40:45 np0005538515.localdomain systemd-logind[763]: Removed session 57.
Nov 28 09:40:46 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 09:40:46.873 2 INFO neutron.common.config [-] Logging enabled!
Nov 28 09:40:46 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 09:40:46.873 2 INFO neutron.common.config [-] /usr/bin/neutron-sriov-nic-agent version 22.2.2.dev43
Nov 28 09:40:46 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 09:40:46.874 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [-] Physical Devices mappings: {'dummy_sriov_net': ['dummy-dev']}
Nov 28 09:40:46 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 09:40:46.874 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [-] Exclude Devices: {}
Nov 28 09:40:46 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 09:40:46.874 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [-] Resource provider bandwidths: {}
Nov 28 09:40:46 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 09:40:46.874 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [-] Resource provider inventory defaults: {'allocation_ratio': 1.0, 'min_unit': 1, 'step_size': 1, 'reserved': 0}
Nov 28 09:40:46 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 09:40:46.874 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [-] Resource provider hypervisors: {'dummy-dev': 'np0005538515.localdomain'}
Nov 28 09:40:46 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 09:40:46.874 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [None req-f57fe1c5-075a-474f-89f1-63a8be551758 - - - - - -] RPC agent_id: nic-switch-agent.np0005538515.localdomain
Nov 28 09:40:46 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 09:40:46.879 2 INFO neutron.agent.agent_extensions_manager [None req-f57fe1c5-075a-474f-89f1-63a8be551758 - - - - - -] Loaded agent extensions: ['qos']
Nov 28 09:40:46 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 09:40:46.879 2 INFO neutron.agent.agent_extensions_manager [None req-f57fe1c5-075a-474f-89f1-63a8be551758 - - - - - -] Initializing agent extension 'qos'
Nov 28 09:40:47 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52368 DF PROTO=TCP SPT=54008 DPT=9102 SEQ=2498611152 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD99CFB0000000001030307) 
Nov 28 09:40:47 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 09:40:47.277 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [None req-f57fe1c5-075a-474f-89f1-63a8be551758 - - - - - -] Agent initialized successfully, now running... 
Nov 28 09:40:47 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 09:40:47.277 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [None req-f57fe1c5-075a-474f-89f1-63a8be551758 - - - - - -] SRIOV NIC Agent RPC Daemon Started!
Nov 28 09:40:47 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 09:40:47.278 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [None req-f57fe1c5-075a-474f-89f1-63a8be551758 - - - - - -] Agent out of sync with plugin!
Nov 28 09:40:48 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:40:48.172 228501 DEBUG oslo_service.periodic_task [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:40:48 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.
Nov 28 09:40:48 np0005538515.localdomain podman[254448]: 2025-11-28 09:40:48.980325331 +0000 UTC m=+0.091950602 container health_status 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, distribution-scope=public, architecture=x86_64, maintainer=Red Hat, Inc., version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, vcs-type=git, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal)
Nov 28 09:40:48 np0005538515.localdomain podman[254448]: 2025-11-28 09:40:48.996479333 +0000 UTC m=+0.108104594 container exec_died 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, vcs-type=git, architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, config_id=edpm, name=ubi9-minimal)
Nov 28 09:40:49 np0005538515.localdomain systemd[1]: 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.service: Deactivated successfully.
Nov 28 09:40:49 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:40:49.074 228501 DEBUG oslo_service.periodic_task [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:40:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:40:50.821 158530 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:40:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:40:50.821 158530 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:40:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:40:50.821 158530 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:40:52 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:40:52.070 228501 DEBUG oslo_service.periodic_task [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:40:52 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:40:52.073 228501 DEBUG oslo_service.periodic_task [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:40:52 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:40:52.094 228501 DEBUG oslo_concurrency.lockutils [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:40:52 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:40:52.095 228501 DEBUG oslo_concurrency.lockutils [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:40:52 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:40:52.095 228501 DEBUG oslo_concurrency.lockutils [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:40:52 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:40:52.095 228501 DEBUG nova.compute.resource_tracker [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Auditing locally available compute resources for np0005538515.localdomain (node: np0005538515.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 28 09:40:52 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:40:52.096 228501 DEBUG oslo_concurrency.processutils [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 09:40:52 np0005538515.localdomain sshd[254488]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 09:40:52 np0005538515.localdomain sshd[254488]: Accepted publickey for zuul from 192.168.122.30 port 48350 ssh2: RSA SHA256:3gOhaEk5Hp1Sm2LwNst6cGDJ5O01KvSo8lCo9SBO2II
Nov 28 09:40:52 np0005538515.localdomain systemd-logind[763]: New session 58 of user zuul.
Nov 28 09:40:52 np0005538515.localdomain systemd[1]: Started Session 58 of User zuul.
Nov 28 09:40:52 np0005538515.localdomain sshd[254488]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Nov 28 09:40:52 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:40:52.524 228501 DEBUG oslo_concurrency.processutils [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.428s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 09:40:52 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:40:52.751 228501 WARNING nova.virt.libvirt.driver [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 09:40:52 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:40:52.754 228501 DEBUG nova.compute.resource_tracker [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Hypervisor/Node resource view: name=np0005538515.localdomain free_ram=12962MB free_disk=41.837093353271484GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 28 09:40:52 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:40:52.754 228501 DEBUG oslo_concurrency.lockutils [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:40:52 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:40:52.755 228501 DEBUG oslo_concurrency.lockutils [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:40:52 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:40:52.833 228501 DEBUG nova.compute.resource_tracker [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 28 09:40:52 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:40:52.834 228501 DEBUG nova.compute.resource_tracker [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Final resource view: name=np0005538515.localdomain phys_ram=15738MB used_ram=512MB phys_disk=41GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 28 09:40:52 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:40:52.863 228501 DEBUG oslo_concurrency.processutils [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 09:40:53 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:40:53.336 228501 DEBUG oslo_concurrency.processutils [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.473s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 09:40:53 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:40:53.342 228501 DEBUG nova.compute.provider_tree [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Inventory has not changed in ProviderTree for provider: 72fba1ca-0d86-48af-8a3d-510284dfd0e0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 09:40:53 np0005538515.localdomain python3.9[254621]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 09:40:53 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:40:53.359 228501 DEBUG nova.scheduler.client.report [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Inventory has not changed for provider 72fba1ca-0d86-48af-8a3d-510284dfd0e0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 09:40:53 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:40:53.361 228501 DEBUG nova.compute.resource_tracker [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Compute_service record updated for np0005538515.localdomain:np0005538515.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 28 09:40:53 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:40:53.361 228501 DEBUG oslo_concurrency.lockutils [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.607s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:40:54 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:40:54.362 228501 DEBUG oslo_service.periodic_task [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:40:54 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:40:54.363 228501 DEBUG nova.compute.manager [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 28 09:40:54 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:40:54.363 228501 DEBUG nova.compute.manager [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 28 09:40:54 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:40:54.382 228501 DEBUG nova.compute.manager [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 28 09:40:54 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:40:54.382 228501 DEBUG oslo_service.periodic_task [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:40:54 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:40:54.383 228501 DEBUG oslo_service.periodic_task [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:40:54 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:40:54.383 228501 DEBUG oslo_service.periodic_task [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:40:54 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:40:54.384 228501 DEBUG oslo_service.periodic_task [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:40:54 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:40:54.384 228501 DEBUG nova.compute.manager [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 28 09:40:54 np0005538515.localdomain sudo[254735]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vtgldsfkbvqxdjkfbxwegrubrugrmonl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322854.1981637-67-126949185070276/AnsiballZ_setup.py
Nov 28 09:40:54 np0005538515.localdomain sudo[254735]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:40:54 np0005538515.localdomain python3.9[254737]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 28 09:40:55 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:40:55.074 228501 DEBUG oslo_service.periodic_task [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:40:55 np0005538515.localdomain sudo[254735]: pam_unix(sudo:session): session closed for user root
Nov 28 09:40:55 np0005538515.localdomain sudo[254798]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ugjycrcgegpjoxcmejgzgxdjqfluhjvk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322854.1981637-67-126949185070276/AnsiballZ_dnf.py
Nov 28 09:40:55 np0005538515.localdomain sudo[254798]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:40:55 np0005538515.localdomain python3.9[254800]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch3.3'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 28 09:40:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:40:57 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:40:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:40:57 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:40:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:40:57 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 09:40:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:40:57 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 09:40:57 np0005538515.localdomain openstack_network_exporter[240973]: 
Nov 28 09:40:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:40:57 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 09:40:57 np0005538515.localdomain openstack_network_exporter[240973]: 
Nov 28 09:40:57 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.
Nov 28 09:40:57 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.
Nov 28 09:40:57 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.
Nov 28 09:40:57 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.
Nov 28 09:40:57 np0005538515.localdomain podman[254803]: 2025-11-28 09:40:57.98676701 +0000 UTC m=+0.085914226 container health_status 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_id=edpm, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Nov 28 09:40:57 np0005538515.localdomain podman[254803]: 2025-11-28 09:40:57.995364036 +0000 UTC m=+0.094511302 container exec_died 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, container_name=ceilometer_agent_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, config_id=edpm, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Nov 28 09:40:58 np0005538515.localdomain systemd[1]: 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.service: Deactivated successfully.
Nov 28 09:40:58 np0005538515.localdomain podman[254806]: 2025-11-28 09:40:58.089324102 +0000 UTC m=+0.184961750 container health_status d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 28 09:40:58 np0005538515.localdomain podman[254806]: 2025-11-28 09:40:58.097363512 +0000 UTC m=+0.193001120 container exec_died d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 28 09:40:58 np0005538515.localdomain systemd[1]: d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.service: Deactivated successfully.
Nov 28 09:40:58 np0005538515.localdomain podman[254804]: 2025-11-28 09:40:58.20009915 +0000 UTC m=+0.300354151 container health_status 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 28 09:40:58 np0005538515.localdomain podman[254805]: 2025-11-28 09:40:58.227538541 +0000 UTC m=+0.328963718 container health_status b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Nov 28 09:40:58 np0005538515.localdomain podman[254804]: 2025-11-28 09:40:58.238724118 +0000 UTC m=+0.338979149 container exec_died 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3)
Nov 28 09:40:58 np0005538515.localdomain systemd[1]: 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.service: Deactivated successfully.
Nov 28 09:40:58 np0005538515.localdomain podman[254805]: 2025-11-28 09:40:58.258881854 +0000 UTC m=+0.360307011 container exec_died b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 28 09:40:58 np0005538515.localdomain systemd[1]: b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.service: Deactivated successfully.
Nov 28 09:40:58 np0005538515.localdomain podman[239012]: time="2025-11-28T09:40:58Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 09:40:58 np0005538515.localdomain podman[239012]: @ - - [28/Nov/2025:09:40:58 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 144036 "" "Go-http-client/1.1"
Nov 28 09:40:58 np0005538515.localdomain podman[239012]: @ - - [28/Nov/2025:09:40:58 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16299 "" "Go-http-client/1.1"
Nov 28 09:40:58 np0005538515.localdomain sudo[254798]: pam_unix(sudo:session): session closed for user root
Nov 28 09:40:58 np0005538515.localdomain systemd[1]: tmp-crun.p2VTl6.mount: Deactivated successfully.
Nov 28 09:40:59 np0005538515.localdomain sudo[254993]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-twfyhqdbqdpxdydpxmhohofwdljyxtfr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322859.1589644-103-258151080931593/AnsiballZ_systemd.py
Nov 28 09:40:59 np0005538515.localdomain sudo[254993]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:41:00 np0005538515.localdomain python3.9[254995]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 28 09:41:00 np0005538515.localdomain sudo[254993]: pam_unix(sudo:session): session closed for user root
Nov 28 09:41:00 np0005538515.localdomain sudo[255106]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tpwrycqdtuugdawucggdujmcwqriodpc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322860.3169591-130-73607515656604/AnsiballZ_file.py
Nov 28 09:41:00 np0005538515.localdomain sudo[255106]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:41:00 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.
Nov 28 09:41:00 np0005538515.localdomain systemd[1]: tmp-crun.EGcRhz.mount: Deactivated successfully.
Nov 28 09:41:00 np0005538515.localdomain podman[255109]: 2025-11-28 09:41:00.820494018 +0000 UTC m=+0.055544105 container health_status 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 28 09:41:00 np0005538515.localdomain podman[255109]: 2025-11-28 09:41:00.854489762 +0000 UTC m=+0.089539909 container exec_died 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 28 09:41:00 np0005538515.localdomain systemd[1]: 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.service: Deactivated successfully.
Nov 28 09:41:00 np0005538515.localdomain python3.9[255108]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/openstack/config/containers setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:41:00 np0005538515.localdomain sudo[255106]: pam_unix(sudo:session): session closed for user root
Nov 28 09:41:01 np0005538515.localdomain sudo[255239]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ozlimaoiljdklwqrxvyrfzoqdeeyzeyg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322861.1122158-130-152341140385039/AnsiballZ_file.py
Nov 28 09:41:01 np0005538515.localdomain sudo[255239]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:41:01 np0005538515.localdomain python3.9[255241]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:41:01 np0005538515.localdomain sudo[255239]: pam_unix(sudo:session): session closed for user root
Nov 28 09:41:02 np0005538515.localdomain sudo[255349]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kjcbknkpeuusbxifpuhmmxhbdozeyafn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322861.741563-130-7503949637727/AnsiballZ_file.py
Nov 28 09:41:02 np0005538515.localdomain sudo[255349]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:41:02 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61060 DF PROTO=TCP SPT=38018 DPT=9102 SEQ=3204140831 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD9D6F90000000001030307) 
Nov 28 09:41:02 np0005538515.localdomain python3.9[255351]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/config-data/ansible-generated/neutron-dhcp-agent setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:41:02 np0005538515.localdomain sudo[255349]: pam_unix(sudo:session): session closed for user root
Nov 28 09:41:02 np0005538515.localdomain sudo[255459]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mzoblmixichdcxzqfkzvwutxzafqgudw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322862.3686957-130-267680069238547/AnsiballZ_file.py
Nov 28 09:41:02 np0005538515.localdomain sudo[255459]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:41:02 np0005538515.localdomain python3.9[255461]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:41:02 np0005538515.localdomain sudo[255459]: pam_unix(sudo:session): session closed for user root
Nov 28 09:41:03 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61061 DF PROTO=TCP SPT=38018 DPT=9102 SEQ=3204140831 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD9DAFB0000000001030307) 
Nov 28 09:41:03 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52369 DF PROTO=TCP SPT=54008 DPT=9102 SEQ=2498611152 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD9DCFA0000000001030307) 
Nov 28 09:41:04 np0005538515.localdomain sudo[255569]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uthsngppjqsqvcsjadkfugwalleaekcw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322864.1055706-130-164617165555749/AnsiballZ_file.py
Nov 28 09:41:04 np0005538515.localdomain sudo[255569]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:41:04 np0005538515.localdomain python3.9[255571]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/kill_scripts setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:41:04 np0005538515.localdomain sudo[255569]: pam_unix(sudo:session): session closed for user root
Nov 28 09:41:04 np0005538515.localdomain sudo[255679]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gpnfrquypooogkfpxowhueerfzjqlpgy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322864.6873248-130-28516020918401/AnsiballZ_file.py
Nov 28 09:41:04 np0005538515.localdomain sudo[255679]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:41:05 np0005538515.localdomain python3.9[255681]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/ns-metadata-proxy setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:41:05 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61062 DF PROTO=TCP SPT=38018 DPT=9102 SEQ=3204140831 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD9E2FA0000000001030307) 
Nov 28 09:41:05 np0005538515.localdomain sudo[255679]: pam_unix(sudo:session): session closed for user root
Nov 28 09:41:06 np0005538515.localdomain sudo[255789]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nglnzpyezinxcutzgknnlealctmzkfal ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322865.2532778-130-109855241853601/AnsiballZ_file.py
Nov 28 09:41:06 np0005538515.localdomain sudo[255789]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:41:06 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52055 DF PROTO=TCP SPT=39936 DPT=9102 SEQ=2080535726 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD9E6FA0000000001030307) 
Nov 28 09:41:06 np0005538515.localdomain python3.9[255791]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/external/pids setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:41:06 np0005538515.localdomain sudo[255789]: pam_unix(sudo:session): session closed for user root
Nov 28 09:41:06 np0005538515.localdomain sudo[255899]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tzhtkmofdmfcrbwhvjzkqazmnwltbwgv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322866.5391157-280-238172815887883/AnsiballZ_stat.py
Nov 28 09:41:06 np0005538515.localdomain sudo[255899]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:41:07 np0005538515.localdomain python3.9[255901]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/neutron_dhcp_agent.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:41:07 np0005538515.localdomain sudo[255899]: pam_unix(sudo:session): session closed for user root
Nov 28 09:41:07 np0005538515.localdomain sudo[255987]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ajqbrczxczgoxmovhihyswckykmkajyk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322866.5391157-280-238172815887883/AnsiballZ_copy.py
Nov 28 09:41:07 np0005538515.localdomain sudo[255987]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:41:07 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.
Nov 28 09:41:07 np0005538515.localdomain podman[255989]: 2025-11-28 09:41:07.81822863 +0000 UTC m=+0.086514826 container health_status cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, container_name=multipathd, config_id=multipathd, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Nov 28 09:41:07 np0005538515.localdomain podman[255989]: 2025-11-28 09:41:07.833522464 +0000 UTC m=+0.101808710 container exec_died cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 28 09:41:07 np0005538515.localdomain systemd[1]: cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.service: Deactivated successfully.
Nov 28 09:41:07 np0005538515.localdomain python3.9[255990]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/neutron_dhcp_agent.yaml mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764322866.5391157-280-238172815887883/.source.yaml follow=False _original_basename=neutron_dhcp_agent.yaml.j2 checksum=3ebfe8ab1da42a1c6ca52429f61716009c5fd177 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:41:07 np0005538515.localdomain sudo[255987]: pam_unix(sudo:session): session closed for user root
Nov 28 09:41:08 np0005538515.localdomain python3.9[256117]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-dhcp-agent/01-neutron.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:41:09 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61063 DF PROTO=TCP SPT=38018 DPT=9102 SEQ=3204140831 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AD9F2BA0000000001030307) 
Nov 28 09:41:09 np0005538515.localdomain python3.9[256203]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-dhcp-agent/01-neutron.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764322868.1553445-325-127343057006529/.source.conf follow=False _original_basename=neutron.conf.j2 checksum=24e013b64eb8be4a13596c6ffccbd94df7442bd2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:41:09 np0005538515.localdomain python3.9[256311]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-dhcp-agent/01-rootwrap.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:41:10 np0005538515.localdomain python3.9[256397]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-dhcp-agent/01-rootwrap.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764322869.3451304-325-89253515392770/.source.conf follow=False _original_basename=rootwrap.conf.j2 checksum=11f2cfb4b7d97b2cef3c2c2d88089e6999cffe22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:41:10 np0005538515.localdomain python3.9[256505]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-dhcp-agent/01-neutron-dhcp-agent.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:41:11 np0005538515.localdomain python3.9[256591]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-dhcp-agent/01-neutron-dhcp-agent.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764322870.4732502-325-124257756122899/.source.conf follow=False _original_basename=neutron-dhcp-agent.conf.j2 checksum=5873adc378353c4eccfdbaaec218413c8cf1c0ca backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:41:12 np0005538515.localdomain python3.9[256699]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-dhcp-agent/10-neutron-dhcp.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:41:13 np0005538515.localdomain python3.9[256785]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-dhcp-agent/10-neutron-dhcp.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764322872.367248-500-155972275857620/.source.conf _original_basename=10-neutron-dhcp.conf follow=False checksum=ecea6b6701c9be6f1d83be82edd3c16fe40b7bb4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:41:14 np0005538515.localdomain python3.9[256893]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/dhcp_agent_haproxy_wrapper follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:41:14 np0005538515.localdomain python3.9[256979]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/dhcp_agent_haproxy_wrapper mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764322873.6216002-544-3936382088340/.source follow=False _original_basename=haproxy.j2 checksum=e4288860049c1baef23f6e1bb6c6f91acb5432e7 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:41:14 np0005538515.localdomain sudo[257035]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:41:14 np0005538515.localdomain sudo[257035]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:41:14 np0005538515.localdomain sudo[257035]: pam_unix(sudo:session): session closed for user root
Nov 28 09:41:14 np0005538515.localdomain sudo[257072]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 09:41:14 np0005538515.localdomain sudo[257072]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:41:15 np0005538515.localdomain python3.9[257123]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/dhcp_agent_dnsmasq_wrapper follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:41:15 np0005538515.localdomain sudo[257072]: pam_unix(sudo:session): session closed for user root
Nov 28 09:41:15 np0005538515.localdomain python3.9[257229]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/dhcp_agent_dnsmasq_wrapper mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764322874.7189987-544-69477358306292/.source follow=False _original_basename=dnsmasq.j2 checksum=efc19f376a79c40570368e9c2b979cde746f1ea8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:41:16 np0005538515.localdomain python3.9[257349]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/haproxy-kill follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:41:17 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61064 DF PROTO=TCP SPT=38018 DPT=9102 SEQ=3204140831 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ADA12FB0000000001030307) 
Nov 28 09:41:17 np0005538515.localdomain python3.9[257404]: ansible-ansible.legacy.file Invoked with mode=0755 setype=container_file_t dest=/var/lib/neutron/kill_scripts/haproxy-kill _original_basename=kill-script.j2 recurse=False state=file path=/var/lib/neutron/kill_scripts/haproxy-kill force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:41:18 np0005538515.localdomain python3.9[257512]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/dnsmasq-kill follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:41:18 np0005538515.localdomain sudo[257562]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:41:18 np0005538515.localdomain sudo[257562]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:41:18 np0005538515.localdomain sudo[257562]: pam_unix(sudo:session): session closed for user root
Nov 28 09:41:18 np0005538515.localdomain python3.9[257616]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/kill_scripts/dnsmasq-kill mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764322877.7349155-632-75949131049161/.source follow=False _original_basename=kill-script.j2 checksum=2dfb5489f491f61b95691c3bf95fa1fe48ff3700 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:41:19 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.
Nov 28 09:41:20 np0005538515.localdomain podman[257719]: 2025-11-28 09:41:20.066371344 +0000 UTC m=+0.163866335 container health_status 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, config_id=edpm, container_name=openstack_network_exporter, io.openshift.expose-services=, vcs-type=git, build-date=2025-08-20T13:12:41, version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., name=ubi9-minimal, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9)
Nov 28 09:41:20 np0005538515.localdomain podman[257719]: 2025-11-28 09:41:20.083755093 +0000 UTC m=+0.181250074 container exec_died 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vendor=Red Hat, Inc., config_id=edpm, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, managed_by=edpm_ansible, vcs-type=git, release=1755695350, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, distribution-scope=public, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Nov 28 09:41:20 np0005538515.localdomain systemd[1]: 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.service: Deactivated successfully.
Nov 28 09:41:20 np0005538515.localdomain python3.9[257725]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 09:41:20 np0005538515.localdomain sudo[257854]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-akihyfuxkslpkkgexkjyhrsymagvogrj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322880.4152992-736-232524875104206/AnsiballZ_file.py
Nov 28 09:41:20 np0005538515.localdomain sudo[257854]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:41:20 np0005538515.localdomain python3.9[257856]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:41:20 np0005538515.localdomain sudo[257854]: pam_unix(sudo:session): session closed for user root
Nov 28 09:41:21 np0005538515.localdomain sudo[257964]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ofnndspvlinninszrfirqezktdkcopkr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322881.1176102-760-88170330801550/AnsiballZ_stat.py
Nov 28 09:41:21 np0005538515.localdomain sudo[257964]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:41:21 np0005538515.localdomain python3.9[257966]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:41:21 np0005538515.localdomain sudo[257964]: pam_unix(sudo:session): session closed for user root
Nov 28 09:41:21 np0005538515.localdomain sudo[258021]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yothgqfdpgbenauzctfqgypuchrgwueg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322881.1176102-760-88170330801550/AnsiballZ_file.py
Nov 28 09:41:21 np0005538515.localdomain sudo[258021]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:41:22 np0005538515.localdomain python3.9[258023]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:41:22 np0005538515.localdomain sudo[258021]: pam_unix(sudo:session): session closed for user root
Nov 28 09:41:22 np0005538515.localdomain sudo[258131]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-roybbkopaydhvqftjkyihftjoobvtbcy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322882.235046-760-206944374152475/AnsiballZ_stat.py
Nov 28 09:41:22 np0005538515.localdomain sudo[258131]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:41:22 np0005538515.localdomain python3.9[258133]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:41:22 np0005538515.localdomain sudo[258131]: pam_unix(sudo:session): session closed for user root
Nov 28 09:41:23 np0005538515.localdomain sudo[258188]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zblqradxbswpjvhryqjxplwardenluge ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322882.235046-760-206944374152475/AnsiballZ_file.py
Nov 28 09:41:23 np0005538515.localdomain sudo[258188]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:41:23 np0005538515.localdomain python3.9[258190]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:41:23 np0005538515.localdomain sudo[258188]: pam_unix(sudo:session): session closed for user root
Nov 28 09:41:23 np0005538515.localdomain sudo[258298]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hxqyhdsuaknqbtigstwvcohzeziwizyk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322883.4840286-829-277299501919241/AnsiballZ_file.py
Nov 28 09:41:23 np0005538515.localdomain sudo[258298]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:41:23 np0005538515.localdomain python3.9[258300]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:41:23 np0005538515.localdomain sudo[258298]: pam_unix(sudo:session): session closed for user root
Nov 28 09:41:24 np0005538515.localdomain sudo[258408]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lhehlhdtjyukmbcbwdyieteujjqnyagx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322884.2030442-854-247318036073427/AnsiballZ_stat.py
Nov 28 09:41:24 np0005538515.localdomain sudo[258408]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:41:24 np0005538515.localdomain python3.9[258410]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:41:24 np0005538515.localdomain sudo[258408]: pam_unix(sudo:session): session closed for user root
Nov 28 09:41:24 np0005538515.localdomain sudo[258465]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lptzklbibberfpctpswglshjzzejsycu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322884.2030442-854-247318036073427/AnsiballZ_file.py
Nov 28 09:41:24 np0005538515.localdomain sudo[258465]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:41:25 np0005538515.localdomain python3.9[258467]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:41:25 np0005538515.localdomain sudo[258465]: pam_unix(sudo:session): session closed for user root
Nov 28 09:41:25 np0005538515.localdomain sudo[258575]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-imrwyuwonaiyhyhscjkpsnziisucmavo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322885.38834-890-235454888419112/AnsiballZ_stat.py
Nov 28 09:41:25 np0005538515.localdomain sudo[258575]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:41:25 np0005538515.localdomain python3.9[258577]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:41:25 np0005538515.localdomain sudo[258575]: pam_unix(sudo:session): session closed for user root
Nov 28 09:41:26 np0005538515.localdomain sudo[258632]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xkiesccdedbkrzlcjbtbjkermdtoiina ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322885.38834-890-235454888419112/AnsiballZ_file.py
Nov 28 09:41:26 np0005538515.localdomain sudo[258632]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:41:26 np0005538515.localdomain python3.9[258634]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:41:26 np0005538515.localdomain sudo[258632]: pam_unix(sudo:session): session closed for user root
Nov 28 09:41:26 np0005538515.localdomain sudo[258742]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xcovhrhnbfdjczjyvpcamlpmentajbey ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322886.5389972-926-160633163494443/AnsiballZ_systemd.py
Nov 28 09:41:26 np0005538515.localdomain sudo[258742]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:41:27 np0005538515.localdomain python3.9[258744]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:41:27 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 09:41:27 np0005538515.localdomain systemd-rc-local-generator[258766]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:41:27 np0005538515.localdomain systemd-sysv-generator[258775]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:41:27 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:41:27 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 28 09:41:27 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:41:27 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:41:27 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:41:27 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 28 09:41:27 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:41:27 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:41:27 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:41:27 np0005538515.localdomain sudo[258742]: pam_unix(sudo:session): session closed for user root
Nov 28 09:41:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:41:27 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:41:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:41:27 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 09:41:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:41:27 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:41:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:41:27 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 09:41:27 np0005538515.localdomain openstack_network_exporter[240973]: 
Nov 28 09:41:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:41:27 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 09:41:27 np0005538515.localdomain openstack_network_exporter[240973]: 
Nov 28 09:41:28 np0005538515.localdomain sudo[258890]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kcohwsrycbccadoftrznkhlarnnamwxf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322887.7658596-949-235948898581019/AnsiballZ_stat.py
Nov 28 09:41:28 np0005538515.localdomain sudo[258890]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:41:28 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.
Nov 28 09:41:28 np0005538515.localdomain podman[258892]: 2025-11-28 09:41:28.132572197 +0000 UTC m=+0.083415609 container health_status 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2)
Nov 28 09:41:28 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.
Nov 28 09:41:28 np0005538515.localdomain podman[258892]: 2025-11-28 09:41:28.147402358 +0000 UTC m=+0.098245780 container exec_died 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=edpm, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 28 09:41:28 np0005538515.localdomain systemd[1]: 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.service: Deactivated successfully.
Nov 28 09:41:28 np0005538515.localdomain podman[258911]: 2025-11-28 09:41:28.2303388 +0000 UTC m=+0.080420905 container health_status d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 09:41:28 np0005538515.localdomain python3.9[258893]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:41:28 np0005538515.localdomain podman[258911]: 2025-11-28 09:41:28.244464269 +0000 UTC m=+0.094546364 container exec_died d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 28 09:41:28 np0005538515.localdomain systemd[1]: d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.service: Deactivated successfully.
Nov 28 09:41:28 np0005538515.localdomain sudo[258890]: pam_unix(sudo:session): session closed for user root
Nov 28 09:41:28 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.
Nov 28 09:41:28 np0005538515.localdomain podman[258935]: 2025-11-28 09:41:28.356093303 +0000 UTC m=+0.068587609 container health_status 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Nov 28 09:41:28 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.
Nov 28 09:41:28 np0005538515.localdomain podman[258935]: 2025-11-28 09:41:28.391337157 +0000 UTC m=+0.103831513 container exec_died 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller)
Nov 28 09:41:28 np0005538515.localdomain systemd[1]: 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.service: Deactivated successfully.
Nov 28 09:41:28 np0005538515.localdomain systemd[1]: tmp-crun.Xc32tK.mount: Deactivated successfully.
Nov 28 09:41:28 np0005538515.localdomain podman[258972]: 2025-11-28 09:41:28.452276928 +0000 UTC m=+0.075247706 container health_status b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Nov 28 09:41:28 np0005538515.localdomain podman[258972]: 2025-11-28 09:41:28.462441672 +0000 UTC m=+0.085412451 container exec_died b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 28 09:41:28 np0005538515.localdomain systemd[1]: b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.service: Deactivated successfully.
Nov 28 09:41:28 np0005538515.localdomain sudo[259029]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dhhtuqyvzcqsmfnurfttrmearqgxyfxw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322887.7658596-949-235948898581019/AnsiballZ_file.py
Nov 28 09:41:28 np0005538515.localdomain sudo[259029]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:41:28 np0005538515.localdomain python3.9[259031]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:41:28 np0005538515.localdomain sudo[259029]: pam_unix(sudo:session): session closed for user root
Nov 28 09:41:28 np0005538515.localdomain podman[239012]: time="2025-11-28T09:41:28Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 09:41:28 np0005538515.localdomain podman[239012]: @ - - [28/Nov/2025:09:41:28 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 144036 "" "Go-http-client/1.1"
Nov 28 09:41:28 np0005538515.localdomain podman[239012]: @ - - [28/Nov/2025:09:41:28 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16300 "" "Go-http-client/1.1"
Nov 28 09:41:29 np0005538515.localdomain sudo[259139]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iuhoddjqdddgzeiuwuvmyusletbfppvu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322888.9325705-986-45502825677007/AnsiballZ_stat.py
Nov 28 09:41:29 np0005538515.localdomain sudo[259139]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:41:29 np0005538515.localdomain python3.9[259141]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:41:29 np0005538515.localdomain sudo[259139]: pam_unix(sudo:session): session closed for user root
Nov 28 09:41:29 np0005538515.localdomain sudo[259196]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ngwgvyrvmjjyztqyioziotforvqbrcxc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322888.9325705-986-45502825677007/AnsiballZ_file.py
Nov 28 09:41:29 np0005538515.localdomain sudo[259196]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:41:29 np0005538515.localdomain python3.9[259198]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:41:29 np0005538515.localdomain sudo[259196]: pam_unix(sudo:session): session closed for user root
Nov 28 09:41:30 np0005538515.localdomain sudo[259306]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-goncelnxwqlwdvbajxfothhvngbckeor ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322890.14085-1021-1380663311483/AnsiballZ_systemd.py
Nov 28 09:41:30 np0005538515.localdomain sudo[259306]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:41:30 np0005538515.localdomain python3.9[259308]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:41:30 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 09:41:30 np0005538515.localdomain systemd-sysv-generator[259334]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:41:30 np0005538515.localdomain systemd-rc-local-generator[259330]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:41:30 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:41:30 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 28 09:41:30 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:41:30 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:41:30 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:41:30 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 28 09:41:30 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:41:30 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:41:30 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:41:31 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.
Nov 28 09:41:31 np0005538515.localdomain systemd[1]: Starting Create netns directory...
Nov 28 09:41:31 np0005538515.localdomain systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Nov 28 09:41:31 np0005538515.localdomain systemd[1]: netns-placeholder.service: Deactivated successfully.
Nov 28 09:41:31 np0005538515.localdomain systemd[1]: Finished Create netns directory.
Nov 28 09:41:31 np0005538515.localdomain podman[259346]: 2025-11-28 09:41:31.180734448 +0000 UTC m=+0.088224279 container health_status 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 28 09:41:31 np0005538515.localdomain podman[259346]: 2025-11-28 09:41:31.187759085 +0000 UTC m=+0.095248946 container exec_died 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 28 09:41:31 np0005538515.localdomain sudo[259306]: pam_unix(sudo:session): session closed for user root
Nov 28 09:41:31 np0005538515.localdomain systemd[1]: 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.service: Deactivated successfully.
Nov 28 09:41:32 np0005538515.localdomain sudo[259483]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xxytsoxksozxkttromxbsqiuksmzxhyc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322891.718631-1051-173560087585916/AnsiballZ_file.py
Nov 28 09:41:32 np0005538515.localdomain sudo[259483]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:41:32 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31079 DF PROTO=TCP SPT=36740 DPT=9102 SEQ=3245871172 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ADA4C290000000001030307) 
Nov 28 09:41:32 np0005538515.localdomain python3.9[259485]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:41:32 np0005538515.localdomain sudo[259483]: pam_unix(sudo:session): session closed for user root
Nov 28 09:41:32 np0005538515.localdomain sudo[259593]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xtdbpxwojyqgrlhibkzpnpmgixlluldk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322892.4640145-1075-157189402429008/AnsiballZ_stat.py
Nov 28 09:41:32 np0005538515.localdomain sudo[259593]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:41:32 np0005538515.localdomain python3.9[259595]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/neutron_dhcp_agent.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:41:32 np0005538515.localdomain sudo[259593]: pam_unix(sudo:session): session closed for user root
Nov 28 09:41:33 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31080 DF PROTO=TCP SPT=36740 DPT=9102 SEQ=3245871172 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ADA503B0000000001030307) 
Nov 28 09:41:33 np0005538515.localdomain sudo[259681]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xvqjeqhcqqhhqoazfypqvwolnogxccur ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322892.4640145-1075-157189402429008/AnsiballZ_copy.py
Nov 28 09:41:33 np0005538515.localdomain sudo[259681]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:41:33 np0005538515.localdomain python3.9[259683]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/neutron_dhcp_agent.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764322892.4640145-1075-157189402429008/.source.json _original_basename=.j4cfuoqs follow=False checksum=c62829c98c0f9e788d62f52aa71fba276cd98270 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:41:33 np0005538515.localdomain sudo[259681]: pam_unix(sudo:session): session closed for user root
Nov 28 09:41:33 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61065 DF PROTO=TCP SPT=38018 DPT=9102 SEQ=3204140831 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ADA52FA0000000001030307) 
Nov 28 09:41:34 np0005538515.localdomain sudo[259791]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-phjcovvcegditpgojqxgsdjgfspfnmwo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322893.721713-1121-202179985901898/AnsiballZ_file.py
Nov 28 09:41:34 np0005538515.localdomain sudo[259791]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:41:34 np0005538515.localdomain python3.9[259793]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/neutron_dhcp state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:41:34 np0005538515.localdomain sudo[259791]: pam_unix(sudo:session): session closed for user root
Nov 28 09:41:35 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31081 DF PROTO=TCP SPT=36740 DPT=9102 SEQ=3245871172 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ADA583A0000000001030307) 
Nov 28 09:41:35 np0005538515.localdomain sudo[259901]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ztleddugcbjcjwtaxozhzydwgucgcdtp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322894.906514-1144-4165403244927/AnsiballZ_stat.py
Nov 28 09:41:35 np0005538515.localdomain sudo[259901]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:41:35 np0005538515.localdomain sudo[259901]: pam_unix(sudo:session): session closed for user root
Nov 28 09:41:35 np0005538515.localdomain sudo[259989]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gcsmrlbujzrrenekfnplkfddkifqebjq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322894.906514-1144-4165403244927/AnsiballZ_copy.py
Nov 28 09:41:35 np0005538515.localdomain sudo[259989]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:41:35 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52370 DF PROTO=TCP SPT=54008 DPT=9102 SEQ=2498611152 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ADA5AFA0000000001030307) 
Nov 28 09:41:35 np0005538515.localdomain sudo[259989]: pam_unix(sudo:session): session closed for user root
Nov 28 09:41:36 np0005538515.localdomain sudo[260099]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-beuvojbpxtyflhtocvybvseajijysdvd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322896.297837-1195-250415340477832/AnsiballZ_container_config_data.py
Nov 28 09:41:36 np0005538515.localdomain sudo[260099]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:41:36 np0005538515.localdomain python3.9[260101]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/neutron_dhcp config_pattern=*.json debug=False
Nov 28 09:41:36 np0005538515.localdomain sudo[260099]: pam_unix(sudo:session): session closed for user root
Nov 28 09:41:37 np0005538515.localdomain sudo[260209]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bunpszhinxypklfysdhlajixgqrpjvto ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322897.3248172-1222-42775142171625/AnsiballZ_container_config_hash.py
Nov 28 09:41:37 np0005538515.localdomain sudo[260209]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:41:37 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.
Nov 28 09:41:37 np0005538515.localdomain python3.9[260211]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 28 09:41:37 np0005538515.localdomain sudo[260209]: pam_unix(sudo:session): session closed for user root
Nov 28 09:41:37 np0005538515.localdomain podman[260212]: 2025-11-28 09:41:37.983525679 +0000 UTC m=+0.069500338 container health_status cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, org.label-schema.vendor=CentOS, container_name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 28 09:41:37 np0005538515.localdomain podman[260212]: 2025-11-28 09:41:37.992639752 +0000 UTC m=+0.078614401 container exec_died cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.schema-version=1.0, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 28 09:41:38 np0005538515.localdomain systemd[1]: cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.service: Deactivated successfully.
Nov 28 09:41:38 np0005538515.localdomain sudo[260338]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xtyexgpsyqtuhyncfdaavkziluvlpwdx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322898.290179-1249-147094822068123/AnsiballZ_podman_container_info.py
Nov 28 09:41:38 np0005538515.localdomain sudo[260338]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:41:38 np0005538515.localdomain python3.9[260340]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Nov 28 09:41:39 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31082 DF PROTO=TCP SPT=36740 DPT=9102 SEQ=3245871172 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ADA67FA0000000001030307) 
Nov 28 09:41:39 np0005538515.localdomain sudo[260338]: pam_unix(sudo:session): session closed for user root
Nov 28 09:41:44 np0005538515.localdomain sudo[260475]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yuxjacxhaiopjtcfyjpkgvxrbcghtrcy ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764322903.6797142-1288-47835725387218/AnsiballZ_edpm_container_manage.py
Nov 28 09:41:44 np0005538515.localdomain sudo[260475]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:41:44 np0005538515.localdomain python3[260477]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/neutron_dhcp config_id=neutron_dhcp config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Nov 28 09:41:44 np0005538515.localdomain podman[260514]: 
Nov 28 09:41:44 np0005538515.localdomain podman[260514]: 2025-11-28 09:41:44.698707062 +0000 UTC m=+0.076077381 container create 0e5a51137dc54e3c7d5de7eadd56c57c951f38ee121a300835ca4e5324dc3294 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron_dhcp_agent, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '97b631b11487264e1d06ace7cd32b528ca21fc2a7a7b166e6cb3ae7d17fd8dd3'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/config-data/ansible-generated/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, tcib_managed=true, container_name=neutron_dhcp_agent, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, config_id=neutron_dhcp, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 09:41:44 np0005538515.localdomain podman[260514]: 2025-11-28 09:41:44.657749101 +0000 UTC m=+0.035119440 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 28 09:41:44 np0005538515.localdomain python3[260477]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name neutron_dhcp_agent --cgroupns=host --conmon-pidfile /run/neutron_dhcp_agent.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=97b631b11487264e1d06ace7cd32b528ca21fc2a7a7b166e6cb3ae7d17fd8dd3 --label config_id=neutron_dhcp --label container_name=neutron_dhcp_agent --label managed_by=edpm_ansible --label config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '97b631b11487264e1d06ace7cd32b528ca21fc2a7a7b166e6cb3ae7d17fd8dd3'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/config-data/ansible-generated/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']} --log-driver journald --log-level info --network host --pid host --privileged=True --user root --volume /run/netns:/run/netns:shared --volume /var/lib/config-data/ansible-generated/neutron-dhcp-agent:/etc/neutron.conf.d:z --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /run/openvswitch:/run/openvswitch:shared,z --volume /var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro --volume /var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro --volume /var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro --volume /var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 28 09:41:44 np0005538515.localdomain sudo[260475]: pam_unix(sudo:session): session closed for user root
Nov 28 09:41:45 np0005538515.localdomain sudo[260659]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tlsumpxqovenyujfmpbqczdzbfyitlmx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322905.0822494-1312-28494295147422/AnsiballZ_stat.py
Nov 28 09:41:45 np0005538515.localdomain sudo[260659]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:41:45 np0005538515.localdomain python3.9[260661]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 09:41:45 np0005538515.localdomain sudo[260659]: pam_unix(sudo:session): session closed for user root
Nov 28 09:41:46 np0005538515.localdomain sudo[260771]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-twaiudzxndnvztwhrodovtkhiwrkqael ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322905.955421-1339-266374682517938/AnsiballZ_file.py
Nov 28 09:41:46 np0005538515.localdomain sudo[260771]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:41:46 np0005538515.localdomain python3.9[260773]: ansible-file Invoked with path=/etc/systemd/system/edpm_neutron_dhcp_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:41:46 np0005538515.localdomain sudo[260771]: pam_unix(sudo:session): session closed for user root
Nov 28 09:41:46 np0005538515.localdomain sudo[260826]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bwvteqptjoswynjwjljbmabylsguecbd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322905.955421-1339-266374682517938/AnsiballZ_stat.py
Nov 28 09:41:46 np0005538515.localdomain sudo[260826]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:41:46 np0005538515.localdomain python3.9[260828]: ansible-stat Invoked with path=/etc/systemd/system/edpm_neutron_dhcp_agent_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 09:41:46 np0005538515.localdomain sudo[260826]: pam_unix(sudo:session): session closed for user root
Nov 28 09:41:47 np0005538515.localdomain sudo[260935]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-diwnfbpcouxeewhssvwzsrjqlrwueyvp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322906.9300923-1339-173838920908179/AnsiballZ_copy.py
Nov 28 09:41:47 np0005538515.localdomain sudo[260935]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:41:47 np0005538515.localdomain python3.9[260937]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764322906.9300923-1339-173838920908179/source dest=/etc/systemd/system/edpm_neutron_dhcp_agent.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:41:47 np0005538515.localdomain sudo[260935]: pam_unix(sudo:session): session closed for user root
Nov 28 09:41:47 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31083 DF PROTO=TCP SPT=36740 DPT=9102 SEQ=3245871172 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ADA88FA0000000001030307) 
Nov 28 09:41:47 np0005538515.localdomain sudo[260990]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hkqyulufxgrezupfxrremjwseosefrkr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322906.9300923-1339-173838920908179/AnsiballZ_systemd.py
Nov 28 09:41:47 np0005538515.localdomain sudo[260990]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:41:48 np0005538515.localdomain python3.9[260992]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 28 09:41:48 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 09:41:48 np0005538515.localdomain systemd-rc-local-generator[261015]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:41:48 np0005538515.localdomain systemd-sysv-generator[261020]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:41:48 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:41:48 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 28 09:41:48 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:41:48 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:41:48 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:41:48 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 28 09:41:48 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:41:48 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:41:48 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:41:48 np0005538515.localdomain sudo[260990]: pam_unix(sudo:session): session closed for user root
Nov 28 09:41:48 np0005538515.localdomain sudo[261081]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zwpeuapeyxpnqfeujvvfajvojbcbxubl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322906.9300923-1339-173838920908179/AnsiballZ_systemd.py
Nov 28 09:41:48 np0005538515.localdomain sudo[261081]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:41:49 np0005538515.localdomain python3.9[261083]: ansible-systemd Invoked with state=restarted name=edpm_neutron_dhcp_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:41:49 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 09:41:49 np0005538515.localdomain systemd-rc-local-generator[261109]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:41:49 np0005538515.localdomain systemd-sysv-generator[261113]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:41:49 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:41:49 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 28 09:41:49 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:41:49 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:41:49 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:41:49 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 28 09:41:49 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:41:49 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:41:49 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:41:49 np0005538515.localdomain systemd[1]: Starting neutron_dhcp_agent container...
Nov 28 09:41:49 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 09:41:49 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/92b02dc597e68fe7ad726f5e333d8ab9d38ed42f81284c1cac4d8e7be783762c/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff)
Nov 28 09:41:49 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/92b02dc597e68fe7ad726f5e333d8ab9d38ed42f81284c1cac4d8e7be783762c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 09:41:49 np0005538515.localdomain podman[261124]: 2025-11-28 09:41:49.55705225 +0000 UTC m=+0.110927903 container init 0e5a51137dc54e3c7d5de7eadd56c57c951f38ee121a300835ca4e5324dc3294 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron_dhcp_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=neutron_dhcp_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_id=neutron_dhcp, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '97b631b11487264e1d06ace7cd32b528ca21fc2a7a7b166e6cb3ae7d17fd8dd3'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/config-data/ansible-generated/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']})
Nov 28 09:41:49 np0005538515.localdomain podman[261124]: 2025-11-28 09:41:49.566298067 +0000 UTC m=+0.120173710 container start 0e5a51137dc54e3c7d5de7eadd56c57c951f38ee121a300835ca4e5324dc3294 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron_dhcp_agent, org.label-schema.name=CentOS Stream 9 Base Image, container_name=neutron_dhcp_agent, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, config_id=neutron_dhcp, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '97b631b11487264e1d06ace7cd32b528ca21fc2a7a7b166e6cb3ae7d17fd8dd3'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/config-data/ansible-generated/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, tcib_managed=true)
Nov 28 09:41:49 np0005538515.localdomain podman[261124]: neutron_dhcp_agent
Nov 28 09:41:49 np0005538515.localdomain neutron_dhcp_agent[261136]: + sudo -E kolla_set_configs
Nov 28 09:41:49 np0005538515.localdomain systemd[1]: Started neutron_dhcp_agent container.
Nov 28 09:41:49 np0005538515.localdomain sudo[261081]: pam_unix(sudo:session): session closed for user root
Nov 28 09:41:49 np0005538515.localdomain neutron_dhcp_agent[261136]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 28 09:41:49 np0005538515.localdomain neutron_dhcp_agent[261136]: INFO:__main__:Validating config file
Nov 28 09:41:49 np0005538515.localdomain neutron_dhcp_agent[261136]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 28 09:41:49 np0005538515.localdomain neutron_dhcp_agent[261136]: INFO:__main__:Copying service configuration files
Nov 28 09:41:49 np0005538515.localdomain neutron_dhcp_agent[261136]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf
Nov 28 09:41:49 np0005538515.localdomain neutron_dhcp_agent[261136]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf
Nov 28 09:41:49 np0005538515.localdomain neutron_dhcp_agent[261136]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf
Nov 28 09:41:49 np0005538515.localdomain neutron_dhcp_agent[261136]: INFO:__main__:Writing out command to execute
Nov 28 09:41:49 np0005538515.localdomain neutron_dhcp_agent[261136]: INFO:__main__:Setting permission for /var/lib/neutron
Nov 28 09:41:49 np0005538515.localdomain neutron_dhcp_agent[261136]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts
Nov 28 09:41:49 np0005538515.localdomain neutron_dhcp_agent[261136]: INFO:__main__:Setting permission for /var/lib/neutron/.cache
Nov 28 09:41:49 np0005538515.localdomain neutron_dhcp_agent[261136]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy
Nov 28 09:41:49 np0005538515.localdomain neutron_dhcp_agent[261136]: INFO:__main__:Setting permission for /var/lib/neutron/external
Nov 28 09:41:49 np0005538515.localdomain neutron_dhcp_agent[261136]: INFO:__main__:Setting permission for /var/lib/neutron/ns-metadata-proxy
Nov 28 09:41:49 np0005538515.localdomain neutron_dhcp_agent[261136]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper
Nov 28 09:41:49 np0005538515.localdomain neutron_dhcp_agent[261136]: INFO:__main__:Setting permission for /var/lib/neutron/metadata_proxy
Nov 28 09:41:49 np0005538515.localdomain neutron_dhcp_agent[261136]: INFO:__main__:Setting permission for /var/lib/neutron/dhcp_agent_haproxy_wrapper
Nov 28 09:41:49 np0005538515.localdomain neutron_dhcp_agent[261136]: INFO:__main__:Setting permission for /var/lib/neutron/dhcp_agent_dnsmasq_wrapper
Nov 28 09:41:49 np0005538515.localdomain neutron_dhcp_agent[261136]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill
Nov 28 09:41:49 np0005538515.localdomain neutron_dhcp_agent[261136]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/dnsmasq-kill
Nov 28 09:41:49 np0005538515.localdomain neutron_dhcp_agent[261136]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints
Nov 28 09:41:49 np0005538515.localdomain neutron_dhcp_agent[261136]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints/adac9f827fd7fb11fb07020ef60ee06a1fede4feab743856dc8fb3266181d934
Nov 28 09:41:49 np0005538515.localdomain neutron_dhcp_agent[261136]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints/a99cc267a5c3ade03c88b3bb0a43299c9bb62825df6f4ca0c30c03cccfac55c1
Nov 28 09:41:49 np0005538515.localdomain neutron_dhcp_agent[261136]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids
Nov 28 09:41:49 np0005538515.localdomain neutron_dhcp_agent[261136]: ++ cat /run_command
Nov 28 09:41:49 np0005538515.localdomain neutron_dhcp_agent[261136]: + CMD=/usr/bin/neutron-dhcp-agent
Nov 28 09:41:49 np0005538515.localdomain neutron_dhcp_agent[261136]: + ARGS=
Nov 28 09:41:49 np0005538515.localdomain neutron_dhcp_agent[261136]: + sudo kolla_copy_cacerts
Nov 28 09:41:49 np0005538515.localdomain neutron_dhcp_agent[261136]: + [[ ! -n '' ]]
Nov 28 09:41:49 np0005538515.localdomain neutron_dhcp_agent[261136]: + . kolla_extend_start
Nov 28 09:41:49 np0005538515.localdomain neutron_dhcp_agent[261136]: Running command: '/usr/bin/neutron-dhcp-agent'
Nov 28 09:41:49 np0005538515.localdomain neutron_dhcp_agent[261136]: + echo 'Running command: '\''/usr/bin/neutron-dhcp-agent'\'''
Nov 28 09:41:49 np0005538515.localdomain neutron_dhcp_agent[261136]: + umask 0022
Nov 28 09:41:49 np0005538515.localdomain neutron_dhcp_agent[261136]: + exec /usr/bin/neutron-dhcp-agent
Nov 28 09:41:50 np0005538515.localdomain sudo[261258]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rokgwuasozxfbhbqwdopezcwuypyjgiz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322909.941348-1423-236496226159180/AnsiballZ_systemd.py
Nov 28 09:41:50 np0005538515.localdomain sudo[261258]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:41:50 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.
Nov 28 09:41:50 np0005538515.localdomain podman[261261]: 2025-11-28 09:41:50.373459472 +0000 UTC m=+0.087983831 container health_status 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, managed_by=edpm_ansible, release=1755695350, version=9.6, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., config_id=edpm, io.openshift.expose-services=, distribution-scope=public, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, vcs-type=git, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 28 09:41:50 np0005538515.localdomain podman[261261]: 2025-11-28 09:41:50.386405333 +0000 UTC m=+0.100929642 container exec_died 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, vcs-type=git, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, managed_by=edpm_ansible, version=9.6, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350)
Nov 28 09:41:50 np0005538515.localdomain systemd[1]: 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.service: Deactivated successfully.
Nov 28 09:41:50 np0005538515.localdomain python3.9[261260]: ansible-ansible.builtin.systemd Invoked with name=edpm_neutron_dhcp_agent.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 28 09:41:50 np0005538515.localdomain systemd[1]: Stopping neutron_dhcp_agent container...
Nov 28 09:41:50 np0005538515.localdomain systemd[1]: tmp-crun.YM59mU.mount: Deactivated successfully.
Nov 28 09:41:50 np0005538515.localdomain systemd[1]: libpod-0e5a51137dc54e3c7d5de7eadd56c57c951f38ee121a300835ca4e5324dc3294.scope: Deactivated successfully.
Nov 28 09:41:50 np0005538515.localdomain systemd[1]: libpod-0e5a51137dc54e3c7d5de7eadd56c57c951f38ee121a300835ca4e5324dc3294.scope: Consumed 1.114s CPU time.
Nov 28 09:41:50 np0005538515.localdomain podman[261284]: 2025-11-28 09:41:50.692935805 +0000 UTC m=+0.079692274 container died 0e5a51137dc54e3c7d5de7eadd56c57c951f38ee121a300835ca4e5324dc3294 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron_dhcp_agent, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '97b631b11487264e1d06ace7cd32b528ca21fc2a7a7b166e6cb3ae7d17fd8dd3'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/config-data/ansible-generated/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, config_id=neutron_dhcp, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=neutron_dhcp_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 28 09:41:50 np0005538515.localdomain podman[261284]: 2025-11-28 09:41:50.788590433 +0000 UTC m=+0.175346842 container cleanup 0e5a51137dc54e3c7d5de7eadd56c57c951f38ee121a300835ca4e5324dc3294 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron_dhcp_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '97b631b11487264e1d06ace7cd32b528ca21fc2a7a7b166e6cb3ae7d17fd8dd3'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/config-data/ansible-generated/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=neutron_dhcp, container_name=neutron_dhcp_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Nov 28 09:41:50 np0005538515.localdomain podman[261284]: neutron_dhcp_agent
Nov 28 09:41:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:41:50.821 158530 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:41:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:41:50.821 158530 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:41:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:41:50.822 158530 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:41:50 np0005538515.localdomain podman[261325]: error opening file `/run/crun/0e5a51137dc54e3c7d5de7eadd56c57c951f38ee121a300835ca4e5324dc3294/status`: No such file or directory
Nov 28 09:41:50 np0005538515.localdomain podman[261313]: 2025-11-28 09:41:50.888253375 +0000 UTC m=+0.072696937 container cleanup 0e5a51137dc54e3c7d5de7eadd56c57c951f38ee121a300835ca4e5324dc3294 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron_dhcp_agent, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '97b631b11487264e1d06ace7cd32b528ca21fc2a7a7b166e6cb3ae7d17fd8dd3'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/config-data/ansible-generated/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=neutron_dhcp, container_name=neutron_dhcp_agent, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Nov 28 09:41:50 np0005538515.localdomain podman[261313]: neutron_dhcp_agent
Nov 28 09:41:50 np0005538515.localdomain systemd[1]: edpm_neutron_dhcp_agent.service: Deactivated successfully.
Nov 28 09:41:50 np0005538515.localdomain systemd[1]: Stopped neutron_dhcp_agent container.
Nov 28 09:41:50 np0005538515.localdomain systemd[1]: Starting neutron_dhcp_agent container...
Nov 28 09:41:51 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 09:41:51 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/92b02dc597e68fe7ad726f5e333d8ab9d38ed42f81284c1cac4d8e7be783762c/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff)
Nov 28 09:41:51 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/92b02dc597e68fe7ad726f5e333d8ab9d38ed42f81284c1cac4d8e7be783762c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 09:41:51 np0005538515.localdomain podman[261327]: 2025-11-28 09:41:51.037736984 +0000 UTC m=+0.113071760 container init 0e5a51137dc54e3c7d5de7eadd56c57c951f38ee121a300835ca4e5324dc3294 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron_dhcp_agent, config_id=neutron_dhcp, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '97b631b11487264e1d06ace7cd32b528ca21fc2a7a7b166e6cb3ae7d17fd8dd3'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/config-data/ansible-generated/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, container_name=neutron_dhcp_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 28 09:41:51 np0005538515.localdomain podman[261327]: 2025-11-28 09:41:51.044014778 +0000 UTC m=+0.119349554 container start 0e5a51137dc54e3c7d5de7eadd56c57c951f38ee121a300835ca4e5324dc3294 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron_dhcp_agent, maintainer=OpenStack Kubernetes Operator team, container_name=neutron_dhcp_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '97b631b11487264e1d06ace7cd32b528ca21fc2a7a7b166e6cb3ae7d17fd8dd3'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/config-data/ansible-generated/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, config_id=neutron_dhcp, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 09:41:51 np0005538515.localdomain podman[261327]: neutron_dhcp_agent
Nov 28 09:41:51 np0005538515.localdomain neutron_dhcp_agent[261342]: + sudo -E kolla_set_configs
Nov 28 09:41:51 np0005538515.localdomain systemd[1]: Started neutron_dhcp_agent container.
Nov 28 09:41:51 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:41:51.074 228501 DEBUG oslo_service.periodic_task [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:41:51 np0005538515.localdomain sudo[261258]: pam_unix(sudo:session): session closed for user root
Nov 28 09:41:51 np0005538515.localdomain neutron_dhcp_agent[261342]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 28 09:41:51 np0005538515.localdomain neutron_dhcp_agent[261342]: INFO:__main__:Validating config file
Nov 28 09:41:51 np0005538515.localdomain neutron_dhcp_agent[261342]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 28 09:41:51 np0005538515.localdomain neutron_dhcp_agent[261342]: INFO:__main__:Copying service configuration files
Nov 28 09:41:51 np0005538515.localdomain neutron_dhcp_agent[261342]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf
Nov 28 09:41:51 np0005538515.localdomain neutron_dhcp_agent[261342]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf
Nov 28 09:41:51 np0005538515.localdomain neutron_dhcp_agent[261342]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf
Nov 28 09:41:51 np0005538515.localdomain neutron_dhcp_agent[261342]: INFO:__main__:Writing out command to execute
Nov 28 09:41:51 np0005538515.localdomain neutron_dhcp_agent[261342]: INFO:__main__:Setting permission for /var/lib/neutron
Nov 28 09:41:51 np0005538515.localdomain neutron_dhcp_agent[261342]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts
Nov 28 09:41:51 np0005538515.localdomain neutron_dhcp_agent[261342]: INFO:__main__:Setting permission for /var/lib/neutron/.cache
Nov 28 09:41:51 np0005538515.localdomain neutron_dhcp_agent[261342]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy
Nov 28 09:41:51 np0005538515.localdomain neutron_dhcp_agent[261342]: INFO:__main__:Setting permission for /var/lib/neutron/external
Nov 28 09:41:51 np0005538515.localdomain neutron_dhcp_agent[261342]: INFO:__main__:Setting permission for /var/lib/neutron/ns-metadata-proxy
Nov 28 09:41:51 np0005538515.localdomain neutron_dhcp_agent[261342]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper
Nov 28 09:41:51 np0005538515.localdomain neutron_dhcp_agent[261342]: INFO:__main__:Setting permission for /var/lib/neutron/metadata_proxy
Nov 28 09:41:51 np0005538515.localdomain neutron_dhcp_agent[261342]: INFO:__main__:Setting permission for /var/lib/neutron/dhcp_agent_haproxy_wrapper
Nov 28 09:41:51 np0005538515.localdomain neutron_dhcp_agent[261342]: INFO:__main__:Setting permission for /var/lib/neutron/dhcp_agent_dnsmasq_wrapper
Nov 28 09:41:51 np0005538515.localdomain neutron_dhcp_agent[261342]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill
Nov 28 09:41:51 np0005538515.localdomain neutron_dhcp_agent[261342]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/dnsmasq-kill
Nov 28 09:41:51 np0005538515.localdomain neutron_dhcp_agent[261342]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints
Nov 28 09:41:51 np0005538515.localdomain neutron_dhcp_agent[261342]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints/adac9f827fd7fb11fb07020ef60ee06a1fede4feab743856dc8fb3266181d934
Nov 28 09:41:51 np0005538515.localdomain neutron_dhcp_agent[261342]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints/a99cc267a5c3ade03c88b3bb0a43299c9bb62825df6f4ca0c30c03cccfac55c1
Nov 28 09:41:51 np0005538515.localdomain neutron_dhcp_agent[261342]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids
Nov 28 09:41:51 np0005538515.localdomain neutron_dhcp_agent[261342]: ++ cat /run_command
Nov 28 09:41:51 np0005538515.localdomain neutron_dhcp_agent[261342]: + CMD=/usr/bin/neutron-dhcp-agent
Nov 28 09:41:51 np0005538515.localdomain neutron_dhcp_agent[261342]: + ARGS=
Nov 28 09:41:51 np0005538515.localdomain neutron_dhcp_agent[261342]: + sudo kolla_copy_cacerts
Nov 28 09:41:51 np0005538515.localdomain neutron_dhcp_agent[261342]: + [[ ! -n '' ]]
Nov 28 09:41:51 np0005538515.localdomain neutron_dhcp_agent[261342]: + . kolla_extend_start
Nov 28 09:41:51 np0005538515.localdomain neutron_dhcp_agent[261342]: + echo 'Running command: '\''/usr/bin/neutron-dhcp-agent'\'''
Nov 28 09:41:51 np0005538515.localdomain neutron_dhcp_agent[261342]: Running command: '/usr/bin/neutron-dhcp-agent'
Nov 28 09:41:51 np0005538515.localdomain neutron_dhcp_agent[261342]: + umask 0022
Nov 28 09:41:51 np0005538515.localdomain neutron_dhcp_agent[261342]: + exec /usr/bin/neutron-dhcp-agent
Nov 28 09:41:51 np0005538515.localdomain sshd[254488]: pam_unix(sshd:session): session closed for user zuul
Nov 28 09:41:51 np0005538515.localdomain systemd[1]: session-58.scope: Deactivated successfully.
Nov 28 09:41:51 np0005538515.localdomain systemd[1]: session-58.scope: Consumed 34.033s CPU time.
Nov 28 09:41:51 np0005538515.localdomain systemd-logind[763]: Session 58 logged out. Waiting for processes to exit.
Nov 28 09:41:51 np0005538515.localdomain systemd-logind[763]: Removed session 58.
Nov 28 09:41:52 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:41:52.070 228501 DEBUG oslo_service.periodic_task [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:41:52 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:41:52.072 228501 DEBUG oslo_service.periodic_task [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:41:52 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:41:52.089 228501 DEBUG oslo_concurrency.lockutils [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:41:52 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:41:52.090 228501 DEBUG oslo_concurrency.lockutils [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:41:52 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:41:52.090 228501 DEBUG oslo_concurrency.lockutils [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:41:52 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:41:52.090 228501 DEBUG nova.compute.resource_tracker [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Auditing locally available compute resources for np0005538515.localdomain (node: np0005538515.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 28 09:41:52 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:41:52.090 228501 DEBUG oslo_concurrency.processutils [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 09:41:52 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 09:41:52.275 261346 INFO neutron.common.config [-] Logging enabled!
Nov 28 09:41:52 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 09:41:52.275 261346 INFO neutron.common.config [-] /usr/bin/neutron-dhcp-agent version 22.2.2.dev43
Nov 28 09:41:52 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:41:52.593 228501 DEBUG oslo_concurrency.processutils [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.503s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 09:41:52 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 09:41:52.700 261346 INFO neutron.agent.dhcp.agent [-] Synchronizing state
Nov 28 09:41:52 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:41:52.786 228501 WARNING nova.virt.libvirt.driver [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 09:41:52 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:41:52.788 228501 DEBUG nova.compute.resource_tracker [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Hypervisor/Node resource view: name=np0005538515.localdomain free_ram=12923MB free_disk=41.837093353271484GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 28 09:41:52 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:41:52.788 228501 DEBUG oslo_concurrency.lockutils [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:41:52 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:41:52.789 228501 DEBUG oslo_concurrency.lockutils [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:41:52 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:41:52.870 228501 DEBUG nova.compute.resource_tracker [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 28 09:41:52 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:41:52.871 228501 DEBUG nova.compute.resource_tracker [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Final resource view: name=np0005538515.localdomain phys_ram=15738MB used_ram=512MB phys_disk=41GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 28 09:41:52 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:41:52.893 228501 DEBUG oslo_concurrency.processutils [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 09:41:53 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:41:53.349 228501 DEBUG oslo_concurrency.processutils [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.457s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 09:41:53 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:41:53.355 228501 DEBUG nova.compute.provider_tree [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Inventory has not changed in ProviderTree for provider: 72fba1ca-0d86-48af-8a3d-510284dfd0e0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 09:41:53 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:41:53.375 228501 DEBUG nova.scheduler.client.report [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Inventory has not changed for provider 72fba1ca-0d86-48af-8a3d-510284dfd0e0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 09:41:53 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:41:53.376 228501 DEBUG nova.compute.resource_tracker [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Compute_service record updated for np0005538515.localdomain:np0005538515.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 28 09:41:53 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:41:53.377 228501 DEBUG oslo_concurrency.lockutils [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.588s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:41:53 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 09:41:53.716 261346 INFO neutron.agent.dhcp.agent [None req-89cff752-fd5b-4681-91c5-e1b153e3385a - - - - - -] All active networks have been fetched through RPC.
Nov 28 09:41:53 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 09:41:53.717 261346 INFO neutron.agent.dhcp.agent [-] Starting network 887157f9-a765-40c0-8be5-1fba3ddea8f8 dhcp configuration
Nov 28 09:41:53 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 09:41:53.772 261346 INFO neutron.agent.dhcp.agent [-] Starting network 40d5da59-6201-424a-8380-80ecc3d67c7e dhcp configuration
Nov 28 09:41:54 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:41:54.302 158530 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=5, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '92:49:97', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ca:ab:0a:de:51:20'}, ipsec=False) old=SB_Global(nb_cfg=4) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 09:41:54 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:41:54.303 158530 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 28 09:41:54 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:41:54.307 158530 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=62c03cad-89c1-4fd7-973b-8f2a608c71f1, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '5'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 09:41:54 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:41:54.378 228501 DEBUG oslo_service.periodic_task [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:41:54 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:41:54.378 228501 DEBUG nova.compute.manager [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 28 09:41:54 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:41:54.379 228501 DEBUG nova.compute.manager [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 28 09:41:54 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:41:54.392 228501 DEBUG nova.compute.manager [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 28 09:41:54 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:41:54.393 228501 DEBUG oslo_service.periodic_task [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:41:54 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:41:54.393 228501 DEBUG oslo_service.periodic_task [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:41:54 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:41:54.394 228501 DEBUG oslo_service.periodic_task [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:41:54 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:41:54.394 228501 DEBUG nova.compute.manager [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 28 09:41:54 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 09:41:54.784 261346 INFO oslo.privsep.daemon [None req-17aca3bb-5100-48e6-8b92-ae59d394f285 - - - - - -] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.default', '--privsep_sock_path', '/tmp/tmp7j8wsnzi/privsep.sock']
Nov 28 09:41:55 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 09:41:55.399 261346 INFO oslo.privsep.daemon [None req-17aca3bb-5100-48e6-8b92-ae59d394f285 - - - - - -] Spawned new privsep daemon via rootwrap
Nov 28 09:41:55 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 09:41:55.288 261423 INFO oslo.privsep.daemon [-] privsep daemon starting
Nov 28 09:41:55 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 09:41:55.293 261423 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Nov 28 09:41:55 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 09:41:55.296 261423 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/none
Nov 28 09:41:55 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 09:41:55.296 261423 INFO oslo.privsep.daemon [-] privsep daemon running as pid 261423
Nov 28 09:41:55 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 09:41:55.403 261346 WARNING oslo_privsep.priv_context [None req-342297e5-b70c-4019-b8b9-e2d6661e0978 - - - - - -] privsep daemon already running
Nov 28 09:41:55 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 09:41:55.910 261346 INFO oslo.privsep.daemon [None req-17aca3bb-5100-48e6-8b92-ae59d394f285 - - - - - -] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.namespace_cmd', '--privsep_sock_path', '/tmp/tmpkitqw0jv/privsep.sock']
Nov 28 09:41:56 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:41:56.074 228501 DEBUG oslo_service.periodic_task [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:41:56 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:41:56.074 228501 DEBUG oslo_service.periodic_task [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:41:56 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 09:41:56.519 261346 INFO oslo.privsep.daemon [None req-17aca3bb-5100-48e6-8b92-ae59d394f285 - - - - - -] Spawned new privsep daemon via rootwrap
Nov 28 09:41:56 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 09:41:56.411 261433 INFO oslo.privsep.daemon [-] privsep daemon starting
Nov 28 09:41:56 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 09:41:56.416 261433 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Nov 28 09:41:56 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 09:41:56.419 261433 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_SYS_ADMIN/CAP_SYS_ADMIN/none
Nov 28 09:41:56 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 09:41:56.419 261433 INFO oslo.privsep.daemon [-] privsep daemon running as pid 261433
Nov 28 09:41:56 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 09:41:56.525 261346 WARNING oslo_privsep.priv_context [None req-342297e5-b70c-4019-b8b9-e2d6661e0978 - - - - - -] privsep daemon already running
Nov 28 09:41:57 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 09:41:57.414 261346 INFO oslo.privsep.daemon [None req-17aca3bb-5100-48e6-8b92-ae59d394f285 - - - - - -] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.link_cmd', '--privsep_sock_path', '/tmp/tmpx53cxqh7/privsep.sock']
Nov 28 09:41:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:41:57 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:41:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:41:57 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 09:41:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:41:57 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:41:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:41:57 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 09:41:57 np0005538515.localdomain openstack_network_exporter[240973]: 
Nov 28 09:41:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:41:57 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 09:41:57 np0005538515.localdomain openstack_network_exporter[240973]: 
Nov 28 09:41:58 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 09:41:58.052 261346 INFO oslo.privsep.daemon [None req-17aca3bb-5100-48e6-8b92-ae59d394f285 - - - - - -] Spawned new privsep daemon via rootwrap
Nov 28 09:41:58 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 09:41:57.946 261449 INFO oslo.privsep.daemon [-] privsep daemon starting
Nov 28 09:41:58 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 09:41:57.951 261449 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Nov 28 09:41:58 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 09:41:57.954 261449 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_NET_ADMIN|CAP_SYS_ADMIN/none
Nov 28 09:41:58 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 09:41:57.955 261449 INFO oslo.privsep.daemon [-] privsep daemon running as pid 261449
Nov 28 09:41:58 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 09:41:58.055 261346 WARNING oslo_privsep.priv_context [None req-342297e5-b70c-4019-b8b9-e2d6661e0978 - - - - - -] privsep daemon already running
Nov 28 09:41:58 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.
Nov 28 09:41:58 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.
Nov 28 09:41:58 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.
Nov 28 09:41:58 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.
Nov 28 09:41:58 np0005538515.localdomain podman[239012]: time="2025-11-28T09:41:58Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 09:41:58 np0005538515.localdomain podman[239012]: @ - - [28/Nov/2025:09:41:58 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 146341 "" "Go-http-client/1.1"
Nov 28 09:41:58 np0005538515.localdomain podman[239012]: @ - - [28/Nov/2025:09:41:58 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16742 "" "Go-http-client/1.1"
Nov 28 09:41:58 np0005538515.localdomain podman[261455]: 2025-11-28 09:41:58.992907431 +0000 UTC m=+0.103742960 container health_status 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm, tcib_managed=true)
Nov 28 09:41:59 np0005538515.localdomain podman[261455]: 2025-11-28 09:41:59.001636923 +0000 UTC m=+0.112472402 container exec_died 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm)
Nov 28 09:41:59 np0005538515.localdomain systemd[1]: 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.service: Deactivated successfully.
Nov 28 09:41:59 np0005538515.localdomain systemd[1]: tmp-crun.1RSn0X.mount: Deactivated successfully.
Nov 28 09:41:59 np0005538515.localdomain podman[261456]: 2025-11-28 09:41:59.090574362 +0000 UTC m=+0.195831838 container health_status 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Nov 28 09:41:59 np0005538515.localdomain podman[261457]: 2025-11-28 09:41:59.061266813 +0000 UTC m=+0.170044387 container health_status b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Nov 28 09:41:59 np0005538515.localdomain podman[261458]: 2025-11-28 09:41:59.15687506 +0000 UTC m=+0.261495715 container health_status d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 09:41:59 np0005538515.localdomain podman[261456]: 2025-11-28 09:41:59.160375728 +0000 UTC m=+0.265633144 container exec_died 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Nov 28 09:41:59 np0005538515.localdomain systemd[1]: 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.service: Deactivated successfully.
Nov 28 09:41:59 np0005538515.localdomain podman[261457]: 2025-11-28 09:41:59.178410908 +0000 UTC m=+0.287188442 container exec_died b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 28 09:41:59 np0005538515.localdomain podman[261458]: 2025-11-28 09:41:59.192450244 +0000 UTC m=+0.297070919 container exec_died d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 09:41:59 np0005538515.localdomain systemd[1]: b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.service: Deactivated successfully.
Nov 28 09:41:59 np0005538515.localdomain systemd[1]: d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.service: Deactivated successfully.
Nov 28 09:41:59 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 09:41:59.443 261346 INFO neutron.agent.linux.ip_lib [None req-342297e5-b70c-4019-b8b9-e2d6661e0978 - - - - - -] Device tap8af1236c-20 cannot be used as it has no MAC address
Nov 28 09:41:59 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 09:41:59.447 261346 INFO neutron.agent.linux.ip_lib [None req-17aca3bb-5100-48e6-8b92-ae59d394f285 - - - - - -] Device tapb51f2386-0b cannot be used as it has no MAC address
Nov 28 09:41:59 np0005538515.localdomain kernel: device tap8af1236c-20 entered promiscuous mode
Nov 28 09:41:59 np0005538515.localdomain NetworkManager[5965]: <info>  [1764322919.5242] manager: (tap8af1236c-20): new Generic device (/org/freedesktop/NetworkManager/Devices/13)
Nov 28 09:41:59 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T09:41:59Z|00025|binding|INFO|Claiming lport 8af1236c-205e-4af9-a882-ccde7f9d3ecf for this chassis.
Nov 28 09:41:59 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T09:41:59Z|00026|binding|INFO|8af1236c-205e-4af9-a882-ccde7f9d3ecf: Claiming unknown
Nov 28 09:41:59 np0005538515.localdomain systemd-udevd[261554]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 09:41:59 np0005538515.localdomain kernel: device tapb51f2386-0b entered promiscuous mode
Nov 28 09:41:59 np0005538515.localdomain NetworkManager[5965]: <info>  [1764322919.5351] manager: (tapb51f2386-0b): new Generic device (/org/freedesktop/NetworkManager/Devices/14)
Nov 28 09:41:59 np0005538515.localdomain systemd-udevd[261557]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 09:41:59 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T09:41:59Z|00027|ovn_bfd|INFO|Enabled BFD on interface ovn-c3237d-0
Nov 28 09:41:59 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T09:41:59Z|00028|ovn_bfd|INFO|Enabled BFD on interface ovn-11aa47-0
Nov 28 09:41:59 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T09:41:59Z|00029|ovn_bfd|INFO|Enabled BFD on interface ovn-07900d-0
Nov 28 09:41:59 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:41:59.544 158530 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538515.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '192.168.122.172/24', 'neutron:device_id': 'dhcp2a75ae47-09f3-5db4-9c67-86b6e0e7c804-887157f9-a765-40c0-8be5-1fba3ddea8f8', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-887157f9-a765-40c0-8be5-1fba3ddea8f8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9dda653c53224db086060962b0702694', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e5520a81-bbe1-4feb-9859-6165eafc855d, chassis=[<ovs.db.idl.Row object at 0x7fd80e481be0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fd80e481be0>], logical_port=8af1236c-205e-4af9-a882-ccde7f9d3ecf) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 09:41:59 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:41:59.549 158530 INFO neutron.agent.ovn.metadata.agent [-] Port 8af1236c-205e-4af9-a882-ccde7f9d3ecf in datapath 887157f9-a765-40c0-8be5-1fba3ddea8f8 bound to our chassis
Nov 28 09:41:59 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:41:59.553 158530 DEBUG neutron.agent.ovn.metadata.agent [-] Port 443f831a-83a9-4df5-adbb-6fdf4d706460 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Nov 28 09:41:59 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:41:59.553 158530 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 887157f9-a765-40c0-8be5-1fba3ddea8f8, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 28 09:41:59 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:41:59.555 158530 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.default', '--privsep_sock_path', '/tmp/tmp32linmc5/privsep.sock']
Nov 28 09:41:59 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T09:41:59Z|00030|binding|INFO|Claiming lport b51f2386-0b9d-42f5-9ce1-e7fa1b564192 for this chassis.
Nov 28 09:41:59 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T09:41:59Z|00031|binding|INFO|b51f2386-0b9d-42f5-9ce1-e7fa1b564192: Claiming unknown
Nov 28 09:41:59 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:41:59.609 158530 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538515.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '192.168.0.3/24', 'neutron:device_id': 'dhcp2a75ae47-09f3-5db4-9c67-86b6e0e7c804-40d5da59-6201-424a-8380-80ecc3d67c7e', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-40d5da59-6201-424a-8380-80ecc3d67c7e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9dda653c53224db086060962b0702694', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f3122580-f73f-40fa-a838-6bad2ff9da2f, chassis=[<ovs.db.idl.Row object at 0x7fd80e481be0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fd80e481be0>], logical_port=b51f2386-0b9d-42f5-9ce1-e7fa1b564192) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 09:41:59 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T09:41:59Z|00032|binding|INFO|Setting lport 8af1236c-205e-4af9-a882-ccde7f9d3ecf ovn-installed in OVS
Nov 28 09:41:59 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T09:41:59Z|00033|binding|INFO|Setting lport 8af1236c-205e-4af9-a882-ccde7f9d3ecf up in Southbound
Nov 28 09:41:59 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T09:41:59Z|00034|binding|INFO|Setting lport b51f2386-0b9d-42f5-9ce1-e7fa1b564192 ovn-installed in OVS
Nov 28 09:41:59 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T09:41:59Z|00035|binding|INFO|Setting lport b51f2386-0b9d-42f5-9ce1-e7fa1b564192 up in Southbound
Nov 28 09:42:00 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:42:00.195 158530 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Nov 28 09:42:00 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:42:00.196 158530 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmp32linmc5/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362
Nov 28 09:42:00 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:42:00.089 261619 INFO oslo.privsep.daemon [-] privsep daemon starting
Nov 28 09:42:00 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:42:00.094 261619 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Nov 28 09:42:00 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:42:00.099 261619 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/none
Nov 28 09:42:00 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:42:00.100 261619 INFO oslo.privsep.daemon [-] privsep daemon running as pid 261619
Nov 28 09:42:00 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:42:00.199 261619 DEBUG oslo.privsep.daemon [-] privsep: reply[6c148fed-fc0b-43cf-9395-0948098b0bde]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 09:42:00 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:42:00.599 261619 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:42:00 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:42:00.599 261619 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:42:00 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:42:00.599 261619 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:42:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:42:00.615 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:42:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:42:00.615 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:42:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:42:00.615 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:42:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:42:00.616 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:42:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:42:00.616 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:42:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:42:00.616 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:42:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:42:00.616 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:42:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:42:00.616 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:42:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:42:00.616 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:42:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:42:00.616 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:42:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:42:00.617 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:42:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:42:00.617 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:42:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:42:00.617 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:42:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:42:00.617 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:42:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:42:00.617 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:42:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:42:00.617 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:42:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:42:00.617 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:42:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:42:00.618 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:42:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:42:00.618 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:42:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:42:00.618 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:42:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:42:00.618 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:42:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:42:00.618 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:42:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:42:00.618 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:42:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:42:00.618 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:42:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:42:00.618 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:42:00 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:42:00.697 261619 DEBUG oslo.privsep.daemon [-] privsep: reply[51ad5e54-84e8-4518-b785-21f5be70137e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 09:42:00 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:42:00.699 158530 INFO neutron.agent.ovn.metadata.agent [-] Port b51f2386-0b9d-42f5-9ce1-e7fa1b564192 in datapath 40d5da59-6201-424a-8380-80ecc3d67c7e unbound from our chassis
Nov 28 09:42:00 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:42:00.701 158530 DEBUG neutron.agent.ovn.metadata.agent [-] Port d15a465b-1a05-4da0-8002-47b641f332f3 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Nov 28 09:42:00 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:42:00.701 158530 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 40d5da59-6201-424a-8380-80ecc3d67c7e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 28 09:42:00 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:42:00.702 261619 DEBUG oslo.privsep.daemon [-] privsep: reply[2eb3f4d9-6725-45be-99e1-b02910aebb6e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 09:42:00 np0005538515.localdomain podman[261669]: 
Nov 28 09:42:00 np0005538515.localdomain podman[261669]: 2025-11-28 09:42:00.773522722 +0000 UTC m=+0.081668555 container create dff024c0a8ef5b75fbea3dbf531de7255c21fd5f44b33b8a799f8e3ce0ffd439 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-887157f9-a765-40c0-8be5-1fba3ddea8f8, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 28 09:42:00 np0005538515.localdomain podman[261683]: 
Nov 28 09:42:00 np0005538515.localdomain podman[261683]: 2025-11-28 09:42:00.805771793 +0000 UTC m=+0.078437765 container create 89b3dc0bb55924bc1fd5f8ac3bf996eec3cf2dde4e5a3645e78d8d092d18d9b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-40d5da59-6201-424a-8380-80ecc3d67c7e, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Nov 28 09:42:00 np0005538515.localdomain systemd[1]: Started libpod-conmon-dff024c0a8ef5b75fbea3dbf531de7255c21fd5f44b33b8a799f8e3ce0ffd439.scope.
Nov 28 09:42:00 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 09:42:00 np0005538515.localdomain podman[261669]: 2025-11-28 09:42:00.733622534 +0000 UTC m=+0.041768397 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 28 09:42:00 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f7cff75508476df411b334ef64aedbb65646b8067d2b7c094a8dcb894216f571/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 09:42:00 np0005538515.localdomain systemd[1]: Started libpod-conmon-89b3dc0bb55924bc1fd5f8ac3bf996eec3cf2dde4e5a3645e78d8d092d18d9b4.scope.
Nov 28 09:42:00 np0005538515.localdomain podman[261669]: 2025-11-28 09:42:00.842955876 +0000 UTC m=+0.151101709 container init dff024c0a8ef5b75fbea3dbf531de7255c21fd5f44b33b8a799f8e3ce0ffd439 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-887157f9-a765-40c0-8be5-1fba3ddea8f8, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team)
Nov 28 09:42:00 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 09:42:00 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/47cb17eb211f19f0dbd29cd4ba5c46fd5d30452237ee450b25f44eaeac446706/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 09:42:00 np0005538515.localdomain podman[261669]: 2025-11-28 09:42:00.852611936 +0000 UTC m=+0.160757769 container start dff024c0a8ef5b75fbea3dbf531de7255c21fd5f44b33b8a799f8e3ce0ffd439 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-887157f9-a765-40c0-8be5-1fba3ddea8f8, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 28 09:42:00 np0005538515.localdomain dnsmasq[261709]: started, version 2.85 cachesize 150
Nov 28 09:42:00 np0005538515.localdomain dnsmasq[261709]: DNS service limited to local subnets
Nov 28 09:42:00 np0005538515.localdomain podman[261683]: 2025-11-28 09:42:00.858123367 +0000 UTC m=+0.130789339 container init 89b3dc0bb55924bc1fd5f8ac3bf996eec3cf2dde4e5a3645e78d8d092d18d9b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-40d5da59-6201-424a-8380-80ecc3d67c7e, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Nov 28 09:42:00 np0005538515.localdomain dnsmasq[261709]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 28 09:42:00 np0005538515.localdomain dnsmasq[261709]: warning: no upstream servers configured
Nov 28 09:42:00 np0005538515.localdomain dnsmasq-dhcp[261709]: DHCP, static leases only on 192.168.122.0, lease time 1d
Nov 28 09:42:00 np0005538515.localdomain dnsmasq[261709]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/addn_hosts - 1 addresses
Nov 28 09:42:00 np0005538515.localdomain dnsmasq-dhcp[261709]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/host
Nov 28 09:42:00 np0005538515.localdomain dnsmasq-dhcp[261709]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/opts
Nov 28 09:42:00 np0005538515.localdomain podman[261683]: 2025-11-28 09:42:00.763644736 +0000 UTC m=+0.036310728 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 28 09:42:00 np0005538515.localdomain podman[261683]: 2025-11-28 09:42:00.869296933 +0000 UTC m=+0.141962915 container start 89b3dc0bb55924bc1fd5f8ac3bf996eec3cf2dde4e5a3645e78d8d092d18d9b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-40d5da59-6201-424a-8380-80ecc3d67c7e, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Nov 28 09:42:00 np0005538515.localdomain dnsmasq[261711]: started, version 2.85 cachesize 150
Nov 28 09:42:00 np0005538515.localdomain dnsmasq[261711]: DNS service limited to local subnets
Nov 28 09:42:00 np0005538515.localdomain dnsmasq[261711]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 28 09:42:00 np0005538515.localdomain dnsmasq[261711]: warning: no upstream servers configured
Nov 28 09:42:00 np0005538515.localdomain dnsmasq-dhcp[261711]: DHCP, static leases only on 192.168.0.0, lease time 1d
Nov 28 09:42:00 np0005538515.localdomain dnsmasq[261711]: read /var/lib/neutron/dhcp/40d5da59-6201-424a-8380-80ecc3d67c7e/addn_hosts - 2 addresses
Nov 28 09:42:00 np0005538515.localdomain dnsmasq-dhcp[261711]: read /var/lib/neutron/dhcp/40d5da59-6201-424a-8380-80ecc3d67c7e/host
Nov 28 09:42:00 np0005538515.localdomain dnsmasq-dhcp[261711]: read /var/lib/neutron/dhcp/40d5da59-6201-424a-8380-80ecc3d67c7e/opts
Nov 28 09:42:00 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 09:42:00.910 261346 INFO neutron.agent.dhcp.agent [None req-2543693c-d7f7-4a61-8938-f9f318f3eb76 - - - - - -] Finished network 887157f9-a765-40c0-8be5-1fba3ddea8f8 dhcp configuration
Nov 28 09:42:00 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 09:42:00.913 261346 INFO neutron.agent.dhcp.agent [None req-77827c0b-294d-473d-976f-8b23782efbcf - - - - - -] Finished network 40d5da59-6201-424a-8380-80ecc3d67c7e dhcp configuration
Nov 28 09:42:00 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 09:42:00.913 261346 INFO neutron.agent.dhcp.agent [None req-89cff752-fd5b-4681-91c5-e1b153e3385a - - - - - -] Synchronizing state complete
Nov 28 09:42:00 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 09:42:00.975 261346 INFO neutron.agent.dhcp.agent [None req-89cff752-fd5b-4681-91c5-e1b153e3385a - - - - - -] DHCP agent started
Nov 28 09:42:01 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.
Nov 28 09:42:01 np0005538515.localdomain podman[261712]: 2025-11-28 09:42:01.973195366 +0000 UTC m=+0.079241390 container health_status 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 28 09:42:02 np0005538515.localdomain podman[261712]: 2025-11-28 09:42:02.010458133 +0000 UTC m=+0.116504177 container exec_died 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 28 09:42:02 np0005538515.localdomain systemd[1]: 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.service: Deactivated successfully.
Nov 28 09:42:02 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35404 DF PROTO=TCP SPT=34354 DPT=9102 SEQ=908223820 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ADAC1590000000001030307) 
Nov 28 09:42:02 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 09:42:02.447 261346 INFO neutron.agent.dhcp.agent [None req-9c3ad580-6cc4-4b63-a120-686d2232cbd8 - - - - - -] DHCP configuration for ports {'50fa6f67-abd9-48d7-aedb-8ca08cff0a66', 'a05cd915-7bc5-46e4-9c1e-3efad949112b', '3ff57c88-06c6-4894-984a-80ce116d1456', '09612b07-5142-4b0f-9dab-74bf4403f69f', 'c11672ac-31d9-4e35-992c-9c2cc8fbd9ff', '4a0a3326-6d12-4d57-91f4-2bd267c644b1'} is completed
Nov 28 09:42:03 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35405 DF PROTO=TCP SPT=34354 DPT=9102 SEQ=908223820 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ADAC57B0000000001030307) 
Nov 28 09:42:04 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31084 DF PROTO=TCP SPT=36740 DPT=9102 SEQ=3245871172 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ADAC8FA0000000001030307) 
Nov 28 09:42:05 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35406 DF PROTO=TCP SPT=34354 DPT=9102 SEQ=908223820 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ADACD7B0000000001030307) 
Nov 28 09:42:06 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61066 DF PROTO=TCP SPT=38018 DPT=9102 SEQ=3204140831 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ADAD0FA0000000001030307) 
Nov 28 09:42:08 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.
Nov 28 09:42:09 np0005538515.localdomain podman[261736]: 2025-11-28 09:42:09.12835799 +0000 UTC m=+0.234417913 container health_status cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd)
Nov 28 09:42:09 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35407 DF PROTO=TCP SPT=34354 DPT=9102 SEQ=908223820 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ADADD3B0000000001030307) 
Nov 28 09:42:09 np0005538515.localdomain podman[261736]: 2025-11-28 09:42:09.28560044 +0000 UTC m=+0.391660413 container exec_died cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.schema-version=1.0)
Nov 28 09:42:09 np0005538515.localdomain systemd[1]: cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.service: Deactivated successfully.
Nov 28 09:42:17 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35408 DF PROTO=TCP SPT=34354 DPT=9102 SEQ=908223820 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ADAFCFA0000000001030307) 
Nov 28 09:42:18 np0005538515.localdomain sudo[261756]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:42:18 np0005538515.localdomain sudo[261756]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:42:18 np0005538515.localdomain sudo[261756]: pam_unix(sudo:session): session closed for user root
Nov 28 09:42:18 np0005538515.localdomain sudo[261774]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 check-host
Nov 28 09:42:18 np0005538515.localdomain sudo[261774]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:42:19 np0005538515.localdomain sudo[261774]: pam_unix(sudo:session): session closed for user root
Nov 28 09:42:19 np0005538515.localdomain sudo[261814]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:42:19 np0005538515.localdomain sudo[261814]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:42:19 np0005538515.localdomain sudo[261814]: pam_unix(sudo:session): session closed for user root
Nov 28 09:42:19 np0005538515.localdomain sudo[261832]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 09:42:19 np0005538515.localdomain sudo[261832]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:42:20 np0005538515.localdomain sudo[261832]: pam_unix(sudo:session): session closed for user root
Nov 28 09:42:20 np0005538515.localdomain sudo[261882]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:42:20 np0005538515.localdomain sudo[261882]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:42:20 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.
Nov 28 09:42:20 np0005538515.localdomain sudo[261882]: pam_unix(sudo:session): session closed for user root
Nov 28 09:42:20 np0005538515.localdomain podman[261900]: 2025-11-28 09:42:20.842168297 +0000 UTC m=+0.063278075 container health_status 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, vendor=Red Hat, Inc., vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, release=1755695350, version=9.6, architecture=x86_64, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, name=ubi9-minimal)
Nov 28 09:42:20 np0005538515.localdomain podman[261900]: 2025-11-28 09:42:20.853794678 +0000 UTC m=+0.074904446 container exec_died 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, release=1755695350, version=9.6, vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., architecture=x86_64, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 28 09:42:20 np0005538515.localdomain systemd[1]: 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.service: Deactivated successfully.
Nov 28 09:42:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:42:27 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:42:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:42:27 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:42:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:42:27 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 09:42:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:42:27 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 09:42:27 np0005538515.localdomain openstack_network_exporter[240973]: 
Nov 28 09:42:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:42:27 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 09:42:27 np0005538515.localdomain openstack_network_exporter[240973]: 
Nov 28 09:42:28 np0005538515.localdomain podman[239012]: time="2025-11-28T09:42:28Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 09:42:28 np0005538515.localdomain podman[239012]: @ - - [28/Nov/2025:09:42:28 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 149991 "" "Go-http-client/1.1"
Nov 28 09:42:28 np0005538515.localdomain podman[239012]: @ - - [28/Nov/2025:09:42:28 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17691 "" "Go-http-client/1.1"
Nov 28 09:42:29 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T09:42:29Z|00036|memory_trim|INFO|Detected inactivity (last active 30003 ms ago): trimming memory
Nov 28 09:42:29 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.
Nov 28 09:42:29 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.
Nov 28 09:42:29 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.
Nov 28 09:42:29 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.
Nov 28 09:42:29 np0005538515.localdomain podman[261921]: 2025-11-28 09:42:29.988624089 +0000 UTC m=+0.095080532 container health_status 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=edpm)
Nov 28 09:42:30 np0005538515.localdomain podman[261921]: 2025-11-28 09:42:30.001427965 +0000 UTC m=+0.107884368 container exec_died 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 09:42:30 np0005538515.localdomain systemd[1]: 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.service: Deactivated successfully.
Nov 28 09:42:30 np0005538515.localdomain podman[261922]: 2025-11-28 09:42:30.051110597 +0000 UTC m=+0.150898663 container health_status 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Nov 28 09:42:30 np0005538515.localdomain podman[261922]: 2025-11-28 09:42:30.089734706 +0000 UTC m=+0.189522742 container exec_died 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 28 09:42:30 np0005538515.localdomain systemd[1]: 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.service: Deactivated successfully.
Nov 28 09:42:30 np0005538515.localdomain podman[261923]: 2025-11-28 09:42:30.105396232 +0000 UTC m=+0.201222955 container health_status b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 28 09:42:30 np0005538515.localdomain podman[261929]: 2025-11-28 09:42:30.144396561 +0000 UTC m=+0.239471821 container health_status d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 28 09:42:30 np0005538515.localdomain podman[261929]: 2025-11-28 09:42:30.15205961 +0000 UTC m=+0.247134840 container exec_died d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 28 09:42:30 np0005538515.localdomain systemd[1]: d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.service: Deactivated successfully.
Nov 28 09:42:30 np0005538515.localdomain podman[261923]: 2025-11-28 09:42:30.235012164 +0000 UTC m=+0.330838867 container exec_died b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, managed_by=edpm_ansible)
Nov 28 09:42:30 np0005538515.localdomain systemd[1]: b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.service: Deactivated successfully.
Nov 28 09:42:32 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31227 DF PROTO=TCP SPT=40914 DPT=9102 SEQ=318385806 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ADB36890000000001030307) 
Nov 28 09:42:32 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.
Nov 28 09:42:32 np0005538515.localdomain podman[262002]: 2025-11-28 09:42:32.968000515 +0000 UTC m=+0.077217437 container health_status 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 09:42:32 np0005538515.localdomain podman[262002]: 2025-11-28 09:42:32.975509548 +0000 UTC m=+0.084726420 container exec_died 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 09:42:32 np0005538515.localdomain systemd[1]: 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.service: Deactivated successfully.
Nov 28 09:42:33 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31228 DF PROTO=TCP SPT=40914 DPT=9102 SEQ=318385806 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ADB3A7A0000000001030307) 
Nov 28 09:42:33 np0005538515.localdomain sshd[262026]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 09:42:33 np0005538515.localdomain sshd[262026]: Accepted publickey for zuul from 192.168.122.30 port 51370 ssh2: RSA SHA256:3gOhaEk5Hp1Sm2LwNst6cGDJ5O01KvSo8lCo9SBO2II
Nov 28 09:42:33 np0005538515.localdomain systemd-logind[763]: New session 59 of user zuul.
Nov 28 09:42:33 np0005538515.localdomain systemd[1]: Started Session 59 of User zuul.
Nov 28 09:42:33 np0005538515.localdomain sshd[262026]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Nov 28 09:42:33 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35409 DF PROTO=TCP SPT=34354 DPT=9102 SEQ=908223820 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ADB3CFA0000000001030307) 
Nov 28 09:42:34 np0005538515.localdomain python3.9[262137]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 09:42:35 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31229 DF PROTO=TCP SPT=40914 DPT=9102 SEQ=318385806 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ADB427A0000000001030307) 
Nov 28 09:42:36 np0005538515.localdomain python3.9[262249]: ansible-ansible.builtin.service_facts Invoked
Nov 28 09:42:36 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31085 DF PROTO=TCP SPT=36740 DPT=9102 SEQ=3245871172 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ADB46FA0000000001030307) 
Nov 28 09:42:36 np0005538515.localdomain network[262266]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 28 09:42:36 np0005538515.localdomain network[262267]: 'network-scripts' will be removed from distribution in near future.
Nov 28 09:42:36 np0005538515.localdomain network[262268]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 28 09:42:37 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:42:39 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31230 DF PROTO=TCP SPT=40914 DPT=9102 SEQ=318385806 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ADB523A0000000001030307) 
Nov 28 09:42:39 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.
Nov 28 09:42:39 np0005538515.localdomain sudo[262511]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-enturhzfxjchutouezudjkxiabspwxet ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322959.652918-103-192916563027431/AnsiballZ_setup.py
Nov 28 09:42:39 np0005538515.localdomain sudo[262511]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:42:39 np0005538515.localdomain systemd[1]: tmp-crun.hNyOOJ.mount: Deactivated successfully.
Nov 28 09:42:39 np0005538515.localdomain podman[262481]: 2025-11-28 09:42:39.987367947 +0000 UTC m=+0.090790578 container health_status cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, tcib_managed=true)
Nov 28 09:42:40 np0005538515.localdomain podman[262481]: 2025-11-28 09:42:40.027460621 +0000 UTC m=+0.130883222 container exec_died cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd, managed_by=edpm_ansible, tcib_managed=true)
Nov 28 09:42:40 np0005538515.localdomain systemd[1]: cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.service: Deactivated successfully.
Nov 28 09:42:40 np0005538515.localdomain python3.9[262519]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 28 09:42:40 np0005538515.localdomain sudo[262511]: pam_unix(sudo:session): session closed for user root
Nov 28 09:42:40 np0005538515.localdomain sudo[262583]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jjtmjelcechqluqvlkymqnzfxkhpemoi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322959.652918-103-192916563027431/AnsiballZ_dnf.py
Nov 28 09:42:40 np0005538515.localdomain sudo[262583]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:42:41 np0005538515.localdomain python3.9[262585]: ansible-ansible.legacy.dnf Invoked with name=['iscsi-initiator-utils'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 28 09:42:44 np0005538515.localdomain sudo[262583]: pam_unix(sudo:session): session closed for user root
Nov 28 09:42:44 np0005538515.localdomain sudo[262695]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hatsufxqopwueudctqehgrhjwprvtvkn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322964.5250711-140-154841872980954/AnsiballZ_stat.py
Nov 28 09:42:44 np0005538515.localdomain sudo[262695]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:42:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:42:45.074 228501 DEBUG oslo_service.periodic_task [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:42:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:42:45.074 228501 DEBUG nova.compute.manager [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Nov 28 09:42:45 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:42:45.114 228501 DEBUG nova.compute.manager [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Nov 28 09:42:45 np0005538515.localdomain python3.9[262697]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated/iscsid/etc/iscsi follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 09:42:45 np0005538515.localdomain sudo[262695]: pam_unix(sudo:session): session closed for user root
Nov 28 09:42:45 np0005538515.localdomain sudo[262805]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nbjtbjhohioetxmidadmdpknvqltpvni ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322965.5532942-170-234478058347408/AnsiballZ_command.py
Nov 28 09:42:45 np0005538515.localdomain sudo[262805]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:42:46 np0005538515.localdomain python3.9[262807]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/iscsi /var/lib/iscsi _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:42:46 np0005538515.localdomain sudo[262805]: pam_unix(sudo:session): session closed for user root
Nov 28 09:42:46 np0005538515.localdomain sudo[262916]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jmqtddmytgqedljnpccchnuktpkpqjxn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322966.595179-200-180377006192355/AnsiballZ_stat.py
Nov 28 09:42:46 np0005538515.localdomain sudo[262916]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:42:47 np0005538515.localdomain python3.9[262918]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.initiator_reset follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 09:42:47 np0005538515.localdomain sudo[262916]: pam_unix(sudo:session): session closed for user root
Nov 28 09:42:47 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31231 DF PROTO=TCP SPT=40914 DPT=9102 SEQ=318385806 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ADB72FA0000000001030307) 
Nov 28 09:42:47 np0005538515.localdomain sudo[263028]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-booozjjznzarfzfbscfxdachxzqhyvzm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322967.519712-232-38467216193134/AnsiballZ_lineinfile.py
Nov 28 09:42:47 np0005538515.localdomain sudo[263028]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:42:48 np0005538515.localdomain python3.9[263030]: ansible-ansible.builtin.lineinfile Invoked with insertafter=^#node.session.auth.chap.algs line=node.session.auth.chap_algs = SHA3-256,SHA256,SHA1,MD5 path=/etc/iscsi/iscsid.conf regexp=^node.session.auth.chap_algs state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:42:48 np0005538515.localdomain sudo[263028]: pam_unix(sudo:session): session closed for user root
Nov 28 09:42:49 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:42:49.109 228501 DEBUG oslo_service.periodic_task [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:42:49 np0005538515.localdomain sudo[263138]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ypdrlniotjyjsmgdqvwijmrabqaepfos ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322968.9311519-260-236005514662652/AnsiballZ_systemd_service.py
Nov 28 09:42:49 np0005538515.localdomain sudo[263138]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:42:50 np0005538515.localdomain python3.9[263140]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:42:50 np0005538515.localdomain sudo[263138]: pam_unix(sudo:session): session closed for user root
Nov 28 09:42:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:42:50.822 158530 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:42:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:42:50.823 158530 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:42:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:42:50.824 158530 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:42:50 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.
Nov 28 09:42:51 np0005538515.localdomain podman[263198]: 2025-11-28 09:42:51.009815578 +0000 UTC m=+0.097770354 container health_status 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, container_name=openstack_network_exporter, version=9.6, com.redhat.component=ubi9-minimal-container, vcs-type=git, name=ubi9-minimal, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, distribution-scope=public, vendor=Red Hat, Inc., io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible)
Nov 28 09:42:51 np0005538515.localdomain podman[263198]: 2025-11-28 09:42:51.050604439 +0000 UTC m=+0.138559215 container exec_died 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, version=9.6, com.redhat.component=ubi9-minimal-container, config_id=edpm, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., name=ubi9-minimal, distribution-scope=public, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, release=1755695350, io.buildah.version=1.33.7, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 28 09:42:51 np0005538515.localdomain systemd[1]: 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.service: Deactivated successfully.
Nov 28 09:42:51 np0005538515.localdomain sudo[263270]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kjughvadvwdocjfrauqeonywokoprhsl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322970.8298733-284-6589393242840/AnsiballZ_systemd_service.py
Nov 28 09:42:51 np0005538515.localdomain sudo[263270]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:42:51 np0005538515.localdomain python3.9[263272]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:42:51 np0005538515.localdomain sudo[263270]: pam_unix(sudo:session): session closed for user root
Nov 28 09:42:52 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:42:52.073 228501 DEBUG oslo_service.periodic_task [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:42:52 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:42:52.105 228501 DEBUG oslo_concurrency.lockutils [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:42:52 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:42:52.106 228501 DEBUG oslo_concurrency.lockutils [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:42:52 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:42:52.106 228501 DEBUG oslo_concurrency.lockutils [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:42:52 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:42:52.107 228501 DEBUG nova.compute.resource_tracker [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Auditing locally available compute resources for np0005538515.localdomain (node: np0005538515.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 28 09:42:52 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:42:52.107 228501 DEBUG oslo_concurrency.processutils [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 09:42:52 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:42:52.545 228501 DEBUG oslo_concurrency.processutils [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.438s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 09:42:52 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:42:52.721 228501 WARNING nova.virt.libvirt.driver [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 09:42:52 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:42:52.722 228501 DEBUG nova.compute.resource_tracker [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Hypervisor/Node resource view: name=np0005538515.localdomain free_ram=12548MB free_disk=41.837093353271484GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 28 09:42:52 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:42:52.722 228501 DEBUG oslo_concurrency.lockutils [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:42:52 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:42:52.722 228501 DEBUG oslo_concurrency.lockutils [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:42:52 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:42:52.887 228501 DEBUG nova.compute.resource_tracker [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 28 09:42:52 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:42:52.887 228501 DEBUG nova.compute.resource_tracker [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Final resource view: name=np0005538515.localdomain phys_ram=15738MB used_ram=512MB phys_disk=41GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 28 09:42:52 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:42:52.976 228501 DEBUG nova.scheduler.client.report [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Refreshing inventories for resource provider 72fba1ca-0d86-48af-8a3d-510284dfd0e0 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Nov 28 09:42:53 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:42:53.043 228501 DEBUG nova.scheduler.client.report [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Updating ProviderTree inventory for provider 72fba1ca-0d86-48af-8a3d-510284dfd0e0 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Nov 28 09:42:53 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:42:53.044 228501 DEBUG nova.compute.provider_tree [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Updating inventory in ProviderTree for provider 72fba1ca-0d86-48af-8a3d-510284dfd0e0 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 28 09:42:53 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:42:53.057 228501 DEBUG nova.scheduler.client.report [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Refreshing aggregate associations for resource provider 72fba1ca-0d86-48af-8a3d-510284dfd0e0, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Nov 28 09:42:53 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:42:53.075 228501 DEBUG nova.scheduler.client.report [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Refreshing trait associations for resource provider 72fba1ca-0d86-48af-8a3d-510284dfd0e0, traits: COMPUTE_DEVICE_TAGGING,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SHA,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_AESNI,HW_CPU_X86_ABM,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_AMD_SVM,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NODE,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_ACCELERATORS,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_FMA3,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_SSE,HW_CPU_X86_F16C,HW_CPU_X86_SSE4A,HW_CPU_X86_MMX,HW_CPU_X86_SSE2,HW_CPU_X86_SSSE3,COMPUTE_RESCUE_BFV,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_AVX,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_SVM,COMPUTE_STORAGE_BUS_FDC,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_SECURITY_TPM_1_2,COMPUTE_STORAGE_BUS_USB,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_SSE41,HW_CPU_X86_SSE42,HW_CPU_X86_BMI2,HW_CPU_X86_CLMUL,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_BMI,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_AVX2,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_VIF_MODEL_VMXNET3 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Nov 28 09:42:53 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:42:53.092 228501 DEBUG oslo_concurrency.processutils [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 09:42:53 np0005538515.localdomain sudo[263424]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qlkrcyzclzarjwsmsiybxfbejfjhjcai ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322973.0687947-317-188625484510609/AnsiballZ_service_facts.py
Nov 28 09:42:53 np0005538515.localdomain sudo[263424]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:42:53 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:42:53.532 228501 DEBUG oslo_concurrency.processutils [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.440s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 09:42:53 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:42:53.539 228501 DEBUG nova.compute.provider_tree [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Inventory has not changed in ProviderTree for provider: 72fba1ca-0d86-48af-8a3d-510284dfd0e0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 09:42:53 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:42:53.561 228501 DEBUG nova.scheduler.client.report [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Inventory has not changed for provider 72fba1ca-0d86-48af-8a3d-510284dfd0e0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 09:42:53 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:42:53.563 228501 DEBUG nova.compute.resource_tracker [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Compute_service record updated for np0005538515.localdomain:np0005538515.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 28 09:42:53 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:42:53.563 228501 DEBUG oslo_concurrency.lockutils [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.841s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:42:53 np0005538515.localdomain python3.9[263426]: ansible-ansible.builtin.service_facts Invoked
Nov 28 09:42:53 np0005538515.localdomain network[263445]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 28 09:42:53 np0005538515.localdomain network[263446]: 'network-scripts' will be removed from distribution in near future.
Nov 28 09:42:53 np0005538515.localdomain network[263447]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 28 09:42:54 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:42:54.560 228501 DEBUG oslo_service.periodic_task [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:42:54 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:42:54.560 228501 DEBUG oslo_service.periodic_task [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:42:54 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:42:54.561 228501 DEBUG oslo_service.periodic_task [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:42:54 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:42:55 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:42:55.075 228501 DEBUG oslo_service.periodic_task [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:42:55 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:42:55.076 228501 DEBUG nova.compute.manager [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 28 09:42:55 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:42:55.076 228501 DEBUG nova.compute.manager [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 28 09:42:55 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:42:55.145 228501 DEBUG nova.compute.manager [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 28 09:42:55 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:42:55.145 228501 DEBUG oslo_service.periodic_task [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:42:55 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:42:55.146 228501 DEBUG nova.compute.manager [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 28 09:42:55 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:42:55.146 228501 DEBUG oslo_service.periodic_task [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:42:56 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:42:56.087 228501 DEBUG oslo_service.periodic_task [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:42:56 np0005538515.localdomain sudo[263424]: pam_unix(sudo:session): session closed for user root
Nov 28 09:42:57 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:42:57.073 228501 DEBUG oslo_service.periodic_task [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:42:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:42:57 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:42:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:42:57 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 09:42:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:42:57 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:42:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:42:57 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 09:42:57 np0005538515.localdomain openstack_network_exporter[240973]: 
Nov 28 09:42:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:42:57 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 09:42:57 np0005538515.localdomain openstack_network_exporter[240973]: 
Nov 28 09:42:58 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:42:58.074 228501 DEBUG oslo_service.periodic_task [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:42:58 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:42:58.074 228501 DEBUG oslo_service.periodic_task [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:42:58 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:42:58.075 228501 DEBUG nova.compute.manager [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Nov 28 09:42:58 np0005538515.localdomain sudo[263679]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wbmvivhplafkbfqrlvjhuvhpnjvuctwg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322978.0182004-347-267304055383037/AnsiballZ_file.py
Nov 28 09:42:58 np0005538515.localdomain sudo[263679]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:42:58 np0005538515.localdomain python3.9[263681]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Nov 28 09:42:58 np0005538515.localdomain sudo[263679]: pam_unix(sudo:session): session closed for user root
Nov 28 09:42:58 np0005538515.localdomain podman[239012]: time="2025-11-28T09:42:58Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 09:42:58 np0005538515.localdomain podman[239012]: @ - - [28/Nov/2025:09:42:58 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 149991 "" "Go-http-client/1.1"
Nov 28 09:42:58 np0005538515.localdomain podman[239012]: @ - - [28/Nov/2025:09:42:58 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17699 "" "Go-http-client/1.1"
Nov 28 09:42:59 np0005538515.localdomain sudo[263789]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rxsmekrymafckpebvcfzatizymzjleen ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322978.888537-372-200115968367777/AnsiballZ_modprobe.py
Nov 28 09:42:59 np0005538515.localdomain sudo[263789]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:42:59 np0005538515.localdomain python3.9[263791]: ansible-community.general.modprobe Invoked with name=dm-multipath state=present params= persistent=disabled
Nov 28 09:42:59 np0005538515.localdomain sudo[263789]: pam_unix(sudo:session): session closed for user root
Nov 28 09:42:59 np0005538515.localdomain sudo[263899]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qlnoxyqwsueaogxwkhjqxgogtgfkrcfe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322979.716058-395-184640961032133/AnsiballZ_stat.py
Nov 28 09:42:59 np0005538515.localdomain sudo[263899]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:43:00 np0005538515.localdomain python3.9[263901]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/dm-multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:43:00 np0005538515.localdomain sudo[263899]: pam_unix(sudo:session): session closed for user root
Nov 28 09:43:00 np0005538515.localdomain sudo[263956]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-soqfsfmrggimnxfdmpsayvdrcjmrnisb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322979.716058-395-184640961032133/AnsiballZ_file.py
Nov 28 09:43:00 np0005538515.localdomain sudo[263956]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:43:00 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.
Nov 28 09:43:00 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.
Nov 28 09:43:00 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.
Nov 28 09:43:00 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.
Nov 28 09:43:00 np0005538515.localdomain podman[263959]: 2025-11-28 09:43:00.666852733 +0000 UTC m=+0.092762919 container health_status 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_id=edpm, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 28 09:43:00 np0005538515.localdomain podman[263959]: 2025-11-28 09:43:00.677737369 +0000 UTC m=+0.103647605 container exec_died 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 09:43:00 np0005538515.localdomain systemd[1]: 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.service: Deactivated successfully.
Nov 28 09:43:00 np0005538515.localdomain podman[263960]: 2025-11-28 09:43:00.728438667 +0000 UTC m=+0.150027920 container health_status 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, config_id=ovn_controller)
Nov 28 09:43:00 np0005538515.localdomain python3.9[263958]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/etc/modules-load.d/dm-multipath.conf _original_basename=module-load.conf.j2 recurse=False state=file path=/etc/modules-load.d/dm-multipath.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:43:00 np0005538515.localdomain systemd[1]: tmp-crun.qQfLa5.mount: Deactivated successfully.
Nov 28 09:43:00 np0005538515.localdomain sudo[263956]: pam_unix(sudo:session): session closed for user root
Nov 28 09:43:00 np0005538515.localdomain podman[263960]: 2025-11-28 09:43:00.772690955 +0000 UTC m=+0.194280238 container exec_died 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 28 09:43:00 np0005538515.localdomain systemd[1]: 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.service: Deactivated successfully.
Nov 28 09:43:00 np0005538515.localdomain podman[263961]: 2025-11-28 09:43:00.827818628 +0000 UTC m=+0.248454681 container health_status b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Nov 28 09:43:00 np0005538515.localdomain podman[263961]: 2025-11-28 09:43:00.861449998 +0000 UTC m=+0.282086091 container exec_died b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team)
Nov 28 09:43:00 np0005538515.localdomain systemd[1]: b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.service: Deactivated successfully.
Nov 28 09:43:00 np0005538515.localdomain podman[263967]: 2025-11-28 09:43:00.776452901 +0000 UTC m=+0.188853669 container health_status d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 09:43:00 np0005538515.localdomain podman[263967]: 2025-11-28 09:43:00.909300357 +0000 UTC m=+0.321701075 container exec_died d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 28 09:43:00 np0005538515.localdomain systemd[1]: d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.service: Deactivated successfully.
Nov 28 09:43:01 np0005538515.localdomain sudo[264151]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wyolkqxzyykagzzxautlsspefwleogyl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322981.5152526-434-270062781897544/AnsiballZ_lineinfile.py
Nov 28 09:43:01 np0005538515.localdomain sudo[264151]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:43:01 np0005538515.localdomain python3.9[264153]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=dm-multipath  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:43:01 np0005538515.localdomain sudo[264151]: pam_unix(sudo:session): session closed for user root
Nov 28 09:43:02 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25821 DF PROTO=TCP SPT=38760 DPT=9102 SEQ=1919062088 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ADBABB90000000001030307) 
Nov 28 09:43:02 np0005538515.localdomain sudo[264261]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wruoicplqdinqajulqhelmkbwryexorr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322982.3286908-460-92771915095297/AnsiballZ_file.py
Nov 28 09:43:02 np0005538515.localdomain sudo[264261]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:43:03 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25822 DF PROTO=TCP SPT=38760 DPT=9102 SEQ=1919062088 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ADBAFBA0000000001030307) 
Nov 28 09:43:03 np0005538515.localdomain python3.9[264263]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:43:03 np0005538515.localdomain sudo[264261]: pam_unix(sudo:session): session closed for user root
Nov 28 09:43:03 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.
Nov 28 09:43:03 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31232 DF PROTO=TCP SPT=40914 DPT=9102 SEQ=318385806 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ADBB2FA0000000001030307) 
Nov 28 09:43:03 np0005538515.localdomain systemd[1]: tmp-crun.zUeMs8.mount: Deactivated successfully.
Nov 28 09:43:03 np0005538515.localdomain podman[264319]: 2025-11-28 09:43:03.984853341 +0000 UTC m=+0.094040748 container health_status 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 28 09:43:03 np0005538515.localdomain podman[264319]: 2025-11-28 09:43:03.999400661 +0000 UTC m=+0.108588028 container exec_died 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 28 09:43:04 np0005538515.localdomain systemd[1]: 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.service: Deactivated successfully.
Nov 28 09:43:04 np0005538515.localdomain sudo[264394]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uytddiwkmjbhwuyttbozoqijozyjxfbs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322983.8084164-487-101165137503370/AnsiballZ_stat.py
Nov 28 09:43:04 np0005538515.localdomain sudo[264394]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:43:04 np0005538515.localdomain python3.9[264396]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 09:43:04 np0005538515.localdomain sudo[264394]: pam_unix(sudo:session): session closed for user root
Nov 28 09:43:04 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:43:04.565 228501 DEBUG oslo_service.periodic_task [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:43:04 np0005538515.localdomain sudo[264506]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-txmfjdjegcrdxiezntigkgdyyzvgntmw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322984.6390185-515-16795725120664/AnsiballZ_stat.py
Nov 28 09:43:04 np0005538515.localdomain sudo[264506]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:43:05 np0005538515.localdomain python3.9[264508]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 09:43:05 np0005538515.localdomain sudo[264506]: pam_unix(sudo:session): session closed for user root
Nov 28 09:43:05 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25823 DF PROTO=TCP SPT=38760 DPT=9102 SEQ=1919062088 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ADBB7BA0000000001030307) 
Nov 28 09:43:05 np0005538515.localdomain sudo[264618]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vjqbjnqwtgfluzacdisazzwojydeknlx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322985.3565907-542-214091534145617/AnsiballZ_command.py
Nov 28 09:43:05 np0005538515.localdomain sudo[264618]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:43:05 np0005538515.localdomain python3.9[264620]: ansible-ansible.legacy.command Invoked with _raw_params=grep -q '^blacklist\s*{' /etc/multipath.conf _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:43:05 np0005538515.localdomain sudo[264618]: pam_unix(sudo:session): session closed for user root
Nov 28 09:43:05 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35410 DF PROTO=TCP SPT=34354 DPT=9102 SEQ=908223820 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ADBBAFB0000000001030307) 
Nov 28 09:43:06 np0005538515.localdomain sudo[264729]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sdinmncmjreuagcvlyljckznmtscndsc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322986.308646-572-9361825098478/AnsiballZ_replace.py
Nov 28 09:43:06 np0005538515.localdomain sudo[264729]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:43:06 np0005538515.localdomain python3.9[264731]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^blacklist\s*{\n[\s]+devnode \"\.\*\" replace=blacklist { backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:43:06 np0005538515.localdomain sudo[264729]: pam_unix(sudo:session): session closed for user root
Nov 28 09:43:06 np0005538515.localdomain systemd-journald[48427]: Field hash table of /run/log/journal/5cd59ba25ae47acac865224fa46a5f9e/system.journal has a fill level at 75.7 (252 of 333 items), suggesting rotation.
Nov 28 09:43:06 np0005538515.localdomain systemd-journald[48427]: /run/log/journal/5cd59ba25ae47acac865224fa46a5f9e/system.journal: Journal header limits reached or header out-of-date, rotating.
Nov 28 09:43:06 np0005538515.localdomain rsyslogd[758]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Nov 28 09:43:06 np0005538515.localdomain rsyslogd[758]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Nov 28 09:43:07 np0005538515.localdomain sudo[264840]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qyjtblcmkrgyacdcxvhggmlduzltkcxo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322987.3519802-599-168894869501625/AnsiballZ_lineinfile.py
Nov 28 09:43:07 np0005538515.localdomain sudo[264840]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:43:07 np0005538515.localdomain python3.9[264842]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        find_multipaths yes path=/etc/multipath.conf regexp=^\s+find_multipaths state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:43:07 np0005538515.localdomain sudo[264840]: pam_unix(sudo:session): session closed for user root
Nov 28 09:43:08 np0005538515.localdomain sudo[264950]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vcalzcgpxngouiredcswrciwwwjuzqro ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322987.9084275-599-21523587195982/AnsiballZ_lineinfile.py
Nov 28 09:43:08 np0005538515.localdomain sudo[264950]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:43:08 np0005538515.localdomain python3.9[264952]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        recheck_wwid yes path=/etc/multipath.conf regexp=^\s+recheck_wwid state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:43:08 np0005538515.localdomain sudo[264950]: pam_unix(sudo:session): session closed for user root
Nov 28 09:43:08 np0005538515.localdomain sudo[265060]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wxvxxerqsgyxceuujwiiwdlsbcnnlgrn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322988.538254-599-122321697206474/AnsiballZ_lineinfile.py
Nov 28 09:43:08 np0005538515.localdomain sudo[265060]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:43:09 np0005538515.localdomain python3.9[265062]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        skip_kpartx yes path=/etc/multipath.conf regexp=^\s+skip_kpartx state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:43:09 np0005538515.localdomain sudo[265060]: pam_unix(sudo:session): session closed for user root
Nov 28 09:43:09 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25824 DF PROTO=TCP SPT=38760 DPT=9102 SEQ=1919062088 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ADBC77A0000000001030307) 
Nov 28 09:43:09 np0005538515.localdomain sudo[265170]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dbhqhycdtycxbdjvgoohrzztxfbeifqz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322989.1795044-599-82554343065760/AnsiballZ_lineinfile.py
Nov 28 09:43:09 np0005538515.localdomain sudo[265170]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:43:09 np0005538515.localdomain python3.9[265172]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        user_friendly_names no path=/etc/multipath.conf regexp=^\s+user_friendly_names state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:43:09 np0005538515.localdomain sudo[265170]: pam_unix(sudo:session): session closed for user root
Nov 28 09:43:10 np0005538515.localdomain sudo[265280]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pspxfqrjfxhvxfgggcrtssqmrwcuunlg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322989.9290822-686-257514580609457/AnsiballZ_stat.py
Nov 28 09:43:10 np0005538515.localdomain sudo[265280]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:43:10 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.
Nov 28 09:43:10 np0005538515.localdomain systemd[1]: tmp-crun.24VdHx.mount: Deactivated successfully.
Nov 28 09:43:10 np0005538515.localdomain podman[265283]: 2025-11-28 09:43:10.325766676 +0000 UTC m=+0.087213287 container health_status cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=multipathd, org.label-schema.schema-version=1.0)
Nov 28 09:43:10 np0005538515.localdomain podman[265283]: 2025-11-28 09:43:10.337097966 +0000 UTC m=+0.098544607 container exec_died cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Nov 28 09:43:10 np0005538515.localdomain systemd[1]: cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.service: Deactivated successfully.
Nov 28 09:43:10 np0005538515.localdomain python3.9[265282]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 09:43:10 np0005538515.localdomain sudo[265280]: pam_unix(sudo:session): session closed for user root
Nov 28 09:43:11 np0005538515.localdomain sudo[265411]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tmjceqjxdghouysyqabfmyiggbggqdhk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322990.9439206-716-48243682858975/AnsiballZ_file.py
Nov 28 09:43:11 np0005538515.localdomain sudo[265411]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:43:11 np0005538515.localdomain python3.9[265413]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:43:11 np0005538515.localdomain sudo[265411]: pam_unix(sudo:session): session closed for user root
Nov 28 09:43:12 np0005538515.localdomain sudo[265521]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vdhgvfqxsvydugzyajydvjuhwstcouzs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322991.8783312-739-122064668596251/AnsiballZ_stat.py
Nov 28 09:43:12 np0005538515.localdomain sudo[265521]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:43:12 np0005538515.localdomain python3.9[265523]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:43:12 np0005538515.localdomain sudo[265521]: pam_unix(sudo:session): session closed for user root
Nov 28 09:43:13 np0005538515.localdomain sudo[265578]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fjhgiueyifphjjemechauwhealcfyewe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322991.8783312-739-122064668596251/AnsiballZ_file.py
Nov 28 09:43:13 np0005538515.localdomain sudo[265578]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:43:13 np0005538515.localdomain python3.9[265580]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:43:13 np0005538515.localdomain sudo[265578]: pam_unix(sudo:session): session closed for user root
Nov 28 09:43:13 np0005538515.localdomain sudo[265688]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ueoarkspzeeaypyhkxhuyycqctvpimlh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322993.6546657-739-75987278529486/AnsiballZ_stat.py
Nov 28 09:43:13 np0005538515.localdomain sudo[265688]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:43:14 np0005538515.localdomain python3.9[265690]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:43:14 np0005538515.localdomain sudo[265688]: pam_unix(sudo:session): session closed for user root
Nov 28 09:43:14 np0005538515.localdomain sudo[265745]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eljqmwzskebzjdbhdofrvmjdanmdzxxz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322993.6546657-739-75987278529486/AnsiballZ_file.py
Nov 28 09:43:14 np0005538515.localdomain sudo[265745]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:43:14 np0005538515.localdomain python3.9[265747]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:43:14 np0005538515.localdomain sudo[265745]: pam_unix(sudo:session): session closed for user root
Nov 28 09:43:15 np0005538515.localdomain sudo[265855]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-muiqrqtcncpojlrebirfqxoasawpoiyt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322995.4043028-809-194515892338684/AnsiballZ_file.py
Nov 28 09:43:15 np0005538515.localdomain sudo[265855]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:43:15 np0005538515.localdomain python3.9[265857]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:43:15 np0005538515.localdomain sudo[265855]: pam_unix(sudo:session): session closed for user root
Nov 28 09:43:16 np0005538515.localdomain sudo[265965]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hepktvvxdounaxrjdaammddyxakcvqli ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322996.0400805-833-129727282028267/AnsiballZ_stat.py
Nov 28 09:43:16 np0005538515.localdomain sudo[265965]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:43:16 np0005538515.localdomain python3.9[265967]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:43:16 np0005538515.localdomain sudo[265965]: pam_unix(sudo:session): session closed for user root
Nov 28 09:43:16 np0005538515.localdomain sudo[266022]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mwyacrgqdtpthocojsxhpxcgugvxnvvy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322996.0400805-833-129727282028267/AnsiballZ_file.py
Nov 28 09:43:16 np0005538515.localdomain sudo[266022]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:43:17 np0005538515.localdomain python3.9[266024]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:43:17 np0005538515.localdomain sudo[266022]: pam_unix(sudo:session): session closed for user root
Nov 28 09:43:17 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25825 DF PROTO=TCP SPT=38760 DPT=9102 SEQ=1919062088 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ADBE6FA0000000001030307) 
Nov 28 09:43:17 np0005538515.localdomain sudo[266132]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qorehwuizfnxksgipgsisxhwnbmcysnp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322997.294134-869-83976064163322/AnsiballZ_stat.py
Nov 28 09:43:17 np0005538515.localdomain sudo[266132]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:43:17 np0005538515.localdomain python3.9[266134]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:43:17 np0005538515.localdomain sudo[266132]: pam_unix(sudo:session): session closed for user root
Nov 28 09:43:17 np0005538515.localdomain sudo[266189]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-batugrutiwtxmyaxqcxydlslnffrojhg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322997.294134-869-83976064163322/AnsiballZ_file.py
Nov 28 09:43:17 np0005538515.localdomain sudo[266189]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:43:18 np0005538515.localdomain python3.9[266191]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:43:18 np0005538515.localdomain sudo[266189]: pam_unix(sudo:session): session closed for user root
Nov 28 09:43:18 np0005538515.localdomain sudo[266299]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eecfznhcjakafrctykzrrmyxebweveru ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322998.3730073-905-233625449364983/AnsiballZ_systemd.py
Nov 28 09:43:18 np0005538515.localdomain sudo[266299]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:43:19 np0005538515.localdomain python3.9[266301]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:43:19 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 09:43:19 np0005538515.localdomain systemd-rc-local-generator[266328]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:43:19 np0005538515.localdomain systemd-sysv-generator[266332]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:43:19 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:43:19 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 28 09:43:19 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:43:19 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:43:19 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:43:19 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 28 09:43:19 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:43:19 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:43:19 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:43:19 np0005538515.localdomain sudo[266299]: pam_unix(sudo:session): session closed for user root
Nov 28 09:43:20 np0005538515.localdomain sudo[266447]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kytavmddexhkgcbhbytkeszvneozvyee ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322999.9204106-928-69525560710773/AnsiballZ_stat.py
Nov 28 09:43:20 np0005538515.localdomain sudo[266447]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:43:20 np0005538515.localdomain python3.9[266449]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:43:20 np0005538515.localdomain sudo[266447]: pam_unix(sudo:session): session closed for user root
Nov 28 09:43:20 np0005538515.localdomain sudo[266504]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-weursnvlnslknwouyvuladjgafkbmeus ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764322999.9204106-928-69525560710773/AnsiballZ_file.py
Nov 28 09:43:20 np0005538515.localdomain sudo[266504]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:43:20 np0005538515.localdomain sudo[266507]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:43:20 np0005538515.localdomain sudo[266507]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:43:20 np0005538515.localdomain sudo[266507]: pam_unix(sudo:session): session closed for user root
Nov 28 09:43:20 np0005538515.localdomain python3.9[266506]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:43:20 np0005538515.localdomain sudo[266525]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 09:43:20 np0005538515.localdomain sudo[266525]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:43:20 np0005538515.localdomain sudo[266504]: pam_unix(sudo:session): session closed for user root
Nov 28 09:43:21 np0005538515.localdomain sudo[266669]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kunvucyuaiffkdpinijjcfjohhqojjrz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323001.1922104-964-44736121147455/AnsiballZ_stat.py
Nov 28 09:43:21 np0005538515.localdomain sudo[266669]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:43:21 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.
Nov 28 09:43:21 np0005538515.localdomain sudo[266525]: pam_unix(sudo:session): session closed for user root
Nov 28 09:43:21 np0005538515.localdomain podman[266672]: 2025-11-28 09:43:21.589324898 +0000 UTC m=+0.072085149 container health_status 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, io.buildah.version=1.33.7, config_id=edpm, managed_by=edpm_ansible, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., release=1755695350, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, build-date=2025-08-20T13:12:41)
Nov 28 09:43:21 np0005538515.localdomain podman[266672]: 2025-11-28 09:43:21.635877347 +0000 UTC m=+0.118637528 container exec_died 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vcs-type=git, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, io.openshift.expose-services=, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, container_name=openstack_network_exporter, io.buildah.version=1.33.7, architecture=x86_64, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 28 09:43:21 np0005538515.localdomain systemd[1]: 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.service: Deactivated successfully.
Nov 28 09:43:21 np0005538515.localdomain python3.9[266671]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:43:21 np0005538515.localdomain sudo[266669]: pam_unix(sudo:session): session closed for user root
Nov 28 09:43:21 np0005538515.localdomain sudo[266758]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ifyplfjmnmhykraikdftpujvdetkkrkj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323001.1922104-964-44736121147455/AnsiballZ_file.py
Nov 28 09:43:21 np0005538515.localdomain sudo[266758]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:43:22 np0005538515.localdomain python3.9[266760]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:43:22 np0005538515.localdomain sudo[266758]: pam_unix(sudo:session): session closed for user root
Nov 28 09:43:22 np0005538515.localdomain sudo[266778]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:43:22 np0005538515.localdomain sudo[266778]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:43:22 np0005538515.localdomain sudo[266778]: pam_unix(sudo:session): session closed for user root
Nov 28 09:43:22 np0005538515.localdomain sudo[266886]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lykyvmzpfmarrktrgrlnjnmpwobtwjuc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323002.383716-1001-85554643783873/AnsiballZ_systemd.py
Nov 28 09:43:22 np0005538515.localdomain sudo[266886]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:43:23 np0005538515.localdomain python3.9[266888]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:43:23 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 09:43:23 np0005538515.localdomain systemd-rc-local-generator[266911]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:43:23 np0005538515.localdomain systemd-sysv-generator[266914]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:43:23 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:43:23 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 28 09:43:23 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:43:23 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:43:23 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:43:23 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 28 09:43:23 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:43:23 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:43:23 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:43:23 np0005538515.localdomain systemd[1]: Starting Create netns directory...
Nov 28 09:43:23 np0005538515.localdomain systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Nov 28 09:43:23 np0005538515.localdomain systemd[1]: netns-placeholder.service: Deactivated successfully.
Nov 28 09:43:23 np0005538515.localdomain systemd[1]: Finished Create netns directory.
Nov 28 09:43:23 np0005538515.localdomain sudo[266886]: pam_unix(sudo:session): session closed for user root
Nov 28 09:43:24 np0005538515.localdomain sudo[267037]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nbbdwdhamkpdlaeybeyjrwljjjopfigp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323004.064742-1031-277172870379341/AnsiballZ_file.py
Nov 28 09:43:24 np0005538515.localdomain sudo[267037]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:43:24 np0005538515.localdomain python3.9[267039]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:43:24 np0005538515.localdomain sudo[267037]: pam_unix(sudo:session): session closed for user root
Nov 28 09:43:25 np0005538515.localdomain sudo[267147]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-olbjvfibricslgpqpbykccmxrbjkqgux ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323004.755007-1054-192711034019381/AnsiballZ_stat.py
Nov 28 09:43:25 np0005538515.localdomain sudo[267147]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:43:25 np0005538515.localdomain python3.9[267149]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/multipathd/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:43:25 np0005538515.localdomain sudo[267147]: pam_unix(sudo:session): session closed for user root
Nov 28 09:43:25 np0005538515.localdomain sudo[267204]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-huoalukwuekqyjzdswpjylnwvxbojdjt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323004.755007-1054-192711034019381/AnsiballZ_file.py
Nov 28 09:43:25 np0005538515.localdomain sudo[267204]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:43:25 np0005538515.localdomain python3.9[267206]: ansible-ansible.legacy.file Invoked with group=zuul mode=0700 owner=zuul setype=container_file_t dest=/var/lib/openstack/healthchecks/multipathd/ _original_basename=healthcheck recurse=False state=file path=/var/lib/openstack/healthchecks/multipathd/ force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:43:25 np0005538515.localdomain sudo[267204]: pam_unix(sudo:session): session closed for user root
Nov 28 09:43:26 np0005538515.localdomain sudo[267314]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ulobwosbwieseyhjebxrhwwjopxirklg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323006.2072103-1097-138510521958579/AnsiballZ_file.py
Nov 28 09:43:26 np0005538515.localdomain sudo[267314]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:43:26 np0005538515.localdomain python3.9[267316]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:43:26 np0005538515.localdomain sudo[267314]: pam_unix(sudo:session): session closed for user root
Nov 28 09:43:27 np0005538515.localdomain sudo[267424]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kfajkqkcdefkpnhajgrobkbuhpgnmuiw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323006.9457521-1120-141796922822462/AnsiballZ_stat.py
Nov 28 09:43:27 np0005538515.localdomain sudo[267424]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:43:27 np0005538515.localdomain python3.9[267426]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/multipathd.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:43:27 np0005538515.localdomain sudo[267424]: pam_unix(sudo:session): session closed for user root
Nov 28 09:43:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:43:27 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 09:43:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:43:27 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:43:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:43:27 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:43:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:43:27 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 09:43:27 np0005538515.localdomain openstack_network_exporter[240973]: 
Nov 28 09:43:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:43:27 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 09:43:27 np0005538515.localdomain openstack_network_exporter[240973]: 
Nov 28 09:43:27 np0005538515.localdomain sudo[267481]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-infvhaorphxyspsziarxiewxmbonajjc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323006.9457521-1120-141796922822462/AnsiballZ_file.py
Nov 28 09:43:27 np0005538515.localdomain sudo[267481]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:43:27 np0005538515.localdomain python3.9[267483]: ansible-ansible.legacy.file Invoked with mode=0600 dest=/var/lib/kolla/config_files/multipathd.json _original_basename=.samh7goi recurse=False state=file path=/var/lib/kolla/config_files/multipathd.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:43:27 np0005538515.localdomain sudo[267481]: pam_unix(sudo:session): session closed for user root
Nov 28 09:43:28 np0005538515.localdomain sudo[267591]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cyfcqjnnckejkxsaciqbqujigrecoxkb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323008.3415577-1157-194344447648762/AnsiballZ_file.py
Nov 28 09:43:28 np0005538515.localdomain sudo[267591]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:43:28 np0005538515.localdomain python3.9[267593]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/multipathd state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:43:28 np0005538515.localdomain sudo[267591]: pam_unix(sudo:session): session closed for user root
Nov 28 09:43:28 np0005538515.localdomain podman[239012]: time="2025-11-28T09:43:28Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 09:43:28 np0005538515.localdomain podman[239012]: @ - - [28/Nov/2025:09:43:28 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 149991 "" "Go-http-client/1.1"
Nov 28 09:43:28 np0005538515.localdomain podman[239012]: @ - - [28/Nov/2025:09:43:28 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17701 "" "Go-http-client/1.1"
Nov 28 09:43:29 np0005538515.localdomain sudo[267701]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qdmdrrdpvolsesmocnuksvjjfxhhirta ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323009.0592463-1180-122540970549764/AnsiballZ_stat.py
Nov 28 09:43:29 np0005538515.localdomain sudo[267701]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:43:29 np0005538515.localdomain sudo[267701]: pam_unix(sudo:session): session closed for user root
Nov 28 09:43:29 np0005538515.localdomain sudo[267758]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hgsrndoxhpdrfxamftlsexjfmbmqmvjn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323009.0592463-1180-122540970549764/AnsiballZ_file.py
Nov 28 09:43:29 np0005538515.localdomain sudo[267758]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:43:29 np0005538515.localdomain sudo[267758]: pam_unix(sudo:session): session closed for user root
Nov 28 09:43:30 np0005538515.localdomain sudo[267868]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rkrtvvwgeiittajuhgemkhnhtdpwernl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323010.4698553-1223-250285294729312/AnsiballZ_container_config_data.py
Nov 28 09:43:30 np0005538515.localdomain sudo[267868]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:43:30 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.
Nov 28 09:43:30 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.
Nov 28 09:43:30 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.
Nov 28 09:43:30 np0005538515.localdomain podman[267871]: 2025-11-28 09:43:30.982860587 +0000 UTC m=+0.092741228 container health_status 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible)
Nov 28 09:43:30 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.
Nov 28 09:43:31 np0005538515.localdomain podman[267872]: 2025-11-28 09:43:31.038632552 +0000 UTC m=+0.148427380 container health_status 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 28 09:43:31 np0005538515.localdomain podman[267871]: 2025-11-28 09:43:31.04604958 +0000 UTC m=+0.155930231 container exec_died 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 28 09:43:31 np0005538515.localdomain systemd[1]: 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.service: Deactivated successfully.
Nov 28 09:43:31 np0005538515.localdomain python3.9[267870]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/multipathd config_pattern=*.json debug=False
Nov 28 09:43:31 np0005538515.localdomain sudo[267868]: pam_unix(sudo:session): session closed for user root
Nov 28 09:43:31 np0005538515.localdomain podman[267909]: 2025-11-28 09:43:31.125683863 +0000 UTC m=+0.126509493 container health_status d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 09:43:31 np0005538515.localdomain podman[267909]: 2025-11-28 09:43:31.133711651 +0000 UTC m=+0.134537261 container exec_died d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 28 09:43:31 np0005538515.localdomain systemd[1]: d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.service: Deactivated successfully.
Nov 28 09:43:31 np0005538515.localdomain podman[267873]: 2025-11-28 09:43:31.205705596 +0000 UTC m=+0.310022265 container health_status b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2)
Nov 28 09:43:31 np0005538515.localdomain podman[267873]: 2025-11-28 09:43:31.213529518 +0000 UTC m=+0.317846197 container exec_died b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Nov 28 09:43:31 np0005538515.localdomain systemd[1]: b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.service: Deactivated successfully.
Nov 28 09:43:31 np0005538515.localdomain podman[267872]: 2025-11-28 09:43:31.257861419 +0000 UTC m=+0.367656187 container exec_died 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 28 09:43:31 np0005538515.localdomain systemd[1]: 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.service: Deactivated successfully.
Nov 28 09:43:32 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33657 DF PROTO=TCP SPT=43764 DPT=9102 SEQ=419132269 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ADC20E90000000001030307) 
Nov 28 09:43:32 np0005538515.localdomain sudo[268060]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cufthzufoetrwmidzkfrfdujmlchvwjn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323012.1535537-1249-239058207762477/AnsiballZ_container_config_hash.py
Nov 28 09:43:32 np0005538515.localdomain sudo[268060]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:43:32 np0005538515.localdomain python3.9[268062]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 28 09:43:32 np0005538515.localdomain sudo[268060]: pam_unix(sudo:session): session closed for user root
Nov 28 09:43:33 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33658 DF PROTO=TCP SPT=43764 DPT=9102 SEQ=419132269 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ADC24FB0000000001030307) 
Nov 28 09:43:33 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25826 DF PROTO=TCP SPT=38760 DPT=9102 SEQ=1919062088 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ADC26FA0000000001030307) 
Nov 28 09:43:34 np0005538515.localdomain sudo[268170]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wxnrajrfvuyyvanjedqitlssgmgasczn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323013.157157-1277-260589199396762/AnsiballZ_podman_container_info.py
Nov 28 09:43:34 np0005538515.localdomain sudo[268170]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:43:34 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.
Nov 28 09:43:34 np0005538515.localdomain podman[268173]: 2025-11-28 09:43:34.446009054 +0000 UTC m=+0.078169168 container health_status 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 28 09:43:34 np0005538515.localdomain podman[268173]: 2025-11-28 09:43:34.45267011 +0000 UTC m=+0.084830204 container exec_died 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 28 09:43:34 np0005538515.localdomain systemd[1]: 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.service: Deactivated successfully.
Nov 28 09:43:34 np0005538515.localdomain python3.9[268172]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Nov 28 09:43:34 np0005538515.localdomain sudo[268170]: pam_unix(sudo:session): session closed for user root
Nov 28 09:43:35 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33659 DF PROTO=TCP SPT=43764 DPT=9102 SEQ=419132269 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ADC2CFB0000000001030307) 
Nov 28 09:43:36 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31233 DF PROTO=TCP SPT=40914 DPT=9102 SEQ=318385806 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ADC30FA0000000001030307) 
Nov 28 09:43:38 np0005538515.localdomain sudo[268329]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nhjzwyoccjavuwliewwbpvapucglgnap ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764323018.5641522-1317-36754344666475/AnsiballZ_edpm_container_manage.py
Nov 28 09:43:38 np0005538515.localdomain sudo[268329]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:43:39 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33660 DF PROTO=TCP SPT=43764 DPT=9102 SEQ=419132269 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ADC3CBA0000000001030307) 
Nov 28 09:43:39 np0005538515.localdomain python3[268331]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/multipathd config_id=multipathd config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Nov 28 09:43:39 np0005538515.localdomain python3[268331]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: [
                                                               {
                                                                    "Id": "f275b8d168f7f57f31e3da49224019f39f95c80a833f083696a964527b07b54f",
                                                                    "Digest": "sha256:6296d2d95faaeb90443ee98443b39aa81b5152414f9542335d72711bb15fefdd",
                                                                    "RepoTags": [
                                                                         "quay.io/podified-antelope-centos9/openstack-multipathd:current-podified"
                                                                    ],
                                                                    "RepoDigests": [
                                                                         "quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6296d2d95faaeb90443ee98443b39aa81b5152414f9542335d72711bb15fefdd"
                                                                    ],
                                                                    "Parent": "",
                                                                    "Comment": "",
                                                                    "Created": "2025-11-26T06:12:42.268223466Z",
                                                                    "Config": {
                                                                         "User": "root",
                                                                         "Env": [
                                                                              "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",
                                                                              "LANG=en_US.UTF-8",
                                                                              "TZ=UTC",
                                                                              "container=oci"
                                                                         ],
                                                                         "Entrypoint": [
                                                                              "dumb-init",
                                                                              "--single-child",
                                                                              "--"
                                                                         ],
                                                                         "Cmd": [
                                                                              "kolla_start"
                                                                         ],
                                                                         "Labels": {
                                                                              "io.buildah.version": "1.41.3",
                                                                              "maintainer": "OpenStack Kubernetes Operator team",
                                                                              "org.label-schema.build-date": "20251125",
                                                                              "org.label-schema.license": "GPLv2",
                                                                              "org.label-schema.name": "CentOS Stream 9 Base Image",
                                                                              "org.label-schema.schema-version": "1.0",
                                                                              "org.label-schema.vendor": "CentOS",
                                                                              "tcib_build_tag": "1f5c0439f2433cb462b222a5bb23e629",
                                                                              "tcib_managed": "true"
                                                                         },
                                                                         "StopSignal": "SIGTERM"
                                                                    },
                                                                    "Version": "",
                                                                    "Author": "",
                                                                    "Architecture": "amd64",
                                                                    "Os": "linux",
                                                                    "Size": 249482220,
                                                                    "VirtualSize": 249482220,
                                                                    "GraphDriver": {
                                                                         "Name": "overlay",
                                                                         "Data": {
                                                                              "LowerDir": "/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/diff:/var/lib/containers/storage/overlay/cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa/diff",
                                                                              "UpperDir": "/var/lib/containers/storage/overlay/da9f726a106a4f4af24ed404443eca5cd50a43c6e5c864c256f158761c28e938/diff",
                                                                              "WorkDir": "/var/lib/containers/storage/overlay/da9f726a106a4f4af24ed404443eca5cd50a43c6e5c864c256f158761c28e938/work"
                                                                         }
                                                                    },
                                                                    "RootFS": {
                                                                         "Type": "layers",
                                                                         "Layers": [
                                                                              "sha256:cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa",
                                                                              "sha256:1e3477d3ea795ca64b46f28aa9428ba791c4250e0fd05e173a4b9c0fb0bdee23",
                                                                              "sha256:135e1f5eea0bd6ac73fc43c122f58d5ed97cb8a56365c4a958c72d470055986b"
                                                                         ]
                                                                    },
                                                                    "Labels": {
                                                                         "io.buildah.version": "1.41.3",
                                                                         "maintainer": "OpenStack Kubernetes Operator team",
                                                                         "org.label-schema.build-date": "20251125",
                                                                         "org.label-schema.license": "GPLv2",
                                                                         "org.label-schema.name": "CentOS Stream 9 Base Image",
                                                                         "org.label-schema.schema-version": "1.0",
                                                                         "org.label-schema.vendor": "CentOS",
                                                                         "tcib_build_tag": "1f5c0439f2433cb462b222a5bb23e629",
                                                                         "tcib_managed": "true"
                                                                    },
                                                                    "Annotations": {},
                                                                    "ManifestType": "application/vnd.docker.distribution.manifest.v2+json",
                                                                    "User": "root",
                                                                    "History": [
                                                                         {
                                                                              "created": "2025-11-25T04:02:36.223494528Z",
                                                                              "created_by": "/bin/sh -c #(nop) ADD file:cacf1a97b4abfca5db2db22f7ddbca8fd7daa5076a559639c109f09aaf55871d in / ",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-25T04:02:36.223562059Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL org.label-schema.schema-version=\"1.0\"     org.label-schema.name=\"CentOS Stream 9 Base Image\"     org.label-schema.vendor=\"CentOS\"     org.label-schema.license=\"GPLv2\"     org.label-schema.build-date=\"20251125\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-25T04:02:39.054452717Z",
                                                                              "created_by": "/bin/sh -c #(nop) CMD [\"/bin/bash\"]"
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:10:57.55004106Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL maintainer=\"OpenStack Kubernetes Operator team\"",
                                                                              "comment": "FROM quay.io/centos/centos:stream9",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:10:57.550061231Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL tcib_managed=true",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:10:57.550071761Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV LANG=\"en_US.UTF-8\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:10:57.550082711Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV TZ=\"UTC\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:10:57.550094371Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV container=\"oci\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:10:57.550104472Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:10:57.937139683Z",
                                                                              "created_by": "/bin/sh -c if [ -f \"/etc/yum.repos.d/ubi.repo\" ]; then rm -f /etc/yum.repos.d/ubi.repo && dnf clean all && rm -rf /var/cache/dnf; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:33.845342269Z",
                                                                              "created_by": "/bin/sh -c dnf install -y crudini && crudini --del /etc/dnf/dnf.conf main override_install_langs && crudini --set /etc/dnf/dnf.conf main clean_requirements_on_remove True && crudini --set /etc/dnf/dnf.conf main exactarch 1 && crudini --set /etc/dnf/dnf.conf main gpgcheck 1 && crudini --set /etc/dnf/dnf.conf main install_weak_deps False && if [ 'centos' == 'centos' ];then crudini --set /etc/dnf/dnf.conf main best False; fi && crudini --set /etc/dnf/dnf.conf main installonly_limit 0 && crudini --set /etc/dnf/dnf.conf main keepcache 0 && crudini --set /etc/dnf/dnf.conf main obsoletes 1 && crudini --set /etc/dnf/dnf.conf main plugins 1 && crudini --set /etc/dnf/dnf.conf main skip_missing_names_on_install False && crudini --set /etc/dnf/dnf.conf main tsflags nodocs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:37.752912815Z",
                                                                              "created_by": "/bin/sh -c dnf install -y ca-certificates dumb-init glibc-langpack-en procps-ng python3 sudo util-linux-user which python-tcib-containers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:38.066850603Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/uid_gid_manage.sh /usr/local/bin/uid_gid_manage",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:38.343690066Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/uid_gid_manage",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:39.121414134Z",
                                                                              "created_by": "/bin/sh -c bash /usr/local/bin/uid_gid_manage kolla hugetlbfs libvirt qemu",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:39.758394881Z",
                                                                              "created_by": "/bin/sh -c touch /usr/local/bin/kolla_extend_start && chmod 755 /usr/local/bin/kolla_extend_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:40.023293708Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/set_configs.py /usr/local/bin/kolla_set_configs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:40.666927498Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_set_configs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:41.274045447Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/start.sh /usr/local/bin/kolla_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:41.934810694Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:42.460051822Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/httpd_setup.sh /usr/local/bin/kolla_httpd_setup",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:43.056709748Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_httpd_setup",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:43.656939418Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/copy_cacerts.sh /usr/local/bin/kolla_copy_cacerts",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:44.391634882Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_copy_cacerts",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:44.866551538Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/sudoers /etc/sudoers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:45.384686341Z",
                                                                              "created_by": "/bin/sh -c chmod 440 /etc/sudoers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:45.893815667Z",
                                                                              "created_by": "/bin/sh -c sed -ri '/^(passwd:|group:)/ s/systemd//g' /etc/nsswitch.conf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:50.280039705Z",
                                                                              "created_by": "/bin/sh -c dnf -y reinstall which && rpm -e --nodeps tzdata && dnf -y install tzdata",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:51.365780205Z",
                                                                              "created_by": "/bin/sh -c if [ ! -f \"/etc/localtime\" ]; then ln -s /usr/share/zoneinfo/Etc/UTC /etc/localtime; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:52.238116267Z",
                                                                              "created_by": "/bin/sh -c mkdir -p /openstack",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:54.354755699Z",
                                                                              "created_by": "/bin/sh -c if [ 'centos' == 'centos' ];then if [ -n \"$(rpm -qa redhat-release)\" ];then rpm -e --nodeps redhat-release; fi ; dnf -y install centos-stream-release; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:57.47438266Z",
                                                                              "created_by": "/bin/sh -c dnf update --excludepkgs redhat-release -y && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:57.474435383Z",
                                                                              "created_by": "/bin/sh -c #(nop) STOPSIGNAL SIGTERM",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:57.474444143Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENTRYPOINT [\"dumb-init\", \"--single-child\", \"--\"]",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:57.474450953Z",
                                                                              "created_by": "/bin/sh -c #(nop) CMD [\"kolla_start\"]",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:58.542433842Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"1f5c0439f2433cb462b222a5bb23e629\""
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:12:01.186918094Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-base:1f5c0439f2433cb462b222a5bb23e629",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:12:42.000584504Z",
                                                                              "created_by": "/bin/sh -c dnf -y install device-mapper-multipath iscsi-initiator-utils && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:12:43.229019379Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"1f5c0439f2433cb462b222a5bb23e629\""
                                                                         }
                                                                    ],
                                                                    "NamesHistory": [
                                                                         "quay.io/podified-antelope-centos9/openstack-multipathd:current-podified"
                                                                    ]
                                                               }
                                                          ]
                                                          : quay.io/podified-antelope-centos9/openstack-multipathd:current-podified
Nov 28 09:43:39 np0005538515.localdomain sudo[268329]: pam_unix(sudo:session): session closed for user root
Nov 28 09:43:40 np0005538515.localdomain sudo[268501]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fvbujqyhnctllmxgmvycbnvbvmxdyhah ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323019.8888693-1340-180687972637753/AnsiballZ_stat.py
Nov 28 09:43:40 np0005538515.localdomain sudo[268501]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:43:40 np0005538515.localdomain python3.9[268503]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 09:43:40 np0005538515.localdomain sudo[268501]: pam_unix(sudo:session): session closed for user root
Nov 28 09:43:40 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.
Nov 28 09:43:40 np0005538515.localdomain podman[268584]: 2025-11-28 09:43:40.974494827 +0000 UTC m=+0.073811343 container health_status cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Nov 28 09:43:40 np0005538515.localdomain sudo[268625]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dtnlorsthvojlpoaaxpsdxmximdgsusi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323020.745732-1367-76407086107525/AnsiballZ_file.py
Nov 28 09:43:40 np0005538515.localdomain sudo[268625]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:43:40 np0005538515.localdomain podman[268584]: 2025-11-28 09:43:40.991469982 +0000 UTC m=+0.090786518 container exec_died cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 09:43:41 np0005538515.localdomain systemd[1]: cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.service: Deactivated successfully.
Nov 28 09:43:41 np0005538515.localdomain python3.9[268634]: ansible-file Invoked with path=/etc/systemd/system/edpm_multipathd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:43:41 np0005538515.localdomain sudo[268625]: pam_unix(sudo:session): session closed for user root
Nov 28 09:43:41 np0005538515.localdomain sudo[268687]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qsnnctqisbivirtjmrculvdprzulbixl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323020.745732-1367-76407086107525/AnsiballZ_stat.py
Nov 28 09:43:41 np0005538515.localdomain sudo[268687]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:43:41 np0005538515.localdomain python3.9[268689]: ansible-stat Invoked with path=/etc/systemd/system/edpm_multipathd_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 09:43:41 np0005538515.localdomain sudo[268687]: pam_unix(sudo:session): session closed for user root
Nov 28 09:43:42 np0005538515.localdomain sudo[268796]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nygqwgoqkweltaovijdgkhnrowioapdo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323021.6852741-1367-158546594071798/AnsiballZ_copy.py
Nov 28 09:43:42 np0005538515.localdomain sudo[268796]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:43:42 np0005538515.localdomain python3.9[268798]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764323021.6852741-1367-158546594071798/source dest=/etc/systemd/system/edpm_multipathd.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:43:42 np0005538515.localdomain sudo[268796]: pam_unix(sudo:session): session closed for user root
Nov 28 09:43:43 np0005538515.localdomain sudo[268851]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zzmmdlrkpkmerqtfcnapjrgwjznixxhu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323021.6852741-1367-158546594071798/AnsiballZ_systemd.py
Nov 28 09:43:43 np0005538515.localdomain sudo[268851]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:43:43 np0005538515.localdomain python3.9[268853]: ansible-systemd Invoked with state=started name=edpm_multipathd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:43:43 np0005538515.localdomain sudo[268851]: pam_unix(sudo:session): session closed for user root
Nov 28 09:43:44 np0005538515.localdomain python3.9[268963]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath/.multipath_restart_required follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 09:43:45 np0005538515.localdomain sudo[269071]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zhzciwhcnxevragpnhcmemxmongslejz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323025.5865815-1469-81620284824666/AnsiballZ_file.py
Nov 28 09:43:45 np0005538515.localdomain sudo[269071]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:43:46 np0005538515.localdomain python3.9[269073]: ansible-ansible.builtin.file Invoked with path=/etc/multipath/.multipath_restart_required state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:43:46 np0005538515.localdomain sudo[269071]: pam_unix(sudo:session): session closed for user root
Nov 28 09:43:47 np0005538515.localdomain sudo[269181]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pwcdnklpjustzbejfneynxwjlbtuesdc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323026.8497145-1505-92668169638583/AnsiballZ_file.py
Nov 28 09:43:47 np0005538515.localdomain sudo[269181]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:43:47 np0005538515.localdomain python3.9[269183]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Nov 28 09:43:47 np0005538515.localdomain sudo[269181]: pam_unix(sudo:session): session closed for user root
Nov 28 09:43:47 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33661 DF PROTO=TCP SPT=43764 DPT=9102 SEQ=419132269 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ADC5CFA0000000001030307) 
Nov 28 09:43:47 np0005538515.localdomain sudo[269291]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ccatoqvlflrwdanyuinsssxorbbofbcs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323027.5208185-1529-65338315845371/AnsiballZ_modprobe.py
Nov 28 09:43:47 np0005538515.localdomain sudo[269291]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:43:47 np0005538515.localdomain python3.9[269293]: ansible-community.general.modprobe Invoked with name=nvme-fabrics state=present params= persistent=disabled
Nov 28 09:43:48 np0005538515.localdomain sudo[269291]: pam_unix(sudo:session): session closed for user root
Nov 28 09:43:48 np0005538515.localdomain sudo[269401]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hlwoassylecwjekriljtamkkkkrxnppo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323028.2397683-1553-70930738958771/AnsiballZ_stat.py
Nov 28 09:43:48 np0005538515.localdomain sudo[269401]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:43:48 np0005538515.localdomain python3.9[269403]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/nvme-fabrics.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:43:48 np0005538515.localdomain sudo[269401]: pam_unix(sudo:session): session closed for user root
Nov 28 09:43:48 np0005538515.localdomain sudo[269458]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ugtakkxclhghcrxevzddivtlurgwhxfv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323028.2397683-1553-70930738958771/AnsiballZ_file.py
Nov 28 09:43:48 np0005538515.localdomain sudo[269458]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:43:49 np0005538515.localdomain python3.9[269460]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/etc/modules-load.d/nvme-fabrics.conf _original_basename=module-load.conf.j2 recurse=False state=file path=/etc/modules-load.d/nvme-fabrics.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:43:49 np0005538515.localdomain sudo[269458]: pam_unix(sudo:session): session closed for user root
Nov 28 09:43:49 np0005538515.localdomain sudo[269568]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-watkprcehxocjixetsxcabgojqhpzvuw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323029.5258021-1592-169879740178033/AnsiballZ_lineinfile.py
Nov 28 09:43:49 np0005538515.localdomain sudo[269568]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:43:49 np0005538515.localdomain python3.9[269570]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=nvme-fabrics  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:43:49 np0005538515.localdomain sudo[269568]: pam_unix(sudo:session): session closed for user root
Nov 28 09:43:50 np0005538515.localdomain sudo[269678]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qahswcfhbfslipsjgqujdksxkwczjblv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323030.3287988-1618-106262377458653/AnsiballZ_dnf.py
Nov 28 09:43:50 np0005538515.localdomain sudo[269678]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:43:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:43:50.823 158530 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:43:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:43:50.824 158530 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:43:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:43:50.824 158530 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:43:50 np0005538515.localdomain python3.9[269680]: ansible-ansible.legacy.dnf Invoked with name=['nvme-cli'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 28 09:43:51 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.
Nov 28 09:43:51 np0005538515.localdomain systemd[1]: tmp-crun.ykMeot.mount: Deactivated successfully.
Nov 28 09:43:51 np0005538515.localdomain podman[269683]: 2025-11-28 09:43:51.984893955 +0000 UTC m=+0.088567040 container health_status 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, vcs-type=git, architecture=x86_64, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible)
Nov 28 09:43:51 np0005538515.localdomain podman[269683]: 2025-11-28 09:43:51.997540546 +0000 UTC m=+0.101213610 container exec_died 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.openshift.expose-services=, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, distribution-scope=public, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, version=9.6, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Nov 28 09:43:52 np0005538515.localdomain systemd[1]: 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.service: Deactivated successfully.
Nov 28 09:43:52 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:43:52.074 228501 DEBUG oslo_service.periodic_task [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:43:52 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:43:52.162 228501 DEBUG oslo_concurrency.lockutils [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:43:52 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:43:52.162 228501 DEBUG oslo_concurrency.lockutils [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:43:52 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:43:52.163 228501 DEBUG oslo_concurrency.lockutils [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:43:52 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:43:52.163 228501 DEBUG nova.compute.resource_tracker [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Auditing locally available compute resources for np0005538515.localdomain (node: np0005538515.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 28 09:43:52 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:43:52.163 228501 DEBUG oslo_concurrency.processutils [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 09:43:52 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:43:52.608 228501 DEBUG oslo_concurrency.processutils [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.444s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 09:43:52 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:43:52.803 228501 WARNING nova.virt.libvirt.driver [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 09:43:52 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:43:52.805 228501 DEBUG nova.compute.resource_tracker [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Hypervisor/Node resource view: name=np0005538515.localdomain free_ram=12432MB free_disk=41.837093353271484GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 28 09:43:52 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:43:52.806 228501 DEBUG oslo_concurrency.lockutils [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:43:52 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:43:52.806 228501 DEBUG oslo_concurrency.lockutils [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:43:52 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:43:52.874 228501 DEBUG nova.compute.resource_tracker [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 28 09:43:52 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:43:52.874 228501 DEBUG nova.compute.resource_tracker [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Final resource view: name=np0005538515.localdomain phys_ram=15738MB used_ram=512MB phys_disk=41GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 28 09:43:52 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:43:52.900 228501 DEBUG oslo_concurrency.processutils [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 09:43:53 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:43:53.352 228501 DEBUG oslo_concurrency.processutils [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.453s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 09:43:53 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:43:53.358 228501 DEBUG nova.compute.provider_tree [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Inventory has not changed in ProviderTree for provider: 72fba1ca-0d86-48af-8a3d-510284dfd0e0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 09:43:53 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:43:53.462 228501 DEBUG nova.scheduler.client.report [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Inventory has not changed for provider 72fba1ca-0d86-48af-8a3d-510284dfd0e0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 09:43:53 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:43:53.464 228501 DEBUG nova.compute.resource_tracker [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Compute_service record updated for np0005538515.localdomain:np0005538515.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 28 09:43:53 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:43:53.465 228501 DEBUG oslo_concurrency.lockutils [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.659s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:43:54 np0005538515.localdomain sudo[269678]: pam_unix(sudo:session): session closed for user root
Nov 28 09:43:54 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:43:54.461 228501 DEBUG oslo_service.periodic_task [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:43:54 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:43:54.462 228501 DEBUG oslo_service.periodic_task [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:43:54 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:43:54.462 228501 DEBUG oslo_service.periodic_task [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:43:54 np0005538515.localdomain python3.9[269853]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 09:43:55 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:43:55.074 228501 DEBUG oslo_service.periodic_task [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:43:55 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:43:55.074 228501 DEBUG nova.compute.manager [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 28 09:43:55 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:43:55.074 228501 DEBUG nova.compute.manager [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 28 09:43:55 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:43:55.091 228501 DEBUG nova.compute.manager [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 28 09:43:55 np0005538515.localdomain sudo[269965]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jxkjpztfsiowtpqdcqhywpjkroimxxup ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323035.377597-1672-239629759082045/AnsiballZ_file.py
Nov 28 09:43:55 np0005538515.localdomain sudo[269965]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:43:55 np0005538515.localdomain python3.9[269967]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/ssh/ssh_known_hosts state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:43:55 np0005538515.localdomain sudo[269965]: pam_unix(sudo:session): session closed for user root
Nov 28 09:43:56 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:43:56.073 228501 DEBUG oslo_service.periodic_task [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:43:56 np0005538515.localdomain sudo[270075]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pqbgagaxdsbfvwzutxqjgzyldxuvmkaz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323036.4397154-1704-275298450306154/AnsiballZ_systemd_service.py
Nov 28 09:43:56 np0005538515.localdomain sudo[270075]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:43:57 np0005538515.localdomain python3.9[270077]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 28 09:43:57 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 09:43:57 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:43:57.073 228501 DEBUG oslo_service.periodic_task [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:43:57 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:43:57.074 228501 DEBUG nova.compute.manager [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 28 09:43:57 np0005538515.localdomain systemd-rc-local-generator[270101]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:43:57 np0005538515.localdomain systemd-sysv-generator[270106]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:43:57 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:43:57 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 28 09:43:57 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:43:57 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:43:57 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:43:57 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 28 09:43:57 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:43:57 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:43:57 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:43:57 np0005538515.localdomain sudo[270075]: pam_unix(sudo:session): session closed for user root
Nov 28 09:43:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:43:57 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:43:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:43:57 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 09:43:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:43:57 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:43:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:43:57 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 09:43:57 np0005538515.localdomain openstack_network_exporter[240973]: 
Nov 28 09:43:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:43:57 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 09:43:57 np0005538515.localdomain openstack_network_exporter[240973]: 
Nov 28 09:43:58 np0005538515.localdomain python3.9[270221]: ansible-ansible.builtin.service_facts Invoked
Nov 28 09:43:58 np0005538515.localdomain network[270238]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 28 09:43:58 np0005538515.localdomain network[270239]: 'network-scripts' will be removed from distribution in near future.
Nov 28 09:43:58 np0005538515.localdomain network[270240]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 28 09:43:58 np0005538515.localdomain podman[239012]: time="2025-11-28T09:43:58Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 09:43:58 np0005538515.localdomain podman[239012]: @ - - [28/Nov/2025:09:43:58 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 149991 "" "Go-http-client/1.1"
Nov 28 09:43:58 np0005538515.localdomain podman[239012]: @ - - [28/Nov/2025:09:43:58 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17699 "" "Go-http-client/1.1"
Nov 28 09:43:59 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:43:59.074 228501 DEBUG oslo_service.periodic_task [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:44:00 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:44:00.074 228501 DEBUG oslo_service.periodic_task [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:44:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:44:00.615 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:44:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:44:00.616 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:44:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:44:00.616 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:44:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:44:00.616 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:44:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:44:00.616 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:44:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:44:00.616 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:44:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:44:00.617 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:44:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:44:00.617 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:44:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:44:00.617 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:44:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:44:00.617 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:44:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:44:00.617 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:44:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:44:00.617 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:44:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:44:00.618 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:44:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:44:00.618 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:44:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:44:00.618 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:44:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:44:00.618 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:44:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:44:00.618 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:44:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:44:00.618 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:44:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:44:00.619 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:44:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:44:00.619 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:44:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:44:00.619 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:44:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:44:00.619 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:44:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:44:00.619 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:44:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:44:00.619 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:44:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:44:00.619 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:44:01 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.
Nov 28 09:44:01 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.
Nov 28 09:44:01 np0005538515.localdomain podman[270272]: 2025-11-28 09:44:01.248170507 +0000 UTC m=+0.088891269 container health_status 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, config_id=edpm, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS)
Nov 28 09:44:01 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.
Nov 28 09:44:01 np0005538515.localdomain podman[270272]: 2025-11-28 09:44:01.264476161 +0000 UTC m=+0.105196913 container exec_died 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ceilometer_agent_compute, config_id=edpm, org.label-schema.build-date=20251125)
Nov 28 09:44:01 np0005538515.localdomain systemd[1]: 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.service: Deactivated successfully.
Nov 28 09:44:01 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.
Nov 28 09:44:01 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:44:01 np0005538515.localdomain podman[270317]: 2025-11-28 09:44:01.379770055 +0000 UTC m=+0.075017199 container health_status 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 28 09:44:01 np0005538515.localdomain podman[270306]: 2025-11-28 09:44:01.338682226 +0000 UTC m=+0.076332182 container health_status b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Nov 28 09:44:01 np0005538515.localdomain podman[270273]: 2025-11-28 09:44:01.360497519 +0000 UTC m=+0.198315011 container health_status d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 28 09:44:01 np0005538515.localdomain podman[270306]: 2025-11-28 09:44:01.421364561 +0000 UTC m=+0.159014527 container exec_died b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent)
Nov 28 09:44:01 np0005538515.localdomain systemd[1]: b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.service: Deactivated successfully.
Nov 28 09:44:01 np0005538515.localdomain podman[270273]: 2025-11-28 09:44:01.495967458 +0000 UTC m=+0.333784880 container exec_died d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 28 09:44:01 np0005538515.localdomain systemd[1]: d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.service: Deactivated successfully.
Nov 28 09:44:01 np0005538515.localdomain podman[270317]: 2025-11-28 09:44:01.547768589 +0000 UTC m=+0.243015723 container exec_died 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 09:44:01 np0005538515.localdomain systemd[1]: 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.service: Deactivated successfully.
Nov 28 09:44:02 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28298 DF PROTO=TCP SPT=46354 DPT=9102 SEQ=1223652804 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ADC96190000000001030307) 
Nov 28 09:44:03 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28299 DF PROTO=TCP SPT=46354 DPT=9102 SEQ=1223652804 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ADC9A3A0000000001030307) 
Nov 28 09:44:03 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33662 DF PROTO=TCP SPT=43764 DPT=9102 SEQ=419132269 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ADC9CFA0000000001030307) 
Nov 28 09:44:03 np0005538515.localdomain sudo[270556]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vsqeiqitfuhwdlzfupetzowdpleovybk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323043.5531242-1761-136379896322471/AnsiballZ_systemd_service.py
Nov 28 09:44:03 np0005538515.localdomain sudo[270556]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:44:04 np0005538515.localdomain python3.9[270558]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:44:04 np0005538515.localdomain sudo[270556]: pam_unix(sudo:session): session closed for user root
Nov 28 09:44:04 np0005538515.localdomain sudo[270667]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cxrtbzwffzosnoowmoewjpxnvhayjohu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323044.3070405-1761-145316832192927/AnsiballZ_systemd_service.py
Nov 28 09:44:04 np0005538515.localdomain sudo[270667]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:44:04 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.
Nov 28 09:44:04 np0005538515.localdomain podman[270670]: 2025-11-28 09:44:04.731207539 +0000 UTC m=+0.081279804 container health_status 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 28 09:44:04 np0005538515.localdomain podman[270670]: 2025-11-28 09:44:04.74355161 +0000 UTC m=+0.093623805 container exec_died 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 28 09:44:04 np0005538515.localdomain systemd[1]: 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.service: Deactivated successfully.
Nov 28 09:44:04 np0005538515.localdomain python3.9[270669]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_migration_target.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:44:05 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28300 DF PROTO=TCP SPT=46354 DPT=9102 SEQ=1223652804 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ADCA23A0000000001030307) 
Nov 28 09:44:05 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25827 DF PROTO=TCP SPT=38760 DPT=9102 SEQ=1919062088 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ADCA4FA0000000001030307) 
Nov 28 09:44:05 np0005538515.localdomain sudo[270667]: pam_unix(sudo:session): session closed for user root
Nov 28 09:44:06 np0005538515.localdomain sudo[270801]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-njrjmecnivbpravbymgozxxbtarujfae ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323046.1128857-1761-234662309488695/AnsiballZ_systemd_service.py
Nov 28 09:44:06 np0005538515.localdomain sudo[270801]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:44:06 np0005538515.localdomain python3.9[270803]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api_cron.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:44:06 np0005538515.localdomain sudo[270801]: pam_unix(sudo:session): session closed for user root
Nov 28 09:44:07 np0005538515.localdomain sudo[270912]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iypnpovmrskgfbeisahpiinggoushsmv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323046.8430717-1761-75287565219894/AnsiballZ_systemd_service.py
Nov 28 09:44:07 np0005538515.localdomain sudo[270912]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:44:07 np0005538515.localdomain python3.9[270914]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:44:08 np0005538515.localdomain sudo[270912]: pam_unix(sudo:session): session closed for user root
Nov 28 09:44:08 np0005538515.localdomain sudo[271023]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iaufbgsodkyorstddzgftqexbebvcick ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323048.5663629-1761-273045086348895/AnsiballZ_systemd_service.py
Nov 28 09:44:08 np0005538515.localdomain sudo[271023]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:44:09 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28301 DF PROTO=TCP SPT=46354 DPT=9102 SEQ=1223652804 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ADCB1FA0000000001030307) 
Nov 28 09:44:09 np0005538515.localdomain python3.9[271025]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_conductor.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:44:09 np0005538515.localdomain sudo[271023]: pam_unix(sudo:session): session closed for user root
Nov 28 09:44:09 np0005538515.localdomain sudo[271134]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tqiaveydbohumfhkjgnylfjmpldfqdkz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323049.3702173-1761-107138604331580/AnsiballZ_systemd_service.py
Nov 28 09:44:09 np0005538515.localdomain sudo[271134]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:44:09 np0005538515.localdomain python3.9[271136]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_metadata.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:44:09 np0005538515.localdomain sudo[271134]: pam_unix(sudo:session): session closed for user root
Nov 28 09:44:10 np0005538515.localdomain sudo[271245]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mannqhorstgsxcaakcidbfcaysfhcwhw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323050.0883114-1761-204266204952500/AnsiballZ_systemd_service.py
Nov 28 09:44:10 np0005538515.localdomain sudo[271245]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:44:10 np0005538515.localdomain python3.9[271247]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_scheduler.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:44:11 np0005538515.localdomain sudo[271245]: pam_unix(sudo:session): session closed for user root
Nov 28 09:44:11 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.
Nov 28 09:44:11 np0005538515.localdomain podman[271283]: 2025-11-28 09:44:11.981154517 +0000 UTC m=+0.083009538 container health_status cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true)
Nov 28 09:44:11 np0005538515.localdomain podman[271283]: 2025-11-28 09:44:11.994830638 +0000 UTC m=+0.096685649 container exec_died cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125)
Nov 28 09:44:12 np0005538515.localdomain systemd[1]: cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.service: Deactivated successfully.
Nov 28 09:44:12 np0005538515.localdomain sudo[271374]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ukgbkvrooyugpcqucyyksgmvqthrqdfw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323051.8590415-1761-216849549906598/AnsiballZ_systemd_service.py
Nov 28 09:44:12 np0005538515.localdomain sudo[271374]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:44:12 np0005538515.localdomain python3.9[271376]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_vnc_proxy.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:44:12 np0005538515.localdomain sudo[271374]: pam_unix(sudo:session): session closed for user root
Nov 28 09:44:13 np0005538515.localdomain sudo[271485]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-asmnmfoorjbrpsuhqkcyztmpikkacntv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323052.9878051-1938-57600836456543/AnsiballZ_file.py
Nov 28 09:44:13 np0005538515.localdomain sudo[271485]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:44:13 np0005538515.localdomain python3.9[271487]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:44:13 np0005538515.localdomain sudo[271485]: pam_unix(sudo:session): session closed for user root
Nov 28 09:44:13 np0005538515.localdomain sudo[271595]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rozcbjgnfsmzupdbgiowttgauvjucqmy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323053.6575713-1938-160578538003976/AnsiballZ_file.py
Nov 28 09:44:13 np0005538515.localdomain sudo[271595]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:44:14 np0005538515.localdomain python3.9[271597]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:44:14 np0005538515.localdomain sudo[271595]: pam_unix(sudo:session): session closed for user root
Nov 28 09:44:14 np0005538515.localdomain sudo[271705]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xzxdqoflpufuuacbgathbbgrmvudsfso ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323054.2432709-1938-167254065676374/AnsiballZ_file.py
Nov 28 09:44:14 np0005538515.localdomain sudo[271705]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:44:14 np0005538515.localdomain python3.9[271707]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:44:14 np0005538515.localdomain sudo[271705]: pam_unix(sudo:session): session closed for user root
Nov 28 09:44:15 np0005538515.localdomain sudo[271815]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nfzybefvkmwhcbavuwiygzckxtjycgvn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323054.7920265-1938-188589771410651/AnsiballZ_file.py
Nov 28 09:44:15 np0005538515.localdomain sudo[271815]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:44:15 np0005538515.localdomain python3.9[271817]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:44:15 np0005538515.localdomain sudo[271815]: pam_unix(sudo:session): session closed for user root
Nov 28 09:44:16 np0005538515.localdomain sudo[271925]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tsvsdftageatcqttywrfhficdlmtsuts ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323056.164642-1938-199477350812909/AnsiballZ_file.py
Nov 28 09:44:16 np0005538515.localdomain sudo[271925]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:44:16 np0005538515.localdomain python3.9[271927]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:44:16 np0005538515.localdomain sudo[271925]: pam_unix(sudo:session): session closed for user root
Nov 28 09:44:17 np0005538515.localdomain sudo[272035]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-djkaoxqgqwdjcnsstsejdwlbvonptews ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323056.7754095-1938-217082272036518/AnsiballZ_file.py
Nov 28 09:44:17 np0005538515.localdomain sudo[272035]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:44:17 np0005538515.localdomain python3.9[272037]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:44:17 np0005538515.localdomain sudo[272035]: pam_unix(sudo:session): session closed for user root
Nov 28 09:44:17 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28302 DF PROTO=TCP SPT=46354 DPT=9102 SEQ=1223652804 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ADCD2FA0000000001030307) 
Nov 28 09:44:18 np0005538515.localdomain sudo[272145]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-brpkibkbpwhmvhcoerhoxwrhbdhwriju ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323057.3699281-1938-185447088858904/AnsiballZ_file.py
Nov 28 09:44:18 np0005538515.localdomain sudo[272145]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:44:18 np0005538515.localdomain python3.9[272147]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:44:18 np0005538515.localdomain sudo[272145]: pam_unix(sudo:session): session closed for user root
Nov 28 09:44:18 np0005538515.localdomain sudo[272255]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kaagftbboctujnlexkmslbwdvfpekata ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323058.6578815-1938-12396374281047/AnsiballZ_file.py
Nov 28 09:44:18 np0005538515.localdomain sudo[272255]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:44:19 np0005538515.localdomain python3.9[272257]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:44:19 np0005538515.localdomain sudo[272255]: pam_unix(sudo:session): session closed for user root
Nov 28 09:44:19 np0005538515.localdomain sudo[272365]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tnnfqyifcvimcisidzxjbbngkwoywksy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323059.4407983-2108-63884881277782/AnsiballZ_file.py
Nov 28 09:44:19 np0005538515.localdomain sudo[272365]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:44:19 np0005538515.localdomain python3.9[272367]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:44:19 np0005538515.localdomain sudo[272365]: pam_unix(sudo:session): session closed for user root
Nov 28 09:44:20 np0005538515.localdomain sudo[272475]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lfeeregwljazdbfmkivytlraailskwen ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323060.0001209-2108-23743246711216/AnsiballZ_file.py
Nov 28 09:44:20 np0005538515.localdomain sudo[272475]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:44:20 np0005538515.localdomain python3.9[272477]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:44:20 np0005538515.localdomain sudo[272475]: pam_unix(sudo:session): session closed for user root
Nov 28 09:44:20 np0005538515.localdomain sudo[272585]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dumosciqonmrxovfodneufhdhjgezskv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323060.5651588-2108-262468499219221/AnsiballZ_file.py
Nov 28 09:44:20 np0005538515.localdomain sudo[272585]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:44:21 np0005538515.localdomain python3.9[272587]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:44:21 np0005538515.localdomain sudo[272585]: pam_unix(sudo:session): session closed for user root
Nov 28 09:44:21 np0005538515.localdomain sudo[272695]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-srkqnfbsbwgjgfhxeijfexlsoyjbqaie ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323061.1516244-2108-7419192533733/AnsiballZ_file.py
Nov 28 09:44:21 np0005538515.localdomain sudo[272695]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:44:21 np0005538515.localdomain python3.9[272697]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:44:21 np0005538515.localdomain sudo[272695]: pam_unix(sudo:session): session closed for user root
Nov 28 09:44:21 np0005538515.localdomain sudo[272805]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yvqwapzgztgevioiwkgqoahchocxogps ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323061.717788-2108-212997219395442/AnsiballZ_file.py
Nov 28 09:44:21 np0005538515.localdomain sudo[272805]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:44:22 np0005538515.localdomain python3.9[272807]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:44:22 np0005538515.localdomain sudo[272805]: pam_unix(sudo:session): session closed for user root
Nov 28 09:44:22 np0005538515.localdomain sudo[272879]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:44:22 np0005538515.localdomain sudo[272879]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:44:22 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.
Nov 28 09:44:22 np0005538515.localdomain sudo[272879]: pam_unix(sudo:session): session closed for user root
Nov 28 09:44:22 np0005538515.localdomain sudo[272940]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aiyaqqwrtwrpqphxfbkeqvyxzmxdkbfi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323062.2824125-2108-252818751299084/AnsiballZ_file.py
Nov 28 09:44:22 np0005538515.localdomain sudo[272940]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:44:22 np0005538515.localdomain sudo[272939]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Nov 28 09:44:22 np0005538515.localdomain sudo[272939]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:44:22 np0005538515.localdomain podman[272922]: 2025-11-28 09:44:22.566669336 +0000 UTC m=+0.097758624 container health_status 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, container_name=openstack_network_exporter, vcs-type=git, architecture=x86_64, build-date=2025-08-20T13:12:41, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, version=9.6, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible)
Nov 28 09:44:22 np0005538515.localdomain podman[272922]: 2025-11-28 09:44:22.575820479 +0000 UTC m=+0.106909767 container exec_died 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, distribution-scope=public, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., version=9.6, container_name=openstack_network_exporter, managed_by=edpm_ansible, name=ubi9-minimal)
Nov 28 09:44:22 np0005538515.localdomain systemd[1]: 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.service: Deactivated successfully.
Nov 28 09:44:22 np0005538515.localdomain python3.9[272962]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:44:22 np0005538515.localdomain sudo[272940]: pam_unix(sudo:session): session closed for user root
Nov 28 09:44:23 np0005538515.localdomain sudo[273124]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pqpypsfpvmcqksbcvmbhinxxjhoueogw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323062.809037-2108-204346385953391/AnsiballZ_file.py
Nov 28 09:44:23 np0005538515.localdomain sudo[273124]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:44:23 np0005538515.localdomain python3.9[273141]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:44:23 np0005538515.localdomain podman[273155]: 2025-11-28 09:44:23.311038989 +0000 UTC m=+0.087336371 container exec 98f7091a3e2ea0e9ed1e630f1e98c8fad1fd276cf7448473db6afc3c103ea45d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538515, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2025-09-24T08:57:55, com.redhat.component=rhceph-container, io.buildah.version=1.33.12, CEPH_POINT_RELEASE=, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.expose-services=, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=553, io.openshift.tags=rhceph ceph, distribution-scope=public, vcs-type=git, RELEASE=main, ceph=True, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7)
Nov 28 09:44:23 np0005538515.localdomain sudo[273124]: pam_unix(sudo:session): session closed for user root
Nov 28 09:44:23 np0005538515.localdomain podman[273155]: 2025-11-28 09:44:23.427484149 +0000 UTC m=+0.203781501 container exec_died 98f7091a3e2ea0e9ed1e630f1e98c8fad1fd276cf7448473db6afc3c103ea45d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538515, io.openshift.tags=rhceph ceph, distribution-scope=public, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-09-24T08:57:55, GIT_CLEAN=True, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, release=553, io.openshift.expose-services=, architecture=x86_64, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.buildah.version=1.33.12, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Nov 28 09:44:23 np0005538515.localdomain sudo[272939]: pam_unix(sudo:session): session closed for user root
Nov 28 09:44:23 np0005538515.localdomain sudo[273295]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:44:23 np0005538515.localdomain sudo[273295]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:44:23 np0005538515.localdomain sudo[273295]: pam_unix(sudo:session): session closed for user root
Nov 28 09:44:23 np0005538515.localdomain sudo[273330]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 09:44:23 np0005538515.localdomain sudo[273330]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:44:23 np0005538515.localdomain sudo[273366]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qqrxlbnqisdrbucznyazkwmwxdopzdkn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323063.4178007-2108-275801048877193/AnsiballZ_file.py
Nov 28 09:44:23 np0005538515.localdomain sudo[273366]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:44:24 np0005538515.localdomain python3.9[273369]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:44:24 np0005538515.localdomain sudo[273366]: pam_unix(sudo:session): session closed for user root
Nov 28 09:44:24 np0005538515.localdomain sudo[273330]: pam_unix(sudo:session): session closed for user root
Nov 28 09:44:24 np0005538515.localdomain sudo[273510]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jiagrzsprpwhnntulkrciwcvdanympji ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323064.5315418-2284-49641825253681/AnsiballZ_command.py
Nov 28 09:44:24 np0005538515.localdomain sudo[273510]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:44:24 np0005538515.localdomain python3.9[273512]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                                              systemctl disable --now certmonger.service
                                                              test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                                            fi
                                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:44:25 np0005538515.localdomain sudo[273510]: pam_unix(sudo:session): session closed for user root
Nov 28 09:44:25 np0005538515.localdomain sudo[273532]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:44:25 np0005538515.localdomain sudo[273532]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:44:25 np0005538515.localdomain sudo[273532]: pam_unix(sudo:session): session closed for user root
Nov 28 09:44:27 np0005538515.localdomain python3.9[273640]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Nov 28 09:44:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:44:27 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:44:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:44:27 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 09:44:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:44:27 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:44:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:44:27 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 09:44:27 np0005538515.localdomain openstack_network_exporter[240973]: 
Nov 28 09:44:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:44:27 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 09:44:27 np0005538515.localdomain openstack_network_exporter[240973]: 
Nov 28 09:44:27 np0005538515.localdomain sudo[273748]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pthucdgilimlqnpatiyvdwhphxnxxqfy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323067.6313968-2337-192182469882293/AnsiballZ_systemd_service.py
Nov 28 09:44:27 np0005538515.localdomain sudo[273748]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:44:28 np0005538515.localdomain python3.9[273750]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 28 09:44:28 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 09:44:28 np0005538515.localdomain systemd-rc-local-generator[273775]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:44:28 np0005538515.localdomain systemd-sysv-generator[273778]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:44:28 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:44:28 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 28 09:44:28 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:44:28 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:44:28 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:44:28 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 28 09:44:28 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:44:28 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:44:28 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:44:28 np0005538515.localdomain sudo[273748]: pam_unix(sudo:session): session closed for user root
Nov 28 09:44:28 np0005538515.localdomain podman[239012]: time="2025-11-28T09:44:28Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 09:44:28 np0005538515.localdomain podman[239012]: @ - - [28/Nov/2025:09:44:28 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 149991 "" "Go-http-client/1.1"
Nov 28 09:44:28 np0005538515.localdomain podman[239012]: @ - - [28/Nov/2025:09:44:28 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17684 "" "Go-http-client/1.1"
Nov 28 09:44:29 np0005538515.localdomain sudo[273894]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zehocxkpklafosrolwxxfpqmcbmmpaxj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323069.2388217-2361-174520290857094/AnsiballZ_command.py
Nov 28 09:44:29 np0005538515.localdomain sudo[273894]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:44:29 np0005538515.localdomain python3.9[273896]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:44:29 np0005538515.localdomain sudo[273894]: pam_unix(sudo:session): session closed for user root
Nov 28 09:44:30 np0005538515.localdomain sudo[274005]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uuoqpsxpihwezyvofcmulxsqutskysky ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323069.9172328-2361-113428992491330/AnsiballZ_command.py
Nov 28 09:44:30 np0005538515.localdomain sudo[274005]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:44:30 np0005538515.localdomain python3.9[274007]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_migration_target.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:44:30 np0005538515.localdomain sudo[274005]: pam_unix(sudo:session): session closed for user root
Nov 28 09:44:30 np0005538515.localdomain sudo[274116]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lirhninohzuxqybbimxvrnknotqlyykl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323070.5435529-2361-228114990570754/AnsiballZ_command.py
Nov 28 09:44:30 np0005538515.localdomain sudo[274116]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:44:31 np0005538515.localdomain python3.9[274118]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api_cron.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:44:31 np0005538515.localdomain sudo[274116]: pam_unix(sudo:session): session closed for user root
Nov 28 09:44:31 np0005538515.localdomain sudo[274227]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kojodppqvtyxxnnrarnptzjyllpnszmd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323071.146785-2361-177090298474273/AnsiballZ_command.py
Nov 28 09:44:31 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.
Nov 28 09:44:31 np0005538515.localdomain sudo[274227]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:44:31 np0005538515.localdomain podman[274229]: 2025-11-28 09:44:31.54890356 +0000 UTC m=+0.086562437 container health_status 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=edpm, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 28 09:44:31 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.
Nov 28 09:44:31 np0005538515.localdomain podman[274229]: 2025-11-28 09:44:31.559841519 +0000 UTC m=+0.097500396 container exec_died 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=edpm, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 28 09:44:31 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.
Nov 28 09:44:31 np0005538515.localdomain systemd[1]: 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.service: Deactivated successfully.
Nov 28 09:44:31 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.
Nov 28 09:44:31 np0005538515.localdomain systemd[1]: tmp-crun.FFBhoo.mount: Deactivated successfully.
Nov 28 09:44:31 np0005538515.localdomain podman[274249]: 2025-11-28 09:44:31.658500878 +0000 UTC m=+0.087937290 container health_status b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, managed_by=edpm_ansible)
Nov 28 09:44:31 np0005538515.localdomain python3.9[274230]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:44:31 np0005538515.localdomain sudo[274227]: pam_unix(sudo:session): session closed for user root
Nov 28 09:44:31 np0005538515.localdomain podman[274269]: 2025-11-28 09:44:31.723328032 +0000 UTC m=+0.082265494 container health_status 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Nov 28 09:44:31 np0005538515.localdomain podman[274249]: 2025-11-28 09:44:31.741408711 +0000 UTC m=+0.170845163 container exec_died b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 28 09:44:31 np0005538515.localdomain systemd[1]: b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.service: Deactivated successfully.
Nov 28 09:44:31 np0005538515.localdomain podman[274269]: 2025-11-28 09:44:31.764452724 +0000 UTC m=+0.123390246 container exec_died 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Nov 28 09:44:31 np0005538515.localdomain systemd[1]: 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.service: Deactivated successfully.
Nov 28 09:44:31 np0005538515.localdomain podman[274250]: 2025-11-28 09:44:31.698729452 +0000 UTC m=+0.126215083 container health_status d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 09:44:31 np0005538515.localdomain podman[274250]: 2025-11-28 09:44:31.833510418 +0000 UTC m=+0.260996089 container exec_died d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 28 09:44:31 np0005538515.localdomain systemd[1]: d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.service: Deactivated successfully.
Nov 28 09:44:32 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18462 DF PROTO=TCP SPT=39560 DPT=9102 SEQ=918741934 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ADD0B490000000001030307) 
Nov 28 09:44:32 np0005538515.localdomain sudo[274421]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vxjwfjcfehelcrgbqjronmnpjxcghfqh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323071.8494532-2361-164856014464114/AnsiballZ_command.py
Nov 28 09:44:32 np0005538515.localdomain sudo[274421]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:44:32 np0005538515.localdomain python3.9[274423]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_conductor.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:44:32 np0005538515.localdomain sudo[274421]: pam_unix(sudo:session): session closed for user root
Nov 28 09:44:32 np0005538515.localdomain sudo[274532]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-onztdfrjjpkduuhnyjcjyoqzrbrhmfrh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323072.4741786-2361-215019793498219/AnsiballZ_command.py
Nov 28 09:44:32 np0005538515.localdomain sudo[274532]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:44:32 np0005538515.localdomain python3.9[274534]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_metadata.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:44:32 np0005538515.localdomain sudo[274532]: pam_unix(sudo:session): session closed for user root
Nov 28 09:44:33 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18463 DF PROTO=TCP SPT=39560 DPT=9102 SEQ=918741934 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ADD0F3B0000000001030307) 
Nov 28 09:44:33 np0005538515.localdomain sudo[274643]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lwfxlgbvgsskiegdxfunbdyzpvoazogq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323073.051446-2361-167141251579636/AnsiballZ_command.py
Nov 28 09:44:33 np0005538515.localdomain sudo[274643]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:44:33 np0005538515.localdomain python3.9[274645]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_scheduler.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:44:33 np0005538515.localdomain sudo[274643]: pam_unix(sudo:session): session closed for user root
Nov 28 09:44:33 np0005538515.localdomain sudo[274754]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-udezoxaimmqgdrznbwkhnbnfhfkpuvwg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323073.6421099-2361-89677536482499/AnsiballZ_command.py
Nov 28 09:44:33 np0005538515.localdomain sudo[274754]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:44:34 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28303 DF PROTO=TCP SPT=46354 DPT=9102 SEQ=1223652804 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ADD12FA0000000001030307) 
Nov 28 09:44:34 np0005538515.localdomain python3.9[274756]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_vnc_proxy.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:44:34 np0005538515.localdomain sudo[274754]: pam_unix(sudo:session): session closed for user root
Nov 28 09:44:34 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.
Nov 28 09:44:34 np0005538515.localdomain podman[274775]: 2025-11-28 09:44:34.95945728 +0000 UTC m=+0.065476125 container health_status 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 28 09:44:34 np0005538515.localdomain podman[274775]: 2025-11-28 09:44:34.996597648 +0000 UTC m=+0.102616433 container exec_died 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 28 09:44:35 np0005538515.localdomain systemd[1]: 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.service: Deactivated successfully.
Nov 28 09:44:35 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18464 DF PROTO=TCP SPT=39560 DPT=9102 SEQ=918741934 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ADD173A0000000001030307) 
Nov 28 09:44:36 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33663 DF PROTO=TCP SPT=43764 DPT=9102 SEQ=419132269 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ADD1AFB0000000001030307) 
Nov 28 09:44:36 np0005538515.localdomain sudo[274887]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-udfwfyypedopfitnhwtazevhneteqegc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323076.6787004-2568-252234110708746/AnsiballZ_file.py
Nov 28 09:44:36 np0005538515.localdomain sudo[274887]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:44:37 np0005538515.localdomain python3.9[274889]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:44:37 np0005538515.localdomain sudo[274887]: pam_unix(sudo:session): session closed for user root
Nov 28 09:44:37 np0005538515.localdomain sudo[274997]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-slvknoihnyrgaqzzwtpolptgwwjgvtlj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323077.2770834-2568-115344572634726/AnsiballZ_file.py
Nov 28 09:44:37 np0005538515.localdomain sudo[274997]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:44:37 np0005538515.localdomain python3.9[274999]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/containers setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:44:37 np0005538515.localdomain sudo[274997]: pam_unix(sudo:session): session closed for user root
Nov 28 09:44:38 np0005538515.localdomain sudo[275107]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rvmqyosdtbywtrfmbovbeltfoqykjzjl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323077.8512745-2568-97579407515430/AnsiballZ_file.py
Nov 28 09:44:38 np0005538515.localdomain sudo[275107]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:44:38 np0005538515.localdomain python3.9[275109]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova_nvme_cleaner setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:44:38 np0005538515.localdomain sudo[275107]: pam_unix(sudo:session): session closed for user root
Nov 28 09:44:38 np0005538515.localdomain sudo[275217]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pweahdtawyefkotejwcfbnzucmwpljwz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323078.5710888-2634-202654976683585/AnsiballZ_file.py
Nov 28 09:44:38 np0005538515.localdomain sudo[275217]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:44:39 np0005538515.localdomain python3.9[275219]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:44:39 np0005538515.localdomain sudo[275217]: pam_unix(sudo:session): session closed for user root
Nov 28 09:44:39 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18465 DF PROTO=TCP SPT=39560 DPT=9102 SEQ=918741934 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ADD26FA0000000001030307) 
Nov 28 09:44:40 np0005538515.localdomain sudo[275327]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ppgvnbgshhvsojvifhizytcllfwptstr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323079.1896594-2634-178408227946198/AnsiballZ_file.py
Nov 28 09:44:40 np0005538515.localdomain sudo[275327]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:44:40 np0005538515.localdomain python3.9[275329]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/_nova_secontext setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:44:40 np0005538515.localdomain sudo[275327]: pam_unix(sudo:session): session closed for user root
Nov 28 09:44:41 np0005538515.localdomain sudo[275437]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-esmeqkszjqjorculqvshzzznsjrvxfov ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323080.7687159-2634-46892087831404/AnsiballZ_file.py
Nov 28 09:44:41 np0005538515.localdomain sudo[275437]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:44:41 np0005538515.localdomain python3.9[275439]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova/instances setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:44:41 np0005538515.localdomain sudo[275437]: pam_unix(sudo:session): session closed for user root
Nov 28 09:44:42 np0005538515.localdomain sudo[275547]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rbwalipbaslfuyefsnxjgyxjaksutsfo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323081.502205-2634-98033577161110/AnsiballZ_file.py
Nov 28 09:44:42 np0005538515.localdomain sudo[275547]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:44:42 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.
Nov 28 09:44:42 np0005538515.localdomain podman[275550]: 2025-11-28 09:44:42.640187766 +0000 UTC m=+0.084817353 container health_status cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd)
Nov 28 09:44:42 np0005538515.localdomain podman[275550]: 2025-11-28 09:44:42.680586745 +0000 UTC m=+0.125216332 container exec_died cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, config_id=multipathd, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true)
Nov 28 09:44:42 np0005538515.localdomain systemd[1]: cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.service: Deactivated successfully.
Nov 28 09:44:42 np0005538515.localdomain python3.9[275549]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/etc/ceph setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:44:42 np0005538515.localdomain sudo[275547]: pam_unix(sudo:session): session closed for user root
Nov 28 09:44:43 np0005538515.localdomain sudo[275675]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mdncaekomelhcqcpzohflzeitxwtcwlu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323082.911991-2634-225456497171807/AnsiballZ_file.py
Nov 28 09:44:43 np0005538515.localdomain sudo[275675]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:44:43 np0005538515.localdomain python3.9[275677]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:44:43 np0005538515.localdomain sudo[275675]: pam_unix(sudo:session): session closed for user root
Nov 28 09:44:43 np0005538515.localdomain sudo[275785]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bkvexikuxypmcnbjpczsoqfarlebufxf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323083.493079-2634-132973486773904/AnsiballZ_file.py
Nov 28 09:44:43 np0005538515.localdomain sudo[275785]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:44:43 np0005538515.localdomain python3.9[275787]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/nvme setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:44:43 np0005538515.localdomain sudo[275785]: pam_unix(sudo:session): session closed for user root
Nov 28 09:44:44 np0005538515.localdomain sudo[275895]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-evkpmdmnnxfibrxypeaoxophldkyhbhv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323084.1089869-2634-275205077153310/AnsiballZ_file.py
Nov 28 09:44:44 np0005538515.localdomain sudo[275895]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:44:44 np0005538515.localdomain python3.9[275897]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/run/openvswitch setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:44:44 np0005538515.localdomain sudo[275895]: pam_unix(sudo:session): session closed for user root
Nov 28 09:44:47 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18466 DF PROTO=TCP SPT=39560 DPT=9102 SEQ=918741934 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ADD46FA0000000001030307) 
Nov 28 09:44:50 np0005538515.localdomain sudo[276005]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-clveurvollzcbincukhxjkjbisvrkrjd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323089.979455-2959-252052524393921/AnsiballZ_getent.py
Nov 28 09:44:50 np0005538515.localdomain sudo[276005]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:44:50 np0005538515.localdomain python3.9[276007]: ansible-ansible.builtin.getent Invoked with database=passwd key=nova fail_key=True service=None split=None
Nov 28 09:44:50 np0005538515.localdomain sudo[276005]: pam_unix(sudo:session): session closed for user root
Nov 28 09:44:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:44:50.824 158530 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:44:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:44:50.824 158530 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:44:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:44:50.825 158530 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:44:51 np0005538515.localdomain sshd[276026]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 09:44:51 np0005538515.localdomain sshd[276026]: Accepted publickey for zuul from 192.168.122.30 port 49554 ssh2: RSA SHA256:3gOhaEk5Hp1Sm2LwNst6cGDJ5O01KvSo8lCo9SBO2II
Nov 28 09:44:51 np0005538515.localdomain systemd-logind[763]: New session 60 of user zuul.
Nov 28 09:44:51 np0005538515.localdomain systemd[1]: Started Session 60 of User zuul.
Nov 28 09:44:51 np0005538515.localdomain sshd[276026]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Nov 28 09:44:51 np0005538515.localdomain sshd[276029]: Received disconnect from 192.168.122.30 port 49554:11: disconnected by user
Nov 28 09:44:51 np0005538515.localdomain sshd[276029]: Disconnected from user zuul 192.168.122.30 port 49554
Nov 28 09:44:51 np0005538515.localdomain sshd[276026]: pam_unix(sshd:session): session closed for user zuul
Nov 28 09:44:51 np0005538515.localdomain systemd[1]: session-60.scope: Deactivated successfully.
Nov 28 09:44:51 np0005538515.localdomain systemd-logind[763]: Session 60 logged out. Waiting for processes to exit.
Nov 28 09:44:51 np0005538515.localdomain systemd-logind[763]: Removed session 60.
Nov 28 09:44:52 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:44:52.069 228501 DEBUG oslo_service.periodic_task [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:44:52 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:44:52.092 228501 DEBUG oslo_service.periodic_task [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:44:52 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:44:52.117 228501 DEBUG oslo_concurrency.lockutils [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:44:52 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:44:52.117 228501 DEBUG oslo_concurrency.lockutils [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:44:52 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:44:52.117 228501 DEBUG oslo_concurrency.lockutils [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:44:52 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:44:52.118 228501 DEBUG nova.compute.resource_tracker [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Auditing locally available compute resources for np0005538515.localdomain (node: np0005538515.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 28 09:44:52 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:44:52.118 228501 DEBUG oslo_concurrency.processutils [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 09:44:52 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:44:52.583 228501 DEBUG oslo_concurrency.processutils [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.465s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 09:44:52 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:44:52.812 228501 WARNING nova.virt.libvirt.driver [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 09:44:52 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:44:52.815 228501 DEBUG nova.compute.resource_tracker [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Hypervisor/Node resource view: name=np0005538515.localdomain free_ram=12530MB free_disk=41.837093353271484GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 28 09:44:52 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:44:52.815 228501 DEBUG oslo_concurrency.lockutils [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:44:52 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:44:52.816 228501 DEBUG oslo_concurrency.lockutils [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:44:52 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.
Nov 28 09:44:52 np0005538515.localdomain podman[276069]: 2025-11-28 09:44:52.975901713 +0000 UTC m=+0.083175651 container health_status 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, io.openshift.expose-services=, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, config_id=edpm, vcs-type=git, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers)
Nov 28 09:44:52 np0005538515.localdomain podman[276069]: 2025-11-28 09:44:52.994607692 +0000 UTC m=+0.101881590 container exec_died 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, name=ubi9-minimal, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., release=1755695350, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, distribution-scope=public, version=9.6, container_name=openstack_network_exporter)
Nov 28 09:44:53 np0005538515.localdomain systemd[1]: 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.service: Deactivated successfully.
Nov 28 09:44:53 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:44:53.025 228501 DEBUG nova.compute.resource_tracker [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 28 09:44:53 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:44:53.025 228501 DEBUG nova.compute.resource_tracker [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Final resource view: name=np0005538515.localdomain phys_ram=15738MB used_ram=512MB phys_disk=41GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 28 09:44:53 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:44:53.042 228501 DEBUG oslo_concurrency.processutils [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 09:44:53 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:44:53.476 228501 DEBUG oslo_concurrency.processutils [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.435s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 09:44:53 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:44:53.480 228501 DEBUG nova.compute.provider_tree [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Inventory has not changed in ProviderTree for provider: 72fba1ca-0d86-48af-8a3d-510284dfd0e0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 09:44:53 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:44:53.502 228501 DEBUG nova.scheduler.client.report [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Inventory has not changed for provider 72fba1ca-0d86-48af-8a3d-510284dfd0e0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 09:44:53 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:44:53.503 228501 DEBUG nova.compute.resource_tracker [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Compute_service record updated for np0005538515.localdomain:np0005538515.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 28 09:44:53 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:44:53.504 228501 DEBUG oslo_concurrency.lockutils [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.688s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:44:53 np0005538515.localdomain python3.9[276199]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/config.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:44:54 np0005538515.localdomain python3.9[276287]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/config.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764323093.104491-3040-151366628889114/.source.json follow=False _original_basename=config.json.j2 checksum=b51012bfb0ca26296dcf3793a2f284446fb1395e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:44:54 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:44:54.503 228501 DEBUG oslo_service.periodic_task [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:44:54 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:44:54.504 228501 DEBUG oslo_service.periodic_task [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:44:54 np0005538515.localdomain python3.9[276395]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova-blank.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:44:55 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:44:55.073 228501 DEBUG oslo_service.periodic_task [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:44:55 np0005538515.localdomain python3.9[276450]: ansible-ansible.legacy.file Invoked with mode=0644 setype=container_file_t dest=/var/lib/openstack/config/nova/nova-blank.conf _original_basename=nova-blank.conf recurse=False state=file path=/var/lib/openstack/config/nova/nova-blank.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:44:56 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:44:56.073 228501 DEBUG oslo_service.periodic_task [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:44:56 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:44:56.074 228501 DEBUG nova.compute.manager [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 28 09:44:56 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:44:56.074 228501 DEBUG nova.compute.manager [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 28 09:44:56 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:44:56.097 228501 DEBUG nova.compute.manager [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 28 09:44:56 np0005538515.localdomain python3.9[276558]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/ssh-config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:44:56 np0005538515.localdomain python3.9[276644]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/ssh-config mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764323095.6960287-3040-276943246522897/.source follow=False _original_basename=ssh-config checksum=4297f735c41bdc1ff52d72e6f623a02242f37958 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:44:57 np0005538515.localdomain python3.9[276752]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/02-nova-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:44:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:44:57 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 09:44:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:44:57 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:44:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:44:57 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:44:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:44:57 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 09:44:57 np0005538515.localdomain openstack_network_exporter[240973]: 
Nov 28 09:44:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:44:57 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 09:44:57 np0005538515.localdomain openstack_network_exporter[240973]: 
Nov 28 09:44:57 np0005538515.localdomain python3.9[276838]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/02-nova-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764323096.748872-3040-34235935653247/.source.conf follow=False _original_basename=02-nova-host-specific.conf.j2 checksum=ea203e550d6f82354ff814f038f2bcabd98eed86 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:44:58 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:44:58.073 228501 DEBUG oslo_service.periodic_task [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:44:58 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:44:58.074 228501 DEBUG oslo_service.periodic_task [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:44:58 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:44:58.074 228501 DEBUG nova.compute.manager [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 28 09:44:58 np0005538515.localdomain python3.9[276946]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova_statedir_ownership.py follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:44:58 np0005538515.localdomain podman[239012]: time="2025-11-28T09:44:58Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 09:44:58 np0005538515.localdomain podman[239012]: @ - - [28/Nov/2025:09:44:58 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 149991 "" "Go-http-client/1.1"
Nov 28 09:44:58 np0005538515.localdomain python3.9[277032]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/nova_statedir_ownership.py mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764323097.92153-3040-97221860165152/.source.py follow=False _original_basename=nova_statedir_ownership.py checksum=c6c8a3cfefa5efd60ceb1408c4e977becedb71e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:44:58 np0005538515.localdomain podman[239012]: @ - - [28/Nov/2025:09:44:58 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17699 "" "Go-http-client/1.1"
Nov 28 09:44:59 np0005538515.localdomain python3.9[277140]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/run-on-host follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:45:00 np0005538515.localdomain python3.9[277226]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/run-on-host mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764323099.0921626-3040-43139872244947/.source follow=False _original_basename=run-on-host checksum=93aba8edc83d5878604a66d37fea2f12b60bdea2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:45:00 np0005538515.localdomain sudo[277334]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qydmqsjadoeyhqjlmhzzghpmvpqxtfqg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323100.6576035-3289-226165381661846/AnsiballZ_file.py
Nov 28 09:45:00 np0005538515.localdomain sudo[277334]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:45:01 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:45:01.073 228501 DEBUG oslo_service.periodic_task [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:45:01 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:45:01.074 228501 DEBUG oslo_service.periodic_task [None req-3c98ccaf-3970-4bf7-a4d1-56213c549b35 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:45:01 np0005538515.localdomain python3.9[277336]: ansible-ansible.builtin.file Invoked with group=nova mode=0700 owner=nova path=/home/nova/.ssh state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:45:01 np0005538515.localdomain sudo[277334]: pam_unix(sudo:session): session closed for user root
Nov 28 09:45:01 np0005538515.localdomain sudo[277444]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qmhosqzvjennslocyvihqqaajodggfry ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323101.4237044-3313-56092199276754/AnsiballZ_copy.py
Nov 28 09:45:01 np0005538515.localdomain sudo[277444]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:45:01 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.
Nov 28 09:45:01 np0005538515.localdomain podman[277447]: 2025-11-28 09:45:01.764307389 +0000 UTC m=+0.074719328 container health_status 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251125)
Nov 28 09:45:01 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.
Nov 28 09:45:01 np0005538515.localdomain podman[277447]: 2025-11-28 09:45:01.807438248 +0000 UTC m=+0.117850147 container exec_died 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 09:45:01 np0005538515.localdomain systemd[1]: 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.service: Deactivated successfully.
Nov 28 09:45:01 np0005538515.localdomain python3.9[277446]: ansible-ansible.legacy.copy Invoked with dest=/home/nova/.ssh/authorized_keys group=nova mode=0600 owner=nova remote_src=True src=/var/lib/openstack/config/nova/ssh-publickey backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:45:01 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.
Nov 28 09:45:01 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.
Nov 28 09:45:01 np0005538515.localdomain podman[277466]: 2025-11-28 09:45:01.900111464 +0000 UTC m=+0.118307472 container health_status b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0)
Nov 28 09:45:01 np0005538515.localdomain podman[277466]: 2025-11-28 09:45:01.906465981 +0000 UTC m=+0.124662029 container exec_died b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent)
Nov 28 09:45:01 np0005538515.localdomain sudo[277444]: pam_unix(sudo:session): session closed for user root
Nov 28 09:45:01 np0005538515.localdomain systemd[1]: b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.service: Deactivated successfully.
Nov 28 09:45:01 np0005538515.localdomain podman[277482]: 2025-11-28 09:45:01.988519406 +0000 UTC m=+0.085977598 container health_status 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_managed=true)
Nov 28 09:45:02 np0005538515.localdomain podman[277482]: 2025-11-28 09:45:02.042192002 +0000 UTC m=+0.139650174 container exec_died 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 28 09:45:02 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58794 DF PROTO=TCP SPT=52604 DPT=9102 SEQ=1885305082 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ADD80790000000001030307) 
Nov 28 09:45:02 np0005538515.localdomain systemd[1]: 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.service: Deactivated successfully.
Nov 28 09:45:02 np0005538515.localdomain podman[277486]: 2025-11-28 09:45:02.046687461 +0000 UTC m=+0.132317276 container health_status d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 28 09:45:02 np0005538515.localdomain podman[277486]: 2025-11-28 09:45:02.126775617 +0000 UTC m=+0.212405432 container exec_died d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 09:45:02 np0005538515.localdomain systemd[1]: d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.service: Deactivated successfully.
Nov 28 09:45:02 np0005538515.localdomain sudo[277640]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rbrmlrppidystbgxxzrkfdutmetepaap ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323102.1847527-3337-139507837452329/AnsiballZ_stat.py
Nov 28 09:45:02 np0005538515.localdomain sudo[277640]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:45:02 np0005538515.localdomain python3.9[277642]: ansible-ansible.builtin.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 09:45:02 np0005538515.localdomain sudo[277640]: pam_unix(sudo:session): session closed for user root
Nov 28 09:45:03 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58795 DF PROTO=TCP SPT=52604 DPT=9102 SEQ=1885305082 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ADD847A0000000001030307) 
Nov 28 09:45:03 np0005538515.localdomain sudo[277752]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-omuplktfmohstcjflvwajkskytgyeops ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323103.03589-3364-119273996987615/AnsiballZ_file.py
Nov 28 09:45:03 np0005538515.localdomain sudo[277752]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:45:03 np0005538515.localdomain python3.9[277754]: ansible-ansible.builtin.file Invoked with group=nova mode=0400 owner=nova path=/var/lib/nova/compute_id state=file recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:45:03 np0005538515.localdomain sudo[277752]: pam_unix(sudo:session): session closed for user root
Nov 28 09:45:03 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18467 DF PROTO=TCP SPT=39560 DPT=9102 SEQ=918741934 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ADD86FA0000000001030307) 
Nov 28 09:45:04 np0005538515.localdomain python3.9[277862]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 09:45:04 np0005538515.localdomain python3.9[277972]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:45:05 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58796 DF PROTO=TCP SPT=52604 DPT=9102 SEQ=1885305082 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ADD8C7B0000000001030307) 
Nov 28 09:45:05 np0005538515.localdomain python3.9[278027]: ansible-ansible.legacy.file Invoked with mode=0644 setype=container_file_t dest=/var/lib/openstack/config/containers/nova_compute.json _original_basename=nova_compute.json.j2 recurse=False state=file path=/var/lib/openstack/config/containers/nova_compute.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:45:05 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.
Nov 28 09:45:05 np0005538515.localdomain podman[278136]: 2025-11-28 09:45:05.960631705 +0000 UTC m=+0.070050154 container health_status 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 28 09:45:05 np0005538515.localdomain python3.9[278135]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute_init.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 09:45:05 np0005538515.localdomain podman[278136]: 2025-11-28 09:45:05.973469114 +0000 UTC m=+0.082887613 container exec_died 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 28 09:45:05 np0005538515.localdomain systemd[1]: 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.service: Deactivated successfully.
Nov 28 09:45:06 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28304 DF PROTO=TCP SPT=46354 DPT=9102 SEQ=1223652804 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ADD90FA0000000001030307) 
Nov 28 09:45:06 np0005538515.localdomain python3.9[278213]: ansible-ansible.legacy.file Invoked with mode=0700 setype=container_file_t dest=/var/lib/openstack/config/containers/nova_compute_init.json _original_basename=nova_compute_init.json.j2 recurse=False state=file path=/var/lib/openstack/config/containers/nova_compute_init.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 09:45:07 np0005538515.localdomain sudo[278321]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tmjwmsmbjzkdsuodfsnfhmmwdcvqdfbz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323106.9499893-3493-49252651692583/AnsiballZ_container_config_data.py
Nov 28 09:45:07 np0005538515.localdomain sudo[278321]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:45:07 np0005538515.localdomain python3.9[278323]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute_init.json debug=False
Nov 28 09:45:07 np0005538515.localdomain sudo[278321]: pam_unix(sudo:session): session closed for user root
Nov 28 09:45:08 np0005538515.localdomain sudo[278431]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lsbtjpqrkdxkdzgytemrkymdnyyzqyhy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323107.7938905-3519-151065073826818/AnsiballZ_container_config_hash.py
Nov 28 09:45:08 np0005538515.localdomain sudo[278431]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:45:08 np0005538515.localdomain python3.9[278433]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 28 09:45:08 np0005538515.localdomain sudo[278431]: pam_unix(sudo:session): session closed for user root
Nov 28 09:45:08 np0005538515.localdomain sudo[278541]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dbrfifylatzsoxjqmglgnhhqbeismxzc ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764323108.7406325-3549-104411170834539/AnsiballZ_edpm_container_manage.py
Nov 28 09:45:08 np0005538515.localdomain sudo[278541]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:45:09 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58797 DF PROTO=TCP SPT=52604 DPT=9102 SEQ=1885305082 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ADD9C3B0000000001030307) 
Nov 28 09:45:09 np0005538515.localdomain python3[278543]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute_init.json log_base_path=/var/log/containers/stdouts debug=False
Nov 28 09:45:09 np0005538515.localdomain python3[278543]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: [
                                                               {
                                                                    "Id": "b65793e7266422f5b94c32d109b906c8ffd974cf2ddf0b6929e463e29e05864a",
                                                                    "Digest": "sha256:647f1d5dc1b70ffa3e1832199619d57bfaeceac8823ff53ece64b8e42cc9688e",
                                                                    "RepoTags": [
                                                                         "quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified"
                                                                    ],
                                                                    "RepoDigests": [
                                                                         "quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:647f1d5dc1b70ffa3e1832199619d57bfaeceac8823ff53ece64b8e42cc9688e"
                                                                    ],
                                                                    "Parent": "",
                                                                    "Comment": "",
                                                                    "Created": "2025-11-26T06:36:07.10279245Z",
                                                                    "Config": {
                                                                         "User": "nova",
                                                                         "Env": [
                                                                              "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",
                                                                              "LANG=en_US.UTF-8",
                                                                              "TZ=UTC",
                                                                              "container=oci"
                                                                         ],
                                                                         "Entrypoint": [
                                                                              "dumb-init",
                                                                              "--single-child",
                                                                              "--"
                                                                         ],
                                                                         "Cmd": [
                                                                              "kolla_start"
                                                                         ],
                                                                         "Labels": {
                                                                              "io.buildah.version": "1.41.3",
                                                                              "maintainer": "OpenStack Kubernetes Operator team",
                                                                              "org.label-schema.build-date": "20251125",
                                                                              "org.label-schema.license": "GPLv2",
                                                                              "org.label-schema.name": "CentOS Stream 9 Base Image",
                                                                              "org.label-schema.schema-version": "1.0",
                                                                              "org.label-schema.vendor": "CentOS",
                                                                              "tcib_build_tag": "1f5c0439f2433cb462b222a5bb23e629",
                                                                              "tcib_managed": "true"
                                                                         },
                                                                         "StopSignal": "SIGTERM"
                                                                    },
                                                                    "Version": "",
                                                                    "Author": "",
                                                                    "Architecture": "amd64",
                                                                    "Os": "linux",
                                                                    "Size": 1211782527,
                                                                    "VirtualSize": 1211782527,
                                                                    "GraphDriver": {
                                                                         "Name": "overlay",
                                                                         "Data": {
                                                                              "LowerDir": "/var/lib/containers/storage/overlay/c3914bdda39f47c0c497a56396d11c84b489b87df2bfd019b00ddced1e1ae309/diff:/var/lib/containers/storage/overlay/f20c3ba929bbb53a84e323dddb8c0eaf3ca74b6729310e964e1fa9eee119e43a/diff:/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/diff:/var/lib/containers/storage/overlay/cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa/diff",
                                                                              "UpperDir": "/var/lib/containers/storage/overlay/f7726cecd9e8969401979ecd2369f385c53efc762aea19175eca5dfbffa00449/diff",
                                                                              "WorkDir": "/var/lib/containers/storage/overlay/f7726cecd9e8969401979ecd2369f385c53efc762aea19175eca5dfbffa00449/work"
                                                                         }
                                                                    },
                                                                    "RootFS": {
                                                                         "Type": "layers",
                                                                         "Layers": [
                                                                              "sha256:cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa",
                                                                              "sha256:1e3477d3ea795ca64b46f28aa9428ba791c4250e0fd05e173a4b9c0fb0bdee23",
                                                                              "sha256:c136b33417f134a3b932677bcf7a2df089c29f20eca250129eafd2132d4708bb",
                                                                              "sha256:7913bde445307e7f24767d9149b2e7f498930793ac9f073ccec69b608c009d31",
                                                                              "sha256:084b2323a717fe711217b0ec21da61f4804f7a0d506adae935888421b80809cf"
                                                                         ]
                                                                    },
                                                                    "Labels": {
                                                                         "io.buildah.version": "1.41.3",
                                                                         "maintainer": "OpenStack Kubernetes Operator team",
                                                                         "org.label-schema.build-date": "20251125",
                                                                         "org.label-schema.license": "GPLv2",
                                                                         "org.label-schema.name": "CentOS Stream 9 Base Image",
                                                                         "org.label-schema.schema-version": "1.0",
                                                                         "org.label-schema.vendor": "CentOS",
                                                                         "tcib_build_tag": "1f5c0439f2433cb462b222a5bb23e629",
                                                                         "tcib_managed": "true"
                                                                    },
                                                                    "Annotations": {},
                                                                    "ManifestType": "application/vnd.docker.distribution.manifest.v2+json",
                                                                    "User": "nova",
                                                                    "History": [
                                                                         {
                                                                              "created": "2025-11-25T04:02:36.223494528Z",
                                                                              "created_by": "/bin/sh -c #(nop) ADD file:cacf1a97b4abfca5db2db22f7ddbca8fd7daa5076a559639c109f09aaf55871d in / ",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-25T04:02:36.223562059Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL org.label-schema.schema-version=\"1.0\"     org.label-schema.name=\"CentOS Stream 9 Base Image\"     org.label-schema.vendor=\"CentOS\"     org.label-schema.license=\"GPLv2\"     org.label-schema.build-date=\"20251125\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-25T04:02:39.054452717Z",
                                                                              "created_by": "/bin/sh -c #(nop) CMD [\"/bin/bash\"]"
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:10:57.55004106Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL maintainer=\"OpenStack Kubernetes Operator team\"",
                                                                              "comment": "FROM quay.io/centos/centos:stream9",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:10:57.550061231Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL tcib_managed=true",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:10:57.550071761Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV LANG=\"en_US.UTF-8\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:10:57.550082711Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV TZ=\"UTC\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:10:57.550094371Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV container=\"oci\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:10:57.550104472Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:10:57.937139683Z",
                                                                              "created_by": "/bin/sh -c if [ -f \"/etc/yum.repos.d/ubi.repo\" ]; then rm -f /etc/yum.repos.d/ubi.repo && dnf clean all && rm -rf /var/cache/dnf; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:33.845342269Z",
                                                                              "created_by": "/bin/sh -c dnf install -y crudini && crudini --del /etc/dnf/dnf.conf main override_install_langs && crudini --set /etc/dnf/dnf.conf main clean_requirements_on_remove True && crudini --set /etc/dnf/dnf.conf main exactarch 1 && crudini --set /etc/dnf/dnf.conf main gpgcheck 1 && crudini --set /etc/dnf/dnf.conf main install_weak_deps False && if [ 'centos' == 'centos' ];then crudini --set /etc/dnf/dnf.conf main best False; fi && crudini --set /etc/dnf/dnf.conf main installonly_limit 0 && crudini --set /etc/dnf/dnf.conf main keepcache 0 && crudini --set /etc/dnf/dnf.conf main obsoletes 1 && crudini --set /etc/dnf/dnf.conf main plugins 1 && crudini --set /etc/dnf/dnf.conf main skip_missing_names_on_install False && crudini --set /etc/dnf/dnf.conf main tsflags nodocs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:37.752912815Z",
                                                                              "created_by": "/bin/sh -c dnf install -y ca-certificates dumb-init glibc-langpack-en procps-ng python3 sudo util-linux-user which python-tcib-containers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:38.066850603Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/uid_gid_manage.sh /usr/local/bin/uid_gid_manage",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:38.343690066Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/uid_gid_manage",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:39.121414134Z",
                                                                              "created_by": "/bin/sh -c bash /usr/local/bin/uid_gid_manage kolla hugetlbfs libvirt qemu",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:39.758394881Z",
                                                                              "created_by": "/bin/sh -c touch /usr/local/bin/kolla_extend_start && chmod 755 /usr/local/bin/kolla_extend_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:40.023293708Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/set_configs.py /usr/local/bin/kolla_set_configs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:40.666927498Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_set_configs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:41.274045447Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/start.sh /usr/local/bin/kolla_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:41.934810694Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:42.460051822Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/httpd_setup.sh /usr/local/bin/kolla_httpd_setup",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:43.056709748Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_httpd_setup",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:43.656939418Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/copy_cacerts.sh /usr/local/bin/kolla_copy_cacerts",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:44.391634882Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_copy_cacerts",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:44.866551538Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/sudoers /etc/sudoers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:45.384686341Z",
                                                                              "created_by": "/bin/sh -c chmod 440 /etc/sudoers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:45.893815667Z",
                                                                              "created_by": "/bin/sh -c sed -ri '/^(passwd:|group:)/ s/systemd//g' /etc/nsswitch.conf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:50.280039705Z",
                                                                              "created_by": "/bin/sh -c dnf -y reinstall which && rpm -e --nodeps tzdata && dnf -y install tzdata",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:51.365780205Z",
                                                                              "created_by": "/bin/sh -c if [ ! -f \"/etc/localtime\" ]; then ln -s /usr/share/zoneinfo/Etc/UTC /etc/localtime; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:52.238116267Z",
                                                                              "created_by": "/bin/sh -c mkdir -p /openstack",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:54.354755699Z",
                                                                              "created_by": "/bin/sh -c if [ 'centos' == 'centos' ];then if [ -n \"$(rpm -qa redhat-release)\" ];then rpm -e --nodeps redhat-release; fi ; dnf -y install centos-stream-release; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:57.47438266Z",
                                                                              "created_by": "/bin/sh -c dnf update --excludepkgs redhat-release -y && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:57.474435383Z",
                                                                              "created_by": "/bin/sh -c #(nop) STOPSIGNAL SIGTERM",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:57.474444143Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENTRYPOINT [\"dumb-init\", \"--single-child\", \"--\"]",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:57.474450953Z",
                                                                              "created_by": "/bin/sh -c #(nop) CMD [\"kolla_start\"]",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:58.542433842Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"1f5c0439f2433cb462b222a5bb23e629\""
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:13:58.883943816Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-base:1f5c0439f2433cb462b222a5bb23e629",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:14:39.655921147Z",
                                                                              "created_by": "/bin/sh -c dnf install -y python3-barbicanclient python3-cinderclient python3-designateclient python3-glanceclient python3-ironicclient python3-keystoneclient python3-manilaclient python3-neutronclient python3-novaclient python3-observabilityclient python3-octaviaclient python3-openstackclient python3-swiftclient python3-pymemcache && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:14:42.534184087Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"1f5c0439f2433cb462b222a5bb23e629\""
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:22:20.237322707Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-os:1f5c0439f2433cb462b222a5bb23e629",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:22:20.688296939Z",
                                                                              "created_by": "/bin/sh -c bash /usr/local/bin/uid_gid_manage nova",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:22:21.069367201Z",
                                                                              "created_by": "/bin/sh -c mkdir -p /etc/ssh && touch /etc/ssh/ssh_known_host",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:23:46.989417927Z",
                                                                              "created_by": "/bin/sh -c dnf install -y openstack-nova-common && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:23:54.535170465Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"1f5c0439f2433cb462b222a5bb23e629\""
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:34:24.828469773Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-nova-base:1f5c0439f2433cb462b222a5bb23e629",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:36:06.089054875Z",
                                                                              "created_by": "/bin/sh -c dnf -y install e2fsprogs xfsprogs xorriso iscsi-initiator-utils nfs-utils targetcli nvme-cli device-mapper-multipath ceph-common openssh-clients openstack-nova-compute openvswitch swtpm swtpm-tools && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:36:06.610811813Z",
                                                                              "created_by": "/bin/sh -c bash /usr/local/bin/uid_gid_manage nova",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:36:07.099939071Z",
                                                                              "created_by": "/bin/sh -c rm -f /etc/machine-id",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:36:07.100032994Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER nova",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:36:14.509959241Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"1f5c0439f2433cb462b222a5bb23e629\""
                                                                         }
                                                                    ],
                                                                    "NamesHistory": [
                                                                         "quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified"
                                                                    ]
                                                               }
                                                          ]
                                                          : quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Nov 28 09:45:09 np0005538515.localdomain sudo[278541]: pam_unix(sudo:session): session closed for user root
Nov 28 09:45:10 np0005538515.localdomain sudo[278717]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ylelzszniejgjmvlhdquukobpxlxqdub ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323109.9440045-3573-77818055073125/AnsiballZ_stat.py
Nov 28 09:45:10 np0005538515.localdomain sudo[278717]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:45:10 np0005538515.localdomain python3.9[278719]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 09:45:10 np0005538515.localdomain sudo[278717]: pam_unix(sudo:session): session closed for user root
Nov 28 09:45:11 np0005538515.localdomain sudo[278829]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gdwkkezsdbowozkmuflfifigumphwzpw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323111.1153445-3610-75312585976222/AnsiballZ_container_config_data.py
Nov 28 09:45:11 np0005538515.localdomain sudo[278829]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:45:11 np0005538515.localdomain python3.9[278831]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute.json debug=False
Nov 28 09:45:11 np0005538515.localdomain sudo[278829]: pam_unix(sudo:session): session closed for user root
Nov 28 09:45:12 np0005538515.localdomain sudo[278939]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fbzpajyvlmvelabqvdfplsagghcvkuhd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323111.9361188-3637-65071037180314/AnsiballZ_container_config_hash.py
Nov 28 09:45:12 np0005538515.localdomain sudo[278939]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:45:12 np0005538515.localdomain python3.9[278941]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 28 09:45:12 np0005538515.localdomain sudo[278939]: pam_unix(sudo:session): session closed for user root
Nov 28 09:45:12 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.
Nov 28 09:45:12 np0005538515.localdomain podman[278959]: 2025-11-28 09:45:12.983168495 +0000 UTC m=+0.094489934 container health_status cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Nov 28 09:45:12 np0005538515.localdomain podman[278959]: 2025-11-28 09:45:12.999534443 +0000 UTC m=+0.110855872 container exec_died cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=multipathd, io.buildah.version=1.41.3)
Nov 28 09:45:13 np0005538515.localdomain systemd[1]: cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.service: Deactivated successfully.
Nov 28 09:45:13 np0005538515.localdomain sudo[279068]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kofuugffngkdaokpxtaxopgnflkyohph ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764323112.9325225-3666-65241887486313/AnsiballZ_edpm_container_manage.py
Nov 28 09:45:13 np0005538515.localdomain sudo[279068]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:45:13 np0005538515.localdomain python3[279070]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute.json log_base_path=/var/log/containers/stdouts debug=False
Nov 28 09:45:13 np0005538515.localdomain python3[279070]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: [
                                                               {
                                                                    "Id": "b65793e7266422f5b94c32d109b906c8ffd974cf2ddf0b6929e463e29e05864a",
                                                                    "Digest": "sha256:647f1d5dc1b70ffa3e1832199619d57bfaeceac8823ff53ece64b8e42cc9688e",
                                                                    "RepoTags": [
                                                                         "quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified"
                                                                    ],
                                                                    "RepoDigests": [
                                                                         "quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:647f1d5dc1b70ffa3e1832199619d57bfaeceac8823ff53ece64b8e42cc9688e"
                                                                    ],
                                                                    "Parent": "",
                                                                    "Comment": "",
                                                                    "Created": "2025-11-26T06:36:07.10279245Z",
                                                                    "Config": {
                                                                         "User": "nova",
                                                                         "Env": [
                                                                              "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",
                                                                              "LANG=en_US.UTF-8",
                                                                              "TZ=UTC",
                                                                              "container=oci"
                                                                         ],
                                                                         "Entrypoint": [
                                                                              "dumb-init",
                                                                              "--single-child",
                                                                              "--"
                                                                         ],
                                                                         "Cmd": [
                                                                              "kolla_start"
                                                                         ],
                                                                         "Labels": {
                                                                              "io.buildah.version": "1.41.3",
                                                                              "maintainer": "OpenStack Kubernetes Operator team",
                                                                              "org.label-schema.build-date": "20251125",
                                                                              "org.label-schema.license": "GPLv2",
                                                                              "org.label-schema.name": "CentOS Stream 9 Base Image",
                                                                              "org.label-schema.schema-version": "1.0",
                                                                              "org.label-schema.vendor": "CentOS",
                                                                              "tcib_build_tag": "1f5c0439f2433cb462b222a5bb23e629",
                                                                              "tcib_managed": "true"
                                                                         },
                                                                         "StopSignal": "SIGTERM"
                                                                    },
                                                                    "Version": "",
                                                                    "Author": "",
                                                                    "Architecture": "amd64",
                                                                    "Os": "linux",
                                                                    "Size": 1211782527,
                                                                    "VirtualSize": 1211782527,
                                                                    "GraphDriver": {
                                                                         "Name": "overlay",
                                                                         "Data": {
                                                                              "LowerDir": "/var/lib/containers/storage/overlay/c3914bdda39f47c0c497a56396d11c84b489b87df2bfd019b00ddced1e1ae309/diff:/var/lib/containers/storage/overlay/f20c3ba929bbb53a84e323dddb8c0eaf3ca74b6729310e964e1fa9eee119e43a/diff:/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/diff:/var/lib/containers/storage/overlay/cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa/diff",
                                                                              "UpperDir": "/var/lib/containers/storage/overlay/f7726cecd9e8969401979ecd2369f385c53efc762aea19175eca5dfbffa00449/diff",
                                                                              "WorkDir": "/var/lib/containers/storage/overlay/f7726cecd9e8969401979ecd2369f385c53efc762aea19175eca5dfbffa00449/work"
                                                                         }
                                                                    },
                                                                    "RootFS": {
                                                                         "Type": "layers",
                                                                         "Layers": [
                                                                              "sha256:cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa",
                                                                              "sha256:1e3477d3ea795ca64b46f28aa9428ba791c4250e0fd05e173a4b9c0fb0bdee23",
                                                                              "sha256:c136b33417f134a3b932677bcf7a2df089c29f20eca250129eafd2132d4708bb",
                                                                              "sha256:7913bde445307e7f24767d9149b2e7f498930793ac9f073ccec69b608c009d31",
                                                                              "sha256:084b2323a717fe711217b0ec21da61f4804f7a0d506adae935888421b80809cf"
                                                                         ]
                                                                    },
                                                                    "Labels": {
                                                                         "io.buildah.version": "1.41.3",
                                                                         "maintainer": "OpenStack Kubernetes Operator team",
                                                                         "org.label-schema.build-date": "20251125",
                                                                         "org.label-schema.license": "GPLv2",
                                                                         "org.label-schema.name": "CentOS Stream 9 Base Image",
                                                                         "org.label-schema.schema-version": "1.0",
                                                                         "org.label-schema.vendor": "CentOS",
                                                                         "tcib_build_tag": "1f5c0439f2433cb462b222a5bb23e629",
                                                                         "tcib_managed": "true"
                                                                    },
                                                                    "Annotations": {},
                                                                    "ManifestType": "application/vnd.docker.distribution.manifest.v2+json",
                                                                    "User": "nova",
                                                                    "History": [
                                                                         {
                                                                              "created": "2025-11-25T04:02:36.223494528Z",
                                                                              "created_by": "/bin/sh -c #(nop) ADD file:cacf1a97b4abfca5db2db22f7ddbca8fd7daa5076a559639c109f09aaf55871d in / ",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-25T04:02:36.223562059Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL org.label-schema.schema-version=\"1.0\"     org.label-schema.name=\"CentOS Stream 9 Base Image\"     org.label-schema.vendor=\"CentOS\"     org.label-schema.license=\"GPLv2\"     org.label-schema.build-date=\"20251125\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-25T04:02:39.054452717Z",
                                                                              "created_by": "/bin/sh -c #(nop) CMD [\"/bin/bash\"]"
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:10:57.55004106Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL maintainer=\"OpenStack Kubernetes Operator team\"",
                                                                              "comment": "FROM quay.io/centos/centos:stream9",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:10:57.550061231Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL tcib_managed=true",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:10:57.550071761Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV LANG=\"en_US.UTF-8\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:10:57.550082711Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV TZ=\"UTC\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:10:57.550094371Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV container=\"oci\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:10:57.550104472Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:10:57.937139683Z",
                                                                              "created_by": "/bin/sh -c if [ -f \"/etc/yum.repos.d/ubi.repo\" ]; then rm -f /etc/yum.repos.d/ubi.repo && dnf clean all && rm -rf /var/cache/dnf; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:33.845342269Z",
                                                                              "created_by": "/bin/sh -c dnf install -y crudini && crudini --del /etc/dnf/dnf.conf main override_install_langs && crudini --set /etc/dnf/dnf.conf main clean_requirements_on_remove True && crudini --set /etc/dnf/dnf.conf main exactarch 1 && crudini --set /etc/dnf/dnf.conf main gpgcheck 1 && crudini --set /etc/dnf/dnf.conf main install_weak_deps False && if [ 'centos' == 'centos' ];then crudini --set /etc/dnf/dnf.conf main best False; fi && crudini --set /etc/dnf/dnf.conf main installonly_limit 0 && crudini --set /etc/dnf/dnf.conf main keepcache 0 && crudini --set /etc/dnf/dnf.conf main obsoletes 1 && crudini --set /etc/dnf/dnf.conf main plugins 1 && crudini --set /etc/dnf/dnf.conf main skip_missing_names_on_install False && crudini --set /etc/dnf/dnf.conf main tsflags nodocs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:37.752912815Z",
                                                                              "created_by": "/bin/sh -c dnf install -y ca-certificates dumb-init glibc-langpack-en procps-ng python3 sudo util-linux-user which python-tcib-containers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:38.066850603Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/uid_gid_manage.sh /usr/local/bin/uid_gid_manage",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:38.343690066Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/uid_gid_manage",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:39.121414134Z",
                                                                              "created_by": "/bin/sh -c bash /usr/local/bin/uid_gid_manage kolla hugetlbfs libvirt qemu",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:39.758394881Z",
                                                                              "created_by": "/bin/sh -c touch /usr/local/bin/kolla_extend_start && chmod 755 /usr/local/bin/kolla_extend_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:40.023293708Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/set_configs.py /usr/local/bin/kolla_set_configs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:40.666927498Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_set_configs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:41.274045447Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/start.sh /usr/local/bin/kolla_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:41.934810694Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:42.460051822Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/httpd_setup.sh /usr/local/bin/kolla_httpd_setup",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:43.056709748Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_httpd_setup",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:43.656939418Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/copy_cacerts.sh /usr/local/bin/kolla_copy_cacerts",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:44.391634882Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_copy_cacerts",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:44.866551538Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/sudoers /etc/sudoers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:45.384686341Z",
                                                                              "created_by": "/bin/sh -c chmod 440 /etc/sudoers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:45.893815667Z",
                                                                              "created_by": "/bin/sh -c sed -ri '/^(passwd:|group:)/ s/systemd//g' /etc/nsswitch.conf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:50.280039705Z",
                                                                              "created_by": "/bin/sh -c dnf -y reinstall which && rpm -e --nodeps tzdata && dnf -y install tzdata",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:51.365780205Z",
                                                                              "created_by": "/bin/sh -c if [ ! -f \"/etc/localtime\" ]; then ln -s /usr/share/zoneinfo/Etc/UTC /etc/localtime; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:52.238116267Z",
                                                                              "created_by": "/bin/sh -c mkdir -p /openstack",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:54.354755699Z",
                                                                              "created_by": "/bin/sh -c if [ 'centos' == 'centos' ];then if [ -n \"$(rpm -qa redhat-release)\" ];then rpm -e --nodeps redhat-release; fi ; dnf -y install centos-stream-release; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:57.47438266Z",
                                                                              "created_by": "/bin/sh -c dnf update --excludepkgs redhat-release -y && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:57.474435383Z",
                                                                              "created_by": "/bin/sh -c #(nop) STOPSIGNAL SIGTERM",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:57.474444143Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENTRYPOINT [\"dumb-init\", \"--single-child\", \"--\"]",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:57.474450953Z",
                                                                              "created_by": "/bin/sh -c #(nop) CMD [\"kolla_start\"]",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:11:58.542433842Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"1f5c0439f2433cb462b222a5bb23e629\""
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:13:58.883943816Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-base:1f5c0439f2433cb462b222a5bb23e629",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:14:39.655921147Z",
                                                                              "created_by": "/bin/sh -c dnf install -y python3-barbicanclient python3-cinderclient python3-designateclient python3-glanceclient python3-ironicclient python3-keystoneclient python3-manilaclient python3-neutronclient python3-novaclient python3-observabilityclient python3-octaviaclient python3-openstackclient python3-swiftclient python3-pymemcache && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:14:42.534184087Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"1f5c0439f2433cb462b222a5bb23e629\""
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:22:20.237322707Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-os:1f5c0439f2433cb462b222a5bb23e629",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:22:20.688296939Z",
                                                                              "created_by": "/bin/sh -c bash /usr/local/bin/uid_gid_manage nova",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:22:21.069367201Z",
                                                                              "created_by": "/bin/sh -c mkdir -p /etc/ssh && touch /etc/ssh/ssh_known_host",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:23:46.989417927Z",
                                                                              "created_by": "/bin/sh -c dnf install -y openstack-nova-common && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:23:54.535170465Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"1f5c0439f2433cb462b222a5bb23e629\""
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:34:24.828469773Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-nova-base:1f5c0439f2433cb462b222a5bb23e629",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:36:06.089054875Z",
                                                                              "created_by": "/bin/sh -c dnf -y install e2fsprogs xfsprogs xorriso iscsi-initiator-utils nfs-utils targetcli nvme-cli device-mapper-multipath ceph-common openssh-clients openstack-nova-compute openvswitch swtpm swtpm-tools && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:36:06.610811813Z",
                                                                              "created_by": "/bin/sh -c bash /usr/local/bin/uid_gid_manage nova",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:36:07.099939071Z",
                                                                              "created_by": "/bin/sh -c rm -f /etc/machine-id",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:36:07.100032994Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER nova",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-26T06:36:14.509959241Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"1f5c0439f2433cb462b222a5bb23e629\""
                                                                         }
                                                                    ],
                                                                    "NamesHistory": [
                                                                         "quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified"
                                                                    ]
                                                               }
                                                          ]
                                                          : quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Nov 28 09:45:13 np0005538515.localdomain sudo[279068]: pam_unix(sudo:session): session closed for user root
Nov 28 09:45:14 np0005538515.localdomain sudo[279241]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ufpdciewslgdkcbalfkdwhpeokhbnnlm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323114.13563-3691-253034045803650/AnsiballZ_stat.py
Nov 28 09:45:14 np0005538515.localdomain sudo[279241]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:45:14 np0005538515.localdomain python3.9[279243]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 09:45:14 np0005538515.localdomain sudo[279241]: pam_unix(sudo:session): session closed for user root
Nov 28 09:45:15 np0005538515.localdomain sudo[279353]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zdttjflmkuqbcyqumlxcvumdnxecxngx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323114.9747198-3717-220181978342319/AnsiballZ_file.py
Nov 28 09:45:15 np0005538515.localdomain sudo[279353]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:45:15 np0005538515.localdomain python3.9[279355]: ansible-file Invoked with path=/etc/systemd/system/edpm_nova_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:45:15 np0005538515.localdomain sudo[279353]: pam_unix(sudo:session): session closed for user root
Nov 28 09:45:15 np0005538515.localdomain sudo[279462]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bbyymfchipdnyllarlbnxadyyjkvwtni ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323115.5039833-3717-65271323444404/AnsiballZ_copy.py
Nov 28 09:45:15 np0005538515.localdomain sudo[279462]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:45:16 np0005538515.localdomain python3.9[279464]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764323115.5039833-3717-65271323444404/source dest=/etc/systemd/system/edpm_nova_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:45:16 np0005538515.localdomain sudo[279462]: pam_unix(sudo:session): session closed for user root
Nov 28 09:45:16 np0005538515.localdomain sudo[279517]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tfcajrvpoxliddmcjqppnutzzxynfzcu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323115.5039833-3717-65271323444404/AnsiballZ_systemd.py
Nov 28 09:45:16 np0005538515.localdomain sudo[279517]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:45:16 np0005538515.localdomain python3.9[279519]: ansible-systemd Invoked with state=started name=edpm_nova_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:45:16 np0005538515.localdomain sudo[279517]: pam_unix(sudo:session): session closed for user root
Nov 28 09:45:17 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58798 DF PROTO=TCP SPT=52604 DPT=9102 SEQ=1885305082 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ADDBCFB0000000001030307) 
Nov 28 09:45:17 np0005538515.localdomain python3.9[279629]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner_healthcheck.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 09:45:18 np0005538515.localdomain python3.9[279737]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 09:45:19 np0005538515.localdomain python3.9[279845]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service.requires follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 09:45:20 np0005538515.localdomain sudo[279953]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cxfifobnuqkfqkyjsptwunygluabbdhy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323120.0171723-3885-36198888917792/AnsiballZ_podman_container.py
Nov 28 09:45:20 np0005538515.localdomain sudo[279953]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:45:20 np0005538515.localdomain python3.9[279955]: ansible-containers.podman.podman_container Invoked with name=nova_nvme_cleaner state=absent executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Nov 28 09:45:20 np0005538515.localdomain sudo[279953]: pam_unix(sudo:session): session closed for user root
Nov 28 09:45:20 np0005538515.localdomain systemd-journald[48427]: Field hash table of /run/log/journal/5cd59ba25ae47acac865224fa46a5f9e/system.journal has a fill level at 102.7 (342 of 333 items), suggesting rotation.
Nov 28 09:45:20 np0005538515.localdomain systemd-journald[48427]: /run/log/journal/5cd59ba25ae47acac865224fa46a5f9e/system.journal: Journal header limits reached or header out-of-date, rotating.
Nov 28 09:45:20 np0005538515.localdomain rsyslogd[758]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Nov 28 09:45:20 np0005538515.localdomain rsyslogd[758]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Nov 28 09:45:21 np0005538515.localdomain sudo[280087]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bkhldvkdpbwuwevgacgukefkyomohpxs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323121.1300159-3910-101846634878815/AnsiballZ_systemd.py
Nov 28 09:45:21 np0005538515.localdomain sudo[280087]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:45:21 np0005538515.localdomain python3.9[280089]: ansible-ansible.builtin.systemd Invoked with name=edpm_nova_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 28 09:45:21 np0005538515.localdomain systemd[1]: Stopping nova_compute container...
Nov 28 09:45:22 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:45:22.573 228501 WARNING amqp [-] Received method (60, 30) during closing channel 1. This method will be ignored
Nov 28 09:45:22 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:45:22.575 228501 DEBUG oslo_concurrency.lockutils [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 28 09:45:22 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:45:22.576 228501 DEBUG oslo_concurrency.lockutils [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 28 09:45:22 np0005538515.localdomain nova_compute[228497]: 2025-11-28 09:45:22.576 228501 DEBUG oslo_concurrency.lockutils [None req-8bf541a8-6dc3-40d0-8f78-14c26a1c42e5 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 28 09:45:22 np0005538515.localdomain virtqemud[227736]: End of file while reading data: Input/output error
Nov 28 09:45:22 np0005538515.localdomain systemd[1]: libpod-1d94fbbb119736b439df2d989b40e3ac469ad8b1f74689adb48cea226c0cf94e.scope: Deactivated successfully.
Nov 28 09:45:22 np0005538515.localdomain systemd[1]: libpod-1d94fbbb119736b439df2d989b40e3ac469ad8b1f74689adb48cea226c0cf94e.scope: Consumed 17.558s CPU time.
Nov 28 09:45:22 np0005538515.localdomain podman[280093]: 2025-11-28 09:45:22.948546556 +0000 UTC m=+1.165350710 container died 1d94fbbb119736b439df2d989b40e3ac469ad8b1f74689adb48cea226c0cf94e (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, container_name=nova_compute, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, config_id=edpm, org.label-schema.build-date=20251125)
Nov 28 09:45:22 np0005538515.localdomain systemd[1]: tmp-crun.EXMo5n.mount: Deactivated successfully.
Nov 28 09:45:22 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-1d94fbbb119736b439df2d989b40e3ac469ad8b1f74689adb48cea226c0cf94e-userdata-shm.mount: Deactivated successfully.
Nov 28 09:45:23 np0005538515.localdomain podman[280093]: 2025-11-28 09:45:23.08273871 +0000 UTC m=+1.299542784 container cleanup 1d94fbbb119736b439df2d989b40e3ac469ad8b1f74689adb48cea226c0cf94e (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=edpm, container_name=nova_compute, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, maintainer=OpenStack Kubernetes Operator team)
Nov 28 09:45:23 np0005538515.localdomain podman[280093]: nova_compute
Nov 28 09:45:23 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.
Nov 28 09:45:23 np0005538515.localdomain podman[280122]: 2025-11-28 09:45:23.201836996 +0000 UTC m=+0.087478496 container health_status 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, release=1755695350, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, version=9.6, io.buildah.version=1.33.7, distribution-scope=public, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, vcs-type=git, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, maintainer=Red Hat, Inc., name=ubi9-minimal)
Nov 28 09:45:23 np0005538515.localdomain podman[280122]: 2025-11-28 09:45:23.217396898 +0000 UTC m=+0.103038438 container exec_died 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, name=ubi9-minimal, config_id=edpm, io.openshift.tags=minimal rhel9, release=1755695350, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, vcs-type=git, vendor=Red Hat, Inc., distribution-scope=public, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 28 09:45:23 np0005538515.localdomain systemd[1]: 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.service: Deactivated successfully.
Nov 28 09:45:23 np0005538515.localdomain podman[280152]: error opening file `/run/crun/1d94fbbb119736b439df2d989b40e3ac469ad8b1f74689adb48cea226c0cf94e/status`: No such file or directory
Nov 28 09:45:23 np0005538515.localdomain podman[280123]: 2025-11-28 09:45:23.293047895 +0000 UTC m=+0.175540167 container cleanup 1d94fbbb119736b439df2d989b40e3ac469ad8b1f74689adb48cea226c0cf94e (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm, container_name=nova_compute, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 28 09:45:23 np0005538515.localdomain podman[280123]: nova_compute
Nov 28 09:45:23 np0005538515.localdomain systemd[1]: edpm_nova_compute.service: Deactivated successfully.
Nov 28 09:45:23 np0005538515.localdomain systemd[1]: Stopped nova_compute container.
Nov 28 09:45:23 np0005538515.localdomain systemd[1]: Starting nova_compute container...
Nov 28 09:45:23 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 09:45:23 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7d070e222432defa9c0fb260246ed4b88067e3e8c5320c077932e5b44f128942/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Nov 28 09:45:23 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7d070e222432defa9c0fb260246ed4b88067e3e8c5320c077932e5b44f128942/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Nov 28 09:45:23 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7d070e222432defa9c0fb260246ed4b88067e3e8c5320c077932e5b44f128942/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Nov 28 09:45:23 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7d070e222432defa9c0fb260246ed4b88067e3e8c5320c077932e5b44f128942/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Nov 28 09:45:23 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7d070e222432defa9c0fb260246ed4b88067e3e8c5320c077932e5b44f128942/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Nov 28 09:45:23 np0005538515.localdomain podman[280154]: 2025-11-28 09:45:23.404563726 +0000 UTC m=+0.086112173 container init 1d94fbbb119736b439df2d989b40e3ac469ad8b1f74689adb48cea226c0cf94e (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, tcib_managed=true, container_name=nova_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']})
Nov 28 09:45:23 np0005538515.localdomain podman[280154]: 2025-11-28 09:45:23.413433721 +0000 UTC m=+0.094982168 container start 1d94fbbb119736b439df2d989b40e3ac469ad8b1f74689adb48cea226c0cf94e (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=nova_compute, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=edpm, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Nov 28 09:45:23 np0005538515.localdomain podman[280154]: nova_compute
Nov 28 09:45:23 np0005538515.localdomain nova_compute[280168]: + sudo -E kolla_set_configs
Nov 28 09:45:23 np0005538515.localdomain systemd[1]: Started nova_compute container.
Nov 28 09:45:23 np0005538515.localdomain sudo[280087]: pam_unix(sudo:session): session closed for user root
Nov 28 09:45:23 np0005538515.localdomain nova_compute[280168]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 28 09:45:23 np0005538515.localdomain nova_compute[280168]: INFO:__main__:Validating config file
Nov 28 09:45:23 np0005538515.localdomain nova_compute[280168]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 28 09:45:23 np0005538515.localdomain nova_compute[280168]: INFO:__main__:Copying service configuration files
Nov 28 09:45:23 np0005538515.localdomain nova_compute[280168]: INFO:__main__:Deleting /etc/nova/nova.conf
Nov 28 09:45:23 np0005538515.localdomain nova_compute[280168]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Nov 28 09:45:23 np0005538515.localdomain nova_compute[280168]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Nov 28 09:45:23 np0005538515.localdomain nova_compute[280168]: INFO:__main__:Deleting /etc/nova/nova.conf.d/01-nova.conf
Nov 28 09:45:23 np0005538515.localdomain nova_compute[280168]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Nov 28 09:45:23 np0005538515.localdomain nova_compute[280168]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Nov 28 09:45:23 np0005538515.localdomain nova_compute[280168]: INFO:__main__:Deleting /etc/nova/nova.conf.d/03-ceph-nova.conf
Nov 28 09:45:23 np0005538515.localdomain nova_compute[280168]: INFO:__main__:Copying /var/lib/kolla/config_files/03-ceph-nova.conf to /etc/nova/nova.conf.d/03-ceph-nova.conf
Nov 28 09:45:23 np0005538515.localdomain nova_compute[280168]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/03-ceph-nova.conf
Nov 28 09:45:23 np0005538515.localdomain nova_compute[280168]: INFO:__main__:Deleting /etc/nova/nova.conf.d/99-nova-compute-cells-workarounds.conf
Nov 28 09:45:23 np0005538515.localdomain nova_compute[280168]: INFO:__main__:Copying /var/lib/kolla/config_files/99-nova-compute-cells-workarounds.conf to /etc/nova/nova.conf.d/99-nova-compute-cells-workarounds.conf
Nov 28 09:45:23 np0005538515.localdomain nova_compute[280168]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/99-nova-compute-cells-workarounds.conf
Nov 28 09:45:23 np0005538515.localdomain nova_compute[280168]: INFO:__main__:Deleting /etc/nova/nova.conf.d/nova-blank.conf
Nov 28 09:45:23 np0005538515.localdomain nova_compute[280168]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Nov 28 09:45:23 np0005538515.localdomain nova_compute[280168]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Nov 28 09:45:23 np0005538515.localdomain nova_compute[280168]: INFO:__main__:Deleting /etc/nova/nova.conf.d/02-nova-host-specific.conf
Nov 28 09:45:23 np0005538515.localdomain nova_compute[280168]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Nov 28 09:45:23 np0005538515.localdomain nova_compute[280168]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Nov 28 09:45:23 np0005538515.localdomain nova_compute[280168]: INFO:__main__:Deleting /etc/ceph
Nov 28 09:45:23 np0005538515.localdomain nova_compute[280168]: INFO:__main__:Creating directory /etc/ceph
Nov 28 09:45:23 np0005538515.localdomain nova_compute[280168]: INFO:__main__:Setting permission for /etc/ceph
Nov 28 09:45:23 np0005538515.localdomain nova_compute[280168]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.conf to /etc/ceph/ceph.conf
Nov 28 09:45:23 np0005538515.localdomain nova_compute[280168]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Nov 28 09:45:23 np0005538515.localdomain nova_compute[280168]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.client.openstack.keyring to /etc/ceph/ceph.client.openstack.keyring
Nov 28 09:45:23 np0005538515.localdomain nova_compute[280168]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Nov 28 09:45:23 np0005538515.localdomain nova_compute[280168]: INFO:__main__:Deleting /var/lib/nova/.ssh/ssh-privatekey
Nov 28 09:45:23 np0005538515.localdomain nova_compute[280168]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Nov 28 09:45:23 np0005538515.localdomain nova_compute[280168]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Nov 28 09:45:23 np0005538515.localdomain nova_compute[280168]: INFO:__main__:Deleting /var/lib/nova/.ssh/config
Nov 28 09:45:23 np0005538515.localdomain nova_compute[280168]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Nov 28 09:45:23 np0005538515.localdomain nova_compute[280168]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Nov 28 09:45:23 np0005538515.localdomain nova_compute[280168]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Nov 28 09:45:23 np0005538515.localdomain nova_compute[280168]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Nov 28 09:45:23 np0005538515.localdomain nova_compute[280168]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Nov 28 09:45:23 np0005538515.localdomain nova_compute[280168]: INFO:__main__:Writing out command to execute
Nov 28 09:45:23 np0005538515.localdomain nova_compute[280168]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Nov 28 09:45:23 np0005538515.localdomain nova_compute[280168]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Nov 28 09:45:23 np0005538515.localdomain nova_compute[280168]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Nov 28 09:45:23 np0005538515.localdomain nova_compute[280168]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Nov 28 09:45:23 np0005538515.localdomain nova_compute[280168]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Nov 28 09:45:23 np0005538515.localdomain nova_compute[280168]: ++ cat /run_command
Nov 28 09:45:23 np0005538515.localdomain nova_compute[280168]: + CMD=nova-compute
Nov 28 09:45:23 np0005538515.localdomain nova_compute[280168]: + ARGS=
Nov 28 09:45:23 np0005538515.localdomain nova_compute[280168]: + sudo kolla_copy_cacerts
Nov 28 09:45:23 np0005538515.localdomain nova_compute[280168]: + [[ ! -n '' ]]
Nov 28 09:45:23 np0005538515.localdomain nova_compute[280168]: + . kolla_extend_start
Nov 28 09:45:23 np0005538515.localdomain nova_compute[280168]: Running command: 'nova-compute'
Nov 28 09:45:23 np0005538515.localdomain nova_compute[280168]: + echo 'Running command: '\''nova-compute'\'''
Nov 28 09:45:23 np0005538515.localdomain nova_compute[280168]: + umask 0022
Nov 28 09:45:23 np0005538515.localdomain nova_compute[280168]: + exec nova-compute
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.098 280172 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.098 280172 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.098 280172 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.098 280172 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.209 280172 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.229 280172 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.020s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.230 280172 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473
Nov 28 09:45:25 np0005538515.localdomain sudo[280200]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:45:25 np0005538515.localdomain sudo[280200]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:45:25 np0005538515.localdomain sudo[280200]: pam_unix(sudo:session): session closed for user root
Nov 28 09:45:25 np0005538515.localdomain sudo[280220]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 09:45:25 np0005538515.localdomain sudo[280220]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.665 280172 INFO nova.virt.driver [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.778 280172 INFO nova.compute.provider_config [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.787 280172 DEBUG oslo_concurrency.lockutils [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.787 280172 DEBUG oslo_concurrency.lockutils [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.788 280172 DEBUG oslo_concurrency.lockutils [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.788 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.788 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.788 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.788 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.788 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.789 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.789 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.789 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.789 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.789 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.789 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.789 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.790 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.790 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.790 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.790 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.790 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.790 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.790 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.791 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] console_host                   = np0005538515.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.791 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.791 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.791 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.791 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.791 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.791 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.792 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.792 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.793 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.793 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.793 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.794 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.794 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.795 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.795 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.795 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.796 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.796 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.796 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] host                           = np0005538515.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.796 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.796 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.797 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.797 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.797 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.797 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.797 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.797 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.798 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.798 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.798 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.798 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.798 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.798 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.798 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.799 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.799 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.799 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.799 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.799 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.799 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.799 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.800 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.800 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.800 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.800 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.800 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.800 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.800 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.800 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.801 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.801 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.801 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.801 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.801 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.801 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.801 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.802 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.802 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.802 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.802 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.802 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] my_block_storage_ip            = 192.168.122.108 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.802 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] my_ip                          = 192.168.122.108 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.802 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.803 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.803 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.803 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.803 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.803 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.803 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.803 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.803 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.804 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.804 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.804 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.804 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.804 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.804 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.804 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.805 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.805 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.805 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.805 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.805 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.805 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.805 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.805 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.806 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.806 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.806 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.806 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.806 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.806 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.806 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.807 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.807 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.807 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.807 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.807 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.807 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.807 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.808 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.808 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.808 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.808 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.808 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.808 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.808 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.808 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.809 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.809 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.809 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.809 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.809 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.809 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.809 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.810 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.810 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.810 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.810 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.810 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.810 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.810 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.810 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.811 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.811 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.811 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.811 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.811 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.812 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.812 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.812 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.812 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.812 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.812 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.812 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.813 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.813 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.813 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.813 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.813 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.813 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.813 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.814 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.814 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.814 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.814 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.814 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.814 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.814 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.815 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.815 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.815 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.815 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.815 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.815 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.815 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.816 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.816 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.816 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.816 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.816 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.816 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.816 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.817 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.817 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.817 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.817 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.817 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.817 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.817 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.818 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.818 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.818 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.818 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.818 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.818 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.818 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.819 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.819 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.819 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.819 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.819 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.819 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.819 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.820 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.820 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.820 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.820 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.820 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.820 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.820 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.821 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.821 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.821 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.821 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.821 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.821 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.821 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.822 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.822 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.822 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.822 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.822 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.822 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.822 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.822 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.823 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.823 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.823 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.823 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.823 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.823 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.823 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.824 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.824 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.824 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.824 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.824 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.824 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.824 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.825 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.825 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.825 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.825 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.825 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.825 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.825 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.825 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.826 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.826 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.826 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.826 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.826 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.826 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.826 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.827 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.827 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.827 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.827 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.827 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.827 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.827 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.828 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.828 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.828 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.828 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.828 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.828 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.828 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.829 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.829 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.829 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.829 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.829 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.829 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.829 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.829 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.830 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.830 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.830 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.830 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.830 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.830 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.830 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.831 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.831 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.831 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.831 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.831 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.831 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.831 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.832 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.832 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.832 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.832 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.832 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.832 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.832 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.833 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.833 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.833 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.833 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.833 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.833 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.833 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.834 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.834 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.834 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.834 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.834 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.834 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.834 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.834 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.835 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.835 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.835 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.835 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.835 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.835 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.835 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.836 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.836 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.836 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.836 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.836 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.836 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.836 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.837 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.837 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.837 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.837 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.837 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.837 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.837 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.837 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.838 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.838 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.838 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.838 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.838 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.838 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.839 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.839 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.839 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.839 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.839 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.839 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.839 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.840 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.840 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.840 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.840 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.840 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.840 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.840 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.841 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.841 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.841 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.841 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.841 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.841 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.842 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.842 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.842 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.842 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.842 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.842 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.842 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.842 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.843 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.843 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.843 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.843 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.843 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.844 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.844 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.844 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.844 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.844 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.844 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.844 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.845 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.845 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.845 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.845 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.845 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.845 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.845 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.845 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.846 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.846 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.846 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.846 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.846 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.846 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.846 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.847 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.847 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.847 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.847 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.847 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.847 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.847 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.848 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.848 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.848 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.848 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.848 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.848 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.848 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.848 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.849 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.849 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.849 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.849 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.849 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.849 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.849 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.850 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.850 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.850 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.850 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.850 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.850 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.850 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.850 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.851 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.851 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.851 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.851 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.851 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.851 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.851 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.852 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.852 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.852 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.852 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.852 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.852 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.852 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.853 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.853 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.853 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.853 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.853 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.853 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.853 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.853 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.854 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.854 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.854 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.854 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.854 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] libvirt.cpu_mode               = host-model log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.854 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.854 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] libvirt.cpu_models             = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.855 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.855 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.855 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.855 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.855 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.855 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.855 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.856 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.856 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.856 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.856 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.856 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.856 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.856 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] libvirt.images_rbd_ceph_conf   = /etc/ceph/ceph.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.857 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.857 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.857 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] libvirt.images_rbd_glance_store_name = default_backend log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.857 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] libvirt.images_rbd_pool        = vms log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.857 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] libvirt.images_type            = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.857 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.857 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.857 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.858 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.858 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.858 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.858 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.858 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.858 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.859 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.859 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.859 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.859 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.859 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.859 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.859 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.860 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.860 280172 WARNING oslo_config.cfg [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: live_migration_uri is deprecated for removal in favor of two other options that
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: and ``live_migration_inbound_addr`` respectively.
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: ).  Its value may be silently ignored in the future.
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.860 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] libvirt.live_migration_uri     = qemu+ssh://nova@%s/system?keyfile=/var/lib/nova/.ssh/ssh-privatekey log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.860 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] libvirt.live_migration_with_native_tls = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.860 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.860 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.861 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.861 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.861 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.861 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.861 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.861 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.861 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.862 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.862 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.862 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.862 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.862 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.862 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.862 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.863 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] libvirt.rbd_secret_uuid        = 2c5417c9-00eb-57d5-a565-ddecbc7995c1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.863 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] libvirt.rbd_user               = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.863 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.863 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.863 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.863 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.864 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.864 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.864 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.864 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.864 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.864 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.864 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.865 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.865 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.865 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.865 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.865 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.865 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.865 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.866 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.866 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.866 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.866 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.866 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.866 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.866 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.867 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.867 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.867 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.867 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.867 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.867 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.868 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.868 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.868 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.868 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.868 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.868 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.868 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.868 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.869 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.869 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.869 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.869 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.869 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.869 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.869 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.870 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.870 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.870 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.870 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.870 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.870 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.870 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.871 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.871 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.871 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.871 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.871 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.871 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.871 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.871 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.872 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.872 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.872 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.872 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.872 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.872 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.872 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.873 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.873 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.873 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] placement.auth_url             = http://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.873 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.873 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.873 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.873 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.874 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.874 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.874 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.874 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.874 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.874 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.874 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.875 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.875 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.875 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.875 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.875 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.875 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.875 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.875 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.876 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.876 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.876 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.876 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.876 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.876 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.876 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.877 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.877 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.877 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.877 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.877 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.877 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.877 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.877 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.878 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.878 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.878 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.878 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.878 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.878 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.878 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.879 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.879 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.879 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.879 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.879 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.879 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.879 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.880 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.880 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.880 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.880 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.880 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.880 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.881 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.881 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.881 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.881 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.881 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.881 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.881 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.882 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.882 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.882 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.882 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.882 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.882 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.882 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.883 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.883 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.883 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.883 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.883 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.883 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.883 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.884 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.884 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.884 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.884 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.884 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.884 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.884 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.884 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.885 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.885 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.885 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.885 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.885 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.885 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.886 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.886 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.886 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.886 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.886 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.886 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.886 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.887 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.887 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.887 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.887 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.887 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.887 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.887 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.888 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.888 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.888 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.888 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.888 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.888 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.888 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.889 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.889 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.889 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.889 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.889 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.889 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.889 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.890 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.890 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.890 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.890 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.890 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.890 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.890 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.890 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.891 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.891 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.891 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.891 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.891 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.891 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.891 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.892 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.892 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.892 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.892 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.892 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.892 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.892 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.893 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.893 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.893 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.893 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.893 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.893 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.893 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.893 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.894 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.894 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.894 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.894 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.894 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.894 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.894 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.895 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.895 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.895 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.895 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] vnc.novncproxy_base_url        = http://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.895 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.895 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.896 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.896 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] vnc.server_proxyclient_address = 192.168.122.108 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.896 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.896 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.896 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.896 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.896 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.897 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.897 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.897 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.897 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.897 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.897 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.897 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.897 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.898 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.898 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.898 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.898 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.898 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.898 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.898 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.899 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.899 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.899 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.899 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.899 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.899 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.899 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.900 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.900 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.900 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.900 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.900 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.900 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.900 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.901 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.901 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.901 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.901 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.901 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.901 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.901 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.902 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.902 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.902 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.902 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.902 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.902 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.902 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.903 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.903 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.903 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.903 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.903 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.903 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.903 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.904 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.904 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.904 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.904 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.904 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.904 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.904 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.904 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.905 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.905 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.905 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.905 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.905 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.905 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.905 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.906 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.906 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.906 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.906 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.906 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.906 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.906 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.907 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.907 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.907 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.907 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.907 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.907 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.907 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.908 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.908 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.908 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.908 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.908 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.908 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.908 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] oslo_limit.auth_url            = http://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.909 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.909 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.909 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.909 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.909 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.909 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.909 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.909 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.910 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.910 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.910 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.910 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.910 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.910 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.910 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.911 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.911 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.911 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.911 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.911 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.911 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.911 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.911 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.912 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.912 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.912 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.912 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.912 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.912 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.912 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.913 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.913 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.913 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.913 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.913 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.913 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.913 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.914 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.914 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.914 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.914 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.914 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.914 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.914 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.915 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.915 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.915 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.915 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.915 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.915 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.915 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.915 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.916 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.916 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.916 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.916 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.916 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.916 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.916 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.917 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.917 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.917 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.917 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.917 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.917 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.917 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.918 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.918 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.918 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.918 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.918 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.918 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.918 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.918 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.919 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.919 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.919 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.919 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.919 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.919 280172 DEBUG oslo_service.service [None req-88da137d-e844-41ab-ba2e-65531aa98673 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.920 280172 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.941 280172 INFO nova.virt.node [None req-1a57df67-0f32-4e7a-ae01-b3a64cdd1e9a - - - - - -] Determined node identity 72fba1ca-0d86-48af-8a3d-510284dfd0e0 from /var/lib/nova/compute_id
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.941 280172 DEBUG nova.virt.libvirt.host [None req-1a57df67-0f32-4e7a-ae01-b3a64cdd1e9a - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.942 280172 DEBUG nova.virt.libvirt.host [None req-1a57df67-0f32-4e7a-ae01-b3a64cdd1e9a - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.942 280172 DEBUG nova.virt.libvirt.host [None req-1a57df67-0f32-4e7a-ae01-b3a64cdd1e9a - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.942 280172 DEBUG nova.virt.libvirt.host [None req-1a57df67-0f32-4e7a-ae01-b3a64cdd1e9a - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.951 280172 DEBUG nova.virt.libvirt.host [None req-1a57df67-0f32-4e7a-ae01-b3a64cdd1e9a - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7f403c8cf400> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.954 280172 DEBUG nova.virt.libvirt.host [None req-1a57df67-0f32-4e7a-ae01-b3a64cdd1e9a - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7f403c8cf400> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.954 280172 INFO nova.virt.libvirt.driver [None req-1a57df67-0f32-4e7a-ae01-b3a64cdd1e9a - - - - - -] Connection event '1' reason 'None'
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.961 280172 INFO nova.virt.libvirt.host [None req-1a57df67-0f32-4e7a-ae01-b3a64cdd1e9a - - - - - -] Libvirt host capabilities <capabilities>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:   <host>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:     <uuid>4c358f0e-7e15-44e5-bde2-714780d05a92</uuid>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:     <cpu>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <arch>x86_64</arch>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <model>EPYC-Rome-v4</model>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <vendor>AMD</vendor>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <microcode version='16777317'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <signature family='23' model='49' stepping='0'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <maxphysaddr mode='emulate' bits='40'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <feature name='x2apic'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <feature name='tsc-deadline'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <feature name='osxsave'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <feature name='hypervisor'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <feature name='tsc_adjust'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <feature name='spec-ctrl'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <feature name='stibp'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <feature name='arch-capabilities'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <feature name='ssbd'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <feature name='cmp_legacy'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <feature name='topoext'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <feature name='virt-ssbd'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <feature name='lbrv'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <feature name='tsc-scale'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <feature name='vmcb-clean'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <feature name='pause-filter'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <feature name='pfthreshold'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <feature name='svme-addr-chk'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <feature name='rdctl-no'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <feature name='skip-l1dfl-vmentry'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <feature name='mds-no'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <feature name='pschange-mc-no'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <pages unit='KiB' size='4'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <pages unit='KiB' size='2048'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <pages unit='KiB' size='1048576'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:     </cpu>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:     <power_management>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <suspend_mem/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <suspend_disk/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <suspend_hybrid/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:     </power_management>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:     <iommu support='no'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:     <migration_features>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <live/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <uri_transports>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <uri_transport>tcp</uri_transport>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <uri_transport>rdma</uri_transport>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       </uri_transports>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:     </migration_features>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:     <topology>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <cells num='1'>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <cell id='0'>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:           <memory unit='KiB'>16116612</memory>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:           <pages unit='KiB' size='4'>4029153</pages>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:           <pages unit='KiB' size='2048'>0</pages>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:           <pages unit='KiB' size='1048576'>0</pages>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:           <distances>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:             <sibling id='0' value='10'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:           </distances>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:           <cpus num='8'>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:             <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:             <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:             <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:             <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:             <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:             <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:             <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:             <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:           </cpus>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         </cell>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       </cells>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:     </topology>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:     <cache>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:     </cache>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:     <secmodel>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <model>selinux</model>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <doi>0</doi>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:     </secmodel>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:     <secmodel>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <model>dac</model>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <doi>0</doi>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <baselabel type='kvm'>+107:+107</baselabel>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <baselabel type='qemu'>+107:+107</baselabel>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:     </secmodel>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:   </host>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:   <guest>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:     <os_type>hvm</os_type>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:     <arch name='i686'>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <wordsize>32</wordsize>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <domain type='qemu'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <domain type='kvm'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:     </arch>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:     <features>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <pae/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <nonpae/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <acpi default='on' toggle='yes'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <apic default='on' toggle='no'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <cpuselection/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <deviceboot/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <disksnapshot default='on' toggle='no'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <externalSnapshot/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:     </features>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:   </guest>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:   <guest>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:     <os_type>hvm</os_type>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:     <arch name='x86_64'>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <wordsize>64</wordsize>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <domain type='qemu'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <domain type='kvm'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:     </arch>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:     <features>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <acpi default='on' toggle='yes'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <apic default='on' toggle='no'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <cpuselection/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <deviceboot/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <disksnapshot default='on' toggle='no'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <externalSnapshot/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:     </features>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:   </guest>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: </capabilities>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.971 280172 DEBUG nova.virt.libvirt.host [None req-1a57df67-0f32-4e7a-ae01-b3a64cdd1e9a - - - - - -] Getting domain capabilities for i686 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.973 280172 DEBUG nova.virt.libvirt.volume.mount [None req-1a57df67-0f32-4e7a-ae01-b3a64cdd1e9a - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.974 280172 DEBUG nova.virt.libvirt.host [None req-1a57df67-0f32-4e7a-ae01-b3a64cdd1e9a - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]: <domainCapabilities>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:   <path>/usr/libexec/qemu-kvm</path>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:   <domain>kvm</domain>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:   <machine>pc-q35-rhel9.8.0</machine>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:   <arch>i686</arch>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:   <vcpu max='1024'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:   <iothreads supported='yes'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:   <os supported='yes'>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:     <enum name='firmware'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:     <loader supported='yes'>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <enum name='type'>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <value>rom</value>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <value>pflash</value>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       </enum>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <enum name='readonly'>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <value>yes</value>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <value>no</value>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       </enum>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <enum name='secure'>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <value>no</value>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       </enum>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:     </loader>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:   </os>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:   <cpu>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:     <mode name='host-passthrough' supported='yes'>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <enum name='hostPassthroughMigratable'>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <value>on</value>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <value>off</value>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       </enum>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:     </mode>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:     <mode name='maximum' supported='yes'>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <enum name='maximumMigratable'>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <value>on</value>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <value>off</value>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       </enum>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:     </mode>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:     <mode name='host-model' supported='yes'>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <model fallback='forbid'>EPYC-Rome</model>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <vendor>AMD</vendor>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <maxphysaddr mode='passthrough' limit='40'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <feature policy='require' name='x2apic'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <feature policy='require' name='tsc-deadline'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <feature policy='require' name='hypervisor'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <feature policy='require' name='tsc_adjust'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <feature policy='require' name='spec-ctrl'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <feature policy='require' name='stibp'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <feature policy='require' name='ssbd'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <feature policy='require' name='cmp_legacy'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <feature policy='require' name='overflow-recov'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <feature policy='require' name='succor'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <feature policy='require' name='ibrs'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <feature policy='require' name='amd-ssbd'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <feature policy='require' name='virt-ssbd'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <feature policy='require' name='lbrv'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <feature policy='require' name='tsc-scale'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <feature policy='require' name='vmcb-clean'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <feature policy='require' name='pause-filter'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <feature policy='require' name='pfthreshold'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <feature policy='require' name='svme-addr-chk'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <feature policy='require' name='lfence-always-serializing'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <feature policy='disable' name='xsaves'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:     </mode>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:     <mode name='custom' supported='yes'>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <blockers model='Broadwell'>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='hle'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='rtm'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <blockers model='Broadwell-IBRS'>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='hle'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='rtm'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <blockers model='Broadwell-noTSX'>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <blockers model='Broadwell-noTSX-IBRS'>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <blockers model='Broadwell-v1'>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='hle'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='rtm'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <blockers model='Broadwell-v2'>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <blockers model='Broadwell-v3'>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='hle'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='rtm'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <blockers model='Broadwell-v4'>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <blockers model='Cascadelake-Server'>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bw'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512cd'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512dq'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512f'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vl'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vnni'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='hle'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='rtm'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <blockers model='Cascadelake-Server-noTSX'>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bw'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512cd'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512dq'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512f'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vl'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vnni'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='ibrs-all'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <blockers model='Cascadelake-Server-v1'>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bw'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512cd'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512dq'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512f'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vl'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vnni'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='hle'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='rtm'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <blockers model='Cascadelake-Server-v2'>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bw'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512cd'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512dq'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512f'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vl'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vnni'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='hle'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='ibrs-all'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='rtm'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <blockers model='Cascadelake-Server-v3'>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bw'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512cd'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512dq'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512f'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vl'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vnni'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='ibrs-all'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <blockers model='Cascadelake-Server-v4'>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bw'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512cd'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512dq'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512f'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vl'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vnni'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='ibrs-all'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <blockers model='Cascadelake-Server-v5'>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bw'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512cd'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512dq'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512f'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vl'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vnni'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='ibrs-all'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='xsaves'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <blockers model='Cooperlake'>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-bf16'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bw'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512cd'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512dq'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512f'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vl'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vnni'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='hle'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='ibrs-all'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='rtm'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='taa-no'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <blockers model='Cooperlake-v1'>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-bf16'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bw'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512cd'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512dq'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512f'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vl'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vnni'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='hle'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='ibrs-all'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='rtm'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='taa-no'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <blockers model='Cooperlake-v2'>
Nov 28 09:45:25 np0005538515.localdomain sudo[280220]: pam_unix(sudo:session): session closed for user root
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-bf16'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bw'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512cd'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512dq'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512f'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vl'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vnni'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='hle'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='ibrs-all'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='rtm'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='taa-no'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='xsaves'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <blockers model='Denverton'>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='mpx'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <blockers model='Denverton-v1'>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='mpx'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <blockers model='Denverton-v2'>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <blockers model='Denverton-v3'>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='xsaves'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <blockers model='Dhyana-v2'>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='xsaves'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <blockers model='EPYC-Genoa'>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='amd-psfd'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='auto-ibrs'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-bf16'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bitalg'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bw'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512cd'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512dq'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512f'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512ifma'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vbmi'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vbmi2'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vl'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vnni'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='fsrm'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='gfni'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='la57'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='no-nested-data-bp'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='null-sel-clr-base'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='stibp-always-on'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='vaes'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='xsaves'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <blockers model='EPYC-Genoa-v1'>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='amd-psfd'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='auto-ibrs'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-bf16'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bitalg'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bw'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512cd'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512dq'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512f'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512ifma'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vbmi'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vbmi2'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vl'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vnni'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='fsrm'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='gfni'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='la57'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='no-nested-data-bp'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='null-sel-clr-base'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='stibp-always-on'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='vaes'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='xsaves'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <blockers model='EPYC-Milan'>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='fsrm'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='xsaves'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <blockers model='EPYC-Milan-v1'>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='fsrm'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='xsaves'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <blockers model='EPYC-Milan-v2'>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='amd-psfd'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='fsrm'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='no-nested-data-bp'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='null-sel-clr-base'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='stibp-always-on'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='vaes'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='xsaves'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <blockers model='EPYC-Rome'>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='xsaves'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <blockers model='EPYC-Rome-v1'>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='xsaves'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <blockers model='EPYC-Rome-v2'>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='xsaves'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <blockers model='EPYC-Rome-v3'>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='xsaves'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <blockers model='EPYC-v3'>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='xsaves'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <blockers model='EPYC-v4'>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='xsaves'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <blockers model='GraniteRapids'>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='amx-bf16'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='amx-fp16'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='amx-int8'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='amx-tile'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx-vnni'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-bf16'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-fp16'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bitalg'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bw'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512cd'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512dq'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512f'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512ifma'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vbmi'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vbmi2'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vl'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vnni'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='bus-lock-detect'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='fbsdp-no'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='fsrc'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='fsrm'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='fsrs'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='fzrm'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='gfni'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='hle'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='ibrs-all'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='la57'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='mcdt-no'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='pbrsb-no'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='prefetchiti'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='psdp-no'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='rtm'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='serialize'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='taa-no'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='tsx-ldtrk'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='vaes'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='xfd'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='xsaves'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <blockers model='GraniteRapids-v1'>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='amx-bf16'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='amx-fp16'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='amx-int8'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='amx-tile'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx-vnni'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-bf16'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-fp16'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bitalg'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bw'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512cd'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512dq'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512f'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512ifma'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vbmi'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vbmi2'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vl'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vnni'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='bus-lock-detect'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='fbsdp-no'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='fsrc'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='fsrm'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='fsrs'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='fzrm'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='gfni'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='hle'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='ibrs-all'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='la57'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='mcdt-no'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='pbrsb-no'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='prefetchiti'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='psdp-no'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='rtm'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='serialize'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='taa-no'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='tsx-ldtrk'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='vaes'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='xfd'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='xsaves'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <blockers model='GraniteRapids-v2'>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='amx-bf16'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='amx-fp16'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='amx-int8'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='amx-tile'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx-vnni'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx10'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx10-128'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx10-256'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx10-512'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-bf16'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-fp16'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bitalg'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bw'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512cd'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512dq'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512f'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512ifma'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vbmi'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vbmi2'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vl'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vnni'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='bus-lock-detect'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='cldemote'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='fbsdp-no'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='fsrc'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='fsrm'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='fsrs'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='fzrm'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='gfni'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='hle'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='ibrs-all'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='la57'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='mcdt-no'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='movdir64b'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='movdiri'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='pbrsb-no'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='prefetchiti'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='psdp-no'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='rtm'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='serialize'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='ss'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='taa-no'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='tsx-ldtrk'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='vaes'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='xfd'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='xsaves'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <blockers model='Haswell'>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='hle'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='rtm'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <blockers model='Haswell-IBRS'>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='hle'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='rtm'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <blockers model='Haswell-noTSX'>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <blockers model='Haswell-noTSX-IBRS'>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <blockers model='Haswell-v1'>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='hle'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='rtm'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <blockers model='Haswell-v2'>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <blockers model='Haswell-v3'>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='hle'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='rtm'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <blockers model='Haswell-v4'>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <blockers model='Icelake-Server'>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bitalg'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bw'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512cd'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512dq'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512f'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vbmi'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vbmi2'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vl'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vnni'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='gfni'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='hle'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='la57'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='rtm'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='vaes'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <blockers model='Icelake-Server-noTSX'>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bitalg'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bw'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512cd'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512dq'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512f'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vbmi'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vbmi2'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vl'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vnni'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='gfni'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='la57'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='vaes'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <blockers model='Icelake-Server-v1'>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bitalg'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bw'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512cd'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512dq'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512f'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vbmi'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vbmi2'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vl'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vnni'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='gfni'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='hle'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='la57'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='rtm'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='vaes'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <blockers model='Icelake-Server-v2'>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bitalg'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bw'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512cd'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512dq'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512f'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vbmi'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vbmi2'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vl'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vnni'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='gfni'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='la57'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='vaes'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <blockers model='Icelake-Server-v3'>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bitalg'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bw'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512cd'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512dq'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512f'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vbmi'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vbmi2'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vl'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vnni'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='gfni'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='ibrs-all'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='la57'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='taa-no'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='vaes'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <blockers model='Icelake-Server-v4'>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bitalg'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bw'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512cd'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512dq'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512f'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512ifma'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vbmi'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vbmi2'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vl'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vnni'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='fsrm'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='gfni'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='ibrs-all'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='la57'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='taa-no'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='vaes'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <blockers model='Icelake-Server-v5'>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bitalg'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bw'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512cd'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512dq'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512f'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512ifma'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vbmi'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vbmi2'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vl'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vnni'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='fsrm'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='gfni'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='ibrs-all'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='la57'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='taa-no'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='vaes'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='xsaves'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <blockers model='Icelake-Server-v6'>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bitalg'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bw'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512cd'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512dq'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512f'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512ifma'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vbmi'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vbmi2'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vl'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vnni'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='fsrm'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='gfni'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='ibrs-all'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='la57'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='taa-no'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='vaes'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='xsaves'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <blockers model='Icelake-Server-v7'>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bitalg'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bw'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512cd'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512dq'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512f'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512ifma'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vbmi'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vbmi2'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vl'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vnni'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='fsrm'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='gfni'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='hle'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='ibrs-all'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='la57'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='rtm'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='taa-no'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='vaes'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='xsaves'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <blockers model='IvyBridge'>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <blockers model='IvyBridge-IBRS'>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <blockers model='IvyBridge-v1'>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <blockers model='IvyBridge-v2'>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <blockers model='KnightsMill'>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-4fmaps'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-4vnniw'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512cd'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512er'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512f'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512pf'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='ss'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <blockers model='KnightsMill-v1'>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-4fmaps'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-4vnniw'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512cd'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512er'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512f'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512pf'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='ss'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <blockers model='Opteron_G4'>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='fma4'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='xop'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <blockers model='Opteron_G4-v1'>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='fma4'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='xop'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <blockers model='Opteron_G5'>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='fma4'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='tbm'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='xop'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <blockers model='Opteron_G5-v1'>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='fma4'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='tbm'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='xop'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <blockers model='SapphireRapids'>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='amx-bf16'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='amx-int8'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='amx-tile'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx-vnni'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-bf16'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-fp16'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bitalg'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bw'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512cd'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512dq'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512f'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512ifma'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vbmi'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vbmi2'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vl'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vnni'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='bus-lock-detect'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='fsrc'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='fsrm'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='fsrs'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='fzrm'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='gfni'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='hle'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='ibrs-all'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='la57'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='rtm'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='serialize'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='taa-no'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='tsx-ldtrk'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='vaes'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='xfd'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='xsaves'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <blockers model='SapphireRapids-v1'>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='amx-bf16'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='amx-int8'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='amx-tile'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx-vnni'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-bf16'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-fp16'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bitalg'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bw'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512cd'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512dq'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512f'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512ifma'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vbmi'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vbmi2'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vl'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vnni'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='bus-lock-detect'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='fsrc'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='fsrm'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='fsrs'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='fzrm'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='gfni'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='hle'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='ibrs-all'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='la57'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='rtm'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='serialize'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='taa-no'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='tsx-ldtrk'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='vaes'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='xfd'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='xsaves'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <blockers model='SapphireRapids-v2'>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='amx-bf16'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='amx-int8'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='amx-tile'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx-vnni'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-bf16'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-fp16'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bitalg'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bw'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512cd'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512dq'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512f'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512ifma'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vbmi'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vbmi2'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vl'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vnni'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='bus-lock-detect'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='fbsdp-no'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='fsrc'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='fsrm'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='fsrs'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='fzrm'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='gfni'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='hle'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='ibrs-all'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='la57'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='psdp-no'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='rtm'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='serialize'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='taa-no'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='tsx-ldtrk'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='vaes'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='xfd'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='xsaves'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:       <blockers model='SapphireRapids-v3'>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='amx-bf16'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='amx-int8'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='amx-tile'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx-vnni'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-bf16'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-fp16'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bitalg'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bw'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512cd'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512dq'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512f'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512ifma'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vbmi'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vbmi2'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vl'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vnni'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='bus-lock-detect'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='cldemote'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='fbsdp-no'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='fsrc'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='fsrm'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='fsrs'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='fzrm'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='gfni'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='hle'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='ibrs-all'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='la57'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='movdir64b'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='movdiri'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='psdp-no'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='rtm'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='serialize'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='ss'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='taa-no'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='tsx-ldtrk'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='vaes'/>
Nov 28 09:45:25 np0005538515.localdomain nova_compute[280168]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='xfd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='xsaves'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='SierraForest'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx-ifma'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx-ne-convert'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx-vnni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx-vnni-int8'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='bus-lock-detect'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='cmpccxadd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='fbsdp-no'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='fsrm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='fsrs'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='gfni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='ibrs-all'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='mcdt-no'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pbrsb-no'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='psdp-no'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='serialize'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='vaes'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='xsaves'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='SierraForest-v1'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx-ifma'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx-ne-convert'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx-vnni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx-vnni-int8'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='bus-lock-detect'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='cmpccxadd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='fbsdp-no'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='fsrm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='fsrs'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='gfni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='ibrs-all'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='mcdt-no'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pbrsb-no'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='psdp-no'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='serialize'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='vaes'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='xsaves'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Skylake-Client'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='hle'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='rtm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Skylake-Client-IBRS'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='hle'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='rtm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Skylake-Client-v1'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='hle'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='rtm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Skylake-Client-v2'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='hle'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='rtm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Skylake-Client-v3'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Skylake-Client-v4'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='xsaves'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Skylake-Server'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bw'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512cd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512dq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512f'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vl'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='hle'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='rtm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Skylake-Server-IBRS'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bw'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512cd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512dq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512f'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vl'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='hle'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='rtm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bw'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512cd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512dq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512f'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vl'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Skylake-Server-v1'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bw'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512cd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512dq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512f'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vl'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='hle'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='rtm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Skylake-Server-v2'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bw'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512cd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512dq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512f'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vl'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='hle'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='rtm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Skylake-Server-v3'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bw'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512cd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512dq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512f'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vl'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Skylake-Server-v4'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bw'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512cd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512dq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512f'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vl'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Skylake-Server-v5'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bw'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512cd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512dq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512f'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vl'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='xsaves'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Snowridge'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='cldemote'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='core-capability'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='gfni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='movdir64b'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='movdiri'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='mpx'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='split-lock-detect'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Snowridge-v1'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='cldemote'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='core-capability'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='gfni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='movdir64b'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='movdiri'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='mpx'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='split-lock-detect'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Snowridge-v2'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='cldemote'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='core-capability'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='gfni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='movdir64b'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='movdiri'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='split-lock-detect'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Snowridge-v3'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='cldemote'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='core-capability'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='gfni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='movdir64b'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='movdiri'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='split-lock-detect'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='xsaves'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Snowridge-v4'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='cldemote'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='gfni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='movdir64b'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='movdiri'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='xsaves'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='athlon'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='3dnow'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='3dnowext'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='athlon-v1'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='3dnow'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='3dnowext'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='core2duo'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='ss'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='core2duo-v1'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='ss'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='coreduo'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='ss'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='coreduo-v1'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='ss'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='n270'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='ss'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='n270-v1'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='ss'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='phenom'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='3dnow'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='3dnowext'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='phenom-v1'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='3dnow'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='3dnowext'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     </mode>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:   </cpu>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:   <memoryBacking supported='yes'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     <enum name='sourceType'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <value>file</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <value>anonymous</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <value>memfd</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     </enum>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:   </memoryBacking>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:   <devices>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     <disk supported='yes'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <enum name='diskDevice'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>disk</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>cdrom</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>floppy</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>lun</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </enum>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <enum name='bus'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>fdc</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>scsi</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>virtio</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>usb</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>sata</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </enum>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <enum name='model'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>virtio</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>virtio-transitional</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>virtio-non-transitional</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </enum>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     </disk>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     <graphics supported='yes'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <enum name='type'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>vnc</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>egl-headless</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>dbus</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </enum>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     </graphics>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     <video supported='yes'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <enum name='modelType'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>vga</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>cirrus</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>virtio</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>none</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>bochs</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>ramfb</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </enum>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     </video>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     <hostdev supported='yes'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <enum name='mode'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>subsystem</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </enum>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <enum name='startupPolicy'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>default</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>mandatory</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>requisite</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>optional</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </enum>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <enum name='subsysType'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>usb</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>pci</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>scsi</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </enum>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <enum name='capsType'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <enum name='pciBackend'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     </hostdev>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     <rng supported='yes'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <enum name='model'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>virtio</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>virtio-transitional</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>virtio-non-transitional</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </enum>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <enum name='backendModel'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>random</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>egd</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>builtin</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </enum>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     </rng>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     <filesystem supported='yes'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <enum name='driverType'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>path</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>handle</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>virtiofs</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </enum>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     </filesystem>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     <tpm supported='yes'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <enum name='model'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>tpm-tis</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>tpm-crb</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </enum>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <enum name='backendModel'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>emulator</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>external</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </enum>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <enum name='backendVersion'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>2.0</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </enum>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     </tpm>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     <redirdev supported='yes'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <enum name='bus'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>usb</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </enum>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     </redirdev>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     <channel supported='yes'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <enum name='type'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>pty</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>unix</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </enum>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     </channel>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     <crypto supported='yes'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <enum name='model'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <enum name='type'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>qemu</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </enum>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <enum name='backendModel'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>builtin</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </enum>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     </crypto>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     <interface supported='yes'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <enum name='backendType'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>default</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>passt</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </enum>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     </interface>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     <panic supported='yes'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <enum name='model'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>isa</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>hyperv</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </enum>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     </panic>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     <console supported='yes'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <enum name='type'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>null</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>vc</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>pty</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>dev</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>file</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>pipe</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>stdio</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>udp</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>tcp</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>unix</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>qemu-vdagent</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>dbus</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </enum>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     </console>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:   </devices>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:   <features>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     <gic supported='no'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     <vmcoreinfo supported='yes'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     <genid supported='yes'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     <backingStoreInput supported='yes'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     <backup supported='yes'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     <async-teardown supported='yes'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     <ps2 supported='yes'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     <sev supported='no'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     <sgx supported='no'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     <hyperv supported='yes'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <enum name='features'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>relaxed</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>vapic</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>spinlocks</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>vpindex</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>runtime</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>synic</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>stimer</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>reset</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>vendor_id</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>frequencies</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>reenlightenment</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>tlbflush</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>ipi</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>avic</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>emsr_bitmap</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>xmm_input</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </enum>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <defaults>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <spinlocks>4095</spinlocks>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <stimer_direct>on</stimer_direct>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <tlbflush_direct>off</tlbflush_direct>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <tlbflush_extended>off</tlbflush_extended>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <vendor_id>Linux KVM Hv</vendor_id>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </defaults>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     </hyperv>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     <launchSecurity supported='yes'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <enum name='sectype'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>tdx</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </enum>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     </launchSecurity>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:   </features>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]: </domainCapabilities>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:25.979 280172 DEBUG nova.virt.libvirt.host [None req-1a57df67-0f32-4e7a-ae01-b3a64cdd1e9a - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]: <domainCapabilities>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:   <path>/usr/libexec/qemu-kvm</path>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:   <domain>kvm</domain>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:   <machine>pc-i440fx-rhel7.6.0</machine>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:   <arch>i686</arch>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:   <vcpu max='240'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:   <iothreads supported='yes'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:   <os supported='yes'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     <enum name='firmware'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     <loader supported='yes'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <enum name='type'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>rom</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>pflash</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </enum>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <enum name='readonly'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>yes</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>no</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </enum>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <enum name='secure'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>no</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </enum>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     </loader>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:   </os>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:   <cpu>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     <mode name='host-passthrough' supported='yes'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <enum name='hostPassthroughMigratable'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>on</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>off</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </enum>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     </mode>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     <mode name='maximum' supported='yes'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <enum name='maximumMigratable'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>on</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>off</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </enum>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     </mode>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     <mode name='host-model' supported='yes'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model fallback='forbid'>EPYC-Rome</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <vendor>AMD</vendor>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <maxphysaddr mode='passthrough' limit='40'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <feature policy='require' name='x2apic'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <feature policy='require' name='tsc-deadline'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <feature policy='require' name='hypervisor'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <feature policy='require' name='tsc_adjust'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <feature policy='require' name='spec-ctrl'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <feature policy='require' name='stibp'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <feature policy='require' name='ssbd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <feature policy='require' name='cmp_legacy'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <feature policy='require' name='overflow-recov'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <feature policy='require' name='succor'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <feature policy='require' name='ibrs'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <feature policy='require' name='amd-ssbd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <feature policy='require' name='virt-ssbd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <feature policy='require' name='lbrv'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <feature policy='require' name='tsc-scale'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <feature policy='require' name='vmcb-clean'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <feature policy='require' name='pause-filter'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <feature policy='require' name='pfthreshold'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <feature policy='require' name='svme-addr-chk'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <feature policy='require' name='lfence-always-serializing'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <feature policy='disable' name='xsaves'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     </mode>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     <mode name='custom' supported='yes'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Broadwell'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='hle'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='rtm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Broadwell-IBRS'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='hle'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='rtm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Broadwell-noTSX'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Broadwell-noTSX-IBRS'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Broadwell-v1'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='hle'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='rtm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Broadwell-v2'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Broadwell-v3'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='hle'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='rtm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Broadwell-v4'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Cascadelake-Server'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bw'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512cd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512dq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512f'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vl'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vnni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='hle'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='rtm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Cascadelake-Server-noTSX'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bw'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512cd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512dq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512f'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vl'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vnni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='ibrs-all'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Cascadelake-Server-v1'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bw'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512cd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512dq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512f'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vl'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vnni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='hle'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='rtm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Cascadelake-Server-v2'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bw'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512cd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512dq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512f'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vl'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vnni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='hle'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='ibrs-all'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='rtm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Cascadelake-Server-v3'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bw'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512cd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512dq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512f'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vl'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vnni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='ibrs-all'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Cascadelake-Server-v4'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bw'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512cd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512dq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512f'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vl'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vnni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='ibrs-all'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Cascadelake-Server-v5'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bw'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512cd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512dq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512f'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vl'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vnni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='ibrs-all'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='xsaves'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Cooperlake'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-bf16'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bw'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512cd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512dq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512f'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vl'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vnni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='hle'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='ibrs-all'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='rtm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='taa-no'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Cooperlake-v1'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-bf16'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bw'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512cd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512dq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512f'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vl'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vnni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='hle'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='ibrs-all'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='rtm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='taa-no'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Cooperlake-v2'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-bf16'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bw'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512cd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512dq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512f'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vl'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vnni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='hle'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='ibrs-all'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='rtm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='taa-no'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='xsaves'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Denverton'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='mpx'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Denverton-v1'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='mpx'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Denverton-v2'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Denverton-v3'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='xsaves'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Dhyana-v2'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='xsaves'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='EPYC-Genoa'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='amd-psfd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='auto-ibrs'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-bf16'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bitalg'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bw'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512cd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512dq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512f'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512ifma'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vbmi'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vbmi2'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vl'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vnni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='fsrm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='gfni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='la57'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='no-nested-data-bp'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='null-sel-clr-base'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='stibp-always-on'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='vaes'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='xsaves'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='EPYC-Genoa-v1'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='amd-psfd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='auto-ibrs'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-bf16'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bitalg'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bw'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512cd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512dq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512f'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512ifma'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vbmi'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vbmi2'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vl'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vnni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='fsrm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='gfni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='la57'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='no-nested-data-bp'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='null-sel-clr-base'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='stibp-always-on'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='vaes'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='xsaves'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='EPYC-Milan'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='fsrm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='xsaves'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='EPYC-Milan-v1'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='fsrm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='xsaves'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='EPYC-Milan-v2'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='amd-psfd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='fsrm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='no-nested-data-bp'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='null-sel-clr-base'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='stibp-always-on'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='vaes'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='xsaves'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='EPYC-Rome'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='xsaves'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='EPYC-Rome-v1'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='xsaves'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='EPYC-Rome-v2'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='xsaves'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='EPYC-Rome-v3'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='xsaves'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='EPYC-v3'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='xsaves'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='EPYC-v4'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='xsaves'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='GraniteRapids'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='amx-bf16'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='amx-fp16'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='amx-int8'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='amx-tile'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx-vnni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-bf16'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-fp16'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bitalg'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bw'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512cd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512dq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512f'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512ifma'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vbmi'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vbmi2'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vl'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vnni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='bus-lock-detect'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='fbsdp-no'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='fsrc'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='fsrm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='fsrs'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='fzrm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='gfni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='hle'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='ibrs-all'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='la57'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='mcdt-no'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pbrsb-no'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='prefetchiti'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='psdp-no'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='rtm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='serialize'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='taa-no'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='tsx-ldtrk'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='vaes'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='xfd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='xsaves'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='GraniteRapids-v1'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='amx-bf16'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='amx-fp16'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='amx-int8'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='amx-tile'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx-vnni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-bf16'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-fp16'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bitalg'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bw'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512cd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512dq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512f'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512ifma'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vbmi'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vbmi2'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vl'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vnni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='bus-lock-detect'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='fbsdp-no'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='fsrc'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='fsrm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='fsrs'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='fzrm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='gfni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='hle'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='ibrs-all'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='la57'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='mcdt-no'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pbrsb-no'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='prefetchiti'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='psdp-no'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='rtm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='serialize'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='taa-no'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='tsx-ldtrk'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='vaes'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='xfd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='xsaves'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='GraniteRapids-v2'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='amx-bf16'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='amx-fp16'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='amx-int8'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='amx-tile'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx-vnni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx10'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx10-128'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx10-256'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx10-512'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-bf16'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-fp16'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bitalg'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bw'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512cd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512dq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512f'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512ifma'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vbmi'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vbmi2'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vl'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vnni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='bus-lock-detect'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='cldemote'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='fbsdp-no'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='fsrc'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='fsrm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='fsrs'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='fzrm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='gfni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='hle'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='ibrs-all'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='la57'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='mcdt-no'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='movdir64b'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='movdiri'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pbrsb-no'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='prefetchiti'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='psdp-no'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='rtm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='serialize'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='ss'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='taa-no'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='tsx-ldtrk'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='vaes'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='xfd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='xsaves'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Haswell'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='hle'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='rtm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Haswell-IBRS'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='hle'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='rtm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Haswell-noTSX'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Haswell-noTSX-IBRS'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Haswell-v1'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='hle'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='rtm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Haswell-v2'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Haswell-v3'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='hle'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='rtm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Haswell-v4'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Icelake-Server'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bitalg'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bw'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512cd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512dq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512f'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vbmi'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vbmi2'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vl'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vnni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='gfni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='hle'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='la57'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='rtm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='vaes'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Icelake-Server-noTSX'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bitalg'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bw'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512cd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512dq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512f'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vbmi'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vbmi2'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vl'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vnni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='gfni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='la57'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='vaes'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Icelake-Server-v1'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bitalg'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bw'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512cd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512dq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512f'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vbmi'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vbmi2'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vl'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vnni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='gfni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='hle'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='la57'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='rtm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='vaes'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Icelake-Server-v2'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bitalg'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bw'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512cd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512dq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512f'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vbmi'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vbmi2'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vl'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vnni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='gfni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='la57'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='vaes'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Icelake-Server-v3'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bitalg'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bw'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512cd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512dq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512f'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vbmi'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vbmi2'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vl'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vnni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='gfni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='ibrs-all'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='la57'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='taa-no'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='vaes'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Icelake-Server-v4'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bitalg'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bw'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512cd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512dq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512f'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512ifma'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vbmi'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vbmi2'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vl'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vnni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='fsrm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='gfni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='ibrs-all'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='la57'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='taa-no'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='vaes'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Icelake-Server-v5'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bitalg'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bw'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512cd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512dq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512f'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512ifma'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vbmi'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vbmi2'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vl'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vnni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='fsrm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='gfni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='ibrs-all'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='la57'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='taa-no'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='vaes'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='xsaves'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Icelake-Server-v6'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bitalg'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bw'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512cd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512dq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512f'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512ifma'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vbmi'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vbmi2'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vl'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vnni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='fsrm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='gfni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='ibrs-all'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='la57'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='taa-no'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='vaes'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='xsaves'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Icelake-Server-v7'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bitalg'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bw'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512cd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512dq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512f'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512ifma'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vbmi'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vbmi2'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vl'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vnni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='fsrm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='gfni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='hle'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='ibrs-all'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='la57'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='rtm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='taa-no'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='vaes'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='xsaves'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='IvyBridge'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='IvyBridge-IBRS'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='IvyBridge-v1'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='IvyBridge-v2'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='KnightsMill'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-4fmaps'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-4vnniw'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512cd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512er'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512f'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512pf'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='ss'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='KnightsMill-v1'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-4fmaps'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-4vnniw'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512cd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512er'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512f'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512pf'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='ss'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Opteron_G4'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='fma4'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='xop'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Opteron_G4-v1'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='fma4'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='xop'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Opteron_G5'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='fma4'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='tbm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='xop'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Opteron_G5-v1'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='fma4'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='tbm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='xop'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='SapphireRapids'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='amx-bf16'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='amx-int8'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='amx-tile'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx-vnni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-bf16'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-fp16'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bitalg'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bw'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512cd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512dq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512f'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512ifma'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vbmi'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vbmi2'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vl'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vnni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='bus-lock-detect'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='fsrc'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='fsrm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='fsrs'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='fzrm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='gfni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='hle'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='ibrs-all'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='la57'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='rtm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='serialize'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='taa-no'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='tsx-ldtrk'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='vaes'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='xfd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='xsaves'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='SapphireRapids-v1'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='amx-bf16'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='amx-int8'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='amx-tile'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx-vnni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-bf16'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-fp16'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bitalg'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bw'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512cd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512dq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512f'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512ifma'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vbmi'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vbmi2'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vl'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vnni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='bus-lock-detect'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='fsrc'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='fsrm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='fsrs'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='fzrm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='gfni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='hle'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='ibrs-all'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='la57'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='rtm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='serialize'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='taa-no'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='tsx-ldtrk'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='vaes'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='xfd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='xsaves'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='SapphireRapids-v2'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='amx-bf16'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='amx-int8'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='amx-tile'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx-vnni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-bf16'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-fp16'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bitalg'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bw'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512cd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512dq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512f'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512ifma'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vbmi'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vbmi2'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vl'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vnni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='bus-lock-detect'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='fbsdp-no'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='fsrc'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='fsrm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='fsrs'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='fzrm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='gfni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='hle'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='ibrs-all'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='la57'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='psdp-no'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='rtm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='serialize'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='taa-no'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='tsx-ldtrk'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='vaes'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='xfd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='xsaves'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='SapphireRapids-v3'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='amx-bf16'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='amx-int8'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='amx-tile'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx-vnni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-bf16'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-fp16'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bitalg'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bw'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512cd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512dq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512f'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512ifma'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vbmi'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vbmi2'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vl'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vnni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='bus-lock-detect'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='cldemote'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='fbsdp-no'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='fsrc'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='fsrm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='fsrs'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='fzrm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='gfni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='hle'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='ibrs-all'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='la57'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='movdir64b'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='movdiri'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='psdp-no'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='rtm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='serialize'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='ss'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='taa-no'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='tsx-ldtrk'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='vaes'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='xfd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='xsaves'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='SierraForest'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx-ifma'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx-ne-convert'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx-vnni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx-vnni-int8'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='bus-lock-detect'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='cmpccxadd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='fbsdp-no'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='fsrm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='fsrs'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='gfni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='ibrs-all'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='mcdt-no'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pbrsb-no'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='psdp-no'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='serialize'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='vaes'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='xsaves'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='SierraForest-v1'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx-ifma'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx-ne-convert'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx-vnni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx-vnni-int8'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='bus-lock-detect'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='cmpccxadd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='fbsdp-no'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='fsrm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='fsrs'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='gfni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='ibrs-all'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='mcdt-no'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pbrsb-no'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='psdp-no'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='serialize'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='vaes'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='xsaves'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Skylake-Client'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='hle'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='rtm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Skylake-Client-IBRS'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='hle'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='rtm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Skylake-Client-v1'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='hle'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='rtm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Skylake-Client-v2'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='hle'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='rtm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Skylake-Client-v3'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Skylake-Client-v4'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='xsaves'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Skylake-Server'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bw'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512cd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512dq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512f'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vl'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='hle'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='rtm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Skylake-Server-IBRS'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bw'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512cd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512dq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512f'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vl'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='hle'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='rtm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bw'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512cd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512dq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512f'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vl'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Skylake-Server-v1'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bw'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512cd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512dq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512f'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vl'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='hle'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='rtm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Skylake-Server-v2'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bw'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512cd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512dq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512f'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vl'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='hle'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='rtm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Skylake-Server-v3'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bw'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512cd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512dq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512f'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vl'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Skylake-Server-v4'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bw'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512cd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512dq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512f'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vl'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Skylake-Server-v5'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bw'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512cd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512dq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512f'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vl'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='xsaves'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Snowridge'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='cldemote'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='core-capability'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='gfni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='movdir64b'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='movdiri'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='mpx'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='split-lock-detect'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Snowridge-v1'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='cldemote'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='core-capability'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='gfni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='movdir64b'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='movdiri'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='mpx'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='split-lock-detect'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Snowridge-v2'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='cldemote'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='core-capability'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='gfni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='movdir64b'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='movdiri'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='split-lock-detect'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Snowridge-v3'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='cldemote'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='core-capability'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='gfni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='movdir64b'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='movdiri'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='split-lock-detect'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='xsaves'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Snowridge-v4'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='cldemote'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='gfni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='movdir64b'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='movdiri'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='xsaves'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='athlon'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='3dnow'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='3dnowext'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='athlon-v1'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='3dnow'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='3dnowext'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='core2duo'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='ss'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='core2duo-v1'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='ss'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='coreduo'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='ss'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='coreduo-v1'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='ss'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='n270'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='ss'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='n270-v1'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='ss'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='phenom'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='3dnow'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='3dnowext'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='phenom-v1'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='3dnow'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='3dnowext'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     </mode>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:   </cpu>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:   <memoryBacking supported='yes'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     <enum name='sourceType'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <value>file</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <value>anonymous</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <value>memfd</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     </enum>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:   </memoryBacking>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:   <devices>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     <disk supported='yes'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <enum name='diskDevice'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>disk</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>cdrom</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>floppy</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>lun</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </enum>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <enum name='bus'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>ide</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>fdc</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>scsi</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>virtio</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>usb</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>sata</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </enum>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <enum name='model'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>virtio</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>virtio-transitional</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>virtio-non-transitional</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </enum>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     </disk>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     <graphics supported='yes'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <enum name='type'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>vnc</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>egl-headless</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>dbus</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </enum>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     </graphics>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     <video supported='yes'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <enum name='modelType'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>vga</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>cirrus</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>virtio</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>none</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>bochs</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>ramfb</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </enum>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     </video>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     <hostdev supported='yes'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <enum name='mode'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>subsystem</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </enum>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <enum name='startupPolicy'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>default</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>mandatory</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>requisite</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>optional</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </enum>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <enum name='subsysType'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>usb</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>pci</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>scsi</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </enum>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <enum name='capsType'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <enum name='pciBackend'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     </hostdev>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     <rng supported='yes'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <enum name='model'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>virtio</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>virtio-transitional</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>virtio-non-transitional</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </enum>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <enum name='backendModel'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>random</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>egd</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>builtin</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </enum>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     </rng>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     <filesystem supported='yes'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <enum name='driverType'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>path</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>handle</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>virtiofs</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </enum>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     </filesystem>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     <tpm supported='yes'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <enum name='model'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>tpm-tis</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>tpm-crb</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </enum>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <enum name='backendModel'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>emulator</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>external</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </enum>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <enum name='backendVersion'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>2.0</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </enum>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     </tpm>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     <redirdev supported='yes'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <enum name='bus'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>usb</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </enum>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     </redirdev>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     <channel supported='yes'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <enum name='type'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>pty</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>unix</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </enum>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     </channel>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     <crypto supported='yes'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <enum name='model'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <enum name='type'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>qemu</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </enum>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <enum name='backendModel'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>builtin</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </enum>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     </crypto>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     <interface supported='yes'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <enum name='backendType'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>default</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>passt</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </enum>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     </interface>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     <panic supported='yes'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <enum name='model'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>isa</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>hyperv</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </enum>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     </panic>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     <console supported='yes'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <enum name='type'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>null</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>vc</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>pty</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>dev</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>file</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>pipe</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>stdio</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>udp</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>tcp</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>unix</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>qemu-vdagent</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>dbus</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </enum>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     </console>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:   </devices>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:   <features>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     <gic supported='no'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     <vmcoreinfo supported='yes'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     <genid supported='yes'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     <backingStoreInput supported='yes'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     <backup supported='yes'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     <async-teardown supported='yes'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     <ps2 supported='yes'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     <sev supported='no'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     <sgx supported='no'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     <hyperv supported='yes'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <enum name='features'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>relaxed</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>vapic</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>spinlocks</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>vpindex</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>runtime</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>synic</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>stimer</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>reset</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>vendor_id</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>frequencies</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>reenlightenment</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>tlbflush</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>ipi</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>avic</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>emsr_bitmap</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>xmm_input</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </enum>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <defaults>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <spinlocks>4095</spinlocks>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <stimer_direct>on</stimer_direct>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <tlbflush_direct>off</tlbflush_direct>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <tlbflush_extended>off</tlbflush_extended>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <vendor_id>Linux KVM Hv</vendor_id>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </defaults>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     </hyperv>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     <launchSecurity supported='yes'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <enum name='sectype'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>tdx</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </enum>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     </launchSecurity>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:   </features>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]: </domainCapabilities>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:26.011 280172 DEBUG nova.virt.libvirt.host [None req-1a57df67-0f32-4e7a-ae01-b3a64cdd1e9a - - - - - -] Getting domain capabilities for x86_64 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:26.016 280172 DEBUG nova.virt.libvirt.host [None req-1a57df67-0f32-4e7a-ae01-b3a64cdd1e9a - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]: <domainCapabilities>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:   <path>/usr/libexec/qemu-kvm</path>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:   <domain>kvm</domain>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:   <machine>pc-q35-rhel9.8.0</machine>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:   <arch>x86_64</arch>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:   <vcpu max='1024'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:   <iothreads supported='yes'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:   <os supported='yes'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     <enum name='firmware'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <value>efi</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     </enum>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     <loader supported='yes'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <enum name='type'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>rom</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>pflash</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </enum>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <enum name='readonly'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>yes</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>no</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </enum>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <enum name='secure'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>yes</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>no</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </enum>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     </loader>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:   </os>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:   <cpu>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     <mode name='host-passthrough' supported='yes'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <enum name='hostPassthroughMigratable'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>on</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>off</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </enum>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     </mode>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     <mode name='maximum' supported='yes'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <enum name='maximumMigratable'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>on</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>off</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </enum>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     </mode>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     <mode name='host-model' supported='yes'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model fallback='forbid'>EPYC-Rome</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <vendor>AMD</vendor>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <maxphysaddr mode='passthrough' limit='40'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <feature policy='require' name='x2apic'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <feature policy='require' name='tsc-deadline'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <feature policy='require' name='hypervisor'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <feature policy='require' name='tsc_adjust'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <feature policy='require' name='spec-ctrl'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <feature policy='require' name='stibp'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <feature policy='require' name='ssbd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <feature policy='require' name='cmp_legacy'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <feature policy='require' name='overflow-recov'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <feature policy='require' name='succor'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <feature policy='require' name='ibrs'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <feature policy='require' name='amd-ssbd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <feature policy='require' name='virt-ssbd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <feature policy='require' name='lbrv'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <feature policy='require' name='tsc-scale'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <feature policy='require' name='vmcb-clean'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <feature policy='require' name='pause-filter'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <feature policy='require' name='pfthreshold'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <feature policy='require' name='svme-addr-chk'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <feature policy='require' name='lfence-always-serializing'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <feature policy='disable' name='xsaves'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     </mode>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     <mode name='custom' supported='yes'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Broadwell'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='hle'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='rtm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Broadwell-IBRS'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='hle'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='rtm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Broadwell-noTSX'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Broadwell-noTSX-IBRS'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Broadwell-v1'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='hle'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='rtm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Broadwell-v2'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Broadwell-v3'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='hle'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='rtm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Broadwell-v4'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Cascadelake-Server'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bw'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512cd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512dq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512f'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vl'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vnni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='hle'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='rtm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Cascadelake-Server-noTSX'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bw'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512cd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512dq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512f'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vl'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vnni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='ibrs-all'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Cascadelake-Server-v1'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bw'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512cd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512dq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512f'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vl'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vnni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='hle'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='rtm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Cascadelake-Server-v2'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bw'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512cd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512dq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512f'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vl'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vnni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='hle'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='ibrs-all'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='rtm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Cascadelake-Server-v3'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bw'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512cd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512dq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512f'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vl'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vnni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='ibrs-all'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Cascadelake-Server-v4'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bw'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512cd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512dq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512f'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vl'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vnni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='ibrs-all'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Cascadelake-Server-v5'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bw'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512cd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512dq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512f'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vl'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vnni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='ibrs-all'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='xsaves'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Cooperlake'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-bf16'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bw'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512cd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512dq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512f'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vl'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vnni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='hle'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='ibrs-all'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='rtm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='taa-no'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Cooperlake-v1'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-bf16'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bw'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512cd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512dq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512f'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vl'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vnni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='hle'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='ibrs-all'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='rtm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='taa-no'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Cooperlake-v2'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-bf16'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bw'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512cd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512dq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512f'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vl'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vnni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='hle'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='ibrs-all'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='rtm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='taa-no'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='xsaves'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Denverton'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='mpx'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Denverton-v1'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='mpx'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Denverton-v2'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Denverton-v3'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='xsaves'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Dhyana-v2'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='xsaves'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='EPYC-Genoa'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='amd-psfd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='auto-ibrs'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-bf16'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bitalg'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bw'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512cd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512dq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512f'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512ifma'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vbmi'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vbmi2'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vl'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vnni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='fsrm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='gfni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='la57'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='no-nested-data-bp'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='null-sel-clr-base'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='stibp-always-on'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='vaes'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='xsaves'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='EPYC-Genoa-v1'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='amd-psfd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='auto-ibrs'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-bf16'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bitalg'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bw'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512cd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512dq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512f'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512ifma'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vbmi'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vbmi2'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vl'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vnni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='fsrm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='gfni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='la57'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='no-nested-data-bp'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='null-sel-clr-base'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='stibp-always-on'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='vaes'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='xsaves'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='EPYC-Milan'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='fsrm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='xsaves'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='EPYC-Milan-v1'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='fsrm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='xsaves'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='EPYC-Milan-v2'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='amd-psfd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='fsrm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='no-nested-data-bp'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='null-sel-clr-base'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='stibp-always-on'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='vaes'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='xsaves'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='EPYC-Rome'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='xsaves'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='EPYC-Rome-v1'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='xsaves'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='EPYC-Rome-v2'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='xsaves'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='EPYC-Rome-v3'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='xsaves'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='EPYC-v3'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='xsaves'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='EPYC-v4'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='xsaves'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='GraniteRapids'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='amx-bf16'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='amx-fp16'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='amx-int8'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='amx-tile'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx-vnni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-bf16'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-fp16'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bitalg'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bw'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512cd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512dq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512f'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512ifma'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vbmi'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vbmi2'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vl'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vnni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='bus-lock-detect'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='fbsdp-no'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='fsrc'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='fsrm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='fsrs'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='fzrm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='gfni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='hle'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='ibrs-all'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='la57'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='mcdt-no'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pbrsb-no'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='prefetchiti'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='psdp-no'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='rtm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='serialize'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='taa-no'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='tsx-ldtrk'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='vaes'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='xfd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='xsaves'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='GraniteRapids-v1'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='amx-bf16'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='amx-fp16'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='amx-int8'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='amx-tile'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx-vnni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-bf16'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-fp16'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bitalg'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bw'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512cd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512dq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512f'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512ifma'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vbmi'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vbmi2'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vl'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vnni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='bus-lock-detect'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='fbsdp-no'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='fsrc'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='fsrm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='fsrs'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='fzrm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='gfni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='hle'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='ibrs-all'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='la57'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='mcdt-no'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pbrsb-no'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='prefetchiti'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='psdp-no'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='rtm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='serialize'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='taa-no'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='tsx-ldtrk'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='vaes'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='xfd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='xsaves'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='GraniteRapids-v2'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='amx-bf16'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='amx-fp16'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='amx-int8'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='amx-tile'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx-vnni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx10'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx10-128'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx10-256'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx10-512'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-bf16'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-fp16'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bitalg'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bw'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512cd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512dq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512f'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512ifma'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vbmi'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vbmi2'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vl'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vnni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='bus-lock-detect'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='cldemote'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='fbsdp-no'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='fsrc'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='fsrm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='fsrs'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='fzrm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='gfni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='hle'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='ibrs-all'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='la57'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='mcdt-no'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='movdir64b'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='movdiri'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pbrsb-no'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='prefetchiti'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='psdp-no'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='rtm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='serialize'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='ss'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='taa-no'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='tsx-ldtrk'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='vaes'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='xfd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='xsaves'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Haswell'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='hle'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='rtm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Haswell-IBRS'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='hle'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='rtm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Haswell-noTSX'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Haswell-noTSX-IBRS'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Haswell-v1'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='hle'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='rtm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Haswell-v2'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Haswell-v3'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='hle'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='rtm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Haswell-v4'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Icelake-Server'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bitalg'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bw'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512cd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512dq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512f'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vbmi'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vbmi2'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vl'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vnni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='gfni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='hle'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='la57'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='rtm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='vaes'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Icelake-Server-noTSX'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bitalg'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bw'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512cd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512dq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512f'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vbmi'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vbmi2'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vl'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vnni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='gfni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='la57'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='vaes'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Icelake-Server-v1'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bitalg'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bw'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512cd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512dq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512f'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vbmi'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vbmi2'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vl'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vnni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='gfni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='hle'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='la57'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='rtm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='vaes'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Icelake-Server-v2'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bitalg'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bw'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512cd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512dq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512f'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vbmi'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vbmi2'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vl'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vnni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='gfni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='la57'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='vaes'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Icelake-Server-v3'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bitalg'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bw'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512cd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512dq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512f'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vbmi'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vbmi2'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vl'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vnni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='gfni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='ibrs-all'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='la57'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='taa-no'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='vaes'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Icelake-Server-v4'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bitalg'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bw'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512cd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512dq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512f'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512ifma'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vbmi'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vbmi2'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vl'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vnni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='fsrm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='gfni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='ibrs-all'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='la57'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='taa-no'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='vaes'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Icelake-Server-v5'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bitalg'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bw'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512cd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512dq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512f'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512ifma'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vbmi'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vbmi2'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vl'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vnni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='fsrm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='gfni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='ibrs-all'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='la57'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='taa-no'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='vaes'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='xsaves'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Icelake-Server-v6'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bitalg'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bw'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512cd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512dq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512f'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512ifma'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vbmi'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vbmi2'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vl'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vnni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='fsrm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='gfni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='ibrs-all'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='la57'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='taa-no'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='vaes'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='xsaves'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Icelake-Server-v7'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bitalg'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bw'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512cd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512dq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512f'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512ifma'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vbmi'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vbmi2'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vl'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vnni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='fsrm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='gfni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='hle'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='ibrs-all'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='la57'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='rtm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='taa-no'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='vaes'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='xsaves'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='IvyBridge'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='IvyBridge-IBRS'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='IvyBridge-v1'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='IvyBridge-v2'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='KnightsMill'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-4fmaps'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-4vnniw'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512cd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512er'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512f'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512pf'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='ss'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='KnightsMill-v1'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-4fmaps'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-4vnniw'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512cd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512er'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512f'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512pf'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='ss'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Opteron_G4'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='fma4'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='xop'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Opteron_G4-v1'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='fma4'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='xop'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Opteron_G5'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='fma4'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='tbm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='xop'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Opteron_G5-v1'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='fma4'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='tbm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='xop'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='SapphireRapids'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='amx-bf16'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='amx-int8'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='amx-tile'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx-vnni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-bf16'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-fp16'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bitalg'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bw'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512cd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512dq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512f'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512ifma'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vbmi'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vbmi2'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vl'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vnni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='bus-lock-detect'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='fsrc'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='fsrm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='fsrs'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='fzrm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='gfni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='hle'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='ibrs-all'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='la57'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='rtm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='serialize'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='taa-no'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='tsx-ldtrk'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='vaes'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='xfd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='xsaves'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='SapphireRapids-v1'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='amx-bf16'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='amx-int8'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='amx-tile'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx-vnni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-bf16'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-fp16'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bitalg'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bw'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512cd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512dq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512f'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512ifma'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vbmi'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vbmi2'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vl'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vnni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='bus-lock-detect'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='fsrc'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='fsrm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='fsrs'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='fzrm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='gfni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='hle'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='ibrs-all'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='la57'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='rtm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='serialize'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='taa-no'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='tsx-ldtrk'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='vaes'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='xfd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='xsaves'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='SapphireRapids-v2'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='amx-bf16'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='amx-int8'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='amx-tile'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx-vnni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-bf16'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-fp16'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bitalg'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bw'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512cd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512dq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512f'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512ifma'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vbmi'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vbmi2'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vl'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vnni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='bus-lock-detect'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='fbsdp-no'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='fsrc'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='fsrm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='fsrs'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='fzrm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='gfni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='hle'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='ibrs-all'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='la57'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='psdp-no'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='rtm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='serialize'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='taa-no'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='tsx-ldtrk'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='vaes'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='xfd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='xsaves'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='SapphireRapids-v3'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='amx-bf16'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='amx-int8'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='amx-tile'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx-vnni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-bf16'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-fp16'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bitalg'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bw'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512cd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512dq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512f'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512ifma'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vbmi'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vbmi2'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vl'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vnni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='bus-lock-detect'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='cldemote'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='fbsdp-no'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='fsrc'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='fsrm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='fsrs'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='fzrm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='gfni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='hle'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='ibrs-all'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='la57'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='movdir64b'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='movdiri'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='psdp-no'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='rtm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='serialize'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='ss'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='taa-no'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='tsx-ldtrk'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='vaes'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='xfd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='xsaves'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='SierraForest'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx-ifma'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx-ne-convert'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx-vnni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx-vnni-int8'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='bus-lock-detect'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='cmpccxadd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='fbsdp-no'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='fsrm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='fsrs'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='gfni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='ibrs-all'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='mcdt-no'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pbrsb-no'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='psdp-no'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='serialize'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='vaes'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='xsaves'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='SierraForest-v1'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx-ifma'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx-ne-convert'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx-vnni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx-vnni-int8'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='bus-lock-detect'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='cmpccxadd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='fbsdp-no'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='fsrm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='fsrs'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='gfni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='ibrs-all'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='mcdt-no'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pbrsb-no'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='psdp-no'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='serialize'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='vaes'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='xsaves'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Skylake-Client'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='hle'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='rtm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Skylake-Client-IBRS'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='hle'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='rtm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Skylake-Client-v1'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='hle'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='rtm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Skylake-Client-v2'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='hle'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='rtm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Skylake-Client-v3'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Skylake-Client-v4'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='xsaves'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Skylake-Server'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bw'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512cd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512dq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512f'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vl'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='hle'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='rtm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Skylake-Server-IBRS'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bw'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512cd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512dq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512f'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vl'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='hle'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='rtm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bw'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512cd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512dq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512f'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vl'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Skylake-Server-v1'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bw'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512cd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512dq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512f'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vl'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='hle'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='rtm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Skylake-Server-v2'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bw'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512cd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512dq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512f'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vl'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='hle'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='rtm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Skylake-Server-v3'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bw'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512cd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512dq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512f'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vl'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Skylake-Server-v4'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bw'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512cd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512dq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512f'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vl'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Skylake-Server-v5'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bw'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512cd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512dq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512f'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vl'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='xsaves'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Snowridge'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='cldemote'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='core-capability'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='gfni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='movdir64b'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='movdiri'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='mpx'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='split-lock-detect'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Snowridge-v1'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='cldemote'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='core-capability'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='gfni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='movdir64b'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='movdiri'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='mpx'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='split-lock-detect'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Snowridge-v2'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='cldemote'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='core-capability'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='gfni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='movdir64b'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='movdiri'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='split-lock-detect'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Snowridge-v3'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='cldemote'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='core-capability'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='gfni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='movdir64b'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='movdiri'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='split-lock-detect'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='xsaves'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Snowridge-v4'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='cldemote'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='gfni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='movdir64b'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='movdiri'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='xsaves'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='athlon'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='3dnow'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='3dnowext'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='athlon-v1'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='3dnow'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='3dnowext'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='core2duo'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='ss'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='core2duo-v1'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='ss'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='coreduo'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='ss'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='coreduo-v1'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='ss'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='n270'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='ss'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='n270-v1'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='ss'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='phenom'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='3dnow'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='3dnowext'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='phenom-v1'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='3dnow'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='3dnowext'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     </mode>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:   </cpu>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:   <memoryBacking supported='yes'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     <enum name='sourceType'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <value>file</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <value>anonymous</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <value>memfd</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     </enum>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:   </memoryBacking>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:   <devices>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     <disk supported='yes'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <enum name='diskDevice'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>disk</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>cdrom</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>floppy</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>lun</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </enum>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <enum name='bus'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>fdc</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>scsi</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>virtio</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>usb</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>sata</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </enum>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <enum name='model'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>virtio</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>virtio-transitional</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>virtio-non-transitional</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </enum>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     </disk>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     <graphics supported='yes'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <enum name='type'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>vnc</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>egl-headless</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>dbus</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </enum>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     </graphics>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     <video supported='yes'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <enum name='modelType'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>vga</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>cirrus</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>virtio</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>none</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>bochs</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>ramfb</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </enum>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     </video>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     <hostdev supported='yes'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <enum name='mode'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>subsystem</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </enum>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <enum name='startupPolicy'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>default</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>mandatory</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>requisite</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>optional</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </enum>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <enum name='subsysType'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>usb</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>pci</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>scsi</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </enum>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <enum name='capsType'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <enum name='pciBackend'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     </hostdev>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     <rng supported='yes'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <enum name='model'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>virtio</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>virtio-transitional</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>virtio-non-transitional</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </enum>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <enum name='backendModel'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>random</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>egd</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>builtin</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </enum>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     </rng>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     <filesystem supported='yes'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <enum name='driverType'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>path</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>handle</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>virtiofs</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </enum>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     </filesystem>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     <tpm supported='yes'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <enum name='model'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>tpm-tis</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>tpm-crb</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </enum>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <enum name='backendModel'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>emulator</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>external</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </enum>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <enum name='backendVersion'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>2.0</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </enum>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     </tpm>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     <redirdev supported='yes'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <enum name='bus'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>usb</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </enum>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     </redirdev>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     <channel supported='yes'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <enum name='type'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>pty</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>unix</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </enum>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     </channel>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     <crypto supported='yes'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <enum name='model'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <enum name='type'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>qemu</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </enum>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <enum name='backendModel'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>builtin</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </enum>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     </crypto>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     <interface supported='yes'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <enum name='backendType'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>default</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>passt</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </enum>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     </interface>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     <panic supported='yes'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <enum name='model'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>isa</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>hyperv</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </enum>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     </panic>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     <console supported='yes'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <enum name='type'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>null</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>vc</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>pty</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>dev</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>file</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>pipe</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>stdio</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>udp</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>tcp</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>unix</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>qemu-vdagent</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>dbus</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </enum>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     </console>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:   </devices>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:   <features>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     <gic supported='no'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     <vmcoreinfo supported='yes'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     <genid supported='yes'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     <backingStoreInput supported='yes'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     <backup supported='yes'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     <async-teardown supported='yes'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     <ps2 supported='yes'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     <sev supported='no'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     <sgx supported='no'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     <hyperv supported='yes'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <enum name='features'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>relaxed</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>vapic</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>spinlocks</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>vpindex</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>runtime</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>synic</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>stimer</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>reset</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>vendor_id</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>frequencies</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>reenlightenment</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>tlbflush</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>ipi</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>avic</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>emsr_bitmap</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>xmm_input</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </enum>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <defaults>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <spinlocks>4095</spinlocks>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <stimer_direct>on</stimer_direct>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <tlbflush_direct>off</tlbflush_direct>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <tlbflush_extended>off</tlbflush_extended>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <vendor_id>Linux KVM Hv</vendor_id>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </defaults>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     </hyperv>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     <launchSecurity supported='yes'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <enum name='sectype'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>tdx</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </enum>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     </launchSecurity>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:   </features>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]: </domainCapabilities>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:26.068 280172 DEBUG nova.virt.libvirt.host [None req-1a57df67-0f32-4e7a-ae01-b3a64cdd1e9a - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]: <domainCapabilities>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:   <path>/usr/libexec/qemu-kvm</path>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:   <domain>kvm</domain>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:   <machine>pc-i440fx-rhel7.6.0</machine>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:   <arch>x86_64</arch>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:   <vcpu max='240'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:   <iothreads supported='yes'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:   <os supported='yes'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     <enum name='firmware'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     <loader supported='yes'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <enum name='type'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>rom</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>pflash</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </enum>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <enum name='readonly'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>yes</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>no</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </enum>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <enum name='secure'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>no</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </enum>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     </loader>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:   </os>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:   <cpu>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     <mode name='host-passthrough' supported='yes'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <enum name='hostPassthroughMigratable'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>on</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>off</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </enum>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     </mode>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     <mode name='maximum' supported='yes'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <enum name='maximumMigratable'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>on</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>off</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </enum>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     </mode>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     <mode name='host-model' supported='yes'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model fallback='forbid'>EPYC-Rome</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <vendor>AMD</vendor>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <maxphysaddr mode='passthrough' limit='40'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <feature policy='require' name='x2apic'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <feature policy='require' name='tsc-deadline'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <feature policy='require' name='hypervisor'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <feature policy='require' name='tsc_adjust'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <feature policy='require' name='spec-ctrl'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <feature policy='require' name='stibp'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <feature policy='require' name='ssbd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <feature policy='require' name='cmp_legacy'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <feature policy='require' name='overflow-recov'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <feature policy='require' name='succor'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <feature policy='require' name='ibrs'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <feature policy='require' name='amd-ssbd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <feature policy='require' name='virt-ssbd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <feature policy='require' name='lbrv'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <feature policy='require' name='tsc-scale'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <feature policy='require' name='vmcb-clean'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <feature policy='require' name='pause-filter'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <feature policy='require' name='pfthreshold'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <feature policy='require' name='svme-addr-chk'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <feature policy='require' name='lfence-always-serializing'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <feature policy='disable' name='xsaves'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     </mode>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     <mode name='custom' supported='yes'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Broadwell'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='hle'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='rtm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Broadwell-IBRS'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='hle'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='rtm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Broadwell-noTSX'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Broadwell-noTSX-IBRS'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Broadwell-v1'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='hle'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='rtm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Broadwell-v2'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Broadwell-v3'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='hle'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='rtm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Broadwell-v4'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Cascadelake-Server'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bw'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512cd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512dq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512f'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vl'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vnni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='hle'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='rtm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Cascadelake-Server-noTSX'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bw'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512cd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512dq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512f'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vl'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vnni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='ibrs-all'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Cascadelake-Server-v1'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bw'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512cd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512dq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512f'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vl'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vnni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='hle'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='rtm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Cascadelake-Server-v2'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bw'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512cd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512dq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512f'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vl'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vnni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='hle'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='ibrs-all'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='rtm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Cascadelake-Server-v3'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bw'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512cd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512dq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512f'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vl'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vnni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='ibrs-all'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Cascadelake-Server-v4'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bw'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512cd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512dq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512f'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vl'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vnni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='ibrs-all'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Cascadelake-Server-v5'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bw'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512cd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512dq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512f'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vl'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vnni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='ibrs-all'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='xsaves'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Cooperlake'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-bf16'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bw'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512cd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512dq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512f'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vl'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vnni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='hle'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='ibrs-all'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='rtm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='taa-no'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Cooperlake-v1'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-bf16'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bw'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512cd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512dq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512f'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vl'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vnni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='hle'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='ibrs-all'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='rtm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='taa-no'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Cooperlake-v2'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-bf16'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bw'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512cd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512dq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512f'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vl'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vnni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='hle'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='ibrs-all'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='rtm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='taa-no'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='xsaves'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Denverton'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='mpx'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Denverton-v1'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='mpx'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Denverton-v2'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Denverton-v3'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='xsaves'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Dhyana-v2'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='xsaves'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='EPYC-Genoa'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='amd-psfd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='auto-ibrs'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-bf16'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bitalg'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bw'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512cd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512dq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512f'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512ifma'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vbmi'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vbmi2'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vl'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vnni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='fsrm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='gfni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='la57'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='no-nested-data-bp'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='null-sel-clr-base'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='stibp-always-on'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='vaes'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='xsaves'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='EPYC-Genoa-v1'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='amd-psfd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='auto-ibrs'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-bf16'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bitalg'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bw'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512cd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512dq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512f'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512ifma'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vbmi'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vbmi2'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vl'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vnni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='fsrm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='gfni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='la57'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='no-nested-data-bp'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='null-sel-clr-base'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='stibp-always-on'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='vaes'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='xsaves'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='EPYC-Milan'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='fsrm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='xsaves'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='EPYC-Milan-v1'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='fsrm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='xsaves'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='EPYC-Milan-v2'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='amd-psfd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='fsrm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='no-nested-data-bp'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='null-sel-clr-base'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='stibp-always-on'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='vaes'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='xsaves'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='EPYC-Rome'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='xsaves'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='EPYC-Rome-v1'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='xsaves'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='EPYC-Rome-v2'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='xsaves'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='EPYC-Rome-v3'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='xsaves'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='EPYC-v3'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='xsaves'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='EPYC-v4'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='xsaves'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='GraniteRapids'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='amx-bf16'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='amx-fp16'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='amx-int8'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='amx-tile'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx-vnni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-bf16'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-fp16'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bitalg'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bw'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512cd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512dq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512f'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512ifma'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vbmi'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vbmi2'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vl'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vnni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='bus-lock-detect'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='fbsdp-no'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='fsrc'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='fsrm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='fsrs'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='fzrm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='gfni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='hle'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='ibrs-all'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='la57'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='mcdt-no'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pbrsb-no'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='prefetchiti'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='psdp-no'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='rtm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='serialize'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='taa-no'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='tsx-ldtrk'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='vaes'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='xfd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='xsaves'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='GraniteRapids-v1'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='amx-bf16'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='amx-fp16'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='amx-int8'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='amx-tile'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx-vnni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-bf16'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-fp16'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bitalg'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bw'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512cd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512dq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512f'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512ifma'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vbmi'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vbmi2'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vl'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vnni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='bus-lock-detect'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='fbsdp-no'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='fsrc'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='fsrm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='fsrs'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='fzrm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='gfni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='hle'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='ibrs-all'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='la57'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='mcdt-no'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pbrsb-no'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='prefetchiti'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='psdp-no'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='rtm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='serialize'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='taa-no'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='tsx-ldtrk'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='vaes'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='xfd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='xsaves'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='GraniteRapids-v2'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='amx-bf16'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='amx-fp16'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='amx-int8'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='amx-tile'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx-vnni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx10'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx10-128'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx10-256'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx10-512'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-bf16'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-fp16'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bitalg'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bw'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512cd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512dq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512f'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512ifma'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vbmi'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vbmi2'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vl'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vnni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='bus-lock-detect'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='cldemote'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='fbsdp-no'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='fsrc'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='fsrm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='fsrs'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='fzrm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='gfni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='hle'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='ibrs-all'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='la57'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='mcdt-no'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='movdir64b'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='movdiri'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pbrsb-no'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='prefetchiti'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='psdp-no'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='rtm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='serialize'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='ss'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='taa-no'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='tsx-ldtrk'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='vaes'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='xfd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='xsaves'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Haswell'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='hle'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='rtm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Haswell-IBRS'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='hle'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='rtm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Haswell-noTSX'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Haswell-noTSX-IBRS'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Haswell-v1'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='hle'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='rtm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Haswell-v2'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Haswell-v3'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='hle'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='rtm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Haswell-v4'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Icelake-Server'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bitalg'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bw'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512cd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512dq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512f'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vbmi'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vbmi2'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vl'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vnni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='gfni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='hle'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='la57'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='rtm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='vaes'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Icelake-Server-noTSX'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bitalg'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bw'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512cd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512dq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512f'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vbmi'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vbmi2'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vl'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vnni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='gfni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='la57'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='vaes'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Icelake-Server-v1'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bitalg'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bw'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512cd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512dq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512f'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vbmi'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vbmi2'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vl'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vnni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='gfni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='hle'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='la57'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='rtm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='vaes'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Icelake-Server-v2'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bitalg'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bw'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512cd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512dq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512f'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vbmi'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vbmi2'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vl'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vnni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='gfni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='la57'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='vaes'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Icelake-Server-v3'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bitalg'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bw'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512cd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512dq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512f'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vbmi'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vbmi2'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vl'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vnni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='gfni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='ibrs-all'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='la57'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='taa-no'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='vaes'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Icelake-Server-v4'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bitalg'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bw'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512cd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512dq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512f'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512ifma'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vbmi'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vbmi2'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vl'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vnni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='fsrm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='gfni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='ibrs-all'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='la57'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='taa-no'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='vaes'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Icelake-Server-v5'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bitalg'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bw'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512cd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512dq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512f'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512ifma'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vbmi'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vbmi2'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vl'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vnni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='fsrm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='gfni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='ibrs-all'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='la57'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='taa-no'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='vaes'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='xsaves'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Icelake-Server-v6'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bitalg'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bw'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512cd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512dq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512f'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512ifma'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vbmi'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vbmi2'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vl'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vnni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='fsrm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='gfni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='ibrs-all'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='la57'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='taa-no'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='vaes'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='xsaves'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Icelake-Server-v7'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bitalg'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bw'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512cd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512dq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512f'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512ifma'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vbmi'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vbmi2'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vl'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vnni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='fsrm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='gfni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='hle'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='ibrs-all'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='la57'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='rtm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='taa-no'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='vaes'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='xsaves'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='IvyBridge'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='IvyBridge-IBRS'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='IvyBridge-v1'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='IvyBridge-v2'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='KnightsMill'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-4fmaps'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-4vnniw'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512cd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512er'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512f'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512pf'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='ss'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='KnightsMill-v1'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-4fmaps'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-4vnniw'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512cd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512er'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512f'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512pf'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='ss'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Opteron_G4'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='fma4'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='xop'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Opteron_G4-v1'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='fma4'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='xop'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Opteron_G5'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='fma4'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='tbm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='xop'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Opteron_G5-v1'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='fma4'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='tbm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='xop'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='SapphireRapids'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='amx-bf16'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='amx-int8'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='amx-tile'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx-vnni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-bf16'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-fp16'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bitalg'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bw'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512cd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512dq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512f'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512ifma'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vbmi'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vbmi2'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vl'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vnni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='bus-lock-detect'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='fsrc'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='fsrm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='fsrs'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='fzrm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='gfni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='hle'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='ibrs-all'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='la57'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='rtm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='serialize'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='taa-no'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='tsx-ldtrk'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='vaes'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='xfd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='xsaves'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='SapphireRapids-v1'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='amx-bf16'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='amx-int8'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='amx-tile'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx-vnni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-bf16'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-fp16'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bitalg'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bw'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512cd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512dq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512f'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512ifma'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vbmi'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vbmi2'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vl'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vnni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='bus-lock-detect'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='fsrc'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='fsrm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='fsrs'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='fzrm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='gfni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='hle'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='ibrs-all'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='la57'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='rtm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='serialize'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='taa-no'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='tsx-ldtrk'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='vaes'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='xfd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='xsaves'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='SapphireRapids-v2'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='amx-bf16'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='amx-int8'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='amx-tile'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx-vnni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-bf16'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-fp16'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bitalg'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bw'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512cd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512dq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512f'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512ifma'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vbmi'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vbmi2'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vl'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vnni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='bus-lock-detect'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='fbsdp-no'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='fsrc'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='fsrm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='fsrs'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='fzrm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='gfni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='hle'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='ibrs-all'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='la57'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='psdp-no'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='rtm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='serialize'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='taa-no'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='tsx-ldtrk'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='vaes'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='xfd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='xsaves'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='SapphireRapids-v3'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='amx-bf16'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='amx-int8'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='amx-tile'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx-vnni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-bf16'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-fp16'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512-vpopcntdq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bitalg'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bw'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512cd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512dq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512f'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512ifma'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vbmi'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vbmi2'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vl'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vnni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='bus-lock-detect'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='cldemote'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='fbsdp-no'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='fsrc'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='fsrm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='fsrs'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='fzrm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='gfni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='hle'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='ibrs-all'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='la57'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='movdir64b'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='movdiri'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='psdp-no'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='rtm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='serialize'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='ss'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='taa-no'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='tsx-ldtrk'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='vaes'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='xfd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='xsaves'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='SierraForest'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx-ifma'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx-ne-convert'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx-vnni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx-vnni-int8'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='bus-lock-detect'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='cmpccxadd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='fbsdp-no'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='fsrm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='fsrs'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='gfni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='ibrs-all'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='mcdt-no'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pbrsb-no'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='psdp-no'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='serialize'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='vaes'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='xsaves'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='SierraForest-v1'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx-ifma'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx-ne-convert'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx-vnni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx-vnni-int8'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='bus-lock-detect'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='cmpccxadd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='fbsdp-no'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='fsrm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='fsrs'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='gfni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='ibrs-all'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='mcdt-no'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pbrsb-no'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='psdp-no'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='sbdr-ssdp-no'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='serialize'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='vaes'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='vpclmulqdq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='xsaves'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Skylake-Client'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='hle'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='rtm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Skylake-Client-IBRS'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='hle'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='rtm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Skylake-Client-v1'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='hle'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='rtm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Skylake-Client-v2'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='hle'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='rtm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Skylake-Client-v3'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Skylake-Client-v4'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='xsaves'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Skylake-Server'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bw'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512cd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512dq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512f'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vl'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='hle'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='rtm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Skylake-Server-IBRS'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bw'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512cd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512dq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512f'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vl'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='hle'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='rtm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bw'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512cd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512dq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512f'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vl'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Skylake-Server-v1'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bw'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512cd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512dq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512f'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vl'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='hle'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='rtm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Skylake-Server-v2'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bw'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512cd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512dq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512f'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vl'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='hle'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='rtm'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Skylake-Server-v3'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bw'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512cd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512dq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512f'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vl'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Skylake-Server-v4'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bw'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512cd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512dq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512f'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vl'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Skylake-Server-v5'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512bw'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512cd'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512dq'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512f'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='avx512vl'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='invpcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pcid'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='pku'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='xsaves'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Snowridge'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='cldemote'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='core-capability'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='gfni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='movdir64b'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='movdiri'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='mpx'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='split-lock-detect'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Snowridge-v1'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='cldemote'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='core-capability'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='gfni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='movdir64b'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='movdiri'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='mpx'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='split-lock-detect'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Snowridge-v2'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='cldemote'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='core-capability'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='gfni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='movdir64b'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='movdiri'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='split-lock-detect'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Snowridge-v3'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='cldemote'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='core-capability'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='gfni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='movdir64b'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='movdiri'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='split-lock-detect'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='xsaves'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='Snowridge-v4'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='cldemote'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='erms'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='gfni'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='movdir64b'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='movdiri'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='xsaves'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='athlon'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='3dnow'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='3dnowext'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='athlon-v1'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='3dnow'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='3dnowext'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='core2duo'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='ss'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='core2duo-v1'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='ss'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='coreduo'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='ss'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='coreduo-v1'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='ss'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='n270'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='ss'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='n270-v1'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='ss'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='phenom'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='3dnow'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='3dnowext'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <blockers model='phenom-v1'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='3dnow'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <feature name='3dnowext'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </blockers>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     </mode>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:   </cpu>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:   <memoryBacking supported='yes'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     <enum name='sourceType'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <value>file</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <value>anonymous</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <value>memfd</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     </enum>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:   </memoryBacking>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:   <devices>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     <disk supported='yes'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <enum name='diskDevice'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>disk</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>cdrom</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>floppy</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>lun</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </enum>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <enum name='bus'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>ide</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>fdc</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>scsi</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>virtio</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>usb</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>sata</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </enum>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <enum name='model'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>virtio</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>virtio-transitional</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>virtio-non-transitional</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </enum>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     </disk>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     <graphics supported='yes'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <enum name='type'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>vnc</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>egl-headless</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>dbus</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </enum>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     </graphics>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     <video supported='yes'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <enum name='modelType'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>vga</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>cirrus</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>virtio</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>none</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>bochs</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>ramfb</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </enum>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     </video>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     <hostdev supported='yes'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <enum name='mode'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>subsystem</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </enum>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <enum name='startupPolicy'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>default</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>mandatory</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>requisite</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>optional</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </enum>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <enum name='subsysType'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>usb</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>pci</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>scsi</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </enum>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <enum name='capsType'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <enum name='pciBackend'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     </hostdev>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     <rng supported='yes'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <enum name='model'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>virtio</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>virtio-transitional</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>virtio-non-transitional</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </enum>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <enum name='backendModel'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>random</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>egd</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>builtin</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </enum>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     </rng>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     <filesystem supported='yes'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <enum name='driverType'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>path</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>handle</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>virtiofs</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </enum>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     </filesystem>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     <tpm supported='yes'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <enum name='model'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>tpm-tis</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>tpm-crb</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </enum>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <enum name='backendModel'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>emulator</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>external</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </enum>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <enum name='backendVersion'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>2.0</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </enum>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     </tpm>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     <redirdev supported='yes'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <enum name='bus'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>usb</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </enum>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     </redirdev>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     <channel supported='yes'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <enum name='type'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>pty</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>unix</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </enum>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     </channel>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     <crypto supported='yes'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <enum name='model'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <enum name='type'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>qemu</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </enum>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <enum name='backendModel'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>builtin</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </enum>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     </crypto>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     <interface supported='yes'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <enum name='backendType'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>default</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>passt</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </enum>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     </interface>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     <panic supported='yes'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <enum name='model'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>isa</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>hyperv</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </enum>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     </panic>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     <console supported='yes'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <enum name='type'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>null</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>vc</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>pty</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>dev</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>file</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>pipe</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>stdio</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>udp</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>tcp</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>unix</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>qemu-vdagent</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>dbus</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </enum>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     </console>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:   </devices>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:   <features>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     <gic supported='no'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     <vmcoreinfo supported='yes'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     <genid supported='yes'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     <backingStoreInput supported='yes'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     <backup supported='yes'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     <async-teardown supported='yes'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     <ps2 supported='yes'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     <sev supported='no'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     <sgx supported='no'/>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     <hyperv supported='yes'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <enum name='features'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>relaxed</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>vapic</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>spinlocks</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>vpindex</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>runtime</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>synic</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>stimer</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>reset</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>vendor_id</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>frequencies</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>reenlightenment</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>tlbflush</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>ipi</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>avic</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>emsr_bitmap</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>xmm_input</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </enum>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <defaults>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <spinlocks>4095</spinlocks>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <stimer_direct>on</stimer_direct>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <tlbflush_direct>off</tlbflush_direct>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <tlbflush_extended>off</tlbflush_extended>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <vendor_id>Linux KVM Hv</vendor_id>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </defaults>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     </hyperv>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     <launchSecurity supported='yes'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       <enum name='sectype'>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:         <value>tdx</value>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:       </enum>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:     </launchSecurity>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:   </features>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]: </domainCapabilities>
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:26.119 280172 DEBUG nova.virt.libvirt.host [None req-1a57df67-0f32-4e7a-ae01-b3a64cdd1e9a - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:26.120 280172 INFO nova.virt.libvirt.host [None req-1a57df67-0f32-4e7a-ae01-b3a64cdd1e9a - - - - - -] Secure Boot support detected
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:26.122 280172 INFO nova.virt.libvirt.driver [None req-1a57df67-0f32-4e7a-ae01-b3a64cdd1e9a - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:26.123 280172 INFO nova.virt.libvirt.driver [None req-1a57df67-0f32-4e7a-ae01-b3a64cdd1e9a - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:26.133 280172 DEBUG nova.virt.libvirt.driver [None req-1a57df67-0f32-4e7a-ae01-b3a64cdd1e9a - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:26.155 280172 INFO nova.virt.node [None req-1a57df67-0f32-4e7a-ae01-b3a64cdd1e9a - - - - - -] Determined node identity 72fba1ca-0d86-48af-8a3d-510284dfd0e0 from /var/lib/nova/compute_id
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:26.175 280172 DEBUG nova.compute.manager [None req-1a57df67-0f32-4e7a-ae01-b3a64cdd1e9a - - - - - -] Verified node 72fba1ca-0d86-48af-8a3d-510284dfd0e0 matches my host np0005538515.localdomain _check_for_host_rename /usr/lib/python3.9/site-packages/nova/compute/manager.py:1568
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:26.217 280172 INFO nova.compute.manager [None req-1a57df67-0f32-4e7a-ae01-b3a64cdd1e9a - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:26.338 280172 DEBUG oslo_concurrency.lockutils [None req-1a57df67-0f32-4e7a-ae01-b3a64cdd1e9a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:26.339 280172 DEBUG oslo_concurrency.lockutils [None req-1a57df67-0f32-4e7a-ae01-b3a64cdd1e9a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:26.339 280172 DEBUG oslo_concurrency.lockutils [None req-1a57df67-0f32-4e7a-ae01-b3a64cdd1e9a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:26.339 280172 DEBUG nova.compute.resource_tracker [None req-1a57df67-0f32-4e7a-ae01-b3a64cdd1e9a - - - - - -] Auditing locally available compute resources for np0005538515.localdomain (node: np0005538515.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:26.339 280172 DEBUG oslo_concurrency.processutils [None req-1a57df67-0f32-4e7a-ae01-b3a64cdd1e9a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 09:45:26 np0005538515.localdomain sudo[280310]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:45:26 np0005538515.localdomain sudo[280310]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:45:26 np0005538515.localdomain sudo[280310]: pam_unix(sudo:session): session closed for user root
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:26.777 280172 DEBUG oslo_concurrency.processutils [None req-1a57df67-0f32-4e7a-ae01-b3a64cdd1e9a - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.437s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:26.906 280172 WARNING nova.virt.libvirt.driver [None req-1a57df67-0f32-4e7a-ae01-b3a64cdd1e9a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:26.907 280172 DEBUG nova.compute.resource_tracker [None req-1a57df67-0f32-4e7a-ae01-b3a64cdd1e9a - - - - - -] Hypervisor/Node resource view: name=np0005538515.localdomain free_ram=12519MB free_disk=41.837093353271484GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:26.908 280172 DEBUG oslo_concurrency.lockutils [None req-1a57df67-0f32-4e7a-ae01-b3a64cdd1e9a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:45:26 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:26.908 280172 DEBUG oslo_concurrency.lockutils [None req-1a57df67-0f32-4e7a-ae01-b3a64cdd1e9a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:45:27 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:27.055 280172 DEBUG nova.compute.resource_tracker [None req-1a57df67-0f32-4e7a-ae01-b3a64cdd1e9a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 28 09:45:27 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:27.055 280172 DEBUG nova.compute.resource_tracker [None req-1a57df67-0f32-4e7a-ae01-b3a64cdd1e9a - - - - - -] Final resource view: name=np0005538515.localdomain phys_ram=15738MB used_ram=512MB phys_disk=41GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 28 09:45:27 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:27.120 280172 DEBUG nova.scheduler.client.report [None req-1a57df67-0f32-4e7a-ae01-b3a64cdd1e9a - - - - - -] Refreshing inventories for resource provider 72fba1ca-0d86-48af-8a3d-510284dfd0e0 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Nov 28 09:45:27 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:27.142 280172 DEBUG nova.scheduler.client.report [None req-1a57df67-0f32-4e7a-ae01-b3a64cdd1e9a - - - - - -] Updating ProviderTree inventory for provider 72fba1ca-0d86-48af-8a3d-510284dfd0e0 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Nov 28 09:45:27 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:27.142 280172 DEBUG nova.compute.provider_tree [None req-1a57df67-0f32-4e7a-ae01-b3a64cdd1e9a - - - - - -] Updating inventory in ProviderTree for provider 72fba1ca-0d86-48af-8a3d-510284dfd0e0 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 28 09:45:27 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:27.156 280172 DEBUG nova.scheduler.client.report [None req-1a57df67-0f32-4e7a-ae01-b3a64cdd1e9a - - - - - -] Refreshing aggregate associations for resource provider 72fba1ca-0d86-48af-8a3d-510284dfd0e0, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Nov 28 09:45:27 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:27.179 280172 DEBUG nova.scheduler.client.report [None req-1a57df67-0f32-4e7a-ae01-b3a64cdd1e9a - - - - - -] Refreshing trait associations for resource provider 72fba1ca-0d86-48af-8a3d-510284dfd0e0, traits: COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_TRUSTED_CERTS,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_AVX,HW_CPU_X86_BMI2,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_AESNI,HW_CPU_X86_SHA,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_ACCELERATORS,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_AMD_SVM,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_F16C,HW_CPU_X86_SVM,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_SSE2,HW_CPU_X86_CLMUL,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_ABM,COMPUTE_STORAGE_BUS_FDC,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_SSE4A,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_SSSE3,HW_CPU_X86_AVX2,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_SECURITY_TPM_1_2,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_SSE41,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_FMA3,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_BMI,HW_CPU_X86_SSE,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NODE,HW_CPU_X86_MMX,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_VOLUME_EXTEND,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_VIRTIO _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Nov 28 09:45:27 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:27.196 280172 DEBUG oslo_concurrency.processutils [None req-1a57df67-0f32-4e7a-ae01-b3a64cdd1e9a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 09:45:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:45:27 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:45:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:45:27 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:45:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:45:27 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 09:45:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:45:27 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 09:45:27 np0005538515.localdomain openstack_network_exporter[240973]: 
Nov 28 09:45:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:45:27 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 09:45:27 np0005538515.localdomain openstack_network_exporter[240973]: 
Nov 28 09:45:27 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:27.656 280172 DEBUG oslo_concurrency.processutils [None req-1a57df67-0f32-4e7a-ae01-b3a64cdd1e9a - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.460s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 09:45:27 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:27.662 280172 DEBUG nova.virt.libvirt.host [None req-1a57df67-0f32-4e7a-ae01-b3a64cdd1e9a - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N
Nov 28 09:45:27 np0005538515.localdomain nova_compute[280168]: ] _kernel_supports_amd_sev /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1803
Nov 28 09:45:27 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:27.663 280172 INFO nova.virt.libvirt.host [None req-1a57df67-0f32-4e7a-ae01-b3a64cdd1e9a - - - - - -] kernel doesn't support AMD SEV
Nov 28 09:45:27 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:27.664 280172 DEBUG nova.compute.provider_tree [None req-1a57df67-0f32-4e7a-ae01-b3a64cdd1e9a - - - - - -] Inventory has not changed in ProviderTree for provider: 72fba1ca-0d86-48af-8a3d-510284dfd0e0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 09:45:27 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:27.665 280172 DEBUG nova.virt.libvirt.driver [None req-1a57df67-0f32-4e7a-ae01-b3a64cdd1e9a - - - - - -] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 28 09:45:27 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:27.689 280172 DEBUG nova.scheduler.client.report [None req-1a57df67-0f32-4e7a-ae01-b3a64cdd1e9a - - - - - -] Inventory has not changed for provider 72fba1ca-0d86-48af-8a3d-510284dfd0e0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 09:45:27 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:27.735 280172 DEBUG nova.compute.resource_tracker [None req-1a57df67-0f32-4e7a-ae01-b3a64cdd1e9a - - - - - -] Compute_service record updated for np0005538515.localdomain:np0005538515.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 28 09:45:27 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:27.735 280172 DEBUG oslo_concurrency.lockutils [None req-1a57df67-0f32-4e7a-ae01-b3a64cdd1e9a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.827s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:45:27 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:27.735 280172 DEBUG nova.service [None req-1a57df67-0f32-4e7a-ae01-b3a64cdd1e9a - - - - - -] Creating RPC server for service compute start /usr/lib/python3.9/site-packages/nova/service.py:182
Nov 28 09:45:27 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:27.764 280172 DEBUG nova.service [None req-1a57df67-0f32-4e7a-ae01-b3a64cdd1e9a - - - - - -] Join ServiceGroup membership for this service compute start /usr/lib/python3.9/site-packages/nova/service.py:199
Nov 28 09:45:27 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:45:27.765 280172 DEBUG nova.servicegroup.drivers.db [None req-1a57df67-0f32-4e7a-ae01-b3a64cdd1e9a - - - - - -] DB_Driver: join new ServiceGroup member np0005538515.localdomain to the compute group, service = <Service: host=np0005538515.localdomain, binary=nova-compute, manager_class_name=nova.compute.manager.ComputeManager> join /usr/lib/python3.9/site-packages/nova/servicegroup/drivers/db.py:44
Nov 28 09:45:28 np0005538515.localdomain sudo[280442]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-onsxxwdfdrbbszezcetaruasuevlqgph ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764323128.112179-3937-113256465206772/AnsiballZ_podman_container.py
Nov 28 09:45:28 np0005538515.localdomain sudo[280442]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 09:45:28 np0005538515.localdomain python3.9[280444]: ansible-containers.podman.podman_container Invoked with name=nova_compute_init state=started executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Nov 28 09:45:28 np0005538515.localdomain podman[239012]: time="2025-11-28T09:45:28Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 09:45:28 np0005538515.localdomain systemd[1]: Started libpod-conmon-acc5612457ab293e4f840ea19b50676bf97e3477bba289ad940bf778a740745d.scope.
Nov 28 09:45:28 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 09:45:28 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8d7a1875c2425cad72cafe803874ca1ca683dd2f4b513ab7c102d534a7a81b79/merged/usr/sbin/nova_statedir_ownership.py supports timestamps until 2038 (0x7fffffff)
Nov 28 09:45:28 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8d7a1875c2425cad72cafe803874ca1ca683dd2f4b513ab7c102d534a7a81b79/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Nov 28 09:45:28 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8d7a1875c2425cad72cafe803874ca1ca683dd2f4b513ab7c102d534a7a81b79/merged/var/lib/_nova_secontext supports timestamps until 2038 (0x7fffffff)
Nov 28 09:45:29 np0005538515.localdomain podman[280469]: 2025-11-28 09:45:29.002193792 +0000 UTC m=+0.166094375 container init acc5612457ab293e4f840ea19b50676bf97e3477bba289ad940bf778a740745d (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=nova_compute_init, org.label-schema.license=GPLv2)
Nov 28 09:45:29 np0005538515.localdomain podman[280469]: 2025-11-28 09:45:29.014348149 +0000 UTC m=+0.178248732 container start acc5612457ab293e4f840ea19b50676bf97e3477bba289ad940bf778a740745d (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, container_name=nova_compute_init, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true)
Nov 28 09:45:29 np0005538515.localdomain podman[239012]: @ - - [28/Nov/2025:09:45:28 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 149991 "" "Go-http-client/1.1"
Nov 28 09:45:29 np0005538515.localdomain python3.9[280444]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: podman start nova_compute_init
Nov 28 09:45:29 np0005538515.localdomain podman[239012]: @ - - [28/Nov/2025:09:45:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17804 "" "Go-http-client/1.1"
Nov 28 09:45:29 np0005538515.localdomain nova_compute_init[280488]: INFO:nova_statedir:Applying nova statedir ownership
Nov 28 09:45:29 np0005538515.localdomain nova_compute_init[280488]: INFO:nova_statedir:Target ownership for /var/lib/nova: 42436:42436
Nov 28 09:45:29 np0005538515.localdomain nova_compute_init[280488]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/
Nov 28 09:45:29 np0005538515.localdomain nova_compute_init[280488]: INFO:nova_statedir:Changing ownership of /var/lib/nova from 1000:1000 to 42436:42436
Nov 28 09:45:29 np0005538515.localdomain nova_compute_init[280488]: INFO:nova_statedir:Setting selinux context of /var/lib/nova to system_u:object_r:container_file_t:s0
Nov 28 09:45:29 np0005538515.localdomain nova_compute_init[280488]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/instances/
Nov 28 09:45:29 np0005538515.localdomain nova_compute_init[280488]: INFO:nova_statedir:Changing ownership of /var/lib/nova/instances from 1000:1000 to 42436:42436
Nov 28 09:45:29 np0005538515.localdomain nova_compute_init[280488]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances to system_u:object_r:container_file_t:s0
Nov 28 09:45:29 np0005538515.localdomain nova_compute_init[280488]: INFO:nova_statedir:Checking uid: 0 gid: 0 path: /var/lib/nova/delay-nova-compute
Nov 28 09:45:29 np0005538515.localdomain nova_compute_init[280488]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/
Nov 28 09:45:29 np0005538515.localdomain nova_compute_init[280488]: INFO:nova_statedir:Ownership of /var/lib/nova/.ssh already 42436:42436
Nov 28 09:45:29 np0005538515.localdomain nova_compute_init[280488]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.ssh to system_u:object_r:container_file_t:s0
Nov 28 09:45:29 np0005538515.localdomain nova_compute_init[280488]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/ssh-privatekey
Nov 28 09:45:29 np0005538515.localdomain nova_compute_init[280488]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/config
Nov 28 09:45:29 np0005538515.localdomain nova_compute_init[280488]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.cache/
Nov 28 09:45:29 np0005538515.localdomain nova_compute_init[280488]: INFO:nova_statedir:Ownership of /var/lib/nova/.cache already 42436:42436
Nov 28 09:45:29 np0005538515.localdomain nova_compute_init[280488]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.cache to system_u:object_r:container_file_t:s0
Nov 28 09:45:29 np0005538515.localdomain nova_compute_init[280488]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.cache/python-entrypoints/
Nov 28 09:45:29 np0005538515.localdomain nova_compute_init[280488]: INFO:nova_statedir:Ownership of /var/lib/nova/.cache/python-entrypoints already 42436:42436
Nov 28 09:45:29 np0005538515.localdomain nova_compute_init[280488]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.cache/python-entrypoints to system_u:object_r:container_file_t:s0
Nov 28 09:45:29 np0005538515.localdomain nova_compute_init[280488]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.cache/python-entrypoints/b234715fc878456b41e32c4fbc669b417044dbe6c6684bbc9059e5c93396ffea
Nov 28 09:45:29 np0005538515.localdomain nova_compute_init[280488]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.cache/python-entrypoints/469bc4441baff9216df986857f9ff45dbf25965a8d2f755a6449ac2645cb7191
Nov 28 09:45:29 np0005538515.localdomain nova_compute_init[280488]: INFO:nova_statedir:Nova statedir ownership complete
Nov 28 09:45:29 np0005538515.localdomain systemd[1]: libpod-acc5612457ab293e4f840ea19b50676bf97e3477bba289ad940bf778a740745d.scope: Deactivated successfully.
Nov 28 09:45:29 np0005538515.localdomain podman[280489]: 2025-11-28 09:45:29.098738058 +0000 UTC m=+0.060982944 container died acc5612457ab293e4f840ea19b50676bf97e3477bba289ad940bf778a740745d (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=nova_compute_init, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']})
Nov 28 09:45:29 np0005538515.localdomain podman[280502]: 2025-11-28 09:45:29.174025124 +0000 UTC m=+0.071154039 container cleanup acc5612457ab293e4f840ea19b50676bf97e3477bba289ad940bf778a740745d (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=edpm, container_name=nova_compute_init, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, tcib_managed=true)
Nov 28 09:45:29 np0005538515.localdomain systemd[1]: libpod-conmon-acc5612457ab293e4f840ea19b50676bf97e3477bba289ad940bf778a740745d.scope: Deactivated successfully.
Nov 28 09:45:29 np0005538515.localdomain sudo[280442]: pam_unix(sudo:session): session closed for user root
Nov 28 09:45:29 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-8d7a1875c2425cad72cafe803874ca1ca683dd2f4b513ab7c102d534a7a81b79-merged.mount: Deactivated successfully.
Nov 28 09:45:29 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-acc5612457ab293e4f840ea19b50676bf97e3477bba289ad940bf778a740745d-userdata-shm.mount: Deactivated successfully.
Nov 28 09:45:30 np0005538515.localdomain sshd[262026]: pam_unix(sshd:session): session closed for user zuul
Nov 28 09:45:30 np0005538515.localdomain systemd[1]: session-59.scope: Deactivated successfully.
Nov 28 09:45:30 np0005538515.localdomain systemd[1]: session-59.scope: Consumed 1min 28.839s CPU time.
Nov 28 09:45:30 np0005538515.localdomain systemd-logind[763]: Session 59 logged out. Waiting for processes to exit.
Nov 28 09:45:30 np0005538515.localdomain systemd-logind[763]: Removed session 59.
Nov 28 09:45:31 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.
Nov 28 09:45:31 np0005538515.localdomain podman[280549]: 2025-11-28 09:45:31.985224272 +0000 UTC m=+0.090379945 container health_status 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Nov 28 09:45:31 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.
Nov 28 09:45:32 np0005538515.localdomain podman[280549]: 2025-11-28 09:45:32.0028733 +0000 UTC m=+0.108028933 container exec_died 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 09:45:32 np0005538515.localdomain systemd[1]: 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.service: Deactivated successfully.
Nov 28 09:45:32 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38024 DF PROTO=TCP SPT=50632 DPT=9102 SEQ=2766945851 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ADDF5A90000000001030307) 
Nov 28 09:45:32 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.
Nov 28 09:45:32 np0005538515.localdomain podman[280569]: 2025-11-28 09:45:32.110753157 +0000 UTC m=+0.104284696 container health_status b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible)
Nov 28 09:45:32 np0005538515.localdomain podman[280569]: 2025-11-28 09:45:32.151532763 +0000 UTC m=+0.145064302 container exec_died b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 28 09:45:32 np0005538515.localdomain systemd[1]: b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.service: Deactivated successfully.
Nov 28 09:45:32 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.
Nov 28 09:45:32 np0005538515.localdomain podman[280586]: 2025-11-28 09:45:32.198282783 +0000 UTC m=+0.077879287 container health_status 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 28 09:45:32 np0005538515.localdomain podman[280601]: 2025-11-28 09:45:32.273247749 +0000 UTC m=+0.074097630 container health_status d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 09:45:32 np0005538515.localdomain podman[280601]: 2025-11-28 09:45:32.287434539 +0000 UTC m=+0.088284390 container exec_died d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 09:45:32 np0005538515.localdomain systemd[1]: d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.service: Deactivated successfully.
Nov 28 09:45:32 np0005538515.localdomain podman[280586]: 2025-11-28 09:45:32.309654529 +0000 UTC m=+0.189251023 container exec_died 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125)
Nov 28 09:45:32 np0005538515.localdomain systemd[1]: 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.service: Deactivated successfully.
Nov 28 09:45:33 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38025 DF PROTO=TCP SPT=50632 DPT=9102 SEQ=2766945851 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ADDF9BA0000000001030307) 
Nov 28 09:45:33 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58799 DF PROTO=TCP SPT=52604 DPT=9102 SEQ=1885305082 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ADDFCFB0000000001030307) 
Nov 28 09:45:35 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38026 DF PROTO=TCP SPT=50632 DPT=9102 SEQ=2766945851 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ADE01BA0000000001030307) 
Nov 28 09:45:35 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18468 DF PROTO=TCP SPT=39560 DPT=9102 SEQ=918741934 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ADE04FB0000000001030307) 
Nov 28 09:45:36 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.
Nov 28 09:45:36 np0005538515.localdomain podman[280633]: 2025-11-28 09:45:36.948596569 +0000 UTC m=+0.061838340 container health_status 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 28 09:45:36 np0005538515.localdomain podman[280633]: 2025-11-28 09:45:36.960447066 +0000 UTC m=+0.073688807 container exec_died 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 28 09:45:36 np0005538515.localdomain systemd[1]: 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.service: Deactivated successfully.
Nov 28 09:45:39 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38027 DF PROTO=TCP SPT=50632 DPT=9102 SEQ=2766945851 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ADE117A0000000001030307) 
Nov 28 09:45:43 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.
Nov 28 09:45:43 np0005538515.localdomain podman[280656]: 2025-11-28 09:45:43.638053124 +0000 UTC m=+0.082290394 container health_status cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, container_name=multipathd, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 09:45:43 np0005538515.localdomain podman[280656]: 2025-11-28 09:45:43.652590955 +0000 UTC m=+0.096828265 container exec_died cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=multipathd, config_id=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible)
Nov 28 09:45:43 np0005538515.localdomain systemd[1]: cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.service: Deactivated successfully.
Nov 28 09:45:47 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38028 DF PROTO=TCP SPT=50632 DPT=9102 SEQ=2766945851 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ADE30FB0000000001030307) 
Nov 28 09:45:47 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:45:47.617 158530 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=6, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '92:49:97', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ca:ab:0a:de:51:20'}, ipsec=False) old=SB_Global(nb_cfg=5) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 09:45:47 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:45:47.619 158530 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 28 09:45:47 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:45:47.622 158530 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=62c03cad-89c1-4fd7-973b-8f2a608c71f1, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '6'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 09:45:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:45:50.827 158530 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:45:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:45:50.828 158530 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:45:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:45:50.828 158530 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:45:53 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.
Nov 28 09:45:53 np0005538515.localdomain podman[280675]: 2025-11-28 09:45:53.977992078 +0000 UTC m=+0.081788138 container health_status 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, vendor=Red Hat, Inc., config_id=edpm, release=1755695350, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.tags=minimal rhel9, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Nov 28 09:45:53 np0005538515.localdomain podman[280675]: 2025-11-28 09:45:53.992420716 +0000 UTC m=+0.096216786 container exec_died 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vcs-type=git, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, name=ubi9-minimal, release=1755695350, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.openshift.tags=minimal rhel9, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6)
Nov 28 09:45:54 np0005538515.localdomain systemd[1]: 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.service: Deactivated successfully.
Nov 28 09:45:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:45:57 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 09:45:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:45:57 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:45:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:45:57 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:45:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:45:57 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 09:45:57 np0005538515.localdomain openstack_network_exporter[240973]: 
Nov 28 09:45:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:45:57 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 09:45:57 np0005538515.localdomain openstack_network_exporter[240973]: 
Nov 28 09:45:58 np0005538515.localdomain podman[239012]: time="2025-11-28T09:45:58Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 09:45:58 np0005538515.localdomain podman[239012]: @ - - [28/Nov/2025:09:45:58 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 149991 "" "Go-http-client/1.1"
Nov 28 09:45:58 np0005538515.localdomain podman[239012]: @ - - [28/Nov/2025:09:45:58 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17699 "" "Go-http-client/1.1"
Nov 28 09:46:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:46:00.617 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:46:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:46:00.617 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:46:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:46:00.617 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:46:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:46:00.618 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:46:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:46:00.618 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:46:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:46:00.618 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:46:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:46:00.618 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:46:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:46:00.618 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:46:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:46:00.618 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:46:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:46:00.618 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:46:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:46:00.618 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:46:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:46:00.619 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:46:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:46:00.619 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:46:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:46:00.619 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:46:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:46:00.619 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:46:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:46:00.619 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:46:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:46:00.619 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:46:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:46:00.619 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:46:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:46:00.620 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:46:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:46:00.620 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:46:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:46:00.620 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:46:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:46:00.620 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:46:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:46:00.620 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:46:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:46:00.620 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:46:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:46:00.620 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:46:02 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56160 DF PROTO=TCP SPT=53668 DPT=9102 SEQ=1281477598 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ADE6AD90000000001030307) 
Nov 28 09:46:02 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.
Nov 28 09:46:02 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.
Nov 28 09:46:02 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.
Nov 28 09:46:02 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.
Nov 28 09:46:02 np0005538515.localdomain systemd[1]: tmp-crun.xI1fOk.mount: Deactivated successfully.
Nov 28 09:46:02 np0005538515.localdomain podman[280703]: 2025-11-28 09:46:02.98684869 +0000 UTC m=+0.076538326 container health_status d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 28 09:46:02 np0005538515.localdomain podman[280703]: 2025-11-28 09:46:02.994553859 +0000 UTC m=+0.084243505 container exec_died d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 28 09:46:03 np0005538515.localdomain systemd[1]: d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.service: Deactivated successfully.
Nov 28 09:46:03 np0005538515.localdomain podman[280702]: 2025-11-28 09:46:03.031817906 +0000 UTC m=+0.128893951 container health_status b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_id=ovn_metadata_agent, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 28 09:46:03 np0005538515.localdomain podman[280696]: 2025-11-28 09:46:03.084437339 +0000 UTC m=+0.184934310 container health_status 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 28 09:46:03 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56161 DF PROTO=TCP SPT=53668 DPT=9102 SEQ=1281477598 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ADE6EFA0000000001030307) 
Nov 28 09:46:03 np0005538515.localdomain podman[280695]: 2025-11-28 09:46:02.962925928 +0000 UTC m=+0.071035295 container health_status 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=edpm, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Nov 28 09:46:03 np0005538515.localdomain podman[280696]: 2025-11-28 09:46:03.142333564 +0000 UTC m=+0.242830605 container exec_died 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Nov 28 09:46:03 np0005538515.localdomain podman[280695]: 2025-11-28 09:46:03.15057998 +0000 UTC m=+0.258689417 container exec_died 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 09:46:03 np0005538515.localdomain systemd[1]: 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.service: Deactivated successfully.
Nov 28 09:46:03 np0005538515.localdomain podman[280702]: 2025-11-28 09:46:03.167603028 +0000 UTC m=+0.264679113 container exec_died b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 28 09:46:03 np0005538515.localdomain systemd[1]: 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.service: Deactivated successfully.
Nov 28 09:46:03 np0005538515.localdomain systemd[1]: b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.service: Deactivated successfully.
Nov 28 09:46:03 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38029 DF PROTO=TCP SPT=50632 DPT=9102 SEQ=2766945851 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ADE70FA0000000001030307) 
Nov 28 09:46:05 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56162 DF PROTO=TCP SPT=53668 DPT=9102 SEQ=1281477598 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ADE76FA0000000001030307) 
Nov 28 09:46:06 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58800 DF PROTO=TCP SPT=52604 DPT=9102 SEQ=1885305082 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ADE7AFB0000000001030307) 
Nov 28 09:46:07 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.
Nov 28 09:46:07 np0005538515.localdomain systemd[1]: tmp-crun.n8xbZT.mount: Deactivated successfully.
Nov 28 09:46:07 np0005538515.localdomain podman[280781]: 2025-11-28 09:46:07.958322938 +0000 UTC m=+0.068910898 container health_status 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 28 09:46:07 np0005538515.localdomain podman[280781]: 2025-11-28 09:46:07.97060071 +0000 UTC m=+0.081188650 container exec_died 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 28 09:46:07 np0005538515.localdomain systemd[1]: 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.service: Deactivated successfully.
Nov 28 09:46:09 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56163 DF PROTO=TCP SPT=53668 DPT=9102 SEQ=1281477598 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ADE86BA0000000001030307) 
Nov 28 09:46:13 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.
Nov 28 09:46:13 np0005538515.localdomain systemd[1]: tmp-crun.Jd1U80.mount: Deactivated successfully.
Nov 28 09:46:13 np0005538515.localdomain podman[280803]: 2025-11-28 09:46:13.96989191 +0000 UTC m=+0.079166068 container health_status cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3)
Nov 28 09:46:13 np0005538515.localdomain podman[280803]: 2025-11-28 09:46:13.988410585 +0000 UTC m=+0.097684773 container exec_died cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 09:46:14 np0005538515.localdomain systemd[1]: cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.service: Deactivated successfully.
Nov 28 09:46:17 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56164 DF PROTO=TCP SPT=53668 DPT=9102 SEQ=1281477598 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ADEA6FA0000000001030307) 
Nov 28 09:46:20 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:46:20.767 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:46:20 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:46:20.788 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:46:24 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.
Nov 28 09:46:24 np0005538515.localdomain podman[280822]: 2025-11-28 09:46:24.960085889 +0000 UTC m=+0.068934439 container health_status 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, architecture=x86_64, distribution-scope=public, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, config_id=edpm)
Nov 28 09:46:24 np0005538515.localdomain podman[280822]: 2025-11-28 09:46:24.971434782 +0000 UTC m=+0.080283352 container exec_died 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, architecture=x86_64, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, version=9.6, io.buildah.version=1.33.7, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, vcs-type=git, config_id=edpm, io.openshift.expose-services=)
Nov 28 09:46:24 np0005538515.localdomain systemd[1]: 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.service: Deactivated successfully.
Nov 28 09:46:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:46:25.239 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:46:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:46:25.240 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:46:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:46:25.240 280172 DEBUG nova.compute.manager [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 28 09:46:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:46:25.241 280172 DEBUG nova.compute.manager [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 28 09:46:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:46:25.274 280172 DEBUG nova.compute.manager [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 28 09:46:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:46:25.275 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:46:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:46:25.275 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:46:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:46:25.276 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:46:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:46:25.276 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:46:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:46:25.277 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:46:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:46:25.277 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:46:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:46:25.277 280172 DEBUG nova.compute.manager [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 28 09:46:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:46:25.277 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:46:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:46:25.300 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:46:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:46:25.301 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:46:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:46:25.301 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:46:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:46:25.301 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Auditing locally available compute resources for np0005538515.localdomain (node: np0005538515.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 28 09:46:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:46:25.302 280172 DEBUG oslo_concurrency.processutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 09:46:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:46:25.750 280172 DEBUG oslo_concurrency.processutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.448s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 09:46:26 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:46:25.998 280172 WARNING nova.virt.libvirt.driver [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 09:46:26 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:46:26.000 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Hypervisor/Node resource view: name=np0005538515.localdomain free_ram=12529MB free_disk=41.83693313598633GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 28 09:46:26 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:46:26.001 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:46:26 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:46:26.001 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:46:26 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:46:26.093 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 28 09:46:26 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:46:26.093 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Final resource view: name=np0005538515.localdomain phys_ram=15738MB used_ram=512MB phys_disk=41GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 28 09:46:26 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:46:26.112 280172 DEBUG oslo_concurrency.processutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 09:46:26 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:46:26.602 280172 DEBUG oslo_concurrency.processutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.490s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 09:46:26 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:46:26.609 280172 DEBUG nova.compute.provider_tree [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Inventory has not changed in ProviderTree for provider: 72fba1ca-0d86-48af-8a3d-510284dfd0e0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 09:46:26 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:46:26.626 280172 DEBUG nova.scheduler.client.report [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Inventory has not changed for provider 72fba1ca-0d86-48af-8a3d-510284dfd0e0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 09:46:26 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:46:26.628 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Compute_service record updated for np0005538515.localdomain:np0005538515.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 28 09:46:26 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:46:26.628 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.627s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:46:26 np0005538515.localdomain sudo[280888]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:46:26 np0005538515.localdomain sudo[280888]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:46:26 np0005538515.localdomain sudo[280888]: pam_unix(sudo:session): session closed for user root
Nov 28 09:46:26 np0005538515.localdomain sudo[280906]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 09:46:26 np0005538515.localdomain sudo[280906]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:46:27 np0005538515.localdomain sudo[280906]: pam_unix(sudo:session): session closed for user root
Nov 28 09:46:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:46:27 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:46:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:46:27 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:46:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:46:27 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 09:46:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:46:27 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 09:46:27 np0005538515.localdomain openstack_network_exporter[240973]: 
Nov 28 09:46:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:46:27 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 09:46:27 np0005538515.localdomain openstack_network_exporter[240973]: 
Nov 28 09:46:28 np0005538515.localdomain sudo[280956]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:46:28 np0005538515.localdomain sudo[280956]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:46:28 np0005538515.localdomain sudo[280956]: pam_unix(sudo:session): session closed for user root
Nov 28 09:46:28 np0005538515.localdomain podman[239012]: time="2025-11-28T09:46:28Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 09:46:28 np0005538515.localdomain podman[239012]: @ - - [28/Nov/2025:09:46:28 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 149991 "" "Go-http-client/1.1"
Nov 28 09:46:28 np0005538515.localdomain podman[239012]: @ - - [28/Nov/2025:09:46:28 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17706 "" "Go-http-client/1.1"
Nov 28 09:46:32 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9353 DF PROTO=TCP SPT=53558 DPT=9102 SEQ=1644628444 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ADEE0090000000001030307) 
Nov 28 09:46:33 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9354 DF PROTO=TCP SPT=53558 DPT=9102 SEQ=1644628444 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ADEE3FA0000000001030307) 
Nov 28 09:46:33 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56165 DF PROTO=TCP SPT=53668 DPT=9102 SEQ=1281477598 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ADEE6FA0000000001030307) 
Nov 28 09:46:33 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.
Nov 28 09:46:33 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.
Nov 28 09:46:33 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.
Nov 28 09:46:33 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.
Nov 28 09:46:33 np0005538515.localdomain podman[280979]: 2025-11-28 09:46:33.981275445 +0000 UTC m=+0.074962327 container health_status d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 09:46:33 np0005538515.localdomain podman[280979]: 2025-11-28 09:46:33.992303787 +0000 UTC m=+0.085990689 container exec_died d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 09:46:34 np0005538515.localdomain systemd[1]: d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.service: Deactivated successfully.
Nov 28 09:46:34 np0005538515.localdomain podman[280974]: 2025-11-28 09:46:34.037403777 +0000 UTC m=+0.138487809 container health_status 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.vendor=CentOS, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, managed_by=edpm_ansible)
Nov 28 09:46:34 np0005538515.localdomain podman[280974]: 2025-11-28 09:46:34.047313084 +0000 UTC m=+0.148397116 container exec_died 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible)
Nov 28 09:46:34 np0005538515.localdomain systemd[1]: 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.service: Deactivated successfully.
Nov 28 09:46:34 np0005538515.localdomain podman[280975]: 2025-11-28 09:46:34.090251047 +0000 UTC m=+0.188012736 container health_status 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_controller, tcib_managed=true, org.label-schema.vendor=CentOS)
Nov 28 09:46:34 np0005538515.localdomain podman[280976]: 2025-11-28 09:46:34.143029713 +0000 UTC m=+0.238304185 container health_status b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Nov 28 09:46:34 np0005538515.localdomain podman[280975]: 2025-11-28 09:46:34.152682253 +0000 UTC m=+0.250443982 container exec_died 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 09:46:34 np0005538515.localdomain systemd[1]: 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.service: Deactivated successfully.
Nov 28 09:46:34 np0005538515.localdomain podman[280976]: 2025-11-28 09:46:34.178597237 +0000 UTC m=+0.273871709 container exec_died b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 28 09:46:34 np0005538515.localdomain systemd[1]: b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.service: Deactivated successfully.
Nov 28 09:46:35 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9355 DF PROTO=TCP SPT=53558 DPT=9102 SEQ=1644628444 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ADEEBFA0000000001030307) 
Nov 28 09:46:35 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38030 DF PROTO=TCP SPT=50632 DPT=9102 SEQ=2766945851 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ADEEEFA0000000001030307) 
Nov 28 09:46:38 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.
Nov 28 09:46:38 np0005538515.localdomain podman[281055]: 2025-11-28 09:46:38.972692102 +0000 UTC m=+0.080858650 container health_status 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 28 09:46:38 np0005538515.localdomain podman[281055]: 2025-11-28 09:46:38.980581166 +0000 UTC m=+0.088747694 container exec_died 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 28 09:46:38 np0005538515.localdomain systemd[1]: 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.service: Deactivated successfully.
Nov 28 09:46:39 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9356 DF PROTO=TCP SPT=53558 DPT=9102 SEQ=1644628444 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ADEFBBB0000000001030307) 
Nov 28 09:46:42 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T09:46:42Z|00037|memory_trim|INFO|Detected inactivity (last active 30009 ms ago): trimming memory
Nov 28 09:46:44 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.
Nov 28 09:46:44 np0005538515.localdomain podman[281079]: 2025-11-28 09:46:44.974796661 +0000 UTC m=+0.079515438 container health_status cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true, config_id=multipathd, managed_by=edpm_ansible)
Nov 28 09:46:45 np0005538515.localdomain podman[281079]: 2025-11-28 09:46:45.0176636 +0000 UTC m=+0.122382327 container exec_died cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, managed_by=edpm_ansible, config_id=multipathd, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 28 09:46:45 np0005538515.localdomain systemd[1]: cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.service: Deactivated successfully.
Nov 28 09:46:47 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9357 DF PROTO=TCP SPT=53558 DPT=9102 SEQ=1644628444 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ADF1CFB0000000001030307) 
Nov 28 09:46:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:46:50.828 158530 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:46:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:46:50.829 158530 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:46:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:46:50.829 158530 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:46:55 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.
Nov 28 09:46:55 np0005538515.localdomain podman[281097]: 2025-11-28 09:46:55.983176416 +0000 UTC m=+0.085863285 container health_status 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, maintainer=Red Hat, Inc., version=9.6, distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., config_id=edpm, vcs-type=git, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, release=1755695350, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, name=ubi9-minimal)
Nov 28 09:46:56 np0005538515.localdomain podman[281097]: 2025-11-28 09:46:56.024532299 +0000 UTC m=+0.127219148 container exec_died 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, name=ubi9-minimal, vendor=Red Hat, Inc., config_id=edpm, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, release=1755695350, container_name=openstack_network_exporter, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, maintainer=Red Hat, Inc., version=9.6, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 28 09:46:56 np0005538515.localdomain systemd[1]: 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.service: Deactivated successfully.
Nov 28 09:46:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:46:57 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 09:46:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:46:57 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 09:46:57 np0005538515.localdomain openstack_network_exporter[240973]: 
Nov 28 09:46:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:46:57 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:46:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:46:57 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:46:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:46:57 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 09:46:57 np0005538515.localdomain openstack_network_exporter[240973]: 
Nov 28 09:46:58 np0005538515.localdomain podman[239012]: time="2025-11-28T09:46:58Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 09:46:58 np0005538515.localdomain podman[239012]: @ - - [28/Nov/2025:09:46:58 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 149991 "" "Go-http-client/1.1"
Nov 28 09:46:58 np0005538515.localdomain podman[239012]: @ - - [28/Nov/2025:09:46:58 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17695 "" "Go-http-client/1.1"
Nov 28 09:47:02 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7755 DF PROTO=TCP SPT=36244 DPT=9102 SEQ=2949777023 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ADF55390000000001030307) 
Nov 28 09:47:03 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7756 DF PROTO=TCP SPT=36244 DPT=9102 SEQ=2949777023 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ADF593A0000000001030307) 
Nov 28 09:47:04 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9358 DF PROTO=TCP SPT=53558 DPT=9102 SEQ=1644628444 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ADF5CFA0000000001030307) 
Nov 28 09:47:04 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.
Nov 28 09:47:04 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.
Nov 28 09:47:04 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.
Nov 28 09:47:04 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.
Nov 28 09:47:04 np0005538515.localdomain podman[281119]: 2025-11-28 09:47:04.989452448 +0000 UTC m=+0.094141367 container health_status 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller)
Nov 28 09:47:05 np0005538515.localdomain podman[281118]: 2025-11-28 09:47:05.04232315 +0000 UTC m=+0.149893798 container health_status 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 28 09:47:05 np0005538515.localdomain podman[281119]: 2025-11-28 09:47:05.046550751 +0000 UTC m=+0.151239680 container exec_died 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Nov 28 09:47:05 np0005538515.localdomain systemd[1]: 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.service: Deactivated successfully.
Nov 28 09:47:05 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7757 DF PROTO=TCP SPT=36244 DPT=9102 SEQ=2949777023 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ADF613A0000000001030307) 
Nov 28 09:47:05 np0005538515.localdomain podman[281120]: 2025-11-28 09:47:05.125119587 +0000 UTC m=+0.221210441 container health_status b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 09:47:05 np0005538515.localdomain podman[281118]: 2025-11-28 09:47:05.132507676 +0000 UTC m=+0.240078364 container exec_died 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Nov 28 09:47:05 np0005538515.localdomain systemd[1]: 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.service: Deactivated successfully.
Nov 28 09:47:05 np0005538515.localdomain podman[281125]: 2025-11-28 09:47:05.174473382 +0000 UTC m=+0.272998611 container health_status d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 09:47:05 np0005538515.localdomain podman[281120]: 2025-11-28 09:47:05.186370289 +0000 UTC m=+0.282461103 container exec_died b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 09:47:05 np0005538515.localdomain systemd[1]: b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.service: Deactivated successfully.
Nov 28 09:47:05 np0005538515.localdomain podman[281125]: 2025-11-28 09:47:05.206254553 +0000 UTC m=+0.304779782 container exec_died d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 09:47:05 np0005538515.localdomain systemd[1]: d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.service: Deactivated successfully.
Nov 28 09:47:05 np0005538515.localdomain systemd[1]: tmp-crun.aFfvby.mount: Deactivated successfully.
Nov 28 09:47:06 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56166 DF PROTO=TCP SPT=53668 DPT=9102 SEQ=1281477598 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ADF64FA0000000001030307) 
Nov 28 09:47:07 np0005538515.localdomain ceph-osd[33334]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #43. Immutable memtables: 0.
Nov 28 09:47:09 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7758 DF PROTO=TCP SPT=36244 DPT=9102 SEQ=2949777023 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ADF70FA0000000001030307) 
Nov 28 09:47:09 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.
Nov 28 09:47:09 np0005538515.localdomain podman[281200]: 2025-11-28 09:47:09.980932057 +0000 UTC m=+0.089048441 container health_status 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 28 09:47:09 np0005538515.localdomain podman[281200]: 2025-11-28 09:47:09.990243645 +0000 UTC m=+0.098360069 container exec_died 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 28 09:47:10 np0005538515.localdomain systemd[1]: 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.service: Deactivated successfully.
Nov 28 09:47:13 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T09:47:13Z|00038|memory_trim|INFO|Detected inactivity (last active 30010 ms ago): trimming memory
Nov 28 09:47:15 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.
Nov 28 09:47:15 np0005538515.localdomain podman[281222]: 2025-11-28 09:47:15.980317178 +0000 UTC m=+0.084376936 container health_status cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_id=multipathd, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, io.buildah.version=1.41.3)
Nov 28 09:47:15 np0005538515.localdomain podman[281222]: 2025-11-28 09:47:15.990564215 +0000 UTC m=+0.094623993 container exec_died cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, config_id=multipathd, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 09:47:16 np0005538515.localdomain systemd[1]: cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.service: Deactivated successfully.
Nov 28 09:47:16 np0005538515.localdomain ceph-osd[32393]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #43. Immutable memtables: 0.
Nov 28 09:47:17 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7759 DF PROTO=TCP SPT=36244 DPT=9102 SEQ=2949777023 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ADF90FB0000000001030307) 
Nov 28 09:47:26 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:47:26.621 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:47:26 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:47:26.643 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:47:26 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:47:26.644 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:47:26 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:47:26.644 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:47:26 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:47:26.644 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:47:26 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:47:26.644 280172 DEBUG nova.compute.manager [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 28 09:47:26 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:47:26.645 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:47:26 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:47:26.666 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:47:26 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:47:26.666 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:47:26 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:47:26.667 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:47:26 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:47:26.667 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Auditing locally available compute resources for np0005538515.localdomain (node: np0005538515.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 28 09:47:26 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:47:26.667 280172 DEBUG oslo_concurrency.processutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 09:47:26 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.
Nov 28 09:47:26 np0005538515.localdomain podman[281261]: 2025-11-28 09:47:26.9771152 +0000 UTC m=+0.085634954 container health_status 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, name=ubi9-minimal, vcs-type=git, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., version=9.6, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=)
Nov 28 09:47:27 np0005538515.localdomain podman[281261]: 2025-11-28 09:47:27.016125685 +0000 UTC m=+0.124645449 container exec_died 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, config_id=edpm, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, vcs-type=git, architecture=x86_64, distribution-scope=public, container_name=openstack_network_exporter, name=ubi9-minimal, vendor=Red Hat, Inc., managed_by=edpm_ansible, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, version=9.6, build-date=2025-08-20T13:12:41)
Nov 28 09:47:27 np0005538515.localdomain systemd[1]: 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.service: Deactivated successfully.
Nov 28 09:47:27 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:47:27.223 280172 DEBUG oslo_concurrency.processutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.556s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 09:47:27 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:47:27.438 280172 WARNING nova.virt.libvirt.driver [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 09:47:27 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:47:27.440 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Hypervisor/Node resource view: name=np0005538515.localdomain free_ram=12503MB free_disk=41.83686447143555GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 28 09:47:27 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:47:27.440 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:47:27 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:47:27.441 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:47:27 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:47:27.532 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 28 09:47:27 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:47:27.532 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Final resource view: name=np0005538515.localdomain phys_ram=15738MB used_ram=512MB phys_disk=41GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 28 09:47:27 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:47:27.578 280172 DEBUG oslo_concurrency.processutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 09:47:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:47:27 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 09:47:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:47:27 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:47:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:47:27 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:47:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:47:27 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 09:47:27 np0005538515.localdomain openstack_network_exporter[240973]: 
Nov 28 09:47:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:47:27 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 09:47:27 np0005538515.localdomain openstack_network_exporter[240973]: 
Nov 28 09:47:28 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:47:28.054 280172 DEBUG oslo_concurrency.processutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.476s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 09:47:28 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:47:28.060 280172 DEBUG nova.compute.provider_tree [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Inventory has not changed in ProviderTree for provider: 72fba1ca-0d86-48af-8a3d-510284dfd0e0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 09:47:28 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:47:28.072 280172 DEBUG nova.scheduler.client.report [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Inventory has not changed for provider 72fba1ca-0d86-48af-8a3d-510284dfd0e0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 09:47:28 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:47:28.073 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Compute_service record updated for np0005538515.localdomain:np0005538515.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 28 09:47:28 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:47:28.074 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.633s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:47:28 np0005538515.localdomain sudo[281305]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:47:28 np0005538515.localdomain sudo[281305]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:47:28 np0005538515.localdomain sudo[281305]: pam_unix(sudo:session): session closed for user root
Nov 28 09:47:28 np0005538515.localdomain sudo[281323]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 09:47:28 np0005538515.localdomain sudo[281323]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:47:28 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:47:28.668 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:47:28 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:47:28.668 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:47:28 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:47:28.669 280172 DEBUG nova.compute.manager [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 28 09:47:28 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:47:28.669 280172 DEBUG nova.compute.manager [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 28 09:47:28 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:47:28.684 280172 DEBUG nova.compute.manager [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 28 09:47:28 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:47:28.684 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:47:28 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:47:28.685 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:47:28 np0005538515.localdomain podman[239012]: time="2025-11-28T09:47:28Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 09:47:28 np0005538515.localdomain podman[239012]: @ - - [28/Nov/2025:09:47:28 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 149991 "" "Go-http-client/1.1"
Nov 28 09:47:28 np0005538515.localdomain podman[239012]: @ - - [28/Nov/2025:09:47:28 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17706 "" "Go-http-client/1.1"
Nov 28 09:47:29 np0005538515.localdomain sudo[281323]: pam_unix(sudo:session): session closed for user root
Nov 28 09:47:32 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50498 DF PROTO=TCP SPT=48254 DPT=9102 SEQ=1993357398 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ADFCA690000000001030307) 
Nov 28 09:47:33 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50499 DF PROTO=TCP SPT=48254 DPT=9102 SEQ=1993357398 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ADFCE7A0000000001030307) 
Nov 28 09:47:33 np0005538515.localdomain sudo[281373]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:47:33 np0005538515.localdomain sudo[281373]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:47:33 np0005538515.localdomain sudo[281373]: pam_unix(sudo:session): session closed for user root
Nov 28 09:47:33 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7760 DF PROTO=TCP SPT=36244 DPT=9102 SEQ=2949777023 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ADFD0FA0000000001030307) 
Nov 28 09:47:35 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50500 DF PROTO=TCP SPT=48254 DPT=9102 SEQ=1993357398 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ADFD67A0000000001030307) 
Nov 28 09:47:35 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.
Nov 28 09:47:35 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.
Nov 28 09:47:35 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.
Nov 28 09:47:35 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.
Nov 28 09:47:36 np0005538515.localdomain podman[281391]: 2025-11-28 09:47:36.009483157 +0000 UTC m=+0.109049947 container health_status 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team)
Nov 28 09:47:36 np0005538515.localdomain podman[281392]: 2025-11-28 09:47:36.05331446 +0000 UTC m=+0.151290052 container health_status 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 09:47:36 np0005538515.localdomain podman[281392]: 2025-11-28 09:47:36.102883611 +0000 UTC m=+0.200859153 container exec_died 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 28 09:47:36 np0005538515.localdomain systemd[1]: 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.service: Deactivated successfully.
Nov 28 09:47:36 np0005538515.localdomain podman[281391]: 2025-11-28 09:47:36.120009159 +0000 UTC m=+0.219575940 container exec_died 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Nov 28 09:47:36 np0005538515.localdomain systemd[1]: 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.service: Deactivated successfully.
Nov 28 09:47:36 np0005538515.localdomain podman[281393]: 2025-11-28 09:47:36.111479336 +0000 UTC m=+0.204622788 container health_status b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Nov 28 09:47:36 np0005538515.localdomain podman[281393]: 2025-11-28 09:47:36.197612946 +0000 UTC m=+0.290756448 container exec_died b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 09:47:36 np0005538515.localdomain systemd[1]: b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.service: Deactivated successfully.
Nov 28 09:47:36 np0005538515.localdomain podman[281394]: 2025-11-28 09:47:36.266904005 +0000 UTC m=+0.357633913 container health_status d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 09:47:36 np0005538515.localdomain podman[281394]: 2025-11-28 09:47:36.274936883 +0000 UTC m=+0.365666821 container exec_died d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 28 09:47:36 np0005538515.localdomain systemd[1]: d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.service: Deactivated successfully.
Nov 28 09:47:36 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9359 DF PROTO=TCP SPT=53558 DPT=9102 SEQ=1644628444 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ADFDAFA0000000001030307) 
Nov 28 09:47:39 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50501 DF PROTO=TCP SPT=48254 DPT=9102 SEQ=1993357398 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5ADFE63A0000000001030307) 
Nov 28 09:47:40 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.
Nov 28 09:47:40 np0005538515.localdomain podman[281475]: 2025-11-28 09:47:40.966244937 +0000 UTC m=+0.071376215 container health_status 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 28 09:47:40 np0005538515.localdomain podman[281475]: 2025-11-28 09:47:40.99872613 +0000 UTC m=+0.103857418 container exec_died 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 28 09:47:41 np0005538515.localdomain systemd[1]: 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.service: Deactivated successfully.
Nov 28 09:47:46 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.
Nov 28 09:47:46 np0005538515.localdomain podman[281498]: 2025-11-28 09:47:46.966397318 +0000 UTC m=+0.070696543 container health_status cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, managed_by=edpm_ansible)
Nov 28 09:47:46 np0005538515.localdomain podman[281498]: 2025-11-28 09:47:46.981607207 +0000 UTC m=+0.085906502 container exec_died cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.schema-version=1.0, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 09:47:46 np0005538515.localdomain systemd[1]: cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.service: Deactivated successfully.
Nov 28 09:47:47 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50502 DF PROTO=TCP SPT=48254 DPT=9102 SEQ=1993357398 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AE006FA0000000001030307) 
Nov 28 09:47:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:47:50.829 158530 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:47:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:47:50.830 158530 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:47:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:47:50.830 158530 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:47:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:47:57 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:47:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:47:57 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:47:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:47:57 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 09:47:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:47:57 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 09:47:57 np0005538515.localdomain openstack_network_exporter[240973]: 
Nov 28 09:47:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:47:57 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 09:47:57 np0005538515.localdomain openstack_network_exporter[240973]: 
Nov 28 09:47:57 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.
Nov 28 09:47:57 np0005538515.localdomain podman[281519]: 2025-11-28 09:47:57.735955334 +0000 UTC m=+0.074491971 container health_status 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, name=ubi9-minimal, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, version=9.6, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.expose-services=, container_name=openstack_network_exporter, config_id=edpm, vcs-type=git, maintainer=Red Hat, Inc.)
Nov 28 09:47:57 np0005538515.localdomain podman[281519]: 2025-11-28 09:47:57.747567052 +0000 UTC m=+0.086103699 container exec_died 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, architecture=x86_64, container_name=openstack_network_exporter, vendor=Red Hat, Inc., io.buildah.version=1.33.7, vcs-type=git, release=1755695350, io.openshift.expose-services=, config_id=edpm, build-date=2025-08-20T13:12:41, distribution-scope=public, io.openshift.tags=minimal rhel9, version=9.6, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container)
Nov 28 09:47:57 np0005538515.localdomain systemd[1]: 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.service: Deactivated successfully.
Nov 28 09:47:58 np0005538515.localdomain podman[239012]: time="2025-11-28T09:47:58Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 09:47:58 np0005538515.localdomain podman[239012]: @ - - [28/Nov/2025:09:47:58 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 149991 "" "Go-http-client/1.1"
Nov 28 09:47:58 np0005538515.localdomain podman[239012]: @ - - [28/Nov/2025:09:47:58 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17712 "" "Go-http-client/1.1"
Nov 28 09:48:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:48:00.619 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:48:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:48:00.620 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:48:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:48:00.620 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:48:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:48:00.620 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:48:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:48:00.620 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:48:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:48:00.621 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:48:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:48:00.621 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:48:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:48:00.621 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:48:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:48:00.621 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:48:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:48:00.621 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:48:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:48:00.622 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:48:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:48:00.622 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:48:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:48:00.622 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:48:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:48:00.622 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:48:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:48:00.623 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:48:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:48:00.623 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:48:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:48:00.623 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:48:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:48:00.623 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:48:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:48:00.623 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:48:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:48:00.624 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:48:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:48:00.624 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:48:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:48:00.624 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:48:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:48:00.624 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:48:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:48:00.624 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:48:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:48:00.625 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:48:02 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40378 DF PROTO=TCP SPT=52778 DPT=9102 SEQ=640193919 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AE03F990000000001030307) 
Nov 28 09:48:03 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40379 DF PROTO=TCP SPT=52778 DPT=9102 SEQ=640193919 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AE043BB0000000001030307) 
Nov 28 09:48:03 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50503 DF PROTO=TCP SPT=48254 DPT=9102 SEQ=1993357398 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AE046FB0000000001030307) 
Nov 28 09:48:05 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40380 DF PROTO=TCP SPT=52778 DPT=9102 SEQ=640193919 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AE04BBB0000000001030307) 
Nov 28 09:48:05 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7761 DF PROTO=TCP SPT=36244 DPT=9102 SEQ=2949777023 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AE04EFA0000000001030307) 
Nov 28 09:48:06 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.
Nov 28 09:48:06 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.
Nov 28 09:48:06 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.
Nov 28 09:48:06 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.
Nov 28 09:48:06 np0005538515.localdomain podman[281538]: 2025-11-28 09:48:06.9939819 +0000 UTC m=+0.097053678 container health_status 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125)
Nov 28 09:48:07 np0005538515.localdomain podman[281538]: 2025-11-28 09:48:07.030456106 +0000 UTC m=+0.133527884 container exec_died 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, org.label-schema.build-date=20251125, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 09:48:07 np0005538515.localdomain systemd[1]: tmp-crun.3ds0QA.mount: Deactivated successfully.
Nov 28 09:48:07 np0005538515.localdomain systemd[1]: 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.service: Deactivated successfully.
Nov 28 09:48:07 np0005538515.localdomain podman[281539]: 2025-11-28 09:48:07.044986744 +0000 UTC m=+0.144380988 container health_status 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Nov 28 09:48:07 np0005538515.localdomain podman[281539]: 2025-11-28 09:48:07.085450834 +0000 UTC m=+0.184845098 container exec_died 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.license=GPLv2)
Nov 28 09:48:07 np0005538515.localdomain podman[281545]: 2025-11-28 09:48:07.100327143 +0000 UTC m=+0.190975147 container health_status d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 28 09:48:07 np0005538515.localdomain systemd[1]: 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.service: Deactivated successfully.
Nov 28 09:48:07 np0005538515.localdomain podman[281545]: 2025-11-28 09:48:07.11449265 +0000 UTC m=+0.205140634 container exec_died d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 09:48:07 np0005538515.localdomain systemd[1]: d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.service: Deactivated successfully.
Nov 28 09:48:07 np0005538515.localdomain podman[281540]: 2025-11-28 09:48:07.206223013 +0000 UTC m=+0.299490198 container health_status b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Nov 28 09:48:07 np0005538515.localdomain podman[281540]: 2025-11-28 09:48:07.213006282 +0000 UTC m=+0.306273457 container exec_died b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Nov 28 09:48:07 np0005538515.localdomain systemd[1]: b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.service: Deactivated successfully.
Nov 28 09:48:09 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40381 DF PROTO=TCP SPT=52778 DPT=9102 SEQ=640193919 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AE05B7A0000000001030307) 
Nov 28 09:48:11 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.
Nov 28 09:48:11 np0005538515.localdomain podman[281621]: 2025-11-28 09:48:11.97451849 +0000 UTC m=+0.086994087 container health_status 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 28 09:48:11 np0005538515.localdomain podman[281621]: 2025-11-28 09:48:11.985566261 +0000 UTC m=+0.098041908 container exec_died 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 28 09:48:11 np0005538515.localdomain systemd[1]: 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.service: Deactivated successfully.
Nov 28 09:48:17 np0005538515.localdomain sshd[281644]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 09:48:17 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40382 DF PROTO=TCP SPT=52778 DPT=9102 SEQ=640193919 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AE07AFA0000000001030307) 
Nov 28 09:48:17 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.
Nov 28 09:48:17 np0005538515.localdomain sshd[281644]: Accepted publickey for zuul from 38.102.83.114 port 54746 ssh2: RSA SHA256:3gOhaEk5Hp1Sm2LwNst6cGDJ5O01KvSo8lCo9SBO2II
Nov 28 09:48:17 np0005538515.localdomain systemd-logind[763]: New session 61 of user zuul.
Nov 28 09:48:17 np0005538515.localdomain systemd[1]: Started Session 61 of User zuul.
Nov 28 09:48:17 np0005538515.localdomain sshd[281644]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Nov 28 09:48:17 np0005538515.localdomain podman[281646]: 2025-11-28 09:48:17.350018876 +0000 UTC m=+0.081250309 container health_status cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, container_name=multipathd, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Nov 28 09:48:17 np0005538515.localdomain podman[281646]: 2025-11-28 09:48:17.386643687 +0000 UTC m=+0.117875090 container exec_died cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 28 09:48:17 np0005538515.localdomain systemd[1]: cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.service: Deactivated successfully.
Nov 28 09:48:17 np0005538515.localdomain sudo[281683]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dlzinxogolzaqzeoaoecnjccqcsxhhsr ; /usr/bin/python3
Nov 28 09:48:17 np0005538515.localdomain sudo[281683]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 09:48:17 np0005538515.localdomain python3[281685]: ansible-ansible.legacy.command Invoked with _raw_params=subscription-manager unregister _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:48:18 np0005538515.localdomain subscription-manager[281686]: Unregistered machine with identity: c20224ed-ba86-41a6-a487-b9546587a93c
Nov 28 09:48:18 np0005538515.localdomain sudo[281683]: pam_unix(sudo:session): session closed for user root
Nov 28 09:48:26 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:48:26.238 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:48:26 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:48:26.240 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:48:26 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:48:26.240 280172 DEBUG nova.compute.manager [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 28 09:48:27 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:48:27.240 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:48:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:48:27 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 09:48:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:48:27 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:48:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:48:27 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:48:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:48:27 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 09:48:27 np0005538515.localdomain openstack_network_exporter[240973]: 
Nov 28 09:48:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:48:27 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 09:48:27 np0005538515.localdomain openstack_network_exporter[240973]: 
Nov 28 09:48:27 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.
Nov 28 09:48:27 np0005538515.localdomain podman[281688]: 2025-11-28 09:48:27.976263317 +0000 UTC m=+0.082223819 container health_status 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., config_id=edpm, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, release=1755695350, vcs-type=git, architecture=x86_64, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, container_name=openstack_network_exporter, distribution-scope=public, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers)
Nov 28 09:48:27 np0005538515.localdomain podman[281688]: 2025-11-28 09:48:27.987442462 +0000 UTC m=+0.093402914 container exec_died 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, distribution-scope=public, managed_by=edpm_ansible, version=9.6, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 28 09:48:28 np0005538515.localdomain systemd[1]: 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.service: Deactivated successfully.
Nov 28 09:48:28 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:48:28.234 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:48:28 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:48:28.237 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:48:28 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:48:28.238 280172 DEBUG nova.compute.manager [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 28 09:48:28 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:48:28.238 280172 DEBUG nova.compute.manager [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 28 09:48:28 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:48:28.253 280172 DEBUG nova.compute.manager [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 28 09:48:28 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:48:28.253 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:48:28 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:48:28.253 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:48:28 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:48:28.254 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:48:28 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:48:28.283 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:48:28 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:48:28.284 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:48:28 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:48:28.284 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:48:28 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:48:28.284 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Auditing locally available compute resources for np0005538515.localdomain (node: np0005538515.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 28 09:48:28 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:48:28.285 280172 DEBUG oslo_concurrency.processutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 09:48:28 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:48:28.745 280172 DEBUG oslo_concurrency.processutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.460s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 09:48:28 np0005538515.localdomain podman[239012]: time="2025-11-28T09:48:28Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 09:48:28 np0005538515.localdomain podman[239012]: @ - - [28/Nov/2025:09:48:28 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 149991 "" "Go-http-client/1.1"
Nov 28 09:48:28 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:48:28.960 280172 WARNING nova.virt.libvirt.driver [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 09:48:28 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:48:28.962 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Hypervisor/Node resource view: name=np0005538515.localdomain free_ram=12517MB free_disk=41.83686447143555GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 28 09:48:28 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:48:28.963 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:48:28 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:48:28.963 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:48:28 np0005538515.localdomain podman[239012]: @ - - [28/Nov/2025:09:48:28 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17703 "" "Go-http-client/1.1"
Nov 28 09:48:29 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:48:29.170 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 28 09:48:29 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:48:29.171 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Final resource view: name=np0005538515.localdomain phys_ram=15738MB used_ram=512MB phys_disk=41GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 28 09:48:29 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:48:29.196 280172 DEBUG oslo_concurrency.processutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 09:48:29 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:48:29.665 280172 DEBUG oslo_concurrency.processutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.469s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 09:48:29 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:48:29.671 280172 DEBUG nova.compute.provider_tree [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Inventory has not changed in ProviderTree for provider: 72fba1ca-0d86-48af-8a3d-510284dfd0e0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 09:48:29 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:48:29.775 280172 DEBUG nova.scheduler.client.report [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Inventory has not changed for provider 72fba1ca-0d86-48af-8a3d-510284dfd0e0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 09:48:29 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:48:29.778 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Compute_service record updated for np0005538515.localdomain:np0005538515.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 28 09:48:29 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:48:29.778 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.815s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:48:30 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:48:30.764 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:48:32 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34317 DF PROTO=TCP SPT=39882 DPT=9102 SEQ=3804483572 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AE0B4C90000000001030307) 
Nov 28 09:48:33 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34318 DF PROTO=TCP SPT=39882 DPT=9102 SEQ=3804483572 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AE0B8BA0000000001030307) 
Nov 28 09:48:33 np0005538515.localdomain sudo[281752]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:48:33 np0005538515.localdomain sudo[281752]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:48:33 np0005538515.localdomain sudo[281752]: pam_unix(sudo:session): session closed for user root
Nov 28 09:48:33 np0005538515.localdomain sudo[281770]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 09:48:33 np0005538515.localdomain sudo[281770]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:48:33 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40383 DF PROTO=TCP SPT=52778 DPT=9102 SEQ=640193919 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AE0BAFA0000000001030307) 
Nov 28 09:48:34 np0005538515.localdomain ceph-osd[32393]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 28 09:48:34 np0005538515.localdomain ceph-osd[32393]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 7200.1 total, 600.0 interval
                                                          Cumulative writes: 4850 writes, 21K keys, 4850 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s
                                                          Cumulative WAL: 4850 writes, 661 syncs, 7.34 writes per sync, written: 0.02 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 66 writes, 202 keys, 66 commit groups, 1.0 writes per commit group, ingest: 0.25 MB, 0.00 MB/s
                                                          Interval WAL: 66 writes, 24 syncs, 2.75 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Nov 28 09:48:34 np0005538515.localdomain sudo[281770]: pam_unix(sudo:session): session closed for user root
Nov 28 09:48:34 np0005538515.localdomain sudo[281820]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:48:34 np0005538515.localdomain sudo[281820]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:48:34 np0005538515.localdomain sudo[281820]: pam_unix(sudo:session): session closed for user root
Nov 28 09:48:34 np0005538515.localdomain sudo[281838]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 list-networks
Nov 28 09:48:34 np0005538515.localdomain sudo[281838]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:48:34 np0005538515.localdomain sudo[281838]: pam_unix(sudo:session): session closed for user root
Nov 28 09:48:35 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34319 DF PROTO=TCP SPT=39882 DPT=9102 SEQ=3804483572 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AE0C0BA0000000001030307) 
Nov 28 09:48:36 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50504 DF PROTO=TCP SPT=48254 DPT=9102 SEQ=1993357398 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AE0C4FA0000000001030307) 
Nov 28 09:48:37 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.
Nov 28 09:48:37 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.
Nov 28 09:48:37 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.
Nov 28 09:48:37 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.
Nov 28 09:48:37 np0005538515.localdomain sudo[281873]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:48:37 np0005538515.localdomain sudo[281873]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:48:37 np0005538515.localdomain sudo[281873]: pam_unix(sudo:session): session closed for user root
Nov 28 09:48:38 np0005538515.localdomain podman[281892]: 2025-11-28 09:48:38.009328462 +0000 UTC m=+0.095780198 container health_status d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 28 09:48:38 np0005538515.localdomain podman[281892]: 2025-11-28 09:48:38.017374141 +0000 UTC m=+0.103825807 container exec_died d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 09:48:38 np0005538515.localdomain systemd[1]: d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.service: Deactivated successfully.
Nov 28 09:48:38 np0005538515.localdomain podman[281889]: 2025-11-28 09:48:38.062816424 +0000 UTC m=+0.159960160 container health_status 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Nov 28 09:48:38 np0005538515.localdomain podman[281891]: 2025-11-28 09:48:38.106536714 +0000 UTC m=+0.196693484 container health_status b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.3, container_name=ovn_metadata_agent, managed_by=edpm_ansible, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 09:48:38 np0005538515.localdomain podman[281889]: 2025-11-28 09:48:38.123203429 +0000 UTC m=+0.220347165 container exec_died 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_id=edpm, container_name=ceilometer_agent_compute, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Nov 28 09:48:38 np0005538515.localdomain systemd[1]: 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.service: Deactivated successfully.
Nov 28 09:48:38 np0005538515.localdomain podman[281891]: 2025-11-28 09:48:38.140509513 +0000 UTC m=+0.230666333 container exec_died b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 28 09:48:38 np0005538515.localdomain systemd[1]: b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.service: Deactivated successfully.
Nov 28 09:48:38 np0005538515.localdomain podman[281890]: 2025-11-28 09:48:38.215335964 +0000 UTC m=+0.308996152 container health_status 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 09:48:38 np0005538515.localdomain podman[281890]: 2025-11-28 09:48:38.294619881 +0000 UTC m=+0.388280129 container exec_died 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 09:48:38 np0005538515.localdomain systemd[1]: 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.service: Deactivated successfully.
Nov 28 09:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 28 09:48:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 7200.2 total, 600.0 interval
                                                          Cumulative writes: 5854 writes, 25K keys, 5854 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s
                                                          Cumulative WAL: 5854 writes, 763 syncs, 7.67 writes per sync, written: 0.02 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 73 writes, 221 keys, 73 commit groups, 1.0 writes per commit group, ingest: 0.35 MB, 0.00 MB/s
                                                          Interval WAL: 73 writes, 34 syncs, 2.15 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Nov 28 09:48:39 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34320 DF PROTO=TCP SPT=39882 DPT=9102 SEQ=3804483572 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AE0D07A0000000001030307) 
Nov 28 09:48:42 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.
Nov 28 09:48:42 np0005538515.localdomain podman[281975]: 2025-11-28 09:48:42.993566179 +0000 UTC m=+0.097989717 container health_status 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 28 09:48:43 np0005538515.localdomain podman[281975]: 2025-11-28 09:48:43.02471166 +0000 UTC m=+0.129135268 container exec_died 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 09:48:43 np0005538515.localdomain systemd[1]: 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.service: Deactivated successfully.
Nov 28 09:48:47 np0005538515.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:f7:e2:83 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34321 DF PROTO=TCP SPT=39882 DPT=9102 SEQ=3804483572 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5AE0F0FA0000000001030307) 
Nov 28 09:48:47 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.
Nov 28 09:48:47 np0005538515.localdomain podman[281998]: 2025-11-28 09:48:47.993045184 +0000 UTC m=+0.096351615 container health_status cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=multipathd, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 28 09:48:48 np0005538515.localdomain podman[281998]: 2025-11-28 09:48:48.00684062 +0000 UTC m=+0.110147001 container exec_died cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Nov 28 09:48:48 np0005538515.localdomain systemd[1]: cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.service: Deactivated successfully.
Nov 28 09:48:49 np0005538515.localdomain kernel: DROPPING: IN=eth0 OUT= MACSRC=c6:e7:bc:23:0b:06 MACDST=fa:16:3e:93:ca:2d MACPROTO=0800 SRC=3.138.197.221 DST=38.102.83.53 LEN=52 TOS=0x00 PREC=0x00 TTL=50 ID=37683 PROTO=TCP SPT=59427 DPT=9090 SEQ=1778054124 ACK=0 WINDOW=65535 RES=0x00 SYN URGP=0 OPT (020405B40103030801010402) 
Nov 28 09:48:49 np0005538515.localdomain sudo[282017]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:48:49 np0005538515.localdomain sudo[282017]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:48:49 np0005538515.localdomain sudo[282017]: pam_unix(sudo:session): session closed for user root
Nov 28 09:48:50 np0005538515.localdomain sudo[282035]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:48:50 np0005538515.localdomain sudo[282035]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:48:50 np0005538515.localdomain sudo[282035]: pam_unix(sudo:session): session closed for user root
Nov 28 09:48:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:48:50.830 158530 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:48:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:48:50.831 158530 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:48:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:48:50.832 158530 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:48:52 np0005538515.localdomain sudo[282053]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:48:52 np0005538515.localdomain sudo[282053]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:48:52 np0005538515.localdomain sudo[282053]: pam_unix(sudo:session): session closed for user root
Nov 28 09:48:57 np0005538515.localdomain sshd[282071]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 09:48:57 np0005538515.localdomain sshd[282071]: Accepted publickey for tripleo-admin from 192.168.122.11 port 39946 ssh2: RSA SHA256:3gOhaEk5Hp1Sm2LwNst6cGDJ5O01KvSo8lCo9SBO2II
Nov 28 09:48:57 np0005538515.localdomain systemd[1]: Created slice User Slice of UID 1003.
Nov 28 09:48:57 np0005538515.localdomain systemd[1]: Starting User Runtime Directory /run/user/1003...
Nov 28 09:48:57 np0005538515.localdomain systemd-logind[763]: New session 62 of user tripleo-admin.
Nov 28 09:48:57 np0005538515.localdomain systemd[1]: Finished User Runtime Directory /run/user/1003.
Nov 28 09:48:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:48:57 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:48:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:48:57 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 09:48:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:48:57 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:48:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:48:57 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 09:48:57 np0005538515.localdomain openstack_network_exporter[240973]: 
Nov 28 09:48:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:48:57 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 09:48:57 np0005538515.localdomain openstack_network_exporter[240973]: 
Nov 28 09:48:57 np0005538515.localdomain systemd[1]: Starting User Manager for UID 1003...
Nov 28 09:48:57 np0005538515.localdomain systemd[282075]: pam_unix(systemd-user:session): session opened for user tripleo-admin(uid=1003) by (uid=0)
Nov 28 09:48:57 np0005538515.localdomain systemd[282075]: Queued start job for default target Main User Target.
Nov 28 09:48:57 np0005538515.localdomain systemd[282075]: Created slice User Application Slice.
Nov 28 09:48:57 np0005538515.localdomain systemd[282075]: Started Mark boot as successful after the user session has run 2 minutes.
Nov 28 09:48:57 np0005538515.localdomain systemd[282075]: Started Daily Cleanup of User's Temporary Directories.
Nov 28 09:48:57 np0005538515.localdomain systemd[282075]: Reached target Paths.
Nov 28 09:48:57 np0005538515.localdomain systemd[282075]: Reached target Timers.
Nov 28 09:48:57 np0005538515.localdomain systemd[282075]: Starting D-Bus User Message Bus Socket...
Nov 28 09:48:57 np0005538515.localdomain systemd[282075]: Starting Create User's Volatile Files and Directories...
Nov 28 09:48:57 np0005538515.localdomain systemd[282075]: Listening on D-Bus User Message Bus Socket.
Nov 28 09:48:57 np0005538515.localdomain systemd[282075]: Reached target Sockets.
Nov 28 09:48:57 np0005538515.localdomain systemd[282075]: Finished Create User's Volatile Files and Directories.
Nov 28 09:48:57 np0005538515.localdomain systemd[282075]: Reached target Basic System.
Nov 28 09:48:57 np0005538515.localdomain systemd[282075]: Reached target Main User Target.
Nov 28 09:48:57 np0005538515.localdomain systemd[282075]: Startup finished in 147ms.
Nov 28 09:48:57 np0005538515.localdomain systemd[1]: Started User Manager for UID 1003.
Nov 28 09:48:57 np0005538515.localdomain systemd[1]: Started Session 62 of User tripleo-admin.
Nov 28 09:48:57 np0005538515.localdomain sshd[282071]: pam_unix(sshd:session): session opened for user tripleo-admin(uid=1003) by (uid=0)
Nov 28 09:48:58 np0005538515.localdomain sudo[282216]: tripleo-admin : TTY=pts/0 ; PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gkxhopjzvtnxkulwqhecllinwhdqiors ; /usr/bin/python3 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1764323337.903894-61474-123489591517724/AnsiballZ_blockinfile.py
Nov 28 09:48:58 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.
Nov 28 09:48:58 np0005538515.localdomain sudo[282216]: pam_unix(sudo:session): session opened for user root(uid=0) by tripleo-admin(uid=1003)
Nov 28 09:48:58 np0005538515.localdomain podman[282218]: 2025-11-28 09:48:58.438238756 +0000 UTC m=+0.074401069 container health_status 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, distribution-scope=public, com.redhat.component=ubi9-minimal-container, vcs-type=git, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Nov 28 09:48:58 np0005538515.localdomain podman[282218]: 2025-11-28 09:48:58.451491704 +0000 UTC m=+0.087654037 container exec_died 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, config_id=edpm, architecture=x86_64, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1755695350, version=9.6, build-date=2025-08-20T13:12:41, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=)
Nov 28 09:48:58 np0005538515.localdomain systemd[1]: 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.service: Deactivated successfully.
Nov 28 09:48:58 np0005538515.localdomain python3[282219]: ansible-ansible.builtin.blockinfile Invoked with marker_begin=BEGIN ceph firewall rules marker_end=END ceph firewall rules path=/etc/nftables/edpm-rules.nft mode=0644 block=# 100 ceph_alertmanager (9093)
                                                          add rule inet filter EDPM_INPUT tcp dport { 9093 } ct state new counter accept comment "100 ceph_alertmanager"
                                                          # 100 ceph_dashboard (8443)
                                                          add rule inet filter EDPM_INPUT tcp dport { 8443 } ct state new counter accept comment "100 ceph_dashboard"
                                                          # 100 ceph_grafana (3100)
                                                          add rule inet filter EDPM_INPUT tcp dport { 3100 } ct state new counter accept comment "100 ceph_grafana"
                                                          # 100 ceph_prometheus (9092)
                                                          add rule inet filter EDPM_INPUT tcp dport { 9092 } ct state new counter accept comment "100 ceph_prometheus"
                                                          # 100 ceph_rgw (8080)
                                                          add rule inet filter EDPM_INPUT tcp dport { 8080 } ct state new counter accept comment "100 ceph_rgw"
                                                          # 110 ceph_mon (6789, 3300, 9100)
                                                          add rule inet filter EDPM_INPUT tcp dport { 6789,3300,9100 } ct state new counter accept comment "110 ceph_mon"
                                                          # 112 ceph_mds (6800-7300, 9100)
                                                          add rule inet filter EDPM_INPUT tcp dport { 6800-7300,9100 } ct state new counter accept comment "112 ceph_mds"
                                                          # 113 ceph_mgr (6800-7300, 8444)
                                                          add rule inet filter EDPM_INPUT tcp dport { 6800-7300,8444 } ct state new counter accept comment "113 ceph_mgr"
                                                          # 120 ceph_nfs (2049, 12049)
                                                          add rule inet filter EDPM_INPUT tcp dport { 2049,12049 } ct state new counter accept comment "120 ceph_nfs"
                                                          # 123 ceph_dashboard (9090, 9094, 9283)
                                                          add rule inet filter EDPM_INPUT tcp dport { 9090,9094,9283 } ct state new counter accept comment "123 ceph_dashboard"
                                                           insertbefore=^# Lock down INPUT chains state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False backup=False unsafe_writes=False insertafter=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:48:58 np0005538515.localdomain sudo[282216]: pam_unix(sudo:session): session closed for user root
Nov 28 09:48:58 np0005538515.localdomain systemd-journald[48427]: Field hash table of /run/log/journal/5cd59ba25ae47acac865224fa46a5f9e/system.journal has a fill level at 80.5 (268 of 333 items), suggesting rotation.
Nov 28 09:48:58 np0005538515.localdomain systemd-journald[48427]: /run/log/journal/5cd59ba25ae47acac865224fa46a5f9e/system.journal: Journal header limits reached or header out-of-date, rotating.
Nov 28 09:48:58 np0005538515.localdomain rsyslogd[758]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Nov 28 09:48:58 np0005538515.localdomain rsyslogd[758]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Nov 28 09:48:58 np0005538515.localdomain podman[239012]: time="2025-11-28T09:48:58Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 09:48:58 np0005538515.localdomain podman[239012]: @ - - [28/Nov/2025:09:48:58 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 149991 "" "Go-http-client/1.1"
Nov 28 09:48:58 np0005538515.localdomain podman[239012]: @ - - [28/Nov/2025:09:48:58 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17704 "" "Go-http-client/1.1"
Nov 28 09:48:59 np0005538515.localdomain sudo[282379]: tripleo-admin : TTY=pts/0 ; PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tsndnmoivtqajsfmjsifyunasbggjvah ; /usr/bin/python3 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1764323338.71316-61488-133236728356915/AnsiballZ_systemd.py
Nov 28 09:48:59 np0005538515.localdomain sudo[282379]: pam_unix(sudo:session): session opened for user root(uid=0) by tripleo-admin(uid=1003)
Nov 28 09:48:59 np0005538515.localdomain python3[282381]: ansible-ansible.builtin.systemd Invoked with name=nftables state=restarted enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 09:49:00 np0005538515.localdomain systemd[1]: Stopping Netfilter Tables...
Nov 28 09:49:00 np0005538515.localdomain systemd[1]: nftables.service: Deactivated successfully.
Nov 28 09:49:00 np0005538515.localdomain systemd[1]: Stopped Netfilter Tables.
Nov 28 09:49:00 np0005538515.localdomain systemd[1]: Starting Netfilter Tables...
Nov 28 09:49:00 np0005538515.localdomain systemd[1]: Finished Netfilter Tables.
Nov 28 09:49:00 np0005538515.localdomain sudo[282379]: pam_unix(sudo:session): session closed for user root
Nov 28 09:49:03 np0005538515.localdomain sudo[282406]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:49:03 np0005538515.localdomain sudo[282406]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:49:03 np0005538515.localdomain sudo[282406]: pam_unix(sudo:session): session closed for user root
Nov 28 09:49:04 np0005538515.localdomain sudo[282424]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:49:04 np0005538515.localdomain sudo[282424]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:49:04 np0005538515.localdomain sudo[282424]: pam_unix(sudo:session): session closed for user root
Nov 28 09:49:05 np0005538515.localdomain sudo[282442]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:49:05 np0005538515.localdomain sudo[282442]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:49:05 np0005538515.localdomain sudo[282442]: pam_unix(sudo:session): session closed for user root
Nov 28 09:49:07 np0005538515.localdomain sudo[282460]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:49:07 np0005538515.localdomain sudo[282460]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:49:07 np0005538515.localdomain sudo[282460]: pam_unix(sudo:session): session closed for user root
Nov 28 09:49:08 np0005538515.localdomain sudo[282478]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:49:08 np0005538515.localdomain sudo[282478]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:49:08 np0005538515.localdomain sudo[282478]: pam_unix(sudo:session): session closed for user root
Nov 28 09:49:08 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.
Nov 28 09:49:08 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.
Nov 28 09:49:08 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.
Nov 28 09:49:08 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.
Nov 28 09:49:08 np0005538515.localdomain podman[282499]: 2025-11-28 09:49:08.509256033 +0000 UTC m=+0.081139226 container health_status d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 09:49:08 np0005538515.localdomain podman[282499]: 2025-11-28 09:49:08.519909742 +0000 UTC m=+0.091792985 container exec_died d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 28 09:49:08 np0005538515.localdomain systemd[1]: d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.service: Deactivated successfully.
Nov 28 09:49:08 np0005538515.localdomain systemd[1]: tmp-crun.9TS4d5.mount: Deactivated successfully.
Nov 28 09:49:08 np0005538515.localdomain podman[282497]: 2025-11-28 09:49:08.567918234 +0000 UTC m=+0.143567984 container health_status 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 09:49:08 np0005538515.localdomain podman[282498]: 2025-11-28 09:49:08.705853773 +0000 UTC m=+0.279299965 container health_status b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Nov 28 09:49:08 np0005538515.localdomain podman[282496]: 2025-11-28 09:49:08.714375616 +0000 UTC m=+0.292065808 container health_status 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 09:49:08 np0005538515.localdomain podman[282496]: 2025-11-28 09:49:08.726251802 +0000 UTC m=+0.303941994 container exec_died 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, config_id=edpm)
Nov 28 09:49:08 np0005538515.localdomain systemd[1]: 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.service: Deactivated successfully.
Nov 28 09:49:08 np0005538515.localdomain podman[282498]: 2025-11-28 09:49:08.741341449 +0000 UTC m=+0.314787651 container exec_died b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS)
Nov 28 09:49:08 np0005538515.localdomain systemd[1]: b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.service: Deactivated successfully.
Nov 28 09:49:08 np0005538515.localdomain podman[282497]: 2025-11-28 09:49:08.791861879 +0000 UTC m=+0.367511659 container exec_died 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true)
Nov 28 09:49:08 np0005538515.localdomain systemd[1]: 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.service: Deactivated successfully.
Nov 28 09:49:09 np0005538515.localdomain systemd[1]: tmp-crun.vzlSjZ.mount: Deactivated successfully.
Nov 28 09:49:09 np0005538515.localdomain sudo[282578]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:49:09 np0005538515.localdomain sudo[282578]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:49:09 np0005538515.localdomain sudo[282578]: pam_unix(sudo:session): session closed for user root
Nov 28 09:49:10 np0005538515.localdomain sudo[282596]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:49:10 np0005538515.localdomain sudo[282596]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:49:10 np0005538515.localdomain sudo[282596]: pam_unix(sudo:session): session closed for user root
Nov 28 09:49:10 np0005538515.localdomain sudo[282614]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:49:10 np0005538515.localdomain sudo[282614]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:49:11 np0005538515.localdomain podman[282673]: 
Nov 28 09:49:11 np0005538515.localdomain podman[282673]: 2025-11-28 09:49:11.391997912 +0000 UTC m=+0.064266365 container create 5c39b91f0e214bc2d9ddd580c265dd0c78d922f43a89f77bf8a2c196fd47b299 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=strange_banach, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.expose-services=, CEPH_POINT_RELEASE=, version=7, release=553, name=rhceph, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, io.buildah.version=1.33.12, vcs-type=git, com.redhat.component=rhceph-container)
Nov 28 09:49:11 np0005538515.localdomain systemd[1]: Started libpod-conmon-5c39b91f0e214bc2d9ddd580c265dd0c78d922f43a89f77bf8a2c196fd47b299.scope.
Nov 28 09:49:11 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 09:49:11 np0005538515.localdomain podman[282673]: 2025-11-28 09:49:11.364967527 +0000 UTC m=+0.037235990 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 09:49:11 np0005538515.localdomain podman[282673]: 2025-11-28 09:49:11.468598247 +0000 UTC m=+0.140866720 container init 5c39b91f0e214bc2d9ddd580c265dd0c78d922f43a89f77bf8a2c196fd47b299 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=strange_banach, vendor=Red Hat, Inc., distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, RELEASE=main, build-date=2025-09-24T08:57:55, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, architecture=x86_64, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-type=git, release=553, io.openshift.expose-services=, ceph=True, GIT_BRANCH=main, io.buildah.version=1.33.12, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7)
Nov 28 09:49:11 np0005538515.localdomain podman[282673]: 2025-11-28 09:49:11.480603127 +0000 UTC m=+0.152871580 container start 5c39b91f0e214bc2d9ddd580c265dd0c78d922f43a89f77bf8a2c196fd47b299 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=strange_banach, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, RELEASE=main, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhceph, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, com.redhat.component=rhceph-container, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, release=553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, CEPH_POINT_RELEASE=, io.buildah.version=1.33.12, GIT_REPO=https://github.com/ceph/ceph-container.git)
Nov 28 09:49:11 np0005538515.localdomain podman[282673]: 2025-11-28 09:49:11.480846684 +0000 UTC m=+0.153115157 container attach 5c39b91f0e214bc2d9ddd580c265dd0c78d922f43a89f77bf8a2c196fd47b299 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=strange_banach, io.buildah.version=1.33.12, vcs-type=git, vendor=Red Hat, Inc., GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_CLEAN=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, release=553, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, com.redhat.component=rhceph-container, build-date=2025-09-24T08:57:55, distribution-scope=public, CEPH_POINT_RELEASE=, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, RELEASE=main, ceph=True, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d)
Nov 28 09:49:11 np0005538515.localdomain strange_banach[282688]: 167 167
Nov 28 09:49:11 np0005538515.localdomain systemd[1]: libpod-5c39b91f0e214bc2d9ddd580c265dd0c78d922f43a89f77bf8a2c196fd47b299.scope: Deactivated successfully.
Nov 28 09:49:11 np0005538515.localdomain podman[282673]: 2025-11-28 09:49:11.485612882 +0000 UTC m=+0.157881385 container died 5c39b91f0e214bc2d9ddd580c265dd0c78d922f43a89f77bf8a2c196fd47b299 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=strange_banach, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-type=git, build-date=2025-09-24T08:57:55, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, release=553, distribution-scope=public, architecture=x86_64, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, ceph=True, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.expose-services=, io.buildah.version=1.33.12, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, name=rhceph, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements)
Nov 28 09:49:11 np0005538515.localdomain podman[282693]: 2025-11-28 09:49:11.588235421 +0000 UTC m=+0.089780043 container remove 5c39b91f0e214bc2d9ddd580c265dd0c78d922f43a89f77bf8a2c196fd47b299 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=strange_banach, io.buildah.version=1.33.12, version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=, vcs-type=git, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, release=553, ceph=True, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vendor=Red Hat, Inc., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.expose-services=, build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3)
Nov 28 09:49:11 np0005538515.localdomain systemd[1]: libpod-conmon-5c39b91f0e214bc2d9ddd580c265dd0c78d922f43a89f77bf8a2c196fd47b299.scope: Deactivated successfully.
Nov 28 09:49:11 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 09:49:11 np0005538515.localdomain systemd-sysv-generator[282738]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:49:11 np0005538515.localdomain systemd-rc-local-generator[282734]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:49:11 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:49:11 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 28 09:49:11 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:49:11 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:49:11 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:49:11 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 28 09:49:11 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:49:11 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:49:11 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:49:11 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-e69a33fd7597a815f85353cd5144c9887f9735726adf4c7894616900505306ec-merged.mount: Deactivated successfully.
Nov 28 09:49:12 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 09:49:12 np0005538515.localdomain systemd-rc-local-generator[282780]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:49:12 np0005538515.localdomain systemd-sysv-generator[282783]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:49:12 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:49:12 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 28 09:49:12 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:49:12 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:49:12 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:49:12 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 28 09:49:12 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:49:12 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:49:12 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:49:12 np0005538515.localdomain systemd[1]: Starting Ceph mds.mds.np0005538515.anvatb for 2c5417c9-00eb-57d5-a565-ddecbc7995c1...
Nov 28 09:49:12 np0005538515.localdomain podman[282840]: 
Nov 28 09:49:12 np0005538515.localdomain podman[282840]: 2025-11-28 09:49:12.850442813 +0000 UTC m=+0.086280244 container create c8b5ff4ae49aa0080332f0f2f830f5e8e5b9a599ac27dabefec286af414abefd (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mds-mds-np0005538515-anvatb, version=7, build-date=2025-09-24T08:57:55, distribution-scope=public, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=553, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.component=rhceph-container, RELEASE=main, io.openshift.tags=rhceph ceph, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True)
Nov 28 09:49:12 np0005538515.localdomain systemd[1]: tmp-crun.KgNu5x.mount: Deactivated successfully.
Nov 28 09:49:12 np0005538515.localdomain podman[282840]: 2025-11-28 09:49:12.815601608 +0000 UTC m=+0.051439059 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 09:49:12 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3bf1c1705508931bd14cca6ca9a7a98cdec79f0341c06bb9a11dbf9c764b05b6/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Nov 28 09:49:12 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3bf1c1705508931bd14cca6ca9a7a98cdec79f0341c06bb9a11dbf9c764b05b6/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Nov 28 09:49:12 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3bf1c1705508931bd14cca6ca9a7a98cdec79f0341c06bb9a11dbf9c764b05b6/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Nov 28 09:49:12 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3bf1c1705508931bd14cca6ca9a7a98cdec79f0341c06bb9a11dbf9c764b05b6/merged/var/lib/ceph/mds/ceph-mds.np0005538515.anvatb supports timestamps until 2038 (0x7fffffff)
Nov 28 09:49:12 np0005538515.localdomain podman[282840]: 2025-11-28 09:49:12.931172086 +0000 UTC m=+0.167009567 container init c8b5ff4ae49aa0080332f0f2f830f5e8e5b9a599ac27dabefec286af414abefd (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mds-mds-np0005538515-anvatb, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-09-24T08:57:55, io.buildah.version=1.33.12, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, io.openshift.expose-services=, GIT_BRANCH=main, release=553, version=7, ceph=True, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/agreements)
Nov 28 09:49:12 np0005538515.localdomain podman[282840]: 2025-11-28 09:49:12.94166785 +0000 UTC m=+0.177505271 container start c8b5ff4ae49aa0080332f0f2f830f5e8e5b9a599ac27dabefec286af414abefd (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mds-mds-np0005538515-anvatb, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.33.12, vcs-type=git, architecture=x86_64, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, version=7, GIT_CLEAN=True, name=rhceph, vendor=Red Hat, Inc., GIT_BRANCH=main, release=553, ceph=True, build-date=2025-09-24T08:57:55, com.redhat.component=rhceph-container, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553)
Nov 28 09:49:12 np0005538515.localdomain bash[282840]: c8b5ff4ae49aa0080332f0f2f830f5e8e5b9a599ac27dabefec286af414abefd
Nov 28 09:49:12 np0005538515.localdomain systemd[1]: Started Ceph mds.mds.np0005538515.anvatb for 2c5417c9-00eb-57d5-a565-ddecbc7995c1.
Nov 28 09:49:12 np0005538515.localdomain sudo[282614]: pam_unix(sudo:session): session closed for user root
Nov 28 09:49:13 np0005538515.localdomain ceph-mds[282859]: set uid:gid to 167:167 (ceph:ceph)
Nov 28 09:49:13 np0005538515.localdomain ceph-mds[282859]: ceph version 18.2.1-361.el9cp (439dcd6094d413840eb2ec590fe2194ec616687f) reef (stable), process ceph-mds, pid 2
Nov 28 09:49:13 np0005538515.localdomain ceph-mds[282859]: main not setting numa affinity
Nov 28 09:49:13 np0005538515.localdomain ceph-mds[282859]: pidfile_write: ignore empty --pid-file
Nov 28 09:49:13 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mds-mds-np0005538515-anvatb[282855]: starting mds.mds.np0005538515.anvatb at 
Nov 28 09:49:13 np0005538515.localdomain ceph-mds[282859]: mds.mds.np0005538515.anvatb Updating MDS map to version 7 from mon.1
Nov 28 09:49:13 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.
Nov 28 09:49:13 np0005538515.localdomain podman[282878]: 2025-11-28 09:49:13.478397005 +0000 UTC m=+0.081365945 container health_status 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 28 09:49:13 np0005538515.localdomain podman[282878]: 2025-11-28 09:49:13.492420257 +0000 UTC m=+0.095389167 container exec_died 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 09:49:13 np0005538515.localdomain systemd[1]: 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.service: Deactivated successfully.
Nov 28 09:49:14 np0005538515.localdomain ceph-mds[282859]: mds.mds.np0005538515.anvatb Updating MDS map to version 8 from mon.1
Nov 28 09:49:14 np0005538515.localdomain ceph-mds[282859]: mds.mds.np0005538515.anvatb Monitors have assigned me to become a standby.
Nov 28 09:49:18 np0005538515.localdomain sudo[282904]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:49:18 np0005538515.localdomain sudo[282904]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:49:18 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.
Nov 28 09:49:18 np0005538515.localdomain sudo[282904]: pam_unix(sudo:session): session closed for user root
Nov 28 09:49:18 np0005538515.localdomain sudo[282928]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:49:18 np0005538515.localdomain sudo[282928]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:49:18 np0005538515.localdomain sudo[282928]: pam_unix(sudo:session): session closed for user root
Nov 28 09:49:18 np0005538515.localdomain podman[282921]: 2025-11-28 09:49:18.278229245 +0000 UTC m=+0.094379545 container health_status cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 28 09:49:18 np0005538515.localdomain podman[282921]: 2025-11-28 09:49:18.294496336 +0000 UTC m=+0.110646696 container exec_died cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, container_name=multipathd, org.label-schema.build-date=20251125, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Nov 28 09:49:18 np0005538515.localdomain systemd[1]: cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.service: Deactivated successfully.
Nov 28 09:49:18 np0005538515.localdomain sshd[281657]: Received disconnect from 38.102.83.114 port 54746:11: disconnected by user
Nov 28 09:49:18 np0005538515.localdomain sshd[281657]: Disconnected from user zuul 38.102.83.114 port 54746
Nov 28 09:49:18 np0005538515.localdomain sshd[281644]: pam_unix(sshd:session): session closed for user zuul
Nov 28 09:49:18 np0005538515.localdomain systemd[1]: session-61.scope: Deactivated successfully.
Nov 28 09:49:18 np0005538515.localdomain systemd-logind[763]: Session 61 logged out. Waiting for processes to exit.
Nov 28 09:49:18 np0005538515.localdomain systemd-logind[763]: Removed session 61.
Nov 28 09:49:18 np0005538515.localdomain sudo[282957]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Nov 28 09:49:18 np0005538515.localdomain sudo[282957]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:49:19 np0005538515.localdomain systemd[1]: tmp-crun.VEdhdK.mount: Deactivated successfully.
Nov 28 09:49:19 np0005538515.localdomain podman[283048]: 2025-11-28 09:49:19.250806307 +0000 UTC m=+0.097283794 container exec 98f7091a3e2ea0e9ed1e630f1e98c8fad1fd276cf7448473db6afc3c103ea45d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538515, vcs-type=git, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, io.buildah.version=1.33.12, release=553, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/agreements, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_CLEAN=True, version=7, RELEASE=main, distribution-scope=public, name=rhceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d)
Nov 28 09:49:19 np0005538515.localdomain podman[283048]: 2025-11-28 09:49:19.346378478 +0000 UTC m=+0.192855955 container exec_died 98f7091a3e2ea0e9ed1e630f1e98c8fad1fd276cf7448473db6afc3c103ea45d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538515, io.openshift.tags=rhceph ceph, RELEASE=main, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, release=553, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, CEPH_POINT_RELEASE=, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, build-date=2025-09-24T08:57:55, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.component=rhceph-container, io.buildah.version=1.33.12, description=Red Hat Ceph Storage 7, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc.)
Nov 28 09:49:19 np0005538515.localdomain sudo[282957]: pam_unix(sudo:session): session closed for user root
Nov 28 09:49:20 np0005538515.localdomain sudo[283133]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:49:20 np0005538515.localdomain sudo[283133]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:49:20 np0005538515.localdomain sudo[283133]: pam_unix(sudo:session): session closed for user root
Nov 28 09:49:20 np0005538515.localdomain sudo[283151]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:49:20 np0005538515.localdomain sudo[283151]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:49:20 np0005538515.localdomain sudo[283151]: pam_unix(sudo:session): session closed for user root
Nov 28 09:49:26 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:49:26.238 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:49:26 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:49:26.238 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:49:26 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:49:26.239 280172 DEBUG nova.compute.manager [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 28 09:49:27 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:49:27.235 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:49:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:49:27 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:49:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:49:27 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 09:49:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:49:27 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:49:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:49:27 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 09:49:27 np0005538515.localdomain openstack_network_exporter[240973]: 
Nov 28 09:49:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:49:27 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 09:49:27 np0005538515.localdomain openstack_network_exporter[240973]: 
Nov 28 09:49:28 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:49:28.238 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:49:28 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:49:28.239 280172 DEBUG nova.compute.manager [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 28 09:49:28 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:49:28.239 280172 DEBUG nova.compute.manager [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 28 09:49:28 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:49:28.258 280172 DEBUG nova.compute.manager [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 28 09:49:28 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:49:28.258 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:49:28 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.
Nov 28 09:49:28 np0005538515.localdomain podman[239012]: time="2025-11-28T09:49:28Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 09:49:28 np0005538515.localdomain podman[239012]: @ - - [28/Nov/2025:09:49:28 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 152069 "" "Go-http-client/1.1"
Nov 28 09:49:28 np0005538515.localdomain podman[239012]: @ - - [28/Nov/2025:09:49:28 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18194 "" "Go-http-client/1.1"
Nov 28 09:49:29 np0005538515.localdomain podman[283169]: 2025-11-28 09:49:29.057758335 +0000 UTC m=+0.158304071 container health_status 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, config_id=edpm, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., architecture=x86_64, build-date=2025-08-20T13:12:41, release=1755695350, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, version=9.6, container_name=openstack_network_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container)
Nov 28 09:49:29 np0005538515.localdomain podman[283169]: 2025-11-28 09:49:29.076722629 +0000 UTC m=+0.177268385 container exec_died 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., name=ubi9-minimal, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, distribution-scope=public, architecture=x86_64, managed_by=edpm_ansible, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, com.redhat.component=ubi9-minimal-container, version=9.6, config_id=edpm, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git)
Nov 28 09:49:29 np0005538515.localdomain systemd[1]: 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.service: Deactivated successfully.
Nov 28 09:49:29 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:49:29.238 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:49:29 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:49:29.239 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:49:29 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:49:29.239 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:49:30 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:49:30.238 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:49:30 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:49:30.239 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:49:30 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:49:30.265 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:49:30 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:49:30.266 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:49:30 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:49:30.266 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:49:30 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:49:30.266 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Auditing locally available compute resources for np0005538515.localdomain (node: np0005538515.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 28 09:49:30 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:49:30.267 280172 DEBUG oslo_concurrency.processutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 09:49:30 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:49:30.758 280172 DEBUG oslo_concurrency.processutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.492s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 09:49:30 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:49:30.960 280172 WARNING nova.virt.libvirt.driver [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 09:49:30 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:49:30.962 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Hypervisor/Node resource view: name=np0005538515.localdomain free_ram=12499MB free_disk=41.83686447143555GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 28 09:49:30 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:49:30.962 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:49:30 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:49:30.963 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:49:31 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:49:31.052 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 28 09:49:31 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:49:31.053 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Final resource view: name=np0005538515.localdomain phys_ram=15738MB used_ram=512MB phys_disk=41GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 28 09:49:31 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:49:31.074 280172 DEBUG oslo_concurrency.processutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 09:49:31 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:49:31.546 280172 DEBUG oslo_concurrency.processutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.472s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 09:49:31 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:49:31.553 280172 DEBUG nova.compute.provider_tree [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Inventory has not changed in ProviderTree for provider: 72fba1ca-0d86-48af-8a3d-510284dfd0e0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 09:49:31 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:49:31.571 280172 DEBUG nova.scheduler.client.report [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Inventory has not changed for provider 72fba1ca-0d86-48af-8a3d-510284dfd0e0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 09:49:31 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:49:31.574 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Compute_service record updated for np0005538515.localdomain:np0005538515.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 28 09:49:31 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:49:31.574 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.611s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:49:38 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.
Nov 28 09:49:38 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.
Nov 28 09:49:38 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.
Nov 28 09:49:38 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.
Nov 28 09:49:39 np0005538515.localdomain podman[283235]: 2025-11-28 09:49:39.017548684 +0000 UTC m=+0.114797933 container health_status 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.build-date=20251125, container_name=ovn_controller, io.buildah.version=1.41.3)
Nov 28 09:49:39 np0005538515.localdomain podman[283236]: 2025-11-28 09:49:39.078623624 +0000 UTC m=+0.172422107 container health_status b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 28 09:49:39 np0005538515.localdomain podman[283235]: 2025-11-28 09:49:39.10839631 +0000 UTC m=+0.205645599 container exec_died 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 28 09:49:39 np0005538515.localdomain systemd[1]: 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.service: Deactivated successfully.
Nov 28 09:49:39 np0005538515.localdomain podman[283234]: 2025-11-28 09:49:39.125053141 +0000 UTC m=+0.226237860 container health_status 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=edpm, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Nov 28 09:49:39 np0005538515.localdomain podman[283234]: 2025-11-28 09:49:39.138278839 +0000 UTC m=+0.239463558 container exec_died 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=edpm, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 28 09:49:39 np0005538515.localdomain systemd[1]: 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.service: Deactivated successfully.
Nov 28 09:49:39 np0005538515.localdomain podman[283237]: 2025-11-28 09:49:39.183994306 +0000 UTC m=+0.274698893 container health_status d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 28 09:49:39 np0005538515.localdomain podman[283236]: 2025-11-28 09:49:39.211619325 +0000 UTC m=+0.305417838 container exec_died b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 28 09:49:39 np0005538515.localdomain podman[283237]: 2025-11-28 09:49:39.221508859 +0000 UTC m=+0.312213426 container exec_died d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 09:49:39 np0005538515.localdomain systemd[1]: b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.service: Deactivated successfully.
Nov 28 09:49:39 np0005538515.localdomain systemd[1]: d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.service: Deactivated successfully.
Nov 28 09:49:43 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.
Nov 28 09:49:43 np0005538515.localdomain podman[283321]: 2025-11-28 09:49:43.646135316 +0000 UTC m=+0.085496131 container health_status 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 28 09:49:43 np0005538515.localdomain podman[283321]: 2025-11-28 09:49:43.653748631 +0000 UTC m=+0.093109446 container exec_died 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 28 09:49:43 np0005538515.localdomain systemd[1]: 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.service: Deactivated successfully.
Nov 28 09:49:45 np0005538515.localdomain sudo[283344]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:49:45 np0005538515.localdomain sudo[283344]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:49:45 np0005538515.localdomain sudo[283344]: pam_unix(sudo:session): session closed for user root
Nov 28 09:49:45 np0005538515.localdomain sudo[283362]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 09:49:45 np0005538515.localdomain sudo[283362]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:49:46 np0005538515.localdomain sudo[283362]: pam_unix(sudo:session): session closed for user root
Nov 28 09:49:46 np0005538515.localdomain sudo[283412]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:49:46 np0005538515.localdomain sudo[283412]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:49:46 np0005538515.localdomain sudo[283412]: pam_unix(sudo:session): session closed for user root
Nov 28 09:49:46 np0005538515.localdomain sudo[283430]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ceph-volume --fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1 -- inventory --format=json-pretty --filter-for-batch
Nov 28 09:49:46 np0005538515.localdomain sudo[283430]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:49:47 np0005538515.localdomain podman[283492]: 
Nov 28 09:49:47 np0005538515.localdomain podman[283492]: 2025-11-28 09:49:47.050771293 +0000 UTC m=+0.088811024 container create 2a47286e1af48908d44d463f619c371f623df6447d47b9112591e56778add5dd (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=blissful_agnesi, com.redhat.component=rhceph-container, io.openshift.expose-services=, version=7, io.openshift.tags=rhceph ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, RELEASE=main, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-09-24T08:57:55, name=rhceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, CEPH_POINT_RELEASE=, vcs-type=git, GIT_BRANCH=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=553, distribution-scope=public)
Nov 28 09:49:47 np0005538515.localdomain systemd[1]: Started libpod-conmon-2a47286e1af48908d44d463f619c371f623df6447d47b9112591e56778add5dd.scope.
Nov 28 09:49:47 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 09:49:47 np0005538515.localdomain podman[283492]: 2025-11-28 09:49:47.013996112 +0000 UTC m=+0.052035853 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 09:49:47 np0005538515.localdomain podman[283492]: 2025-11-28 09:49:47.126383989 +0000 UTC m=+0.164423690 container init 2a47286e1af48908d44d463f619c371f623df6447d47b9112591e56778add5dd (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=blissful_agnesi, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, GIT_BRANCH=main, io.buildah.version=1.33.12, architecture=x86_64, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.component=rhceph-container, ceph=True, RELEASE=main, GIT_CLEAN=True, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, name=rhceph, release=553, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=, io.openshift.expose-services=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, distribution-scope=public, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55)
Nov 28 09:49:47 np0005538515.localdomain podman[283492]: 2025-11-28 09:49:47.135383146 +0000 UTC m=+0.173422857 container start 2a47286e1af48908d44d463f619c371f623df6447d47b9112591e56778add5dd (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=blissful_agnesi, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, name=rhceph, vcs-type=git, architecture=x86_64, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, release=553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph, RELEASE=main, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.buildah.version=1.33.12, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, ceph=True, GIT_BRANCH=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, build-date=2025-09-24T08:57:55)
Nov 28 09:49:47 np0005538515.localdomain podman[283492]: 2025-11-28 09:49:47.135758907 +0000 UTC m=+0.173798658 container attach 2a47286e1af48908d44d463f619c371f623df6447d47b9112591e56778add5dd (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=blissful_agnesi, release=553, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, CEPH_POINT_RELEASE=, vcs-type=git, build-date=2025-09-24T08:57:55, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, com.redhat.component=rhceph-container, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., GIT_CLEAN=True, version=7, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.license_terms=https://www.redhat.com/agreements, RELEASE=main, ceph=True, name=rhceph, io.buildah.version=1.33.12)
Nov 28 09:49:47 np0005538515.localdomain blissful_agnesi[283508]: 167 167
Nov 28 09:49:47 np0005538515.localdomain systemd[1]: libpod-2a47286e1af48908d44d463f619c371f623df6447d47b9112591e56778add5dd.scope: Deactivated successfully.
Nov 28 09:49:47 np0005538515.localdomain podman[283492]: 2025-11-28 09:49:47.141605367 +0000 UTC m=+0.179645098 container died 2a47286e1af48908d44d463f619c371f623df6447d47b9112591e56778add5dd (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=blissful_agnesi, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, distribution-scope=public, GIT_CLEAN=True, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, GIT_BRANCH=main, vcs-type=git, build-date=2025-09-24T08:57:55, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, CEPH_POINT_RELEASE=, ceph=True, release=553, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12)
Nov 28 09:49:47 np0005538515.localdomain podman[283513]: 2025-11-28 09:49:47.239905961 +0000 UTC m=+0.088476682 container remove 2a47286e1af48908d44d463f619c371f623df6447d47b9112591e56778add5dd (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=blissful_agnesi, GIT_CLEAN=True, ceph=True, distribution-scope=public, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=rhceph-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, name=rhceph, vendor=Red Hat, Inc., release=553, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, description=Red Hat Ceph Storage 7, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.expose-services=, vcs-type=git, GIT_BRANCH=main, CEPH_POINT_RELEASE=)
Nov 28 09:49:47 np0005538515.localdomain systemd[1]: libpod-conmon-2a47286e1af48908d44d463f619c371f623df6447d47b9112591e56778add5dd.scope: Deactivated successfully.
Nov 28 09:49:47 np0005538515.localdomain podman[283534]: 
Nov 28 09:49:47 np0005538515.localdomain podman[283534]: 2025-11-28 09:49:47.445495727 +0000 UTC m=+0.050180415 container create e38edb8798d76b311f83f67cf2ab8a8ca5df320f69bc123e8d898a74db50d6cd (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=admiring_torvalds, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=553, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, architecture=x86_64, version=7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.buildah.version=1.33.12, ceph=True, GIT_CLEAN=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, build-date=2025-09-24T08:57:55, io.openshift.tags=rhceph ceph, distribution-scope=public, CEPH_POINT_RELEASE=)
Nov 28 09:49:47 np0005538515.localdomain systemd[1]: Started libpod-conmon-e38edb8798d76b311f83f67cf2ab8a8ca5df320f69bc123e8d898a74db50d6cd.scope.
Nov 28 09:49:47 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 09:49:47 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f9c528d1e805a12a374a1f3b902cddf269704c3c35566f6b31c54da0d04151aa/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Nov 28 09:49:47 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f9c528d1e805a12a374a1f3b902cddf269704c3c35566f6b31c54da0d04151aa/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Nov 28 09:49:47 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f9c528d1e805a12a374a1f3b902cddf269704c3c35566f6b31c54da0d04151aa/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Nov 28 09:49:47 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f9c528d1e805a12a374a1f3b902cddf269704c3c35566f6b31c54da0d04151aa/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Nov 28 09:49:47 np0005538515.localdomain podman[283534]: 2025-11-28 09:49:47.506225375 +0000 UTC m=+0.110910073 container init e38edb8798d76b311f83f67cf2ab8a8ca5df320f69bc123e8d898a74db50d6cd (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=admiring_torvalds, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, distribution-scope=public, CEPH_POINT_RELEASE=, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, version=7, com.redhat.component=rhceph-container, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/agreements, release=553, vcs-type=git, io.buildah.version=1.33.12, ceph=True, build-date=2025-09-24T08:57:55, architecture=x86_64, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, RELEASE=main)
Nov 28 09:49:47 np0005538515.localdomain podman[283534]: 2025-11-28 09:49:47.518871604 +0000 UTC m=+0.123556322 container start e38edb8798d76b311f83f67cf2ab8a8ca5df320f69bc123e8d898a74db50d6cd (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=admiring_torvalds, com.redhat.component=rhceph-container, RELEASE=main, name=rhceph, vcs-type=git, CEPH_POINT_RELEASE=, architecture=x86_64, GIT_CLEAN=True, io.buildah.version=1.33.12, description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=553, build-date=2025-09-24T08:57:55, io.openshift.expose-services=, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main)
Nov 28 09:49:47 np0005538515.localdomain podman[283534]: 2025-11-28 09:49:47.519239216 +0000 UTC m=+0.123924114 container attach e38edb8798d76b311f83f67cf2ab8a8ca5df320f69bc123e8d898a74db50d6cd (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=admiring_torvalds, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vcs-type=git, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., distribution-scope=public, build-date=2025-09-24T08:57:55, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, name=rhceph, version=7, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, architecture=x86_64, RELEASE=main, release=553, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Nov 28 09:49:47 np0005538515.localdomain podman[283534]: 2025-11-28 09:49:47.42674389 +0000 UTC m=+0.031428598 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 09:49:48 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-7b6acf2ff29f315409e545cb09d88ef8e6695b6d9535098c83b8dada17045a83-merged.mount: Deactivated successfully.
Nov 28 09:49:48 np0005538515.localdomain admiring_torvalds[283549]: [
Nov 28 09:49:48 np0005538515.localdomain admiring_torvalds[283549]:     {
Nov 28 09:49:48 np0005538515.localdomain admiring_torvalds[283549]:         "available": false,
Nov 28 09:49:48 np0005538515.localdomain admiring_torvalds[283549]:         "ceph_device": false,
Nov 28 09:49:48 np0005538515.localdomain admiring_torvalds[283549]:         "device_id": "QEMU_DVD-ROM_QM00001",
Nov 28 09:49:48 np0005538515.localdomain admiring_torvalds[283549]:         "lsm_data": {},
Nov 28 09:49:48 np0005538515.localdomain admiring_torvalds[283549]:         "lvs": [],
Nov 28 09:49:48 np0005538515.localdomain admiring_torvalds[283549]:         "path": "/dev/sr0",
Nov 28 09:49:48 np0005538515.localdomain admiring_torvalds[283549]:         "rejected_reasons": [
Nov 28 09:49:48 np0005538515.localdomain admiring_torvalds[283549]:             "Insufficient space (<5GB)",
Nov 28 09:49:48 np0005538515.localdomain admiring_torvalds[283549]:             "Has a FileSystem"
Nov 28 09:49:48 np0005538515.localdomain admiring_torvalds[283549]:         ],
Nov 28 09:49:48 np0005538515.localdomain admiring_torvalds[283549]:         "sys_api": {
Nov 28 09:49:48 np0005538515.localdomain admiring_torvalds[283549]:             "actuators": null,
Nov 28 09:49:48 np0005538515.localdomain admiring_torvalds[283549]:             "device_nodes": "sr0",
Nov 28 09:49:48 np0005538515.localdomain admiring_torvalds[283549]:             "human_readable_size": "482.00 KB",
Nov 28 09:49:48 np0005538515.localdomain admiring_torvalds[283549]:             "id_bus": "ata",
Nov 28 09:49:48 np0005538515.localdomain admiring_torvalds[283549]:             "model": "QEMU DVD-ROM",
Nov 28 09:49:48 np0005538515.localdomain admiring_torvalds[283549]:             "nr_requests": "2",
Nov 28 09:49:48 np0005538515.localdomain admiring_torvalds[283549]:             "partitions": {},
Nov 28 09:49:48 np0005538515.localdomain admiring_torvalds[283549]:             "path": "/dev/sr0",
Nov 28 09:49:48 np0005538515.localdomain admiring_torvalds[283549]:             "removable": "1",
Nov 28 09:49:48 np0005538515.localdomain admiring_torvalds[283549]:             "rev": "2.5+",
Nov 28 09:49:48 np0005538515.localdomain admiring_torvalds[283549]:             "ro": "0",
Nov 28 09:49:48 np0005538515.localdomain admiring_torvalds[283549]:             "rotational": "1",
Nov 28 09:49:48 np0005538515.localdomain admiring_torvalds[283549]:             "sas_address": "",
Nov 28 09:49:48 np0005538515.localdomain admiring_torvalds[283549]:             "sas_device_handle": "",
Nov 28 09:49:48 np0005538515.localdomain admiring_torvalds[283549]:             "scheduler_mode": "mq-deadline",
Nov 28 09:49:48 np0005538515.localdomain admiring_torvalds[283549]:             "sectors": 0,
Nov 28 09:49:48 np0005538515.localdomain admiring_torvalds[283549]:             "sectorsize": "2048",
Nov 28 09:49:48 np0005538515.localdomain admiring_torvalds[283549]:             "size": 493568.0,
Nov 28 09:49:48 np0005538515.localdomain admiring_torvalds[283549]:             "support_discard": "0",
Nov 28 09:49:48 np0005538515.localdomain admiring_torvalds[283549]:             "type": "disk",
Nov 28 09:49:48 np0005538515.localdomain admiring_torvalds[283549]:             "vendor": "QEMU"
Nov 28 09:49:48 np0005538515.localdomain admiring_torvalds[283549]:         }
Nov 28 09:49:48 np0005538515.localdomain admiring_torvalds[283549]:     }
Nov 28 09:49:48 np0005538515.localdomain admiring_torvalds[283549]: ]
Nov 28 09:49:48 np0005538515.localdomain systemd[1]: libpod-e38edb8798d76b311f83f67cf2ab8a8ca5df320f69bc123e8d898a74db50d6cd.scope: Deactivated successfully.
Nov 28 09:49:48 np0005538515.localdomain systemd[1]: libpod-e38edb8798d76b311f83f67cf2ab8a8ca5df320f69bc123e8d898a74db50d6cd.scope: Consumed 1.155s CPU time.
Nov 28 09:49:48 np0005538515.localdomain podman[283534]: 2025-11-28 09:49:48.63034748 +0000 UTC m=+1.235032168 container died e38edb8798d76b311f83f67cf2ab8a8ca5df320f69bc123e8d898a74db50d6cd (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=admiring_torvalds, build-date=2025-09-24T08:57:55, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.component=rhceph-container, name=rhceph, release=553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, io.openshift.expose-services=, GIT_CLEAN=True, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.33.12, ceph=True, com.redhat.license_terms=https://www.redhat.com/agreements, RELEASE=main, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, distribution-scope=public, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7)
Nov 28 09:49:48 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.
Nov 28 09:49:48 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-f9c528d1e805a12a374a1f3b902cddf269704c3c35566f6b31c54da0d04151aa-merged.mount: Deactivated successfully.
Nov 28 09:49:48 np0005538515.localdomain podman[285468]: 2025-11-28 09:49:48.777294281 +0000 UTC m=+0.110326435 container health_status cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 28 09:49:48 np0005538515.localdomain podman[285462]: 2025-11-28 09:49:48.8000177 +0000 UTC m=+0.156815936 container remove e38edb8798d76b311f83f67cf2ab8a8ca5df320f69bc123e8d898a74db50d6cd (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=admiring_torvalds, release=553, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, io.buildah.version=1.33.12, name=rhceph, com.redhat.component=rhceph-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, GIT_CLEAN=True, vcs-type=git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-09-24T08:57:55, io.openshift.expose-services=, version=7, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/agreements)
Nov 28 09:49:48 np0005538515.localdomain systemd[1]: libpod-conmon-e38edb8798d76b311f83f67cf2ab8a8ca5df320f69bc123e8d898a74db50d6cd.scope: Deactivated successfully.
Nov 28 09:49:48 np0005538515.localdomain podman[285468]: 2025-11-28 09:49:48.820568902 +0000 UTC m=+0.153601076 container exec_died cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 09:49:48 np0005538515.localdomain systemd[1]: cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.service: Deactivated successfully.
Nov 28 09:49:48 np0005538515.localdomain sudo[283430]: pam_unix(sudo:session): session closed for user root
Nov 28 09:49:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:49:50.832 158530 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:49:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:49:50.832 158530 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:49:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:49:50.833 158530 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:49:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:49:57 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:49:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:49:57 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:49:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:49:57 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 09:49:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:49:57 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 09:49:57 np0005538515.localdomain openstack_network_exporter[240973]: 
Nov 28 09:49:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:49:57 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 09:49:57 np0005538515.localdomain openstack_network_exporter[240973]: 
Nov 28 09:49:58 np0005538515.localdomain podman[239012]: time="2025-11-28T09:49:58Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 09:49:58 np0005538515.localdomain podman[239012]: @ - - [28/Nov/2025:09:49:58 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 152069 "" "Go-http-client/1.1"
Nov 28 09:49:58 np0005538515.localdomain podman[239012]: @ - - [28/Nov/2025:09:49:58 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18198 "" "Go-http-client/1.1"
Nov 28 09:49:59 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.
Nov 28 09:49:59 np0005538515.localdomain podman[285498]: 2025-11-28 09:49:59.983982665 +0000 UTC m=+0.089180234 container health_status 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, release=1755695350, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., vcs-type=git, maintainer=Red Hat, Inc., version=9.6, config_id=edpm, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, name=ubi9-minimal, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter)
Nov 28 09:50:00 np0005538515.localdomain podman[285498]: 2025-11-28 09:50:00.000696869 +0000 UTC m=+0.105894438 container exec_died 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, vcs-type=git, config_id=edpm, container_name=openstack_network_exporter, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, version=9.6)
Nov 28 09:50:00 np0005538515.localdomain systemd[1]: 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.service: Deactivated successfully.
Nov 28 09:50:00 np0005538515.localdomain sshd[282091]: Received disconnect from 192.168.122.11 port 39946:11: disconnected by user
Nov 28 09:50:00 np0005538515.localdomain sshd[282091]: Disconnected from user tripleo-admin 192.168.122.11 port 39946
Nov 28 09:50:00 np0005538515.localdomain sshd[282071]: pam_unix(sshd:session): session closed for user tripleo-admin
Nov 28 09:50:00 np0005538515.localdomain systemd[1]: session-62.scope: Deactivated successfully.
Nov 28 09:50:00 np0005538515.localdomain systemd[1]: session-62.scope: Consumed 1.378s CPU time.
Nov 28 09:50:00 np0005538515.localdomain systemd-logind[763]: Session 62 logged out. Waiting for processes to exit.
Nov 28 09:50:00 np0005538515.localdomain systemd-logind[763]: Removed session 62.
Nov 28 09:50:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:50:00.621 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:50:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:50:00.622 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:50:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:50:00.622 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:50:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:50:00.622 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:50:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:50:00.622 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:50:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:50:00.623 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:50:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:50:00.623 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:50:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:50:00.623 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:50:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:50:00.623 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:50:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:50:00.623 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:50:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:50:00.623 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:50:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:50:00.623 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:50:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:50:00.624 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:50:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:50:00.624 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:50:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:50:00.624 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:50:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:50:00.624 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:50:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:50:00.624 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:50:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:50:00.624 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:50:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:50:00.624 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:50:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:50:00.625 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:50:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:50:00.625 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:50:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:50:00.625 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:50:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:50:00.625 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:50:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:50:00.625 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:50:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:50:00.625 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:50:08 np0005538515.localdomain sudo[285517]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:50:08 np0005538515.localdomain sudo[285517]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:50:08 np0005538515.localdomain sudo[285517]: pam_unix(sudo:session): session closed for user root
Nov 28 09:50:09 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.
Nov 28 09:50:09 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.
Nov 28 09:50:09 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.
Nov 28 09:50:09 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.
Nov 28 09:50:09 np0005538515.localdomain podman[285537]: 2025-11-28 09:50:09.98255481 +0000 UTC m=+0.080647452 container health_status b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0)
Nov 28 09:50:09 np0005538515.localdomain podman[285537]: 2025-11-28 09:50:09.992199936 +0000 UTC m=+0.090292578 container exec_died b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 09:50:10 np0005538515.localdomain systemd[1]: b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.service: Deactivated successfully.
Nov 28 09:50:10 np0005538515.localdomain systemd[1]: tmp-crun.wM4piK.mount: Deactivated successfully.
Nov 28 09:50:10 np0005538515.localdomain podman[285536]: 2025-11-28 09:50:10.040910225 +0000 UTC m=+0.139098570 container health_status 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 28 09:50:10 np0005538515.localdomain systemd[1]: tmp-crun.sw4ABv.mount: Deactivated successfully.
Nov 28 09:50:10 np0005538515.localdomain podman[285538]: 2025-11-28 09:50:10.086667733 +0000 UTC m=+0.180140404 container health_status d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 09:50:10 np0005538515.localdomain podman[285538]: 2025-11-28 09:50:10.097373103 +0000 UTC m=+0.190845784 container exec_died d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 28 09:50:10 np0005538515.localdomain systemd[1]: d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.service: Deactivated successfully.
Nov 28 09:50:10 np0005538515.localdomain podman[285536]: 2025-11-28 09:50:10.114359595 +0000 UTC m=+0.212547890 container exec_died 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller)
Nov 28 09:50:10 np0005538515.localdomain systemd[1]: 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.service: Deactivated successfully.
Nov 28 09:50:10 np0005538515.localdomain podman[285535]: 2025-11-28 09:50:10.192788268 +0000 UTC m=+0.292925243 container health_status 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute)
Nov 28 09:50:10 np0005538515.localdomain podman[285535]: 2025-11-28 09:50:10.20749725 +0000 UTC m=+0.307634215 container exec_died 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=edpm, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ceilometer_agent_compute, managed_by=edpm_ansible)
Nov 28 09:50:10 np0005538515.localdomain systemd[1]: 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.service: Deactivated successfully.
Nov 28 09:50:10 np0005538515.localdomain systemd[1]: Stopping User Manager for UID 1003...
Nov 28 09:50:10 np0005538515.localdomain systemd[282075]: Activating special unit Exit the Session...
Nov 28 09:50:10 np0005538515.localdomain systemd[282075]: Stopped target Main User Target.
Nov 28 09:50:10 np0005538515.localdomain systemd[282075]: Stopped target Basic System.
Nov 28 09:50:10 np0005538515.localdomain systemd[282075]: Stopped target Paths.
Nov 28 09:50:10 np0005538515.localdomain systemd[282075]: Stopped target Sockets.
Nov 28 09:50:10 np0005538515.localdomain systemd[282075]: Stopped target Timers.
Nov 28 09:50:10 np0005538515.localdomain systemd[282075]: Stopped Mark boot as successful after the user session has run 2 minutes.
Nov 28 09:50:10 np0005538515.localdomain systemd[282075]: Stopped Daily Cleanup of User's Temporary Directories.
Nov 28 09:50:10 np0005538515.localdomain systemd[282075]: Closed D-Bus User Message Bus Socket.
Nov 28 09:50:10 np0005538515.localdomain systemd[282075]: Stopped Create User's Volatile Files and Directories.
Nov 28 09:50:10 np0005538515.localdomain systemd[282075]: Removed slice User Application Slice.
Nov 28 09:50:10 np0005538515.localdomain systemd[282075]: Reached target Shutdown.
Nov 28 09:50:10 np0005538515.localdomain systemd[282075]: Finished Exit the Session.
Nov 28 09:50:10 np0005538515.localdomain systemd[282075]: Reached target Exit the Session.
Nov 28 09:50:10 np0005538515.localdomain systemd[1]: user@1003.service: Deactivated successfully.
Nov 28 09:50:10 np0005538515.localdomain systemd[1]: Stopped User Manager for UID 1003.
Nov 28 09:50:10 np0005538515.localdomain systemd[1]: Stopping User Runtime Directory /run/user/1003...
Nov 28 09:50:10 np0005538515.localdomain systemd[1]: user-runtime-dir@1003.service: Deactivated successfully.
Nov 28 09:50:10 np0005538515.localdomain systemd[1]: Stopped User Runtime Directory /run/user/1003.
Nov 28 09:50:10 np0005538515.localdomain systemd[1]: Removed slice User Slice of UID 1003.
Nov 28 09:50:10 np0005538515.localdomain systemd[1]: user-1003.slice: Consumed 1.809s CPU time.
Nov 28 09:50:10 np0005538515.localdomain sudo[285621]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:50:10 np0005538515.localdomain sudo[285621]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:50:10 np0005538515.localdomain sudo[285621]: pam_unix(sudo:session): session closed for user root
Nov 28 09:50:10 np0005538515.localdomain systemd[1]: run-user-1003.mount: Deactivated successfully.
Nov 28 09:50:11 np0005538515.localdomain sudo[285640]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:50:11 np0005538515.localdomain sudo[285640]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:50:11 np0005538515.localdomain sudo[285640]: pam_unix(sudo:session): session closed for user root
Nov 28 09:50:13 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.
Nov 28 09:50:13 np0005538515.localdomain podman[285658]: 2025-11-28 09:50:13.989416774 +0000 UTC m=+0.090605389 container health_status 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 28 09:50:14 np0005538515.localdomain podman[285658]: 2025-11-28 09:50:14.002534257 +0000 UTC m=+0.103722772 container exec_died 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 28 09:50:14 np0005538515.localdomain systemd[1]: 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.service: Deactivated successfully.
Nov 28 09:50:18 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.
Nov 28 09:50:18 np0005538515.localdomain podman[285681]: 2025-11-28 09:50:18.977876266 +0000 UTC m=+0.085026227 container health_status cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 28 09:50:19 np0005538515.localdomain podman[285681]: 2025-11-28 09:50:19.016678649 +0000 UTC m=+0.123828550 container exec_died cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Nov 28 09:50:19 np0005538515.localdomain systemd[1]: cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.service: Deactivated successfully.
Nov 28 09:50:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:50:25.239 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:50:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:50:25.239 280172 DEBUG nova.compute.manager [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Nov 28 09:50:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:50:25.267 280172 DEBUG nova.compute.manager [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Nov 28 09:50:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:50:25.268 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:50:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:50:25.268 280172 DEBUG nova.compute.manager [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Nov 28 09:50:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:50:25.289 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:50:26 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:50:26.304 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:50:26 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:50:26.305 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:50:26 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:50:26.305 280172 DEBUG nova.compute.manager [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 28 09:50:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:50:27 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:50:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:50:27 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:50:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:50:27 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 09:50:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:50:27 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 09:50:27 np0005538515.localdomain openstack_network_exporter[240973]: 
Nov 28 09:50:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:50:27 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 09:50:27 np0005538515.localdomain openstack_network_exporter[240973]: 
Nov 28 09:50:28 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:50:28.240 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:50:28 np0005538515.localdomain podman[239012]: time="2025-11-28T09:50:28Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 09:50:28 np0005538515.localdomain podman[239012]: @ - - [28/Nov/2025:09:50:28 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 152069 "" "Go-http-client/1.1"
Nov 28 09:50:28 np0005538515.localdomain podman[239012]: @ - - [28/Nov/2025:09:50:28 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18195 "" "Go-http-client/1.1"
Nov 28 09:50:29 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:50:29.235 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:50:30 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:50:30.238 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:50:30 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:50:30.239 280172 DEBUG nova.compute.manager [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 28 09:50:30 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:50:30.239 280172 DEBUG nova.compute.manager [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 28 09:50:30 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:50:30.367 280172 DEBUG nova.compute.manager [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 28 09:50:30 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:50:30.368 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:50:30 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:50:30.368 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:50:30 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:50:30.385 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:50:30 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:50:30.386 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:50:30 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:50:30.386 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:50:30 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:50:30.387 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Auditing locally available compute resources for np0005538515.localdomain (node: np0005538515.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 28 09:50:30 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:50:30.387 280172 DEBUG oslo_concurrency.processutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 09:50:30 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:50:30.857 280172 DEBUG oslo_concurrency.processutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.470s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 09:50:30 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.
Nov 28 09:50:30 np0005538515.localdomain podman[285722]: 2025-11-28 09:50:30.972410357 +0000 UTC m=+0.076453853 container health_status 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, version=9.6, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., managed_by=edpm_ansible, name=ubi9-minimal, distribution-scope=public, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, vendor=Red Hat, Inc.)
Nov 28 09:50:31 np0005538515.localdomain podman[285722]: 2025-11-28 09:50:31.013558413 +0000 UTC m=+0.117601959 container exec_died 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, version=9.6, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, config_id=edpm, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Nov 28 09:50:31 np0005538515.localdomain systemd[1]: 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.service: Deactivated successfully.
Nov 28 09:50:31 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:50:31.066 280172 WARNING nova.virt.libvirt.driver [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 09:50:31 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:50:31.067 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Hypervisor/Node resource view: name=np0005538515.localdomain free_ram=12492MB free_disk=41.83686447143555GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 28 09:50:31 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:50:31.068 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:50:31 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:50:31.068 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:50:31 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:50:31.252 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 28 09:50:31 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:50:31.253 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Final resource view: name=np0005538515.localdomain phys_ram=15738MB used_ram=512MB phys_disk=41GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 28 09:50:31 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:50:31.389 280172 DEBUG nova.scheduler.client.report [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Refreshing inventories for resource provider 72fba1ca-0d86-48af-8a3d-510284dfd0e0 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Nov 28 09:50:31 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:50:31.528 280172 DEBUG nova.scheduler.client.report [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Updating ProviderTree inventory for provider 72fba1ca-0d86-48af-8a3d-510284dfd0e0 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Nov 28 09:50:31 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:50:31.529 280172 DEBUG nova.compute.provider_tree [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Updating inventory in ProviderTree for provider 72fba1ca-0d86-48af-8a3d-510284dfd0e0 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 28 09:50:31 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:50:31.542 280172 DEBUG nova.scheduler.client.report [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Refreshing aggregate associations for resource provider 72fba1ca-0d86-48af-8a3d-510284dfd0e0, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Nov 28 09:50:31 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:50:31.566 280172 DEBUG nova.scheduler.client.report [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Refreshing trait associations for resource provider 72fba1ca-0d86-48af-8a3d-510284dfd0e0, traits: COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_TRUSTED_CERTS,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_AVX,HW_CPU_X86_BMI2,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_AESNI,HW_CPU_X86_SHA,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_ACCELERATORS,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_AMD_SVM,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_F16C,HW_CPU_X86_SVM,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_SSE2,HW_CPU_X86_CLMUL,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_ABM,COMPUTE_STORAGE_BUS_FDC,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_SSE4A,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_SSSE3,HW_CPU_X86_AVX2,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_SECURITY_TPM_1_2,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_SSE41,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_FMA3,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_BMI,HW_CPU_X86_SSE,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NODE,HW_CPU_X86_MMX,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_VOLUME_EXTEND,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_VIRTIO _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Nov 28 09:50:31 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:50:31.587 280172 DEBUG oslo_concurrency.processutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 09:50:32 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:50:32.049 280172 DEBUG oslo_concurrency.processutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.462s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 09:50:32 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:50:32.055 280172 DEBUG nova.compute.provider_tree [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Inventory has not changed in ProviderTree for provider: 72fba1ca-0d86-48af-8a3d-510284dfd0e0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 09:50:32 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:50:32.079 280172 DEBUG nova.scheduler.client.report [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Inventory has not changed for provider 72fba1ca-0d86-48af-8a3d-510284dfd0e0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 09:50:32 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:50:32.082 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Compute_service record updated for np0005538515.localdomain:np0005538515.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 28 09:50:32 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:50:32.082 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.014s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:50:32 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:50:32.953 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:50:32 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:50:32.954 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:50:37 np0005538515.localdomain sudo[285765]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:50:37 np0005538515.localdomain sudo[285765]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:50:37 np0005538515.localdomain sudo[285765]: pam_unix(sudo:session): session closed for user root
Nov 28 09:50:38 np0005538515.localdomain sudo[285783]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:50:38 np0005538515.localdomain sudo[285783]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:50:38 np0005538515.localdomain sudo[285783]: pam_unix(sudo:session): session closed for user root
Nov 28 09:50:39 np0005538515.localdomain sudo[285801]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:50:39 np0005538515.localdomain sudo[285801]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:50:39 np0005538515.localdomain sudo[285801]: pam_unix(sudo:session): session closed for user root
Nov 28 09:50:40 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.
Nov 28 09:50:40 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.
Nov 28 09:50:40 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.
Nov 28 09:50:40 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.
Nov 28 09:50:40 np0005538515.localdomain podman[285820]: 2025-11-28 09:50:40.985317072 +0000 UTC m=+0.084464470 container health_status 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Nov 28 09:50:41 np0005538515.localdomain podman[285819]: 2025-11-28 09:50:40.963477611 +0000 UTC m=+0.069209881 container health_status 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=edpm, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ceilometer_agent_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Nov 28 09:50:41 np0005538515.localdomain podman[285827]: 2025-11-28 09:50:41.035198357 +0000 UTC m=+0.129061902 container health_status d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 09:50:41 np0005538515.localdomain podman[285827]: 2025-11-28 09:50:41.072448553 +0000 UTC m=+0.166312058 container exec_died d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 09:50:41 np0005538515.localdomain podman[285821]: 2025-11-28 09:50:41.072555217 +0000 UTC m=+0.172162989 container health_status b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true, container_name=ovn_metadata_agent, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Nov 28 09:50:41 np0005538515.localdomain podman[285821]: 2025-11-28 09:50:41.081273025 +0000 UTC m=+0.180880807 container exec_died b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Nov 28 09:50:41 np0005538515.localdomain systemd[1]: b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.service: Deactivated successfully.
Nov 28 09:50:41 np0005538515.localdomain podman[285819]: 2025-11-28 09:50:41.09966244 +0000 UTC m=+0.205394710 container exec_died 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3)
Nov 28 09:50:41 np0005538515.localdomain podman[285820]: 2025-11-28 09:50:41.099977689 +0000 UTC m=+0.199125167 container exec_died 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 28 09:50:41 np0005538515.localdomain systemd[1]: 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.service: Deactivated successfully.
Nov 28 09:50:41 np0005538515.localdomain systemd[1]: d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.service: Deactivated successfully.
Nov 28 09:50:41 np0005538515.localdomain systemd[1]: 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.service: Deactivated successfully.
Nov 28 09:50:44 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.
Nov 28 09:50:44 np0005538515.localdomain podman[285903]: 2025-11-28 09:50:44.982805717 +0000 UTC m=+0.084292734 container health_status 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 28 09:50:45 np0005538515.localdomain podman[285903]: 2025-11-28 09:50:45.019562239 +0000 UTC m=+0.121049266 container exec_died 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 28 09:50:45 np0005538515.localdomain systemd[1]: 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.service: Deactivated successfully.
Nov 28 09:50:45 np0005538515.localdomain sudo[285925]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:50:45 np0005538515.localdomain sudo[285925]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:50:45 np0005538515.localdomain sudo[285925]: pam_unix(sudo:session): session closed for user root
Nov 28 09:50:45 np0005538515.localdomain sudo[285943]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:50:45 np0005538515.localdomain sudo[285943]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:50:46 np0005538515.localdomain podman[286004]: 
Nov 28 09:50:46 np0005538515.localdomain podman[286004]: 2025-11-28 09:50:46.172114918 +0000 UTC m=+0.058061418 container create cced2c85b354e0bf58917b909c50dac32447226932b3de68393bcfec839cb5cf (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=happy_franklin, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., release=553, version=7, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, build-date=2025-09-24T08:57:55, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_CLEAN=True, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12, GIT_BRANCH=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public)
Nov 28 09:50:46 np0005538515.localdomain systemd[1]: Started libpod-conmon-cced2c85b354e0bf58917b909c50dac32447226932b3de68393bcfec839cb5cf.scope.
Nov 28 09:50:46 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 09:50:46 np0005538515.localdomain podman[286004]: 2025-11-28 09:50:46.139949508 +0000 UTC m=+0.025895978 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 09:50:46 np0005538515.localdomain podman[286004]: 2025-11-28 09:50:46.239730338 +0000 UTC m=+0.125676788 container init cced2c85b354e0bf58917b909c50dac32447226932b3de68393bcfec839cb5cf (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=happy_franklin, GIT_BRANCH=main, distribution-scope=public, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, release=553, io.buildah.version=1.33.12, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-09-24T08:57:55, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_CLEAN=True, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, vcs-type=git, io.openshift.expose-services=, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, ceph=True)
Nov 28 09:50:46 np0005538515.localdomain systemd[1]: tmp-crun.bKAriY.mount: Deactivated successfully.
Nov 28 09:50:46 np0005538515.localdomain podman[286004]: 2025-11-28 09:50:46.251657395 +0000 UTC m=+0.137603835 container start cced2c85b354e0bf58917b909c50dac32447226932b3de68393bcfec839cb5cf (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=happy_franklin, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vcs-type=git, ceph=True, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, name=rhceph, vendor=Red Hat, Inc., RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=553, io.openshift.expose-services=, io.buildah.version=1.33.12, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/agreements, version=7, build-date=2025-09-24T08:57:55, architecture=x86_64)
Nov 28 09:50:46 np0005538515.localdomain podman[286004]: 2025-11-28 09:50:46.251952243 +0000 UTC m=+0.137898683 container attach cced2c85b354e0bf58917b909c50dac32447226932b3de68393bcfec839cb5cf (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=happy_franklin, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, build-date=2025-09-24T08:57:55, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, release=553, ceph=True, io.openshift.expose-services=, GIT_CLEAN=True, name=rhceph, io.buildah.version=1.33.12, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, version=7)
Nov 28 09:50:46 np0005538515.localdomain happy_franklin[286019]: 167 167
Nov 28 09:50:46 np0005538515.localdomain systemd[1]: libpod-cced2c85b354e0bf58917b909c50dac32447226932b3de68393bcfec839cb5cf.scope: Deactivated successfully.
Nov 28 09:50:46 np0005538515.localdomain podman[286004]: 2025-11-28 09:50:46.25832959 +0000 UTC m=+0.144276070 container died cced2c85b354e0bf58917b909c50dac32447226932b3de68393bcfec839cb5cf (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=happy_franklin, ceph=True, GIT_BRANCH=main, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-type=git, GIT_CLEAN=True, io.openshift.expose-services=, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-09-24T08:57:55, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., io.buildah.version=1.33.12, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, version=7, architecture=x86_64, RELEASE=main, description=Red Hat Ceph Storage 7, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553)
Nov 28 09:50:46 np0005538515.localdomain podman[286024]: 2025-11-28 09:50:46.339122576 +0000 UTC m=+0.071436439 container remove cced2c85b354e0bf58917b909c50dac32447226932b3de68393bcfec839cb5cf (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=happy_franklin, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, build-date=2025-09-24T08:57:55, com.redhat.component=rhceph-container, vcs-type=git, RELEASE=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.license_terms=https://www.redhat.com/agreements, ceph=True, architecture=x86_64, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, release=553, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, vendor=Red Hat, Inc., io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d)
Nov 28 09:50:46 np0005538515.localdomain systemd[1]: libpod-conmon-cced2c85b354e0bf58917b909c50dac32447226932b3de68393bcfec839cb5cf.scope: Deactivated successfully.
Nov 28 09:50:46 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 09:50:46 np0005538515.localdomain systemd-rc-local-generator[286062]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:50:46 np0005538515.localdomain systemd-sysv-generator[286069]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:50:46 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:50:46 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 28 09:50:46 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:50:46 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:50:46 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:50:46 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 28 09:50:46 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:50:46 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:50:46 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:50:46 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-ca2e8ad921ba0d14b62f1213457bc3a508f256cebd9b246f8c28651d5dad8283-merged.mount: Deactivated successfully.
Nov 28 09:50:46 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 09:50:46 np0005538515.localdomain systemd-rc-local-generator[286108]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:50:46 np0005538515.localdomain systemd-sysv-generator[286112]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:50:46 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:50:46 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 28 09:50:46 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:50:46 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:50:46 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:50:46 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 28 09:50:46 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:50:46 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:50:46 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:50:47 np0005538515.localdomain systemd[1]: Starting Ceph mgr.np0005538515.yfkzhl for 2c5417c9-00eb-57d5-a565-ddecbc7995c1...
Nov 28 09:50:47 np0005538515.localdomain podman[286170]: 
Nov 28 09:50:47 np0005538515.localdomain podman[286170]: 2025-11-28 09:50:47.405105911 +0000 UTC m=+0.058400328 container create 351e4a94ab289bce7b4a85395ca5ce7cc15d9e39651879fa3b3c91ad3ed9ba78 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl, vendor=Red Hat, Inc., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, CEPH_POINT_RELEASE=, io.buildah.version=1.33.12, com.redhat.component=rhceph-container, GIT_BRANCH=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, release=553, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, description=Red Hat Ceph Storage 7, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, version=7, ceph=True, architecture=x86_64)
Nov 28 09:50:47 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9ff37e480af970500c0eb217506b25b6534f9ae18ce1d3cf6ced4b6b59bce95f/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Nov 28 09:50:47 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9ff37e480af970500c0eb217506b25b6534f9ae18ce1d3cf6ced4b6b59bce95f/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Nov 28 09:50:47 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9ff37e480af970500c0eb217506b25b6534f9ae18ce1d3cf6ced4b6b59bce95f/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Nov 28 09:50:47 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9ff37e480af970500c0eb217506b25b6534f9ae18ce1d3cf6ced4b6b59bce95f/merged/var/lib/ceph/mgr/ceph-np0005538515.yfkzhl supports timestamps until 2038 (0x7fffffff)
Nov 28 09:50:47 np0005538515.localdomain podman[286170]: 2025-11-28 09:50:47.464519019 +0000 UTC m=+0.117813436 container init 351e4a94ab289bce7b4a85395ca5ce7cc15d9e39651879fa3b3c91ad3ed9ba78 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=rhceph-container, name=rhceph, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, release=553, RELEASE=main, GIT_BRANCH=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, build-date=2025-09-24T08:57:55, distribution-scope=public, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., io.buildah.version=1.33.12, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, ceph=True, architecture=x86_64, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Nov 28 09:50:47 np0005538515.localdomain podman[286170]: 2025-11-28 09:50:47.471990539 +0000 UTC m=+0.125284936 container start 351e4a94ab289bce7b4a85395ca5ce7cc15d9e39651879fa3b3c91ad3ed9ba78 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl, io.buildah.version=1.33.12, io.openshift.expose-services=, version=7, build-date=2025-09-24T08:57:55, GIT_CLEAN=True, com.redhat.component=rhceph-container, release=553, RELEASE=main, GIT_BRANCH=main, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, architecture=x86_64, description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc.)
Nov 28 09:50:47 np0005538515.localdomain bash[286170]: 351e4a94ab289bce7b4a85395ca5ce7cc15d9e39651879fa3b3c91ad3ed9ba78
Nov 28 09:50:47 np0005538515.localdomain podman[286170]: 2025-11-28 09:50:47.379870755 +0000 UTC m=+0.033165222 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 09:50:47 np0005538515.localdomain systemd[1]: Started Ceph mgr.np0005538515.yfkzhl for 2c5417c9-00eb-57d5-a565-ddecbc7995c1.
Nov 28 09:50:47 np0005538515.localdomain sudo[285943]: pam_unix(sudo:session): session closed for user root
Nov 28 09:50:47 np0005538515.localdomain ceph-mgr[286188]: set uid:gid to 167:167 (ceph:ceph)
Nov 28 09:50:47 np0005538515.localdomain ceph-mgr[286188]: ceph version 18.2.1-361.el9cp (439dcd6094d413840eb2ec590fe2194ec616687f) reef (stable), process ceph-mgr, pid 2
Nov 28 09:50:47 np0005538515.localdomain ceph-mgr[286188]: pidfile_write: ignore empty --pid-file
Nov 28 09:50:47 np0005538515.localdomain ceph-mgr[286188]: mgr[py] Loading python module 'alerts'
Nov 28 09:50:47 np0005538515.localdomain ceph-mgr[286188]: mgr[py] Module alerts has missing NOTIFY_TYPES member
Nov 28 09:50:47 np0005538515.localdomain ceph-mgr[286188]: mgr[py] Loading python module 'balancer'
Nov 28 09:50:47 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T09:50:47.662+0000 7f3293659140 -1 mgr[py] Module alerts has missing NOTIFY_TYPES member
Nov 28 09:50:47 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T09:50:47.727+0000 7f3293659140 -1 mgr[py] Module balancer has missing NOTIFY_TYPES member
Nov 28 09:50:47 np0005538515.localdomain ceph-mgr[286188]: mgr[py] Module balancer has missing NOTIFY_TYPES member
Nov 28 09:50:47 np0005538515.localdomain ceph-mgr[286188]: mgr[py] Loading python module 'cephadm'
Nov 28 09:50:48 np0005538515.localdomain sudo[286213]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:50:48 np0005538515.localdomain sudo[286213]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:50:48 np0005538515.localdomain sudo[286213]: pam_unix(sudo:session): session closed for user root
Nov 28 09:50:48 np0005538515.localdomain sudo[286231]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:50:48 np0005538515.localdomain sudo[286231]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:50:48 np0005538515.localdomain sudo[286231]: pam_unix(sudo:session): session closed for user root
Nov 28 09:50:48 np0005538515.localdomain sudo[286249]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Nov 28 09:50:48 np0005538515.localdomain sudo[286249]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:50:48 np0005538515.localdomain ceph-mgr[286188]: mgr[py] Loading python module 'crash'
Nov 28 09:50:48 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T09:50:48.378+0000 7f3293659140 -1 mgr[py] Module crash has missing NOTIFY_TYPES member
Nov 28 09:50:48 np0005538515.localdomain ceph-mgr[286188]: mgr[py] Module crash has missing NOTIFY_TYPES member
Nov 28 09:50:48 np0005538515.localdomain ceph-mgr[286188]: mgr[py] Loading python module 'dashboard'
Nov 28 09:50:48 np0005538515.localdomain ceph-mgr[286188]: mgr[py] Loading python module 'devicehealth'
Nov 28 09:50:48 np0005538515.localdomain ceph-mgr[286188]: mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Nov 28 09:50:48 np0005538515.localdomain ceph-mgr[286188]: mgr[py] Loading python module 'diskprediction_local'
Nov 28 09:50:48 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T09:50:48.940+0000 7f3293659140 -1 mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Nov 28 09:50:49 np0005538515.localdomain systemd[1]: tmp-crun.ziKcZ3.mount: Deactivated successfully.
Nov 28 09:50:49 np0005538515.localdomain podman[286342]: 2025-11-28 09:50:49.021561473 +0000 UTC m=+0.095693635 container exec 98f7091a3e2ea0e9ed1e630f1e98c8fad1fd276cf7448473db6afc3c103ea45d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538515, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, build-date=2025-09-24T08:57:55, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.component=rhceph-container, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.buildah.version=1.33.12, distribution-scope=public, RELEASE=main, name=rhceph, release=553, ceph=True, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.expose-services=, vendor=Red Hat, Inc., version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, architecture=x86_64)
Nov 28 09:50:49 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: /lib64/python3.9/site-packages/scipy/__init__.py:73: UserWarning: NumPy was imported from a Python sub-interpreter but NumPy does not properly support sub-interpreters. This will likely work for most users but might cause hard to track down issues or subtle bugs. A common user of the rare sub-interpreter feature is wsgi which also allows single-interpreter mode.
Nov 28 09:50:49 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: Improvements in the case of bugs are welcome, but is not on the NumPy roadmap, and full support may require significant effort to achieve.
Nov 28 09:50:49 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]:   from numpy import show_config as show_numpy_config
Nov 28 09:50:49 np0005538515.localdomain ceph-mgr[286188]: mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Nov 28 09:50:49 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T09:50:49.085+0000 7f3293659140 -1 mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Nov 28 09:50:49 np0005538515.localdomain ceph-mgr[286188]: mgr[py] Loading python module 'influx'
Nov 28 09:50:49 np0005538515.localdomain podman[286342]: 2025-11-28 09:50:49.125706927 +0000 UTC m=+0.199839109 container exec_died 98f7091a3e2ea0e9ed1e630f1e98c8fad1fd276cf7448473db6afc3c103ea45d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538515, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., ceph=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, release=553, io.openshift.tags=rhceph ceph, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, version=7, com.redhat.component=rhceph-container, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, io.buildah.version=1.33.12, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, build-date=2025-09-24T08:57:55, RELEASE=main, vcs-type=git, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9)
Nov 28 09:50:49 np0005538515.localdomain ceph-mgr[286188]: mgr[py] Module influx has missing NOTIFY_TYPES member
Nov 28 09:50:49 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T09:50:49.147+0000 7f3293659140 -1 mgr[py] Module influx has missing NOTIFY_TYPES member
Nov 28 09:50:49 np0005538515.localdomain ceph-mgr[286188]: mgr[py] Loading python module 'insights'
Nov 28 09:50:49 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.
Nov 28 09:50:49 np0005538515.localdomain ceph-mgr[286188]: mgr[py] Loading python module 'iostat'
Nov 28 09:50:49 np0005538515.localdomain ceph-mgr[286188]: mgr[py] Module iostat has missing NOTIFY_TYPES member
Nov 28 09:50:49 np0005538515.localdomain ceph-mgr[286188]: mgr[py] Loading python module 'k8sevents'
Nov 28 09:50:49 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T09:50:49.266+0000 7f3293659140 -1 mgr[py] Module iostat has missing NOTIFY_TYPES member
Nov 28 09:50:49 np0005538515.localdomain podman[286375]: 2025-11-28 09:50:49.26979061 +0000 UTC m=+0.085360438 container health_status cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, container_name=multipathd, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 28 09:50:49 np0005538515.localdomain podman[286375]: 2025-11-28 09:50:49.27887972 +0000 UTC m=+0.094449558 container exec_died cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=multipathd, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 28 09:50:49 np0005538515.localdomain systemd[1]: cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.service: Deactivated successfully.
Nov 28 09:50:49 np0005538515.localdomain ceph-mgr[286188]: mgr[py] Loading python module 'localpool'
Nov 28 09:50:49 np0005538515.localdomain ceph-mgr[286188]: mgr[py] Loading python module 'mds_autoscaler'
Nov 28 09:50:49 np0005538515.localdomain sudo[286249]: pam_unix(sudo:session): session closed for user root
Nov 28 09:50:49 np0005538515.localdomain sudo[286464]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:50:49 np0005538515.localdomain sudo[286464]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:50:49 np0005538515.localdomain sudo[286464]: pam_unix(sudo:session): session closed for user root
Nov 28 09:50:49 np0005538515.localdomain ceph-mgr[286188]: mgr[py] Loading python module 'mirroring'
Nov 28 09:50:49 np0005538515.localdomain sudo[286482]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 09:50:49 np0005538515.localdomain sudo[286482]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:50:49 np0005538515.localdomain ceph-mgr[286188]: mgr[py] Loading python module 'nfs'
Nov 28 09:50:50 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T09:50:50.026+0000 7f3293659140 -1 mgr[py] Module nfs has missing NOTIFY_TYPES member
Nov 28 09:50:50 np0005538515.localdomain ceph-mgr[286188]: mgr[py] Module nfs has missing NOTIFY_TYPES member
Nov 28 09:50:50 np0005538515.localdomain ceph-mgr[286188]: mgr[py] Loading python module 'orchestrator'
Nov 28 09:50:50 np0005538515.localdomain ceph-mgr[286188]: mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Nov 28 09:50:50 np0005538515.localdomain ceph-mgr[286188]: mgr[py] Loading python module 'osd_perf_query'
Nov 28 09:50:50 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T09:50:50.182+0000 7f3293659140 -1 mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Nov 28 09:50:50 np0005538515.localdomain ceph-mgr[286188]: mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Nov 28 09:50:50 np0005538515.localdomain ceph-mgr[286188]: mgr[py] Loading python module 'osd_support'
Nov 28 09:50:50 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T09:50:50.250+0000 7f3293659140 -1 mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Nov 28 09:50:50 np0005538515.localdomain ceph-mgr[286188]: mgr[py] Module osd_support has missing NOTIFY_TYPES member
Nov 28 09:50:50 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T09:50:50.305+0000 7f3293659140 -1 mgr[py] Module osd_support has missing NOTIFY_TYPES member
Nov 28 09:50:50 np0005538515.localdomain ceph-mgr[286188]: mgr[py] Loading python module 'pg_autoscaler'
Nov 28 09:50:50 np0005538515.localdomain ceph-mgr[286188]: mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Nov 28 09:50:50 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T09:50:50.370+0000 7f3293659140 -1 mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Nov 28 09:50:50 np0005538515.localdomain ceph-mgr[286188]: mgr[py] Loading python module 'progress'
Nov 28 09:50:50 np0005538515.localdomain ceph-mgr[286188]: mgr[py] Module progress has missing NOTIFY_TYPES member
Nov 28 09:50:50 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T09:50:50.428+0000 7f3293659140 -1 mgr[py] Module progress has missing NOTIFY_TYPES member
Nov 28 09:50:50 np0005538515.localdomain ceph-mgr[286188]: mgr[py] Loading python module 'prometheus'
Nov 28 09:50:50 np0005538515.localdomain sudo[286482]: pam_unix(sudo:session): session closed for user root
Nov 28 09:50:50 np0005538515.localdomain ceph-mgr[286188]: mgr[py] Module prometheus has missing NOTIFY_TYPES member
Nov 28 09:50:50 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T09:50:50.733+0000 7f3293659140 -1 mgr[py] Module prometheus has missing NOTIFY_TYPES member
Nov 28 09:50:50 np0005538515.localdomain ceph-mgr[286188]: mgr[py] Loading python module 'rbd_support'
Nov 28 09:50:50 np0005538515.localdomain ceph-mgr[286188]: mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Nov 28 09:50:50 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T09:50:50.831+0000 7f3293659140 -1 mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Nov 28 09:50:50 np0005538515.localdomain ceph-mgr[286188]: mgr[py] Loading python module 'restful'
Nov 28 09:50:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:50:50.832 158530 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:50:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:50:50.833 158530 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:50:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:50:50.833 158530 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:50:51 np0005538515.localdomain ceph-mgr[286188]: mgr[py] Loading python module 'rgw'
Nov 28 09:50:51 np0005538515.localdomain sudo[286531]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:50:51 np0005538515.localdomain sudo[286531]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:50:51 np0005538515.localdomain sudo[286531]: pam_unix(sudo:session): session closed for user root
Nov 28 09:50:51 np0005538515.localdomain ceph-mgr[286188]: mgr[py] Module rgw has missing NOTIFY_TYPES member
Nov 28 09:50:51 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T09:50:51.195+0000 7f3293659140 -1 mgr[py] Module rgw has missing NOTIFY_TYPES member
Nov 28 09:50:51 np0005538515.localdomain ceph-mgr[286188]: mgr[py] Loading python module 'rook'
Nov 28 09:50:51 np0005538515.localdomain ceph-mgr[286188]: mgr[py] Module rook has missing NOTIFY_TYPES member
Nov 28 09:50:51 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T09:50:51.649+0000 7f3293659140 -1 mgr[py] Module rook has missing NOTIFY_TYPES member
Nov 28 09:50:51 np0005538515.localdomain ceph-mgr[286188]: mgr[py] Loading python module 'selftest'
Nov 28 09:50:51 np0005538515.localdomain sudo[286549]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:50:51 np0005538515.localdomain sudo[286549]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:50:51 np0005538515.localdomain sudo[286549]: pam_unix(sudo:session): session closed for user root
Nov 28 09:50:51 np0005538515.localdomain ceph-mgr[286188]: mgr[py] Module selftest has missing NOTIFY_TYPES member
Nov 28 09:50:51 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T09:50:51.714+0000 7f3293659140 -1 mgr[py] Module selftest has missing NOTIFY_TYPES member
Nov 28 09:50:51 np0005538515.localdomain ceph-mgr[286188]: mgr[py] Loading python module 'snap_schedule'
Nov 28 09:50:51 np0005538515.localdomain ceph-mgr[286188]: mgr[py] Loading python module 'stats'
Nov 28 09:50:51 np0005538515.localdomain ceph-mgr[286188]: mgr[py] Loading python module 'status'
Nov 28 09:50:51 np0005538515.localdomain ceph-mgr[286188]: mgr[py] Module status has missing NOTIFY_TYPES member
Nov 28 09:50:51 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T09:50:51.926+0000 7f3293659140 -1 mgr[py] Module status has missing NOTIFY_TYPES member
Nov 28 09:50:51 np0005538515.localdomain ceph-mgr[286188]: mgr[py] Loading python module 'telegraf'
Nov 28 09:50:51 np0005538515.localdomain ceph-mgr[286188]: mgr[py] Module telegraf has missing NOTIFY_TYPES member
Nov 28 09:50:51 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T09:50:51.992+0000 7f3293659140 -1 mgr[py] Module telegraf has missing NOTIFY_TYPES member
Nov 28 09:50:51 np0005538515.localdomain ceph-mgr[286188]: mgr[py] Loading python module 'telemetry'
Nov 28 09:50:52 np0005538515.localdomain ceph-mgr[286188]: mgr[py] Module telemetry has missing NOTIFY_TYPES member
Nov 28 09:50:52 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T09:50:52.138+0000 7f3293659140 -1 mgr[py] Module telemetry has missing NOTIFY_TYPES member
Nov 28 09:50:52 np0005538515.localdomain ceph-mgr[286188]: mgr[py] Loading python module 'test_orchestrator'
Nov 28 09:50:52 np0005538515.localdomain ceph-mgr[286188]: mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Nov 28 09:50:52 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T09:50:52.296+0000 7f3293659140 -1 mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Nov 28 09:50:52 np0005538515.localdomain ceph-mgr[286188]: mgr[py] Loading python module 'volumes'
Nov 28 09:50:52 np0005538515.localdomain sudo[286567]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:50:52 np0005538515.localdomain sudo[286567]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:50:52 np0005538515.localdomain sudo[286567]: pam_unix(sudo:session): session closed for user root
Nov 28 09:50:52 np0005538515.localdomain ceph-mgr[286188]: mgr[py] Module volumes has missing NOTIFY_TYPES member
Nov 28 09:50:52 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T09:50:52.507+0000 7f3293659140 -1 mgr[py] Module volumes has missing NOTIFY_TYPES member
Nov 28 09:50:52 np0005538515.localdomain ceph-mgr[286188]: mgr[py] Loading python module 'zabbix'
Nov 28 09:50:52 np0005538515.localdomain ceph-mgr[286188]: mgr[py] Module zabbix has missing NOTIFY_TYPES member
Nov 28 09:50:52 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T09:50:52.574+0000 7f3293659140 -1 mgr[py] Module zabbix has missing NOTIFY_TYPES member
Nov 28 09:50:52 np0005538515.localdomain ceph-mgr[286188]: ms_deliver_dispatch: unhandled message 0x5646e12c51e0 mon_map magic: 0 from mon.0 v2:172.18.0.103:3300/0
Nov 28 09:50:52 np0005538515.localdomain ceph-mgr[286188]: client.0 ms_handle_reset on v2:172.18.0.103:6800/705940825
Nov 28 09:50:56 np0005538515.localdomain sudo[286585]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:50:56 np0005538515.localdomain sudo[286585]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:50:56 np0005538515.localdomain sudo[286585]: pam_unix(sudo:session): session closed for user root
Nov 28 09:50:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:50:57 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:50:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:50:57 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 09:50:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:50:57 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:50:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:50:57 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 09:50:57 np0005538515.localdomain openstack_network_exporter[240973]: 
Nov 28 09:50:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:50:57 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 09:50:57 np0005538515.localdomain openstack_network_exporter[240973]: 
Nov 28 09:50:58 np0005538515.localdomain podman[239012]: time="2025-11-28T09:50:58Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 09:50:58 np0005538515.localdomain podman[239012]: @ - - [28/Nov/2025:09:50:58 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 154135 "" "Go-http-client/1.1"
Nov 28 09:50:58 np0005538515.localdomain podman[239012]: @ - - [28/Nov/2025:09:50:58 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18682 "" "Go-http-client/1.1"
Nov 28 09:50:59 np0005538515.localdomain sudo[286604]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:50:59 np0005538515.localdomain sudo[286604]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:50:59 np0005538515.localdomain sudo[286604]: pam_unix(sudo:session): session closed for user root
Nov 28 09:50:59 np0005538515.localdomain sudo[286622]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Nov 28 09:50:59 np0005538515.localdomain sudo[286622]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:50:59 np0005538515.localdomain sudo[286622]: pam_unix(sudo:session): session closed for user root
Nov 28 09:50:59 np0005538515.localdomain sudo[286640]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph
Nov 28 09:50:59 np0005538515.localdomain sudo[286640]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:50:59 np0005538515.localdomain sudo[286640]: pam_unix(sudo:session): session closed for user root
Nov 28 09:50:59 np0005538515.localdomain sudo[286658]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.conf.new
Nov 28 09:50:59 np0005538515.localdomain sudo[286658]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:50:59 np0005538515.localdomain sudo[286658]: pam_unix(sudo:session): session closed for user root
Nov 28 09:50:59 np0005538515.localdomain sudo[286676]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:50:59 np0005538515.localdomain sudo[286676]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:50:59 np0005538515.localdomain sudo[286676]: pam_unix(sudo:session): session closed for user root
Nov 28 09:50:59 np0005538515.localdomain sudo[286694]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.conf.new
Nov 28 09:50:59 np0005538515.localdomain sudo[286694]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:50:59 np0005538515.localdomain sudo[286694]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:00 np0005538515.localdomain sudo[286728]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.conf.new
Nov 28 09:51:00 np0005538515.localdomain sudo[286728]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:00 np0005538515.localdomain sudo[286728]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:00 np0005538515.localdomain sudo[286746]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.conf.new
Nov 28 09:51:00 np0005538515.localdomain sudo[286746]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:00 np0005538515.localdomain sudo[286746]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:00 np0005538515.localdomain sudo[286764]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Nov 28 09:51:00 np0005538515.localdomain sudo[286764]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:00 np0005538515.localdomain sudo[286764]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:00 np0005538515.localdomain sudo[286782]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config
Nov 28 09:51:00 np0005538515.localdomain sudo[286782]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:00 np0005538515.localdomain sudo[286782]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:00 np0005538515.localdomain sudo[286800]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config
Nov 28 09:51:00 np0005538515.localdomain sudo[286800]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:00 np0005538515.localdomain sudo[286800]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:00 np0005538515.localdomain sudo[286818]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf.new
Nov 28 09:51:00 np0005538515.localdomain sudo[286818]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:00 np0005538515.localdomain sudo[286818]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:00 np0005538515.localdomain sudo[286836]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:51:00 np0005538515.localdomain sudo[286836]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:00 np0005538515.localdomain sudo[286836]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:00 np0005538515.localdomain sudo[286854]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf.new
Nov 28 09:51:00 np0005538515.localdomain sudo[286854]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:00 np0005538515.localdomain sudo[286854]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:00 np0005538515.localdomain sudo[286888]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf.new
Nov 28 09:51:00 np0005538515.localdomain sudo[286888]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:00 np0005538515.localdomain sudo[286888]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:00 np0005538515.localdomain sudo[286906]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf.new
Nov 28 09:51:00 np0005538515.localdomain sudo[286906]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:00 np0005538515.localdomain sudo[286906]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:00 np0005538515.localdomain sudo[286924]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf.new /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:51:00 np0005538515.localdomain sudo[286924]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:00 np0005538515.localdomain sudo[286924]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:01 np0005538515.localdomain sudo[286942]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Nov 28 09:51:01 np0005538515.localdomain sudo[286942]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:01 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.
Nov 28 09:51:01 np0005538515.localdomain sudo[286942]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:01 np0005538515.localdomain sudo[286961]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph
Nov 28 09:51:01 np0005538515.localdomain sudo[286961]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:01 np0005538515.localdomain sudo[286961]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:01 np0005538515.localdomain podman[286960]: 2025-11-28 09:51:01.1531501 +0000 UTC m=+0.089050251 container health_status 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., architecture=x86_64, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, name=ubi9-minimal, maintainer=Red Hat, Inc., release=1755695350, version=9.6, io.openshift.expose-services=, vcs-type=git, config_id=edpm)
Nov 28 09:51:01 np0005538515.localdomain podman[286960]: 2025-11-28 09:51:01.171507964 +0000 UTC m=+0.107408165 container exec_died 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, maintainer=Red Hat, Inc., architecture=x86_64, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, release=1755695350, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., config_id=edpm, name=ubi9-minimal)
Nov 28 09:51:01 np0005538515.localdomain systemd[1]: 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.service: Deactivated successfully.
Nov 28 09:51:01 np0005538515.localdomain sudo[286991]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.client.admin.keyring.new
Nov 28 09:51:01 np0005538515.localdomain sudo[286991]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:01 np0005538515.localdomain sudo[286991]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:01 np0005538515.localdomain sudo[287016]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:51:01 np0005538515.localdomain sudo[287016]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:01 np0005538515.localdomain sudo[287016]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:01 np0005538515.localdomain sudo[287034]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.client.admin.keyring.new
Nov 28 09:51:01 np0005538515.localdomain sudo[287034]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:01 np0005538515.localdomain sudo[287034]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:01 np0005538515.localdomain sudo[287068]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.client.admin.keyring.new
Nov 28 09:51:01 np0005538515.localdomain sudo[287068]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:01 np0005538515.localdomain sudo[287068]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:01 np0005538515.localdomain sudo[287086]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.client.admin.keyring.new
Nov 28 09:51:01 np0005538515.localdomain sudo[287086]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:01 np0005538515.localdomain sudo[287086]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:01 np0005538515.localdomain sudo[287104]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.client.admin.keyring.new /etc/ceph/ceph.client.admin.keyring
Nov 28 09:51:01 np0005538515.localdomain sudo[287104]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:01 np0005538515.localdomain sudo[287104]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:01 np0005538515.localdomain sudo[287122]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config
Nov 28 09:51:01 np0005538515.localdomain sudo[287122]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:01 np0005538515.localdomain sudo[287122]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:01 np0005538515.localdomain sudo[287140]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config
Nov 28 09:51:01 np0005538515.localdomain sudo[287140]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:01 np0005538515.localdomain sudo[287140]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:01 np0005538515.localdomain sudo[287158]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring.new
Nov 28 09:51:01 np0005538515.localdomain sudo[287158]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:01 np0005538515.localdomain sudo[287158]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:01 np0005538515.localdomain sudo[287176]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:51:01 np0005538515.localdomain sudo[287176]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:01 np0005538515.localdomain sudo[287176]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:02 np0005538515.localdomain sudo[287194]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring.new
Nov 28 09:51:02 np0005538515.localdomain sudo[287194]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:02 np0005538515.localdomain sudo[287194]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:02 np0005538515.localdomain sudo[287228]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring.new
Nov 28 09:51:02 np0005538515.localdomain sudo[287228]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:02 np0005538515.localdomain sudo[287228]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:02 np0005538515.localdomain sudo[287246]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring.new
Nov 28 09:51:02 np0005538515.localdomain sudo[287246]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:02 np0005538515.localdomain sudo[287246]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:02 np0005538515.localdomain sudo[287264]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring.new /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring
Nov 28 09:51:02 np0005538515.localdomain sudo[287264]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:02 np0005538515.localdomain sudo[287264]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:02 np0005538515.localdomain sudo[287282]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:51:02 np0005538515.localdomain sudo[287282]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:02 np0005538515.localdomain sudo[287282]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:02 np0005538515.localdomain sudo[287300]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:51:02 np0005538515.localdomain sudo[287300]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:03 np0005538515.localdomain podman[287361]: 
Nov 28 09:51:03 np0005538515.localdomain podman[287361]: 2025-11-28 09:51:03.121060243 +0000 UTC m=+0.071066577 container create d4ec5f80a73c7abebbf553cecace275fd9fa303dc6d22eb9379aeb94aa3a0051 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=awesome_kilby, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, version=7, RELEASE=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, build-date=2025-09-24T08:57:55, vendor=Red Hat, Inc., vcs-type=git, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, distribution-scope=public, io.buildah.version=1.33.12, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_BRANCH=main, release=553, description=Red Hat Ceph Storage 7, name=rhceph)
Nov 28 09:51:03 np0005538515.localdomain systemd[1]: Started libpod-conmon-d4ec5f80a73c7abebbf553cecace275fd9fa303dc6d22eb9379aeb94aa3a0051.scope.
Nov 28 09:51:03 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 09:51:03 np0005538515.localdomain podman[287361]: 2025-11-28 09:51:03.088400748 +0000 UTC m=+0.038407152 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 09:51:03 np0005538515.localdomain podman[287361]: 2025-11-28 09:51:03.197627709 +0000 UTC m=+0.147634043 container init d4ec5f80a73c7abebbf553cecace275fd9fa303dc6d22eb9379aeb94aa3a0051 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=awesome_kilby, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_BRANCH=main, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, name=rhceph, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, ceph=True, GIT_CLEAN=True, architecture=x86_64, build-date=2025-09-24T08:57:55, vcs-type=git, release=553, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., version=7, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.expose-services=, io.buildah.version=1.33.12, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Nov 28 09:51:03 np0005538515.localdomain podman[287361]: 2025-11-28 09:51:03.20740162 +0000 UTC m=+0.157407954 container start d4ec5f80a73c7abebbf553cecace275fd9fa303dc6d22eb9379aeb94aa3a0051 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=awesome_kilby, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.33.12, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, build-date=2025-09-24T08:57:55, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-type=git, description=Red Hat Ceph Storage 7, release=553, distribution-scope=public, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, ceph=True, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9)
Nov 28 09:51:03 np0005538515.localdomain podman[287361]: 2025-11-28 09:51:03.207756661 +0000 UTC m=+0.157763055 container attach d4ec5f80a73c7abebbf553cecace275fd9fa303dc6d22eb9379aeb94aa3a0051 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=awesome_kilby, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, distribution-scope=public, description=Red Hat Ceph Storage 7, name=rhceph, GIT_CLEAN=True, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., build-date=2025-09-24T08:57:55, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, com.redhat.component=rhceph-container, RELEASE=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, version=7, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements)
Nov 28 09:51:03 np0005538515.localdomain awesome_kilby[287375]: 167 167
Nov 28 09:51:03 np0005538515.localdomain systemd[1]: libpod-d4ec5f80a73c7abebbf553cecace275fd9fa303dc6d22eb9379aeb94aa3a0051.scope: Deactivated successfully.
Nov 28 09:51:03 np0005538515.localdomain podman[287361]: 2025-11-28 09:51:03.211513236 +0000 UTC m=+0.161519600 container died d4ec5f80a73c7abebbf553cecace275fd9fa303dc6d22eb9379aeb94aa3a0051 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=awesome_kilby, io.openshift.tags=rhceph ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.expose-services=, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, release=553, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, vcs-type=git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, description=Red Hat Ceph Storage 7, name=rhceph, com.redhat.component=rhceph-container, distribution-scope=public, build-date=2025-09-24T08:57:55, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, GIT_BRANCH=main, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_CLEAN=True, io.buildah.version=1.33.12, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main)
Nov 28 09:51:03 np0005538515.localdomain podman[287381]: 2025-11-28 09:51:03.314865176 +0000 UTC m=+0.089346540 container remove d4ec5f80a73c7abebbf553cecace275fd9fa303dc6d22eb9379aeb94aa3a0051 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=awesome_kilby, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.expose-services=, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_BRANCH=main, build-date=2025-09-24T08:57:55, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, CEPH_POINT_RELEASE=, release=553, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main, ceph=True, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d)
Nov 28 09:51:03 np0005538515.localdomain systemd[1]: libpod-conmon-d4ec5f80a73c7abebbf553cecace275fd9fa303dc6d22eb9379aeb94aa3a0051.scope: Deactivated successfully.
Nov 28 09:51:03 np0005538515.localdomain podman[287402]: 
Nov 28 09:51:03 np0005538515.localdomain podman[287402]: 2025-11-28 09:51:03.436639043 +0000 UTC m=+0.080186189 container create ab397cc5e5dfb411d1fdb02c41d1bdca80fd7aff9a4f89ed6bf65d8ab7cc8b1c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elegant_volhard, architecture=x86_64, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, vcs-type=git, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=553, io.openshift.expose-services=, RELEASE=main, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, GIT_CLEAN=True, ceph=True, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc.)
Nov 28 09:51:03 np0005538515.localdomain systemd[1]: Started libpod-conmon-ab397cc5e5dfb411d1fdb02c41d1bdca80fd7aff9a4f89ed6bf65d8ab7cc8b1c.scope.
Nov 28 09:51:03 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 09:51:03 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a0717618bce4df3df6ee50208187239f957ee9079bb4439469d3e46b86ed9a1a/merged/tmp/config supports timestamps until 2038 (0x7fffffff)
Nov 28 09:51:03 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a0717618bce4df3df6ee50208187239f957ee9079bb4439469d3e46b86ed9a1a/merged/tmp/keyring supports timestamps until 2038 (0x7fffffff)
Nov 28 09:51:03 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a0717618bce4df3df6ee50208187239f957ee9079bb4439469d3e46b86ed9a1a/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Nov 28 09:51:03 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a0717618bce4df3df6ee50208187239f957ee9079bb4439469d3e46b86ed9a1a/merged/var/lib/ceph/mon/ceph-np0005538515 supports timestamps until 2038 (0x7fffffff)
Nov 28 09:51:03 np0005538515.localdomain podman[287402]: 2025-11-28 09:51:03.404462182 +0000 UTC m=+0.048009348 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 09:51:03 np0005538515.localdomain podman[287402]: 2025-11-28 09:51:03.503980545 +0000 UTC m=+0.147527711 container init ab397cc5e5dfb411d1fdb02c41d1bdca80fd7aff9a4f89ed6bf65d8ab7cc8b1c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elegant_volhard, description=Red Hat Ceph Storage 7, ceph=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., io.openshift.expose-services=, CEPH_POINT_RELEASE=, distribution-scope=public, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, build-date=2025-09-24T08:57:55, com.redhat.component=rhceph-container, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_BRANCH=main, release=553, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, RELEASE=main, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, GIT_CLEAN=True)
Nov 28 09:51:03 np0005538515.localdomain podman[287402]: 2025-11-28 09:51:03.510835435 +0000 UTC m=+0.154382581 container start ab397cc5e5dfb411d1fdb02c41d1bdca80fd7aff9a4f89ed6bf65d8ab7cc8b1c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elegant_volhard, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, name=rhceph, version=7, vendor=Red Hat, Inc., build-date=2025-09-24T08:57:55, io.buildah.version=1.33.12, GIT_CLEAN=True, CEPH_POINT_RELEASE=, vcs-type=git, com.redhat.component=rhceph-container, ceph=True, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, release=553, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, architecture=x86_64, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=)
Nov 28 09:51:03 np0005538515.localdomain podman[287402]: 2025-11-28 09:51:03.511017211 +0000 UTC m=+0.154564367 container attach ab397cc5e5dfb411d1fdb02c41d1bdca80fd7aff9a4f89ed6bf65d8ab7cc8b1c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elegant_volhard, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, ceph=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, release=553, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, RELEASE=main, architecture=x86_64, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-09-24T08:57:55, GIT_CLEAN=True, vcs-type=git, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, version=7)
Nov 28 09:51:03 np0005538515.localdomain systemd[1]: libpod-ab397cc5e5dfb411d1fdb02c41d1bdca80fd7aff9a4f89ed6bf65d8ab7cc8b1c.scope: Deactivated successfully.
Nov 28 09:51:03 np0005538515.localdomain podman[287402]: 2025-11-28 09:51:03.620053165 +0000 UTC m=+0.263600311 container died ab397cc5e5dfb411d1fdb02c41d1bdca80fd7aff9a4f89ed6bf65d8ab7cc8b1c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elegant_volhard, ceph=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vcs-type=git, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, release=553, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, RELEASE=main, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhceph, distribution-scope=public, version=7, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.33.12, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-09-24T08:57:55, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=)
Nov 28 09:51:03 np0005538515.localdomain podman[287443]: 2025-11-28 09:51:03.704766371 +0000 UTC m=+0.072785489 container remove ab397cc5e5dfb411d1fdb02c41d1bdca80fd7aff9a4f89ed6bf65d8ab7cc8b1c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elegant_volhard, GIT_BRANCH=main, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, RELEASE=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, release=553, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, name=rhceph, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, GIT_CLEAN=True, build-date=2025-09-24T08:57:55, io.openshift.expose-services=, vcs-type=git, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, ceph=True)
Nov 28 09:51:03 np0005538515.localdomain systemd[1]: libpod-conmon-ab397cc5e5dfb411d1fdb02c41d1bdca80fd7aff9a4f89ed6bf65d8ab7cc8b1c.scope: Deactivated successfully.
Nov 28 09:51:03 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 09:51:03 np0005538515.localdomain systemd-rc-local-generator[287482]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:51:03 np0005538515.localdomain systemd-sysv-generator[287489]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:51:03 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:51:03 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 28 09:51:03 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:51:03 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:51:03 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:51:03 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 28 09:51:03 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:51:03 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:51:03 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:51:04 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-9261e114a82f80fca245808093d9f3507d752ead7cc93194adad348e3647f68c-merged.mount: Deactivated successfully.
Nov 28 09:51:04 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 09:51:04 np0005538515.localdomain systemd-sysv-generator[287528]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:51:04 np0005538515.localdomain systemd-rc-local-generator[287525]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:51:04 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:51:04 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 28 09:51:04 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:51:04 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:51:04 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:51:04 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 28 09:51:04 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:51:04 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:51:04 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:51:04 np0005538515.localdomain systemd[1]: Starting Ceph mon.np0005538515 for 2c5417c9-00eb-57d5-a565-ddecbc7995c1...
Nov 28 09:51:04 np0005538515.localdomain podman[287586]: 
Nov 28 09:51:05 np0005538515.localdomain podman[287586]: 2025-11-28 09:51:04.913765717 +0000 UTC m=+0.051884326 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 09:51:05 np0005538515.localdomain podman[287586]: 2025-11-28 09:51:05.333509711 +0000 UTC m=+0.471628300 container create a20fbb4af4b220d896878368414f00f458b36bd01f689cea18d6929c00ea38cf (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mon-np0005538515, GIT_CLEAN=True, GIT_BRANCH=main, io.buildah.version=1.33.12, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, RELEASE=main, CEPH_POINT_RELEASE=, distribution-scope=public, io.openshift.expose-services=, release=553, build-date=2025-09-24T08:57:55, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, vcs-type=git, version=7, ceph=True, com.redhat.license_terms=https://www.redhat.com/agreements)
Nov 28 09:51:05 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/04c7a1767f0a2009eee63539253f3f9f0fbc337fc72bb39a68aab86969f5acee/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Nov 28 09:51:05 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/04c7a1767f0a2009eee63539253f3f9f0fbc337fc72bb39a68aab86969f5acee/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Nov 28 09:51:05 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/04c7a1767f0a2009eee63539253f3f9f0fbc337fc72bb39a68aab86969f5acee/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Nov 28 09:51:05 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/04c7a1767f0a2009eee63539253f3f9f0fbc337fc72bb39a68aab86969f5acee/merged/var/lib/ceph/mon/ceph-np0005538515 supports timestamps until 2038 (0x7fffffff)
Nov 28 09:51:05 np0005538515.localdomain podman[287586]: 2025-11-28 09:51:05.391343551 +0000 UTC m=+0.529462130 container init a20fbb4af4b220d896878368414f00f458b36bd01f689cea18d6929c00ea38cf (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mon-np0005538515, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=, distribution-scope=public, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_CLEAN=True, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.component=rhceph-container, GIT_BRANCH=main, vendor=Red Hat, Inc., build-date=2025-09-24T08:57:55, name=rhceph, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, release=553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, architecture=x86_64, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.33.12, ceph=True, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Nov 28 09:51:05 np0005538515.localdomain podman[287586]: 2025-11-28 09:51:05.400614646 +0000 UTC m=+0.538733255 container start a20fbb4af4b220d896878368414f00f458b36bd01f689cea18d6929c00ea38cf (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mon-np0005538515, name=rhceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, build-date=2025-09-24T08:57:55, ceph=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-type=git, description=Red Hat Ceph Storage 7, release=553, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, io.buildah.version=1.33.12, RELEASE=main, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7)
Nov 28 09:51:05 np0005538515.localdomain bash[287586]: a20fbb4af4b220d896878368414f00f458b36bd01f689cea18d6929c00ea38cf
Nov 28 09:51:05 np0005538515.localdomain systemd[1]: Started Ceph mon.np0005538515 for 2c5417c9-00eb-57d5-a565-ddecbc7995c1.
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: set uid:gid to 167:167 (ceph:ceph)
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: ceph version 18.2.1-361.el9cp (439dcd6094d413840eb2ec590fe2194ec616687f) reef (stable), process ceph-mon, pid 2
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: pidfile_write: ignore empty --pid-file
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: load: jerasure load: lrc 
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb: RocksDB version: 7.9.2
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb: Git sha 0
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb: Compile date 2025-09-23 00:00:00
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb: DB SUMMARY
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb: DB Session ID:  18KD68ISQNH5R0YWI96C
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb: CURRENT file:  CURRENT
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb: IDENTITY file:  IDENTITY
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb: MANIFEST file:  MANIFEST-000005 size: 59 Bytes
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb: SST files in /var/lib/ceph/mon/ceph-np0005538515/store.db dir, Total Num: 0, files: 
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb: Write Ahead Log file in /var/lib/ceph/mon/ceph-np0005538515/store.db: 000004.log size: 761 ; 
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:                         Options.error_if_exists: 0
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:                       Options.create_if_missing: 0
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:                         Options.paranoid_checks: 1
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:             Options.flush_verify_memtable_count: 1
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:                                     Options.env: 0x5609e0af29e0
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:                                      Options.fs: PosixFileSystem
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:                                Options.info_log: 0x5609e14a4d20
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:                Options.max_file_opening_threads: 16
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:                              Options.statistics: (nil)
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:                               Options.use_fsync: 0
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:                       Options.max_log_file_size: 0
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:                   Options.log_file_time_to_roll: 0
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:                       Options.keep_log_file_num: 1000
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:                    Options.recycle_log_file_num: 0
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:                         Options.allow_fallocate: 1
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:                        Options.allow_mmap_reads: 0
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:                       Options.allow_mmap_writes: 0
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:                        Options.use_direct_reads: 0
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:          Options.create_missing_column_families: 0
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:                              Options.db_log_dir: 
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:                                 Options.wal_dir: 
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:                Options.table_cache_numshardbits: 6
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:                         Options.WAL_ttl_seconds: 0
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:                       Options.WAL_size_limit_MB: 0
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:             Options.manifest_preallocation_size: 4194304
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:                     Options.is_fd_close_on_exec: 1
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:                   Options.advise_random_on_open: 1
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:                    Options.db_write_buffer_size: 0
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:                    Options.write_buffer_manager: 0x5609e14b5540
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:         Options.access_hint_on_compaction_start: 1
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:                      Options.use_adaptive_mutex: 0
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:                            Options.rate_limiter: (nil)
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:                       Options.wal_recovery_mode: 2
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:                  Options.enable_thread_tracking: 0
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:                  Options.enable_pipelined_write: 0
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:                  Options.unordered_write: 0
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:             Options.write_thread_max_yield_usec: 100
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:                               Options.row_cache: None
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:                              Options.wal_filter: None
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:             Options.avoid_flush_during_recovery: 0
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:             Options.allow_ingest_behind: 0
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:             Options.two_write_queues: 0
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:             Options.manual_wal_flush: 0
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:             Options.wal_compression: 0
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:             Options.atomic_flush: 0
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:                 Options.persist_stats_to_disk: 0
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:                 Options.write_dbid_to_manifest: 0
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:                 Options.log_readahead_size: 0
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:                 Options.best_efforts_recovery: 0
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:             Options.allow_data_in_errors: 0
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:             Options.db_host_id: __hostname__
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:             Options.enforce_single_del_contracts: true
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:             Options.max_background_jobs: 2
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:             Options.max_background_compactions: -1
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:             Options.max_subcompactions: 1
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:           Options.writable_file_max_buffer_size: 1048576
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:             Options.delayed_write_rate : 16777216
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:             Options.max_total_wal_size: 0
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:                   Options.stats_dump_period_sec: 600
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:                 Options.stats_persist_period_sec: 600
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:                          Options.max_open_files: -1
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:                          Options.bytes_per_sync: 0
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:                      Options.wal_bytes_per_sync: 0
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:                   Options.strict_bytes_per_sync: 0
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:       Options.compaction_readahead_size: 0
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:                  Options.max_background_flushes: -1
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb: Compression algorithms supported:
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:         kZSTD supported: 0
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:         kXpressCompression supported: 0
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:         kBZip2Compression supported: 0
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:         kZSTDNotFinalCompression supported: 0
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:         kLZ4Compression supported: 1
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:         kZlibCompression supported: 1
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:         kLZ4HCCompression supported: 1
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:         kSnappyCompression supported: 1
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb: Fast CRC32 supported: Supported on x86
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb: DMutex implementation: pthread_mutex_t
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: /var/lib/ceph/mon/ceph-np0005538515/store.db/MANIFEST-000005
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 28 09:51:05 np0005538515.localdomain sudo[287300]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:           Options.merge_operator: 
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:        Options.compaction_filter: None
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:        Options.compaction_filter_factory: None
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:  Options.sst_partitioner_factory: None
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5609e14a4980)
                                                             cache_index_and_filter_blocks: 1
                                                             cache_index_and_filter_blocks_with_high_priority: 0
                                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                                             pin_top_level_index_and_filter: 1
                                                             index_type: 0
                                                             data_block_index_type: 0
                                                             index_shortening: 1
                                                             data_block_hash_table_util_ratio: 0.750000
                                                             checksum: 4
                                                             no_block_cache: 0
                                                             block_cache: 0x5609e14a1350
                                                             block_cache_name: BinnedLRUCache
                                                             block_cache_options:
                                                               capacity : 536870912
                                                               num_shard_bits : 4
                                                               strict_capacity_limit : 0
                                                               high_pri_pool_ratio: 0.000
                                                             block_cache_compressed: (nil)
                                                             persistent_cache: (nil)
                                                             block_size: 4096
                                                             block_size_deviation: 10
                                                             block_restart_interval: 16
                                                             index_block_restart_interval: 1
                                                             metadata_block_size: 4096
                                                             partition_filters: 0
                                                             use_delta_encoding: 1
                                                             filter_policy: bloomfilter
                                                             whole_key_filtering: 1
                                                             verify_compression: 0
                                                             read_amp_bytes_per_bit: 0
                                                             format_version: 5
                                                             enable_index_compression: 1
                                                             block_align: 0
                                                             max_auto_readahead_size: 262144
                                                             prepopulate_block_cache: 0
                                                             initial_auto_readahead_size: 8192
                                                             num_file_reads_for_auto_readahead: 2
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:        Options.write_buffer_size: 33554432
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:  Options.max_write_buffer_number: 2
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:          Options.compression: NoCompression
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:       Options.prefix_extractor: nullptr
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:             Options.num_levels: 7
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:        Options.min_write_buffer_number_to_merge: 1
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:                  Options.compression_opts.level: 32767
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:               Options.compression_opts.strategy: 0
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:                  Options.compression_opts.enabled: false
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:      Options.level0_file_num_compaction_trigger: 4
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:                Options.max_bytes_for_level_base: 268435456
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb: Options.level_compaction_dynamic_level_bytes: 1
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:          Options.max_bytes_for_level_multiplier: 10.000000
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:                        Options.arena_block_size: 1048576
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:                Options.disable_auto_compactions: 0
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:                   Options.table_properties_collectors: 
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:                   Options.inplace_update_support: 0
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:                           Options.bloom_locality: 0
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:                    Options.max_successive_merges: 0
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:                Options.paranoid_file_checks: 0
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:                Options.force_consistency_checks: 1
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:                Options.report_bg_io_stats: 0
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:                               Options.ttl: 2592000
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:                       Options.enable_blob_files: false
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:                           Options.min_blob_size: 0
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:                          Options.blob_file_size: 268435456
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb:                Options.blob_file_starting_level: 0
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:/var/lib/ceph/mon/ceph-np0005538515/store.db/MANIFEST-000005 succeeded,manifest_file_number is 5, next_file_number is 7, last_sequence is 0, log_number is 0,prev_log_number is 0,max_column_family is 0,min_log_number_to_keep is 0
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 0
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: 5fedd929-5f7c-4f1d-86e7-c95af9bc6d32
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323465453140, "job": 1, "event": "recovery_started", "wal_files": [4]}
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #4 mode 2
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323465455933, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 8, "file_size": 1887, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 1, "largest_seqno": 5, "table_properties": {"data_size": 773, "index_size": 31, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 115, "raw_average_key_size": 23, "raw_value_size": 651, "raw_average_value_size": 130, "num_data_blocks": 1, "num_entries": 5, "num_filter_entries": 5, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764323465, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "5fedd929-5f7c-4f1d-86e7-c95af9bc6d32", "db_session_id": "18KD68ISQNH5R0YWI96C", "orig_file_number": 8, "seqno_to_time_mapping": "N/A"}}
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323465456143, "job": 1, "event": "recovery_finished"}
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb: [db/version_set.cc:5047] Creating manifest 10
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538515/store.db/000004.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x5609e14c8e00
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb: DB pointer 0x5609e15be000
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                           ** DB Stats **
                                                           Uptime(secs): 0.0 total, 0.0 interval
                                                           Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s
                                                           Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                           Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                           Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                                           
                                                           ** Compaction Stats [default] **
                                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                             L0      1/0    1.84 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.6      0.00              0.00         1    0.003       0      0       0.0       0.0
                                                            Sum      1/0    1.84 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.6      0.00              0.00         1    0.003       0      0       0.0       0.0
                                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.6      0.00              0.00         1    0.003       0      0       0.0       0.0
                                                           
                                                           ** Compaction Stats [default] **
                                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.6      0.00              0.00         1    0.003       0      0       0.0       0.0
                                                           
                                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                           
                                                           Uptime(secs): 0.0 total, 0.0 interval
                                                           Flush(GB): cumulative 0.000, interval 0.000
                                                           AddFile(GB): cumulative 0.000, interval 0.000
                                                           AddFile(Total Files): cumulative 0, interval 0
                                                           AddFile(L0 Files): cumulative 0, interval 0
                                                           AddFile(Keys): cumulative 0, interval 0
                                                           Cumulative compaction: 0.00 GB write, 0.09 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                           Interval compaction: 0.00 GB write, 0.09 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                           Block cache BinnedLRUCache@0x5609e14a1350#2 capacity: 512.00 MB usage: 0.22 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 0 last_secs: 4.2e-05 secs_since: 0
                                                           Block cache entry stats(count,size,portion): FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.11 KB,2.08616e-05%) Misc(1,0.00 KB,0%)
                                                           
                                                           ** File Read Latency Histogram By Level [default] **
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515 does not exist in monmap, will attempt to join an existing cluster
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: using public_addr v2:172.18.0.108:0/0 -> [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0]
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: starting mon.np0005538515 rank -1 at public addrs [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] at bind addrs [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon_data /var/lib/ceph/mon/ceph-np0005538515 fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@-1(???) e0 preinit fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@-1(synchronizing) e3 sync_obtain_latest_monmap
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@-1(synchronizing) e3 sync_obtain_latest_monmap obtained monmap e3
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@-1(synchronizing).mds e17 new map
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@-1(synchronizing).mds e17 print_map
                                                           e17
                                                           enable_multiple, ever_enabled_multiple: 1,1
                                                           default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,12=quiesce subvolumes}
                                                           legacy client fscid: 1
                                                            
                                                           Filesystem 'cephfs' (1)
                                                           fs_name        cephfs
                                                           epoch        15
                                                           flags        12 joinable allow_snaps allow_multimds_snaps
                                                           created        2025-11-28T08:07:30.958224+0000
                                                           modified        2025-11-28T09:49:53.259185+0000
                                                           tableserver        0
                                                           root        0
                                                           session_timeout        60
                                                           session_autoclose        300
                                                           max_file_size        1099511627776
                                                           required_client_features        {}
                                                           last_failure        0
                                                           last_failure_osd_epoch        83
                                                           compat        compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,12=quiesce subvolumes}
                                                           max_mds        1
                                                           in        0
                                                           up        {0=26449}
                                                           failed        
                                                           damaged        
                                                           stopped        
                                                           data_pools        [6]
                                                           metadata_pool        7
                                                           inline_data        disabled
                                                           balancer        
                                                           bal_rank_mask        -1
                                                           standby_count_wanted        1
                                                           qdb_cluster        leader: 26449 members: 26449
                                                           [mds.mds.np0005538514.umgtoy{0:26449} state up:active seq 12 addr [v2:172.18.0.107:6808/1969410151,v1:172.18.0.107:6809/1969410151] compat {c=[1],r=[1],i=[17ff]}]
                                                            
                                                            
                                                           Standby daemons:
                                                            
                                                           [mds.mds.np0005538513.yljthc{-1:16968} state up:standby seq 1 addr [v2:172.18.0.106:6808/2782735008,v1:172.18.0.106:6809/2782735008] compat {c=[1],r=[1],i=[17ff]}]
                                                           [mds.mds.np0005538515.anvatb{-1:26446} state up:standby seq 1 addr [v2:172.18.0.108:6808/2640180,v1:172.18.0.108:6809/2640180] compat {c=[1],r=[1],i=[17ff]}]
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@-1(synchronizing).osd e85 crush map has features 3314933000852226048, adjusting msgr requires
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@-1(synchronizing).osd e85 crush map has features 288514051259236352, adjusting msgr requires
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@-1(synchronizing).osd e85 crush map has features 288514051259236352, adjusting msgr requires
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@-1(synchronizing).osd e85 crush map has features 288514051259236352, adjusting msgr requires
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: Added label mgr to host np0005538513.localdomain
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: pgmap v3816: 177 pgs: 177 active+clean; 104 MiB data, 558 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: from='client.17100 -' entity='client.admin' cmd=[{"prefix": "orch host label add", "hostname": "np0005538514.localdomain", "label": "mgr", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: Added label mgr to host np0005538514.localdomain
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: from='client.17106 -' entity='client.admin' cmd=[{"prefix": "orch host label add", "hostname": "np0005538515.localdomain", "label": "mgr", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: Added label mgr to host np0005538515.localdomain
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: pgmap v3817: 177 pgs: 177 active+clean; 104 MiB data, 558 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538513.dsfdlx", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd='[{"prefix": "auth get-or-create", "entity": "mgr.np0005538513.dsfdlx", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]': finished
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "mgr services"} : dispatch
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: from='client.17112 -' entity='client.admin' cmd=[{"prefix": "orch apply", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: Saving service mgr spec with placement label:mgr
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: Deploying daemon mgr.np0005538513.dsfdlx on np0005538513.localdomain
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: pgmap v3818: 177 pgs: 177 active+clean; 104 MiB data, 558 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: from='client.17118 -' entity='client.admin' cmd=[{"prefix": "orch ps", "daemon_type": "mgr", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538514.djozup", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd='[{"prefix": "auth get-or-create", "entity": "mgr.np0005538514.djozup", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]': finished
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "mgr services"} : dispatch
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: Deploying daemon mgr.np0005538514.djozup on np0005538514.localdomain
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: pgmap v3819: 177 pgs: 177 active+clean; 104 MiB data, 558 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: from='client.17130 -' entity='client.admin' cmd=[{"prefix": "orch host label add", "hostname": "np0005538510.localdomain", "label": "mon", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: Added label mon to host np0005538510.localdomain
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538515.yfkzhl", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: from='client.17136 -' entity='client.admin' cmd=[{"prefix": "orch host label add", "hostname": "np0005538510.localdomain", "label": "_admin", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: Added label _admin to host np0005538510.localdomain
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd='[{"prefix": "auth get-or-create", "entity": "mgr.np0005538515.yfkzhl", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]': finished
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "mgr services"} : dispatch
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: Deploying daemon mgr.np0005538515.yfkzhl on np0005538515.localdomain
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: pgmap v3820: 177 pgs: 177 active+clean; 104 MiB data, 558 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: from='client.17148 -' entity='client.admin' cmd=[{"prefix": "orch host label add", "hostname": "np0005538511.localdomain", "label": "mon", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: Added label mon to host np0005538511.localdomain
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: pgmap v3821: 177 pgs: 177 active+clean; 104 MiB data, 558 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: from='client.17160 -' entity='client.admin' cmd=[{"prefix": "orch host label add", "hostname": "np0005538511.localdomain", "label": "_admin", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: Added label _admin to host np0005538511.localdomain
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: Standby manager daemon np0005538513.dsfdlx started
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: mgrmap e12: np0005538510.nzitwz(active, since 2h), standbys: np0005538512.zyhkxs, np0005538511.fvuybw, np0005538513.dsfdlx
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "mgr metadata", "who": "np0005538513.dsfdlx", "id": "np0005538513.dsfdlx"} : dispatch
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: from='client.17166 -' entity='client.admin' cmd=[{"prefix": "orch host label add", "hostname": "np0005538512.localdomain", "label": "mon", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: Added label mon to host np0005538512.localdomain
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: pgmap v3822: 177 pgs: 177 active+clean; 104 MiB data, 558 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: Standby manager daemon np0005538514.djozup started
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: from='client.17172 -' entity='client.admin' cmd=[{"prefix": "orch host label add", "hostname": "np0005538512.localdomain", "label": "_admin", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: Added label _admin to host np0005538512.localdomain
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: mgrmap e13: np0005538510.nzitwz(active, since 2h), standbys: np0005538512.zyhkxs, np0005538514.djozup, np0005538511.fvuybw, np0005538513.dsfdlx
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "mgr metadata", "who": "np0005538514.djozup", "id": "np0005538514.djozup"} : dispatch
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: from='client.17178 -' entity='client.admin' cmd=[{"prefix": "orch host label add", "hostname": "np0005538513.localdomain", "label": "mon", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: Added label mon to host np0005538513.localdomain
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: pgmap v3823: 177 pgs: 177 active+clean; 104 MiB data, 558 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: Standby manager daemon np0005538515.yfkzhl started
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: from='client.17184 -' entity='client.admin' cmd=[{"prefix": "orch host label add", "hostname": "np0005538513.localdomain", "label": "_admin", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: Added label _admin to host np0005538513.localdomain
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: Updating np0005538513.localdomain:/etc/ceph/ceph.conf
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: mgrmap e14: np0005538510.nzitwz(active, since 2h), standbys: np0005538512.zyhkxs, np0005538514.djozup, np0005538515.yfkzhl, np0005538511.fvuybw, np0005538513.dsfdlx
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "mgr metadata", "who": "np0005538515.yfkzhl", "id": "np0005538515.yfkzhl"} : dispatch
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: Updating np0005538513.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: pgmap v3824: 177 pgs: 177 active+clean; 104 MiB data, 558 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: from='client.17190 -' entity='client.admin' cmd=[{"prefix": "orch host label add", "hostname": "np0005538514.localdomain", "label": "mon", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: Added label mon to host np0005538514.localdomain
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: Updating np0005538513.localdomain:/etc/ceph/ceph.client.admin.keyring
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: Updating np0005538513.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: from='client.17196 -' entity='client.admin' cmd=[{"prefix": "orch host label add", "hostname": "np0005538514.localdomain", "label": "_admin", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: Added label _admin to host np0005538514.localdomain
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: pgmap v3825: 177 pgs: 177 active+clean; 104 MiB data, 558 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: Updating np0005538514.localdomain:/etc/ceph/ceph.conf
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: from='client.17202 -' entity='client.admin' cmd=[{"prefix": "orch host label add", "hostname": "np0005538515.localdomain", "label": "mon", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: Added label mon to host np0005538515.localdomain
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: Updating np0005538514.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: Updating np0005538514.localdomain:/etc/ceph/ceph.client.admin.keyring
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: pgmap v3826: 177 pgs: 177 active+clean; 104 MiB data, 558 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: from='client.17208 -' entity='client.admin' cmd=[{"prefix": "orch host label add", "hostname": "np0005538515.localdomain", "label": "_admin", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: Updating np0005538514.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: Added label _admin to host np0005538515.localdomain
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: Updating np0005538515.localdomain:/etc/ceph/ceph.conf
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: from='client.17214 -' entity='client.admin' cmd=[{"prefix": "orch apply", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: Saving service mon spec with placement label:mon
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: pgmap v3827: 177 pgs: 177 active+clean; 104 MiB data, 558 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: Updating np0005538515.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: Updating np0005538515.localdomain:/etc/ceph/ceph.client.admin.keyring
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: from='client.26635 -' entity='client.admin' cmd=[{"prefix": "orch ps", "daemon_type": "mon", "daemon_id": "np0005538513", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: Updating np0005538515.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: pgmap v3828: 177 pgs: 177 active+clean; 104 MiB data, 558 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: Deploying daemon mon.np0005538515 on np0005538515.localdomain
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: pgmap v3829: 177 pgs: 177 active+clean; 104 MiB data, 558 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:51:05 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@-1(synchronizing).paxosservice(auth 1..34) refresh upgraded, format 0 -> 3
Nov 28 09:51:05 np0005538515.localdomain ceph-mgr[286188]: ms_deliver_dispatch: unhandled message 0x5646e12c51e0 mon_map magic: 0 from mon.0 v2:172.18.0.103:3300/0
Nov 28 09:51:07 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@-1(probing) e4  my rank is now 3 (was -1)
Nov 28 09:51:07 np0005538515.localdomain ceph-mon[287604]: log_channel(cluster) log [INF] : mon.np0005538515 calling monitor election
Nov 28 09:51:07 np0005538515.localdomain ceph-mon[287604]: paxos.3).electionLogic(0) init, first boot, initializing epoch at 1 
Nov 28 09:51:07 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@3(electing) e4 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 28 09:51:07 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@3(electing) e4  adding peer [v2:172.18.0.107:3300/0,v1:172.18.0.107:6789/0] to list of hints
Nov 28 09:51:08 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@3(electing) e4  adding peer [v2:172.18.0.107:3300/0,v1:172.18.0.107:6789/0] to list of hints
Nov 28 09:51:10 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@3(electing) e4  adding peer [v2:172.18.0.107:3300/0,v1:172.18.0.107:6789/0] to list of hints
Nov 28 09:51:10 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@3(electing) e4 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 28 09:51:10 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@3(peon) e4 _apply_compatset_features enabling new quorum features: compat={},rocompat={},incompat={4=support erasure code pools,5=new-style osdmap encoding,6=support isa/lrc erasure code,7=support shec erasure code}
Nov 28 09:51:10 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@3(peon) e4 _apply_compatset_features enabling new quorum features: compat={},rocompat={},incompat={8=support monmap features,9=luminous ondisk layout,10=mimic ondisk layout,11=nautilus ondisk layout,12=octopus ondisk layout,13=pacific ondisk layout,14=quincy ondisk layout,15=reef ondisk layout}
Nov 28 09:51:10 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@3(peon) e4 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 28 09:51:10 np0005538515.localdomain ceph-mon[287604]: mgrc update_daemon_metadata mon.np0005538515 metadata {addrs=[v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0],arch=x86_64,ceph_release=reef,ceph_version=ceph version 18.2.1-361.el9cp (439dcd6094d413840eb2ec590fe2194ec616687f) reef (stable),ceph_version_short=18.2.1-361.el9cp,compression_algorithms=none, snappy, zlib, zstd, lz4,container_hostname=np0005538515.localdomain,container_image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest,cpu=AMD EPYC-Rome Processor,device_ids=,device_paths=vda=/dev/disk/by-path/pci-0000:00:04.0,devices=vda,distro=rhel,distro_description=Red Hat Enterprise Linux 9.6 (Plow),distro_version=9.6,hostname=np0005538515.localdomain,kernel_description=#1 SMP PREEMPT_DYNAMIC Wed Apr 12 10:45:03 EDT 2023,kernel_version=5.14.0-284.11.1.el9_2.x86_64,mem_swap_kb=1048572,mem_total_kb=16116612,os=Linux}
Nov 28 09:51:11 np0005538515.localdomain ceph-mon[287604]: Deploying daemon mon.np0005538514 on np0005538514.localdomain
Nov 28 09:51:11 np0005538515.localdomain ceph-mon[287604]: pgmap v3830: 177 pgs: 177 active+clean; 104 MiB data, 558 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:51:11 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "mon metadata", "id": "np0005538510"} : dispatch
Nov 28 09:51:11 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "mon metadata", "id": "np0005538511"} : dispatch
Nov 28 09:51:11 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "mon metadata", "id": "np0005538512"} : dispatch
Nov 28 09:51:11 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "mon metadata", "id": "np0005538515"} : dispatch
Nov 28 09:51:11 np0005538515.localdomain ceph-mon[287604]: mon.np0005538510 calling monitor election
Nov 28 09:51:11 np0005538515.localdomain ceph-mon[287604]: mon.np0005538512 calling monitor election
Nov 28 09:51:11 np0005538515.localdomain ceph-mon[287604]: mon.np0005538511 calling monitor election
Nov 28 09:51:11 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "mon metadata", "id": "np0005538515"} : dispatch
Nov 28 09:51:11 np0005538515.localdomain ceph-mon[287604]: pgmap v3831: 177 pgs: 177 active+clean; 104 MiB data, 558 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:51:11 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515 calling monitor election
Nov 28 09:51:11 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "mon metadata", "id": "np0005538515"} : dispatch
Nov 28 09:51:11 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "mon metadata", "id": "np0005538514"} : dispatch
Nov 28 09:51:11 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "mon metadata", "id": "np0005538515"} : dispatch
Nov 28 09:51:11 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "mon metadata", "id": "np0005538514"} : dispatch
Nov 28 09:51:11 np0005538515.localdomain ceph-mon[287604]: pgmap v3832: 177 pgs: 177 active+clean; 104 MiB data, 558 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:51:11 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "mon metadata", "id": "np0005538515"} : dispatch
Nov 28 09:51:11 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "mon metadata", "id": "np0005538514"} : dispatch
Nov 28 09:51:11 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "mon metadata", "id": "np0005538515"} : dispatch
Nov 28 09:51:11 np0005538515.localdomain ceph-mon[287604]: mon.np0005538510 is new leader, mons np0005538510,np0005538512,np0005538511,np0005538515 in quorum (ranks 0,1,2,3)
Nov 28 09:51:11 np0005538515.localdomain ceph-mon[287604]: monmap epoch 4
Nov 28 09:51:11 np0005538515.localdomain ceph-mon[287604]: fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:51:11 np0005538515.localdomain ceph-mon[287604]: last_changed 2025-11-28T09:51:05.886382+0000
Nov 28 09:51:11 np0005538515.localdomain ceph-mon[287604]: created 2025-11-28T07:45:36.120469+0000
Nov 28 09:51:11 np0005538515.localdomain ceph-mon[287604]: min_mon_release 18 (reef)
Nov 28 09:51:11 np0005538515.localdomain ceph-mon[287604]: election_strategy: 1
Nov 28 09:51:11 np0005538515.localdomain ceph-mon[287604]: 0: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005538510
Nov 28 09:51:11 np0005538515.localdomain ceph-mon[287604]: 1: [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon.np0005538512
Nov 28 09:51:11 np0005538515.localdomain ceph-mon[287604]: 2: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005538511
Nov 28 09:51:11 np0005538515.localdomain ceph-mon[287604]: 3: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005538515
Nov 28 09:51:11 np0005538515.localdomain ceph-mon[287604]: fsmap cephfs:1 {0=mds.np0005538514.umgtoy=up:active} 2 up:standby
Nov 28 09:51:11 np0005538515.localdomain ceph-mon[287604]: osdmap e85: 6 total, 6 up, 6 in
Nov 28 09:51:11 np0005538515.localdomain ceph-mon[287604]: mgrmap e14: np0005538510.nzitwz(active, since 2h), standbys: np0005538512.zyhkxs, np0005538514.djozup, np0005538515.yfkzhl, np0005538511.fvuybw, np0005538513.dsfdlx
Nov 28 09:51:11 np0005538515.localdomain ceph-mon[287604]: overall HEALTH_OK
Nov 28 09:51:11 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:11 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:11 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.
Nov 28 09:51:11 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.
Nov 28 09:51:11 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.
Nov 28 09:51:11 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.
Nov 28 09:51:12 np0005538515.localdomain systemd[1]: tmp-crun.C9GUYp.mount: Deactivated successfully.
Nov 28 09:51:12 np0005538515.localdomain podman[287644]: 2025-11-28 09:51:12.012720453 +0000 UTC m=+0.113831473 container health_status 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute)
Nov 28 09:51:12 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:12 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Nov 28 09:51:12 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Nov 28 09:51:12 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:51:12 np0005538515.localdomain ceph-mon[287604]: Deploying daemon mon.np0005538513 on np0005538513.localdomain
Nov 28 09:51:12 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "mon metadata", "id": "np0005538514"} : dispatch
Nov 28 09:51:12 np0005538515.localdomain ceph-mon[287604]: from='client.17228 -' entity='client.admin' cmd=[{"prefix": "orch ps", "daemon_type": "mon", "daemon_id": "np0005538513", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Nov 28 09:51:12 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "mon metadata", "id": "np0005538515"} : dispatch
Nov 28 09:51:12 np0005538515.localdomain podman[287646]: 2025-11-28 09:51:12.052787555 +0000 UTC m=+0.149431558 container health_status b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team)
Nov 28 09:51:12 np0005538515.localdomain podman[287646]: 2025-11-28 09:51:12.087614827 +0000 UTC m=+0.184258900 container exec_died b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Nov 28 09:51:12 np0005538515.localdomain podman[287647]: 2025-11-28 09:51:12.098673557 +0000 UTC m=+0.189933584 container health_status d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 09:51:12 np0005538515.localdomain systemd[1]: b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.service: Deactivated successfully.
Nov 28 09:51:12 np0005538515.localdomain podman[287647]: 2025-11-28 09:51:12.108723506 +0000 UTC m=+0.199983563 container exec_died d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 09:51:12 np0005538515.localdomain systemd[1]: d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.service: Deactivated successfully.
Nov 28 09:51:12 np0005538515.localdomain podman[287645]: 2025-11-28 09:51:12.20731715 +0000 UTC m=+0.305355835 container health_status 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.build-date=20251125, container_name=ovn_controller, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 09:51:12 np0005538515.localdomain podman[287644]: 2025-11-28 09:51:12.230518314 +0000 UTC m=+0.331629314 container exec_died 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=edpm)
Nov 28 09:51:12 np0005538515.localdomain systemd[1]: 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.service: Deactivated successfully.
Nov 28 09:51:12 np0005538515.localdomain podman[287645]: 2025-11-28 09:51:12.277613423 +0000 UTC m=+0.375652108 container exec_died 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Nov 28 09:51:12 np0005538515.localdomain systemd[1]: 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.service: Deactivated successfully.
Nov 28 09:51:12 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@3(peon) e4  adding peer [v2:172.18.0.107:3300/0,v1:172.18.0.107:6789/0] to list of hints
Nov 28 09:51:12 np0005538515.localdomain ceph-mgr[286188]: ms_deliver_dispatch: unhandled message 0x5646e12c4f20 mon_map magic: 0 from mon.0 v2:172.18.0.103:3300/0
Nov 28 09:51:12 np0005538515.localdomain ceph-mon[287604]: log_channel(cluster) log [INF] : mon.np0005538515 calling monitor election
Nov 28 09:51:12 np0005538515.localdomain ceph-mon[287604]: paxos.3).electionLogic(18) init, last seen epoch 18
Nov 28 09:51:12 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@3(electing) e5 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 28 09:51:12 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@3(electing) e5 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 28 09:51:13 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@3(electing) e5  adding peer [v2:172.18.0.106:3300/0,v1:172.18.0.106:6789/0] to list of hints
Nov 28 09:51:13 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@3(electing) e5  adding peer [v2:172.18.0.106:3300/0,v1:172.18.0.106:6789/0] to list of hints
Nov 28 09:51:13 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@3(electing) e5  adding peer [v2:172.18.0.106:3300/0,v1:172.18.0.106:6789/0] to list of hints
Nov 28 09:51:15 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.
Nov 28 09:51:15 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@3(electing) e5  adding peer [v2:172.18.0.106:3300/0,v1:172.18.0.106:6789/0] to list of hints
Nov 28 09:51:15 np0005538515.localdomain podman[287727]: 2025-11-28 09:51:15.976377667 +0000 UTC m=+0.078739543 container health_status 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 28 09:51:15 np0005538515.localdomain podman[287727]: 2025-11-28 09:51:15.98977974 +0000 UTC m=+0.092141656 container exec_died 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 28 09:51:16 np0005538515.localdomain systemd[1]: 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.service: Deactivated successfully.
Nov 28 09:51:17 np0005538515.localdomain ceph-mds[282859]: mds.beacon.mds.np0005538515.anvatb missed beacon ack from the monitors
Nov 28 09:51:17 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@3(electing) e5 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 28 09:51:17 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@3(peon) e5 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 28 09:51:17 np0005538515.localdomain ceph-mon[287604]: pgmap v3833: 177 pgs: 177 active+clean; 104 MiB data, 558 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:51:17 np0005538515.localdomain ceph-mon[287604]: mon.np0005538511 calling monitor election
Nov 28 09:51:17 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "mon metadata", "id": "np0005538510"} : dispatch
Nov 28 09:51:17 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "mon metadata", "id": "np0005538511"} : dispatch
Nov 28 09:51:17 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "mon metadata", "id": "np0005538512"} : dispatch
Nov 28 09:51:17 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "mon metadata", "id": "np0005538514"} : dispatch
Nov 28 09:51:17 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "mon metadata", "id": "np0005538515"} : dispatch
Nov 28 09:51:17 np0005538515.localdomain ceph-mon[287604]: mon.np0005538510 calling monitor election
Nov 28 09:51:17 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515 calling monitor election
Nov 28 09:51:17 np0005538515.localdomain ceph-mon[287604]: mon.np0005538512 calling monitor election
Nov 28 09:51:17 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "mon metadata", "id": "np0005538514"} : dispatch
Nov 28 09:51:17 np0005538515.localdomain ceph-mon[287604]: pgmap v3834: 177 pgs: 177 active+clean; 104 MiB data, 558 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:51:17 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "mon metadata", "id": "np0005538513"} : dispatch
Nov 28 09:51:17 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "mon metadata", "id": "np0005538514"} : dispatch
Nov 28 09:51:17 np0005538515.localdomain ceph-mon[287604]: mon.np0005538514 calling monitor election
Nov 28 09:51:17 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "mon metadata", "id": "np0005538513"} : dispatch
Nov 28 09:51:17 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "mon metadata", "id": "np0005538514"} : dispatch
Nov 28 09:51:17 np0005538515.localdomain ceph-mon[287604]: pgmap v3835: 177 pgs: 177 active+clean; 104 MiB data, 558 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:51:17 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "mon metadata", "id": "np0005538513"} : dispatch
Nov 28 09:51:17 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "mon metadata", "id": "np0005538514"} : dispatch
Nov 28 09:51:17 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "mon metadata", "id": "np0005538513"} : dispatch
Nov 28 09:51:17 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "mon metadata", "id": "np0005538514"} : dispatch
Nov 28 09:51:17 np0005538515.localdomain ceph-mon[287604]: mon.np0005538510 is new leader, mons np0005538510,np0005538512,np0005538511,np0005538515,np0005538514 in quorum (ranks 0,1,2,3,4)
Nov 28 09:51:17 np0005538515.localdomain ceph-mon[287604]: monmap epoch 5
Nov 28 09:51:17 np0005538515.localdomain ceph-mon[287604]: fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:51:17 np0005538515.localdomain ceph-mon[287604]: last_changed 2025-11-28T09:51:12.314668+0000
Nov 28 09:51:17 np0005538515.localdomain ceph-mon[287604]: created 2025-11-28T07:45:36.120469+0000
Nov 28 09:51:17 np0005538515.localdomain ceph-mon[287604]: min_mon_release 18 (reef)
Nov 28 09:51:17 np0005538515.localdomain ceph-mon[287604]: election_strategy: 1
Nov 28 09:51:17 np0005538515.localdomain ceph-mon[287604]: 0: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005538510
Nov 28 09:51:17 np0005538515.localdomain ceph-mon[287604]: 1: [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon.np0005538512
Nov 28 09:51:17 np0005538515.localdomain ceph-mon[287604]: 2: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005538511
Nov 28 09:51:17 np0005538515.localdomain ceph-mon[287604]: 3: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005538515
Nov 28 09:51:17 np0005538515.localdomain ceph-mon[287604]: 4: [v2:172.18.0.107:3300/0,v1:172.18.0.107:6789/0] mon.np0005538514
Nov 28 09:51:17 np0005538515.localdomain ceph-mon[287604]: fsmap cephfs:1 {0=mds.np0005538514.umgtoy=up:active} 2 up:standby
Nov 28 09:51:17 np0005538515.localdomain ceph-mon[287604]: osdmap e85: 6 total, 6 up, 6 in
Nov 28 09:51:17 np0005538515.localdomain ceph-mon[287604]: mgrmap e14: np0005538510.nzitwz(active, since 2h), standbys: np0005538512.zyhkxs, np0005538514.djozup, np0005538515.yfkzhl, np0005538511.fvuybw, np0005538513.dsfdlx
Nov 28 09:51:17 np0005538515.localdomain ceph-mon[287604]: overall HEALTH_OK
Nov 28 09:51:17 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:17 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@3(peon) e5 handle_auth_request failed to assign global_id
Nov 28 09:51:17 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@3(peon) e5 handle_auth_request failed to assign global_id
Nov 28 09:51:17 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@3(peon) e5  adding peer [v2:172.18.0.106:3300/0,v1:172.18.0.106:6789/0] to list of hints
Nov 28 09:51:17 np0005538515.localdomain ceph-mgr[286188]: ms_deliver_dispatch: unhandled message 0x5646e12c5600 mon_map magic: 0 from mon.0 v2:172.18.0.103:3300/0
Nov 28 09:51:17 np0005538515.localdomain ceph-mon[287604]: log_channel(cluster) log [INF] : mon.np0005538515 calling monitor election
Nov 28 09:51:17 np0005538515.localdomain ceph-mon[287604]: paxos.3).electionLogic(22) init, last seen epoch 22
Nov 28 09:51:17 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@3(electing) e6 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 28 09:51:17 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@3(electing) e6 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 28 09:51:18 np0005538515.localdomain sudo[287749]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:51:18 np0005538515.localdomain sudo[287749]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:18 np0005538515.localdomain sudo[287749]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:18 np0005538515.localdomain sudo[287767]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:51:18 np0005538515.localdomain sudo[287767]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:18 np0005538515.localdomain sudo[287767]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:18 np0005538515.localdomain sudo[287785]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Nov 28 09:51:18 np0005538515.localdomain sudo[287785]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:19 np0005538515.localdomain systemd[1]: tmp-crun.s9vsoH.mount: Deactivated successfully.
Nov 28 09:51:19 np0005538515.localdomain podman[287874]: 2025-11-28 09:51:19.098254004 +0000 UTC m=+0.131659002 container exec 98f7091a3e2ea0e9ed1e630f1e98c8fad1fd276cf7448473db6afc3c103ea45d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538515, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, ceph=True, architecture=x86_64, build-date=2025-09-24T08:57:55, RELEASE=main, release=553, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=rhceph-container, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, GIT_BRANCH=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vendor=Red Hat, Inc., distribution-scope=public, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Nov 28 09:51:19 np0005538515.localdomain podman[287874]: 2025-11-28 09:51:19.224775637 +0000 UTC m=+0.258180645 container exec_died 98f7091a3e2ea0e9ed1e630f1e98c8fad1fd276cf7448473db6afc3c103ea45d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538515, version=7, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=rhceph-container, vcs-type=git, distribution-scope=public, io.openshift.expose-services=, build-date=2025-09-24T08:57:55, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., RELEASE=main, description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, name=rhceph, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=)
Nov 28 09:51:19 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.
Nov 28 09:51:19 np0005538515.localdomain podman[287925]: 2025-11-28 09:51:19.470670711 +0000 UTC m=+0.089398471 container health_status cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team)
Nov 28 09:51:19 np0005538515.localdomain podman[287925]: 2025-11-28 09:51:19.487587382 +0000 UTC m=+0.106315202 container exec_died cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 28 09:51:19 np0005538515.localdomain systemd[1]: cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.service: Deactivated successfully.
Nov 28 09:51:19 np0005538515.localdomain sudo[287785]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:22 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@3(electing) e6 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 28 09:51:22 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@3(peon) e6 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 28 09:51:23 np0005538515.localdomain ceph-mon[287604]: from='client.26646 -' entity='client.admin' cmd=[{"prefix": "orch ps", "daemon_type": "mon", "daemon_id": "np0005538513", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Nov 28 09:51:23 np0005538515.localdomain ceph-mon[287604]: pgmap v3836: 177 pgs: 177 active+clean; 104 MiB data, 558 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:51:23 np0005538515.localdomain ceph-mon[287604]: mon.np0005538511 calling monitor election
Nov 28 09:51:23 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515 calling monitor election
Nov 28 09:51:23 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "mon metadata", "id": "np0005538510"} : dispatch
Nov 28 09:51:23 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "mon metadata", "id": "np0005538511"} : dispatch
Nov 28 09:51:23 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "mon metadata", "id": "np0005538512"} : dispatch
Nov 28 09:51:23 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "mon metadata", "id": "np0005538513"} : dispatch
Nov 28 09:51:23 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "mon metadata", "id": "np0005538514"} : dispatch
Nov 28 09:51:23 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "mon metadata", "id": "np0005538515"} : dispatch
Nov 28 09:51:23 np0005538515.localdomain ceph-mon[287604]: mon.np0005538510 calling monitor election
Nov 28 09:51:23 np0005538515.localdomain ceph-mon[287604]: mon.np0005538512 calling monitor election
Nov 28 09:51:23 np0005538515.localdomain ceph-mon[287604]: mon.np0005538514 calling monitor election
Nov 28 09:51:23 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "mon metadata", "id": "np0005538513"} : dispatch
Nov 28 09:51:23 np0005538515.localdomain ceph-mon[287604]: pgmap v3837: 177 pgs: 177 active+clean; 104 MiB data, 558 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:51:23 np0005538515.localdomain ceph-mon[287604]: mon.np0005538513 calling monitor election
Nov 28 09:51:23 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "mon metadata", "id": "np0005538513"} : dispatch
Nov 28 09:51:23 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "mon metadata", "id": "np0005538513"} : dispatch
Nov 28 09:51:23 np0005538515.localdomain ceph-mon[287604]: pgmap v3838: 177 pgs: 177 active+clean; 104 MiB data, 558 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:51:23 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "mon metadata", "id": "np0005538513"} : dispatch
Nov 28 09:51:23 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "mon metadata", "id": "np0005538513"} : dispatch
Nov 28 09:51:23 np0005538515.localdomain ceph-mon[287604]: mon.np0005538510 is new leader, mons np0005538510,np0005538512,np0005538511,np0005538515,np0005538514,np0005538513 in quorum (ranks 0,1,2,3,4,5)
Nov 28 09:51:23 np0005538515.localdomain ceph-mon[287604]: monmap epoch 6
Nov 28 09:51:23 np0005538515.localdomain ceph-mon[287604]: fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:51:23 np0005538515.localdomain ceph-mon[287604]: last_changed 2025-11-28T09:51:17.896997+0000
Nov 28 09:51:23 np0005538515.localdomain ceph-mon[287604]: created 2025-11-28T07:45:36.120469+0000
Nov 28 09:51:23 np0005538515.localdomain ceph-mon[287604]: min_mon_release 18 (reef)
Nov 28 09:51:23 np0005538515.localdomain ceph-mon[287604]: election_strategy: 1
Nov 28 09:51:23 np0005538515.localdomain ceph-mon[287604]: 0: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005538510
Nov 28 09:51:23 np0005538515.localdomain ceph-mon[287604]: 1: [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon.np0005538512
Nov 28 09:51:23 np0005538515.localdomain ceph-mon[287604]: 2: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005538511
Nov 28 09:51:23 np0005538515.localdomain ceph-mon[287604]: 3: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005538515
Nov 28 09:51:23 np0005538515.localdomain ceph-mon[287604]: 4: [v2:172.18.0.107:3300/0,v1:172.18.0.107:6789/0] mon.np0005538514
Nov 28 09:51:23 np0005538515.localdomain ceph-mon[287604]: 5: [v2:172.18.0.106:3300/0,v1:172.18.0.106:6789/0] mon.np0005538513
Nov 28 09:51:23 np0005538515.localdomain ceph-mon[287604]: fsmap cephfs:1 {0=mds.np0005538514.umgtoy=up:active} 2 up:standby
Nov 28 09:51:23 np0005538515.localdomain ceph-mon[287604]: osdmap e85: 6 total, 6 up, 6 in
Nov 28 09:51:23 np0005538515.localdomain ceph-mon[287604]: mgrmap e14: np0005538510.nzitwz(active, since 2h), standbys: np0005538512.zyhkxs, np0005538514.djozup, np0005538515.yfkzhl, np0005538511.fvuybw, np0005538513.dsfdlx
Nov 28 09:51:23 np0005538515.localdomain ceph-mon[287604]: overall HEALTH_OK
Nov 28 09:51:23 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:23 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:23 np0005538515.localdomain sudo[288010]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Nov 28 09:51:23 np0005538515.localdomain sudo[288010]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:23 np0005538515.localdomain sudo[288010]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:23 np0005538515.localdomain sudo[288028]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph
Nov 28 09:51:23 np0005538515.localdomain sudo[288028]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:23 np0005538515.localdomain sudo[288028]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:23 np0005538515.localdomain sudo[288046]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.conf.new
Nov 28 09:51:23 np0005538515.localdomain sudo[288046]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:23 np0005538515.localdomain sudo[288046]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:23 np0005538515.localdomain sudo[288064]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:51:23 np0005538515.localdomain sudo[288064]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:23 np0005538515.localdomain sudo[288064]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:23 np0005538515.localdomain sudo[288082]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.conf.new
Nov 28 09:51:23 np0005538515.localdomain sudo[288082]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:23 np0005538515.localdomain sudo[288082]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:23 np0005538515.localdomain sudo[288116]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.conf.new
Nov 28 09:51:23 np0005538515.localdomain sudo[288116]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:23 np0005538515.localdomain sudo[288116]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:23 np0005538515.localdomain sudo[288134]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.conf.new
Nov 28 09:51:23 np0005538515.localdomain sudo[288134]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:23 np0005538515.localdomain sudo[288134]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:23 np0005538515.localdomain sudo[288152]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Nov 28 09:51:23 np0005538515.localdomain sudo[288152]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:23 np0005538515.localdomain sudo[288152]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:23 np0005538515.localdomain sudo[288170]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config
Nov 28 09:51:23 np0005538515.localdomain sudo[288170]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:23 np0005538515.localdomain sudo[288170]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:23 np0005538515.localdomain sudo[288188]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config
Nov 28 09:51:23 np0005538515.localdomain sudo[288188]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:23 np0005538515.localdomain sudo[288188]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:24 np0005538515.localdomain sudo[288206]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf.new
Nov 28 09:51:24 np0005538515.localdomain sudo[288206]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:24 np0005538515.localdomain sudo[288206]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:24 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:24 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:24 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:24 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:24 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:24 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "config rm", "who": "osd/host:np0005538511", "name": "osd_memory_target"} : dispatch
Nov 28 09:51:24 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:24 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "config rm", "who": "osd/host:np0005538510", "name": "osd_memory_target"} : dispatch
Nov 28 09:51:24 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:24 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:24 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:24 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:51:24 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 28 09:51:24 np0005538515.localdomain ceph-mon[287604]: Updating np0005538510.localdomain:/etc/ceph/ceph.conf
Nov 28 09:51:24 np0005538515.localdomain ceph-mon[287604]: Updating np0005538511.localdomain:/etc/ceph/ceph.conf
Nov 28 09:51:24 np0005538515.localdomain ceph-mon[287604]: Updating np0005538512.localdomain:/etc/ceph/ceph.conf
Nov 28 09:51:24 np0005538515.localdomain ceph-mon[287604]: Updating np0005538513.localdomain:/etc/ceph/ceph.conf
Nov 28 09:51:24 np0005538515.localdomain ceph-mon[287604]: Updating np0005538514.localdomain:/etc/ceph/ceph.conf
Nov 28 09:51:24 np0005538515.localdomain ceph-mon[287604]: Updating np0005538515.localdomain:/etc/ceph/ceph.conf
Nov 28 09:51:24 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "mon metadata", "id": "np0005538513"} : dispatch
Nov 28 09:51:24 np0005538515.localdomain sudo[288224]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:51:24 np0005538515.localdomain sudo[288224]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:24 np0005538515.localdomain sudo[288224]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:24 np0005538515.localdomain sudo[288242]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf.new
Nov 28 09:51:24 np0005538515.localdomain sudo[288242]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:24 np0005538515.localdomain sudo[288242]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:24 np0005538515.localdomain sudo[288276]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf.new
Nov 28 09:51:24 np0005538515.localdomain sudo[288276]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:24 np0005538515.localdomain sudo[288276]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:24 np0005538515.localdomain sudo[288294]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf.new
Nov 28 09:51:24 np0005538515.localdomain sudo[288294]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:24 np0005538515.localdomain sudo[288294]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:24 np0005538515.localdomain sudo[288312]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf.new /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:51:24 np0005538515.localdomain sudo[288312]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:24 np0005538515.localdomain sudo[288312]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:25 np0005538515.localdomain sudo[288330]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:51:25 np0005538515.localdomain sudo[288330]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:25 np0005538515.localdomain sudo[288330]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:25 np0005538515.localdomain ceph-mon[287604]: Updating np0005538514.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:51:25 np0005538515.localdomain ceph-mon[287604]: Updating np0005538510.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:51:25 np0005538515.localdomain ceph-mon[287604]: pgmap v3839: 177 pgs: 177 active+clean; 104 MiB data, 558 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:51:25 np0005538515.localdomain ceph-mon[287604]: Updating np0005538512.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:51:25 np0005538515.localdomain ceph-mon[287604]: Updating np0005538515.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:51:25 np0005538515.localdomain ceph-mon[287604]: Updating np0005538511.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:51:25 np0005538515.localdomain ceph-mon[287604]: Updating np0005538513.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:51:25 np0005538515.localdomain ceph-mon[287604]: from='client.17256 -' entity='client.admin' cmd=[{"prefix": "orch ps", "daemon_type": "mon", "daemon_id": "np0005538513", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Nov 28 09:51:25 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:25 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:25 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:25 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:25 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:25 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:25 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:25 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:25 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:25 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:25 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:25 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:25 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:25 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 28 09:51:25 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Nov 28 09:51:25 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Nov 28 09:51:25 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:51:26 np0005538515.localdomain ceph-mon[287604]: Reconfiguring mon.np0005538510 (monmap changed)...
Nov 28 09:51:26 np0005538515.localdomain ceph-mon[287604]: Reconfiguring daemon mon.np0005538510 on np0005538510.localdomain
Nov 28 09:51:26 np0005538515.localdomain ceph-mon[287604]: from='client.34103 -' entity='client.admin' cmd=[{"prefix": "orch ps", "daemon_type": "mon", "daemon_id": "np0005538514", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Nov 28 09:51:26 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:26 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:26 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538510.nzitwz", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 28 09:51:26 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "mgr services"} : dispatch
Nov 28 09:51:26 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:51:27 np0005538515.localdomain ceph-mon[287604]: pgmap v3840: 177 pgs: 177 active+clean; 104 MiB data, 558 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:51:27 np0005538515.localdomain ceph-mon[287604]: Reconfiguring mgr.np0005538510.nzitwz (monmap changed)...
Nov 28 09:51:27 np0005538515.localdomain ceph-mon[287604]: Reconfiguring daemon mgr.np0005538510.nzitwz on np0005538510.localdomain
Nov 28 09:51:27 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:27 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:27 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538510.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 28 09:51:27 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:51:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:51:27 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:51:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:51:27 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:51:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:51:27 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 09:51:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:51:27 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 09:51:27 np0005538515.localdomain openstack_network_exporter[240973]: 
Nov 28 09:51:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:51:27 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 09:51:27 np0005538515.localdomain openstack_network_exporter[240973]: 
Nov 28 09:51:28 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:51:28.238 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:51:28 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:51:28.238 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:51:28 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:51:28.238 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:51:28 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:51:28.238 280172 DEBUG nova.compute.manager [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 28 09:51:28 np0005538515.localdomain ceph-mon[287604]: from='client.26677 -' entity='client.admin' cmd=[{"prefix": "orch ps", "daemon_type": "mon", "daemon_id": "np0005538515", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Nov 28 09:51:28 np0005538515.localdomain ceph-mon[287604]: Reconfiguring crash.np0005538510 (monmap changed)...
Nov 28 09:51:28 np0005538515.localdomain ceph-mon[287604]: Reconfiguring daemon crash.np0005538510 on np0005538510.localdomain
Nov 28 09:51:28 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:28 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:28 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:28 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538511.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 28 09:51:28 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:51:28 np0005538515.localdomain podman[239012]: time="2025-11-28T09:51:28Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 09:51:28 np0005538515.localdomain podman[239012]: @ - - [28/Nov/2025:09:51:28 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156330 "" "Go-http-client/1.1"
Nov 28 09:51:28 np0005538515.localdomain podman[239012]: @ - - [28/Nov/2025:09:51:28 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19157 "" "Go-http-client/1.1"
Nov 28 09:51:29 np0005538515.localdomain ceph-mon[287604]: pgmap v3841: 177 pgs: 177 active+clean; 104 MiB data, 558 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:51:29 np0005538515.localdomain ceph-mon[287604]: Reconfiguring crash.np0005538511 (monmap changed)...
Nov 28 09:51:29 np0005538515.localdomain ceph-mon[287604]: Reconfiguring daemon crash.np0005538511 on np0005538511.localdomain
Nov 28 09:51:29 np0005538515.localdomain ceph-mon[287604]: from='client.? 172.18.0.103:0/1822508892' entity='client.admin' cmd={"prefix": "status", "format": "json"} : dispatch
Nov 28 09:51:29 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:29 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:29 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Nov 28 09:51:29 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Nov 28 09:51:29 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:51:30 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:51:30.235 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:51:30 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:51:30.259 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:51:30 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:51:30.260 280172 DEBUG nova.compute.manager [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 28 09:51:30 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:51:30.260 280172 DEBUG nova.compute.manager [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 28 09:51:30 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:51:30.280 280172 DEBUG nova.compute.manager [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 28 09:51:30 np0005538515.localdomain ceph-mon[287604]: Reconfiguring mon.np0005538511 (monmap changed)...
Nov 28 09:51:30 np0005538515.localdomain ceph-mon[287604]: Reconfiguring daemon mon.np0005538511 on np0005538511.localdomain
Nov 28 09:51:30 np0005538515.localdomain ceph-mon[287604]: pgmap v3842: 177 pgs: 177 active+clean; 104 MiB data, 558 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:51:30 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:30 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:30 np0005538515.localdomain ceph-mon[287604]: Reconfiguring mgr.np0005538511.fvuybw (monmap changed)...
Nov 28 09:51:30 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538511.fvuybw", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 28 09:51:30 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "mgr services"} : dispatch
Nov 28 09:51:30 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:51:30 np0005538515.localdomain ceph-mon[287604]: Reconfiguring daemon mgr.np0005538511.fvuybw on np0005538511.localdomain
Nov 28 09:51:30 np0005538515.localdomain ceph-mon[287604]: from='client.? 172.18.0.103:0/748718462' entity='client.admin' cmd={"prefix": "mgr stat", "format": "json"} : dispatch
Nov 28 09:51:31 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:51:31.238 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:51:31 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:51:31.239 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:51:31 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:31 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:31 np0005538515.localdomain ceph-mon[287604]: Reconfiguring mon.np0005538512 (monmap changed)...
Nov 28 09:51:31 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Nov 28 09:51:31 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Nov 28 09:51:31 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:51:31 np0005538515.localdomain ceph-mon[287604]: Reconfiguring daemon mon.np0005538512 on np0005538512.localdomain
Nov 28 09:51:31 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:31 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' 
Nov 28 09:51:31 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538512.zyhkxs", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 28 09:51:31 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "mgr services"} : dispatch
Nov 28 09:51:31 np0005538515.localdomain ceph-mon[287604]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:51:31 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.
Nov 28 09:51:31 np0005538515.localdomain podman[288348]: 2025-11-28 09:51:31.967786834 +0000 UTC m=+0.073082019 container health_status 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, managed_by=edpm_ansible, vcs-type=git, io.openshift.expose-services=, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, distribution-scope=public, name=ubi9-minimal, vendor=Red Hat, Inc., version=9.6)
Nov 28 09:51:32 np0005538515.localdomain podman[288348]: 2025-11-28 09:51:32.011769648 +0000 UTC m=+0.117064903 container exec_died 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, release=1755695350, maintainer=Red Hat, Inc., managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, name=ubi9-minimal, version=9.6, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, com.redhat.component=ubi9-minimal-container, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Nov 28 09:51:32 np0005538515.localdomain systemd[1]: 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.service: Deactivated successfully.
Nov 28 09:51:32 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@3(peon).osd e85 _set_cache_ratios kv ratio 0.25 inc ratio 0.375 full ratio 0.375
Nov 28 09:51:32 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@3(peon).osd e85 register_cache_with_pcm pcm target: 2147483648 pcm max: 1020054732 pcm min: 134217728 inc_osd_cache size: 1
Nov 28 09:51:32 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@3(peon).osd e86 e86: 6 total, 6 up, 6 in
Nov 28 09:51:32 np0005538515.localdomain sshd[26932]: pam_unix(sshd:session): session closed for user ceph-admin
Nov 28 09:51:32 np0005538515.localdomain sshd[26894]: pam_unix(sshd:session): session closed for user ceph-admin
Nov 28 09:51:32 np0005538515.localdomain sshd[26856]: pam_unix(sshd:session): session closed for user ceph-admin
Nov 28 09:51:32 np0005538515.localdomain sshd[26951]: pam_unix(sshd:session): session closed for user ceph-admin
Nov 28 09:51:32 np0005538515.localdomain sshd[26837]: pam_unix(sshd:session): session closed for user ceph-admin
Nov 28 09:51:32 np0005538515.localdomain systemd[1]: session-23.scope: Deactivated successfully.
Nov 28 09:51:32 np0005538515.localdomain sshd[26798]: pam_unix(sshd:session): session closed for user ceph-admin
Nov 28 09:51:32 np0005538515.localdomain systemd[1]: session-18.scope: Deactivated successfully.
Nov 28 09:51:32 np0005538515.localdomain systemd[1]: session-24.scope: Deactivated successfully.
Nov 28 09:51:32 np0005538515.localdomain systemd[1]: session-19.scope: Deactivated successfully.
Nov 28 09:51:32 np0005538515.localdomain systemd[1]: session-16.scope: Deactivated successfully.
Nov 28 09:51:32 np0005538515.localdomain systemd[1]: session-21.scope: Deactivated successfully.
Nov 28 09:51:32 np0005538515.localdomain systemd-logind[763]: Session 23 logged out. Waiting for processes to exit.
Nov 28 09:51:32 np0005538515.localdomain systemd-logind[763]: Session 21 logged out. Waiting for processes to exit.
Nov 28 09:51:32 np0005538515.localdomain systemd-logind[763]: Session 19 logged out. Waiting for processes to exit.
Nov 28 09:51:32 np0005538515.localdomain systemd-logind[763]: Session 24 logged out. Waiting for processes to exit.
Nov 28 09:51:32 np0005538515.localdomain systemd-logind[763]: Session 16 logged out. Waiting for processes to exit.
Nov 28 09:51:32 np0005538515.localdomain systemd-logind[763]: Session 18 logged out. Waiting for processes to exit.
Nov 28 09:51:32 np0005538515.localdomain sshd[26968]: pam_unix(sshd:session): session closed for user ceph-admin
Nov 28 09:51:32 np0005538515.localdomain sshd[26779]: pam_unix(sshd:session): session closed for user ceph-admin
Nov 28 09:51:32 np0005538515.localdomain systemd[1]: session-25.scope: Deactivated successfully.
Nov 28 09:51:32 np0005538515.localdomain systemd[1]: session-14.scope: Deactivated successfully.
Nov 28 09:51:32 np0005538515.localdomain systemd-logind[763]: Session 25 logged out. Waiting for processes to exit.
Nov 28 09:51:32 np0005538515.localdomain sshd[26818]: pam_unix(sshd:session): session closed for user ceph-admin
Nov 28 09:51:32 np0005538515.localdomain systemd-logind[763]: Session 14 logged out. Waiting for processes to exit.
Nov 28 09:51:32 np0005538515.localdomain sshd[26913]: pam_unix(sshd:session): session closed for user ceph-admin
Nov 28 09:51:32 np0005538515.localdomain systemd[1]: session-17.scope: Deactivated successfully.
Nov 28 09:51:32 np0005538515.localdomain systemd-logind[763]: Session 17 logged out. Waiting for processes to exit.
Nov 28 09:51:32 np0005538515.localdomain sshd[26875]: pam_unix(sshd:session): session closed for user ceph-admin
Nov 28 09:51:32 np0005538515.localdomain sshd[26987]: pam_unix(sshd:session): session closed for user ceph-admin
Nov 28 09:51:32 np0005538515.localdomain systemd[1]: session-22.scope: Deactivated successfully.
Nov 28 09:51:32 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:51:32.238 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:51:32 np0005538515.localdomain systemd[1]: session-20.scope: Deactivated successfully.
Nov 28 09:51:32 np0005538515.localdomain systemd[1]: session-26.scope: Deactivated successfully.
Nov 28 09:51:32 np0005538515.localdomain systemd-logind[763]: Removed session 23.
Nov 28 09:51:32 np0005538515.localdomain systemd[1]: session-26.scope: Consumed 3min 30.382s CPU time.
Nov 28 09:51:32 np0005538515.localdomain systemd-logind[763]: Session 22 logged out. Waiting for processes to exit.
Nov 28 09:51:32 np0005538515.localdomain systemd-logind[763]: Session 20 logged out. Waiting for processes to exit.
Nov 28 09:51:32 np0005538515.localdomain systemd-logind[763]: Session 26 logged out. Waiting for processes to exit.
Nov 28 09:51:32 np0005538515.localdomain systemd-logind[763]: Removed session 18.
Nov 28 09:51:32 np0005538515.localdomain systemd-logind[763]: Removed session 24.
Nov 28 09:51:32 np0005538515.localdomain systemd-logind[763]: Removed session 19.
Nov 28 09:51:32 np0005538515.localdomain systemd-logind[763]: Removed session 16.
Nov 28 09:51:32 np0005538515.localdomain systemd-logind[763]: Removed session 21.
Nov 28 09:51:32 np0005538515.localdomain systemd-logind[763]: Removed session 25.
Nov 28 09:51:32 np0005538515.localdomain systemd-logind[763]: Removed session 14.
Nov 28 09:51:32 np0005538515.localdomain systemd-logind[763]: Removed session 17.
Nov 28 09:51:32 np0005538515.localdomain systemd-logind[763]: Removed session 22.
Nov 28 09:51:32 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:51:32.265 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:51:32 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:51:32.266 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:51:32 np0005538515.localdomain systemd-logind[763]: Removed session 20.
Nov 28 09:51:32 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:51:32.266 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:51:32 np0005538515.localdomain systemd-logind[763]: Removed session 26.
Nov 28 09:51:32 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:51:32.266 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Auditing locally available compute resources for np0005538515.localdomain (node: np0005538515.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 28 09:51:32 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:51:32.267 280172 DEBUG oslo_concurrency.processutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 09:51:32 np0005538515.localdomain sshd[288388]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 09:51:32 np0005538515.localdomain sshd[288388]: Accepted publickey for ceph-admin from 192.168.122.105 port 59446 ssh2: RSA SHA256:zjXO5gWr7Xng+SeiWsaFLFQaayJZD5rPIAl1v5Aks+g
Nov 28 09:51:32 np0005538515.localdomain systemd-logind[763]: New session 64 of user ceph-admin.
Nov 28 09:51:32 np0005538515.localdomain systemd[1]: Started Session 64 of User ceph-admin.
Nov 28 09:51:32 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@3(peon) e6 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 28 09:51:32 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/1096407890' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 09:51:32 np0005538515.localdomain sshd[288388]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Nov 28 09:51:32 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:51:32.709 280172 DEBUG oslo_concurrency.processutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.442s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 09:51:32 np0005538515.localdomain sudo[288394]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:51:32 np0005538515.localdomain sudo[288394]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:32 np0005538515.localdomain sudo[288394]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:32 np0005538515.localdomain ceph-mon[287604]: from='client.? 172.18.0.103:0/3703486687' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch
Nov 28 09:51:32 np0005538515.localdomain ceph-mon[287604]: from='client.? ' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch
Nov 28 09:51:32 np0005538515.localdomain ceph-mon[287604]: Activating manager daemon np0005538512.zyhkxs
Nov 28 09:51:32 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "mon metadata", "id": "np0005538510"} : dispatch
Nov 28 09:51:32 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "mon metadata", "id": "np0005538511"} : dispatch
Nov 28 09:51:32 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "mon metadata", "id": "np0005538512"} : dispatch
Nov 28 09:51:32 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "mon metadata", "id": "np0005538513"} : dispatch
Nov 28 09:51:32 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "mon metadata", "id": "np0005538514"} : dispatch
Nov 28 09:51:32 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "mon metadata", "id": "np0005538515"} : dispatch
Nov 28 09:51:32 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "mds metadata", "who": "mds.np0005538513.yljthc"} : dispatch
Nov 28 09:51:32 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "mds metadata", "who": "mds.np0005538515.anvatb"} : dispatch
Nov 28 09:51:32 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "mds metadata", "who": "mds.np0005538514.umgtoy"} : dispatch
Nov 28 09:51:32 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "mgr metadata", "who": "np0005538512.zyhkxs", "id": "np0005538512.zyhkxs"} : dispatch
Nov 28 09:51:32 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "mgr metadata", "who": "np0005538514.djozup", "id": "np0005538514.djozup"} : dispatch
Nov 28 09:51:32 np0005538515.localdomain ceph-mon[287604]: osdmap e86: 6 total, 6 up, 6 in
Nov 28 09:51:32 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "mgr metadata", "who": "np0005538515.yfkzhl", "id": "np0005538515.yfkzhl"} : dispatch
Nov 28 09:51:32 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "mgr metadata", "who": "np0005538511.fvuybw", "id": "np0005538511.fvuybw"} : dispatch
Nov 28 09:51:32 np0005538515.localdomain ceph-mon[287604]: from='client.? ' entity='client.admin' cmd='[{"prefix": "mgr fail"}]': finished
Nov 28 09:51:32 np0005538515.localdomain ceph-mon[287604]: mgrmap e15: np0005538512.zyhkxs(active, starting, since 0.0649225s), standbys: np0005538514.djozup, np0005538515.yfkzhl, np0005538511.fvuybw, np0005538513.dsfdlx
Nov 28 09:51:32 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "mgr metadata", "who": "np0005538513.dsfdlx", "id": "np0005538513.dsfdlx"} : dispatch
Nov 28 09:51:32 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "osd metadata", "id": 0} : dispatch
Nov 28 09:51:32 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "osd metadata", "id": 1} : dispatch
Nov 28 09:51:32 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "osd metadata", "id": 2} : dispatch
Nov 28 09:51:32 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "osd metadata", "id": 3} : dispatch
Nov 28 09:51:32 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "osd metadata", "id": 4} : dispatch
Nov 28 09:51:32 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "osd metadata", "id": 5} : dispatch
Nov 28 09:51:32 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "mds metadata"} : dispatch
Nov 28 09:51:32 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "osd metadata"} : dispatch
Nov 28 09:51:32 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "mon metadata"} : dispatch
Nov 28 09:51:32 np0005538515.localdomain ceph-mon[287604]: Manager daemon np0005538512.zyhkxs is now available
Nov 28 09:51:32 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005538512.zyhkxs/mirror_snapshot_schedule"} : dispatch
Nov 28 09:51:32 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005538512.zyhkxs/mirror_snapshot_schedule"} : dispatch
Nov 28 09:51:32 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005538512.zyhkxs/trash_purge_schedule"} : dispatch
Nov 28 09:51:32 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005538512.zyhkxs/trash_purge_schedule"} : dispatch
Nov 28 09:51:32 np0005538515.localdomain ceph-mon[287604]: from='client.? 172.18.0.108:0/1096407890' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 09:51:32 np0005538515.localdomain sudo[288412]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Nov 28 09:51:32 np0005538515.localdomain sudo[288412]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:32 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:51:32.896 280172 WARNING nova.virt.libvirt.driver [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 09:51:32 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:51:32.897 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Hypervisor/Node resource view: name=np0005538515.localdomain free_ram=12043MB free_disk=41.83686447143555GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 28 09:51:32 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:51:32.897 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:51:32 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:51:32.897 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:51:32 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:51:32.966 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 28 09:51:32 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:51:32.967 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Final resource view: name=np0005538515.localdomain phys_ram=15738MB used_ram=512MB phys_disk=41GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 28 09:51:32 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:51:32.989 280172 DEBUG oslo_concurrency.processutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 09:51:33 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:51:33.454 280172 DEBUG oslo_concurrency.processutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.465s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 09:51:33 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:51:33.461 280172 DEBUG nova.compute.provider_tree [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Inventory has not changed in ProviderTree for provider: 72fba1ca-0d86-48af-8a3d-510284dfd0e0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 09:51:33 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:51:33.491 280172 DEBUG nova.scheduler.client.report [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Inventory has not changed for provider 72fba1ca-0d86-48af-8a3d-510284dfd0e0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 09:51:33 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:51:33.494 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Compute_service record updated for np0005538515.localdomain:np0005538515.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 28 09:51:33 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:51:33.495 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.597s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:51:33 np0005538515.localdomain systemd[1]: tmp-crun.Bbt5CA.mount: Deactivated successfully.
Nov 28 09:51:33 np0005538515.localdomain podman[288525]: 2025-11-28 09:51:33.733034524 +0000 UTC m=+0.107460148 container exec 98f7091a3e2ea0e9ed1e630f1e98c8fad1fd276cf7448473db6afc3c103ea45d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538515, vendor=Red Hat, Inc., version=7, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.tags=rhceph ceph, name=rhceph, io.openshift.expose-services=, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, release=553, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2025-09-24T08:57:55, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, architecture=x86_64, com.redhat.component=rhceph-container, io.buildah.version=1.33.12, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, vcs-type=git, GIT_CLEAN=True)
Nov 28 09:51:33 np0005538515.localdomain podman[288525]: 2025-11-28 09:51:33.850004692 +0000 UTC m=+0.224430336 container exec_died 98f7091a3e2ea0e9ed1e630f1e98c8fad1fd276cf7448473db6afc3c103ea45d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538515, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, vendor=Red Hat, Inc., GIT_CLEAN=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2025-09-24T08:57:55, release=553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, io.buildah.version=1.33.12, CEPH_POINT_RELEASE=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, architecture=x86_64, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, ceph=True, RELEASE=main, name=rhceph)
Nov 28 09:51:34 np0005538515.localdomain ceph-mon[287604]: mgrmap e16: np0005538512.zyhkxs(active, since 1.0929s), standbys: np0005538514.djozup, np0005538515.yfkzhl, np0005538511.fvuybw, np0005538513.dsfdlx
Nov 28 09:51:34 np0005538515.localdomain ceph-mon[287604]: from='client.? 172.18.0.107:0/1691971170' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 09:51:34 np0005538515.localdomain ceph-mon[287604]: pgmap v5: 177 pgs: 177 active+clean; 104 MiB data, 558 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:51:34 np0005538515.localdomain ceph-mon[287604]: from='client.? 172.18.0.108:0/1013956769' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 09:51:34 np0005538515.localdomain ceph-mon[287604]: from='client.? 172.18.0.107:0/1839703657' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 09:51:34 np0005538515.localdomain sudo[288412]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:34 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:51:34.496 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:51:34 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:51:34.496 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:51:34 np0005538515.localdomain sudo[288642]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:51:34 np0005538515.localdomain sudo[288642]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:34 np0005538515.localdomain sudo[288642]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:35 np0005538515.localdomain sudo[288660]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 09:51:35 np0005538515.localdomain sudo[288660]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:35 np0005538515.localdomain ceph-mon[287604]: [28/Nov/2025:09:51:33] ENGINE Bus STARTING
Nov 28 09:51:35 np0005538515.localdomain ceph-mon[287604]: [28/Nov/2025:09:51:33] ENGINE Serving on https://172.18.0.105:7150
Nov 28 09:51:35 np0005538515.localdomain ceph-mon[287604]: [28/Nov/2025:09:51:33] ENGINE Client ('172.18.0.105', 40464) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)')
Nov 28 09:51:35 np0005538515.localdomain ceph-mon[287604]: [28/Nov/2025:09:51:33] ENGINE Serving on http://172.18.0.105:8765
Nov 28 09:51:35 np0005538515.localdomain ceph-mon[287604]: [28/Nov/2025:09:51:33] ENGINE Bus STARTED
Nov 28 09:51:35 np0005538515.localdomain ceph-mon[287604]: pgmap v6: 177 pgs: 177 active+clean; 104 MiB data, 558 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:51:35 np0005538515.localdomain ceph-mon[287604]: mgrmap e17: np0005538512.zyhkxs(active, since 2s), standbys: np0005538514.djozup, np0005538515.yfkzhl, np0005538511.fvuybw, np0005538513.dsfdlx
Nov 28 09:51:35 np0005538515.localdomain ceph-mon[287604]: from='client.? 172.18.0.106:0/3152215171' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 09:51:35 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:51:35 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:51:35 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:51:35 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:51:35 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:51:35 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:51:35 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:51:35 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:51:35 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:51:35 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:51:35 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:51:35 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:51:35 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@3(peon).osd e86 _set_new_cache_sizes cache_size:1019548993 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:51:35 np0005538515.localdomain sudo[288660]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:36 np0005538515.localdomain sudo[288709]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:51:36 np0005538515.localdomain sudo[288709]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:36 np0005538515.localdomain sudo[288709]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:36 np0005538515.localdomain sudo[288727]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 list-networks
Nov 28 09:51:36 np0005538515.localdomain sudo[288727]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:36 np0005538515.localdomain sudo[288727]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:36 np0005538515.localdomain ceph-mon[287604]: from='client.? 172.18.0.106:0/2421283346' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 09:51:37 np0005538515.localdomain sudo[288765]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Nov 28 09:51:37 np0005538515.localdomain sudo[288765]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:37 np0005538515.localdomain sudo[288765]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:37 np0005538515.localdomain sudo[288783]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph
Nov 28 09:51:37 np0005538515.localdomain sudo[288783]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:37 np0005538515.localdomain sudo[288783]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:37 np0005538515.localdomain sudo[288801]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.conf.new
Nov 28 09:51:37 np0005538515.localdomain sudo[288801]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:37 np0005538515.localdomain sudo[288801]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:37 np0005538515.localdomain ceph-mon[287604]: pgmap v7: 177 pgs: 177 active+clean; 104 MiB data, 558 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:51:37 np0005538515.localdomain ceph-mon[287604]: mgrmap e18: np0005538512.zyhkxs(active, since 4s), standbys: np0005538514.djozup, np0005538515.yfkzhl, np0005538511.fvuybw, np0005538513.dsfdlx
Nov 28 09:51:37 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:51:37 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:51:37 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:51:37 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:51:37 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:51:37 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config rm", "who": "osd/host:np0005538512", "name": "osd_memory_target"} : dispatch
Nov 28 09:51:37 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:51:37 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch
Nov 28 09:51:37 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:51:37 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config rm", "who": "osd/host:np0005538512", "name": "osd_memory_target"} : dispatch
Nov 28 09:51:37 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch
Nov 28 09:51:37 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:51:37 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch
Nov 28 09:51:37 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch
Nov 28 09:51:37 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch
Nov 28 09:51:37 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:51:37 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config rm", "who": "osd/host:np0005538511", "name": "osd_memory_target"} : dispatch
Nov 28 09:51:37 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch
Nov 28 09:51:37 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch
Nov 28 09:51:37 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch
Nov 28 09:51:37 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:51:37 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch
Nov 28 09:51:37 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config rm", "who": "osd/host:np0005538511", "name": "osd_memory_target"} : dispatch
Nov 28 09:51:37 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch
Nov 28 09:51:37 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch
Nov 28 09:51:37 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch
Nov 28 09:51:37 np0005538515.localdomain ceph-mon[287604]: Adjusting osd_memory_target on np0005538514.localdomain to 836.6M
Nov 28 09:51:37 np0005538515.localdomain ceph-mon[287604]: Adjusting osd_memory_target on np0005538515.localdomain to 836.6M
Nov 28 09:51:37 np0005538515.localdomain ceph-mon[287604]: Adjusting osd_memory_target on np0005538513.localdomain to 836.6M
Nov 28 09:51:37 np0005538515.localdomain ceph-mon[287604]: Unable to set osd_memory_target on np0005538514.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Nov 28 09:51:37 np0005538515.localdomain ceph-mon[287604]: Unable to set osd_memory_target on np0005538515.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Nov 28 09:51:37 np0005538515.localdomain ceph-mon[287604]: Unable to set osd_memory_target on np0005538513.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Nov 28 09:51:37 np0005538515.localdomain ceph-mon[287604]: Standby manager daemon np0005538510.nzitwz started
Nov 28 09:51:37 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:51:37 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config rm", "who": "osd/host:np0005538510", "name": "osd_memory_target"} : dispatch
Nov 28 09:51:37 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:51:37 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:51:37 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 28 09:51:37 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config rm", "who": "osd/host:np0005538510", "name": "osd_memory_target"} : dispatch
Nov 28 09:51:37 np0005538515.localdomain sudo[288819]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:51:37 np0005538515.localdomain sudo[288819]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:37 np0005538515.localdomain sudo[288819]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:37 np0005538515.localdomain sudo[288837]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.conf.new
Nov 28 09:51:37 np0005538515.localdomain sudo[288837]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:37 np0005538515.localdomain sudo[288837]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:38 np0005538515.localdomain sudo[288871]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.conf.new
Nov 28 09:51:38 np0005538515.localdomain sudo[288871]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:38 np0005538515.localdomain sudo[288871]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:38 np0005538515.localdomain sudo[288889]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.conf.new
Nov 28 09:51:38 np0005538515.localdomain sudo[288889]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:38 np0005538515.localdomain sudo[288889]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:38 np0005538515.localdomain sudo[288907]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Nov 28 09:51:38 np0005538515.localdomain sudo[288907]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:38 np0005538515.localdomain sudo[288907]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:38 np0005538515.localdomain sudo[288925]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config
Nov 28 09:51:38 np0005538515.localdomain sudo[288925]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:38 np0005538515.localdomain sudo[288925]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:38 np0005538515.localdomain sudo[288943]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config
Nov 28 09:51:38 np0005538515.localdomain sudo[288943]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:38 np0005538515.localdomain sudo[288943]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:38 np0005538515.localdomain sudo[288961]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf.new
Nov 28 09:51:38 np0005538515.localdomain sudo[288961]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:38 np0005538515.localdomain sudo[288961]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:38 np0005538515.localdomain sudo[288979]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:51:38 np0005538515.localdomain sudo[288979]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:38 np0005538515.localdomain sudo[288979]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:38 np0005538515.localdomain sudo[288997]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf.new
Nov 28 09:51:38 np0005538515.localdomain sudo[288997]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:38 np0005538515.localdomain sudo[288997]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:38 np0005538515.localdomain sudo[289031]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf.new
Nov 28 09:51:38 np0005538515.localdomain sudo[289031]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:38 np0005538515.localdomain sudo[289031]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:38 np0005538515.localdomain sudo[289049]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf.new
Nov 28 09:51:38 np0005538515.localdomain sudo[289049]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:38 np0005538515.localdomain sudo[289049]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:38 np0005538515.localdomain sudo[289067]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf.new /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:51:38 np0005538515.localdomain sudo[289067]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:38 np0005538515.localdomain sudo[289067]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:39 np0005538515.localdomain sudo[289085]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Nov 28 09:51:39 np0005538515.localdomain sudo[289085]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:39 np0005538515.localdomain sudo[289085]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:39 np0005538515.localdomain sudo[289103]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph
Nov 28 09:51:39 np0005538515.localdomain sudo[289103]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:39 np0005538515.localdomain sudo[289103]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:39 np0005538515.localdomain sudo[289121]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.client.admin.keyring.new
Nov 28 09:51:39 np0005538515.localdomain sudo[289121]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:39 np0005538515.localdomain sudo[289121]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:39 np0005538515.localdomain ceph-mon[287604]: Updating np0005538510.localdomain:/etc/ceph/ceph.conf
Nov 28 09:51:39 np0005538515.localdomain ceph-mon[287604]: Updating np0005538511.localdomain:/etc/ceph/ceph.conf
Nov 28 09:51:39 np0005538515.localdomain ceph-mon[287604]: Updating np0005538512.localdomain:/etc/ceph/ceph.conf
Nov 28 09:51:39 np0005538515.localdomain ceph-mon[287604]: Updating np0005538513.localdomain:/etc/ceph/ceph.conf
Nov 28 09:51:39 np0005538515.localdomain ceph-mon[287604]: Updating np0005538514.localdomain:/etc/ceph/ceph.conf
Nov 28 09:51:39 np0005538515.localdomain ceph-mon[287604]: Updating np0005538515.localdomain:/etc/ceph/ceph.conf
Nov 28 09:51:39 np0005538515.localdomain ceph-mon[287604]: Updating np0005538514.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:51:39 np0005538515.localdomain ceph-mon[287604]: pgmap v8: 177 pgs: 177 active+clean; 104 MiB data, 558 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:51:39 np0005538515.localdomain ceph-mon[287604]: Updating np0005538510.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:51:39 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "mgr metadata", "who": "np0005538510.nzitwz", "id": "np0005538510.nzitwz"} : dispatch
Nov 28 09:51:39 np0005538515.localdomain ceph-mon[287604]: mgrmap e19: np0005538512.zyhkxs(active, since 6s), standbys: np0005538514.djozup, np0005538515.yfkzhl, np0005538510.nzitwz, np0005538511.fvuybw, np0005538513.dsfdlx
Nov 28 09:51:39 np0005538515.localdomain ceph-mon[287604]: Updating np0005538512.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:51:39 np0005538515.localdomain ceph-mon[287604]: Updating np0005538511.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:51:39 np0005538515.localdomain ceph-mon[287604]: Updating np0005538515.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:51:39 np0005538515.localdomain ceph-mon[287604]: Updating np0005538513.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:51:39 np0005538515.localdomain sudo[289139]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:51:39 np0005538515.localdomain sudo[289139]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:39 np0005538515.localdomain sudo[289139]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:39 np0005538515.localdomain sudo[289157]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.client.admin.keyring.new
Nov 28 09:51:39 np0005538515.localdomain sudo[289157]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:39 np0005538515.localdomain sudo[289157]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:39 np0005538515.localdomain sudo[289191]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.client.admin.keyring.new
Nov 28 09:51:39 np0005538515.localdomain sudo[289191]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:39 np0005538515.localdomain sudo[289191]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:39 np0005538515.localdomain sudo[289209]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.client.admin.keyring.new
Nov 28 09:51:39 np0005538515.localdomain sudo[289209]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:39 np0005538515.localdomain sudo[289209]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:39 np0005538515.localdomain sudo[289227]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.client.admin.keyring.new /etc/ceph/ceph.client.admin.keyring
Nov 28 09:51:39 np0005538515.localdomain sudo[289227]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:39 np0005538515.localdomain sudo[289227]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:39 np0005538515.localdomain sudo[289245]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config
Nov 28 09:51:39 np0005538515.localdomain sudo[289245]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:39 np0005538515.localdomain sudo[289245]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:39 np0005538515.localdomain sudo[289263]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config
Nov 28 09:51:39 np0005538515.localdomain sudo[289263]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:39 np0005538515.localdomain sudo[289263]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:39 np0005538515.localdomain sudo[289281]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring.new
Nov 28 09:51:39 np0005538515.localdomain sudo[289281]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:39 np0005538515.localdomain sudo[289281]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:39 np0005538515.localdomain sudo[289299]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:51:39 np0005538515.localdomain sudo[289299]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:39 np0005538515.localdomain sudo[289299]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:39 np0005538515.localdomain sudo[289317]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring.new
Nov 28 09:51:39 np0005538515.localdomain sudo[289317]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:39 np0005538515.localdomain sudo[289317]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:40 np0005538515.localdomain sudo[289351]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring.new
Nov 28 09:51:40 np0005538515.localdomain sudo[289351]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:40 np0005538515.localdomain sudo[289351]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:40 np0005538515.localdomain sudo[289369]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring.new
Nov 28 09:51:40 np0005538515.localdomain sudo[289369]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:40 np0005538515.localdomain sudo[289369]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:40 np0005538515.localdomain ceph-mon[287604]: Updating np0005538514.localdomain:/etc/ceph/ceph.client.admin.keyring
Nov 28 09:51:40 np0005538515.localdomain ceph-mon[287604]: Updating np0005538510.localdomain:/etc/ceph/ceph.client.admin.keyring
Nov 28 09:51:40 np0005538515.localdomain ceph-mon[287604]: Updating np0005538512.localdomain:/etc/ceph/ceph.client.admin.keyring
Nov 28 09:51:40 np0005538515.localdomain ceph-mon[287604]: Updating np0005538515.localdomain:/etc/ceph/ceph.client.admin.keyring
Nov 28 09:51:40 np0005538515.localdomain ceph-mon[287604]: Updating np0005538511.localdomain:/etc/ceph/ceph.client.admin.keyring
Nov 28 09:51:40 np0005538515.localdomain ceph-mon[287604]: Updating np0005538513.localdomain:/etc/ceph/ceph.client.admin.keyring
Nov 28 09:51:40 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:51:40 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:51:40 np0005538515.localdomain sudo[289387]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring.new /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring
Nov 28 09:51:40 np0005538515.localdomain sudo[289387]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:40 np0005538515.localdomain sudo[289387]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:40 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@3(peon).osd e86 _set_new_cache_sizes cache_size:1020041534 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:51:41 np0005538515.localdomain sudo[289405]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:51:41 np0005538515.localdomain sudo[289405]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:41 np0005538515.localdomain sudo[289405]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:41 np0005538515.localdomain ceph-mon[287604]: Updating np0005538514.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring
Nov 28 09:51:41 np0005538515.localdomain ceph-mon[287604]: Updating np0005538510.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring
Nov 28 09:51:41 np0005538515.localdomain ceph-mon[287604]: Updating np0005538512.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring
Nov 28 09:51:41 np0005538515.localdomain ceph-mon[287604]: Updating np0005538515.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring
Nov 28 09:51:41 np0005538515.localdomain ceph-mon[287604]: Updating np0005538511.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring
Nov 28 09:51:41 np0005538515.localdomain ceph-mon[287604]: Updating np0005538513.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring
Nov 28 09:51:41 np0005538515.localdomain ceph-mon[287604]: pgmap v9: 177 pgs: 177 active+clean; 104 MiB data, 558 MiB used, 41 GiB / 42 GiB avail; 30 KiB/s rd, 0 B/s wr, 16 op/s
Nov 28 09:51:41 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:51:41 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:51:41 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:51:41 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:51:41 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:51:41 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:51:41 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:51:41 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:51:41 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:51:41 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:51:41 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 28 09:51:41 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:51:41 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538512.zyhkxs", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 28 09:51:41 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "mgr services"} : dispatch
Nov 28 09:51:41 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:51:41 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538512.zyhkxs", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 28 09:51:42 np0005538515.localdomain ceph-mon[287604]: Reconfiguring mgr.np0005538512.zyhkxs (monmap changed)...
Nov 28 09:51:42 np0005538515.localdomain ceph-mon[287604]: Reconfiguring daemon mgr.np0005538512.zyhkxs on np0005538512.localdomain
Nov 28 09:51:42 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:51:42 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538512.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 28 09:51:42 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:51:42 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:51:42 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538512.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 28 09:51:42 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.
Nov 28 09:51:42 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.
Nov 28 09:51:42 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.
Nov 28 09:51:42 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.
Nov 28 09:51:43 np0005538515.localdomain podman[289424]: 2025-11-28 09:51:43.005029884 +0000 UTC m=+0.100597196 container health_status 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Nov 28 09:51:43 np0005538515.localdomain podman[289423]: 2025-11-28 09:51:43.056002413 +0000 UTC m=+0.152180994 container health_status 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=edpm, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 09:51:43 np0005538515.localdomain podman[289424]: 2025-11-28 09:51:43.07347607 +0000 UTC m=+0.169043362 container exec_died 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 28 09:51:43 np0005538515.localdomain systemd[1]: 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.service: Deactivated successfully.
Nov 28 09:51:43 np0005538515.localdomain podman[289423]: 2025-11-28 09:51:43.142817273 +0000 UTC m=+0.238995944 container exec_died 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, config_id=edpm, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 28 09:51:43 np0005538515.localdomain systemd[1]: tmp-crun.Dx7RCO.mount: Deactivated successfully.
Nov 28 09:51:43 np0005538515.localdomain systemd[1]: 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.service: Deactivated successfully.
Nov 28 09:51:43 np0005538515.localdomain podman[289426]: 2025-11-28 09:51:43.163555801 +0000 UTC m=+0.252779947 container health_status d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 09:51:43 np0005538515.localdomain podman[289426]: 2025-11-28 09:51:43.197113774 +0000 UTC m=+0.286337900 container exec_died d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 28 09:51:43 np0005538515.localdomain systemd[1]: d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.service: Deactivated successfully.
Nov 28 09:51:43 np0005538515.localdomain podman[289425]: 2025-11-28 09:51:43.217778129 +0000 UTC m=+0.311790603 container health_status b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=ovn_metadata_agent)
Nov 28 09:51:43 np0005538515.localdomain podman[289425]: 2025-11-28 09:51:43.248922477 +0000 UTC m=+0.342934951 container exec_died b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 28 09:51:43 np0005538515.localdomain systemd[1]: b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.service: Deactivated successfully.
Nov 28 09:51:43 np0005538515.localdomain ceph-mon[287604]: Reconfiguring crash.np0005538512 (monmap changed)...
Nov 28 09:51:43 np0005538515.localdomain ceph-mon[287604]: Reconfiguring daemon crash.np0005538512 on np0005538512.localdomain
Nov 28 09:51:43 np0005538515.localdomain ceph-mon[287604]: pgmap v10: 177 pgs: 177 active+clean; 104 MiB data, 558 MiB used, 41 GiB / 42 GiB avail; 23 KiB/s rd, 0 B/s wr, 12 op/s
Nov 28 09:51:43 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:51:43 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:51:43 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538513.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 28 09:51:43 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:51:43 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:51:43 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538513.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 28 09:51:44 np0005538515.localdomain ceph-mon[287604]: Reconfiguring crash.np0005538513 (monmap changed)...
Nov 28 09:51:44 np0005538515.localdomain ceph-mon[287604]: Reconfiguring daemon crash.np0005538513 on np0005538513.localdomain
Nov 28 09:51:44 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:51:44 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch
Nov 28 09:51:44 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:51:44 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:51:45 np0005538515.localdomain ceph-mon[287604]: Reconfiguring osd.2 (monmap changed)...
Nov 28 09:51:45 np0005538515.localdomain ceph-mon[287604]: Reconfiguring daemon osd.2 on np0005538513.localdomain
Nov 28 09:51:45 np0005538515.localdomain ceph-mon[287604]: from='client.17376 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Nov 28 09:51:45 np0005538515.localdomain ceph-mon[287604]: pgmap v11: 177 pgs: 177 active+clean; 104 MiB data, 558 MiB used, 41 GiB / 42 GiB avail; 19 KiB/s rd, 0 B/s wr, 10 op/s
Nov 28 09:51:45 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:51:45 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch
Nov 28 09:51:45 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:51:45 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:51:45 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@3(peon).osd e86 _set_new_cache_sizes cache_size:1020054386 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:51:46 np0005538515.localdomain ceph-mon[287604]: Reconfiguring osd.5 (monmap changed)...
Nov 28 09:51:46 np0005538515.localdomain ceph-mon[287604]: Reconfiguring daemon osd.5 on np0005538513.localdomain
Nov 28 09:51:46 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:51:46 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005538513.yljthc", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Nov 28 09:51:46 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:51:46 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:51:46 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005538513.yljthc", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Nov 28 09:51:46 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.
Nov 28 09:51:46 np0005538515.localdomain podman[289506]: 2025-11-28 09:51:46.982874085 +0000 UTC m=+0.089707771 container health_status 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 28 09:51:46 np0005538515.localdomain podman[289506]: 2025-11-28 09:51:46.990231672 +0000 UTC m=+0.097065368 container exec_died 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 28 09:51:47 np0005538515.localdomain systemd[1]: 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.service: Deactivated successfully.
Nov 28 09:51:47 np0005538515.localdomain ceph-mon[287604]: Reconfiguring mds.mds.np0005538513.yljthc (monmap changed)...
Nov 28 09:51:47 np0005538515.localdomain ceph-mon[287604]: Reconfiguring daemon mds.mds.np0005538513.yljthc on np0005538513.localdomain
Nov 28 09:51:47 np0005538515.localdomain ceph-mon[287604]: pgmap v12: 177 pgs: 177 active+clean; 104 MiB data, 558 MiB used, 41 GiB / 42 GiB avail; 17 KiB/s rd, 0 B/s wr, 9 op/s
Nov 28 09:51:47 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:51:47 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:51:47 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538513.dsfdlx", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 28 09:51:47 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "mgr services"} : dispatch
Nov 28 09:51:47 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:51:47 np0005538515.localdomain ceph-mon[287604]: Reconfiguring mgr.np0005538513.dsfdlx (monmap changed)...
Nov 28 09:51:47 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538513.dsfdlx", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 28 09:51:47 np0005538515.localdomain ceph-mon[287604]: Reconfiguring daemon mgr.np0005538513.dsfdlx on np0005538513.localdomain
Nov 28 09:51:47 np0005538515.localdomain ceph-mon[287604]: from='client.26761 -' entity='client.admin' cmd=[{"prefix": "orch ps", "daemon_type": "mon", "daemon_id": "np0005538510", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Nov 28 09:51:48 np0005538515.localdomain ceph-mgr[286188]: ms_deliver_dispatch: unhandled message 0x5646e12c5600 mon_map magic: 0 from mon.0 v2:172.18.0.103:3300/0
Nov 28 09:51:48 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@3(peon) e7  my rank is now 2 (was 3)
Nov 28 09:51:48 np0005538515.localdomain ceph-mgr[286188]: client.0 ms_handle_reset on v2:172.18.0.108:3300/0
Nov 28 09:51:48 np0005538515.localdomain ceph-mgr[286188]: client.0 ms_handle_reset on v2:172.18.0.108:3300/0
Nov 28 09:51:48 np0005538515.localdomain ceph-mgr[286188]: ms_deliver_dispatch: unhandled message 0x5646e12c4f20 mon_map magic: 0 from mon.3 v2:172.18.0.107:3300/0
Nov 28 09:51:48 np0005538515.localdomain ceph-mon[287604]: log_channel(cluster) log [INF] : mon.np0005538515 calling monitor election
Nov 28 09:51:48 np0005538515.localdomain ceph-mon[287604]: paxos.2).electionLogic(26) init, last seen epoch 26
Nov 28 09:51:48 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@2(electing) e7 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 28 09:51:48 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@2(electing) e7 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 28 09:51:49 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.
Nov 28 09:51:49 np0005538515.localdomain systemd[1]: tmp-crun.ooRVKE.mount: Deactivated successfully.
Nov 28 09:51:50 np0005538515.localdomain podman[289532]: 2025-11-28 09:51:50.018321943 +0000 UTC m=+0.125606246 container health_status cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, container_name=multipathd, tcib_managed=true, config_id=multipathd, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 28 09:51:50 np0005538515.localdomain podman[289532]: 2025-11-28 09:51:50.059650844 +0000 UTC m=+0.166935127 container exec_died cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, container_name=multipathd, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3)
Nov 28 09:51:50 np0005538515.localdomain systemd[1]: cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.service: Deactivated successfully.
Nov 28 09:51:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:51:50.833 158530 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:51:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:51:50.834 158530 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:51:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:51:50.834 158530 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:51:53 np0005538515.localdomain ceph-mds[282859]: mds.beacon.mds.np0005538515.anvatb missed beacon ack from the monitors
Nov 28 09:51:53 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@2(peon) e7 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 28 09:51:53 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Nov 28 09:51:53 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Nov 28 09:51:53 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:51:53 np0005538515.localdomain ceph-mon[287604]: Reconfiguring daemon mon.np0005538513 on np0005538513.localdomain
Nov 28 09:51:53 np0005538515.localdomain ceph-mon[287604]: pgmap v13: 177 pgs: 177 active+clean; 104 MiB data, 558 MiB used, 41 GiB / 42 GiB avail; 17 KiB/s rd, 0 B/s wr, 9 op/s
Nov 28 09:51:53 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "quorum_status"} : dispatch
Nov 28 09:51:53 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "mon rm", "name": "np0005538510"} : dispatch
Nov 28 09:51:53 np0005538515.localdomain ceph-mon[287604]: from='client.34185 -' entity='client.admin' cmd=[{"prefix": "orch daemon rm", "names": ["mon.np0005538510"], "force": true, "target": ["mon-mgr", ""]}]: dispatch
Nov 28 09:51:53 np0005538515.localdomain ceph-mon[287604]: Remove daemons mon.np0005538510
Nov 28 09:51:53 np0005538515.localdomain ceph-mon[287604]: Safe to remove mon.np0005538510: new quorum should be ['np0005538512', 'np0005538511', 'np0005538515', 'np0005538514', 'np0005538513'] (from ['np0005538512', 'np0005538511', 'np0005538515', 'np0005538514', 'np0005538513'])
Nov 28 09:51:53 np0005538515.localdomain ceph-mon[287604]: Removing monitor np0005538510 from monmap...
Nov 28 09:51:53 np0005538515.localdomain ceph-mon[287604]: Removing daemon mon.np0005538510 from np0005538510.localdomain -- ports []
Nov 28 09:51:53 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515 calling monitor election
Nov 28 09:51:53 np0005538515.localdomain ceph-mon[287604]: mon.np0005538511 calling monitor election
Nov 28 09:51:53 np0005538515.localdomain ceph-mon[287604]: mon.np0005538512 calling monitor election
Nov 28 09:51:53 np0005538515.localdomain ceph-mon[287604]: mon.np0005538513 calling monitor election
Nov 28 09:51:53 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "mon metadata", "id": "np0005538511"} : dispatch
Nov 28 09:51:53 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "mon metadata", "id": "np0005538512"} : dispatch
Nov 28 09:51:53 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "mon metadata", "id": "np0005538513"} : dispatch
Nov 28 09:51:53 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "mon metadata", "id": "np0005538514"} : dispatch
Nov 28 09:51:53 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "mon metadata", "id": "np0005538515"} : dispatch
Nov 28 09:51:53 np0005538515.localdomain ceph-mon[287604]: pgmap v14: 177 pgs: 177 active+clean; 104 MiB data, 558 MiB used, 41 GiB / 42 GiB avail; 17 KiB/s rd, 0 B/s wr, 9 op/s
Nov 28 09:51:53 np0005538515.localdomain ceph-mon[287604]: pgmap v15: 177 pgs: 177 active+clean; 104 MiB data, 558 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:51:53 np0005538515.localdomain ceph-mon[287604]: mon.np0005538512 is new leader, mons np0005538512,np0005538511,np0005538515,np0005538513 in quorum (ranks 0,1,2,4)
Nov 28 09:51:53 np0005538515.localdomain ceph-mon[287604]: monmap epoch 7
Nov 28 09:51:53 np0005538515.localdomain ceph-mon[287604]: fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:51:53 np0005538515.localdomain ceph-mon[287604]: last_changed 2025-11-28T09:51:48.586207+0000
Nov 28 09:51:53 np0005538515.localdomain ceph-mon[287604]: created 2025-11-28T07:45:36.120469+0000
Nov 28 09:51:53 np0005538515.localdomain ceph-mon[287604]: min_mon_release 18 (reef)
Nov 28 09:51:53 np0005538515.localdomain ceph-mon[287604]: election_strategy: 1
Nov 28 09:51:53 np0005538515.localdomain ceph-mon[287604]: 0: [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon.np0005538512
Nov 28 09:51:53 np0005538515.localdomain ceph-mon[287604]: 1: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005538511
Nov 28 09:51:53 np0005538515.localdomain ceph-mon[287604]: 2: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005538515
Nov 28 09:51:53 np0005538515.localdomain ceph-mon[287604]: 3: [v2:172.18.0.107:3300/0,v1:172.18.0.107:6789/0] mon.np0005538514
Nov 28 09:51:53 np0005538515.localdomain ceph-mon[287604]: 4: [v2:172.18.0.106:3300/0,v1:172.18.0.106:6789/0] mon.np0005538513
Nov 28 09:51:53 np0005538515.localdomain ceph-mon[287604]: fsmap cephfs:1 {0=mds.np0005538514.umgtoy=up:active} 2 up:standby
Nov 28 09:51:53 np0005538515.localdomain ceph-mon[287604]: osdmap e86: 6 total, 6 up, 6 in
Nov 28 09:51:53 np0005538515.localdomain ceph-mon[287604]: mgrmap e19: np0005538512.zyhkxs(active, since 21s), standbys: np0005538514.djozup, np0005538515.yfkzhl, np0005538510.nzitwz, np0005538511.fvuybw, np0005538513.dsfdlx
Nov 28 09:51:53 np0005538515.localdomain ceph-mon[287604]: Health check failed: 1/5 mons down, quorum np0005538512,np0005538511,np0005538515,np0005538513 (MON_DOWN)
Nov 28 09:51:53 np0005538515.localdomain ceph-mon[287604]: Health detail: HEALTH_WARN 1/5 mons down, quorum np0005538512,np0005538511,np0005538515,np0005538513
Nov 28 09:51:53 np0005538515.localdomain ceph-mon[287604]: [WRN] MON_DOWN: 1/5 mons down, quorum np0005538512,np0005538511,np0005538515,np0005538513
Nov 28 09:51:53 np0005538515.localdomain ceph-mon[287604]:     mon.np0005538514 (rank 3) addr [v2:172.18.0.107:3300/0,v1:172.18.0.107:6789/0] is down (out of quorum)
Nov 28 09:51:53 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:51:54 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:51:54 np0005538515.localdomain ceph-mon[287604]: Reconfiguring crash.np0005538514 (monmap changed)...
Nov 28 09:51:54 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538514.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 28 09:51:54 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:51:54 np0005538515.localdomain ceph-mon[287604]: Reconfiguring daemon crash.np0005538514 on np0005538514.localdomain
Nov 28 09:51:54 np0005538515.localdomain ceph-mon[287604]: pgmap v16: 177 pgs: 177 active+clean; 104 MiB data, 558 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:51:54 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:51:54 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:51:54 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch
Nov 28 09:51:54 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:51:55 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@2(peon).osd e86 _set_new_cache_sizes cache_size:1020054723 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:51:55 np0005538515.localdomain ceph-mon[287604]: log_channel(cluster) log [INF] : mon.np0005538515 calling monitor election
Nov 28 09:51:55 np0005538515.localdomain ceph-mon[287604]: paxos.2).electionLogic(29) init, last seen epoch 29, mid-election, bumping
Nov 28 09:51:55 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@2(electing) e7 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 28 09:51:55 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@2(electing) e7 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 28 09:51:55 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@2(electing) e7 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 28 09:51:55 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@2(peon) e7 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 28 09:51:56 np0005538515.localdomain ceph-mon[287604]: mon.np0005538514 calling monitor election
Nov 28 09:51:56 np0005538515.localdomain ceph-mon[287604]: from='client.34197 -' entity='client.admin' cmd=[{"prefix": "orch host label rm", "hostname": "np0005538510.localdomain", "label": "mon", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 09:51:56 np0005538515.localdomain ceph-mon[287604]: Removed label mon from host np0005538510.localdomain
Nov 28 09:51:56 np0005538515.localdomain ceph-mon[287604]: Reconfiguring osd.3 (monmap changed)...
Nov 28 09:51:56 np0005538515.localdomain ceph-mon[287604]: Reconfiguring daemon osd.3 on np0005538514.localdomain
Nov 28 09:51:56 np0005538515.localdomain ceph-mon[287604]: mon.np0005538511 calling monitor election
Nov 28 09:51:56 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515 calling monitor election
Nov 28 09:51:56 np0005538515.localdomain ceph-mon[287604]: mon.np0005538512 calling monitor election
Nov 28 09:51:56 np0005538515.localdomain ceph-mon[287604]: mon.np0005538512 is new leader, mons np0005538512,np0005538511,np0005538515,np0005538514,np0005538513 in quorum (ranks 0,1,2,3,4)
Nov 28 09:51:56 np0005538515.localdomain ceph-mon[287604]: monmap epoch 7
Nov 28 09:51:56 np0005538515.localdomain ceph-mon[287604]: fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:51:56 np0005538515.localdomain ceph-mon[287604]: last_changed 2025-11-28T09:51:48.586207+0000
Nov 28 09:51:56 np0005538515.localdomain ceph-mon[287604]: created 2025-11-28T07:45:36.120469+0000
Nov 28 09:51:56 np0005538515.localdomain ceph-mon[287604]: min_mon_release 18 (reef)
Nov 28 09:51:56 np0005538515.localdomain ceph-mon[287604]: election_strategy: 1
Nov 28 09:51:56 np0005538515.localdomain ceph-mon[287604]: 0: [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon.np0005538512
Nov 28 09:51:56 np0005538515.localdomain ceph-mon[287604]: 1: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005538511
Nov 28 09:51:56 np0005538515.localdomain ceph-mon[287604]: 2: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005538515
Nov 28 09:51:56 np0005538515.localdomain ceph-mon[287604]: 3: [v2:172.18.0.107:3300/0,v1:172.18.0.107:6789/0] mon.np0005538514
Nov 28 09:51:56 np0005538515.localdomain ceph-mon[287604]: 4: [v2:172.18.0.106:3300/0,v1:172.18.0.106:6789/0] mon.np0005538513
Nov 28 09:51:56 np0005538515.localdomain ceph-mon[287604]: fsmap cephfs:1 {0=mds.np0005538514.umgtoy=up:active} 2 up:standby
Nov 28 09:51:56 np0005538515.localdomain ceph-mon[287604]: osdmap e86: 6 total, 6 up, 6 in
Nov 28 09:51:56 np0005538515.localdomain ceph-mon[287604]: mgrmap e19: np0005538512.zyhkxs(active, since 23s), standbys: np0005538514.djozup, np0005538515.yfkzhl, np0005538510.nzitwz, np0005538511.fvuybw, np0005538513.dsfdlx
Nov 28 09:51:56 np0005538515.localdomain ceph-mon[287604]: Health check cleared: MON_DOWN (was: 1/5 mons down, quorum np0005538512,np0005538511,np0005538515,np0005538513)
Nov 28 09:51:56 np0005538515.localdomain ceph-mon[287604]: Cluster is now healthy
Nov 28 09:51:56 np0005538515.localdomain ceph-mon[287604]: overall HEALTH_OK
Nov 28 09:51:56 np0005538515.localdomain ceph-mon[287604]: pgmap v17: 177 pgs: 177 active+clean; 104 MiB data, 558 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:51:56 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:51:56 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:51:56 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005538514.umgtoy", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Nov 28 09:51:56 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:51:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:51:57 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:51:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:51:57 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:51:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:51:57 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 09:51:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:51:57 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 09:51:57 np0005538515.localdomain openstack_network_exporter[240973]: 
Nov 28 09:51:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:51:57 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 09:51:57 np0005538515.localdomain openstack_network_exporter[240973]: 
Nov 28 09:51:57 np0005538515.localdomain ceph-mon[287604]: Reconfiguring mds.mds.np0005538514.umgtoy (monmap changed)...
Nov 28 09:51:57 np0005538515.localdomain ceph-mon[287604]: Reconfiguring daemon mds.mds.np0005538514.umgtoy on np0005538514.localdomain
Nov 28 09:51:57 np0005538515.localdomain ceph-mon[287604]: from='client.34169 -' entity='client.admin' cmd=[{"prefix": "orch host label rm", "hostname": "np0005538510.localdomain", "label": "mgr", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 09:51:57 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:51:57 np0005538515.localdomain ceph-mon[287604]: Removed label mgr from host np0005538510.localdomain
Nov 28 09:51:57 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:51:57 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:51:57 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538514.djozup", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 28 09:51:57 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "mgr services"} : dispatch
Nov 28 09:51:57 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:51:58 np0005538515.localdomain podman[239012]: time="2025-11-28T09:51:58Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 09:51:58 np0005538515.localdomain podman[239012]: @ - - [28/Nov/2025:09:51:58 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156330 "" "Go-http-client/1.1"
Nov 28 09:51:58 np0005538515.localdomain podman[239012]: @ - - [28/Nov/2025:09:51:58 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19169 "" "Go-http-client/1.1"
Nov 28 09:51:59 np0005538515.localdomain ceph-mon[287604]: Reconfiguring mgr.np0005538514.djozup (monmap changed)...
Nov 28 09:51:59 np0005538515.localdomain ceph-mon[287604]: Reconfiguring daemon mgr.np0005538514.djozup on np0005538514.localdomain
Nov 28 09:51:59 np0005538515.localdomain ceph-mon[287604]: from='client.26785 -' entity='client.admin' cmd=[{"prefix": "orch host label rm", "hostname": "np0005538510.localdomain", "label": "_admin", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 09:51:59 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:51:59 np0005538515.localdomain ceph-mon[287604]: Removed label _admin from host np0005538510.localdomain
Nov 28 09:51:59 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:51:59 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:51:59 np0005538515.localdomain ceph-mon[287604]: Reconfiguring mon.np0005538514 (monmap changed)...
Nov 28 09:51:59 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Nov 28 09:51:59 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Nov 28 09:51:59 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:51:59 np0005538515.localdomain ceph-mon[287604]: Reconfiguring daemon mon.np0005538514 on np0005538514.localdomain
Nov 28 09:51:59 np0005538515.localdomain ceph-mon[287604]: pgmap v18: 177 pgs: 177 active+clean; 104 MiB data, 558 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:51:59 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:51:59 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:51:59 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538515.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 28 09:51:59 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:51:59 np0005538515.localdomain sudo[289551]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:51:59 np0005538515.localdomain sudo[289551]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:59 np0005538515.localdomain sudo[289551]: pam_unix(sudo:session): session closed for user root
Nov 28 09:51:59 np0005538515.localdomain sudo[289569]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:51:59 np0005538515.localdomain sudo[289569]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:51:59 np0005538515.localdomain podman[289603]: 
Nov 28 09:51:59 np0005538515.localdomain podman[289603]: 2025-11-28 09:51:59.729124842 +0000 UTC m=+0.086402160 container create 6ddad776a79fd3c8179346e9a16362a66f623d999dc53229245b966913a6f32d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=quizzical_goldwasser, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, GIT_CLEAN=True, vendor=Red Hat, Inc., io.openshift.expose-services=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, distribution-scope=public, name=rhceph, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.description=Red Hat Ceph Storage 7, release=553, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, vcs-type=git, build-date=2025-09-24T08:57:55, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9)
Nov 28 09:51:59 np0005538515.localdomain systemd[1]: Started libpod-conmon-6ddad776a79fd3c8179346e9a16362a66f623d999dc53229245b966913a6f32d.scope.
Nov 28 09:51:59 np0005538515.localdomain podman[289603]: 2025-11-28 09:51:59.692830685 +0000 UTC m=+0.050108023 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 09:51:59 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 09:51:59 np0005538515.localdomain podman[289603]: 2025-11-28 09:51:59.817275484 +0000 UTC m=+0.174552802 container init 6ddad776a79fd3c8179346e9a16362a66f623d999dc53229245b966913a6f32d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=quizzical_goldwasser, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_CLEAN=True, GIT_BRANCH=main, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, release=553, build-date=2025-09-24T08:57:55, io.openshift.tags=rhceph ceph, architecture=x86_64, version=7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, CEPH_POINT_RELEASE=, distribution-scope=public, ceph=True, com.redhat.license_terms=https://www.redhat.com/agreements, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph)
Nov 28 09:51:59 np0005538515.localdomain podman[289603]: 2025-11-28 09:51:59.831240074 +0000 UTC m=+0.188517392 container start 6ddad776a79fd3c8179346e9a16362a66f623d999dc53229245b966913a6f32d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=quizzical_goldwasser, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.buildah.version=1.33.12, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, RELEASE=main, ceph=True, vcs-type=git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, name=rhceph, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, version=7, build-date=2025-09-24T08:57:55, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=)
Nov 28 09:51:59 np0005538515.localdomain podman[289603]: 2025-11-28 09:51:59.83209994 +0000 UTC m=+0.189377298 container attach 6ddad776a79fd3c8179346e9a16362a66f623d999dc53229245b966913a6f32d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=quizzical_goldwasser, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, distribution-scope=public, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, GIT_BRANCH=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, architecture=x86_64, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, name=rhceph, release=553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.33.12, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, build-date=2025-09-24T08:57:55, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=rhceph-container, vcs-type=git)
Nov 28 09:51:59 np0005538515.localdomain quizzical_goldwasser[289617]: 167 167
Nov 28 09:51:59 np0005538515.localdomain systemd[1]: libpod-6ddad776a79fd3c8179346e9a16362a66f623d999dc53229245b966913a6f32d.scope: Deactivated successfully.
Nov 28 09:51:59 np0005538515.localdomain podman[289603]: 2025-11-28 09:51:59.835389511 +0000 UTC m=+0.192666839 container died 6ddad776a79fd3c8179346e9a16362a66f623d999dc53229245b966913a6f32d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=quizzical_goldwasser, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, io.buildah.version=1.33.12, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, ceph=True, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, distribution-scope=public, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, build-date=2025-09-24T08:57:55, io.k8s.description=Red Hat Ceph Storage 7, release=553, io.openshift.expose-services=, vcs-type=git, CEPH_POINT_RELEASE=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, RELEASE=main, version=7)
Nov 28 09:51:59 np0005538515.localdomain podman[289622]: 2025-11-28 09:51:59.940961119 +0000 UTC m=+0.094363124 container remove 6ddad776a79fd3c8179346e9a16362a66f623d999dc53229245b966913a6f32d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=quizzical_goldwasser, build-date=2025-09-24T08:57:55, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, GIT_CLEAN=True, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, architecture=x86_64, release=553, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, io.buildah.version=1.33.12, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553)
Nov 28 09:51:59 np0005538515.localdomain systemd[1]: libpod-conmon-6ddad776a79fd3c8179346e9a16362a66f623d999dc53229245b966913a6f32d.scope: Deactivated successfully.
Nov 28 09:52:00 np0005538515.localdomain sudo[289569]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:00 np0005538515.localdomain sudo[289638]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:52:00 np0005538515.localdomain sudo[289638]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:00 np0005538515.localdomain sudo[289638]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:00 np0005538515.localdomain sudo[289656]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:52:00 np0005538515.localdomain sudo[289656]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:00 np0005538515.localdomain ceph-mon[287604]: Reconfiguring crash.np0005538515 (monmap changed)...
Nov 28 09:52:00 np0005538515.localdomain ceph-mon[287604]: Reconfiguring daemon crash.np0005538515 on np0005538515.localdomain
Nov 28 09:52:00 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:00 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:00 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch
Nov 28 09:52:00 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:52:00 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@2(peon).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:52:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:52:00.620 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:52:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:52:00.622 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:52:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:52:00.622 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:52:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:52:00.622 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:52:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:52:00.622 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:52:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:52:00.622 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:52:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:52:00.622 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:52:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:52:00.622 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:52:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:52:00.622 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:52:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:52:00.623 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:52:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:52:00.623 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:52:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:52:00.623 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:52:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:52:00.623 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:52:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:52:00.623 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:52:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:52:00.623 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:52:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:52:00.623 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:52:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:52:00.623 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:52:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:52:00.623 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:52:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:52:00.624 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:52:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:52:00.624 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:52:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:52:00.624 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:52:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:52:00.624 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:52:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:52:00.624 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:52:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:52:00.624 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:52:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:52:00.624 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:52:00 np0005538515.localdomain podman[289691]: 
Nov 28 09:52:00 np0005538515.localdomain podman[289691]: 2025-11-28 09:52:00.656378799 +0000 UTC m=+0.077993020 container create 39732a853f2c7d6fb9059d31699247fd503158b67df963548a1b0548becb255b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=goofy_grothendieck, io.openshift.tags=rhceph ceph, architecture=x86_64, description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, release=553, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, io.openshift.expose-services=, version=7, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, io.buildah.version=1.33.12, com.redhat.component=rhceph-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7)
Nov 28 09:52:00 np0005538515.localdomain systemd[1]: Started libpod-conmon-39732a853f2c7d6fb9059d31699247fd503158b67df963548a1b0548becb255b.scope.
Nov 28 09:52:00 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 09:52:00 np0005538515.localdomain podman[289691]: 2025-11-28 09:52:00.623397134 +0000 UTC m=+0.045011395 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 09:52:00 np0005538515.localdomain podman[289691]: 2025-11-28 09:52:00.723095162 +0000 UTC m=+0.144709383 container init 39732a853f2c7d6fb9059d31699247fd503158b67df963548a1b0548becb255b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=goofy_grothendieck, RELEASE=main, name=rhceph, vcs-type=git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_CLEAN=True, GIT_BRANCH=main, io.buildah.version=1.33.12, distribution-scope=public, ceph=True, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, build-date=2025-09-24T08:57:55, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, version=7, release=553, architecture=x86_64, vendor=Red Hat, Inc., io.openshift.expose-services=)
Nov 28 09:52:00 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-263a15e22c3ffb0b59d878eed662e156aae2c04cb0ed982484a20193bc87d945-merged.mount: Deactivated successfully.
Nov 28 09:52:00 np0005538515.localdomain goofy_grothendieck[289705]: 167 167
Nov 28 09:52:00 np0005538515.localdomain podman[289691]: 2025-11-28 09:52:00.739622411 +0000 UTC m=+0.161236632 container start 39732a853f2c7d6fb9059d31699247fd503158b67df963548a1b0548becb255b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=goofy_grothendieck, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.tags=rhceph ceph, RELEASE=main, build-date=2025-09-24T08:57:55, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., io.buildah.version=1.33.12, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, version=7, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, release=553, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, architecture=x86_64, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, name=rhceph)
Nov 28 09:52:00 np0005538515.localdomain systemd[1]: libpod-39732a853f2c7d6fb9059d31699247fd503158b67df963548a1b0548becb255b.scope: Deactivated successfully.
Nov 28 09:52:00 np0005538515.localdomain podman[289691]: 2025-11-28 09:52:00.740861298 +0000 UTC m=+0.162475519 container attach 39732a853f2c7d6fb9059d31699247fd503158b67df963548a1b0548becb255b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=goofy_grothendieck, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, GIT_CLEAN=True, io.buildah.version=1.33.12, build-date=2025-09-24T08:57:55, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, architecture=x86_64, vcs-type=git, version=7, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, release=553, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=)
Nov 28 09:52:00 np0005538515.localdomain podman[289691]: 2025-11-28 09:52:00.743868671 +0000 UTC m=+0.165482962 container died 39732a853f2c7d6fb9059d31699247fd503158b67df963548a1b0548becb255b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=goofy_grothendieck, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, name=rhceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., release=553, io.openshift.tags=rhceph ceph, version=7, RELEASE=main, io.buildah.version=1.33.12, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, architecture=x86_64, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-09-24T08:57:55)
Nov 28 09:52:00 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-0fc72b264f4585e4f3143871a4f11f78923c2fe522743deffb5c4d4086fbf1ae-merged.mount: Deactivated successfully.
Nov 28 09:52:00 np0005538515.localdomain podman[289710]: 2025-11-28 09:52:00.83938207 +0000 UTC m=+0.088531225 container remove 39732a853f2c7d6fb9059d31699247fd503158b67df963548a1b0548becb255b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=goofy_grothendieck, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=553, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, name=rhceph, io.openshift.expose-services=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.tags=rhceph ceph, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.33.12, architecture=x86_64, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, CEPH_POINT_RELEASE=, GIT_CLEAN=True, com.redhat.component=rhceph-container, build-date=2025-09-24T08:57:55, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3)
Nov 28 09:52:00 np0005538515.localdomain systemd[1]: libpod-conmon-39732a853f2c7d6fb9059d31699247fd503158b67df963548a1b0548becb255b.scope: Deactivated successfully.
Nov 28 09:52:01 np0005538515.localdomain sudo[289656]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:01 np0005538515.localdomain sudo[289734]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:52:01 np0005538515.localdomain sudo[289734]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:01 np0005538515.localdomain sudo[289734]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:01 np0005538515.localdomain sudo[289752]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:52:01 np0005538515.localdomain sudo[289752]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:01 np0005538515.localdomain ceph-mon[287604]: Reconfiguring osd.1 (monmap changed)...
Nov 28 09:52:01 np0005538515.localdomain ceph-mon[287604]: Reconfiguring daemon osd.1 on np0005538515.localdomain
Nov 28 09:52:01 np0005538515.localdomain ceph-mon[287604]: pgmap v19: 177 pgs: 177 active+clean; 104 MiB data, 558 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:52:01 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:01 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:01 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch
Nov 28 09:52:01 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:52:01 np0005538515.localdomain podman[289787]: 
Nov 28 09:52:01 np0005538515.localdomain podman[289787]: 2025-11-28 09:52:01.64709377 +0000 UTC m=+0.080577551 container create bc6687a5a9ab639ffe8d2ed7f4693b7aae602a8cfa238207a10564772d0110c5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vigilant_elgamal, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, io.buildah.version=1.33.12, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, CEPH_POINT_RELEASE=, version=7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, architecture=x86_64, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, build-date=2025-09-24T08:57:55, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, name=rhceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=553)
Nov 28 09:52:01 np0005538515.localdomain systemd[1]: Started libpod-conmon-bc6687a5a9ab639ffe8d2ed7f4693b7aae602a8cfa238207a10564772d0110c5.scope.
Nov 28 09:52:01 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 09:52:01 np0005538515.localdomain podman[289787]: 2025-11-28 09:52:01.711381208 +0000 UTC m=+0.144864989 container init bc6687a5a9ab639ffe8d2ed7f4693b7aae602a8cfa238207a10564772d0110c5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vigilant_elgamal, build-date=2025-09-24T08:57:55, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, CEPH_POINT_RELEASE=, version=7, GIT_CLEAN=True, name=rhceph, com.redhat.component=rhceph-container, io.openshift.expose-services=, distribution-scope=public, RELEASE=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, release=553, vcs-type=git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, ceph=True, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_BRANCH=main, io.buildah.version=1.33.12)
Nov 28 09:52:01 np0005538515.localdomain podman[289787]: 2025-11-28 09:52:01.616720085 +0000 UTC m=+0.050203906 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 09:52:01 np0005538515.localdomain podman[289787]: 2025-11-28 09:52:01.721203679 +0000 UTC m=+0.154687460 container start bc6687a5a9ab639ffe8d2ed7f4693b7aae602a8cfa238207a10564772d0110c5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vigilant_elgamal, name=rhceph, vendor=Red Hat, Inc., io.buildah.version=1.33.12, build-date=2025-09-24T08:57:55, release=553, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, GIT_BRANCH=main, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, description=Red Hat Ceph Storage 7, ceph=True, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553)
Nov 28 09:52:01 np0005538515.localdomain vigilant_elgamal[289802]: 167 167
Nov 28 09:52:01 np0005538515.localdomain systemd[1]: libpod-bc6687a5a9ab639ffe8d2ed7f4693b7aae602a8cfa238207a10564772d0110c5.scope: Deactivated successfully.
Nov 28 09:52:01 np0005538515.localdomain podman[289787]: 2025-11-28 09:52:01.721589231 +0000 UTC m=+0.155073022 container attach bc6687a5a9ab639ffe8d2ed7f4693b7aae602a8cfa238207a10564772d0110c5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vigilant_elgamal, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, ceph=True, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55, name=rhceph, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/agreements, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, release=553, io.buildah.version=1.33.12, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, CEPH_POINT_RELEASE=)
Nov 28 09:52:01 np0005538515.localdomain podman[289787]: 2025-11-28 09:52:01.729140594 +0000 UTC m=+0.162624425 container died bc6687a5a9ab639ffe8d2ed7f4693b7aae602a8cfa238207a10564772d0110c5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vigilant_elgamal, io.openshift.expose-services=, name=rhceph, GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-type=git, io.buildah.version=1.33.12, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., release=553, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, distribution-scope=public, version=7, description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Nov 28 09:52:01 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-4a9db5debd21dd23f23ad88a68baded3c4635497648aba31d8707735a64b93a3-merged.mount: Deactivated successfully.
Nov 28 09:52:01 np0005538515.localdomain podman[289807]: 2025-11-28 09:52:01.827987364 +0000 UTC m=+0.095543920 container remove bc6687a5a9ab639ffe8d2ed7f4693b7aae602a8cfa238207a10564772d0110c5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vigilant_elgamal, ceph=True, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, release=553, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, RELEASE=main, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, build-date=2025-09-24T08:57:55, GIT_CLEAN=True, distribution-scope=public, vendor=Red Hat, Inc., version=7, com.redhat.component=rhceph-container, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3)
Nov 28 09:52:01 np0005538515.localdomain systemd[1]: libpod-conmon-bc6687a5a9ab639ffe8d2ed7f4693b7aae602a8cfa238207a10564772d0110c5.scope: Deactivated successfully.
Nov 28 09:52:02 np0005538515.localdomain sudo[289752]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:02 np0005538515.localdomain sudo[289830]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:52:02 np0005538515.localdomain sudo[289830]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:02 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.
Nov 28 09:52:02 np0005538515.localdomain sudo[289830]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:02 np0005538515.localdomain sudo[289849]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:52:02 np0005538515.localdomain sudo[289849]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:02 np0005538515.localdomain ceph-mon[287604]: Reconfiguring osd.4 (monmap changed)...
Nov 28 09:52:02 np0005538515.localdomain ceph-mon[287604]: Reconfiguring daemon osd.4 on np0005538515.localdomain
Nov 28 09:52:02 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:02 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:02 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005538515.anvatb", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Nov 28 09:52:02 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:52:02 np0005538515.localdomain podman[289848]: 2025-11-28 09:52:02.293555419 +0000 UTC m=+0.136740608 container health_status 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, vcs-type=git, architecture=x86_64, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, name=ubi9-minimal, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, managed_by=edpm_ansible, container_name=openstack_network_exporter, distribution-scope=public, maintainer=Red Hat, Inc.)
Nov 28 09:52:02 np0005538515.localdomain podman[289848]: 2025-11-28 09:52:02.311364166 +0000 UTC m=+0.154549415 container exec_died 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, managed_by=edpm_ansible, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, config_id=edpm, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, distribution-scope=public)
Nov 28 09:52:02 np0005538515.localdomain systemd[1]: 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.service: Deactivated successfully.
Nov 28 09:52:02 np0005538515.localdomain podman[289903]: 
Nov 28 09:52:02 np0005538515.localdomain podman[289903]: 2025-11-28 09:52:02.697983211 +0000 UTC m=+0.075922737 container create 865f135ef39cae66b53c1eefb1b69e2c89d7e11139e5613376a460ebac81965f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=mystifying_brattain, vcs-type=git, release=553, version=7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.expose-services=, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, build-date=2025-09-24T08:57:55, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, name=rhceph, io.buildah.version=1.33.12, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, RELEASE=main, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., architecture=x86_64)
Nov 28 09:52:02 np0005538515.localdomain systemd[1]: Started libpod-conmon-865f135ef39cae66b53c1eefb1b69e2c89d7e11139e5613376a460ebac81965f.scope.
Nov 28 09:52:02 np0005538515.localdomain systemd[1]: tmp-crun.B0bOVL.mount: Deactivated successfully.
Nov 28 09:52:02 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 09:52:02 np0005538515.localdomain podman[289903]: 2025-11-28 09:52:02.764644012 +0000 UTC m=+0.142583538 container init 865f135ef39cae66b53c1eefb1b69e2c89d7e11139e5613376a460ebac81965f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=mystifying_brattain, build-date=2025-09-24T08:57:55, com.redhat.component=rhceph-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, release=553, name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12, ceph=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, vendor=Red Hat, Inc., version=7, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, description=Red Hat Ceph Storage 7, RELEASE=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Nov 28 09:52:02 np0005538515.localdomain podman[289903]: 2025-11-28 09:52:02.669200866 +0000 UTC m=+0.047140452 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 09:52:02 np0005538515.localdomain podman[289903]: 2025-11-28 09:52:02.773522545 +0000 UTC m=+0.151462071 container start 865f135ef39cae66b53c1eefb1b69e2c89d7e11139e5613376a460ebac81965f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=mystifying_brattain, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, release=553, io.buildah.version=1.33.12, RELEASE=main, vendor=Red Hat, Inc., io.openshift.expose-services=, name=rhceph, build-date=2025-09-24T08:57:55, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.license_terms=https://www.redhat.com/agreements, CEPH_POINT_RELEASE=, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, GIT_BRANCH=main, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, version=7)
Nov 28 09:52:02 np0005538515.localdomain mystifying_brattain[289918]: 167 167
Nov 28 09:52:02 np0005538515.localdomain podman[289903]: 2025-11-28 09:52:02.775248138 +0000 UTC m=+0.153187714 container attach 865f135ef39cae66b53c1eefb1b69e2c89d7e11139e5613376a460ebac81965f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=mystifying_brattain, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, distribution-scope=public, GIT_CLEAN=True, build-date=2025-09-24T08:57:55, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.expose-services=, io.buildah.version=1.33.12, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=553, GIT_BRANCH=main, vcs-type=git, name=rhceph, version=7, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.component=rhceph-container, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Nov 28 09:52:02 np0005538515.localdomain systemd[1]: libpod-865f135ef39cae66b53c1eefb1b69e2c89d7e11139e5613376a460ebac81965f.scope: Deactivated successfully.
Nov 28 09:52:02 np0005538515.localdomain podman[289903]: 2025-11-28 09:52:02.779818658 +0000 UTC m=+0.157758244 container died 865f135ef39cae66b53c1eefb1b69e2c89d7e11139e5613376a460ebac81965f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=mystifying_brattain, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=rhceph-container, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, description=Red Hat Ceph Storage 7, architecture=x86_64, ceph=True, build-date=2025-09-24T08:57:55, release=553, GIT_BRANCH=main, io.buildah.version=1.33.12, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, GIT_CLEAN=True, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, vcs-type=git, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3)
Nov 28 09:52:02 np0005538515.localdomain podman[289924]: 2025-11-28 09:52:02.873413889 +0000 UTC m=+0.085624166 container remove 865f135ef39cae66b53c1eefb1b69e2c89d7e11139e5613376a460ebac81965f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=mystifying_brattain, GIT_BRANCH=main, architecture=x86_64, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, release=553, build-date=2025-09-24T08:57:55, com.redhat.component=rhceph-container, name=rhceph, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.buildah.version=1.33.12, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_CLEAN=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, version=7, vendor=Red Hat, Inc.)
Nov 28 09:52:02 np0005538515.localdomain systemd[1]: libpod-conmon-865f135ef39cae66b53c1eefb1b69e2c89d7e11139e5613376a460ebac81965f.scope: Deactivated successfully.
Nov 28 09:52:02 np0005538515.localdomain sudo[289849]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:03 np0005538515.localdomain sudo[289940]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:52:03 np0005538515.localdomain sudo[289940]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:03 np0005538515.localdomain sudo[289940]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:03 np0005538515.localdomain sudo[289958]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:52:03 np0005538515.localdomain sudo[289958]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:03 np0005538515.localdomain ceph-mon[287604]: Reconfiguring mds.mds.np0005538515.anvatb (monmap changed)...
Nov 28 09:52:03 np0005538515.localdomain ceph-mon[287604]: Reconfiguring daemon mds.mds.np0005538515.anvatb on np0005538515.localdomain
Nov 28 09:52:03 np0005538515.localdomain ceph-mon[287604]: pgmap v20: 177 pgs: 177 active+clean; 104 MiB data, 558 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:52:03 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:03 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:03 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538515.yfkzhl", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 28 09:52:03 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "mgr services"} : dispatch
Nov 28 09:52:03 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:52:03 np0005538515.localdomain podman[289993]: 
Nov 28 09:52:03 np0005538515.localdomain podman[289993]: 2025-11-28 09:52:03.592184201 +0000 UTC m=+0.074946957 container create f99acffce887091ada0b0b48efd749e2acf0a9b2f32ecb0cbc091c439c5fd547 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=naughty_bose, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.component=rhceph-container, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, architecture=x86_64, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, io.buildah.version=1.33.12, build-date=2025-09-24T08:57:55, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=553, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, distribution-scope=public, version=7, vcs-type=git)
Nov 28 09:52:03 np0005538515.localdomain systemd[1]: Started libpod-conmon-f99acffce887091ada0b0b48efd749e2acf0a9b2f32ecb0cbc091c439c5fd547.scope.
Nov 28 09:52:03 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 09:52:03 np0005538515.localdomain podman[289993]: 2025-11-28 09:52:03.651764494 +0000 UTC m=+0.134527250 container init f99acffce887091ada0b0b48efd749e2acf0a9b2f32ecb0cbc091c439c5fd547 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=naughty_bose, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-09-24T08:57:55, description=Red Hat Ceph Storage 7, distribution-scope=public, architecture=x86_64, io.openshift.tags=rhceph ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.component=rhceph-container, ceph=True, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, release=553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vendor=Red Hat, Inc., version=7, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_BRANCH=main, CEPH_POINT_RELEASE=, RELEASE=main, io.buildah.version=1.33.12, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Nov 28 09:52:03 np0005538515.localdomain podman[289993]: 2025-11-28 09:52:03.661152864 +0000 UTC m=+0.143915620 container start f99acffce887091ada0b0b48efd749e2acf0a9b2f32ecb0cbc091c439c5fd547 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=naughty_bose, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, RELEASE=main, release=553, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-type=git, build-date=2025-09-24T08:57:55, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, distribution-scope=public, GIT_CLEAN=True, CEPH_POINT_RELEASE=, ceph=True, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Nov 28 09:52:03 np0005538515.localdomain naughty_bose[290008]: 167 167
Nov 28 09:52:03 np0005538515.localdomain podman[289993]: 2025-11-28 09:52:03.561758475 +0000 UTC m=+0.044521261 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 09:52:03 np0005538515.localdomain podman[289993]: 2025-11-28 09:52:03.661437112 +0000 UTC m=+0.144199868 container attach f99acffce887091ada0b0b48efd749e2acf0a9b2f32ecb0cbc091c439c5fd547 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=naughty_bose, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_BRANCH=main, ceph=True, RELEASE=main, name=rhceph, vendor=Red Hat, Inc., vcs-type=git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, distribution-scope=public, io.openshift.expose-services=, build-date=2025-09-24T08:57:55, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, release=553, CEPH_POINT_RELEASE=, io.buildah.version=1.33.12, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553)
Nov 28 09:52:03 np0005538515.localdomain systemd[1]: libpod-f99acffce887091ada0b0b48efd749e2acf0a9b2f32ecb0cbc091c439c5fd547.scope: Deactivated successfully.
Nov 28 09:52:03 np0005538515.localdomain podman[289993]: 2025-11-28 09:52:03.666679104 +0000 UTC m=+0.149441890 container died f99acffce887091ada0b0b48efd749e2acf0a9b2f32ecb0cbc091c439c5fd547 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=naughty_bose, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55, vcs-type=git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.buildah.version=1.33.12, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, release=553, distribution-scope=public, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, architecture=x86_64, CEPH_POINT_RELEASE=, RELEASE=main, io.openshift.expose-services=, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, ceph=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, version=7)
Nov 28 09:52:03 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-179602c39f11d7560e4f6aeeed47cfb20ec067f15ea64266f61327d8d1bb4fc0-merged.mount: Deactivated successfully.
Nov 28 09:52:03 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-50e577bb76dafc4aceb6fb7c44cc2aa3e42ca33e42ed4189eeadd14dd72d400e-merged.mount: Deactivated successfully.
Nov 28 09:52:03 np0005538515.localdomain podman[290014]: 2025-11-28 09:52:03.756547159 +0000 UTC m=+0.080892980 container remove f99acffce887091ada0b0b48efd749e2acf0a9b2f32ecb0cbc091c439c5fd547 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=naughty_bose, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, build-date=2025-09-24T08:57:55, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, RELEASE=main, ceph=True, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, release=553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, version=7, architecture=x86_64, name=rhceph, distribution-scope=public, vcs-type=git, io.openshift.expose-services=, GIT_BRANCH=main)
Nov 28 09:52:03 np0005538515.localdomain systemd[1]: libpod-conmon-f99acffce887091ada0b0b48efd749e2acf0a9b2f32ecb0cbc091c439c5fd547.scope: Deactivated successfully.
Nov 28 09:52:03 np0005538515.localdomain sudo[289958]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:03 np0005538515.localdomain sudo[290029]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:52:03 np0005538515.localdomain sudo[290029]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:03 np0005538515.localdomain sudo[290029]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:04 np0005538515.localdomain sudo[290047]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:52:04 np0005538515.localdomain sudo[290047]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:04 np0005538515.localdomain ceph-mon[287604]: Reconfiguring mgr.np0005538515.yfkzhl (monmap changed)...
Nov 28 09:52:04 np0005538515.localdomain ceph-mon[287604]: Reconfiguring daemon mgr.np0005538515.yfkzhl on np0005538515.localdomain
Nov 28 09:52:04 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:04 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:04 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Nov 28 09:52:04 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Nov 28 09:52:04 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:52:04 np0005538515.localdomain podman[290082]: 
Nov 28 09:52:04 np0005538515.localdomain podman[290082]: 2025-11-28 09:52:04.474459645 +0000 UTC m=+0.081330203 container create 89ded673970392884b8c2ed3020af8bc987ed7fbb85f96ba4415e62330009096 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=great_jennings, release=553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, com.redhat.component=rhceph-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.33.12, build-date=2025-09-24T08:57:55, CEPH_POINT_RELEASE=, io.openshift.expose-services=, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, ceph=True, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, name=rhceph, version=7, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, architecture=x86_64, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3)
Nov 28 09:52:04 np0005538515.localdomain systemd[1]: Started libpod-conmon-89ded673970392884b8c2ed3020af8bc987ed7fbb85f96ba4415e62330009096.scope.
Nov 28 09:52:04 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 09:52:04 np0005538515.localdomain podman[290082]: 2025-11-28 09:52:04.4404917 +0000 UTC m=+0.047362298 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 09:52:04 np0005538515.localdomain podman[290082]: 2025-11-28 09:52:04.544434078 +0000 UTC m=+0.151304646 container init 89ded673970392884b8c2ed3020af8bc987ed7fbb85f96ba4415e62330009096 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=great_jennings, version=7, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-09-24T08:57:55, name=rhceph, GIT_CLEAN=True, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.buildah.version=1.33.12, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, RELEASE=main, GIT_BRANCH=main, distribution-scope=public, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, release=553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vendor=Red Hat, Inc., ceph=True)
Nov 28 09:52:04 np0005538515.localdomain podman[290082]: 2025-11-28 09:52:04.552182497 +0000 UTC m=+0.159053055 container start 89ded673970392884b8c2ed3020af8bc987ed7fbb85f96ba4415e62330009096 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=great_jennings, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, version=7, io.openshift.tags=rhceph ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vendor=Red Hat, Inc., build-date=2025-09-24T08:57:55, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, release=553, name=rhceph, io.buildah.version=1.33.12, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.component=rhceph-container, GIT_BRANCH=main)
Nov 28 09:52:04 np0005538515.localdomain podman[290082]: 2025-11-28 09:52:04.552435794 +0000 UTC m=+0.159306352 container attach 89ded673970392884b8c2ed3020af8bc987ed7fbb85f96ba4415e62330009096 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=great_jennings, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.expose-services=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, release=553, CEPH_POINT_RELEASE=, architecture=x86_64, com.redhat.component=rhceph-container, GIT_CLEAN=True, version=7, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, distribution-scope=public, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, io.buildah.version=1.33.12, name=rhceph, build-date=2025-09-24T08:57:55)
Nov 28 09:52:04 np0005538515.localdomain great_jennings[290098]: 167 167
Nov 28 09:52:04 np0005538515.localdomain systemd[1]: libpod-89ded673970392884b8c2ed3020af8bc987ed7fbb85f96ba4415e62330009096.scope: Deactivated successfully.
Nov 28 09:52:04 np0005538515.localdomain podman[290082]: 2025-11-28 09:52:04.557018236 +0000 UTC m=+0.163888794 container died 89ded673970392884b8c2ed3020af8bc987ed7fbb85f96ba4415e62330009096 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=great_jennings, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, architecture=x86_64, io.openshift.tags=rhceph ceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, release=553, version=7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, distribution-scope=public, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, GIT_CLEAN=True, RELEASE=main, io.buildah.version=1.33.12, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, build-date=2025-09-24T08:57:55, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True)
Nov 28 09:52:04 np0005538515.localdomain podman[290103]: 2025-11-28 09:52:04.657143805 +0000 UTC m=+0.085123039 container remove 89ded673970392884b8c2ed3020af8bc987ed7fbb85f96ba4415e62330009096 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=great_jennings, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, com.redhat.component=rhceph-container, version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7, release=553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, io.buildah.version=1.33.12, description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, distribution-scope=public, vcs-type=git, name=rhceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., RELEASE=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True)
Nov 28 09:52:04 np0005538515.localdomain systemd[1]: libpod-conmon-89ded673970392884b8c2ed3020af8bc987ed7fbb85f96ba4415e62330009096.scope: Deactivated successfully.
Nov 28 09:52:04 np0005538515.localdomain sudo[290047]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:04 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-0955258851a1cdebe2488657501f1ad6abbb78676e18f42475395c8d9af8e7d0-merged.mount: Deactivated successfully.
Nov 28 09:52:05 np0005538515.localdomain ceph-mon[287604]: Reconfiguring mon.np0005538515 (monmap changed)...
Nov 28 09:52:05 np0005538515.localdomain ceph-mon[287604]: Reconfiguring daemon mon.np0005538515 on np0005538515.localdomain
Nov 28 09:52:05 np0005538515.localdomain ceph-mon[287604]: pgmap v21: 177 pgs: 177 active+clean; 104 MiB data, 558 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:52:05 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:05 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:05 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@2(peon).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:52:06 np0005538515.localdomain sudo[290119]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Nov 28 09:52:06 np0005538515.localdomain sudo[290119]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:06 np0005538515.localdomain sudo[290119]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:06 np0005538515.localdomain sudo[290137]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph
Nov 28 09:52:06 np0005538515.localdomain sudo[290137]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:06 np0005538515.localdomain sudo[290137]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:06 np0005538515.localdomain sudo[290155]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.conf.new
Nov 28 09:52:06 np0005538515.localdomain sudo[290155]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:06 np0005538515.localdomain sudo[290155]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:06 np0005538515.localdomain sudo[290173]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:52:06 np0005538515.localdomain sudo[290173]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:06 np0005538515.localdomain sudo[290173]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:06 np0005538515.localdomain sudo[290191]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.conf.new
Nov 28 09:52:06 np0005538515.localdomain sudo[290191]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:06 np0005538515.localdomain sudo[290191]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:06 np0005538515.localdomain sudo[290225]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.conf.new
Nov 28 09:52:06 np0005538515.localdomain sudo[290225]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:06 np0005538515.localdomain sudo[290225]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:06 np0005538515.localdomain sudo[290243]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.conf.new
Nov 28 09:52:06 np0005538515.localdomain sudo[290243]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:06 np0005538515.localdomain sudo[290243]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:06 np0005538515.localdomain sudo[290261]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Nov 28 09:52:06 np0005538515.localdomain sudo[290261]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:06 np0005538515.localdomain sudo[290261]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:07 np0005538515.localdomain sudo[290279]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config
Nov 28 09:52:07 np0005538515.localdomain sudo[290279]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:07 np0005538515.localdomain sudo[290279]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:07 np0005538515.localdomain sudo[290297]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config
Nov 28 09:52:07 np0005538515.localdomain sudo[290297]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:07 np0005538515.localdomain sudo[290297]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:07 np0005538515.localdomain sudo[290315]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf.new
Nov 28 09:52:07 np0005538515.localdomain sudo[290315]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:07 np0005538515.localdomain sudo[290315]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:07 np0005538515.localdomain ceph-mon[287604]: pgmap v22: 177 pgs: 177 active+clean; 104 MiB data, 558 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:52:07 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:07 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:07 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:52:07 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 28 09:52:07 np0005538515.localdomain ceph-mon[287604]: Removing np0005538510.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:52:07 np0005538515.localdomain ceph-mon[287604]: Updating np0005538511.localdomain:/etc/ceph/ceph.conf
Nov 28 09:52:07 np0005538515.localdomain ceph-mon[287604]: Updating np0005538512.localdomain:/etc/ceph/ceph.conf
Nov 28 09:52:07 np0005538515.localdomain ceph-mon[287604]: Updating np0005538513.localdomain:/etc/ceph/ceph.conf
Nov 28 09:52:07 np0005538515.localdomain ceph-mon[287604]: Updating np0005538514.localdomain:/etc/ceph/ceph.conf
Nov 28 09:52:07 np0005538515.localdomain ceph-mon[287604]: Updating np0005538515.localdomain:/etc/ceph/ceph.conf
Nov 28 09:52:07 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:07 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:07 np0005538515.localdomain sudo[290333]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:52:07 np0005538515.localdomain sudo[290333]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:07 np0005538515.localdomain sudo[290333]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:07 np0005538515.localdomain sudo[290351]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf.new
Nov 28 09:52:07 np0005538515.localdomain sudo[290351]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:07 np0005538515.localdomain sudo[290351]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:07 np0005538515.localdomain sudo[290385]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf.new
Nov 28 09:52:07 np0005538515.localdomain sudo[290385]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:07 np0005538515.localdomain sudo[290385]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:07 np0005538515.localdomain sudo[290403]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf.new
Nov 28 09:52:07 np0005538515.localdomain sudo[290403]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:07 np0005538515.localdomain sudo[290403]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:07 np0005538515.localdomain sudo[290421]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf.new /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:52:07 np0005538515.localdomain sudo[290421]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:07 np0005538515.localdomain sudo[290421]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:08 np0005538515.localdomain ceph-mon[287604]: Removing np0005538510.localdomain:/etc/ceph/ceph.client.admin.keyring
Nov 28 09:52:08 np0005538515.localdomain ceph-mon[287604]: Removing np0005538510.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring
Nov 28 09:52:08 np0005538515.localdomain ceph-mon[287604]: Updating np0005538514.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:52:08 np0005538515.localdomain ceph-mon[287604]: Updating np0005538511.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:52:08 np0005538515.localdomain ceph-mon[287604]: Updating np0005538512.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:52:08 np0005538515.localdomain ceph-mon[287604]: Updating np0005538515.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:52:08 np0005538515.localdomain ceph-mon[287604]: Updating np0005538513.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:52:08 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:08 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:08 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:08 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:08 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:08 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:08 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:08 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:08 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:08 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:08 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:09 np0005538515.localdomain ceph-mon[287604]: Removing daemon mgr.np0005538510.nzitwz from np0005538510.localdomain -- ports [9283, 8765]
Nov 28 09:52:09 np0005538515.localdomain ceph-mon[287604]: pgmap v23: 177 pgs: 177 active+clean; 104 MiB data, 558 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:52:10 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@2(peon).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:52:10 np0005538515.localdomain ceph-mon[287604]: from='client.26795 -' entity='client.admin' cmd=[{"prefix": "orch host drain", "hostname": "np0005538510.localdomain", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 09:52:10 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:10 np0005538515.localdomain ceph-mon[287604]: Added label _no_schedule to host np0005538510.localdomain
Nov 28 09:52:10 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:10 np0005538515.localdomain ceph-mon[287604]: Added label SpecialHostLabels.DRAIN_CONF_KEYRING to host np0005538510.localdomain
Nov 28 09:52:10 np0005538515.localdomain ceph-mon[287604]: Removing key for mgr.np0005538510.nzitwz
Nov 28 09:52:10 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth rm", "entity": "mgr.np0005538510.nzitwz"} : dispatch
Nov 28 09:52:10 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd='[{"prefix": "auth rm", "entity": "mgr.np0005538510.nzitwz"}]': finished
Nov 28 09:52:10 np0005538515.localdomain ceph-mon[287604]: pgmap v24: 177 pgs: 177 active+clean; 104 MiB data, 558 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:52:10 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:10 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:10 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 28 09:52:10 np0005538515.localdomain sudo[290439]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:52:10 np0005538515.localdomain sudo[290439]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:10 np0005538515.localdomain sudo[290439]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:12 np0005538515.localdomain ceph-mon[287604]: from='client.26805 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "host_pattern": "np0005538510.localdomain", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Nov 28 09:52:12 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:12 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:12 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:52:12 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 28 09:52:12 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:12 np0005538515.localdomain sudo[290457]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:52:12 np0005538515.localdomain sudo[290457]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:12 np0005538515.localdomain sudo[290457]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:13 np0005538515.localdomain ceph-mon[287604]: Removing daemon crash.np0005538510 from np0005538510.localdomain -- ports []
Nov 28 09:52:13 np0005538515.localdomain ceph-mon[287604]: pgmap v25: 177 pgs: 177 active+clean; 104 MiB data, 558 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:52:13 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:13 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:13 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005538510.localdomain"} : dispatch
Nov 28 09:52:13 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005538510.localdomain"}]': finished
Nov 28 09:52:13 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth rm", "entity": "client.crash.np0005538510.localdomain"} : dispatch
Nov 28 09:52:13 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd='[{"prefix": "auth rm", "entity": "client.crash.np0005538510.localdomain"}]': finished
Nov 28 09:52:13 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:13 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:13 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 28 09:52:13 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:52:13 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 28 09:52:13 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:13 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 28 09:52:13 np0005538515.localdomain sudo[290475]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:52:13 np0005538515.localdomain sudo[290475]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:13 np0005538515.localdomain sudo[290475]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:13 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.
Nov 28 09:52:13 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.
Nov 28 09:52:13 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.
Nov 28 09:52:13 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.
Nov 28 09:52:13 np0005538515.localdomain podman[290493]: 2025-11-28 09:52:13.376008259 +0000 UTC m=+0.076293569 container health_status 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 09:52:13 np0005538515.localdomain podman[290495]: 2025-11-28 09:52:13.446131026 +0000 UTC m=+0.137639395 container health_status b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 09:52:13 np0005538515.localdomain podman[290494]: 2025-11-28 09:52:13.402161673 +0000 UTC m=+0.096422187 container health_status 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 28 09:52:13 np0005538515.localdomain podman[290493]: 2025-11-28 09:52:13.470426733 +0000 UTC m=+0.170712073 container exec_died 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 09:52:13 np0005538515.localdomain podman[290495]: 2025-11-28 09:52:13.478326926 +0000 UTC m=+0.169835315 container exec_died b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 09:52:13 np0005538515.localdomain systemd[1]: 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.service: Deactivated successfully.
Nov 28 09:52:13 np0005538515.localdomain podman[290494]: 2025-11-28 09:52:13.485365853 +0000 UTC m=+0.179626287 container exec_died 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Nov 28 09:52:13 np0005538515.localdomain systemd[1]: b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.service: Deactivated successfully.
Nov 28 09:52:13 np0005538515.localdomain systemd[1]: 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.service: Deactivated successfully.
Nov 28 09:52:13 np0005538515.localdomain podman[290501]: 2025-11-28 09:52:13.436684415 +0000 UTC m=+0.125499592 container health_status d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 09:52:13 np0005538515.localdomain podman[290501]: 2025-11-28 09:52:13.568717037 +0000 UTC m=+0.257532284 container exec_died d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 09:52:13 np0005538515.localdomain systemd[1]: d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.service: Deactivated successfully.
Nov 28 09:52:14 np0005538515.localdomain ceph-mon[287604]: from='client.34193 -' entity='client.admin' cmd=[{"prefix": "orch host rm", "hostname": "np0005538510.localdomain", "force": true, "target": ["mon-mgr", ""]}]: dispatch
Nov 28 09:52:14 np0005538515.localdomain ceph-mon[287604]: Removing key for client.crash.np0005538510.localdomain
Nov 28 09:52:14 np0005538515.localdomain ceph-mon[287604]: Removed host np0005538510.localdomain
Nov 28 09:52:14 np0005538515.localdomain ceph-mon[287604]: Reconfiguring crash.np0005538511 (monmap changed)...
Nov 28 09:52:14 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538511.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 28 09:52:14 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:52:14 np0005538515.localdomain ceph-mon[287604]: Reconfiguring daemon crash.np0005538511 on np0005538511.localdomain
Nov 28 09:52:14 np0005538515.localdomain ceph-mon[287604]: from='client.? 172.18.0.32:0/1979550132' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 09:52:14 np0005538515.localdomain ceph-mon[287604]: from='client.? 172.18.0.32:0/1979550132' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 09:52:14 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:14 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:14 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Nov 28 09:52:14 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Nov 28 09:52:14 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:52:14 np0005538515.localdomain ceph-mon[287604]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #13. Immutable memtables: 0.
Nov 28 09:52:14 np0005538515.localdomain ceph-mon[287604]: rocksdb: (Original Log Time 2025/11/28-09:52:14.435873) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Nov 28 09:52:14 np0005538515.localdomain ceph-mon[287604]: rocksdb: [db/flush_job.cc:856] [default] [JOB 3] Flushing memtable with next log file: 13
Nov 28 09:52:14 np0005538515.localdomain ceph-mon[287604]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323534436166, "job": 3, "event": "flush_started", "num_memtables": 1, "num_entries": 12689, "num_deletes": 779, "total_data_size": 20222003, "memory_usage": 20826280, "flush_reason": "Manual Compaction"}
Nov 28 09:52:14 np0005538515.localdomain ceph-mon[287604]: rocksdb: [db/flush_job.cc:885] [default] [JOB 3] Level-0 flush table #14: started
Nov 28 09:52:14 np0005538515.localdomain ceph-mon[287604]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323534545346, "cf_name": "default", "job": 3, "event": "table_file_creation", "file_number": 14, "file_size": 12286694, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 6, "largest_seqno": 12694, "table_properties": {"data_size": 12229999, "index_size": 29773, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 25861, "raw_key_size": 267511, "raw_average_key_size": 25, "raw_value_size": 12056265, "raw_average_value_size": 1167, "num_data_blocks": 1136, "num_entries": 10325, "num_filter_entries": 10325, "num_deletions": 778, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764323465, "oldest_key_time": 1764323465, "file_creation_time": 1764323534, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "5fedd929-5f7c-4f1d-86e7-c95af9bc6d32", "db_session_id": "18KD68ISQNH5R0YWI96C", "orig_file_number": 14, "seqno_to_time_mapping": "N/A"}}
Nov 28 09:52:14 np0005538515.localdomain ceph-mon[287604]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 3] Flush lasted 109583 microseconds, and 33495 cpu microseconds.
Nov 28 09:52:14 np0005538515.localdomain ceph-mon[287604]: rocksdb: (Original Log Time 2025/11/28-09:52:14.545480) [db/flush_job.cc:967] [default] [JOB 3] Level-0 flush table #14: 12286694 bytes OK
Nov 28 09:52:14 np0005538515.localdomain ceph-mon[287604]: rocksdb: (Original Log Time 2025/11/28-09:52:14.545538) [db/memtable_list.cc:519] [default] Level-0 commit table #14 started
Nov 28 09:52:14 np0005538515.localdomain ceph-mon[287604]: rocksdb: (Original Log Time 2025/11/28-09:52:14.547353) [db/memtable_list.cc:722] [default] Level-0 commit table #14: memtable #1 done
Nov 28 09:52:14 np0005538515.localdomain ceph-mon[287604]: rocksdb: (Original Log Time 2025/11/28-09:52:14.547403) EVENT_LOG_v1 {"time_micros": 1764323534547397, "job": 3, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [2, 0, 0, 0, 0, 0, 0], "immutable_memtables": 0}
Nov 28 09:52:14 np0005538515.localdomain ceph-mon[287604]: rocksdb: (Original Log Time 2025/11/28-09:52:14.547426) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: files[2 0 0 0 0 0 0] max score 0.50
Nov 28 09:52:14 np0005538515.localdomain ceph-mon[287604]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 3] Try to delete WAL files size 20139157, prev total WAL file size 20139906, number of live WAL files 2.
Nov 28 09:52:14 np0005538515.localdomain ceph-mon[287604]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538515/store.db/000009.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 28 09:52:14 np0005538515.localdomain ceph-mon[287604]: rocksdb: (Original Log Time 2025/11/28-09:52:14.550900) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740033353039' seq:72057594037927935, type:22 .. '6D6772737461740033373631' seq:0, type:0; will stop at (end)
Nov 28 09:52:14 np0005538515.localdomain ceph-mon[287604]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 4] Compacting 2@0 files to L6, score -1.00
Nov 28 09:52:14 np0005538515.localdomain ceph-mon[287604]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 3 Base level 0, inputs: [14(11MB) 8(1887B)]
Nov 28 09:52:14 np0005538515.localdomain ceph-mon[287604]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323534550993, "job": 4, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [14, 8], "score": -1, "input_data_size": 12288581, "oldest_snapshot_seqno": -1}
Nov 28 09:52:14 np0005538515.localdomain ceph-mon[287604]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 4] Generated table #15: 9550 keys, 12274725 bytes, temperature: kUnknown
Nov 28 09:52:14 np0005538515.localdomain ceph-mon[287604]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323534648175, "cf_name": "default", "job": 4, "event": "table_file_creation", "file_number": 15, "file_size": 12274725, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12220028, "index_size": 29700, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 23941, "raw_key_size": 254014, "raw_average_key_size": 26, "raw_value_size": 12056323, "raw_average_value_size": 1262, "num_data_blocks": 1134, "num_entries": 9550, "num_filter_entries": 9550, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764323465, "oldest_key_time": 0, "file_creation_time": 1764323534, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "5fedd929-5f7c-4f1d-86e7-c95af9bc6d32", "db_session_id": "18KD68ISQNH5R0YWI96C", "orig_file_number": 15, "seqno_to_time_mapping": "N/A"}}
Nov 28 09:52:14 np0005538515.localdomain ceph-mon[287604]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 28 09:52:14 np0005538515.localdomain ceph-mon[287604]: rocksdb: (Original Log Time 2025/11/28-09:52:14.648468) [db/compaction/compaction_job.cc:1663] [default] [JOB 4] Compacted 2@0 files to L6 => 12274725 bytes
Nov 28 09:52:14 np0005538515.localdomain ceph-mon[287604]: rocksdb: (Original Log Time 2025/11/28-09:52:14.650241) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 126.3 rd, 126.2 wr, level 6, files in(2, 0) out(1 +0 blob) MB in(11.7, 0.0 +0.0 blob) out(11.7 +0.0 blob), read-write-amplify(2.0) write-amplify(1.0) OK, records in: 10330, records dropped: 780 output_compression: NoCompression
Nov 28 09:52:14 np0005538515.localdomain ceph-mon[287604]: rocksdb: (Original Log Time 2025/11/28-09:52:14.650268) EVENT_LOG_v1 {"time_micros": 1764323534650257, "job": 4, "event": "compaction_finished", "compaction_time_micros": 97266, "compaction_time_cpu_micros": 35352, "output_level": 6, "num_output_files": 1, "total_output_size": 12274725, "num_input_records": 10330, "num_output_records": 9550, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Nov 28 09:52:14 np0005538515.localdomain ceph-mon[287604]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538515/store.db/000014.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 28 09:52:14 np0005538515.localdomain ceph-mon[287604]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323534652202, "job": 4, "event": "table_file_deletion", "file_number": 14}
Nov 28 09:52:14 np0005538515.localdomain ceph-mon[287604]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538515/store.db/000008.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 28 09:52:14 np0005538515.localdomain ceph-mon[287604]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323534652251, "job": 4, "event": "table_file_deletion", "file_number": 8}
Nov 28 09:52:14 np0005538515.localdomain ceph-mon[287604]: rocksdb: (Original Log Time 2025/11/28-09:52:14.550804) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 09:52:15 np0005538515.localdomain ceph-mon[287604]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #16. Immutable memtables: 0.
Nov 28 09:52:15 np0005538515.localdomain ceph-mon[287604]: rocksdb: (Original Log Time 2025/11/28-09:52:15.209788) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Nov 28 09:52:15 np0005538515.localdomain ceph-mon[287604]: rocksdb: [db/flush_job.cc:856] [default] [JOB 5] Flushing memtable with next log file: 16
Nov 28 09:52:15 np0005538515.localdomain ceph-mon[287604]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323535209836, "job": 5, "event": "flush_started", "num_memtables": 1, "num_entries": 291, "num_deletes": 251, "total_data_size": 90618, "memory_usage": 96504, "flush_reason": "Manual Compaction"}
Nov 28 09:52:15 np0005538515.localdomain ceph-mon[287604]: rocksdb: [db/flush_job.cc:885] [default] [JOB 5] Level-0 flush table #17: started
Nov 28 09:52:15 np0005538515.localdomain ceph-mon[287604]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323535212993, "cf_name": "default", "job": 5, "event": "table_file_creation", "file_number": 17, "file_size": 58979, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 12699, "largest_seqno": 12985, "table_properties": {"data_size": 57000, "index_size": 218, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 709, "raw_key_size": 5345, "raw_average_key_size": 19, "raw_value_size": 53076, "raw_average_value_size": 194, "num_data_blocks": 8, "num_entries": 273, "num_filter_entries": 273, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764323534, "oldest_key_time": 1764323534, "file_creation_time": 1764323535, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "5fedd929-5f7c-4f1d-86e7-c95af9bc6d32", "db_session_id": "18KD68ISQNH5R0YWI96C", "orig_file_number": 17, "seqno_to_time_mapping": "N/A"}}
Nov 28 09:52:15 np0005538515.localdomain ceph-mon[287604]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 5] Flush lasted 3245 microseconds, and 913 cpu microseconds.
Nov 28 09:52:15 np0005538515.localdomain ceph-mon[287604]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 28 09:52:15 np0005538515.localdomain ceph-mon[287604]: rocksdb: (Original Log Time 2025/11/28-09:52:15.213033) [db/flush_job.cc:967] [default] [JOB 5] Level-0 flush table #17: 58979 bytes OK
Nov 28 09:52:15 np0005538515.localdomain ceph-mon[287604]: rocksdb: (Original Log Time 2025/11/28-09:52:15.213053) [db/memtable_list.cc:519] [default] Level-0 commit table #17 started
Nov 28 09:52:15 np0005538515.localdomain ceph-mon[287604]: rocksdb: (Original Log Time 2025/11/28-09:52:15.214851) [db/memtable_list.cc:722] [default] Level-0 commit table #17: memtable #1 done
Nov 28 09:52:15 np0005538515.localdomain ceph-mon[287604]: rocksdb: (Original Log Time 2025/11/28-09:52:15.214876) EVENT_LOG_v1 {"time_micros": 1764323535214870, "job": 5, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Nov 28 09:52:15 np0005538515.localdomain ceph-mon[287604]: rocksdb: (Original Log Time 2025/11/28-09:52:15.214897) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Nov 28 09:52:15 np0005538515.localdomain ceph-mon[287604]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 5] Try to delete WAL files size 88450, prev total WAL file size 88450, number of live WAL files 2.
Nov 28 09:52:15 np0005538515.localdomain ceph-mon[287604]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538515/store.db/000013.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 28 09:52:15 np0005538515.localdomain ceph-mon[287604]: rocksdb: (Original Log Time 2025/11/28-09:52:15.215416) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003130303430' seq:72057594037927935, type:22 .. '7061786F73003130323932' seq:0, type:0; will stop at (end)
Nov 28 09:52:15 np0005538515.localdomain ceph-mon[287604]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 6] Compacting 1@0 + 1@6 files to L6, score -1.00
Nov 28 09:52:15 np0005538515.localdomain ceph-mon[287604]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 5 Base level 0, inputs: [17(57KB)], [15(11MB)]
Nov 28 09:52:15 np0005538515.localdomain ceph-mon[287604]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323535215503, "job": 6, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [17], "files_L6": [15], "score": -1, "input_data_size": 12333704, "oldest_snapshot_seqno": -1}
Nov 28 09:52:15 np0005538515.localdomain ceph-mon[287604]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 6] Generated table #18: 9307 keys, 11191258 bytes, temperature: kUnknown
Nov 28 09:52:15 np0005538515.localdomain ceph-mon[287604]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323535309062, "cf_name": "default", "job": 6, "event": "table_file_creation", "file_number": 18, "file_size": 11191258, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 11139798, "index_size": 27103, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 23301, "raw_key_size": 249390, "raw_average_key_size": 26, "raw_value_size": 10981875, "raw_average_value_size": 1179, "num_data_blocks": 1021, "num_entries": 9307, "num_filter_entries": 9307, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764323465, "oldest_key_time": 0, "file_creation_time": 1764323535, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "5fedd929-5f7c-4f1d-86e7-c95af9bc6d32", "db_session_id": "18KD68ISQNH5R0YWI96C", "orig_file_number": 18, "seqno_to_time_mapping": "N/A"}}
Nov 28 09:52:15 np0005538515.localdomain ceph-mon[287604]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 28 09:52:15 np0005538515.localdomain ceph-mon[287604]: rocksdb: (Original Log Time 2025/11/28-09:52:15.309496) [db/compaction/compaction_job.cc:1663] [default] [JOB 6] Compacted 1@0 + 1@6 files to L6 => 11191258 bytes
Nov 28 09:52:15 np0005538515.localdomain ceph-mon[287604]: rocksdb: (Original Log Time 2025/11/28-09:52:15.311623) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 131.6 rd, 119.4 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.1, 11.7 +0.0 blob) out(10.7 +0.0 blob), read-write-amplify(398.9) write-amplify(189.7) OK, records in: 9823, records dropped: 516 output_compression: NoCompression
Nov 28 09:52:15 np0005538515.localdomain ceph-mon[287604]: rocksdb: (Original Log Time 2025/11/28-09:52:15.311657) EVENT_LOG_v1 {"time_micros": 1764323535311644, "job": 6, "event": "compaction_finished", "compaction_time_micros": 93747, "compaction_time_cpu_micros": 37380, "output_level": 6, "num_output_files": 1, "total_output_size": 11191258, "num_input_records": 9823, "num_output_records": 9307, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Nov 28 09:52:15 np0005538515.localdomain ceph-mon[287604]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538515/store.db/000017.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 28 09:52:15 np0005538515.localdomain ceph-mon[287604]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323535311971, "job": 6, "event": "table_file_deletion", "file_number": 17}
Nov 28 09:52:15 np0005538515.localdomain ceph-mon[287604]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538515/store.db/000015.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 28 09:52:15 np0005538515.localdomain ceph-mon[287604]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323535313969, "job": 6, "event": "table_file_deletion", "file_number": 15}
Nov 28 09:52:15 np0005538515.localdomain ceph-mon[287604]: rocksdb: (Original Log Time 2025/11/28-09:52:15.215281) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 09:52:15 np0005538515.localdomain ceph-mon[287604]: rocksdb: (Original Log Time 2025/11/28-09:52:15.314193) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 09:52:15 np0005538515.localdomain ceph-mon[287604]: rocksdb: (Original Log Time 2025/11/28-09:52:15.314204) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 09:52:15 np0005538515.localdomain ceph-mon[287604]: rocksdb: (Original Log Time 2025/11/28-09:52:15.314206) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 09:52:15 np0005538515.localdomain ceph-mon[287604]: rocksdb: (Original Log Time 2025/11/28-09:52:15.314208) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 09:52:15 np0005538515.localdomain ceph-mon[287604]: rocksdb: (Original Log Time 2025/11/28-09:52:15.314210) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 09:52:15 np0005538515.localdomain ceph-mon[287604]: Reconfiguring mon.np0005538511 (monmap changed)...
Nov 28 09:52:15 np0005538515.localdomain ceph-mon[287604]: Reconfiguring daemon mon.np0005538511 on np0005538511.localdomain
Nov 28 09:52:15 np0005538515.localdomain ceph-mon[287604]: pgmap v26: 177 pgs: 177 active+clean; 104 MiB data, 558 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:52:15 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:15 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:15 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538511.fvuybw", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 28 09:52:15 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "mgr services"} : dispatch
Nov 28 09:52:15 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:52:15 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@2(peon).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:52:16 np0005538515.localdomain ceph-mon[287604]: Reconfiguring mgr.np0005538511.fvuybw (monmap changed)...
Nov 28 09:52:16 np0005538515.localdomain ceph-mon[287604]: Reconfiguring daemon mgr.np0005538511.fvuybw on np0005538511.localdomain
Nov 28 09:52:16 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:16 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:16 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Nov 28 09:52:16 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Nov 28 09:52:16 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:52:17 np0005538515.localdomain ceph-mon[287604]: Reconfiguring mon.np0005538512 (monmap changed)...
Nov 28 09:52:17 np0005538515.localdomain ceph-mon[287604]: Reconfiguring daemon mon.np0005538512 on np0005538512.localdomain
Nov 28 09:52:17 np0005538515.localdomain ceph-mon[287604]: pgmap v27: 177 pgs: 177 active+clean; 104 MiB data, 558 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:52:17 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:17 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:17 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538512.zyhkxs", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 28 09:52:17 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "mgr services"} : dispatch
Nov 28 09:52:17 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:52:17 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.
Nov 28 09:52:17 np0005538515.localdomain podman[290576]: 2025-11-28 09:52:17.974786913 +0000 UTC m=+0.078661561 container health_status 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 09:52:17 np0005538515.localdomain podman[290576]: 2025-11-28 09:52:17.983976226 +0000 UTC m=+0.087850824 container exec_died 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 28 09:52:17 np0005538515.localdomain systemd[1]: 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.service: Deactivated successfully.
Nov 28 09:52:18 np0005538515.localdomain ceph-mon[287604]: Reconfiguring mgr.np0005538512.zyhkxs (monmap changed)...
Nov 28 09:52:18 np0005538515.localdomain ceph-mon[287604]: Reconfiguring daemon mgr.np0005538512.zyhkxs on np0005538512.localdomain
Nov 28 09:52:18 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:18 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:18 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:18 np0005538515.localdomain ceph-mon[287604]: Reconfiguring crash.np0005538512 (monmap changed)...
Nov 28 09:52:18 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538512.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 28 09:52:18 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:52:18 np0005538515.localdomain ceph-mon[287604]: Reconfiguring daemon crash.np0005538512 on np0005538512.localdomain
Nov 28 09:52:18 np0005538515.localdomain ceph-mon[287604]: pgmap v28: 177 pgs: 177 active+clean; 104 MiB data, 558 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:52:19 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:19 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:19 np0005538515.localdomain ceph-mon[287604]: Reconfiguring crash.np0005538513 (monmap changed)...
Nov 28 09:52:19 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538513.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 28 09:52:19 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:52:19 np0005538515.localdomain ceph-mon[287604]: Reconfiguring daemon crash.np0005538513 on np0005538513.localdomain
Nov 28 09:52:19 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:20 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@2(peon).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:52:20 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:20 np0005538515.localdomain ceph-mon[287604]: Reconfiguring osd.2 (monmap changed)...
Nov 28 09:52:20 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch
Nov 28 09:52:20 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:52:20 np0005538515.localdomain ceph-mon[287604]: Reconfiguring daemon osd.2 on np0005538513.localdomain
Nov 28 09:52:20 np0005538515.localdomain ceph-mon[287604]: pgmap v29: 177 pgs: 177 active+clean; 104 MiB data, 558 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:52:20 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.
Nov 28 09:52:20 np0005538515.localdomain podman[290599]: 2025-11-28 09:52:20.957227911 +0000 UTC m=+0.067137836 container health_status cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, managed_by=edpm_ansible, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Nov 28 09:52:20 np0005538515.localdomain podman[290599]: 2025-11-28 09:52:20.96597083 +0000 UTC m=+0.075880765 container exec_died cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=multipathd, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 28 09:52:20 np0005538515.localdomain systemd[1]: cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.service: Deactivated successfully.
Nov 28 09:52:21 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:21 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:21 np0005538515.localdomain ceph-mon[287604]: Reconfiguring osd.5 (monmap changed)...
Nov 28 09:52:21 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch
Nov 28 09:52:21 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:52:21 np0005538515.localdomain ceph-mon[287604]: Reconfiguring daemon osd.5 on np0005538513.localdomain
Nov 28 09:52:21 np0005538515.localdomain ceph-mon[287604]: from='client.34194 -' entity='client.admin' cmd=[{"prefix": "orch apply", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 09:52:21 np0005538515.localdomain ceph-mon[287604]: Saving service mon spec with placement label:mon
Nov 28 09:52:21 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:22 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:22 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:22 np0005538515.localdomain ceph-mon[287604]: Reconfiguring mds.mds.np0005538513.yljthc (monmap changed)...
Nov 28 09:52:22 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005538513.yljthc", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Nov 28 09:52:22 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:52:22 np0005538515.localdomain ceph-mon[287604]: Reconfiguring daemon mds.mds.np0005538513.yljthc on np0005538513.localdomain
Nov 28 09:52:22 np0005538515.localdomain ceph-mon[287604]: pgmap v30: 177 pgs: 177 active+clean; 104 MiB data, 558 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:52:22 np0005538515.localdomain ceph-mon[287604]: from='client.34199 -' entity='client.admin' cmd=[{"prefix": "orch ps", "daemon_type": "mon", "daemon_id": "np0005538513", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Nov 28 09:52:22 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:22 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:22 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538513.dsfdlx", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 28 09:52:22 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "mgr services"} : dispatch
Nov 28 09:52:22 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:52:23 np0005538515.localdomain ceph-mgr[286188]: ms_deliver_dispatch: unhandled message 0x5646e12c51e0 mon_map magic: 0 from mon.3 v2:172.18.0.107:3300/0
Nov 28 09:52:23 np0005538515.localdomain ceph-mon[287604]: log_channel(cluster) log [INF] : mon.np0005538515 calling monitor election
Nov 28 09:52:23 np0005538515.localdomain ceph-mon[287604]: paxos.2).electionLogic(32) init, last seen epoch 32
Nov 28 09:52:23 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@2(electing) e8 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 28 09:52:23 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@2(electing) e8 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 28 09:52:23 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@2(peon) e8 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 28 09:52:23 np0005538515.localdomain sudo[290618]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:52:23 np0005538515.localdomain sudo[290618]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:23 np0005538515.localdomain sudo[290618]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:23 np0005538515.localdomain sudo[290636]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 check-host
Nov 28 09:52:23 np0005538515.localdomain sudo[290636]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:24 np0005538515.localdomain sudo[290636]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:24 np0005538515.localdomain sudo[290675]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Nov 28 09:52:24 np0005538515.localdomain sudo[290675]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:24 np0005538515.localdomain sudo[290675]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:24 np0005538515.localdomain sudo[290693]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph
Nov 28 09:52:24 np0005538515.localdomain sudo[290693]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:24 np0005538515.localdomain sudo[290693]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:24 np0005538515.localdomain sudo[290711]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.conf.new
Nov 28 09:52:24 np0005538515.localdomain sudo[290711]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:24 np0005538515.localdomain sudo[290711]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:24 np0005538515.localdomain sudo[290729]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:52:24 np0005538515.localdomain sudo[290729]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:24 np0005538515.localdomain sudo[290729]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:24 np0005538515.localdomain ceph-mon[287604]: from='client.26581 -' entity='client.admin' cmd=[{"prefix": "orch daemon rm", "names": ["mon.np0005538513"], "force": true, "target": ["mon-mgr", ""]}]: dispatch
Nov 28 09:52:24 np0005538515.localdomain ceph-mon[287604]: Remove daemons mon.np0005538513
Nov 28 09:52:24 np0005538515.localdomain ceph-mon[287604]: Safe to remove mon.np0005538513: new quorum should be ['np0005538512', 'np0005538511', 'np0005538515', 'np0005538514'] (from ['np0005538512', 'np0005538511', 'np0005538515', 'np0005538514'])
Nov 28 09:52:24 np0005538515.localdomain ceph-mon[287604]: Removing monitor np0005538513 from monmap...
Nov 28 09:52:24 np0005538515.localdomain ceph-mon[287604]: Removing daemon mon.np0005538513 from np0005538513.localdomain -- ports []
Nov 28 09:52:24 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "mon metadata", "id": "np0005538511"} : dispatch
Nov 28 09:52:24 np0005538515.localdomain ceph-mon[287604]: mon.np0005538511 calling monitor election
Nov 28 09:52:24 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "mon metadata", "id": "np0005538512"} : dispatch
Nov 28 09:52:24 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "mon metadata", "id": "np0005538514"} : dispatch
Nov 28 09:52:24 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "mon metadata", "id": "np0005538515"} : dispatch
Nov 28 09:52:24 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515 calling monitor election
Nov 28 09:52:24 np0005538515.localdomain ceph-mon[287604]: mon.np0005538514 calling monitor election
Nov 28 09:52:24 np0005538515.localdomain ceph-mon[287604]: mon.np0005538512 calling monitor election
Nov 28 09:52:24 np0005538515.localdomain ceph-mon[287604]: mon.np0005538512 is new leader, mons np0005538512,np0005538511,np0005538515,np0005538514 in quorum (ranks 0,1,2,3)
Nov 28 09:52:24 np0005538515.localdomain ceph-mon[287604]: monmap epoch 8
Nov 28 09:52:24 np0005538515.localdomain ceph-mon[287604]: fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:52:24 np0005538515.localdomain ceph-mon[287604]: last_changed 2025-11-28T09:52:23.566128+0000
Nov 28 09:52:24 np0005538515.localdomain ceph-mon[287604]: created 2025-11-28T07:45:36.120469+0000
Nov 28 09:52:24 np0005538515.localdomain ceph-mon[287604]: min_mon_release 18 (reef)
Nov 28 09:52:24 np0005538515.localdomain ceph-mon[287604]: election_strategy: 1
Nov 28 09:52:24 np0005538515.localdomain ceph-mon[287604]: 0: [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon.np0005538512
Nov 28 09:52:24 np0005538515.localdomain ceph-mon[287604]: 1: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005538511
Nov 28 09:52:24 np0005538515.localdomain ceph-mon[287604]: 2: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005538515
Nov 28 09:52:24 np0005538515.localdomain ceph-mon[287604]: 3: [v2:172.18.0.107:3300/0,v1:172.18.0.107:6789/0] mon.np0005538514
Nov 28 09:52:24 np0005538515.localdomain ceph-mon[287604]: fsmap cephfs:1 {0=mds.np0005538514.umgtoy=up:active} 2 up:standby
Nov 28 09:52:24 np0005538515.localdomain ceph-mon[287604]: osdmap e86: 6 total, 6 up, 6 in
Nov 28 09:52:24 np0005538515.localdomain ceph-mon[287604]: mgrmap e19: np0005538512.zyhkxs(active, since 51s), standbys: np0005538514.djozup, np0005538515.yfkzhl, np0005538510.nzitwz, np0005538511.fvuybw, np0005538513.dsfdlx
Nov 28 09:52:24 np0005538515.localdomain ceph-mon[287604]: overall HEALTH_OK
Nov 28 09:52:24 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:24 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:24 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:24 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:24 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:24 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:24 np0005538515.localdomain ceph-mon[287604]: pgmap v31: 177 pgs: 177 active+clean; 104 MiB data, 558 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:52:24 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:24 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:24 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:52:24 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 28 09:52:24 np0005538515.localdomain ceph-mon[287604]: Updating np0005538511.localdomain:/etc/ceph/ceph.conf
Nov 28 09:52:24 np0005538515.localdomain ceph-mon[287604]: Updating np0005538512.localdomain:/etc/ceph/ceph.conf
Nov 28 09:52:24 np0005538515.localdomain ceph-mon[287604]: Updating np0005538513.localdomain:/etc/ceph/ceph.conf
Nov 28 09:52:24 np0005538515.localdomain ceph-mon[287604]: Updating np0005538514.localdomain:/etc/ceph/ceph.conf
Nov 28 09:52:24 np0005538515.localdomain ceph-mon[287604]: Updating np0005538515.localdomain:/etc/ceph/ceph.conf
Nov 28 09:52:24 np0005538515.localdomain sudo[290747]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.conf.new
Nov 28 09:52:24 np0005538515.localdomain sudo[290747]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:24 np0005538515.localdomain sudo[290747]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:24 np0005538515.localdomain sudo[290781]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.conf.new
Nov 28 09:52:24 np0005538515.localdomain sudo[290781]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:24 np0005538515.localdomain sudo[290781]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:24 np0005538515.localdomain sudo[290799]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.conf.new
Nov 28 09:52:24 np0005538515.localdomain sudo[290799]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:24 np0005538515.localdomain sudo[290799]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:24 np0005538515.localdomain sudo[290817]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Nov 28 09:52:24 np0005538515.localdomain sudo[290817]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:24 np0005538515.localdomain sudo[290817]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:25 np0005538515.localdomain sudo[290835]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config
Nov 28 09:52:25 np0005538515.localdomain sudo[290835]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:25 np0005538515.localdomain sudo[290835]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:25 np0005538515.localdomain sudo[290853]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config
Nov 28 09:52:25 np0005538515.localdomain sudo[290853]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:25 np0005538515.localdomain sudo[290853]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:25 np0005538515.localdomain sudo[290871]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf.new
Nov 28 09:52:25 np0005538515.localdomain sudo[290871]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:25 np0005538515.localdomain sudo[290871]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:25 np0005538515.localdomain sudo[290889]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:52:25 np0005538515.localdomain sudo[290889]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:25 np0005538515.localdomain sudo[290889]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:25 np0005538515.localdomain sudo[290907]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf.new
Nov 28 09:52:25 np0005538515.localdomain sudo[290907]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:25 np0005538515.localdomain sudo[290907]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:25 np0005538515.localdomain sudo[290941]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf.new
Nov 28 09:52:25 np0005538515.localdomain sudo[290941]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:25 np0005538515.localdomain sudo[290941]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:25 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@2(peon).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:52:25 np0005538515.localdomain sudo[290959]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf.new
Nov 28 09:52:25 np0005538515.localdomain sudo[290959]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:25 np0005538515.localdomain sudo[290959]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:25 np0005538515.localdomain sudo[290977]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf.new /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:52:25 np0005538515.localdomain sudo[290977]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:25 np0005538515.localdomain sudo[290977]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:26 np0005538515.localdomain ceph-mon[287604]: Updating np0005538512.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:52:26 np0005538515.localdomain ceph-mon[287604]: Updating np0005538511.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:52:26 np0005538515.localdomain ceph-mon[287604]: Updating np0005538515.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:52:26 np0005538515.localdomain ceph-mon[287604]: Updating np0005538514.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:52:26 np0005538515.localdomain ceph-mon[287604]: Updating np0005538513.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:52:26 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:26 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:26 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:26 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:26 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:26 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:26 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:26 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:26 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:26 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:26 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:26 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 28 09:52:26 np0005538515.localdomain sudo[290995]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:52:26 np0005538515.localdomain sudo[290995]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:26 np0005538515.localdomain sudo[290995]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:27 np0005538515.localdomain ceph-mon[287604]: pgmap v32: 177 pgs: 177 active+clean; 104 MiB data, 558 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:52:27 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538511.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 28 09:52:27 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:52:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:52:27 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:52:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:52:27 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 09:52:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:52:27 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:52:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:52:27 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 09:52:27 np0005538515.localdomain openstack_network_exporter[240973]: 
Nov 28 09:52:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:52:27 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 09:52:27 np0005538515.localdomain openstack_network_exporter[240973]: 
Nov 28 09:52:28 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:52:28.238 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:52:28 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:52:28.239 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:52:28 np0005538515.localdomain ceph-mon[287604]: Reconfiguring crash.np0005538511 (monmap changed)...
Nov 28 09:52:28 np0005538515.localdomain ceph-mon[287604]: Reconfiguring daemon crash.np0005538511 on np0005538511.localdomain
Nov 28 09:52:28 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:28 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:28 np0005538515.localdomain ceph-mon[287604]: Reconfiguring mgr.np0005538511.fvuybw (monmap changed)...
Nov 28 09:52:28 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538511.fvuybw", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 28 09:52:28 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "mgr services"} : dispatch
Nov 28 09:52:28 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:52:28 np0005538515.localdomain ceph-mon[287604]: Reconfiguring daemon mgr.np0005538511.fvuybw on np0005538511.localdomain
Nov 28 09:52:28 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:28 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:28 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:28 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538512.zyhkxs", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 28 09:52:28 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "mgr services"} : dispatch
Nov 28 09:52:28 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:52:28 np0005538515.localdomain podman[239012]: time="2025-11-28T09:52:28Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 09:52:28 np0005538515.localdomain podman[239012]: @ - - [28/Nov/2025:09:52:28 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156330 "" "Go-http-client/1.1"
Nov 28 09:52:28 np0005538515.localdomain podman[239012]: @ - - [28/Nov/2025:09:52:28 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19164 "" "Go-http-client/1.1"
Nov 28 09:52:29 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:52:29.239 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:52:29 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:52:29.240 280172 DEBUG nova.compute.manager [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 28 09:52:29 np0005538515.localdomain ceph-mon[287604]: Reconfiguring mgr.np0005538512.zyhkxs (monmap changed)...
Nov 28 09:52:29 np0005538515.localdomain ceph-mon[287604]: pgmap v33: 177 pgs: 177 active+clean; 104 MiB data, 558 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:52:29 np0005538515.localdomain ceph-mon[287604]: Reconfiguring daemon mgr.np0005538512.zyhkxs on np0005538512.localdomain
Nov 28 09:52:29 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:29 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:29 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538512.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 28 09:52:29 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:52:30 np0005538515.localdomain ceph-mon[287604]: Reconfiguring crash.np0005538512 (monmap changed)...
Nov 28 09:52:30 np0005538515.localdomain ceph-mon[287604]: Reconfiguring daemon crash.np0005538512 on np0005538512.localdomain
Nov 28 09:52:30 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:30 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:30 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538513.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 28 09:52:30 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:52:30 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@2(peon).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:52:31 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:52:31.240 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:52:31 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:52:31.241 280172 DEBUG nova.compute.manager [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 28 09:52:31 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:52:31.241 280172 DEBUG nova.compute.manager [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 28 09:52:31 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:52:31.262 280172 DEBUG nova.compute.manager [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 28 09:52:31 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:52:31.263 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:52:31 np0005538515.localdomain ceph-mon[287604]: Reconfiguring crash.np0005538513 (monmap changed)...
Nov 28 09:52:31 np0005538515.localdomain ceph-mon[287604]: Reconfiguring daemon crash.np0005538513 on np0005538513.localdomain
Nov 28 09:52:31 np0005538515.localdomain ceph-mon[287604]: pgmap v34: 177 pgs: 177 active+clean; 104 MiB data, 558 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:52:31 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:31 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:31 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch
Nov 28 09:52:31 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:52:32 np0005538515.localdomain ceph-mon[287604]: Reconfiguring osd.2 (monmap changed)...
Nov 28 09:52:32 np0005538515.localdomain ceph-mon[287604]: Reconfiguring daemon osd.2 on np0005538513.localdomain
Nov 28 09:52:32 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:32 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:32 np0005538515.localdomain ceph-mon[287604]: Reconfiguring osd.5 (monmap changed)...
Nov 28 09:52:32 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch
Nov 28 09:52:32 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:52:32 np0005538515.localdomain ceph-mon[287604]: Reconfiguring daemon osd.5 on np0005538513.localdomain
Nov 28 09:52:32 np0005538515.localdomain ceph-mon[287604]: pgmap v35: 177 pgs: 177 active+clean; 104 MiB data, 558 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:52:32 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.
Nov 28 09:52:32 np0005538515.localdomain systemd[1]: tmp-crun.D2uw0o.mount: Deactivated successfully.
Nov 28 09:52:32 np0005538515.localdomain podman[291013]: 2025-11-28 09:52:32.976239133 +0000 UTC m=+0.084390947 container health_status 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vendor=Red Hat, Inc., io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, config_id=edpm, distribution-scope=public, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, version=9.6, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal)
Nov 28 09:52:33 np0005538515.localdomain podman[291013]: 2025-11-28 09:52:33.020512795 +0000 UTC m=+0.128664629 container exec_died 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., version=9.6, architecture=x86_64, distribution-scope=public, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., name=ubi9-minimal, build-date=2025-08-20T13:12:41, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers)
Nov 28 09:52:33 np0005538515.localdomain systemd[1]: 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.service: Deactivated successfully.
Nov 28 09:52:33 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:52:33.238 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:52:33 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:52:33.238 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:52:33 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:52:33.239 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:52:33 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@2(peon) e8 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 28 09:52:33 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/1522556299' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 09:52:33 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:52:33.438 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:52:33 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:52:33.439 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:52:33 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:52:33.439 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:52:33 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:52:33.439 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Auditing locally available compute resources for np0005538515.localdomain (node: np0005538515.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 28 09:52:33 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:52:33.440 280172 DEBUG oslo_concurrency.processutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 09:52:33 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@2(peon) e8 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 28 09:52:33 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/3135871719' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 09:52:33 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:52:33.926 280172 DEBUG oslo_concurrency.processutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.486s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 09:52:34 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:52:34.145 280172 WARNING nova.virt.libvirt.driver [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 09:52:34 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:52:34.147 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Hypervisor/Node resource view: name=np0005538515.localdomain free_ram=12056MB free_disk=41.83686447143555GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 28 09:52:34 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:52:34.147 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:52:34 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:52:34.148 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:52:34 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:34 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:34 np0005538515.localdomain ceph-mon[287604]: Reconfiguring mds.mds.np0005538513.yljthc (monmap changed)...
Nov 28 09:52:34 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005538513.yljthc", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Nov 28 09:52:34 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:52:34 np0005538515.localdomain ceph-mon[287604]: Reconfiguring daemon mds.mds.np0005538513.yljthc on np0005538513.localdomain
Nov 28 09:52:34 np0005538515.localdomain ceph-mon[287604]: from='client.? 172.18.0.107:0/1522556299' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 09:52:34 np0005538515.localdomain ceph-mon[287604]: from='client.? 172.18.0.108:0/3135871719' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 09:52:34 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:34 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:34 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538513.dsfdlx", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 28 09:52:34 np0005538515.localdomain ceph-mon[287604]: from='client.? 172.18.0.107:0/2185645533' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 09:52:34 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "mgr services"} : dispatch
Nov 28 09:52:34 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:52:34 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:52:34.246 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 28 09:52:34 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:52:34.246 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Final resource view: name=np0005538515.localdomain phys_ram=15738MB used_ram=512MB phys_disk=41GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 28 09:52:34 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:52:34.279 280172 DEBUG oslo_concurrency.processutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 09:52:34 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:52:34.778 280172 DEBUG oslo_concurrency.processutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.499s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 09:52:34 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:52:34.784 280172 DEBUG nova.compute.provider_tree [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Inventory has not changed in ProviderTree for provider: 72fba1ca-0d86-48af-8a3d-510284dfd0e0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 09:52:34 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:52:34.815 280172 DEBUG nova.scheduler.client.report [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Inventory has not changed for provider 72fba1ca-0d86-48af-8a3d-510284dfd0e0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 09:52:34 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:52:34.818 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Compute_service record updated for np0005538515.localdomain:np0005538515.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 28 09:52:34 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:52:34.818 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.670s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:52:35 np0005538515.localdomain ceph-mon[287604]: Reconfiguring mgr.np0005538513.dsfdlx (monmap changed)...
Nov 28 09:52:35 np0005538515.localdomain ceph-mon[287604]: Reconfiguring daemon mgr.np0005538513.dsfdlx on np0005538513.localdomain
Nov 28 09:52:35 np0005538515.localdomain ceph-mon[287604]: pgmap v36: 177 pgs: 177 active+clean; 104 MiB data, 558 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:52:35 np0005538515.localdomain ceph-mon[287604]: from='client.? 172.18.0.106:0/1971507369' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 09:52:35 np0005538515.localdomain ceph-mon[287604]: from='client.? 172.18.0.108:0/2894340973' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 09:52:35 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:35 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:35 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538514.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 28 09:52:35 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:52:35 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@2(peon) e8 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 28 09:52:35 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/2401285035' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 09:52:35 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@2(peon).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:52:36 np0005538515.localdomain ceph-mon[287604]: Reconfiguring crash.np0005538514 (monmap changed)...
Nov 28 09:52:36 np0005538515.localdomain ceph-mon[287604]: Reconfiguring daemon crash.np0005538514 on np0005538514.localdomain
Nov 28 09:52:36 np0005538515.localdomain ceph-mon[287604]: from='client.? 172.18.0.106:0/2401285035' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 09:52:36 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:36 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:36 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch
Nov 28 09:52:36 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:52:36 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:52:36.819 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:52:37 np0005538515.localdomain ceph-mon[287604]: Reconfiguring osd.0 (monmap changed)...
Nov 28 09:52:37 np0005538515.localdomain ceph-mon[287604]: Reconfiguring daemon osd.0 on np0005538514.localdomain
Nov 28 09:52:37 np0005538515.localdomain ceph-mon[287604]: pgmap v37: 177 pgs: 177 active+clean; 104 MiB data, 558 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:52:37 np0005538515.localdomain ceph-mon[287604]: from='client.34266 -' entity='client.admin' cmd=[{"prefix": "orch daemon add", "daemon_type": "mon", "placement": "np0005538513.localdomain:172.18.0.103", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 09:52:37 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:37 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Nov 28 09:52:37 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:52:37 np0005538515.localdomain ceph-mon[287604]: Deploying daemon mon.np0005538513 on np0005538513.localdomain
Nov 28 09:52:37 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:37 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:37 np0005538515.localdomain ceph-mon[287604]: Reconfiguring osd.3 (monmap changed)...
Nov 28 09:52:37 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch
Nov 28 09:52:37 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:52:37 np0005538515.localdomain ceph-mon[287604]: Reconfiguring daemon osd.3 on np0005538514.localdomain
Nov 28 09:52:39 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:39 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:39 np0005538515.localdomain ceph-mon[287604]: Reconfiguring mds.mds.np0005538514.umgtoy (monmap changed)...
Nov 28 09:52:39 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005538514.umgtoy", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Nov 28 09:52:39 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:52:39 np0005538515.localdomain ceph-mon[287604]: Reconfiguring daemon mds.mds.np0005538514.umgtoy on np0005538514.localdomain
Nov 28 09:52:39 np0005538515.localdomain ceph-mon[287604]: pgmap v38: 177 pgs: 177 active+clean; 104 MiB data, 558 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:52:39 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:39 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:39 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538514.djozup", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 28 09:52:39 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "mgr services"} : dispatch
Nov 28 09:52:39 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:52:39 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@2(peon) e8  adding peer [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] to list of hints
Nov 28 09:52:39 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@2(peon) e8  adding peer [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] to list of hints
Nov 28 09:52:39 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@2(peon) e8  adding peer [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] to list of hints
Nov 28 09:52:39 np0005538515.localdomain ceph-mgr[286188]: ms_deliver_dispatch: unhandled message 0x5646e12c4f20 mon_map magic: 0 from mon.3 v2:172.18.0.107:3300/0
Nov 28 09:52:39 np0005538515.localdomain ceph-mon[287604]: log_channel(cluster) log [INF] : mon.np0005538515 calling monitor election
Nov 28 09:52:39 np0005538515.localdomain ceph-mon[287604]: paxos.2).electionLogic(34) init, last seen epoch 34
Nov 28 09:52:39 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@2(electing) e9 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 28 09:52:39 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@2(electing) e9 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 28 09:52:39 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@2(electing) e9 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 28 09:52:43 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.
Nov 28 09:52:43 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.
Nov 28 09:52:43 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.
Nov 28 09:52:43 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.
Nov 28 09:52:43 np0005538515.localdomain systemd[1]: tmp-crun.wM0KEP.mount: Deactivated successfully.
Nov 28 09:52:43 np0005538515.localdomain podman[291078]: 2025-11-28 09:52:43.661689272 +0000 UTC m=+0.085809071 container health_status 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 28 09:52:43 np0005538515.localdomain podman[291077]: 2025-11-28 09:52:43.726764234 +0000 UTC m=+0.150034368 container health_status 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_id=edpm, tcib_managed=true, io.buildah.version=1.41.3, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 28 09:52:43 np0005538515.localdomain podman[291079]: 2025-11-28 09:52:43.771557072 +0000 UTC m=+0.192744601 container health_status b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, io.buildah.version=1.41.3)
Nov 28 09:52:43 np0005538515.localdomain podman[291079]: 2025-11-28 09:52:43.778423513 +0000 UTC m=+0.199611062 container exec_died b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS)
Nov 28 09:52:43 np0005538515.localdomain systemd[1]: b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.service: Deactivated successfully.
Nov 28 09:52:43 np0005538515.localdomain podman[291077]: 2025-11-28 09:52:43.792395232 +0000 UTC m=+0.215665306 container exec_died 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125)
Nov 28 09:52:43 np0005538515.localdomain systemd[1]: 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.service: Deactivated successfully.
Nov 28 09:52:43 np0005538515.localdomain podman[291104]: 2025-11-28 09:52:43.692254622 +0000 UTC m=+0.080700584 container health_status d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 28 09:52:43 np0005538515.localdomain podman[291078]: 2025-11-28 09:52:43.847369264 +0000 UTC m=+0.271489113 container exec_died 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.build-date=20251125)
Nov 28 09:52:43 np0005538515.localdomain systemd[1]: 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.service: Deactivated successfully.
Nov 28 09:52:43 np0005538515.localdomain podman[291104]: 2025-11-28 09:52:43.876523381 +0000 UTC m=+0.264969353 container exec_died d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 28 09:52:43 np0005538515.localdomain systemd[1]: d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.service: Deactivated successfully.
Nov 28 09:52:44 np0005538515.localdomain systemd[1]: tmp-crun.dT0Ctr.mount: Deactivated successfully.
Nov 28 09:52:44 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@2(peon) e9 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 28 09:52:44 np0005538515.localdomain ceph-mon[287604]: mon.np0005538511 calling monitor election
Nov 28 09:52:44 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515 calling monitor election
Nov 28 09:52:44 np0005538515.localdomain ceph-mon[287604]: mon.np0005538514 calling monitor election
Nov 28 09:52:44 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "mon metadata", "id": "np0005538511"} : dispatch
Nov 28 09:52:44 np0005538515.localdomain ceph-mon[287604]: mon.np0005538512 calling monitor election
Nov 28 09:52:44 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "mon metadata", "id": "np0005538512"} : dispatch
Nov 28 09:52:44 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "mon metadata", "id": "np0005538513"} : dispatch
Nov 28 09:52:44 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "mon metadata", "id": "np0005538514"} : dispatch
Nov 28 09:52:44 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "mon metadata", "id": "np0005538515"} : dispatch
Nov 28 09:52:44 np0005538515.localdomain ceph-mon[287604]: pgmap v39: 177 pgs: 177 active+clean; 104 MiB data, 558 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:52:44 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "mon metadata", "id": "np0005538513"} : dispatch
Nov 28 09:52:44 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "mon metadata", "id": "np0005538513"} : dispatch
Nov 28 09:52:44 np0005538515.localdomain ceph-mon[287604]: pgmap v40: 177 pgs: 177 active+clean; 104 MiB data, 558 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:52:44 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "mon metadata", "id": "np0005538513"} : dispatch
Nov 28 09:52:44 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "mon metadata", "id": "np0005538513"} : dispatch
Nov 28 09:52:44 np0005538515.localdomain ceph-mon[287604]: pgmap v41: 177 pgs: 177 active+clean; 104 MiB data, 558 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:52:44 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "mon metadata", "id": "np0005538513"} : dispatch
Nov 28 09:52:44 np0005538515.localdomain ceph-mon[287604]: mon.np0005538512 is new leader, mons np0005538512,np0005538511,np0005538515,np0005538514 in quorum (ranks 0,1,2,3)
Nov 28 09:52:44 np0005538515.localdomain ceph-mon[287604]: monmap epoch 9
Nov 28 09:52:44 np0005538515.localdomain ceph-mon[287604]: fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:52:44 np0005538515.localdomain ceph-mon[287604]: last_changed 2025-11-28T09:52:39.794263+0000
Nov 28 09:52:44 np0005538515.localdomain ceph-mon[287604]: created 2025-11-28T07:45:36.120469+0000
Nov 28 09:52:44 np0005538515.localdomain ceph-mon[287604]: min_mon_release 18 (reef)
Nov 28 09:52:44 np0005538515.localdomain ceph-mon[287604]: election_strategy: 1
Nov 28 09:52:44 np0005538515.localdomain ceph-mon[287604]: 0: [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon.np0005538512
Nov 28 09:52:44 np0005538515.localdomain ceph-mon[287604]: 1: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005538511
Nov 28 09:52:44 np0005538515.localdomain ceph-mon[287604]: 2: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005538515
Nov 28 09:52:44 np0005538515.localdomain ceph-mon[287604]: 3: [v2:172.18.0.107:3300/0,v1:172.18.0.107:6789/0] mon.np0005538514
Nov 28 09:52:44 np0005538515.localdomain ceph-mon[287604]: 4: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005538513
Nov 28 09:52:44 np0005538515.localdomain ceph-mon[287604]: fsmap cephfs:1 {0=mds.np0005538514.umgtoy=up:active} 2 up:standby
Nov 28 09:52:44 np0005538515.localdomain ceph-mon[287604]: osdmap e86: 6 total, 6 up, 6 in
Nov 28 09:52:44 np0005538515.localdomain ceph-mon[287604]: mgrmap e19: np0005538512.zyhkxs(active, since 72s), standbys: np0005538514.djozup, np0005538515.yfkzhl, np0005538510.nzitwz, np0005538511.fvuybw, np0005538513.dsfdlx
Nov 28 09:52:44 np0005538515.localdomain ceph-mon[287604]: Health check failed: 1/5 mons down, quorum np0005538512,np0005538511,np0005538515,np0005538514 (MON_DOWN)
Nov 28 09:52:44 np0005538515.localdomain ceph-mon[287604]: Health detail: HEALTH_WARN 1/5 mons down, quorum np0005538512,np0005538511,np0005538515,np0005538514
Nov 28 09:52:44 np0005538515.localdomain ceph-mon[287604]: [WRN] MON_DOWN: 1/5 mons down, quorum np0005538512,np0005538511,np0005538515,np0005538514
Nov 28 09:52:44 np0005538515.localdomain ceph-mon[287604]:     mon.np0005538513 (rank 4) addr [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] is down (out of quorum)
Nov 28 09:52:44 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:44 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:44 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538515.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 28 09:52:44 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:52:44 np0005538515.localdomain sudo[291165]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:52:44 np0005538515.localdomain sudo[291165]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:44 np0005538515.localdomain sudo[291165]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:45 np0005538515.localdomain sudo[291183]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:52:45 np0005538515.localdomain sudo[291183]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:45 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@2(peon).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:52:45 np0005538515.localdomain podman[291217]: 
Nov 28 09:52:45 np0005538515.localdomain podman[291217]: 2025-11-28 09:52:45.52897265 +0000 UTC m=+0.070571652 container create 93e8b5648c77f2f6bb54601f3934c570f39dc5c853de50baf3cdecb973c84830 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=modest_clarke, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.buildah.version=1.33.12, version=7, distribution-scope=public, vcs-type=git, GIT_CLEAN=True, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=553, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, io.openshift.expose-services=, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, RELEASE=main)
Nov 28 09:52:45 np0005538515.localdomain systemd[1]: Started libpod-conmon-93e8b5648c77f2f6bb54601f3934c570f39dc5c853de50baf3cdecb973c84830.scope.
Nov 28 09:52:45 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 09:52:45 np0005538515.localdomain podman[291217]: 2025-11-28 09:52:45.600036467 +0000 UTC m=+0.141635429 container init 93e8b5648c77f2f6bb54601f3934c570f39dc5c853de50baf3cdecb973c84830 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=modest_clarke, io.buildah.version=1.33.12, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, com.redhat.component=rhceph-container, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, distribution-scope=public, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, CEPH_POINT_RELEASE=, release=553, vcs-type=git, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Nov 28 09:52:45 np0005538515.localdomain podman[291217]: 2025-11-28 09:52:45.502114304 +0000 UTC m=+0.043713276 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 09:52:45 np0005538515.localdomain podman[291217]: 2025-11-28 09:52:45.613864791 +0000 UTC m=+0.155463753 container start 93e8b5648c77f2f6bb54601f3934c570f39dc5c853de50baf3cdecb973c84830 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=modest_clarke, version=7, RELEASE=main, ceph=True, vendor=Red Hat, Inc., architecture=x86_64, release=553, GIT_BRANCH=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.expose-services=, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_CLEAN=True, distribution-scope=public, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-type=git, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Nov 28 09:52:45 np0005538515.localdomain podman[291217]: 2025-11-28 09:52:45.614299326 +0000 UTC m=+0.155898338 container attach 93e8b5648c77f2f6bb54601f3934c570f39dc5c853de50baf3cdecb973c84830 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=modest_clarke, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, version=7, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_BRANCH=main, vcs-type=git, GIT_CLEAN=True, release=553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, distribution-scope=public, architecture=x86_64, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main, io.openshift.expose-services=, build-date=2025-09-24T08:57:55, name=rhceph, io.buildah.version=1.33.12, com.redhat.component=rhceph-container)
Nov 28 09:52:45 np0005538515.localdomain modest_clarke[291232]: 167 167
Nov 28 09:52:45 np0005538515.localdomain systemd[1]: libpod-93e8b5648c77f2f6bb54601f3934c570f39dc5c853de50baf3cdecb973c84830.scope: Deactivated successfully.
Nov 28 09:52:45 np0005538515.localdomain podman[291217]: 2025-11-28 09:52:45.61836636 +0000 UTC m=+0.159965352 container died 93e8b5648c77f2f6bb54601f3934c570f39dc5c853de50baf3cdecb973c84830 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=modest_clarke, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.33.12, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, ceph=True, io.openshift.expose-services=, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., distribution-scope=public, RELEASE=main, build-date=2025-09-24T08:57:55, release=553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, CEPH_POINT_RELEASE=, name=rhceph)
Nov 28 09:52:45 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-d4e1b993447ac98af278f889cbe1b53e7d6f874ea92960d3d7adbaaef536a55c-merged.mount: Deactivated successfully.
Nov 28 09:52:45 np0005538515.localdomain podman[291237]: 2025-11-28 09:52:45.71617889 +0000 UTC m=+0.087825003 container remove 93e8b5648c77f2f6bb54601f3934c570f39dc5c853de50baf3cdecb973c84830 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=modest_clarke, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, architecture=x86_64, vcs-type=git, release=553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_BRANCH=main, GIT_CLEAN=True, version=7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-09-24T08:57:55, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, ceph=True, io.buildah.version=1.33.12, distribution-scope=public, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph)
Nov 28 09:52:45 np0005538515.localdomain systemd[1]: libpod-conmon-93e8b5648c77f2f6bb54601f3934c570f39dc5c853de50baf3cdecb973c84830.scope: Deactivated successfully.
Nov 28 09:52:45 np0005538515.localdomain sudo[291183]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:45 np0005538515.localdomain sudo[291253]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:52:45 np0005538515.localdomain sudo[291253]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:45 np0005538515.localdomain sudo[291253]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:45 np0005538515.localdomain ceph-mon[287604]: Reconfiguring crash.np0005538515 (monmap changed)...
Nov 28 09:52:45 np0005538515.localdomain ceph-mon[287604]: Reconfiguring daemon crash.np0005538515 on np0005538515.localdomain
Nov 28 09:52:45 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "mon metadata", "id": "np0005538513"} : dispatch
Nov 28 09:52:45 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:45 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:45 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch
Nov 28 09:52:45 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:52:45 np0005538515.localdomain sudo[291271]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:52:45 np0005538515.localdomain sudo[291271]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:46 np0005538515.localdomain podman[291306]: 
Nov 28 09:52:46 np0005538515.localdomain podman[291306]: 2025-11-28 09:52:46.429997931 +0000 UTC m=+0.084539722 container create 8c7f960d9b896964b2dc567d7ea8a87fad3d67721751b174a11b222c32a80c4b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nice_mendel, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, ceph=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, CEPH_POINT_RELEASE=, build-date=2025-09-24T08:57:55, com.redhat.component=rhceph-container, vcs-type=git, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, release=553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., distribution-scope=public, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d)
Nov 28 09:52:46 np0005538515.localdomain systemd[1]: Started libpod-conmon-8c7f960d9b896964b2dc567d7ea8a87fad3d67721751b174a11b222c32a80c4b.scope.
Nov 28 09:52:46 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 09:52:46 np0005538515.localdomain podman[291306]: 2025-11-28 09:52:46.397727098 +0000 UTC m=+0.052268909 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 09:52:46 np0005538515.localdomain podman[291306]: 2025-11-28 09:52:46.500705296 +0000 UTC m=+0.155247087 container init 8c7f960d9b896964b2dc567d7ea8a87fad3d67721751b174a11b222c32a80c4b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nice_mendel, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, ceph=True, CEPH_POINT_RELEASE=, io.openshift.expose-services=, build-date=2025-09-24T08:57:55, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, name=rhceph, io.buildah.version=1.33.12, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., version=7, release=553, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3)
Nov 28 09:52:46 np0005538515.localdomain podman[291306]: 2025-11-28 09:52:46.510820958 +0000 UTC m=+0.165362739 container start 8c7f960d9b896964b2dc567d7ea8a87fad3d67721751b174a11b222c32a80c4b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nice_mendel, distribution-scope=public, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=553, build-date=2025-09-24T08:57:55, io.openshift.expose-services=, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, version=7, vcs-type=git, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553)
Nov 28 09:52:46 np0005538515.localdomain podman[291306]: 2025-11-28 09:52:46.511132517 +0000 UTC m=+0.165674298 container attach 8c7f960d9b896964b2dc567d7ea8a87fad3d67721751b174a11b222c32a80c4b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nice_mendel, vcs-type=git, build-date=2025-09-24T08:57:55, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, GIT_BRANCH=main, io.buildah.version=1.33.12, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, release=553, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_CLEAN=True, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, distribution-scope=public, name=rhceph, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, ceph=True)
Nov 28 09:52:46 np0005538515.localdomain nice_mendel[291321]: 167 167
Nov 28 09:52:46 np0005538515.localdomain systemd[1]: libpod-8c7f960d9b896964b2dc567d7ea8a87fad3d67721751b174a11b222c32a80c4b.scope: Deactivated successfully.
Nov 28 09:52:46 np0005538515.localdomain podman[291306]: 2025-11-28 09:52:46.514183471 +0000 UTC m=+0.168725282 container died 8c7f960d9b896964b2dc567d7ea8a87fad3d67721751b174a11b222c32a80c4b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nice_mendel, version=7, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_CLEAN=True, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, io.buildah.version=1.33.12, vcs-type=git, ceph=True, release=553, name=rhceph, distribution-scope=public, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.expose-services=, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, build-date=2025-09-24T08:57:55, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3)
Nov 28 09:52:46 np0005538515.localdomain podman[291326]: 2025-11-28 09:52:46.626983601 +0000 UTC m=+0.097510801 container remove 8c7f960d9b896964b2dc567d7ea8a87fad3d67721751b174a11b222c32a80c4b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nice_mendel, name=rhceph, com.redhat.component=rhceph-container, version=7, io.openshift.expose-services=, io.buildah.version=1.33.12, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, architecture=x86_64, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, distribution-scope=public, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, release=553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, RELEASE=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-09-24T08:57:55)
Nov 28 09:52:46 np0005538515.localdomain systemd[1]: libpod-conmon-8c7f960d9b896964b2dc567d7ea8a87fad3d67721751b174a11b222c32a80c4b.scope: Deactivated successfully.
Nov 28 09:52:46 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-3727b37edf48995ccfcc87ef5d2dc906625fd4c2baa0b3ae8d41eec1a78e6092-merged.mount: Deactivated successfully.
Nov 28 09:52:46 np0005538515.localdomain sudo[291271]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:46 np0005538515.localdomain ceph-mon[287604]: log_channel(cluster) log [INF] : mon.np0005538515 calling monitor election
Nov 28 09:52:46 np0005538515.localdomain ceph-mon[287604]: paxos.2).electionLogic(36) init, last seen epoch 36
Nov 28 09:52:46 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@2(electing) e9 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 28 09:52:46 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@2(electing) e9 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 28 09:52:46 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@2(electing) e9 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 28 09:52:46 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@2(peon) e9 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 28 09:52:46 np0005538515.localdomain sudo[291347]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:52:46 np0005538515.localdomain sudo[291347]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:46 np0005538515.localdomain sudo[291347]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:47 np0005538515.localdomain sudo[291365]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:52:47 np0005538515.localdomain sudo[291365]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:47 np0005538515.localdomain podman[291399]: 
Nov 28 09:52:47 np0005538515.localdomain podman[291399]: 2025-11-28 09:52:47.54811181 +0000 UTC m=+0.082955033 container create fbac79e01f3623ff3927a4fb79200e5236748ad9579c02b82dbeef56fe62c748 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=kind_stonebraker, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, io.buildah.version=1.33.12, architecture=x86_64, build-date=2025-09-24T08:57:55, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., ceph=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, RELEASE=main, io.openshift.expose-services=, GIT_CLEAN=True, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=553, io.openshift.tags=rhceph ceph)
Nov 28 09:52:47 np0005538515.localdomain systemd[1]: Started libpod-conmon-fbac79e01f3623ff3927a4fb79200e5236748ad9579c02b82dbeef56fe62c748.scope.
Nov 28 09:52:47 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 09:52:47 np0005538515.localdomain podman[291399]: 2025-11-28 09:52:47.611766659 +0000 UTC m=+0.146609842 container init fbac79e01f3623ff3927a4fb79200e5236748ad9579c02b82dbeef56fe62c748 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=kind_stonebraker, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, vcs-type=git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, release=553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.buildah.version=1.33.12, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, io.openshift.expose-services=, build-date=2025-09-24T08:57:55, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, version=7, description=Red Hat Ceph Storage 7, name=rhceph, com.redhat.component=rhceph-container, architecture=x86_64, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, RELEASE=main, vendor=Red Hat, Inc.)
Nov 28 09:52:47 np0005538515.localdomain podman[291399]: 2025-11-28 09:52:47.518407926 +0000 UTC m=+0.053251109 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 09:52:47 np0005538515.localdomain kind_stonebraker[291414]: 167 167
Nov 28 09:52:47 np0005538515.localdomain podman[291399]: 2025-11-28 09:52:47.624648195 +0000 UTC m=+0.159491378 container start fbac79e01f3623ff3927a4fb79200e5236748ad9579c02b82dbeef56fe62c748 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=kind_stonebraker, name=rhceph, io.openshift.expose-services=, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, release=553, distribution-scope=public, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, io.buildah.version=1.33.12, build-date=2025-09-24T08:57:55, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, ceph=True, GIT_BRANCH=main, architecture=x86_64, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git)
Nov 28 09:52:47 np0005538515.localdomain systemd[1]: libpod-fbac79e01f3623ff3927a4fb79200e5236748ad9579c02b82dbeef56fe62c748.scope: Deactivated successfully.
Nov 28 09:52:47 np0005538515.localdomain podman[291399]: 2025-11-28 09:52:47.624997095 +0000 UTC m=+0.159840288 container attach fbac79e01f3623ff3927a4fb79200e5236748ad9579c02b82dbeef56fe62c748 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=kind_stonebraker, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12, release=553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, GIT_CLEAN=True, version=7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vcs-type=git, build-date=2025-09-24T08:57:55, name=rhceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, ceph=True)
Nov 28 09:52:47 np0005538515.localdomain podman[291399]: 2025-11-28 09:52:47.628041839 +0000 UTC m=+0.162885042 container died fbac79e01f3623ff3927a4fb79200e5236748ad9579c02b82dbeef56fe62c748 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=kind_stonebraker, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/agreements, version=7, distribution-scope=public, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, io.openshift.expose-services=, io.buildah.version=1.33.12, release=553, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, description=Red Hat Ceph Storage 7, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, ceph=True, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, GIT_CLEAN=True, com.redhat.component=rhceph-container, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, build-date=2025-09-24T08:57:55)
Nov 28 09:52:47 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-5b7dcc9e32d04e73258d034ae877554174af6478f92d6d4ccde03635c3390142-merged.mount: Deactivated successfully.
Nov 28 09:52:47 np0005538515.localdomain podman[291419]: 2025-11-28 09:52:47.731535173 +0000 UTC m=+0.095427547 container remove fbac79e01f3623ff3927a4fb79200e5236748ad9579c02b82dbeef56fe62c748 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=kind_stonebraker, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, RELEASE=main, GIT_CLEAN=True, architecture=x86_64, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12, vcs-type=git, build-date=2025-09-24T08:57:55, release=553, description=Red Hat Ceph Storage 7, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, name=rhceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, io.openshift.expose-services=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d)
Nov 28 09:52:47 np0005538515.localdomain systemd[1]: libpod-conmon-fbac79e01f3623ff3927a4fb79200e5236748ad9579c02b82dbeef56fe62c748.scope: Deactivated successfully.
Nov 28 09:52:47 np0005538515.localdomain ceph-mon[287604]: mon.np0005538513 calling monitor election
Nov 28 09:52:47 np0005538515.localdomain ceph-mon[287604]: mon.np0005538513 calling monitor election
Nov 28 09:52:47 np0005538515.localdomain ceph-mon[287604]: mon.np0005538511 calling monitor election
Nov 28 09:52:47 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515 calling monitor election
Nov 28 09:52:47 np0005538515.localdomain ceph-mon[287604]: mon.np0005538514 calling monitor election
Nov 28 09:52:47 np0005538515.localdomain ceph-mon[287604]: mon.np0005538512 calling monitor election
Nov 28 09:52:47 np0005538515.localdomain ceph-mon[287604]: mon.np0005538512 is new leader, mons np0005538512,np0005538511,np0005538515,np0005538514,np0005538513 in quorum (ranks 0,1,2,3,4)
Nov 28 09:52:47 np0005538515.localdomain ceph-mon[287604]: monmap epoch 9
Nov 28 09:52:47 np0005538515.localdomain ceph-mon[287604]: fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:52:47 np0005538515.localdomain ceph-mon[287604]: last_changed 2025-11-28T09:52:39.794263+0000
Nov 28 09:52:47 np0005538515.localdomain ceph-mon[287604]: created 2025-11-28T07:45:36.120469+0000
Nov 28 09:52:47 np0005538515.localdomain ceph-mon[287604]: min_mon_release 18 (reef)
Nov 28 09:52:47 np0005538515.localdomain ceph-mon[287604]: election_strategy: 1
Nov 28 09:52:47 np0005538515.localdomain ceph-mon[287604]: 0: [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon.np0005538512
Nov 28 09:52:47 np0005538515.localdomain ceph-mon[287604]: 1: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005538511
Nov 28 09:52:47 np0005538515.localdomain ceph-mon[287604]: 2: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005538515
Nov 28 09:52:47 np0005538515.localdomain ceph-mon[287604]: 3: [v2:172.18.0.107:3300/0,v1:172.18.0.107:6789/0] mon.np0005538514
Nov 28 09:52:47 np0005538515.localdomain ceph-mon[287604]: 4: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005538513
Nov 28 09:52:47 np0005538515.localdomain ceph-mon[287604]: fsmap cephfs:1 {0=mds.np0005538514.umgtoy=up:active} 2 up:standby
Nov 28 09:52:47 np0005538515.localdomain ceph-mon[287604]: osdmap e86: 6 total, 6 up, 6 in
Nov 28 09:52:47 np0005538515.localdomain ceph-mon[287604]: mgrmap e19: np0005538512.zyhkxs(active, since 74s), standbys: np0005538514.djozup, np0005538515.yfkzhl, np0005538510.nzitwz, np0005538511.fvuybw, np0005538513.dsfdlx
Nov 28 09:52:47 np0005538515.localdomain ceph-mon[287604]: Health check cleared: MON_DOWN (was: 1/5 mons down, quorum np0005538512,np0005538511,np0005538515,np0005538514)
Nov 28 09:52:47 np0005538515.localdomain ceph-mon[287604]: Cluster is now healthy
Nov 28 09:52:47 np0005538515.localdomain ceph-mon[287604]: overall HEALTH_OK
Nov 28 09:52:47 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:47 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:47 np0005538515.localdomain ceph-mon[287604]: Reconfiguring osd.4 (monmap changed)...
Nov 28 09:52:47 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch
Nov 28 09:52:47 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:52:47 np0005538515.localdomain ceph-mon[287604]: Reconfiguring daemon osd.4 on np0005538515.localdomain
Nov 28 09:52:47 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "mon metadata", "id": "np0005538513"} : dispatch
Nov 28 09:52:47 np0005538515.localdomain sudo[291365]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:48 np0005538515.localdomain sudo[291442]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:52:48 np0005538515.localdomain sudo[291442]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:48 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.
Nov 28 09:52:48 np0005538515.localdomain sudo[291442]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:48 np0005538515.localdomain sudo[291461]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:52:48 np0005538515.localdomain sudo[291461]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:48 np0005538515.localdomain podman[291460]: 2025-11-28 09:52:48.215266256 +0000 UTC m=+0.108636924 container health_status 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 28 09:52:48 np0005538515.localdomain podman[291460]: 2025-11-28 09:52:48.25213523 +0000 UTC m=+0.145505918 container exec_died 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 28 09:52:48 np0005538515.localdomain systemd[1]: 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.service: Deactivated successfully.
Nov 28 09:52:48 np0005538515.localdomain podman[291518]: 
Nov 28 09:52:48 np0005538515.localdomain podman[291518]: 2025-11-28 09:52:48.655547961 +0000 UTC m=+0.086036967 container create ad9bf77c17751781700c0c0d48a17139af47b781b2e53c1573f0a34feb3019d7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=intelligent_banzai, version=7, architecture=x86_64, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, RELEASE=main, GIT_BRANCH=main, GIT_CLEAN=True, vcs-type=git, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, ceph=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12, build-date=2025-09-24T08:57:55, release=553, name=rhceph)
Nov 28 09:52:48 np0005538515.localdomain systemd[1]: Started libpod-conmon-ad9bf77c17751781700c0c0d48a17139af47b781b2e53c1573f0a34feb3019d7.scope.
Nov 28 09:52:48 np0005538515.localdomain podman[291518]: 2025-11-28 09:52:48.618977816 +0000 UTC m=+0.049466822 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 09:52:48 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 09:52:48 np0005538515.localdomain podman[291518]: 2025-11-28 09:52:48.73968785 +0000 UTC m=+0.170176846 container init ad9bf77c17751781700c0c0d48a17139af47b781b2e53c1573f0a34feb3019d7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=intelligent_banzai, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, version=7, description=Red Hat Ceph Storage 7, release=553, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, build-date=2025-09-24T08:57:55, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, CEPH_POINT_RELEASE=, GIT_CLEAN=True, vcs-type=git, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, io.buildah.version=1.33.12, architecture=x86_64, ceph=True, GIT_BRANCH=main, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Nov 28 09:52:48 np0005538515.localdomain podman[291518]: 2025-11-28 09:52:48.749855153 +0000 UTC m=+0.180344149 container start ad9bf77c17751781700c0c0d48a17139af47b781b2e53c1573f0a34feb3019d7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=intelligent_banzai, vcs-type=git, RELEASE=main, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., build-date=2025-09-24T08:57:55, io.buildah.version=1.33.12, com.redhat.component=rhceph-container, release=553, description=Red Hat Ceph Storage 7, version=7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, ceph=True, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, GIT_BRANCH=main)
Nov 28 09:52:48 np0005538515.localdomain podman[291518]: 2025-11-28 09:52:48.750204473 +0000 UTC m=+0.180693469 container attach ad9bf77c17751781700c0c0d48a17139af47b781b2e53c1573f0a34feb3019d7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=intelligent_banzai, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/agreements, release=553, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, ceph=True, name=rhceph, RELEASE=main, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.expose-services=, build-date=2025-09-24T08:57:55, io.openshift.tags=rhceph ceph, version=7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.33.12, GIT_BRANCH=main)
Nov 28 09:52:48 np0005538515.localdomain intelligent_banzai[291534]: 167 167
Nov 28 09:52:48 np0005538515.localdomain systemd[1]: libpod-ad9bf77c17751781700c0c0d48a17139af47b781b2e53c1573f0a34feb3019d7.scope: Deactivated successfully.
Nov 28 09:52:48 np0005538515.localdomain podman[291541]: 2025-11-28 09:52:48.82516366 +0000 UTC m=+0.055346614 container died ad9bf77c17751781700c0c0d48a17139af47b781b2e53c1573f0a34feb3019d7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=intelligent_banzai, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, RELEASE=main, GIT_BRANCH=main, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.33.12, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, vcs-type=git, release=553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, ceph=True, build-date=2025-09-24T08:57:55, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553)
Nov 28 09:52:48 np0005538515.localdomain podman[291541]: 2025-11-28 09:52:48.872931339 +0000 UTC m=+0.103114253 container remove ad9bf77c17751781700c0c0d48a17139af47b781b2e53c1573f0a34feb3019d7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=intelligent_banzai, io.buildah.version=1.33.12, com.redhat.component=rhceph-container, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, io.openshift.expose-services=, vcs-type=git, build-date=2025-09-24T08:57:55, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, architecture=x86_64, release=553, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph)
Nov 28 09:52:48 np0005538515.localdomain systemd[1]: libpod-conmon-ad9bf77c17751781700c0c0d48a17139af47b781b2e53c1573f0a34feb3019d7.scope: Deactivated successfully.
Nov 28 09:52:48 np0005538515.localdomain sudo[291461]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:48 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:48 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:48 np0005538515.localdomain ceph-mon[287604]: Reconfiguring mds.mds.np0005538515.anvatb (monmap changed)...
Nov 28 09:52:48 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005538515.anvatb", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Nov 28 09:52:48 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:52:48 np0005538515.localdomain ceph-mon[287604]: Reconfiguring daemon mds.mds.np0005538515.anvatb on np0005538515.localdomain
Nov 28 09:52:48 np0005538515.localdomain ceph-mon[287604]: pgmap v43: 177 pgs: 177 active+clean; 104 MiB data, 558 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:52:49 np0005538515.localdomain sudo[291557]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:52:49 np0005538515.localdomain sudo[291557]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:49 np0005538515.localdomain sudo[291557]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:49 np0005538515.localdomain sudo[291575]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:52:49 np0005538515.localdomain sudo[291575]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:49 np0005538515.localdomain podman[291610]: 
Nov 28 09:52:49 np0005538515.localdomain podman[291610]: 2025-11-28 09:52:49.642870187 +0000 UTC m=+0.077525936 container create 43fefb26711867a1d278c7efa6e7f0da448b05c281fce655805ccb0d2997dc95 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=busy_mahavira, io.buildah.version=1.33.12, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, io.openshift.expose-services=, vcs-type=git, release=553, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, GIT_CLEAN=True, build-date=2025-09-24T08:57:55, name=rhceph, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, version=7, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, RELEASE=main, ceph=True)
Nov 28 09:52:49 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-9bd864515b5a1f8b47ff2c371c21eef8657baf75a14d22d84d8d5354b9dae190-merged.mount: Deactivated successfully.
Nov 28 09:52:49 np0005538515.localdomain systemd[1]: Started libpod-conmon-43fefb26711867a1d278c7efa6e7f0da448b05c281fce655805ccb0d2997dc95.scope.
Nov 28 09:52:49 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 09:52:49 np0005538515.localdomain podman[291610]: 2025-11-28 09:52:49.612004968 +0000 UTC m=+0.046660717 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 09:52:49 np0005538515.localdomain podman[291610]: 2025-11-28 09:52:49.721053873 +0000 UTC m=+0.155709612 container init 43fefb26711867a1d278c7efa6e7f0da448b05c281fce655805ccb0d2997dc95 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=busy_mahavira, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.tags=rhceph ceph, io.buildah.version=1.33.12, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, vcs-type=git, io.openshift.expose-services=, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=553, GIT_CLEAN=True)
Nov 28 09:52:49 np0005538515.localdomain podman[291610]: 2025-11-28 09:52:49.731032649 +0000 UTC m=+0.165688468 container start 43fefb26711867a1d278c7efa6e7f0da448b05c281fce655805ccb0d2997dc95 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=busy_mahavira, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.buildah.version=1.33.12, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, RELEASE=main, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, distribution-scope=public, build-date=2025-09-24T08:57:55, version=7, release=553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Nov 28 09:52:49 np0005538515.localdomain podman[291610]: 2025-11-28 09:52:49.731473373 +0000 UTC m=+0.166129152 container attach 43fefb26711867a1d278c7efa6e7f0da448b05c281fce655805ccb0d2997dc95 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=busy_mahavira, distribution-scope=public, vcs-type=git, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, RELEASE=main, release=553, com.redhat.component=rhceph-container, build-date=2025-09-24T08:57:55, io.buildah.version=1.33.12, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, io.openshift.expose-services=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_CLEAN=True, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=)
Nov 28 09:52:49 np0005538515.localdomain busy_mahavira[291625]: 167 167
Nov 28 09:52:49 np0005538515.localdomain systemd[1]: libpod-43fefb26711867a1d278c7efa6e7f0da448b05c281fce655805ccb0d2997dc95.scope: Deactivated successfully.
Nov 28 09:52:49 np0005538515.localdomain podman[291610]: 2025-11-28 09:52:49.734164355 +0000 UTC m=+0.168820134 container died 43fefb26711867a1d278c7efa6e7f0da448b05c281fce655805ccb0d2997dc95 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=busy_mahavira, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, distribution-scope=public, RELEASE=main, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, vcs-type=git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.buildah.version=1.33.12, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=553, io.openshift.tags=rhceph ceph, version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553)
Nov 28 09:52:49 np0005538515.localdomain podman[291630]: 2025-11-28 09:52:49.825566948 +0000 UTC m=+0.080332853 container remove 43fefb26711867a1d278c7efa6e7f0da448b05c281fce655805ccb0d2997dc95 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=busy_mahavira, ceph=True, io.buildah.version=1.33.12, io.openshift.tags=rhceph ceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, release=553, vendor=Red Hat, Inc., name=rhceph, description=Red Hat Ceph Storage 7, vcs-type=git, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, RELEASE=main, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, version=7)
Nov 28 09:52:49 np0005538515.localdomain systemd[1]: libpod-conmon-43fefb26711867a1d278c7efa6e7f0da448b05c281fce655805ccb0d2997dc95.scope: Deactivated successfully.
Nov 28 09:52:49 np0005538515.localdomain sudo[291575]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:50 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:50 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:50 np0005538515.localdomain sudo[291647]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:52:50 np0005538515.localdomain ceph-mon[287604]: Reconfiguring mgr.np0005538515.yfkzhl (monmap changed)...
Nov 28 09:52:50 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538515.yfkzhl", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 28 09:52:50 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "mgr services"} : dispatch
Nov 28 09:52:50 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:52:50 np0005538515.localdomain ceph-mon[287604]: Reconfiguring daemon mgr.np0005538515.yfkzhl on np0005538515.localdomain
Nov 28 09:52:50 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:50 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:50 np0005538515.localdomain sudo[291647]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:50 np0005538515.localdomain sudo[291647]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:50 np0005538515.localdomain sudo[291665]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 09:52:50 np0005538515.localdomain sudo[291665]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:50 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@2(peon).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:52:50 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-4be03bd20a4323a9ca325359fcb32a3eb720231157e3fc56c0f99faba10a4ce0-merged.mount: Deactivated successfully.
Nov 28 09:52:50 np0005538515.localdomain sudo[291665]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:52:50.835 158530 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:52:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:52:50.837 158530 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:52:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:52:50.837 158530 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:52:51 np0005538515.localdomain ceph-mon[287604]: pgmap v44: 177 pgs: 177 active+clean; 104 MiB data, 558 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:52:51 np0005538515.localdomain ceph-mon[287604]: from='client.? 172.18.0.200:0/438518273' entity='client.admin' cmd={"prefix": "status", "format": "json"} : dispatch
Nov 28 09:52:51 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.
Nov 28 09:52:51 np0005538515.localdomain systemd[1]: tmp-crun.gmsfX9.mount: Deactivated successfully.
Nov 28 09:52:52 np0005538515.localdomain podman[291715]: 2025-11-28 09:52:51.999975505 +0000 UTC m=+0.101197674 container health_status cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=multipathd, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 28 09:52:52 np0005538515.localdomain podman[291715]: 2025-11-28 09:52:52.036723115 +0000 UTC m=+0.137945244 container exec_died cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=multipathd, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 09:52:52 np0005538515.localdomain systemd[1]: cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.service: Deactivated successfully.
Nov 28 09:52:52 np0005538515.localdomain sudo[291734]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Nov 28 09:52:52 np0005538515.localdomain sudo[291734]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:52 np0005538515.localdomain sudo[291734]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:52 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:52 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:52 np0005538515.localdomain ceph-mon[287604]: pgmap v45: 177 pgs: 177 active+clean; 104 MiB data, 558 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:52:52 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:52 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:52 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:52 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:52 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:52 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:52 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:52 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:52 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:52 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:52 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:52 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:52 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:52:52 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 28 09:52:52 np0005538515.localdomain sudo[291752]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph
Nov 28 09:52:52 np0005538515.localdomain sudo[291752]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:52 np0005538515.localdomain sudo[291752]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:52 np0005538515.localdomain sudo[291770]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.conf.new
Nov 28 09:52:52 np0005538515.localdomain sudo[291770]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:52 np0005538515.localdomain sudo[291770]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:53 np0005538515.localdomain sudo[291788]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:52:53 np0005538515.localdomain sudo[291788]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:53 np0005538515.localdomain sudo[291788]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:53 np0005538515.localdomain sudo[291806]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.conf.new
Nov 28 09:52:53 np0005538515.localdomain sudo[291806]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:53 np0005538515.localdomain sudo[291806]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:53 np0005538515.localdomain sudo[291840]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.conf.new
Nov 28 09:52:53 np0005538515.localdomain sudo[291840]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:53 np0005538515.localdomain sudo[291840]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:53 np0005538515.localdomain sudo[291858]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.conf.new
Nov 28 09:52:53 np0005538515.localdomain sudo[291858]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:53 np0005538515.localdomain sudo[291858]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:53 np0005538515.localdomain sudo[291876]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Nov 28 09:52:53 np0005538515.localdomain sudo[291876]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:53 np0005538515.localdomain sudo[291876]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:53 np0005538515.localdomain sudo[291894]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config
Nov 28 09:52:53 np0005538515.localdomain sudo[291894]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:53 np0005538515.localdomain sudo[291894]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:53 np0005538515.localdomain sudo[291912]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config
Nov 28 09:52:53 np0005538515.localdomain sudo[291912]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:53 np0005538515.localdomain sudo[291912]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:53 np0005538515.localdomain sudo[291930]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf.new
Nov 28 09:52:53 np0005538515.localdomain sudo[291930]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:53 np0005538515.localdomain sudo[291930]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:53 np0005538515.localdomain sudo[291948]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:52:53 np0005538515.localdomain sudo[291948]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:53 np0005538515.localdomain sudo[291948]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:53 np0005538515.localdomain sudo[291966]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf.new
Nov 28 09:52:53 np0005538515.localdomain sudo[291966]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:53 np0005538515.localdomain sudo[291966]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:53 np0005538515.localdomain ceph-mon[287604]: from='client.26865 -' entity='client.admin' cmd=[{"prefix": "orch", "action": "reconfig", "service_name": "osd.default_drive_group", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 09:52:53 np0005538515.localdomain ceph-mon[287604]: Reconfig service osd.default_drive_group
Nov 28 09:52:53 np0005538515.localdomain ceph-mon[287604]: Updating np0005538511.localdomain:/etc/ceph/ceph.conf
Nov 28 09:52:53 np0005538515.localdomain ceph-mon[287604]: Updating np0005538512.localdomain:/etc/ceph/ceph.conf
Nov 28 09:52:53 np0005538515.localdomain ceph-mon[287604]: Updating np0005538513.localdomain:/etc/ceph/ceph.conf
Nov 28 09:52:53 np0005538515.localdomain ceph-mon[287604]: Updating np0005538514.localdomain:/etc/ceph/ceph.conf
Nov 28 09:52:53 np0005538515.localdomain ceph-mon[287604]: Updating np0005538515.localdomain:/etc/ceph/ceph.conf
Nov 28 09:52:53 np0005538515.localdomain sudo[292000]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf.new
Nov 28 09:52:53 np0005538515.localdomain sudo[292000]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:53 np0005538515.localdomain sudo[292000]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:54 np0005538515.localdomain sudo[292018]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf.new
Nov 28 09:52:54 np0005538515.localdomain sudo[292018]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:54 np0005538515.localdomain sudo[292018]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:54 np0005538515.localdomain sudo[292036]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf.new /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:52:54 np0005538515.localdomain sudo[292036]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:54 np0005538515.localdomain sudo[292036]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:54 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@2(peon).osd e87 e87: 6 total, 6 up, 6 in
Nov 28 09:52:54 np0005538515.localdomain sshd[288388]: pam_unix(sshd:session): session closed for user ceph-admin
Nov 28 09:52:54 np0005538515.localdomain systemd[1]: session-64.scope: Deactivated successfully.
Nov 28 09:52:54 np0005538515.localdomain systemd[1]: session-64.scope: Consumed 20.319s CPU time.
Nov 28 09:52:54 np0005538515.localdomain systemd-logind[763]: Session 64 logged out. Waiting for processes to exit.
Nov 28 09:52:54 np0005538515.localdomain systemd-logind[763]: Removed session 64.
Nov 28 09:52:54 np0005538515.localdomain sshd[292054]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 09:52:54 np0005538515.localdomain sshd[292054]: Accepted publickey for ceph-admin from 192.168.122.107 port 56446 ssh2: RSA SHA256:zjXO5gWr7Xng+SeiWsaFLFQaayJZD5rPIAl1v5Aks+g
Nov 28 09:52:54 np0005538515.localdomain systemd-logind[763]: New session 65 of user ceph-admin.
Nov 28 09:52:54 np0005538515.localdomain systemd[1]: Started Session 65 of User ceph-admin.
Nov 28 09:52:54 np0005538515.localdomain sshd[292054]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Nov 28 09:52:54 np0005538515.localdomain sudo[292058]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:52:54 np0005538515.localdomain sudo[292058]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:54 np0005538515.localdomain sudo[292058]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:54 np0005538515.localdomain sudo[292076]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Nov 28 09:52:54 np0005538515.localdomain sudo[292076]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:54 np0005538515.localdomain ceph-mon[287604]: from='client.? 172.18.0.200:0/1216930330' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch
Nov 28 09:52:54 np0005538515.localdomain ceph-mon[287604]: Activating manager daemon np0005538514.djozup
Nov 28 09:52:54 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:54 np0005538515.localdomain ceph-mon[287604]: osdmap e87: 6 total, 6 up, 6 in
Nov 28 09:52:54 np0005538515.localdomain ceph-mon[287604]: from='client.? 172.18.0.200:0/1216930330' entity='client.admin' cmd='[{"prefix": "mgr fail"}]': finished
Nov 28 09:52:54 np0005538515.localdomain ceph-mon[287604]: mgrmap e20: np0005538514.djozup(active, starting, since 0.0638751s), standbys: np0005538515.yfkzhl, np0005538510.nzitwz, np0005538511.fvuybw, np0005538513.dsfdlx
Nov 28 09:52:54 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "mon metadata", "id": "np0005538511"} : dispatch
Nov 28 09:52:54 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "mon metadata", "id": "np0005538512"} : dispatch
Nov 28 09:52:54 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "mon metadata", "id": "np0005538513"} : dispatch
Nov 28 09:52:54 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "mon metadata", "id": "np0005538514"} : dispatch
Nov 28 09:52:54 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "mon metadata", "id": "np0005538515"} : dispatch
Nov 28 09:52:54 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "mds metadata", "who": "mds.np0005538513.yljthc"} : dispatch
Nov 28 09:52:54 np0005538515.localdomain ceph-mon[287604]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' 
Nov 28 09:52:54 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "mds metadata", "who": "mds.np0005538515.anvatb"} : dispatch
Nov 28 09:52:54 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "mds metadata", "who": "mds.np0005538514.umgtoy"} : dispatch
Nov 28 09:52:54 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "mgr metadata", "who": "np0005538514.djozup", "id": "np0005538514.djozup"} : dispatch
Nov 28 09:52:54 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "mgr metadata", "who": "np0005538515.yfkzhl", "id": "np0005538515.yfkzhl"} : dispatch
Nov 28 09:52:54 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "mgr metadata", "who": "np0005538510.nzitwz", "id": "np0005538510.nzitwz"} : dispatch
Nov 28 09:52:54 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "mgr metadata", "who": "np0005538511.fvuybw", "id": "np0005538511.fvuybw"} : dispatch
Nov 28 09:52:54 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "mgr metadata", "who": "np0005538513.dsfdlx", "id": "np0005538513.dsfdlx"} : dispatch
Nov 28 09:52:54 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "osd metadata", "id": 0} : dispatch
Nov 28 09:52:54 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "osd metadata", "id": 1} : dispatch
Nov 28 09:52:54 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "osd metadata", "id": 2} : dispatch
Nov 28 09:52:54 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "osd metadata", "id": 3} : dispatch
Nov 28 09:52:54 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "osd metadata", "id": 4} : dispatch
Nov 28 09:52:54 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "osd metadata", "id": 5} : dispatch
Nov 28 09:52:54 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "mds metadata"} : dispatch
Nov 28 09:52:54 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "osd metadata"} : dispatch
Nov 28 09:52:54 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "mon metadata"} : dispatch
Nov 28 09:52:54 np0005538515.localdomain ceph-mon[287604]: Manager daemon np0005538514.djozup is now available
Nov 28 09:52:54 np0005538515.localdomain ceph-mon[287604]: removing stray HostCache host record np0005538510.localdomain.devices.0
Nov 28 09:52:54 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005538510.localdomain.devices.0"} : dispatch
Nov 28 09:52:54 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005538510.localdomain.devices.0"} : dispatch
Nov 28 09:52:54 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005538510.localdomain.devices.0"}]': finished
Nov 28 09:52:54 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005538510.localdomain.devices.0"} : dispatch
Nov 28 09:52:54 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005538510.localdomain.devices.0"} : dispatch
Nov 28 09:52:54 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005538510.localdomain.devices.0"}]': finished
Nov 28 09:52:54 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005538514.djozup/mirror_snapshot_schedule"} : dispatch
Nov 28 09:52:54 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005538514.djozup/mirror_snapshot_schedule"} : dispatch
Nov 28 09:52:54 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005538514.djozup/trash_purge_schedule"} : dispatch
Nov 28 09:52:54 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005538514.djozup/trash_purge_schedule"} : dispatch
Nov 28 09:52:55 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@2(peon).osd e87 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:52:55 np0005538515.localdomain podman[292163]: 2025-11-28 09:52:55.706786247 +0000 UTC m=+0.090587037 container exec 98f7091a3e2ea0e9ed1e630f1e98c8fad1fd276cf7448473db6afc3c103ea45d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538515, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.33.12, release=553, name=rhceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, architecture=x86_64, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, build-date=2025-09-24T08:57:55, com.redhat.license_terms=https://www.redhat.com/agreements, version=7, com.redhat.component=rhceph-container, GIT_CLEAN=True, vcs-type=git, io.openshift.expose-services=, io.openshift.tags=rhceph ceph)
Nov 28 09:52:55 np0005538515.localdomain podman[292163]: 2025-11-28 09:52:55.812229642 +0000 UTC m=+0.196030412 container exec_died 98f7091a3e2ea0e9ed1e630f1e98c8fad1fd276cf7448473db6afc3c103ea45d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538515, io.buildah.version=1.33.12, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, CEPH_POINT_RELEASE=, version=7, io.openshift.expose-services=, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_CLEAN=True, name=rhceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-type=git, RELEASE=main, architecture=x86_64, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, release=553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Nov 28 09:52:56 np0005538515.localdomain ceph-mon[287604]: mgrmap e21: np0005538514.djozup(active, since 1.14261s), standbys: np0005538515.yfkzhl, np0005538510.nzitwz, np0005538511.fvuybw, np0005538513.dsfdlx
Nov 28 09:52:56 np0005538515.localdomain ceph-mon[287604]: pgmap v3: 177 pgs: 177 active+clean; 104 MiB data, 558 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:52:56 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:52:56 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:52:56 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:52:56 np0005538515.localdomain sudo[292076]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:56 np0005538515.localdomain sudo[292279]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:52:56 np0005538515.localdomain sudo[292279]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:56 np0005538515.localdomain sudo[292279]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:56 np0005538515.localdomain sudo[292297]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 09:52:56 np0005538515.localdomain sudo[292297]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:57 np0005538515.localdomain sudo[292297]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:57 np0005538515.localdomain ceph-mon[287604]: [28/Nov/2025:09:52:55] ENGINE Bus STARTING
Nov 28 09:52:57 np0005538515.localdomain ceph-mon[287604]: [28/Nov/2025:09:52:56] ENGINE Serving on http://172.18.0.107:8765
Nov 28 09:52:57 np0005538515.localdomain ceph-mon[287604]: [28/Nov/2025:09:52:56] ENGINE Serving on https://172.18.0.107:7150
Nov 28 09:52:57 np0005538515.localdomain ceph-mon[287604]: [28/Nov/2025:09:52:56] ENGINE Bus STARTED
Nov 28 09:52:57 np0005538515.localdomain ceph-mon[287604]: [28/Nov/2025:09:52:56] ENGINE Client ('172.18.0.107', 40776) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)')
Nov 28 09:52:57 np0005538515.localdomain ceph-mon[287604]: pgmap v4: 177 pgs: 177 active+clean; 104 MiB data, 558 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:52:57 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:52:57 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:52:57 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:52:57 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:52:57 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:52:57 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:52:57 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:52:57 np0005538515.localdomain sudo[292348]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:52:57 np0005538515.localdomain sudo[292348]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:57 np0005538515.localdomain sudo[292348]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:57 np0005538515.localdomain sudo[292366]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 list-networks
Nov 28 09:52:57 np0005538515.localdomain sudo[292366]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:52:57 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:52:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:52:57 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 09:52:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:52:57 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:52:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:52:57 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 09:52:57 np0005538515.localdomain openstack_network_exporter[240973]: 
Nov 28 09:52:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:52:57 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 09:52:57 np0005538515.localdomain openstack_network_exporter[240973]: 
Nov 28 09:52:57 np0005538515.localdomain sudo[292366]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:58 np0005538515.localdomain sudo[292402]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Nov 28 09:52:58 np0005538515.localdomain sudo[292402]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:58 np0005538515.localdomain sudo[292402]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:58 np0005538515.localdomain sudo[292420]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph
Nov 28 09:52:58 np0005538515.localdomain sudo[292420]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:58 np0005538515.localdomain sudo[292420]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:58 np0005538515.localdomain sudo[292438]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.conf.new
Nov 28 09:52:58 np0005538515.localdomain sudo[292438]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:58 np0005538515.localdomain sudo[292438]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:58 np0005538515.localdomain ceph-mon[287604]: mgrmap e22: np0005538514.djozup(active, since 3s), standbys: np0005538515.yfkzhl, np0005538510.nzitwz, np0005538511.fvuybw, np0005538513.dsfdlx
Nov 28 09:52:58 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:52:58 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:52:58 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch
Nov 28 09:52:58 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch
Nov 28 09:52:58 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch
Nov 28 09:52:58 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch
Nov 28 09:52:58 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:52:58 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:52:58 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:52:58 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "config rm", "who": "osd/host:np0005538512", "name": "osd_memory_target"} : dispatch
Nov 28 09:52:58 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' cmd={"prefix": "config rm", "who": "osd/host:np0005538512", "name": "osd_memory_target"} : dispatch
Nov 28 09:52:58 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:52:58 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "config rm", "who": "osd/host:np0005538511", "name": "osd_memory_target"} : dispatch
Nov 28 09:52:58 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' cmd={"prefix": "config rm", "who": "osd/host:np0005538511", "name": "osd_memory_target"} : dispatch
Nov 28 09:52:58 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:52:58 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:52:58 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch
Nov 28 09:52:58 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch
Nov 28 09:52:58 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch
Nov 28 09:52:58 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch
Nov 28 09:52:58 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:52:58 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:52:58 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch
Nov 28 09:52:58 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch
Nov 28 09:52:58 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch
Nov 28 09:52:58 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch
Nov 28 09:52:58 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:52:58 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 28 09:52:58 np0005538515.localdomain sudo[292456]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:52:58 np0005538515.localdomain sudo[292456]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:58 np0005538515.localdomain sudo[292456]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:58 np0005538515.localdomain sudo[292474]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.conf.new
Nov 28 09:52:58 np0005538515.localdomain sudo[292474]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:58 np0005538515.localdomain sudo[292474]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:58 np0005538515.localdomain sudo[292508]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.conf.new
Nov 28 09:52:58 np0005538515.localdomain sudo[292508]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:58 np0005538515.localdomain sudo[292508]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:58 np0005538515.localdomain sudo[292526]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.conf.new
Nov 28 09:52:58 np0005538515.localdomain sudo[292526]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:58 np0005538515.localdomain sudo[292526]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:58 np0005538515.localdomain sudo[292544]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Nov 28 09:52:58 np0005538515.localdomain sudo[292544]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:58 np0005538515.localdomain sudo[292544]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:58 np0005538515.localdomain sudo[292562]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config
Nov 28 09:52:58 np0005538515.localdomain sudo[292562]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:58 np0005538515.localdomain sudo[292562]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:58 np0005538515.localdomain sudo[292580]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config
Nov 28 09:52:58 np0005538515.localdomain sudo[292580]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:58 np0005538515.localdomain sudo[292580]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:58 np0005538515.localdomain sudo[292598]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf.new
Nov 28 09:52:58 np0005538515.localdomain sudo[292598]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:58 np0005538515.localdomain sudo[292598]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:58 np0005538515.localdomain podman[239012]: time="2025-11-28T09:52:58Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 09:52:58 np0005538515.localdomain podman[239012]: @ - - [28/Nov/2025:09:52:58 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156330 "" "Go-http-client/1.1"
Nov 28 09:52:58 np0005538515.localdomain podman[239012]: @ - - [28/Nov/2025:09:52:58 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19161 "" "Go-http-client/1.1"
Nov 28 09:52:58 np0005538515.localdomain sudo[292616]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:52:58 np0005538515.localdomain sudo[292616]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:58 np0005538515.localdomain sudo[292616]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:59 np0005538515.localdomain sudo[292634]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf.new
Nov 28 09:52:59 np0005538515.localdomain sudo[292634]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:59 np0005538515.localdomain sudo[292634]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:59 np0005538515.localdomain sudo[292668]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf.new
Nov 28 09:52:59 np0005538515.localdomain sudo[292668]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:59 np0005538515.localdomain sudo[292668]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:59 np0005538515.localdomain sudo[292686]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf.new
Nov 28 09:52:59 np0005538515.localdomain sudo[292686]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:59 np0005538515.localdomain sudo[292686]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:59 np0005538515.localdomain ceph-mon[287604]: Adjusting osd_memory_target on np0005538514.localdomain to 836.6M
Nov 28 09:52:59 np0005538515.localdomain ceph-mon[287604]: Unable to set osd_memory_target on np0005538514.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Nov 28 09:52:59 np0005538515.localdomain ceph-mon[287604]: Adjusting osd_memory_target on np0005538515.localdomain to 836.6M
Nov 28 09:52:59 np0005538515.localdomain ceph-mon[287604]: Unable to set osd_memory_target on np0005538515.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Nov 28 09:52:59 np0005538515.localdomain ceph-mon[287604]: Adjusting osd_memory_target on np0005538513.localdomain to 836.6M
Nov 28 09:52:59 np0005538515.localdomain ceph-mon[287604]: Unable to set osd_memory_target on np0005538513.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Nov 28 09:52:59 np0005538515.localdomain ceph-mon[287604]: Updating np0005538511.localdomain:/etc/ceph/ceph.conf
Nov 28 09:52:59 np0005538515.localdomain ceph-mon[287604]: Updating np0005538512.localdomain:/etc/ceph/ceph.conf
Nov 28 09:52:59 np0005538515.localdomain ceph-mon[287604]: Updating np0005538513.localdomain:/etc/ceph/ceph.conf
Nov 28 09:52:59 np0005538515.localdomain ceph-mon[287604]: Updating np0005538514.localdomain:/etc/ceph/ceph.conf
Nov 28 09:52:59 np0005538515.localdomain ceph-mon[287604]: Updating np0005538515.localdomain:/etc/ceph/ceph.conf
Nov 28 09:52:59 np0005538515.localdomain ceph-mon[287604]: pgmap v5: 177 pgs: 177 active+clean; 104 MiB data, 558 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:52:59 np0005538515.localdomain ceph-mon[287604]: mgrmap e23: np0005538514.djozup(active, since 4s), standbys: np0005538515.yfkzhl, np0005538510.nzitwz, np0005538511.fvuybw, np0005538513.dsfdlx
Nov 28 09:52:59 np0005538515.localdomain ceph-mon[287604]: Standby manager daemon np0005538512.zyhkxs started
Nov 28 09:52:59 np0005538515.localdomain sudo[292704]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf.new /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:52:59 np0005538515.localdomain sudo[292704]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:59 np0005538515.localdomain sudo[292704]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:59 np0005538515.localdomain sudo[292722]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Nov 28 09:52:59 np0005538515.localdomain sudo[292722]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:59 np0005538515.localdomain sudo[292722]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:59 np0005538515.localdomain sudo[292740]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph
Nov 28 09:52:59 np0005538515.localdomain sudo[292740]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:59 np0005538515.localdomain sudo[292740]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:59 np0005538515.localdomain sudo[292758]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.client.admin.keyring.new
Nov 28 09:52:59 np0005538515.localdomain sudo[292758]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:59 np0005538515.localdomain sudo[292758]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:59 np0005538515.localdomain sudo[292776]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:52:59 np0005538515.localdomain sudo[292776]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:59 np0005538515.localdomain sudo[292776]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:59 np0005538515.localdomain sudo[292794]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.client.admin.keyring.new
Nov 28 09:52:59 np0005538515.localdomain sudo[292794]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:59 np0005538515.localdomain sudo[292794]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:59 np0005538515.localdomain sudo[292828]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.client.admin.keyring.new
Nov 28 09:52:59 np0005538515.localdomain sudo[292828]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:59 np0005538515.localdomain sudo[292828]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:59 np0005538515.localdomain sudo[292846]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.client.admin.keyring.new
Nov 28 09:52:59 np0005538515.localdomain sudo[292846]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:59 np0005538515.localdomain sudo[292846]: pam_unix(sudo:session): session closed for user root
Nov 28 09:52:59 np0005538515.localdomain sudo[292864]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.client.admin.keyring.new /etc/ceph/ceph.client.admin.keyring
Nov 28 09:52:59 np0005538515.localdomain sudo[292864]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:52:59 np0005538515.localdomain sudo[292864]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:00 np0005538515.localdomain sudo[292882]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config
Nov 28 09:53:00 np0005538515.localdomain sudo[292882]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:00 np0005538515.localdomain sudo[292882]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:00 np0005538515.localdomain sudo[292900]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config
Nov 28 09:53:00 np0005538515.localdomain sudo[292900]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:00 np0005538515.localdomain sudo[292900]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:00 np0005538515.localdomain sudo[292918]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring.new
Nov 28 09:53:00 np0005538515.localdomain sudo[292918]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:00 np0005538515.localdomain sudo[292918]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:00 np0005538515.localdomain sudo[292936]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:53:00 np0005538515.localdomain sudo[292936]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:00 np0005538515.localdomain sudo[292936]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:00 np0005538515.localdomain ceph-mon[287604]: Updating np0005538515.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:53:00 np0005538515.localdomain ceph-mon[287604]: Updating np0005538512.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:53:00 np0005538515.localdomain ceph-mon[287604]: Updating np0005538514.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:53:00 np0005538515.localdomain ceph-mon[287604]: Updating np0005538511.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:53:00 np0005538515.localdomain ceph-mon[287604]: Updating np0005538513.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:53:00 np0005538515.localdomain ceph-mon[287604]: Updating np0005538512.localdomain:/etc/ceph/ceph.client.admin.keyring
Nov 28 09:53:00 np0005538515.localdomain ceph-mon[287604]: Updating np0005538515.localdomain:/etc/ceph/ceph.client.admin.keyring
Nov 28 09:53:00 np0005538515.localdomain ceph-mon[287604]: Updating np0005538511.localdomain:/etc/ceph/ceph.client.admin.keyring
Nov 28 09:53:00 np0005538515.localdomain ceph-mon[287604]: Updating np0005538514.localdomain:/etc/ceph/ceph.client.admin.keyring
Nov 28 09:53:00 np0005538515.localdomain ceph-mon[287604]: mgrmap e24: np0005538514.djozup(active, since 6s), standbys: np0005538515.yfkzhl, np0005538510.nzitwz, np0005538511.fvuybw, np0005538513.dsfdlx, np0005538512.zyhkxs
Nov 28 09:53:00 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "mgr metadata", "who": "np0005538512.zyhkxs", "id": "np0005538512.zyhkxs"} : dispatch
Nov 28 09:53:00 np0005538515.localdomain sudo[292954]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring.new
Nov 28 09:53:00 np0005538515.localdomain sudo[292954]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:00 np0005538515.localdomain sudo[292954]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:00 np0005538515.localdomain sudo[292988]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring.new
Nov 28 09:53:00 np0005538515.localdomain sudo[292988]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:00 np0005538515.localdomain sudo[292988]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:00 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@2(peon).osd e87 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:53:00 np0005538515.localdomain sudo[293006]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring.new
Nov 28 09:53:00 np0005538515.localdomain sudo[293006]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:00 np0005538515.localdomain sudo[293006]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:00 np0005538515.localdomain sudo[293024]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring.new /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring
Nov 28 09:53:00 np0005538515.localdomain sudo[293024]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:00 np0005538515.localdomain sudo[293024]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:01 np0005538515.localdomain ceph-mon[287604]: Updating np0005538513.localdomain:/etc/ceph/ceph.client.admin.keyring
Nov 28 09:53:01 np0005538515.localdomain ceph-mon[287604]: Updating np0005538515.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring
Nov 28 09:53:01 np0005538515.localdomain ceph-mon[287604]: Updating np0005538512.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring
Nov 28 09:53:01 np0005538515.localdomain ceph-mon[287604]: Updating np0005538511.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring
Nov 28 09:53:01 np0005538515.localdomain ceph-mon[287604]: Updating np0005538514.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring
Nov 28 09:53:01 np0005538515.localdomain ceph-mon[287604]: pgmap v6: 177 pgs: 177 active+clean; 104 MiB data, 558 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:53:01 np0005538515.localdomain ceph-mon[287604]: Updating np0005538513.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring
Nov 28 09:53:01 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:53:01 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:53:01 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:53:01 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:53:01 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:53:01 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:53:01 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:53:01 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:53:01 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:53:01 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:53:01 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:53:01 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 28 09:53:01 np0005538515.localdomain ceph-mon[287604]: Health check failed: 1 stray daemon(s) not managed by cephadm (CEPHADM_STRAY_DAEMON)
Nov 28 09:53:01 np0005538515.localdomain ceph-mon[287604]: Health check failed: 1 stray host(s) with 1 daemon(s) not managed by cephadm (CEPHADM_STRAY_HOST)
Nov 28 09:53:01 np0005538515.localdomain sudo[293042]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:53:01 np0005538515.localdomain sudo[293042]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:01 np0005538515.localdomain sudo[293042]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:02 np0005538515.localdomain ceph-mon[287604]: pgmap v7: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail; 35 KiB/s rd, 0 B/s wr, 19 op/s
Nov 28 09:53:02 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538511.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 28 09:53:02 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538511.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 28 09:53:02 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:53:03 np0005538515.localdomain ceph-mon[287604]: Reconfiguring crash.np0005538511 (monmap changed)...
Nov 28 09:53:03 np0005538515.localdomain ceph-mon[287604]: Reconfiguring daemon crash.np0005538511 on np0005538511.localdomain
Nov 28 09:53:03 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:53:03 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:53:03 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538511.fvuybw", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 28 09:53:03 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538511.fvuybw", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 28 09:53:03 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "mgr services"} : dispatch
Nov 28 09:53:03 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:53:03 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:53:03 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:53:03 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538512.zyhkxs", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 28 09:53:03 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538512.zyhkxs", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 28 09:53:03 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "mgr services"} : dispatch
Nov 28 09:53:03 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:53:03 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.
Nov 28 09:53:03 np0005538515.localdomain systemd[1]: tmp-crun.GoUW4O.mount: Deactivated successfully.
Nov 28 09:53:04 np0005538515.localdomain podman[293060]: 2025-11-28 09:53:04.003796041 +0000 UTC m=+0.099153781 container health_status 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1755695350, version=9.6, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, container_name=openstack_network_exporter, vendor=Red Hat, Inc., architecture=x86_64, maintainer=Red Hat, Inc., config_id=edpm, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9)
Nov 28 09:53:04 np0005538515.localdomain podman[293060]: 2025-11-28 09:53:04.023745574 +0000 UTC m=+0.119103384 container exec_died 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, release=1755695350, distribution-scope=public, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, name=ubi9-minimal, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Nov 28 09:53:04 np0005538515.localdomain systemd[1]: 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.service: Deactivated successfully.
Nov 28 09:53:04 np0005538515.localdomain ceph-mon[287604]: Reconfiguring mgr.np0005538511.fvuybw (monmap changed)...
Nov 28 09:53:04 np0005538515.localdomain ceph-mon[287604]: Reconfiguring daemon mgr.np0005538511.fvuybw on np0005538511.localdomain
Nov 28 09:53:04 np0005538515.localdomain ceph-mon[287604]: pgmap v8: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail; 26 KiB/s rd, 0 B/s wr, 14 op/s
Nov 28 09:53:04 np0005538515.localdomain ceph-mon[287604]: Reconfiguring mgr.np0005538512.zyhkxs (monmap changed)...
Nov 28 09:53:04 np0005538515.localdomain ceph-mon[287604]: Reconfiguring daemon mgr.np0005538512.zyhkxs on np0005538512.localdomain
Nov 28 09:53:04 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:53:04 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:53:04 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538512.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 28 09:53:04 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538512.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 28 09:53:04 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:53:04 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:53:05 np0005538515.localdomain ceph-mon[287604]: Reconfiguring crash.np0005538512 (monmap changed)...
Nov 28 09:53:05 np0005538515.localdomain ceph-mon[287604]: Reconfiguring daemon crash.np0005538512 on np0005538512.localdomain
Nov 28 09:53:05 np0005538515.localdomain ceph-mon[287604]: pgmap v9: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail; 21 KiB/s rd, 0 B/s wr, 11 op/s
Nov 28 09:53:05 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:53:05 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:53:05 np0005538515.localdomain ceph-mon[287604]: Reconfiguring crash.np0005538513 (monmap changed)...
Nov 28 09:53:05 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538513.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 28 09:53:05 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538513.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 28 09:53:05 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:53:05 np0005538515.localdomain ceph-mon[287604]: Reconfiguring daemon crash.np0005538513 on np0005538513.localdomain
Nov 28 09:53:05 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@2(peon).osd e87 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:53:07 np0005538515.localdomain ceph-mon[287604]: from='client.44134 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Nov 28 09:53:07 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:53:07 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:53:07 np0005538515.localdomain ceph-mon[287604]: Reconfiguring osd.2 (monmap changed)...
Nov 28 09:53:07 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch
Nov 28 09:53:07 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:53:07 np0005538515.localdomain ceph-mon[287604]: Reconfiguring daemon osd.2 on np0005538513.localdomain
Nov 28 09:53:08 np0005538515.localdomain ceph-mon[287604]: pgmap v10: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail; 19 KiB/s rd, 0 B/s wr, 10 op/s
Nov 28 09:53:08 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:53:08 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:53:08 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:53:08 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:53:08 np0005538515.localdomain ceph-mon[287604]: Reconfiguring osd.5 (monmap changed)...
Nov 28 09:53:08 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch
Nov 28 09:53:08 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:53:08 np0005538515.localdomain ceph-mon[287604]: Reconfiguring daemon osd.5 on np0005538513.localdomain
Nov 28 09:53:08 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:53:09 np0005538515.localdomain ceph-mon[287604]: from='client.44144 -' entity='client.admin' cmd=[{"prefix": "orch apply", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 09:53:09 np0005538515.localdomain ceph-mon[287604]: Saving service mon spec with placement label:mon
Nov 28 09:53:09 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:53:09 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:53:09 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:53:09 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:53:09 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005538513.yljthc", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Nov 28 09:53:09 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005538513.yljthc", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Nov 28 09:53:09 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:53:10 np0005538515.localdomain ceph-mon[287604]: Reconfiguring mds.mds.np0005538513.yljthc (monmap changed)...
Nov 28 09:53:10 np0005538515.localdomain ceph-mon[287604]: Reconfiguring daemon mds.mds.np0005538513.yljthc on np0005538513.localdomain
Nov 28 09:53:10 np0005538515.localdomain ceph-mon[287604]: pgmap v11: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail; 19 KiB/s rd, 0 B/s wr, 10 op/s
Nov 28 09:53:10 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:53:10 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:53:10 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538513.dsfdlx", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 28 09:53:10 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538513.dsfdlx", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 28 09:53:10 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "mgr services"} : dispatch
Nov 28 09:53:10 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:53:10 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@2(peon).osd e87 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:53:11 np0005538515.localdomain ceph-mon[287604]: Reconfiguring mgr.np0005538513.dsfdlx (monmap changed)...
Nov 28 09:53:11 np0005538515.localdomain ceph-mon[287604]: Reconfiguring daemon mgr.np0005538513.dsfdlx on np0005538513.localdomain
Nov 28 09:53:11 np0005538515.localdomain ceph-mon[287604]: from='client.26905 -' entity='client.admin' cmd=[{"prefix": "orch ps", "daemon_type": "mon", "daemon_id": "np0005538513", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Nov 28 09:53:11 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:53:11 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:53:11 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Nov 28 09:53:11 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Nov 28 09:53:11 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:53:11 np0005538515.localdomain ceph-mon[287604]: from='client.? 172.18.0.200:0/430380774' entity='client.admin' cmd={"prefix": "status", "format": "json"} : dispatch
Nov 28 09:53:12 np0005538515.localdomain ceph-mon[287604]: Reconfiguring mon.np0005538513 (monmap changed)...
Nov 28 09:53:12 np0005538515.localdomain ceph-mon[287604]: Reconfiguring daemon mon.np0005538513 on np0005538513.localdomain
Nov 28 09:53:12 np0005538515.localdomain ceph-mon[287604]: pgmap v12: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail; 19 KiB/s rd, 0 B/s wr, 10 op/s
Nov 28 09:53:12 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:53:12 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:53:12 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538514.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 28 09:53:12 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538514.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 28 09:53:12 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:53:13 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@2(peon) e9 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Nov 28 09:53:13 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3388432170' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 09:53:13 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@2(peon) e9 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Nov 28 09:53:13 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3388432170' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 09:53:13 np0005538515.localdomain ceph-mon[287604]: Reconfiguring crash.np0005538514 (monmap changed)...
Nov 28 09:53:13 np0005538515.localdomain ceph-mon[287604]: Reconfiguring daemon crash.np0005538514 on np0005538514.localdomain
Nov 28 09:53:13 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:53:13 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:53:13 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch
Nov 28 09:53:13 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:53:13 np0005538515.localdomain ceph-mon[287604]: from='client.? 172.18.0.32:0/3388432170' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 09:53:13 np0005538515.localdomain ceph-mon[287604]: from='client.? 172.18.0.32:0/3388432170' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 09:53:13 np0005538515.localdomain ceph-mon[287604]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #19. Immutable memtables: 0.
Nov 28 09:53:13 np0005538515.localdomain ceph-mon[287604]: rocksdb: (Original Log Time 2025/11/28-09:53:13.487662) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Nov 28 09:53:13 np0005538515.localdomain ceph-mon[287604]: rocksdb: [db/flush_job.cc:856] [default] [JOB 7] Flushing memtable with next log file: 19
Nov 28 09:53:13 np0005538515.localdomain ceph-mon[287604]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323593487701, "job": 7, "event": "flush_started", "num_memtables": 1, "num_entries": 2879, "num_deletes": 254, "total_data_size": 9409171, "memory_usage": 9832624, "flush_reason": "Manual Compaction"}
Nov 28 09:53:13 np0005538515.localdomain ceph-mon[287604]: rocksdb: [db/flush_job.cc:885] [default] [JOB 7] Level-0 flush table #20: started
Nov 28 09:53:13 np0005538515.localdomain ceph-mon[287604]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323593536521, "cf_name": "default", "job": 7, "event": "table_file_creation", "file_number": 20, "file_size": 5653023, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 12990, "largest_seqno": 15864, "table_properties": {"data_size": 5640920, "index_size": 7584, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 3461, "raw_key_size": 31389, "raw_average_key_size": 22, "raw_value_size": 5614350, "raw_average_value_size": 4086, "num_data_blocks": 327, "num_entries": 1374, "num_filter_entries": 1374, "num_deletions": 252, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764323535, "oldest_key_time": 1764323535, "file_creation_time": 1764323593, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "5fedd929-5f7c-4f1d-86e7-c95af9bc6d32", "db_session_id": "18KD68ISQNH5R0YWI96C", "orig_file_number": 20, "seqno_to_time_mapping": "N/A"}}
Nov 28 09:53:13 np0005538515.localdomain ceph-mon[287604]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 7] Flush lasted 48916 microseconds, and 11224 cpu microseconds.
Nov 28 09:53:13 np0005538515.localdomain ceph-mon[287604]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 28 09:53:13 np0005538515.localdomain ceph-mon[287604]: rocksdb: (Original Log Time 2025/11/28-09:53:13.536574) [db/flush_job.cc:967] [default] [JOB 7] Level-0 flush table #20: 5653023 bytes OK
Nov 28 09:53:13 np0005538515.localdomain ceph-mon[287604]: rocksdb: (Original Log Time 2025/11/28-09:53:13.536598) [db/memtable_list.cc:519] [default] Level-0 commit table #20 started
Nov 28 09:53:13 np0005538515.localdomain ceph-mon[287604]: rocksdb: (Original Log Time 2025/11/28-09:53:13.541594) [db/memtable_list.cc:722] [default] Level-0 commit table #20: memtable #1 done
Nov 28 09:53:13 np0005538515.localdomain ceph-mon[287604]: rocksdb: (Original Log Time 2025/11/28-09:53:13.541645) EVENT_LOG_v1 {"time_micros": 1764323593541635, "job": 7, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Nov 28 09:53:13 np0005538515.localdomain ceph-mon[287604]: rocksdb: (Original Log Time 2025/11/28-09:53:13.541667) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Nov 28 09:53:13 np0005538515.localdomain ceph-mon[287604]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 7] Try to delete WAL files size 9395148, prev total WAL file size 9395148, number of live WAL files 2.
Nov 28 09:53:13 np0005538515.localdomain ceph-mon[287604]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538515/store.db/000016.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 28 09:53:13 np0005538515.localdomain ceph-mon[287604]: rocksdb: (Original Log Time 2025/11/28-09:53:13.543645) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003130323931' seq:72057594037927935, type:22 .. '7061786F73003130353433' seq:0, type:0; will stop at (end)
Nov 28 09:53:13 np0005538515.localdomain ceph-mon[287604]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 8] Compacting 1@0 + 1@6 files to L6, score -1.00
Nov 28 09:53:13 np0005538515.localdomain ceph-mon[287604]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 7 Base level 0, inputs: [20(5520KB)], [18(10MB)]
Nov 28 09:53:13 np0005538515.localdomain ceph-mon[287604]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323593543730, "job": 8, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [20], "files_L6": [18], "score": -1, "input_data_size": 16844281, "oldest_snapshot_seqno": -1}
Nov 28 09:53:13 np0005538515.localdomain ceph-mon[287604]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 8] Generated table #21: 10132 keys, 14814666 bytes, temperature: kUnknown
Nov 28 09:53:13 np0005538515.localdomain ceph-mon[287604]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323593658638, "cf_name": "default", "job": 8, "event": "table_file_creation", "file_number": 21, "file_size": 14814666, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 14756604, "index_size": 31657, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 25349, "raw_key_size": 269423, "raw_average_key_size": 26, "raw_value_size": 14583304, "raw_average_value_size": 1439, "num_data_blocks": 1215, "num_entries": 10132, "num_filter_entries": 10132, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764323465, "oldest_key_time": 0, "file_creation_time": 1764323593, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "5fedd929-5f7c-4f1d-86e7-c95af9bc6d32", "db_session_id": "18KD68ISQNH5R0YWI96C", "orig_file_number": 21, "seqno_to_time_mapping": "N/A"}}
Nov 28 09:53:13 np0005538515.localdomain ceph-mon[287604]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 28 09:53:13 np0005538515.localdomain ceph-mon[287604]: rocksdb: (Original Log Time 2025/11/28-09:53:13.659206) [db/compaction/compaction_job.cc:1663] [default] [JOB 8] Compacted 1@0 + 1@6 files to L6 => 14814666 bytes
Nov 28 09:53:13 np0005538515.localdomain ceph-mon[287604]: rocksdb: (Original Log Time 2025/11/28-09:53:13.661105) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 146.4 rd, 128.8 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(5.4, 10.7 +0.0 blob) out(14.1 +0.0 blob), read-write-amplify(5.6) write-amplify(2.6) OK, records in: 10681, records dropped: 549 output_compression: NoCompression
Nov 28 09:53:13 np0005538515.localdomain ceph-mon[287604]: rocksdb: (Original Log Time 2025/11/28-09:53:13.661138) EVENT_LOG_v1 {"time_micros": 1764323593661125, "job": 8, "event": "compaction_finished", "compaction_time_micros": 115024, "compaction_time_cpu_micros": 41806, "output_level": 6, "num_output_files": 1, "total_output_size": 14814666, "num_input_records": 10681, "num_output_records": 10132, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Nov 28 09:53:13 np0005538515.localdomain ceph-mon[287604]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538515/store.db/000020.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 28 09:53:13 np0005538515.localdomain ceph-mon[287604]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323593662159, "job": 8, "event": "table_file_deletion", "file_number": 20}
Nov 28 09:53:13 np0005538515.localdomain ceph-mon[287604]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538515/store.db/000018.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 28 09:53:13 np0005538515.localdomain ceph-mon[287604]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323593663897, "job": 8, "event": "table_file_deletion", "file_number": 18}
Nov 28 09:53:13 np0005538515.localdomain ceph-mon[287604]: rocksdb: (Original Log Time 2025/11/28-09:53:13.543304) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 09:53:13 np0005538515.localdomain ceph-mon[287604]: rocksdb: (Original Log Time 2025/11/28-09:53:13.664043) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 09:53:13 np0005538515.localdomain ceph-mon[287604]: rocksdb: (Original Log Time 2025/11/28-09:53:13.664051) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 09:53:13 np0005538515.localdomain ceph-mon[287604]: rocksdb: (Original Log Time 2025/11/28-09:53:13.664055) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 09:53:13 np0005538515.localdomain ceph-mon[287604]: rocksdb: (Original Log Time 2025/11/28-09:53:13.664058) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 09:53:13 np0005538515.localdomain ceph-mon[287604]: rocksdb: (Original Log Time 2025/11/28-09:53:13.664061) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 09:53:13 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.
Nov 28 09:53:13 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.
Nov 28 09:53:13 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.
Nov 28 09:53:13 np0005538515.localdomain podman[293081]: 2025-11-28 09:53:13.980226094 +0000 UTC m=+0.082539200 container health_status 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, container_name=ovn_controller, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Nov 28 09:53:13 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.
Nov 28 09:53:14 np0005538515.localdomain podman[293080]: 2025-11-28 09:53:14.03240382 +0000 UTC m=+0.136730968 container health_status 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_id=edpm, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Nov 28 09:53:14 np0005538515.localdomain podman[293081]: 2025-11-28 09:53:14.043388268 +0000 UTC m=+0.145701344 container exec_died 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 09:53:14 np0005538515.localdomain systemd[1]: 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.service: Deactivated successfully.
Nov 28 09:53:14 np0005538515.localdomain podman[293080]: 2025-11-28 09:53:14.094189501 +0000 UTC m=+0.198516639 container exec_died 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2)
Nov 28 09:53:14 np0005538515.localdomain systemd[1]: 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.service: Deactivated successfully.
Nov 28 09:53:14 np0005538515.localdomain podman[293082]: 2025-11-28 09:53:14.189688369 +0000 UTC m=+0.288921740 container health_status b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, managed_by=edpm_ansible)
Nov 28 09:53:14 np0005538515.localdomain podman[293082]: 2025-11-28 09:53:14.219598509 +0000 UTC m=+0.318831850 container exec_died b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 09:53:14 np0005538515.localdomain systemd[1]: b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.service: Deactivated successfully.
Nov 28 09:53:14 np0005538515.localdomain podman[293121]: 2025-11-28 09:53:14.23850964 +0000 UTC m=+0.232448162 container health_status d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 28 09:53:14 np0005538515.localdomain podman[293121]: 2025-11-28 09:53:14.252510051 +0000 UTC m=+0.246448533 container exec_died d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 28 09:53:14 np0005538515.localdomain systemd[1]: d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.service: Deactivated successfully.
Nov 28 09:53:14 np0005538515.localdomain ceph-mon[287604]: Reconfiguring osd.0 (monmap changed)...
Nov 28 09:53:14 np0005538515.localdomain ceph-mon[287604]: Reconfiguring daemon osd.0 on np0005538514.localdomain
Nov 28 09:53:14 np0005538515.localdomain ceph-mon[287604]: pgmap v13: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:53:14 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:53:14 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:53:14 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:53:14 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:53:14 np0005538515.localdomain ceph-mon[287604]: Reconfiguring osd.3 (monmap changed)...
Nov 28 09:53:14 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch
Nov 28 09:53:14 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:53:14 np0005538515.localdomain ceph-mon[287604]: Reconfiguring daemon osd.3 on np0005538514.localdomain
Nov 28 09:53:15 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@2(peon).osd e87 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:53:15 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:53:15 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:53:15 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:53:15 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:53:15 np0005538515.localdomain ceph-mon[287604]: Reconfiguring mds.mds.np0005538514.umgtoy (monmap changed)...
Nov 28 09:53:15 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005538514.umgtoy", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Nov 28 09:53:15 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005538514.umgtoy", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Nov 28 09:53:15 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:53:15 np0005538515.localdomain ceph-mon[287604]: Reconfiguring daemon mds.mds.np0005538514.umgtoy on np0005538514.localdomain
Nov 28 09:53:15 np0005538515.localdomain ceph-mon[287604]: pgmap v14: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:53:15 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:53:15 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:53:15 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538514.djozup", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 28 09:53:15 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538514.djozup", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 28 09:53:15 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "mgr services"} : dispatch
Nov 28 09:53:15 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:53:16 np0005538515.localdomain ceph-mon[287604]: Reconfiguring mgr.np0005538514.djozup (monmap changed)...
Nov 28 09:53:16 np0005538515.localdomain ceph-mon[287604]: Reconfiguring daemon mgr.np0005538514.djozup on np0005538514.localdomain
Nov 28 09:53:16 np0005538515.localdomain ceph-mon[287604]: from='client.? 172.18.0.200:0/3919583814' entity='client.admin' cmd={"prefix": "mgr stat", "format": "json"} : dispatch
Nov 28 09:53:16 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:53:16 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:53:16 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Nov 28 09:53:16 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Nov 28 09:53:16 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:53:17 np0005538515.localdomain sudo[293165]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:53:17 np0005538515.localdomain sudo[293165]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:17 np0005538515.localdomain sudo[293165]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:17 np0005538515.localdomain sudo[293183]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:53:17 np0005538515.localdomain sudo[293183]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:17 np0005538515.localdomain ceph-mon[287604]: Reconfiguring mon.np0005538514 (monmap changed)...
Nov 28 09:53:17 np0005538515.localdomain ceph-mon[287604]: Reconfiguring daemon mon.np0005538514 on np0005538514.localdomain
Nov 28 09:53:17 np0005538515.localdomain ceph-mon[287604]: pgmap v15: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:53:17 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:53:17 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:53:17 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch
Nov 28 09:53:17 np0005538515.localdomain ceph-mon[287604]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:53:17 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@2(peon).osd e88 e88: 6 total, 6 up, 6 in
Nov 28 09:53:17 np0005538515.localdomain ceph-mgr[286188]: mgr handle_mgr_map Activating!
Nov 28 09:53:17 np0005538515.localdomain ceph-mgr[286188]: mgr handle_mgr_map I am now activating
Nov 28 09:53:18 np0005538515.localdomain ceph-mgr[286188]: [balancer DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Nov 28 09:53:18 np0005538515.localdomain ceph-mgr[286188]: mgr load Constructed class from module: balancer
Nov 28 09:53:18 np0005538515.localdomain ceph-mgr[286188]: [cephadm DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Nov 28 09:53:18 np0005538515.localdomain ceph-mgr[286188]: [balancer INFO root] Starting
Nov 28 09:53:18 np0005538515.localdomain ceph-mgr[286188]: [balancer INFO root] Optimize plan auto_2025-11-28_09:53:18
Nov 28 09:53:18 np0005538515.localdomain ceph-mgr[286188]: [balancer INFO root] Mode upmap, max misplaced 0.050000
Nov 28 09:53:18 np0005538515.localdomain ceph-mgr[286188]: [balancer INFO root] Some PGs (1.000000) are unknown; try again later
Nov 28 09:53:18 np0005538515.localdomain sshd[292054]: pam_unix(sshd:session): session closed for user ceph-admin
Nov 28 09:53:18 np0005538515.localdomain systemd-logind[763]: Session 65 logged out. Waiting for processes to exit.
Nov 28 09:53:18 np0005538515.localdomain ceph-mgr[286188]: mgr load Constructed class from module: cephadm
Nov 28 09:53:18 np0005538515.localdomain ceph-mgr[286188]: [crash DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Nov 28 09:53:18 np0005538515.localdomain ceph-mgr[286188]: mgr load Constructed class from module: crash
Nov 28 09:53:18 np0005538515.localdomain ceph-mgr[286188]: [devicehealth DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Nov 28 09:53:18 np0005538515.localdomain ceph-mgr[286188]: mgr load Constructed class from module: devicehealth
Nov 28 09:53:18 np0005538515.localdomain ceph-mgr[286188]: [iostat DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Nov 28 09:53:18 np0005538515.localdomain ceph-mgr[286188]: mgr load Constructed class from module: iostat
Nov 28 09:53:18 np0005538515.localdomain ceph-mgr[286188]: [devicehealth INFO root] Starting
Nov 28 09:53:18 np0005538515.localdomain ceph-mgr[286188]: [nfs DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Nov 28 09:53:18 np0005538515.localdomain ceph-mgr[286188]: mgr load Constructed class from module: nfs
Nov 28 09:53:18 np0005538515.localdomain ceph-mgr[286188]: [orchestrator DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Nov 28 09:53:18 np0005538515.localdomain ceph-mgr[286188]: mgr load Constructed class from module: orchestrator
Nov 28 09:53:18 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Nov 28 09:53:18 np0005538515.localdomain ceph-mgr[286188]: mgr load Constructed class from module: pg_autoscaler
Nov 28 09:53:18 np0005538515.localdomain ceph-mgr[286188]: [progress DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Nov 28 09:53:18 np0005538515.localdomain ceph-mgr[286188]: mgr load Constructed class from module: progress
Nov 28 09:53:18 np0005538515.localdomain ceph-mgr[286188]: [rbd_support DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Nov 28 09:53:18 np0005538515.localdomain ceph-mgr[286188]: [progress INFO root] Loading...
Nov 28 09:53:18 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] _maybe_adjust
Nov 28 09:53:18 np0005538515.localdomain ceph-mgr[286188]: [progress INFO root] Loaded [<progress.module.GhostEvent object at 0x7f321b495790>, <progress.module.GhostEvent object at 0x7f321b4957c0>, <progress.module.GhostEvent object at 0x7f321b4957f0>, <progress.module.GhostEvent object at 0x7f321b495820>, <progress.module.GhostEvent object at 0x7f321b495850>, <progress.module.GhostEvent object at 0x7f321b495880>, <progress.module.GhostEvent object at 0x7f321b4958b0>, <progress.module.GhostEvent object at 0x7f321b4958e0>, <progress.module.GhostEvent object at 0x7f321b495910>, <progress.module.GhostEvent object at 0x7f321b495940>, <progress.module.GhostEvent object at 0x7f321b495970>, <progress.module.GhostEvent object at 0x7f321b4959a0>, <progress.module.GhostEvent object at 0x7f321b4959d0>, <progress.module.GhostEvent object at 0x7f321b495a00>, <progress.module.GhostEvent object at 0x7f321b495a30>, <progress.module.GhostEvent object at 0x7f321b495a60>, <progress.module.GhostEvent object at 0x7f321b495a90>, <progress.module.GhostEvent object at 0x7f321b495ac0>, <progress.module.GhostEvent object at 0x7f321b495af0>, <progress.module.GhostEvent object at 0x7f321b495b20>, <progress.module.GhostEvent object at 0x7f321b495b50>, <progress.module.GhostEvent object at 0x7f321b495b80>, <progress.module.GhostEvent object at 0x7f321b495bb0>, <progress.module.GhostEvent object at 0x7f321b495be0>, <progress.module.GhostEvent object at 0x7f321b495c10>, <progress.module.GhostEvent object at 0x7f321b495c40>, <progress.module.GhostEvent object at 0x7f321b495c70>, <progress.module.GhostEvent object at 0x7f321b495ca0>, <progress.module.GhostEvent object at 0x7f321b495cd0>, <progress.module.GhostEvent object at 0x7f321b495d00>, <progress.module.GhostEvent object at 0x7f321b495d30>, <progress.module.GhostEvent object at 0x7f321b495d60>, <progress.module.GhostEvent object at 0x7f321b495d90>, <progress.module.GhostEvent object at 0x7f321b495dc0>, <progress.module.GhostEvent object at 0x7f321b495df0>, <progress.module.GhostEvent object at 0x7f321b495e20>, <progress.module.GhostEvent object at 0x7f321b495e50>, <progress.module.GhostEvent object at 0x7f321b495e80>, <progress.module.GhostEvent object at 0x7f321b495eb0>, <progress.module.GhostEvent object at 0x7f321b495ee0>, <progress.module.GhostEvent object at 0x7f321b495f10>, <progress.module.GhostEvent object at 0x7f321b495f40>, <progress.module.GhostEvent object at 0x7f321b495f70>, <progress.module.GhostEvent object at 0x7f321b495fa0>, <progress.module.GhostEvent object at 0x7f321b495fd0>, <progress.module.GhostEvent object at 0x7f321a430040>, <progress.module.GhostEvent object at 0x7f321a430070>, <progress.module.GhostEvent object at 0x7f321a4300a0>, <progress.module.GhostEvent object at 0x7f321a4300d0>, <progress.module.GhostEvent object at 0x7f321a430100>] historic events
Nov 28 09:53:18 np0005538515.localdomain ceph-mgr[286188]: [progress INFO root] Loaded OSDMap, ready.
Nov 28 09:53:18 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] recovery thread starting
Nov 28 09:53:18 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] starting setup
Nov 28 09:53:18 np0005538515.localdomain ceph-mgr[286188]: mgr load Constructed class from module: rbd_support
Nov 28 09:53:18 np0005538515.localdomain ceph-mgr[286188]: [restful DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Nov 28 09:53:18 np0005538515.localdomain ceph-mgr[286188]: mgr load Constructed class from module: restful
Nov 28 09:53:18 np0005538515.localdomain ceph-mgr[286188]: [status DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Nov 28 09:53:18 np0005538515.localdomain ceph-mgr[286188]: mgr load Constructed class from module: status
Nov 28 09:53:18 np0005538515.localdomain ceph-mgr[286188]: [restful INFO root] server_addr: :: server_port: 8003
Nov 28 09:53:18 np0005538515.localdomain ceph-mgr[286188]: [restful WARNING root] server not running: no certificate configured
Nov 28 09:53:18 np0005538515.localdomain ceph-mgr[286188]: [telemetry DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Nov 28 09:53:18 np0005538515.localdomain ceph-mgr[286188]: mgr load Constructed class from module: telemetry
Nov 28 09:53:18 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules
Nov 28 09:53:18 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: vms, start_after=
Nov 28 09:53:18 np0005538515.localdomain ceph-mgr[286188]: [volumes DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Nov 28 09:53:18 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: volumes, start_after=
Nov 28 09:53:18 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: images, start_after=
Nov 28 09:53:18 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: backups, start_after=
Nov 28 09:53:18 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: starting
Nov 28 09:53:18 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] PerfHandler: starting
Nov 28 09:53:18 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_task_task: vms, start_after=
Nov 28 09:53:18 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_task_task: volumes, start_after=
Nov 28 09:53:18 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_task_task: images, start_after=
Nov 28 09:53:18 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_task_task: backups, start_after=
Nov 28 09:53:18 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] TaskHandler: starting
Nov 28 09:53:18 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules
Nov 28 09:53:18 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: vms, start_after=
Nov 28 09:53:18 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: volumes, start_after=
Nov 28 09:53:18 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: images, start_after=
Nov 28 09:53:18 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: backups, start_after=
Nov 28 09:53:18 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] TrashPurgeScheduleHandler: starting
Nov 28 09:53:18 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] setup complete
Nov 28 09:53:18 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Nov 28 09:53:18 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Nov 28 09:53:18 np0005538515.localdomain ceph-mgr[286188]: mgr load Constructed class from module: volumes
Nov 28 09:53:18 np0005538515.localdomain ceph-mgr[286188]: client.0 error registering admin socket command: (17) File exists
Nov 28 09:53:18 np0005538515.localdomain ceph-mgr[286188]: client.0 error registering admin socket command: (17) File exists
Nov 28 09:53:18 np0005538515.localdomain ceph-mgr[286188]: client.0 error registering admin socket command: (17) File exists
Nov 28 09:53:18 np0005538515.localdomain ceph-mgr[286188]: client.0 error registering admin socket command: (17) File exists
Nov 28 09:53:18 np0005538515.localdomain ceph-mgr[286188]: client.0 error registering admin socket command: (17) File exists
Nov 28 09:53:18 np0005538515.localdomain ceph-mgr[286188]: client.0 error registering admin socket command: (17) File exists
Nov 28 09:53:18 np0005538515.localdomain ceph-mgr[286188]: client.0 error registering admin socket command: (17) File exists
Nov 28 09:53:18 np0005538515.localdomain ceph-mgr[286188]: client.0 error registering admin socket command: (17) File exists
Nov 28 09:53:18 np0005538515.localdomain ceph-mgr[286188]: client.0 error registering admin socket command: (17) File exists
Nov 28 09:53:18 np0005538515.localdomain ceph-mgr[286188]: client.0 error registering admin socket command: (17) File exists
Nov 28 09:53:18 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T09:53:18.260+0000 7f3200b77640 -1 client.0 error registering admin socket command: (17) File exists
Nov 28 09:53:18 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T09:53:18.260+0000 7f3200b77640 -1 client.0 error registering admin socket command: (17) File exists
Nov 28 09:53:18 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T09:53:18.260+0000 7f3200b77640 -1 client.0 error registering admin socket command: (17) File exists
Nov 28 09:53:18 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T09:53:18.260+0000 7f3200b77640 -1 client.0 error registering admin socket command: (17) File exists
Nov 28 09:53:18 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T09:53:18.260+0000 7f3200b77640 -1 client.0 error registering admin socket command: (17) File exists
Nov 28 09:53:18 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T09:53:18.261+0000 7f3204b7f640 -1 client.0 error registering admin socket command: (17) File exists
Nov 28 09:53:18 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T09:53:18.261+0000 7f3204b7f640 -1 client.0 error registering admin socket command: (17) File exists
Nov 28 09:53:18 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T09:53:18.261+0000 7f3204b7f640 -1 client.0 error registering admin socket command: (17) File exists
Nov 28 09:53:18 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T09:53:18.261+0000 7f3204b7f640 -1 client.0 error registering admin socket command: (17) File exists
Nov 28 09:53:18 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T09:53:18.261+0000 7f3204b7f640 -1 client.0 error registering admin socket command: (17) File exists
Nov 28 09:53:18 np0005538515.localdomain podman[293324]: 
Nov 28 09:53:18 np0005538515.localdomain podman[293324]: 2025-11-28 09:53:18.304220935 +0000 UTC m=+0.072048668 container create d9255291562c6a222a500ed68e4a06693d1234b5a66802f27ea28d27fc845f52 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=exciting_chaplygin, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, version=7, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, GIT_BRANCH=main, distribution-scope=public, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, RELEASE=main, io.openshift.tags=rhceph ceph, release=553, io.buildah.version=1.33.12, vendor=Red Hat, Inc., GIT_CLEAN=True, vcs-type=git, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_REPO=https://github.com/ceph/ceph-container.git)
Nov 28 09:53:18 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.
Nov 28 09:53:18 np0005538515.localdomain systemd[1]: Started libpod-conmon-d9255291562c6a222a500ed68e4a06693d1234b5a66802f27ea28d27fc845f52.scope.
Nov 28 09:53:18 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 09:53:18 np0005538515.localdomain podman[293324]: 2025-11-28 09:53:18.282566838 +0000 UTC m=+0.050394661 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 09:53:18 np0005538515.localdomain sshd[293386]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 09:53:18 np0005538515.localdomain systemd[1]: tmp-crun.cvoGQF.mount: Deactivated successfully.
Nov 28 09:53:18 np0005538515.localdomain podman[293369]: 2025-11-28 09:53:18.39635359 +0000 UTC m=+0.068893521 container health_status 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 09:53:18 np0005538515.localdomain podman[293369]: 2025-11-28 09:53:18.405596254 +0000 UTC m=+0.078136185 container exec_died 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 28 09:53:18 np0005538515.localdomain systemd[1]: 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.service: Deactivated successfully.
Nov 28 09:53:18 np0005538515.localdomain podman[293324]: 2025-11-28 09:53:18.422467033 +0000 UTC m=+0.190294776 container init d9255291562c6a222a500ed68e4a06693d1234b5a66802f27ea28d27fc845f52 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=exciting_chaplygin, GIT_BRANCH=main, distribution-scope=public, io.buildah.version=1.33.12, build-date=2025-09-24T08:57:55, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, io.openshift.expose-services=, vcs-type=git, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., name=rhceph, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, version=7, release=553, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, architecture=x86_64, CEPH_POINT_RELEASE=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.license_terms=https://www.redhat.com/agreements, RELEASE=main)
Nov 28 09:53:18 np0005538515.localdomain podman[293324]: 2025-11-28 09:53:18.434512143 +0000 UTC m=+0.202339886 container start d9255291562c6a222a500ed68e4a06693d1234b5a66802f27ea28d27fc845f52 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=exciting_chaplygin, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, CEPH_POINT_RELEASE=, RELEASE=main, version=7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, io.openshift.expose-services=, distribution-scope=public, release=553, architecture=x86_64)
Nov 28 09:53:18 np0005538515.localdomain podman[293324]: 2025-11-28 09:53:18.43473174 +0000 UTC m=+0.202559503 container attach d9255291562c6a222a500ed68e4a06693d1234b5a66802f27ea28d27fc845f52 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=exciting_chaplygin, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=553, vcs-type=git, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.buildah.version=1.33.12, build-date=2025-09-24T08:57:55, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, CEPH_POINT_RELEASE=)
Nov 28 09:53:18 np0005538515.localdomain systemd[1]: libpod-d9255291562c6a222a500ed68e4a06693d1234b5a66802f27ea28d27fc845f52.scope: Deactivated successfully.
Nov 28 09:53:18 np0005538515.localdomain exciting_chaplygin[293379]: 167 167
Nov 28 09:53:18 np0005538515.localdomain podman[293324]: 2025-11-28 09:53:18.439635431 +0000 UTC m=+0.207463194 container died d9255291562c6a222a500ed68e4a06693d1234b5a66802f27ea28d27fc845f52 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=exciting_chaplygin, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.33.12, vcs-type=git, distribution-scope=public, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, CEPH_POINT_RELEASE=, RELEASE=main, vendor=Red Hat, Inc., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, ceph=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=553, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-09-24T08:57:55, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7)
Nov 28 09:53:18 np0005538515.localdomain sshd[293386]: Accepted publickey for ceph-admin from 192.168.122.108 port 56492 ssh2: RSA SHA256:zjXO5gWr7Xng+SeiWsaFLFQaayJZD5rPIAl1v5Aks+g
Nov 28 09:53:18 np0005538515.localdomain systemd-logind[763]: New session 66 of user ceph-admin.
Nov 28 09:53:18 np0005538515.localdomain systemd[1]: Started Session 66 of User ceph-admin.
Nov 28 09:53:18 np0005538515.localdomain sshd[293386]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Nov 28 09:53:18 np0005538515.localdomain podman[293401]: 2025-11-28 09:53:18.517395363 +0000 UTC m=+0.070998735 container remove d9255291562c6a222a500ed68e4a06693d1234b5a66802f27ea28d27fc845f52 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=exciting_chaplygin, com.redhat.license_terms=https://www.redhat.com/agreements, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, distribution-scope=public, release=553, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, CEPH_POINT_RELEASE=, version=7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.component=rhceph-container, name=rhceph, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, vcs-type=git, io.openshift.expose-services=, vendor=Red Hat, Inc., io.buildah.version=1.33.12, build-date=2025-09-24T08:57:55, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main)
Nov 28 09:53:18 np0005538515.localdomain systemd[1]: libpod-conmon-d9255291562c6a222a500ed68e4a06693d1234b5a66802f27ea28d27fc845f52.scope: Deactivated successfully.
Nov 28 09:53:18 np0005538515.localdomain sudo[293416]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:53:18 np0005538515.localdomain sudo[293416]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:18 np0005538515.localdomain sudo[293416]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:18 np0005538515.localdomain sudo[293183]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:18 np0005538515.localdomain systemd[1]: session-65.scope: Deactivated successfully.
Nov 28 09:53:18 np0005538515.localdomain systemd[1]: session-65.scope: Consumed 6.588s CPU time.
Nov 28 09:53:18 np0005538515.localdomain systemd-logind[763]: Removed session 65.
Nov 28 09:53:18 np0005538515.localdomain sudo[293438]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Nov 28 09:53:18 np0005538515.localdomain sudo[293438]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:18 np0005538515.localdomain ceph-mon[287604]: from='client.? 172.18.0.200:0/1038640921' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch
Nov 28 09:53:18 np0005538515.localdomain ceph-mon[287604]: Activating manager daemon np0005538515.yfkzhl
Nov 28 09:53:18 np0005538515.localdomain ceph-mon[287604]: osdmap e88: 6 total, 6 up, 6 in
Nov 28 09:53:18 np0005538515.localdomain ceph-mon[287604]: from='client.? 172.18.0.200:0/1038640921' entity='client.admin' cmd='[{"prefix": "mgr fail"}]': finished
Nov 28 09:53:18 np0005538515.localdomain ceph-mon[287604]: mgrmap e25: np0005538515.yfkzhl(active, starting, since 0.0424963s), standbys: np0005538510.nzitwz, np0005538511.fvuybw, np0005538513.dsfdlx, np0005538512.zyhkxs
Nov 28 09:53:18 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mon metadata", "id": "np0005538511"} : dispatch
Nov 28 09:53:18 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mon metadata", "id": "np0005538512"} : dispatch
Nov 28 09:53:18 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mon metadata", "id": "np0005538513"} : dispatch
Nov 28 09:53:18 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mon metadata", "id": "np0005538514"} : dispatch
Nov 28 09:53:18 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mon metadata", "id": "np0005538515"} : dispatch
Nov 28 09:53:18 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mds metadata", "who": "mds.np0005538513.yljthc"} : dispatch
Nov 28 09:53:18 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mds metadata", "who": "mds.np0005538515.anvatb"} : dispatch
Nov 28 09:53:18 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mds metadata", "who": "mds.np0005538514.umgtoy"} : dispatch
Nov 28 09:53:18 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mgr metadata", "who": "np0005538515.yfkzhl", "id": "np0005538515.yfkzhl"} : dispatch
Nov 28 09:53:18 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mgr metadata", "who": "np0005538510.nzitwz", "id": "np0005538510.nzitwz"} : dispatch
Nov 28 09:53:18 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mgr metadata", "who": "np0005538511.fvuybw", "id": "np0005538511.fvuybw"} : dispatch
Nov 28 09:53:18 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mgr metadata", "who": "np0005538513.dsfdlx", "id": "np0005538513.dsfdlx"} : dispatch
Nov 28 09:53:18 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mgr metadata", "who": "np0005538512.zyhkxs", "id": "np0005538512.zyhkxs"} : dispatch
Nov 28 09:53:18 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "osd metadata", "id": 0} : dispatch
Nov 28 09:53:18 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "osd metadata", "id": 1} : dispatch
Nov 28 09:53:18 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "osd metadata", "id": 2} : dispatch
Nov 28 09:53:18 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "osd metadata", "id": 3} : dispatch
Nov 28 09:53:18 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "osd metadata", "id": 4} : dispatch
Nov 28 09:53:18 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "osd metadata", "id": 5} : dispatch
Nov 28 09:53:18 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mds metadata"} : dispatch
Nov 28 09:53:18 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "osd metadata"} : dispatch
Nov 28 09:53:18 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mon metadata"} : dispatch
Nov 28 09:53:18 np0005538515.localdomain ceph-mon[287604]: Manager daemon np0005538515.yfkzhl is now available
Nov 28 09:53:18 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005538515.yfkzhl/mirror_snapshot_schedule"} : dispatch
Nov 28 09:53:18 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005538515.yfkzhl/mirror_snapshot_schedule"} : dispatch
Nov 28 09:53:18 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005538515.yfkzhl/trash_purge_schedule"} : dispatch
Nov 28 09:53:18 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005538515.yfkzhl/trash_purge_schedule"} : dispatch
Nov 28 09:53:19 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v3: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:53:19 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-50a76ba9299418af860d305dcb8580e6d6bdc14f92426abdd892d36b9e2159d5-merged.mount: Deactivated successfully.
Nov 28 09:53:19 np0005538515.localdomain systemd[1]: tmp-crun.mGVAsw.mount: Deactivated successfully.
Nov 28 09:53:19 np0005538515.localdomain podman[293533]: 2025-11-28 09:53:19.535125534 +0000 UTC m=+0.109167219 container exec 98f7091a3e2ea0e9ed1e630f1e98c8fad1fd276cf7448473db6afc3c103ea45d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538515, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.33.12, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., RELEASE=main, description=Red Hat Ceph Storage 7, vcs-type=git, GIT_CLEAN=True, architecture=x86_64, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, version=7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, release=553, ceph=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.expose-services=)
Nov 28 09:53:19 np0005538515.localdomain podman[293533]: 2025-11-28 09:53:19.639489525 +0000 UTC m=+0.213531220 container exec_died 98f7091a3e2ea0e9ed1e630f1e98c8fad1fd276cf7448473db6afc3c103ea45d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538515, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, ceph=True, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, release=553, GIT_CLEAN=True, io.buildah.version=1.33.12, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-09-24T08:57:55, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, GIT_BRANCH=main, RELEASE=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, distribution-scope=public)
Nov 28 09:53:19 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO cherrypy.error] [28/Nov/2025:09:53:19] ENGINE Bus STARTING
Nov 28 09:53:19 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : [28/Nov/2025:09:53:19] ENGINE Bus STARTING
Nov 28 09:53:19 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO cherrypy.error] [28/Nov/2025:09:53:19] ENGINE Serving on https://172.18.0.108:7150
Nov 28 09:53:19 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : [28/Nov/2025:09:53:19] ENGINE Serving on https://172.18.0.108:7150
Nov 28 09:53:19 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO cherrypy.error] [28/Nov/2025:09:53:19] ENGINE Client ('172.18.0.108', 34804) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)')
Nov 28 09:53:19 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : [28/Nov/2025:09:53:19] ENGINE Client ('172.18.0.108', 34804) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)')
Nov 28 09:53:19 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO cherrypy.error] [28/Nov/2025:09:53:19] ENGINE Serving on http://172.18.0.108:8765
Nov 28 09:53:19 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : [28/Nov/2025:09:53:19] ENGINE Serving on http://172.18.0.108:8765
Nov 28 09:53:19 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO cherrypy.error] [28/Nov/2025:09:53:19] ENGINE Bus STARTED
Nov 28 09:53:19 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : [28/Nov/2025:09:53:19] ENGINE Bus STARTED
Nov 28 09:53:20 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v4: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:53:20 np0005538515.localdomain ceph-mon[287604]: mgrmap e26: np0005538515.yfkzhl(active, since 1.07766s), standbys: np0005538510.nzitwz, np0005538511.fvuybw, np0005538513.dsfdlx, np0005538512.zyhkxs
Nov 28 09:53:20 np0005538515.localdomain ceph-mon[287604]: pgmap v3: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:53:20 np0005538515.localdomain ceph-mon[287604]: mgrmap e27: np0005538515.yfkzhl(active, since 1.58689s), standbys: np0005538511.fvuybw, np0005538513.dsfdlx, np0005538512.zyhkxs
Nov 28 09:53:20 np0005538515.localdomain sudo[293438]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:20 np0005538515.localdomain ceph-mgr[286188]: [devicehealth INFO root] Check health
Nov 28 09:53:20 np0005538515.localdomain sudo[293688]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:53:20 np0005538515.localdomain sudo[293688]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:20 np0005538515.localdomain sudo[293688]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:20 np0005538515.localdomain sudo[293706]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 09:53:20 np0005538515.localdomain sudo[293706]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:20 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@2(peon).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:53:21 np0005538515.localdomain ceph-mon[287604]: [28/Nov/2025:09:53:19] ENGINE Bus STARTING
Nov 28 09:53:21 np0005538515.localdomain ceph-mon[287604]: [28/Nov/2025:09:53:19] ENGINE Serving on https://172.18.0.108:7150
Nov 28 09:53:21 np0005538515.localdomain ceph-mon[287604]: [28/Nov/2025:09:53:19] ENGINE Client ('172.18.0.108', 34804) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)')
Nov 28 09:53:21 np0005538515.localdomain ceph-mon[287604]: [28/Nov/2025:09:53:19] ENGINE Serving on http://172.18.0.108:8765
Nov 28 09:53:21 np0005538515.localdomain ceph-mon[287604]: [28/Nov/2025:09:53:19] ENGINE Bus STARTED
Nov 28 09:53:21 np0005538515.localdomain ceph-mon[287604]: Health check cleared: CEPHADM_STRAY_DAEMON (was: 1 stray daemon(s) not managed by cephadm)
Nov 28 09:53:21 np0005538515.localdomain ceph-mon[287604]: Health check cleared: CEPHADM_STRAY_HOST (was: 1 stray host(s) with 1 daemon(s) not managed by cephadm)
Nov 28 09:53:21 np0005538515.localdomain ceph-mon[287604]: Cluster is now healthy
Nov 28 09:53:21 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:21 np0005538515.localdomain ceph-mon[287604]: pgmap v4: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:53:21 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:21 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:21 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:21 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:21 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:21 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:21 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:21 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:21 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:21 np0005538515.localdomain sudo[293706]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:21 np0005538515.localdomain sudo[293756]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:53:21 np0005538515.localdomain sudo[293756]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:21 np0005538515.localdomain sudo[293756]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:21 np0005538515.localdomain sudo[293774]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 list-networks
Nov 28 09:53:21 np0005538515.localdomain sudo[293774]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:21 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO root] Adjusting osd_memory_target on np0005538514.localdomain to 836.6M
Nov 28 09:53:21 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Adjusting osd_memory_target on np0005538514.localdomain to 836.6M
Nov 28 09:53:21 np0005538515.localdomain ceph-mgr[286188]: [cephadm WARNING cephadm.serve] Unable to set osd_memory_target on np0005538514.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Nov 28 09:53:21 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [WRN] : Unable to set osd_memory_target on np0005538514.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Nov 28 09:53:21 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO root] Adjusting osd_memory_target on np0005538513.localdomain to 836.6M
Nov 28 09:53:21 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Adjusting osd_memory_target on np0005538513.localdomain to 836.6M
Nov 28 09:53:21 np0005538515.localdomain ceph-mgr[286188]: [cephadm WARNING cephadm.serve] Unable to set osd_memory_target on np0005538513.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Nov 28 09:53:21 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [WRN] : Unable to set osd_memory_target on np0005538513.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Nov 28 09:53:21 np0005538515.localdomain sudo[293774]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:21 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO root] Adjusting osd_memory_target on np0005538515.localdomain to 836.6M
Nov 28 09:53:21 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Adjusting osd_memory_target on np0005538515.localdomain to 836.6M
Nov 28 09:53:21 np0005538515.localdomain ceph-mgr[286188]: [cephadm WARNING cephadm.serve] Unable to set osd_memory_target on np0005538515.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Nov 28 09:53:21 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [WRN] : Unable to set osd_memory_target on np0005538515.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Nov 28 09:53:21 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO cephadm.serve] Updating np0005538511.localdomain:/etc/ceph/ceph.conf
Nov 28 09:53:21 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Updating np0005538511.localdomain:/etc/ceph/ceph.conf
Nov 28 09:53:21 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO cephadm.serve] Updating np0005538512.localdomain:/etc/ceph/ceph.conf
Nov 28 09:53:21 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO cephadm.serve] Updating np0005538513.localdomain:/etc/ceph/ceph.conf
Nov 28 09:53:21 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO cephadm.serve] Updating np0005538514.localdomain:/etc/ceph/ceph.conf
Nov 28 09:53:21 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO cephadm.serve] Updating np0005538515.localdomain:/etc/ceph/ceph.conf
Nov 28 09:53:21 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Updating np0005538512.localdomain:/etc/ceph/ceph.conf
Nov 28 09:53:21 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Updating np0005538513.localdomain:/etc/ceph/ceph.conf
Nov 28 09:53:21 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Updating np0005538514.localdomain:/etc/ceph/ceph.conf
Nov 28 09:53:21 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Updating np0005538515.localdomain:/etc/ceph/ceph.conf
Nov 28 09:53:21 np0005538515.localdomain sudo[293810]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Nov 28 09:53:21 np0005538515.localdomain sudo[293810]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:21 np0005538515.localdomain sudo[293810]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:21 np0005538515.localdomain sudo[293828]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph
Nov 28 09:53:21 np0005538515.localdomain sudo[293828]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:21 np0005538515.localdomain sudo[293828]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:22 np0005538515.localdomain sudo[293846]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.conf.new
Nov 28 09:53:22 np0005538515.localdomain sudo[293846]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:22 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v5: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:53:22 np0005538515.localdomain sudo[293846]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:22 np0005538515.localdomain sudo[293864]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:53:22 np0005538515.localdomain sudo[293864]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:22 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.
Nov 28 09:53:22 np0005538515.localdomain sudo[293864]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:22 np0005538515.localdomain sudo[293888]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.conf.new
Nov 28 09:53:22 np0005538515.localdomain sudo[293888]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:22 np0005538515.localdomain sudo[293888]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:22 np0005538515.localdomain podman[293881]: 2025-11-28 09:53:22.213376283 +0000 UTC m=+0.084708857 container health_status cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, tcib_managed=true, org.label-schema.schema-version=1.0)
Nov 28 09:53:22 np0005538515.localdomain podman[293881]: 2025-11-28 09:53:22.230476459 +0000 UTC m=+0.101809033 container exec_died cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd)
Nov 28 09:53:22 np0005538515.localdomain systemd[1]: cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.service: Deactivated successfully.
Nov 28 09:53:22 np0005538515.localdomain sudo[293935]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.conf.new
Nov 28 09:53:22 np0005538515.localdomain sudo[293935]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:22 np0005538515.localdomain sudo[293935]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:22 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO cephadm.serve] Updating np0005538514.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:53:22 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Updating np0005538514.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:53:22 np0005538515.localdomain ceph-mon[287604]: mgrmap e28: np0005538515.yfkzhl(active, since 3s), standbys: np0005538511.fvuybw, np0005538513.dsfdlx, np0005538512.zyhkxs
Nov 28 09:53:22 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:22 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:22 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd/host:np0005538511", "name": "osd_memory_target"} : dispatch
Nov 28 09:53:22 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd/host:np0005538511", "name": "osd_memory_target"} : dispatch
Nov 28 09:53:22 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:22 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:22 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd/host:np0005538512", "name": "osd_memory_target"} : dispatch
Nov 28 09:53:22 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd/host:np0005538512", "name": "osd_memory_target"} : dispatch
Nov 28 09:53:22 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:22 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:22 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch
Nov 28 09:53:22 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch
Nov 28 09:53:22 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch
Nov 28 09:53:22 np0005538515.localdomain ceph-mon[287604]: Adjusting osd_memory_target on np0005538514.localdomain to 836.6M
Nov 28 09:53:22 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch
Nov 28 09:53:22 np0005538515.localdomain ceph-mon[287604]: Unable to set osd_memory_target on np0005538514.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Nov 28 09:53:22 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:22 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:22 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch
Nov 28 09:53:22 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch
Nov 28 09:53:22 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch
Nov 28 09:53:22 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch
Nov 28 09:53:22 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:22 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:22 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch
Nov 28 09:53:22 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch
Nov 28 09:53:22 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch
Nov 28 09:53:22 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch
Nov 28 09:53:22 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:53:22 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 28 09:53:22 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO cephadm.serve] Updating np0005538511.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:53:22 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Updating np0005538511.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:53:22 np0005538515.localdomain sudo[293953]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.conf.new
Nov 28 09:53:22 np0005538515.localdomain sudo[293953]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:22 np0005538515.localdomain sudo[293953]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:22 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO cephadm.serve] Updating np0005538512.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:53:22 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Updating np0005538512.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:53:22 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO cephadm.serve] Updating np0005538513.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:53:22 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Updating np0005538513.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:53:22 np0005538515.localdomain sudo[293971]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Nov 28 09:53:22 np0005538515.localdomain sudo[293971]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:22 np0005538515.localdomain sudo[293971]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:22 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO cephadm.serve] Updating np0005538515.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:53:22 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Updating np0005538515.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:53:22 np0005538515.localdomain sudo[293989]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config
Nov 28 09:53:22 np0005538515.localdomain sudo[293989]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:22 np0005538515.localdomain sudo[293989]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:22 np0005538515.localdomain sudo[294007]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config
Nov 28 09:53:22 np0005538515.localdomain sudo[294007]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:22 np0005538515.localdomain sudo[294007]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:22 np0005538515.localdomain sudo[294025]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf.new
Nov 28 09:53:22 np0005538515.localdomain sudo[294025]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:22 np0005538515.localdomain sudo[294025]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:22 np0005538515.localdomain sudo[294043]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:53:22 np0005538515.localdomain sudo[294043]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:22 np0005538515.localdomain sudo[294043]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:22 np0005538515.localdomain sudo[294061]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf.new
Nov 28 09:53:22 np0005538515.localdomain sudo[294061]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:22 np0005538515.localdomain sudo[294061]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:22 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO cephadm.serve] Updating np0005538514.localdomain:/etc/ceph/ceph.client.admin.keyring
Nov 28 09:53:23 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Updating np0005538514.localdomain:/etc/ceph/ceph.client.admin.keyring
Nov 28 09:53:23 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO cephadm.serve] Updating np0005538513.localdomain:/etc/ceph/ceph.client.admin.keyring
Nov 28 09:53:23 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Updating np0005538513.localdomain:/etc/ceph/ceph.client.admin.keyring
Nov 28 09:53:23 np0005538515.localdomain sudo[294095]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf.new
Nov 28 09:53:23 np0005538515.localdomain sudo[294095]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:23 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO cephadm.serve] Updating np0005538512.localdomain:/etc/ceph/ceph.client.admin.keyring
Nov 28 09:53:23 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Updating np0005538512.localdomain:/etc/ceph/ceph.client.admin.keyring
Nov 28 09:53:23 np0005538515.localdomain sudo[294095]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:23 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO cephadm.serve] Updating np0005538511.localdomain:/etc/ceph/ceph.client.admin.keyring
Nov 28 09:53:23 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Updating np0005538511.localdomain:/etc/ceph/ceph.client.admin.keyring
Nov 28 09:53:23 np0005538515.localdomain ceph-mgr[286188]: mgr.server handle_open ignoring open from mgr.np0005538514.djozup 172.18.0.107:0/1408760265; not ready for session (expect reconnect)
Nov 28 09:53:23 np0005538515.localdomain sudo[294113]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf.new
Nov 28 09:53:23 np0005538515.localdomain sudo[294113]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:23 np0005538515.localdomain sudo[294113]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:23 np0005538515.localdomain sudo[294131]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf.new /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:53:23 np0005538515.localdomain sudo[294131]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:23 np0005538515.localdomain sudo[294131]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:23 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO cephadm.serve] Updating np0005538515.localdomain:/etc/ceph/ceph.client.admin.keyring
Nov 28 09:53:23 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Updating np0005538515.localdomain:/etc/ceph/ceph.client.admin.keyring
Nov 28 09:53:23 np0005538515.localdomain sudo[294149]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Nov 28 09:53:23 np0005538515.localdomain sudo[294149]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:23 np0005538515.localdomain sudo[294149]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:23 np0005538515.localdomain sudo[294167]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph
Nov 28 09:53:23 np0005538515.localdomain sudo[294167]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:23 np0005538515.localdomain sudo[294167]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:23 np0005538515.localdomain ceph-mon[287604]: Adjusting osd_memory_target on np0005538513.localdomain to 836.6M
Nov 28 09:53:23 np0005538515.localdomain ceph-mon[287604]: Unable to set osd_memory_target on np0005538513.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Nov 28 09:53:23 np0005538515.localdomain ceph-mon[287604]: Adjusting osd_memory_target on np0005538515.localdomain to 836.6M
Nov 28 09:53:23 np0005538515.localdomain ceph-mon[287604]: Unable to set osd_memory_target on np0005538515.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Nov 28 09:53:23 np0005538515.localdomain ceph-mon[287604]: Updating np0005538511.localdomain:/etc/ceph/ceph.conf
Nov 28 09:53:23 np0005538515.localdomain ceph-mon[287604]: Updating np0005538512.localdomain:/etc/ceph/ceph.conf
Nov 28 09:53:23 np0005538515.localdomain ceph-mon[287604]: Updating np0005538513.localdomain:/etc/ceph/ceph.conf
Nov 28 09:53:23 np0005538515.localdomain ceph-mon[287604]: Updating np0005538514.localdomain:/etc/ceph/ceph.conf
Nov 28 09:53:23 np0005538515.localdomain ceph-mon[287604]: Updating np0005538515.localdomain:/etc/ceph/ceph.conf
Nov 28 09:53:23 np0005538515.localdomain ceph-mon[287604]: pgmap v5: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:53:23 np0005538515.localdomain ceph-mon[287604]: Updating np0005538514.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:53:23 np0005538515.localdomain ceph-mon[287604]: Updating np0005538511.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:53:23 np0005538515.localdomain ceph-mon[287604]: Updating np0005538512.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:53:23 np0005538515.localdomain ceph-mon[287604]: Updating np0005538513.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:53:23 np0005538515.localdomain ceph-mon[287604]: Updating np0005538515.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:53:23 np0005538515.localdomain ceph-mon[287604]: Standby manager daemon np0005538514.djozup started
Nov 28 09:53:23 np0005538515.localdomain sudo[294185]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.client.admin.keyring.new
Nov 28 09:53:23 np0005538515.localdomain sudo[294185]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:23 np0005538515.localdomain sudo[294185]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:23 np0005538515.localdomain sudo[294203]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:53:23 np0005538515.localdomain sudo[294203]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:23 np0005538515.localdomain sudo[294203]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:23 np0005538515.localdomain sudo[294221]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.client.admin.keyring.new
Nov 28 09:53:23 np0005538515.localdomain sudo[294221]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:23 np0005538515.localdomain sudo[294221]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:23 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO cephadm.serve] Updating np0005538514.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring
Nov 28 09:53:23 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Updating np0005538514.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring
Nov 28 09:53:23 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO cephadm.serve] Updating np0005538511.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring
Nov 28 09:53:23 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Updating np0005538511.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring
Nov 28 09:53:23 np0005538515.localdomain sudo[294255]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.client.admin.keyring.new
Nov 28 09:53:23 np0005538515.localdomain sudo[294255]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:23 np0005538515.localdomain sudo[294255]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:23 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO cephadm.serve] Updating np0005538512.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring
Nov 28 09:53:23 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Updating np0005538512.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring
Nov 28 09:53:23 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO cephadm.serve] Updating np0005538513.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring
Nov 28 09:53:23 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Updating np0005538513.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring
Nov 28 09:53:23 np0005538515.localdomain sudo[294273]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.client.admin.keyring.new
Nov 28 09:53:23 np0005538515.localdomain sudo[294273]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:23 np0005538515.localdomain sudo[294273]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:23 np0005538515.localdomain sudo[294291]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.client.admin.keyring.new /etc/ceph/ceph.client.admin.keyring
Nov 28 09:53:23 np0005538515.localdomain sudo[294291]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:23 np0005538515.localdomain sudo[294291]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:23 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO cephadm.serve] Updating np0005538515.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring
Nov 28 09:53:23 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Updating np0005538515.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring
Nov 28 09:53:23 np0005538515.localdomain sudo[294309]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config
Nov 28 09:53:23 np0005538515.localdomain sudo[294309]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:23 np0005538515.localdomain sudo[294309]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:24 np0005538515.localdomain sudo[294327]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config
Nov 28 09:53:24 np0005538515.localdomain sudo[294327]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:24 np0005538515.localdomain sudo[294327]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:24 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v6: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:53:24 np0005538515.localdomain sudo[294345]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring.new
Nov 28 09:53:24 np0005538515.localdomain sudo[294345]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:24 np0005538515.localdomain sudo[294345]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:24 np0005538515.localdomain sudo[294363]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:53:24 np0005538515.localdomain sudo[294363]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:24 np0005538515.localdomain sudo[294363]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:24 np0005538515.localdomain sudo[294381]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring.new
Nov 28 09:53:24 np0005538515.localdomain sudo[294381]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:24 np0005538515.localdomain sudo[294381]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:24 np0005538515.localdomain sudo[294415]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring.new
Nov 28 09:53:24 np0005538515.localdomain ceph-mon[287604]: Updating np0005538514.localdomain:/etc/ceph/ceph.client.admin.keyring
Nov 28 09:53:24 np0005538515.localdomain sudo[294415]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:24 np0005538515.localdomain ceph-mon[287604]: Updating np0005538513.localdomain:/etc/ceph/ceph.client.admin.keyring
Nov 28 09:53:24 np0005538515.localdomain ceph-mon[287604]: Updating np0005538512.localdomain:/etc/ceph/ceph.client.admin.keyring
Nov 28 09:53:24 np0005538515.localdomain ceph-mon[287604]: Updating np0005538511.localdomain:/etc/ceph/ceph.client.admin.keyring
Nov 28 09:53:24 np0005538515.localdomain ceph-mon[287604]: Updating np0005538515.localdomain:/etc/ceph/ceph.client.admin.keyring
Nov 28 09:53:24 np0005538515.localdomain ceph-mon[287604]: mgrmap e29: np0005538515.yfkzhl(active, since 5s), standbys: np0005538511.fvuybw, np0005538513.dsfdlx, np0005538514.djozup, np0005538512.zyhkxs
Nov 28 09:53:24 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mgr metadata", "who": "np0005538514.djozup", "id": "np0005538514.djozup"} : dispatch
Nov 28 09:53:24 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:24 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:24 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:24 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:24 np0005538515.localdomain sudo[294415]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:24 np0005538515.localdomain sudo[294433]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring.new
Nov 28 09:53:24 np0005538515.localdomain sudo[294433]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:24 np0005538515.localdomain sudo[294433]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:24 np0005538515.localdomain ceph-mon[287604]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #22. Immutable memtables: 0.
Nov 28 09:53:24 np0005538515.localdomain ceph-mon[287604]: rocksdb: (Original Log Time 2025/11/28-09:53:24.527546) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Nov 28 09:53:24 np0005538515.localdomain ceph-mon[287604]: rocksdb: [db/flush_job.cc:856] [default] [JOB 9] Flushing memtable with next log file: 22
Nov 28 09:53:24 np0005538515.localdomain ceph-mon[287604]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323604527671, "job": 9, "event": "flush_started", "num_memtables": 1, "num_entries": 973, "num_deletes": 276, "total_data_size": 5014026, "memory_usage": 5158144, "flush_reason": "Manual Compaction"}
Nov 28 09:53:24 np0005538515.localdomain ceph-mon[287604]: rocksdb: [db/flush_job.cc:885] [default] [JOB 9] Level-0 flush table #23: started
Nov 28 09:53:24 np0005538515.localdomain ceph-mon[287604]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323604550573, "cf_name": "default", "job": 9, "event": "table_file_creation", "file_number": 23, "file_size": 3201627, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 15869, "largest_seqno": 16837, "table_properties": {"data_size": 3196696, "index_size": 2334, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1541, "raw_key_size": 11990, "raw_average_key_size": 20, "raw_value_size": 3186115, "raw_average_value_size": 5372, "num_data_blocks": 96, "num_entries": 593, "num_filter_entries": 593, "num_deletions": 275, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764323593, "oldest_key_time": 1764323593, "file_creation_time": 1764323604, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "5fedd929-5f7c-4f1d-86e7-c95af9bc6d32", "db_session_id": "18KD68ISQNH5R0YWI96C", "orig_file_number": 23, "seqno_to_time_mapping": "N/A"}}
Nov 28 09:53:24 np0005538515.localdomain ceph-mon[287604]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 9] Flush lasted 23086 microseconds, and 8622 cpu microseconds.
Nov 28 09:53:24 np0005538515.localdomain ceph-mon[287604]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 28 09:53:24 np0005538515.localdomain ceph-mon[287604]: rocksdb: (Original Log Time 2025/11/28-09:53:24.550654) [db/flush_job.cc:967] [default] [JOB 9] Level-0 flush table #23: 3201627 bytes OK
Nov 28 09:53:24 np0005538515.localdomain ceph-mon[287604]: rocksdb: (Original Log Time 2025/11/28-09:53:24.550691) [db/memtable_list.cc:519] [default] Level-0 commit table #23 started
Nov 28 09:53:24 np0005538515.localdomain ceph-mon[287604]: rocksdb: (Original Log Time 2025/11/28-09:53:24.552173) [db/memtable_list.cc:722] [default] Level-0 commit table #23: memtable #1 done
Nov 28 09:53:24 np0005538515.localdomain ceph-mon[287604]: rocksdb: (Original Log Time 2025/11/28-09:53:24.552200) EVENT_LOG_v1 {"time_micros": 1764323604552192, "job": 9, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Nov 28 09:53:24 np0005538515.localdomain ceph-mon[287604]: rocksdb: (Original Log Time 2025/11/28-09:53:24.552235) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Nov 28 09:53:24 np0005538515.localdomain ceph-mon[287604]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 9] Try to delete WAL files size 5008595, prev total WAL file size 5008595, number of live WAL files 2.
Nov 28 09:53:24 np0005538515.localdomain ceph-mon[287604]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538515/store.db/000019.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 28 09:53:24 np0005538515.localdomain ceph-mon[287604]: rocksdb: (Original Log Time 2025/11/28-09:53:24.553605) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6B760031303231' seq:72057594037927935, type:22 .. '6B760031323936' seq:0, type:0; will stop at (end)
Nov 28 09:53:24 np0005538515.localdomain ceph-mon[287604]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 10] Compacting 1@0 + 1@6 files to L6, score -1.00
Nov 28 09:53:24 np0005538515.localdomain ceph-mon[287604]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 9 Base level 0, inputs: [23(3126KB)], [21(14MB)]
Nov 28 09:53:24 np0005538515.localdomain ceph-mon[287604]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323604553662, "job": 10, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [23], "files_L6": [21], "score": -1, "input_data_size": 18016293, "oldest_snapshot_seqno": -1}
Nov 28 09:53:24 np0005538515.localdomain sudo[294451]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring.new /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring
Nov 28 09:53:24 np0005538515.localdomain sudo[294451]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:24 np0005538515.localdomain sudo[294451]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:24 np0005538515.localdomain ceph-mon[287604]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 10] Generated table #24: 10142 keys, 16854249 bytes, temperature: kUnknown
Nov 28 09:53:24 np0005538515.localdomain ceph-mon[287604]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323604692988, "cf_name": "default", "job": 10, "event": "table_file_creation", "file_number": 24, "file_size": 16854249, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 16796604, "index_size": 31176, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 25413, "raw_key_size": 271822, "raw_average_key_size": 26, "raw_value_size": 16623397, "raw_average_value_size": 1639, "num_data_blocks": 1176, "num_entries": 10142, "num_filter_entries": 10142, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764323465, "oldest_key_time": 0, "file_creation_time": 1764323604, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "5fedd929-5f7c-4f1d-86e7-c95af9bc6d32", "db_session_id": "18KD68ISQNH5R0YWI96C", "orig_file_number": 24, "seqno_to_time_mapping": "N/A"}}
Nov 28 09:53:24 np0005538515.localdomain ceph-mon[287604]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 28 09:53:24 np0005538515.localdomain ceph-mon[287604]: rocksdb: (Original Log Time 2025/11/28-09:53:24.693624) [db/compaction/compaction_job.cc:1663] [default] [JOB 10] Compacted 1@0 + 1@6 files to L6 => 16854249 bytes
Nov 28 09:53:24 np0005538515.localdomain ceph-mon[287604]: rocksdb: (Original Log Time 2025/11/28-09:53:24.695245) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 129.0 rd, 120.6 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.1, 14.1 +0.0 blob) out(16.1 +0.0 blob), read-write-amplify(10.9) write-amplify(5.3) OK, records in: 10725, records dropped: 583 output_compression: NoCompression
Nov 28 09:53:24 np0005538515.localdomain ceph-mon[287604]: rocksdb: (Original Log Time 2025/11/28-09:53:24.695279) EVENT_LOG_v1 {"time_micros": 1764323604695264, "job": 10, "event": "compaction_finished", "compaction_time_micros": 139707, "compaction_time_cpu_micros": 47360, "output_level": 6, "num_output_files": 1, "total_output_size": 16854249, "num_input_records": 10725, "num_output_records": 10142, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Nov 28 09:53:24 np0005538515.localdomain ceph-mon[287604]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538515/store.db/000023.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 28 09:53:24 np0005538515.localdomain ceph-mon[287604]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323604695871, "job": 10, "event": "table_file_deletion", "file_number": 23}
Nov 28 09:53:24 np0005538515.localdomain ceph-mon[287604]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538515/store.db/000021.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 28 09:53:24 np0005538515.localdomain ceph-mon[287604]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323604698469, "job": 10, "event": "table_file_deletion", "file_number": 21}
Nov 28 09:53:24 np0005538515.localdomain ceph-mon[287604]: rocksdb: (Original Log Time 2025/11/28-09:53:24.553494) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 09:53:24 np0005538515.localdomain ceph-mon[287604]: rocksdb: (Original Log Time 2025/11/28-09:53:24.698558) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 09:53:24 np0005538515.localdomain ceph-mon[287604]: rocksdb: (Original Log Time 2025/11/28-09:53:24.698566) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 09:53:24 np0005538515.localdomain ceph-mon[287604]: rocksdb: (Original Log Time 2025/11/28-09:53:24.698569) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 09:53:24 np0005538515.localdomain ceph-mon[287604]: rocksdb: (Original Log Time 2025/11/28-09:53:24.698572) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 09:53:24 np0005538515.localdomain ceph-mon[287604]: rocksdb: (Original Log Time 2025/11/28-09:53:24.698574) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 09:53:24 np0005538515.localdomain ceph-mgr[286188]: [progress INFO root] update: starting ev 3d74b371-4d29-468f-84f1-3b2a0094d080 (Updating node-proxy deployment (+5 -> 5))
Nov 28 09:53:24 np0005538515.localdomain ceph-mgr[286188]: [progress INFO root] complete: finished ev 3d74b371-4d29-468f-84f1-3b2a0094d080 (Updating node-proxy deployment (+5 -> 5))
Nov 28 09:53:24 np0005538515.localdomain ceph-mgr[286188]: [progress INFO root] Completed event 3d74b371-4d29-468f-84f1-3b2a0094d080 (Updating node-proxy deployment (+5 -> 5)) in 0 seconds
Nov 28 09:53:25 np0005538515.localdomain sudo[294469]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:53:25 np0005538515.localdomain sudo[294469]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:25 np0005538515.localdomain sudo[294469]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:25 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO cephadm.serve] Reconfiguring mon.np0005538511 (monmap changed)...
Nov 28 09:53:25 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Reconfiguring mon.np0005538511 (monmap changed)...
Nov 28 09:53:25 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO cephadm.serve] Reconfiguring daemon mon.np0005538511 on np0005538511.localdomain
Nov 28 09:53:25 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Reconfiguring daemon mon.np0005538511 on np0005538511.localdomain
Nov 28 09:53:25 np0005538515.localdomain ceph-mon[287604]: Updating np0005538514.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring
Nov 28 09:53:25 np0005538515.localdomain ceph-mon[287604]: Updating np0005538511.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring
Nov 28 09:53:25 np0005538515.localdomain ceph-mon[287604]: Updating np0005538512.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring
Nov 28 09:53:25 np0005538515.localdomain ceph-mon[287604]: Updating np0005538513.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring
Nov 28 09:53:25 np0005538515.localdomain ceph-mon[287604]: Updating np0005538515.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring
Nov 28 09:53:25 np0005538515.localdomain ceph-mon[287604]: pgmap v6: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:53:25 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:25 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:25 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:25 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:25 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:25 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:25 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:25 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:25 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 28 09:53:25 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Nov 28 09:53:25 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Nov 28 09:53:25 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:53:25 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@2(peon).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:53:26 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO cephadm.serve] Reconfiguring mon.np0005538512 (monmap changed)...
Nov 28 09:53:26 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Reconfiguring mon.np0005538512 (monmap changed)...
Nov 28 09:53:26 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v7: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail; 30 KiB/s rd, 0 B/s wr, 16 op/s
Nov 28 09:53:26 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO cephadm.serve] Reconfiguring daemon mon.np0005538512 on np0005538512.localdomain
Nov 28 09:53:26 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Reconfiguring daemon mon.np0005538512 on np0005538512.localdomain
Nov 28 09:53:26 np0005538515.localdomain ceph-mon[287604]: Reconfiguring mon.np0005538511 (monmap changed)...
Nov 28 09:53:26 np0005538515.localdomain ceph-mon[287604]: Reconfiguring daemon mon.np0005538511 on np0005538511.localdomain
Nov 28 09:53:26 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:26 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:26 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Nov 28 09:53:26 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Nov 28 09:53:26 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:53:26 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.1 on np0005538515.localdomain
Nov 28 09:53:26 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.1 on np0005538515.localdomain
Nov 28 09:53:26 np0005538515.localdomain sudo[294487]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:53:27 np0005538515.localdomain sudo[294487]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:27 np0005538515.localdomain sudo[294487]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:27 np0005538515.localdomain sudo[294505]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:53:27 np0005538515.localdomain sudo[294505]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:27 np0005538515.localdomain ceph-mon[287604]: Reconfiguring mon.np0005538512 (monmap changed)...
Nov 28 09:53:27 np0005538515.localdomain ceph-mon[287604]: pgmap v7: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail; 30 KiB/s rd, 0 B/s wr, 16 op/s
Nov 28 09:53:27 np0005538515.localdomain ceph-mon[287604]: Reconfiguring daemon mon.np0005538512 on np0005538512.localdomain
Nov 28 09:53:27 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:27 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:27 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch
Nov 28 09:53:27 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:53:27 np0005538515.localdomain podman[294539]: 
Nov 28 09:53:27 np0005538515.localdomain podman[294539]: 2025-11-28 09:53:27.501055592 +0000 UTC m=+0.076046450 container create 3154c5605efc9d661938a904f14de1f2d5ef44c3f4363417b521ecc15b0a569f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=condescending_gates, GIT_CLEAN=True, version=7, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, ceph=True, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=553, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, build-date=2025-09-24T08:57:55, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., architecture=x86_64, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, RELEASE=main, description=Red Hat Ceph Storage 7, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_BRANCH=main, CEPH_POINT_RELEASE=)
Nov 28 09:53:27 np0005538515.localdomain systemd[1]: Started libpod-conmon-3154c5605efc9d661938a904f14de1f2d5ef44c3f4363417b521ecc15b0a569f.scope.
Nov 28 09:53:27 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 09:53:27 np0005538515.localdomain podman[294539]: 2025-11-28 09:53:27.469986196 +0000 UTC m=+0.044977114 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 09:53:27 np0005538515.localdomain podman[294539]: 2025-11-28 09:53:27.570090556 +0000 UTC m=+0.145081444 container init 3154c5605efc9d661938a904f14de1f2d5ef44c3f4363417b521ecc15b0a569f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=condescending_gates, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, release=553, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, build-date=2025-09-24T08:57:55, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, architecture=x86_64, RELEASE=main, vcs-type=git, vendor=Red Hat, Inc., GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.33.12, version=7, GIT_BRANCH=main, io.openshift.tags=rhceph ceph)
Nov 28 09:53:27 np0005538515.localdomain podman[294539]: 2025-11-28 09:53:27.583359924 +0000 UTC m=+0.158350802 container start 3154c5605efc9d661938a904f14de1f2d5ef44c3f4363417b521ecc15b0a569f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=condescending_gates, release=553, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, architecture=x86_64, RELEASE=main, ceph=True, GIT_CLEAN=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.component=rhceph-container, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-09-24T08:57:55, version=7, GIT_BRANCH=main, vcs-type=git, CEPH_POINT_RELEASE=, distribution-scope=public, vendor=Red Hat, Inc.)
Nov 28 09:53:27 np0005538515.localdomain podman[294539]: 2025-11-28 09:53:27.583667173 +0000 UTC m=+0.158658051 container attach 3154c5605efc9d661938a904f14de1f2d5ef44c3f4363417b521ecc15b0a569f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=condescending_gates, description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, vcs-type=git, version=7, name=rhceph, GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, build-date=2025-09-24T08:57:55, architecture=x86_64, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, RELEASE=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, CEPH_POINT_RELEASE=, release=553)
Nov 28 09:53:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:53:27 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:53:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:53:27 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:53:27 np0005538515.localdomain condescending_gates[294555]: 167 167
Nov 28 09:53:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:53:27 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 09:53:27 np0005538515.localdomain openstack_network_exporter[240973]: 
Nov 28 09:53:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:53:27 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 09:53:27 np0005538515.localdomain openstack_network_exporter[240973]: 
Nov 28 09:53:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:53:27 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 09:53:27 np0005538515.localdomain systemd[1]: libpod-3154c5605efc9d661938a904f14de1f2d5ef44c3f4363417b521ecc15b0a569f.scope: Deactivated successfully.
Nov 28 09:53:27 np0005538515.localdomain podman[294539]: 2025-11-28 09:53:27.593562988 +0000 UTC m=+0.168553916 container died 3154c5605efc9d661938a904f14de1f2d5ef44c3f4363417b521ecc15b0a569f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=condescending_gates, GIT_CLEAN=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, ceph=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., build-date=2025-09-24T08:57:55, RELEASE=main, com.redhat.component=rhceph-container, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, release=553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, architecture=x86_64, name=rhceph, vcs-type=git, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, version=7, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7)
Nov 28 09:53:27 np0005538515.localdomain podman[294560]: 2025-11-28 09:53:27.688124508 +0000 UTC m=+0.089318599 container remove 3154c5605efc9d661938a904f14de1f2d5ef44c3f4363417b521ecc15b0a569f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=condescending_gates, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., release=553, vcs-type=git, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, name=rhceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, RELEASE=main, ceph=True, build-date=2025-09-24T08:57:55, io.openshift.tags=rhceph ceph, version=7, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, architecture=x86_64)
Nov 28 09:53:27 np0005538515.localdomain systemd[1]: libpod-conmon-3154c5605efc9d661938a904f14de1f2d5ef44c3f4363417b521ecc15b0a569f.scope: Deactivated successfully.
Nov 28 09:53:27 np0005538515.localdomain sudo[294505]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:27 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.4 on np0005538515.localdomain
Nov 28 09:53:27 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.4 on np0005538515.localdomain
Nov 28 09:53:28 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v8: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail; 23 KiB/s rd, 0 B/s wr, 12 op/s
Nov 28 09:53:28 np0005538515.localdomain sudo[294585]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:53:28 np0005538515.localdomain sudo[294585]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:28 np0005538515.localdomain sudo[294585]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:28 np0005538515.localdomain ceph-mgr[286188]: [progress INFO root] Writing back 50 completed events
Nov 28 09:53:28 np0005538515.localdomain sudo[294603]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:53:28 np0005538515.localdomain sudo[294603]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:28 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:53:28.238 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:53:28 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-2d96f11932743d2290c86ea0c88a5761137714a02bc99806d2167da30e3e694d-merged.mount: Deactivated successfully.
Nov 28 09:53:28 np0005538515.localdomain podman[294637]: 
Nov 28 09:53:28 np0005538515.localdomain podman[294637]: 2025-11-28 09:53:28.632730679 +0000 UTC m=+0.078635050 container create 5d3fc0e73786ac04e99ae2a2cf52f3cdb1ae69807e577cfe5064444338fc8835 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gracious_mclaren, release=553, distribution-scope=public, vcs-type=git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, GIT_CLEAN=True, CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.buildah.version=1.33.12, build-date=2025-09-24T08:57:55, RELEASE=main, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git)
Nov 28 09:53:28 np0005538515.localdomain systemd[1]: Started libpod-conmon-5d3fc0e73786ac04e99ae2a2cf52f3cdb1ae69807e577cfe5064444338fc8835.scope.
Nov 28 09:53:28 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 09:53:28 np0005538515.localdomain podman[294637]: 2025-11-28 09:53:28.603819259 +0000 UTC m=+0.049723710 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 09:53:28 np0005538515.localdomain podman[294637]: 2025-11-28 09:53:28.706481698 +0000 UTC m=+0.152386079 container init 5d3fc0e73786ac04e99ae2a2cf52f3cdb1ae69807e577cfe5064444338fc8835 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gracious_mclaren, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_CLEAN=True, com.redhat.component=rhceph-container, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, io.openshift.expose-services=, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, architecture=x86_64, vcs-type=git, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-09-24T08:57:55, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, RELEASE=main, release=553, name=rhceph)
Nov 28 09:53:28 np0005538515.localdomain podman[294637]: 2025-11-28 09:53:28.721102847 +0000 UTC m=+0.167007218 container start 5d3fc0e73786ac04e99ae2a2cf52f3cdb1ae69807e577cfe5064444338fc8835 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gracious_mclaren, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, description=Red Hat Ceph Storage 7, RELEASE=main, CEPH_POINT_RELEASE=, release=553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, architecture=x86_64, ceph=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., GIT_BRANCH=main, io.buildah.version=1.33.12, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.tags=rhceph ceph, build-date=2025-09-24T08:57:55, com.redhat.component=rhceph-container, name=rhceph, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=)
Nov 28 09:53:28 np0005538515.localdomain podman[294637]: 2025-11-28 09:53:28.721334514 +0000 UTC m=+0.167238905 container attach 5d3fc0e73786ac04e99ae2a2cf52f3cdb1ae69807e577cfe5064444338fc8835 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gracious_mclaren, vcs-type=git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., distribution-scope=public, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/agreements, RELEASE=main, com.redhat.component=rhceph-container, io.openshift.expose-services=, GIT_CLEAN=True, CEPH_POINT_RELEASE=, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, io.buildah.version=1.33.12, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, release=553, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-09-24T08:57:55)
Nov 28 09:53:28 np0005538515.localdomain gracious_mclaren[294653]: 167 167
Nov 28 09:53:28 np0005538515.localdomain systemd[1]: libpod-5d3fc0e73786ac04e99ae2a2cf52f3cdb1ae69807e577cfe5064444338fc8835.scope: Deactivated successfully.
Nov 28 09:53:28 np0005538515.localdomain podman[294637]: 2025-11-28 09:53:28.726624567 +0000 UTC m=+0.172528978 container died 5d3fc0e73786ac04e99ae2a2cf52f3cdb1ae69807e577cfe5064444338fc8835 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gracious_mclaren, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_CLEAN=True, version=7, CEPH_POINT_RELEASE=, release=553, io.buildah.version=1.33.12, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-09-24T08:57:55, ceph=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, distribution-scope=public, architecture=x86_64)
Nov 28 09:53:28 np0005538515.localdomain ceph-mon[287604]: Reconfiguring daemon osd.1 on np0005538515.localdomain
Nov 28 09:53:28 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:28 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:28 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:28 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:28 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch
Nov 28 09:53:28 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:53:28 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:28 np0005538515.localdomain podman[294658]: 2025-11-28 09:53:28.825239641 +0000 UTC m=+0.089418852 container remove 5d3fc0e73786ac04e99ae2a2cf52f3cdb1ae69807e577cfe5064444338fc8835 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gracious_mclaren, build-date=2025-09-24T08:57:55, vendor=Red Hat, Inc., release=553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, RELEASE=main, GIT_CLEAN=True, name=rhceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, ceph=True, io.openshift.tags=rhceph ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.33.12, version=7, GIT_BRANCH=main, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git)
Nov 28 09:53:28 np0005538515.localdomain systemd[1]: libpod-conmon-5d3fc0e73786ac04e99ae2a2cf52f3cdb1ae69807e577cfe5064444338fc8835.scope: Deactivated successfully.
Nov 28 09:53:28 np0005538515.localdomain podman[239012]: time="2025-11-28T09:53:28Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 09:53:28 np0005538515.localdomain podman[239012]: @ - - [28/Nov/2025:09:53:28 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156330 "" "Go-http-client/1.1"
Nov 28 09:53:28 np0005538515.localdomain podman[239012]: @ - - [28/Nov/2025:09:53:28 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19182 "" "Go-http-client/1.1"
Nov 28 09:53:29 np0005538515.localdomain sudo[294603]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:29 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO cephadm.serve] Reconfiguring mon.np0005538515 (monmap changed)...
Nov 28 09:53:29 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Reconfiguring mon.np0005538515 (monmap changed)...
Nov 28 09:53:29 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO cephadm.serve] Reconfiguring daemon mon.np0005538515 on np0005538515.localdomain
Nov 28 09:53:29 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Reconfiguring daemon mon.np0005538515 on np0005538515.localdomain
Nov 28 09:53:29 np0005538515.localdomain sudo[294681]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:53:29 np0005538515.localdomain sudo[294681]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:29 np0005538515.localdomain sudo[294681]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:29 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:53:29.238 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:53:29 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:53:29.239 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:53:29 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:53:29.239 280172 DEBUG nova.compute.manager [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 28 09:53:29 np0005538515.localdomain sudo[294699]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:53:29 np0005538515.localdomain sudo[294699]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:29 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-4d73b939258e9b68251a3951bc8e1bc9aab1a0e720e35b723a768032b8802c7a-merged.mount: Deactivated successfully.
Nov 28 09:53:29 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.26686 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Nov 28 09:53:29 np0005538515.localdomain podman[294733]: 
Nov 28 09:53:29 np0005538515.localdomain podman[294733]: 2025-11-28 09:53:29.729098416 +0000 UTC m=+0.083179821 container create 3aa78de2ffdb57bff6acf9c771439917cfea82968b421f9d6d450effe4efcb92 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=goofy_rubin, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main, CEPH_POINT_RELEASE=, io.buildah.version=1.33.12, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, release=553, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., name=rhceph, description=Red Hat Ceph Storage 7, version=7, GIT_CLEAN=True, com.redhat.component=rhceph-container, vcs-type=git, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, ceph=True)
Nov 28 09:53:29 np0005538515.localdomain systemd[1]: Started libpod-conmon-3aa78de2ffdb57bff6acf9c771439917cfea82968b421f9d6d450effe4efcb92.scope.
Nov 28 09:53:29 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 09:53:29 np0005538515.localdomain podman[294733]: 2025-11-28 09:53:29.695931363 +0000 UTC m=+0.050012768 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 09:53:29 np0005538515.localdomain podman[294733]: 2025-11-28 09:53:29.795683529 +0000 UTC m=+0.149764934 container init 3aa78de2ffdb57bff6acf9c771439917cfea82968b421f9d6d450effe4efcb92 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=goofy_rubin, name=rhceph, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-09-24T08:57:55, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, version=7, io.buildah.version=1.33.12, architecture=x86_64, distribution-scope=public, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, RELEASE=main, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main)
Nov 28 09:53:29 np0005538515.localdomain podman[294733]: 2025-11-28 09:53:29.806979074 +0000 UTC m=+0.161060479 container start 3aa78de2ffdb57bff6acf9c771439917cfea82968b421f9d6d450effe4efcb92 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=goofy_rubin, vendor=Red Hat, Inc., distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-09-24T08:57:55, RELEASE=main, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, com.redhat.component=rhceph-container, release=553, io.buildah.version=1.33.12, CEPH_POINT_RELEASE=, architecture=x86_64, name=rhceph, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, ceph=True, vcs-type=git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.description=Red Hat Ceph Storage 7)
Nov 28 09:53:29 np0005538515.localdomain podman[294733]: 2025-11-28 09:53:29.807272563 +0000 UTC m=+0.161353968 container attach 3aa78de2ffdb57bff6acf9c771439917cfea82968b421f9d6d450effe4efcb92 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=goofy_rubin, io.buildah.version=1.33.12, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, RELEASE=main, architecture=x86_64, vcs-type=git, version=7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_BRANCH=main, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, io.openshift.expose-services=, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-09-24T08:57:55, distribution-scope=public, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=553)
Nov 28 09:53:29 np0005538515.localdomain goofy_rubin[294748]: 167 167
Nov 28 09:53:29 np0005538515.localdomain systemd[1]: libpod-3aa78de2ffdb57bff6acf9c771439917cfea82968b421f9d6d450effe4efcb92.scope: Deactivated successfully.
Nov 28 09:53:29 np0005538515.localdomain podman[294733]: 2025-11-28 09:53:29.811794471 +0000 UTC m=+0.165875896 container died 3aa78de2ffdb57bff6acf9c771439917cfea82968b421f9d6d450effe4efcb92 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=goofy_rubin, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, com.redhat.component=rhceph-container, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, version=7, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, release=553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.tags=rhceph ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, vcs-type=git, name=rhceph, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, RELEASE=main, description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55)
Nov 28 09:53:29 np0005538515.localdomain ceph-mon[287604]: Reconfiguring daemon osd.4 on np0005538515.localdomain
Nov 28 09:53:29 np0005538515.localdomain ceph-mon[287604]: pgmap v8: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail; 23 KiB/s rd, 0 B/s wr, 12 op/s
Nov 28 09:53:29 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:29 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:29 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:29 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:29 np0005538515.localdomain ceph-mon[287604]: Reconfiguring mon.np0005538515 (monmap changed)...
Nov 28 09:53:29 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Nov 28 09:53:29 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Nov 28 09:53:29 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:53:29 np0005538515.localdomain ceph-mon[287604]: Reconfiguring daemon mon.np0005538515 on np0005538515.localdomain
Nov 28 09:53:29 np0005538515.localdomain podman[294753]: 2025-11-28 09:53:29.913926929 +0000 UTC m=+0.085743269 container remove 3aa78de2ffdb57bff6acf9c771439917cfea82968b421f9d6d450effe4efcb92 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=goofy_rubin, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, CEPH_POINT_RELEASE=, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, version=7, io.openshift.tags=rhceph ceph, architecture=x86_64, build-date=2025-09-24T08:57:55, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=rhceph-container, release=553, ceph=True, com.redhat.license_terms=https://www.redhat.com/agreements, RELEASE=main, io.buildah.version=1.33.12, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d)
Nov 28 09:53:29 np0005538515.localdomain systemd[1]: libpod-conmon-3aa78de2ffdb57bff6acf9c771439917cfea82968b421f9d6d450effe4efcb92.scope: Deactivated successfully.
Nov 28 09:53:29 np0005538515.localdomain sudo[294699]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:30 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v9: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail; 19 KiB/s rd, 0 B/s wr, 10 op/s
Nov 28 09:53:30 np0005538515.localdomain ceph-mgr[286188]: [progress INFO root] update: starting ev f82a4739-45e5-48d5-a015-d6c0e4fb423d (Updating node-proxy deployment (+5 -> 5))
Nov 28 09:53:30 np0005538515.localdomain ceph-mgr[286188]: [progress INFO root] complete: finished ev f82a4739-45e5-48d5-a015-d6c0e4fb423d (Updating node-proxy deployment (+5 -> 5))
Nov 28 09:53:30 np0005538515.localdomain ceph-mgr[286188]: [progress INFO root] Completed event f82a4739-45e5-48d5-a015-d6c0e4fb423d (Updating node-proxy deployment (+5 -> 5)) in 0 seconds
Nov 28 09:53:30 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:53:30.235 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:53:30 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@2(peon).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:53:30 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-19ee31ce6adbd2aa96b1c98f5a2ea2693bb00d872a460e3313d25c2af0f56d2b-merged.mount: Deactivated successfully.
Nov 28 09:53:30 np0005538515.localdomain sudo[294770]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:53:30 np0005538515.localdomain sudo[294770]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:30 np0005538515.localdomain sudo[294770]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:31 np0005538515.localdomain ceph-mon[287604]: from='client.26686 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Nov 28 09:53:31 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:31 np0005538515.localdomain ceph-mon[287604]: pgmap v9: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail; 19 KiB/s rd, 0 B/s wr, 10 op/s
Nov 28 09:53:31 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:31 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:53:31 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 28 09:53:31 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:31 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 28 09:53:31 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:53:31.238 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:53:32 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v10: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail; 17 KiB/s rd, 0 B/s wr, 9 op/s
Nov 28 09:53:32 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.44214 -' entity='client.admin' cmd=[{"prefix": "orch ps", "daemon_type": "mon", "daemon_id": "np0005538511", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Nov 28 09:53:33 np0005538515.localdomain ceph-mon[287604]: pgmap v10: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail; 17 KiB/s rd, 0 B/s wr, 9 op/s
Nov 28 09:53:33 np0005538515.localdomain ceph-mon[287604]: from='client.44214 -' entity='client.admin' cmd=[{"prefix": "orch ps", "daemon_type": "mon", "daemon_id": "np0005538511", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Nov 28 09:53:33 np0005538515.localdomain ceph-mgr[286188]: [progress INFO root] Writing back 50 completed events
Nov 28 09:53:33 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:53:33.237 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:53:33 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:53:33.238 280172 DEBUG nova.compute.manager [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 28 09:53:33 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:53:33.238 280172 DEBUG nova.compute.manager [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 28 09:53:33 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:53:33.264 280172 DEBUG nova.compute.manager [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 28 09:53:33 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:53:33.265 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:53:33 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:53:33.287 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:53:33 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:53:33.287 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:53:33 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:53:33.288 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:53:33 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:53:33.288 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Auditing locally available compute resources for np0005538515.localdomain (node: np0005538515.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 28 09:53:33 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:53:33.289 280172 DEBUG oslo_concurrency.processutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 09:53:33 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@2(peon) e9 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 28 09:53:33 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/479351308' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 09:53:33 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:53:33.703 280172 DEBUG oslo_concurrency.processutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.414s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 09:53:33 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.26706 -' entity='client.admin' cmd=[{"prefix": "orch daemon rm", "names": ["mon.np0005538511"], "force": true, "target": ["mon-mgr", ""]}]: dispatch
Nov 28 09:53:33 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO root] Remove daemons mon.np0005538511
Nov 28 09:53:33 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Remove daemons mon.np0005538511
Nov 28 09:53:33 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO cephadm.services.cephadmservice] Safe to remove mon.np0005538511: new quorum should be ['np0005538512', 'np0005538515', 'np0005538514', 'np0005538513'] (from ['np0005538512', 'np0005538515', 'np0005538514', 'np0005538513'])
Nov 28 09:53:33 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Safe to remove mon.np0005538511: new quorum should be ['np0005538512', 'np0005538515', 'np0005538514', 'np0005538513'] (from ['np0005538512', 'np0005538515', 'np0005538514', 'np0005538513'])
Nov 28 09:53:33 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO cephadm.services.cephadmservice] Removing monitor np0005538511 from monmap...
Nov 28 09:53:33 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Removing monitor np0005538511 from monmap...
Nov 28 09:53:33 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO cephadm.serve] Removing daemon mon.np0005538511 from np0005538511.localdomain -- ports []
Nov 28 09:53:33 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Removing daemon mon.np0005538511 from np0005538511.localdomain -- ports []
Nov 28 09:53:33 np0005538515.localdomain ceph-mgr[286188]: client.34353 ms_handle_reset on v2:172.18.0.107:3300/0
Nov 28 09:53:33 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@2(peon) e10  my rank is now 1 (was 2)
Nov 28 09:53:33 np0005538515.localdomain ceph-mgr[286188]: client.34382 ms_handle_reset on v2:172.18.0.108:3300/0
Nov 28 09:53:33 np0005538515.localdomain ceph-mgr[286188]: client.0 ms_handle_reset on v2:172.18.0.108:3300/0
Nov 28 09:53:33 np0005538515.localdomain ceph-mgr[286188]: client.0 ms_handle_reset on v2:172.18.0.108:3300/0
Nov 28 09:53:33 np0005538515.localdomain ceph-mgr[286188]: client.34353 ms_handle_reset on v2:172.18.0.108:3300/0
Nov 28 09:53:33 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:53:33.908 280172 WARNING nova.virt.libvirt.driver [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 09:53:33 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:53:33.910 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Hypervisor/Node resource view: name=np0005538515.localdomain free_ram=11995MB free_disk=41.83686447143555GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 28 09:53:33 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:53:33.910 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:53:33 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:53:33.911 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:53:33 np0005538515.localdomain ceph-mon[287604]: log_channel(cluster) log [INF] : mon.np0005538515 calling monitor election
Nov 28 09:53:33 np0005538515.localdomain ceph-mon[287604]: paxos.1).electionLogic(38) init, last seen epoch 38
Nov 28 09:53:33 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(electing) e10 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 28 09:53:34 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:53:34.003 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 28 09:53:34 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:53:34.003 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Final resource view: name=np0005538515.localdomain phys_ram=15738MB used_ram=512MB phys_disk=41GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 28 09:53:34 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v11: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail; 17 KiB/s rd, 0 B/s wr, 9 op/s
Nov 28 09:53:34 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:53:34.041 280172 DEBUG oslo_concurrency.processutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 09:53:34 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.
Nov 28 09:53:34 np0005538515.localdomain podman[294821]: 2025-11-28 09:53:34.992184122 +0000 UTC m=+0.091516645 container health_status 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, vcs-type=git, version=9.6, io.openshift.expose-services=, managed_by=edpm_ansible, architecture=x86_64, release=1755695350, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Nov 28 09:53:35 np0005538515.localdomain podman[294821]: 2025-11-28 09:53:35.031632936 +0000 UTC m=+0.130965499 container exec_died 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.openshift.expose-services=, managed_by=edpm_ansible, maintainer=Red Hat, Inc., release=1755695350, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, vcs-type=git, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 28 09:53:35 np0005538515.localdomain systemd[1]: 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.service: Deactivated successfully.
Nov 28 09:53:35 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(electing) e10 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 28 09:53:35 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon) e10 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 28 09:53:35 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO cephadm.serve] Updating np0005538511.localdomain:/etc/ceph/ceph.conf
Nov 28 09:53:35 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Updating np0005538511.localdomain:/etc/ceph/ceph.conf
Nov 28 09:53:35 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO cephadm.serve] Updating np0005538512.localdomain:/etc/ceph/ceph.conf
Nov 28 09:53:35 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO cephadm.serve] Updating np0005538513.localdomain:/etc/ceph/ceph.conf
Nov 28 09:53:35 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO cephadm.serve] Updating np0005538514.localdomain:/etc/ceph/ceph.conf
Nov 28 09:53:35 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO cephadm.serve] Updating np0005538515.localdomain:/etc/ceph/ceph.conf
Nov 28 09:53:35 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Updating np0005538512.localdomain:/etc/ceph/ceph.conf
Nov 28 09:53:35 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Updating np0005538513.localdomain:/etc/ceph/ceph.conf
Nov 28 09:53:35 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Updating np0005538514.localdomain:/etc/ceph/ceph.conf
Nov 28 09:53:35 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Updating np0005538515.localdomain:/etc/ceph/ceph.conf
Nov 28 09:53:35 np0005538515.localdomain ceph-mon[287604]: from='client.? 172.18.0.108:0/479351308' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 09:53:35 np0005538515.localdomain ceph-mon[287604]: Remove daemons mon.np0005538511
Nov 28 09:53:35 np0005538515.localdomain ceph-mon[287604]: Safe to remove mon.np0005538511: new quorum should be ['np0005538512', 'np0005538515', 'np0005538514', 'np0005538513'] (from ['np0005538512', 'np0005538515', 'np0005538514', 'np0005538513'])
Nov 28 09:53:35 np0005538515.localdomain ceph-mon[287604]: Removing monitor np0005538511 from monmap...
Nov 28 09:53:35 np0005538515.localdomain ceph-mon[287604]: Removing daemon mon.np0005538511 from np0005538511.localdomain -- ports []
Nov 28 09:53:35 np0005538515.localdomain ceph-mon[287604]: mon.np0005538513 calling monitor election
Nov 28 09:53:35 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mon metadata", "id": "np0005538512"} : dispatch
Nov 28 09:53:35 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mon metadata", "id": "np0005538513"} : dispatch
Nov 28 09:53:35 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mon metadata", "id": "np0005538514"} : dispatch
Nov 28 09:53:35 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mon metadata", "id": "np0005538515"} : dispatch
Nov 28 09:53:35 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515 calling monitor election
Nov 28 09:53:35 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:53:35 np0005538515.localdomain ceph-mon[287604]: pgmap v11: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail; 17 KiB/s rd, 0 B/s wr, 9 op/s
Nov 28 09:53:35 np0005538515.localdomain ceph-mon[287604]: mon.np0005538514 calling monitor election
Nov 28 09:53:35 np0005538515.localdomain ceph-mon[287604]: mon.np0005538512 calling monitor election
Nov 28 09:53:35 np0005538515.localdomain ceph-mon[287604]: mon.np0005538512 is new leader, mons np0005538512,np0005538515,np0005538514,np0005538513 in quorum (ranks 0,1,2,3)
Nov 28 09:53:35 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 28 09:53:35 np0005538515.localdomain ceph-mon[287604]: monmap epoch 10
Nov 28 09:53:35 np0005538515.localdomain ceph-mon[287604]: fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:53:35 np0005538515.localdomain ceph-mon[287604]: last_changed 2025-11-28T09:53:33.884066+0000
Nov 28 09:53:35 np0005538515.localdomain ceph-mon[287604]: created 2025-11-28T07:45:36.120469+0000
Nov 28 09:53:35 np0005538515.localdomain ceph-mon[287604]: min_mon_release 18 (reef)
Nov 28 09:53:35 np0005538515.localdomain ceph-mon[287604]: election_strategy: 1
Nov 28 09:53:35 np0005538515.localdomain ceph-mon[287604]: 0: [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon.np0005538512
Nov 28 09:53:35 np0005538515.localdomain ceph-mon[287604]: 1: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005538515
Nov 28 09:53:35 np0005538515.localdomain ceph-mon[287604]: 2: [v2:172.18.0.107:3300/0,v1:172.18.0.107:6789/0] mon.np0005538514
Nov 28 09:53:35 np0005538515.localdomain ceph-mon[287604]: 3: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005538513
Nov 28 09:53:35 np0005538515.localdomain ceph-mon[287604]: fsmap cephfs:1 {0=mds.np0005538514.umgtoy=up:active} 2 up:standby
Nov 28 09:53:35 np0005538515.localdomain ceph-mon[287604]: osdmap e88: 6 total, 6 up, 6 in
Nov 28 09:53:35 np0005538515.localdomain ceph-mon[287604]: mgrmap e29: np0005538515.yfkzhl(active, since 17s), standbys: np0005538511.fvuybw, np0005538513.dsfdlx, np0005538514.djozup, np0005538512.zyhkxs
Nov 28 09:53:35 np0005538515.localdomain ceph-mon[287604]: overall HEALTH_OK
Nov 28 09:53:36 np0005538515.localdomain sudo[294840]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Nov 28 09:53:36 np0005538515.localdomain sudo[294840]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:36 np0005538515.localdomain sudo[294840]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:36 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v12: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail; 17 KiB/s rd, 0 B/s wr, 9 op/s
Nov 28 09:53:36 np0005538515.localdomain sudo[294858]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph
Nov 28 09:53:36 np0005538515.localdomain sudo[294858]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:36 np0005538515.localdomain sudo[294858]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:36 np0005538515.localdomain sudo[294876]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.conf.new
Nov 28 09:53:36 np0005538515.localdomain sudo[294876]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:36 np0005538515.localdomain sudo[294876]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:36 np0005538515.localdomain sudo[294894]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:53:36 np0005538515.localdomain sudo[294894]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:36 np0005538515.localdomain sudo[294894]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:36 np0005538515.localdomain sudo[294912]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.conf.new
Nov 28 09:53:36 np0005538515.localdomain sudo[294912]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:36 np0005538515.localdomain sudo[294912]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:36 np0005538515.localdomain sudo[294946]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.conf.new
Nov 28 09:53:36 np0005538515.localdomain sudo[294946]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:36 np0005538515.localdomain sudo[294946]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:36 np0005538515.localdomain sudo[294964]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.conf.new
Nov 28 09:53:36 np0005538515.localdomain sudo[294964]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:36 np0005538515.localdomain sudo[294964]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:36 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO cephadm.serve] Updating np0005538511.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:53:36 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Updating np0005538511.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:53:36 np0005538515.localdomain sudo[294982]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Nov 28 09:53:36 np0005538515.localdomain sudo[294982]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:36 np0005538515.localdomain sudo[294982]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:36 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO cephadm.serve] Updating np0005538515.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:53:36 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Updating np0005538515.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:53:36 np0005538515.localdomain sudo[295000]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config
Nov 28 09:53:36 np0005538515.localdomain sudo[295000]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:36 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO cephadm.serve] Updating np0005538514.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:53:36 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Updating np0005538514.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:53:36 np0005538515.localdomain sudo[295000]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:36 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO cephadm.serve] Updating np0005538513.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:53:36 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Updating np0005538513.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:53:36 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO cephadm.serve] Updating np0005538512.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:53:36 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Updating np0005538512.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:53:36 np0005538515.localdomain sudo[295018]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config
Nov 28 09:53:36 np0005538515.localdomain sudo[295018]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:36 np0005538515.localdomain sudo[295018]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:36 np0005538515.localdomain sudo[295036]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf.new
Nov 28 09:53:36 np0005538515.localdomain sudo[295036]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:36 np0005538515.localdomain sudo[295036]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:36 np0005538515.localdomain sudo[295054]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:53:36 np0005538515.localdomain sudo[295054]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:36 np0005538515.localdomain sudo[295054]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:37 np0005538515.localdomain ceph-mon[287604]: Updating np0005538511.localdomain:/etc/ceph/ceph.conf
Nov 28 09:53:37 np0005538515.localdomain ceph-mon[287604]: Updating np0005538512.localdomain:/etc/ceph/ceph.conf
Nov 28 09:53:37 np0005538515.localdomain ceph-mon[287604]: Updating np0005538513.localdomain:/etc/ceph/ceph.conf
Nov 28 09:53:37 np0005538515.localdomain ceph-mon[287604]: Updating np0005538514.localdomain:/etc/ceph/ceph.conf
Nov 28 09:53:37 np0005538515.localdomain ceph-mon[287604]: Updating np0005538515.localdomain:/etc/ceph/ceph.conf
Nov 28 09:53:37 np0005538515.localdomain ceph-mon[287604]: pgmap v12: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail; 17 KiB/s rd, 0 B/s wr, 9 op/s
Nov 28 09:53:37 np0005538515.localdomain ceph-mon[287604]: from='client.? 172.18.0.106:0/2870239069' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 09:53:37 np0005538515.localdomain ceph-mon[287604]: Updating np0005538511.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:53:37 np0005538515.localdomain ceph-mon[287604]: from='client.? 172.18.0.107:0/2640763287' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 09:53:37 np0005538515.localdomain sudo[295072]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf.new
Nov 28 09:53:37 np0005538515.localdomain sudo[295072]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:37 np0005538515.localdomain sudo[295072]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:37 np0005538515.localdomain sudo[295106]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf.new
Nov 28 09:53:37 np0005538515.localdomain sudo[295106]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:37 np0005538515.localdomain sudo[295106]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:37 np0005538515.localdomain sudo[295124]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf.new
Nov 28 09:53:37 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.26988 -' entity='client.admin' cmd=[{"prefix": "orch host label rm", "hostname": "np0005538511.localdomain", "label": "mon", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 09:53:37 np0005538515.localdomain sudo[295124]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:37 np0005538515.localdomain sudo[295124]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:37 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO root] Removed label mon from host np0005538511.localdomain
Nov 28 09:53:37 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Removed label mon from host np0005538511.localdomain
Nov 28 09:53:37 np0005538515.localdomain sudo[295151]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf.new /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:53:37 np0005538515.localdomain sudo[295151]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:37 np0005538515.localdomain sudo[295151]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:37 np0005538515.localdomain ceph-mgr[286188]: [progress INFO root] update: starting ev 26e86fa6-16ca-4f04-a584-7b02c5af7818 (Updating node-proxy deployment (+5 -> 5))
Nov 28 09:53:37 np0005538515.localdomain ceph-mgr[286188]: [progress INFO root] complete: finished ev 26e86fa6-16ca-4f04-a584-7b02c5af7818 (Updating node-proxy deployment (+5 -> 5))
Nov 28 09:53:37 np0005538515.localdomain ceph-mgr[286188]: [progress INFO root] Completed event 26e86fa6-16ca-4f04-a584-7b02c5af7818 (Updating node-proxy deployment (+5 -> 5)) in 0 seconds
Nov 28 09:53:37 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon) e10 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 28 09:53:37 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/3363422915' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 09:53:37 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:53:37.523 280172 DEBUG oslo_concurrency.processutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 3.482s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 09:53:37 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:53:37.530 280172 DEBUG nova.compute.provider_tree [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Inventory has not changed in ProviderTree for provider: 72fba1ca-0d86-48af-8a3d-510284dfd0e0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 09:53:37 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:53:37.548 280172 DEBUG nova.scheduler.client.report [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Inventory has not changed for provider 72fba1ca-0d86-48af-8a3d-510284dfd0e0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 09:53:37 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:53:37.550 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Compute_service record updated for np0005538515.localdomain:np0005538515.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 28 09:53:37 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:53:37.550 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 3.640s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:53:37 np0005538515.localdomain sudo[295171]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:53:37 np0005538515.localdomain sudo[295171]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:37 np0005538515.localdomain sudo[295171]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:37 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO cephadm.serve] Reconfiguring crash.np0005538511 (monmap changed)...
Nov 28 09:53:37 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Reconfiguring crash.np0005538511 (monmap changed)...
Nov 28 09:53:37 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO cephadm.serve] Reconfiguring daemon crash.np0005538511 on np0005538511.localdomain
Nov 28 09:53:37 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Reconfiguring daemon crash.np0005538511 on np0005538511.localdomain
Nov 28 09:53:38 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v13: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:53:38 np0005538515.localdomain ceph-mgr[286188]: [progress INFO root] Writing back 50 completed events
Nov 28 09:53:38 np0005538515.localdomain ceph-mon[287604]: Updating np0005538515.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:53:38 np0005538515.localdomain ceph-mon[287604]: Updating np0005538514.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:53:38 np0005538515.localdomain ceph-mon[287604]: Updating np0005538513.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:53:38 np0005538515.localdomain ceph-mon[287604]: Updating np0005538512.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:53:38 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:38 np0005538515.localdomain ceph-mon[287604]: from='client.26988 -' entity='client.admin' cmd=[{"prefix": "orch host label rm", "hostname": "np0005538511.localdomain", "label": "mon", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 09:53:38 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:38 np0005538515.localdomain ceph-mon[287604]: from='client.? 172.18.0.106:0/953514327' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 09:53:38 np0005538515.localdomain ceph-mon[287604]: Removed label mon from host np0005538511.localdomain
Nov 28 09:53:38 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:38 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:38 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:38 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:38 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:38 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:38 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:38 np0005538515.localdomain ceph-mon[287604]: from='client.? 172.18.0.107:0/3573920266' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 09:53:38 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:38 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:38 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:38 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 28 09:53:38 np0005538515.localdomain ceph-mon[287604]: from='client.? 172.18.0.108:0/3363422915' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 09:53:38 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538511.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 28 09:53:38 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538511.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 28 09:53:38 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:53:38 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:38 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:53:38.547 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:53:38 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:53:38.548 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:53:38 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:53:38.548 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:53:38 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO cephadm.serve] Reconfiguring mgr.np0005538511.fvuybw (monmap changed)...
Nov 28 09:53:38 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Reconfiguring mgr.np0005538511.fvuybw (monmap changed)...
Nov 28 09:53:38 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO cephadm.serve] Reconfiguring daemon mgr.np0005538511.fvuybw on np0005538511.localdomain
Nov 28 09:53:38 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Reconfiguring daemon mgr.np0005538511.fvuybw on np0005538511.localdomain
Nov 28 09:53:39 np0005538515.localdomain ceph-mon[287604]: Reconfiguring crash.np0005538511 (monmap changed)...
Nov 28 09:53:39 np0005538515.localdomain ceph-mon[287604]: Reconfiguring daemon crash.np0005538511 on np0005538511.localdomain
Nov 28 09:53:39 np0005538515.localdomain ceph-mon[287604]: pgmap v13: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:53:39 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:39 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:39 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538511.fvuybw", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 28 09:53:39 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538511.fvuybw", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 28 09:53:39 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mgr services"} : dispatch
Nov 28 09:53:39 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:53:39 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO cephadm.serve] Reconfiguring mon.np0005538512 (monmap changed)...
Nov 28 09:53:39 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Reconfiguring mon.np0005538512 (monmap changed)...
Nov 28 09:53:39 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO cephadm.serve] Reconfiguring daemon mon.np0005538512 on np0005538512.localdomain
Nov 28 09:53:39 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Reconfiguring daemon mon.np0005538512 on np0005538512.localdomain
Nov 28 09:53:40 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v14: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:53:40 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:53:40 np0005538515.localdomain ceph-mon[287604]: Reconfiguring mgr.np0005538511.fvuybw (monmap changed)...
Nov 28 09:53:40 np0005538515.localdomain ceph-mon[287604]: Reconfiguring daemon mgr.np0005538511.fvuybw on np0005538511.localdomain
Nov 28 09:53:40 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:40 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:40 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Nov 28 09:53:40 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Nov 28 09:53:40 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:53:40 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO cephadm.serve] Reconfiguring mgr.np0005538512.zyhkxs (monmap changed)...
Nov 28 09:53:40 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Reconfiguring mgr.np0005538512.zyhkxs (monmap changed)...
Nov 28 09:53:40 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO cephadm.serve] Reconfiguring daemon mgr.np0005538512.zyhkxs on np0005538512.localdomain
Nov 28 09:53:40 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Reconfiguring daemon mgr.np0005538512.zyhkxs on np0005538512.localdomain
Nov 28 09:53:41 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO cephadm.serve] Reconfiguring crash.np0005538512 (monmap changed)...
Nov 28 09:53:41 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Reconfiguring crash.np0005538512 (monmap changed)...
Nov 28 09:53:41 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO cephadm.serve] Reconfiguring daemon crash.np0005538512 on np0005538512.localdomain
Nov 28 09:53:41 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Reconfiguring daemon crash.np0005538512 on np0005538512.localdomain
Nov 28 09:53:41 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.27004 -' entity='client.admin' cmd=[{"prefix": "orch host label rm", "hostname": "np0005538511.localdomain", "label": "mgr", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 09:53:41 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO root] Removed label mgr from host np0005538511.localdomain
Nov 28 09:53:41 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Removed label mgr from host np0005538511.localdomain
Nov 28 09:53:41 np0005538515.localdomain ceph-mon[287604]: Reconfiguring mon.np0005538512 (monmap changed)...
Nov 28 09:53:41 np0005538515.localdomain ceph-mon[287604]: Reconfiguring daemon mon.np0005538512 on np0005538512.localdomain
Nov 28 09:53:41 np0005538515.localdomain ceph-mon[287604]: pgmap v14: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:53:41 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:41 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:41 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538512.zyhkxs", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 28 09:53:41 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538512.zyhkxs", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 28 09:53:41 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mgr services"} : dispatch
Nov 28 09:53:41 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:53:41 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:41 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:41 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538512.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 28 09:53:41 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538512.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 28 09:53:41 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:53:42 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v15: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:53:42 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO cephadm.serve] Reconfiguring crash.np0005538513 (monmap changed)...
Nov 28 09:53:42 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Reconfiguring crash.np0005538513 (monmap changed)...
Nov 28 09:53:42 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO cephadm.serve] Reconfiguring daemon crash.np0005538513 on np0005538513.localdomain
Nov 28 09:53:42 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Reconfiguring daemon crash.np0005538513 on np0005538513.localdomain
Nov 28 09:53:42 np0005538515.localdomain ceph-mon[287604]: Reconfiguring mgr.np0005538512.zyhkxs (monmap changed)...
Nov 28 09:53:42 np0005538515.localdomain ceph-mon[287604]: Reconfiguring daemon mgr.np0005538512.zyhkxs on np0005538512.localdomain
Nov 28 09:53:42 np0005538515.localdomain ceph-mon[287604]: Reconfiguring crash.np0005538512 (monmap changed)...
Nov 28 09:53:42 np0005538515.localdomain ceph-mon[287604]: Reconfiguring daemon crash.np0005538512 on np0005538512.localdomain
Nov 28 09:53:42 np0005538515.localdomain ceph-mon[287604]: from='client.27004 -' entity='client.admin' cmd=[{"prefix": "orch host label rm", "hostname": "np0005538511.localdomain", "label": "mgr", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 09:53:42 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:42 np0005538515.localdomain ceph-mon[287604]: Removed label mgr from host np0005538511.localdomain
Nov 28 09:53:42 np0005538515.localdomain ceph-mon[287604]: pgmap v15: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:53:42 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:42 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:42 np0005538515.localdomain ceph-mon[287604]: Reconfiguring crash.np0005538513 (monmap changed)...
Nov 28 09:53:42 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538513.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 28 09:53:42 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538513.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 28 09:53:42 np0005538515.localdomain ceph-mon[287604]: Reconfiguring daemon crash.np0005538513 on np0005538513.localdomain
Nov 28 09:53:42 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:53:42 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.44263 -' entity='client.admin' cmd=[{"prefix": "orch host label rm", "hostname": "np0005538511.localdomain", "label": "_admin", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 09:53:42 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO root] Removed label _admin from host np0005538511.localdomain
Nov 28 09:53:42 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Removed label _admin from host np0005538511.localdomain
Nov 28 09:53:43 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO cephadm.serve] Reconfiguring osd.2 (monmap changed)...
Nov 28 09:53:43 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Reconfiguring osd.2 (monmap changed)...
Nov 28 09:53:43 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.2 on np0005538513.localdomain
Nov 28 09:53:43 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.2 on np0005538513.localdomain
Nov 28 09:53:43 np0005538515.localdomain ceph-mon[287604]: from='client.44263 -' entity='client.admin' cmd=[{"prefix": "orch host label rm", "hostname": "np0005538511.localdomain", "label": "_admin", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 09:53:43 np0005538515.localdomain ceph-mon[287604]: Removed label _admin from host np0005538511.localdomain
Nov 28 09:53:43 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:43 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:43 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:43 np0005538515.localdomain ceph-mon[287604]: Reconfiguring osd.2 (monmap changed)...
Nov 28 09:53:43 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch
Nov 28 09:53:43 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:53:43 np0005538515.localdomain ceph-mon[287604]: Reconfiguring daemon osd.2 on np0005538513.localdomain
Nov 28 09:53:44 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v16: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:53:44 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO cephadm.serve] Reconfiguring osd.5 (monmap changed)...
Nov 28 09:53:44 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Reconfiguring osd.5 (monmap changed)...
Nov 28 09:53:44 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.5 on np0005538513.localdomain
Nov 28 09:53:44 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.5 on np0005538513.localdomain
Nov 28 09:53:44 np0005538515.localdomain ceph-mon[287604]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #25. Immutable memtables: 0.
Nov 28 09:53:44 np0005538515.localdomain ceph-mon[287604]: rocksdb: (Original Log Time 2025/11/28-09:53:44.527906) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Nov 28 09:53:44 np0005538515.localdomain ceph-mon[287604]: rocksdb: [db/flush_job.cc:856] [default] [JOB 11] Flushing memtable with next log file: 25
Nov 28 09:53:44 np0005538515.localdomain ceph-mon[287604]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323624527954, "job": 11, "event": "flush_started", "num_memtables": 1, "num_entries": 1079, "num_deletes": 257, "total_data_size": 1714968, "memory_usage": 1736800, "flush_reason": "Manual Compaction"}
Nov 28 09:53:44 np0005538515.localdomain ceph-mon[287604]: rocksdb: [db/flush_job.cc:885] [default] [JOB 11] Level-0 flush table #26: started
Nov 28 09:53:44 np0005538515.localdomain ceph-mon[287604]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323624540015, "cf_name": "default", "job": 11, "event": "table_file_creation", "file_number": 26, "file_size": 995848, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 16842, "largest_seqno": 17916, "table_properties": {"data_size": 990602, "index_size": 2589, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1605, "raw_key_size": 13536, "raw_average_key_size": 21, "raw_value_size": 979270, "raw_average_value_size": 1561, "num_data_blocks": 108, "num_entries": 627, "num_filter_entries": 627, "num_deletions": 257, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764323604, "oldest_key_time": 1764323604, "file_creation_time": 1764323624, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "5fedd929-5f7c-4f1d-86e7-c95af9bc6d32", "db_session_id": "18KD68ISQNH5R0YWI96C", "orig_file_number": 26, "seqno_to_time_mapping": "N/A"}}
Nov 28 09:53:44 np0005538515.localdomain ceph-mon[287604]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 11] Flush lasted 12218 microseconds, and 4806 cpu microseconds.
Nov 28 09:53:44 np0005538515.localdomain ceph-mon[287604]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 28 09:53:44 np0005538515.localdomain ceph-mon[287604]: rocksdb: (Original Log Time 2025/11/28-09:53:44.540123) [db/flush_job.cc:967] [default] [JOB 11] Level-0 flush table #26: 995848 bytes OK
Nov 28 09:53:44 np0005538515.localdomain ceph-mon[287604]: rocksdb: (Original Log Time 2025/11/28-09:53:44.540148) [db/memtable_list.cc:519] [default] Level-0 commit table #26 started
Nov 28 09:53:44 np0005538515.localdomain ceph-mon[287604]: rocksdb: (Original Log Time 2025/11/28-09:53:44.541880) [db/memtable_list.cc:722] [default] Level-0 commit table #26: memtable #1 done
Nov 28 09:53:44 np0005538515.localdomain ceph-mon[287604]: rocksdb: (Original Log Time 2025/11/28-09:53:44.541904) EVENT_LOG_v1 {"time_micros": 1764323624541897, "job": 11, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Nov 28 09:53:44 np0005538515.localdomain ceph-mon[287604]: rocksdb: (Original Log Time 2025/11/28-09:53:44.541926) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Nov 28 09:53:44 np0005538515.localdomain ceph-mon[287604]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 11] Try to delete WAL files size 1709186, prev total WAL file size 1709510, number of live WAL files 2.
Nov 28 09:53:44 np0005538515.localdomain ceph-mon[287604]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538515/store.db/000022.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 28 09:53:44 np0005538515.localdomain ceph-mon[287604]: rocksdb: (Original Log Time 2025/11/28-09:53:44.542595) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0033353138' seq:72057594037927935, type:22 .. '6C6F676D0033373731' seq:0, type:0; will stop at (end)
Nov 28 09:53:44 np0005538515.localdomain ceph-mon[287604]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 12] Compacting 1@0 + 1@6 files to L6, score -1.00
Nov 28 09:53:44 np0005538515.localdomain ceph-mon[287604]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 11 Base level 0, inputs: [26(972KB)], [24(16MB)]
Nov 28 09:53:44 np0005538515.localdomain ceph-mon[287604]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323624542643, "job": 12, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [26], "files_L6": [24], "score": -1, "input_data_size": 17850097, "oldest_snapshot_seqno": -1}
Nov 28 09:53:44 np0005538515.localdomain ceph-mon[287604]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 12] Generated table #27: 10222 keys, 17704597 bytes, temperature: kUnknown
Nov 28 09:53:44 np0005538515.localdomain ceph-mon[287604]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323624700382, "cf_name": "default", "job": 12, "event": "table_file_creation", "file_number": 27, "file_size": 17704597, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 17644868, "index_size": 33068, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 25605, "raw_key_size": 275203, "raw_average_key_size": 26, "raw_value_size": 17468689, "raw_average_value_size": 1708, "num_data_blocks": 1256, "num_entries": 10222, "num_filter_entries": 10222, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764323465, "oldest_key_time": 0, "file_creation_time": 1764323624, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "5fedd929-5f7c-4f1d-86e7-c95af9bc6d32", "db_session_id": "18KD68ISQNH5R0YWI96C", "orig_file_number": 27, "seqno_to_time_mapping": "N/A"}}
Nov 28 09:53:44 np0005538515.localdomain ceph-mon[287604]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 28 09:53:44 np0005538515.localdomain ceph-mon[287604]: rocksdb: (Original Log Time 2025/11/28-09:53:44.700719) [db/compaction/compaction_job.cc:1663] [default] [JOB 12] Compacted 1@0 + 1@6 files to L6 => 17704597 bytes
Nov 28 09:53:44 np0005538515.localdomain ceph-mon[287604]: rocksdb: (Original Log Time 2025/11/28-09:53:44.702978) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 113.1 rd, 112.2 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.9, 16.1 +0.0 blob) out(16.9 +0.0 blob), read-write-amplify(35.7) write-amplify(17.8) OK, records in: 10769, records dropped: 547 output_compression: NoCompression
Nov 28 09:53:44 np0005538515.localdomain ceph-mon[287604]: rocksdb: (Original Log Time 2025/11/28-09:53:44.703012) EVENT_LOG_v1 {"time_micros": 1764323624702998, "job": 12, "event": "compaction_finished", "compaction_time_micros": 157865, "compaction_time_cpu_micros": 43908, "output_level": 6, "num_output_files": 1, "total_output_size": 17704597, "num_input_records": 10769, "num_output_records": 10222, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Nov 28 09:53:44 np0005538515.localdomain ceph-mon[287604]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538515/store.db/000026.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 28 09:53:44 np0005538515.localdomain ceph-mon[287604]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323624703430, "job": 12, "event": "table_file_deletion", "file_number": 26}
Nov 28 09:53:44 np0005538515.localdomain ceph-mon[287604]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538515/store.db/000024.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 28 09:53:44 np0005538515.localdomain ceph-mon[287604]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323624705936, "job": 12, "event": "table_file_deletion", "file_number": 24}
Nov 28 09:53:44 np0005538515.localdomain ceph-mon[287604]: rocksdb: (Original Log Time 2025/11/28-09:53:44.542498) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 09:53:44 np0005538515.localdomain ceph-mon[287604]: rocksdb: (Original Log Time 2025/11/28-09:53:44.706030) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 09:53:44 np0005538515.localdomain ceph-mon[287604]: rocksdb: (Original Log Time 2025/11/28-09:53:44.706037) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 09:53:44 np0005538515.localdomain ceph-mon[287604]: rocksdb: (Original Log Time 2025/11/28-09:53:44.706042) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 09:53:44 np0005538515.localdomain ceph-mon[287604]: rocksdb: (Original Log Time 2025/11/28-09:53:44.706045) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 09:53:44 np0005538515.localdomain ceph-mon[287604]: rocksdb: (Original Log Time 2025/11/28-09:53:44.706050) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 09:53:44 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.
Nov 28 09:53:44 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.
Nov 28 09:53:44 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.
Nov 28 09:53:44 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.
Nov 28 09:53:44 np0005538515.localdomain podman[295197]: 2025-11-28 09:53:44.986837599 +0000 UTC m=+0.073967340 container health_status d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 09:53:44 np0005538515.localdomain podman[295197]: 2025-11-28 09:53:44.991566833 +0000 UTC m=+0.078696584 container exec_died d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 09:53:45 np0005538515.localdomain systemd[1]: d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.service: Deactivated successfully.
Nov 28 09:53:45 np0005538515.localdomain podman[295189]: 2025-11-28 09:53:45.030932234 +0000 UTC m=+0.128893076 container health_status 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3)
Nov 28 09:53:45 np0005538515.localdomain podman[295190]: 2025-11-28 09:53:45.046567762 +0000 UTC m=+0.139588693 container health_status 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 28 09:53:45 np0005538515.localdomain podman[295191]: 2025-11-28 09:53:45.089193143 +0000 UTC m=+0.180099979 container health_status b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 28 09:53:45 np0005538515.localdomain podman[295189]: 2025-11-28 09:53:45.112651509 +0000 UTC m=+0.210612391 container exec_died 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_id=edpm, container_name=ceilometer_agent_compute, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Nov 28 09:53:45 np0005538515.localdomain podman[295191]: 2025-11-28 09:53:45.123602334 +0000 UTC m=+0.214509220 container exec_died b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 28 09:53:45 np0005538515.localdomain systemd[1]: 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.service: Deactivated successfully.
Nov 28 09:53:45 np0005538515.localdomain systemd[1]: b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.service: Deactivated successfully.
Nov 28 09:53:45 np0005538515.localdomain podman[295190]: 2025-11-28 09:53:45.175119697 +0000 UTC m=+0.268140598 container exec_died 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 28 09:53:45 np0005538515.localdomain systemd[1]: 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.service: Deactivated successfully.
Nov 28 09:53:45 np0005538515.localdomain ceph-mon[287604]: pgmap v16: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:53:45 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:45 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:45 np0005538515.localdomain ceph-mon[287604]: Reconfiguring osd.5 (monmap changed)...
Nov 28 09:53:45 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch
Nov 28 09:53:45 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:53:45 np0005538515.localdomain ceph-mon[287604]: Reconfiguring daemon osd.5 on np0005538513.localdomain
Nov 28 09:53:45 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO cephadm.serve] Reconfiguring mds.mds.np0005538513.yljthc (monmap changed)...
Nov 28 09:53:45 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Reconfiguring mds.mds.np0005538513.yljthc (monmap changed)...
Nov 28 09:53:45 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO cephadm.serve] Reconfiguring daemon mds.mds.np0005538513.yljthc on np0005538513.localdomain
Nov 28 09:53:45 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Reconfiguring daemon mds.mds.np0005538513.yljthc on np0005538513.localdomain
Nov 28 09:53:45 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:53:46 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v17: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:53:46 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO cephadm.serve] Reconfiguring mgr.np0005538513.dsfdlx (monmap changed)...
Nov 28 09:53:46 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Reconfiguring mgr.np0005538513.dsfdlx (monmap changed)...
Nov 28 09:53:46 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO cephadm.serve] Reconfiguring daemon mgr.np0005538513.dsfdlx on np0005538513.localdomain
Nov 28 09:53:46 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Reconfiguring daemon mgr.np0005538513.dsfdlx on np0005538513.localdomain
Nov 28 09:53:46 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:46 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:46 np0005538515.localdomain ceph-mon[287604]: Reconfiguring mds.mds.np0005538513.yljthc (monmap changed)...
Nov 28 09:53:46 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005538513.yljthc", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Nov 28 09:53:46 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005538513.yljthc", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Nov 28 09:53:46 np0005538515.localdomain ceph-mon[287604]: Reconfiguring daemon mds.mds.np0005538513.yljthc on np0005538513.localdomain
Nov 28 09:53:46 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:53:46 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:46 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:46 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538513.dsfdlx", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 28 09:53:46 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538513.dsfdlx", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 28 09:53:46 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mgr services"} : dispatch
Nov 28 09:53:46 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:53:47 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO cephadm.serve] Reconfiguring mon.np0005538513 (monmap changed)...
Nov 28 09:53:47 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Reconfiguring mon.np0005538513 (monmap changed)...
Nov 28 09:53:47 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO cephadm.serve] Reconfiguring daemon mon.np0005538513 on np0005538513.localdomain
Nov 28 09:53:47 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Reconfiguring daemon mon.np0005538513 on np0005538513.localdomain
Nov 28 09:53:47 np0005538515.localdomain ceph-mon[287604]: pgmap v17: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:53:47 np0005538515.localdomain ceph-mon[287604]: Reconfiguring mgr.np0005538513.dsfdlx (monmap changed)...
Nov 28 09:53:47 np0005538515.localdomain ceph-mon[287604]: Reconfiguring daemon mgr.np0005538513.dsfdlx on np0005538513.localdomain
Nov 28 09:53:47 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:47 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:47 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Nov 28 09:53:47 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Nov 28 09:53:47 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:53:48 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v18: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:53:48 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] scanning for idle connections..
Nov 28 09:53:48 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] cleaning up connections: []
Nov 28 09:53:48 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO cephadm.serve] Reconfiguring crash.np0005538514 (monmap changed)...
Nov 28 09:53:48 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Reconfiguring crash.np0005538514 (monmap changed)...
Nov 28 09:53:48 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO cephadm.serve] Reconfiguring daemon crash.np0005538514 on np0005538514.localdomain
Nov 28 09:53:48 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Reconfiguring daemon crash.np0005538514 on np0005538514.localdomain
Nov 28 09:53:48 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] scanning for idle connections..
Nov 28 09:53:48 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] cleaning up connections: []
Nov 28 09:53:48 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] scanning for idle connections..
Nov 28 09:53:48 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] cleaning up connections: []
Nov 28 09:53:48 np0005538515.localdomain ceph-mon[287604]: Reconfiguring mon.np0005538513 (monmap changed)...
Nov 28 09:53:48 np0005538515.localdomain ceph-mon[287604]: Reconfiguring daemon mon.np0005538513 on np0005538513.localdomain
Nov 28 09:53:48 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:48 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:48 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538514.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 28 09:53:48 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538514.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 28 09:53:48 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:53:48 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.
Nov 28 09:53:48 np0005538515.localdomain podman[295275]: 2025-11-28 09:53:48.978663912 +0000 UTC m=+0.083250622 container health_status 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 28 09:53:48 np0005538515.localdomain podman[295275]: 2025-11-28 09:53:48.988246065 +0000 UTC m=+0.092832825 container exec_died 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 28 09:53:49 np0005538515.localdomain systemd[1]: 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.service: Deactivated successfully.
Nov 28 09:53:49 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO cephadm.serve] Reconfiguring osd.0 (monmap changed)...
Nov 28 09:53:49 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Reconfiguring osd.0 (monmap changed)...
Nov 28 09:53:49 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.0 on np0005538514.localdomain
Nov 28 09:53:49 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.0 on np0005538514.localdomain
Nov 28 09:53:49 np0005538515.localdomain ceph-mon[287604]: pgmap v18: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:53:49 np0005538515.localdomain ceph-mon[287604]: Reconfiguring crash.np0005538514 (monmap changed)...
Nov 28 09:53:49 np0005538515.localdomain ceph-mon[287604]: Reconfiguring daemon crash.np0005538514 on np0005538514.localdomain
Nov 28 09:53:49 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:49 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:49 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch
Nov 28 09:53:49 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:53:50 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v19: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:53:50 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO cephadm.serve] Reconfiguring osd.3 (monmap changed)...
Nov 28 09:53:50 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Reconfiguring osd.3 (monmap changed)...
Nov 28 09:53:50 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.3 on np0005538514.localdomain
Nov 28 09:53:50 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.3 on np0005538514.localdomain
Nov 28 09:53:50 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:53:50 np0005538515.localdomain ceph-mon[287604]: Reconfiguring osd.0 (monmap changed)...
Nov 28 09:53:50 np0005538515.localdomain ceph-mon[287604]: Reconfiguring daemon osd.0 on np0005538514.localdomain
Nov 28 09:53:50 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:50 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:50 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch
Nov 28 09:53:50 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:53:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:53:50.836 158530 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:53:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:53:50.836 158530 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:53:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:53:50.836 158530 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:53:51 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO cephadm.serve] Reconfiguring mds.mds.np0005538514.umgtoy (monmap changed)...
Nov 28 09:53:51 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Reconfiguring mds.mds.np0005538514.umgtoy (monmap changed)...
Nov 28 09:53:51 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO cephadm.serve] Reconfiguring daemon mds.mds.np0005538514.umgtoy on np0005538514.localdomain
Nov 28 09:53:51 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Reconfiguring daemon mds.mds.np0005538514.umgtoy on np0005538514.localdomain
Nov 28 09:53:51 np0005538515.localdomain ceph-mon[287604]: pgmap v19: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:53:51 np0005538515.localdomain ceph-mon[287604]: Reconfiguring osd.3 (monmap changed)...
Nov 28 09:53:51 np0005538515.localdomain ceph-mon[287604]: Reconfiguring daemon osd.3 on np0005538514.localdomain
Nov 28 09:53:51 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:51 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:51 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005538514.umgtoy", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Nov 28 09:53:51 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005538514.umgtoy", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Nov 28 09:53:51 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:53:52 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v20: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:53:52 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO cephadm.serve] Reconfiguring mgr.np0005538514.djozup (monmap changed)...
Nov 28 09:53:52 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Reconfiguring mgr.np0005538514.djozup (monmap changed)...
Nov 28 09:53:52 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO cephadm.serve] Reconfiguring daemon mgr.np0005538514.djozup on np0005538514.localdomain
Nov 28 09:53:52 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Reconfiguring daemon mgr.np0005538514.djozup on np0005538514.localdomain
Nov 28 09:53:52 np0005538515.localdomain ceph-mon[287604]: Reconfiguring mds.mds.np0005538514.umgtoy (monmap changed)...
Nov 28 09:53:52 np0005538515.localdomain ceph-mon[287604]: Reconfiguring daemon mds.mds.np0005538514.umgtoy on np0005538514.localdomain
Nov 28 09:53:52 np0005538515.localdomain ceph-mon[287604]: pgmap v20: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:53:52 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:52 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:52 np0005538515.localdomain ceph-mon[287604]: Reconfiguring mgr.np0005538514.djozup (monmap changed)...
Nov 28 09:53:52 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538514.djozup", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 28 09:53:52 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538514.djozup", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 28 09:53:52 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mgr services"} : dispatch
Nov 28 09:53:52 np0005538515.localdomain ceph-mon[287604]: Reconfiguring daemon mgr.np0005538514.djozup on np0005538514.localdomain
Nov 28 09:53:52 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:53:52 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.
Nov 28 09:53:52 np0005538515.localdomain podman[295297]: 2025-11-28 09:53:52.977209932 +0000 UTC m=+0.086146211 container health_status cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, managed_by=edpm_ansible)
Nov 28 09:53:52 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO cephadm.serve] Reconfiguring mon.np0005538514 (monmap changed)...
Nov 28 09:53:52 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Reconfiguring mon.np0005538514 (monmap changed)...
Nov 28 09:53:53 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO cephadm.serve] Reconfiguring daemon mon.np0005538514 on np0005538514.localdomain
Nov 28 09:53:53 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Reconfiguring daemon mon.np0005538514 on np0005538514.localdomain
Nov 28 09:53:53 np0005538515.localdomain podman[295297]: 2025-11-28 09:53:53.012812909 +0000 UTC m=+0.121749168 container exec_died cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Nov 28 09:53:53 np0005538515.localdomain systemd[1]: cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.service: Deactivated successfully.
Nov 28 09:53:53 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO cephadm.serve] Reconfiguring crash.np0005538515 (monmap changed)...
Nov 28 09:53:53 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Reconfiguring crash.np0005538515 (monmap changed)...
Nov 28 09:53:53 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO cephadm.serve] Reconfiguring daemon crash.np0005538515 on np0005538515.localdomain
Nov 28 09:53:53 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Reconfiguring daemon crash.np0005538515 on np0005538515.localdomain
Nov 28 09:53:53 np0005538515.localdomain sudo[295316]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:53:53 np0005538515.localdomain sudo[295316]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:53 np0005538515.localdomain sudo[295316]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:53 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:53 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:53 np0005538515.localdomain sudo[295334]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:53:53 np0005538515.localdomain ceph-mon[287604]: Reconfiguring mon.np0005538514 (monmap changed)...
Nov 28 09:53:53 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Nov 28 09:53:53 np0005538515.localdomain ceph-mon[287604]: Reconfiguring daemon mon.np0005538514 on np0005538514.localdomain
Nov 28 09:53:53 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Nov 28 09:53:53 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:53:53 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:53 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:53 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538515.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 28 09:53:53 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538515.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 28 09:53:53 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:53:53 np0005538515.localdomain sudo[295334]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:54 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v21: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:53:54 np0005538515.localdomain podman[295368]: 
Nov 28 09:53:54 np0005538515.localdomain podman[295368]: 2025-11-28 09:53:54.432245915 +0000 UTC m=+0.075719163 container create bbc0f5fc8e016680b340a7fb7f767057686d2f53a4bbd11aa24d3013d050d977 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=clever_mclean, io.buildah.version=1.33.12, distribution-scope=public, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, architecture=x86_64, build-date=2025-09-24T08:57:55, vendor=Red Hat, Inc., ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, RELEASE=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-type=git, name=rhceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=553, GIT_BRANCH=main, version=7, com.redhat.license_terms=https://www.redhat.com/agreements)
Nov 28 09:53:54 np0005538515.localdomain systemd[1]: Started libpod-conmon-bbc0f5fc8e016680b340a7fb7f767057686d2f53a4bbd11aa24d3013d050d977.scope.
Nov 28 09:53:54 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 09:53:54 np0005538515.localdomain podman[295368]: 2025-11-28 09:53:54.400847247 +0000 UTC m=+0.044320505 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 09:53:54 np0005538515.localdomain podman[295368]: 2025-11-28 09:53:54.506875464 +0000 UTC m=+0.150348712 container init bbc0f5fc8e016680b340a7fb7f767057686d2f53a4bbd11aa24d3013d050d977 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=clever_mclean, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, io.openshift.expose-services=, com.redhat.component=rhceph-container, name=rhceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.33.12, io.openshift.tags=rhceph ceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, release=553, RELEASE=main, CEPH_POINT_RELEASE=, ceph=True, build-date=2025-09-24T08:57:55, description=Red Hat Ceph Storage 7, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Nov 28 09:53:54 np0005538515.localdomain systemd[1]: tmp-crun.cjrOXk.mount: Deactivated successfully.
Nov 28 09:53:54 np0005538515.localdomain podman[295368]: 2025-11-28 09:53:54.518743386 +0000 UTC m=+0.162216644 container start bbc0f5fc8e016680b340a7fb7f767057686d2f53a4bbd11aa24d3013d050d977 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=clever_mclean, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, release=553, com.redhat.component=rhceph-container, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main, CEPH_POINT_RELEASE=, io.buildah.version=1.33.12, name=rhceph, description=Red Hat Ceph Storage 7, ceph=True, version=7, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph, distribution-scope=public, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, build-date=2025-09-24T08:57:55)
Nov 28 09:53:54 np0005538515.localdomain podman[295368]: 2025-11-28 09:53:54.519057086 +0000 UTC m=+0.162530374 container attach bbc0f5fc8e016680b340a7fb7f767057686d2f53a4bbd11aa24d3013d050d977 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=clever_mclean, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, vendor=Red Hat, Inc., RELEASE=main, io.openshift.tags=rhceph ceph, distribution-scope=public, architecture=x86_64, name=rhceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat Ceph Storage 7, release=553, com.redhat.component=rhceph-container, ceph=True, GIT_CLEAN=True, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.33.12, build-date=2025-09-24T08:57:55, version=7, description=Red Hat Ceph Storage 7)
Nov 28 09:53:54 np0005538515.localdomain clever_mclean[295383]: 167 167
Nov 28 09:53:54 np0005538515.localdomain systemd[1]: libpod-bbc0f5fc8e016680b340a7fb7f767057686d2f53a4bbd11aa24d3013d050d977.scope: Deactivated successfully.
Nov 28 09:53:54 np0005538515.localdomain podman[295368]: 2025-11-28 09:53:54.523921304 +0000 UTC m=+0.167394582 container died bbc0f5fc8e016680b340a7fb7f767057686d2f53a4bbd11aa24d3013d050d977 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=clever_mclean, com.redhat.component=rhceph-container, GIT_BRANCH=main, build-date=2025-09-24T08:57:55, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, name=rhceph, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, version=7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.description=Red Hat Ceph Storage 7, release=553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12, CEPH_POINT_RELEASE=, RELEASE=main, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git)
Nov 28 09:53:54 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.34406 -' entity='client.admin' cmd=[{"prefix": "orch host drain", "hostname": "np0005538511.localdomain", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 09:53:54 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO root] Added label _no_schedule to host np0005538511.localdomain
Nov 28 09:53:54 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Added label _no_schedule to host np0005538511.localdomain
Nov 28 09:53:54 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO root] Added label SpecialHostLabels.DRAIN_CONF_KEYRING to host np0005538511.localdomain
Nov 28 09:53:54 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Added label SpecialHostLabels.DRAIN_CONF_KEYRING to host np0005538511.localdomain
Nov 28 09:53:54 np0005538515.localdomain podman[295388]: 2025-11-28 09:53:54.62010281 +0000 UTC m=+0.081555330 container remove bbc0f5fc8e016680b340a7fb7f767057686d2f53a4bbd11aa24d3013d050d977 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=clever_mclean, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, io.buildah.version=1.33.12, build-date=2025-09-24T08:57:55, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, description=Red Hat Ceph Storage 7, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, version=7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, ceph=True, release=553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, name=rhceph, RELEASE=main)
Nov 28 09:53:54 np0005538515.localdomain systemd[1]: libpod-conmon-bbc0f5fc8e016680b340a7fb7f767057686d2f53a4bbd11aa24d3013d050d977.scope: Deactivated successfully.
Nov 28 09:53:54 np0005538515.localdomain sudo[295334]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:54 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO cephadm.serve] Reconfiguring osd.1 (monmap changed)...
Nov 28 09:53:54 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Reconfiguring osd.1 (monmap changed)...
Nov 28 09:53:54 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.1 on np0005538515.localdomain
Nov 28 09:53:54 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.1 on np0005538515.localdomain
Nov 28 09:53:54 np0005538515.localdomain sudo[295405]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:53:54 np0005538515.localdomain sudo[295405]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:54 np0005538515.localdomain sudo[295405]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:54 np0005538515.localdomain sudo[295423]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:53:54 np0005538515.localdomain sudo[295423]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:55 np0005538515.localdomain podman[295457]: 
Nov 28 09:53:55 np0005538515.localdomain podman[295457]: 2025-11-28 09:53:55.319473923 +0000 UTC m=+0.078980722 container create 299525f745800ad238445a9b92a87bf9cbf5e6bdb9cebcdbe445a8c0cea443be (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vibrant_margulis, release=553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, RELEASE=main, name=rhceph, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., distribution-scope=public, vcs-type=git, CEPH_POINT_RELEASE=, GIT_CLEAN=True, version=7, com.redhat.component=rhceph-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, ceph=True)
Nov 28 09:53:55 np0005538515.localdomain systemd[1]: Started libpod-conmon-299525f745800ad238445a9b92a87bf9cbf5e6bdb9cebcdbe445a8c0cea443be.scope.
Nov 28 09:53:55 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 09:53:55 np0005538515.localdomain podman[295457]: 2025-11-28 09:53:55.287895629 +0000 UTC m=+0.047402458 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 09:53:55 np0005538515.localdomain podman[295457]: 2025-11-28 09:53:55.389890013 +0000 UTC m=+0.149396822 container init 299525f745800ad238445a9b92a87bf9cbf5e6bdb9cebcdbe445a8c0cea443be (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vibrant_margulis, ceph=True, architecture=x86_64, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.33.12, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/agreements, version=7, io.openshift.expose-services=, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, distribution-scope=public, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, release=553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, vcs-type=git, name=rhceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d)
Nov 28 09:53:55 np0005538515.localdomain podman[295457]: 2025-11-28 09:53:55.399502896 +0000 UTC m=+0.159009695 container start 299525f745800ad238445a9b92a87bf9cbf5e6bdb9cebcdbe445a8c0cea443be (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vibrant_margulis, RELEASE=main, vcs-type=git, build-date=2025-09-24T08:57:55, version=7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., ceph=True, io.buildah.version=1.33.12, release=553, GIT_BRANCH=main, io.openshift.expose-services=, architecture=x86_64, distribution-scope=public, GIT_CLEAN=True, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, CEPH_POINT_RELEASE=)
Nov 28 09:53:55 np0005538515.localdomain podman[295457]: 2025-11-28 09:53:55.399764155 +0000 UTC m=+0.159271004 container attach 299525f745800ad238445a9b92a87bf9cbf5e6bdb9cebcdbe445a8c0cea443be (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vibrant_margulis, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, description=Red Hat Ceph Storage 7, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, release=553, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, distribution-scope=public, version=7, com.redhat.component=rhceph-container, build-date=2025-09-24T08:57:55, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_CLEAN=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, name=rhceph, ceph=True, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc.)
Nov 28 09:53:55 np0005538515.localdomain vibrant_margulis[295473]: 167 167
Nov 28 09:53:55 np0005538515.localdomain systemd[1]: libpod-299525f745800ad238445a9b92a87bf9cbf5e6bdb9cebcdbe445a8c0cea443be.scope: Deactivated successfully.
Nov 28 09:53:55 np0005538515.localdomain podman[295457]: 2025-11-28 09:53:55.40322575 +0000 UTC m=+0.162732579 container died 299525f745800ad238445a9b92a87bf9cbf5e6bdb9cebcdbe445a8c0cea443be (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vibrant_margulis, architecture=x86_64, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhceph, build-date=2025-09-24T08:57:55, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=rhceph-container, release=553, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, version=7, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, RELEASE=main, ceph=True, GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.33.12, description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-type=git, CEPH_POINT_RELEASE=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d)
Nov 28 09:53:55 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-845026d1e22dbb3000217c3565c8162f2b9c0611bf0287605636b758bb1c7f64-merged.mount: Deactivated successfully.
Nov 28 09:53:55 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-555eec11d45b1d65bc67a897d6d8d0cf128587c4e8c678984188d52936a1ca1e-merged.mount: Deactivated successfully.
Nov 28 09:53:55 np0005538515.localdomain podman[295478]: 2025-11-28 09:53:55.505158802 +0000 UTC m=+0.089588756 container remove 299525f745800ad238445a9b92a87bf9cbf5e6bdb9cebcdbe445a8c0cea443be (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vibrant_margulis, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, release=553, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, io.openshift.tags=rhceph ceph, build-date=2025-09-24T08:57:55, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, distribution-scope=public, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/agreements, version=7, GIT_BRANCH=main, GIT_CLEAN=True, io.openshift.expose-services=)
Nov 28 09:53:55 np0005538515.localdomain systemd[1]: libpod-conmon-299525f745800ad238445a9b92a87bf9cbf5e6bdb9cebcdbe445a8c0cea443be.scope: Deactivated successfully.
Nov 28 09:53:55 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:53:55 np0005538515.localdomain ceph-mon[287604]: Reconfiguring crash.np0005538515 (monmap changed)...
Nov 28 09:53:55 np0005538515.localdomain ceph-mon[287604]: Reconfiguring daemon crash.np0005538515 on np0005538515.localdomain
Nov 28 09:53:55 np0005538515.localdomain ceph-mon[287604]: pgmap v21: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:53:55 np0005538515.localdomain ceph-mon[287604]: from='client.34406 -' entity='client.admin' cmd=[{"prefix": "orch host drain", "hostname": "np0005538511.localdomain", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 09:53:55 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:55 np0005538515.localdomain ceph-mon[287604]: Added label _no_schedule to host np0005538511.localdomain
Nov 28 09:53:55 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:55 np0005538515.localdomain ceph-mon[287604]: Added label SpecialHostLabels.DRAIN_CONF_KEYRING to host np0005538511.localdomain
Nov 28 09:53:55 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:55 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:55 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch
Nov 28 09:53:55 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:53:55 np0005538515.localdomain sudo[295423]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:55 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO cephadm.serve] Reconfiguring osd.4 (monmap changed)...
Nov 28 09:53:55 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Reconfiguring osd.4 (monmap changed)...
Nov 28 09:53:55 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.4 on np0005538515.localdomain
Nov 28 09:53:55 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.4 on np0005538515.localdomain
Nov 28 09:53:55 np0005538515.localdomain sudo[295501]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:53:55 np0005538515.localdomain sudo[295501]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:55 np0005538515.localdomain sudo[295501]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:55 np0005538515.localdomain sudo[295519]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:53:55 np0005538515.localdomain sudo[295519]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:56 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v22: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:53:56 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.44275 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "host_pattern": "np0005538511.localdomain", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Nov 28 09:53:56 np0005538515.localdomain podman[295554]: 
Nov 28 09:53:56 np0005538515.localdomain podman[295554]: 2025-11-28 09:53:56.388890024 +0000 UTC m=+0.092973761 container create 82c453d50cd775b9a7d1f84ad7f12079d1decccb836e85feed7879b444d5a8ff (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=affectionate_heisenberg, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, RELEASE=main, release=553, version=7, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, build-date=2025-09-24T08:57:55, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, ceph=True, io.buildah.version=1.33.12, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, architecture=x86_64, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3)
Nov 28 09:53:56 np0005538515.localdomain systemd[1]: Started libpod-conmon-82c453d50cd775b9a7d1f84ad7f12079d1decccb836e85feed7879b444d5a8ff.scope.
Nov 28 09:53:56 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 09:53:56 np0005538515.localdomain podman[295554]: 2025-11-28 09:53:56.345300072 +0000 UTC m=+0.049383859 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 09:53:56 np0005538515.localdomain podman[295554]: 2025-11-28 09:53:56.452564507 +0000 UTC m=+0.156648254 container init 82c453d50cd775b9a7d1f84ad7f12079d1decccb836e85feed7879b444d5a8ff (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=affectionate_heisenberg, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, version=7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, architecture=x86_64, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, distribution-scope=public, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, vendor=Red Hat, Inc., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.33.12, vcs-type=git, io.openshift.tags=rhceph ceph, release=553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d)
Nov 28 09:53:56 np0005538515.localdomain systemd[1]: tmp-crun.bj4qVR.mount: Deactivated successfully.
Nov 28 09:53:56 np0005538515.localdomain affectionate_heisenberg[295569]: 167 167
Nov 28 09:53:56 np0005538515.localdomain podman[295554]: 2025-11-28 09:53:56.468989649 +0000 UTC m=+0.173073386 container start 82c453d50cd775b9a7d1f84ad7f12079d1decccb836e85feed7879b444d5a8ff (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=affectionate_heisenberg, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_CLEAN=True, vendor=Red Hat, Inc., vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, distribution-scope=public, name=rhceph, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, GIT_BRANCH=main, build-date=2025-09-24T08:57:55, release=553, version=7, io.buildah.version=1.33.12, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main)
Nov 28 09:53:56 np0005538515.localdomain podman[295554]: 2025-11-28 09:53:56.472217317 +0000 UTC m=+0.176301054 container attach 82c453d50cd775b9a7d1f84ad7f12079d1decccb836e85feed7879b444d5a8ff (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=affectionate_heisenberg, version=7, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, ceph=True, io.buildah.version=1.33.12, name=rhceph, GIT_CLEAN=True, io.openshift.expose-services=, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, RELEASE=main, io.openshift.tags=rhceph ceph, release=553, GIT_BRANCH=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, CEPH_POINT_RELEASE=)
Nov 28 09:53:56 np0005538515.localdomain systemd[1]: libpod-82c453d50cd775b9a7d1f84ad7f12079d1decccb836e85feed7879b444d5a8ff.scope: Deactivated successfully.
Nov 28 09:53:56 np0005538515.localdomain podman[295554]: 2025-11-28 09:53:56.476183099 +0000 UTC m=+0.180266846 container died 82c453d50cd775b9a7d1f84ad7f12079d1decccb836e85feed7879b444d5a8ff (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=affectionate_heisenberg, name=rhceph, release=553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, GIT_CLEAN=True, ceph=True, vendor=Red Hat, Inc., io.buildah.version=1.33.12, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, RELEASE=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553)
Nov 28 09:53:56 np0005538515.localdomain ceph-mon[287604]: Reconfiguring osd.1 (monmap changed)...
Nov 28 09:53:56 np0005538515.localdomain ceph-mon[287604]: Reconfiguring daemon osd.1 on np0005538515.localdomain
Nov 28 09:53:56 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:56 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:56 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch
Nov 28 09:53:56 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:53:56 np0005538515.localdomain podman[295575]: 2025-11-28 09:53:56.580108982 +0000 UTC m=+0.094175597 container remove 82c453d50cd775b9a7d1f84ad7f12079d1decccb836e85feed7879b444d5a8ff (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=affectionate_heisenberg, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, architecture=x86_64, io.openshift.tags=rhceph ceph, vcs-type=git, version=7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, distribution-scope=public, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, name=rhceph, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, release=553, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, vendor=Red Hat, Inc., io.buildah.version=1.33.12, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9)
Nov 28 09:53:56 np0005538515.localdomain systemd[1]: libpod-conmon-82c453d50cd775b9a7d1f84ad7f12079d1decccb836e85feed7879b444d5a8ff.scope: Deactivated successfully.
Nov 28 09:53:56 np0005538515.localdomain sudo[295519]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:56 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO cephadm.serve] Reconfiguring mds.mds.np0005538515.anvatb (monmap changed)...
Nov 28 09:53:56 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Reconfiguring mds.mds.np0005538515.anvatb (monmap changed)...
Nov 28 09:53:56 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO cephadm.serve] Reconfiguring daemon mds.mds.np0005538515.anvatb on np0005538515.localdomain
Nov 28 09:53:56 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Reconfiguring daemon mds.mds.np0005538515.anvatb on np0005538515.localdomain
Nov 28 09:53:56 np0005538515.localdomain sudo[295597]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:53:56 np0005538515.localdomain sudo[295597]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:56 np0005538515.localdomain sudo[295597]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:56 np0005538515.localdomain sudo[295615]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:53:56 np0005538515.localdomain sudo[295615]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:57 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.27016 -' entity='client.admin' cmd=[{"prefix": "orch host rm", "hostname": "np0005538511.localdomain", "force": true, "target": ["mon-mgr", ""]}]: dispatch
Nov 28 09:53:57 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-52f8975212decd8be6dc0b515937d3e942ee1fb747319e206520e5abfa566202-merged.mount: Deactivated successfully.
Nov 28 09:53:57 np0005538515.localdomain podman[295651]: 
Nov 28 09:53:57 np0005538515.localdomain podman[295651]: 2025-11-28 09:53:57.461127679 +0000 UTC m=+0.081458838 container create 6da8e88072a59f66abdd6475679524c010da64c988328a48c6931baa52e0ab33 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=friendly_galois, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, CEPH_POINT_RELEASE=, distribution-scope=public, name=rhceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, architecture=x86_64, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=553, version=7, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., GIT_BRANCH=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=rhceph-container, RELEASE=main, ceph=True, build-date=2025-09-24T08:57:55, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9)
Nov 28 09:53:57 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO root] Removed host np0005538511.localdomain
Nov 28 09:53:57 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Removed host np0005538511.localdomain
Nov 28 09:53:57 np0005538515.localdomain systemd[1]: Started libpod-conmon-6da8e88072a59f66abdd6475679524c010da64c988328a48c6931baa52e0ab33.scope.
Nov 28 09:53:57 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 09:53:57 np0005538515.localdomain podman[295651]: 2025-11-28 09:53:57.428615757 +0000 UTC m=+0.048946966 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 09:53:57 np0005538515.localdomain podman[295651]: 2025-11-28 09:53:57.535333595 +0000 UTC m=+0.155664764 container init 6da8e88072a59f66abdd6475679524c010da64c988328a48c6931baa52e0ab33 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=friendly_galois, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, release=553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, GIT_BRANCH=main, GIT_CLEAN=True, architecture=x86_64, io.openshift.expose-services=, build-date=2025-09-24T08:57:55, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=, io.buildah.version=1.33.12, io.openshift.tags=rhceph ceph, distribution-scope=public, RELEASE=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, name=rhceph, version=7, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9)
Nov 28 09:53:57 np0005538515.localdomain podman[295651]: 2025-11-28 09:53:57.544137114 +0000 UTC m=+0.164468283 container start 6da8e88072a59f66abdd6475679524c010da64c988328a48c6931baa52e0ab33 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=friendly_galois, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, distribution-scope=public, GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, release=553, architecture=x86_64, ceph=True, io.buildah.version=1.33.12, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph)
Nov 28 09:53:57 np0005538515.localdomain podman[295651]: 2025-11-28 09:53:57.544354921 +0000 UTC m=+0.164686080 container attach 6da8e88072a59f66abdd6475679524c010da64c988328a48c6931baa52e0ab33 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=friendly_galois, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, name=rhceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=rhceph-container, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, ceph=True, release=553, io.openshift.expose-services=, distribution-scope=public, GIT_CLEAN=True, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, version=7, RELEASE=main)
Nov 28 09:53:57 np0005538515.localdomain friendly_galois[295665]: 167 167
Nov 28 09:53:57 np0005538515.localdomain systemd[1]: libpod-6da8e88072a59f66abdd6475679524c010da64c988328a48c6931baa52e0ab33.scope: Deactivated successfully.
Nov 28 09:53:57 np0005538515.localdomain podman[295651]: 2025-11-28 09:53:57.547028022 +0000 UTC m=+0.167359211 container died 6da8e88072a59f66abdd6475679524c010da64c988328a48c6931baa52e0ab33 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=friendly_galois, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, ceph=True, distribution-scope=public, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, release=553, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, architecture=x86_64, build-date=2025-09-24T08:57:55, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, version=7, RELEASE=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.buildah.version=1.33.12)
Nov 28 09:53:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:53:57 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 09:53:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:53:57 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:53:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:53:57 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:53:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:53:57 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 09:53:57 np0005538515.localdomain openstack_network_exporter[240973]: 
Nov 28 09:53:57 np0005538515.localdomain ceph-mon[287604]: Reconfiguring osd.4 (monmap changed)...
Nov 28 09:53:57 np0005538515.localdomain ceph-mon[287604]: Reconfiguring daemon osd.4 on np0005538515.localdomain
Nov 28 09:53:57 np0005538515.localdomain ceph-mon[287604]: pgmap v22: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:53:57 np0005538515.localdomain ceph-mon[287604]: from='client.44275 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "host_pattern": "np0005538511.localdomain", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Nov 28 09:53:57 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:57 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:57 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005538515.anvatb", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Nov 28 09:53:57 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005538515.anvatb", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Nov 28 09:53:57 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:53:57 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:57 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005538511.localdomain"} : dispatch
Nov 28 09:53:57 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005538511.localdomain"} : dispatch
Nov 28 09:53:57 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005538511.localdomain"}]': finished
Nov 28 09:53:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:53:57 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 09:53:57 np0005538515.localdomain openstack_network_exporter[240973]: 
Nov 28 09:53:57 np0005538515.localdomain podman[295670]: 2025-11-28 09:53:57.659190197 +0000 UTC m=+0.098458247 container remove 6da8e88072a59f66abdd6475679524c010da64c988328a48c6931baa52e0ab33 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=friendly_galois, name=rhceph, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, vcs-type=git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, release=553, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, io.buildah.version=1.33.12, CEPH_POINT_RELEASE=, version=7, io.openshift.tags=rhceph ceph, GIT_CLEAN=True)
Nov 28 09:53:57 np0005538515.localdomain systemd[1]: libpod-conmon-6da8e88072a59f66abdd6475679524c010da64c988328a48c6931baa52e0ab33.scope: Deactivated successfully.
Nov 28 09:53:57 np0005538515.localdomain sudo[295615]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:57 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO cephadm.serve] Reconfiguring mgr.np0005538515.yfkzhl (monmap changed)...
Nov 28 09:53:57 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Reconfiguring mgr.np0005538515.yfkzhl (monmap changed)...
Nov 28 09:53:57 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO cephadm.serve] Reconfiguring daemon mgr.np0005538515.yfkzhl on np0005538515.localdomain
Nov 28 09:53:57 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Reconfiguring daemon mgr.np0005538515.yfkzhl on np0005538515.localdomain
Nov 28 09:53:57 np0005538515.localdomain sudo[295686]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:53:57 np0005538515.localdomain sudo[295686]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:57 np0005538515.localdomain sudo[295686]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:57 np0005538515.localdomain sudo[295704]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:53:57 np0005538515.localdomain sudo[295704]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:58 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v23: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:53:58 np0005538515.localdomain podman[295739]: 
Nov 28 09:53:58 np0005538515.localdomain podman[295739]: 2025-11-28 09:53:58.417960633 +0000 UTC m=+0.101037516 container create 80095e06da07e006eeb4b26d1f0c428930efe3584acc7c2469bd7ca8d9451f81 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=determined_liskov, vendor=Red Hat, Inc., GIT_BRANCH=main, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-09-24T08:57:55, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, name=rhceph, vcs-type=git, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, release=553, ceph=True, io.openshift.expose-services=, version=7, architecture=x86_64, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Nov 28 09:53:58 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-e3a1fe58ba7564743a15137bff4bfdb2920f385846a277197347c506c906651c-merged.mount: Deactivated successfully.
Nov 28 09:53:58 np0005538515.localdomain systemd[1]: Started libpod-conmon-80095e06da07e006eeb4b26d1f0c428930efe3584acc7c2469bd7ca8d9451f81.scope.
Nov 28 09:53:58 np0005538515.localdomain podman[295739]: 2025-11-28 09:53:58.360198109 +0000 UTC m=+0.043274972 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 09:53:58 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 09:53:58 np0005538515.localdomain podman[295739]: 2025-11-28 09:53:58.47977851 +0000 UTC m=+0.162855433 container init 80095e06da07e006eeb4b26d1f0c428930efe3584acc7c2469bd7ca8d9451f81 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=determined_liskov, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., RELEASE=main, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, distribution-scope=public, io.buildah.version=1.33.12, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, release=553, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/agreements, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, build-date=2025-09-24T08:57:55, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, GIT_BRANCH=main, ceph=True, GIT_CLEAN=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d)
Nov 28 09:53:58 np0005538515.localdomain podman[295739]: 2025-11-28 09:53:58.488573098 +0000 UTC m=+0.171649981 container start 80095e06da07e006eeb4b26d1f0c428930efe3584acc7c2469bd7ca8d9451f81 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=determined_liskov, distribution-scope=public, version=7, com.redhat.component=rhceph-container, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, RELEASE=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, release=553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, io.openshift.tags=rhceph ceph, io.buildah.version=1.33.12, architecture=x86_64, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Nov 28 09:53:58 np0005538515.localdomain podman[295739]: 2025-11-28 09:53:58.488784455 +0000 UTC m=+0.171861368 container attach 80095e06da07e006eeb4b26d1f0c428930efe3584acc7c2469bd7ca8d9451f81 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=determined_liskov, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.33.12, architecture=x86_64, CEPH_POINT_RELEASE=, ceph=True, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, release=553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, name=rhceph, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-type=git, distribution-scope=public, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d)
Nov 28 09:53:58 np0005538515.localdomain determined_liskov[295754]: 167 167
Nov 28 09:53:58 np0005538515.localdomain systemd[1]: libpod-80095e06da07e006eeb4b26d1f0c428930efe3584acc7c2469bd7ca8d9451f81.scope: Deactivated successfully.
Nov 28 09:53:58 np0005538515.localdomain podman[295739]: 2025-11-28 09:53:58.49158816 +0000 UTC m=+0.174665043 container died 80095e06da07e006eeb4b26d1f0c428930efe3584acc7c2469bd7ca8d9451f81 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=determined_liskov, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.33.12, distribution-scope=public, version=7, GIT_BRANCH=main, CEPH_POINT_RELEASE=, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, name=rhceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, release=553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, build-date=2025-09-24T08:57:55)
Nov 28 09:53:58 np0005538515.localdomain podman[295759]: 2025-11-28 09:53:58.583966881 +0000 UTC m=+0.082168840 container remove 80095e06da07e006eeb4b26d1f0c428930efe3584acc7c2469bd7ca8d9451f81 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=determined_liskov, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-09-24T08:57:55, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, architecture=x86_64, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, release=553, GIT_CLEAN=True, RELEASE=main, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, ceph=True, name=rhceph, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.buildah.version=1.33.12)
Nov 28 09:53:58 np0005538515.localdomain systemd[1]: libpod-conmon-80095e06da07e006eeb4b26d1f0c428930efe3584acc7c2469bd7ca8d9451f81.scope: Deactivated successfully.
Nov 28 09:53:58 np0005538515.localdomain ceph-mon[287604]: Reconfiguring mds.mds.np0005538515.anvatb (monmap changed)...
Nov 28 09:53:58 np0005538515.localdomain ceph-mon[287604]: Reconfiguring daemon mds.mds.np0005538515.anvatb on np0005538515.localdomain
Nov 28 09:53:58 np0005538515.localdomain ceph-mon[287604]: from='client.27016 -' entity='client.admin' cmd=[{"prefix": "orch host rm", "hostname": "np0005538511.localdomain", "force": true, "target": ["mon-mgr", ""]}]: dispatch
Nov 28 09:53:58 np0005538515.localdomain ceph-mon[287604]: Removed host np0005538511.localdomain
Nov 28 09:53:58 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:58 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:58 np0005538515.localdomain ceph-mon[287604]: Reconfiguring mgr.np0005538515.yfkzhl (monmap changed)...
Nov 28 09:53:58 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538515.yfkzhl", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 28 09:53:58 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538515.yfkzhl", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 28 09:53:58 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mgr services"} : dispatch
Nov 28 09:53:58 np0005538515.localdomain ceph-mon[287604]: Reconfiguring daemon mgr.np0005538515.yfkzhl on np0005538515.localdomain
Nov 28 09:53:58 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:53:58 np0005538515.localdomain ceph-mon[287604]: pgmap v23: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:53:58 np0005538515.localdomain sudo[295704]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:58 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO cephadm.serve] Reconfiguring mon.np0005538515 (monmap changed)...
Nov 28 09:53:58 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Reconfiguring mon.np0005538515 (monmap changed)...
Nov 28 09:53:58 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO cephadm.serve] Reconfiguring daemon mon.np0005538515 on np0005538515.localdomain
Nov 28 09:53:58 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Reconfiguring daemon mon.np0005538515 on np0005538515.localdomain
Nov 28 09:53:58 np0005538515.localdomain sudo[295775]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:53:58 np0005538515.localdomain sudo[295775]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:58 np0005538515.localdomain sudo[295775]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:58 np0005538515.localdomain sudo[295793]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:53:58 np0005538515.localdomain sudo[295793]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:58 np0005538515.localdomain podman[239012]: time="2025-11-28T09:53:58Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 09:53:58 np0005538515.localdomain podman[239012]: @ - - [28/Nov/2025:09:53:58 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156330 "" "Go-http-client/1.1"
Nov 28 09:53:58 np0005538515.localdomain podman[239012]: @ - - [28/Nov/2025:09:53:58 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19184 "" "Go-http-client/1.1"
Nov 28 09:53:59 np0005538515.localdomain podman[295827]: 
Nov 28 09:53:59 np0005538515.localdomain podman[295827]: 2025-11-28 09:53:59.297812305 +0000 UTC m=+0.071459832 container create 878bcbed3afc22c5199174dda88beb1dbdd0e9e14bb4bed886d2daa7ec33bd3f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sharp_shaw, name=rhceph, vendor=Red Hat, Inc., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.component=rhceph-container, io.openshift.expose-services=, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.description=Red Hat Ceph Storage 7, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, release=553, vcs-type=git, build-date=2025-09-24T08:57:55, GIT_BRANCH=main, io.buildah.version=1.33.12, distribution-scope=public, io.openshift.tags=rhceph ceph, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3)
Nov 28 09:53:59 np0005538515.localdomain systemd[1]: Started libpod-conmon-878bcbed3afc22c5199174dda88beb1dbdd0e9e14bb4bed886d2daa7ec33bd3f.scope.
Nov 28 09:53:59 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 09:53:59 np0005538515.localdomain podman[295827]: 2025-11-28 09:53:59.357785456 +0000 UTC m=+0.131433003 container init 878bcbed3afc22c5199174dda88beb1dbdd0e9e14bb4bed886d2daa7ec33bd3f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sharp_shaw, vcs-type=git, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, GIT_BRANCH=main, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, build-date=2025-09-24T08:57:55, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.expose-services=, release=553, architecture=x86_64, RELEASE=main, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vendor=Red Hat, Inc., distribution-scope=public)
Nov 28 09:53:59 np0005538515.localdomain podman[295827]: 2025-11-28 09:53:59.367546885 +0000 UTC m=+0.141194432 container start 878bcbed3afc22c5199174dda88beb1dbdd0e9e14bb4bed886d2daa7ec33bd3f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sharp_shaw, build-date=2025-09-24T08:57:55, com.redhat.component=rhceph-container, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, ceph=True, io.openshift.tags=rhceph ceph, name=rhceph, version=7, description=Red Hat Ceph Storage 7, release=553, vendor=Red Hat, Inc., RELEASE=main, GIT_BRANCH=main, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.buildah.version=1.33.12)
Nov 28 09:53:59 np0005538515.localdomain podman[295827]: 2025-11-28 09:53:59.368011379 +0000 UTC m=+0.141658926 container attach 878bcbed3afc22c5199174dda88beb1dbdd0e9e14bb4bed886d2daa7ec33bd3f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sharp_shaw, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, RELEASE=main, distribution-scope=public, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, build-date=2025-09-24T08:57:55, version=7, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, architecture=x86_64, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, release=553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vendor=Red Hat, Inc., GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Nov 28 09:53:59 np0005538515.localdomain sharp_shaw[295842]: 167 167
Nov 28 09:53:59 np0005538515.localdomain systemd[1]: libpod-878bcbed3afc22c5199174dda88beb1dbdd0e9e14bb4bed886d2daa7ec33bd3f.scope: Deactivated successfully.
Nov 28 09:53:59 np0005538515.localdomain podman[295827]: 2025-11-28 09:53:59.272035969 +0000 UTC m=+0.045683586 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 09:53:59 np0005538515.localdomain podman[295827]: 2025-11-28 09:53:59.371777383 +0000 UTC m=+0.145424960 container died 878bcbed3afc22c5199174dda88beb1dbdd0e9e14bb4bed886d2daa7ec33bd3f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sharp_shaw, name=rhceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.description=Red Hat Ceph Storage 7, version=7, CEPH_POINT_RELEASE=, release=553, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_CLEAN=True, io.buildah.version=1.33.12, vendor=Red Hat, Inc., ceph=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, distribution-scope=public, io.openshift.expose-services=, architecture=x86_64, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-09-24T08:57:55, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/agreements)
Nov 28 09:53:59 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-cd1f7e3d49597e9f104c7e3900ac85ef7c088cbdae7a111ba7e0174b9b127e95-merged.mount: Deactivated successfully.
Nov 28 09:53:59 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-290bfb5861a484f53ee709498fbf8ec350d102dd052678ad92fc7c121f1c7740-merged.mount: Deactivated successfully.
Nov 28 09:53:59 np0005538515.localdomain podman[295847]: 2025-11-28 09:53:59.471594801 +0000 UTC m=+0.088005887 container remove 878bcbed3afc22c5199174dda88beb1dbdd0e9e14bb4bed886d2daa7ec33bd3f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sharp_shaw, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, version=7, ceph=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_BRANCH=main, release=553, io.buildah.version=1.33.12, io.openshift.tags=rhceph ceph, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-09-24T08:57:55, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.component=rhceph-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, name=rhceph, vendor=Red Hat, Inc., architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, GIT_CLEAN=True, io.openshift.expose-services=)
Nov 28 09:53:59 np0005538515.localdomain systemd[1]: libpod-conmon-878bcbed3afc22c5199174dda88beb1dbdd0e9e14bb4bed886d2daa7ec33bd3f.scope: Deactivated successfully.
Nov 28 09:53:59 np0005538515.localdomain sudo[295793]: pam_unix(sudo:session): session closed for user root
Nov 28 09:53:59 np0005538515.localdomain ceph-mgr[286188]: [progress INFO root] update: starting ev 24d3134e-2595-4e0c-aeac-80c1e796ab9d (Updating node-proxy deployment (+4 -> 4))
Nov 28 09:53:59 np0005538515.localdomain ceph-mgr[286188]: [progress INFO root] complete: finished ev 24d3134e-2595-4e0c-aeac-80c1e796ab9d (Updating node-proxy deployment (+4 -> 4))
Nov 28 09:53:59 np0005538515.localdomain ceph-mgr[286188]: [progress INFO root] Completed event 24d3134e-2595-4e0c-aeac-80c1e796ab9d (Updating node-proxy deployment (+4 -> 4)) in 0 seconds
Nov 28 09:53:59 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:59 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:59 np0005538515.localdomain ceph-mon[287604]: Reconfiguring mon.np0005538515 (monmap changed)...
Nov 28 09:53:59 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Nov 28 09:53:59 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Nov 28 09:53:59 np0005538515.localdomain ceph-mon[287604]: Reconfiguring daemon mon.np0005538515 on np0005538515.localdomain
Nov 28 09:53:59 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:53:59 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:59 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:59 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:53:59 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 28 09:53:59 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:53:59 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 28 09:53:59 np0005538515.localdomain sudo[295861]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:53:59 np0005538515.localdomain sudo[295861]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:53:59 np0005538515.localdomain sudo[295861]: pam_unix(sudo:session): session closed for user root
Nov 28 09:54:00 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v24: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:54:00 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:54:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:54:00.626 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:54:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:54:00.627 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:54:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:54:00.627 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:54:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:54:00.627 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:54:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:54:00.627 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:54:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:54:00.627 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:54:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:54:00.627 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:54:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:54:00.627 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:54:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:54:00.628 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:54:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:54:00.628 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:54:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:54:00.628 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:54:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:54:00.628 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:54:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:54:00.628 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:54:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:54:00.628 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:54:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:54:00.628 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:54:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:54:00.628 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:54:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:54:00.629 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:54:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:54:00.629 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:54:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:54:00.629 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:54:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:54:00.629 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:54:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:54:00.629 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:54:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:54:00.629 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:54:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:54:00.629 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:54:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:54:00.630 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:54:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:54:00.630 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:54:00 np0005538515.localdomain ceph-mon[287604]: pgmap v24: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:54:02 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v25: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:54:03 np0005538515.localdomain ceph-mon[287604]: pgmap v25: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:54:03 np0005538515.localdomain ceph-mgr[286188]: [progress INFO root] Writing back 50 completed events
Nov 28 09:54:04 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v26: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:54:05 np0005538515.localdomain ceph-mon[287604]: pgmap v26: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:54:05 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:54:05 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:54:05 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.
Nov 28 09:54:05 np0005538515.localdomain podman[295879]: 2025-11-28 09:54:05.988822418 +0000 UTC m=+0.091529246 container health_status 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, vcs-type=git, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, version=9.6, io.buildah.version=1.33.7, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, config_id=edpm, io.openshift.expose-services=, name=ubi9-minimal)
Nov 28 09:54:06 np0005538515.localdomain podman[295879]: 2025-11-28 09:54:06.005328601 +0000 UTC m=+0.108035419 container exec_died 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, maintainer=Red Hat, Inc., release=1755695350, vcs-type=git, version=9.6, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, io.openshift.expose-services=, name=ubi9-minimal, distribution-scope=public, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Nov 28 09:54:06 np0005538515.localdomain systemd[1]: 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.service: Deactivated successfully.
Nov 28 09:54:06 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v27: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:54:06 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.27028 -' entity='client.admin' cmd=[{"prefix": "orch apply", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 09:54:06 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO root] Saving service mon spec with placement label:mon
Nov 28 09:54:06 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Saving service mon spec with placement label:mon
Nov 28 09:54:06 np0005538515.localdomain ceph-mgr[286188]: [progress INFO root] update: starting ev cd2a798e-179c-47da-9819-8e78128c7f2d (Updating node-proxy deployment (+4 -> 4))
Nov 28 09:54:06 np0005538515.localdomain ceph-mgr[286188]: [progress INFO root] complete: finished ev cd2a798e-179c-47da-9819-8e78128c7f2d (Updating node-proxy deployment (+4 -> 4))
Nov 28 09:54:06 np0005538515.localdomain ceph-mgr[286188]: [progress INFO root] Completed event cd2a798e-179c-47da-9819-8e78128c7f2d (Updating node-proxy deployment (+4 -> 4)) in 0 seconds
Nov 28 09:54:06 np0005538515.localdomain sudo[295899]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:54:06 np0005538515.localdomain sudo[295899]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:54:06 np0005538515.localdomain sudo[295899]: pam_unix(sudo:session): session closed for user root
Nov 28 09:54:07 np0005538515.localdomain ceph-mon[287604]: pgmap v27: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:54:07 np0005538515.localdomain ceph-mon[287604]: from='client.27028 -' entity='client.admin' cmd=[{"prefix": "orch apply", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 09:54:07 np0005538515.localdomain ceph-mon[287604]: Saving service mon spec with placement label:mon
Nov 28 09:54:07 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:54:07 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:54:07 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 28 09:54:07 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:54:07 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 28 09:54:07 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.27036 -' entity='client.admin' cmd=[{"prefix": "orch ps", "daemon_type": "mon", "daemon_id": "np0005538514", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Nov 28 09:54:08 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v28: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:54:09 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.44295 -' entity='client.admin' cmd=[{"prefix": "orch daemon rm", "names": ["mon.np0005538514"], "force": true, "target": ["mon-mgr", ""]}]: dispatch
Nov 28 09:54:09 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO root] Remove daemons mon.np0005538514
Nov 28 09:54:09 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Remove daemons mon.np0005538514
Nov 28 09:54:09 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO cephadm.services.cephadmservice] Safe to remove mon.np0005538514: new quorum should be ['np0005538512', 'np0005538515', 'np0005538513'] (from ['np0005538512', 'np0005538515', 'np0005538513'])
Nov 28 09:54:09 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Safe to remove mon.np0005538514: new quorum should be ['np0005538512', 'np0005538515', 'np0005538513'] (from ['np0005538512', 'np0005538515', 'np0005538513'])
Nov 28 09:54:09 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO cephadm.services.cephadmservice] Removing monitor np0005538514 from monmap...
Nov 28 09:54:09 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Removing monitor np0005538514 from monmap...
Nov 28 09:54:09 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO cephadm.serve] Removing daemon mon.np0005538514 from np0005538514.localdomain -- ports []
Nov 28 09:54:09 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Removing daemon mon.np0005538514 from np0005538514.localdomain -- ports []
Nov 28 09:54:09 np0005538515.localdomain ceph-mgr[286188]: client.34353 ms_handle_reset on v2:172.18.0.103:3300/0
Nov 28 09:54:09 np0005538515.localdomain ceph-mon[287604]: log_channel(cluster) log [INF] : mon.np0005538515 calling monitor election
Nov 28 09:54:09 np0005538515.localdomain ceph-mon[287604]: paxos.1).electionLogic(40) init, last seen epoch 40
Nov 28 09:54:09 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(electing) e11 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 28 09:54:09 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(electing) e11 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 28 09:54:09 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(electing) e11 handle_command mon_command({"prefix": "mon metadata", "id": "np0005538512"} v 0)
Nov 28 09:54:09 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [DBG] : from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mon metadata", "id": "np0005538512"} : dispatch
Nov 28 09:54:09 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(electing) e11 handle_command mon_command({"prefix": "mon metadata", "id": "np0005538513"} v 0)
Nov 28 09:54:09 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [DBG] : from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mon metadata", "id": "np0005538513"} : dispatch
Nov 28 09:54:09 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(electing) e11 handle_command mon_command({"prefix": "mon metadata", "id": "np0005538515"} v 0)
Nov 28 09:54:09 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [DBG] : from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mon metadata", "id": "np0005538515"} : dispatch
Nov 28 09:54:09 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(electing) e11 handle_command mon_command({"prefix": "mon metadata", "id": "np0005538512"} v 0)
Nov 28 09:54:09 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [DBG] : from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mon metadata", "id": "np0005538512"} : dispatch
Nov 28 09:54:09 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(electing) e11 handle_command mon_command({"prefix": "mon metadata", "id": "np0005538513"} v 0)
Nov 28 09:54:09 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [DBG] : from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mon metadata", "id": "np0005538513"} : dispatch
Nov 28 09:54:09 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(electing) e11 handle_command mon_command({"prefix": "mon metadata", "id": "np0005538515"} v 0)
Nov 28 09:54:09 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [DBG] : from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mon metadata", "id": "np0005538515"} : dispatch
Nov 28 09:54:09 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(electing) e11 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Nov 28 09:54:09 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [DBG] : from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:54:09 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon) e11 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 28 09:54:09 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon) e11 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0)
Nov 28 09:54:09 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [INF] : from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 28 09:54:09 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO cephadm.serve] Updating np0005538512.localdomain:/etc/ceph/ceph.conf
Nov 28 09:54:09 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Updating np0005538512.localdomain:/etc/ceph/ceph.conf
Nov 28 09:54:09 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO cephadm.serve] Updating np0005538513.localdomain:/etc/ceph/ceph.conf
Nov 28 09:54:09 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO cephadm.serve] Updating np0005538514.localdomain:/etc/ceph/ceph.conf
Nov 28 09:54:09 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO cephadm.serve] Updating np0005538515.localdomain:/etc/ceph/ceph.conf
Nov 28 09:54:09 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Updating np0005538513.localdomain:/etc/ceph/ceph.conf
Nov 28 09:54:09 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Updating np0005538514.localdomain:/etc/ceph/ceph.conf
Nov 28 09:54:09 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Updating np0005538515.localdomain:/etc/ceph/ceph.conf
Nov 28 09:54:09 np0005538515.localdomain sudo[295917]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Nov 28 09:54:09 np0005538515.localdomain sudo[295917]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:54:09 np0005538515.localdomain sudo[295917]: pam_unix(sudo:session): session closed for user root
Nov 28 09:54:09 np0005538515.localdomain sudo[295935]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph
Nov 28 09:54:09 np0005538515.localdomain sudo[295935]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:54:09 np0005538515.localdomain sudo[295935]: pam_unix(sudo:session): session closed for user root
Nov 28 09:54:09 np0005538515.localdomain ceph-mgr[286188]: [progress INFO root] Writing back 50 completed events
Nov 28 09:54:09 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon) e11 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0)
Nov 28 09:54:09 np0005538515.localdomain sudo[295953]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.conf.new
Nov 28 09:54:09 np0005538515.localdomain sudo[295953]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:54:09 np0005538515.localdomain sudo[295953]: pam_unix(sudo:session): session closed for user root
Nov 28 09:54:09 np0005538515.localdomain sudo[295971]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:54:09 np0005538515.localdomain sudo[295971]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:54:09 np0005538515.localdomain sudo[295971]: pam_unix(sudo:session): session closed for user root
Nov 28 09:54:09 np0005538515.localdomain sudo[295989]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.conf.new
Nov 28 09:54:09 np0005538515.localdomain sudo[295989]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:54:09 np0005538515.localdomain sudo[295989]: pam_unix(sudo:session): session closed for user root
Nov 28 09:54:09 np0005538515.localdomain sudo[296023]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.conf.new
Nov 28 09:54:09 np0005538515.localdomain sudo[296023]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:54:09 np0005538515.localdomain sudo[296023]: pam_unix(sudo:session): session closed for user root
Nov 28 09:54:09 np0005538515.localdomain sudo[296041]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.conf.new
Nov 28 09:54:09 np0005538515.localdomain sudo[296041]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:54:09 np0005538515.localdomain sudo[296041]: pam_unix(sudo:session): session closed for user root
Nov 28 09:54:09 np0005538515.localdomain sudo[296059]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Nov 28 09:54:09 np0005538515.localdomain sudo[296059]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:54:09 np0005538515.localdomain sudo[296059]: pam_unix(sudo:session): session closed for user root
Nov 28 09:54:09 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO cephadm.serve] Updating np0005538515.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:54:09 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Updating np0005538515.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:54:09 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO cephadm.serve] Updating np0005538512.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:54:09 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Updating np0005538512.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:54:09 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO cephadm.serve] Updating np0005538513.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:54:09 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Updating np0005538513.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:54:09 np0005538515.localdomain sudo[296077]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config
Nov 28 09:54:09 np0005538515.localdomain sudo[296077]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:54:09 np0005538515.localdomain sudo[296077]: pam_unix(sudo:session): session closed for user root
Nov 28 09:54:09 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO cephadm.serve] Updating np0005538514.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:54:09 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Updating np0005538514.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:54:09 np0005538515.localdomain sudo[296095]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config
Nov 28 09:54:09 np0005538515.localdomain sudo[296095]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:54:09 np0005538515.localdomain sudo[296095]: pam_unix(sudo:session): session closed for user root
Nov 28 09:54:09 np0005538515.localdomain sudo[296113]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf.new
Nov 28 09:54:09 np0005538515.localdomain sudo[296113]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:54:09 np0005538515.localdomain sudo[296113]: pam_unix(sudo:session): session closed for user root
Nov 28 09:54:09 np0005538515.localdomain sudo[296131]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:54:09 np0005538515.localdomain sudo[296131]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:54:09 np0005538515.localdomain sudo[296131]: pam_unix(sudo:session): session closed for user root
Nov 28 09:54:10 np0005538515.localdomain sudo[296149]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf.new
Nov 28 09:54:10 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v29: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:54:10 np0005538515.localdomain sudo[296149]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:54:10 np0005538515.localdomain sudo[296149]: pam_unix(sudo:session): session closed for user root
Nov 28 09:54:10 np0005538515.localdomain ceph-mon[287604]: pgmap v28: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:54:10 np0005538515.localdomain ceph-mon[287604]: from='client.44295 -' entity='client.admin' cmd=[{"prefix": "orch daemon rm", "names": ["mon.np0005538514"], "force": true, "target": ["mon-mgr", ""]}]: dispatch
Nov 28 09:54:10 np0005538515.localdomain ceph-mon[287604]: Remove daemons mon.np0005538514
Nov 28 09:54:10 np0005538515.localdomain ceph-mon[287604]: Safe to remove mon.np0005538514: new quorum should be ['np0005538512', 'np0005538515', 'np0005538513'] (from ['np0005538512', 'np0005538515', 'np0005538513'])
Nov 28 09:54:10 np0005538515.localdomain ceph-mon[287604]: Removing monitor np0005538514 from monmap...
Nov 28 09:54:10 np0005538515.localdomain ceph-mon[287604]: Removing daemon mon.np0005538514 from np0005538514.localdomain -- ports []
Nov 28 09:54:10 np0005538515.localdomain ceph-mon[287604]: mon.np0005538513 calling monitor election
Nov 28 09:54:10 np0005538515.localdomain ceph-mon[287604]: mon.np0005538512 calling monitor election
Nov 28 09:54:10 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515 calling monitor election
Nov 28 09:54:10 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mon metadata", "id": "np0005538512"} : dispatch
Nov 28 09:54:10 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mon metadata", "id": "np0005538513"} : dispatch
Nov 28 09:54:10 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mon metadata", "id": "np0005538515"} : dispatch
Nov 28 09:54:10 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mon metadata", "id": "np0005538512"} : dispatch
Nov 28 09:54:10 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mon metadata", "id": "np0005538513"} : dispatch
Nov 28 09:54:10 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mon metadata", "id": "np0005538515"} : dispatch
Nov 28 09:54:10 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:54:10 np0005538515.localdomain ceph-mon[287604]: mon.np0005538512 is new leader, mons np0005538512,np0005538515,np0005538513 in quorum (ranks 0,1,2)
Nov 28 09:54:10 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 28 09:54:10 np0005538515.localdomain ceph-mon[287604]: Updating np0005538512.localdomain:/etc/ceph/ceph.conf
Nov 28 09:54:10 np0005538515.localdomain ceph-mon[287604]: Updating np0005538513.localdomain:/etc/ceph/ceph.conf
Nov 28 09:54:10 np0005538515.localdomain ceph-mon[287604]: Updating np0005538514.localdomain:/etc/ceph/ceph.conf
Nov 28 09:54:10 np0005538515.localdomain ceph-mon[287604]: Updating np0005538515.localdomain:/etc/ceph/ceph.conf
Nov 28 09:54:10 np0005538515.localdomain ceph-mon[287604]: monmap epoch 11
Nov 28 09:54:10 np0005538515.localdomain ceph-mon[287604]: fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:54:10 np0005538515.localdomain ceph-mon[287604]: last_changed 2025-11-28T09:54:09.028617+0000
Nov 28 09:54:10 np0005538515.localdomain ceph-mon[287604]: created 2025-11-28T07:45:36.120469+0000
Nov 28 09:54:10 np0005538515.localdomain ceph-mon[287604]: min_mon_release 18 (reef)
Nov 28 09:54:10 np0005538515.localdomain ceph-mon[287604]: election_strategy: 1
Nov 28 09:54:10 np0005538515.localdomain ceph-mon[287604]: 0: [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon.np0005538512
Nov 28 09:54:10 np0005538515.localdomain ceph-mon[287604]: 1: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005538515
Nov 28 09:54:10 np0005538515.localdomain ceph-mon[287604]: 2: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005538513
Nov 28 09:54:10 np0005538515.localdomain ceph-mon[287604]: fsmap cephfs:1 {0=mds.np0005538514.umgtoy=up:active} 2 up:standby
Nov 28 09:54:10 np0005538515.localdomain ceph-mon[287604]: osdmap e88: 6 total, 6 up, 6 in
Nov 28 09:54:10 np0005538515.localdomain ceph-mon[287604]: mgrmap e29: np0005538515.yfkzhl(active, since 51s), standbys: np0005538511.fvuybw, np0005538513.dsfdlx, np0005538514.djozup, np0005538512.zyhkxs
Nov 28 09:54:10 np0005538515.localdomain ceph-mon[287604]: overall HEALTH_OK
Nov 28 09:54:10 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:54:10 np0005538515.localdomain sudo[296183]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf.new
Nov 28 09:54:10 np0005538515.localdomain sudo[296183]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:54:10 np0005538515.localdomain sudo[296183]: pam_unix(sudo:session): session closed for user root
Nov 28 09:54:10 np0005538515.localdomain sudo[296201]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf.new
Nov 28 09:54:10 np0005538515.localdomain sudo[296201]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:54:10 np0005538515.localdomain sudo[296201]: pam_unix(sudo:session): session closed for user root
Nov 28 09:54:10 np0005538515.localdomain sudo[296219]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf.new /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:54:10 np0005538515.localdomain sudo[296219]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:54:10 np0005538515.localdomain sudo[296219]: pam_unix(sudo:session): session closed for user root
Nov 28 09:54:10 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon) e11 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain.devices.0}] v 0)
Nov 28 09:54:10 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon) e11 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain}] v 0)
Nov 28 09:54:10 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon) e11 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538512.localdomain.devices.0}] v 0)
Nov 28 09:54:10 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon) e11 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538512.localdomain}] v 0)
Nov 28 09:54:10 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon) e11 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain.devices.0}] v 0)
Nov 28 09:54:10 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon) e11 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain}] v 0)
Nov 28 09:54:10 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:54:10 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon) e11 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain.devices.0}] v 0)
Nov 28 09:54:10 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon) e11 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain}] v 0)
Nov 28 09:54:10 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon) e11 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Nov 28 09:54:10 np0005538515.localdomain ceph-mgr[286188]: [progress INFO root] update: starting ev 455a3ce3-e7f5-4e5b-a411-77ecee04021a (Updating node-proxy deployment (+4 -> 4))
Nov 28 09:54:10 np0005538515.localdomain ceph-mgr[286188]: [progress INFO root] complete: finished ev 455a3ce3-e7f5-4e5b-a411-77ecee04021a (Updating node-proxy deployment (+4 -> 4))
Nov 28 09:54:10 np0005538515.localdomain ceph-mgr[286188]: [progress INFO root] Completed event 455a3ce3-e7f5-4e5b-a411-77ecee04021a (Updating node-proxy deployment (+4 -> 4)) in 0 seconds
Nov 28 09:54:10 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon) e11 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0)
Nov 28 09:54:10 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [DBG] : from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 28 09:54:11 np0005538515.localdomain sudo[296237]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:54:11 np0005538515.localdomain sudo[296237]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:54:11 np0005538515.localdomain sudo[296237]: pam_unix(sudo:session): session closed for user root
Nov 28 09:54:11 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO cephadm.serve] Reconfiguring mgr.np0005538512.zyhkxs (monmap changed)...
Nov 28 09:54:11 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Reconfiguring mgr.np0005538512.zyhkxs (monmap changed)...
Nov 28 09:54:11 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon) e11 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mgr.np0005538512.zyhkxs", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} v 0)
Nov 28 09:54:11 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [INF] : from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538512.zyhkxs", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 28 09:54:11 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon) e11 handle_command mon_command({"prefix": "mgr services"} v 0)
Nov 28 09:54:11 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [DBG] : from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mgr services"} : dispatch
Nov 28 09:54:11 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon) e11 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Nov 28 09:54:11 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [DBG] : from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:54:11 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO cephadm.serve] Reconfiguring daemon mgr.np0005538512.zyhkxs on np0005538512.localdomain
Nov 28 09:54:11 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Reconfiguring daemon mgr.np0005538512.zyhkxs on np0005538512.localdomain
Nov 28 09:54:11 np0005538515.localdomain ceph-mon[287604]: Updating np0005538515.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:54:11 np0005538515.localdomain ceph-mon[287604]: Updating np0005538512.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:54:11 np0005538515.localdomain ceph-mon[287604]: Updating np0005538513.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:54:11 np0005538515.localdomain ceph-mon[287604]: Updating np0005538514.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:54:11 np0005538515.localdomain ceph-mon[287604]: pgmap v29: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:54:11 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:54:11 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:54:11 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:54:11 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:54:11 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:54:11 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:54:11 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:54:11 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:54:11 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:54:11 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 28 09:54:11 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538512.zyhkxs", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 28 09:54:11 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mgr services"} : dispatch
Nov 28 09:54:11 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538512.zyhkxs", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 28 09:54:11 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:54:12 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v30: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:54:12 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon) e11 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538512.localdomain.devices.0}] v 0)
Nov 28 09:54:12 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon) e11 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538512.localdomain}] v 0)
Nov 28 09:54:12 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO cephadm.serve] Reconfiguring crash.np0005538512 (monmap changed)...
Nov 28 09:54:12 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Reconfiguring crash.np0005538512 (monmap changed)...
Nov 28 09:54:12 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon) e11 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.crash.np0005538512.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} v 0)
Nov 28 09:54:12 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [INF] : from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538512.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 28 09:54:12 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon) e11 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Nov 28 09:54:12 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [DBG] : from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:54:12 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO cephadm.serve] Reconfiguring daemon crash.np0005538512 on np0005538512.localdomain
Nov 28 09:54:12 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Reconfiguring daemon crash.np0005538512 on np0005538512.localdomain
Nov 28 09:54:12 np0005538515.localdomain ceph-mon[287604]: Reconfiguring mgr.np0005538512.zyhkxs (monmap changed)...
Nov 28 09:54:12 np0005538515.localdomain ceph-mon[287604]: Reconfiguring daemon mgr.np0005538512.zyhkxs on np0005538512.localdomain
Nov 28 09:54:12 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:54:12 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:54:12 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538512.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 28 09:54:12 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:54:12 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538512.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 28 09:54:13 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon) e11 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538512.localdomain.devices.0}] v 0)
Nov 28 09:54:13 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon) e11 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538512.localdomain}] v 0)
Nov 28 09:54:13 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO cephadm.serve] Reconfiguring crash.np0005538513 (monmap changed)...
Nov 28 09:54:13 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Reconfiguring crash.np0005538513 (monmap changed)...
Nov 28 09:54:13 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon) e11 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.crash.np0005538513.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} v 0)
Nov 28 09:54:13 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [INF] : from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538513.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 28 09:54:13 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon) e11 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Nov 28 09:54:13 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [DBG] : from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:54:13 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO cephadm.serve] Reconfiguring daemon crash.np0005538513 on np0005538513.localdomain
Nov 28 09:54:13 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Reconfiguring daemon crash.np0005538513 on np0005538513.localdomain
Nov 28 09:54:13 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon) e11 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Nov 28 09:54:13 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/921452423' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 09:54:13 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon) e11 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Nov 28 09:54:13 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/921452423' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 09:54:13 np0005538515.localdomain ceph-mon[287604]: pgmap v30: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:54:13 np0005538515.localdomain ceph-mon[287604]: Reconfiguring crash.np0005538512 (monmap changed)...
Nov 28 09:54:13 np0005538515.localdomain ceph-mon[287604]: Reconfiguring daemon crash.np0005538512 on np0005538512.localdomain
Nov 28 09:54:13 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:54:13 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:54:13 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538513.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 28 09:54:13 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:54:13 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538513.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 28 09:54:13 np0005538515.localdomain ceph-mon[287604]: from='client.? 172.18.0.32:0/921452423' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 09:54:13 np0005538515.localdomain ceph-mon[287604]: from='client.? 172.18.0.32:0/921452423' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 09:54:14 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon) e11 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain.devices.0}] v 0)
Nov 28 09:54:14 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v31: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:54:14 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon) e11 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain}] v 0)
Nov 28 09:54:14 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO cephadm.serve] Reconfiguring osd.2 (monmap changed)...
Nov 28 09:54:14 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Reconfiguring osd.2 (monmap changed)...
Nov 28 09:54:14 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon) e11 handle_command mon_command({"prefix": "auth get", "entity": "osd.2"} v 0)
Nov 28 09:54:14 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [INF] : from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch
Nov 28 09:54:14 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon) e11 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Nov 28 09:54:14 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [DBG] : from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:54:14 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.2 on np0005538513.localdomain
Nov 28 09:54:14 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.2 on np0005538513.localdomain
Nov 28 09:54:14 np0005538515.localdomain ceph-mgr[286188]: [progress INFO root] Writing back 50 completed events
Nov 28 09:54:14 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon) e11 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0)
Nov 28 09:54:14 np0005538515.localdomain ceph-mon[287604]: Reconfiguring crash.np0005538513 (monmap changed)...
Nov 28 09:54:14 np0005538515.localdomain ceph-mon[287604]: Reconfiguring daemon crash.np0005538513 on np0005538513.localdomain
Nov 28 09:54:14 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:54:14 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:54:14 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch
Nov 28 09:54:14 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:54:14 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:54:14 np0005538515.localdomain ceph-mon[287604]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #28. Immutable memtables: 0.
Nov 28 09:54:14 np0005538515.localdomain ceph-mon[287604]: rocksdb: (Original Log Time 2025/11/28-09:54:14.570506) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Nov 28 09:54:14 np0005538515.localdomain ceph-mon[287604]: rocksdb: [db/flush_job.cc:856] [default] [JOB 13] Flushing memtable with next log file: 28
Nov 28 09:54:14 np0005538515.localdomain ceph-mon[287604]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323654570608, "job": 13, "event": "flush_started", "num_memtables": 1, "num_entries": 1325, "num_deletes": 252, "total_data_size": 2258387, "memory_usage": 2295248, "flush_reason": "Manual Compaction"}
Nov 28 09:54:14 np0005538515.localdomain ceph-mon[287604]: rocksdb: [db/flush_job.cc:885] [default] [JOB 13] Level-0 flush table #29: started
Nov 28 09:54:14 np0005538515.localdomain ceph-mon[287604]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323654580581, "cf_name": "default", "job": 13, "event": "table_file_creation", "file_number": 29, "file_size": 1301650, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 17921, "largest_seqno": 19241, "table_properties": {"data_size": 1295703, "index_size": 3097, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1797, "raw_key_size": 15968, "raw_average_key_size": 22, "raw_value_size": 1282660, "raw_average_value_size": 1798, "num_data_blocks": 132, "num_entries": 713, "num_filter_entries": 713, "num_deletions": 252, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764323624, "oldest_key_time": 1764323624, "file_creation_time": 1764323654, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "5fedd929-5f7c-4f1d-86e7-c95af9bc6d32", "db_session_id": "18KD68ISQNH5R0YWI96C", "orig_file_number": 29, "seqno_to_time_mapping": "N/A"}}
Nov 28 09:54:14 np0005538515.localdomain ceph-mon[287604]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 13] Flush lasted 10118 microseconds, and 3685 cpu microseconds.
Nov 28 09:54:14 np0005538515.localdomain ceph-mon[287604]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 28 09:54:14 np0005538515.localdomain ceph-mon[287604]: rocksdb: (Original Log Time 2025/11/28-09:54:14.580647) [db/flush_job.cc:967] [default] [JOB 13] Level-0 flush table #29: 1301650 bytes OK
Nov 28 09:54:14 np0005538515.localdomain ceph-mon[287604]: rocksdb: (Original Log Time 2025/11/28-09:54:14.580677) [db/memtable_list.cc:519] [default] Level-0 commit table #29 started
Nov 28 09:54:14 np0005538515.localdomain ceph-mon[287604]: rocksdb: (Original Log Time 2025/11/28-09:54:14.582746) [db/memtable_list.cc:722] [default] Level-0 commit table #29: memtable #1 done
Nov 28 09:54:14 np0005538515.localdomain ceph-mon[287604]: rocksdb: (Original Log Time 2025/11/28-09:54:14.582770) EVENT_LOG_v1 {"time_micros": 1764323654582764, "job": 13, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Nov 28 09:54:14 np0005538515.localdomain ceph-mon[287604]: rocksdb: (Original Log Time 2025/11/28-09:54:14.582792) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Nov 28 09:54:14 np0005538515.localdomain ceph-mon[287604]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 13] Try to delete WAL files size 2251512, prev total WAL file size 2251512, number of live WAL files 2.
Nov 28 09:54:14 np0005538515.localdomain ceph-mon[287604]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538515/store.db/000025.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 28 09:54:14 np0005538515.localdomain ceph-mon[287604]: rocksdb: (Original Log Time 2025/11/28-09:54:14.583710) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003130353432' seq:72057594037927935, type:22 .. '7061786F73003130373934' seq:0, type:0; will stop at (end)
Nov 28 09:54:14 np0005538515.localdomain ceph-mon[287604]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 14] Compacting 1@0 + 1@6 files to L6, score -1.00
Nov 28 09:54:14 np0005538515.localdomain ceph-mon[287604]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 13 Base level 0, inputs: [29(1271KB)], [27(16MB)]
Nov 28 09:54:14 np0005538515.localdomain ceph-mon[287604]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323654583755, "job": 14, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [29], "files_L6": [27], "score": -1, "input_data_size": 19006247, "oldest_snapshot_seqno": -1}
Nov 28 09:54:14 np0005538515.localdomain ceph-mon[287604]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 14] Generated table #30: 10399 keys, 15806710 bytes, temperature: kUnknown
Nov 28 09:54:14 np0005538515.localdomain ceph-mon[287604]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323654688042, "cf_name": "default", "job": 14, "event": "table_file_creation", "file_number": 30, "file_size": 15806710, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 15747263, "index_size": 32338, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 26053, "raw_key_size": 280251, "raw_average_key_size": 26, "raw_value_size": 15569414, "raw_average_value_size": 1497, "num_data_blocks": 1224, "num_entries": 10399, "num_filter_entries": 10399, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764323465, "oldest_key_time": 0, "file_creation_time": 1764323654, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "5fedd929-5f7c-4f1d-86e7-c95af9bc6d32", "db_session_id": "18KD68ISQNH5R0YWI96C", "orig_file_number": 30, "seqno_to_time_mapping": "N/A"}}
Nov 28 09:54:14 np0005538515.localdomain ceph-mon[287604]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 28 09:54:14 np0005538515.localdomain ceph-mon[287604]: rocksdb: (Original Log Time 2025/11/28-09:54:14.688391) [db/compaction/compaction_job.cc:1663] [default] [JOB 14] Compacted 1@0 + 1@6 files to L6 => 15806710 bytes
Nov 28 09:54:14 np0005538515.localdomain ceph-mon[287604]: rocksdb: (Original Log Time 2025/11/28-09:54:14.690194) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 182.0 rd, 151.4 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.2, 16.9 +0.0 blob) out(15.1 +0.0 blob), read-write-amplify(26.7) write-amplify(12.1) OK, records in: 10935, records dropped: 536 output_compression: NoCompression
Nov 28 09:54:14 np0005538515.localdomain ceph-mon[287604]: rocksdb: (Original Log Time 2025/11/28-09:54:14.690225) EVENT_LOG_v1 {"time_micros": 1764323654690210, "job": 14, "event": "compaction_finished", "compaction_time_micros": 104414, "compaction_time_cpu_micros": 42673, "output_level": 6, "num_output_files": 1, "total_output_size": 15806710, "num_input_records": 10935, "num_output_records": 10399, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Nov 28 09:54:14 np0005538515.localdomain ceph-mon[287604]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538515/store.db/000029.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 28 09:54:14 np0005538515.localdomain ceph-mon[287604]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323654690598, "job": 14, "event": "table_file_deletion", "file_number": 29}
Nov 28 09:54:14 np0005538515.localdomain ceph-mon[287604]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538515/store.db/000027.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 28 09:54:14 np0005538515.localdomain ceph-mon[287604]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323654693234, "job": 14, "event": "table_file_deletion", "file_number": 27}
Nov 28 09:54:14 np0005538515.localdomain ceph-mon[287604]: rocksdb: (Original Log Time 2025/11/28-09:54:14.583628) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 09:54:14 np0005538515.localdomain ceph-mon[287604]: rocksdb: (Original Log Time 2025/11/28-09:54:14.693340) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 09:54:14 np0005538515.localdomain ceph-mon[287604]: rocksdb: (Original Log Time 2025/11/28-09:54:14.693348) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 09:54:14 np0005538515.localdomain ceph-mon[287604]: rocksdb: (Original Log Time 2025/11/28-09:54:14.693351) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 09:54:14 np0005538515.localdomain ceph-mon[287604]: rocksdb: (Original Log Time 2025/11/28-09:54:14.693353) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 09:54:14 np0005538515.localdomain ceph-mon[287604]: rocksdb: (Original Log Time 2025/11/28-09:54:14.693356) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 09:54:15 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon) e11 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain.devices.0}] v 0)
Nov 28 09:54:15 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon) e11 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain}] v 0)
Nov 28 09:54:15 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO cephadm.serve] Reconfiguring osd.5 (monmap changed)...
Nov 28 09:54:15 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Reconfiguring osd.5 (monmap changed)...
Nov 28 09:54:15 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon) e11 handle_command mon_command({"prefix": "auth get", "entity": "osd.5"} v 0)
Nov 28 09:54:15 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [INF] : from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch
Nov 28 09:54:15 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon) e11 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Nov 28 09:54:15 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [DBG] : from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:54:15 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.5 on np0005538513.localdomain
Nov 28 09:54:15 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.5 on np0005538513.localdomain
Nov 28 09:54:15 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:54:15 np0005538515.localdomain ceph-mon[287604]: pgmap v31: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:54:15 np0005538515.localdomain ceph-mon[287604]: Reconfiguring osd.2 (monmap changed)...
Nov 28 09:54:15 np0005538515.localdomain ceph-mon[287604]: Reconfiguring daemon osd.2 on np0005538513.localdomain
Nov 28 09:54:15 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:54:15 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:54:15 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch
Nov 28 09:54:15 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:54:15 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.
Nov 28 09:54:15 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.
Nov 28 09:54:15 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.
Nov 28 09:54:15 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.
Nov 28 09:54:15 np0005538515.localdomain systemd[1]: tmp-crun.t6jZTz.mount: Deactivated successfully.
Nov 28 09:54:16 np0005538515.localdomain podman[296255]: 2025-11-28 09:54:16.001249416 +0000 UTC m=+0.106516113 container health_status 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_managed=true, org.label-schema.build-date=20251125, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute)
Nov 28 09:54:16 np0005538515.localdomain podman[296255]: 2025-11-28 09:54:16.03740375 +0000 UTC m=+0.142670437 container exec_died 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ceilometer_agent_compute)
Nov 28 09:54:16 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v32: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:54:16 np0005538515.localdomain podman[296257]: 2025-11-28 09:54:16.050943263 +0000 UTC m=+0.147482123 container health_status b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3)
Nov 28 09:54:16 np0005538515.localdomain systemd[1]: 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.service: Deactivated successfully.
Nov 28 09:54:16 np0005538515.localdomain podman[296257]: 2025-11-28 09:54:16.062518726 +0000 UTC m=+0.159057646 container exec_died b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Nov 28 09:54:16 np0005538515.localdomain systemd[1]: b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.service: Deactivated successfully.
Nov 28 09:54:16 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon) e11 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain.devices.0}] v 0)
Nov 28 09:54:16 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon) e11 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain}] v 0)
Nov 28 09:54:16 np0005538515.localdomain podman[296256]: 2025-11-28 09:54:16.13665603 +0000 UTC m=+0.239808883 container health_status 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 09:54:16 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO cephadm.serve] Reconfiguring mds.mds.np0005538513.yljthc (monmap changed)...
Nov 28 09:54:16 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Reconfiguring mds.mds.np0005538513.yljthc (monmap changed)...
Nov 28 09:54:16 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon) e11 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mds.mds.np0005538513.yljthc", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} v 0)
Nov 28 09:54:16 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [INF] : from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005538513.yljthc", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Nov 28 09:54:16 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon) e11 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Nov 28 09:54:16 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [DBG] : from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:54:16 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO cephadm.serve] Reconfiguring daemon mds.mds.np0005538513.yljthc on np0005538513.localdomain
Nov 28 09:54:16 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Reconfiguring daemon mds.mds.np0005538513.yljthc on np0005538513.localdomain
Nov 28 09:54:16 np0005538515.localdomain podman[296258]: 2025-11-28 09:54:16.115776012 +0000 UTC m=+0.209092634 container health_status d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 09:54:16 np0005538515.localdomain podman[296256]: 2025-11-28 09:54:16.179288731 +0000 UTC m=+0.282441584 container exec_died 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Nov 28 09:54:16 np0005538515.localdomain systemd[1]: 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.service: Deactivated successfully.
Nov 28 09:54:16 np0005538515.localdomain podman[296258]: 2025-11-28 09:54:16.198598831 +0000 UTC m=+0.291915413 container exec_died d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 28 09:54:16 np0005538515.localdomain systemd[1]: d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.service: Deactivated successfully.
Nov 28 09:54:16 np0005538515.localdomain ceph-mon[287604]: Reconfiguring osd.5 (monmap changed)...
Nov 28 09:54:16 np0005538515.localdomain ceph-mon[287604]: Reconfiguring daemon osd.5 on np0005538513.localdomain
Nov 28 09:54:16 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:54:16 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:54:16 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005538513.yljthc", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Nov 28 09:54:16 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:54:16 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005538513.yljthc", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Nov 28 09:54:17 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon) e11 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain.devices.0}] v 0)
Nov 28 09:54:17 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon) e11 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain}] v 0)
Nov 28 09:54:17 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO cephadm.serve] Reconfiguring mgr.np0005538513.dsfdlx (monmap changed)...
Nov 28 09:54:17 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Reconfiguring mgr.np0005538513.dsfdlx (monmap changed)...
Nov 28 09:54:17 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon) e11 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mgr.np0005538513.dsfdlx", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} v 0)
Nov 28 09:54:17 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [INF] : from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538513.dsfdlx", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 28 09:54:17 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon) e11 handle_command mon_command({"prefix": "mgr services"} v 0)
Nov 28 09:54:17 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [DBG] : from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mgr services"} : dispatch
Nov 28 09:54:17 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon) e11 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Nov 28 09:54:17 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [DBG] : from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:54:17 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO cephadm.serve] Reconfiguring daemon mgr.np0005538513.dsfdlx on np0005538513.localdomain
Nov 28 09:54:17 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Reconfiguring daemon mgr.np0005538513.dsfdlx on np0005538513.localdomain
Nov 28 09:54:17 np0005538515.localdomain ceph-mon[287604]: pgmap v32: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:54:17 np0005538515.localdomain ceph-mon[287604]: Reconfiguring mds.mds.np0005538513.yljthc (monmap changed)...
Nov 28 09:54:17 np0005538515.localdomain ceph-mon[287604]: Reconfiguring daemon mds.mds.np0005538513.yljthc on np0005538513.localdomain
Nov 28 09:54:17 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:54:17 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:54:17 np0005538515.localdomain ceph-mon[287604]: Reconfiguring mgr.np0005538513.dsfdlx (monmap changed)...
Nov 28 09:54:17 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538513.dsfdlx", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 28 09:54:17 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mgr services"} : dispatch
Nov 28 09:54:17 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538513.dsfdlx", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 28 09:54:17 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:54:17 np0005538515.localdomain ceph-mon[287604]: Reconfiguring daemon mgr.np0005538513.dsfdlx on np0005538513.localdomain
Nov 28 09:54:17 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon) e11 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain.devices.0}] v 0)
Nov 28 09:54:17 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon) e11 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain}] v 0)
Nov 28 09:54:17 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO cephadm.serve] Reconfiguring crash.np0005538514 (monmap changed)...
Nov 28 09:54:17 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Reconfiguring crash.np0005538514 (monmap changed)...
Nov 28 09:54:17 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon) e11 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.crash.np0005538514.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} v 0)
Nov 28 09:54:17 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [INF] : from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538514.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 28 09:54:17 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon) e11 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Nov 28 09:54:17 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [DBG] : from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:54:17 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO cephadm.serve] Reconfiguring daemon crash.np0005538514 on np0005538514.localdomain
Nov 28 09:54:17 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Reconfiguring daemon crash.np0005538514 on np0005538514.localdomain
Nov 28 09:54:18 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v33: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:54:18 np0005538515.localdomain ceph-mgr[286188]: [balancer INFO root] Optimize plan auto_2025-11-28_09:54:18
Nov 28 09:54:18 np0005538515.localdomain ceph-mgr[286188]: [balancer INFO root] Mode upmap, max misplaced 0.050000
Nov 28 09:54:18 np0005538515.localdomain ceph-mgr[286188]: [balancer INFO root] do_upmap
Nov 28 09:54:18 np0005538515.localdomain ceph-mgr[286188]: [balancer INFO root] pools ['manila_metadata', 'images', 'vms', '.mgr', 'backups', 'volumes', 'manila_data']
Nov 28 09:54:18 np0005538515.localdomain ceph-mgr[286188]: [balancer INFO root] prepared 0/10 changes
Nov 28 09:54:18 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] _maybe_adjust
Nov 28 09:54:18 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Nov 28 09:54:18 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] Pool '.mgr' root_id -1 using 3.080724804578448e-05 of space, bias 1.0, pg target 0.006161449609156895 quantized to 1 (current 1)
Nov 28 09:54:18 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Nov 28 09:54:18 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] Pool 'vms' root_id -1 using 0.003328000680485762 of space, bias 1.0, pg target 0.6656001360971524 quantized to 32 (current 32)
Nov 28 09:54:18 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Nov 28 09:54:18 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] Pool 'volumes' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Nov 28 09:54:18 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Nov 28 09:54:18 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] Pool 'images' root_id -1 using 0.0014449417225013959 of space, bias 1.0, pg target 0.2885066972594454 quantized to 32 (current 32)
Nov 28 09:54:18 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Nov 28 09:54:18 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] Pool 'backups' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Nov 28 09:54:18 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Nov 28 09:54:18 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] Pool 'manila_data' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Nov 28 09:54:18 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Nov 28 09:54:18 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] Pool 'manila_metadata' root_id -1 using 2.1810441094360693e-06 of space, bias 4.0, pg target 0.001741927228736274 quantized to 16 (current 16)
Nov 28 09:54:18 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules
Nov 28 09:54:18 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: vms, start_after=
Nov 28 09:54:18 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: volumes, start_after=
Nov 28 09:54:18 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: images, start_after=
Nov 28 09:54:18 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: backups, start_after=
Nov 28 09:54:18 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] scanning for idle connections..
Nov 28 09:54:18 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] cleaning up connections: []
Nov 28 09:54:18 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] scanning for idle connections..
Nov 28 09:54:18 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] cleaning up connections: []
Nov 28 09:54:18 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules
Nov 28 09:54:18 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: vms, start_after=
Nov 28 09:54:18 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: volumes, start_after=
Nov 28 09:54:18 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: images, start_after=
Nov 28 09:54:18 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: backups, start_after=
Nov 28 09:54:18 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] scanning for idle connections..
Nov 28 09:54:18 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] cleaning up connections: []
Nov 28 09:54:18 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon) e11 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain.devices.0}] v 0)
Nov 28 09:54:18 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon) e11 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain}] v 0)
Nov 28 09:54:18 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO cephadm.serve] Reconfiguring osd.0 (monmap changed)...
Nov 28 09:54:18 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Reconfiguring osd.0 (monmap changed)...
Nov 28 09:54:18 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon) e11 handle_command mon_command({"prefix": "auth get", "entity": "osd.0"} v 0)
Nov 28 09:54:18 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [INF] : from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch
Nov 28 09:54:18 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon) e11 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Nov 28 09:54:18 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [DBG] : from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:54:18 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.0 on np0005538514.localdomain
Nov 28 09:54:18 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.0 on np0005538514.localdomain
Nov 28 09:54:18 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:54:18 np0005538515.localdomain ceph-mon[287604]: Reconfiguring crash.np0005538514 (monmap changed)...
Nov 28 09:54:18 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:54:18 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538514.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 28 09:54:18 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:54:18 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538514.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 28 09:54:18 np0005538515.localdomain ceph-mon[287604]: Reconfiguring daemon crash.np0005538514 on np0005538514.localdomain
Nov 28 09:54:18 np0005538515.localdomain ceph-mon[287604]: pgmap v33: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:54:18 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:54:18 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:54:18 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch
Nov 28 09:54:18 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:54:19 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon) e11 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain.devices.0}] v 0)
Nov 28 09:54:19 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon) e11 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain}] v 0)
Nov 28 09:54:19 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO cephadm.serve] Reconfiguring osd.3 (monmap changed)...
Nov 28 09:54:19 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon) e11 handle_command mon_command({"prefix": "auth get", "entity": "osd.3"} v 0)
Nov 28 09:54:19 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [INF] : from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch
Nov 28 09:54:19 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Reconfiguring osd.3 (monmap changed)...
Nov 28 09:54:19 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon) e11 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Nov 28 09:54:19 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [DBG] : from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:54:19 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.3 on np0005538514.localdomain
Nov 28 09:54:19 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.3 on np0005538514.localdomain
Nov 28 09:54:19 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.
Nov 28 09:54:19 np0005538515.localdomain podman[296340]: 2025-11-28 09:54:19.98026432 +0000 UTC m=+0.081146469 container health_status 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 28 09:54:19 np0005538515.localdomain podman[296340]: 2025-11-28 09:54:19.99535837 +0000 UTC m=+0.096240499 container exec_died 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 09:54:20 np0005538515.localdomain systemd[1]: 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.service: Deactivated successfully.
Nov 28 09:54:20 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v34: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:54:20 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:54:20 np0005538515.localdomain ceph-mon[287604]: Reconfiguring osd.0 (monmap changed)...
Nov 28 09:54:20 np0005538515.localdomain ceph-mon[287604]: Reconfiguring daemon osd.0 on np0005538514.localdomain
Nov 28 09:54:20 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:54:20 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:54:20 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch
Nov 28 09:54:20 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:54:20 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon) e11 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain.devices.0}] v 0)
Nov 28 09:54:20 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon) e11 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain}] v 0)
Nov 28 09:54:20 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO cephadm.serve] Reconfiguring mds.mds.np0005538514.umgtoy (monmap changed)...
Nov 28 09:54:20 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Reconfiguring mds.mds.np0005538514.umgtoy (monmap changed)...
Nov 28 09:54:20 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon) e11 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mds.mds.np0005538514.umgtoy", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} v 0)
Nov 28 09:54:20 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [INF] : from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005538514.umgtoy", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Nov 28 09:54:20 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon) e11 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Nov 28 09:54:20 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [DBG] : from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:54:20 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO cephadm.serve] Reconfiguring daemon mds.mds.np0005538514.umgtoy on np0005538514.localdomain
Nov 28 09:54:20 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Reconfiguring daemon mds.mds.np0005538514.umgtoy on np0005538514.localdomain
Nov 28 09:54:21 np0005538515.localdomain ceph-mon[287604]: Reconfiguring osd.3 (monmap changed)...
Nov 28 09:54:21 np0005538515.localdomain ceph-mon[287604]: Reconfiguring daemon osd.3 on np0005538514.localdomain
Nov 28 09:54:21 np0005538515.localdomain ceph-mon[287604]: pgmap v34: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:54:21 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:54:21 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:54:21 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005538514.umgtoy", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Nov 28 09:54:21 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:54:21 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005538514.umgtoy", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Nov 28 09:54:21 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon) e11 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain.devices.0}] v 0)
Nov 28 09:54:21 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon) e11 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain}] v 0)
Nov 28 09:54:21 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO cephadm.serve] Reconfiguring mgr.np0005538514.djozup (monmap changed)...
Nov 28 09:54:21 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Reconfiguring mgr.np0005538514.djozup (monmap changed)...
Nov 28 09:54:21 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon) e11 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mgr.np0005538514.djozup", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} v 0)
Nov 28 09:54:21 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [INF] : from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538514.djozup", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 28 09:54:21 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon) e11 handle_command mon_command({"prefix": "mgr services"} v 0)
Nov 28 09:54:21 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [DBG] : from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mgr services"} : dispatch
Nov 28 09:54:21 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon) e11 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Nov 28 09:54:21 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [DBG] : from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:54:21 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO cephadm.serve] Reconfiguring daemon mgr.np0005538514.djozup on np0005538514.localdomain
Nov 28 09:54:21 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Reconfiguring daemon mgr.np0005538514.djozup on np0005538514.localdomain
Nov 28 09:54:22 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v35: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:54:22 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.44312 -' entity='client.admin' cmd=[{"prefix": "orch daemon add", "daemon_type": "mon", "placement": "np0005538514.localdomain:172.18.0.104", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 09:54:22 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon) e11 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/spec.mon}] v 0)
Nov 28 09:54:22 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon) e11 handle_command mon_command({"prefix": "auth get", "entity": "mon."} v 0)
Nov 28 09:54:22 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [INF] : from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Nov 28 09:54:22 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon) e11 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Nov 28 09:54:22 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [DBG] : from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:54:22 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO cephadm.serve] Deploying daemon mon.np0005538514 on np0005538514.localdomain
Nov 28 09:54:22 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Deploying daemon mon.np0005538514 on np0005538514.localdomain
Nov 28 09:54:22 np0005538515.localdomain ceph-mon[287604]: Reconfiguring mds.mds.np0005538514.umgtoy (monmap changed)...
Nov 28 09:54:22 np0005538515.localdomain ceph-mon[287604]: Reconfiguring daemon mds.mds.np0005538514.umgtoy on np0005538514.localdomain
Nov 28 09:54:22 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:54:22 np0005538515.localdomain ceph-mon[287604]: Reconfiguring mgr.np0005538514.djozup (monmap changed)...
Nov 28 09:54:22 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:54:22 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538514.djozup", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 28 09:54:22 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mgr services"} : dispatch
Nov 28 09:54:22 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538514.djozup", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 28 09:54:22 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:54:22 np0005538515.localdomain ceph-mon[287604]: Reconfiguring daemon mgr.np0005538514.djozup on np0005538514.localdomain
Nov 28 09:54:22 np0005538515.localdomain ceph-mon[287604]: pgmap v35: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:54:22 np0005538515.localdomain ceph-mon[287604]: from='client.44312 -' entity='client.admin' cmd=[{"prefix": "orch daemon add", "daemon_type": "mon", "placement": "np0005538514.localdomain:172.18.0.104", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 09:54:22 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:54:22 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Nov 28 09:54:22 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:54:22 np0005538515.localdomain ceph-mon[287604]: Deploying daemon mon.np0005538514 on np0005538514.localdomain
Nov 28 09:54:22 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon) e11 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain.devices.0}] v 0)
Nov 28 09:54:22 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon) e11 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain}] v 0)
Nov 28 09:54:22 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO cephadm.serve] Reconfiguring crash.np0005538515 (monmap changed)...
Nov 28 09:54:22 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Reconfiguring crash.np0005538515 (monmap changed)...
Nov 28 09:54:22 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon) e11 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.crash.np0005538515.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} v 0)
Nov 28 09:54:22 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [INF] : from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538515.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 28 09:54:22 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon) e11 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Nov 28 09:54:22 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [DBG] : from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:54:22 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO cephadm.serve] Reconfiguring daemon crash.np0005538515 on np0005538515.localdomain
Nov 28 09:54:22 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Reconfiguring daemon crash.np0005538515 on np0005538515.localdomain
Nov 28 09:54:22 np0005538515.localdomain sudo[296363]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:54:22 np0005538515.localdomain sudo[296363]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:54:22 np0005538515.localdomain sudo[296363]: pam_unix(sudo:session): session closed for user root
Nov 28 09:54:22 np0005538515.localdomain sudo[296381]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:54:22 np0005538515.localdomain sudo[296381]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:54:23 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.
Nov 28 09:54:23 np0005538515.localdomain systemd[1]: tmp-crun.Nbu9Wm.mount: Deactivated successfully.
Nov 28 09:54:23 np0005538515.localdomain podman[296423]: 
Nov 28 09:54:23 np0005538515.localdomain podman[296415]: 2025-11-28 09:54:23.457207695 +0000 UTC m=+0.100031996 container health_status cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.vendor=CentOS, container_name=multipathd, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Nov 28 09:54:23 np0005538515.localdomain podman[296423]: 2025-11-28 09:54:23.474200633 +0000 UTC m=+0.095446425 container create f12aa6ba1fa8cd1c5125ef7d5bcf0f25712f23b87a3e9b2f4c7150de76e39943 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hungry_gagarin, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, build-date=2025-09-24T08:57:55, GIT_CLEAN=True, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, name=rhceph, version=7, vendor=Red Hat, Inc., ceph=True, RELEASE=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.buildah.version=1.33.12, release=553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, architecture=x86_64, io.openshift.expose-services=, io.openshift.tags=rhceph ceph)
Nov 28 09:54:23 np0005538515.localdomain podman[296415]: 2025-11-28 09:54:23.496511484 +0000 UTC m=+0.139335665 container exec_died cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 28 09:54:23 np0005538515.localdomain systemd[1]: Started libpod-conmon-f12aa6ba1fa8cd1c5125ef7d5bcf0f25712f23b87a3e9b2f4c7150de76e39943.scope.
Nov 28 09:54:23 np0005538515.localdomain systemd[1]: cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.service: Deactivated successfully.
Nov 28 09:54:23 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 09:54:23 np0005538515.localdomain podman[296423]: 2025-11-28 09:54:23.434678307 +0000 UTC m=+0.055924139 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 09:54:23 np0005538515.localdomain podman[296423]: 2025-11-28 09:54:23.544879911 +0000 UTC m=+0.166125683 container init f12aa6ba1fa8cd1c5125ef7d5bcf0f25712f23b87a3e9b2f4c7150de76e39943 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hungry_gagarin, GIT_CLEAN=True, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.33.12, version=7, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, build-date=2025-09-24T08:57:55, RELEASE=main, CEPH_POINT_RELEASE=, io.openshift.expose-services=, name=rhceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, architecture=x86_64, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, description=Red Hat Ceph Storage 7, release=553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Nov 28 09:54:23 np0005538515.localdomain podman[296423]: 2025-11-28 09:54:23.553536256 +0000 UTC m=+0.174782028 container start f12aa6ba1fa8cd1c5125ef7d5bcf0f25712f23b87a3e9b2f4c7150de76e39943 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hungry_gagarin, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, vcs-type=git, GIT_BRANCH=main, RELEASE=main, distribution-scope=public, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-09-24T08:57:55, version=7, vendor=Red Hat, Inc., release=553, GIT_CLEAN=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64)
Nov 28 09:54:23 np0005538515.localdomain podman[296423]: 2025-11-28 09:54:23.553745012 +0000 UTC m=+0.174990814 container attach f12aa6ba1fa8cd1c5125ef7d5bcf0f25712f23b87a3e9b2f4c7150de76e39943 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hungry_gagarin, com.redhat.component=rhceph-container, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, CEPH_POINT_RELEASE=, architecture=x86_64, description=Red Hat Ceph Storage 7, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, ceph=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, io.buildah.version=1.33.12, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-09-24T08:57:55, GIT_CLEAN=True, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, release=553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/agreements, RELEASE=main, name=rhceph, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Nov 28 09:54:23 np0005538515.localdomain hungry_gagarin[296451]: 167 167
Nov 28 09:54:23 np0005538515.localdomain systemd[1]: libpod-f12aa6ba1fa8cd1c5125ef7d5bcf0f25712f23b87a3e9b2f4c7150de76e39943.scope: Deactivated successfully.
Nov 28 09:54:23 np0005538515.localdomain podman[296423]: 2025-11-28 09:54:23.557452315 +0000 UTC m=+0.178698087 container died f12aa6ba1fa8cd1c5125ef7d5bcf0f25712f23b87a3e9b2f4c7150de76e39943 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hungry_gagarin, CEPH_POINT_RELEASE=, io.openshift.expose-services=, GIT_BRANCH=main, version=7, GIT_CLEAN=True, release=553, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, io.openshift.tags=rhceph ceph, io.buildah.version=1.33.12, distribution-scope=public, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., name=rhceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, RELEASE=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, architecture=x86_64, build-date=2025-09-24T08:57:55, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_REPO=https://github.com/ceph/ceph-container.git)
Nov 28 09:54:23 np0005538515.localdomain podman[296456]: 2025-11-28 09:54:23.649341851 +0000 UTC m=+0.077679613 container remove f12aa6ba1fa8cd1c5125ef7d5bcf0f25712f23b87a3e9b2f4c7150de76e39943 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hungry_gagarin, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, com.redhat.component=rhceph-container, name=rhceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.tags=rhceph ceph, io.buildah.version=1.33.12, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., version=7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, RELEASE=main, architecture=x86_64, GIT_CLEAN=True, build-date=2025-09-24T08:57:55, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9)
Nov 28 09:54:23 np0005538515.localdomain systemd[1]: libpod-conmon-f12aa6ba1fa8cd1c5125ef7d5bcf0f25712f23b87a3e9b2f4c7150de76e39943.scope: Deactivated successfully.
Nov 28 09:54:23 np0005538515.localdomain sudo[296381]: pam_unix(sudo:session): session closed for user root
Nov 28 09:54:23 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon) e11 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain.devices.0}] v 0)
Nov 28 09:54:23 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon) e11 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain}] v 0)
Nov 28 09:54:23 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO cephadm.serve] Reconfiguring osd.1 (monmap changed)...
Nov 28 09:54:23 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Reconfiguring osd.1 (monmap changed)...
Nov 28 09:54:23 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon) e11 handle_command mon_command({"prefix": "auth get", "entity": "osd.1"} v 0)
Nov 28 09:54:23 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [INF] : from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch
Nov 28 09:54:23 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon) e11 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Nov 28 09:54:23 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [DBG] : from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:54:23 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.1 on np0005538515.localdomain
Nov 28 09:54:23 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.1 on np0005538515.localdomain
Nov 28 09:54:23 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:54:23 np0005538515.localdomain ceph-mon[287604]: Reconfiguring crash.np0005538515 (monmap changed)...
Nov 28 09:54:23 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538515.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 28 09:54:23 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:54:23 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:54:23 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538515.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 28 09:54:23 np0005538515.localdomain ceph-mon[287604]: Reconfiguring daemon crash.np0005538515 on np0005538515.localdomain
Nov 28 09:54:23 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:54:23 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch
Nov 28 09:54:23 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:54:23 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:54:23 np0005538515.localdomain sudo[296473]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:54:23 np0005538515.localdomain sudo[296473]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:54:23 np0005538515.localdomain sudo[296473]: pam_unix(sudo:session): session closed for user root
Nov 28 09:54:23 np0005538515.localdomain sudo[296491]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:54:23 np0005538515.localdomain sudo[296491]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:54:24 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v36: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:54:24 np0005538515.localdomain podman[296526]: 
Nov 28 09:54:24 np0005538515.localdomain podman[296526]: 2025-11-28 09:54:24.38431638 +0000 UTC m=+0.081712916 container create 6c7e296a84f0eb4a836cbfa0efeec4b137ab97a70e78ced26645e03f820167f6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=affectionate_bartik, release=553, architecture=x86_64, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, ceph=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, build-date=2025-09-24T08:57:55, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, name=rhceph, version=7, com.redhat.component=rhceph-container, distribution-scope=public, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Nov 28 09:54:24 np0005538515.localdomain systemd[1]: Started libpod-conmon-6c7e296a84f0eb4a836cbfa0efeec4b137ab97a70e78ced26645e03f820167f6.scope.
Nov 28 09:54:24 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-22e2b03ed48c04aec873e7d78813026a2a95a8b0b1fba7b042fd31c7dae6d406-merged.mount: Deactivated successfully.
Nov 28 09:54:24 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 09:54:24 np0005538515.localdomain podman[296526]: 2025-11-28 09:54:24.350267821 +0000 UTC m=+0.047664387 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 09:54:24 np0005538515.localdomain podman[296526]: 2025-11-28 09:54:24.464061545 +0000 UTC m=+0.161458091 container init 6c7e296a84f0eb4a836cbfa0efeec4b137ab97a70e78ced26645e03f820167f6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=affectionate_bartik, version=7, io.buildah.version=1.33.12, RELEASE=main, ceph=True, name=rhceph, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, build-date=2025-09-24T08:57:55, GIT_CLEAN=True, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, GIT_BRANCH=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., distribution-scope=public, release=553, vcs-type=git)
Nov 28 09:54:24 np0005538515.localdomain podman[296526]: 2025-11-28 09:54:24.474542985 +0000 UTC m=+0.171939531 container start 6c7e296a84f0eb4a836cbfa0efeec4b137ab97a70e78ced26645e03f820167f6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=affectionate_bartik, ceph=True, io.buildah.version=1.33.12, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=553, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., version=7, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, RELEASE=main, io.openshift.expose-services=, build-date=2025-09-24T08:57:55, vcs-type=git, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, architecture=x86_64)
Nov 28 09:54:24 np0005538515.localdomain podman[296526]: 2025-11-28 09:54:24.474905376 +0000 UTC m=+0.172301902 container attach 6c7e296a84f0eb4a836cbfa0efeec4b137ab97a70e78ced26645e03f820167f6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=affectionate_bartik, vcs-type=git, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, RELEASE=main, release=553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.buildah.version=1.33.12, io.openshift.tags=rhceph ceph, ceph=True, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, build-date=2025-09-24T08:57:55, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, name=rhceph, version=7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git)
Nov 28 09:54:24 np0005538515.localdomain affectionate_bartik[296541]: 167 167
Nov 28 09:54:24 np0005538515.localdomain systemd[1]: libpod-6c7e296a84f0eb4a836cbfa0efeec4b137ab97a70e78ced26645e03f820167f6.scope: Deactivated successfully.
Nov 28 09:54:24 np0005538515.localdomain podman[296526]: 2025-11-28 09:54:24.479608849 +0000 UTC m=+0.177005385 container died 6c7e296a84f0eb4a836cbfa0efeec4b137ab97a70e78ced26645e03f820167f6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=affectionate_bartik, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, com.redhat.component=rhceph-container, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, architecture=x86_64, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, release=553, vcs-type=git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.tags=rhceph ceph, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, io.buildah.version=1.33.12, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/agreements, version=7)
Nov 28 09:54:24 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-b41c8d286bc4294e55d036e66a43a0b04e5f58aa7bce2e7314b27c4c94baa87e-merged.mount: Deactivated successfully.
Nov 28 09:54:24 np0005538515.localdomain podman[296546]: 2025-11-28 09:54:24.576215039 +0000 UTC m=+0.087808362 container remove 6c7e296a84f0eb4a836cbfa0efeec4b137ab97a70e78ced26645e03f820167f6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=affectionate_bartik, GIT_BRANCH=main, release=553, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, build-date=2025-09-24T08:57:55, com.redhat.component=rhceph-container, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_CLEAN=True, architecture=x86_64, io.buildah.version=1.33.12, name=rhceph, description=Red Hat Ceph Storage 7, distribution-scope=public, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=)
Nov 28 09:54:24 np0005538515.localdomain systemd[1]: libpod-conmon-6c7e296a84f0eb4a836cbfa0efeec4b137ab97a70e78ced26645e03f820167f6.scope: Deactivated successfully.
Nov 28 09:54:24 np0005538515.localdomain sudo[296491]: pam_unix(sudo:session): session closed for user root
Nov 28 09:54:24 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon) e11 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain.devices.0}] v 0)
Nov 28 09:54:24 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon) e11 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain}] v 0)
Nov 28 09:54:24 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO cephadm.serve] Reconfiguring osd.4 (monmap changed)...
Nov 28 09:54:24 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Reconfiguring osd.4 (monmap changed)...
Nov 28 09:54:24 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon) e11 handle_command mon_command({"prefix": "auth get", "entity": "osd.4"} v 0)
Nov 28 09:54:24 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [INF] : from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch
Nov 28 09:54:24 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon) e11 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Nov 28 09:54:24 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [DBG] : from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:54:24 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.4 on np0005538515.localdomain
Nov 28 09:54:24 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.4 on np0005538515.localdomain
Nov 28 09:54:24 np0005538515.localdomain sudo[296569]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:54:24 np0005538515.localdomain sudo[296569]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:54:24 np0005538515.localdomain sudo[296569]: pam_unix(sudo:session): session closed for user root
Nov 28 09:54:24 np0005538515.localdomain sudo[296587]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:54:24 np0005538515.localdomain sudo[296587]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:54:24 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon) e11 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain.devices.0}] v 0)
Nov 28 09:54:24 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon) e11  adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints
Nov 28 09:54:24 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon) e11  adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints
Nov 28 09:54:24 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon) e11 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain}] v 0)
Nov 28 09:54:25 np0005538515.localdomain podman[296622]: 
Nov 28 09:54:25 np0005538515.localdomain podman[296622]: 2025-11-28 09:54:25.380728762 +0000 UTC m=+0.077100546 container create d7779ce290bade0419fb5a1edcec7662eebb0fdca25be09ccb42d525f5d3a37b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=musing_dijkstra, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., build-date=2025-09-24T08:57:55, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, release=553, io.buildah.version=1.33.12, vcs-type=git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, version=7, GIT_BRANCH=main, ceph=True, GIT_CLEAN=True, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, distribution-scope=public)
Nov 28 09:54:25 np0005538515.localdomain systemd[1]: Started libpod-conmon-d7779ce290bade0419fb5a1edcec7662eebb0fdca25be09ccb42d525f5d3a37b.scope.
Nov 28 09:54:25 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 09:54:25 np0005538515.localdomain podman[296622]: 2025-11-28 09:54:25.447517901 +0000 UTC m=+0.143889685 container init d7779ce290bade0419fb5a1edcec7662eebb0fdca25be09ccb42d525f5d3a37b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=musing_dijkstra, io.buildah.version=1.33.12, name=rhceph, com.redhat.component=rhceph-container, io.openshift.expose-services=, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2025-09-24T08:57:55, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, RELEASE=main, version=7, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, release=553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-type=git, GIT_CLEAN=True)
Nov 28 09:54:25 np0005538515.localdomain podman[296622]: 2025-11-28 09:54:25.349484668 +0000 UTC m=+0.045856482 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 09:54:25 np0005538515.localdomain podman[296622]: 2025-11-28 09:54:25.459020532 +0000 UTC m=+0.155392346 container start d7779ce290bade0419fb5a1edcec7662eebb0fdca25be09ccb42d525f5d3a37b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=musing_dijkstra, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, release=553, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-09-24T08:57:55, architecture=x86_64, name=rhceph, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, GIT_CLEAN=True, com.redhat.component=rhceph-container, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-type=git, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, vendor=Red Hat, Inc., io.buildah.version=1.33.12, version=7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553)
Nov 28 09:54:25 np0005538515.localdomain musing_dijkstra[296637]: 167 167
Nov 28 09:54:25 np0005538515.localdomain systemd[1]: libpod-d7779ce290bade0419fb5a1edcec7662eebb0fdca25be09ccb42d525f5d3a37b.scope: Deactivated successfully.
Nov 28 09:54:25 np0005538515.localdomain podman[296622]: 2025-11-28 09:54:25.459418474 +0000 UTC m=+0.155790298 container attach d7779ce290bade0419fb5a1edcec7662eebb0fdca25be09ccb42d525f5d3a37b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=musing_dijkstra, com.redhat.component=rhceph-container, io.buildah.version=1.33.12, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=, build-date=2025-09-24T08:57:55, com.redhat.license_terms=https://www.redhat.com/agreements, release=553, io.openshift.expose-services=, ceph=True, RELEASE=main, version=7, vcs-type=git)
Nov 28 09:54:25 np0005538515.localdomain ceph-mgr[286188]: mgr.server handle_open ignoring open from mon.np0005538514 172.18.0.107:0/2340047348; not ready for session (expect reconnect)
Nov 28 09:54:25 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon) e11 handle_command mon_command({"prefix": "mon metadata", "id": "np0005538514"} v 0)
Nov 28 09:54:25 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [DBG] : from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mon metadata", "id": "np0005538514"} : dispatch
Nov 28 09:54:25 np0005538515.localdomain ceph-mgr[286188]: mgr finish mon failed to return metadata for mon.np0005538514: (2) No such file or directory
Nov 28 09:54:25 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon) e11  adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints
Nov 28 09:54:25 np0005538515.localdomain podman[296622]: 2025-11-28 09:54:25.468671616 +0000 UTC m=+0.165043440 container died d7779ce290bade0419fb5a1edcec7662eebb0fdca25be09ccb42d525f5d3a37b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=musing_dijkstra, release=553, io.openshift.expose-services=, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, GIT_CLEAN=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, io.openshift.tags=rhceph ceph, version=7, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, io.buildah.version=1.33.12, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-09-24T08:57:55, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, com.redhat.component=rhceph-container)
Nov 28 09:54:25 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:54:25 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-7f64d01fc17ec90b253bb7d538d09ea35e21e9c87aa8704788c622ea90f303ce-merged.mount: Deactivated successfully.
Nov 28 09:54:25 np0005538515.localdomain podman[296642]: 2025-11-28 09:54:25.568914727 +0000 UTC m=+0.088106471 container remove d7779ce290bade0419fb5a1edcec7662eebb0fdca25be09ccb42d525f5d3a37b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=musing_dijkstra, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, io.openshift.expose-services=, distribution-scope=public, RELEASE=main, name=rhceph, io.buildah.version=1.33.12, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, GIT_BRANCH=main, CEPH_POINT_RELEASE=, vcs-type=git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=553, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, build-date=2025-09-24T08:57:55, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d)
Nov 28 09:54:25 np0005538515.localdomain systemd[1]: libpod-conmon-d7779ce290bade0419fb5a1edcec7662eebb0fdca25be09ccb42d525f5d3a37b.scope: Deactivated successfully.
Nov 28 09:54:25 np0005538515.localdomain ceph-mon[287604]: Reconfiguring osd.1 (monmap changed)...
Nov 28 09:54:25 np0005538515.localdomain ceph-mon[287604]: Reconfiguring daemon osd.1 on np0005538515.localdomain
Nov 28 09:54:25 np0005538515.localdomain ceph-mon[287604]: pgmap v36: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:54:25 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:54:25 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:54:25 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch
Nov 28 09:54:25 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:54:25 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:54:25 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:54:25 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mon metadata", "id": "np0005538514"} : dispatch
Nov 28 09:54:25 np0005538515.localdomain sudo[296587]: pam_unix(sudo:session): session closed for user root
Nov 28 09:54:25 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon) e11 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain.devices.0}] v 0)
Nov 28 09:54:25 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon) e11 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain}] v 0)
Nov 28 09:54:25 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO cephadm.serve] Reconfiguring mds.mds.np0005538515.anvatb (monmap changed)...
Nov 28 09:54:25 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Reconfiguring mds.mds.np0005538515.anvatb (monmap changed)...
Nov 28 09:54:25 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon) e11 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mds.mds.np0005538515.anvatb", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} v 0)
Nov 28 09:54:25 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [INF] : from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005538515.anvatb", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Nov 28 09:54:25 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon) e11 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Nov 28 09:54:25 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [DBG] : from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:54:25 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO cephadm.serve] Reconfiguring daemon mds.mds.np0005538515.anvatb on np0005538515.localdomain
Nov 28 09:54:25 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Reconfiguring daemon mds.mds.np0005538515.anvatb on np0005538515.localdomain
Nov 28 09:54:25 np0005538515.localdomain sudo[296665]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:54:25 np0005538515.localdomain sudo[296665]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:54:25 np0005538515.localdomain sudo[296665]: pam_unix(sudo:session): session closed for user root
Nov 28 09:54:25 np0005538515.localdomain sudo[296683]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:54:25 np0005538515.localdomain sudo[296683]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:54:26 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v37: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:54:26 np0005538515.localdomain podman[296719]: 
Nov 28 09:54:26 np0005538515.localdomain podman[296719]: 2025-11-28 09:54:26.416567257 +0000 UTC m=+0.075599389 container create 8683940fd5b947aa3511d1c5d65e9e61773580381c0ec7c84b3ca6e5e9e4115a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=pedantic_austin, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, release=553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.33.12, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-09-24T08:57:55, version=7, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, GIT_CLEAN=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, RELEASE=main)
Nov 28 09:54:26 np0005538515.localdomain systemd[1]: Started libpod-conmon-8683940fd5b947aa3511d1c5d65e9e61773580381c0ec7c84b3ca6e5e9e4115a.scope.
Nov 28 09:54:26 np0005538515.localdomain ceph-mgr[286188]: mgr.server handle_open ignoring open from mon.np0005538514 172.18.0.107:0/2340047348; not ready for session (expect reconnect)
Nov 28 09:54:26 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 09:54:26 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon) e11 handle_command mon_command({"prefix": "mon metadata", "id": "np0005538514"} v 0)
Nov 28 09:54:26 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [DBG] : from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mon metadata", "id": "np0005538514"} : dispatch
Nov 28 09:54:26 np0005538515.localdomain ceph-mgr[286188]: mgr finish mon failed to return metadata for mon.np0005538514: (2) No such file or directory
Nov 28 09:54:26 np0005538515.localdomain podman[296719]: 2025-11-28 09:54:26.480599171 +0000 UTC m=+0.139631313 container init 8683940fd5b947aa3511d1c5d65e9e61773580381c0ec7c84b3ca6e5e9e4115a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=pedantic_austin, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-09-24T08:57:55, io.openshift.expose-services=, RELEASE=main, GIT_CLEAN=True, io.buildah.version=1.33.12, distribution-scope=public, vendor=Red Hat, Inc., release=553, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph, architecture=x86_64, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, ceph=True, GIT_BRANCH=main, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_REPO=https://github.com/ceph/ceph-container.git)
Nov 28 09:54:26 np0005538515.localdomain podman[296719]: 2025-11-28 09:54:26.385425395 +0000 UTC m=+0.044457557 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 09:54:26 np0005538515.localdomain podman[296719]: 2025-11-28 09:54:26.490632958 +0000 UTC m=+0.149665100 container start 8683940fd5b947aa3511d1c5d65e9e61773580381c0ec7c84b3ca6e5e9e4115a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=pedantic_austin, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, RELEASE=main, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.description=Red Hat Ceph Storage 7, release=553, io.buildah.version=1.33.12, description=Red Hat Ceph Storage 7, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, GIT_CLEAN=True, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, build-date=2025-09-24T08:57:55, version=7, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, GIT_BRANCH=main)
Nov 28 09:54:26 np0005538515.localdomain podman[296719]: 2025-11-28 09:54:26.490823684 +0000 UTC m=+0.149855826 container attach 8683940fd5b947aa3511d1c5d65e9e61773580381c0ec7c84b3ca6e5e9e4115a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=pedantic_austin, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, release=553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, vcs-type=git, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.33.12, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, GIT_CLEAN=True, vendor=Red Hat, Inc., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, version=7, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-09-24T08:57:55, architecture=x86_64, GIT_BRANCH=main, com.redhat.component=rhceph-container, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.license_terms=https://www.redhat.com/agreements)
Nov 28 09:54:26 np0005538515.localdomain pedantic_austin[296734]: 167 167
Nov 28 09:54:26 np0005538515.localdomain systemd[1]: libpod-8683940fd5b947aa3511d1c5d65e9e61773580381c0ec7c84b3ca6e5e9e4115a.scope: Deactivated successfully.
Nov 28 09:54:26 np0005538515.localdomain podman[296719]: 2025-11-28 09:54:26.492553897 +0000 UTC m=+0.151586059 container died 8683940fd5b947aa3511d1c5d65e9e61773580381c0ec7c84b3ca6e5e9e4115a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=pedantic_austin, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-09-24T08:57:55, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, RELEASE=main, distribution-scope=public, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, architecture=x86_64, io.openshift.expose-services=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, CEPH_POINT_RELEASE=, ceph=True, name=rhceph, GIT_CLEAN=True, io.buildah.version=1.33.12, com.redhat.component=rhceph-container, release=553, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main)
Nov 28 09:54:26 np0005538515.localdomain ceph-mon[287604]: Reconfiguring osd.4 (monmap changed)...
Nov 28 09:54:26 np0005538515.localdomain ceph-mon[287604]: Reconfiguring daemon osd.4 on np0005538515.localdomain
Nov 28 09:54:26 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:54:26 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:54:26 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005538515.anvatb", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Nov 28 09:54:26 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:54:26 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005538515.anvatb", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Nov 28 09:54:26 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mon metadata", "id": "np0005538514"} : dispatch
Nov 28 09:54:26 np0005538515.localdomain podman[296739]: 2025-11-28 09:54:26.59093141 +0000 UTC m=+0.090180164 container remove 8683940fd5b947aa3511d1c5d65e9e61773580381c0ec7c84b3ca6e5e9e4115a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=pedantic_austin, io.buildah.version=1.33.12, vcs-type=git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, release=553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, distribution-scope=public, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, com.redhat.component=rhceph-container, build-date=2025-09-24T08:57:55, ceph=True, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, name=rhceph)
Nov 28 09:54:26 np0005538515.localdomain systemd[1]: libpod-conmon-8683940fd5b947aa3511d1c5d65e9e61773580381c0ec7c84b3ca6e5e9e4115a.scope: Deactivated successfully.
Nov 28 09:54:26 np0005538515.localdomain sudo[296683]: pam_unix(sudo:session): session closed for user root
Nov 28 09:54:26 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon) e11 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain.devices.0}] v 0)
Nov 28 09:54:26 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon) e11 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain}] v 0)
Nov 28 09:54:26 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO cephadm.serve] Reconfiguring mgr.np0005538515.yfkzhl (monmap changed)...
Nov 28 09:54:26 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Reconfiguring mgr.np0005538515.yfkzhl (monmap changed)...
Nov 28 09:54:26 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon) e11 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mgr.np0005538515.yfkzhl", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} v 0)
Nov 28 09:54:26 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [INF] : from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538515.yfkzhl", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 28 09:54:26 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon) e11 handle_command mon_command({"prefix": "mgr services"} v 0)
Nov 28 09:54:26 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [DBG] : from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mgr services"} : dispatch
Nov 28 09:54:26 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon) e11 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Nov 28 09:54:26 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [DBG] : from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:54:26 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO cephadm.serve] Reconfiguring daemon mgr.np0005538515.yfkzhl on np0005538515.localdomain
Nov 28 09:54:26 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Reconfiguring daemon mgr.np0005538515.yfkzhl on np0005538515.localdomain
Nov 28 09:54:26 np0005538515.localdomain sudo[296756]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:54:26 np0005538515.localdomain sudo[296756]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:54:26 np0005538515.localdomain sudo[296756]: pam_unix(sudo:session): session closed for user root
Nov 28 09:54:26 np0005538515.localdomain sudo[296774]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:54:26 np0005538515.localdomain sudo[296774]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:54:27 np0005538515.localdomain podman[296808]: 
Nov 28 09:54:27 np0005538515.localdomain podman[296808]: 2025-11-28 09:54:27.288872799 +0000 UTC m=+0.077091125 container create 59b563431c1c8a424ad9d3cdabcd3143e86abd92cef35eb883af1e0a3a1d7da0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nervous_germain, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, name=rhceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, build-date=2025-09-24T08:57:55, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, RELEASE=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.expose-services=, vcs-type=git, release=553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_CLEAN=True, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, io.buildah.version=1.33.12, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9)
Nov 28 09:54:27 np0005538515.localdomain systemd[1]: Started libpod-conmon-59b563431c1c8a424ad9d3cdabcd3143e86abd92cef35eb883af1e0a3a1d7da0.scope.
Nov 28 09:54:27 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 09:54:27 np0005538515.localdomain podman[296808]: 2025-11-28 09:54:27.349372725 +0000 UTC m=+0.137591051 container init 59b563431c1c8a424ad9d3cdabcd3143e86abd92cef35eb883af1e0a3a1d7da0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nervous_germain, build-date=2025-09-24T08:57:55, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, release=553, name=rhceph, io.buildah.version=1.33.12, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, version=7, distribution-scope=public, vendor=Red Hat, Inc., io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_CLEAN=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.tags=rhceph ceph)
Nov 28 09:54:27 np0005538515.localdomain podman[296808]: 2025-11-28 09:54:27.257890453 +0000 UTC m=+0.046108829 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 09:54:27 np0005538515.localdomain podman[296808]: 2025-11-28 09:54:27.358728712 +0000 UTC m=+0.146947028 container start 59b563431c1c8a424ad9d3cdabcd3143e86abd92cef35eb883af1e0a3a1d7da0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nervous_germain, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, vcs-type=git, ceph=True, RELEASE=main, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, name=rhceph, release=553, version=7, distribution-scope=public, architecture=x86_64, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, io.buildah.version=1.33.12, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-09-24T08:57:55, CEPH_POINT_RELEASE=)
Nov 28 09:54:27 np0005538515.localdomain nervous_germain[296823]: 167 167
Nov 28 09:54:27 np0005538515.localdomain podman[296808]: 2025-11-28 09:54:27.358969109 +0000 UTC m=+0.147187425 container attach 59b563431c1c8a424ad9d3cdabcd3143e86abd92cef35eb883af1e0a3a1d7da0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nervous_germain, build-date=2025-09-24T08:57:55, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, ceph=True, distribution-scope=public, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, release=553, RELEASE=main, io.openshift.expose-services=, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_CLEAN=True, CEPH_POINT_RELEASE=, name=rhceph, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3)
Nov 28 09:54:27 np0005538515.localdomain systemd[1]: libpod-59b563431c1c8a424ad9d3cdabcd3143e86abd92cef35eb883af1e0a3a1d7da0.scope: Deactivated successfully.
Nov 28 09:54:27 np0005538515.localdomain podman[296808]: 2025-11-28 09:54:27.360236477 +0000 UTC m=+0.148454863 container died 59b563431c1c8a424ad9d3cdabcd3143e86abd92cef35eb883af1e0a3a1d7da0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nervous_germain, io.openshift.tags=rhceph ceph, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_BRANCH=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.expose-services=, RELEASE=main, distribution-scope=public, build-date=2025-09-24T08:57:55, architecture=x86_64, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, vcs-type=git, name=rhceph, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=553, version=7)
Nov 28 09:54:27 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-556a39532d0aa6de0e8a898bd2980e40c73158ebf1ec18de63bc439213223fc9-merged.mount: Deactivated successfully.
Nov 28 09:54:27 np0005538515.localdomain podman[296828]: 2025-11-28 09:54:27.450654989 +0000 UTC m=+0.081898213 container remove 59b563431c1c8a424ad9d3cdabcd3143e86abd92cef35eb883af1e0a3a1d7da0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nervous_germain, description=Red Hat Ceph Storage 7, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.33.12, version=7, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, RELEASE=main, name=rhceph, build-date=2025-09-24T08:57:55, vcs-type=git, ceph=True, vendor=Red Hat, Inc., GIT_CLEAN=True, GIT_BRANCH=main, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container)
Nov 28 09:54:27 np0005538515.localdomain systemd[1]: libpod-conmon-59b563431c1c8a424ad9d3cdabcd3143e86abd92cef35eb883af1e0a3a1d7da0.scope: Deactivated successfully.
Nov 28 09:54:27 np0005538515.localdomain ceph-mgr[286188]: mgr.server handle_open ignoring open from mon.np0005538514 172.18.0.107:0/2340047348; not ready for session (expect reconnect)
Nov 28 09:54:27 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon) e11 handle_command mon_command({"prefix": "mon metadata", "id": "np0005538514"} v 0)
Nov 28 09:54:27 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [DBG] : from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mon metadata", "id": "np0005538514"} : dispatch
Nov 28 09:54:27 np0005538515.localdomain ceph-mgr[286188]: mgr finish mon failed to return metadata for mon.np0005538514: (2) No such file or directory
Nov 28 09:54:27 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon) e11  adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints
Nov 28 09:54:27 np0005538515.localdomain sudo[296774]: pam_unix(sudo:session): session closed for user root
Nov 28 09:54:27 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon) e11 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain.devices.0}] v 0)
Nov 28 09:54:27 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon) e11 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain}] v 0)
Nov 28 09:54:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:54:27 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:54:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:54:27 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:54:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:54:27 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 09:54:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:54:27 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 09:54:27 np0005538515.localdomain openstack_network_exporter[240973]: 
Nov 28 09:54:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:54:27 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 09:54:27 np0005538515.localdomain openstack_network_exporter[240973]: 
Nov 28 09:54:27 np0005538515.localdomain ceph-mon[287604]: Reconfiguring mds.mds.np0005538515.anvatb (monmap changed)...
Nov 28 09:54:27 np0005538515.localdomain ceph-mon[287604]: Reconfiguring daemon mds.mds.np0005538515.anvatb on np0005538515.localdomain
Nov 28 09:54:27 np0005538515.localdomain ceph-mon[287604]: pgmap v37: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:54:27 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:54:27 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:54:27 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538515.yfkzhl", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 28 09:54:27 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mgr services"} : dispatch
Nov 28 09:54:27 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538515.yfkzhl", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 28 09:54:27 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:54:27 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mon metadata", "id": "np0005538514"} : dispatch
Nov 28 09:54:27 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:54:27 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:54:27 np0005538515.localdomain sudo[296844]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:54:27 np0005538515.localdomain sudo[296844]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:54:27 np0005538515.localdomain sudo[296844]: pam_unix(sudo:session): session closed for user root
Nov 28 09:54:27 np0005538515.localdomain sudo[296862]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 09:54:27 np0005538515.localdomain sudo[296862]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:54:28 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v38: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:54:28 np0005538515.localdomain sudo[296862]: pam_unix(sudo:session): session closed for user root
Nov 28 09:54:28 np0005538515.localdomain ceph-mgr[286188]: mgr.server handle_open ignoring open from mon.np0005538514 172.18.0.107:0/2340047348; not ready for session (expect reconnect)
Nov 28 09:54:28 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon) e11 handle_command mon_command({"prefix": "mon metadata", "id": "np0005538514"} v 0)
Nov 28 09:54:28 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [DBG] : from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mon metadata", "id": "np0005538514"} : dispatch
Nov 28 09:54:28 np0005538515.localdomain ceph-mgr[286188]: mgr finish mon failed to return metadata for mon.np0005538514: (2) No such file or directory
Nov 28 09:54:28 np0005538515.localdomain ceph-mon[287604]: Reconfiguring mgr.np0005538515.yfkzhl (monmap changed)...
Nov 28 09:54:28 np0005538515.localdomain ceph-mon[287604]: Reconfiguring daemon mgr.np0005538515.yfkzhl on np0005538515.localdomain
Nov 28 09:54:28 np0005538515.localdomain ceph-mon[287604]: pgmap v38: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:54:28 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mon metadata", "id": "np0005538514"} : dispatch
Nov 28 09:54:28 np0005538515.localdomain podman[239012]: time="2025-11-28T09:54:28Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 09:54:28 np0005538515.localdomain podman[239012]: @ - - [28/Nov/2025:09:54:28 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156330 "" "Go-http-client/1.1"
Nov 28 09:54:28 np0005538515.localdomain podman[239012]: @ - - [28/Nov/2025:09:54:28 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19176 "" "Go-http-client/1.1"
Nov 28 09:54:29 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:54:29.238 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:54:29 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:54:29.239 280172 DEBUG nova.compute.manager [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 28 09:54:29 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon) e11 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain.devices.0}] v 0)
Nov 28 09:54:29 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon) e11 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain}] v 0)
Nov 28 09:54:29 np0005538515.localdomain ceph-mgr[286188]: mgr.server handle_open ignoring open from mon.np0005538514 172.18.0.107:0/2340047348; not ready for session (expect reconnect)
Nov 28 09:54:29 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon) e11 handle_command mon_command({"prefix": "mon metadata", "id": "np0005538514"} v 0)
Nov 28 09:54:29 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [DBG] : from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mon metadata", "id": "np0005538514"} : dispatch
Nov 28 09:54:29 np0005538515.localdomain ceph-mgr[286188]: mgr finish mon failed to return metadata for mon.np0005538514: (2) No such file or directory
Nov 28 09:54:29 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon) e11  adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints
Nov 28 09:54:29 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon) e11  adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints
Nov 28 09:54:30 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v39: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:54:30 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon) e11 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Nov 28 09:54:30 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [DBG] : from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:54:30 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon) e11 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0)
Nov 28 09:54:30 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [INF] : from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 28 09:54:30 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon) e11 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Nov 28 09:54:30 np0005538515.localdomain ceph-mgr[286188]: [progress INFO root] update: starting ev 39e4d999-07d1-4255-93ab-c5fe5c86b525 (Updating node-proxy deployment (+4 -> 4))
Nov 28 09:54:30 np0005538515.localdomain ceph-mgr[286188]: [progress INFO root] complete: finished ev 39e4d999-07d1-4255-93ab-c5fe5c86b525 (Updating node-proxy deployment (+4 -> 4))
Nov 28 09:54:30 np0005538515.localdomain ceph-mgr[286188]: [progress INFO root] Completed event 39e4d999-07d1-4255-93ab-c5fe5c86b525 (Updating node-proxy deployment (+4 -> 4)) in 0 seconds
Nov 28 09:54:30 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon) e11 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0)
Nov 28 09:54:30 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [DBG] : from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 28 09:54:30 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:54:30.240 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:54:30 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:54:30.241 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:54:30 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:54:30 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:54:30 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mon metadata", "id": "np0005538514"} : dispatch
Nov 28 09:54:30 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:54:30 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 28 09:54:30 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:54:30 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 28 09:54:30 np0005538515.localdomain ceph-mgr[286188]: mgr.server handle_open ignoring open from mon.np0005538514 172.18.0.107:0/2340047348; not ready for session (expect reconnect)
Nov 28 09:54:30 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon) e11 handle_command mon_command({"prefix": "mon metadata", "id": "np0005538514"} v 0)
Nov 28 09:54:30 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [DBG] : from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mon metadata", "id": "np0005538514"} : dispatch
Nov 28 09:54:30 np0005538515.localdomain ceph-mgr[286188]: mgr finish mon failed to return metadata for mon.np0005538514: (2) No such file or directory
Nov 28 09:54:30 np0005538515.localdomain sudo[296912]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:54:30 np0005538515.localdomain sudo[296912]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:54:30 np0005538515.localdomain sudo[296912]: pam_unix(sudo:session): session closed for user root
Nov 28 09:54:30 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:54:31 np0005538515.localdomain ceph-mon[287604]: pgmap v39: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:54:31 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mon metadata", "id": "np0005538514"} : dispatch
Nov 28 09:54:31 np0005538515.localdomain ceph-mgr[286188]: mgr.server handle_open ignoring open from mon.np0005538514 172.18.0.107:0/2340047348; not ready for session (expect reconnect)
Nov 28 09:54:31 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon) e11 handle_command mon_command({"prefix": "mon metadata", "id": "np0005538514"} v 0)
Nov 28 09:54:31 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [DBG] : from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mon metadata", "id": "np0005538514"} : dispatch
Nov 28 09:54:31 np0005538515.localdomain ceph-mgr[286188]: mgr finish mon failed to return metadata for mon.np0005538514: (2) No such file or directory
Nov 28 09:54:31 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon) e11  adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints
Nov 28 09:54:32 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v40: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:54:32 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mon metadata", "id": "np0005538514"} : dispatch
Nov 28 09:54:32 np0005538515.localdomain ceph-mgr[286188]: mgr.server handle_open ignoring open from mon.np0005538514 172.18.0.107:0/2340047348; not ready for session (expect reconnect)
Nov 28 09:54:32 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon) e11 handle_command mon_command({"prefix": "mon metadata", "id": "np0005538514"} v 0)
Nov 28 09:54:32 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [DBG] : from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mon metadata", "id": "np0005538514"} : dispatch
Nov 28 09:54:32 np0005538515.localdomain ceph-mgr[286188]: mgr finish mon failed to return metadata for mon.np0005538514: (2) No such file or directory
Nov 28 09:54:33 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:54:33.239 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:54:33 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:54:33.239 280172 DEBUG nova.compute.manager [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 28 09:54:33 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:54:33.239 280172 DEBUG nova.compute.manager [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 28 09:54:33 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:54:33.365 280172 DEBUG nova.compute.manager [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 28 09:54:33 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:54:33.366 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:54:33 np0005538515.localdomain ceph-mon[287604]: pgmap v40: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:54:33 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mon metadata", "id": "np0005538514"} : dispatch
Nov 28 09:54:33 np0005538515.localdomain ceph-mgr[286188]: mgr.server handle_open ignoring open from mon.np0005538514 172.18.0.107:0/2340047348; not ready for session (expect reconnect)
Nov 28 09:54:33 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon) e11 handle_command mon_command({"prefix": "mon metadata", "id": "np0005538514"} v 0)
Nov 28 09:54:33 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [DBG] : from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mon metadata", "id": "np0005538514"} : dispatch
Nov 28 09:54:33 np0005538515.localdomain ceph-mgr[286188]: mgr finish mon failed to return metadata for mon.np0005538514: (2) No such file or directory
Nov 28 09:54:33 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon) e11  adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints
Nov 28 09:54:34 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v41: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:54:34 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:54:34.239 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:54:34 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:54:34.272 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:54:34 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:54:34.273 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:54:34 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:54:34.273 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:54:34 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:54:34.273 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Auditing locally available compute resources for np0005538515.localdomain (node: np0005538515.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 28 09:54:34 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:54:34.274 280172 DEBUG oslo_concurrency.processutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 09:54:34 np0005538515.localdomain ceph-mgr[286188]: [progress INFO root] Writing back 50 completed events
Nov 28 09:54:34 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon) e11 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0)
Nov 28 09:54:34 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mon metadata", "id": "np0005538514"} : dispatch
Nov 28 09:54:34 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:54:34 np0005538515.localdomain ceph-mgr[286188]: mgr.server handle_open ignoring open from mon.np0005538514 172.18.0.107:0/2340047348; not ready for session (expect reconnect)
Nov 28 09:54:34 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon) e11 handle_command mon_command({"prefix": "mon metadata", "id": "np0005538514"} v 0)
Nov 28 09:54:34 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [DBG] : from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mon metadata", "id": "np0005538514"} : dispatch
Nov 28 09:54:34 np0005538515.localdomain ceph-mgr[286188]: mgr finish mon failed to return metadata for mon.np0005538514: (2) No such file or directory
Nov 28 09:54:34 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon) e11 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 28 09:54:34 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/2905538660' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 09:54:34 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:54:34.704 280172 DEBUG oslo_concurrency.processutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.430s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 09:54:34 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:54:34.909 280172 WARNING nova.virt.libvirt.driver [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 09:54:34 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:54:34.911 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Hypervisor/Node resource view: name=np0005538515.localdomain free_ram=11961MB free_disk=41.83686447143555GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 28 09:54:34 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:54:34.912 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:54:34 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:54:34.912 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:54:35 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:54:35.010 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 28 09:54:35 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:54:35.011 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Final resource view: name=np0005538515.localdomain phys_ram=15738MB used_ram=512MB phys_disk=41GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 28 09:54:35 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:54:35.026 280172 DEBUG oslo_concurrency.processutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 09:54:35 np0005538515.localdomain ceph-mon[287604]: pgmap v41: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:54:35 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mon metadata", "id": "np0005538514"} : dispatch
Nov 28 09:54:35 np0005538515.localdomain ceph-mon[287604]: from='client.? 172.18.0.108:0/2905538660' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 09:54:35 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon) e11 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 28 09:54:35 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/4208687966' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 09:54:35 np0005538515.localdomain ceph-mgr[286188]: mgr.server handle_open ignoring open from mon.np0005538514 172.18.0.107:0/2340047348; not ready for session (expect reconnect)
Nov 28 09:54:35 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon) e11 handle_command mon_command({"prefix": "mon metadata", "id": "np0005538514"} v 0)
Nov 28 09:54:35 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [DBG] : from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mon metadata", "id": "np0005538514"} : dispatch
Nov 28 09:54:35 np0005538515.localdomain ceph-mgr[286188]: mgr finish mon failed to return metadata for mon.np0005538514: (2) No such file or directory
Nov 28 09:54:35 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:54:35.483 280172 DEBUG oslo_concurrency.processutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.457s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 09:54:35 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon) e11  adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints
Nov 28 09:54:35 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:54:35.491 280172 DEBUG nova.compute.provider_tree [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Inventory has not changed in ProviderTree for provider: 72fba1ca-0d86-48af-8a3d-510284dfd0e0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 09:54:35 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon) e11  adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints
Nov 28 09:54:35 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:54:35.511 280172 DEBUG nova.scheduler.client.report [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Inventory has not changed for provider 72fba1ca-0d86-48af-8a3d-510284dfd0e0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 09:54:35 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:54:35.514 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Compute_service record updated for np0005538515.localdomain:np0005538515.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 28 09:54:35 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:54:35.514 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.602s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:54:35 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:54:36 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v42: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:54:36 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon) e11 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 28 09:54:36 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/738597505' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 09:54:36 np0005538515.localdomain ceph-mon[287604]: from='client.? 172.18.0.108:0/4208687966' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 09:54:36 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mon metadata", "id": "np0005538514"} : dispatch
Nov 28 09:54:36 np0005538515.localdomain ceph-mon[287604]: from='client.? 172.18.0.107:0/738597505' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 09:54:36 np0005538515.localdomain ceph-mon[287604]: from='client.? 172.18.0.200:0/11160684' entity='client.admin' cmd={"prefix": "status", "format": "json"} : dispatch
Nov 28 09:54:36 np0005538515.localdomain ceph-mgr[286188]: mgr.server handle_open ignoring open from mon.np0005538514 172.18.0.107:0/2340047348; not ready for session (expect reconnect)
Nov 28 09:54:36 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon) e11 handle_command mon_command({"prefix": "mon metadata", "id": "np0005538514"} v 0)
Nov 28 09:54:36 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [DBG] : from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mon metadata", "id": "np0005538514"} : dispatch
Nov 28 09:54:36 np0005538515.localdomain ceph-mgr[286188]: mgr finish mon failed to return metadata for mon.np0005538514: (2) No such file or directory
Nov 28 09:54:36 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.
Nov 28 09:54:36 np0005538515.localdomain systemd[1]: tmp-crun.tIhUv7.mount: Deactivated successfully.
Nov 28 09:54:36 np0005538515.localdomain podman[296974]: 2025-11-28 09:54:36.992703866 +0000 UTC m=+0.099909822 container health_status 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, version=9.6, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, architecture=x86_64, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, managed_by=edpm_ansible, vcs-type=git)
Nov 28 09:54:37 np0005538515.localdomain podman[296974]: 2025-11-28 09:54:37.008405804 +0000 UTC m=+0.115611770 container exec_died 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, config_id=edpm, release=1755695350, container_name=openstack_network_exporter, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6)
Nov 28 09:54:37 np0005538515.localdomain systemd[1]: 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.service: Deactivated successfully.
Nov 28 09:54:37 np0005538515.localdomain ceph-mgr[286188]: mgr.server handle_open ignoring open from mon.np0005538514 172.18.0.107:0/2340047348; not ready for session (expect reconnect)
Nov 28 09:54:37 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon) e11 handle_command mon_command({"prefix": "mon metadata", "id": "np0005538514"} v 0)
Nov 28 09:54:37 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [DBG] : from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mon metadata", "id": "np0005538514"} : dispatch
Nov 28 09:54:37 np0005538515.localdomain ceph-mgr[286188]: mgr finish mon failed to return metadata for mon.np0005538514: (2) No such file or directory
Nov 28 09:54:37 np0005538515.localdomain ceph-mon[287604]: pgmap v42: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:54:37 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mon metadata", "id": "np0005538514"} : dispatch
Nov 28 09:54:37 np0005538515.localdomain ceph-mon[287604]: from='client.? 172.18.0.107:0/1285419659' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 09:54:37 np0005538515.localdomain ceph-mon[287604]: from='client.? 172.18.0.106:0/4278207600' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 09:54:37 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon) e11  adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints
Nov 28 09:54:37 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:54:37.510 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:54:37 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:54:37.510 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:54:37 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(probing) e12 handle_command mon_command({"prefix": "mon metadata", "id": "np0005538512"} v 0)
Nov 28 09:54:37 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [DBG] : from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mon metadata", "id": "np0005538512"} : dispatch
Nov 28 09:54:37 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(probing) e12 handle_command mon_command({"prefix": "mon metadata", "id": "np0005538513"} v 0)
Nov 28 09:54:37 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [DBG] : from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mon metadata", "id": "np0005538513"} : dispatch
Nov 28 09:54:37 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(probing) e12 handle_command mon_command({"prefix": "mon metadata", "id": "np0005538514"} v 0)
Nov 28 09:54:37 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [DBG] : from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mon metadata", "id": "np0005538514"} : dispatch
Nov 28 09:54:37 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(probing) e12 handle_command mon_command({"prefix": "mon metadata", "id": "np0005538515"} v 0)
Nov 28 09:54:37 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [DBG] : from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mon metadata", "id": "np0005538515"} : dispatch
Nov 28 09:54:37 np0005538515.localdomain ceph-mgr[286188]: mgr finish mon failed to return metadata for mon.np0005538514: (22) Invalid argument
Nov 28 09:54:37 np0005538515.localdomain ceph-mon[287604]: log_channel(cluster) log [INF] : mon.np0005538515 calling monitor election
Nov 28 09:54:37 np0005538515.localdomain ceph-mon[287604]: paxos.1).electionLogic(42) init, last seen epoch 42
Nov 28 09:54:37 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(electing) e12 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 28 09:54:37 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(electing) e12 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 28 09:54:38 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v43: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:54:38 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:54:38.238 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:54:38 np0005538515.localdomain ceph-mgr[286188]: mgr.server handle_open ignoring open from mon.np0005538514 172.18.0.107:0/2340047348; not ready for session (expect reconnect)
Nov 28 09:54:38 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(electing) e12 handle_command mon_command({"prefix": "mon metadata", "id": "np0005538514"} v 0)
Nov 28 09:54:38 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [DBG] : from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mon metadata", "id": "np0005538514"} : dispatch
Nov 28 09:54:38 np0005538515.localdomain ceph-mgr[286188]: mgr finish mon failed to return metadata for mon.np0005538514: (22) Invalid argument
Nov 28 09:54:39 np0005538515.localdomain ceph-mgr[286188]: mgr.server handle_open ignoring open from mon.np0005538514 172.18.0.107:0/2340047348; not ready for session (expect reconnect)
Nov 28 09:54:39 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(electing) e12 handle_command mon_command({"prefix": "mon metadata", "id": "np0005538514"} v 0)
Nov 28 09:54:39 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [DBG] : from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mon metadata", "id": "np0005538514"} : dispatch
Nov 28 09:54:39 np0005538515.localdomain ceph-mgr[286188]: mgr finish mon failed to return metadata for mon.np0005538514: (22) Invalid argument
Nov 28 09:54:40 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v44: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:54:40 np0005538515.localdomain ceph-mgr[286188]: mgr.server handle_open ignoring open from mon.np0005538514 172.18.0.107:0/2340047348; not ready for session (expect reconnect)
Nov 28 09:54:40 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(electing) e12 handle_command mon_command({"prefix": "mon metadata", "id": "np0005538514"} v 0)
Nov 28 09:54:40 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [DBG] : from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mon metadata", "id": "np0005538514"} : dispatch
Nov 28 09:54:40 np0005538515.localdomain ceph-mgr[286188]: mgr finish mon failed to return metadata for mon.np0005538514: (22) Invalid argument
Nov 28 09:54:40 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(electing) e12 handle_auth_request failed to assign global_id
Nov 28 09:54:40 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(electing) e12 handle_auth_request failed to assign global_id
Nov 28 09:54:41 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(electing) e12 handle_auth_request failed to assign global_id
Nov 28 09:54:41 np0005538515.localdomain ceph-mgr[286188]: mgr.server handle_open ignoring open from mon.np0005538514 172.18.0.107:0/2340047348; not ready for session (expect reconnect)
Nov 28 09:54:41 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(electing) e12 handle_command mon_command({"prefix": "mon metadata", "id": "np0005538514"} v 0)
Nov 28 09:54:41 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [DBG] : from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mon metadata", "id": "np0005538514"} : dispatch
Nov 28 09:54:41 np0005538515.localdomain ceph-mgr[286188]: mgr finish mon failed to return metadata for mon.np0005538514: (22) Invalid argument
Nov 28 09:54:42 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v45: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:54:42 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(electing) e12 handle_auth_request failed to assign global_id
Nov 28 09:54:42 np0005538515.localdomain ceph-mgr[286188]: mgr.server handle_open ignoring open from mon.np0005538514 172.18.0.107:0/2340047348; not ready for session (expect reconnect)
Nov 28 09:54:42 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(electing) e12 handle_command mon_command({"prefix": "mon metadata", "id": "np0005538514"} v 0)
Nov 28 09:54:42 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [DBG] : from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mon metadata", "id": "np0005538514"} : dispatch
Nov 28 09:54:42 np0005538515.localdomain ceph-mgr[286188]: mgr finish mon failed to return metadata for mon.np0005538514: (22) Invalid argument
Nov 28 09:54:42 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(electing) e12 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 28 09:54:42 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon) e12 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 28 09:54:42 np0005538515.localdomain ceph-mon[287604]: mon.np0005538513 calling monitor election
Nov 28 09:54:42 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mon metadata", "id": "np0005538512"} : dispatch
Nov 28 09:54:42 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mon metadata", "id": "np0005538513"} : dispatch
Nov 28 09:54:42 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mon metadata", "id": "np0005538514"} : dispatch
Nov 28 09:54:42 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mon metadata", "id": "np0005538515"} : dispatch
Nov 28 09:54:42 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515 calling monitor election
Nov 28 09:54:42 np0005538515.localdomain ceph-mon[287604]: mon.np0005538512 calling monitor election
Nov 28 09:54:42 np0005538515.localdomain ceph-mon[287604]: pgmap v43: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:54:42 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mon metadata", "id": "np0005538514"} : dispatch
Nov 28 09:54:42 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mon metadata", "id": "np0005538514"} : dispatch
Nov 28 09:54:42 np0005538515.localdomain ceph-mon[287604]: mon.np0005538514 calling monitor election
Nov 28 09:54:42 np0005538515.localdomain ceph-mon[287604]: pgmap v44: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:54:42 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mon metadata", "id": "np0005538514"} : dispatch
Nov 28 09:54:42 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mon metadata", "id": "np0005538514"} : dispatch
Nov 28 09:54:42 np0005538515.localdomain ceph-mon[287604]: pgmap v45: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:54:42 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mon metadata", "id": "np0005538514"} : dispatch
Nov 28 09:54:42 np0005538515.localdomain ceph-mon[287604]: mon.np0005538512 is new leader, mons np0005538512,np0005538515,np0005538513,np0005538514 in quorum (ranks 0,1,2,3)
Nov 28 09:54:42 np0005538515.localdomain ceph-mon[287604]: monmap epoch 12
Nov 28 09:54:42 np0005538515.localdomain ceph-mon[287604]: fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:54:42 np0005538515.localdomain ceph-mon[287604]: last_changed 2025-11-28T09:54:37.617923+0000
Nov 28 09:54:42 np0005538515.localdomain ceph-mon[287604]: created 2025-11-28T07:45:36.120469+0000
Nov 28 09:54:42 np0005538515.localdomain ceph-mon[287604]: min_mon_release 18 (reef)
Nov 28 09:54:42 np0005538515.localdomain ceph-mon[287604]: election_strategy: 1
Nov 28 09:54:42 np0005538515.localdomain ceph-mon[287604]: 0: [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon.np0005538512
Nov 28 09:54:42 np0005538515.localdomain ceph-mon[287604]: 1: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005538515
Nov 28 09:54:42 np0005538515.localdomain ceph-mon[287604]: 2: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005538513
Nov 28 09:54:42 np0005538515.localdomain ceph-mon[287604]: 3: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005538514
Nov 28 09:54:42 np0005538515.localdomain ceph-mon[287604]: fsmap cephfs:1 {0=mds.np0005538514.umgtoy=up:active} 2 up:standby
Nov 28 09:54:42 np0005538515.localdomain ceph-mon[287604]: osdmap e88: 6 total, 6 up, 6 in
Nov 28 09:54:42 np0005538515.localdomain ceph-mon[287604]: mgrmap e29: np0005538515.yfkzhl(active, since 84s), standbys: np0005538511.fvuybw, np0005538513.dsfdlx, np0005538514.djozup, np0005538512.zyhkxs
Nov 28 09:54:42 np0005538515.localdomain ceph-mon[287604]: overall HEALTH_OK
Nov 28 09:54:42 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.44342 -' entity='client.admin' cmd=[{"prefix": "orch", "action": "reconfig", "service_name": "osd.default_drive_group", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 09:54:42 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO root] Reconfig service osd.default_drive_group
Nov 28 09:54:42 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Reconfig service osd.default_drive_group
Nov 28 09:54:42 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon) e12 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain.devices.0}] v 0)
Nov 28 09:54:42 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon) e12 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain}] v 0)
Nov 28 09:54:43 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon) e12 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain.devices.0}] v 0)
Nov 28 09:54:43 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon) e12 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain}] v 0)
Nov 28 09:54:43 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon) e12 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain.devices.0}] v 0)
Nov 28 09:54:43 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon) e12 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain}] v 0)
Nov 28 09:54:43 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon) e12 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain.devices.0}] v 0)
Nov 28 09:54:43 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon) e12 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain}] v 0)
Nov 28 09:54:43 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon) e12 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain.devices.0}] v 0)
Nov 28 09:54:43 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon) e12 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain}] v 0)
Nov 28 09:54:43 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon) e12 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain.devices.0}] v 0)
Nov 28 09:54:43 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon) e12 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain}] v 0)
Nov 28 09:54:43 np0005538515.localdomain ceph-mgr[286188]: mgr.server handle_open ignoring open from mon.np0005538514 172.18.0.107:0/2340047348; not ready for session (expect reconnect)
Nov 28 09:54:43 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon) e12 handle_command mon_command({"prefix": "mon metadata", "id": "np0005538514"} v 0)
Nov 28 09:54:43 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [DBG] : from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mon metadata", "id": "np0005538514"} : dispatch
Nov 28 09:54:43 np0005538515.localdomain ceph-mon[287604]: from='client.44342 -' entity='client.admin' cmd=[{"prefix": "orch", "action": "reconfig", "service_name": "osd.default_drive_group", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 09:54:43 np0005538515.localdomain ceph-mon[287604]: Reconfig service osd.default_drive_group
Nov 28 09:54:43 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:54:43 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:54:43 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:54:43 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:54:43 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:54:43 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:54:43 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:54:43 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:54:43 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:54:43 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:54:43 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:54:43 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:54:43 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mon metadata", "id": "np0005538514"} : dispatch
Nov 28 09:54:44 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v46: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:54:44 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon) e12 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain.devices.0}] v 0)
Nov 28 09:54:44 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon).osd e89 e89: 6 total, 6 up, 6 in
Nov 28 09:54:44 np0005538515.localdomain ceph-mgr[286188]: mgr handle_mgr_map I was active but no longer am
Nov 28 09:54:44 np0005538515.localdomain ceph-mgr[286188]: mgr respawn  e: '/usr/bin/ceph-mgr'
Nov 28 09:54:44 np0005538515.localdomain ceph-mgr[286188]: mgr respawn  0: '/usr/bin/ceph-mgr'
Nov 28 09:54:44 np0005538515.localdomain ceph-mgr[286188]: mgr respawn  1: '-n'
Nov 28 09:54:44 np0005538515.localdomain ceph-mgr[286188]: mgr respawn  2: 'mgr.np0005538515.yfkzhl'
Nov 28 09:54:44 np0005538515.localdomain ceph-mgr[286188]: mgr respawn  3: '-f'
Nov 28 09:54:44 np0005538515.localdomain ceph-mgr[286188]: mgr respawn  4: '--setuser'
Nov 28 09:54:44 np0005538515.localdomain ceph-mgr[286188]: mgr respawn  5: 'ceph'
Nov 28 09:54:44 np0005538515.localdomain ceph-mgr[286188]: mgr respawn  6: '--setgroup'
Nov 28 09:54:44 np0005538515.localdomain ceph-mgr[286188]: mgr respawn  7: 'ceph'
Nov 28 09:54:44 np0005538515.localdomain ceph-mgr[286188]: mgr respawn  8: '--default-log-to-file=false'
Nov 28 09:54:44 np0005538515.localdomain ceph-mgr[286188]: mgr respawn  9: '--default-log-to-journald=true'
Nov 28 09:54:44 np0005538515.localdomain ceph-mgr[286188]: mgr respawn  10: '--default-log-to-stderr=false'
Nov 28 09:54:44 np0005538515.localdomain ceph-mgr[286188]: mgr respawn respawning with exe /usr/bin/ceph-mgr
Nov 28 09:54:44 np0005538515.localdomain ceph-mgr[286188]: mgr respawn  exe_path /proc/self/exe
Nov 28 09:54:44 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T09:54:44.916+0000 7f328e5e9640 -1 mgr handle_mgr_map I was active but no longer am
Nov 28 09:54:44 np0005538515.localdomain sshd[293386]: pam_unix(sshd:session): session closed for user ceph-admin
Nov 28 09:54:44 np0005538515.localdomain systemd[1]: session-66.scope: Deactivated successfully.
Nov 28 09:54:44 np0005538515.localdomain systemd[1]: session-66.scope: Consumed 21.108s CPU time.
Nov 28 09:54:44 np0005538515.localdomain systemd-logind[763]: Session 66 logged out. Waiting for processes to exit.
Nov 28 09:54:44 np0005538515.localdomain systemd-logind[763]: Removed session 66.
Nov 28 09:54:44 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: ignoring --setuser ceph since I am not root
Nov 28 09:54:44 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: ignoring --setgroup ceph since I am not root
Nov 28 09:54:45 np0005538515.localdomain ceph-mgr[286188]: ceph version 18.2.1-361.el9cp (439dcd6094d413840eb2ec590fe2194ec616687f) reef (stable), process ceph-mgr, pid 2
Nov 28 09:54:45 np0005538515.localdomain ceph-mgr[286188]: pidfile_write: ignore empty --pid-file
Nov 28 09:54:45 np0005538515.localdomain ceph-mon[287604]: pgmap v46: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:54:45 np0005538515.localdomain ceph-mon[287604]: from='client.? 172.18.0.106:0/3652519406' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 09:54:45 np0005538515.localdomain ceph-mon[287604]: from='client.? 172.18.0.200:0/1122335753' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch
Nov 28 09:54:45 np0005538515.localdomain ceph-mon[287604]: Activating manager daemon np0005538511.fvuybw
Nov 28 09:54:45 np0005538515.localdomain ceph-mon[287604]: osdmap e89: 6 total, 6 up, 6 in
Nov 28 09:54:45 np0005538515.localdomain ceph-mon[287604]: from='client.? 172.18.0.200:0/1122335753' entity='client.admin' cmd='[{"prefix": "mgr fail"}]': finished
Nov 28 09:54:45 np0005538515.localdomain ceph-mon[287604]: mgrmap e30: np0005538511.fvuybw(active, starting, since 0.0525983s), standbys: np0005538513.dsfdlx, np0005538514.djozup, np0005538512.zyhkxs
Nov 28 09:54:45 np0005538515.localdomain ceph-mon[287604]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:54:45 np0005538515.localdomain ceph-mgr[286188]: mgr[py] Loading python module 'alerts'
Nov 28 09:54:45 np0005538515.localdomain ceph-mgr[286188]: mgr[py] Module alerts has missing NOTIFY_TYPES member
Nov 28 09:54:45 np0005538515.localdomain ceph-mgr[286188]: mgr[py] Loading python module 'balancer'
Nov 28 09:54:45 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T09:54:45.108+0000 7fcd196e9140 -1 mgr[py] Module alerts has missing NOTIFY_TYPES member
Nov 28 09:54:45 np0005538515.localdomain ceph-mgr[286188]: mgr[py] Module balancer has missing NOTIFY_TYPES member
Nov 28 09:54:45 np0005538515.localdomain ceph-mgr[286188]: mgr[py] Loading python module 'cephadm'
Nov 28 09:54:45 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T09:54:45.178+0000 7fcd196e9140 -1 mgr[py] Module balancer has missing NOTIFY_TYPES member
Nov 28 09:54:45 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon).osd e89 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:54:45 np0005538515.localdomain ceph-mgr[286188]: mgr[py] Loading python module 'crash'
Nov 28 09:54:45 np0005538515.localdomain ceph-mgr[286188]: mgr[py] Module crash has missing NOTIFY_TYPES member
Nov 28 09:54:45 np0005538515.localdomain ceph-mgr[286188]: mgr[py] Loading python module 'dashboard'
Nov 28 09:54:45 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T09:54:45.848+0000 7fcd196e9140 -1 mgr[py] Module crash has missing NOTIFY_TYPES member
Nov 28 09:54:46 np0005538515.localdomain ceph-mgr[286188]: mgr[py] Loading python module 'devicehealth'
Nov 28 09:54:46 np0005538515.localdomain ceph-mgr[286188]: mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Nov 28 09:54:46 np0005538515.localdomain ceph-mgr[286188]: mgr[py] Loading python module 'diskprediction_local'
Nov 28 09:54:46 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T09:54:46.430+0000 7fcd196e9140 -1 mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Nov 28 09:54:46 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: /lib64/python3.9/site-packages/scipy/__init__.py:73: UserWarning: NumPy was imported from a Python sub-interpreter but NumPy does not properly support sub-interpreters. This will likely work for most users but might cause hard to track down issues or subtle bugs. A common user of the rare sub-interpreter feature is wsgi which also allows single-interpreter mode.
Nov 28 09:54:46 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: Improvements in the case of bugs are welcome, but is not on the NumPy roadmap, and full support may require significant effort to achieve.
Nov 28 09:54:46 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]:   from numpy import show_config as show_numpy_config
Nov 28 09:54:46 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T09:54:46.559+0000 7fcd196e9140 -1 mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Nov 28 09:54:46 np0005538515.localdomain ceph-mgr[286188]: mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Nov 28 09:54:46 np0005538515.localdomain ceph-mgr[286188]: mgr[py] Loading python module 'influx'
Nov 28 09:54:46 np0005538515.localdomain ceph-mgr[286188]: mgr[py] Module influx has missing NOTIFY_TYPES member
Nov 28 09:54:46 np0005538515.localdomain ceph-mgr[286188]: mgr[py] Loading python module 'insights'
Nov 28 09:54:46 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T09:54:46.617+0000 7fcd196e9140 -1 mgr[py] Module influx has missing NOTIFY_TYPES member
Nov 28 09:54:46 np0005538515.localdomain ceph-mgr[286188]: mgr[py] Loading python module 'iostat'
Nov 28 09:54:46 np0005538515.localdomain ceph-mgr[286188]: mgr[py] Module iostat has missing NOTIFY_TYPES member
Nov 28 09:54:46 np0005538515.localdomain ceph-mgr[286188]: mgr[py] Loading python module 'k8sevents'
Nov 28 09:54:46 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T09:54:46.733+0000 7fcd196e9140 -1 mgr[py] Module iostat has missing NOTIFY_TYPES member
Nov 28 09:54:46 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.
Nov 28 09:54:46 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.
Nov 28 09:54:46 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.
Nov 28 09:54:46 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.
Nov 28 09:54:47 np0005538515.localdomain systemd[1]: tmp-crun.USb96H.mount: Deactivated successfully.
Nov 28 09:54:47 np0005538515.localdomain podman[297024]: 2025-11-28 09:54:47.015407077 +0000 UTC m=+0.076464596 container health_status 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, io.buildah.version=1.41.3, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 28 09:54:47 np0005538515.localdomain ceph-mgr[286188]: mgr[py] Loading python module 'localpool'
Nov 28 09:54:47 np0005538515.localdomain podman[297024]: 2025-11-28 09:54:47.052357395 +0000 UTC m=+0.113414904 container exec_died 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 09:54:47 np0005538515.localdomain systemd[1]: 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.service: Deactivated successfully.
Nov 28 09:54:47 np0005538515.localdomain ceph-mgr[286188]: mgr[py] Loading python module 'mds_autoscaler'
Nov 28 09:54:47 np0005538515.localdomain podman[297023]: 2025-11-28 09:54:47.064221817 +0000 UTC m=+0.125519763 container health_status 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_id=edpm, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3)
Nov 28 09:54:47 np0005538515.localdomain podman[297025]: 2025-11-28 09:54:47.117453332 +0000 UTC m=+0.211109336 container health_status b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent)
Nov 28 09:54:47 np0005538515.localdomain podman[297027]: 2025-11-28 09:54:47.174775102 +0000 UTC m=+0.225878867 container health_status d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 09:54:47 np0005538515.localdomain podman[297027]: 2025-11-28 09:54:47.185877251 +0000 UTC m=+0.236981056 container exec_died d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 09:54:47 np0005538515.localdomain podman[297023]: 2025-11-28 09:54:47.197884988 +0000 UTC m=+0.259182954 container exec_died 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 28 09:54:47 np0005538515.localdomain systemd[1]: d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.service: Deactivated successfully.
Nov 28 09:54:47 np0005538515.localdomain ceph-mgr[286188]: mgr[py] Loading python module 'mirroring'
Nov 28 09:54:47 np0005538515.localdomain systemd[1]: 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.service: Deactivated successfully.
Nov 28 09:54:47 np0005538515.localdomain podman[297025]: 2025-11-28 09:54:47.253243918 +0000 UTC m=+0.346899952 container exec_died b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, managed_by=edpm_ansible, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 28 09:54:47 np0005538515.localdomain systemd[1]: b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.service: Deactivated successfully.
Nov 28 09:54:47 np0005538515.localdomain ceph-mgr[286188]: mgr[py] Loading python module 'nfs'
Nov 28 09:54:47 np0005538515.localdomain ceph-mgr[286188]: mgr[py] Module nfs has missing NOTIFY_TYPES member
Nov 28 09:54:47 np0005538515.localdomain ceph-mgr[286188]: mgr[py] Loading python module 'orchestrator'
Nov 28 09:54:47 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T09:54:47.458+0000 7fcd196e9140 -1 mgr[py] Module nfs has missing NOTIFY_TYPES member
Nov 28 09:54:47 np0005538515.localdomain ceph-mgr[286188]: mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Nov 28 09:54:47 np0005538515.localdomain ceph-mgr[286188]: mgr[py] Loading python module 'osd_perf_query'
Nov 28 09:54:47 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T09:54:47.603+0000 7fcd196e9140 -1 mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Nov 28 09:54:47 np0005538515.localdomain ceph-mgr[286188]: mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Nov 28 09:54:47 np0005538515.localdomain ceph-mgr[286188]: mgr[py] Loading python module 'osd_support'
Nov 28 09:54:47 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T09:54:47.667+0000 7fcd196e9140 -1 mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Nov 28 09:54:47 np0005538515.localdomain ceph-mgr[286188]: mgr[py] Module osd_support has missing NOTIFY_TYPES member
Nov 28 09:54:47 np0005538515.localdomain ceph-mgr[286188]: mgr[py] Loading python module 'pg_autoscaler'
Nov 28 09:54:47 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T09:54:47.723+0000 7fcd196e9140 -1 mgr[py] Module osd_support has missing NOTIFY_TYPES member
Nov 28 09:54:47 np0005538515.localdomain ceph-mgr[286188]: mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Nov 28 09:54:47 np0005538515.localdomain ceph-mgr[286188]: mgr[py] Loading python module 'progress'
Nov 28 09:54:47 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T09:54:47.791+0000 7fcd196e9140 -1 mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Nov 28 09:54:47 np0005538515.localdomain ceph-mgr[286188]: mgr[py] Module progress has missing NOTIFY_TYPES member
Nov 28 09:54:47 np0005538515.localdomain ceph-mgr[286188]: mgr[py] Loading python module 'prometheus'
Nov 28 09:54:47 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T09:54:47.848+0000 7fcd196e9140 -1 mgr[py] Module progress has missing NOTIFY_TYPES member
Nov 28 09:54:48 np0005538515.localdomain ceph-mgr[286188]: mgr[py] Module prometheus has missing NOTIFY_TYPES member
Nov 28 09:54:48 np0005538515.localdomain ceph-mgr[286188]: mgr[py] Loading python module 'rbd_support'
Nov 28 09:54:48 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T09:54:48.145+0000 7fcd196e9140 -1 mgr[py] Module prometheus has missing NOTIFY_TYPES member
Nov 28 09:54:48 np0005538515.localdomain ceph-mgr[286188]: mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Nov 28 09:54:48 np0005538515.localdomain ceph-mgr[286188]: mgr[py] Loading python module 'restful'
Nov 28 09:54:48 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T09:54:48.225+0000 7fcd196e9140 -1 mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Nov 28 09:54:48 np0005538515.localdomain ceph-mgr[286188]: mgr[py] Loading python module 'rgw'
Nov 28 09:54:48 np0005538515.localdomain ceph-mgr[286188]: mgr[py] Module rgw has missing NOTIFY_TYPES member
Nov 28 09:54:48 np0005538515.localdomain ceph-mgr[286188]: mgr[py] Loading python module 'rook'
Nov 28 09:54:48 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T09:54:48.554+0000 7fcd196e9140 -1 mgr[py] Module rgw has missing NOTIFY_TYPES member
Nov 28 09:54:48 np0005538515.localdomain ceph-mgr[286188]: mgr[py] Module rook has missing NOTIFY_TYPES member
Nov 28 09:54:48 np0005538515.localdomain ceph-mgr[286188]: mgr[py] Loading python module 'selftest'
Nov 28 09:54:48 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T09:54:48.972+0000 7fcd196e9140 -1 mgr[py] Module rook has missing NOTIFY_TYPES member
Nov 28 09:54:49 np0005538515.localdomain ceph-mgr[286188]: mgr[py] Module selftest has missing NOTIFY_TYPES member
Nov 28 09:54:49 np0005538515.localdomain ceph-mgr[286188]: mgr[py] Loading python module 'snap_schedule'
Nov 28 09:54:49 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T09:54:49.032+0000 7fcd196e9140 -1 mgr[py] Module selftest has missing NOTIFY_TYPES member
Nov 28 09:54:49 np0005538515.localdomain ceph-mgr[286188]: mgr[py] Loading python module 'stats'
Nov 28 09:54:49 np0005538515.localdomain ceph-mgr[286188]: mgr[py] Loading python module 'status'
Nov 28 09:54:49 np0005538515.localdomain ceph-mgr[286188]: mgr[py] Module status has missing NOTIFY_TYPES member
Nov 28 09:54:49 np0005538515.localdomain ceph-mgr[286188]: mgr[py] Loading python module 'telegraf'
Nov 28 09:54:49 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T09:54:49.220+0000 7fcd196e9140 -1 mgr[py] Module status has missing NOTIFY_TYPES member
Nov 28 09:54:49 np0005538515.localdomain ceph-mgr[286188]: mgr[py] Module telegraf has missing NOTIFY_TYPES member
Nov 28 09:54:49 np0005538515.localdomain ceph-mgr[286188]: mgr[py] Loading python module 'telemetry'
Nov 28 09:54:49 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T09:54:49.279+0000 7fcd196e9140 -1 mgr[py] Module telegraf has missing NOTIFY_TYPES member
Nov 28 09:54:49 np0005538515.localdomain ceph-mgr[286188]: mgr[py] Module telemetry has missing NOTIFY_TYPES member
Nov 28 09:54:49 np0005538515.localdomain ceph-mgr[286188]: mgr[py] Loading python module 'test_orchestrator'
Nov 28 09:54:49 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T09:54:49.410+0000 7fcd196e9140 -1 mgr[py] Module telemetry has missing NOTIFY_TYPES member
Nov 28 09:54:49 np0005538515.localdomain ceph-mgr[286188]: mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Nov 28 09:54:49 np0005538515.localdomain ceph-mgr[286188]: mgr[py] Loading python module 'volumes'
Nov 28 09:54:49 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T09:54:49.552+0000 7fcd196e9140 -1 mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Nov 28 09:54:49 np0005538515.localdomain ceph-mgr[286188]: mgr[py] Module volumes has missing NOTIFY_TYPES member
Nov 28 09:54:49 np0005538515.localdomain ceph-mgr[286188]: mgr[py] Loading python module 'zabbix'
Nov 28 09:54:49 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T09:54:49.738+0000 7fcd196e9140 -1 mgr[py] Module volumes has missing NOTIFY_TYPES member
Nov 28 09:54:49 np0005538515.localdomain ceph-mgr[286188]: mgr[py] Module zabbix has missing NOTIFY_TYPES member
Nov 28 09:54:49 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T09:54:49.794+0000 7fcd196e9140 -1 mgr[py] Module zabbix has missing NOTIFY_TYPES member
Nov 28 09:54:49 np0005538515.localdomain ceph-mgr[286188]: ms_deliver_dispatch: unhandled message 0x5575ffdcf600 mon_map magic: 0 from mon.1 v2:172.18.0.108:3300/0
Nov 28 09:54:50 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon).osd e89 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:54:50 np0005538515.localdomain ceph-mon[287604]: Standby manager daemon np0005538515.yfkzhl started
Nov 28 09:54:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:54:50.837 158530 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:54:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:54:50.837 158530 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:54:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:54:50.837 158530 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:54:50 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.
Nov 28 09:54:50 np0005538515.localdomain podman[297106]: 2025-11-28 09:54:50.976822103 +0000 UTC m=+0.079072386 container health_status 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 28 09:54:50 np0005538515.localdomain podman[297106]: 2025-11-28 09:54:50.984977822 +0000 UTC m=+0.087228095 container exec_died 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 09:54:50 np0005538515.localdomain systemd[1]: 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.service: Deactivated successfully.
Nov 28 09:54:51 np0005538515.localdomain ceph-mon[287604]: mgrmap e31: np0005538511.fvuybw(active, starting, since 5s), standbys: np0005538513.dsfdlx, np0005538514.djozup, np0005538515.yfkzhl, np0005538512.zyhkxs
Nov 28 09:54:53 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.
Nov 28 09:54:53 np0005538515.localdomain podman[297130]: 2025-11-28 09:54:53.971762151 +0000 UTC m=+0.079154028 container health_status cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, container_name=multipathd, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible)
Nov 28 09:54:53 np0005538515.localdomain podman[297130]: 2025-11-28 09:54:53.986430009 +0000 UTC m=+0.093821866 container exec_died cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 09:54:53 np0005538515.localdomain systemd[1]: cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.service: Deactivated successfully.
Nov 28 09:54:55 np0005538515.localdomain systemd[1]: Stopping User Manager for UID 1002...
Nov 28 09:54:55 np0005538515.localdomain systemd[26783]: Activating special unit Exit the Session...
Nov 28 09:54:55 np0005538515.localdomain systemd[26783]: Removed slice User Background Tasks Slice.
Nov 28 09:54:55 np0005538515.localdomain systemd[26783]: Stopped target Main User Target.
Nov 28 09:54:55 np0005538515.localdomain systemd[26783]: Stopped target Basic System.
Nov 28 09:54:55 np0005538515.localdomain systemd[26783]: Stopped target Paths.
Nov 28 09:54:55 np0005538515.localdomain systemd[26783]: Stopped target Sockets.
Nov 28 09:54:55 np0005538515.localdomain systemd[26783]: Stopped target Timers.
Nov 28 09:54:55 np0005538515.localdomain systemd[26783]: Stopped Mark boot as successful after the user session has run 2 minutes.
Nov 28 09:54:55 np0005538515.localdomain systemd[26783]: Stopped Daily Cleanup of User's Temporary Directories.
Nov 28 09:54:55 np0005538515.localdomain systemd[26783]: Closed D-Bus User Message Bus Socket.
Nov 28 09:54:55 np0005538515.localdomain systemd[26783]: Stopped Create User's Volatile Files and Directories.
Nov 28 09:54:55 np0005538515.localdomain systemd[26783]: Removed slice User Application Slice.
Nov 28 09:54:55 np0005538515.localdomain systemd[26783]: Reached target Shutdown.
Nov 28 09:54:55 np0005538515.localdomain systemd[26783]: Finished Exit the Session.
Nov 28 09:54:55 np0005538515.localdomain systemd[26783]: Reached target Exit the Session.
Nov 28 09:54:55 np0005538515.localdomain systemd[1]: user@1002.service: Deactivated successfully.
Nov 28 09:54:55 np0005538515.localdomain systemd[1]: Stopped User Manager for UID 1002.
Nov 28 09:54:55 np0005538515.localdomain systemd[1]: user@1002.service: Consumed 12.875s CPU time, read 0B from disk, written 7.0K to disk.
Nov 28 09:54:55 np0005538515.localdomain systemd[1]: Stopping User Runtime Directory /run/user/1002...
Nov 28 09:54:55 np0005538515.localdomain systemd[1]: run-user-1002.mount: Deactivated successfully.
Nov 28 09:54:55 np0005538515.localdomain systemd[1]: user-runtime-dir@1002.service: Deactivated successfully.
Nov 28 09:54:55 np0005538515.localdomain systemd[1]: Stopped User Runtime Directory /run/user/1002.
Nov 28 09:54:55 np0005538515.localdomain systemd[1]: Removed slice User Slice of UID 1002.
Nov 28 09:54:55 np0005538515.localdomain systemd[1]: user-1002.slice: Consumed 4min 35.182s CPU time.
Nov 28 09:54:55 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon).osd e89 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:54:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:54:57 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:54:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:54:57 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:54:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:54:57 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 09:54:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:54:57 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 09:54:57 np0005538515.localdomain openstack_network_exporter[240973]: 
Nov 28 09:54:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:54:57 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 09:54:57 np0005538515.localdomain openstack_network_exporter[240973]: 
Nov 28 09:54:58 np0005538515.localdomain podman[239012]: time="2025-11-28T09:54:58Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 09:54:58 np0005538515.localdomain podman[239012]: @ - - [28/Nov/2025:09:54:58 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156330 "" "Go-http-client/1.1"
Nov 28 09:54:58 np0005538515.localdomain podman[239012]: @ - - [28/Nov/2025:09:54:58 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19180 "" "Go-http-client/1.1"
Nov 28 09:55:00 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon).osd e89 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:55:05 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon).osd e89 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:55:07 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.
Nov 28 09:55:07 np0005538515.localdomain podman[297151]: 2025-11-28 09:55:07.969653179 +0000 UTC m=+0.076292731 container health_status 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, architecture=x86_64, config_id=edpm, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, vcs-type=git, vendor=Red Hat, Inc., container_name=openstack_network_exporter, version=9.6, distribution-scope=public, io.openshift.tags=minimal rhel9, name=ubi9-minimal)
Nov 28 09:55:08 np0005538515.localdomain podman[297151]: 2025-11-28 09:55:08.011690322 +0000 UTC m=+0.118329874 container exec_died 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, config_id=edpm, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, vcs-type=git, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter)
Nov 28 09:55:08 np0005538515.localdomain systemd[1]: 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.service: Deactivated successfully.
Nov 28 09:55:10 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon).osd e89 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:55:13 np0005538515.localdomain ceph-mon[287604]: from='client.? 172.18.0.32:0/4217127523' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 09:55:13 np0005538515.localdomain ceph-mon[287604]: from='client.? 172.18.0.32:0/4217127523' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 09:55:15 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon).osd e89 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:55:17 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.
Nov 28 09:55:17 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.
Nov 28 09:55:17 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.
Nov 28 09:55:17 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.
Nov 28 09:55:17 np0005538515.localdomain podman[297171]: 2025-11-28 09:55:17.986319885 +0000 UTC m=+0.091954478 container health_status 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Nov 28 09:55:17 np0005538515.localdomain podman[297171]: 2025-11-28 09:55:17.995244078 +0000 UTC m=+0.100878701 container exec_died 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 28 09:55:18 np0005538515.localdomain systemd[1]: 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.service: Deactivated successfully.
Nov 28 09:55:18 np0005538515.localdomain podman[297173]: 2025-11-28 09:55:18.047405561 +0000 UTC m=+0.143703949 container health_status b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_metadata_agent)
Nov 28 09:55:18 np0005538515.localdomain podman[297173]: 2025-11-28 09:55:18.052844897 +0000 UTC m=+0.149143245 container exec_died b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 09:55:18 np0005538515.localdomain systemd[1]: b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.service: Deactivated successfully.
Nov 28 09:55:18 np0005538515.localdomain podman[297178]: 2025-11-28 09:55:18.103229725 +0000 UTC m=+0.196755758 container health_status d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 09:55:18 np0005538515.localdomain podman[297178]: 2025-11-28 09:55:18.11091873 +0000 UTC m=+0.204444813 container exec_died d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 09:55:18 np0005538515.localdomain systemd[1]: d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.service: Deactivated successfully.
Nov 28 09:55:18 np0005538515.localdomain podman[297172]: 2025-11-28 09:55:18.158324537 +0000 UTC m=+0.254787900 container health_status 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 28 09:55:18 np0005538515.localdomain podman[297172]: 2025-11-28 09:55:18.254176253 +0000 UTC m=+0.350639606 container exec_died 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251125, io.buildah.version=1.41.3)
Nov 28 09:55:18 np0005538515.localdomain systemd[1]: 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.service: Deactivated successfully.
Nov 28 09:55:19 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon).osd e90 e90: 6 total, 6 up, 6 in
Nov 28 09:55:19 np0005538515.localdomain ceph-mon[287604]: Activating manager daemon np0005538513.dsfdlx
Nov 28 09:55:19 np0005538515.localdomain ceph-mon[287604]: Manager daemon np0005538511.fvuybw is unresponsive, replacing it with standby daemon np0005538513.dsfdlx
Nov 28 09:55:19 np0005538515.localdomain ceph-mon[287604]: osdmap e90: 6 total, 6 up, 6 in
Nov 28 09:55:19 np0005538515.localdomain ceph-mon[287604]: mgrmap e32: np0005538513.dsfdlx(active, starting, since 0.0452346s), standbys: np0005538514.djozup, np0005538515.yfkzhl, np0005538512.zyhkxs
Nov 28 09:55:19 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "mon metadata", "id": "np0005538512"} : dispatch
Nov 28 09:55:19 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "mon metadata", "id": "np0005538513"} : dispatch
Nov 28 09:55:19 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "mon metadata", "id": "np0005538514"} : dispatch
Nov 28 09:55:19 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "mon metadata", "id": "np0005538515"} : dispatch
Nov 28 09:55:19 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "mds metadata", "who": "mds.np0005538513.yljthc"} : dispatch
Nov 28 09:55:19 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "mds metadata", "who": "mds.np0005538515.anvatb"} : dispatch
Nov 28 09:55:19 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "mds metadata", "who": "mds.np0005538514.umgtoy"} : dispatch
Nov 28 09:55:20 np0005538515.localdomain sshd[297251]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 09:55:20 np0005538515.localdomain sshd[297251]: Accepted publickey for ceph-admin from 192.168.122.106 port 37562 ssh2: RSA SHA256:zjXO5gWr7Xng+SeiWsaFLFQaayJZD5rPIAl1v5Aks+g
Nov 28 09:55:20 np0005538515.localdomain systemd-logind[763]: New session 67 of user ceph-admin.
Nov 28 09:55:20 np0005538515.localdomain systemd[1]: Created slice User Slice of UID 1002.
Nov 28 09:55:20 np0005538515.localdomain systemd[1]: Starting User Runtime Directory /run/user/1002...
Nov 28 09:55:20 np0005538515.localdomain systemd[1]: Finished User Runtime Directory /run/user/1002.
Nov 28 09:55:20 np0005538515.localdomain systemd[1]: Starting User Manager for UID 1002...
Nov 28 09:55:20 np0005538515.localdomain systemd[297255]: pam_unix(systemd-user:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Nov 28 09:55:20 np0005538515.localdomain systemd[297255]: Queued start job for default target Main User Target.
Nov 28 09:55:20 np0005538515.localdomain systemd[297255]: Created slice User Application Slice.
Nov 28 09:55:20 np0005538515.localdomain systemd[297255]: Started Mark boot as successful after the user session has run 2 minutes.
Nov 28 09:55:20 np0005538515.localdomain systemd[297255]: Started Daily Cleanup of User's Temporary Directories.
Nov 28 09:55:20 np0005538515.localdomain systemd[297255]: Reached target Paths.
Nov 28 09:55:20 np0005538515.localdomain systemd[297255]: Reached target Timers.
Nov 28 09:55:20 np0005538515.localdomain systemd[297255]: Starting D-Bus User Message Bus Socket...
Nov 28 09:55:20 np0005538515.localdomain systemd[297255]: Starting Create User's Volatile Files and Directories...
Nov 28 09:55:20 np0005538515.localdomain systemd[297255]: Finished Create User's Volatile Files and Directories.
Nov 28 09:55:20 np0005538515.localdomain systemd[297255]: Listening on D-Bus User Message Bus Socket.
Nov 28 09:55:20 np0005538515.localdomain systemd[297255]: Reached target Sockets.
Nov 28 09:55:20 np0005538515.localdomain systemd[297255]: Reached target Basic System.
Nov 28 09:55:20 np0005538515.localdomain systemd[297255]: Reached target Main User Target.
Nov 28 09:55:20 np0005538515.localdomain systemd[297255]: Startup finished in 170ms.
Nov 28 09:55:20 np0005538515.localdomain systemd[1]: Started User Manager for UID 1002.
Nov 28 09:55:20 np0005538515.localdomain systemd[1]: Started Session 67 of User ceph-admin.
Nov 28 09:55:20 np0005538515.localdomain sshd[297251]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Nov 28 09:55:20 np0005538515.localdomain sudo[297271]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:55:20 np0005538515.localdomain sudo[297271]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:55:20 np0005538515.localdomain sudo[297271]: pam_unix(sudo:session): session closed for user root
Nov 28 09:55:20 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:55:20 np0005538515.localdomain sudo[297289]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Nov 28 09:55:20 np0005538515.localdomain sudo[297289]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:55:20 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "mgr metadata", "who": "np0005538513.dsfdlx", "id": "np0005538513.dsfdlx"} : dispatch
Nov 28 09:55:20 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "mgr metadata", "who": "np0005538514.djozup", "id": "np0005538514.djozup"} : dispatch
Nov 28 09:55:20 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "mgr metadata", "who": "np0005538515.yfkzhl", "id": "np0005538515.yfkzhl"} : dispatch
Nov 28 09:55:20 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "mgr metadata", "who": "np0005538512.zyhkxs", "id": "np0005538512.zyhkxs"} : dispatch
Nov 28 09:55:20 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "osd metadata", "id": 0} : dispatch
Nov 28 09:55:20 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "osd metadata", "id": 1} : dispatch
Nov 28 09:55:20 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "osd metadata", "id": 2} : dispatch
Nov 28 09:55:20 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "osd metadata", "id": 3} : dispatch
Nov 28 09:55:20 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "osd metadata", "id": 4} : dispatch
Nov 28 09:55:20 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "osd metadata", "id": 5} : dispatch
Nov 28 09:55:20 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "mds metadata"} : dispatch
Nov 28 09:55:20 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "osd metadata"} : dispatch
Nov 28 09:55:20 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "mon metadata"} : dispatch
Nov 28 09:55:20 np0005538515.localdomain ceph-mon[287604]: Manager daemon np0005538513.dsfdlx is now available
Nov 28 09:55:20 np0005538515.localdomain ceph-mon[287604]: removing stray HostCache host record np0005538511.localdomain.devices.0
Nov 28 09:55:20 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005538511.localdomain.devices.0"} : dispatch
Nov 28 09:55:20 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005538511.localdomain.devices.0"}]': finished
Nov 28 09:55:20 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005538511.localdomain.devices.0"} : dispatch
Nov 28 09:55:20 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005538511.localdomain.devices.0"}]': finished
Nov 28 09:55:20 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005538513.dsfdlx/mirror_snapshot_schedule"} : dispatch
Nov 28 09:55:20 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005538513.dsfdlx/trash_purge_schedule"} : dispatch
Nov 28 09:55:21 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.
Nov 28 09:55:21 np0005538515.localdomain podman[297352]: 2025-11-28 09:55:21.328128684 +0000 UTC m=+0.083272794 container health_status 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 28 09:55:21 np0005538515.localdomain podman[297352]: 2025-11-28 09:55:21.338386197 +0000 UTC m=+0.093530257 container exec_died 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 09:55:21 np0005538515.localdomain systemd[1]: 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.service: Deactivated successfully.
Nov 28 09:55:21 np0005538515.localdomain podman[297401]: 2025-11-28 09:55:21.609190875 +0000 UTC m=+0.096401394 container exec 98f7091a3e2ea0e9ed1e630f1e98c8fad1fd276cf7448473db6afc3c103ea45d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538515, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, GIT_BRANCH=main, com.redhat.component=rhceph-container, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.buildah.version=1.33.12, io.openshift.tags=rhceph ceph, architecture=x86_64, GIT_CLEAN=True, RELEASE=main, vcs-type=git, build-date=2025-09-24T08:57:55, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, release=553, ceph=True, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vendor=Red Hat, Inc., io.openshift.expose-services=, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Nov 28 09:55:21 np0005538515.localdomain podman[297401]: 2025-11-28 09:55:21.729542939 +0000 UTC m=+0.216753458 container exec_died 98f7091a3e2ea0e9ed1e630f1e98c8fad1fd276cf7448473db6afc3c103ea45d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538515, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, release=553, CEPH_POINT_RELEASE=, io.buildah.version=1.33.12, RELEASE=main, name=rhceph, build-date=2025-09-24T08:57:55, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.component=rhceph-container, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, ceph=True, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, architecture=x86_64)
Nov 28 09:55:21 np0005538515.localdomain ceph-mon[287604]: mgrmap e33: np0005538513.dsfdlx(active, since 1.13065s), standbys: np0005538514.djozup, np0005538515.yfkzhl, np0005538512.zyhkxs
Nov 28 09:55:21 np0005538515.localdomain ceph-mon[287604]: from='client.44366 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Nov 28 09:55:21 np0005538515.localdomain ceph-mon[287604]: pgmap v3: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:55:21 np0005538515.localdomain ceph-mon[287604]: [28/Nov/2025:09:55:21] ENGINE Bus STARTING
Nov 28 09:55:21 np0005538515.localdomain ceph-mon[287604]: [28/Nov/2025:09:55:21] ENGINE Serving on http://172.18.0.106:8765
Nov 28 09:55:22 np0005538515.localdomain sudo[297289]: pam_unix(sudo:session): session closed for user root
Nov 28 09:55:22 np0005538515.localdomain sudo[297520]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:55:22 np0005538515.localdomain sudo[297520]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:55:22 np0005538515.localdomain sudo[297520]: pam_unix(sudo:session): session closed for user root
Nov 28 09:55:22 np0005538515.localdomain sudo[297538]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 09:55:22 np0005538515.localdomain sudo[297538]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:55:22 np0005538515.localdomain ceph-mon[287604]: [28/Nov/2025:09:55:21] ENGINE Serving on https://172.18.0.106:7150
Nov 28 09:55:22 np0005538515.localdomain ceph-mon[287604]: [28/Nov/2025:09:55:21] ENGINE Bus STARTED
Nov 28 09:55:22 np0005538515.localdomain ceph-mon[287604]: [28/Nov/2025:09:55:21] ENGINE Client ('172.18.0.106', 60952) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)')
Nov 28 09:55:22 np0005538515.localdomain ceph-mon[287604]: pgmap v4: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:55:22 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:22 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:22 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:22 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:22 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:22 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:22 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:22 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:23 np0005538515.localdomain sudo[297538]: pam_unix(sudo:session): session closed for user root
Nov 28 09:55:23 np0005538515.localdomain sudo[297587]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:55:23 np0005538515.localdomain sudo[297587]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:55:23 np0005538515.localdomain sudo[297587]: pam_unix(sudo:session): session closed for user root
Nov 28 09:55:23 np0005538515.localdomain sudo[297605]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 list-networks
Nov 28 09:55:23 np0005538515.localdomain sudo[297605]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:55:23 np0005538515.localdomain sudo[297605]: pam_unix(sudo:session): session closed for user root
Nov 28 09:55:23 np0005538515.localdomain sudo[297643]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Nov 28 09:55:23 np0005538515.localdomain sudo[297643]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:55:23 np0005538515.localdomain sudo[297643]: pam_unix(sudo:session): session closed for user root
Nov 28 09:55:24 np0005538515.localdomain sudo[297661]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph
Nov 28 09:55:24 np0005538515.localdomain sudo[297661]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:55:24 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.
Nov 28 09:55:24 np0005538515.localdomain sudo[297661]: pam_unix(sudo:session): session closed for user root
Nov 28 09:55:24 np0005538515.localdomain sudo[297685]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.conf.new
Nov 28 09:55:24 np0005538515.localdomain sudo[297685]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:55:24 np0005538515.localdomain sudo[297685]: pam_unix(sudo:session): session closed for user root
Nov 28 09:55:24 np0005538515.localdomain podman[297678]: 2025-11-28 09:55:24.135021161 +0000 UTC m=+0.098521089 container health_status cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd, container_name=multipathd, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 28 09:55:24 np0005538515.localdomain podman[297678]: 2025-11-28 09:55:24.17233183 +0000 UTC m=+0.135831788 container exec_died cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 28 09:55:24 np0005538515.localdomain sudo[297711]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:55:24 np0005538515.localdomain systemd[1]: cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.service: Deactivated successfully.
Nov 28 09:55:24 np0005538515.localdomain sudo[297711]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:55:24 np0005538515.localdomain sudo[297711]: pam_unix(sudo:session): session closed for user root
Nov 28 09:55:24 np0005538515.localdomain sudo[297734]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.conf.new
Nov 28 09:55:24 np0005538515.localdomain sudo[297734]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:55:24 np0005538515.localdomain sudo[297734]: pam_unix(sudo:session): session closed for user root
Nov 28 09:55:24 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:24 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:24 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config rm", "who": "osd/host:np0005538512", "name": "osd_memory_target"} : dispatch
Nov 28 09:55:24 np0005538515.localdomain ceph-mon[287604]: mgrmap e34: np0005538513.dsfdlx(active, since 3s), standbys: np0005538514.djozup, np0005538515.yfkzhl, np0005538512.zyhkxs
Nov 28 09:55:24 np0005538515.localdomain ceph-mon[287604]: from='client.44426 -' entity='client.admin' cmd=[{"prefix": "orch apply", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 09:55:24 np0005538515.localdomain ceph-mon[287604]: Saving service mon spec with placement label:mon
Nov 28 09:55:24 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:24 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:24 np0005538515.localdomain ceph-mon[287604]: Adjusting osd_memory_target on np0005538513.localdomain to 836.6M
Nov 28 09:55:24 np0005538515.localdomain ceph-mon[287604]: Unable to set osd_memory_target on np0005538513.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Nov 28 09:55:24 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:24 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch
Nov 28 09:55:24 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch
Nov 28 09:55:24 np0005538515.localdomain ceph-mon[287604]: pgmap v5: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:55:24 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:24 np0005538515.localdomain ceph-mon[287604]: Adjusting osd_memory_target on np0005538514.localdomain to 836.6M
Nov 28 09:55:24 np0005538515.localdomain ceph-mon[287604]: Unable to set osd_memory_target on np0005538514.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Nov 28 09:55:24 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:24 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch
Nov 28 09:55:24 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch
Nov 28 09:55:24 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:24 np0005538515.localdomain ceph-mon[287604]: Adjusting osd_memory_target on np0005538515.localdomain to 836.6M
Nov 28 09:55:24 np0005538515.localdomain ceph-mon[287604]: Unable to set osd_memory_target on np0005538515.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Nov 28 09:55:24 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:24 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch
Nov 28 09:55:24 np0005538515.localdomain ceph-mon[287604]: Updating np0005538512.localdomain:/etc/ceph/ceph.conf
Nov 28 09:55:24 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch
Nov 28 09:55:24 np0005538515.localdomain ceph-mon[287604]: Updating np0005538513.localdomain:/etc/ceph/ceph.conf
Nov 28 09:55:24 np0005538515.localdomain ceph-mon[287604]: Updating np0005538514.localdomain:/etc/ceph/ceph.conf
Nov 28 09:55:24 np0005538515.localdomain ceph-mon[287604]: Updating np0005538515.localdomain:/etc/ceph/ceph.conf
Nov 28 09:55:24 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:55:24 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 28 09:55:24 np0005538515.localdomain sudo[297768]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.conf.new
Nov 28 09:55:24 np0005538515.localdomain sudo[297768]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:55:24 np0005538515.localdomain sudo[297768]: pam_unix(sudo:session): session closed for user root
Nov 28 09:55:24 np0005538515.localdomain sudo[297786]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.conf.new
Nov 28 09:55:24 np0005538515.localdomain sudo[297786]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:55:24 np0005538515.localdomain sudo[297786]: pam_unix(sudo:session): session closed for user root
Nov 28 09:55:24 np0005538515.localdomain sudo[297804]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Nov 28 09:55:24 np0005538515.localdomain sudo[297804]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:55:24 np0005538515.localdomain sudo[297804]: pam_unix(sudo:session): session closed for user root
Nov 28 09:55:24 np0005538515.localdomain sudo[297822]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config
Nov 28 09:55:24 np0005538515.localdomain sudo[297822]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:55:24 np0005538515.localdomain sudo[297822]: pam_unix(sudo:session): session closed for user root
Nov 28 09:55:24 np0005538515.localdomain sudo[297840]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config
Nov 28 09:55:24 np0005538515.localdomain sudo[297840]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:55:24 np0005538515.localdomain sudo[297840]: pam_unix(sudo:session): session closed for user root
Nov 28 09:55:24 np0005538515.localdomain sudo[297858]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf.new
Nov 28 09:55:24 np0005538515.localdomain sudo[297858]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:55:24 np0005538515.localdomain sudo[297858]: pam_unix(sudo:session): session closed for user root
Nov 28 09:55:24 np0005538515.localdomain sudo[297876]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:55:24 np0005538515.localdomain sudo[297876]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:55:24 np0005538515.localdomain sudo[297876]: pam_unix(sudo:session): session closed for user root
Nov 28 09:55:24 np0005538515.localdomain sudo[297894]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf.new
Nov 28 09:55:24 np0005538515.localdomain sudo[297894]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:55:24 np0005538515.localdomain sudo[297894]: pam_unix(sudo:session): session closed for user root
Nov 28 09:55:25 np0005538515.localdomain sudo[297928]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf.new
Nov 28 09:55:25 np0005538515.localdomain sudo[297928]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:55:25 np0005538515.localdomain sudo[297928]: pam_unix(sudo:session): session closed for user root
Nov 28 09:55:25 np0005538515.localdomain sudo[297946]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf.new
Nov 28 09:55:25 np0005538515.localdomain sudo[297946]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:55:25 np0005538515.localdomain sudo[297946]: pam_unix(sudo:session): session closed for user root
Nov 28 09:55:25 np0005538515.localdomain sudo[297964]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf.new /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:55:25 np0005538515.localdomain sudo[297964]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:55:25 np0005538515.localdomain sudo[297964]: pam_unix(sudo:session): session closed for user root
Nov 28 09:55:25 np0005538515.localdomain sudo[297982]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Nov 28 09:55:25 np0005538515.localdomain sudo[297982]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:55:25 np0005538515.localdomain sudo[297982]: pam_unix(sudo:session): session closed for user root
Nov 28 09:55:25 np0005538515.localdomain sudo[298000]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph
Nov 28 09:55:25 np0005538515.localdomain sudo[298000]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:55:25 np0005538515.localdomain sudo[298000]: pam_unix(sudo:session): session closed for user root
Nov 28 09:55:25 np0005538515.localdomain sudo[298018]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.client.admin.keyring.new
Nov 28 09:55:25 np0005538515.localdomain sudo[298018]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:55:25 np0005538515.localdomain sudo[298018]: pam_unix(sudo:session): session closed for user root
Nov 28 09:55:25 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:55:25 np0005538515.localdomain sudo[298036]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:55:25 np0005538515.localdomain sudo[298036]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:55:25 np0005538515.localdomain sudo[298036]: pam_unix(sudo:session): session closed for user root
Nov 28 09:55:25 np0005538515.localdomain ceph-mon[287604]: Updating np0005538512.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:55:25 np0005538515.localdomain ceph-mon[287604]: Updating np0005538514.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:55:25 np0005538515.localdomain ceph-mon[287604]: Updating np0005538513.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:55:25 np0005538515.localdomain ceph-mon[287604]: Updating np0005538515.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:55:25 np0005538515.localdomain ceph-mon[287604]: from='client.44434 -' entity='client.admin' cmd=[{"prefix": "orch ps", "daemon_type": "mon", "daemon_id": "np0005538514", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Nov 28 09:55:25 np0005538515.localdomain ceph-mon[287604]: Updating np0005538512.localdomain:/etc/ceph/ceph.client.admin.keyring
Nov 28 09:55:25 np0005538515.localdomain ceph-mon[287604]: Updating np0005538514.localdomain:/etc/ceph/ceph.client.admin.keyring
Nov 28 09:55:25 np0005538515.localdomain ceph-mon[287604]: Updating np0005538515.localdomain:/etc/ceph/ceph.client.admin.keyring
Nov 28 09:55:25 np0005538515.localdomain ceph-mon[287604]: Updating np0005538513.localdomain:/etc/ceph/ceph.client.admin.keyring
Nov 28 09:55:25 np0005538515.localdomain sudo[298054]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.client.admin.keyring.new
Nov 28 09:55:25 np0005538515.localdomain sudo[298054]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:55:25 np0005538515.localdomain sudo[298054]: pam_unix(sudo:session): session closed for user root
Nov 28 09:55:25 np0005538515.localdomain sudo[298088]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.client.admin.keyring.new
Nov 28 09:55:25 np0005538515.localdomain sudo[298088]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:55:25 np0005538515.localdomain sudo[298088]: pam_unix(sudo:session): session closed for user root
Nov 28 09:55:25 np0005538515.localdomain sudo[298106]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.client.admin.keyring.new
Nov 28 09:55:25 np0005538515.localdomain sudo[298106]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:55:25 np0005538515.localdomain sudo[298106]: pam_unix(sudo:session): session closed for user root
Nov 28 09:55:25 np0005538515.localdomain sudo[298124]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.client.admin.keyring.new /etc/ceph/ceph.client.admin.keyring
Nov 28 09:55:25 np0005538515.localdomain sudo[298124]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:55:25 np0005538515.localdomain sudo[298124]: pam_unix(sudo:session): session closed for user root
Nov 28 09:55:26 np0005538515.localdomain sudo[298142]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config
Nov 28 09:55:26 np0005538515.localdomain sudo[298142]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:55:26 np0005538515.localdomain sudo[298142]: pam_unix(sudo:session): session closed for user root
Nov 28 09:55:26 np0005538515.localdomain sudo[298160]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config
Nov 28 09:55:26 np0005538515.localdomain sudo[298160]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:55:26 np0005538515.localdomain sudo[298160]: pam_unix(sudo:session): session closed for user root
Nov 28 09:55:26 np0005538515.localdomain sudo[298178]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring.new
Nov 28 09:55:26 np0005538515.localdomain sudo[298178]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:55:26 np0005538515.localdomain sudo[298178]: pam_unix(sudo:session): session closed for user root
Nov 28 09:55:26 np0005538515.localdomain sudo[298196]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:55:26 np0005538515.localdomain sudo[298196]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:55:26 np0005538515.localdomain sudo[298196]: pam_unix(sudo:session): session closed for user root
Nov 28 09:55:26 np0005538515.localdomain sudo[298214]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring.new
Nov 28 09:55:26 np0005538515.localdomain sudo[298214]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:55:26 np0005538515.localdomain sudo[298214]: pam_unix(sudo:session): session closed for user root
Nov 28 09:55:26 np0005538515.localdomain sudo[298248]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring.new
Nov 28 09:55:26 np0005538515.localdomain sudo[298248]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:55:26 np0005538515.localdomain sudo[298248]: pam_unix(sudo:session): session closed for user root
Nov 28 09:55:26 np0005538515.localdomain sudo[298266]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring.new
Nov 28 09:55:26 np0005538515.localdomain sudo[298266]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:55:26 np0005538515.localdomain sudo[298266]: pam_unix(sudo:session): session closed for user root
Nov 28 09:55:26 np0005538515.localdomain ceph-mon[287604]: pgmap v6: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:55:26 np0005538515.localdomain ceph-mon[287604]: Updating np0005538512.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring
Nov 28 09:55:26 np0005538515.localdomain ceph-mon[287604]: Updating np0005538514.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring
Nov 28 09:55:26 np0005538515.localdomain ceph-mon[287604]: Updating np0005538515.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring
Nov 28 09:55:26 np0005538515.localdomain ceph-mon[287604]: Updating np0005538513.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring
Nov 28 09:55:26 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:26 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:26 np0005538515.localdomain ceph-mon[287604]: from='client.? 172.18.0.200:0/363124451' entity='client.admin' cmd={"prefix": "status", "format": "json"} : dispatch
Nov 28 09:55:26 np0005538515.localdomain sudo[298284]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring.new /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring
Nov 28 09:55:26 np0005538515.localdomain sudo[298284]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:55:26 np0005538515.localdomain sudo[298284]: pam_unix(sudo:session): session closed for user root
Nov 28 09:55:27 np0005538515.localdomain sudo[298302]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:55:27 np0005538515.localdomain sudo[298302]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:55:27 np0005538515.localdomain sudo[298302]: pam_unix(sudo:session): session closed for user root
Nov 28 09:55:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:55:27 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:55:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:55:27 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:55:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:55:27 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 09:55:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:55:27 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 09:55:27 np0005538515.localdomain openstack_network_exporter[240973]: 
Nov 28 09:55:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:55:27 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 09:55:27 np0005538515.localdomain openstack_network_exporter[240973]: 
Nov 28 09:55:27 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:27 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:27 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:27 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:27 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:27 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:27 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:27 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:27 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 28 09:55:27 np0005538515.localdomain ceph-mon[287604]: Reconfiguring mon.np0005538512 (monmap changed)...
Nov 28 09:55:27 np0005538515.localdomain ceph-mon[287604]: Reconfiguring daemon mon.np0005538512 on np0005538512.localdomain
Nov 28 09:55:27 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Nov 28 09:55:27 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Nov 28 09:55:27 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:55:28 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:55:28.239 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:55:28 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:55:28.240 280172 DEBUG nova.compute.manager [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Nov 28 09:55:28 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:55:28.378 280172 DEBUG nova.compute.manager [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Nov 28 09:55:28 np0005538515.localdomain podman[239012]: time="2025-11-28T09:55:28Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 09:55:28 np0005538515.localdomain podman[239012]: @ - - [28/Nov/2025:09:55:28 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156330 "" "Go-http-client/1.1"
Nov 28 09:55:28 np0005538515.localdomain podman[239012]: @ - - [28/Nov/2025:09:55:28 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19184 "" "Go-http-client/1.1"
Nov 28 09:55:29 np0005538515.localdomain ceph-mon[287604]: pgmap v7: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail; 30 KiB/s rd, 0 B/s wr, 16 op/s
Nov 28 09:55:29 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:29 np0005538515.localdomain ceph-mon[287604]: Reconfiguring mgr.np0005538512.zyhkxs (monmap changed)...
Nov 28 09:55:29 np0005538515.localdomain ceph-mon[287604]: Reconfiguring daemon mgr.np0005538512.zyhkxs on np0005538512.localdomain
Nov 28 09:55:29 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:29 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538512.zyhkxs", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 28 09:55:29 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "mgr services"} : dispatch
Nov 28 09:55:29 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:55:29 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:29 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:29 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538512.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 28 09:55:29 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:55:30 np0005538515.localdomain ceph-mon[287604]: Reconfiguring crash.np0005538512 (monmap changed)...
Nov 28 09:55:30 np0005538515.localdomain ceph-mon[287604]: Reconfiguring daemon crash.np0005538512 on np0005538512.localdomain
Nov 28 09:55:30 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:30 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:30 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:30 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538513.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 28 09:55:30 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:55:30 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:55:30.378 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:55:30 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:55:30.378 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:55:30 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:55:31 np0005538515.localdomain ceph-mon[287604]: pgmap v8: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail; 23 KiB/s rd, 0 B/s wr, 13 op/s
Nov 28 09:55:31 np0005538515.localdomain ceph-mon[287604]: Reconfiguring crash.np0005538513 (monmap changed)...
Nov 28 09:55:31 np0005538515.localdomain ceph-mon[287604]: Reconfiguring daemon crash.np0005538513 on np0005538513.localdomain
Nov 28 09:55:31 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:31 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:31 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch
Nov 28 09:55:31 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:55:31 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:55:31.239 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:55:31 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:55:31.240 280172 DEBUG nova.compute.manager [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 28 09:55:32 np0005538515.localdomain ceph-mon[287604]: Reconfiguring osd.2 (monmap changed)...
Nov 28 09:55:32 np0005538515.localdomain ceph-mon[287604]: Reconfiguring daemon osd.2 on np0005538513.localdomain
Nov 28 09:55:32 np0005538515.localdomain ceph-mon[287604]: from='client.? 172.18.0.200:0/577138193' entity='client.admin' cmd={"prefix": "mgr stat", "format": "json"} : dispatch
Nov 28 09:55:32 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:32 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:32 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:32 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:32 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch
Nov 28 09:55:32 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:55:33 np0005538515.localdomain ceph-mon[287604]: pgmap v9: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail; 19 KiB/s rd, 0 B/s wr, 10 op/s
Nov 28 09:55:33 np0005538515.localdomain ceph-mon[287604]: Reconfiguring osd.5 (monmap changed)...
Nov 28 09:55:33 np0005538515.localdomain ceph-mon[287604]: Reconfiguring daemon osd.5 on np0005538513.localdomain
Nov 28 09:55:33 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:33 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:33 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:33 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:33 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005538513.yljthc", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Nov 28 09:55:33 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:55:33 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:55:33.239 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:55:33 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:55:33.239 280172 DEBUG nova.compute.manager [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 28 09:55:33 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:55:33.239 280172 DEBUG nova.compute.manager [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 28 09:55:33 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:55:33.335 280172 DEBUG nova.compute.manager [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 28 09:55:34 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:55:34.239 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:55:34 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:55:34.281 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:55:34 np0005538515.localdomain ceph-mon[287604]: Reconfiguring mds.mds.np0005538513.yljthc (monmap changed)...
Nov 28 09:55:34 np0005538515.localdomain ceph-mon[287604]: Reconfiguring daemon mds.mds.np0005538513.yljthc on np0005538513.localdomain
Nov 28 09:55:34 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:34 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:34 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538513.dsfdlx", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 28 09:55:34 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "mgr services"} : dispatch
Nov 28 09:55:34 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:55:35 np0005538515.localdomain ceph-mgr[286188]: ms_deliver_dispatch: unhandled message 0x5575ffdcef20 mon_map magic: 0 from mon.1 v2:172.18.0.108:3300/0
Nov 28 09:55:35 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@1(peon) e13  my rank is now 0 (was 1)
Nov 28 09:55:35 np0005538515.localdomain ceph-mgr[286188]: client.0 ms_handle_reset on v2:172.18.0.108:3300/0
Nov 28 09:55:35 np0005538515.localdomain ceph-mgr[286188]: client.0 ms_handle_reset on v2:172.18.0.108:3300/0
Nov 28 09:55:35 np0005538515.localdomain ceph-mgr[286188]: ms_deliver_dispatch: unhandled message 0x55760951e000 mon_map magic: 0 from mon.0 v2:172.18.0.108:3300/0
Nov 28 09:55:35 np0005538515.localdomain ceph-mon[287604]: log_channel(cluster) log [INF] : mon.np0005538515 calling monitor election
Nov 28 09:55:35 np0005538515.localdomain ceph-mon[287604]: paxos.0).electionLogic(46) init, last seen epoch 46
Nov 28 09:55:35 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@0(electing) e13 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 28 09:55:35 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@0(electing) e13 handle_command mon_command({"prefix": "mon metadata", "id": "np0005538513"} v 0)
Nov 28 09:55:35 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [DBG] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "mon metadata", "id": "np0005538513"} : dispatch
Nov 28 09:55:35 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@0(electing) e13 handle_command mon_command({"prefix": "mon metadata", "id": "np0005538514"} v 0)
Nov 28 09:55:35 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [DBG] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "mon metadata", "id": "np0005538514"} : dispatch
Nov 28 09:55:35 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@0(electing) e13 handle_command mon_command({"prefix": "mon metadata", "id": "np0005538515"} v 0)
Nov 28 09:55:35 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [DBG] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "mon metadata", "id": "np0005538515"} : dispatch
Nov 28 09:55:35 np0005538515.localdomain ceph-mon[287604]: log_channel(cluster) log [INF] : mon.np0005538515 is new leader, mons np0005538515,np0005538513,np0005538514 in quorum (ranks 0,1,2)
Nov 28 09:55:35 np0005538515.localdomain ceph-mon[287604]: log_channel(cluster) log [DBG] : monmap epoch 13
Nov 28 09:55:35 np0005538515.localdomain ceph-mon[287604]: log_channel(cluster) log [DBG] : fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:55:35 np0005538515.localdomain ceph-mon[287604]: log_channel(cluster) log [DBG] : last_changed 2025-11-28T09:55:34.993934+0000
Nov 28 09:55:35 np0005538515.localdomain ceph-mon[287604]: log_channel(cluster) log [DBG] : created 2025-11-28T07:45:36.120469+0000
Nov 28 09:55:35 np0005538515.localdomain ceph-mon[287604]: log_channel(cluster) log [DBG] : min_mon_release 18 (reef)
Nov 28 09:55:35 np0005538515.localdomain ceph-mon[287604]: log_channel(cluster) log [DBG] : election_strategy: 1
Nov 28 09:55:35 np0005538515.localdomain ceph-mon[287604]: log_channel(cluster) log [DBG] : 0: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005538515
Nov 28 09:55:35 np0005538515.localdomain ceph-mon[287604]: log_channel(cluster) log [DBG] : 1: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005538513
Nov 28 09:55:35 np0005538515.localdomain ceph-mon[287604]: log_channel(cluster) log [DBG] : 2: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005538514
Nov 28 09:55:35 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@0(leader) e13 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 28 09:55:35 np0005538515.localdomain ceph-mon[287604]: log_channel(cluster) log [DBG] : fsmap cephfs:1 {0=mds.np0005538514.umgtoy=up:active} 2 up:standby
Nov 28 09:55:35 np0005538515.localdomain ceph-mon[287604]: log_channel(cluster) log [DBG] : osdmap e90: 6 total, 6 up, 6 in
Nov 28 09:55:35 np0005538515.localdomain ceph-mon[287604]: log_channel(cluster) log [DBG] : mgrmap e34: np0005538513.dsfdlx(active, since 15s), standbys: np0005538514.djozup, np0005538515.yfkzhl, np0005538512.zyhkxs
Nov 28 09:55:35 np0005538515.localdomain ceph-mon[287604]: log_channel(cluster) log [INF] : overall HEALTH_OK
Nov 28 09:55:35 np0005538515.localdomain ceph-mon[287604]: pgmap v10: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail; 17 KiB/s rd, 0 B/s wr, 9 op/s
Nov 28 09:55:35 np0005538515.localdomain ceph-mon[287604]: Reconfiguring mgr.np0005538513.dsfdlx (monmap changed)...
Nov 28 09:55:35 np0005538515.localdomain ceph-mon[287604]: Reconfiguring daemon mgr.np0005538513.dsfdlx on np0005538513.localdomain
Nov 28 09:55:35 np0005538515.localdomain ceph-mon[287604]: Reconfiguring mon.np0005538513 (monmap changed)...
Nov 28 09:55:35 np0005538515.localdomain ceph-mon[287604]: Reconfiguring daemon mon.np0005538513 on np0005538513.localdomain
Nov 28 09:55:35 np0005538515.localdomain ceph-mon[287604]: from='client.27180 -' entity='client.admin' cmd=[{"prefix": "orch daemon rm", "names": ["mon.np0005538512"], "force": true, "target": ["mon-mgr", ""]}]: dispatch
Nov 28 09:55:35 np0005538515.localdomain ceph-mon[287604]: Remove daemons mon.np0005538512
Nov 28 09:55:35 np0005538515.localdomain ceph-mon[287604]: Safe to remove mon.np0005538512: new quorum should be ['np0005538515', 'np0005538513', 'np0005538514'] (from ['np0005538515', 'np0005538513', 'np0005538514'])
Nov 28 09:55:35 np0005538515.localdomain ceph-mon[287604]: Removing monitor np0005538512 from monmap...
Nov 28 09:55:35 np0005538515.localdomain ceph-mon[287604]: Removing daemon mon.np0005538512 from np0005538512.localdomain -- ports []
Nov 28 09:55:35 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515 calling monitor election
Nov 28 09:55:35 np0005538515.localdomain ceph-mon[287604]: mon.np0005538514 calling monitor election
Nov 28 09:55:35 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "mon metadata", "id": "np0005538513"} : dispatch
Nov 28 09:55:35 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "mon metadata", "id": "np0005538514"} : dispatch
Nov 28 09:55:35 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "mon metadata", "id": "np0005538515"} : dispatch
Nov 28 09:55:35 np0005538515.localdomain ceph-mon[287604]: mon.np0005538513 calling monitor election
Nov 28 09:55:35 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515 is new leader, mons np0005538515,np0005538513,np0005538514 in quorum (ranks 0,1,2)
Nov 28 09:55:35 np0005538515.localdomain ceph-mon[287604]: monmap epoch 13
Nov 28 09:55:35 np0005538515.localdomain ceph-mon[287604]: fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:55:35 np0005538515.localdomain ceph-mon[287604]: last_changed 2025-11-28T09:55:34.993934+0000
Nov 28 09:55:35 np0005538515.localdomain ceph-mon[287604]: created 2025-11-28T07:45:36.120469+0000
Nov 28 09:55:35 np0005538515.localdomain ceph-mon[287604]: min_mon_release 18 (reef)
Nov 28 09:55:35 np0005538515.localdomain ceph-mon[287604]: election_strategy: 1
Nov 28 09:55:35 np0005538515.localdomain ceph-mon[287604]: 0: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005538515
Nov 28 09:55:35 np0005538515.localdomain ceph-mon[287604]: 1: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005538513
Nov 28 09:55:35 np0005538515.localdomain ceph-mon[287604]: 2: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005538514
Nov 28 09:55:35 np0005538515.localdomain ceph-mon[287604]: fsmap cephfs:1 {0=mds.np0005538514.umgtoy=up:active} 2 up:standby
Nov 28 09:55:35 np0005538515.localdomain ceph-mon[287604]: osdmap e90: 6 total, 6 up, 6 in
Nov 28 09:55:35 np0005538515.localdomain ceph-mon[287604]: mgrmap e34: np0005538513.dsfdlx(active, since 15s), standbys: np0005538514.djozup, np0005538515.yfkzhl, np0005538512.zyhkxs
Nov 28 09:55:35 np0005538515.localdomain ceph-mon[287604]: overall HEALTH_OK
Nov 28 09:55:35 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:55:35 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain.devices.0}] v 0)
Nov 28 09:55:35 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:35 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain}] v 0)
Nov 28 09:55:35 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:35 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@0(leader) e13 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.crash.np0005538514.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} v 0)
Nov 28 09:55:35 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538514.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 28 09:55:35 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@0(leader) e13 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Nov 28 09:55:35 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [DBG] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:55:36 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:55:36.241 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:55:36 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:55:36.241 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:55:36 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:55:36.272 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:55:36 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:55:36.273 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:55:36 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:55:36.273 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:55:36 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:55:36.273 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Auditing locally available compute resources for np0005538515.localdomain (node: np0005538515.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 28 09:55:36 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:55:36.274 280172 DEBUG oslo_concurrency.processutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 09:55:36 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@0(leader) e13 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 28 09:55:36 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/36973857' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 09:55:36 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:55:36.716 280172 DEBUG oslo_concurrency.processutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.442s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 09:55:36 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain.devices.0}] v 0)
Nov 28 09:55:36 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:36 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain}] v 0)
Nov 28 09:55:36 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:36 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@0(leader) e13 handle_command mon_command({"prefix": "auth get", "entity": "osd.0"} v 0)
Nov 28 09:55:36 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch
Nov 28 09:55:36 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@0(leader) e13 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Nov 28 09:55:36 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [DBG] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:55:36 np0005538515.localdomain ceph-mon[287604]: pgmap v11: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail; 17 KiB/s rd, 0 B/s wr, 9 op/s
Nov 28 09:55:36 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:36 np0005538515.localdomain ceph-mon[287604]: Reconfiguring crash.np0005538514 (monmap changed)...
Nov 28 09:55:36 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:36 np0005538515.localdomain ceph-mon[287604]: Reconfiguring daemon crash.np0005538514 on np0005538514.localdomain
Nov 28 09:55:36 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538514.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 28 09:55:36 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:55:36 np0005538515.localdomain ceph-mon[287604]: from='client.? 172.18.0.107:0/758709823' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 09:55:36 np0005538515.localdomain ceph-mon[287604]: from='client.? 172.18.0.108:0/36973857' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 09:55:36 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:36 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:36 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch
Nov 28 09:55:36 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:55:36 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:55:36.953 280172 WARNING nova.virt.libvirt.driver [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 09:55:36 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:55:36.955 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Hypervisor/Node resource view: name=np0005538515.localdomain free_ram=12019MB free_disk=41.83686447143555GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 28 09:55:36 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:55:36.955 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:55:36 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:55:36.956 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:55:37 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:55:37.238 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 28 09:55:37 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:55:37.238 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Final resource view: name=np0005538515.localdomain phys_ram=15738MB used_ram=512MB phys_disk=41GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 28 09:55:37 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:55:37.292 280172 DEBUG nova.scheduler.client.report [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Refreshing inventories for resource provider 72fba1ca-0d86-48af-8a3d-510284dfd0e0 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Nov 28 09:55:37 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:55:37.348 280172 DEBUG nova.scheduler.client.report [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Updating ProviderTree inventory for provider 72fba1ca-0d86-48af-8a3d-510284dfd0e0 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Nov 28 09:55:37 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:55:37.349 280172 DEBUG nova.compute.provider_tree [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Updating inventory in ProviderTree for provider 72fba1ca-0d86-48af-8a3d-510284dfd0e0 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 28 09:55:37 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:55:37.365 280172 DEBUG nova.scheduler.client.report [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Refreshing aggregate associations for resource provider 72fba1ca-0d86-48af-8a3d-510284dfd0e0, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Nov 28 09:55:37 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:55:37.391 280172 DEBUG nova.scheduler.client.report [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Refreshing trait associations for resource provider 72fba1ca-0d86-48af-8a3d-510284dfd0e0, traits: COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_TRUSTED_CERTS,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_AVX,HW_CPU_X86_BMI2,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_AESNI,HW_CPU_X86_SHA,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_ACCELERATORS,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_AMD_SVM,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_F16C,HW_CPU_X86_SVM,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_SSE2,HW_CPU_X86_CLMUL,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_ABM,COMPUTE_STORAGE_BUS_FDC,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_SSE4A,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_SSSE3,HW_CPU_X86_AVX2,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_SECURITY_TPM_1_2,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_SSE41,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_FMA3,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_BMI,HW_CPU_X86_SSE,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NODE,HW_CPU_X86_MMX,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_VOLUME_EXTEND,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_VIRTIO _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Nov 28 09:55:37 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:55:37.419 280172 DEBUG oslo_concurrency.processutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 09:55:37 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain.devices.0}] v 0)
Nov 28 09:55:37 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:37 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain}] v 0)
Nov 28 09:55:37 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:37 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain.devices.0}] v 0)
Nov 28 09:55:37 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:37 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain}] v 0)
Nov 28 09:55:37 np0005538515.localdomain ceph-mon[287604]: Reconfiguring osd.0 (monmap changed)...
Nov 28 09:55:37 np0005538515.localdomain ceph-mon[287604]: Reconfiguring daemon osd.0 on np0005538514.localdomain
Nov 28 09:55:37 np0005538515.localdomain ceph-mon[287604]: from='client.? 172.18.0.106:0/705674516' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 09:55:37 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:37 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:37 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:37 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@0(leader) e13 handle_command mon_command({"prefix": "auth get", "entity": "osd.3"} v 0)
Nov 28 09:55:37 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch
Nov 28 09:55:37 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@0(leader) e13 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Nov 28 09:55:37 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [DBG] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:55:37 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@0(leader) e13 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 28 09:55:37 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/3157638842' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 09:55:37 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:55:37.930 280172 DEBUG oslo_concurrency.processutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.511s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 09:55:37 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:55:37.935 280172 DEBUG nova.compute.provider_tree [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Inventory has not changed in ProviderTree for provider: 72fba1ca-0d86-48af-8a3d-510284dfd0e0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 09:55:37 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:55:37.955 280172 DEBUG nova.scheduler.client.report [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Inventory has not changed for provider 72fba1ca-0d86-48af-8a3d-510284dfd0e0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 09:55:37 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:55:37.956 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Compute_service record updated for np0005538515.localdomain:np0005538515.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 28 09:55:37 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:55:37.957 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:55:38 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:55:38.238 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:55:38 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:55:38.239 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:55:38 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:55:38.239 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:55:38 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:55:38.239 280172 DEBUG nova.compute.manager [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Nov 28 09:55:38 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/inventory}] v 0)
Nov 28 09:55:38 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:38 np0005538515.localdomain ceph-mon[287604]: pgmap v12: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail; 17 KiB/s rd, 0 B/s wr, 9 op/s
Nov 28 09:55:38 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:38 np0005538515.localdomain ceph-mon[287604]: Reconfiguring osd.3 (monmap changed)...
Nov 28 09:55:38 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:38 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch
Nov 28 09:55:38 np0005538515.localdomain ceph-mon[287604]: Reconfiguring daemon osd.3 on np0005538514.localdomain
Nov 28 09:55:38 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:55:38 np0005538515.localdomain ceph-mon[287604]: from='client.? 172.18.0.108:0/3157638842' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 09:55:38 np0005538515.localdomain ceph-mon[287604]: from='client.? 172.18.0.107:0/486094489' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 09:55:38 np0005538515.localdomain ceph-mon[287604]: from='client.? 172.18.0.106:0/4058543900' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 09:55:38 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:38 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.
Nov 28 09:55:38 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain.devices.0}] v 0)
Nov 28 09:55:38 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:38 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain}] v 0)
Nov 28 09:55:38 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:38 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain.devices.0}] v 0)
Nov 28 09:55:38 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:38 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain}] v 0)
Nov 28 09:55:38 np0005538515.localdomain podman[298364]: 2025-11-28 09:55:38.988326996 +0000 UTC m=+0.091258507 container health_status 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, vcs-type=git, com.redhat.component=ubi9-minimal-container, config_id=edpm)
Nov 28 09:55:38 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:38 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@0(leader) e13 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mds.mds.np0005538514.umgtoy", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} v 0)
Nov 28 09:55:38 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005538514.umgtoy", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Nov 28 09:55:39 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@0(leader) e13 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Nov 28 09:55:39 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [DBG] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:55:39 np0005538515.localdomain podman[298364]: 2025-11-28 09:55:39.001401406 +0000 UTC m=+0.104332917 container exec_died 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, maintainer=Red Hat, Inc., managed_by=edpm_ansible, vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers)
Nov 28 09:55:39 np0005538515.localdomain systemd[1]: 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.service: Deactivated successfully.
Nov 28 09:55:39 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:55:39.239 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:55:39 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain.devices.0}] v 0)
Nov 28 09:55:39 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:39 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain}] v 0)
Nov 28 09:55:39 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:39 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@0(leader) e13 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mgr.np0005538514.djozup", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} v 0)
Nov 28 09:55:39 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538514.djozup", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 28 09:55:39 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@0(leader) e13 handle_command mon_command({"prefix": "mgr services"} v 0)
Nov 28 09:55:39 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [DBG] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "mgr services"} : dispatch
Nov 28 09:55:39 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@0(leader) e13 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Nov 28 09:55:39 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [DBG] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:55:39 np0005538515.localdomain ceph-mon[287604]: from='client.44470 -' entity='client.admin' cmd=[{"prefix": "orch host label rm", "hostname": "np0005538512.localdomain", "label": "mon", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 09:55:39 np0005538515.localdomain ceph-mon[287604]: Removed label mon from host np0005538512.localdomain
Nov 28 09:55:39 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:39 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:39 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:39 np0005538515.localdomain ceph-mon[287604]: Reconfiguring mds.mds.np0005538514.umgtoy (monmap changed)...
Nov 28 09:55:39 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:39 np0005538515.localdomain ceph-mon[287604]: Reconfiguring daemon mds.mds.np0005538514.umgtoy on np0005538514.localdomain
Nov 28 09:55:39 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005538514.umgtoy", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Nov 28 09:55:39 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:55:39 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:39 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:39 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538514.djozup", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 28 09:55:39 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "mgr services"} : dispatch
Nov 28 09:55:39 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:55:40 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:55:40 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain.devices.0}] v 0)
Nov 28 09:55:40 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:40 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain}] v 0)
Nov 28 09:55:40 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:40 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@0(leader) e13 handle_command mon_command({"prefix": "auth get", "entity": "mon."} v 0)
Nov 28 09:55:40 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Nov 28 09:55:40 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@0(leader) e13 handle_command mon_command({"prefix": "config get", "who": "mon", "key": "public_network"} v 0)
Nov 28 09:55:40 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [DBG] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Nov 28 09:55:40 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@0(leader) e13 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Nov 28 09:55:40 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [DBG] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:55:41 np0005538515.localdomain ceph-mon[287604]: pgmap v13: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:55:41 np0005538515.localdomain ceph-mon[287604]: Reconfiguring mgr.np0005538514.djozup (monmap changed)...
Nov 28 09:55:41 np0005538515.localdomain ceph-mon[287604]: Reconfiguring daemon mgr.np0005538514.djozup on np0005538514.localdomain
Nov 28 09:55:41 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:41 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:41 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Nov 28 09:55:41 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Nov 28 09:55:41 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:55:41 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain.devices.0}] v 0)
Nov 28 09:55:41 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:41 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain}] v 0)
Nov 28 09:55:41 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:41 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@0(leader) e13 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.crash.np0005538515.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} v 0)
Nov 28 09:55:41 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538515.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 28 09:55:41 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@0(leader) e13 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Nov 28 09:55:41 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [DBG] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:55:41 np0005538515.localdomain sudo[298384]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:55:41 np0005538515.localdomain sudo[298384]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:55:41 np0005538515.localdomain sudo[298384]: pam_unix(sudo:session): session closed for user root
Nov 28 09:55:41 np0005538515.localdomain sudo[298402]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:55:41 np0005538515.localdomain sudo[298402]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:55:42 np0005538515.localdomain ceph-mon[287604]: Reconfiguring mon.np0005538514 (monmap changed)...
Nov 28 09:55:42 np0005538515.localdomain ceph-mon[287604]: Reconfiguring daemon mon.np0005538514 on np0005538514.localdomain
Nov 28 09:55:42 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:42 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:42 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538515.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 28 09:55:42 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:55:42 np0005538515.localdomain podman[298437]: 
Nov 28 09:55:42 np0005538515.localdomain podman[298437]: 2025-11-28 09:55:42.28838688 +0000 UTC m=+0.076314491 container create 44e2ec4883636c5d3917054ce2eb1bb43265bd75b4b1c876de4ee5bbc3af64cc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=boring_driscoll, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., name=rhceph, io.buildah.version=1.33.12, vcs-type=git, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-09-24T08:57:55, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.expose-services=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_CLEAN=True, release=553, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, version=7, distribution-scope=public, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9)
Nov 28 09:55:42 np0005538515.localdomain systemd[1]: Started libpod-conmon-44e2ec4883636c5d3917054ce2eb1bb43265bd75b4b1c876de4ee5bbc3af64cc.scope.
Nov 28 09:55:42 np0005538515.localdomain podman[298437]: 2025-11-28 09:55:42.256673792 +0000 UTC m=+0.044601443 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 09:55:42 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 09:55:42 np0005538515.localdomain podman[298437]: 2025-11-28 09:55:42.378221652 +0000 UTC m=+0.166149283 container init 44e2ec4883636c5d3917054ce2eb1bb43265bd75b4b1c876de4ee5bbc3af64cc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=boring_driscoll, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2025-09-24T08:57:55, description=Red Hat Ceph Storage 7, version=7, release=553, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, architecture=x86_64, GIT_BRANCH=main, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., GIT_CLEAN=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, RELEASE=main, name=rhceph, io.buildah.version=1.33.12, vcs-type=git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.license_terms=https://www.redhat.com/agreements)
Nov 28 09:55:42 np0005538515.localdomain podman[298437]: 2025-11-28 09:55:42.390251919 +0000 UTC m=+0.178179530 container start 44e2ec4883636c5d3917054ce2eb1bb43265bd75b4b1c876de4ee5bbc3af64cc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=boring_driscoll, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, vcs-type=git, architecture=x86_64, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_CLEAN=True, version=7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/agreements, RELEASE=main, com.redhat.component=rhceph-container, release=553, name=rhceph, io.buildah.version=1.33.12, distribution-scope=public, build-date=2025-09-24T08:57:55, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git)
Nov 28 09:55:42 np0005538515.localdomain podman[298437]: 2025-11-28 09:55:42.390998773 +0000 UTC m=+0.178926384 container attach 44e2ec4883636c5d3917054ce2eb1bb43265bd75b4b1c876de4ee5bbc3af64cc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=boring_driscoll, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, name=rhceph, io.buildah.version=1.33.12, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, vendor=Red Hat, Inc., ceph=True, build-date=2025-09-24T08:57:55, io.openshift.expose-services=, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=553, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, distribution-scope=public, description=Red Hat Ceph Storage 7)
Nov 28 09:55:42 np0005538515.localdomain boring_driscoll[298452]: 167 167
Nov 28 09:55:42 np0005538515.localdomain systemd[1]: libpod-44e2ec4883636c5d3917054ce2eb1bb43265bd75b4b1c876de4ee5bbc3af64cc.scope: Deactivated successfully.
Nov 28 09:55:42 np0005538515.localdomain podman[298437]: 2025-11-28 09:55:42.39517402 +0000 UTC m=+0.183101731 container died 44e2ec4883636c5d3917054ce2eb1bb43265bd75b4b1c876de4ee5bbc3af64cc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=boring_driscoll, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=rhceph-container, name=rhceph, ceph=True, build-date=2025-09-24T08:57:55, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, version=7, release=553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, com.redhat.license_terms=https://www.redhat.com/agreements, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.buildah.version=1.33.12, GIT_BRANCH=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_CLEAN=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, distribution-scope=public, io.openshift.expose-services=, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7)
Nov 28 09:55:42 np0005538515.localdomain podman[298457]: 2025-11-28 09:55:42.488956864 +0000 UTC m=+0.085553464 container remove 44e2ec4883636c5d3917054ce2eb1bb43265bd75b4b1c876de4ee5bbc3af64cc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=boring_driscoll, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_CLEAN=True, io.buildah.version=1.33.12, CEPH_POINT_RELEASE=, release=553, vcs-type=git, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, RELEASE=main, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements)
Nov 28 09:55:42 np0005538515.localdomain systemd[1]: libpod-conmon-44e2ec4883636c5d3917054ce2eb1bb43265bd75b4b1c876de4ee5bbc3af64cc.scope: Deactivated successfully.
Nov 28 09:55:42 np0005538515.localdomain sudo[298402]: pam_unix(sudo:session): session closed for user root
Nov 28 09:55:42 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain.devices.0}] v 0)
Nov 28 09:55:42 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:42 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain}] v 0)
Nov 28 09:55:42 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:42 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@0(leader) e13 handle_command mon_command({"prefix": "auth get", "entity": "osd.1"} v 0)
Nov 28 09:55:42 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch
Nov 28 09:55:42 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@0(leader) e13 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Nov 28 09:55:42 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [DBG] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:55:42 np0005538515.localdomain sudo[298475]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:55:42 np0005538515.localdomain sudo[298475]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:55:42 np0005538515.localdomain sudo[298475]: pam_unix(sudo:session): session closed for user root
Nov 28 09:55:42 np0005538515.localdomain sudo[298493]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:55:42 np0005538515.localdomain sudo[298493]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:55:43 np0005538515.localdomain ceph-mon[287604]: pgmap v14: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:55:43 np0005538515.localdomain ceph-mon[287604]: Reconfiguring crash.np0005538515 (monmap changed)...
Nov 28 09:55:43 np0005538515.localdomain ceph-mon[287604]: Reconfiguring daemon crash.np0005538515 on np0005538515.localdomain
Nov 28 09:55:43 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:43 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:43 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch
Nov 28 09:55:43 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:55:43 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/inventory}] v 0)
Nov 28 09:55:43 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:43 np0005538515.localdomain podman[298527]: 
Nov 28 09:55:43 np0005538515.localdomain podman[298527]: 2025-11-28 09:55:43.217105695 +0000 UTC m=+0.081228301 container create 4cfdfc371ff56dc8b0761658dea6bfed389b9b3576e9e8e90f62fd124a7d44eb (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=magical_lumiere, release=553, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, build-date=2025-09-24T08:57:55, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, ceph=True, distribution-scope=public, name=rhceph, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, GIT_BRANCH=main, com.redhat.component=rhceph-container, version=7)
Nov 28 09:55:43 np0005538515.localdomain systemd[1]: Started libpod-conmon-4cfdfc371ff56dc8b0761658dea6bfed389b9b3576e9e8e90f62fd124a7d44eb.scope.
Nov 28 09:55:43 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 09:55:43 np0005538515.localdomain podman[298527]: 2025-11-28 09:55:43.281249313 +0000 UTC m=+0.145371919 container init 4cfdfc371ff56dc8b0761658dea6bfed389b9b3576e9e8e90f62fd124a7d44eb (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=magical_lumiere, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, release=553, description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.buildah.version=1.33.12, build-date=2025-09-24T08:57:55, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, vendor=Red Hat, Inc., GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, version=7, com.redhat.component=rhceph-container, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, distribution-scope=public)
Nov 28 09:55:43 np0005538515.localdomain podman[298527]: 2025-11-28 09:55:43.186737767 +0000 UTC m=+0.050860383 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 09:55:43 np0005538515.localdomain podman[298527]: 2025-11-28 09:55:43.29198476 +0000 UTC m=+0.156107366 container start 4cfdfc371ff56dc8b0761658dea6bfed389b9b3576e9e8e90f62fd124a7d44eb (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=magical_lumiere, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, distribution-scope=public, name=rhceph, release=553, architecture=x86_64, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, version=7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-09-24T08:57:55, vcs-type=git, RELEASE=main)
Nov 28 09:55:43 np0005538515.localdomain podman[298527]: 2025-11-28 09:55:43.292209447 +0000 UTC m=+0.156332053 container attach 4cfdfc371ff56dc8b0761658dea6bfed389b9b3576e9e8e90f62fd124a7d44eb (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=magical_lumiere, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., RELEASE=main, CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, architecture=x86_64, version=7, build-date=2025-09-24T08:57:55, name=rhceph, release=553, GIT_CLEAN=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-type=git, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7)
Nov 28 09:55:43 np0005538515.localdomain magical_lumiere[298541]: 167 167
Nov 28 09:55:43 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-02efe27f3da46599feb6ac6679f8631ae01615f76ef6933d3ad577363ed5c130-merged.mount: Deactivated successfully.
Nov 28 09:55:43 np0005538515.localdomain systemd[1]: libpod-4cfdfc371ff56dc8b0761658dea6bfed389b9b3576e9e8e90f62fd124a7d44eb.scope: Deactivated successfully.
Nov 28 09:55:43 np0005538515.localdomain podman[298527]: 2025-11-28 09:55:43.301308566 +0000 UTC m=+0.165431172 container died 4cfdfc371ff56dc8b0761658dea6bfed389b9b3576e9e8e90f62fd124a7d44eb (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=magical_lumiere, vcs-type=git, RELEASE=main, GIT_BRANCH=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.license_terms=https://www.redhat.com/agreements, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, name=rhceph, release=553, CEPH_POINT_RELEASE=, io.openshift.expose-services=, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553)
Nov 28 09:55:43 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-1c35d031be3c1f3ce2e41b0882ee9864fa3d2667397f4dc81097bbe536f56f13-merged.mount: Deactivated successfully.
Nov 28 09:55:43 np0005538515.localdomain podman[298546]: 2025-11-28 09:55:43.402699191 +0000 UTC m=+0.092864366 container remove 4cfdfc371ff56dc8b0761658dea6bfed389b9b3576e9e8e90f62fd124a7d44eb (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=magical_lumiere, release=553, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, version=7, io.buildah.version=1.33.12, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, architecture=x86_64, name=rhceph, build-date=2025-09-24T08:57:55, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_BRANCH=main)
Nov 28 09:55:43 np0005538515.localdomain systemd[1]: libpod-conmon-4cfdfc371ff56dc8b0761658dea6bfed389b9b3576e9e8e90f62fd124a7d44eb.scope: Deactivated successfully.
Nov 28 09:55:43 np0005538515.localdomain sudo[298493]: pam_unix(sudo:session): session closed for user root
Nov 28 09:55:43 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain.devices.0}] v 0)
Nov 28 09:55:43 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:43 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain}] v 0)
Nov 28 09:55:43 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:43 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain.devices.0}] v 0)
Nov 28 09:55:43 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:43 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain}] v 0)
Nov 28 09:55:43 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:43 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@0(leader) e13 handle_command mon_command({"prefix": "auth get", "entity": "osd.4"} v 0)
Nov 28 09:55:43 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch
Nov 28 09:55:43 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@0(leader) e13 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Nov 28 09:55:43 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [DBG] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:55:43 np0005538515.localdomain sudo[298570]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:55:43 np0005538515.localdomain sudo[298570]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:55:43 np0005538515.localdomain sudo[298570]: pam_unix(sudo:session): session closed for user root
Nov 28 09:55:43 np0005538515.localdomain sudo[298588]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:55:43 np0005538515.localdomain sudo[298588]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:55:44 np0005538515.localdomain ceph-mon[287604]: Reconfiguring osd.1 (monmap changed)...
Nov 28 09:55:44 np0005538515.localdomain ceph-mon[287604]: Reconfiguring daemon osd.1 on np0005538515.localdomain
Nov 28 09:55:44 np0005538515.localdomain ceph-mon[287604]: from='client.44473 -' entity='client.admin' cmd=[{"prefix": "orch host label rm", "hostname": "np0005538512.localdomain", "label": "mgr", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 09:55:44 np0005538515.localdomain ceph-mon[287604]: Removed label mgr from host np0005538512.localdomain
Nov 28 09:55:44 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:44 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:44 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:44 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:44 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:44 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch
Nov 28 09:55:44 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:55:44 np0005538515.localdomain podman[298622]: 
Nov 28 09:55:44 np0005538515.localdomain podman[298622]: 2025-11-28 09:55:44.233020691 +0000 UTC m=+0.077366513 container create 59f186d56261c765b19a7c01f132fec690d912ab7a450b130871dba8d939d2c1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=pedantic_khayyam, release=553, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, com.redhat.license_terms=https://www.redhat.com/agreements, RELEASE=main, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, description=Red Hat Ceph Storage 7, name=rhceph, io.buildah.version=1.33.12, architecture=x86_64, CEPH_POINT_RELEASE=, ceph=True, vendor=Red Hat, Inc., vcs-type=git, build-date=2025-09-24T08:57:55, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.expose-services=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_CLEAN=True)
Nov 28 09:55:44 np0005538515.localdomain systemd[1]: Started libpod-conmon-59f186d56261c765b19a7c01f132fec690d912ab7a450b130871dba8d939d2c1.scope.
Nov 28 09:55:44 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 09:55:44 np0005538515.localdomain podman[298622]: 2025-11-28 09:55:44.290899978 +0000 UTC m=+0.135245810 container init 59f186d56261c765b19a7c01f132fec690d912ab7a450b130871dba8d939d2c1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=pedantic_khayyam, vendor=Red Hat, Inc., ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-type=git, GIT_CLEAN=True, name=rhceph, io.openshift.tags=rhceph ceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.buildah.version=1.33.12, distribution-scope=public, com.redhat.component=rhceph-container, release=553, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, version=7, build-date=2025-09-24T08:57:55, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Nov 28 09:55:44 np0005538515.localdomain podman[298622]: 2025-11-28 09:55:44.200650613 +0000 UTC m=+0.044996475 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 09:55:44 np0005538515.localdomain systemd[1]: tmp-crun.jfq0cQ.mount: Deactivated successfully.
Nov 28 09:55:44 np0005538515.localdomain podman[298622]: 2025-11-28 09:55:44.307391342 +0000 UTC m=+0.151737174 container start 59f186d56261c765b19a7c01f132fec690d912ab7a450b130871dba8d939d2c1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=pedantic_khayyam, GIT_BRANCH=main, io.buildah.version=1.33.12, build-date=2025-09-24T08:57:55, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, description=Red Hat Ceph Storage 7, vcs-type=git, name=rhceph, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, release=553, RELEASE=main, GIT_CLEAN=True, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Nov 28 09:55:44 np0005538515.localdomain podman[298622]: 2025-11-28 09:55:44.307852616 +0000 UTC m=+0.152198448 container attach 59f186d56261c765b19a7c01f132fec690d912ab7a450b130871dba8d939d2c1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=pedantic_khayyam, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vendor=Red Hat, Inc., RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, architecture=x86_64, distribution-scope=public, CEPH_POINT_RELEASE=, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, GIT_CLEAN=True, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, vcs-type=git, io.buildah.version=1.33.12, build-date=2025-09-24T08:57:55, release=553, version=7, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main)
Nov 28 09:55:44 np0005538515.localdomain pedantic_khayyam[298637]: 167 167
Nov 28 09:55:44 np0005538515.localdomain systemd[1]: libpod-59f186d56261c765b19a7c01f132fec690d912ab7a450b130871dba8d939d2c1.scope: Deactivated successfully.
Nov 28 09:55:44 np0005538515.localdomain podman[298622]: 2025-11-28 09:55:44.311368343 +0000 UTC m=+0.155714175 container died 59f186d56261c765b19a7c01f132fec690d912ab7a450b130871dba8d939d2c1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=pedantic_khayyam, GIT_BRANCH=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.component=rhceph-container, release=553, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, version=7, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, GIT_CLEAN=True, RELEASE=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, CEPH_POINT_RELEASE=, io.buildah.version=1.33.12, build-date=2025-09-24T08:57:55, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, description=Red Hat Ceph Storage 7, ceph=True)
Nov 28 09:55:44 np0005538515.localdomain podman[298642]: 2025-11-28 09:55:44.400391101 +0000 UTC m=+0.082250632 container remove 59f186d56261c765b19a7c01f132fec690d912ab7a450b130871dba8d939d2c1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=pedantic_khayyam, RELEASE=main, name=rhceph, vendor=Red Hat, Inc., vcs-type=git, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, distribution-scope=public, com.redhat.component=rhceph-container, io.openshift.expose-services=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, release=553, io.buildah.version=1.33.12, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.description=Red Hat Ceph Storage 7, version=7, build-date=2025-09-24T08:57:55, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, architecture=x86_64)
Nov 28 09:55:44 np0005538515.localdomain systemd[1]: libpod-conmon-59f186d56261c765b19a7c01f132fec690d912ab7a450b130871dba8d939d2c1.scope: Deactivated successfully.
Nov 28 09:55:44 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/inventory}] v 0)
Nov 28 09:55:44 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:44 np0005538515.localdomain sudo[298588]: pam_unix(sudo:session): session closed for user root
Nov 28 09:55:44 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain.devices.0}] v 0)
Nov 28 09:55:44 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:44 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain}] v 0)
Nov 28 09:55:44 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:44 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain.devices.0}] v 0)
Nov 28 09:55:44 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:44 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain}] v 0)
Nov 28 09:55:44 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:44 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@0(leader) e13 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mds.mds.np0005538515.anvatb", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} v 0)
Nov 28 09:55:44 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005538515.anvatb", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Nov 28 09:55:44 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@0(leader) e13 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Nov 28 09:55:44 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [DBG] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:55:44 np0005538515.localdomain sudo[298667]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:55:44 np0005538515.localdomain sudo[298667]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:55:44 np0005538515.localdomain sudo[298667]: pam_unix(sudo:session): session closed for user root
Nov 28 09:55:44 np0005538515.localdomain sudo[298685]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:55:44 np0005538515.localdomain sudo[298685]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:55:45 np0005538515.localdomain podman[298719]: 
Nov 28 09:55:45 np0005538515.localdomain podman[298719]: 2025-11-28 09:55:45.192045521 +0000 UTC m=+0.075587698 container create ddc82d327631c3a0e384271adc277944cdd22a26b3a89fb2d4ef2baa11f19c33 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=focused_mendel, CEPH_POINT_RELEASE=, io.openshift.expose-services=, io.buildah.version=1.33.12, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, ceph=True, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, name=rhceph, version=7, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, build-date=2025-09-24T08:57:55, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/agreements, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, architecture=x86_64, distribution-scope=public, GIT_CLEAN=True, release=553, GIT_BRANCH=main, vcs-type=git)
Nov 28 09:55:45 np0005538515.localdomain ceph-mon[287604]: Reconfiguring osd.4 (monmap changed)...
Nov 28 09:55:45 np0005538515.localdomain ceph-mon[287604]: Reconfiguring daemon osd.4 on np0005538515.localdomain
Nov 28 09:55:45 np0005538515.localdomain ceph-mon[287604]: pgmap v15: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:55:45 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:45 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:45 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:45 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:45 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:45 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005538515.anvatb", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Nov 28 09:55:45 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:55:45 np0005538515.localdomain systemd[1]: Started libpod-conmon-ddc82d327631c3a0e384271adc277944cdd22a26b3a89fb2d4ef2baa11f19c33.scope.
Nov 28 09:55:45 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 09:55:45 np0005538515.localdomain podman[298719]: 2025-11-28 09:55:45.243485302 +0000 UTC m=+0.127027499 container init ddc82d327631c3a0e384271adc277944cdd22a26b3a89fb2d4ef2baa11f19c33 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=focused_mendel, GIT_BRANCH=main, architecture=x86_64, io.buildah.version=1.33.12, description=Red Hat Ceph Storage 7, release=553, ceph=True, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, vcs-type=git, com.redhat.component=rhceph-container, build-date=2025-09-24T08:57:55, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, name=rhceph, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7)
Nov 28 09:55:45 np0005538515.localdomain podman[298719]: 2025-11-28 09:55:45.253055373 +0000 UTC m=+0.136597570 container start ddc82d327631c3a0e384271adc277944cdd22a26b3a89fb2d4ef2baa11f19c33 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=focused_mendel, distribution-scope=public, com.redhat.component=rhceph-container, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, vendor=Red Hat, Inc., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, version=7, description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.tags=rhceph ceph, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_BRANCH=main, ceph=True, vcs-type=git, release=553, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, name=rhceph)
Nov 28 09:55:45 np0005538515.localdomain podman[298719]: 2025-11-28 09:55:45.253358342 +0000 UTC m=+0.136900649 container attach ddc82d327631c3a0e384271adc277944cdd22a26b3a89fb2d4ef2baa11f19c33 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=focused_mendel, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, distribution-scope=public, name=rhceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.component=rhceph-container, io.buildah.version=1.33.12, vcs-type=git, io.openshift.tags=rhceph ceph, build-date=2025-09-24T08:57:55, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, vendor=Red Hat, Inc., RELEASE=main, description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, release=553, architecture=x86_64, GIT_BRANCH=main, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7)
Nov 28 09:55:45 np0005538515.localdomain focused_mendel[298734]: 167 167
Nov 28 09:55:45 np0005538515.localdomain systemd[1]: libpod-ddc82d327631c3a0e384271adc277944cdd22a26b3a89fb2d4ef2baa11f19c33.scope: Deactivated successfully.
Nov 28 09:55:45 np0005538515.localdomain podman[298719]: 2025-11-28 09:55:45.256047895 +0000 UTC m=+0.139590122 container died ddc82d327631c3a0e384271adc277944cdd22a26b3a89fb2d4ef2baa11f19c33 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=focused_mendel, distribution-scope=public, io.openshift.expose-services=, build-date=2025-09-24T08:57:55, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=553, GIT_BRANCH=main, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.33.12, GIT_CLEAN=True, vcs-type=git, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, version=7, ceph=True)
Nov 28 09:55:45 np0005538515.localdomain podman[298719]: 2025-11-28 09:55:45.162637303 +0000 UTC m=+0.046179530 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 09:55:45 np0005538515.localdomain systemd[1]: tmp-crun.Gn7A0M.mount: Deactivated successfully.
Nov 28 09:55:45 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-356bf8abb1d168d3a6c292063641cc54d7ee0bbfdf795d9250c3691fc3519770-merged.mount: Deactivated successfully.
Nov 28 09:55:45 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-ae7bd569944352693d0d16bf51fef07105e9129baf4902bdc9ac56802231dfdf-merged.mount: Deactivated successfully.
Nov 28 09:55:45 np0005538515.localdomain podman[298740]: 2025-11-28 09:55:45.359447342 +0000 UTC m=+0.089561336 container remove ddc82d327631c3a0e384271adc277944cdd22a26b3a89fb2d4ef2baa11f19c33 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=focused_mendel, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.buildah.version=1.33.12, CEPH_POINT_RELEASE=, RELEASE=main, build-date=2025-09-24T08:57:55, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.component=rhceph-container, distribution-scope=public, GIT_BRANCH=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, version=7, release=553, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, GIT_CLEAN=True, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Nov 28 09:55:45 np0005538515.localdomain systemd[1]: libpod-conmon-ddc82d327631c3a0e384271adc277944cdd22a26b3a89fb2d4ef2baa11f19c33.scope: Deactivated successfully.
Nov 28 09:55:45 np0005538515.localdomain sudo[298685]: pam_unix(sudo:session): session closed for user root
Nov 28 09:55:45 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain.devices.0}] v 0)
Nov 28 09:55:45 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:45 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain}] v 0)
Nov 28 09:55:45 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:45 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@0(leader) e13 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mgr.np0005538515.yfkzhl", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} v 0)
Nov 28 09:55:45 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538515.yfkzhl", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 28 09:55:45 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@0(leader) e13 handle_command mon_command({"prefix": "mgr services"} v 0)
Nov 28 09:55:45 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [DBG] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "mgr services"} : dispatch
Nov 28 09:55:45 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@0(leader) e13 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Nov 28 09:55:45 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [DBG] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:55:45 np0005538515.localdomain sudo[298756]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:55:45 np0005538515.localdomain sudo[298756]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:55:45 np0005538515.localdomain sudo[298756]: pam_unix(sudo:session): session closed for user root
Nov 28 09:55:45 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:55:45 np0005538515.localdomain sudo[298774]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:55:45 np0005538515.localdomain sudo[298774]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:55:46 np0005538515.localdomain podman[298808]: 
Nov 28 09:55:46 np0005538515.localdomain podman[298808]: 2025-11-28 09:55:46.030868421 +0000 UTC m=+0.078925541 container create 1dba93a69a2672784a118699c4166db7c44dc4b27b71f05c9c4806d65303e21f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vigilant_dirac, release=553, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, io.openshift.expose-services=, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., version=7, name=rhceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, distribution-scope=public, ceph=True, RELEASE=main, build-date=2025-09-24T08:57:55, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d)
Nov 28 09:55:46 np0005538515.localdomain systemd[1]: Started libpod-conmon-1dba93a69a2672784a118699c4166db7c44dc4b27b71f05c9c4806d65303e21f.scope.
Nov 28 09:55:46 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 09:55:46 np0005538515.localdomain podman[298808]: 2025-11-28 09:55:46.096202905 +0000 UTC m=+0.144260025 container init 1dba93a69a2672784a118699c4166db7c44dc4b27b71f05c9c4806d65303e21f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vigilant_dirac, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, io.openshift.expose-services=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.tags=rhceph ceph, release=553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, io.buildah.version=1.33.12, GIT_CLEAN=True, build-date=2025-09-24T08:57:55, RELEASE=main, vendor=Red Hat, Inc., version=7, description=Red Hat Ceph Storage 7, ceph=True)
Nov 28 09:55:46 np0005538515.localdomain podman[298808]: 2025-11-28 09:55:46.000253096 +0000 UTC m=+0.048310236 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 09:55:46 np0005538515.localdomain podman[298808]: 2025-11-28 09:55:46.105194611 +0000 UTC m=+0.153251731 container start 1dba93a69a2672784a118699c4166db7c44dc4b27b71f05c9c4806d65303e21f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vigilant_dirac, name=rhceph, GIT_BRANCH=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vendor=Red Hat, Inc., io.openshift.expose-services=, build-date=2025-09-24T08:57:55, ceph=True, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, version=7, GIT_CLEAN=True, io.buildah.version=1.33.12, vcs-type=git, io.openshift.tags=rhceph ceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, architecture=x86_64, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, com.redhat.component=rhceph-container)
Nov 28 09:55:46 np0005538515.localdomain podman[298808]: 2025-11-28 09:55:46.105468189 +0000 UTC m=+0.153525339 container attach 1dba93a69a2672784a118699c4166db7c44dc4b27b71f05c9c4806d65303e21f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vigilant_dirac, GIT_REPO=https://github.com/ceph/ceph-container.git, release=553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, name=rhceph, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph, ceph=True, RELEASE=main, io.buildah.version=1.33.12, distribution-scope=public, vendor=Red Hat, Inc., io.openshift.expose-services=, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2025-09-24T08:57:55, GIT_BRANCH=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553)
Nov 28 09:55:46 np0005538515.localdomain vigilant_dirac[298823]: 167 167
Nov 28 09:55:46 np0005538515.localdomain systemd[1]: libpod-1dba93a69a2672784a118699c4166db7c44dc4b27b71f05c9c4806d65303e21f.scope: Deactivated successfully.
Nov 28 09:55:46 np0005538515.localdomain podman[298808]: 2025-11-28 09:55:46.109801451 +0000 UTC m=+0.157858571 container died 1dba93a69a2672784a118699c4166db7c44dc4b27b71f05c9c4806d65303e21f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vigilant_dirac, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, description=Red Hat Ceph Storage 7, version=7, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_CLEAN=True, io.buildah.version=1.33.12, build-date=2025-09-24T08:57:55, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, name=rhceph, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, io.openshift.expose-services=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, release=553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, RELEASE=main, distribution-scope=public)
Nov 28 09:55:46 np0005538515.localdomain podman[298828]: 2025-11-28 09:55:46.21032371 +0000 UTC m=+0.087850203 container remove 1dba93a69a2672784a118699c4166db7c44dc4b27b71f05c9c4806d65303e21f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vigilant_dirac, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, version=7, architecture=x86_64, distribution-scope=public, RELEASE=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.buildah.version=1.33.12, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55, GIT_BRANCH=main, release=553, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, GIT_CLEAN=True, name=rhceph, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements)
Nov 28 09:55:46 np0005538515.localdomain systemd[1]: libpod-conmon-1dba93a69a2672784a118699c4166db7c44dc4b27b71f05c9c4806d65303e21f.scope: Deactivated successfully.
Nov 28 09:55:46 np0005538515.localdomain ceph-mon[287604]: from='client.44476 -' entity='client.admin' cmd=[{"prefix": "orch host label rm", "hostname": "np0005538512.localdomain", "label": "_admin", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 09:55:46 np0005538515.localdomain ceph-mon[287604]: Removed label _admin from host np0005538512.localdomain
Nov 28 09:55:46 np0005538515.localdomain ceph-mon[287604]: Reconfiguring mds.mds.np0005538515.anvatb (monmap changed)...
Nov 28 09:55:46 np0005538515.localdomain ceph-mon[287604]: Reconfiguring daemon mds.mds.np0005538515.anvatb on np0005538515.localdomain
Nov 28 09:55:46 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:46 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:46 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538515.yfkzhl", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 28 09:55:46 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "mgr services"} : dispatch
Nov 28 09:55:46 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:55:46 np0005538515.localdomain sudo[298774]: pam_unix(sudo:session): session closed for user root
Nov 28 09:55:46 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain.devices.0}] v 0)
Nov 28 09:55:46 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-9f891d5a2999601500f8e6128c05dd91a0551ae2d99d65e83703de4ab238969c-merged.mount: Deactivated successfully.
Nov 28 09:55:46 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:46 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain}] v 0)
Nov 28 09:55:46 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:46 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@0(leader) e13 handle_command mon_command({"prefix": "auth get", "entity": "mon."} v 0)
Nov 28 09:55:46 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Nov 28 09:55:46 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@0(leader) e13 handle_command mon_command({"prefix": "config get", "who": "mon", "key": "public_network"} v 0)
Nov 28 09:55:46 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [DBG] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Nov 28 09:55:46 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@0(leader) e13 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Nov 28 09:55:46 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [DBG] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:55:46 np0005538515.localdomain sudo[298845]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:55:46 np0005538515.localdomain sudo[298845]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:55:46 np0005538515.localdomain sudo[298845]: pam_unix(sudo:session): session closed for user root
Nov 28 09:55:46 np0005538515.localdomain sudo[298863]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:55:46 np0005538515.localdomain sudo[298863]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:55:46 np0005538515.localdomain podman[298899]: 
Nov 28 09:55:46 np0005538515.localdomain podman[298899]: 2025-11-28 09:55:46.893430776 +0000 UTC m=+0.075687682 container create 5b30e265da750d685f4166e88773bce32d3929567f421217fad6248a512bc1a7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nifty_lamport, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, architecture=x86_64, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, vcs-type=git, version=7, vendor=Red Hat, Inc., GIT_CLEAN=True, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.component=rhceph-container, io.openshift.expose-services=, io.buildah.version=1.33.12, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/agreements, RELEASE=main, release=553, description=Red Hat Ceph Storage 7, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9)
Nov 28 09:55:46 np0005538515.localdomain systemd[1]: Started libpod-conmon-5b30e265da750d685f4166e88773bce32d3929567f421217fad6248a512bc1a7.scope.
Nov 28 09:55:46 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 09:55:46 np0005538515.localdomain podman[298899]: 2025-11-28 09:55:46.951685165 +0000 UTC m=+0.133942071 container init 5b30e265da750d685f4166e88773bce32d3929567f421217fad6248a512bc1a7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nifty_lamport, com.redhat.license_terms=https://www.redhat.com/agreements, ceph=True, build-date=2025-09-24T08:57:55, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.33.12, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, vendor=Red Hat, Inc., vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.expose-services=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, release=553, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, architecture=x86_64)
Nov 28 09:55:46 np0005538515.localdomain podman[298899]: 2025-11-28 09:55:46.960861634 +0000 UTC m=+0.143118540 container start 5b30e265da750d685f4166e88773bce32d3929567f421217fad6248a512bc1a7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nifty_lamport, CEPH_POINT_RELEASE=, io.buildah.version=1.33.12, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-09-24T08:57:55, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, RELEASE=main, ceph=True, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=553, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, vcs-type=git, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhceph, version=7, distribution-scope=public)
Nov 28 09:55:46 np0005538515.localdomain podman[298899]: 2025-11-28 09:55:46.961310078 +0000 UTC m=+0.143566994 container attach 5b30e265da750d685f4166e88773bce32d3929567f421217fad6248a512bc1a7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nifty_lamport, distribution-scope=public, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-09-24T08:57:55, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.33.12, release=553, version=7, io.openshift.expose-services=, name=rhceph, com.redhat.component=rhceph-container, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, GIT_BRANCH=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, RELEASE=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9)
Nov 28 09:55:46 np0005538515.localdomain nifty_lamport[298915]: 167 167
Nov 28 09:55:46 np0005538515.localdomain podman[298899]: 2025-11-28 09:55:46.864526114 +0000 UTC m=+0.046783040 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 09:55:46 np0005538515.localdomain systemd[1]: libpod-5b30e265da750d685f4166e88773bce32d3929567f421217fad6248a512bc1a7.scope: Deactivated successfully.
Nov 28 09:55:46 np0005538515.localdomain podman[298899]: 2025-11-28 09:55:46.965505407 +0000 UTC m=+0.147762353 container died 5b30e265da750d685f4166e88773bce32d3929567f421217fad6248a512bc1a7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nifty_lamport, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.component=rhceph-container, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.expose-services=, name=rhceph, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.33.12, vendor=Red Hat, Inc., RELEASE=main, description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, GIT_CLEAN=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.tags=rhceph ceph, release=553, GIT_BRANCH=main, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, CEPH_POINT_RELEASE=, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Nov 28 09:55:47 np0005538515.localdomain podman[298920]: 2025-11-28 09:55:47.061148897 +0000 UTC m=+0.085138491 container remove 5b30e265da750d685f4166e88773bce32d3929567f421217fad6248a512bc1a7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nifty_lamport, description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, distribution-scope=public, build-date=2025-09-24T08:57:55, io.openshift.expose-services=, version=7, GIT_CLEAN=True, name=rhceph, release=553, io.buildah.version=1.33.12, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, vcs-type=git, ceph=True, GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vendor=Red Hat, Inc., architecture=x86_64)
Nov 28 09:55:47 np0005538515.localdomain systemd[1]: libpod-conmon-5b30e265da750d685f4166e88773bce32d3929567f421217fad6248a512bc1a7.scope: Deactivated successfully.
Nov 28 09:55:47 np0005538515.localdomain sudo[298863]: pam_unix(sudo:session): session closed for user root
Nov 28 09:55:47 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain.devices.0}] v 0)
Nov 28 09:55:47 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:47 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain}] v 0)
Nov 28 09:55:47 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:47 np0005538515.localdomain ceph-mon[287604]: Reconfiguring mgr.np0005538515.yfkzhl (monmap changed)...
Nov 28 09:55:47 np0005538515.localdomain ceph-mon[287604]: Reconfiguring daemon mgr.np0005538515.yfkzhl on np0005538515.localdomain
Nov 28 09:55:47 np0005538515.localdomain ceph-mon[287604]: pgmap v16: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:55:47 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:47 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:47 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Nov 28 09:55:47 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Nov 28 09:55:47 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:55:47 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:47 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:47 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-56d6c5a9ba1509de57ef1c1ad10602d79fadf4f08d4bcfd86ec49c81770adc01-merged.mount: Deactivated successfully.
Nov 28 09:55:48 np0005538515.localdomain ceph-mon[287604]: Reconfiguring mon.np0005538515 (monmap changed)...
Nov 28 09:55:48 np0005538515.localdomain ceph-mon[287604]: Reconfiguring daemon mon.np0005538515 on np0005538515.localdomain
Nov 28 09:55:48 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538512.localdomain.devices.0}] v 0)
Nov 28 09:55:48 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:48 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538512.localdomain}] v 0)
Nov 28 09:55:48 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:48 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@0(leader) e13 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Nov 28 09:55:48 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [DBG] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:55:48 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@0(leader) e13 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0)
Nov 28 09:55:48 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 28 09:55:48 np0005538515.localdomain sudo[298937]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Nov 28 09:55:48 np0005538515.localdomain sudo[298937]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:55:48 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.
Nov 28 09:55:48 np0005538515.localdomain sudo[298937]: pam_unix(sudo:session): session closed for user root
Nov 28 09:55:48 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.
Nov 28 09:55:48 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.
Nov 28 09:55:48 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.
Nov 28 09:55:48 np0005538515.localdomain sudo[298964]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph
Nov 28 09:55:48 np0005538515.localdomain sudo[298964]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:55:48 np0005538515.localdomain sudo[298964]: pam_unix(sudo:session): session closed for user root
Nov 28 09:55:48 np0005538515.localdomain podman[298955]: 2025-11-28 09:55:48.798507459 +0000 UTC m=+0.092886046 container health_status 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=edpm, container_name=ceilometer_agent_compute, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Nov 28 09:55:48 np0005538515.localdomain podman[298955]: 2025-11-28 09:55:48.812492066 +0000 UTC m=+0.106870663 container exec_died 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true)
Nov 28 09:55:48 np0005538515.localdomain systemd[1]: 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.service: Deactivated successfully.
Nov 28 09:55:48 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538512.localdomain.devices.0}] v 0)
Nov 28 09:55:48 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:48 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538512.localdomain}] v 0)
Nov 28 09:55:48 np0005538515.localdomain sudo[299020]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.conf.new
Nov 28 09:55:48 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:48 np0005538515.localdomain sudo[299020]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:55:48 np0005538515.localdomain sudo[299020]: pam_unix(sudo:session): session closed for user root
Nov 28 09:55:48 np0005538515.localdomain podman[298956]: 2025-11-28 09:55:48.903312189 +0000 UTC m=+0.197302995 container health_status 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Nov 28 09:55:48 np0005538515.localdomain podman[298956]: 2025-11-28 09:55:48.943400363 +0000 UTC m=+0.237391219 container exec_died 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Nov 28 09:55:48 np0005538515.localdomain sudo[299046]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:55:48 np0005538515.localdomain sudo[299046]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:55:48 np0005538515.localdomain sudo[299046]: pam_unix(sudo:session): session closed for user root
Nov 28 09:55:48 np0005538515.localdomain podman[298957]: 2025-11-28 09:55:48.955934595 +0000 UTC m=+0.246006632 container health_status b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Nov 28 09:55:48 np0005538515.localdomain systemd[1]: 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.service: Deactivated successfully.
Nov 28 09:55:49 np0005538515.localdomain podman[298958]: 2025-11-28 09:55:49.010269005 +0000 UTC m=+0.298334550 container health_status d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 28 09:55:49 np0005538515.localdomain podman[298958]: 2025-11-28 09:55:49.019350992 +0000 UTC m=+0.307416547 container exec_died d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 28 09:55:49 np0005538515.localdomain systemd[1]: d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.service: Deactivated successfully.
Nov 28 09:55:49 np0005538515.localdomain sudo[299081]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.conf.new
Nov 28 09:55:49 np0005538515.localdomain sudo[299081]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:55:49 np0005538515.localdomain podman[298957]: 2025-11-28 09:55:49.037807175 +0000 UTC m=+0.327879232 container exec_died b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_id=ovn_metadata_agent)
Nov 28 09:55:49 np0005538515.localdomain sudo[299081]: pam_unix(sudo:session): session closed for user root
Nov 28 09:55:49 np0005538515.localdomain systemd[1]: b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.service: Deactivated successfully.
Nov 28 09:55:49 np0005538515.localdomain sudo[299126]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.conf.new
Nov 28 09:55:49 np0005538515.localdomain sudo[299126]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:55:49 np0005538515.localdomain sudo[299126]: pam_unix(sudo:session): session closed for user root
Nov 28 09:55:49 np0005538515.localdomain sudo[299144]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.conf.new
Nov 28 09:55:49 np0005538515.localdomain sudo[299144]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:55:49 np0005538515.localdomain sudo[299144]: pam_unix(sudo:session): session closed for user root
Nov 28 09:55:49 np0005538515.localdomain ceph-mon[287604]: pgmap v17: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:55:49 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:49 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:49 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:55:49 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 28 09:55:49 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:49 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:49 np0005538515.localdomain sudo[299162]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Nov 28 09:55:49 np0005538515.localdomain sudo[299162]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:55:49 np0005538515.localdomain sudo[299162]: pam_unix(sudo:session): session closed for user root
Nov 28 09:55:49 np0005538515.localdomain sudo[299180]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config
Nov 28 09:55:49 np0005538515.localdomain sudo[299180]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:55:49 np0005538515.localdomain sudo[299180]: pam_unix(sudo:session): session closed for user root
Nov 28 09:55:49 np0005538515.localdomain sudo[299198]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config
Nov 28 09:55:49 np0005538515.localdomain sudo[299198]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:55:49 np0005538515.localdomain sudo[299198]: pam_unix(sudo:session): session closed for user root
Nov 28 09:55:49 np0005538515.localdomain sudo[299216]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf.new
Nov 28 09:55:49 np0005538515.localdomain sudo[299216]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:55:49 np0005538515.localdomain sudo[299216]: pam_unix(sudo:session): session closed for user root
Nov 28 09:55:49 np0005538515.localdomain sudo[299234]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:55:49 np0005538515.localdomain sudo[299234]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:55:49 np0005538515.localdomain sudo[299234]: pam_unix(sudo:session): session closed for user root
Nov 28 09:55:49 np0005538515.localdomain sudo[299252]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf.new
Nov 28 09:55:49 np0005538515.localdomain sudo[299252]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:55:49 np0005538515.localdomain sudo[299252]: pam_unix(sudo:session): session closed for user root
Nov 28 09:55:49 np0005538515.localdomain sudo[299286]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf.new
Nov 28 09:55:49 np0005538515.localdomain sudo[299286]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:55:49 np0005538515.localdomain sudo[299286]: pam_unix(sudo:session): session closed for user root
Nov 28 09:55:49 np0005538515.localdomain sudo[299304]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf.new
Nov 28 09:55:49 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain.devices.0}] v 0)
Nov 28 09:55:49 np0005538515.localdomain sudo[299304]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:55:49 np0005538515.localdomain sudo[299304]: pam_unix(sudo:session): session closed for user root
Nov 28 09:55:49 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:49 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain}] v 0)
Nov 28 09:55:49 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:49 np0005538515.localdomain sudo[299322]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf.new /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:55:49 np0005538515.localdomain sudo[299322]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:55:49 np0005538515.localdomain sudo[299322]: pam_unix(sudo:session): session closed for user root
Nov 28 09:55:50 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain.devices.0}] v 0)
Nov 28 09:55:50 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:50 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain}] v 0)
Nov 28 09:55:50 np0005538515.localdomain ceph-mon[287604]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #31. Immutable memtables: 0.
Nov 28 09:55:50 np0005538515.localdomain ceph-mon[287604]: rocksdb: (Original Log Time 2025/11/28-09:55:50.028808) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Nov 28 09:55:50 np0005538515.localdomain ceph-mon[287604]: rocksdb: [db/flush_job.cc:856] [default] [JOB 15] Flushing memtable with next log file: 31
Nov 28 09:55:50 np0005538515.localdomain ceph-mon[287604]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323750028897, "job": 15, "event": "flush_started", "num_memtables": 1, "num_entries": 2742, "num_deletes": 255, "total_data_size": 7857638, "memory_usage": 8325856, "flush_reason": "Manual Compaction"}
Nov 28 09:55:50 np0005538515.localdomain ceph-mon[287604]: rocksdb: [db/flush_job.cc:885] [default] [JOB 15] Level-0 flush table #32: started
Nov 28 09:55:50 np0005538515.localdomain ceph-mon[287604]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323750059933, "cf_name": "default", "job": 15, "event": "table_file_creation", "file_number": 32, "file_size": 5010030, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 19246, "largest_seqno": 21983, "table_properties": {"data_size": 4998674, "index_size": 7029, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 3269, "raw_key_size": 29004, "raw_average_key_size": 22, "raw_value_size": 4973877, "raw_average_value_size": 3861, "num_data_blocks": 305, "num_entries": 1288, "num_filter_entries": 1288, "num_deletions": 253, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764323655, "oldest_key_time": 1764323655, "file_creation_time": 1764323750, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "5fedd929-5f7c-4f1d-86e7-c95af9bc6d32", "db_session_id": "18KD68ISQNH5R0YWI96C", "orig_file_number": 32, "seqno_to_time_mapping": "N/A"}}
Nov 28 09:55:50 np0005538515.localdomain ceph-mon[287604]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 15] Flush lasted 31199 microseconds, and 11858 cpu microseconds.
Nov 28 09:55:50 np0005538515.localdomain ceph-mon[287604]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 28 09:55:50 np0005538515.localdomain ceph-mon[287604]: rocksdb: (Original Log Time 2025/11/28-09:55:50.060019) [db/flush_job.cc:967] [default] [JOB 15] Level-0 flush table #32: 5010030 bytes OK
Nov 28 09:55:50 np0005538515.localdomain ceph-mon[287604]: rocksdb: (Original Log Time 2025/11/28-09:55:50.060051) [db/memtable_list.cc:519] [default] Level-0 commit table #32 started
Nov 28 09:55:50 np0005538515.localdomain ceph-mon[287604]: rocksdb: (Original Log Time 2025/11/28-09:55:50.062024) [db/memtable_list.cc:722] [default] Level-0 commit table #32: memtable #1 done
Nov 28 09:55:50 np0005538515.localdomain ceph-mon[287604]: rocksdb: (Original Log Time 2025/11/28-09:55:50.062052) EVENT_LOG_v1 {"time_micros": 1764323750062044, "job": 15, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Nov 28 09:55:50 np0005538515.localdomain ceph-mon[287604]: rocksdb: (Original Log Time 2025/11/28-09:55:50.062118) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Nov 28 09:55:50 np0005538515.localdomain ceph-mon[287604]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 15] Try to delete WAL files size 7844585, prev total WAL file size 7876723, number of live WAL files 2.
Nov 28 09:55:50 np0005538515.localdomain ceph-mon[287604]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538515/store.db/000028.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 28 09:55:50 np0005538515.localdomain ceph-mon[287604]: rocksdb: (Original Log Time 2025/11/28-09:55:50.064103) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003130373933' seq:72057594037927935, type:22 .. '7061786F73003131303435' seq:0, type:0; will stop at (end)
Nov 28 09:55:50 np0005538515.localdomain ceph-mon[287604]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 16] Compacting 1@0 + 1@6 files to L6, score -1.00
Nov 28 09:55:50 np0005538515.localdomain ceph-mon[287604]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 15 Base level 0, inputs: [32(4892KB)], [30(15MB)]
Nov 28 09:55:50 np0005538515.localdomain ceph-mon[287604]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323750064164, "job": 16, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [32], "files_L6": [30], "score": -1, "input_data_size": 20816740, "oldest_snapshot_seqno": -1}
Nov 28 09:55:50 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:50 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain.devices.0}] v 0)
Nov 28 09:55:50 np0005538515.localdomain ceph-mon[287604]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 16] Generated table #33: 11136 keys, 17521930 bytes, temperature: kUnknown
Nov 28 09:55:50 np0005538515.localdomain ceph-mon[287604]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323750216310, "cf_name": "default", "job": 16, "event": "table_file_creation", "file_number": 33, "file_size": 17521930, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 17457349, "index_size": 35680, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 27845, "raw_key_size": 297892, "raw_average_key_size": 26, "raw_value_size": 17266374, "raw_average_value_size": 1550, "num_data_blocks": 1369, "num_entries": 11136, "num_filter_entries": 11136, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764323465, "oldest_key_time": 0, "file_creation_time": 1764323750, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "5fedd929-5f7c-4f1d-86e7-c95af9bc6d32", "db_session_id": "18KD68ISQNH5R0YWI96C", "orig_file_number": 33, "seqno_to_time_mapping": "N/A"}}
Nov 28 09:55:50 np0005538515.localdomain ceph-mon[287604]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 28 09:55:50 np0005538515.localdomain ceph-mon[287604]: rocksdb: (Original Log Time 2025/11/28-09:55:50.216611) [db/compaction/compaction_job.cc:1663] [default] [JOB 16] Compacted 1@0 + 1@6 files to L6 => 17521930 bytes
Nov 28 09:55:50 np0005538515.localdomain ceph-mon[287604]: rocksdb: (Original Log Time 2025/11/28-09:55:50.218794) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 136.7 rd, 115.1 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(4.8, 15.1 +0.0 blob) out(16.7 +0.0 blob), read-write-amplify(7.7) write-amplify(3.5) OK, records in: 11687, records dropped: 551 output_compression: NoCompression
Nov 28 09:55:50 np0005538515.localdomain ceph-mon[287604]: rocksdb: (Original Log Time 2025/11/28-09:55:50.218834) EVENT_LOG_v1 {"time_micros": 1764323750218822, "job": 16, "event": "compaction_finished", "compaction_time_micros": 152249, "compaction_time_cpu_micros": 47102, "output_level": 6, "num_output_files": 1, "total_output_size": 17521930, "num_input_records": 11687, "num_output_records": 11136, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Nov 28 09:55:50 np0005538515.localdomain ceph-mon[287604]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538515/store.db/000032.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 28 09:55:50 np0005538515.localdomain ceph-mon[287604]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323750219690, "job": 16, "event": "table_file_deletion", "file_number": 32}
Nov 28 09:55:50 np0005538515.localdomain ceph-mon[287604]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538515/store.db/000030.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 28 09:55:50 np0005538515.localdomain ceph-mon[287604]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323750221277, "job": 16, "event": "table_file_deletion", "file_number": 30}
Nov 28 09:55:50 np0005538515.localdomain ceph-mon[287604]: rocksdb: (Original Log Time 2025/11/28-09:55:50.063969) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 09:55:50 np0005538515.localdomain ceph-mon[287604]: rocksdb: (Original Log Time 2025/11/28-09:55:50.221327) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 09:55:50 np0005538515.localdomain ceph-mon[287604]: rocksdb: (Original Log Time 2025/11/28-09:55:50.221335) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 09:55:50 np0005538515.localdomain ceph-mon[287604]: rocksdb: (Original Log Time 2025/11/28-09:55:50.221338) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 09:55:50 np0005538515.localdomain ceph-mon[287604]: rocksdb: (Original Log Time 2025/11/28-09:55:50.221341) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 09:55:50 np0005538515.localdomain ceph-mon[287604]: rocksdb: (Original Log Time 2025/11/28-09:55:50.221344) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 09:55:50 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:50 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain}] v 0)
Nov 28 09:55:50 np0005538515.localdomain ceph-mon[287604]: Removing np0005538512.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:55:50 np0005538515.localdomain ceph-mon[287604]: Updating np0005538513.localdomain:/etc/ceph/ceph.conf
Nov 28 09:55:50 np0005538515.localdomain ceph-mon[287604]: Updating np0005538514.localdomain:/etc/ceph/ceph.conf
Nov 28 09:55:50 np0005538515.localdomain ceph-mon[287604]: Updating np0005538515.localdomain:/etc/ceph/ceph.conf
Nov 28 09:55:50 np0005538515.localdomain ceph-mon[287604]: Removing np0005538512.localdomain:/etc/ceph/ceph.client.admin.keyring
Nov 28 09:55:50 np0005538515.localdomain ceph-mon[287604]: Removing np0005538512.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring
Nov 28 09:55:50 np0005538515.localdomain ceph-mon[287604]: Updating np0005538514.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:55:50 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:50 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:50 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:50 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:50 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:50 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:50 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Nov 28 09:55:50 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:50 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:55:50 np0005538515.localdomain ceph-mon[287604]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #34. Immutable memtables: 0.
Nov 28 09:55:50 np0005538515.localdomain ceph-mon[287604]: rocksdb: (Original Log Time 2025/11/28-09:55:50.570987) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Nov 28 09:55:50 np0005538515.localdomain ceph-mon[287604]: rocksdb: [db/flush_job.cc:856] [default] [JOB 17] Flushing memtable with next log file: 34
Nov 28 09:55:50 np0005538515.localdomain ceph-mon[287604]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323750571018, "job": 17, "event": "flush_started", "num_memtables": 1, "num_entries": 316, "num_deletes": 253, "total_data_size": 112396, "memory_usage": 120600, "flush_reason": "Manual Compaction"}
Nov 28 09:55:50 np0005538515.localdomain ceph-mon[287604]: rocksdb: [db/flush_job.cc:885] [default] [JOB 17] Level-0 flush table #35: started
Nov 28 09:55:50 np0005538515.localdomain ceph-mon[287604]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323750574160, "cf_name": "default", "job": 17, "event": "table_file_creation", "file_number": 35, "file_size": 112571, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 21984, "largest_seqno": 22299, "table_properties": {"data_size": 110437, "index_size": 309, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 773, "raw_key_size": 4933, "raw_average_key_size": 16, "raw_value_size": 106110, "raw_average_value_size": 359, "num_data_blocks": 11, "num_entries": 295, "num_filter_entries": 295, "num_deletions": 253, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764323750, "oldest_key_time": 1764323750, "file_creation_time": 1764323750, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "5fedd929-5f7c-4f1d-86e7-c95af9bc6d32", "db_session_id": "18KD68ISQNH5R0YWI96C", "orig_file_number": 35, "seqno_to_time_mapping": "N/A"}}
Nov 28 09:55:50 np0005538515.localdomain ceph-mon[287604]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 17] Flush lasted 3231 microseconds, and 1170 cpu microseconds.
Nov 28 09:55:50 np0005538515.localdomain ceph-mon[287604]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 28 09:55:50 np0005538515.localdomain ceph-mon[287604]: rocksdb: (Original Log Time 2025/11/28-09:55:50.574212) [db/flush_job.cc:967] [default] [JOB 17] Level-0 flush table #35: 112571 bytes OK
Nov 28 09:55:50 np0005538515.localdomain ceph-mon[287604]: rocksdb: (Original Log Time 2025/11/28-09:55:50.574235) [db/memtable_list.cc:519] [default] Level-0 commit table #35 started
Nov 28 09:55:50 np0005538515.localdomain ceph-mon[287604]: rocksdb: (Original Log Time 2025/11/28-09:55:50.575961) [db/memtable_list.cc:722] [default] Level-0 commit table #35: memtable #1 done
Nov 28 09:55:50 np0005538515.localdomain ceph-mon[287604]: rocksdb: (Original Log Time 2025/11/28-09:55:50.575990) EVENT_LOG_v1 {"time_micros": 1764323750575981, "job": 17, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Nov 28 09:55:50 np0005538515.localdomain ceph-mon[287604]: rocksdb: (Original Log Time 2025/11/28-09:55:50.576011) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Nov 28 09:55:50 np0005538515.localdomain ceph-mon[287604]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 17] Try to delete WAL files size 110099, prev total WAL file size 110099, number of live WAL files 2.
Nov 28 09:55:50 np0005538515.localdomain ceph-mon[287604]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538515/store.db/000031.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 28 09:55:50 np0005538515.localdomain ceph-mon[287604]: rocksdb: (Original Log Time 2025/11/28-09:55:50.576476) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6B760031323935' seq:72057594037927935, type:22 .. '6B760031353439' seq:0, type:0; will stop at (end)
Nov 28 09:55:50 np0005538515.localdomain ceph-mon[287604]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 18] Compacting 1@0 + 1@6 files to L6, score -1.00
Nov 28 09:55:50 np0005538515.localdomain ceph-mon[287604]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 17 Base level 0, inputs: [35(109KB)], [33(16MB)]
Nov 28 09:55:50 np0005538515.localdomain ceph-mon[287604]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323750576514, "job": 18, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [35], "files_L6": [33], "score": -1, "input_data_size": 17634501, "oldest_snapshot_seqno": -1}
Nov 28 09:55:50 np0005538515.localdomain ceph-mon[287604]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 18] Generated table #36: 10908 keys, 16607435 bytes, temperature: kUnknown
Nov 28 09:55:50 np0005538515.localdomain ceph-mon[287604]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323750684841, "cf_name": "default", "job": 18, "event": "table_file_creation", "file_number": 36, "file_size": 16607435, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 16545608, "index_size": 33438, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 27333, "raw_key_size": 294665, "raw_average_key_size": 27, "raw_value_size": 16359786, "raw_average_value_size": 1499, "num_data_blocks": 1257, "num_entries": 10908, "num_filter_entries": 10908, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764323465, "oldest_key_time": 0, "file_creation_time": 1764323750, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "5fedd929-5f7c-4f1d-86e7-c95af9bc6d32", "db_session_id": "18KD68ISQNH5R0YWI96C", "orig_file_number": 36, "seqno_to_time_mapping": "N/A"}}
Nov 28 09:55:50 np0005538515.localdomain ceph-mon[287604]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 28 09:55:50 np0005538515.localdomain ceph-mon[287604]: rocksdb: (Original Log Time 2025/11/28-09:55:50.685472) [db/compaction/compaction_job.cc:1663] [default] [JOB 18] Compacted 1@0 + 1@6 files to L6 => 16607435 bytes
Nov 28 09:55:50 np0005538515.localdomain ceph-mon[287604]: rocksdb: (Original Log Time 2025/11/28-09:55:50.687315) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 162.5 rd, 153.1 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.1, 16.7 +0.0 blob) out(15.8 +0.0 blob), read-write-amplify(304.2) write-amplify(147.5) OK, records in: 11431, records dropped: 523 output_compression: NoCompression
Nov 28 09:55:50 np0005538515.localdomain ceph-mon[287604]: rocksdb: (Original Log Time 2025/11/28-09:55:50.687351) EVENT_LOG_v1 {"time_micros": 1764323750687333, "job": 18, "event": "compaction_finished", "compaction_time_micros": 108491, "compaction_time_cpu_micros": 46361, "output_level": 6, "num_output_files": 1, "total_output_size": 16607435, "num_input_records": 11431, "num_output_records": 10908, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Nov 28 09:55:50 np0005538515.localdomain ceph-mon[287604]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538515/store.db/000035.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 28 09:55:50 np0005538515.localdomain ceph-mon[287604]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323750687545, "job": 18, "event": "table_file_deletion", "file_number": 35}
Nov 28 09:55:50 np0005538515.localdomain ceph-mon[287604]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538515/store.db/000033.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 28 09:55:50 np0005538515.localdomain ceph-mon[287604]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323750690350, "job": 18, "event": "table_file_deletion", "file_number": 33}
Nov 28 09:55:50 np0005538515.localdomain ceph-mon[287604]: rocksdb: (Original Log Time 2025/11/28-09:55:50.576395) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 09:55:50 np0005538515.localdomain ceph-mon[287604]: rocksdb: (Original Log Time 2025/11/28-09:55:50.690451) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 09:55:50 np0005538515.localdomain ceph-mon[287604]: rocksdb: (Original Log Time 2025/11/28-09:55:50.690460) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 09:55:50 np0005538515.localdomain ceph-mon[287604]: rocksdb: (Original Log Time 2025/11/28-09:55:50.690463) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 09:55:50 np0005538515.localdomain ceph-mon[287604]: rocksdb: (Original Log Time 2025/11/28-09:55:50.690466) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 09:55:50 np0005538515.localdomain ceph-mon[287604]: rocksdb: (Original Log Time 2025/11/28-09:55:50.690468) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 09:55:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:55:50.838 158530 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:55:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:55:50.838 158530 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:55:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:55:50.838 158530 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:55:51 np0005538515.localdomain ceph-mon[287604]: Updating np0005538515.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:55:51 np0005538515.localdomain ceph-mon[287604]: Updating np0005538513.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:55:51 np0005538515.localdomain ceph-mon[287604]: pgmap v18: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:55:51 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:51 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:51 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.
Nov 28 09:55:51 np0005538515.localdomain podman[299340]: 2025-11-28 09:55:51.978743634 +0000 UTC m=+0.084434478 container health_status 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 28 09:55:52 np0005538515.localdomain podman[299340]: 2025-11-28 09:55:52.018477097 +0000 UTC m=+0.124167931 container exec_died 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 28 09:55:52 np0005538515.localdomain systemd[1]: 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.service: Deactivated successfully.
Nov 28 09:55:52 np0005538515.localdomain ceph-mon[287604]: Removing daemon mgr.np0005538512.zyhkxs from np0005538512.localdomain -- ports [8765]
Nov 28 09:55:52 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@0(leader) e13 handle_command mon_command({"prefix": "auth rm", "entity": "mgr.np0005538512.zyhkxs"} v 0)
Nov 28 09:55:52 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth rm", "entity": "mgr.np0005538512.zyhkxs"} : dispatch
Nov 28 09:55:52 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd='[{"prefix": "auth rm", "entity": "mgr.np0005538512.zyhkxs"}]': finished
Nov 28 09:55:52 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/spec.mgr}] v 0)
Nov 28 09:55:52 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:52 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/spec.mgr}] v 0)
Nov 28 09:55:52 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:52 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@0(leader) e13 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0)
Nov 28 09:55:52 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [DBG] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 28 09:55:52 np0005538515.localdomain sudo[299363]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:55:52 np0005538515.localdomain sudo[299363]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:55:52 np0005538515.localdomain sudo[299363]: pam_unix(sudo:session): session closed for user root
Nov 28 09:55:53 np0005538515.localdomain ceph-mon[287604]: pgmap v19: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:55:53 np0005538515.localdomain ceph-mon[287604]: Removing key for mgr.np0005538512.zyhkxs
Nov 28 09:55:53 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth rm", "entity": "mgr.np0005538512.zyhkxs"} : dispatch
Nov 28 09:55:53 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd='[{"prefix": "auth rm", "entity": "mgr.np0005538512.zyhkxs"}]': finished
Nov 28 09:55:53 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:53 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:53 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 28 09:55:54 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538512.localdomain.devices.0}] v 0)
Nov 28 09:55:54 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:54 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538512.localdomain}] v 0)
Nov 28 09:55:54 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:54 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@0(leader) e13 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Nov 28 09:55:54 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [DBG] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:55:54 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@0(leader) e13 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0)
Nov 28 09:55:54 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 28 09:55:54 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Nov 28 09:55:54 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:54 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@0(leader) e13 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0)
Nov 28 09:55:54 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [DBG] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 28 09:55:54 np0005538515.localdomain sudo[299381]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:55:54 np0005538515.localdomain sudo[299381]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:55:54 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.
Nov 28 09:55:54 np0005538515.localdomain sudo[299381]: pam_unix(sudo:session): session closed for user root
Nov 28 09:55:54 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@0(leader) e13 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.crash.np0005538512.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} v 0)
Nov 28 09:55:54 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538512.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 28 09:55:54 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@0(leader) e13 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Nov 28 09:55:54 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [DBG] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:55:54 np0005538515.localdomain podman[299399]: 2025-11-28 09:55:54.656650264 +0000 UTC m=+0.080647904 container health_status cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=multipathd)
Nov 28 09:55:54 np0005538515.localdomain podman[299399]: 2025-11-28 09:55:54.696597613 +0000 UTC m=+0.120595273 container exec_died cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0)
Nov 28 09:55:54 np0005538515.localdomain systemd[1]: cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.service: Deactivated successfully.
Nov 28 09:55:54 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0)
Nov 28 09:55:54 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:55 np0005538515.localdomain ceph-mon[287604]: pgmap v20: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:55:55 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:55 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:55 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:55:55 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 28 09:55:55 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:55 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 28 09:55:55 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538512.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 28 09:55:55 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:55:55 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:55 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538512.localdomain.devices.0}] v 0)
Nov 28 09:55:55 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:55 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538512.localdomain}] v 0)
Nov 28 09:55:55 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:55 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@0(leader) e13 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.crash.np0005538513.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} v 0)
Nov 28 09:55:55 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538513.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 28 09:55:55 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@0(leader) e13 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Nov 28 09:55:55 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [DBG] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:55:55 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:55:56 np0005538515.localdomain ceph-mon[287604]: Reconfiguring crash.np0005538512 (monmap changed)...
Nov 28 09:55:56 np0005538515.localdomain ceph-mon[287604]: Reconfiguring daemon crash.np0005538512 on np0005538512.localdomain
Nov 28 09:55:56 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:56 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:56 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538513.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 28 09:55:56 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:55:56 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/inventory}] v 0)
Nov 28 09:55:56 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:56 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/inventory}] v 0)
Nov 28 09:55:56 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:56 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain.devices.0}] v 0)
Nov 28 09:55:56 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:56 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain}] v 0)
Nov 28 09:55:56 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:56 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@0(leader) e13 handle_command mon_command({"prefix": "auth get", "entity": "osd.2"} v 0)
Nov 28 09:55:56 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch
Nov 28 09:55:56 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@0(leader) e13 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Nov 28 09:55:56 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [DBG] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:55:57 np0005538515.localdomain ceph-mon[287604]: Reconfiguring crash.np0005538513 (monmap changed)...
Nov 28 09:55:57 np0005538515.localdomain ceph-mon[287604]: Reconfiguring daemon crash.np0005538513 on np0005538513.localdomain
Nov 28 09:55:57 np0005538515.localdomain ceph-mon[287604]: pgmap v21: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:55:57 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:57 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:57 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:57 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:57 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch
Nov 28 09:55:57 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:55:57 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain.devices.0}] v 0)
Nov 28 09:55:57 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:57 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain}] v 0)
Nov 28 09:55:57 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:57 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@0(leader) e13 handle_command mon_command({"prefix": "auth get", "entity": "osd.5"} v 0)
Nov 28 09:55:57 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch
Nov 28 09:55:57 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@0(leader) e13 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Nov 28 09:55:57 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [DBG] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:55:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:55:57 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:55:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:55:57 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 09:55:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:55:57 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:55:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:55:57 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 09:55:57 np0005538515.localdomain openstack_network_exporter[240973]: 
Nov 28 09:55:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:55:57 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 09:55:57 np0005538515.localdomain openstack_network_exporter[240973]: 
Nov 28 09:55:58 np0005538515.localdomain ceph-mon[287604]: from='client.34545 -' entity='client.admin' cmd=[{"prefix": "orch host drain", "hostname": "np0005538512.localdomain", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 09:55:58 np0005538515.localdomain ceph-mon[287604]: Added label _no_schedule to host np0005538512.localdomain
Nov 28 09:55:58 np0005538515.localdomain ceph-mon[287604]: Added label SpecialHostLabels.DRAIN_CONF_KEYRING to host np0005538512.localdomain
Nov 28 09:55:58 np0005538515.localdomain ceph-mon[287604]: Reconfiguring osd.2 (monmap changed)...
Nov 28 09:55:58 np0005538515.localdomain ceph-mon[287604]: Reconfiguring daemon osd.2 on np0005538513.localdomain
Nov 28 09:55:58 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:58 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:58 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch
Nov 28 09:55:58 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:55:58 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain.devices.0}] v 0)
Nov 28 09:55:58 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:58 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain}] v 0)
Nov 28 09:55:58 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:58 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@0(leader) e13 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mds.mds.np0005538513.yljthc", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} v 0)
Nov 28 09:55:58 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005538513.yljthc", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Nov 28 09:55:58 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@0(leader) e13 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Nov 28 09:55:58 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [DBG] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:55:58 np0005538515.localdomain podman[239012]: time="2025-11-28T09:55:58Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 09:55:58 np0005538515.localdomain podman[239012]: @ - - [28/Nov/2025:09:55:58 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156330 "" "Go-http-client/1.1"
Nov 28 09:55:58 np0005538515.localdomain podman[239012]: @ - - [28/Nov/2025:09:55:58 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19180 "" "Go-http-client/1.1"
Nov 28 09:55:59 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/inventory}] v 0)
Nov 28 09:55:59 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:59 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@0(leader) e13 handle_command mon_command({"prefix":"config-key del","key":"mgr/cephadm/host.np0005538512.localdomain"} v 0)
Nov 28 09:55:59 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005538512.localdomain"} : dispatch
Nov 28 09:55:59 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005538512.localdomain"}]': finished
Nov 28 09:55:59 np0005538515.localdomain ceph-mon[287604]: Reconfiguring osd.5 (monmap changed)...
Nov 28 09:55:59 np0005538515.localdomain ceph-mon[287604]: Reconfiguring daemon osd.5 on np0005538513.localdomain
Nov 28 09:55:59 np0005538515.localdomain ceph-mon[287604]: pgmap v22: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:55:59 np0005538515.localdomain ceph-mon[287604]: from='client.44482 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "host_pattern": "np0005538512.localdomain", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Nov 28 09:55:59 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:59 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:59 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005538513.yljthc", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Nov 28 09:55:59 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:55:59 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:59 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005538512.localdomain"} : dispatch
Nov 28 09:55:59 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005538512.localdomain"}]': finished
Nov 28 09:55:59 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain.devices.0}] v 0)
Nov 28 09:55:59 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:59 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain}] v 0)
Nov 28 09:55:59 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:55:59 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@0(leader) e13 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mgr.np0005538513.dsfdlx", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} v 0)
Nov 28 09:55:59 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538513.dsfdlx", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 28 09:55:59 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@0(leader) e13 handle_command mon_command({"prefix": "mgr services"} v 0)
Nov 28 09:55:59 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [DBG] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "mgr services"} : dispatch
Nov 28 09:55:59 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@0(leader) e13 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Nov 28 09:55:59 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [DBG] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:56:00 np0005538515.localdomain ceph-mon[287604]: Reconfiguring mds.mds.np0005538513.yljthc (monmap changed)...
Nov 28 09:56:00 np0005538515.localdomain ceph-mon[287604]: Reconfiguring daemon mds.mds.np0005538513.yljthc on np0005538513.localdomain
Nov 28 09:56:00 np0005538515.localdomain ceph-mon[287604]: from='client.44488 -' entity='client.admin' cmd=[{"prefix": "orch host rm", "hostname": "np0005538512.localdomain", "force": true, "target": ["mon-mgr", ""]}]: dispatch
Nov 28 09:56:00 np0005538515.localdomain ceph-mon[287604]: Removed host np0005538512.localdomain
Nov 28 09:56:00 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:00 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:00 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538513.dsfdlx", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 28 09:56:00 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "mgr services"} : dispatch
Nov 28 09:56:00 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:56:00 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain.devices.0}] v 0)
Nov 28 09:56:00 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:56:00 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:00 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain}] v 0)
Nov 28 09:56:00 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:00 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@0(leader) e13 handle_command mon_command({"prefix": "auth get", "entity": "mon."} v 0)
Nov 28 09:56:00 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Nov 28 09:56:00 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@0(leader) e13 handle_command mon_command({"prefix": "config get", "who": "mon", "key": "public_network"} v 0)
Nov 28 09:56:00 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [DBG] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Nov 28 09:56:00 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@0(leader) e13 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Nov 28 09:56:00 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [DBG] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:56:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:56:00.624 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:56:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:56:00.625 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:56:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:56:00.625 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:56:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:56:00.625 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:56:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:56:00.625 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:56:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:56:00.625 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:56:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:56:00.625 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:56:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:56:00.626 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:56:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:56:00.626 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:56:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:56:00.626 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:56:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:56:00.626 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:56:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:56:00.626 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:56:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:56:00.626 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:56:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:56:00.627 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:56:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:56:00.627 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:56:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:56:00.627 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:56:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:56:00.627 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:56:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:56:00.627 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:56:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:56:00.627 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:56:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:56:00.628 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:56:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:56:00.628 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:56:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:56:00.628 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:56:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:56:00.628 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:56:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:56:00.628 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:56:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:56:00.628 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:56:01 np0005538515.localdomain ceph-mon[287604]: Reconfiguring mgr.np0005538513.dsfdlx (monmap changed)...
Nov 28 09:56:01 np0005538515.localdomain ceph-mon[287604]: Reconfiguring daemon mgr.np0005538513.dsfdlx on np0005538513.localdomain
Nov 28 09:56:01 np0005538515.localdomain ceph-mon[287604]: pgmap v23: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:56:01 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:01 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:01 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Nov 28 09:56:01 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Nov 28 09:56:01 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:56:01 np0005538515.localdomain sshd[299419]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 09:56:01 np0005538515.localdomain sshd[299419]: Accepted publickey for tripleo-admin from 192.168.122.11 port 58352 ssh2: RSA SHA256:3gOhaEk5Hp1Sm2LwNst6cGDJ5O01KvSo8lCo9SBO2II
Nov 28 09:56:01 np0005538515.localdomain systemd-logind[763]: New session 69 of user tripleo-admin.
Nov 28 09:56:01 np0005538515.localdomain systemd[1]: Created slice User Slice of UID 1003.
Nov 28 09:56:01 np0005538515.localdomain systemd[1]: Starting User Runtime Directory /run/user/1003...
Nov 28 09:56:01 np0005538515.localdomain systemd[1]: Finished User Runtime Directory /run/user/1003.
Nov 28 09:56:01 np0005538515.localdomain systemd[1]: Starting User Manager for UID 1003...
Nov 28 09:56:01 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain.devices.0}] v 0)
Nov 28 09:56:01 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:01 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain}] v 0)
Nov 28 09:56:01 np0005538515.localdomain systemd[299423]: pam_unix(systemd-user:session): session opened for user tripleo-admin(uid=1003) by (uid=0)
Nov 28 09:56:01 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:01 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@0(leader) e13 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Nov 28 09:56:01 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [DBG] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:56:01 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@0(leader) e13 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0)
Nov 28 09:56:01 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 28 09:56:01 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Nov 28 09:56:01 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:01 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@0(leader) e13 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0)
Nov 28 09:56:01 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [DBG] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 28 09:56:01 np0005538515.localdomain systemd[299423]: Queued start job for default target Main User Target.
Nov 28 09:56:01 np0005538515.localdomain systemd[299423]: Created slice User Application Slice.
Nov 28 09:56:01 np0005538515.localdomain systemd[299423]: Started Mark boot as successful after the user session has run 2 minutes.
Nov 28 09:56:01 np0005538515.localdomain systemd[299423]: Started Daily Cleanup of User's Temporary Directories.
Nov 28 09:56:01 np0005538515.localdomain systemd[299423]: Reached target Paths.
Nov 28 09:56:01 np0005538515.localdomain systemd[299423]: Reached target Timers.
Nov 28 09:56:01 np0005538515.localdomain systemd[299423]: Starting D-Bus User Message Bus Socket...
Nov 28 09:56:01 np0005538515.localdomain systemd[299423]: Starting Create User's Volatile Files and Directories...
Nov 28 09:56:01 np0005538515.localdomain systemd[299423]: Listening on D-Bus User Message Bus Socket.
Nov 28 09:56:01 np0005538515.localdomain systemd[299423]: Reached target Sockets.
Nov 28 09:56:01 np0005538515.localdomain systemd[299423]: Finished Create User's Volatile Files and Directories.
Nov 28 09:56:01 np0005538515.localdomain systemd[299423]: Reached target Basic System.
Nov 28 09:56:01 np0005538515.localdomain systemd[299423]: Reached target Main User Target.
Nov 28 09:56:01 np0005538515.localdomain systemd[299423]: Startup finished in 162ms.
Nov 28 09:56:01 np0005538515.localdomain systemd[1]: Started User Manager for UID 1003.
Nov 28 09:56:01 np0005538515.localdomain systemd[1]: Started Session 69 of User tripleo-admin.
Nov 28 09:56:01 np0005538515.localdomain sshd[299419]: pam_unix(sshd:session): session opened for user tripleo-admin(uid=1003) by (uid=0)
Nov 28 09:56:01 np0005538515.localdomain sudo[299440]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:56:01 np0005538515.localdomain sudo[299440]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:01 np0005538515.localdomain sudo[299440]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:02 np0005538515.localdomain ceph-mon[287604]: Reconfiguring mon.np0005538513 (monmap changed)...
Nov 28 09:56:02 np0005538515.localdomain ceph-mon[287604]: Reconfiguring daemon mon.np0005538513 on np0005538513.localdomain
Nov 28 09:56:02 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:02 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:02 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:56:02 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 28 09:56:02 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:02 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 28 09:56:02 np0005538515.localdomain sudo[299582]: tripleo-admin : TTY=pts/0 ; PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ahbcxyvajooripgvbuutcjlqckfiyruh ; /usr/bin/python3 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1764323761.884055-63149-89602299437877/AnsiballZ_lineinfile.py
Nov 28 09:56:02 np0005538515.localdomain sudo[299582]: pam_unix(sudo:session): session opened for user root(uid=0) by tripleo-admin(uid=1003)
Nov 28 09:56:02 np0005538515.localdomain python3[299584]: ansible-ansible.builtin.lineinfile Invoked with dest=/etc/os-net-config/tripleo_config.yaml insertafter=172.18.0 line=    - ip_netmask: 172.18.0.105/24 backup=True path=/etc/os-net-config/tripleo_config.yaml state=present backrefs=False create=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 09:56:02 np0005538515.localdomain sudo[299582]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:03 np0005538515.localdomain sudo[299728]: tripleo-admin : TTY=pts/0 ; PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xhqktyzvwxgriregaczxszexgtnogovy ; /usr/bin/python3 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1764323762.7114203-63166-160785021466859/AnsiballZ_command.py
Nov 28 09:56:03 np0005538515.localdomain sudo[299728]: pam_unix(sudo:session): session opened for user root(uid=0) by tripleo-admin(uid=1003)
Nov 28 09:56:03 np0005538515.localdomain ceph-mon[287604]: pgmap v24: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:56:03 np0005538515.localdomain python3[299730]: ansible-ansible.legacy.command Invoked with _raw_params=ip a add 172.18.0.105/24 dev vlan21 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:56:03 np0005538515.localdomain sudo[299728]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:03 np0005538515.localdomain sudo[299873]: tripleo-admin : TTY=pts/0 ; PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pyvpdicjklyrjmmapvqvismfbiqfmzbu ; /usr/bin/python3 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1764323763.5432606-63177-206973118604983/AnsiballZ_command.py
Nov 28 09:56:03 np0005538515.localdomain sudo[299873]: pam_unix(sudo:session): session opened for user root(uid=0) by tripleo-admin(uid=1003)
Nov 28 09:56:04 np0005538515.localdomain python3[299875]: ansible-ansible.legacy.command Invoked with _raw_params=ping -W1 -c 3 172.18.0.105 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 09:56:04 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0)
Nov 28 09:56:05 np0005538515.localdomain ceph-mon[287604]: pgmap v25: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:56:05 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:05 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:56:06 np0005538515.localdomain sudo[299873]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:06 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:06 np0005538515.localdomain ceph-mon[287604]: pgmap v26: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:56:07 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/spec.mon}] v 0)
Nov 28 09:56:07 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:07 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@0(leader) e13 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Nov 28 09:56:07 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [DBG] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:56:07 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@0(leader) e13 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0)
Nov 28 09:56:07 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 28 09:56:07 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Nov 28 09:56:07 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:07 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@0(leader) e13 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0)
Nov 28 09:56:07 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [DBG] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 28 09:56:07 np0005538515.localdomain sudo[299894]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:56:07 np0005538515.localdomain sudo[299894]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:07 np0005538515.localdomain sudo[299894]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:08 np0005538515.localdomain ceph-mon[287604]: from='client.44494 -' entity='client.admin' cmd=[{"prefix": "orch apply", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 09:56:08 np0005538515.localdomain ceph-mon[287604]: Saving service mon spec with placement label:mon
Nov 28 09:56:08 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:08 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:56:08 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 28 09:56:08 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:08 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 28 09:56:08 np0005538515.localdomain ceph-mon[287604]: pgmap v27: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:56:09 np0005538515.localdomain ceph-mon[287604]: from='client.44500 -' entity='client.admin' cmd=[{"prefix": "orch ps", "daemon_type": "mon", "daemon_id": "np0005538515", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Nov 28 09:56:09 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.
Nov 28 09:56:09 np0005538515.localdomain systemd[1]: tmp-crun.zqJkjG.mount: Deactivated successfully.
Nov 28 09:56:09 np0005538515.localdomain podman[299912]: 2025-11-28 09:56:09.992369907 +0000 UTC m=+0.096436815 container health_status 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, com.redhat.component=ubi9-minimal-container, config_id=edpm, version=9.6, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., name=ubi9-minimal, io.openshift.expose-services=, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter)
Nov 28 09:56:10 np0005538515.localdomain podman[299912]: 2025-11-28 09:56:10.036451113 +0000 UTC m=+0.140518071 container exec_died 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, release=1755695350, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., config_id=edpm, distribution-scope=public, io.openshift.expose-services=, architecture=x86_64, io.buildah.version=1.33.7)
Nov 28 09:56:10 np0005538515.localdomain systemd[1]: 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.service: Deactivated successfully.
Nov 28 09:56:10 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0)
Nov 28 09:56:10 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:10 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:56:10 np0005538515.localdomain ceph-mon[287604]: pgmap v28: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:56:10 np0005538515.localdomain ceph-mon[287604]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:10 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@0(leader) e13 handle_command mon_command({"prefix": "quorum_status"} v 0)
Nov 28 09:56:10 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [DBG] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "quorum_status"} : dispatch
Nov 28 09:56:10 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@0(leader) e13 handle_command mon_command({"prefix": "mon rm", "name": "np0005538515"} v 0)
Nov 28 09:56:10 np0005538515.localdomain ceph-mon[287604]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "mon rm", "name": "np0005538515"} : dispatch
Nov 28 09:56:10 np0005538515.localdomain ceph-mgr[286188]: ms_deliver_dispatch: unhandled message 0x5575ffdcef20 mon_map magic: 0 from mon.0 v2:172.18.0.108:3300/0
Nov 28 09:56:10 np0005538515.localdomain ceph-mon[287604]: mon.np0005538515@0(leader) e14  removed from monmap, suicide.
Nov 28 09:56:10 np0005538515.localdomain ceph-mgr[286188]: client.0 ms_handle_reset on v2:172.18.0.104:3300/0
Nov 28 09:56:10 np0005538515.localdomain ceph-mgr[286188]: client.0 ms_handle_reset on v2:172.18.0.104:3300/0
Nov 28 09:56:10 np0005538515.localdomain ceph-mgr[286188]: ms_deliver_dispatch: unhandled message 0x5575ffdcf600 mon_map magic: 0 from mon.1 v2:172.18.0.104:3300/0
Nov 28 09:56:10 np0005538515.localdomain sudo[299933]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:56:10 np0005538515.localdomain sudo[299933]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:10 np0005538515.localdomain sudo[299933]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:10 np0005538515.localdomain podman[299947]: 2025-11-28 09:56:10.774332712 +0000 UTC m=+0.055189956 container died a20fbb4af4b220d896878368414f00f458b36bd01f689cea18d6929c00ea38cf (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mon-np0005538515, io.buildah.version=1.33.12, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_CLEAN=True, io.openshift.expose-services=, ceph=True, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, CEPH_POINT_RELEASE=, version=7, com.redhat.component=rhceph-container, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_BRANCH=main, build-date=2025-09-24T08:57:55, vcs-type=git, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, release=553)
Nov 28 09:56:10 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-04c7a1767f0a2009eee63539253f3f9f0fbc337fc72bb39a68aab86969f5acee-merged.mount: Deactivated successfully.
Nov 28 09:56:10 np0005538515.localdomain podman[299947]: 2025-11-28 09:56:10.81061089 +0000 UTC m=+0.091468104 container remove a20fbb4af4b220d896878368414f00f458b36bd01f689cea18d6929c00ea38cf (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mon-np0005538515, CEPH_POINT_RELEASE=, io.buildah.version=1.33.12, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vcs-type=git, com.redhat.component=rhceph-container, build-date=2025-09-24T08:57:55, distribution-scope=public, name=rhceph, version=7, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph, architecture=x86_64, release=553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, ceph=True)
Nov 28 09:56:10 np0005538515.localdomain sudo[299965]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 rm-daemon --fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1 --name mon.np0005538515 --force
Nov 28 09:56:10 np0005538515.localdomain sudo[299965]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:10 np0005538515.localdomain ceph-osd[32393]: --2- [v2:172.18.0.108:6800/2860773178,v1:172.18.0.108:6801/2860773178] >> [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] conn(0x55ab8edee400 0x55ab8d898000 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request get_initial_auth_request returned -2
Nov 28 09:56:10 np0005538515.localdomain sudo[299998]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Nov 28 09:56:10 np0005538515.localdomain sudo[299998]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:10 np0005538515.localdomain sudo[299998]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:11 np0005538515.localdomain sudo[300016]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph
Nov 28 09:56:11 np0005538515.localdomain sudo[300016]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:11 np0005538515.localdomain sudo[300016]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:11 np0005538515.localdomain sudo[300042]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.conf.new
Nov 28 09:56:11 np0005538515.localdomain sudo[300042]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:11 np0005538515.localdomain sudo[300042]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:11 np0005538515.localdomain sudo[300084]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:56:11 np0005538515.localdomain sudo[300084]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:11 np0005538515.localdomain sudo[300084]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:11 np0005538515.localdomain sudo[300109]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.conf.new
Nov 28 09:56:11 np0005538515.localdomain sudo[300109]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:11 np0005538515.localdomain sudo[300109]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:11 np0005538515.localdomain sudo[300168]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.conf.new
Nov 28 09:56:11 np0005538515.localdomain sudo[300168]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:11 np0005538515.localdomain sudo[300168]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:11 np0005538515.localdomain sudo[300189]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.conf.new
Nov 28 09:56:11 np0005538515.localdomain sudo[300189]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:11 np0005538515.localdomain sudo[300189]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:11 np0005538515.localdomain sudo[300219]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Nov 28 09:56:11 np0005538515.localdomain sudo[300219]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:11 np0005538515.localdomain sudo[300219]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:11 np0005538515.localdomain sudo[300238]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config
Nov 28 09:56:11 np0005538515.localdomain sudo[300238]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:11 np0005538515.localdomain sudo[300238]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:11 np0005538515.localdomain systemd[1]: ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1@mon.np0005538515.service: Deactivated successfully.
Nov 28 09:56:11 np0005538515.localdomain systemd[1]: Stopped Ceph mon.np0005538515 for 2c5417c9-00eb-57d5-a565-ddecbc7995c1.
Nov 28 09:56:11 np0005538515.localdomain systemd[1]: ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1@mon.np0005538515.service: Consumed 12.041s CPU time.
Nov 28 09:56:11 np0005538515.localdomain sudo[300256]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config
Nov 28 09:56:11 np0005538515.localdomain sudo[300256]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:11 np0005538515.localdomain sudo[300256]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:11 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 09:56:11 np0005538515.localdomain sudo[300278]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf.new
Nov 28 09:56:11 np0005538515.localdomain systemd-rc-local-generator[300319]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:56:11 np0005538515.localdomain systemd-sysv-generator[300324]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:56:11 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:56:11 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 28 09:56:11 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:56:11 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:56:11 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:56:11 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 28 09:56:11 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:56:11 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:56:11 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:56:12 np0005538515.localdomain sudo[300278]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:12 np0005538515.localdomain sudo[300278]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:12 np0005538515.localdomain sudo[299965]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:12 np0005538515.localdomain sudo[300332]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:56:12 np0005538515.localdomain sudo[300332]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:12 np0005538515.localdomain sudo[300332]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:12 np0005538515.localdomain sudo[300350]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf.new
Nov 28 09:56:12 np0005538515.localdomain sudo[300350]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:12 np0005538515.localdomain sudo[300350]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:12 np0005538515.localdomain sudo[300384]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf.new
Nov 28 09:56:12 np0005538515.localdomain sudo[300384]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:12 np0005538515.localdomain sudo[300384]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:12 np0005538515.localdomain sudo[300402]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf.new
Nov 28 09:56:12 np0005538515.localdomain sudo[300402]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:12 np0005538515.localdomain sudo[300402]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:12 np0005538515.localdomain sudo[300420]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf.new /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:56:12 np0005538515.localdomain sudo[300420]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:12 np0005538515.localdomain sudo[300420]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:12 np0005538515.localdomain sudo[300438]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:56:12 np0005538515.localdomain sudo[300438]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:12 np0005538515.localdomain sudo[300438]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:18 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.
Nov 28 09:56:18 np0005538515.localdomain systemd[1]: tmp-crun.NOdvD3.mount: Deactivated successfully.
Nov 28 09:56:19 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.
Nov 28 09:56:19 np0005538515.localdomain podman[300456]: 2025-11-28 09:56:19.007139766 +0000 UTC m=+0.106860683 container health_status 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 09:56:19 np0005538515.localdomain podman[300472]: 2025-11-28 09:56:19.080398704 +0000 UTC m=+0.070491204 container health_status 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Nov 28 09:56:19 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.
Nov 28 09:56:19 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.
Nov 28 09:56:19 np0005538515.localdomain podman[300456]: 2025-11-28 09:56:19.098032862 +0000 UTC m=+0.197753779 container exec_died 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 28 09:56:19 np0005538515.localdomain systemd[1]: 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.service: Deactivated successfully.
Nov 28 09:56:19 np0005538515.localdomain podman[300472]: 2025-11-28 09:56:19.122484778 +0000 UTC m=+0.112577288 container exec_died 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125)
Nov 28 09:56:19 np0005538515.localdomain systemd[1]: 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.service: Deactivated successfully.
Nov 28 09:56:19 np0005538515.localdomain podman[300497]: 2025-11-28 09:56:19.180819539 +0000 UTC m=+0.072907087 container health_status d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 09:56:19 np0005538515.localdomain podman[300495]: 2025-11-28 09:56:19.250936409 +0000 UTC m=+0.143250074 container health_status b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 28 09:56:19 np0005538515.localdomain podman[300495]: 2025-11-28 09:56:19.259393878 +0000 UTC m=+0.151707513 container exec_died b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Nov 28 09:56:19 np0005538515.localdomain systemd[1]: b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.service: Deactivated successfully.
Nov 28 09:56:19 np0005538515.localdomain podman[300497]: 2025-11-28 09:56:19.316221763 +0000 UTC m=+0.208309341 container exec_died d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 28 09:56:19 np0005538515.localdomain systemd[1]: d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.service: Deactivated successfully.
Nov 28 09:56:20 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:56:20.785 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:56:22 np0005538515.localdomain sudo[300542]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:56:22 np0005538515.localdomain sudo[300542]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:22 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.
Nov 28 09:56:22 np0005538515.localdomain sudo[300542]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:22 np0005538515.localdomain sudo[300566]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:56:22 np0005538515.localdomain sudo[300566]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:22 np0005538515.localdomain systemd[1]: tmp-crun.7VGf4I.mount: Deactivated successfully.
Nov 28 09:56:22 np0005538515.localdomain podman[300560]: 2025-11-28 09:56:22.820497071 +0000 UTC m=+0.089554835 container health_status 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 09:56:22 np0005538515.localdomain podman[300560]: 2025-11-28 09:56:22.832497528 +0000 UTC m=+0.101555262 container exec_died 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 28 09:56:22 np0005538515.localdomain systemd[1]: 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.service: Deactivated successfully.
Nov 28 09:56:23 np0005538515.localdomain podman[300615]: 
Nov 28 09:56:23 np0005538515.localdomain podman[300615]: 2025-11-28 09:56:23.276947877 +0000 UTC m=+0.081620993 container create e42102336fc1acf5bf48ef4e3d2fcf4221a4f50af373c5c14c0bc253f90e4ccd (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=romantic_kirch, GIT_BRANCH=main, distribution-scope=public, version=7, architecture=x86_64, RELEASE=main, description=Red Hat Ceph Storage 7, release=553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.buildah.version=1.33.12, CEPH_POINT_RELEASE=, ceph=True, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, vcs-type=git, build-date=2025-09-24T08:57:55, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d)
Nov 28 09:56:23 np0005538515.localdomain systemd[1]: Started libpod-conmon-e42102336fc1acf5bf48ef4e3d2fcf4221a4f50af373c5c14c0bc253f90e4ccd.scope.
Nov 28 09:56:23 np0005538515.localdomain podman[300615]: 2025-11-28 09:56:23.24527117 +0000 UTC m=+0.049944326 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 09:56:23 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 09:56:23 np0005538515.localdomain podman[300615]: 2025-11-28 09:56:23.364625464 +0000 UTC m=+0.169298590 container init e42102336fc1acf5bf48ef4e3d2fcf4221a4f50af373c5c14c0bc253f90e4ccd (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=romantic_kirch, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, vcs-type=git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55, version=7, architecture=x86_64, RELEASE=main, io.buildah.version=1.33.12, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, name=rhceph, io.openshift.expose-services=, release=553, vendor=Red Hat, Inc., GIT_CLEAN=True)
Nov 28 09:56:23 np0005538515.localdomain podman[300615]: 2025-11-28 09:56:23.376547438 +0000 UTC m=+0.181220564 container start e42102336fc1acf5bf48ef4e3d2fcf4221a4f50af373c5c14c0bc253f90e4ccd (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=romantic_kirch, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, name=rhceph, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, ceph=True, build-date=2025-09-24T08:57:55, release=553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, distribution-scope=public, version=7, com.redhat.license_terms=https://www.redhat.com/agreements)
Nov 28 09:56:23 np0005538515.localdomain podman[300615]: 2025-11-28 09:56:23.377257169 +0000 UTC m=+0.181930285 container attach e42102336fc1acf5bf48ef4e3d2fcf4221a4f50af373c5c14c0bc253f90e4ccd (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=romantic_kirch, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, RELEASE=main, com.redhat.component=rhceph-container, name=rhceph, GIT_BRANCH=main, vendor=Red Hat, Inc., GIT_CLEAN=True, build-date=2025-09-24T08:57:55, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, distribution-scope=public, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, CEPH_POINT_RELEASE=, ceph=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_REPO=https://github.com/ceph/ceph-container.git, release=553, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, io.openshift.expose-services=)
Nov 28 09:56:23 np0005538515.localdomain romantic_kirch[300630]: 167 167
Nov 28 09:56:23 np0005538515.localdomain systemd[1]: libpod-e42102336fc1acf5bf48ef4e3d2fcf4221a4f50af373c5c14c0bc253f90e4ccd.scope: Deactivated successfully.
Nov 28 09:56:23 np0005538515.localdomain podman[300615]: 2025-11-28 09:56:23.381893141 +0000 UTC m=+0.186566277 container died e42102336fc1acf5bf48ef4e3d2fcf4221a4f50af373c5c14c0bc253f90e4ccd (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=romantic_kirch, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, io.openshift.tags=rhceph ceph, RELEASE=main, vcs-type=git, io.openshift.expose-services=, version=7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=553, name=rhceph, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=)
Nov 28 09:56:23 np0005538515.localdomain podman[300635]: 2025-11-28 09:56:23.48537212 +0000 UTC m=+0.088193184 container remove e42102336fc1acf5bf48ef4e3d2fcf4221a4f50af373c5c14c0bc253f90e4ccd (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=romantic_kirch, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main, description=Red Hat Ceph Storage 7, architecture=x86_64, version=7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.tags=rhceph ceph, build-date=2025-09-24T08:57:55, io.openshift.expose-services=, name=rhceph, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, com.redhat.component=rhceph-container, distribution-scope=public, release=553, ceph=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=)
Nov 28 09:56:23 np0005538515.localdomain systemd[1]: libpod-conmon-e42102336fc1acf5bf48ef4e3d2fcf4221a4f50af373c5c14c0bc253f90e4ccd.scope: Deactivated successfully.
Nov 28 09:56:23 np0005538515.localdomain sudo[300566]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:23 np0005538515.localdomain sudo[300651]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:56:23 np0005538515.localdomain sudo[300651]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:23 np0005538515.localdomain sudo[300651]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:23 np0005538515.localdomain sudo[300655]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:56:23 np0005538515.localdomain sudo[300655]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:23 np0005538515.localdomain sudo[300655]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:23 np0005538515.localdomain sudo[300687]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:56:23 np0005538515.localdomain sudo[300687]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:23 np0005538515.localdomain sudo[300694]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:56:23 np0005538515.localdomain sudo[300694]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:23 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-23505fcdc6081ff9353caafebf0151945c6c8b473f91d1a30963d8e96ed8a8ff-merged.mount: Deactivated successfully.
Nov 28 09:56:24 np0005538515.localdomain podman[300753]: 
Nov 28 09:56:24 np0005538515.localdomain podman[300753]: 2025-11-28 09:56:24.197556073 +0000 UTC m=+0.075506326 container create eb3b79bfca3fe0dd2f5617abe87251038f6b3fc75f15452003b4a70bfb9d1fea (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vigilant_moser, build-date=2025-09-24T08:57:55, ceph=True, version=7, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.expose-services=, architecture=x86_64, com.redhat.component=rhceph-container, release=553, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, GIT_BRANCH=main, RELEASE=main, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.33.12, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph)
Nov 28 09:56:24 np0005538515.localdomain systemd[1]: Started libpod-conmon-eb3b79bfca3fe0dd2f5617abe87251038f6b3fc75f15452003b4a70bfb9d1fea.scope.
Nov 28 09:56:24 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 09:56:24 np0005538515.localdomain podman[300753]: 2025-11-28 09:56:24.251646726 +0000 UTC m=+0.129596979 container init eb3b79bfca3fe0dd2f5617abe87251038f6b3fc75f15452003b4a70bfb9d1fea (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vigilant_moser, build-date=2025-09-24T08:57:55, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, RELEASE=main, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, version=7, distribution-scope=public, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=553, ceph=True, name=rhceph, io.openshift.tags=rhceph ceph, architecture=x86_64, CEPH_POINT_RELEASE=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.buildah.version=1.33.12, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Nov 28 09:56:24 np0005538515.localdomain podman[300753]: 2025-11-28 09:56:24.2596749 +0000 UTC m=+0.137625133 container start eb3b79bfca3fe0dd2f5617abe87251038f6b3fc75f15452003b4a70bfb9d1fea (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vigilant_moser, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, io.buildah.version=1.33.12, build-date=2025-09-24T08:57:55, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, distribution-scope=public, GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, RELEASE=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, version=7, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=rhceph-container, release=553, description=Red Hat Ceph Storage 7)
Nov 28 09:56:24 np0005538515.localdomain podman[300753]: 2025-11-28 09:56:24.260043272 +0000 UTC m=+0.137993595 container attach eb3b79bfca3fe0dd2f5617abe87251038f6b3fc75f15452003b4a70bfb9d1fea (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vigilant_moser, release=553, build-date=2025-09-24T08:57:55, description=Red Hat Ceph Storage 7, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, name=rhceph, vcs-type=git, io.openshift.expose-services=, distribution-scope=public, vendor=Red Hat, Inc., version=7, io.buildah.version=1.33.12, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph)
Nov 28 09:56:24 np0005538515.localdomain vigilant_moser[300768]: 167 167
Nov 28 09:56:24 np0005538515.localdomain systemd[1]: libpod-eb3b79bfca3fe0dd2f5617abe87251038f6b3fc75f15452003b4a70bfb9d1fea.scope: Deactivated successfully.
Nov 28 09:56:24 np0005538515.localdomain podman[300753]: 2025-11-28 09:56:24.263409735 +0000 UTC m=+0.141360098 container died eb3b79bfca3fe0dd2f5617abe87251038f6b3fc75f15452003b4a70bfb9d1fea (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vigilant_moser, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-type=git, release=553, GIT_BRANCH=main, name=rhceph, distribution-scope=public, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.33.12, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, version=7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, build-date=2025-09-24T08:57:55, RELEASE=main, GIT_CLEAN=True, architecture=x86_64, vendor=Red Hat, Inc.)
Nov 28 09:56:24 np0005538515.localdomain podman[300753]: 2025-11-28 09:56:24.166425983 +0000 UTC m=+0.044376366 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 09:56:24 np0005538515.localdomain podman[300773]: 2025-11-28 09:56:24.333972889 +0000 UTC m=+0.067023488 container remove eb3b79bfca3fe0dd2f5617abe87251038f6b3fc75f15452003b4a70bfb9d1fea (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vigilant_moser, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.description=Red Hat Ceph Storage 7, version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, CEPH_POINT_RELEASE=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, distribution-scope=public, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, release=553, com.redhat.license_terms=https://www.redhat.com/agreements, ceph=True, GIT_BRANCH=main, vcs-type=git, build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, io.openshift.tags=rhceph ceph, architecture=x86_64, io.buildah.version=1.33.12, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., GIT_CLEAN=True)
Nov 28 09:56:24 np0005538515.localdomain systemd[1]: libpod-conmon-eb3b79bfca3fe0dd2f5617abe87251038f6b3fc75f15452003b4a70bfb9d1fea.scope: Deactivated successfully.
Nov 28 09:56:24 np0005538515.localdomain sudo[300694]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:24 np0005538515.localdomain sudo[300800]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:56:24 np0005538515.localdomain sudo[300800]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:24 np0005538515.localdomain sudo[300800]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:24 np0005538515.localdomain sudo[300832]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:56:24 np0005538515.localdomain sudo[300832]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:24 np0005538515.localdomain podman[300862]: 
Nov 28 09:56:24 np0005538515.localdomain podman[300862]: 2025-11-28 09:56:24.795989935 +0000 UTC m=+0.077744505 container create e3833cc75ac3e5cdbca0624e2fd30fa9a165398b3d430dfb3ab1f260ea7241e1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=naughty_williams, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., build-date=2025-09-24T08:57:55, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, version=7, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, release=553, RELEASE=main, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, distribution-scope=public, GIT_BRANCH=main, vcs-type=git, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Nov 28 09:56:24 np0005538515.localdomain systemd[1]: tmp-crun.ppFrcD.mount: Deactivated successfully.
Nov 28 09:56:24 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-7c93fe0f4b18fc1eb31bd126ceaf6e89675e8948f2a834fc31751de85703b260-merged.mount: Deactivated successfully.
Nov 28 09:56:24 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.
Nov 28 09:56:24 np0005538515.localdomain systemd[1]: Started libpod-conmon-e3833cc75ac3e5cdbca0624e2fd30fa9a165398b3d430dfb3ab1f260ea7241e1.scope.
Nov 28 09:56:24 np0005538515.localdomain podman[300862]: 2025-11-28 09:56:24.763983037 +0000 UTC m=+0.045737597 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 09:56:24 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 09:56:24 np0005538515.localdomain podman[300862]: 2025-11-28 09:56:24.887790658 +0000 UTC m=+0.169545218 container init e3833cc75ac3e5cdbca0624e2fd30fa9a165398b3d430dfb3ab1f260ea7241e1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=naughty_williams, com.redhat.component=rhceph-container, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, GIT_BRANCH=main, vcs-type=git, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, build-date=2025-09-24T08:57:55, vendor=Red Hat, Inc., RELEASE=main, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, version=7, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, io.buildah.version=1.33.12)
Nov 28 09:56:24 np0005538515.localdomain naughty_williams[300883]: 167 167
Nov 28 09:56:24 np0005538515.localdomain systemd[1]: libpod-e3833cc75ac3e5cdbca0624e2fd30fa9a165398b3d430dfb3ab1f260ea7241e1.scope: Deactivated successfully.
Nov 28 09:56:24 np0005538515.localdomain podman[300862]: 2025-11-28 09:56:24.902115395 +0000 UTC m=+0.183869965 container start e3833cc75ac3e5cdbca0624e2fd30fa9a165398b3d430dfb3ab1f260ea7241e1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=naughty_williams, build-date=2025-09-24T08:57:55, CEPH_POINT_RELEASE=, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, distribution-scope=public, vcs-type=git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, version=7, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, RELEASE=main, release=553)
Nov 28 09:56:24 np0005538515.localdomain podman[300862]: 2025-11-28 09:56:24.903112515 +0000 UTC m=+0.184867125 container attach e3833cc75ac3e5cdbca0624e2fd30fa9a165398b3d430dfb3ab1f260ea7241e1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=naughty_williams, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/agreements, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, vcs-type=git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, release=553, version=7, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, build-date=2025-09-24T08:57:55, name=rhceph, GIT_CLEAN=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.buildah.version=1.33.12, io.openshift.tags=rhceph ceph, ceph=True, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7)
Nov 28 09:56:24 np0005538515.localdomain podman[300862]: 2025-11-28 09:56:24.905860589 +0000 UTC m=+0.187615179 container died e3833cc75ac3e5cdbca0624e2fd30fa9a165398b3d430dfb3ab1f260ea7241e1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=naughty_williams, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., name=rhceph, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, version=7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, RELEASE=main, distribution-scope=public, io.openshift.expose-services=, GIT_BRANCH=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, io.buildah.version=1.33.12, release=553, build-date=2025-09-24T08:57:55, vcs-type=git, architecture=x86_64)
Nov 28 09:56:24 np0005538515.localdomain podman[300877]: 2025-11-28 09:56:24.984751007 +0000 UTC m=+0.139618833 container health_status cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=multipathd, managed_by=edpm_ansible, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3)
Nov 28 09:56:25 np0005538515.localdomain podman[300892]: 2025-11-28 09:56:25.042498141 +0000 UTC m=+0.130882227 container remove e3833cc75ac3e5cdbca0624e2fd30fa9a165398b3d430dfb3ab1f260ea7241e1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=naughty_williams, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, release=553, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., distribution-scope=public, io.buildah.version=1.33.12, vcs-type=git, build-date=2025-09-24T08:57:55, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, CEPH_POINT_RELEASE=, name=rhceph, GIT_CLEAN=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d)
Nov 28 09:56:25 np0005538515.localdomain systemd[1]: libpod-conmon-e3833cc75ac3e5cdbca0624e2fd30fa9a165398b3d430dfb3ab1f260ea7241e1.scope: Deactivated successfully.
Nov 28 09:56:25 np0005538515.localdomain podman[300877]: 2025-11-28 09:56:25.073800746 +0000 UTC m=+0.228668572 container exec_died cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Nov 28 09:56:25 np0005538515.localdomain systemd[1]: cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.service: Deactivated successfully.
Nov 28 09:56:25 np0005538515.localdomain podman[300928]: 
Nov 28 09:56:25 np0005538515.localdomain podman[300928]: 2025-11-28 09:56:25.149692853 +0000 UTC m=+0.067124690 container create 27c18f369328566a4b4030b15f216752f36c58ddfce2edccbef4823adad7530e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=cranky_nobel, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=553, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, com.redhat.component=rhceph-container, RELEASE=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, distribution-scope=public, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_BRANCH=main, build-date=2025-09-24T08:57:55, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, version=7, ceph=True)
Nov 28 09:56:25 np0005538515.localdomain systemd[1]: Started libpod-conmon-27c18f369328566a4b4030b15f216752f36c58ddfce2edccbef4823adad7530e.scope.
Nov 28 09:56:25 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 09:56:25 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/62f7d916c15565b1c0e2f8063d0e3902e0ff4155098303eb460f7197bc598bec/merged/tmp/keyring supports timestamps until 2038 (0x7fffffff)
Nov 28 09:56:25 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/62f7d916c15565b1c0e2f8063d0e3902e0ff4155098303eb460f7197bc598bec/merged/tmp/config supports timestamps until 2038 (0x7fffffff)
Nov 28 09:56:25 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/62f7d916c15565b1c0e2f8063d0e3902e0ff4155098303eb460f7197bc598bec/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Nov 28 09:56:25 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/62f7d916c15565b1c0e2f8063d0e3902e0ff4155098303eb460f7197bc598bec/merged/var/lib/ceph/mon/ceph-np0005538515 supports timestamps until 2038 (0x7fffffff)
Nov 28 09:56:25 np0005538515.localdomain podman[300928]: 2025-11-28 09:56:25.206909721 +0000 UTC m=+0.124341548 container init 27c18f369328566a4b4030b15f216752f36c58ddfce2edccbef4823adad7530e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=cranky_nobel, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, build-date=2025-09-24T08:57:55, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.buildah.version=1.33.12, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, version=7, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, GIT_CLEAN=True, distribution-scope=public, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, release=553, vendor=Red Hat, Inc., architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True)
Nov 28 09:56:25 np0005538515.localdomain podman[300928]: 2025-11-28 09:56:25.215620487 +0000 UTC m=+0.133052334 container start 27c18f369328566a4b4030b15f216752f36c58ddfce2edccbef4823adad7530e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=cranky_nobel, com.redhat.license_terms=https://www.redhat.com/agreements, release=553, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, distribution-scope=public, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-09-24T08:57:55, io.openshift.tags=rhceph ceph, architecture=x86_64, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_BRANCH=main, name=rhceph, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, vcs-type=git, io.buildah.version=1.33.12, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc.)
Nov 28 09:56:25 np0005538515.localdomain podman[300928]: 2025-11-28 09:56:25.216093611 +0000 UTC m=+0.133525448 container attach 27c18f369328566a4b4030b15f216752f36c58ddfce2edccbef4823adad7530e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=cranky_nobel, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main, release=553, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55, io.buildah.version=1.33.12, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_BRANCH=main, name=rhceph, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, distribution-scope=public, vcs-type=git, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True)
Nov 28 09:56:25 np0005538515.localdomain podman[300928]: 2025-11-28 09:56:25.125599267 +0000 UTC m=+0.043031144 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 09:56:25 np0005538515.localdomain systemd[1]: libpod-27c18f369328566a4b4030b15f216752f36c58ddfce2edccbef4823adad7530e.scope: Deactivated successfully.
Nov 28 09:56:25 np0005538515.localdomain podman[300928]: 2025-11-28 09:56:25.311976058 +0000 UTC m=+0.229408175 container died 27c18f369328566a4b4030b15f216752f36c58ddfce2edccbef4823adad7530e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=cranky_nobel, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-09-24T08:57:55, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, RELEASE=main, release=553, version=7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.expose-services=, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., name=rhceph, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, ceph=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.buildah.version=1.33.12, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Nov 28 09:56:25 np0005538515.localdomain podman[300972]: 2025-11-28 09:56:25.40670705 +0000 UTC m=+0.084119909 container remove 27c18f369328566a4b4030b15f216752f36c58ddfce2edccbef4823adad7530e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=cranky_nobel, CEPH_POINT_RELEASE=, release=553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, distribution-scope=public, GIT_BRANCH=main, io.buildah.version=1.33.12, ceph=True, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., name=rhceph, version=7, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.expose-services=)
Nov 28 09:56:25 np0005538515.localdomain systemd[1]: libpod-conmon-27c18f369328566a4b4030b15f216752f36c58ddfce2edccbef4823adad7530e.scope: Deactivated successfully.
Nov 28 09:56:25 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 09:56:25 np0005538515.localdomain systemd-rc-local-generator[301009]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:56:25 np0005538515.localdomain systemd-sysv-generator[301016]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:56:25 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:56:25 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 28 09:56:25 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:56:25 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:56:25 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:56:25 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 28 09:56:25 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:56:25 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:56:25 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:56:25 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-15aa363c80d8a8a19ee62ad5a41cdb499331b3bc5f00fd34ba725e34fa5cee77-merged.mount: Deactivated successfully.
Nov 28 09:56:25 np0005538515.localdomain systemd[1]: Reloading.
Nov 28 09:56:25 np0005538515.localdomain systemd-rc-local-generator[301052]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 09:56:25 np0005538515.localdomain systemd-sysv-generator[301058]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 09:56:26 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:56:26 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Nov 28 09:56:26 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:56:26 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:56:26 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Nov 28 09:56:26 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Nov 28 09:56:26 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:56:26 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:56:26 np0005538515.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Nov 28 09:56:26 np0005538515.localdomain systemd[1]: Starting Ceph mon.np0005538515 for 2c5417c9-00eb-57d5-a565-ddecbc7995c1...
Nov 28 09:56:26 np0005538515.localdomain podman[301116]: 
Nov 28 09:56:26 np0005538515.localdomain podman[301116]: 2025-11-28 09:56:26.560366013 +0000 UTC m=+0.069132192 container create 9d25083944a0821d6aa6b270a71605ed33c6aecefc4f4532b654f92a63e98682 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mon-np0005538515, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/agreements, GIT_BRANCH=main, build-date=2025-09-24T08:57:55, RELEASE=main, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, ceph=True, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, vcs-type=git, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, release=553, name=rhceph, CEPH_POINT_RELEASE=, io.buildah.version=1.33.12, version=7)
Nov 28 09:56:26 np0005538515.localdomain systemd[1]: tmp-crun.KHG8rT.mount: Deactivated successfully.
Nov 28 09:56:26 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/46e10033ff9163dac6e3ff16d3e55af3f313132156d2843a6fbb3e3458282cfc/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Nov 28 09:56:26 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/46e10033ff9163dac6e3ff16d3e55af3f313132156d2843a6fbb3e3458282cfc/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Nov 28 09:56:26 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/46e10033ff9163dac6e3ff16d3e55af3f313132156d2843a6fbb3e3458282cfc/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Nov 28 09:56:26 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/46e10033ff9163dac6e3ff16d3e55af3f313132156d2843a6fbb3e3458282cfc/merged/var/lib/ceph/mon/ceph-np0005538515 supports timestamps until 2038 (0x7fffffff)
Nov 28 09:56:26 np0005538515.localdomain podman[301116]: 2025-11-28 09:56:26.618968382 +0000 UTC m=+0.127734591 container init 9d25083944a0821d6aa6b270a71605ed33c6aecefc4f4532b654f92a63e98682 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mon-np0005538515, io.buildah.version=1.33.12, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, vcs-type=git, build-date=2025-09-24T08:57:55, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, architecture=x86_64, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=553, version=7, GIT_BRANCH=main, com.redhat.component=rhceph-container, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_REPO=https://github.com/ceph/ceph-container.git)
Nov 28 09:56:26 np0005538515.localdomain podman[301116]: 2025-11-28 09:56:26.630993629 +0000 UTC m=+0.139759848 container start 9d25083944a0821d6aa6b270a71605ed33c6aecefc4f4532b654f92a63e98682 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mon-np0005538515, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, ceph=True, description=Red Hat Ceph Storage 7, version=7, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, vcs-type=git, GIT_BRANCH=main, RELEASE=main, io.openshift.expose-services=, io.buildah.version=1.33.12, build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, name=rhceph, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, release=553, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7)
Nov 28 09:56:26 np0005538515.localdomain podman[301116]: 2025-11-28 09:56:26.534177993 +0000 UTC m=+0.042944262 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 09:56:26 np0005538515.localdomain bash[301116]: 9d25083944a0821d6aa6b270a71605ed33c6aecefc4f4532b654f92a63e98682
Nov 28 09:56:26 np0005538515.localdomain systemd[1]: Started Ceph mon.np0005538515 for 2c5417c9-00eb-57d5-a565-ddecbc7995c1.
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: set uid:gid to 167:167 (ceph:ceph)
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: ceph version 18.2.1-361.el9cp (439dcd6094d413840eb2ec590fe2194ec616687f) reef (stable), process ceph-mon, pid 2
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: pidfile_write: ignore empty --pid-file
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: load: jerasure load: lrc 
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb: RocksDB version: 7.9.2
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb: Git sha 0
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb: Compile date 2025-09-23 00:00:00
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb: DB SUMMARY
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb: DB Session ID:  7KM5GJAJPD54H6HSLJHG
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb: CURRENT file:  CURRENT
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb: IDENTITY file:  IDENTITY
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb: MANIFEST file:  MANIFEST-000005 size: 59 Bytes
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb: SST files in /var/lib/ceph/mon/ceph-np0005538515/store.db dir, Total Num: 0, files: 
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb: Write Ahead Log file in /var/lib/ceph/mon/ceph-np0005538515/store.db: 000004.log size: 636 ; 
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:                         Options.error_if_exists: 0
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:                       Options.create_if_missing: 0
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:                         Options.paranoid_checks: 1
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:             Options.flush_verify_memtable_count: 1
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:                                     Options.env: 0x561ade3479e0
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:                                      Options.fs: PosixFileSystem
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:                                Options.info_log: 0x561ae070ad20
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:                Options.max_file_opening_threads: 16
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:                              Options.statistics: (nil)
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:                               Options.use_fsync: 0
Nov 28 09:56:26 np0005538515.localdomain sudo[300687]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:                       Options.max_log_file_size: 0
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:                   Options.log_file_time_to_roll: 0
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:                       Options.keep_log_file_num: 1000
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:                    Options.recycle_log_file_num: 0
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:                         Options.allow_fallocate: 1
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:                        Options.allow_mmap_reads: 0
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:                       Options.allow_mmap_writes: 0
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:                        Options.use_direct_reads: 0
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:          Options.create_missing_column_families: 0
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:                              Options.db_log_dir: 
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:                                 Options.wal_dir: 
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:                Options.table_cache_numshardbits: 6
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:                         Options.WAL_ttl_seconds: 0
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:                       Options.WAL_size_limit_MB: 0
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:             Options.manifest_preallocation_size: 4194304
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:                     Options.is_fd_close_on_exec: 1
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:                   Options.advise_random_on_open: 1
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:                    Options.db_write_buffer_size: 0
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:                    Options.write_buffer_manager: 0x561ae071b540
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:         Options.access_hint_on_compaction_start: 1
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:                      Options.use_adaptive_mutex: 0
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:                            Options.rate_limiter: (nil)
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:                       Options.wal_recovery_mode: 2
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:                  Options.enable_thread_tracking: 0
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:                  Options.enable_pipelined_write: 0
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:                  Options.unordered_write: 0
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:             Options.write_thread_max_yield_usec: 100
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:                               Options.row_cache: None
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:                              Options.wal_filter: None
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:             Options.avoid_flush_during_recovery: 0
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:             Options.allow_ingest_behind: 0
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:             Options.two_write_queues: 0
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:             Options.manual_wal_flush: 0
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:             Options.wal_compression: 0
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:             Options.atomic_flush: 0
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:                 Options.persist_stats_to_disk: 0
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:                 Options.write_dbid_to_manifest: 0
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:                 Options.log_readahead_size: 0
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:                 Options.best_efforts_recovery: 0
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:             Options.allow_data_in_errors: 0
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:             Options.db_host_id: __hostname__
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:             Options.enforce_single_del_contracts: true
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:             Options.max_background_jobs: 2
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:             Options.max_background_compactions: -1
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:             Options.max_subcompactions: 1
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:           Options.writable_file_max_buffer_size: 1048576
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:             Options.delayed_write_rate : 16777216
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:             Options.max_total_wal_size: 0
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:                   Options.stats_dump_period_sec: 600
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:                 Options.stats_persist_period_sec: 600
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:                          Options.max_open_files: -1
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:                          Options.bytes_per_sync: 0
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:                      Options.wal_bytes_per_sync: 0
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:                   Options.strict_bytes_per_sync: 0
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:       Options.compaction_readahead_size: 0
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:                  Options.max_background_flushes: -1
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb: Compression algorithms supported:
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:         kZSTD supported: 0
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:         kXpressCompression supported: 0
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:         kBZip2Compression supported: 0
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:         kZSTDNotFinalCompression supported: 0
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:         kLZ4Compression supported: 1
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:         kZlibCompression supported: 1
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:         kLZ4HCCompression supported: 1
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:         kSnappyCompression supported: 1
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb: Fast CRC32 supported: Supported on x86
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb: DMutex implementation: pthread_mutex_t
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: /var/lib/ceph/mon/ceph-np0005538515/store.db/MANIFEST-000005
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:           Options.merge_operator: 
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:        Options.compaction_filter: None
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:        Options.compaction_filter_factory: None
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:  Options.sst_partitioner_factory: None
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x561ae070a980)
                                                             cache_index_and_filter_blocks: 1
                                                             cache_index_and_filter_blocks_with_high_priority: 0
                                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                                             pin_top_level_index_and_filter: 1
                                                             index_type: 0
                                                             data_block_index_type: 0
                                                             index_shortening: 1
                                                             data_block_hash_table_util_ratio: 0.750000
                                                             checksum: 4
                                                             no_block_cache: 0
                                                             block_cache: 0x561ae0707350
                                                             block_cache_name: BinnedLRUCache
                                                             block_cache_options:
                                                               capacity : 536870912
                                                               num_shard_bits : 4
                                                               strict_capacity_limit : 0
                                                               high_pri_pool_ratio: 0.000
                                                             block_cache_compressed: (nil)
                                                             persistent_cache: (nil)
                                                             block_size: 4096
                                                             block_size_deviation: 10
                                                             block_restart_interval: 16
                                                             index_block_restart_interval: 1
                                                             metadata_block_size: 4096
                                                             partition_filters: 0
                                                             use_delta_encoding: 1
                                                             filter_policy: bloomfilter
                                                             whole_key_filtering: 1
                                                             verify_compression: 0
                                                             read_amp_bytes_per_bit: 0
                                                             format_version: 5
                                                             enable_index_compression: 1
                                                             block_align: 0
                                                             max_auto_readahead_size: 262144
                                                             prepopulate_block_cache: 0
                                                             initial_auto_readahead_size: 8192
                                                             num_file_reads_for_auto_readahead: 2
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:        Options.write_buffer_size: 33554432
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:  Options.max_write_buffer_number: 2
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:          Options.compression: NoCompression
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:       Options.prefix_extractor: nullptr
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:             Options.num_levels: 7
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:        Options.min_write_buffer_number_to_merge: 1
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:                  Options.compression_opts.level: 32767
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:               Options.compression_opts.strategy: 0
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:                  Options.compression_opts.enabled: false
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:      Options.level0_file_num_compaction_trigger: 4
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:                Options.max_bytes_for_level_base: 268435456
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb: Options.level_compaction_dynamic_level_bytes: 1
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:          Options.max_bytes_for_level_multiplier: 10.000000
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:                        Options.arena_block_size: 1048576
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:                Options.disable_auto_compactions: 0
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:                   Options.table_properties_collectors: 
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:                   Options.inplace_update_support: 0
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:                           Options.bloom_locality: 0
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:                    Options.max_successive_merges: 0
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:                Options.paranoid_file_checks: 0
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:                Options.force_consistency_checks: 1
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:                Options.report_bg_io_stats: 0
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:                               Options.ttl: 2592000
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:                       Options.enable_blob_files: false
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:                           Options.min_blob_size: 0
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:                          Options.blob_file_size: 268435456
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb:                Options.blob_file_starting_level: 0
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:/var/lib/ceph/mon/ceph-np0005538515/store.db/MANIFEST-000005 succeeded,manifest_file_number is 5, next_file_number is 7, last_sequence is 0, log_number is 0,prev_log_number is 0,max_column_family is 0,min_log_number_to_keep is 0
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 0
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: 75e61b0e-4f73-4b03-b096-8587ecbe7a9f
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323786679350, "job": 1, "event": "recovery_started", "wal_files": [4]}
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #4 mode 2
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323786681587, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 8, "file_size": 1762, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 1, "largest_seqno": 5, "table_properties": {"data_size": 648, "index_size": 31, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 115, "raw_average_key_size": 23, "raw_value_size": 526, "raw_average_value_size": 105, "num_data_blocks": 1, "num_entries": 5, "num_filter_entries": 5, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764323786, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "75e61b0e-4f73-4b03-b096-8587ecbe7a9f", "db_session_id": "7KM5GJAJPD54H6HSLJHG", "orig_file_number": 8, "seqno_to_time_mapping": "N/A"}}
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323786681701, "job": 1, "event": "recovery_finished"}
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/version_set.cc:5047] Creating manifest 10
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538515/store.db/000004.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x561ae072ee00
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb: DB pointer 0x561ae0824000
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515 does not exist in monmap, will attempt to join an existing cluster
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                           ** DB Stats **
                                                           Uptime(secs): 0.0 total, 0.0 interval
                                                           Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s
                                                           Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                           Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                           Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                                           
                                                           ** Compaction Stats [default] **
                                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                             L0      1/0    1.72 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.8      0.00              0.00         1    0.002       0      0       0.0       0.0
                                                            Sum      1/0    1.72 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.8      0.00              0.00         1    0.002       0      0       0.0       0.0
                                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.8      0.00              0.00         1    0.002       0      0       0.0       0.0
                                                           
                                                           ** Compaction Stats [default] **
                                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.8      0.00              0.00         1    0.002       0      0       0.0       0.0
                                                           
                                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                           
                                                           Uptime(secs): 0.0 total, 0.0 interval
                                                           Flush(GB): cumulative 0.000, interval 0.000
                                                           AddFile(GB): cumulative 0.000, interval 0.000
                                                           AddFile(Total Files): cumulative 0, interval 0
                                                           AddFile(L0 Files): cumulative 0, interval 0
                                                           AddFile(Keys): cumulative 0, interval 0
                                                           Cumulative compaction: 0.00 GB write, 0.14 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                           Interval compaction: 0.00 GB write, 0.14 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                           Block cache BinnedLRUCache@0x561ae0707350#2 capacity: 512.00 MB usage: 0.98 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 0 last_secs: 3.5e-05 secs_since: 0
                                                           Block cache entry stats(count,size,portion): DataBlock(1,0.77 KB,0.000146031%) FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.11 KB,2.08616e-05%) Misc(1,0.00 KB,0%)
                                                           
                                                           ** File Read Latency Histogram By Level [default] **
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: using public_addr v2:172.18.0.105:0/0 -> [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0]
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: starting mon.np0005538515 rank -1 at public addrs [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] at bind addrs [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon_data /var/lib/ceph/mon/ceph-np0005538515 fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@-1(???) e0 preinit fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@-1(synchronizing) e14 sync_obtain_latest_monmap
Nov 28 09:56:26 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@-1(synchronizing) e14 sync_obtain_latest_monmap obtained monmap e14
Nov 28 09:56:26 np0005538515.localdomain podman[301177]: 
Nov 28 09:56:26 np0005538515.localdomain podman[301177]: 2025-11-28 09:56:26.797849223 +0000 UTC m=+0.068827672 container create 7294beee033c4cb7de008fb12a21fe87cf238a20fe3c578110b2ae5c579af9e8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=quizzical_panini, architecture=x86_64, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, description=Red Hat Ceph Storage 7, version=7, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, RELEASE=main, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, release=553, build-date=2025-09-24T08:57:55, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, io.openshift.expose-services=)
Nov 28 09:56:26 np0005538515.localdomain systemd[1]: tmp-crun.03VsBs.mount: Deactivated successfully.
Nov 28 09:56:26 np0005538515.localdomain systemd[1]: Started libpod-conmon-7294beee033c4cb7de008fb12a21fe87cf238a20fe3c578110b2ae5c579af9e8.scope.
Nov 28 09:56:26 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 09:56:26 np0005538515.localdomain podman[301177]: 2025-11-28 09:56:26.764049651 +0000 UTC m=+0.035028130 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 09:56:26 np0005538515.localdomain podman[301177]: 2025-11-28 09:56:26.870550452 +0000 UTC m=+0.141528901 container init 7294beee033c4cb7de008fb12a21fe87cf238a20fe3c578110b2ae5c579af9e8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=quizzical_panini, io.buildah.version=1.33.12, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=rhceph-container, release=553, RELEASE=main, distribution-scope=public, description=Red Hat Ceph Storage 7, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2025-09-24T08:57:55, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, version=7)
Nov 28 09:56:26 np0005538515.localdomain podman[301177]: 2025-11-28 09:56:26.880453985 +0000 UTC m=+0.151432444 container start 7294beee033c4cb7de008fb12a21fe87cf238a20fe3c578110b2ae5c579af9e8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=quizzical_panini, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, CEPH_POINT_RELEASE=, ceph=True, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, description=Red Hat Ceph Storage 7, version=7, release=553, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph, io.buildah.version=1.33.12, io.openshift.expose-services=, distribution-scope=public, vcs-type=git, RELEASE=main, build-date=2025-09-24T08:57:55, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_BRANCH=main, vendor=Red Hat, Inc.)
Nov 28 09:56:26 np0005538515.localdomain podman[301177]: 2025-11-28 09:56:26.881016372 +0000 UTC m=+0.151994831 container attach 7294beee033c4cb7de008fb12a21fe87cf238a20fe3c578110b2ae5c579af9e8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=quizzical_panini, com.redhat.component=rhceph-container, io.openshift.expose-services=, build-date=2025-09-24T08:57:55, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, version=7, GIT_BRANCH=main, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, GIT_CLEAN=True, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, release=553, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=)
Nov 28 09:56:26 np0005538515.localdomain quizzical_panini[301192]: 167 167
Nov 28 09:56:26 np0005538515.localdomain systemd[1]: libpod-7294beee033c4cb7de008fb12a21fe87cf238a20fe3c578110b2ae5c579af9e8.scope: Deactivated successfully.
Nov 28 09:56:26 np0005538515.localdomain podman[301177]: 2025-11-28 09:56:26.886630614 +0000 UTC m=+0.157609083 container died 7294beee033c4cb7de008fb12a21fe87cf238a20fe3c578110b2ae5c579af9e8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=quizzical_panini, architecture=x86_64, version=7, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, RELEASE=main, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.component=rhceph-container, vcs-type=git, release=553, ceph=True, name=rhceph, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, distribution-scope=public, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, build-date=2025-09-24T08:57:55)
Nov 28 09:56:26 np0005538515.localdomain podman[301197]: 2025-11-28 09:56:26.990732282 +0000 UTC m=+0.091199215 container remove 7294beee033c4cb7de008fb12a21fe87cf238a20fe3c578110b2ae5c579af9e8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=quizzical_panini, io.openshift.tags=rhceph ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, release=553, build-date=2025-09-24T08:57:55, vendor=Red Hat, Inc., version=7, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, architecture=x86_64, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, GIT_CLEAN=True, CEPH_POINT_RELEASE=, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12)
Nov 28 09:56:26 np0005538515.localdomain systemd[1]: libpod-conmon-7294beee033c4cb7de008fb12a21fe87cf238a20fe3c578110b2ae5c579af9e8.scope: Deactivated successfully.
Nov 28 09:56:27 np0005538515.localdomain sudo[300832]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@-1(synchronizing).mds e17 new map
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@-1(synchronizing).mds e17 print_map
                                                           e17
                                                           enable_multiple, ever_enabled_multiple: 1,1
                                                           default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,12=quiesce subvolumes}
                                                           legacy client fscid: 1
                                                            
                                                           Filesystem 'cephfs' (1)
                                                           fs_name        cephfs
                                                           epoch        15
                                                           flags        12 joinable allow_snaps allow_multimds_snaps
                                                           created        2025-11-28T08:07:30.958224+0000
                                                           modified        2025-11-28T09:49:53.259185+0000
                                                           tableserver        0
                                                           root        0
                                                           session_timeout        60
                                                           session_autoclose        300
                                                           max_file_size        1099511627776
                                                           required_client_features        {}
                                                           last_failure        0
                                                           last_failure_osd_epoch        83
                                                           compat        compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,12=quiesce subvolumes}
                                                           max_mds        1
                                                           in        0
                                                           up        {0=26449}
                                                           failed        
                                                           damaged        
                                                           stopped        
                                                           data_pools        [6]
                                                           metadata_pool        7
                                                           inline_data        disabled
                                                           balancer        
                                                           bal_rank_mask        -1
                                                           standby_count_wanted        1
                                                           qdb_cluster        leader: 26449 members: 26449
                                                           [mds.mds.np0005538514.umgtoy{0:26449} state up:active seq 12 addr [v2:172.18.0.107:6808/1969410151,v1:172.18.0.107:6809/1969410151] compat {c=[1],r=[1],i=[17ff]}]
                                                            
                                                            
                                                           Standby daemons:
                                                            
                                                           [mds.mds.np0005538513.yljthc{-1:16968} state up:standby seq 1 addr [v2:172.18.0.106:6808/2782735008,v1:172.18.0.106:6809/2782735008] compat {c=[1],r=[1],i=[17ff]}]
                                                           [mds.mds.np0005538515.anvatb{-1:26446} state up:standby seq 1 addr [v2:172.18.0.108:6808/2640180,v1:172.18.0.108:6809/2640180] compat {c=[1],r=[1],i=[17ff]}]
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@-1(synchronizing).osd e90 crush map has features 3314933000852226048, adjusting msgr requires
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@-1(synchronizing).osd e90 crush map has features 288514051259236352, adjusting msgr requires
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@-1(synchronizing).osd e90 crush map has features 288514051259236352, adjusting msgr requires
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@-1(synchronizing).osd e90 crush map has features 288514051259236352, adjusting msgr requires
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: pgmap v11: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail; 17 KiB/s rd, 0 B/s wr, 9 op/s
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: Reconfiguring crash.np0005538514 (monmap changed)...
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: Reconfiguring daemon crash.np0005538514 on np0005538514.localdomain
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538514.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.107:0/758709823' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.108:0/36973857' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: Reconfiguring osd.0 (monmap changed)...
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: Reconfiguring daemon osd.0 on np0005538514.localdomain
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.106:0/705674516' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: pgmap v12: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail; 17 KiB/s rd, 0 B/s wr, 9 op/s
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: Reconfiguring osd.3 (monmap changed)...
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: Reconfiguring daemon osd.3 on np0005538514.localdomain
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.108:0/3157638842' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.107:0/486094489' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.106:0/4058543900' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='client.44470 -' entity='client.admin' cmd=[{"prefix": "orch host label rm", "hostname": "np0005538512.localdomain", "label": "mon", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: Removed label mon from host np0005538512.localdomain
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: Reconfiguring mds.mds.np0005538514.umgtoy (monmap changed)...
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: Reconfiguring daemon mds.mds.np0005538514.umgtoy on np0005538514.localdomain
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005538514.umgtoy", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538514.djozup", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "mgr services"} : dispatch
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: pgmap v13: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: Reconfiguring mgr.np0005538514.djozup (monmap changed)...
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: Reconfiguring daemon mgr.np0005538514.djozup on np0005538514.localdomain
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: Reconfiguring mon.np0005538514 (monmap changed)...
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: Reconfiguring daemon mon.np0005538514 on np0005538514.localdomain
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538515.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: pgmap v14: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: Reconfiguring crash.np0005538515 (monmap changed)...
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: Reconfiguring daemon crash.np0005538515 on np0005538515.localdomain
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: Reconfiguring osd.1 (monmap changed)...
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: Reconfiguring daemon osd.1 on np0005538515.localdomain
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='client.44473 -' entity='client.admin' cmd=[{"prefix": "orch host label rm", "hostname": "np0005538512.localdomain", "label": "mgr", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: Removed label mgr from host np0005538512.localdomain
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: Reconfiguring osd.4 (monmap changed)...
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: Reconfiguring daemon osd.4 on np0005538515.localdomain
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: pgmap v15: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005538515.anvatb", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='client.44476 -' entity='client.admin' cmd=[{"prefix": "orch host label rm", "hostname": "np0005538512.localdomain", "label": "_admin", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: Removed label _admin from host np0005538512.localdomain
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: Reconfiguring mds.mds.np0005538515.anvatb (monmap changed)...
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: Reconfiguring daemon mds.mds.np0005538515.anvatb on np0005538515.localdomain
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538515.yfkzhl", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "mgr services"} : dispatch
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: Reconfiguring mgr.np0005538515.yfkzhl (monmap changed)...
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: Reconfiguring daemon mgr.np0005538515.yfkzhl on np0005538515.localdomain
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: pgmap v16: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: Reconfiguring mon.np0005538515 (monmap changed)...
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: Reconfiguring daemon mon.np0005538515 on np0005538515.localdomain
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: pgmap v17: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: Removing np0005538512.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: Updating np0005538513.localdomain:/etc/ceph/ceph.conf
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: Updating np0005538514.localdomain:/etc/ceph/ceph.conf
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: Updating np0005538515.localdomain:/etc/ceph/ceph.conf
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: Removing np0005538512.localdomain:/etc/ceph/ceph.client.admin.keyring
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: Removing np0005538512.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: Updating np0005538514.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: Updating np0005538515.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: Updating np0005538513.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: pgmap v18: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: Removing daemon mgr.np0005538512.zyhkxs from np0005538512.localdomain -- ports [8765]
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: pgmap v19: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: Removing key for mgr.np0005538512.zyhkxs
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth rm", "entity": "mgr.np0005538512.zyhkxs"} : dispatch
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd='[{"prefix": "auth rm", "entity": "mgr.np0005538512.zyhkxs"}]': finished
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: pgmap v20: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538512.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: Reconfiguring crash.np0005538512 (monmap changed)...
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: Reconfiguring daemon crash.np0005538512 on np0005538512.localdomain
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538513.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: Reconfiguring crash.np0005538513 (monmap changed)...
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: Reconfiguring daemon crash.np0005538513 on np0005538513.localdomain
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: pgmap v21: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='client.34545 -' entity='client.admin' cmd=[{"prefix": "orch host drain", "hostname": "np0005538512.localdomain", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: Added label _no_schedule to host np0005538512.localdomain
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: Added label SpecialHostLabels.DRAIN_CONF_KEYRING to host np0005538512.localdomain
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: Reconfiguring osd.2 (monmap changed)...
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: Reconfiguring daemon osd.2 on np0005538513.localdomain
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: Reconfiguring osd.5 (monmap changed)...
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: Reconfiguring daemon osd.5 on np0005538513.localdomain
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: pgmap v22: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='client.44482 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "host_pattern": "np0005538512.localdomain", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005538513.yljthc", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005538512.localdomain"} : dispatch
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005538512.localdomain"}]': finished
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: Reconfiguring mds.mds.np0005538513.yljthc (monmap changed)...
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: Reconfiguring daemon mds.mds.np0005538513.yljthc on np0005538513.localdomain
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='client.44488 -' entity='client.admin' cmd=[{"prefix": "orch host rm", "hostname": "np0005538512.localdomain", "force": true, "target": ["mon-mgr", ""]}]: dispatch
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: Removed host np0005538512.localdomain
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538513.dsfdlx", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "mgr services"} : dispatch
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: Reconfiguring mgr.np0005538513.dsfdlx (monmap changed)...
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: Reconfiguring daemon mgr.np0005538513.dsfdlx on np0005538513.localdomain
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: pgmap v23: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: Reconfiguring mon.np0005538513 (monmap changed)...
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: Reconfiguring daemon mon.np0005538513 on np0005538513.localdomain
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: pgmap v24: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: pgmap v25: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: pgmap v26: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='client.44494 -' entity='client.admin' cmd=[{"prefix": "orch apply", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: Saving service mon spec with placement label:mon
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: pgmap v27: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='client.44500 -' entity='client.admin' cmd=[{"prefix": "orch ps", "daemon_type": "mon", "daemon_id": "np0005538515", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: pgmap v28: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: mon.np0005538513 calling monitor election
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: mon.np0005538514 calling monitor election
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: mon.np0005538513 is new leader, mons np0005538513,np0005538514 in quorum (ranks 0,1)
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: monmap epoch 14
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: last_changed 2025-11-28T09:56:10.676143+0000
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: created 2025-11-28T07:45:36.120469+0000
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: min_mon_release 18 (reef)
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: election_strategy: 1
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: 0: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005538513
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: 1: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005538514
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: fsmap cephfs:1 {0=mds.np0005538514.umgtoy=up:active} 2 up:standby
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: osdmap e90: 6 total, 6 up, 6 in
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: mgrmap e34: np0005538513.dsfdlx(active, since 51s), standbys: np0005538514.djozup, np0005538515.yfkzhl, np0005538512.zyhkxs
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: overall HEALTH_OK
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: Remove daemons mon.np0005538515
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: Safe to remove mon.np0005538515: new quorum should be ['np0005538513', 'np0005538514'] (from ['np0005538513', 'np0005538514'])
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: Removing monitor np0005538515 from monmap...
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: Removing daemon mon.np0005538515 from np0005538515.localdomain -- ports []
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "mon metadata", "id": "np0005538513"} : dispatch
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "mon metadata", "id": "np0005538514"} : dispatch
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: Updating np0005538513.localdomain:/etc/ceph/ceph.conf
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: Updating np0005538514.localdomain:/etc/ceph/ceph.conf
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: Updating np0005538515.localdomain:/etc/ceph/ceph.conf
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: Updating np0005538515.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: Updating np0005538514.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: Updating np0005538513.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: pgmap v29: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538513.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: Reconfiguring crash.np0005538513 (monmap changed)...
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: Reconfiguring daemon crash.np0005538513 on np0005538513.localdomain
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/3571339704' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/3571339704' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: pgmap v30: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: Reconfiguring osd.2 (monmap changed)...
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: Reconfiguring daemon osd.2 on np0005538513.localdomain
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: Reconfiguring osd.5 (monmap changed)...
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: Reconfiguring daemon osd.5 on np0005538513.localdomain
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: pgmap v31: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: Reconfiguring mds.mds.np0005538513.yljthc (monmap changed)...
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005538513.yljthc", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: Reconfiguring daemon mds.mds.np0005538513.yljthc on np0005538513.localdomain
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538513.dsfdlx", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "mgr services"} : dispatch
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: Reconfiguring mgr.np0005538513.dsfdlx (monmap changed)...
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: Reconfiguring daemon mgr.np0005538513.dsfdlx on np0005538513.localdomain
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538514.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: pgmap v32: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: Reconfiguring crash.np0005538514 (monmap changed)...
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: Reconfiguring daemon crash.np0005538514 on np0005538514.localdomain
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: Reconfiguring osd.0 (monmap changed)...
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: Reconfiguring daemon osd.0 on np0005538514.localdomain
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: pgmap v33: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: Reconfiguring osd.3 (monmap changed)...
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: Reconfiguring daemon osd.3 on np0005538514.localdomain
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005538514.umgtoy", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: Reconfiguring mds.mds.np0005538514.umgtoy (monmap changed)...
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: Reconfiguring daemon mds.mds.np0005538514.umgtoy on np0005538514.localdomain
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538514.djozup", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "mgr services"} : dispatch
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: Reconfiguring mgr.np0005538514.djozup (monmap changed)...
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: Reconfiguring daemon mgr.np0005538514.djozup on np0005538514.localdomain
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: pgmap v34: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538515.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: Reconfiguring crash.np0005538515 (monmap changed)...
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: Reconfiguring daemon crash.np0005538515 on np0005538515.localdomain
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='client.54203 -' entity='client.admin' cmd=[{"prefix": "orch daemon add", "daemon_type": "mon", "placement": "np0005538515.localdomain:172.18.0.105", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: Deploying daemon mon.np0005538515 on np0005538515.localdomain
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: Reconfiguring osd.1 (monmap changed)...
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: Reconfiguring daemon osd.1 on np0005538515.localdomain
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: pgmap v35: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: Reconfiguring osd.4 (monmap changed)...
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: Reconfiguring daemon osd.4 on np0005538515.localdomain
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: pgmap v36: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:27 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@-1(synchronizing).paxosservice(auth 1..40) refresh upgraded, format 0 -> 3
Nov 28 09:56:27 np0005538515.localdomain ceph-mgr[286188]: ms_deliver_dispatch: unhandled message 0x5575ffdcf1e0 mon_map magic: 0 from mon.1 v2:172.18.0.104:3300/0
Nov 28 09:56:27 np0005538515.localdomain sudo[301221]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:56:27 np0005538515.localdomain sudo[301221]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:27 np0005538515.localdomain sudo[301221]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:27 np0005538515.localdomain sudo[301239]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:56:27 np0005538515.localdomain sudo[301239]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:56:27 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:56:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:56:27 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:56:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:56:27 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 09:56:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:56:27 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 09:56:27 np0005538515.localdomain openstack_network_exporter[240973]: 
Nov 28 09:56:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:56:27 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 09:56:27 np0005538515.localdomain openstack_network_exporter[240973]: 
Nov 28 09:56:27 np0005538515.localdomain podman[301273]: 
Nov 28 09:56:27 np0005538515.localdomain podman[301273]: 2025-11-28 09:56:27.799844365 +0000 UTC m=+0.074192006 container create 3cd7c2118b50f354408439c87ae622519415375b69afe69cf94fa1f69712b365 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=confident_feistel, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, GIT_BRANCH=main, version=7, RELEASE=main, CEPH_POINT_RELEASE=, release=553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, distribution-scope=public, GIT_CLEAN=True, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Nov 28 09:56:27 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-c1ee95d8c81fd8a8b2a249cffc03d35606a4a336f30e2b686796f40cfeb36a7e-merged.mount: Deactivated successfully.
Nov 28 09:56:27 np0005538515.localdomain systemd[1]: Started libpod-conmon-3cd7c2118b50f354408439c87ae622519415375b69afe69cf94fa1f69712b365.scope.
Nov 28 09:56:27 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 09:56:27 np0005538515.localdomain podman[301273]: 2025-11-28 09:56:27.76989051 +0000 UTC m=+0.044238181 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 09:56:27 np0005538515.localdomain podman[301273]: 2025-11-28 09:56:27.876199736 +0000 UTC m=+0.150547377 container init 3cd7c2118b50f354408439c87ae622519415375b69afe69cf94fa1f69712b365 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=confident_feistel, version=7, name=rhceph, build-date=2025-09-24T08:57:55, vcs-type=git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, distribution-scope=public, release=553, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, com.redhat.component=rhceph-container, GIT_CLEAN=True, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, ceph=True, io.openshift.tags=rhceph ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.expose-services=)
Nov 28 09:56:27 np0005538515.localdomain podman[301273]: 2025-11-28 09:56:27.886333025 +0000 UTC m=+0.160680666 container start 3cd7c2118b50f354408439c87ae622519415375b69afe69cf94fa1f69712b365 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=confident_feistel, vcs-type=git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.33.12, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, GIT_CLEAN=True, io.openshift.expose-services=, build-date=2025-09-24T08:57:55, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, version=7, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, vendor=Red Hat, Inc., architecture=x86_64, CEPH_POINT_RELEASE=, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9)
Nov 28 09:56:27 np0005538515.localdomain podman[301273]: 2025-11-28 09:56:27.886585783 +0000 UTC m=+0.160933424 container attach 3cd7c2118b50f354408439c87ae622519415375b69afe69cf94fa1f69712b365 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=confident_feistel, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, name=rhceph, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, release=553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, RELEASE=main, architecture=x86_64, CEPH_POINT_RELEASE=, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, io.openshift.expose-services=, distribution-scope=public, com.redhat.component=rhceph-container, vcs-type=git, vendor=Red Hat, Inc., build-date=2025-09-24T08:57:55)
Nov 28 09:56:27 np0005538515.localdomain confident_feistel[301288]: 167 167
Nov 28 09:56:27 np0005538515.localdomain systemd[1]: libpod-3cd7c2118b50f354408439c87ae622519415375b69afe69cf94fa1f69712b365.scope: Deactivated successfully.
Nov 28 09:56:27 np0005538515.localdomain podman[301273]: 2025-11-28 09:56:27.888570654 +0000 UTC m=+0.162918315 container died 3cd7c2118b50f354408439c87ae622519415375b69afe69cf94fa1f69712b365 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=confident_feistel, release=553, CEPH_POINT_RELEASE=, RELEASE=main, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., io.buildah.version=1.33.12, GIT_BRANCH=main, GIT_CLEAN=True, version=7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container)
Nov 28 09:56:27 np0005538515.localdomain podman[301293]: 2025-11-28 09:56:27.981627725 +0000 UTC m=+0.085022677 container remove 3cd7c2118b50f354408439c87ae622519415375b69afe69cf94fa1f69712b365 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=confident_feistel, GIT_BRANCH=main, GIT_CLEAN=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, architecture=x86_64, vcs-type=git, version=7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., release=553, io.buildah.version=1.33.12, io.openshift.expose-services=, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=, name=rhceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, build-date=2025-09-24T08:57:55, ceph=True, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git)
Nov 28 09:56:27 np0005538515.localdomain systemd[1]: libpod-conmon-3cd7c2118b50f354408439c87ae622519415375b69afe69cf94fa1f69712b365.scope: Deactivated successfully.
Nov 28 09:56:28 np0005538515.localdomain sudo[301239]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:28 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-a47b285070e973d1757d9cbe6d6f4eb6a282d7fe9e56692c0c44e8c104d648fa-merged.mount: Deactivated successfully.
Nov 28 09:56:28 np0005538515.localdomain podman[239012]: time="2025-11-28T09:56:28Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 09:56:28 np0005538515.localdomain podman[239012]: @ - - [28/Nov/2025:09:56:28 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156330 "" "Go-http-client/1.1"
Nov 28 09:56:28 np0005538515.localdomain podman[239012]: @ - - [28/Nov/2025:09:56:28 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19180 "" "Go-http-client/1.1"
Nov 28 09:56:29 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@-1(probing) e15  my rank is now 2 (was -1)
Nov 28 09:56:29 np0005538515.localdomain ceph-mon[301134]: log_channel(cluster) log [INF] : mon.np0005538515 calling monitor election
Nov 28 09:56:29 np0005538515.localdomain ceph-mon[301134]: paxos.2).electionLogic(0) init, first boot, initializing epoch at 1 
Nov 28 09:56:29 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(electing) e15 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 28 09:56:30 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:56:30.264 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:56:30 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:56:30.264 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:56:32 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:56:32.238 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:56:32 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:56:32.239 280172 DEBUG nova.compute.manager [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 28 09:56:32 np0005538515.localdomain sudo[301310]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:56:32 np0005538515.localdomain sudo[301310]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:32 np0005538515.localdomain sudo[301310]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:32 np0005538515.localdomain sudo[301328]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:56:32 np0005538515.localdomain sudo[301328]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:32 np0005538515.localdomain podman[301362]: 
Nov 28 09:56:32 np0005538515.localdomain podman[301362]: 2025-11-28 09:56:32.925560988 +0000 UTC m=+0.076806297 container create 2c537f4269efc82ea4c3a807e32acc951613aa6563df4a9269cc33ad9240dd0f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=relaxed_northcutt, GIT_BRANCH=main, io.buildah.version=1.33.12, RELEASE=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, distribution-scope=public, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.component=rhceph-container, vcs-type=git, GIT_CLEAN=True, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, build-date=2025-09-24T08:57:55, com.redhat.license_terms=https://www.redhat.com/agreements, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., release=553)
Nov 28 09:56:32 np0005538515.localdomain systemd[1]: Started libpod-conmon-2c537f4269efc82ea4c3a807e32acc951613aa6563df4a9269cc33ad9240dd0f.scope.
Nov 28 09:56:32 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 09:56:32 np0005538515.localdomain podman[301362]: 2025-11-28 09:56:32.992542073 +0000 UTC m=+0.143787342 container init 2c537f4269efc82ea4c3a807e32acc951613aa6563df4a9269cc33ad9240dd0f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=relaxed_northcutt, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., name=rhceph, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.buildah.version=1.33.12, distribution-scope=public, release=553, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.component=rhceph-container, architecture=x86_64, GIT_BRANCH=main, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, build-date=2025-09-24T08:57:55)
Nov 28 09:56:32 np0005538515.localdomain podman[301362]: 2025-11-28 09:56:32.893516729 +0000 UTC m=+0.044762028 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 09:56:33 np0005538515.localdomain podman[301362]: 2025-11-28 09:56:33.003044043 +0000 UTC m=+0.154289352 container start 2c537f4269efc82ea4c3a807e32acc951613aa6563df4a9269cc33ad9240dd0f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=relaxed_northcutt, architecture=x86_64, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, RELEASE=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, version=7, vcs-type=git, build-date=2025-09-24T08:57:55, io.buildah.version=1.33.12, com.redhat.component=rhceph-container, release=553, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, name=rhceph, distribution-scope=public, ceph=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Nov 28 09:56:33 np0005538515.localdomain podman[301362]: 2025-11-28 09:56:33.003338243 +0000 UTC m=+0.154583522 container attach 2c537f4269efc82ea4c3a807e32acc951613aa6563df4a9269cc33ad9240dd0f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=relaxed_northcutt, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, RELEASE=main, distribution-scope=public, com.redhat.component=rhceph-container, GIT_BRANCH=main, version=7, release=553, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, architecture=x86_64, build-date=2025-09-24T08:57:55, name=rhceph, ceph=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/agreements, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, GIT_CLEAN=True)
Nov 28 09:56:33 np0005538515.localdomain relaxed_northcutt[301378]: 167 167
Nov 28 09:56:33 np0005538515.localdomain systemd[1]: libpod-2c537f4269efc82ea4c3a807e32acc951613aa6563df4a9269cc33ad9240dd0f.scope: Deactivated successfully.
Nov 28 09:56:33 np0005538515.localdomain podman[301362]: 2025-11-28 09:56:33.005738885 +0000 UTC m=+0.156984164 container died 2c537f4269efc82ea4c3a807e32acc951613aa6563df4a9269cc33ad9240dd0f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=relaxed_northcutt, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, build-date=2025-09-24T08:57:55, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, release=553, architecture=x86_64, io.openshift.expose-services=, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/agreements, RELEASE=main, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9)
Nov 28 09:56:33 np0005538515.localdomain podman[301383]: 2025-11-28 09:56:33.095043642 +0000 UTC m=+0.077657171 container remove 2c537f4269efc82ea4c3a807e32acc951613aa6563df4a9269cc33ad9240dd0f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=relaxed_northcutt, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, distribution-scope=public, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, vcs-type=git, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_BRANCH=main, version=7, io.openshift.expose-services=, ceph=True, build-date=2025-09-24T08:57:55, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, release=553, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.33.12, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/agreements)
Nov 28 09:56:33 np0005538515.localdomain systemd[1]: libpod-conmon-2c537f4269efc82ea4c3a807e32acc951613aa6563df4a9269cc33ad9240dd0f.scope: Deactivated successfully.
Nov 28 09:56:33 np0005538515.localdomain sudo[301328]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:33 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:56:33.238 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:56:33 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:56:33.240 280172 DEBUG nova.compute.manager [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 28 09:56:33 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:56:33.241 280172 DEBUG nova.compute.manager [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 28 09:56:33 np0005538515.localdomain sudo[301400]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:56:33 np0005538515.localdomain sudo[301400]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:33 np0005538515.localdomain sudo[301400]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:33 np0005538515.localdomain sudo[301418]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Nov 28 09:56:33 np0005538515.localdomain sudo[301418]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:33 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:56:33.373 280172 DEBUG nova.compute.manager [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 28 09:56:33 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-ccae6c316a7e0b0fbd5cc83757b41b105132aff1424e7216c8dc088cbe0ee340-merged.mount: Deactivated successfully.
Nov 28 09:56:34 np0005538515.localdomain podman[301507]: 2025-11-28 09:56:34.142649696 +0000 UTC m=+0.082933582 container exec 98f7091a3e2ea0e9ed1e630f1e98c8fad1fd276cf7448473db6afc3c103ea45d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538515, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-09-24T08:57:55, ceph=True, RELEASE=main, distribution-scope=public, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.buildah.version=1.33.12, GIT_BRANCH=main, GIT_CLEAN=True, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.expose-services=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, architecture=x86_64)
Nov 28 09:56:34 np0005538515.localdomain ceph-mon[301134]: log_channel(cluster) log [INF] : mon.np0005538515 calling monitor election
Nov 28 09:56:34 np0005538515.localdomain ceph-mon[301134]: paxos.2).electionLogic(0) init, first boot, initializing epoch at 1 
Nov 28 09:56:34 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(electing) e15 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 28 09:56:34 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(electing) e15 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 28 09:56:34 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(electing) e15 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 28 09:56:34 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 _apply_compatset_features enabling new quorum features: compat={},rocompat={},incompat={4=support erasure code pools,5=new-style osdmap encoding,6=support isa/lrc erasure code,7=support shec erasure code}
Nov 28 09:56:34 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 _apply_compatset_features enabling new quorum features: compat={},rocompat={},incompat={8=support monmap features,9=luminous ondisk layout,10=mimic ondisk layout,11=nautilus ondisk layout,12=octopus ondisk layout,13=pacific ondisk layout,14=quincy ondisk layout,15=reef ondisk layout}
Nov 28 09:56:34 np0005538515.localdomain ceph-mon[301134]: Reconfiguring mds.mds.np0005538515.anvatb (monmap changed)...
Nov 28 09:56:34 np0005538515.localdomain ceph-mon[301134]: Reconfiguring daemon mds.mds.np0005538515.anvatb on np0005538515.localdomain
Nov 28 09:56:34 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "mon metadata", "id": "np0005538513"} : dispatch
Nov 28 09:56:34 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "mon metadata", "id": "np0005538514"} : dispatch
Nov 28 09:56:34 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "mon metadata", "id": "np0005538515"} : dispatch
Nov 28 09:56:34 np0005538515.localdomain ceph-mon[301134]: mon.np0005538513 calling monitor election
Nov 28 09:56:34 np0005538515.localdomain ceph-mon[301134]: mon.np0005538514 calling monitor election
Nov 28 09:56:34 np0005538515.localdomain ceph-mon[301134]: pgmap v37: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:56:34 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "mon metadata", "id": "np0005538515"} : dispatch
Nov 28 09:56:34 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "mon metadata", "id": "np0005538515"} : dispatch
Nov 28 09:56:34 np0005538515.localdomain ceph-mon[301134]: pgmap v38: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:56:34 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "mon metadata", "id": "np0005538515"} : dispatch
Nov 28 09:56:34 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "mon metadata", "id": "np0005538515"} : dispatch
Nov 28 09:56:34 np0005538515.localdomain ceph-mon[301134]: pgmap v39: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:56:34 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "mon metadata", "id": "np0005538515"} : dispatch
Nov 28 09:56:34 np0005538515.localdomain ceph-mon[301134]: mon.np0005538513 is new leader, mons np0005538513,np0005538514 in quorum (ranks 0,1)
Nov 28 09:56:34 np0005538515.localdomain ceph-mon[301134]: monmap epoch 15
Nov 28 09:56:34 np0005538515.localdomain ceph-mon[301134]: fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:56:34 np0005538515.localdomain ceph-mon[301134]: last_changed 2025-11-28T09:56:27.227153+0000
Nov 28 09:56:34 np0005538515.localdomain ceph-mon[301134]: created 2025-11-28T07:45:36.120469+0000
Nov 28 09:56:34 np0005538515.localdomain ceph-mon[301134]: min_mon_release 18 (reef)
Nov 28 09:56:34 np0005538515.localdomain ceph-mon[301134]: election_strategy: 1
Nov 28 09:56:34 np0005538515.localdomain ceph-mon[301134]: 0: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005538513
Nov 28 09:56:34 np0005538515.localdomain ceph-mon[301134]: 1: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005538514
Nov 28 09:56:34 np0005538515.localdomain ceph-mon[301134]: 2: [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon.np0005538515
Nov 28 09:56:34 np0005538515.localdomain ceph-mon[301134]: fsmap cephfs:1 {0=mds.np0005538514.umgtoy=up:active} 2 up:standby
Nov 28 09:56:34 np0005538515.localdomain ceph-mon[301134]: osdmap e90: 6 total, 6 up, 6 in
Nov 28 09:56:34 np0005538515.localdomain ceph-mon[301134]: mgrmap e34: np0005538513.dsfdlx(active, since 72s), standbys: np0005538514.djozup, np0005538515.yfkzhl, np0005538512.zyhkxs
Nov 28 09:56:34 np0005538515.localdomain ceph-mon[301134]: Health check failed: 1/3 mons down, quorum np0005538513,np0005538514 (MON_DOWN)
Nov 28 09:56:34 np0005538515.localdomain ceph-mon[301134]: Health detail: HEALTH_WARN 1/3 mons down, quorum np0005538513,np0005538514
Nov 28 09:56:34 np0005538515.localdomain ceph-mon[301134]: [WRN] MON_DOWN: 1/3 mons down, quorum np0005538513,np0005538514
Nov 28 09:56:34 np0005538515.localdomain ceph-mon[301134]:     mon.np0005538515 (rank 2) addr [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] is down (out of quorum)
Nov 28 09:56:34 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:34 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:34 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538515.yfkzhl", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 28 09:56:34 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "mgr services"} : dispatch
Nov 28 09:56:34 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:56:34 np0005538515.localdomain ceph-mon[301134]: Reconfiguring mgr.np0005538515.yfkzhl (monmap changed)...
Nov 28 09:56:34 np0005538515.localdomain ceph-mon[301134]: Reconfiguring daemon mgr.np0005538515.yfkzhl on np0005538515.localdomain
Nov 28 09:56:34 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:34 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:34 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "mon metadata", "id": "np0005538515"} : dispatch
Nov 28 09:56:34 np0005538515.localdomain podman[301507]: 2025-11-28 09:56:34.273838122 +0000 UTC m=+0.214122008 container exec_died 98f7091a3e2ea0e9ed1e630f1e98c8fad1fd276cf7448473db6afc3c103ea45d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538515, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, build-date=2025-09-24T08:57:55, CEPH_POINT_RELEASE=, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, GIT_CLEAN=True, release=553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, distribution-scope=public, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., io.openshift.expose-services=, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, vcs-type=git, ceph=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553)
Nov 28 09:56:34 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 28 09:56:34 np0005538515.localdomain ceph-mon[301134]: mgrc update_daemon_metadata mon.np0005538515 metadata {addrs=[v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0],arch=x86_64,ceph_release=reef,ceph_version=ceph version 18.2.1-361.el9cp (439dcd6094d413840eb2ec590fe2194ec616687f) reef (stable),ceph_version_short=18.2.1-361.el9cp,compression_algorithms=none, snappy, zlib, zstd, lz4,container_hostname=np0005538515.localdomain,container_image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest,cpu=AMD EPYC-Rome Processor,device_ids=,device_paths=vda=/dev/disk/by-path/pci-0000:00:04.0,devices=vda,distro=rhel,distro_description=Red Hat Enterprise Linux 9.6 (Plow),distro_version=9.6,hostname=np0005538515.localdomain,kernel_description=#1 SMP PREEMPT_DYNAMIC Wed Apr 12 10:45:03 EDT 2023,kernel_version=5.14.0-284.11.1.el9_2.x86_64,mem_swap_kb=1048572,mem_total_kb=16116612,os=Linux}
Nov 28 09:56:34 np0005538515.localdomain sudo[301418]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:34 np0005538515.localdomain sudo[301627]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:56:34 np0005538515.localdomain sudo[301627]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:34 np0005538515.localdomain sudo[301627]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:35 np0005538515.localdomain sudo[301645]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 09:56:35 np0005538515.localdomain sudo[301645]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:35 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515 calling monitor election
Nov 28 09:56:35 np0005538515.localdomain ceph-mon[301134]: pgmap v40: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:56:35 np0005538515.localdomain ceph-mon[301134]: mon.np0005538513 calling monitor election
Nov 28 09:56:35 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515 calling monitor election
Nov 28 09:56:35 np0005538515.localdomain ceph-mon[301134]: mon.np0005538513 is new leader, mons np0005538513,np0005538514,np0005538515 in quorum (ranks 0,1,2)
Nov 28 09:56:35 np0005538515.localdomain ceph-mon[301134]: mon.np0005538514 calling monitor election
Nov 28 09:56:35 np0005538515.localdomain ceph-mon[301134]: monmap epoch 15
Nov 28 09:56:35 np0005538515.localdomain ceph-mon[301134]: fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:56:35 np0005538515.localdomain ceph-mon[301134]: last_changed 2025-11-28T09:56:27.227153+0000
Nov 28 09:56:35 np0005538515.localdomain ceph-mon[301134]: created 2025-11-28T07:45:36.120469+0000
Nov 28 09:56:35 np0005538515.localdomain ceph-mon[301134]: min_mon_release 18 (reef)
Nov 28 09:56:35 np0005538515.localdomain ceph-mon[301134]: election_strategy: 1
Nov 28 09:56:35 np0005538515.localdomain ceph-mon[301134]: 0: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005538513
Nov 28 09:56:35 np0005538515.localdomain ceph-mon[301134]: 1: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005538514
Nov 28 09:56:35 np0005538515.localdomain ceph-mon[301134]: 2: [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon.np0005538515
Nov 28 09:56:35 np0005538515.localdomain ceph-mon[301134]: fsmap cephfs:1 {0=mds.np0005538514.umgtoy=up:active} 2 up:standby
Nov 28 09:56:35 np0005538515.localdomain ceph-mon[301134]: osdmap e90: 6 total, 6 up, 6 in
Nov 28 09:56:35 np0005538515.localdomain ceph-mon[301134]: mgrmap e34: np0005538513.dsfdlx(active, since 74s), standbys: np0005538514.djozup, np0005538515.yfkzhl, np0005538512.zyhkxs
Nov 28 09:56:35 np0005538515.localdomain ceph-mon[301134]: Health check cleared: MON_DOWN (was: 1/3 mons down, quorum np0005538513,np0005538514)
Nov 28 09:56:35 np0005538515.localdomain ceph-mon[301134]: Cluster is now healthy
Nov 28 09:56:35 np0005538515.localdomain ceph-mon[301134]: overall HEALTH_OK
Nov 28 09:56:35 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:35 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:35 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:56:35.238 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:56:35 np0005538515.localdomain sudo[301645]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:35 np0005538515.localdomain sudo[301695]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Nov 28 09:56:35 np0005538515.localdomain sudo[301695]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:35 np0005538515.localdomain sudo[301695]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:35 np0005538515.localdomain sudo[301713]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph
Nov 28 09:56:35 np0005538515.localdomain sudo[301713]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:35 np0005538515.localdomain sudo[301713]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:36 np0005538515.localdomain sudo[301731]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.conf.new
Nov 28 09:56:36 np0005538515.localdomain sudo[301731]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:36 np0005538515.localdomain sudo[301731]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:36 np0005538515.localdomain sudo[301749]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:56:36 np0005538515.localdomain sudo[301749]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:36 np0005538515.localdomain sudo[301749]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:36 np0005538515.localdomain sudo[301767]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.conf.new
Nov 28 09:56:36 np0005538515.localdomain sudo[301767]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:36 np0005538515.localdomain sudo[301767]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:36 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "mon metadata", "id": "np0005538515"} : dispatch
Nov 28 09:56:36 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:56:36 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 28 09:56:36 np0005538515.localdomain sudo[301801]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.conf.new
Nov 28 09:56:36 np0005538515.localdomain sudo[301801]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:36 np0005538515.localdomain sudo[301801]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:36 np0005538515.localdomain sudo[301819]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.conf.new
Nov 28 09:56:36 np0005538515.localdomain sudo[301819]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:36 np0005538515.localdomain sudo[301819]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:36 np0005538515.localdomain sudo[301837]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Nov 28 09:56:36 np0005538515.localdomain sudo[301837]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:36 np0005538515.localdomain sudo[301837]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:36 np0005538515.localdomain sudo[301855]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config
Nov 28 09:56:36 np0005538515.localdomain sudo[301855]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:36 np0005538515.localdomain sudo[301855]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:36 np0005538515.localdomain sudo[301873]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config
Nov 28 09:56:36 np0005538515.localdomain sudo[301873]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:36 np0005538515.localdomain sudo[301873]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:36 np0005538515.localdomain sudo[301891]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf.new
Nov 28 09:56:36 np0005538515.localdomain sudo[301891]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:36 np0005538515.localdomain sudo[301891]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:36 np0005538515.localdomain sudo[301909]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:56:36 np0005538515.localdomain sudo[301909]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:36 np0005538515.localdomain sudo[301909]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:36 np0005538515.localdomain sudo[301927]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf.new
Nov 28 09:56:36 np0005538515.localdomain sudo[301927]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:36 np0005538515.localdomain sudo[301927]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:36 np0005538515.localdomain sudo[301961]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf.new
Nov 28 09:56:36 np0005538515.localdomain sudo[301961]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:36 np0005538515.localdomain sudo[301961]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:37 np0005538515.localdomain sudo[301979]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf.new
Nov 28 09:56:37 np0005538515.localdomain sudo[301979]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:37 np0005538515.localdomain sudo[301979]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:37 np0005538515.localdomain sudo[301997]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf.new /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:56:37 np0005538515.localdomain sudo[301997]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:37 np0005538515.localdomain sudo[301997]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:37 np0005538515.localdomain ceph-mon[301134]: pgmap v41: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:56:37 np0005538515.localdomain ceph-mon[301134]: Updating np0005538513.localdomain:/etc/ceph/ceph.conf
Nov 28 09:56:37 np0005538515.localdomain ceph-mon[301134]: Updating np0005538514.localdomain:/etc/ceph/ceph.conf
Nov 28 09:56:37 np0005538515.localdomain ceph-mon[301134]: Updating np0005538515.localdomain:/etc/ceph/ceph.conf
Nov 28 09:56:37 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:37 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:37 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:37 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:37 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:37 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:37 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:37 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 28 09:56:37 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:56:37.238 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:56:37 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:56:37.238 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:56:37 np0005538515.localdomain sudo[302015]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:56:37 np0005538515.localdomain sudo[302015]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:37 np0005538515.localdomain sudo[302015]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:37 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:56:37.445 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:56:37 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:56:37.445 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:56:37 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:56:37.446 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:56:37 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:56:37.446 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Auditing locally available compute resources for np0005538515.localdomain (node: np0005538515.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 28 09:56:37 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:56:37.446 280172 DEBUG oslo_concurrency.processutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 09:56:37 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 28 09:56:37 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/3870773326' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 09:56:37 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:56:37.902 280172 DEBUG oslo_concurrency.processutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.456s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 09:56:38 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:56:38.105 280172 WARNING nova.virt.libvirt.driver [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 09:56:38 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:56:38.107 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Hypervisor/Node resource view: name=np0005538515.localdomain free_ram=12016MB free_disk=41.83686447143555GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 28 09:56:38 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:56:38.108 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:56:38 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:56:38.108 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:56:38 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:56:38.183 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 28 09:56:38 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:56:38.184 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Final resource view: name=np0005538515.localdomain phys_ram=15738MB used_ram=512MB phys_disk=41GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 28 09:56:38 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:56:38.214 280172 DEBUG oslo_concurrency.processutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 09:56:38 np0005538515.localdomain ceph-mon[301134]: Updating np0005538515.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:56:38 np0005538515.localdomain ceph-mon[301134]: Updating np0005538514.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:56:38 np0005538515.localdomain ceph-mon[301134]: Updating np0005538513.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:56:38 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538513.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 28 09:56:38 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:56:38 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.108:0/3870773326' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 09:56:38 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.107:0/289694563' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 09:56:38 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 28 09:56:38 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/3004239366' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 09:56:38 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:56:38.670 280172 DEBUG oslo_concurrency.processutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.455s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 09:56:38 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:56:38.676 280172 DEBUG nova.compute.provider_tree [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Inventory has not changed in ProviderTree for provider: 72fba1ca-0d86-48af-8a3d-510284dfd0e0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 09:56:38 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:56:38.700 280172 DEBUG nova.scheduler.client.report [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Inventory has not changed for provider 72fba1ca-0d86-48af-8a3d-510284dfd0e0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 09:56:38 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:56:38.703 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Compute_service record updated for np0005538515.localdomain:np0005538515.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 28 09:56:38 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:56:38.703 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.595s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:56:39 np0005538515.localdomain ceph-mon[301134]: Reconfiguring crash.np0005538513 (monmap changed)...
Nov 28 09:56:39 np0005538515.localdomain ceph-mon[301134]: Reconfiguring daemon crash.np0005538513 on np0005538513.localdomain
Nov 28 09:56:39 np0005538515.localdomain ceph-mon[301134]: pgmap v42: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:56:39 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.106:0/1396630985' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 09:56:39 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:39 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:39 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch
Nov 28 09:56:39 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:56:39 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.200:0/2594589190' entity='client.admin' cmd={"prefix": "status", "format": "json"} : dispatch
Nov 28 09:56:39 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.108:0/3004239366' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 09:56:39 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.107:0/3912028822' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 09:56:39 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.106:0/2861436119' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 09:56:39 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:56:39.700 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:56:40 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:56:40.238 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:56:40 np0005538515.localdomain ceph-mon[301134]: Reconfiguring osd.2 (monmap changed)...
Nov 28 09:56:40 np0005538515.localdomain ceph-mon[301134]: Reconfiguring daemon osd.2 on np0005538513.localdomain
Nov 28 09:56:40 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:40 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:40 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch
Nov 28 09:56:40 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:56:40 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:40 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:40 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:40 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:40 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:40 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:40 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:40 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:40 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:40 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:40 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:40 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:40 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.
Nov 28 09:56:40 np0005538515.localdomain podman[302077]: 2025-11-28 09:56:40.981244405 +0000 UTC m=+0.077190767 container health_status 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vendor=Red Hat, Inc., container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, release=1755695350, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, io.buildah.version=1.33.7, io.openshift.expose-services=, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Nov 28 09:56:40 np0005538515.localdomain podman[302077]: 2025-11-28 09:56:40.995059887 +0000 UTC m=+0.091006279 container exec_died 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., distribution-scope=public, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, io.openshift.expose-services=, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., architecture=x86_64, version=9.6, name=ubi9-minimal, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7)
Nov 28 09:56:41 np0005538515.localdomain systemd[1]: 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.service: Deactivated successfully.
Nov 28 09:56:41 np0005538515.localdomain ceph-mon[301134]: Reconfiguring osd.5 (monmap changed)...
Nov 28 09:56:41 np0005538515.localdomain ceph-mon[301134]: Reconfiguring daemon osd.5 on np0005538513.localdomain
Nov 28 09:56:41 np0005538515.localdomain ceph-mon[301134]: from='client.44535 -' entity='client.admin' cmd=[{"prefix": "orch", "action": "reconfig", "service_name": "osd.default_drive_group", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 09:56:41 np0005538515.localdomain ceph-mon[301134]: Reconfig service osd.default_drive_group
Nov 28 09:56:41 np0005538515.localdomain ceph-mon[301134]: pgmap v43: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:56:41 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:41 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:41 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:41 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:41 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:41 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005538513.yljthc", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Nov 28 09:56:41 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:56:41 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e90 _set_cache_ratios kv ratio 0.25 inc ratio 0.375 full ratio 0.375
Nov 28 09:56:41 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e90 register_cache_with_pcm pcm target: 2147483648 pcm max: 1020054732 pcm min: 134217728 inc_osd_cache size: 1
Nov 28 09:56:41 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e91 e91: 6 total, 6 up, 6 in
Nov 28 09:56:41 np0005538515.localdomain sshd[297251]: pam_unix(sshd:session): session closed for user ceph-admin
Nov 28 09:56:41 np0005538515.localdomain systemd[1]: session-67.scope: Deactivated successfully.
Nov 28 09:56:41 np0005538515.localdomain systemd[1]: session-67.scope: Consumed 23.524s CPU time.
Nov 28 09:56:41 np0005538515.localdomain systemd-logind[763]: Session 67 logged out. Waiting for processes to exit.
Nov 28 09:56:41 np0005538515.localdomain systemd-logind[763]: Removed session 67.
Nov 28 09:56:41 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e91 _set_new_cache_sizes cache_size:1019390646 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:56:41 np0005538515.localdomain sshd[302096]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 09:56:41 np0005538515.localdomain sshd[302096]: Accepted publickey for ceph-admin from 192.168.122.107 port 45702 ssh2: RSA SHA256:zjXO5gWr7Xng+SeiWsaFLFQaayJZD5rPIAl1v5Aks+g
Nov 28 09:56:41 np0005538515.localdomain systemd-logind[763]: New session 71 of user ceph-admin.
Nov 28 09:56:41 np0005538515.localdomain systemd[1]: Started Session 71 of User ceph-admin.
Nov 28 09:56:41 np0005538515.localdomain sshd[302096]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Nov 28 09:56:42 np0005538515.localdomain sudo[302100]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:56:42 np0005538515.localdomain sudo[302100]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:42 np0005538515.localdomain sudo[302100]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:42 np0005538515.localdomain sudo[302118]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Nov 28 09:56:42 np0005538515.localdomain sudo[302118]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:42 np0005538515.localdomain ceph-mon[301134]: Reconfiguring mds.mds.np0005538513.yljthc (monmap changed)...
Nov 28 09:56:42 np0005538515.localdomain ceph-mon[301134]: Reconfiguring daemon mds.mds.np0005538513.yljthc on np0005538513.localdomain
Nov 28 09:56:42 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.200:0/937537164' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch
Nov 28 09:56:42 np0005538515.localdomain ceph-mon[301134]: Activating manager daemon np0005538514.djozup
Nov 28 09:56:42 np0005538515.localdomain ceph-mon[301134]: osdmap e91: 6 total, 6 up, 6 in
Nov 28 09:56:42 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.200:0/937537164' entity='client.admin' cmd='[{"prefix": "mgr fail"}]': finished
Nov 28 09:56:42 np0005538515.localdomain ceph-mon[301134]: mgrmap e35: np0005538514.djozup(active, starting, since 0.0458243s), standbys: np0005538515.yfkzhl, np0005538512.zyhkxs
Nov 28 09:56:42 np0005538515.localdomain ceph-mon[301134]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' 
Nov 28 09:56:42 np0005538515.localdomain ceph-mon[301134]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix": "mon metadata", "id": "np0005538513"} : dispatch
Nov 28 09:56:42 np0005538515.localdomain ceph-mon[301134]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix": "mon metadata", "id": "np0005538514"} : dispatch
Nov 28 09:56:42 np0005538515.localdomain ceph-mon[301134]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix": "mon metadata", "id": "np0005538515"} : dispatch
Nov 28 09:56:42 np0005538515.localdomain ceph-mon[301134]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix": "mds metadata", "who": "mds.np0005538513.yljthc"} : dispatch
Nov 28 09:56:42 np0005538515.localdomain ceph-mon[301134]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix": "mds metadata", "who": "mds.np0005538515.anvatb"} : dispatch
Nov 28 09:56:42 np0005538515.localdomain ceph-mon[301134]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix": "mds metadata", "who": "mds.np0005538514.umgtoy"} : dispatch
Nov 28 09:56:42 np0005538515.localdomain ceph-mon[301134]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix": "mgr metadata", "who": "np0005538514.djozup", "id": "np0005538514.djozup"} : dispatch
Nov 28 09:56:42 np0005538515.localdomain ceph-mon[301134]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix": "mgr metadata", "who": "np0005538515.yfkzhl", "id": "np0005538515.yfkzhl"} : dispatch
Nov 28 09:56:42 np0005538515.localdomain ceph-mon[301134]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix": "mgr metadata", "who": "np0005538512.zyhkxs", "id": "np0005538512.zyhkxs"} : dispatch
Nov 28 09:56:42 np0005538515.localdomain ceph-mon[301134]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix": "osd metadata", "id": 0} : dispatch
Nov 28 09:56:42 np0005538515.localdomain ceph-mon[301134]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix": "osd metadata", "id": 1} : dispatch
Nov 28 09:56:42 np0005538515.localdomain ceph-mon[301134]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix": "osd metadata", "id": 2} : dispatch
Nov 28 09:56:42 np0005538515.localdomain ceph-mon[301134]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix": "osd metadata", "id": 3} : dispatch
Nov 28 09:56:42 np0005538515.localdomain ceph-mon[301134]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix": "osd metadata", "id": 4} : dispatch
Nov 28 09:56:42 np0005538515.localdomain ceph-mon[301134]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix": "osd metadata", "id": 5} : dispatch
Nov 28 09:56:42 np0005538515.localdomain ceph-mon[301134]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix": "mds metadata"} : dispatch
Nov 28 09:56:42 np0005538515.localdomain ceph-mon[301134]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix": "osd metadata"} : dispatch
Nov 28 09:56:42 np0005538515.localdomain ceph-mon[301134]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix": "mon metadata"} : dispatch
Nov 28 09:56:42 np0005538515.localdomain ceph-mon[301134]: Manager daemon np0005538514.djozup is now available
Nov 28 09:56:42 np0005538515.localdomain ceph-mon[301134]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005538512.localdomain.devices.0"} : dispatch
Nov 28 09:56:42 np0005538515.localdomain ceph-mon[301134]: removing stray HostCache host record np0005538512.localdomain.devices.0
Nov 28 09:56:42 np0005538515.localdomain ceph-mon[301134]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005538512.localdomain.devices.0"} : dispatch
Nov 28 09:56:42 np0005538515.localdomain ceph-mon[301134]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005538512.localdomain.devices.0"}]': finished
Nov 28 09:56:42 np0005538515.localdomain ceph-mon[301134]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005538512.localdomain.devices.0"} : dispatch
Nov 28 09:56:42 np0005538515.localdomain ceph-mon[301134]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005538512.localdomain.devices.0"} : dispatch
Nov 28 09:56:42 np0005538515.localdomain ceph-mon[301134]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005538512.localdomain.devices.0"}]': finished
Nov 28 09:56:42 np0005538515.localdomain ceph-mon[301134]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005538514.djozup/mirror_snapshot_schedule"} : dispatch
Nov 28 09:56:42 np0005538515.localdomain ceph-mon[301134]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005538514.djozup/mirror_snapshot_schedule"} : dispatch
Nov 28 09:56:42 np0005538515.localdomain ceph-mon[301134]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005538514.djozup/trash_purge_schedule"} : dispatch
Nov 28 09:56:42 np0005538515.localdomain ceph-mon[301134]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005538514.djozup/trash_purge_schedule"} : dispatch
Nov 28 09:56:42 np0005538515.localdomain podman[302206]: 2025-11-28 09:56:42.945358221 +0000 UTC m=+0.078463267 container exec 98f7091a3e2ea0e9ed1e630f1e98c8fad1fd276cf7448473db6afc3c103ea45d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538515, io.buildah.version=1.33.12, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, vendor=Red Hat, Inc., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, build-date=2025-09-24T08:57:55, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.license_terms=https://www.redhat.com/agreements, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, com.redhat.component=rhceph-container, name=rhceph, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, release=553, vcs-type=git, version=7, CEPH_POINT_RELEASE=, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git)
Nov 28 09:56:43 np0005538515.localdomain podman[302206]: 2025-11-28 09:56:43.123746237 +0000 UTC m=+0.256851263 container exec_died 98f7091a3e2ea0e9ed1e630f1e98c8fad1fd276cf7448473db6afc3c103ea45d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538515, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12, io.openshift.tags=rhceph ceph, vcs-type=git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_CLEAN=True, version=7, description=Red Hat Ceph Storage 7, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_BRANCH=main, com.redhat.component=rhceph-container, io.openshift.expose-services=, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, distribution-scope=public, vendor=Red Hat, Inc., release=553)
Nov 28 09:56:43 np0005538515.localdomain ceph-mon[301134]: mgrmap e36: np0005538514.djozup(active, since 1.10238s), standbys: np0005538515.yfkzhl, np0005538512.zyhkxs
Nov 28 09:56:43 np0005538515.localdomain ceph-mon[301134]: pgmap v3: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:56:43 np0005538515.localdomain sudo[302118]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:43 np0005538515.localdomain sudo[302328]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:56:43 np0005538515.localdomain sudo[302328]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:43 np0005538515.localdomain sudo[302328]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:43 np0005538515.localdomain sudo[302346]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 09:56:43 np0005538515.localdomain sudo[302346]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:44 np0005538515.localdomain sudo[302346]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:44 np0005538515.localdomain ceph-mon[301134]: [28/Nov/2025:09:56:43] ENGINE Bus STARTING
Nov 28 09:56:44 np0005538515.localdomain ceph-mon[301134]: [28/Nov/2025:09:56:43] ENGINE Serving on https://172.18.0.107:7150
Nov 28 09:56:44 np0005538515.localdomain ceph-mon[301134]: [28/Nov/2025:09:56:43] ENGINE Client ('172.18.0.107', 59370) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)')
Nov 28 09:56:44 np0005538515.localdomain ceph-mon[301134]: [28/Nov/2025:09:56:43] ENGINE Serving on http://172.18.0.107:8765
Nov 28 09:56:44 np0005538515.localdomain ceph-mon[301134]: [28/Nov/2025:09:56:43] ENGINE Bus STARTED
Nov 28 09:56:44 np0005538515.localdomain ceph-mon[301134]: pgmap v4: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:56:44 np0005538515.localdomain ceph-mon[301134]: mgrmap e37: np0005538514.djozup(active, since 2s), standbys: np0005538515.yfkzhl, np0005538512.zyhkxs
Nov 28 09:56:44 np0005538515.localdomain ceph-mon[301134]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:56:44 np0005538515.localdomain ceph-mon[301134]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:56:44 np0005538515.localdomain ceph-mon[301134]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:56:44 np0005538515.localdomain ceph-mon[301134]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:56:44 np0005538515.localdomain ceph-mon[301134]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:56:44 np0005538515.localdomain ceph-mon[301134]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:56:44 np0005538515.localdomain sudo[302397]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:56:44 np0005538515.localdomain sudo[302397]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:44 np0005538515.localdomain sudo[302397]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:44 np0005538515.localdomain sudo[302415]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 list-networks
Nov 28 09:56:44 np0005538515.localdomain sudo[302415]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:45 np0005538515.localdomain sudo[302415]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:45 np0005538515.localdomain sudo[302452]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Nov 28 09:56:45 np0005538515.localdomain sudo[302452]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:45 np0005538515.localdomain sudo[302452]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:45 np0005538515.localdomain sudo[302470]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph
Nov 28 09:56:45 np0005538515.localdomain sudo[302470]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:45 np0005538515.localdomain sudo[302470]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:45 np0005538515.localdomain sudo[302488]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.conf.new
Nov 28 09:56:45 np0005538515.localdomain sudo[302488]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:45 np0005538515.localdomain sudo[302488]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:45 np0005538515.localdomain sudo[302506]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:56:45 np0005538515.localdomain sudo[302506]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:45 np0005538515.localdomain sudo[302506]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:45 np0005538515.localdomain sudo[302524]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.conf.new
Nov 28 09:56:45 np0005538515.localdomain sudo[302524]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:45 np0005538515.localdomain sudo[302524]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:45 np0005538515.localdomain sudo[302558]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.conf.new
Nov 28 09:56:45 np0005538515.localdomain sudo[302558]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:45 np0005538515.localdomain sudo[302558]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:45 np0005538515.localdomain sudo[302576]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.conf.new
Nov 28 09:56:45 np0005538515.localdomain sudo[302576]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:45 np0005538515.localdomain sudo[302576]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:45 np0005538515.localdomain ceph-mon[301134]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:56:45 np0005538515.localdomain ceph-mon[301134]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:56:45 np0005538515.localdomain ceph-mon[301134]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch
Nov 28 09:56:45 np0005538515.localdomain ceph-mon[301134]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch
Nov 28 09:56:45 np0005538515.localdomain ceph-mon[301134]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch
Nov 28 09:56:45 np0005538515.localdomain ceph-mon[301134]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch
Nov 28 09:56:45 np0005538515.localdomain ceph-mon[301134]: Adjusting osd_memory_target on np0005538514.localdomain to 836.6M
Nov 28 09:56:45 np0005538515.localdomain ceph-mon[301134]: Unable to set osd_memory_target on np0005538514.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Nov 28 09:56:45 np0005538515.localdomain ceph-mon[301134]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:56:45 np0005538515.localdomain ceph-mon[301134]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:56:45 np0005538515.localdomain ceph-mon[301134]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch
Nov 28 09:56:45 np0005538515.localdomain ceph-mon[301134]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch
Nov 28 09:56:45 np0005538515.localdomain ceph-mon[301134]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch
Nov 28 09:56:45 np0005538515.localdomain ceph-mon[301134]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch
Nov 28 09:56:45 np0005538515.localdomain ceph-mon[301134]: Adjusting osd_memory_target on np0005538515.localdomain to 836.6M
Nov 28 09:56:45 np0005538515.localdomain ceph-mon[301134]: Unable to set osd_memory_target on np0005538515.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Nov 28 09:56:45 np0005538515.localdomain ceph-mon[301134]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:56:45 np0005538515.localdomain ceph-mon[301134]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:56:45 np0005538515.localdomain ceph-mon[301134]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch
Nov 28 09:56:45 np0005538515.localdomain ceph-mon[301134]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch
Nov 28 09:56:45 np0005538515.localdomain ceph-mon[301134]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch
Nov 28 09:56:45 np0005538515.localdomain ceph-mon[301134]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch
Nov 28 09:56:45 np0005538515.localdomain ceph-mon[301134]: Adjusting osd_memory_target on np0005538513.localdomain to 836.6M
Nov 28 09:56:45 np0005538515.localdomain ceph-mon[301134]: Unable to set osd_memory_target on np0005538513.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Nov 28 09:56:45 np0005538515.localdomain ceph-mon[301134]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:56:45 np0005538515.localdomain ceph-mon[301134]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 28 09:56:45 np0005538515.localdomain ceph-mon[301134]: Updating np0005538513.localdomain:/etc/ceph/ceph.conf
Nov 28 09:56:46 np0005538515.localdomain ceph-mon[301134]: Updating np0005538514.localdomain:/etc/ceph/ceph.conf
Nov 28 09:56:46 np0005538515.localdomain ceph-mon[301134]: Updating np0005538515.localdomain:/etc/ceph/ceph.conf
Nov 28 09:56:46 np0005538515.localdomain ceph-mon[301134]: pgmap v5: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:56:46 np0005538515.localdomain sudo[302594]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Nov 28 09:56:46 np0005538515.localdomain sudo[302594]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:46 np0005538515.localdomain sudo[302594]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:46 np0005538515.localdomain sudo[302612]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config
Nov 28 09:56:46 np0005538515.localdomain sudo[302612]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:46 np0005538515.localdomain sudo[302612]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:46 np0005538515.localdomain sudo[302630]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config
Nov 28 09:56:46 np0005538515.localdomain sudo[302630]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:46 np0005538515.localdomain sudo[302630]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:46 np0005538515.localdomain sudo[302648]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf.new
Nov 28 09:56:46 np0005538515.localdomain sudo[302648]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:46 np0005538515.localdomain sudo[302648]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:46 np0005538515.localdomain sudo[302666]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:56:46 np0005538515.localdomain sudo[302666]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:46 np0005538515.localdomain sudo[302666]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:46 np0005538515.localdomain sudo[302684]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf.new
Nov 28 09:56:46 np0005538515.localdomain sudo[302684]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:46 np0005538515.localdomain sudo[302684]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:46 np0005538515.localdomain sudo[302718]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf.new
Nov 28 09:56:46 np0005538515.localdomain sudo[302718]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:46 np0005538515.localdomain sudo[302718]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:46 np0005538515.localdomain sudo[302736]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf.new
Nov 28 09:56:46 np0005538515.localdomain sudo[302736]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:46 np0005538515.localdomain sudo[302736]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:46 np0005538515.localdomain sudo[302754]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf.new /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:56:46 np0005538515.localdomain sudo[302754]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:46 np0005538515.localdomain sudo[302754]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:46 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e91 _set_new_cache_sizes cache_size:1020036203 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:56:46 np0005538515.localdomain sudo[302772]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Nov 28 09:56:46 np0005538515.localdomain sudo[302772]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:46 np0005538515.localdomain sudo[302772]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:46 np0005538515.localdomain sudo[302790]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph
Nov 28 09:56:46 np0005538515.localdomain sudo[302790]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:46 np0005538515.localdomain sudo[302790]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:46 np0005538515.localdomain sudo[302808]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.client.admin.keyring.new
Nov 28 09:56:46 np0005538515.localdomain sudo[302808]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:46 np0005538515.localdomain sudo[302808]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:46 np0005538515.localdomain sudo[302826]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:56:46 np0005538515.localdomain sudo[302826]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:46 np0005538515.localdomain sudo[302826]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:47 np0005538515.localdomain sudo[302844]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.client.admin.keyring.new
Nov 28 09:56:47 np0005538515.localdomain sudo[302844]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:47 np0005538515.localdomain sudo[302844]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:47 np0005538515.localdomain sudo[302878]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.client.admin.keyring.new
Nov 28 09:56:47 np0005538515.localdomain sudo[302878]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:47 np0005538515.localdomain sudo[302878]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:47 np0005538515.localdomain sudo[302896]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.client.admin.keyring.new
Nov 28 09:56:47 np0005538515.localdomain sudo[302896]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:47 np0005538515.localdomain sudo[302896]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:47 np0005538515.localdomain ceph-mon[301134]: Updating np0005538514.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:56:47 np0005538515.localdomain ceph-mon[301134]: Updating np0005538513.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:56:47 np0005538515.localdomain ceph-mon[301134]: Updating np0005538515.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:56:47 np0005538515.localdomain ceph-mon[301134]: mgrmap e38: np0005538514.djozup(active, since 4s), standbys: np0005538515.yfkzhl, np0005538512.zyhkxs
Nov 28 09:56:47 np0005538515.localdomain ceph-mon[301134]: Standby manager daemon np0005538513.dsfdlx started
Nov 28 09:56:47 np0005538515.localdomain ceph-mon[301134]: Updating np0005538514.localdomain:/etc/ceph/ceph.client.admin.keyring
Nov 28 09:56:47 np0005538515.localdomain sudo[302914]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.client.admin.keyring.new /etc/ceph/ceph.client.admin.keyring
Nov 28 09:56:47 np0005538515.localdomain sudo[302914]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:47 np0005538515.localdomain sudo[302914]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:47 np0005538515.localdomain sudo[302932]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config
Nov 28 09:56:47 np0005538515.localdomain sudo[302932]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:47 np0005538515.localdomain sudo[302932]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:47 np0005538515.localdomain sudo[302950]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config
Nov 28 09:56:47 np0005538515.localdomain sudo[302950]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:47 np0005538515.localdomain sudo[302950]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:47 np0005538515.localdomain sudo[302968]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring.new
Nov 28 09:56:47 np0005538515.localdomain sudo[302968]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:47 np0005538515.localdomain sudo[302968]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:47 np0005538515.localdomain sudo[302986]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:56:47 np0005538515.localdomain sudo[302986]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:47 np0005538515.localdomain sudo[302986]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:47 np0005538515.localdomain sudo[303004]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring.new
Nov 28 09:56:47 np0005538515.localdomain sudo[303004]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:47 np0005538515.localdomain sudo[303004]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:47 np0005538515.localdomain sudo[303038]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring.new
Nov 28 09:56:47 np0005538515.localdomain sudo[303038]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:47 np0005538515.localdomain sudo[303038]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:47 np0005538515.localdomain sudo[303056]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring.new
Nov 28 09:56:47 np0005538515.localdomain sudo[303056]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:47 np0005538515.localdomain sudo[303056]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:47 np0005538515.localdomain sudo[303074]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring.new /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring
Nov 28 09:56:47 np0005538515.localdomain sudo[303074]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:47 np0005538515.localdomain sudo[303074]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:48 np0005538515.localdomain ceph-mon[301134]: Updating np0005538515.localdomain:/etc/ceph/ceph.client.admin.keyring
Nov 28 09:56:48 np0005538515.localdomain ceph-mon[301134]: Updating np0005538513.localdomain:/etc/ceph/ceph.client.admin.keyring
Nov 28 09:56:48 np0005538515.localdomain ceph-mon[301134]: Updating np0005538514.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring
Nov 28 09:56:48 np0005538515.localdomain ceph-mon[301134]: Updating np0005538515.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring
Nov 28 09:56:48 np0005538515.localdomain ceph-mon[301134]: mgrmap e39: np0005538514.djozup(active, since 5s), standbys: np0005538515.yfkzhl, np0005538512.zyhkxs, np0005538513.dsfdlx
Nov 28 09:56:48 np0005538515.localdomain ceph-mon[301134]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix": "mgr metadata", "who": "np0005538513.dsfdlx", "id": "np0005538513.dsfdlx"} : dispatch
Nov 28 09:56:48 np0005538515.localdomain ceph-mon[301134]: Updating np0005538513.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring
Nov 28 09:56:48 np0005538515.localdomain ceph-mon[301134]: pgmap v6: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:56:48 np0005538515.localdomain ceph-mon[301134]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:56:48 np0005538515.localdomain ceph-mon[301134]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:56:48 np0005538515.localdomain ceph-mon[301134]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:56:48 np0005538515.localdomain ceph-mon[301134]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:56:48 np0005538515.localdomain ceph-mon[301134]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:56:48 np0005538515.localdomain ceph-mon[301134]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:56:48 np0005538515.localdomain ceph-mon[301134]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:56:48 np0005538515.localdomain ceph-mon[301134]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 28 09:56:48 np0005538515.localdomain sudo[303092]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:56:48 np0005538515.localdomain sudo[303092]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:48 np0005538515.localdomain sudo[303092]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:49 np0005538515.localdomain ceph-mon[301134]: pgmap v7: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail; 36 KiB/s rd, 0 B/s wr, 20 op/s
Nov 28 09:56:49 np0005538515.localdomain ceph-mon[301134]: Health check failed: 1 stray daemon(s) not managed by cephadm (CEPHADM_STRAY_DAEMON)
Nov 28 09:56:49 np0005538515.localdomain ceph-mon[301134]: Health check failed: 1 stray host(s) with 1 daemon(s) not managed by cephadm (CEPHADM_STRAY_HOST)
Nov 28 09:56:49 np0005538515.localdomain ceph-mon[301134]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch
Nov 28 09:56:49 np0005538515.localdomain ceph-mon[301134]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:56:49 np0005538515.localdomain ceph-mon[301134]: Reconfiguring daemon osd.2 on np0005538513.localdomain
Nov 28 09:56:49 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.
Nov 28 09:56:49 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.
Nov 28 09:56:49 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.
Nov 28 09:56:49 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.
Nov 28 09:56:49 np0005538515.localdomain podman[303110]: 2025-11-28 09:56:49.993559388 +0000 UTC m=+0.092280118 container health_status 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.41.3, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=edpm, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Nov 28 09:56:50 np0005538515.localdomain podman[303110]: 2025-11-28 09:56:50.003247804 +0000 UTC m=+0.101968554 container exec_died 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.build-date=20251125, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=edpm, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 09:56:50 np0005538515.localdomain systemd[1]: 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.service: Deactivated successfully.
Nov 28 09:56:50 np0005538515.localdomain podman[303111]: 2025-11-28 09:56:50.059240453 +0000 UTC m=+0.157685445 container health_status 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125)
Nov 28 09:56:50 np0005538515.localdomain podman[303112]: 2025-11-28 09:56:50.103741672 +0000 UTC m=+0.198708818 container health_status b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 28 09:56:50 np0005538515.localdomain podman[303112]: 2025-11-28 09:56:50.113651944 +0000 UTC m=+0.208619050 container exec_died b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, tcib_managed=true)
Nov 28 09:56:50 np0005538515.localdomain podman[303111]: 2025-11-28 09:56:50.125005721 +0000 UTC m=+0.223450713 container exec_died 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3)
Nov 28 09:56:50 np0005538515.localdomain systemd[1]: b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.service: Deactivated successfully.
Nov 28 09:56:50 np0005538515.localdomain systemd[1]: 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.service: Deactivated successfully.
Nov 28 09:56:50 np0005538515.localdomain podman[303113]: 2025-11-28 09:56:50.199746673 +0000 UTC m=+0.291722077 container health_status d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 28 09:56:50 np0005538515.localdomain podman[303113]: 2025-11-28 09:56:50.213656448 +0000 UTC m=+0.305631902 container exec_died d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 09:56:50 np0005538515.localdomain systemd[1]: d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.service: Deactivated successfully.
Nov 28 09:56:50 np0005538515.localdomain ceph-mon[301134]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:56:50 np0005538515.localdomain ceph-mon[301134]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:56:50 np0005538515.localdomain ceph-mon[301134]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:56:50 np0005538515.localdomain ceph-mon[301134]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:56:50 np0005538515.localdomain ceph-mon[301134]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005538513.yljthc", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Nov 28 09:56:50 np0005538515.localdomain ceph-mon[301134]: Reconfiguring mds.mds.np0005538513.yljthc (monmap changed)...
Nov 28 09:56:50 np0005538515.localdomain ceph-mon[301134]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005538513.yljthc", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Nov 28 09:56:50 np0005538515.localdomain ceph-mon[301134]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:56:50 np0005538515.localdomain ceph-mon[301134]: Reconfiguring daemon mds.mds.np0005538513.yljthc on np0005538513.localdomain
Nov 28 09:56:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:56:50.839 158530 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:56:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:56:50.840 158530 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:56:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:56:50.840 158530 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:56:50 np0005538515.localdomain systemd[1]: tmp-crun.skBMxA.mount: Deactivated successfully.
Nov 28 09:56:51 np0005538515.localdomain ceph-mon[301134]: pgmap v8: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail; 27 KiB/s rd, 0 B/s wr, 15 op/s
Nov 28 09:56:51 np0005538515.localdomain ceph-mon[301134]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:56:51 np0005538515.localdomain ceph-mon[301134]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:56:51 np0005538515.localdomain ceph-mon[301134]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538513.dsfdlx", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 28 09:56:51 np0005538515.localdomain ceph-mon[301134]: Reconfiguring mgr.np0005538513.dsfdlx (monmap changed)...
Nov 28 09:56:51 np0005538515.localdomain ceph-mon[301134]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538513.dsfdlx", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 28 09:56:51 np0005538515.localdomain ceph-mon[301134]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix": "mgr services"} : dispatch
Nov 28 09:56:51 np0005538515.localdomain ceph-mon[301134]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:56:51 np0005538515.localdomain ceph-mon[301134]: Reconfiguring daemon mgr.np0005538513.dsfdlx on np0005538513.localdomain
Nov 28 09:56:51 np0005538515.localdomain ceph-mon[301134]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:56:51 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e91 _set_new_cache_sizes cache_size:1020054204 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:56:52 np0005538515.localdomain ceph-mon[301134]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:56:52 np0005538515.localdomain ceph-mon[301134]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:56:52 np0005538515.localdomain ceph-mon[301134]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538514.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 28 09:56:52 np0005538515.localdomain ceph-mon[301134]: Reconfiguring crash.np0005538514 (monmap changed)...
Nov 28 09:56:52 np0005538515.localdomain ceph-mon[301134]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538514.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 28 09:56:52 np0005538515.localdomain ceph-mon[301134]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:56:52 np0005538515.localdomain ceph-mon[301134]: Reconfiguring daemon crash.np0005538514 on np0005538514.localdomain
Nov 28 09:56:52 np0005538515.localdomain ceph-mon[301134]: pgmap v9: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail; 21 KiB/s rd, 0 B/s wr, 11 op/s
Nov 28 09:56:52 np0005538515.localdomain ceph-mon[301134]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:56:52 np0005538515.localdomain ceph-mon[301134]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:56:52 np0005538515.localdomain ceph-mon[301134]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch
Nov 28 09:56:52 np0005538515.localdomain ceph-mon[301134]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:56:52 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.
Nov 28 09:56:52 np0005538515.localdomain podman[303195]: 2025-11-28 09:56:52.983029369 +0000 UTC m=+0.085631876 container health_status 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 28 09:56:52 np0005538515.localdomain podman[303195]: 2025-11-28 09:56:52.994793578 +0000 UTC m=+0.097396075 container exec_died 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 28 09:56:53 np0005538515.localdomain systemd[1]: 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.service: Deactivated successfully.
Nov 28 09:56:53 np0005538515.localdomain ceph-mon[301134]: Reconfiguring osd.0 (monmap changed)...
Nov 28 09:56:53 np0005538515.localdomain ceph-mon[301134]: Reconfiguring daemon osd.0 on np0005538514.localdomain
Nov 28 09:56:53 np0005538515.localdomain ceph-mon[301134]: from='client.44559 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Nov 28 09:56:53 np0005538515.localdomain ceph-mon[301134]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:56:54 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #13. Immutable memtables: 0.
Nov 28 09:56:54 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-09:56:54.770448) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Nov 28 09:56:54 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/flush_job.cc:856] [default] [JOB 3] Flushing memtable with next log file: 13
Nov 28 09:56:54 np0005538515.localdomain ceph-mon[301134]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323814770632, "job": 3, "event": "flush_started", "num_memtables": 1, "num_entries": 12847, "num_deletes": 257, "total_data_size": 23230672, "memory_usage": 24400384, "flush_reason": "Manual Compaction"}
Nov 28 09:56:54 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/flush_job.cc:885] [default] [JOB 3] Level-0 flush table #14: started
Nov 28 09:56:54 np0005538515.localdomain ceph-mon[301134]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:56:54 np0005538515.localdomain ceph-mon[301134]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:56:54 np0005538515.localdomain ceph-mon[301134]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:56:54 np0005538515.localdomain ceph-mon[301134]: Reconfiguring osd.3 (monmap changed)...
Nov 28 09:56:54 np0005538515.localdomain ceph-mon[301134]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch
Nov 28 09:56:54 np0005538515.localdomain ceph-mon[301134]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:56:54 np0005538515.localdomain ceph-mon[301134]: Reconfiguring daemon osd.3 on np0005538514.localdomain
Nov 28 09:56:54 np0005538515.localdomain ceph-mon[301134]: pgmap v10: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail; 19 KiB/s rd, 0 B/s wr, 10 op/s
Nov 28 09:56:54 np0005538515.localdomain ceph-mon[301134]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:56:54 np0005538515.localdomain ceph-mon[301134]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:56:54 np0005538515.localdomain ceph-mon[301134]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:56:54 np0005538515.localdomain ceph-mon[301134]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323814906125, "cf_name": "default", "job": 3, "event": "table_file_creation", "file_number": 14, "file_size": 18014386, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 6, "largest_seqno": 12852, "table_properties": {"data_size": 17946891, "index_size": 36669, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 29765, "raw_key_size": 317092, "raw_average_key_size": 26, "raw_value_size": 17744895, "raw_average_value_size": 1492, "num_data_blocks": 1394, "num_entries": 11890, "num_filter_entries": 11890, "num_deletions": 255, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764323786, "oldest_key_time": 1764323786, "file_creation_time": 1764323814, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "75e61b0e-4f73-4b03-b096-8587ecbe7a9f", "db_session_id": "7KM5GJAJPD54H6HSLJHG", "orig_file_number": 14, "seqno_to_time_mapping": "N/A"}}
Nov 28 09:56:54 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 3] Flush lasted 135715 microseconds, and 38000 cpu microseconds.
Nov 28 09:56:54 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-09:56:54.906185) [db/flush_job.cc:967] [default] [JOB 3] Level-0 flush table #14: 18014386 bytes OK
Nov 28 09:56:54 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-09:56:54.906211) [db/memtable_list.cc:519] [default] Level-0 commit table #14 started
Nov 28 09:56:54 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-09:56:54.907563) [db/memtable_list.cc:722] [default] Level-0 commit table #14: memtable #1 done
Nov 28 09:56:54 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-09:56:54.907586) EVENT_LOG_v1 {"time_micros": 1764323814907579, "job": 3, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [2, 0, 0, 0, 0, 0, 0], "immutable_memtables": 0}
Nov 28 09:56:54 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-09:56:54.907605) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: files[2 0 0 0 0 0 0] max score 0.50
Nov 28 09:56:54 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 3] Try to delete WAL files size 23143546, prev total WAL file size 23174789, number of live WAL files 2.
Nov 28 09:56:54 np0005538515.localdomain ceph-mon[301134]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538515/store.db/000009.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 28 09:56:54 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-09:56:54.911323) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003131303434' seq:72057594037927935, type:22 .. '7061786F73003131323936' seq:0, type:0; will stop at (end)
Nov 28 09:56:54 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 4] Compacting 2@0 files to L6, score -1.00
Nov 28 09:56:54 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 3 Base level 0, inputs: [14(17MB) 8(1762B)]
Nov 28 09:56:54 np0005538515.localdomain ceph-mon[301134]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323814911421, "job": 4, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [14, 8], "score": -1, "input_data_size": 18016148, "oldest_snapshot_seqno": -1}
Nov 28 09:56:55 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 4] Generated table #15: 11639 keys, 18010811 bytes, temperature: kUnknown
Nov 28 09:56:55 np0005538515.localdomain ceph-mon[301134]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323815040427, "cf_name": "default", "job": 4, "event": "table_file_creation", "file_number": 15, "file_size": 18010811, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 17943987, "index_size": 36643, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 29125, "raw_key_size": 312249, "raw_average_key_size": 26, "raw_value_size": 17745300, "raw_average_value_size": 1524, "num_data_blocks": 1394, "num_entries": 11639, "num_filter_entries": 11639, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764323786, "oldest_key_time": 0, "file_creation_time": 1764323814, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "75e61b0e-4f73-4b03-b096-8587ecbe7a9f", "db_session_id": "7KM5GJAJPD54H6HSLJHG", "orig_file_number": 15, "seqno_to_time_mapping": "N/A"}}
Nov 28 09:56:55 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 28 09:56:55 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-09:56:55.040943) [db/compaction/compaction_job.cc:1663] [default] [JOB 4] Compacted 2@0 files to L6 => 18010811 bytes
Nov 28 09:56:55 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-09:56:55.042777) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 139.5 rd, 139.4 wr, level 6, files in(2, 0) out(1 +0 blob) MB in(17.2, 0.0 +0.0 blob) out(17.2 +0.0 blob), read-write-amplify(2.0) write-amplify(1.0) OK, records in: 11895, records dropped: 256 output_compression: NoCompression
Nov 28 09:56:55 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-09:56:55.042812) EVENT_LOG_v1 {"time_micros": 1764323815042797, "job": 4, "event": "compaction_finished", "compaction_time_micros": 129174, "compaction_time_cpu_micros": 50950, "output_level": 6, "num_output_files": 1, "total_output_size": 18010811, "num_input_records": 11895, "num_output_records": 11639, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Nov 28 09:56:55 np0005538515.localdomain ceph-mon[301134]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538515/store.db/000014.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 28 09:56:55 np0005538515.localdomain ceph-mon[301134]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323815045510, "job": 4, "event": "table_file_deletion", "file_number": 14}
Nov 28 09:56:55 np0005538515.localdomain ceph-mon[301134]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538515/store.db/000008.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 28 09:56:55 np0005538515.localdomain ceph-mon[301134]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323815045569, "job": 4, "event": "table_file_deletion", "file_number": 8}
Nov 28 09:56:55 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-09:56:54.911213) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 09:56:55 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.
Nov 28 09:56:55 np0005538515.localdomain ceph-mon[301134]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:56:55 np0005538515.localdomain ceph-mon[301134]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005538514.umgtoy", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Nov 28 09:56:55 np0005538515.localdomain ceph-mon[301134]: Reconfiguring mds.mds.np0005538514.umgtoy (monmap changed)...
Nov 28 09:56:55 np0005538515.localdomain ceph-mon[301134]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005538514.umgtoy", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Nov 28 09:56:55 np0005538515.localdomain ceph-mon[301134]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:56:55 np0005538515.localdomain ceph-mon[301134]: Reconfiguring daemon mds.mds.np0005538514.umgtoy on np0005538514.localdomain
Nov 28 09:56:55 np0005538515.localdomain ceph-mon[301134]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:56:55 np0005538515.localdomain ceph-mon[301134]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:56:55 np0005538515.localdomain ceph-mon[301134]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:56:55 np0005538515.localdomain ceph-mon[301134]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538514.djozup", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 28 09:56:55 np0005538515.localdomain ceph-mon[301134]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538514.djozup", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Nov 28 09:56:55 np0005538515.localdomain ceph-mon[301134]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix": "mgr services"} : dispatch
Nov 28 09:56:55 np0005538515.localdomain ceph-mon[301134]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:56:55 np0005538515.localdomain podman[303218]: 2025-11-28 09:56:55.974556993 +0000 UTC m=+0.079350604 container health_status cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, org.label-schema.vendor=CentOS)
Nov 28 09:56:55 np0005538515.localdomain podman[303218]: 2025-11-28 09:56:55.983835997 +0000 UTC m=+0.088629588 container exec_died cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 28 09:56:55 np0005538515.localdomain systemd[1]: cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.service: Deactivated successfully.
Nov 28 09:56:56 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e91 _set_new_cache_sizes cache_size:1020054722 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:56:56 np0005538515.localdomain ceph-mon[301134]: from='client.54277 -' entity='client.admin' cmd=[{"prefix": "orch apply", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 09:56:56 np0005538515.localdomain ceph-mon[301134]: Saving service mon spec with placement label:mon
Nov 28 09:56:56 np0005538515.localdomain ceph-mon[301134]: Reconfiguring mgr.np0005538514.djozup (monmap changed)...
Nov 28 09:56:56 np0005538515.localdomain ceph-mon[301134]: Reconfiguring daemon mgr.np0005538514.djozup on np0005538514.localdomain
Nov 28 09:56:56 np0005538515.localdomain ceph-mon[301134]: pgmap v11: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail; 19 KiB/s rd, 0 B/s wr, 10 op/s
Nov 28 09:56:56 np0005538515.localdomain ceph-mon[301134]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:56:56 np0005538515.localdomain ceph-mon[301134]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:56:56 np0005538515.localdomain ceph-mon[301134]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Nov 28 09:56:56 np0005538515.localdomain ceph-mon[301134]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Nov 28 09:56:56 np0005538515.localdomain ceph-mon[301134]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:56:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:56:57 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 09:56:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:56:57 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:56:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:56:57 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:56:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:56:57 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 09:56:57 np0005538515.localdomain openstack_network_exporter[240973]: 
Nov 28 09:56:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:56:57 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 09:56:57 np0005538515.localdomain openstack_network_exporter[240973]: 
Nov 28 09:56:57 np0005538515.localdomain sudo[303237]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:56:57 np0005538515.localdomain sudo[303237]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:57 np0005538515.localdomain sudo[303237]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:57 np0005538515.localdomain sudo[303255]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:56:57 np0005538515.localdomain sudo[303255]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:58 np0005538515.localdomain podman[303290]: 
Nov 28 09:56:58 np0005538515.localdomain podman[303290]: 2025-11-28 09:56:58.180638106 +0000 UTC m=+0.082976385 container create 469c94809154f654ce97900746bdb27238d67a1707e2a59d99c6dbfd972af64f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=serene_ritchie, GIT_BRANCH=main, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, vcs-type=git, version=7, description=Red Hat Ceph Storage 7, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, build-date=2025-09-24T08:57:55, ceph=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., name=rhceph, io.buildah.version=1.33.12, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.tags=rhceph ceph, release=553, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_CLEAN=True, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Nov 28 09:56:58 np0005538515.localdomain systemd[1]: Started libpod-conmon-469c94809154f654ce97900746bdb27238d67a1707e2a59d99c6dbfd972af64f.scope.
Nov 28 09:56:58 np0005538515.localdomain systemd[1]: tmp-crun.9OVs9M.mount: Deactivated successfully.
Nov 28 09:56:58 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 09:56:58 np0005538515.localdomain podman[303290]: 2025-11-28 09:56:58.141395448 +0000 UTC m=+0.043733707 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 09:56:58 np0005538515.localdomain podman[303290]: 2025-11-28 09:56:58.25052517 +0000 UTC m=+0.152863399 container init 469c94809154f654ce97900746bdb27238d67a1707e2a59d99c6dbfd972af64f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=serene_ritchie, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.tags=rhceph ceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-09-24T08:57:55, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=553, version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=rhceph-container, name=rhceph, io.buildah.version=1.33.12, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, distribution-scope=public, GIT_CLEAN=True, GIT_BRANCH=main, io.openshift.expose-services=, RELEASE=main)
Nov 28 09:56:58 np0005538515.localdomain podman[303290]: 2025-11-28 09:56:58.260960469 +0000 UTC m=+0.163298758 container start 469c94809154f654ce97900746bdb27238d67a1707e2a59d99c6dbfd972af64f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=serene_ritchie, version=7, io.openshift.tags=rhceph ceph, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, build-date=2025-09-24T08:57:55, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, release=553, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.buildah.version=1.33.12, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, GIT_CLEAN=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=rhceph-container, architecture=x86_64)
Nov 28 09:56:58 np0005538515.localdomain podman[303290]: 2025-11-28 09:56:58.261266968 +0000 UTC m=+0.163605207 container attach 469c94809154f654ce97900746bdb27238d67a1707e2a59d99c6dbfd972af64f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=serene_ritchie, io.buildah.version=1.33.12, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, release=553, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, distribution-scope=public, io.openshift.expose-services=, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, version=7, RELEASE=main, name=rhceph, vendor=Red Hat, Inc., GIT_CLEAN=True, ceph=True, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3)
Nov 28 09:56:58 np0005538515.localdomain systemd[1]: libpod-469c94809154f654ce97900746bdb27238d67a1707e2a59d99c6dbfd972af64f.scope: Deactivated successfully.
Nov 28 09:56:58 np0005538515.localdomain serene_ritchie[303305]: 167 167
Nov 28 09:56:58 np0005538515.localdomain podman[303290]: 2025-11-28 09:56:58.26625399 +0000 UTC m=+0.168592229 container died 469c94809154f654ce97900746bdb27238d67a1707e2a59d99c6dbfd972af64f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=serene_ritchie, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, vcs-type=git, distribution-scope=public, CEPH_POINT_RELEASE=, release=553, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.expose-services=, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, ceph=True, io.buildah.version=1.33.12, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64)
Nov 28 09:56:58 np0005538515.localdomain podman[303310]: 2025-11-28 09:56:58.367888483 +0000 UTC m=+0.087134401 container remove 469c94809154f654ce97900746bdb27238d67a1707e2a59d99c6dbfd972af64f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=serene_ritchie, GIT_BRANCH=main, architecture=x86_64, ceph=True, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, release=553, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, io.buildah.version=1.33.12, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, build-date=2025-09-24T08:57:55, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, version=7, GIT_CLEAN=True, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, name=rhceph, io.openshift.expose-services=, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph)
Nov 28 09:56:58 np0005538515.localdomain systemd[1]: libpod-conmon-469c94809154f654ce97900746bdb27238d67a1707e2a59d99c6dbfd972af64f.scope: Deactivated successfully.
Nov 28 09:56:58 np0005538515.localdomain sudo[303255]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:58 np0005538515.localdomain ceph-mon[301134]: Reconfiguring mon.np0005538514 (monmap changed)...
Nov 28 09:56:58 np0005538515.localdomain ceph-mon[301134]: Reconfiguring daemon mon.np0005538514 on np0005538514.localdomain
Nov 28 09:56:58 np0005538515.localdomain ceph-mon[301134]: from='client.44565 -' entity='client.admin' cmd=[{"prefix": "orch ps", "daemon_type": "mon", "daemon_id": "np0005538515", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Nov 28 09:56:58 np0005538515.localdomain ceph-mon[301134]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:56:58 np0005538515.localdomain ceph-mon[301134]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:56:58 np0005538515.localdomain ceph-mon[301134]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538515.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 28 09:56:58 np0005538515.localdomain ceph-mon[301134]: Reconfiguring crash.np0005538515 (monmap changed)...
Nov 28 09:56:58 np0005538515.localdomain ceph-mon[301134]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538515.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Nov 28 09:56:58 np0005538515.localdomain ceph-mon[301134]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:56:58 np0005538515.localdomain ceph-mon[301134]: Reconfiguring daemon crash.np0005538515 on np0005538515.localdomain
Nov 28 09:56:58 np0005538515.localdomain ceph-mon[301134]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:56:58 np0005538515.localdomain ceph-mon[301134]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:56:58 np0005538515.localdomain ceph-mon[301134]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch
Nov 28 09:56:58 np0005538515.localdomain ceph-mon[301134]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:56:58 np0005538515.localdomain sudo[303326]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:56:58 np0005538515.localdomain sudo[303326]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:58 np0005538515.localdomain sudo[303326]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:58 np0005538515.localdomain sudo[303344]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:56:58 np0005538515.localdomain sudo[303344]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:58 np0005538515.localdomain podman[239012]: time="2025-11-28T09:56:58Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 09:56:58 np0005538515.localdomain podman[239012]: @ - - [28/Nov/2025:09:56:58 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156330 "" "Go-http-client/1.1"
Nov 28 09:56:58 np0005538515.localdomain podman[239012]: @ - - [28/Nov/2025:09:56:58 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19187 "" "Go-http-client/1.1"
Nov 28 09:56:59 np0005538515.localdomain podman[303379]: 
Nov 28 09:56:59 np0005538515.localdomain podman[303379]: 2025-11-28 09:56:59.145106912 +0000 UTC m=+0.074068292 container create 318493ff4cf94b48366183f04405bf046344f3366211f2290ea49e9953e8f9aa (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=silly_elion, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, CEPH_POINT_RELEASE=, GIT_CLEAN=True, build-date=2025-09-24T08:57:55, release=553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, ceph=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, description=Red Hat Ceph Storage 7, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., io.buildah.version=1.33.12, com.redhat.component=rhceph-container)
Nov 28 09:56:59 np0005538515.localdomain systemd[1]: Started libpod-conmon-318493ff4cf94b48366183f04405bf046344f3366211f2290ea49e9953e8f9aa.scope.
Nov 28 09:56:59 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-3a7566c280c0a72a6b2e5d072bb358d2cae274ead210aed0a72f210cb175d1ef-merged.mount: Deactivated successfully.
Nov 28 09:56:59 np0005538515.localdomain podman[303379]: 2025-11-28 09:56:59.11392416 +0000 UTC m=+0.042885540 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 09:56:59 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 09:56:59 np0005538515.localdomain podman[303379]: 2025-11-28 09:56:59.231323264 +0000 UTC m=+0.160284634 container init 318493ff4cf94b48366183f04405bf046344f3366211f2290ea49e9953e8f9aa (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=silly_elion, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, release=553, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2025-09-24T08:57:55, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, distribution-scope=public, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, RELEASE=main, io.buildah.version=1.33.12, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vendor=Red Hat, Inc., name=rhceph, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, ceph=True, io.openshift.expose-services=)
Nov 28 09:56:59 np0005538515.localdomain podman[303379]: 2025-11-28 09:56:59.24230493 +0000 UTC m=+0.171266300 container start 318493ff4cf94b48366183f04405bf046344f3366211f2290ea49e9953e8f9aa (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=silly_elion, ceph=True, vendor=Red Hat, Inc., name=rhceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-type=git, RELEASE=main, io.buildah.version=1.33.12, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, distribution-scope=public, io.openshift.expose-services=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_CLEAN=True, architecture=x86_64, release=553, version=7, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-09-24T08:57:55, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7)
Nov 28 09:56:59 np0005538515.localdomain podman[303379]: 2025-11-28 09:56:59.242753993 +0000 UTC m=+0.171715363 container attach 318493ff4cf94b48366183f04405bf046344f3366211f2290ea49e9953e8f9aa (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=silly_elion, vcs-type=git, version=7, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., io.buildah.version=1.33.12, GIT_CLEAN=True, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, release=553, build-date=2025-09-24T08:57:55, CEPH_POINT_RELEASE=, GIT_BRANCH=main, RELEASE=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.description=Red Hat Ceph Storage 7)
Nov 28 09:56:59 np0005538515.localdomain silly_elion[303394]: 167 167
Nov 28 09:56:59 np0005538515.localdomain systemd[1]: libpod-318493ff4cf94b48366183f04405bf046344f3366211f2290ea49e9953e8f9aa.scope: Deactivated successfully.
Nov 28 09:56:59 np0005538515.localdomain podman[303379]: 2025-11-28 09:56:59.24627421 +0000 UTC m=+0.175235610 container died 318493ff4cf94b48366183f04405bf046344f3366211f2290ea49e9953e8f9aa (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=silly_elion, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.buildah.version=1.33.12, vendor=Red Hat, Inc., version=7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.license_terms=https://www.redhat.com/agreements, CEPH_POINT_RELEASE=, GIT_BRANCH=main, architecture=x86_64, release=553, RELEASE=main, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, ceph=True, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2025-09-24T08:57:55, vcs-type=git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, distribution-scope=public, com.redhat.component=rhceph-container)
Nov 28 09:56:59 np0005538515.localdomain podman[303399]: 2025-11-28 09:56:59.324015474 +0000 UTC m=+0.070251636 container remove 318493ff4cf94b48366183f04405bf046344f3366211f2290ea49e9953e8f9aa (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=silly_elion, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.expose-services=, vcs-type=git, RELEASE=main, description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_CLEAN=True, io.buildah.version=1.33.12, architecture=x86_64, version=7, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, release=553, io.openshift.tags=rhceph ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553)
Nov 28 09:56:59 np0005538515.localdomain systemd[1]: libpod-conmon-318493ff4cf94b48366183f04405bf046344f3366211f2290ea49e9953e8f9aa.scope: Deactivated successfully.
Nov 28 09:56:59 np0005538515.localdomain sudo[303344]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:59 np0005538515.localdomain ceph-mon[301134]: pgmap v12: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail; 19 KiB/s rd, 0 B/s wr, 10 op/s
Nov 28 09:56:59 np0005538515.localdomain ceph-mon[301134]: Reconfiguring osd.1 (monmap changed)...
Nov 28 09:56:59 np0005538515.localdomain ceph-mon[301134]: Reconfiguring daemon osd.1 on np0005538515.localdomain
Nov 28 09:56:59 np0005538515.localdomain ceph-mon[301134]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:56:59 np0005538515.localdomain ceph-mon[301134]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:56:59 np0005538515.localdomain ceph-mon[301134]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:56:59 np0005538515.localdomain sudo[303423]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:56:59 np0005538515.localdomain sudo[303423]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:56:59 np0005538515.localdomain sudo[303423]: pam_unix(sudo:session): session closed for user root
Nov 28 09:56:59 np0005538515.localdomain sudo[303441]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:56:59 np0005538515.localdomain sudo[303441]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:57:00 np0005538515.localdomain podman[303476]: 
Nov 28 09:57:00 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-360fbb81d2a782204a10bfef9b27021e137add83d6bf94599a0f17d16f2375ef-merged.mount: Deactivated successfully.
Nov 28 09:57:00 np0005538515.localdomain podman[303476]: 2025-11-28 09:57:00.199764472 +0000 UTC m=+0.074567518 container create b9786a901ee1086bf612daca4622bcf62e049248dff8d43507d790d481eb6383 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=wizardly_curie, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, name=rhceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_CLEAN=True, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, io.buildah.version=1.33.12, vcs-type=git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.component=rhceph-container, build-date=2025-09-24T08:57:55, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, release=553, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, version=7, architecture=x86_64, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git)
Nov 28 09:57:00 np0005538515.localdomain systemd[1]: Started libpod-conmon-b9786a901ee1086bf612daca4622bcf62e049248dff8d43507d790d481eb6383.scope.
Nov 28 09:57:00 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 09:57:00 np0005538515.localdomain podman[303476]: 2025-11-28 09:57:00.26556436 +0000 UTC m=+0.140367406 container init b9786a901ee1086bf612daca4622bcf62e049248dff8d43507d790d481eb6383 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=wizardly_curie, io.openshift.tags=rhceph ceph, release=553, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, GIT_BRANCH=main, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, name=rhceph, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-type=git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git)
Nov 28 09:57:00 np0005538515.localdomain podman[303476]: 2025-11-28 09:57:00.169418535 +0000 UTC m=+0.044221631 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 09:57:00 np0005538515.localdomain podman[303476]: 2025-11-28 09:57:00.274631037 +0000 UTC m=+0.149434093 container start b9786a901ee1086bf612daca4622bcf62e049248dff8d43507d790d481eb6383 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=wizardly_curie, build-date=2025-09-24T08:57:55, architecture=x86_64, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, com.redhat.component=rhceph-container, release=553, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_CLEAN=True, version=7, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.buildah.version=1.33.12, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, vendor=Red Hat, Inc., GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git)
Nov 28 09:57:00 np0005538515.localdomain podman[303476]: 2025-11-28 09:57:00.275905516 +0000 UTC m=+0.150708532 container attach b9786a901ee1086bf612daca4622bcf62e049248dff8d43507d790d481eb6383 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=wizardly_curie, release=553, build-date=2025-09-24T08:57:55, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, io.openshift.expose-services=, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhceph, ceph=True, distribution-scope=public, version=7, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, io.buildah.version=1.33.12)
Nov 28 09:57:00 np0005538515.localdomain wizardly_curie[303492]: 167 167
Nov 28 09:57:00 np0005538515.localdomain systemd[1]: libpod-b9786a901ee1086bf612daca4622bcf62e049248dff8d43507d790d481eb6383.scope: Deactivated successfully.
Nov 28 09:57:00 np0005538515.localdomain podman[303476]: 2025-11-28 09:57:00.279385462 +0000 UTC m=+0.154188538 container died b9786a901ee1086bf612daca4622bcf62e049248dff8d43507d790d481eb6383 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=wizardly_curie, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, name=rhceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, RELEASE=main, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_CLEAN=True, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, release=553, CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-09-24T08:57:55)
Nov 28 09:57:00 np0005538515.localdomain podman[303497]: 2025-11-28 09:57:00.365594224 +0000 UTC m=+0.078544179 container remove b9786a901ee1086bf612daca4622bcf62e049248dff8d43507d790d481eb6383 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=wizardly_curie, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, build-date=2025-09-24T08:57:55, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., release=553, version=7, description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, vcs-type=git, GIT_CLEAN=True, RELEASE=main, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, io.buildah.version=1.33.12, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3)
Nov 28 09:57:00 np0005538515.localdomain systemd[1]: libpod-conmon-b9786a901ee1086bf612daca4622bcf62e049248dff8d43507d790d481eb6383.scope: Deactivated successfully.
Nov 28 09:57:00 np0005538515.localdomain sudo[303441]: pam_unix(sudo:session): session closed for user root
Nov 28 09:57:00 np0005538515.localdomain ceph-mon[301134]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:57:00 np0005538515.localdomain ceph-mon[301134]: Reconfiguring osd.4 (monmap changed)...
Nov 28 09:57:00 np0005538515.localdomain ceph-mon[301134]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch
Nov 28 09:57:00 np0005538515.localdomain ceph-mon[301134]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:57:00 np0005538515.localdomain ceph-mon[301134]: Reconfiguring daemon osd.4 on np0005538515.localdomain
Nov 28 09:57:00 np0005538515.localdomain ceph-mon[301134]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:57:00 np0005538515.localdomain sudo[303520]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:57:00 np0005538515.localdomain sudo[303520]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:57:00 np0005538515.localdomain sudo[303520]: pam_unix(sudo:session): session closed for user root
Nov 28 09:57:00 np0005538515.localdomain sudo[303538]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:57:00 np0005538515.localdomain sudo[303538]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:57:01 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-0687bdb07c7ae93399a19b7e5f84f8193c66beccbcc7234c7ab80fcf4a24fcb4-merged.mount: Deactivated successfully.
Nov 28 09:57:01 np0005538515.localdomain podman[303572]: 
Nov 28 09:57:01 np0005538515.localdomain podman[303572]: 2025-11-28 09:57:01.237797483 +0000 UTC m=+0.079182548 container create cf7c048209a7a2890d9109dbdeb2dd4f41966258875814b3b86f78caed3f384b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=magical_gould, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, name=rhceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, architecture=x86_64, io.openshift.expose-services=, ceph=True, io.buildah.version=1.33.12, version=7, release=553, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, vendor=Red Hat, Inc., GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3)
Nov 28 09:57:01 np0005538515.localdomain systemd[1]: Started libpod-conmon-cf7c048209a7a2890d9109dbdeb2dd4f41966258875814b3b86f78caed3f384b.scope.
Nov 28 09:57:01 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 09:57:01 np0005538515.localdomain podman[303572]: 2025-11-28 09:57:01.203899919 +0000 UTC m=+0.045285014 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 09:57:01 np0005538515.localdomain podman[303572]: 2025-11-28 09:57:01.311220735 +0000 UTC m=+0.152605790 container init cf7c048209a7a2890d9109dbdeb2dd4f41966258875814b3b86f78caed3f384b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=magical_gould, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, architecture=x86_64, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_CLEAN=True, io.buildah.version=1.33.12, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=553, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, ceph=True, RELEASE=main, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, version=7, io.openshift.expose-services=, vcs-type=git, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., build-date=2025-09-24T08:57:55, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_BRANCH=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d)
Nov 28 09:57:01 np0005538515.localdomain podman[303572]: 2025-11-28 09:57:01.319459266 +0000 UTC m=+0.160844321 container start cf7c048209a7a2890d9109dbdeb2dd4f41966258875814b3b86f78caed3f384b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=magical_gould, description=Red Hat Ceph Storage 7, architecture=x86_64, com.redhat.component=rhceph-container, RELEASE=main, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, distribution-scope=public, vendor=Red Hat, Inc., ceph=True, io.buildah.version=1.33.12, release=553, build-date=2025-09-24T08:57:55, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, version=7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, name=rhceph, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git)
Nov 28 09:57:01 np0005538515.localdomain podman[303572]: 2025-11-28 09:57:01.319736506 +0000 UTC m=+0.161121561 container attach cf7c048209a7a2890d9109dbdeb2dd4f41966258875814b3b86f78caed3f384b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=magical_gould, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, vcs-type=git, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=553, distribution-scope=public, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.buildah.version=1.33.12, RELEASE=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, CEPH_POINT_RELEASE=, name=rhceph, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, version=7, build-date=2025-09-24T08:57:55, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7)
Nov 28 09:57:01 np0005538515.localdomain magical_gould[303587]: 167 167
Nov 28 09:57:01 np0005538515.localdomain systemd[1]: libpod-cf7c048209a7a2890d9109dbdeb2dd4f41966258875814b3b86f78caed3f384b.scope: Deactivated successfully.
Nov 28 09:57:01 np0005538515.localdomain podman[303572]: 2025-11-28 09:57:01.322653855 +0000 UTC m=+0.164038930 container died cf7c048209a7a2890d9109dbdeb2dd4f41966258875814b3b86f78caed3f384b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=magical_gould, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=553, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, vendor=Red Hat, Inc., RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, name=rhceph, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_BRANCH=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.component=rhceph-container, io.openshift.expose-services=, io.buildah.version=1.33.12, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, ceph=True, CEPH_POINT_RELEASE=, build-date=2025-09-24T08:57:55, distribution-scope=public, architecture=x86_64, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3)
Nov 28 09:57:01 np0005538515.localdomain podman[303592]: 2025-11-28 09:57:01.413455637 +0000 UTC m=+0.082163710 container remove cf7c048209a7a2890d9109dbdeb2dd4f41966258875814b3b86f78caed3f384b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=magical_gould, vcs-type=git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, GIT_BRANCH=main, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, io.buildah.version=1.33.12, description=Red Hat Ceph Storage 7, ceph=True, build-date=2025-09-24T08:57:55, release=553, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, version=7)
Nov 28 09:57:01 np0005538515.localdomain systemd[1]: libpod-conmon-cf7c048209a7a2890d9109dbdeb2dd4f41966258875814b3b86f78caed3f384b.scope: Deactivated successfully.
Nov 28 09:57:01 np0005538515.localdomain sudo[303538]: pam_unix(sudo:session): session closed for user root
Nov 28 09:57:01 np0005538515.localdomain sudo[303608]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:57:01 np0005538515.localdomain ceph-mon[301134]: pgmap v13: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:57:01 np0005538515.localdomain ceph-mon[301134]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:57:01 np0005538515.localdomain ceph-mon[301134]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:57:01 np0005538515.localdomain ceph-mon[301134]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:57:01 np0005538515.localdomain ceph-mon[301134]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005538515.anvatb", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Nov 28 09:57:01 np0005538515.localdomain ceph-mon[301134]: Reconfiguring mds.mds.np0005538515.anvatb (monmap changed)...
Nov 28 09:57:01 np0005538515.localdomain ceph-mon[301134]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005538515.anvatb", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Nov 28 09:57:01 np0005538515.localdomain ceph-mon[301134]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:57:01 np0005538515.localdomain ceph-mon[301134]: Reconfiguring daemon mds.mds.np0005538515.anvatb on np0005538515.localdomain
Nov 28 09:57:01 np0005538515.localdomain ceph-mon[301134]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:57:01 np0005538515.localdomain ceph-mon[301134]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:57:01 np0005538515.localdomain ceph-mon[301134]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Nov 28 09:57:01 np0005538515.localdomain ceph-mon[301134]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Nov 28 09:57:01 np0005538515.localdomain ceph-mon[301134]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:57:01 np0005538515.localdomain sudo[303608]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:57:01 np0005538515.localdomain sudo[303608]: pam_unix(sudo:session): session closed for user root
Nov 28 09:57:01 np0005538515.localdomain sudo[303626]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:57:01 np0005538515.localdomain sudo[303626]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:57:01 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:57:02 np0005538515.localdomain podman[303662]: 
Nov 28 09:57:02 np0005538515.localdomain podman[303662]: 2025-11-28 09:57:02.141427742 +0000 UTC m=+0.074508096 container create 5e3878dd50e525dff491b0b4fbeb79c7c401669727527736dc5289bf5a881ebc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=lucid_banach, name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, build-date=2025-09-24T08:57:55, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, GIT_CLEAN=True, RELEASE=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vendor=Red Hat, Inc., version=7, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=553, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vcs-type=git, architecture=x86_64, distribution-scope=public, io.openshift.tags=rhceph ceph)
Nov 28 09:57:02 np0005538515.localdomain systemd[1]: Started libpod-conmon-5e3878dd50e525dff491b0b4fbeb79c7c401669727527736dc5289bf5a881ebc.scope.
Nov 28 09:57:02 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 09:57:02 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-95062f8b9d46047cfa7542ca93ec336b93848a418b3f12dfdef1ca065f86b3bf-merged.mount: Deactivated successfully.
Nov 28 09:57:02 np0005538515.localdomain podman[303662]: 2025-11-28 09:57:02.201742783 +0000 UTC m=+0.134823137 container init 5e3878dd50e525dff491b0b4fbeb79c7c401669727527736dc5289bf5a881ebc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=lucid_banach, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, build-date=2025-09-24T08:57:55, vendor=Red Hat, Inc., GIT_CLEAN=True, RELEASE=main, distribution-scope=public, io.buildah.version=1.33.12, GIT_BRANCH=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, version=7, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=553, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=, name=rhceph)
Nov 28 09:57:02 np0005538515.localdomain podman[303662]: 2025-11-28 09:57:02.111462177 +0000 UTC m=+0.044542601 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Nov 28 09:57:02 np0005538515.localdomain systemd[1]: tmp-crun.ieDyM4.mount: Deactivated successfully.
Nov 28 09:57:02 np0005538515.localdomain podman[303662]: 2025-11-28 09:57:02.214933686 +0000 UTC m=+0.148014040 container start 5e3878dd50e525dff491b0b4fbeb79c7c401669727527736dc5289bf5a881ebc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=lucid_banach, io.openshift.expose-services=, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.tags=rhceph ceph, release=553, GIT_BRANCH=main, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, com.redhat.component=rhceph-container, io.buildah.version=1.33.12, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., build-date=2025-09-24T08:57:55, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553)
Nov 28 09:57:02 np0005538515.localdomain podman[303662]: 2025-11-28 09:57:02.215466742 +0000 UTC m=+0.148547136 container attach 5e3878dd50e525dff491b0b4fbeb79c7c401669727527736dc5289bf5a881ebc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=lucid_banach, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, architecture=x86_64, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, name=rhceph, release=553, com.redhat.license_terms=https://www.redhat.com/agreements, RELEASE=main, GIT_BRANCH=main, build-date=2025-09-24T08:57:55, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9)
Nov 28 09:57:02 np0005538515.localdomain lucid_banach[303677]: 167 167
Nov 28 09:57:02 np0005538515.localdomain systemd[1]: libpod-5e3878dd50e525dff491b0b4fbeb79c7c401669727527736dc5289bf5a881ebc.scope: Deactivated successfully.
Nov 28 09:57:02 np0005538515.localdomain podman[303662]: 2025-11-28 09:57:02.218818705 +0000 UTC m=+0.151899079 container died 5e3878dd50e525dff491b0b4fbeb79c7c401669727527736dc5289bf5a881ebc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=lucid_banach, name=rhceph, GIT_BRANCH=main, GIT_CLEAN=True, build-date=2025-09-24T08:57:55, description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=553, vcs-type=git, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, distribution-scope=public, io.openshift.tags=rhceph ceph, RELEASE=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, architecture=x86_64)
Nov 28 09:57:02 np0005538515.localdomain systemd[1]: tmp-crun.nPY5ev.mount: Deactivated successfully.
Nov 28 09:57:02 np0005538515.localdomain podman[303682]: 2025-11-28 09:57:02.318394185 +0000 UTC m=+0.091397722 container remove 5e3878dd50e525dff491b0b4fbeb79c7c401669727527736dc5289bf5a881ebc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=lucid_banach, distribution-scope=public, GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, name=rhceph, vendor=Red Hat, Inc., io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=553, version=7, vcs-type=git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, ceph=True, CEPH_POINT_RELEASE=, architecture=x86_64, io.buildah.version=1.33.12, com.redhat.component=rhceph-container, build-date=2025-09-24T08:57:55, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/agreements)
Nov 28 09:57:02 np0005538515.localdomain systemd[1]: libpod-conmon-5e3878dd50e525dff491b0b4fbeb79c7c401669727527736dc5289bf5a881ebc.scope: Deactivated successfully.
Nov 28 09:57:02 np0005538515.localdomain sudo[303626]: pam_unix(sudo:session): session closed for user root
Nov 28 09:57:02 np0005538515.localdomain ceph-mon[301134]: Reconfiguring mon.np0005538515 (monmap changed)...
Nov 28 09:57:02 np0005538515.localdomain ceph-mon[301134]: Reconfiguring daemon mon.np0005538515 on np0005538515.localdomain
Nov 28 09:57:03 np0005538515.localdomain sudo[303699]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:57:03 np0005538515.localdomain sudo[303699]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:57:03 np0005538515.localdomain sudo[303699]: pam_unix(sudo:session): session closed for user root
Nov 28 09:57:03 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-13f3cbae1de9f99ee9b8d0449455f9a474d8c3eedd97ff6a9b979ff9e18faeca-merged.mount: Deactivated successfully.
Nov 28 09:57:03 np0005538515.localdomain ceph-mon[301134]: pgmap v14: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:57:03 np0005538515.localdomain ceph-mon[301134]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:57:03 np0005538515.localdomain ceph-mon[301134]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:57:03 np0005538515.localdomain ceph-mon[301134]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:57:03 np0005538515.localdomain ceph-mon[301134]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 28 09:57:03 np0005538515.localdomain ceph-mon[301134]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:57:03 np0005538515.localdomain ceph-mon[301134]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:57:03 np0005538515.localdomain ceph-mon[301134]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 28 09:57:03 np0005538515.localdomain ceph-mon[301134]: Reconfiguring mon.np0005538513 (monmap changed)...
Nov 28 09:57:03 np0005538515.localdomain ceph-mon[301134]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Nov 28 09:57:03 np0005538515.localdomain ceph-mon[301134]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Nov 28 09:57:03 np0005538515.localdomain ceph-mon[301134]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:57:03 np0005538515.localdomain ceph-mon[301134]: Reconfiguring daemon mon.np0005538513 on np0005538513.localdomain
Nov 28 09:57:05 np0005538515.localdomain ceph-mon[301134]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:57:05 np0005538515.localdomain ceph-mon[301134]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:57:05 np0005538515.localdomain ceph-mon[301134]: pgmap v15: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:57:05 np0005538515.localdomain ceph-mon[301134]: mgrmap e40: np0005538514.djozup(active, since 23s), standbys: np0005538515.yfkzhl, np0005538513.dsfdlx
Nov 28 09:57:05 np0005538515.localdomain sshd[299439]: Received disconnect from 192.168.122.11 port 58352:11: disconnected by user
Nov 28 09:57:05 np0005538515.localdomain sshd[299439]: Disconnected from user tripleo-admin 192.168.122.11 port 58352
Nov 28 09:57:05 np0005538515.localdomain sshd[299419]: pam_unix(sshd:session): session closed for user tripleo-admin
Nov 28 09:57:05 np0005538515.localdomain systemd[1]: session-69.scope: Deactivated successfully.
Nov 28 09:57:05 np0005538515.localdomain systemd[1]: session-69.scope: Consumed 1.713s CPU time.
Nov 28 09:57:05 np0005538515.localdomain systemd-logind[763]: Session 69 logged out. Waiting for processes to exit.
Nov 28 09:57:05 np0005538515.localdomain systemd-logind[763]: Removed session 69.
Nov 28 09:57:06 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:57:07 np0005538515.localdomain ceph-mon[301134]: pgmap v16: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:57:07 np0005538515.localdomain ceph-mon[301134]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:57:08 np0005538515.localdomain ceph-mon[301134]: pgmap v17: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:57:11 np0005538515.localdomain ceph-mon[301134]: pgmap v18: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:57:11 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:57:11 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.
Nov 28 09:57:11 np0005538515.localdomain systemd[1]: tmp-crun.yr4esi.mount: Deactivated successfully.
Nov 28 09:57:11 np0005538515.localdomain podman[303717]: 2025-11-28 09:57:11.981036584 +0000 UTC m=+0.082198760 container health_status 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., architecture=x86_64, container_name=openstack_network_exporter, distribution-scope=public, io.openshift.expose-services=, release=1755695350, vcs-type=git, maintainer=Red Hat, Inc., managed_by=edpm_ansible)
Nov 28 09:57:11 np0005538515.localdomain podman[303717]: 2025-11-28 09:57:11.996574919 +0000 UTC m=+0.097737135 container exec_died 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., architecture=x86_64, maintainer=Red Hat, Inc., vcs-type=git, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, io.buildah.version=1.33.7, release=1755695350, com.redhat.component=ubi9-minimal-container, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=)
Nov 28 09:57:12 np0005538515.localdomain systemd[1]: 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.service: Deactivated successfully.
Nov 28 09:57:12 np0005538515.localdomain ceph-mon[301134]: pgmap v19: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:57:13 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Nov 28 09:57:13 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3967776959' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 09:57:13 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Nov 28 09:57:13 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3967776959' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 09:57:13 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/3967776959' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 09:57:13 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/3967776959' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 09:57:15 np0005538515.localdomain systemd[1]: Stopping User Manager for UID 1003...
Nov 28 09:57:15 np0005538515.localdomain systemd[299423]: Activating special unit Exit the Session...
Nov 28 09:57:15 np0005538515.localdomain systemd[299423]: Stopped target Main User Target.
Nov 28 09:57:15 np0005538515.localdomain systemd[299423]: Stopped target Basic System.
Nov 28 09:57:15 np0005538515.localdomain systemd[299423]: Stopped target Paths.
Nov 28 09:57:15 np0005538515.localdomain systemd[299423]: Stopped target Sockets.
Nov 28 09:57:15 np0005538515.localdomain systemd[299423]: Stopped target Timers.
Nov 28 09:57:15 np0005538515.localdomain systemd[299423]: Stopped Mark boot as successful after the user session has run 2 minutes.
Nov 28 09:57:15 np0005538515.localdomain systemd[299423]: Stopped Daily Cleanup of User's Temporary Directories.
Nov 28 09:57:15 np0005538515.localdomain systemd[299423]: Closed D-Bus User Message Bus Socket.
Nov 28 09:57:15 np0005538515.localdomain systemd[299423]: Stopped Create User's Volatile Files and Directories.
Nov 28 09:57:15 np0005538515.localdomain systemd[299423]: Removed slice User Application Slice.
Nov 28 09:57:15 np0005538515.localdomain systemd[299423]: Reached target Shutdown.
Nov 28 09:57:15 np0005538515.localdomain systemd[299423]: Finished Exit the Session.
Nov 28 09:57:15 np0005538515.localdomain systemd[299423]: Reached target Exit the Session.
Nov 28 09:57:15 np0005538515.localdomain systemd[1]: user@1003.service: Deactivated successfully.
Nov 28 09:57:15 np0005538515.localdomain systemd[1]: Stopped User Manager for UID 1003.
Nov 28 09:57:15 np0005538515.localdomain systemd[1]: Stopping User Runtime Directory /run/user/1003...
Nov 28 09:57:15 np0005538515.localdomain systemd[1]: run-user-1003.mount: Deactivated successfully.
Nov 28 09:57:15 np0005538515.localdomain systemd[1]: user-runtime-dir@1003.service: Deactivated successfully.
Nov 28 09:57:15 np0005538515.localdomain systemd[1]: Stopped User Runtime Directory /run/user/1003.
Nov 28 09:57:15 np0005538515.localdomain systemd[1]: Removed slice User Slice of UID 1003.
Nov 28 09:57:15 np0005538515.localdomain systemd[1]: user-1003.slice: Consumed 2.299s CPU time.
Nov 28 09:57:15 np0005538515.localdomain ceph-mon[301134]: pgmap v20: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:57:16 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:57:17 np0005538515.localdomain ceph-mon[301134]: pgmap v21: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:57:18 np0005538515.localdomain ceph-mon[301134]: pgmap v22: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:57:20 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.
Nov 28 09:57:20 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.
Nov 28 09:57:20 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.
Nov 28 09:57:20 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.
Nov 28 09:57:20 np0005538515.localdomain systemd[297255]: Starting Mark boot as successful...
Nov 28 09:57:20 np0005538515.localdomain systemd[1]: tmp-crun.Ppi2e9.mount: Deactivated successfully.
Nov 28 09:57:21 np0005538515.localdomain systemd[297255]: Finished Mark boot as successful.
Nov 28 09:57:21 np0005538515.localdomain podman[303738]: 2025-11-28 09:57:21.006280934 +0000 UTC m=+0.108493274 container health_status 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, config_id=edpm)
Nov 28 09:57:21 np0005538515.localdomain podman[303738]: 2025-11-28 09:57:21.042892851 +0000 UTC m=+0.145105191 container exec_died 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Nov 28 09:57:21 np0005538515.localdomain systemd[1]: 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.service: Deactivated successfully.
Nov 28 09:57:21 np0005538515.localdomain podman[303739]: 2025-11-28 09:57:21.04875251 +0000 UTC m=+0.150636370 container health_status 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Nov 28 09:57:21 np0005538515.localdomain podman[303741]: 2025-11-28 09:57:21.154271692 +0000 UTC m=+0.246434865 container health_status d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 28 09:57:21 np0005538515.localdomain podman[303741]: 2025-11-28 09:57:21.165491085 +0000 UTC m=+0.257654288 container exec_died d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 09:57:21 np0005538515.localdomain podman[303740]: 2025-11-28 09:57:21.124908965 +0000 UTC m=+0.219349267 container health_status b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true)
Nov 28 09:57:21 np0005538515.localdomain systemd[1]: d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.service: Deactivated successfully.
Nov 28 09:57:21 np0005538515.localdomain podman[303740]: 2025-11-28 09:57:21.207333862 +0000 UTC m=+0.301774184 container exec_died b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_metadata_agent, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS)
Nov 28 09:57:21 np0005538515.localdomain systemd[1]: b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.service: Deactivated successfully.
Nov 28 09:57:21 np0005538515.localdomain podman[303739]: 2025-11-28 09:57:21.228959262 +0000 UTC m=+0.330843172 container exec_died 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Nov 28 09:57:21 np0005538515.localdomain systemd[1]: 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.service: Deactivated successfully.
Nov 28 09:57:21 np0005538515.localdomain ceph-mon[301134]: pgmap v23: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:57:21 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:57:23 np0005538515.localdomain ceph-mon[301134]: pgmap v24: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:57:23 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.
Nov 28 09:57:23 np0005538515.localdomain podman[303821]: 2025-11-28 09:57:23.953737073 +0000 UTC m=+0.060334483 container health_status 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 28 09:57:23 np0005538515.localdomain podman[303821]: 2025-11-28 09:57:23.958431306 +0000 UTC m=+0.065028686 container exec_died 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 28 09:57:23 np0005538515.localdomain systemd[1]: 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.service: Deactivated successfully.
Nov 28 09:57:25 np0005538515.localdomain ceph-mon[301134]: pgmap v25: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:57:26 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:57:26 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.
Nov 28 09:57:26 np0005538515.localdomain podman[303846]: 2025-11-28 09:57:26.972910211 +0000 UTC m=+0.081073626 container health_status cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 09:57:26 np0005538515.localdomain podman[303846]: 2025-11-28 09:57:26.986376473 +0000 UTC m=+0.094539888 container exec_died cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=multipathd, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 28 09:57:27 np0005538515.localdomain systemd[1]: cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.service: Deactivated successfully.
Nov 28 09:57:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:57:27 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:57:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:57:27 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 09:57:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:57:27 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:57:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:57:27 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 09:57:27 np0005538515.localdomain openstack_network_exporter[240973]: 
Nov 28 09:57:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:57:27 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 09:57:27 np0005538515.localdomain openstack_network_exporter[240973]: 
Nov 28 09:57:27 np0005538515.localdomain ceph-mon[301134]: pgmap v26: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:57:28 np0005538515.localdomain ceph-mon[301134]: pgmap v27: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:57:28 np0005538515.localdomain podman[239012]: time="2025-11-28T09:57:28Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 09:57:28 np0005538515.localdomain podman[239012]: @ - - [28/Nov/2025:09:57:28 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156330 "" "Go-http-client/1.1"
Nov 28 09:57:28 np0005538515.localdomain podman[239012]: @ - - [28/Nov/2025:09:57:28 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19185 "" "Go-http-client/1.1"
Nov 28 09:57:30 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:57:30.238 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:57:31 np0005538515.localdomain ceph-mon[301134]: pgmap v28: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:57:31 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:57:32 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:57:32.238 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:57:32 np0005538515.localdomain ceph-mon[301134]: pgmap v29: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:57:33 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:57:33.238 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:57:33 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:57:33.238 280172 DEBUG nova.compute.manager [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 28 09:57:34 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:57:34.234 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:57:35 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:57:35.238 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:57:35 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:57:35.239 280172 DEBUG nova.compute.manager [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 28 09:57:35 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:57:35.239 280172 DEBUG nova.compute.manager [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 28 09:57:35 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:57:35.357 280172 DEBUG nova.compute.manager [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 28 09:57:35 np0005538515.localdomain ceph-mon[301134]: pgmap v30: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:57:35 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.200:0/1459346702' entity='client.admin' cmd={"prefix": "status", "format": "json"} : dispatch
Nov 28 09:57:36 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:57:37 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:57:37.238 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:57:37 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:57:37.239 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:57:37 np0005538515.localdomain ceph-mon[301134]: pgmap v31: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:57:38 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.106:0/3192274012' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 09:57:38 np0005538515.localdomain ceph-mon[301134]: pgmap v32: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:57:39 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:57:39.238 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:57:39 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:57:39.262 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:57:39 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:57:39.263 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:57:39 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:57:39.263 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:57:39 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:57:39.263 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Auditing locally available compute resources for np0005538515.localdomain (node: np0005538515.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 28 09:57:39 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:57:39.263 280172 DEBUG oslo_concurrency.processutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 09:57:39 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 28 09:57:39 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/1903764416' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 09:57:39 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:57:39.701 280172 DEBUG oslo_concurrency.processutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.438s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 09:57:39 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.106:0/510121500' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 09:57:39 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.108:0/1903764416' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 09:57:39 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:57:39.898 280172 WARNING nova.virt.libvirt.driver [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 09:57:39 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:57:39.899 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Hypervisor/Node resource view: name=np0005538515.localdomain free_ram=12044MB free_disk=41.83686447143555GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 28 09:57:39 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:57:39.899 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:57:39 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:57:39.899 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:57:39 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:57:39.983 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 28 09:57:39 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:57:39.984 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Final resource view: name=np0005538515.localdomain phys_ram=15738MB used_ram=512MB phys_disk=41GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 28 09:57:40 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:57:40.020 280172 DEBUG oslo_concurrency.processutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 09:57:40 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 28 09:57:40 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/279813176' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 09:57:40 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:57:40.469 280172 DEBUG oslo_concurrency.processutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.449s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 09:57:40 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:57:40.475 280172 DEBUG nova.compute.provider_tree [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Inventory has not changed in ProviderTree for provider: 72fba1ca-0d86-48af-8a3d-510284dfd0e0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 09:57:40 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:57:40.492 280172 DEBUG nova.scheduler.client.report [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Inventory has not changed for provider 72fba1ca-0d86-48af-8a3d-510284dfd0e0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 09:57:40 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:57:40.495 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Compute_service record updated for np0005538515.localdomain:np0005538515.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 28 09:57:40 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:57:40.495 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.596s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:57:40 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.107:0/3305388726' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 09:57:40 np0005538515.localdomain ceph-mon[301134]: pgmap v33: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:57:40 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.108:0/279813176' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 09:57:41 np0005538515.localdomain sshd[303910]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 09:57:41 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:57:41.492 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:57:41 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:57:41.492 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:57:41 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:57:41 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.107:0/436717469' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 09:57:42 np0005538515.localdomain sshd[303911]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 09:57:42 np0005538515.localdomain ceph-mon[301134]: pgmap v34: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:57:42 np0005538515.localdomain sshd[303911]: error: kex_exchange_identification: read: Connection reset by peer
Nov 28 09:57:42 np0005538515.localdomain sshd[303911]: Connection reset by 45.140.17.97 port 10651
Nov 28 09:57:42 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.
Nov 28 09:57:42 np0005538515.localdomain podman[303912]: 2025-11-28 09:57:42.975390102 +0000 UTC m=+0.079748905 container health_status 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, managed_by=edpm_ansible, vcs-type=git, release=1755695350, architecture=x86_64, io.openshift.expose-services=, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, config_id=edpm)
Nov 28 09:57:42 np0005538515.localdomain podman[303912]: 2025-11-28 09:57:42.98776404 +0000 UTC m=+0.092122813 container exec_died 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, architecture=x86_64, config_id=edpm, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, vcs-type=git, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal)
Nov 28 09:57:43 np0005538515.localdomain systemd[1]: 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.service: Deactivated successfully.
Nov 28 09:57:43 np0005538515.localdomain ceph-mon[301134]: from='client.44598 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Nov 28 09:57:45 np0005538515.localdomain ceph-mon[301134]: pgmap v35: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:57:46 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:57:47 np0005538515.localdomain ceph-mon[301134]: pgmap v36: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:57:48 np0005538515.localdomain ceph-mon[301134]: pgmap v37: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:57:49 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.200:0/3756897191' entity='client.admin' cmd={"prefix": "config dump", "format": "json"} : dispatch
Nov 28 09:57:50 np0005538515.localdomain ceph-mon[301134]: pgmap v38: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:57:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:57:50.841 158530 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:57:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:57:50.841 158530 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:57:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:57:50.842 158530 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:57:51 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:57:51 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.
Nov 28 09:57:51 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.
Nov 28 09:57:51 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.
Nov 28 09:57:51 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.
Nov 28 09:57:51 np0005538515.localdomain podman[303932]: 2025-11-28 09:57:51.970941222 +0000 UTC m=+0.079778315 container health_status 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 09:57:51 np0005538515.localdomain podman[303932]: 2025-11-28 09:57:51.982312716 +0000 UTC m=+0.091149819 container exec_died 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0)
Nov 28 09:57:51 np0005538515.localdomain systemd[1]: 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.service: Deactivated successfully.
Nov 28 09:57:52 np0005538515.localdomain podman[303937]: 2025-11-28 09:57:51.985817913 +0000 UTC m=+0.084157449 container health_status b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, container_name=ovn_metadata_agent, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 09:57:52 np0005538515.localdomain podman[303933]: 2025-11-28 09:57:52.045284232 +0000 UTC m=+0.145357969 container health_status 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 28 09:57:52 np0005538515.localdomain podman[303937]: 2025-11-28 09:57:52.064778912 +0000 UTC m=+0.163118468 container exec_died b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Nov 28 09:57:52 np0005538515.localdomain podman[303940]: 2025-11-28 09:57:52.023294877 +0000 UTC m=+0.120929461 container health_status d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 09:57:52 np0005538515.localdomain podman[303940]: 2025-11-28 09:57:52.108444223 +0000 UTC m=+0.206078857 container exec_died d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 28 09:57:52 np0005538515.localdomain systemd[1]: b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.service: Deactivated successfully.
Nov 28 09:57:52 np0005538515.localdomain systemd[1]: d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.service: Deactivated successfully.
Nov 28 09:57:52 np0005538515.localdomain podman[303933]: 2025-11-28 09:57:52.12817234 +0000 UTC m=+0.228246127 container exec_died 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 28 09:57:52 np0005538515.localdomain systemd[1]: 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.service: Deactivated successfully.
Nov 28 09:57:53 np0005538515.localdomain ceph-mon[301134]: pgmap v39: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:57:54 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.
Nov 28 09:57:54 np0005538515.localdomain podman[304015]: 2025-11-28 09:57:54.976487261 +0000 UTC m=+0.083581890 container health_status 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 28 09:57:55 np0005538515.localdomain podman[304015]: 2025-11-28 09:57:55.01245492 +0000 UTC m=+0.119549559 container exec_died 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 28 09:57:55 np0005538515.localdomain systemd[1]: 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.service: Deactivated successfully.
Nov 28 09:57:55 np0005538515.localdomain ceph-mon[301134]: pgmap v40: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:57:56 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:57:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:57:57 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 09:57:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:57:57 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:57:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:57:57 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:57:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:57:57 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 09:57:57 np0005538515.localdomain openstack_network_exporter[240973]: 
Nov 28 09:57:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:57:57 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 09:57:57 np0005538515.localdomain openstack_network_exporter[240973]: 
Nov 28 09:57:57 np0005538515.localdomain ceph-mon[301134]: from='client.44607 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Nov 28 09:57:57 np0005538515.localdomain ceph-mon[301134]: pgmap v41: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:57:57 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.
Nov 28 09:57:57 np0005538515.localdomain podman[304039]: 2025-11-28 09:57:57.981280158 +0000 UTC m=+0.089885512 container health_status cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, container_name=multipathd, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 28 09:57:57 np0005538515.localdomain podman[304039]: 2025-11-28 09:57:57.990763434 +0000 UTC m=+0.099368788 container exec_died cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, container_name=multipathd)
Nov 28 09:57:58 np0005538515.localdomain systemd[1]: cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.service: Deactivated successfully.
Nov 28 09:57:58 np0005538515.localdomain ceph-mon[301134]: pgmap v42: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:57:58 np0005538515.localdomain podman[239012]: time="2025-11-28T09:57:58Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 09:57:58 np0005538515.localdomain podman[239012]: @ - - [28/Nov/2025:09:57:58 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156330 "" "Go-http-client/1.1"
Nov 28 09:57:58 np0005538515.localdomain podman[239012]: @ - - [28/Nov/2025:09:57:58 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19188 "" "Go-http-client/1.1"
Nov 28 09:58:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:58:00.626 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:58:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:58:00.626 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:58:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:58:00.626 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:58:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:58:00.627 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:58:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:58:00.627 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:58:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:58:00.627 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:58:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:58:00.627 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:58:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:58:00.627 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:58:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:58:00.627 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:58:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:58:00.627 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:58:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:58:00.627 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:58:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:58:00.628 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:58:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:58:00.628 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:58:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:58:00.628 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:58:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:58:00.628 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:58:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:58:00.628 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:58:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:58:00.628 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:58:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:58:00.628 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:58:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:58:00.629 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:58:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:58:00.629 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:58:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:58:00.629 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:58:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:58:00.629 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:58:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:58:00.629 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:58:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:58:00.629 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:58:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 09:58:00.629 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 09:58:01 np0005538515.localdomain ceph-mon[301134]: pgmap v43: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:58:01 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:58:02 np0005538515.localdomain ceph-mon[301134]: pgmap v44: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:58:03 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.200:0/2455933958' entity='client.admin' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 09:58:04 np0005538515.localdomain sudo[304058]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:58:04 np0005538515.localdomain sudo[304058]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:58:04 np0005538515.localdomain sudo[304058]: pam_unix(sudo:session): session closed for user root
Nov 28 09:58:04 np0005538515.localdomain sudo[304076]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 09:58:04 np0005538515.localdomain sudo[304076]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:58:04 np0005538515.localdomain sudo[304076]: pam_unix(sudo:session): session closed for user root
Nov 28 09:58:05 np0005538515.localdomain sudo[304125]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:58:05 np0005538515.localdomain sudo[304125]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:58:05 np0005538515.localdomain sudo[304125]: pam_unix(sudo:session): session closed for user root
Nov 28 09:58:05 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e92 e92: 6 total, 6 up, 6 in
Nov 28 09:58:05 np0005538515.localdomain ceph-mgr[286188]: mgr handle_mgr_map Activating!
Nov 28 09:58:05 np0005538515.localdomain ceph-mgr[286188]: mgr handle_mgr_map I am now activating
Nov 28 09:58:05 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "mon metadata", "id": "np0005538513"} v 0)
Nov 28 09:58:05 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mon metadata", "id": "np0005538513"} : dispatch
Nov 28 09:58:05 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "mon metadata", "id": "np0005538514"} v 0)
Nov 28 09:58:05 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mon metadata", "id": "np0005538514"} : dispatch
Nov 28 09:58:05 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "mon metadata", "id": "np0005538515"} v 0)
Nov 28 09:58:05 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mon metadata", "id": "np0005538515"} : dispatch
Nov 28 09:58:05 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "mds metadata", "who": "mds.np0005538513.yljthc"} v 0)
Nov 28 09:58:05 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mds metadata", "who": "mds.np0005538513.yljthc"} : dispatch
Nov 28 09:58:05 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).mds e17 all = 0
Nov 28 09:58:05 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "mds metadata", "who": "mds.np0005538515.anvatb"} v 0)
Nov 28 09:58:05 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mds metadata", "who": "mds.np0005538515.anvatb"} : dispatch
Nov 28 09:58:05 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).mds e17 all = 0
Nov 28 09:58:05 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "mds metadata", "who": "mds.np0005538514.umgtoy"} v 0)
Nov 28 09:58:05 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mds metadata", "who": "mds.np0005538514.umgtoy"} : dispatch
Nov 28 09:58:05 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).mds e17 all = 0
Nov 28 09:58:05 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "mgr metadata", "who": "np0005538515.yfkzhl", "id": "np0005538515.yfkzhl"} v 0)
Nov 28 09:58:05 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mgr metadata", "who": "np0005538515.yfkzhl", "id": "np0005538515.yfkzhl"} : dispatch
Nov 28 09:58:05 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "mgr metadata", "who": "np0005538513.dsfdlx", "id": "np0005538513.dsfdlx"} v 0)
Nov 28 09:58:05 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mgr metadata", "who": "np0005538513.dsfdlx", "id": "np0005538513.dsfdlx"} : dispatch
Nov 28 09:58:05 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "osd metadata", "id": 0} v 0)
Nov 28 09:58:05 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "osd metadata", "id": 0} : dispatch
Nov 28 09:58:05 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "osd metadata", "id": 1} v 0)
Nov 28 09:58:05 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "osd metadata", "id": 1} : dispatch
Nov 28 09:58:05 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "osd metadata", "id": 2} v 0)
Nov 28 09:58:05 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "osd metadata", "id": 2} : dispatch
Nov 28 09:58:05 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "osd metadata", "id": 3} v 0)
Nov 28 09:58:05 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "osd metadata", "id": 3} : dispatch
Nov 28 09:58:05 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "osd metadata", "id": 4} v 0)
Nov 28 09:58:05 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "osd metadata", "id": 4} : dispatch
Nov 28 09:58:05 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "osd metadata", "id": 5} v 0)
Nov 28 09:58:05 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "osd metadata", "id": 5} : dispatch
Nov 28 09:58:05 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "mds metadata"} v 0)
Nov 28 09:58:05 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mds metadata"} : dispatch
Nov 28 09:58:05 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).mds e17 all = 1
Nov 28 09:58:05 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "osd metadata"} v 0)
Nov 28 09:58:05 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "osd metadata"} : dispatch
Nov 28 09:58:05 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "mon metadata"} v 0)
Nov 28 09:58:05 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mon metadata"} : dispatch
Nov 28 09:58:05 np0005538515.localdomain ceph-mgr[286188]: [balancer DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Nov 28 09:58:05 np0005538515.localdomain ceph-mgr[286188]: mgr load Constructed class from module: balancer
Nov 28 09:58:05 np0005538515.localdomain ceph-mgr[286188]: [balancer INFO root] Starting
Nov 28 09:58:05 np0005538515.localdomain ceph-mgr[286188]: [cephadm DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Nov 28 09:58:05 np0005538515.localdomain ceph-mgr[286188]: [balancer INFO root] Optimize plan auto_2025-11-28_09:58:05
Nov 28 09:58:05 np0005538515.localdomain ceph-mgr[286188]: [balancer INFO root] Mode upmap, max misplaced 0.050000
Nov 28 09:58:05 np0005538515.localdomain ceph-mgr[286188]: [balancer INFO root] Some PGs (1.000000) are unknown; try again later
Nov 28 09:58:05 np0005538515.localdomain sshd[302096]: pam_unix(sshd:session): session closed for user ceph-admin
Nov 28 09:58:05 np0005538515.localdomain systemd[1]: session-71.scope: Deactivated successfully.
Nov 28 09:58:05 np0005538515.localdomain systemd[1]: session-71.scope: Consumed 10.660s CPU time.
Nov 28 09:58:05 np0005538515.localdomain systemd-logind[763]: Session 71 logged out. Waiting for processes to exit.
Nov 28 09:58:05 np0005538515.localdomain systemd-logind[763]: Removed session 71.
Nov 28 09:58:05 np0005538515.localdomain ceph-mgr[286188]: mgr load Constructed class from module: cephadm
Nov 28 09:58:05 np0005538515.localdomain ceph-mgr[286188]: [crash DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Nov 28 09:58:05 np0005538515.localdomain ceph-mgr[286188]: mgr load Constructed class from module: crash
Nov 28 09:58:05 np0005538515.localdomain ceph-mgr[286188]: [devicehealth DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Nov 28 09:58:05 np0005538515.localdomain ceph-mgr[286188]: mgr load Constructed class from module: devicehealth
Nov 28 09:58:05 np0005538515.localdomain ceph-mgr[286188]: [iostat DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Nov 28 09:58:05 np0005538515.localdomain ceph-mgr[286188]: mgr load Constructed class from module: iostat
Nov 28 09:58:05 np0005538515.localdomain ceph-mgr[286188]: [devicehealth INFO root] Starting
Nov 28 09:58:05 np0005538515.localdomain ceph-mgr[286188]: [nfs DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Nov 28 09:58:05 np0005538515.localdomain ceph-mgr[286188]: mgr load Constructed class from module: nfs
Nov 28 09:58:05 np0005538515.localdomain ceph-mgr[286188]: [orchestrator DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Nov 28 09:58:05 np0005538515.localdomain ceph-mgr[286188]: mgr load Constructed class from module: orchestrator
Nov 28 09:58:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Nov 28 09:58:05 np0005538515.localdomain ceph-mgr[286188]: mgr load Constructed class from module: pg_autoscaler
Nov 28 09:58:05 np0005538515.localdomain ceph-mgr[286188]: [progress DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Nov 28 09:58:05 np0005538515.localdomain ceph-mgr[286188]: mgr load Constructed class from module: progress
Nov 28 09:58:05 np0005538515.localdomain ceph-mgr[286188]: [progress INFO root] Loading...
Nov 28 09:58:05 np0005538515.localdomain ceph-mgr[286188]: [progress INFO root] Loaded [<progress.module.GhostEvent object at 0x7fcc9bcfdeb0>, <progress.module.GhostEvent object at 0x7fcc9bd08130>, <progress.module.GhostEvent object at 0x7fcc9bd08160>, <progress.module.GhostEvent object at 0x7fcc9bd08190>, <progress.module.GhostEvent object at 0x7fcc9bd081c0>, <progress.module.GhostEvent object at 0x7fcc9bd081f0>, <progress.module.GhostEvent object at 0x7fcc9bd08220>, <progress.module.GhostEvent object at 0x7fcc9bd08250>, <progress.module.GhostEvent object at 0x7fcc9bd08280>, <progress.module.GhostEvent object at 0x7fcc9bd082b0>, <progress.module.GhostEvent object at 0x7fcc9bd082e0>, <progress.module.GhostEvent object at 0x7fcc9bd08310>, <progress.module.GhostEvent object at 0x7fcc9bd08340>, <progress.module.GhostEvent object at 0x7fcc9bd08370>, <progress.module.GhostEvent object at 0x7fcc9bd083a0>, <progress.module.GhostEvent object at 0x7fcc9bd083d0>, <progress.module.GhostEvent object at 0x7fcc9bd08400>, <progress.module.GhostEvent object at 0x7fcc9bd08430>, <progress.module.GhostEvent object at 0x7fcc9bd08460>, <progress.module.GhostEvent object at 0x7fcc9bd08490>, <progress.module.GhostEvent object at 0x7fcc9bd084c0>, <progress.module.GhostEvent object at 0x7fcc9bd084f0>, <progress.module.GhostEvent object at 0x7fcc9bd08520>, <progress.module.GhostEvent object at 0x7fcc9bd08550>, <progress.module.GhostEvent object at 0x7fcc9bd08580>, <progress.module.GhostEvent object at 0x7fcc9bd085b0>, <progress.module.GhostEvent object at 0x7fcc9bd085e0>, <progress.module.GhostEvent object at 0x7fcc9bd08610>, <progress.module.GhostEvent object at 0x7fcc9bd08640>, <progress.module.GhostEvent object at 0x7fcc9bd08670>, <progress.module.GhostEvent object at 0x7fcc9bd086a0>, <progress.module.GhostEvent object at 0x7fcc9bd086d0>, <progress.module.GhostEvent object at 0x7fcc9bd08700>, <progress.module.GhostEvent object at 0x7fcc9bd08730>, <progress.module.GhostEvent object at 0x7fcc9bd08760>, <progress.module.GhostEvent object at 0x7fcc9bd08790>, <progress.module.GhostEvent object at 0x7fcc9bd087c0>, <progress.module.GhostEvent object at 0x7fcc9bd087f0>, <progress.module.GhostEvent object at 0x7fcc9bd08820>, <progress.module.GhostEvent object at 0x7fcc9bd08850>, <progress.module.GhostEvent object at 0x7fcc9bd08880>, <progress.module.GhostEvent object at 0x7fcc9bd088b0>, <progress.module.GhostEvent object at 0x7fcc9bd088e0>, <progress.module.GhostEvent object at 0x7fcc9bd08910>, <progress.module.GhostEvent object at 0x7fcc9bd08940>, <progress.module.GhostEvent object at 0x7fcc9bd08970>, <progress.module.GhostEvent object at 0x7fcc9bd089a0>, <progress.module.GhostEvent object at 0x7fcc9bd089d0>, <progress.module.GhostEvent object at 0x7fcc9bd08a00>, <progress.module.GhostEvent object at 0x7fcc9bd08a30>] historic events
Nov 28 09:58:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] _maybe_adjust
Nov 28 09:58:05 np0005538515.localdomain ceph-mgr[286188]: [progress INFO root] Loaded OSDMap, ready.
Nov 28 09:58:05 np0005538515.localdomain ceph-mgr[286188]: [rbd_support DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Nov 28 09:58:05 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] recovery thread starting
Nov 28 09:58:05 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] starting setup
Nov 28 09:58:05 np0005538515.localdomain ceph-mgr[286188]: mgr load Constructed class from module: rbd_support
Nov 28 09:58:05 np0005538515.localdomain ceph-mgr[286188]: [restful DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Nov 28 09:58:05 np0005538515.localdomain ceph-mgr[286188]: mgr load Constructed class from module: restful
Nov 28 09:58:05 np0005538515.localdomain ceph-mgr[286188]: [status DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Nov 28 09:58:05 np0005538515.localdomain ceph-mgr[286188]: mgr load Constructed class from module: status
Nov 28 09:58:05 np0005538515.localdomain ceph-mgr[286188]: [telemetry DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Nov 28 09:58:05 np0005538515.localdomain ceph-mgr[286188]: mgr load Constructed class from module: telemetry
Nov 28 09:58:05 np0005538515.localdomain ceph-mgr[286188]: [restful INFO root] server_addr: :: server_port: 8003
Nov 28 09:58:05 np0005538515.localdomain ceph-mgr[286188]: [volumes DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Nov 28 09:58:05 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005538515.yfkzhl/mirror_snapshot_schedule"} v 0)
Nov 28 09:58:05 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005538515.yfkzhl/mirror_snapshot_schedule"} : dispatch
Nov 28 09:58:05 np0005538515.localdomain ceph-mgr[286188]: [restful WARNING root] server not running: no certificate configured
Nov 28 09:58:05 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules
Nov 28 09:58:05 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: vms, start_after=
Nov 28 09:58:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Nov 28 09:58:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Nov 28 09:58:05 np0005538515.localdomain ceph-mgr[286188]: mgr load Constructed class from module: volumes
Nov 28 09:58:05 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: volumes, start_after=
Nov 28 09:58:05 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T09:58:05.689+0000 7fcc8cc53640 -1 client.0 error registering admin socket command: (17) File exists
Nov 28 09:58:05 np0005538515.localdomain ceph-mgr[286188]: client.0 error registering admin socket command: (17) File exists
Nov 28 09:58:05 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T09:58:05.689+0000 7fcc8cc53640 -1 client.0 error registering admin socket command: (17) File exists
Nov 28 09:58:05 np0005538515.localdomain ceph-mgr[286188]: client.0 error registering admin socket command: (17) File exists
Nov 28 09:58:05 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T09:58:05.689+0000 7fcc8cc53640 -1 client.0 error registering admin socket command: (17) File exists
Nov 28 09:58:05 np0005538515.localdomain ceph-mgr[286188]: client.0 error registering admin socket command: (17) File exists
Nov 28 09:58:05 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T09:58:05.689+0000 7fcc8cc53640 -1 client.0 error registering admin socket command: (17) File exists
Nov 28 09:58:05 np0005538515.localdomain ceph-mgr[286188]: client.0 error registering admin socket command: (17) File exists
Nov 28 09:58:05 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T09:58:05.689+0000 7fcc8cc53640 -1 client.0 error registering admin socket command: (17) File exists
Nov 28 09:58:05 np0005538515.localdomain ceph-mgr[286188]: client.0 error registering admin socket command: (17) File exists
Nov 28 09:58:05 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: images, start_after=
Nov 28 09:58:05 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: backups, start_after=
Nov 28 09:58:05 np0005538515.localdomain ceph-mon[301134]: pgmap v45: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:58:05 np0005538515.localdomain ceph-mon[301134]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:58:05 np0005538515.localdomain ceph-mon[301134]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 28 09:58:05 np0005538515.localdomain ceph-mon[301134]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' 
Nov 28 09:58:05 np0005538515.localdomain ceph-mon[301134]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 28 09:58:05 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.200:0/1630948378' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch
Nov 28 09:58:05 np0005538515.localdomain ceph-mon[301134]: Activating manager daemon np0005538515.yfkzhl
Nov 28 09:58:05 np0005538515.localdomain ceph-mon[301134]: osdmap e92: 6 total, 6 up, 6 in
Nov 28 09:58:05 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.200:0/1630948378' entity='client.admin' cmd='[{"prefix": "mgr fail"}]': finished
Nov 28 09:58:05 np0005538515.localdomain ceph-mon[301134]: mgrmap e41: np0005538515.yfkzhl(active, starting, since 0.0380796s), standbys: np0005538513.dsfdlx
Nov 28 09:58:05 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mon metadata", "id": "np0005538513"} : dispatch
Nov 28 09:58:05 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mon metadata", "id": "np0005538514"} : dispatch
Nov 28 09:58:05 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mon metadata", "id": "np0005538515"} : dispatch
Nov 28 09:58:05 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mds metadata", "who": "mds.np0005538513.yljthc"} : dispatch
Nov 28 09:58:05 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mds metadata", "who": "mds.np0005538515.anvatb"} : dispatch
Nov 28 09:58:05 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mds metadata", "who": "mds.np0005538514.umgtoy"} : dispatch
Nov 28 09:58:05 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mgr metadata", "who": "np0005538515.yfkzhl", "id": "np0005538515.yfkzhl"} : dispatch
Nov 28 09:58:05 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mgr metadata", "who": "np0005538513.dsfdlx", "id": "np0005538513.dsfdlx"} : dispatch
Nov 28 09:58:05 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "osd metadata", "id": 0} : dispatch
Nov 28 09:58:05 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "osd metadata", "id": 1} : dispatch
Nov 28 09:58:05 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "osd metadata", "id": 2} : dispatch
Nov 28 09:58:05 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "osd metadata", "id": 3} : dispatch
Nov 28 09:58:05 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "osd metadata", "id": 4} : dispatch
Nov 28 09:58:05 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "osd metadata", "id": 5} : dispatch
Nov 28 09:58:05 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mds metadata"} : dispatch
Nov 28 09:58:05 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "osd metadata"} : dispatch
Nov 28 09:58:05 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mon metadata"} : dispatch
Nov 28 09:58:05 np0005538515.localdomain ceph-mon[301134]: Manager daemon np0005538515.yfkzhl is now available
Nov 28 09:58:05 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005538515.yfkzhl/mirror_snapshot_schedule"} : dispatch
Nov 28 09:58:05 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005538515.yfkzhl/mirror_snapshot_schedule"} : dispatch
Nov 28 09:58:05 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T09:58:05.693+0000 7fcc89c4d640 -1 client.0 error registering admin socket command: (17) File exists
Nov 28 09:58:05 np0005538515.localdomain ceph-mgr[286188]: client.0 error registering admin socket command: (17) File exists
Nov 28 09:58:05 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T09:58:05.693+0000 7fcc89c4d640 -1 client.0 error registering admin socket command: (17) File exists
Nov 28 09:58:05 np0005538515.localdomain ceph-mgr[286188]: client.0 error registering admin socket command: (17) File exists
Nov 28 09:58:05 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T09:58:05.693+0000 7fcc89c4d640 -1 client.0 error registering admin socket command: (17) File exists
Nov 28 09:58:05 np0005538515.localdomain ceph-mgr[286188]: client.0 error registering admin socket command: (17) File exists
Nov 28 09:58:05 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T09:58:05.693+0000 7fcc89c4d640 -1 client.0 error registering admin socket command: (17) File exists
Nov 28 09:58:05 np0005538515.localdomain ceph-mgr[286188]: client.0 error registering admin socket command: (17) File exists
Nov 28 09:58:05 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T09:58:05.693+0000 7fcc89c4d640 -1 client.0 error registering admin socket command: (17) File exists
Nov 28 09:58:05 np0005538515.localdomain ceph-mgr[286188]: client.0 error registering admin socket command: (17) File exists
Nov 28 09:58:05 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: starting
Nov 28 09:58:05 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] PerfHandler: starting
Nov 28 09:58:05 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_task_task: vms, start_after=
Nov 28 09:58:05 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_task_task: volumes, start_after=
Nov 28 09:58:05 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_task_task: images, start_after=
Nov 28 09:58:05 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_task_task: backups, start_after=
Nov 28 09:58:05 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] TaskHandler: starting
Nov 28 09:58:05 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005538515.yfkzhl/trash_purge_schedule"} v 0)
Nov 28 09:58:05 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005538515.yfkzhl/trash_purge_schedule"} : dispatch
Nov 28 09:58:05 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules
Nov 28 09:58:05 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: vms, start_after=
Nov 28 09:58:05 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: volumes, start_after=
Nov 28 09:58:05 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: images, start_after=
Nov 28 09:58:05 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: backups, start_after=
Nov 28 09:58:05 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] TrashPurgeScheduleHandler: starting
Nov 28 09:58:05 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] setup complete
Nov 28 09:58:05 np0005538515.localdomain sshd[304282]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 09:58:05 np0005538515.localdomain sshd[304282]: Accepted publickey for ceph-admin from 192.168.122.108 port 42600 ssh2: RSA SHA256:zjXO5gWr7Xng+SeiWsaFLFQaayJZD5rPIAl1v5Aks+g
Nov 28 09:58:05 np0005538515.localdomain systemd-logind[763]: New session 72 of user ceph-admin.
Nov 28 09:58:05 np0005538515.localdomain systemd[1]: Started Session 72 of User ceph-admin.
Nov 28 09:58:05 np0005538515.localdomain sshd[304282]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Nov 28 09:58:06 np0005538515.localdomain sudo[304286]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:58:06 np0005538515.localdomain sudo[304286]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:58:06 np0005538515.localdomain sudo[304286]: pam_unix(sudo:session): session closed for user root
Nov 28 09:58:06 np0005538515.localdomain sudo[304304]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Nov 28 09:58:06 np0005538515.localdomain sudo[304304]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:58:06 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v3: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:58:06 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005538515.yfkzhl/trash_purge_schedule"} : dispatch
Nov 28 09:58:06 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005538515.yfkzhl/trash_purge_schedule"} : dispatch
Nov 28 09:58:06 np0005538515.localdomain ceph-mon[301134]: mgrmap e42: np0005538515.yfkzhl(active, since 1.05546s), standbys: np0005538513.dsfdlx
Nov 28 09:58:06 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e92 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:58:06 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO cherrypy.error] [28/Nov/2025:09:58:06] ENGINE Bus STARTING
Nov 28 09:58:06 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : [28/Nov/2025:09:58:06] ENGINE Bus STARTING
Nov 28 09:58:06 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO cherrypy.error] [28/Nov/2025:09:58:06] ENGINE Serving on http://172.18.0.108:8765
Nov 28 09:58:06 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : [28/Nov/2025:09:58:06] ENGINE Serving on http://172.18.0.108:8765
Nov 28 09:58:06 np0005538515.localdomain systemd[1]: tmp-crun.VgD5Vk.mount: Deactivated successfully.
Nov 28 09:58:06 np0005538515.localdomain podman[304402]: 2025-11-28 09:58:06.980779455 +0000 UTC m=+0.094733667 container exec 98f7091a3e2ea0e9ed1e630f1e98c8fad1fd276cf7448473db6afc3c103ea45d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538515, build-date=2025-09-24T08:57:55, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, vcs-type=git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, release=553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=, io.openshift.expose-services=, distribution-scope=public, io.buildah.version=1.33.12, vendor=Red Hat, Inc., name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/agreements, version=7, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7)
Nov 28 09:58:07 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO cherrypy.error] [28/Nov/2025:09:58:07] ENGINE Serving on https://172.18.0.108:7150
Nov 28 09:58:07 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : [28/Nov/2025:09:58:07] ENGINE Serving on https://172.18.0.108:7150
Nov 28 09:58:07 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO cherrypy.error] [28/Nov/2025:09:58:07] ENGINE Bus STARTED
Nov 28 09:58:07 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : [28/Nov/2025:09:58:07] ENGINE Bus STARTED
Nov 28 09:58:07 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO cherrypy.error] [28/Nov/2025:09:58:07] ENGINE Client ('172.18.0.108', 56132) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)')
Nov 28 09:58:07 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : [28/Nov/2025:09:58:07] ENGINE Client ('172.18.0.108', 56132) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)')
Nov 28 09:58:07 np0005538515.localdomain podman[304402]: 2025-11-28 09:58:07.074722519 +0000 UTC m=+0.188676701 container exec_died 98f7091a3e2ea0e9ed1e630f1e98c8fad1fd276cf7448473db6afc3c103ea45d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538515, vcs-type=git, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-09-24T08:57:55, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=553, name=rhceph, io.buildah.version=1.33.12, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/agreements, ceph=True, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, GIT_CLEAN=True, version=7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.component=rhceph-container)
Nov 28 09:58:07 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v4: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:58:07 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain.devices.0}] v 0)
Nov 28 09:58:07 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain}] v 0)
Nov 28 09:58:07 np0005538515.localdomain sudo[304304]: pam_unix(sudo:session): session closed for user root
Nov 28 09:58:07 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain.devices.0}] v 0)
Nov 28 09:58:07 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain}] v 0)
Nov 28 09:58:07 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain.devices.0}] v 0)
Nov 28 09:58:07 np0005538515.localdomain ceph-mon[301134]: [28/Nov/2025:09:58:06] ENGINE Bus STARTING
Nov 28 09:58:07 np0005538515.localdomain ceph-mon[301134]: [28/Nov/2025:09:58:06] ENGINE Serving on http://172.18.0.108:8765
Nov 28 09:58:07 np0005538515.localdomain ceph-mon[301134]: [28/Nov/2025:09:58:07] ENGINE Serving on https://172.18.0.108:7150
Nov 28 09:58:07 np0005538515.localdomain ceph-mon[301134]: [28/Nov/2025:09:58:07] ENGINE Bus STARTED
Nov 28 09:58:07 np0005538515.localdomain ceph-mon[301134]: [28/Nov/2025:09:58:07] ENGINE Client ('172.18.0.108', 56132) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)')
Nov 28 09:58:07 np0005538515.localdomain ceph-mon[301134]: Health check cleared: CEPHADM_STRAY_DAEMON (was: 1 stray daemon(s) not managed by cephadm)
Nov 28 09:58:07 np0005538515.localdomain ceph-mon[301134]: Health check cleared: CEPHADM_STRAY_HOST (was: 1 stray host(s) with 1 daemon(s) not managed by cephadm)
Nov 28 09:58:07 np0005538515.localdomain ceph-mon[301134]: Cluster is now healthy
Nov 28 09:58:07 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:58:07 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:58:07 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:58:07 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain}] v 0)
Nov 28 09:58:07 np0005538515.localdomain ceph-mgr[286188]: [devicehealth INFO root] Check health
Nov 28 09:58:07 np0005538515.localdomain sudo[304544]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:58:07 np0005538515.localdomain sudo[304544]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:58:07 np0005538515.localdomain sudo[304544]: pam_unix(sudo:session): session closed for user root
Nov 28 09:58:07 np0005538515.localdomain sudo[304562]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 09:58:07 np0005538515.localdomain sudo[304562]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:58:08 np0005538515.localdomain sudo[304562]: pam_unix(sudo:session): session closed for user root
Nov 28 09:58:08 np0005538515.localdomain sudo[304612]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:58:08 np0005538515.localdomain sudo[304612]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:58:08 np0005538515.localdomain sudo[304612]: pam_unix(sudo:session): session closed for user root
Nov 28 09:58:08 np0005538515.localdomain sudo[304630]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 list-networks
Nov 28 09:58:08 np0005538515.localdomain sudo[304630]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:58:08 np0005538515.localdomain ceph-mon[301134]: pgmap v4: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:58:08 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:58:08 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:58:08 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:58:08 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain.devices.0}] v 0)
Nov 28 09:58:08 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain}] v 0)
Nov 28 09:58:08 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} v 0)
Nov 28 09:58:08 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch
Nov 28 09:58:08 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} v 0)
Nov 28 09:58:08 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch
Nov 28 09:58:09 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO root] Adjusting osd_memory_target on np0005538514.localdomain to 836.6M
Nov 28 09:58:09 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Adjusting osd_memory_target on np0005538514.localdomain to 836.6M
Nov 28 09:58:09 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command([{prefix=config set, name=osd_memory_target}] v 0)
Nov 28 09:58:09 np0005538515.localdomain ceph-mgr[286188]: [cephadm WARNING cephadm.serve] Unable to set osd_memory_target on np0005538514.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Nov 28 09:58:09 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [WRN] : Unable to set osd_memory_target on np0005538514.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Nov 28 09:58:09 np0005538515.localdomain sudo[304630]: pam_unix(sudo:session): session closed for user root
Nov 28 09:58:09 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain.devices.0}] v 0)
Nov 28 09:58:09 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain}] v 0)
Nov 28 09:58:09 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} v 0)
Nov 28 09:58:09 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch
Nov 28 09:58:09 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} v 0)
Nov 28 09:58:09 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch
Nov 28 09:58:09 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO root] Adjusting osd_memory_target on np0005538515.localdomain to 836.6M
Nov 28 09:58:09 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Adjusting osd_memory_target on np0005538515.localdomain to 836.6M
Nov 28 09:58:09 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command([{prefix=config set, name=osd_memory_target}] v 0)
Nov 28 09:58:09 np0005538515.localdomain ceph-mgr[286188]: [cephadm WARNING cephadm.serve] Unable to set osd_memory_target on np0005538515.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Nov 28 09:58:09 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [WRN] : Unable to set osd_memory_target on np0005538515.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Nov 28 09:58:09 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain.devices.0}] v 0)
Nov 28 09:58:09 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain}] v 0)
Nov 28 09:58:09 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} v 0)
Nov 28 09:58:09 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch
Nov 28 09:58:09 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} v 0)
Nov 28 09:58:09 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch
Nov 28 09:58:09 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO root] Adjusting osd_memory_target on np0005538513.localdomain to 836.6M
Nov 28 09:58:09 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Adjusting osd_memory_target on np0005538513.localdomain to 836.6M
Nov 28 09:58:09 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command([{prefix=config set, name=osd_memory_target}] v 0)
Nov 28 09:58:09 np0005538515.localdomain ceph-mgr[286188]: [cephadm WARNING cephadm.serve] Unable to set osd_memory_target on np0005538513.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Nov 28 09:58:09 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [WRN] : Unable to set osd_memory_target on np0005538513.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Nov 28 09:58:09 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Nov 28 09:58:09 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:58:09 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0)
Nov 28 09:58:09 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 28 09:58:09 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO cephadm.serve] Updating np0005538513.localdomain:/etc/ceph/ceph.conf
Nov 28 09:58:09 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Updating np0005538513.localdomain:/etc/ceph/ceph.conf
Nov 28 09:58:09 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO cephadm.serve] Updating np0005538514.localdomain:/etc/ceph/ceph.conf
Nov 28 09:58:09 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Updating np0005538514.localdomain:/etc/ceph/ceph.conf
Nov 28 09:58:09 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO cephadm.serve] Updating np0005538515.localdomain:/etc/ceph/ceph.conf
Nov 28 09:58:09 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Updating np0005538515.localdomain:/etc/ceph/ceph.conf
Nov 28 09:58:09 np0005538515.localdomain sudo[304666]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Nov 28 09:58:09 np0005538515.localdomain sudo[304666]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:58:09 np0005538515.localdomain sudo[304666]: pam_unix(sudo:session): session closed for user root
Nov 28 09:58:09 np0005538515.localdomain sudo[304684]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph
Nov 28 09:58:09 np0005538515.localdomain sudo[304684]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:58:09 np0005538515.localdomain sudo[304684]: pam_unix(sudo:session): session closed for user root
Nov 28 09:58:09 np0005538515.localdomain sudo[304702]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.conf.new
Nov 28 09:58:09 np0005538515.localdomain sudo[304702]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:58:09 np0005538515.localdomain sudo[304702]: pam_unix(sudo:session): session closed for user root
Nov 28 09:58:09 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v5: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:58:09 np0005538515.localdomain sudo[304720]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:58:09 np0005538515.localdomain sudo[304720]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:58:09 np0005538515.localdomain sudo[304720]: pam_unix(sudo:session): session closed for user root
Nov 28 09:58:09 np0005538515.localdomain sudo[304738]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.conf.new
Nov 28 09:58:09 np0005538515.localdomain sudo[304738]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:58:09 np0005538515.localdomain sudo[304738]: pam_unix(sudo:session): session closed for user root
Nov 28 09:58:09 np0005538515.localdomain ceph-mon[301134]: mgrmap e43: np0005538515.yfkzhl(active, since 3s), standbys: np0005538513.dsfdlx
Nov 28 09:58:09 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:58:09 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:58:09 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch
Nov 28 09:58:09 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch
Nov 28 09:58:09 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch
Nov 28 09:58:09 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch
Nov 28 09:58:09 np0005538515.localdomain ceph-mon[301134]: Adjusting osd_memory_target on np0005538514.localdomain to 836.6M
Nov 28 09:58:09 np0005538515.localdomain ceph-mon[301134]: Unable to set osd_memory_target on np0005538514.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Nov 28 09:58:09 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:58:09 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:58:09 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch
Nov 28 09:58:09 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch
Nov 28 09:58:09 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch
Nov 28 09:58:09 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch
Nov 28 09:58:09 np0005538515.localdomain ceph-mon[301134]: Adjusting osd_memory_target on np0005538515.localdomain to 836.6M
Nov 28 09:58:09 np0005538515.localdomain ceph-mon[301134]: Unable to set osd_memory_target on np0005538515.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Nov 28 09:58:09 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:58:09 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:58:09 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch
Nov 28 09:58:09 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch
Nov 28 09:58:09 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch
Nov 28 09:58:09 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch
Nov 28 09:58:09 np0005538515.localdomain ceph-mon[301134]: Adjusting osd_memory_target on np0005538513.localdomain to 836.6M
Nov 28 09:58:09 np0005538515.localdomain ceph-mon[301134]: Unable to set osd_memory_target on np0005538513.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Nov 28 09:58:09 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:58:09 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 28 09:58:09 np0005538515.localdomain ceph-mon[301134]: Updating np0005538513.localdomain:/etc/ceph/ceph.conf
Nov 28 09:58:09 np0005538515.localdomain ceph-mon[301134]: Updating np0005538514.localdomain:/etc/ceph/ceph.conf
Nov 28 09:58:09 np0005538515.localdomain ceph-mon[301134]: Updating np0005538515.localdomain:/etc/ceph/ceph.conf
Nov 28 09:58:09 np0005538515.localdomain sudo[304772]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.conf.new
Nov 28 09:58:09 np0005538515.localdomain sudo[304772]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:58:09 np0005538515.localdomain sudo[304772]: pam_unix(sudo:session): session closed for user root
Nov 28 09:58:09 np0005538515.localdomain sudo[304790]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.conf.new
Nov 28 09:58:09 np0005538515.localdomain sudo[304790]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:58:09 np0005538515.localdomain sudo[304790]: pam_unix(sudo:session): session closed for user root
Nov 28 09:58:09 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO cephadm.serve] Updating np0005538514.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:58:09 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Updating np0005538514.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:58:09 np0005538515.localdomain sudo[304808]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Nov 28 09:58:09 np0005538515.localdomain sudo[304808]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:58:10 np0005538515.localdomain sudo[304808]: pam_unix(sudo:session): session closed for user root
Nov 28 09:58:10 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO cephadm.serve] Updating np0005538515.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:58:10 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Updating np0005538515.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:58:10 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO cephadm.serve] Updating np0005538513.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:58:10 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Updating np0005538513.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:58:10 np0005538515.localdomain sudo[304826]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config
Nov 28 09:58:10 np0005538515.localdomain sudo[304826]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:58:10 np0005538515.localdomain sudo[304826]: pam_unix(sudo:session): session closed for user root
Nov 28 09:58:10 np0005538515.localdomain sudo[304844]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config
Nov 28 09:58:10 np0005538515.localdomain sudo[304844]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:58:10 np0005538515.localdomain sudo[304844]: pam_unix(sudo:session): session closed for user root
Nov 28 09:58:10 np0005538515.localdomain sudo[304862]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf.new
Nov 28 09:58:10 np0005538515.localdomain sudo[304862]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:58:10 np0005538515.localdomain sudo[304862]: pam_unix(sudo:session): session closed for user root
Nov 28 09:58:10 np0005538515.localdomain sudo[304880]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:58:10 np0005538515.localdomain sudo[304880]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:58:10 np0005538515.localdomain sudo[304880]: pam_unix(sudo:session): session closed for user root
Nov 28 09:58:10 np0005538515.localdomain sudo[304898]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf.new
Nov 28 09:58:10 np0005538515.localdomain sudo[304898]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:58:10 np0005538515.localdomain sudo[304898]: pam_unix(sudo:session): session closed for user root
Nov 28 09:58:10 np0005538515.localdomain sudo[304932]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf.new
Nov 28 09:58:10 np0005538515.localdomain sudo[304932]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:58:10 np0005538515.localdomain sudo[304932]: pam_unix(sudo:session): session closed for user root
Nov 28 09:58:10 np0005538515.localdomain sudo[304950]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf.new
Nov 28 09:58:10 np0005538515.localdomain sudo[304950]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:58:10 np0005538515.localdomain sudo[304950]: pam_unix(sudo:session): session closed for user root
Nov 28 09:58:10 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO cephadm.serve] Updating np0005538514.localdomain:/etc/ceph/ceph.client.admin.keyring
Nov 28 09:58:10 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Updating np0005538514.localdomain:/etc/ceph/ceph.client.admin.keyring
Nov 28 09:58:10 np0005538515.localdomain sudo[304968]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf.new /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:58:10 np0005538515.localdomain sudo[304968]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:58:10 np0005538515.localdomain sudo[304968]: pam_unix(sudo:session): session closed for user root
Nov 28 09:58:10 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO cephadm.serve] Updating np0005538515.localdomain:/etc/ceph/ceph.client.admin.keyring
Nov 28 09:58:10 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Updating np0005538515.localdomain:/etc/ceph/ceph.client.admin.keyring
Nov 28 09:58:10 np0005538515.localdomain ceph-mgr[286188]: mgr.server handle_open ignoring open from mgr.np0005538514.djozup 172.18.0.107:0/3915016929; not ready for session (expect reconnect)
Nov 28 09:58:10 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO cephadm.serve] Updating np0005538513.localdomain:/etc/ceph/ceph.client.admin.keyring
Nov 28 09:58:10 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Updating np0005538513.localdomain:/etc/ceph/ceph.client.admin.keyring
Nov 28 09:58:10 np0005538515.localdomain sudo[304986]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Nov 28 09:58:10 np0005538515.localdomain sudo[304986]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:58:10 np0005538515.localdomain sudo[304986]: pam_unix(sudo:session): session closed for user root
Nov 28 09:58:10 np0005538515.localdomain sudo[305004]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph
Nov 28 09:58:10 np0005538515.localdomain sudo[305004]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:58:10 np0005538515.localdomain sudo[305004]: pam_unix(sudo:session): session closed for user root
Nov 28 09:58:10 np0005538515.localdomain sudo[305022]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.client.admin.keyring.new
Nov 28 09:58:10 np0005538515.localdomain sudo[305022]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:58:10 np0005538515.localdomain sudo[305022]: pam_unix(sudo:session): session closed for user root
Nov 28 09:58:11 np0005538515.localdomain sudo[305040]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:58:11 np0005538515.localdomain sudo[305040]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:58:11 np0005538515.localdomain sudo[305040]: pam_unix(sudo:session): session closed for user root
Nov 28 09:58:11 np0005538515.localdomain sudo[305058]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.client.admin.keyring.new
Nov 28 09:58:11 np0005538515.localdomain sudo[305058]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:58:11 np0005538515.localdomain sudo[305058]: pam_unix(sudo:session): session closed for user root
Nov 28 09:58:11 np0005538515.localdomain sudo[305092]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.client.admin.keyring.new
Nov 28 09:58:11 np0005538515.localdomain sudo[305092]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:58:11 np0005538515.localdomain sudo[305092]: pam_unix(sudo:session): session closed for user root
Nov 28 09:58:11 np0005538515.localdomain ceph-mon[301134]: pgmap v5: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:58:11 np0005538515.localdomain ceph-mon[301134]: Updating np0005538514.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:58:11 np0005538515.localdomain ceph-mon[301134]: Updating np0005538515.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:58:11 np0005538515.localdomain ceph-mon[301134]: Updating np0005538513.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf
Nov 28 09:58:11 np0005538515.localdomain ceph-mon[301134]: Standby manager daemon np0005538514.djozup started
Nov 28 09:58:11 np0005538515.localdomain sudo[305110]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.client.admin.keyring.new
Nov 28 09:58:11 np0005538515.localdomain sudo[305110]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:58:11 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "mgr metadata", "who": "np0005538514.djozup", "id": "np0005538514.djozup"} v 0)
Nov 28 09:58:11 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mgr metadata", "who": "np0005538514.djozup", "id": "np0005538514.djozup"} : dispatch
Nov 28 09:58:11 np0005538515.localdomain sudo[305110]: pam_unix(sudo:session): session closed for user root
Nov 28 09:58:11 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO cephadm.serve] Updating np0005538514.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring
Nov 28 09:58:11 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Updating np0005538514.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring
Nov 28 09:58:11 np0005538515.localdomain sudo[305128]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/etc/ceph/ceph.client.admin.keyring.new /etc/ceph/ceph.client.admin.keyring
Nov 28 09:58:11 np0005538515.localdomain sudo[305128]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:58:11 np0005538515.localdomain sudo[305128]: pam_unix(sudo:session): session closed for user root
Nov 28 09:58:11 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO cephadm.serve] Updating np0005538515.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring
Nov 28 09:58:11 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Updating np0005538515.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring
Nov 28 09:58:11 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO cephadm.serve] Updating np0005538513.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring
Nov 28 09:58:11 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Updating np0005538513.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring
Nov 28 09:58:11 np0005538515.localdomain sudo[305146]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config
Nov 28 09:58:11 np0005538515.localdomain sudo[305146]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:58:11 np0005538515.localdomain sudo[305146]: pam_unix(sudo:session): session closed for user root
Nov 28 09:58:11 np0005538515.localdomain sudo[305164]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config
Nov 28 09:58:11 np0005538515.localdomain sudo[305164]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:58:11 np0005538515.localdomain sudo[305164]: pam_unix(sudo:session): session closed for user root
Nov 28 09:58:11 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v6: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:58:11 np0005538515.localdomain sudo[305182]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring.new
Nov 28 09:58:11 np0005538515.localdomain sudo[305182]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:58:11 np0005538515.localdomain sudo[305182]: pam_unix(sudo:session): session closed for user root
Nov 28 09:58:11 np0005538515.localdomain sudo[305200]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1
Nov 28 09:58:11 np0005538515.localdomain sudo[305200]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:58:11 np0005538515.localdomain sudo[305200]: pam_unix(sudo:session): session closed for user root
Nov 28 09:58:11 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e92 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:58:11 np0005538515.localdomain sudo[305218]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring.new
Nov 28 09:58:11 np0005538515.localdomain sudo[305218]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:58:11 np0005538515.localdomain sudo[305218]: pam_unix(sudo:session): session closed for user root
Nov 28 09:58:11 np0005538515.localdomain sudo[305252]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring.new
Nov 28 09:58:11 np0005538515.localdomain sudo[305252]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:58:11 np0005538515.localdomain sudo[305252]: pam_unix(sudo:session): session closed for user root
Nov 28 09:58:11 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain.devices.0}] v 0)
Nov 28 09:58:11 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain}] v 0)
Nov 28 09:58:11 np0005538515.localdomain sudo[305270]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring.new
Nov 28 09:58:11 np0005538515.localdomain sudo[305270]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:58:11 np0005538515.localdomain sudo[305270]: pam_unix(sudo:session): session closed for user root
Nov 28 09:58:12 np0005538515.localdomain sudo[305288]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-2c5417c9-00eb-57d5-a565-ddecbc7995c1/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring.new /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring
Nov 28 09:58:12 np0005538515.localdomain sudo[305288]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:58:12 np0005538515.localdomain sudo[305288]: pam_unix(sudo:session): session closed for user root
Nov 28 09:58:12 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain.devices.0}] v 0)
Nov 28 09:58:12 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain}] v 0)
Nov 28 09:58:12 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain.devices.0}] v 0)
Nov 28 09:58:12 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain}] v 0)
Nov 28 09:58:12 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Nov 28 09:58:12 np0005538515.localdomain ceph-mgr[286188]: [progress INFO root] update: starting ev 8e2b0e75-f61a-4829-9dff-75d4bcf68e11 (Updating node-proxy deployment (+3 -> 3))
Nov 28 09:58:12 np0005538515.localdomain ceph-mgr[286188]: [progress INFO root] complete: finished ev 8e2b0e75-f61a-4829-9dff-75d4bcf68e11 (Updating node-proxy deployment (+3 -> 3))
Nov 28 09:58:12 np0005538515.localdomain ceph-mgr[286188]: [progress INFO root] Completed event 8e2b0e75-f61a-4829-9dff-75d4bcf68e11 (Updating node-proxy deployment (+3 -> 3)) in 0 seconds
Nov 28 09:58:12 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0)
Nov 28 09:58:12 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 28 09:58:12 np0005538515.localdomain ceph-mon[301134]: Updating np0005538514.localdomain:/etc/ceph/ceph.client.admin.keyring
Nov 28 09:58:12 np0005538515.localdomain ceph-mon[301134]: Updating np0005538515.localdomain:/etc/ceph/ceph.client.admin.keyring
Nov 28 09:58:12 np0005538515.localdomain ceph-mon[301134]: Updating np0005538513.localdomain:/etc/ceph/ceph.client.admin.keyring
Nov 28 09:58:12 np0005538515.localdomain ceph-mon[301134]: mgrmap e44: np0005538515.yfkzhl(active, since 5s), standbys: np0005538513.dsfdlx, np0005538514.djozup
Nov 28 09:58:12 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "mgr metadata", "who": "np0005538514.djozup", "id": "np0005538514.djozup"} : dispatch
Nov 28 09:58:12 np0005538515.localdomain ceph-mon[301134]: Updating np0005538514.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring
Nov 28 09:58:12 np0005538515.localdomain ceph-mon[301134]: Updating np0005538515.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring
Nov 28 09:58:12 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:58:12 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:58:12 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:58:12 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:58:12 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:58:12 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:58:12 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:58:12 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 28 09:58:12 np0005538515.localdomain sudo[305306]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:58:12 np0005538515.localdomain sudo[305306]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:58:12 np0005538515.localdomain sudo[305306]: pam_unix(sudo:session): session closed for user root
Nov 28 09:58:12 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Nov 28 09:58:12 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:58:12 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0)
Nov 28 09:58:12 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 28 09:58:12 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Nov 28 09:58:12 np0005538515.localdomain ceph-mgr[286188]: [progress INFO root] update: starting ev b8f44057-e882-4490-b8d1-70c34e1686a0 (Updating node-proxy deployment (+3 -> 3))
Nov 28 09:58:12 np0005538515.localdomain ceph-mgr[286188]: [progress INFO root] complete: finished ev b8f44057-e882-4490-b8d1-70c34e1686a0 (Updating node-proxy deployment (+3 -> 3))
Nov 28 09:58:12 np0005538515.localdomain ceph-mgr[286188]: [progress INFO root] Completed event b8f44057-e882-4490-b8d1-70c34e1686a0 (Updating node-proxy deployment (+3 -> 3)) in 0 seconds
Nov 28 09:58:12 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0)
Nov 28 09:58:12 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 28 09:58:12 np0005538515.localdomain sudo[305324]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:58:12 np0005538515.localdomain sudo[305324]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:58:12 np0005538515.localdomain sudo[305324]: pam_unix(sudo:session): session closed for user root
Nov 28 09:58:13 np0005538515.localdomain ceph-mon[301134]: Updating np0005538513.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring
Nov 28 09:58:13 np0005538515.localdomain ceph-mon[301134]: pgmap v6: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:58:13 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:58:13 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 28 09:58:13 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:58:13 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 28 09:58:13 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.
Nov 28 09:58:13 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v7: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail; 30 KiB/s rd, 0 B/s wr, 16 op/s
Nov 28 09:58:13 np0005538515.localdomain podman[305342]: 2025-11-28 09:58:13.639284605 +0000 UTC m=+0.085591660 container health_status 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, distribution-scope=public, vcs-type=git, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, vendor=Red Hat, Inc., io.openshift.expose-services=, maintainer=Red Hat, Inc., version=9.6)
Nov 28 09:58:13 np0005538515.localdomain podman[305342]: 2025-11-28 09:58:13.68009185 +0000 UTC m=+0.126398845 container exec_died 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, vcs-type=git, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, release=1755695350, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, build-date=2025-08-20T13:12:41, distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, container_name=openstack_network_exporter)
Nov 28 09:58:13 np0005538515.localdomain systemd[1]: 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.service: Deactivated successfully.
Nov 28 09:58:14 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/1746112093' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 09:58:14 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/1746112093' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 09:58:15 np0005538515.localdomain ceph-mon[301134]: pgmap v7: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail; 30 KiB/s rd, 0 B/s wr, 16 op/s
Nov 28 09:58:15 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v8: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail; 23 KiB/s rd, 0 B/s wr, 12 op/s
Nov 28 09:58:15 np0005538515.localdomain ceph-mgr[286188]: [progress INFO root] Writing back 50 completed events
Nov 28 09:58:15 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0)
Nov 28 09:58:16 np0005538515.localdomain ceph-mon[301134]: pgmap v8: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail; 23 KiB/s rd, 0 B/s wr, 12 op/s
Nov 28 09:58:16 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:58:16 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e92 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:58:17 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v9: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail; 19 KiB/s rd, 0 B/s wr, 10 op/s
Nov 28 09:58:18 np0005538515.localdomain ceph-mon[301134]: pgmap v9: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail; 19 KiB/s rd, 0 B/s wr, 10 op/s
Nov 28 09:58:19 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v10: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail; 17 KiB/s rd, 0 B/s wr, 9 op/s
Nov 28 09:58:20 np0005538515.localdomain ceph-mon[301134]: pgmap v10: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail; 17 KiB/s rd, 0 B/s wr, 9 op/s
Nov 28 09:58:21 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v11: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail; 17 KiB/s rd, 0 B/s wr, 9 op/s
Nov 28 09:58:21 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e92 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:58:22 np0005538515.localdomain ceph-mon[301134]: pgmap v11: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail; 17 KiB/s rd, 0 B/s wr, 9 op/s
Nov 28 09:58:22 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.
Nov 28 09:58:22 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.
Nov 28 09:58:22 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.
Nov 28 09:58:22 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.
Nov 28 09:58:23 np0005538515.localdomain podman[305362]: 2025-11-28 09:58:23.001610302 +0000 UTC m=+0.101862913 container health_status 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2)
Nov 28 09:58:23 np0005538515.localdomain podman[305364]: 2025-11-28 09:58:23.052296536 +0000 UTC m=+0.148625499 container health_status b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Nov 28 09:58:23 np0005538515.localdomain podman[305364]: 2025-11-28 09:58:23.086438019 +0000 UTC m=+0.182767002 container exec_died b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Nov 28 09:58:23 np0005538515.localdomain podman[305363]: 2025-11-28 09:58:23.099558016 +0000 UTC m=+0.195910989 container health_status 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 09:58:23 np0005538515.localdomain systemd[1]: b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.service: Deactivated successfully.
Nov 28 09:58:23 np0005538515.localdomain podman[305362]: 2025-11-28 09:58:23.119312894 +0000 UTC m=+0.219565455 container exec_died 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.build-date=20251125, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 09:58:23 np0005538515.localdomain systemd[1]: 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.service: Deactivated successfully.
Nov 28 09:58:23 np0005538515.localdomain podman[305363]: 2025-11-28 09:58:23.166683327 +0000 UTC m=+0.263036320 container exec_died 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible)
Nov 28 09:58:23 np0005538515.localdomain systemd[1]: 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.service: Deactivated successfully.
Nov 28 09:58:23 np0005538515.localdomain podman[305365]: 2025-11-28 09:58:23.228092086 +0000 UTC m=+0.319557071 container health_status d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 28 09:58:23 np0005538515.localdomain podman[305365]: 2025-11-28 09:58:23.236940933 +0000 UTC m=+0.328405958 container exec_died d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 28 09:58:23 np0005538515.localdomain systemd[1]: d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.service: Deactivated successfully.
Nov 28 09:58:23 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v12: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail; 17 KiB/s rd, 0 B/s wr, 9 op/s
Nov 28 09:58:24 np0005538515.localdomain ceph-mon[301134]: pgmap v12: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail; 17 KiB/s rd, 0 B/s wr, 9 op/s
Nov 28 09:58:25 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v13: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:58:25 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.
Nov 28 09:58:25 np0005538515.localdomain podman[305448]: 2025-11-28 09:58:25.981145405 +0000 UTC m=+0.090381248 container health_status 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 28 09:58:25 np0005538515.localdomain podman[305448]: 2025-11-28 09:58:25.992481158 +0000 UTC m=+0.101717061 container exec_died 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 28 09:58:26 np0005538515.localdomain systemd[1]: 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.service: Deactivated successfully.
Nov 28 09:58:26 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e92 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:58:26 np0005538515.localdomain ceph-mon[301134]: pgmap v13: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:58:27 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v14: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:58:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:58:27 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:58:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:58:27 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:58:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:58:27 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 09:58:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:58:27 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 09:58:27 np0005538515.localdomain openstack_network_exporter[240973]: 
Nov 28 09:58:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:58:27 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 09:58:27 np0005538515.localdomain openstack_network_exporter[240973]: 
Nov 28 09:58:28 np0005538515.localdomain ceph-mon[301134]: pgmap v14: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:58:28 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.
Nov 28 09:58:28 np0005538515.localdomain podman[239012]: time="2025-11-28T09:58:28Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 09:58:28 np0005538515.localdomain podman[239012]: @ - - [28/Nov/2025:09:58:28 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156330 "" "Go-http-client/1.1"
Nov 28 09:58:28 np0005538515.localdomain podman[239012]: @ - - [28/Nov/2025:09:58:28 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19179 "" "Go-http-client/1.1"
Nov 28 09:58:29 np0005538515.localdomain podman[305472]: 2025-11-28 09:58:29.037817741 +0000 UTC m=+0.144237186 container health_status cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=multipathd)
Nov 28 09:58:29 np0005538515.localdomain podman[305472]: 2025-11-28 09:58:29.053553076 +0000 UTC m=+0.159972521 container exec_died cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, io.buildah.version=1.41.3)
Nov 28 09:58:29 np0005538515.localdomain systemd[1]: cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.service: Deactivated successfully.
Nov 28 09:58:29 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v15: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:58:30 np0005538515.localdomain ceph-mon[301134]: pgmap v15: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:58:31 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v16: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:58:31 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e92 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:58:32 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:58:32.239 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:58:32 np0005538515.localdomain ceph-mon[301134]: pgmap v16: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:58:33 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:58:33.238 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:58:33 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v17: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:58:34 np0005538515.localdomain ceph-osd[32393]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 28 09:58:34 np0005538515.localdomain ceph-osd[32393]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 7800.1 total, 600.0 interval
                                                          Cumulative writes: 5064 writes, 22K keys, 5064 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s
                                                          Cumulative WAL: 5064 writes, 762 syncs, 6.65 writes per sync, written: 0.02 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 214 writes, 503 keys, 214 commit groups, 1.0 writes per commit group, ingest: 0.48 MB, 0.00 MB/s
                                                          Interval WAL: 214 writes, 101 syncs, 2.12 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Nov 28 09:58:34 np0005538515.localdomain ceph-mon[301134]: pgmap v17: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:58:35 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:58:35.238 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:58:35 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:58:35.238 280172 DEBUG nova.compute.manager [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 28 09:58:35 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v18: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:58:35 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] scanning for idle connections..
Nov 28 09:58:35 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] cleaning up connections: []
Nov 28 09:58:35 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] scanning for idle connections..
Nov 28 09:58:35 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] cleaning up connections: []
Nov 28 09:58:35 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] scanning for idle connections..
Nov 28 09:58:35 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] cleaning up connections: []
Nov 28 09:58:36 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:58:36.239 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:58:36 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:58:36.240 280172 DEBUG nova.compute.manager [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 28 09:58:36 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:58:36.240 280172 DEBUG nova.compute.manager [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 28 09:58:36 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:58:36.443 280172 DEBUG nova.compute.manager [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 28 09:58:36 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e92 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:58:36 np0005538515.localdomain ceph-mon[301134]: pgmap v18: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:58:37 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:58:37.238 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:58:37 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:58:37.239 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:58:37 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v19: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:58:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 28 09:58:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 7800.2 total, 600.0 interval
                                                          Cumulative writes: 5888 writes, 25K keys, 5888 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s
                                                          Cumulative WAL: 5888 writes, 780 syncs, 7.55 writes per sync, written: 0.02 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 34 writes, 127 keys, 34 commit groups, 1.0 writes per commit group, ingest: 0.22 MB, 0.00 MB/s
                                                          Interval WAL: 34 writes, 17 syncs, 2.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Nov 28 09:58:38 np0005538515.localdomain ceph-mon[301134]: pgmap v19: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:58:39 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v20: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:58:40 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:58:40.234 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:58:40 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:58:40.237 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:58:40 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:58:40.257 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:58:40 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:58:40.258 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:58:40 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:58:40.258 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:58:40 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:58:40.258 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Auditing locally available compute resources for np0005538515.localdomain (node: np0005538515.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 28 09:58:40 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:58:40.259 280172 DEBUG oslo_concurrency.processutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 09:58:40 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 28 09:58:40 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/2543969539' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 09:58:40 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:58:40.712 280172 DEBUG oslo_concurrency.processutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.454s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 09:58:40 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:58:40.909 280172 WARNING nova.virt.libvirt.driver [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 09:58:40 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:58:40.910 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Hypervisor/Node resource view: name=np0005538515.localdomain free_ram=11996MB free_disk=41.83686447143555GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 28 09:58:40 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:58:40.911 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:58:40 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:58:40.911 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:58:40 np0005538515.localdomain ceph-mon[301134]: pgmap v20: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:58:40 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.106:0/4221705048' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 09:58:40 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.108:0/2543969539' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 09:58:40 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:58:40.975 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 28 09:58:40 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:58:40.975 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Final resource view: name=np0005538515.localdomain phys_ram=15738MB used_ram=512MB phys_disk=41GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 28 09:58:40 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:58:40.990 280172 DEBUG oslo_concurrency.processutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 09:58:41 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 28 09:58:41 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/1985352112' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 09:58:41 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:58:41.408 280172 DEBUG oslo_concurrency.processutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.418s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 09:58:41 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:58:41.414 280172 DEBUG nova.compute.provider_tree [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Inventory has not changed in ProviderTree for provider: 72fba1ca-0d86-48af-8a3d-510284dfd0e0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 09:58:41 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:58:41.428 280172 DEBUG nova.scheduler.client.report [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Inventory has not changed for provider 72fba1ca-0d86-48af-8a3d-510284dfd0e0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 09:58:41 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:58:41.431 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Compute_service record updated for np0005538515.localdomain:np0005538515.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 28 09:58:41 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:58:41.431 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.520s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:58:41 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v21: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:58:41 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e92 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:58:41 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.106:0/1527944200' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 09:58:41 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.108:0/1985352112' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 09:58:42 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:58:42.433 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:58:42 np0005538515.localdomain ceph-mon[301134]: pgmap v21: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:58:42 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.107:0/355013671' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 09:58:43 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v22: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:58:43 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.
Nov 28 09:58:43 np0005538515.localdomain systemd[1]: tmp-crun.HtS9eP.mount: Deactivated successfully.
Nov 28 09:58:43 np0005538515.localdomain podman[305535]: 2025-11-28 09:58:43.961813177 +0000 UTC m=+0.070301648 container health_status 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, name=ubi9-minimal, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., config_id=edpm, managed_by=edpm_ansible, container_name=openstack_network_exporter, version=9.6, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, release=1755695350, io.openshift.expose-services=)
Nov 28 09:58:43 np0005538515.localdomain podman[305535]: 2025-11-28 09:58:43.97843053 +0000 UTC m=+0.086918991 container exec_died 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, vcs-type=git, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., managed_by=edpm_ansible, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., name=ubi9-minimal, release=1755695350, distribution-scope=public, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, architecture=x86_64, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Nov 28 09:58:43 np0005538515.localdomain systemd[1]: 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.service: Deactivated successfully.
Nov 28 09:58:44 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.107:0/716858050' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 09:58:45 np0005538515.localdomain ceph-mon[301134]: pgmap v22: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:58:45 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v23: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:58:46 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e92 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:58:47 np0005538515.localdomain ceph-mon[301134]: pgmap v23: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:58:47 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v24: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:58:49 np0005538515.localdomain ceph-mon[301134]: pgmap v24: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:58:49 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v25: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:58:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:58:50.843 158530 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:58:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:58:50.844 158530 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:58:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:58:50.844 158530 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:58:51 np0005538515.localdomain ceph-mon[301134]: pgmap v25: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:58:51 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v26: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:58:51 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e92 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:58:53 np0005538515.localdomain ceph-mon[301134]: pgmap v26: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:58:53 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v27: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:58:53 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.
Nov 28 09:58:53 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.
Nov 28 09:58:53 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.
Nov 28 09:58:53 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.
Nov 28 09:58:53 np0005538515.localdomain podman[305556]: 2025-11-28 09:58:53.982415863 +0000 UTC m=+0.084747766 container health_status 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 28 09:58:54 np0005538515.localdomain podman[305555]: 2025-11-28 09:58:54.08903933 +0000 UTC m=+0.197437056 container health_status 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, maintainer=OpenStack Kubernetes Operator team)
Nov 28 09:58:54 np0005538515.localdomain podman[305563]: 2025-11-28 09:58:54.053901097 +0000 UTC m=+0.148390932 container health_status d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 28 09:58:54 np0005538515.localdomain podman[305555]: 2025-11-28 09:58:54.127635588 +0000 UTC m=+0.236033324 container exec_died 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute)
Nov 28 09:58:54 np0005538515.localdomain systemd[1]: 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.service: Deactivated successfully.
Nov 28 09:58:54 np0005538515.localdomain podman[305557]: 2025-11-28 09:58:54.157922405 +0000 UTC m=+0.254710080 container health_status b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Nov 28 09:58:54 np0005538515.localdomain podman[305557]: 2025-11-28 09:58:54.167557646 +0000 UTC m=+0.264345381 container exec_died b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, container_name=ovn_metadata_agent, managed_by=edpm_ansible)
Nov 28 09:58:54 np0005538515.localdomain systemd[1]: b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.service: Deactivated successfully.
Nov 28 09:58:54 np0005538515.localdomain podman[305563]: 2025-11-28 09:58:54.18388184 +0000 UTC m=+0.278371655 container exec_died d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 28 09:58:54 np0005538515.localdomain systemd[1]: d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.service: Deactivated successfully.
Nov 28 09:58:54 np0005538515.localdomain podman[305556]: 2025-11-28 09:58:54.208766343 +0000 UTC m=+0.311098316 container exec_died 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_controller, org.label-schema.build-date=20251125, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0)
Nov 28 09:58:54 np0005538515.localdomain systemd[1]: 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.service: Deactivated successfully.
Nov 28 09:58:55 np0005538515.localdomain ceph-mon[301134]: pgmap v27: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:58:55 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v28: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:58:56 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e92 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:58:56 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.
Nov 28 09:58:56 np0005538515.localdomain podman[305635]: 2025-11-28 09:58:56.980305141 +0000 UTC m=+0.085671653 container health_status 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 28 09:58:56 np0005538515.localdomain podman[305635]: 2025-11-28 09:58:56.994389807 +0000 UTC m=+0.099756299 container exec_died 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 28 09:58:57 np0005538515.localdomain systemd[1]: 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.service: Deactivated successfully.
Nov 28 09:58:57 np0005538515.localdomain ceph-mon[301134]: pgmap v28: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:58:57 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v29: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:58:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:58:57 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 09:58:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:58:57 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:58:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:58:57 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:58:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:58:57 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 09:58:57 np0005538515.localdomain openstack_network_exporter[240973]: 
Nov 28 09:58:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:58:57 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 09:58:57 np0005538515.localdomain openstack_network_exporter[240973]: 
Nov 28 09:58:58 np0005538515.localdomain podman[239012]: time="2025-11-28T09:58:58Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 09:58:58 np0005538515.localdomain podman[239012]: @ - - [28/Nov/2025:09:58:58 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156330 "" "Go-http-client/1.1"
Nov 28 09:58:58 np0005538515.localdomain podman[239012]: @ - - [28/Nov/2025:09:58:58 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19184 "" "Go-http-client/1.1"
Nov 28 09:58:59 np0005538515.localdomain ceph-mon[301134]: pgmap v29: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:58:59 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v30: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:58:59 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.
Nov 28 09:58:59 np0005538515.localdomain podman[305658]: 2025-11-28 09:58:59.983972404 +0000 UTC m=+0.092833751 container health_status cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_managed=true, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2)
Nov 28 09:59:00 np0005538515.localdomain podman[305658]: 2025-11-28 09:59:00.022613682 +0000 UTC m=+0.131474989 container exec_died cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Nov 28 09:59:00 np0005538515.localdomain systemd[1]: cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.service: Deactivated successfully.
Nov 28 09:59:01 np0005538515.localdomain ceph-mon[301134]: pgmap v30: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:59:01 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v31: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:59:01 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e92 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:59:03 np0005538515.localdomain ceph-mon[301134]: pgmap v31: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:59:03 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v32: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:59:05 np0005538515.localdomain ceph-mon[301134]: pgmap v32: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:59:05 np0005538515.localdomain ceph-mgr[286188]: [balancer INFO root] Optimize plan auto_2025-11-28_09:59:05
Nov 28 09:59:05 np0005538515.localdomain ceph-mgr[286188]: [balancer INFO root] Mode upmap, max misplaced 0.050000
Nov 28 09:59:05 np0005538515.localdomain ceph-mgr[286188]: [balancer INFO root] do_upmap
Nov 28 09:59:05 np0005538515.localdomain ceph-mgr[286188]: [balancer INFO root] pools ['manila_metadata', 'images', 'volumes', 'backups', '.mgr', 'vms', 'manila_data']
Nov 28 09:59:05 np0005538515.localdomain ceph-mgr[286188]: [balancer INFO root] prepared 0/10 changes
Nov 28 09:59:05 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v33: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:59:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] _maybe_adjust
Nov 28 09:59:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Nov 28 09:59:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] Pool '.mgr' root_id -1 using 3.080724804578448e-05 of space, bias 1.0, pg target 0.006161449609156895 quantized to 1 (current 1)
Nov 28 09:59:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Nov 28 09:59:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] Pool 'vms' root_id -1 using 0.003328000680485762 of space, bias 1.0, pg target 0.6656001360971524 quantized to 32 (current 32)
Nov 28 09:59:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Nov 28 09:59:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] Pool 'volumes' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Nov 28 09:59:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Nov 28 09:59:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] Pool 'images' root_id -1 using 0.0014449417225013959 of space, bias 1.0, pg target 0.2885066972594454 quantized to 32 (current 32)
Nov 28 09:59:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Nov 28 09:59:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] Pool 'backups' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Nov 28 09:59:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Nov 28 09:59:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] Pool 'manila_data' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Nov 28 09:59:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Nov 28 09:59:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] Pool 'manila_metadata' root_id -1 using 2.453674623115578e-06 of space, bias 4.0, pg target 0.0019596681323283084 quantized to 16 (current 16)
Nov 28 09:59:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] scanning for idle connections..
Nov 28 09:59:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] cleaning up connections: []
Nov 28 09:59:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] scanning for idle connections..
Nov 28 09:59:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] cleaning up connections: []
Nov 28 09:59:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] scanning for idle connections..
Nov 28 09:59:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] cleaning up connections: []
Nov 28 09:59:05 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules
Nov 28 09:59:05 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: vms, start_after=
Nov 28 09:59:05 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: volumes, start_after=
Nov 28 09:59:05 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: images, start_after=
Nov 28 09:59:05 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: backups, start_after=
Nov 28 09:59:05 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules
Nov 28 09:59:05 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: vms, start_after=
Nov 28 09:59:05 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: volumes, start_after=
Nov 28 09:59:05 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: images, start_after=
Nov 28 09:59:05 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: backups, start_after=
Nov 28 09:59:06 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e92 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:59:07 np0005538515.localdomain ceph-mon[301134]: pgmap v33: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:59:07 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v34: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:59:09 np0005538515.localdomain ceph-mon[301134]: pgmap v34: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:59:09 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v35: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:59:11 np0005538515.localdomain ceph-mon[301134]: pgmap v35: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:59:11 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v36: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:59:11 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e92 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:59:12 np0005538515.localdomain sudo[305678]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 09:59:12 np0005538515.localdomain sudo[305678]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:59:12 np0005538515.localdomain sudo[305678]: pam_unix(sudo:session): session closed for user root
Nov 28 09:59:12 np0005538515.localdomain sudo[305696]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 09:59:12 np0005538515.localdomain sudo[305696]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:59:13 np0005538515.localdomain ceph-mon[301134]: pgmap v36: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:59:13 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/4226605833' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 09:59:13 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/4226605833' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 09:59:13 np0005538515.localdomain sudo[305696]: pam_unix(sudo:session): session closed for user root
Nov 28 09:59:13 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v37: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:59:13 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Nov 28 09:59:13 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:59:13 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0)
Nov 28 09:59:13 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 28 09:59:13 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Nov 28 09:59:13 np0005538515.localdomain ceph-mgr[286188]: [progress INFO root] update: starting ev e66e0e5b-c775-4708-a26b-b94b8c033107 (Updating node-proxy deployment (+3 -> 3))
Nov 28 09:59:13 np0005538515.localdomain ceph-mgr[286188]: [progress INFO root] complete: finished ev e66e0e5b-c775-4708-a26b-b94b8c033107 (Updating node-proxy deployment (+3 -> 3))
Nov 28 09:59:13 np0005538515.localdomain ceph-mgr[286188]: [progress INFO root] Completed event e66e0e5b-c775-4708-a26b-b94b8c033107 (Updating node-proxy deployment (+3 -> 3)) in 0 seconds
Nov 28 09:59:13 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0)
Nov 28 09:59:13 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 28 09:59:13 np0005538515.localdomain sudo[305746]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 09:59:13 np0005538515.localdomain sudo[305746]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 09:59:13 np0005538515.localdomain sudo[305746]: pam_unix(sudo:session): session closed for user root
Nov 28 09:59:14 np0005538515.localdomain ceph-mon[301134]: pgmap v37: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:59:14 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 09:59:14 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 28 09:59:14 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:59:14 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 28 09:59:14 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.
Nov 28 09:59:14 np0005538515.localdomain systemd[1]: tmp-crun.yCiBRz.mount: Deactivated successfully.
Nov 28 09:59:14 np0005538515.localdomain podman[305764]: 2025-11-28 09:59:14.991793917 +0000 UTC m=+0.095690267 container health_status 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, distribution-scope=public, io.buildah.version=1.33.7, vendor=Red Hat, Inc., release=1755695350, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, name=ubi9-minimal, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Nov 28 09:59:15 np0005538515.localdomain podman[305764]: 2025-11-28 09:59:15.032542739 +0000 UTC m=+0.136439089 container exec_died 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, vcs-type=git, container_name=openstack_network_exporter, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, maintainer=Red Hat, Inc., config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, name=ubi9-minimal, build-date=2025-08-20T13:12:41, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7)
Nov 28 09:59:15 np0005538515.localdomain systemd[1]: 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.service: Deactivated successfully.
Nov 28 09:59:15 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v38: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:59:15 np0005538515.localdomain ceph-mgr[286188]: [progress INFO root] Writing back 50 completed events
Nov 28 09:59:15 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0)
Nov 28 09:59:16 np0005538515.localdomain ceph-mon[301134]: pgmap v38: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:59:16 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 09:59:16 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e92 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:59:17 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v39: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:59:18 np0005538515.localdomain ceph-mon[301134]: pgmap v39: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:59:19 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v40: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:59:20 np0005538515.localdomain ceph-mon[301134]: pgmap v40: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:59:21 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v41: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:59:21 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e92 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:59:22 np0005538515.localdomain ceph-mon[301134]: pgmap v41: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:59:22 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #16. Immutable memtables: 0.
Nov 28 09:59:22 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-09:59:22.809484) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Nov 28 09:59:22 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/flush_job.cc:856] [default] [JOB 5] Flushing memtable with next log file: 16
Nov 28 09:59:22 np0005538515.localdomain ceph-mon[301134]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323962809575, "job": 5, "event": "flush_started", "num_memtables": 1, "num_entries": 2566, "num_deletes": 253, "total_data_size": 5852505, "memory_usage": 6144320, "flush_reason": "Manual Compaction"}
Nov 28 09:59:22 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/flush_job.cc:885] [default] [JOB 5] Level-0 flush table #17: started
Nov 28 09:59:22 np0005538515.localdomain ceph-mon[301134]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323962838397, "cf_name": "default", "job": 5, "event": "table_file_creation", "file_number": 17, "file_size": 3664999, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 12857, "largest_seqno": 15418, "table_properties": {"data_size": 3655199, "index_size": 6049, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2693, "raw_key_size": 22841, "raw_average_key_size": 21, "raw_value_size": 3634616, "raw_average_value_size": 3451, "num_data_blocks": 261, "num_entries": 1053, "num_filter_entries": 1053, "num_deletions": 252, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764323814, "oldest_key_time": 1764323814, "file_creation_time": 1764323962, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "75e61b0e-4f73-4b03-b096-8587ecbe7a9f", "db_session_id": "7KM5GJAJPD54H6HSLJHG", "orig_file_number": 17, "seqno_to_time_mapping": "N/A"}}
Nov 28 09:59:22 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 5] Flush lasted 28969 microseconds, and 8142 cpu microseconds.
Nov 28 09:59:22 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 28 09:59:22 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-09:59:22.838459) [db/flush_job.cc:967] [default] [JOB 5] Level-0 flush table #17: 3664999 bytes OK
Nov 28 09:59:22 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-09:59:22.838485) [db/memtable_list.cc:519] [default] Level-0 commit table #17 started
Nov 28 09:59:22 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-09:59:22.840630) [db/memtable_list.cc:722] [default] Level-0 commit table #17: memtable #1 done
Nov 28 09:59:22 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-09:59:22.840655) EVENT_LOG_v1 {"time_micros": 1764323962840648, "job": 5, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Nov 28 09:59:22 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-09:59:22.840678) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Nov 28 09:59:22 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 5] Try to delete WAL files size 5840674, prev total WAL file size 5840674, number of live WAL files 2.
Nov 28 09:59:22 np0005538515.localdomain ceph-mon[301134]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538515/store.db/000013.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 28 09:59:22 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-09:59:22.842004) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003131323935' seq:72057594037927935, type:22 .. '7061786F73003131353437' seq:0, type:0; will stop at (end)
Nov 28 09:59:22 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 6] Compacting 1@0 + 1@6 files to L6, score -1.00
Nov 28 09:59:22 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 5 Base level 0, inputs: [17(3579KB)], [15(17MB)]
Nov 28 09:59:22 np0005538515.localdomain ceph-mon[301134]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323962842058, "job": 6, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [17], "files_L6": [15], "score": -1, "input_data_size": 21675810, "oldest_snapshot_seqno": -1}
Nov 28 09:59:22 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 6] Generated table #18: 12155 keys, 18737846 bytes, temperature: kUnknown
Nov 28 09:59:22 np0005538515.localdomain ceph-mon[301134]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323962949422, "cf_name": "default", "job": 6, "event": "table_file_creation", "file_number": 18, "file_size": 18737846, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 18667031, "index_size": 39354, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 30405, "raw_key_size": 324291, "raw_average_key_size": 26, "raw_value_size": 18458709, "raw_average_value_size": 1518, "num_data_blocks": 1508, "num_entries": 12155, "num_filter_entries": 12155, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764323786, "oldest_key_time": 0, "file_creation_time": 1764323962, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "75e61b0e-4f73-4b03-b096-8587ecbe7a9f", "db_session_id": "7KM5GJAJPD54H6HSLJHG", "orig_file_number": 18, "seqno_to_time_mapping": "N/A"}}
Nov 28 09:59:22 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 28 09:59:22 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-09:59:22.949749) [db/compaction/compaction_job.cc:1663] [default] [JOB 6] Compacted 1@0 + 1@6 files to L6 => 18737846 bytes
Nov 28 09:59:22 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-09:59:22.951721) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 201.8 rd, 174.4 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.5, 17.2 +0.0 blob) out(17.9 +0.0 blob), read-write-amplify(11.0) write-amplify(5.1) OK, records in: 12692, records dropped: 537 output_compression: NoCompression
Nov 28 09:59:22 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-09:59:22.951758) EVENT_LOG_v1 {"time_micros": 1764323962951742, "job": 6, "event": "compaction_finished", "compaction_time_micros": 107425, "compaction_time_cpu_micros": 51504, "output_level": 6, "num_output_files": 1, "total_output_size": 18737846, "num_input_records": 12692, "num_output_records": 12155, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Nov 28 09:59:22 np0005538515.localdomain ceph-mon[301134]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538515/store.db/000017.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 28 09:59:22 np0005538515.localdomain ceph-mon[301134]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323962952653, "job": 6, "event": "table_file_deletion", "file_number": 17}
Nov 28 09:59:22 np0005538515.localdomain ceph-mon[301134]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538515/store.db/000015.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 28 09:59:22 np0005538515.localdomain ceph-mon[301134]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323962956350, "job": 6, "event": "table_file_deletion", "file_number": 15}
Nov 28 09:59:22 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-09:59:22.841868) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 09:59:22 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-09:59:22.956513) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 09:59:22 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-09:59:22.956521) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 09:59:22 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-09:59:22.956524) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 09:59:22 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-09:59:22.956527) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 09:59:22 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-09:59:22.956530) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 09:59:23 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v42: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:59:24 np0005538515.localdomain ceph-mon[301134]: pgmap v42: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:59:24 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.
Nov 28 09:59:24 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.
Nov 28 09:59:24 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.
Nov 28 09:59:24 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.
Nov 28 09:59:24 np0005538515.localdomain podman[305784]: 2025-11-28 09:59:24.986798149 +0000 UTC m=+0.089076177 container health_status 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=edpm, managed_by=edpm_ansible, io.buildah.version=1.41.3)
Nov 28 09:59:25 np0005538515.localdomain podman[305784]: 2025-11-28 09:59:25.023779798 +0000 UTC m=+0.126057886 container exec_died 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm, org.label-schema.schema-version=1.0)
Nov 28 09:59:25 np0005538515.localdomain systemd[1]: tmp-crun.MN1w9F.mount: Deactivated successfully.
Nov 28 09:59:25 np0005538515.localdomain systemd[1]: 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.service: Deactivated successfully.
Nov 28 09:59:25 np0005538515.localdomain podman[305795]: 2025-11-28 09:59:25.040470833 +0000 UTC m=+0.133998866 container health_status d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 09:59:25 np0005538515.localdomain podman[305795]: 2025-11-28 09:59:25.053377663 +0000 UTC m=+0.146905716 container exec_died d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 09:59:25 np0005538515.localdomain systemd[1]: d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.service: Deactivated successfully.
Nov 28 09:59:25 np0005538515.localdomain systemd[1]: tmp-crun.Srx31m.mount: Deactivated successfully.
Nov 28 09:59:25 np0005538515.localdomain podman[305786]: 2025-11-28 09:59:25.143305625 +0000 UTC m=+0.238602112 container health_status b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 09:59:25 np0005538515.localdomain podman[305786]: 2025-11-28 09:59:25.17420476 +0000 UTC m=+0.269501247 container exec_died b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.license=GPLv2, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Nov 28 09:59:25 np0005538515.localdomain systemd[1]: b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.service: Deactivated successfully.
Nov 28 09:59:25 np0005538515.localdomain podman[305785]: 2025-11-28 09:59:25.195060131 +0000 UTC m=+0.296342519 container health_status 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 09:59:25 np0005538515.localdomain podman[305785]: 2025-11-28 09:59:25.2604921 +0000 UTC m=+0.361774538 container exec_died 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Nov 28 09:59:25 np0005538515.localdomain systemd[1]: 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.service: Deactivated successfully.
Nov 28 09:59:25 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v43: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:59:26 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e92 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:59:26 np0005538515.localdomain ceph-mon[301134]: pgmap v43: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:59:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:59:27 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 09:59:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:59:27 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:59:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:59:27 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:59:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:59:27 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 09:59:27 np0005538515.localdomain openstack_network_exporter[240973]: 
Nov 28 09:59:27 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v44: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:59:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:59:27 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 09:59:27 np0005538515.localdomain openstack_network_exporter[240973]: 
Nov 28 09:59:27 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.
Nov 28 09:59:27 np0005538515.localdomain podman[305866]: 2025-11-28 09:59:27.981021234 +0000 UTC m=+0.088649883 container health_status 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 28 09:59:27 np0005538515.localdomain podman[305866]: 2025-11-28 09:59:27.994387859 +0000 UTC m=+0.102016498 container exec_died 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 28 09:59:28 np0005538515.localdomain systemd[1]: 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.service: Deactivated successfully.
Nov 28 09:59:28 np0005538515.localdomain ceph-mon[301134]: pgmap v44: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:59:28 np0005538515.localdomain podman[239012]: time="2025-11-28T09:59:28Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 09:59:28 np0005538515.localdomain podman[239012]: @ - - [28/Nov/2025:09:59:28 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156330 "" "Go-http-client/1.1"
Nov 28 09:59:28 np0005538515.localdomain podman[239012]: @ - - [28/Nov/2025:09:59:28 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19185 "" "Go-http-client/1.1"
Nov 28 09:59:29 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v45: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:59:30 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.
Nov 28 09:59:30 np0005538515.localdomain ceph-mon[301134]: pgmap v45: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:59:30 np0005538515.localdomain podman[305889]: 2025-11-28 09:59:30.972386884 +0000 UTC m=+0.078852037 container health_status cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 09:59:30 np0005538515.localdomain podman[305889]: 2025-11-28 09:59:30.990528384 +0000 UTC m=+0.096993557 container exec_died cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Nov 28 09:59:31 np0005538515.localdomain systemd[1]: cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.service: Deactivated successfully.
Nov 28 09:59:31 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v46: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:59:31 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e92 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:59:32 np0005538515.localdomain ceph-mon[301134]: pgmap v46: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:59:33 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:59:33.238 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:59:33 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v47: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:59:34 np0005538515.localdomain ceph-mon[301134]: pgmap v47: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:59:35 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:59:35.238 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:59:35 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:59:35.239 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:59:35 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:59:35.239 280172 DEBUG nova.compute.manager [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 28 09:59:35 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v48: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:59:35 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] scanning for idle connections..
Nov 28 09:59:35 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] cleaning up connections: []
Nov 28 09:59:35 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] scanning for idle connections..
Nov 28 09:59:35 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] cleaning up connections: [('cephfs', <mgr_util.CephfsConnectionPool.Connection object at 0x7fcc91c760d0>)]
Nov 28 09:59:35 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] disconnecting from cephfs 'cephfs'
Nov 28 09:59:35 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] scanning for idle connections..
Nov 28 09:59:35 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] cleaning up connections: [('cephfs', <mgr_util.CephfsConnectionPool.Connection object at 0x7fcc91c760a0>)]
Nov 28 09:59:35 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] disconnecting from cephfs 'cephfs'
Nov 28 09:59:36 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:59:36.234 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:59:36 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e92 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:59:36 np0005538515.localdomain ceph-mon[301134]: pgmap v48: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Nov 28 09:59:37 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v49: 177 pgs: 177 active+clean; 105 MiB data, 570 MiB used, 41 GiB / 42 GiB avail; 255 B/s wr, 0 op/s
Nov 28 09:59:37 np0005538515.localdomain ceph-mon[301134]: mgrmap e45: np0005538515.yfkzhl(active, since 91s), standbys: np0005538513.dsfdlx, np0005538514.djozup
Nov 28 09:59:38 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:59:38.237 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:59:38 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:59:38.238 280172 DEBUG nova.compute.manager [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 28 09:59:38 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:59:38.238 280172 DEBUG nova.compute.manager [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 28 09:59:38 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:59:38.264 280172 DEBUG nova.compute.manager [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 28 09:59:38 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:59:38.264 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:59:39 np0005538515.localdomain ceph-mon[301134]: pgmap v49: 177 pgs: 177 active+clean; 105 MiB data, 570 MiB used, 41 GiB / 42 GiB avail; 255 B/s wr, 0 op/s
Nov 28 09:59:39 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:59:39.238 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:59:39 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v50: 177 pgs: 177 active+clean; 105 MiB data, 570 MiB used, 41 GiB / 42 GiB avail; 255 B/s wr, 0 op/s
Nov 28 09:59:41 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:59:41.005 158530 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=7, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '92:49:97', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ca:ab:0a:de:51:20'}, ipsec=False) old=SB_Global(nb_cfg=6) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 09:59:41 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:59:41.006 158530 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 28 09:59:41 np0005538515.localdomain ceph-mon[301134]: pgmap v50: 177 pgs: 177 active+clean; 105 MiB data, 570 MiB used, 41 GiB / 42 GiB avail; 255 B/s wr, 0 op/s
Nov 28 09:59:41 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.106:0/1564004805' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 09:59:41 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v51: 177 pgs: 177 active+clean; 105 MiB data, 570 MiB used, 41 GiB / 42 GiB avail; 255 B/s wr, 0 op/s
Nov 28 09:59:41 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e92 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:59:42 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.106:0/986677832' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 09:59:42 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:59:42.233 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:59:42 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:59:42.237 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:59:42 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:59:42.259 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:59:42 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:59:42.259 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:59:42 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:59:42.260 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:59:42 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:59:42.260 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Auditing locally available compute resources for np0005538515.localdomain (node: np0005538515.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 28 09:59:42 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:59:42.260 280172 DEBUG oslo_concurrency.processutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 09:59:42 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 28 09:59:42 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/3275500434' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 09:59:42 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:59:42.735 280172 DEBUG oslo_concurrency.processutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.475s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 09:59:42 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:59:42.970 280172 WARNING nova.virt.libvirt.driver [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 09:59:42 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:59:42.972 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Hypervisor/Node resource view: name=np0005538515.localdomain free_ram=12018MB free_disk=41.83686447143555GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 28 09:59:42 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:59:42.973 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:59:42 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:59:42.973 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:59:43 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:59:43.073 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 28 09:59:43 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:59:43.073 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Final resource view: name=np0005538515.localdomain phys_ram=15738MB used_ram=512MB phys_disk=41GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 28 09:59:43 np0005538515.localdomain ceph-mon[301134]: pgmap v51: 177 pgs: 177 active+clean; 105 MiB data, 570 MiB used, 41 GiB / 42 GiB avail; 255 B/s wr, 0 op/s
Nov 28 09:59:43 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.108:0/3275500434' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 09:59:43 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:59:43.100 280172 DEBUG oslo_concurrency.processutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 09:59:43 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 28 09:59:43 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/164218660' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 09:59:43 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:59:43.540 280172 DEBUG oslo_concurrency.processutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.440s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 09:59:43 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:59:43.546 280172 DEBUG nova.compute.provider_tree [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Inventory has not changed in ProviderTree for provider: 72fba1ca-0d86-48af-8a3d-510284dfd0e0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 09:59:43 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:59:43.560 280172 DEBUG nova.scheduler.client.report [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Inventory has not changed for provider 72fba1ca-0d86-48af-8a3d-510284dfd0e0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 09:59:43 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:59:43.561 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Compute_service record updated for np0005538515.localdomain:np0005538515.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 28 09:59:43 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:59:43.561 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.588s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:59:43 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v52: 177 pgs: 177 active+clean; 105 MiB data, 570 MiB used, 41 GiB / 42 GiB avail; 255 B/s wr, 0 op/s
Nov 28 09:59:44 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.107:0/3064234323' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 09:59:44 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.108:0/164218660' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 09:59:44 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.107:0/1537443536' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 09:59:44 np0005538515.localdomain nova_compute[280168]: 2025-11-28 09:59:44.563 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 09:59:45 np0005538515.localdomain ceph-mon[301134]: pgmap v52: 177 pgs: 177 active+clean; 105 MiB data, 570 MiB used, 41 GiB / 42 GiB avail; 255 B/s wr, 0 op/s
Nov 28 09:59:45 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v53: 177 pgs: 177 active+clean; 105 MiB data, 570 MiB used, 41 GiB / 42 GiB avail; 255 B/s wr, 0 op/s
Nov 28 09:59:45 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.
Nov 28 09:59:45 np0005538515.localdomain podman[305952]: 2025-11-28 09:59:45.994046629 +0000 UTC m=+0.102508703 container health_status 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, architecture=x86_64, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, vendor=Red Hat, Inc., config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, managed_by=edpm_ansible, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, distribution-scope=public, name=ubi9-minimal)
Nov 28 09:59:46 np0005538515.localdomain podman[305952]: 2025-11-28 09:59:46.010591939 +0000 UTC m=+0.119054053 container exec_died 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, config_id=edpm, distribution-scope=public, architecture=x86_64, vcs-type=git, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.openshift.expose-services=, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., release=1755695350)
Nov 28 09:59:46 np0005538515.localdomain systemd[1]: 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.service: Deactivated successfully.
Nov 28 09:59:46 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e93 e93: 6 total, 6 up, 6 in
Nov 28 09:59:46 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e93 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:59:47 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:59:47.008 158530 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=62c03cad-89c1-4fd7-973b-8f2a608c71f1, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '7'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 09:59:47 np0005538515.localdomain ceph-mon[301134]: pgmap v53: 177 pgs: 177 active+clean; 105 MiB data, 570 MiB used, 41 GiB / 42 GiB avail; 255 B/s wr, 0 op/s
Nov 28 09:59:47 np0005538515.localdomain ceph-mon[301134]: osdmap e93: 6 total, 6 up, 6 in
Nov 28 09:59:47 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v55: 177 pgs: 177 active+clean; 125 MiB data, 619 MiB used, 41 GiB / 42 GiB avail; 14 KiB/s rd, 2.0 MiB/s wr, 19 op/s
Nov 28 09:59:48 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e94 e94: 6 total, 6 up, 6 in
Nov 28 09:59:49 np0005538515.localdomain ceph-mon[301134]: pgmap v55: 177 pgs: 177 active+clean; 125 MiB data, 619 MiB used, 41 GiB / 42 GiB avail; 14 KiB/s rd, 2.0 MiB/s wr, 19 op/s
Nov 28 09:59:49 np0005538515.localdomain ceph-mon[301134]: osdmap e94: 6 total, 6 up, 6 in
Nov 28 09:59:49 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v57: 177 pgs: 177 active+clean; 125 MiB data, 619 MiB used, 41 GiB / 42 GiB avail; 17 KiB/s rd, 2.6 MiB/s wr, 24 op/s
Nov 28 09:59:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:59:50.843 158530 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 09:59:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:59:50.844 158530 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 09:59:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 09:59:50.844 158530 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 09:59:51 np0005538515.localdomain ceph-mon[301134]: pgmap v57: 177 pgs: 177 active+clean; 125 MiB data, 619 MiB used, 41 GiB / 42 GiB avail; 17 KiB/s rd, 2.6 MiB/s wr, 24 op/s
Nov 28 09:59:51 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v58: 177 pgs: 177 active+clean; 145 MiB data, 693 MiB used, 41 GiB / 42 GiB avail; 33 KiB/s rd, 5.1 MiB/s wr, 47 op/s
Nov 28 09:59:51 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e94 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:59:53 np0005538515.localdomain ceph-mon[301134]: pgmap v58: 177 pgs: 177 active+clean; 145 MiB data, 693 MiB used, 41 GiB / 42 GiB avail; 33 KiB/s rd, 5.1 MiB/s wr, 47 op/s
Nov 28 09:59:53 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v59: 177 pgs: 177 active+clean; 145 MiB data, 693 MiB used, 41 GiB / 42 GiB avail; 33 KiB/s rd, 5.1 MiB/s wr, 47 op/s
Nov 28 09:59:55 np0005538515.localdomain ceph-mon[301134]: pgmap v59: 177 pgs: 177 active+clean; 145 MiB data, 693 MiB used, 41 GiB / 42 GiB avail; 33 KiB/s rd, 5.1 MiB/s wr, 47 op/s
Nov 28 09:59:55 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v60: 177 pgs: 177 active+clean; 145 MiB data, 693 MiB used, 41 GiB / 42 GiB avail; 28 KiB/s rd, 4.3 MiB/s wr, 40 op/s
Nov 28 09:59:55 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.
Nov 28 09:59:55 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.
Nov 28 09:59:55 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.
Nov 28 09:59:55 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.
Nov 28 09:59:55 np0005538515.localdomain podman[305970]: 2025-11-28 09:59:55.974305923 +0000 UTC m=+0.082617090 container health_status 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Nov 28 09:59:55 np0005538515.localdomain podman[305970]: 2025-11-28 09:59:55.988453971 +0000 UTC m=+0.096765158 container exec_died 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, io.buildah.version=1.41.3)
Nov 28 09:59:56 np0005538515.localdomain systemd[1]: 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.service: Deactivated successfully.
Nov 28 09:59:56 np0005538515.localdomain podman[305971]: 2025-11-28 09:59:56.079891398 +0000 UTC m=+0.184265897 container health_status 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 09:59:56 np0005538515.localdomain podman[305972]: 2025-11-28 09:59:56.045240999 +0000 UTC m=+0.142360218 container health_status b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, tcib_managed=true, io.buildah.version=1.41.3)
Nov 28 09:59:56 np0005538515.localdomain podman[305972]: 2025-11-28 09:59:56.130488489 +0000 UTC m=+0.227607678 container exec_died b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Nov 28 09:59:56 np0005538515.localdomain systemd[1]: b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.service: Deactivated successfully.
Nov 28 09:59:56 np0005538515.localdomain podman[305978]: 2025-11-28 09:59:56.143711 +0000 UTC m=+0.239125948 container health_status d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 09:59:56 np0005538515.localdomain podman[305971]: 2025-11-28 09:59:56.147492504 +0000 UTC m=+0.251866963 container exec_died 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 09:59:56 np0005538515.localdomain podman[305978]: 2025-11-28 09:59:56.157427154 +0000 UTC m=+0.252842142 container exec_died d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 09:59:56 np0005538515.localdomain systemd[1]: 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.service: Deactivated successfully.
Nov 28 09:59:56 np0005538515.localdomain systemd[1]: d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.service: Deactivated successfully.
Nov 28 09:59:56 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e94 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 09:59:57 np0005538515.localdomain ceph-mon[301134]: pgmap v60: 177 pgs: 177 active+clean; 145 MiB data, 693 MiB used, 41 GiB / 42 GiB avail; 28 KiB/s rd, 4.3 MiB/s wr, 40 op/s
Nov 28 09:59:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:59:57 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:59:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:59:57 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 09:59:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:59:57 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 09:59:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:59:57 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 09:59:57 np0005538515.localdomain openstack_network_exporter[240973]: 
Nov 28 09:59:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   09:59:57 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 09:59:57 np0005538515.localdomain openstack_network_exporter[240973]: 
Nov 28 09:59:57 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v61: 177 pgs: 177 active+clean; 145 MiB data, 693 MiB used, 41 GiB / 42 GiB avail; 13 KiB/s rd, 2.0 MiB/s wr, 18 op/s
Nov 28 09:59:58 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.
Nov 28 09:59:58 np0005538515.localdomain podman[239012]: time="2025-11-28T09:59:58Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 09:59:58 np0005538515.localdomain podman[239012]: @ - - [28/Nov/2025:09:59:58 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156330 "" "Go-http-client/1.1"
Nov 28 09:59:58 np0005538515.localdomain podman[239012]: @ - - [28/Nov/2025:09:59:58 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19186 "" "Go-http-client/1.1"
Nov 28 09:59:59 np0005538515.localdomain podman[306054]: 2025-11-28 09:59:59.027135513 +0000 UTC m=+0.135306645 container health_status 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 28 09:59:59 np0005538515.localdomain podman[306054]: 2025-11-28 09:59:59.039523407 +0000 UTC m=+0.147694599 container exec_died 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 28 09:59:59 np0005538515.localdomain systemd[1]: 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.service: Deactivated successfully.
Nov 28 09:59:59 np0005538515.localdomain ceph-mon[301134]: pgmap v61: 177 pgs: 177 active+clean; 145 MiB data, 693 MiB used, 41 GiB / 42 GiB avail; 13 KiB/s rd, 2.0 MiB/s wr, 18 op/s
Nov 28 09:59:59 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v62: 177 pgs: 177 active+clean; 145 MiB data, 693 MiB used, 41 GiB / 42 GiB avail; 11 KiB/s rd, 1.8 MiB/s wr, 16 op/s
Nov 28 10:00:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:00:00.625 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:00:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:00:00.626 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:00:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:00:00.626 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:00:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:00:00.626 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:00:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:00:00.626 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:00:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:00:00.626 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:00:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:00:00.626 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:00:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:00:00.626 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:00:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:00:00.627 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:00:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:00:00.627 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:00:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:00:00.627 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:00:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:00:00.627 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:00:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:00:00.627 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:00:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:00:00.627 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:00:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:00:00.628 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:00:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:00:00.628 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:00:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:00:00.628 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:00:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:00:00.628 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:00:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:00:00.628 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:00:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:00:00.628 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:00:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:00:00.628 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:00:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:00:00.629 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:00:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:00:00.629 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:00:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:00:00.629 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:00:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:00:00.629 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:00:00 np0005538515.localdomain ceph-mon[301134]: pgmap v62: 177 pgs: 177 active+clean; 145 MiB data, 693 MiB used, 41 GiB / 42 GiB avail; 11 KiB/s rd, 1.8 MiB/s wr, 16 op/s
Nov 28 10:00:00 np0005538515.localdomain ceph-mon[301134]: overall HEALTH_OK
Nov 28 10:00:01 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v63: 177 pgs: 177 active+clean; 145 MiB data, 693 MiB used, 41 GiB / 42 GiB avail; 11 KiB/s rd, 1.7 MiB/s wr, 15 op/s
Nov 28 10:00:01 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e94 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 10:00:01 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.
Nov 28 10:00:01 np0005538515.localdomain podman[306079]: 2025-11-28 10:00:01.975243834 +0000 UTC m=+0.082893699 container health_status cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, config_id=multipathd, org.label-schema.build-date=20251125)
Nov 28 10:00:01 np0005538515.localdomain podman[306079]: 2025-11-28 10:00:01.990681941 +0000 UTC m=+0.098331856 container exec_died cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3)
Nov 28 10:00:02 np0005538515.localdomain systemd[1]: cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.service: Deactivated successfully.
Nov 28 10:00:02 np0005538515.localdomain ceph-mon[301134]: pgmap v63: 177 pgs: 177 active+clean; 145 MiB data, 693 MiB used, 41 GiB / 42 GiB avail; 11 KiB/s rd, 1.7 MiB/s wr, 15 op/s
Nov 28 10:00:03 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v64: 177 pgs: 177 active+clean; 145 MiB data, 693 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:00:04 np0005538515.localdomain ceph-mon[301134]: pgmap v64: 177 pgs: 177 active+clean; 145 MiB data, 693 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:00:05 np0005538515.localdomain ceph-mgr[286188]: [balancer INFO root] Optimize plan auto_2025-11-28_10:00:05
Nov 28 10:00:05 np0005538515.localdomain ceph-mgr[286188]: [balancer INFO root] Mode upmap, max misplaced 0.050000
Nov 28 10:00:05 np0005538515.localdomain ceph-mgr[286188]: [balancer INFO root] do_upmap
Nov 28 10:00:05 np0005538515.localdomain ceph-mgr[286188]: [balancer INFO root] pools ['.mgr', 'images', 'volumes', 'manila_data', 'backups', 'manila_metadata', 'vms']
Nov 28 10:00:05 np0005538515.localdomain ceph-mgr[286188]: [balancer INFO root] prepared 0/10 changes
Nov 28 10:00:05 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v65: 177 pgs: 177 active+clean; 145 MiB data, 693 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:00:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] scanning for idle connections..
Nov 28 10:00:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] cleaning up connections: []
Nov 28 10:00:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] _maybe_adjust
Nov 28 10:00:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Nov 28 10:00:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] Pool '.mgr' root_id -1 using 3.080724804578448e-05 of space, bias 1.0, pg target 0.006161449609156895 quantized to 1 (current 1)
Nov 28 10:00:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Nov 28 10:00:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] Pool 'vms' root_id -1 using 0.003328000680485762 of space, bias 1.0, pg target 0.6656001360971524 quantized to 32 (current 32)
Nov 28 10:00:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Nov 28 10:00:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] Pool 'volumes' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Nov 28 10:00:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Nov 28 10:00:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] Pool 'images' root_id -1 using 0.004299383200725851 of space, bias 1.0, pg target 0.8584435124115949 quantized to 32 (current 32)
Nov 28 10:00:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Nov 28 10:00:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] Pool 'backups' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Nov 28 10:00:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Nov 28 10:00:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] Pool 'manila_data' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Nov 28 10:00:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Nov 28 10:00:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] Pool 'manila_metadata' root_id -1 using 2.453674623115578e-06 of space, bias 4.0, pg target 0.001953125 quantized to 16 (current 16)
Nov 28 10:00:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] scanning for idle connections..
Nov 28 10:00:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] cleaning up connections: []
Nov 28 10:00:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] scanning for idle connections..
Nov 28 10:00:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] cleaning up connections: []
Nov 28 10:00:05 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules
Nov 28 10:00:05 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: vms, start_after=
Nov 28 10:00:05 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: volumes, start_after=
Nov 28 10:00:05 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: images, start_after=
Nov 28 10:00:05 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: backups, start_after=
Nov 28 10:00:05 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules
Nov 28 10:00:05 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: vms, start_after=
Nov 28 10:00:05 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: volumes, start_after=
Nov 28 10:00:05 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: images, start_after=
Nov 28 10:00:05 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: backups, start_after=
Nov 28 10:00:06 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e94 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 10:00:06 np0005538515.localdomain ceph-mon[301134]: pgmap v65: 177 pgs: 177 active+clean; 145 MiB data, 693 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:00:07 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v66: 177 pgs: 177 active+clean; 145 MiB data, 693 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:00:08 np0005538515.localdomain ceph-mon[301134]: pgmap v66: 177 pgs: 177 active+clean; 145 MiB data, 693 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:00:09 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v67: 177 pgs: 177 active+clean; 145 MiB data, 693 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:00:10 np0005538515.localdomain ceph-mon[301134]: pgmap v67: 177 pgs: 177 active+clean; 145 MiB data, 693 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:00:11 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v68: 177 pgs: 177 active+clean; 145 MiB data, 693 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:00:11 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e94 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 10:00:12 np0005538515.localdomain ceph-mon[301134]: pgmap v68: 177 pgs: 177 active+clean; 145 MiB data, 693 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:00:13 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v69: 177 pgs: 177 active+clean; 145 MiB data, 693 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:00:13 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/2462883409' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:00:13 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/2462883409' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:00:14 np0005538515.localdomain sudo[306098]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 10:00:14 np0005538515.localdomain sudo[306098]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 10:00:14 np0005538515.localdomain sudo[306098]: pam_unix(sudo:session): session closed for user root
Nov 28 10:00:14 np0005538515.localdomain sudo[306116]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 10:00:14 np0005538515.localdomain sudo[306116]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 10:00:14 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:00:14.685 261346 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:00:13Z, description=, device_id=b12e58f4-c600-4789-9e60-18c753a08ff6, device_owner=network:router_gateway, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce5d65e0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce5d67c0>], id=e1feac1c-84ce-4045-8255-738ae05741b8, ip_allocation=immediate, mac_address=fa:16:3e:cd:76:d8, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T08:32:19Z, description=, dns_domain=, id=887157f9-a765-40c0-8be5-1fba3ddea8f8, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=9dda653c53224db086060962b0702694, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=29, status=ACTIVE, subnets=['5f7de60c-f82a-4f40-b803-51cb08cbf2e3'], tags=[], tenant_id=9dda653c53224db086060962b0702694, updated_at=2025-11-28T08:32:25Z, vlan_transparent=None, network_id=887157f9-a765-40c0-8be5-1fba3ddea8f8, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=114, status=DOWN, tags=[], tenant_id=, updated_at=2025-11-28T10:00:14Z on network 887157f9-a765-40c0-8be5-1fba3ddea8f8
Nov 28 10:00:14 np0005538515.localdomain sudo[306116]: pam_unix(sudo:session): session closed for user root
Nov 28 10:00:14 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #19. Immutable memtables: 0.
Nov 28 10:00:14 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:00:14.802668) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Nov 28 10:00:14 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/flush_job.cc:856] [default] [JOB 7] Flushing memtable with next log file: 19
Nov 28 10:00:14 np0005538515.localdomain ceph-mon[301134]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324014802724, "job": 7, "event": "flush_started", "num_memtables": 1, "num_entries": 852, "num_deletes": 256, "total_data_size": 1242869, "memory_usage": 1266192, "flush_reason": "Manual Compaction"}
Nov 28 10:00:14 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/flush_job.cc:885] [default] [JOB 7] Level-0 flush table #20: started
Nov 28 10:00:14 np0005538515.localdomain ceph-mon[301134]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324014810222, "cf_name": "default", "job": 7, "event": "table_file_creation", "file_number": 20, "file_size": 816319, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 15424, "largest_seqno": 16270, "table_properties": {"data_size": 812648, "index_size": 1462, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1157, "raw_key_size": 8362, "raw_average_key_size": 18, "raw_value_size": 805113, "raw_average_value_size": 1821, "num_data_blocks": 65, "num_entries": 442, "num_filter_entries": 442, "num_deletions": 256, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764323963, "oldest_key_time": 1764323963, "file_creation_time": 1764324014, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "75e61b0e-4f73-4b03-b096-8587ecbe7a9f", "db_session_id": "7KM5GJAJPD54H6HSLJHG", "orig_file_number": 20, "seqno_to_time_mapping": "N/A"}}
Nov 28 10:00:14 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 7] Flush lasted 7579 microseconds, and 2143 cpu microseconds.
Nov 28 10:00:14 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 28 10:00:14 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:00:14.810254) [db/flush_job.cc:967] [default] [JOB 7] Level-0 flush table #20: 816319 bytes OK
Nov 28 10:00:14 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:00:14.810272) [db/memtable_list.cc:519] [default] Level-0 commit table #20 started
Nov 28 10:00:14 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:00:14.813883) [db/memtable_list.cc:722] [default] Level-0 commit table #20: memtable #1 done
Nov 28 10:00:14 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:00:14.813898) EVENT_LOG_v1 {"time_micros": 1764324014813893, "job": 7, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Nov 28 10:00:14 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:00:14.813914) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Nov 28 10:00:14 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 7] Try to delete WAL files size 1238461, prev total WAL file size 1238785, number of live WAL files 2.
Nov 28 10:00:14 np0005538515.localdomain ceph-mon[301134]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538515/store.db/000016.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 28 10:00:14 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:00:14.815054) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0033373730' seq:72057594037927935, type:22 .. '6C6F676D0034303232' seq:0, type:0; will stop at (end)
Nov 28 10:00:14 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 8] Compacting 1@0 + 1@6 files to L6, score -1.00
Nov 28 10:00:14 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 7 Base level 0, inputs: [20(797KB)], [18(17MB)]
Nov 28 10:00:14 np0005538515.localdomain ceph-mon[301134]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324014815092, "job": 8, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [20], "files_L6": [18], "score": -1, "input_data_size": 19554165, "oldest_snapshot_seqno": -1}
Nov 28 10:00:14 np0005538515.localdomain dnsmasq[261709]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/addn_hosts - 2 addresses
Nov 28 10:00:14 np0005538515.localdomain podman[306182]: 2025-11-28 10:00:14.910456128 +0000 UTC m=+0.059075659 container kill dff024c0a8ef5b75fbea3dbf531de7255c21fd5f44b33b8a799f8e3ce0ffd439 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-887157f9-a765-40c0-8be5-1fba3ddea8f8, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:00:14 np0005538515.localdomain dnsmasq-dhcp[261709]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/host
Nov 28 10:00:14 np0005538515.localdomain dnsmasq-dhcp[261709]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/opts
Nov 28 10:00:14 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Nov 28 10:00:14 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 10:00:14 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0)
Nov 28 10:00:14 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 28 10:00:14 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 8] Generated table #21: 12065 keys, 19455235 bytes, temperature: kUnknown
Nov 28 10:00:14 np0005538515.localdomain ceph-mon[301134]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324014957208, "cf_name": "default", "job": 8, "event": "table_file_creation", "file_number": 21, "file_size": 19455235, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 19383598, "index_size": 40368, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 30213, "raw_key_size": 323387, "raw_average_key_size": 26, "raw_value_size": 19175396, "raw_average_value_size": 1589, "num_data_blocks": 1549, "num_entries": 12065, "num_filter_entries": 12065, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764323786, "oldest_key_time": 0, "file_creation_time": 1764324014, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "75e61b0e-4f73-4b03-b096-8587ecbe7a9f", "db_session_id": "7KM5GJAJPD54H6HSLJHG", "orig_file_number": 21, "seqno_to_time_mapping": "N/A"}}
Nov 28 10:00:14 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 28 10:00:14 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Nov 28 10:00:14 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:00:14.957437) [db/compaction/compaction_job.cc:1663] [default] [JOB 8] Compacted 1@0 + 1@6 files to L6 => 19455235 bytes
Nov 28 10:00:14 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:00:14.960565) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 137.5 rd, 136.8 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.8, 17.9 +0.0 blob) out(18.6 +0.0 blob), read-write-amplify(47.8) write-amplify(23.8) OK, records in: 12597, records dropped: 532 output_compression: NoCompression
Nov 28 10:00:14 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:00:14.960598) EVENT_LOG_v1 {"time_micros": 1764324014960583, "job": 8, "event": "compaction_finished", "compaction_time_micros": 142183, "compaction_time_cpu_micros": 39584, "output_level": 6, "num_output_files": 1, "total_output_size": 19455235, "num_input_records": 12597, "num_output_records": 12065, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Nov 28 10:00:14 np0005538515.localdomain ceph-mon[301134]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538515/store.db/000020.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 28 10:00:14 np0005538515.localdomain ceph-mon[301134]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324014961131, "job": 8, "event": "table_file_deletion", "file_number": 20}
Nov 28 10:00:14 np0005538515.localdomain ceph-mon[301134]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538515/store.db/000018.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 28 10:00:14 np0005538515.localdomain ceph-mon[301134]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324014964455, "job": 8, "event": "table_file_deletion", "file_number": 18}
Nov 28 10:00:14 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:00:14.814976) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:00:14 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:00:14.964622) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:00:14 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:00:14.964632) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:00:14 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:00:14.964636) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:00:14 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:00:14.964641) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:00:14 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:00:14.964645) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:00:14 np0005538515.localdomain ceph-mgr[286188]: [progress INFO root] update: starting ev d57516a0-7dc2-4aaf-ba81-14e3a5c978fb (Updating node-proxy deployment (+3 -> 3))
Nov 28 10:00:14 np0005538515.localdomain ceph-mgr[286188]: [progress INFO root] complete: finished ev d57516a0-7dc2-4aaf-ba81-14e3a5c978fb (Updating node-proxy deployment (+3 -> 3))
Nov 28 10:00:14 np0005538515.localdomain ceph-mgr[286188]: [progress INFO root] Completed event d57516a0-7dc2-4aaf-ba81-14e3a5c978fb (Updating node-proxy deployment (+3 -> 3)) in 0 seconds
Nov 28 10:00:14 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0)
Nov 28 10:00:14 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 28 10:00:14 np0005538515.localdomain ceph-mon[301134]: pgmap v69: 177 pgs: 177 active+clean; 145 MiB data, 693 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:00:14 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 10:00:14 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 28 10:00:14 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:00:15 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:00:15.151 261346 INFO neutron.agent.dhcp.agent [None req-9530903c-83e5-45c2-9004-e11a1d9be55f - - - - - -] DHCP configuration for ports {'e1feac1c-84ce-4045-8255-738ae05741b8'} is completed
Nov 28 10:00:15 np0005538515.localdomain sudo[306205]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 10:00:15 np0005538515.localdomain sudo[306205]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 10:00:15 np0005538515.localdomain sudo[306205]: pam_unix(sudo:session): session closed for user root
Nov 28 10:00:15 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v70: 177 pgs: 177 active+clean; 145 MiB data, 693 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:00:15 np0005538515.localdomain ceph-mgr[286188]: [progress INFO root] Writing back 50 completed events
Nov 28 10:00:15 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0)
Nov 28 10:00:15 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 28 10:00:15 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:00:16 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e94 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 10:00:16 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.
Nov 28 10:00:16 np0005538515.localdomain podman[306223]: 2025-11-28 10:00:16.982614881 +0000 UTC m=+0.087998123 container health_status 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, maintainer=Red Hat, Inc., name=ubi9-minimal, vendor=Red Hat, Inc.)
Nov 28 10:00:17 np0005538515.localdomain podman[306223]: 2025-11-28 10:00:17.000561255 +0000 UTC m=+0.105944487 container exec_died 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, container_name=openstack_network_exporter, distribution-scope=public, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, config_id=edpm, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, build-date=2025-08-20T13:12:41, architecture=x86_64, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 28 10:00:17 np0005538515.localdomain systemd[1]: 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.service: Deactivated successfully.
Nov 28 10:00:17 np0005538515.localdomain ceph-mon[301134]: pgmap v70: 177 pgs: 177 active+clean; 145 MiB data, 693 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:00:17 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v71: 177 pgs: 177 active+clean; 145 MiB data, 693 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:00:19 np0005538515.localdomain ceph-mon[301134]: pgmap v71: 177 pgs: 177 active+clean; 145 MiB data, 693 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:00:19 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v72: 177 pgs: 177 active+clean; 145 MiB data, 693 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:00:21 np0005538515.localdomain ceph-mon[301134]: pgmap v72: 177 pgs: 177 active+clean; 145 MiB data, 693 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:00:21 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v73: 177 pgs: 177 active+clean; 145 MiB data, 693 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:00:21 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e94 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 10:00:22 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:00:22.880 261346 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:00:22Z, description=, device_id=76efbe69-508f-4a6c-bc6c-575aca933da7, device_owner=network:router_gateway, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce5f9940>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce64ae20>], id=b4af8288-f645-4ff4-99bf-dd2772bb45d9, ip_allocation=immediate, mac_address=fa:16:3e:98:54:59, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T08:32:19Z, description=, dns_domain=, id=887157f9-a765-40c0-8be5-1fba3ddea8f8, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=9dda653c53224db086060962b0702694, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=29, status=ACTIVE, subnets=['5f7de60c-f82a-4f40-b803-51cb08cbf2e3'], tags=[], tenant_id=9dda653c53224db086060962b0702694, updated_at=2025-11-28T08:32:25Z, vlan_transparent=None, network_id=887157f9-a765-40c0-8be5-1fba3ddea8f8, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=188, status=DOWN, tags=[], tenant_id=, updated_at=2025-11-28T10:00:22Z on network 887157f9-a765-40c0-8be5-1fba3ddea8f8
Nov 28 10:00:23 np0005538515.localdomain dnsmasq[261709]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/addn_hosts - 3 addresses
Nov 28 10:00:23 np0005538515.localdomain podman[306259]: 2025-11-28 10:00:23.089525859 +0000 UTC m=+0.063274065 container kill dff024c0a8ef5b75fbea3dbf531de7255c21fd5f44b33b8a799f8e3ce0ffd439 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-887157f9-a765-40c0-8be5-1fba3ddea8f8, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Nov 28 10:00:23 np0005538515.localdomain dnsmasq-dhcp[261709]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/host
Nov 28 10:00:23 np0005538515.localdomain systemd[297255]: Created slice User Background Tasks Slice.
Nov 28 10:00:23 np0005538515.localdomain dnsmasq-dhcp[261709]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/opts
Nov 28 10:00:23 np0005538515.localdomain ceph-mon[301134]: pgmap v73: 177 pgs: 177 active+clean; 145 MiB data, 693 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:00:23 np0005538515.localdomain systemd[297255]: Starting Cleanup of User's Temporary Files and Directories...
Nov 28 10:00:23 np0005538515.localdomain systemd[297255]: Finished Cleanup of User's Temporary Files and Directories.
Nov 28 10:00:23 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:00:23.332 261346 INFO neutron.agent.dhcp.agent [None req-29033f13-0547-4c9e-8569-9f9799d316d6 - - - - - -] DHCP configuration for ports {'b4af8288-f645-4ff4-99bf-dd2772bb45d9'} is completed
Nov 28 10:00:23 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v74: 177 pgs: 177 active+clean; 145 MiB data, 693 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:00:25 np0005538515.localdomain ceph-mon[301134]: pgmap v74: 177 pgs: 177 active+clean; 145 MiB data, 693 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:00:25 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v75: 177 pgs: 177 active+clean; 145 MiB data, 693 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:00:26 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e94 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 10:00:26 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.
Nov 28 10:00:26 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.
Nov 28 10:00:26 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.
Nov 28 10:00:26 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.
Nov 28 10:00:26 np0005538515.localdomain systemd[1]: tmp-crun.mCFX8N.mount: Deactivated successfully.
Nov 28 10:00:26 np0005538515.localdomain podman[306285]: 2025-11-28 10:00:26.990286278 +0000 UTC m=+0.087847570 container health_status b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3)
Nov 28 10:00:27 np0005538515.localdomain podman[306285]: 2025-11-28 10:00:27.02341869 +0000 UTC m=+0.120979992 container exec_died b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 28 10:00:27 np0005538515.localdomain systemd[1]: b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.service: Deactivated successfully.
Nov 28 10:00:27 np0005538515.localdomain podman[306284]: 2025-11-28 10:00:27.039436664 +0000 UTC m=+0.136999076 container health_status 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, tcib_managed=true, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 28 10:00:27 np0005538515.localdomain podman[306284]: 2025-11-28 10:00:27.085467888 +0000 UTC m=+0.183030250 container exec_died 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, config_id=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, org.label-schema.license=GPLv2)
Nov 28 10:00:27 np0005538515.localdomain systemd[1]: 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.service: Deactivated successfully.
Nov 28 10:00:27 np0005538515.localdomain podman[306283]: 2025-11-28 10:00:27.102507623 +0000 UTC m=+0.201649253 container health_status 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:00:27 np0005538515.localdomain ceph-mon[301134]: pgmap v75: 177 pgs: 177 active+clean; 145 MiB data, 693 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:00:27 np0005538515.localdomain podman[306283]: 2025-11-28 10:00:27.140531863 +0000 UTC m=+0.239673463 container exec_died 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 28 10:00:27 np0005538515.localdomain systemd[1]: 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.service: Deactivated successfully.
Nov 28 10:00:27 np0005538515.localdomain podman[306286]: 2025-11-28 10:00:27.145875355 +0000 UTC m=+0.240286392 container health_status d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 10:00:27 np0005538515.localdomain podman[306286]: 2025-11-28 10:00:27.229523997 +0000 UTC m=+0.323934954 container exec_died d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 10:00:27 np0005538515.localdomain systemd[1]: d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.service: Deactivated successfully.
Nov 28 10:00:27 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:00:27.445 261346 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:00:27Z, description=, device_id=cd2a6086-9326-4cdd-a015-4768e2092068, device_owner=network:router_gateway, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce80fa30>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce5d6dc0>], id=54e1a6ba-02cc-4dd7-8b89-abb02b7f636f, ip_allocation=immediate, mac_address=fa:16:3e:2d:ce:2a, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T08:32:19Z, description=, dns_domain=, id=887157f9-a765-40c0-8be5-1fba3ddea8f8, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=9dda653c53224db086060962b0702694, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=29, status=ACTIVE, subnets=['5f7de60c-f82a-4f40-b803-51cb08cbf2e3'], tags=[], tenant_id=9dda653c53224db086060962b0702694, updated_at=2025-11-28T08:32:25Z, vlan_transparent=None, network_id=887157f9-a765-40c0-8be5-1fba3ddea8f8, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=236, status=DOWN, tags=[], tenant_id=, updated_at=2025-11-28T10:00:27Z on network 887157f9-a765-40c0-8be5-1fba3ddea8f8
Nov 28 10:00:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:00:27 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 10:00:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:00:27 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 10:00:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:00:27 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 10:00:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:00:27 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 10:00:27 np0005538515.localdomain openstack_network_exporter[240973]: 
Nov 28 10:00:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:00:27 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 10:00:27 np0005538515.localdomain openstack_network_exporter[240973]: 
Nov 28 10:00:27 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v76: 177 pgs: 177 active+clean; 145 MiB data, 693 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:00:27 np0005538515.localdomain dnsmasq[261709]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/addn_hosts - 4 addresses
Nov 28 10:00:27 np0005538515.localdomain dnsmasq-dhcp[261709]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/host
Nov 28 10:00:27 np0005538515.localdomain dnsmasq-dhcp[261709]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/opts
Nov 28 10:00:27 np0005538515.localdomain podman[306383]: 2025-11-28 10:00:27.665860651 +0000 UTC m=+0.059053048 container kill dff024c0a8ef5b75fbea3dbf531de7255c21fd5f44b33b8a799f8e3ce0ffd439 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-887157f9-a765-40c0-8be5-1fba3ddea8f8, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 28 10:00:27 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:00:27.890 261346 INFO neutron.agent.dhcp.agent [None req-d06bab2e-baa5-41c1-aede-fe332e955b2e - - - - - -] DHCP configuration for ports {'54e1a6ba-02cc-4dd7-8b89-abb02b7f636f'} is completed
Nov 28 10:00:28 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:00:28.463 2 INFO neutron.agent.securitygroups_rpc [None req-cdabab7f-6f0b-439c-ae1d-dd4a3208cd11 c64867c2bac34a819c0995d0b72ee9a7 3e4b394501d24dc7954ec5d2f27b8081 - - default default] Security group member updated ['dc0a6e12-205a-4d7d-adb2-6545f08f7990']
Nov 28 10:00:28 np0005538515.localdomain podman[239012]: time="2025-11-28T10:00:28Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 10:00:28 np0005538515.localdomain podman[239012]: @ - - [28/Nov/2025:10:00:28 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156330 "" "Go-http-client/1.1"
Nov 28 10:00:28 np0005538515.localdomain podman[239012]: @ - - [28/Nov/2025:10:00:28 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19186 "" "Go-http-client/1.1"
Nov 28 10:00:29 np0005538515.localdomain ceph-mon[301134]: pgmap v76: 177 pgs: 177 active+clean; 145 MiB data, 693 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:00:29 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v77: 177 pgs: 177 active+clean; 145 MiB data, 693 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:00:29 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.
Nov 28 10:00:29 np0005538515.localdomain podman[306403]: 2025-11-28 10:00:29.983881844 +0000 UTC m=+0.084848968 container health_status 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 10:00:29 np0005538515.localdomain podman[306403]: 2025-11-28 10:00:29.994481045 +0000 UTC m=+0.095448209 container exec_died 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 28 10:00:30 np0005538515.localdomain systemd[1]: 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.service: Deactivated successfully.
Nov 28 10:00:31 np0005538515.localdomain ceph-mon[301134]: pgmap v77: 177 pgs: 177 active+clean; 145 MiB data, 693 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:00:31 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v78: 177 pgs: 177 active+clean; 145 MiB data, 693 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:00:31 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e94 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 10:00:32 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T10:00:32Z|00039|memory|INFO|peak resident set size grew 53% in last 2237.5 seconds, from 13040 kB to 19928 kB
Nov 28 10:00:32 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T10:00:32Z|00040|memory|INFO|idl-cells-OVN_Southbound:7680 idl-cells-Open_vSwitch:1041 if_status_mgr_ifaces_state_usage-KB:1 if_status_mgr_ifaces_usage-KB:1 lflow-cache-entries-cache-expr:209 lflow-cache-entries-cache-matches:236 lflow-cache-size-KB:803 local_datapath_usage-KB:2 ofctrl_desired_flow_usage-KB:361 ofctrl_installed_flow_usage-KB:265 ofctrl_sb_flow_ref_usage-KB:140
Nov 28 10:00:32 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.
Nov 28 10:00:32 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:00:32.900 2 INFO neutron.agent.securitygroups_rpc [None req-d73a2eae-adce-4d72-91ae-34d59db28d8a c64867c2bac34a819c0995d0b72ee9a7 3e4b394501d24dc7954ec5d2f27b8081 - - default default] Security group member updated ['dc0a6e12-205a-4d7d-adb2-6545f08f7990']
Nov 28 10:00:32 np0005538515.localdomain systemd[1]: tmp-crun.N4fTHe.mount: Deactivated successfully.
Nov 28 10:00:32 np0005538515.localdomain podman[306426]: 2025-11-28 10:00:32.980484793 +0000 UTC m=+0.077641370 container health_status cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.build-date=20251125, config_id=multipathd, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3)
Nov 28 10:00:33 np0005538515.localdomain podman[306426]: 2025-11-28 10:00:33.017882415 +0000 UTC m=+0.115038972 container exec_died cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 28 10:00:33 np0005538515.localdomain systemd[1]: cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.service: Deactivated successfully.
Nov 28 10:00:33 np0005538515.localdomain ceph-mon[301134]: pgmap v78: 177 pgs: 177 active+clean; 145 MiB data, 693 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:00:33 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v79: 177 pgs: 177 active+clean; 145 MiB data, 693 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:00:35 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:00:35.016 261346 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:00:34Z, description=, device_id=5e2bdb5c-9386-4f23-88bb-b64884bb41d1, device_owner=network:router_gateway, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce599310>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce5993a0>], id=3cbad716-1cc4-4dc7-a41c-a00dc20f804b, ip_allocation=immediate, mac_address=fa:16:3e:9a:66:d9, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T08:32:19Z, description=, dns_domain=, id=887157f9-a765-40c0-8be5-1fba3ddea8f8, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=9dda653c53224db086060962b0702694, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=29, status=ACTIVE, subnets=['5f7de60c-f82a-4f40-b803-51cb08cbf2e3'], tags=[], tenant_id=9dda653c53224db086060962b0702694, updated_at=2025-11-28T08:32:25Z, vlan_transparent=None, network_id=887157f9-a765-40c0-8be5-1fba3ddea8f8, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=314, status=DOWN, tags=[], tenant_id=, updated_at=2025-11-28T10:00:34Z on network 887157f9-a765-40c0-8be5-1fba3ddea8f8
Nov 28 10:00:35 np0005538515.localdomain ceph-mon[301134]: pgmap v79: 177 pgs: 177 active+clean; 145 MiB data, 693 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:00:35 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:00:35.238 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:00:35 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:00:35.239 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:00:35 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:00:35.239 280172 DEBUG nova.compute.manager [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 28 10:00:35 np0005538515.localdomain dnsmasq[261709]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/addn_hosts - 5 addresses
Nov 28 10:00:35 np0005538515.localdomain podman[306462]: 2025-11-28 10:00:35.2444066 +0000 UTC m=+0.057572382 container kill dff024c0a8ef5b75fbea3dbf531de7255c21fd5f44b33b8a799f8e3ce0ffd439 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-887157f9-a765-40c0-8be5-1fba3ddea8f8, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125)
Nov 28 10:00:35 np0005538515.localdomain dnsmasq-dhcp[261709]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/host
Nov 28 10:00:35 np0005538515.localdomain dnsmasq-dhcp[261709]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/opts
Nov 28 10:00:35 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v80: 177 pgs: 177 active+clean; 145 MiB data, 693 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:00:35 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:00:35.661 261346 INFO neutron.agent.dhcp.agent [None req-a77b74cd-c8e7-437b-87b5-52b5da6d1c28 - - - - - -] DHCP configuration for ports {'3cbad716-1cc4-4dc7-a41c-a00dc20f804b'} is completed
Nov 28 10:00:35 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] scanning for idle connections..
Nov 28 10:00:35 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] cleaning up connections: []
Nov 28 10:00:35 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] scanning for idle connections..
Nov 28 10:00:35 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] cleaning up connections: []
Nov 28 10:00:35 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] scanning for idle connections..
Nov 28 10:00:35 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] cleaning up connections: []
Nov 28 10:00:36 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:00:36.239 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:00:36 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:00:36.240 280172 DEBUG nova.compute.manager [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Nov 28 10:00:36 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:00:36.265 280172 DEBUG nova.compute.manager [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Nov 28 10:00:36 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e94 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 10:00:37 np0005538515.localdomain ceph-mon[301134]: pgmap v80: 177 pgs: 177 active+clean; 145 MiB data, 693 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:00:37 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:00:37.264 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:00:37 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v81: 177 pgs: 177 active+clean; 145 MiB data, 693 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:00:37 np0005538515.localdomain podman[306499]: 2025-11-28 10:00:37.686462318 +0000 UTC m=+0.062850293 container kill dff024c0a8ef5b75fbea3dbf531de7255c21fd5f44b33b8a799f8e3ce0ffd439 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-887157f9-a765-40c0-8be5-1fba3ddea8f8, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 28 10:00:37 np0005538515.localdomain dnsmasq[261709]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/addn_hosts - 4 addresses
Nov 28 10:00:37 np0005538515.localdomain dnsmasq-dhcp[261709]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/host
Nov 28 10:00:37 np0005538515.localdomain dnsmasq-dhcp[261709]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/opts
Nov 28 10:00:38 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:00:38.238 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:00:39 np0005538515.localdomain ceph-mon[301134]: pgmap v81: 177 pgs: 177 active+clean; 145 MiB data, 693 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:00:39 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v82: 177 pgs: 177 active+clean; 145 MiB data, 693 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:00:40 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:00:40.238 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:00:40 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:00:40.239 280172 DEBUG nova.compute.manager [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 28 10:00:40 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:00:40.239 280172 DEBUG nova.compute.manager [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 28 10:00:40 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:00:40.262 280172 DEBUG nova.compute.manager [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 28 10:00:41 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:00:41.124 2 INFO neutron.agent.securitygroups_rpc [req-a368a1e6-562c-4526-ae3a-0e69b2a15bca req-d778405e-6e70-4ce8-a2a0-d781e7c8b4de 67aa4e1531db4854a2788f0bbeaba314 40693e6dadaf448a8cb4caeb6899effc - - default default] Security group rule updated ['f59fc346-b907-4d25-9b54-7ce550f4338f']
Nov 28 10:00:41 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:00:41.153 158530 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=8, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '92:49:97', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ca:ab:0a:de:51:20'}, ipsec=False) old=SB_Global(nb_cfg=7) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:00:41 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:00:41.157 158530 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 28 10:00:41 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:00:41.238 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:00:41 np0005538515.localdomain ceph-mon[301134]: pgmap v82: 177 pgs: 177 active+clean; 145 MiB data, 693 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:00:41 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.107:0/3575417762' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:00:41 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v83: 177 pgs: 177 active+clean; 145 MiB data, 693 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:00:41 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e94 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 10:00:42 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:00:42.144 2 INFO neutron.agent.securitygroups_rpc [req-91a233d2-d128-4e77-ba4a-bffdfb538e2c req-2dda1abe-8246-445d-82ca-7050072f5ab7 67aa4e1531db4854a2788f0bbeaba314 40693e6dadaf448a8cb4caeb6899effc - - default default] Security group rule updated ['793a871e-42bd-4871-9764-ed4c16f282ee']
Nov 28 10:00:42 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:00:42.234 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:00:42 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.106:0/2613041327' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:00:43 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:00:43.151 2 INFO neutron.agent.securitygroups_rpc [req-9f8c831a-ecf0-4e4e-9595-719ef0ba964a req-7a744279-94cd-4894-9a01-f869bc00b409 67aa4e1531db4854a2788f0bbeaba314 40693e6dadaf448a8cb4caeb6899effc - - default default] Security group rule updated ['03578922-528e-499a-8e7e-7a5c262d5e64']
Nov 28 10:00:43 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:00:43.160 158530 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=62c03cad-89c1-4fd7-973b-8f2a608c71f1, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '8'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 10:00:43 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:00:43.238 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:00:43 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:00:43.238 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:00:43 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:00:43.259 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 10:00:43 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:00:43.259 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 10:00:43 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:00:43.260 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 10:00:43 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:00:43.260 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Auditing locally available compute resources for np0005538515.localdomain (node: np0005538515.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 28 10:00:43 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:00:43.261 280172 DEBUG oslo_concurrency.processutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 10:00:43 np0005538515.localdomain ceph-mon[301134]: pgmap v83: 177 pgs: 177 active+clean; 145 MiB data, 693 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:00:43 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.107:0/1753842664' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:00:43 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v84: 177 pgs: 177 active+clean; 145 MiB data, 693 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:00:43 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 28 10:00:43 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/2762372881' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:00:43 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:00:43.728 280172 DEBUG oslo_concurrency.processutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.467s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 10:00:43 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:00:43.926 280172 WARNING nova.virt.libvirt.driver [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 10:00:43 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:00:43.927 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Hypervisor/Node resource view: name=np0005538515.localdomain free_ram=11978MB free_disk=41.83686447143555GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 28 10:00:43 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:00:43.928 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 10:00:43 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:00:43.929 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 10:00:44 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:00:44.113 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 28 10:00:44 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:00:44.114 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Final resource view: name=np0005538515.localdomain phys_ram=15738MB used_ram=512MB phys_disk=41GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 28 10:00:44 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:00:44.371 280172 DEBUG nova.scheduler.client.report [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Refreshing inventories for resource provider 72fba1ca-0d86-48af-8a3d-510284dfd0e0 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Nov 28 10:00:44 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.108:0/2762372881' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:00:44 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.107:0/1651382477' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:00:44 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.106:0/1539368950' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:00:44 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:00:44.402 280172 DEBUG nova.scheduler.client.report [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Updating ProviderTree inventory for provider 72fba1ca-0d86-48af-8a3d-510284dfd0e0 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Nov 28 10:00:44 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:00:44.403 280172 DEBUG nova.compute.provider_tree [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Updating inventory in ProviderTree for provider 72fba1ca-0d86-48af-8a3d-510284dfd0e0 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 28 10:00:44 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:00:44.703 280172 DEBUG nova.scheduler.client.report [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Refreshing aggregate associations for resource provider 72fba1ca-0d86-48af-8a3d-510284dfd0e0, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Nov 28 10:00:44 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:00:44.742 280172 DEBUG nova.scheduler.client.report [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Refreshing trait associations for resource provider 72fba1ca-0d86-48af-8a3d-510284dfd0e0, traits: COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_TRUSTED_CERTS,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_AVX,HW_CPU_X86_BMI2,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_AESNI,HW_CPU_X86_SHA,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_ACCELERATORS,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_AMD_SVM,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_F16C,HW_CPU_X86_SVM,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_SSE2,HW_CPU_X86_CLMUL,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_ABM,COMPUTE_STORAGE_BUS_FDC,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_SSE4A,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_SSSE3,HW_CPU_X86_AVX2,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_SECURITY_TPM_1_2,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_SSE41,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_FMA3,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_BMI,HW_CPU_X86_SSE,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NODE,HW_CPU_X86_MMX,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_VOLUME_EXTEND,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_VIRTIO _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Nov 28 10:00:44 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:00:44.771 280172 DEBUG oslo_concurrency.processutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 10:00:44 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:00:44.828 2 INFO neutron.agent.securitygroups_rpc [req-cee9d54d-2061-4e18-9dbb-7978cc78c723 req-de7c358c-da89-4568-8167-d43d2e6c2b50 67aa4e1531db4854a2788f0bbeaba314 40693e6dadaf448a8cb4caeb6899effc - - default default] Security group rule updated ['0b748e56-a20d-4a74-8688-d245ea875072']
Nov 28 10:00:45 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 28 10:00:45 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/2207683770' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:00:45 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:00:45.288 280172 DEBUG oslo_concurrency.processutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.517s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 10:00:45 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:00:45.295 280172 DEBUG nova.compute.provider_tree [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Inventory has not changed in ProviderTree for provider: 72fba1ca-0d86-48af-8a3d-510284dfd0e0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 10:00:45 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:00:45.316 280172 DEBUG nova.scheduler.client.report [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Inventory has not changed for provider 72fba1ca-0d86-48af-8a3d-510284dfd0e0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 10:00:45 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:00:45.319 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Compute_service record updated for np0005538515.localdomain:np0005538515.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 28 10:00:45 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:00:45.319 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.390s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 10:00:45 np0005538515.localdomain ceph-mon[301134]: pgmap v84: 177 pgs: 177 active+clean; 145 MiB data, 693 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:00:45 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.108:0/2207683770' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:00:45 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v85: 177 pgs: 177 active+clean; 145 MiB data, 693 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:00:46 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:00:46.164 2 INFO neutron.agent.securitygroups_rpc [req-fb04029d-a55f-44ca-91cb-e1804a48ba9f req-358dd77e-4f38-41f7-8d87-d246c9bdd01f 67aa4e1531db4854a2788f0bbeaba314 40693e6dadaf448a8cb4caeb6899effc - - default default] Security group rule updated ['e4169f5e-6c1f-4c53-9aba-f1f5fa41bfad']
Nov 28 10:00:46 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.107:0/2972538254' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:00:46 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e94 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 10:00:46 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:00:46.821 2 INFO neutron.agent.securitygroups_rpc [req-587359a2-3e51-4204-a146-11784139df1a req-865ecdad-325d-46ee-b040-9c64055b894f 67aa4e1531db4854a2788f0bbeaba314 40693e6dadaf448a8cb4caeb6899effc - - default default] Security group rule updated ['e4169f5e-6c1f-4c53-9aba-f1f5fa41bfad']
Nov 28 10:00:47 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:00:47.110 2 INFO neutron.agent.securitygroups_rpc [req-95a69005-5e60-4e46-b43c-b0ec65f622be req-ea491abe-a44c-4d6d-8b8a-511053fe8f93 67aa4e1531db4854a2788f0bbeaba314 40693e6dadaf448a8cb4caeb6899effc - - default default] Security group rule updated ['e4169f5e-6c1f-4c53-9aba-f1f5fa41bfad']
Nov 28 10:00:47 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:00:47.238 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:00:47 np0005538515.localdomain ceph-mon[301134]: pgmap v85: 177 pgs: 177 active+clean; 145 MiB data, 693 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:00:47 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.107:0/1377881097' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:00:47 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v86: 177 pgs: 177 active+clean; 192 MiB data, 757 MiB used, 41 GiB / 42 GiB avail; 1.7 MiB/s rd, 1.8 MiB/s wr, 33 op/s
Nov 28 10:00:47 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.
Nov 28 10:00:47 np0005538515.localdomain podman[306564]: 2025-11-28 10:00:47.98248462 +0000 UTC m=+0.085827357 container health_status 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, version=9.6, vendor=Red Hat, Inc., config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., managed_by=edpm_ansible, release=1755695350, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, io.buildah.version=1.33.7, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Nov 28 10:00:48 np0005538515.localdomain podman[306564]: 2025-11-28 10:00:48.021101199 +0000 UTC m=+0.124443936 container exec_died 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., config_id=edpm, io.openshift.expose-services=, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, managed_by=edpm_ansible, release=1755695350, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Nov 28 10:00:48 np0005538515.localdomain systemd[1]: 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.service: Deactivated successfully.
Nov 28 10:00:48 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:00:48.344 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:00:48 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:00:48.345 280172 DEBUG nova.compute.manager [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Nov 28 10:00:48 np0005538515.localdomain ceph-mon[301134]: pgmap v86: 177 pgs: 177 active+clean; 192 MiB data, 757 MiB used, 41 GiB / 42 GiB avail; 1.7 MiB/s rd, 1.8 MiB/s wr, 33 op/s
Nov 28 10:00:49 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v87: 177 pgs: 177 active+clean; 192 MiB data, 757 MiB used, 41 GiB / 42 GiB avail; 1.7 MiB/s rd, 1.8 MiB/s wr, 33 op/s
Nov 28 10:00:50 np0005538515.localdomain ceph-mon[301134]: pgmap v87: 177 pgs: 177 active+clean; 192 MiB data, 757 MiB used, 41 GiB / 42 GiB avail; 1.7 MiB/s rd, 1.8 MiB/s wr, 33 op/s
Nov 28 10:00:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:00:50.844 158530 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 10:00:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:00:50.845 158530 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 10:00:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:00:50.846 158530 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 10:00:51 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v88: 177 pgs: 177 active+clean; 192 MiB data, 757 MiB used, 41 GiB / 42 GiB avail; 2.8 MiB/s rd, 1.8 MiB/s wr, 80 op/s
Nov 28 10:00:51 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e94 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 10:00:52 np0005538515.localdomain dnsmasq[261709]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/addn_hosts - 3 addresses
Nov 28 10:00:52 np0005538515.localdomain dnsmasq-dhcp[261709]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/host
Nov 28 10:00:52 np0005538515.localdomain podman[306602]: 2025-11-28 10:00:52.495250596 +0000 UTC m=+0.084335513 container kill dff024c0a8ef5b75fbea3dbf531de7255c21fd5f44b33b8a799f8e3ce0ffd439 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-887157f9-a765-40c0-8be5-1fba3ddea8f8, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Nov 28 10:00:52 np0005538515.localdomain dnsmasq-dhcp[261709]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/opts
Nov 28 10:00:52 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:00:52.608 261346 INFO neutron.agent.dhcp.agent [None req-346e1842-d854-4979-83dc-6f928a59e0e2 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:00:52Z, description=, device_id=7dd8143d-bf72-4ceb-a8e4-27c584dc6e09, device_owner=network:router_gateway, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce599940>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce746940>], id=086df021-4554-4cc2-b965-7ba5df63e5cc, ip_allocation=immediate, mac_address=fa:16:3e:2a:ea:ec, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T08:32:19Z, description=, dns_domain=, id=887157f9-a765-40c0-8be5-1fba3ddea8f8, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=9dda653c53224db086060962b0702694, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=29, status=ACTIVE, subnets=['5f7de60c-f82a-4f40-b803-51cb08cbf2e3'], tags=[], tenant_id=9dda653c53224db086060962b0702694, updated_at=2025-11-28T08:32:25Z, vlan_transparent=None, network_id=887157f9-a765-40c0-8be5-1fba3ddea8f8, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=450, status=DOWN, tags=[], tenant_id=, updated_at=2025-11-28T10:00:52Z on network 887157f9-a765-40c0-8be5-1fba3ddea8f8
Nov 28 10:00:52 np0005538515.localdomain podman[306640]: 2025-11-28 10:00:52.851179447 +0000 UTC m=+0.060650906 container kill dff024c0a8ef5b75fbea3dbf531de7255c21fd5f44b33b8a799f8e3ce0ffd439 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-887157f9-a765-40c0-8be5-1fba3ddea8f8, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Nov 28 10:00:52 np0005538515.localdomain dnsmasq[261709]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/addn_hosts - 4 addresses
Nov 28 10:00:52 np0005538515.localdomain dnsmasq-dhcp[261709]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/host
Nov 28 10:00:52 np0005538515.localdomain dnsmasq-dhcp[261709]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/opts
Nov 28 10:00:52 np0005538515.localdomain ceph-mon[301134]: pgmap v88: 177 pgs: 177 active+clean; 192 MiB data, 757 MiB used, 41 GiB / 42 GiB avail; 2.8 MiB/s rd, 1.8 MiB/s wr, 80 op/s
Nov 28 10:00:53 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:00:53.201 261346 INFO neutron.agent.dhcp.agent [None req-a9d7f79b-7a99-4343-b36b-5790d64bbd21 - - - - - -] DHCP configuration for ports {'086df021-4554-4cc2-b965-7ba5df63e5cc'} is completed
Nov 28 10:00:53 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v89: 177 pgs: 177 active+clean; 192 MiB data, 757 MiB used, 41 GiB / 42 GiB avail; 2.8 MiB/s rd, 1.8 MiB/s wr, 80 op/s
Nov 28 10:00:54 np0005538515.localdomain ceph-mon[301134]: pgmap v89: 177 pgs: 177 active+clean; 192 MiB data, 757 MiB used, 41 GiB / 42 GiB avail; 2.8 MiB/s rd, 1.8 MiB/s wr, 80 op/s
Nov 28 10:00:55 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v90: 177 pgs: 177 active+clean; 192 MiB data, 757 MiB used, 41 GiB / 42 GiB avail; 2.8 MiB/s rd, 1.8 MiB/s wr, 80 op/s
Nov 28 10:00:55 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e95 e95: 6 total, 6 up, 6 in
Nov 28 10:00:56 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e95 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 10:00:56 np0005538515.localdomain ceph-mon[301134]: pgmap v90: 177 pgs: 177 active+clean; 192 MiB data, 757 MiB used, 41 GiB / 42 GiB avail; 2.8 MiB/s rd, 1.8 MiB/s wr, 80 op/s
Nov 28 10:00:56 np0005538515.localdomain ceph-mon[301134]: osdmap e95: 6 total, 6 up, 6 in
Nov 28 10:00:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:00:57 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 10:00:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:00:57 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 10:00:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:00:57 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 10:00:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:00:57 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 10:00:57 np0005538515.localdomain openstack_network_exporter[240973]: 
Nov 28 10:00:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:00:57 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 10:00:57 np0005538515.localdomain openstack_network_exporter[240973]: 
Nov 28 10:00:57 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v92: 177 pgs: 177 active+clean; 192 MiB data, 757 MiB used, 41 GiB / 42 GiB avail; 2.3 MiB/s rd, 17 KiB/s wr, 103 op/s
Nov 28 10:00:57 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.
Nov 28 10:00:57 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.
Nov 28 10:00:57 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.
Nov 28 10:00:57 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.
Nov 28 10:00:57 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e96 e96: 6 total, 6 up, 6 in
Nov 28 10:00:58 np0005538515.localdomain podman[306661]: 2025-11-28 10:00:58.03562702 +0000 UTC m=+0.135590534 container health_status 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.build-date=20251125, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:00:58 np0005538515.localdomain systemd[1]: tmp-crun.JwZH0W.mount: Deactivated successfully.
Nov 28 10:00:58 np0005538515.localdomain podman[306662]: 2025-11-28 10:00:58.048143639 +0000 UTC m=+0.144575356 container health_status b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Nov 28 10:00:58 np0005538515.localdomain podman[306662]: 2025-11-28 10:00:58.057635837 +0000 UTC m=+0.154067574 container exec_died b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 28 10:00:58 np0005538515.localdomain systemd[1]: b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.service: Deactivated successfully.
Nov 28 10:00:58 np0005538515.localdomain podman[306663]: 2025-11-28 10:00:58.098503203 +0000 UTC m=+0.188703521 container health_status d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 10:00:58 np0005538515.localdomain podman[306663]: 2025-11-28 10:00:58.106969699 +0000 UTC m=+0.197170057 container exec_died d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 10:00:58 np0005538515.localdomain podman[306661]: 2025-11-28 10:00:58.115997603 +0000 UTC m=+0.215961167 container exec_died 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, container_name=ovn_controller)
Nov 28 10:00:58 np0005538515.localdomain systemd[1]: d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.service: Deactivated successfully.
Nov 28 10:00:58 np0005538515.localdomain systemd[1]: 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.service: Deactivated successfully.
Nov 28 10:00:58 np0005538515.localdomain podman[306660]: 2025-11-28 10:00:58.191516957 +0000 UTC m=+0.294046238 container health_status 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, managed_by=edpm_ansible, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Nov 28 10:00:58 np0005538515.localdomain podman[306660]: 2025-11-28 10:00:58.20578789 +0000 UTC m=+0.308317191 container exec_died 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Nov 28 10:00:58 np0005538515.localdomain systemd[1]: 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.service: Deactivated successfully.
Nov 28 10:00:58 np0005538515.localdomain podman[239012]: time="2025-11-28T10:00:58Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 10:00:58 np0005538515.localdomain ceph-mon[301134]: pgmap v92: 177 pgs: 177 active+clean; 192 MiB data, 757 MiB used, 41 GiB / 42 GiB avail; 2.3 MiB/s rd, 17 KiB/s wr, 103 op/s
Nov 28 10:00:58 np0005538515.localdomain ceph-mon[301134]: osdmap e96: 6 total, 6 up, 6 in
Nov 28 10:00:58 np0005538515.localdomain podman[239012]: @ - - [28/Nov/2025:10:00:58 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156330 "" "Go-http-client/1.1"
Nov 28 10:00:58 np0005538515.localdomain podman[239012]: @ - - [28/Nov/2025:10:00:58 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19185 "" "Go-http-client/1.1"
Nov 28 10:00:59 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v94: 177 pgs: 177 active+clean; 192 MiB data, 757 MiB used, 41 GiB / 42 GiB avail; 1.2 MiB/s rd, 2.2 KiB/s wr, 59 op/s
Nov 28 10:01:00 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.
Nov 28 10:01:00 np0005538515.localdomain podman[306743]: 2025-11-28 10:01:00.972071438 +0000 UTC m=+0.077688562 container health_status 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 28 10:01:00 np0005538515.localdomain ceph-mon[301134]: pgmap v94: 177 pgs: 177 active+clean; 192 MiB data, 757 MiB used, 41 GiB / 42 GiB avail; 1.2 MiB/s rd, 2.2 KiB/s wr, 59 op/s
Nov 28 10:01:01 np0005538515.localdomain podman[306743]: 2025-11-28 10:01:01.006015875 +0000 UTC m=+0.111632969 container exec_died 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 28 10:01:01 np0005538515.localdomain systemd[1]: 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.service: Deactivated successfully.
Nov 28 10:01:01 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v95: 177 pgs: 177 active+clean; 192 MiB data, 757 MiB used, 41 GiB / 42 GiB avail; 1.3 MiB/s rd, 4.0 KiB/s wr, 99 op/s
Nov 28 10:01:01 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e96 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 10:01:01 np0005538515.localdomain CROND[306766]: (root) CMD (run-parts /etc/cron.hourly)
Nov 28 10:01:01 np0005538515.localdomain run-parts[306769]: (/etc/cron.hourly) starting 0anacron
Nov 28 10:01:01 np0005538515.localdomain run-parts[306775]: (/etc/cron.hourly) finished 0anacron
Nov 28 10:01:01 np0005538515.localdomain CROND[306765]: (root) CMDEND (run-parts /etc/cron.hourly)
Nov 28 10:01:02 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:02.257 280172 DEBUG oslo_concurrency.lockutils [None req-c08be0ce-6d64-47f2-999a-9837d83fcc2e 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] Acquiring lock "7292509e-f294-4159-96e5-22d4712df2a0" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 10:01:02 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:02.257 280172 DEBUG oslo_concurrency.lockutils [None req-c08be0ce-6d64-47f2-999a-9837d83fcc2e 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] Lock "7292509e-f294-4159-96e5-22d4712df2a0" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 10:01:02 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:02.279 280172 DEBUG nova.compute.manager [None req-c08be0ce-6d64-47f2-999a-9837d83fcc2e 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 28 10:01:02 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:02.359 280172 DEBUG oslo_concurrency.lockutils [None req-c08be0ce-6d64-47f2-999a-9837d83fcc2e 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 10:01:02 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:02.360 280172 DEBUG oslo_concurrency.lockutils [None req-c08be0ce-6d64-47f2-999a-9837d83fcc2e 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 10:01:02 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:02.365 280172 DEBUG nova.virt.hardware [None req-c08be0ce-6d64-47f2-999a-9837d83fcc2e 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 28 10:01:02 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:02.366 280172 INFO nova.compute.claims [None req-c08be0ce-6d64-47f2-999a-9837d83fcc2e 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Claim successful on node np0005538515.localdomain
Nov 28 10:01:02 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:02.487 280172 DEBUG oslo_concurrency.processutils [None req-c08be0ce-6d64-47f2-999a-9837d83fcc2e 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 10:01:02 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 28 10:01:02 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/3260841447' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:01:02 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:02.924 280172 DEBUG oslo_concurrency.processutils [None req-c08be0ce-6d64-47f2-999a-9837d83fcc2e 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.437s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 10:01:02 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:02.931 280172 DEBUG nova.compute.provider_tree [None req-c08be0ce-6d64-47f2-999a-9837d83fcc2e 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] Inventory has not changed in ProviderTree for provider: 72fba1ca-0d86-48af-8a3d-510284dfd0e0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 10:01:02 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:02.949 280172 DEBUG nova.scheduler.client.report [None req-c08be0ce-6d64-47f2-999a-9837d83fcc2e 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] Inventory has not changed for provider 72fba1ca-0d86-48af-8a3d-510284dfd0e0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 10:01:02 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:02.977 280172 DEBUG oslo_concurrency.lockutils [None req-c08be0ce-6d64-47f2-999a-9837d83fcc2e 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.617s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 10:01:02 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:02.978 280172 DEBUG nova.compute.manager [None req-c08be0ce-6d64-47f2-999a-9837d83fcc2e 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 28 10:01:02 np0005538515.localdomain ceph-mon[301134]: pgmap v95: 177 pgs: 177 active+clean; 192 MiB data, 757 MiB used, 41 GiB / 42 GiB avail; 1.3 MiB/s rd, 4.0 KiB/s wr, 99 op/s
Nov 28 10:01:02 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.108:0/3260841447' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:01:03 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:03.067 280172 DEBUG nova.compute.manager [None req-c08be0ce-6d64-47f2-999a-9837d83fcc2e 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Not allocating networking since 'none' was specified. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1948
Nov 28 10:01:03 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:03.090 280172 INFO nova.virt.libvirt.driver [None req-c08be0ce-6d64-47f2-999a-9837d83fcc2e 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 28 10:01:03 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:03.108 280172 DEBUG nova.compute.manager [None req-c08be0ce-6d64-47f2-999a-9837d83fcc2e 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 28 10:01:03 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:03.215 280172 DEBUG nova.compute.manager [None req-c08be0ce-6d64-47f2-999a-9837d83fcc2e 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 28 10:01:03 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:03.217 280172 DEBUG nova.virt.libvirt.driver [None req-c08be0ce-6d64-47f2-999a-9837d83fcc2e 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 28 10:01:03 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:03.218 280172 INFO nova.virt.libvirt.driver [None req-c08be0ce-6d64-47f2-999a-9837d83fcc2e 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Creating image(s)
Nov 28 10:01:03 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:03.254 280172 DEBUG nova.storage.rbd_utils [None req-c08be0ce-6d64-47f2-999a-9837d83fcc2e 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] rbd image 7292509e-f294-4159-96e5-22d4712df2a0_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Nov 28 10:01:03 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:03.293 280172 DEBUG nova.storage.rbd_utils [None req-c08be0ce-6d64-47f2-999a-9837d83fcc2e 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] rbd image 7292509e-f294-4159-96e5-22d4712df2a0_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Nov 28 10:01:03 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:03.331 280172 DEBUG nova.storage.rbd_utils [None req-c08be0ce-6d64-47f2-999a-9837d83fcc2e 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] rbd image 7292509e-f294-4159-96e5-22d4712df2a0_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Nov 28 10:01:03 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:03.335 280172 DEBUG oslo_concurrency.lockutils [None req-c08be0ce-6d64-47f2-999a-9837d83fcc2e 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] Acquiring lock "1d475a5fe6866c2fa864abfa6db335a58fd8123d" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 10:01:03 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:03.337 280172 DEBUG oslo_concurrency.lockutils [None req-c08be0ce-6d64-47f2-999a-9837d83fcc2e 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] Lock "1d475a5fe6866c2fa864abfa6db335a58fd8123d" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 10:01:03 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v96: 177 pgs: 177 active+clean; 192 MiB data, 757 MiB used, 41 GiB / 42 GiB avail; 1.3 MiB/s rd, 4.0 KiB/s wr, 99 op/s
Nov 28 10:01:03 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:01:03.788 261346 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:01:03Z, description=, device_id=e8c7be5d-4204-4a6d-9f91-45b65598e58d, device_owner=network:router_gateway, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce6b8f40>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce6b8550>], id=8b997639-c389-4236-b623-22a294d76e8e, ip_allocation=immediate, mac_address=fa:16:3e:36:94:3e, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T08:32:19Z, description=, dns_domain=, id=887157f9-a765-40c0-8be5-1fba3ddea8f8, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=9dda653c53224db086060962b0702694, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=29, status=ACTIVE, subnets=['5f7de60c-f82a-4f40-b803-51cb08cbf2e3'], tags=[], tenant_id=9dda653c53224db086060962b0702694, updated_at=2025-11-28T08:32:25Z, vlan_transparent=None, network_id=887157f9-a765-40c0-8be5-1fba3ddea8f8, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=514, status=DOWN, tags=[], tenant_id=, updated_at=2025-11-28T10:01:03Z on network 887157f9-a765-40c0-8be5-1fba3ddea8f8
Nov 28 10:01:03 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.
Nov 28 10:01:03 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:01:03.912 261346 INFO neutron.agent.linux.ip_lib [None req-26c8f880-99f6-42d3-a916-e5fbef56693d - - - - - -] Device tapd0a70cfb-41 cannot be used as it has no MAC address
Nov 28 10:01:03 np0005538515.localdomain podman[306858]: 2025-11-28 10:01:03.93012128 +0000 UTC m=+0.090552121 container health_status cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 28 10:01:03 np0005538515.localdomain systemd[1]: tmp-crun.ftvlsS.mount: Deactivated successfully.
Nov 28 10:01:03 np0005538515.localdomain kernel: device tapd0a70cfb-41 entered promiscuous mode
Nov 28 10:01:03 np0005538515.localdomain podman[306858]: 2025-11-28 10:01:03.944496375 +0000 UTC m=+0.104927246 container exec_died cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Nov 28 10:01:03 np0005538515.localdomain NetworkManager[5965]: <info>  [1764324063.9457] manager: (tapd0a70cfb-41): new Generic device (/org/freedesktop/NetworkManager/Devices/15)
Nov 28 10:01:03 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T10:01:03Z|00041|binding|INFO|Claiming lport d0a70cfb-41f8-4ab9-819b-560a898e8329 for this chassis.
Nov 28 10:01:03 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T10:01:03Z|00042|binding|INFO|d0a70cfb-41f8-4ab9-819b-560a898e8329: Claiming unknown
Nov 28 10:01:03 np0005538515.localdomain systemd-udevd[306896]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 10:01:03 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:01:03.959 158530 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538515.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcp2a75ae47-09f3-5db4-9c67-86b6e0e7c804-4feac402-945d-4d17-a15d-c8337ea9c266', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4feac402-945d-4d17-a15d-c8337ea9c266', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f1bee3918a2345388c202f74e60af9c5', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4a3868fc-e35e-44db-9bd3-f12a417ed185, chassis=[<ovs.db.idl.Row object at 0x7fd80e481be0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fd80e481be0>], logical_port=d0a70cfb-41f8-4ab9-819b-560a898e8329) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:01:03 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:01:03.961 158530 INFO neutron.agent.ovn.metadata.agent [-] Port d0a70cfb-41f8-4ab9-819b-560a898e8329 in datapath 4feac402-945d-4d17-a15d-c8337ea9c266 bound to our chassis
Nov 28 10:01:03 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:01:03.964 158530 DEBUG neutron.agent.ovn.metadata.agent [-] Port a7005248-9fa7-4afe-ae28-d6f6bbb69c02 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Nov 28 10:01:03 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:01:03.965 158530 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 4feac402-945d-4d17-a15d-c8337ea9c266, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 28 10:01:03 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:01:03.966 261619 DEBUG oslo.privsep.daemon [-] privsep: reply[6bb503cf-76be-499e-b7c0-2e546539f4a5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:01:03 np0005538515.localdomain virtnodedevd[228057]: libvirt version: 11.9.0, package: 1.el9 (builder@centos.org, 2025-11-04-09:54:50, )
Nov 28 10:01:03 np0005538515.localdomain virtnodedevd[228057]: hostname: np0005538515.localdomain
Nov 28 10:01:03 np0005538515.localdomain virtnodedevd[228057]: ethtool ioctl error on tapd0a70cfb-41: No such device
Nov 28 10:01:03 np0005538515.localdomain virtnodedevd[228057]: ethtool ioctl error on tapd0a70cfb-41: No such device
Nov 28 10:01:03 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T10:01:03Z|00043|binding|INFO|Setting lport d0a70cfb-41f8-4ab9-819b-560a898e8329 ovn-installed in OVS
Nov 28 10:01:03 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T10:01:03Z|00044|binding|INFO|Setting lport d0a70cfb-41f8-4ab9-819b-560a898e8329 up in Southbound
Nov 28 10:01:03 np0005538515.localdomain virtnodedevd[228057]: ethtool ioctl error on tapd0a70cfb-41: No such device
Nov 28 10:01:03 np0005538515.localdomain virtnodedevd[228057]: ethtool ioctl error on tapd0a70cfb-41: No such device
Nov 28 10:01:03 np0005538515.localdomain virtnodedevd[228057]: ethtool ioctl error on tapd0a70cfb-41: No such device
Nov 28 10:01:04 np0005538515.localdomain virtnodedevd[228057]: ethtool ioctl error on tapd0a70cfb-41: No such device
Nov 28 10:01:04 np0005538515.localdomain virtnodedevd[228057]: ethtool ioctl error on tapd0a70cfb-41: No such device
Nov 28 10:01:04 np0005538515.localdomain virtnodedevd[228057]: ethtool ioctl error on tapd0a70cfb-41: No such device
Nov 28 10:01:04 np0005538515.localdomain systemd[1]: cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.service: Deactivated successfully.
Nov 28 10:01:04 np0005538515.localdomain podman[306902]: 2025-11-28 10:01:04.062193966 +0000 UTC m=+0.067941616 container kill dff024c0a8ef5b75fbea3dbf531de7255c21fd5f44b33b8a799f8e3ce0ffd439 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-887157f9-a765-40c0-8be5-1fba3ddea8f8, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:01:04 np0005538515.localdomain dnsmasq[261709]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/addn_hosts - 5 addresses
Nov 28 10:01:04 np0005538515.localdomain dnsmasq-dhcp[261709]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/host
Nov 28 10:01:04 np0005538515.localdomain dnsmasq-dhcp[261709]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/opts
Nov 28 10:01:04 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:04.137 280172 DEBUG nova.virt.libvirt.imagebackend [None req-c08be0ce-6d64-47f2-999a-9837d83fcc2e 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] Image locations are: [{'url': 'rbd://2c5417c9-00eb-57d5-a565-ddecbc7995c1/images/85968a96-5a0e-43a4-9c04-3954f640a7ed/snap', 'metadata': {'store': 'default_backend'}}, {'url': 'rbd://2c5417c9-00eb-57d5-a565-ddecbc7995c1/images/85968a96-5a0e-43a4-9c04-3954f640a7ed/snap', 'metadata': {}}] clone /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1085
Nov 28 10:01:04 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:01:04.548 261346 INFO neutron.agent.dhcp.agent [None req-677340f8-c16e-4f0d-9c2e-d1a109d49412 - - - - - -] DHCP configuration for ports {'8b997639-c389-4236-b623-22a294d76e8e'} is completed
Nov 28 10:01:04 np0005538515.localdomain systemd[1]: tmp-crun.96fcCp.mount: Deactivated successfully.
Nov 28 10:01:04 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e97 e97: 6 total, 6 up, 6 in
Nov 28 10:01:04 np0005538515.localdomain podman[306989]: 
Nov 28 10:01:04 np0005538515.localdomain podman[306989]: 2025-11-28 10:01:04.967909803 +0000 UTC m=+0.073783343 container create 421f676002d9dc6198ae0142c65c01f174b39b07450c22c4b59e8e8bd991f65a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4feac402-945d-4d17-a15d-c8337ea9c266, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 28 10:01:05 np0005538515.localdomain systemd[1]: Started libpod-conmon-421f676002d9dc6198ae0142c65c01f174b39b07450c22c4b59e8e8bd991f65a.scope.
Nov 28 10:01:05 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 10:01:05 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a53b8d3f668b25246333f5de4a531a6e13da55d713122016f2fd29b9d52ffaf2/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 10:01:05 np0005538515.localdomain podman[306989]: 2025-11-28 10:01:04.938573546 +0000 UTC m=+0.044447076 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 28 10:01:05 np0005538515.localdomain podman[306989]: 2025-11-28 10:01:05.046374148 +0000 UTC m=+0.152247688 container init 421f676002d9dc6198ae0142c65c01f174b39b07450c22c4b59e8e8bd991f65a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4feac402-945d-4d17-a15d-c8337ea9c266, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 28 10:01:05 np0005538515.localdomain podman[306989]: 2025-11-28 10:01:05.055094972 +0000 UTC m=+0.160968522 container start 421f676002d9dc6198ae0142c65c01f174b39b07450c22c4b59e8e8bd991f65a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4feac402-945d-4d17-a15d-c8337ea9c266, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true)
Nov 28 10:01:05 np0005538515.localdomain dnsmasq[307007]: started, version 2.85 cachesize 150
Nov 28 10:01:05 np0005538515.localdomain dnsmasq[307007]: DNS service limited to local subnets
Nov 28 10:01:05 np0005538515.localdomain dnsmasq[307007]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 28 10:01:05 np0005538515.localdomain dnsmasq[307007]: warning: no upstream servers configured
Nov 28 10:01:05 np0005538515.localdomain dnsmasq-dhcp[307007]: DHCP, static leases only on 10.100.0.0, lease time 1d
Nov 28 10:01:05 np0005538515.localdomain dnsmasq[307007]: read /var/lib/neutron/dhcp/4feac402-945d-4d17-a15d-c8337ea9c266/addn_hosts - 0 addresses
Nov 28 10:01:05 np0005538515.localdomain dnsmasq-dhcp[307007]: read /var/lib/neutron/dhcp/4feac402-945d-4d17-a15d-c8337ea9c266/host
Nov 28 10:01:05 np0005538515.localdomain dnsmasq-dhcp[307007]: read /var/lib/neutron/dhcp/4feac402-945d-4d17-a15d-c8337ea9c266/opts
Nov 28 10:01:05 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:01:05.195 261346 INFO neutron.agent.dhcp.agent [None req-35fa67ce-21ea-4a26-94d8-b7bd49186315 - - - - - -] DHCP configuration for ports {'3c0393c7-2178-4cfe-a769-33cf4168b6af'} is completed
Nov 28 10:01:05 np0005538515.localdomain ceph-mon[301134]: pgmap v96: 177 pgs: 177 active+clean; 192 MiB data, 757 MiB used, 41 GiB / 42 GiB avail; 1.3 MiB/s rd, 4.0 KiB/s wr, 99 op/s
Nov 28 10:01:05 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.107:0/3322246960' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:01:05 np0005538515.localdomain ceph-mon[301134]: osdmap e97: 6 total, 6 up, 6 in
Nov 28 10:01:05 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:05.296 280172 DEBUG oslo_concurrency.processutils [None req-c08be0ce-6d64-47f2-999a-9837d83fcc2e 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1d475a5fe6866c2fa864abfa6db335a58fd8123d.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 10:01:05 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:05.372 280172 DEBUG oslo_concurrency.processutils [None req-c08be0ce-6d64-47f2-999a-9837d83fcc2e 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1d475a5fe6866c2fa864abfa6db335a58fd8123d.part --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 10:01:05 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:05.373 280172 DEBUG nova.virt.images [None req-c08be0ce-6d64-47f2-999a-9837d83fcc2e 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] 85968a96-5a0e-43a4-9c04-3954f640a7ed was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242
Nov 28 10:01:05 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:05.374 280172 DEBUG nova.privsep.utils [None req-c08be0ce-6d64-47f2-999a-9837d83fcc2e 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Nov 28 10:01:05 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:05.375 280172 DEBUG oslo_concurrency.processutils [None req-c08be0ce-6d64-47f2-999a-9837d83fcc2e 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/1d475a5fe6866c2fa864abfa6db335a58fd8123d.part /var/lib/nova/instances/_base/1d475a5fe6866c2fa864abfa6db335a58fd8123d.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 10:01:05 np0005538515.localdomain ceph-mgr[286188]: [balancer INFO root] Optimize plan auto_2025-11-28_10:01:05
Nov 28 10:01:05 np0005538515.localdomain ceph-mgr[286188]: [balancer INFO root] Mode upmap, max misplaced 0.050000
Nov 28 10:01:05 np0005538515.localdomain ceph-mgr[286188]: [balancer INFO root] do_upmap
Nov 28 10:01:05 np0005538515.localdomain ceph-mgr[286188]: [balancer INFO root] pools ['manila_data', '.mgr', 'volumes', 'manila_metadata', 'backups', 'vms', 'images']
Nov 28 10:01:05 np0005538515.localdomain ceph-mgr[286188]: [balancer INFO root] prepared 0/10 changes
Nov 28 10:01:05 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v98: 177 pgs: 177 active+clean; 192 MiB data, 757 MiB used, 41 GiB / 42 GiB avail; 34 KiB/s rd, 1.7 KiB/s wr, 39 op/s
Nov 28 10:01:05 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:05.654 280172 DEBUG oslo_concurrency.processutils [None req-c08be0ce-6d64-47f2-999a-9837d83fcc2e 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/1d475a5fe6866c2fa864abfa6db335a58fd8123d.part /var/lib/nova/instances/_base/1d475a5fe6866c2fa864abfa6db335a58fd8123d.converted" returned: 0 in 0.279s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 10:01:05 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:05.659 280172 DEBUG oslo_concurrency.processutils [None req-c08be0ce-6d64-47f2-999a-9837d83fcc2e 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1d475a5fe6866c2fa864abfa6db335a58fd8123d.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 10:01:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] scanning for idle connections..
Nov 28 10:01:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] cleaning up connections: []
Nov 28 10:01:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] scanning for idle connections..
Nov 28 10:01:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] cleaning up connections: []
Nov 28 10:01:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] _maybe_adjust
Nov 28 10:01:05 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules
Nov 28 10:01:05 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: vms, start_after=
Nov 28 10:01:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Nov 28 10:01:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] Pool '.mgr' root_id -1 using 3.080724804578448e-05 of space, bias 1.0, pg target 0.006161449609156895 quantized to 1 (current 1)
Nov 28 10:01:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Nov 28 10:01:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] Pool 'vms' root_id -1 using 0.004821470634422111 of space, bias 1.0, pg target 0.9642941268844222 quantized to 32 (current 32)
Nov 28 10:01:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Nov 28 10:01:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] Pool 'volumes' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Nov 28 10:01:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Nov 28 10:01:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] Pool 'images' root_id -1 using 0.004299383200725851 of space, bias 1.0, pg target 0.8570103846780196 quantized to 32 (current 32)
Nov 28 10:01:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Nov 28 10:01:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] Pool 'backups' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Nov 28 10:01:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Nov 28 10:01:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] Pool 'manila_data' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Nov 28 10:01:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Nov 28 10:01:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] Pool 'manila_metadata' root_id -1 using 2.453674623115578e-06 of space, bias 4.0, pg target 0.001949853433835846 quantized to 16 (current 16)
Nov 28 10:01:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] scanning for idle connections..
Nov 28 10:01:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] cleaning up connections: []
Nov 28 10:01:05 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: volumes, start_after=
Nov 28 10:01:05 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: images, start_after=
Nov 28 10:01:05 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: backups, start_after=
Nov 28 10:01:05 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules
Nov 28 10:01:05 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: vms, start_after=
Nov 28 10:01:05 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: volumes, start_after=
Nov 28 10:01:05 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: images, start_after=
Nov 28 10:01:05 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: backups, start_after=
Nov 28 10:01:06 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:01:06.177 261346 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:01:05Z, description=, device_id=e8c7be5d-4204-4a6d-9f91-45b65598e58d, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce5af190>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce5af430>], id=ec6ddc27-5566-4943-ab8e-4ee78e9f615c, ip_allocation=immediate, mac_address=fa:16:3e:9b:df:fd, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:01:01Z, description=, dns_domain=, id=4feac402-945d-4d17-a15d-c8337ea9c266, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-LiveMigrationTest-1343972998-network, port_security_enabled=True, project_id=f1bee3918a2345388c202f74e60af9c5, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=22843, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=507, status=ACTIVE, subnets=['e34cc883-2097-4e00-b953-2cb3d3328eb3'], tags=[], tenant_id=f1bee3918a2345388c202f74e60af9c5, updated_at=2025-11-28T10:01:02Z, vlan_transparent=None, network_id=4feac402-945d-4d17-a15d-c8337ea9c266, port_security_enabled=False, project_id=f1bee3918a2345388c202f74e60af9c5, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=530, status=DOWN, tags=[], tenant_id=f1bee3918a2345388c202f74e60af9c5, updated_at=2025-11-28T10:01:05Z on network 4feac402-945d-4d17-a15d-c8337ea9c266
Nov 28 10:01:06 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:06.220 280172 DEBUG oslo_concurrency.processutils [None req-c08be0ce-6d64-47f2-999a-9837d83fcc2e 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1d475a5fe6866c2fa864abfa6db335a58fd8123d.converted --force-share --output=json" returned: 0 in 0.561s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 10:01:06 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:06.222 280172 DEBUG oslo_concurrency.lockutils [None req-c08be0ce-6d64-47f2-999a-9837d83fcc2e 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] Lock "1d475a5fe6866c2fa864abfa6db335a58fd8123d" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 2.885s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 10:01:06 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:06.268 280172 DEBUG nova.storage.rbd_utils [None req-c08be0ce-6d64-47f2-999a-9837d83fcc2e 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] rbd image 7292509e-f294-4159-96e5-22d4712df2a0_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Nov 28 10:01:06 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:06.275 280172 DEBUG oslo_concurrency.processutils [None req-c08be0ce-6d64-47f2-999a-9837d83fcc2e 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/1d475a5fe6866c2fa864abfa6db335a58fd8123d 7292509e-f294-4159-96e5-22d4712df2a0_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 10:01:06 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.107:0/593006997' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:01:06 np0005538515.localdomain podman[307056]: 2025-11-28 10:01:06.375265171 +0000 UTC m=+0.056974605 container kill 421f676002d9dc6198ae0142c65c01f174b39b07450c22c4b59e8e8bd991f65a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4feac402-945d-4d17-a15d-c8337ea9c266, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 28 10:01:06 np0005538515.localdomain dnsmasq[307007]: read /var/lib/neutron/dhcp/4feac402-945d-4d17-a15d-c8337ea9c266/addn_hosts - 1 addresses
Nov 28 10:01:06 np0005538515.localdomain dnsmasq-dhcp[307007]: read /var/lib/neutron/dhcp/4feac402-945d-4d17-a15d-c8337ea9c266/host
Nov 28 10:01:06 np0005538515.localdomain dnsmasq-dhcp[307007]: read /var/lib/neutron/dhcp/4feac402-945d-4d17-a15d-c8337ea9c266/opts
Nov 28 10:01:06 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:01:06.699 261346 INFO neutron.agent.dhcp.agent [None req-9d600ab1-f8ff-401f-948e-6a47429145b8 - - - - - -] DHCP configuration for ports {'ec6ddc27-5566-4943-ab8e-4ee78e9f615c'} is completed
Nov 28 10:01:06 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e97 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 10:01:06 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:06.752 280172 DEBUG oslo_concurrency.processutils [None req-c08be0ce-6d64-47f2-999a-9837d83fcc2e 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/1d475a5fe6866c2fa864abfa6db335a58fd8123d 7292509e-f294-4159-96e5-22d4712df2a0_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.477s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 10:01:06 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:06.840 280172 DEBUG nova.storage.rbd_utils [None req-c08be0ce-6d64-47f2-999a-9837d83fcc2e 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] resizing rbd image 7292509e-f294-4159-96e5-22d4712df2a0_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288
Nov 28 10:01:06 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:06.981 280172 DEBUG nova.objects.instance [None req-c08be0ce-6d64-47f2-999a-9837d83fcc2e 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] Lazy-loading 'migration_context' on Instance uuid 7292509e-f294-4159-96e5-22d4712df2a0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 28 10:01:07 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:07.217 280172 DEBUG nova.virt.libvirt.driver [None req-c08be0ce-6d64-47f2-999a-9837d83fcc2e 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 28 10:01:07 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:07.218 280172 DEBUG nova.virt.libvirt.driver [None req-c08be0ce-6d64-47f2-999a-9837d83fcc2e 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Ensure instance console log exists: /var/lib/nova/instances/7292509e-f294-4159-96e5-22d4712df2a0/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 28 10:01:07 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:07.219 280172 DEBUG oslo_concurrency.lockutils [None req-c08be0ce-6d64-47f2-999a-9837d83fcc2e 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 10:01:07 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:07.219 280172 DEBUG oslo_concurrency.lockutils [None req-c08be0ce-6d64-47f2-999a-9837d83fcc2e 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 10:01:07 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:07.220 280172 DEBUG oslo_concurrency.lockutils [None req-c08be0ce-6d64-47f2-999a-9837d83fcc2e 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 10:01:07 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:07.223 280172 DEBUG nova.virt.libvirt.driver [None req-c08be0ce-6d64-47f2-999a-9837d83fcc2e 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-28T09:59:44Z,direct_url=<?>,disk_format='qcow2',id=85968a96-5a0e-43a4-9c04-3954f640a7ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='9dda653c53224db086060962b0702694',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-28T09:59:46Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'encryption_secret_uuid': None, 'guest_format': None, 'size': 0, 'encryption_options': None, 'boot_index': 0, 'device_type': 'disk', 'disk_bus': 'virtio', 'encrypted': False, 'device_name': '/dev/vda', 'image_id': '85968a96-5a0e-43a4-9c04-3954f640a7ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 28 10:01:07 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:07.230 280172 WARNING nova.virt.libvirt.driver [None req-c08be0ce-6d64-47f2-999a-9837d83fcc2e 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 10:01:07 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:07.234 280172 DEBUG nova.virt.libvirt.host [None req-c08be0ce-6d64-47f2-999a-9837d83fcc2e 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] Searching host: 'np0005538515.localdomain' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 28 10:01:07 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:07.235 280172 DEBUG nova.virt.libvirt.host [None req-c08be0ce-6d64-47f2-999a-9837d83fcc2e 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 28 10:01:07 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:07.238 280172 DEBUG nova.virt.libvirt.host [None req-c08be0ce-6d64-47f2-999a-9837d83fcc2e 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] Searching host: 'np0005538515.localdomain' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 28 10:01:07 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:07.238 280172 DEBUG nova.virt.libvirt.host [None req-c08be0ce-6d64-47f2-999a-9837d83fcc2e 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 28 10:01:07 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:07.239 280172 DEBUG nova.virt.libvirt.driver [None req-c08be0ce-6d64-47f2-999a-9837d83fcc2e 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 28 10:01:07 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:07.240 280172 DEBUG nova.virt.hardware [None req-c08be0ce-6d64-47f2-999a-9837d83fcc2e 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-28T09:59:43Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='98f289d4-5c06-4ab5-9089-7b580870d676',id=5,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-28T09:59:44Z,direct_url=<?>,disk_format='qcow2',id=85968a96-5a0e-43a4-9c04-3954f640a7ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='9dda653c53224db086060962b0702694',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-28T09:59:46Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 28 10:01:07 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:07.241 280172 DEBUG nova.virt.hardware [None req-c08be0ce-6d64-47f2-999a-9837d83fcc2e 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 28 10:01:07 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:07.241 280172 DEBUG nova.virt.hardware [None req-c08be0ce-6d64-47f2-999a-9837d83fcc2e 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 28 10:01:07 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:07.242 280172 DEBUG nova.virt.hardware [None req-c08be0ce-6d64-47f2-999a-9837d83fcc2e 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 28 10:01:07 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:07.242 280172 DEBUG nova.virt.hardware [None req-c08be0ce-6d64-47f2-999a-9837d83fcc2e 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 28 10:01:07 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:07.243 280172 DEBUG nova.virt.hardware [None req-c08be0ce-6d64-47f2-999a-9837d83fcc2e 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 28 10:01:07 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:07.243 280172 DEBUG nova.virt.hardware [None req-c08be0ce-6d64-47f2-999a-9837d83fcc2e 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 28 10:01:07 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:07.244 280172 DEBUG nova.virt.hardware [None req-c08be0ce-6d64-47f2-999a-9837d83fcc2e 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 28 10:01:07 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:07.245 280172 DEBUG nova.virt.hardware [None req-c08be0ce-6d64-47f2-999a-9837d83fcc2e 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 28 10:01:07 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:07.245 280172 DEBUG nova.virt.hardware [None req-c08be0ce-6d64-47f2-999a-9837d83fcc2e 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 28 10:01:07 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:07.245 280172 DEBUG nova.virt.hardware [None req-c08be0ce-6d64-47f2-999a-9837d83fcc2e 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 28 10:01:07 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:07.251 280172 DEBUG nova.privsep.utils [None req-c08be0ce-6d64-47f2-999a-9837d83fcc2e 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Nov 28 10:01:07 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:07.252 280172 DEBUG oslo_concurrency.processutils [None req-c08be0ce-6d64-47f2-999a-9837d83fcc2e 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 10:01:07 np0005538515.localdomain ceph-mon[301134]: pgmap v98: 177 pgs: 177 active+clean; 192 MiB data, 757 MiB used, 41 GiB / 42 GiB avail; 34 KiB/s rd, 1.7 KiB/s wr, 39 op/s
Nov 28 10:01:07 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v99: 177 pgs: 177 active+clean; 175 MiB data, 782 MiB used, 41 GiB / 42 GiB avail; 2.5 MiB/s rd, 3.8 MiB/s wr, 166 op/s
Nov 28 10:01:07 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Nov 28 10:01:07 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/2931902013' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:01:07 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:07.711 280172 DEBUG oslo_concurrency.processutils [None req-c08be0ce-6d64-47f2-999a-9837d83fcc2e 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.459s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 10:01:07 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:07.746 280172 DEBUG nova.storage.rbd_utils [None req-c08be0ce-6d64-47f2-999a-9837d83fcc2e 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] rbd image 7292509e-f294-4159-96e5-22d4712df2a0_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Nov 28 10:01:07 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:07.750 280172 DEBUG oslo_concurrency.processutils [None req-c08be0ce-6d64-47f2-999a-9837d83fcc2e 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 10:01:08 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Nov 28 10:01:08 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/3475554474' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:01:08 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:08.250 280172 DEBUG oslo_concurrency.processutils [None req-c08be0ce-6d64-47f2-999a-9837d83fcc2e 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.500s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 10:01:08 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:08.254 280172 DEBUG nova.objects.instance [None req-c08be0ce-6d64-47f2-999a-9837d83fcc2e 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] Lazy-loading 'pci_devices' on Instance uuid 7292509e-f294-4159-96e5-22d4712df2a0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 28 10:01:08 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:08.344 280172 DEBUG nova.virt.libvirt.driver [None req-c08be0ce-6d64-47f2-999a-9837d83fcc2e 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] End _get_guest_xml xml=<domain type="kvm">
Nov 28 10:01:08 np0005538515.localdomain nova_compute[280168]:   <uuid>7292509e-f294-4159-96e5-22d4712df2a0</uuid>
Nov 28 10:01:08 np0005538515.localdomain nova_compute[280168]:   <name>instance-00000007</name>
Nov 28 10:01:08 np0005538515.localdomain nova_compute[280168]:   <memory>131072</memory>
Nov 28 10:01:08 np0005538515.localdomain nova_compute[280168]:   <vcpu>1</vcpu>
Nov 28 10:01:08 np0005538515.localdomain nova_compute[280168]:   <metadata>
Nov 28 10:01:08 np0005538515.localdomain nova_compute[280168]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 28 10:01:08 np0005538515.localdomain nova_compute[280168]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 28 10:01:08 np0005538515.localdomain nova_compute[280168]:       <nova:name>tempest-UnshelveToHostMultiNodesTest-server-650509197</nova:name>
Nov 28 10:01:08 np0005538515.localdomain nova_compute[280168]:       <nova:creationTime>2025-11-28 10:01:07</nova:creationTime>
Nov 28 10:01:08 np0005538515.localdomain nova_compute[280168]:       <nova:flavor name="m1.nano">
Nov 28 10:01:08 np0005538515.localdomain nova_compute[280168]:         <nova:memory>128</nova:memory>
Nov 28 10:01:08 np0005538515.localdomain nova_compute[280168]:         <nova:disk>1</nova:disk>
Nov 28 10:01:08 np0005538515.localdomain nova_compute[280168]:         <nova:swap>0</nova:swap>
Nov 28 10:01:08 np0005538515.localdomain nova_compute[280168]:         <nova:ephemeral>0</nova:ephemeral>
Nov 28 10:01:08 np0005538515.localdomain nova_compute[280168]:         <nova:vcpus>1</nova:vcpus>
Nov 28 10:01:08 np0005538515.localdomain nova_compute[280168]:       </nova:flavor>
Nov 28 10:01:08 np0005538515.localdomain nova_compute[280168]:       <nova:owner>
Nov 28 10:01:08 np0005538515.localdomain nova_compute[280168]:         <nova:user uuid="28578129c91d407a92af609ba8bac430">tempest-UnshelveToHostMultiNodesTest-426973173-project-member</nova:user>
Nov 28 10:01:08 np0005538515.localdomain nova_compute[280168]:         <nova:project uuid="a30386ba68ee46f4a1bac43cf415f3a4">tempest-UnshelveToHostMultiNodesTest-426973173</nova:project>
Nov 28 10:01:08 np0005538515.localdomain nova_compute[280168]:       </nova:owner>
Nov 28 10:01:08 np0005538515.localdomain nova_compute[280168]:       <nova:root type="image" uuid="85968a96-5a0e-43a4-9c04-3954f640a7ed"/>
Nov 28 10:01:08 np0005538515.localdomain nova_compute[280168]:       <nova:ports/>
Nov 28 10:01:08 np0005538515.localdomain nova_compute[280168]:     </nova:instance>
Nov 28 10:01:08 np0005538515.localdomain nova_compute[280168]:   </metadata>
Nov 28 10:01:08 np0005538515.localdomain nova_compute[280168]:   <sysinfo type="smbios">
Nov 28 10:01:08 np0005538515.localdomain nova_compute[280168]:     <system>
Nov 28 10:01:08 np0005538515.localdomain nova_compute[280168]:       <entry name="manufacturer">RDO</entry>
Nov 28 10:01:08 np0005538515.localdomain nova_compute[280168]:       <entry name="product">OpenStack Compute</entry>
Nov 28 10:01:08 np0005538515.localdomain nova_compute[280168]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 28 10:01:08 np0005538515.localdomain nova_compute[280168]:       <entry name="serial">7292509e-f294-4159-96e5-22d4712df2a0</entry>
Nov 28 10:01:08 np0005538515.localdomain nova_compute[280168]:       <entry name="uuid">7292509e-f294-4159-96e5-22d4712df2a0</entry>
Nov 28 10:01:08 np0005538515.localdomain nova_compute[280168]:       <entry name="family">Virtual Machine</entry>
Nov 28 10:01:08 np0005538515.localdomain nova_compute[280168]:     </system>
Nov 28 10:01:08 np0005538515.localdomain nova_compute[280168]:   </sysinfo>
Nov 28 10:01:08 np0005538515.localdomain nova_compute[280168]:   <os>
Nov 28 10:01:08 np0005538515.localdomain nova_compute[280168]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 28 10:01:08 np0005538515.localdomain nova_compute[280168]:     <boot dev="hd"/>
Nov 28 10:01:08 np0005538515.localdomain nova_compute[280168]:     <smbios mode="sysinfo"/>
Nov 28 10:01:08 np0005538515.localdomain nova_compute[280168]:   </os>
Nov 28 10:01:08 np0005538515.localdomain nova_compute[280168]:   <features>
Nov 28 10:01:08 np0005538515.localdomain nova_compute[280168]:     <acpi/>
Nov 28 10:01:08 np0005538515.localdomain nova_compute[280168]:     <apic/>
Nov 28 10:01:08 np0005538515.localdomain nova_compute[280168]:     <vmcoreinfo/>
Nov 28 10:01:08 np0005538515.localdomain nova_compute[280168]:   </features>
Nov 28 10:01:08 np0005538515.localdomain nova_compute[280168]:   <clock offset="utc">
Nov 28 10:01:08 np0005538515.localdomain nova_compute[280168]:     <timer name="pit" tickpolicy="delay"/>
Nov 28 10:01:08 np0005538515.localdomain nova_compute[280168]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 28 10:01:08 np0005538515.localdomain nova_compute[280168]:     <timer name="hpet" present="no"/>
Nov 28 10:01:08 np0005538515.localdomain nova_compute[280168]:   </clock>
Nov 28 10:01:08 np0005538515.localdomain nova_compute[280168]:   <cpu mode="host-model" match="exact">
Nov 28 10:01:08 np0005538515.localdomain nova_compute[280168]:     <topology sockets="1" cores="1" threads="1"/>
Nov 28 10:01:08 np0005538515.localdomain nova_compute[280168]:   </cpu>
Nov 28 10:01:08 np0005538515.localdomain nova_compute[280168]:   <devices>
Nov 28 10:01:08 np0005538515.localdomain nova_compute[280168]:     <disk type="network" device="disk">
Nov 28 10:01:08 np0005538515.localdomain nova_compute[280168]:       <driver type="raw" cache="none"/>
Nov 28 10:01:08 np0005538515.localdomain nova_compute[280168]:       <source protocol="rbd" name="vms/7292509e-f294-4159-96e5-22d4712df2a0_disk">
Nov 28 10:01:08 np0005538515.localdomain nova_compute[280168]:         <host name="172.18.0.103" port="6789"/>
Nov 28 10:01:08 np0005538515.localdomain nova_compute[280168]:         <host name="172.18.0.104" port="6789"/>
Nov 28 10:01:08 np0005538515.localdomain nova_compute[280168]:         <host name="172.18.0.105" port="6789"/>
Nov 28 10:01:08 np0005538515.localdomain nova_compute[280168]:       </source>
Nov 28 10:01:08 np0005538515.localdomain nova_compute[280168]:       <auth username="openstack">
Nov 28 10:01:08 np0005538515.localdomain nova_compute[280168]:         <secret type="ceph" uuid="2c5417c9-00eb-57d5-a565-ddecbc7995c1"/>
Nov 28 10:01:08 np0005538515.localdomain nova_compute[280168]:       </auth>
Nov 28 10:01:08 np0005538515.localdomain nova_compute[280168]:       <target dev="vda" bus="virtio"/>
Nov 28 10:01:08 np0005538515.localdomain nova_compute[280168]:     </disk>
Nov 28 10:01:08 np0005538515.localdomain nova_compute[280168]:     <disk type="network" device="cdrom">
Nov 28 10:01:08 np0005538515.localdomain nova_compute[280168]:       <driver type="raw" cache="none"/>
Nov 28 10:01:08 np0005538515.localdomain nova_compute[280168]:       <source protocol="rbd" name="vms/7292509e-f294-4159-96e5-22d4712df2a0_disk.config">
Nov 28 10:01:08 np0005538515.localdomain nova_compute[280168]:         <host name="172.18.0.103" port="6789"/>
Nov 28 10:01:08 np0005538515.localdomain nova_compute[280168]:         <host name="172.18.0.104" port="6789"/>
Nov 28 10:01:08 np0005538515.localdomain nova_compute[280168]:         <host name="172.18.0.105" port="6789"/>
Nov 28 10:01:08 np0005538515.localdomain nova_compute[280168]:       </source>
Nov 28 10:01:08 np0005538515.localdomain nova_compute[280168]:       <auth username="openstack">
Nov 28 10:01:08 np0005538515.localdomain nova_compute[280168]:         <secret type="ceph" uuid="2c5417c9-00eb-57d5-a565-ddecbc7995c1"/>
Nov 28 10:01:08 np0005538515.localdomain nova_compute[280168]:       </auth>
Nov 28 10:01:08 np0005538515.localdomain nova_compute[280168]:       <target dev="sda" bus="sata"/>
Nov 28 10:01:08 np0005538515.localdomain nova_compute[280168]:     </disk>
Nov 28 10:01:08 np0005538515.localdomain nova_compute[280168]:     <serial type="pty">
Nov 28 10:01:08 np0005538515.localdomain nova_compute[280168]:       <log file="/var/lib/nova/instances/7292509e-f294-4159-96e5-22d4712df2a0/console.log" append="off"/>
Nov 28 10:01:08 np0005538515.localdomain nova_compute[280168]:     </serial>
Nov 28 10:01:08 np0005538515.localdomain nova_compute[280168]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 28 10:01:08 np0005538515.localdomain nova_compute[280168]:     <video>
Nov 28 10:01:08 np0005538515.localdomain nova_compute[280168]:       <model type="virtio"/>
Nov 28 10:01:08 np0005538515.localdomain nova_compute[280168]:     </video>
Nov 28 10:01:08 np0005538515.localdomain nova_compute[280168]:     <input type="tablet" bus="usb"/>
Nov 28 10:01:08 np0005538515.localdomain nova_compute[280168]:     <rng model="virtio">
Nov 28 10:01:08 np0005538515.localdomain nova_compute[280168]:       <backend model="random">/dev/urandom</backend>
Nov 28 10:01:08 np0005538515.localdomain nova_compute[280168]:     </rng>
Nov 28 10:01:08 np0005538515.localdomain nova_compute[280168]:     <controller type="pci" model="pcie-root"/>
Nov 28 10:01:08 np0005538515.localdomain nova_compute[280168]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 10:01:08 np0005538515.localdomain nova_compute[280168]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 10:01:08 np0005538515.localdomain nova_compute[280168]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 10:01:08 np0005538515.localdomain nova_compute[280168]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 10:01:08 np0005538515.localdomain nova_compute[280168]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 10:01:08 np0005538515.localdomain nova_compute[280168]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 10:01:08 np0005538515.localdomain nova_compute[280168]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 10:01:08 np0005538515.localdomain nova_compute[280168]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 10:01:08 np0005538515.localdomain nova_compute[280168]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 10:01:08 np0005538515.localdomain nova_compute[280168]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 10:01:08 np0005538515.localdomain nova_compute[280168]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 10:01:08 np0005538515.localdomain nova_compute[280168]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 10:01:08 np0005538515.localdomain nova_compute[280168]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 10:01:08 np0005538515.localdomain nova_compute[280168]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 10:01:08 np0005538515.localdomain nova_compute[280168]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 10:01:08 np0005538515.localdomain nova_compute[280168]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 10:01:08 np0005538515.localdomain nova_compute[280168]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 10:01:08 np0005538515.localdomain nova_compute[280168]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 10:01:08 np0005538515.localdomain nova_compute[280168]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 10:01:08 np0005538515.localdomain nova_compute[280168]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 10:01:08 np0005538515.localdomain nova_compute[280168]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 10:01:08 np0005538515.localdomain nova_compute[280168]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 10:01:08 np0005538515.localdomain nova_compute[280168]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 10:01:08 np0005538515.localdomain nova_compute[280168]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 10:01:08 np0005538515.localdomain nova_compute[280168]:     <controller type="usb" index="0"/>
Nov 28 10:01:08 np0005538515.localdomain nova_compute[280168]:     <memballoon model="virtio">
Nov 28 10:01:08 np0005538515.localdomain nova_compute[280168]:       <stats period="10"/>
Nov 28 10:01:08 np0005538515.localdomain nova_compute[280168]:     </memballoon>
Nov 28 10:01:08 np0005538515.localdomain nova_compute[280168]:   </devices>
Nov 28 10:01:08 np0005538515.localdomain nova_compute[280168]: </domain>
Nov 28 10:01:08 np0005538515.localdomain nova_compute[280168]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 28 10:01:08 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.108:0/2931902013' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:01:08 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.108:0/3475554474' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:01:08 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:08.587 280172 DEBUG nova.virt.libvirt.driver [None req-c08be0ce-6d64-47f2-999a-9837d83fcc2e 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 28 10:01:08 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:08.588 280172 DEBUG nova.virt.libvirt.driver [None req-c08be0ce-6d64-47f2-999a-9837d83fcc2e 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 28 10:01:08 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:08.588 280172 INFO nova.virt.libvirt.driver [None req-c08be0ce-6d64-47f2-999a-9837d83fcc2e 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Using config drive
Nov 28 10:01:08 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:08.617 280172 DEBUG nova.storage.rbd_utils [None req-c08be0ce-6d64-47f2-999a-9837d83fcc2e 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] rbd image 7292509e-f294-4159-96e5-22d4712df2a0_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Nov 28 10:01:08 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:01:08.774 261346 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:01:05Z, description=, device_id=e8c7be5d-4204-4a6d-9f91-45b65598e58d, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce627910>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce627c40>], id=ec6ddc27-5566-4943-ab8e-4ee78e9f615c, ip_allocation=immediate, mac_address=fa:16:3e:9b:df:fd, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:01:01Z, description=, dns_domain=, id=4feac402-945d-4d17-a15d-c8337ea9c266, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-LiveMigrationTest-1343972998-network, port_security_enabled=True, project_id=f1bee3918a2345388c202f74e60af9c5, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=22843, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=507, status=ACTIVE, subnets=['e34cc883-2097-4e00-b953-2cb3d3328eb3'], tags=[], tenant_id=f1bee3918a2345388c202f74e60af9c5, updated_at=2025-11-28T10:01:02Z, vlan_transparent=None, network_id=4feac402-945d-4d17-a15d-c8337ea9c266, port_security_enabled=False, project_id=f1bee3918a2345388c202f74e60af9c5, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=530, status=DOWN, tags=[], tenant_id=f1bee3918a2345388c202f74e60af9c5, updated_at=2025-11-28T10:01:05Z on network 4feac402-945d-4d17-a15d-c8337ea9c266
Nov 28 10:01:08 np0005538515.localdomain dnsmasq[307007]: read /var/lib/neutron/dhcp/4feac402-945d-4d17-a15d-c8337ea9c266/addn_hosts - 1 addresses
Nov 28 10:01:08 np0005538515.localdomain dnsmasq-dhcp[307007]: read /var/lib/neutron/dhcp/4feac402-945d-4d17-a15d-c8337ea9c266/host
Nov 28 10:01:08 np0005538515.localdomain dnsmasq-dhcp[307007]: read /var/lib/neutron/dhcp/4feac402-945d-4d17-a15d-c8337ea9c266/opts
Nov 28 10:01:08 np0005538515.localdomain podman[307264]: 2025-11-28 10:01:08.993263463 +0000 UTC m=+0.070633198 container kill 421f676002d9dc6198ae0142c65c01f174b39b07450c22c4b59e8e8bd991f65a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4feac402-945d-4d17-a15d-c8337ea9c266, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3)
Nov 28 10:01:08 np0005538515.localdomain systemd[1]: tmp-crun.7rC0wW.mount: Deactivated successfully.
Nov 28 10:01:09 np0005538515.localdomain ceph-mon[301134]: pgmap v99: 177 pgs: 177 active+clean; 175 MiB data, 782 MiB used, 41 GiB / 42 GiB avail; 2.5 MiB/s rd, 3.8 MiB/s wr, 166 op/s
Nov 28 10:01:09 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:01:09.551 261346 INFO neutron.agent.dhcp.agent [None req-914c1384-c4b0-487e-bfed-b49caeca73a9 - - - - - -] DHCP configuration for ports {'ec6ddc27-5566-4943-ab8e-4ee78e9f615c'} is completed
Nov 28 10:01:09 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v100: 177 pgs: 177 active+clean; 175 MiB data, 782 MiB used, 41 GiB / 42 GiB avail; 2.4 MiB/s rd, 3.7 MiB/s wr, 162 op/s
Nov 28 10:01:09 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:09.799 280172 INFO nova.virt.libvirt.driver [None req-c08be0ce-6d64-47f2-999a-9837d83fcc2e 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Creating config drive at /var/lib/nova/instances/7292509e-f294-4159-96e5-22d4712df2a0/disk.config
Nov 28 10:01:09 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:09.806 280172 DEBUG oslo_concurrency.processutils [None req-c08be0ce-6d64-47f2-999a-9837d83fcc2e 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/7292509e-f294-4159-96e5-22d4712df2a0/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp_6474w8j execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 10:01:09 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:09.931 280172 DEBUG oslo_concurrency.processutils [None req-c08be0ce-6d64-47f2-999a-9837d83fcc2e 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/7292509e-f294-4159-96e5-22d4712df2a0/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp_6474w8j" returned: 0 in 0.125s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 10:01:09 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:09.965 280172 DEBUG nova.storage.rbd_utils [None req-c08be0ce-6d64-47f2-999a-9837d83fcc2e 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] rbd image 7292509e-f294-4159-96e5-22d4712df2a0_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Nov 28 10:01:09 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:09.969 280172 DEBUG oslo_concurrency.processutils [None req-c08be0ce-6d64-47f2-999a-9837d83fcc2e 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/7292509e-f294-4159-96e5-22d4712df2a0/disk.config 7292509e-f294-4159-96e5-22d4712df2a0_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 10:01:10 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:10.172 280172 DEBUG oslo_concurrency.processutils [None req-c08be0ce-6d64-47f2-999a-9837d83fcc2e 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/7292509e-f294-4159-96e5-22d4712df2a0/disk.config 7292509e-f294-4159-96e5-22d4712df2a0_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.203s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 10:01:10 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:10.173 280172 INFO nova.virt.libvirt.driver [None req-c08be0ce-6d64-47f2-999a-9837d83fcc2e 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Deleting local config drive /var/lib/nova/instances/7292509e-f294-4159-96e5-22d4712df2a0/disk.config because it was imported into RBD.
Nov 28 10:01:10 np0005538515.localdomain systemd[1]: Started libvirt secret daemon.
Nov 28 10:01:10 np0005538515.localdomain systemd-machined[201641]: New machine qemu-1-instance-00000007.
Nov 28 10:01:10 np0005538515.localdomain systemd[1]: Started Virtual Machine qemu-1-instance-00000007.
Nov 28 10:01:10 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:10.601 280172 DEBUG nova.virt.driver [None req-1a57df67-0f32-4e7a-ae01-b3a64cdd1e9a - - - - - -] Emitting event <LifecycleEvent: 1764324070.6008863, 7292509e-f294-4159-96e5-22d4712df2a0 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 28 10:01:10 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:10.602 280172 INFO nova.compute.manager [None req-1a57df67-0f32-4e7a-ae01-b3a64cdd1e9a - - - - - -] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] VM Resumed (Lifecycle Event)
Nov 28 10:01:10 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:10.629 280172 DEBUG nova.compute.manager [None req-1a57df67-0f32-4e7a-ae01-b3a64cdd1e9a - - - - - -] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 28 10:01:10 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:10.631 280172 DEBUG nova.compute.manager [None req-c08be0ce-6d64-47f2-999a-9837d83fcc2e 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 28 10:01:10 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:10.631 280172 DEBUG nova.virt.libvirt.driver [None req-c08be0ce-6d64-47f2-999a-9837d83fcc2e 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 28 10:01:10 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:10.635 280172 DEBUG nova.compute.manager [None req-1a57df67-0f32-4e7a-ae01-b3a64cdd1e9a - - - - - -] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 28 10:01:10 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:10.638 280172 INFO nova.virt.libvirt.driver [-] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Instance spawned successfully.
Nov 28 10:01:10 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:10.639 280172 DEBUG nova.virt.libvirt.driver [None req-c08be0ce-6d64-47f2-999a-9837d83fcc2e 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 28 10:01:10 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:10.652 280172 INFO nova.compute.manager [None req-1a57df67-0f32-4e7a-ae01-b3a64cdd1e9a - - - - - -] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 28 10:01:10 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:10.653 280172 DEBUG nova.virt.driver [None req-1a57df67-0f32-4e7a-ae01-b3a64cdd1e9a - - - - - -] Emitting event <LifecycleEvent: 1764324070.6303923, 7292509e-f294-4159-96e5-22d4712df2a0 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 28 10:01:10 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:10.653 280172 INFO nova.compute.manager [None req-1a57df67-0f32-4e7a-ae01-b3a64cdd1e9a - - - - - -] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] VM Started (Lifecycle Event)
Nov 28 10:01:10 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:10.675 280172 DEBUG nova.compute.manager [None req-1a57df67-0f32-4e7a-ae01-b3a64cdd1e9a - - - - - -] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 28 10:01:10 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:10.681 280172 DEBUG nova.compute.manager [None req-1a57df67-0f32-4e7a-ae01-b3a64cdd1e9a - - - - - -] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 28 10:01:10 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:10.685 280172 DEBUG nova.virt.libvirt.driver [None req-c08be0ce-6d64-47f2-999a-9837d83fcc2e 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 28 10:01:10 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:10.686 280172 DEBUG nova.virt.libvirt.driver [None req-c08be0ce-6d64-47f2-999a-9837d83fcc2e 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 28 10:01:10 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:10.686 280172 DEBUG nova.virt.libvirt.driver [None req-c08be0ce-6d64-47f2-999a-9837d83fcc2e 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 28 10:01:10 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:10.687 280172 DEBUG nova.virt.libvirt.driver [None req-c08be0ce-6d64-47f2-999a-9837d83fcc2e 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 28 10:01:10 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:10.687 280172 DEBUG nova.virt.libvirt.driver [None req-c08be0ce-6d64-47f2-999a-9837d83fcc2e 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 28 10:01:10 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:10.688 280172 DEBUG nova.virt.libvirt.driver [None req-c08be0ce-6d64-47f2-999a-9837d83fcc2e 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 28 10:01:10 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:10.711 280172 INFO nova.compute.manager [None req-1a57df67-0f32-4e7a-ae01-b3a64cdd1e9a - - - - - -] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 28 10:01:10 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:10.740 280172 INFO nova.compute.manager [None req-c08be0ce-6d64-47f2-999a-9837d83fcc2e 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Took 7.52 seconds to spawn the instance on the hypervisor.
Nov 28 10:01:10 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:10.741 280172 DEBUG nova.compute.manager [None req-c08be0ce-6d64-47f2-999a-9837d83fcc2e 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 28 10:01:10 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:10.794 280172 INFO nova.compute.manager [None req-c08be0ce-6d64-47f2-999a-9837d83fcc2e 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Took 8.46 seconds to build instance.
Nov 28 10:01:10 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:10.821 280172 DEBUG oslo_concurrency.lockutils [None req-c08be0ce-6d64-47f2-999a-9837d83fcc2e 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] Lock "7292509e-f294-4159-96e5-22d4712df2a0" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.564s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 10:01:11 np0005538515.localdomain ceph-mon[301134]: pgmap v100: 177 pgs: 177 active+clean; 175 MiB data, 782 MiB used, 41 GiB / 42 GiB avail; 2.4 MiB/s rd, 3.7 MiB/s wr, 162 op/s
Nov 28 10:01:11 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v101: 177 pgs: 177 active+clean; 192 MiB data, 793 MiB used, 41 GiB / 42 GiB avail; 2.9 MiB/s rd, 4.7 MiB/s wr, 174 op/s
Nov 28 10:01:11 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:01:11.682 2 INFO neutron.agent.securitygroups_rpc [None req-64ca5811-bbe2-4768-b546-3f3c65a295fa c64867c2bac34a819c0995d0b72ee9a7 3e4b394501d24dc7954ec5d2f27b8081 - - default default] Security group member updated ['dc0a6e12-205a-4d7d-adb2-6545f08f7990']
Nov 28 10:01:11 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e97 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 10:01:11 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:11.907 280172 DEBUG oslo_concurrency.lockutils [None req-9dd5035d-70c3-4cd3-a7df-eb80858bea87 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] Acquiring lock "7292509e-f294-4159-96e5-22d4712df2a0" by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 10:01:11 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:11.907 280172 DEBUG oslo_concurrency.lockutils [None req-9dd5035d-70c3-4cd3-a7df-eb80858bea87 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] Lock "7292509e-f294-4159-96e5-22d4712df2a0" acquired by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 10:01:11 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:11.907 280172 INFO nova.compute.manager [None req-9dd5035d-70c3-4cd3-a7df-eb80858bea87 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Shelving
Nov 28 10:01:11 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:11.927 280172 DEBUG nova.virt.libvirt.driver [None req-9dd5035d-70c3-4cd3-a7df-eb80858bea87 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Nov 28 10:01:13 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:01:13.332 2 INFO neutron.agent.securitygroups_rpc [None req-0bbc429e-2462-4e1e-9b65-ff3c254999c8 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] Security group member updated ['8cd6a72f-0cb3-42f5-95bb-7d1b962c8a1e']
Nov 28 10:01:13 np0005538515.localdomain ceph-mon[301134]: pgmap v101: 177 pgs: 177 active+clean; 192 MiB data, 793 MiB used, 41 GiB / 42 GiB avail; 2.9 MiB/s rd, 4.7 MiB/s wr, 174 op/s
Nov 28 10:01:13 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/1924445682' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:01:13 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/1924445682' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:01:13 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v102: 177 pgs: 177 active+clean; 192 MiB data, 793 MiB used, 41 GiB / 42 GiB avail; 2.9 MiB/s rd, 4.7 MiB/s wr, 174 op/s
Nov 28 10:01:13 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:01:13.672 2 INFO neutron.agent.securitygroups_rpc [None req-8218c41f-1821-4284-8a71-4b98eaf9d107 c64867c2bac34a819c0995d0b72ee9a7 3e4b394501d24dc7954ec5d2f27b8081 - - default default] Security group member updated ['dc0a6e12-205a-4d7d-adb2-6545f08f7990']
Nov 28 10:01:15 np0005538515.localdomain sudo[307401]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 10:01:15 np0005538515.localdomain sudo[307401]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 10:01:15 np0005538515.localdomain sudo[307401]: pam_unix(sudo:session): session closed for user root
Nov 28 10:01:15 np0005538515.localdomain sudo[307419]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 10:01:15 np0005538515.localdomain sudo[307419]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 10:01:15 np0005538515.localdomain ceph-mon[301134]: pgmap v102: 177 pgs: 177 active+clean; 192 MiB data, 793 MiB used, 41 GiB / 42 GiB avail; 2.9 MiB/s rd, 4.7 MiB/s wr, 174 op/s
Nov 28 10:01:15 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v103: 177 pgs: 177 active+clean; 192 MiB data, 793 MiB used, 41 GiB / 42 GiB avail; 2.7 MiB/s rd, 4.4 MiB/s wr, 161 op/s
Nov 28 10:01:15 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:01:15.692 261346 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:01:15Z, description=, device_id=350c5687-2c97-42e6-96bf-0b6c681cec37, device_owner=network:router_gateway, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce655c40>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce655b50>], id=23022a72-ee36-49ce-914d-58b90ee87225, ip_allocation=immediate, mac_address=fa:16:3e:c3:79:bb, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T08:32:19Z, description=, dns_domain=, id=887157f9-a765-40c0-8be5-1fba3ddea8f8, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=9dda653c53224db086060962b0702694, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=29, status=ACTIVE, subnets=['5f7de60c-f82a-4f40-b803-51cb08cbf2e3'], tags=[], tenant_id=9dda653c53224db086060962b0702694, updated_at=2025-11-28T08:32:25Z, vlan_transparent=None, network_id=887157f9-a765-40c0-8be5-1fba3ddea8f8, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=578, status=DOWN, tags=[], tenant_id=, updated_at=2025-11-28T10:01:15Z on network 887157f9-a765-40c0-8be5-1fba3ddea8f8
Nov 28 10:01:15 np0005538515.localdomain podman[307454]: 2025-11-28 10:01:15.909882972 +0000 UTC m=+0.064463931 container kill dff024c0a8ef5b75fbea3dbf531de7255c21fd5f44b33b8a799f8e3ce0ffd439 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-887157f9-a765-40c0-8be5-1fba3ddea8f8, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team)
Nov 28 10:01:15 np0005538515.localdomain dnsmasq[261709]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/addn_hosts - 6 addresses
Nov 28 10:01:15 np0005538515.localdomain dnsmasq-dhcp[261709]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/host
Nov 28 10:01:15 np0005538515.localdomain dnsmasq-dhcp[261709]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/opts
Nov 28 10:01:16 np0005538515.localdomain sudo[307419]: pam_unix(sudo:session): session closed for user root
Nov 28 10:01:16 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:01:16.210 261346 INFO neutron.agent.dhcp.agent [None req-a49c3bc0-dd51-49a0-a914-bc9dcf3b4703 - - - - - -] DHCP configuration for ports {'23022a72-ee36-49ce-914d-58b90ee87225'} is completed
Nov 28 10:01:16 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Nov 28 10:01:16 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 10:01:16 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0)
Nov 28 10:01:16 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 28 10:01:16 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Nov 28 10:01:16 np0005538515.localdomain ceph-mgr[286188]: [progress INFO root] update: starting ev 6ca18e3b-b6e0-430a-aa47-28e6913af879 (Updating node-proxy deployment (+3 -> 3))
Nov 28 10:01:16 np0005538515.localdomain ceph-mgr[286188]: [progress INFO root] complete: finished ev 6ca18e3b-b6e0-430a-aa47-28e6913af879 (Updating node-proxy deployment (+3 -> 3))
Nov 28 10:01:16 np0005538515.localdomain ceph-mgr[286188]: [progress INFO root] Completed event 6ca18e3b-b6e0-430a-aa47-28e6913af879 (Updating node-proxy deployment (+3 -> 3)) in 0 seconds
Nov 28 10:01:16 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0)
Nov 28 10:01:16 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 28 10:01:16 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 10:01:16 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 28 10:01:16 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:01:16 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 28 10:01:16 np0005538515.localdomain sudo[307505]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 10:01:16 np0005538515.localdomain sudo[307505]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 10:01:16 np0005538515.localdomain sudo[307505]: pam_unix(sudo:session): session closed for user root
Nov 28 10:01:16 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e97 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 10:01:17 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:01:17.041 2 INFO neutron.agent.securitygroups_rpc [None req-2679fbf8-01c0-46c1-b86d-7a154868a163 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] Security group member updated ['8cd6a72f-0cb3-42f5-95bb-7d1b962c8a1e']
Nov 28 10:01:17 np0005538515.localdomain systemd[1]: tmp-crun.ltrGx4.mount: Deactivated successfully.
Nov 28 10:01:17 np0005538515.localdomain dnsmasq[261709]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/addn_hosts - 5 addresses
Nov 28 10:01:17 np0005538515.localdomain dnsmasq-dhcp[261709]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/host
Nov 28 10:01:17 np0005538515.localdomain dnsmasq-dhcp[261709]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/opts
Nov 28 10:01:17 np0005538515.localdomain podman[307539]: 2025-11-28 10:01:17.26996829 +0000 UTC m=+0.079721074 container kill dff024c0a8ef5b75fbea3dbf531de7255c21fd5f44b33b8a799f8e3ce0ffd439 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-887157f9-a765-40c0-8be5-1fba3ddea8f8, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 28 10:01:17 np0005538515.localdomain ceph-mon[301134]: pgmap v103: 177 pgs: 177 active+clean; 192 MiB data, 793 MiB used, 41 GiB / 42 GiB avail; 2.7 MiB/s rd, 4.4 MiB/s wr, 161 op/s
Nov 28 10:01:17 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v104: 177 pgs: 177 active+clean; 192 MiB data, 793 MiB used, 41 GiB / 42 GiB avail; 3.9 MiB/s rd, 3.9 MiB/s wr, 194 op/s
Nov 28 10:01:18 np0005538515.localdomain ceph-mon[301134]: pgmap v104: 177 pgs: 177 active+clean; 192 MiB data, 793 MiB used, 41 GiB / 42 GiB avail; 3.9 MiB/s rd, 3.9 MiB/s wr, 194 op/s
Nov 28 10:01:18 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.
Nov 28 10:01:18 np0005538515.localdomain systemd[1]: tmp-crun.k4oOFu.mount: Deactivated successfully.
Nov 28 10:01:18 np0005538515.localdomain podman[307560]: 2025-11-28 10:01:18.978237583 +0000 UTC m=+0.081900670 container health_status 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.buildah.version=1.33.7, container_name=openstack_network_exporter, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, vendor=Red Hat, Inc., version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, name=ubi9-minimal)
Nov 28 10:01:18 np0005538515.localdomain podman[307560]: 2025-11-28 10:01:18.997572848 +0000 UTC m=+0.101235955 container exec_died 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, build-date=2025-08-20T13:12:41, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., distribution-scope=public, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, managed_by=edpm_ansible, release=1755695350, vendor=Red Hat, Inc., version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 28 10:01:19 np0005538515.localdomain systemd[1]: 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.service: Deactivated successfully.
Nov 28 10:01:19 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v105: 177 pgs: 177 active+clean; 192 MiB data, 793 MiB used, 41 GiB / 42 GiB avail; 1.9 MiB/s rd, 873 KiB/s wr, 86 op/s
Nov 28 10:01:20 np0005538515.localdomain ceph-mgr[286188]: [progress INFO root] Writing back 50 completed events
Nov 28 10:01:20 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0)
Nov 28 10:01:20 np0005538515.localdomain ceph-mon[301134]: pgmap v105: 177 pgs: 177 active+clean; 192 MiB data, 793 MiB used, 41 GiB / 42 GiB avail; 1.9 MiB/s rd, 873 KiB/s wr, 86 op/s
Nov 28 10:01:20 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.106:0/88113730' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:01:20 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:01:21 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v106: 177 pgs: 177 active+clean; 192 MiB data, 793 MiB used, 41 GiB / 42 GiB avail; 2.6 MiB/s rd, 873 KiB/s wr, 93 op/s
Nov 28 10:01:21 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e97 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 10:01:21 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:21.987 280172 DEBUG nova.virt.libvirt.driver [None req-9dd5035d-70c3-4cd3-a7df-eb80858bea87 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101
Nov 28 10:01:22 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:01:22.152 261346 INFO neutron.agent.linux.ip_lib [None req-f5fd574f-afee-45bd-b13e-c250e5c4c52d - - - - - -] Device tap52bd411f-3a cannot be used as it has no MAC address
Nov 28 10:01:22 np0005538515.localdomain kernel: device tap52bd411f-3a entered promiscuous mode
Nov 28 10:01:22 np0005538515.localdomain NetworkManager[5965]: <info>  [1764324082.1862] manager: (tap52bd411f-3a): new Generic device (/org/freedesktop/NetworkManager/Devices/16)
Nov 28 10:01:22 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T10:01:22Z|00045|binding|INFO|Claiming lport 52bd411f-3a65-4ceb-b07c-480829869bbb for this chassis.
Nov 28 10:01:22 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T10:01:22Z|00046|binding|INFO|52bd411f-3a65-4ceb-b07c-480829869bbb: Claiming unknown
Nov 28 10:01:22 np0005538515.localdomain systemd-udevd[307591]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 10:01:22 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:01:22.206 158530 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538515.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcp2a75ae47-09f3-5db4-9c67-86b6e0e7c804-8a9132e3-6bf5-4fa5-8eac-9650725d34b1', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8a9132e3-6bf5-4fa5-8eac-9650725d34b1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6ad5cc945c4a4ceda603318537f79333', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e2ba1c0c-781e-42ae-a904-9ebfc98b36b5, chassis=[<ovs.db.idl.Row object at 0x7fd80e481be0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fd80e481be0>], logical_port=52bd411f-3a65-4ceb-b07c-480829869bbb) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:01:22 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:01:22.214 158530 INFO neutron.agent.ovn.metadata.agent [-] Port 52bd411f-3a65-4ceb-b07c-480829869bbb in datapath 8a9132e3-6bf5-4fa5-8eac-9650725d34b1 bound to our chassis
Nov 28 10:01:22 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:01:22.219 158530 DEBUG neutron.agent.ovn.metadata.agent [-] Port aebf4a7f-5482-4fc7-9998-c2f271e1ebc5 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Nov 28 10:01:22 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:01:22.219 158530 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8a9132e3-6bf5-4fa5-8eac-9650725d34b1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 28 10:01:22 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T10:01:22Z|00047|binding|INFO|Setting lport 52bd411f-3a65-4ceb-b07c-480829869bbb ovn-installed in OVS
Nov 28 10:01:22 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T10:01:22Z|00048|binding|INFO|Setting lport 52bd411f-3a65-4ceb-b07c-480829869bbb up in Southbound
Nov 28 10:01:22 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:01:22.220 261619 DEBUG oslo.privsep.daemon [-] privsep: reply[e898b86a-92b2-4671-a21c-4702406cfee6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:01:22 np0005538515.localdomain ceph-mon[301134]: pgmap v106: 177 pgs: 177 active+clean; 192 MiB data, 793 MiB used, 41 GiB / 42 GiB avail; 2.6 MiB/s rd, 873 KiB/s wr, 93 op/s
Nov 28 10:01:22 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:01:22.955 261346 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:01:22Z, description=, device_id=65f73f16-3a25-49b4-8bf7-1a58246c1063, device_owner=network:router_gateway, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce5cbe20>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce5cbca0>], id=ca7b4bb7-a367-4c61-b81b-96aa3e7c9cd9, ip_allocation=immediate, mac_address=fa:16:3e:81:be:83, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T08:32:19Z, description=, dns_domain=, id=887157f9-a765-40c0-8be5-1fba3ddea8f8, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=9dda653c53224db086060962b0702694, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=29, status=ACTIVE, subnets=['5f7de60c-f82a-4f40-b803-51cb08cbf2e3'], tags=[], tenant_id=9dda653c53224db086060962b0702694, updated_at=2025-11-28T08:32:25Z, vlan_transparent=None, network_id=887157f9-a765-40c0-8be5-1fba3ddea8f8, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=624, status=DOWN, tags=[], tenant_id=, updated_at=2025-11-28T10:01:22Z on network 887157f9-a765-40c0-8be5-1fba3ddea8f8
Nov 28 10:01:23 np0005538515.localdomain podman[307657]: 
Nov 28 10:01:23 np0005538515.localdomain systemd[1]: tmp-crun.0jCTre.mount: Deactivated successfully.
Nov 28 10:01:23 np0005538515.localdomain dnsmasq[261709]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/addn_hosts - 6 addresses
Nov 28 10:01:23 np0005538515.localdomain dnsmasq-dhcp[261709]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/host
Nov 28 10:01:23 np0005538515.localdomain dnsmasq-dhcp[261709]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/opts
Nov 28 10:01:23 np0005538515.localdomain podman[307668]: 2025-11-28 10:01:23.264060394 +0000 UTC m=+0.103041730 container kill dff024c0a8ef5b75fbea3dbf531de7255c21fd5f44b33b8a799f8e3ce0ffd439 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-887157f9-a765-40c0-8be5-1fba3ddea8f8, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 28 10:01:23 np0005538515.localdomain podman[307657]: 2025-11-28 10:01:23.171243694 +0000 UTC m=+0.043077804 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 28 10:01:23 np0005538515.localdomain podman[307657]: 2025-11-28 10:01:23.299528156 +0000 UTC m=+0.171362186 container create f0d54c45b79e93cce2c09d9f53a08f4697989faf756dae8aace02617f311ac7b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8a9132e3-6bf5-4fa5-8eac-9650725d34b1, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, io.buildah.version=1.41.3)
Nov 28 10:01:23 np0005538515.localdomain systemd[1]: Started libpod-conmon-f0d54c45b79e93cce2c09d9f53a08f4697989faf756dae8aace02617f311ac7b.scope.
Nov 28 10:01:23 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 10:01:23 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/93a797d5722443d4145967b946d93624c8318af2c1cabbcd44f075fa585e6ab9/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 10:01:23 np0005538515.localdomain podman[307657]: 2025-11-28 10:01:23.367386759 +0000 UTC m=+0.239220829 container init f0d54c45b79e93cce2c09d9f53a08f4697989faf756dae8aace02617f311ac7b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8a9132e3-6bf5-4fa5-8eac-9650725d34b1, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3)
Nov 28 10:01:23 np0005538515.localdomain podman[307657]: 2025-11-28 10:01:23.372967058 +0000 UTC m=+0.244801118 container start f0d54c45b79e93cce2c09d9f53a08f4697989faf756dae8aace02617f311ac7b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8a9132e3-6bf5-4fa5-8eac-9650725d34b1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Nov 28 10:01:23 np0005538515.localdomain dnsmasq[307696]: started, version 2.85 cachesize 150
Nov 28 10:01:23 np0005538515.localdomain dnsmasq[307696]: DNS service limited to local subnets
Nov 28 10:01:23 np0005538515.localdomain dnsmasq[307696]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 28 10:01:23 np0005538515.localdomain dnsmasq[307696]: warning: no upstream servers configured
Nov 28 10:01:23 np0005538515.localdomain dnsmasq-dhcp[307696]: DHCP, static leases only on 10.100.0.0, lease time 1d
Nov 28 10:01:23 np0005538515.localdomain dnsmasq[307696]: read /var/lib/neutron/dhcp/8a9132e3-6bf5-4fa5-8eac-9650725d34b1/addn_hosts - 0 addresses
Nov 28 10:01:23 np0005538515.localdomain dnsmasq-dhcp[307696]: read /var/lib/neutron/dhcp/8a9132e3-6bf5-4fa5-8eac-9650725d34b1/host
Nov 28 10:01:23 np0005538515.localdomain dnsmasq-dhcp[307696]: read /var/lib/neutron/dhcp/8a9132e3-6bf5-4fa5-8eac-9650725d34b1/opts
Nov 28 10:01:23 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:01:23.529 261346 INFO neutron.agent.dhcp.agent [None req-5e966c0d-cb2e-41e4-bcbd-f77ef4de119c - - - - - -] DHCP configuration for ports {'56884b0d-9cf9-44f8-ba7a-adbb3b5ac5b6', 'ca7b4bb7-a367-4c61-b81b-96aa3e7c9cd9'} is completed
Nov 28 10:01:23 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v107: 177 pgs: 177 active+clean; 192 MiB data, 793 MiB used, 41 GiB / 42 GiB avail; 2.2 MiB/s rd, 56 op/s
Nov 28 10:01:24 np0005538515.localdomain dnsmasq[261709]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/addn_hosts - 5 addresses
Nov 28 10:01:24 np0005538515.localdomain dnsmasq-dhcp[261709]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/host
Nov 28 10:01:24 np0005538515.localdomain dnsmasq-dhcp[261709]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/opts
Nov 28 10:01:24 np0005538515.localdomain podman[307714]: 2025-11-28 10:01:24.191321413 +0000 UTC m=+0.069155984 container kill dff024c0a8ef5b75fbea3dbf531de7255c21fd5f44b33b8a799f8e3ce0ffd439 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-887157f9-a765-40c0-8be5-1fba3ddea8f8, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Nov 28 10:01:24 np0005538515.localdomain ceph-mon[301134]: pgmap v107: 177 pgs: 177 active+clean; 192 MiB data, 793 MiB used, 41 GiB / 42 GiB avail; 2.2 MiB/s rd, 56 op/s
Nov 28 10:01:24 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.106:0/85736152' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:01:24 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.106:0/2373481162' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:01:25 np0005538515.localdomain dnsmasq[261709]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/addn_hosts - 4 addresses
Nov 28 10:01:25 np0005538515.localdomain dnsmasq-dhcp[261709]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/host
Nov 28 10:01:25 np0005538515.localdomain podman[307753]: 2025-11-28 10:01:25.477362289 +0000 UTC m=+0.065913626 container kill dff024c0a8ef5b75fbea3dbf531de7255c21fd5f44b33b8a799f8e3ce0ffd439 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-887157f9-a765-40c0-8be5-1fba3ddea8f8, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 28 10:01:25 np0005538515.localdomain dnsmasq-dhcp[261709]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/opts
Nov 28 10:01:25 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v108: 177 pgs: 177 active+clean; 192 MiB data, 793 MiB used, 41 GiB / 42 GiB avail; 2.2 MiB/s rd, 56 op/s
Nov 28 10:01:26 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e97 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 10:01:26 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:01:26.904 261346 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:01:26Z, description=, device_id=65f73f16-3a25-49b4-8bf7-1a58246c1063, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce5c8250>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce7b3e80>], id=22f795b4-891f-4309-a925-624689a94701, ip_allocation=immediate, mac_address=fa:16:3e:23:a9:f5, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:01:19Z, description=, dns_domain=, id=8a9132e3-6bf5-4fa5-8eac-9650725d34b1, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-ServersTestJSON-1017617356-network, port_security_enabled=True, project_id=6ad5cc945c4a4ceda603318537f79333, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=27950, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=602, status=ACTIVE, subnets=['0a395c59-e6ba-4384-8c1a-d6113e5b2e22'], tags=[], tenant_id=6ad5cc945c4a4ceda603318537f79333, updated_at=2025-11-28T10:01:20Z, vlan_transparent=None, network_id=8a9132e3-6bf5-4fa5-8eac-9650725d34b1, port_security_enabled=False, project_id=6ad5cc945c4a4ceda603318537f79333, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=637, status=DOWN, tags=[], tenant_id=6ad5cc945c4a4ceda603318537f79333, updated_at=2025-11-28T10:01:26Z on network 8a9132e3-6bf5-4fa5-8eac-9650725d34b1
Nov 28 10:01:26 np0005538515.localdomain ceph-mon[301134]: pgmap v108: 177 pgs: 177 active+clean; 192 MiB data, 793 MiB used, 41 GiB / 42 GiB avail; 2.2 MiB/s rd, 56 op/s
Nov 28 10:01:27 np0005538515.localdomain systemd[1]: tmp-crun.HK96Nl.mount: Deactivated successfully.
Nov 28 10:01:27 np0005538515.localdomain dnsmasq[307696]: read /var/lib/neutron/dhcp/8a9132e3-6bf5-4fa5-8eac-9650725d34b1/addn_hosts - 1 addresses
Nov 28 10:01:27 np0005538515.localdomain dnsmasq-dhcp[307696]: read /var/lib/neutron/dhcp/8a9132e3-6bf5-4fa5-8eac-9650725d34b1/host
Nov 28 10:01:27 np0005538515.localdomain dnsmasq-dhcp[307696]: read /var/lib/neutron/dhcp/8a9132e3-6bf5-4fa5-8eac-9650725d34b1/opts
Nov 28 10:01:27 np0005538515.localdomain podman[307792]: 2025-11-28 10:01:27.365220965 +0000 UTC m=+0.064842043 container kill f0d54c45b79e93cce2c09d9f53a08f4697989faf756dae8aace02617f311ac7b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8a9132e3-6bf5-4fa5-8eac-9650725d34b1, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.vendor=CentOS)
Nov 28 10:01:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:01:27 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 10:01:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:01:27 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 10:01:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:01:27 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 10:01:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:01:27 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 10:01:27 np0005538515.localdomain openstack_network_exporter[240973]: 
Nov 28 10:01:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:01:27 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 10:01:27 np0005538515.localdomain openstack_network_exporter[240973]: 
Nov 28 10:01:27 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:01:27.602 261346 INFO neutron.agent.dhcp.agent [None req-6a311664-682b-4695-a087-d4a6a15e9202 - - - - - -] DHCP configuration for ports {'22f795b4-891f-4309-a925-624689a94701'} is completed
Nov 28 10:01:27 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v109: 177 pgs: 177 active+clean; 271 MiB data, 932 MiB used, 41 GiB / 42 GiB avail; 3.6 MiB/s rd, 3.9 MiB/s wr, 154 op/s
Nov 28 10:01:28 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.
Nov 28 10:01:28 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.
Nov 28 10:01:28 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.
Nov 28 10:01:28 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.
Nov 28 10:01:28 np0005538515.localdomain podman[239012]: time="2025-11-28T10:01:28Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 10:01:28 np0005538515.localdomain podman[239012]: @ - - [28/Nov/2025:10:01:28 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 159978 "" "Go-http-client/1.1"
Nov 28 10:01:28 np0005538515.localdomain ceph-mon[301134]: pgmap v109: 177 pgs: 177 active+clean; 271 MiB data, 932 MiB used, 41 GiB / 42 GiB avail; 3.6 MiB/s rd, 3.9 MiB/s wr, 154 op/s
Nov 28 10:01:29 np0005538515.localdomain podman[307816]: 2025-11-28 10:01:28.991769375 +0000 UTC m=+0.088777787 container health_status b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Nov 28 10:01:29 np0005538515.localdomain systemd[1]: tmp-crun.QO9O6g.mount: Deactivated successfully.
Nov 28 10:01:29 np0005538515.localdomain podman[307815]: 2025-11-28 10:01:29.047498802 +0000 UTC m=+0.147534276 container health_status 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:01:29 np0005538515.localdomain podman[307814]: 2025-11-28 10:01:29.021510166 +0000 UTC m=+0.124481319 container health_status 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 28 10:01:29 np0005538515.localdomain podman[239012]: @ - - [28/Nov/2025:10:01:28 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 20137 "" "Go-http-client/1.1"
Nov 28 10:01:29 np0005538515.localdomain podman[307814]: 2025-11-28 10:01:29.105429675 +0000 UTC m=+0.208400818 container exec_died 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:01:29 np0005538515.localdomain podman[307816]: 2025-11-28 10:01:29.118631354 +0000 UTC m=+0.215639766 container exec_died b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Nov 28 10:01:29 np0005538515.localdomain systemd[1]: 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.service: Deactivated successfully.
Nov 28 10:01:29 np0005538515.localdomain podman[307815]: 2025-11-28 10:01:29.137723532 +0000 UTC m=+0.237759026 container exec_died 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller)
Nov 28 10:01:29 np0005538515.localdomain systemd[1]: b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.service: Deactivated successfully.
Nov 28 10:01:29 np0005538515.localdomain systemd[1]: 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.service: Deactivated successfully.
Nov 28 10:01:29 np0005538515.localdomain podman[307822]: 2025-11-28 10:01:29.200962876 +0000 UTC m=+0.289847612 container health_status d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 28 10:01:29 np0005538515.localdomain podman[307822]: 2025-11-28 10:01:29.24043333 +0000 UTC m=+0.329318066 container exec_died d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 10:01:29 np0005538515.localdomain systemd[1]: d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.service: Deactivated successfully.
Nov 28 10:01:29 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v110: 177 pgs: 177 active+clean; 271 MiB data, 932 MiB used, 41 GiB / 42 GiB avail; 2.0 MiB/s rd, 3.9 MiB/s wr, 105 op/s
Nov 28 10:01:29 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:01:29.652 261346 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:01:26Z, description=, device_id=65f73f16-3a25-49b4-8bf7-1a58246c1063, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce56bd00>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce56bc70>], id=22f795b4-891f-4309-a925-624689a94701, ip_allocation=immediate, mac_address=fa:16:3e:23:a9:f5, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:01:19Z, description=, dns_domain=, id=8a9132e3-6bf5-4fa5-8eac-9650725d34b1, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-ServersTestJSON-1017617356-network, port_security_enabled=True, project_id=6ad5cc945c4a4ceda603318537f79333, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=27950, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=602, status=ACTIVE, subnets=['0a395c59-e6ba-4384-8c1a-d6113e5b2e22'], tags=[], tenant_id=6ad5cc945c4a4ceda603318537f79333, updated_at=2025-11-28T10:01:20Z, vlan_transparent=None, network_id=8a9132e3-6bf5-4fa5-8eac-9650725d34b1, port_security_enabled=False, project_id=6ad5cc945c4a4ceda603318537f79333, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=637, status=DOWN, tags=[], tenant_id=6ad5cc945c4a4ceda603318537f79333, updated_at=2025-11-28T10:01:26Z on network 8a9132e3-6bf5-4fa5-8eac-9650725d34b1
Nov 28 10:01:29 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #22. Immutable memtables: 0.
Nov 28 10:01:29 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:01:29.870362) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Nov 28 10:01:29 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/flush_job.cc:856] [default] [JOB 9] Flushing memtable with next log file: 22
Nov 28 10:01:29 np0005538515.localdomain ceph-mon[301134]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324089870447, "job": 9, "event": "flush_started", "num_memtables": 1, "num_entries": 1173, "num_deletes": 251, "total_data_size": 1335303, "memory_usage": 1355672, "flush_reason": "Manual Compaction"}
Nov 28 10:01:29 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/flush_job.cc:885] [default] [JOB 9] Level-0 flush table #23: started
Nov 28 10:01:29 np0005538515.localdomain ceph-mon[301134]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324089878990, "cf_name": "default", "job": 9, "event": "table_file_creation", "file_number": 23, "file_size": 599688, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 16275, "largest_seqno": 17443, "table_properties": {"data_size": 595874, "index_size": 1477, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1285, "raw_key_size": 10741, "raw_average_key_size": 21, "raw_value_size": 587279, "raw_average_value_size": 1149, "num_data_blocks": 66, "num_entries": 511, "num_filter_entries": 511, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764324014, "oldest_key_time": 1764324014, "file_creation_time": 1764324089, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "75e61b0e-4f73-4b03-b096-8587ecbe7a9f", "db_session_id": "7KM5GJAJPD54H6HSLJHG", "orig_file_number": 23, "seqno_to_time_mapping": "N/A"}}
Nov 28 10:01:29 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 9] Flush lasted 8739 microseconds, and 2993 cpu microseconds.
Nov 28 10:01:29 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 28 10:01:29 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:01:29.879109) [db/flush_job.cc:967] [default] [JOB 9] Level-0 flush table #23: 599688 bytes OK
Nov 28 10:01:29 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:01:29.879134) [db/memtable_list.cc:519] [default] Level-0 commit table #23 started
Nov 28 10:01:29 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:01:29.881230) [db/memtable_list.cc:722] [default] Level-0 commit table #23: memtable #1 done
Nov 28 10:01:29 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:01:29.881254) EVENT_LOG_v1 {"time_micros": 1764324089881248, "job": 9, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Nov 28 10:01:29 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:01:29.881276) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Nov 28 10:01:29 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 9] Try to delete WAL files size 1329577, prev total WAL file size 1329901, number of live WAL files 2.
Nov 28 10:01:29 np0005538515.localdomain ceph-mon[301134]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538515/store.db/000019.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 28 10:01:29 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:01:29.882053) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740033373630' seq:72057594037927935, type:22 .. '6D6772737461740034303132' seq:0, type:0; will stop at (end)
Nov 28 10:01:29 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 10] Compacting 1@0 + 1@6 files to L6, score -1.00
Nov 28 10:01:29 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 9 Base level 0, inputs: [23(585KB)], [21(18MB)]
Nov 28 10:01:29 np0005538515.localdomain ceph-mon[301134]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324089882160, "job": 10, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [23], "files_L6": [21], "score": -1, "input_data_size": 20054923, "oldest_snapshot_seqno": -1}
Nov 28 10:01:29 np0005538515.localdomain dnsmasq[307696]: read /var/lib/neutron/dhcp/8a9132e3-6bf5-4fa5-8eac-9650725d34b1/addn_hosts - 1 addresses
Nov 28 10:01:29 np0005538515.localdomain dnsmasq-dhcp[307696]: read /var/lib/neutron/dhcp/8a9132e3-6bf5-4fa5-8eac-9650725d34b1/host
Nov 28 10:01:29 np0005538515.localdomain podman[307918]: 2025-11-28 10:01:29.997827079 +0000 UTC m=+0.058174862 container kill f0d54c45b79e93cce2c09d9f53a08f4697989faf756dae8aace02617f311ac7b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8a9132e3-6bf5-4fa5-8eac-9650725d34b1, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Nov 28 10:01:29 np0005538515.localdomain dnsmasq-dhcp[307696]: read /var/lib/neutron/dhcp/8a9132e3-6bf5-4fa5-8eac-9650725d34b1/opts
Nov 28 10:01:30 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 10] Generated table #24: 12084 keys, 18080893 bytes, temperature: kUnknown
Nov 28 10:01:30 np0005538515.localdomain ceph-mon[301134]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324090023505, "cf_name": "default", "job": 10, "event": "table_file_creation", "file_number": 24, "file_size": 18080893, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 18012921, "index_size": 36639, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 30277, "raw_key_size": 324123, "raw_average_key_size": 26, "raw_value_size": 17808263, "raw_average_value_size": 1473, "num_data_blocks": 1394, "num_entries": 12084, "num_filter_entries": 12084, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764323786, "oldest_key_time": 0, "file_creation_time": 1764324089, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "75e61b0e-4f73-4b03-b096-8587ecbe7a9f", "db_session_id": "7KM5GJAJPD54H6HSLJHG", "orig_file_number": 24, "seqno_to_time_mapping": "N/A"}}
Nov 28 10:01:30 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 28 10:01:30 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:01:30.023709) [db/compaction/compaction_job.cc:1663] [default] [JOB 10] Compacted 1@0 + 1@6 files to L6 => 18080893 bytes
Nov 28 10:01:30 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:01:30.026342) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 141.8 rd, 127.9 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.6, 18.6 +0.0 blob) out(17.2 +0.0 blob), read-write-amplify(63.6) write-amplify(30.2) OK, records in: 12576, records dropped: 492 output_compression: NoCompression
Nov 28 10:01:30 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:01:30.026360) EVENT_LOG_v1 {"time_micros": 1764324090026353, "job": 10, "event": "compaction_finished", "compaction_time_micros": 141392, "compaction_time_cpu_micros": 50415, "output_level": 6, "num_output_files": 1, "total_output_size": 18080893, "num_input_records": 12576, "num_output_records": 12084, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Nov 28 10:01:30 np0005538515.localdomain ceph-mon[301134]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538515/store.db/000023.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 28 10:01:30 np0005538515.localdomain ceph-mon[301134]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324090026500, "job": 10, "event": "table_file_deletion", "file_number": 23}
Nov 28 10:01:30 np0005538515.localdomain ceph-mon[301134]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538515/store.db/000021.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 28 10:01:30 np0005538515.localdomain ceph-mon[301134]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324090028223, "job": 10, "event": "table_file_deletion", "file_number": 21}
Nov 28 10:01:30 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:01:29.881919) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:01:30 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:01:30.028255) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:01:30 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:01:30.028261) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:01:30 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:01:30.028264) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:01:30 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:01:30.028267) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:01:30 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:01:30.028270) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:01:30 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:01:30.413 261346 INFO neutron.agent.dhcp.agent [None req-45ed9bd7-c99a-4676-a976-a30d6711f3f7 - - - - - -] DHCP configuration for ports {'22f795b4-891f-4309-a925-624689a94701'} is completed
Nov 28 10:01:31 np0005538515.localdomain ceph-mon[301134]: pgmap v110: 177 pgs: 177 active+clean; 271 MiB data, 932 MiB used, 41 GiB / 42 GiB avail; 2.0 MiB/s rd, 3.9 MiB/s wr, 105 op/s
Nov 28 10:01:31 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v111: 177 pgs: 177 active+clean; 271 MiB data, 932 MiB used, 41 GiB / 42 GiB avail; 3.6 MiB/s rd, 3.9 MiB/s wr, 158 op/s
Nov 28 10:01:31 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e97 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 10:01:31 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.
Nov 28 10:01:31 np0005538515.localdomain systemd[1]: tmp-crun.D40X4C.mount: Deactivated successfully.
Nov 28 10:01:31 np0005538515.localdomain podman[307938]: 2025-11-28 10:01:31.984090744 +0000 UTC m=+0.087525500 container health_status 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 28 10:01:31 np0005538515.localdomain podman[307938]: 2025-11-28 10:01:31.999553842 +0000 UTC m=+0.102988588 container exec_died 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 28 10:01:32 np0005538515.localdomain systemd[1]: 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.service: Deactivated successfully.
Nov 28 10:01:33 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:33.038 280172 DEBUG nova.virt.libvirt.driver [None req-9dd5035d-70c3-4cd3-a7df-eb80858bea87 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Instance in state 1 after 21 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101
Nov 28 10:01:33 np0005538515.localdomain ceph-mon[301134]: pgmap v111: 177 pgs: 177 active+clean; 271 MiB data, 932 MiB used, 41 GiB / 42 GiB avail; 3.6 MiB/s rd, 3.9 MiB/s wr, 158 op/s
Nov 28 10:01:33 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v112: 177 pgs: 177 active+clean; 271 MiB data, 932 MiB used, 41 GiB / 42 GiB avail; 2.9 MiB/s rd, 3.9 MiB/s wr, 151 op/s
Nov 28 10:01:33 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:33.732 280172 DEBUG nova.virt.libvirt.driver [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Creating tmpfile /var/lib/nova/instances/tmpx5ac6ig2 to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10041
Nov 28 10:01:33 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:33.760 280172 DEBUG nova.compute.manager [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=False,disk_available_mb=13312,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpx5ac6ig2',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.9/site-packages/nova/compute/manager.py:8476
Nov 28 10:01:33 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:33.783 280172 DEBUG oslo_concurrency.lockutils [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] Acquiring lock "compute-rpcapi-router" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 28 10:01:33 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:33.783 280172 DEBUG oslo_concurrency.lockutils [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] Acquired lock "compute-rpcapi-router" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 28 10:01:33 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:33.792 280172 INFO nova.compute.rpcapi [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] Automatically selected compute RPC version 6.2 from minimum service version 66
Nov 28 10:01:33 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:33.793 280172 DEBUG oslo_concurrency.lockutils [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] Releasing lock "compute-rpcapi-router" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 28 10:01:34 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:34.431 280172 DEBUG nova.compute.manager [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=False,disk_available_mb=13312,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpx5ac6ig2',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='c06e2ffc-a8af-41b6-ab88-680ef1f6fe50',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8604
Nov 28 10:01:34 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:34.462 280172 DEBUG oslo_concurrency.lockutils [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] Acquiring lock "refresh_cache-c06e2ffc-a8af-41b6-ab88-680ef1f6fe50" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 28 10:01:34 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:34.463 280172 DEBUG oslo_concurrency.lockutils [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] Acquired lock "refresh_cache-c06e2ffc-a8af-41b6-ab88-680ef1f6fe50" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 28 10:01:34 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:34.463 280172 DEBUG nova.network.neutron [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 28 10:01:34 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.
Nov 28 10:01:34 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:34.912 280172 DEBUG nova.network.neutron [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Updating instance_info_cache with network_info: [{"id": "62b8533f-b250-4475-80c2-28c4543536b5", "address": "fa:16:3e:58:68:3c", "network": {"id": "ad2d8cf7-987d-4804-acbd-9b3e248dc8cd", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1085829019-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.2"}}], "meta": {"injected": false, "tenant_id": "310745a04bd441169ff77f55ccf6bd7b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap62b8533f-b2", "ovs_interfaceid": "62b8533f-b250-4475-80c2-28c4543536b5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 28 10:01:34 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:34.930 280172 DEBUG oslo_concurrency.lockutils [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] Releasing lock "refresh_cache-c06e2ffc-a8af-41b6-ab88-680ef1f6fe50" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 28 10:01:34 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:34.932 280172 DEBUG nova.virt.libvirt.driver [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=False,disk_available_mb=13312,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpx5ac6ig2',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='c06e2ffc-a8af-41b6-ab88-680ef1f6fe50',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10827
Nov 28 10:01:34 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:34.933 280172 DEBUG nova.virt.libvirt.driver [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Creating instance directory: /var/lib/nova/instances/c06e2ffc-a8af-41b6-ab88-680ef1f6fe50 pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10840
Nov 28 10:01:34 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:34.934 280172 DEBUG nova.virt.libvirt.driver [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Ensure instance console log exists: /var/lib/nova/instances/c06e2ffc-a8af-41b6-ab88-680ef1f6fe50/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 28 10:01:34 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:34.934 280172 DEBUG nova.virt.libvirt.driver [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10794
Nov 28 10:01:34 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:34.936 280172 DEBUG nova.virt.libvirt.vif [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-28T10:01:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-LiveMigrationTest-server-915340611',display_name='tempest-LiveMigrationTest-server-915340611',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(5),hidden=False,host='np0005538513.localdomain',hostname='tempest-livemigrationtest-server-915340611',id=8,image_ref='85968a96-5a0e-43a4-9c04-3954f640a7ed',info_cache=InstanceInfoCache,instance_type_id=5,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-28T10:01:29Z,launched_on='np0005538513.localdomain',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='np0005538513.localdomain',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='310745a04bd441169ff77f55ccf6bd7b',ramdisk_id='',reservation_id='r-1g332w05',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='85968a96-5a0e-43a4-9c04-3954f640a7ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveMigrationTest-480152442',owner_user_name='tempest-LiveMigrationTest-480152442-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-28T10:01:29Z,user_data=None,user_id='318114281cb649bc9eeed12ecdc7273f',uuid=c06e2ffc-a8af-41b6-ab88-680ef1f6fe50,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "62b8533f-b250-4475-80c2-28c4543536b5", "address": "fa:16:3e:58:68:3c", "network": {"id": "ad2d8cf7-987d-4804-acbd-9b3e248dc8cd", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1085829019-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.2"}}], "meta": {"injected": false, "tenant_id": "310745a04bd441169ff77f55ccf6bd7b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap62b8533f-b2", "ovs_interfaceid": "62b8533f-b250-4475-80c2-28c4543536b5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 28 10:01:34 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:34.936 280172 DEBUG nova.network.os_vif_util [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] Converting VIF {"id": "62b8533f-b250-4475-80c2-28c4543536b5", "address": "fa:16:3e:58:68:3c", "network": {"id": "ad2d8cf7-987d-4804-acbd-9b3e248dc8cd", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1085829019-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.2"}}], "meta": {"injected": false, "tenant_id": "310745a04bd441169ff77f55ccf6bd7b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap62b8533f-b2", "ovs_interfaceid": "62b8533f-b250-4475-80c2-28c4543536b5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 28 10:01:34 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:34.937 280172 DEBUG nova.network.os_vif_util [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:58:68:3c,bridge_name='br-int',has_traffic_filtering=True,id=62b8533f-b250-4475-80c2-28c4543536b5,network=Network(ad2d8cf7-987d-4804-acbd-9b3e248dc8cd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap62b8533f-b2') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 28 10:01:34 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:34.938 280172 DEBUG os_vif [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:58:68:3c,bridge_name='br-int',has_traffic_filtering=True,id=62b8533f-b250-4475-80c2-28c4543536b5,network=Network(ad2d8cf7-987d-4804-acbd-9b3e248dc8cd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap62b8533f-b2') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 28 10:01:34 np0005538515.localdomain podman[307962]: 2025-11-28 10:01:34.970134713 +0000 UTC m=+0.077913499 container health_status cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_id=multipathd, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 28 10:01:34 np0005538515.localdomain podman[307962]: 2025-11-28 10:01:34.984503597 +0000 UTC m=+0.092282363 container exec_died cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.schema-version=1.0, config_id=multipathd, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 28 10:01:34 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:34.985 280172 DEBUG ovsdbapp.backend.ovs_idl [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Nov 28 10:01:34 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:34.986 280172 DEBUG ovsdbapp.backend.ovs_idl [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Nov 28 10:01:34 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:34.986 280172 DEBUG ovsdbapp.backend.ovs_idl [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Nov 28 10:01:34 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:34.987 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] tcp:127.0.0.1:6640: entering CONNECTING _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 28 10:01:34 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:34.987 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] [POLLOUT] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:01:34 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:34.987 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 28 10:01:34 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:34.988 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:01:34 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:34.990 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:01:34 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:34.995 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:01:35 np0005538515.localdomain systemd[1]: cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.service: Deactivated successfully.
Nov 28 10:01:35 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:35.026 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:01:35 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:35.027 280172 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 10:01:35 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:35.027 280172 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 28 10:01:35 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:35.029 280172 INFO oslo.privsep.daemon [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'vif_plug_ovs.privsep.vif_plug', '--privsep_sock_path', '/tmp/tmpfkf_65jq/privsep.sock']
Nov 28 10:01:35 np0005538515.localdomain ceph-mon[301134]: pgmap v112: 177 pgs: 177 active+clean; 271 MiB data, 932 MiB used, 41 GiB / 42 GiB avail; 2.9 MiB/s rd, 3.9 MiB/s wr, 151 op/s
Nov 28 10:01:35 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:35.255 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:01:35 np0005538515.localdomain systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000007.scope: Deactivated successfully.
Nov 28 10:01:35 np0005538515.localdomain systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000007.scope: Consumed 13.173s CPU time.
Nov 28 10:01:35 np0005538515.localdomain systemd-machined[201641]: Machine qemu-1-instance-00000007 terminated.
Nov 28 10:01:35 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v113: 177 pgs: 177 active+clean; 271 MiB data, 932 MiB used, 41 GiB / 42 GiB avail; 2.9 MiB/s rd, 3.9 MiB/s wr, 151 op/s
Nov 28 10:01:35 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] scanning for idle connections..
Nov 28 10:01:35 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] cleaning up connections: []
Nov 28 10:01:35 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:35.693 280172 INFO oslo.privsep.daemon [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] Spawned new privsep daemon via rootwrap
Nov 28 10:01:35 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:35.584 307986 INFO oslo.privsep.daemon [-] privsep daemon starting
Nov 28 10:01:35 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:35.589 307986 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Nov 28 10:01:35 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:35.593 307986 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_NET_ADMIN/CAP_DAC_OVERRIDE|CAP_NET_ADMIN/none
Nov 28 10:01:35 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:35.593 307986 INFO oslo.privsep.daemon [-] privsep daemon running as pid 307986
Nov 28 10:01:35 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] scanning for idle connections..
Nov 28 10:01:35 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] cleaning up connections: []
Nov 28 10:01:35 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] scanning for idle connections..
Nov 28 10:01:35 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] cleaning up connections: []
Nov 28 10:01:35 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:35.959 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:01:35 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:35.960 280172 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap62b8533f-b2, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 10:01:35 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:35.960 280172 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap62b8533f-b2, col_values=(('external_ids', {'iface-id': '62b8533f-b250-4475-80c2-28c4543536b5', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:58:68:3c', 'vm-uuid': 'c06e2ffc-a8af-41b6-ab88-680ef1f6fe50'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 10:01:35 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:35.966 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 28 10:01:35 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:35.968 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:01:35 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:35.970 280172 INFO os_vif [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:58:68:3c,bridge_name='br-int',has_traffic_filtering=True,id=62b8533f-b250-4475-80c2-28c4543536b5,network=Network(ad2d8cf7-987d-4804-acbd-9b3e248dc8cd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap62b8533f-b2')
Nov 28 10:01:35 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:35.971 280172 DEBUG nova.virt.libvirt.driver [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10954
Nov 28 10:01:35 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:35.972 280172 DEBUG nova.compute.manager [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=False,disk_available_mb=13312,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpx5ac6ig2',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='c06e2ffc-a8af-41b6-ab88-680ef1f6fe50',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8668
Nov 28 10:01:36 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:36.054 280172 INFO nova.virt.libvirt.driver [None req-9dd5035d-70c3-4cd3-a7df-eb80858bea87 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Instance shutdown successfully after 24 seconds.
Nov 28 10:01:36 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:36.066 280172 INFO nova.virt.libvirt.driver [-] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Instance destroyed successfully.
Nov 28 10:01:36 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:36.067 280172 DEBUG nova.objects.instance [None req-9dd5035d-70c3-4cd3-a7df-eb80858bea87 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] Lazy-loading 'numa_topology' on Instance uuid 7292509e-f294-4159-96e5-22d4712df2a0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 28 10:01:36 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:36.159 280172 INFO nova.virt.libvirt.driver [None req-9dd5035d-70c3-4cd3-a7df-eb80858bea87 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Beginning cold snapshot process
Nov 28 10:01:36 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:36.238 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:01:36 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:36.240 280172 DEBUG nova.compute.manager [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 28 10:01:36 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:36.394 280172 DEBUG nova.virt.libvirt.imagebackend [None req-9dd5035d-70c3-4cd3-a7df-eb80858bea87 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] No parent info for 85968a96-5a0e-43a4-9c04-3954f640a7ed; asking the Image API where its store is _get_parent_pool /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1163
Nov 28 10:01:36 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:36.428 280172 DEBUG nova.storage.rbd_utils [None req-9dd5035d-70c3-4cd3-a7df-eb80858bea87 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] creating snapshot(ea994f4a379648c6a3c88fbbe63e049a) on rbd image(7292509e-f294-4159-96e5-22d4712df2a0_disk) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462
Nov 28 10:01:36 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e97 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 10:01:37 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e98 e98: 6 total, 6 up, 6 in
Nov 28 10:01:37 np0005538515.localdomain ceph-mon[301134]: pgmap v113: 177 pgs: 177 active+clean; 271 MiB data, 932 MiB used, 41 GiB / 42 GiB avail; 2.9 MiB/s rd, 3.9 MiB/s wr, 151 op/s
Nov 28 10:01:37 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:37.187 280172 DEBUG nova.storage.rbd_utils [None req-9dd5035d-70c3-4cd3-a7df-eb80858bea87 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] cloning vms/7292509e-f294-4159-96e5-22d4712df2a0_disk@ea994f4a379648c6a3c88fbbe63e049a to images/c045142b-5f2b-4f4d-80b7-ca5ee791067d clone /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:261
Nov 28 10:01:37 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:37.242 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:01:37 np0005538515.localdomain dnsmasq[307696]: read /var/lib/neutron/dhcp/8a9132e3-6bf5-4fa5-8eac-9650725d34b1/addn_hosts - 0 addresses
Nov 28 10:01:37 np0005538515.localdomain dnsmasq-dhcp[307696]: read /var/lib/neutron/dhcp/8a9132e3-6bf5-4fa5-8eac-9650725d34b1/host
Nov 28 10:01:37 np0005538515.localdomain dnsmasq-dhcp[307696]: read /var/lib/neutron/dhcp/8a9132e3-6bf5-4fa5-8eac-9650725d34b1/opts
Nov 28 10:01:37 np0005538515.localdomain podman[308074]: 2025-11-28 10:01:37.290642093 +0000 UTC m=+0.063638397 container kill f0d54c45b79e93cce2c09d9f53a08f4697989faf756dae8aace02617f311ac7b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8a9132e3-6bf5-4fa5-8eac-9650725d34b1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Nov 28 10:01:37 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:37.358 280172 DEBUG nova.storage.rbd_utils [None req-9dd5035d-70c3-4cd3-a7df-eb80858bea87 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] flattening images/c045142b-5f2b-4f4d-80b7-ca5ee791067d flatten /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:314
Nov 28 10:01:37 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:37.466 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:01:37 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T10:01:37Z|00049|binding|INFO|Releasing lport 52bd411f-3a65-4ceb-b07c-480829869bbb from this chassis (sb_readonly=0)
Nov 28 10:01:37 np0005538515.localdomain kernel: device tap52bd411f-3a left promiscuous mode
Nov 28 10:01:37 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T10:01:37Z|00050|binding|INFO|Setting lport 52bd411f-3a65-4ceb-b07c-480829869bbb down in Southbound
Nov 28 10:01:37 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:01:37.482 158530 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538515.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcp2a75ae47-09f3-5db4-9c67-86b6e0e7c804-8a9132e3-6bf5-4fa5-8eac-9650725d34b1', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8a9132e3-6bf5-4fa5-8eac-9650725d34b1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6ad5cc945c4a4ceda603318537f79333', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005538515.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e2ba1c0c-781e-42ae-a904-9ebfc98b36b5, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fd80e481be0>], logical_port=52bd411f-3a65-4ceb-b07c-480829869bbb) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fd80e481be0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:01:37 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:01:37.485 158530 INFO neutron.agent.ovn.metadata.agent [-] Port 52bd411f-3a65-4ceb-b07c-480829869bbb in datapath 8a9132e3-6bf5-4fa5-8eac-9650725d34b1 unbound from our chassis
Nov 28 10:01:37 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:37.489 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:01:37 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:01:37.489 158530 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8a9132e3-6bf5-4fa5-8eac-9650725d34b1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 28 10:01:37 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:01:37.491 261619 DEBUG oslo.privsep.daemon [-] privsep: reply[03049ccf-6269-425f-bf3f-cf28893e1154]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:01:37 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v115: 177 pgs: 177 active+clean; 271 MiB data, 933 MiB used, 41 GiB / 42 GiB avail; 2.3 MiB/s rd, 32 KiB/s wr, 87 op/s
Nov 28 10:01:38 np0005538515.localdomain ceph-mon[301134]: osdmap e98: 6 total, 6 up, 6 in
Nov 28 10:01:38 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:38.273 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:01:38 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:38.276 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:01:38 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:38.295 280172 DEBUG nova.storage.rbd_utils [None req-9dd5035d-70c3-4cd3-a7df-eb80858bea87 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] removing snapshot(ea994f4a379648c6a3c88fbbe63e049a) on rbd image(7292509e-f294-4159-96e5-22d4712df2a0_disk) remove_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:489
Nov 28 10:01:38 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:38.569 280172 DEBUG nova.network.neutron [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Port 62b8533f-b250-4475-80c2-28c4543536b5 updated with migration profile {'migrating_to': 'np0005538515.localdomain'} successfully _setup_migration_port_profile /usr/lib/python3.9/site-packages/nova/network/neutron.py:354
Nov 28 10:01:38 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:38.572 280172 DEBUG nova.compute.manager [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=False,disk_available_mb=13312,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpx5ac6ig2',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='c06e2ffc-a8af-41b6-ab88-680ef1f6fe50',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8723
Nov 28 10:01:38 np0005538515.localdomain sshd[308154]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 10:01:38 np0005538515.localdomain sshd[308154]: Accepted publickey for nova from 172.17.0.106 port 36484 ssh2: ECDSA SHA256:i2iq7ecxWJi+/Y2y/tJQSdSgYGpWyMNC9YvRCAzXl2w
Nov 28 10:01:38 np0005538515.localdomain systemd-logind[763]: New session 73 of user nova.
Nov 28 10:01:38 np0005538515.localdomain systemd[1]: Created slice User Slice of UID 42436.
Nov 28 10:01:38 np0005538515.localdomain systemd[1]: Starting User Runtime Directory /run/user/42436...
Nov 28 10:01:38 np0005538515.localdomain systemd[1]: Finished User Runtime Directory /run/user/42436.
Nov 28 10:01:38 np0005538515.localdomain systemd[1]: Starting User Manager for UID 42436...
Nov 28 10:01:38 np0005538515.localdomain systemd[308158]: pam_unix(systemd-user:session): session opened for user nova(uid=42436) by (uid=0)
Nov 28 10:01:39 np0005538515.localdomain systemd[308158]: Queued start job for default target Main User Target.
Nov 28 10:01:39 np0005538515.localdomain systemd[308158]: Created slice User Application Slice.
Nov 28 10:01:39 np0005538515.localdomain systemd[308158]: Started Mark boot as successful after the user session has run 2 minutes.
Nov 28 10:01:39 np0005538515.localdomain systemd[308158]: Started Daily Cleanup of User's Temporary Directories.
Nov 28 10:01:39 np0005538515.localdomain systemd[308158]: Reached target Paths.
Nov 28 10:01:39 np0005538515.localdomain systemd[308158]: Reached target Timers.
Nov 28 10:01:39 np0005538515.localdomain systemd[308158]: Starting D-Bus User Message Bus Socket...
Nov 28 10:01:39 np0005538515.localdomain systemd[308158]: Starting Create User's Volatile Files and Directories...
Nov 28 10:01:39 np0005538515.localdomain systemd[308158]: Finished Create User's Volatile Files and Directories.
Nov 28 10:01:39 np0005538515.localdomain systemd[308158]: Listening on D-Bus User Message Bus Socket.
Nov 28 10:01:39 np0005538515.localdomain systemd[308158]: Reached target Sockets.
Nov 28 10:01:39 np0005538515.localdomain systemd[308158]: Reached target Basic System.
Nov 28 10:01:39 np0005538515.localdomain systemd[308158]: Reached target Main User Target.
Nov 28 10:01:39 np0005538515.localdomain systemd[308158]: Startup finished in 159ms.
Nov 28 10:01:39 np0005538515.localdomain systemd[1]: Started User Manager for UID 42436.
Nov 28 10:01:39 np0005538515.localdomain systemd[1]: Started Session 73 of User nova.
Nov 28 10:01:39 np0005538515.localdomain sshd[308154]: pam_unix(sshd:session): session opened for user nova(uid=42436) by (uid=0)
Nov 28 10:01:39 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e99 e99: 6 total, 6 up, 6 in
Nov 28 10:01:39 np0005538515.localdomain ceph-mon[301134]: pgmap v115: 177 pgs: 177 active+clean; 271 MiB data, 933 MiB used, 41 GiB / 42 GiB avail; 2.3 MiB/s rd, 32 KiB/s wr, 87 op/s
Nov 28 10:01:39 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:39.237 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:01:39 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:39.244 280172 DEBUG nova.storage.rbd_utils [None req-9dd5035d-70c3-4cd3-a7df-eb80858bea87 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] creating snapshot(snap) on rbd image(c045142b-5f2b-4f4d-80b7-ca5ee791067d) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462
Nov 28 10:01:39 np0005538515.localdomain kernel: tun: Universal TUN/TAP device driver, 1.6
Nov 28 10:01:39 np0005538515.localdomain kernel: device tap62b8533f-b2 entered promiscuous mode
Nov 28 10:01:39 np0005538515.localdomain NetworkManager[5965]: <info>  [1764324099.3103] manager: (tap62b8533f-b2): new Tun device (/org/freedesktop/NetworkManager/Devices/17)
Nov 28 10:01:39 np0005538515.localdomain systemd-udevd[308210]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 10:01:39 np0005538515.localdomain NetworkManager[5965]: <info>  [1764324099.3234] device (tap62b8533f-b2): state change: unmanaged -> unavailable (reason 'connection-assumed', sys-iface-state: 'external')
Nov 28 10:01:39 np0005538515.localdomain NetworkManager[5965]: <info>  [1764324099.3243] device (tap62b8533f-b2): state change: unavailable -> disconnected (reason 'none', sys-iface-state: 'external')
Nov 28 10:01:39 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T10:01:39Z|00051|binding|INFO|Claiming lport 62b8533f-b250-4475-80c2-28c4543536b5 for this additional chassis.
Nov 28 10:01:39 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T10:01:39Z|00052|binding|INFO|62b8533f-b250-4475-80c2-28c4543536b5: Claiming fa:16:3e:58:68:3c 10.100.0.12
Nov 28 10:01:39 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T10:01:39Z|00053|binding|INFO|Claiming lport fc82099a-3702-4952-add7-ba3d39b895a0 for this additional chassis.
Nov 28 10:01:39 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T10:01:39Z|00054|binding|INFO|fc82099a-3702-4952-add7-ba3d39b895a0: Claiming fa:16:3e:41:3c:a8 19.80.0.139
Nov 28 10:01:39 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:39.378 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:01:39 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:39.408 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:01:39 np0005538515.localdomain systemd-machined[201641]: New machine qemu-2-instance-00000008.
Nov 28 10:01:39 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T10:01:39Z|00055|binding|INFO|Setting lport 62b8533f-b250-4475-80c2-28c4543536b5 ovn-installed in OVS
Nov 28 10:01:39 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:39.417 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:01:39 np0005538515.localdomain systemd[1]: Started Virtual Machine qemu-2-instance-00000008.
Nov 28 10:01:39 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v117: 177 pgs: 177 active+clean; 271 MiB data, 933 MiB used, 41 GiB / 42 GiB avail; 518 KiB/s rd, 23 KiB/s wr, 28 op/s
Nov 28 10:01:39 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:39.683 280172 DEBUG nova.virt.driver [None req-1a57df67-0f32-4e7a-ae01-b3a64cdd1e9a - - - - - -] Emitting event <LifecycleEvent: 1764324099.6833832, c06e2ffc-a8af-41b6-ab88-680ef1f6fe50 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 28 10:01:39 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:39.684 280172 INFO nova.compute.manager [None req-1a57df67-0f32-4e7a-ae01-b3a64cdd1e9a - - - - - -] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] VM Started (Lifecycle Event)
Nov 28 10:01:40 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e100 e100: 6 total, 6 up, 6 in
Nov 28 10:01:40 np0005538515.localdomain ceph-mon[301134]: osdmap e99: 6 total, 6 up, 6 in
Nov 28 10:01:40 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:40.239 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:01:40 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:40.240 280172 DEBUG nova.compute.manager [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 28 10:01:40 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:40.241 280172 DEBUG nova.compute.manager [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 28 10:01:40 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:40.320 280172 DEBUG nova.compute.manager [None req-1a57df67-0f32-4e7a-ae01-b3a64cdd1e9a - - - - - -] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 28 10:01:40 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:40.341 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Acquiring lock "refresh_cache-7292509e-f294-4159-96e5-22d4712df2a0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 28 10:01:40 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:40.341 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Acquired lock "refresh_cache-7292509e-f294-4159-96e5-22d4712df2a0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 28 10:01:40 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:40.341 280172 DEBUG nova.network.neutron [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 28 10:01:40 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:40.341 280172 DEBUG nova.objects.instance [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 7292509e-f294-4159-96e5-22d4712df2a0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 28 10:01:40 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:40.384 280172 DEBUG nova.virt.driver [None req-1a57df67-0f32-4e7a-ae01-b3a64cdd1e9a - - - - - -] Emitting event <LifecycleEvent: 1764324100.3844452, c06e2ffc-a8af-41b6-ab88-680ef1f6fe50 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 28 10:01:40 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:40.384 280172 INFO nova.compute.manager [None req-1a57df67-0f32-4e7a-ae01-b3a64cdd1e9a - - - - - -] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] VM Resumed (Lifecycle Event)
Nov 28 10:01:40 np0005538515.localdomain sshd[308174]: Received disconnect from 172.17.0.106 port 36484:11: disconnected by user
Nov 28 10:01:40 np0005538515.localdomain sshd[308174]: Disconnected from user nova 172.17.0.106 port 36484
Nov 28 10:01:40 np0005538515.localdomain sshd[308154]: pam_unix(sshd:session): session closed for user nova
Nov 28 10:01:40 np0005538515.localdomain systemd[1]: session-73.scope: Deactivated successfully.
Nov 28 10:01:40 np0005538515.localdomain systemd-logind[763]: Session 73 logged out. Waiting for processes to exit.
Nov 28 10:01:40 np0005538515.localdomain systemd-logind[763]: Removed session 73.
Nov 28 10:01:40 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:40.963 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:01:41 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:41.050 280172 DEBUG nova.compute.manager [None req-1a57df67-0f32-4e7a-ae01-b3a64cdd1e9a - - - - - -] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 28 10:01:41 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:41.054 280172 DEBUG nova.compute.manager [None req-1a57df67-0f32-4e7a-ae01-b3a64cdd1e9a - - - - - -] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 28 10:01:41 np0005538515.localdomain ceph-mon[301134]: pgmap v117: 177 pgs: 177 active+clean; 271 MiB data, 933 MiB used, 41 GiB / 42 GiB avail; 518 KiB/s rd, 23 KiB/s wr, 28 op/s
Nov 28 10:01:41 np0005538515.localdomain ceph-mon[301134]: osdmap e100: 6 total, 6 up, 6 in
Nov 28 10:01:41 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:41.608 280172 INFO nova.virt.libvirt.driver [None req-9dd5035d-70c3-4cd3-a7df-eb80858bea87 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Snapshot image upload complete
Nov 28 10:01:41 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:41.609 280172 DEBUG nova.compute.manager [None req-9dd5035d-70c3-4cd3-a7df-eb80858bea87 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 28 10:01:41 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v119: 177 pgs: 177 active+clean; 350 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 8.5 MiB/s rd, 7.8 MiB/s wr, 210 op/s
Nov 28 10:01:41 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e100 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 10:01:41 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:41.993 280172 INFO nova.compute.manager [None req-1a57df67-0f32-4e7a-ae01-b3a64cdd1e9a - - - - - -] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] During the sync_power process the instance has moved from host np0005538513.localdomain to host np0005538515.localdomain
Nov 28 10:01:42 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:42.938 280172 INFO nova.compute.manager [None req-9dd5035d-70c3-4cd3-a7df-eb80858bea87 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Shelve offloading
Nov 28 10:01:42 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:42.945 280172 INFO nova.virt.libvirt.driver [-] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Instance destroyed successfully.
Nov 28 10:01:42 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:42.945 280172 DEBUG nova.compute.manager [None req-9dd5035d-70c3-4cd3-a7df-eb80858bea87 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 28 10:01:42 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:42.948 280172 DEBUG oslo_concurrency.lockutils [None req-9dd5035d-70c3-4cd3-a7df-eb80858bea87 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] Acquiring lock "refresh_cache-7292509e-f294-4159-96e5-22d4712df2a0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 28 10:01:42 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:42.952 280172 DEBUG nova.network.neutron [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 28 10:01:43 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:01:43.078 158530 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=9, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '92:49:97', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ca:ab:0a:de:51:20'}, ipsec=False) old=SB_Global(nb_cfg=8) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:01:43 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:01:43.079 158530 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 28 10:01:43 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:43.109 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:01:43 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:43.247 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:01:43 np0005538515.localdomain ceph-mon[301134]: pgmap v119: 177 pgs: 177 active+clean; 350 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 8.5 MiB/s rd, 7.8 MiB/s wr, 210 op/s
Nov 28 10:01:43 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:43.464 280172 DEBUG nova.network.neutron [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 28 10:01:43 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:43.477 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Releasing lock "refresh_cache-7292509e-f294-4159-96e5-22d4712df2a0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 28 10:01:43 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:43.478 280172 DEBUG nova.compute.manager [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 28 10:01:43 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:43.478 280172 DEBUG oslo_concurrency.lockutils [None req-9dd5035d-70c3-4cd3-a7df-eb80858bea87 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] Acquired lock "refresh_cache-7292509e-f294-4159-96e5-22d4712df2a0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 28 10:01:43 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:43.478 280172 DEBUG nova.network.neutron [None req-9dd5035d-70c3-4cd3-a7df-eb80858bea87 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 28 10:01:43 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:43.479 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:01:43 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v120: 177 pgs: 177 active+clean; 350 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 7.2 MiB/s rd, 7.1 MiB/s wr, 157 op/s
Nov 28 10:01:43 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:43.717 280172 DEBUG nova.network.neutron [None req-9dd5035d-70c3-4cd3-a7df-eb80858bea87 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 28 10:01:43 np0005538515.localdomain dnsmasq[261709]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/addn_hosts - 3 addresses
Nov 28 10:01:43 np0005538515.localdomain dnsmasq-dhcp[261709]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/host
Nov 28 10:01:43 np0005538515.localdomain podman[308282]: 2025-11-28 10:01:43.917814683 +0000 UTC m=+0.044395255 container kill dff024c0a8ef5b75fbea3dbf531de7255c21fd5f44b33b8a799f8e3ce0ffd439 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-887157f9-a765-40c0-8be5-1fba3ddea8f8, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Nov 28 10:01:43 np0005538515.localdomain dnsmasq-dhcp[261709]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/opts
Nov 28 10:01:44 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:44.061 280172 DEBUG nova.network.neutron [None req-9dd5035d-70c3-4cd3-a7df-eb80858bea87 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 28 10:01:44 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:44.154 280172 DEBUG oslo_concurrency.lockutils [None req-9dd5035d-70c3-4cd3-a7df-eb80858bea87 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] Releasing lock "refresh_cache-7292509e-f294-4159-96e5-22d4712df2a0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 28 10:01:44 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:44.160 280172 INFO nova.virt.libvirt.driver [-] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Instance destroyed successfully.
Nov 28 10:01:44 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:44.161 280172 DEBUG nova.objects.instance [None req-9dd5035d-70c3-4cd3-a7df-eb80858bea87 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] Lazy-loading 'resources' on Instance uuid 7292509e-f294-4159-96e5-22d4712df2a0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 28 10:01:44 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:44.238 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:01:44 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:44.238 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:01:44 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:44.256 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 10:01:44 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:44.257 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 10:01:44 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:44.257 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 10:01:44 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:44.257 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Auditing locally available compute resources for np0005538515.localdomain (node: np0005538515.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 28 10:01:44 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:44.258 280172 DEBUG oslo_concurrency.processutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 10:01:44 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.106:0/4067164618' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:01:44 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.107:0/2900998411' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:01:44 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:44.416 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:01:44 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T10:01:44Z|00056|binding|INFO|Claiming lport 62b8533f-b250-4475-80c2-28c4543536b5 for this chassis.
Nov 28 10:01:44 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T10:01:44Z|00057|binding|INFO|62b8533f-b250-4475-80c2-28c4543536b5: Claiming fa:16:3e:58:68:3c 10.100.0.12
Nov 28 10:01:44 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T10:01:44Z|00058|binding|INFO|Claiming lport fc82099a-3702-4952-add7-ba3d39b895a0 for this chassis.
Nov 28 10:01:44 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T10:01:44Z|00059|binding|INFO|fc82099a-3702-4952-add7-ba3d39b895a0: Claiming fa:16:3e:41:3c:a8 19.80.0.139
Nov 28 10:01:44 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T10:01:44Z|00060|binding|INFO|Setting lport 62b8533f-b250-4475-80c2-28c4543536b5 up in Southbound
Nov 28 10:01:44 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T10:01:44Z|00061|binding|INFO|Setting lport fc82099a-3702-4952-add7-ba3d39b895a0 up in Southbound
Nov 28 10:01:44 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:01:44.746 158530 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:41:3c:a8 19.80.0.139'], port_security=['fa:16:3e:41:3c:a8 19.80.0.139'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': ''}, parent_port=['62b8533f-b250-4475-80c2-28c4543536b5'], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-subport-957922340', 'neutron:cidrs': '19.80.0.139/24', 'neutron:device_id': '', 'neutron:device_owner': 'trunk:subport', 'neutron:mtu': '', 'neutron:network_name': 'neutron-492ef1de-4a68-49e4-b736-13cdb2eb7b59', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-subport-957922340', 'neutron:project_id': '310745a04bd441169ff77f55ccf6bd7b', 'neutron:revision_number': '3', 'neutron:security_group_ids': '8cd6a72f-0cb3-42f5-95bb-7d1b962c8a1e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[42], additional_encap=[], encap=[], mirror_rules=[], datapath=96ffb618-d617-4e8c-a498-acb365ae5313, chassis=[<ovs.db.idl.Row object at 0x7fd80e481be0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=fc82099a-3702-4952-add7-ba3d39b895a0) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7fd80e481be0>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:01:44 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:01:44.751 158530 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:58:68:3c 10.100.0.12'], port_security=['fa:16:3e:58:68:3c 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005538515.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-parent-191355626', 'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'c06e2ffc-a8af-41b6-ab88-680ef1f6fe50', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ad2d8cf7-987d-4804-acbd-9b3e248dc8cd', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-parent-191355626', 'neutron:project_id': '310745a04bd441169ff77f55ccf6bd7b', 'neutron:revision_number': '9', 'neutron:security_group_ids': '8cd6a72f-0cb3-42f5-95bb-7d1b962c8a1e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005538513.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b393f93f-1891-43a2-aa26-a4cab2642f74, chassis=[<ovs.db.idl.Row object at 0x7fd80e481be0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fd80e481be0>], logical_port=62b8533f-b250-4475-80c2-28c4543536b5) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7fd80e481be0>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:01:44 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:01:44.755 158530 INFO neutron.agent.ovn.metadata.agent [-] Port fc82099a-3702-4952-add7-ba3d39b895a0 in datapath 492ef1de-4a68-49e4-b736-13cdb2eb7b59 bound to our chassis
Nov 28 10:01:44 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:01:44.762 158530 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 492ef1de-4a68-49e4-b736-13cdb2eb7b59
Nov 28 10:01:44 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 28 10:01:44 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/701571942' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:01:44 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:44.806 280172 DEBUG oslo_concurrency.processutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.548s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 10:01:44 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:01:44.816 2 WARNING neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 req-a0dc2833-791a-4b03-82d4-55f61a32c76b 9e5033e84dec44f4956046cabe7e22af e2c76e4d27554fd5a4f85cce208b136f - - default default] This port is not SRIOV, skip binding for port 62b8533f-b250-4475-80c2-28c4543536b5.
Nov 28 10:01:44 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e101 e101: 6 total, 6 up, 6 in
Nov 28 10:01:44 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:44.898 280172 INFO nova.virt.libvirt.driver [None req-9dd5035d-70c3-4cd3-a7df-eb80858bea87 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Deleting instance files /var/lib/nova/instances/7292509e-f294-4159-96e5-22d4712df2a0_del
Nov 28 10:01:44 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:44.899 280172 INFO nova.virt.libvirt.driver [None req-9dd5035d-70c3-4cd3-a7df-eb80858bea87 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Deletion of /var/lib/nova/instances/7292509e-f294-4159-96e5-22d4712df2a0_del complete
Nov 28 10:01:44 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:44.911 280172 DEBUG nova.virt.libvirt.driver [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] skipping disk for instance-00000007 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 28 10:01:44 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:44.912 280172 DEBUG nova.virt.libvirt.driver [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] skipping disk for instance-00000007 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 28 10:01:44 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:44.922 280172 DEBUG nova.virt.libvirt.driver [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] skipping disk for instance-00000008 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 28 10:01:44 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:44.923 280172 DEBUG nova.virt.libvirt.driver [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] skipping disk for instance-00000008 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Nov 28 10:01:44 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:44.961 280172 DEBUG nova.virt.libvirt.host [None req-9dd5035d-70c3-4cd3-a7df-eb80858bea87 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] Checking UEFI support for host arch (x86_64) supports_uefi /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1754
Nov 28 10:01:44 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:44.962 280172 INFO nova.virt.libvirt.host [None req-9dd5035d-70c3-4cd3-a7df-eb80858bea87 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] UEFI support detected
Nov 28 10:01:44 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:44.965 280172 INFO nova.compute.manager [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Post operation of migration started
Nov 28 10:01:45 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:45.013 280172 INFO nova.scheduler.client.report [None req-9dd5035d-70c3-4cd3-a7df-eb80858bea87 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] Deleted allocations for instance 7292509e-f294-4159-96e5-22d4712df2a0
Nov 28 10:01:45 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:45.048 280172 DEBUG oslo_concurrency.lockutils [None req-9dd5035d-70c3-4cd3-a7df-eb80858bea87 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 10:01:45 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:45.048 280172 DEBUG oslo_concurrency.lockutils [None req-9dd5035d-70c3-4cd3-a7df-eb80858bea87 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 10:01:45 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:01:45.081 158530 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=62c03cad-89c1-4fd7-973b-8f2a608c71f1, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '9'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 10:01:45 np0005538515.localdomain systemd[1]: tmp-crun.duZvx7.mount: Deactivated successfully.
Nov 28 10:01:45 np0005538515.localdomain dnsmasq[307696]: exiting on receipt of SIGTERM
Nov 28 10:01:45 np0005538515.localdomain podman[308361]: 2025-11-28 10:01:45.087594031 +0000 UTC m=+0.058622895 container kill f0d54c45b79e93cce2c09d9f53a08f4697989faf756dae8aace02617f311ac7b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8a9132e3-6bf5-4fa5-8eac-9650725d34b1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 28 10:01:45 np0005538515.localdomain systemd[1]: libpod-f0d54c45b79e93cce2c09d9f53a08f4697989faf756dae8aace02617f311ac7b.scope: Deactivated successfully.
Nov 28 10:01:45 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:45.104 280172 DEBUG oslo_concurrency.processutils [None req-9dd5035d-70c3-4cd3-a7df-eb80858bea87 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 10:01:45 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:45.132 280172 DEBUG oslo_concurrency.lockutils [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] Acquiring lock "refresh_cache-c06e2ffc-a8af-41b6-ab88-680ef1f6fe50" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 28 10:01:45 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:45.133 280172 DEBUG oslo_concurrency.lockutils [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] Acquired lock "refresh_cache-c06e2ffc-a8af-41b6-ab88-680ef1f6fe50" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 28 10:01:45 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:45.133 280172 DEBUG nova.network.neutron [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 28 10:01:45 np0005538515.localdomain podman[308375]: 2025-11-28 10:01:45.135928743 +0000 UTC m=+0.033388731 container died f0d54c45b79e93cce2c09d9f53a08f4697989faf756dae8aace02617f311ac7b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8a9132e3-6bf5-4fa5-8eac-9650725d34b1, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:01:45 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T10:01:45Z|00004|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:58:68:3c 10.100.0.12
Nov 28 10:01:45 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T10:01:45Z|00005|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:58:68:3c 10.100.0.12
Nov 28 10:01:45 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:45.152 280172 WARNING nova.virt.libvirt.driver [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 10:01:45 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:45.153 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Hypervisor/Node resource view: name=np0005538515.localdomain free_ram=11612MB free_disk=41.63758850097656GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 28 10:01:45 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:45.153 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 10:01:45 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f0d54c45b79e93cce2c09d9f53a08f4697989faf756dae8aace02617f311ac7b-userdata-shm.mount: Deactivated successfully.
Nov 28 10:01:45 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:01:45.183 261619 DEBUG oslo.privsep.daemon [-] privsep: reply[bead1f9f-fa92-427a-9841-94cc87ada8b7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:01:45 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:01:45.183 158530 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap492ef1de-41 in ovnmeta-492ef1de-4a68-49e4-b736-13cdb2eb7b59 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 28 10:01:45 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:01:45.186 261619 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap492ef1de-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 28 10:01:45 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:01:45.186 261619 DEBUG oslo.privsep.daemon [-] privsep: reply[cb746268-d067-46da-916a-bd1bc3d25b5c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:01:45 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:01:45.187 261619 DEBUG oslo.privsep.daemon [-] privsep: reply[fe4f7d92-aa60-41d5-9041-fa494ebb4ca8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:01:45 np0005538515.localdomain podman[308375]: 2025-11-28 10:01:45.19035587 +0000 UTC m=+0.087815868 container remove f0d54c45b79e93cce2c09d9f53a08f4697989faf756dae8aace02617f311ac7b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8a9132e3-6bf5-4fa5-8eac-9650725d34b1, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 28 10:01:45 np0005538515.localdomain systemd[1]: libpod-conmon-f0d54c45b79e93cce2c09d9f53a08f4697989faf756dae8aace02617f311ac7b.scope: Deactivated successfully.
Nov 28 10:01:45 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:01:45.205 158630 DEBUG oslo.privsep.daemon [-] privsep: reply[833479b1-9d56-4db1-855b-c3d01b023dde]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:01:45 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:01:45.213 261619 DEBUG oslo.privsep.daemon [-] privsep: reply[4ccae5a1-8683-4fdb-8080-89edd82ea369]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:01:45 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:01:45.214 158530 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.link_cmd', '--privsep_sock_path', '/tmp/tmplomwym1i/privsep.sock']
Nov 28 10:01:45 np0005538515.localdomain ceph-mon[301134]: pgmap v120: 177 pgs: 177 active+clean; 350 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 7.2 MiB/s rd, 7.1 MiB/s wr, 157 op/s
Nov 28 10:01:45 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.106:0/1891951526' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:01:45 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.108:0/701571942' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:01:45 np0005538515.localdomain ceph-mon[301134]: osdmap e101: 6 total, 6 up, 6 in
Nov 28 10:01:45 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.107:0/1054391208' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:01:45 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 28 10:01:45 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/947106529' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:01:45 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:45.544 280172 DEBUG oslo_concurrency.processutils [None req-9dd5035d-70c3-4cd3-a7df-eb80858bea87 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.440s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 10:01:45 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:45.550 280172 DEBUG nova.compute.provider_tree [None req-9dd5035d-70c3-4cd3-a7df-eb80858bea87 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] Updating inventory in ProviderTree for provider 72fba1ca-0d86-48af-8a3d-510284dfd0e0 with inventory: {'MEMORY_MB': {'total': 15738, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0, 'reserved': 0}, 'DISK_GB': {'total': 41, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 1}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 28 10:01:45 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:45.638 280172 ERROR nova.scheduler.client.report [None req-9dd5035d-70c3-4cd3-a7df-eb80858bea87 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] [req-f76c6dc6-4923-4c12-b438-1eaa6086d7c3] Failed to update inventory to [{'MEMORY_MB': {'total': 15738, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0, 'reserved': 0}, 'DISK_GB': {'total': 41, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 1}}] for resource provider with UUID 72fba1ca-0d86-48af-8a3d-510284dfd0e0.  Got 409: {"errors": [{"status": 409, "title": "Conflict", "detail": "There was a conflict when trying to complete your request.\n\n resource provider generation conflict  ", "code": "placement.concurrent_update", "request_id": "req-f76c6dc6-4923-4c12-b438-1eaa6086d7c3"}]}
Nov 28 10:01:45 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v122: 177 pgs: 177 active+clean; 350 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 7.3 MiB/s rd, 7.2 MiB/s wr, 159 op/s
Nov 28 10:01:45 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:01:45.732 261346 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:01:45 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:45.787 280172 DEBUG nova.scheduler.client.report [None req-9dd5035d-70c3-4cd3-a7df-eb80858bea87 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] Refreshing inventories for resource provider 72fba1ca-0d86-48af-8a3d-510284dfd0e0 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Nov 28 10:01:45 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:45.808 280172 DEBUG nova.scheduler.client.report [None req-9dd5035d-70c3-4cd3-a7df-eb80858bea87 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] Updating ProviderTree inventory for provider 72fba1ca-0d86-48af-8a3d-510284dfd0e0 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Nov 28 10:01:45 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:45.809 280172 DEBUG nova.compute.provider_tree [None req-9dd5035d-70c3-4cd3-a7df-eb80858bea87 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] Updating inventory in ProviderTree for provider 72fba1ca-0d86-48af-8a3d-510284dfd0e0 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 28 10:01:45 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:01:45.821 158530 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Nov 28 10:01:45 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:01:45.822 158530 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmplomwym1i/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362
Nov 28 10:01:45 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:01:45.714 308430 INFO oslo.privsep.daemon [-] privsep daemon starting
Nov 28 10:01:45 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:01:45.722 308430 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Nov 28 10:01:45 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:01:45.726 308430 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_NET_ADMIN|CAP_SYS_ADMIN/none
Nov 28 10:01:45 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:01:45.726 308430 INFO oslo.privsep.daemon [-] privsep daemon running as pid 308430
Nov 28 10:01:45 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:01:45.825 308430 DEBUG oslo.privsep.daemon [-] privsep: reply[92f8ed77-fbcb-4c69-aee5-ddc4eb0ce7c5]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:01:45 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:45.829 280172 DEBUG nova.scheduler.client.report [None req-9dd5035d-70c3-4cd3-a7df-eb80858bea87 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] Refreshing aggregate associations for resource provider 72fba1ca-0d86-48af-8a3d-510284dfd0e0, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Nov 28 10:01:45 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:45.867 280172 DEBUG nova.scheduler.client.report [None req-9dd5035d-70c3-4cd3-a7df-eb80858bea87 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] Refreshing trait associations for resource provider 72fba1ca-0d86-48af-8a3d-510284dfd0e0, traits: COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_TRUSTED_CERTS,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_AVX,HW_CPU_X86_BMI2,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_AESNI,HW_CPU_X86_SHA,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_ACCELERATORS,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_AMD_SVM,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_F16C,HW_CPU_X86_SVM,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_SSE2,HW_CPU_X86_CLMUL,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_ABM,COMPUTE_STORAGE_BUS_FDC,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_SSE4A,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_SSSE3,HW_CPU_X86_AVX2,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_SECURITY_TPM_1_2,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_SSE41,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_FMA3,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_BMI,HW_CPU_X86_SSE,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NODE,HW_CPU_X86_MMX,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_VOLUME_EXTEND,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_VIRTIO _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Nov 28 10:01:45 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:45.936 280172 DEBUG oslo_concurrency.processutils [None req-9dd5035d-70c3-4cd3-a7df-eb80858bea87 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 10:01:45 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:45.965 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:01:46 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-93a797d5722443d4145967b946d93624c8318af2c1cabbcd44f075fa585e6ab9-merged.mount: Deactivated successfully.
Nov 28 10:01:46 np0005538515.localdomain systemd[1]: run-netns-qdhcp\x2d8a9132e3\x2d6bf5\x2d4fa5\x2d8eac\x2d9650725d34b1.mount: Deactivated successfully.
Nov 28 10:01:46 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:46.165 280172 DEBUG nova.network.neutron [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Updating instance_info_cache with network_info: [{"id": "62b8533f-b250-4475-80c2-28c4543536b5", "address": "fa:16:3e:58:68:3c", "network": {"id": "ad2d8cf7-987d-4804-acbd-9b3e248dc8cd", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1085829019-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.2"}}], "meta": {"injected": false, "tenant_id": "310745a04bd441169ff77f55ccf6bd7b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap62b8533f-b2", "ovs_interfaceid": "62b8533f-b250-4475-80c2-28c4543536b5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 28 10:01:46 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:01:46.208 261346 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:01:46 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:46.228 280172 DEBUG oslo_concurrency.lockutils [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] Releasing lock "refresh_cache-c06e2ffc-a8af-41b6-ab88-680ef1f6fe50" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 28 10:01:46 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:46.257 280172 DEBUG oslo_concurrency.lockutils [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 10:01:46 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:01:46.275 308430 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 10:01:46 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:01:46.275 308430 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 10:01:46 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:01:46.275 308430 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 10:01:46 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.108:0/947106529' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:01:46 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:46.401 280172 DEBUG oslo_concurrency.processutils [None req-9dd5035d-70c3-4cd3-a7df-eb80858bea87 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.465s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 10:01:46 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:46.408 280172 DEBUG nova.compute.provider_tree [None req-9dd5035d-70c3-4cd3-a7df-eb80858bea87 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] Updating inventory in ProviderTree for provider 72fba1ca-0d86-48af-8a3d-510284dfd0e0 with inventory: {'MEMORY_MB': {'total': 15738, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0, 'reserved': 0}, 'DISK_GB': {'total': 41, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 1}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 28 10:01:46 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:46.467 280172 ERROR nova.scheduler.client.report [None req-9dd5035d-70c3-4cd3-a7df-eb80858bea87 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] [req-688da9a6-1005-4ef1-9e59-c08a93ab4f5d] Failed to update inventory to [{'MEMORY_MB': {'total': 15738, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0, 'reserved': 0}, 'DISK_GB': {'total': 41, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 1}}] for resource provider with UUID 72fba1ca-0d86-48af-8a3d-510284dfd0e0.  Got 409: {"errors": [{"status": 409, "title": "Conflict", "detail": "There was a conflict when trying to complete your request.\n\n resource provider generation conflict  ", "code": "placement.concurrent_update", "request_id": "req-688da9a6-1005-4ef1-9e59-c08a93ab4f5d"}]}
Nov 28 10:01:46 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:46.473 280172 DEBUG oslo_concurrency.lockutils [None req-9b118e69-1b4e-4a80-93b6-9ea2d37c0d65 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] Acquiring lock "7292509e-f294-4159-96e5-22d4712df2a0" by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 10:01:46 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:46.494 280172 DEBUG nova.scheduler.client.report [None req-9dd5035d-70c3-4cd3-a7df-eb80858bea87 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] Refreshing inventories for resource provider 72fba1ca-0d86-48af-8a3d-510284dfd0e0 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Nov 28 10:01:46 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:46.516 280172 DEBUG nova.scheduler.client.report [None req-9dd5035d-70c3-4cd3-a7df-eb80858bea87 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] Updating ProviderTree inventory for provider 72fba1ca-0d86-48af-8a3d-510284dfd0e0 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Nov 28 10:01:46 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:46.517 280172 DEBUG nova.compute.provider_tree [None req-9dd5035d-70c3-4cd3-a7df-eb80858bea87 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] Updating inventory in ProviderTree for provider 72fba1ca-0d86-48af-8a3d-510284dfd0e0 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 28 10:01:46 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:46.541 280172 DEBUG nova.scheduler.client.report [None req-9dd5035d-70c3-4cd3-a7df-eb80858bea87 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] Refreshing aggregate associations for resource provider 72fba1ca-0d86-48af-8a3d-510284dfd0e0, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Nov 28 10:01:46 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:46.588 280172 DEBUG nova.scheduler.client.report [None req-9dd5035d-70c3-4cd3-a7df-eb80858bea87 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] Refreshing trait associations for resource provider 72fba1ca-0d86-48af-8a3d-510284dfd0e0, traits: COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_TRUSTED_CERTS,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_AVX,HW_CPU_X86_BMI2,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_AESNI,HW_CPU_X86_SHA,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_ACCELERATORS,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_AMD_SVM,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_F16C,HW_CPU_X86_SVM,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_SSE2,HW_CPU_X86_CLMUL,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_ABM,COMPUTE_STORAGE_BUS_FDC,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_SSE4A,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_SSSE3,HW_CPU_X86_AVX2,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_SECURITY_TPM_1_2,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_SSE41,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_FMA3,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_BMI,HW_CPU_X86_SSE,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NODE,HW_CPU_X86_MMX,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_VOLUME_EXTEND,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_VIRTIO _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Nov 28 10:01:46 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:46.668 280172 DEBUG oslo_concurrency.processutils [None req-9dd5035d-70c3-4cd3-a7df-eb80858bea87 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 10:01:46 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e101 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 10:01:46 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:01:46.782 308430 DEBUG oslo.privsep.daemon [-] privsep: reply[56e0eb79-8330-40b6-ab1e-77c3852f4031]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:01:46 np0005538515.localdomain NetworkManager[5965]: <info>  [1764324106.8097] manager: (tap492ef1de-40): new Veth device (/org/freedesktop/NetworkManager/Devices/18)
Nov 28 10:01:46 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:01:46.811 261619 DEBUG oslo.privsep.daemon [-] privsep: reply[7c920fdb-7dcf-4131-ba2e-292292f51a1a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:01:46 np0005538515.localdomain systemd-udevd[308464]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 10:01:46 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:01:46.847 308430 DEBUG oslo.privsep.daemon [-] privsep: reply[85ccada5-b419-4d34-958f-3736153dc10f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:01:46 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:01:46.852 308430 DEBUG oslo.privsep.daemon [-] privsep: reply[9a084e47-9187-4650-8940-db00218ff1ea]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:01:46 np0005538515.localdomain kernel: IPv6: ADDRCONF(NETDEV_CHANGE): tap492ef1de-41: link becomes ready
Nov 28 10:01:46 np0005538515.localdomain kernel: IPv6: ADDRCONF(NETDEV_CHANGE): tap492ef1de-40: link becomes ready
Nov 28 10:01:46 np0005538515.localdomain NetworkManager[5965]: <info>  [1764324106.8751] device (tap492ef1de-40): carrier: link connected
Nov 28 10:01:46 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:01:46.880 308430 DEBUG oslo.privsep.daemon [-] privsep: reply[07945255-82b2-4e53-9c83-21d52075cb8f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:01:46 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:01:46.904 261619 DEBUG oslo.privsep.daemon [-] privsep: reply[d83545aa-84d1-4a29-9265-b2d190cfb15f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap492ef1de-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', 'fa:16:3e:e3:7c:76'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 18], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1193102, 'reachable_time': 23541, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1400, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 308502, 'error': None, 'target': 'ovnmeta-492ef1de-4a68-49e4-b736-13cdb2eb7b59', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:01:46 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:01:46.928 261619 DEBUG oslo.privsep.daemon [-] privsep: reply[035577c0-09b3-4ef4-816e-394ed29b54c9]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fee3:7c76'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 1193102, 'tstamp': 1193102}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 308503, 'error': None, 'target': 'ovnmeta-492ef1de-4a68-49e4-b736-13cdb2eb7b59', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:01:46 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:01:46.948 261619 DEBUG oslo.privsep.daemon [-] privsep: reply[2ccf56c1-1485-4a00-aac8-1c88da13afa2]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap492ef1de-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', 'fa:16:3e:e3:7c:76'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 18], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1193102, 'reachable_time': 23541, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1400, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 308504, 'error': None, 'target': 'ovnmeta-492ef1de-4a68-49e4-b736-13cdb2eb7b59', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:01:46 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:01:46.984 261619 DEBUG oslo.privsep.daemon [-] privsep: reply[6aaf7ddd-0ec7-4ef0-8abe-c5f5c2831656]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:01:47 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:01:47.051 261619 DEBUG oslo.privsep.daemon [-] privsep: reply[2713abf2-bf40-4190-8958-00159c98d8a8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:01:47 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:01:47.053 158530 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap492ef1de-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 10:01:47 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:01:47.054 158530 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 28 10:01:47 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:01:47.054 158530 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap492ef1de-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 10:01:47 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:47.056 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:01:47 np0005538515.localdomain kernel: device tap492ef1de-40 entered promiscuous mode
Nov 28 10:01:47 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:47.060 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:01:47 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:01:47.061 158530 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap492ef1de-40, col_values=(('external_ids', {'iface-id': '6838a8cb-20d7-44c7-aad3-e7f442484bd5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 10:01:47 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:47.062 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:01:47 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T10:01:47Z|00062|binding|INFO|Releasing lport 6838a8cb-20d7-44c7-aad3-e7f442484bd5 from this chassis (sb_readonly=0)
Nov 28 10:01:47 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:47.076 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:01:47 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:01:47.078 158530 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/492ef1de-4a68-49e4-b736-13cdb2eb7b59.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/492ef1de-4a68-49e4-b736-13cdb2eb7b59.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 28 10:01:47 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:01:47.079 261619 DEBUG oslo.privsep.daemon [-] privsep: reply[bde4614e-c7d4-440a-8c95-443938892fdc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:01:47 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:01:47.080 158530 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 28 10:01:47 np0005538515.localdomain ovn_metadata_agent[158525]: global
Nov 28 10:01:47 np0005538515.localdomain ovn_metadata_agent[158525]:     log         /dev/log local0 debug
Nov 28 10:01:47 np0005538515.localdomain ovn_metadata_agent[158525]:     log-tag     haproxy-metadata-proxy-492ef1de-4a68-49e4-b736-13cdb2eb7b59
Nov 28 10:01:47 np0005538515.localdomain ovn_metadata_agent[158525]:     user        root
Nov 28 10:01:47 np0005538515.localdomain ovn_metadata_agent[158525]:     group       root
Nov 28 10:01:47 np0005538515.localdomain ovn_metadata_agent[158525]:     maxconn     1024
Nov 28 10:01:47 np0005538515.localdomain ovn_metadata_agent[158525]:     pidfile     /var/lib/neutron/external/pids/492ef1de-4a68-49e4-b736-13cdb2eb7b59.pid.haproxy
Nov 28 10:01:47 np0005538515.localdomain ovn_metadata_agent[158525]:     daemon
Nov 28 10:01:47 np0005538515.localdomain ovn_metadata_agent[158525]: 
Nov 28 10:01:47 np0005538515.localdomain ovn_metadata_agent[158525]: defaults
Nov 28 10:01:47 np0005538515.localdomain ovn_metadata_agent[158525]:     log global
Nov 28 10:01:47 np0005538515.localdomain ovn_metadata_agent[158525]:     mode http
Nov 28 10:01:47 np0005538515.localdomain ovn_metadata_agent[158525]:     option httplog
Nov 28 10:01:47 np0005538515.localdomain ovn_metadata_agent[158525]:     option dontlognull
Nov 28 10:01:47 np0005538515.localdomain ovn_metadata_agent[158525]:     option http-server-close
Nov 28 10:01:47 np0005538515.localdomain ovn_metadata_agent[158525]:     option forwardfor
Nov 28 10:01:47 np0005538515.localdomain ovn_metadata_agent[158525]:     retries                 3
Nov 28 10:01:47 np0005538515.localdomain ovn_metadata_agent[158525]:     timeout http-request    30s
Nov 28 10:01:47 np0005538515.localdomain ovn_metadata_agent[158525]:     timeout connect         30s
Nov 28 10:01:47 np0005538515.localdomain ovn_metadata_agent[158525]:     timeout client          32s
Nov 28 10:01:47 np0005538515.localdomain ovn_metadata_agent[158525]:     timeout server          32s
Nov 28 10:01:47 np0005538515.localdomain ovn_metadata_agent[158525]:     timeout http-keep-alive 30s
Nov 28 10:01:47 np0005538515.localdomain ovn_metadata_agent[158525]: 
Nov 28 10:01:47 np0005538515.localdomain ovn_metadata_agent[158525]: 
Nov 28 10:01:47 np0005538515.localdomain ovn_metadata_agent[158525]: listen listener
Nov 28 10:01:47 np0005538515.localdomain ovn_metadata_agent[158525]:     bind 169.254.169.254:80
Nov 28 10:01:47 np0005538515.localdomain ovn_metadata_agent[158525]:     server metadata /var/lib/neutron/metadata_proxy
Nov 28 10:01:47 np0005538515.localdomain ovn_metadata_agent[158525]:     http-request add-header X-OVN-Network-ID 492ef1de-4a68-49e4-b736-13cdb2eb7b59
Nov 28 10:01:47 np0005538515.localdomain ovn_metadata_agent[158525]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 28 10:01:47 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:01:47.081 158530 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-492ef1de-4a68-49e4-b736-13cdb2eb7b59', 'env', 'PROCESS_TAG=haproxy-492ef1de-4a68-49e4-b736-13cdb2eb7b59', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/492ef1de-4a68-49e4-b736-13cdb2eb7b59.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 28 10:01:47 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:47.211 280172 DEBUG oslo_concurrency.processutils [None req-9dd5035d-70c3-4cd3-a7df-eb80858bea87 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.543s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 10:01:47 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:47.221 280172 DEBUG nova.compute.provider_tree [None req-9dd5035d-70c3-4cd3-a7df-eb80858bea87 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] Updating inventory in ProviderTree for provider 72fba1ca-0d86-48af-8a3d-510284dfd0e0 with inventory: {'MEMORY_MB': {'total': 15738, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0, 'reserved': 0}, 'DISK_GB': {'total': 41, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 1}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 28 10:01:47 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:01:47.237 261346 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:01:46Z, description=, device_id=df7249ed-b97e-4670-9db6-41014e05ccf7, device_owner=network:router_gateway, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce61b3d0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce61bac0>], id=db944860-27cb-4e81-92f6-1a891644e35c, ip_allocation=immediate, mac_address=fa:16:3e:76:13:cc, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T08:32:19Z, description=, dns_domain=, id=887157f9-a765-40c0-8be5-1fba3ddea8f8, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=9dda653c53224db086060962b0702694, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=29, status=ACTIVE, subnets=['5f7de60c-f82a-4f40-b803-51cb08cbf2e3'], tags=[], tenant_id=9dda653c53224db086060962b0702694, updated_at=2025-11-28T08:32:25Z, vlan_transparent=None, network_id=887157f9-a765-40c0-8be5-1fba3ddea8f8, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=723, status=DOWN, tags=[], tenant_id=, updated_at=2025-11-28T10:01:47Z on network 887157f9-a765-40c0-8be5-1fba3ddea8f8
Nov 28 10:01:47 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:47.289 280172 DEBUG nova.scheduler.client.report [None req-9dd5035d-70c3-4cd3-a7df-eb80858bea87 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] Updated inventory for provider 72fba1ca-0d86-48af-8a3d-510284dfd0e0 with generation 6 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 15738, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0, 'reserved': 0}, 'DISK_GB': {'total': 41, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 1}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957
Nov 28 10:01:47 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:47.290 280172 DEBUG nova.compute.provider_tree [None req-9dd5035d-70c3-4cd3-a7df-eb80858bea87 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] Updating resource provider 72fba1ca-0d86-48af-8a3d-510284dfd0e0 generation from 6 to 7 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Nov 28 10:01:47 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:47.290 280172 DEBUG nova.compute.provider_tree [None req-9dd5035d-70c3-4cd3-a7df-eb80858bea87 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] Updating inventory in ProviderTree for provider 72fba1ca-0d86-48af-8a3d-510284dfd0e0 with inventory: {'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 28 10:01:47 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:47.316 280172 DEBUG oslo_concurrency.lockutils [None req-9dd5035d-70c3-4cd3-a7df-eb80858bea87 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 2.268s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 10:01:47 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:47.319 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 2.166s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 10:01:47 np0005538515.localdomain ceph-mon[301134]: pgmap v122: 177 pgs: 177 active+clean; 350 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 7.3 MiB/s rd, 7.2 MiB/s wr, 159 op/s
Nov 28 10:01:47 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.108:0/2242950946' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:01:47 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.108:0/3836043779' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:01:47 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:47.394 280172 DEBUG oslo_concurrency.lockutils [None req-9dd5035d-70c3-4cd3-a7df-eb80858bea87 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] Lock "7292509e-f294-4159-96e5-22d4712df2a0" "released" by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" :: held 35.487s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 10:01:47 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:47.397 280172 DEBUG oslo_concurrency.lockutils [None req-9b118e69-1b4e-4a80-93b6-9ea2d37c0d65 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] Lock "7292509e-f294-4159-96e5-22d4712df2a0" acquired by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" :: waited 0.923s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 10:01:47 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:47.397 280172 INFO nova.compute.manager [None req-9b118e69-1b4e-4a80-93b6-9ea2d37c0d65 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Unshelving
Nov 28 10:01:47 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:47.411 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Migration for instance c06e2ffc-a8af-41b6-ab88-680ef1f6fe50 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903
Nov 28 10:01:47 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:47.435 280172 INFO nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Updating resource usage from migration 62fb7f70-bf44-4fcf-8c08-e096ee66cd99
Nov 28 10:01:47 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:47.436 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Starting to track incoming migration 62fb7f70-bf44-4fcf-8c08-e096ee66cd99 with flavor 98f289d4-5c06-4ab5-9089-7b580870d676 _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431
Nov 28 10:01:47 np0005538515.localdomain dnsmasq[261709]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/addn_hosts - 4 addresses
Nov 28 10:01:47 np0005538515.localdomain dnsmasq-dhcp[261709]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/host
Nov 28 10:01:47 np0005538515.localdomain podman[308543]: 2025-11-28 10:01:47.503171877 +0000 UTC m=+0.111317709 container kill dff024c0a8ef5b75fbea3dbf531de7255c21fd5f44b33b8a799f8e3ce0ffd439 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-887157f9-a765-40c0-8be5-1fba3ddea8f8, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3)
Nov 28 10:01:47 np0005538515.localdomain dnsmasq-dhcp[261709]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/opts
Nov 28 10:01:47 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:47.502 280172 WARNING nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Instance c06e2ffc-a8af-41b6-ab88-680ef1f6fe50 has been moved to another host np0005538513.localdomain(np0005538513.localdomain). There are allocations remaining against the source host that might need to be removed: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}.
Nov 28 10:01:47 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:47.504 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Instance with task_state "unshelving" is not being actively managed by this compute host but has allocations referencing this compute node (72fba1ca-0d86-48af-8a3d-510284dfd0e0): {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. Skipping heal of allocations during the task state transition. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1708
Nov 28 10:01:47 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:47.504 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 28 10:01:47 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:47.505 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Final resource view: name=np0005538515.localdomain phys_ram=15738MB used_ram=640MB phys_disk=41GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 28 10:01:47 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:47.523 280172 DEBUG oslo_concurrency.lockutils [None req-9b118e69-1b4e-4a80-93b6-9ea2d37c0d65 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 10:01:47 np0005538515.localdomain podman[308564]: 
Nov 28 10:01:47 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:47.569 280172 DEBUG oslo_concurrency.processutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 10:01:47 np0005538515.localdomain podman[308564]: 2025-11-28 10:01:47.575347782 +0000 UTC m=+0.121416616 container create 2a677fb10ee4b5d09cc54f10906d47c5af3cbfa4d5a75fef5f8a83ebbe812c65 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-492ef1de-4a68-49e4-b736-13cdb2eb7b59, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 28 10:01:47 np0005538515.localdomain systemd[1]: Started libpod-conmon-2a677fb10ee4b5d09cc54f10906d47c5af3cbfa4d5a75fef5f8a83ebbe812c65.scope.
Nov 28 10:01:47 np0005538515.localdomain podman[308564]: 2025-11-28 10:01:47.540481906 +0000 UTC m=+0.086550790 image pull  quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 28 10:01:47 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 10:01:47 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/85df380a7f1d8aaeea84d1e7053bb33d88e99eb8caf1cca6548cd15c84faacda/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 10:01:47 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v123: 177 pgs: 177 active+clean; 304 MiB data, 1012 MiB used, 41 GiB / 42 GiB avail; 6.4 MiB/s rd, 9.0 MiB/s wr, 264 op/s
Nov 28 10:01:47 np0005538515.localdomain podman[308564]: 2025-11-28 10:01:47.662293483 +0000 UTC m=+0.208362357 container init 2a677fb10ee4b5d09cc54f10906d47c5af3cbfa4d5a75fef5f8a83ebbe812c65 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-492ef1de-4a68-49e4-b736-13cdb2eb7b59, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Nov 28 10:01:47 np0005538515.localdomain podman[308564]: 2025-11-28 10:01:47.672095439 +0000 UTC m=+0.218164303 container start 2a677fb10ee4b5d09cc54f10906d47c5af3cbfa4d5a75fef5f8a83ebbe812c65 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-492ef1de-4a68-49e4-b736-13cdb2eb7b59, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125)
Nov 28 10:01:47 np0005538515.localdomain neutron-haproxy-ovnmeta-492ef1de-4a68-49e4-b736-13cdb2eb7b59[308585]: [NOTICE]   (308591) : New worker (308611) forked
Nov 28 10:01:47 np0005538515.localdomain neutron-haproxy-ovnmeta-492ef1de-4a68-49e4-b736-13cdb2eb7b59[308585]: [NOTICE]   (308591) : Loading success.
Nov 28 10:01:47 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:01:47.735 158530 INFO neutron.agent.ovn.metadata.agent [-] Port 62b8533f-b250-4475-80c2-28c4543536b5 in datapath ad2d8cf7-987d-4804-acbd-9b3e248dc8cd unbound from our chassis
Nov 28 10:01:47 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:01:47.738 158530 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ad2d8cf7-987d-4804-acbd-9b3e248dc8cd
Nov 28 10:01:47 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:01:47.747 261619 DEBUG oslo.privsep.daemon [-] privsep: reply[c449ad3c-2fec-4028-bf3e-65252d6509da]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:01:47 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:01:47.748 158530 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapad2d8cf7-91 in ovnmeta-ad2d8cf7-987d-4804-acbd-9b3e248dc8cd namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 28 10:01:47 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:01:47.750 261619 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapad2d8cf7-90 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 28 10:01:47 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:01:47.750 261619 DEBUG oslo.privsep.daemon [-] privsep: reply[bfa07c62-38cd-490c-a67a-9de416fcd981]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:01:47 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:01:47.751 261619 DEBUG oslo.privsep.daemon [-] privsep: reply[313e942c-22f7-4a6c-a365-65ac87e0d2d0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:01:47 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:01:47.767 158630 DEBUG oslo.privsep.daemon [-] privsep: reply[b4f496e9-2b44-45f4-ab92-fa6fdf5f0312]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:01:47 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:01:47.778 261619 DEBUG oslo.privsep.daemon [-] privsep: reply[075220ec-5554-48ad-9989-286c411bdd87]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:01:47 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:01:47.783 261346 INFO neutron.agent.dhcp.agent [None req-30ac3a63-3ad5-4a5c-9676-d34d9c100bb7 - - - - - -] DHCP configuration for ports {'db944860-27cb-4e81-92f6-1a891644e35c'} is completed
Nov 28 10:01:47 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:01:47.799 308430 DEBUG oslo.privsep.daemon [-] privsep: reply[adfc7d8d-494d-407c-bc5e-59134f575d8c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:01:47 np0005538515.localdomain NetworkManager[5965]: <info>  [1764324107.8076] manager: (tapad2d8cf7-90): new Veth device (/org/freedesktop/NetworkManager/Devices/19)
Nov 28 10:01:47 np0005538515.localdomain systemd-udevd[308481]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 10:01:47 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:01:47.806 261619 DEBUG oslo.privsep.daemon [-] privsep: reply[ffa119cd-a449-4c35-8a77-b949a8258341]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:01:47 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:01:47.835 308430 DEBUG oslo.privsep.daemon [-] privsep: reply[742fee01-a888-45b5-bc1c-ee6ab5b0f550]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:01:47 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:01:47.838 308430 DEBUG oslo.privsep.daemon [-] privsep: reply[2d3c2685-bf5d-492a-a91e-1033e96bdee0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:01:47 np0005538515.localdomain NetworkManager[5965]: <info>  [1764324107.8570] device (tapad2d8cf7-90): carrier: link connected
Nov 28 10:01:47 np0005538515.localdomain kernel: IPv6: ADDRCONF(NETDEV_CHANGE): tapad2d8cf7-91: link becomes ready
Nov 28 10:01:47 np0005538515.localdomain kernel: IPv6: ADDRCONF(NETDEV_CHANGE): tapad2d8cf7-90: link becomes ready
Nov 28 10:01:47 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:01:47.861 308430 DEBUG oslo.privsep.daemon [-] privsep: reply[93ff9bf1-2e52-4bc9-b131-6025bfdd5676]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:01:47 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:01:47.877 261619 DEBUG oslo.privsep.daemon [-] privsep: reply[922fdef3-3ef0-4cb8-a878-06bfe884346c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapad2d8cf7-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', 'fa:16:3e:14:78:5b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 19], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1193200, 'reachable_time': 18479, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1400, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 308632, 'error': None, 'target': 'ovnmeta-ad2d8cf7-987d-4804-acbd-9b3e248dc8cd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:01:47 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:01:47.892 261619 DEBUG oslo.privsep.daemon [-] privsep: reply[86705772-a996-41aa-8ce3-f934d71c541b]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe14:785b'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 1193200, 'tstamp': 1193200}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 308633, 'error': None, 'target': 'ovnmeta-ad2d8cf7-987d-4804-acbd-9b3e248dc8cd', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:01:47 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:01:47.908 261619 DEBUG oslo.privsep.daemon [-] privsep: reply[911a23a8-2ef5-427f-a395-a8bdf864f4c1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapad2d8cf7-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', 'fa:16:3e:14:78:5b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 19], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1193200, 'reachable_time': 18479, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1400, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 308634, 'error': None, 'target': 'ovnmeta-ad2d8cf7-987d-4804-acbd-9b3e248dc8cd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:01:47 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:01:47.936 261619 DEBUG oslo.privsep.daemon [-] privsep: reply[731e95e6-73a0-4a74-af6e-0dc40af5ff9a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:01:47 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:01:47.991 261619 DEBUG oslo.privsep.daemon [-] privsep: reply[eefb38c4-f45d-43a0-b630-bac67bad1b24]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:01:47 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:01:47.993 158530 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapad2d8cf7-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 10:01:47 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:01:47.993 158530 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 28 10:01:47 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:01:47.994 158530 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapad2d8cf7-90, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 10:01:48 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 28 10:01:48 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/1206230784' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:01:48 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:48.030 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:01:48 np0005538515.localdomain kernel: device tapad2d8cf7-90 entered promiscuous mode
Nov 28 10:01:48 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:48.039 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:01:48 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:01:48.039 158530 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapad2d8cf7-90, col_values=(('external_ids', {'iface-id': 'acd4bbc3-c7c4-47d8-b58b-29abee48b714'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 10:01:48 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T10:01:48Z|00063|binding|INFO|Releasing lport acd4bbc3-c7c4-47d8-b58b-29abee48b714 from this chassis (sb_readonly=0)
Nov 28 10:01:48 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:48.052 280172 DEBUG oslo_concurrency.processutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.483s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 10:01:48 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:48.054 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:01:48 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:01:48.055 158530 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ad2d8cf7-987d-4804-acbd-9b3e248dc8cd.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ad2d8cf7-987d-4804-acbd-9b3e248dc8cd.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 28 10:01:48 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:01:48.058 261619 DEBUG oslo.privsep.daemon [-] privsep: reply[016208af-c7d7-4a39-8155-b9ad71b8a951]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:01:48 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:01:48.059 158530 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 28 10:01:48 np0005538515.localdomain ovn_metadata_agent[158525]: global
Nov 28 10:01:48 np0005538515.localdomain ovn_metadata_agent[158525]:     log         /dev/log local0 debug
Nov 28 10:01:48 np0005538515.localdomain ovn_metadata_agent[158525]:     log-tag     haproxy-metadata-proxy-ad2d8cf7-987d-4804-acbd-9b3e248dc8cd
Nov 28 10:01:48 np0005538515.localdomain ovn_metadata_agent[158525]:     user        root
Nov 28 10:01:48 np0005538515.localdomain ovn_metadata_agent[158525]:     group       root
Nov 28 10:01:48 np0005538515.localdomain ovn_metadata_agent[158525]:     maxconn     1024
Nov 28 10:01:48 np0005538515.localdomain ovn_metadata_agent[158525]:     pidfile     /var/lib/neutron/external/pids/ad2d8cf7-987d-4804-acbd-9b3e248dc8cd.pid.haproxy
Nov 28 10:01:48 np0005538515.localdomain ovn_metadata_agent[158525]:     daemon
Nov 28 10:01:48 np0005538515.localdomain ovn_metadata_agent[158525]: 
Nov 28 10:01:48 np0005538515.localdomain ovn_metadata_agent[158525]: defaults
Nov 28 10:01:48 np0005538515.localdomain ovn_metadata_agent[158525]:     log global
Nov 28 10:01:48 np0005538515.localdomain ovn_metadata_agent[158525]:     mode http
Nov 28 10:01:48 np0005538515.localdomain ovn_metadata_agent[158525]:     option httplog
Nov 28 10:01:48 np0005538515.localdomain ovn_metadata_agent[158525]:     option dontlognull
Nov 28 10:01:48 np0005538515.localdomain ovn_metadata_agent[158525]:     option http-server-close
Nov 28 10:01:48 np0005538515.localdomain ovn_metadata_agent[158525]:     option forwardfor
Nov 28 10:01:48 np0005538515.localdomain ovn_metadata_agent[158525]:     retries                 3
Nov 28 10:01:48 np0005538515.localdomain ovn_metadata_agent[158525]:     timeout http-request    30s
Nov 28 10:01:48 np0005538515.localdomain ovn_metadata_agent[158525]:     timeout connect         30s
Nov 28 10:01:48 np0005538515.localdomain ovn_metadata_agent[158525]:     timeout client          32s
Nov 28 10:01:48 np0005538515.localdomain ovn_metadata_agent[158525]:     timeout server          32s
Nov 28 10:01:48 np0005538515.localdomain ovn_metadata_agent[158525]:     timeout http-keep-alive 30s
Nov 28 10:01:48 np0005538515.localdomain ovn_metadata_agent[158525]: 
Nov 28 10:01:48 np0005538515.localdomain ovn_metadata_agent[158525]: 
Nov 28 10:01:48 np0005538515.localdomain ovn_metadata_agent[158525]: listen listener
Nov 28 10:01:48 np0005538515.localdomain ovn_metadata_agent[158525]:     bind 169.254.169.254:80
Nov 28 10:01:48 np0005538515.localdomain ovn_metadata_agent[158525]:     server metadata /var/lib/neutron/metadata_proxy
Nov 28 10:01:48 np0005538515.localdomain ovn_metadata_agent[158525]:     http-request add-header X-OVN-Network-ID ad2d8cf7-987d-4804-acbd-9b3e248dc8cd
Nov 28 10:01:48 np0005538515.localdomain ovn_metadata_agent[158525]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 28 10:01:48 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:01:48.061 158530 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-ad2d8cf7-987d-4804-acbd-9b3e248dc8cd', 'env', 'PROCESS_TAG=haproxy-ad2d8cf7-987d-4804-acbd-9b3e248dc8cd', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/ad2d8cf7-987d-4804-acbd-9b3e248dc8cd.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 28 10:01:48 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:48.062 280172 DEBUG nova.compute.provider_tree [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Inventory has not changed in ProviderTree for provider: 72fba1ca-0d86-48af-8a3d-510284dfd0e0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 10:01:48 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:48.083 280172 DEBUG nova.scheduler.client.report [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Inventory has not changed for provider 72fba1ca-0d86-48af-8a3d-510284dfd0e0 based on inventory data: {'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 10:01:48 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:48.109 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Compute_service record updated for np0005538515.localdomain:np0005538515.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 28 10:01:48 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:48.110 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.791s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 10:01:48 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:48.111 280172 DEBUG oslo_concurrency.lockutils [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 1.854s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 10:01:48 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:48.111 280172 DEBUG oslo_concurrency.lockutils [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 10:01:48 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:48.112 280172 DEBUG oslo_concurrency.lockutils [None req-9b118e69-1b4e-4a80-93b6-9ea2d37c0d65 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.589s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 10:01:48 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:48.120 280172 INFO nova.virt.libvirt.driver [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Nov 28 10:01:48 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:48.121 280172 DEBUG nova.objects.instance [None req-9b118e69-1b4e-4a80-93b6-9ea2d37c0d65 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] Lazy-loading 'pci_requests' on Instance uuid 7292509e-f294-4159-96e5-22d4712df2a0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 28 10:01:48 np0005538515.localdomain virtqemud[227736]: Domain id=2 name='instance-00000008' uuid=c06e2ffc-a8af-41b6-ab88-680ef1f6fe50 is tainted: custom-monitor
Nov 28 10:01:48 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:48.136 280172 DEBUG nova.objects.instance [None req-9b118e69-1b4e-4a80-93b6-9ea2d37c0d65 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] Lazy-loading 'numa_topology' on Instance uuid 7292509e-f294-4159-96e5-22d4712df2a0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 28 10:01:48 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:48.149 280172 DEBUG nova.virt.hardware [None req-9b118e69-1b4e-4a80-93b6-9ea2d37c0d65 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 28 10:01:48 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:48.150 280172 INFO nova.compute.claims [None req-9b118e69-1b4e-4a80-93b6-9ea2d37c0d65 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Claim successful on node np0005538515.localdomain
Nov 28 10:01:48 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:48.251 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:01:48 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:48.330 280172 DEBUG oslo_concurrency.processutils [None req-9b118e69-1b4e-4a80-93b6-9ea2d37c0d65 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 10:01:48 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.108:0/1206230784' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:01:48 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:48.408 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:01:48 np0005538515.localdomain podman[308667]: 
Nov 28 10:01:48 np0005538515.localdomain podman[308667]: 2025-11-28 10:01:48.471885921 +0000 UTC m=+0.105184404 container create b83b95d2190600c1dbe7db51200edb3cc9a4c1a1dce37a308e98f94a4ead0bc9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ad2d8cf7-987d-4804-acbd-9b3e248dc8cd, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 28 10:01:48 np0005538515.localdomain systemd[1]: Started libpod-conmon-b83b95d2190600c1dbe7db51200edb3cc9a4c1a1dce37a308e98f94a4ead0bc9.scope.
Nov 28 10:01:48 np0005538515.localdomain podman[308667]: 2025-11-28 10:01:48.427947121 +0000 UTC m=+0.061245594 image pull  quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 28 10:01:48 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 10:01:48 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0a768f3cd9f334ca2043e41bc50d9940a576b305a62ce3d0d03be668021f18a6/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 10:01:48 np0005538515.localdomain podman[308667]: 2025-11-28 10:01:48.546374075 +0000 UTC m=+0.179672488 container init b83b95d2190600c1dbe7db51200edb3cc9a4c1a1dce37a308e98f94a4ead0bc9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ad2d8cf7-987d-4804-acbd-9b3e248dc8cd, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:01:48 np0005538515.localdomain systemd[1]: tmp-crun.j9tA6D.mount: Deactivated successfully.
Nov 28 10:01:48 np0005538515.localdomain podman[308667]: 2025-11-28 10:01:48.56538382 +0000 UTC m=+0.198682263 container start b83b95d2190600c1dbe7db51200edb3cc9a4c1a1dce37a308e98f94a4ead0bc9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ad2d8cf7-987d-4804-acbd-9b3e248dc8cd, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Nov 28 10:01:48 np0005538515.localdomain neutron-haproxy-ovnmeta-ad2d8cf7-987d-4804-acbd-9b3e248dc8cd[308699]: [NOTICE]   (308705) : New worker (308707) forked
Nov 28 10:01:48 np0005538515.localdomain neutron-haproxy-ovnmeta-ad2d8cf7-987d-4804-acbd-9b3e248dc8cd[308699]: [NOTICE]   (308705) : Loading success.
Nov 28 10:01:48 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 28 10:01:48 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/3121132784' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:01:48 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:48.792 280172 DEBUG oslo_concurrency.processutils [None req-9b118e69-1b4e-4a80-93b6-9ea2d37c0d65 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.462s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 10:01:48 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:48.798 280172 DEBUG nova.compute.provider_tree [None req-9b118e69-1b4e-4a80-93b6-9ea2d37c0d65 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] Inventory has not changed in ProviderTree for provider: 72fba1ca-0d86-48af-8a3d-510284dfd0e0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 10:01:48 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:48.855 280172 DEBUG nova.scheduler.client.report [None req-9b118e69-1b4e-4a80-93b6-9ea2d37c0d65 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] Inventory has not changed for provider 72fba1ca-0d86-48af-8a3d-510284dfd0e0 based on inventory data: {'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 10:01:48 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:48.890 280172 DEBUG oslo_concurrency.lockutils [None req-9b118e69-1b4e-4a80-93b6-9ea2d37c0d65 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.778s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 10:01:48 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:48.933 280172 DEBUG oslo_concurrency.lockutils [None req-9b118e69-1b4e-4a80-93b6-9ea2d37c0d65 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] Acquiring lock "refresh_cache-7292509e-f294-4159-96e5-22d4712df2a0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 28 10:01:48 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:48.933 280172 DEBUG oslo_concurrency.lockutils [None req-9b118e69-1b4e-4a80-93b6-9ea2d37c0d65 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] Acquired lock "refresh_cache-7292509e-f294-4159-96e5-22d4712df2a0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 28 10:01:48 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:48.934 280172 DEBUG nova.network.neutron [None req-9b118e69-1b4e-4a80-93b6-9ea2d37c0d65 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 28 10:01:48 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:48.967 280172 DEBUG nova.network.neutron [None req-9b118e69-1b4e-4a80-93b6-9ea2d37c0d65 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 28 10:01:49 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:49.109 280172 DEBUG nova.network.neutron [None req-9b118e69-1b4e-4a80-93b6-9ea2d37c0d65 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 28 10:01:49 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:49.112 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:01:49 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:49.128 280172 DEBUG oslo_concurrency.lockutils [None req-9b118e69-1b4e-4a80-93b6-9ea2d37c0d65 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] Releasing lock "refresh_cache-7292509e-f294-4159-96e5-22d4712df2a0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 28 10:01:49 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:49.130 280172 DEBUG nova.virt.libvirt.driver [None req-9b118e69-1b4e-4a80-93b6-9ea2d37c0d65 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 28 10:01:49 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:49.131 280172 INFO nova.virt.libvirt.driver [None req-9b118e69-1b4e-4a80-93b6-9ea2d37c0d65 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Creating image(s)
Nov 28 10:01:49 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:49.166 280172 DEBUG nova.storage.rbd_utils [None req-9b118e69-1b4e-4a80-93b6-9ea2d37c0d65 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] rbd image 7292509e-f294-4159-96e5-22d4712df2a0_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Nov 28 10:01:49 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:49.170 280172 DEBUG nova.objects.instance [None req-9b118e69-1b4e-4a80-93b6-9ea2d37c0d65 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 7292509e-f294-4159-96e5-22d4712df2a0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 28 10:01:49 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:49.172 280172 INFO nova.virt.libvirt.driver [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Nov 28 10:01:49 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:49.231 280172 DEBUG nova.storage.rbd_utils [None req-9b118e69-1b4e-4a80-93b6-9ea2d37c0d65 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] rbd image 7292509e-f294-4159-96e5-22d4712df2a0_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Nov 28 10:01:49 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:49.268 280172 DEBUG nova.storage.rbd_utils [None req-9b118e69-1b4e-4a80-93b6-9ea2d37c0d65 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] rbd image 7292509e-f294-4159-96e5-22d4712df2a0_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Nov 28 10:01:49 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:49.272 280172 DEBUG oslo_concurrency.lockutils [None req-9b118e69-1b4e-4a80-93b6-9ea2d37c0d65 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] Acquiring lock "f96cefa575d4e71026b7a689ed51daf234dda618" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 10:01:49 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:49.274 280172 DEBUG oslo_concurrency.lockutils [None req-9b118e69-1b4e-4a80-93b6-9ea2d37c0d65 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] Lock "f96cefa575d4e71026b7a689ed51daf234dda618" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 10:01:49 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:49.327 280172 DEBUG nova.virt.libvirt.imagebackend [None req-9b118e69-1b4e-4a80-93b6-9ea2d37c0d65 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] Image locations are: [{'url': 'rbd://2c5417c9-00eb-57d5-a565-ddecbc7995c1/images/c045142b-5f2b-4f4d-80b7-ca5ee791067d/snap', 'metadata': {'store': 'default_backend'}}, {'url': 'rbd://2c5417c9-00eb-57d5-a565-ddecbc7995c1/images/c045142b-5f2b-4f4d-80b7-ca5ee791067d/snap', 'metadata': {}}] clone /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1085
Nov 28 10:01:49 np0005538515.localdomain ceph-mon[301134]: pgmap v123: 177 pgs: 177 active+clean; 304 MiB data, 1012 MiB used, 41 GiB / 42 GiB avail; 6.4 MiB/s rd, 9.0 MiB/s wr, 264 op/s
Nov 28 10:01:49 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.108:0/3121132784' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:01:49 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:49.406 280172 DEBUG nova.virt.libvirt.imagebackend [None req-9b118e69-1b4e-4a80-93b6-9ea2d37c0d65 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] Selected location: {'url': 'rbd://2c5417c9-00eb-57d5-a565-ddecbc7995c1/images/c045142b-5f2b-4f4d-80b7-ca5ee791067d/snap', 'metadata': {'store': 'default_backend'}} clone /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1094
Nov 28 10:01:49 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:49.407 280172 DEBUG nova.storage.rbd_utils [None req-9b118e69-1b4e-4a80-93b6-9ea2d37c0d65 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] cloning images/c045142b-5f2b-4f4d-80b7-ca5ee791067d@snap to None/7292509e-f294-4159-96e5-22d4712df2a0_disk clone /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:261
Nov 28 10:01:49 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:49.546 280172 DEBUG oslo_concurrency.lockutils [None req-9b118e69-1b4e-4a80-93b6-9ea2d37c0d65 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] Lock "f96cefa575d4e71026b7a689ed51daf234dda618" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.272s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 10:01:49 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v124: 177 pgs: 177 active+clean; 304 MiB data, 1012 MiB used, 41 GiB / 42 GiB avail; 5.4 MiB/s rd, 7.6 MiB/s wr, 224 op/s
Nov 28 10:01:49 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:49.706 280172 DEBUG nova.objects.instance [None req-9b118e69-1b4e-4a80-93b6-9ea2d37c0d65 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] Lazy-loading 'migration_context' on Instance uuid 7292509e-f294-4159-96e5-22d4712df2a0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 28 10:01:49 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:49.775 280172 DEBUG nova.storage.rbd_utils [None req-9b118e69-1b4e-4a80-93b6-9ea2d37c0d65 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] flattening vms/7292509e-f294-4159-96e5-22d4712df2a0_disk flatten /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:314
Nov 28 10:01:49 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.
Nov 28 10:01:49 np0005538515.localdomain systemd[1]: tmp-crun.evCG8p.mount: Deactivated successfully.
Nov 28 10:01:49 np0005538515.localdomain podman[308932]: 2025-11-28 10:01:49.995729534 +0000 UTC m=+0.099826033 container health_status 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, managed_by=edpm_ansible, name=ubi9-minimal, config_id=edpm, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, distribution-scope=public, build-date=2025-08-20T13:12:41)
Nov 28 10:01:50 np0005538515.localdomain podman[308932]: 2025-11-28 10:01:50.010393827 +0000 UTC m=+0.114490366 container exec_died 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, release=1755695350, io.openshift.expose-services=, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, vendor=Red Hat, Inc., config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc.)
Nov 28 10:01:50 np0005538515.localdomain systemd[1]: 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.service: Deactivated successfully.
Nov 28 10:01:50 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:50.178 280172 INFO nova.virt.libvirt.driver [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Nov 28 10:01:50 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:50.184 280172 DEBUG nova.compute.manager [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 28 10:01:50 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:50.206 280172 DEBUG nova.objects.instance [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Nov 28 10:01:50 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:50.591 280172 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764324095.5905724, 7292509e-f294-4159-96e5-22d4712df2a0 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 28 10:01:50 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:50.592 280172 INFO nova.compute.manager [-] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] VM Stopped (Lifecycle Event)
Nov 28 10:01:50 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:50.617 280172 DEBUG nova.compute.manager [None req-ce46f1b3-617e-478c-a97e-8ffeffe5d69b - - - - - -] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 28 10:01:50 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:50.738 280172 DEBUG nova.virt.libvirt.driver [None req-9b118e69-1b4e-4a80-93b6-9ea2d37c0d65 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Image rbd:vms/7292509e-f294-4159-96e5-22d4712df2a0_disk:id=openstack:conf=/etc/ceph/ceph.conf flattened successfully while unshelving instance. _try_fetch_image_cache /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11007
Nov 28 10:01:50 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:50.739 280172 DEBUG nova.virt.libvirt.driver [None req-9b118e69-1b4e-4a80-93b6-9ea2d37c0d65 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 28 10:01:50 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:50.740 280172 DEBUG nova.virt.libvirt.driver [None req-9b118e69-1b4e-4a80-93b6-9ea2d37c0d65 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Ensure instance console log exists: /var/lib/nova/instances/7292509e-f294-4159-96e5-22d4712df2a0/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 28 10:01:50 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:50.740 280172 DEBUG oslo_concurrency.lockutils [None req-9b118e69-1b4e-4a80-93b6-9ea2d37c0d65 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 10:01:50 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:50.741 280172 DEBUG oslo_concurrency.lockutils [None req-9b118e69-1b4e-4a80-93b6-9ea2d37c0d65 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 10:01:50 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:50.741 280172 DEBUG oslo_concurrency.lockutils [None req-9b118e69-1b4e-4a80-93b6-9ea2d37c0d65 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 10:01:50 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:50.743 280172 DEBUG nova.virt.libvirt.driver [None req-9b118e69-1b4e-4a80-93b6-9ea2d37c0d65 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='',container_format='bare',created_at=2025-11-28T10:01:11Z,direct_url=<?>,disk_format='raw',id=c045142b-5f2b-4f4d-80b7-ca5ee791067d,min_disk=1,min_ram=0,name='tempest-UnshelveToHostMultiNodesTest-server-650509197-shelved',owner='a30386ba68ee46f4a1bac43cf415f3a4',properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=2025-11-28T10:01:41Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'encryption_secret_uuid': None, 'guest_format': None, 'size': 0, 'encryption_options': None, 'boot_index': 0, 'device_type': 'disk', 'disk_bus': 'virtio', 'encrypted': False, 'device_name': '/dev/vda', 'image_id': '85968a96-5a0e-43a4-9c04-3954f640a7ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 28 10:01:50 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:50.749 280172 WARNING nova.virt.libvirt.driver [None req-9b118e69-1b4e-4a80-93b6-9ea2d37c0d65 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 10:01:50 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:50.751 280172 DEBUG nova.virt.libvirt.host [None req-9b118e69-1b4e-4a80-93b6-9ea2d37c0d65 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] Searching host: 'np0005538515.localdomain' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 28 10:01:50 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:50.752 280172 DEBUG nova.virt.libvirt.host [None req-9b118e69-1b4e-4a80-93b6-9ea2d37c0d65 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 28 10:01:50 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:50.753 280172 DEBUG nova.virt.libvirt.host [None req-9b118e69-1b4e-4a80-93b6-9ea2d37c0d65 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] Searching host: 'np0005538515.localdomain' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 28 10:01:50 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:50.754 280172 DEBUG nova.virt.libvirt.host [None req-9b118e69-1b4e-4a80-93b6-9ea2d37c0d65 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 28 10:01:50 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:50.755 280172 DEBUG nova.virt.libvirt.driver [None req-9b118e69-1b4e-4a80-93b6-9ea2d37c0d65 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 28 10:01:50 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:50.756 280172 DEBUG nova.virt.hardware [None req-9b118e69-1b4e-4a80-93b6-9ea2d37c0d65 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-28T09:59:43Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='98f289d4-5c06-4ab5-9089-7b580870d676',id=5,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='',container_format='bare',created_at=2025-11-28T10:01:11Z,direct_url=<?>,disk_format='raw',id=c045142b-5f2b-4f4d-80b7-ca5ee791067d,min_disk=1,min_ram=0,name='tempest-UnshelveToHostMultiNodesTest-server-650509197-shelved',owner='a30386ba68ee46f4a1bac43cf415f3a4',properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=2025-11-28T10:01:41Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 28 10:01:50 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:50.756 280172 DEBUG nova.virt.hardware [None req-9b118e69-1b4e-4a80-93b6-9ea2d37c0d65 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 28 10:01:50 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:50.757 280172 DEBUG nova.virt.hardware [None req-9b118e69-1b4e-4a80-93b6-9ea2d37c0d65 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 28 10:01:50 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:50.757 280172 DEBUG nova.virt.hardware [None req-9b118e69-1b4e-4a80-93b6-9ea2d37c0d65 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 28 10:01:50 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:50.758 280172 DEBUG nova.virt.hardware [None req-9b118e69-1b4e-4a80-93b6-9ea2d37c0d65 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 28 10:01:50 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:50.758 280172 DEBUG nova.virt.hardware [None req-9b118e69-1b4e-4a80-93b6-9ea2d37c0d65 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 28 10:01:50 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:50.759 280172 DEBUG nova.virt.hardware [None req-9b118e69-1b4e-4a80-93b6-9ea2d37c0d65 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 28 10:01:50 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:50.759 280172 DEBUG nova.virt.hardware [None req-9b118e69-1b4e-4a80-93b6-9ea2d37c0d65 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 28 10:01:50 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:50.760 280172 DEBUG nova.virt.hardware [None req-9b118e69-1b4e-4a80-93b6-9ea2d37c0d65 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 28 10:01:50 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:50.760 280172 DEBUG nova.virt.hardware [None req-9b118e69-1b4e-4a80-93b6-9ea2d37c0d65 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 28 10:01:50 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:50.760 280172 DEBUG nova.virt.hardware [None req-9b118e69-1b4e-4a80-93b6-9ea2d37c0d65 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 28 10:01:50 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:50.761 280172 DEBUG nova.objects.instance [None req-9b118e69-1b4e-4a80-93b6-9ea2d37c0d65 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 7292509e-f294-4159-96e5-22d4712df2a0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 28 10:01:50 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:50.776 280172 DEBUG oslo_concurrency.processutils [None req-9b118e69-1b4e-4a80-93b6-9ea2d37c0d65 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 10:01:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:01:50.845 158530 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 10:01:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:01:50.846 158530 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 10:01:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:01:50.847 158530 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 10:01:50 np0005538515.localdomain systemd[1]: Stopping User Manager for UID 42436...
Nov 28 10:01:50 np0005538515.localdomain systemd[308158]: Activating special unit Exit the Session...
Nov 28 10:01:50 np0005538515.localdomain systemd[308158]: Stopped target Main User Target.
Nov 28 10:01:50 np0005538515.localdomain systemd[308158]: Stopped target Basic System.
Nov 28 10:01:50 np0005538515.localdomain systemd[308158]: Stopped target Paths.
Nov 28 10:01:50 np0005538515.localdomain systemd[308158]: Stopped target Sockets.
Nov 28 10:01:50 np0005538515.localdomain systemd[308158]: Stopped target Timers.
Nov 28 10:01:50 np0005538515.localdomain systemd[308158]: Stopped Mark boot as successful after the user session has run 2 minutes.
Nov 28 10:01:50 np0005538515.localdomain systemd[308158]: Stopped Daily Cleanup of User's Temporary Directories.
Nov 28 10:01:50 np0005538515.localdomain systemd[308158]: Closed D-Bus User Message Bus Socket.
Nov 28 10:01:50 np0005538515.localdomain systemd[308158]: Stopped Create User's Volatile Files and Directories.
Nov 28 10:01:50 np0005538515.localdomain systemd[308158]: Removed slice User Application Slice.
Nov 28 10:01:50 np0005538515.localdomain systemd[308158]: Reached target Shutdown.
Nov 28 10:01:50 np0005538515.localdomain systemd[308158]: Finished Exit the Session.
Nov 28 10:01:50 np0005538515.localdomain systemd[308158]: Reached target Exit the Session.
Nov 28 10:01:50 np0005538515.localdomain systemd[1]: user@42436.service: Deactivated successfully.
Nov 28 10:01:50 np0005538515.localdomain systemd[1]: Stopped User Manager for UID 42436.
Nov 28 10:01:50 np0005538515.localdomain systemd[1]: Stopping User Runtime Directory /run/user/42436...
Nov 28 10:01:50 np0005538515.localdomain systemd[1]: run-user-42436.mount: Deactivated successfully.
Nov 28 10:01:50 np0005538515.localdomain systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Nov 28 10:01:50 np0005538515.localdomain systemd[1]: Stopped User Runtime Directory /run/user/42436.
Nov 28 10:01:50 np0005538515.localdomain systemd[1]: Removed slice User Slice of UID 42436.
Nov 28 10:01:50 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:50.968 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:01:51 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Nov 28 10:01:51 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/1415036771' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:01:51 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:51.273 280172 DEBUG oslo_concurrency.processutils [None req-9b118e69-1b4e-4a80-93b6-9ea2d37c0d65 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.497s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 10:01:51 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:51.311 280172 DEBUG nova.storage.rbd_utils [None req-9b118e69-1b4e-4a80-93b6-9ea2d37c0d65 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] rbd image 7292509e-f294-4159-96e5-22d4712df2a0_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Nov 28 10:01:51 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:51.316 280172 DEBUG oslo_concurrency.processutils [None req-9b118e69-1b4e-4a80-93b6-9ea2d37c0d65 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 10:01:51 np0005538515.localdomain ceph-mon[301134]: pgmap v124: 177 pgs: 177 active+clean; 304 MiB data, 1012 MiB used, 41 GiB / 42 GiB avail; 5.4 MiB/s rd, 7.6 MiB/s wr, 224 op/s
Nov 28 10:01:51 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.106:0/3559263610' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:01:51 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.108:0/1415036771' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:01:51 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v125: 177 pgs: 177 active+clean; 383 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 5.1 MiB/s rd, 7.2 MiB/s wr, 200 op/s
Nov 28 10:01:51 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Nov 28 10:01:51 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/2304873447' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:01:51 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:51.736 280172 DEBUG oslo_concurrency.processutils [None req-9b118e69-1b4e-4a80-93b6-9ea2d37c0d65 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.420s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 10:01:51 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:51.741 280172 DEBUG nova.objects.instance [None req-9b118e69-1b4e-4a80-93b6-9ea2d37c0d65 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] Lazy-loading 'pci_devices' on Instance uuid 7292509e-f294-4159-96e5-22d4712df2a0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 28 10:01:51 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e101 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 10:01:51 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:51.758 280172 DEBUG nova.virt.libvirt.driver [None req-9b118e69-1b4e-4a80-93b6-9ea2d37c0d65 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] End _get_guest_xml xml=<domain type="kvm">
Nov 28 10:01:51 np0005538515.localdomain nova_compute[280168]:   <uuid>7292509e-f294-4159-96e5-22d4712df2a0</uuid>
Nov 28 10:01:51 np0005538515.localdomain nova_compute[280168]:   <name>instance-00000007</name>
Nov 28 10:01:51 np0005538515.localdomain nova_compute[280168]:   <memory>131072</memory>
Nov 28 10:01:51 np0005538515.localdomain nova_compute[280168]:   <vcpu>1</vcpu>
Nov 28 10:01:51 np0005538515.localdomain nova_compute[280168]:   <metadata>
Nov 28 10:01:51 np0005538515.localdomain nova_compute[280168]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 28 10:01:51 np0005538515.localdomain nova_compute[280168]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 28 10:01:51 np0005538515.localdomain nova_compute[280168]:       <nova:name>tempest-UnshelveToHostMultiNodesTest-server-650509197</nova:name>
Nov 28 10:01:51 np0005538515.localdomain nova_compute[280168]:       <nova:creationTime>2025-11-28 10:01:50</nova:creationTime>
Nov 28 10:01:51 np0005538515.localdomain nova_compute[280168]:       <nova:flavor name="m1.nano">
Nov 28 10:01:51 np0005538515.localdomain nova_compute[280168]:         <nova:memory>128</nova:memory>
Nov 28 10:01:51 np0005538515.localdomain nova_compute[280168]:         <nova:disk>1</nova:disk>
Nov 28 10:01:51 np0005538515.localdomain nova_compute[280168]:         <nova:swap>0</nova:swap>
Nov 28 10:01:51 np0005538515.localdomain nova_compute[280168]:         <nova:ephemeral>0</nova:ephemeral>
Nov 28 10:01:51 np0005538515.localdomain nova_compute[280168]:         <nova:vcpus>1</nova:vcpus>
Nov 28 10:01:51 np0005538515.localdomain nova_compute[280168]:       </nova:flavor>
Nov 28 10:01:51 np0005538515.localdomain nova_compute[280168]:       <nova:owner>
Nov 28 10:01:51 np0005538515.localdomain nova_compute[280168]:         <nova:user uuid="28578129c91d407a92af609ba8bac430">tempest-UnshelveToHostMultiNodesTest-426973173-project-member</nova:user>
Nov 28 10:01:51 np0005538515.localdomain nova_compute[280168]:         <nova:project uuid="a30386ba68ee46f4a1bac43cf415f3a4">tempest-UnshelveToHostMultiNodesTest-426973173</nova:project>
Nov 28 10:01:51 np0005538515.localdomain nova_compute[280168]:       </nova:owner>
Nov 28 10:01:51 np0005538515.localdomain nova_compute[280168]:       <nova:root type="image" uuid="c045142b-5f2b-4f4d-80b7-ca5ee791067d"/>
Nov 28 10:01:51 np0005538515.localdomain nova_compute[280168]:       <nova:ports/>
Nov 28 10:01:51 np0005538515.localdomain nova_compute[280168]:     </nova:instance>
Nov 28 10:01:51 np0005538515.localdomain nova_compute[280168]:   </metadata>
Nov 28 10:01:51 np0005538515.localdomain nova_compute[280168]:   <sysinfo type="smbios">
Nov 28 10:01:51 np0005538515.localdomain nova_compute[280168]:     <system>
Nov 28 10:01:51 np0005538515.localdomain nova_compute[280168]:       <entry name="manufacturer">RDO</entry>
Nov 28 10:01:51 np0005538515.localdomain nova_compute[280168]:       <entry name="product">OpenStack Compute</entry>
Nov 28 10:01:51 np0005538515.localdomain nova_compute[280168]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 28 10:01:51 np0005538515.localdomain nova_compute[280168]:       <entry name="serial">7292509e-f294-4159-96e5-22d4712df2a0</entry>
Nov 28 10:01:51 np0005538515.localdomain nova_compute[280168]:       <entry name="uuid">7292509e-f294-4159-96e5-22d4712df2a0</entry>
Nov 28 10:01:51 np0005538515.localdomain nova_compute[280168]:       <entry name="family">Virtual Machine</entry>
Nov 28 10:01:51 np0005538515.localdomain nova_compute[280168]:     </system>
Nov 28 10:01:51 np0005538515.localdomain nova_compute[280168]:   </sysinfo>
Nov 28 10:01:51 np0005538515.localdomain nova_compute[280168]:   <os>
Nov 28 10:01:51 np0005538515.localdomain nova_compute[280168]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 28 10:01:51 np0005538515.localdomain nova_compute[280168]:     <boot dev="hd"/>
Nov 28 10:01:51 np0005538515.localdomain nova_compute[280168]:     <smbios mode="sysinfo"/>
Nov 28 10:01:51 np0005538515.localdomain nova_compute[280168]:   </os>
Nov 28 10:01:51 np0005538515.localdomain nova_compute[280168]:   <features>
Nov 28 10:01:51 np0005538515.localdomain nova_compute[280168]:     <acpi/>
Nov 28 10:01:51 np0005538515.localdomain nova_compute[280168]:     <apic/>
Nov 28 10:01:51 np0005538515.localdomain nova_compute[280168]:     <vmcoreinfo/>
Nov 28 10:01:51 np0005538515.localdomain nova_compute[280168]:   </features>
Nov 28 10:01:51 np0005538515.localdomain nova_compute[280168]:   <clock offset="utc">
Nov 28 10:01:51 np0005538515.localdomain nova_compute[280168]:     <timer name="pit" tickpolicy="delay"/>
Nov 28 10:01:51 np0005538515.localdomain nova_compute[280168]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 28 10:01:51 np0005538515.localdomain nova_compute[280168]:     <timer name="hpet" present="no"/>
Nov 28 10:01:51 np0005538515.localdomain nova_compute[280168]:   </clock>
Nov 28 10:01:51 np0005538515.localdomain nova_compute[280168]:   <cpu mode="host-model" match="exact">
Nov 28 10:01:51 np0005538515.localdomain nova_compute[280168]:     <topology sockets="1" cores="1" threads="1"/>
Nov 28 10:01:51 np0005538515.localdomain nova_compute[280168]:   </cpu>
Nov 28 10:01:51 np0005538515.localdomain nova_compute[280168]:   <devices>
Nov 28 10:01:51 np0005538515.localdomain nova_compute[280168]:     <disk type="network" device="disk">
Nov 28 10:01:51 np0005538515.localdomain nova_compute[280168]:       <driver type="raw" cache="none"/>
Nov 28 10:01:51 np0005538515.localdomain nova_compute[280168]:       <source protocol="rbd" name="vms/7292509e-f294-4159-96e5-22d4712df2a0_disk">
Nov 28 10:01:51 np0005538515.localdomain nova_compute[280168]:         <host name="172.18.0.103" port="6789"/>
Nov 28 10:01:51 np0005538515.localdomain nova_compute[280168]:         <host name="172.18.0.104" port="6789"/>
Nov 28 10:01:51 np0005538515.localdomain nova_compute[280168]:         <host name="172.18.0.105" port="6789"/>
Nov 28 10:01:51 np0005538515.localdomain nova_compute[280168]:       </source>
Nov 28 10:01:51 np0005538515.localdomain nova_compute[280168]:       <auth username="openstack">
Nov 28 10:01:51 np0005538515.localdomain nova_compute[280168]:         <secret type="ceph" uuid="2c5417c9-00eb-57d5-a565-ddecbc7995c1"/>
Nov 28 10:01:51 np0005538515.localdomain nova_compute[280168]:       </auth>
Nov 28 10:01:51 np0005538515.localdomain nova_compute[280168]:       <target dev="vda" bus="virtio"/>
Nov 28 10:01:51 np0005538515.localdomain nova_compute[280168]:     </disk>
Nov 28 10:01:51 np0005538515.localdomain nova_compute[280168]:     <disk type="network" device="cdrom">
Nov 28 10:01:51 np0005538515.localdomain nova_compute[280168]:       <driver type="raw" cache="none"/>
Nov 28 10:01:51 np0005538515.localdomain nova_compute[280168]:       <source protocol="rbd" name="vms/7292509e-f294-4159-96e5-22d4712df2a0_disk.config">
Nov 28 10:01:51 np0005538515.localdomain nova_compute[280168]:         <host name="172.18.0.103" port="6789"/>
Nov 28 10:01:51 np0005538515.localdomain nova_compute[280168]:         <host name="172.18.0.104" port="6789"/>
Nov 28 10:01:51 np0005538515.localdomain nova_compute[280168]:         <host name="172.18.0.105" port="6789"/>
Nov 28 10:01:51 np0005538515.localdomain nova_compute[280168]:       </source>
Nov 28 10:01:51 np0005538515.localdomain nova_compute[280168]:       <auth username="openstack">
Nov 28 10:01:51 np0005538515.localdomain nova_compute[280168]:         <secret type="ceph" uuid="2c5417c9-00eb-57d5-a565-ddecbc7995c1"/>
Nov 28 10:01:51 np0005538515.localdomain nova_compute[280168]:       </auth>
Nov 28 10:01:51 np0005538515.localdomain nova_compute[280168]:       <target dev="sda" bus="sata"/>
Nov 28 10:01:51 np0005538515.localdomain nova_compute[280168]:     </disk>
Nov 28 10:01:51 np0005538515.localdomain nova_compute[280168]:     <serial type="pty">
Nov 28 10:01:51 np0005538515.localdomain nova_compute[280168]:       <log file="/var/lib/nova/instances/7292509e-f294-4159-96e5-22d4712df2a0/console.log" append="off"/>
Nov 28 10:01:51 np0005538515.localdomain nova_compute[280168]:     </serial>
Nov 28 10:01:51 np0005538515.localdomain nova_compute[280168]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 28 10:01:51 np0005538515.localdomain nova_compute[280168]:     <video>
Nov 28 10:01:51 np0005538515.localdomain nova_compute[280168]:       <model type="virtio"/>
Nov 28 10:01:51 np0005538515.localdomain nova_compute[280168]:     </video>
Nov 28 10:01:51 np0005538515.localdomain nova_compute[280168]:     <input type="tablet" bus="usb"/>
Nov 28 10:01:51 np0005538515.localdomain nova_compute[280168]:     <input type="keyboard" bus="usb"/>
Nov 28 10:01:51 np0005538515.localdomain nova_compute[280168]:     <rng model="virtio">
Nov 28 10:01:51 np0005538515.localdomain nova_compute[280168]:       <backend model="random">/dev/urandom</backend>
Nov 28 10:01:51 np0005538515.localdomain nova_compute[280168]:     </rng>
Nov 28 10:01:51 np0005538515.localdomain nova_compute[280168]:     <controller type="pci" model="pcie-root"/>
Nov 28 10:01:51 np0005538515.localdomain nova_compute[280168]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 10:01:51 np0005538515.localdomain nova_compute[280168]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 10:01:51 np0005538515.localdomain nova_compute[280168]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 10:01:51 np0005538515.localdomain nova_compute[280168]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 10:01:51 np0005538515.localdomain nova_compute[280168]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 10:01:51 np0005538515.localdomain nova_compute[280168]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 10:01:51 np0005538515.localdomain nova_compute[280168]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 10:01:51 np0005538515.localdomain nova_compute[280168]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 10:01:51 np0005538515.localdomain nova_compute[280168]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 10:01:51 np0005538515.localdomain nova_compute[280168]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 10:01:51 np0005538515.localdomain nova_compute[280168]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 10:01:51 np0005538515.localdomain nova_compute[280168]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 10:01:51 np0005538515.localdomain nova_compute[280168]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 10:01:51 np0005538515.localdomain nova_compute[280168]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 10:01:51 np0005538515.localdomain nova_compute[280168]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 10:01:51 np0005538515.localdomain nova_compute[280168]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 10:01:51 np0005538515.localdomain nova_compute[280168]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 10:01:51 np0005538515.localdomain nova_compute[280168]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 10:01:51 np0005538515.localdomain nova_compute[280168]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 10:01:51 np0005538515.localdomain nova_compute[280168]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 10:01:51 np0005538515.localdomain nova_compute[280168]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 10:01:51 np0005538515.localdomain nova_compute[280168]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 10:01:51 np0005538515.localdomain nova_compute[280168]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 10:01:51 np0005538515.localdomain nova_compute[280168]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 10:01:51 np0005538515.localdomain nova_compute[280168]:     <controller type="usb" index="0"/>
Nov 28 10:01:51 np0005538515.localdomain nova_compute[280168]:     <memballoon model="virtio">
Nov 28 10:01:51 np0005538515.localdomain nova_compute[280168]:       <stats period="10"/>
Nov 28 10:01:51 np0005538515.localdomain nova_compute[280168]:     </memballoon>
Nov 28 10:01:51 np0005538515.localdomain nova_compute[280168]:   </devices>
Nov 28 10:01:51 np0005538515.localdomain nova_compute[280168]: </domain>
Nov 28 10:01:51 np0005538515.localdomain nova_compute[280168]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 28 10:01:51 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:51.809 280172 DEBUG nova.virt.libvirt.driver [None req-9b118e69-1b4e-4a80-93b6-9ea2d37c0d65 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 28 10:01:51 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:51.810 280172 DEBUG nova.virt.libvirt.driver [None req-9b118e69-1b4e-4a80-93b6-9ea2d37c0d65 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 28 10:01:51 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:51.811 280172 INFO nova.virt.libvirt.driver [None req-9b118e69-1b4e-4a80-93b6-9ea2d37c0d65 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Using config drive
Nov 28 10:01:51 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:51.851 280172 DEBUG nova.storage.rbd_utils [None req-9b118e69-1b4e-4a80-93b6-9ea2d37c0d65 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] rbd image 7292509e-f294-4159-96e5-22d4712df2a0_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Nov 28 10:01:51 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:51.870 280172 DEBUG nova.objects.instance [None req-9b118e69-1b4e-4a80-93b6-9ea2d37c0d65 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] Lazy-loading 'ec2_ids' on Instance uuid 7292509e-f294-4159-96e5-22d4712df2a0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 28 10:01:51 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:51.912 280172 DEBUG nova.objects.instance [None req-9b118e69-1b4e-4a80-93b6-9ea2d37c0d65 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] Lazy-loading 'keypairs' on Instance uuid 7292509e-f294-4159-96e5-22d4712df2a0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 28 10:01:51 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:51.983 280172 INFO nova.virt.libvirt.driver [None req-9b118e69-1b4e-4a80-93b6-9ea2d37c0d65 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Creating config drive at /var/lib/nova/instances/7292509e-f294-4159-96e5-22d4712df2a0/disk.config
Nov 28 10:01:51 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:51.989 280172 DEBUG oslo_concurrency.processutils [None req-9b118e69-1b4e-4a80-93b6-9ea2d37c0d65 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/7292509e-f294-4159-96e5-22d4712df2a0/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpox96wfi0 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 10:01:52 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:52.039 280172 DEBUG oslo_concurrency.lockutils [None req-f01523fe-fe44-4637-97d1-2ed008e509d0 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] Acquiring lock "c06e2ffc-a8af-41b6-ab88-680ef1f6fe50" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 10:01:52 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:52.040 280172 DEBUG oslo_concurrency.lockutils [None req-f01523fe-fe44-4637-97d1-2ed008e509d0 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] Lock "c06e2ffc-a8af-41b6-ab88-680ef1f6fe50" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 10:01:52 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:52.041 280172 DEBUG oslo_concurrency.lockutils [None req-f01523fe-fe44-4637-97d1-2ed008e509d0 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] Acquiring lock "c06e2ffc-a8af-41b6-ab88-680ef1f6fe50-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 10:01:52 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:52.041 280172 DEBUG oslo_concurrency.lockutils [None req-f01523fe-fe44-4637-97d1-2ed008e509d0 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] Lock "c06e2ffc-a8af-41b6-ab88-680ef1f6fe50-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 10:01:52 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:52.041 280172 DEBUG oslo_concurrency.lockutils [None req-f01523fe-fe44-4637-97d1-2ed008e509d0 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] Lock "c06e2ffc-a8af-41b6-ab88-680ef1f6fe50-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 10:01:52 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:52.043 280172 INFO nova.compute.manager [None req-f01523fe-fe44-4637-97d1-2ed008e509d0 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Terminating instance
Nov 28 10:01:52 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:52.045 280172 DEBUG nova.compute.manager [None req-f01523fe-fe44-4637-97d1-2ed008e509d0 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 28 10:01:52 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:52.119 280172 DEBUG oslo_concurrency.processutils [None req-9b118e69-1b4e-4a80-93b6-9ea2d37c0d65 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/7292509e-f294-4159-96e5-22d4712df2a0/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpox96wfi0" returned: 0 in 0.130s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 10:01:52 np0005538515.localdomain kernel: device tap62b8533f-b2 left promiscuous mode
Nov 28 10:01:52 np0005538515.localdomain NetworkManager[5965]: <info>  [1764324112.1426] device (tap62b8533f-b2): state change: disconnected -> unmanaged (reason 'unmanaged', sys-iface-state: 'removed')
Nov 28 10:01:52 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:52.189 280172 DEBUG nova.storage.rbd_utils [None req-9b118e69-1b4e-4a80-93b6-9ea2d37c0d65 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] rbd image 7292509e-f294-4159-96e5-22d4712df2a0_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Nov 28 10:01:52 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T10:01:52Z|00064|binding|INFO|Releasing lport 62b8533f-b250-4475-80c2-28c4543536b5 from this chassis (sb_readonly=0)
Nov 28 10:01:52 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T10:01:52Z|00065|binding|INFO|Setting lport 62b8533f-b250-4475-80c2-28c4543536b5 down in Southbound
Nov 28 10:01:52 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T10:01:52Z|00066|binding|INFO|Releasing lport fc82099a-3702-4952-add7-ba3d39b895a0 from this chassis (sb_readonly=0)
Nov 28 10:01:52 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T10:01:52Z|00067|binding|INFO|Setting lport fc82099a-3702-4952-add7-ba3d39b895a0 down in Southbound
Nov 28 10:01:52 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T10:01:52Z|00068|binding|INFO|Removing iface tap62b8533f-b2 ovn-installed in OVS
Nov 28 10:01:52 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T10:01:52Z|00069|binding|INFO|Releasing lport 6838a8cb-20d7-44c7-aad3-e7f442484bd5 from this chassis (sb_readonly=0)
Nov 28 10:01:52 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T10:01:52Z|00070|binding|INFO|Releasing lport acd4bbc3-c7c4-47d8-b58b-29abee48b714 from this chassis (sb_readonly=0)
Nov 28 10:01:52 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:52.205 280172 DEBUG oslo_concurrency.processutils [None req-9b118e69-1b4e-4a80-93b6-9ea2d37c0d65 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/7292509e-f294-4159-96e5-22d4712df2a0/disk.config 7292509e-f294-4159-96e5-22d4712df2a0_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 10:01:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:01:52.207 158530 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:41:3c:a8 19.80.0.139'], port_security=['fa:16:3e:41:3c:a8 19.80.0.139'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=['62b8533f-b250-4475-80c2-28c4543536b5'], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-subport-957922340', 'neutron:cidrs': '19.80.0.139/24', 'neutron:device_id': '', 'neutron:device_owner': 'trunk:subport', 'neutron:mtu': '', 'neutron:network_name': 'neutron-492ef1de-4a68-49e4-b736-13cdb2eb7b59', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-subport-957922340', 'neutron:project_id': '310745a04bd441169ff77f55ccf6bd7b', 'neutron:revision_number': '5', 'neutron:security_group_ids': '8cd6a72f-0cb3-42f5-95bb-7d1b962c8a1e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[42], additional_encap=[], encap=[], mirror_rules=[], datapath=96ffb618-d617-4e8c-a498-acb365ae5313, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=fc82099a-3702-4952-add7-ba3d39b895a0) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fd80e481be0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:01:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:01:52.212 158530 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:58:68:3c 10.100.0.12'], port_security=['fa:16:3e:58:68:3c 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538515.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-parent-191355626', 'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'c06e2ffc-a8af-41b6-ab88-680ef1f6fe50', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ad2d8cf7-987d-4804-acbd-9b3e248dc8cd', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-parent-191355626', 'neutron:project_id': '310745a04bd441169ff77f55ccf6bd7b', 'neutron:revision_number': '11', 'neutron:security_group_ids': '8cd6a72f-0cb3-42f5-95bb-7d1b962c8a1e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005538513.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b393f93f-1891-43a2-aa26-a4cab2642f74, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fd80e481be0>], logical_port=62b8533f-b250-4475-80c2-28c4543536b5) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fd80e481be0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:01:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:01:52.215 158530 INFO neutron.agent.ovn.metadata.agent [-] Port fc82099a-3702-4952-add7-ba3d39b895a0 in datapath 492ef1de-4a68-49e4-b736-13cdb2eb7b59 unbound from our chassis
Nov 28 10:01:52 np0005538515.localdomain systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000008.scope: Deactivated successfully.
Nov 28 10:01:52 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:52.221 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:01:52 np0005538515.localdomain systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000008.scope: Consumed 3.943s CPU time.
Nov 28 10:01:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:01:52.226 158530 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 492ef1de-4a68-49e4-b736-13cdb2eb7b59, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 28 10:01:52 np0005538515.localdomain systemd-machined[201641]: Machine qemu-2-instance-00000008 terminated.
Nov 28 10:01:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:01:52.228 261619 DEBUG oslo.privsep.daemon [-] privsep: reply[709d5b21-0381-489d-b2c4-19a970c47e4f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:01:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:01:52.229 158530 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-492ef1de-4a68-49e4-b736-13cdb2eb7b59 namespace which is not needed anymore
Nov 28 10:01:52 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:52.232 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:01:52 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:52.246 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:01:52 np0005538515.localdomain NetworkManager[5965]: <info>  [1764324112.2611] manager: (tap62b8533f-b2): new Tun device (/org/freedesktop/NetworkManager/Devices/20)
Nov 28 10:01:52 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:52.282 280172 INFO nova.virt.libvirt.driver [-] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Instance destroyed successfully.
Nov 28 10:01:52 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:52.283 280172 DEBUG nova.objects.instance [None req-f01523fe-fe44-4637-97d1-2ed008e509d0 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] Lazy-loading 'resources' on Instance uuid c06e2ffc-a8af-41b6-ab88-680ef1f6fe50 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 28 10:01:52 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:52.297 280172 DEBUG nova.virt.libvirt.vif [None req-f01523fe-fe44-4637-97d1-2ed008e509d0 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-11-28T10:01:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-LiveMigrationTest-server-915340611',display_name='tempest-LiveMigrationTest-server-915340611',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(5),hidden=False,host='np0005538515.localdomain',hostname='tempest-livemigrationtest-server-915340611',id=8,image_ref='85968a96-5a0e-43a4-9c04-3954f640a7ed',info_cache=InstanceInfoCache,instance_type_id=5,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-28T10:01:29Z,launched_on='np0005538513.localdomain',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='np0005538515.localdomain',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='310745a04bd441169ff77f55ccf6bd7b',ramdisk_id='',reservation_id='r-1g332w05',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='85968a96-5a0e-43a4-9c04-3954f640a7ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveMigrationTest-480152442',owner_user_name='tempest-LiveMigrationTest-480152442-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-28T10:01:50Z,user_data=None,user_id='318114281cb649bc9eeed12ecdc7273f',uuid=c06e2ffc-a8af-41b6-ab88-680ef1f6fe50,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "62b8533f-b250-4475-80c2-28c4543536b5", "address": "fa:16:3e:58:68:3c", "network": {"id": "ad2d8cf7-987d-4804-acbd-9b3e248dc8cd", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1085829019-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.2"}}], "meta": {"injected": false, "tenant_id": "310745a04bd441169ff77f55ccf6bd7b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap62b8533f-b2", "ovs_interfaceid": "62b8533f-b250-4475-80c2-28c4543536b5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 28 10:01:52 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:52.298 280172 DEBUG nova.network.os_vif_util [None req-f01523fe-fe44-4637-97d1-2ed008e509d0 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] Converting VIF {"id": "62b8533f-b250-4475-80c2-28c4543536b5", "address": "fa:16:3e:58:68:3c", "network": {"id": "ad2d8cf7-987d-4804-acbd-9b3e248dc8cd", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1085829019-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.2"}}], "meta": {"injected": false, "tenant_id": "310745a04bd441169ff77f55ccf6bd7b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap62b8533f-b2", "ovs_interfaceid": "62b8533f-b250-4475-80c2-28c4543536b5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 28 10:01:52 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:52.300 280172 DEBUG nova.network.os_vif_util [None req-f01523fe-fe44-4637-97d1-2ed008e509d0 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:58:68:3c,bridge_name='br-int',has_traffic_filtering=True,id=62b8533f-b250-4475-80c2-28c4543536b5,network=Network(ad2d8cf7-987d-4804-acbd-9b3e248dc8cd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap62b8533f-b2') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 28 10:01:52 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:52.300 280172 DEBUG os_vif [None req-f01523fe-fe44-4637-97d1-2ed008e509d0 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:58:68:3c,bridge_name='br-int',has_traffic_filtering=True,id=62b8533f-b250-4475-80c2-28c4543536b5,network=Network(ad2d8cf7-987d-4804-acbd-9b3e248dc8cd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap62b8533f-b2') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 28 10:01:52 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:52.303 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:01:52 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:52.303 280172 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap62b8533f-b2, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 10:01:52 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:52.305 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:01:52 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:52.307 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:01:52 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:52.311 280172 INFO os_vif [None req-f01523fe-fe44-4637-97d1-2ed008e509d0 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:58:68:3c,bridge_name='br-int',has_traffic_filtering=True,id=62b8533f-b250-4475-80c2-28c4543536b5,network=Network(ad2d8cf7-987d-4804-acbd-9b3e248dc8cd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap62b8533f-b2')
Nov 28 10:01:52 np0005538515.localdomain systemd[1]: tmp-crun.GzoSAL.mount: Deactivated successfully.
Nov 28 10:01:52 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.108:0/2304873447' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:01:52 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.106:0/1128492412' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:01:52 np0005538515.localdomain neutron-haproxy-ovnmeta-492ef1de-4a68-49e4-b736-13cdb2eb7b59[308585]: [NOTICE]   (308591) : haproxy version is 2.8.14-c23fe91
Nov 28 10:01:52 np0005538515.localdomain neutron-haproxy-ovnmeta-492ef1de-4a68-49e4-b736-13cdb2eb7b59[308585]: [NOTICE]   (308591) : path to executable is /usr/sbin/haproxy
Nov 28 10:01:52 np0005538515.localdomain neutron-haproxy-ovnmeta-492ef1de-4a68-49e4-b736-13cdb2eb7b59[308585]: [WARNING]  (308591) : Exiting Master process...
Nov 28 10:01:52 np0005538515.localdomain neutron-haproxy-ovnmeta-492ef1de-4a68-49e4-b736-13cdb2eb7b59[308585]: [WARNING]  (308591) : Exiting Master process...
Nov 28 10:01:52 np0005538515.localdomain neutron-haproxy-ovnmeta-492ef1de-4a68-49e4-b736-13cdb2eb7b59[308585]: [ALERT]    (308591) : Current worker (308611) exited with code 143 (Terminated)
Nov 28 10:01:52 np0005538515.localdomain neutron-haproxy-ovnmeta-492ef1de-4a68-49e4-b736-13cdb2eb7b59[308585]: [WARNING]  (308591) : All workers exited. Exiting... (0)
Nov 28 10:01:52 np0005538515.localdomain systemd[1]: libpod-2a677fb10ee4b5d09cc54f10906d47c5af3cbfa4d5a75fef5f8a83ebbe812c65.scope: Deactivated successfully.
Nov 28 10:01:52 np0005538515.localdomain podman[309124]: 2025-11-28 10:01:52.408794434 +0000 UTC m=+0.062557644 container died 2a677fb10ee4b5d09cc54f10906d47c5af3cbfa4d5a75fef5f8a83ebbe812c65 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-492ef1de-4a68-49e4-b736-13cdb2eb7b59, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:01:52 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:52.415 280172 DEBUG oslo_concurrency.processutils [None req-9b118e69-1b4e-4a80-93b6-9ea2d37c0d65 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/7292509e-f294-4159-96e5-22d4712df2a0/disk.config 7292509e-f294-4159-96e5-22d4712df2a0_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.210s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 10:01:52 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:52.415 280172 INFO nova.virt.libvirt.driver [None req-9b118e69-1b4e-4a80-93b6-9ea2d37c0d65 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Deleting local config drive /var/lib/nova/instances/7292509e-f294-4159-96e5-22d4712df2a0/disk.config because it was imported into RBD.
Nov 28 10:01:52 np0005538515.localdomain podman[309124]: 2025-11-28 10:01:52.442486163 +0000 UTC m=+0.096249353 container cleanup 2a677fb10ee4b5d09cc54f10906d47c5af3cbfa4d5a75fef5f8a83ebbe812c65 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-492ef1de-4a68-49e4-b736-13cdb2eb7b59, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 28 10:01:52 np0005538515.localdomain podman[309144]: 2025-11-28 10:01:52.472279345 +0000 UTC m=+0.059026857 container cleanup 2a677fb10ee4b5d09cc54f10906d47c5af3cbfa4d5a75fef5f8a83ebbe812c65 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-492ef1de-4a68-49e4-b736-13cdb2eb7b59, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 28 10:01:52 np0005538515.localdomain systemd[1]: libpod-conmon-2a677fb10ee4b5d09cc54f10906d47c5af3cbfa4d5a75fef5f8a83ebbe812c65.scope: Deactivated successfully.
Nov 28 10:01:52 np0005538515.localdomain systemd-machined[201641]: New machine qemu-3-instance-00000007.
Nov 28 10:01:52 np0005538515.localdomain systemd[1]: Started Virtual Machine qemu-3-instance-00000007.
Nov 28 10:01:52 np0005538515.localdomain podman[309161]: 2025-11-28 10:01:52.547167721 +0000 UTC m=+0.077309310 container remove 2a677fb10ee4b5d09cc54f10906d47c5af3cbfa4d5a75fef5f8a83ebbe812c65 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-492ef1de-4a68-49e4-b736-13cdb2eb7b59, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Nov 28 10:01:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:01:52.556 261619 DEBUG oslo.privsep.daemon [-] privsep: reply[d0d446a8-0b2b-44b4-b3ee-33e569acb7be]: (4, ('Fri Nov 28 10:01:52 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-492ef1de-4a68-49e4-b736-13cdb2eb7b59 (2a677fb10ee4b5d09cc54f10906d47c5af3cbfa4d5a75fef5f8a83ebbe812c65)\n2a677fb10ee4b5d09cc54f10906d47c5af3cbfa4d5a75fef5f8a83ebbe812c65\nFri Nov 28 10:01:52 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-492ef1de-4a68-49e4-b736-13cdb2eb7b59 (2a677fb10ee4b5d09cc54f10906d47c5af3cbfa4d5a75fef5f8a83ebbe812c65)\n2a677fb10ee4b5d09cc54f10906d47c5af3cbfa4d5a75fef5f8a83ebbe812c65\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:01:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:01:52.559 261619 DEBUG oslo.privsep.daemon [-] privsep: reply[36a2f72a-c5f8-4f2a-886c-b0ad457aa943]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:01:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:01:52.560 158530 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap492ef1de-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 10:01:52 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:52.562 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:01:52 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:52.572 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:01:52 np0005538515.localdomain kernel: device tap492ef1de-40 left promiscuous mode
Nov 28 10:01:52 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:52.575 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:01:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:01:52.580 261619 DEBUG oslo.privsep.daemon [-] privsep: reply[eade3d9f-ed08-4f4a-b76b-9de1d857e182]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:01:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:01:52.593 261619 DEBUG oslo.privsep.daemon [-] privsep: reply[35bed441-6c5c-4beb-a871-85ee05007d01]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:01:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:01:52.594 261619 DEBUG oslo.privsep.daemon [-] privsep: reply[fecfebec-4848-409d-b1ef-6c30351df2d8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:01:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:01:52.611 261619 DEBUG oslo.privsep.daemon [-] privsep: reply[2085ad7c-1e33-4ff4-9cb2-c4c128d41dd7]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1193092, 'reachable_time': 18342, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1356, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 309194, 'error': None, 'target': 'ovnmeta-492ef1de-4a68-49e4-b736-13cdb2eb7b59', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:01:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:01:52.621 158630 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-492ef1de-4a68-49e4-b736-13cdb2eb7b59 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 28 10:01:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:01:52.622 158630 DEBUG oslo.privsep.daemon [-] privsep: reply[da9db098-6708-4a0f-9d00-0149d8a190f3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:01:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:01:52.624 158530 INFO neutron.agent.ovn.metadata.agent [-] Port 62b8533f-b250-4475-80c2-28c4543536b5 in datapath ad2d8cf7-987d-4804-acbd-9b3e248dc8cd unbound from our chassis
Nov 28 10:01:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:01:52.627 158530 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ad2d8cf7-987d-4804-acbd-9b3e248dc8cd, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 28 10:01:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:01:52.628 261619 DEBUG oslo.privsep.daemon [-] privsep: reply[420b00ed-f076-4c45-acdf-bbb9f919d125]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:01:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:01:52.629 158530 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-ad2d8cf7-987d-4804-acbd-9b3e248dc8cd namespace which is not needed anymore
Nov 28 10:01:52 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:52.811 280172 DEBUG nova.virt.driver [None req-1a57df67-0f32-4e7a-ae01-b3a64cdd1e9a - - - - - -] Emitting event <LifecycleEvent: 1764324112.8115134, 7292509e-f294-4159-96e5-22d4712df2a0 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 28 10:01:52 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:52.812 280172 INFO nova.compute.manager [None req-1a57df67-0f32-4e7a-ae01-b3a64cdd1e9a - - - - - -] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] VM Resumed (Lifecycle Event)
Nov 28 10:01:52 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:52.815 280172 DEBUG nova.compute.manager [None req-9b118e69-1b4e-4a80-93b6-9ea2d37c0d65 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 28 10:01:52 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:52.816 280172 DEBUG nova.virt.libvirt.driver [None req-9b118e69-1b4e-4a80-93b6-9ea2d37c0d65 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 28 10:01:52 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:52.820 280172 INFO nova.virt.libvirt.driver [-] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Instance spawned successfully.
Nov 28 10:01:52 np0005538515.localdomain neutron-haproxy-ovnmeta-ad2d8cf7-987d-4804-acbd-9b3e248dc8cd[308699]: [NOTICE]   (308705) : haproxy version is 2.8.14-c23fe91
Nov 28 10:01:52 np0005538515.localdomain neutron-haproxy-ovnmeta-ad2d8cf7-987d-4804-acbd-9b3e248dc8cd[308699]: [NOTICE]   (308705) : path to executable is /usr/sbin/haproxy
Nov 28 10:01:52 np0005538515.localdomain neutron-haproxy-ovnmeta-ad2d8cf7-987d-4804-acbd-9b3e248dc8cd[308699]: [WARNING]  (308705) : Exiting Master process...
Nov 28 10:01:52 np0005538515.localdomain neutron-haproxy-ovnmeta-ad2d8cf7-987d-4804-acbd-9b3e248dc8cd[308699]: [WARNING]  (308705) : Exiting Master process...
Nov 28 10:01:52 np0005538515.localdomain neutron-haproxy-ovnmeta-ad2d8cf7-987d-4804-acbd-9b3e248dc8cd[308699]: [ALERT]    (308705) : Current worker (308707) exited with code 143 (Terminated)
Nov 28 10:01:52 np0005538515.localdomain neutron-haproxy-ovnmeta-ad2d8cf7-987d-4804-acbd-9b3e248dc8cd[308699]: [WARNING]  (308705) : All workers exited. Exiting... (0)
Nov 28 10:01:52 np0005538515.localdomain systemd[1]: libpod-b83b95d2190600c1dbe7db51200edb3cc9a4c1a1dce37a308e98f94a4ead0bc9.scope: Deactivated successfully.
Nov 28 10:01:52 np0005538515.localdomain podman[309253]: 2025-11-28 10:01:52.833861987 +0000 UTC m=+0.083604112 container died b83b95d2190600c1dbe7db51200edb3cc9a4c1a1dce37a308e98f94a4ead0bc9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ad2d8cf7-987d-4804-acbd-9b3e248dc8cd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, tcib_managed=true)
Nov 28 10:01:52 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:52.849 280172 DEBUG nova.compute.manager [None req-1a57df67-0f32-4e7a-ae01-b3a64cdd1e9a - - - - - -] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 28 10:01:52 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:52.857 280172 DEBUG nova.compute.manager [None req-1a57df67-0f32-4e7a-ae01-b3a64cdd1e9a - - - - - -] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: shelved_offloaded, current task_state: spawning, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 28 10:01:52 np0005538515.localdomain podman[309253]: 2025-11-28 10:01:52.884120387 +0000 UTC m=+0.133862572 container cleanup b83b95d2190600c1dbe7db51200edb3cc9a4c1a1dce37a308e98f94a4ead0bc9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ad2d8cf7-987d-4804-acbd-9b3e248dc8cd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 28 10:01:52 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:52.890 280172 INFO nova.compute.manager [None req-1a57df67-0f32-4e7a-ae01-b3a64cdd1e9a - - - - - -] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 28 10:01:52 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:52.891 280172 DEBUG nova.virt.driver [None req-1a57df67-0f32-4e7a-ae01-b3a64cdd1e9a - - - - - -] Emitting event <LifecycleEvent: 1764324112.8168766, 7292509e-f294-4159-96e5-22d4712df2a0 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 28 10:01:52 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:52.891 280172 INFO nova.compute.manager [None req-1a57df67-0f32-4e7a-ae01-b3a64cdd1e9a - - - - - -] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] VM Started (Lifecycle Event)
Nov 28 10:01:52 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:52.912 280172 DEBUG nova.compute.manager [None req-1a57df67-0f32-4e7a-ae01-b3a64cdd1e9a - - - - - -] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 28 10:01:52 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:52.916 280172 DEBUG nova.compute.manager [None req-1a57df67-0f32-4e7a-ae01-b3a64cdd1e9a - - - - - -] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Synchronizing instance power state after lifecycle event "Started"; current vm_state: shelved_offloaded, current task_state: spawning, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 28 10:01:52 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:52.939 280172 INFO nova.virt.libvirt.driver [None req-f01523fe-fe44-4637-97d1-2ed008e509d0 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Deleting instance files /var/lib/nova/instances/c06e2ffc-a8af-41b6-ab88-680ef1f6fe50_del
Nov 28 10:01:52 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:52.940 280172 INFO nova.virt.libvirt.driver [None req-f01523fe-fe44-4637-97d1-2ed008e509d0 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Deletion of /var/lib/nova/instances/c06e2ffc-a8af-41b6-ab88-680ef1f6fe50_del complete
Nov 28 10:01:52 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:52.942 280172 INFO nova.compute.manager [None req-1a57df67-0f32-4e7a-ae01-b3a64cdd1e9a - - - - - -] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 28 10:01:52 np0005538515.localdomain podman[309268]: 2025-11-28 10:01:52.952473315 +0000 UTC m=+0.114868686 container cleanup b83b95d2190600c1dbe7db51200edb3cc9a4c1a1dce37a308e98f94a4ead0bc9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ad2d8cf7-987d-4804-acbd-9b3e248dc8cd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Nov 28 10:01:52 np0005538515.localdomain systemd[1]: libpod-conmon-b83b95d2190600c1dbe7db51200edb3cc9a4c1a1dce37a308e98f94a4ead0bc9.scope: Deactivated successfully.
Nov 28 10:01:52 np0005538515.localdomain podman[309283]: 2025-11-28 10:01:52.980607247 +0000 UTC m=+0.072985630 container remove b83b95d2190600c1dbe7db51200edb3cc9a4c1a1dce37a308e98f94a4ead0bc9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ad2d8cf7-987d-4804-acbd-9b3e248dc8cd, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125)
Nov 28 10:01:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:01:52.984 261619 DEBUG oslo.privsep.daemon [-] privsep: reply[5d2fd949-ad34-4724-aeab-8b08e11f9de1]: (4, ('Fri Nov 28 10:01:52 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-ad2d8cf7-987d-4804-acbd-9b3e248dc8cd (b83b95d2190600c1dbe7db51200edb3cc9a4c1a1dce37a308e98f94a4ead0bc9)\nb83b95d2190600c1dbe7db51200edb3cc9a4c1a1dce37a308e98f94a4ead0bc9\nFri Nov 28 10:01:52 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-ad2d8cf7-987d-4804-acbd-9b3e248dc8cd (b83b95d2190600c1dbe7db51200edb3cc9a4c1a1dce37a308e98f94a4ead0bc9)\nb83b95d2190600c1dbe7db51200edb3cc9a4c1a1dce37a308e98f94a4ead0bc9\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:01:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:01:52.985 261619 DEBUG oslo.privsep.daemon [-] privsep: reply[97f156a8-0150-4d12-9211-f547b838e017]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:01:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:01:52.986 158530 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapad2d8cf7-90, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 10:01:52 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:52.987 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:01:52 np0005538515.localdomain kernel: device tapad2d8cf7-90 left promiscuous mode
Nov 28 10:01:52 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:52.989 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:01:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:01:52.991 261619 DEBUG oslo.privsep.daemon [-] privsep: reply[fb48d62e-d90b-4819-88c5-ce1005deba62]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:01:52 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:52.995 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:01:53 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:01:53.009 261619 DEBUG oslo.privsep.daemon [-] privsep: reply[41c09b17-15ac-49ff-b8cf-3190024d680c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:01:53 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:01:53.010 261619 DEBUG oslo.privsep.daemon [-] privsep: reply[e847cb6e-d5dc-4d99-a5e9-8861d2961a46]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:01:53 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:01:53.022 261619 DEBUG oslo.privsep.daemon [-] privsep: reply[a9acf74d-b58b-4611-b574-bd51a1764b87]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1193194, 'reachable_time': 44107, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1356, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 309302, 'error': None, 'target': 'ovnmeta-ad2d8cf7-987d-4804-acbd-9b3e248dc8cd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:01:53 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:01:53.024 158630 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-ad2d8cf7-987d-4804-acbd-9b3e248dc8cd deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 28 10:01:53 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:01:53.024 158630 DEBUG oslo.privsep.daemon [-] privsep: reply[a4e78a62-09a3-4129-b1e0-07877d696ddb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:01:53 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:53.033 280172 INFO nova.compute.manager [None req-f01523fe-fe44-4637-97d1-2ed008e509d0 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Took 0.99 seconds to destroy the instance on the hypervisor.
Nov 28 10:01:53 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:53.033 280172 DEBUG oslo.service.loopingcall [None req-f01523fe-fe44-4637-97d1-2ed008e509d0 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 28 10:01:53 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:53.034 280172 DEBUG nova.compute.manager [-] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 28 10:01:53 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:53.034 280172 DEBUG nova.network.neutron [-] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 28 10:01:53 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:01:53.062 261346 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:01:52Z, description=, device_id=ff13b2c3-ffbb-486b-ba3a-fa0f2960342d, device_owner=network:router_gateway, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce599220>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce5990d0>], id=57c70dff-855f-436b-a33c-5f3b79153011, ip_allocation=immediate, mac_address=fa:16:3e:58:c4:db, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T08:32:19Z, description=, dns_domain=, id=887157f9-a765-40c0-8be5-1fba3ddea8f8, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=9dda653c53224db086060962b0702694, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=29, status=ACTIVE, subnets=['5f7de60c-f82a-4f40-b803-51cb08cbf2e3'], tags=[], tenant_id=9dda653c53224db086060962b0702694, updated_at=2025-11-28T08:32:25Z, vlan_transparent=None, network_id=887157f9-a765-40c0-8be5-1fba3ddea8f8, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=756, status=DOWN, tags=[], tenant_id=, updated_at=2025-11-28T10:01:52Z on network 887157f9-a765-40c0-8be5-1fba3ddea8f8
Nov 28 10:01:53 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:53.293 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:01:53 np0005538515.localdomain dnsmasq[261709]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/addn_hosts - 5 addresses
Nov 28 10:01:53 np0005538515.localdomain dnsmasq-dhcp[261709]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/host
Nov 28 10:01:53 np0005538515.localdomain podman[309318]: 2025-11-28 10:01:53.295435054 +0000 UTC m=+0.091599523 container kill dff024c0a8ef5b75fbea3dbf531de7255c21fd5f44b33b8a799f8e3ce0ffd439 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-887157f9-a765-40c0-8be5-1fba3ddea8f8, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 28 10:01:53 np0005538515.localdomain dnsmasq-dhcp[261709]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/opts
Nov 28 10:01:53 np0005538515.localdomain systemd[1]: tmp-crun.6XLSz6.mount: Deactivated successfully.
Nov 28 10:01:53 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-0a768f3cd9f334ca2043e41bc50d9940a576b305a62ce3d0d03be668021f18a6-merged.mount: Deactivated successfully.
Nov 28 10:01:53 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b83b95d2190600c1dbe7db51200edb3cc9a4c1a1dce37a308e98f94a4ead0bc9-userdata-shm.mount: Deactivated successfully.
Nov 28 10:01:53 np0005538515.localdomain systemd[1]: run-netns-ovnmeta\x2dad2d8cf7\x2d987d\x2d4804\x2dacbd\x2d9b3e248dc8cd.mount: Deactivated successfully.
Nov 28 10:01:53 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-85df380a7f1d8aaeea84d1e7053bb33d88e99eb8caf1cca6548cd15c84faacda-merged.mount: Deactivated successfully.
Nov 28 10:01:53 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-2a677fb10ee4b5d09cc54f10906d47c5af3cbfa4d5a75fef5f8a83ebbe812c65-userdata-shm.mount: Deactivated successfully.
Nov 28 10:01:53 np0005538515.localdomain systemd[1]: run-netns-ovnmeta\x2d492ef1de\x2d4a68\x2d49e4\x2db736\x2d13cdb2eb7b59.mount: Deactivated successfully.
Nov 28 10:01:53 np0005538515.localdomain ceph-mon[301134]: pgmap v125: 177 pgs: 177 active+clean; 383 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 5.1 MiB/s rd, 7.2 MiB/s wr, 200 op/s
Nov 28 10:01:53 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e102 e102: 6 total, 6 up, 6 in
Nov 28 10:01:53 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:01:53.559 261346 INFO neutron.agent.dhcp.agent [None req-4bc49313-c594-4a9c-9251-e1fc93d41348 - - - - - -] DHCP configuration for ports {'57c70dff-855f-436b-a33c-5f3b79153011'} is completed
Nov 28 10:01:53 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v127: 177 pgs: 177 active+clean; 383 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 5.8 MiB/s rd, 8.2 MiB/s wr, 228 op/s
Nov 28 10:01:54 np0005538515.localdomain ceph-mon[301134]: osdmap e102: 6 total, 6 up, 6 in
Nov 28 10:01:54 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:54.719 280172 DEBUG nova.compute.manager [None req-9b118e69-1b4e-4a80-93b6-9ea2d37c0d65 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 28 10:01:54 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:54.734 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:01:54 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:54.876 280172 DEBUG oslo_concurrency.lockutils [None req-9b118e69-1b4e-4a80-93b6-9ea2d37c0d65 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] Lock "7292509e-f294-4159-96e5-22d4712df2a0" "released" by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" :: held 7.479s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 10:01:55 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:55.369 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:01:55 np0005538515.localdomain ceph-mon[301134]: pgmap v127: 177 pgs: 177 active+clean; 383 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 5.8 MiB/s rd, 8.2 MiB/s wr, 228 op/s
Nov 28 10:01:55 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v128: 177 pgs: 177 active+clean; 383 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 5.1 MiB/s rd, 7.2 MiB/s wr, 200 op/s
Nov 28 10:01:55 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:55.742 280172 DEBUG nova.network.neutron [-] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 28 10:01:55 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:55.758 280172 INFO nova.compute.manager [-] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Took 2.72 seconds to deallocate network for instance.
Nov 28 10:01:55 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:55.822 280172 DEBUG oslo_concurrency.lockutils [None req-f01523fe-fe44-4637-97d1-2ed008e509d0 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 10:01:55 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:55.822 280172 DEBUG oslo_concurrency.lockutils [None req-f01523fe-fe44-4637-97d1-2ed008e509d0 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 10:01:55 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:55.825 280172 DEBUG oslo_concurrency.lockutils [None req-f01523fe-fe44-4637-97d1-2ed008e509d0 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 10:01:55 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:55.864 280172 INFO nova.scheduler.client.report [None req-f01523fe-fe44-4637-97d1-2ed008e509d0 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] Deleted allocations for instance c06e2ffc-a8af-41b6-ab88-680ef1f6fe50
Nov 28 10:01:55 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:55.951 280172 DEBUG oslo_concurrency.lockutils [None req-f01523fe-fe44-4637-97d1-2ed008e509d0 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] Lock "c06e2ffc-a8af-41b6-ab88-680ef1f6fe50" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.911s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 10:01:56 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:56.304 280172 DEBUG oslo_concurrency.lockutils [None req-e7d006c1-4e1e-4995-bd1a-1f6d1f26ac1e 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] Acquiring lock "7292509e-f294-4159-96e5-22d4712df2a0" by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 10:01:56 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:56.305 280172 DEBUG oslo_concurrency.lockutils [None req-e7d006c1-4e1e-4995-bd1a-1f6d1f26ac1e 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] Lock "7292509e-f294-4159-96e5-22d4712df2a0" acquired by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 10:01:56 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:56.305 280172 INFO nova.compute.manager [None req-e7d006c1-4e1e-4995-bd1a-1f6d1f26ac1e 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Shelving
Nov 28 10:01:56 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:56.324 280172 DEBUG nova.virt.libvirt.driver [None req-e7d006c1-4e1e-4995-bd1a-1f6d1f26ac1e 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Nov 28 10:01:56 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:01:56.416 2 INFO neutron.agent.securitygroups_rpc [req-49cc58ff-e4e8-45be-b0c0-595b2c881c34 req-2439bbec-210c-4eb9-989c-4cbc137e5d8d 76caaf04f9e5427ca10e0bb020dbffa2 6fec370fed684ed6ba04de00336f61ee - - default default] Security group rule updated ['4f7b9341-d4bb-4bbc-a8bf-917ce0b68881']
Nov 28 10:01:56 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e102 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 10:01:56 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:01:56.959 2 INFO neutron.agent.securitygroups_rpc [req-d6a9207a-32bb-417d-a1a8-33a725f0d00f req-76509a4e-6eff-4420-ad31-f2903ff65806 76caaf04f9e5427ca10e0bb020dbffa2 6fec370fed684ed6ba04de00336f61ee - - default default] Security group rule updated ['4f7b9341-d4bb-4bbc-a8bf-917ce0b68881']
Nov 28 10:01:57 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:57.307 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:01:57 np0005538515.localdomain ceph-mon[301134]: pgmap v128: 177 pgs: 177 active+clean; 383 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 5.1 MiB/s rd, 7.2 MiB/s wr, 200 op/s
Nov 28 10:01:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:01:57 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 10:01:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:01:57 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 10:01:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:01:57 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 10:01:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:01:57 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 10:01:57 np0005538515.localdomain openstack_network_exporter[240973]: 
Nov 28 10:01:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:01:57 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 10:01:57 np0005538515.localdomain openstack_network_exporter[240973]: 
Nov 28 10:01:57 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v129: 177 pgs: 177 active+clean; 225 MiB data, 916 MiB used, 41 GiB / 42 GiB avail; 7.0 MiB/s rd, 4.7 MiB/s wr, 241 op/s
Nov 28 10:01:57 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:01:57.743 2 INFO neutron.agent.securitygroups_rpc [None req-e0232781-0774-46d7-9ff8-6308f0f3831b 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] Security group member updated ['8cd6a72f-0cb3-42f5-95bb-7d1b962c8a1e']
Nov 28 10:01:58 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:58.296 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:01:58 np0005538515.localdomain ceph-mon[301134]: pgmap v129: 177 pgs: 177 active+clean; 225 MiB data, 916 MiB used, 41 GiB / 42 GiB avail; 7.0 MiB/s rd, 4.7 MiB/s wr, 241 op/s
Nov 28 10:01:58 np0005538515.localdomain podman[239012]: time="2025-11-28T10:01:58Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 10:01:58 np0005538515.localdomain podman[239012]: @ - - [28/Nov/2025:10:01:58 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 158154 "" "Go-http-client/1.1"
Nov 28 10:01:58 np0005538515.localdomain podman[239012]: @ - - [28/Nov/2025:10:01:58 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19674 "" "Go-http-client/1.1"
Nov 28 10:01:59 np0005538515.localdomain snmpd[68067]: empty variable list in _query
Nov 28 10:01:59 np0005538515.localdomain snmpd[68067]: empty variable list in _query
Nov 28 10:01:59 np0005538515.localdomain snmpd[68067]: empty variable list in _query
Nov 28 10:01:59 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v130: 177 pgs: 177 active+clean; 225 MiB data, 916 MiB used, 41 GiB / 42 GiB avail; 7.0 MiB/s rd, 4.7 MiB/s wr, 241 op/s
Nov 28 10:01:59 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:01:59.698 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:01:59 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:01:59.807 261346 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:01:59Z, description=, device_id=8d6dcd20-92ab-47ad-ac9d-52244fd1b9b4, device_owner=network:router_gateway, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce5f9d60>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce5f96d0>], id=a5e94566-6d46-488f-ab71-30296b099db4, ip_allocation=immediate, mac_address=fa:16:3e:3c:b9:19, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T08:32:19Z, description=, dns_domain=, id=887157f9-a765-40c0-8be5-1fba3ddea8f8, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=9dda653c53224db086060962b0702694, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=29, status=ACTIVE, subnets=['5f7de60c-f82a-4f40-b803-51cb08cbf2e3'], tags=[], tenant_id=9dda653c53224db086060962b0702694, updated_at=2025-11-28T08:32:25Z, vlan_transparent=None, network_id=887157f9-a765-40c0-8be5-1fba3ddea8f8, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=810, status=DOWN, tags=[], tenant_id=, updated_at=2025-11-28T10:01:59Z on network 887157f9-a765-40c0-8be5-1fba3ddea8f8
Nov 28 10:01:59 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:01:59.862 2 INFO neutron.agent.securitygroups_rpc [None req-34f90ada-ae7e-4d6e-90c9-94029146836e 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] Security group member updated ['8cd6a72f-0cb3-42f5-95bb-7d1b962c8a1e']
Nov 28 10:01:59 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.
Nov 28 10:01:59 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e103 e103: 6 total, 6 up, 6 in
Nov 28 10:01:59 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.
Nov 28 10:01:59 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.
Nov 28 10:01:59 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.
Nov 28 10:02:00 np0005538515.localdomain podman[309345]: 2025-11-28 10:01:59.999907694 +0000 UTC m=+0.099428301 container health_status b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_metadata_agent)
Nov 28 10:02:00 np0005538515.localdomain podman[309351]: 2025-11-28 10:02:00.029325593 +0000 UTC m=+0.113188245 container health_status d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 28 10:02:00 np0005538515.localdomain podman[309345]: 2025-11-28 10:02:00.084487503 +0000 UTC m=+0.184008140 container exec_died b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0)
Nov 28 10:02:00 np0005538515.localdomain systemd[1]: tmp-crun.a6vCPt.mount: Deactivated successfully.
Nov 28 10:02:00 np0005538515.localdomain systemd[1]: b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.service: Deactivated successfully.
Nov 28 10:02:00 np0005538515.localdomain podman[309351]: 2025-11-28 10:02:00.114512742 +0000 UTC m=+0.198375434 container exec_died d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 28 10:02:00 np0005538515.localdomain systemd[1]: d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.service: Deactivated successfully.
Nov 28 10:02:00 np0005538515.localdomain podman[309410]: 2025-11-28 10:02:00.136362803 +0000 UTC m=+0.069131993 container kill dff024c0a8ef5b75fbea3dbf531de7255c21fd5f44b33b8a799f8e3ce0ffd439 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-887157f9-a765-40c0-8be5-1fba3ddea8f8, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:02:00 np0005538515.localdomain dnsmasq[261709]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/addn_hosts - 6 addresses
Nov 28 10:02:00 np0005538515.localdomain dnsmasq-dhcp[261709]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/host
Nov 28 10:02:00 np0005538515.localdomain dnsmasq-dhcp[261709]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/opts
Nov 28 10:02:00 np0005538515.localdomain podman[309344]: 2025-11-28 10:02:00.100147757 +0000 UTC m=+0.199635333 container health_status 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 28 10:02:00 np0005538515.localdomain podman[309344]: 2025-11-28 10:02:00.180536719 +0000 UTC m=+0.280024385 container exec_died 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Nov 28 10:02:00 np0005538515.localdomain systemd[1]: 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.service: Deactivated successfully.
Nov 28 10:02:00 np0005538515.localdomain podman[309342]: 2025-11-28 10:02:00.251420205 +0000 UTC m=+0.352712874 container health_status 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, tcib_managed=true, managed_by=edpm_ansible)
Nov 28 10:02:00 np0005538515.localdomain podman[309342]: 2025-11-28 10:02:00.260581012 +0000 UTC m=+0.361873701 container exec_died 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Nov 28 10:02:00 np0005538515.localdomain systemd[1]: 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.service: Deactivated successfully.
Nov 28 10:02:00 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:02:00.334 261346 INFO neutron.agent.dhcp.agent [None req-69db6ddf-6552-497f-ab95-7f82cc1729d9 - - - - - -] DHCP configuration for ports {'a5e94566-6d46-488f-ab71-30296b099db4'} is completed
Nov 28 10:02:00 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:02:00.871 2 INFO neutron.agent.securitygroups_rpc [None req-42bd8e77-bdc1-4bfe-abe6-7d585fdf99bb 75ac6a26227c40ba81e61e610018d23f 1f9b84b894e641c4bee3ebcd1409ad9f - - default default] Security group rule updated ['6deb8732-9203-448a-b0a5-cf6a0375d009']
Nov 28 10:02:00 np0005538515.localdomain ceph-mon[301134]: pgmap v130: 177 pgs: 177 active+clean; 225 MiB data, 916 MiB used, 41 GiB / 42 GiB avail; 7.0 MiB/s rd, 4.7 MiB/s wr, 241 op/s
Nov 28 10:02:00 np0005538515.localdomain ceph-mon[301134]: osdmap e103: 6 total, 6 up, 6 in
Nov 28 10:02:00 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:02:00.984 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.002 12 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET http://nova-internal.openstack.svc:8774/v2.1/flavors?is_public=None -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}5d3ce7f1fffa4bc48553188980fcc5a0f98928b083e87362f9a3663b98ca8926" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:519
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.112 12 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 954 Content-Type: application/json Date: Fri, 28 Nov 2025 10:02:01 GMT Keep-Alive: timeout=5, max=100 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-868e1cbc-f490-4365-854d-da0a5b96df09 x-openstack-request-id: req-868e1cbc-f490-4365-854d-da0a5b96df09 _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:550
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.113 12 DEBUG novaclient.v2.client [-] RESP BODY: {"flavors": [{"id": "6c0b67a0-46d4-481b-87df-bc5abc74bfe1", "name": "m1.micro", "links": [{"rel": "self", "href": "http://nova-internal.openstack.svc:8774/v2.1/flavors/6c0b67a0-46d4-481b-87df-bc5abc74bfe1"}, {"rel": "bookmark", "href": "http://nova-internal.openstack.svc:8774/flavors/6c0b67a0-46d4-481b-87df-bc5abc74bfe1"}]}, {"id": "98f289d4-5c06-4ab5-9089-7b580870d676", "name": "m1.nano", "links": [{"rel": "self", "href": "http://nova-internal.openstack.svc:8774/v2.1/flavors/98f289d4-5c06-4ab5-9089-7b580870d676"}, {"rel": "bookmark", "href": "http://nova-internal.openstack.svc:8774/flavors/98f289d4-5c06-4ab5-9089-7b580870d676"}]}, {"id": "f3c44237-060e-4213-a926-aa7fdb4bf902", "name": "m1.small", "links": [{"rel": "self", "href": "http://nova-internal.openstack.svc:8774/v2.1/flavors/f3c44237-060e-4213-a926-aa7fdb4bf902"}, {"rel": "bookmark", "href": "http://nova-internal.openstack.svc:8774/flavors/f3c44237-060e-4213-a926-aa7fdb4bf902"}]}]} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:582
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.113 12 DEBUG novaclient.v2.client [-] GET call to compute for http://nova-internal.openstack.svc:8774/v2.1/flavors?is_public=None used request id req-868e1cbc-f490-4365-854d-da0a5b96df09 request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:954
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.119 12 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET http://nova-internal.openstack.svc:8774/v2.1/flavors/98f289d4-5c06-4ab5-9089-7b580870d676 -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}5d3ce7f1fffa4bc48553188980fcc5a0f98928b083e87362f9a3663b98ca8926" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:519
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.137 12 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 493 Content-Type: application/json Date: Fri, 28 Nov 2025 10:02:01 GMT Keep-Alive: timeout=5, max=99 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-15cea7a1-779e-444e-86f9-112352ff567e x-openstack-request-id: req-15cea7a1-779e-444e-86f9-112352ff567e _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:550
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.137 12 DEBUG novaclient.v2.client [-] RESP BODY: {"flavor": {"id": "98f289d4-5c06-4ab5-9089-7b580870d676", "name": "m1.nano", "ram": 128, "disk": 1, "swap": "", "OS-FLV-EXT-DATA:ephemeral": 0, "OS-FLV-DISABLED:disabled": false, "vcpus": 1, "os-flavor-access:is_public": true, "rxtx_factor": 1.0, "links": [{"rel": "self", "href": "http://nova-internal.openstack.svc:8774/v2.1/flavors/98f289d4-5c06-4ab5-9089-7b580870d676"}, {"rel": "bookmark", "href": "http://nova-internal.openstack.svc:8774/flavors/98f289d4-5c06-4ab5-9089-7b580870d676"}]}} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:582
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.138 12 DEBUG novaclient.v2.client [-] GET call to compute for http://nova-internal.openstack.svc:8774/v2.1/flavors/98f289d4-5c06-4ab5-9089-7b580870d676 used request id req-15cea7a1-779e-444e-86f9-112352ff567e request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:954
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.139 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '7292509e-f294-4159-96e5-22d4712df2a0', 'name': 'tempest-UnshelveToHostMultiNodesTest-server-650509197', 'flavor': {'id': '98f289d4-5c06-4ab5-9089-7b580870d676', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'c045142b-5f2b-4f4d-80b7-ca5ee791067d'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000007', 'OS-EXT-SRV-ATTR:host': 'np0005538515.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'a30386ba68ee46f4a1bac43cf415f3a4', 'user_id': '28578129c91d407a92af609ba8bac430', 'hostId': '9ebb4b702ddc99b37414d36e54c2ee005e2ad03e5449dd0a560248b5', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.142 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.148 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.148 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.194 12 DEBUG ceilometer.compute.pollsters [-] 7292509e-f294-4159-96e5-22d4712df2a0/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.194 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance 7292509e-f294-4159-96e5-22d4712df2a0: ceilometer.compute.pollsters.NoVolumeException
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.194 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.210 12 DEBUG ceilometer.compute.pollsters [-] 7292509e-f294-4159-96e5-22d4712df2a0/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.211 12 DEBUG ceilometer.compute.pollsters [-] 7292509e-f294-4159-96e5-22d4712df2a0/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.224 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b366c60a-2a6b-4155-930f-8db93d806276', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '28578129c91d407a92af609ba8bac430', 'user_name': None, 'project_id': 'a30386ba68ee46f4a1bac43cf415f3a4', 'project_name': None, 'resource_id': '7292509e-f294-4159-96e5-22d4712df2a0-vda', 'timestamp': '2025-11-28T10:02:01.195105', 'resource_metadata': {'display_name': 'tempest-UnshelveToHostMultiNodesTest-server-650509197', 'name': 'instance-00000007', 'instance_id': '7292509e-f294-4159-96e5-22d4712df2a0', 'instance_type': 'm1.nano', 'host': '9ebb4b702ddc99b37414d36e54c2ee005e2ad03e5449dd0a560248b5', 'instance_host': 'np0005538515.localdomain', 'flavor': {'id': '98f289d4-5c06-4ab5-9089-7b580870d676', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c045142b-5f2b-4f4d-80b7-ca5ee791067d'}, 'image_ref': 'c045142b-5f2b-4f4d-80b7-ca5ee791067d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '48a3ad70-cc41-11f0-bb26-fa163e93ca2d', 'monotonic_time': 11945.418562253, 'message_signature': '17a25bae4a4eb77ae60e0fd32e92f09d344e0688f3e21347717a58cea8cdf717'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '28578129c91d407a92af609ba8bac430', 'user_name': None, 'project_id': 'a30386ba68ee46f4a1bac43cf415f3a4', 'project_name': None, 'resource_id': '7292509e-f294-4159-96e5-22d4712df2a0-sda', 'timestamp': '2025-11-28T10:02:01.195105', 'resource_metadata': {'display_name': 'tempest-UnshelveToHostMultiNodesTest-server-650509197', 'name': 'instance-00000007', 'instance_id': '7292509e-f294-4159-96e5-22d4712df2a0', 'instance_type': 'm1.nano', 'host': '9ebb4b702ddc99b37414d36e54c2ee005e2ad03e5449dd0a560248b5', 'instance_host': 'np0005538515.localdomain', 'flavor': {'id': '98f289d4-5c06-4ab5-9089-7b580870d676', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c045142b-5f2b-4f4d-80b7-ca5ee791067d'}, 'image_ref': 'c045142b-5f2b-4f4d-80b7-ca5ee791067d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '48a3c936-cc41-11f0-bb26-fa163e93ca2d', 'monotonic_time': 11945.418562253, 'message_signature': '8474cbfdad15815a7123c1777c9ca76fb89eee1eef07e27ae105e1cb8128d181'}]}, 'timestamp': '2025-11-28 10:02:01.212638', '_unique_id': '4a61b93a5d0447ce95a7d7c78408552c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.224 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.224 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.224 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.224 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.224 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.224 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.224 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.224 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.224 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.224 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.224 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.224 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.224 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.224 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.224 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.224 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.224 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.224 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.224 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.224 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.224 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.224 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.224 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.224 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.224 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.224 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.224 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.224 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.224 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.224 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.224 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.230 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.266 12 DEBUG ceilometer.compute.pollsters [-] 7292509e-f294-4159-96e5-22d4712df2a0/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.268 12 DEBUG ceilometer.compute.pollsters [-] 7292509e-f294-4159-96e5-22d4712df2a0/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.270 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4b06f872-6c1c-4c88-b9d6-d2a29597c59f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '28578129c91d407a92af609ba8bac430', 'user_name': None, 'project_id': 'a30386ba68ee46f4a1bac43cf415f3a4', 'project_name': None, 'resource_id': '7292509e-f294-4159-96e5-22d4712df2a0-vda', 'timestamp': '2025-11-28T10:02:01.230321', 'resource_metadata': {'display_name': 'tempest-UnshelveToHostMultiNodesTest-server-650509197', 'name': 'instance-00000007', 'instance_id': '7292509e-f294-4159-96e5-22d4712df2a0', 'instance_type': 'm1.nano', 'host': '9ebb4b702ddc99b37414d36e54c2ee005e2ad03e5449dd0a560248b5', 'instance_host': 'np0005538515.localdomain', 'flavor': {'id': '98f289d4-5c06-4ab5-9089-7b580870d676', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c045142b-5f2b-4f4d-80b7-ca5ee791067d'}, 'image_ref': 'c045142b-5f2b-4f4d-80b7-ca5ee791067d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '48ac5470-cc41-11f0-bb26-fa163e93ca2d', 'monotonic_time': 11945.452005595, 'message_signature': 'c5a5b547658c2ae100048ddcc45db4ff5d5180b52c771f10427ddedac8c2eca4'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '28578129c91d407a92af609ba8bac430', 'user_name': None, 'project_id': 'a30386ba68ee46f4a1bac43cf415f3a4', 'project_name': None, 'resource_id': '7292509e-f294-4159-96e5-22d4712df2a0-sda', 'timestamp': '2025-11-28T10:02:01.230321', 'resource_metadata': {'display_name': 'tempest-UnshelveToHostMultiNodesTest-server-650509197', 'name': 'instance-00000007', 'instance_id': '7292509e-f294-4159-96e5-22d4712df2a0', 'instance_type': 'm1.nano', 'host': '9ebb4b702ddc99b37414d36e54c2ee005e2ad03e5449dd0a560248b5', 'instance_host': 'np0005538515.localdomain', 'flavor': {'id': '98f289d4-5c06-4ab5-9089-7b580870d676', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c045142b-5f2b-4f4d-80b7-ca5ee791067d'}, 'image_ref': 'c045142b-5f2b-4f4d-80b7-ca5ee791067d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '48ac65dc-cc41-11f0-bb26-fa163e93ca2d', 'monotonic_time': 11945.452005595, 'message_signature': 'e7eb0a28ac27128d923a7dde225d23238500df32c5d8a9d2176f149e88fa96c9'}]}, 'timestamp': '2025-11-28 10:02:01.268534', '_unique_id': '3275019ec3bb4c94ae84f163eef73d2f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.270 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.270 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.270 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.270 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.270 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.270 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.270 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.270 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.270 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.270 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.270 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.270 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.270 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.270 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.270 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.270 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.270 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.270 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.270 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.270 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.270 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.270 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.270 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.270 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.270 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.270 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.270 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.270 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.270 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.270 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.270 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.273 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.273 12 DEBUG ceilometer.compute.pollsters [-] 7292509e-f294-4159-96e5-22d4712df2a0/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.273 12 DEBUG ceilometer.compute.pollsters [-] 7292509e-f294-4159-96e5-22d4712df2a0/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.277 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9acb507b-5041-4611-89ca-4ee95dda1ef4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '28578129c91d407a92af609ba8bac430', 'user_name': None, 'project_id': 'a30386ba68ee46f4a1bac43cf415f3a4', 'project_name': None, 'resource_id': '7292509e-f294-4159-96e5-22d4712df2a0-vda', 'timestamp': '2025-11-28T10:02:01.273459', 'resource_metadata': {'display_name': 'tempest-UnshelveToHostMultiNodesTest-server-650509197', 'name': 'instance-00000007', 'instance_id': '7292509e-f294-4159-96e5-22d4712df2a0', 'instance_type': 'm1.nano', 'host': '9ebb4b702ddc99b37414d36e54c2ee005e2ad03e5449dd0a560248b5', 'instance_host': 'np0005538515.localdomain', 'flavor': {'id': '98f289d4-5c06-4ab5-9089-7b580870d676', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c045142b-5f2b-4f4d-80b7-ca5ee791067d'}, 'image_ref': 'c045142b-5f2b-4f4d-80b7-ca5ee791067d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '48ad3480-cc41-11f0-bb26-fa163e93ca2d', 'monotonic_time': 11945.452005595, 'message_signature': 'd87d9711d2f9992f8054f2903f7e9b004cec538b3bf0da1c5f404002128ad6c4'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '28578129c91d407a92af609ba8bac430', 'user_name': None, 'project_id': 'a30386ba68ee46f4a1bac43cf415f3a4', 'project_name': None, 'resource_id': '7292509e-f294-4159-96e5-22d4712df2a0-sda', 'timestamp': '2025-11-28T10:02:01.273459', 'resource_metadata': {'display_name': 'tempest-UnshelveToHostMultiNodesTest-server-650509197', 'name': 'instance-00000007', 'instance_id': '7292509e-f294-4159-96e5-22d4712df2a0', 'instance_type': 'm1.nano', 'host': '9ebb4b702ddc99b37414d36e54c2ee005e2ad03e5449dd0a560248b5', 'instance_host': 'np0005538515.localdomain', 'flavor': {'id': '98f289d4-5c06-4ab5-9089-7b580870d676', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c045142b-5f2b-4f4d-80b7-ca5ee791067d'}, 'image_ref': 'c045142b-5f2b-4f4d-80b7-ca5ee791067d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '48ad4182-cc41-11f0-bb26-fa163e93ca2d', 'monotonic_time': 11945.452005595, 'message_signature': '3570179c06c02c299b35b55687f99c2cde83698dcfb875bb35c91732362e27ae'}]}, 'timestamp': '2025-11-28 10:02:01.274776', '_unique_id': '7fd9637f2b5741df8b4e59e596a59e41'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.277 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.277 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.277 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.277 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.277 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.277 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.277 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.277 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.277 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.277 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.277 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.277 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.277 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.277 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.277 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.277 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.277 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.277 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.277 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.277 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.277 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.277 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.277 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.277 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.277 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.277 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.277 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.277 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.277 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.277 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.277 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.278 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.278 12 DEBUG ceilometer.compute.pollsters [-] 7292509e-f294-4159-96e5-22d4712df2a0/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.279 12 DEBUG ceilometer.compute.pollsters [-] 7292509e-f294-4159-96e5-22d4712df2a0/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.281 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a8989de5-bf71-4bef-8d52-b0ae10dd236a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '28578129c91d407a92af609ba8bac430', 'user_name': None, 'project_id': 'a30386ba68ee46f4a1bac43cf415f3a4', 'project_name': None, 'resource_id': '7292509e-f294-4159-96e5-22d4712df2a0-vda', 'timestamp': '2025-11-28T10:02:01.278524', 'resource_metadata': {'display_name': 'tempest-UnshelveToHostMultiNodesTest-server-650509197', 'name': 'instance-00000007', 'instance_id': '7292509e-f294-4159-96e5-22d4712df2a0', 'instance_type': 'm1.nano', 'host': '9ebb4b702ddc99b37414d36e54c2ee005e2ad03e5449dd0a560248b5', 'instance_host': 'np0005538515.localdomain', 'flavor': {'id': '98f289d4-5c06-4ab5-9089-7b580870d676', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c045142b-5f2b-4f4d-80b7-ca5ee791067d'}, 'image_ref': 'c045142b-5f2b-4f4d-80b7-ca5ee791067d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '48adfc62-cc41-11f0-bb26-fa163e93ca2d', 'monotonic_time': 11945.452005595, 'message_signature': 'd664f5f7f01a9f722fc902d88deb099f999ddec7130e73a3a47ab72274b33c34'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '28578129c91d407a92af609ba8bac430', 'user_name': None, 'project_id': 'a30386ba68ee46f4a1bac43cf415f3a4', 'project_name': None, 'resource_id': '7292509e-f294-4159-96e5-22d4712df2a0-sda', 'timestamp': '2025-11-28T10:02:01.278524', 'resource_metadata': {'display_name': 'tempest-UnshelveToHostMultiNodesTest-server-650509197', 'name': 'instance-00000007', 'instance_id': '7292509e-f294-4159-96e5-22d4712df2a0', 'instance_type': 'm1.nano', 'host': '9ebb4b702ddc99b37414d36e54c2ee005e2ad03e5449dd0a560248b5', 'instance_host': 'np0005538515.localdomain', 'flavor': {'id': '98f289d4-5c06-4ab5-9089-7b580870d676', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c045142b-5f2b-4f4d-80b7-ca5ee791067d'}, 'image_ref': 'c045142b-5f2b-4f4d-80b7-ca5ee791067d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '48ae2fb6-cc41-11f0-bb26-fa163e93ca2d', 'monotonic_time': 11945.452005595, 'message_signature': '4e43cea39727cbbb60314aa0cf3a58c2ec95f40d50bf6f060876809613a2a80e'}]}, 'timestamp': '2025-11-28 10:02:01.280377', '_unique_id': '75daea94745342569e2966000947b850'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.281 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.281 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.281 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.281 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.281 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.281 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.281 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.281 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.281 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.281 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.281 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.281 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.281 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.281 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.281 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.281 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.281 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.281 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.281 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.281 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.281 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.281 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.281 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.281 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.281 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.281 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.281 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.281 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.281 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.281 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.281 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.282 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.282 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.282 12 DEBUG ceilometer.compute.pollsters [-] 7292509e-f294-4159-96e5-22d4712df2a0/cpu volume: 7910000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.283 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7dd79bf5-fef9-4d2c-b4d2-62fc5277d34b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 7910000000, 'user_id': '28578129c91d407a92af609ba8bac430', 'user_name': None, 'project_id': 'a30386ba68ee46f4a1bac43cf415f3a4', 'project_name': None, 'resource_id': '7292509e-f294-4159-96e5-22d4712df2a0', 'timestamp': '2025-11-28T10:02:01.282783', 'resource_metadata': {'display_name': 'tempest-UnshelveToHostMultiNodesTest-server-650509197', 'name': 'instance-00000007', 'instance_id': '7292509e-f294-4159-96e5-22d4712df2a0', 'instance_type': 'm1.nano', 'host': '9ebb4b702ddc99b37414d36e54c2ee005e2ad03e5449dd0a560248b5', 'instance_host': 'np0005538515.localdomain', 'flavor': {'id': '98f289d4-5c06-4ab5-9089-7b580870d676', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c045142b-5f2b-4f4d-80b7-ca5ee791067d'}, 'image_ref': 'c045142b-5f2b-4f4d-80b7-ca5ee791067d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '48ae9e56-cc41-11f0-bb26-fa163e93ca2d', 'monotonic_time': 11945.415387957, 'message_signature': '2f902eda5451d0402aff92f8c2ba456a7c6d33dc9ee6f870e22596d41cefa613'}]}, 'timestamp': '2025-11-28 10:02:01.283034', '_unique_id': 'de64cac123174a58a7224fc275ad64f2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.283 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.283 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.283 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.283 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.283 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.283 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.283 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.283 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.283 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.283 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.283 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.283 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.283 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.283 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.283 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.283 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.283 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.283 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.283 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.283 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.283 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.283 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.283 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.283 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.283 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.283 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.283 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.283 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.283 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.283 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.283 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.285 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.285 12 DEBUG ceilometer.compute.pollsters [-] 7292509e-f294-4159-96e5-22d4712df2a0/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.286 12 DEBUG ceilometer.compute.pollsters [-] 7292509e-f294-4159-96e5-22d4712df2a0/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.287 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9f81154f-2e21-4769-baf7-4ee83e453ab8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '28578129c91d407a92af609ba8bac430', 'user_name': None, 'project_id': 'a30386ba68ee46f4a1bac43cf415f3a4', 'project_name': None, 'resource_id': '7292509e-f294-4159-96e5-22d4712df2a0-vda', 'timestamp': '2025-11-28T10:02:01.285462', 'resource_metadata': {'display_name': 'tempest-UnshelveToHostMultiNodesTest-server-650509197', 'name': 'instance-00000007', 'instance_id': '7292509e-f294-4159-96e5-22d4712df2a0', 'instance_type': 'm1.nano', 'host': '9ebb4b702ddc99b37414d36e54c2ee005e2ad03e5449dd0a560248b5', 'instance_host': 'np0005538515.localdomain', 'flavor': {'id': '98f289d4-5c06-4ab5-9089-7b580870d676', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c045142b-5f2b-4f4d-80b7-ca5ee791067d'}, 'image_ref': 'c045142b-5f2b-4f4d-80b7-ca5ee791067d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '48af10de-cc41-11f0-bb26-fa163e93ca2d', 'monotonic_time': 11945.418562253, 'message_signature': '27fd8939504e14c6892efbf778326bb10a6f12cdf72b567f48078ec890368fc7'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '28578129c91d407a92af609ba8bac430', 'user_name': None, 'project_id': 'a30386ba68ee46f4a1bac43cf415f3a4', 'project_name': None, 'resource_id': '7292509e-f294-4159-96e5-22d4712df2a0-sda', 'timestamp': '2025-11-28T10:02:01.285462', 'resource_metadata': {'display_name': 'tempest-UnshelveToHostMultiNodesTest-server-650509197', 'name': 'instance-00000007', 'instance_id': '7292509e-f294-4159-96e5-22d4712df2a0', 'instance_type': 'm1.nano', 'host': '9ebb4b702ddc99b37414d36e54c2ee005e2ad03e5449dd0a560248b5', 'instance_host': 'np0005538515.localdomain', 'flavor': {'id': '98f289d4-5c06-4ab5-9089-7b580870d676', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c045142b-5f2b-4f4d-80b7-ca5ee791067d'}, 'image_ref': 'c045142b-5f2b-4f4d-80b7-ca5ee791067d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '48af279a-cc41-11f0-bb26-fa163e93ca2d', 'monotonic_time': 11945.418562253, 'message_signature': 'ecea7c3e24b0552c3d53adee4d1036ad3a11ff3c1c589d0d0d76344b58963fbc'}]}, 'timestamp': '2025-11-28 10:02:01.286672', '_unique_id': 'e4f1229508704a10a50562545ab007f6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.287 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.287 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.287 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.287 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.287 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.287 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.287 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.287 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.287 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.287 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.287 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.287 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.287 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.287 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.287 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.287 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.287 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.287 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.287 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.287 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.287 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.287 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.287 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.287 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.287 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.287 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.287 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.287 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.287 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.287 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.287 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.287 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.288 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.288 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.288 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-UnshelveToHostMultiNodesTest-server-650509197>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-UnshelveToHostMultiNodesTest-server-650509197>]
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.289 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.289 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.289 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-UnshelveToHostMultiNodesTest-server-650509197>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-UnshelveToHostMultiNodesTest-server-650509197>]
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.290 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.290 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.290 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.291 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.291 12 DEBUG ceilometer.compute.pollsters [-] 7292509e-f294-4159-96e5-22d4712df2a0/disk.device.read.bytes volume: 23775232 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.291 12 DEBUG ceilometer.compute.pollsters [-] 7292509e-f294-4159-96e5-22d4712df2a0/disk.device.read.bytes volume: 2048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.292 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e492ad16-b7c7-41a7-90be-edf45ad3d2e4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 23775232, 'user_id': '28578129c91d407a92af609ba8bac430', 'user_name': None, 'project_id': 'a30386ba68ee46f4a1bac43cf415f3a4', 'project_name': None, 'resource_id': '7292509e-f294-4159-96e5-22d4712df2a0-vda', 'timestamp': '2025-11-28T10:02:01.291383', 'resource_metadata': {'display_name': 'tempest-UnshelveToHostMultiNodesTest-server-650509197', 'name': 'instance-00000007', 'instance_id': '7292509e-f294-4159-96e5-22d4712df2a0', 'instance_type': 'm1.nano', 'host': '9ebb4b702ddc99b37414d36e54c2ee005e2ad03e5449dd0a560248b5', 'instance_host': 'np0005538515.localdomain', 'flavor': {'id': '98f289d4-5c06-4ab5-9089-7b580870d676', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c045142b-5f2b-4f4d-80b7-ca5ee791067d'}, 'image_ref': 'c045142b-5f2b-4f4d-80b7-ca5ee791067d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '48afee32-cc41-11f0-bb26-fa163e93ca2d', 'monotonic_time': 11945.452005595, 'message_signature': '62deb3c488182642e7b5f23a602bae8a3b941b4995a8ab9a8ffa6bd653e9e0fa'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2048, 'user_id': '28578129c91d407a92af609ba8bac430', 'user_name': None, 'project_id': 'a30386ba68ee46f4a1bac43cf415f3a4', 'project_name': None, 'resource_id': '7292509e-f294-4159-96e5-22d4712df2a0-sda', 'timestamp': '2025-11-28T10:02:01.291383', 'resource_metadata': {'display_name': 'tempest-UnshelveToHostMultiNodesTest-server-650509197', 'name': 'instance-00000007', 'instance_id': '7292509e-f294-4159-96e5-22d4712df2a0', 'instance_type': 'm1.nano', 'host': '9ebb4b702ddc99b37414d36e54c2ee005e2ad03e5449dd0a560248b5', 'instance_host': 'np0005538515.localdomain', 'flavor': {'id': '98f289d4-5c06-4ab5-9089-7b580870d676', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c045142b-5f2b-4f4d-80b7-ca5ee791067d'}, 'image_ref': 'c045142b-5f2b-4f4d-80b7-ca5ee791067d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '48aff814-cc41-11f0-bb26-fa163e93ca2d', 'monotonic_time': 11945.452005595, 'message_signature': 'b3d5e360a8662c0deb7f3f4f69f4802500219677947de31b8bf7fb482660abaf'}]}, 'timestamp': '2025-11-28 10:02:01.291881', '_unique_id': 'bbcdcebe3d8447d0b9a920c6b788286e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.292 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.292 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.292 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.292 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.292 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.292 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.292 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.292 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.292 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.292 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.292 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.292 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.292 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.292 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.292 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.292 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.292 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.292 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.292 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.292 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.292 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.292 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.292 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.292 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.292 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.292 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.292 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.292 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.292 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.292 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.292 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.292 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.292 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.292 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.292 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.292 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.292 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.292 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.292 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.292 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.292 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.292 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.292 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.292 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.292 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.292 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.292 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.292 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.292 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.292 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.292 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.292 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.292 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.292 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.294 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.294 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.294 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.295 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-UnshelveToHostMultiNodesTest-server-650509197>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-UnshelveToHostMultiNodesTest-server-650509197>]
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.295 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.295 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.295 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-UnshelveToHostMultiNodesTest-server-650509197>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-UnshelveToHostMultiNodesTest-server-650509197>]
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.296 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.296 12 DEBUG ceilometer.compute.pollsters [-] 7292509e-f294-4159-96e5-22d4712df2a0/disk.device.read.latency volume: 950997712 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.296 12 DEBUG ceilometer.compute.pollsters [-] 7292509e-f294-4159-96e5-22d4712df2a0/disk.device.read.latency volume: 928318 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.297 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ac6e0d5b-d488-40f8-9bbb-05aa3bb95aa6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 950997712, 'user_id': '28578129c91d407a92af609ba8bac430', 'user_name': None, 'project_id': 'a30386ba68ee46f4a1bac43cf415f3a4', 'project_name': None, 'resource_id': '7292509e-f294-4159-96e5-22d4712df2a0-vda', 'timestamp': '2025-11-28T10:02:01.296243', 'resource_metadata': {'display_name': 'tempest-UnshelveToHostMultiNodesTest-server-650509197', 'name': 'instance-00000007', 'instance_id': '7292509e-f294-4159-96e5-22d4712df2a0', 'instance_type': 'm1.nano', 'host': '9ebb4b702ddc99b37414d36e54c2ee005e2ad03e5449dd0a560248b5', 'instance_host': 'np0005538515.localdomain', 'flavor': {'id': '98f289d4-5c06-4ab5-9089-7b580870d676', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c045142b-5f2b-4f4d-80b7-ca5ee791067d'}, 'image_ref': 'c045142b-5f2b-4f4d-80b7-ca5ee791067d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '48b0abec-cc41-11f0-bb26-fa163e93ca2d', 'monotonic_time': 11945.452005595, 'message_signature': '8be86b2f202474d0857f7dcc80d2303595b836b6545eec0524ef487e640f9f82'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 928318, 'user_id': '28578129c91d407a92af609ba8bac430', 'user_name': None, 'project_id': 'a30386ba68ee46f4a1bac43cf415f3a4', 'project_name': None, 'resource_id': '7292509e-f294-4159-96e5-22d4712df2a0-sda', 'timestamp': '2025-11-28T10:02:01.296243', 'resource_metadata': {'display_name': 'tempest-UnshelveToHostMultiNodesTest-server-650509197', 'name': 'instance-00000007', 'instance_id': '7292509e-f294-4159-96e5-22d4712df2a0', 'instance_type': 'm1.nano', 'host': '9ebb4b702ddc99b37414d36e54c2ee005e2ad03e5449dd0a560248b5', 'instance_host': 'np0005538515.localdomain', 'flavor': {'id': '98f289d4-5c06-4ab5-9089-7b580870d676', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c045142b-5f2b-4f4d-80b7-ca5ee791067d'}, 'image_ref': 'c045142b-5f2b-4f4d-80b7-ca5ee791067d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '48b0b4d4-cc41-11f0-bb26-fa163e93ca2d', 'monotonic_time': 11945.452005595, 'message_signature': 'd7662cb5c233202d14f96399f46c4a9851d1036a57a73e593686a909c98e112f'}]}, 'timestamp': '2025-11-28 10:02:01.296708', '_unique_id': '9acf8d50cac343eb8ce0e2817d6bc9df'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.297 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.297 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.297 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.297 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.297 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.297 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.297 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.297 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.297 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.297 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.297 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.297 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.297 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.297 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.297 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.297 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.297 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.297 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.297 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.297 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.297 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.297 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.297 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.297 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.297 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.297 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.297 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.297 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.297 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.297 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.297 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.298 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.298 12 DEBUG ceilometer.compute.pollsters [-] 7292509e-f294-4159-96e5-22d4712df2a0/disk.device.read.requests volume: 760 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.299 12 DEBUG ceilometer.compute.pollsters [-] 7292509e-f294-4159-96e5-22d4712df2a0/disk.device.read.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.300 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b4b6c9ab-af0b-4e58-9463-726aa3bef9b0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 760, 'user_id': '28578129c91d407a92af609ba8bac430', 'user_name': None, 'project_id': 'a30386ba68ee46f4a1bac43cf415f3a4', 'project_name': None, 'resource_id': '7292509e-f294-4159-96e5-22d4712df2a0-vda', 'timestamp': '2025-11-28T10:02:01.298600', 'resource_metadata': {'display_name': 'tempest-UnshelveToHostMultiNodesTest-server-650509197', 'name': 'instance-00000007', 'instance_id': '7292509e-f294-4159-96e5-22d4712df2a0', 'instance_type': 'm1.nano', 'host': '9ebb4b702ddc99b37414d36e54c2ee005e2ad03e5449dd0a560248b5', 'instance_host': 'np0005538515.localdomain', 'flavor': {'id': '98f289d4-5c06-4ab5-9089-7b580870d676', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c045142b-5f2b-4f4d-80b7-ca5ee791067d'}, 'image_ref': 'c045142b-5f2b-4f4d-80b7-ca5ee791067d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '48b10ff6-cc41-11f0-bb26-fa163e93ca2d', 'monotonic_time': 11945.452005595, 'message_signature': '99c91569dde35541e782674a71448408aeef74a50b4bd10b17a452c4afe98e05'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '28578129c91d407a92af609ba8bac430', 'user_name': None, 'project_id': 'a30386ba68ee46f4a1bac43cf415f3a4', 'project_name': None, 'resource_id': '7292509e-f294-4159-96e5-22d4712df2a0-sda', 'timestamp': '2025-11-28T10:02:01.298600', 'resource_metadata': {'display_name': 'tempest-UnshelveToHostMultiNodesTest-server-650509197', 'name': 'instance-00000007', 'instance_id': '7292509e-f294-4159-96e5-22d4712df2a0', 'instance_type': 'm1.nano', 'host': '9ebb4b702ddc99b37414d36e54c2ee005e2ad03e5449dd0a560248b5', 'instance_host': 'np0005538515.localdomain', 'flavor': {'id': '98f289d4-5c06-4ab5-9089-7b580870d676', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c045142b-5f2b-4f4d-80b7-ca5ee791067d'}, 'image_ref': 'c045142b-5f2b-4f4d-80b7-ca5ee791067d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '48b127b6-cc41-11f0-bb26-fa163e93ca2d', 'monotonic_time': 11945.452005595, 'message_signature': 'e4efe1af1dd9ee6be3a8d5c6bcd4a5df44c90861c9516bdf23692148f7ad22ca'}]}, 'timestamp': '2025-11-28 10:02:01.299836', '_unique_id': '9f53cfda721c4f6ea3295554e0e22909'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.300 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.300 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.300 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.300 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.300 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.300 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.300 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.300 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.300 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.300 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.300 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.300 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.300 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.300 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.300 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.300 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.300 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.300 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.300 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.300 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.300 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.300 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.300 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.300 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.300 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.300 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.300 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.300 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.300 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.300 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.300 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.301 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.301 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.301 12 DEBUG ceilometer.compute.pollsters [-] 7292509e-f294-4159-96e5-22d4712df2a0/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.302 12 DEBUG ceilometer.compute.pollsters [-] 7292509e-f294-4159-96e5-22d4712df2a0/disk.device.allocation volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.303 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '05270718-23e3-425f-80ee-8e34563de07c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '28578129c91d407a92af609ba8bac430', 'user_name': None, 'project_id': 'a30386ba68ee46f4a1bac43cf415f3a4', 'project_name': None, 'resource_id': '7292509e-f294-4159-96e5-22d4712df2a0-vda', 'timestamp': '2025-11-28T10:02:01.301819', 'resource_metadata': {'display_name': 'tempest-UnshelveToHostMultiNodesTest-server-650509197', 'name': 'instance-00000007', 'instance_id': '7292509e-f294-4159-96e5-22d4712df2a0', 'instance_type': 'm1.nano', 'host': '9ebb4b702ddc99b37414d36e54c2ee005e2ad03e5449dd0a560248b5', 'instance_host': 'np0005538515.localdomain', 'flavor': {'id': '98f289d4-5c06-4ab5-9089-7b580870d676', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c045142b-5f2b-4f4d-80b7-ca5ee791067d'}, 'image_ref': 'c045142b-5f2b-4f4d-80b7-ca5ee791067d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '48b1858a-cc41-11f0-bb26-fa163e93ca2d', 'monotonic_time': 11945.418562253, 'message_signature': '07a5b880eadcead49e279c3062e88fdeb72e10d867ec80cdbfd124bede74bbb0'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '28578129c91d407a92af609ba8bac430', 'user_name': None, 'project_id': 'a30386ba68ee46f4a1bac43cf415f3a4', 'project_name': None, 'resource_id': '7292509e-f294-4159-96e5-22d4712df2a0-sda', 'timestamp': '2025-11-28T10:02:01.301819', 'resource_metadata': {'display_name': 'tempest-UnshelveToHostMultiNodesTest-server-650509197', 'name': 'instance-00000007', 'instance_id': '7292509e-f294-4159-96e5-22d4712df2a0', 'instance_type': 'm1.nano', 'host': '9ebb4b702ddc99b37414d36e54c2ee005e2ad03e5449dd0a560248b5', 'instance_host': 'np0005538515.localdomain', 'flavor': {'id': '98f289d4-5c06-4ab5-9089-7b580870d676', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c045142b-5f2b-4f4d-80b7-ca5ee791067d'}, 'image_ref': 'c045142b-5f2b-4f4d-80b7-ca5ee791067d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '48b18efe-cc41-11f0-bb26-fa163e93ca2d', 'monotonic_time': 11945.418562253, 'message_signature': '145019e893f1a5b50f537e8781f3833ebe0797f495eacf4f7bff4a17bf98096e'}]}, 'timestamp': '2025-11-28 10:02:01.302350', '_unique_id': 'a3b2ea48a6394687b2a97232ac4f60ba'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.303 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.303 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.303 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.303 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.303 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.303 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.303 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.303 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.303 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.303 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.303 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.303 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.303 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.303 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.303 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.303 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.303 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.303 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.303 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.303 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.303 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.303 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.303 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.303 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.303 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.303 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.303 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.303 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.303 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.303 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.303 12 ERROR oslo_messaging.notify.messaging 
Nov 28 10:02:01 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:02:01.305 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 28 10:02:01 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:02:01.501 2 INFO neutron.agent.securitygroups_rpc [None req-3458faa2-903e-46ff-96c1-5776090af93b 75ac6a26227c40ba81e61e610018d23f 1f9b84b894e641c4bee3ebcd1409ad9f - - default default] Security group rule updated ['6deb8732-9203-448a-b0a5-cf6a0375d009']
Nov 28 10:02:01 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v132: 177 pgs: 177 active+clean; 225 MiB data, 869 MiB used, 41 GiB / 42 GiB avail; 2.9 MiB/s rd, 22 KiB/s wr, 181 op/s
Nov 28 10:02:01 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e103 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 10:02:02 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:02:02.310 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:02:02 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.
Nov 28 10:02:02 np0005538515.localdomain ceph-mon[301134]: pgmap v132: 177 pgs: 177 active+clean; 225 MiB data, 869 MiB used, 41 GiB / 42 GiB avail; 2.9 MiB/s rd, 22 KiB/s wr, 181 op/s
Nov 28 10:02:02 np0005538515.localdomain podman[309453]: 2025-11-28 10:02:02.992132729 +0000 UTC m=+0.094782289 container health_status 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 28 10:02:03 np0005538515.localdomain podman[309453]: 2025-11-28 10:02:03.005171614 +0000 UTC m=+0.107821154 container exec_died 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 10:02:03 np0005538515.localdomain systemd[1]: 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.service: Deactivated successfully.
Nov 28 10:02:03 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:02:03.300 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:02:03 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v133: 177 pgs: 177 active+clean; 225 MiB data, 869 MiB used, 41 GiB / 42 GiB avail; 2.4 MiB/s rd, 18 KiB/s wr, 149 op/s
Nov 28 10:02:04 np0005538515.localdomain dnsmasq[261709]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/addn_hosts - 5 addresses
Nov 28 10:02:04 np0005538515.localdomain dnsmasq-dhcp[261709]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/host
Nov 28 10:02:04 np0005538515.localdomain dnsmasq-dhcp[261709]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/opts
Nov 28 10:02:04 np0005538515.localdomain podman[309490]: 2025-11-28 10:02:04.60185873 +0000 UTC m=+0.074429054 container kill dff024c0a8ef5b75fbea3dbf531de7255c21fd5f44b33b8a799f8e3ce0ffd439 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-887157f9-a765-40c0-8be5-1fba3ddea8f8, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:02:04 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:02:04.822 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:02:05 np0005538515.localdomain ceph-mgr[286188]: [balancer INFO root] Optimize plan auto_2025-11-28_10:02:05
Nov 28 10:02:05 np0005538515.localdomain ceph-mgr[286188]: [balancer INFO root] Mode upmap, max misplaced 0.050000
Nov 28 10:02:05 np0005538515.localdomain ceph-mgr[286188]: [balancer INFO root] do_upmap
Nov 28 10:02:05 np0005538515.localdomain ceph-mgr[286188]: [balancer INFO root] pools ['manila_data', 'images', 'manila_metadata', 'vms', 'volumes', 'backups', '.mgr']
Nov 28 10:02:05 np0005538515.localdomain ceph-mgr[286188]: [balancer INFO root] prepared 0/10 changes
Nov 28 10:02:05 np0005538515.localdomain ceph-mon[301134]: pgmap v133: 177 pgs: 177 active+clean; 225 MiB data, 869 MiB used, 41 GiB / 42 GiB avail; 2.4 MiB/s rd, 18 KiB/s wr, 149 op/s
Nov 28 10:02:05 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v134: 177 pgs: 177 active+clean; 225 MiB data, 869 MiB used, 41 GiB / 42 GiB avail; 2.4 MiB/s rd, 18 KiB/s wr, 149 op/s
Nov 28 10:02:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] scanning for idle connections..
Nov 28 10:02:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] cleaning up connections: []
Nov 28 10:02:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] scanning for idle connections..
Nov 28 10:02:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] cleaning up connections: []
Nov 28 10:02:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] _maybe_adjust
Nov 28 10:02:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Nov 28 10:02:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] Pool '.mgr' root_id -1 using 3.080724804578448e-05 of space, bias 1.0, pg target 0.006161449609156895 quantized to 1 (current 1)
Nov 28 10:02:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Nov 28 10:02:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] Pool 'vms' root_id -1 using 0.006581845861250698 of space, bias 1.0, pg target 1.3163691722501396 quantized to 32 (current 32)
Nov 28 10:02:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Nov 28 10:02:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] Pool 'volumes' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Nov 28 10:02:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Nov 28 10:02:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] Pool 'images' root_id -1 using 0.004299383200725851 of space, bias 1.0, pg target 0.8555772569444443 quantized to 32 (current 32)
Nov 28 10:02:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Nov 28 10:02:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] Pool 'backups' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Nov 28 10:02:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Nov 28 10:02:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] Pool 'manila_data' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Nov 28 10:02:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Nov 28 10:02:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] Pool 'manila_metadata' root_id -1 using 2.453674623115578e-06 of space, bias 4.0, pg target 0.0019465818676716918 quantized to 16 (current 16)
Nov 28 10:02:05 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules
Nov 28 10:02:05 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: vms, start_after=
Nov 28 10:02:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] scanning for idle connections..
Nov 28 10:02:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] cleaning up connections: []
Nov 28 10:02:05 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: volumes, start_after=
Nov 28 10:02:05 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: images, start_after=
Nov 28 10:02:05 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules
Nov 28 10:02:05 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: vms, start_after=
Nov 28 10:02:05 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: backups, start_after=
Nov 28 10:02:05 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: volumes, start_after=
Nov 28 10:02:05 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: images, start_after=
Nov 28 10:02:05 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: backups, start_after=
Nov 28 10:02:05 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.
Nov 28 10:02:05 np0005538515.localdomain podman[309525]: 2025-11-28 10:02:05.946488918 +0000 UTC m=+0.060162916 container health_status cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3)
Nov 28 10:02:06 np0005538515.localdomain dnsmasq[261709]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/addn_hosts - 4 addresses
Nov 28 10:02:06 np0005538515.localdomain dnsmasq-dhcp[261709]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/host
Nov 28 10:02:06 np0005538515.localdomain podman[309538]: 2025-11-28 10:02:06.000897045 +0000 UTC m=+0.077453846 container kill dff024c0a8ef5b75fbea3dbf531de7255c21fd5f44b33b8a799f8e3ce0ffd439 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-887157f9-a765-40c0-8be5-1fba3ddea8f8, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 28 10:02:06 np0005538515.localdomain dnsmasq-dhcp[261709]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/opts
Nov 28 10:02:06 np0005538515.localdomain podman[309525]: 2025-11-28 10:02:06.04152158 +0000 UTC m=+0.155195638 container exec_died cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 28 10:02:06 np0005538515.localdomain systemd[1]: cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.service: Deactivated successfully.
Nov 28 10:02:06 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:02:06.336 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:02:06 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:02:06.375 280172 DEBUG nova.virt.libvirt.driver [None req-e7d006c1-4e1e-4995-bd1a-1f6d1f26ac1e 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101
Nov 28 10:02:06 np0005538515.localdomain ceph-mon[301134]: pgmap v134: 177 pgs: 177 active+clean; 225 MiB data, 869 MiB used, 41 GiB / 42 GiB avail; 2.4 MiB/s rd, 18 KiB/s wr, 149 op/s
Nov 28 10:02:06 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e103 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 10:02:07 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:02:07.276 280172 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764324112.2752717, c06e2ffc-a8af-41b6-ab88-680ef1f6fe50 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 28 10:02:07 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:02:07.277 280172 INFO nova.compute.manager [-] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] VM Stopped (Lifecycle Event)
Nov 28 10:02:07 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:02:07.313 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:02:07 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:02:07.427 280172 DEBUG nova.compute.manager [None req-61914361-d006-49b8-b93c-36c57176e94a - - - - - -] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 28 10:02:07 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.107:0/743986143' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:02:07 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v135: 177 pgs: 177 active+clean; 226 MiB data, 869 MiB used, 41 GiB / 42 GiB avail; 630 KiB/s rd, 35 KiB/s wr, 53 op/s
Nov 28 10:02:08 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:02:08.262 2 INFO neutron.agent.securitygroups_rpc [req-6bffedb9-405b-4a40-9982-68d686e88a5f req-5df2fd06-5333-4972-81c1-a0ccb5870973 75ac6a26227c40ba81e61e610018d23f 1f9b84b894e641c4bee3ebcd1409ad9f - - default default] Security group member updated ['6deb8732-9203-448a-b0a5-cf6a0375d009']
Nov 28 10:02:08 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:02:08.274 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:02:08 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:02:08.303 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:02:08 np0005538515.localdomain ceph-mon[301134]: pgmap v135: 177 pgs: 177 active+clean; 226 MiB data, 869 MiB used, 41 GiB / 42 GiB avail; 630 KiB/s rd, 35 KiB/s wr, 53 op/s
Nov 28 10:02:08 np0005538515.localdomain systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000007.scope: Deactivated successfully.
Nov 28 10:02:08 np0005538515.localdomain systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000007.scope: Consumed 13.546s CPU time.
Nov 28 10:02:08 np0005538515.localdomain systemd-machined[201641]: Machine qemu-3-instance-00000007 terminated.
Nov 28 10:02:08 np0005538515.localdomain dnsmasq[307007]: read /var/lib/neutron/dhcp/4feac402-945d-4d17-a15d-c8337ea9c266/addn_hosts - 0 addresses
Nov 28 10:02:08 np0005538515.localdomain dnsmasq-dhcp[307007]: read /var/lib/neutron/dhcp/4feac402-945d-4d17-a15d-c8337ea9c266/host
Nov 28 10:02:08 np0005538515.localdomain dnsmasq-dhcp[307007]: read /var/lib/neutron/dhcp/4feac402-945d-4d17-a15d-c8337ea9c266/opts
Nov 28 10:02:08 np0005538515.localdomain systemd[1]: tmp-crun.SGnri4.mount: Deactivated successfully.
Nov 28 10:02:08 np0005538515.localdomain podman[309582]: 2025-11-28 10:02:08.870841764 +0000 UTC m=+0.055320037 container kill 421f676002d9dc6198ae0142c65c01f174b39b07450c22c4b59e8e8bd991f65a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4feac402-945d-4d17-a15d-c8337ea9c266, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:02:09 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:02:09.077 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:02:09 np0005538515.localdomain kernel: device tapd0a70cfb-41 left promiscuous mode
Nov 28 10:02:09 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T10:02:09Z|00071|binding|INFO|Releasing lport d0a70cfb-41f8-4ab9-819b-560a898e8329 from this chassis (sb_readonly=0)
Nov 28 10:02:09 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T10:02:09Z|00072|binding|INFO|Setting lport d0a70cfb-41f8-4ab9-819b-560a898e8329 down in Southbound
Nov 28 10:02:09 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:02:09.086 158530 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538515.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcp2a75ae47-09f3-5db4-9c67-86b6e0e7c804-4feac402-945d-4d17-a15d-c8337ea9c266', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4feac402-945d-4d17-a15d-c8337ea9c266', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f1bee3918a2345388c202f74e60af9c5', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005538515.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4a3868fc-e35e-44db-9bd3-f12a417ed185, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fd80e481be0>], logical_port=d0a70cfb-41f8-4ab9-819b-560a898e8329) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fd80e481be0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:02:09 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:02:09.088 158530 INFO neutron.agent.ovn.metadata.agent [-] Port d0a70cfb-41f8-4ab9-819b-560a898e8329 in datapath 4feac402-945d-4d17-a15d-c8337ea9c266 unbound from our chassis
Nov 28 10:02:09 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:02:09.091 158530 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 4feac402-945d-4d17-a15d-c8337ea9c266, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 28 10:02:09 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:02:09.093 261619 DEBUG oslo.privsep.daemon [-] privsep: reply[3215de84-0f82-42e1-9cc6-d53bdeb003a1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:02:09 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:02:09.098 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:02:09 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:02:09.389 280172 INFO nova.virt.libvirt.driver [None req-e7d006c1-4e1e-4995-bd1a-1f6d1f26ac1e 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Instance shutdown successfully after 13 seconds.
Nov 28 10:02:09 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:02:09.395 280172 INFO nova.virt.libvirt.driver [-] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Instance destroyed successfully.
Nov 28 10:02:09 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:02:09.395 280172 DEBUG nova.objects.instance [None req-e7d006c1-4e1e-4995-bd1a-1f6d1f26ac1e 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] Lazy-loading 'numa_topology' on Instance uuid 7292509e-f294-4159-96e5-22d4712df2a0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 28 10:02:09 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:02:09.453 280172 INFO nova.virt.libvirt.driver [None req-e7d006c1-4e1e-4995-bd1a-1f6d1f26ac1e 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Beginning cold snapshot process
Nov 28 10:02:09 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:02:09.595 280172 DEBUG nova.virt.libvirt.imagebackend [None req-e7d006c1-4e1e-4995-bd1a-1f6d1f26ac1e 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] No parent info for 85968a96-5a0e-43a4-9c04-3954f640a7ed; asking the Image API where its store is _get_parent_pool /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1163
Nov 28 10:02:09 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:02:09.644 280172 DEBUG nova.storage.rbd_utils [None req-e7d006c1-4e1e-4995-bd1a-1f6d1f26ac1e 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] creating snapshot(450118c61e6a4bb095de1d74dd4c0177) on rbd image(7292509e-f294-4159-96e5-22d4712df2a0_disk) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462
Nov 28 10:02:09 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v136: 177 pgs: 177 active+clean; 226 MiB data, 869 MiB used, 41 GiB / 42 GiB avail; 630 KiB/s rd, 35 KiB/s wr, 53 op/s
Nov 28 10:02:09 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e104 e104: 6 total, 6 up, 6 in
Nov 28 10:02:09 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:02:09.893 280172 DEBUG nova.storage.rbd_utils [None req-e7d006c1-4e1e-4995-bd1a-1f6d1f26ac1e 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] cloning vms/7292509e-f294-4159-96e5-22d4712df2a0_disk@450118c61e6a4bb095de1d74dd4c0177 to images/a2def208-be38-4da4-a3f2-d5c5045455ca clone /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:261
Nov 28 10:02:10 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:02:10.084 280172 DEBUG nova.storage.rbd_utils [None req-e7d006c1-4e1e-4995-bd1a-1f6d1f26ac1e 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] flattening images/a2def208-be38-4da4-a3f2-d5c5045455ca flatten /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:314
Nov 28 10:02:10 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:02:10.720 280172 DEBUG nova.storage.rbd_utils [None req-e7d006c1-4e1e-4995-bd1a-1f6d1f26ac1e 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] removing snapshot(450118c61e6a4bb095de1d74dd4c0177) on rbd image(7292509e-f294-4159-96e5-22d4712df2a0_disk) remove_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:489
Nov 28 10:02:10 np0005538515.localdomain ceph-mon[301134]: pgmap v136: 177 pgs: 177 active+clean; 226 MiB data, 869 MiB used, 41 GiB / 42 GiB avail; 630 KiB/s rd, 35 KiB/s wr, 53 op/s
Nov 28 10:02:10 np0005538515.localdomain ceph-mon[301134]: osdmap e104: 6 total, 6 up, 6 in
Nov 28 10:02:10 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e105 e105: 6 total, 6 up, 6 in
Nov 28 10:02:10 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:02:10.956 280172 DEBUG nova.storage.rbd_utils [None req-e7d006c1-4e1e-4995-bd1a-1f6d1f26ac1e 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] creating snapshot(snap) on rbd image(a2def208-be38-4da4-a3f2-d5c5045455ca) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462
Nov 28 10:02:11 np0005538515.localdomain dnsmasq[261709]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/addn_hosts - 3 addresses
Nov 28 10:02:11 np0005538515.localdomain dnsmasq-dhcp[261709]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/host
Nov 28 10:02:11 np0005538515.localdomain podman[309765]: 2025-11-28 10:02:11.570976688 +0000 UTC m=+0.060876406 container kill dff024c0a8ef5b75fbea3dbf531de7255c21fd5f44b33b8a799f8e3ce0ffd439 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-887157f9-a765-40c0-8be5-1fba3ddea8f8, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 28 10:02:11 np0005538515.localdomain dnsmasq-dhcp[261709]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/opts
Nov 28 10:02:11 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v139: 177 pgs: 177 active+clean; 354 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 6.7 MiB/s rd, 8.5 MiB/s wr, 217 op/s
Nov 28 10:02:11 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e105 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 10:02:11 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:02:11.879 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:02:11 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e106 e106: 6 total, 6 up, 6 in
Nov 28 10:02:11 np0005538515.localdomain ceph-mon[301134]: osdmap e105: 6 total, 6 up, 6 in
Nov 28 10:02:11 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.107:0/3484021559' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:02:11 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.107:0/1882949366' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:02:12 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:02:12.315 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:02:12 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:02:12.688 280172 INFO nova.virt.libvirt.driver [None req-e7d006c1-4e1e-4995-bd1a-1f6d1f26ac1e 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Snapshot image upload complete
Nov 28 10:02:12 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:02:12.689 280172 DEBUG nova.compute.manager [None req-e7d006c1-4e1e-4995-bd1a-1f6d1f26ac1e 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 28 10:02:12 np0005538515.localdomain dnsmasq[307007]: exiting on receipt of SIGTERM
Nov 28 10:02:12 np0005538515.localdomain systemd[1]: libpod-421f676002d9dc6198ae0142c65c01f174b39b07450c22c4b59e8e8bd991f65a.scope: Deactivated successfully.
Nov 28 10:02:12 np0005538515.localdomain podman[309802]: 2025-11-28 10:02:12.718867249 +0000 UTC m=+0.060131994 container kill 421f676002d9dc6198ae0142c65c01f174b39b07450c22c4b59e8e8bd991f65a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4feac402-945d-4d17-a15d-c8337ea9c266, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251125)
Nov 28 10:02:12 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:02:12.744 280172 INFO nova.compute.manager [None req-e7d006c1-4e1e-4995-bd1a-1f6d1f26ac1e 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Shelve offloading
Nov 28 10:02:12 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:02:12.751 280172 INFO nova.virt.libvirt.driver [-] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Instance destroyed successfully.
Nov 28 10:02:12 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:02:12.752 280172 DEBUG nova.compute.manager [None req-e7d006c1-4e1e-4995-bd1a-1f6d1f26ac1e 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 28 10:02:12 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:02:12.754 280172 DEBUG oslo_concurrency.lockutils [None req-e7d006c1-4e1e-4995-bd1a-1f6d1f26ac1e 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] Acquiring lock "refresh_cache-7292509e-f294-4159-96e5-22d4712df2a0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 28 10:02:12 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:02:12.754 280172 DEBUG oslo_concurrency.lockutils [None req-e7d006c1-4e1e-4995-bd1a-1f6d1f26ac1e 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] Acquired lock "refresh_cache-7292509e-f294-4159-96e5-22d4712df2a0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 28 10:02:12 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:02:12.754 280172 DEBUG nova.network.neutron [None req-e7d006c1-4e1e-4995-bd1a-1f6d1f26ac1e 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 28 10:02:12 np0005538515.localdomain podman[309814]: 2025-11-28 10:02:12.791713041 +0000 UTC m=+0.062420044 container died 421f676002d9dc6198ae0142c65c01f174b39b07450c22c4b59e8e8bd991f65a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4feac402-945d-4d17-a15d-c8337ea9c266, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 28 10:02:12 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:02:12.811 280172 DEBUG nova.network.neutron [None req-e7d006c1-4e1e-4995-bd1a-1f6d1f26ac1e 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 28 10:02:12 np0005538515.localdomain podman[309814]: 2025-11-28 10:02:12.832876153 +0000 UTC m=+0.103583116 container cleanup 421f676002d9dc6198ae0142c65c01f174b39b07450c22c4b59e8e8bd991f65a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4feac402-945d-4d17-a15d-c8337ea9c266, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:02:12 np0005538515.localdomain systemd[1]: libpod-conmon-421f676002d9dc6198ae0142c65c01f174b39b07450c22c4b59e8e8bd991f65a.scope: Deactivated successfully.
Nov 28 10:02:12 np0005538515.localdomain podman[309821]: 2025-11-28 10:02:12.871411614 +0000 UTC m=+0.131644495 container remove 421f676002d9dc6198ae0142c65c01f174b39b07450c22c4b59e8e8bd991f65a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4feac402-945d-4d17-a15d-c8337ea9c266, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 28 10:02:12 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:02:12.903 261346 INFO neutron.agent.dhcp.agent [None req-fa2996bb-4129-4fad-bd6f-99d7b74572f3 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:02:12 np0005538515.localdomain ceph-mon[301134]: pgmap v139: 177 pgs: 177 active+clean; 354 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 6.7 MiB/s rd, 8.5 MiB/s wr, 217 op/s
Nov 28 10:02:12 np0005538515.localdomain ceph-mon[301134]: osdmap e106: 6 total, 6 up, 6 in
Nov 28 10:02:12 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:02:12.981 280172 DEBUG nova.network.neutron [None req-e7d006c1-4e1e-4995-bd1a-1f6d1f26ac1e 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 28 10:02:12 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:02:12.998 280172 DEBUG oslo_concurrency.lockutils [None req-e7d006c1-4e1e-4995-bd1a-1f6d1f26ac1e 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] Releasing lock "refresh_cache-7292509e-f294-4159-96e5-22d4712df2a0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 28 10:02:13 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:02:13.007 280172 INFO nova.virt.libvirt.driver [-] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Instance destroyed successfully.
Nov 28 10:02:13 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:02:13.008 280172 DEBUG nova.objects.instance [None req-e7d006c1-4e1e-4995-bd1a-1f6d1f26ac1e 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] Lazy-loading 'resources' on Instance uuid 7292509e-f294-4159-96e5-22d4712df2a0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 28 10:02:13 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:02:13.234 261346 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:02:13 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:02:13.308 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:02:13 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Nov 28 10:02:13 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3925667673' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:02:13 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Nov 28 10:02:13 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3925667673' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:02:13 np0005538515.localdomain dnsmasq[261709]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/addn_hosts - 2 addresses
Nov 28 10:02:13 np0005538515.localdomain dnsmasq-dhcp[261709]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/host
Nov 28 10:02:13 np0005538515.localdomain dnsmasq-dhcp[261709]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/opts
Nov 28 10:02:13 np0005538515.localdomain podman[309879]: 2025-11-28 10:02:13.530294528 +0000 UTC m=+0.073594787 container kill dff024c0a8ef5b75fbea3dbf531de7255c21fd5f44b33b8a799f8e3ce0ffd439 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-887157f9-a765-40c0-8be5-1fba3ddea8f8, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:02:13 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:02:13.583 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:02:13 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:02:13.646 280172 INFO nova.virt.libvirt.driver [None req-e7d006c1-4e1e-4995-bd1a-1f6d1f26ac1e 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Deleting instance files /var/lib/nova/instances/7292509e-f294-4159-96e5-22d4712df2a0_del
Nov 28 10:02:13 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:02:13.647 280172 INFO nova.virt.libvirt.driver [None req-e7d006c1-4e1e-4995-bd1a-1f6d1f26ac1e 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Deletion of /var/lib/nova/instances/7292509e-f294-4159-96e5-22d4712df2a0_del complete
Nov 28 10:02:13 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v141: 177 pgs: 177 active+clean; 354 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 7.9 MiB/s rd, 11 MiB/s wr, 200 op/s
Nov 28 10:02:13 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-a53b8d3f668b25246333f5de4a531a6e13da55d713122016f2fd29b9d52ffaf2-merged.mount: Deactivated successfully.
Nov 28 10:02:13 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-421f676002d9dc6198ae0142c65c01f174b39b07450c22c4b59e8e8bd991f65a-userdata-shm.mount: Deactivated successfully.
Nov 28 10:02:13 np0005538515.localdomain systemd[1]: run-netns-qdhcp\x2d4feac402\x2d945d\x2d4d17\x2da15d\x2dc8337ea9c266.mount: Deactivated successfully.
Nov 28 10:02:13 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:02:13.742 280172 INFO nova.scheduler.client.report [None req-e7d006c1-4e1e-4995-bd1a-1f6d1f26ac1e 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] Deleted allocations for instance 7292509e-f294-4159-96e5-22d4712df2a0
Nov 28 10:02:13 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:02:13.788 280172 DEBUG oslo_concurrency.lockutils [None req-e7d006c1-4e1e-4995-bd1a-1f6d1f26ac1e 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 10:02:13 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:02:13.789 280172 DEBUG oslo_concurrency.lockutils [None req-e7d006c1-4e1e-4995-bd1a-1f6d1f26ac1e 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 10:02:13 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:02:13.814 280172 DEBUG oslo_concurrency.processutils [None req-e7d006c1-4e1e-4995-bd1a-1f6d1f26ac1e 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 10:02:13 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/3925667673' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:02:13 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/3925667673' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:02:14 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 28 10:02:14 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/915675261' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:02:14 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:02:14.273 280172 DEBUG oslo_concurrency.processutils [None req-e7d006c1-4e1e-4995-bd1a-1f6d1f26ac1e 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.459s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 10:02:14 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:02:14.280 280172 DEBUG nova.compute.provider_tree [None req-e7d006c1-4e1e-4995-bd1a-1f6d1f26ac1e 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] Inventory has not changed in ProviderTree for provider: 72fba1ca-0d86-48af-8a3d-510284dfd0e0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 10:02:14 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:02:14.296 280172 DEBUG nova.scheduler.client.report [None req-e7d006c1-4e1e-4995-bd1a-1f6d1f26ac1e 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] Inventory has not changed for provider 72fba1ca-0d86-48af-8a3d-510284dfd0e0 based on inventory data: {'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 10:02:14 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:02:14.318 280172 DEBUG oslo_concurrency.lockutils [None req-e7d006c1-4e1e-4995-bd1a-1f6d1f26ac1e 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.529s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 10:02:14 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:02:14.389 280172 DEBUG oslo_concurrency.lockutils [None req-e7d006c1-4e1e-4995-bd1a-1f6d1f26ac1e 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] Lock "7292509e-f294-4159-96e5-22d4712df2a0" "released" by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" :: held 18.084s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 10:02:14 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e107 e107: 6 total, 6 up, 6 in
Nov 28 10:02:14 np0005538515.localdomain ceph-mon[301134]: pgmap v141: 177 pgs: 177 active+clean; 354 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 7.9 MiB/s rd, 11 MiB/s wr, 200 op/s
Nov 28 10:02:14 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.108:0/915675261' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:02:14 np0005538515.localdomain ceph-mon[301134]: osdmap e107: 6 total, 6 up, 6 in
Nov 28 10:02:15 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v143: 177 pgs: 177 active+clean; 354 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 8.0 MiB/s rd, 12 MiB/s wr, 204 op/s
Nov 28 10:02:15 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.106:0/2145567881' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:02:16 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #25. Immutable memtables: 0.
Nov 28 10:02:16 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:02:16.032192) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Nov 28 10:02:16 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/flush_job.cc:856] [default] [JOB 11] Flushing memtable with next log file: 25
Nov 28 10:02:16 np0005538515.localdomain ceph-mon[301134]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324136032250, "job": 11, "event": "flush_started", "num_memtables": 1, "num_entries": 964, "num_deletes": 254, "total_data_size": 1034210, "memory_usage": 1051536, "flush_reason": "Manual Compaction"}
Nov 28 10:02:16 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/flush_job.cc:885] [default] [JOB 11] Level-0 flush table #26: started
Nov 28 10:02:16 np0005538515.localdomain ceph-mon[301134]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324136040182, "cf_name": "default", "job": 11, "event": "table_file_creation", "file_number": 26, "file_size": 674308, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 17448, "largest_seqno": 18407, "table_properties": {"data_size": 670184, "index_size": 1787, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1285, "raw_key_size": 10053, "raw_average_key_size": 20, "raw_value_size": 661649, "raw_average_value_size": 1358, "num_data_blocks": 78, "num_entries": 487, "num_filter_entries": 487, "num_deletions": 254, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764324089, "oldest_key_time": 1764324089, "file_creation_time": 1764324136, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "75e61b0e-4f73-4b03-b096-8587ecbe7a9f", "db_session_id": "7KM5GJAJPD54H6HSLJHG", "orig_file_number": 26, "seqno_to_time_mapping": "N/A"}}
Nov 28 10:02:16 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 11] Flush lasted 8037 microseconds, and 2808 cpu microseconds.
Nov 28 10:02:16 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 28 10:02:16 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:02:16.040228) [db/flush_job.cc:967] [default] [JOB 11] Level-0 flush table #26: 674308 bytes OK
Nov 28 10:02:16 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:02:16.040251) [db/memtable_list.cc:519] [default] Level-0 commit table #26 started
Nov 28 10:02:16 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:02:16.042372) [db/memtable_list.cc:722] [default] Level-0 commit table #26: memtable #1 done
Nov 28 10:02:16 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:02:16.042397) EVENT_LOG_v1 {"time_micros": 1764324136042391, "job": 11, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Nov 28 10:02:16 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:02:16.042417) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Nov 28 10:02:16 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 11] Try to delete WAL files size 1029284, prev total WAL file size 1029284, number of live WAL files 2.
Nov 28 10:02:16 np0005538515.localdomain ceph-mon[301134]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538515/store.db/000022.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 28 10:02:16 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:02:16.043003) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003131353436' seq:72057594037927935, type:22 .. '7061786F73003131373938' seq:0, type:0; will stop at (end)
Nov 28 10:02:16 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 12] Compacting 1@0 + 1@6 files to L6, score -1.00
Nov 28 10:02:16 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 11 Base level 0, inputs: [26(658KB)], [24(17MB)]
Nov 28 10:02:16 np0005538515.localdomain ceph-mon[301134]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324136043098, "job": 12, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [26], "files_L6": [24], "score": -1, "input_data_size": 18755201, "oldest_snapshot_seqno": -1}
Nov 28 10:02:16 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 12] Generated table #27: 12046 keys, 16120641 bytes, temperature: kUnknown
Nov 28 10:02:16 np0005538515.localdomain ceph-mon[301134]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324136158331, "cf_name": "default", "job": 12, "event": "table_file_creation", "file_number": 27, "file_size": 16120641, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 16054296, "index_size": 35140, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 30149, "raw_key_size": 323941, "raw_average_key_size": 26, "raw_value_size": 15851460, "raw_average_value_size": 1315, "num_data_blocks": 1327, "num_entries": 12046, "num_filter_entries": 12046, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764323786, "oldest_key_time": 0, "file_creation_time": 1764324136, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "75e61b0e-4f73-4b03-b096-8587ecbe7a9f", "db_session_id": "7KM5GJAJPD54H6HSLJHG", "orig_file_number": 27, "seqno_to_time_mapping": "N/A"}}
Nov 28 10:02:16 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 28 10:02:16 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:02:16.158558) [db/compaction/compaction_job.cc:1663] [default] [JOB 12] Compacted 1@0 + 1@6 files to L6 => 16120641 bytes
Nov 28 10:02:16 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:02:16.160578) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 162.7 rd, 139.8 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.6, 17.2 +0.0 blob) out(15.4 +0.0 blob), read-write-amplify(51.7) write-amplify(23.9) OK, records in: 12571, records dropped: 525 output_compression: NoCompression
Nov 28 10:02:16 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:02:16.160596) EVENT_LOG_v1 {"time_micros": 1764324136160588, "job": 12, "event": "compaction_finished", "compaction_time_micros": 115292, "compaction_time_cpu_micros": 47119, "output_level": 6, "num_output_files": 1, "total_output_size": 16120641, "num_input_records": 12571, "num_output_records": 12046, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Nov 28 10:02:16 np0005538515.localdomain ceph-mon[301134]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538515/store.db/000026.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 28 10:02:16 np0005538515.localdomain ceph-mon[301134]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324136160790, "job": 12, "event": "table_file_deletion", "file_number": 26}
Nov 28 10:02:16 np0005538515.localdomain ceph-mon[301134]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538515/store.db/000024.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 28 10:02:16 np0005538515.localdomain ceph-mon[301134]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324136162486, "job": 12, "event": "table_file_deletion", "file_number": 24}
Nov 28 10:02:16 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:02:16.042880) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:02:16 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:02:16.162603) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:02:16 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:02:16.162612) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:02:16 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:02:16.162615) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:02:16 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:02:16.162617) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:02:16 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:02:16.162620) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:02:16 np0005538515.localdomain sudo[309922]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 10:02:16 np0005538515.localdomain sudo[309922]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 10:02:16 np0005538515.localdomain sudo[309922]: pam_unix(sudo:session): session closed for user root
Nov 28 10:02:16 np0005538515.localdomain sudo[309940]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 10:02:16 np0005538515.localdomain sudo[309940]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 10:02:16 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e107 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 10:02:17 np0005538515.localdomain ceph-mon[301134]: pgmap v143: 177 pgs: 177 active+clean; 354 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 8.0 MiB/s rd, 12 MiB/s wr, 204 op/s
Nov 28 10:02:17 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:02:17.317 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:02:17 np0005538515.localdomain sudo[309940]: pam_unix(sudo:session): session closed for user root
Nov 28 10:02:17 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Nov 28 10:02:17 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 10:02:17 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0)
Nov 28 10:02:17 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 28 10:02:17 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Nov 28 10:02:17 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v144: 177 pgs: 177 active+clean; 273 MiB data, 941 MiB used, 41 GiB / 42 GiB avail; 10 MiB/s rd, 10 MiB/s wr, 387 op/s
Nov 28 10:02:17 np0005538515.localdomain ceph-mgr[286188]: [progress INFO root] update: starting ev 25632c53-ed7a-489d-a0d0-638d4bdafaff (Updating node-proxy deployment (+3 -> 3))
Nov 28 10:02:17 np0005538515.localdomain ceph-mgr[286188]: [progress INFO root] complete: finished ev 25632c53-ed7a-489d-a0d0-638d4bdafaff (Updating node-proxy deployment (+3 -> 3))
Nov 28 10:02:17 np0005538515.localdomain ceph-mgr[286188]: [progress INFO root] Completed event 25632c53-ed7a-489d-a0d0-638d4bdafaff (Updating node-proxy deployment (+3 -> 3)) in 0 seconds
Nov 28 10:02:17 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0)
Nov 28 10:02:17 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 28 10:02:17 np0005538515.localdomain sudo[309991]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 10:02:17 np0005538515.localdomain sudo[309991]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 10:02:17 np0005538515.localdomain sudo[309991]: pam_unix(sudo:session): session closed for user root
Nov 28 10:02:18 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 10:02:18 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 28 10:02:18 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:02:18 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 28 10:02:18 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:02:18.311 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:02:19 np0005538515.localdomain ceph-mon[301134]: pgmap v144: 177 pgs: 177 active+clean; 273 MiB data, 941 MiB used, 41 GiB / 42 GiB avail; 10 MiB/s rd, 10 MiB/s wr, 387 op/s
Nov 28 10:02:19 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.106:0/3129734523' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:02:19 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.106:0/3253647983' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:02:19 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e108 e108: 6 total, 6 up, 6 in
Nov 28 10:02:19 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:02:19.518 261346 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:02:19 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v146: 177 pgs: 177 active+clean; 273 MiB data, 941 MiB used, 41 GiB / 42 GiB avail; 3.0 MiB/s rd, 25 KiB/s wr, 182 op/s
Nov 28 10:02:20 np0005538515.localdomain ceph-mon[301134]: osdmap e108: 6 total, 6 up, 6 in
Nov 28 10:02:20 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e109 e109: 6 total, 6 up, 6 in
Nov 28 10:02:20 np0005538515.localdomain ceph-mgr[286188]: [progress INFO root] Writing back 50 completed events
Nov 28 10:02:20 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0)
Nov 28 10:02:20 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.
Nov 28 10:02:21 np0005538515.localdomain podman[310009]: 2025-11-28 10:02:21.065940832 +0000 UTC m=+0.099528442 container health_status 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, architecture=x86_64, distribution-scope=public, container_name=openstack_network_exporter, name=ubi9-minimal, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, version=9.6, vcs-type=git, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, vendor=Red Hat, Inc.)
Nov 28 10:02:21 np0005538515.localdomain podman[310009]: 2025-11-28 10:02:21.076139034 +0000 UTC m=+0.109726564 container exec_died 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., name=ubi9-minimal, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public)
Nov 28 10:02:21 np0005538515.localdomain systemd[1]: 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.service: Deactivated successfully.
Nov 28 10:02:21 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e110 e110: 6 total, 6 up, 6 in
Nov 28 10:02:21 np0005538515.localdomain ceph-mon[301134]: pgmap v146: 177 pgs: 177 active+clean; 273 MiB data, 941 MiB used, 41 GiB / 42 GiB avail; 3.0 MiB/s rd, 25 KiB/s wr, 182 op/s
Nov 28 10:02:21 np0005538515.localdomain ceph-mon[301134]: osdmap e109: 6 total, 6 up, 6 in
Nov 28 10:02:21 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:02:21 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v149: 177 pgs: 177 active+clean; 273 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 13 MiB/s rd, 7.8 MiB/s wr, 572 op/s
Nov 28 10:02:21 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e110 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 10:02:22 np0005538515.localdomain ceph-mon[301134]: osdmap e110: 6 total, 6 up, 6 in
Nov 28 10:02:22 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e111 e111: 6 total, 6 up, 6 in
Nov 28 10:02:22 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:02:22.320 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:02:23 np0005538515.localdomain ceph-mon[301134]: pgmap v149: 177 pgs: 177 active+clean; 273 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 13 MiB/s rd, 7.8 MiB/s wr, 572 op/s
Nov 28 10:02:23 np0005538515.localdomain ceph-mon[301134]: osdmap e111: 6 total, 6 up, 6 in
Nov 28 10:02:23 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e112 e112: 6 total, 6 up, 6 in
Nov 28 10:02:23 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:02:23.350 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:02:23 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v152: 177 pgs: 177 active+clean; 273 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 13 MiB/s rd, 12 MiB/s wr, 504 op/s
Nov 28 10:02:23 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:02:23.946 280172 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764324128.9454114, 7292509e-f294-4159-96e5-22d4712df2a0 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 28 10:02:23 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:02:23.946 280172 INFO nova.compute.manager [-] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] VM Stopped (Lifecycle Event)
Nov 28 10:02:24 np0005538515.localdomain ceph-mon[301134]: osdmap e112: 6 total, 6 up, 6 in
Nov 28 10:02:24 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:02:24.213 280172 DEBUG nova.compute.manager [None req-774cb919-d254-431b-b0b7-1e90bb499929 - - - - - -] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 28 10:02:24 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e113 e113: 6 total, 6 up, 6 in
Nov 28 10:02:24 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e114 e114: 6 total, 6 up, 6 in
Nov 28 10:02:25 np0005538515.localdomain ceph-mon[301134]: pgmap v152: 177 pgs: 177 active+clean; 273 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 13 MiB/s rd, 12 MiB/s wr, 504 op/s
Nov 28 10:02:25 np0005538515.localdomain ceph-mon[301134]: osdmap e113: 6 total, 6 up, 6 in
Nov 28 10:02:25 np0005538515.localdomain ceph-mon[301134]: osdmap e114: 6 total, 6 up, 6 in
Nov 28 10:02:25 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v155: 177 pgs: 177 active+clean; 273 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:02:25 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:02:25.940 261346 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:02:25Z, description=, device_id=856a9e5d-377c-485a-8ffd-a38bf58c9fa5, device_owner=network:router_gateway, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce623250>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce6235e0>], id=078d853f-feea-4033-aef7-9f3673e9288f, ip_allocation=immediate, mac_address=fa:16:3e:a7:b4:2f, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T08:32:19Z, description=, dns_domain=, id=887157f9-a765-40c0-8be5-1fba3ddea8f8, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=9dda653c53224db086060962b0702694, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=29, status=ACTIVE, subnets=['5f7de60c-f82a-4f40-b803-51cb08cbf2e3'], tags=[], tenant_id=9dda653c53224db086060962b0702694, updated_at=2025-11-28T08:32:25Z, vlan_transparent=None, network_id=887157f9-a765-40c0-8be5-1fba3ddea8f8, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=975, status=DOWN, tags=[], tenant_id=, updated_at=2025-11-28T10:02:25Z on network 887157f9-a765-40c0-8be5-1fba3ddea8f8
Nov 28 10:02:26 np0005538515.localdomain dnsmasq[261709]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/addn_hosts - 3 addresses
Nov 28 10:02:26 np0005538515.localdomain podman[310049]: 2025-11-28 10:02:26.164831024 +0000 UTC m=+0.052569012 container kill dff024c0a8ef5b75fbea3dbf531de7255c21fd5f44b33b8a799f8e3ce0ffd439 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-887157f9-a765-40c0-8be5-1fba3ddea8f8, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:02:26 np0005538515.localdomain dnsmasq-dhcp[261709]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/host
Nov 28 10:02:26 np0005538515.localdomain dnsmasq-dhcp[261709]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/opts
Nov 28 10:02:26 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.106:0/2043476067' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:02:26 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e115 e115: 6 total, 6 up, 6 in
Nov 28 10:02:26 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:02:26.469 261346 INFO neutron.agent.dhcp.agent [None req-57d1c120-e546-45fb-a4f3-e968f2b5a166 - - - - - -] DHCP configuration for ports {'078d853f-feea-4033-aef7-9f3673e9288f'} is completed
Nov 28 10:02:26 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e115 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 10:02:27 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:02:27.058 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:02:27 np0005538515.localdomain ceph-mon[301134]: pgmap v155: 177 pgs: 177 active+clean; 273 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:02:27 np0005538515.localdomain ceph-mon[301134]: osdmap e115: 6 total, 6 up, 6 in
Nov 28 10:02:27 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e116 e116: 6 total, 6 up, 6 in
Nov 28 10:02:27 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:02:27.322 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:02:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:02:27 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 10:02:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:02:27 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 10:02:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:02:27 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 10:02:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:02:27 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 10:02:27 np0005538515.localdomain openstack_network_exporter[240973]: 
Nov 28 10:02:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:02:27 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 10:02:27 np0005538515.localdomain openstack_network_exporter[240973]: 
Nov 28 10:02:27 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v158: 177 pgs: 177 active+clean; 225 MiB data, 886 MiB used, 41 GiB / 42 GiB avail; 5.2 MiB/s rd, 6.4 MiB/s wr, 555 op/s
Nov 28 10:02:28 np0005538515.localdomain ceph-mon[301134]: osdmap e116: 6 total, 6 up, 6 in
Nov 28 10:02:28 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e117 e117: 6 total, 6 up, 6 in
Nov 28 10:02:28 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:02:28.354 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:02:28 np0005538515.localdomain podman[239012]: time="2025-11-28T10:02:28Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 10:02:28 np0005538515.localdomain podman[239012]: @ - - [28/Nov/2025:10:02:28 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156330 "" "Go-http-client/1.1"
Nov 28 10:02:28 np0005538515.localdomain podman[239012]: @ - - [28/Nov/2025:10:02:28 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19205 "" "Go-http-client/1.1"
Nov 28 10:02:29 np0005538515.localdomain ceph-mon[301134]: pgmap v158: 177 pgs: 177 active+clean; 225 MiB data, 886 MiB used, 41 GiB / 42 GiB avail; 5.2 MiB/s rd, 6.4 MiB/s wr, 555 op/s
Nov 28 10:02:29 np0005538515.localdomain ceph-mon[301134]: osdmap e117: 6 total, 6 up, 6 in
Nov 28 10:02:29 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e118 e118: 6 total, 6 up, 6 in
Nov 28 10:02:29 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v161: 177 pgs: 177 active+clean; 225 MiB data, 886 MiB used, 41 GiB / 42 GiB avail; 5.2 MiB/s rd, 6.4 MiB/s wr, 555 op/s
Nov 28 10:02:29 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:02:29.868 2 INFO neutron.agent.securitygroups_rpc [None req-163713b6-af4d-4d16-9097-b3cd54a25f68 078cec78b66d44acb2dcf304e572f2cf dba68040958c4e4c89f84cd27a771cd2 - - default default] Security group member updated ['a0bf5ab5-c355-48ac-a40e-9473d4858766']
Nov 28 10:02:29 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e119 e119: 6 total, 6 up, 6 in
Nov 28 10:02:30 np0005538515.localdomain ceph-mon[301134]: osdmap e118: 6 total, 6 up, 6 in
Nov 28 10:02:30 np0005538515.localdomain ceph-mon[301134]: osdmap e119: 6 total, 6 up, 6 in
Nov 28 10:02:30 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:02:30.388 2 INFO neutron.agent.securitygroups_rpc [None req-59eaff10-1680-4aeb-97dc-49cab4063acc 078cec78b66d44acb2dcf304e572f2cf dba68040958c4e4c89f84cd27a771cd2 - - default default] Security group member updated ['a0bf5ab5-c355-48ac-a40e-9473d4858766']
Nov 28 10:02:30 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:02:30.422 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:02:30 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.
Nov 28 10:02:30 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.
Nov 28 10:02:30 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.
Nov 28 10:02:30 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.
Nov 28 10:02:30 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e120 e120: 6 total, 6 up, 6 in
Nov 28 10:02:30 np0005538515.localdomain podman[310071]: 2025-11-28 10:02:30.997331851 +0000 UTC m=+0.091681761 container health_status b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, container_name=ovn_metadata_agent, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent)
Nov 28 10:02:31 np0005538515.localdomain podman[310071]: 2025-11-28 10:02:31.006741819 +0000 UTC m=+0.101091669 container exec_died b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:02:31 np0005538515.localdomain systemd[1]: b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.service: Deactivated successfully.
Nov 28 10:02:31 np0005538515.localdomain podman[310072]: 2025-11-28 10:02:31.007919285 +0000 UTC m=+0.096366505 container health_status d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 10:02:31 np0005538515.localdomain podman[310072]: 2025-11-28 10:02:31.087937467 +0000 UTC m=+0.176384667 container exec_died d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 10:02:31 np0005538515.localdomain systemd[1]: d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.service: Deactivated successfully.
Nov 28 10:02:31 np0005538515.localdomain podman[310069]: 2025-11-28 10:02:31.150923998 +0000 UTC m=+0.250927092 container health_status 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.license=GPLv2)
Nov 28 10:02:31 np0005538515.localdomain podman[310070]: 2025-11-28 10:02:31.203750817 +0000 UTC m=+0.302372398 container health_status 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, config_id=ovn_controller, tcib_managed=true, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, managed_by=edpm_ansible)
Nov 28 10:02:31 np0005538515.localdomain podman[310069]: 2025-11-28 10:02:31.216415405 +0000 UTC m=+0.316418519 container exec_died 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_id=edpm, container_name=ceilometer_agent_compute, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Nov 28 10:02:31 np0005538515.localdomain systemd[1]: 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.service: Deactivated successfully.
Nov 28 10:02:31 np0005538515.localdomain podman[310070]: 2025-11-28 10:02:31.277690973 +0000 UTC m=+0.376312514 container exec_died 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 28 10:02:31 np0005538515.localdomain systemd[1]: 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.service: Deactivated successfully.
Nov 28 10:02:31 np0005538515.localdomain ceph-mon[301134]: pgmap v161: 177 pgs: 177 active+clean; 225 MiB data, 886 MiB used, 41 GiB / 42 GiB avail; 5.2 MiB/s rd, 6.4 MiB/s wr, 555 op/s
Nov 28 10:02:31 np0005538515.localdomain ceph-mon[301134]: osdmap e120: 6 total, 6 up, 6 in
Nov 28 10:02:31 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v164: 177 pgs: 177 active+clean; 225 MiB data, 891 MiB used, 41 GiB / 42 GiB avail; 203 KiB/s rd, 51 KiB/s wr, 281 op/s
Nov 28 10:02:31 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e120 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 10:02:32 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:02:32.324 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:02:32 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e121 e121: 6 total, 6 up, 6 in
Nov 28 10:02:33 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:02:33.076 261346 INFO neutron.agent.linux.ip_lib [None req-94478697-69e1-416a-905a-44405c2bc0e6 - - - - - -] Device tap51f612f0-6f cannot be used as it has no MAC address
Nov 28 10:02:33 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:02:33.097 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:02:33 np0005538515.localdomain kernel: device tap51f612f0-6f entered promiscuous mode
Nov 28 10:02:33 np0005538515.localdomain NetworkManager[5965]: <info>  [1764324153.1064] manager: (tap51f612f0-6f): new Generic device (/org/freedesktop/NetworkManager/Devices/21)
Nov 28 10:02:33 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T10:02:33Z|00073|binding|INFO|Claiming lport 51f612f0-6f47-40c4-b14b-9819c35d81b4 for this chassis.
Nov 28 10:02:33 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:02:33.107 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:02:33 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T10:02:33Z|00074|binding|INFO|51f612f0-6f47-40c4-b14b-9819c35d81b4: Claiming unknown
Nov 28 10:02:33 np0005538515.localdomain systemd-udevd[310160]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 10:02:33 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:02:33.123 158530 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538515.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcp2a75ae47-09f3-5db4-9c67-86b6e0e7c804-02080985-b864-4a8c-99f6-15cd1e3b9bee', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-02080985-b864-4a8c-99f6-15cd1e3b9bee', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a7913f694a0c456794b7dd9ed628cb12', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1193d1c9-a667-4e4b-958b-68ef2ed8f8fd, chassis=[<ovs.db.idl.Row object at 0x7fd80e481be0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fd80e481be0>], logical_port=51f612f0-6f47-40c4-b14b-9819c35d81b4) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:02:33 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:02:33.125 158530 INFO neutron.agent.ovn.metadata.agent [-] Port 51f612f0-6f47-40c4-b14b-9819c35d81b4 in datapath 02080985-b864-4a8c-99f6-15cd1e3b9bee bound to our chassis
Nov 28 10:02:33 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:02:33.127 158530 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 02080985-b864-4a8c-99f6-15cd1e3b9bee or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 28 10:02:33 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:02:33.128 261619 DEBUG oslo.privsep.daemon [-] privsep: reply[7b473e3d-acb2-4ccb-a3cf-1aefab7c7810]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:02:33 np0005538515.localdomain virtnodedevd[228057]: ethtool ioctl error on tap51f612f0-6f: No such device
Nov 28 10:02:33 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:02:33.141 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:02:33 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T10:02:33Z|00075|binding|INFO|Setting lport 51f612f0-6f47-40c4-b14b-9819c35d81b4 ovn-installed in OVS
Nov 28 10:02:33 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T10:02:33Z|00076|binding|INFO|Setting lport 51f612f0-6f47-40c4-b14b-9819c35d81b4 up in Southbound
Nov 28 10:02:33 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.
Nov 28 10:02:33 np0005538515.localdomain virtnodedevd[228057]: ethtool ioctl error on tap51f612f0-6f: No such device
Nov 28 10:02:33 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:02:33.147 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:02:33 np0005538515.localdomain virtnodedevd[228057]: ethtool ioctl error on tap51f612f0-6f: No such device
Nov 28 10:02:33 np0005538515.localdomain virtnodedevd[228057]: ethtool ioctl error on tap51f612f0-6f: No such device
Nov 28 10:02:33 np0005538515.localdomain virtnodedevd[228057]: ethtool ioctl error on tap51f612f0-6f: No such device
Nov 28 10:02:33 np0005538515.localdomain virtnodedevd[228057]: ethtool ioctl error on tap51f612f0-6f: No such device
Nov 28 10:02:33 np0005538515.localdomain virtnodedevd[228057]: ethtool ioctl error on tap51f612f0-6f: No such device
Nov 28 10:02:33 np0005538515.localdomain virtnodedevd[228057]: ethtool ioctl error on tap51f612f0-6f: No such device
Nov 28 10:02:33 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:02:33.213 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:02:33 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:02:33.233 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:02:33 np0005538515.localdomain systemd[1]: tmp-crun.NAxtzl.mount: Deactivated successfully.
Nov 28 10:02:33 np0005538515.localdomain podman[310168]: 2025-11-28 10:02:33.264535596 +0000 UTC m=+0.110180617 container health_status 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 28 10:02:33 np0005538515.localdomain podman[310168]: 2025-11-28 10:02:33.274533343 +0000 UTC m=+0.120178344 container exec_died 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 28 10:02:33 np0005538515.localdomain systemd[1]: 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.service: Deactivated successfully.
Nov 28 10:02:33 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:02:33.356 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:02:33 np0005538515.localdomain ceph-mon[301134]: pgmap v164: 177 pgs: 177 active+clean; 225 MiB data, 891 MiB used, 41 GiB / 42 GiB avail; 203 KiB/s rd, 51 KiB/s wr, 281 op/s
Nov 28 10:02:33 np0005538515.localdomain ceph-mon[301134]: osdmap e121: 6 total, 6 up, 6 in
Nov 28 10:02:33 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v166: 177 pgs: 177 active+clean; 225 MiB data, 891 MiB used, 41 GiB / 42 GiB avail; 188 KiB/s rd, 48 KiB/s wr, 260 op/s
Nov 28 10:02:33 np0005538515.localdomain podman[310254]: 
Nov 28 10:02:33 np0005538515.localdomain podman[310254]: 2025-11-28 10:02:33.988126304 +0000 UTC m=+0.082279183 container create 2811a8561213d32d0b58d9ae5adfddbd0eddc356f936ada681729b09146dc38f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-02080985-b864-4a8c-99f6-15cd1e3b9bee, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:02:34 np0005538515.localdomain systemd[1]: Started libpod-conmon-2811a8561213d32d0b58d9ae5adfddbd0eddc356f936ada681729b09146dc38f.scope.
Nov 28 10:02:34 np0005538515.localdomain podman[310254]: 2025-11-28 10:02:33.949968814 +0000 UTC m=+0.044121713 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 28 10:02:34 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 10:02:34 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9cb36eecf92e193033a7d68ca6fa47ce64881bf085c7d0c1c8fd7557fceaaf3f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 10:02:34 np0005538515.localdomain podman[310254]: 2025-11-28 10:02:34.069391225 +0000 UTC m=+0.163544094 container init 2811a8561213d32d0b58d9ae5adfddbd0eddc356f936ada681729b09146dc38f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-02080985-b864-4a8c-99f6-15cd1e3b9bee, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2)
Nov 28 10:02:34 np0005538515.localdomain systemd[1]: tmp-crun.HPbvqN.mount: Deactivated successfully.
Nov 28 10:02:34 np0005538515.localdomain podman[310254]: 2025-11-28 10:02:34.083363763 +0000 UTC m=+0.177516632 container start 2811a8561213d32d0b58d9ae5adfddbd0eddc356f936ada681729b09146dc38f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-02080985-b864-4a8c-99f6-15cd1e3b9bee, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125)
Nov 28 10:02:34 np0005538515.localdomain dnsmasq[310272]: started, version 2.85 cachesize 150
Nov 28 10:02:34 np0005538515.localdomain dnsmasq[310272]: DNS service limited to local subnets
Nov 28 10:02:34 np0005538515.localdomain dnsmasq[310272]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 28 10:02:34 np0005538515.localdomain dnsmasq[310272]: warning: no upstream servers configured
Nov 28 10:02:34 np0005538515.localdomain dnsmasq-dhcp[310272]: DHCP, static leases only on 10.100.0.0, lease time 1d
Nov 28 10:02:34 np0005538515.localdomain dnsmasq[310272]: read /var/lib/neutron/dhcp/02080985-b864-4a8c-99f6-15cd1e3b9bee/addn_hosts - 0 addresses
Nov 28 10:02:34 np0005538515.localdomain dnsmasq-dhcp[310272]: read /var/lib/neutron/dhcp/02080985-b864-4a8c-99f6-15cd1e3b9bee/host
Nov 28 10:02:34 np0005538515.localdomain dnsmasq-dhcp[310272]: read /var/lib/neutron/dhcp/02080985-b864-4a8c-99f6-15cd1e3b9bee/opts
Nov 28 10:02:34 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:02:34.273 261346 INFO neutron.agent.dhcp.agent [None req-7a7c43c6-3509-47dd-8b59-79886fef072a - - - - - -] DHCP configuration for ports {'b8b73ac7-0389-4b76-8dd8-3615c03348fe'} is completed
Nov 28 10:02:34 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e122 e122: 6 total, 6 up, 6 in
Nov 28 10:02:34 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:02:34.966 261346 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:02:34Z, description=, device_id=b5ba2814-344b-427b-b30c-b10dca1fc3b1, device_owner=network:router_gateway, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce5f9130>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce5f93a0>], id=962ab66e-6f28-460c-ba48-6f8d97c72fc1, ip_allocation=immediate, mac_address=fa:16:3e:be:72:98, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T08:32:19Z, description=, dns_domain=, id=887157f9-a765-40c0-8be5-1fba3ddea8f8, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=9dda653c53224db086060962b0702694, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=29, status=ACTIVE, subnets=['5f7de60c-f82a-4f40-b803-51cb08cbf2e3'], tags=[], tenant_id=9dda653c53224db086060962b0702694, updated_at=2025-11-28T08:32:25Z, vlan_transparent=None, network_id=887157f9-a765-40c0-8be5-1fba3ddea8f8, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1059, status=DOWN, tags=[], tenant_id=, updated_at=2025-11-28T10:02:34Z on network 887157f9-a765-40c0-8be5-1fba3ddea8f8
Nov 28 10:02:35 np0005538515.localdomain podman[310291]: 2025-11-28 10:02:35.203843442 +0000 UTC m=+0.067615393 container kill dff024c0a8ef5b75fbea3dbf531de7255c21fd5f44b33b8a799f8e3ce0ffd439 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-887157f9-a765-40c0-8be5-1fba3ddea8f8, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2)
Nov 28 10:02:35 np0005538515.localdomain dnsmasq[261709]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/addn_hosts - 4 addresses
Nov 28 10:02:35 np0005538515.localdomain dnsmasq-dhcp[261709]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/host
Nov 28 10:02:35 np0005538515.localdomain dnsmasq-dhcp[261709]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/opts
Nov 28 10:02:35 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:02:35.378 261346 INFO neutron.agent.dhcp.agent [None req-14d8cd81-cc41-491e-96af-db4c8f5fc3c7 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:02:34Z, description=, device_id=bc18d52b-79ab-4649-9d2c-e95822b972e6, device_owner=network:router_gateway, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce5660d0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce566eb0>], id=4943dbb9-4fdb-4880-be61-1585f95a0a04, ip_allocation=immediate, mac_address=fa:16:3e:a9:f5:f2, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T08:32:19Z, description=, dns_domain=, id=887157f9-a765-40c0-8be5-1fba3ddea8f8, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=9dda653c53224db086060962b0702694, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=29, status=ACTIVE, subnets=['5f7de60c-f82a-4f40-b803-51cb08cbf2e3'], tags=[], tenant_id=9dda653c53224db086060962b0702694, updated_at=2025-11-28T08:32:25Z, vlan_transparent=None, network_id=887157f9-a765-40c0-8be5-1fba3ddea8f8, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1060, status=DOWN, tags=[], tenant_id=, updated_at=2025-11-28T10:02:34Z on network 887157f9-a765-40c0-8be5-1fba3ddea8f8
Nov 28 10:02:35 np0005538515.localdomain ceph-mon[301134]: pgmap v166: 177 pgs: 177 active+clean; 225 MiB data, 891 MiB used, 41 GiB / 42 GiB avail; 188 KiB/s rd, 48 KiB/s wr, 260 op/s
Nov 28 10:02:35 np0005538515.localdomain ceph-mon[301134]: osdmap e122: 6 total, 6 up, 6 in
Nov 28 10:02:35 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:02:35.439 261346 INFO neutron.agent.dhcp.agent [None req-958bedac-91a2-41d2-a0d3-50c5cc8addae - - - - - -] DHCP configuration for ports {'962ab66e-6f28-460c-ba48-6f8d97c72fc1'} is completed
Nov 28 10:02:35 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:02:35.551 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:02:35 np0005538515.localdomain podman[310329]: 2025-11-28 10:02:35.616189611 +0000 UTC m=+0.062316632 container kill dff024c0a8ef5b75fbea3dbf531de7255c21fd5f44b33b8a799f8e3ce0ffd439 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-887157f9-a765-40c0-8be5-1fba3ddea8f8, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Nov 28 10:02:35 np0005538515.localdomain dnsmasq[261709]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/addn_hosts - 5 addresses
Nov 28 10:02:35 np0005538515.localdomain dnsmasq-dhcp[261709]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/host
Nov 28 10:02:35 np0005538515.localdomain dnsmasq-dhcp[261709]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/opts
Nov 28 10:02:35 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] scanning for idle connections..
Nov 28 10:02:35 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] cleaning up connections: []
Nov 28 10:02:35 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v168: 177 pgs: 177 active+clean; 225 MiB data, 891 MiB used, 41 GiB / 42 GiB avail; 141 KiB/s rd, 36 KiB/s wr, 195 op/s
Nov 28 10:02:35 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] scanning for idle connections..
Nov 28 10:02:35 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] cleaning up connections: []
Nov 28 10:02:35 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] scanning for idle connections..
Nov 28 10:02:35 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] cleaning up connections: []
Nov 28 10:02:35 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:02:35.915 261346 INFO neutron.agent.dhcp.agent [None req-c09ef46b-67f3-466a-a234-9e4f86de8f36 - - - - - -] DHCP configuration for ports {'4943dbb9-4fdb-4880-be61-1585f95a0a04'} is completed
Nov 28 10:02:35 np0005538515.localdomain podman[310367]: 2025-11-28 10:02:35.997430835 +0000 UTC m=+0.060494406 container kill dff024c0a8ef5b75fbea3dbf531de7255c21fd5f44b33b8a799f8e3ce0ffd439 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-887157f9-a765-40c0-8be5-1fba3ddea8f8, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS)
Nov 28 10:02:35 np0005538515.localdomain dnsmasq[261709]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/addn_hosts - 4 addresses
Nov 28 10:02:35 np0005538515.localdomain dnsmasq-dhcp[261709]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/host
Nov 28 10:02:35 np0005538515.localdomain dnsmasq-dhcp[261709]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/opts
Nov 28 10:02:36 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:02:36.360 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:02:36 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e122 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 10:02:36 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.
Nov 28 10:02:36 np0005538515.localdomain podman[310388]: 2025-11-28 10:02:36.968255668 +0000 UTC m=+0.072971197 container health_status cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, container_name=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Nov 28 10:02:36 np0005538515.localdomain podman[310388]: 2025-11-28 10:02:36.984325871 +0000 UTC m=+0.089041320 container exec_died cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, io.buildah.version=1.41.3, config_id=multipathd, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 28 10:02:36 np0005538515.localdomain systemd[1]: cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.service: Deactivated successfully.
Nov 28 10:02:37 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:02:37.051 261346 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:02:36Z, description=, device_id=bc18d52b-79ab-4649-9d2c-e95822b972e6, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce647df0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce647bb0>], id=a5d73d1f-506a-456b-84b3-47f3eca586f5, ip_allocation=immediate, mac_address=fa:16:3e:c0:bf:e4, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:02:31Z, description=, dns_domain=, id=02080985-b864-4a8c-99f6-15cd1e3b9bee, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-ServersTestFqdnHostnames-806689888-network, port_security_enabled=True, project_id=a7913f694a0c456794b7dd9ed628cb12, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=58375, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1036, status=ACTIVE, subnets=['a7068720-9432-437a-b41d-71d3341bbf2b'], tags=[], tenant_id=a7913f694a0c456794b7dd9ed628cb12, updated_at=2025-11-28T10:02:32Z, vlan_transparent=None, network_id=02080985-b864-4a8c-99f6-15cd1e3b9bee, port_security_enabled=False, project_id=a7913f694a0c456794b7dd9ed628cb12, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1064, status=DOWN, tags=[], tenant_id=a7913f694a0c456794b7dd9ed628cb12, updated_at=2025-11-28T10:02:36Z on network 02080985-b864-4a8c-99f6-15cd1e3b9bee
Nov 28 10:02:37 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:02:37.238 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:02:37 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:02:37.265 261346 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:02:36Z, description=, device_id=bebca4d6-d3c9-48a5-be17-ed38f336aa97, device_owner=network:router_gateway, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce6a7e80>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce566c40>], id=c72e44f2-e17b-4cb9-b759-8ca328e1cca9, ip_allocation=immediate, mac_address=fa:16:3e:55:6d:55, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T08:32:19Z, description=, dns_domain=, id=887157f9-a765-40c0-8be5-1fba3ddea8f8, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=9dda653c53224db086060962b0702694, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=29, status=ACTIVE, subnets=['5f7de60c-f82a-4f40-b803-51cb08cbf2e3'], tags=[], tenant_id=9dda653c53224db086060962b0702694, updated_at=2025-11-28T08:32:25Z, vlan_transparent=None, network_id=887157f9-a765-40c0-8be5-1fba3ddea8f8, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1065, status=DOWN, tags=[], tenant_id=, updated_at=2025-11-28T10:02:36Z on network 887157f9-a765-40c0-8be5-1fba3ddea8f8
Nov 28 10:02:37 np0005538515.localdomain dnsmasq[310272]: read /var/lib/neutron/dhcp/02080985-b864-4a8c-99f6-15cd1e3b9bee/addn_hosts - 1 addresses
Nov 28 10:02:37 np0005538515.localdomain dnsmasq-dhcp[310272]: read /var/lib/neutron/dhcp/02080985-b864-4a8c-99f6-15cd1e3b9bee/host
Nov 28 10:02:37 np0005538515.localdomain dnsmasq-dhcp[310272]: read /var/lib/neutron/dhcp/02080985-b864-4a8c-99f6-15cd1e3b9bee/opts
Nov 28 10:02:37 np0005538515.localdomain podman[310422]: 2025-11-28 10:02:37.279644351 +0000 UTC m=+0.059207065 container kill 2811a8561213d32d0b58d9ae5adfddbd0eddc356f936ada681729b09146dc38f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-02080985-b864-4a8c-99f6-15cd1e3b9bee, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Nov 28 10:02:37 np0005538515.localdomain systemd[1]: tmp-crun.JNnQIF.mount: Deactivated successfully.
Nov 28 10:02:37 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:02:37.327 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:02:37 np0005538515.localdomain ceph-mon[301134]: pgmap v168: 177 pgs: 177 active+clean; 225 MiB data, 891 MiB used, 41 GiB / 42 GiB avail; 141 KiB/s rd, 36 KiB/s wr, 195 op/s
Nov 28 10:02:37 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:02:37.475 261346 INFO neutron.agent.dhcp.agent [None req-04bbd525-e3f1-4019-9d68-71e5b77be9e2 - - - - - -] DHCP configuration for ports {'a5d73d1f-506a-456b-84b3-47f3eca586f5'} is completed
Nov 28 10:02:37 np0005538515.localdomain podman[310460]: 2025-11-28 10:02:37.524440225 +0000 UTC m=+0.063584510 container kill dff024c0a8ef5b75fbea3dbf531de7255c21fd5f44b33b8a799f8e3ce0ffd439 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-887157f9-a765-40c0-8be5-1fba3ddea8f8, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2)
Nov 28 10:02:37 np0005538515.localdomain dnsmasq[261709]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/addn_hosts - 5 addresses
Nov 28 10:02:37 np0005538515.localdomain dnsmasq-dhcp[261709]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/host
Nov 28 10:02:37 np0005538515.localdomain dnsmasq-dhcp[261709]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/opts
Nov 28 10:02:37 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v169: 177 pgs: 177 active+clean; 225 MiB data, 895 MiB used, 41 GiB / 42 GiB avail; 138 KiB/s rd, 32 KiB/s wr, 191 op/s
Nov 28 10:02:37 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:02:37.719 261346 INFO neutron.agent.dhcp.agent [None req-f526a273-7a95-43fd-8624-cfbb0f92b601 - - - - - -] DHCP configuration for ports {'c72e44f2-e17b-4cb9-b759-8ca328e1cca9'} is completed
Nov 28 10:02:38 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:02:38.232 261346 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:02:36Z, description=, device_id=bc18d52b-79ab-4649-9d2c-e95822b972e6, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ceec5cd0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ceea3ee0>], id=a5d73d1f-506a-456b-84b3-47f3eca586f5, ip_allocation=immediate, mac_address=fa:16:3e:c0:bf:e4, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:02:31Z, description=, dns_domain=, id=02080985-b864-4a8c-99f6-15cd1e3b9bee, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-ServersTestFqdnHostnames-806689888-network, port_security_enabled=True, project_id=a7913f694a0c456794b7dd9ed628cb12, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=58375, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1036, status=ACTIVE, subnets=['a7068720-9432-437a-b41d-71d3341bbf2b'], tags=[], tenant_id=a7913f694a0c456794b7dd9ed628cb12, updated_at=2025-11-28T10:02:32Z, vlan_transparent=None, network_id=02080985-b864-4a8c-99f6-15cd1e3b9bee, port_security_enabled=False, project_id=a7913f694a0c456794b7dd9ed628cb12, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1064, status=DOWN, tags=[], tenant_id=a7913f694a0c456794b7dd9ed628cb12, updated_at=2025-11-28T10:02:36Z on network 02080985-b864-4a8c-99f6-15cd1e3b9bee
Nov 28 10:02:38 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:02:38.237 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:02:38 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:02:38.238 280172 DEBUG nova.compute.manager [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 28 10:02:38 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:02:38.359 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:02:38 np0005538515.localdomain dnsmasq[310272]: read /var/lib/neutron/dhcp/02080985-b864-4a8c-99f6-15cd1e3b9bee/addn_hosts - 1 addresses
Nov 28 10:02:38 np0005538515.localdomain dnsmasq-dhcp[310272]: read /var/lib/neutron/dhcp/02080985-b864-4a8c-99f6-15cd1e3b9bee/host
Nov 28 10:02:38 np0005538515.localdomain dnsmasq-dhcp[310272]: read /var/lib/neutron/dhcp/02080985-b864-4a8c-99f6-15cd1e3b9bee/opts
Nov 28 10:02:38 np0005538515.localdomain podman[310497]: 2025-11-28 10:02:38.446159994 +0000 UTC m=+0.046551218 container kill 2811a8561213d32d0b58d9ae5adfddbd0eddc356f936ada681729b09146dc38f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-02080985-b864-4a8c-99f6-15cd1e3b9bee, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2)
Nov 28 10:02:38 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:02:38.669 261346 INFO neutron.agent.dhcp.agent [None req-646c4b4c-f7bf-470f-aa91-4c5339d4c197 - - - - - -] DHCP configuration for ports {'a5d73d1f-506a-456b-84b3-47f3eca586f5'} is completed
Nov 28 10:02:39 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:02:39.240 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:02:39 np0005538515.localdomain ceph-mon[301134]: pgmap v169: 177 pgs: 177 active+clean; 225 MiB data, 895 MiB used, 41 GiB / 42 GiB avail; 138 KiB/s rd, 32 KiB/s wr, 191 op/s
Nov 28 10:02:39 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v170: 177 pgs: 177 active+clean; 225 MiB data, 895 MiB used, 41 GiB / 42 GiB avail; 15 KiB/s rd, 1023 B/s wr, 20 op/s
Nov 28 10:02:39 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e123 e123: 6 total, 6 up, 6 in
Nov 28 10:02:39 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:02:39.946 2 INFO neutron.agent.securitygroups_rpc [None req-c410e527-579f-4d7d-bb14-04bb4c79dd9f b97430f38d544448bcb1f84d60affd50 f23b7feb8db740db9eea6302444ed3a8 - - default default] Security group member updated ['84bc6ad8-56a1-4678-950f-738b55ff6708']
Nov 28 10:02:40 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:02:40.239 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:02:40 np0005538515.localdomain dnsmasq[261709]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/addn_hosts - 4 addresses
Nov 28 10:02:40 np0005538515.localdomain dnsmasq-dhcp[261709]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/host
Nov 28 10:02:40 np0005538515.localdomain dnsmasq-dhcp[261709]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/opts
Nov 28 10:02:40 np0005538515.localdomain podman[310535]: 2025-11-28 10:02:40.371340027 +0000 UTC m=+0.059057321 container kill dff024c0a8ef5b75fbea3dbf531de7255c21fd5f44b33b8a799f8e3ce0ffd439 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-887157f9-a765-40c0-8be5-1fba3ddea8f8, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 28 10:02:40 np0005538515.localdomain ceph-mon[301134]: pgmap v170: 177 pgs: 177 active+clean; 225 MiB data, 895 MiB used, 41 GiB / 42 GiB avail; 15 KiB/s rd, 1023 B/s wr, 20 op/s
Nov 28 10:02:40 np0005538515.localdomain ceph-mon[301134]: osdmap e123: 6 total, 6 up, 6 in
Nov 28 10:02:41 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:02:41.325 261346 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:02:40Z, description=, device_id=459f84b2-a937-4822-a081-b6da2fd06fcc, device_owner=network:router_gateway, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce61b6a0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce61b130>], id=469c6d02-2f81-4843-aa65-1cc5bd3e1c08, ip_allocation=immediate, mac_address=fa:16:3e:d2:2a:94, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T08:32:19Z, description=, dns_domain=, id=887157f9-a765-40c0-8be5-1fba3ddea8f8, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=9dda653c53224db086060962b0702694, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=29, status=ACTIVE, subnets=['5f7de60c-f82a-4f40-b803-51cb08cbf2e3'], tags=[], tenant_id=9dda653c53224db086060962b0702694, updated_at=2025-11-28T08:32:25Z, vlan_transparent=None, network_id=887157f9-a765-40c0-8be5-1fba3ddea8f8, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1079, status=DOWN, tags=[], tenant_id=, updated_at=2025-11-28T10:02:41Z on network 887157f9-a765-40c0-8be5-1fba3ddea8f8
Nov 28 10:02:41 np0005538515.localdomain dnsmasq[261709]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/addn_hosts - 5 addresses
Nov 28 10:02:41 np0005538515.localdomain dnsmasq-dhcp[261709]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/host
Nov 28 10:02:41 np0005538515.localdomain dnsmasq-dhcp[261709]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/opts
Nov 28 10:02:41 np0005538515.localdomain podman[310572]: 2025-11-28 10:02:41.519649172 +0000 UTC m=+0.041193574 container kill dff024c0a8ef5b75fbea3dbf531de7255c21fd5f44b33b8a799f8e3ce0ffd439 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-887157f9-a765-40c0-8be5-1fba3ddea8f8, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:02:41 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v172: 177 pgs: 177 active+clean; 225 MiB data, 895 MiB used, 41 GiB / 42 GiB avail; 15 KiB/s rd, 6.5 KiB/s wr, 20 op/s
Nov 28 10:02:41 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:02:41.739 261346 INFO neutron.agent.dhcp.agent [None req-6c39faa6-5988-436a-ac10-23a551979913 - - - - - -] DHCP configuration for ports {'469c6d02-2f81-4843-aa65-1cc5bd3e1c08'} is completed
Nov 28 10:02:41 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e123 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 10:02:42 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:02:42.240 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:02:42 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:02:42.240 280172 DEBUG nova.compute.manager [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 28 10:02:42 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:02:42.240 280172 DEBUG nova.compute.manager [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 28 10:02:42 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:02:42.256 280172 DEBUG nova.compute.manager [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 28 10:02:42 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:02:42.330 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:02:42 np0005538515.localdomain dnsmasq[310272]: read /var/lib/neutron/dhcp/02080985-b864-4a8c-99f6-15cd1e3b9bee/addn_hosts - 0 addresses
Nov 28 10:02:42 np0005538515.localdomain dnsmasq-dhcp[310272]: read /var/lib/neutron/dhcp/02080985-b864-4a8c-99f6-15cd1e3b9bee/host
Nov 28 10:02:42 np0005538515.localdomain podman[310611]: 2025-11-28 10:02:42.881890663 +0000 UTC m=+0.069131349 container kill 2811a8561213d32d0b58d9ae5adfddbd0eddc356f936ada681729b09146dc38f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-02080985-b864-4a8c-99f6-15cd1e3b9bee, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Nov 28 10:02:42 np0005538515.localdomain dnsmasq-dhcp[310272]: read /var/lib/neutron/dhcp/02080985-b864-4a8c-99f6-15cd1e3b9bee/opts
Nov 28 10:02:42 np0005538515.localdomain ceph-mon[301134]: pgmap v172: 177 pgs: 177 active+clean; 225 MiB data, 895 MiB used, 41 GiB / 42 GiB avail; 15 KiB/s rd, 6.5 KiB/s wr, 20 op/s
Nov 28 10:02:42 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e124 e124: 6 total, 6 up, 6 in
Nov 28 10:02:43 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:02:43.094 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:02:43 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T10:02:43Z|00077|binding|INFO|Releasing lport 51f612f0-6f47-40c4-b14b-9819c35d81b4 from this chassis (sb_readonly=0)
Nov 28 10:02:43 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T10:02:43Z|00078|binding|INFO|Setting lport 51f612f0-6f47-40c4-b14b-9819c35d81b4 down in Southbound
Nov 28 10:02:43 np0005538515.localdomain kernel: device tap51f612f0-6f left promiscuous mode
Nov 28 10:02:43 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:02:43.102 158530 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538515.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcp2a75ae47-09f3-5db4-9c67-86b6e0e7c804-02080985-b864-4a8c-99f6-15cd1e3b9bee', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-02080985-b864-4a8c-99f6-15cd1e3b9bee', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a7913f694a0c456794b7dd9ed628cb12', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005538515.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1193d1c9-a667-4e4b-958b-68ef2ed8f8fd, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fd80e481be0>], logical_port=51f612f0-6f47-40c4-b14b-9819c35d81b4) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fd80e481be0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:02:43 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:02:43.103 158530 INFO neutron.agent.ovn.metadata.agent [-] Port 51f612f0-6f47-40c4-b14b-9819c35d81b4 in datapath 02080985-b864-4a8c-99f6-15cd1e3b9bee unbound from our chassis
Nov 28 10:02:43 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:02:43.105 158530 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 02080985-b864-4a8c-99f6-15cd1e3b9bee, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 28 10:02:43 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:02:43.106 261619 DEBUG oslo.privsep.daemon [-] privsep: reply[4751d0ac-3dbe-453a-b0a9-1df86d35f5ea]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:02:43 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:02:43.116 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:02:43 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:02:43.233 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:02:43 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:02:43.238 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:02:43 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:02:43.362 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:02:43 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v174: 177 pgs: 177 active+clean; 225 MiB data, 895 MiB used, 41 GiB / 42 GiB avail; 15 KiB/s rd, 6.5 KiB/s wr, 20 op/s
Nov 28 10:02:43 np0005538515.localdomain ceph-mon[301134]: osdmap e124: 6 total, 6 up, 6 in
Nov 28 10:02:44 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:02:44.072 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:02:44 np0005538515.localdomain kernel: device tap8af1236c-20 left promiscuous mode
Nov 28 10:02:44 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T10:02:44Z|00079|binding|INFO|Releasing lport 8af1236c-205e-4af9-a882-ccde7f9d3ecf from this chassis (sb_readonly=0)
Nov 28 10:02:44 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T10:02:44Z|00080|binding|INFO|Setting lport 8af1236c-205e-4af9-a882-ccde7f9d3ecf down in Southbound
Nov 28 10:02:44 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:02:44.084 158530 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538515.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '192.168.122.172/24', 'neutron:device_id': 'dhcp2a75ae47-09f3-5db4-9c67-86b6e0e7c804-887157f9-a765-40c0-8be5-1fba3ddea8f8', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-887157f9-a765-40c0-8be5-1fba3ddea8f8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9dda653c53224db086060962b0702694', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005538515.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e5520a81-bbe1-4feb-9859-6165eafc855d, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fd80e481be0>], logical_port=8af1236c-205e-4af9-a882-ccde7f9d3ecf) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fd80e481be0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:02:44 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:02:44.087 158530 INFO neutron.agent.ovn.metadata.agent [-] Port 8af1236c-205e-4af9-a882-ccde7f9d3ecf in datapath 887157f9-a765-40c0-8be5-1fba3ddea8f8 unbound from our chassis
Nov 28 10:02:44 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:02:44.090 158530 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 887157f9-a765-40c0-8be5-1fba3ddea8f8, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 28 10:02:44 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:02:44.091 261619 DEBUG oslo.privsep.daemon [-] privsep: reply[c1a8764c-2854-4d09-b4bb-fa2e06cdb968]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:02:44 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:02:44.103 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:02:44 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:02:44.233 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:02:44 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:02:44.237 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:02:44 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:02:44.258 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 10:02:44 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:02:44.259 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 10:02:44 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:02:44.259 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 10:02:44 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:02:44.260 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Auditing locally available compute resources for np0005538515.localdomain (node: np0005538515.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 28 10:02:44 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:02:44.261 280172 DEBUG oslo_concurrency.processutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 10:02:44 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:02:44.559 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:02:44 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:02:44.560 158530 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=10, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '92:49:97', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ca:ab:0a:de:51:20'}, ipsec=False) old=SB_Global(nb_cfg=9) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:02:44 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:02:44.561 158530 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 28 10:02:44 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 28 10:02:44 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/2317602557' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:02:44 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:02:44.782 280172 DEBUG oslo_concurrency.processutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.521s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 10:02:44 np0005538515.localdomain ceph-mon[301134]: pgmap v174: 177 pgs: 177 active+clean; 225 MiB data, 895 MiB used, 41 GiB / 42 GiB avail; 15 KiB/s rd, 6.5 KiB/s wr, 20 op/s
Nov 28 10:02:44 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.106:0/3290241926' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:02:44 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.108:0/2317602557' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:02:45 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:02:45.025 280172 WARNING nova.virt.libvirt.driver [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 10:02:45 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:02:45.027 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Hypervisor/Node resource view: name=np0005538515.localdomain free_ram=11626MB free_disk=41.700096130371094GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 28 10:02:45 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:02:45.028 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 10:02:45 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:02:45.028 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 10:02:45 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:02:45.105 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 28 10:02:45 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:02:45.106 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Final resource view: name=np0005538515.localdomain phys_ram=15738MB used_ram=512MB phys_disk=41GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 28 10:02:45 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:02:45.150 280172 DEBUG oslo_concurrency.processutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 10:02:45 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:02:45.271 261346 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:02:44Z, description=, device_id=284309e9-e4cc-4725-a39f-0b5b94312aa1, device_owner=network:router_gateway, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce5e44f0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce5e4c10>], id=a523cd52-834b-474e-925b-8a0b6c6f8679, ip_allocation=immediate, mac_address=fa:16:3e:dd:0e:89, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T08:32:19Z, description=, dns_domain=, id=887157f9-a765-40c0-8be5-1fba3ddea8f8, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=9dda653c53224db086060962b0702694, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=29, status=ACTIVE, subnets=['5f7de60c-f82a-4f40-b803-51cb08cbf2e3'], tags=[], tenant_id=9dda653c53224db086060962b0702694, updated_at=2025-11-28T08:32:25Z, vlan_transparent=None, network_id=887157f9-a765-40c0-8be5-1fba3ddea8f8, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1095, status=DOWN, tags=[], tenant_id=, updated_at=2025-11-28T10:02:44Z on network 887157f9-a765-40c0-8be5-1fba3ddea8f8
Nov 28 10:02:45 np0005538515.localdomain dnsmasq[261709]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/addn_hosts - 6 addresses
Nov 28 10:02:45 np0005538515.localdomain dnsmasq-dhcp[261709]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/host
Nov 28 10:02:45 np0005538515.localdomain podman[310697]: 2025-11-28 10:02:45.500508209 +0000 UTC m=+0.069238593 container kill dff024c0a8ef5b75fbea3dbf531de7255c21fd5f44b33b8a799f8e3ce0ffd439 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-887157f9-a765-40c0-8be5-1fba3ddea8f8, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 28 10:02:45 np0005538515.localdomain dnsmasq-dhcp[261709]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/opts
Nov 28 10:02:45 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:02:45.524 261346 ERROR neutron.agent.dhcp.agent [None req-8508e813-9116-48df-98bd-6595903dd5ab - - - - - -] Unable to reload_allocations dhcp for 887157f9-a765-40c0-8be5-1fba3ddea8f8.: neutron.privileged.agent.linux.ip_lib.NetworkInterfaceNotFound: Network interface tap8af1236c-20 not found in namespace qdhcp-887157f9-a765-40c0-8be5-1fba3ddea8f8.
Nov 28 10:02:45 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:02:45.524 261346 ERROR neutron.agent.dhcp.agent Traceback (most recent call last):
Nov 28 10:02:45 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:02:45.524 261346 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/dhcp/agent.py", line 264, in _call_driver
Nov 28 10:02:45 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:02:45.524 261346 ERROR neutron.agent.dhcp.agent     rv = getattr(driver, action)(**action_kwargs)
Nov 28 10:02:45 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:02:45.524 261346 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 673, in reload_allocations
Nov 28 10:02:45 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:02:45.524 261346 ERROR neutron.agent.dhcp.agent     self.device_manager.update(self.network, self.interface_name)
Nov 28 10:02:45 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:02:45.524 261346 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1899, in update
Nov 28 10:02:45 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:02:45.524 261346 ERROR neutron.agent.dhcp.agent     self._set_default_route(network, device_name)
Nov 28 10:02:45 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:02:45.524 261346 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1610, in _set_default_route
Nov 28 10:02:45 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:02:45.524 261346 ERROR neutron.agent.dhcp.agent     self._set_default_route_ip_version(network, device_name,
Nov 28 10:02:45 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:02:45.524 261346 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1539, in _set_default_route_ip_version
Nov 28 10:02:45 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:02:45.524 261346 ERROR neutron.agent.dhcp.agent     gateway = device.route.get_gateway(ip_version=ip_version)
Nov 28 10:02:45 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:02:45.524 261346 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 671, in get_gateway
Nov 28 10:02:45 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:02:45.524 261346 ERROR neutron.agent.dhcp.agent     routes = self.list_routes(ip_version, scope=scope, table=table)
Nov 28 10:02:45 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:02:45.524 261346 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 656, in list_routes
Nov 28 10:02:45 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:02:45.524 261346 ERROR neutron.agent.dhcp.agent     return list_ip_routes(self._parent.namespace, ip_version, scope=scope,
Nov 28 10:02:45 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:02:45.524 261346 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 1611, in list_ip_routes
Nov 28 10:02:45 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:02:45.524 261346 ERROR neutron.agent.dhcp.agent     routes = privileged.list_ip_routes(namespace, ip_version, device=device,
Nov 28 10:02:45 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:02:45.524 261346 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 333, in wrapped_f
Nov 28 10:02:45 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:02:45.524 261346 ERROR neutron.agent.dhcp.agent     return self(f, *args, **kw)
Nov 28 10:02:45 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:02:45.524 261346 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 423, in __call__
Nov 28 10:02:45 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:02:45.524 261346 ERROR neutron.agent.dhcp.agent     do = self.iter(retry_state=retry_state)
Nov 28 10:02:45 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:02:45.524 261346 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 360, in iter
Nov 28 10:02:45 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:02:45.524 261346 ERROR neutron.agent.dhcp.agent     return fut.result()
Nov 28 10:02:45 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:02:45.524 261346 ERROR neutron.agent.dhcp.agent   File "/usr/lib64/python3.9/concurrent/futures/_base.py", line 439, in result
Nov 28 10:02:45 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:02:45.524 261346 ERROR neutron.agent.dhcp.agent     return self.__get_result()
Nov 28 10:02:45 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:02:45.524 261346 ERROR neutron.agent.dhcp.agent   File "/usr/lib64/python3.9/concurrent/futures/_base.py", line 391, in __get_result
Nov 28 10:02:45 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:02:45.524 261346 ERROR neutron.agent.dhcp.agent     raise self._exception
Nov 28 10:02:45 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:02:45.524 261346 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 426, in __call__
Nov 28 10:02:45 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:02:45.524 261346 ERROR neutron.agent.dhcp.agent     result = fn(*args, **kwargs)
Nov 28 10:02:45 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:02:45.524 261346 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/oslo_privsep/priv_context.py", line 271, in _wrap
Nov 28 10:02:45 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:02:45.524 261346 ERROR neutron.agent.dhcp.agent     return self.channel.remote_call(name, args, kwargs,
Nov 28 10:02:45 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:02:45.524 261346 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/oslo_privsep/daemon.py", line 215, in remote_call
Nov 28 10:02:45 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:02:45.524 261346 ERROR neutron.agent.dhcp.agent     raise exc_type(*result[2])
Nov 28 10:02:45 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:02:45.524 261346 ERROR neutron.agent.dhcp.agent neutron.privileged.agent.linux.ip_lib.NetworkInterfaceNotFound: Network interface tap8af1236c-20 not found in namespace qdhcp-887157f9-a765-40c0-8be5-1fba3ddea8f8.
Nov 28 10:02:45 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:02:45.524 261346 ERROR neutron.agent.dhcp.agent 
Nov 28 10:02:45 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:02:45.529 261346 INFO neutron.agent.dhcp.agent [-] Synchronizing state
Nov 28 10:02:45 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 28 10:02:45 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/3168695736' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:02:45 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:02:45.592 280172 DEBUG oslo_concurrency.processutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.442s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 10:02:45 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:02:45.599 280172 DEBUG nova.compute.provider_tree [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Inventory has not changed in ProviderTree for provider: 72fba1ca-0d86-48af-8a3d-510284dfd0e0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 10:02:45 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:02:45.617 280172 DEBUG nova.scheduler.client.report [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Inventory has not changed for provider 72fba1ca-0d86-48af-8a3d-510284dfd0e0 based on inventory data: {'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 10:02:45 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:02:45.633 261346 INFO neutron.agent.dhcp.agent [None req-21e967cf-1575-474e-9a24-c502ec0a76a5 - - - - - -] DHCP configuration for ports {'a523cd52-834b-474e-925b-8a0b6c6f8679'} is completed
Nov 28 10:02:45 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:02:45.649 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Compute_service record updated for np0005538515.localdomain:np0005538515.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 28 10:02:45 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:02:45.650 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.622s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 10:02:45 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v175: 177 pgs: 177 active+clean; 225 MiB data, 895 MiB used, 41 GiB / 42 GiB avail; 5.5 KiB/s wr, 0 op/s
Nov 28 10:02:45 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:02:45.798 261346 INFO neutron.agent.dhcp.agent [None req-694fd91a-974f-4128-82f1-92e341d4f45d - - - - - -] All active networks have been fetched through RPC.
Nov 28 10:02:45 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:02:45.799 261346 INFO neutron.agent.dhcp.agent [-] Starting network 887157f9-a765-40c0-8be5-1fba3ddea8f8 dhcp configuration
Nov 28 10:02:45 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:02:45.853 2 INFO neutron.agent.securitygroups_rpc [None req-7370f7f5-c105-405f-816d-670eb41986b4 b97430f38d544448bcb1f84d60affd50 f23b7feb8db740db9eea6302444ed3a8 - - default default] Security group member updated ['84bc6ad8-56a1-4678-950f-738b55ff6708']
Nov 28 10:02:45 np0005538515.localdomain dnsmasq[261709]: exiting on receipt of SIGTERM
Nov 28 10:02:45 np0005538515.localdomain podman[310728]: 2025-11-28 10:02:45.976182838 +0000 UTC m=+0.059120333 container kill dff024c0a8ef5b75fbea3dbf531de7255c21fd5f44b33b8a799f8e3ce0ffd439 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-887157f9-a765-40c0-8be5-1fba3ddea8f8, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Nov 28 10:02:45 np0005538515.localdomain systemd[1]: libpod-dff024c0a8ef5b75fbea3dbf531de7255c21fd5f44b33b8a799f8e3ce0ffd439.scope: Deactivated successfully.
Nov 28 10:02:46 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.106:0/1391681796' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:02:46 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.107:0/3089492664' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:02:46 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.108:0/3168695736' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:02:46 np0005538515.localdomain podman[310742]: 2025-11-28 10:02:46.045514843 +0000 UTC m=+0.051689106 container died dff024c0a8ef5b75fbea3dbf531de7255c21fd5f44b33b8a799f8e3ce0ffd439 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-887157f9-a765-40c0-8be5-1fba3ddea8f8, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:02:46 np0005538515.localdomain podman[310742]: 2025-11-28 10:02:46.076336217 +0000 UTC m=+0.082510460 container cleanup dff024c0a8ef5b75fbea3dbf531de7255c21fd5f44b33b8a799f8e3ce0ffd439 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-887157f9-a765-40c0-8be5-1fba3ddea8f8, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Nov 28 10:02:46 np0005538515.localdomain systemd[1]: libpod-conmon-dff024c0a8ef5b75fbea3dbf531de7255c21fd5f44b33b8a799f8e3ce0ffd439.scope: Deactivated successfully.
Nov 28 10:02:46 np0005538515.localdomain podman[310743]: 2025-11-28 10:02:46.133483408 +0000 UTC m=+0.134775241 container remove dff024c0a8ef5b75fbea3dbf531de7255c21fd5f44b33b8a799f8e3ce0ffd439 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-887157f9-a765-40c0-8be5-1fba3ddea8f8, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Nov 28 10:02:46 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:02:46.173 261346 INFO neutron.agent.linux.ip_lib [-] Device tap8af1236c-20 cannot be used as it has no MAC address
Nov 28 10:02:46 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:02:46.193 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:02:46 np0005538515.localdomain kernel: device tap8af1236c-20 entered promiscuous mode
Nov 28 10:02:46 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T10:02:46Z|00081|binding|INFO|Claiming lport 8af1236c-205e-4af9-a882-ccde7f9d3ecf for this chassis.
Nov 28 10:02:46 np0005538515.localdomain NetworkManager[5965]: <info>  [1764324166.2004] manager: (tap8af1236c-20): new Generic device (/org/freedesktop/NetworkManager/Devices/22)
Nov 28 10:02:46 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:02:46.200 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:02:46 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T10:02:46Z|00082|binding|INFO|8af1236c-205e-4af9-a882-ccde7f9d3ecf: Claiming unknown
Nov 28 10:02:46 np0005538515.localdomain systemd-udevd[310777]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 10:02:46 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:02:46.209 158530 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538515.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '192.168.122.172/24', 'neutron:device_id': 'dhcp2a75ae47-09f3-5db4-9c67-86b6e0e7c804-887157f9-a765-40c0-8be5-1fba3ddea8f8', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-887157f9-a765-40c0-8be5-1fba3ddea8f8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9dda653c53224db086060962b0702694', 'neutron:revision_number': '4', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e5520a81-bbe1-4feb-9859-6165eafc855d, chassis=[<ovs.db.idl.Row object at 0x7fd80e481be0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fd80e481be0>], logical_port=8af1236c-205e-4af9-a882-ccde7f9d3ecf) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:02:46 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:02:46.211 158530 INFO neutron.agent.ovn.metadata.agent [-] Port 8af1236c-205e-4af9-a882-ccde7f9d3ecf in datapath 887157f9-a765-40c0-8be5-1fba3ddea8f8 bound to our chassis
Nov 28 10:02:46 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:02:46.213 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:02:46 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T10:02:46Z|00083|binding|INFO|Setting lport 8af1236c-205e-4af9-a882-ccde7f9d3ecf up in Southbound
Nov 28 10:02:46 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T10:02:46Z|00084|binding|INFO|Setting lport 8af1236c-205e-4af9-a882-ccde7f9d3ecf ovn-installed in OVS
Nov 28 10:02:46 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:02:46.214 158530 DEBUG neutron.agent.ovn.metadata.agent [-] Port 443f831a-83a9-4df5-adbb-6fdf4d706460 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Nov 28 10:02:46 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:02:46.214 158530 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 887157f9-a765-40c0-8be5-1fba3ddea8f8, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 28 10:02:46 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:02:46.215 261619 DEBUG oslo.privsep.daemon [-] privsep: reply[c7c14fe3-1a44-497a-9369-16a7068ec51c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:02:46 np0005538515.localdomain virtnodedevd[228057]: ethtool ioctl error on tap8af1236c-20: No such device
Nov 28 10:02:46 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:02:46.231 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:02:46 np0005538515.localdomain virtnodedevd[228057]: ethtool ioctl error on tap8af1236c-20: No such device
Nov 28 10:02:46 np0005538515.localdomain virtnodedevd[228057]: ethtool ioctl error on tap8af1236c-20: No such device
Nov 28 10:02:46 np0005538515.localdomain virtnodedevd[228057]: ethtool ioctl error on tap8af1236c-20: No such device
Nov 28 10:02:46 np0005538515.localdomain virtnodedevd[228057]: ethtool ioctl error on tap8af1236c-20: No such device
Nov 28 10:02:46 np0005538515.localdomain virtnodedevd[228057]: ethtool ioctl error on tap8af1236c-20: No such device
Nov 28 10:02:46 np0005538515.localdomain virtnodedevd[228057]: ethtool ioctl error on tap8af1236c-20: No such device
Nov 28 10:02:46 np0005538515.localdomain virtnodedevd[228057]: ethtool ioctl error on tap8af1236c-20: No such device
Nov 28 10:02:46 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:02:46.267 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:02:46 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:02:46.297 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:02:46 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-f7cff75508476df411b334ef64aedbb65646b8067d2b7c094a8dcb894216f571-merged.mount: Deactivated successfully.
Nov 28 10:02:46 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-dff024c0a8ef5b75fbea3dbf531de7255c21fd5f44b33b8a799f8e3ce0ffd439-userdata-shm.mount: Deactivated successfully.
Nov 28 10:02:46 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:02:46.563 158530 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=62c03cad-89c1-4fd7-973b-8f2a608c71f1, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '10'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 10:02:46 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:02:46.586 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:02:46 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e124 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 10:02:47 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:02:47.019 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:02:47 np0005538515.localdomain ceph-mon[301134]: pgmap v175: 177 pgs: 177 active+clean; 225 MiB data, 895 MiB used, 41 GiB / 42 GiB avail; 5.5 KiB/s wr, 0 op/s
Nov 28 10:02:47 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.107:0/438558575' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:02:47 np0005538515.localdomain podman[310844]: 
Nov 28 10:02:47 np0005538515.localdomain podman[310844]: 2025-11-28 10:02:47.083607638 +0000 UTC m=+0.090839955 container create d6ba70e100412f39fbf43f2359746885ccd151dd93bddc3682d2651a49a36c62 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-887157f9-a765-40c0-8be5-1fba3ddea8f8, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS)
Nov 28 10:02:47 np0005538515.localdomain systemd[1]: Started libpod-conmon-d6ba70e100412f39fbf43f2359746885ccd151dd93bddc3682d2651a49a36c62.scope.
Nov 28 10:02:47 np0005538515.localdomain systemd[1]: tmp-crun.KFYzr7.mount: Deactivated successfully.
Nov 28 10:02:47 np0005538515.localdomain podman[310844]: 2025-11-28 10:02:47.043801088 +0000 UTC m=+0.051033435 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 28 10:02:47 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 10:02:47 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a641b960bfac5f099829c6c40c021109cd7c4b369b639859fdabca2fc43676f7/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 10:02:47 np0005538515.localdomain podman[310844]: 2025-11-28 10:02:47.163490217 +0000 UTC m=+0.170722524 container init d6ba70e100412f39fbf43f2359746885ccd151dd93bddc3682d2651a49a36c62 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-887157f9-a765-40c0-8be5-1fba3ddea8f8, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 28 10:02:47 np0005538515.localdomain podman[310844]: 2025-11-28 10:02:47.172563004 +0000 UTC m=+0.179795311 container start d6ba70e100412f39fbf43f2359746885ccd151dd93bddc3682d2651a49a36c62 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-887157f9-a765-40c0-8be5-1fba3ddea8f8, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Nov 28 10:02:47 np0005538515.localdomain dnsmasq[310862]: started, version 2.85 cachesize 150
Nov 28 10:02:47 np0005538515.localdomain dnsmasq[310862]: DNS service limited to local subnets
Nov 28 10:02:47 np0005538515.localdomain dnsmasq[310862]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 28 10:02:47 np0005538515.localdomain dnsmasq[310862]: warning: no upstream servers configured
Nov 28 10:02:47 np0005538515.localdomain dnsmasq-dhcp[310862]: DHCP, static leases only on 192.168.122.0, lease time 1d
Nov 28 10:02:47 np0005538515.localdomain dnsmasq[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/addn_hosts - 6 addresses
Nov 28 10:02:47 np0005538515.localdomain dnsmasq-dhcp[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/host
Nov 28 10:02:47 np0005538515.localdomain dnsmasq-dhcp[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/opts
Nov 28 10:02:47 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:02:47.236 261346 INFO neutron.agent.dhcp.agent [None req-60eaec6b-f7eb-4ede-8f00-8df3698f4ff6 - - - - - -] Finished network 887157f9-a765-40c0-8be5-1fba3ddea8f8 dhcp configuration
Nov 28 10:02:47 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:02:47.237 261346 INFO neutron.agent.dhcp.agent [None req-694fd91a-974f-4128-82f1-92e341d4f45d - - - - - -] Synchronizing state complete
Nov 28 10:02:47 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:02:47.241 261346 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.dhcp_release_cmd', '--privsep_sock_path', '/tmp/tmp6a4n70_d/privsep.sock']
Nov 28 10:02:47 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:02:47.333 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:02:47 np0005538515.localdomain dnsmasq[310272]: exiting on receipt of SIGTERM
Nov 28 10:02:47 np0005538515.localdomain podman[310882]: 2025-11-28 10:02:47.504012402 +0000 UTC m=+0.056554924 container kill 2811a8561213d32d0b58d9ae5adfddbd0eddc356f936ada681729b09146dc38f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-02080985-b864-4a8c-99f6-15cd1e3b9bee, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 28 10:02:47 np0005538515.localdomain systemd[1]: libpod-2811a8561213d32d0b58d9ae5adfddbd0eddc356f936ada681729b09146dc38f.scope: Deactivated successfully.
Nov 28 10:02:47 np0005538515.localdomain podman[310896]: 2025-11-28 10:02:47.553104837 +0000 UTC m=+0.040107220 container died 2811a8561213d32d0b58d9ae5adfddbd0eddc356f936ada681729b09146dc38f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-02080985-b864-4a8c-99f6-15cd1e3b9bee, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 28 10:02:47 np0005538515.localdomain podman[310896]: 2025-11-28 10:02:47.57538466 +0000 UTC m=+0.062387013 container cleanup 2811a8561213d32d0b58d9ae5adfddbd0eddc356f936ada681729b09146dc38f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-02080985-b864-4a8c-99f6-15cd1e3b9bee, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Nov 28 10:02:47 np0005538515.localdomain systemd[1]: libpod-conmon-2811a8561213d32d0b58d9ae5adfddbd0eddc356f936ada681729b09146dc38f.scope: Deactivated successfully.
Nov 28 10:02:47 np0005538515.localdomain podman[310898]: 2025-11-28 10:02:47.63704604 +0000 UTC m=+0.119165774 container remove 2811a8561213d32d0b58d9ae5adfddbd0eddc356f936ada681729b09146dc38f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-02080985-b864-4a8c-99f6-15cd1e3b9bee, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 28 10:02:47 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:02:47.651 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:02:47 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v176: 177 pgs: 177 active+clean; 225 MiB data, 895 MiB used, 41 GiB / 42 GiB avail; 13 KiB/s rd, 9.0 KiB/s wr, 19 op/s
Nov 28 10:02:47 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:02:47.726 261346 INFO neutron.agent.dhcp.agent [None req-3562f759-3584-4414-aa3a-1f377c05b5cf - - - - - -] DHCP configuration for ports {'50fa6f67-abd9-48d7-aedb-8ca08cff0a66', '4f09ff74-6c86-4b4e-b350-59898c763592', 'c11672ac-31d9-4e35-992c-9c2cc8fbd9ff', '4a0a3326-6d12-4d57-91f4-2bd267c644b1', '57c70dff-855f-436b-a33c-5f3b79153011', '962ab66e-6f28-460c-ba48-6f8d97c72fc1', '4943dbb9-4fdb-4880-be61-1585f95a0a04', '8af1236c-205e-4af9-a882-ccde7f9d3ecf', 'a523cd52-834b-474e-925b-8a0b6c6f8679', '89beab0b-910c-4e5d-abc6-3023f325656d', '469c6d02-2f81-4843-aa65-1cc5bd3e1c08'} is completed
Nov 28 10:02:47 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:02:47.902 261346 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Nov 28 10:02:47 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:02:47.801 310924 INFO oslo.privsep.daemon [-] privsep daemon starting
Nov 28 10:02:47 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:02:47.806 310924 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Nov 28 10:02:47 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:02:47.810 310924 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_NET_ADMIN|CAP_SYS_ADMIN/none
Nov 28 10:02:47 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:02:47.811 310924 INFO oslo.privsep.daemon [-] privsep daemon running as pid 310924
Nov 28 10:02:48 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e125 e125: 6 total, 6 up, 6 in
Nov 28 10:02:48 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:02:48.058 261346 INFO neutron.agent.dhcp.agent [None req-a5aa7db5-0eef-457b-b419-306f44d27e99 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:02:48 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:02:48.059 261346 INFO neutron.agent.dhcp.agent [None req-a5aa7db5-0eef-457b-b419-306f44d27e99 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:02:48 np0005538515.localdomain dnsmasq-dhcp[310862]: DHCPRELEASE(tap8af1236c-20) 192.168.122.189 fa:16:3e:a9:f5:f2
Nov 28 10:02:48 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:02:48.365 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:02:48 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:02:48.391 261346 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:02:48 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-9cb36eecf92e193033a7d68ca6fa47ce64881bf085c7d0c1c8fd7557fceaaf3f-merged.mount: Deactivated successfully.
Nov 28 10:02:48 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-2811a8561213d32d0b58d9ae5adfddbd0eddc356f936ada681729b09146dc38f-userdata-shm.mount: Deactivated successfully.
Nov 28 10:02:48 np0005538515.localdomain systemd[1]: run-netns-qdhcp\x2d02080985\x2db864\x2d4a8c\x2d99f6\x2d15cd1e3b9bee.mount: Deactivated successfully.
Nov 28 10:02:48 np0005538515.localdomain dnsmasq[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/addn_hosts - 5 addresses
Nov 28 10:02:48 np0005538515.localdomain dnsmasq-dhcp[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/host
Nov 28 10:02:48 np0005538515.localdomain podman[310946]: 2025-11-28 10:02:48.665910013 +0000 UTC m=+0.059672670 container kill d6ba70e100412f39fbf43f2359746885ccd151dd93bddc3682d2651a49a36c62 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-887157f9-a765-40c0-8be5-1fba3ddea8f8, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Nov 28 10:02:48 np0005538515.localdomain dnsmasq-dhcp[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/opts
Nov 28 10:02:49 np0005538515.localdomain ceph-mon[301134]: pgmap v176: 177 pgs: 177 active+clean; 225 MiB data, 895 MiB used, 41 GiB / 42 GiB avail; 13 KiB/s rd, 9.0 KiB/s wr, 19 op/s
Nov 28 10:02:49 np0005538515.localdomain ceph-mon[301134]: osdmap e125: 6 total, 6 up, 6 in
Nov 28 10:02:49 np0005538515.localdomain dnsmasq-dhcp[310862]: DHCPRELEASE(tap8af1236c-20) 192.168.122.232 fa:16:3e:dd:0e:89
Nov 28 10:02:49 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:02:49.480 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:02:49 np0005538515.localdomain dnsmasq[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/addn_hosts - 4 addresses
Nov 28 10:02:49 np0005538515.localdomain dnsmasq-dhcp[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/host
Nov 28 10:02:49 np0005538515.localdomain dnsmasq-dhcp[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/opts
Nov 28 10:02:49 np0005538515.localdomain podman[310985]: 2025-11-28 10:02:49.676352671 +0000 UTC m=+0.068454890 container kill d6ba70e100412f39fbf43f2359746885ccd151dd93bddc3682d2651a49a36c62 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-887157f9-a765-40c0-8be5-1fba3ddea8f8, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:02:49 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v178: 177 pgs: 177 active+clean; 225 MiB data, 895 MiB used, 41 GiB / 42 GiB avail; 13 KiB/s rd, 3.5 KiB/s wr, 18 op/s
Nov 28 10:02:50 np0005538515.localdomain dnsmasq-dhcp[310862]: DHCPRELEASE(tap8af1236c-20) 192.168.122.249 fa:16:3e:d2:2a:94
Nov 28 10:02:50 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:02:50.425 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:02:50 np0005538515.localdomain dnsmasq[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/addn_hosts - 3 addresses
Nov 28 10:02:50 np0005538515.localdomain dnsmasq-dhcp[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/host
Nov 28 10:02:50 np0005538515.localdomain podman[311025]: 2025-11-28 10:02:50.683478877 +0000 UTC m=+0.054067448 container kill d6ba70e100412f39fbf43f2359746885ccd151dd93bddc3682d2651a49a36c62 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-887157f9-a765-40c0-8be5-1fba3ddea8f8, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Nov 28 10:02:50 np0005538515.localdomain dnsmasq-dhcp[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/opts
Nov 28 10:02:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:02:50.846 158530 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 10:02:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:02:50.847 158530 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 10:02:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:02:50.847 158530 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 10:02:50 np0005538515.localdomain dnsmasq-dhcp[310862]: DHCPRELEASE(tap8af1236c-20) 192.168.122.181 fa:16:3e:be:72:98
Nov 28 10:02:51 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:02:51.017 2 INFO neutron.agent.securitygroups_rpc [req-bb7f0ac8-504e-4783-80de-f00563b1098a req-aad0b688-0986-452a-b92d-7d53ff4d1361 75ac6a26227c40ba81e61e610018d23f 1f9b84b894e641c4bee3ebcd1409ad9f - - default default] Security group member updated ['6deb8732-9203-448a-b0a5-cf6a0375d009']
Nov 28 10:02:51 np0005538515.localdomain ceph-mon[301134]: pgmap v178: 177 pgs: 177 active+clean; 225 MiB data, 895 MiB used, 41 GiB / 42 GiB avail; 13 KiB/s rd, 3.5 KiB/s wr, 18 op/s
Nov 28 10:02:51 np0005538515.localdomain dnsmasq[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/addn_hosts - 2 addresses
Nov 28 10:02:51 np0005538515.localdomain dnsmasq-dhcp[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/host
Nov 28 10:02:51 np0005538515.localdomain dnsmasq-dhcp[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/opts
Nov 28 10:02:51 np0005538515.localdomain podman[311063]: 2025-11-28 10:02:51.414167821 +0000 UTC m=+0.059115392 container kill d6ba70e100412f39fbf43f2359746885ccd151dd93bddc3682d2651a49a36c62 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-887157f9-a765-40c0-8be5-1fba3ddea8f8, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:02:51 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.
Nov 28 10:02:51 np0005538515.localdomain podman[311077]: 2025-11-28 10:02:51.542547206 +0000 UTC m=+0.093901979 container health_status 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, vendor=Red Hat, Inc., config_id=edpm, version=9.6, io.openshift.expose-services=, container_name=openstack_network_exporter, distribution-scope=public, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers)
Nov 28 10:02:51 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:02:51.580 261346 INFO neutron.agent.dhcp.agent [None req-4a726320-3439-44e2-ba79-4c07dcfdd571 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:02:50Z, description=, device_id=3d98c3ed-07f5-41b6-82be-a697b1dc8144, device_owner=network:router_gateway, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce572cd0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce572580>], id=6633904c-1896-441a-b503-d755272315ca, ip_allocation=immediate, mac_address=fa:16:3e:64:dd:c0, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T08:32:19Z, description=, dns_domain=, id=887157f9-a765-40c0-8be5-1fba3ddea8f8, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=9dda653c53224db086060962b0702694, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=29, status=ACTIVE, subnets=['5f7de60c-f82a-4f40-b803-51cb08cbf2e3'], tags=[], tenant_id=9dda653c53224db086060962b0702694, updated_at=2025-11-28T08:32:25Z, vlan_transparent=None, network_id=887157f9-a765-40c0-8be5-1fba3ddea8f8, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1135, status=DOWN, tags=[], tenant_id=, updated_at=2025-11-28T10:02:50Z on network 887157f9-a765-40c0-8be5-1fba3ddea8f8
Nov 28 10:02:51 np0005538515.localdomain podman[311077]: 2025-11-28 10:02:51.588472473 +0000 UTC m=+0.139827216 container exec_died 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, config_id=edpm, name=ubi9-minimal, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, version=9.6)
Nov 28 10:02:51 np0005538515.localdomain systemd[1]: 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.service: Deactivated successfully.
Nov 28 10:02:51 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v179: 177 pgs: 177 active+clean; 145 MiB data, 773 MiB used, 41 GiB / 42 GiB avail; 60 KiB/s rd, 6.4 KiB/s wr, 84 op/s
Nov 28 10:02:51 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e125 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 10:02:51 np0005538515.localdomain dnsmasq[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/addn_hosts - 3 addresses
Nov 28 10:02:51 np0005538515.localdomain podman[311119]: 2025-11-28 10:02:51.802296937 +0000 UTC m=+0.062124826 container kill d6ba70e100412f39fbf43f2359746885ccd151dd93bddc3682d2651a49a36c62 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-887157f9-a765-40c0-8be5-1fba3ddea8f8, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 28 10:02:51 np0005538515.localdomain dnsmasq-dhcp[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/host
Nov 28 10:02:51 np0005538515.localdomain dnsmasq-dhcp[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/opts
Nov 28 10:02:52 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.107:0/2372670749' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:02:52 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:02:52.126 261346 INFO neutron.agent.dhcp.agent [None req-323955ca-bd6e-47cb-b147-a3567fe93a13 - - - - - -] DHCP configuration for ports {'6633904c-1896-441a-b503-d755272315ca'} is completed
Nov 28 10:02:52 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:02:52.336 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:02:53 np0005538515.localdomain ceph-mon[301134]: pgmap v179: 177 pgs: 177 active+clean; 145 MiB data, 773 MiB used, 41 GiB / 42 GiB avail; 60 KiB/s rd, 6.4 KiB/s wr, 84 op/s
Nov 28 10:02:53 np0005538515.localdomain systemd[1]: tmp-crun.BYlQaP.mount: Deactivated successfully.
Nov 28 10:02:53 np0005538515.localdomain dnsmasq[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/addn_hosts - 3 addresses
Nov 28 10:02:53 np0005538515.localdomain podman[311156]: 2025-11-28 10:02:53.248343726 +0000 UTC m=+0.068318716 container kill d6ba70e100412f39fbf43f2359746885ccd151dd93bddc3682d2651a49a36c62 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-887157f9-a765-40c0-8be5-1fba3ddea8f8, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:02:53 np0005538515.localdomain dnsmasq-dhcp[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/host
Nov 28 10:02:53 np0005538515.localdomain dnsmasq-dhcp[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/opts
Nov 28 10:02:53 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:02:53.370 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:02:53 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v180: 177 pgs: 177 active+clean; 145 MiB data, 773 MiB used, 41 GiB / 42 GiB avail; 52 KiB/s rd, 5.6 KiB/s wr, 73 op/s
Nov 28 10:02:53 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:02:53.895 2 INFO neutron.agent.securitygroups_rpc [None req-ca5b8c5c-4a7b-4773-b7e8-8e9eb8c79737 7dd8e21cfc81423e88248cb5fa529d85 2cfa67078b8440d0bf985b2a5e0e5558 - - default default] Security group member updated ['99b6bdbb-5b32-49a7-af1f-7638d6a8bc5d']
Nov 28 10:02:53 np0005538515.localdomain dnsmasq[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/addn_hosts - 3 addresses
Nov 28 10:02:53 np0005538515.localdomain podman[311193]: 2025-11-28 10:02:53.986289802 +0000 UTC m=+0.058702840 container kill d6ba70e100412f39fbf43f2359746885ccd151dd93bddc3682d2651a49a36c62 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-887157f9-a765-40c0-8be5-1fba3ddea8f8, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3)
Nov 28 10:02:53 np0005538515.localdomain dnsmasq-dhcp[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/host
Nov 28 10:02:53 np0005538515.localdomain dnsmasq-dhcp[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/opts
Nov 28 10:02:54 np0005538515.localdomain dnsmasq[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/addn_hosts - 2 addresses
Nov 28 10:02:54 np0005538515.localdomain podman[311232]: 2025-11-28 10:02:54.881468168 +0000 UTC m=+0.062990802 container kill d6ba70e100412f39fbf43f2359746885ccd151dd93bddc3682d2651a49a36c62 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-887157f9-a765-40c0-8be5-1fba3ddea8f8, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 28 10:02:54 np0005538515.localdomain dnsmasq-dhcp[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/host
Nov 28 10:02:54 np0005538515.localdomain dnsmasq-dhcp[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/opts
Nov 28 10:02:54 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e126 e126: 6 total, 6 up, 6 in
Nov 28 10:02:55 np0005538515.localdomain ceph-mon[301134]: pgmap v180: 177 pgs: 177 active+clean; 145 MiB data, 773 MiB used, 41 GiB / 42 GiB avail; 52 KiB/s rd, 5.6 KiB/s wr, 73 op/s
Nov 28 10:02:55 np0005538515.localdomain ceph-mon[301134]: osdmap e126: 6 total, 6 up, 6 in
Nov 28 10:02:55 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v182: 177 pgs: 177 active+clean; 145 MiB data, 773 MiB used, 41 GiB / 42 GiB avail; 52 KiB/s rd, 3.5 KiB/s wr, 73 op/s
Nov 28 10:02:55 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:02:55.687 261346 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:02:55Z, description=, device_id=96675921-219e-4a6b-80f7-35d4cc5cbdb6, device_owner=network:router_gateway, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce5e0580>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce5e04f0>], id=4a9f17de-1903-4782-9fbb-eb87966cc66d, ip_allocation=immediate, mac_address=fa:16:3e:2e:27:e5, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T08:32:19Z, description=, dns_domain=, id=887157f9-a765-40c0-8be5-1fba3ddea8f8, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=9dda653c53224db086060962b0702694, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=29, status=ACTIVE, subnets=['5f7de60c-f82a-4f40-b803-51cb08cbf2e3'], tags=[], tenant_id=9dda653c53224db086060962b0702694, updated_at=2025-11-28T08:32:25Z, vlan_transparent=None, network_id=887157f9-a765-40c0-8be5-1fba3ddea8f8, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1176, status=DOWN, tags=[], tenant_id=, updated_at=2025-11-28T10:02:55Z on network 887157f9-a765-40c0-8be5-1fba3ddea8f8
Nov 28 10:02:55 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:02:55.796 2 INFO neutron.agent.securitygroups_rpc [None req-0a1122e3-48a9-4fdd-9791-f33fb613b799 7dd8e21cfc81423e88248cb5fa529d85 2cfa67078b8440d0bf985b2a5e0e5558 - - default default] Security group member updated ['99b6bdbb-5b32-49a7-af1f-7638d6a8bc5d']
Nov 28 10:02:55 np0005538515.localdomain dnsmasq[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/addn_hosts - 3 addresses
Nov 28 10:02:55 np0005538515.localdomain dnsmasq-dhcp[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/host
Nov 28 10:02:55 np0005538515.localdomain dnsmasq-dhcp[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/opts
Nov 28 10:02:55 np0005538515.localdomain podman[311270]: 2025-11-28 10:02:55.967488642 +0000 UTC m=+0.061152164 container kill d6ba70e100412f39fbf43f2359746885ccd151dd93bddc3682d2651a49a36c62 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-887157f9-a765-40c0-8be5-1fba3ddea8f8, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 28 10:02:56 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:02:56.215 261346 INFO neutron.agent.dhcp.agent [None req-b42314b3-6a3d-443c-a4d3-31c6d689860a - - - - - -] DHCP configuration for ports {'4a9f17de-1903-4782-9fbb-eb87966cc66d'} is completed
Nov 28 10:02:56 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 10:02:57 np0005538515.localdomain ceph-mon[301134]: pgmap v182: 177 pgs: 177 active+clean; 145 MiB data, 773 MiB used, 41 GiB / 42 GiB avail; 52 KiB/s rd, 3.5 KiB/s wr, 73 op/s
Nov 28 10:02:57 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:02:57.338 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:02:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:02:57 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 10:02:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:02:57 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 10:02:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:02:57 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 10:02:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:02:57 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 10:02:57 np0005538515.localdomain openstack_network_exporter[240973]: 
Nov 28 10:02:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:02:57 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 10:02:57 np0005538515.localdomain openstack_network_exporter[240973]: 
Nov 28 10:02:57 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v183: 177 pgs: 177 active+clean; 145 MiB data, 773 MiB used, 41 GiB / 42 GiB avail; 43 KiB/s rd, 2.9 KiB/s wr, 60 op/s
Nov 28 10:02:57 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:02:57.786 2 INFO neutron.agent.securitygroups_rpc [None req-2d11ad2b-bc0b-4803-8bd7-bbf5b227318c 7dd8e21cfc81423e88248cb5fa529d85 2cfa67078b8440d0bf985b2a5e0e5558 - - default default] Security group member updated ['99b6bdbb-5b32-49a7-af1f-7638d6a8bc5d']
Nov 28 10:02:58 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:02:58.372 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:02:58 np0005538515.localdomain dnsmasq[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/addn_hosts - 2 addresses
Nov 28 10:02:58 np0005538515.localdomain dnsmasq-dhcp[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/host
Nov 28 10:02:58 np0005538515.localdomain podman[311307]: 2025-11-28 10:02:58.694228893 +0000 UTC m=+0.062940281 container kill d6ba70e100412f39fbf43f2359746885ccd151dd93bddc3682d2651a49a36c62 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-887157f9-a765-40c0-8be5-1fba3ddea8f8, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team)
Nov 28 10:02:58 np0005538515.localdomain dnsmasq-dhcp[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/opts
Nov 28 10:02:58 np0005538515.localdomain podman[239012]: time="2025-11-28T10:02:58Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 10:02:58 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:02:58.916 2 INFO neutron.agent.securitygroups_rpc [None req-15306174-a853-47d1-9333-4213f5fad357 7dd8e21cfc81423e88248cb5fa529d85 2cfa67078b8440d0bf985b2a5e0e5558 - - default default] Security group member updated ['99b6bdbb-5b32-49a7-af1f-7638d6a8bc5d']
Nov 28 10:02:58 np0005538515.localdomain podman[239012]: @ - - [28/Nov/2025:10:02:58 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156330 "" "Go-http-client/1.1"
Nov 28 10:02:58 np0005538515.localdomain podman[239012]: @ - - [28/Nov/2025:10:02:58 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19206 "" "Go-http-client/1.1"
Nov 28 10:02:59 np0005538515.localdomain ceph-mon[301134]: pgmap v183: 177 pgs: 177 active+clean; 145 MiB data, 773 MiB used, 41 GiB / 42 GiB avail; 43 KiB/s rd, 2.9 KiB/s wr, 60 op/s
Nov 28 10:02:59 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v184: 177 pgs: 177 active+clean; 145 MiB data, 773 MiB used, 41 GiB / 42 GiB avail; 42 KiB/s rd, 2.8 KiB/s wr, 58 op/s
Nov 28 10:02:59 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:02:59.717 261346 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:02:59Z, description=, device_id=6d6f5f2b-73ff-4ed3-b208-e62261a8f605, device_owner=network:router_gateway, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce61bf70>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce61bd90>], id=374a4d89-2998-47b5-9fe9-05108f11754c, ip_allocation=immediate, mac_address=fa:16:3e:1c:ee:6c, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T08:32:19Z, description=, dns_domain=, id=887157f9-a765-40c0-8be5-1fba3ddea8f8, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=9dda653c53224db086060962b0702694, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=29, status=ACTIVE, subnets=['5f7de60c-f82a-4f40-b803-51cb08cbf2e3'], tags=[], tenant_id=9dda653c53224db086060962b0702694, updated_at=2025-11-28T08:32:25Z, vlan_transparent=None, network_id=887157f9-a765-40c0-8be5-1fba3ddea8f8, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1182, status=DOWN, tags=[], tenant_id=, updated_at=2025-11-28T10:02:59Z on network 887157f9-a765-40c0-8be5-1fba3ddea8f8
Nov 28 10:02:59 np0005538515.localdomain dnsmasq[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/addn_hosts - 3 addresses
Nov 28 10:02:59 np0005538515.localdomain dnsmasq-dhcp[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/host
Nov 28 10:02:59 np0005538515.localdomain dnsmasq-dhcp[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/opts
Nov 28 10:02:59 np0005538515.localdomain podman[311345]: 2025-11-28 10:02:59.936821806 +0000 UTC m=+0.064963032 container kill d6ba70e100412f39fbf43f2359746885ccd151dd93bddc3682d2651a49a36c62 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-887157f9-a765-40c0-8be5-1fba3ddea8f8, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Nov 28 10:03:00 np0005538515.localdomain dnsmasq-dhcp[310862]: DHCPRELEASE(tap8af1236c-20) 192.168.122.179 fa:16:3e:58:c4:db
Nov 28 10:03:00 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:03:00.186 261346 INFO neutron.agent.dhcp.agent [None req-cb78e981-3191-4efd-82ce-484e1a5825f8 - - - - - -] DHCP configuration for ports {'374a4d89-2998-47b5-9fe9-05108f11754c'} is completed
Nov 28 10:03:00 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:03:00.212 2 INFO neutron.agent.securitygroups_rpc [None req-f1e38bd4-3201-4ca6-aca5-e6cf8d3e47ff 7dd8e21cfc81423e88248cb5fa529d85 2cfa67078b8440d0bf985b2a5e0e5558 - - default default] Security group member updated ['99b6bdbb-5b32-49a7-af1f-7638d6a8bc5d']
Nov 28 10:03:00 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:03:00.509 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:00 np0005538515.localdomain dnsmasq[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/addn_hosts - 2 addresses
Nov 28 10:03:00 np0005538515.localdomain podman[311383]: 2025-11-28 10:03:00.60063203 +0000 UTC m=+0.064684244 container kill d6ba70e100412f39fbf43f2359746885ccd151dd93bddc3682d2651a49a36c62 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-887157f9-a765-40c0-8be5-1fba3ddea8f8, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Nov 28 10:03:00 np0005538515.localdomain dnsmasq-dhcp[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/host
Nov 28 10:03:00 np0005538515.localdomain dnsmasq-dhcp[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/opts
Nov 28 10:03:00 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:03:00.782 261346 INFO neutron.agent.dhcp.agent [None req-ff55e36f-a28c-42c2-9334-23beac7ec5c5 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:03:00Z, description=, device_id=b85a61c5-6e26-4d8b-a681-62813c171222, device_owner=network:router_gateway, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce688ca0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce688eb0>], id=11686e4b-1bc0-4947-bdac-2b1663b1d9cc, ip_allocation=immediate, mac_address=fa:16:3e:14:a7:e4, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T08:32:19Z, description=, dns_domain=, id=887157f9-a765-40c0-8be5-1fba3ddea8f8, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=9dda653c53224db086060962b0702694, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=29, status=ACTIVE, subnets=['5f7de60c-f82a-4f40-b803-51cb08cbf2e3'], tags=[], tenant_id=9dda653c53224db086060962b0702694, updated_at=2025-11-28T08:32:25Z, vlan_transparent=None, network_id=887157f9-a765-40c0-8be5-1fba3ddea8f8, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1183, status=DOWN, tags=[], tenant_id=, updated_at=2025-11-28T10:03:00Z on network 887157f9-a765-40c0-8be5-1fba3ddea8f8
Nov 28 10:03:00 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:03:00.974 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:00 np0005538515.localdomain dnsmasq[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/addn_hosts - 3 addresses
Nov 28 10:03:00 np0005538515.localdomain dnsmasq-dhcp[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/host
Nov 28 10:03:00 np0005538515.localdomain podman[311420]: 2025-11-28 10:03:00.996441462 +0000 UTC m=+0.056443602 container kill d6ba70e100412f39fbf43f2359746885ccd151dd93bddc3682d2651a49a36c62 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-887157f9-a765-40c0-8be5-1fba3ddea8f8, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, io.buildah.version=1.41.3)
Nov 28 10:03:00 np0005538515.localdomain dnsmasq-dhcp[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/opts
Nov 28 10:03:01 np0005538515.localdomain ceph-mon[301134]: pgmap v184: 177 pgs: 177 active+clean; 145 MiB data, 773 MiB used, 41 GiB / 42 GiB avail; 42 KiB/s rd, 2.8 KiB/s wr, 58 op/s
Nov 28 10:03:01 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:03:01.248 261346 INFO neutron.agent.dhcp.agent [None req-f24cbd90-34a2-4a4a-9310-3de1de928be0 - - - - - -] DHCP configuration for ports {'11686e4b-1bc0-4947-bdac-2b1663b1d9cc'} is completed
Nov 28 10:03:01 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:03:01.474 2 INFO neutron.agent.securitygroups_rpc [None req-06213f27-8bbf-4f60-8df9-0ce6274952ed 7dd8e21cfc81423e88248cb5fa529d85 2cfa67078b8440d0bf985b2a5e0e5558 - - default default] Security group member updated ['99b6bdbb-5b32-49a7-af1f-7638d6a8bc5d']
Nov 28 10:03:01 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v185: 177 pgs: 177 active+clean; 145 MiB data, 773 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:03:01 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 10:03:01 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.
Nov 28 10:03:01 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.
Nov 28 10:03:01 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.
Nov 28 10:03:01 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.
Nov 28 10:03:01 np0005538515.localdomain systemd[1]: tmp-crun.BSO3gv.mount: Deactivated successfully.
Nov 28 10:03:01 np0005538515.localdomain podman[311442]: 2025-11-28 10:03:01.993493168 +0000 UTC m=+0.092768574 container health_status 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 28 10:03:02 np0005538515.localdomain podman[311442]: 2025-11-28 10:03:02.007052174 +0000 UTC m=+0.106327570 container exec_died 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251125)
Nov 28 10:03:02 np0005538515.localdomain systemd[1]: 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.service: Deactivated successfully.
Nov 28 10:03:02 np0005538515.localdomain podman[311444]: 2025-11-28 10:03:02.062286627 +0000 UTC m=+0.148237154 container health_status b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, managed_by=edpm_ansible)
Nov 28 10:03:02 np0005538515.localdomain systemd[1]: tmp-crun.3gvKF6.mount: Deactivated successfully.
Nov 28 10:03:02 np0005538515.localdomain podman[311450]: 2025-11-28 10:03:02.092554115 +0000 UTC m=+0.178992757 container health_status d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 10:03:02 np0005538515.localdomain podman[311450]: 2025-11-28 10:03:02.107423731 +0000 UTC m=+0.193862413 container exec_died d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 28 10:03:02 np0005538515.localdomain podman[311444]: 2025-11-28 10:03:02.120410609 +0000 UTC m=+0.206361136 container exec_died b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:03:02 np0005538515.localdomain systemd[1]: d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.service: Deactivated successfully.
Nov 28 10:03:02 np0005538515.localdomain systemd[1]: b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.service: Deactivated successfully.
Nov 28 10:03:02 np0005538515.localdomain podman[311443]: 2025-11-28 10:03:02.190330152 +0000 UTC m=+0.284666686 container health_status 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125)
Nov 28 10:03:02 np0005538515.localdomain podman[311443]: 2025-11-28 10:03:02.247917126 +0000 UTC m=+0.342253690 container exec_died 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 28 10:03:02 np0005538515.localdomain systemd[1]: 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.service: Deactivated successfully.
Nov 28 10:03:02 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:03:02.371 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:03 np0005538515.localdomain ceph-mon[301134]: pgmap v185: 177 pgs: 177 active+clean; 145 MiB data, 773 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:03:03 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:03:03.314 2 INFO neutron.agent.securitygroups_rpc [None req-58637b77-ae6c-405f-99c5-e20fa41f4923 f6a5516f43fb48ebaa16e4040dd82b84 cd7c5d213d924d3d9d4428db9d286082 - - default default] Security group member updated ['9f37640b-8d78-40f1-9b7c-3ec3fef04776']
Nov 28 10:03:03 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:03:03.375 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:03 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v186: 177 pgs: 177 active+clean; 145 MiB data, 773 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:03:03 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.
Nov 28 10:03:03 np0005538515.localdomain podman[311527]: 2025-11-28 10:03:03.973702988 +0000 UTC m=+0.084842171 container health_status 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 28 10:03:04 np0005538515.localdomain podman[311527]: 2025-11-28 10:03:04.01259046 +0000 UTC m=+0.123729583 container exec_died 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 10:03:04 np0005538515.localdomain systemd[1]: 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.service: Deactivated successfully.
Nov 28 10:03:04 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:03:04.510 2 INFO neutron.agent.securitygroups_rpc [None req-9701a6f5-02eb-46da-bd51-76f4153e4e2b db6b2950549b47a2a2693ecffb5083c4 cd9b97e6d04840f3a546b260a8ee9b24 - - default default] Security group member updated ['7c5e1d73-494f-47ff-9f16-a2cff6e79638']
Nov 28 10:03:04 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:03:04.772 261346 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:03:03Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce5f0c10>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce5f0670>], id=5f13681b-a9b0-4110-8479-3b0babad0289, ip_allocation=immediate, mac_address=fa:16:3e:72:1c:49, name=tempest-RoutersAdminNegativeTest-942817881, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T08:32:19Z, description=, dns_domain=, id=887157f9-a765-40c0-8be5-1fba3ddea8f8, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=9dda653c53224db086060962b0702694, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=29, status=ACTIVE, subnets=['5f7de60c-f82a-4f40-b803-51cb08cbf2e3'], tags=[], tenant_id=9dda653c53224db086060962b0702694, updated_at=2025-11-28T08:32:25Z, vlan_transparent=None, network_id=887157f9-a765-40c0-8be5-1fba3ddea8f8, port_security_enabled=True, project_id=cd9b97e6d04840f3a546b260a8ee9b24, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['7c5e1d73-494f-47ff-9f16-a2cff6e79638'], standard_attr_id=1197, status=DOWN, tags=[], tenant_id=cd9b97e6d04840f3a546b260a8ee9b24, updated_at=2025-11-28T10:03:03Z on network 887157f9-a765-40c0-8be5-1fba3ddea8f8
Nov 28 10:03:05 np0005538515.localdomain dnsmasq[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/addn_hosts - 4 addresses
Nov 28 10:03:05 np0005538515.localdomain dnsmasq-dhcp[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/host
Nov 28 10:03:05 np0005538515.localdomain dnsmasq-dhcp[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/opts
Nov 28 10:03:05 np0005538515.localdomain podman[311568]: 2025-11-28 10:03:05.250208732 +0000 UTC m=+0.051515370 container kill d6ba70e100412f39fbf43f2359746885ccd151dd93bddc3682d2651a49a36c62 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-887157f9-a765-40c0-8be5-1fba3ddea8f8, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 28 10:03:05 np0005538515.localdomain ceph-mon[301134]: pgmap v186: 177 pgs: 177 active+clean; 145 MiB data, 773 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:03:05 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:03:05.414 261346 INFO neutron.agent.dhcp.agent [None req-a8982697-18de-44b5-bb9e-b065c2789480 - - - - - -] DHCP configuration for ports {'5f13681b-a9b0-4110-8479-3b0babad0289'} is completed
Nov 28 10:03:05 np0005538515.localdomain ceph-mgr[286188]: [balancer INFO root] Optimize plan auto_2025-11-28_10:03:05
Nov 28 10:03:05 np0005538515.localdomain ceph-mgr[286188]: [balancer INFO root] Mode upmap, max misplaced 0.050000
Nov 28 10:03:05 np0005538515.localdomain ceph-mgr[286188]: [balancer INFO root] do_upmap
Nov 28 10:03:05 np0005538515.localdomain ceph-mgr[286188]: [balancer INFO root] pools ['images', 'volumes', '.mgr', 'vms', 'manila_metadata', 'backups', 'manila_data']
Nov 28 10:03:05 np0005538515.localdomain ceph-mgr[286188]: [balancer INFO root] prepared 0/10 changes
Nov 28 10:03:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] scanning for idle connections..
Nov 28 10:03:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] cleaning up connections: []
Nov 28 10:03:05 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v187: 177 pgs: 177 active+clean; 145 MiB data, 773 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:03:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] scanning for idle connections..
Nov 28 10:03:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] cleaning up connections: []
Nov 28 10:03:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] scanning for idle connections..
Nov 28 10:03:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] cleaning up connections: []
Nov 28 10:03:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] _maybe_adjust
Nov 28 10:03:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Nov 28 10:03:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] Pool '.mgr' root_id -1 using 3.080724804578448e-05 of space, bias 1.0, pg target 0.006161449609156895 quantized to 1 (current 1)
Nov 28 10:03:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Nov 28 10:03:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] Pool 'vms' root_id -1 using 0.003328000680485762 of space, bias 1.0, pg target 0.6656001360971524 quantized to 32 (current 32)
Nov 28 10:03:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Nov 28 10:03:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] Pool 'volumes' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Nov 28 10:03:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Nov 28 10:03:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] Pool 'images' root_id -1 using 0.004299383200725851 of space, bias 1.0, pg target 0.8584435124115949 quantized to 32 (current 32)
Nov 28 10:03:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Nov 28 10:03:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] Pool 'backups' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Nov 28 10:03:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Nov 28 10:03:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] Pool 'manila_data' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Nov 28 10:03:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Nov 28 10:03:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] Pool 'manila_metadata' root_id -1 using 2.453674623115578e-06 of space, bias 4.0, pg target 0.001953125 quantized to 16 (current 16)
Nov 28 10:03:05 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules
Nov 28 10:03:05 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: vms, start_after=
Nov 28 10:03:05 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules
Nov 28 10:03:05 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: volumes, start_after=
Nov 28 10:03:05 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: vms, start_after=
Nov 28 10:03:05 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: images, start_after=
Nov 28 10:03:05 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: volumes, start_after=
Nov 28 10:03:05 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: backups, start_after=
Nov 28 10:03:05 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: images, start_after=
Nov 28 10:03:05 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: backups, start_after=
Nov 28 10:03:06 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:03:06.153 2 INFO neutron.agent.securitygroups_rpc [None req-42b0499f-37f4-4061-a4df-d49e7a70a2c4 7dd8e21cfc81423e88248cb5fa529d85 2cfa67078b8440d0bf985b2a5e0e5558 - - default default] Security group member updated ['99b6bdbb-5b32-49a7-af1f-7638d6a8bc5d']
Nov 28 10:03:06 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:03:06.689 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:06 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 10:03:06 np0005538515.localdomain dnsmasq[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/addn_hosts - 3 addresses
Nov 28 10:03:06 np0005538515.localdomain dnsmasq-dhcp[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/host
Nov 28 10:03:06 np0005538515.localdomain podman[311607]: 2025-11-28 10:03:06.768437673 +0000 UTC m=+0.063626621 container kill d6ba70e100412f39fbf43f2359746885ccd151dd93bddc3682d2651a49a36c62 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-887157f9-a765-40c0-8be5-1fba3ddea8f8, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 28 10:03:06 np0005538515.localdomain dnsmasq-dhcp[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/opts
Nov 28 10:03:07 np0005538515.localdomain ceph-mon[301134]: pgmap v187: 177 pgs: 177 active+clean; 145 MiB data, 773 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:03:07 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:03:07.282 2 INFO neutron.agent.securitygroups_rpc [None req-374ec1da-a6ee-43ec-aeb4-2a3037224eb2 7dd8e21cfc81423e88248cb5fa529d85 2cfa67078b8440d0bf985b2a5e0e5558 - - default default] Security group member updated ['99b6bdbb-5b32-49a7-af1f-7638d6a8bc5d']
Nov 28 10:03:07 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:03:07.374 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:07 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:03:07.575 2 INFO neutron.agent.securitygroups_rpc [None req-5f1d0dc7-c78c-4e13-8de3-56bbcc932539 db6b2950549b47a2a2693ecffb5083c4 cd9b97e6d04840f3a546b260a8ee9b24 - - default default] Security group member updated ['7c5e1d73-494f-47ff-9f16-a2cff6e79638']
Nov 28 10:03:07 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v188: 177 pgs: 177 active+clean; 145 MiB data, 773 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:03:07 np0005538515.localdomain dnsmasq[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/addn_hosts - 2 addresses
Nov 28 10:03:07 np0005538515.localdomain podman[311646]: 2025-11-28 10:03:07.784165574 +0000 UTC m=+0.047475486 container kill d6ba70e100412f39fbf43f2359746885ccd151dd93bddc3682d2651a49a36c62 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-887157f9-a765-40c0-8be5-1fba3ddea8f8, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 28 10:03:07 np0005538515.localdomain dnsmasq-dhcp[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/host
Nov 28 10:03:07 np0005538515.localdomain dnsmasq-dhcp[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/opts
Nov 28 10:03:07 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.
Nov 28 10:03:07 np0005538515.localdomain systemd[1]: tmp-crun.4t67ti.mount: Deactivated successfully.
Nov 28 10:03:07 np0005538515.localdomain podman[311659]: 2025-11-28 10:03:07.921140601 +0000 UTC m=+0.104706330 container health_status cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125)
Nov 28 10:03:07 np0005538515.localdomain podman[311659]: 2025-11-28 10:03:07.961486008 +0000 UTC m=+0.145051727 container exec_died cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=multipathd, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 28 10:03:07 np0005538515.localdomain systemd[1]: cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.service: Deactivated successfully.
Nov 28 10:03:08 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:03:08.378 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:08 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:03:08.756 2 INFO neutron.agent.securitygroups_rpc [None req-be0492e2-ff74-4faa-8249-9d4640988efe f6a5516f43fb48ebaa16e4040dd82b84 cd7c5d213d924d3d9d4428db9d286082 - - default default] Security group member updated ['9f37640b-8d78-40f1-9b7c-3ec3fef04776']
Nov 28 10:03:08 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:03:08.786 261346 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:03:08Z, description=, device_id=39e4d76e-c3ed-4f2c-ab49-82a582f6ea5b, device_owner=network:router_gateway, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce61bee0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce61b850>], id=2a338f3b-cc3c-43b0-9669-c423ab6a0eb7, ip_allocation=immediate, mac_address=fa:16:3e:98:e0:c3, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T08:32:19Z, description=, dns_domain=, id=887157f9-a765-40c0-8be5-1fba3ddea8f8, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=9dda653c53224db086060962b0702694, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=29, status=ACTIVE, subnets=['5f7de60c-f82a-4f40-b803-51cb08cbf2e3'], tags=[], tenant_id=9dda653c53224db086060962b0702694, updated_at=2025-11-28T08:32:25Z, vlan_transparent=None, network_id=887157f9-a765-40c0-8be5-1fba3ddea8f8, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1228, status=DOWN, tags=[], tenant_id=, updated_at=2025-11-28T10:03:08Z on network 887157f9-a765-40c0-8be5-1fba3ddea8f8
Nov 28 10:03:08 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:03:08.968 2 INFO neutron.agent.securitygroups_rpc [None req-f639edd5-343d-4ae3-8fa2-2054bebb498d 7dd8e21cfc81423e88248cb5fa529d85 2cfa67078b8440d0bf985b2a5e0e5558 - - default default] Security group member updated ['99b6bdbb-5b32-49a7-af1f-7638d6a8bc5d']
Nov 28 10:03:09 np0005538515.localdomain dnsmasq[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/addn_hosts - 3 addresses
Nov 28 10:03:09 np0005538515.localdomain podman[311703]: 2025-11-28 10:03:09.001353868 +0000 UTC m=+0.048921710 container kill d6ba70e100412f39fbf43f2359746885ccd151dd93bddc3682d2651a49a36c62 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-887157f9-a765-40c0-8be5-1fba3ddea8f8, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:03:09 np0005538515.localdomain dnsmasq-dhcp[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/host
Nov 28 10:03:09 np0005538515.localdomain dnsmasq-dhcp[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/opts
Nov 28 10:03:09 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:03:09.286 261346 INFO neutron.agent.dhcp.agent [None req-9a8ca310-e034-4fca-bbb3-96535be556ed - - - - - -] DHCP configuration for ports {'2a338f3b-cc3c-43b0-9669-c423ab6a0eb7'} is completed
Nov 28 10:03:09 np0005538515.localdomain ceph-mon[301134]: pgmap v188: 177 pgs: 177 active+clean; 145 MiB data, 773 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:03:09 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v189: 177 pgs: 177 active+clean; 145 MiB data, 773 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:03:11 np0005538515.localdomain ceph-mon[301134]: pgmap v189: 177 pgs: 177 active+clean; 145 MiB data, 773 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:03:11 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:03:11.438 2 INFO neutron.agent.securitygroups_rpc [None req-cc15aeb8-86ce-4ade-b16e-7c5f404511cd 7dd8e21cfc81423e88248cb5fa529d85 2cfa67078b8440d0bf985b2a5e0e5558 - - default default] Security group member updated ['99b6bdbb-5b32-49a7-af1f-7638d6a8bc5d']
Nov 28 10:03:11 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v190: 177 pgs: 177 active+clean; 145 MiB data, 773 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:03:11 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 10:03:12 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:03:12.075 2 INFO neutron.agent.securitygroups_rpc [None req-93eb68a4-7d7e-4f26-af38-fff447267025 7dd8e21cfc81423e88248cb5fa529d85 2cfa67078b8440d0bf985b2a5e0e5558 - - default default] Security group member updated ['99b6bdbb-5b32-49a7-af1f-7638d6a8bc5d']
Nov 28 10:03:12 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:03:12.378 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:12 np0005538515.localdomain dnsmasq[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/addn_hosts - 2 addresses
Nov 28 10:03:12 np0005538515.localdomain dnsmasq-dhcp[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/host
Nov 28 10:03:12 np0005538515.localdomain dnsmasq-dhcp[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/opts
Nov 28 10:03:12 np0005538515.localdomain podman[311743]: 2025-11-28 10:03:12.427914606 +0000 UTC m=+0.067990634 container kill d6ba70e100412f39fbf43f2359746885ccd151dd93bddc3682d2651a49a36c62 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-887157f9-a765-40c0-8be5-1fba3ddea8f8, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Nov 28 10:03:12 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:03:12.509 2 INFO neutron.agent.securitygroups_rpc [None req-0547c360-35fd-496e-9dbb-6212e2de25bb 7dd8e21cfc81423e88248cb5fa529d85 2cfa67078b8440d0bf985b2a5e0e5558 - - default default] Security group member updated ['99b6bdbb-5b32-49a7-af1f-7638d6a8bc5d']
Nov 28 10:03:13 np0005538515.localdomain ceph-mon[301134]: pgmap v190: 177 pgs: 177 active+clean; 145 MiB data, 773 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:03:13 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:03:13.383 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:13 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v191: 177 pgs: 177 active+clean; 145 MiB data, 773 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:03:13 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:03:13.989 261346 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:03:13Z, description=, device_id=69c61a5b-ec94-4bbf-a961-0b942c346de5, device_owner=network:router_gateway, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce5c5fd0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce5c59d0>], id=e282245c-a634-4e4d-b9c8-31a4620b1751, ip_allocation=immediate, mac_address=fa:16:3e:cc:a1:00, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T08:32:19Z, description=, dns_domain=, id=887157f9-a765-40c0-8be5-1fba3ddea8f8, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=9dda653c53224db086060962b0702694, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=29, status=ACTIVE, subnets=['5f7de60c-f82a-4f40-b803-51cb08cbf2e3'], tags=[], tenant_id=9dda653c53224db086060962b0702694, updated_at=2025-11-28T08:32:25Z, vlan_transparent=None, network_id=887157f9-a765-40c0-8be5-1fba3ddea8f8, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1258, status=DOWN, tags=[], tenant_id=, updated_at=2025-11-28T10:03:13Z on network 887157f9-a765-40c0-8be5-1fba3ddea8f8
Nov 28 10:03:14 np0005538515.localdomain systemd[1]: tmp-crun.qkAkIm.mount: Deactivated successfully.
Nov 28 10:03:14 np0005538515.localdomain dnsmasq[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/addn_hosts - 3 addresses
Nov 28 10:03:14 np0005538515.localdomain dnsmasq-dhcp[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/host
Nov 28 10:03:14 np0005538515.localdomain dnsmasq-dhcp[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/opts
Nov 28 10:03:14 np0005538515.localdomain podman[311780]: 2025-11-28 10:03:14.275668297 +0000 UTC m=+0.068857082 container kill d6ba70e100412f39fbf43f2359746885ccd151dd93bddc3682d2651a49a36c62 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-887157f9-a765-40c0-8be5-1fba3ddea8f8, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 28 10:03:14 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/4000469581' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:03:14 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/4000469581' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:03:14 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:03:14.699 261346 INFO neutron.agent.dhcp.agent [None req-d5fdbe78-a1e0-4b36-84b6-e17319936665 - - - - - -] DHCP configuration for ports {'e282245c-a634-4e4d-b9c8-31a4620b1751'} is completed
Nov 28 10:03:15 np0005538515.localdomain ceph-mon[301134]: pgmap v191: 177 pgs: 177 active+clean; 145 MiB data, 773 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:03:15 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:03:15.461 261346 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:03:14Z, description=, device_id=eabd79f3-7436-4767-9b03-0bae1c4f5088, device_owner=network:router_gateway, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce5a4c40>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce5c58e0>], id=9e71845f-d23d-4701-82b5-febedf9d3c44, ip_allocation=immediate, mac_address=fa:16:3e:06:b4:45, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T08:32:19Z, description=, dns_domain=, id=887157f9-a765-40c0-8be5-1fba3ddea8f8, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=9dda653c53224db086060962b0702694, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=29, status=ACTIVE, subnets=['5f7de60c-f82a-4f40-b803-51cb08cbf2e3'], tags=[], tenant_id=9dda653c53224db086060962b0702694, updated_at=2025-11-28T08:32:25Z, vlan_transparent=None, network_id=887157f9-a765-40c0-8be5-1fba3ddea8f8, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1265, status=DOWN, tags=[], tenant_id=, updated_at=2025-11-28T10:03:14Z on network 887157f9-a765-40c0-8be5-1fba3ddea8f8
Nov 28 10:03:15 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v192: 177 pgs: 177 active+clean; 145 MiB data, 773 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:03:15 np0005538515.localdomain dnsmasq[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/addn_hosts - 4 addresses
Nov 28 10:03:15 np0005538515.localdomain dnsmasq-dhcp[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/host
Nov 28 10:03:15 np0005538515.localdomain dnsmasq-dhcp[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/opts
Nov 28 10:03:15 np0005538515.localdomain podman[311817]: 2025-11-28 10:03:15.714426052 +0000 UTC m=+0.067116738 container kill d6ba70e100412f39fbf43f2359746885ccd151dd93bddc3682d2651a49a36c62 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-887157f9-a765-40c0-8be5-1fba3ddea8f8, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 28 10:03:16 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:03:16.011 261346 INFO neutron.agent.dhcp.agent [None req-b9372eed-0d6a-498a-9bb3-5cd54850ec5c - - - - - -] DHCP configuration for ports {'9e71845f-d23d-4701-82b5-febedf9d3c44'} is completed
Nov 28 10:03:16 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 10:03:17 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:03:17.379 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:17 np0005538515.localdomain ceph-mon[301134]: pgmap v192: 177 pgs: 177 active+clean; 145 MiB data, 773 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:03:17 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:03:17.678 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:17 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v193: 177 pgs: 177 active+clean; 145 MiB data, 773 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:03:18 np0005538515.localdomain sudo[311839]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 10:03:18 np0005538515.localdomain sudo[311839]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 10:03:18 np0005538515.localdomain sudo[311839]: pam_unix(sudo:session): session closed for user root
Nov 28 10:03:18 np0005538515.localdomain sudo[311857]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 check-host
Nov 28 10:03:18 np0005538515.localdomain sudo[311857]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 10:03:18 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:03:18.413 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:18 np0005538515.localdomain sudo[311857]: pam_unix(sudo:session): session closed for user root
Nov 28 10:03:18 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain.devices.0}] v 0)
Nov 28 10:03:18 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain.devices.0}] v 0)
Nov 28 10:03:18 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain}] v 0)
Nov 28 10:03:18 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain}] v 0)
Nov 28 10:03:18 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain.devices.0}] v 0)
Nov 28 10:03:18 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain}] v 0)
Nov 28 10:03:18 np0005538515.localdomain sudo[311895]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 10:03:18 np0005538515.localdomain sudo[311895]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 10:03:18 np0005538515.localdomain sudo[311895]: pam_unix(sudo:session): session closed for user root
Nov 28 10:03:18 np0005538515.localdomain sudo[311913]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 10:03:18 np0005538515.localdomain sudo[311913]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 10:03:19 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:03:19.306 2 INFO neutron.agent.securitygroups_rpc [None req-0bd438b8-b072-41d3-bddf-9588300a9670 1170e00b193540e48a34e21b33f08b7a a8f8694ac11a4237ad168b64c39ca114 - - default default] Security group member updated ['7f513fd2-7d14-47f7-892b-9f7b5ed2e3c4']
Nov 28 10:03:19 np0005538515.localdomain sudo[311913]: pam_unix(sudo:session): session closed for user root
Nov 28 10:03:19 np0005538515.localdomain ceph-mon[301134]: pgmap v193: 177 pgs: 177 active+clean; 145 MiB data, 773 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:03:19 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:03:19 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:03:19 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:03:19 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:03:19 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:03:19 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:03:19 np0005538515.localdomain dnsmasq[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/addn_hosts - 3 addresses
Nov 28 10:03:19 np0005538515.localdomain podman[311978]: 2025-11-28 10:03:19.587379211 +0000 UTC m=+0.060271949 container kill d6ba70e100412f39fbf43f2359746885ccd151dd93bddc3682d2651a49a36c62 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-887157f9-a765-40c0-8be5-1fba3ddea8f8, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS)
Nov 28 10:03:19 np0005538515.localdomain dnsmasq-dhcp[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/host
Nov 28 10:03:19 np0005538515.localdomain dnsmasq-dhcp[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/opts
Nov 28 10:03:19 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Nov 28 10:03:19 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 10:03:19 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0)
Nov 28 10:03:19 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 28 10:03:19 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Nov 28 10:03:19 np0005538515.localdomain ceph-mgr[286188]: [progress INFO root] update: starting ev 6cf45a1b-c23e-4bbd-a248-a5d51e2e61e5 (Updating node-proxy deployment (+3 -> 3))
Nov 28 10:03:19 np0005538515.localdomain ceph-mgr[286188]: [progress INFO root] complete: finished ev 6cf45a1b-c23e-4bbd-a248-a5d51e2e61e5 (Updating node-proxy deployment (+3 -> 3))
Nov 28 10:03:19 np0005538515.localdomain ceph-mgr[286188]: [progress INFO root] Completed event 6cf45a1b-c23e-4bbd-a248-a5d51e2e61e5 (Updating node-proxy deployment (+3 -> 3)) in 0 seconds
Nov 28 10:03:19 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0)
Nov 28 10:03:19 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 28 10:03:19 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v194: 177 pgs: 177 active+clean; 145 MiB data, 773 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:03:19 np0005538515.localdomain sudo[312013]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 10:03:19 np0005538515.localdomain sudo[312013]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 10:03:19 np0005538515.localdomain podman[312017]: 2025-11-28 10:03:19.93257657 +0000 UTC m=+0.060219896 container kill d6ba70e100412f39fbf43f2359746885ccd151dd93bddc3682d2651a49a36c62 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-887157f9-a765-40c0-8be5-1fba3ddea8f8, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Nov 28 10:03:19 np0005538515.localdomain sudo[312013]: pam_unix(sudo:session): session closed for user root
Nov 28 10:03:19 np0005538515.localdomain dnsmasq[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/addn_hosts - 2 addresses
Nov 28 10:03:19 np0005538515.localdomain dnsmasq-dhcp[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/host
Nov 28 10:03:19 np0005538515.localdomain dnsmasq-dhcp[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/opts
Nov 28 10:03:19 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:03:19.969 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:20 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 10:03:20 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 28 10:03:20 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:03:20 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 28 10:03:20 np0005538515.localdomain ceph-mgr[286188]: [progress INFO root] Writing back 50 completed events
Nov 28 10:03:20 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0)
Nov 28 10:03:20 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:03:20.923 2 INFO neutron.agent.securitygroups_rpc [None req-cc447c81-1a1f-4f5d-aa14-abdbefdf4620 1170e00b193540e48a34e21b33f08b7a a8f8694ac11a4237ad168b64c39ca114 - - default default] Security group member updated ['7f513fd2-7d14-47f7-892b-9f7b5ed2e3c4']
Nov 28 10:03:21 np0005538515.localdomain ceph-mon[301134]: pgmap v194: 177 pgs: 177 active+clean; 145 MiB data, 773 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:03:21 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:03:21 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v195: 177 pgs: 177 active+clean; 145 MiB data, 773 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:03:21 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 10:03:21 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.
Nov 28 10:03:21 np0005538515.localdomain podman[312054]: 2025-11-28 10:03:21.978947399 +0000 UTC m=+0.085802501 container health_status 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, release=1755695350, architecture=x86_64, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, build-date=2025-08-20T13:12:41)
Nov 28 10:03:22 np0005538515.localdomain podman[312054]: 2025-11-28 10:03:22.01785818 +0000 UTC m=+0.124713312 container exec_died 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, config_id=edpm, vendor=Red Hat, Inc., release=1755695350, architecture=x86_64, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.buildah.version=1.33.7, io.openshift.expose-services=, maintainer=Red Hat, Inc.)
Nov 28 10:03:22 np0005538515.localdomain systemd[1]: 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.service: Deactivated successfully.
Nov 28 10:03:22 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:03:22.382 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:23 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:03:23.417 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:23 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:03:23.439 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:23 np0005538515.localdomain ceph-mon[301134]: pgmap v195: 177 pgs: 177 active+clean; 145 MiB data, 773 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:03:23 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v196: 177 pgs: 177 active+clean; 145 MiB data, 773 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:03:23 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:03:23.774 261346 INFO neutron.agent.linux.ip_lib [None req-9ba95e0d-ea20-4a29-9f4e-0fb1eb5c67d2 - - - - - -] Device tap6c0b5ff2-df cannot be used as it has no MAC address
Nov 28 10:03:23 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:03:23.798 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:23 np0005538515.localdomain kernel: device tap6c0b5ff2-df entered promiscuous mode
Nov 28 10:03:23 np0005538515.localdomain NetworkManager[5965]: <info>  [1764324203.8090] manager: (tap6c0b5ff2-df): new Generic device (/org/freedesktop/NetworkManager/Devices/23)
Nov 28 10:03:23 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T10:03:23Z|00085|binding|INFO|Claiming lport 6c0b5ff2-df89-4fe3-9e64-c927114a583e for this chassis.
Nov 28 10:03:23 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T10:03:23Z|00086|binding|INFO|6c0b5ff2-df89-4fe3-9e64-c927114a583e: Claiming unknown
Nov 28 10:03:23 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:03:23.813 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:23 np0005538515.localdomain systemd-udevd[312084]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 10:03:23 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:03:23.823 158530 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538515.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:1::2/64', 'neutron:device_id': 'dhcp2a75ae47-09f3-5db4-9c67-86b6e0e7c804-6e7f0ee3-3d47-47d5-a9ba-c47f46ffc694', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6e7f0ee3-3d47-47d5-a9ba-c47f46ffc694', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '50a1392ce96c4024bcd36a3df403ca29', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=94e1d43d-b7b9-4cde-a1cb-52c6e6527f88, chassis=[<ovs.db.idl.Row object at 0x7fd80e481be0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fd80e481be0>], logical_port=6c0b5ff2-df89-4fe3-9e64-c927114a583e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:03:23 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:03:23.824 158530 INFO neutron.agent.ovn.metadata.agent [-] Port 6c0b5ff2-df89-4fe3-9e64-c927114a583e in datapath 6e7f0ee3-3d47-47d5-a9ba-c47f46ffc694 bound to our chassis
Nov 28 10:03:23 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:03:23.825 158530 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 6e7f0ee3-3d47-47d5-a9ba-c47f46ffc694 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 28 10:03:23 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:03:23.826 261619 DEBUG oslo.privsep.daemon [-] privsep: reply[4f2f28f8-e204-4c09-add3-cf93528d8693]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:03:23 np0005538515.localdomain virtnodedevd[228057]: ethtool ioctl error on tap6c0b5ff2-df: No such device
Nov 28 10:03:23 np0005538515.localdomain virtnodedevd[228057]: ethtool ioctl error on tap6c0b5ff2-df: No such device
Nov 28 10:03:23 np0005538515.localdomain virtnodedevd[228057]: ethtool ioctl error on tap6c0b5ff2-df: No such device
Nov 28 10:03:23 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T10:03:23Z|00087|binding|INFO|Setting lport 6c0b5ff2-df89-4fe3-9e64-c927114a583e ovn-installed in OVS
Nov 28 10:03:23 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T10:03:23Z|00088|binding|INFO|Setting lport 6c0b5ff2-df89-4fe3-9e64-c927114a583e up in Southbound
Nov 28 10:03:23 np0005538515.localdomain virtnodedevd[228057]: ethtool ioctl error on tap6c0b5ff2-df: No such device
Nov 28 10:03:23 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:03:23.856 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:23 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:03:23.856 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:23 np0005538515.localdomain virtnodedevd[228057]: ethtool ioctl error on tap6c0b5ff2-df: No such device
Nov 28 10:03:23 np0005538515.localdomain virtnodedevd[228057]: ethtool ioctl error on tap6c0b5ff2-df: No such device
Nov 28 10:03:23 np0005538515.localdomain virtnodedevd[228057]: ethtool ioctl error on tap6c0b5ff2-df: No such device
Nov 28 10:03:23 np0005538515.localdomain virtnodedevd[228057]: ethtool ioctl error on tap6c0b5ff2-df: No such device
Nov 28 10:03:23 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:03:23.899 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:23 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:03:23.932 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:23 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:03:23.950 2 INFO neutron.agent.securitygroups_rpc [None req-8c468440-8245-4890-91bf-66327309dae3 1170e00b193540e48a34e21b33f08b7a a8f8694ac11a4237ad168b64c39ca114 - - default default] Security group member updated ['7f513fd2-7d14-47f7-892b-9f7b5ed2e3c4']
Nov 28 10:03:24 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:03:24.625 2 INFO neutron.agent.securitygroups_rpc [None req-6563d2b7-ae08-45e8-8b76-40044d8bfa2e 1170e00b193540e48a34e21b33f08b7a a8f8694ac11a4237ad168b64c39ca114 - - default default] Security group member updated ['7f513fd2-7d14-47f7-892b-9f7b5ed2e3c4']
Nov 28 10:03:24 np0005538515.localdomain podman[312155]: 
Nov 28 10:03:24 np0005538515.localdomain podman[312155]: 2025-11-28 10:03:24.855991265 +0000 UTC m=+0.101570635 container create 68800f0a433642a83dde4b2a60ebb2444b0baed6eef390dd962b7b9126a9b889 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6e7f0ee3-3d47-47d5-a9ba-c47f46ffc694, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 28 10:03:24 np0005538515.localdomain podman[312155]: 2025-11-28 10:03:24.804211127 +0000 UTC m=+0.049790547 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 28 10:03:24 np0005538515.localdomain systemd[1]: Started libpod-conmon-68800f0a433642a83dde4b2a60ebb2444b0baed6eef390dd962b7b9126a9b889.scope.
Nov 28 10:03:24 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 10:03:24 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8ea26693fddbc198b895040c73bb855dbac4baf54a7312418b5e6c595b6259ad/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 10:03:24 np0005538515.localdomain podman[312155]: 2025-11-28 10:03:24.941147664 +0000 UTC m=+0.186727064 container init 68800f0a433642a83dde4b2a60ebb2444b0baed6eef390dd962b7b9126a9b889 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6e7f0ee3-3d47-47d5-a9ba-c47f46ffc694, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:03:24 np0005538515.localdomain podman[312155]: 2025-11-28 10:03:24.955062311 +0000 UTC m=+0.200641681 container start 68800f0a433642a83dde4b2a60ebb2444b0baed6eef390dd962b7b9126a9b889 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6e7f0ee3-3d47-47d5-a9ba-c47f46ffc694, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0)
Nov 28 10:03:24 np0005538515.localdomain dnsmasq[312173]: started, version 2.85 cachesize 150
Nov 28 10:03:24 np0005538515.localdomain dnsmasq[312173]: DNS service limited to local subnets
Nov 28 10:03:24 np0005538515.localdomain dnsmasq[312173]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 28 10:03:24 np0005538515.localdomain dnsmasq[312173]: warning: no upstream servers configured
Nov 28 10:03:24 np0005538515.localdomain dnsmasq-dhcp[312173]: DHCPv6, static leases only on 2001:db8:1::, lease time 1d
Nov 28 10:03:24 np0005538515.localdomain dnsmasq[312173]: read /var/lib/neutron/dhcp/6e7f0ee3-3d47-47d5-a9ba-c47f46ffc694/addn_hosts - 0 addresses
Nov 28 10:03:24 np0005538515.localdomain dnsmasq-dhcp[312173]: read /var/lib/neutron/dhcp/6e7f0ee3-3d47-47d5-a9ba-c47f46ffc694/host
Nov 28 10:03:24 np0005538515.localdomain dnsmasq-dhcp[312173]: read /var/lib/neutron/dhcp/6e7f0ee3-3d47-47d5-a9ba-c47f46ffc694/opts
Nov 28 10:03:25 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:03:25.160 261346 INFO neutron.agent.dhcp.agent [None req-23033da1-cc7d-4b74-afae-6e37e0f29c88 - - - - - -] DHCP configuration for ports {'eb2477c9-7d94-49d1-83d4-9980e36bbbef'} is completed
Nov 28 10:03:25 np0005538515.localdomain ceph-mon[301134]: pgmap v196: 177 pgs: 177 active+clean; 145 MiB data, 773 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:03:25 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v197: 177 pgs: 177 active+clean; 145 MiB data, 773 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:03:25 np0005538515.localdomain systemd[1]: tmp-crun.yJGL9j.mount: Deactivated successfully.
Nov 28 10:03:26 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e127 e127: 6 total, 6 up, 6 in
Nov 28 10:03:26 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:03:26.730 2 INFO neutron.agent.securitygroups_rpc [None req-a6c40294-bcff-4fbb-89ad-bea0e8a1937c 1170e00b193540e48a34e21b33f08b7a a8f8694ac11a4237ad168b64c39ca114 - - default default] Security group member updated ['7f513fd2-7d14-47f7-892b-9f7b5ed2e3c4']
Nov 28 10:03:26 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e127 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 10:03:26 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:03:26.922 261346 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:03:26Z, description=, device_id=2c4fdf00-3fa5-4a24-a05d-8ff5ec980545, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ceebdca0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce5c8c70>], id=0fbe6bd7-f895-4e05-ab45-c6c2712e010d, ip_allocation=immediate, mac_address=fa:16:3e:da:00:e8, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:03:19Z, description=, dns_domain=, id=6e7f0ee3-3d47-47d5-a9ba-c47f46ffc694, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersIpV6Test-1313739008, port_security_enabled=True, project_id=50a1392ce96c4024bcd36a3df403ca29, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=21540, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1295, status=ACTIVE, subnets=['63402f8f-09ed-4fd7-94cd-e097b6e23efa'], tags=[], tenant_id=50a1392ce96c4024bcd36a3df403ca29, updated_at=2025-11-28T10:03:22Z, vlan_transparent=None, network_id=6e7f0ee3-3d47-47d5-a9ba-c47f46ffc694, port_security_enabled=False, project_id=50a1392ce96c4024bcd36a3df403ca29, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1334, status=DOWN, tags=[], tenant_id=50a1392ce96c4024bcd36a3df403ca29, updated_at=2025-11-28T10:03:26Z on network 6e7f0ee3-3d47-47d5-a9ba-c47f46ffc694
Nov 28 10:03:27 np0005538515.localdomain ceph-mon[301134]: pgmap v197: 177 pgs: 177 active+clean; 145 MiB data, 773 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:03:27 np0005538515.localdomain ceph-mon[301134]: osdmap e127: 6 total, 6 up, 6 in
Nov 28 10:03:27 np0005538515.localdomain dnsmasq[312173]: read /var/lib/neutron/dhcp/6e7f0ee3-3d47-47d5-a9ba-c47f46ffc694/addn_hosts - 1 addresses
Nov 28 10:03:27 np0005538515.localdomain podman[312191]: 2025-11-28 10:03:27.133477485 +0000 UTC m=+0.068895042 container kill 68800f0a433642a83dde4b2a60ebb2444b0baed6eef390dd962b7b9126a9b889 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6e7f0ee3-3d47-47d5-a9ba-c47f46ffc694, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125)
Nov 28 10:03:27 np0005538515.localdomain dnsmasq-dhcp[312173]: read /var/lib/neutron/dhcp/6e7f0ee3-3d47-47d5-a9ba-c47f46ffc694/host
Nov 28 10:03:27 np0005538515.localdomain dnsmasq-dhcp[312173]: read /var/lib/neutron/dhcp/6e7f0ee3-3d47-47d5-a9ba-c47f46ffc694/opts
Nov 28 10:03:27 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:03:27.385 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:27 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:03:27.448 261346 INFO neutron.agent.dhcp.agent [None req-281fd3a1-7453-483c-8f48-a3f681ad2cf6 - - - - - -] DHCP configuration for ports {'0fbe6bd7-f895-4e05-ab45-c6c2712e010d'} is completed
Nov 28 10:03:27 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:03:27.497 2 INFO neutron.agent.securitygroups_rpc [None req-9bac993d-08a2-4a7a-9741-5f6e8a523396 1170e00b193540e48a34e21b33f08b7a a8f8694ac11a4237ad168b64c39ca114 - - default default] Security group member updated ['7f513fd2-7d14-47f7-892b-9f7b5ed2e3c4']
Nov 28 10:03:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:03:27 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 10:03:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:03:27 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 10:03:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:03:27 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 10:03:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:03:27 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 10:03:27 np0005538515.localdomain openstack_network_exporter[240973]: 
Nov 28 10:03:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:03:27 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 10:03:27 np0005538515.localdomain openstack_network_exporter[240973]: 
Nov 28 10:03:27 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v199: 177 pgs: 177 active+clean; 145 MiB data, 773 MiB used, 41 GiB / 42 GiB avail; 13 KiB/s rd, 1.6 KiB/s wr, 18 op/s
Nov 28 10:03:28 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e128 e128: 6 total, 6 up, 6 in
Nov 28 10:03:28 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:03:28.422 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:28 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:03:28.825 261346 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:03:26Z, description=, device_id=2c4fdf00-3fa5-4a24-a05d-8ff5ec980545, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce61bdc0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce61b0a0>], id=0fbe6bd7-f895-4e05-ab45-c6c2712e010d, ip_allocation=immediate, mac_address=fa:16:3e:da:00:e8, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:03:19Z, description=, dns_domain=, id=6e7f0ee3-3d47-47d5-a9ba-c47f46ffc694, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersIpV6Test-1313739008, port_security_enabled=True, project_id=50a1392ce96c4024bcd36a3df403ca29, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=21540, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1295, status=ACTIVE, subnets=['63402f8f-09ed-4fd7-94cd-e097b6e23efa'], tags=[], tenant_id=50a1392ce96c4024bcd36a3df403ca29, updated_at=2025-11-28T10:03:22Z, vlan_transparent=None, network_id=6e7f0ee3-3d47-47d5-a9ba-c47f46ffc694, port_security_enabled=False, project_id=50a1392ce96c4024bcd36a3df403ca29, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1334, status=DOWN, tags=[], tenant_id=50a1392ce96c4024bcd36a3df403ca29, updated_at=2025-11-28T10:03:26Z on network 6e7f0ee3-3d47-47d5-a9ba-c47f46ffc694
Nov 28 10:03:28 np0005538515.localdomain podman[239012]: time="2025-11-28T10:03:28Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 10:03:28 np0005538515.localdomain podman[239012]: @ - - [28/Nov/2025:10:03:28 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 158149 "" "Go-http-client/1.1"
Nov 28 10:03:28 np0005538515.localdomain podman[239012]: @ - - [28/Nov/2025:10:03:28 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19672 "" "Go-http-client/1.1"
Nov 28 10:03:29 np0005538515.localdomain ceph-mon[301134]: pgmap v199: 177 pgs: 177 active+clean; 145 MiB data, 773 MiB used, 41 GiB / 42 GiB avail; 13 KiB/s rd, 1.6 KiB/s wr, 18 op/s
Nov 28 10:03:29 np0005538515.localdomain ceph-mon[301134]: osdmap e128: 6 total, 6 up, 6 in
Nov 28 10:03:29 np0005538515.localdomain dnsmasq[312173]: read /var/lib/neutron/dhcp/6e7f0ee3-3d47-47d5-a9ba-c47f46ffc694/addn_hosts - 1 addresses
Nov 28 10:03:29 np0005538515.localdomain dnsmasq-dhcp[312173]: read /var/lib/neutron/dhcp/6e7f0ee3-3d47-47d5-a9ba-c47f46ffc694/host
Nov 28 10:03:29 np0005538515.localdomain podman[312229]: 2025-11-28 10:03:29.075253507 +0000 UTC m=+0.062432895 container kill 68800f0a433642a83dde4b2a60ebb2444b0baed6eef390dd962b7b9126a9b889 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6e7f0ee3-3d47-47d5-a9ba-c47f46ffc694, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 28 10:03:29 np0005538515.localdomain dnsmasq-dhcp[312173]: read /var/lib/neutron/dhcp/6e7f0ee3-3d47-47d5-a9ba-c47f46ffc694/opts
Nov 28 10:03:29 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:03:29.309 261346 INFO neutron.agent.dhcp.agent [None req-d0ad3e84-584e-4664-8c47-46a23eb9be73 - - - - - -] DHCP configuration for ports {'0fbe6bd7-f895-4e05-ab45-c6c2712e010d'} is completed
Nov 28 10:03:29 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v201: 177 pgs: 177 active+clean; 145 MiB data, 773 MiB used, 41 GiB / 42 GiB avail; 17 KiB/s rd, 2.0 KiB/s wr, 23 op/s
Nov 28 10:03:30 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e129 e129: 6 total, 6 up, 6 in
Nov 28 10:03:30 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:03:30.078 2 INFO neutron.agent.securitygroups_rpc [None req-a538bd0d-c0aa-4d14-8c4b-26de5d170843 1170e00b193540e48a34e21b33f08b7a a8f8694ac11a4237ad168b64c39ca114 - - default default] Security group member updated ['7f513fd2-7d14-47f7-892b-9f7b5ed2e3c4']
Nov 28 10:03:30 np0005538515.localdomain dnsmasq[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/addn_hosts - 1 addresses
Nov 28 10:03:30 np0005538515.localdomain dnsmasq-dhcp[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/host
Nov 28 10:03:30 np0005538515.localdomain podman[312267]: 2025-11-28 10:03:30.653334392 +0000 UTC m=+0.066661254 container kill d6ba70e100412f39fbf43f2359746885ccd151dd93bddc3682d2651a49a36c62 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-887157f9-a765-40c0-8be5-1fba3ddea8f8, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Nov 28 10:03:30 np0005538515.localdomain dnsmasq-dhcp[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/opts
Nov 28 10:03:31 np0005538515.localdomain ceph-mon[301134]: pgmap v201: 177 pgs: 177 active+clean; 145 MiB data, 773 MiB used, 41 GiB / 42 GiB avail; 17 KiB/s rd, 2.0 KiB/s wr, 23 op/s
Nov 28 10:03:31 np0005538515.localdomain ceph-mon[301134]: osdmap e129: 6 total, 6 up, 6 in
Nov 28 10:03:31 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v203: 177 pgs: 177 active+clean; 145 MiB data, 773 MiB used, 41 GiB / 42 GiB avail; 97 KiB/s rd, 8.7 KiB/s wr, 132 op/s
Nov 28 10:03:31 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 10:03:31 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:03:31.897 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:32 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e130 e130: 6 total, 6 up, 6 in
Nov 28 10:03:32 np0005538515.localdomain dnsmasq[312173]: read /var/lib/neutron/dhcp/6e7f0ee3-3d47-47d5-a9ba-c47f46ffc694/addn_hosts - 0 addresses
Nov 28 10:03:32 np0005538515.localdomain dnsmasq-dhcp[312173]: read /var/lib/neutron/dhcp/6e7f0ee3-3d47-47d5-a9ba-c47f46ffc694/host
Nov 28 10:03:32 np0005538515.localdomain dnsmasq-dhcp[312173]: read /var/lib/neutron/dhcp/6e7f0ee3-3d47-47d5-a9ba-c47f46ffc694/opts
Nov 28 10:03:32 np0005538515.localdomain podman[312306]: 2025-11-28 10:03:32.190620097 +0000 UTC m=+0.066952213 container kill 68800f0a433642a83dde4b2a60ebb2444b0baed6eef390dd962b7b9126a9b889 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6e7f0ee3-3d47-47d5-a9ba-c47f46ffc694, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Nov 28 10:03:32 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.
Nov 28 10:03:32 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.
Nov 28 10:03:32 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.
Nov 28 10:03:32 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.
Nov 28 10:03:32 np0005538515.localdomain systemd[1]: tmp-crun.cAdyah.mount: Deactivated successfully.
Nov 28 10:03:32 np0005538515.localdomain podman[312323]: 2025-11-28 10:03:32.323917742 +0000 UTC m=+0.093357362 container health_status d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 28 10:03:32 np0005538515.localdomain podman[312323]: 2025-11-28 10:03:32.33363357 +0000 UTC m=+0.103073190 container exec_died d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 10:03:32 np0005538515.localdomain systemd[1]: d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.service: Deactivated successfully.
Nov 28 10:03:32 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:03:32.356 2 INFO neutron.agent.securitygroups_rpc [None req-23b7a4db-87e9-4c7d-8b9d-380815f2adcd 1170e00b193540e48a34e21b33f08b7a a8f8694ac11a4237ad168b64c39ca114 - - default default] Security group member updated ['7f513fd2-7d14-47f7-892b-9f7b5ed2e3c4']
Nov 28 10:03:32 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:03:32.387 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:32 np0005538515.localdomain podman[312361]: 2025-11-28 10:03:32.396391163 +0000 UTC m=+0.070424149 container health_status 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_controller, io.buildah.version=1.41.3, container_name=ovn_controller, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 28 10:03:32 np0005538515.localdomain podman[312322]: 2025-11-28 10:03:32.419217843 +0000 UTC m=+0.192466109 container health_status b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent)
Nov 28 10:03:32 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:03:32.466 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:32 np0005538515.localdomain kernel: device tap6c0b5ff2-df left promiscuous mode
Nov 28 10:03:32 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T10:03:32Z|00089|binding|INFO|Releasing lport 6c0b5ff2-df89-4fe3-9e64-c927114a583e from this chassis (sb_readonly=0)
Nov 28 10:03:32 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T10:03:32Z|00090|binding|INFO|Setting lport 6c0b5ff2-df89-4fe3-9e64-c927114a583e down in Southbound
Nov 28 10:03:32 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:03:32.474 158530 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538515.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:1::2/64', 'neutron:device_id': 'dhcp2a75ae47-09f3-5db4-9c67-86b6e0e7c804-6e7f0ee3-3d47-47d5-a9ba-c47f46ffc694', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6e7f0ee3-3d47-47d5-a9ba-c47f46ffc694', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '50a1392ce96c4024bcd36a3df403ca29', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005538515.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=94e1d43d-b7b9-4cde-a1cb-52c6e6527f88, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fd80e481be0>], logical_port=6c0b5ff2-df89-4fe3-9e64-c927114a583e) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fd80e481be0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:03:32 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:03:32.475 158530 INFO neutron.agent.ovn.metadata.agent [-] Port 6c0b5ff2-df89-4fe3-9e64-c927114a583e in datapath 6e7f0ee3-3d47-47d5-a9ba-c47f46ffc694 unbound from our chassis
Nov 28 10:03:32 np0005538515.localdomain podman[312320]: 2025-11-28 10:03:32.475429015 +0000 UTC m=+0.251467807 container health_status 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Nov 28 10:03:32 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:03:32.476 158530 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 6e7f0ee3-3d47-47d5-a9ba-c47f46ffc694 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 28 10:03:32 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:03:32.477 261619 DEBUG oslo.privsep.daemon [-] privsep: reply[901670d2-e895-4c44-9182-201ffe8c8039]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:03:32 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:03:32.486 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:32 np0005538515.localdomain podman[312322]: 2025-11-28 10:03:32.498350339 +0000 UTC m=+0.271598545 container exec_died b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125)
Nov 28 10:03:32 np0005538515.localdomain podman[312320]: 2025-11-28 10:03:32.51046753 +0000 UTC m=+0.286506392 container exec_died 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251125, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Nov 28 10:03:32 np0005538515.localdomain systemd[1]: 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.service: Deactivated successfully.
Nov 28 10:03:32 np0005538515.localdomain podman[312361]: 2025-11-28 10:03:32.529526594 +0000 UTC m=+0.203559610 container exec_died 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Nov 28 10:03:32 np0005538515.localdomain systemd[1]: 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.service: Deactivated successfully.
Nov 28 10:03:32 np0005538515.localdomain systemd[1]: b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.service: Deactivated successfully.
Nov 28 10:03:33 np0005538515.localdomain ceph-mon[301134]: pgmap v203: 177 pgs: 177 active+clean; 145 MiB data, 773 MiB used, 41 GiB / 42 GiB avail; 97 KiB/s rd, 8.7 KiB/s wr, 132 op/s
Nov 28 10:03:33 np0005538515.localdomain ceph-mon[301134]: osdmap e130: 6 total, 6 up, 6 in
Nov 28 10:03:33 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:03:33.423 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:33 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:03:33.442 261346 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:03:32Z, description=, device_id=6044fed0-a964-4a28-ad00-c53ecd1a7643, device_owner=network:router_gateway, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce5e9ee0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce5e9460>], id=06c8fced-7e68-4f3b-b898-de91e39611ac, ip_allocation=immediate, mac_address=fa:16:3e:bf:b3:13, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T08:32:19Z, description=, dns_domain=, id=887157f9-a765-40c0-8be5-1fba3ddea8f8, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=9dda653c53224db086060962b0702694, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=29, status=ACTIVE, subnets=['5f7de60c-f82a-4f40-b803-51cb08cbf2e3'], tags=[], tenant_id=9dda653c53224db086060962b0702694, updated_at=2025-11-28T08:32:25Z, vlan_transparent=None, network_id=887157f9-a765-40c0-8be5-1fba3ddea8f8, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1368, status=DOWN, tags=[], tenant_id=, updated_at=2025-11-28T10:03:33Z on network 887157f9-a765-40c0-8be5-1fba3ddea8f8
Nov 28 10:03:33 np0005538515.localdomain dnsmasq[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/addn_hosts - 2 addresses
Nov 28 10:03:33 np0005538515.localdomain dnsmasq-dhcp[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/host
Nov 28 10:03:33 np0005538515.localdomain dnsmasq-dhcp[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/opts
Nov 28 10:03:33 np0005538515.localdomain podman[312431]: 2025-11-28 10:03:33.660134735 +0000 UTC m=+0.055610426 container kill d6ba70e100412f39fbf43f2359746885ccd151dd93bddc3682d2651a49a36c62 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-887157f9-a765-40c0-8be5-1fba3ddea8f8, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 28 10:03:33 np0005538515.localdomain systemd[1]: tmp-crun.9YNW19.mount: Deactivated successfully.
Nov 28 10:03:33 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v205: 177 pgs: 177 active+clean; 145 MiB data, 773 MiB used, 41 GiB / 42 GiB avail; 74 KiB/s rd, 6.0 KiB/s wr, 101 op/s
Nov 28 10:03:33 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:03:33.895 261346 INFO neutron.agent.dhcp.agent [None req-9ae4ce4e-240a-49d4-a88e-f444041aed3f - - - - - -] DHCP configuration for ports {'06c8fced-7e68-4f3b-b898-de91e39611ac'} is completed
Nov 28 10:03:34 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.
Nov 28 10:03:34 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:03:34.885 2 INFO neutron.agent.securitygroups_rpc [None req-a1e00e91-b063-4693-9d8b-b7a005d16694 1170e00b193540e48a34e21b33f08b7a a8f8694ac11a4237ad168b64c39ca114 - - default default] Security group member updated ['7f513fd2-7d14-47f7-892b-9f7b5ed2e3c4']
Nov 28 10:03:34 np0005538515.localdomain podman[312452]: 2025-11-28 10:03:34.977175419 +0000 UTC m=+0.085544822 container health_status 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 28 10:03:34 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e131 e131: 6 total, 6 up, 6 in
Nov 28 10:03:34 np0005538515.localdomain podman[312452]: 2025-11-28 10:03:34.989415405 +0000 UTC m=+0.097784838 container exec_died 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 28 10:03:35 np0005538515.localdomain systemd[1]: 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.service: Deactivated successfully.
Nov 28 10:03:35 np0005538515.localdomain ceph-mon[301134]: pgmap v205: 177 pgs: 177 active+clean; 145 MiB data, 773 MiB used, 41 GiB / 42 GiB avail; 74 KiB/s rd, 6.0 KiB/s wr, 101 op/s
Nov 28 10:03:35 np0005538515.localdomain ceph-mon[301134]: osdmap e131: 6 total, 6 up, 6 in
Nov 28 10:03:35 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] scanning for idle connections..
Nov 28 10:03:35 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] cleaning up connections: []
Nov 28 10:03:35 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:03:35.699 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:35 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] scanning for idle connections..
Nov 28 10:03:35 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] cleaning up connections: []
Nov 28 10:03:35 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v207: 177 pgs: 177 active+clean; 145 MiB data, 773 MiB used, 41 GiB / 42 GiB avail; 74 KiB/s rd, 6.0 KiB/s wr, 101 op/s
Nov 28 10:03:35 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] scanning for idle connections..
Nov 28 10:03:35 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] cleaning up connections: []
Nov 28 10:03:35 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:03:35.856 2 INFO neutron.agent.securitygroups_rpc [None req-57c1bb59-2e0b-4157-baad-e850337ecf12 1170e00b193540e48a34e21b33f08b7a a8f8694ac11a4237ad168b64c39ca114 - - default default] Security group member updated ['7f513fd2-7d14-47f7-892b-9f7b5ed2e3c4']
Nov 28 10:03:36 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e132 e132: 6 total, 6 up, 6 in
Nov 28 10:03:36 np0005538515.localdomain dnsmasq[312173]: exiting on receipt of SIGTERM
Nov 28 10:03:36 np0005538515.localdomain podman[312489]: 2025-11-28 10:03:36.600664056 +0000 UTC m=+0.044258757 container kill 68800f0a433642a83dde4b2a60ebb2444b0baed6eef390dd962b7b9126a9b889 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6e7f0ee3-3d47-47d5-a9ba-c47f46ffc694, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125)
Nov 28 10:03:36 np0005538515.localdomain systemd[1]: libpod-68800f0a433642a83dde4b2a60ebb2444b0baed6eef390dd962b7b9126a9b889.scope: Deactivated successfully.
Nov 28 10:03:36 np0005538515.localdomain podman[312503]: 2025-11-28 10:03:36.674787508 +0000 UTC m=+0.058318808 container died 68800f0a433642a83dde4b2a60ebb2444b0baed6eef390dd962b7b9126a9b889 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6e7f0ee3-3d47-47d5-a9ba-c47f46ffc694, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:03:36 np0005538515.localdomain systemd[1]: tmp-crun.KIRFvb.mount: Deactivated successfully.
Nov 28 10:03:36 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-68800f0a433642a83dde4b2a60ebb2444b0baed6eef390dd962b7b9126a9b889-userdata-shm.mount: Deactivated successfully.
Nov 28 10:03:36 np0005538515.localdomain podman[312503]: 2025-11-28 10:03:36.706380287 +0000 UTC m=+0.089911547 container cleanup 68800f0a433642a83dde4b2a60ebb2444b0baed6eef390dd962b7b9126a9b889 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6e7f0ee3-3d47-47d5-a9ba-c47f46ffc694, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true)
Nov 28 10:03:36 np0005538515.localdomain systemd[1]: libpod-conmon-68800f0a433642a83dde4b2a60ebb2444b0baed6eef390dd962b7b9126a9b889.scope: Deactivated successfully.
Nov 28 10:03:36 np0005538515.localdomain podman[312504]: 2025-11-28 10:03:36.756975977 +0000 UTC m=+0.134944516 container remove 68800f0a433642a83dde4b2a60ebb2444b0baed6eef390dd962b7b9126a9b889 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6e7f0ee3-3d47-47d5-a9ba-c47f46ffc694, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:03:36 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 10:03:37 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:03:37.135 261346 INFO neutron.agent.dhcp.agent [None req-0ed3a335-a0b9-411f-8b12-710d52ffc5de - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:03:37 np0005538515.localdomain ceph-mon[301134]: pgmap v207: 177 pgs: 177 active+clean; 145 MiB data, 773 MiB used, 41 GiB / 42 GiB avail; 74 KiB/s rd, 6.0 KiB/s wr, 101 op/s
Nov 28 10:03:37 np0005538515.localdomain ceph-mon[301134]: osdmap e132: 6 total, 6 up, 6 in
Nov 28 10:03:37 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e133 e133: 6 total, 6 up, 6 in
Nov 28 10:03:37 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:03:37.262 261346 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:03:37 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:03:37.390 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:37 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:03:37.429 2 INFO neutron.agent.securitygroups_rpc [None req-69a76c47-354e-40d8-9c7a-4acd924cbac4 1170e00b193540e48a34e21b33f08b7a a8f8694ac11a4237ad168b64c39ca114 - - default default] Security group member updated ['7f513fd2-7d14-47f7-892b-9f7b5ed2e3c4']
Nov 28 10:03:37 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:03:37.593 261346 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:03:37 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-8ea26693fddbc198b895040c73bb855dbac4baf54a7312418b5e6c595b6259ad-merged.mount: Deactivated successfully.
Nov 28 10:03:37 np0005538515.localdomain systemd[1]: run-netns-qdhcp\x2d6e7f0ee3\x2d3d47\x2d47d5\x2da9ba\x2dc47f46ffc694.mount: Deactivated successfully.
Nov 28 10:03:37 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v210: 177 pgs: 177 active+clean; 145 MiB data, 777 MiB used, 41 GiB / 42 GiB avail; 21 KiB/s rd, 1.6 KiB/s wr, 29 op/s
Nov 28 10:03:37 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:03:37.976 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:38 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:03:38.192 2 INFO neutron.agent.securitygroups_rpc [None req-eef6fd1f-c62c-4fce-a6ed-f73dc25767c9 1170e00b193540e48a34e21b33f08b7a a8f8694ac11a4237ad168b64c39ca114 - - default default] Security group member updated ['7f513fd2-7d14-47f7-892b-9f7b5ed2e3c4']
Nov 28 10:03:38 np0005538515.localdomain ceph-mon[301134]: osdmap e133: 6 total, 6 up, 6 in
Nov 28 10:03:38 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e134 e134: 6 total, 6 up, 6 in
Nov 28 10:03:38 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:03:38.427 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:38 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.
Nov 28 10:03:38 np0005538515.localdomain podman[312531]: 2025-11-28 10:03:38.968628761 +0000 UTC m=+0.080832669 container health_status cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_id=multipathd, container_name=multipathd)
Nov 28 10:03:38 np0005538515.localdomain podman[312531]: 2025-11-28 10:03:38.982361622 +0000 UTC m=+0.094565520 container exec_died cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, config_id=multipathd, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 28 10:03:38 np0005538515.localdomain systemd[1]: cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.service: Deactivated successfully.
Nov 28 10:03:39 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:03:39.235 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:03:39 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:03:39.263 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:03:39 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:03:39.263 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:03:39 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:03:39.263 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:03:39 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:03:39.264 280172 DEBUG nova.compute.manager [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 28 10:03:39 np0005538515.localdomain ceph-mon[301134]: pgmap v210: 177 pgs: 177 active+clean; 145 MiB data, 777 MiB used, 41 GiB / 42 GiB avail; 21 KiB/s rd, 1.6 KiB/s wr, 29 op/s
Nov 28 10:03:39 np0005538515.localdomain ceph-mon[301134]: osdmap e134: 6 total, 6 up, 6 in
Nov 28 10:03:39 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v212: 177 pgs: 177 active+clean; 145 MiB data, 777 MiB used, 41 GiB / 42 GiB avail; 25 KiB/s rd, 1.9 KiB/s wr, 34 op/s
Nov 28 10:03:39 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e135 e135: 6 total, 6 up, 6 in
Nov 28 10:03:40 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:03:40.471 2 INFO neutron.agent.securitygroups_rpc [None req-3119b771-1e00-43fd-8d05-15e8a1d2219b 1170e00b193540e48a34e21b33f08b7a a8f8694ac11a4237ad168b64c39ca114 - - default default] Security group member updated ['7f513fd2-7d14-47f7-892b-9f7b5ed2e3c4']
Nov 28 10:03:40 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:03:40.566 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:40 np0005538515.localdomain ceph-mon[301134]: pgmap v212: 177 pgs: 177 active+clean; 145 MiB data, 777 MiB used, 41 GiB / 42 GiB avail; 25 KiB/s rd, 1.9 KiB/s wr, 34 op/s
Nov 28 10:03:40 np0005538515.localdomain ceph-mon[301134]: osdmap e135: 6 total, 6 up, 6 in
Nov 28 10:03:41 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:03:41.239 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:03:41 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:03:41.652 2 INFO neutron.agent.securitygroups_rpc [None req-1b0c80b5-e3aa-421f-ac6e-3cbc3bd6a095 1170e00b193540e48a34e21b33f08b7a a8f8694ac11a4237ad168b64c39ca114 - - default default] Security group member updated ['7f513fd2-7d14-47f7-892b-9f7b5ed2e3c4']
Nov 28 10:03:41 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v214: 177 pgs: 177 active+clean; 161 MiB data, 794 MiB used, 41 GiB / 42 GiB avail; 107 KiB/s rd, 2.9 MiB/s wr, 146 op/s
Nov 28 10:03:41 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e135 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 10:03:42 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:03:42.096 2 INFO neutron.agent.securitygroups_rpc [None req-e8cd93f0-eb23-4135-97c1-cd5750f74f24 b286c38dfd0e4889806c62c7b4b9ee98 50a1392ce96c4024bcd36a3df403ca29 - - default default] Security group member updated ['b372bb98-860c-4571-936b-bf08ecbd647d']
Nov 28 10:03:42 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:03:42.392 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:43 np0005538515.localdomain ceph-mon[301134]: pgmap v214: 177 pgs: 177 active+clean; 161 MiB data, 794 MiB used, 41 GiB / 42 GiB avail; 107 KiB/s rd, 2.9 MiB/s wr, 146 op/s
Nov 28 10:03:43 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/2803641897' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:03:43 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/2803641897' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:03:43 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:03:43.429 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:43 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v215: 177 pgs: 177 active+clean; 161 MiB data, 794 MiB used, 41 GiB / 42 GiB avail; 72 KiB/s rd, 2.5 MiB/s wr, 99 op/s
Nov 28 10:03:44 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:03:44.198 2 INFO neutron.agent.securitygroups_rpc [None req-363c9598-0bac-406f-990f-c24334dc748e 595b5cbed3764c7a95b0ab3634e5becb 8c66e098e4fb4a349dc2bb4293454135 - - default default] Security group member updated ['f75eb612-183d-4664-84c2-1d15534e163f']
Nov 28 10:03:44 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:03:44.239 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:03:44 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:03:44.240 280172 DEBUG nova.compute.manager [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 28 10:03:44 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:03:44.240 280172 DEBUG nova.compute.manager [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 28 10:03:44 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:03:44.273 280172 DEBUG nova.compute.manager [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 28 10:03:44 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:03:44.957 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:44 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:03:44.960 158530 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=11, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '92:49:97', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ca:ab:0a:de:51:20'}, ipsec=False) old=SB_Global(nb_cfg=10) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:03:44 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:03:44.961 158530 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 28 10:03:44 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e136 e136: 6 total, 6 up, 6 in
Nov 28 10:03:45 np0005538515.localdomain ceph-mon[301134]: pgmap v215: 177 pgs: 177 active+clean; 161 MiB data, 794 MiB used, 41 GiB / 42 GiB avail; 72 KiB/s rd, 2.5 MiB/s wr, 99 op/s
Nov 28 10:03:45 np0005538515.localdomain ceph-mon[301134]: osdmap e136: 6 total, 6 up, 6 in
Nov 28 10:03:45 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:03:45.238 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:03:45 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:03:45.239 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:03:45 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:03:45.257 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 10:03:45 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:03:45.258 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 10:03:45 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:03:45.259 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 10:03:45 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:03:45.259 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Auditing locally available compute resources for np0005538515.localdomain (node: np0005538515.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 28 10:03:45 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:03:45.260 280172 DEBUG oslo_concurrency.processutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 10:03:45 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 28 10:03:45 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/3641885735' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:03:45 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:03:45.699 280172 DEBUG oslo_concurrency.processutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.439s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 10:03:45 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v217: 177 pgs: 177 active+clean; 161 MiB data, 794 MiB used, 41 GiB / 42 GiB avail; 63 KiB/s rd, 2.2 MiB/s wr, 86 op/s
Nov 28 10:03:45 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:03:45.899 280172 WARNING nova.virt.libvirt.driver [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 10:03:45 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:03:45.901 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Hypervisor/Node resource view: name=np0005538515.localdomain free_ram=11582MB free_disk=41.83686447143555GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 28 10:03:45 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:03:45.901 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 10:03:45 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:03:45.902 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 10:03:46 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:03:46.009 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 28 10:03:46 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:03:46.010 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Final resource view: name=np0005538515.localdomain phys_ram=15738MB used_ram=512MB phys_disk=41GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 28 10:03:46 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:03:46.033 280172 DEBUG oslo_concurrency.processutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 10:03:46 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.107:0/2196630115' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:03:46 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.106:0/3685964475' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:03:46 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.108:0/3641885735' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:03:46 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.107:0/1044460926' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:03:46 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:03:46.486 280172 DEBUG oslo_concurrency.processutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.453s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 10:03:46 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:03:46.491 280172 DEBUG nova.compute.provider_tree [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Inventory has not changed in ProviderTree for provider: 72fba1ca-0d86-48af-8a3d-510284dfd0e0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 10:03:46 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:03:46.506 280172 DEBUG nova.scheduler.client.report [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Inventory has not changed for provider 72fba1ca-0d86-48af-8a3d-510284dfd0e0 based on inventory data: {'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 10:03:46 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:03:46.510 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Compute_service record updated for np0005538515.localdomain:np0005538515.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 28 10:03:46 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:03:46.511 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.609s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 10:03:46 np0005538515.localdomain systemd[1]: tmp-crun.wMI6yU.mount: Deactivated successfully.
Nov 28 10:03:46 np0005538515.localdomain dnsmasq[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/addn_hosts - 1 addresses
Nov 28 10:03:46 np0005538515.localdomain dnsmasq-dhcp[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/host
Nov 28 10:03:46 np0005538515.localdomain dnsmasq-dhcp[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/opts
Nov 28 10:03:46 np0005538515.localdomain podman[312610]: 2025-11-28 10:03:46.650632633 +0000 UTC m=+0.062359043 container kill d6ba70e100412f39fbf43f2359746885ccd151dd93bddc3682d2651a49a36c62 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-887157f9-a765-40c0-8be5-1fba3ddea8f8, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Nov 28 10:03:46 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 10:03:47 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Nov 28 10:03:47 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3438130315' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:03:47 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Nov 28 10:03:47 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3438130315' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:03:47 np0005538515.localdomain ceph-mon[301134]: pgmap v217: 177 pgs: 177 active+clean; 161 MiB data, 794 MiB used, 41 GiB / 42 GiB avail; 63 KiB/s rd, 2.2 MiB/s wr, 86 op/s
Nov 28 10:03:47 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.106:0/1881536497' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:03:47 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.108:0/1619156998' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:03:47 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/3438130315' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:03:47 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/3438130315' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:03:47 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e137 e137: 6 total, 6 up, 6 in
Nov 28 10:03:47 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:03:47.138 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:47 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:03:47.395 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:47 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:03:47.507 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:03:47 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v219: 177 pgs: 177 active+clean; 257 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 117 KiB/s rd, 15 MiB/s wr, 161 op/s
Nov 28 10:03:48 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:03:48.081 2 INFO neutron.agent.securitygroups_rpc [None req-d103e3e5-6a2c-4d52-97a2-9ed0e9f72fa6 595b5cbed3764c7a95b0ab3634e5becb 8c66e098e4fb4a349dc2bb4293454135 - - default default] Security group member updated ['f75eb612-183d-4664-84c2-1d15534e163f']
Nov 28 10:03:48 np0005538515.localdomain ceph-mon[301134]: osdmap e137: 6 total, 6 up, 6 in
Nov 28 10:03:48 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e138 e138: 6 total, 6 up, 6 in
Nov 28 10:03:48 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:03:48.239 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:03:48 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:03:48.293 2 INFO neutron.agent.securitygroups_rpc [None req-8df05b65-915d-4be4-a7dc-9f54beb052e9 b286c38dfd0e4889806c62c7b4b9ee98 50a1392ce96c4024bcd36a3df403ca29 - - default default] Security group member updated ['b372bb98-860c-4571-936b-bf08ecbd647d']
Nov 28 10:03:48 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:03:48.432 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:49 np0005538515.localdomain ceph-mon[301134]: pgmap v219: 177 pgs: 177 active+clean; 257 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 117 KiB/s rd, 15 MiB/s wr, 161 op/s
Nov 28 10:03:49 np0005538515.localdomain ceph-mon[301134]: osdmap e138: 6 total, 6 up, 6 in
Nov 28 10:03:49 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v221: 177 pgs: 177 active+clean; 257 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 73 KiB/s rd, 16 MiB/s wr, 101 op/s
Nov 28 10:03:49 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:03:49.963 158530 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=62c03cad-89c1-4fd7-973b-8f2a608c71f1, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '11'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 10:03:49 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e139 e139: 6 total, 6 up, 6 in
Nov 28 10:03:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:03:50.847 158530 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 10:03:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:03:50.848 158530 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 10:03:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:03:50.848 158530 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 10:03:51 np0005538515.localdomain ceph-mon[301134]: pgmap v221: 177 pgs: 177 active+clean; 257 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 73 KiB/s rd, 16 MiB/s wr, 101 op/s
Nov 28 10:03:51 np0005538515.localdomain ceph-mon[301134]: osdmap e139: 6 total, 6 up, 6 in
Nov 28 10:03:51 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v223: 177 pgs: 177 active+clean; 145 MiB data, 782 MiB used, 41 GiB / 42 GiB avail; 131 KiB/s rd, 16 MiB/s wr, 185 op/s
Nov 28 10:03:51 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 10:03:52 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:03:52.397 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:52 np0005538515.localdomain systemd[1]: virtsecretd.service: Deactivated successfully.
Nov 28 10:03:52 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.
Nov 28 10:03:52 np0005538515.localdomain podman[312631]: 2025-11-28 10:03:52.56278715 +0000 UTC m=+0.081202719 container health_status 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, vendor=Red Hat, Inc., managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, release=1755695350, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, io.openshift.expose-services=, io.buildah.version=1.33.7)
Nov 28 10:03:52 np0005538515.localdomain podman[312631]: 2025-11-28 10:03:52.580334818 +0000 UTC m=+0.098750357 container exec_died 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, managed_by=edpm_ansible, config_id=edpm, name=ubi9-minimal, io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, distribution-scope=public)
Nov 28 10:03:52 np0005538515.localdomain systemd[1]: 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.service: Deactivated successfully.
Nov 28 10:03:53 np0005538515.localdomain ceph-mon[301134]: pgmap v223: 177 pgs: 177 active+clean; 145 MiB data, 782 MiB used, 41 GiB / 42 GiB avail; 131 KiB/s rd, 16 MiB/s wr, 185 op/s
Nov 28 10:03:53 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:03:53.478 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:53 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v224: 177 pgs: 177 active+clean; 145 MiB data, 782 MiB used, 41 GiB / 42 GiB avail; 53 KiB/s rd, 5.0 KiB/s wr, 76 op/s
Nov 28 10:03:55 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e140 e140: 6 total, 6 up, 6 in
Nov 28 10:03:55 np0005538515.localdomain ceph-mon[301134]: pgmap v224: 177 pgs: 177 active+clean; 145 MiB data, 782 MiB used, 41 GiB / 42 GiB avail; 53 KiB/s rd, 5.0 KiB/s wr, 76 op/s
Nov 28 10:03:55 np0005538515.localdomain ceph-mon[301134]: osdmap e140: 6 total, 6 up, 6 in
Nov 28 10:03:55 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v226: 177 pgs: 177 active+clean; 145 MiB data, 782 MiB used, 41 GiB / 42 GiB avail; 46 KiB/s rd, 4.4 KiB/s wr, 67 op/s
Nov 28 10:03:56 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e140 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 10:03:57 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e141 e141: 6 total, 6 up, 6 in
Nov 28 10:03:57 np0005538515.localdomain ceph-mon[301134]: pgmap v226: 177 pgs: 177 active+clean; 145 MiB data, 782 MiB used, 41 GiB / 42 GiB avail; 46 KiB/s rd, 4.4 KiB/s wr, 67 op/s
Nov 28 10:03:57 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:03:57.401 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:03:57 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 10:03:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:03:57 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 10:03:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:03:57 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 10:03:57 np0005538515.localdomain openstack_network_exporter[240973]: 
Nov 28 10:03:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:03:57 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 10:03:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:03:57 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 10:03:57 np0005538515.localdomain openstack_network_exporter[240973]: 
Nov 28 10:03:57 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v228: 177 pgs: 177 active+clean; 145 MiB data, 782 MiB used, 41 GiB / 42 GiB avail; 48 KiB/s rd, 4.3 KiB/s wr, 70 op/s
Nov 28 10:03:58 np0005538515.localdomain ceph-mon[301134]: osdmap e141: 6 total, 6 up, 6 in
Nov 28 10:03:58 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e142 e142: 6 total, 6 up, 6 in
Nov 28 10:03:58 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:03:58.480 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:03:58 np0005538515.localdomain podman[239012]: time="2025-11-28T10:03:58Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 10:03:58 np0005538515.localdomain podman[239012]: @ - - [28/Nov/2025:10:03:58 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156330 "" "Go-http-client/1.1"
Nov 28 10:03:58 np0005538515.localdomain podman[239012]: @ - - [28/Nov/2025:10:03:58 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19197 "" "Go-http-client/1.1"
Nov 28 10:03:59 np0005538515.localdomain ceph-mon[301134]: pgmap v228: 177 pgs: 177 active+clean; 145 MiB data, 782 MiB used, 41 GiB / 42 GiB avail; 48 KiB/s rd, 4.3 KiB/s wr, 70 op/s
Nov 28 10:03:59 np0005538515.localdomain ceph-mon[301134]: osdmap e142: 6 total, 6 up, 6 in
Nov 28 10:03:59 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v230: 177 pgs: 177 active+clean; 145 MiB data, 782 MiB used, 41 GiB / 42 GiB avail; 4.0 KiB/s rd, 0 B/s wr, 5 op/s
Nov 28 10:04:00 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:04:00.034 261346 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:03:59Z, description=, device_id=d3e54e1b-bd42-4183-a535-29739c2a7728, device_owner=network:router_gateway, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce4f6550>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce4f6f70>], id=d74da118-60dc-4dc3-9894-6ab138032f68, ip_allocation=immediate, mac_address=fa:16:3e:6f:c6:cd, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T08:32:19Z, description=, dns_domain=, id=887157f9-a765-40c0-8be5-1fba3ddea8f8, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=9dda653c53224db086060962b0702694, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=29, status=ACTIVE, subnets=['5f7de60c-f82a-4f40-b803-51cb08cbf2e3'], tags=[], tenant_id=9dda653c53224db086060962b0702694, updated_at=2025-11-28T08:32:25Z, vlan_transparent=None, network_id=887157f9-a765-40c0-8be5-1fba3ddea8f8, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1486, status=DOWN, tags=[], tenant_id=, updated_at=2025-11-28T10:03:59Z on network 887157f9-a765-40c0-8be5-1fba3ddea8f8
Nov 28 10:04:00 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e143 e143: 6 total, 6 up, 6 in
Nov 28 10:04:00 np0005538515.localdomain dnsmasq[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/addn_hosts - 2 addresses
Nov 28 10:04:00 np0005538515.localdomain dnsmasq-dhcp[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/host
Nov 28 10:04:00 np0005538515.localdomain dnsmasq-dhcp[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/opts
Nov 28 10:04:00 np0005538515.localdomain podman[312669]: 2025-11-28 10:04:00.2477203 +0000 UTC m=+0.044371911 container kill d6ba70e100412f39fbf43f2359746885ccd151dd93bddc3682d2651a49a36c62 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-887157f9-a765-40c0-8be5-1fba3ddea8f8, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team)
Nov 28 10:04:00 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:04:00.484 261346 INFO neutron.agent.dhcp.agent [None req-9459765b-c1a8-41c2-959b-593c843623f6 - - - - - -] DHCP configuration for ports {'d74da118-60dc-4dc3-9894-6ab138032f68'} is completed
Nov 28 10:04:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:04:00.626 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:04:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:04:00.627 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:04:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:04:00.627 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:04:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:04:00.627 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:04:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:04:00.628 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:04:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:04:00.628 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:04:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:04:00.628 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:04:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:04:00.628 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:04:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:04:00.628 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:04:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:04:00.629 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:04:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:04:00.629 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:04:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:04:00.629 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:04:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:04:00.629 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:04:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:04:00.630 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:04:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:04:00.630 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:04:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:04:00.630 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:04:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:04:00.630 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:04:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:04:00.630 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:04:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:04:00.631 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:04:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:04:00.631 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:04:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:04:00.631 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:04:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:04:00.631 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:04:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:04:00.632 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:04:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:04:00.632 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:04:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:04:00.632 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:04:01 np0005538515.localdomain ceph-mon[301134]: pgmap v230: 177 pgs: 177 active+clean; 145 MiB data, 782 MiB used, 41 GiB / 42 GiB avail; 4.0 KiB/s rd, 0 B/s wr, 5 op/s
Nov 28 10:04:01 np0005538515.localdomain ceph-mon[301134]: osdmap e143: 6 total, 6 up, 6 in
Nov 28 10:04:01 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e144 e144: 6 total, 6 up, 6 in
Nov 28 10:04:01 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:04:01.492 261346 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:04:00Z, description=, device_id=2783a834-568f-4ded-99ee-ece5c2dbdd83, device_owner=network:router_gateway, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce61bf70>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce61bd30>], id=35238ed1-dbed-4cd4-89e7-df96fbeaaa2f, ip_allocation=immediate, mac_address=fa:16:3e:34:e6:75, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T08:32:19Z, description=, dns_domain=, id=887157f9-a765-40c0-8be5-1fba3ddea8f8, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=9dda653c53224db086060962b0702694, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=29, status=ACTIVE, subnets=['5f7de60c-f82a-4f40-b803-51cb08cbf2e3'], tags=[], tenant_id=9dda653c53224db086060962b0702694, updated_at=2025-11-28T08:32:25Z, vlan_transparent=None, network_id=887157f9-a765-40c0-8be5-1fba3ddea8f8, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1490, status=DOWN, tags=[], tenant_id=, updated_at=2025-11-28T10:04:01Z on network 887157f9-a765-40c0-8be5-1fba3ddea8f8
Nov 28 10:04:01 np0005538515.localdomain dnsmasq[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/addn_hosts - 3 addresses
Nov 28 10:04:01 np0005538515.localdomain dnsmasq-dhcp[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/host
Nov 28 10:04:01 np0005538515.localdomain podman[312706]: 2025-11-28 10:04:01.716722993 +0000 UTC m=+0.064550079 container kill d6ba70e100412f39fbf43f2359746885ccd151dd93bddc3682d2651a49a36c62 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-887157f9-a765-40c0-8be5-1fba3ddea8f8, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 28 10:04:01 np0005538515.localdomain dnsmasq-dhcp[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/opts
Nov 28 10:04:01 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v233: 177 pgs: 177 active+clean; 145 MiB data, 783 MiB used, 41 GiB / 42 GiB avail; 54 KiB/s rd, 4.3 KiB/s wr, 74 op/s
Nov 28 10:04:01 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 10:04:01 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:04:01.927 261346 INFO neutron.agent.dhcp.agent [None req-e26cf6be-c645-4a00-83aa-f00344c570b6 - - - - - -] DHCP configuration for ports {'35238ed1-dbed-4cd4-89e7-df96fbeaaa2f'} is completed
Nov 28 10:04:02 np0005538515.localdomain ceph-mon[301134]: osdmap e144: 6 total, 6 up, 6 in
Nov 28 10:04:02 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:04:02.403 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:04:02 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.
Nov 28 10:04:02 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.
Nov 28 10:04:02 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.
Nov 28 10:04:02 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.
Nov 28 10:04:02 np0005538515.localdomain systemd[1]: tmp-crun.EUjUqz.mount: Deactivated successfully.
Nov 28 10:04:02 np0005538515.localdomain podman[312727]: 2025-11-28 10:04:02.998705823 +0000 UTC m=+0.094520018 container health_status 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 28 10:04:03 np0005538515.localdomain podman[312726]: 2025-11-28 10:04:03.045440636 +0000 UTC m=+0.146538252 container health_status 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Nov 28 10:04:03 np0005538515.localdomain podman[312727]: 2025-11-28 10:04:03.060530058 +0000 UTC m=+0.156344213 container exec_died 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251125, managed_by=edpm_ansible)
Nov 28 10:04:03 np0005538515.localdomain systemd[1]: 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.service: Deactivated successfully.
Nov 28 10:04:03 np0005538515.localdomain podman[312726]: 2025-11-28 10:04:03.082542363 +0000 UTC m=+0.183639969 container exec_died 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 28 10:04:03 np0005538515.localdomain podman[312728]: 2025-11-28 10:04:03.103446864 +0000 UTC m=+0.195614836 container health_status b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:04:03 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:04:03.144 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:04:03 np0005538515.localdomain systemd[1]: 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.service: Deactivated successfully.
Nov 28 10:04:03 np0005538515.localdomain podman[312735]: 2025-11-28 10:04:03.151838467 +0000 UTC m=+0.237924263 container health_status d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 10:04:03 np0005538515.localdomain podman[312735]: 2025-11-28 10:04:03.162451192 +0000 UTC m=+0.248536988 container exec_died d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 10:04:03 np0005538515.localdomain systemd[1]: d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.service: Deactivated successfully.
Nov 28 10:04:03 np0005538515.localdomain podman[312728]: 2025-11-28 10:04:03.188094528 +0000 UTC m=+0.280262510 container exec_died b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 28 10:04:03 np0005538515.localdomain ceph-mon[301134]: pgmap v233: 177 pgs: 177 active+clean; 145 MiB data, 783 MiB used, 41 GiB / 42 GiB avail; 54 KiB/s rd, 4.3 KiB/s wr, 74 op/s
Nov 28 10:04:03 np0005538515.localdomain systemd[1]: b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.service: Deactivated successfully.
Nov 28 10:04:03 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Nov 28 10:04:03 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2743022339' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:04:03 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Nov 28 10:04:03 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2743022339' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:04:03 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:04:03.481 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:04:03 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v234: 177 pgs: 177 active+clean; 145 MiB data, 783 MiB used, 41 GiB / 42 GiB avail; 42 KiB/s rd, 3.3 KiB/s wr, 57 op/s
Nov 28 10:04:04 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/2743022339' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:04:04 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/2743022339' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:04:04 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:04:04.765 261346 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:04:03Z, description=, device_id=e1d6ee65-6055-40c3-8545-a5afe51532af, device_owner=network:router_gateway, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ceeea070>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ceeea100>], id=c00eeda0-afa4-4e3d-8ef8-f30a2bfd8639, ip_allocation=immediate, mac_address=fa:16:3e:4f:12:57, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T08:32:19Z, description=, dns_domain=, id=887157f9-a765-40c0-8be5-1fba3ddea8f8, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=9dda653c53224db086060962b0702694, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=29, status=ACTIVE, subnets=['5f7de60c-f82a-4f40-b803-51cb08cbf2e3'], tags=[], tenant_id=9dda653c53224db086060962b0702694, updated_at=2025-11-28T08:32:25Z, vlan_transparent=None, network_id=887157f9-a765-40c0-8be5-1fba3ddea8f8, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1503, status=DOWN, tags=[], tenant_id=, updated_at=2025-11-28T10:04:04Z on network 887157f9-a765-40c0-8be5-1fba3ddea8f8
Nov 28 10:04:04 np0005538515.localdomain systemd[1]: tmp-crun.GrKOxL.mount: Deactivated successfully.
Nov 28 10:04:05 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:04:05.066 158530 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f8:ad:87 10.100.0.2 2001:db8::f816:3eff:fef8:ad87'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fef8:ad87/64', 'neutron:device_id': 'ovnmeta-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a8f8694ac11a4237ad168b64c39ca114', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3e1a061c-89bc-4b63-8b60-f49fb95addda, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=ef9eb238-2b1e-49f7-8a0f-72efc8854e0f) old=Port_Binding(mac=['fa:16:3e:f8:ad:87 2001:db8::f816:3eff:fef8:ad87'], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fef8:ad87/64', 'neutron:device_id': 'ovnmeta-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a8f8694ac11a4237ad168b64c39ca114', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:04:05 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:04:05.069 158530 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port ef9eb238-2b1e-49f7-8a0f-72efc8854e0f in datapath 8642adde-54ae-4fc2-b997-bf1962c6c7f1 updated
Nov 28 10:04:05 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:04:05.071 158530 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8642adde-54ae-4fc2-b997-bf1962c6c7f1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 28 10:04:05 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:04:05.072 261619 DEBUG oslo.privsep.daemon [-] privsep: reply[a09484e7-5083-45b5-b603-7d1076454c2e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:04:05 np0005538515.localdomain dnsmasq[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/addn_hosts - 4 addresses
Nov 28 10:04:05 np0005538515.localdomain podman[312825]: 2025-11-28 10:04:05.294995041 +0000 UTC m=+0.371985552 container kill d6ba70e100412f39fbf43f2359746885ccd151dd93bddc3682d2651a49a36c62 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-887157f9-a765-40c0-8be5-1fba3ddea8f8, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Nov 28 10:04:05 np0005538515.localdomain dnsmasq-dhcp[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/host
Nov 28 10:04:05 np0005538515.localdomain dnsmasq-dhcp[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/opts
Nov 28 10:04:05 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e145 e145: 6 total, 6 up, 6 in
Nov 28 10:04:05 np0005538515.localdomain ceph-mon[301134]: pgmap v234: 177 pgs: 177 active+clean; 145 MiB data, 783 MiB used, 41 GiB / 42 GiB avail; 42 KiB/s rd, 3.3 KiB/s wr, 57 op/s
Nov 28 10:04:05 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:04:05.537 261346 INFO neutron.agent.dhcp.agent [None req-ad746a3d-cc91-4303-885c-6820f784f0e3 - - - - - -] DHCP configuration for ports {'c00eeda0-afa4-4e3d-8ef8-f30a2bfd8639'} is completed
Nov 28 10:04:05 np0005538515.localdomain ceph-mgr[286188]: [balancer INFO root] Optimize plan auto_2025-11-28_10:04:05
Nov 28 10:04:05 np0005538515.localdomain ceph-mgr[286188]: [balancer INFO root] Mode upmap, max misplaced 0.050000
Nov 28 10:04:05 np0005538515.localdomain ceph-mgr[286188]: [balancer INFO root] do_upmap
Nov 28 10:04:05 np0005538515.localdomain ceph-mgr[286188]: [balancer INFO root] pools ['images', 'volumes', 'backups', '.mgr', 'vms', 'manila_data', 'manila_metadata']
Nov 28 10:04:05 np0005538515.localdomain ceph-mgr[286188]: [balancer INFO root] prepared 0/10 changes
Nov 28 10:04:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] scanning for idle connections..
Nov 28 10:04:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] cleaning up connections: []
Nov 28 10:04:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] scanning for idle connections..
Nov 28 10:04:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] cleaning up connections: []
Nov 28 10:04:05 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v236: 177 pgs: 177 active+clean; 145 MiB data, 783 MiB used, 41 GiB / 42 GiB avail; 42 KiB/s rd, 3.3 KiB/s wr, 57 op/s
Nov 28 10:04:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] scanning for idle connections..
Nov 28 10:04:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] cleaning up connections: []
Nov 28 10:04:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] _maybe_adjust
Nov 28 10:04:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Nov 28 10:04:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] Pool '.mgr' root_id -1 using 3.080724804578448e-05 of space, bias 1.0, pg target 0.006161449609156895 quantized to 1 (current 1)
Nov 28 10:04:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Nov 28 10:04:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] Pool 'vms' root_id -1 using 0.003328000680485762 of space, bias 1.0, pg target 0.6656001360971524 quantized to 32 (current 32)
Nov 28 10:04:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Nov 28 10:04:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] Pool 'volumes' root_id -1 using 1.3631525683975433e-06 of space, bias 1.0, pg target 0.0002721761294900428 quantized to 32 (current 32)
Nov 28 10:04:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Nov 28 10:04:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] Pool 'images' root_id -1 using 0.004299383200725851 of space, bias 1.0, pg target 0.8584435124115949 quantized to 32 (current 32)
Nov 28 10:04:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Nov 28 10:04:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] Pool 'backups' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Nov 28 10:04:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Nov 28 10:04:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] Pool 'manila_data' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Nov 28 10:04:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Nov 28 10:04:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] Pool 'manila_metadata' root_id -1 using 2.453674623115578e-06 of space, bias 4.0, pg target 0.001953125 quantized to 16 (current 16)
Nov 28 10:04:05 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules
Nov 28 10:04:05 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: vms, start_after=
Nov 28 10:04:05 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules
Nov 28 10:04:05 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: vms, start_after=
Nov 28 10:04:05 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: volumes, start_after=
Nov 28 10:04:05 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: volumes, start_after=
Nov 28 10:04:05 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: images, start_after=
Nov 28 10:04:05 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: images, start_after=
Nov 28 10:04:05 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: backups, start_after=
Nov 28 10:04:05 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: backups, start_after=
Nov 28 10:04:05 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.
Nov 28 10:04:05 np0005538515.localdomain podman[312845]: 2025-11-28 10:04:05.98899071 +0000 UTC m=+0.092235888 container health_status 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 28 10:04:06 np0005538515.localdomain podman[312845]: 2025-11-28 10:04:06.025589852 +0000 UTC m=+0.128834990 container exec_died 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 28 10:04:06 np0005538515.localdomain systemd[1]: 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.service: Deactivated successfully.
Nov 28 10:04:06 np0005538515.localdomain ceph-mon[301134]: osdmap e145: 6 total, 6 up, 6 in
Nov 28 10:04:06 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e145 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 10:04:07 np0005538515.localdomain dnsmasq[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/addn_hosts - 3 addresses
Nov 28 10:04:07 np0005538515.localdomain dnsmasq-dhcp[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/host
Nov 28 10:04:07 np0005538515.localdomain dnsmasq-dhcp[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/opts
Nov 28 10:04:07 np0005538515.localdomain podman[312884]: 2025-11-28 10:04:07.103245139 +0000 UTC m=+0.060943238 container kill d6ba70e100412f39fbf43f2359746885ccd151dd93bddc3682d2651a49a36c62 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-887157f9-a765-40c0-8be5-1fba3ddea8f8, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:04:07 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e146 e146: 6 total, 6 up, 6 in
Nov 28 10:04:07 np0005538515.localdomain ceph-mon[301134]: pgmap v236: 177 pgs: 177 active+clean; 145 MiB data, 783 MiB used, 41 GiB / 42 GiB avail; 42 KiB/s rd, 3.3 KiB/s wr, 57 op/s
Nov 28 10:04:07 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:04:07.407 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:04:07 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:04:07.431 2 INFO neutron.agent.securitygroups_rpc [None req-f8d4b801-af07-4edf-8fd0-12384366c126 1170e00b193540e48a34e21b33f08b7a a8f8694ac11a4237ad168b64c39ca114 - - default default] Security group member updated ['7f513fd2-7d14-47f7-892b-9f7b5ed2e3c4']
Nov 28 10:04:07 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v238: 177 pgs: 177 active+clean; 145 MiB data, 783 MiB used, 41 GiB / 42 GiB avail; 77 KiB/s rd, 4.6 KiB/s wr, 103 op/s
Nov 28 10:04:08 np0005538515.localdomain ceph-mon[301134]: osdmap e146: 6 total, 6 up, 6 in
Nov 28 10:04:08 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:04:08.485 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:04:09 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:04:09.211 2 INFO neutron.agent.securitygroups_rpc [None req-f248108c-11ce-43fd-804f-455d486d1048 1170e00b193540e48a34e21b33f08b7a a8f8694ac11a4237ad168b64c39ca114 - - default default] Security group member updated ['7f513fd2-7d14-47f7-892b-9f7b5ed2e3c4']
Nov 28 10:04:09 np0005538515.localdomain ceph-mon[301134]: pgmap v238: 177 pgs: 177 active+clean; 145 MiB data, 783 MiB used, 41 GiB / 42 GiB avail; 77 KiB/s rd, 4.6 KiB/s wr, 103 op/s
Nov 28 10:04:09 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e147 e147: 6 total, 6 up, 6 in
Nov 28 10:04:09 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v240: 177 pgs: 177 active+clean; 145 MiB data, 783 MiB used, 41 GiB / 42 GiB avail; 42 KiB/s rd, 1.7 KiB/s wr, 55 op/s
Nov 28 10:04:09 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.
Nov 28 10:04:09 np0005538515.localdomain podman[312906]: 2025-11-28 10:04:09.980619865 +0000 UTC m=+0.082727736 container health_status cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Nov 28 10:04:09 np0005538515.localdomain podman[312906]: 2025-11-28 10:04:09.998993648 +0000 UTC m=+0.101101519 container exec_died cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Nov 28 10:04:10 np0005538515.localdomain systemd[1]: cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.service: Deactivated successfully.
Nov 28 10:04:10 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:04:10.182 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:04:10 np0005538515.localdomain ceph-mon[301134]: osdmap e147: 6 total, 6 up, 6 in
Nov 28 10:04:10 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/3798834837' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:04:10 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/3798834837' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:04:10 np0005538515.localdomain dnsmasq[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/addn_hosts - 2 addresses
Nov 28 10:04:10 np0005538515.localdomain dnsmasq-dhcp[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/host
Nov 28 10:04:10 np0005538515.localdomain podman[312941]: 2025-11-28 10:04:10.682175417 +0000 UTC m=+0.057534105 container kill d6ba70e100412f39fbf43f2359746885ccd151dd93bddc3682d2651a49a36c62 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-887157f9-a765-40c0-8be5-1fba3ddea8f8, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:04:10 np0005538515.localdomain dnsmasq-dhcp[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/opts
Nov 28 10:04:11 np0005538515.localdomain ceph-mon[301134]: pgmap v240: 177 pgs: 177 active+clean; 145 MiB data, 783 MiB used, 41 GiB / 42 GiB avail; 42 KiB/s rd, 1.7 KiB/s wr, 55 op/s
Nov 28 10:04:11 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v241: 177 pgs: 177 active+clean; 145 MiB data, 783 MiB used, 41 GiB / 42 GiB avail; 80 KiB/s rd, 3.6 KiB/s wr, 106 op/s
Nov 28 10:04:11 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 10:04:12 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:04:12.411 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:04:13 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Nov 28 10:04:13 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2186460897' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:04:13 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Nov 28 10:04:13 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2186460897' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:04:13 np0005538515.localdomain ceph-mon[301134]: pgmap v241: 177 pgs: 177 active+clean; 145 MiB data, 783 MiB used, 41 GiB / 42 GiB avail; 80 KiB/s rd, 3.6 KiB/s wr, 106 op/s
Nov 28 10:04:13 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/2186460897' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:04:13 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/2186460897' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:04:13 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:04:13.514 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:04:13 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v242: 177 pgs: 177 active+clean; 145 MiB data, 783 MiB used, 41 GiB / 42 GiB avail; 65 KiB/s rd, 2.9 KiB/s wr, 85 op/s
Nov 28 10:04:13 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:04:13.946 158530 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f8:ad:87 2001:db8::f816:3eff:fef8:ad87'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fef8:ad87/64', 'neutron:device_id': 'ovnmeta-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a8f8694ac11a4237ad168b64c39ca114', 'neutron:revision_number': '6', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3e1a061c-89bc-4b63-8b60-f49fb95addda, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=ef9eb238-2b1e-49f7-8a0f-72efc8854e0f) old=Port_Binding(mac=['fa:16:3e:f8:ad:87 10.100.0.2 2001:db8::f816:3eff:fef8:ad87'], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fef8:ad87/64', 'neutron:device_id': 'ovnmeta-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a8f8694ac11a4237ad168b64c39ca114', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:04:13 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:04:13.949 158530 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port ef9eb238-2b1e-49f7-8a0f-72efc8854e0f in datapath 8642adde-54ae-4fc2-b997-bf1962c6c7f1 updated
Nov 28 10:04:13 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:04:13.951 158530 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8642adde-54ae-4fc2-b997-bf1962c6c7f1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 28 10:04:13 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:04:13.952 261619 DEBUG oslo.privsep.daemon [-] privsep: reply[ac807c90-3bbd-4357-8875-0873d0af70bf]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:04:14 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:04:14.149 261346 INFO neutron.agent.linux.ip_lib [None req-777cdef2-7418-4a12-a456-c55ffec90362 - - - - - -] Device tap31ecb95a-12 cannot be used as it has no MAC address
Nov 28 10:04:14 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:04:14.170 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:04:14 np0005538515.localdomain kernel: device tap31ecb95a-12 entered promiscuous mode
Nov 28 10:04:14 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T10:04:14Z|00091|binding|INFO|Claiming lport 31ecb95a-127f-4dbe-a0de-1dce5207aadf for this chassis.
Nov 28 10:04:14 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T10:04:14Z|00092|binding|INFO|31ecb95a-127f-4dbe-a0de-1dce5207aadf: Claiming unknown
Nov 28 10:04:14 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:04:14.177 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:04:14 np0005538515.localdomain NetworkManager[5965]: <info>  [1764324254.1784] manager: (tap31ecb95a-12): new Generic device (/org/freedesktop/NetworkManager/Devices/24)
Nov 28 10:04:14 np0005538515.localdomain systemd-udevd[312972]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 10:04:14 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:04:14.198 158530 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538515.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcp2a75ae47-09f3-5db4-9c67-86b6e0e7c804-3b7330b6-a04d-491d-86c4-bd4c5d42920c', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3b7330b6-a04d-491d-86c4-bd4c5d42920c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8c66e098e4fb4a349dc2bb4293454135', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6a4e44b4-7eb2-420d-aa93-9cf50a2ed56e, chassis=[<ovs.db.idl.Row object at 0x7fd80e481be0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fd80e481be0>], logical_port=31ecb95a-127f-4dbe-a0de-1dce5207aadf) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:04:14 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:04:14.200 158530 INFO neutron.agent.ovn.metadata.agent [-] Port 31ecb95a-127f-4dbe-a0de-1dce5207aadf in datapath 3b7330b6-a04d-491d-86c4-bd4c5d42920c bound to our chassis
Nov 28 10:04:14 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:04:14.202 158530 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 3b7330b6-a04d-491d-86c4-bd4c5d42920c or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 28 10:04:14 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:04:14.203 261619 DEBUG oslo.privsep.daemon [-] privsep: reply[de1e1271-5989-49ea-840a-a572d1d40447]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:04:14 np0005538515.localdomain virtnodedevd[228057]: ethtool ioctl error on tap31ecb95a-12: No such device
Nov 28 10:04:14 np0005538515.localdomain virtnodedevd[228057]: ethtool ioctl error on tap31ecb95a-12: No such device
Nov 28 10:04:14 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T10:04:14Z|00093|binding|INFO|Setting lport 31ecb95a-127f-4dbe-a0de-1dce5207aadf ovn-installed in OVS
Nov 28 10:04:14 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T10:04:14Z|00094|binding|INFO|Setting lport 31ecb95a-127f-4dbe-a0de-1dce5207aadf up in Southbound
Nov 28 10:04:14 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:04:14.215 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:04:14 np0005538515.localdomain virtnodedevd[228057]: ethtool ioctl error on tap31ecb95a-12: No such device
Nov 28 10:04:14 np0005538515.localdomain virtnodedevd[228057]: ethtool ioctl error on tap31ecb95a-12: No such device
Nov 28 10:04:14 np0005538515.localdomain virtnodedevd[228057]: ethtool ioctl error on tap31ecb95a-12: No such device
Nov 28 10:04:14 np0005538515.localdomain virtnodedevd[228057]: ethtool ioctl error on tap31ecb95a-12: No such device
Nov 28 10:04:14 np0005538515.localdomain virtnodedevd[228057]: ethtool ioctl error on tap31ecb95a-12: No such device
Nov 28 10:04:14 np0005538515.localdomain virtnodedevd[228057]: ethtool ioctl error on tap31ecb95a-12: No such device
Nov 28 10:04:14 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:04:14.250 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:04:14 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:04:14.282 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:04:14 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:04:14.846 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:04:14 np0005538515.localdomain dnsmasq[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/addn_hosts - 1 addresses
Nov 28 10:04:14 np0005538515.localdomain podman[313039]: 2025-11-28 10:04:14.876665201 +0000 UTC m=+0.059855725 container kill d6ba70e100412f39fbf43f2359746885ccd151dd93bddc3682d2651a49a36c62 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-887157f9-a765-40c0-8be5-1fba3ddea8f8, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 28 10:04:14 np0005538515.localdomain dnsmasq-dhcp[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/host
Nov 28 10:04:14 np0005538515.localdomain dnsmasq-dhcp[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/opts
Nov 28 10:04:15 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e148 e148: 6 total, 6 up, 6 in
Nov 28 10:04:15 np0005538515.localdomain podman[313081]: 
Nov 28 10:04:15 np0005538515.localdomain podman[313081]: 2025-11-28 10:04:15.237406777 +0000 UTC m=+0.089010969 container create a2fba2f1895c02415423b3a2d1514bc7c1d3e2d21a355465422aa21680a66b5b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3b7330b6-a04d-491d-86c4-bd4c5d42920c, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:04:15 np0005538515.localdomain systemd[1]: Started libpod-conmon-a2fba2f1895c02415423b3a2d1514bc7c1d3e2d21a355465422aa21680a66b5b.scope.
Nov 28 10:04:15 np0005538515.localdomain podman[313081]: 2025-11-28 10:04:15.19345305 +0000 UTC m=+0.045057262 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 28 10:04:15 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 10:04:15 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/55e14a9eee3790940011a5f6fec7b205b35f5a18bfb035012dcc3d555698e72f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 10:04:15 np0005538515.localdomain podman[313081]: 2025-11-28 10:04:15.316402238 +0000 UTC m=+0.168006420 container init a2fba2f1895c02415423b3a2d1514bc7c1d3e2d21a355465422aa21680a66b5b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3b7330b6-a04d-491d-86c4-bd4c5d42920c, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 28 10:04:15 np0005538515.localdomain podman[313081]: 2025-11-28 10:04:15.325837828 +0000 UTC m=+0.177441990 container start a2fba2f1895c02415423b3a2d1514bc7c1d3e2d21a355465422aa21680a66b5b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3b7330b6-a04d-491d-86c4-bd4c5d42920c, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Nov 28 10:04:15 np0005538515.localdomain dnsmasq[313099]: started, version 2.85 cachesize 150
Nov 28 10:04:15 np0005538515.localdomain dnsmasq[313099]: DNS service limited to local subnets
Nov 28 10:04:15 np0005538515.localdomain dnsmasq[313099]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 28 10:04:15 np0005538515.localdomain dnsmasq[313099]: warning: no upstream servers configured
Nov 28 10:04:15 np0005538515.localdomain dnsmasq-dhcp[313099]: DHCP, static leases only on 10.100.0.0, lease time 1d
Nov 28 10:04:15 np0005538515.localdomain dnsmasq[313099]: read /var/lib/neutron/dhcp/3b7330b6-a04d-491d-86c4-bd4c5d42920c/addn_hosts - 0 addresses
Nov 28 10:04:15 np0005538515.localdomain dnsmasq-dhcp[313099]: read /var/lib/neutron/dhcp/3b7330b6-a04d-491d-86c4-bd4c5d42920c/host
Nov 28 10:04:15 np0005538515.localdomain dnsmasq-dhcp[313099]: read /var/lib/neutron/dhcp/3b7330b6-a04d-491d-86c4-bd4c5d42920c/opts
Nov 28 10:04:15 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:04:15.500 261346 INFO neutron.agent.dhcp.agent [None req-6dde239a-04da-4ef2-9edd-b49e404f417d - - - - - -] DHCP configuration for ports {'25b5af18-6a6e-4029-a01f-7975d7d01d4f'} is completed
Nov 28 10:04:15 np0005538515.localdomain ceph-mon[301134]: pgmap v242: 177 pgs: 177 active+clean; 145 MiB data, 783 MiB used, 41 GiB / 42 GiB avail; 65 KiB/s rd, 2.9 KiB/s wr, 85 op/s
Nov 28 10:04:15 np0005538515.localdomain ceph-mon[301134]: osdmap e148: 6 total, 6 up, 6 in
Nov 28 10:04:15 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:04:15.719 261346 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:04:15Z, description=, device_id=a11f5dcb-dffb-46ad-be99-9a47466c1a1b, device_owner=network:router_gateway, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce5976a0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce597430>], id=a4c2e21e-851b-46c3-b929-7611c9033400, ip_allocation=immediate, mac_address=fa:16:3e:7e:eb:ef, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T08:32:19Z, description=, dns_domain=, id=887157f9-a765-40c0-8be5-1fba3ddea8f8, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=9dda653c53224db086060962b0702694, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=29, status=ACTIVE, subnets=['5f7de60c-f82a-4f40-b803-51cb08cbf2e3'], tags=[], tenant_id=9dda653c53224db086060962b0702694, updated_at=2025-11-28T08:32:25Z, vlan_transparent=None, network_id=887157f9-a765-40c0-8be5-1fba3ddea8f8, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1571, status=DOWN, tags=[], tenant_id=, updated_at=2025-11-28T10:04:15Z on network 887157f9-a765-40c0-8be5-1fba3ddea8f8
Nov 28 10:04:15 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v244: 177 pgs: 177 active+clean; 145 MiB data, 783 MiB used, 41 GiB / 42 GiB avail; 33 KiB/s rd, 1.6 KiB/s wr, 43 op/s
Nov 28 10:04:15 np0005538515.localdomain podman[313116]: 2025-11-28 10:04:15.935957447 +0000 UTC m=+0.052846961 container kill d6ba70e100412f39fbf43f2359746885ccd151dd93bddc3682d2651a49a36c62 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-887157f9-a765-40c0-8be5-1fba3ddea8f8, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Nov 28 10:04:15 np0005538515.localdomain dnsmasq[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/addn_hosts - 2 addresses
Nov 28 10:04:15 np0005538515.localdomain dnsmasq-dhcp[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/host
Nov 28 10:04:15 np0005538515.localdomain dnsmasq-dhcp[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/opts
Nov 28 10:04:16 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:04:16.170 261346 INFO neutron.agent.dhcp.agent [None req-31d574d1-422c-4246-9779-02f1ccf2b076 - - - - - -] DHCP configuration for ports {'a4c2e21e-851b-46c3-b929-7611c9033400'} is completed
Nov 28 10:04:16 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e148 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 10:04:17 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:04:17.268 158530 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f8:ad:87 10.100.0.2 2001:db8::f816:3eff:fef8:ad87'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fef8:ad87/64', 'neutron:device_id': 'ovnmeta-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a8f8694ac11a4237ad168b64c39ca114', 'neutron:revision_number': '7', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3e1a061c-89bc-4b63-8b60-f49fb95addda, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=ef9eb238-2b1e-49f7-8a0f-72efc8854e0f) old=Port_Binding(mac=['fa:16:3e:f8:ad:87 2001:db8::f816:3eff:fef8:ad87'], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fef8:ad87/64', 'neutron:device_id': 'ovnmeta-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a8f8694ac11a4237ad168b64c39ca114', 'neutron:revision_number': '6', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:04:17 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:04:17.270 158530 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port ef9eb238-2b1e-49f7-8a0f-72efc8854e0f in datapath 8642adde-54ae-4fc2-b997-bf1962c6c7f1 updated
Nov 28 10:04:17 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:04:17.272 158530 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8642adde-54ae-4fc2-b997-bf1962c6c7f1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 28 10:04:17 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:04:17.273 261619 DEBUG oslo.privsep.daemon [-] privsep: reply[e9da2519-8502-496e-a1f7-cbdc569c6136]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:04:17 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:04:17.403 261346 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:04:17Z, description=, device_id=a16605f7-3b57-4ca7-98cd-dfd3dddf9b38, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce56b2b0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce56b250>], id=55f31bf4-45be-4c85-bb79-87c25ca80e96, ip_allocation=immediate, mac_address=fa:16:3e:5d:96:ee, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:04:10Z, description=, dns_domain=, id=3b7330b6-a04d-491d-86c4-bd4c5d42920c, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersTest-2131176874, port_security_enabled=True, project_id=8c66e098e4fb4a349dc2bb4293454135, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=9850, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1534, status=ACTIVE, subnets=['4ebd8429-0bf9-4a3b-9202-8b21c367bbb8'], tags=[], tenant_id=8c66e098e4fb4a349dc2bb4293454135, updated_at=2025-11-28T10:04:12Z, vlan_transparent=None, network_id=3b7330b6-a04d-491d-86c4-bd4c5d42920c, port_security_enabled=False, project_id=8c66e098e4fb4a349dc2bb4293454135, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1578, status=DOWN, tags=[], tenant_id=8c66e098e4fb4a349dc2bb4293454135, updated_at=2025-11-28T10:04:17Z on network 3b7330b6-a04d-491d-86c4-bd4c5d42920c
Nov 28 10:04:17 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:04:17.414 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:04:17 np0005538515.localdomain ceph-mon[301134]: pgmap v244: 177 pgs: 177 active+clean; 145 MiB data, 783 MiB used, 41 GiB / 42 GiB avail; 33 KiB/s rd, 1.6 KiB/s wr, 43 op/s
Nov 28 10:04:17 np0005538515.localdomain dnsmasq[313099]: read /var/lib/neutron/dhcp/3b7330b6-a04d-491d-86c4-bd4c5d42920c/addn_hosts - 1 addresses
Nov 28 10:04:17 np0005538515.localdomain dnsmasq-dhcp[313099]: read /var/lib/neutron/dhcp/3b7330b6-a04d-491d-86c4-bd4c5d42920c/host
Nov 28 10:04:17 np0005538515.localdomain dnsmasq-dhcp[313099]: read /var/lib/neutron/dhcp/3b7330b6-a04d-491d-86c4-bd4c5d42920c/opts
Nov 28 10:04:17 np0005538515.localdomain podman[313152]: 2025-11-28 10:04:17.629678137 +0000 UTC m=+0.059218526 container kill a2fba2f1895c02415423b3a2d1514bc7c1d3e2d21a355465422aa21680a66b5b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3b7330b6-a04d-491d-86c4-bd4c5d42920c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:04:17 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v245: 177 pgs: 177 active+clean; 145 MiB data, 783 MiB used, 41 GiB / 42 GiB avail; 32 KiB/s rd, 1.6 KiB/s wr, 42 op/s
Nov 28 10:04:17 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:04:17.855 261346 INFO neutron.agent.dhcp.agent [None req-88aab17e-875d-47cc-8987-2d6a9cb3f633 - - - - - -] DHCP configuration for ports {'55f31bf4-45be-4c85-bb79-87c25ca80e96'} is completed
Nov 28 10:04:18 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:04:18.318 2 INFO neutron.agent.securitygroups_rpc [None req-5939fc78-1573-4689-b1d7-9426dbeeb10b 1170e00b193540e48a34e21b33f08b7a a8f8694ac11a4237ad168b64c39ca114 - - default default] Security group member updated ['7f513fd2-7d14-47f7-892b-9f7b5ed2e3c4']
Nov 28 10:04:18 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:04:18.431 261346 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:04:17Z, description=, device_id=a16605f7-3b57-4ca7-98cd-dfd3dddf9b38, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce627d60>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce627fd0>], id=55f31bf4-45be-4c85-bb79-87c25ca80e96, ip_allocation=immediate, mac_address=fa:16:3e:5d:96:ee, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:04:10Z, description=, dns_domain=, id=3b7330b6-a04d-491d-86c4-bd4c5d42920c, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersTest-2131176874, port_security_enabled=True, project_id=8c66e098e4fb4a349dc2bb4293454135, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=9850, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1534, status=ACTIVE, subnets=['4ebd8429-0bf9-4a3b-9202-8b21c367bbb8'], tags=[], tenant_id=8c66e098e4fb4a349dc2bb4293454135, updated_at=2025-11-28T10:04:12Z, vlan_transparent=None, network_id=3b7330b6-a04d-491d-86c4-bd4c5d42920c, port_security_enabled=False, project_id=8c66e098e4fb4a349dc2bb4293454135, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1578, status=DOWN, tags=[], tenant_id=8c66e098e4fb4a349dc2bb4293454135, updated_at=2025-11-28T10:04:17Z on network 3b7330b6-a04d-491d-86c4-bd4c5d42920c
Nov 28 10:04:18 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:04:18.517 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:04:18 np0005538515.localdomain ceph-mon[301134]: pgmap v245: 177 pgs: 177 active+clean; 145 MiB data, 783 MiB used, 41 GiB / 42 GiB avail; 32 KiB/s rd, 1.6 KiB/s wr, 42 op/s
Nov 28 10:04:18 np0005538515.localdomain dnsmasq[313099]: read /var/lib/neutron/dhcp/3b7330b6-a04d-491d-86c4-bd4c5d42920c/addn_hosts - 1 addresses
Nov 28 10:04:18 np0005538515.localdomain dnsmasq-dhcp[313099]: read /var/lib/neutron/dhcp/3b7330b6-a04d-491d-86c4-bd4c5d42920c/host
Nov 28 10:04:18 np0005538515.localdomain podman[313188]: 2025-11-28 10:04:18.765717513 +0000 UTC m=+0.064140226 container kill a2fba2f1895c02415423b3a2d1514bc7c1d3e2d21a355465422aa21680a66b5b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3b7330b6-a04d-491d-86c4-bd4c5d42920c, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:04:18 np0005538515.localdomain dnsmasq-dhcp[313099]: read /var/lib/neutron/dhcp/3b7330b6-a04d-491d-86c4-bd4c5d42920c/opts
Nov 28 10:04:19 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:04:19.020 261346 INFO neutron.agent.dhcp.agent [None req-cd745ce8-7227-41a9-8592-32d9d2cb63e1 - - - - - -] DHCP configuration for ports {'55f31bf4-45be-4c85-bb79-87c25ca80e96'} is completed
Nov 28 10:04:19 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:04:19.194 2 INFO neutron.agent.securitygroups_rpc [None req-582a65ec-d5d3-451f-981d-4b6bb2c1b94e 1170e00b193540e48a34e21b33f08b7a a8f8694ac11a4237ad168b64c39ca114 - - default default] Security group member updated ['7f513fd2-7d14-47f7-892b-9f7b5ed2e3c4']
Nov 28 10:04:19 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v246: 177 pgs: 177 active+clean; 145 MiB data, 783 MiB used, 41 GiB / 42 GiB avail; 26 KiB/s rd, 1.3 KiB/s wr, 35 op/s
Nov 28 10:04:20 np0005538515.localdomain sudo[313209]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 10:04:20 np0005538515.localdomain sudo[313209]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 10:04:20 np0005538515.localdomain sudo[313209]: pam_unix(sudo:session): session closed for user root
Nov 28 10:04:20 np0005538515.localdomain sudo[313227]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 10:04:20 np0005538515.localdomain sudo[313227]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 10:04:20 np0005538515.localdomain sudo[313227]: pam_unix(sudo:session): session closed for user root
Nov 28 10:04:20 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:04:20.891 158530 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f8:ad:87 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a8f8694ac11a4237ad168b64c39ca114', 'neutron:revision_number': '10', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3e1a061c-89bc-4b63-8b60-f49fb95addda, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=ef9eb238-2b1e-49f7-8a0f-72efc8854e0f) old=Port_Binding(mac=['fa:16:3e:f8:ad:87 10.100.0.2 2001:db8::f816:3eff:fef8:ad87'], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fef8:ad87/64', 'neutron:device_id': 'ovnmeta-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a8f8694ac11a4237ad168b64c39ca114', 'neutron:revision_number': '7', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:04:20 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:04:20.893 158530 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port ef9eb238-2b1e-49f7-8a0f-72efc8854e0f in datapath 8642adde-54ae-4fc2-b997-bf1962c6c7f1 updated
Nov 28 10:04:20 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:04:20.895 158530 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8642adde-54ae-4fc2-b997-bf1962c6c7f1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 28 10:04:20 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:04:20.896 261619 DEBUG oslo.privsep.daemon [-] privsep: reply[5f91a866-546e-482b-9c29-63bfafd861e4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:04:20 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Nov 28 10:04:20 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 10:04:20 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0)
Nov 28 10:04:20 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 28 10:04:20 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Nov 28 10:04:20 np0005538515.localdomain ceph-mgr[286188]: [progress INFO root] update: starting ev 1b5136cd-8db2-4cef-82dc-2b15d47f84f4 (Updating node-proxy deployment (+3 -> 3))
Nov 28 10:04:20 np0005538515.localdomain ceph-mgr[286188]: [progress INFO root] complete: finished ev 1b5136cd-8db2-4cef-82dc-2b15d47f84f4 (Updating node-proxy deployment (+3 -> 3))
Nov 28 10:04:20 np0005538515.localdomain ceph-mgr[286188]: [progress INFO root] Completed event 1b5136cd-8db2-4cef-82dc-2b15d47f84f4 (Updating node-proxy deployment (+3 -> 3)) in 0 seconds
Nov 28 10:04:21 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0)
Nov 28 10:04:21 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 28 10:04:21 np0005538515.localdomain ceph-mon[301134]: pgmap v246: 177 pgs: 177 active+clean; 145 MiB data, 783 MiB used, 41 GiB / 42 GiB avail; 26 KiB/s rd, 1.3 KiB/s wr, 35 op/s
Nov 28 10:04:21 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 10:04:21 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 28 10:04:21 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:04:21 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 28 10:04:21 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:04:21.275 2 INFO neutron.agent.securitygroups_rpc [None req-2bb5c88d-1a1c-4245-b22f-59b37a9a0aaf 4d2a0ee370da4e688f1d8f5c639f278d ac0f848cde7c47c998a8b80087a3b59d - - default default] Security group member updated ['6784caad-903b-4e5a-b2eb-6bb1d9f8710a']
Nov 28 10:04:21 np0005538515.localdomain sudo[313290]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 10:04:21 np0005538515.localdomain sudo[313290]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 10:04:21 np0005538515.localdomain sudo[313290]: pam_unix(sudo:session): session closed for user root
Nov 28 10:04:21 np0005538515.localdomain dnsmasq[313099]: read /var/lib/neutron/dhcp/3b7330b6-a04d-491d-86c4-bd4c5d42920c/addn_hosts - 0 addresses
Nov 28 10:04:21 np0005538515.localdomain dnsmasq-dhcp[313099]: read /var/lib/neutron/dhcp/3b7330b6-a04d-491d-86c4-bd4c5d42920c/host
Nov 28 10:04:21 np0005538515.localdomain systemd[1]: tmp-crun.Lav7sM.mount: Deactivated successfully.
Nov 28 10:04:21 np0005538515.localdomain dnsmasq-dhcp[313099]: read /var/lib/neutron/dhcp/3b7330b6-a04d-491d-86c4-bd4c5d42920c/opts
Nov 28 10:04:21 np0005538515.localdomain podman[313311]: 2025-11-28 10:04:21.352588407 +0000 UTC m=+0.065013654 container kill a2fba2f1895c02415423b3a2d1514bc7c1d3e2d21a355465422aa21680a66b5b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3b7330b6-a04d-491d-86c4-bd4c5d42920c, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 28 10:04:21 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T10:04:21Z|00095|binding|INFO|Releasing lport 31ecb95a-127f-4dbe-a0de-1dce5207aadf from this chassis (sb_readonly=0)
Nov 28 10:04:21 np0005538515.localdomain kernel: device tap31ecb95a-12 left promiscuous mode
Nov 28 10:04:21 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:04:21.517 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:04:21 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T10:04:21Z|00096|binding|INFO|Setting lport 31ecb95a-127f-4dbe-a0de-1dce5207aadf down in Southbound
Nov 28 10:04:21 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:04:21.527 158530 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538515.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcp2a75ae47-09f3-5db4-9c67-86b6e0e7c804-3b7330b6-a04d-491d-86c4-bd4c5d42920c', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3b7330b6-a04d-491d-86c4-bd4c5d42920c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8c66e098e4fb4a349dc2bb4293454135', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005538515.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6a4e44b4-7eb2-420d-aa93-9cf50a2ed56e, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fd80e481be0>], logical_port=31ecb95a-127f-4dbe-a0de-1dce5207aadf) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fd80e481be0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:04:21 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:04:21.529 158530 INFO neutron.agent.ovn.metadata.agent [-] Port 31ecb95a-127f-4dbe-a0de-1dce5207aadf in datapath 3b7330b6-a04d-491d-86c4-bd4c5d42920c unbound from our chassis
Nov 28 10:04:21 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:04:21.531 158530 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 3b7330b6-a04d-491d-86c4-bd4c5d42920c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 28 10:04:21 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:04:21.532 261619 DEBUG oslo.privsep.daemon [-] privsep: reply[616eb776-6531-4bff-96ca-98a6f31155aa]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:04:21 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:04:21.548 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:04:21 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v247: 177 pgs: 177 active+clean; 145 MiB data, 783 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:04:21 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e148 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 10:04:22 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:04:22.416 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:04:22 np0005538515.localdomain dnsmasq[313099]: exiting on receipt of SIGTERM
Nov 28 10:04:22 np0005538515.localdomain podman[313350]: 2025-11-28 10:04:22.459201532 +0000 UTC m=+0.071047308 container kill a2fba2f1895c02415423b3a2d1514bc7c1d3e2d21a355465422aa21680a66b5b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3b7330b6-a04d-491d-86c4-bd4c5d42920c, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Nov 28 10:04:22 np0005538515.localdomain systemd[1]: libpod-a2fba2f1895c02415423b3a2d1514bc7c1d3e2d21a355465422aa21680a66b5b.scope: Deactivated successfully.
Nov 28 10:04:22 np0005538515.localdomain podman[313362]: 2025-11-28 10:04:22.524310918 +0000 UTC m=+0.053916124 container died a2fba2f1895c02415423b3a2d1514bc7c1d3e2d21a355465422aa21680a66b5b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3b7330b6-a04d-491d-86c4-bd4c5d42920c, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Nov 28 10:04:22 np0005538515.localdomain podman[313362]: 2025-11-28 10:04:22.559749814 +0000 UTC m=+0.089354910 container cleanup a2fba2f1895c02415423b3a2d1514bc7c1d3e2d21a355465422aa21680a66b5b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3b7330b6-a04d-491d-86c4-bd4c5d42920c, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:04:22 np0005538515.localdomain systemd[1]: libpod-conmon-a2fba2f1895c02415423b3a2d1514bc7c1d3e2d21a355465422aa21680a66b5b.scope: Deactivated successfully.
Nov 28 10:04:22 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:04:22.619 2 INFO neutron.agent.securitygroups_rpc [None req-038c15bb-00b9-42a6-bcc5-a72acb379335 4d2a0ee370da4e688f1d8f5c639f278d ac0f848cde7c47c998a8b80087a3b59d - - default default] Security group member updated ['6784caad-903b-4e5a-b2eb-6bb1d9f8710a']
Nov 28 10:04:22 np0005538515.localdomain podman[313369]: 2025-11-28 10:04:22.620387572 +0000 UTC m=+0.135543615 container remove a2fba2f1895c02415423b3a2d1514bc7c1d3e2d21a355465422aa21680a66b5b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3b7330b6-a04d-491d-86c4-bd4c5d42920c, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 28 10:04:22 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:04:22.666 261346 INFO neutron.agent.dhcp.agent [None req-8c614c6e-ad9e-421f-861d-62c291826668 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:04:22 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:04:22.667 261346 INFO neutron.agent.dhcp.agent [None req-8c614c6e-ad9e-421f-861d-62c291826668 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:04:22 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.
Nov 28 10:04:22 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:04:22.979 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:04:22 np0005538515.localdomain podman[313391]: 2025-11-28 10:04:22.982571333 +0000 UTC m=+0.089428252 container health_status 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, distribution-scope=public, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41)
Nov 28 10:04:23 np0005538515.localdomain podman[313391]: 2025-11-28 10:04:23.004497575 +0000 UTC m=+0.111354464 container exec_died 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, version=9.6, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, com.redhat.component=ubi9-minimal-container, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, release=1755695350, architecture=x86_64, io.openshift.expose-services=, distribution-scope=public, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm)
Nov 28 10:04:23 np0005538515.localdomain systemd[1]: 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.service: Deactivated successfully.
Nov 28 10:04:23 np0005538515.localdomain ceph-mon[301134]: pgmap v247: 177 pgs: 177 active+clean; 145 MiB data, 783 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:04:23 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:04:23.086 158530 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f8:ad:87 10.100.0.2 2001:db8::f816:3eff:fef8:ad87'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fef8:ad87/64', 'neutron:device_id': 'ovnmeta-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a8f8694ac11a4237ad168b64c39ca114', 'neutron:revision_number': '11', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3e1a061c-89bc-4b63-8b60-f49fb95addda, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=ef9eb238-2b1e-49f7-8a0f-72efc8854e0f) old=Port_Binding(mac=['fa:16:3e:f8:ad:87 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a8f8694ac11a4237ad168b64c39ca114', 'neutron:revision_number': '10', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:04:23 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:04:23.088 158530 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port ef9eb238-2b1e-49f7-8a0f-72efc8854e0f in datapath 8642adde-54ae-4fc2-b997-bf1962c6c7f1 updated
Nov 28 10:04:23 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:04:23.090 158530 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8642adde-54ae-4fc2-b997-bf1962c6c7f1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 28 10:04:23 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:04:23.090 261619 DEBUG oslo.privsep.daemon [-] privsep: reply[6a5618c0-3b02-4734-844f-deae7f75386f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:04:23 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-55e14a9eee3790940011a5f6fec7b205b35f5a18bfb035012dcc3d555698e72f-merged.mount: Deactivated successfully.
Nov 28 10:04:23 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a2fba2f1895c02415423b3a2d1514bc7c1d3e2d21a355465422aa21680a66b5b-userdata-shm.mount: Deactivated successfully.
Nov 28 10:04:23 np0005538515.localdomain systemd[1]: run-netns-qdhcp\x2d3b7330b6\x2da04d\x2d491d\x2d86c4\x2dbd4c5d42920c.mount: Deactivated successfully.
Nov 28 10:04:23 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:04:23.520 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:04:23 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v248: 177 pgs: 177 active+clean; 145 MiB data, 783 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:04:24 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:04:24.859 2 INFO neutron.agent.securitygroups_rpc [None req-5be6ae81-d286-424b-afd6-2b6865c77664 1170e00b193540e48a34e21b33f08b7a a8f8694ac11a4237ad168b64c39ca114 - - default default] Security group member updated ['7f513fd2-7d14-47f7-892b-9f7b5ed2e3c4']
Nov 28 10:04:25 np0005538515.localdomain ceph-mon[301134]: pgmap v248: 177 pgs: 177 active+clean; 145 MiB data, 783 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:04:25 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v249: 177 pgs: 177 active+clean; 145 MiB data, 783 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:04:25 np0005538515.localdomain ceph-mgr[286188]: [progress INFO root] Writing back 50 completed events
Nov 28 10:04:25 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0)
Nov 28 10:04:26 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:04:26.169 2 INFO neutron.agent.securitygroups_rpc [None req-f516662f-fbee-4914-9fae-83fcd2f7d639 1170e00b193540e48a34e21b33f08b7a a8f8694ac11a4237ad168b64c39ca114 - - default default] Security group member updated ['7f513fd2-7d14-47f7-892b-9f7b5ed2e3c4']
Nov 28 10:04:26 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e148 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 10:04:26 np0005538515.localdomain ceph-mon[301134]: pgmap v249: 177 pgs: 177 active+clean; 145 MiB data, 783 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:04:26 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:04:27 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:04:27.419 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:04:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:04:27 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 10:04:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:04:27 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 10:04:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:04:27 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 10:04:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:04:27 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 10:04:27 np0005538515.localdomain openstack_network_exporter[240973]: 
Nov 28 10:04:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:04:27 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 10:04:27 np0005538515.localdomain openstack_network_exporter[240973]: 
Nov 28 10:04:27 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v250: 177 pgs: 177 active+clean; 145 MiB data, 783 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:04:28 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:04:28.467 2 INFO neutron.agent.securitygroups_rpc [None req-35c0af25-6cf6-4373-be02-f6ff138ff337 e7c9d49fbf5f41059b8d32426e1740a6 517e4cc7e34e4fe7b49313300e5db635 - - default default] Security group member updated ['d7bc25b0-2d77-4fba-a003-707609b573d6']
Nov 28 10:04:28 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:04:28.548 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:04:28 np0005538515.localdomain ceph-mon[301134]: pgmap v250: 177 pgs: 177 active+clean; 145 MiB data, 783 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:04:28 np0005538515.localdomain podman[239012]: time="2025-11-28T10:04:28Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 10:04:28 np0005538515.localdomain podman[239012]: @ - - [28/Nov/2025:10:04:28 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156330 "" "Go-http-client/1.1"
Nov 28 10:04:28 np0005538515.localdomain podman[239012]: @ - - [28/Nov/2025:10:04:28 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19199 "" "Go-http-client/1.1"
Nov 28 10:04:29 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:04:29.317 2 INFO neutron.agent.securitygroups_rpc [None req-535fb80e-1678-409f-9e3d-b2eaa82a20b5 e7c9d49fbf5f41059b8d32426e1740a6 517e4cc7e34e4fe7b49313300e5db635 - - default default] Security group member updated ['d7bc25b0-2d77-4fba-a003-707609b573d6']
Nov 28 10:04:29 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:04:29.610 158530 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f8:ad:87 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a8f8694ac11a4237ad168b64c39ca114', 'neutron:revision_number': '14', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3e1a061c-89bc-4b63-8b60-f49fb95addda, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=ef9eb238-2b1e-49f7-8a0f-72efc8854e0f) old=Port_Binding(mac=['fa:16:3e:f8:ad:87 10.100.0.2 2001:db8::f816:3eff:fef8:ad87'], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fef8:ad87/64', 'neutron:device_id': 'ovnmeta-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a8f8694ac11a4237ad168b64c39ca114', 'neutron:revision_number': '11', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:04:29 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:04:29.611 158530 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port ef9eb238-2b1e-49f7-8a0f-72efc8854e0f in datapath 8642adde-54ae-4fc2-b997-bf1962c6c7f1 updated
Nov 28 10:04:29 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:04:29.614 158530 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8642adde-54ae-4fc2-b997-bf1962c6c7f1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 28 10:04:29 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:04:29.615 261619 DEBUG oslo.privsep.daemon [-] privsep: reply[0e921f9a-6bf2-4a4c-9ba0-70834c11d331]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:04:29 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v251: 177 pgs: 177 active+clean; 145 MiB data, 783 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:04:29 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/4123920278' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:04:29 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/4123920278' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:04:29 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:04:29.916 261346 INFO neutron.agent.linux.ip_lib [None req-a547b6d6-8a29-4867-b7b0-701bdc0462de - - - - - -] Device tap48dc601b-9d cannot be used as it has no MAC address
Nov 28 10:04:29 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:04:29.940 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:04:29 np0005538515.localdomain kernel: device tap48dc601b-9d entered promiscuous mode
Nov 28 10:04:29 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T10:04:29Z|00097|binding|INFO|Claiming lport 48dc601b-9dc3-45c9-9b98-4c07536959fd for this chassis.
Nov 28 10:04:29 np0005538515.localdomain NetworkManager[5965]: <info>  [1764324269.9498] manager: (tap48dc601b-9d): new Generic device (/org/freedesktop/NetworkManager/Devices/25)
Nov 28 10:04:29 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:04:29.949 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:04:29 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T10:04:29Z|00098|binding|INFO|48dc601b-9dc3-45c9-9b98-4c07536959fd: Claiming unknown
Nov 28 10:04:29 np0005538515.localdomain systemd-udevd[313424]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 10:04:29 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:04:29.960 158530 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538515.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.101.0.2/28', 'neutron:device_id': 'dhcp2a75ae47-09f3-5db4-9c67-86b6e0e7c804-d81fff26-c58f-4d58-a4c3-379fa25c0b56', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d81fff26-c58f-4d58-a4c3-379fa25c0b56', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8c66e098e4fb4a349dc2bb4293454135', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3c8780a7-df1b-4611-af44-f100aaf1ce7e, chassis=[<ovs.db.idl.Row object at 0x7fd80e481be0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fd80e481be0>], logical_port=48dc601b-9dc3-45c9-9b98-4c07536959fd) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:04:29 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:04:29.962 158530 INFO neutron.agent.ovn.metadata.agent [-] Port 48dc601b-9dc3-45c9-9b98-4c07536959fd in datapath d81fff26-c58f-4d58-a4c3-379fa25c0b56 bound to our chassis
Nov 28 10:04:29 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:04:29.963 158530 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network d81fff26-c58f-4d58-a4c3-379fa25c0b56 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 28 10:04:29 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:04:29.964 261619 DEBUG oslo.privsep.daemon [-] privsep: reply[b1e8850e-d148-473e-8eb6-15ccd6d7a09c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:04:29 np0005538515.localdomain virtnodedevd[228057]: ethtool ioctl error on tap48dc601b-9d: No such device
Nov 28 10:04:29 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T10:04:29Z|00099|binding|INFO|Setting lport 48dc601b-9dc3-45c9-9b98-4c07536959fd ovn-installed in OVS
Nov 28 10:04:29 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T10:04:29Z|00100|binding|INFO|Setting lport 48dc601b-9dc3-45c9-9b98-4c07536959fd up in Southbound
Nov 28 10:04:29 np0005538515.localdomain virtnodedevd[228057]: ethtool ioctl error on tap48dc601b-9d: No such device
Nov 28 10:04:29 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:04:29.992 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:04:29 np0005538515.localdomain virtnodedevd[228057]: ethtool ioctl error on tap48dc601b-9d: No such device
Nov 28 10:04:30 np0005538515.localdomain virtnodedevd[228057]: ethtool ioctl error on tap48dc601b-9d: No such device
Nov 28 10:04:30 np0005538515.localdomain virtnodedevd[228057]: ethtool ioctl error on tap48dc601b-9d: No such device
Nov 28 10:04:30 np0005538515.localdomain virtnodedevd[228057]: ethtool ioctl error on tap48dc601b-9d: No such device
Nov 28 10:04:30 np0005538515.localdomain virtnodedevd[228057]: ethtool ioctl error on tap48dc601b-9d: No such device
Nov 28 10:04:30 np0005538515.localdomain virtnodedevd[228057]: ethtool ioctl error on tap48dc601b-9d: No such device
Nov 28 10:04:30 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:04:30.029 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:04:30 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:04:30.061 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:04:30 np0005538515.localdomain podman[313495]: 
Nov 28 10:04:30 np0005538515.localdomain podman[313495]: 2025-11-28 10:04:30.926876532 +0000 UTC m=+0.078280930 container create db4f1e73011f5afa73a2974ecc8d11b3610a5c42e770d1cf01bae7f0e9edad24 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d81fff26-c58f-4d58-a4c3-379fa25c0b56, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Nov 28 10:04:30 np0005538515.localdomain systemd[1]: Started libpod-conmon-db4f1e73011f5afa73a2974ecc8d11b3610a5c42e770d1cf01bae7f0e9edad24.scope.
Nov 28 10:04:30 np0005538515.localdomain podman[313495]: 2025-11-28 10:04:30.883287216 +0000 UTC m=+0.034691614 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 28 10:04:30 np0005538515.localdomain systemd[1]: tmp-crun.eOnXfx.mount: Deactivated successfully.
Nov 28 10:04:30 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 10:04:30 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/54a4e230fd4a13b47ad5b4e150539a4b42d7ec5ed223aa986059699bf0cd00cc/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 10:04:31 np0005538515.localdomain podman[313495]: 2025-11-28 10:04:31.003588163 +0000 UTC m=+0.154992551 container init db4f1e73011f5afa73a2974ecc8d11b3610a5c42e770d1cf01bae7f0e9edad24 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d81fff26-c58f-4d58-a4c3-379fa25c0b56, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 28 10:04:31 np0005538515.localdomain podman[313495]: 2025-11-28 10:04:31.010556306 +0000 UTC m=+0.161960694 container start db4f1e73011f5afa73a2974ecc8d11b3610a5c42e770d1cf01bae7f0e9edad24 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d81fff26-c58f-4d58-a4c3-379fa25c0b56, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 28 10:04:31 np0005538515.localdomain dnsmasq[313513]: started, version 2.85 cachesize 150
Nov 28 10:04:31 np0005538515.localdomain dnsmasq[313513]: DNS service limited to local subnets
Nov 28 10:04:31 np0005538515.localdomain dnsmasq[313513]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 28 10:04:31 np0005538515.localdomain dnsmasq[313513]: warning: no upstream servers configured
Nov 28 10:04:31 np0005538515.localdomain dnsmasq-dhcp[313513]: DHCP, static leases only on 10.101.0.0, lease time 1d
Nov 28 10:04:31 np0005538515.localdomain dnsmasq[313513]: read /var/lib/neutron/dhcp/d81fff26-c58f-4d58-a4c3-379fa25c0b56/addn_hosts - 0 addresses
Nov 28 10:04:31 np0005538515.localdomain dnsmasq-dhcp[313513]: read /var/lib/neutron/dhcp/d81fff26-c58f-4d58-a4c3-379fa25c0b56/host
Nov 28 10:04:31 np0005538515.localdomain dnsmasq-dhcp[313513]: read /var/lib/neutron/dhcp/d81fff26-c58f-4d58-a4c3-379fa25c0b56/opts
Nov 28 10:04:31 np0005538515.localdomain ceph-mon[301134]: pgmap v251: 177 pgs: 177 active+clean; 145 MiB data, 783 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:04:31 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:04:31.153 261346 INFO neutron.agent.dhcp.agent [None req-2ace9c18-2446-4f1e-a298-37ba949f937e - - - - - -] DHCP configuration for ports {'a2ca468b-152f-4318-90e7-ce780e265076'} is completed
Nov 28 10:04:31 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v252: 177 pgs: 177 active+clean; 145 MiB data, 783 MiB used, 41 GiB / 42 GiB avail; 10 KiB/s rd, 597 B/s wr, 14 op/s
Nov 28 10:04:31 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e148 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 10:04:32 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:04:32.422 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:04:32 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:04:32.901 261346 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:04:31Z, description=, device_id=65a5fc61-8378-41fb-8a6b-788254c76348, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce5e0d00>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce5e0c70>], id=8d7574a9-ec3d-435a-81c6-c889e68005e5, ip_allocation=immediate, mac_address=fa:16:3e:3c:60:f5, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:04:27Z, description=, dns_domain=, id=d81fff26-c58f-4d58-a4c3-379fa25c0b56, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersTest-1315557878, port_security_enabled=True, project_id=8c66e098e4fb4a349dc2bb4293454135, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=16174, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1641, status=ACTIVE, subnets=['1ede51d8-16b3-470f-a7f1-54260e168ed4'], tags=[], tenant_id=8c66e098e4fb4a349dc2bb4293454135, updated_at=2025-11-28T10:04:28Z, vlan_transparent=None, network_id=d81fff26-c58f-4d58-a4c3-379fa25c0b56, port_security_enabled=False, project_id=8c66e098e4fb4a349dc2bb4293454135, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1666, status=DOWN, tags=[], tenant_id=8c66e098e4fb4a349dc2bb4293454135, updated_at=2025-11-28T10:04:32Z on network d81fff26-c58f-4d58-a4c3-379fa25c0b56
Nov 28 10:04:33 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:04:33.055 158530 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f8:ad:87 10.100.0.2 2001:db8::f816:3eff:fef8:ad87'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fef8:ad87/64', 'neutron:device_id': 'ovnmeta-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a8f8694ac11a4237ad168b64c39ca114', 'neutron:revision_number': '15', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3e1a061c-89bc-4b63-8b60-f49fb95addda, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=ef9eb238-2b1e-49f7-8a0f-72efc8854e0f) old=Port_Binding(mac=['fa:16:3e:f8:ad:87 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a8f8694ac11a4237ad168b64c39ca114', 'neutron:revision_number': '14', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:04:33 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:04:33.057 158530 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port ef9eb238-2b1e-49f7-8a0f-72efc8854e0f in datapath 8642adde-54ae-4fc2-b997-bf1962c6c7f1 updated
Nov 28 10:04:33 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:04:33.061 158530 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8642adde-54ae-4fc2-b997-bf1962c6c7f1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 28 10:04:33 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:04:33.062 261619 DEBUG oslo.privsep.daemon [-] privsep: reply[7baaed11-b6a6-4a18-ae03-b8bcba1964e5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:04:33 np0005538515.localdomain ceph-mon[301134]: pgmap v252: 177 pgs: 177 active+clean; 145 MiB data, 783 MiB used, 41 GiB / 42 GiB avail; 10 KiB/s rd, 597 B/s wr, 14 op/s
Nov 28 10:04:33 np0005538515.localdomain dnsmasq[313513]: read /var/lib/neutron/dhcp/d81fff26-c58f-4d58-a4c3-379fa25c0b56/addn_hosts - 1 addresses
Nov 28 10:04:33 np0005538515.localdomain dnsmasq-dhcp[313513]: read /var/lib/neutron/dhcp/d81fff26-c58f-4d58-a4c3-379fa25c0b56/host
Nov 28 10:04:33 np0005538515.localdomain podman[313532]: 2025-11-28 10:04:33.123881026 +0000 UTC m=+0.056305176 container kill db4f1e73011f5afa73a2974ecc8d11b3610a5c42e770d1cf01bae7f0e9edad24 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d81fff26-c58f-4d58-a4c3-379fa25c0b56, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 28 10:04:33 np0005538515.localdomain dnsmasq-dhcp[313513]: read /var/lib/neutron/dhcp/d81fff26-c58f-4d58-a4c3-379fa25c0b56/opts
Nov 28 10:04:33 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.
Nov 28 10:04:33 np0005538515.localdomain systemd[1]: tmp-crun.ulCl0w.mount: Deactivated successfully.
Nov 28 10:04:33 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:04:33.175 2 INFO neutron.agent.securitygroups_rpc [None req-e7bb8635-ea1b-4f3c-951e-d95d847ad39e 4d2a0ee370da4e688f1d8f5c639f278d ac0f848cde7c47c998a8b80087a3b59d - - default default] Security group member updated ['6784caad-903b-4e5a-b2eb-6bb1d9f8710a']
Nov 28 10:04:33 np0005538515.localdomain podman[313547]: 2025-11-28 10:04:33.250686502 +0000 UTC m=+0.098563891 container health_status 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3)
Nov 28 10:04:33 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.
Nov 28 10:04:33 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.
Nov 28 10:04:33 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.
Nov 28 10:04:33 np0005538515.localdomain podman[313572]: 2025-11-28 10:04:33.361696265 +0000 UTC m=+0.088385951 container health_status 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, config_id=edpm, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 28 10:04:33 np0005538515.localdomain podman[313547]: 2025-11-28 10:04:33.365354287 +0000 UTC m=+0.213231656 container exec_died 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Nov 28 10:04:33 np0005538515.localdomain systemd[1]: 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.service: Deactivated successfully.
Nov 28 10:04:33 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:04:33.412 261346 INFO neutron.agent.dhcp.agent [None req-8536f9ea-c6b8-4d24-a7fc-eb3b957f7c1a - - - - - -] DHCP configuration for ports {'8d7574a9-ec3d-435a-81c6-c889e68005e5'} is completed
Nov 28 10:04:33 np0005538515.localdomain podman[313576]: 2025-11-28 10:04:33.420012782 +0000 UTC m=+0.142169248 container health_status d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 10:04:33 np0005538515.localdomain podman[313576]: 2025-11-28 10:04:33.425365215 +0000 UTC m=+0.147521671 container exec_died d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 28 10:04:33 np0005538515.localdomain systemd[1]: d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.service: Deactivated successfully.
Nov 28 10:04:33 np0005538515.localdomain podman[313574]: 2025-11-28 10:04:33.339568137 +0000 UTC m=+0.064282192 container health_status b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Nov 28 10:04:33 np0005538515.localdomain podman[313572]: 2025-11-28 10:04:33.450126585 +0000 UTC m=+0.176816321 container exec_died 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3)
Nov 28 10:04:33 np0005538515.localdomain systemd[1]: 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.service: Deactivated successfully.
Nov 28 10:04:33 np0005538515.localdomain podman[313574]: 2025-11-28 10:04:33.471350225 +0000 UTC m=+0.196064210 container exec_died b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:04:33 np0005538515.localdomain systemd[1]: b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.service: Deactivated successfully.
Nov 28 10:04:33 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:04:33.550 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:04:33 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v253: 177 pgs: 177 active+clean; 145 MiB data, 783 MiB used, 41 GiB / 42 GiB avail; 10 KiB/s rd, 597 B/s wr, 14 op/s
Nov 28 10:04:34 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:04:34.528 2 INFO neutron.agent.securitygroups_rpc [None req-406cfd0d-88dd-4d36-9649-40665b36b8d2 1170e00b193540e48a34e21b33f08b7a a8f8694ac11a4237ad168b64c39ca114 - - default default] Security group member updated ['7f513fd2-7d14-47f7-892b-9f7b5ed2e3c4']
Nov 28 10:04:35 np0005538515.localdomain ceph-mon[301134]: pgmap v253: 177 pgs: 177 active+clean; 145 MiB data, 783 MiB used, 41 GiB / 42 GiB avail; 10 KiB/s rd, 597 B/s wr, 14 op/s
Nov 28 10:04:35 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:04:35.340 261346 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:04:31Z, description=, device_id=65a5fc61-8378-41fb-8a6b-788254c76348, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce566b80>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce4b1e50>], id=8d7574a9-ec3d-435a-81c6-c889e68005e5, ip_allocation=immediate, mac_address=fa:16:3e:3c:60:f5, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:04:27Z, description=, dns_domain=, id=d81fff26-c58f-4d58-a4c3-379fa25c0b56, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersTest-1315557878, port_security_enabled=True, project_id=8c66e098e4fb4a349dc2bb4293454135, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=16174, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1641, status=ACTIVE, subnets=['1ede51d8-16b3-470f-a7f1-54260e168ed4'], tags=[], tenant_id=8c66e098e4fb4a349dc2bb4293454135, updated_at=2025-11-28T10:04:28Z, vlan_transparent=None, network_id=d81fff26-c58f-4d58-a4c3-379fa25c0b56, port_security_enabled=False, project_id=8c66e098e4fb4a349dc2bb4293454135, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1666, status=DOWN, tags=[], tenant_id=8c66e098e4fb4a349dc2bb4293454135, updated_at=2025-11-28T10:04:32Z on network d81fff26-c58f-4d58-a4c3-379fa25c0b56
Nov 28 10:04:35 np0005538515.localdomain podman[313653]: 2025-11-28 10:04:35.563256199 +0000 UTC m=+0.071969247 container kill db4f1e73011f5afa73a2974ecc8d11b3610a5c42e770d1cf01bae7f0e9edad24 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d81fff26-c58f-4d58-a4c3-379fa25c0b56, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Nov 28 10:04:35 np0005538515.localdomain dnsmasq[313513]: read /var/lib/neutron/dhcp/d81fff26-c58f-4d58-a4c3-379fa25c0b56/addn_hosts - 1 addresses
Nov 28 10:04:35 np0005538515.localdomain dnsmasq-dhcp[313513]: read /var/lib/neutron/dhcp/d81fff26-c58f-4d58-a4c3-379fa25c0b56/host
Nov 28 10:04:35 np0005538515.localdomain dnsmasq-dhcp[313513]: read /var/lib/neutron/dhcp/d81fff26-c58f-4d58-a4c3-379fa25c0b56/opts
Nov 28 10:04:35 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] scanning for idle connections..
Nov 28 10:04:35 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] cleaning up connections: []
Nov 28 10:04:35 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] scanning for idle connections..
Nov 28 10:04:35 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] cleaning up connections: []
Nov 28 10:04:35 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v254: 177 pgs: 177 active+clean; 145 MiB data, 783 MiB used, 41 GiB / 42 GiB avail; 10 KiB/s rd, 597 B/s wr, 14 op/s
Nov 28 10:04:35 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] scanning for idle connections..
Nov 28 10:04:35 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] cleaning up connections: []
Nov 28 10:04:35 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:04:35.878 261346 INFO neutron.agent.dhcp.agent [None req-db1f7d68-a318-43ec-9aed-8d8bd4c4ca23 - - - - - -] DHCP configuration for ports {'8d7574a9-ec3d-435a-81c6-c889e68005e5'} is completed
Nov 28 10:04:35 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:04:35.899 2 INFO neutron.agent.securitygroups_rpc [None req-e5e44e33-1445-4a9a-aa0f-e3e5f136c603 1170e00b193540e48a34e21b33f08b7a a8f8694ac11a4237ad168b64c39ca114 - - default default] Security group member updated ['7f513fd2-7d14-47f7-892b-9f7b5ed2e3c4']
Nov 28 10:04:36 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:04:36.024 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:04:36 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:04:36.675 2 INFO neutron.agent.securitygroups_rpc [None req-260c5785-8a89-47b1-924c-143d50af86e5 4d2a0ee370da4e688f1d8f5c639f278d ac0f848cde7c47c998a8b80087a3b59d - - default default] Security group member updated ['6784caad-903b-4e5a-b2eb-6bb1d9f8710a']
Nov 28 10:04:36 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e148 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 10:04:36 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.
Nov 28 10:04:36 np0005538515.localdomain podman[313675]: 2025-11-28 10:04:36.967597129 +0000 UTC m=+0.074603088 container health_status 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 28 10:04:36 np0005538515.localdomain podman[313675]: 2025-11-28 10:04:36.982444063 +0000 UTC m=+0.089449992 container exec_died 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 28 10:04:36 np0005538515.localdomain systemd[1]: 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.service: Deactivated successfully.
Nov 28 10:04:37 np0005538515.localdomain ceph-mon[301134]: pgmap v254: 177 pgs: 177 active+clean; 145 MiB data, 783 MiB used, 41 GiB / 42 GiB avail; 10 KiB/s rd, 597 B/s wr, 14 op/s
Nov 28 10:04:37 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:04:37.283 261346 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:04:37Z, description=, device_id=996ff932-5706-4ef7-9ffc-88689082f05e, device_owner=network:router_gateway, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce49b640>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce49b3d0>], id=05a4d4ad-a16f-428e-8f1a-957893a76c09, ip_allocation=immediate, mac_address=fa:16:3e:f8:dd:39, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T08:32:19Z, description=, dns_domain=, id=887157f9-a765-40c0-8be5-1fba3ddea8f8, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=9dda653c53224db086060962b0702694, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=29, status=ACTIVE, subnets=['5f7de60c-f82a-4f40-b803-51cb08cbf2e3'], tags=[], tenant_id=9dda653c53224db086060962b0702694, updated_at=2025-11-28T08:32:25Z, vlan_transparent=None, network_id=887157f9-a765-40c0-8be5-1fba3ddea8f8, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1713, status=DOWN, tags=[], tenant_id=, updated_at=2025-11-28T10:04:37Z on network 887157f9-a765-40c0-8be5-1fba3ddea8f8
Nov 28 10:04:37 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:04:37.425 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:04:37 np0005538515.localdomain systemd[1]: tmp-crun.PWIq15.mount: Deactivated successfully.
Nov 28 10:04:37 np0005538515.localdomain dnsmasq[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/addn_hosts - 3 addresses
Nov 28 10:04:37 np0005538515.localdomain podman[313715]: 2025-11-28 10:04:37.520217145 +0000 UTC m=+0.076615089 container kill d6ba70e100412f39fbf43f2359746885ccd151dd93bddc3682d2651a49a36c62 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-887157f9-a765-40c0-8be5-1fba3ddea8f8, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 28 10:04:37 np0005538515.localdomain dnsmasq-dhcp[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/host
Nov 28 10:04:37 np0005538515.localdomain dnsmasq-dhcp[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/opts
Nov 28 10:04:37 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v255: 177 pgs: 177 active+clean; 145 MiB data, 783 MiB used, 41 GiB / 42 GiB avail; 10 KiB/s rd, 597 B/s wr, 14 op/s
Nov 28 10:04:37 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:04:37.795 261346 INFO neutron.agent.dhcp.agent [None req-4f4dae4a-22fc-4aac-b2fa-f71365457492 - - - - - -] DHCP configuration for ports {'05a4d4ad-a16f-428e-8f1a-957893a76c09'} is completed
Nov 28 10:04:38 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:04:38.596 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:04:38 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:04:38.899 261346 INFO neutron.agent.linux.ip_lib [None req-aa8a08ff-b3b7-4c94-aa7b-45efc7bbc092 - - - - - -] Device tap0dd1eafc-23 cannot be used as it has no MAC address
Nov 28 10:04:38 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:04:38.925 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:04:38 np0005538515.localdomain kernel: device tap0dd1eafc-23 entered promiscuous mode
Nov 28 10:04:38 np0005538515.localdomain NetworkManager[5965]: <info>  [1764324278.9335] manager: (tap0dd1eafc-23): new Generic device (/org/freedesktop/NetworkManager/Devices/26)
Nov 28 10:04:38 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T10:04:38Z|00101|binding|INFO|Claiming lport 0dd1eafc-23dc-4156-9d9f-2142e93fc855 for this chassis.
Nov 28 10:04:38 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:04:38.933 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:04:38 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T10:04:38Z|00102|binding|INFO|0dd1eafc-23dc-4156-9d9f-2142e93fc855: Claiming unknown
Nov 28 10:04:38 np0005538515.localdomain systemd-udevd[313746]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 10:04:38 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:04:38.958 158530 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538515.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.102.0.2/28', 'neutron:device_id': 'dhcp2a75ae47-09f3-5db4-9c67-86b6e0e7c804-857a46da-ae2c-48a0-8bc8-b100174874d8', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-857a46da-ae2c-48a0-8bc8-b100174874d8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8c66e098e4fb4a349dc2bb4293454135', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d6731419-dcc4-4fe4-a3f7-f974e0596a7c, chassis=[<ovs.db.idl.Row object at 0x7fd80e481be0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fd80e481be0>], logical_port=0dd1eafc-23dc-4156-9d9f-2142e93fc855) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:04:38 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:04:38.961 158530 INFO neutron.agent.ovn.metadata.agent [-] Port 0dd1eafc-23dc-4156-9d9f-2142e93fc855 in datapath 857a46da-ae2c-48a0-8bc8-b100174874d8 bound to our chassis
Nov 28 10:04:38 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:04:38.963 158530 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 857a46da-ae2c-48a0-8bc8-b100174874d8 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 28 10:04:38 np0005538515.localdomain virtnodedevd[228057]: ethtool ioctl error on tap0dd1eafc-23: No such device
Nov 28 10:04:38 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:04:38.965 261619 DEBUG oslo.privsep.daemon [-] privsep: reply[2f6b2330-5104-4a07-a1cf-89fe682852bc]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:04:38 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T10:04:38Z|00103|binding|INFO|Setting lport 0dd1eafc-23dc-4156-9d9f-2142e93fc855 ovn-installed in OVS
Nov 28 10:04:38 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T10:04:38Z|00104|binding|INFO|Setting lport 0dd1eafc-23dc-4156-9d9f-2142e93fc855 up in Southbound
Nov 28 10:04:38 np0005538515.localdomain virtnodedevd[228057]: ethtool ioctl error on tap0dd1eafc-23: No such device
Nov 28 10:04:38 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:04:38.969 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:04:38 np0005538515.localdomain virtnodedevd[228057]: ethtool ioctl error on tap0dd1eafc-23: No such device
Nov 28 10:04:38 np0005538515.localdomain virtnodedevd[228057]: ethtool ioctl error on tap0dd1eafc-23: No such device
Nov 28 10:04:38 np0005538515.localdomain virtnodedevd[228057]: ethtool ioctl error on tap0dd1eafc-23: No such device
Nov 28 10:04:38 np0005538515.localdomain virtnodedevd[228057]: ethtool ioctl error on tap0dd1eafc-23: No such device
Nov 28 10:04:38 np0005538515.localdomain virtnodedevd[228057]: ethtool ioctl error on tap0dd1eafc-23: No such device
Nov 28 10:04:39 np0005538515.localdomain virtnodedevd[228057]: ethtool ioctl error on tap0dd1eafc-23: No such device
Nov 28 10:04:39 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:04:39.013 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:04:39 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:04:39.040 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:04:39 np0005538515.localdomain ceph-mon[301134]: pgmap v255: 177 pgs: 177 active+clean; 145 MiB data, 783 MiB used, 41 GiB / 42 GiB avail; 10 KiB/s rd, 597 B/s wr, 14 op/s
Nov 28 10:04:39 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:04:39.540 2 INFO neutron.agent.securitygroups_rpc [None req-cb662c54-c616-44d2-81c7-5ec9bf360652 4d2a0ee370da4e688f1d8f5c639f278d ac0f848cde7c47c998a8b80087a3b59d - - default default] Security group member updated ['6784caad-903b-4e5a-b2eb-6bb1d9f8710a']
Nov 28 10:04:39 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v256: 177 pgs: 177 active+clean; 145 MiB data, 783 MiB used, 41 GiB / 42 GiB avail; 10 KiB/s rd, 597 B/s wr, 14 op/s
Nov 28 10:04:39 np0005538515.localdomain podman[313818]: 
Nov 28 10:04:39 np0005538515.localdomain podman[313818]: 2025-11-28 10:04:39.891948295 +0000 UTC m=+0.098566792 container create 7d9055d2c408a209194a629fb271361b17d5a881d5c8d65e14bd567976a0b618 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-857a46da-ae2c-48a0-8bc8-b100174874d8, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Nov 28 10:04:39 np0005538515.localdomain systemd[1]: Started libpod-conmon-7d9055d2c408a209194a629fb271361b17d5a881d5c8d65e14bd567976a0b618.scope.
Nov 28 10:04:39 np0005538515.localdomain podman[313818]: 2025-11-28 10:04:39.845719117 +0000 UTC m=+0.052337644 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 28 10:04:39 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 10:04:39 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/33481160fabcc385d997dc82bdb19e1c1a0e4fb16542a4ec52812ba2f61cc168/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 10:04:40 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:04:40.003 158530 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f8:ad:87 2001:db8::f816:3eff:fef8:ad87'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fef8:ad87/64', 'neutron:device_id': 'ovnmeta-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a8f8694ac11a4237ad168b64c39ca114', 'neutron:revision_number': '18', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3e1a061c-89bc-4b63-8b60-f49fb95addda, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=ef9eb238-2b1e-49f7-8a0f-72efc8854e0f) old=Port_Binding(mac=['fa:16:3e:f8:ad:87 10.100.0.2 2001:db8::f816:3eff:fef8:ad87'], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fef8:ad87/64', 'neutron:device_id': 'ovnmeta-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a8f8694ac11a4237ad168b64c39ca114', 'neutron:revision_number': '15', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:04:40 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:04:40.006 158530 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port ef9eb238-2b1e-49f7-8a0f-72efc8854e0f in datapath 8642adde-54ae-4fc2-b997-bf1962c6c7f1 updated
Nov 28 10:04:40 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:04:40.010 158530 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8642adde-54ae-4fc2-b997-bf1962c6c7f1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 28 10:04:40 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:04:40.011 261619 DEBUG oslo.privsep.daemon [-] privsep: reply[4bbcb7a2-0a60-4393-ba70-e8013dd6fa77]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:04:40 np0005538515.localdomain podman[313818]: 2025-11-28 10:04:40.028289333 +0000 UTC m=+0.234907830 container init 7d9055d2c408a209194a629fb271361b17d5a881d5c8d65e14bd567976a0b618 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-857a46da-ae2c-48a0-8bc8-b100174874d8, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.vendor=CentOS)
Nov 28 10:04:40 np0005538515.localdomain podman[313818]: 2025-11-28 10:04:40.04124381 +0000 UTC m=+0.247862307 container start 7d9055d2c408a209194a629fb271361b17d5a881d5c8d65e14bd567976a0b618 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-857a46da-ae2c-48a0-8bc8-b100174874d8, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:04:40 np0005538515.localdomain dnsmasq[313863]: started, version 2.85 cachesize 150
Nov 28 10:04:40 np0005538515.localdomain dnsmasq[313863]: DNS service limited to local subnets
Nov 28 10:04:40 np0005538515.localdomain dnsmasq[313863]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 28 10:04:40 np0005538515.localdomain dnsmasq[313863]: warning: no upstream servers configured
Nov 28 10:04:40 np0005538515.localdomain dnsmasq-dhcp[313863]: DHCP, static leases only on 10.102.0.0, lease time 1d
Nov 28 10:04:40 np0005538515.localdomain dnsmasq[313863]: read /var/lib/neutron/dhcp/857a46da-ae2c-48a0-8bc8-b100174874d8/addn_hosts - 0 addresses
Nov 28 10:04:40 np0005538515.localdomain dnsmasq-dhcp[313863]: read /var/lib/neutron/dhcp/857a46da-ae2c-48a0-8bc8-b100174874d8/host
Nov 28 10:04:40 np0005538515.localdomain dnsmasq-dhcp[313863]: read /var/lib/neutron/dhcp/857a46da-ae2c-48a0-8bc8-b100174874d8/opts
Nov 28 10:04:40 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.
Nov 28 10:04:40 np0005538515.localdomain dnsmasq[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/addn_hosts - 2 addresses
Nov 28 10:04:40 np0005538515.localdomain dnsmasq-dhcp[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/host
Nov 28 10:04:40 np0005538515.localdomain podman[313852]: 2025-11-28 10:04:40.082979259 +0000 UTC m=+0.073038779 container kill d6ba70e100412f39fbf43f2359746885ccd151dd93bddc3682d2651a49a36c62 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-887157f9-a765-40c0-8be5-1fba3ddea8f8, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:04:40 np0005538515.localdomain dnsmasq-dhcp[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/opts
Nov 28 10:04:40 np0005538515.localdomain podman[313864]: 2025-11-28 10:04:40.175456324 +0000 UTC m=+0.110436996 container health_status cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=multipathd, org.label-schema.build-date=20251125, config_id=multipathd, io.buildah.version=1.41.3, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Nov 28 10:04:40 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:04:40.181 261346 INFO neutron.agent.dhcp.agent [None req-df0315a2-b00a-4270-9961-99c10f4b5cf1 - - - - - -] DHCP configuration for ports {'56d719ab-3654-408e-baed-b0b1d788646e'} is completed
Nov 28 10:04:40 np0005538515.localdomain podman[313864]: 2025-11-28 10:04:40.21549345 +0000 UTC m=+0.150474082 container exec_died cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Nov 28 10:04:40 np0005538515.localdomain systemd[1]: cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.service: Deactivated successfully.
Nov 28 10:04:40 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:04:40.238 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:04:40 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:04:40.908 2 INFO neutron.agent.securitygroups_rpc [None req-019d9e71-83ce-440c-bb58-c6c7df87e29f 1170e00b193540e48a34e21b33f08b7a a8f8694ac11a4237ad168b64c39ca114 - - default default] Security group member updated ['7f513fd2-7d14-47f7-892b-9f7b5ed2e3c4']
Nov 28 10:04:41 np0005538515.localdomain ceph-mon[301134]: pgmap v256: 177 pgs: 177 active+clean; 145 MiB data, 783 MiB used, 41 GiB / 42 GiB avail; 10 KiB/s rd, 597 B/s wr, 14 op/s
Nov 28 10:04:41 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:04:41.238 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:04:41 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:04:41.238 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:04:41 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:04:41.239 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:04:41 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:04:41.239 280172 DEBUG nova.compute.manager [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 28 10:04:41 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:04:41.396 2 INFO neutron.agent.securitygroups_rpc [None req-bb6bd82f-3718-4e2c-b707-6af77a5385f7 1170e00b193540e48a34e21b33f08b7a a8f8694ac11a4237ad168b64c39ca114 - - default default] Security group member updated ['7f513fd2-7d14-47f7-892b-9f7b5ed2e3c4']
Nov 28 10:04:41 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v257: 177 pgs: 177 active+clean; 145 MiB data, 783 MiB used, 41 GiB / 42 GiB avail; 10 KiB/s rd, 597 B/s wr, 14 op/s
Nov 28 10:04:41 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e148 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 10:04:42 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:04:42.452 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:04:42 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:04:42.453 261346 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:04:41Z, description=, device_id=e8aba12c-c649-4ffe-b451-637a93bb0e29, device_owner=network:router_gateway, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce4b1a90>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce4b13d0>], id=e30196bb-cb63-4cd5-aac4-f2363fa006be, ip_allocation=immediate, mac_address=fa:16:3e:65:74:c6, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T08:32:19Z, description=, dns_domain=, id=887157f9-a765-40c0-8be5-1fba3ddea8f8, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=9dda653c53224db086060962b0702694, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=29, status=ACTIVE, subnets=['5f7de60c-f82a-4f40-b803-51cb08cbf2e3'], tags=[], tenant_id=9dda653c53224db086060962b0702694, updated_at=2025-11-28T08:32:25Z, vlan_transparent=None, network_id=887157f9-a765-40c0-8be5-1fba3ddea8f8, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1747, status=DOWN, tags=[], tenant_id=, updated_at=2025-11-28T10:04:41Z on network 887157f9-a765-40c0-8be5-1fba3ddea8f8
Nov 28 10:04:42 np0005538515.localdomain dnsmasq[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/addn_hosts - 3 addresses
Nov 28 10:04:42 np0005538515.localdomain dnsmasq-dhcp[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/host
Nov 28 10:04:42 np0005538515.localdomain dnsmasq-dhcp[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/opts
Nov 28 10:04:42 np0005538515.localdomain podman[313907]: 2025-11-28 10:04:42.692677563 +0000 UTC m=+0.068671366 container kill d6ba70e100412f39fbf43f2359746885ccd151dd93bddc3682d2651a49a36c62 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-887157f9-a765-40c0-8be5-1fba3ddea8f8, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Nov 28 10:04:42 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:04:42.719 261346 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:04:41Z, description=, device_id=65a5fc61-8378-41fb-8a6b-788254c76348, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce4c0d30>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce4c0df0>], id=aba80ccc-f5c0-4dbd-b771-fb9b4d07071b, ip_allocation=immediate, mac_address=fa:16:3e:65:bc:e9, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:04:36Z, description=, dns_domain=, id=857a46da-ae2c-48a0-8bc8-b100174874d8, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersTest-1918864759, port_security_enabled=True, project_id=8c66e098e4fb4a349dc2bb4293454135, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=1538, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1706, status=ACTIVE, subnets=['f694aabc-d0eb-4a4e-a738-daa5c1e20a97'], tags=[], tenant_id=8c66e098e4fb4a349dc2bb4293454135, updated_at=2025-11-28T10:04:37Z, vlan_transparent=None, network_id=857a46da-ae2c-48a0-8bc8-b100174874d8, port_security_enabled=False, project_id=8c66e098e4fb4a349dc2bb4293454135, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1753, status=DOWN, tags=[], tenant_id=8c66e098e4fb4a349dc2bb4293454135, updated_at=2025-11-28T10:04:42Z on network 857a46da-ae2c-48a0-8bc8-b100174874d8
Nov 28 10:04:42 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:04:42.950 261346 INFO neutron.agent.dhcp.agent [None req-e30d9dfa-29fa-48ca-887c-40f4421d6216 - - - - - -] DHCP configuration for ports {'e30196bb-cb63-4cd5-aac4-f2363fa006be'} is completed
Nov 28 10:04:43 np0005538515.localdomain dnsmasq[313863]: read /var/lib/neutron/dhcp/857a46da-ae2c-48a0-8bc8-b100174874d8/addn_hosts - 1 addresses
Nov 28 10:04:43 np0005538515.localdomain dnsmasq-dhcp[313863]: read /var/lib/neutron/dhcp/857a46da-ae2c-48a0-8bc8-b100174874d8/host
Nov 28 10:04:43 np0005538515.localdomain dnsmasq-dhcp[313863]: read /var/lib/neutron/dhcp/857a46da-ae2c-48a0-8bc8-b100174874d8/opts
Nov 28 10:04:43 np0005538515.localdomain podman[313944]: 2025-11-28 10:04:43.053629195 +0000 UTC m=+0.067107538 container kill 7d9055d2c408a209194a629fb271361b17d5a881d5c8d65e14bd567976a0b618 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-857a46da-ae2c-48a0-8bc8-b100174874d8, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:04:43 np0005538515.localdomain ceph-mon[301134]: pgmap v257: 177 pgs: 177 active+clean; 145 MiB data, 783 MiB used, 41 GiB / 42 GiB avail; 10 KiB/s rd, 597 B/s wr, 14 op/s
Nov 28 10:04:43 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:04:43.322 261346 INFO neutron.agent.dhcp.agent [None req-7867baef-f20a-431a-97d3-bea38254697a - - - - - -] DHCP configuration for ports {'aba80ccc-f5c0-4dbd-b771-fb9b4d07071b'} is completed
Nov 28 10:04:43 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:04:43.625 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:04:43 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v258: 177 pgs: 177 active+clean; 145 MiB data, 783 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:04:44 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:04:44.219 2 INFO neutron.agent.securitygroups_rpc [None req-364c5261-76b8-4bbe-a1a4-6cba1a17e718 4d2a0ee370da4e688f1d8f5c639f278d ac0f848cde7c47c998a8b80087a3b59d - - default default] Security group member updated ['6784caad-903b-4e5a-b2eb-6bb1d9f8710a']
Nov 28 10:04:44 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e149 e149: 6 total, 6 up, 6 in
Nov 28 10:04:44 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:04:44.660 2 INFO neutron.agent.securitygroups_rpc [None req-32328951-8e5a-4539-abdc-e00522344c39 1170e00b193540e48a34e21b33f08b7a a8f8694ac11a4237ad168b64c39ca114 - - default default] Security group member updated ['7f513fd2-7d14-47f7-892b-9f7b5ed2e3c4']
Nov 28 10:04:45 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:04:45.159 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:04:45 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:04:45.203 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:04:45 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:04:45.202 158530 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=12, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '92:49:97', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ca:ab:0a:de:51:20'}, ipsec=False) old=SB_Global(nb_cfg=11) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:04:45 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:04:45.204 158530 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 28 10:04:45 np0005538515.localdomain ceph-mon[301134]: pgmap v258: 177 pgs: 177 active+clean; 145 MiB data, 783 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:04:45 np0005538515.localdomain ceph-mon[301134]: osdmap e149: 6 total, 6 up, 6 in
Nov 28 10:04:45 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:04:45.240 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:04:45 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:04:45.240 280172 DEBUG nova.compute.manager [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 28 10:04:45 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:04:45.241 280172 DEBUG nova.compute.manager [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 28 10:04:45 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e150 e150: 6 total, 6 up, 6 in
Nov 28 10:04:45 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:04:45.257 280172 DEBUG nova.compute.manager [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 28 10:04:45 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 28 10:04:45 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/2309871379' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:04:45 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v261: 177 pgs: 177 active+clean; 145 MiB data, 783 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:04:45 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:04:45.752 2 INFO neutron.agent.securitygroups_rpc [None req-bac2ff8c-e0f4-436a-a2ef-9ffa5731fa74 1170e00b193540e48a34e21b33f08b7a a8f8694ac11a4237ad168b64c39ca114 - - default default] Security group member updated ['7f513fd2-7d14-47f7-892b-9f7b5ed2e3c4']
Nov 28 10:04:46 np0005538515.localdomain ceph-mon[301134]: osdmap e150: 6 total, 6 up, 6 in
Nov 28 10:04:46 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.106:0/2309871379' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:04:46 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.107:0/1654213529' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:04:46 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.106:0/1907177804' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:04:46 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:04:46.280 261346 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:04:41Z, description=, device_id=65a5fc61-8378-41fb-8a6b-788254c76348, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce647d60>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce647eb0>], id=aba80ccc-f5c0-4dbd-b771-fb9b4d07071b, ip_allocation=immediate, mac_address=fa:16:3e:65:bc:e9, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:04:36Z, description=, dns_domain=, id=857a46da-ae2c-48a0-8bc8-b100174874d8, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersTest-1918864759, port_security_enabled=True, project_id=8c66e098e4fb4a349dc2bb4293454135, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=1538, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1706, status=ACTIVE, subnets=['f694aabc-d0eb-4a4e-a738-daa5c1e20a97'], tags=[], tenant_id=8c66e098e4fb4a349dc2bb4293454135, updated_at=2025-11-28T10:04:37Z, vlan_transparent=None, network_id=857a46da-ae2c-48a0-8bc8-b100174874d8, port_security_enabled=False, project_id=8c66e098e4fb4a349dc2bb4293454135, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1753, status=DOWN, tags=[], tenant_id=8c66e098e4fb4a349dc2bb4293454135, updated_at=2025-11-28T10:04:42Z on network 857a46da-ae2c-48a0-8bc8-b100174874d8
Nov 28 10:04:46 np0005538515.localdomain dnsmasq[313863]: read /var/lib/neutron/dhcp/857a46da-ae2c-48a0-8bc8-b100174874d8/addn_hosts - 1 addresses
Nov 28 10:04:46 np0005538515.localdomain dnsmasq-dhcp[313863]: read /var/lib/neutron/dhcp/857a46da-ae2c-48a0-8bc8-b100174874d8/host
Nov 28 10:04:46 np0005538515.localdomain podman[313982]: 2025-11-28 10:04:46.493271125 +0000 UTC m=+0.056983808 container kill 7d9055d2c408a209194a629fb271361b17d5a881d5c8d65e14bd567976a0b618 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-857a46da-ae2c-48a0-8bc8-b100174874d8, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Nov 28 10:04:46 np0005538515.localdomain dnsmasq-dhcp[313863]: read /var/lib/neutron/dhcp/857a46da-ae2c-48a0-8bc8-b100174874d8/opts
Nov 28 10:04:46 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 10:04:46 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:04:46.813 261346 INFO neutron.agent.dhcp.agent [None req-78bf5c70-70d7-43b1-bd7c-bdc110a4b4a3 - - - - - -] DHCP configuration for ports {'aba80ccc-f5c0-4dbd-b771-fb9b4d07071b'} is completed
Nov 28 10:04:47 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:04:47.032 261346 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:04:46Z, description=, device_id=5b709ff1-f495-452b-91d4-f8f4d4d33b79, device_owner=network:router_gateway, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce49b700>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce700940>], id=d6b610c6-cfc0-4922-be3e-a3490957a77c, ip_allocation=immediate, mac_address=fa:16:3e:c5:8a:e8, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T08:32:19Z, description=, dns_domain=, id=887157f9-a765-40c0-8be5-1fba3ddea8f8, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=9dda653c53224db086060962b0702694, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=29, status=ACTIVE, subnets=['5f7de60c-f82a-4f40-b803-51cb08cbf2e3'], tags=[], tenant_id=9dda653c53224db086060962b0702694, updated_at=2025-11-28T08:32:25Z, vlan_transparent=None, network_id=887157f9-a765-40c0-8be5-1fba3ddea8f8, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1763, status=DOWN, tags=[], tenant_id=, updated_at=2025-11-28T10:04:46Z on network 887157f9-a765-40c0-8be5-1fba3ddea8f8
Nov 28 10:04:47 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:04:47.238 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:04:47 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:04:47.239 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:04:47 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:04:47.239 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:04:47 np0005538515.localdomain dnsmasq[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/addn_hosts - 4 addresses
Nov 28 10:04:47 np0005538515.localdomain dnsmasq-dhcp[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/host
Nov 28 10:04:47 np0005538515.localdomain dnsmasq-dhcp[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/opts
Nov 28 10:04:47 np0005538515.localdomain podman[314022]: 2025-11-28 10:04:47.257909799 +0000 UTC m=+0.063398553 container kill d6ba70e100412f39fbf43f2359746885ccd151dd93bddc3682d2651a49a36c62 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-887157f9-a765-40c0-8be5-1fba3ddea8f8, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2)
Nov 28 10:04:47 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:04:47.264 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 10:04:47 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:04:47.265 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 10:04:47 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:04:47.266 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 10:04:47 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:04:47.266 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Auditing locally available compute resources for np0005538515.localdomain (node: np0005538515.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 28 10:04:47 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:04:47.267 280172 DEBUG oslo_concurrency.processutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 10:04:47 np0005538515.localdomain ceph-mon[301134]: pgmap v261: 177 pgs: 177 active+clean; 145 MiB data, 783 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:04:47 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.107:0/2706611035' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:04:47 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #28. Immutable memtables: 0.
Nov 28 10:04:47 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:04:47.309351) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Nov 28 10:04:47 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/flush_job.cc:856] [default] [JOB 13] Flushing memtable with next log file: 28
Nov 28 10:04:47 np0005538515.localdomain ceph-mon[301134]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324287309428, "job": 13, "event": "flush_started", "num_memtables": 1, "num_entries": 2642, "num_deletes": 268, "total_data_size": 3875778, "memory_usage": 3940944, "flush_reason": "Manual Compaction"}
Nov 28 10:04:47 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/flush_job.cc:885] [default] [JOB 13] Level-0 flush table #29: started
Nov 28 10:04:47 np0005538515.localdomain ceph-mon[301134]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324287330802, "cf_name": "default", "job": 13, "event": "table_file_creation", "file_number": 29, "file_size": 2510994, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 18412, "largest_seqno": 21049, "table_properties": {"data_size": 2500880, "index_size": 6491, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2565, "raw_key_size": 22084, "raw_average_key_size": 21, "raw_value_size": 2480323, "raw_average_value_size": 2455, "num_data_blocks": 275, "num_entries": 1010, "num_filter_entries": 1010, "num_deletions": 268, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764324137, "oldest_key_time": 1764324137, "file_creation_time": 1764324287, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "75e61b0e-4f73-4b03-b096-8587ecbe7a9f", "db_session_id": "7KM5GJAJPD54H6HSLJHG", "orig_file_number": 29, "seqno_to_time_mapping": "N/A"}}
Nov 28 10:04:47 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 13] Flush lasted 21531 microseconds, and 7615 cpu microseconds.
Nov 28 10:04:47 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 28 10:04:47 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:04:47.330881) [db/flush_job.cc:967] [default] [JOB 13] Level-0 flush table #29: 2510994 bytes OK
Nov 28 10:04:47 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:04:47.330916) [db/memtable_list.cc:519] [default] Level-0 commit table #29 started
Nov 28 10:04:47 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:04:47.333757) [db/memtable_list.cc:722] [default] Level-0 commit table #29: memtable #1 done
Nov 28 10:04:47 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:04:47.333792) EVENT_LOG_v1 {"time_micros": 1764324287333782, "job": 13, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Nov 28 10:04:47 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:04:47.333823) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Nov 28 10:04:47 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 13] Try to delete WAL files size 3863916, prev total WAL file size 3863916, number of live WAL files 2.
Nov 28 10:04:47 np0005538515.localdomain ceph-mon[301134]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538515/store.db/000025.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 28 10:04:47 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:04:47.335399) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003131373937' seq:72057594037927935, type:22 .. '7061786F73003132303439' seq:0, type:0; will stop at (end)
Nov 28 10:04:47 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 14] Compacting 1@0 + 1@6 files to L6, score -1.00
Nov 28 10:04:47 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 13 Base level 0, inputs: [29(2452KB)], [27(15MB)]
Nov 28 10:04:47 np0005538515.localdomain ceph-mon[301134]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324287335450, "job": 14, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [29], "files_L6": [27], "score": -1, "input_data_size": 18631635, "oldest_snapshot_seqno": -1}
Nov 28 10:04:47 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 14] Generated table #30: 12509 keys, 16635286 bytes, temperature: kUnknown
Nov 28 10:04:47 np0005538515.localdomain ceph-mon[301134]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324287444836, "cf_name": "default", "job": 14, "event": "table_file_creation", "file_number": 30, "file_size": 16635286, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 16564092, "index_size": 38837, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 31301, "raw_key_size": 334644, "raw_average_key_size": 26, "raw_value_size": 16351314, "raw_average_value_size": 1307, "num_data_blocks": 1476, "num_entries": 12509, "num_filter_entries": 12509, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764323786, "oldest_key_time": 0, "file_creation_time": 1764324287, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "75e61b0e-4f73-4b03-b096-8587ecbe7a9f", "db_session_id": "7KM5GJAJPD54H6HSLJHG", "orig_file_number": 30, "seqno_to_time_mapping": "N/A"}}
Nov 28 10:04:47 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 28 10:04:47 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:04:47.445200) [db/compaction/compaction_job.cc:1663] [default] [JOB 14] Compacted 1@0 + 1@6 files to L6 => 16635286 bytes
Nov 28 10:04:47 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:04:47.446832) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 170.2 rd, 152.0 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.4, 15.4 +0.0 blob) out(15.9 +0.0 blob), read-write-amplify(14.0) write-amplify(6.6) OK, records in: 13056, records dropped: 547 output_compression: NoCompression
Nov 28 10:04:47 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:04:47.446855) EVENT_LOG_v1 {"time_micros": 1764324287446844, "job": 14, "event": "compaction_finished", "compaction_time_micros": 109477, "compaction_time_cpu_micros": 40511, "output_level": 6, "num_output_files": 1, "total_output_size": 16635286, "num_input_records": 13056, "num_output_records": 12509, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Nov 28 10:04:47 np0005538515.localdomain ceph-mon[301134]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538515/store.db/000029.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 28 10:04:47 np0005538515.localdomain ceph-mon[301134]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324287447361, "job": 14, "event": "table_file_deletion", "file_number": 29}
Nov 28 10:04:47 np0005538515.localdomain ceph-mon[301134]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538515/store.db/000027.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 28 10:04:47 np0005538515.localdomain ceph-mon[301134]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324287449259, "job": 14, "event": "table_file_deletion", "file_number": 27}
Nov 28 10:04:47 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:04:47.335299) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:04:47 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:04:47.449303) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:04:47 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:04:47.449308) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:04:47 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:04:47.449310) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:04:47 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:04:47.449311) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:04:47 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:04:47.449313) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:04:47 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:04:47.490 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:04:47 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:04:47.596 261346 INFO neutron.agent.dhcp.agent [None req-26936482-318a-44eb-8e5b-97362f40d947 - - - - - -] DHCP configuration for ports {'d6b610c6-cfc0-4922-be3e-a3490957a77c'} is completed
Nov 28 10:04:47 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 28 10:04:47 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/1032672972' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:04:47 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v262: 177 pgs: 177 active+clean; 145 MiB data, 783 MiB used, 41 GiB / 42 GiB avail; 36 KiB/s rd, 3.9 KiB/s wr, 49 op/s
Nov 28 10:04:47 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:04:47.772 280172 DEBUG oslo_concurrency.processutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.506s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 10:04:47 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:04:47.992 280172 WARNING nova.virt.libvirt.driver [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 10:04:47 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:04:47.994 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Hypervisor/Node resource view: name=np0005538515.localdomain free_ram=11558MB free_disk=41.83686447143555GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 28 10:04:47 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:04:47.994 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 10:04:47 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:04:47.995 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 10:04:48 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:04:48.251 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 28 10:04:48 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:04:48.252 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Final resource view: name=np0005538515.localdomain phys_ram=15738MB used_ram=512MB phys_disk=41GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 28 10:04:48 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:04:48.279 280172 DEBUG oslo_concurrency.processutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 10:04:48 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.108:0/1032672972' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:04:48 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:04:48.651 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:04:48 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 28 10:04:48 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/2551675878' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:04:48 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:04:48.731 280172 DEBUG oslo_concurrency.processutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.452s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 10:04:48 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:04:48.737 280172 DEBUG nova.compute.provider_tree [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Inventory has not changed in ProviderTree for provider: 72fba1ca-0d86-48af-8a3d-510284dfd0e0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 10:04:48 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:04:48.751 280172 DEBUG nova.scheduler.client.report [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Inventory has not changed for provider 72fba1ca-0d86-48af-8a3d-510284dfd0e0 based on inventory data: {'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 10:04:48 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:04:48.755 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Compute_service record updated for np0005538515.localdomain:np0005538515.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 28 10:04:48 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:04:48.756 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.761s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 10:04:49 np0005538515.localdomain ceph-mon[301134]: pgmap v262: 177 pgs: 177 active+clean; 145 MiB data, 783 MiB used, 41 GiB / 42 GiB avail; 36 KiB/s rd, 3.9 KiB/s wr, 49 op/s
Nov 28 10:04:49 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.108:0/2551675878' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:04:49 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:04:49.581 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:04:49 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v263: 177 pgs: 177 active+clean; 145 MiB data, 783 MiB used, 41 GiB / 42 GiB avail; 36 KiB/s rd, 3.9 KiB/s wr, 49 op/s
Nov 28 10:04:50 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e151 e151: 6 total, 6 up, 6 in
Nov 28 10:04:50 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:04:50.756 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:04:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:04:50.848 158530 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 10:04:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:04:50.848 158530 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 10:04:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:04:50.849 158530 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 10:04:51 np0005538515.localdomain dnsmasq[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/addn_hosts - 3 addresses
Nov 28 10:04:51 np0005538515.localdomain dnsmasq-dhcp[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/host
Nov 28 10:04:51 np0005538515.localdomain dnsmasq-dhcp[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/opts
Nov 28 10:04:51 np0005538515.localdomain podman[314103]: 2025-11-28 10:04:51.029697559 +0000 UTC m=+0.058560376 container kill d6ba70e100412f39fbf43f2359746885ccd151dd93bddc3682d2651a49a36c62 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-887157f9-a765-40c0-8be5-1fba3ddea8f8, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_managed=true)
Nov 28 10:04:51 np0005538515.localdomain ceph-mon[301134]: pgmap v263: 177 pgs: 177 active+clean; 145 MiB data, 783 MiB used, 41 GiB / 42 GiB avail; 36 KiB/s rd, 3.9 KiB/s wr, 49 op/s
Nov 28 10:04:51 np0005538515.localdomain ceph-mon[301134]: osdmap e151: 6 total, 6 up, 6 in
Nov 28 10:04:51 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:04:51.141 2 INFO neutron.agent.securitygroups_rpc [None req-54cf6d1c-df90-4434-9ef1-afe91707ca30 1170e00b193540e48a34e21b33f08b7a a8f8694ac11a4237ad168b64c39ca114 - - default default] Security group member updated ['7f513fd2-7d14-47f7-892b-9f7b5ed2e3c4']
Nov 28 10:04:51 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:04:51.437 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:04:51 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v265: 177 pgs: 177 active+clean; 145 MiB data, 783 MiB used, 41 GiB / 42 GiB avail; 38 KiB/s rd, 4.1 KiB/s wr, 53 op/s
Nov 28 10:04:51 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 10:04:51 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:04:51.915 2 INFO neutron.agent.securitygroups_rpc [None req-263d211e-775e-47b3-9274-70437dee437e 1170e00b193540e48a34e21b33f08b7a a8f8694ac11a4237ad168b64c39ca114 - - default default] Security group member updated ['7f513fd2-7d14-47f7-892b-9f7b5ed2e3c4']
Nov 28 10:04:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:04:52.205 158530 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=62c03cad-89c1-4fd7-973b-8f2a608c71f1, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '12'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 10:04:52 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:04:52.493 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:04:52 np0005538515.localdomain dnsmasq[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/addn_hosts - 2 addresses
Nov 28 10:04:52 np0005538515.localdomain dnsmasq-dhcp[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/host
Nov 28 10:04:52 np0005538515.localdomain dnsmasq-dhcp[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/opts
Nov 28 10:04:52 np0005538515.localdomain podman[314141]: 2025-11-28 10:04:52.690790558 +0000 UTC m=+0.056203724 container kill d6ba70e100412f39fbf43f2359746885ccd151dd93bddc3682d2651a49a36c62 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-887157f9-a765-40c0-8be5-1fba3ddea8f8, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Nov 28 10:04:52 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:04:52.728 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:04:53 np0005538515.localdomain ceph-mon[301134]: pgmap v265: 177 pgs: 177 active+clean; 145 MiB data, 783 MiB used, 41 GiB / 42 GiB avail; 38 KiB/s rd, 4.1 KiB/s wr, 53 op/s
Nov 28 10:04:53 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:04:53.310 261346 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:04:52Z, description=, device_id=0f485132-8e02-47a3-b2f5-564423c3ef9b, device_owner=network:router_gateway, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce4b1a30>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce4b1b50>], id=4e2523a4-d733-4b06-b5b8-e4949ed3c622, ip_allocation=immediate, mac_address=fa:16:3e:9b:d7:82, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T08:32:19Z, description=, dns_domain=, id=887157f9-a765-40c0-8be5-1fba3ddea8f8, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=9dda653c53224db086060962b0702694, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=29, status=ACTIVE, subnets=['5f7de60c-f82a-4f40-b803-51cb08cbf2e3'], tags=[], tenant_id=9dda653c53224db086060962b0702694, updated_at=2025-11-28T08:32:25Z, vlan_transparent=None, network_id=887157f9-a765-40c0-8be5-1fba3ddea8f8, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1811, status=DOWN, tags=[], tenant_id=, updated_at=2025-11-28T10:04:52Z on network 887157f9-a765-40c0-8be5-1fba3ddea8f8
Nov 28 10:04:53 np0005538515.localdomain dnsmasq[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/addn_hosts - 3 addresses
Nov 28 10:04:53 np0005538515.localdomain dnsmasq-dhcp[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/host
Nov 28 10:04:53 np0005538515.localdomain dnsmasq-dhcp[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/opts
Nov 28 10:04:53 np0005538515.localdomain podman[314179]: 2025-11-28 10:04:53.502168055 +0000 UTC m=+0.052425078 container kill d6ba70e100412f39fbf43f2359746885ccd151dd93bddc3682d2651a49a36c62 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-887157f9-a765-40c0-8be5-1fba3ddea8f8, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 28 10:04:53 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.
Nov 28 10:04:53 np0005538515.localdomain podman[314194]: 2025-11-28 10:04:53.605580064 +0000 UTC m=+0.076704051 container health_status 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.openshift.expose-services=, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., version=9.6, config_id=edpm, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, io.openshift.tags=minimal rhel9, vcs-type=git, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible)
Nov 28 10:04:53 np0005538515.localdomain podman[314194]: 2025-11-28 10:04:53.616582712 +0000 UTC m=+0.087706719 container exec_died 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.buildah.version=1.33.7, io.openshift.expose-services=, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible)
Nov 28 10:04:53 np0005538515.localdomain systemd[1]: 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.service: Deactivated successfully.
Nov 28 10:04:53 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:04:53.653 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:04:53 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v266: 177 pgs: 177 active+clean; 145 MiB data, 783 MiB used, 41 GiB / 42 GiB avail; 34 KiB/s rd, 3.6 KiB/s wr, 47 op/s
Nov 28 10:04:53 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:04:53.797 261346 INFO neutron.agent.dhcp.agent [None req-3441817a-fc5b-4e4a-8786-ee6f4841c459 - - - - - -] DHCP configuration for ports {'4e2523a4-d733-4b06-b5b8-e4949ed3c622'} is completed
Nov 28 10:04:54 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:04:54.286 261346 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:04:53Z, description=, device_id=0cdeddab-1ad3-4978-9351-14136638bcd9, device_owner=network:router_gateway, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce56b760>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce56baf0>], id=024a63c9-8d6e-42ba-ba05-3bcff8cb8032, ip_allocation=immediate, mac_address=fa:16:3e:f7:75:8b, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T08:32:19Z, description=, dns_domain=, id=887157f9-a765-40c0-8be5-1fba3ddea8f8, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=9dda653c53224db086060962b0702694, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=29, status=ACTIVE, subnets=['5f7de60c-f82a-4f40-b803-51cb08cbf2e3'], tags=[], tenant_id=9dda653c53224db086060962b0702694, updated_at=2025-11-28T08:32:25Z, vlan_transparent=None, network_id=887157f9-a765-40c0-8be5-1fba3ddea8f8, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1817, status=DOWN, tags=[], tenant_id=, updated_at=2025-11-28T10:04:53Z on network 887157f9-a765-40c0-8be5-1fba3ddea8f8
Nov 28 10:04:54 np0005538515.localdomain dnsmasq[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/addn_hosts - 4 addresses
Nov 28 10:04:54 np0005538515.localdomain dnsmasq-dhcp[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/host
Nov 28 10:04:54 np0005538515.localdomain podman[314237]: 2025-11-28 10:04:54.514384317 +0000 UTC m=+0.044642518 container kill d6ba70e100412f39fbf43f2359746885ccd151dd93bddc3682d2651a49a36c62 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-887157f9-a765-40c0-8be5-1fba3ddea8f8, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 28 10:04:54 np0005538515.localdomain dnsmasq-dhcp[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/opts
Nov 28 10:04:54 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:04:54.767 261346 INFO neutron.agent.dhcp.agent [None req-37f8069a-5896-439b-9707-5a9e9576d5a7 - - - - - -] DHCP configuration for ports {'024a63c9-8d6e-42ba-ba05-3bcff8cb8032'} is completed
Nov 28 10:04:55 np0005538515.localdomain ceph-mon[301134]: pgmap v266: 177 pgs: 177 active+clean; 145 MiB data, 783 MiB used, 41 GiB / 42 GiB avail; 34 KiB/s rd, 3.6 KiB/s wr, 47 op/s
Nov 28 10:04:55 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v267: 177 pgs: 177 active+clean; 145 MiB data, 783 MiB used, 41 GiB / 42 GiB avail; 29 KiB/s rd, 3.1 KiB/s wr, 39 op/s
Nov 28 10:04:55 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:04:55.811 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:04:56 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:04:56.094 261346 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:04:55Z, description=, device_id=ffb38a66-64a8-419b-8d9e-757a81603106, device_owner=network:router_gateway, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce4b1910>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce4b1a90>], id=d7fd03cb-eae4-46ed-aec3-7c90878a5f67, ip_allocation=immediate, mac_address=fa:16:3e:67:e1:81, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T08:32:19Z, description=, dns_domain=, id=887157f9-a765-40c0-8be5-1fba3ddea8f8, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=9dda653c53224db086060962b0702694, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=29, status=ACTIVE, subnets=['5f7de60c-f82a-4f40-b803-51cb08cbf2e3'], tags=[], tenant_id=9dda653c53224db086060962b0702694, updated_at=2025-11-28T08:32:25Z, vlan_transparent=None, network_id=887157f9-a765-40c0-8be5-1fba3ddea8f8, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1826, status=DOWN, tags=[], tenant_id=, updated_at=2025-11-28T10:04:55Z on network 887157f9-a765-40c0-8be5-1fba3ddea8f8
Nov 28 10:04:56 np0005538515.localdomain dnsmasq[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/addn_hosts - 5 addresses
Nov 28 10:04:56 np0005538515.localdomain podman[314273]: 2025-11-28 10:04:56.377196519 +0000 UTC m=+0.057806432 container kill d6ba70e100412f39fbf43f2359746885ccd151dd93bddc3682d2651a49a36c62 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-887157f9-a765-40c0-8be5-1fba3ddea8f8, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Nov 28 10:04:56 np0005538515.localdomain dnsmasq-dhcp[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/host
Nov 28 10:04:56 np0005538515.localdomain dnsmasq-dhcp[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/opts
Nov 28 10:04:56 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 10:04:56 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:04:56.957 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:04:57 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:04:57.051 261346 INFO neutron.agent.dhcp.agent [None req-b756c8ea-e8e5-484d-9a70-ff54e365566f - - - - - -] DHCP configuration for ports {'d7fd03cb-eae4-46ed-aec3-7c90878a5f67'} is completed
Nov 28 10:04:57 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:04:57.062 2 INFO neutron.agent.securitygroups_rpc [None req-1b122d45-69b6-44ae-9f44-6255649c2a99 1170e00b193540e48a34e21b33f08b7a a8f8694ac11a4237ad168b64c39ca114 - - default default] Security group member updated ['7f513fd2-7d14-47f7-892b-9f7b5ed2e3c4']
Nov 28 10:04:57 np0005538515.localdomain ceph-mon[301134]: pgmap v267: 177 pgs: 177 active+clean; 145 MiB data, 783 MiB used, 41 GiB / 42 GiB avail; 29 KiB/s rd, 3.1 KiB/s wr, 39 op/s
Nov 28 10:04:57 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:04:57.496 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:04:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:04:57 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 10:04:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:04:57 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 10:04:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:04:57 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 10:04:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:04:57 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 10:04:57 np0005538515.localdomain openstack_network_exporter[240973]: 
Nov 28 10:04:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:04:57 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 10:04:57 np0005538515.localdomain openstack_network_exporter[240973]: 
Nov 28 10:04:57 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v268: 177 pgs: 177 active+clean; 145 MiB data, 783 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:04:58 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:04:58.656 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:04:58 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:04:58.759 2 INFO neutron.agent.securitygroups_rpc [None req-d1d37070-b21e-47b1-9333-9a0acdf29e79 1170e00b193540e48a34e21b33f08b7a a8f8694ac11a4237ad168b64c39ca114 - - default default] Security group member updated ['7f513fd2-7d14-47f7-892b-9f7b5ed2e3c4']
Nov 28 10:04:58 np0005538515.localdomain podman[239012]: time="2025-11-28T10:04:58Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 10:04:58 np0005538515.localdomain podman[239012]: @ - - [28/Nov/2025:10:04:58 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 159978 "" "Go-http-client/1.1"
Nov 28 10:04:58 np0005538515.localdomain podman[239012]: @ - - [28/Nov/2025:10:04:58 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 20153 "" "Go-http-client/1.1"
Nov 28 10:04:59 np0005538515.localdomain ceph-mon[301134]: pgmap v268: 177 pgs: 177 active+clean; 145 MiB data, 783 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:04:59 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v269: 177 pgs: 177 active+clean; 145 MiB data, 783 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:05:00 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:05:00.096 2 INFO neutron.agent.securitygroups_rpc [None req-f8f0dbe9-4862-4719-841c-e92cc8d478e0 6aa2d31d87d44bd68f3306827a96cb84 7163b69fa8fc4d998fb494edfa303457 - - default default] Security group member updated ['d5ac7cb5-5e8f-446f-ac61-9c9e90d707c1']
Nov 28 10:05:00 np0005538515.localdomain dnsmasq[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/addn_hosts - 4 addresses
Nov 28 10:05:00 np0005538515.localdomain podman[314310]: 2025-11-28 10:05:00.715330076 +0000 UTC m=+0.067295034 container kill d6ba70e100412f39fbf43f2359746885ccd151dd93bddc3682d2651a49a36c62 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-887157f9-a765-40c0-8be5-1fba3ddea8f8, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 28 10:05:00 np0005538515.localdomain dnsmasq-dhcp[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/host
Nov 28 10:05:00 np0005538515.localdomain dnsmasq-dhcp[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/opts
Nov 28 10:05:01 np0005538515.localdomain ceph-mon[301134]: pgmap v269: 177 pgs: 177 active+clean; 145 MiB data, 783 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:05:01 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:05:01.223 2 INFO neutron.agent.securitygroups_rpc [None req-8e254d5a-cb3d-4d3c-8c34-ca57b16025b1 6aa2d31d87d44bd68f3306827a96cb84 7163b69fa8fc4d998fb494edfa303457 - - default default] Security group member updated ['d5ac7cb5-5e8f-446f-ac61-9c9e90d707c1']
Nov 28 10:05:01 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v270: 177 pgs: 177 active+clean; 145 MiB data, 783 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:05:01 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 10:05:01 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:05:01.970 2 INFO neutron.agent.securitygroups_rpc [None req-70ec53ac-b8b3-4633-a977-c5aaaab920ff 1170e00b193540e48a34e21b33f08b7a a8f8694ac11a4237ad168b64c39ca114 - - default default] Security group member updated ['7f513fd2-7d14-47f7-892b-9f7b5ed2e3c4']
Nov 28 10:05:02 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:05:02.357 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:02 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:05:02.498 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:02 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:05:02.751 2 INFO neutron.agent.securitygroups_rpc [None req-472369ad-637e-463f-8142-3dff7a706106 6aa2d31d87d44bd68f3306827a96cb84 7163b69fa8fc4d998fb494edfa303457 - - default default] Security group member updated ['d5ac7cb5-5e8f-446f-ac61-9c9e90d707c1']
Nov 28 10:05:02 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:05:02.951 261346 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:05:02Z, description=, device_id=77059af7-1071-4399-bb20-a4a44fadd0d4, device_owner=network:router_gateway, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce50e760>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce59d9a0>], id=50da1b3b-e236-4c33-b4c3-cea5ee9d8fff, ip_allocation=immediate, mac_address=fa:16:3e:48:f0:6e, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T08:32:19Z, description=, dns_domain=, id=887157f9-a765-40c0-8be5-1fba3ddea8f8, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=9dda653c53224db086060962b0702694, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=29, status=ACTIVE, subnets=['5f7de60c-f82a-4f40-b803-51cb08cbf2e3'], tags=[], tenant_id=9dda653c53224db086060962b0702694, updated_at=2025-11-28T08:32:25Z, vlan_transparent=None, network_id=887157f9-a765-40c0-8be5-1fba3ddea8f8, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1847, status=DOWN, tags=[], tenant_id=, updated_at=2025-11-28T10:05:02Z on network 887157f9-a765-40c0-8be5-1fba3ddea8f8
Nov 28 10:05:02 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:05:02.973 2 INFO neutron.agent.securitygroups_rpc [None req-a8642c38-ef8d-4c54-aca2-47d1d691b2fe 1170e00b193540e48a34e21b33f08b7a a8f8694ac11a4237ad168b64c39ca114 - - default default] Security group member updated ['7f513fd2-7d14-47f7-892b-9f7b5ed2e3c4']
Nov 28 10:05:03 np0005538515.localdomain dnsmasq[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/addn_hosts - 5 addresses
Nov 28 10:05:03 np0005538515.localdomain dnsmasq-dhcp[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/host
Nov 28 10:05:03 np0005538515.localdomain podman[314350]: 2025-11-28 10:05:03.174128644 +0000 UTC m=+0.067470108 container kill d6ba70e100412f39fbf43f2359746885ccd151dd93bddc3682d2651a49a36c62 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-887157f9-a765-40c0-8be5-1fba3ddea8f8, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 28 10:05:03 np0005538515.localdomain dnsmasq-dhcp[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/opts
Nov 28 10:05:03 np0005538515.localdomain ceph-mon[301134]: pgmap v270: 177 pgs: 177 active+clean; 145 MiB data, 783 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:05:03 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:05:03.429 261346 INFO neutron.agent.dhcp.agent [None req-259efdca-8dd3-4612-a024-f75b0f6f029b - - - - - -] DHCP configuration for ports {'50da1b3b-e236-4c33-b4c3-cea5ee9d8fff'} is completed
Nov 28 10:05:03 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:05:03.659 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:03 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v271: 177 pgs: 177 active+clean; 145 MiB data, 783 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:05:03 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:05:03.772 2 INFO neutron.agent.securitygroups_rpc [None req-551a0a6f-26c0-4f59-8a71-37fd214c141c 6aa2d31d87d44bd68f3306827a96cb84 7163b69fa8fc4d998fb494edfa303457 - - default default] Security group member updated ['d5ac7cb5-5e8f-446f-ac61-9c9e90d707c1']
Nov 28 10:05:03 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.
Nov 28 10:05:03 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.
Nov 28 10:05:03 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.
Nov 28 10:05:03 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.
Nov 28 10:05:03 np0005538515.localdomain podman[314372]: 2025-11-28 10:05:03.977431293 +0000 UTC m=+0.079167927 container health_status 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251125, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:05:04 np0005538515.localdomain podman[314372]: 2025-11-28 10:05:04.013442197 +0000 UTC m=+0.115178831 container exec_died 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Nov 28 10:05:04 np0005538515.localdomain systemd[1]: 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.service: Deactivated successfully.
Nov 28 10:05:04 np0005538515.localdomain podman[314373]: 2025-11-28 10:05:04.032714518 +0000 UTC m=+0.130926064 container health_status b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Nov 28 10:05:04 np0005538515.localdomain podman[314373]: 2025-11-28 10:05:04.06149526 +0000 UTC m=+0.159706786 container exec_died b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 28 10:05:04 np0005538515.localdomain systemd[1]: b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.service: Deactivated successfully.
Nov 28 10:05:04 np0005538515.localdomain podman[314377]: 2025-11-28 10:05:04.1541461 +0000 UTC m=+0.247377593 container health_status d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 10:05:04 np0005538515.localdomain podman[314377]: 2025-11-28 10:05:04.16395598 +0000 UTC m=+0.257187463 container exec_died d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 28 10:05:04 np0005538515.localdomain systemd[1]: d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.service: Deactivated successfully.
Nov 28 10:05:04 np0005538515.localdomain systemd[1]: tmp-crun.g63zQj.mount: Deactivated successfully.
Nov 28 10:05:04 np0005538515.localdomain podman[314371]: 2025-11-28 10:05:04.261254663 +0000 UTC m=+0.365522974 container health_status 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, config_id=edpm)
Nov 28 10:05:04 np0005538515.localdomain podman[314371]: 2025-11-28 10:05:04.299851686 +0000 UTC m=+0.404119947 container exec_died 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm)
Nov 28 10:05:04 np0005538515.localdomain systemd[1]: 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.service: Deactivated successfully.
Nov 28 10:05:05 np0005538515.localdomain ceph-mon[301134]: pgmap v271: 177 pgs: 177 active+clean; 145 MiB data, 783 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:05:05 np0005538515.localdomain ceph-mgr[286188]: [balancer INFO root] Optimize plan auto_2025-11-28_10:05:05
Nov 28 10:05:05 np0005538515.localdomain ceph-mgr[286188]: [balancer INFO root] Mode upmap, max misplaced 0.050000
Nov 28 10:05:05 np0005538515.localdomain ceph-mgr[286188]: [balancer INFO root] do_upmap
Nov 28 10:05:05 np0005538515.localdomain ceph-mgr[286188]: [balancer INFO root] pools ['manila_metadata', 'vms', 'backups', 'images', '.mgr', 'manila_data', 'volumes']
Nov 28 10:05:05 np0005538515.localdomain ceph-mgr[286188]: [balancer INFO root] prepared 0/10 changes
Nov 28 10:05:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] scanning for idle connections..
Nov 28 10:05:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] cleaning up connections: []
Nov 28 10:05:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] scanning for idle connections..
Nov 28 10:05:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] cleaning up connections: []
Nov 28 10:05:05 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:05:05.717 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:05 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v272: 177 pgs: 177 active+clean; 145 MiB data, 783 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:05:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] scanning for idle connections..
Nov 28 10:05:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] cleaning up connections: []
Nov 28 10:05:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] _maybe_adjust
Nov 28 10:05:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Nov 28 10:05:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] Pool '.mgr' root_id -1 using 3.080724804578448e-05 of space, bias 1.0, pg target 0.006161449609156895 quantized to 1 (current 1)
Nov 28 10:05:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Nov 28 10:05:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] Pool 'vms' root_id -1 using 0.003328000680485762 of space, bias 1.0, pg target 0.6656001360971524 quantized to 32 (current 32)
Nov 28 10:05:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Nov 28 10:05:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] Pool 'volumes' root_id -1 using 2.7263051367950866e-07 of space, bias 1.0, pg target 5.443522589800856e-05 quantized to 32 (current 32)
Nov 28 10:05:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Nov 28 10:05:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] Pool 'images' root_id -1 using 0.004299383200725851 of space, bias 1.0, pg target 0.8584435124115949 quantized to 32 (current 32)
Nov 28 10:05:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Nov 28 10:05:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] Pool 'backups' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Nov 28 10:05:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Nov 28 10:05:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] Pool 'manila_data' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Nov 28 10:05:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Nov 28 10:05:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] Pool 'manila_metadata' root_id -1 using 2.453674623115578e-06 of space, bias 4.0, pg target 0.001953125 quantized to 16 (current 16)
Nov 28 10:05:05 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules
Nov 28 10:05:05 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: vms, start_after=
Nov 28 10:05:05 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules
Nov 28 10:05:05 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: vms, start_after=
Nov 28 10:05:05 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: volumes, start_after=
Nov 28 10:05:05 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: images, start_after=
Nov 28 10:05:05 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: volumes, start_after=
Nov 28 10:05:05 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: backups, start_after=
Nov 28 10:05:05 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: images, start_after=
Nov 28 10:05:05 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: backups, start_after=
Nov 28 10:05:06 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 10:05:07 np0005538515.localdomain ceph-mon[301134]: pgmap v272: 177 pgs: 177 active+clean; 145 MiB data, 783 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:05:07 np0005538515.localdomain dnsmasq[313863]: read /var/lib/neutron/dhcp/857a46da-ae2c-48a0-8bc8-b100174874d8/addn_hosts - 0 addresses
Nov 28 10:05:07 np0005538515.localdomain dnsmasq-dhcp[313863]: read /var/lib/neutron/dhcp/857a46da-ae2c-48a0-8bc8-b100174874d8/host
Nov 28 10:05:07 np0005538515.localdomain dnsmasq-dhcp[313863]: read /var/lib/neutron/dhcp/857a46da-ae2c-48a0-8bc8-b100174874d8/opts
Nov 28 10:05:07 np0005538515.localdomain podman[314471]: 2025-11-28 10:05:07.422202449 +0000 UTC m=+0.063528918 container kill 7d9055d2c408a209194a629fb271361b17d5a881d5c8d65e14bd567976a0b618 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-857a46da-ae2c-48a0-8bc8-b100174874d8, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 28 10:05:07 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.
Nov 28 10:05:07 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:05:07.501 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:07 np0005538515.localdomain systemd[1]: tmp-crun.w7YAX4.mount: Deactivated successfully.
Nov 28 10:05:07 np0005538515.localdomain podman[314484]: 2025-11-28 10:05:07.533275514 +0000 UTC m=+0.086928705 container health_status 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 10:05:07 np0005538515.localdomain podman[314484]: 2025-11-28 10:05:07.5478593 +0000 UTC m=+0.101512491 container exec_died 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 28 10:05:07 np0005538515.localdomain systemd[1]: 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.service: Deactivated successfully.
Nov 28 10:05:07 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T10:05:07Z|00105|binding|INFO|Releasing lport 0dd1eafc-23dc-4156-9d9f-2142e93fc855 from this chassis (sb_readonly=0)
Nov 28 10:05:07 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T10:05:07Z|00106|binding|INFO|Setting lport 0dd1eafc-23dc-4156-9d9f-2142e93fc855 down in Southbound
Nov 28 10:05:07 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:05:07.639 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:07 np0005538515.localdomain kernel: device tap0dd1eafc-23 left promiscuous mode
Nov 28 10:05:07 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:05:07.648 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:07 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:05:07.648 158530 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538515.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.102.0.2/28', 'neutron:device_id': 'dhcp2a75ae47-09f3-5db4-9c67-86b6e0e7c804-857a46da-ae2c-48a0-8bc8-b100174874d8', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-857a46da-ae2c-48a0-8bc8-b100174874d8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8c66e098e4fb4a349dc2bb4293454135', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005538515.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d6731419-dcc4-4fe4-a3f7-f974e0596a7c, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fd80e481be0>], logical_port=0dd1eafc-23dc-4156-9d9f-2142e93fc855) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fd80e481be0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:05:07 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:05:07.651 158530 INFO neutron.agent.ovn.metadata.agent [-] Port 0dd1eafc-23dc-4156-9d9f-2142e93fc855 in datapath 857a46da-ae2c-48a0-8bc8-b100174874d8 unbound from our chassis
Nov 28 10:05:07 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:05:07.654 158530 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 857a46da-ae2c-48a0-8bc8-b100174874d8, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 28 10:05:07 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:05:07.655 261619 DEBUG oslo.privsep.daemon [-] privsep: reply[74bf3411-3699-461f-85ba-a80bd4dbb800]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:05:07 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:05:07.663 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:07 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v273: 177 pgs: 177 active+clean; 145 MiB data, 783 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:05:07 np0005538515.localdomain dnsmasq[313863]: exiting on receipt of SIGTERM
Nov 28 10:05:07 np0005538515.localdomain podman[314532]: 2025-11-28 10:05:07.889171961 +0000 UTC m=+0.057007829 container kill 7d9055d2c408a209194a629fb271361b17d5a881d5c8d65e14bd567976a0b618 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-857a46da-ae2c-48a0-8bc8-b100174874d8, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 28 10:05:07 np0005538515.localdomain systemd[1]: libpod-7d9055d2c408a209194a629fb271361b17d5a881d5c8d65e14bd567976a0b618.scope: Deactivated successfully.
Nov 28 10:05:07 np0005538515.localdomain podman[314545]: 2025-11-28 10:05:07.962458147 +0000 UTC m=+0.057193353 container died 7d9055d2c408a209194a629fb271361b17d5a881d5c8d65e14bd567976a0b618 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-857a46da-ae2c-48a0-8bc8-b100174874d8, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 28 10:05:08 np0005538515.localdomain podman[314545]: 2025-11-28 10:05:08.042715917 +0000 UTC m=+0.137451043 container cleanup 7d9055d2c408a209194a629fb271361b17d5a881d5c8d65e14bd567976a0b618 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-857a46da-ae2c-48a0-8bc8-b100174874d8, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 28 10:05:08 np0005538515.localdomain systemd[1]: libpod-conmon-7d9055d2c408a209194a629fb271361b17d5a881d5c8d65e14bd567976a0b618.scope: Deactivated successfully.
Nov 28 10:05:08 np0005538515.localdomain podman[314546]: 2025-11-28 10:05:08.067729354 +0000 UTC m=+0.152323140 container remove 7d9055d2c408a209194a629fb271361b17d5a881d5c8d65e14bd567976a0b618 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-857a46da-ae2c-48a0-8bc8-b100174874d8, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Nov 28 10:05:08 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:05:08.095 261346 INFO neutron.agent.dhcp.agent [None req-e6c3cbae-7101-4302-a706-0f6eaeffc841 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:05:08 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:05:08.096 261346 INFO neutron.agent.dhcp.agent [None req-e6c3cbae-7101-4302-a706-0f6eaeffc841 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:05:08 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:05:08.136 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:08 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:05:08.240 2 INFO neutron.agent.securitygroups_rpc [None req-687de3fc-aa2f-4499-970c-3ba0a56c0388 4d2a0ee370da4e688f1d8f5c639f278d ac0f848cde7c47c998a8b80087a3b59d - - default default] Security group member updated ['6784caad-903b-4e5a-b2eb-6bb1d9f8710a']
Nov 28 10:05:08 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-33481160fabcc385d997dc82bdb19e1c1a0e4fb16542a4ec52812ba2f61cc168-merged.mount: Deactivated successfully.
Nov 28 10:05:08 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-7d9055d2c408a209194a629fb271361b17d5a881d5c8d65e14bd567976a0b618-userdata-shm.mount: Deactivated successfully.
Nov 28 10:05:08 np0005538515.localdomain systemd[1]: run-netns-qdhcp\x2d857a46da\x2dae2c\x2d48a0\x2d8bc8\x2db100174874d8.mount: Deactivated successfully.
Nov 28 10:05:08 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:05:08.686 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:09 np0005538515.localdomain dnsmasq[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/addn_hosts - 4 addresses
Nov 28 10:05:09 np0005538515.localdomain dnsmasq-dhcp[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/host
Nov 28 10:05:09 np0005538515.localdomain systemd[1]: tmp-crun.4nqT4t.mount: Deactivated successfully.
Nov 28 10:05:09 np0005538515.localdomain dnsmasq-dhcp[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/opts
Nov 28 10:05:09 np0005538515.localdomain podman[314602]: 2025-11-28 10:05:09.017223623 +0000 UTC m=+0.118852723 container kill d6ba70e100412f39fbf43f2359746885ccd151dd93bddc3682d2651a49a36c62 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-887157f9-a765-40c0-8be5-1fba3ddea8f8, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 28 10:05:09 np0005538515.localdomain dnsmasq[313513]: read /var/lib/neutron/dhcp/d81fff26-c58f-4d58-a4c3-379fa25c0b56/addn_hosts - 0 addresses
Nov 28 10:05:09 np0005538515.localdomain dnsmasq-dhcp[313513]: read /var/lib/neutron/dhcp/d81fff26-c58f-4d58-a4c3-379fa25c0b56/host
Nov 28 10:05:09 np0005538515.localdomain dnsmasq-dhcp[313513]: read /var/lib/neutron/dhcp/d81fff26-c58f-4d58-a4c3-379fa25c0b56/opts
Nov 28 10:05:09 np0005538515.localdomain podman[314614]: 2025-11-28 10:05:09.066485914 +0000 UTC m=+0.102637947 container kill db4f1e73011f5afa73a2974ecc8d11b3610a5c42e770d1cf01bae7f0e9edad24 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d81fff26-c58f-4d58-a4c3-379fa25c0b56, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Nov 28 10:05:09 np0005538515.localdomain ceph-mon[301134]: pgmap v273: 177 pgs: 177 active+clean; 145 MiB data, 783 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:05:09 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:05:09.304 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:09 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T10:05:09Z|00107|binding|INFO|Releasing lport 48dc601b-9dc3-45c9-9b98-4c07536959fd from this chassis (sb_readonly=0)
Nov 28 10:05:09 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T10:05:09Z|00108|binding|INFO|Setting lport 48dc601b-9dc3-45c9-9b98-4c07536959fd down in Southbound
Nov 28 10:05:09 np0005538515.localdomain kernel: device tap48dc601b-9d left promiscuous mode
Nov 28 10:05:09 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:05:09.324 158530 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538515.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.101.0.2/28', 'neutron:device_id': 'dhcp2a75ae47-09f3-5db4-9c67-86b6e0e7c804-d81fff26-c58f-4d58-a4c3-379fa25c0b56', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d81fff26-c58f-4d58-a4c3-379fa25c0b56', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8c66e098e4fb4a349dc2bb4293454135', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005538515.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3c8780a7-df1b-4611-af44-f100aaf1ce7e, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fd80e481be0>], logical_port=48dc601b-9dc3-45c9-9b98-4c07536959fd) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fd80e481be0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:05:09 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:05:09.326 158530 INFO neutron.agent.ovn.metadata.agent [-] Port 48dc601b-9dc3-45c9-9b98-4c07536959fd in datapath d81fff26-c58f-4d58-a4c3-379fa25c0b56 unbound from our chassis
Nov 28 10:05:09 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:05:09.329 158530 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d81fff26-c58f-4d58-a4c3-379fa25c0b56, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 28 10:05:09 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:05:09.330 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:09 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:05:09.331 261619 DEBUG oslo.privsep.daemon [-] privsep: reply[c9930ccb-3087-4ef5-b59e-e78b2ac12929]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:05:09 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v274: 177 pgs: 177 active+clean; 145 MiB data, 783 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:05:10 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:05:10.362 2 INFO neutron.agent.securitygroups_rpc [None req-20bc108e-ba14-415c-a7d5-38bda2943a27 a22fc7f25dd74349be1fe8842a517a9e e8063417cfc540ba8948734a0d2952d7 - - default default] Security group member updated ['f26ff442-2d58-414e-b33d-dcdad9bae629']
Nov 28 10:05:10 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:05:10.449 261346 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:05:09Z, description=, device_id=2e7ac6be-2b29-43ff-882f-d19806f73241, device_owner=network:router_gateway, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce4cd670>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce4cdd00>], id=0e9b8c70-8289-4af3-939d-c2588ce1fe4f, ip_allocation=immediate, mac_address=fa:16:3e:98:1d:06, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T08:32:19Z, description=, dns_domain=, id=887157f9-a765-40c0-8be5-1fba3ddea8f8, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=9dda653c53224db086060962b0702694, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=29, status=ACTIVE, subnets=['5f7de60c-f82a-4f40-b803-51cb08cbf2e3'], tags=[], tenant_id=9dda653c53224db086060962b0702694, updated_at=2025-11-28T08:32:25Z, vlan_transparent=None, network_id=887157f9-a765-40c0-8be5-1fba3ddea8f8, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1877, status=DOWN, tags=[], tenant_id=, updated_at=2025-11-28T10:05:10Z on network 887157f9-a765-40c0-8be5-1fba3ddea8f8
Nov 28 10:05:10 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:05:10.460 2 INFO neutron.agent.securitygroups_rpc [None req-dee3de84-6e3e-4d2c-b4d4-a44f07424a62 1170e00b193540e48a34e21b33f08b7a a8f8694ac11a4237ad168b64c39ca114 - - default default] Security group member updated ['7f513fd2-7d14-47f7-892b-9f7b5ed2e3c4']
Nov 28 10:05:10 np0005538515.localdomain dnsmasq[313513]: exiting on receipt of SIGTERM
Nov 28 10:05:10 np0005538515.localdomain podman[314662]: 2025-11-28 10:05:10.468265695 +0000 UTC m=+0.066487488 container kill db4f1e73011f5afa73a2974ecc8d11b3610a5c42e770d1cf01bae7f0e9edad24 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d81fff26-c58f-4d58-a4c3-379fa25c0b56, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 28 10:05:10 np0005538515.localdomain systemd[1]: libpod-db4f1e73011f5afa73a2974ecc8d11b3610a5c42e770d1cf01bae7f0e9edad24.scope: Deactivated successfully.
Nov 28 10:05:10 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.
Nov 28 10:05:10 np0005538515.localdomain podman[314676]: 2025-11-28 10:05:10.555388376 +0000 UTC m=+0.064825568 container died db4f1e73011f5afa73a2974ecc8d11b3610a5c42e770d1cf01bae7f0e9edad24 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d81fff26-c58f-4d58-a4c3-379fa25c0b56, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:05:10 np0005538515.localdomain systemd[1]: tmp-crun.ooNQSQ.mount: Deactivated successfully.
Nov 28 10:05:10 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-db4f1e73011f5afa73a2974ecc8d11b3610a5c42e770d1cf01bae7f0e9edad24-userdata-shm.mount: Deactivated successfully.
Nov 28 10:05:10 np0005538515.localdomain podman[314676]: 2025-11-28 10:05:10.609635709 +0000 UTC m=+0.119072831 container remove db4f1e73011f5afa73a2974ecc8d11b3610a5c42e770d1cf01bae7f0e9edad24 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d81fff26-c58f-4d58-a4c3-379fa25c0b56, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 28 10:05:10 np0005538515.localdomain systemd[1]: libpod-conmon-db4f1e73011f5afa73a2974ecc8d11b3610a5c42e770d1cf01bae7f0e9edad24.scope: Deactivated successfully.
Nov 28 10:05:10 np0005538515.localdomain podman[314682]: 2025-11-28 10:05:10.687084923 +0000 UTC m=+0.189598342 container health_status cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Nov 28 10:05:10 np0005538515.localdomain podman[314682]: 2025-11-28 10:05:10.702303789 +0000 UTC m=+0.204817188 container exec_died cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Nov 28 10:05:10 np0005538515.localdomain systemd[1]: cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.service: Deactivated successfully.
Nov 28 10:05:10 np0005538515.localdomain podman[314726]: 2025-11-28 10:05:10.755512989 +0000 UTC m=+0.076801074 container kill d6ba70e100412f39fbf43f2359746885ccd151dd93bddc3682d2651a49a36c62 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-887157f9-a765-40c0-8be5-1fba3ddea8f8, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Nov 28 10:05:10 np0005538515.localdomain dnsmasq[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/addn_hosts - 5 addresses
Nov 28 10:05:10 np0005538515.localdomain dnsmasq-dhcp[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/host
Nov 28 10:05:10 np0005538515.localdomain dnsmasq-dhcp[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/opts
Nov 28 10:05:10 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:05:10.892 261346 INFO neutron.agent.dhcp.agent [None req-fc46b6ff-7bfc-4f5f-9579-0a2a8e3afd29 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:05:10 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:05:10.893 261346 INFO neutron.agent.dhcp.agent [None req-fc46b6ff-7bfc-4f5f-9579-0a2a8e3afd29 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:05:11 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:05:11.026 261346 INFO neutron.agent.dhcp.agent [None req-f065876f-5691-4382-ba6a-aa358a5474a4 - - - - - -] DHCP configuration for ports {'0e9b8c70-8289-4af3-939d-c2588ce1fe4f'} is completed
Nov 28 10:05:11 np0005538515.localdomain ceph-mon[301134]: pgmap v274: 177 pgs: 177 active+clean; 145 MiB data, 783 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:05:11 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:05:11.365 2 INFO neutron.agent.securitygroups_rpc [None req-dada4afd-3538-47ee-91fc-3d547e9c2d44 a22fc7f25dd74349be1fe8842a517a9e e8063417cfc540ba8948734a0d2952d7 - - default default] Security group member updated ['f26ff442-2d58-414e-b33d-dcdad9bae629']
Nov 28 10:05:11 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-54a4e230fd4a13b47ad5b4e150539a4b42d7ec5ed223aa986059699bf0cd00cc-merged.mount: Deactivated successfully.
Nov 28 10:05:11 np0005538515.localdomain systemd[1]: run-netns-qdhcp\x2dd81fff26\x2dc58f\x2d4d58\x2da4c3\x2d379fa25c0b56.mount: Deactivated successfully.
Nov 28 10:05:11 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:05:11.569 261346 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:05:10Z, description=, device_id=382e1b3d-fdff-4aa5-8e88-248b9691853e, device_owner=network:router_gateway, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce7c1bb0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce7c1a00>], id=8c329718-f2f6-443f-b03f-4395241be6de, ip_allocation=immediate, mac_address=fa:16:3e:d9:c8:d0, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T08:32:19Z, description=, dns_domain=, id=887157f9-a765-40c0-8be5-1fba3ddea8f8, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=9dda653c53224db086060962b0702694, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=29, status=ACTIVE, subnets=['5f7de60c-f82a-4f40-b803-51cb08cbf2e3'], tags=[], tenant_id=9dda653c53224db086060962b0702694, updated_at=2025-11-28T08:32:25Z, vlan_transparent=None, network_id=887157f9-a765-40c0-8be5-1fba3ddea8f8, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1878, status=DOWN, tags=[], tenant_id=, updated_at=2025-11-28T10:05:11Z on network 887157f9-a765-40c0-8be5-1fba3ddea8f8
Nov 28 10:05:11 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:05:11.574 261346 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:05:11 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:05:11.652 2 INFO neutron.agent.securitygroups_rpc [None req-3c61a2a4-46f5-45d3-8f70-86c422c843ed 4d2a0ee370da4e688f1d8f5c639f278d ac0f848cde7c47c998a8b80087a3b59d - - default default] Security group member updated ['6784caad-903b-4e5a-b2eb-6bb1d9f8710a']
Nov 28 10:05:11 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:05:11.708 2 INFO neutron.agent.securitygroups_rpc [None req-bb3cde64-53ac-43b6-992c-b149fee8302c 1170e00b193540e48a34e21b33f08b7a a8f8694ac11a4237ad168b64c39ca114 - - default default] Security group member updated ['7f513fd2-7d14-47f7-892b-9f7b5ed2e3c4']
Nov 28 10:05:11 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v275: 177 pgs: 177 active+clean; 145 MiB data, 783 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:05:11 np0005538515.localdomain dnsmasq[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/addn_hosts - 6 addresses
Nov 28 10:05:11 np0005538515.localdomain dnsmasq-dhcp[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/host
Nov 28 10:05:11 np0005538515.localdomain systemd[1]: tmp-crun.pjQic2.mount: Deactivated successfully.
Nov 28 10:05:11 np0005538515.localdomain dnsmasq-dhcp[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/opts
Nov 28 10:05:11 np0005538515.localdomain podman[314769]: 2025-11-28 10:05:11.766104903 +0000 UTC m=+0.061091714 container kill d6ba70e100412f39fbf43f2359746885ccd151dd93bddc3682d2651a49a36c62 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-887157f9-a765-40c0-8be5-1fba3ddea8f8, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Nov 28 10:05:11 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 10:05:11 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:05:11.794 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:11 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:05:11.974 2 INFO neutron.agent.securitygroups_rpc [None req-2c04ab47-4041-48cd-a577-3ad3edc1e57b a22fc7f25dd74349be1fe8842a517a9e e8063417cfc540ba8948734a0d2952d7 - - default default] Security group member updated ['f26ff442-2d58-414e-b33d-dcdad9bae629']
Nov 28 10:05:11 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:05:11.984 261346 INFO neutron.agent.dhcp.agent [None req-5976b58a-662c-4a63-8160-641d43c9a1cf - - - - - -] DHCP configuration for ports {'8c329718-f2f6-443f-b03f-4395241be6de'} is completed
Nov 28 10:05:12 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:05:12.047 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:12 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:05:12.504 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:13 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:05:13.292 2 INFO neutron.agent.securitygroups_rpc [None req-865841f9-78fc-4d59-bed2-9c90df3d7ecb a22fc7f25dd74349be1fe8842a517a9e e8063417cfc540ba8948734a0d2952d7 - - default default] Security group member updated ['f26ff442-2d58-414e-b33d-dcdad9bae629']
Nov 28 10:05:13 np0005538515.localdomain ceph-mon[301134]: pgmap v275: 177 pgs: 177 active+clean; 145 MiB data, 783 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:05:13 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Nov 28 10:05:13 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2371469984' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:05:13 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Nov 28 10:05:13 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2371469984' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:05:13 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:05:13.729 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:13 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v276: 177 pgs: 177 active+clean; 145 MiB data, 783 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:05:14 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/2371469984' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:05:14 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/2371469984' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:05:15 np0005538515.localdomain ceph-mon[301134]: pgmap v276: 177 pgs: 177 active+clean; 145 MiB data, 783 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:05:15 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:05:15.480 2 INFO neutron.agent.securitygroups_rpc [None req-95a3885e-d25b-4b35-9841-10ba9cd222bb a22fc7f25dd74349be1fe8842a517a9e e8063417cfc540ba8948734a0d2952d7 - - default default] Security group member updated ['f26ff442-2d58-414e-b33d-dcdad9bae629']
Nov 28 10:05:15 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v277: 177 pgs: 177 active+clean; 145 MiB data, 783 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:05:15 np0005538515.localdomain dnsmasq[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/addn_hosts - 5 addresses
Nov 28 10:05:15 np0005538515.localdomain dnsmasq-dhcp[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/host
Nov 28 10:05:15 np0005538515.localdomain podman[314806]: 2025-11-28 10:05:15.882920795 +0000 UTC m=+0.063396333 container kill d6ba70e100412f39fbf43f2359746885ccd151dd93bddc3682d2651a49a36c62 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-887157f9-a765-40c0-8be5-1fba3ddea8f8, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:05:15 np0005538515.localdomain dnsmasq-dhcp[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/opts
Nov 28 10:05:16 np0005538515.localdomain dnsmasq[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/addn_hosts - 4 addresses
Nov 28 10:05:16 np0005538515.localdomain podman[314846]: 2025-11-28 10:05:16.603206831 +0000 UTC m=+0.061198216 container kill d6ba70e100412f39fbf43f2359746885ccd151dd93bddc3682d2651a49a36c62 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-887157f9-a765-40c0-8be5-1fba3ddea8f8, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Nov 28 10:05:16 np0005538515.localdomain dnsmasq-dhcp[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/host
Nov 28 10:05:16 np0005538515.localdomain dnsmasq-dhcp[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/opts
Nov 28 10:05:16 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:05:16.734 261346 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:05:16 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:05:16.737 2 INFO neutron.agent.securitygroups_rpc [None req-8e314c79-26fe-4adc-939f-e587c4e62ad2 a22fc7f25dd74349be1fe8842a517a9e e8063417cfc540ba8948734a0d2952d7 - - default default] Security group member updated ['f26ff442-2d58-414e-b33d-dcdad9bae629']
Nov 28 10:05:16 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 10:05:17 np0005538515.localdomain ceph-mon[301134]: pgmap v277: 177 pgs: 177 active+clean; 145 MiB data, 783 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:05:17 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:05:17.508 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:17 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v278: 177 pgs: 177 active+clean; 145 MiB data, 783 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:05:17 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:05:17.846 261346 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:05:17Z, description=, device_id=304ed48c-45db-4273-b511-259f5369a67b, device_owner=network:router_gateway, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce613520>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce6133a0>], id=2b7a5158-ba0d-40a0-a879-ecb41201e794, ip_allocation=immediate, mac_address=fa:16:3e:58:33:d9, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T08:32:19Z, description=, dns_domain=, id=887157f9-a765-40c0-8be5-1fba3ddea8f8, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=9dda653c53224db086060962b0702694, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=29, status=ACTIVE, subnets=['5f7de60c-f82a-4f40-b803-51cb08cbf2e3'], tags=[], tenant_id=9dda653c53224db086060962b0702694, updated_at=2025-11-28T08:32:25Z, vlan_transparent=None, network_id=887157f9-a765-40c0-8be5-1fba3ddea8f8, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1906, status=DOWN, tags=[], tenant_id=, updated_at=2025-11-28T10:05:17Z on network 887157f9-a765-40c0-8be5-1fba3ddea8f8
Nov 28 10:05:18 np0005538515.localdomain dnsmasq[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/addn_hosts - 5 addresses
Nov 28 10:05:18 np0005538515.localdomain dnsmasq-dhcp[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/host
Nov 28 10:05:18 np0005538515.localdomain podman[314883]: 2025-11-28 10:05:18.076545677 +0000 UTC m=+0.062102814 container kill d6ba70e100412f39fbf43f2359746885ccd151dd93bddc3682d2651a49a36c62 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-887157f9-a765-40c0-8be5-1fba3ddea8f8, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:05:18 np0005538515.localdomain dnsmasq-dhcp[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/opts
Nov 28 10:05:18 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:05:18.435 261346 INFO neutron.agent.dhcp.agent [None req-385723ac-76ee-47d3-8fce-105c83b23e4a - - - - - -] DHCP configuration for ports {'2b7a5158-ba0d-40a0-a879-ecb41201e794'} is completed
Nov 28 10:05:18 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:05:18.734 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:18 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:05:18.832 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:19 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:05:19.187 2 INFO neutron.agent.securitygroups_rpc [None req-8fd6935d-936a-4ee4-abda-f9675ff09d60 a22fc7f25dd74349be1fe8842a517a9e e8063417cfc540ba8948734a0d2952d7 - - default default] Security group member updated ['f26ff442-2d58-414e-b33d-dcdad9bae629']
Nov 28 10:05:19 np0005538515.localdomain ceph-mon[301134]: pgmap v278: 177 pgs: 177 active+clean; 145 MiB data, 783 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:05:19 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v279: 177 pgs: 177 active+clean; 145 MiB data, 783 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:05:19 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:05:19.804 2 INFO neutron.agent.securitygroups_rpc [None req-5f6fc136-bb5c-4f44-93cc-8b8331d1b8c7 1170e00b193540e48a34e21b33f08b7a a8f8694ac11a4237ad168b64c39ca114 - - default default] Security group member updated ['7f513fd2-7d14-47f7-892b-9f7b5ed2e3c4']
Nov 28 10:05:20 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:05:20.450 2 INFO neutron.agent.securitygroups_rpc [None req-08674235-5e89-42a7-aefb-bb66d7c4db90 a22fc7f25dd74349be1fe8842a517a9e e8063417cfc540ba8948734a0d2952d7 - - default default] Security group member updated ['f26ff442-2d58-414e-b33d-dcdad9bae629']
Nov 28 10:05:20 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:05:20.937 2 INFO neutron.agent.securitygroups_rpc [None req-ff8acaea-6e1e-4ede-81e1-a7305639837a 1170e00b193540e48a34e21b33f08b7a a8f8694ac11a4237ad168b64c39ca114 - - default default] Security group member updated ['7f513fd2-7d14-47f7-892b-9f7b5ed2e3c4']
Nov 28 10:05:21 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:05:21.296 2 INFO neutron.agent.securitygroups_rpc [None req-f5c2db0c-7ccc-47c0-8119-58c735104780 a22fc7f25dd74349be1fe8842a517a9e e8063417cfc540ba8948734a0d2952d7 - - default default] Security group member updated ['f26ff442-2d58-414e-b33d-dcdad9bae629']
Nov 28 10:05:21 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:05:21.323 2 INFO neutron.agent.securitygroups_rpc [None req-baf56349-7def-4de1-b8e1-b12f76677166 4d2a0ee370da4e688f1d8f5c639f278d ac0f848cde7c47c998a8b80087a3b59d - - default default] Security group member updated ['6784caad-903b-4e5a-b2eb-6bb1d9f8710a']
Nov 28 10:05:21 np0005538515.localdomain sudo[314905]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 10:05:21 np0005538515.localdomain sudo[314905]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 10:05:21 np0005538515.localdomain sudo[314905]: pam_unix(sudo:session): session closed for user root
Nov 28 10:05:21 np0005538515.localdomain ceph-mon[301134]: pgmap v279: 177 pgs: 177 active+clean; 145 MiB data, 783 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:05:21 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:05:21.426 2 INFO neutron.agent.securitygroups_rpc [None req-1ffc7eeb-b258-4bda-bcd4-9d4ef92a7b1f 48f54094604448288052e0203d18d8df 45fdbe27569f45449de58f1d1899ceea - - default default] Security group member updated ['0eb534d2-8077-4b6c-8550-0df9f702073c']
Nov 28 10:05:21 np0005538515.localdomain sudo[314923]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 10:05:21 np0005538515.localdomain sudo[314923]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 10:05:21 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:05:21.537 2 INFO neutron.agent.securitygroups_rpc [None req-1ffc7eeb-b258-4bda-bcd4-9d4ef92a7b1f 48f54094604448288052e0203d18d8df 45fdbe27569f45449de58f1d1899ceea - - default default] Security group member updated ['0eb534d2-8077-4b6c-8550-0df9f702073c']
Nov 28 10:05:21 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v280: 177 pgs: 177 active+clean; 145 MiB data, 783 MiB used, 41 GiB / 42 GiB avail; 10 KiB/s rd, 596 B/s wr, 14 op/s
Nov 28 10:05:21 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 10:05:22 np0005538515.localdomain dnsmasq[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/addn_hosts - 4 addresses
Nov 28 10:05:22 np0005538515.localdomain dnsmasq-dhcp[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/host
Nov 28 10:05:22 np0005538515.localdomain dnsmasq-dhcp[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/opts
Nov 28 10:05:22 np0005538515.localdomain podman[314973]: 2025-11-28 10:05:22.035147131 +0000 UTC m=+0.058906636 container kill d6ba70e100412f39fbf43f2359746885ccd151dd93bddc3682d2651a49a36c62 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-887157f9-a765-40c0-8be5-1fba3ddea8f8, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:05:22 np0005538515.localdomain sudo[314923]: pam_unix(sudo:session): session closed for user root
Nov 28 10:05:22 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Nov 28 10:05:22 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 10:05:22 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0)
Nov 28 10:05:22 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 28 10:05:22 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Nov 28 10:05:22 np0005538515.localdomain ceph-mgr[286188]: [progress INFO root] update: starting ev 97fca34c-0541-418d-89cd-2d2423e80b02 (Updating node-proxy deployment (+3 -> 3))
Nov 28 10:05:22 np0005538515.localdomain ceph-mgr[286188]: [progress INFO root] complete: finished ev 97fca34c-0541-418d-89cd-2d2423e80b02 (Updating node-proxy deployment (+3 -> 3))
Nov 28 10:05:22 np0005538515.localdomain ceph-mgr[286188]: [progress INFO root] Completed event 97fca34c-0541-418d-89cd-2d2423e80b02 (Updating node-proxy deployment (+3 -> 3)) in 0 seconds
Nov 28 10:05:22 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0)
Nov 28 10:05:22 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 28 10:05:22 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:05:22.275 2 INFO neutron.agent.securitygroups_rpc [None req-a89ecc9f-78c7-44a1-98c4-c8700080c0c0 4d2a0ee370da4e688f1d8f5c639f278d ac0f848cde7c47c998a8b80087a3b59d - - default default] Security group member updated ['6784caad-903b-4e5a-b2eb-6bb1d9f8710a']
Nov 28 10:05:22 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/3152367006' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:05:22 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/3152367006' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:05:22 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 10:05:22 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 28 10:05:22 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:05:22 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 28 10:05:22 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:05:22.456 261346 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:05:22Z, description=, device_id=db8491a4-73bc-4598-975c-a7b74ee65365, device_owner=network:router_gateway, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce613400>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce613640>], id=b37d9acc-d566-4328-b5c8-2033650f3d16, ip_allocation=immediate, mac_address=fa:16:3e:35:f9:75, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T08:32:19Z, description=, dns_domain=, id=887157f9-a765-40c0-8be5-1fba3ddea8f8, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=9dda653c53224db086060962b0702694, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=29, status=ACTIVE, subnets=['5f7de60c-f82a-4f40-b803-51cb08cbf2e3'], tags=[], tenant_id=9dda653c53224db086060962b0702694, updated_at=2025-11-28T08:32:25Z, vlan_transparent=None, network_id=887157f9-a765-40c0-8be5-1fba3ddea8f8, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1922, status=DOWN, tags=[], tenant_id=, updated_at=2025-11-28T10:05:22Z on network 887157f9-a765-40c0-8be5-1fba3ddea8f8
Nov 28 10:05:22 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:05:22.509 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:22 np0005538515.localdomain sudo[315008]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 10:05:22 np0005538515.localdomain sudo[315008]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 10:05:22 np0005538515.localdomain sudo[315008]: pam_unix(sudo:session): session closed for user root
Nov 28 10:05:22 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:05:22.556 2 INFO neutron.agent.securitygroups_rpc [None req-d94caea8-60f5-4f9f-b771-ee8db26744a5 48f54094604448288052e0203d18d8df 45fdbe27569f45449de58f1d1899ceea - - default default] Security group member updated ['0eb534d2-8077-4b6c-8550-0df9f702073c']
Nov 28 10:05:22 np0005538515.localdomain dnsmasq[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/addn_hosts - 5 addresses
Nov 28 10:05:22 np0005538515.localdomain dnsmasq-dhcp[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/host
Nov 28 10:05:22 np0005538515.localdomain dnsmasq-dhcp[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/opts
Nov 28 10:05:22 np0005538515.localdomain podman[315044]: 2025-11-28 10:05:22.69062034 +0000 UTC m=+0.058948687 container kill d6ba70e100412f39fbf43f2359746885ccd151dd93bddc3682d2651a49a36c62 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-887157f9-a765-40c0-8be5-1fba3ddea8f8, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Nov 28 10:05:22 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:05:22.932 261346 INFO neutron.agent.dhcp.agent [None req-6f9eb19a-9e92-4ddc-91ef-da613ff77b77 - - - - - -] DHCP configuration for ports {'b37d9acc-d566-4328-b5c8-2033650f3d16'} is completed
Nov 28 10:05:22 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:05:22.954 2 INFO neutron.agent.securitygroups_rpc [None req-e4fadf30-c514-41d2-a5af-51c101b5ad33 a22fc7f25dd74349be1fe8842a517a9e e8063417cfc540ba8948734a0d2952d7 - - default default] Security group member updated ['f26ff442-2d58-414e-b33d-dcdad9bae629']
Nov 28 10:05:23 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:05:23.176 2 INFO neutron.agent.securitygroups_rpc [None req-e4e98f94-fcd2-4011-84de-9dbf12942472 48f54094604448288052e0203d18d8df 45fdbe27569f45449de58f1d1899ceea - - default default] Security group member updated ['0eb534d2-8077-4b6c-8550-0df9f702073c']
Nov 28 10:05:23 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:05:23.213 261346 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:05:23 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:05:23.416 2 INFO neutron.agent.securitygroups_rpc [None req-ab356ba5-c3b5-409b-9ecc-7c688fb94166 a22fc7f25dd74349be1fe8842a517a9e e8063417cfc540ba8948734a0d2952d7 - - default default] Security group member updated ['f26ff442-2d58-414e-b33d-dcdad9bae629']
Nov 28 10:05:23 np0005538515.localdomain ceph-mon[301134]: pgmap v280: 177 pgs: 177 active+clean; 145 MiB data, 783 MiB used, 41 GiB / 42 GiB avail; 10 KiB/s rd, 596 B/s wr, 14 op/s
Nov 28 10:05:23 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:05:23.763 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:23 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v281: 177 pgs: 177 active+clean; 145 MiB data, 783 MiB used, 41 GiB / 42 GiB avail; 10 KiB/s rd, 596 B/s wr, 14 op/s
Nov 28 10:05:23 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:05:23.806 2 INFO neutron.agent.securitygroups_rpc [None req-237dd594-9b00-4d69-9190-5464bb1ce820 a22fc7f25dd74349be1fe8842a517a9e e8063417cfc540ba8948734a0d2952d7 - - default default] Security group member updated ['f26ff442-2d58-414e-b33d-dcdad9bae629']
Nov 28 10:05:23 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.
Nov 28 10:05:23 np0005538515.localdomain podman[315064]: 2025-11-28 10:05:23.998360631 +0000 UTC m=+0.095571420 container health_status 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, managed_by=edpm_ansible, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, build-date=2025-08-20T13:12:41, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., distribution-scope=public, vendor=Red Hat, Inc., config_id=edpm, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7)
Nov 28 10:05:24 np0005538515.localdomain podman[315064]: 2025-11-28 10:05:24.011815793 +0000 UTC m=+0.109026552 container exec_died 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, config_id=edpm, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, io.openshift.expose-services=, maintainer=Red Hat, Inc., managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Nov 28 10:05:24 np0005538515.localdomain systemd[1]: 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.service: Deactivated successfully.
Nov 28 10:05:24 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:05:24.441 158530 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f8:ad:87 2001:db8:0:1:f816:3eff:fef8:ad87'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1:f816:3eff:fef8:ad87/64', 'neutron:device_id': 'ovnmeta-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a8f8694ac11a4237ad168b64c39ca114', 'neutron:revision_number': '30', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3e1a061c-89bc-4b63-8b60-f49fb95addda, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=ef9eb238-2b1e-49f7-8a0f-72efc8854e0f) old=Port_Binding(mac=['fa:16:3e:f8:ad:87 2001:db8::f816:3eff:fef8:ad87'], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fef8:ad87/64', 'neutron:device_id': 'ovnmeta-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a8f8694ac11a4237ad168b64c39ca114', 'neutron:revision_number': '28', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:05:24 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:05:24.443 158530 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port ef9eb238-2b1e-49f7-8a0f-72efc8854e0f in datapath 8642adde-54ae-4fc2-b997-bf1962c6c7f1 updated
Nov 28 10:05:24 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:05:24.445 158530 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8642adde-54ae-4fc2-b997-bf1962c6c7f1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 28 10:05:24 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:05:24.446 261619 DEBUG oslo.privsep.daemon [-] privsep: reply[a4102194-aa3e-407c-adac-55234b6b0d34]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:05:24 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/1834815268' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:05:24 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/1834815268' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:05:25 np0005538515.localdomain ceph-mon[301134]: pgmap v281: 177 pgs: 177 active+clean; 145 MiB data, 783 MiB used, 41 GiB / 42 GiB avail; 10 KiB/s rd, 596 B/s wr, 14 op/s
Nov 28 10:05:25 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:05:25.479 2 INFO neutron.agent.securitygroups_rpc [None req-1ee68db6-3f9b-4e96-8e85-23a114c9c60e 1170e00b193540e48a34e21b33f08b7a a8f8694ac11a4237ad168b64c39ca114 - - default default] Security group member updated ['7f513fd2-7d14-47f7-892b-9f7b5ed2e3c4']
Nov 28 10:05:25 np0005538515.localdomain dnsmasq[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/addn_hosts - 4 addresses
Nov 28 10:05:25 np0005538515.localdomain dnsmasq-dhcp[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/host
Nov 28 10:05:25 np0005538515.localdomain podman[315102]: 2025-11-28 10:05:25.718189721 +0000 UTC m=+0.063504978 container kill d6ba70e100412f39fbf43f2359746885ccd151dd93bddc3682d2651a49a36c62 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-887157f9-a765-40c0-8be5-1fba3ddea8f8, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS)
Nov 28 10:05:25 np0005538515.localdomain dnsmasq-dhcp[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/opts
Nov 28 10:05:25 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v282: 177 pgs: 177 active+clean; 145 MiB data, 783 MiB used, 41 GiB / 42 GiB avail; 10 KiB/s rd, 596 B/s wr, 14 op/s
Nov 28 10:05:25 np0005538515.localdomain ceph-mgr[286188]: [progress INFO root] Writing back 50 completed events
Nov 28 10:05:25 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0)
Nov 28 10:05:26 np0005538515.localdomain dnsmasq[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/addn_hosts - 3 addresses
Nov 28 10:05:26 np0005538515.localdomain dnsmasq-dhcp[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/host
Nov 28 10:05:26 np0005538515.localdomain dnsmasq-dhcp[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/opts
Nov 28 10:05:26 np0005538515.localdomain podman[315139]: 2025-11-28 10:05:26.417597767 +0000 UTC m=+0.055739460 container kill d6ba70e100412f39fbf43f2359746885ccd151dd93bddc3682d2651a49a36c62 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-887157f9-a765-40c0-8be5-1fba3ddea8f8, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Nov 28 10:05:26 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 10:05:26 np0005538515.localdomain ceph-mon[301134]: pgmap v282: 177 pgs: 177 active+clean; 145 MiB data, 783 MiB used, 41 GiB / 42 GiB avail; 10 KiB/s rd, 596 B/s wr, 14 op/s
Nov 28 10:05:26 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:05:26 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/2070944282' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:05:26 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/2070944282' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:05:26 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:05:26.906 2 INFO neutron.agent.securitygroups_rpc [None req-b7b686b0-df55-421e-8492-2a2961409c1a 1170e00b193540e48a34e21b33f08b7a a8f8694ac11a4237ad168b64c39ca114 - - default default] Security group member updated ['7f513fd2-7d14-47f7-892b-9f7b5ed2e3c4']
Nov 28 10:05:27 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:05:27.512 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:05:27 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 10:05:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:05:27 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 10:05:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:05:27 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 10:05:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:05:27 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 10:05:27 np0005538515.localdomain openstack_network_exporter[240973]: 
Nov 28 10:05:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:05:27 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 10:05:27 np0005538515.localdomain openstack_network_exporter[240973]: 
Nov 28 10:05:27 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v283: 177 pgs: 177 active+clean; 145 MiB data, 787 MiB used, 41 GiB / 42 GiB avail; 31 KiB/s rd, 1.7 KiB/s wr, 42 op/s
Nov 28 10:05:27 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:05:27.907 261346 INFO neutron.agent.linux.ip_lib [None req-b14b2469-1a83-47d7-a3c4-c4f8422d291d - - - - - -] Device tap263dd990-bb cannot be used as it has no MAC address
Nov 28 10:05:27 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:05:27.930 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:27 np0005538515.localdomain kernel: device tap263dd990-bb entered promiscuous mode
Nov 28 10:05:27 np0005538515.localdomain NetworkManager[5965]: <info>  [1764324327.9399] manager: (tap263dd990-bb): new Generic device (/org/freedesktop/NetworkManager/Devices/27)
Nov 28 10:05:27 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T10:05:27Z|00109|binding|INFO|Claiming lport 263dd990-bba8-43be-a704-af8089f8d063 for this chassis.
Nov 28 10:05:27 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T10:05:27Z|00110|binding|INFO|263dd990-bba8-43be-a704-af8089f8d063: Claiming unknown
Nov 28 10:05:27 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:05:27.939 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:27 np0005538515.localdomain systemd-udevd[315170]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 10:05:27 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:05:27.949 158530 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538515.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcp2a75ae47-09f3-5db4-9c67-86b6e0e7c804-76551b5f-5d3c-486b-8256-6697e6d961af', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-76551b5f-5d3c-486b-8256-6697e6d961af', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '79c7da76f5894b10864b69d1961b95ab', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e093bf79-07f4-4a53-b10e-ba79c6e89219, chassis=[<ovs.db.idl.Row object at 0x7fd80e481be0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fd80e481be0>], logical_port=263dd990-bba8-43be-a704-af8089f8d063) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:05:27 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:05:27.951 158530 INFO neutron.agent.ovn.metadata.agent [-] Port 263dd990-bba8-43be-a704-af8089f8d063 in datapath 76551b5f-5d3c-486b-8256-6697e6d961af bound to our chassis
Nov 28 10:05:27 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:05:27.952 158530 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 76551b5f-5d3c-486b-8256-6697e6d961af or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 28 10:05:27 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:05:27.953 261619 DEBUG oslo.privsep.daemon [-] privsep: reply[d54fba90-9f1b-4dac-a58b-db288154cd34]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:05:27 np0005538515.localdomain virtnodedevd[228057]: ethtool ioctl error on tap263dd990-bb: No such device
Nov 28 10:05:27 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T10:05:27Z|00111|binding|INFO|Setting lport 263dd990-bba8-43be-a704-af8089f8d063 ovn-installed in OVS
Nov 28 10:05:27 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T10:05:27Z|00112|binding|INFO|Setting lport 263dd990-bba8-43be-a704-af8089f8d063 up in Southbound
Nov 28 10:05:27 np0005538515.localdomain virtnodedevd[228057]: ethtool ioctl error on tap263dd990-bb: No such device
Nov 28 10:05:27 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:05:27.983 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:27 np0005538515.localdomain virtnodedevd[228057]: ethtool ioctl error on tap263dd990-bb: No such device
Nov 28 10:05:27 np0005538515.localdomain virtnodedevd[228057]: ethtool ioctl error on tap263dd990-bb: No such device
Nov 28 10:05:27 np0005538515.localdomain virtnodedevd[228057]: ethtool ioctl error on tap263dd990-bb: No such device
Nov 28 10:05:28 np0005538515.localdomain virtnodedevd[228057]: ethtool ioctl error on tap263dd990-bb: No such device
Nov 28 10:05:28 np0005538515.localdomain virtnodedevd[228057]: ethtool ioctl error on tap263dd990-bb: No such device
Nov 28 10:05:28 np0005538515.localdomain virtnodedevd[228057]: ethtool ioctl error on tap263dd990-bb: No such device
Nov 28 10:05:28 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:05:28.024 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:28 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:05:28.050 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:28 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:05:28.766 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:28 np0005538515.localdomain podman[315242]: 
Nov 28 10:05:28 np0005538515.localdomain podman[315242]: 2025-11-28 10:05:28.882151091 +0000 UTC m=+0.084163831 container create b1786ca741ddada675528e8264626c275f0e00eead8f1394ae47def187e20ad4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-76551b5f-5d3c-486b-8256-6697e6d961af, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Nov 28 10:05:28 np0005538515.localdomain ceph-mon[301134]: pgmap v283: 177 pgs: 177 active+clean; 145 MiB data, 787 MiB used, 41 GiB / 42 GiB avail; 31 KiB/s rd, 1.7 KiB/s wr, 42 op/s
Nov 28 10:05:28 np0005538515.localdomain podman[239012]: time="2025-11-28T10:05:28Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 10:05:28 np0005538515.localdomain systemd[1]: Started libpod-conmon-b1786ca741ddada675528e8264626c275f0e00eead8f1394ae47def187e20ad4.scope.
Nov 28 10:05:28 np0005538515.localdomain podman[315242]: 2025-11-28 10:05:28.841127943 +0000 UTC m=+0.043140683 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 28 10:05:28 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 10:05:28 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f596bc5e7c28d7a22b37b15d00f7baa6c61a1b94d6dc8c55a4dae31f2f97828d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 10:05:28 np0005538515.localdomain podman[315242]: 2025-11-28 10:05:28.959272864 +0000 UTC m=+0.161285564 container init b1786ca741ddada675528e8264626c275f0e00eead8f1394ae47def187e20ad4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-76551b5f-5d3c-486b-8256-6697e6d961af, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:05:28 np0005538515.localdomain podman[315242]: 2025-11-28 10:05:28.968448325 +0000 UTC m=+0.170461025 container start b1786ca741ddada675528e8264626c275f0e00eead8f1394ae47def187e20ad4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-76551b5f-5d3c-486b-8256-6697e6d961af, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 28 10:05:28 np0005538515.localdomain podman[239012]: @ - - [28/Nov/2025:10:05:28 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 158143 "" "Go-http-client/1.1"
Nov 28 10:05:28 np0005538515.localdomain dnsmasq[315260]: started, version 2.85 cachesize 150
Nov 28 10:05:28 np0005538515.localdomain dnsmasq[315260]: DNS service limited to local subnets
Nov 28 10:05:28 np0005538515.localdomain dnsmasq[315260]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 28 10:05:28 np0005538515.localdomain dnsmasq[315260]: warning: no upstream servers configured
Nov 28 10:05:28 np0005538515.localdomain dnsmasq-dhcp[315260]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Nov 28 10:05:28 np0005538515.localdomain dnsmasq[315260]: read /var/lib/neutron/dhcp/76551b5f-5d3c-486b-8256-6697e6d961af/addn_hosts - 0 addresses
Nov 28 10:05:28 np0005538515.localdomain dnsmasq-dhcp[315260]: read /var/lib/neutron/dhcp/76551b5f-5d3c-486b-8256-6697e6d961af/host
Nov 28 10:05:28 np0005538515.localdomain dnsmasq-dhcp[315260]: read /var/lib/neutron/dhcp/76551b5f-5d3c-486b-8256-6697e6d961af/opts
Nov 28 10:05:29 np0005538515.localdomain podman[239012]: @ - - [28/Nov/2025:10:05:28 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19672 "" "Go-http-client/1.1"
Nov 28 10:05:29 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:05:29.214 261346 INFO neutron.agent.dhcp.agent [None req-4eb272e4-06b1-4cdc-acc8-b7176b6e35dc - - - - - -] DHCP configuration for ports {'0a750d38-4e6c-43e8-93df-caac4798b0d3'} is completed
Nov 28 10:05:29 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v284: 177 pgs: 177 active+clean; 145 MiB data, 787 MiB used, 41 GiB / 42 GiB avail; 31 KiB/s rd, 1.7 KiB/s wr, 42 op/s
Nov 28 10:05:29 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/1994004222' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:05:29 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/1994004222' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:05:30 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:05:30.550 2 INFO neutron.agent.securitygroups_rpc [None req-3bea7d0a-d883-4475-b5cf-e7689fbb3c41 1170e00b193540e48a34e21b33f08b7a a8f8694ac11a4237ad168b64c39ca114 - - default default] Security group member updated ['7f513fd2-7d14-47f7-892b-9f7b5ed2e3c4']
Nov 28 10:05:31 np0005538515.localdomain ceph-mon[301134]: pgmap v284: 177 pgs: 177 active+clean; 145 MiB data, 787 MiB used, 41 GiB / 42 GiB avail; 31 KiB/s rd, 1.7 KiB/s wr, 42 op/s
Nov 28 10:05:31 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:05:31.134 261346 INFO neutron.agent.linux.ip_lib [None req-52264fc1-85c6-49e7-bf5f-d4bc4fa204b0 - - - - - -] Device tap6e1d8ae7-e5 cannot be used as it has no MAC address
Nov 28 10:05:31 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:05:31.158 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:31 np0005538515.localdomain kernel: device tap6e1d8ae7-e5 entered promiscuous mode
Nov 28 10:05:31 np0005538515.localdomain NetworkManager[5965]: <info>  [1764324331.1663] manager: (tap6e1d8ae7-e5): new Generic device (/org/freedesktop/NetworkManager/Devices/28)
Nov 28 10:05:31 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:05:31.169 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:31 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T10:05:31Z|00113|binding|INFO|Claiming lport 6e1d8ae7-e505-45e5-a475-e8b27c1cbd76 for this chassis.
Nov 28 10:05:31 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T10:05:31Z|00114|binding|INFO|6e1d8ae7-e505-45e5-a475-e8b27c1cbd76: Claiming unknown
Nov 28 10:05:31 np0005538515.localdomain systemd-udevd[315271]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 10:05:31 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:05:31.178 158530 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538515.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcp2a75ae47-09f3-5db4-9c67-86b6e0e7c804-a9e7bee2-adb4-4094-93fb-3cf35621d144', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a9e7bee2-adb4-4094-93fb-3cf35621d144', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '79c7da76f5894b10864b69d1961b95ab', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d2b44c93-fed4-4a4f-bdce-cdd61f440a29, chassis=[<ovs.db.idl.Row object at 0x7fd80e481be0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fd80e481be0>], logical_port=6e1d8ae7-e505-45e5-a475-e8b27c1cbd76) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:05:31 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:05:31.180 158530 INFO neutron.agent.ovn.metadata.agent [-] Port 6e1d8ae7-e505-45e5-a475-e8b27c1cbd76 in datapath a9e7bee2-adb4-4094-93fb-3cf35621d144 bound to our chassis
Nov 28 10:05:31 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:05:31.182 158530 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network a9e7bee2-adb4-4094-93fb-3cf35621d144 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 28 10:05:31 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:05:31.182 261619 DEBUG oslo.privsep.daemon [-] privsep: reply[65397a8c-f46d-453e-b6ff-6eed8ff6cf8f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:05:31 np0005538515.localdomain virtnodedevd[228057]: ethtool ioctl error on tap6e1d8ae7-e5: No such device
Nov 28 10:05:31 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:05:31.197 2 INFO neutron.agent.securitygroups_rpc [None req-de2b1d93-259b-466a-9f3c-b35e66480eb7 1170e00b193540e48a34e21b33f08b7a a8f8694ac11a4237ad168b64c39ca114 - - default default] Security group member updated ['7f513fd2-7d14-47f7-892b-9f7b5ed2e3c4']
Nov 28 10:05:31 np0005538515.localdomain virtnodedevd[228057]: ethtool ioctl error on tap6e1d8ae7-e5: No such device
Nov 28 10:05:31 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T10:05:31Z|00115|binding|INFO|Setting lport 6e1d8ae7-e505-45e5-a475-e8b27c1cbd76 ovn-installed in OVS
Nov 28 10:05:31 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T10:05:31Z|00116|binding|INFO|Setting lport 6e1d8ae7-e505-45e5-a475-e8b27c1cbd76 up in Southbound
Nov 28 10:05:31 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:05:31.200 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:31 np0005538515.localdomain virtnodedevd[228057]: ethtool ioctl error on tap6e1d8ae7-e5: No such device
Nov 28 10:05:31 np0005538515.localdomain virtnodedevd[228057]: ethtool ioctl error on tap6e1d8ae7-e5: No such device
Nov 28 10:05:31 np0005538515.localdomain virtnodedevd[228057]: ethtool ioctl error on tap6e1d8ae7-e5: No such device
Nov 28 10:05:31 np0005538515.localdomain virtnodedevd[228057]: ethtool ioctl error on tap6e1d8ae7-e5: No such device
Nov 28 10:05:31 np0005538515.localdomain virtnodedevd[228057]: ethtool ioctl error on tap6e1d8ae7-e5: No such device
Nov 28 10:05:31 np0005538515.localdomain virtnodedevd[228057]: ethtool ioctl error on tap6e1d8ae7-e5: No such device
Nov 28 10:05:31 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:05:31.242 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:31 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:05:31.273 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:31 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v285: 177 pgs: 177 active+clean; 145 MiB data, 787 MiB used, 41 GiB / 42 GiB avail; 42 KiB/s rd, 3.0 KiB/s wr, 59 op/s
Nov 28 10:05:31 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 10:05:32 np0005538515.localdomain podman[315342]: 
Nov 28 10:05:32 np0005538515.localdomain podman[315342]: 2025-11-28 10:05:32.105850721 +0000 UTC m=+0.090567566 container create d5e1c3cb72577915b7d71f2368eb31dc1c34546039766c4ed24cd04c7c5823d4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a9e7bee2-adb4-4094-93fb-3cf35621d144, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:05:32 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/3014863473' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:05:32 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/3014863473' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:05:32 np0005538515.localdomain systemd[1]: Started libpod-conmon-d5e1c3cb72577915b7d71f2368eb31dc1c34546039766c4ed24cd04c7c5823d4.scope.
Nov 28 10:05:32 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 10:05:32 np0005538515.localdomain podman[315342]: 2025-11-28 10:05:32.061343637 +0000 UTC m=+0.046060522 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 28 10:05:32 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5b06168b56e6b283e5d79caea478a11e9296a7b7a714e7c8158e200f6a95c8f4/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 10:05:32 np0005538515.localdomain podman[315342]: 2025-11-28 10:05:32.17335136 +0000 UTC m=+0.158068205 container init d5e1c3cb72577915b7d71f2368eb31dc1c34546039766c4ed24cd04c7c5823d4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a9e7bee2-adb4-4094-93fb-3cf35621d144, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125)
Nov 28 10:05:32 np0005538515.localdomain podman[315342]: 2025-11-28 10:05:32.187204105 +0000 UTC m=+0.171920960 container start d5e1c3cb72577915b7d71f2368eb31dc1c34546039766c4ed24cd04c7c5823d4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a9e7bee2-adb4-4094-93fb-3cf35621d144, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:05:32 np0005538515.localdomain dnsmasq[315360]: started, version 2.85 cachesize 150
Nov 28 10:05:32 np0005538515.localdomain dnsmasq[315360]: DNS service limited to local subnets
Nov 28 10:05:32 np0005538515.localdomain dnsmasq[315360]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 28 10:05:32 np0005538515.localdomain dnsmasq[315360]: warning: no upstream servers configured
Nov 28 10:05:32 np0005538515.localdomain dnsmasq-dhcp[315360]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Nov 28 10:05:32 np0005538515.localdomain dnsmasq[315360]: read /var/lib/neutron/dhcp/a9e7bee2-adb4-4094-93fb-3cf35621d144/addn_hosts - 0 addresses
Nov 28 10:05:32 np0005538515.localdomain dnsmasq-dhcp[315360]: read /var/lib/neutron/dhcp/a9e7bee2-adb4-4094-93fb-3cf35621d144/host
Nov 28 10:05:32 np0005538515.localdomain dnsmasq-dhcp[315360]: read /var/lib/neutron/dhcp/a9e7bee2-adb4-4094-93fb-3cf35621d144/opts
Nov 28 10:05:32 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:05:32.369 261346 INFO neutron.agent.dhcp.agent [None req-42100a45-6df4-4646-912d-569842e8287a - - - - - -] DHCP configuration for ports {'c0212f63-6473-42a7-8ffc-e5ce666be6b1'} is completed
Nov 28 10:05:32 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:05:32.550 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:33 np0005538515.localdomain ceph-mon[301134]: pgmap v285: 177 pgs: 177 active+clean; 145 MiB data, 787 MiB used, 41 GiB / 42 GiB avail; 42 KiB/s rd, 3.0 KiB/s wr, 59 op/s
Nov 28 10:05:33 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/317391599' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:05:33 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/317391599' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:05:33 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/1576133045' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:05:33 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/1576133045' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:05:33 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:05:33.244 261346 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:05:33 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v286: 177 pgs: 177 active+clean; 145 MiB data, 787 MiB used, 41 GiB / 42 GiB avail; 32 KiB/s rd, 2.4 KiB/s wr, 45 op/s
Nov 28 10:05:33 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:05:33.775 2 INFO neutron.agent.securitygroups_rpc [None req-04197ece-4fb3-43df-90bb-b0e309825a8e cb58c56533984a129050414c9b160b63 de1aeac8abd545fcb83eb3ee06f16689 - - default default] Security group member updated ['805fa77a-da24-42d8-9154-db9402b01c3e']
Nov 28 10:05:33 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:05:33.799 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:33 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:05:33.838 261346 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:05:33Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce41aa00>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce41a970>], id=908c0c28-dfd3-45f3-9a9c-b54daa8aaee7, ip_allocation=immediate, mac_address=fa:16:3e:bd:c0:a8, name=tempest-RoutersAdminNegativeIpV6Test-1418326863, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T08:32:19Z, description=, dns_domain=, id=887157f9-a765-40c0-8be5-1fba3ddea8f8, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=9dda653c53224db086060962b0702694, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=29, status=ACTIVE, subnets=['5f7de60c-f82a-4f40-b803-51cb08cbf2e3'], tags=[], tenant_id=9dda653c53224db086060962b0702694, updated_at=2025-11-28T08:32:25Z, vlan_transparent=None, network_id=887157f9-a765-40c0-8be5-1fba3ddea8f8, port_security_enabled=True, project_id=de1aeac8abd545fcb83eb3ee06f16689, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['805fa77a-da24-42d8-9154-db9402b01c3e'], standard_attr_id=1991, status=DOWN, tags=[], tenant_id=de1aeac8abd545fcb83eb3ee06f16689, updated_at=2025-11-28T10:05:33Z on network 887157f9-a765-40c0-8be5-1fba3ddea8f8
Nov 28 10:05:34 np0005538515.localdomain dnsmasq[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/addn_hosts - 4 addresses
Nov 28 10:05:34 np0005538515.localdomain dnsmasq-dhcp[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/host
Nov 28 10:05:34 np0005538515.localdomain dnsmasq-dhcp[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/opts
Nov 28 10:05:34 np0005538515.localdomain podman[315379]: 2025-11-28 10:05:34.073472446 +0000 UTC m=+0.063494417 container kill d6ba70e100412f39fbf43f2359746885ccd151dd93bddc3682d2651a49a36c62 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-887157f9-a765-40c0-8be5-1fba3ddea8f8, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Nov 28 10:05:34 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.
Nov 28 10:05:34 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.
Nov 28 10:05:34 np0005538515.localdomain podman[315396]: 2025-11-28 10:05:34.19434403 +0000 UTC m=+0.086478121 container health_status b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true)
Nov 28 10:05:34 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.
Nov 28 10:05:34 np0005538515.localdomain podman[315395]: 2025-11-28 10:05:34.254636298 +0000 UTC m=+0.144958643 container health_status 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.build-date=20251125)
Nov 28 10:05:34 np0005538515.localdomain podman[315396]: 2025-11-28 10:05:34.273539657 +0000 UTC m=+0.165673708 container exec_died b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 28 10:05:34 np0005538515.localdomain podman[315429]: 2025-11-28 10:05:34.308700725 +0000 UTC m=+0.088339038 container health_status d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 28 10:05:34 np0005538515.localdomain podman[315395]: 2025-11-28 10:05:34.309552201 +0000 UTC m=+0.199874526 container exec_died 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125)
Nov 28 10:05:34 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.
Nov 28 10:05:34 np0005538515.localdomain systemd[1]: 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.service: Deactivated successfully.
Nov 28 10:05:34 np0005538515.localdomain systemd[1]: b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.service: Deactivated successfully.
Nov 28 10:05:34 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:05:34.343 261346 INFO neutron.agent.dhcp.agent [None req-fe96075c-60cc-4c18-a44e-fcb71d2e90bb - - - - - -] DHCP configuration for ports {'908c0c28-dfd3-45f3-9a9c-b54daa8aaee7'} is completed
Nov 28 10:05:34 np0005538515.localdomain podman[315429]: 2025-11-28 10:05:34.39368268 +0000 UTC m=+0.173320993 container exec_died d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 10:05:34 np0005538515.localdomain systemd[1]: d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.service: Deactivated successfully.
Nov 28 10:05:34 np0005538515.localdomain podman[315463]: 2025-11-28 10:05:34.416156048 +0000 UTC m=+0.072643897 container health_status 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, managed_by=edpm_ansible, tcib_managed=true)
Nov 28 10:05:34 np0005538515.localdomain podman[315463]: 2025-11-28 10:05:34.426858707 +0000 UTC m=+0.083346546 container exec_died 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=edpm, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, managed_by=edpm_ansible)
Nov 28 10:05:34 np0005538515.localdomain systemd[1]: 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.service: Deactivated successfully.
Nov 28 10:05:35 np0005538515.localdomain systemd[1]: tmp-crun.q1a27R.mount: Deactivated successfully.
Nov 28 10:05:35 np0005538515.localdomain ceph-mon[301134]: pgmap v286: 177 pgs: 177 active+clean; 145 MiB data, 787 MiB used, 41 GiB / 42 GiB avail; 32 KiB/s rd, 2.4 KiB/s wr, 45 op/s
Nov 28 10:05:35 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] scanning for idle connections..
Nov 28 10:05:35 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] cleaning up connections: []
Nov 28 10:05:35 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] scanning for idle connections..
Nov 28 10:05:35 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] cleaning up connections: []
Nov 28 10:05:35 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] scanning for idle connections..
Nov 28 10:05:35 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] cleaning up connections: []
Nov 28 10:05:35 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v287: 177 pgs: 177 active+clean; 145 MiB data, 787 MiB used, 41 GiB / 42 GiB avail; 32 KiB/s rd, 2.4 KiB/s wr, 45 op/s
Nov 28 10:05:35 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:05:35.883 2 INFO neutron.agent.securitygroups_rpc [None req-9c55fb28-c01b-40d6-8fec-099f9b722777 cb58c56533984a129050414c9b160b63 de1aeac8abd545fcb83eb3ee06f16689 - - default default] Security group member updated ['805fa77a-da24-42d8-9154-db9402b01c3e']
Nov 28 10:05:36 np0005538515.localdomain dnsmasq[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/addn_hosts - 3 addresses
Nov 28 10:05:36 np0005538515.localdomain podman[315502]: 2025-11-28 10:05:36.106233546 +0000 UTC m=+0.057763610 container kill d6ba70e100412f39fbf43f2359746885ccd151dd93bddc3682d2651a49a36c62 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-887157f9-a765-40c0-8be5-1fba3ddea8f8, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team)
Nov 28 10:05:36 np0005538515.localdomain dnsmasq-dhcp[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/host
Nov 28 10:05:36 np0005538515.localdomain dnsmasq-dhcp[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/opts
Nov 28 10:05:36 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e152 e152: 6 total, 6 up, 6 in
Nov 28 10:05:36 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e152 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 10:05:37 np0005538515.localdomain dnsmasq[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/addn_hosts - 2 addresses
Nov 28 10:05:37 np0005538515.localdomain dnsmasq-dhcp[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/host
Nov 28 10:05:37 np0005538515.localdomain dnsmasq-dhcp[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/opts
Nov 28 10:05:37 np0005538515.localdomain podman[315540]: 2025-11-28 10:05:37.056698767 +0000 UTC m=+0.067067496 container kill d6ba70e100412f39fbf43f2359746885ccd151dd93bddc3682d2651a49a36c62 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-887157f9-a765-40c0-8be5-1fba3ddea8f8, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 28 10:05:37 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:05:37.098 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:37 np0005538515.localdomain ceph-mon[301134]: pgmap v287: 177 pgs: 177 active+clean; 145 MiB data, 787 MiB used, 41 GiB / 42 GiB avail; 32 KiB/s rd, 2.4 KiB/s wr, 45 op/s
Nov 28 10:05:37 np0005538515.localdomain ceph-mon[301134]: osdmap e152: 6 total, 6 up, 6 in
Nov 28 10:05:37 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:05:37.144 261346 INFO neutron.agent.linux.ip_lib [None req-4ac1aed3-fa52-4606-9efc-ba3ad05f7bfc - - - - - -] Device tap7525b617-bc cannot be used as it has no MAC address
Nov 28 10:05:37 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:05:37.199 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:37 np0005538515.localdomain kernel: device tap7525b617-bc entered promiscuous mode
Nov 28 10:05:37 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T10:05:37Z|00117|binding|INFO|Claiming lport 7525b617-bc2f-464e-a494-d304810bfb3d for this chassis.
Nov 28 10:05:37 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T10:05:37Z|00118|binding|INFO|7525b617-bc2f-464e-a494-d304810bfb3d: Claiming unknown
Nov 28 10:05:37 np0005538515.localdomain NetworkManager[5965]: <info>  [1764324337.2080] manager: (tap7525b617-bc): new Generic device (/org/freedesktop/NetworkManager/Devices/29)
Nov 28 10:05:37 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:05:37.208 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:37 np0005538515.localdomain systemd-udevd[315573]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 10:05:37 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:05:37.218 158530 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538515.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:1::2/64', 'neutron:device_id': 'dhcp2a75ae47-09f3-5db4-9c67-86b6e0e7c804-a168791a-0c5a-4a6f-ab00-175cc6c1bb37', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a168791a-0c5a-4a6f-ab00-175cc6c1bb37', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '79c7da76f5894b10864b69d1961b95ab', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=abd1fad4-7cc2-47af-bcf3-845289d3423e, chassis=[<ovs.db.idl.Row object at 0x7fd80e481be0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fd80e481be0>], logical_port=7525b617-bc2f-464e-a494-d304810bfb3d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:05:37 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:05:37.219 158530 INFO neutron.agent.ovn.metadata.agent [-] Port 7525b617-bc2f-464e-a494-d304810bfb3d in datapath a168791a-0c5a-4a6f-ab00-175cc6c1bb37 bound to our chassis
Nov 28 10:05:37 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:05:37.221 158530 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network a168791a-0c5a-4a6f-ab00-175cc6c1bb37 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 28 10:05:37 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:05:37.222 261619 DEBUG oslo.privsep.daemon [-] privsep: reply[b5fc9f31-7d5e-4b46-8118-afbd48211107]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:05:37 np0005538515.localdomain virtnodedevd[228057]: ethtool ioctl error on tap7525b617-bc: No such device
Nov 28 10:05:37 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T10:05:37Z|00119|binding|INFO|Setting lport 7525b617-bc2f-464e-a494-d304810bfb3d ovn-installed in OVS
Nov 28 10:05:37 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T10:05:37Z|00120|binding|INFO|Setting lport 7525b617-bc2f-464e-a494-d304810bfb3d up in Southbound
Nov 28 10:05:37 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:05:37.246 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:37 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:05:37.248 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:37 np0005538515.localdomain virtnodedevd[228057]: ethtool ioctl error on tap7525b617-bc: No such device
Nov 28 10:05:37 np0005538515.localdomain virtnodedevd[228057]: ethtool ioctl error on tap7525b617-bc: No such device
Nov 28 10:05:37 np0005538515.localdomain virtnodedevd[228057]: ethtool ioctl error on tap7525b617-bc: No such device
Nov 28 10:05:37 np0005538515.localdomain virtnodedevd[228057]: ethtool ioctl error on tap7525b617-bc: No such device
Nov 28 10:05:37 np0005538515.localdomain virtnodedevd[228057]: ethtool ioctl error on tap7525b617-bc: No such device
Nov 28 10:05:37 np0005538515.localdomain virtnodedevd[228057]: ethtool ioctl error on tap7525b617-bc: No such device
Nov 28 10:05:37 np0005538515.localdomain virtnodedevd[228057]: ethtool ioctl error on tap7525b617-bc: No such device
Nov 28 10:05:37 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:05:37.277 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:37 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:05:37.299 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:37 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:05:37.552 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:37 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v289: 177 pgs: 177 active+clean; 145 MiB data, 791 MiB used, 41 GiB / 42 GiB avail; 55 KiB/s rd, 3.5 KiB/s wr, 76 op/s
Nov 28 10:05:37 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.
Nov 28 10:05:37 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:05:37.935 261346 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:05:37Z, description=, device_id=773237be-aa27-4adb-a0c5-c56bdc56f175, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce5fbc70>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce5fb1f0>], id=fdf90593-78c6-4222-ac7b-fcd71032b0ab, ip_allocation=immediate, mac_address=fa:16:3e:09:d3:12, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:05:25Z, description=, dns_domain=, id=76551b5f-5d3c-486b-8256-6697e6d961af, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersNegativeIpV6Test-test-network-1822665596, port_security_enabled=True, project_id=79c7da76f5894b10864b69d1961b95ab, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=34453, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1947, status=ACTIVE, subnets=['3c53e558-3975-44a5-a65d-1cc78bc25b73'], tags=[], tenant_id=79c7da76f5894b10864b69d1961b95ab, updated_at=2025-11-28T10:05:26Z, vlan_transparent=None, network_id=76551b5f-5d3c-486b-8256-6697e6d961af, port_security_enabled=False, project_id=79c7da76f5894b10864b69d1961b95ab, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2014, status=DOWN, tags=[], tenant_id=79c7da76f5894b10864b69d1961b95ab, updated_at=2025-11-28T10:05:37Z on network 76551b5f-5d3c-486b-8256-6697e6d961af
Nov 28 10:05:38 np0005538515.localdomain podman[315623]: 2025-11-28 10:05:38.020149035 +0000 UTC m=+0.131394318 container health_status 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 10:05:38 np0005538515.localdomain podman[315623]: 2025-11-28 10:05:38.032487853 +0000 UTC m=+0.143733186 container exec_died 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 28 10:05:38 np0005538515.localdomain systemd[1]: 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.service: Deactivated successfully.
Nov 28 10:05:38 np0005538515.localdomain podman[315661]: 
Nov 28 10:05:38 np0005538515.localdomain podman[315661]: 2025-11-28 10:05:38.085200509 +0000 UTC m=+0.098202321 container create 8dd6126469f09ec4480a793fbe3fb8029b57ea0ba328dc8ed4db77ac55e2d94a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a168791a-0c5a-4a6f-ab00-175cc6c1bb37, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Nov 28 10:05:38 np0005538515.localdomain systemd[1]: Started libpod-conmon-8dd6126469f09ec4480a793fbe3fb8029b57ea0ba328dc8ed4db77ac55e2d94a.scope.
Nov 28 10:05:38 np0005538515.localdomain podman[315661]: 2025-11-28 10:05:38.032870795 +0000 UTC m=+0.045872587 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 28 10:05:38 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 10:05:38 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/24636b78bd77589d490104c67446063eaf3f979c2e97bf5584acb9bfe1dedc30/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 10:05:38 np0005538515.localdomain dnsmasq[315260]: read /var/lib/neutron/dhcp/76551b5f-5d3c-486b-8256-6697e6d961af/addn_hosts - 1 addresses
Nov 28 10:05:38 np0005538515.localdomain dnsmasq-dhcp[315260]: read /var/lib/neutron/dhcp/76551b5f-5d3c-486b-8256-6697e6d961af/host
Nov 28 10:05:38 np0005538515.localdomain podman[315696]: 2025-11-28 10:05:38.167417458 +0000 UTC m=+0.050750466 container kill b1786ca741ddada675528e8264626c275f0e00eead8f1394ae47def187e20ad4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-76551b5f-5d3c-486b-8256-6697e6d961af, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Nov 28 10:05:38 np0005538515.localdomain dnsmasq-dhcp[315260]: read /var/lib/neutron/dhcp/76551b5f-5d3c-486b-8256-6697e6d961af/opts
Nov 28 10:05:38 np0005538515.localdomain podman[315661]: 2025-11-28 10:05:38.208990953 +0000 UTC m=+0.221992715 container init 8dd6126469f09ec4480a793fbe3fb8029b57ea0ba328dc8ed4db77ac55e2d94a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a168791a-0c5a-4a6f-ab00-175cc6c1bb37, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:05:38 np0005538515.localdomain podman[315661]: 2025-11-28 10:05:38.215915374 +0000 UTC m=+0.228917166 container start 8dd6126469f09ec4480a793fbe3fb8029b57ea0ba328dc8ed4db77ac55e2d94a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a168791a-0c5a-4a6f-ab00-175cc6c1bb37, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:05:38 np0005538515.localdomain dnsmasq[315714]: started, version 2.85 cachesize 150
Nov 28 10:05:38 np0005538515.localdomain dnsmasq[315714]: DNS service limited to local subnets
Nov 28 10:05:38 np0005538515.localdomain dnsmasq[315714]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 28 10:05:38 np0005538515.localdomain dnsmasq[315714]: warning: no upstream servers configured
Nov 28 10:05:38 np0005538515.localdomain dnsmasq-dhcp[315714]: DHCPv6, static leases only on 2001:db8:1::, lease time 1d
Nov 28 10:05:38 np0005538515.localdomain dnsmasq[315714]: read /var/lib/neutron/dhcp/a168791a-0c5a-4a6f-ab00-175cc6c1bb37/addn_hosts - 0 addresses
Nov 28 10:05:38 np0005538515.localdomain dnsmasq-dhcp[315714]: read /var/lib/neutron/dhcp/a168791a-0c5a-4a6f-ab00-175cc6c1bb37/host
Nov 28 10:05:38 np0005538515.localdomain dnsmasq-dhcp[315714]: read /var/lib/neutron/dhcp/a168791a-0c5a-4a6f-ab00-175cc6c1bb37/opts
Nov 28 10:05:38 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:05:38.238 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:05:38 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:05:38.238 280172 DEBUG nova.compute.manager [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Nov 28 10:05:38 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:05:38.271 280172 DEBUG nova.compute.manager [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Nov 28 10:05:38 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:05:38.419 261346 INFO neutron.agent.dhcp.agent [None req-e470f9a7-ccdc-4519-b34e-49ce2eb028b4 - - - - - -] DHCP configuration for ports {'fdf90593-78c6-4222-ac7b-fcd71032b0ab', '84f970fc-aab5-4a93-b320-463d8d4ba76e'} is completed
Nov 28 10:05:38 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:05:38.803 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:39 np0005538515.localdomain systemd[1]: tmp-crun.9OK8bS.mount: Deactivated successfully.
Nov 28 10:05:39 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:05:39.099 261346 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:05:37Z, description=, device_id=773237be-aa27-4adb-a0c5-c56bdc56f175, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce4cd4f0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce4cd2e0>], id=fdf90593-78c6-4222-ac7b-fcd71032b0ab, ip_allocation=immediate, mac_address=fa:16:3e:09:d3:12, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:05:25Z, description=, dns_domain=, id=76551b5f-5d3c-486b-8256-6697e6d961af, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersNegativeIpV6Test-test-network-1822665596, port_security_enabled=True, project_id=79c7da76f5894b10864b69d1961b95ab, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=34453, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1947, status=ACTIVE, subnets=['3c53e558-3975-44a5-a65d-1cc78bc25b73'], tags=[], tenant_id=79c7da76f5894b10864b69d1961b95ab, updated_at=2025-11-28T10:05:26Z, vlan_transparent=None, network_id=76551b5f-5d3c-486b-8256-6697e6d961af, port_security_enabled=False, project_id=79c7da76f5894b10864b69d1961b95ab, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2014, status=DOWN, tags=[], tenant_id=79c7da76f5894b10864b69d1961b95ab, updated_at=2025-11-28T10:05:37Z on network 76551b5f-5d3c-486b-8256-6697e6d961af
Nov 28 10:05:39 np0005538515.localdomain ceph-mon[301134]: pgmap v289: 177 pgs: 177 active+clean; 145 MiB data, 791 MiB used, 41 GiB / 42 GiB avail; 55 KiB/s rd, 3.5 KiB/s wr, 76 op/s
Nov 28 10:05:39 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/483552978' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:05:39 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/483552978' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:05:39 np0005538515.localdomain podman[315739]: 2025-11-28 10:05:39.327268726 +0000 UTC m=+0.083425308 container kill b1786ca741ddada675528e8264626c275f0e00eead8f1394ae47def187e20ad4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-76551b5f-5d3c-486b-8256-6697e6d961af, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125)
Nov 28 10:05:39 np0005538515.localdomain systemd[1]: tmp-crun.44JQCJ.mount: Deactivated successfully.
Nov 28 10:05:39 np0005538515.localdomain dnsmasq[315260]: read /var/lib/neutron/dhcp/76551b5f-5d3c-486b-8256-6697e6d961af/addn_hosts - 1 addresses
Nov 28 10:05:39 np0005538515.localdomain dnsmasq-dhcp[315260]: read /var/lib/neutron/dhcp/76551b5f-5d3c-486b-8256-6697e6d961af/host
Nov 28 10:05:39 np0005538515.localdomain dnsmasq-dhcp[315260]: read /var/lib/neutron/dhcp/76551b5f-5d3c-486b-8256-6697e6d961af/opts
Nov 28 10:05:39 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Nov 28 10:05:39 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3980002561' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:05:39 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Nov 28 10:05:39 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3980002561' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:05:39 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:05:39.680 261346 INFO neutron.agent.dhcp.agent [None req-bad90eee-5e0c-4cbf-a1ed-730e9f04425a - - - - - -] DHCP configuration for ports {'fdf90593-78c6-4222-ac7b-fcd71032b0ab'} is completed
Nov 28 10:05:39 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v290: 177 pgs: 177 active+clean; 145 MiB data, 791 MiB used, 41 GiB / 42 GiB avail; 55 KiB/s rd, 3.5 KiB/s wr, 76 op/s
Nov 28 10:05:40 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/3980002561' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:05:40 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/3980002561' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:05:40 np0005538515.localdomain podman[315778]: 2025-11-28 10:05:40.706543607 +0000 UTC m=+0.056958856 container kill b1786ca741ddada675528e8264626c275f0e00eead8f1394ae47def187e20ad4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-76551b5f-5d3c-486b-8256-6697e6d961af, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:05:40 np0005538515.localdomain dnsmasq[315260]: read /var/lib/neutron/dhcp/76551b5f-5d3c-486b-8256-6697e6d961af/addn_hosts - 0 addresses
Nov 28 10:05:40 np0005538515.localdomain dnsmasq-dhcp[315260]: read /var/lib/neutron/dhcp/76551b5f-5d3c-486b-8256-6697e6d961af/host
Nov 28 10:05:40 np0005538515.localdomain dnsmasq-dhcp[315260]: read /var/lib/neutron/dhcp/76551b5f-5d3c-486b-8256-6697e6d961af/opts
Nov 28 10:05:40 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.
Nov 28 10:05:40 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T10:05:40Z|00121|binding|INFO|Releasing lport 263dd990-bba8-43be-a704-af8089f8d063 from this chassis (sb_readonly=0)
Nov 28 10:05:40 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T10:05:40Z|00122|binding|INFO|Setting lport 263dd990-bba8-43be-a704-af8089f8d063 down in Southbound
Nov 28 10:05:40 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:05:40.928 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:40 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:05:40.937 158530 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538515.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcp2a75ae47-09f3-5db4-9c67-86b6e0e7c804-76551b5f-5d3c-486b-8256-6697e6d961af', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-76551b5f-5d3c-486b-8256-6697e6d961af', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '79c7da76f5894b10864b69d1961b95ab', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005538515.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e093bf79-07f4-4a53-b10e-ba79c6e89219, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fd80e481be0>], logical_port=263dd990-bba8-43be-a704-af8089f8d063) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fd80e481be0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:05:40 np0005538515.localdomain kernel: device tap263dd990-bb left promiscuous mode
Nov 28 10:05:40 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:05:40.939 158530 INFO neutron.agent.ovn.metadata.agent [-] Port 263dd990-bba8-43be-a704-af8089f8d063 in datapath 76551b5f-5d3c-486b-8256-6697e6d961af unbound from our chassis
Nov 28 10:05:40 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:05:40.942 158530 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 76551b5f-5d3c-486b-8256-6697e6d961af or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 28 10:05:40 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:05:40.941 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:40 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:05:40.943 261619 DEBUG oslo.privsep.daemon [-] privsep: reply[e1cfdf71-2016-45f4-9a8e-dd91f2bb3ef8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:05:40 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:05:40.955 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:40 np0005538515.localdomain systemd[1]: tmp-crun.fgxX5K.mount: Deactivated successfully.
Nov 28 10:05:41 np0005538515.localdomain podman[315799]: 2025-11-28 10:05:40.999660451 +0000 UTC m=+0.103219394 container health_status cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, config_id=multipathd, container_name=multipathd, tcib_managed=true)
Nov 28 10:05:41 np0005538515.localdomain podman[315799]: 2025-11-28 10:05:41.041532455 +0000 UTC m=+0.145091418 container exec_died cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=multipathd, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:05:41 np0005538515.localdomain systemd[1]: cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.service: Deactivated successfully.
Nov 28 10:05:41 np0005538515.localdomain ceph-mon[301134]: pgmap v290: 177 pgs: 177 active+clean; 145 MiB data, 791 MiB used, 41 GiB / 42 GiB avail; 55 KiB/s rd, 3.5 KiB/s wr, 76 op/s
Nov 28 10:05:41 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:05:41.270 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:41 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:05:41.272 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:05:41 np0005538515.localdomain dnsmasq[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/addn_hosts - 1 addresses
Nov 28 10:05:41 np0005538515.localdomain dnsmasq-dhcp[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/host
Nov 28 10:05:41 np0005538515.localdomain dnsmasq-dhcp[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/opts
Nov 28 10:05:41 np0005538515.localdomain podman[315835]: 2025-11-28 10:05:41.307119125 +0000 UTC m=+0.054952166 container kill d6ba70e100412f39fbf43f2359746885ccd151dd93bddc3682d2651a49a36c62 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-887157f9-a765-40c0-8be5-1fba3ddea8f8, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:05:41 np0005538515.localdomain dnsmasq[315714]: exiting on receipt of SIGTERM
Nov 28 10:05:41 np0005538515.localdomain podman[315871]: 2025-11-28 10:05:41.674147263 +0000 UTC m=+0.059407141 container kill 8dd6126469f09ec4480a793fbe3fb8029b57ea0ba328dc8ed4db77ac55e2d94a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a168791a-0c5a-4a6f-ab00-175cc6c1bb37, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:05:41 np0005538515.localdomain systemd[1]: libpod-8dd6126469f09ec4480a793fbe3fb8029b57ea0ba328dc8ed4db77ac55e2d94a.scope: Deactivated successfully.
Nov 28 10:05:41 np0005538515.localdomain podman[315883]: 2025-11-28 10:05:41.743040264 +0000 UTC m=+0.058445581 container died 8dd6126469f09ec4480a793fbe3fb8029b57ea0ba328dc8ed4db77ac55e2d94a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a168791a-0c5a-4a6f-ab00-175cc6c1bb37, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Nov 28 10:05:41 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8dd6126469f09ec4480a793fbe3fb8029b57ea0ba328dc8ed4db77ac55e2d94a-userdata-shm.mount: Deactivated successfully.
Nov 28 10:05:41 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-24636b78bd77589d490104c67446063eaf3f979c2e97bf5584acb9bfe1dedc30-merged.mount: Deactivated successfully.
Nov 28 10:05:41 np0005538515.localdomain podman[315883]: 2025-11-28 10:05:41.773651633 +0000 UTC m=+0.089056910 container cleanup 8dd6126469f09ec4480a793fbe3fb8029b57ea0ba328dc8ed4db77ac55e2d94a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a168791a-0c5a-4a6f-ab00-175cc6c1bb37, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Nov 28 10:05:41 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v291: 177 pgs: 177 active+clean; 145 MiB data, 791 MiB used, 41 GiB / 42 GiB avail; 63 KiB/s rd, 2.5 KiB/s wr, 84 op/s
Nov 28 10:05:41 np0005538515.localdomain systemd[1]: libpod-conmon-8dd6126469f09ec4480a793fbe3fb8029b57ea0ba328dc8ed4db77ac55e2d94a.scope: Deactivated successfully.
Nov 28 10:05:41 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e152 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 10:05:41 np0005538515.localdomain podman[315890]: 2025-11-28 10:05:41.819059615 +0000 UTC m=+0.122707562 container remove 8dd6126469f09ec4480a793fbe3fb8029b57ea0ba328dc8ed4db77ac55e2d94a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a168791a-0c5a-4a6f-ab00-175cc6c1bb37, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 28 10:05:41 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T10:05:41Z|00123|binding|INFO|Releasing lport 7525b617-bc2f-464e-a494-d304810bfb3d from this chassis (sb_readonly=0)
Nov 28 10:05:41 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T10:05:41Z|00124|binding|INFO|Setting lport 7525b617-bc2f-464e-a494-d304810bfb3d down in Southbound
Nov 28 10:05:41 np0005538515.localdomain kernel: device tap7525b617-bc left promiscuous mode
Nov 28 10:05:41 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:05:41.827 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:41 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:05:41.838 158530 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538515.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:1::2/64', 'neutron:device_id': 'dhcp2a75ae47-09f3-5db4-9c67-86b6e0e7c804-a168791a-0c5a-4a6f-ab00-175cc6c1bb37', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a168791a-0c5a-4a6f-ab00-175cc6c1bb37', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '79c7da76f5894b10864b69d1961b95ab', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005538515.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=abd1fad4-7cc2-47af-bcf3-845289d3423e, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fd80e481be0>], logical_port=7525b617-bc2f-464e-a494-d304810bfb3d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fd80e481be0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:05:41 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:05:41.840 158530 INFO neutron.agent.ovn.metadata.agent [-] Port 7525b617-bc2f-464e-a494-d304810bfb3d in datapath a168791a-0c5a-4a6f-ab00-175cc6c1bb37 unbound from our chassis
Nov 28 10:05:41 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:05:41.841 158530 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network a168791a-0c5a-4a6f-ab00-175cc6c1bb37 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 28 10:05:41 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:05:41.842 261619 DEBUG oslo.privsep.daemon [-] privsep: reply[95c89f6b-8365-4ab5-8024-d667af7571ba]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:05:41 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:05:41.850 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:42 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:05:42.135 261346 INFO neutron.agent.dhcp.agent [None req-aa8571c8-c2fb-4cb0-a85a-a2ca3cd531f5 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:05:42 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:05:42.234 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:05:42 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:05:42.503 261346 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:05:42 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:05:42.555 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:42 np0005538515.localdomain systemd[1]: run-netns-qdhcp\x2da168791a\x2d0c5a\x2d4a6f\x2dab00\x2d175cc6c1bb37.mount: Deactivated successfully.
Nov 28 10:05:43 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:05:43.079 261346 INFO neutron.agent.linux.ip_lib [None req-852ff37c-0e8a-49ad-babb-d02a979c0f09 - - - - - -] Device tapda869e93-67 cannot be used as it has no MAC address
Nov 28 10:05:43 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:05:43.106 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:43 np0005538515.localdomain kernel: device tapda869e93-67 entered promiscuous mode
Nov 28 10:05:43 np0005538515.localdomain NetworkManager[5965]: <info>  [1764324343.1162] manager: (tapda869e93-67): new Generic device (/org/freedesktop/NetworkManager/Devices/30)
Nov 28 10:05:43 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:05:43.116 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:43 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T10:05:43Z|00125|binding|INFO|Claiming lport da869e93-67dd-47bf-bfed-34a8468836ea for this chassis.
Nov 28 10:05:43 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T10:05:43Z|00126|binding|INFO|da869e93-67dd-47bf-bfed-34a8468836ea: Claiming unknown
Nov 28 10:05:43 np0005538515.localdomain systemd-udevd[315924]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 10:05:43 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:05:43.127 158530 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538515.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcp2a75ae47-09f3-5db4-9c67-86b6e0e7c804-19e7252b-db63-489c-9a93-8026377ebe8c', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-19e7252b-db63-489c-9a93-8026377ebe8c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '79185418333d4a93b24c87e39a4a1847', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=11c54a0e-746c-4aa6-aa68-60e31c1cb041, chassis=[<ovs.db.idl.Row object at 0x7fd80e481be0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fd80e481be0>], logical_port=da869e93-67dd-47bf-bfed-34a8468836ea) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:05:43 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:05:43.129 158530 INFO neutron.agent.ovn.metadata.agent [-] Port da869e93-67dd-47bf-bfed-34a8468836ea in datapath 19e7252b-db63-489c-9a93-8026377ebe8c bound to our chassis
Nov 28 10:05:43 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:05:43.130 158530 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 19e7252b-db63-489c-9a93-8026377ebe8c or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 28 10:05:43 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:05:43.131 261619 DEBUG oslo.privsep.daemon [-] privsep: reply[18b95250-9a69-4a55-a8e6-342f71ad1d83]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:05:43 np0005538515.localdomain virtnodedevd[228057]: ethtool ioctl error on tapda869e93-67: No such device
Nov 28 10:05:43 np0005538515.localdomain virtnodedevd[228057]: ethtool ioctl error on tapda869e93-67: No such device
Nov 28 10:05:43 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T10:05:43Z|00127|binding|INFO|Setting lport da869e93-67dd-47bf-bfed-34a8468836ea ovn-installed in OVS
Nov 28 10:05:43 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T10:05:43Z|00128|binding|INFO|Setting lport da869e93-67dd-47bf-bfed-34a8468836ea up in Southbound
Nov 28 10:05:43 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:05:43.158 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:43 np0005538515.localdomain virtnodedevd[228057]: ethtool ioctl error on tapda869e93-67: No such device
Nov 28 10:05:43 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:05:43.162 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:43 np0005538515.localdomain virtnodedevd[228057]: ethtool ioctl error on tapda869e93-67: No such device
Nov 28 10:05:43 np0005538515.localdomain virtnodedevd[228057]: ethtool ioctl error on tapda869e93-67: No such device
Nov 28 10:05:43 np0005538515.localdomain virtnodedevd[228057]: ethtool ioctl error on tapda869e93-67: No such device
Nov 28 10:05:43 np0005538515.localdomain virtnodedevd[228057]: ethtool ioctl error on tapda869e93-67: No such device
Nov 28 10:05:43 np0005538515.localdomain virtnodedevd[228057]: ethtool ioctl error on tapda869e93-67: No such device
Nov 28 10:05:43 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:05:43.200 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:43 np0005538515.localdomain ceph-mon[301134]: pgmap v291: 177 pgs: 177 active+clean; 145 MiB data, 791 MiB used, 41 GiB / 42 GiB avail; 63 KiB/s rd, 2.5 KiB/s wr, 84 op/s
Nov 28 10:05:43 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:05:43.229 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:43 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:05:43.237 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:05:43 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:05:43.238 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:05:43 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:05:43.238 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:05:43 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:05:43.238 280172 DEBUG nova.compute.manager [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 28 10:05:43 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:05:43.439 261346 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:05:43 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:05:43.605 261346 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:05:43 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:05:43.761 2 INFO neutron.agent.securitygroups_rpc [None req-1640c1ae-d5d6-4903-a020-69a5c84dc198 e0f29aacf6a94315b178b4a16e3fd03d 79185418333d4a93b24c87e39a4a1847 - - default default] Security group member updated ['f7d47ffa-9780-427b-aaf2-f0de3a638f8a']
Nov 28 10:05:43 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v292: 177 pgs: 177 active+clean; 145 MiB data, 791 MiB used, 41 GiB / 42 GiB avail; 63 KiB/s rd, 2.5 KiB/s wr, 84 op/s
Nov 28 10:05:43 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:05:43.806 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:44 np0005538515.localdomain podman[315995]: 
Nov 28 10:05:44 np0005538515.localdomain podman[315995]: 2025-11-28 10:05:44.037134414 +0000 UTC m=+0.086483651 container create fc1c2e6c8e588cd0ca5bbf1694bcef69f472745fa13bbc4a9abc9b4ba2d19e18 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-19e7252b-db63-489c-9a93-8026377ebe8c, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:05:44 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:05:44.090 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:44 np0005538515.localdomain podman[315995]: 2025-11-28 10:05:44.003688959 +0000 UTC m=+0.053038306 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 28 10:05:44 np0005538515.localdomain systemd[1]: Started libpod-conmon-fc1c2e6c8e588cd0ca5bbf1694bcef69f472745fa13bbc4a9abc9b4ba2d19e18.scope.
Nov 28 10:05:44 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 10:05:44 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/997025a36efb1e50d9ffe44c2974a8638074eb37fef508bc2be0cc49df956a21/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 10:05:44 np0005538515.localdomain podman[315995]: 2025-11-28 10:05:44.15741102 +0000 UTC m=+0.206760287 container init fc1c2e6c8e588cd0ca5bbf1694bcef69f472745fa13bbc4a9abc9b4ba2d19e18 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-19e7252b-db63-489c-9a93-8026377ebe8c, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:05:44 np0005538515.localdomain podman[315995]: 2025-11-28 10:05:44.174100582 +0000 UTC m=+0.223449859 container start fc1c2e6c8e588cd0ca5bbf1694bcef69f472745fa13bbc4a9abc9b4ba2d19e18 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-19e7252b-db63-489c-9a93-8026377ebe8c, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Nov 28 10:05:44 np0005538515.localdomain dnsmasq[316013]: started, version 2.85 cachesize 150
Nov 28 10:05:44 np0005538515.localdomain dnsmasq[316013]: DNS service limited to local subnets
Nov 28 10:05:44 np0005538515.localdomain dnsmasq[316013]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 28 10:05:44 np0005538515.localdomain dnsmasq[316013]: warning: no upstream servers configured
Nov 28 10:05:44 np0005538515.localdomain dnsmasq-dhcp[316013]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Nov 28 10:05:44 np0005538515.localdomain dnsmasq[316013]: read /var/lib/neutron/dhcp/19e7252b-db63-489c-9a93-8026377ebe8c/addn_hosts - 0 addresses
Nov 28 10:05:44 np0005538515.localdomain dnsmasq-dhcp[316013]: read /var/lib/neutron/dhcp/19e7252b-db63-489c-9a93-8026377ebe8c/host
Nov 28 10:05:44 np0005538515.localdomain dnsmasq-dhcp[316013]: read /var/lib/neutron/dhcp/19e7252b-db63-489c-9a93-8026377ebe8c/opts
Nov 28 10:05:44 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:05:44.237 261346 INFO neutron.agent.dhcp.agent [None req-852ff37c-0e8a-49ad-babb-d02a979c0f09 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:05:43Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce4267f0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce426490>], id=2266342c-b2ca-4cdd-98a9-2015ce48326f, ip_allocation=immediate, mac_address=fa:16:3e:42:cc:c5, name=tempest-ExtraDHCPOptionsIpV6TestJSON-1360024416, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:05:39Z, description=, dns_domain=, id=19e7252b-db63-489c-9a93-8026377ebe8c, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-ExtraDHCPOptionsIpV6TestJSON-test-network-1227717976, port_security_enabled=True, project_id=79185418333d4a93b24c87e39a4a1847, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=25537, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2036, status=ACTIVE, subnets=['e7013afe-a363-4f80-93ea-b1f16b2d44b3'], tags=[], tenant_id=79185418333d4a93b24c87e39a4a1847, updated_at=2025-11-28T10:05:41Z, vlan_transparent=None, network_id=19e7252b-db63-489c-9a93-8026377ebe8c, port_security_enabled=True, project_id=79185418333d4a93b24c87e39a4a1847, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['f7d47ffa-9780-427b-aaf2-f0de3a638f8a'], standard_attr_id=2065, status=DOWN, tags=[], tenant_id=79185418333d4a93b24c87e39a4a1847, updated_at=2025-11-28T10:05:43Z on network 19e7252b-db63-489c-9a93-8026377ebe8c
Nov 28 10:05:44 np0005538515.localdomain dnsmasq[316013]: read /var/lib/neutron/dhcp/19e7252b-db63-489c-9a93-8026377ebe8c/addn_hosts - 1 addresses
Nov 28 10:05:44 np0005538515.localdomain dnsmasq-dhcp[316013]: read /var/lib/neutron/dhcp/19e7252b-db63-489c-9a93-8026377ebe8c/host
Nov 28 10:05:44 np0005538515.localdomain dnsmasq-dhcp[316013]: read /var/lib/neutron/dhcp/19e7252b-db63-489c-9a93-8026377ebe8c/opts
Nov 28 10:05:44 np0005538515.localdomain podman[316033]: 2025-11-28 10:05:44.387201363 +0000 UTC m=+0.051342634 container kill fc1c2e6c8e588cd0ca5bbf1694bcef69f472745fa13bbc4a9abc9b4ba2d19e18 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-19e7252b-db63-489c-9a93-8026377ebe8c, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team)
Nov 28 10:05:44 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:05:44.388 261346 INFO neutron.agent.dhcp.agent [None req-94f8b550-a2fb-4a61-9fe3-b005871c19a5 - - - - - -] DHCP configuration for ports {'98f82d8b-5603-4803-8ab5-3c3646965543'} is completed
Nov 28 10:05:44 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:05:44.667 261346 INFO neutron.agent.dhcp.agent [None req-d756f844-7324-4ba2-b29e-9d22f50691db - - - - - -] DHCP configuration for ports {'2266342c-b2ca-4cdd-98a9-2015ce48326f'} is completed
Nov 28 10:05:45 np0005538515.localdomain systemd[1]: tmp-crun.uVGOZJ.mount: Deactivated successfully.
Nov 28 10:05:45 np0005538515.localdomain ceph-mon[301134]: pgmap v292: 177 pgs: 177 active+clean; 145 MiB data, 791 MiB used, 41 GiB / 42 GiB avail; 63 KiB/s rd, 2.5 KiB/s wr, 84 op/s
Nov 28 10:05:45 np0005538515.localdomain dnsmasq[315360]: exiting on receipt of SIGTERM
Nov 28 10:05:45 np0005538515.localdomain podman[316071]: 2025-11-28 10:05:45.231658415 +0000 UTC m=+0.064495128 container kill d5e1c3cb72577915b7d71f2368eb31dc1c34546039766c4ed24cd04c7c5823d4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a9e7bee2-adb4-4094-93fb-3cf35621d144, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:05:45 np0005538515.localdomain systemd[1]: libpod-d5e1c3cb72577915b7d71f2368eb31dc1c34546039766c4ed24cd04c7c5823d4.scope: Deactivated successfully.
Nov 28 10:05:45 np0005538515.localdomain podman[316084]: 2025-11-28 10:05:45.296976756 +0000 UTC m=+0.054678876 container died d5e1c3cb72577915b7d71f2368eb31dc1c34546039766c4ed24cd04c7c5823d4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a9e7bee2-adb4-4094-93fb-3cf35621d144, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Nov 28 10:05:45 np0005538515.localdomain podman[316084]: 2025-11-28 10:05:45.379209096 +0000 UTC m=+0.136911176 container cleanup d5e1c3cb72577915b7d71f2368eb31dc1c34546039766c4ed24cd04c7c5823d4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a9e7bee2-adb4-4094-93fb-3cf35621d144, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3)
Nov 28 10:05:45 np0005538515.localdomain systemd[1]: libpod-conmon-d5e1c3cb72577915b7d71f2368eb31dc1c34546039766c4ed24cd04c7c5823d4.scope: Deactivated successfully.
Nov 28 10:05:45 np0005538515.localdomain podman[316086]: 2025-11-28 10:05:45.398502388 +0000 UTC m=+0.144085967 container remove d5e1c3cb72577915b7d71f2368eb31dc1c34546039766c4ed24cd04c7c5823d4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a9e7bee2-adb4-4094-93fb-3cf35621d144, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3)
Nov 28 10:05:45 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:05:45.413 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:45 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T10:05:45Z|00129|binding|INFO|Releasing lport 6e1d8ae7-e505-45e5-a475-e8b27c1cbd76 from this chassis (sb_readonly=0)
Nov 28 10:05:45 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T10:05:45Z|00130|binding|INFO|Setting lport 6e1d8ae7-e505-45e5-a475-e8b27c1cbd76 down in Southbound
Nov 28 10:05:45 np0005538515.localdomain kernel: device tap6e1d8ae7-e5 left promiscuous mode
Nov 28 10:05:45 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:05:45.426 158530 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538515.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcp2a75ae47-09f3-5db4-9c67-86b6e0e7c804-a9e7bee2-adb4-4094-93fb-3cf35621d144', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a9e7bee2-adb4-4094-93fb-3cf35621d144', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '79c7da76f5894b10864b69d1961b95ab', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005538515.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d2b44c93-fed4-4a4f-bdce-cdd61f440a29, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fd80e481be0>], logical_port=6e1d8ae7-e505-45e5-a475-e8b27c1cbd76) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fd80e481be0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:05:45 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:05:45.428 158530 INFO neutron.agent.ovn.metadata.agent [-] Port 6e1d8ae7-e505-45e5-a475-e8b27c1cbd76 in datapath a9e7bee2-adb4-4094-93fb-3cf35621d144 unbound from our chassis
Nov 28 10:05:45 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:05:45.429 158530 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network a9e7bee2-adb4-4094-93fb-3cf35621d144 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 28 10:05:45 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:05:45.431 261619 DEBUG oslo.privsep.daemon [-] privsep: reply[7ce95c36-0bdf-4ae1-87cb-40e83bd4a44f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:05:45 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:05:45.434 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:45 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:05:45.536 2 INFO neutron.agent.securitygroups_rpc [None req-6e211784-e61c-41e0-bb4d-96a8d1625a59 e0f29aacf6a94315b178b4a16e3fd03d 79185418333d4a93b24c87e39a4a1847 - - default default] Security group member updated ['f7d47ffa-9780-427b-aaf2-f0de3a638f8a']
Nov 28 10:05:45 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:05:45.618 261346 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:05:44Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce412040>], dns_domain=, dns_name=, extra_dhcp_opts=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce4122b0>, <neutron.agent.linux.dhcp.DictModel object at 0x7f19ce412880>, <neutron.agent.linux.dhcp.DictModel object at 0x7f19ce412700>], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce412250>], id=d906e92f-2a14-419f-b7c7-dfd2b9d95ebe, ip_allocation=immediate, mac_address=fa:16:3e:e4:43:f0, name=tempest-ExtraDHCPOptionsIpV6TestJSON-1072293803, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:05:39Z, description=, dns_domain=, id=19e7252b-db63-489c-9a93-8026377ebe8c, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-ExtraDHCPOptionsIpV6TestJSON-test-network-1227717976, port_security_enabled=True, project_id=79185418333d4a93b24c87e39a4a1847, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=25537, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2036, status=ACTIVE, subnets=['e7013afe-a363-4f80-93ea-b1f16b2d44b3'], tags=[], tenant_id=79185418333d4a93b24c87e39a4a1847, updated_at=2025-11-28T10:05:41Z, vlan_transparent=None, network_id=19e7252b-db63-489c-9a93-8026377ebe8c, port_security_enabled=True, project_id=79185418333d4a93b24c87e39a4a1847, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['f7d47ffa-9780-427b-aaf2-f0de3a638f8a'], standard_attr_id=2079, status=DOWN, tags=[], tenant_id=79185418333d4a93b24c87e39a4a1847, updated_at=2025-11-28T10:05:45Z on network 19e7252b-db63-489c-9a93-8026377ebe8c
Nov 28 10:05:45 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:05:45.637 261346 INFO neutron.agent.linux.dhcp [-] Cannot apply dhcp option server-ip-address because it's ip_version 4 is not in port's address IP versions
Nov 28 10:05:45 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:05:45.637 261346 INFO neutron.agent.linux.dhcp [-] Cannot apply dhcp option bootfile-name because it's ip_version 4 is not in port's address IP versions
Nov 28 10:05:45 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:05:45.638 261346 INFO neutron.agent.linux.dhcp [-] Cannot apply dhcp option tftp-server because it's ip_version 4 is not in port's address IP versions
Nov 28 10:05:45 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:05:45.698 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:45 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:05:45.700 158530 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=13, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '92:49:97', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ca:ab:0a:de:51:20'}, ipsec=False) old=SB_Global(nb_cfg=12) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:05:45 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:05:45.702 158530 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 28 10:05:45 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v293: 177 pgs: 177 active+clean; 145 MiB data, 791 MiB used, 41 GiB / 42 GiB avail; 63 KiB/s rd, 2.5 KiB/s wr, 84 op/s
Nov 28 10:05:45 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:05:45.806 261346 INFO neutron.agent.dhcp.agent [None req-dc669399-b00a-4161-8a03-77bb1e8c43e0 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:05:45 np0005538515.localdomain dnsmasq[316013]: read /var/lib/neutron/dhcp/19e7252b-db63-489c-9a93-8026377ebe8c/addn_hosts - 2 addresses
Nov 28 10:05:45 np0005538515.localdomain dnsmasq-dhcp[316013]: read /var/lib/neutron/dhcp/19e7252b-db63-489c-9a93-8026377ebe8c/host
Nov 28 10:05:45 np0005538515.localdomain podman[316130]: 2025-11-28 10:05:45.812527677 +0000 UTC m=+0.059901557 container kill fc1c2e6c8e588cd0ca5bbf1694bcef69f472745fa13bbc4a9abc9b4ba2d19e18 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-19e7252b-db63-489c-9a93-8026377ebe8c, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 28 10:05:45 np0005538515.localdomain dnsmasq-dhcp[316013]: read /var/lib/neutron/dhcp/19e7252b-db63-489c-9a93-8026377ebe8c/opts
Nov 28 10:05:45 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:05:45.973 261346 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:05:46 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:05:46.026 261346 INFO neutron.agent.dhcp.agent [None req-fb96c901-fad4-4460-ae3e-474889f96a78 - - - - - -] DHCP configuration for ports {'d906e92f-2a14-419f-b7c7-dfd2b9d95ebe'} is completed
Nov 28 10:05:46 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-5b06168b56e6b283e5d79caea478a11e9296a7b7a714e7c8158e200f6a95c8f4-merged.mount: Deactivated successfully.
Nov 28 10:05:46 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d5e1c3cb72577915b7d71f2368eb31dc1c34546039766c4ed24cd04c7c5823d4-userdata-shm.mount: Deactivated successfully.
Nov 28 10:05:46 np0005538515.localdomain systemd[1]: run-netns-qdhcp\x2da9e7bee2\x2dadb4\x2d4094\x2d93fb\x2d3cf35621d144.mount: Deactivated successfully.
Nov 28 10:05:46 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:05:46.180 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:46 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.106:0/1626658574' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:05:46 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:05:46.239 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:05:46 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:05:46.239 280172 DEBUG nova.compute.manager [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 28 10:05:46 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:05:46.239 280172 DEBUG nova.compute.manager [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 28 10:05:46 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:05:46.254 280172 DEBUG nova.compute.manager [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 28 10:05:46 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:05:46.684 2 INFO neutron.agent.securitygroups_rpc [None req-fda551bc-bdc3-478a-93b4-a620175c516e 8d30c732fa674cae8e1de9092f58edd9 3a67d7f32f5e49c3aed3e09278dd6c95 - - default default] Security group member updated ['371ca172-3d0a-4f94-811c-7c823124cef1']
Nov 28 10:05:46 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e152 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 10:05:47 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:05:47.122 2 INFO neutron.agent.securitygroups_rpc [None req-e42eed40-5a16-457a-8c1b-352e3dbcff3e e0f29aacf6a94315b178b4a16e3fd03d 79185418333d4a93b24c87e39a4a1847 - - default default] Security group member updated ['f7d47ffa-9780-427b-aaf2-f0de3a638f8a']
Nov 28 10:05:47 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:05:47.164 2 INFO neutron.agent.securitygroups_rpc [None req-fda551bc-bdc3-478a-93b4-a620175c516e 8d30c732fa674cae8e1de9092f58edd9 3a67d7f32f5e49c3aed3e09278dd6c95 - - default default] Security group member updated ['371ca172-3d0a-4f94-811c-7c823124cef1']
Nov 28 10:05:47 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:05:47.238 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:05:47 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:05:47.238 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:05:47 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e153 e153: 6 total, 6 up, 6 in
Nov 28 10:05:47 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:05:47.267 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 10:05:47 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:05:47.267 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 10:05:47 np0005538515.localdomain ceph-mon[301134]: pgmap v293: 177 pgs: 177 active+clean; 145 MiB data, 791 MiB used, 41 GiB / 42 GiB avail; 63 KiB/s rd, 2.5 KiB/s wr, 84 op/s
Nov 28 10:05:47 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.106:0/2969744802' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:05:47 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:05:47.268 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 10:05:47 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:05:47.268 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Auditing locally available compute resources for np0005538515.localdomain (node: np0005538515.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 28 10:05:47 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:05:47.269 280172 DEBUG oslo_concurrency.processutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 10:05:47 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:05:47.558 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:47 np0005538515.localdomain dnsmasq[316013]: read /var/lib/neutron/dhcp/19e7252b-db63-489c-9a93-8026377ebe8c/addn_hosts - 1 addresses
Nov 28 10:05:47 np0005538515.localdomain dnsmasq-dhcp[316013]: read /var/lib/neutron/dhcp/19e7252b-db63-489c-9a93-8026377ebe8c/host
Nov 28 10:05:47 np0005538515.localdomain podman[316188]: 2025-11-28 10:05:47.666395164 +0000 UTC m=+0.070938505 container kill fc1c2e6c8e588cd0ca5bbf1694bcef69f472745fa13bbc4a9abc9b4ba2d19e18 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-19e7252b-db63-489c-9a93-8026377ebe8c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:05:47 np0005538515.localdomain dnsmasq-dhcp[316013]: read /var/lib/neutron/dhcp/19e7252b-db63-489c-9a93-8026377ebe8c/opts
Nov 28 10:05:47 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 28 10:05:47 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/692503189' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:05:47 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:05:47.736 280172 DEBUG oslo_concurrency.processutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.467s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 10:05:47 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v295: 177 pgs: 177 active+clean; 145 MiB data, 791 MiB used, 41 GiB / 42 GiB avail; 22 KiB/s rd, 921 B/s wr, 29 op/s
Nov 28 10:05:47 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:05:47.912 280172 WARNING nova.virt.libvirt.driver [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 10:05:47 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:05:47.913 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Hypervisor/Node resource view: name=np0005538515.localdomain free_ram=11556MB free_disk=41.83686447143555GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 28 10:05:47 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:05:47.913 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 10:05:47 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:05:47.913 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 10:05:47 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:05:47.965 261346 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:05:43Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce492130>], dns_domain=, dns_name=, extra_dhcp_opts=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce469d00>, <neutron.agent.linux.dhcp.DictModel object at 0x7f19ce492f10>, <neutron.agent.linux.dhcp.DictModel object at 0x7f19ce4924f0>], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce4695b0>], id=2266342c-b2ca-4cdd-98a9-2015ce48326f, ip_allocation=immediate, mac_address=fa:16:3e:42:cc:c5, name=tempest-new-port-name-1521843568, network_id=19e7252b-db63-489c-9a93-8026377ebe8c, port_security_enabled=True, project_id=79185418333d4a93b24c87e39a4a1847, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=2, security_groups=['f7d47ffa-9780-427b-aaf2-f0de3a638f8a'], standard_attr_id=2065, status=DOWN, tags=[], tenant_id=79185418333d4a93b24c87e39a4a1847, updated_at=2025-11-28T10:05:47Z on network 19e7252b-db63-489c-9a93-8026377ebe8c
Nov 28 10:05:47 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:05:47.979 261346 INFO neutron.agent.linux.dhcp [-] Cannot apply dhcp option server-ip-address because it's ip_version 4 is not in port's address IP versions
Nov 28 10:05:47 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:05:47.980 261346 INFO neutron.agent.linux.dhcp [-] Cannot apply dhcp option tftp-server because it's ip_version 4 is not in port's address IP versions
Nov 28 10:05:47 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:05:47.980 261346 INFO neutron.agent.linux.dhcp [-] Cannot apply dhcp option bootfile-name because it's ip_version 4 is not in port's address IP versions
Nov 28 10:05:48 np0005538515.localdomain dnsmasq[316013]: read /var/lib/neutron/dhcp/19e7252b-db63-489c-9a93-8026377ebe8c/addn_hosts - 1 addresses
Nov 28 10:05:48 np0005538515.localdomain dnsmasq-dhcp[316013]: read /var/lib/neutron/dhcp/19e7252b-db63-489c-9a93-8026377ebe8c/host
Nov 28 10:05:48 np0005538515.localdomain dnsmasq-dhcp[316013]: read /var/lib/neutron/dhcp/19e7252b-db63-489c-9a93-8026377ebe8c/opts
Nov 28 10:05:48 np0005538515.localdomain podman[316229]: 2025-11-28 10:05:48.121314646 +0000 UTC m=+0.044607568 container kill fc1c2e6c8e588cd0ca5bbf1694bcef69f472745fa13bbc4a9abc9b4ba2d19e18 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-19e7252b-db63-489c-9a93-8026377ebe8c, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Nov 28 10:05:48 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:05:48.190 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 28 10:05:48 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:05:48.191 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Final resource view: name=np0005538515.localdomain phys_ram=15738MB used_ram=512MB phys_disk=41GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 28 10:05:48 np0005538515.localdomain ceph-mon[301134]: osdmap e153: 6 total, 6 up, 6 in
Nov 28 10:05:48 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.108:0/692503189' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:05:48 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.107:0/3353686040' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:05:48 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:05:48.287 280172 DEBUG oslo_concurrency.processutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 10:05:48 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:05:48.349 2 INFO neutron.agent.securitygroups_rpc [None req-7a28cb1b-8d30-40dd-8ac7-551ea3c1b87c e0f29aacf6a94315b178b4a16e3fd03d 79185418333d4a93b24c87e39a4a1847 - - default default] Security group member updated ['f7d47ffa-9780-427b-aaf2-f0de3a638f8a']
Nov 28 10:05:48 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:05:48.376 261346 INFO neutron.agent.dhcp.agent [None req-93a0b611-3b28-46d3-86a4-ec7d6c12c009 - - - - - -] DHCP configuration for ports {'2266342c-b2ca-4cdd-98a9-2015ce48326f'} is completed
Nov 28 10:05:48 np0005538515.localdomain dnsmasq[316013]: read /var/lib/neutron/dhcp/19e7252b-db63-489c-9a93-8026377ebe8c/addn_hosts - 0 addresses
Nov 28 10:05:48 np0005538515.localdomain podman[316284]: 2025-11-28 10:05:48.541656709 +0000 UTC m=+0.055291665 container kill fc1c2e6c8e588cd0ca5bbf1694bcef69f472745fa13bbc4a9abc9b4ba2d19e18 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-19e7252b-db63-489c-9a93-8026377ebe8c, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Nov 28 10:05:48 np0005538515.localdomain dnsmasq-dhcp[316013]: read /var/lib/neutron/dhcp/19e7252b-db63-489c-9a93-8026377ebe8c/host
Nov 28 10:05:48 np0005538515.localdomain dnsmasq-dhcp[316013]: read /var/lib/neutron/dhcp/19e7252b-db63-489c-9a93-8026377ebe8c/opts
Nov 28 10:05:48 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 28 10:05:48 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/1333398748' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:05:48 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:05:48.740 280172 DEBUG oslo_concurrency.processutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.453s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 10:05:48 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:05:48.747 280172 DEBUG nova.compute.provider_tree [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Inventory has not changed in ProviderTree for provider: 72fba1ca-0d86-48af-8a3d-510284dfd0e0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 10:05:48 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:05:48.762 280172 DEBUG nova.scheduler.client.report [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Inventory has not changed for provider 72fba1ca-0d86-48af-8a3d-510284dfd0e0 based on inventory data: {'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 10:05:48 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:05:48.765 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Compute_service record updated for np0005538515.localdomain:np0005538515.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 28 10:05:48 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:05:48.765 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.852s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 10:05:48 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:05:48.808 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:49 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:05:49.000 2 INFO neutron.agent.securitygroups_rpc [None req-0fb2cb74-ebe1-4595-9145-49dcf60353ff 8d30c732fa674cae8e1de9092f58edd9 3a67d7f32f5e49c3aed3e09278dd6c95 - - default default] Security group member updated ['371ca172-3d0a-4f94-811c-7c823124cef1']
Nov 28 10:05:49 np0005538515.localdomain dnsmasq[316013]: exiting on receipt of SIGTERM
Nov 28 10:05:49 np0005538515.localdomain podman[316323]: 2025-11-28 10:05:49.018344678 +0000 UTC m=+0.060012280 container kill fc1c2e6c8e588cd0ca5bbf1694bcef69f472745fa13bbc4a9abc9b4ba2d19e18 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-19e7252b-db63-489c-9a93-8026377ebe8c, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Nov 28 10:05:49 np0005538515.localdomain systemd[1]: libpod-fc1c2e6c8e588cd0ca5bbf1694bcef69f472745fa13bbc4a9abc9b4ba2d19e18.scope: Deactivated successfully.
Nov 28 10:05:49 np0005538515.localdomain podman[316335]: 2025-11-28 10:05:49.091126189 +0000 UTC m=+0.058034769 container died fc1c2e6c8e588cd0ca5bbf1694bcef69f472745fa13bbc4a9abc9b4ba2d19e18 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-19e7252b-db63-489c-9a93-8026377ebe8c, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3)
Nov 28 10:05:49 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-fc1c2e6c8e588cd0ca5bbf1694bcef69f472745fa13bbc4a9abc9b4ba2d19e18-userdata-shm.mount: Deactivated successfully.
Nov 28 10:05:49 np0005538515.localdomain podman[316335]: 2025-11-28 10:05:49.127294177 +0000 UTC m=+0.094203077 container cleanup fc1c2e6c8e588cd0ca5bbf1694bcef69f472745fa13bbc4a9abc9b4ba2d19e18 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-19e7252b-db63-489c-9a93-8026377ebe8c, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS)
Nov 28 10:05:49 np0005538515.localdomain systemd[1]: libpod-conmon-fc1c2e6c8e588cd0ca5bbf1694bcef69f472745fa13bbc4a9abc9b4ba2d19e18.scope: Deactivated successfully.
Nov 28 10:05:49 np0005538515.localdomain podman[316337]: 2025-11-28 10:05:49.168673496 +0000 UTC m=+0.128317904 container remove fc1c2e6c8e588cd0ca5bbf1694bcef69f472745fa13bbc4a9abc9b4ba2d19e18 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-19e7252b-db63-489c-9a93-8026377ebe8c, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 28 10:05:49 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:05:49.179 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:49 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T10:05:49Z|00131|binding|INFO|Releasing lport da869e93-67dd-47bf-bfed-34a8468836ea from this chassis (sb_readonly=0)
Nov 28 10:05:49 np0005538515.localdomain kernel: device tapda869e93-67 left promiscuous mode
Nov 28 10:05:49 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T10:05:49Z|00132|binding|INFO|Setting lport da869e93-67dd-47bf-bfed-34a8468836ea down in Southbound
Nov 28 10:05:49 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:05:49.191 158530 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538515.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcp2a75ae47-09f3-5db4-9c67-86b6e0e7c804-19e7252b-db63-489c-9a93-8026377ebe8c', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-19e7252b-db63-489c-9a93-8026377ebe8c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '79185418333d4a93b24c87e39a4a1847', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005538515.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=11c54a0e-746c-4aa6-aa68-60e31c1cb041, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fd80e481be0>], logical_port=da869e93-67dd-47bf-bfed-34a8468836ea) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fd80e481be0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:05:49 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:05:49.193 158530 INFO neutron.agent.ovn.metadata.agent [-] Port da869e93-67dd-47bf-bfed-34a8468836ea in datapath 19e7252b-db63-489c-9a93-8026377ebe8c unbound from our chassis
Nov 28 10:05:49 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:05:49.194 158530 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 19e7252b-db63-489c-9a93-8026377ebe8c or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 28 10:05:49 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:05:49.196 261619 DEBUG oslo.privsep.daemon [-] privsep: reply[620d4250-43f4-4571-8476-ee8dfd7061eb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:05:49 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:05:49.200 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:49 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e154 e154: 6 total, 6 up, 6 in
Nov 28 10:05:49 np0005538515.localdomain ceph-mon[301134]: pgmap v295: 177 pgs: 177 active+clean; 145 MiB data, 791 MiB used, 41 GiB / 42 GiB avail; 22 KiB/s rd, 921 B/s wr, 29 op/s
Nov 28 10:05:49 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.108:0/1333398748' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:05:49 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:05:49.399 2 INFO neutron.agent.securitygroups_rpc [None req-a6f5628a-dc81-40c9-a579-68393376dce2 8d30c732fa674cae8e1de9092f58edd9 3a67d7f32f5e49c3aed3e09278dd6c95 - - default default] Security group member updated ['371ca172-3d0a-4f94-811c-7c823124cef1']
Nov 28 10:05:49 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:05:49.417 261346 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:05:49 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:05:49.643 261346 INFO neutron.agent.dhcp.agent [None req-60053e95-c58a-45b1-9c1d-d82a22fba80a - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:05:49 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:05:49.644 261346 INFO neutron.agent.dhcp.agent [None req-60053e95-c58a-45b1-9c1d-d82a22fba80a - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:05:49 np0005538515.localdomain dnsmasq[315260]: exiting on receipt of SIGTERM
Nov 28 10:05:49 np0005538515.localdomain podman[316383]: 2025-11-28 10:05:49.726262955 +0000 UTC m=+0.065495328 container kill b1786ca741ddada675528e8264626c275f0e00eead8f1394ae47def187e20ad4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-76551b5f-5d3c-486b-8256-6697e6d961af, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:05:49 np0005538515.localdomain systemd[1]: libpod-b1786ca741ddada675528e8264626c275f0e00eead8f1394ae47def187e20ad4.scope: Deactivated successfully.
Nov 28 10:05:49 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:05:49.762 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:05:49 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v297: 177 pgs: 177 active+clean; 145 MiB data, 791 MiB used, 41 GiB / 42 GiB avail; 639 B/s rd, 511 B/s wr, 1 op/s
Nov 28 10:05:49 np0005538515.localdomain podman[316397]: 2025-11-28 10:05:49.79721152 +0000 UTC m=+0.057629278 container died b1786ca741ddada675528e8264626c275f0e00eead8f1394ae47def187e20ad4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-76551b5f-5d3c-486b-8256-6697e6d961af, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:05:49 np0005538515.localdomain podman[316397]: 2025-11-28 10:05:49.823304989 +0000 UTC m=+0.083722647 container cleanup b1786ca741ddada675528e8264626c275f0e00eead8f1394ae47def187e20ad4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-76551b5f-5d3c-486b-8256-6697e6d961af, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Nov 28 10:05:49 np0005538515.localdomain systemd[1]: libpod-conmon-b1786ca741ddada675528e8264626c275f0e00eead8f1394ae47def187e20ad4.scope: Deactivated successfully.
Nov 28 10:05:49 np0005538515.localdomain podman[316399]: 2025-11-28 10:05:49.881552785 +0000 UTC m=+0.132897075 container remove b1786ca741ddada675528e8264626c275f0e00eead8f1394ae47def187e20ad4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-76551b5f-5d3c-486b-8256-6697e6d961af, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Nov 28 10:05:50 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-997025a36efb1e50d9ffe44c2974a8638074eb37fef508bc2be0cc49df956a21-merged.mount: Deactivated successfully.
Nov 28 10:05:50 np0005538515.localdomain systemd[1]: run-netns-qdhcp\x2d19e7252b\x2ddb63\x2d489c\x2d9a93\x2d8026377ebe8c.mount: Deactivated successfully.
Nov 28 10:05:50 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-f596bc5e7c28d7a22b37b15d00f7baa6c61a1b94d6dc8c55a4dae31f2f97828d-merged.mount: Deactivated successfully.
Nov 28 10:05:50 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b1786ca741ddada675528e8264626c275f0e00eead8f1394ae47def187e20ad4-userdata-shm.mount: Deactivated successfully.
Nov 28 10:05:50 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:05:50.149 261346 INFO neutron.agent.dhcp.agent [None req-5670840b-f6bb-4d1d-bf62-8fafb33cdb50 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:05:50 np0005538515.localdomain systemd[1]: run-netns-qdhcp\x2d76551b5f\x2d5d3c\x2d486b\x2d8256\x2d6697e6d961af.mount: Deactivated successfully.
Nov 28 10:05:50 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:05:50.238 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:05:50 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:05:50.293 261346 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:05:50 np0005538515.localdomain ceph-mon[301134]: osdmap e154: 6 total, 6 up, 6 in
Nov 28 10:05:50 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.107:0/3995934598' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:05:50 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:05:50.461 261346 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:05:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:05:50.848 158530 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 10:05:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:05:50.849 158530 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 10:05:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:05:50.849 158530 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 10:05:51 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:05:51.111 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:51 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:05:51.276 261346 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:05:51 np0005538515.localdomain ceph-mon[301134]: pgmap v297: 177 pgs: 177 active+clean; 145 MiB data, 791 MiB used, 41 GiB / 42 GiB avail; 639 B/s rd, 511 B/s wr, 1 op/s
Nov 28 10:05:51 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/1247585138' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:05:51 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/1247585138' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:05:51 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v298: 177 pgs: 177 active+clean; 145 MiB data, 792 MiB used, 41 GiB / 42 GiB avail; 2.6 MiB/s rd, 2.2 KiB/s wr, 55 op/s
Nov 28 10:05:51 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 10:05:52 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:05:52.581 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:05:52.704 158530 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=62c03cad-89c1-4fd7-973b-8f2a608c71f1, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '13'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 10:05:53 np0005538515.localdomain ceph-mon[301134]: pgmap v298: 177 pgs: 177 active+clean; 145 MiB data, 792 MiB used, 41 GiB / 42 GiB avail; 2.6 MiB/s rd, 2.2 KiB/s wr, 55 op/s
Nov 28 10:05:53 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v299: 177 pgs: 177 active+clean; 145 MiB data, 792 MiB used, 41 GiB / 42 GiB avail; 2.6 MiB/s rd, 2.2 KiB/s wr, 55 op/s
Nov 28 10:05:53 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:05:53.837 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:54 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:05:54.238 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:05:54 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.
Nov 28 10:05:54 np0005538515.localdomain podman[316427]: 2025-11-28 10:05:54.985274574 +0000 UTC m=+0.081987564 container health_status 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, io.buildah.version=1.33.7, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., architecture=x86_64, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, name=ubi9-minimal, container_name=openstack_network_exporter)
Nov 28 10:05:54 np0005538515.localdomain podman[316427]: 2025-11-28 10:05:54.99884205 +0000 UTC m=+0.095555030 container exec_died 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.openshift.expose-services=, config_id=edpm, managed_by=edpm_ansible, maintainer=Red Hat, Inc., release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, vendor=Red Hat, Inc., container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64)
Nov 28 10:05:55 np0005538515.localdomain systemd[1]: 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.service: Deactivated successfully.
Nov 28 10:05:55 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e155 e155: 6 total, 6 up, 6 in
Nov 28 10:05:55 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:05:55.395 261346 INFO neutron.agent.linux.ip_lib [None req-da59e66f-1e4b-44f4-a073-460cabe89314 - - - - - -] Device tap241102ed-e9 cannot be used as it has no MAC address
Nov 28 10:05:55 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:05:55.419 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:55 np0005538515.localdomain kernel: device tap241102ed-e9 entered promiscuous mode
Nov 28 10:05:55 np0005538515.localdomain NetworkManager[5965]: <info>  [1764324355.4273] manager: (tap241102ed-e9): new Generic device (/org/freedesktop/NetworkManager/Devices/31)
Nov 28 10:05:55 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:05:55.429 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:55 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T10:05:55Z|00133|binding|INFO|Claiming lport 241102ed-e98f-481c-af36-a58d0b82a130 for this chassis.
Nov 28 10:05:55 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T10:05:55Z|00134|binding|INFO|241102ed-e98f-481c-af36-a58d0b82a130: Claiming unknown
Nov 28 10:05:55 np0005538515.localdomain systemd-udevd[316457]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 10:05:55 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:05:55.439 158530 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538515.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcp2a75ae47-09f3-5db4-9c67-86b6e0e7c804-3a59b0d3-6d6c-43cf-8506-00d3024e1dd5', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3a59b0d3-6d6c-43cf-8506-00d3024e1dd5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3a67d7f32f5e49c3aed3e09278dd6c95', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e3e102bc-7fe3-462e-8d76-cae98ee17de5, chassis=[<ovs.db.idl.Row object at 0x7fd80e481be0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fd80e481be0>], logical_port=241102ed-e98f-481c-af36-a58d0b82a130) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:05:55 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:05:55.441 158530 INFO neutron.agent.ovn.metadata.agent [-] Port 241102ed-e98f-481c-af36-a58d0b82a130 in datapath 3a59b0d3-6d6c-43cf-8506-00d3024e1dd5 bound to our chassis
Nov 28 10:05:55 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:05:55.443 158530 DEBUG neutron.agent.ovn.metadata.agent [-] Port c7eb739b-fca4-4c4a-bcd6-eddbc69e6fb0 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Nov 28 10:05:55 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:05:55.444 158530 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 3a59b0d3-6d6c-43cf-8506-00d3024e1dd5, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 28 10:05:55 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:05:55.444 261619 DEBUG oslo.privsep.daemon [-] privsep: reply[c64018e0-3abc-4bde-bb05-14c12f5079a4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:05:55 np0005538515.localdomain ceph-mon[301134]: pgmap v299: 177 pgs: 177 active+clean; 145 MiB data, 792 MiB used, 41 GiB / 42 GiB avail; 2.6 MiB/s rd, 2.2 KiB/s wr, 55 op/s
Nov 28 10:05:55 np0005538515.localdomain ceph-mon[301134]: osdmap e155: 6 total, 6 up, 6 in
Nov 28 10:05:55 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T10:05:55Z|00135|binding|INFO|Setting lport 241102ed-e98f-481c-af36-a58d0b82a130 ovn-installed in OVS
Nov 28 10:05:55 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T10:05:55Z|00136|binding|INFO|Setting lport 241102ed-e98f-481c-af36-a58d0b82a130 up in Southbound
Nov 28 10:05:55 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:05:55.479 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:55 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:05:55.515 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:55 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:05:55.542 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:55 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v301: 177 pgs: 177 active+clean; 145 MiB data, 792 MiB used, 41 GiB / 42 GiB avail; 2.6 MiB/s rd, 1.7 KiB/s wr, 54 op/s
Nov 28 10:05:56 np0005538515.localdomain podman[316512]: 
Nov 28 10:05:56 np0005538515.localdomain podman[316512]: 2025-11-28 10:05:56.486436552 +0000 UTC m=+0.089008349 container create 8ce55d06dc884b3adea11f4b138a43364413ea7f9c45256bc2b458e88b37eecf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3a59b0d3-6d6c-43cf-8506-00d3024e1dd5, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:05:56 np0005538515.localdomain systemd[1]: Started libpod-conmon-8ce55d06dc884b3adea11f4b138a43364413ea7f9c45256bc2b458e88b37eecf.scope.
Nov 28 10:05:56 np0005538515.localdomain podman[316512]: 2025-11-28 10:05:56.442514766 +0000 UTC m=+0.045086623 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 28 10:05:56 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 10:05:56 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c626be514af0ce8e815152f86e5ad34ffc6952c7e54fd8d57a5b98c9ea4b60f6/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 10:05:56 np0005538515.localdomain podman[316512]: 2025-11-28 10:05:56.569980642 +0000 UTC m=+0.172552439 container init 8ce55d06dc884b3adea11f4b138a43364413ea7f9c45256bc2b458e88b37eecf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3a59b0d3-6d6c-43cf-8506-00d3024e1dd5, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 28 10:05:56 np0005538515.localdomain podman[316512]: 2025-11-28 10:05:56.578196714 +0000 UTC m=+0.180768511 container start 8ce55d06dc884b3adea11f4b138a43364413ea7f9c45256bc2b458e88b37eecf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3a59b0d3-6d6c-43cf-8506-00d3024e1dd5, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Nov 28 10:05:56 np0005538515.localdomain dnsmasq[316529]: started, version 2.85 cachesize 150
Nov 28 10:05:56 np0005538515.localdomain dnsmasq[316529]: DNS service limited to local subnets
Nov 28 10:05:56 np0005538515.localdomain dnsmasq[316529]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 28 10:05:56 np0005538515.localdomain dnsmasq[316529]: warning: no upstream servers configured
Nov 28 10:05:56 np0005538515.localdomain dnsmasq-dhcp[316529]: DHCP, static leases only on 10.100.0.0, lease time 1d
Nov 28 10:05:56 np0005538515.localdomain dnsmasq[316529]: read /var/lib/neutron/dhcp/3a59b0d3-6d6c-43cf-8506-00d3024e1dd5/addn_hosts - 0 addresses
Nov 28 10:05:56 np0005538515.localdomain dnsmasq-dhcp[316529]: read /var/lib/neutron/dhcp/3a59b0d3-6d6c-43cf-8506-00d3024e1dd5/host
Nov 28 10:05:56 np0005538515.localdomain dnsmasq-dhcp[316529]: read /var/lib/neutron/dhcp/3a59b0d3-6d6c-43cf-8506-00d3024e1dd5/opts
Nov 28 10:05:56 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e155 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 10:05:56 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:05:56.795 261346 INFO neutron.agent.dhcp.agent [None req-1def1e25-c684-4f63-8aa1-117fef23afc0 - - - - - -] DHCP configuration for ports {'4abc60f3-e168-4dec-9d99-99bf38eae7a0'} is completed
Nov 28 10:05:57 np0005538515.localdomain ceph-mon[301134]: pgmap v301: 177 pgs: 177 active+clean; 145 MiB data, 792 MiB used, 41 GiB / 42 GiB avail; 2.6 MiB/s rd, 1.7 KiB/s wr, 54 op/s
Nov 28 10:05:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:05:57 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 10:05:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:05:57 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 10:05:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:05:57 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 10:05:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:05:57 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 10:05:57 np0005538515.localdomain openstack_network_exporter[240973]: 
Nov 28 10:05:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:05:57 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 10:05:57 np0005538515.localdomain openstack_network_exporter[240973]: 
Nov 28 10:05:57 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:05:57.618 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:57 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v302: 177 pgs: 177 active+clean; 145 MiB data, 792 MiB used, 41 GiB / 42 GiB avail; 2.4 MiB/s rd, 2.1 KiB/s wr, 52 op/s
Nov 28 10:05:58 np0005538515.localdomain podman[316547]: 2025-11-28 10:05:58.246849006 +0000 UTC m=+0.054100809 container kill 8ce55d06dc884b3adea11f4b138a43364413ea7f9c45256bc2b458e88b37eecf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3a59b0d3-6d6c-43cf-8506-00d3024e1dd5, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 28 10:05:58 np0005538515.localdomain dnsmasq[316529]: exiting on receipt of SIGTERM
Nov 28 10:05:58 np0005538515.localdomain systemd[1]: libpod-8ce55d06dc884b3adea11f4b138a43364413ea7f9c45256bc2b458e88b37eecf.scope: Deactivated successfully.
Nov 28 10:05:58 np0005538515.localdomain podman[316559]: 2025-11-28 10:05:58.303018698 +0000 UTC m=+0.043785964 container died 8ce55d06dc884b3adea11f4b138a43364413ea7f9c45256bc2b458e88b37eecf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3a59b0d3-6d6c-43cf-8506-00d3024e1dd5, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:05:58 np0005538515.localdomain podman[316559]: 2025-11-28 10:05:58.332872742 +0000 UTC m=+0.073639938 container cleanup 8ce55d06dc884b3adea11f4b138a43364413ea7f9c45256bc2b458e88b37eecf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3a59b0d3-6d6c-43cf-8506-00d3024e1dd5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:05:58 np0005538515.localdomain systemd[1]: libpod-conmon-8ce55d06dc884b3adea11f4b138a43364413ea7f9c45256bc2b458e88b37eecf.scope: Deactivated successfully.
Nov 28 10:05:58 np0005538515.localdomain podman[316561]: 2025-11-28 10:05:58.369807575 +0000 UTC m=+0.106314689 container remove 8ce55d06dc884b3adea11f4b138a43364413ea7f9c45256bc2b458e88b37eecf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3a59b0d3-6d6c-43cf-8506-00d3024e1dd5, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125)
Nov 28 10:05:58 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:05:58.382 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:58 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T10:05:58Z|00137|binding|INFO|Releasing lport 241102ed-e98f-481c-af36-a58d0b82a130 from this chassis (sb_readonly=0)
Nov 28 10:05:58 np0005538515.localdomain kernel: device tap241102ed-e9 left promiscuous mode
Nov 28 10:05:58 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T10:05:58Z|00138|binding|INFO|Setting lport 241102ed-e98f-481c-af36-a58d0b82a130 down in Southbound
Nov 28 10:05:58 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:05:58.391 158530 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538515.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'dhcp2a75ae47-09f3-5db4-9c67-86b6e0e7c804-3a59b0d3-6d6c-43cf-8506-00d3024e1dd5', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3a59b0d3-6d6c-43cf-8506-00d3024e1dd5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3a67d7f32f5e49c3aed3e09278dd6c95', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e3e102bc-7fe3-462e-8d76-cae98ee17de5, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fd80e481be0>], logical_port=241102ed-e98f-481c-af36-a58d0b82a130) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fd80e481be0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:05:58 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:05:58.393 158530 INFO neutron.agent.ovn.metadata.agent [-] Port 241102ed-e98f-481c-af36-a58d0b82a130 in datapath 3a59b0d3-6d6c-43cf-8506-00d3024e1dd5 unbound from our chassis
Nov 28 10:05:58 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:05:58.396 158530 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 3a59b0d3-6d6c-43cf-8506-00d3024e1dd5, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 28 10:05:58 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:05:58.397 261619 DEBUG oslo.privsep.daemon [-] privsep: reply[9eee3137-4672-4663-b0f0-d3f50d532006]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:05:58 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:05:58.414 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:58 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:05:58.416 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:58 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Nov 28 10:05:58 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1783542772' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:05:58 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Nov 28 10:05:58 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1783542772' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:05:58 np0005538515.localdomain systemd[1]: tmp-crun.0VqOhB.mount: Deactivated successfully.
Nov 28 10:05:58 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-c626be514af0ce8e815152f86e5ad34ffc6952c7e54fd8d57a5b98c9ea4b60f6-merged.mount: Deactivated successfully.
Nov 28 10:05:58 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8ce55d06dc884b3adea11f4b138a43364413ea7f9c45256bc2b458e88b37eecf-userdata-shm.mount: Deactivated successfully.
Nov 28 10:05:58 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:05:58.692 261346 INFO neutron.agent.dhcp.agent [None req-4c3bf9c5-dfb1-4524-b422-cffea4019160 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:05:58 np0005538515.localdomain systemd[1]: run-netns-qdhcp\x2d3a59b0d3\x2d6d6c\x2d43cf\x2d8506\x2d00d3024e1dd5.mount: Deactivated successfully.
Nov 28 10:05:58 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:05:58.870 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:05:58 np0005538515.localdomain podman[239012]: time="2025-11-28T10:05:58Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 10:05:58 np0005538515.localdomain podman[239012]: @ - - [28/Nov/2025:10:05:58 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156330 "" "Go-http-client/1.1"
Nov 28 10:05:58 np0005538515.localdomain podman[239012]: @ - - [28/Nov/2025:10:05:58 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19212 "" "Go-http-client/1.1"
Nov 28 10:05:59 np0005538515.localdomain ceph-mon[301134]: pgmap v302: 177 pgs: 177 active+clean; 145 MiB data, 792 MiB used, 41 GiB / 42 GiB avail; 2.4 MiB/s rd, 2.1 KiB/s wr, 52 op/s
Nov 28 10:05:59 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/1783542772' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:05:59 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/1783542772' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:05:59 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:05:59.715 2 INFO neutron.agent.securitygroups_rpc [None req-eb7cbd19-2bf0-4a29-9cd5-8cfb0b13af7b 646af4638a054ec4b7eb9e5438a2ab60 5e7a07c97c664076bc825e05137c574c - - default default] Security group member updated ['11213b09-f0e7-4cd0-8dcb-72dc58a4cd0b']
Nov 28 10:05:59 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v303: 177 pgs: 177 active+clean; 145 MiB data, 792 MiB used, 41 GiB / 42 GiB avail; 2.1 MiB/s rd, 1.8 KiB/s wr, 44 op/s
Nov 28 10:05:59 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Nov 28 10:05:59 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2237297014' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:05:59 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Nov 28 10:05:59 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2237297014' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:06:00 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:06:00.261 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:06:00 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:06:00.261 280172 DEBUG nova.compute.manager [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Nov 28 10:06:00 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e156 e156: 6 total, 6 up, 6 in
Nov 28 10:06:00 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/753870557' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:06:00 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/2237297014' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:06:00 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/2237297014' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:06:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:06:00.627 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:06:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:06:00.627 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:06:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:06:00.628 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:06:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:06:00.628 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:06:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:06:00.628 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:06:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:06:00.628 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:06:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:06:00.628 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:06:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:06:00.628 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:06:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:06:00.629 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:06:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:06:00.629 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:06:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:06:00.629 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:06:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:06:00.629 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:06:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:06:00.629 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:06:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:06:00.629 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:06:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:06:00.629 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:06:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:06:00.630 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:06:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:06:00.630 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:06:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:06:00.630 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:06:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:06:00.630 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:06:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:06:00.630 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:06:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:06:00.630 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:06:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:06:00.630 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:06:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:06:00.631 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:06:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:06:00.631 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:06:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:06:00.631 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:06:01 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:06:01.206 261346 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:06:01 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e157 e157: 6 total, 6 up, 6 in
Nov 28 10:06:01 np0005538515.localdomain ceph-mon[301134]: pgmap v303: 177 pgs: 177 active+clean; 145 MiB data, 792 MiB used, 41 GiB / 42 GiB avail; 2.1 MiB/s rd, 1.8 KiB/s wr, 44 op/s
Nov 28 10:06:01 np0005538515.localdomain ceph-mon[301134]: osdmap e156: 6 total, 6 up, 6 in
Nov 28 10:06:01 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v306: 177 pgs: 177 active+clean; 145 MiB data, 792 MiB used, 41 GiB / 42 GiB avail; 83 KiB/s rd, 3.2 MiB/s wr, 125 op/s
Nov 28 10:06:01 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e157 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 10:06:01 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:06:01.881 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:02 np0005538515.localdomain ceph-mon[301134]: osdmap e157: 6 total, 6 up, 6 in
Nov 28 10:06:02 np0005538515.localdomain ceph-mon[301134]: pgmap v306: 177 pgs: 177 active+clean; 145 MiB data, 792 MiB used, 41 GiB / 42 GiB avail; 83 KiB/s rd, 3.2 MiB/s wr, 125 op/s
Nov 28 10:06:02 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e158 e158: 6 total, 6 up, 6 in
Nov 28 10:06:02 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:06:02.622 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:02 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:06:02.695 261346 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:06:02 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:06:02.862 2 INFO neutron.agent.securitygroups_rpc [None req-e67102ba-fe9d-44cc-8c94-469bc136f71c 646af4638a054ec4b7eb9e5438a2ab60 5e7a07c97c664076bc825e05137c574c - - default default] Security group member updated ['11213b09-f0e7-4cd0-8dcb-72dc58a4cd0b']
Nov 28 10:06:03 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:06:03.020 2 INFO neutron.agent.securitygroups_rpc [None req-e67102ba-fe9d-44cc-8c94-469bc136f71c 646af4638a054ec4b7eb9e5438a2ab60 5e7a07c97c664076bc825e05137c574c - - default default] Security group member updated ['11213b09-f0e7-4cd0-8dcb-72dc58a4cd0b']
Nov 28 10:06:03 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:06:03.171 261346 INFO neutron.agent.dhcp.agent [None req-694fd91a-974f-4128-82f1-92e341d4f45d - - - - - -] Synchronizing state
Nov 28 10:06:03 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:06:03.305 261346 INFO neutron.agent.dhcp.agent [None req-b04f7734-15e3-4a1f-ac37-0c3cc54f42ff - - - - - -] All active networks have been fetched through RPC.
Nov 28 10:06:03 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:06:03.307 261346 INFO neutron.agent.dhcp.agent [-] Starting network 42de65c3-1b93-417a-9a72-a70c0a174963 dhcp configuration
Nov 28 10:06:03 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:06:03.307 261346 INFO neutron.agent.dhcp.agent [-] Finished network 42de65c3-1b93-417a-9a72-a70c0a174963 dhcp configuration
Nov 28 10:06:03 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:06:03.308 261346 INFO neutron.agent.dhcp.agent [-] Starting network 744b5a82-3c5c-4b41-ba44-527244a209c4 dhcp configuration
Nov 28 10:06:03 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:06:03.308 261346 INFO neutron.agent.dhcp.agent [-] Finished network 744b5a82-3c5c-4b41-ba44-527244a209c4 dhcp configuration
Nov 28 10:06:03 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:06:03.308 261346 INFO neutron.agent.dhcp.agent [-] Starting network 91fe7ebb-e32d-453f-9285-23228e6cf776 dhcp configuration
Nov 28 10:06:03 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:06:03.309 261346 INFO neutron.agent.dhcp.agent [-] Finished network 91fe7ebb-e32d-453f-9285-23228e6cf776 dhcp configuration
Nov 28 10:06:03 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:06:03.309 261346 INFO neutron.agent.dhcp.agent [None req-b04f7734-15e3-4a1f-ac37-0c3cc54f42ff - - - - - -] Synchronizing state complete
Nov 28 10:06:03 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:06:03.310 261346 INFO neutron.agent.dhcp.agent [None req-c62c5ab5-d92f-4c9b-95cd-e42bc27bc915 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:06:03 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e159 e159: 6 total, 6 up, 6 in
Nov 28 10:06:03 np0005538515.localdomain ceph-mon[301134]: osdmap e158: 6 total, 6 up, 6 in
Nov 28 10:06:03 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:06:03.627 2 INFO neutron.agent.securitygroups_rpc [None req-06fb4107-5237-41cb-aedf-7ec837b7d0c6 646af4638a054ec4b7eb9e5438a2ab60 5e7a07c97c664076bc825e05137c574c - - default default] Security group member updated ['11213b09-f0e7-4cd0-8dcb-72dc58a4cd0b']
Nov 28 10:06:03 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:06:03.653 261346 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:06:03 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v309: 177 pgs: 177 active+clean; 145 MiB data, 792 MiB used, 41 GiB / 42 GiB avail; 136 KiB/s rd, 5.3 MiB/s wr, 206 op/s
Nov 28 10:06:03 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:06:03.854 261346 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:06:03 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:06:03.911 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:04 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:06:04.294 2 INFO neutron.agent.securitygroups_rpc [None req-f2f37d61-bf24-49e1-9d4d-20b98269f559 646af4638a054ec4b7eb9e5438a2ab60 5e7a07c97c664076bc825e05137c574c - - default default] Security group member updated ['11213b09-f0e7-4cd0-8dcb-72dc58a4cd0b']
Nov 28 10:06:04 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:06:04.346 261346 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:06:04 np0005538515.localdomain ceph-mon[301134]: osdmap e159: 6 total, 6 up, 6 in
Nov 28 10:06:04 np0005538515.localdomain ceph-mon[301134]: pgmap v309: 177 pgs: 177 active+clean; 145 MiB data, 792 MiB used, 41 GiB / 42 GiB avail; 136 KiB/s rd, 5.3 MiB/s wr, 206 op/s
Nov 28 10:06:04 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:06:04.844 261346 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:06:04 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.
Nov 28 10:06:04 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.
Nov 28 10:06:04 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.
Nov 28 10:06:04 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.
Nov 28 10:06:04 np0005538515.localdomain systemd[1]: tmp-crun.UfgAs0.mount: Deactivated successfully.
Nov 28 10:06:04 np0005538515.localdomain podman[316587]: 2025-11-28 10:06:04.983471922 +0000 UTC m=+0.086491781 container health_status 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Nov 28 10:06:04 np0005538515.localdomain podman[316587]: 2025-11-28 10:06:04.996859873 +0000 UTC m=+0.099879732 container exec_died 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible)
Nov 28 10:06:05 np0005538515.localdomain systemd[1]: 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.service: Deactivated successfully.
Nov 28 10:06:05 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Nov 28 10:06:05 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1537040644' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:06:05 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Nov 28 10:06:05 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1537040644' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:06:05 np0005538515.localdomain podman[316589]: 2025-11-28 10:06:05.08485759 +0000 UTC m=+0.179103930 container health_status b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:06:05 np0005538515.localdomain podman[316589]: 2025-11-28 10:06:05.093419363 +0000 UTC m=+0.187665713 container exec_died b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 28 10:06:05 np0005538515.localdomain systemd[1]: b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.service: Deactivated successfully.
Nov 28 10:06:05 np0005538515.localdomain podman[316595]: 2025-11-28 10:06:05.057963226 +0000 UTC m=+0.146998976 container health_status d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 28 10:06:05 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e160 e160: 6 total, 6 up, 6 in
Nov 28 10:06:05 np0005538515.localdomain podman[316588]: 2025-11-28 10:06:05.143134556 +0000 UTC m=+0.240781690 container health_status 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.build-date=20251125, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 28 10:06:05 np0005538515.localdomain podman[316595]: 2025-11-28 10:06:05.192573451 +0000 UTC m=+0.281609261 container exec_died d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 10:06:05 np0005538515.localdomain systemd[1]: d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.service: Deactivated successfully.
Nov 28 10:06:05 np0005538515.localdomain podman[316588]: 2025-11-28 10:06:05.225521181 +0000 UTC m=+0.323168305 container exec_died 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:06:05 np0005538515.localdomain systemd[1]: 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.service: Deactivated successfully.
Nov 28 10:06:05 np0005538515.localdomain ceph-mgr[286188]: [balancer INFO root] Optimize plan auto_2025-11-28_10:06:05
Nov 28 10:06:05 np0005538515.localdomain ceph-mgr[286188]: [balancer INFO root] Mode upmap, max misplaced 0.050000
Nov 28 10:06:05 np0005538515.localdomain ceph-mgr[286188]: [balancer INFO root] do_upmap
Nov 28 10:06:05 np0005538515.localdomain ceph-mgr[286188]: [balancer INFO root] pools ['manila_metadata', 'volumes', 'manila_data', 'backups', 'images', '.mgr', 'vms']
Nov 28 10:06:05 np0005538515.localdomain ceph-mgr[286188]: [balancer INFO root] prepared 0/10 changes
Nov 28 10:06:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] scanning for idle connections..
Nov 28 10:06:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] cleaning up connections: []
Nov 28 10:06:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] scanning for idle connections..
Nov 28 10:06:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] cleaning up connections: []
Nov 28 10:06:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] scanning for idle connections..
Nov 28 10:06:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] cleaning up connections: []
Nov 28 10:06:05 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v311: 177 pgs: 177 active+clean; 145 MiB data, 792 MiB used, 41 GiB / 42 GiB avail; 128 KiB/s rd, 5.0 MiB/s wr, 194 op/s
Nov 28 10:06:05 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/1537040644' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:06:05 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/1537040644' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:06:05 np0005538515.localdomain ceph-mon[301134]: osdmap e160: 6 total, 6 up, 6 in
Nov 28 10:06:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] _maybe_adjust
Nov 28 10:06:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Nov 28 10:06:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] Pool '.mgr' root_id -1 using 3.080724804578448e-05 of space, bias 1.0, pg target 0.006161449609156895 quantized to 1 (current 1)
Nov 28 10:06:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Nov 28 10:06:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] Pool 'vms' root_id -1 using 0.003328000680485762 of space, bias 1.0, pg target 0.6656001360971524 quantized to 32 (current 32)
Nov 28 10:06:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Nov 28 10:06:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] Pool 'volumes' root_id -1 using 1.635783082077052e-06 of space, bias 1.0, pg target 0.0003266113553880514 quantized to 32 (current 32)
Nov 28 10:06:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Nov 28 10:06:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] Pool 'images' root_id -1 using 0.004299383200725851 of space, bias 1.0, pg target 0.8584435124115949 quantized to 32 (current 32)
Nov 28 10:06:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Nov 28 10:06:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] Pool 'backups' root_id -1 using 8.17891541038526e-07 of space, bias 1.0, pg target 0.00016276041666666666 quantized to 32 (current 32)
Nov 28 10:06:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Nov 28 10:06:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] Pool 'manila_data' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Nov 28 10:06:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Nov 28 10:06:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] Pool 'manila_metadata' root_id -1 using 2.453674623115578e-06 of space, bias 4.0, pg target 0.001953125 quantized to 16 (current 16)
Nov 28 10:06:05 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules
Nov 28 10:06:05 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules
Nov 28 10:06:05 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: vms, start_after=
Nov 28 10:06:05 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: vms, start_after=
Nov 28 10:06:05 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: volumes, start_after=
Nov 28 10:06:05 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: volumes, start_after=
Nov 28 10:06:05 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: images, start_after=
Nov 28 10:06:05 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: images, start_after=
Nov 28 10:06:05 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: backups, start_after=
Nov 28 10:06:05 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: backups, start_after=
Nov 28 10:06:06 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e160 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 10:06:07 np0005538515.localdomain ceph-mon[301134]: pgmap v311: 177 pgs: 177 active+clean; 145 MiB data, 792 MiB used, 41 GiB / 42 GiB avail; 128 KiB/s rd, 5.0 MiB/s wr, 194 op/s
Nov 28 10:06:07 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:06:07.655 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:07 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v312: 177 pgs: 177 active+clean; 145 MiB data, 792 MiB used, 41 GiB / 42 GiB avail; 37 KiB/s rd, 2.5 KiB/s wr, 50 op/s
Nov 28 10:06:08 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Nov 28 10:06:08 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3328946498' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:06:08 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.
Nov 28 10:06:08 np0005538515.localdomain podman[316673]: 2025-11-28 10:06:08.877940582 +0000 UTC m=+0.092029892 container health_status 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 28 10:06:08 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:06:08.879 261346 INFO neutron.agent.linux.ip_lib [None req-b352c162-d9d7-4ff7-bb47-6368ce0a595e - - - - - -] Device tap0eb8bf5c-38 cannot be used as it has no MAC address
Nov 28 10:06:08 np0005538515.localdomain podman[316673]: 2025-11-28 10:06:08.890731243 +0000 UTC m=+0.104820503 container exec_died 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 10:06:08 np0005538515.localdomain systemd[1]: 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.service: Deactivated successfully.
Nov 28 10:06:08 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:06:08.908 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:08 np0005538515.localdomain kernel: device tap0eb8bf5c-38 entered promiscuous mode
Nov 28 10:06:08 np0005538515.localdomain NetworkManager[5965]: <info>  [1764324368.9184] manager: (tap0eb8bf5c-38): new Generic device (/org/freedesktop/NetworkManager/Devices/32)
Nov 28 10:06:08 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T10:06:08Z|00139|binding|INFO|Claiming lport 0eb8bf5c-382d-4ca7-a3fe-a3c5250492c6 for this chassis.
Nov 28 10:06:08 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T10:06:08Z|00140|binding|INFO|0eb8bf5c-382d-4ca7-a3fe-a3c5250492c6: Claiming unknown
Nov 28 10:06:08 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:06:08.922 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:08 np0005538515.localdomain systemd-udevd[316704]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 10:06:08 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:06:08.933 158530 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538515.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcp2a75ae47-09f3-5db4-9c67-86b6e0e7c804-b2c4ac07-8851-40d3-9495-d0489b67c4c3', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b2c4ac07-8851-40d3-9495-d0489b67c4c3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd3c0d1ce8d854a7b9ffc953e88cd2c44', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=940d6739-e1d9-4dcd-a724-785ba886c2af, chassis=[<ovs.db.idl.Row object at 0x7fd80e481be0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fd80e481be0>], logical_port=0eb8bf5c-382d-4ca7-a3fe-a3c5250492c6) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:06:08 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:06:08.935 158530 INFO neutron.agent.ovn.metadata.agent [-] Port 0eb8bf5c-382d-4ca7-a3fe-a3c5250492c6 in datapath b2c4ac07-8851-40d3-9495-d0489b67c4c3 bound to our chassis
Nov 28 10:06:08 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:06:08.938 158530 DEBUG neutron.agent.ovn.metadata.agent [-] Port 58641ed8-9be5-4200-85c3-ac1f139da10b IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Nov 28 10:06:08 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:06:08.939 158530 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b2c4ac07-8851-40d3-9495-d0489b67c4c3, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 28 10:06:08 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:06:08.940 261619 DEBUG oslo.privsep.daemon [-] privsep: reply[b1add40e-5b98-49c3-ab79-6ee062113ceb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:06:08 np0005538515.localdomain virtnodedevd[228057]: ethtool ioctl error on tap0eb8bf5c-38: No such device
Nov 28 10:06:08 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T10:06:08Z|00141|binding|INFO|Setting lport 0eb8bf5c-382d-4ca7-a3fe-a3c5250492c6 ovn-installed in OVS
Nov 28 10:06:08 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T10:06:08Z|00142|binding|INFO|Setting lport 0eb8bf5c-382d-4ca7-a3fe-a3c5250492c6 up in Southbound
Nov 28 10:06:08 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:06:08.958 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:08 np0005538515.localdomain virtnodedevd[228057]: ethtool ioctl error on tap0eb8bf5c-38: No such device
Nov 28 10:06:08 np0005538515.localdomain virtnodedevd[228057]: ethtool ioctl error on tap0eb8bf5c-38: No such device
Nov 28 10:06:08 np0005538515.localdomain virtnodedevd[228057]: ethtool ioctl error on tap0eb8bf5c-38: No such device
Nov 28 10:06:08 np0005538515.localdomain virtnodedevd[228057]: ethtool ioctl error on tap0eb8bf5c-38: No such device
Nov 28 10:06:08 np0005538515.localdomain virtnodedevd[228057]: ethtool ioctl error on tap0eb8bf5c-38: No such device
Nov 28 10:06:08 np0005538515.localdomain virtnodedevd[228057]: ethtool ioctl error on tap0eb8bf5c-38: No such device
Nov 28 10:06:08 np0005538515.localdomain virtnodedevd[228057]: ethtool ioctl error on tap0eb8bf5c-38: No such device
Nov 28 10:06:08 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:06:08.997 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:09 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:06:09.033 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:09 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:06:09.171 261346 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:06:08Z, description=, device_id=0c7c1c6e-1c46-40f5-85d2-30567725a06d, device_owner=network:router_gateway, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce4f69d0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce4f6af0>], id=eda15d4b-6e20-425d-85eb-2095daa7da9b, ip_allocation=immediate, mac_address=fa:16:3e:6c:31:2b, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T08:32:19Z, description=, dns_domain=, id=887157f9-a765-40c0-8be5-1fba3ddea8f8, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=9dda653c53224db086060962b0702694, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=29, status=ACTIVE, subnets=['5f7de60c-f82a-4f40-b803-51cb08cbf2e3'], tags=[], tenant_id=9dda653c53224db086060962b0702694, updated_at=2025-11-28T08:32:25Z, vlan_transparent=None, network_id=887157f9-a765-40c0-8be5-1fba3ddea8f8, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2240, status=DOWN, tags=[], tenant_id=, updated_at=2025-11-28T10:06:08Z on network 887157f9-a765-40c0-8be5-1fba3ddea8f8
Nov 28 10:06:09 np0005538515.localdomain ceph-mon[301134]: pgmap v312: 177 pgs: 177 active+clean; 145 MiB data, 792 MiB used, 41 GiB / 42 GiB avail; 37 KiB/s rd, 2.5 KiB/s wr, 50 op/s
Nov 28 10:06:09 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/3328946498' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:06:09 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e161 e161: 6 total, 6 up, 6 in
Nov 28 10:06:09 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:06:09.395 2 INFO neutron.agent.securitygroups_rpc [None req-88638c5c-0267-4c55-b76f-e8bedf4b6799 e32848e36ae94f66ae634ff4d7716d6f 8462a4a9a313405e8fd212f9ec4a0c92 - - default default] Security group member updated ['78343c03-098f-4faf-a880-2814fe3611d6']
Nov 28 10:06:09 np0005538515.localdomain systemd[1]: tmp-crun.HZTDMN.mount: Deactivated successfully.
Nov 28 10:06:09 np0005538515.localdomain dnsmasq[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/addn_hosts - 2 addresses
Nov 28 10:06:09 np0005538515.localdomain dnsmasq-dhcp[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/host
Nov 28 10:06:09 np0005538515.localdomain podman[316757]: 2025-11-28 10:06:09.407935175 +0000 UTC m=+0.069175431 container kill d6ba70e100412f39fbf43f2359746885ccd151dd93bddc3682d2651a49a36c62 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-887157f9-a765-40c0-8be5-1fba3ddea8f8, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 28 10:06:09 np0005538515.localdomain dnsmasq-dhcp[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/opts
Nov 28 10:06:09 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:06:09.739 261346 INFO neutron.agent.dhcp.agent [None req-0ef17ad3-b4f0-45a4-9f72-bd0ee6e421f7 - - - - - -] DHCP configuration for ports {'eda15d4b-6e20-425d-85eb-2095daa7da9b'} is completed
Nov 28 10:06:09 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v314: 177 pgs: 177 active+clean; 145 MiB data, 792 MiB used, 41 GiB / 42 GiB avail; 36 KiB/s rd, 2.4 KiB/s wr, 49 op/s
Nov 28 10:06:09 np0005538515.localdomain podman[316811]: 
Nov 28 10:06:09 np0005538515.localdomain podman[316811]: 2025-11-28 10:06:09.971139476 +0000 UTC m=+0.081942282 container create a39214a4baa8262623303d314b8ed95b71c01a463bc2eabd06aba05950874fd1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b2c4ac07-8851-40d3-9495-d0489b67c4c3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Nov 28 10:06:10 np0005538515.localdomain systemd[1]: Started libpod-conmon-a39214a4baa8262623303d314b8ed95b71c01a463bc2eabd06aba05950874fd1.scope.
Nov 28 10:06:10 np0005538515.localdomain systemd[1]: tmp-crun.yBOGfI.mount: Deactivated successfully.
Nov 28 10:06:10 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 10:06:10 np0005538515.localdomain podman[316811]: 2025-11-28 10:06:09.934536974 +0000 UTC m=+0.045339810 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 28 10:06:10 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a112523a391658be219e8ca2b94928afac8124141e68c4a75a8a0c64ca4d98f3/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 10:06:10 np0005538515.localdomain podman[316811]: 2025-11-28 10:06:10.04857967 +0000 UTC m=+0.159382476 container init a39214a4baa8262623303d314b8ed95b71c01a463bc2eabd06aba05950874fd1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b2c4ac07-8851-40d3-9495-d0489b67c4c3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Nov 28 10:06:10 np0005538515.localdomain podman[316811]: 2025-11-28 10:06:10.059819054 +0000 UTC m=+0.170621860 container start a39214a4baa8262623303d314b8ed95b71c01a463bc2eabd06aba05950874fd1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b2c4ac07-8851-40d3-9495-d0489b67c4c3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:06:10 np0005538515.localdomain dnsmasq[316830]: started, version 2.85 cachesize 150
Nov 28 10:06:10 np0005538515.localdomain dnsmasq[316830]: DNS service limited to local subnets
Nov 28 10:06:10 np0005538515.localdomain dnsmasq[316830]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 28 10:06:10 np0005538515.localdomain dnsmasq[316830]: warning: no upstream servers configured
Nov 28 10:06:10 np0005538515.localdomain dnsmasq-dhcp[316830]: DHCP, static leases only on 10.100.0.0, lease time 1d
Nov 28 10:06:10 np0005538515.localdomain dnsmasq[316830]: read /var/lib/neutron/dhcp/b2c4ac07-8851-40d3-9495-d0489b67c4c3/addn_hosts - 0 addresses
Nov 28 10:06:10 np0005538515.localdomain dnsmasq-dhcp[316830]: read /var/lib/neutron/dhcp/b2c4ac07-8851-40d3-9495-d0489b67c4c3/host
Nov 28 10:06:10 np0005538515.localdomain dnsmasq-dhcp[316830]: read /var/lib/neutron/dhcp/b2c4ac07-8851-40d3-9495-d0489b67c4c3/opts
Nov 28 10:06:10 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:06:10.079 2 INFO neutron.agent.securitygroups_rpc [None req-58c4368b-992c-4171-8380-96e11b260575 646af4638a054ec4b7eb9e5438a2ab60 5e7a07c97c664076bc825e05137c574c - - default default] Security group member updated ['11213b09-f0e7-4cd0-8dcb-72dc58a4cd0b']
Nov 28 10:06:10 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:06:10.189 261346 INFO neutron.agent.dhcp.agent [None req-34377480-17b1-454e-befc-c8d8416758ad - - - - - -] DHCP configuration for ports {'f9c3bbdf-3272-4132-8f82-3ca63ccc1570'} is completed
Nov 28 10:06:10 np0005538515.localdomain ceph-mon[301134]: osdmap e161: 6 total, 6 up, 6 in
Nov 28 10:06:10 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Nov 28 10:06:10 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/144526477' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:06:10 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Nov 28 10:06:10 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/144526477' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:06:10 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:06:10.914 2 INFO neutron.agent.securitygroups_rpc [None req-dbbfa92a-6f48-4219-96e5-450182918d3d e32848e36ae94f66ae634ff4d7716d6f 8462a4a9a313405e8fd212f9ec4a0c92 - - default default] Security group member updated ['78343c03-098f-4faf-a880-2814fe3611d6']
Nov 28 10:06:11 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:06:11.003 2 INFO neutron.agent.securitygroups_rpc [None req-dbbfa92a-6f48-4219-96e5-450182918d3d e32848e36ae94f66ae634ff4d7716d6f 8462a4a9a313405e8fd212f9ec4a0c92 - - default default] Security group member updated ['78343c03-098f-4faf-a880-2814fe3611d6']
Nov 28 10:06:11 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e162 e162: 6 total, 6 up, 6 in
Nov 28 10:06:11 np0005538515.localdomain ceph-mon[301134]: pgmap v314: 177 pgs: 177 active+clean; 145 MiB data, 792 MiB used, 41 GiB / 42 GiB avail; 36 KiB/s rd, 2.4 KiB/s wr, 49 op/s
Nov 28 10:06:11 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/144526477' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:06:11 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/144526477' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:06:11 np0005538515.localdomain ceph-mon[301134]: osdmap e162: 6 total, 6 up, 6 in
Nov 28 10:06:11 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:06:11.325 2 INFO neutron.agent.securitygroups_rpc [None req-a2df05ee-0a35-442d-aa18-eb9de4dda01c e32848e36ae94f66ae634ff4d7716d6f 8462a4a9a313405e8fd212f9ec4a0c92 - - default default] Security group member updated ['78343c03-098f-4faf-a880-2814fe3611d6']
Nov 28 10:06:11 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:06:11.343 261346 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:06:11 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:06:11.785 2 INFO neutron.agent.securitygroups_rpc [None req-cbfd0cad-e26e-4d58-9458-36c5ead2083c 646af4638a054ec4b7eb9e5438a2ab60 5e7a07c97c664076bc825e05137c574c - - default default] Security group member updated ['11213b09-f0e7-4cd0-8dcb-72dc58a4cd0b']
Nov 28 10:06:11 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v316: 177 pgs: 177 active+clean; 145 MiB data, 792 MiB used, 41 GiB / 42 GiB avail; 117 KiB/s rd, 5.9 KiB/s wr, 157 op/s
Nov 28 10:06:11 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e162 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 10:06:11 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:06:11.845 2 INFO neutron.agent.securitygroups_rpc [None req-841a5df4-1f75-4611-8939-235283ca6a97 e32848e36ae94f66ae634ff4d7716d6f 8462a4a9a313405e8fd212f9ec4a0c92 - - default default] Security group member updated ['78343c03-098f-4faf-a880-2814fe3611d6']
Nov 28 10:06:11 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:06:11.864 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:11 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.
Nov 28 10:06:11 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:06:11.882 261346 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:06:11 np0005538515.localdomain podman[316831]: 2025-11-28 10:06:11.979577351 +0000 UTC m=+0.087893245 container health_status cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:06:12 np0005538515.localdomain podman[316831]: 2025-11-28 10:06:12.021727253 +0000 UTC m=+0.130043107 container exec_died cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Nov 28 10:06:12 np0005538515.localdomain systemd[1]: cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.service: Deactivated successfully.
Nov 28 10:06:12 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e163 e163: 6 total, 6 up, 6 in
Nov 28 10:06:12 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/170238483' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:06:12 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/170238483' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:06:12 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:06:12.378 261346 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:06:12 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:06:12.694 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:13 np0005538515.localdomain ceph-mon[301134]: pgmap v316: 177 pgs: 177 active+clean; 145 MiB data, 792 MiB used, 41 GiB / 42 GiB avail; 117 KiB/s rd, 5.9 KiB/s wr, 157 op/s
Nov 28 10:06:13 np0005538515.localdomain ceph-mon[301134]: osdmap e163: 6 total, 6 up, 6 in
Nov 28 10:06:13 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v318: 177 pgs: 177 active+clean; 145 MiB data, 792 MiB used, 41 GiB / 42 GiB avail; 93 KiB/s rd, 4.0 KiB/s wr, 124 op/s
Nov 28 10:06:13 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:06:13.960 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:14 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:06:14.064 261346 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:06:13Z, description=, device_id=0c7c1c6e-1c46-40f5-85d2-30567725a06d, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce446370>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce446820>], id=e7862df1-355e-4f97-9168-e87770babee6, ip_allocation=immediate, mac_address=fa:16:3e:40:b9:f1, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:06:04Z, description=, dns_domain=, id=b2c4ac07-8851-40d3-9495-d0489b67c4c3, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-VolumesBackupsTest-1354719988-network, port_security_enabled=True, project_id=d3c0d1ce8d854a7b9ffc953e88cd2c44, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=64958, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2202, status=ACTIVE, subnets=['f4b4dc5d-f654-46e4-8ff2-bd52eff10306'], tags=[], tenant_id=d3c0d1ce8d854a7b9ffc953e88cd2c44, updated_at=2025-11-28T10:06:06Z, vlan_transparent=None, network_id=b2c4ac07-8851-40d3-9495-d0489b67c4c3, port_security_enabled=False, project_id=d3c0d1ce8d854a7b9ffc953e88cd2c44, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2267, status=DOWN, tags=[], tenant_id=d3c0d1ce8d854a7b9ffc953e88cd2c44, updated_at=2025-11-28T10:06:13Z on network b2c4ac07-8851-40d3-9495-d0489b67c4c3
Nov 28 10:06:14 np0005538515.localdomain dnsmasq[316830]: read /var/lib/neutron/dhcp/b2c4ac07-8851-40d3-9495-d0489b67c4c3/addn_hosts - 1 addresses
Nov 28 10:06:14 np0005538515.localdomain podman[316867]: 2025-11-28 10:06:14.268658667 +0000 UTC m=+0.057684639 container kill a39214a4baa8262623303d314b8ed95b71c01a463bc2eabd06aba05950874fd1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b2c4ac07-8851-40d3-9495-d0489b67c4c3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 28 10:06:14 np0005538515.localdomain dnsmasq-dhcp[316830]: read /var/lib/neutron/dhcp/b2c4ac07-8851-40d3-9495-d0489b67c4c3/host
Nov 28 10:06:14 np0005538515.localdomain dnsmasq-dhcp[316830]: read /var/lib/neutron/dhcp/b2c4ac07-8851-40d3-9495-d0489b67c4c3/opts
Nov 28 10:06:14 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/1677773907' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:06:14 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/1677773907' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:06:14 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e164 e164: 6 total, 6 up, 6 in
Nov 28 10:06:14 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:06:14.647 261346 INFO neutron.agent.dhcp.agent [None req-3eaedd66-05ec-4a99-8df9-df7c0f22a0c5 - - - - - -] DHCP configuration for ports {'e7862df1-355e-4f97-9168-e87770babee6'} is completed
Nov 28 10:06:14 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Nov 28 10:06:14 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3736797209' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:06:14 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Nov 28 10:06:14 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3736797209' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:06:15 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:06:15.017 2 INFO neutron.agent.securitygroups_rpc [None req-e90d3b0d-d250-425f-9007-eccb585011b0 e32848e36ae94f66ae634ff4d7716d6f 8462a4a9a313405e8fd212f9ec4a0c92 - - default default] Security group member updated ['78343c03-098f-4faf-a880-2814fe3611d6']
Nov 28 10:06:15 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e165 e165: 6 total, 6 up, 6 in
Nov 28 10:06:15 np0005538515.localdomain ceph-mon[301134]: pgmap v318: 177 pgs: 177 active+clean; 145 MiB data, 792 MiB used, 41 GiB / 42 GiB avail; 93 KiB/s rd, 4.0 KiB/s wr, 124 op/s
Nov 28 10:06:15 np0005538515.localdomain ceph-mon[301134]: osdmap e164: 6 total, 6 up, 6 in
Nov 28 10:06:15 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/3736797209' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:06:15 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/3736797209' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:06:15 np0005538515.localdomain ceph-mon[301134]: osdmap e165: 6 total, 6 up, 6 in
Nov 28 10:06:15 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v321: 177 pgs: 177 active+clean; 145 MiB data, 792 MiB used, 41 GiB / 42 GiB avail; 120 KiB/s rd, 5.2 KiB/s wr, 160 op/s
Nov 28 10:06:16 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:06:15.999 261346 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:06:15Z, description=, device_id=5b45f823-0eca-4648-936e-96781a85013b, device_owner=network:router_gateway, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce432940>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce432b20>], id=951fee01-fb0c-4d4d-b345-0c23456eb150, ip_allocation=immediate, mac_address=fa:16:3e:95:f7:27, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T08:32:19Z, description=, dns_domain=, id=887157f9-a765-40c0-8be5-1fba3ddea8f8, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=9dda653c53224db086060962b0702694, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=29, status=ACTIVE, subnets=['5f7de60c-f82a-4f40-b803-51cb08cbf2e3'], tags=[], tenant_id=9dda653c53224db086060962b0702694, updated_at=2025-11-28T08:32:25Z, vlan_transparent=None, network_id=887157f9-a765-40c0-8be5-1fba3ddea8f8, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2278, status=DOWN, tags=[], tenant_id=, updated_at=2025-11-28T10:06:15Z on network 887157f9-a765-40c0-8be5-1fba3ddea8f8
Nov 28 10:06:16 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:06:16.028 2 INFO neutron.agent.securitygroups_rpc [None req-3771b2dc-83c7-4322-8cb3-68ef5f3840bb e32848e36ae94f66ae634ff4d7716d6f 8462a4a9a313405e8fd212f9ec4a0c92 - - default default] Security group member updated ['78343c03-098f-4faf-a880-2814fe3611d6']
Nov 28 10:06:16 np0005538515.localdomain dnsmasq[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/addn_hosts - 3 addresses
Nov 28 10:06:16 np0005538515.localdomain podman[316904]: 2025-11-28 10:06:16.251249531 +0000 UTC m=+0.061069873 container kill d6ba70e100412f39fbf43f2359746885ccd151dd93bddc3682d2651a49a36c62 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-887157f9-a765-40c0-8be5-1fba3ddea8f8, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Nov 28 10:06:16 np0005538515.localdomain dnsmasq-dhcp[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/host
Nov 28 10:06:16 np0005538515.localdomain dnsmasq-dhcp[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/opts
Nov 28 10:06:16 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:06:16.511 261346 INFO neutron.agent.dhcp.agent [None req-b559ac96-b038-4740-a953-bf001a3a52b2 - - - - - -] DHCP configuration for ports {'951fee01-fb0c-4d4d-b345-0c23456eb150'} is completed
Nov 28 10:06:16 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e165 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 10:06:17 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:06:17.155 261346 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:06:13Z, description=, device_id=0c7c1c6e-1c46-40f5-85d2-30567725a06d, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce5e0400>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ceeac3a0>], id=e7862df1-355e-4f97-9168-e87770babee6, ip_allocation=immediate, mac_address=fa:16:3e:40:b9:f1, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:06:04Z, description=, dns_domain=, id=b2c4ac07-8851-40d3-9495-d0489b67c4c3, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-VolumesBackupsTest-1354719988-network, port_security_enabled=True, project_id=d3c0d1ce8d854a7b9ffc953e88cd2c44, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=64958, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2202, status=ACTIVE, subnets=['f4b4dc5d-f654-46e4-8ff2-bd52eff10306'], tags=[], tenant_id=d3c0d1ce8d854a7b9ffc953e88cd2c44, updated_at=2025-11-28T10:06:06Z, vlan_transparent=None, network_id=b2c4ac07-8851-40d3-9495-d0489b67c4c3, port_security_enabled=False, project_id=d3c0d1ce8d854a7b9ffc953e88cd2c44, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2267, status=DOWN, tags=[], tenant_id=d3c0d1ce8d854a7b9ffc953e88cd2c44, updated_at=2025-11-28T10:06:13Z on network b2c4ac07-8851-40d3-9495-d0489b67c4c3
Nov 28 10:06:17 np0005538515.localdomain ceph-mon[301134]: pgmap v321: 177 pgs: 177 active+clean; 145 MiB data, 792 MiB used, 41 GiB / 42 GiB avail; 120 KiB/s rd, 5.2 KiB/s wr, 160 op/s
Nov 28 10:06:17 np0005538515.localdomain dnsmasq[316830]: read /var/lib/neutron/dhcp/b2c4ac07-8851-40d3-9495-d0489b67c4c3/addn_hosts - 1 addresses
Nov 28 10:06:17 np0005538515.localdomain dnsmasq-dhcp[316830]: read /var/lib/neutron/dhcp/b2c4ac07-8851-40d3-9495-d0489b67c4c3/host
Nov 28 10:06:17 np0005538515.localdomain systemd[1]: tmp-crun.RgNVpV.mount: Deactivated successfully.
Nov 28 10:06:17 np0005538515.localdomain dnsmasq-dhcp[316830]: read /var/lib/neutron/dhcp/b2c4ac07-8851-40d3-9495-d0489b67c4c3/opts
Nov 28 10:06:17 np0005538515.localdomain podman[316939]: 2025-11-28 10:06:17.396004135 +0000 UTC m=+0.062960740 container kill a39214a4baa8262623303d314b8ed95b71c01a463bc2eabd06aba05950874fd1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b2c4ac07-8851-40d3-9495-d0489b67c4c3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 28 10:06:17 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:06:17.672 261346 INFO neutron.agent.dhcp.agent [None req-82077205-b7ef-4c5a-bfd5-7f3b6fc5d19b - - - - - -] DHCP configuration for ports {'e7862df1-355e-4f97-9168-e87770babee6'} is completed
Nov 28 10:06:17 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:06:17.696 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:17 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v322: 177 pgs: 177 active+clean; 145 MiB data, 810 MiB used, 41 GiB / 42 GiB avail; 49 KiB/s rd, 2.5 KiB/s wr, 65 op/s
Nov 28 10:06:18 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e166 e166: 6 total, 6 up, 6 in
Nov 28 10:06:18 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/3947693664' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:06:18 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:06:18.785 261346 INFO neutron.agent.linux.ip_lib [None req-76fba609-54ed-4c5d-8243-74b30a1b7e43 - - - - - -] Device tapa14f394f-d8 cannot be used as it has no MAC address
Nov 28 10:06:18 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:06:18.842 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:18 np0005538515.localdomain kernel: device tapa14f394f-d8 entered promiscuous mode
Nov 28 10:06:18 np0005538515.localdomain NetworkManager[5965]: <info>  [1764324378.8513] manager: (tapa14f394f-d8): new Generic device (/org/freedesktop/NetworkManager/Devices/33)
Nov 28 10:06:18 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:06:18.850 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:18 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T10:06:18Z|00143|binding|INFO|Claiming lport a14f394f-d877-4b96-b490-240cee4c17a9 for this chassis.
Nov 28 10:06:18 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T10:06:18Z|00144|binding|INFO|a14f394f-d877-4b96-b490-240cee4c17a9: Claiming unknown
Nov 28 10:06:18 np0005538515.localdomain systemd-udevd[316969]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 10:06:18 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:06:18.863 158530 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538515.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcp2a75ae47-09f3-5db4-9c67-86b6e0e7c804-8b55fbca-6e7b-41f3-bf9f-cfd8b31211d4', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8b55fbca-6e7b-41f3-bf9f-cfd8b31211d4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5e7a07c97c664076bc825e05137c574c', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fe07eabe-b5fa-45d0-9144-395b300628e2, chassis=[<ovs.db.idl.Row object at 0x7fd80e481be0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fd80e481be0>], logical_port=a14f394f-d877-4b96-b490-240cee4c17a9) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:06:18 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:06:18.865 158530 INFO neutron.agent.ovn.metadata.agent [-] Port a14f394f-d877-4b96-b490-240cee4c17a9 in datapath 8b55fbca-6e7b-41f3-bf9f-cfd8b31211d4 bound to our chassis
Nov 28 10:06:18 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:06:18.867 158530 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 8b55fbca-6e7b-41f3-bf9f-cfd8b31211d4 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 28 10:06:18 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:06:18.868 261619 DEBUG oslo.privsep.daemon [-] privsep: reply[3a29e55a-d24a-42e2-ae3b-3f81fe38bed0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:06:18 np0005538515.localdomain virtnodedevd[228057]: ethtool ioctl error on tapa14f394f-d8: No such device
Nov 28 10:06:18 np0005538515.localdomain virtnodedevd[228057]: ethtool ioctl error on tapa14f394f-d8: No such device
Nov 28 10:06:18 np0005538515.localdomain virtnodedevd[228057]: ethtool ioctl error on tapa14f394f-d8: No such device
Nov 28 10:06:18 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:06:18.885 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:18 np0005538515.localdomain virtnodedevd[228057]: ethtool ioctl error on tapa14f394f-d8: No such device
Nov 28 10:06:18 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T10:06:18Z|00145|binding|INFO|Setting lport a14f394f-d877-4b96-b490-240cee4c17a9 ovn-installed in OVS
Nov 28 10:06:18 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T10:06:18Z|00146|binding|INFO|Setting lport a14f394f-d877-4b96-b490-240cee4c17a9 up in Southbound
Nov 28 10:06:18 np0005538515.localdomain virtnodedevd[228057]: ethtool ioctl error on tapa14f394f-d8: No such device
Nov 28 10:06:18 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:06:18.891 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:18 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:06:18.893 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:18 np0005538515.localdomain virtnodedevd[228057]: ethtool ioctl error on tapa14f394f-d8: No such device
Nov 28 10:06:18 np0005538515.localdomain virtnodedevd[228057]: ethtool ioctl error on tapa14f394f-d8: No such device
Nov 28 10:06:18 np0005538515.localdomain virtnodedevd[228057]: ethtool ioctl error on tapa14f394f-d8: No such device
Nov 28 10:06:18 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:06:18.921 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:18 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:06:18.945 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:18 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:06:18.962 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:19 np0005538515.localdomain ceph-mon[301134]: pgmap v322: 177 pgs: 177 active+clean; 145 MiB data, 810 MiB used, 41 GiB / 42 GiB avail; 49 KiB/s rd, 2.5 KiB/s wr, 65 op/s
Nov 28 10:06:19 np0005538515.localdomain ceph-mon[301134]: osdmap e166: 6 total, 6 up, 6 in
Nov 28 10:06:19 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e167 e167: 6 total, 6 up, 6 in
Nov 28 10:06:19 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v325: 177 pgs: 177 active+clean; 145 MiB data, 810 MiB used, 41 GiB / 42 GiB avail; 53 KiB/s rd, 2.7 KiB/s wr, 72 op/s
Nov 28 10:06:19 np0005538515.localdomain podman[317041]: 
Nov 28 10:06:19 np0005538515.localdomain podman[317041]: 2025-11-28 10:06:19.981184676 +0000 UTC m=+0.093917259 container create 57bdded9be6509522a89034674ed2eb843af9f2d8f84f0925466760b8a7d72a8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8b55fbca-6e7b-41f3-bf9f-cfd8b31211d4, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:06:20 np0005538515.localdomain systemd[1]: Started libpod-conmon-57bdded9be6509522a89034674ed2eb843af9f2d8f84f0925466760b8a7d72a8.scope.
Nov 28 10:06:20 np0005538515.localdomain podman[317041]: 2025-11-28 10:06:19.935795255 +0000 UTC m=+0.048527868 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 28 10:06:20 np0005538515.localdomain systemd[1]: tmp-crun.GlTCmu.mount: Deactivated successfully.
Nov 28 10:06:20 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 10:06:20 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/da9ef93f4d18acab10afc53cca5f72b138d7269b0830746f92d68d65b0e6d6ba/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 10:06:20 np0005538515.localdomain podman[317041]: 2025-11-28 10:06:20.085955708 +0000 UTC m=+0.198688251 container init 57bdded9be6509522a89034674ed2eb843af9f2d8f84f0925466760b8a7d72a8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8b55fbca-6e7b-41f3-bf9f-cfd8b31211d4, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:06:20 np0005538515.localdomain podman[317041]: 2025-11-28 10:06:20.09681483 +0000 UTC m=+0.209547403 container start 57bdded9be6509522a89034674ed2eb843af9f2d8f84f0925466760b8a7d72a8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8b55fbca-6e7b-41f3-bf9f-cfd8b31211d4, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Nov 28 10:06:20 np0005538515.localdomain dnsmasq[317059]: started, version 2.85 cachesize 150
Nov 28 10:06:20 np0005538515.localdomain dnsmasq[317059]: DNS service limited to local subnets
Nov 28 10:06:20 np0005538515.localdomain dnsmasq[317059]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 28 10:06:20 np0005538515.localdomain dnsmasq[317059]: warning: no upstream servers configured
Nov 28 10:06:20 np0005538515.localdomain dnsmasq-dhcp[317059]: DHCP, static leases only on 10.100.0.0, lease time 1d
Nov 28 10:06:20 np0005538515.localdomain dnsmasq[317059]: read /var/lib/neutron/dhcp/8b55fbca-6e7b-41f3-bf9f-cfd8b31211d4/addn_hosts - 0 addresses
Nov 28 10:06:20 np0005538515.localdomain dnsmasq-dhcp[317059]: read /var/lib/neutron/dhcp/8b55fbca-6e7b-41f3-bf9f-cfd8b31211d4/host
Nov 28 10:06:20 np0005538515.localdomain dnsmasq-dhcp[317059]: read /var/lib/neutron/dhcp/8b55fbca-6e7b-41f3-bf9f-cfd8b31211d4/opts
Nov 28 10:06:20 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e168 e168: 6 total, 6 up, 6 in
Nov 28 10:06:20 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:06:20.309 261346 INFO neutron.agent.dhcp.agent [None req-1c87ab88-3398-451a-9add-22742c15a1f1 - - - - - -] DHCP configuration for ports {'d6e76183-1869-4503-9c79-f3f08e79db56'} is completed
Nov 28 10:06:20 np0005538515.localdomain ceph-mon[301134]: osdmap e167: 6 total, 6 up, 6 in
Nov 28 10:06:20 np0005538515.localdomain ceph-mon[301134]: osdmap e168: 6 total, 6 up, 6 in
Nov 28 10:06:21 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e169 e169: 6 total, 6 up, 6 in
Nov 28 10:06:21 np0005538515.localdomain ceph-mon[301134]: pgmap v325: 177 pgs: 177 active+clean; 145 MiB data, 810 MiB used, 41 GiB / 42 GiB avail; 53 KiB/s rd, 2.7 KiB/s wr, 72 op/s
Nov 28 10:06:21 np0005538515.localdomain ceph-mon[301134]: osdmap e169: 6 total, 6 up, 6 in
Nov 28 10:06:21 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:06:21.643 261346 INFO neutron.agent.linux.ip_lib [None req-a88592ad-e761-4ea4-80f7-40a9f3361f28 - - - - - -] Device tapd03b0310-2c cannot be used as it has no MAC address
Nov 28 10:06:21 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:06:21.663 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:21 np0005538515.localdomain kernel: device tapd03b0310-2c entered promiscuous mode
Nov 28 10:06:21 np0005538515.localdomain NetworkManager[5965]: <info>  [1764324381.6722] manager: (tapd03b0310-2c): new Generic device (/org/freedesktop/NetworkManager/Devices/34)
Nov 28 10:06:21 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:06:21.673 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:21 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T10:06:21Z|00147|binding|INFO|Claiming lport d03b0310-2c7a-49f5-a26a-a0b7e61df97d for this chassis.
Nov 28 10:06:21 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T10:06:21Z|00148|binding|INFO|d03b0310-2c7a-49f5-a26a-a0b7e61df97d: Claiming unknown
Nov 28 10:06:21 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:06:21.687 158530 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538515.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.1/28', 'neutron:device_id': 'dhcp2a75ae47-09f3-5db4-9c67-86b6e0e7c804-844c2297-3cba-435a-8224-a5874c8fc772', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-844c2297-3cba-435a-8224-a5874c8fc772', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8499932c523b4e26933fff84403e296e', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5fcf2c61-5d0b-45c2-9bee-03c636787a2e, chassis=[<ovs.db.idl.Row object at 0x7fd80e481be0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fd80e481be0>], logical_port=d03b0310-2c7a-49f5-a26a-a0b7e61df97d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:06:21 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:06:21.690 158530 INFO neutron.agent.ovn.metadata.agent [-] Port d03b0310-2c7a-49f5-a26a-a0b7e61df97d in datapath 844c2297-3cba-435a-8224-a5874c8fc772 bound to our chassis
Nov 28 10:06:21 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:06:21.692 158530 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 844c2297-3cba-435a-8224-a5874c8fc772 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 28 10:06:21 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:06:21.693 261619 DEBUG oslo.privsep.daemon [-] privsep: reply[48e45ea5-a971-4f9d-b208-d208bbf3f997]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:06:21 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:06:21.711 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:21 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T10:06:21Z|00149|binding|INFO|Setting lport d03b0310-2c7a-49f5-a26a-a0b7e61df97d ovn-installed in OVS
Nov 28 10:06:21 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T10:06:21Z|00150|binding|INFO|Setting lport d03b0310-2c7a-49f5-a26a-a0b7e61df97d up in Southbound
Nov 28 10:06:21 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:06:21.721 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:21 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:06:21.723 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:21 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:06:21.754 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:21 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:06:21.768 261346 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:06:21Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce407ee0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce407e80>], id=2cd09136-ad69-4fac-87fd-7f802c66ea93, ip_allocation=immediate, mac_address=fa:16:3e:35:91:e0, name=tempest-PortsTestJSON-1134682245, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:06:16Z, description=, dns_domain=, id=8b55fbca-6e7b-41f3-bf9f-cfd8b31211d4, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-PortsTestJSON-241150779, port_security_enabled=True, project_id=5e7a07c97c664076bc825e05137c574c, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=28617, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2281, status=ACTIVE, subnets=['7e1cf227-5201-4b25-a503-c669fcd1fd8a'], tags=[], tenant_id=5e7a07c97c664076bc825e05137c574c, updated_at=2025-11-28T10:06:17Z, vlan_transparent=None, network_id=8b55fbca-6e7b-41f3-bf9f-cfd8b31211d4, port_security_enabled=True, project_id=5e7a07c97c664076bc825e05137c574c, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2309, status=DOWN, tags=[], tenant_id=5e7a07c97c664076bc825e05137c574c, updated_at=2025-11-28T10:06:21Z on network 8b55fbca-6e7b-41f3-bf9f-cfd8b31211d4
Nov 28 10:06:21 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:06:21.784 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:21 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e169 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 10:06:21 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v328: 177 pgs: 177 active+clean; 145 MiB data, 811 MiB used, 41 GiB / 42 GiB avail; 73 KiB/s rd, 4.7 KiB/s wr, 100 op/s
Nov 28 10:06:22 np0005538515.localdomain dnsmasq[317059]: read /var/lib/neutron/dhcp/8b55fbca-6e7b-41f3-bf9f-cfd8b31211d4/addn_hosts - 1 addresses
Nov 28 10:06:22 np0005538515.localdomain podman[317101]: 2025-11-28 10:06:22.045172324 +0000 UTC m=+0.063664662 container kill 57bdded9be6509522a89034674ed2eb843af9f2d8f84f0925466760b8a7d72a8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8b55fbca-6e7b-41f3-bf9f-cfd8b31211d4, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:06:22 np0005538515.localdomain dnsmasq-dhcp[317059]: read /var/lib/neutron/dhcp/8b55fbca-6e7b-41f3-bf9f-cfd8b31211d4/host
Nov 28 10:06:22 np0005538515.localdomain dnsmasq-dhcp[317059]: read /var/lib/neutron/dhcp/8b55fbca-6e7b-41f3-bf9f-cfd8b31211d4/opts
Nov 28 10:06:22 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:06:22.363 261346 INFO neutron.agent.dhcp.agent [None req-06129683-9dfb-429c-a9ff-ea718def59e4 - - - - - -] DHCP configuration for ports {'2cd09136-ad69-4fac-87fd-7f802c66ea93'} is completed
Nov 28 10:06:22 np0005538515.localdomain sudo[317150]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 10:06:22 np0005538515.localdomain sudo[317150]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 10:06:22 np0005538515.localdomain sudo[317150]: pam_unix(sudo:session): session closed for user root
Nov 28 10:06:22 np0005538515.localdomain podman[317173]: 
Nov 28 10:06:22 np0005538515.localdomain podman[317173]: 2025-11-28 10:06:22.699304482 +0000 UTC m=+0.092549088 container create e030229e49106417441f6cec18f25534071f1d52c1203835ba91718b7398cacc (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-844c2297-3cba-435a-8224-a5874c8fc772, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team)
Nov 28 10:06:22 np0005538515.localdomain sudo[317188]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 10:06:22 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:06:22.743 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:22 np0005538515.localdomain sudo[317188]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 10:06:22 np0005538515.localdomain podman[317173]: 2025-11-28 10:06:22.653458217 +0000 UTC m=+0.046702863 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 28 10:06:22 np0005538515.localdomain systemd[1]: Started libpod-conmon-e030229e49106417441f6cec18f25534071f1d52c1203835ba91718b7398cacc.scope.
Nov 28 10:06:22 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 10:06:22 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d7af7e398dae285d45d069130170b8c66994cef9726e98fc0beb2e1c05467b9a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 10:06:22 np0005538515.localdomain podman[317173]: 2025-11-28 10:06:22.801125763 +0000 UTC m=+0.194370379 container init e030229e49106417441f6cec18f25534071f1d52c1203835ba91718b7398cacc (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-844c2297-3cba-435a-8224-a5874c8fc772, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Nov 28 10:06:22 np0005538515.localdomain podman[317173]: 2025-11-28 10:06:22.812216763 +0000 UTC m=+0.205461419 container start e030229e49106417441f6cec18f25534071f1d52c1203835ba91718b7398cacc (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-844c2297-3cba-435a-8224-a5874c8fc772, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Nov 28 10:06:22 np0005538515.localdomain dnsmasq[317213]: started, version 2.85 cachesize 150
Nov 28 10:06:22 np0005538515.localdomain dnsmasq[317213]: DNS service limited to local subnets
Nov 28 10:06:22 np0005538515.localdomain dnsmasq[317213]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 28 10:06:22 np0005538515.localdomain dnsmasq[317213]: warning: no upstream servers configured
Nov 28 10:06:22 np0005538515.localdomain dnsmasq-dhcp[317213]: DHCP, static leases only on 10.100.0.0, lease time 1d
Nov 28 10:06:22 np0005538515.localdomain dnsmasq[317213]: read /var/lib/neutron/dhcp/844c2297-3cba-435a-8224-a5874c8fc772/addn_hosts - 0 addresses
Nov 28 10:06:22 np0005538515.localdomain dnsmasq-dhcp[317213]: read /var/lib/neutron/dhcp/844c2297-3cba-435a-8224-a5874c8fc772/host
Nov 28 10:06:22 np0005538515.localdomain dnsmasq-dhcp[317213]: read /var/lib/neutron/dhcp/844c2297-3cba-435a-8224-a5874c8fc772/opts
Nov 28 10:06:22 np0005538515.localdomain podman[317230]: 2025-11-28 10:06:22.986149254 +0000 UTC m=+0.064712494 container kill 57bdded9be6509522a89034674ed2eb843af9f2d8f84f0925466760b8a7d72a8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8b55fbca-6e7b-41f3-bf9f-cfd8b31211d4, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:06:22 np0005538515.localdomain dnsmasq[317059]: read /var/lib/neutron/dhcp/8b55fbca-6e7b-41f3-bf9f-cfd8b31211d4/addn_hosts - 0 addresses
Nov 28 10:06:22 np0005538515.localdomain dnsmasq-dhcp[317059]: read /var/lib/neutron/dhcp/8b55fbca-6e7b-41f3-bf9f-cfd8b31211d4/host
Nov 28 10:06:22 np0005538515.localdomain dnsmasq-dhcp[317059]: read /var/lib/neutron/dhcp/8b55fbca-6e7b-41f3-bf9f-cfd8b31211d4/opts
Nov 28 10:06:23 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:06:23.036 261346 INFO neutron.agent.dhcp.agent [None req-7a188e46-e70e-464f-b2c6-b19fdc568a1f - - - - - -] DHCP configuration for ports {'ec1f1e87-c34a-4cb6-8e4b-7a7c76325fc7'} is completed
Nov 28 10:06:23 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T10:06:23Z|00151|binding|INFO|Releasing lport d03b0310-2c7a-49f5-a26a-a0b7e61df97d from this chassis (sb_readonly=0)
Nov 28 10:06:23 np0005538515.localdomain kernel: device tapd03b0310-2c left promiscuous mode
Nov 28 10:06:23 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T10:06:23Z|00152|binding|INFO|Setting lport d03b0310-2c7a-49f5-a26a-a0b7e61df97d down in Southbound
Nov 28 10:06:23 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:06:23.179 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:23 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:06:23.198 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:23 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:06:23.205 158530 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port 754f3c1f-ac2e-4666-b728-9ee5f97abccb with type ""
Nov 28 10:06:23 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:06:23.206 158530 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005538515.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.1/28', 'neutron:device_id': 'dhcp2a75ae47-09f3-5db4-9c67-86b6e0e7c804-844c2297-3cba-435a-8224-a5874c8fc772', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-844c2297-3cba-435a-8224-a5874c8fc772', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8499932c523b4e26933fff84403e296e', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5fcf2c61-5d0b-45c2-9bee-03c636787a2e, chassis=[<ovs.db.idl.Row object at 0x7fd80e481be0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fd80e481be0>], logical_port=d03b0310-2c7a-49f5-a26a-a0b7e61df97d) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:06:23 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:06:23.207 158530 INFO neutron.agent.ovn.metadata.agent [-] Port d03b0310-2c7a-49f5-a26a-a0b7e61df97d in datapath 844c2297-3cba-435a-8224-a5874c8fc772 unbound from our chassis
Nov 28 10:06:23 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:06:23.209 158530 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 844c2297-3cba-435a-8224-a5874c8fc772, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 28 10:06:23 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:06:23.209 261619 DEBUG oslo.privsep.daemon [-] privsep: reply[acb9f861-6669-4add-a543-8e47811af8e1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:06:23 np0005538515.localdomain sudo[317188]: pam_unix(sudo:session): session closed for user root
Nov 28 10:06:23 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e170 e170: 6 total, 6 up, 6 in
Nov 28 10:06:23 np0005538515.localdomain ceph-mon[301134]: pgmap v328: 177 pgs: 177 active+clean; 145 MiB data, 811 MiB used, 41 GiB / 42 GiB avail; 73 KiB/s rd, 4.7 KiB/s wr, 100 op/s
Nov 28 10:06:23 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Nov 28 10:06:23 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 10:06:23 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0)
Nov 28 10:06:23 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 28 10:06:23 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Nov 28 10:06:23 np0005538515.localdomain ceph-mgr[286188]: [progress INFO root] update: starting ev 4b129d6e-9a4e-4fce-a15d-8ce31e0d01a4 (Updating node-proxy deployment (+3 -> 3))
Nov 28 10:06:23 np0005538515.localdomain ceph-mgr[286188]: [progress INFO root] complete: finished ev 4b129d6e-9a4e-4fce-a15d-8ce31e0d01a4 (Updating node-proxy deployment (+3 -> 3))
Nov 28 10:06:23 np0005538515.localdomain ceph-mgr[286188]: [progress INFO root] Completed event 4b129d6e-9a4e-4fce-a15d-8ce31e0d01a4 (Updating node-proxy deployment (+3 -> 3)) in 0 seconds
Nov 28 10:06:23 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0)
Nov 28 10:06:23 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 28 10:06:23 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:06:23.566 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:23 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:06:23.639 2 INFO neutron.agent.securitygroups_rpc [None req-5b881bab-382f-42c7-b1b6-bde09ef38c32 e32848e36ae94f66ae634ff4d7716d6f 8462a4a9a313405e8fd212f9ec4a0c92 - - default default] Security group member updated ['78343c03-098f-4faf-a880-2814fe3611d6']
Nov 28 10:06:23 np0005538515.localdomain dnsmasq[317213]: read /var/lib/neutron/dhcp/844c2297-3cba-435a-8224-a5874c8fc772/addn_hosts - 0 addresses
Nov 28 10:06:23 np0005538515.localdomain dnsmasq-dhcp[317213]: read /var/lib/neutron/dhcp/844c2297-3cba-435a-8224-a5874c8fc772/host
Nov 28 10:06:23 np0005538515.localdomain dnsmasq-dhcp[317213]: read /var/lib/neutron/dhcp/844c2297-3cba-435a-8224-a5874c8fc772/opts
Nov 28 10:06:23 np0005538515.localdomain podman[317302]: 2025-11-28 10:06:23.759593008 +0000 UTC m=+0.062120714 container kill e030229e49106417441f6cec18f25534071f1d52c1203835ba91718b7398cacc (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-844c2297-3cba-435a-8224-a5874c8fc772, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.build-date=20251125)
Nov 28 10:06:23 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:06:23.785 261346 ERROR neutron.agent.dhcp.agent [None req-83103f8b-35de-43cd-9e96-7816cc1f876c - - - - - -] Unable to reload_allocations dhcp for 844c2297-3cba-435a-8224-a5874c8fc772.: neutron.privileged.agent.linux.ip_lib.NetworkInterfaceNotFound: Network interface tapd03b0310-2c not found in namespace qdhcp-844c2297-3cba-435a-8224-a5874c8fc772.
Nov 28 10:06:23 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:06:23.785 261346 ERROR neutron.agent.dhcp.agent Traceback (most recent call last):
Nov 28 10:06:23 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:06:23.785 261346 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/dhcp/agent.py", line 264, in _call_driver
Nov 28 10:06:23 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:06:23.785 261346 ERROR neutron.agent.dhcp.agent     rv = getattr(driver, action)(**action_kwargs)
Nov 28 10:06:23 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:06:23.785 261346 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 673, in reload_allocations
Nov 28 10:06:23 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:06:23.785 261346 ERROR neutron.agent.dhcp.agent     self.device_manager.update(self.network, self.interface_name)
Nov 28 10:06:23 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:06:23.785 261346 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1899, in update
Nov 28 10:06:23 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:06:23.785 261346 ERROR neutron.agent.dhcp.agent     self._set_default_route(network, device_name)
Nov 28 10:06:23 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:06:23.785 261346 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1610, in _set_default_route
Nov 28 10:06:23 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:06:23.785 261346 ERROR neutron.agent.dhcp.agent     self._set_default_route_ip_version(network, device_name,
Nov 28 10:06:23 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:06:23.785 261346 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1539, in _set_default_route_ip_version
Nov 28 10:06:23 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:06:23.785 261346 ERROR neutron.agent.dhcp.agent     gateway = device.route.get_gateway(ip_version=ip_version)
Nov 28 10:06:23 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:06:23.785 261346 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 671, in get_gateway
Nov 28 10:06:23 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:06:23.785 261346 ERROR neutron.agent.dhcp.agent     routes = self.list_routes(ip_version, scope=scope, table=table)
Nov 28 10:06:23 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:06:23.785 261346 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 656, in list_routes
Nov 28 10:06:23 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:06:23.785 261346 ERROR neutron.agent.dhcp.agent     return list_ip_routes(self._parent.namespace, ip_version, scope=scope,
Nov 28 10:06:23 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:06:23.785 261346 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 1611, in list_ip_routes
Nov 28 10:06:23 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:06:23.785 261346 ERROR neutron.agent.dhcp.agent     routes = privileged.list_ip_routes(namespace, ip_version, device=device,
Nov 28 10:06:23 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:06:23.785 261346 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 333, in wrapped_f
Nov 28 10:06:23 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:06:23.785 261346 ERROR neutron.agent.dhcp.agent     return self(f, *args, **kw)
Nov 28 10:06:23 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:06:23.785 261346 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 423, in __call__
Nov 28 10:06:23 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:06:23.785 261346 ERROR neutron.agent.dhcp.agent     do = self.iter(retry_state=retry_state)
Nov 28 10:06:23 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:06:23.785 261346 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 360, in iter
Nov 28 10:06:23 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:06:23.785 261346 ERROR neutron.agent.dhcp.agent     return fut.result()
Nov 28 10:06:23 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:06:23.785 261346 ERROR neutron.agent.dhcp.agent   File "/usr/lib64/python3.9/concurrent/futures/_base.py", line 439, in result
Nov 28 10:06:23 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:06:23.785 261346 ERROR neutron.agent.dhcp.agent     return self.__get_result()
Nov 28 10:06:23 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:06:23.785 261346 ERROR neutron.agent.dhcp.agent   File "/usr/lib64/python3.9/concurrent/futures/_base.py", line 391, in __get_result
Nov 28 10:06:23 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:06:23.785 261346 ERROR neutron.agent.dhcp.agent     raise self._exception
Nov 28 10:06:23 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:06:23.785 261346 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 426, in __call__
Nov 28 10:06:23 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:06:23.785 261346 ERROR neutron.agent.dhcp.agent     result = fn(*args, **kwargs)
Nov 28 10:06:23 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:06:23.785 261346 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/oslo_privsep/priv_context.py", line 271, in _wrap
Nov 28 10:06:23 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:06:23.785 261346 ERROR neutron.agent.dhcp.agent     return self.channel.remote_call(name, args, kwargs,
Nov 28 10:06:23 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:06:23.785 261346 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/oslo_privsep/daemon.py", line 215, in remote_call
Nov 28 10:06:23 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:06:23.785 261346 ERROR neutron.agent.dhcp.agent     raise exc_type(*result[2])
Nov 28 10:06:23 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:06:23.785 261346 ERROR neutron.agent.dhcp.agent neutron.privileged.agent.linux.ip_lib.NetworkInterfaceNotFound: Network interface tapd03b0310-2c not found in namespace qdhcp-844c2297-3cba-435a-8224-a5874c8fc772.
Nov 28 10:06:23 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:06:23.785 261346 ERROR neutron.agent.dhcp.agent 
Nov 28 10:06:23 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v330: 177 pgs: 177 active+clean; 145 MiB data, 811 MiB used, 41 GiB / 42 GiB avail; 67 KiB/s rd, 4.3 KiB/s wr, 91 op/s
Nov 28 10:06:23 np0005538515.localdomain sudo[317313]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 10:06:23 np0005538515.localdomain sudo[317313]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 10:06:23 np0005538515.localdomain sudo[317313]: pam_unix(sudo:session): session closed for user root
Nov 28 10:06:23 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:06:23.928 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:23 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:06:23.965 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:24 np0005538515.localdomain podman[317351]: 2025-11-28 10:06:24.131110705 +0000 UTC m=+0.067733748 container kill 57bdded9be6509522a89034674ed2eb843af9f2d8f84f0925466760b8a7d72a8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8b55fbca-6e7b-41f3-bf9f-cfd8b31211d4, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Nov 28 10:06:24 np0005538515.localdomain dnsmasq[317059]: exiting on receipt of SIGTERM
Nov 28 10:06:24 np0005538515.localdomain systemd[1]: libpod-57bdded9be6509522a89034674ed2eb843af9f2d8f84f0925466760b8a7d72a8.scope: Deactivated successfully.
Nov 28 10:06:24 np0005538515.localdomain podman[317364]: 2025-11-28 10:06:24.21840977 +0000 UTC m=+0.067115428 container died 57bdded9be6509522a89034674ed2eb843af9f2d8f84f0925466760b8a7d72a8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8b55fbca-6e7b-41f3-bf9f-cfd8b31211d4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Nov 28 10:06:24 np0005538515.localdomain systemd[1]: tmp-crun.5PvZRw.mount: Deactivated successfully.
Nov 28 10:06:24 np0005538515.localdomain podman[317364]: 2025-11-28 10:06:24.259423457 +0000 UTC m=+0.108129055 container cleanup 57bdded9be6509522a89034674ed2eb843af9f2d8f84f0925466760b8a7d72a8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8b55fbca-6e7b-41f3-bf9f-cfd8b31211d4, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Nov 28 10:06:24 np0005538515.localdomain systemd[1]: libpod-conmon-57bdded9be6509522a89034674ed2eb843af9f2d8f84f0925466760b8a7d72a8.scope: Deactivated successfully.
Nov 28 10:06:24 np0005538515.localdomain podman[317365]: 2025-11-28 10:06:24.294163152 +0000 UTC m=+0.137564927 container remove 57bdded9be6509522a89034674ed2eb843af9f2d8f84f0925466760b8a7d72a8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8b55fbca-6e7b-41f3-bf9f-cfd8b31211d4, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 28 10:06:24 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T10:06:24Z|00153|binding|INFO|Releasing lport a14f394f-d877-4b96-b490-240cee4c17a9 from this chassis (sb_readonly=0)
Nov 28 10:06:24 np0005538515.localdomain kernel: device tapa14f394f-d8 left promiscuous mode
Nov 28 10:06:24 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T10:06:24Z|00154|binding|INFO|Setting lport a14f394f-d877-4b96-b490-240cee4c17a9 down in Southbound
Nov 28 10:06:24 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:06:24.311 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:24 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:06:24.325 158530 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538515.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcp2a75ae47-09f3-5db4-9c67-86b6e0e7c804-8b55fbca-6e7b-41f3-bf9f-cfd8b31211d4', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8b55fbca-6e7b-41f3-bf9f-cfd8b31211d4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5e7a07c97c664076bc825e05137c574c', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005538515.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fe07eabe-b5fa-45d0-9144-395b300628e2, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fd80e481be0>], logical_port=a14f394f-d877-4b96-b490-240cee4c17a9) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fd80e481be0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:06:24 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:06:24.329 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:24 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:06:24.331 158530 INFO neutron.agent.ovn.metadata.agent [-] Port a14f394f-d877-4b96-b490-240cee4c17a9 in datapath 8b55fbca-6e7b-41f3-bf9f-cfd8b31211d4 unbound from our chassis
Nov 28 10:06:24 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:06:24.334 158530 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8b55fbca-6e7b-41f3-bf9f-cfd8b31211d4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 28 10:06:24 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:06:24.336 261619 DEBUG oslo.privsep.daemon [-] privsep: reply[d53f6ad2-251d-4f25-9d30-c44646e4b251]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:06:24 np0005538515.localdomain ceph-mon[301134]: osdmap e170: 6 total, 6 up, 6 in
Nov 28 10:06:24 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 10:06:24 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 28 10:06:24 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:06:24 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 28 10:06:24 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/1033241070' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:06:24 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/1033241070' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:06:24 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:06:24.517 2 INFO neutron.agent.securitygroups_rpc [None req-043214bd-8f49-4013-b76d-a6a4f382dbad e32848e36ae94f66ae634ff4d7716d6f 8462a4a9a313405e8fd212f9ec4a0c92 - - default default] Security group member updated ['78343c03-098f-4faf-a880-2814fe3611d6']
Nov 28 10:06:24 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:06:24.521 261346 INFO neutron.agent.dhcp.agent [None req-b04f7734-15e3-4a1f-ac37-0c3cc54f42ff - - - - - -] Synchronizing state
Nov 28 10:06:24 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e171 e171: 6 total, 6 up, 6 in
Nov 28 10:06:24 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:06:24.732 261346 INFO neutron.agent.dhcp.agent [None req-dffd8019-a902-4d9f-b9a5-2d63c26e7b70 - - - - - -] All active networks have been fetched through RPC.
Nov 28 10:06:24 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-da9ef93f4d18acab10afc53cca5f72b138d7269b0830746f92d68d65b0e6d6ba-merged.mount: Deactivated successfully.
Nov 28 10:06:24 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-57bdded9be6509522a89034674ed2eb843af9f2d8f84f0925466760b8a7d72a8-userdata-shm.mount: Deactivated successfully.
Nov 28 10:06:24 np0005538515.localdomain systemd[1]: run-netns-qdhcp\x2d8b55fbca\x2d6e7b\x2d41f3\x2dbf9f\x2dcfd8b31211d4.mount: Deactivated successfully.
Nov 28 10:06:24 np0005538515.localdomain dnsmasq[317213]: exiting on receipt of SIGTERM
Nov 28 10:06:24 np0005538515.localdomain podman[317407]: 2025-11-28 10:06:24.931378421 +0000 UTC m=+0.070182062 container kill e030229e49106417441f6cec18f25534071f1d52c1203835ba91718b7398cacc (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-844c2297-3cba-435a-8224-a5874c8fc772, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:06:24 np0005538515.localdomain systemd[1]: libpod-e030229e49106417441f6cec18f25534071f1d52c1203835ba91718b7398cacc.scope: Deactivated successfully.
Nov 28 10:06:24 np0005538515.localdomain podman[317419]: 2025-11-28 10:06:24.997124197 +0000 UTC m=+0.053796131 container died e030229e49106417441f6cec18f25534071f1d52c1203835ba91718b7398cacc (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-844c2297-3cba-435a-8224-a5874c8fc772, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 28 10:06:25 np0005538515.localdomain podman[317419]: 2025-11-28 10:06:25.028674883 +0000 UTC m=+0.085346817 container cleanup e030229e49106417441f6cec18f25534071f1d52c1203835ba91718b7398cacc (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-844c2297-3cba-435a-8224-a5874c8fc772, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Nov 28 10:06:25 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.
Nov 28 10:06:25 np0005538515.localdomain systemd[1]: libpod-conmon-e030229e49106417441f6cec18f25534071f1d52c1203835ba91718b7398cacc.scope: Deactivated successfully.
Nov 28 10:06:25 np0005538515.localdomain podman[317426]: 2025-11-28 10:06:25.084912497 +0000 UTC m=+0.126228120 container remove e030229e49106417441f6cec18f25534071f1d52c1203835ba91718b7398cacc (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-844c2297-3cba-435a-8224-a5874c8fc772, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Nov 28 10:06:25 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:06:25.116 261346 INFO neutron.agent.dhcp.agent [-] Starting network 719e8b2f-3827-4b26-9c22-6bd0f7ed7ceb dhcp configuration
Nov 28 10:06:25 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:06:25.117 261346 INFO neutron.agent.dhcp.agent [-] Finished network 719e8b2f-3827-4b26-9c22-6bd0f7ed7ceb dhcp configuration
Nov 28 10:06:25 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:06:25.117 261346 INFO neutron.agent.dhcp.agent [-] Starting network 744b5a82-3c5c-4b41-ba44-527244a209c4 dhcp configuration
Nov 28 10:06:25 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:06:25.118 261346 INFO neutron.agent.dhcp.agent [-] Finished network 744b5a82-3c5c-4b41-ba44-527244a209c4 dhcp configuration
Nov 28 10:06:25 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:06:25.118 261346 INFO neutron.agent.dhcp.agent [-] Starting network 8b55fbca-6e7b-41f3-bf9f-cfd8b31211d4 dhcp configuration
Nov 28 10:06:25 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:06:25.118 261346 INFO neutron.agent.dhcp.agent [-] Finished network 8b55fbca-6e7b-41f3-bf9f-cfd8b31211d4 dhcp configuration
Nov 28 10:06:25 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:06:25.119 261346 INFO neutron.agent.dhcp.agent [None req-ba59b665-bbba-47f6-9efa-63a243c41137 - - - - - -] Synchronizing state complete
Nov 28 10:06:25 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:06:25.120 261346 INFO neutron.agent.dhcp.agent [None req-1f11cdf6-59fe-4b85-8008-6ece80439f46 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:06:25 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:06:25.121 261346 INFO neutron.agent.dhcp.agent [None req-1f11cdf6-59fe-4b85-8008-6ece80439f46 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:06:25 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:06:25.121 261346 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:06:25 np0005538515.localdomain podman[317445]: 2025-11-28 10:06:25.134470896 +0000 UTC m=+0.090814365 container health_status 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., io.openshift.expose-services=, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, managed_by=edpm_ansible, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 28 10:06:25 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:06:25.135 261346 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:06:25 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e172 e172: 6 total, 6 up, 6 in
Nov 28 10:06:25 np0005538515.localdomain podman[317445]: 2025-11-28 10:06:25.16854139 +0000 UTC m=+0.124884899 container exec_died 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, distribution-scope=public, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., managed_by=edpm_ansible)
Nov 28 10:06:25 np0005538515.localdomain systemd[1]: 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.service: Deactivated successfully.
Nov 28 10:06:25 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:06:25.422 2 INFO neutron.agent.securitygroups_rpc [None req-dd385026-e816-420f-a351-7652f9735a8b e32848e36ae94f66ae634ff4d7716d6f 8462a4a9a313405e8fd212f9ec4a0c92 - - default default] Security group member updated ['78343c03-098f-4faf-a880-2814fe3611d6']
Nov 28 10:06:25 np0005538515.localdomain ceph-mon[301134]: pgmap v330: 177 pgs: 177 active+clean; 145 MiB data, 811 MiB used, 41 GiB / 42 GiB avail; 67 KiB/s rd, 4.3 KiB/s wr, 91 op/s
Nov 28 10:06:25 np0005538515.localdomain ceph-mon[301134]: osdmap e171: 6 total, 6 up, 6 in
Nov 28 10:06:25 np0005538515.localdomain ceph-mon[301134]: osdmap e172: 6 total, 6 up, 6 in
Nov 28 10:06:25 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-d7af7e398dae285d45d069130170b8c66994cef9726e98fc0beb2e1c05467b9a-merged.mount: Deactivated successfully.
Nov 28 10:06:25 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e030229e49106417441f6cec18f25534071f1d52c1203835ba91718b7398cacc-userdata-shm.mount: Deactivated successfully.
Nov 28 10:06:25 np0005538515.localdomain systemd[1]: run-netns-qdhcp\x2d844c2297\x2d3cba\x2d435a\x2d8224\x2da5874c8fc772.mount: Deactivated successfully.
Nov 28 10:06:25 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v333: 177 pgs: 177 active+clean; 145 MiB data, 811 MiB used, 41 GiB / 42 GiB avail; 63 KiB/s rd, 4.1 KiB/s wr, 86 op/s
Nov 28 10:06:25 np0005538515.localdomain ceph-mgr[286188]: [progress INFO root] Writing back 50 completed events
Nov 28 10:06:25 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0)
Nov 28 10:06:26 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:06:26.032 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:26 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e173 e173: 6 total, 6 up, 6 in
Nov 28 10:06:26 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:06:26.504 2 INFO neutron.agent.securitygroups_rpc [None req-1c752cf9-f4dc-4e37-b8ac-f785450dc01f 646af4638a054ec4b7eb9e5438a2ab60 5e7a07c97c664076bc825e05137c574c - - default default] Security group member updated ['11213b09-f0e7-4cd0-8dcb-72dc58a4cd0b']
Nov 28 10:06:26 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:06:26.656 2 INFO neutron.agent.securitygroups_rpc [None req-0d9bcbd8-d3ad-4d76-88b4-dcdc400ccf8b e32848e36ae94f66ae634ff4d7716d6f 8462a4a9a313405e8fd212f9ec4a0c92 - - default default] Security group member updated ['78343c03-098f-4faf-a880-2814fe3611d6']
Nov 28 10:06:26 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 28 10:06:26 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                           ** DB Stats **
                                                           Uptime(secs): 600.0 total, 600.0 interval
                                                           Cumulative writes: 2277 writes, 22K keys, 2277 commit groups, 1.0 writes per commit group, ingest: 0.04 GB, 0.06 MB/s
                                                           Cumulative WAL: 2277 writes, 2277 syncs, 1.00 writes per sync, written: 0.04 GB, 0.06 MB/s
                                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                           Interval writes: 2277 writes, 22K keys, 2277 commit groups, 1.0 writes per commit group, ingest: 36.97 MB, 0.06 MB/s
                                                           Interval WAL: 2277 writes, 2277 syncs, 1.00 writes per sync, written: 0.04 GB, 0.06 MB/s
                                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                                           
                                                           ** Compaction Stats [default] **
                                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0    117.8      0.21              0.06         7    0.030       0      0       0.0       0.0
                                                             L6      1/0   15.86 MB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   4.1    149.4    137.0      0.74              0.28         6    0.124     75K   2889       0.0       0.0
                                                            Sum      1/0   15.86 MB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   5.1    116.2    132.8      0.96              0.34        13    0.074     75K   2889       0.0       0.0
                                                            Int      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   5.1    116.5    133.1      0.96              0.34        12    0.080     75K   2889       0.0       0.0
                                                           
                                                           ** Compaction Stats [default] **
                                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            Low      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   0.0    149.4    137.0      0.74              0.28         6    0.124     75K   2889       0.0       0.0
                                                           High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0    119.0      0.21              0.06         6    0.035       0      0       0.0       0.0
                                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.8      0.00              0.00         1    0.002       0      0       0.0       0.0
                                                           
                                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                           
                                                           Uptime(secs): 600.0 total, 600.0 interval
                                                           Flush(GB): cumulative 0.024, interval 0.024
                                                           AddFile(GB): cumulative 0.000, interval 0.000
                                                           AddFile(Total Files): cumulative 0, interval 0
                                                           AddFile(L0 Files): cumulative 0, interval 0
                                                           AddFile(Keys): cumulative 0, interval 0
                                                           Cumulative compaction: 0.12 GB write, 0.21 MB/s write, 0.11 GB read, 0.19 MB/s read, 1.0 seconds
                                                           Interval compaction: 0.12 GB write, 0.21 MB/s write, 0.11 GB read, 0.19 MB/s read, 1.0 seconds
                                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                           Block cache BinnedLRUCache@0x561ae0707350#2 capacity: 308.00 MB usage: 10.60 MB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 0 last_secs: 0.000114 secs_since: 0
                                                           Block cache entry stats(count,size,portion): DataBlock(535,10.08 MB,3.27364%) FilterBlock(13,233.98 KB,0.0741884%) IndexBlock(13,298.30 KB,0.0945797%) Misc(1,0.00 KB,0%)
                                                           
                                                           ** File Read Latency Histogram By Level [default] **
Nov 28 10:06:26 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:06:26.705 261346 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:06:26 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:06:26.740 2 INFO neutron.agent.securitygroups_rpc [None req-2fa3b339-652f-411d-b715-041869496ad1 646af4638a054ec4b7eb9e5438a2ab60 5e7a07c97c664076bc825e05137c574c - - default default] Security group member updated ['11213b09-f0e7-4cd0-8dcb-72dc58a4cd0b']
Nov 28 10:06:26 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:06:26.752 261346 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:06:26 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e173 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 10:06:26 np0005538515.localdomain ceph-mon[301134]: pgmap v333: 177 pgs: 177 active+clean; 145 MiB data, 811 MiB used, 41 GiB / 42 GiB avail; 63 KiB/s rd, 4.1 KiB/s wr, 86 op/s
Nov 28 10:06:26 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:06:26 np0005538515.localdomain ceph-mon[301134]: osdmap e173: 6 total, 6 up, 6 in
Nov 28 10:06:27 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:06:27.061 2 INFO neutron.agent.securitygroups_rpc [None req-c1a7ab1c-728b-412f-8d01-654c720e39af 646af4638a054ec4b7eb9e5438a2ab60 5e7a07c97c664076bc825e05137c574c - - default default] Security group member updated ['11213b09-f0e7-4cd0-8dcb-72dc58a4cd0b']
Nov 28 10:06:27 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:06:27.257 2 INFO neutron.agent.securitygroups_rpc [None req-35d7c929-00df-4969-aae8-f80246e7ea1b e32848e36ae94f66ae634ff4d7716d6f 8462a4a9a313405e8fd212f9ec4a0c92 - - default default] Security group member updated ['78343c03-098f-4faf-a880-2814fe3611d6']
Nov 28 10:06:27 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:06:27.573 2 INFO neutron.agent.securitygroups_rpc [None req-c6801d78-6b56-496a-b37f-67c7bc7e4554 646af4638a054ec4b7eb9e5438a2ab60 5e7a07c97c664076bc825e05137c574c - - default default] Security group member updated ['11213b09-f0e7-4cd0-8dcb-72dc58a4cd0b']
Nov 28 10:06:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:06:27 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 10:06:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:06:27 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 10:06:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:06:27 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 10:06:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:06:27 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 10:06:27 np0005538515.localdomain openstack_network_exporter[240973]: 
Nov 28 10:06:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:06:27 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 10:06:27 np0005538515.localdomain openstack_network_exporter[240973]: 
Nov 28 10:06:27 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:06:27.625 261346 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:06:27 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:06:27.749 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:27 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v335: 177 pgs: 177 active+clean; 175 MiB data, 819 MiB used, 41 GiB / 42 GiB avail; 4.9 MiB/s rd, 2.0 MiB/s wr, 203 op/s
Nov 28 10:06:27 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:06:27.833 2 INFO neutron.agent.securitygroups_rpc [None req-70e4eaf3-c859-433d-b609-e53f73e65383 e32848e36ae94f66ae634ff4d7716d6f 8462a4a9a313405e8fd212f9ec4a0c92 - - default default] Security group member updated ['78343c03-098f-4faf-a880-2814fe3611d6']
Nov 28 10:06:27 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:06:27.856 261346 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:06:27 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e174 e174: 6 total, 6 up, 6 in
Nov 28 10:06:28 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:06:28.324 2 INFO neutron.agent.securitygroups_rpc [None req-8150f43d-22f5-4d4c-88f9-cca24e00dc84 646af4638a054ec4b7eb9e5438a2ab60 5e7a07c97c664076bc825e05137c574c - - default default] Security group member updated ['11213b09-f0e7-4cd0-8dcb-72dc58a4cd0b']
Nov 28 10:06:28 np0005538515.localdomain podman[239012]: time="2025-11-28T10:06:28Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 10:06:28 np0005538515.localdomain podman[239012]: @ - - [28/Nov/2025:10:06:28 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 158154 "" "Go-http-client/1.1"
Nov 28 10:06:28 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:06:28.982 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:28 np0005538515.localdomain ceph-mon[301134]: pgmap v335: 177 pgs: 177 active+clean; 175 MiB data, 819 MiB used, 41 GiB / 42 GiB avail; 4.9 MiB/s rd, 2.0 MiB/s wr, 203 op/s
Nov 28 10:06:28 np0005538515.localdomain ceph-mon[301134]: osdmap e174: 6 total, 6 up, 6 in
Nov 28 10:06:28 np0005538515.localdomain podman[239012]: @ - - [28/Nov/2025:10:06:28 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19671 "" "Go-http-client/1.1"
Nov 28 10:06:29 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e175 e175: 6 total, 6 up, 6 in
Nov 28 10:06:29 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Nov 28 10:06:29 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2154361656' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:06:29 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v338: 177 pgs: 177 active+clean; 175 MiB data, 819 MiB used, 41 GiB / 42 GiB avail; 4.5 MiB/s rd, 1.9 MiB/s wr, 189 op/s
Nov 28 10:06:30 np0005538515.localdomain ceph-mon[301134]: osdmap e175: 6 total, 6 up, 6 in
Nov 28 10:06:30 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/2154361656' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:06:30 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e176 e176: 6 total, 6 up, 6 in
Nov 28 10:06:30 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Nov 28 10:06:30 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/885376607' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:06:30 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Nov 28 10:06:30 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/885376607' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:06:30 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e177 e177: 6 total, 6 up, 6 in
Nov 28 10:06:30 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:06:30.319 2 INFO neutron.agent.securitygroups_rpc [None req-61635fc4-7cf2-4d88-91e1-3ec9d744288e 646af4638a054ec4b7eb9e5438a2ab60 5e7a07c97c664076bc825e05137c574c - - default default] Security group member updated ['11213b09-f0e7-4cd0-8dcb-72dc58a4cd0b']
Nov 28 10:06:30 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:06:30.345 261346 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:06:31 np0005538515.localdomain ceph-mon[301134]: pgmap v338: 177 pgs: 177 active+clean; 175 MiB data, 819 MiB used, 41 GiB / 42 GiB avail; 4.5 MiB/s rd, 1.9 MiB/s wr, 189 op/s
Nov 28 10:06:31 np0005538515.localdomain ceph-mon[301134]: osdmap e176: 6 total, 6 up, 6 in
Nov 28 10:06:31 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/885376607' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:06:31 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/885376607' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:06:31 np0005538515.localdomain ceph-mon[301134]: osdmap e177: 6 total, 6 up, 6 in
Nov 28 10:06:31 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e177 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 10:06:31 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v341: 177 pgs: 177 active+clean; 334 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 5.5 MiB/s rd, 32 MiB/s wr, 256 op/s
Nov 28 10:06:32 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e178 e178: 6 total, 6 up, 6 in
Nov 28 10:06:32 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:06:32.269 2 INFO neutron.agent.securitygroups_rpc [None req-b832b7b5-4ef2-4bcf-bb1b-eacd2f3a21fc e32848e36ae94f66ae634ff4d7716d6f 8462a4a9a313405e8fd212f9ec4a0c92 - - default default] Security group member updated ['78343c03-098f-4faf-a880-2814fe3611d6']
Nov 28 10:06:32 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:06:32.753 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:33 np0005538515.localdomain ceph-mon[301134]: pgmap v341: 177 pgs: 177 active+clean; 334 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 5.5 MiB/s rd, 32 MiB/s wr, 256 op/s
Nov 28 10:06:33 np0005538515.localdomain ceph-mon[301134]: osdmap e178: 6 total, 6 up, 6 in
Nov 28 10:06:33 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e179 e179: 6 total, 6 up, 6 in
Nov 28 10:06:33 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:06:33.317 2 INFO neutron.agent.securitygroups_rpc [None req-995edc3e-e8fd-43bb-892b-18b0775677c3 e32848e36ae94f66ae634ff4d7716d6f 8462a4a9a313405e8fd212f9ec4a0c92 - - default default] Security group member updated ['78343c03-098f-4faf-a880-2814fe3611d6']
Nov 28 10:06:33 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v344: 177 pgs: 177 active+clean; 334 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 5.5 MiB/s rd, 32 MiB/s wr, 256 op/s
Nov 28 10:06:33 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:06:33.830 2 INFO neutron.agent.securitygroups_rpc [None req-ca5aaf48-8c6f-4efd-a0a9-4566338fc9f9 e32848e36ae94f66ae634ff4d7716d6f 8462a4a9a313405e8fd212f9ec4a0c92 - - default default] Security group member updated ['78343c03-098f-4faf-a880-2814fe3611d6']
Nov 28 10:06:33 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:06:33.986 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:34 np0005538515.localdomain ceph-mon[301134]: osdmap e179: 6 total, 6 up, 6 in
Nov 28 10:06:34 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:06:34.584 158530 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a3:db:09 10.100.0.18 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.18/28 10.100.0.2/28', 'neutron:device_id': 'ovnmeta-d1c7e9a2-1241-45c0-8a7d-a563a8d4e9f3', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d1c7e9a2-1241-45c0-8a7d-a563a8d4e9f3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5e7a07c97c664076bc825e05137c574c', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b879ef3c-9a06-48a8-9e87-0eac0ec86fcf, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=6db6a620-dcc3-4cb5-ab27-f70881c20730) old=Port_Binding(mac=['fa:16:3e:a3:db:09 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-d1c7e9a2-1241-45c0-8a7d-a563a8d4e9f3', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d1c7e9a2-1241-45c0-8a7d-a563a8d4e9f3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5e7a07c97c664076bc825e05137c574c', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:06:34 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:06:34.586 158530 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 6db6a620-dcc3-4cb5-ab27-f70881c20730 in datapath d1c7e9a2-1241-45c0-8a7d-a563a8d4e9f3 updated
Nov 28 10:06:34 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:06:34.588 158530 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d1c7e9a2-1241-45c0-8a7d-a563a8d4e9f3, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 28 10:06:34 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:06:34.589 261619 DEBUG oslo.privsep.daemon [-] privsep: reply[e2f50453-d186-4f22-970b-dae020d0567a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:06:34 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:06:34.616 2 INFO neutron.agent.securitygroups_rpc [None req-c85e903a-7c43-4db3-84d5-12d5f3b5c956 e32848e36ae94f66ae634ff4d7716d6f 8462a4a9a313405e8fd212f9ec4a0c92 - - default default] Security group member updated ['78343c03-098f-4faf-a880-2814fe3611d6']
Nov 28 10:06:35 np0005538515.localdomain ceph-mon[301134]: pgmap v344: 177 pgs: 177 active+clean; 334 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 5.5 MiB/s rd, 32 MiB/s wr, 256 op/s
Nov 28 10:06:35 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/3040858317' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:06:35 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e180 e180: 6 total, 6 up, 6 in
Nov 28 10:06:35 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #31. Immutable memtables: 0.
Nov 28 10:06:35 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:06:35.410804) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Nov 28 10:06:35 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/flush_job.cc:856] [default] [JOB 15] Flushing memtable with next log file: 31
Nov 28 10:06:35 np0005538515.localdomain ceph-mon[301134]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324395410865, "job": 15, "event": "flush_started", "num_memtables": 1, "num_entries": 1972, "num_deletes": 266, "total_data_size": 2677974, "memory_usage": 2724232, "flush_reason": "Manual Compaction"}
Nov 28 10:06:35 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/flush_job.cc:885] [default] [JOB 15] Level-0 flush table #32: started
Nov 28 10:06:35 np0005538515.localdomain ceph-mon[301134]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324395425376, "cf_name": "default", "job": 15, "event": "table_file_creation", "file_number": 32, "file_size": 1738394, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 21054, "largest_seqno": 23021, "table_properties": {"data_size": 1730746, "index_size": 4541, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2053, "raw_key_size": 16990, "raw_average_key_size": 20, "raw_value_size": 1714900, "raw_average_value_size": 2099, "num_data_blocks": 197, "num_entries": 817, "num_filter_entries": 817, "num_deletions": 266, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764324288, "oldest_key_time": 1764324288, "file_creation_time": 1764324395, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "75e61b0e-4f73-4b03-b096-8587ecbe7a9f", "db_session_id": "7KM5GJAJPD54H6HSLJHG", "orig_file_number": 32, "seqno_to_time_mapping": "N/A"}}
Nov 28 10:06:35 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 15] Flush lasted 14628 microseconds, and 5396 cpu microseconds.
Nov 28 10:06:35 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 28 10:06:35 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:06:35.425431) [db/flush_job.cc:967] [default] [JOB 15] Level-0 flush table #32: 1738394 bytes OK
Nov 28 10:06:35 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:06:35.425457) [db/memtable_list.cc:519] [default] Level-0 commit table #32 started
Nov 28 10:06:35 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:06:35.427721) [db/memtable_list.cc:722] [default] Level-0 commit table #32: memtable #1 done
Nov 28 10:06:35 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:06:35.427744) EVENT_LOG_v1 {"time_micros": 1764324395427738, "job": 15, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Nov 28 10:06:35 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:06:35.427766) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Nov 28 10:06:35 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 15] Try to delete WAL files size 2668824, prev total WAL file size 2668824, number of live WAL files 2.
Nov 28 10:06:35 np0005538515.localdomain ceph-mon[301134]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538515/store.db/000028.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 28 10:06:35 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:06:35.428572) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0034303231' seq:72057594037927935, type:22 .. '6C6F676D0034323735' seq:0, type:0; will stop at (end)
Nov 28 10:06:35 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 16] Compacting 1@0 + 1@6 files to L6, score -1.00
Nov 28 10:06:35 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 15 Base level 0, inputs: [32(1697KB)], [30(15MB)]
Nov 28 10:06:35 np0005538515.localdomain ceph-mon[301134]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324395428628, "job": 16, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [32], "files_L6": [30], "score": -1, "input_data_size": 18373680, "oldest_snapshot_seqno": -1}
Nov 28 10:06:35 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 16] Generated table #33: 12779 keys, 17853968 bytes, temperature: kUnknown
Nov 28 10:06:35 np0005538515.localdomain ceph-mon[301134]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324395557928, "cf_name": "default", "job": 16, "event": "table_file_creation", "file_number": 33, "file_size": 17853968, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 17779239, "index_size": 41668, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 32005, "raw_key_size": 341760, "raw_average_key_size": 26, "raw_value_size": 17560025, "raw_average_value_size": 1374, "num_data_blocks": 1585, "num_entries": 12779, "num_filter_entries": 12779, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764323786, "oldest_key_time": 0, "file_creation_time": 1764324395, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "75e61b0e-4f73-4b03-b096-8587ecbe7a9f", "db_session_id": "7KM5GJAJPD54H6HSLJHG", "orig_file_number": 33, "seqno_to_time_mapping": "N/A"}}
Nov 28 10:06:35 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 28 10:06:35 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:06:35.558387) [db/compaction/compaction_job.cc:1663] [default] [JOB 16] Compacted 1@0 + 1@6 files to L6 => 17853968 bytes
Nov 28 10:06:35 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:06:35.560916) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 141.9 rd, 137.8 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.7, 15.9 +0.0 blob) out(17.0 +0.0 blob), read-write-amplify(20.8) write-amplify(10.3) OK, records in: 13326, records dropped: 547 output_compression: NoCompression
Nov 28 10:06:35 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:06:35.560956) EVENT_LOG_v1 {"time_micros": 1764324395560937, "job": 16, "event": "compaction_finished", "compaction_time_micros": 129519, "compaction_time_cpu_micros": 51438, "output_level": 6, "num_output_files": 1, "total_output_size": 17853968, "num_input_records": 13326, "num_output_records": 12779, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Nov 28 10:06:35 np0005538515.localdomain ceph-mon[301134]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538515/store.db/000032.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 28 10:06:35 np0005538515.localdomain ceph-mon[301134]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324395561528, "job": 16, "event": "table_file_deletion", "file_number": 32}
Nov 28 10:06:35 np0005538515.localdomain ceph-mon[301134]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538515/store.db/000030.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 28 10:06:35 np0005538515.localdomain ceph-mon[301134]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324395564622, "job": 16, "event": "table_file_deletion", "file_number": 30}
Nov 28 10:06:35 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:06:35.428497) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:06:35 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:06:35.564733) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:06:35 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:06:35.564741) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:06:35 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:06:35.564744) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:06:35 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:06:35.564747) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:06:35 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:06:35.564750) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:06:35 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] scanning for idle connections..
Nov 28 10:06:35 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] cleaning up connections: []
Nov 28 10:06:35 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] scanning for idle connections..
Nov 28 10:06:35 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] cleaning up connections: []
Nov 28 10:06:35 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] scanning for idle connections..
Nov 28 10:06:35 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] cleaning up connections: []
Nov 28 10:06:35 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:06:35.797 2 INFO neutron.agent.securitygroups_rpc [None req-8c0c18c9-b8b0-46ef-89c5-259d43510268 646af4638a054ec4b7eb9e5438a2ab60 5e7a07c97c664076bc825e05137c574c - - default default] Security group member updated ['11213b09-f0e7-4cd0-8dcb-72dc58a4cd0b']
Nov 28 10:06:35 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v346: 177 pgs: 177 active+clean; 334 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 3.9 MiB/s rd, 23 MiB/s wr, 181 op/s
Nov 28 10:06:35 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.
Nov 28 10:06:35 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.
Nov 28 10:06:35 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.
Nov 28 10:06:35 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.
Nov 28 10:06:36 np0005538515.localdomain podman[317469]: 2025-11-28 10:06:36.010769741 +0000 UTC m=+0.104988381 container health_status 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.build-date=20251125, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm)
Nov 28 10:06:36 np0005538515.localdomain podman[317471]: 2025-11-28 10:06:36.048554188 +0000 UTC m=+0.136428752 container health_status b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 28 10:06:36 np0005538515.localdomain podman[317471]: 2025-11-28 10:06:36.0521768 +0000 UTC m=+0.140051374 container exec_died b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 28 10:06:36 np0005538515.localdomain systemd[1]: b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.service: Deactivated successfully.
Nov 28 10:06:36 np0005538515.localdomain podman[317469]: 2025-11-28 10:06:36.072371133 +0000 UTC m=+0.166589773 container exec_died 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Nov 28 10:06:36 np0005538515.localdomain systemd[1]: 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.service: Deactivated successfully.
Nov 28 10:06:36 np0005538515.localdomain podman[317470]: 2025-11-28 10:06:36.148593206 +0000 UTC m=+0.240150504 container health_status 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 28 10:06:36 np0005538515.localdomain podman[317472]: 2025-11-28 10:06:36.21446482 +0000 UTC m=+0.299886749 container health_status d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 10:06:36 np0005538515.localdomain podman[317472]: 2025-11-28 10:06:36.223733045 +0000 UTC m=+0.309154964 container exec_died d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 28 10:06:36 np0005538515.localdomain podman[317470]: 2025-11-28 10:06:36.234669113 +0000 UTC m=+0.326226471 container exec_died 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller)
Nov 28 10:06:36 np0005538515.localdomain systemd[1]: d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.service: Deactivated successfully.
Nov 28 10:06:36 np0005538515.localdomain systemd[1]: 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.service: Deactivated successfully.
Nov 28 10:06:36 np0005538515.localdomain ceph-mon[301134]: osdmap e180: 6 total, 6 up, 6 in
Nov 28 10:06:36 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/662129540' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:06:36 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/662129540' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:06:36 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:06:36.608 2 INFO neutron.agent.securitygroups_rpc [None req-9cc8449a-c364-4646-9e4e-de66c7fab687 646af4638a054ec4b7eb9e5438a2ab60 5e7a07c97c664076bc825e05137c574c - - default default] Security group member updated ['11213b09-f0e7-4cd0-8dcb-72dc58a4cd0b']
Nov 28 10:06:36 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e180 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 28 10:06:37 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:06:37.098 261346 INFO neutron.agent.linux.ip_lib [None req-17ddab61-d1bc-422c-b5c6-fddc90895032 - - - - - -] Device tap8223ff8a-d8 cannot be used as it has no MAC address
Nov 28 10:06:37 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:06:37.123 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:37 np0005538515.localdomain kernel: device tap8223ff8a-d8 entered promiscuous mode
Nov 28 10:06:37 np0005538515.localdomain NetworkManager[5965]: <info>  [1764324397.1326] manager: (tap8223ff8a-d8): new Generic device (/org/freedesktop/NetworkManager/Devices/35)
Nov 28 10:06:37 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T10:06:37Z|00155|binding|INFO|Claiming lport 8223ff8a-d864-4e3d-8c9a-3a343402a080 for this chassis.
Nov 28 10:06:37 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T10:06:37Z|00156|binding|INFO|8223ff8a-d864-4e3d-8c9a-3a343402a080: Claiming unknown
Nov 28 10:06:37 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:06:37.134 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:37 np0005538515.localdomain systemd-udevd[317561]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 10:06:37 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:06:37.146 158530 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538515.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcp2a75ae47-09f3-5db4-9c67-86b6e0e7c804-af6f0c29-9935-4fbd-b4e8-6bde23565f73', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-af6f0c29-9935-4fbd-b4e8-6bde23565f73', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8499932c523b4e26933fff84403e296e', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=200f951a-6408-4a85-8a6d-abf60f0f240f, chassis=[<ovs.db.idl.Row object at 0x7fd80e481be0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fd80e481be0>], logical_port=8223ff8a-d864-4e3d-8c9a-3a343402a080) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:06:37 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:06:37.148 158530 INFO neutron.agent.ovn.metadata.agent [-] Port 8223ff8a-d864-4e3d-8c9a-3a343402a080 in datapath af6f0c29-9935-4fbd-b4e8-6bde23565f73 bound to our chassis
Nov 28 10:06:37 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:06:37.150 158530 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network af6f0c29-9935-4fbd-b4e8-6bde23565f73 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 28 10:06:37 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:06:37.151 261619 DEBUG oslo.privsep.daemon [-] privsep: reply[63f90fc4-e98b-4e55-b097-f2d0abb85e1e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:06:37 np0005538515.localdomain virtnodedevd[228057]: ethtool ioctl error on tap8223ff8a-d8: No such device
Nov 28 10:06:37 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T10:06:37Z|00157|binding|INFO|Setting lport 8223ff8a-d864-4e3d-8c9a-3a343402a080 ovn-installed in OVS
Nov 28 10:06:37 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T10:06:37Z|00158|binding|INFO|Setting lport 8223ff8a-d864-4e3d-8c9a-3a343402a080 up in Southbound
Nov 28 10:06:37 np0005538515.localdomain virtnodedevd[228057]: ethtool ioctl error on tap8223ff8a-d8: No such device
Nov 28 10:06:37 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:06:37.175 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:37 np0005538515.localdomain virtnodedevd[228057]: ethtool ioctl error on tap8223ff8a-d8: No such device
Nov 28 10:06:37 np0005538515.localdomain virtnodedevd[228057]: ethtool ioctl error on tap8223ff8a-d8: No such device
Nov 28 10:06:37 np0005538515.localdomain virtnodedevd[228057]: ethtool ioctl error on tap8223ff8a-d8: No such device
Nov 28 10:06:37 np0005538515.localdomain virtnodedevd[228057]: ethtool ioctl error on tap8223ff8a-d8: No such device
Nov 28 10:06:37 np0005538515.localdomain virtnodedevd[228057]: ethtool ioctl error on tap8223ff8a-d8: No such device
Nov 28 10:06:37 np0005538515.localdomain virtnodedevd[228057]: ethtool ioctl error on tap8223ff8a-d8: No such device
Nov 28 10:06:37 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:06:37.213 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:37 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:06:37.250 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:37 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e181 e181: 6 total, 6 up, 6 in
Nov 28 10:06:37 np0005538515.localdomain ceph-mon[301134]: pgmap v346: 177 pgs: 177 active+clean; 334 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 3.9 MiB/s rd, 23 MiB/s wr, 181 op/s
Nov 28 10:06:37 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:06:37.761 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:37 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v348: 177 pgs: 177 active+clean; 508 MiB data, 1.6 GiB used, 40 GiB / 42 GiB avail; 3.8 MiB/s rd, 26 MiB/s wr, 195 op/s
Nov 28 10:06:38 np0005538515.localdomain podman[317632]: 
Nov 28 10:06:38 np0005538515.localdomain podman[317632]: 2025-11-28 10:06:38.16653744 +0000 UTC m=+0.096019486 container create 9843dcb5f0688513d98e565130cb502bba1e44def357e175bb83adf33a558a51 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-af6f0c29-9935-4fbd-b4e8-6bde23565f73, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Nov 28 10:06:38 np0005538515.localdomain systemd[1]: Started libpod-conmon-9843dcb5f0688513d98e565130cb502bba1e44def357e175bb83adf33a558a51.scope.
Nov 28 10:06:38 np0005538515.localdomain podman[317632]: 2025-11-28 10:06:38.120583601 +0000 UTC m=+0.050065677 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 28 10:06:38 np0005538515.localdomain systemd[1]: tmp-crun.01q0lp.mount: Deactivated successfully.
Nov 28 10:06:38 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 10:06:38 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2663c8f4edf79b99518629935207b6fb933c056adfabba625d01c99d738355ca/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 10:06:38 np0005538515.localdomain podman[317632]: 2025-11-28 10:06:38.252939857 +0000 UTC m=+0.182421893 container init 9843dcb5f0688513d98e565130cb502bba1e44def357e175bb83adf33a558a51 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-af6f0c29-9935-4fbd-b4e8-6bde23565f73, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 28 10:06:38 np0005538515.localdomain podman[317632]: 2025-11-28 10:06:38.264994719 +0000 UTC m=+0.194476795 container start 9843dcb5f0688513d98e565130cb502bba1e44def357e175bb83adf33a558a51 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-af6f0c29-9935-4fbd-b4e8-6bde23565f73, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Nov 28 10:06:38 np0005538515.localdomain dnsmasq[317649]: started, version 2.85 cachesize 150
Nov 28 10:06:38 np0005538515.localdomain dnsmasq[317649]: DNS service limited to local subnets
Nov 28 10:06:38 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:06:38.268 2 INFO neutron.agent.securitygroups_rpc [None req-f5ad8b43-3120-4741-9a0b-dea18e860a97 646af4638a054ec4b7eb9e5438a2ab60 5e7a07c97c664076bc825e05137c574c - - default default] Security group member updated ['11213b09-f0e7-4cd0-8dcb-72dc58a4cd0b']
Nov 28 10:06:38 np0005538515.localdomain dnsmasq[317649]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 28 10:06:38 np0005538515.localdomain dnsmasq[317649]: warning: no upstream servers configured
Nov 28 10:06:38 np0005538515.localdomain dnsmasq-dhcp[317649]: DHCP, static leases only on 10.100.0.0, lease time 1d
Nov 28 10:06:38 np0005538515.localdomain dnsmasq[317649]: read /var/lib/neutron/dhcp/af6f0c29-9935-4fbd-b4e8-6bde23565f73/addn_hosts - 0 addresses
Nov 28 10:06:38 np0005538515.localdomain dnsmasq-dhcp[317649]: read /var/lib/neutron/dhcp/af6f0c29-9935-4fbd-b4e8-6bde23565f73/host
Nov 28 10:06:38 np0005538515.localdomain dnsmasq-dhcp[317649]: read /var/lib/neutron/dhcp/af6f0c29-9935-4fbd-b4e8-6bde23565f73/opts
Nov 28 10:06:38 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:06:38.382 261346 INFO neutron.agent.dhcp.agent [None req-364bf6f8-f136-4f9a-bdbb-5b6503ee835a - - - - - -] DHCP configuration for ports {'a2264f56-a714-4119-a161-0f64b5e5f509'} is completed
Nov 28 10:06:38 np0005538515.localdomain ceph-mon[301134]: osdmap e181: 6 total, 6 up, 6 in
Nov 28 10:06:38 np0005538515.localdomain dnsmasq[317649]: exiting on receipt of SIGTERM
Nov 28 10:06:38 np0005538515.localdomain podman[317667]: 2025-11-28 10:06:38.672449487 +0000 UTC m=+0.059036774 container kill 9843dcb5f0688513d98e565130cb502bba1e44def357e175bb83adf33a558a51 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-af6f0c29-9935-4fbd-b4e8-6bde23565f73, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Nov 28 10:06:38 np0005538515.localdomain systemd[1]: libpod-9843dcb5f0688513d98e565130cb502bba1e44def357e175bb83adf33a558a51.scope: Deactivated successfully.
Nov 28 10:06:38 np0005538515.localdomain podman[317680]: 2025-11-28 10:06:38.74835554 +0000 UTC m=+0.059440236 container died 9843dcb5f0688513d98e565130cb502bba1e44def357e175bb83adf33a558a51 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-af6f0c29-9935-4fbd-b4e8-6bde23565f73, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 28 10:06:38 np0005538515.localdomain podman[317680]: 2025-11-28 10:06:38.779318176 +0000 UTC m=+0.090402832 container cleanup 9843dcb5f0688513d98e565130cb502bba1e44def357e175bb83adf33a558a51 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-af6f0c29-9935-4fbd-b4e8-6bde23565f73, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:06:38 np0005538515.localdomain systemd[1]: libpod-conmon-9843dcb5f0688513d98e565130cb502bba1e44def357e175bb83adf33a558a51.scope: Deactivated successfully.
Nov 28 10:06:38 np0005538515.localdomain podman[317682]: 2025-11-28 10:06:38.828714401 +0000 UTC m=+0.134917446 container remove 9843dcb5f0688513d98e565130cb502bba1e44def357e175bb83adf33a558a51 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-af6f0c29-9935-4fbd-b4e8-6bde23565f73, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true)
Nov 28 10:06:38 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:06:38.895 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:38 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T10:06:38Z|00159|binding|INFO|Releasing lport 8223ff8a-d864-4e3d-8c9a-3a343402a080 from this chassis (sb_readonly=0)
Nov 28 10:06:38 np0005538515.localdomain kernel: device tap8223ff8a-d8 left promiscuous mode
Nov 28 10:06:38 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T10:06:38Z|00160|binding|INFO|Setting lport 8223ff8a-d864-4e3d-8c9a-3a343402a080 down in Southbound
Nov 28 10:06:38 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:06:38.908 158530 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538515.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'dhcp2a75ae47-09f3-5db4-9c67-86b6e0e7c804-af6f0c29-9935-4fbd-b4e8-6bde23565f73', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-af6f0c29-9935-4fbd-b4e8-6bde23565f73', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8499932c523b4e26933fff84403e296e', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005538515.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=200f951a-6408-4a85-8a6d-abf60f0f240f, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fd80e481be0>], logical_port=8223ff8a-d864-4e3d-8c9a-3a343402a080) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fd80e481be0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:06:38 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:06:38.910 158530 INFO neutron.agent.ovn.metadata.agent [-] Port 8223ff8a-d864-4e3d-8c9a-3a343402a080 in datapath af6f0c29-9935-4fbd-b4e8-6bde23565f73 unbound from our chassis
Nov 28 10:06:38 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:06:38.912 158530 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network af6f0c29-9935-4fbd-b4e8-6bde23565f73 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 28 10:06:38 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:06:38.913 261619 DEBUG oslo.privsep.daemon [-] privsep: reply[71cad528-6272-4d57-9fe0-6e0c8f01dd83]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:06:38 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:06:38.918 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:38 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:06:38.988 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:39 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.
Nov 28 10:06:39 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-2663c8f4edf79b99518629935207b6fb933c056adfabba625d01c99d738355ca-merged.mount: Deactivated successfully.
Nov 28 10:06:39 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9843dcb5f0688513d98e565130cb502bba1e44def357e175bb83adf33a558a51-userdata-shm.mount: Deactivated successfully.
Nov 28 10:06:39 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:06:39.202 2 INFO neutron.agent.securitygroups_rpc [None req-cef696eb-a0d2-4ba5-86dc-1a9cdfd33b8c 646af4638a054ec4b7eb9e5438a2ab60 5e7a07c97c664076bc825e05137c574c - - default default] Security group member updated ['11213b09-f0e7-4cd0-8dcb-72dc58a4cd0b']
Nov 28 10:06:39 np0005538515.localdomain podman[317712]: 2025-11-28 10:06:39.245000782 +0000 UTC m=+0.088468523 container health_status 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 28 10:06:39 np0005538515.localdomain systemd[1]: run-netns-qdhcp\x2daf6f0c29\x2d9935\x2d4fbd\x2db4e8\x2d6bde23565f73.mount: Deactivated successfully.
Nov 28 10:06:39 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:06:39.275 261346 INFO neutron.agent.dhcp.agent [None req-c9a4ee6c-f8d8-4616-bb9c-f10e6ecd9db3 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:06:39 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:06:39.276 261346 INFO neutron.agent.dhcp.agent [None req-c9a4ee6c-f8d8-4616-bb9c-f10e6ecd9db3 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:06:39 np0005538515.localdomain podman[317712]: 2025-11-28 10:06:39.281421826 +0000 UTC m=+0.124889557 container exec_died 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 28 10:06:39 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:06:39.281 261346 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:06:39 np0005538515.localdomain systemd[1]: 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.service: Deactivated successfully.
Nov 28 10:06:39 np0005538515.localdomain ceph-mon[301134]: pgmap v348: 177 pgs: 177 active+clean; 508 MiB data, 1.6 GiB used, 40 GiB / 42 GiB avail; 3.8 MiB/s rd, 26 MiB/s wr, 195 op/s
Nov 28 10:06:39 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/2383859164' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:06:39 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/2383859164' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:06:39 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e182 e182: 6 total, 6 up, 6 in
Nov 28 10:06:39 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:06:39.608 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:39 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v350: 177 pgs: 177 active+clean; 508 MiB data, 1.6 GiB used, 40 GiB / 42 GiB avail; 3.7 MiB/s rd, 25 MiB/s wr, 187 op/s
Nov 28 10:06:40 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:06:40.091 2 INFO neutron.agent.securitygroups_rpc [None req-11baa0de-2c9c-4582-9ccd-cefaf494809d e32848e36ae94f66ae634ff4d7716d6f 8462a4a9a313405e8fd212f9ec4a0c92 - - default default] Security group member updated ['78343c03-098f-4faf-a880-2814fe3611d6']
Nov 28 10:06:40 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Nov 28 10:06:40 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/859948979' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:06:40 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Nov 28 10:06:40 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/859948979' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:06:40 np0005538515.localdomain ceph-mon[301134]: osdmap e182: 6 total, 6 up, 6 in
Nov 28 10:06:40 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:06:40.932 2 INFO neutron.agent.securitygroups_rpc [None req-13572f85-aaed-465a-b457-59f9816ff0f0 e32848e36ae94f66ae634ff4d7716d6f 8462a4a9a313405e8fd212f9ec4a0c92 - - default default] Security group member updated ['78343c03-098f-4faf-a880-2814fe3611d6']
Nov 28 10:06:41 np0005538515.localdomain ceph-mon[301134]: pgmap v350: 177 pgs: 177 active+clean; 508 MiB data, 1.6 GiB used, 40 GiB / 42 GiB avail; 3.7 MiB/s rd, 25 MiB/s wr, 187 op/s
Nov 28 10:06:41 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/859948979' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:06:41 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/859948979' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:06:41 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e183 e183: 6 total, 6 up, 6 in
Nov 28 10:06:41 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e183 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:06:41 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v352: 177 pgs: 1 active+clean+snaptrim_wait, 2 active+clean+snaptrim, 174 active+clean; 598 MiB data, 1.9 GiB used, 40 GiB / 42 GiB avail; 3.7 MiB/s rd, 48 MiB/s wr, 315 op/s
Nov 28 10:06:42 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:06:42.252 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:06:42 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:06:42.353 2 INFO neutron.agent.securitygroups_rpc [None req-38d8b4a3-75dd-41d2-a0db-a9c73ae0e2bb e32848e36ae94f66ae634ff4d7716d6f 8462a4a9a313405e8fd212f9ec4a0c92 - - default default] Security group member updated ['78343c03-098f-4faf-a880-2814fe3611d6']
Nov 28 10:06:42 np0005538515.localdomain ceph-mon[301134]: osdmap e183: 6 total, 6 up, 6 in
Nov 28 10:06:42 np0005538515.localdomain ceph-mon[301134]: pgmap v352: 177 pgs: 1 active+clean+snaptrim_wait, 2 active+clean+snaptrim, 174 active+clean; 598 MiB data, 1.9 GiB used, 40 GiB / 42 GiB avail; 3.7 MiB/s rd, 48 MiB/s wr, 315 op/s
Nov 28 10:06:42 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e184 e184: 6 total, 6 up, 6 in
Nov 28 10:06:42 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:06:42.764 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:42 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:06:42.768 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:06:42 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:06:42.819 261346 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:06:42 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.
Nov 28 10:06:42 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:06:42.974 2 INFO neutron.agent.securitygroups_rpc [None req-c8fe2171-fc33-407b-b80c-443549ec2e39 e32848e36ae94f66ae634ff4d7716d6f 8462a4a9a313405e8fd212f9ec4a0c92 - - default default] Security group member updated ['78343c03-098f-4faf-a880-2814fe3611d6']
Nov 28 10:06:42 np0005538515.localdomain podman[317735]: 2025-11-28 10:06:42.984461508 +0000 UTC m=+0.090464324 container health_status cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, container_name=multipathd, managed_by=edpm_ansible, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Nov 28 10:06:43 np0005538515.localdomain podman[317735]: 2025-11-28 10:06:43.031564712 +0000 UTC m=+0.137567588 container exec_died cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3)
Nov 28 10:06:43 np0005538515.localdomain systemd[1]: cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.service: Deactivated successfully.
Nov 28 10:06:43 np0005538515.localdomain ceph-mon[301134]: osdmap e184: 6 total, 6 up, 6 in
Nov 28 10:06:43 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v354: 177 pgs: 1 active+clean+snaptrim_wait, 2 active+clean+snaptrim, 174 active+clean; 598 MiB data, 1.9 GiB used, 40 GiB / 42 GiB avail; 89 KiB/s rd, 23 MiB/s wr, 128 op/s
Nov 28 10:06:43 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:06:43.990 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:44 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:06:44.226 261346 INFO neutron.agent.linux.ip_lib [None req-8f1fb084-d397-4674-90ab-1db345b3e56d - - - - - -] Device tapb856d1b5-24 cannot be used as it has no MAC address
Nov 28 10:06:44 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:06:44.253 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:44 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:06:44.258 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:06:44 np0005538515.localdomain kernel: device tapb856d1b5-24 entered promiscuous mode
Nov 28 10:06:44 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:06:44.261 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:44 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T10:06:44Z|00161|binding|INFO|Claiming lport b856d1b5-24d8-464e-9e6f-00bfee193aae for this chassis.
Nov 28 10:06:44 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T10:06:44Z|00162|binding|INFO|b856d1b5-24d8-464e-9e6f-00bfee193aae: Claiming unknown
Nov 28 10:06:44 np0005538515.localdomain NetworkManager[5965]: <info>  [1764324404.2658] manager: (tapb856d1b5-24): new Generic device (/org/freedesktop/NetworkManager/Devices/36)
Nov 28 10:06:44 np0005538515.localdomain systemd-udevd[317763]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 10:06:44 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:06:44.274 158530 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538515.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'dhcp2a75ae47-09f3-5db4-9c67-86b6e0e7c804-c97d9106-1e14-4509-8744-407acebde871', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c97d9106-1e14-4509-8744-407acebde871', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8499932c523b4e26933fff84403e296e', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2baca3cd-7ce9-43a6-b4e9-ca22706fa29a, chassis=[<ovs.db.idl.Row object at 0x7fd80e481be0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fd80e481be0>], logical_port=b856d1b5-24d8-464e-9e6f-00bfee193aae) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:06:44 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:06:44.276 158530 INFO neutron.agent.ovn.metadata.agent [-] Port b856d1b5-24d8-464e-9e6f-00bfee193aae in datapath c97d9106-1e14-4509-8744-407acebde871 bound to our chassis
Nov 28 10:06:44 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:06:44.279 158530 DEBUG neutron.agent.ovn.metadata.agent [-] Port cb21ac71-ba5c-494b-b748-80129c041c9a IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Nov 28 10:06:44 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:06:44.279 158530 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c97d9106-1e14-4509-8744-407acebde871, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 28 10:06:44 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:06:44.280 261619 DEBUG oslo.privsep.daemon [-] privsep: reply[c51f299e-cf28-410d-97ff-f982281995bf]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:06:44 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:06:44.315 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:44 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T10:06:44Z|00163|binding|INFO|Setting lport b856d1b5-24d8-464e-9e6f-00bfee193aae ovn-installed in OVS
Nov 28 10:06:44 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T10:06:44Z|00164|binding|INFO|Setting lport b856d1b5-24d8-464e-9e6f-00bfee193aae up in Southbound
Nov 28 10:06:44 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:06:44.323 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:44 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:06:44.356 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:44 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:06:44.386 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:44 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:06:44.715 2 INFO neutron.agent.securitygroups_rpc [None req-f4e28522-faee-4418-9069-18a470f0a6f6 646af4638a054ec4b7eb9e5438a2ab60 5e7a07c97c664076bc825e05137c574c - - default default] Security group member updated ['11213b09-f0e7-4cd0-8dcb-72dc58a4cd0b']
Nov 28 10:06:44 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e185 e185: 6 total, 6 up, 6 in
Nov 28 10:06:44 np0005538515.localdomain ceph-mon[301134]: pgmap v354: 177 pgs: 1 active+clean+snaptrim_wait, 2 active+clean+snaptrim, 174 active+clean; 598 MiB data, 1.9 GiB used, 40 GiB / 42 GiB avail; 89 KiB/s rd, 23 MiB/s wr, 128 op/s
Nov 28 10:06:44 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/1049265492' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:06:44 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/1049265492' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:06:45 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:06:45.238 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:06:45 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:06:45.239 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:06:45 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:06:45.239 280172 DEBUG nova.compute.manager [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 28 10:06:45 np0005538515.localdomain podman[317819]: 
Nov 28 10:06:45 np0005538515.localdomain podman[317819]: 2025-11-28 10:06:45.36582624 +0000 UTC m=+0.099173513 container create 90d3c2900ca27dbb7e1b18db7709b8400d197bf203912f3ec75ad9fd66320108 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c97d9106-1e14-4509-8744-407acebde871, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Nov 28 10:06:45 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e186 e186: 6 total, 6 up, 6 in
Nov 28 10:06:45 np0005538515.localdomain systemd[1]: Started libpod-conmon-90d3c2900ca27dbb7e1b18db7709b8400d197bf203912f3ec75ad9fd66320108.scope.
Nov 28 10:06:45 np0005538515.localdomain podman[317819]: 2025-11-28 10:06:45.308517891 +0000 UTC m=+0.041865154 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 28 10:06:45 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T10:06:45Z|00165|binding|INFO|Removing iface tapb856d1b5-24 ovn-installed in OVS
Nov 28 10:06:45 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:06:45.422 158530 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port cb21ac71-ba5c-494b-b748-80129c041c9a with type ""
Nov 28 10:06:45 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T10:06:45Z|00166|binding|INFO|Removing lport b856d1b5-24d8-464e-9e6f-00bfee193aae ovn-installed in OVS
Nov 28 10:06:45 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:06:45.425 158530 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005538515.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'dhcp2a75ae47-09f3-5db4-9c67-86b6e0e7c804-c97d9106-1e14-4509-8744-407acebde871', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c97d9106-1e14-4509-8744-407acebde871', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8499932c523b4e26933fff84403e296e', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2baca3cd-7ce9-43a6-b4e9-ca22706fa29a, chassis=[<ovs.db.idl.Row object at 0x7fd80e481be0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fd80e481be0>], logical_port=b856d1b5-24d8-464e-9e6f-00bfee193aae) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:06:45 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:06:45.425 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:45 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 10:06:45 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:06:45.427 158530 INFO neutron.agent.ovn.metadata.agent [-] Port b856d1b5-24d8-464e-9e6f-00bfee193aae in datapath c97d9106-1e14-4509-8744-407acebde871 unbound from our chassis
Nov 28 10:06:45 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:06:45.430 158530 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c97d9106-1e14-4509-8744-407acebde871, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 28 10:06:45 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:06:45.431 261619 DEBUG oslo.privsep.daemon [-] privsep: reply[f394d95f-5d41-4c26-b2a5-18e83d68aac7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:06:45 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0c5be43be4f18623af7894625e337eeb187d588c0e78f5f686988f8c5118c154/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 10:06:45 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:06:45.435 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:45 np0005538515.localdomain podman[317819]: 2025-11-28 10:06:45.443791226 +0000 UTC m=+0.177138479 container init 90d3c2900ca27dbb7e1b18db7709b8400d197bf203912f3ec75ad9fd66320108 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c97d9106-1e14-4509-8744-407acebde871, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 28 10:06:45 np0005538515.localdomain podman[317819]: 2025-11-28 10:06:45.451893857 +0000 UTC m=+0.185241110 container start 90d3c2900ca27dbb7e1b18db7709b8400d197bf203912f3ec75ad9fd66320108 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c97d9106-1e14-4509-8744-407acebde871, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:06:45 np0005538515.localdomain dnsmasq[317837]: started, version 2.85 cachesize 150
Nov 28 10:06:45 np0005538515.localdomain dnsmasq[317837]: DNS service limited to local subnets
Nov 28 10:06:45 np0005538515.localdomain dnsmasq[317837]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 28 10:06:45 np0005538515.localdomain dnsmasq[317837]: warning: no upstream servers configured
Nov 28 10:06:45 np0005538515.localdomain dnsmasq-dhcp[317837]: DHCP, static leases only on 10.100.0.0, lease time 1d
Nov 28 10:06:45 np0005538515.localdomain dnsmasq[317837]: read /var/lib/neutron/dhcp/c97d9106-1e14-4509-8744-407acebde871/addn_hosts - 0 addresses
Nov 28 10:06:45 np0005538515.localdomain dnsmasq-dhcp[317837]: read /var/lib/neutron/dhcp/c97d9106-1e14-4509-8744-407acebde871/host
Nov 28 10:06:45 np0005538515.localdomain dnsmasq-dhcp[317837]: read /var/lib/neutron/dhcp/c97d9106-1e14-4509-8744-407acebde871/opts
Nov 28 10:06:45 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Nov 28 10:06:45 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2385808969' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:06:45 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Nov 28 10:06:45 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2385808969' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:06:45 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:06:45.748 261346 INFO neutron.agent.dhcp.agent [None req-103cbf91-8e88-4f3c-a833-f7875801133c - - - - - -] DHCP configuration for ports {'e05b8b57-3cba-4858-b841-64f615521ed6'} is completed
Nov 28 10:06:45 np0005538515.localdomain ceph-mon[301134]: osdmap e185: 6 total, 6 up, 6 in
Nov 28 10:06:45 np0005538515.localdomain ceph-mon[301134]: osdmap e186: 6 total, 6 up, 6 in
Nov 28 10:06:45 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/2385808969' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:06:45 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/2385808969' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:06:45 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v357: 177 pgs: 1 active+clean+snaptrim_wait, 2 active+clean+snaptrim, 174 active+clean; 598 MiB data, 1.9 GiB used, 40 GiB / 42 GiB avail; 128 KiB/s rd, 33 MiB/s wr, 185 op/s
Nov 28 10:06:45 np0005538515.localdomain dnsmasq[317837]: read /var/lib/neutron/dhcp/c97d9106-1e14-4509-8744-407acebde871/addn_hosts - 0 addresses
Nov 28 10:06:45 np0005538515.localdomain dnsmasq-dhcp[317837]: read /var/lib/neutron/dhcp/c97d9106-1e14-4509-8744-407acebde871/host
Nov 28 10:06:45 np0005538515.localdomain dnsmasq-dhcp[317837]: read /var/lib/neutron/dhcp/c97d9106-1e14-4509-8744-407acebde871/opts
Nov 28 10:06:45 np0005538515.localdomain podman[317855]: 2025-11-28 10:06:45.965345027 +0000 UTC m=+0.045868698 container kill 90d3c2900ca27dbb7e1b18db7709b8400d197bf203912f3ec75ad9fd66320108 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c97d9106-1e14-4509-8744-407acebde871, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:06:46 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:06:46.126 2 INFO neutron.agent.securitygroups_rpc [None req-33288b8b-e565-4871-a7ed-f7e6a715772e 646af4638a054ec4b7eb9e5438a2ab60 5e7a07c97c664076bc825e05137c574c - - default default] Security group member updated ['11213b09-f0e7-4cd0-8dcb-72dc58a4cd0b']
Nov 28 10:06:46 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:06:46.147 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:46 np0005538515.localdomain kernel: device tapb856d1b5-24 left promiscuous mode
Nov 28 10:06:46 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:06:46.163 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:46 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:06:46.170 261346 INFO neutron.agent.dhcp.agent [None req-2cb02d81-5c29-434b-8adc-47b9885c74db - - - - - -] DHCP configuration for ports {'e05b8b57-3cba-4858-b841-64f615521ed6'} is completed
Nov 28 10:06:46 np0005538515.localdomain dnsmasq[317837]: read /var/lib/neutron/dhcp/c97d9106-1e14-4509-8744-407acebde871/addn_hosts - 0 addresses
Nov 28 10:06:46 np0005538515.localdomain dnsmasq-dhcp[317837]: read /var/lib/neutron/dhcp/c97d9106-1e14-4509-8744-407acebde871/host
Nov 28 10:06:46 np0005538515.localdomain dnsmasq-dhcp[317837]: read /var/lib/neutron/dhcp/c97d9106-1e14-4509-8744-407acebde871/opts
Nov 28 10:06:46 np0005538515.localdomain podman[317894]: 2025-11-28 10:06:46.391513493 +0000 UTC m=+0.064050668 container kill 90d3c2900ca27dbb7e1b18db7709b8400d197bf203912f3ec75ad9fd66320108 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c97d9106-1e14-4509-8744-407acebde871, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.vendor=CentOS)
Nov 28 10:06:46 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:06:46.422 261346 ERROR neutron.agent.dhcp.agent [None req-9120482d-36a1-4d6e-8558-9d87ecbe8bf9 - - - - - -] Unable to reload_allocations dhcp for c97d9106-1e14-4509-8744-407acebde871.: neutron.privileged.agent.linux.ip_lib.NetworkInterfaceNotFound: Network interface tapb856d1b5-24 not found in namespace qdhcp-c97d9106-1e14-4509-8744-407acebde871.
Nov 28 10:06:46 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:06:46.422 261346 ERROR neutron.agent.dhcp.agent Traceback (most recent call last):
Nov 28 10:06:46 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:06:46.422 261346 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/dhcp/agent.py", line 264, in _call_driver
Nov 28 10:06:46 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:06:46.422 261346 ERROR neutron.agent.dhcp.agent     rv = getattr(driver, action)(**action_kwargs)
Nov 28 10:06:46 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:06:46.422 261346 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 673, in reload_allocations
Nov 28 10:06:46 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:06:46.422 261346 ERROR neutron.agent.dhcp.agent     self.device_manager.update(self.network, self.interface_name)
Nov 28 10:06:46 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:06:46.422 261346 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1899, in update
Nov 28 10:06:46 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:06:46.422 261346 ERROR neutron.agent.dhcp.agent     self._set_default_route(network, device_name)
Nov 28 10:06:46 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:06:46.422 261346 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1610, in _set_default_route
Nov 28 10:06:46 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:06:46.422 261346 ERROR neutron.agent.dhcp.agent     self._set_default_route_ip_version(network, device_name,
Nov 28 10:06:46 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:06:46.422 261346 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1539, in _set_default_route_ip_version
Nov 28 10:06:46 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:06:46.422 261346 ERROR neutron.agent.dhcp.agent     gateway = device.route.get_gateway(ip_version=ip_version)
Nov 28 10:06:46 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:06:46.422 261346 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 671, in get_gateway
Nov 28 10:06:46 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:06:46.422 261346 ERROR neutron.agent.dhcp.agent     routes = self.list_routes(ip_version, scope=scope, table=table)
Nov 28 10:06:46 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:06:46.422 261346 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 656, in list_routes
Nov 28 10:06:46 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:06:46.422 261346 ERROR neutron.agent.dhcp.agent     return list_ip_routes(self._parent.namespace, ip_version, scope=scope,
Nov 28 10:06:46 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:06:46.422 261346 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 1611, in list_ip_routes
Nov 28 10:06:46 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:06:46.422 261346 ERROR neutron.agent.dhcp.agent     routes = privileged.list_ip_routes(namespace, ip_version, device=device,
Nov 28 10:06:46 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:06:46.422 261346 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 333, in wrapped_f
Nov 28 10:06:46 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:06:46.422 261346 ERROR neutron.agent.dhcp.agent     return self(f, *args, **kw)
Nov 28 10:06:46 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:06:46.422 261346 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 423, in __call__
Nov 28 10:06:46 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:06:46.422 261346 ERROR neutron.agent.dhcp.agent     do = self.iter(retry_state=retry_state)
Nov 28 10:06:46 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:06:46.422 261346 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 360, in iter
Nov 28 10:06:46 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:06:46.422 261346 ERROR neutron.agent.dhcp.agent     return fut.result()
Nov 28 10:06:46 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:06:46.422 261346 ERROR neutron.agent.dhcp.agent   File "/usr/lib64/python3.9/concurrent/futures/_base.py", line 439, in result
Nov 28 10:06:46 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:06:46.422 261346 ERROR neutron.agent.dhcp.agent     return self.__get_result()
Nov 28 10:06:46 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:06:46.422 261346 ERROR neutron.agent.dhcp.agent   File "/usr/lib64/python3.9/concurrent/futures/_base.py", line 391, in __get_result
Nov 28 10:06:46 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:06:46.422 261346 ERROR neutron.agent.dhcp.agent     raise self._exception
Nov 28 10:06:46 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:06:46.422 261346 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 426, in __call__
Nov 28 10:06:46 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:06:46.422 261346 ERROR neutron.agent.dhcp.agent     result = fn(*args, **kwargs)
Nov 28 10:06:46 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:06:46.422 261346 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/oslo_privsep/priv_context.py", line 271, in _wrap
Nov 28 10:06:46 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:06:46.422 261346 ERROR neutron.agent.dhcp.agent     return self.channel.remote_call(name, args, kwargs,
Nov 28 10:06:46 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:06:46.422 261346 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/oslo_privsep/daemon.py", line 215, in remote_call
Nov 28 10:06:46 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:06:46.422 261346 ERROR neutron.agent.dhcp.agent     raise exc_type(*result[2])
Nov 28 10:06:46 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:06:46.422 261346 ERROR neutron.agent.dhcp.agent neutron.privileged.agent.linux.ip_lib.NetworkInterfaceNotFound: Network interface tapb856d1b5-24 not found in namespace qdhcp-c97d9106-1e14-4509-8744-407acebde871.
Nov 28 10:06:46 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:06:46.422 261346 ERROR neutron.agent.dhcp.agent 
Nov 28 10:06:46 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:06:46.426 261346 INFO neutron.agent.dhcp.agent [None req-ba59b665-bbba-47f6-9efa-63a243c41137 - - - - - -] Synchronizing state
Nov 28 10:06:46 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:06:46.586 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:46 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:06:46.770 2 INFO neutron.agent.securitygroups_rpc [None req-92a83df5-6e17-4f34-81bc-c1c1907c0872 e32848e36ae94f66ae634ff4d7716d6f 8462a4a9a313405e8fd212f9ec4a0c92 - - default default] Security group member updated ['78343c03-098f-4faf-a880-2814fe3611d6']
Nov 28 10:06:46 np0005538515.localdomain ceph-mon[301134]: pgmap v357: 177 pgs: 1 active+clean+snaptrim_wait, 2 active+clean+snaptrim, 174 active+clean; 598 MiB data, 1.9 GiB used, 40 GiB / 42 GiB avail; 128 KiB/s rd, 33 MiB/s wr, 185 op/s
Nov 28 10:06:46 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/3198145338' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:06:46 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/3198145338' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:06:46 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e186 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:06:46 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:06:46.994 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:46 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:06:46.994 158530 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=14, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '92:49:97', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ca:ab:0a:de:51:20'}, ipsec=False) old=SB_Global(nb_cfg=13) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:06:46 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:06:46.996 158530 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 28 10:06:47 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:06:47.090 261346 INFO neutron.agent.dhcp.agent [None req-836082ad-d917-465d-b846-8d8d23285b85 - - - - - -] All active networks have been fetched through RPC.
Nov 28 10:06:47 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:06:47.168 2 INFO neutron.agent.securitygroups_rpc [None req-09174bbe-f95e-4ff7-9cd4-901486ff4f01 646af4638a054ec4b7eb9e5438a2ab60 5e7a07c97c664076bc825e05137c574c - - default default] Security group member updated ['11213b09-f0e7-4cd0-8dcb-72dc58a4cd0b']
Nov 28 10:06:47 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:06:47.238 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:06:47 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:06:47.238 280172 DEBUG nova.compute.manager [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 28 10:06:47 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:06:47.239 280172 DEBUG nova.compute.manager [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 28 10:06:47 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:06:47.260 280172 DEBUG nova.compute.manager [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 28 10:06:47 np0005538515.localdomain podman[317925]: 2025-11-28 10:06:47.263568133 +0000 UTC m=+0.053674678 container kill 90d3c2900ca27dbb7e1b18db7709b8400d197bf203912f3ec75ad9fd66320108 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c97d9106-1e14-4509-8744-407acebde871, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_managed=true)
Nov 28 10:06:47 np0005538515.localdomain dnsmasq[317837]: exiting on receipt of SIGTERM
Nov 28 10:06:47 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:06:47.263 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:06:47 np0005538515.localdomain systemd[1]: libpod-90d3c2900ca27dbb7e1b18db7709b8400d197bf203912f3ec75ad9fd66320108.scope: Deactivated successfully.
Nov 28 10:06:47 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:06:47.266 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:06:47 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:06:47.290 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 10:06:47 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:06:47.291 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 10:06:47 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:06:47.291 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 10:06:47 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:06:47.292 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Auditing locally available compute resources for np0005538515.localdomain (node: np0005538515.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 28 10:06:47 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:06:47.292 280172 DEBUG oslo_concurrency.processutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 10:06:47 np0005538515.localdomain podman[317937]: 2025-11-28 10:06:47.329648513 +0000 UTC m=+0.052939815 container died 90d3c2900ca27dbb7e1b18db7709b8400d197bf203912f3ec75ad9fd66320108 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c97d9106-1e14-4509-8744-407acebde871, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:06:47 np0005538515.localdomain systemd[1]: tmp-crun.ar2nUr.mount: Deactivated successfully.
Nov 28 10:06:47 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-90d3c2900ca27dbb7e1b18db7709b8400d197bf203912f3ec75ad9fd66320108-userdata-shm.mount: Deactivated successfully.
Nov 28 10:06:47 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-0c5be43be4f18623af7894625e337eeb187d588c0e78f5f686988f8c5118c154-merged.mount: Deactivated successfully.
Nov 28 10:06:47 np0005538515.localdomain podman[317937]: 2025-11-28 10:06:47.382346609 +0000 UTC m=+0.105637871 container cleanup 90d3c2900ca27dbb7e1b18db7709b8400d197bf203912f3ec75ad9fd66320108 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c97d9106-1e14-4509-8744-407acebde871, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:06:47 np0005538515.localdomain systemd[1]: libpod-conmon-90d3c2900ca27dbb7e1b18db7709b8400d197bf203912f3ec75ad9fd66320108.scope: Deactivated successfully.
Nov 28 10:06:47 np0005538515.localdomain podman[317939]: 2025-11-28 10:06:47.465402873 +0000 UTC m=+0.180505952 container remove 90d3c2900ca27dbb7e1b18db7709b8400d197bf203912f3ec75ad9fd66320108 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c97d9106-1e14-4509-8744-407acebde871, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 28 10:06:47 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:06:47.499 261346 INFO neutron.agent.dhcp.agent [-] Starting network 719e8b2f-3827-4b26-9c22-6bd0f7ed7ceb dhcp configuration
Nov 28 10:06:47 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:06:47.499 261346 INFO neutron.agent.dhcp.agent [-] Finished network 719e8b2f-3827-4b26-9c22-6bd0f7ed7ceb dhcp configuration
Nov 28 10:06:47 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:06:47.499 261346 INFO neutron.agent.dhcp.agent [-] Starting network 744b5a82-3c5c-4b41-ba44-527244a209c4 dhcp configuration
Nov 28 10:06:47 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:06:47.499 261346 INFO neutron.agent.dhcp.agent [-] Finished network 744b5a82-3c5c-4b41-ba44-527244a209c4 dhcp configuration
Nov 28 10:06:47 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:06:47.499 261346 INFO neutron.agent.dhcp.agent [None req-ccd65a32-33df-4120-8f5f-c333fd0d9b15 - - - - - -] Synchronizing state complete
Nov 28 10:06:47 np0005538515.localdomain systemd[1]: run-netns-qdhcp\x2dc97d9106\x2d1e14\x2d4509\x2d8744\x2d407acebde871.mount: Deactivated successfully.
Nov 28 10:06:47 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 28 10:06:47 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/3613923806' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:06:47 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:06:47.777 280172 DEBUG oslo_concurrency.processutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.485s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 10:06:47 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v358: 177 pgs: 177 active+clean; 633 MiB data, 2.2 GiB used, 40 GiB / 42 GiB avail; 119 KiB/s rd, 21 MiB/s wr, 168 op/s
Nov 28 10:06:47 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:06:47.809 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:47 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.106:0/2354762036' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:06:47 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.108:0/3613923806' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:06:48 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:06:48.025 280172 WARNING nova.virt.libvirt.driver [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 10:06:48 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:06:48.027 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Hypervisor/Node resource view: name=np0005538515.localdomain free_ram=11511MB free_disk=41.83686447143555GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 28 10:06:48 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:06:48.028 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 10:06:48 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:06:48.028 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 10:06:48 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:06:48.347 2 INFO neutron.agent.securitygroups_rpc [None req-5e549f88-7906-4d9c-9524-bb7ac558174f 646af4638a054ec4b7eb9e5438a2ab60 5e7a07c97c664076bc825e05137c574c - - default default] Security group member updated ['11213b09-f0e7-4cd0-8dcb-72dc58a4cd0b']
Nov 28 10:06:48 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:06:48.410 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 28 10:06:48 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:06:48.412 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Final resource view: name=np0005538515.localdomain phys_ram=15738MB used_ram=512MB phys_disk=41GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 28 10:06:48 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:06:48.433 280172 DEBUG nova.scheduler.client.report [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Refreshing inventories for resource provider 72fba1ca-0d86-48af-8a3d-510284dfd0e0 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Nov 28 10:06:48 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:06:48.463 280172 DEBUG nova.scheduler.client.report [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Updating ProviderTree inventory for provider 72fba1ca-0d86-48af-8a3d-510284dfd0e0 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Nov 28 10:06:48 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:06:48.464 280172 DEBUG nova.compute.provider_tree [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Updating inventory in ProviderTree for provider 72fba1ca-0d86-48af-8a3d-510284dfd0e0 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 28 10:06:48 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:06:48.511 280172 DEBUG nova.scheduler.client.report [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Refreshing aggregate associations for resource provider 72fba1ca-0d86-48af-8a3d-510284dfd0e0, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Nov 28 10:06:48 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:06:48.576 280172 DEBUG nova.scheduler.client.report [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Refreshing trait associations for resource provider 72fba1ca-0d86-48af-8a3d-510284dfd0e0, traits: COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_TRUSTED_CERTS,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_AVX,HW_CPU_X86_BMI2,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_AESNI,HW_CPU_X86_SHA,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_ACCELERATORS,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_AMD_SVM,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_F16C,HW_CPU_X86_SVM,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_SSE2,HW_CPU_X86_CLMUL,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_ABM,COMPUTE_STORAGE_BUS_FDC,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_SSE4A,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_SSSE3,HW_CPU_X86_AVX2,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_SECURITY_TPM_1_2,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_SSE41,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_FMA3,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_BMI,HW_CPU_X86_SSE,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NODE,HW_CPU_X86_MMX,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_VOLUME_EXTEND,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_VIRTIO _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Nov 28 10:06:48 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "7fc9c0d6-3c27-4cc1-b7bd-1c8604d8f486", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:06:48 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:7fc9c0d6-3c27-4cc1-b7bd-1c8604d8f486, vol_name:cephfs) < ""
Nov 28 10:06:48 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:06:48.618 280172 DEBUG oslo_concurrency.processutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 10:06:48 np0005538515.localdomain ceph-mgr[286188]: client.0 error registering admin socket command: (17) File exists
Nov 28 10:06:48 np0005538515.localdomain ceph-mgr[286188]: client.0 error registering admin socket command: (17) File exists
Nov 28 10:06:48 np0005538515.localdomain ceph-mgr[286188]: client.0 error registering admin socket command: (17) File exists
Nov 28 10:06:48 np0005538515.localdomain ceph-mgr[286188]: client.0 error registering admin socket command: (17) File exists
Nov 28 10:06:48 np0005538515.localdomain ceph-mgr[286188]: client.0 error registering admin socket command: (17) File exists
Nov 28 10:06:48 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T10:06:48.624+0000 7fcc87448640 -1 client.0 error registering admin socket command: (17) File exists
Nov 28 10:06:48 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T10:06:48.624+0000 7fcc87448640 -1 client.0 error registering admin socket command: (17) File exists
Nov 28 10:06:48 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T10:06:48.624+0000 7fcc87448640 -1 client.0 error registering admin socket command: (17) File exists
Nov 28 10:06:48 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T10:06:48.624+0000 7fcc87448640 -1 client.0 error registering admin socket command: (17) File exists
Nov 28 10:06:48 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T10:06:48.624+0000 7fcc87448640 -1 client.0 error registering admin socket command: (17) File exists
Nov 28 10:06:49 np0005538515.localdomain ceph-mon[301134]: pgmap v358: 177 pgs: 177 active+clean; 633 MiB data, 2.2 GiB used, 40 GiB / 42 GiB avail; 119 KiB/s rd, 21 MiB/s wr, 168 op/s
Nov 28 10:06:49 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.106:0/1057030524' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:06:49 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:06:49.040 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:49 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/7fc9c0d6-3c27-4cc1-b7bd-1c8604d8f486/.meta.tmp'
Nov 28 10:06:49 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/7fc9c0d6-3c27-4cc1-b7bd-1c8604d8f486/.meta.tmp' to config b'/volumes/_nogroup/7fc9c0d6-3c27-4cc1-b7bd-1c8604d8f486/.meta'
Nov 28 10:06:49 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:7fc9c0d6-3c27-4cc1-b7bd-1c8604d8f486, vol_name:cephfs) < ""
Nov 28 10:06:49 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "7fc9c0d6-3c27-4cc1-b7bd-1c8604d8f486", "format": "json"}]: dispatch
Nov 28 10:06:49 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:7fc9c0d6-3c27-4cc1-b7bd-1c8604d8f486, vol_name:cephfs) < ""
Nov 28 10:06:49 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:7fc9c0d6-3c27-4cc1-b7bd-1c8604d8f486, vol_name:cephfs) < ""
Nov 28 10:06:49 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 28 10:06:49 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/419078793' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:06:49 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:06:49.202 280172 DEBUG oslo_concurrency.processutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.584s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 10:06:49 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:06:49.209 280172 DEBUG nova.compute.provider_tree [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Inventory has not changed in ProviderTree for provider: 72fba1ca-0d86-48af-8a3d-510284dfd0e0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 10:06:49 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:06:49.232 280172 DEBUG nova.scheduler.client.report [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Inventory has not changed for provider 72fba1ca-0d86-48af-8a3d-510284dfd0e0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 10:06:49 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:06:49.235 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Compute_service record updated for np0005538515.localdomain:np0005538515.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 28 10:06:49 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:06:49.235 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.207s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 10:06:49 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v359: 177 pgs: 177 active+clean; 633 MiB data, 2.2 GiB used, 40 GiB / 42 GiB avail; 100 KiB/s rd, 18 MiB/s wr, 142 op/s
Nov 28 10:06:49 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:06:49.867 2 INFO neutron.agent.securitygroups_rpc [None req-be3aec7b-abf4-4211-97ca-2df83bf1c365 87cf17c48bab44279b2e7eca9e0882a2 d3c0d1ce8d854a7b9ffc953e88cd2c44 - - default default] Security group rule updated ['c52603b5-5f47-4123-b8fe-cc9f0a56d914']
Nov 28 10:06:49 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:06:49.997 158530 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=62c03cad-89c1-4fd7-973b-8f2a608c71f1, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '14'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 10:06:50 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "7fc9c0d6-3c27-4cc1-b7bd-1c8604d8f486", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:06:50 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "7fc9c0d6-3c27-4cc1-b7bd-1c8604d8f486", "format": "json"}]: dispatch
Nov 28 10:06:50 np0005538515.localdomain ceph-mon[301134]: from='client.15651 172.18.0.34:0/597982567' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:06:50 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.108:0/419078793' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:06:50 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.107:0/1129667614' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:06:50 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:06:50.232 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:06:50 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:06:50.319 2 INFO neutron.agent.securitygroups_rpc [None req-500702cc-90b6-4aaa-a265-7d7caae35722 87cf17c48bab44279b2e7eca9e0882a2 d3c0d1ce8d854a7b9ffc953e88cd2c44 - - default default] Security group rule updated ['c52603b5-5f47-4123-b8fe-cc9f0a56d914']
Nov 28 10:06:50 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e187 e187: 6 total, 6 up, 6 in
Nov 28 10:06:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:06:50.850 158530 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 10:06:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:06:50.850 158530 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 10:06:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:06:50.850 158530 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 10:06:51 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:06:51.238 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:06:51 np0005538515.localdomain ceph-mon[301134]: pgmap v359: 177 pgs: 177 active+clean; 633 MiB data, 2.2 GiB used, 40 GiB / 42 GiB avail; 100 KiB/s rd, 18 MiB/s wr, 142 op/s
Nov 28 10:06:51 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.107:0/691215413' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:06:51 np0005538515.localdomain ceph-mon[301134]: osdmap e187: 6 total, 6 up, 6 in
Nov 28 10:06:51 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:06:51.734 2 INFO neutron.agent.securitygroups_rpc [None req-7220764d-588c-486e-8a18-42779bef93d4 e32848e36ae94f66ae634ff4d7716d6f 8462a4a9a313405e8fd212f9ec4a0c92 - - default default] Security group member updated ['78343c03-098f-4faf-a880-2814fe3611d6']
Nov 28 10:06:51 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e187 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:06:51 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v361: 177 pgs: 177 active+clean; 745 MiB data, 2.5 GiB used, 39 GiB / 42 GiB avail; 115 KiB/s rd, 34 MiB/s wr, 169 op/s
Nov 28 10:06:52 np0005538515.localdomain ceph-mon[301134]: mgrmap e46: np0005538515.yfkzhl(active, since 8m), standbys: np0005538513.dsfdlx, np0005538514.djozup
Nov 28 10:06:52 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:06:52.811 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:53 np0005538515.localdomain ceph-mon[301134]: pgmap v361: 177 pgs: 177 active+clean; 745 MiB data, 2.5 GiB used, 39 GiB / 42 GiB avail; 115 KiB/s rd, 34 MiB/s wr, 169 op/s
Nov 28 10:06:53 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v362: 177 pgs: 177 active+clean; 745 MiB data, 2.5 GiB used, 39 GiB / 42 GiB avail; 96 KiB/s rd, 29 MiB/s wr, 142 op/s
Nov 28 10:06:54 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "1555194e-595d-4b14-be99-858d500899d3", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:06:54 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:1555194e-595d-4b14-be99-858d500899d3, vol_name:cephfs) < ""
Nov 28 10:06:54 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:06:54.096 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:54 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/1555194e-595d-4b14-be99-858d500899d3/.meta.tmp'
Nov 28 10:06:54 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/1555194e-595d-4b14-be99-858d500899d3/.meta.tmp' to config b'/volumes/_nogroup/1555194e-595d-4b14-be99-858d500899d3/.meta'
Nov 28 10:06:54 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:1555194e-595d-4b14-be99-858d500899d3, vol_name:cephfs) < ""
Nov 28 10:06:54 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "1555194e-595d-4b14-be99-858d500899d3", "format": "json"}]: dispatch
Nov 28 10:06:54 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:1555194e-595d-4b14-be99-858d500899d3, vol_name:cephfs) < ""
Nov 28 10:06:54 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:1555194e-595d-4b14-be99-858d500899d3, vol_name:cephfs) < ""
Nov 28 10:06:54 np0005538515.localdomain ceph-mon[301134]: from='client.15651 172.18.0.34:0/597982567' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:06:55 np0005538515.localdomain ceph-mon[301134]: pgmap v362: 177 pgs: 177 active+clean; 745 MiB data, 2.5 GiB used, 39 GiB / 42 GiB avail; 96 KiB/s rd, 29 MiB/s wr, 142 op/s
Nov 28 10:06:55 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "1555194e-595d-4b14-be99-858d500899d3", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:06:55 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "1555194e-595d-4b14-be99-858d500899d3", "format": "json"}]: dispatch
Nov 28 10:06:55 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.
Nov 28 10:06:55 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v363: 177 pgs: 177 active+clean; 745 MiB data, 2.5 GiB used, 39 GiB / 42 GiB avail; 81 KiB/s rd, 24 MiB/s wr, 119 op/s
Nov 28 10:06:55 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:06:55.879 2 INFO neutron.agent.securitygroups_rpc [None req-3852d4d4-6237-4177-8a94-f6db1e58a4b7 646af4638a054ec4b7eb9e5438a2ab60 5e7a07c97c664076bc825e05137c574c - - default default] Security group member updated ['11213b09-f0e7-4cd0-8dcb-72dc58a4cd0b']
Nov 28 10:06:55 np0005538515.localdomain systemd[1]: tmp-crun.6NLN3X.mount: Deactivated successfully.
Nov 28 10:06:55 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:06:55.887 261346 INFO neutron.agent.linux.ip_lib [None req-27e4944a-e3ba-46ce-9d0d-ad2f93e66f41 - - - - - -] Device tape1272867-53 cannot be used as it has no MAC address
Nov 28 10:06:55 np0005538515.localdomain podman[318023]: 2025-11-28 10:06:55.896906573 +0000 UTC m=+0.114254109 container health_status 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vcs-type=git, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal)
Nov 28 10:06:55 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:06:55.929 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:55 np0005538515.localdomain kernel: device tape1272867-53 entered promiscuous mode
Nov 28 10:06:55 np0005538515.localdomain podman[318023]: 2025-11-28 10:06:55.936159734 +0000 UTC m=+0.153507210 container exec_died 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, distribution-scope=public, io.openshift.expose-services=, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, managed_by=edpm_ansible, architecture=x86_64, maintainer=Red Hat, Inc.)
Nov 28 10:06:55 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T10:06:55Z|00167|binding|INFO|Claiming lport e1272867-532b-4f64-b1d3-8e10c12195a2 for this chassis.
Nov 28 10:06:55 np0005538515.localdomain NetworkManager[5965]: <info>  [1764324415.9398] manager: (tape1272867-53): new Generic device (/org/freedesktop/NetworkManager/Devices/37)
Nov 28 10:06:55 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T10:06:55Z|00168|binding|INFO|e1272867-532b-4f64-b1d3-8e10c12195a2: Claiming unknown
Nov 28 10:06:55 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:06:55.938 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:55 np0005538515.localdomain systemd-udevd[318050]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 10:06:55 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T10:06:55Z|00169|binding|INFO|Setting lport e1272867-532b-4f64-b1d3-8e10c12195a2 ovn-installed in OVS
Nov 28 10:06:55 np0005538515.localdomain virtnodedevd[228057]: ethtool ioctl error on tape1272867-53: No such device
Nov 28 10:06:55 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:06:55.970 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:55 np0005538515.localdomain virtnodedevd[228057]: ethtool ioctl error on tape1272867-53: No such device
Nov 28 10:06:55 np0005538515.localdomain virtnodedevd[228057]: ethtool ioctl error on tape1272867-53: No such device
Nov 28 10:06:55 np0005538515.localdomain systemd[1]: 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.service: Deactivated successfully.
Nov 28 10:06:55 np0005538515.localdomain virtnodedevd[228057]: ethtool ioctl error on tape1272867-53: No such device
Nov 28 10:06:55 np0005538515.localdomain virtnodedevd[228057]: ethtool ioctl error on tape1272867-53: No such device
Nov 28 10:06:55 np0005538515.localdomain virtnodedevd[228057]: ethtool ioctl error on tape1272867-53: No such device
Nov 28 10:06:56 np0005538515.localdomain virtnodedevd[228057]: ethtool ioctl error on tape1272867-53: No such device
Nov 28 10:06:56 np0005538515.localdomain virtnodedevd[228057]: ethtool ioctl error on tape1272867-53: No such device
Nov 28 10:06:56 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:06:56.016 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:56 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:06:56.042 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:56 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T10:06:56Z|00170|binding|INFO|Setting lport e1272867-532b-4f64-b1d3-8e10c12195a2 up in Southbound
Nov 28 10:06:56 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:06:56.156 158530 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538515.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcp2a75ae47-09f3-5db4-9c67-86b6e0e7c804-5d55e4ac-ff66-4512-9e4c-487c005fe37c', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5d55e4ac-ff66-4512-9e4c-487c005fe37c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5e7a07c97c664076bc825e05137c574c', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=211a2e3d-b3b5-43ed-95be-d9132daa612f, chassis=[<ovs.db.idl.Row object at 0x7fd80e481be0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fd80e481be0>], logical_port=e1272867-532b-4f64-b1d3-8e10c12195a2) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:06:56 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:06:56.158 158530 INFO neutron.agent.ovn.metadata.agent [-] Port e1272867-532b-4f64-b1d3-8e10c12195a2 in datapath 5d55e4ac-ff66-4512-9e4c-487c005fe37c bound to our chassis
Nov 28 10:06:56 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:06:56.161 158530 DEBUG neutron.agent.ovn.metadata.agent [-] Port db635ae2-689f-4d6e-a9b9-1757ca00c3a5 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Nov 28 10:06:56 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:06:56.161 158530 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 5d55e4ac-ff66-4512-9e4c-487c005fe37c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 28 10:06:56 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:06:56.162 261619 DEBUG oslo.privsep.daemon [-] privsep: reply[50fa527b-a47e-4b9d-96a9-636a749b6623]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:06:56 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e187 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:06:57 np0005538515.localdomain ceph-mon[301134]: pgmap v363: 177 pgs: 177 active+clean; 745 MiB data, 2.5 GiB used, 39 GiB / 42 GiB avail; 81 KiB/s rd, 24 MiB/s wr, 119 op/s
Nov 28 10:06:57 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.107:0/1390478790' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:06:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:06:57 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 10:06:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:06:57 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 10:06:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:06:57 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 10:06:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:06:57 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 10:06:57 np0005538515.localdomain openstack_network_exporter[240973]: 
Nov 28 10:06:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:06:57 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 10:06:57 np0005538515.localdomain openstack_network_exporter[240973]: 
Nov 28 10:06:57 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v364: 177 pgs: 177 active+clean; 865 MiB data, 2.9 GiB used, 39 GiB / 42 GiB avail; 19 KiB/s rd, 23 MiB/s wr, 36 op/s
Nov 28 10:06:57 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:06:57.828 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:58 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume resize", "vol_name": "cephfs", "sub_name": "1555194e-595d-4b14-be99-858d500899d3", "new_size": 2147483648, "format": "json"}]: dispatch
Nov 28 10:06:58 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_resize(format:json, new_size:2147483648, prefix:fs subvolume resize, sub_name:1555194e-595d-4b14-be99-858d500899d3, vol_name:cephfs) < ""
Nov 28 10:06:58 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_resize(format:json, new_size:2147483648, prefix:fs subvolume resize, sub_name:1555194e-595d-4b14-be99-858d500899d3, vol_name:cephfs) < ""
Nov 28 10:06:58 np0005538515.localdomain podman[318121]: 
Nov 28 10:06:58 np0005538515.localdomain podman[318121]: 2025-11-28 10:06:58.63932373 +0000 UTC m=+0.129729156 container create 6d17717cf4c78ea98c557933e4bb52f68e6a295d91765178c9115356c8664792 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5d55e4ac-ff66-4512-9e4c-487c005fe37c, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS)
Nov 28 10:06:58 np0005538515.localdomain podman[318121]: 2025-11-28 10:06:58.55087604 +0000 UTC m=+0.041281496 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 28 10:06:58 np0005538515.localdomain systemd[1]: Started libpod-conmon-6d17717cf4c78ea98c557933e4bb52f68e6a295d91765178c9115356c8664792.scope.
Nov 28 10:06:58 np0005538515.localdomain systemd[1]: tmp-crun.KYZd7U.mount: Deactivated successfully.
Nov 28 10:06:58 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 10:06:58 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8ce0eb26f941760bc165071691d15f2ff2f716526f20d4e0a559e47f82947841/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 10:06:58 np0005538515.localdomain podman[318121]: 2025-11-28 10:06:58.720259029 +0000 UTC m=+0.210664445 container init 6d17717cf4c78ea98c557933e4bb52f68e6a295d91765178c9115356c8664792 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5d55e4ac-ff66-4512-9e4c-487c005fe37c, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team)
Nov 28 10:06:58 np0005538515.localdomain podman[318121]: 2025-11-28 10:06:58.732409034 +0000 UTC m=+0.222814450 container start 6d17717cf4c78ea98c557933e4bb52f68e6a295d91765178c9115356c8664792 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5d55e4ac-ff66-4512-9e4c-487c005fe37c, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:06:58 np0005538515.localdomain dnsmasq[318139]: started, version 2.85 cachesize 150
Nov 28 10:06:58 np0005538515.localdomain dnsmasq[318139]: DNS service limited to local subnets
Nov 28 10:06:58 np0005538515.localdomain dnsmasq[318139]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 28 10:06:58 np0005538515.localdomain dnsmasq[318139]: warning: no upstream servers configured
Nov 28 10:06:58 np0005538515.localdomain dnsmasq-dhcp[318139]: DHCP, static leases only on 10.100.0.0, lease time 1d
Nov 28 10:06:58 np0005538515.localdomain dnsmasq[318139]: read /var/lib/neutron/dhcp/5d55e4ac-ff66-4512-9e4c-487c005fe37c/addn_hosts - 0 addresses
Nov 28 10:06:58 np0005538515.localdomain dnsmasq-dhcp[318139]: read /var/lib/neutron/dhcp/5d55e4ac-ff66-4512-9e4c-487c005fe37c/host
Nov 28 10:06:58 np0005538515.localdomain dnsmasq-dhcp[318139]: read /var/lib/neutron/dhcp/5d55e4ac-ff66-4512-9e4c-487c005fe37c/opts
Nov 28 10:06:58 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:06:58.793 261346 INFO neutron.agent.dhcp.agent [None req-063925d6-8b30-43b7-98cc-07bf1eb3d373 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:06:55Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce40f1c0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce40f130>], id=c60962a6-b068-4e59-9a3d-1385700e4916, ip_allocation=immediate, mac_address=fa:16:3e:77:98:c0, name=tempest-PortsTestJSON-183814121, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:06:51Z, description=, dns_domain=, id=5d55e4ac-ff66-4512-9e4c-487c005fe37c, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-PortsTestJSON-1418877084, port_security_enabled=True, project_id=5e7a07c97c664076bc825e05137c574c, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=19323, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2432, status=ACTIVE, subnets=['f0db51dd-f2de-43a4-beee-2ae93e9b7dfe'], tags=[], tenant_id=5e7a07c97c664076bc825e05137c574c, updated_at=2025-11-28T10:06:53Z, vlan_transparent=None, network_id=5d55e4ac-ff66-4512-9e4c-487c005fe37c, port_security_enabled=True, project_id=5e7a07c97c664076bc825e05137c574c, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['11213b09-f0e7-4cd0-8dcb-72dc58a4cd0b'], standard_attr_id=2439, status=DOWN, tags=[], tenant_id=5e7a07c97c664076bc825e05137c574c, updated_at=2025-11-28T10:06:55Z on network 5d55e4ac-ff66-4512-9e4c-487c005fe37c
Nov 28 10:06:58 np0005538515.localdomain podman[239012]: time="2025-11-28T10:06:58Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 10:06:58 np0005538515.localdomain podman[239012]: @ - - [28/Nov/2025:10:06:58 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 159978 "" "Go-http-client/1.1"
Nov 28 10:06:58 np0005538515.localdomain podman[239012]: @ - - [28/Nov/2025:10:06:58 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 20149 "" "Go-http-client/1.1"
Nov 28 10:06:59 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:06:59.092 261346 INFO neutron.agent.dhcp.agent [None req-03e9dfeb-0339-4c4f-8eea-327a3f72d469 - - - - - -] DHCP configuration for ports {'21912150-8f04-412e-8741-004f8151ee9a'} is completed
Nov 28 10:06:59 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:06:59.133 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:06:59 np0005538515.localdomain dnsmasq[318139]: read /var/lib/neutron/dhcp/5d55e4ac-ff66-4512-9e4c-487c005fe37c/addn_hosts - 1 addresses
Nov 28 10:06:59 np0005538515.localdomain dnsmasq-dhcp[318139]: read /var/lib/neutron/dhcp/5d55e4ac-ff66-4512-9e4c-487c005fe37c/host
Nov 28 10:06:59 np0005538515.localdomain podman[318157]: 2025-11-28 10:06:59.268814432 +0000 UTC m=+0.065415040 container kill 6d17717cf4c78ea98c557933e4bb52f68e6a295d91765178c9115356c8664792 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5d55e4ac-ff66-4512-9e4c-487c005fe37c, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 28 10:06:59 np0005538515.localdomain dnsmasq-dhcp[318139]: read /var/lib/neutron/dhcp/5d55e4ac-ff66-4512-9e4c-487c005fe37c/opts
Nov 28 10:06:59 np0005538515.localdomain ceph-mon[301134]: pgmap v364: 177 pgs: 177 active+clean; 865 MiB data, 2.9 GiB used, 39 GiB / 42 GiB avail; 19 KiB/s rd, 23 MiB/s wr, 36 op/s
Nov 28 10:06:59 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume resize", "vol_name": "cephfs", "sub_name": "1555194e-595d-4b14-be99-858d500899d3", "new_size": 2147483648, "format": "json"}]: dispatch
Nov 28 10:06:59 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v365: 177 pgs: 177 active+clean; 865 MiB data, 2.9 GiB used, 39 GiB / 42 GiB avail; 19 KiB/s rd, 23 MiB/s wr, 36 op/s
Nov 28 10:07:01 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:07:01.435 261346 INFO neutron.agent.dhcp.agent [None req-e920b98c-c2f6-4088-abca-741f8898d45f - - - - - -] DHCP configuration for ports {'c60962a6-b068-4e59-9a3d-1385700e4916'} is completed
Nov 28 10:07:01 np0005538515.localdomain ceph-mon[301134]: pgmap v365: 177 pgs: 177 active+clean; 865 MiB data, 2.9 GiB used, 39 GiB / 42 GiB avail; 19 KiB/s rd, 23 MiB/s wr, 36 op/s
Nov 28 10:07:01 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/2257434953' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:07:01 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/2257434953' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:07:01 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e187 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:07:01 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v366: 177 pgs: 177 active+clean; 1.0 GiB data, 3.3 GiB used, 39 GiB / 42 GiB avail; 52 KiB/s rd, 33 MiB/s wr, 87 op/s
Nov 28 10:07:02 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "1555194e-595d-4b14-be99-858d500899d3", "format": "json"}]: dispatch
Nov 28 10:07:02 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_clone_status(clone_name:1555194e-595d-4b14-be99-858d500899d3, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Nov 28 10:07:02 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_clone_status(clone_name:1555194e-595d-4b14-be99-858d500899d3, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Nov 28 10:07:02 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T10:07:02.066+0000 7fcc87448640 -1 mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '1555194e-595d-4b14-be99-858d500899d3' of type subvolume
Nov 28 10:07:02 np0005538515.localdomain ceph-mgr[286188]: mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '1555194e-595d-4b14-be99-858d500899d3' of type subvolume
Nov 28 10:07:02 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "1555194e-595d-4b14-be99-858d500899d3", "force": true, "format": "json"}]: dispatch
Nov 28 10:07:02 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:1555194e-595d-4b14-be99-858d500899d3, vol_name:cephfs) < ""
Nov 28 10:07:02 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.subvolume_base] subvolume path 'b'/volumes/_nogroup/1555194e-595d-4b14-be99-858d500899d3'' moved to trashcan
Nov 28 10:07:02 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Nov 28 10:07:02 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:1555194e-595d-4b14-be99-858d500899d3, vol_name:cephfs) < ""
Nov 28 10:07:02 np0005538515.localdomain ceph-mgr[286188]: client.0 error registering admin socket command: (17) File exists
Nov 28 10:07:02 np0005538515.localdomain ceph-mgr[286188]: client.0 error registering admin socket command: (17) File exists
Nov 28 10:07:02 np0005538515.localdomain ceph-mgr[286188]: client.0 error registering admin socket command: (17) File exists
Nov 28 10:07:02 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T10:07:02.090+0000 7fcc8a44e640 -1 client.0 error registering admin socket command: (17) File exists
Nov 28 10:07:02 np0005538515.localdomain ceph-mgr[286188]: client.0 error registering admin socket command: (17) File exists
Nov 28 10:07:02 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T10:07:02.090+0000 7fcc8a44e640 -1 client.0 error registering admin socket command: (17) File exists
Nov 28 10:07:02 np0005538515.localdomain ceph-mgr[286188]: client.0 error registering admin socket command: (17) File exists
Nov 28 10:07:02 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T10:07:02.090+0000 7fcc8a44e640 -1 client.0 error registering admin socket command: (17) File exists
Nov 28 10:07:02 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T10:07:02.090+0000 7fcc8a44e640 -1 client.0 error registering admin socket command: (17) File exists
Nov 28 10:07:02 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T10:07:02.090+0000 7fcc8a44e640 -1 client.0 error registering admin socket command: (17) File exists
Nov 28 10:07:02 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T10:07:02.123+0000 7fcc8944c640 -1 client.0 error registering admin socket command: (17) File exists
Nov 28 10:07:02 np0005538515.localdomain ceph-mgr[286188]: client.0 error registering admin socket command: (17) File exists
Nov 28 10:07:02 np0005538515.localdomain ceph-mgr[286188]: client.0 error registering admin socket command: (17) File exists
Nov 28 10:07:02 np0005538515.localdomain ceph-mgr[286188]: client.0 error registering admin socket command: (17) File exists
Nov 28 10:07:02 np0005538515.localdomain ceph-mgr[286188]: client.0 error registering admin socket command: (17) File exists
Nov 28 10:07:02 np0005538515.localdomain ceph-mgr[286188]: client.0 error registering admin socket command: (17) File exists
Nov 28 10:07:02 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T10:07:02.123+0000 7fcc8944c640 -1 client.0 error registering admin socket command: (17) File exists
Nov 28 10:07:02 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T10:07:02.123+0000 7fcc8944c640 -1 client.0 error registering admin socket command: (17) File exists
Nov 28 10:07:02 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T10:07:02.123+0000 7fcc8944c640 -1 client.0 error registering admin socket command: (17) File exists
Nov 28 10:07:02 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T10:07:02.123+0000 7fcc8944c640 -1 client.0 error registering admin socket command: (17) File exists
Nov 28 10:07:02 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:07:02.159 261346 INFO neutron.agent.linux.ip_lib [None req-d3b8a2e5-4327-44cd-b9f2-6ad11454dc94 - - - - - -] Device tap8bc6a73d-61 cannot be used as it has no MAC address
Nov 28 10:07:02 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:07:02.183 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:07:02 np0005538515.localdomain kernel: device tap8bc6a73d-61 entered promiscuous mode
Nov 28 10:07:02 np0005538515.localdomain NetworkManager[5965]: <info>  [1764324422.1903] manager: (tap8bc6a73d-61): new Generic device (/org/freedesktop/NetworkManager/Devices/38)
Nov 28 10:07:02 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:07:02.192 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:07:02 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T10:07:02Z|00171|binding|INFO|Claiming lport 8bc6a73d-610f-4f06-b515-26f3efcf46a4 for this chassis.
Nov 28 10:07:02 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T10:07:02Z|00172|binding|INFO|8bc6a73d-610f-4f06-b515-26f3efcf46a4: Claiming unknown
Nov 28 10:07:02 np0005538515.localdomain systemd-udevd[318211]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 10:07:02 np0005538515.localdomain virtnodedevd[228057]: ethtool ioctl error on tap8bc6a73d-61: No such device
Nov 28 10:07:02 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:07:02.219 158530 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538515.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcp2a75ae47-09f3-5db4-9c67-86b6e0e7c804-719e8b2f-3827-4b26-9c22-6bd0f7ed7ceb', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-719e8b2f-3827-4b26-9c22-6bd0f7ed7ceb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8462a4a9a313405e8fd212f9ec4a0c92', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fe1e7b19-836c-4f4d-9811-92d20be8712f, chassis=[<ovs.db.idl.Row object at 0x7fd80e481be0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fd80e481be0>], logical_port=8bc6a73d-610f-4f06-b515-26f3efcf46a4) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:07:02 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:07:02.221 158530 INFO neutron.agent.ovn.metadata.agent [-] Port 8bc6a73d-610f-4f06-b515-26f3efcf46a4 in datapath 719e8b2f-3827-4b26-9c22-6bd0f7ed7ceb bound to our chassis
Nov 28 10:07:02 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:07:02.223 158530 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 719e8b2f-3827-4b26-9c22-6bd0f7ed7ceb or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 28 10:07:02 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:07:02.224 261619 DEBUG oslo.privsep.daemon [-] privsep: reply[cafa7bf9-e1ae-4b79-8348-643354c8bcea]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:07:02 np0005538515.localdomain virtnodedevd[228057]: ethtool ioctl error on tap8bc6a73d-61: No such device
Nov 28 10:07:02 np0005538515.localdomain virtnodedevd[228057]: ethtool ioctl error on tap8bc6a73d-61: No such device
Nov 28 10:07:02 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:07:02.235 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:07:02 np0005538515.localdomain virtnodedevd[228057]: ethtool ioctl error on tap8bc6a73d-61: No such device
Nov 28 10:07:02 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T10:07:02Z|00173|binding|INFO|Setting lport 8bc6a73d-610f-4f06-b515-26f3efcf46a4 ovn-installed in OVS
Nov 28 10:07:02 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T10:07:02Z|00174|binding|INFO|Setting lport 8bc6a73d-610f-4f06-b515-26f3efcf46a4 up in Southbound
Nov 28 10:07:02 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:07:02.240 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:07:02 np0005538515.localdomain virtnodedevd[228057]: ethtool ioctl error on tap8bc6a73d-61: No such device
Nov 28 10:07:02 np0005538515.localdomain virtnodedevd[228057]: ethtool ioctl error on tap8bc6a73d-61: No such device
Nov 28 10:07:02 np0005538515.localdomain virtnodedevd[228057]: ethtool ioctl error on tap8bc6a73d-61: No such device
Nov 28 10:07:02 np0005538515.localdomain virtnodedevd[228057]: ethtool ioctl error on tap8bc6a73d-61: No such device
Nov 28 10:07:02 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:07:02.272 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:07:02 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:07:02.296 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:07:02 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:07:02.876 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:07:03 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:07:03.086 2 INFO neutron.agent.securitygroups_rpc [req-e18f2dde-6bdd-4008-97a4-be84187f4807 req-3c6747af-afe3-40df-86cf-89416982a794 87cf17c48bab44279b2e7eca9e0882a2 d3c0d1ce8d854a7b9ffc953e88cd2c44 - - default default] Security group member updated ['c52603b5-5f47-4123-b8fe-cc9f0a56d914']
Nov 28 10:07:03 np0005538515.localdomain podman[318282]: 
Nov 28 10:07:03 np0005538515.localdomain podman[318282]: 2025-11-28 10:07:03.182258328 +0000 UTC m=+0.096728536 container create 0e83d0c2461a254042e294a6421feea4104273067078a5ad2e14050a33c19184 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-719e8b2f-3827-4b26-9c22-6bd0f7ed7ceb, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:07:03 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:07:03.185 261346 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:07:02Z, description=, device_id=640e688c-f2ca-49b5-a84f-ca1ea976a9cd, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce3ebb80>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce3eb250>], id=79663a4e-2979-44db-bdea-40e4855cb323, ip_allocation=immediate, mac_address=fa:16:3e:be:70:a6, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:06:04Z, description=, dns_domain=, id=b2c4ac07-8851-40d3-9495-d0489b67c4c3, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-VolumesBackupsTest-1354719988-network, port_security_enabled=True, project_id=d3c0d1ce8d854a7b9ffc953e88cd2c44, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=64958, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2202, status=ACTIVE, subnets=['f4b4dc5d-f654-46e4-8ff2-bd52eff10306'], tags=[], tenant_id=d3c0d1ce8d854a7b9ffc953e88cd2c44, updated_at=2025-11-28T10:06:06Z, vlan_transparent=None, network_id=b2c4ac07-8851-40d3-9495-d0489b67c4c3, port_security_enabled=True, project_id=d3c0d1ce8d854a7b9ffc953e88cd2c44, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['c52603b5-5f47-4123-b8fe-cc9f0a56d914'], standard_attr_id=2457, status=DOWN, tags=[], tenant_id=d3c0d1ce8d854a7b9ffc953e88cd2c44, updated_at=2025-11-28T10:07:02Z on network b2c4ac07-8851-40d3-9495-d0489b67c4c3
Nov 28 10:07:03 np0005538515.localdomain systemd[1]: Started libpod-conmon-0e83d0c2461a254042e294a6421feea4104273067078a5ad2e14050a33c19184.scope.
Nov 28 10:07:03 np0005538515.localdomain podman[318282]: 2025-11-28 10:07:03.137675102 +0000 UTC m=+0.052145330 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 28 10:07:03 np0005538515.localdomain systemd[1]: tmp-crun.EpHQew.mount: Deactivated successfully.
Nov 28 10:07:03 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 10:07:03 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b30c06b143ece65045ee680ac009c0929cadf7b730be6fdcddc88c7afdc918bb/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 10:07:03 np0005538515.localdomain podman[318282]: 2025-11-28 10:07:03.264942951 +0000 UTC m=+0.179413159 container init 0e83d0c2461a254042e294a6421feea4104273067078a5ad2e14050a33c19184 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-719e8b2f-3827-4b26-9c22-6bd0f7ed7ceb, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3)
Nov 28 10:07:03 np0005538515.localdomain podman[318282]: 2025-11-28 10:07:03.274794935 +0000 UTC m=+0.189265143 container start 0e83d0c2461a254042e294a6421feea4104273067078a5ad2e14050a33c19184 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-719e8b2f-3827-4b26-9c22-6bd0f7ed7ceb, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Nov 28 10:07:03 np0005538515.localdomain dnsmasq[318300]: started, version 2.85 cachesize 150
Nov 28 10:07:03 np0005538515.localdomain dnsmasq[318300]: DNS service limited to local subnets
Nov 28 10:07:03 np0005538515.localdomain dnsmasq[318300]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 28 10:07:03 np0005538515.localdomain dnsmasq[318300]: warning: no upstream servers configured
Nov 28 10:07:03 np0005538515.localdomain dnsmasq-dhcp[318300]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Nov 28 10:07:03 np0005538515.localdomain dnsmasq[318300]: read /var/lib/neutron/dhcp/719e8b2f-3827-4b26-9c22-6bd0f7ed7ceb/addn_hosts - 0 addresses
Nov 28 10:07:03 np0005538515.localdomain dnsmasq-dhcp[318300]: read /var/lib/neutron/dhcp/719e8b2f-3827-4b26-9c22-6bd0f7ed7ceb/host
Nov 28 10:07:03 np0005538515.localdomain dnsmasq-dhcp[318300]: read /var/lib/neutron/dhcp/719e8b2f-3827-4b26-9c22-6bd0f7ed7ceb/opts
Nov 28 10:07:03 np0005538515.localdomain dnsmasq[316830]: read /var/lib/neutron/dhcp/b2c4ac07-8851-40d3-9495-d0489b67c4c3/addn_hosts - 2 addresses
Nov 28 10:07:03 np0005538515.localdomain dnsmasq-dhcp[316830]: read /var/lib/neutron/dhcp/b2c4ac07-8851-40d3-9495-d0489b67c4c3/host
Nov 28 10:07:03 np0005538515.localdomain podman[318316]: 2025-11-28 10:07:03.452940445 +0000 UTC m=+0.058234409 container kill a39214a4baa8262623303d314b8ed95b71c01a463bc2eabd06aba05950874fd1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b2c4ac07-8851-40d3-9495-d0489b67c4c3, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:07:03 np0005538515.localdomain dnsmasq-dhcp[316830]: read /var/lib/neutron/dhcp/b2c4ac07-8851-40d3-9495-d0489b67c4c3/opts
Nov 28 10:07:03 np0005538515.localdomain ceph-mon[301134]: pgmap v366: 177 pgs: 177 active+clean; 1.0 GiB data, 3.3 GiB used, 39 GiB / 42 GiB avail; 52 KiB/s rd, 33 MiB/s wr, 87 op/s
Nov 28 10:07:03 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "1555194e-595d-4b14-be99-858d500899d3", "format": "json"}]: dispatch
Nov 28 10:07:03 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "1555194e-595d-4b14-be99-858d500899d3", "force": true, "format": "json"}]: dispatch
Nov 28 10:07:03 np0005538515.localdomain ceph-mon[301134]: mgrmap e47: np0005538515.yfkzhl(active, since 8m), standbys: np0005538513.dsfdlx, np0005538514.djozup
Nov 28 10:07:03 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/4053547237' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:07:03 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/4053547237' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:07:03 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:07:03.730 261346 INFO neutron.agent.dhcp.agent [None req-f3624154-5d96-47e2-aec1-9eb16f53c2cb - - - - - -] DHCP configuration for ports {'e18fca2e-eaeb-40cf-9eb1-203ecf5b0aa2', '8646aa95-6463-44cd-8c34-1bec1705e23b'} is completed
Nov 28 10:07:03 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v367: 177 pgs: 177 active+clean; 1.0 GiB data, 3.3 GiB used, 39 GiB / 42 GiB avail; 41 KiB/s rd, 22 MiB/s wr, 67 op/s
Nov 28 10:07:03 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:07:03.897 261346 INFO neutron.agent.dhcp.agent [None req-00389324-b119-48e0-8aff-a52530bd8d2a - - - - - -] DHCP configuration for ports {'79663a4e-2979-44db-bdea-40e4855cb323'} is completed
Nov 28 10:07:04 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:07:04.180 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:07:04 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:07:04.544 2 INFO neutron.agent.securitygroups_rpc [None req-ae7d6dfb-c119-45e0-bfec-235340ad22c9 e32848e36ae94f66ae634ff4d7716d6f 8462a4a9a313405e8fd212f9ec4a0c92 - - default default] Security group member updated ['19d31bf3-ea7b-49ec-820d-ba3fe5752e88']
Nov 28 10:07:04 np0005538515.localdomain ceph-mon[301134]: pgmap v367: 177 pgs: 177 active+clean; 1.0 GiB data, 3.3 GiB used, 39 GiB / 42 GiB avail; 41 KiB/s rd, 22 MiB/s wr, 67 op/s
Nov 28 10:07:04 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:07:04.877 261346 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:07:04Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce3c2370>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce3c2a90>], id=a9243b0d-5608-4dd1-bf07-987690272773, ip_allocation=immediate, mac_address=fa:16:3e:3a:d2:78, name=tempest-PortsIpV6TestJSON-388426784, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:06:07Z, description=, dns_domain=, id=719e8b2f-3827-4b26-9c22-6bd0f7ed7ceb, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-PortsIpV6TestJSON-test-network-1946008216, port_security_enabled=True, project_id=8462a4a9a313405e8fd212f9ec4a0c92, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=49828, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2226, status=ACTIVE, subnets=['5d4879f5-1341-4004-892c-8f9038b89398'], tags=[], tenant_id=8462a4a9a313405e8fd212f9ec4a0c92, updated_at=2025-11-28T10:06:56Z, vlan_transparent=None, network_id=719e8b2f-3827-4b26-9c22-6bd0f7ed7ceb, port_security_enabled=True, project_id=8462a4a9a313405e8fd212f9ec4a0c92, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['19d31bf3-ea7b-49ec-820d-ba3fe5752e88'], standard_attr_id=2462, status=DOWN, tags=[], tenant_id=8462a4a9a313405e8fd212f9ec4a0c92, updated_at=2025-11-28T10:07:04Z on network 719e8b2f-3827-4b26-9c22-6bd0f7ed7ceb
Nov 28 10:07:05 np0005538515.localdomain podman[318354]: 2025-11-28 10:07:05.058011523 +0000 UTC m=+0.047909550 container kill 0e83d0c2461a254042e294a6421feea4104273067078a5ad2e14050a33c19184 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-719e8b2f-3827-4b26-9c22-6bd0f7ed7ceb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Nov 28 10:07:05 np0005538515.localdomain dnsmasq[318300]: read /var/lib/neutron/dhcp/719e8b2f-3827-4b26-9c22-6bd0f7ed7ceb/addn_hosts - 1 addresses
Nov 28 10:07:05 np0005538515.localdomain dnsmasq-dhcp[318300]: read /var/lib/neutron/dhcp/719e8b2f-3827-4b26-9c22-6bd0f7ed7ceb/host
Nov 28 10:07:05 np0005538515.localdomain dnsmasq-dhcp[318300]: read /var/lib/neutron/dhcp/719e8b2f-3827-4b26-9c22-6bd0f7ed7ceb/opts
Nov 28 10:07:05 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:07:05.545 261346 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=np0005538514.localdomain, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:07:02Z, description=, device_id=640e688c-f2ca-49b5-a84f-ca1ea976a9cd, device_owner=compute:nova, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce3c4af0>], dns_domain=, dns_name=tempest-volumesbackupstest-instance-1114210035, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce3c4430>], id=79663a4e-2979-44db-bdea-40e4855cb323, ip_allocation=immediate, mac_address=fa:16:3e:be:70:a6, name=, network_id=b2c4ac07-8851-40d3-9495-d0489b67c4c3, port_security_enabled=True, project_id=d3c0d1ce8d854a7b9ffc953e88cd2c44, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=2, security_groups=['c52603b5-5f47-4123-b8fe-cc9f0a56d914'], standard_attr_id=2457, status=DOWN, tags=[], tenant_id=d3c0d1ce8d854a7b9ffc953e88cd2c44, updated_at=2025-11-28T10:07:04Z on network b2c4ac07-8851-40d3-9495-d0489b67c4c3
Nov 28 10:07:05 np0005538515.localdomain ceph-mgr[286188]: [balancer INFO root] Optimize plan auto_2025-11-28_10:07:05
Nov 28 10:07:05 np0005538515.localdomain ceph-mgr[286188]: [balancer INFO root] Mode upmap, max misplaced 0.050000
Nov 28 10:07:05 np0005538515.localdomain ceph-mgr[286188]: [balancer INFO root] do_upmap
Nov 28 10:07:05 np0005538515.localdomain ceph-mgr[286188]: [balancer INFO root] pools ['manila_data', 'vms', 'images', 'backups', 'manila_metadata', 'volumes', '.mgr']
Nov 28 10:07:05 np0005538515.localdomain ceph-mgr[286188]: [balancer INFO root] prepared 0/10 changes
Nov 28 10:07:05 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Nov 28 10:07:05 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/4069209835' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:07:05 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Nov 28 10:07:05 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/4069209835' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:07:05 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:07:05.670 261346 INFO neutron.agent.dhcp.agent [None req-3a4efc61-bf3f-4e53-bf51-d66fd999eedf - - - - - -] DHCP configuration for ports {'a9243b0d-5608-4dd1-bf07-987690272773'} is completed
Nov 28 10:07:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] scanning for idle connections..
Nov 28 10:07:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] cleaning up connections: []
Nov 28 10:07:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] scanning for idle connections..
Nov 28 10:07:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] cleaning up connections: []
Nov 28 10:07:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] scanning for idle connections..
Nov 28 10:07:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] cleaning up connections: []
Nov 28 10:07:05 np0005538515.localdomain dnsmasq[316830]: read /var/lib/neutron/dhcp/b2c4ac07-8851-40d3-9495-d0489b67c4c3/addn_hosts - 2 addresses
Nov 28 10:07:05 np0005538515.localdomain dnsmasq-dhcp[316830]: read /var/lib/neutron/dhcp/b2c4ac07-8851-40d3-9495-d0489b67c4c3/host
Nov 28 10:07:05 np0005538515.localdomain podman[318391]: 2025-11-28 10:07:05.79067727 +0000 UTC m=+0.061367585 container kill a39214a4baa8262623303d314b8ed95b71c01a463bc2eabd06aba05950874fd1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b2c4ac07-8851-40d3-9495-d0489b67c4c3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:07:05 np0005538515.localdomain dnsmasq-dhcp[316830]: read /var/lib/neutron/dhcp/b2c4ac07-8851-40d3-9495-d0489b67c4c3/opts
Nov 28 10:07:05 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v368: 177 pgs: 177 active+clean; 1.0 GiB data, 3.3 GiB used, 39 GiB / 42 GiB avail; 41 KiB/s rd, 22 MiB/s wr, 67 op/s
Nov 28 10:07:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] _maybe_adjust
Nov 28 10:07:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Nov 28 10:07:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] Pool '.mgr' root_id -1 using 3.080724804578448e-05 of space, bias 1.0, pg target 0.006161449609156895 quantized to 1 (current 1)
Nov 28 10:07:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Nov 28 10:07:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] Pool 'vms' root_id -1 using 0.004811110674902289 of space, bias 1.0, pg target 0.9622221349804577 quantized to 32 (current 32)
Nov 28 10:07:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Nov 28 10:07:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] Pool 'volumes' root_id -1 using 3.271566164154104e-06 of space, bias 1.0, pg target 0.0006521321887213847 quantized to 32 (current 32)
Nov 28 10:07:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Nov 28 10:07:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] Pool 'images' root_id -1 using 0.06404308870742602 of space, bias 1.0, pg target 12.765922349013586 quantized to 32 (current 32)
Nov 28 10:07:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Nov 28 10:07:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] Pool 'backups' root_id -1 using 2.7263051367950866e-07 of space, bias 1.0, pg target 5.089102922017495e-05 quantized to 32 (current 32)
Nov 28 10:07:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Nov 28 10:07:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] Pool 'manila_data' root_id -1 using 5.452610273590173e-07 of space, bias 1.0, pg target 0.0001017820584403499 quantized to 32 (current 32)
Nov 28 10:07:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Nov 28 10:07:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] Pool 'manila_metadata' root_id -1 using 6.543132328308208e-06 of space, bias 4.0, pg target 0.004885538805136795 quantized to 16 (current 16)
Nov 28 10:07:05 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules
Nov 28 10:07:05 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: vms, start_after=
Nov 28 10:07:05 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules
Nov 28 10:07:05 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: vms, start_after=
Nov 28 10:07:05 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: volumes, start_after=
Nov 28 10:07:05 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: volumes, start_after=
Nov 28 10:07:05 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: images, start_after=
Nov 28 10:07:05 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: images, start_after=
Nov 28 10:07:05 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: backups, start_after=
Nov 28 10:07:05 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: backups, start_after=
Nov 28 10:07:06 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/4069209835' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:07:06 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/4069209835' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:07:06 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:07:06.613 261346 INFO neutron.agent.dhcp.agent [None req-e8bfdb4d-0326-4c55-ab5f-ac0c01276607 - - - - - -] DHCP configuration for ports {'79663a4e-2979-44db-bdea-40e4855cb323'} is completed
Nov 28 10:07:06 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e187 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:07:06 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.
Nov 28 10:07:06 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.
Nov 28 10:07:06 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.
Nov 28 10:07:06 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.
Nov 28 10:07:06 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "f30195cf-a779-4b65-9774-df0ab49a62cf", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:07:06 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:f30195cf-a779-4b65-9774-df0ab49a62cf, vol_name:cephfs) < ""
Nov 28 10:07:06 np0005538515.localdomain systemd[1]: tmp-crun.ZCIpWg.mount: Deactivated successfully.
Nov 28 10:07:07 np0005538515.localdomain podman[318414]: 2025-11-28 10:07:07.004237242 +0000 UTC m=+0.095093646 container health_status b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Nov 28 10:07:07 np0005538515.localdomain podman[318414]: 2025-11-28 10:07:07.034410003 +0000 UTC m=+0.125266397 container exec_died b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:07:07 np0005538515.localdomain systemd[1]: b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.service: Deactivated successfully.
Nov 28 10:07:07 np0005538515.localdomain podman[318412]: 2025-11-28 10:07:07.054132793 +0000 UTC m=+0.152794698 container health_status 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=edpm, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:07:07 np0005538515.localdomain podman[318413]: 2025-11-28 10:07:07.100547585 +0000 UTC m=+0.197097975 container health_status 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Nov 28 10:07:07 np0005538515.localdomain podman[318412]: 2025-11-28 10:07:07.170946508 +0000 UTC m=+0.269608483 container exec_died 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, config_id=edpm, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 28 10:07:07 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/f30195cf-a779-4b65-9774-df0ab49a62cf/.meta.tmp'
Nov 28 10:07:07 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/f30195cf-a779-4b65-9774-df0ab49a62cf/.meta.tmp' to config b'/volumes/_nogroup/f30195cf-a779-4b65-9774-df0ab49a62cf/.meta'
Nov 28 10:07:07 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:f30195cf-a779-4b65-9774-df0ab49a62cf, vol_name:cephfs) < ""
Nov 28 10:07:07 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "f30195cf-a779-4b65-9774-df0ab49a62cf", "format": "json"}]: dispatch
Nov 28 10:07:07 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:f30195cf-a779-4b65-9774-df0ab49a62cf, vol_name:cephfs) < ""
Nov 28 10:07:07 np0005538515.localdomain systemd[1]: 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.service: Deactivated successfully.
Nov 28 10:07:07 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:f30195cf-a779-4b65-9774-df0ab49a62cf, vol_name:cephfs) < ""
Nov 28 10:07:07 np0005538515.localdomain podman[318413]: 2025-11-28 10:07:07.186995214 +0000 UTC m=+0.283545554 container exec_died 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Nov 28 10:07:07 np0005538515.localdomain systemd[1]: 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.service: Deactivated successfully.
Nov 28 10:07:07 np0005538515.localdomain podman[318415]: 2025-11-28 10:07:07.261267756 +0000 UTC m=+0.349296603 container health_status d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 28 10:07:07 np0005538515.localdomain podman[318415]: 2025-11-28 10:07:07.296240536 +0000 UTC m=+0.384269373 container exec_died d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 10:07:07 np0005538515.localdomain systemd[1]: d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.service: Deactivated successfully.
Nov 28 10:07:07 np0005538515.localdomain ceph-mon[301134]: pgmap v368: 177 pgs: 177 active+clean; 1.0 GiB data, 3.3 GiB used, 39 GiB / 42 GiB avail; 41 KiB/s rd, 22 MiB/s wr, 67 op/s
Nov 28 10:07:07 np0005538515.localdomain ceph-mon[301134]: from='client.15651 172.18.0.34:0/597982567' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:07:07 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:07:07.526 261346 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:06:55Z, description=, device_id=113746df-806c-4ec0-9f15-ab9153798c56, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce421040>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce421400>], id=c60962a6-b068-4e59-9a3d-1385700e4916, ip_allocation=immediate, mac_address=fa:16:3e:77:98:c0, name=tempest-PortsTestJSON-183814121, network_id=5d55e4ac-ff66-4512-9e4c-487c005fe37c, port_security_enabled=True, project_id=5e7a07c97c664076bc825e05137c574c, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=3, security_groups=['11213b09-f0e7-4cd0-8dcb-72dc58a4cd0b'], standard_attr_id=2439, status=ACTIVE, tags=[], tenant_id=5e7a07c97c664076bc825e05137c574c, updated_at=2025-11-28T10:07:03Z on network 5d55e4ac-ff66-4512-9e4c-487c005fe37c
Nov 28 10:07:07 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v369: 177 pgs: 177 active+clean; 1.1 GiB data, 3.7 GiB used, 38 GiB / 42 GiB avail; 67 KiB/s rd, 32 MiB/s wr, 106 op/s
Nov 28 10:07:07 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:07:07.926 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:07:08 np0005538515.localdomain dnsmasq[318139]: read /var/lib/neutron/dhcp/5d55e4ac-ff66-4512-9e4c-487c005fe37c/addn_hosts - 1 addresses
Nov 28 10:07:08 np0005538515.localdomain podman[318510]: 2025-11-28 10:07:08.013319342 +0000 UTC m=+0.061012424 container kill 6d17717cf4c78ea98c557933e4bb52f68e6a295d91765178c9115356c8664792 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5d55e4ac-ff66-4512-9e4c-487c005fe37c, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Nov 28 10:07:08 np0005538515.localdomain dnsmasq-dhcp[318139]: read /var/lib/neutron/dhcp/5d55e4ac-ff66-4512-9e4c-487c005fe37c/host
Nov 28 10:07:08 np0005538515.localdomain dnsmasq-dhcp[318139]: read /var/lib/neutron/dhcp/5d55e4ac-ff66-4512-9e4c-487c005fe37c/opts
Nov 28 10:07:08 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "f30195cf-a779-4b65-9774-df0ab49a62cf", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:07:08 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "f30195cf-a779-4b65-9774-df0ab49a62cf", "format": "json"}]: dispatch
Nov 28 10:07:08 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:07:08.545 261346 INFO neutron.agent.dhcp.agent [None req-d7f2dc29-1cb1-460b-a486-d3f02a5c966a - - - - - -] DHCP configuration for ports {'c60962a6-b068-4e59-9a3d-1385700e4916'} is completed
Nov 28 10:07:09 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:07:09.223 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:07:09 np0005538515.localdomain podman[318550]: 2025-11-28 10:07:09.246087137 +0000 UTC m=+0.063829021 container kill 0e83d0c2461a254042e294a6421feea4104273067078a5ad2e14050a33c19184 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-719e8b2f-3827-4b26-9c22-6bd0f7ed7ceb, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Nov 28 10:07:09 np0005538515.localdomain dnsmasq[318300]: exiting on receipt of SIGTERM
Nov 28 10:07:09 np0005538515.localdomain systemd[1]: tmp-crun.dPMdfv.mount: Deactivated successfully.
Nov 28 10:07:09 np0005538515.localdomain systemd[1]: libpod-0e83d0c2461a254042e294a6421feea4104273067078a5ad2e14050a33c19184.scope: Deactivated successfully.
Nov 28 10:07:09 np0005538515.localdomain podman[318563]: 2025-11-28 10:07:09.302517849 +0000 UTC m=+0.042280936 container died 0e83d0c2461a254042e294a6421feea4104273067078a5ad2e14050a33c19184 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-719e8b2f-3827-4b26-9c22-6bd0f7ed7ceb, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Nov 28 10:07:09 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.
Nov 28 10:07:09 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-b30c06b143ece65045ee680ac009c0929cadf7b730be6fdcddc88c7afdc918bb-merged.mount: Deactivated successfully.
Nov 28 10:07:09 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-0e83d0c2461a254042e294a6421feea4104273067078a5ad2e14050a33c19184-userdata-shm.mount: Deactivated successfully.
Nov 28 10:07:09 np0005538515.localdomain podman[318563]: 2025-11-28 10:07:09.345292079 +0000 UTC m=+0.085055156 container remove 0e83d0c2461a254042e294a6421feea4104273067078a5ad2e14050a33c19184 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-719e8b2f-3827-4b26-9c22-6bd0f7ed7ceb, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2)
Nov 28 10:07:09 np0005538515.localdomain systemd[1]: libpod-conmon-0e83d0c2461a254042e294a6421feea4104273067078a5ad2e14050a33c19184.scope: Deactivated successfully.
Nov 28 10:07:09 np0005538515.localdomain podman[318591]: 2025-11-28 10:07:09.414236948 +0000 UTC m=+0.066735901 container health_status 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 10:07:09 np0005538515.localdomain podman[318591]: 2025-11-28 10:07:09.421556014 +0000 UTC m=+0.074054957 container exec_died 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 28 10:07:09 np0005538515.localdomain systemd[1]: 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.service: Deactivated successfully.
Nov 28 10:07:09 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e188 e188: 6 total, 6 up, 6 in
Nov 28 10:07:09 np0005538515.localdomain ceph-mon[301134]: pgmap v369: 177 pgs: 177 active+clean; 1.1 GiB data, 3.7 GiB used, 38 GiB / 42 GiB avail; 67 KiB/s rd, 32 MiB/s wr, 106 op/s
Nov 28 10:07:09 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v371: 177 pgs: 177 active+clean; 1.1 GiB data, 3.7 GiB used, 38 GiB / 42 GiB avail; 72 KiB/s rd, 26 MiB/s wr, 109 op/s
Nov 28 10:07:10 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume resize", "vol_name": "cephfs", "sub_name": "f30195cf-a779-4b65-9774-df0ab49a62cf", "new_size": 2147483648, "format": "json"}]: dispatch
Nov 28 10:07:10 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_resize(format:json, new_size:2147483648, prefix:fs subvolume resize, sub_name:f30195cf-a779-4b65-9774-df0ab49a62cf, vol_name:cephfs) < ""
Nov 28 10:07:10 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_resize(format:json, new_size:2147483648, prefix:fs subvolume resize, sub_name:f30195cf-a779-4b65-9774-df0ab49a62cf, vol_name:cephfs) < ""
Nov 28 10:07:10 np0005538515.localdomain ceph-mon[301134]: osdmap e188: 6 total, 6 up, 6 in
Nov 28 10:07:10 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.107:0/3389479421' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:07:10 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.107:0/1134720533' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:07:10 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:07:10.641 2 INFO neutron.agent.securitygroups_rpc [None req-570a0175-8080-4b40-9e80-d9942b63779e e32848e36ae94f66ae634ff4d7716d6f 8462a4a9a313405e8fd212f9ec4a0c92 - - default default] Security group member updated ['c81397c2-33ee-481d-8257-b39c2b0c331e', '19d31bf3-ea7b-49ec-820d-ba3fe5752e88']
Nov 28 10:07:10 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:07:10.645 2 INFO neutron.agent.securitygroups_rpc [None req-22769ed5-ed71-4ef8-ab49-99801270d0d3 646af4638a054ec4b7eb9e5438a2ab60 5e7a07c97c664076bc825e05137c574c - - default default] Security group member updated ['11213b09-f0e7-4cd0-8dcb-72dc58a4cd0b']
Nov 28 10:07:11 np0005538515.localdomain ceph-mon[301134]: pgmap v371: 177 pgs: 177 active+clean; 1.1 GiB data, 3.7 GiB used, 38 GiB / 42 GiB avail; 72 KiB/s rd, 26 MiB/s wr, 109 op/s
Nov 28 10:07:11 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume resize", "vol_name": "cephfs", "sub_name": "f30195cf-a779-4b65-9774-df0ab49a62cf", "new_size": 2147483648, "format": "json"}]: dispatch
Nov 28 10:07:11 np0005538515.localdomain dnsmasq[318139]: read /var/lib/neutron/dhcp/5d55e4ac-ff66-4512-9e4c-487c005fe37c/addn_hosts - 0 addresses
Nov 28 10:07:11 np0005538515.localdomain dnsmasq-dhcp[318139]: read /var/lib/neutron/dhcp/5d55e4ac-ff66-4512-9e4c-487c005fe37c/host
Nov 28 10:07:11 np0005538515.localdomain dnsmasq-dhcp[318139]: read /var/lib/neutron/dhcp/5d55e4ac-ff66-4512-9e4c-487c005fe37c/opts
Nov 28 10:07:11 np0005538515.localdomain podman[318657]: 2025-11-28 10:07:11.572365189 +0000 UTC m=+0.059435496 container kill 6d17717cf4c78ea98c557933e4bb52f68e6a295d91765178c9115356c8664792 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5d55e4ac-ff66-4512-9e4c-487c005fe37c, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:07:11 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e189 e189: 6 total, 6 up, 6 in
Nov 28 10:07:11 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e189 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:07:11 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v373: 177 pgs: 12 active+clean+snaptrim_wait, 8 active+clean+snaptrim, 157 active+clean; 1.2 GiB data, 3.9 GiB used, 38 GiB / 42 GiB avail; 65 KiB/s rd, 22 MiB/s wr, 96 op/s
Nov 28 10:07:11 np0005538515.localdomain podman[318704]: 
Nov 28 10:07:11 np0005538515.localdomain podman[318704]: 2025-11-28 10:07:11.993032475 +0000 UTC m=+0.091135385 container create 9232a324cbb803ff5e7248639f834f9284fa50bf671d08fdfddeac4a08ecb36b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-719e8b2f-3827-4b26-9c22-6bd0f7ed7ceb, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125)
Nov 28 10:07:12 np0005538515.localdomain systemd[1]: Started libpod-conmon-9232a324cbb803ff5e7248639f834f9284fa50bf671d08fdfddeac4a08ecb36b.scope.
Nov 28 10:07:12 np0005538515.localdomain podman[318704]: 2025-11-28 10:07:11.944204537 +0000 UTC m=+0.042307477 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 28 10:07:12 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 10:07:12 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fd62fdf166d581c6d2b3ef9c0fe541584f5effaa7812f4badcc3cc24df395cfc/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 10:07:12 np0005538515.localdomain podman[318704]: 2025-11-28 10:07:12.058892408 +0000 UTC m=+0.156995318 container init 9232a324cbb803ff5e7248639f834f9284fa50bf671d08fdfddeac4a08ecb36b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-719e8b2f-3827-4b26-9c22-6bd0f7ed7ceb, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:07:12 np0005538515.localdomain podman[318704]: 2025-11-28 10:07:12.073618182 +0000 UTC m=+0.171721092 container start 9232a324cbb803ff5e7248639f834f9284fa50bf671d08fdfddeac4a08ecb36b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-719e8b2f-3827-4b26-9c22-6bd0f7ed7ceb, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Nov 28 10:07:12 np0005538515.localdomain dnsmasq[318722]: started, version 2.85 cachesize 150
Nov 28 10:07:12 np0005538515.localdomain dnsmasq[318722]: DNS service limited to local subnets
Nov 28 10:07:12 np0005538515.localdomain dnsmasq[318722]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 28 10:07:12 np0005538515.localdomain dnsmasq[318722]: warning: no upstream servers configured
Nov 28 10:07:12 np0005538515.localdomain dnsmasq-dhcp[318722]: DHCPv6, static leases only on 2001:db8:0:1::, lease time 1d
Nov 28 10:07:12 np0005538515.localdomain dnsmasq-dhcp[318722]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Nov 28 10:07:12 np0005538515.localdomain dnsmasq[318722]: read /var/lib/neutron/dhcp/719e8b2f-3827-4b26-9c22-6bd0f7ed7ceb/addn_hosts - 1 addresses
Nov 28 10:07:12 np0005538515.localdomain dnsmasq-dhcp[318722]: read /var/lib/neutron/dhcp/719e8b2f-3827-4b26-9c22-6bd0f7ed7ceb/host
Nov 28 10:07:12 np0005538515.localdomain dnsmasq-dhcp[318722]: read /var/lib/neutron/dhcp/719e8b2f-3827-4b26-9c22-6bd0f7ed7ceb/opts
Nov 28 10:07:12 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:07:12.142 261346 INFO neutron.agent.dhcp.agent [None req-588228b2-f577-44aa-80f1-e24f139f4962 - - - - - -] Trigger reload_allocations for port admin_state_up=False, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:07:04Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce3d0850>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce3d0b20>], id=a9243b0d-5608-4dd1-bf07-987690272773, ip_allocation=immediate, mac_address=fa:16:3e:3a:d2:78, name=tempest-PortsIpV6TestJSON-635648094, network_id=719e8b2f-3827-4b26-9c22-6bd0f7ed7ceb, port_security_enabled=True, project_id=8462a4a9a313405e8fd212f9ec4a0c92, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=2, security_groups=['c81397c2-33ee-481d-8257-b39c2b0c331e'], standard_attr_id=2462, status=DOWN, tags=[], tenant_id=8462a4a9a313405e8fd212f9ec4a0c92, updated_at=2025-11-28T10:07:10Z on network 719e8b2f-3827-4b26-9c22-6bd0f7ed7ceb
Nov 28 10:07:12 np0005538515.localdomain dnsmasq[318722]: read /var/lib/neutron/dhcp/719e8b2f-3827-4b26-9c22-6bd0f7ed7ceb/addn_hosts - 1 addresses
Nov 28 10:07:12 np0005538515.localdomain dnsmasq-dhcp[318722]: read /var/lib/neutron/dhcp/719e8b2f-3827-4b26-9c22-6bd0f7ed7ceb/host
Nov 28 10:07:12 np0005538515.localdomain dnsmasq-dhcp[318722]: read /var/lib/neutron/dhcp/719e8b2f-3827-4b26-9c22-6bd0f7ed7ceb/opts
Nov 28 10:07:12 np0005538515.localdomain podman[318740]: 2025-11-28 10:07:12.346728394 +0000 UTC m=+0.075879564 container kill 9232a324cbb803ff5e7248639f834f9284fa50bf671d08fdfddeac4a08ecb36b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-719e8b2f-3827-4b26-9c22-6bd0f7ed7ceb, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team)
Nov 28 10:07:12 np0005538515.localdomain systemd[1]: tmp-crun.lENMz9.mount: Deactivated successfully.
Nov 28 10:07:12 np0005538515.localdomain ceph-mon[301134]: osdmap e189: 6 total, 6 up, 6 in
Nov 28 10:07:12 np0005538515.localdomain ceph-mon[301134]: pgmap v373: 177 pgs: 12 active+clean+snaptrim_wait, 8 active+clean+snaptrim, 157 active+clean; 1.2 GiB data, 3.9 GiB used, 38 GiB / 42 GiB avail; 65 KiB/s rd, 22 MiB/s wr, 96 op/s
Nov 28 10:07:12 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:07:12.629 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:07:12 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T10:07:12Z|00175|binding|INFO|Releasing lport e1272867-532b-4f64-b1d3-8e10c12195a2 from this chassis (sb_readonly=0)
Nov 28 10:07:12 np0005538515.localdomain kernel: device tape1272867-53 left promiscuous mode
Nov 28 10:07:12 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T10:07:12Z|00176|binding|INFO|Setting lport e1272867-532b-4f64-b1d3-8e10c12195a2 down in Southbound
Nov 28 10:07:12 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:07:12.653 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:07:12 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:07:12.738 158530 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538515.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcp2a75ae47-09f3-5db4-9c67-86b6e0e7c804-5d55e4ac-ff66-4512-9e4c-487c005fe37c', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5d55e4ac-ff66-4512-9e4c-487c005fe37c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5e7a07c97c664076bc825e05137c574c', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005538515.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=211a2e3d-b3b5-43ed-95be-d9132daa612f, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fd80e481be0>], logical_port=e1272867-532b-4f64-b1d3-8e10c12195a2) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fd80e481be0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:07:12 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:07:12.741 158530 INFO neutron.agent.ovn.metadata.agent [-] Port e1272867-532b-4f64-b1d3-8e10c12195a2 in datapath 5d55e4ac-ff66-4512-9e4c-487c005fe37c unbound from our chassis
Nov 28 10:07:12 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:07:12.745 158530 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 5d55e4ac-ff66-4512-9e4c-487c005fe37c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 28 10:07:12 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:07:12.746 261619 DEBUG oslo.privsep.daemon [-] privsep: reply[ceef5364-0cfe-4a2a-9c34-05112c33e1dc]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:07:12 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:07:12.971 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:07:13 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:07:13.158 261346 INFO neutron.agent.dhcp.agent [None req-a84629f1-b95d-4f9e-9f6d-da6d6f22d09c - - - - - -] DHCP configuration for ports {'e18fca2e-eaeb-40cf-9eb1-203ecf5b0aa2', 'a9243b0d-5608-4dd1-bf07-987690272773', '8646aa95-6463-44cd-8c34-1bec1705e23b', '8bc6a73d-610f-4f06-b515-26f3efcf46a4'} is completed
Nov 28 10:07:13 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.
Nov 28 10:07:13 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:07:13.586 261346 INFO neutron.agent.dhcp.agent [None req-b051f84e-683e-4599-af18-b02dd6300d11 - - - - - -] DHCP configuration for ports {'a9243b0d-5608-4dd1-bf07-987690272773'} is completed
Nov 28 10:07:13 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/1541328209' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:07:13 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/1541328209' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:07:13 np0005538515.localdomain podman[318764]: 2025-11-28 10:07:13.636220429 +0000 UTC m=+0.082933090 container health_status cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=multipathd, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125)
Nov 28 10:07:13 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "f30195cf-a779-4b65-9774-df0ab49a62cf", "format": "json"}]: dispatch
Nov 28 10:07:13 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_clone_status(clone_name:f30195cf-a779-4b65-9774-df0ab49a62cf, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Nov 28 10:07:13 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_clone_status(clone_name:f30195cf-a779-4b65-9774-df0ab49a62cf, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Nov 28 10:07:13 np0005538515.localdomain ceph-mgr[286188]: mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume 'f30195cf-a779-4b65-9774-df0ab49a62cf' of type subvolume
Nov 28 10:07:13 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T10:07:13.656+0000 7fcc87448640 -1 mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume 'f30195cf-a779-4b65-9774-df0ab49a62cf' of type subvolume
Nov 28 10:07:13 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "f30195cf-a779-4b65-9774-df0ab49a62cf", "force": true, "format": "json"}]: dispatch
Nov 28 10:07:13 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:f30195cf-a779-4b65-9774-df0ab49a62cf, vol_name:cephfs) < ""
Nov 28 10:07:13 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.subvolume_base] subvolume path 'b'/volumes/_nogroup/f30195cf-a779-4b65-9774-df0ab49a62cf'' moved to trashcan
Nov 28 10:07:13 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Nov 28 10:07:13 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:f30195cf-a779-4b65-9774-df0ab49a62cf, vol_name:cephfs) < ""
Nov 28 10:07:13 np0005538515.localdomain podman[318764]: 2025-11-28 10:07:13.679552238 +0000 UTC m=+0.126264909 container exec_died cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, tcib_managed=true)
Nov 28 10:07:13 np0005538515.localdomain systemd[1]: cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.service: Deactivated successfully.
Nov 28 10:07:13 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v374: 177 pgs: 12 active+clean+snaptrim_wait, 8 active+clean+snaptrim, 157 active+clean; 1.2 GiB data, 3.9 GiB used, 38 GiB / 42 GiB avail; 65 KiB/s rd, 22 MiB/s wr, 96 op/s
Nov 28 10:07:13 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Nov 28 10:07:13 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2355630358' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:07:13 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Nov 28 10:07:13 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2355630358' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:07:14 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:07:14.260 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:07:14 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:07:14.565 2 INFO neutron.agent.securitygroups_rpc [None req-94433583-aa1a-4467-b851-fbd6872bea34 e32848e36ae94f66ae634ff4d7716d6f 8462a4a9a313405e8fd212f9ec4a0c92 - - default default] Security group member updated ['c81397c2-33ee-481d-8257-b39c2b0c331e']
Nov 28 10:07:14 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "f30195cf-a779-4b65-9774-df0ab49a62cf", "format": "json"}]: dispatch
Nov 28 10:07:14 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "f30195cf-a779-4b65-9774-df0ab49a62cf", "force": true, "format": "json"}]: dispatch
Nov 28 10:07:14 np0005538515.localdomain ceph-mon[301134]: pgmap v374: 177 pgs: 12 active+clean+snaptrim_wait, 8 active+clean+snaptrim, 157 active+clean; 1.2 GiB data, 3.9 GiB used, 38 GiB / 42 GiB avail; 65 KiB/s rd, 22 MiB/s wr, 96 op/s
Nov 28 10:07:14 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/2355630358' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:07:14 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/2355630358' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:07:14 np0005538515.localdomain dnsmasq[318722]: read /var/lib/neutron/dhcp/719e8b2f-3827-4b26-9c22-6bd0f7ed7ceb/addn_hosts - 0 addresses
Nov 28 10:07:14 np0005538515.localdomain podman[318800]: 2025-11-28 10:07:14.928865724 +0000 UTC m=+0.054183645 container kill 9232a324cbb803ff5e7248639f834f9284fa50bf671d08fdfddeac4a08ecb36b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-719e8b2f-3827-4b26-9c22-6bd0f7ed7ceb, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 28 10:07:14 np0005538515.localdomain dnsmasq-dhcp[318722]: read /var/lib/neutron/dhcp/719e8b2f-3827-4b26-9c22-6bd0f7ed7ceb/host
Nov 28 10:07:14 np0005538515.localdomain dnsmasq-dhcp[318722]: read /var/lib/neutron/dhcp/719e8b2f-3827-4b26-9c22-6bd0f7ed7ceb/opts
Nov 28 10:07:15 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v375: 177 pgs: 12 active+clean+snaptrim_wait, 8 active+clean+snaptrim, 157 active+clean; 1.2 GiB data, 3.9 GiB used, 38 GiB / 42 GiB avail; 26 KiB/s rd, 8.0 MiB/s wr, 38 op/s
Nov 28 10:07:16 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #34. Immutable memtables: 0.
Nov 28 10:07:16 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:07:16.445449) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Nov 28 10:07:16 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/flush_job.cc:856] [default] [JOB 17] Flushing memtable with next log file: 34
Nov 28 10:07:16 np0005538515.localdomain ceph-mon[301134]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324436445496, "job": 17, "event": "flush_started", "num_memtables": 1, "num_entries": 945, "num_deletes": 255, "total_data_size": 1826301, "memory_usage": 1847528, "flush_reason": "Manual Compaction"}
Nov 28 10:07:16 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/flush_job.cc:885] [default] [JOB 17] Level-0 flush table #35: started
Nov 28 10:07:16 np0005538515.localdomain ceph-mon[301134]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324436455041, "cf_name": "default", "job": 17, "event": "table_file_creation", "file_number": 35, "file_size": 1203374, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 23026, "largest_seqno": 23966, "table_properties": {"data_size": 1199158, "index_size": 1879, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1285, "raw_key_size": 10649, "raw_average_key_size": 21, "raw_value_size": 1190251, "raw_average_value_size": 2352, "num_data_blocks": 82, "num_entries": 506, "num_filter_entries": 506, "num_deletions": 255, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764324396, "oldest_key_time": 1764324396, "file_creation_time": 1764324436, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "75e61b0e-4f73-4b03-b096-8587ecbe7a9f", "db_session_id": "7KM5GJAJPD54H6HSLJHG", "orig_file_number": 35, "seqno_to_time_mapping": "N/A"}}
Nov 28 10:07:16 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 17] Flush lasted 9675 microseconds, and 3868 cpu microseconds.
Nov 28 10:07:16 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 28 10:07:16 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:07:16.455122) [db/flush_job.cc:967] [default] [JOB 17] Level-0 flush table #35: 1203374 bytes OK
Nov 28 10:07:16 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:07:16.455144) [db/memtable_list.cc:519] [default] Level-0 commit table #35 started
Nov 28 10:07:16 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:07:16.457172) [db/memtable_list.cc:722] [default] Level-0 commit table #35: memtable #1 done
Nov 28 10:07:16 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:07:16.457192) EVENT_LOG_v1 {"time_micros": 1764324436457186, "job": 17, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Nov 28 10:07:16 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:07:16.457210) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Nov 28 10:07:16 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 17] Try to delete WAL files size 1821363, prev total WAL file size 1821363, number of live WAL files 2.
Nov 28 10:07:16 np0005538515.localdomain ceph-mon[301134]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538515/store.db/000031.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 28 10:07:16 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:07:16.457880) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003132303438' seq:72057594037927935, type:22 .. '7061786F73003132333030' seq:0, type:0; will stop at (end)
Nov 28 10:07:16 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 18] Compacting 1@0 + 1@6 files to L6, score -1.00
Nov 28 10:07:16 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 17 Base level 0, inputs: [35(1175KB)], [33(17MB)]
Nov 28 10:07:16 np0005538515.localdomain ceph-mon[301134]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324436457926, "job": 18, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [35], "files_L6": [33], "score": -1, "input_data_size": 19057342, "oldest_snapshot_seqno": -1}
Nov 28 10:07:16 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 18] Generated table #36: 12756 keys, 17845285 bytes, temperature: kUnknown
Nov 28 10:07:16 np0005538515.localdomain ceph-mon[301134]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324436603568, "cf_name": "default", "job": 18, "event": "table_file_creation", "file_number": 36, "file_size": 17845285, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 17771154, "index_size": 41134, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 31941, "raw_key_size": 342052, "raw_average_key_size": 26, "raw_value_size": 17552659, "raw_average_value_size": 1376, "num_data_blocks": 1556, "num_entries": 12756, "num_filter_entries": 12756, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764323786, "oldest_key_time": 0, "file_creation_time": 1764324436, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "75e61b0e-4f73-4b03-b096-8587ecbe7a9f", "db_session_id": "7KM5GJAJPD54H6HSLJHG", "orig_file_number": 36, "seqno_to_time_mapping": "N/A"}}
Nov 28 10:07:16 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 28 10:07:16 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:07:16.604183) [db/compaction/compaction_job.cc:1663] [default] [JOB 18] Compacted 1@0 + 1@6 files to L6 => 17845285 bytes
Nov 28 10:07:16 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:07:16.605751) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 130.5 rd, 122.2 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.1, 17.0 +0.0 blob) out(17.0 +0.0 blob), read-write-amplify(30.7) write-amplify(14.8) OK, records in: 13285, records dropped: 529 output_compression: NoCompression
Nov 28 10:07:16 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:07:16.605771) EVENT_LOG_v1 {"time_micros": 1764324436605762, "job": 18, "event": "compaction_finished", "compaction_time_micros": 145992, "compaction_time_cpu_micros": 49683, "output_level": 6, "num_output_files": 1, "total_output_size": 17845285, "num_input_records": 13285, "num_output_records": 12756, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Nov 28 10:07:16 np0005538515.localdomain ceph-mon[301134]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538515/store.db/000035.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 28 10:07:16 np0005538515.localdomain ceph-mon[301134]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324436605985, "job": 18, "event": "table_file_deletion", "file_number": 35}
Nov 28 10:07:16 np0005538515.localdomain ceph-mon[301134]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538515/store.db/000033.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 28 10:07:16 np0005538515.localdomain ceph-mon[301134]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324436608026, "job": 18, "event": "table_file_deletion", "file_number": 33}
Nov 28 10:07:16 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:07:16.457800) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:07:16 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:07:16.608138) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:07:16 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:07:16.608148) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:07:16 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:07:16.608152) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:07:16 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:07:16.608156) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:07:16 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:07:16.608160) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:07:16 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e189 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:07:16 np0005538515.localdomain dnsmasq[318139]: exiting on receipt of SIGTERM
Nov 28 10:07:16 np0005538515.localdomain podman[318838]: 2025-11-28 10:07:16.842517757 +0000 UTC m=+0.096922703 container kill 6d17717cf4c78ea98c557933e4bb52f68e6a295d91765178c9115356c8664792 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5d55e4ac-ff66-4512-9e4c-487c005fe37c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 28 10:07:16 np0005538515.localdomain systemd[1]: libpod-6d17717cf4c78ea98c557933e4bb52f68e6a295d91765178c9115356c8664792.scope: Deactivated successfully.
Nov 28 10:07:16 np0005538515.localdomain podman[318851]: 2025-11-28 10:07:16.906845043 +0000 UTC m=+0.051190922 container died 6d17717cf4c78ea98c557933e4bb52f68e6a295d91765178c9115356c8664792 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5d55e4ac-ff66-4512-9e4c-487c005fe37c, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Nov 28 10:07:16 np0005538515.localdomain podman[318851]: 2025-11-28 10:07:16.934559119 +0000 UTC m=+0.078904948 container cleanup 6d17717cf4c78ea98c557933e4bb52f68e6a295d91765178c9115356c8664792 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5d55e4ac-ff66-4512-9e4c-487c005fe37c, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2)
Nov 28 10:07:16 np0005538515.localdomain systemd[1]: libpod-conmon-6d17717cf4c78ea98c557933e4bb52f68e6a295d91765178c9115356c8664792.scope: Deactivated successfully.
Nov 28 10:07:16 np0005538515.localdomain podman[318853]: 2025-11-28 10:07:16.99227727 +0000 UTC m=+0.130769128 container remove 6d17717cf4c78ea98c557933e4bb52f68e6a295d91765178c9115356c8664792 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5d55e4ac-ff66-4512-9e4c-487c005fe37c, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 28 10:07:17 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e190 e190: 6 total, 6 up, 6 in
Nov 28 10:07:17 np0005538515.localdomain ceph-mon[301134]: pgmap v375: 177 pgs: 12 active+clean+snaptrim_wait, 8 active+clean+snaptrim, 157 active+clean; 1.2 GiB data, 3.9 GiB used, 38 GiB / 42 GiB avail; 26 KiB/s rd, 8.0 MiB/s wr, 38 op/s
Nov 28 10:07:17 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "74c605fe-3105-407e-80b8-d4fb6f7d4329", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:07:17 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:74c605fe-3105-407e-80b8-d4fb6f7d4329, vol_name:cephfs) < ""
Nov 28 10:07:17 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/74c605fe-3105-407e-80b8-d4fb6f7d4329/.meta.tmp'
Nov 28 10:07:17 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/74c605fe-3105-407e-80b8-d4fb6f7d4329/.meta.tmp' to config b'/volumes/_nogroup/74c605fe-3105-407e-80b8-d4fb6f7d4329/.meta'
Nov 28 10:07:17 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:74c605fe-3105-407e-80b8-d4fb6f7d4329, vol_name:cephfs) < ""
Nov 28 10:07:17 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "74c605fe-3105-407e-80b8-d4fb6f7d4329", "format": "json"}]: dispatch
Nov 28 10:07:17 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:74c605fe-3105-407e-80b8-d4fb6f7d4329, vol_name:cephfs) < ""
Nov 28 10:07:17 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:74c605fe-3105-407e-80b8-d4fb6f7d4329, vol_name:cephfs) < ""
Nov 28 10:07:17 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-8ce0eb26f941760bc165071691d15f2ff2f716526f20d4e0a559e47f82947841-merged.mount: Deactivated successfully.
Nov 28 10:07:17 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-6d17717cf4c78ea98c557933e4bb52f68e6a295d91765178c9115356c8664792-userdata-shm.mount: Deactivated successfully.
Nov 28 10:07:17 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v377: 177 pgs: 177 active+clean; 192 MiB data, 903 MiB used, 41 GiB / 42 GiB avail; 2.9 MiB/s rd, 8.0 MiB/s wr, 202 op/s
Nov 28 10:07:17 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:07:17.974 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:07:18 np0005538515.localdomain systemd[1]: run-netns-qdhcp\x2d5d55e4ac\x2dff66\x2d4512\x2d9e4c\x2d487c005fe37c.mount: Deactivated successfully.
Nov 28 10:07:18 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:07:18.010 261346 INFO neutron.agent.dhcp.agent [None req-effcfdf6-50c3-4c69-9253-c6a818f91240 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:07:18 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:07:18.011 261346 INFO neutron.agent.dhcp.agent [None req-effcfdf6-50c3-4c69-9253-c6a818f91240 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:07:18 np0005538515.localdomain ceph-mon[301134]: osdmap e190: 6 total, 6 up, 6 in
Nov 28 10:07:18 np0005538515.localdomain ceph-mon[301134]: from='client.15651 172.18.0.34:0/597982567' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:07:18 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e191 e191: 6 total, 6 up, 6 in
Nov 28 10:07:18 np0005538515.localdomain dnsmasq[318722]: exiting on receipt of SIGTERM
Nov 28 10:07:18 np0005538515.localdomain podman[318896]: 2025-11-28 10:07:18.986296755 +0000 UTC m=+0.067549196 container kill 9232a324cbb803ff5e7248639f834f9284fa50bf671d08fdfddeac4a08ecb36b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-719e8b2f-3827-4b26-9c22-6bd0f7ed7ceb, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 28 10:07:18 np0005538515.localdomain systemd[1]: libpod-9232a324cbb803ff5e7248639f834f9284fa50bf671d08fdfddeac4a08ecb36b.scope: Deactivated successfully.
Nov 28 10:07:19 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:07:18.999 261346 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:07:19 np0005538515.localdomain podman[318909]: 2025-11-28 10:07:19.057506803 +0000 UTC m=+0.056691691 container died 9232a324cbb803ff5e7248639f834f9284fa50bf671d08fdfddeac4a08ecb36b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-719e8b2f-3827-4b26-9c22-6bd0f7ed7ceb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:07:19 np0005538515.localdomain systemd[1]: tmp-crun.CGxK7m.mount: Deactivated successfully.
Nov 28 10:07:19 np0005538515.localdomain podman[318909]: 2025-11-28 10:07:19.090338457 +0000 UTC m=+0.089523285 container cleanup 9232a324cbb803ff5e7248639f834f9284fa50bf671d08fdfddeac4a08ecb36b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-719e8b2f-3827-4b26-9c22-6bd0f7ed7ceb, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3)
Nov 28 10:07:19 np0005538515.localdomain systemd[1]: libpod-conmon-9232a324cbb803ff5e7248639f834f9284fa50bf671d08fdfddeac4a08ecb36b.scope: Deactivated successfully.
Nov 28 10:07:19 np0005538515.localdomain podman[318911]: 2025-11-28 10:07:19.141521596 +0000 UTC m=+0.133372847 container remove 9232a324cbb803ff5e7248639f834f9284fa50bf671d08fdfddeac4a08ecb36b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-719e8b2f-3827-4b26-9c22-6bd0f7ed7ceb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:07:19 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:07:19.301 2 INFO neutron.agent.securitygroups_rpc [None req-5151d855-124b-4682-b39f-fc47e0550bce 6b80a7d6f65f4ebe8363ccaae26f3e87 ae10569a38284f298c961498da620c5f - - default default] Security group member updated ['c5eee24b-0bed-4035-a2ab-e6c531c94e43']
Nov 28 10:07:19 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:07:19.305 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:07:19 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:07:19.496 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:07:19 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "74c605fe-3105-407e-80b8-d4fb6f7d4329", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:07:19 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "74c605fe-3105-407e-80b8-d4fb6f7d4329", "format": "json"}]: dispatch
Nov 28 10:07:19 np0005538515.localdomain ceph-mon[301134]: pgmap v377: 177 pgs: 177 active+clean; 192 MiB data, 903 MiB used, 41 GiB / 42 GiB avail; 2.9 MiB/s rd, 8.0 MiB/s wr, 202 op/s
Nov 28 10:07:19 np0005538515.localdomain ceph-mon[301134]: osdmap e191: 6 total, 6 up, 6 in
Nov 28 10:07:19 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e192 e192: 6 total, 6 up, 6 in
Nov 28 10:07:19 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v380: 177 pgs: 177 active+clean; 192 MiB data, 903 MiB used, 41 GiB / 42 GiB avail; 3.9 MiB/s rd, 36 KiB/s wr, 218 op/s
Nov 28 10:07:19 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-fd62fdf166d581c6d2b3ef9c0fe541584f5effaa7812f4badcc3cc24df395cfc-merged.mount: Deactivated successfully.
Nov 28 10:07:19 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9232a324cbb803ff5e7248639f834f9284fa50bf671d08fdfddeac4a08ecb36b-userdata-shm.mount: Deactivated successfully.
Nov 28 10:07:20 np0005538515.localdomain podman[318987]: 
Nov 28 10:07:20 np0005538515.localdomain podman[318987]: 2025-11-28 10:07:20.076362195 +0000 UTC m=+0.089661719 container create 749149ed15da240a27644b8b3bb546a3353809c26e676ff0d730aad5bc686b6d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-719e8b2f-3827-4b26-9c22-6bd0f7ed7ceb, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 28 10:07:20 np0005538515.localdomain systemd[1]: Started libpod-conmon-749149ed15da240a27644b8b3bb546a3353809c26e676ff0d730aad5bc686b6d.scope.
Nov 28 10:07:20 np0005538515.localdomain podman[318987]: 2025-11-28 10:07:20.034506223 +0000 UTC m=+0.047805767 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 28 10:07:20 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 10:07:20 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/33587a6373740309204ac909f2c16bba4ef78bf310718d28d2ec5f8f7d5dcf20/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 10:07:20 np0005538515.localdomain podman[318987]: 2025-11-28 10:07:20.156816239 +0000 UTC m=+0.170115763 container init 749149ed15da240a27644b8b3bb546a3353809c26e676ff0d730aad5bc686b6d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-719e8b2f-3827-4b26-9c22-6bd0f7ed7ceb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:07:20 np0005538515.localdomain podman[318987]: 2025-11-28 10:07:20.166970422 +0000 UTC m=+0.180269946 container start 749149ed15da240a27644b8b3bb546a3353809c26e676ff0d730aad5bc686b6d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-719e8b2f-3827-4b26-9c22-6bd0f7ed7ceb, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 28 10:07:20 np0005538515.localdomain dnsmasq[319005]: started, version 2.85 cachesize 150
Nov 28 10:07:20 np0005538515.localdomain dnsmasq[319005]: DNS service limited to local subnets
Nov 28 10:07:20 np0005538515.localdomain dnsmasq[319005]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 28 10:07:20 np0005538515.localdomain dnsmasq[319005]: warning: no upstream servers configured
Nov 28 10:07:20 np0005538515.localdomain dnsmasq-dhcp[319005]: DHCPv6, static leases only on 2001:db8:0:1::, lease time 1d
Nov 28 10:07:20 np0005538515.localdomain dnsmasq[319005]: read /var/lib/neutron/dhcp/719e8b2f-3827-4b26-9c22-6bd0f7ed7ceb/addn_hosts - 0 addresses
Nov 28 10:07:20 np0005538515.localdomain dnsmasq-dhcp[319005]: read /var/lib/neutron/dhcp/719e8b2f-3827-4b26-9c22-6bd0f7ed7ceb/host
Nov 28 10:07:20 np0005538515.localdomain dnsmasq-dhcp[319005]: read /var/lib/neutron/dhcp/719e8b2f-3827-4b26-9c22-6bd0f7ed7ceb/opts
Nov 28 10:07:20 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e193 e193: 6 total, 6 up, 6 in
Nov 28 10:07:20 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:07:20.477 261346 INFO neutron.agent.dhcp.agent [None req-496b6db1-97f8-4079-abb7-78122e8313d3 - - - - - -] DHCP configuration for ports {'e18fca2e-eaeb-40cf-9eb1-203ecf5b0aa2', '8646aa95-6463-44cd-8c34-1bec1705e23b', '8bc6a73d-610f-4f06-b515-26f3efcf46a4'} is completed
Nov 28 10:07:20 np0005538515.localdomain ceph-mon[301134]: osdmap e192: 6 total, 6 up, 6 in
Nov 28 10:07:20 np0005538515.localdomain ceph-mon[301134]: osdmap e193: 6 total, 6 up, 6 in
Nov 28 10:07:20 np0005538515.localdomain dnsmasq[319005]: exiting on receipt of SIGTERM
Nov 28 10:07:20 np0005538515.localdomain podman[319023]: 2025-11-28 10:07:20.567913569 +0000 UTC m=+0.062643575 container kill 749149ed15da240a27644b8b3bb546a3353809c26e676ff0d730aad5bc686b6d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-719e8b2f-3827-4b26-9c22-6bd0f7ed7ceb, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:07:20 np0005538515.localdomain systemd[1]: libpod-749149ed15da240a27644b8b3bb546a3353809c26e676ff0d730aad5bc686b6d.scope: Deactivated successfully.
Nov 28 10:07:20 np0005538515.localdomain podman[319037]: 2025-11-28 10:07:20.640558091 +0000 UTC m=+0.057727842 container died 749149ed15da240a27644b8b3bb546a3353809c26e676ff0d730aad5bc686b6d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-719e8b2f-3827-4b26-9c22-6bd0f7ed7ceb, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3)
Nov 28 10:07:20 np0005538515.localdomain podman[319037]: 2025-11-28 10:07:20.674034945 +0000 UTC m=+0.091204646 container cleanup 749149ed15da240a27644b8b3bb546a3353809c26e676ff0d730aad5bc686b6d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-719e8b2f-3827-4b26-9c22-6bd0f7ed7ceb, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 28 10:07:20 np0005538515.localdomain systemd[1]: libpod-conmon-749149ed15da240a27644b8b3bb546a3353809c26e676ff0d730aad5bc686b6d.scope: Deactivated successfully.
Nov 28 10:07:20 np0005538515.localdomain podman[319039]: 2025-11-28 10:07:20.729153996 +0000 UTC m=+0.136598087 container remove 749149ed15da240a27644b8b3bb546a3353809c26e676ff0d730aad5bc686b6d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-719e8b2f-3827-4b26-9c22-6bd0f7ed7ceb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125)
Nov 28 10:07:20 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "74c605fe-3105-407e-80b8-d4fb6f7d4329", "format": "json"}]: dispatch
Nov 28 10:07:20 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_clone_status(clone_name:74c605fe-3105-407e-80b8-d4fb6f7d4329, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Nov 28 10:07:20 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_clone_status(clone_name:74c605fe-3105-407e-80b8-d4fb6f7d4329, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Nov 28 10:07:20 np0005538515.localdomain ceph-mgr[286188]: mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '74c605fe-3105-407e-80b8-d4fb6f7d4329' of type subvolume
Nov 28 10:07:20 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T10:07:20.820+0000 7fcc87448640 -1 mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '74c605fe-3105-407e-80b8-d4fb6f7d4329' of type subvolume
Nov 28 10:07:20 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "74c605fe-3105-407e-80b8-d4fb6f7d4329", "force": true, "format": "json"}]: dispatch
Nov 28 10:07:20 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:74c605fe-3105-407e-80b8-d4fb6f7d4329, vol_name:cephfs) < ""
Nov 28 10:07:20 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.subvolume_base] subvolume path 'b'/volumes/_nogroup/74c605fe-3105-407e-80b8-d4fb6f7d4329'' moved to trashcan
Nov 28 10:07:20 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Nov 28 10:07:20 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:74c605fe-3105-407e-80b8-d4fb6f7d4329, vol_name:cephfs) < ""
Nov 28 10:07:20 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-33587a6373740309204ac909f2c16bba4ef78bf310718d28d2ec5f8f7d5dcf20-merged.mount: Deactivated successfully.
Nov 28 10:07:20 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-749149ed15da240a27644b8b3bb546a3353809c26e676ff0d730aad5bc686b6d-userdata-shm.mount: Deactivated successfully.
Nov 28 10:07:21 np0005538515.localdomain dnsmasq[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/addn_hosts - 2 addresses
Nov 28 10:07:21 np0005538515.localdomain dnsmasq-dhcp[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/host
Nov 28 10:07:21 np0005538515.localdomain podman[319082]: 2025-11-28 10:07:21.001692929 +0000 UTC m=+0.060040854 container kill d6ba70e100412f39fbf43f2359746885ccd151dd93bddc3682d2651a49a36c62 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-887157f9-a765-40c0-8be5-1fba3ddea8f8, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Nov 28 10:07:21 np0005538515.localdomain dnsmasq-dhcp[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/opts
Nov 28 10:07:21 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:07:21.384 2 INFO neutron.agent.securitygroups_rpc [None req-fe8ed995-ee4b-4312-80c7-fb60647feb81 e32848e36ae94f66ae634ff4d7716d6f 8462a4a9a313405e8fd212f9ec4a0c92 - - default default] Security group member updated ['235f4ca9-4e7e-483e-ba22-a609f7751fe8']
Nov 28 10:07:21 np0005538515.localdomain ceph-mon[301134]: pgmap v380: 177 pgs: 177 active+clean; 192 MiB data, 903 MiB used, 41 GiB / 42 GiB avail; 3.9 MiB/s rd, 36 KiB/s wr, 218 op/s
Nov 28 10:07:21 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "74c605fe-3105-407e-80b8-d4fb6f7d4329", "format": "json"}]: dispatch
Nov 28 10:07:21 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "74c605fe-3105-407e-80b8-d4fb6f7d4329", "force": true, "format": "json"}]: dispatch
Nov 28 10:07:21 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e194 e194: 6 total, 6 up, 6 in
Nov 28 10:07:21 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:07:21.717 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:07:21 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:07:21.775 261346 INFO neutron.agent.linux.ip_lib [None req-4c1bfc47-5e8f-408e-865b-81dbc7756058 - - - - - -] Device tapd42f5d61-af cannot be used as it has no MAC address
Nov 28 10:07:21 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e194 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:07:21 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v383: 177 pgs: 1 active+clean+snaptrim, 176 active+clean; 192 MiB data, 903 MiB used, 41 GiB / 42 GiB avail; 77 KiB/s rd, 18 KiB/s wr, 108 op/s
Nov 28 10:07:21 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:07:21.826 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:07:21 np0005538515.localdomain kernel: device tapd42f5d61-af entered promiscuous mode
Nov 28 10:07:21 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T10:07:21Z|00177|binding|INFO|Claiming lport d42f5d61-afe1-455f-b448-86993094b244 for this chassis.
Nov 28 10:07:21 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T10:07:21Z|00178|binding|INFO|d42f5d61-afe1-455f-b448-86993094b244: Claiming unknown
Nov 28 10:07:21 np0005538515.localdomain NetworkManager[5965]: <info>  [1764324441.8361] manager: (tapd42f5d61-af): new Generic device (/org/freedesktop/NetworkManager/Devices/39)
Nov 28 10:07:21 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:07:21.841 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:07:21 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:07:21.848 158530 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538515.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcp2a75ae47-09f3-5db4-9c67-86b6e0e7c804-744b5a82-3c5c-4b41-ba44-527244a209c4', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-744b5a82-3c5c-4b41-ba44-527244a209c4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5e7a07c97c664076bc825e05137c574c', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=af9fa2ef-906d-4c5f-8a61-e350b84d90cf, chassis=[<ovs.db.idl.Row object at 0x7fd80e481be0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fd80e481be0>], logical_port=d42f5d61-afe1-455f-b448-86993094b244) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:07:21 np0005538515.localdomain systemd-udevd[319116]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 10:07:21 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:07:21.851 158530 INFO neutron.agent.ovn.metadata.agent [-] Port d42f5d61-afe1-455f-b448-86993094b244 in datapath 744b5a82-3c5c-4b41-ba44-527244a209c4 bound to our chassis
Nov 28 10:07:21 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:07:21.853 158530 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 744b5a82-3c5c-4b41-ba44-527244a209c4 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 28 10:07:21 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:07:21.855 261619 DEBUG oslo.privsep.daemon [-] privsep: reply[9d04ecd8-d706-49d0-8058-0c6c0b61ac67]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:07:21 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:07:21.892 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:07:21 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T10:07:21Z|00179|binding|INFO|Setting lport d42f5d61-afe1-455f-b448-86993094b244 ovn-installed in OVS
Nov 28 10:07:21 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T10:07:21Z|00180|binding|INFO|Setting lport d42f5d61-afe1-455f-b448-86993094b244 up in Southbound
Nov 28 10:07:21 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:07:21.896 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:07:21 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:07:21.953 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:07:22 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:07:22.004 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:07:22 np0005538515.localdomain ceph-mon[301134]: osdmap e194: 6 total, 6 up, 6 in
Nov 28 10:07:22 np0005538515.localdomain ceph-mon[301134]: pgmap v383: 177 pgs: 1 active+clean+snaptrim, 176 active+clean; 192 MiB data, 903 MiB used, 41 GiB / 42 GiB avail; 77 KiB/s rd, 18 KiB/s wr, 108 op/s
Nov 28 10:07:22 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e195 e195: 6 total, 6 up, 6 in
Nov 28 10:07:22 np0005538515.localdomain podman[319197]: 
Nov 28 10:07:22 np0005538515.localdomain podman[319197]: 2025-11-28 10:07:22.839289106 +0000 UTC m=+0.093007212 container create 7d00eb47706247a4399d8c78950a1e49d730ce95b6c195feb49b94727c9561a8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-719e8b2f-3827-4b26-9c22-6bd0f7ed7ceb, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:07:22 np0005538515.localdomain systemd[1]: Started libpod-conmon-7d00eb47706247a4399d8c78950a1e49d730ce95b6c195feb49b94727c9561a8.scope.
Nov 28 10:07:22 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 10:07:22 np0005538515.localdomain podman[319197]: 2025-11-28 10:07:22.791985925 +0000 UTC m=+0.045704021 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 28 10:07:22 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/47429582b4db31382ce2dca59c66017594f0da5442226bc44ebbcd242cb78e55/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 10:07:22 np0005538515.localdomain podman[319197]: 2025-11-28 10:07:22.902065564 +0000 UTC m=+0.155783640 container init 7d00eb47706247a4399d8c78950a1e49d730ce95b6c195feb49b94727c9561a8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-719e8b2f-3827-4b26-9c22-6bd0f7ed7ceb, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:07:22 np0005538515.localdomain podman[319197]: 2025-11-28 10:07:22.91004166 +0000 UTC m=+0.163759736 container start 7d00eb47706247a4399d8c78950a1e49d730ce95b6c195feb49b94727c9561a8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-719e8b2f-3827-4b26-9c22-6bd0f7ed7ceb, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Nov 28 10:07:22 np0005538515.localdomain dnsmasq[319232]: started, version 2.85 cachesize 150
Nov 28 10:07:22 np0005538515.localdomain dnsmasq[319232]: DNS service limited to local subnets
Nov 28 10:07:22 np0005538515.localdomain dnsmasq[319232]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 28 10:07:22 np0005538515.localdomain dnsmasq[319232]: warning: no upstream servers configured
Nov 28 10:07:22 np0005538515.localdomain dnsmasq-dhcp[319232]: DHCPv6, static leases only on 2001:db8:0:1::, lease time 1d
Nov 28 10:07:22 np0005538515.localdomain dnsmasq-dhcp[319232]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Nov 28 10:07:22 np0005538515.localdomain dnsmasq[319232]: read /var/lib/neutron/dhcp/719e8b2f-3827-4b26-9c22-6bd0f7ed7ceb/addn_hosts - 0 addresses
Nov 28 10:07:22 np0005538515.localdomain dnsmasq-dhcp[319232]: read /var/lib/neutron/dhcp/719e8b2f-3827-4b26-9c22-6bd0f7ed7ceb/host
Nov 28 10:07:22 np0005538515.localdomain dnsmasq-dhcp[319232]: read /var/lib/neutron/dhcp/719e8b2f-3827-4b26-9c22-6bd0f7ed7ceb/opts
Nov 28 10:07:23 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:07:23.022 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:07:23 np0005538515.localdomain dnsmasq[319232]: exiting on receipt of SIGTERM
Nov 28 10:07:23 np0005538515.localdomain systemd[1]: libpod-7d00eb47706247a4399d8c78950a1e49d730ce95b6c195feb49b94727c9561a8.scope: Deactivated successfully.
Nov 28 10:07:23 np0005538515.localdomain podman[319239]: 2025-11-28 10:07:23.06260614 +0000 UTC m=+0.126185297 container died 7d00eb47706247a4399d8c78950a1e49d730ce95b6c195feb49b94727c9561a8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-719e8b2f-3827-4b26-9c22-6bd0f7ed7ceb, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:07:23 np0005538515.localdomain podman[319239]: 2025-11-28 10:07:23.093579016 +0000 UTC m=+0.157158173 container cleanup 7d00eb47706247a4399d8c78950a1e49d730ce95b6c195feb49b94727c9561a8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-719e8b2f-3827-4b26-9c22-6bd0f7ed7ceb, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:07:23 np0005538515.localdomain podman[319252]: 2025-11-28 10:07:23.130542897 +0000 UTC m=+0.063507931 container cleanup 7d00eb47706247a4399d8c78950a1e49d730ce95b6c195feb49b94727c9561a8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-719e8b2f-3827-4b26-9c22-6bd0f7ed7ceb, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125)
Nov 28 10:07:23 np0005538515.localdomain systemd[1]: libpod-conmon-7d00eb47706247a4399d8c78950a1e49d730ce95b6c195feb49b94727c9561a8.scope: Deactivated successfully.
Nov 28 10:07:23 np0005538515.localdomain podman[319264]: 2025-11-28 10:07:23.191519709 +0000 UTC m=+0.081823476 container remove 7d00eb47706247a4399d8c78950a1e49d730ce95b6c195feb49b94727c9561a8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-719e8b2f-3827-4b26-9c22-6bd0f7ed7ceb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.build-date=20251125)
Nov 28 10:07:23 np0005538515.localdomain podman[319283]: 
Nov 28 10:07:23 np0005538515.localdomain podman[319283]: 2025-11-28 10:07:23.289763202 +0000 UTC m=+0.080659401 container create e023ab45866580f82c7d33d531f7dc939662032f69b1cde9a1a731e04d9714df (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-744b5a82-3c5c-4b41-ba44-527244a209c4, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 28 10:07:23 np0005538515.localdomain systemd[1]: Started libpod-conmon-e023ab45866580f82c7d33d531f7dc939662032f69b1cde9a1a731e04d9714df.scope.
Nov 28 10:07:23 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 10:07:23 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8ff3cb155a93a6ece0fb95602d69cd611411b4d92a56342074acdd1cb9fb6b26/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 10:07:23 np0005538515.localdomain podman[319283]: 2025-11-28 10:07:23.346183013 +0000 UTC m=+0.137079222 container init e023ab45866580f82c7d33d531f7dc939662032f69b1cde9a1a731e04d9714df (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-744b5a82-3c5c-4b41-ba44-527244a209c4, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, io.buildah.version=1.41.3)
Nov 28 10:07:23 np0005538515.localdomain podman[319283]: 2025-11-28 10:07:23.249143688 +0000 UTC m=+0.040039937 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 28 10:07:23 np0005538515.localdomain podman[319283]: 2025-11-28 10:07:23.354254803 +0000 UTC m=+0.145151012 container start e023ab45866580f82c7d33d531f7dc939662032f69b1cde9a1a731e04d9714df (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-744b5a82-3c5c-4b41-ba44-527244a209c4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Nov 28 10:07:23 np0005538515.localdomain dnsmasq[319301]: started, version 2.85 cachesize 150
Nov 28 10:07:23 np0005538515.localdomain dnsmasq[319301]: DNS service limited to local subnets
Nov 28 10:07:23 np0005538515.localdomain dnsmasq[319301]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 28 10:07:23 np0005538515.localdomain dnsmasq[319301]: warning: no upstream servers configured
Nov 28 10:07:23 np0005538515.localdomain dnsmasq-dhcp[319301]: DHCP, static leases only on 10.100.0.0, lease time 1d
Nov 28 10:07:23 np0005538515.localdomain dnsmasq[319301]: read /var/lib/neutron/dhcp/744b5a82-3c5c-4b41-ba44-527244a209c4/addn_hosts - 0 addresses
Nov 28 10:07:23 np0005538515.localdomain dnsmasq-dhcp[319301]: read /var/lib/neutron/dhcp/744b5a82-3c5c-4b41-ba44-527244a209c4/host
Nov 28 10:07:23 np0005538515.localdomain dnsmasq-dhcp[319301]: read /var/lib/neutron/dhcp/744b5a82-3c5c-4b41-ba44-527244a209c4/opts
Nov 28 10:07:23 np0005538515.localdomain ceph-mon[301134]: osdmap e195: 6 total, 6 up, 6 in
Nov 28 10:07:23 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v385: 177 pgs: 1 active+clean+snaptrim, 176 active+clean; 192 MiB data, 903 MiB used, 41 GiB / 42 GiB avail; 71 KiB/s rd, 17 KiB/s wr, 100 op/s
Nov 28 10:07:23 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-47429582b4db31382ce2dca59c66017594f0da5442226bc44ebbcd242cb78e55-merged.mount: Deactivated successfully.
Nov 28 10:07:23 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-7d00eb47706247a4399d8c78950a1e49d730ce95b6c195feb49b94727c9561a8-userdata-shm.mount: Deactivated successfully.
Nov 28 10:07:23 np0005538515.localdomain sudo[319302]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 10:07:23 np0005538515.localdomain sudo[319302]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 10:07:23 np0005538515.localdomain sudo[319302]: pam_unix(sudo:session): session closed for user root
Nov 28 10:07:23 np0005538515.localdomain sudo[319320]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 10:07:23 np0005538515.localdomain sudo[319320]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 10:07:24 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:07:24.009 261346 INFO neutron.agent.dhcp.agent [None req-641e10e9-3817-4d29-b950-2fe355e91ffa - - - - - -] DHCP configuration for ports {'e18fca2e-eaeb-40cf-9eb1-203ecf5b0aa2', '8646aa95-6463-44cd-8c34-1bec1705e23b', '8bc6a73d-610f-4f06-b515-26f3efcf46a4'} is completed
Nov 28 10:07:24 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:07:24.092 2 INFO neutron.agent.securitygroups_rpc [None req-b1e53ab7-c922-4511-9eea-36891b374394 e32848e36ae94f66ae634ff4d7716d6f 8462a4a9a313405e8fd212f9ec4a0c92 - - default default] Security group member updated ['ef6c27ab-7008-4940-88ab-f495a3348997', '235f4ca9-4e7e-483e-ba22-a609f7751fe8', 'ad28b9ca-0164-4a23-9923-7d61ac565e84']
Nov 28 10:07:24 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "84a40f08-8b6b-4686-8d05-33a3e9292f4f", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:07:24 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:84a40f08-8b6b-4686-8d05-33a3e9292f4f, vol_name:cephfs) < ""
Nov 28 10:07:24 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/84a40f08-8b6b-4686-8d05-33a3e9292f4f/.meta.tmp'
Nov 28 10:07:24 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/84a40f08-8b6b-4686-8d05-33a3e9292f4f/.meta.tmp' to config b'/volumes/_nogroup/84a40f08-8b6b-4686-8d05-33a3e9292f4f/.meta'
Nov 28 10:07:24 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:84a40f08-8b6b-4686-8d05-33a3e9292f4f, vol_name:cephfs) < ""
Nov 28 10:07:24 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "84a40f08-8b6b-4686-8d05-33a3e9292f4f", "format": "json"}]: dispatch
Nov 28 10:07:24 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:84a40f08-8b6b-4686-8d05-33a3e9292f4f, vol_name:cephfs) < ""
Nov 28 10:07:24 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:84a40f08-8b6b-4686-8d05-33a3e9292f4f, vol_name:cephfs) < ""
Nov 28 10:07:24 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:07:24.345 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:07:24 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:07:24.422 261346 INFO neutron.agent.dhcp.agent [None req-58ebe808-7899-4368-bd27-1fa7f16b1e10 - - - - - -] DHCP configuration for ports {'530dc798-1c6a-4c38-a11f-57f3818e5561', '636c21fa-d6bd-405e-95ed-d59498827d6f'} is completed
Nov 28 10:07:24 np0005538515.localdomain ceph-mon[301134]: pgmap v385: 177 pgs: 1 active+clean+snaptrim, 176 active+clean; 192 MiB data, 903 MiB used, 41 GiB / 42 GiB avail; 71 KiB/s rd, 17 KiB/s wr, 100 op/s
Nov 28 10:07:24 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "84a40f08-8b6b-4686-8d05-33a3e9292f4f", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:07:24 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "84a40f08-8b6b-4686-8d05-33a3e9292f4f", "format": "json"}]: dispatch
Nov 28 10:07:24 np0005538515.localdomain ceph-mon[301134]: from='client.15651 172.18.0.34:0/597982567' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:07:24 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e196 e196: 6 total, 6 up, 6 in
Nov 28 10:07:24 np0005538515.localdomain sudo[319320]: pam_unix(sudo:session): session closed for user root
Nov 28 10:07:24 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:07:24.754 2 INFO neutron.agent.securitygroups_rpc [None req-ded8a162-fe49-472e-9107-11e07cb8573a e32848e36ae94f66ae634ff4d7716d6f 8462a4a9a313405e8fd212f9ec4a0c92 - - default default] Security group member updated ['ef6c27ab-7008-4940-88ab-f495a3348997', 'ad28b9ca-0164-4a23-9923-7d61ac565e84']
Nov 28 10:07:24 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Nov 28 10:07:24 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 10:07:24 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0)
Nov 28 10:07:24 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 28 10:07:24 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Nov 28 10:07:24 np0005538515.localdomain ceph-mgr[286188]: [progress INFO root] update: starting ev e28ff6ca-b8a9-45c3-a327-baf42b0e54aa (Updating node-proxy deployment (+3 -> 3))
Nov 28 10:07:24 np0005538515.localdomain ceph-mgr[286188]: [progress INFO root] complete: finished ev e28ff6ca-b8a9-45c3-a327-baf42b0e54aa (Updating node-proxy deployment (+3 -> 3))
Nov 28 10:07:24 np0005538515.localdomain ceph-mgr[286188]: [progress INFO root] Completed event e28ff6ca-b8a9-45c3-a327-baf42b0e54aa (Updating node-proxy deployment (+3 -> 3)) in 0 seconds
Nov 28 10:07:24 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0)
Nov 28 10:07:24 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 28 10:07:25 np0005538515.localdomain sudo[319371]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 10:07:25 np0005538515.localdomain sudo[319371]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 10:07:25 np0005538515.localdomain sudo[319371]: pam_unix(sudo:session): session closed for user root
Nov 28 10:07:25 np0005538515.localdomain ceph-osd[33334]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #44. Immutable memtables: 1.
Nov 28 10:07:25 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:07:25.422 2 INFO neutron.agent.securitygroups_rpc [None req-84883400-a0bd-45dd-a8ae-3bc8b417b162 646af4638a054ec4b7eb9e5438a2ab60 5e7a07c97c664076bc825e05137c574c - - default default] Security group member updated ['acf02bd6-8fdb-4bdf-b655-c11d3c48057a']
Nov 28 10:07:25 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:07:25.470 261346 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:07:25Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce3d40a0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce3d4700>], id=3f096e93-c3cf-440a-8cda-fd3f17a679fb, ip_allocation=immediate, mac_address=fa:16:3e:bd:2b:32, name=tempest-PortsTestJSON-1518413586, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:05:56Z, description=, dns_domain=, id=744b5a82-3c5c-4b41-ba44-527244a209c4, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-PortsTestJSON-test-network-935184943, port_security_enabled=True, project_id=5e7a07c97c664076bc825e05137c574c, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=32831, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2138, status=ACTIVE, subnets=['4bf409b4-2136-4411-a9e7-978df7f2f500'], tags=[], tenant_id=5e7a07c97c664076bc825e05137c574c, updated_at=2025-11-28T10:07:20Z, vlan_transparent=None, network_id=744b5a82-3c5c-4b41-ba44-527244a209c4, port_security_enabled=True, project_id=5e7a07c97c664076bc825e05137c574c, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['acf02bd6-8fdb-4bdf-b655-c11d3c48057a'], standard_attr_id=2522, status=DOWN, tags=[], tenant_id=5e7a07c97c664076bc825e05137c574c, updated_at=2025-11-28T10:07:25Z on network 744b5a82-3c5c-4b41-ba44-527244a209c4
Nov 28 10:07:25 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:07:25.473 2 INFO neutron.agent.securitygroups_rpc [None req-0fb934e7-9ad4-4c2e-8ef8-4b9c21b34e7a 6b80a7d6f65f4ebe8363ccaae26f3e87 ae10569a38284f298c961498da620c5f - - default default] Security group member updated ['c5eee24b-0bed-4035-a2ab-e6c531c94e43']
Nov 28 10:07:25 np0005538515.localdomain ceph-mon[301134]: osdmap e196: 6 total, 6 up, 6 in
Nov 28 10:07:25 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 10:07:25 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 28 10:07:25 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:07:25 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 28 10:07:25 np0005538515.localdomain dnsmasq[319301]: read /var/lib/neutron/dhcp/744b5a82-3c5c-4b41-ba44-527244a209c4/addn_hosts - 1 addresses
Nov 28 10:07:25 np0005538515.localdomain dnsmasq-dhcp[319301]: read /var/lib/neutron/dhcp/744b5a82-3c5c-4b41-ba44-527244a209c4/host
Nov 28 10:07:25 np0005538515.localdomain podman[319423]: 2025-11-28 10:07:25.726802683 +0000 UTC m=+0.059630212 container kill e023ab45866580f82c7d33d531f7dc939662032f69b1cde9a1a731e04d9714df (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-744b5a82-3c5c-4b41-ba44-527244a209c4, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Nov 28 10:07:25 np0005538515.localdomain dnsmasq-dhcp[319301]: read /var/lib/neutron/dhcp/744b5a82-3c5c-4b41-ba44-527244a209c4/opts
Nov 28 10:07:25 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v387: 177 pgs: 1 active+clean+snaptrim, 176 active+clean; 192 MiB data, 903 MiB used, 41 GiB / 42 GiB avail; 57 KiB/s rd, 14 KiB/s wr, 80 op/s
Nov 28 10:07:25 np0005538515.localdomain ceph-mgr[286188]: [progress INFO root] Writing back 50 completed events
Nov 28 10:07:25 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0)
Nov 28 10:07:26 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.
Nov 28 10:07:26 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:07:26.102 261346 INFO neutron.agent.dhcp.agent [None req-7342aeab-f0ec-4c04-ae76-2a91133c8b33 - - - - - -] DHCP configuration for ports {'3f096e93-c3cf-440a-8cda-fd3f17a679fb'} is completed
Nov 28 10:07:26 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:07:26.158 261346 INFO neutron.agent.linux.ip_lib [None req-59202482-0395-4fc3-b670-eb0f5cd989b9 - - - - - -] Device tap0f9c27e4-dc cannot be used as it has no MAC address
Nov 28 10:07:26 np0005538515.localdomain systemd[1]: tmp-crun.ssnBFK.mount: Deactivated successfully.
Nov 28 10:07:26 np0005538515.localdomain podman[319460]: 2025-11-28 10:07:26.167128685 +0000 UTC m=+0.065309777 container health_status 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1755695350, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, architecture=x86_64, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vcs-type=git, version=9.6)
Nov 28 10:07:26 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:07:26.176 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:07:26 np0005538515.localdomain kernel: device tap0f9c27e4-dc entered promiscuous mode
Nov 28 10:07:26 np0005538515.localdomain NetworkManager[5965]: <info>  [1764324446.1833] manager: (tap0f9c27e4-dc): new Generic device (/org/freedesktop/NetworkManager/Devices/40)
Nov 28 10:07:26 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T10:07:26Z|00181|binding|INFO|Claiming lport 0f9c27e4-dc5b-458a-84e7-59a6845be341 for this chassis.
Nov 28 10:07:26 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T10:07:26Z|00182|binding|INFO|0f9c27e4-dc5b-458a-84e7-59a6845be341: Claiming unknown
Nov 28 10:07:26 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:07:26.186 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:07:26 np0005538515.localdomain systemd-udevd[319501]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 10:07:26 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:07:26.206 158530 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538515.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe58:df08/64', 'neutron:device_id': 'dhcp2a75ae47-09f3-5db4-9c67-86b6e0e7c804-bb75b6d0-46f7-4ff4-b977-20963925f011', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-bb75b6d0-46f7-4ff4-b977-20963925f011', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ae10569a38284f298c961498da620c5f', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=19295739-89e1-4341-a9f7-bf31d43c2d95, chassis=[<ovs.db.idl.Row object at 0x7fd80e481be0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fd80e481be0>], logical_port=0f9c27e4-dc5b-458a-84e7-59a6845be341) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:07:26 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:07:26.207 158530 INFO neutron.agent.ovn.metadata.agent [-] Port 0f9c27e4-dc5b-458a-84e7-59a6845be341 in datapath bb75b6d0-46f7-4ff4-b977-20963925f011 bound to our chassis
Nov 28 10:07:26 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:07:26.209 158530 DEBUG neutron.agent.ovn.metadata.agent [-] Port 64f2d2fb-0269-4c37-9cd4-8777ae8910b3 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Nov 28 10:07:26 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:07:26.209 158530 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network bb75b6d0-46f7-4ff4-b977-20963925f011, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 28 10:07:26 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:07:26.209 261619 DEBUG oslo.privsep.daemon [-] privsep: reply[99bec901-1281-4f86-8478-2bd8bc14ec48]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:07:26 np0005538515.localdomain virtnodedevd[228057]: ethtool ioctl error on tap0f9c27e4-dc: No such device
Nov 28 10:07:26 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:07:26.220 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:07:26 np0005538515.localdomain virtnodedevd[228057]: ethtool ioctl error on tap0f9c27e4-dc: No such device
Nov 28 10:07:26 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T10:07:26Z|00183|binding|INFO|Setting lport 0f9c27e4-dc5b-458a-84e7-59a6845be341 ovn-installed in OVS
Nov 28 10:07:26 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T10:07:26Z|00184|binding|INFO|Setting lport 0f9c27e4-dc5b-458a-84e7-59a6845be341 up in Southbound
Nov 28 10:07:26 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:07:26.222 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:07:26 np0005538515.localdomain virtnodedevd[228057]: ethtool ioctl error on tap0f9c27e4-dc: No such device
Nov 28 10:07:26 np0005538515.localdomain virtnodedevd[228057]: ethtool ioctl error on tap0f9c27e4-dc: No such device
Nov 28 10:07:26 np0005538515.localdomain virtnodedevd[228057]: ethtool ioctl error on tap0f9c27e4-dc: No such device
Nov 28 10:07:26 np0005538515.localdomain virtnodedevd[228057]: ethtool ioctl error on tap0f9c27e4-dc: No such device
Nov 28 10:07:26 np0005538515.localdomain virtnodedevd[228057]: ethtool ioctl error on tap0f9c27e4-dc: No such device
Nov 28 10:07:26 np0005538515.localdomain virtnodedevd[228057]: ethtool ioctl error on tap0f9c27e4-dc: No such device
Nov 28 10:07:26 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:07:26.247 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:07:26 np0005538515.localdomain podman[319460]: 2025-11-28 10:07:26.258842377 +0000 UTC m=+0.157023449 container exec_died 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, distribution-scope=public, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, managed_by=edpm_ansible, container_name=openstack_network_exporter, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, com.redhat.component=ubi9-minimal-container, vcs-type=git, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, io.openshift.expose-services=, vendor=Red Hat, Inc.)
Nov 28 10:07:26 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:07:26.270 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:07:26 np0005538515.localdomain systemd[1]: 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.service: Deactivated successfully.
Nov 28 10:07:26 np0005538515.localdomain podman[319532]: 
Nov 28 10:07:26 np0005538515.localdomain podman[319532]: 2025-11-28 10:07:26.314373191 +0000 UTC m=+0.061059606 container create b64edd8db7d47901ad1bacd35e70a1e47dcd67493ee12a17dc2d4506c9cef948 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-719e8b2f-3827-4b26-9c22-6bd0f7ed7ceb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:07:26 np0005538515.localdomain systemd[1]: Started libpod-conmon-b64edd8db7d47901ad1bacd35e70a1e47dcd67493ee12a17dc2d4506c9cef948.scope.
Nov 28 10:07:26 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 10:07:26 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d1a40302d152a1d4acd8c242f619baa5eea043f1a1ad2914ca314bd0285d34a6/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 10:07:26 np0005538515.localdomain podman[319532]: 2025-11-28 10:07:26.380640676 +0000 UTC m=+0.127327081 container init b64edd8db7d47901ad1bacd35e70a1e47dcd67493ee12a17dc2d4506c9cef948 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-719e8b2f-3827-4b26-9c22-6bd0f7ed7ceb, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 28 10:07:26 np0005538515.localdomain podman[319532]: 2025-11-28 10:07:26.282989022 +0000 UTC m=+0.029675447 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 28 10:07:26 np0005538515.localdomain podman[319532]: 2025-11-28 10:07:26.386609321 +0000 UTC m=+0.133295716 container start b64edd8db7d47901ad1bacd35e70a1e47dcd67493ee12a17dc2d4506c9cef948 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-719e8b2f-3827-4b26-9c22-6bd0f7ed7ceb, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 28 10:07:26 np0005538515.localdomain dnsmasq[319559]: started, version 2.85 cachesize 150
Nov 28 10:07:26 np0005538515.localdomain dnsmasq[319559]: DNS service limited to local subnets
Nov 28 10:07:26 np0005538515.localdomain dnsmasq[319559]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 28 10:07:26 np0005538515.localdomain dnsmasq[319559]: warning: no upstream servers configured
Nov 28 10:07:26 np0005538515.localdomain dnsmasq-dhcp[319559]: DHCPv6, static leases only on 2001:db8:0:1::, lease time 1d
Nov 28 10:07:26 np0005538515.localdomain dnsmasq-dhcp[319559]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Nov 28 10:07:26 np0005538515.localdomain dnsmasq-dhcp[319559]: DHCPv6, static leases only on 2001:db8:0:2::, lease time 1d
Nov 28 10:07:26 np0005538515.localdomain dnsmasq[319559]: read /var/lib/neutron/dhcp/719e8b2f-3827-4b26-9c22-6bd0f7ed7ceb/addn_hosts - 1 addresses
Nov 28 10:07:26 np0005538515.localdomain dnsmasq-dhcp[319559]: read /var/lib/neutron/dhcp/719e8b2f-3827-4b26-9c22-6bd0f7ed7ceb/host
Nov 28 10:07:26 np0005538515.localdomain dnsmasq-dhcp[319559]: read /var/lib/neutron/dhcp/719e8b2f-3827-4b26-9c22-6bd0f7ed7ceb/opts
Nov 28 10:07:26 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:07:26.434 261346 INFO neutron.agent.dhcp.agent [None req-223da476-0ca2-44ac-b2f9-b135e3874901 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:07:21Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce3d0c70>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce3d02b0>], id=3d039e4d-f111-4c85-a4bc-bc275c485ad6, ip_allocation=immediate, mac_address=fa:16:3e:a9:78:64, name=tempest-PortsIpV6TestJSON-342618759, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:06:07Z, description=, dns_domain=, id=719e8b2f-3827-4b26-9c22-6bd0f7ed7ceb, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-PortsIpV6TestJSON-test-network-1946008216, port_security_enabled=True, project_id=8462a4a9a313405e8fd212f9ec4a0c92, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=49828, qos_policy_id=None, revision_number=5, router:external=False, shared=False, standard_attr_id=2226, status=ACTIVE, subnets=['d9bf5441-d2dd-4350-bfbf-90c18e4a1028', 'eb032588-3453-4d13-aca7-c97dd8bc87f7'], tags=[], tenant_id=8462a4a9a313405e8fd212f9ec4a0c92, updated_at=2025-11-28T10:07:19Z, vlan_transparent=None, network_id=719e8b2f-3827-4b26-9c22-6bd0f7ed7ceb, port_security_enabled=True, project_id=8462a4a9a313405e8fd212f9ec4a0c92, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['235f4ca9-4e7e-483e-ba22-a609f7751fe8'], standard_attr_id=2508, status=DOWN, tags=[], tenant_id=8462a4a9a313405e8fd212f9ec4a0c92, updated_at=2025-11-28T10:07:21Z on network 719e8b2f-3827-4b26-9c22-6bd0f7ed7ceb
Nov 28 10:07:26 np0005538515.localdomain dnsmasq[319559]: read /var/lib/neutron/dhcp/719e8b2f-3827-4b26-9c22-6bd0f7ed7ceb/addn_hosts - 1 addresses
Nov 28 10:07:26 np0005538515.localdomain dnsmasq-dhcp[319559]: read /var/lib/neutron/dhcp/719e8b2f-3827-4b26-9c22-6bd0f7ed7ceb/host
Nov 28 10:07:26 np0005538515.localdomain dnsmasq-dhcp[319559]: read /var/lib/neutron/dhcp/719e8b2f-3827-4b26-9c22-6bd0f7ed7ceb/opts
Nov 28 10:07:26 np0005538515.localdomain podman[319587]: 2025-11-28 10:07:26.618895831 +0000 UTC m=+0.064944026 container kill b64edd8db7d47901ad1bacd35e70a1e47dcd67493ee12a17dc2d4506c9cef948 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-719e8b2f-3827-4b26-9c22-6bd0f7ed7ceb, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Nov 28 10:07:26 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:07:26.767 261346 INFO neutron.agent.dhcp.agent [None req-a4f455cb-1187-4a22-ace0-40ae01a8e64a - - - - - -] DHCP configuration for ports {'e18fca2e-eaeb-40cf-9eb1-203ecf5b0aa2', '3d039e4d-f111-4c85-a4bc-bc275c485ad6', '8646aa95-6463-44cd-8c34-1bec1705e23b', '8bc6a73d-610f-4f06-b515-26f3efcf46a4'} is completed
Nov 28 10:07:26 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e196 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:07:26 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:07:26.810 261346 INFO neutron.agent.dhcp.agent [None req-223da476-0ca2-44ac-b2f9-b135e3874901 - - - - - -] Trigger reload_allocations for port admin_state_up=False, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:07:21Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce37c370>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce37c2e0>], id=3d039e4d-f111-4c85-a4bc-bc275c485ad6, ip_allocation=immediate, mac_address=fa:16:3e:a9:78:64, name=tempest-PortsIpV6TestJSON-1465794594, network_id=719e8b2f-3827-4b26-9c22-6bd0f7ed7ceb, port_security_enabled=True, project_id=8462a4a9a313405e8fd212f9ec4a0c92, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=2, security_groups=['ad28b9ca-0164-4a23-9923-7d61ac565e84', 'ef6c27ab-7008-4940-88ab-f495a3348997'], standard_attr_id=2508, status=DOWN, tags=[], tenant_id=8462a4a9a313405e8fd212f9ec4a0c92, updated_at=2025-11-28T10:07:22Z on network 719e8b2f-3827-4b26-9c22-6bd0f7ed7ceb
Nov 28 10:07:26 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:07:26.887 261346 INFO neutron.agent.dhcp.agent [None req-68d4b832-3a93-4d0a-a8a2-32ca37082701 - - - - - -] DHCP configuration for ports {'3d039e4d-f111-4c85-a4bc-bc275c485ad6'} is completed
Nov 28 10:07:26 np0005538515.localdomain ceph-mon[301134]: pgmap v387: 177 pgs: 1 active+clean+snaptrim, 176 active+clean; 192 MiB data, 903 MiB used, 41 GiB / 42 GiB avail; 57 KiB/s rd, 14 KiB/s wr, 80 op/s
Nov 28 10:07:26 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:07:26 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/408988445' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:07:26 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/408988445' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:07:27 np0005538515.localdomain dnsmasq[319559]: read /var/lib/neutron/dhcp/719e8b2f-3827-4b26-9c22-6bd0f7ed7ceb/addn_hosts - 1 addresses
Nov 28 10:07:27 np0005538515.localdomain dnsmasq-dhcp[319559]: read /var/lib/neutron/dhcp/719e8b2f-3827-4b26-9c22-6bd0f7ed7ceb/host
Nov 28 10:07:27 np0005538515.localdomain dnsmasq-dhcp[319559]: read /var/lib/neutron/dhcp/719e8b2f-3827-4b26-9c22-6bd0f7ed7ceb/opts
Nov 28 10:07:27 np0005538515.localdomain podman[319645]: 2025-11-28 10:07:27.13584934 +0000 UTC m=+0.132883323 container kill b64edd8db7d47901ad1bacd35e70a1e47dcd67493ee12a17dc2d4506c9cef948 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-719e8b2f-3827-4b26-9c22-6bd0f7ed7ceb, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Nov 28 10:07:27 np0005538515.localdomain podman[319663]: 
Nov 28 10:07:27 np0005538515.localdomain podman[319663]: 2025-11-28 10:07:27.165130413 +0000 UTC m=+0.076270245 container create 5f528661078fe39cce27d38eff7a9a22891918d00b247e0af7a5cd9ad5dfec9c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-bb75b6d0-46f7-4ff4-b977-20963925f011, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 28 10:07:27 np0005538515.localdomain systemd[1]: Started libpod-conmon-5f528661078fe39cce27d38eff7a9a22891918d00b247e0af7a5cd9ad5dfec9c.scope.
Nov 28 10:07:27 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 10:07:27 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1910b9af47a7dc6a481f7f88c43c164511ef0e76206f1154477c85aff6d7d500/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 10:07:27 np0005538515.localdomain podman[319663]: 2025-11-28 10:07:27.232850224 +0000 UTC m=+0.143990056 container init 5f528661078fe39cce27d38eff7a9a22891918d00b247e0af7a5cd9ad5dfec9c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-bb75b6d0-46f7-4ff4-b977-20963925f011, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS)
Nov 28 10:07:27 np0005538515.localdomain podman[319663]: 2025-11-28 10:07:27.13392119 +0000 UTC m=+0.045060992 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 28 10:07:27 np0005538515.localdomain podman[319663]: 2025-11-28 10:07:27.245749292 +0000 UTC m=+0.156889124 container start 5f528661078fe39cce27d38eff7a9a22891918d00b247e0af7a5cd9ad5dfec9c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-bb75b6d0-46f7-4ff4-b977-20963925f011, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Nov 28 10:07:27 np0005538515.localdomain dnsmasq[319688]: started, version 2.85 cachesize 150
Nov 28 10:07:27 np0005538515.localdomain dnsmasq[319688]: DNS service limited to local subnets
Nov 28 10:07:27 np0005538515.localdomain dnsmasq[319688]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 28 10:07:27 np0005538515.localdomain dnsmasq[319688]: warning: no upstream servers configured
Nov 28 10:07:27 np0005538515.localdomain dnsmasq-dhcp[319688]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Nov 28 10:07:27 np0005538515.localdomain dnsmasq[319688]: read /var/lib/neutron/dhcp/bb75b6d0-46f7-4ff4-b977-20963925f011/addn_hosts - 0 addresses
Nov 28 10:07:27 np0005538515.localdomain dnsmasq-dhcp[319688]: read /var/lib/neutron/dhcp/bb75b6d0-46f7-4ff4-b977-20963925f011/host
Nov 28 10:07:27 np0005538515.localdomain dnsmasq-dhcp[319688]: read /var/lib/neutron/dhcp/bb75b6d0-46f7-4ff4-b977-20963925f011/opts
Nov 28 10:07:27 np0005538515.localdomain dnsmasq[319559]: read /var/lib/neutron/dhcp/719e8b2f-3827-4b26-9c22-6bd0f7ed7ceb/addn_hosts - 0 addresses
Nov 28 10:07:27 np0005538515.localdomain dnsmasq-dhcp[319559]: read /var/lib/neutron/dhcp/719e8b2f-3827-4b26-9c22-6bd0f7ed7ceb/host
Nov 28 10:07:27 np0005538515.localdomain podman[319710]: 2025-11-28 10:07:27.438187593 +0000 UTC m=+0.040317365 container kill b64edd8db7d47901ad1bacd35e70a1e47dcd67493ee12a17dc2d4506c9cef948 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-719e8b2f-3827-4b26-9c22-6bd0f7ed7ceb, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 28 10:07:27 np0005538515.localdomain dnsmasq-dhcp[319559]: read /var/lib/neutron/dhcp/719e8b2f-3827-4b26-9c22-6bd0f7ed7ceb/opts
Nov 28 10:07:27 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "84a40f08-8b6b-4686-8d05-33a3e9292f4f", "format": "json"}]: dispatch
Nov 28 10:07:27 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_clone_status(clone_name:84a40f08-8b6b-4686-8d05-33a3e9292f4f, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Nov 28 10:07:27 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_clone_status(clone_name:84a40f08-8b6b-4686-8d05-33a3e9292f4f, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Nov 28 10:07:27 np0005538515.localdomain ceph-mgr[286188]: mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '84a40f08-8b6b-4686-8d05-33a3e9292f4f' of type subvolume
Nov 28 10:07:27 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T10:07:27.460+0000 7fcc87448640 -1 mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '84a40f08-8b6b-4686-8d05-33a3e9292f4f' of type subvolume
Nov 28 10:07:27 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "84a40f08-8b6b-4686-8d05-33a3e9292f4f", "force": true, "format": "json"}]: dispatch
Nov 28 10:07:27 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:84a40f08-8b6b-4686-8d05-33a3e9292f4f, vol_name:cephfs) < ""
Nov 28 10:07:27 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.subvolume_base] subvolume path 'b'/volumes/_nogroup/84a40f08-8b6b-4686-8d05-33a3e9292f4f'' moved to trashcan
Nov 28 10:07:27 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Nov 28 10:07:27 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:84a40f08-8b6b-4686-8d05-33a3e9292f4f, vol_name:cephfs) < ""
Nov 28 10:07:27 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:07:27.474 261346 INFO neutron.agent.dhcp.agent [None req-392bae1e-7b98-46ed-a908-87eca6ab4af9 - - - - - -] DHCP configuration for ports {'3d039e4d-f111-4c85-a4bc-bc275c485ad6', '16cc2844-5a40-4594-9c31-bb5eedf99c06'} is completed
Nov 28 10:07:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:07:27 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 10:07:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:07:27 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 10:07:27 np0005538515.localdomain openstack_network_exporter[240973]: 
Nov 28 10:07:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:07:27 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 10:07:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:07:27 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 10:07:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:07:27 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 10:07:27 np0005538515.localdomain openstack_network_exporter[240973]: 
Nov 28 10:07:27 np0005538515.localdomain dnsmasq[319688]: exiting on receipt of SIGTERM
Nov 28 10:07:27 np0005538515.localdomain podman[319744]: 2025-11-28 10:07:27.632703607 +0000 UTC m=+0.062101658 container kill 5f528661078fe39cce27d38eff7a9a22891918d00b247e0af7a5cd9ad5dfec9c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-bb75b6d0-46f7-4ff4-b977-20963925f011, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, tcib_managed=true)
Nov 28 10:07:27 np0005538515.localdomain systemd[1]: libpod-5f528661078fe39cce27d38eff7a9a22891918d00b247e0af7a5cd9ad5dfec9c.scope: Deactivated successfully.
Nov 28 10:07:27 np0005538515.localdomain podman[319761]: 2025-11-28 10:07:27.686866629 +0000 UTC m=+0.041211483 container died 5f528661078fe39cce27d38eff7a9a22891918d00b247e0af7a5cd9ad5dfec9c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-bb75b6d0-46f7-4ff4-b977-20963925f011, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Nov 28 10:07:27 np0005538515.localdomain podman[319761]: 2025-11-28 10:07:27.710187749 +0000 UTC m=+0.064532533 container cleanup 5f528661078fe39cce27d38eff7a9a22891918d00b247e0af7a5cd9ad5dfec9c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-bb75b6d0-46f7-4ff4-b977-20963925f011, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 28 10:07:27 np0005538515.localdomain systemd[1]: libpod-conmon-5f528661078fe39cce27d38eff7a9a22891918d00b247e0af7a5cd9ad5dfec9c.scope: Deactivated successfully.
Nov 28 10:07:27 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-1910b9af47a7dc6a481f7f88c43c164511ef0e76206f1154477c85aff6d7d500-merged.mount: Deactivated successfully.
Nov 28 10:07:27 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5f528661078fe39cce27d38eff7a9a22891918d00b247e0af7a5cd9ad5dfec9c-userdata-shm.mount: Deactivated successfully.
Nov 28 10:07:27 np0005538515.localdomain podman[319762]: 2025-11-28 10:07:27.758142369 +0000 UTC m=+0.111969007 container remove 5f528661078fe39cce27d38eff7a9a22891918d00b247e0af7a5cd9ad5dfec9c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-bb75b6d0-46f7-4ff4-b977-20963925f011, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 28 10:07:27 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:07:27.769 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:07:27 np0005538515.localdomain kernel: device tap0f9c27e4-dc left promiscuous mode
Nov 28 10:07:27 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T10:07:27Z|00185|binding|INFO|Releasing lport 0f9c27e4-dc5b-458a-84e7-59a6845be341 from this chassis (sb_readonly=0)
Nov 28 10:07:27 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T10:07:27Z|00186|binding|INFO|Setting lport 0f9c27e4-dc5b-458a-84e7-59a6845be341 down in Southbound
Nov 28 10:07:27 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:07:27.784 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:07:27 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:07:27.789 158530 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538515.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe58:df08/64', 'neutron:device_id': 'dhcp2a75ae47-09f3-5db4-9c67-86b6e0e7c804-bb75b6d0-46f7-4ff4-b977-20963925f011', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-bb75b6d0-46f7-4ff4-b977-20963925f011', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ae10569a38284f298c961498da620c5f', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005538515.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=19295739-89e1-4341-a9f7-bf31d43c2d95, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fd80e481be0>], logical_port=0f9c27e4-dc5b-458a-84e7-59a6845be341) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fd80e481be0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:07:27 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:07:27.789 158530 INFO neutron.agent.ovn.metadata.agent [-] Port 0f9c27e4-dc5b-458a-84e7-59a6845be341 in datapath bb75b6d0-46f7-4ff4-b977-20963925f011 unbound from our chassis
Nov 28 10:07:27 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:07:27.791 158530 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network bb75b6d0-46f7-4ff4-b977-20963925f011, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 28 10:07:27 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:07:27.791 261619 DEBUG oslo.privsep.daemon [-] privsep: reply[fe6655c9-58bd-49f6-804d-104c0a0c35fe]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:07:27 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v388: 177 pgs: 177 active+clean; 213 MiB data, 1016 MiB used, 41 GiB / 42 GiB avail; 651 KiB/s rd, 4.0 MiB/s wr, 230 op/s
Nov 28 10:07:27 np0005538515.localdomain systemd[1]: tmp-crun.kzKinO.mount: Deactivated successfully.
Nov 28 10:07:27 np0005538515.localdomain podman[319807]: 2025-11-28 10:07:27.865770232 +0000 UTC m=+0.062263083 container kill b64edd8db7d47901ad1bacd35e70a1e47dcd67493ee12a17dc2d4506c9cef948 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-719e8b2f-3827-4b26-9c22-6bd0f7ed7ceb, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3)
Nov 28 10:07:27 np0005538515.localdomain dnsmasq[319559]: exiting on receipt of SIGTERM
Nov 28 10:07:27 np0005538515.localdomain systemd[1]: libpod-b64edd8db7d47901ad1bacd35e70a1e47dcd67493ee12a17dc2d4506c9cef948.scope: Deactivated successfully.
Nov 28 10:07:27 np0005538515.localdomain podman[319820]: 2025-11-28 10:07:27.944305227 +0000 UTC m=+0.063612135 container died b64edd8db7d47901ad1bacd35e70a1e47dcd67493ee12a17dc2d4506c9cef948 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-719e8b2f-3827-4b26-9c22-6bd0f7ed7ceb, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:07:27 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "84a40f08-8b6b-4686-8d05-33a3e9292f4f", "format": "json"}]: dispatch
Nov 28 10:07:27 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "84a40f08-8b6b-4686-8d05-33a3e9292f4f", "force": true, "format": "json"}]: dispatch
Nov 28 10:07:27 np0005538515.localdomain podman[319820]: 2025-11-28 10:07:27.980261526 +0000 UTC m=+0.099568394 container cleanup b64edd8db7d47901ad1bacd35e70a1e47dcd67493ee12a17dc2d4506c9cef948 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-719e8b2f-3827-4b26-9c22-6bd0f7ed7ceb, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true)
Nov 28 10:07:27 np0005538515.localdomain systemd[1]: libpod-conmon-b64edd8db7d47901ad1bacd35e70a1e47dcd67493ee12a17dc2d4506c9cef948.scope: Deactivated successfully.
Nov 28 10:07:28 np0005538515.localdomain podman[319822]: 2025-11-28 10:07:28.023030637 +0000 UTC m=+0.133608706 container remove b64edd8db7d47901ad1bacd35e70a1e47dcd67493ee12a17dc2d4506c9cef948 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-719e8b2f-3827-4b26-9c22-6bd0f7ed7ceb, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3)
Nov 28 10:07:28 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:07:28.070 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:07:28 np0005538515.localdomain dnsmasq[319301]: exiting on receipt of SIGTERM
Nov 28 10:07:28 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:07:28.219 261346 INFO neutron.agent.dhcp.agent [None req-f32e5221-32bb-49e5-8900-cc184f56a431 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:07:28 np0005538515.localdomain systemd[1]: libpod-e023ab45866580f82c7d33d531f7dc939662032f69b1cde9a1a731e04d9714df.scope: Deactivated successfully.
Nov 28 10:07:28 np0005538515.localdomain podman[319866]: 2025-11-28 10:07:28.221270656 +0000 UTC m=+0.064790191 container kill e023ab45866580f82c7d33d531f7dc939662032f69b1cde9a1a731e04d9714df (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-744b5a82-3c5c-4b41-ba44-527244a209c4, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3)
Nov 28 10:07:28 np0005538515.localdomain podman[319883]: 2025-11-28 10:07:28.291552266 +0000 UTC m=+0.053025448 container died e023ab45866580f82c7d33d531f7dc939662032f69b1cde9a1a731e04d9714df (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-744b5a82-3c5c-4b41-ba44-527244a209c4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 28 10:07:28 np0005538515.localdomain podman[319883]: 2025-11-28 10:07:28.317979002 +0000 UTC m=+0.079452164 container cleanup e023ab45866580f82c7d33d531f7dc939662032f69b1cde9a1a731e04d9714df (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-744b5a82-3c5c-4b41-ba44-527244a209c4, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:07:28 np0005538515.localdomain systemd[1]: libpod-conmon-e023ab45866580f82c7d33d531f7dc939662032f69b1cde9a1a731e04d9714df.scope: Deactivated successfully.
Nov 28 10:07:28 np0005538515.localdomain podman[319885]: 2025-11-28 10:07:28.368384967 +0000 UTC m=+0.122774591 container remove e023ab45866580f82c7d33d531f7dc939662032f69b1cde9a1a731e04d9714df (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-744b5a82-3c5c-4b41-ba44-527244a209c4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Nov 28 10:07:28 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-d1a40302d152a1d4acd8c242f619baa5eea043f1a1ad2914ca314bd0285d34a6-merged.mount: Deactivated successfully.
Nov 28 10:07:28 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b64edd8db7d47901ad1bacd35e70a1e47dcd67493ee12a17dc2d4506c9cef948-userdata-shm.mount: Deactivated successfully.
Nov 28 10:07:28 np0005538515.localdomain systemd[1]: run-netns-qdhcp\x2dbb75b6d0\x2d46f7\x2d4ff4\x2db977\x2d20963925f011.mount: Deactivated successfully.
Nov 28 10:07:28 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-8ff3cb155a93a6ece0fb95602d69cd611411b4d92a56342074acdd1cb9fb6b26-merged.mount: Deactivated successfully.
Nov 28 10:07:28 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e023ab45866580f82c7d33d531f7dc939662032f69b1cde9a1a731e04d9714df-userdata-shm.mount: Deactivated successfully.
Nov 28 10:07:28 np0005538515.localdomain podman[239012]: time="2025-11-28T10:07:28Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 10:07:28 np0005538515.localdomain podman[239012]: @ - - [28/Nov/2025:10:07:28 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 158154 "" "Go-http-client/1.1"
Nov 28 10:07:28 np0005538515.localdomain ceph-mon[301134]: pgmap v388: 177 pgs: 177 active+clean; 213 MiB data, 1016 MiB used, 41 GiB / 42 GiB avail; 651 KiB/s rd, 4.0 MiB/s wr, 230 op/s
Nov 28 10:07:28 np0005538515.localdomain podman[319950]: 
Nov 28 10:07:29 np0005538515.localdomain podman[319950]: 2025-11-28 10:07:29.008220639 +0000 UTC m=+0.143855972 container create 8c38f521b1dcb7ce646704ab260d0f5ca16b6aaab35e78128688f13d91893f54 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-719e8b2f-3827-4b26-9c22-6bd0f7ed7ceb, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125)
Nov 28 10:07:29 np0005538515.localdomain systemd[1]: Started libpod-conmon-8c38f521b1dcb7ce646704ab260d0f5ca16b6aaab35e78128688f13d91893f54.scope.
Nov 28 10:07:29 np0005538515.localdomain podman[319950]: 2025-11-28 10:07:28.966518251 +0000 UTC m=+0.102153584 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 28 10:07:29 np0005538515.localdomain podman[239012]: @ - - [28/Nov/2025:10:07:28 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19695 "" "Go-http-client/1.1"
Nov 28 10:07:29 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 10:07:29 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7331610e5b3e756e96e7e41d0f681a4e4ee10db4858c583003c7c90264e22183/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 10:07:29 np0005538515.localdomain podman[319950]: 2025-11-28 10:07:29.111824427 +0000 UTC m=+0.247459740 container init 8c38f521b1dcb7ce646704ab260d0f5ca16b6aaab35e78128688f13d91893f54 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-719e8b2f-3827-4b26-9c22-6bd0f7ed7ceb, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 28 10:07:29 np0005538515.localdomain podman[319950]: 2025-11-28 10:07:29.122116275 +0000 UTC m=+0.257751618 container start 8c38f521b1dcb7ce646704ab260d0f5ca16b6aaab35e78128688f13d91893f54 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-719e8b2f-3827-4b26-9c22-6bd0f7ed7ceb, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3)
Nov 28 10:07:29 np0005538515.localdomain dnsmasq[319982]: started, version 2.85 cachesize 150
Nov 28 10:07:29 np0005538515.localdomain dnsmasq[319982]: DNS service limited to local subnets
Nov 28 10:07:29 np0005538515.localdomain dnsmasq[319982]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 28 10:07:29 np0005538515.localdomain dnsmasq[319982]: warning: no upstream servers configured
Nov 28 10:07:29 np0005538515.localdomain dnsmasq-dhcp[319982]: DHCPv6, static leases only on 2001:db8:0:1::, lease time 1d
Nov 28 10:07:29 np0005538515.localdomain dnsmasq-dhcp[319982]: DHCPv6, static leases only on 2001:db8:0:2::, lease time 1d
Nov 28 10:07:29 np0005538515.localdomain dnsmasq[319982]: read /var/lib/neutron/dhcp/719e8b2f-3827-4b26-9c22-6bd0f7ed7ceb/addn_hosts - 0 addresses
Nov 28 10:07:29 np0005538515.localdomain dnsmasq-dhcp[319982]: read /var/lib/neutron/dhcp/719e8b2f-3827-4b26-9c22-6bd0f7ed7ceb/host
Nov 28 10:07:29 np0005538515.localdomain dnsmasq-dhcp[319982]: read /var/lib/neutron/dhcp/719e8b2f-3827-4b26-9c22-6bd0f7ed7ceb/opts
Nov 28 10:07:29 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:07:29.381 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:07:29 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:07:29.668 261346 INFO neutron.agent.dhcp.agent [None req-52e93256-15f5-47cc-a32e-553ffe614e2c - - - - - -] DHCP configuration for ports {'e18fca2e-eaeb-40cf-9eb1-203ecf5b0aa2', '8646aa95-6463-44cd-8c34-1bec1705e23b', '8bc6a73d-610f-4f06-b515-26f3efcf46a4'} is completed
Nov 28 10:07:29 np0005538515.localdomain dnsmasq[319982]: exiting on receipt of SIGTERM
Nov 28 10:07:29 np0005538515.localdomain podman[320011]: 2025-11-28 10:07:29.712721557 +0000 UTC m=+0.060920322 container kill 8c38f521b1dcb7ce646704ab260d0f5ca16b6aaab35e78128688f13d91893f54 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-719e8b2f-3827-4b26-9c22-6bd0f7ed7ceb, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 28 10:07:29 np0005538515.localdomain systemd[1]: libpod-8c38f521b1dcb7ce646704ab260d0f5ca16b6aaab35e78128688f13d91893f54.scope: Deactivated successfully.
Nov 28 10:07:29 np0005538515.localdomain podman[320025]: 2025-11-28 10:07:29.782748719 +0000 UTC m=+0.055018250 container died 8c38f521b1dcb7ce646704ab260d0f5ca16b6aaab35e78128688f13d91893f54 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-719e8b2f-3827-4b26-9c22-6bd0f7ed7ceb, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 28 10:07:29 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8c38f521b1dcb7ce646704ab260d0f5ca16b6aaab35e78128688f13d91893f54-userdata-shm.mount: Deactivated successfully.
Nov 28 10:07:29 np0005538515.localdomain podman[320025]: 2025-11-28 10:07:29.815686916 +0000 UTC m=+0.087956417 container cleanup 8c38f521b1dcb7ce646704ab260d0f5ca16b6aaab35e78128688f13d91893f54 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-719e8b2f-3827-4b26-9c22-6bd0f7ed7ceb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS)
Nov 28 10:07:29 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v389: 177 pgs: 177 active+clean; 213 MiB data, 1016 MiB used, 41 GiB / 42 GiB avail; 466 KiB/s rd, 3.1 MiB/s wr, 124 op/s
Nov 28 10:07:29 np0005538515.localdomain systemd[1]: libpod-conmon-8c38f521b1dcb7ce646704ab260d0f5ca16b6aaab35e78128688f13d91893f54.scope: Deactivated successfully.
Nov 28 10:07:29 np0005538515.localdomain podman[320027]: 2025-11-28 10:07:29.861975584 +0000 UTC m=+0.127395853 container remove 8c38f521b1dcb7ce646704ab260d0f5ca16b6aaab35e78128688f13d91893f54 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-719e8b2f-3827-4b26-9c22-6bd0f7ed7ceb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3)
Nov 28 10:07:29 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:07:29.875 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:07:29 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T10:07:29Z|00187|binding|INFO|Releasing lport 8bc6a73d-610f-4f06-b515-26f3efcf46a4 from this chassis (sb_readonly=0)
Nov 28 10:07:29 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T10:07:29Z|00188|binding|INFO|Setting lport 8bc6a73d-610f-4f06-b515-26f3efcf46a4 down in Southbound
Nov 28 10:07:29 np0005538515.localdomain kernel: device tap8bc6a73d-61 left promiscuous mode
Nov 28 10:07:29 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:07:29.885 158530 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538515.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1::2/64 2001:db8:0:2::2/64 2001:db8::2/64', 'neutron:device_id': 'dhcp2a75ae47-09f3-5db4-9c67-86b6e0e7c804-719e8b2f-3827-4b26-9c22-6bd0f7ed7ceb', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-719e8b2f-3827-4b26-9c22-6bd0f7ed7ceb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8462a4a9a313405e8fd212f9ec4a0c92', 'neutron:revision_number': '7', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005538515.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fe1e7b19-836c-4f4d-9811-92d20be8712f, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fd80e481be0>], logical_port=8bc6a73d-610f-4f06-b515-26f3efcf46a4) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fd80e481be0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:07:29 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:07:29.887 158530 INFO neutron.agent.ovn.metadata.agent [-] Port 8bc6a73d-610f-4f06-b515-26f3efcf46a4 in datapath 719e8b2f-3827-4b26-9c22-6bd0f7ed7ceb unbound from our chassis
Nov 28 10:07:29 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:07:29.889 158530 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 719e8b2f-3827-4b26-9c22-6bd0f7ed7ceb or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 28 10:07:29 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:07:29.889 261619 DEBUG oslo.privsep.daemon [-] privsep: reply[140de10c-a25f-40f2-a1eb-a45d59d9840f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:07:29 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:07:29.900 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:07:29 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/4119185207' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:07:29 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/4119185207' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:07:30 np0005538515.localdomain podman[320076]: 
Nov 28 10:07:30 np0005538515.localdomain podman[320076]: 2025-11-28 10:07:30.024650616 +0000 UTC m=+0.088494793 container create 78986f9e71a77301eadcca6678ea6748e2fa00cb1553a0dcc251102da5ea6e4b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-744b5a82-3c5c-4b41-ba44-527244a209c4, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 28 10:07:30 np0005538515.localdomain systemd[1]: Started libpod-conmon-78986f9e71a77301eadcca6678ea6748e2fa00cb1553a0dcc251102da5ea6e4b.scope.
Nov 28 10:07:30 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 10:07:30 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8feab9915cd549e508973929d606dec6edf6c2c66c5c9a9b78239bf5e744001b/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 10:07:30 np0005538515.localdomain podman[320076]: 2025-11-28 10:07:30.080299044 +0000 UTC m=+0.144143221 container init 78986f9e71a77301eadcca6678ea6748e2fa00cb1553a0dcc251102da5ea6e4b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-744b5a82-3c5c-4b41-ba44-527244a209c4, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true)
Nov 28 10:07:30 np0005538515.localdomain podman[320076]: 2025-11-28 10:07:29.981157984 +0000 UTC m=+0.045002241 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 28 10:07:30 np0005538515.localdomain podman[320076]: 2025-11-28 10:07:30.08926154 +0000 UTC m=+0.153105717 container start 78986f9e71a77301eadcca6678ea6748e2fa00cb1553a0dcc251102da5ea6e4b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-744b5a82-3c5c-4b41-ba44-527244a209c4, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:07:30 np0005538515.localdomain dnsmasq[320095]: started, version 2.85 cachesize 150
Nov 28 10:07:30 np0005538515.localdomain dnsmasq[320095]: DNS service limited to local subnets
Nov 28 10:07:30 np0005538515.localdomain dnsmasq[320095]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 28 10:07:30 np0005538515.localdomain dnsmasq[320095]: warning: no upstream servers configured
Nov 28 10:07:30 np0005538515.localdomain dnsmasq-dhcp[320095]: DHCP, static leases only on 10.100.0.16, lease time 1d
Nov 28 10:07:30 np0005538515.localdomain dnsmasq-dhcp[320095]: DHCP, static leases only on 10.100.0.0, lease time 1d
Nov 28 10:07:30 np0005538515.localdomain dnsmasq[320095]: read /var/lib/neutron/dhcp/744b5a82-3c5c-4b41-ba44-527244a209c4/addn_hosts - 1 addresses
Nov 28 10:07:30 np0005538515.localdomain dnsmasq-dhcp[320095]: read /var/lib/neutron/dhcp/744b5a82-3c5c-4b41-ba44-527244a209c4/host
Nov 28 10:07:30 np0005538515.localdomain dnsmasq-dhcp[320095]: read /var/lib/neutron/dhcp/744b5a82-3c5c-4b41-ba44-527244a209c4/opts
Nov 28 10:07:30 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:07:30.298 2 INFO neutron.agent.securitygroups_rpc [None req-bb1d0f4f-4080-47c2-b71c-a5aaec3a62e2 e32848e36ae94f66ae634ff4d7716d6f 8462a4a9a313405e8fd212f9ec4a0c92 - - default default] Security group member updated ['78343c03-098f-4faf-a880-2814fe3611d6']
Nov 28 10:07:30 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:07:30.398 261346 INFO neutron.agent.dhcp.agent [None req-0061627a-68e5-4249-8609-95796237c4aa - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:07:30 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:07:30.399 261346 INFO neutron.agent.dhcp.agent [None req-0061627a-68e5-4249-8609-95796237c4aa - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:07:30 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e197 e197: 6 total, 6 up, 6 in
Nov 28 10:07:30 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:07:30.581 261346 INFO neutron.agent.dhcp.agent [None req-61a6a753-713e-48a8-bc33-2d6f6814306a - - - - - -] DHCP configuration for ports {'530dc798-1c6a-4c38-a11f-57f3818e5561', 'd42f5d61-afe1-455f-b448-86993094b244', '3f096e93-c3cf-440a-8cda-fd3f17a679fb', '636c21fa-d6bd-405e-95ed-d59498827d6f'} is completed
Nov 28 10:07:30 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-7331610e5b3e756e96e7e41d0f681a4e4ee10db4858c583003c7c90264e22183-merged.mount: Deactivated successfully.
Nov 28 10:07:30 np0005538515.localdomain systemd[1]: run-netns-qdhcp\x2d719e8b2f\x2d3827\x2d4b26\x2d9c22\x2d6bd0f7ed7ceb.mount: Deactivated successfully.
Nov 28 10:07:30 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "7fc9c0d6-3c27-4cc1-b7bd-1c8604d8f486", "snap_name": "3715b589-30b5-48c3-ae35-bbb103548e98", "format": "json"}]: dispatch
Nov 28 10:07:30 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_create(format:json, prefix:fs subvolume snapshot create, snap_name:3715b589-30b5-48c3-ae35-bbb103548e98, sub_name:7fc9c0d6-3c27-4cc1-b7bd-1c8604d8f486, vol_name:cephfs) < ""
Nov 28 10:07:30 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_create(format:json, prefix:fs subvolume snapshot create, snap_name:3715b589-30b5-48c3-ae35-bbb103548e98, sub_name:7fc9c0d6-3c27-4cc1-b7bd-1c8604d8f486, vol_name:cephfs) < ""
Nov 28 10:07:31 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:07:31.395 158530 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:af:3b:02 10.100.0.19 10.100.0.3'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.19/28 10.100.0.3/28', 'neutron:device_id': 'ovnmeta-744b5a82-3c5c-4b41-ba44-527244a209c4', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-744b5a82-3c5c-4b41-ba44-527244a209c4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5e7a07c97c664076bc825e05137c574c', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=af9fa2ef-906d-4c5f-8a61-e350b84d90cf, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=636c21fa-d6bd-405e-95ed-d59498827d6f) old=Port_Binding(mac=['fa:16:3e:af:3b:02 10.100.0.3'], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'ovnmeta-744b5a82-3c5c-4b41-ba44-527244a209c4', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-744b5a82-3c5c-4b41-ba44-527244a209c4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5e7a07c97c664076bc825e05137c574c', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:07:31 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:07:31.397 158530 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 636c21fa-d6bd-405e-95ed-d59498827d6f in datapath 744b5a82-3c5c-4b41-ba44-527244a209c4 updated
Nov 28 10:07:31 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:07:31.401 158530 DEBUG neutron.agent.ovn.metadata.agent [-] Port a03547c5-d094-4489-b8d5-6b024d72dbea IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Nov 28 10:07:31 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:07:31.401 158530 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 744b5a82-3c5c-4b41-ba44-527244a209c4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 28 10:07:31 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:07:31.402 261619 DEBUG oslo.privsep.daemon [-] privsep: reply[c03a809a-5b82-46c4-94e9-bc0af302abbd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:07:31 np0005538515.localdomain ceph-mon[301134]: pgmap v389: 177 pgs: 177 active+clean; 213 MiB data, 1016 MiB used, 41 GiB / 42 GiB avail; 466 KiB/s rd, 3.1 MiB/s wr, 124 op/s
Nov 28 10:07:31 np0005538515.localdomain ceph-mon[301134]: osdmap e197: 6 total, 6 up, 6 in
Nov 28 10:07:31 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:07:31.542 261346 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:07:31 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e197 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:07:31 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v391: 177 pgs: 177 active+clean; 225 MiB data, 1022 MiB used, 41 GiB / 42 GiB avail; 546 KiB/s rd, 3.2 MiB/s wr, 172 op/s
Nov 28 10:07:31 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:07:31.977 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:07:32 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "7fc9c0d6-3c27-4cc1-b7bd-1c8604d8f486", "snap_name": "3715b589-30b5-48c3-ae35-bbb103548e98", "format": "json"}]: dispatch
Nov 28 10:07:32 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:07:32.631 2 INFO neutron.agent.securitygroups_rpc [None req-b1e499b5-5b30-44cb-89ef-2d90dabf973f 646af4638a054ec4b7eb9e5438a2ab60 5e7a07c97c664076bc825e05137c574c - - default default] Security group member updated ['b5d46958-1542-44c0-a82a-37e69acb7089', 'acf02bd6-8fdb-4bdf-b655-c11d3c48057a']
Nov 28 10:07:32 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:07:32.656 261346 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=False, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:07:25Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce3de820>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce3deb50>], id=3f096e93-c3cf-440a-8cda-fd3f17a679fb, ip_allocation=immediate, mac_address=fa:16:3e:bd:2b:32, name=tempest-PortsTestJSON-1565841995, network_id=744b5a82-3c5c-4b41-ba44-527244a209c4, port_security_enabled=True, project_id=5e7a07c97c664076bc825e05137c574c, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=2, security_groups=['b5d46958-1542-44c0-a82a-37e69acb7089'], standard_attr_id=2522, status=DOWN, tags=[], tenant_id=5e7a07c97c664076bc825e05137c574c, updated_at=2025-11-28T10:07:32Z on network 744b5a82-3c5c-4b41-ba44-527244a209c4
Nov 28 10:07:32 np0005538515.localdomain dnsmasq-dhcp[320095]: DHCPRELEASE(tapd42f5d61-af) 10.100.0.5 fa:16:3e:bd:2b:32
Nov 28 10:07:33 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:07:33.109 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:07:33 np0005538515.localdomain dnsmasq[320095]: read /var/lib/neutron/dhcp/744b5a82-3c5c-4b41-ba44-527244a209c4/addn_hosts - 1 addresses
Nov 28 10:07:33 np0005538515.localdomain dnsmasq-dhcp[320095]: read /var/lib/neutron/dhcp/744b5a82-3c5c-4b41-ba44-527244a209c4/host
Nov 28 10:07:33 np0005538515.localdomain podman[320114]: 2025-11-28 10:07:33.248275019 +0000 UTC m=+0.064948026 container kill 78986f9e71a77301eadcca6678ea6748e2fa00cb1553a0dcc251102da5ea6e4b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-744b5a82-3c5c-4b41-ba44-527244a209c4, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 28 10:07:33 np0005538515.localdomain dnsmasq-dhcp[320095]: read /var/lib/neutron/dhcp/744b5a82-3c5c-4b41-ba44-527244a209c4/opts
Nov 28 10:07:33 np0005538515.localdomain systemd[1]: tmp-crun.IxeUIu.mount: Deactivated successfully.
Nov 28 10:07:33 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:07:33.598 2 INFO neutron.agent.securitygroups_rpc [None req-7e9cce24-3864-452e-838f-0b8e85be3343 646af4638a054ec4b7eb9e5438a2ab60 5e7a07c97c664076bc825e05137c574c - - default default] Security group member updated ['b5d46958-1542-44c0-a82a-37e69acb7089']
Nov 28 10:07:33 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:07:33.616 261346 INFO neutron.agent.dhcp.agent [None req-a199181a-4586-4beb-b947-620611dc04f7 - - - - - -] DHCP configuration for ports {'3f096e93-c3cf-440a-8cda-fd3f17a679fb'} is completed
Nov 28 10:07:33 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v392: 177 pgs: 177 active+clean; 225 MiB data, 1022 MiB used, 41 GiB / 42 GiB avail; 478 KiB/s rd, 2.8 MiB/s wr, 151 op/s
Nov 28 10:07:34 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:07:34.427 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:07:34 np0005538515.localdomain dnsmasq[320095]: read /var/lib/neutron/dhcp/744b5a82-3c5c-4b41-ba44-527244a209c4/addn_hosts - 0 addresses
Nov 28 10:07:34 np0005538515.localdomain dnsmasq-dhcp[320095]: read /var/lib/neutron/dhcp/744b5a82-3c5c-4b41-ba44-527244a209c4/host
Nov 28 10:07:34 np0005538515.localdomain dnsmasq-dhcp[320095]: read /var/lib/neutron/dhcp/744b5a82-3c5c-4b41-ba44-527244a209c4/opts
Nov 28 10:07:34 np0005538515.localdomain podman[320151]: 2025-11-28 10:07:34.637218424 +0000 UTC m=+0.060297453 container kill 78986f9e71a77301eadcca6678ea6748e2fa00cb1553a0dcc251102da5ea6e4b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-744b5a82-3c5c-4b41-ba44-527244a209c4, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 28 10:07:34 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "7fc9c0d6-3c27-4cc1-b7bd-1c8604d8f486", "snap_name": "3715b589-30b5-48c3-ae35-bbb103548e98_18a37cfd-0766-4bf4-885c-08ab764ad956", "force": true, "format": "json"}]: dispatch
Nov 28 10:07:34 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:3715b589-30b5-48c3-ae35-bbb103548e98_18a37cfd-0766-4bf4-885c-08ab764ad956, sub_name:7fc9c0d6-3c27-4cc1-b7bd-1c8604d8f486, vol_name:cephfs) < ""
Nov 28 10:07:35 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] scanning for idle connections..
Nov 28 10:07:35 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] cleaning up connections: []
Nov 28 10:07:35 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] scanning for idle connections..
Nov 28 10:07:35 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] cleaning up connections: []
Nov 28 10:07:35 np0005538515.localdomain dnsmasq[320095]: exiting on receipt of SIGTERM
Nov 28 10:07:35 np0005538515.localdomain podman[320189]: 2025-11-28 10:07:35.708057231 +0000 UTC m=+0.065096920 container kill 78986f9e71a77301eadcca6678ea6748e2fa00cb1553a0dcc251102da5ea6e4b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-744b5a82-3c5c-4b41-ba44-527244a209c4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 28 10:07:35 np0005538515.localdomain systemd[1]: libpod-78986f9e71a77301eadcca6678ea6748e2fa00cb1553a0dcc251102da5ea6e4b.scope: Deactivated successfully.
Nov 28 10:07:35 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] scanning for idle connections..
Nov 28 10:07:35 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] cleaning up connections: []
Nov 28 10:07:35 np0005538515.localdomain podman[320202]: 2025-11-28 10:07:35.781631162 +0000 UTC m=+0.062120788 container died 78986f9e71a77301eadcca6678ea6748e2fa00cb1553a0dcc251102da5ea6e4b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-744b5a82-3c5c-4b41-ba44-527244a209c4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:07:35 np0005538515.localdomain podman[320202]: 2025-11-28 10:07:35.814014572 +0000 UTC m=+0.094504158 container cleanup 78986f9e71a77301eadcca6678ea6748e2fa00cb1553a0dcc251102da5ea6e4b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-744b5a82-3c5c-4b41-ba44-527244a209c4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:07:35 np0005538515.localdomain systemd[1]: libpod-conmon-78986f9e71a77301eadcca6678ea6748e2fa00cb1553a0dcc251102da5ea6e4b.scope: Deactivated successfully.
Nov 28 10:07:35 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v393: 177 pgs: 177 active+clean; 225 MiB data, 1022 MiB used, 41 GiB / 42 GiB avail; 437 KiB/s rd, 2.6 MiB/s wr, 138 op/s
Nov 28 10:07:35 np0005538515.localdomain podman[320204]: 2025-11-28 10:07:35.861512149 +0000 UTC m=+0.133214954 container remove 78986f9e71a77301eadcca6678ea6748e2fa00cb1553a0dcc251102da5ea6e4b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-744b5a82-3c5c-4b41-ba44-527244a209c4, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 28 10:07:36 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/7fc9c0d6-3c27-4cc1-b7bd-1c8604d8f486/.meta.tmp'
Nov 28 10:07:36 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/7fc9c0d6-3c27-4cc1-b7bd-1c8604d8f486/.meta.tmp' to config b'/volumes/_nogroup/7fc9c0d6-3c27-4cc1-b7bd-1c8604d8f486/.meta'
Nov 28 10:07:36 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:3715b589-30b5-48c3-ae35-bbb103548e98_18a37cfd-0766-4bf4-885c-08ab764ad956, sub_name:7fc9c0d6-3c27-4cc1-b7bd-1c8604d8f486, vol_name:cephfs) < ""
Nov 28 10:07:36 np0005538515.localdomain ceph-mon[301134]: pgmap v391: 177 pgs: 177 active+clean; 225 MiB data, 1022 MiB used, 41 GiB / 42 GiB avail; 546 KiB/s rd, 3.2 MiB/s wr, 172 op/s
Nov 28 10:07:36 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "7fc9c0d6-3c27-4cc1-b7bd-1c8604d8f486", "snap_name": "3715b589-30b5-48c3-ae35-bbb103548e98", "force": true, "format": "json"}]: dispatch
Nov 28 10:07:36 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:3715b589-30b5-48c3-ae35-bbb103548e98, sub_name:7fc9c0d6-3c27-4cc1-b7bd-1c8604d8f486, vol_name:cephfs) < ""
Nov 28 10:07:36 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/7fc9c0d6-3c27-4cc1-b7bd-1c8604d8f486/.meta.tmp'
Nov 28 10:07:36 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/7fc9c0d6-3c27-4cc1-b7bd-1c8604d8f486/.meta.tmp' to config b'/volumes/_nogroup/7fc9c0d6-3c27-4cc1-b7bd-1c8604d8f486/.meta'
Nov 28 10:07:36 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:3715b589-30b5-48c3-ae35-bbb103548e98, sub_name:7fc9c0d6-3c27-4cc1-b7bd-1c8604d8f486, vol_name:cephfs) < ""
Nov 28 10:07:36 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "80ccc338-1d8e-4716-ba2a-1f18e7a6e806", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:07:36 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:80ccc338-1d8e-4716-ba2a-1f18e7a6e806, vol_name:cephfs) < ""
Nov 28 10:07:36 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/80ccc338-1d8e-4716-ba2a-1f18e7a6e806/.meta.tmp'
Nov 28 10:07:36 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/80ccc338-1d8e-4716-ba2a-1f18e7a6e806/.meta.tmp' to config b'/volumes/_nogroup/80ccc338-1d8e-4716-ba2a-1f18e7a6e806/.meta'
Nov 28 10:07:36 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:80ccc338-1d8e-4716-ba2a-1f18e7a6e806, vol_name:cephfs) < ""
Nov 28 10:07:36 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "80ccc338-1d8e-4716-ba2a-1f18e7a6e806", "format": "json"}]: dispatch
Nov 28 10:07:36 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:80ccc338-1d8e-4716-ba2a-1f18e7a6e806, vol_name:cephfs) < ""
Nov 28 10:07:36 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:80ccc338-1d8e-4716-ba2a-1f18e7a6e806, vol_name:cephfs) < ""
Nov 28 10:07:36 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-8feab9915cd549e508973929d606dec6edf6c2c66c5c9a9b78239bf5e744001b-merged.mount: Deactivated successfully.
Nov 28 10:07:36 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-78986f9e71a77301eadcca6678ea6748e2fa00cb1553a0dcc251102da5ea6e4b-userdata-shm.mount: Deactivated successfully.
Nov 28 10:07:36 np0005538515.localdomain podman[320282]: 
Nov 28 10:07:36 np0005538515.localdomain podman[320282]: 2025-11-28 10:07:36.788962729 +0000 UTC m=+0.071700125 container create 19060c1d1f4ed3cdc711b6e2837e2c7d9b19bdf44425397fde8af14a5c542a9d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-744b5a82-3c5c-4b41-ba44-527244a209c4, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 28 10:07:36 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e197 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:07:36 np0005538515.localdomain systemd[1]: Started libpod-conmon-19060c1d1f4ed3cdc711b6e2837e2c7d9b19bdf44425397fde8af14a5c542a9d.scope.
Nov 28 10:07:36 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 10:07:36 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/37e1f2d784bb480b9928814a810411a02129e006e80f1b36cac11a8afe4824eb/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 10:07:36 np0005538515.localdomain podman[320282]: 2025-11-28 10:07:36.749505781 +0000 UTC m=+0.032243187 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 28 10:07:36 np0005538515.localdomain podman[320282]: 2025-11-28 10:07:36.850199209 +0000 UTC m=+0.132936575 container init 19060c1d1f4ed3cdc711b6e2837e2c7d9b19bdf44425397fde8af14a5c542a9d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-744b5a82-3c5c-4b41-ba44-527244a209c4, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Nov 28 10:07:36 np0005538515.localdomain podman[320282]: 2025-11-28 10:07:36.857739021 +0000 UTC m=+0.140476387 container start 19060c1d1f4ed3cdc711b6e2837e2c7d9b19bdf44425397fde8af14a5c542a9d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-744b5a82-3c5c-4b41-ba44-527244a209c4, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 28 10:07:36 np0005538515.localdomain dnsmasq[320301]: started, version 2.85 cachesize 150
Nov 28 10:07:36 np0005538515.localdomain dnsmasq[320301]: DNS service limited to local subnets
Nov 28 10:07:36 np0005538515.localdomain dnsmasq[320301]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 28 10:07:36 np0005538515.localdomain dnsmasq[320301]: warning: no upstream servers configured
Nov 28 10:07:36 np0005538515.localdomain dnsmasq-dhcp[320301]: DHCP, static leases only on 10.100.0.16, lease time 1d
Nov 28 10:07:36 np0005538515.localdomain dnsmasq[320301]: read /var/lib/neutron/dhcp/744b5a82-3c5c-4b41-ba44-527244a209c4/addn_hosts - 0 addresses
Nov 28 10:07:36 np0005538515.localdomain dnsmasq-dhcp[320301]: read /var/lib/neutron/dhcp/744b5a82-3c5c-4b41-ba44-527244a209c4/host
Nov 28 10:07:36 np0005538515.localdomain dnsmasq-dhcp[320301]: read /var/lib/neutron/dhcp/744b5a82-3c5c-4b41-ba44-527244a209c4/opts
Nov 28 10:07:37 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:07:37.098 158530 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:af:3b:02 10.100.0.19 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.19/28 10.100.0.2/28', 'neutron:device_id': 'ovnmeta-744b5a82-3c5c-4b41-ba44-527244a209c4', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-744b5a82-3c5c-4b41-ba44-527244a209c4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5e7a07c97c664076bc825e05137c574c', 'neutron:revision_number': '5', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=af9fa2ef-906d-4c5f-8a61-e350b84d90cf, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=636c21fa-d6bd-405e-95ed-d59498827d6f) old=Port_Binding(mac=['fa:16:3e:af:3b:02 10.100.0.19 10.100.0.3'], external_ids={'neutron:cidrs': '10.100.0.19/28 10.100.0.3/28', 'neutron:device_id': 'ovnmeta-744b5a82-3c5c-4b41-ba44-527244a209c4', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-744b5a82-3c5c-4b41-ba44-527244a209c4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5e7a07c97c664076bc825e05137c574c', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:07:37 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:07:37.101 158530 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 636c21fa-d6bd-405e-95ed-d59498827d6f in datapath 744b5a82-3c5c-4b41-ba44-527244a209c4 updated
Nov 28 10:07:37 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:07:37.104 158530 DEBUG neutron.agent.ovn.metadata.agent [-] Port a03547c5-d094-4489-b8d5-6b024d72dbea IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Nov 28 10:07:37 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:07:37.105 158530 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 744b5a82-3c5c-4b41-ba44-527244a209c4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 28 10:07:37 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:07:37.106 261619 DEBUG oslo.privsep.daemon [-] privsep: reply[646ff614-23b5-4a1f-a91c-fc245f4e3161]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:07:37 np0005538515.localdomain dnsmasq[320301]: exiting on receipt of SIGTERM
Nov 28 10:07:37 np0005538515.localdomain podman[320319]: 2025-11-28 10:07:37.381384406 +0000 UTC m=+0.065350568 container kill 19060c1d1f4ed3cdc711b6e2837e2c7d9b19bdf44425397fde8af14a5c542a9d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-744b5a82-3c5c-4b41-ba44-527244a209c4, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:07:37 np0005538515.localdomain systemd[1]: libpod-19060c1d1f4ed3cdc711b6e2837e2c7d9b19bdf44425397fde8af14a5c542a9d.scope: Deactivated successfully.
Nov 28 10:07:37 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:07:37.401 261346 INFO neutron.agent.dhcp.agent [None req-5962ec3e-3f95-4335-8134-4b9f103284ff - - - - - -] DHCP configuration for ports {'530dc798-1c6a-4c38-a11f-57f3818e5561', 'd42f5d61-afe1-455f-b448-86993094b244', '636c21fa-d6bd-405e-95ed-d59498827d6f'} is completed
Nov 28 10:07:37 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.
Nov 28 10:07:37 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.
Nov 28 10:07:37 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.
Nov 28 10:07:37 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.
Nov 28 10:07:37 np0005538515.localdomain ceph-mon[301134]: pgmap v392: 177 pgs: 177 active+clean; 225 MiB data, 1022 MiB used, 41 GiB / 42 GiB avail; 478 KiB/s rd, 2.8 MiB/s wr, 151 op/s
Nov 28 10:07:37 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "7fc9c0d6-3c27-4cc1-b7bd-1c8604d8f486", "snap_name": "3715b589-30b5-48c3-ae35-bbb103548e98_18a37cfd-0766-4bf4-885c-08ab764ad956", "force": true, "format": "json"}]: dispatch
Nov 28 10:07:37 np0005538515.localdomain ceph-mon[301134]: pgmap v393: 177 pgs: 177 active+clean; 225 MiB data, 1022 MiB used, 41 GiB / 42 GiB avail; 437 KiB/s rd, 2.6 MiB/s wr, 138 op/s
Nov 28 10:07:37 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "7fc9c0d6-3c27-4cc1-b7bd-1c8604d8f486", "snap_name": "3715b589-30b5-48c3-ae35-bbb103548e98", "force": true, "format": "json"}]: dispatch
Nov 28 10:07:37 np0005538515.localdomain ceph-mon[301134]: from='client.15651 172.18.0.34:0/597982567' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:07:37 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/1980575955' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:07:37 np0005538515.localdomain podman[320335]: 2025-11-28 10:07:37.488898596 +0000 UTC m=+0.081371933 container died 19060c1d1f4ed3cdc711b6e2837e2c7d9b19bdf44425397fde8af14a5c542a9d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-744b5a82-3c5c-4b41-ba44-527244a209c4, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team)
Nov 28 10:07:37 np0005538515.localdomain podman[320335]: 2025-11-28 10:07:37.523762972 +0000 UTC m=+0.116236329 container remove 19060c1d1f4ed3cdc711b6e2837e2c7d9b19bdf44425397fde8af14a5c542a9d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-744b5a82-3c5c-4b41-ba44-527244a209c4, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 28 10:07:37 np0005538515.localdomain podman[320359]: 2025-11-28 10:07:37.581121623 +0000 UTC m=+0.145961758 container health_status d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 10:07:37 np0005538515.localdomain podman[320359]: 2025-11-28 10:07:37.597406145 +0000 UTC m=+0.162246280 container exec_died d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 28 10:07:37 np0005538515.localdomain systemd[1]: d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.service: Deactivated successfully.
Nov 28 10:07:37 np0005538515.localdomain podman[320348]: 2025-11-28 10:07:37.613317266 +0000 UTC m=+0.189848172 container health_status 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:07:37 np0005538515.localdomain podman[320346]: 2025-11-28 10:07:37.566053007 +0000 UTC m=+0.147769373 container health_status 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:07:37 np0005538515.localdomain systemd[1]: libpod-conmon-19060c1d1f4ed3cdc711b6e2837e2c7d9b19bdf44425397fde8af14a5c542a9d.scope: Deactivated successfully.
Nov 28 10:07:37 np0005538515.localdomain podman[320351]: 2025-11-28 10:07:37.674175235 +0000 UTC m=+0.244178518 container health_status b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 28 10:07:37 np0005538515.localdomain podman[320346]: 2025-11-28 10:07:37.699115815 +0000 UTC m=+0.280832181 container exec_died 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 28 10:07:37 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-37e1f2d784bb480b9928814a810411a02129e006e80f1b36cac11a8afe4824eb-merged.mount: Deactivated successfully.
Nov 28 10:07:37 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-19060c1d1f4ed3cdc711b6e2837e2c7d9b19bdf44425397fde8af14a5c542a9d-userdata-shm.mount: Deactivated successfully.
Nov 28 10:07:37 np0005538515.localdomain systemd[1]: 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.service: Deactivated successfully.
Nov 28 10:07:37 np0005538515.localdomain podman[320348]: 2025-11-28 10:07:37.728186322 +0000 UTC m=+0.304717298 container exec_died 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team)
Nov 28 10:07:37 np0005538515.localdomain systemd[1]: 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.service: Deactivated successfully.
Nov 28 10:07:37 np0005538515.localdomain podman[320351]: 2025-11-28 10:07:37.754310178 +0000 UTC m=+0.324313491 container exec_died b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 28 10:07:37 np0005538515.localdomain systemd[1]: b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.service: Deactivated successfully.
Nov 28 10:07:37 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v394: 177 pgs: 177 active+clean; 225 MiB data, 1023 MiB used, 41 GiB / 42 GiB avail; 65 KiB/s rd, 141 KiB/s wr, 42 op/s
Nov 28 10:07:38 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:07:38.143 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:07:38 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "80ccc338-1d8e-4716-ba2a-1f18e7a6e806", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:07:38 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "80ccc338-1d8e-4716-ba2a-1f18e7a6e806", "format": "json"}]: dispatch
Nov 28 10:07:38 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e198 e198: 6 total, 6 up, 6 in
Nov 28 10:07:38 np0005538515.localdomain podman[320493]: 
Nov 28 10:07:38 np0005538515.localdomain podman[320493]: 2025-11-28 10:07:38.911363696 +0000 UTC m=+0.082885939 container create cb475ecb689927ab91959b49b410b28fc5f8205ec91836dd9cfa49a3750f631b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-744b5a82-3c5c-4b41-ba44-527244a209c4, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_managed=true)
Nov 28 10:07:38 np0005538515.localdomain systemd[1]: Started libpod-conmon-cb475ecb689927ab91959b49b410b28fc5f8205ec91836dd9cfa49a3750f631b.scope.
Nov 28 10:07:38 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 10:07:38 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c0012342cf8bd08ca02fc301bcea34a05720aed503942f357e7119093cd5e2b7/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 10:07:38 np0005538515.localdomain podman[320493]: 2025-11-28 10:07:38.968600114 +0000 UTC m=+0.140122357 container init cb475ecb689927ab91959b49b410b28fc5f8205ec91836dd9cfa49a3750f631b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-744b5a82-3c5c-4b41-ba44-527244a209c4, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Nov 28 10:07:38 np0005538515.localdomain podman[320493]: 2025-11-28 10:07:38.873764336 +0000 UTC m=+0.045286629 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 28 10:07:38 np0005538515.localdomain podman[320493]: 2025-11-28 10:07:38.976782286 +0000 UTC m=+0.148304529 container start cb475ecb689927ab91959b49b410b28fc5f8205ec91836dd9cfa49a3750f631b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-744b5a82-3c5c-4b41-ba44-527244a209c4, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2)
Nov 28 10:07:38 np0005538515.localdomain dnsmasq[320512]: started, version 2.85 cachesize 150
Nov 28 10:07:38 np0005538515.localdomain dnsmasq[320512]: DNS service limited to local subnets
Nov 28 10:07:38 np0005538515.localdomain dnsmasq[320512]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 28 10:07:38 np0005538515.localdomain dnsmasq[320512]: warning: no upstream servers configured
Nov 28 10:07:38 np0005538515.localdomain dnsmasq-dhcp[320512]: DHCP, static leases only on 10.100.0.0, lease time 1d
Nov 28 10:07:38 np0005538515.localdomain dnsmasq-dhcp[320512]: DHCP, static leases only on 10.100.0.16, lease time 1d
Nov 28 10:07:38 np0005538515.localdomain dnsmasq[320512]: read /var/lib/neutron/dhcp/744b5a82-3c5c-4b41-ba44-527244a209c4/addn_hosts - 0 addresses
Nov 28 10:07:38 np0005538515.localdomain dnsmasq-dhcp[320512]: read /var/lib/neutron/dhcp/744b5a82-3c5c-4b41-ba44-527244a209c4/host
Nov 28 10:07:38 np0005538515.localdomain dnsmasq-dhcp[320512]: read /var/lib/neutron/dhcp/744b5a82-3c5c-4b41-ba44-527244a209c4/opts
Nov 28 10:07:39 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:07:39.359 261346 INFO neutron.agent.dhcp.agent [None req-964425bd-3569-4dc7-a241-0bf1601b405c - - - - - -] DHCP configuration for ports {'530dc798-1c6a-4c38-a11f-57f3818e5561', 'd42f5d61-afe1-455f-b448-86993094b244', '636c21fa-d6bd-405e-95ed-d59498827d6f'} is completed
Nov 28 10:07:39 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "80ccc338-1d8e-4716-ba2a-1f18e7a6e806", "snap_name": "a7f380a2-0f60-4351-b764-bd78832f244d", "format": "json"}]: dispatch
Nov 28 10:07:39 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_create(format:json, prefix:fs subvolume snapshot create, snap_name:a7f380a2-0f60-4351-b764-bd78832f244d, sub_name:80ccc338-1d8e-4716-ba2a-1f18e7a6e806, vol_name:cephfs) < ""
Nov 28 10:07:39 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_create(format:json, prefix:fs subvolume snapshot create, snap_name:a7f380a2-0f60-4351-b764-bd78832f244d, sub_name:80ccc338-1d8e-4716-ba2a-1f18e7a6e806, vol_name:cephfs) < ""
Nov 28 10:07:39 np0005538515.localdomain ceph-mon[301134]: pgmap v394: 177 pgs: 177 active+clean; 225 MiB data, 1023 MiB used, 41 GiB / 42 GiB avail; 65 KiB/s rd, 141 KiB/s wr, 42 op/s
Nov 28 10:07:39 np0005538515.localdomain ceph-mon[301134]: osdmap e198: 6 total, 6 up, 6 in
Nov 28 10:07:39 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/1666464710' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:07:39 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:07:39.468 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:07:39 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e199 e199: 6 total, 6 up, 6 in
Nov 28 10:07:39 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:07:39.733 2 INFO neutron.agent.securitygroups_rpc [None req-c0e4f748-a4dd-449c-b793-430f30c9256f 646af4638a054ec4b7eb9e5438a2ab60 5e7a07c97c664076bc825e05137c574c - - default default] Security group member updated ['58a0f932-b9f0-4bad-a5cb-a3c8cba7c65b']
Nov 28 10:07:39 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:07:39.772 261346 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:07:39Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce378160>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce378670>], id=ef709d14-bdfb-4122-9587-c257ef31d183, ip_allocation=immediate, mac_address=fa:16:3e:6b:0f:d4, name=tempest-PortsTestJSON-1427451617, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:05:56Z, description=, dns_domain=, id=744b5a82-3c5c-4b41-ba44-527244a209c4, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-PortsTestJSON-test-network-935184943, port_security_enabled=True, project_id=5e7a07c97c664076bc825e05137c574c, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=32831, qos_policy_id=None, revision_number=5, router:external=False, shared=False, standard_attr_id=2138, status=ACTIVE, subnets=['ce735290-31b9-4af4-844a-71b2fcf68031', 'd7466e39-52ac-4561-ade5-b21553db79fb'], tags=[], tenant_id=5e7a07c97c664076bc825e05137c574c, updated_at=2025-11-28T10:07:35Z, vlan_transparent=None, network_id=744b5a82-3c5c-4b41-ba44-527244a209c4, port_security_enabled=True, project_id=5e7a07c97c664076bc825e05137c574c, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['58a0f932-b9f0-4bad-a5cb-a3c8cba7c65b'], standard_attr_id=2574, status=DOWN, tags=[], tenant_id=5e7a07c97c664076bc825e05137c574c, updated_at=2025-11-28T10:07:39Z on network 744b5a82-3c5c-4b41-ba44-527244a209c4
Nov 28 10:07:39 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v397: 177 pgs: 177 active+clean; 225 MiB data, 1023 MiB used, 41 GiB / 42 GiB avail; 1.6 KiB/s rd, 26 KiB/s wr, 4 op/s
Nov 28 10:07:39 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.
Nov 28 10:07:39 np0005538515.localdomain podman[320513]: 2025-11-28 10:07:39.983202134 +0000 UTC m=+0.086988476 container health_status 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 28 10:07:40 np0005538515.localdomain podman[320513]: 2025-11-28 10:07:40.017678108 +0000 UTC m=+0.121464390 container exec_died 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 28 10:07:40 np0005538515.localdomain systemd[1]: 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.service: Deactivated successfully.
Nov 28 10:07:40 np0005538515.localdomain dnsmasq[320512]: read /var/lib/neutron/dhcp/744b5a82-3c5c-4b41-ba44-527244a209c4/addn_hosts - 1 addresses
Nov 28 10:07:40 np0005538515.localdomain dnsmasq-dhcp[320512]: read /var/lib/neutron/dhcp/744b5a82-3c5c-4b41-ba44-527244a209c4/host
Nov 28 10:07:40 np0005538515.localdomain dnsmasq-dhcp[320512]: read /var/lib/neutron/dhcp/744b5a82-3c5c-4b41-ba44-527244a209c4/opts
Nov 28 10:07:40 np0005538515.localdomain podman[320552]: 2025-11-28 10:07:40.15021633 +0000 UTC m=+0.062198172 container kill cb475ecb689927ab91959b49b410b28fc5f8205ec91836dd9cfa49a3750f631b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-744b5a82-3c5c-4b41-ba44-527244a209c4, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 28 10:07:40 np0005538515.localdomain systemd[1]: tmp-crun.m41PEJ.mount: Deactivated successfully.
Nov 28 10:07:40 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:07:40.493 261346 INFO neutron.agent.dhcp.agent [None req-f0fdf4cc-1f8a-4bf6-a800-da6a955dab3d - - - - - -] DHCP configuration for ports {'ef709d14-bdfb-4122-9587-c257ef31d183'} is completed
Nov 28 10:07:40 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "80ccc338-1d8e-4716-ba2a-1f18e7a6e806", "snap_name": "a7f380a2-0f60-4351-b764-bd78832f244d", "format": "json"}]: dispatch
Nov 28 10:07:40 np0005538515.localdomain ceph-mon[301134]: osdmap e199: 6 total, 6 up, 6 in
Nov 28 10:07:40 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e200 e200: 6 total, 6 up, 6 in
Nov 28 10:07:40 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "7fc9c0d6-3c27-4cc1-b7bd-1c8604d8f486", "snap_name": "1ad625c7-a9e5-4671-b5b8-87ed0dc833b6", "format": "json"}]: dispatch
Nov 28 10:07:40 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_create(format:json, prefix:fs subvolume snapshot create, snap_name:1ad625c7-a9e5-4671-b5b8-87ed0dc833b6, sub_name:7fc9c0d6-3c27-4cc1-b7bd-1c8604d8f486, vol_name:cephfs) < ""
Nov 28 10:07:40 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_create(format:json, prefix:fs subvolume snapshot create, snap_name:1ad625c7-a9e5-4671-b5b8-87ed0dc833b6, sub_name:7fc9c0d6-3c27-4cc1-b7bd-1c8604d8f486, vol_name:cephfs) < ""
Nov 28 10:07:41 np0005538515.localdomain ceph-mon[301134]: pgmap v397: 177 pgs: 177 active+clean; 225 MiB data, 1023 MiB used, 41 GiB / 42 GiB avail; 1.6 KiB/s rd, 26 KiB/s wr, 4 op/s
Nov 28 10:07:41 np0005538515.localdomain ceph-mon[301134]: osdmap e200: 6 total, 6 up, 6 in
Nov 28 10:07:41 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e201 e201: 6 total, 6 up, 6 in
Nov 28 10:07:41 np0005538515.localdomain dnsmasq[320512]: exiting on receipt of SIGTERM
Nov 28 10:07:41 np0005538515.localdomain podman[320590]: 2025-11-28 10:07:41.561456024 +0000 UTC m=+0.065296126 container kill cb475ecb689927ab91959b49b410b28fc5f8205ec91836dd9cfa49a3750f631b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-744b5a82-3c5c-4b41-ba44-527244a209c4, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team)
Nov 28 10:07:41 np0005538515.localdomain systemd[1]: libpod-cb475ecb689927ab91959b49b410b28fc5f8205ec91836dd9cfa49a3750f631b.scope: Deactivated successfully.
Nov 28 10:07:41 np0005538515.localdomain podman[320605]: 2025-11-28 10:07:41.636100579 +0000 UTC m=+0.059925381 container died cb475ecb689927ab91959b49b410b28fc5f8205ec91836dd9cfa49a3750f631b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-744b5a82-3c5c-4b41-ba44-527244a209c4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3)
Nov 28 10:07:41 np0005538515.localdomain systemd[1]: tmp-crun.VVRtLF.mount: Deactivated successfully.
Nov 28 10:07:41 np0005538515.localdomain podman[320605]: 2025-11-28 10:07:41.668454528 +0000 UTC m=+0.092279260 container cleanup cb475ecb689927ab91959b49b410b28fc5f8205ec91836dd9cfa49a3750f631b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-744b5a82-3c5c-4b41-ba44-527244a209c4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 28 10:07:41 np0005538515.localdomain systemd[1]: libpod-conmon-cb475ecb689927ab91959b49b410b28fc5f8205ec91836dd9cfa49a3750f631b.scope: Deactivated successfully.
Nov 28 10:07:41 np0005538515.localdomain podman[320606]: 2025-11-28 10:07:41.711504976 +0000 UTC m=+0.128933771 container remove cb475ecb689927ab91959b49b410b28fc5f8205ec91836dd9cfa49a3750f631b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-744b5a82-3c5c-4b41-ba44-527244a209c4, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125)
Nov 28 10:07:41 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e201 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:07:41 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v400: 177 pgs: 177 active+clean; 225 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 128 KiB/s rd, 35 KiB/s wr, 184 op/s
Nov 28 10:07:42 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:07:42.239 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:07:42 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "7fc9c0d6-3c27-4cc1-b7bd-1c8604d8f486", "snap_name": "1ad625c7-a9e5-4671-b5b8-87ed0dc833b6", "format": "json"}]: dispatch
Nov 28 10:07:42 np0005538515.localdomain ceph-mon[301134]: osdmap e201: 6 total, 6 up, 6 in
Nov 28 10:07:42 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-c0012342cf8bd08ca02fc301bcea34a05720aed503942f357e7119093cd5e2b7-merged.mount: Deactivated successfully.
Nov 28 10:07:42 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-cb475ecb689927ab91959b49b410b28fc5f8205ec91836dd9cfa49a3750f631b-userdata-shm.mount: Deactivated successfully.
Nov 28 10:07:42 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e202 e202: 6 total, 6 up, 6 in
Nov 28 10:07:43 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:07:43.156 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:07:43 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:07:43.490 158530 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:af:3b:02 10.100.0.19 10.100.0.2 10.100.0.34'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.19/28 10.100.0.2/28 10.100.0.34/28', 'neutron:device_id': 'ovnmeta-744b5a82-3c5c-4b41-ba44-527244a209c4', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-744b5a82-3c5c-4b41-ba44-527244a209c4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5e7a07c97c664076bc825e05137c574c', 'neutron:revision_number': '6', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=af9fa2ef-906d-4c5f-8a61-e350b84d90cf, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=636c21fa-d6bd-405e-95ed-d59498827d6f) old=Port_Binding(mac=['fa:16:3e:af:3b:02 10.100.0.19 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.19/28 10.100.0.2/28', 'neutron:device_id': 'ovnmeta-744b5a82-3c5c-4b41-ba44-527244a209c4', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-744b5a82-3c5c-4b41-ba44-527244a209c4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5e7a07c97c664076bc825e05137c574c', 'neutron:revision_number': '5', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:07:43 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:07:43.492 158530 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 636c21fa-d6bd-405e-95ed-d59498827d6f in datapath 744b5a82-3c5c-4b41-ba44-527244a209c4 updated
Nov 28 10:07:43 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:07:43.495 158530 DEBUG neutron.agent.ovn.metadata.agent [-] Port a03547c5-d094-4489-b8d5-6b024d72dbea IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Nov 28 10:07:43 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:07:43.495 158530 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 744b5a82-3c5c-4b41-ba44-527244a209c4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 28 10:07:43 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:07:43.496 261619 DEBUG oslo.privsep.daemon [-] privsep: reply[9d322294-e6d7-4255-8886-c5369ddec8fd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:07:43 np0005538515.localdomain ceph-mon[301134]: pgmap v400: 177 pgs: 177 active+clean; 225 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 128 KiB/s rd, 35 KiB/s wr, 184 op/s
Nov 28 10:07:43 np0005538515.localdomain ceph-mon[301134]: osdmap e202: 6 total, 6 up, 6 in
Nov 28 10:07:43 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v402: 177 pgs: 177 active+clean; 225 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 118 KiB/s rd, 32 KiB/s wr, 170 op/s
Nov 28 10:07:43 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "80ccc338-1d8e-4716-ba2a-1f18e7a6e806", "snap_name": "a7f380a2-0f60-4351-b764-bd78832f244d_c6ebf898-e639-41ae-92f7-1c129b5ee100", "force": true, "format": "json"}]: dispatch
Nov 28 10:07:43 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:a7f380a2-0f60-4351-b764-bd78832f244d_c6ebf898-e639-41ae-92f7-1c129b5ee100, sub_name:80ccc338-1d8e-4716-ba2a-1f18e7a6e806, vol_name:cephfs) < ""
Nov 28 10:07:43 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/80ccc338-1d8e-4716-ba2a-1f18e7a6e806/.meta.tmp'
Nov 28 10:07:43 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/80ccc338-1d8e-4716-ba2a-1f18e7a6e806/.meta.tmp' to config b'/volumes/_nogroup/80ccc338-1d8e-4716-ba2a-1f18e7a6e806/.meta'
Nov 28 10:07:43 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:a7f380a2-0f60-4351-b764-bd78832f244d_c6ebf898-e639-41ae-92f7-1c129b5ee100, sub_name:80ccc338-1d8e-4716-ba2a-1f18e7a6e806, vol_name:cephfs) < ""
Nov 28 10:07:43 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "80ccc338-1d8e-4716-ba2a-1f18e7a6e806", "snap_name": "a7f380a2-0f60-4351-b764-bd78832f244d", "force": true, "format": "json"}]: dispatch
Nov 28 10:07:43 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:a7f380a2-0f60-4351-b764-bd78832f244d, sub_name:80ccc338-1d8e-4716-ba2a-1f18e7a6e806, vol_name:cephfs) < ""
Nov 28 10:07:43 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.
Nov 28 10:07:43 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/80ccc338-1d8e-4716-ba2a-1f18e7a6e806/.meta.tmp'
Nov 28 10:07:43 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/80ccc338-1d8e-4716-ba2a-1f18e7a6e806/.meta.tmp' to config b'/volumes/_nogroup/80ccc338-1d8e-4716-ba2a-1f18e7a6e806/.meta'
Nov 28 10:07:43 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:a7f380a2-0f60-4351-b764-bd78832f244d, sub_name:80ccc338-1d8e-4716-ba2a-1f18e7a6e806, vol_name:cephfs) < ""
Nov 28 10:07:43 np0005538515.localdomain podman[320648]: 2025-11-28 10:07:43.983787411 +0000 UTC m=+0.092128905 container health_status cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=multipathd, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 28 10:07:44 np0005538515.localdomain podman[320648]: 2025-11-28 10:07:44.025408746 +0000 UTC m=+0.133750250 container exec_died cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, config_id=multipathd, container_name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:07:44 np0005538515.localdomain systemd[1]: cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.service: Deactivated successfully.
Nov 28 10:07:44 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:07:44.234 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:07:44 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:07:44.334 2 INFO neutron.agent.securitygroups_rpc [None req-cecabd37-7803-4c2d-a13a-d3905bbc0cfc 646af4638a054ec4b7eb9e5438a2ab60 5e7a07c97c664076bc825e05137c574c - - default default] Security group member updated ['18fcd73e-4837-425f-bf44-9ed4ac2aa187', '58a0f932-b9f0-4bad-a5cb-a3c8cba7c65b', 'bec6547e-445f-4500-b371-6e2fc240d4db']
Nov 28 10:07:44 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:07:44.507 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:07:44 np0005538515.localdomain ceph-mon[301134]: pgmap v402: 177 pgs: 177 active+clean; 225 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 118 KiB/s rd, 32 KiB/s wr, 170 op/s
Nov 28 10:07:44 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "80ccc338-1d8e-4716-ba2a-1f18e7a6e806", "snap_name": "a7f380a2-0f60-4351-b764-bd78832f244d_c6ebf898-e639-41ae-92f7-1c129b5ee100", "force": true, "format": "json"}]: dispatch
Nov 28 10:07:44 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/1359194721' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:07:44 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "80ccc338-1d8e-4716-ba2a-1f18e7a6e806", "snap_name": "a7f380a2-0f60-4351-b764-bd78832f244d", "force": true, "format": "json"}]: dispatch
Nov 28 10:07:44 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e203 e203: 6 total, 6 up, 6 in
Nov 28 10:07:44 np0005538515.localdomain podman[320701]: 
Nov 28 10:07:44 np0005538515.localdomain podman[320701]: 2025-11-28 10:07:44.645184658 +0000 UTC m=+0.098128080 container create ade539c5587c5a7549a112e7f1f7660e83ca9e8302a21cb1eea33af2732f279d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-744b5a82-3c5c-4b41-ba44-527244a209c4, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125)
Nov 28 10:07:44 np0005538515.localdomain systemd[1]: Started libpod-conmon-ade539c5587c5a7549a112e7f1f7660e83ca9e8302a21cb1eea33af2732f279d.scope.
Nov 28 10:07:44 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 10:07:44 np0005538515.localdomain podman[320701]: 2025-11-28 10:07:44.59534079 +0000 UTC m=+0.048284232 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 28 10:07:44 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cbb36c4db1eaf4ba887ed8bd7f5126f0d10ffb93de60f53a93db5d592a36f5f2/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 10:07:44 np0005538515.localdomain podman[320701]: 2025-11-28 10:07:44.705005136 +0000 UTC m=+0.157948548 container init ade539c5587c5a7549a112e7f1f7660e83ca9e8302a21cb1eea33af2732f279d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-744b5a82-3c5c-4b41-ba44-527244a209c4, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 28 10:07:44 np0005538515.localdomain podman[320701]: 2025-11-28 10:07:44.713321002 +0000 UTC m=+0.166264414 container start ade539c5587c5a7549a112e7f1f7660e83ca9e8302a21cb1eea33af2732f279d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-744b5a82-3c5c-4b41-ba44-527244a209c4, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Nov 28 10:07:44 np0005538515.localdomain dnsmasq[320719]: started, version 2.85 cachesize 150
Nov 28 10:07:44 np0005538515.localdomain dnsmasq[320719]: DNS service limited to local subnets
Nov 28 10:07:44 np0005538515.localdomain dnsmasq[320719]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 28 10:07:44 np0005538515.localdomain dnsmasq[320719]: warning: no upstream servers configured
Nov 28 10:07:44 np0005538515.localdomain dnsmasq-dhcp[320719]: DHCP, static leases only on 10.100.0.0, lease time 1d
Nov 28 10:07:44 np0005538515.localdomain dnsmasq-dhcp[320719]: DHCP, static leases only on 10.100.0.16, lease time 1d
Nov 28 10:07:44 np0005538515.localdomain dnsmasq-dhcp[320719]: DHCP, static leases only on 10.100.0.32, lease time 1d
Nov 28 10:07:44 np0005538515.localdomain dnsmasq[320719]: read /var/lib/neutron/dhcp/744b5a82-3c5c-4b41-ba44-527244a209c4/addn_hosts - 1 addresses
Nov 28 10:07:44 np0005538515.localdomain dnsmasq-dhcp[320719]: read /var/lib/neutron/dhcp/744b5a82-3c5c-4b41-ba44-527244a209c4/host
Nov 28 10:07:44 np0005538515.localdomain dnsmasq-dhcp[320719]: read /var/lib/neutron/dhcp/744b5a82-3c5c-4b41-ba44-527244a209c4/opts
Nov 28 10:07:44 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:07:44.748 2 INFO neutron.agent.securitygroups_rpc [None req-d99eee51-2169-45db-889c-fcddfc1e6db2 6b80a7d6f65f4ebe8363ccaae26f3e87 ae10569a38284f298c961498da620c5f - - default default] Security group member updated ['c5eee24b-0bed-4035-a2ab-e6c531c94e43']
Nov 28 10:07:44 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:07:44.787 261346 INFO neutron.agent.dhcp.agent [None req-44e1ce0c-38da-4b03-a4d9-0628bced247c - - - - - -] Trigger reload_allocations for port admin_state_up=False, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:07:39Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce426ac0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce426220>], id=ef709d14-bdfb-4122-9587-c257ef31d183, ip_allocation=immediate, mac_address=fa:16:3e:6b:0f:d4, name=tempest-PortsTestJSON-1243252809, network_id=744b5a82-3c5c-4b41-ba44-527244a209c4, port_security_enabled=True, project_id=5e7a07c97c664076bc825e05137c574c, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=2, security_groups=['18fcd73e-4837-425f-bf44-9ed4ac2aa187', 'bec6547e-445f-4500-b371-6e2fc240d4db'], standard_attr_id=2574, status=DOWN, tags=[], tenant_id=5e7a07c97c664076bc825e05137c574c, updated_at=2025-11-28T10:07:44Z on network 744b5a82-3c5c-4b41-ba44-527244a209c4
Nov 28 10:07:44 np0005538515.localdomain dnsmasq-dhcp[320719]: DHCPRELEASE(tapd42f5d61-af) 10.100.0.10 fa:16:3e:6b:0f:d4
Nov 28 10:07:45 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:07:45.595 261346 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:07:45 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e204 e204: 6 total, 6 up, 6 in
Nov 28 10:07:45 np0005538515.localdomain ceph-mon[301134]: osdmap e203: 6 total, 6 up, 6 in
Nov 28 10:07:45 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v405: 177 pgs: 177 active+clean; 225 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 120 KiB/s rd, 32 KiB/s wr, 172 op/s
Nov 28 10:07:45 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "7fc9c0d6-3c27-4cc1-b7bd-1c8604d8f486", "snap_name": "2c0f5770-a785-4c87-af5d-f92c8abce21d", "format": "json"}]: dispatch
Nov 28 10:07:45 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_create(format:json, prefix:fs subvolume snapshot create, snap_name:2c0f5770-a785-4c87-af5d-f92c8abce21d, sub_name:7fc9c0d6-3c27-4cc1-b7bd-1c8604d8f486, vol_name:cephfs) < ""
Nov 28 10:07:45 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_create(format:json, prefix:fs subvolume snapshot create, snap_name:2c0f5770-a785-4c87-af5d-f92c8abce21d, sub_name:7fc9c0d6-3c27-4cc1-b7bd-1c8604d8f486, vol_name:cephfs) < ""
Nov 28 10:07:45 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:07:45.980 261346 INFO neutron.agent.dhcp.agent [None req-ffa45ac7-8367-42e7-83fc-786aae64ad4b - - - - - -] DHCP configuration for ports {'530dc798-1c6a-4c38-a11f-57f3818e5561', 'd42f5d61-afe1-455f-b448-86993094b244', 'ef709d14-bdfb-4122-9587-c257ef31d183', '636c21fa-d6bd-405e-95ed-d59498827d6f'} is completed
Nov 28 10:07:46 np0005538515.localdomain systemd[1]: tmp-crun.64rDsN.mount: Deactivated successfully.
Nov 28 10:07:46 np0005538515.localdomain dnsmasq[320719]: read /var/lib/neutron/dhcp/744b5a82-3c5c-4b41-ba44-527244a209c4/addn_hosts - 1 addresses
Nov 28 10:07:46 np0005538515.localdomain podman[320738]: 2025-11-28 10:07:46.073016566 +0000 UTC m=+0.072644324 container kill ade539c5587c5a7549a112e7f1f7660e83ca9e8302a21cb1eea33af2732f279d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-744b5a82-3c5c-4b41-ba44-527244a209c4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:07:46 np0005538515.localdomain dnsmasq-dhcp[320719]: read /var/lib/neutron/dhcp/744b5a82-3c5c-4b41-ba44-527244a209c4/host
Nov 28 10:07:46 np0005538515.localdomain dnsmasq-dhcp[320719]: read /var/lib/neutron/dhcp/744b5a82-3c5c-4b41-ba44-527244a209c4/opts
Nov 28 10:07:46 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:07:46.126 2 INFO neutron.agent.securitygroups_rpc [None req-ff6bd390-74ec-4285-8021-b1b301c7b944 646af4638a054ec4b7eb9e5438a2ab60 5e7a07c97c664076bc825e05137c574c - - default default] Security group member updated ['18fcd73e-4837-425f-bf44-9ed4ac2aa187', 'bec6547e-445f-4500-b371-6e2fc240d4db']
Nov 28 10:07:46 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:07:46.209 261346 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:07:46 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:07:46.238 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:07:46 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:07:46.239 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:07:46 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:07:46.343 261346 INFO neutron.agent.dhcp.agent [None req-011dc99e-6b5b-4adf-8995-9c08519a6a94 - - - - - -] DHCP configuration for ports {'ef709d14-bdfb-4122-9587-c257ef31d183'} is completed
Nov 28 10:07:46 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e205 e205: 6 total, 6 up, 6 in
Nov 28 10:07:46 np0005538515.localdomain dnsmasq[320719]: read /var/lib/neutron/dhcp/744b5a82-3c5c-4b41-ba44-527244a209c4/addn_hosts - 0 addresses
Nov 28 10:07:46 np0005538515.localdomain dnsmasq-dhcp[320719]: read /var/lib/neutron/dhcp/744b5a82-3c5c-4b41-ba44-527244a209c4/host
Nov 28 10:07:46 np0005538515.localdomain dnsmasq-dhcp[320719]: read /var/lib/neutron/dhcp/744b5a82-3c5c-4b41-ba44-527244a209c4/opts
Nov 28 10:07:46 np0005538515.localdomain podman[320776]: 2025-11-28 10:07:46.556131449 +0000 UTC m=+0.049811568 container kill ade539c5587c5a7549a112e7f1f7660e83ca9e8302a21cb1eea33af2732f279d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-744b5a82-3c5c-4b41-ba44-527244a209c4, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Nov 28 10:07:46 np0005538515.localdomain ceph-mon[301134]: osdmap e204: 6 total, 6 up, 6 in
Nov 28 10:07:46 np0005538515.localdomain ceph-mon[301134]: pgmap v405: 177 pgs: 177 active+clean; 225 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 120 KiB/s rd, 32 KiB/s wr, 172 op/s
Nov 28 10:07:46 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "7fc9c0d6-3c27-4cc1-b7bd-1c8604d8f486", "snap_name": "2c0f5770-a785-4c87-af5d-f92c8abce21d", "format": "json"}]: dispatch
Nov 28 10:07:46 np0005538515.localdomain ceph-mon[301134]: osdmap e205: 6 total, 6 up, 6 in
Nov 28 10:07:46 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/611242034' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:07:46 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/611242034' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:07:46 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e205 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:07:46 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:07:46.878 2 INFO neutron.agent.securitygroups_rpc [None req-02f5cc80-e4fe-45b5-9f60-d631f95eb2c9 6b80a7d6f65f4ebe8363ccaae26f3e87 ae10569a38284f298c961498da620c5f - - default default] Security group member updated ['c5eee24b-0bed-4035-a2ab-e6c531c94e43']
Nov 28 10:07:47 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:07:47.238 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:07:47 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:07:47.238 280172 DEBUG nova.compute.manager [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 28 10:07:47 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "80ccc338-1d8e-4716-ba2a-1f18e7a6e806", "format": "json"}]: dispatch
Nov 28 10:07:47 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_clone_status(clone_name:80ccc338-1d8e-4716-ba2a-1f18e7a6e806, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Nov 28 10:07:47 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_clone_status(clone_name:80ccc338-1d8e-4716-ba2a-1f18e7a6e806, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Nov 28 10:07:47 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T10:07:47.545+0000 7fcc87448640 -1 mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '80ccc338-1d8e-4716-ba2a-1f18e7a6e806' of type subvolume
Nov 28 10:07:47 np0005538515.localdomain ceph-mgr[286188]: mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '80ccc338-1d8e-4716-ba2a-1f18e7a6e806' of type subvolume
Nov 28 10:07:47 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "80ccc338-1d8e-4716-ba2a-1f18e7a6e806", "force": true, "format": "json"}]: dispatch
Nov 28 10:07:47 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:80ccc338-1d8e-4716-ba2a-1f18e7a6e806, vol_name:cephfs) < ""
Nov 28 10:07:47 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.subvolume_base] subvolume path 'b'/volumes/_nogroup/80ccc338-1d8e-4716-ba2a-1f18e7a6e806'' moved to trashcan
Nov 28 10:07:47 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Nov 28 10:07:47 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:80ccc338-1d8e-4716-ba2a-1f18e7a6e806, vol_name:cephfs) < ""
Nov 28 10:07:47 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e206 e206: 6 total, 6 up, 6 in
Nov 28 10:07:47 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "80ccc338-1d8e-4716-ba2a-1f18e7a6e806", "format": "json"}]: dispatch
Nov 28 10:07:47 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "80ccc338-1d8e-4716-ba2a-1f18e7a6e806", "force": true, "format": "json"}]: dispatch
Nov 28 10:07:47 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v408: 177 pgs: 177 active+clean; 225 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 142 KiB/s rd, 30 KiB/s wr, 198 op/s
Nov 28 10:07:48 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:07:48.200 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:07:48 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:07:48.237 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:07:48 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:07:48.238 280172 DEBUG nova.compute.manager [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 28 10:07:48 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:07:48.238 280172 DEBUG nova.compute.manager [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 28 10:07:48 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:07:48.341 280172 DEBUG nova.compute.manager [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 28 10:07:48 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:07:48.342 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:07:48 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:07:48.377 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:07:48 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:07:48.378 158530 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=15, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '92:49:97', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ca:ab:0a:de:51:20'}, ipsec=False) old=SB_Global(nb_cfg=14) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:07:48 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:07:48.379 158530 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 28 10:07:48 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:07:48.398 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 10:07:48 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:07:48.399 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 10:07:48 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:07:48.399 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 10:07:48 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:07:48.399 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Auditing locally available compute resources for np0005538515.localdomain (node: np0005538515.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 28 10:07:48 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:07:48.400 280172 DEBUG oslo_concurrency.processutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 10:07:48 np0005538515.localdomain ceph-mon[301134]: osdmap e206: 6 total, 6 up, 6 in
Nov 28 10:07:48 np0005538515.localdomain ceph-mon[301134]: pgmap v408: 177 pgs: 177 active+clean; 225 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 142 KiB/s rd, 30 KiB/s wr, 198 op/s
Nov 28 10:07:48 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.106:0/1635609151' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:07:48 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 28 10:07:48 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/1383712359' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:07:48 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:07:48.848 280172 DEBUG oslo_concurrency.processutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.448s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 10:07:49 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:07:49.075 280172 WARNING nova.virt.libvirt.driver [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 10:07:49 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:07:49.078 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Hypervisor/Node resource view: name=np0005538515.localdomain free_ram=11506MB free_disk=41.70003890991211GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 28 10:07:49 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:07:49.078 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 10:07:49 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:07:49.079 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 10:07:49 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:07:49.543 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:07:49 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e207 e207: 6 total, 6 up, 6 in
Nov 28 10:07:49 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.108:0/1383712359' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:07:49 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:07:49.786 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 28 10:07:49 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:07:49.786 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Final resource view: name=np0005538515.localdomain phys_ram=15738MB used_ram=512MB phys_disk=41GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 28 10:07:49 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:07:49.808 280172 DEBUG oslo_concurrency.processutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 10:07:49 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v410: 177 pgs: 177 active+clean; 225 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 135 KiB/s rd, 29 KiB/s wr, 188 op/s
Nov 28 10:07:49 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "7fc9c0d6-3c27-4cc1-b7bd-1c8604d8f486", "snap_name": "46e657b3-6b45-48d1-8f94-979e6670605a", "format": "json"}]: dispatch
Nov 28 10:07:49 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_create(format:json, prefix:fs subvolume snapshot create, snap_name:46e657b3-6b45-48d1-8f94-979e6670605a, sub_name:7fc9c0d6-3c27-4cc1-b7bd-1c8604d8f486, vol_name:cephfs) < ""
Nov 28 10:07:49 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_create(format:json, prefix:fs subvolume snapshot create, snap_name:46e657b3-6b45-48d1-8f94-979e6670605a, sub_name:7fc9c0d6-3c27-4cc1-b7bd-1c8604d8f486, vol_name:cephfs) < ""
Nov 28 10:07:50 np0005538515.localdomain dnsmasq[320719]: exiting on receipt of SIGTERM
Nov 28 10:07:50 np0005538515.localdomain systemd[1]: libpod-ade539c5587c5a7549a112e7f1f7660e83ca9e8302a21cb1eea33af2732f279d.scope: Deactivated successfully.
Nov 28 10:07:50 np0005538515.localdomain podman[320857]: 2025-11-28 10:07:50.084507859 +0000 UTC m=+0.060270532 container kill ade539c5587c5a7549a112e7f1f7660e83ca9e8302a21cb1eea33af2732f279d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-744b5a82-3c5c-4b41-ba44-527244a209c4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:07:50 np0005538515.localdomain podman[320870]: 2025-11-28 10:07:50.129952902 +0000 UTC m=+0.031688239 container died ade539c5587c5a7549a112e7f1f7660e83ca9e8302a21cb1eea33af2732f279d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-744b5a82-3c5c-4b41-ba44-527244a209c4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, tcib_managed=true)
Nov 28 10:07:50 np0005538515.localdomain podman[320870]: 2025-11-28 10:07:50.158767421 +0000 UTC m=+0.060502768 container cleanup ade539c5587c5a7549a112e7f1f7660e83ca9e8302a21cb1eea33af2732f279d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-744b5a82-3c5c-4b41-ba44-527244a209c4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125)
Nov 28 10:07:50 np0005538515.localdomain systemd[1]: libpod-conmon-ade539c5587c5a7549a112e7f1f7660e83ca9e8302a21cb1eea33af2732f279d.scope: Deactivated successfully.
Nov 28 10:07:50 np0005538515.localdomain podman[320877]: 2025-11-28 10:07:50.197082344 +0000 UTC m=+0.087260984 container remove ade539c5587c5a7549a112e7f1f7660e83ca9e8302a21cb1eea33af2732f279d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-744b5a82-3c5c-4b41-ba44-527244a209c4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Nov 28 10:07:50 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 28 10:07:50 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/674541035' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:07:50 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:07:50.257 280172 DEBUG oslo_concurrency.processutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.449s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 10:07:50 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:07:50.263 280172 DEBUG nova.compute.provider_tree [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Inventory has not changed in ProviderTree for provider: 72fba1ca-0d86-48af-8a3d-510284dfd0e0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 10:07:50 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:07:50.396 280172 DEBUG nova.scheduler.client.report [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Inventory has not changed for provider 72fba1ca-0d86-48af-8a3d-510284dfd0e0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 10:07:50 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:07:50.399 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Compute_service record updated for np0005538515.localdomain:np0005538515.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 28 10:07:50 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:07:50.399 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.321s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 10:07:50 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Nov 28 10:07:50 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3345109558' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:07:50 np0005538515.localdomain ceph-mon[301134]: osdmap e207: 6 total, 6 up, 6 in
Nov 28 10:07:50 np0005538515.localdomain ceph-mon[301134]: pgmap v410: 177 pgs: 177 active+clean; 225 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 135 KiB/s rd, 29 KiB/s wr, 188 op/s
Nov 28 10:07:50 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "7fc9c0d6-3c27-4cc1-b7bd-1c8604d8f486", "snap_name": "46e657b3-6b45-48d1-8f94-979e6670605a", "format": "json"}]: dispatch
Nov 28 10:07:50 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.106:0/1418648319' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:07:50 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.108:0/674541035' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:07:50 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/3345109558' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:07:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:07:50.851 158530 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 10:07:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:07:50.852 158530 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 10:07:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:07:50.852 158530 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 10:07:51 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-cbb36c4db1eaf4ba887ed8bd7f5126f0d10ffb93de60f53a93db5d592a36f5f2-merged.mount: Deactivated successfully.
Nov 28 10:07:51 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ade539c5587c5a7549a112e7f1f7660e83ca9e8302a21cb1eea33af2732f279d-userdata-shm.mount: Deactivated successfully.
Nov 28 10:07:51 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:07:51.297 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:07:51 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:07:51.298 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:07:51 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:07:51.298 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:07:51 np0005538515.localdomain podman[320948]: 
Nov 28 10:07:51 np0005538515.localdomain podman[320948]: 2025-11-28 10:07:51.407409807 +0000 UTC m=+0.087524053 container create 25d273de97282d954a575896c1c9bb0181ff4d364d50bff62776b0a3d7e26892 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-744b5a82-3c5c-4b41-ba44-527244a209c4, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Nov 28 10:07:51 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e208 e208: 6 total, 6 up, 6 in
Nov 28 10:07:51 np0005538515.localdomain systemd[1]: Started libpod-conmon-25d273de97282d954a575896c1c9bb0181ff4d364d50bff62776b0a3d7e26892.scope.
Nov 28 10:07:51 np0005538515.localdomain podman[320948]: 2025-11-28 10:07:51.365033959 +0000 UTC m=+0.045148255 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Nov 28 10:07:51 np0005538515.localdomain systemd[1]: Started libcrun container.
Nov 28 10:07:51 np0005538515.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2034494d726fb92689f42123b6bec02a92e71b6d85a0adb34bb7e76ef306caf4/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 10:07:51 np0005538515.localdomain podman[320948]: 2025-11-28 10:07:51.503972028 +0000 UTC m=+0.184086274 container init 25d273de97282d954a575896c1c9bb0181ff4d364d50bff62776b0a3d7e26892 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-744b5a82-3c5c-4b41-ba44-527244a209c4, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125)
Nov 28 10:07:51 np0005538515.localdomain podman[320948]: 2025-11-28 10:07:51.511038426 +0000 UTC m=+0.191152672 container start 25d273de97282d954a575896c1c9bb0181ff4d364d50bff62776b0a3d7e26892 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-744b5a82-3c5c-4b41-ba44-527244a209c4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Nov 28 10:07:51 np0005538515.localdomain dnsmasq[320966]: started, version 2.85 cachesize 150
Nov 28 10:07:51 np0005538515.localdomain dnsmasq[320966]: DNS service limited to local subnets
Nov 28 10:07:51 np0005538515.localdomain dnsmasq[320966]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Nov 28 10:07:51 np0005538515.localdomain dnsmasq[320966]: warning: no upstream servers configured
Nov 28 10:07:51 np0005538515.localdomain dnsmasq-dhcp[320966]: DHCP, static leases only on 10.100.0.16, lease time 1d
Nov 28 10:07:51 np0005538515.localdomain dnsmasq-dhcp[320966]: DHCP, static leases only on 10.100.0.32, lease time 1d
Nov 28 10:07:51 np0005538515.localdomain dnsmasq[320966]: read /var/lib/neutron/dhcp/744b5a82-3c5c-4b41-ba44-527244a209c4/addn_hosts - 0 addresses
Nov 28 10:07:51 np0005538515.localdomain dnsmasq-dhcp[320966]: read /var/lib/neutron/dhcp/744b5a82-3c5c-4b41-ba44-527244a209c4/host
Nov 28 10:07:51 np0005538515.localdomain dnsmasq-dhcp[320966]: read /var/lib/neutron/dhcp/744b5a82-3c5c-4b41-ba44-527244a209c4/opts
Nov 28 10:07:51 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e208 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:07:51 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v412: 177 pgs: 177 active+clean; 225 MiB data, 1016 MiB used, 41 GiB / 42 GiB avail; 185 KiB/s rd, 41 KiB/s wr, 254 op/s
Nov 28 10:07:51 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.107:0/1126093872' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:07:51 np0005538515.localdomain ceph-mon[301134]: osdmap e208: 6 total, 6 up, 6 in
Nov 28 10:07:51 np0005538515.localdomain dnsmasq[320966]: exiting on receipt of SIGTERM
Nov 28 10:07:51 np0005538515.localdomain podman[320984]: 2025-11-28 10:07:51.888335133 +0000 UTC m=+0.065044339 container kill 25d273de97282d954a575896c1c9bb0181ff4d364d50bff62776b0a3d7e26892 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-744b5a82-3c5c-4b41-ba44-527244a209c4, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3)
Nov 28 10:07:51 np0005538515.localdomain systemd[1]: libpod-25d273de97282d954a575896c1c9bb0181ff4d364d50bff62776b0a3d7e26892.scope: Deactivated successfully.
Nov 28 10:07:51 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:07:51.951 261346 INFO neutron.agent.dhcp.agent [None req-25e55253-a679-4a11-8582-8826d1bd0a82 - - - - - -] DHCP configuration for ports {'530dc798-1c6a-4c38-a11f-57f3818e5561', 'd42f5d61-afe1-455f-b448-86993094b244', '636c21fa-d6bd-405e-95ed-d59498827d6f'} is completed
Nov 28 10:07:51 np0005538515.localdomain podman[320999]: 2025-11-28 10:07:51.965733242 +0000 UTC m=+0.055952669 container died 25d273de97282d954a575896c1c9bb0181ff4d364d50bff62776b0a3d7e26892 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-744b5a82-3c5c-4b41-ba44-527244a209c4, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Nov 28 10:07:52 np0005538515.localdomain podman[320999]: 2025-11-28 10:07:52.006289744 +0000 UTC m=+0.096509131 container remove 25d273de97282d954a575896c1c9bb0181ff4d364d50bff62776b0a3d7e26892 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-744b5a82-3c5c-4b41-ba44-527244a209c4, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 28 10:07:52 np0005538515.localdomain systemd[1]: libpod-conmon-25d273de97282d954a575896c1c9bb0181ff4d364d50bff62776b0a3d7e26892.scope: Deactivated successfully.
Nov 28 10:07:52 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:07:52.060 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:07:52 np0005538515.localdomain kernel: device tapd42f5d61-af left promiscuous mode
Nov 28 10:07:52 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T10:07:52Z|00189|binding|INFO|Releasing lport d42f5d61-afe1-455f-b448-86993094b244 from this chassis (sb_readonly=0)
Nov 28 10:07:52 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T10:07:52Z|00190|binding|INFO|Setting lport d42f5d61-afe1-455f-b448-86993094b244 down in Southbound
Nov 28 10:07:52 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-2034494d726fb92689f42123b6bec02a92e71b6d85a0adb34bb7e76ef306caf4-merged.mount: Deactivated successfully.
Nov 28 10:07:52 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-25d273de97282d954a575896c1c9bb0181ff4d364d50bff62776b0a3d7e26892-userdata-shm.mount: Deactivated successfully.
Nov 28 10:07:52 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:07:52.087 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:07:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:07:52.094 158530 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538515.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.18/28 10.100.0.3/28 10.100.0.35/28', 'neutron:device_id': 'dhcp2a75ae47-09f3-5db4-9c67-86b6e0e7c804-744b5a82-3c5c-4b41-ba44-527244a209c4', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-744b5a82-3c5c-4b41-ba44-527244a209c4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5e7a07c97c664076bc825e05137c574c', 'neutron:revision_number': '7', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005538515.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=af9fa2ef-906d-4c5f-8a61-e350b84d90cf, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fd80e481be0>], logical_port=d42f5d61-afe1-455f-b448-86993094b244) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fd80e481be0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:07:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:07:52.096 158530 INFO neutron.agent.ovn.metadata.agent [-] Port d42f5d61-afe1-455f-b448-86993094b244 in datapath 744b5a82-3c5c-4b41-ba44-527244a209c4 unbound from our chassis
Nov 28 10:07:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:07:52.098 158530 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 744b5a82-3c5c-4b41-ba44-527244a209c4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 28 10:07:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:07:52.099 261619 DEBUG oslo.privsep.daemon [-] privsep: reply[db4d9133-6660-43ee-916f-c951d5dce369]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:07:52 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:07:52.380 158530 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=62c03cad-89c1-4fd7-973b-8f2a608c71f1, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '15'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 10:07:52 np0005538515.localdomain systemd[1]: run-netns-qdhcp\x2d744b5a82\x2d3c5c\x2d4b41\x2dba44\x2d527244a209c4.mount: Deactivated successfully.
Nov 28 10:07:52 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:07:52.411 261346 INFO neutron.agent.dhcp.agent [None req-93d85043-a83d-4963-a680-6331aca86742 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:07:52 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e209 e209: 6 total, 6 up, 6 in
Nov 28 10:07:52 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:07:52.459 2 INFO neutron.agent.securitygroups_rpc [None req-6326bf78-d29d-46c0-b3b7-72df824a50bd 646af4638a054ec4b7eb9e5438a2ab60 5e7a07c97c664076bc825e05137c574c - - default default] Security group member updated ['11213b09-f0e7-4cd0-8dcb-72dc58a4cd0b']
Nov 28 10:07:52 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:07:52.491 261346 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:07:52 np0005538515.localdomain ceph-mon[301134]: pgmap v412: 177 pgs: 177 active+clean; 225 MiB data, 1016 MiB used, 41 GiB / 42 GiB avail; 185 KiB/s rd, 41 KiB/s wr, 254 op/s
Nov 28 10:07:52 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/3232289778' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:07:52 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/3232289778' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:07:52 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.107:0/2095041212' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:07:52 np0005538515.localdomain ceph-mon[301134]: osdmap e209: 6 total, 6 up, 6 in
Nov 28 10:07:53 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:07:53.203 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:07:53 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:07:53.485 261346 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:07:53 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v414: 177 pgs: 177 active+clean; 225 MiB data, 1016 MiB used, 41 GiB / 42 GiB avail; 71 KiB/s rd, 17 KiB/s wr, 96 op/s
Nov 28 10:07:53 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e210 e210: 6 total, 6 up, 6 in
Nov 28 10:07:53 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:07:53.952 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:07:54 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "7fc9c0d6-3c27-4cc1-b7bd-1c8604d8f486", "snap_name": "a5860537-4c07-4bc9-9db5-8ed95b4a5b7c", "format": "json"}]: dispatch
Nov 28 10:07:54 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_create(format:json, prefix:fs subvolume snapshot create, snap_name:a5860537-4c07-4bc9-9db5-8ed95b4a5b7c, sub_name:7fc9c0d6-3c27-4cc1-b7bd-1c8604d8f486, vol_name:cephfs) < ""
Nov 28 10:07:54 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_create(format:json, prefix:fs subvolume snapshot create, snap_name:a5860537-4c07-4bc9-9db5-8ed95b4a5b7c, sub_name:7fc9c0d6-3c27-4cc1-b7bd-1c8604d8f486, vol_name:cephfs) < ""
Nov 28 10:07:54 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:07:54.546 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:07:54 np0005538515.localdomain ceph-mon[301134]: pgmap v414: 177 pgs: 177 active+clean; 225 MiB data, 1016 MiB used, 41 GiB / 42 GiB avail; 71 KiB/s rd, 17 KiB/s wr, 96 op/s
Nov 28 10:07:54 np0005538515.localdomain ceph-mon[301134]: osdmap e210: 6 total, 6 up, 6 in
Nov 28 10:07:54 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "7fc9c0d6-3c27-4cc1-b7bd-1c8604d8f486", "snap_name": "a5860537-4c07-4bc9-9db5-8ed95b4a5b7c", "format": "json"}]: dispatch
Nov 28 10:07:54 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e211 e211: 6 total, 6 up, 6 in
Nov 28 10:07:55 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v417: 177 pgs: 177 active+clean; 225 MiB data, 1016 MiB used, 41 GiB / 42 GiB avail; 97 KiB/s rd, 23 KiB/s wr, 131 op/s
Nov 28 10:07:55 np0005538515.localdomain ceph-mon[301134]: osdmap e211: 6 total, 6 up, 6 in
Nov 28 10:07:56 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e212 e212: 6 total, 6 up, 6 in
Nov 28 10:07:56 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e212 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:07:56 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.
Nov 28 10:07:56 np0005538515.localdomain podman[321025]: 2025-11-28 10:07:56.975502383 +0000 UTC m=+0.081512217 container health_status 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, config_id=edpm, container_name=openstack_network_exporter)
Nov 28 10:07:56 np0005538515.localdomain podman[321025]: 2025-11-28 10:07:56.991567838 +0000 UTC m=+0.097577632 container exec_died 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, architecture=x86_64, vcs-type=git, version=9.6, io.openshift.tags=minimal rhel9, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, release=1755695350, container_name=openstack_network_exporter, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., io.openshift.expose-services=, distribution-scope=public, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Nov 28 10:07:57 np0005538515.localdomain systemd[1]: 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.service: Deactivated successfully.
Nov 28 10:07:57 np0005538515.localdomain ceph-mon[301134]: pgmap v417: 177 pgs: 177 active+clean; 225 MiB data, 1016 MiB used, 41 GiB / 42 GiB avail; 97 KiB/s rd, 23 KiB/s wr, 131 op/s
Nov 28 10:07:57 np0005538515.localdomain ceph-mon[301134]: osdmap e212: 6 total, 6 up, 6 in
Nov 28 10:07:57 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "7fc9c0d6-3c27-4cc1-b7bd-1c8604d8f486", "snap_name": "3e64b882-8f97-4fa5-8cd3-e92a05aac6b2", "format": "json"}]: dispatch
Nov 28 10:07:57 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_create(format:json, prefix:fs subvolume snapshot create, snap_name:3e64b882-8f97-4fa5-8cd3-e92a05aac6b2, sub_name:7fc9c0d6-3c27-4cc1-b7bd-1c8604d8f486, vol_name:cephfs) < ""
Nov 28 10:07:57 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_create(format:json, prefix:fs subvolume snapshot create, snap_name:3e64b882-8f97-4fa5-8cd3-e92a05aac6b2, sub_name:7fc9c0d6-3c27-4cc1-b7bd-1c8604d8f486, vol_name:cephfs) < ""
Nov 28 10:07:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:07:57 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 10:07:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:07:57 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 10:07:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:07:57 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 10:07:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:07:57 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 10:07:57 np0005538515.localdomain openstack_network_exporter[240973]: 
Nov 28 10:07:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:07:57 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 10:07:57 np0005538515.localdomain openstack_network_exporter[240973]: 
Nov 28 10:07:57 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v419: 177 pgs: 177 active+clean; 225 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 53 KiB/s rd, 11 KiB/s wr, 73 op/s
Nov 28 10:07:58 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:07:58.206 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:07:58 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "7fc9c0d6-3c27-4cc1-b7bd-1c8604d8f486", "snap_name": "3e64b882-8f97-4fa5-8cd3-e92a05aac6b2", "format": "json"}]: dispatch
Nov 28 10:07:58 np0005538515.localdomain podman[239012]: time="2025-11-28T10:07:58Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 10:07:58 np0005538515.localdomain podman[239012]: @ - - [28/Nov/2025:10:07:58 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 158154 "" "Go-http-client/1.1"
Nov 28 10:07:58 np0005538515.localdomain podman[239012]: @ - - [28/Nov/2025:10:07:58 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19685 "" "Go-http-client/1.1"
Nov 28 10:07:59 np0005538515.localdomain ceph-mon[301134]: pgmap v419: 177 pgs: 177 active+clean; 225 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 53 KiB/s rd, 11 KiB/s wr, 73 op/s
Nov 28 10:07:59 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:07:59.548 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:07:59 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v420: 177 pgs: 177 active+clean; 225 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 47 KiB/s rd, 10 KiB/s wr, 66 op/s
Nov 28 10:07:59 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:07:59.980 2 INFO neutron.agent.securitygroups_rpc [None req-db55947c-11ec-46cf-8b4d-c6d9fdfd5571 4946fab2fc3e4b6a824c5b97d5395e1f 6aa84c7049ec4634978c2f2583d7c1dd - - default default] Security group rule updated ['6089c18b-265b-455e-adb1-d3701c826867']
Nov 28 10:08:00 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:08:00.376 2 INFO neutron.agent.securitygroups_rpc [None req-c7648c00-3d9b-484d-a8fa-078b96af4727 4946fab2fc3e4b6a824c5b97d5395e1f 6aa84c7049ec4634978c2f2583d7c1dd - - default default] Security group rule updated ['6089c18b-265b-455e-adb1-d3701c826867']
Nov 28 10:08:00 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:08:00.430 2 INFO neutron.agent.securitygroups_rpc [req-b7f1a51c-049a-4a1b-9cd5-f5d4f2c3744c req-dd3110e0-51f5-4337-ac22-6a0998bcc00c 87cf17c48bab44279b2e7eca9e0882a2 d3c0d1ce8d854a7b9ffc953e88cd2c44 - - default default] Security group member updated ['c52603b5-5f47-4123-b8fe-cc9f0a56d914']
Nov 28 10:08:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:08:00.627 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:08:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:08:00.628 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:08:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:08:00.628 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:08:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:08:00.629 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:08:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:08:00.629 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:08:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:08:00.629 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:08:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:08:00.629 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:08:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:08:00.629 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:08:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:08:00.630 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:08:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:08:00.630 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:08:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:08:00.630 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:08:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:08:00.630 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:08:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:08:00.630 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:08:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:08:00.631 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:08:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:08:00.631 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:08:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:08:00.631 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:08:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:08:00.631 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:08:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:08:00.631 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:08:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:08:00.631 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:08:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:08:00.632 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:08:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:08:00.632 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:08:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:08:00.632 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:08:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:08:00.632 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:08:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:08:00.632 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:08:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:08:00.632 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:08:00 np0005538515.localdomain podman[321065]: 2025-11-28 10:08:00.689774931 +0000 UTC m=+0.052672287 container kill a39214a4baa8262623303d314b8ed95b71c01a463bc2eabd06aba05950874fd1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b2c4ac07-8851-40d3-9495-d0489b67c4c3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 28 10:08:00 np0005538515.localdomain systemd[1]: tmp-crun.gspECf.mount: Deactivated successfully.
Nov 28 10:08:00 np0005538515.localdomain dnsmasq[316830]: read /var/lib/neutron/dhcp/b2c4ac07-8851-40d3-9495-d0489b67c4c3/addn_hosts - 1 addresses
Nov 28 10:08:00 np0005538515.localdomain dnsmasq-dhcp[316830]: read /var/lib/neutron/dhcp/b2c4ac07-8851-40d3-9495-d0489b67c4c3/host
Nov 28 10:08:00 np0005538515.localdomain dnsmasq-dhcp[316830]: read /var/lib/neutron/dhcp/b2c4ac07-8851-40d3-9495-d0489b67c4c3/opts
Nov 28 10:08:01 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "7fc9c0d6-3c27-4cc1-b7bd-1c8604d8f486", "snap_name": "3e64b882-8f97-4fa5-8cd3-e92a05aac6b2_36a63ece-b219-4438-a9c8-be964316c461", "force": true, "format": "json"}]: dispatch
Nov 28 10:08:01 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:3e64b882-8f97-4fa5-8cd3-e92a05aac6b2_36a63ece-b219-4438-a9c8-be964316c461, sub_name:7fc9c0d6-3c27-4cc1-b7bd-1c8604d8f486, vol_name:cephfs) < ""
Nov 28 10:08:01 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/7fc9c0d6-3c27-4cc1-b7bd-1c8604d8f486/.meta.tmp'
Nov 28 10:08:01 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/7fc9c0d6-3c27-4cc1-b7bd-1c8604d8f486/.meta.tmp' to config b'/volumes/_nogroup/7fc9c0d6-3c27-4cc1-b7bd-1c8604d8f486/.meta'
Nov 28 10:08:01 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:3e64b882-8f97-4fa5-8cd3-e92a05aac6b2_36a63ece-b219-4438-a9c8-be964316c461, sub_name:7fc9c0d6-3c27-4cc1-b7bd-1c8604d8f486, vol_name:cephfs) < ""
Nov 28 10:08:01 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "7fc9c0d6-3c27-4cc1-b7bd-1c8604d8f486", "snap_name": "3e64b882-8f97-4fa5-8cd3-e92a05aac6b2", "force": true, "format": "json"}]: dispatch
Nov 28 10:08:01 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:3e64b882-8f97-4fa5-8cd3-e92a05aac6b2, sub_name:7fc9c0d6-3c27-4cc1-b7bd-1c8604d8f486, vol_name:cephfs) < ""
Nov 28 10:08:01 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/7fc9c0d6-3c27-4cc1-b7bd-1c8604d8f486/.meta.tmp'
Nov 28 10:08:01 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/7fc9c0d6-3c27-4cc1-b7bd-1c8604d8f486/.meta.tmp' to config b'/volumes/_nogroup/7fc9c0d6-3c27-4cc1-b7bd-1c8604d8f486/.meta'
Nov 28 10:08:01 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:3e64b882-8f97-4fa5-8cd3-e92a05aac6b2, sub_name:7fc9c0d6-3c27-4cc1-b7bd-1c8604d8f486, vol_name:cephfs) < ""
Nov 28 10:08:01 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:08:01.210 2 INFO neutron.agent.securitygroups_rpc [None req-565e0ad3-78a4-4813-9446-c55fe79fb3b2 98015f849ce5423d83599c846511f268 ed1e45c023a5404a97377641c4a6c580 - - default default] Security group rule updated ['d7b997f9-4b8e-48df-a7bc-cf1a88435b19']
Nov 28 10:08:01 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:08:01.339 2 INFO neutron.agent.securitygroups_rpc [None req-51f1f60e-f6ad-4b91-b7b5-2db567d0c6ed 4946fab2fc3e4b6a824c5b97d5395e1f 6aa84c7049ec4634978c2f2583d7c1dd - - default default] Security group rule updated ['15f5d617-f52a-49d8-bbd0-19868b48ee91']
Nov 28 10:08:01 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:08:01.402 2 INFO neutron.agent.securitygroups_rpc [None req-84e0009b-c65d-4325-9480-9689cb3fdfb2 98015f849ce5423d83599c846511f268 ed1e45c023a5404a97377641c4a6c580 - - default default] Security group rule updated ['d7b997f9-4b8e-48df-a7bc-cf1a88435b19']
Nov 28 10:08:01 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e213 e213: 6 total, 6 up, 6 in
Nov 28 10:08:01 np0005538515.localdomain ceph-mon[301134]: pgmap v420: 177 pgs: 177 active+clean; 225 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 47 KiB/s rd, 10 KiB/s wr, 66 op/s
Nov 28 10:08:01 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.107:0/2546522305' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:08:01 np0005538515.localdomain ceph-mon[301134]: osdmap e213: 6 total, 6 up, 6 in
Nov 28 10:08:01 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:08:01.557 2 INFO neutron.agent.securitygroups_rpc [None req-f01fa5c9-de49-46e8-bc93-b89c3fed3f57 4946fab2fc3e4b6a824c5b97d5395e1f 6aa84c7049ec4634978c2f2583d7c1dd - - default default] Security group rule updated ['15f5d617-f52a-49d8-bbd0-19868b48ee91']
Nov 28 10:08:01 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:08:01.779 2 INFO neutron.agent.securitygroups_rpc [None req-05d81c06-d401-4205-8dff-14a86090a368 4946fab2fc3e4b6a824c5b97d5395e1f 6aa84c7049ec4634978c2f2583d7c1dd - - default default] Security group rule updated ['15f5d617-f52a-49d8-bbd0-19868b48ee91']
Nov 28 10:08:01 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e213 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:08:01 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v422: 177 pgs: 177 active+clean; 146 MiB data, 899 MiB used, 41 GiB / 42 GiB avail; 74 KiB/s rd, 29 KiB/s wr, 107 op/s
Nov 28 10:08:01 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:08:01.944 2 INFO neutron.agent.securitygroups_rpc [None req-22968046-714b-4dc6-9b18-f55692eae2e8 4946fab2fc3e4b6a824c5b97d5395e1f 6aa84c7049ec4634978c2f2583d7c1dd - - default default] Security group rule updated ['15f5d617-f52a-49d8-bbd0-19868b48ee91']
Nov 28 10:08:02 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:08:02.115 2 INFO neutron.agent.securitygroups_rpc [None req-1c9f5803-20cb-4e69-804a-0617a068dfc7 4946fab2fc3e4b6a824c5b97d5395e1f 6aa84c7049ec4634978c2f2583d7c1dd - - default default] Security group rule updated ['15f5d617-f52a-49d8-bbd0-19868b48ee91']
Nov 28 10:08:02 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:08:02.258 2 INFO neutron.agent.securitygroups_rpc [None req-2bab2459-ad99-4514-94d3-affbaffcf884 98015f849ce5423d83599c846511f268 ed1e45c023a5404a97377641c4a6c580 - - default default] Security group rule updated ['b758e87b-4de3-47dd-a896-167b766787f3']
Nov 28 10:08:02 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "7fc9c0d6-3c27-4cc1-b7bd-1c8604d8f486", "snap_name": "3e64b882-8f97-4fa5-8cd3-e92a05aac6b2_36a63ece-b219-4438-a9c8-be964316c461", "force": true, "format": "json"}]: dispatch
Nov 28 10:08:02 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "7fc9c0d6-3c27-4cc1-b7bd-1c8604d8f486", "snap_name": "3e64b882-8f97-4fa5-8cd3-e92a05aac6b2", "force": true, "format": "json"}]: dispatch
Nov 28 10:08:02 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:08:02.788 2 INFO neutron.agent.securitygroups_rpc [None req-c50c53a2-fd90-43a9-8f31-8eb2d8fc3f23 98015f849ce5423d83599c846511f268 ed1e45c023a5404a97377641c4a6c580 - - default default] Security group rule updated ['b758e87b-4de3-47dd-a896-167b766787f3']
Nov 28 10:08:02 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:08:02.989 2 INFO neutron.agent.securitygroups_rpc [None req-be6115b0-0729-4a7d-96a1-2edd3462b47c 4946fab2fc3e4b6a824c5b97d5395e1f 6aa84c7049ec4634978c2f2583d7c1dd - - default default] Security group rule updated ['15f5d617-f52a-49d8-bbd0-19868b48ee91']
Nov 28 10:08:03 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:08:03.011 2 INFO neutron.agent.securitygroups_rpc [None req-bd38caac-e12b-4351-8e6f-97a983a9711e 98015f849ce5423d83599c846511f268 ed1e45c023a5404a97377641c4a6c580 - - default default] Security group rule updated ['b758e87b-4de3-47dd-a896-167b766787f3']
Nov 28 10:08:03 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:08:03.184 2 INFO neutron.agent.securitygroups_rpc [None req-edb67cac-fd5d-4927-8f94-9c49e0c903d7 98015f849ce5423d83599c846511f268 ed1e45c023a5404a97377641c4a6c580 - - default default] Security group rule updated ['b758e87b-4de3-47dd-a896-167b766787f3']
Nov 28 10:08:03 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:08:03.208 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:08:03 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:08:03.338 2 INFO neutron.agent.securitygroups_rpc [None req-20312780-c455-466a-b7ab-f675d7eabde1 98015f849ce5423d83599c846511f268 ed1e45c023a5404a97377641c4a6c580 - - default default] Security group rule updated ['b758e87b-4de3-47dd-a896-167b766787f3']
Nov 28 10:08:03 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:08:03.429 2 INFO neutron.agent.securitygroups_rpc [None req-14a5f88b-dc6d-4539-ba50-a6898f3db0a1 4946fab2fc3e4b6a824c5b97d5395e1f 6aa84c7049ec4634978c2f2583d7c1dd - - default default] Security group rule updated ['15f5d617-f52a-49d8-bbd0-19868b48ee91']
Nov 28 10:08:03 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:08:03.482 2 INFO neutron.agent.securitygroups_rpc [None req-13772359-a1cd-4764-8f15-d95cc5fef900 98015f849ce5423d83599c846511f268 ed1e45c023a5404a97377641c4a6c580 - - default default] Security group rule updated ['b758e87b-4de3-47dd-a896-167b766787f3']
Nov 28 10:08:03 np0005538515.localdomain ceph-mon[301134]: pgmap v422: 177 pgs: 177 active+clean; 146 MiB data, 899 MiB used, 41 GiB / 42 GiB avail; 74 KiB/s rd, 29 KiB/s wr, 107 op/s
Nov 28 10:08:03 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v423: 177 pgs: 177 active+clean; 146 MiB data, 899 MiB used, 41 GiB / 42 GiB avail; 64 KiB/s rd, 25 KiB/s wr, 93 op/s
Nov 28 10:08:03 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Nov 28 10:08:03 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1986905182' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:08:03 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:08:03.875 2 INFO neutron.agent.securitygroups_rpc [None req-6ebeaf50-19b9-453a-b8c6-1888da8279d8 98015f849ce5423d83599c846511f268 ed1e45c023a5404a97377641c4a6c580 - - default default] Security group rule updated ['b758e87b-4de3-47dd-a896-167b766787f3']
Nov 28 10:08:03 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Nov 28 10:08:03 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1986905182' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:08:03 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:08:03.894 2 INFO neutron.agent.securitygroups_rpc [None req-13ec4ccc-0950-4875-9a0e-1e0334716892 f56d2237e5b74576a33d9840c9346817 9ce143270a4649669232b53b6a44e4ba - - default default] Security group member updated ['f3e50b86-f5a6-4339-897f-e9e754c264f3']
Nov 28 10:08:03 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:08:03.898 2 INFO neutron.agent.securitygroups_rpc [None req-a9fad0b5-d55e-463a-8ffe-8066b04b35f2 4946fab2fc3e4b6a824c5b97d5395e1f 6aa84c7049ec4634978c2f2583d7c1dd - - default default] Security group rule updated ['15f5d617-f52a-49d8-bbd0-19868b48ee91']
Nov 28 10:08:04 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:08:04.239 2 INFO neutron.agent.securitygroups_rpc [None req-179f04c6-9ef2-42e7-8385-4b9a01b4f84f 98015f849ce5423d83599c846511f268 ed1e45c023a5404a97377641c4a6c580 - - default default] Security group rule updated ['b758e87b-4de3-47dd-a896-167b766787f3']
Nov 28 10:08:04 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:08:04.280 2 INFO neutron.agent.securitygroups_rpc [None req-703ffdf6-c471-414a-b1d5-827cd1496408 4946fab2fc3e4b6a824c5b97d5395e1f 6aa84c7049ec4634978c2f2583d7c1dd - - default default] Security group rule updated ['15f5d617-f52a-49d8-bbd0-19868b48ee91']
Nov 28 10:08:04 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "7fc9c0d6-3c27-4cc1-b7bd-1c8604d8f486", "snap_name": "a5860537-4c07-4bc9-9db5-8ed95b4a5b7c_b0090dae-0a99-4947-ac5e-d28013b7f627", "force": true, "format": "json"}]: dispatch
Nov 28 10:08:04 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:a5860537-4c07-4bc9-9db5-8ed95b4a5b7c_b0090dae-0a99-4947-ac5e-d28013b7f627, sub_name:7fc9c0d6-3c27-4cc1-b7bd-1c8604d8f486, vol_name:cephfs) < ""
Nov 28 10:08:04 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/7fc9c0d6-3c27-4cc1-b7bd-1c8604d8f486/.meta.tmp'
Nov 28 10:08:04 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/7fc9c0d6-3c27-4cc1-b7bd-1c8604d8f486/.meta.tmp' to config b'/volumes/_nogroup/7fc9c0d6-3c27-4cc1-b7bd-1c8604d8f486/.meta'
Nov 28 10:08:04 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:a5860537-4c07-4bc9-9db5-8ed95b4a5b7c_b0090dae-0a99-4947-ac5e-d28013b7f627, sub_name:7fc9c0d6-3c27-4cc1-b7bd-1c8604d8f486, vol_name:cephfs) < ""
Nov 28 10:08:04 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "7fc9c0d6-3c27-4cc1-b7bd-1c8604d8f486", "snap_name": "a5860537-4c07-4bc9-9db5-8ed95b4a5b7c", "force": true, "format": "json"}]: dispatch
Nov 28 10:08:04 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:a5860537-4c07-4bc9-9db5-8ed95b4a5b7c, sub_name:7fc9c0d6-3c27-4cc1-b7bd-1c8604d8f486, vol_name:cephfs) < ""
Nov 28 10:08:04 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/7fc9c0d6-3c27-4cc1-b7bd-1c8604d8f486/.meta.tmp'
Nov 28 10:08:04 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/7fc9c0d6-3c27-4cc1-b7bd-1c8604d8f486/.meta.tmp' to config b'/volumes/_nogroup/7fc9c0d6-3c27-4cc1-b7bd-1c8604d8f486/.meta'
Nov 28 10:08:04 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:a5860537-4c07-4bc9-9db5-8ed95b4a5b7c, sub_name:7fc9c0d6-3c27-4cc1-b7bd-1c8604d8f486, vol_name:cephfs) < ""
Nov 28 10:08:04 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:08:04.551 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:08:04 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:08:04.659 2 INFO neutron.agent.securitygroups_rpc [None req-0ac9408a-42d9-4a9d-8ec9-d45f37e64efb 98015f849ce5423d83599c846511f268 ed1e45c023a5404a97377641c4a6c580 - - default default] Security group rule updated ['b758e87b-4de3-47dd-a896-167b766787f3']
Nov 28 10:08:04 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:08:04.683 2 INFO neutron.agent.securitygroups_rpc [None req-1da31546-dc0a-4e96-b9fc-5fecbaaa8ced 4946fab2fc3e4b6a824c5b97d5395e1f 6aa84c7049ec4634978c2f2583d7c1dd - - default default] Security group rule updated ['15f5d617-f52a-49d8-bbd0-19868b48ee91']
Nov 28 10:08:04 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/1986905182' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:08:04 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/1986905182' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:08:05 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e214 e214: 6 total, 6 up, 6 in
Nov 28 10:08:05 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:08:05.187 2 INFO neutron.agent.securitygroups_rpc [None req-7ed5809e-b041-4ea1-b265-703b22d08b78 98015f849ce5423d83599c846511f268 ed1e45c023a5404a97377641c4a6c580 - - default default] Security group rule updated ['b758e87b-4de3-47dd-a896-167b766787f3']
Nov 28 10:08:05 np0005538515.localdomain ceph-mgr[286188]: [balancer INFO root] Optimize plan auto_2025-11-28_10:08:05
Nov 28 10:08:05 np0005538515.localdomain ceph-mgr[286188]: [balancer INFO root] Mode upmap, max misplaced 0.050000
Nov 28 10:08:05 np0005538515.localdomain ceph-mgr[286188]: [balancer INFO root] do_upmap
Nov 28 10:08:05 np0005538515.localdomain ceph-mgr[286188]: [balancer INFO root] pools ['backups', 'vms', 'manila_data', 'manila_metadata', 'images', '.mgr', 'volumes']
Nov 28 10:08:05 np0005538515.localdomain ceph-mgr[286188]: [balancer INFO root] prepared 0/10 changes
Nov 28 10:08:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] scanning for idle connections..
Nov 28 10:08:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] cleaning up connections: []
Nov 28 10:08:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] scanning for idle connections..
Nov 28 10:08:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] cleaning up connections: []
Nov 28 10:08:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] scanning for idle connections..
Nov 28 10:08:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] cleaning up connections: []
Nov 28 10:08:05 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v425: 177 pgs: 177 active+clean; 146 MiB data, 899 MiB used, 41 GiB / 42 GiB avail; 28 KiB/s rd, 18 KiB/s wr, 43 op/s
Nov 28 10:08:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] _maybe_adjust
Nov 28 10:08:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Nov 28 10:08:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] Pool '.mgr' root_id -1 using 3.080724804578448e-05 of space, bias 1.0, pg target 0.006161449609156895 quantized to 1 (current 1)
Nov 28 10:08:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Nov 28 10:08:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] Pool 'vms' root_id -1 using 0.003328000680485762 of space, bias 1.0, pg target 0.6656001360971524 quantized to 32 (current 32)
Nov 28 10:08:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Nov 28 10:08:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] Pool 'volumes' root_id -1 using 2.453674623115578e-06 of space, bias 1.0, pg target 0.0004899170330820771 quantized to 32 (current 32)
Nov 28 10:08:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Nov 28 10:08:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] Pool 'images' root_id -1 using 0.004299383200725851 of space, bias 1.0, pg target 0.8584435124115949 quantized to 32 (current 32)
Nov 28 10:08:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Nov 28 10:08:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] Pool 'backups' root_id -1 using 2.7263051367950866e-07 of space, bias 1.0, pg target 5.425347222222222e-05 quantized to 32 (current 32)
Nov 28 10:08:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Nov 28 10:08:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] Pool 'manila_data' root_id -1 using 8.17891541038526e-07 of space, bias 1.0, pg target 0.00016276041666666666 quantized to 32 (current 32)
Nov 28 10:08:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Nov 28 10:08:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] Pool 'manila_metadata' root_id -1 using 4.3348251675041876e-05 of space, bias 4.0, pg target 0.034505208333333336 quantized to 16 (current 16)
Nov 28 10:08:05 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules
Nov 28 10:08:05 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: vms, start_after=
Nov 28 10:08:05 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: volumes, start_after=
Nov 28 10:08:05 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules
Nov 28 10:08:05 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: vms, start_after=
Nov 28 10:08:05 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: images, start_after=
Nov 28 10:08:05 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: volumes, start_after=
Nov 28 10:08:05 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: backups, start_after=
Nov 28 10:08:05 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: images, start_after=
Nov 28 10:08:05 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: backups, start_after=
Nov 28 10:08:06 np0005538515.localdomain ceph-mon[301134]: pgmap v423: 177 pgs: 177 active+clean; 146 MiB data, 899 MiB used, 41 GiB / 42 GiB avail; 64 KiB/s rd, 25 KiB/s wr, 93 op/s
Nov 28 10:08:06 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "7fc9c0d6-3c27-4cc1-b7bd-1c8604d8f486", "snap_name": "a5860537-4c07-4bc9-9db5-8ed95b4a5b7c_b0090dae-0a99-4947-ac5e-d28013b7f627", "force": true, "format": "json"}]: dispatch
Nov 28 10:08:06 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "7fc9c0d6-3c27-4cc1-b7bd-1c8604d8f486", "snap_name": "a5860537-4c07-4bc9-9db5-8ed95b4a5b7c", "force": true, "format": "json"}]: dispatch
Nov 28 10:08:06 np0005538515.localdomain ceph-mon[301134]: osdmap e214: 6 total, 6 up, 6 in
Nov 28 10:08:06 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e215 e215: 6 total, 6 up, 6 in
Nov 28 10:08:06 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "56764117-bba7-4a1d-bc16-3de8a089b757", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:08:06 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:56764117-bba7-4a1d-bc16-3de8a089b757, vol_name:cephfs) < ""
Nov 28 10:08:06 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:08:06.536 2 INFO neutron.agent.securitygroups_rpc [None req-c3c34998-5ba8-4c09-bfd4-af4362061351 4946fab2fc3e4b6a824c5b97d5395e1f 6aa84c7049ec4634978c2f2583d7c1dd - - default default] Security group rule updated ['77da4666-3c7e-4eb4-bd89-e0f6bc0cfb77']
Nov 28 10:08:06 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/56764117-bba7-4a1d-bc16-3de8a089b757/.meta.tmp'
Nov 28 10:08:06 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/56764117-bba7-4a1d-bc16-3de8a089b757/.meta.tmp' to config b'/volumes/_nogroup/56764117-bba7-4a1d-bc16-3de8a089b757/.meta'
Nov 28 10:08:06 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:56764117-bba7-4a1d-bc16-3de8a089b757, vol_name:cephfs) < ""
Nov 28 10:08:06 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "56764117-bba7-4a1d-bc16-3de8a089b757", "format": "json"}]: dispatch
Nov 28 10:08:06 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:56764117-bba7-4a1d-bc16-3de8a089b757, vol_name:cephfs) < ""
Nov 28 10:08:06 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:56764117-bba7-4a1d-bc16-3de8a089b757, vol_name:cephfs) < ""
Nov 28 10:08:06 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e215 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:08:06 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:08:06.838 2 INFO neutron.agent.securitygroups_rpc [None req-cf7500dc-e72d-4341-8bc2-ebaed58e7094 98015f849ce5423d83599c846511f268 ed1e45c023a5404a97377641c4a6c580 - - default default] Security group rule updated ['2521adb0-8644-4922-aaf5-9462c312df8d']
Nov 28 10:08:07 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e216 e216: 6 total, 6 up, 6 in
Nov 28 10:08:07 np0005538515.localdomain ceph-mon[301134]: pgmap v425: 177 pgs: 177 active+clean; 146 MiB data, 899 MiB used, 41 GiB / 42 GiB avail; 28 KiB/s rd, 18 KiB/s wr, 43 op/s
Nov 28 10:08:07 np0005538515.localdomain ceph-mon[301134]: osdmap e215: 6 total, 6 up, 6 in
Nov 28 10:08:07 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "56764117-bba7-4a1d-bc16-3de8a089b757", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:08:07 np0005538515.localdomain ceph-mon[301134]: from='client.15651 172.18.0.34:0/597982567' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:08:07 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Nov 28 10:08:07 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1480453872' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:08:07 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Nov 28 10:08:07 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1480453872' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:08:07 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "7fc9c0d6-3c27-4cc1-b7bd-1c8604d8f486", "snap_name": "46e657b3-6b45-48d1-8f94-979e6670605a_4c7473fe-0e78-460c-a4b8-b3878dfb39c2", "force": true, "format": "json"}]: dispatch
Nov 28 10:08:07 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:46e657b3-6b45-48d1-8f94-979e6670605a_4c7473fe-0e78-460c-a4b8-b3878dfb39c2, sub_name:7fc9c0d6-3c27-4cc1-b7bd-1c8604d8f486, vol_name:cephfs) < ""
Nov 28 10:08:07 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/7fc9c0d6-3c27-4cc1-b7bd-1c8604d8f486/.meta.tmp'
Nov 28 10:08:07 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/7fc9c0d6-3c27-4cc1-b7bd-1c8604d8f486/.meta.tmp' to config b'/volumes/_nogroup/7fc9c0d6-3c27-4cc1-b7bd-1c8604d8f486/.meta'
Nov 28 10:08:07 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:46e657b3-6b45-48d1-8f94-979e6670605a_4c7473fe-0e78-460c-a4b8-b3878dfb39c2, sub_name:7fc9c0d6-3c27-4cc1-b7bd-1c8604d8f486, vol_name:cephfs) < ""
Nov 28 10:08:07 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "7fc9c0d6-3c27-4cc1-b7bd-1c8604d8f486", "snap_name": "46e657b3-6b45-48d1-8f94-979e6670605a", "force": true, "format": "json"}]: dispatch
Nov 28 10:08:07 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:46e657b3-6b45-48d1-8f94-979e6670605a, sub_name:7fc9c0d6-3c27-4cc1-b7bd-1c8604d8f486, vol_name:cephfs) < ""
Nov 28 10:08:07 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/7fc9c0d6-3c27-4cc1-b7bd-1c8604d8f486/.meta.tmp'
Nov 28 10:08:07 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/7fc9c0d6-3c27-4cc1-b7bd-1c8604d8f486/.meta.tmp' to config b'/volumes/_nogroup/7fc9c0d6-3c27-4cc1-b7bd-1c8604d8f486/.meta'
Nov 28 10:08:07 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:46e657b3-6b45-48d1-8f94-979e6670605a, sub_name:7fc9c0d6-3c27-4cc1-b7bd-1c8604d8f486, vol_name:cephfs) < ""
Nov 28 10:08:07 np0005538515.localdomain ceph-mgr[286188]: [devicehealth INFO root] Check health
Nov 28 10:08:07 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v428: 177 pgs: 2 active+clean+snaptrim, 5 active+clean+snaptrim_wait, 170 active+clean; 146 MiB data, 935 MiB used, 41 GiB / 42 GiB avail; 41 KiB/s rd, 30 KiB/s wr, 61 op/s
Nov 28 10:08:07 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.
Nov 28 10:08:07 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.
Nov 28 10:08:07 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.
Nov 28 10:08:07 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.
Nov 28 10:08:07 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:08:07.949 2 INFO neutron.agent.securitygroups_rpc [None req-410f4b75-8a53-43bd-aeb3-73827c1fb9d5 4946fab2fc3e4b6a824c5b97d5395e1f 6aa84c7049ec4634978c2f2583d7c1dd - - default default] Security group rule updated ['cc6c7909-68f3-4243-ad85-ca295b324967']
Nov 28 10:08:07 np0005538515.localdomain podman[321089]: 2025-11-28 10:08:07.976090788 +0000 UTC m=+0.081079274 container health_status 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, managed_by=edpm_ansible)
Nov 28 10:08:07 np0005538515.localdomain systemd[1]: tmp-crun.cB8lIP.mount: Deactivated successfully.
Nov 28 10:08:08 np0005538515.localdomain podman[321102]: 2025-11-28 10:08:08.001678237 +0000 UTC m=+0.095250981 container health_status d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 28 10:08:08 np0005538515.localdomain podman[321089]: 2025-11-28 10:08:08.03350323 +0000 UTC m=+0.138491726 container exec_died 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, tcib_managed=true)
Nov 28 10:08:08 np0005538515.localdomain podman[321102]: 2025-11-28 10:08:08.042879659 +0000 UTC m=+0.136452443 container exec_died d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 28 10:08:08 np0005538515.localdomain systemd[1]: 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.service: Deactivated successfully.
Nov 28 10:08:08 np0005538515.localdomain systemd[1]: d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.service: Deactivated successfully.
Nov 28 10:08:08 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "56764117-bba7-4a1d-bc16-3de8a089b757", "format": "json"}]: dispatch
Nov 28 10:08:08 np0005538515.localdomain ceph-mon[301134]: osdmap e216: 6 total, 6 up, 6 in
Nov 28 10:08:08 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/1480453872' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:08:08 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/1480453872' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:08:08 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "7fc9c0d6-3c27-4cc1-b7bd-1c8604d8f486", "snap_name": "46e657b3-6b45-48d1-8f94-979e6670605a_4c7473fe-0e78-460c-a4b8-b3878dfb39c2", "force": true, "format": "json"}]: dispatch
Nov 28 10:08:08 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/3398271039' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:08:08 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/3398271039' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:08:08 np0005538515.localdomain podman[321090]: 2025-11-28 10:08:08.093253174 +0000 UTC m=+0.189752658 container health_status b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 28 10:08:08 np0005538515.localdomain podman[321088]: 2025-11-28 10:08:08.047172492 +0000 UTC m=+0.151917381 container health_status 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Nov 28 10:08:08 np0005538515.localdomain podman[321090]: 2025-11-28 10:08:08.125421557 +0000 UTC m=+0.221921021 container exec_died b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 28 10:08:08 np0005538515.localdomain systemd[1]: b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.service: Deactivated successfully.
Nov 28 10:08:08 np0005538515.localdomain podman[321088]: 2025-11-28 10:08:08.180312161 +0000 UTC m=+0.285057030 container exec_died 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:08:08 np0005538515.localdomain systemd[1]: 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.service: Deactivated successfully.
Nov 28 10:08:08 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:08:08.209 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:08:08 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:08:08.429 2 INFO neutron.agent.securitygroups_rpc [None req-9e0fb103-8bda-4ed9-896b-20548b225439 4946fab2fc3e4b6a824c5b97d5395e1f 6aa84c7049ec4634978c2f2583d7c1dd - - default default] Security group rule updated ['cc6c7909-68f3-4243-ad85-ca295b324967']
Nov 28 10:08:08 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:08:08.643 2 INFO neutron.agent.securitygroups_rpc [None req-5e0e412c-d85c-446d-af13-157bbc4d1b94 98015f849ce5423d83599c846511f268 ed1e45c023a5404a97377641c4a6c580 - - default default] Security group rule updated ['723b43a6-7c6c-4eb3-a519-023b34d9a2b5']
Nov 28 10:08:08 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:08:08.900 2 INFO neutron.agent.securitygroups_rpc [None req-aa17c84c-96b9-4f58-abde-f1675a957a10 98015f849ce5423d83599c846511f268 ed1e45c023a5404a97377641c4a6c580 - - default default] Security group rule updated ['723b43a6-7c6c-4eb3-a519-023b34d9a2b5']
Nov 28 10:08:09 np0005538515.localdomain podman[321187]: 2025-11-28 10:08:09.041471666 +0000 UTC m=+0.057886519 container kill a39214a4baa8262623303d314b8ed95b71c01a463bc2eabd06aba05950874fd1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b2c4ac07-8851-40d3-9495-d0489b67c4c3, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 28 10:08:09 np0005538515.localdomain dnsmasq[316830]: read /var/lib/neutron/dhcp/b2c4ac07-8851-40d3-9495-d0489b67c4c3/addn_hosts - 0 addresses
Nov 28 10:08:09 np0005538515.localdomain dnsmasq-dhcp[316830]: read /var/lib/neutron/dhcp/b2c4ac07-8851-40d3-9495-d0489b67c4c3/host
Nov 28 10:08:09 np0005538515.localdomain dnsmasq-dhcp[316830]: read /var/lib/neutron/dhcp/b2c4ac07-8851-40d3-9495-d0489b67c4c3/opts
Nov 28 10:08:09 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "7fc9c0d6-3c27-4cc1-b7bd-1c8604d8f486", "snap_name": "46e657b3-6b45-48d1-8f94-979e6670605a", "force": true, "format": "json"}]: dispatch
Nov 28 10:08:09 np0005538515.localdomain ceph-mon[301134]: pgmap v428: 177 pgs: 2 active+clean+snaptrim, 5 active+clean+snaptrim_wait, 170 active+clean; 146 MiB data, 935 MiB used, 41 GiB / 42 GiB avail; 41 KiB/s rd, 30 KiB/s wr, 61 op/s
Nov 28 10:08:09 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T10:08:09Z|00191|binding|INFO|Releasing lport 0eb8bf5c-382d-4ca7-a3fe-a3c5250492c6 from this chassis (sb_readonly=0)
Nov 28 10:08:09 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T10:08:09Z|00192|binding|INFO|Setting lport 0eb8bf5c-382d-4ca7-a3fe-a3c5250492c6 down in Southbound
Nov 28 10:08:09 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:08:09.232 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:08:09 np0005538515.localdomain kernel: device tap0eb8bf5c-38 left promiscuous mode
Nov 28 10:08:09 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:08:09.246 158530 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538515.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcp2a75ae47-09f3-5db4-9c67-86b6e0e7c804-b2c4ac07-8851-40d3-9495-d0489b67c4c3', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b2c4ac07-8851-40d3-9495-d0489b67c4c3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd3c0d1ce8d854a7b9ffc953e88cd2c44', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005538515.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=940d6739-e1d9-4dcd-a724-785ba886c2af, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fd80e481be0>], logical_port=0eb8bf5c-382d-4ca7-a3fe-a3c5250492c6) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fd80e481be0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:08:09 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:08:09.248 158530 INFO neutron.agent.ovn.metadata.agent [-] Port 0eb8bf5c-382d-4ca7-a3fe-a3c5250492c6 in datapath b2c4ac07-8851-40d3-9495-d0489b67c4c3 unbound from our chassis
Nov 28 10:08:09 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:08:09.251 158530 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b2c4ac07-8851-40d3-9495-d0489b67c4c3, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 28 10:08:09 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:08:09.252 261619 DEBUG oslo.privsep.daemon [-] privsep: reply[cbadec4e-4b81-4c68-a408-8faa1724be15]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 10:08:09 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:08:09.261 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:08:09 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:08:09.262 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:08:09 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:08:09.556 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:08:09 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:08:09.713 2 INFO neutron.agent.securitygroups_rpc [None req-e693dfaa-acf9-4ef6-91f0-fa32111d34d5 4946fab2fc3e4b6a824c5b97d5395e1f 6aa84c7049ec4634978c2f2583d7c1dd - - default default] Security group rule updated ['89d85f46-caf0-4632-8f88-6aa2b20ffab5']
Nov 28 10:08:09 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "56764117-bba7-4a1d-bc16-3de8a089b757", "format": "json"}]: dispatch
Nov 28 10:08:09 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_clone_status(clone_name:56764117-bba7-4a1d-bc16-3de8a089b757, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Nov 28 10:08:09 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_clone_status(clone_name:56764117-bba7-4a1d-bc16-3de8a089b757, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Nov 28 10:08:09 np0005538515.localdomain ceph-mgr[286188]: mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '56764117-bba7-4a1d-bc16-3de8a089b757' of type subvolume
Nov 28 10:08:09 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T10:08:09.778+0000 7fcc87448640 -1 mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '56764117-bba7-4a1d-bc16-3de8a089b757' of type subvolume
Nov 28 10:08:09 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "56764117-bba7-4a1d-bc16-3de8a089b757", "force": true, "format": "json"}]: dispatch
Nov 28 10:08:09 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:56764117-bba7-4a1d-bc16-3de8a089b757, vol_name:cephfs) < ""
Nov 28 10:08:09 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.subvolume_base] subvolume path 'b'/volumes/_nogroup/56764117-bba7-4a1d-bc16-3de8a089b757'' moved to trashcan
Nov 28 10:08:09 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Nov 28 10:08:09 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:56764117-bba7-4a1d-bc16-3de8a089b757, vol_name:cephfs) < ""
Nov 28 10:08:09 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v429: 177 pgs: 2 active+clean+snaptrim, 5 active+clean+snaptrim_wait, 170 active+clean; 146 MiB data, 935 MiB used, 41 GiB / 42 GiB avail; 41 KiB/s rd, 30 KiB/s wr, 61 op/s
Nov 28 10:08:09 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:08:09.972 2 INFO neutron.agent.securitygroups_rpc [None req-40ac5e44-c334-4848-ad22-bdb0d1d393a9 f56d2237e5b74576a33d9840c9346817 9ce143270a4649669232b53b6a44e4ba - - default default] Security group member updated ['f3e50b86-f5a6-4339-897f-e9e754c264f3']
Nov 28 10:08:10 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:08:10.035 2 INFO neutron.agent.securitygroups_rpc [None req-aa8fcbbc-e15f-465d-a79b-54ba8e5d4dfa 4946fab2fc3e4b6a824c5b97d5395e1f 6aa84c7049ec4634978c2f2583d7c1dd - - default default] Security group rule updated ['89d85f46-caf0-4632-8f88-6aa2b20ffab5']
Nov 28 10:08:10 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:08:10.766 2 INFO neutron.agent.securitygroups_rpc [None req-652590f9-d0df-4db1-90c9-4125d049cb03 98015f849ce5423d83599c846511f268 ed1e45c023a5404a97377641c4a6c580 - - default default] Security group rule updated ['527f33ea-6583-4033-be5c-a5d3ccd20912']
Nov 28 10:08:10 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "7fc9c0d6-3c27-4cc1-b7bd-1c8604d8f486", "snap_name": "2c0f5770-a785-4c87-af5d-f92c8abce21d_c59c3d50-8922-4eb8-a42f-dbb4854b184b", "force": true, "format": "json"}]: dispatch
Nov 28 10:08:10 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:2c0f5770-a785-4c87-af5d-f92c8abce21d_c59c3d50-8922-4eb8-a42f-dbb4854b184b, sub_name:7fc9c0d6-3c27-4cc1-b7bd-1c8604d8f486, vol_name:cephfs) < ""
Nov 28 10:08:10 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.
Nov 28 10:08:10 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/7fc9c0d6-3c27-4cc1-b7bd-1c8604d8f486/.meta.tmp'
Nov 28 10:08:10 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/7fc9c0d6-3c27-4cc1-b7bd-1c8604d8f486/.meta.tmp' to config b'/volumes/_nogroup/7fc9c0d6-3c27-4cc1-b7bd-1c8604d8f486/.meta'
Nov 28 10:08:10 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:2c0f5770-a785-4c87-af5d-f92c8abce21d_c59c3d50-8922-4eb8-a42f-dbb4854b184b, sub_name:7fc9c0d6-3c27-4cc1-b7bd-1c8604d8f486, vol_name:cephfs) < ""
Nov 28 10:08:10 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "7fc9c0d6-3c27-4cc1-b7bd-1c8604d8f486", "snap_name": "2c0f5770-a785-4c87-af5d-f92c8abce21d", "force": true, "format": "json"}]: dispatch
Nov 28 10:08:10 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:2c0f5770-a785-4c87-af5d-f92c8abce21d, sub_name:7fc9c0d6-3c27-4cc1-b7bd-1c8604d8f486, vol_name:cephfs) < ""
Nov 28 10:08:10 np0005538515.localdomain systemd[1]: tmp-crun.7A9SCX.mount: Deactivated successfully.
Nov 28 10:08:10 np0005538515.localdomain podman[321210]: 2025-11-28 10:08:10.970263247 +0000 UTC m=+0.077726190 container health_status 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 28 10:08:10 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/7fc9c0d6-3c27-4cc1-b7bd-1c8604d8f486/.meta.tmp'
Nov 28 10:08:10 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/7fc9c0d6-3c27-4cc1-b7bd-1c8604d8f486/.meta.tmp' to config b'/volumes/_nogroup/7fc9c0d6-3c27-4cc1-b7bd-1c8604d8f486/.meta'
Nov 28 10:08:11 np0005538515.localdomain podman[321210]: 2025-11-28 10:08:11.012449149 +0000 UTC m=+0.119912082 container exec_died 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 10:08:11 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:08:11.265 2 INFO neutron.agent.securitygroups_rpc [None req-8e5272cc-6346-4fdc-ba9b-b2622fce7146 98015f849ce5423d83599c846511f268 ed1e45c023a5404a97377641c4a6c580 - - default default] Security group rule updated ['527f33ea-6583-4033-be5c-a5d3ccd20912']
Nov 28 10:08:11 np0005538515.localdomain systemd[1]: 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.service: Deactivated successfully.
Nov 28 10:08:11 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "56764117-bba7-4a1d-bc16-3de8a089b757", "format": "json"}]: dispatch
Nov 28 10:08:11 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "56764117-bba7-4a1d-bc16-3de8a089b757", "force": true, "format": "json"}]: dispatch
Nov 28 10:08:11 np0005538515.localdomain ceph-mon[301134]: pgmap v429: 177 pgs: 2 active+clean+snaptrim, 5 active+clean+snaptrim_wait, 170 active+clean; 146 MiB data, 935 MiB used, 41 GiB / 42 GiB avail; 41 KiB/s rd, 30 KiB/s wr, 61 op/s
Nov 28 10:08:11 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:2c0f5770-a785-4c87-af5d-f92c8abce21d, sub_name:7fc9c0d6-3c27-4cc1-b7bd-1c8604d8f486, vol_name:cephfs) < ""
Nov 28 10:08:11 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e217 e217: 6 total, 6 up, 6 in
Nov 28 10:08:11 np0005538515.localdomain dnsmasq[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/addn_hosts - 1 addresses
Nov 28 10:08:11 np0005538515.localdomain dnsmasq-dhcp[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/host
Nov 28 10:08:11 np0005538515.localdomain podman[321250]: 2025-11-28 10:08:11.373876986 +0000 UTC m=+0.039752498 container kill d6ba70e100412f39fbf43f2359746885ccd151dd93bddc3682d2651a49a36c62 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-887157f9-a765-40c0-8be5-1fba3ddea8f8, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 28 10:08:11 np0005538515.localdomain dnsmasq-dhcp[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/opts
Nov 28 10:08:11 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e218 e218: 6 total, 6 up, 6 in
Nov 28 10:08:11 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:08:11.533 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:08:11 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:08:11.742 2 INFO neutron.agent.securitygroups_rpc [None req-fc93531f-0f28-40b2-be64-5c8ae1a05f2f 4946fab2fc3e4b6a824c5b97d5395e1f 6aa84c7049ec4634978c2f2583d7c1dd - - default default] Security group rule updated ['f4a575b2-e757-4f17-8902-ce29566c2707']
Nov 28 10:08:11 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e218 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:08:11 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v432: 177 pgs: 4 active+clean+snaptrim, 3 active+clean+snaptrim_wait, 170 active+clean; 146 MiB data, 923 MiB used, 41 GiB / 42 GiB avail; 85 KiB/s rd, 76 KiB/s wr, 126 op/s
Nov 28 10:08:12 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "7fc9c0d6-3c27-4cc1-b7bd-1c8604d8f486", "snap_name": "2c0f5770-a785-4c87-af5d-f92c8abce21d_c59c3d50-8922-4eb8-a42f-dbb4854b184b", "force": true, "format": "json"}]: dispatch
Nov 28 10:08:12 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "7fc9c0d6-3c27-4cc1-b7bd-1c8604d8f486", "snap_name": "2c0f5770-a785-4c87-af5d-f92c8abce21d", "force": true, "format": "json"}]: dispatch
Nov 28 10:08:12 np0005538515.localdomain ceph-mon[301134]: osdmap e217: 6 total, 6 up, 6 in
Nov 28 10:08:12 np0005538515.localdomain ceph-mon[301134]: osdmap e218: 6 total, 6 up, 6 in
Nov 28 10:08:12 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:08:12.431 2 INFO neutron.agent.securitygroups_rpc [None req-950a4a21-ae98-4893-86d1-a721665d75e4 4946fab2fc3e4b6a824c5b97d5395e1f 6aa84c7049ec4634978c2f2583d7c1dd - - default default] Security group rule updated ['f4a575b2-e757-4f17-8902-ce29566c2707']
Nov 28 10:08:12 np0005538515.localdomain podman[321287]: 2025-11-28 10:08:12.593426903 +0000 UTC m=+0.055072932 container kill a39214a4baa8262623303d314b8ed95b71c01a463bc2eabd06aba05950874fd1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b2c4ac07-8851-40d3-9495-d0489b67c4c3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 28 10:08:12 np0005538515.localdomain dnsmasq[316830]: exiting on receipt of SIGTERM
Nov 28 10:08:12 np0005538515.localdomain systemd[1]: libpod-a39214a4baa8262623303d314b8ed95b71c01a463bc2eabd06aba05950874fd1.scope: Deactivated successfully.
Nov 28 10:08:12 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:08:12.622 2 INFO neutron.agent.securitygroups_rpc [None req-70e65ce9-39cb-464e-830b-530df9be0aa7 98015f849ce5423d83599c846511f268 ed1e45c023a5404a97377641c4a6c580 - - default default] Security group rule updated ['609542d6-94ac-4376-bfbb-29a5ac4d9008']
Nov 28 10:08:12 np0005538515.localdomain podman[321300]: 2025-11-28 10:08:12.674885298 +0000 UTC m=+0.063603205 container died a39214a4baa8262623303d314b8ed95b71c01a463bc2eabd06aba05950874fd1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b2c4ac07-8851-40d3-9495-d0489b67c4c3, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Nov 28 10:08:12 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a39214a4baa8262623303d314b8ed95b71c01a463bc2eabd06aba05950874fd1-userdata-shm.mount: Deactivated successfully.
Nov 28 10:08:12 np0005538515.localdomain podman[321300]: 2025-11-28 10:08:12.706830414 +0000 UTC m=+0.095548281 container cleanup a39214a4baa8262623303d314b8ed95b71c01a463bc2eabd06aba05950874fd1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b2c4ac07-8851-40d3-9495-d0489b67c4c3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Nov 28 10:08:12 np0005538515.localdomain systemd[1]: libpod-conmon-a39214a4baa8262623303d314b8ed95b71c01a463bc2eabd06aba05950874fd1.scope: Deactivated successfully.
Nov 28 10:08:12 np0005538515.localdomain podman[321301]: 2025-11-28 10:08:12.744409974 +0000 UTC m=+0.129221460 container remove a39214a4baa8262623303d314b8ed95b71c01a463bc2eabd06aba05950874fd1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b2c4ac07-8851-40d3-9495-d0489b67c4c3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 28 10:08:12 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:08:12.839 261346 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:08:12 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:08:12.934 2 INFO neutron.agent.securitygroups_rpc [None req-75b9de2d-ffc0-4329-a7f5-226897331b9b 4946fab2fc3e4b6a824c5b97d5395e1f 6aa84c7049ec4634978c2f2583d7c1dd - - default default] Security group rule updated ['f4a575b2-e757-4f17-8902-ce29566c2707']
Nov 28 10:08:13 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:08:13.015 2 INFO neutron.agent.securitygroups_rpc [None req-d8139777-22d3-42a1-ac35-78ef0fdf0858 98015f849ce5423d83599c846511f268 ed1e45c023a5404a97377641c4a6c580 - - default default] Security group rule updated ['609542d6-94ac-4376-bfbb-29a5ac4d9008']
Nov 28 10:08:13 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:08:13.211 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:08:13 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:08:13.228 261346 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Nov 28 10:08:13 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:08:13.259 2 INFO neutron.agent.securitygroups_rpc [None req-08ef0569-f479-4e5d-8704-2ba0f20ecb11 98015f849ce5423d83599c846511f268 ed1e45c023a5404a97377641c4a6c580 - - default default] Security group rule updated ['609542d6-94ac-4376-bfbb-29a5ac4d9008']
Nov 28 10:08:13 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Nov 28 10:08:13 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1770151037' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:08:13 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Nov 28 10:08:13 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1770151037' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:08:13 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:08:13.394 2 INFO neutron.agent.securitygroups_rpc [None req-a98371b3-db2f-4475-8ff8-c8019a0287c5 4946fab2fc3e4b6a824c5b97d5395e1f 6aa84c7049ec4634978c2f2583d7c1dd - - default default] Security group rule updated ['f4a575b2-e757-4f17-8902-ce29566c2707']
Nov 28 10:08:13 np0005538515.localdomain ceph-mon[301134]: pgmap v432: 177 pgs: 4 active+clean+snaptrim, 3 active+clean+snaptrim_wait, 170 active+clean; 146 MiB data, 923 MiB used, 41 GiB / 42 GiB avail; 85 KiB/s rd, 76 KiB/s wr, 126 op/s
Nov 28 10:08:13 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/1770151037' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:08:13 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/1770151037' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:08:13 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:08:13.580 2 INFO neutron.agent.securitygroups_rpc [None req-86361f15-5494-4722-ab9a-5dacf7fdc04d 98015f849ce5423d83599c846511f268 ed1e45c023a5404a97377641c4a6c580 - - default default] Security group rule updated ['609542d6-94ac-4376-bfbb-29a5ac4d9008']
Nov 28 10:08:13 np0005538515.localdomain systemd[1]: var-lib-containers-storage-overlay-a112523a391658be219e8ca2b94928afac8124141e68c4a75a8a0c64ca4d98f3-merged.mount: Deactivated successfully.
Nov 28 10:08:13 np0005538515.localdomain systemd[1]: run-netns-qdhcp\x2db2c4ac07\x2d8851\x2d40d3\x2d9495\x2dd0489b67c4c3.mount: Deactivated successfully.
Nov 28 10:08:13 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:08:13.694 2 INFO neutron.agent.securitygroups_rpc [None req-d46ea504-c7ea-4ca3-916c-fe85715f24c7 4946fab2fc3e4b6a824c5b97d5395e1f 6aa84c7049ec4634978c2f2583d7c1dd - - default default] Security group rule updated ['f4a575b2-e757-4f17-8902-ce29566c2707']
Nov 28 10:08:13 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v433: 177 pgs: 4 active+clean+snaptrim, 3 active+clean+snaptrim_wait, 170 active+clean; 146 MiB data, 923 MiB used, 41 GiB / 42 GiB avail; 37 KiB/s rd, 39 KiB/s wr, 54 op/s
Nov 28 10:08:13 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:08:13.895 2 INFO neutron.agent.securitygroups_rpc [None req-865b6df2-96d7-4677-b689-d1af588b7477 98015f849ce5423d83599c846511f268 ed1e45c023a5404a97377641c4a6c580 - - default default] Security group rule updated ['609542d6-94ac-4376-bfbb-29a5ac4d9008']
Nov 28 10:08:14 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:08:14.015 2 INFO neutron.agent.securitygroups_rpc [None req-447348cf-3ac4-498a-af94-5fdba49716f9 4946fab2fc3e4b6a824c5b97d5395e1f 6aa84c7049ec4634978c2f2583d7c1dd - - default default] Security group rule updated ['f4a575b2-e757-4f17-8902-ce29566c2707']
Nov 28 10:08:14 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "7fc9c0d6-3c27-4cc1-b7bd-1c8604d8f486", "snap_name": "1ad625c7-a9e5-4671-b5b8-87ed0dc833b6_a8a8759a-ecad-4b03-8e69-d4e30de4d63c", "force": true, "format": "json"}]: dispatch
Nov 28 10:08:14 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:1ad625c7-a9e5-4671-b5b8-87ed0dc833b6_a8a8759a-ecad-4b03-8e69-d4e30de4d63c, sub_name:7fc9c0d6-3c27-4cc1-b7bd-1c8604d8f486, vol_name:cephfs) < ""
Nov 28 10:08:14 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/7fc9c0d6-3c27-4cc1-b7bd-1c8604d8f486/.meta.tmp'
Nov 28 10:08:14 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/7fc9c0d6-3c27-4cc1-b7bd-1c8604d8f486/.meta.tmp' to config b'/volumes/_nogroup/7fc9c0d6-3c27-4cc1-b7bd-1c8604d8f486/.meta'
Nov 28 10:08:14 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:1ad625c7-a9e5-4671-b5b8-87ed0dc833b6_a8a8759a-ecad-4b03-8e69-d4e30de4d63c, sub_name:7fc9c0d6-3c27-4cc1-b7bd-1c8604d8f486, vol_name:cephfs) < ""
Nov 28 10:08:14 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "7fc9c0d6-3c27-4cc1-b7bd-1c8604d8f486", "snap_name": "1ad625c7-a9e5-4671-b5b8-87ed0dc833b6", "force": true, "format": "json"}]: dispatch
Nov 28 10:08:14 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:1ad625c7-a9e5-4671-b5b8-87ed0dc833b6, sub_name:7fc9c0d6-3c27-4cc1-b7bd-1c8604d8f486, vol_name:cephfs) < ""
Nov 28 10:08:14 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/7fc9c0d6-3c27-4cc1-b7bd-1c8604d8f486/.meta.tmp'
Nov 28 10:08:14 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/7fc9c0d6-3c27-4cc1-b7bd-1c8604d8f486/.meta.tmp' to config b'/volumes/_nogroup/7fc9c0d6-3c27-4cc1-b7bd-1c8604d8f486/.meta'
Nov 28 10:08:14 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:1ad625c7-a9e5-4671-b5b8-87ed0dc833b6, sub_name:7fc9c0d6-3c27-4cc1-b7bd-1c8604d8f486, vol_name:cephfs) < ""
Nov 28 10:08:14 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:08:14.128 2 INFO neutron.agent.securitygroups_rpc [None req-e5d09537-743d-43a9-bf9b-ecb2832cda1b 98015f849ce5423d83599c846511f268 ed1e45c023a5404a97377641c4a6c580 - - default default] Security group rule updated ['609542d6-94ac-4376-bfbb-29a5ac4d9008']
Nov 28 10:08:14 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:08:14.560 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:08:14 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.
Nov 28 10:08:14 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:08:14.898 2 INFO neutron.agent.securitygroups_rpc [None req-185e5877-2263-42be-9a4e-22f963823be4 4946fab2fc3e4b6a824c5b97d5395e1f 6aa84c7049ec4634978c2f2583d7c1dd - - default default] Security group rule updated ['dae70bc2-83a0-4e05-bc5e-659aa86d0528']
Nov 28 10:08:14 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:08:14.920 2 INFO neutron.agent.securitygroups_rpc [None req-d7a70732-d57d-42ae-a12a-793909186bfd 98015f849ce5423d83599c846511f268 ed1e45c023a5404a97377641c4a6c580 - - default default] Security group rule updated ['d99d3754-6453-4c3f-8498-8ac20a4744c7']
Nov 28 10:08:14 np0005538515.localdomain podman[321328]: 2025-11-28 10:08:14.972444053 +0000 UTC m=+0.079646890 container health_status cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=multipathd, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd)
Nov 28 10:08:15 np0005538515.localdomain podman[321328]: 2025-11-28 10:08:15.012631343 +0000 UTC m=+0.119834140 container exec_died cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd)
Nov 28 10:08:15 np0005538515.localdomain systemd[1]: cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.service: Deactivated successfully.
Nov 28 10:08:15 np0005538515.localdomain ceph-mon[301134]: pgmap v433: 177 pgs: 4 active+clean+snaptrim, 3 active+clean+snaptrim_wait, 170 active+clean; 146 MiB data, 923 MiB used, 41 GiB / 42 GiB avail; 37 KiB/s rd, 39 KiB/s wr, 54 op/s
Nov 28 10:08:15 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "7fc9c0d6-3c27-4cc1-b7bd-1c8604d8f486", "snap_name": "1ad625c7-a9e5-4671-b5b8-87ed0dc833b6_a8a8759a-ecad-4b03-8e69-d4e30de4d63c", "force": true, "format": "json"}]: dispatch
Nov 28 10:08:15 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "7fc9c0d6-3c27-4cc1-b7bd-1c8604d8f486", "snap_name": "1ad625c7-a9e5-4671-b5b8-87ed0dc833b6", "force": true, "format": "json"}]: dispatch
Nov 28 10:08:15 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e219 e219: 6 total, 6 up, 6 in
Nov 28 10:08:15 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v435: 177 pgs: 4 active+clean+snaptrim, 3 active+clean+snaptrim_wait, 170 active+clean; 146 MiB data, 923 MiB used, 41 GiB / 42 GiB avail; 41 KiB/s rd, 44 KiB/s wr, 61 op/s
Nov 28 10:08:16 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "f08fd6d7-e632-4600-b126-7d3287d90baf", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:08:16 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:f08fd6d7-e632-4600-b126-7d3287d90baf, vol_name:cephfs) < ""
Nov 28 10:08:16 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/f08fd6d7-e632-4600-b126-7d3287d90baf/.meta.tmp'
Nov 28 10:08:16 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/f08fd6d7-e632-4600-b126-7d3287d90baf/.meta.tmp' to config b'/volumes/_nogroup/f08fd6d7-e632-4600-b126-7d3287d90baf/.meta'
Nov 28 10:08:16 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:f08fd6d7-e632-4600-b126-7d3287d90baf, vol_name:cephfs) < ""
Nov 28 10:08:16 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "f08fd6d7-e632-4600-b126-7d3287d90baf", "format": "json"}]: dispatch
Nov 28 10:08:16 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:f08fd6d7-e632-4600-b126-7d3287d90baf, vol_name:cephfs) < ""
Nov 28 10:08:16 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e220 e220: 6 total, 6 up, 6 in
Nov 28 10:08:16 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:f08fd6d7-e632-4600-b126-7d3287d90baf, vol_name:cephfs) < ""
Nov 28 10:08:16 np0005538515.localdomain ceph-mon[301134]: osdmap e219: 6 total, 6 up, 6 in
Nov 28 10:08:16 np0005538515.localdomain ceph-mon[301134]: osdmap e220: 6 total, 6 up, 6 in
Nov 28 10:08:16 np0005538515.localdomain ceph-mon[301134]: from='client.15651 172.18.0.34:0/597982567' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:08:16 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e220 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:08:17 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "7fc9c0d6-3c27-4cc1-b7bd-1c8604d8f486", "format": "json"}]: dispatch
Nov 28 10:08:17 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_clone_status(clone_name:7fc9c0d6-3c27-4cc1-b7bd-1c8604d8f486, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Nov 28 10:08:17 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_clone_status(clone_name:7fc9c0d6-3c27-4cc1-b7bd-1c8604d8f486, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Nov 28 10:08:17 np0005538515.localdomain ceph-mgr[286188]: mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '7fc9c0d6-3c27-4cc1-b7bd-1c8604d8f486' of type subvolume
Nov 28 10:08:17 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T10:08:17.480+0000 7fcc87448640 -1 mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '7fc9c0d6-3c27-4cc1-b7bd-1c8604d8f486' of type subvolume
Nov 28 10:08:17 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "7fc9c0d6-3c27-4cc1-b7bd-1c8604d8f486", "force": true, "format": "json"}]: dispatch
Nov 28 10:08:17 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:7fc9c0d6-3c27-4cc1-b7bd-1c8604d8f486, vol_name:cephfs) < ""
Nov 28 10:08:17 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.subvolume_base] subvolume path 'b'/volumes/_nogroup/7fc9c0d6-3c27-4cc1-b7bd-1c8604d8f486'' moved to trashcan
Nov 28 10:08:17 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Nov 28 10:08:17 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:7fc9c0d6-3c27-4cc1-b7bd-1c8604d8f486, vol_name:cephfs) < ""
Nov 28 10:08:17 np0005538515.localdomain ceph-mon[301134]: pgmap v435: 177 pgs: 4 active+clean+snaptrim, 3 active+clean+snaptrim_wait, 170 active+clean; 146 MiB data, 923 MiB used, 41 GiB / 42 GiB avail; 41 KiB/s rd, 44 KiB/s wr, 61 op/s
Nov 28 10:08:17 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "f08fd6d7-e632-4600-b126-7d3287d90baf", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:08:17 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "f08fd6d7-e632-4600-b126-7d3287d90baf", "format": "json"}]: dispatch
Nov 28 10:08:17 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v437: 177 pgs: 177 active+clean; 146 MiB data, 924 MiB used, 41 GiB / 42 GiB avail; 40 KiB/s rd, 68 KiB/s wr, 64 op/s
Nov 28 10:08:18 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:08:18.240 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:08:18 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "7fc9c0d6-3c27-4cc1-b7bd-1c8604d8f486", "format": "json"}]: dispatch
Nov 28 10:08:18 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "7fc9c0d6-3c27-4cc1-b7bd-1c8604d8f486", "force": true, "format": "json"}]: dispatch
Nov 28 10:08:19 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "f08fd6d7-e632-4600-b126-7d3287d90baf", "snap_name": "7f4376e7-ac7c-4743-ba07-48e626bc51c1", "format": "json"}]: dispatch
Nov 28 10:08:19 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_create(format:json, prefix:fs subvolume snapshot create, snap_name:7f4376e7-ac7c-4743-ba07-48e626bc51c1, sub_name:f08fd6d7-e632-4600-b126-7d3287d90baf, vol_name:cephfs) < ""
Nov 28 10:08:19 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:08:19.599 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:08:19 np0005538515.localdomain ceph-mon[301134]: pgmap v437: 177 pgs: 177 active+clean; 146 MiB data, 924 MiB used, 41 GiB / 42 GiB avail; 40 KiB/s rd, 68 KiB/s wr, 64 op/s
Nov 28 10:08:19 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_create(format:json, prefix:fs subvolume snapshot create, snap_name:7f4376e7-ac7c-4743-ba07-48e626bc51c1, sub_name:f08fd6d7-e632-4600-b126-7d3287d90baf, vol_name:cephfs) < ""
Nov 28 10:08:19 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v438: 177 pgs: 177 active+clean; 146 MiB data, 924 MiB used, 41 GiB / 42 GiB avail; 767 B/s rd, 21 KiB/s wr, 5 op/s
Nov 28 10:08:20 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:08:20.221 2 INFO neutron.agent.securitygroups_rpc [None req-d053b12b-4cea-4da2-a120-7dbb5ec3bd14 b3ad92f082324bf2b498b6ec57fa1994 f4aa6a98849143efbe0d34d745657eb8 - - default default] Security group rule updated ['b905493a-8ebf-4d2f-8822-0b2d1ac4a85c']
Nov 28 10:08:20 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "f08fd6d7-e632-4600-b126-7d3287d90baf", "snap_name": "7f4376e7-ac7c-4743-ba07-48e626bc51c1", "format": "json"}]: dispatch
Nov 28 10:08:20 np0005538515.localdomain ceph-mon[301134]: pgmap v438: 177 pgs: 177 active+clean; 146 MiB data, 924 MiB used, 41 GiB / 42 GiB avail; 767 B/s rd, 21 KiB/s wr, 5 op/s
Nov 28 10:08:20 np0005538515.localdomain neutron_sriov_agent[254415]: 2025-11-28 10:08:20.628 2 INFO neutron.agent.securitygroups_rpc [None req-5d5ff3a5-db5c-4c22-9208-8bc209a22601 2d65c21983fa4a008a09c7a8bb7a6484 2603cf17f09846a397a42aba4be9d81b - - default default] Security group rule updated ['90aec1a6-5e99-47c4-8e4c-11b88cdc4ca9']
Nov 28 10:08:21 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e221 e221: 6 total, 6 up, 6 in
Nov 28 10:08:21 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e221 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:08:21 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v440: 177 pgs: 177 active+clean; 147 MiB data, 925 MiB used, 41 GiB / 42 GiB avail; 3.3 MiB/s rd, 55 KiB/s wr, 26 op/s
Nov 28 10:08:22 np0005538515.localdomain ceph-mon[301134]: osdmap e221: 6 total, 6 up, 6 in
Nov 28 10:08:22 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/1272958995' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:08:22 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/1272958995' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:08:23 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot clone", "vol_name": "cephfs", "sub_name": "f08fd6d7-e632-4600-b126-7d3287d90baf", "snap_name": "7f4376e7-ac7c-4743-ba07-48e626bc51c1", "target_sub_name": "fddc728b-0ca3-4979-8f94-9ca64d777da3", "format": "json"}]: dispatch
Nov 28 10:08:23 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_clone(format:json, prefix:fs subvolume snapshot clone, snap_name:7f4376e7-ac7c-4743-ba07-48e626bc51c1, sub_name:f08fd6d7-e632-4600-b126-7d3287d90baf, target_sub_name:fddc728b-0ca3-4979-8f94-9ca64d777da3, vol_name:cephfs) < ""
Nov 28 10:08:23 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 273 bytes to config b'/volumes/_nogroup/fddc728b-0ca3-4979-8f94-9ca64d777da3/.meta.tmp'
Nov 28 10:08:23 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/fddc728b-0ca3-4979-8f94-9ca64d777da3/.meta.tmp' to config b'/volumes/_nogroup/fddc728b-0ca3-4979-8f94-9ca64d777da3/.meta'
Nov 28 10:08:23 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.clone_index] tracking-id c532338b-5ba2-4b23-9b32-ea9746e2ddcb for path b'/volumes/_nogroup/fddc728b-0ca3-4979-8f94-9ca64d777da3'
Nov 28 10:08:23 np0005538515.localdomain ceph-osd[32393]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #44. Immutable memtables: 1.
Nov 28 10:08:23 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 246 bytes to config b'/volumes/_nogroup/f08fd6d7-e632-4600-b126-7d3287d90baf/.meta.tmp'
Nov 28 10:08:23 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/f08fd6d7-e632-4600-b126-7d3287d90baf/.meta.tmp' to config b'/volumes/_nogroup/f08fd6d7-e632-4600-b126-7d3287d90baf/.meta'
Nov 28 10:08:23 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Nov 28 10:08:23 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_clone(format:json, prefix:fs subvolume snapshot clone, snap_name:7f4376e7-ac7c-4743-ba07-48e626bc51c1, sub_name:f08fd6d7-e632-4600-b126-7d3287d90baf, target_sub_name:fddc728b-0ca3-4979-8f94-9ca64d777da3, vol_name:cephfs) < ""
Nov 28 10:08:23 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "fddc728b-0ca3-4979-8f94-9ca64d777da3", "format": "json"}]: dispatch
Nov 28 10:08:23 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_clone_status(clone_name:fddc728b-0ca3-4979-8f94-9ca64d777da3, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Nov 28 10:08:23 np0005538515.localdomain ceph-mgr[286188]: client.0 error registering admin socket command: (17) File exists
Nov 28 10:08:23 np0005538515.localdomain ceph-mgr[286188]: client.0 error registering admin socket command: (17) File exists
Nov 28 10:08:23 np0005538515.localdomain ceph-mgr[286188]: client.0 error registering admin socket command: (17) File exists
Nov 28 10:08:23 np0005538515.localdomain ceph-mgr[286188]: client.0 error registering admin socket command: (17) File exists
Nov 28 10:08:23 np0005538515.localdomain ceph-mgr[286188]: client.0 error registering admin socket command: (17) File exists
Nov 28 10:08:23 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T10:08:23.221+0000 7fcc8d454640 -1 client.0 error registering admin socket command: (17) File exists
Nov 28 10:08:23 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T10:08:23.221+0000 7fcc8d454640 -1 client.0 error registering admin socket command: (17) File exists
Nov 28 10:08:23 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T10:08:23.221+0000 7fcc8d454640 -1 client.0 error registering admin socket command: (17) File exists
Nov 28 10:08:23 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T10:08:23.221+0000 7fcc8d454640 -1 client.0 error registering admin socket command: (17) File exists
Nov 28 10:08:23 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T10:08:23.221+0000 7fcc8d454640 -1 client.0 error registering admin socket command: (17) File exists
Nov 28 10:08:23 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_clone_status(clone_name:fddc728b-0ca3-4979-8f94-9ca64d777da3, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Nov 28 10:08:23 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.async_cloner] cloning to subvolume path: /volumes/_nogroup/fddc728b-0ca3-4979-8f94-9ca64d777da3
Nov 28 10:08:23 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.async_cloner] starting clone: (cephfs, None, fddc728b-0ca3-4979-8f94-9ca64d777da3)
Nov 28 10:08:23 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:08:23.251 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:08:23 np0005538515.localdomain ceph-mgr[286188]: client.0 error registering admin socket command: (17) File exists
Nov 28 10:08:23 np0005538515.localdomain ceph-mgr[286188]: client.0 error registering admin socket command: (17) File exists
Nov 28 10:08:23 np0005538515.localdomain ceph-mgr[286188]: client.0 error registering admin socket command: (17) File exists
Nov 28 10:08:23 np0005538515.localdomain ceph-mgr[286188]: client.0 error registering admin socket command: (17) File exists
Nov 28 10:08:23 np0005538515.localdomain ceph-mgr[286188]: client.0 error registering admin socket command: (17) File exists
Nov 28 10:08:23 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T10:08:23.254+0000 7fcc8c452640 -1 client.0 error registering admin socket command: (17) File exists
Nov 28 10:08:23 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T10:08:23.254+0000 7fcc8c452640 -1 client.0 error registering admin socket command: (17) File exists
Nov 28 10:08:23 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T10:08:23.254+0000 7fcc8c452640 -1 client.0 error registering admin socket command: (17) File exists
Nov 28 10:08:23 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T10:08:23.254+0000 7fcc8c452640 -1 client.0 error registering admin socket command: (17) File exists
Nov 28 10:08:23 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T10:08:23.254+0000 7fcc8c452640 -1 client.0 error registering admin socket command: (17) File exists
Nov 28 10:08:23 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.async_cloner] Delayed cloning (cephfs, None, fddc728b-0ca3-4979-8f94-9ca64d777da3) -- by 0 seconds
Nov 28 10:08:23 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 277 bytes to config b'/volumes/_nogroup/fddc728b-0ca3-4979-8f94-9ca64d777da3/.meta.tmp'
Nov 28 10:08:23 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/fddc728b-0ca3-4979-8f94-9ca64d777da3/.meta.tmp' to config b'/volumes/_nogroup/fddc728b-0ca3-4979-8f94-9ca64d777da3/.meta'
Nov 28 10:08:23 np0005538515.localdomain ceph-mon[301134]: pgmap v440: 177 pgs: 177 active+clean; 147 MiB data, 925 MiB used, 41 GiB / 42 GiB avail; 3.3 MiB/s rd, 55 KiB/s wr, 26 op/s
Nov 28 10:08:23 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v441: 177 pgs: 177 active+clean; 147 MiB data, 925 MiB used, 41 GiB / 42 GiB avail; 2.6 MiB/s rd, 44 KiB/s wr, 20 op/s
Nov 28 10:08:24 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot clone", "vol_name": "cephfs", "sub_name": "f08fd6d7-e632-4600-b126-7d3287d90baf", "snap_name": "7f4376e7-ac7c-4743-ba07-48e626bc51c1", "target_sub_name": "fddc728b-0ca3-4979-8f94-9ca64d777da3", "format": "json"}]: dispatch
Nov 28 10:08:24 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "fddc728b-0ca3-4979-8f94-9ca64d777da3", "format": "json"}]: dispatch
Nov 28 10:08:24 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:08:24.645 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:08:24 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Nov 28 10:08:24 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/439989088' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:08:24 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Nov 28 10:08:24 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/439989088' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:08:25 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.async_cloner] copying data from b'/volumes/_nogroup/f08fd6d7-e632-4600-b126-7d3287d90baf/.snap/7f4376e7-ac7c-4743-ba07-48e626bc51c1/b41cab7a-8985-4dbd-82ff-5c225419ddcd' to b'/volumes/_nogroup/fddc728b-0ca3-4979-8f94-9ca64d777da3/132edee1-c864-490c-8dc8-4b93a2d66413'
Nov 28 10:08:25 np0005538515.localdomain sudo[321372]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 10:08:25 np0005538515.localdomain sudo[321372]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 10:08:25 np0005538515.localdomain sudo[321372]: pam_unix(sudo:session): session closed for user root
Nov 28 10:08:25 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 274 bytes to config b'/volumes/_nogroup/fddc728b-0ca3-4979-8f94-9ca64d777da3/.meta.tmp'
Nov 28 10:08:25 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/fddc728b-0ca3-4979-8f94-9ca64d777da3/.meta.tmp' to config b'/volumes/_nogroup/fddc728b-0ca3-4979-8f94-9ca64d777da3/.meta'
Nov 28 10:08:25 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.clone_index] untracking c532338b-5ba2-4b23-9b32-ea9746e2ddcb
Nov 28 10:08:25 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/f08fd6d7-e632-4600-b126-7d3287d90baf/.meta.tmp'
Nov 28 10:08:25 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/f08fd6d7-e632-4600-b126-7d3287d90baf/.meta.tmp' to config b'/volumes/_nogroup/f08fd6d7-e632-4600-b126-7d3287d90baf/.meta'
Nov 28 10:08:25 np0005538515.localdomain sudo[321390]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Nov 28 10:08:25 np0005538515.localdomain sudo[321390]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 10:08:25 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 151 bytes to config b'/volumes/_nogroup/fddc728b-0ca3-4979-8f94-9ca64d777da3/.meta.tmp'
Nov 28 10:08:25 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/fddc728b-0ca3-4979-8f94-9ca64d777da3/.meta.tmp' to config b'/volumes/_nogroup/fddc728b-0ca3-4979-8f94-9ca64d777da3/.meta'
Nov 28 10:08:25 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.async_cloner] finished clone: (cephfs, None, fddc728b-0ca3-4979-8f94-9ca64d777da3)
Nov 28 10:08:25 np0005538515.localdomain ceph-mon[301134]: pgmap v441: 177 pgs: 177 active+clean; 147 MiB data, 925 MiB used, 41 GiB / 42 GiB avail; 2.6 MiB/s rd, 44 KiB/s wr, 20 op/s
Nov 28 10:08:25 np0005538515.localdomain ceph-mon[301134]: mgrmap e48: np0005538515.yfkzhl(active, since 10m), standbys: np0005538513.dsfdlx, np0005538514.djozup
Nov 28 10:08:25 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/439989088' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:08:25 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/439989088' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:08:25 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v442: 177 pgs: 177 active+clean; 147 MiB data, 925 MiB used, 41 GiB / 42 GiB avail; 2.2 MiB/s rd, 37 KiB/s wr, 17 op/s
Nov 28 10:08:26 np0005538515.localdomain systemd[1]: tmp-crun.4lHoHl.mount: Deactivated successfully.
Nov 28 10:08:26 np0005538515.localdomain podman[321481]: 2025-11-28 10:08:26.224986936 +0000 UTC m=+0.105884660 container exec 98f7091a3e2ea0e9ed1e630f1e98c8fad1fd276cf7448473db6afc3c103ea45d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538515, vcs-type=git, version=7, architecture=x86_64, GIT_CLEAN=True, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-09-24T08:57:55, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=553, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.buildah.version=1.33.12, distribution-scope=public, ceph=True, io.openshift.expose-services=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=rhceph-container, name=rhceph)
Nov 28 10:08:26 np0005538515.localdomain podman[321481]: 2025-11-28 10:08:26.340529302 +0000 UTC m=+0.221426976 container exec_died 98f7091a3e2ea0e9ed1e630f1e98c8fad1fd276cf7448473db6afc3c103ea45d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538515, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=rhceph-container, GIT_CLEAN=True, ceph=True, RELEASE=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, release=553, io.openshift.expose-services=, build-date=2025-09-24T08:57:55, architecture=x86_64, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vendor=Red Hat, Inc., vcs-type=git, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, version=7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d)
Nov 28 10:08:26 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e222 e222: 6 total, 6 up, 6 in
Nov 28 10:08:26 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e222 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:08:26 np0005538515.localdomain sudo[321390]: pam_unix(sudo:session): session closed for user root
Nov 28 10:08:26 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain.devices.0}] v 0)
Nov 28 10:08:26 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain}] v 0)
Nov 28 10:08:26 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain.devices.0}] v 0)
Nov 28 10:08:26 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain}] v 0)
Nov 28 10:08:27 np0005538515.localdomain sudo[321601]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 10:08:27 np0005538515.localdomain sudo[321601]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 10:08:27 np0005538515.localdomain sudo[321601]: pam_unix(sudo:session): session closed for user root
Nov 28 10:08:27 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.
Nov 28 10:08:27 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain.devices.0}] v 0)
Nov 28 10:08:27 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain}] v 0)
Nov 28 10:08:27 np0005538515.localdomain sudo[321620]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 10:08:27 np0005538515.localdomain sudo[321620]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 10:08:27 np0005538515.localdomain podman[321619]: 2025-11-28 10:08:27.227946626 +0000 UTC m=+0.127555138 container health_status 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, config_id=edpm, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, version=9.6, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, maintainer=Red Hat, Inc.)
Nov 28 10:08:27 np0005538515.localdomain podman[321619]: 2025-11-28 10:08:27.248543032 +0000 UTC m=+0.148151614 container exec_died 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, name=ubi9-minimal, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, config_id=edpm, vcs-type=git, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, maintainer=Red Hat, Inc., version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public)
Nov 28 10:08:27 np0005538515.localdomain systemd[1]: 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.service: Deactivated successfully.
Nov 28 10:08:27 np0005538515.localdomain ceph-mon[301134]: pgmap v442: 177 pgs: 177 active+clean; 147 MiB data, 925 MiB used, 41 GiB / 42 GiB avail; 2.2 MiB/s rd, 37 KiB/s wr, 17 op/s
Nov 28 10:08:27 np0005538515.localdomain ceph-mon[301134]: osdmap e222: 6 total, 6 up, 6 in
Nov 28 10:08:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:08:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:08:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:08:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:08:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:08:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:08:27 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/1911934516' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:08:27 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/1911934516' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:08:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:08:27 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 10:08:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:08:27 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 10:08:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:08:27 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 10:08:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:08:27 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 10:08:27 np0005538515.localdomain openstack_network_exporter[240973]: 
Nov 28 10:08:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:08:27 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 10:08:27 np0005538515.localdomain openstack_network_exporter[240973]: 
Nov 28 10:08:27 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v444: 177 pgs: 177 active+clean; 193 MiB data, 991 MiB used, 41 GiB / 42 GiB avail; 2.6 MiB/s rd, 2.7 MiB/s wr, 131 op/s
Nov 28 10:08:27 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} v 0)
Nov 28 10:08:27 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch
Nov 28 10:08:27 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} v 0)
Nov 28 10:08:27 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch
Nov 28 10:08:27 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO root] Adjusting osd_memory_target on np0005538514.localdomain to 836.6M
Nov 28 10:08:27 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Adjusting osd_memory_target on np0005538514.localdomain to 836.6M
Nov 28 10:08:27 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command([{prefix=config set, name=osd_memory_target}] v 0)
Nov 28 10:08:27 np0005538515.localdomain ceph-mgr[286188]: [cephadm WARNING cephadm.serve] Unable to set osd_memory_target on np0005538514.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Nov 28 10:08:27 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [WRN] : Unable to set osd_memory_target on np0005538514.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Nov 28 10:08:27 np0005538515.localdomain sudo[321620]: pam_unix(sudo:session): session closed for user root
Nov 28 10:08:28 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} v 0)
Nov 28 10:08:28 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch
Nov 28 10:08:28 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} v 0)
Nov 28 10:08:28 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch
Nov 28 10:08:28 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} v 0)
Nov 28 10:08:28 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch
Nov 28 10:08:28 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} v 0)
Nov 28 10:08:28 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch
Nov 28 10:08:28 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO root] Adjusting osd_memory_target on np0005538515.localdomain to 836.6M
Nov 28 10:08:28 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command([{prefix=config set, name=osd_memory_target}] v 0)
Nov 28 10:08:28 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Adjusting osd_memory_target on np0005538515.localdomain to 836.6M
Nov 28 10:08:28 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO root] Adjusting osd_memory_target on np0005538513.localdomain to 836.6M
Nov 28 10:08:28 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Adjusting osd_memory_target on np0005538513.localdomain to 836.6M
Nov 28 10:08:28 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command([{prefix=config set, name=osd_memory_target}] v 0)
Nov 28 10:08:28 np0005538515.localdomain ceph-mgr[286188]: [cephadm WARNING cephadm.serve] Unable to set osd_memory_target on np0005538515.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Nov 28 10:08:28 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [WRN] : Unable to set osd_memory_target on np0005538515.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Nov 28 10:08:28 np0005538515.localdomain ceph-mgr[286188]: [cephadm WARNING cephadm.serve] Unable to set osd_memory_target on np0005538513.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Nov 28 10:08:28 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [WRN] : Unable to set osd_memory_target on np0005538513.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Nov 28 10:08:28 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Nov 28 10:08:28 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 10:08:28 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0)
Nov 28 10:08:28 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 28 10:08:28 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Nov 28 10:08:28 np0005538515.localdomain ceph-mgr[286188]: [progress INFO root] update: starting ev 2eb1df9b-02ce-4916-a757-2204b382ab74 (Updating node-proxy deployment (+3 -> 3))
Nov 28 10:08:28 np0005538515.localdomain ceph-mgr[286188]: [progress INFO root] complete: finished ev 2eb1df9b-02ce-4916-a757-2204b382ab74 (Updating node-proxy deployment (+3 -> 3))
Nov 28 10:08:28 np0005538515.localdomain ceph-mgr[286188]: [progress INFO root] Completed event 2eb1df9b-02ce-4916-a757-2204b382ab74 (Updating node-proxy deployment (+3 -> 3)) in 0 seconds
Nov 28 10:08:28 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0)
Nov 28 10:08:28 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 28 10:08:28 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:08:28.303 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:08:28 np0005538515.localdomain sudo[321691]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 10:08:28 np0005538515.localdomain sudo[321691]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 10:08:28 np0005538515.localdomain sudo[321691]: pam_unix(sudo:session): session closed for user root
Nov 28 10:08:28 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch
Nov 28 10:08:28 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch
Nov 28 10:08:28 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch
Nov 28 10:08:28 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch
Nov 28 10:08:28 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch
Nov 28 10:08:28 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch
Nov 28 10:08:28 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch
Nov 28 10:08:28 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch
Nov 28 10:08:28 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch
Nov 28 10:08:28 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch
Nov 28 10:08:28 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch
Nov 28 10:08:28 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch
Nov 28 10:08:28 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 10:08:28 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 28 10:08:28 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:08:28 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 28 10:08:28 np0005538515.localdomain podman[239012]: time="2025-11-28T10:08:28Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 10:08:28 np0005538515.localdomain podman[239012]: @ - - [28/Nov/2025:10:08:28 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156330 "" "Go-http-client/1.1"
Nov 28 10:08:28 np0005538515.localdomain podman[239012]: @ - - [28/Nov/2025:10:08:28 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19213 "" "Go-http-client/1.1"
Nov 28 10:08:29 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e223 e223: 6 total, 6 up, 6 in
Nov 28 10:08:29 np0005538515.localdomain ceph-mon[301134]: pgmap v444: 177 pgs: 177 active+clean; 193 MiB data, 991 MiB used, 41 GiB / 42 GiB avail; 2.6 MiB/s rd, 2.7 MiB/s wr, 131 op/s
Nov 28 10:08:29 np0005538515.localdomain ceph-mon[301134]: Adjusting osd_memory_target on np0005538514.localdomain to 836.6M
Nov 28 10:08:29 np0005538515.localdomain ceph-mon[301134]: Unable to set osd_memory_target on np0005538514.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Nov 28 10:08:29 np0005538515.localdomain ceph-mon[301134]: Adjusting osd_memory_target on np0005538515.localdomain to 836.6M
Nov 28 10:08:29 np0005538515.localdomain ceph-mon[301134]: Adjusting osd_memory_target on np0005538513.localdomain to 836.6M
Nov 28 10:08:29 np0005538515.localdomain ceph-mon[301134]: Unable to set osd_memory_target on np0005538515.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Nov 28 10:08:29 np0005538515.localdomain ceph-mon[301134]: Unable to set osd_memory_target on np0005538513.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Nov 28 10:08:29 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/1301495600' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:08:29 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/1301495600' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:08:29 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/786658190' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:08:29 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/786658190' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:08:29 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:08:29.651 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:08:29 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v446: 177 pgs: 177 active+clean; 193 MiB data, 991 MiB used, 41 GiB / 42 GiB avail; 77 KiB/s rd, 2.7 MiB/s wr, 115 op/s
Nov 28 10:08:30 np0005538515.localdomain ceph-mon[301134]: osdmap e223: 6 total, 6 up, 6 in
Nov 28 10:08:30 np0005538515.localdomain ceph-mgr[286188]: [progress INFO root] Writing back 50 completed events
Nov 28 10:08:30 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0)
Nov 28 10:08:31 np0005538515.localdomain ceph-mon[301134]: pgmap v446: 177 pgs: 177 active+clean; 193 MiB data, 991 MiB used, 41 GiB / 42 GiB avail; 77 KiB/s rd, 2.7 MiB/s wr, 115 op/s
Nov 28 10:08:31 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:08:31 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e223 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:08:31 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v447: 177 pgs: 177 active+clean; 147 MiB data, 928 MiB used, 41 GiB / 42 GiB avail; 123 KiB/s rd, 2.7 MiB/s wr, 183 op/s
Nov 28 10:08:32 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Nov 28 10:08:32 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1386400267' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:08:32 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Nov 28 10:08:32 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1386400267' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:08:32 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e224 e224: 6 total, 6 up, 6 in
Nov 28 10:08:32 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/1386400267' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:08:32 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/1386400267' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:08:32 np0005538515.localdomain ceph-mon[301134]: osdmap e224: 6 total, 6 up, 6 in
Nov 28 10:08:33 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:08:33.341 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:08:33 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e225 e225: 6 total, 6 up, 6 in
Nov 28 10:08:33 np0005538515.localdomain ceph-mon[301134]: pgmap v447: 177 pgs: 177 active+clean; 147 MiB data, 928 MiB used, 41 GiB / 42 GiB avail; 123 KiB/s rd, 2.7 MiB/s wr, 183 op/s
Nov 28 10:08:33 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v450: 177 pgs: 177 active+clean; 147 MiB data, 928 MiB used, 41 GiB / 42 GiB avail; 60 KiB/s rd, 18 KiB/s wr, 91 op/s
Nov 28 10:08:34 np0005538515.localdomain ceph-osd[32393]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 28 10:08:34 np0005538515.localdomain ceph-osd[32393]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 8400.1 total, 600.0 interval
                                                          Cumulative writes: 13K writes, 52K keys, 13K commit groups, 1.0 writes per commit group, ingest: 0.04 GB, 0.01 MB/s
                                                          Cumulative WAL: 13K writes, 4194 syncs, 3.16 writes per sync, written: 0.04 GB, 0.01 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 8169 writes, 30K keys, 8169 commit groups, 1.0 writes per commit group, ingest: 23.49 MB, 0.04 MB/s
                                                          Interval WAL: 8169 writes, 3432 syncs, 2.38 writes per sync, written: 0.02 GB, 0.04 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Nov 28 10:08:34 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Nov 28 10:08:34 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3132541285' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:08:34 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Nov 28 10:08:34 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3132541285' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:08:34 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e226 e226: 6 total, 6 up, 6 in
Nov 28 10:08:34 np0005538515.localdomain ceph-mon[301134]: osdmap e225: 6 total, 6 up, 6 in
Nov 28 10:08:34 np0005538515.localdomain ceph-mon[301134]: pgmap v450: 177 pgs: 177 active+clean; 147 MiB data, 928 MiB used, 41 GiB / 42 GiB avail; 60 KiB/s rd, 18 KiB/s wr, 91 op/s
Nov 28 10:08:34 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/3132541285' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:08:34 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/3132541285' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:08:34 np0005538515.localdomain ceph-mon[301134]: osdmap e226: 6 total, 6 up, 6 in
Nov 28 10:08:34 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:08:34.682 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:08:35 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e227 e227: 6 total, 6 up, 6 in
Nov 28 10:08:35 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] scanning for idle connections..
Nov 28 10:08:35 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] cleaning up connections: []
Nov 28 10:08:35 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] scanning for idle connections..
Nov 28 10:08:35 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] cleaning up connections: []
Nov 28 10:08:35 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] scanning for idle connections..
Nov 28 10:08:35 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] cleaning up connections: []
Nov 28 10:08:35 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v453: 177 pgs: 177 active+clean; 147 MiB data, 928 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:08:36 np0005538515.localdomain ceph-mon[301134]: osdmap e227: 6 total, 6 up, 6 in
Nov 28 10:08:36 np0005538515.localdomain ceph-mon[301134]: pgmap v453: 177 pgs: 177 active+clean; 147 MiB data, 928 MiB used, 41 GiB / 42 GiB avail
Nov 28 10:08:36 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/594035084' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:08:36 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/594035084' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:08:36 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e227 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:08:37 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e228 e228: 6 total, 6 up, 6 in
Nov 28 10:08:37 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v455: 177 pgs: 177 active+clean; 147 MiB data, 928 MiB used, 41 GiB / 42 GiB avail; 125 KiB/s rd, 8.2 KiB/s wr, 170 op/s
Nov 28 10:08:38 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:08:38.368 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:08:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 28 10:08:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 8400.2 total, 600.0 interval
                                                          Cumulative writes: 17K writes, 66K keys, 17K commit groups, 1.0 writes per commit group, ingest: 0.05 GB, 0.01 MB/s
                                                          Cumulative WAL: 17K writes, 5746 syncs, 3.07 writes per sync, written: 0.05 GB, 0.01 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 11K writes, 40K keys, 11K commit groups, 1.0 writes per commit group, ingest: 27.02 MB, 0.05 MB/s
                                                          Interval WAL: 11K writes, 4966 syncs, 2.37 writes per sync, written: 0.03 GB, 0.05 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Nov 28 10:08:38 np0005538515.localdomain ceph-mon[301134]: osdmap e228: 6 total, 6 up, 6 in
Nov 28 10:08:38 np0005538515.localdomain ceph-mon[301134]: pgmap v455: 177 pgs: 177 active+clean; 147 MiB data, 928 MiB used, 41 GiB / 42 GiB avail; 125 KiB/s rd, 8.2 KiB/s wr, 170 op/s
Nov 28 10:08:38 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/83214699' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:08:38 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/83214699' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:08:38 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.
Nov 28 10:08:38 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.
Nov 28 10:08:38 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.
Nov 28 10:08:38 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.
Nov 28 10:08:38 np0005538515.localdomain podman[321709]: 2025-11-28 10:08:38.984129846 +0000 UTC m=+0.083894871 container health_status 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 28 10:08:38 np0005538515.localdomain podman[321709]: 2025-11-28 10:08:38.998407447 +0000 UTC m=+0.098172492 container exec_died 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, tcib_managed=true)
Nov 28 10:08:39 np0005538515.localdomain systemd[1]: tmp-crun.42PQNT.mount: Deactivated successfully.
Nov 28 10:08:39 np0005538515.localdomain podman[321710]: 2025-11-28 10:08:39.045095148 +0000 UTC m=+0.144836382 container health_status 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible)
Nov 28 10:08:39 np0005538515.localdomain systemd[1]: 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.service: Deactivated successfully.
Nov 28 10:08:39 np0005538515.localdomain podman[321710]: 2025-11-28 10:08:39.086573329 +0000 UTC m=+0.186314553 container exec_died 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_controller, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Nov 28 10:08:39 np0005538515.localdomain systemd[1]: 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.service: Deactivated successfully.
Nov 28 10:08:39 np0005538515.localdomain podman[321712]: 2025-11-28 10:08:39.138348887 +0000 UTC m=+0.231213618 container health_status d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 28 10:08:39 np0005538515.localdomain podman[321712]: 2025-11-28 10:08:39.146793467 +0000 UTC m=+0.239658168 container exec_died d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 28 10:08:39 np0005538515.localdomain systemd[1]: d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.service: Deactivated successfully.
Nov 28 10:08:39 np0005538515.localdomain podman[321711]: 2025-11-28 10:08:39.195385518 +0000 UTC m=+0.291861340 container health_status b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:08:39 np0005538515.localdomain podman[321711]: 2025-11-28 10:08:39.226012204 +0000 UTC m=+0.322488016 container exec_died b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 28 10:08:39 np0005538515.localdomain systemd[1]: b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.service: Deactivated successfully.
Nov 28 10:08:39 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:08:39.719 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:08:39 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e229 e229: 6 total, 6 up, 6 in
Nov 28 10:08:39 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v457: 177 pgs: 177 active+clean; 147 MiB data, 928 MiB used, 41 GiB / 42 GiB avail; 101 KiB/s rd, 6.7 KiB/s wr, 138 op/s
Nov 28 10:08:40 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e230 e230: 6 total, 6 up, 6 in
Nov 28 10:08:40 np0005538515.localdomain ceph-mon[301134]: osdmap e229: 6 total, 6 up, 6 in
Nov 28 10:08:40 np0005538515.localdomain ceph-mon[301134]: pgmap v457: 177 pgs: 177 active+clean; 147 MiB data, 928 MiB used, 41 GiB / 42 GiB avail; 101 KiB/s rd, 6.7 KiB/s wr, 138 op/s
Nov 28 10:08:41 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e231 e231: 6 total, 6 up, 6 in
Nov 28 10:08:41 np0005538515.localdomain ceph-mon[301134]: osdmap e230: 6 total, 6 up, 6 in
Nov 28 10:08:41 np0005538515.localdomain ceph-mon[301134]: osdmap e231: 6 total, 6 up, 6 in
Nov 28 10:08:41 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e231 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:08:41 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v460: 177 pgs: 177 active+clean; 147 MiB data, 928 MiB used, 41 GiB / 42 GiB avail; 118 KiB/s rd, 5.3 KiB/s wr, 160 op/s
Nov 28 10:08:41 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.
Nov 28 10:08:41 np0005538515.localdomain podman[321795]: 2025-11-28 10:08:41.965775639 +0000 UTC m=+0.073477000 container health_status 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 28 10:08:41 np0005538515.localdomain podman[321795]: 2025-11-28 10:08:41.977353136 +0000 UTC m=+0.085054487 container exec_died 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 28 10:08:41 np0005538515.localdomain systemd[1]: 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.service: Deactivated successfully.
Nov 28 10:08:42 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "cdb1ad02-8773-41eb-84cf-933676ec61c7", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:08:42 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:cdb1ad02-8773-41eb-84cf-933676ec61c7, vol_name:cephfs) < ""
Nov 28 10:08:42 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/cdb1ad02-8773-41eb-84cf-933676ec61c7/.meta.tmp'
Nov 28 10:08:42 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/cdb1ad02-8773-41eb-84cf-933676ec61c7/.meta.tmp' to config b'/volumes/_nogroup/cdb1ad02-8773-41eb-84cf-933676ec61c7/.meta'
Nov 28 10:08:42 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:cdb1ad02-8773-41eb-84cf-933676ec61c7, vol_name:cephfs) < ""
Nov 28 10:08:42 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "cdb1ad02-8773-41eb-84cf-933676ec61c7", "format": "json"}]: dispatch
Nov 28 10:08:42 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:cdb1ad02-8773-41eb-84cf-933676ec61c7, vol_name:cephfs) < ""
Nov 28 10:08:42 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:cdb1ad02-8773-41eb-84cf-933676ec61c7, vol_name:cephfs) < ""
Nov 28 10:08:42 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "53e82a45-a05e-4381-85a3-03f5d3eecad9", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:08:42 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:53e82a45-a05e-4381-85a3-03f5d3eecad9, vol_name:cephfs) < ""
Nov 28 10:08:42 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/53e82a45-a05e-4381-85a3-03f5d3eecad9/.meta.tmp'
Nov 28 10:08:42 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/53e82a45-a05e-4381-85a3-03f5d3eecad9/.meta.tmp' to config b'/volumes/_nogroup/53e82a45-a05e-4381-85a3-03f5d3eecad9/.meta'
Nov 28 10:08:42 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:53e82a45-a05e-4381-85a3-03f5d3eecad9, vol_name:cephfs) < ""
Nov 28 10:08:42 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "53e82a45-a05e-4381-85a3-03f5d3eecad9", "format": "json"}]: dispatch
Nov 28 10:08:42 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:53e82a45-a05e-4381-85a3-03f5d3eecad9, vol_name:cephfs) < ""
Nov 28 10:08:42 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:53e82a45-a05e-4381-85a3-03f5d3eecad9, vol_name:cephfs) < ""
Nov 28 10:08:42 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e232 e232: 6 total, 6 up, 6 in
Nov 28 10:08:42 np0005538515.localdomain ceph-mon[301134]: pgmap v460: 177 pgs: 177 active+clean; 147 MiB data, 928 MiB used, 41 GiB / 42 GiB avail; 118 KiB/s rd, 5.3 KiB/s wr, 160 op/s
Nov 28 10:08:42 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "cdb1ad02-8773-41eb-84cf-933676ec61c7", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:08:42 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "cdb1ad02-8773-41eb-84cf-933676ec61c7", "format": "json"}]: dispatch
Nov 28 10:08:42 np0005538515.localdomain ceph-mon[301134]: from='client.15651 172.18.0.34:0/597982567' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:08:42 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "53e82a45-a05e-4381-85a3-03f5d3eecad9", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:08:42 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "53e82a45-a05e-4381-85a3-03f5d3eecad9", "format": "json"}]: dispatch
Nov 28 10:08:42 np0005538515.localdomain ceph-mon[301134]: from='client.15651 172.18.0.34:0/597982567' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:08:43 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:08:43.239 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:08:43 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T10:08:43Z|00193|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Nov 28 10:08:43 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:08:43.421 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:08:43 np0005538515.localdomain ceph-mon[301134]: osdmap e232: 6 total, 6 up, 6 in
Nov 28 10:08:43 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v462: 177 pgs: 177 active+clean; 147 MiB data, 928 MiB used, 41 GiB / 42 GiB avail; 119 KiB/s rd, 5.3 KiB/s wr, 161 op/s
Nov 28 10:08:44 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:08:44.722 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:08:44 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e233 e233: 6 total, 6 up, 6 in
Nov 28 10:08:44 np0005538515.localdomain ceph-mon[301134]: pgmap v462: 177 pgs: 177 active+clean; 147 MiB data, 928 MiB used, 41 GiB / 42 GiB avail; 119 KiB/s rd, 5.3 KiB/s wr, 161 op/s
Nov 28 10:08:45 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "4b827a5f-531c-4609-b695-d7ea3eea20eb", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:08:45 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:4b827a5f-531c-4609-b695-d7ea3eea20eb, vol_name:cephfs) < ""
Nov 28 10:08:45 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/4b827a5f-531c-4609-b695-d7ea3eea20eb/.meta.tmp'
Nov 28 10:08:45 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/4b827a5f-531c-4609-b695-d7ea3eea20eb/.meta.tmp' to config b'/volumes/_nogroup/4b827a5f-531c-4609-b695-d7ea3eea20eb/.meta'
Nov 28 10:08:45 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:4b827a5f-531c-4609-b695-d7ea3eea20eb, vol_name:cephfs) < ""
Nov 28 10:08:45 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "4b827a5f-531c-4609-b695-d7ea3eea20eb", "format": "json"}]: dispatch
Nov 28 10:08:45 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:4b827a5f-531c-4609-b695-d7ea3eea20eb, vol_name:cephfs) < ""
Nov 28 10:08:45 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:4b827a5f-531c-4609-b695-d7ea3eea20eb, vol_name:cephfs) < ""
Nov 28 10:08:45 np0005538515.localdomain ceph-mon[301134]: osdmap e233: 6 total, 6 up, 6 in
Nov 28 10:08:45 np0005538515.localdomain ceph-mon[301134]: from='client.15651 172.18.0.34:0/597982567' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:08:45 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v464: 177 pgs: 177 active+clean; 147 MiB data, 928 MiB used, 41 GiB / 42 GiB avail; 96 KiB/s rd, 4.3 KiB/s wr, 130 op/s
Nov 28 10:08:45 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e234 e234: 6 total, 6 up, 6 in
Nov 28 10:08:45 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.
Nov 28 10:08:45 np0005538515.localdomain podman[321818]: 2025-11-28 10:08:45.983552056 +0000 UTC m=+0.084221652 container health_status cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, container_name=multipathd, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:08:45 np0005538515.localdomain podman[321818]: 2025-11-28 10:08:45.998499067 +0000 UTC m=+0.099168713 container exec_died cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=multipathd, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd)
Nov 28 10:08:46 np0005538515.localdomain systemd[1]: cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.service: Deactivated successfully.
Nov 28 10:08:46 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "53e82a45-a05e-4381-85a3-03f5d3eecad9", "format": "json"}]: dispatch
Nov 28 10:08:46 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_clone_status(clone_name:53e82a45-a05e-4381-85a3-03f5d3eecad9, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Nov 28 10:08:46 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_clone_status(clone_name:53e82a45-a05e-4381-85a3-03f5d3eecad9, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Nov 28 10:08:46 np0005538515.localdomain ceph-mgr[286188]: mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '53e82a45-a05e-4381-85a3-03f5d3eecad9' of type subvolume
Nov 28 10:08:46 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T10:08:46.243+0000 7fcc87448640 -1 mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '53e82a45-a05e-4381-85a3-03f5d3eecad9' of type subvolume
Nov 28 10:08:46 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "53e82a45-a05e-4381-85a3-03f5d3eecad9", "force": true, "format": "json"}]: dispatch
Nov 28 10:08:46 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:53e82a45-a05e-4381-85a3-03f5d3eecad9, vol_name:cephfs) < ""
Nov 28 10:08:46 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.subvolume_base] subvolume path 'b'/volumes/_nogroup/53e82a45-a05e-4381-85a3-03f5d3eecad9'' moved to trashcan
Nov 28 10:08:46 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Nov 28 10:08:46 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:53e82a45-a05e-4381-85a3-03f5d3eecad9, vol_name:cephfs) < ""
Nov 28 10:08:46 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e235 e235: 6 total, 6 up, 6 in
Nov 28 10:08:46 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e235 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:08:46 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "4b827a5f-531c-4609-b695-d7ea3eea20eb", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:08:46 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "4b827a5f-531c-4609-b695-d7ea3eea20eb", "format": "json"}]: dispatch
Nov 28 10:08:46 np0005538515.localdomain ceph-mon[301134]: pgmap v464: 177 pgs: 177 active+clean; 147 MiB data, 928 MiB used, 41 GiB / 42 GiB avail; 96 KiB/s rd, 4.3 KiB/s wr, 130 op/s
Nov 28 10:08:46 np0005538515.localdomain ceph-mon[301134]: osdmap e234: 6 total, 6 up, 6 in
Nov 28 10:08:46 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "53e82a45-a05e-4381-85a3-03f5d3eecad9", "format": "json"}]: dispatch
Nov 28 10:08:46 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "53e82a45-a05e-4381-85a3-03f5d3eecad9", "force": true, "format": "json"}]: dispatch
Nov 28 10:08:46 np0005538515.localdomain ceph-mon[301134]: osdmap e235: 6 total, 6 up, 6 in
Nov 28 10:08:47 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:08:47.239 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:08:47 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:08:47.240 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:08:47 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:08:47.241 280172 DEBUG nova.compute.manager [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 28 10:08:47 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e236 e236: 6 total, 6 up, 6 in
Nov 28 10:08:47 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v468: 177 pgs: 177 active+clean; 147 MiB data, 933 MiB used, 41 GiB / 42 GiB avail; 79 KiB/s rd, 62 KiB/s wr, 112 op/s
Nov 28 10:08:48 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:08:48.241 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:08:48 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:08:48.241 280172 DEBUG nova.compute.manager [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 28 10:08:48 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:08:48.241 280172 DEBUG nova.compute.manager [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 28 10:08:48 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:08:48.262 280172 DEBUG nova.compute.manager [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 28 10:08:48 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:08:48.262 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:08:48 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:08:48.411 158530 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=16, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '92:49:97', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ca:ab:0a:de:51:20'}, ipsec=False) old=SB_Global(nb_cfg=15) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:08:48 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:08:48.412 158530 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 28 10:08:48 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:08:48.446 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:08:48 np0005538515.localdomain ceph-mon[301134]: osdmap e236: 6 total, 6 up, 6 in
Nov 28 10:08:49 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "4b827a5f-531c-4609-b695-d7ea3eea20eb", "snap_name": "b751b79e-578e-4a27-88ca-ab5a9cb2b343", "format": "json"}]: dispatch
Nov 28 10:08:49 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_create(format:json, prefix:fs subvolume snapshot create, snap_name:b751b79e-578e-4a27-88ca-ab5a9cb2b343, sub_name:4b827a5f-531c-4609-b695-d7ea3eea20eb, vol_name:cephfs) < ""
Nov 28 10:08:49 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_create(format:json, prefix:fs subvolume snapshot create, snap_name:b751b79e-578e-4a27-88ca-ab5a9cb2b343, sub_name:4b827a5f-531c-4609-b695-d7ea3eea20eb, vol_name:cephfs) < ""
Nov 28 10:08:49 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:08:49.238 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:08:49 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:08:49.255 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 10:08:49 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:08:49.256 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 10:08:49 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:08:49.256 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 10:08:49 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:08:49.256 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Auditing locally available compute resources for np0005538515.localdomain (node: np0005538515.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 28 10:08:49 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:08:49.257 280172 DEBUG oslo_concurrency.processutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 10:08:49 np0005538515.localdomain ceph-mon[301134]: pgmap v468: 177 pgs: 177 active+clean; 147 MiB data, 933 MiB used, 41 GiB / 42 GiB avail; 79 KiB/s rd, 62 KiB/s wr, 112 op/s
Nov 28 10:08:49 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.106:0/1503239630' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:08:49 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e237 e237: 6 total, 6 up, 6 in
Nov 28 10:08:49 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 28 10:08:49 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/1833255642' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:08:49 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:08:49.696 280172 DEBUG oslo_concurrency.processutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.439s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 10:08:49 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:08:49.762 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:08:49 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v470: 177 pgs: 177 active+clean; 147 MiB data, 933 MiB used, 41 GiB / 42 GiB avail; 79 KiB/s rd, 62 KiB/s wr, 112 op/s
Nov 28 10:08:49 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:08:49.955 280172 WARNING nova.virt.libvirt.driver [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 10:08:49 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:08:49.956 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Hypervisor/Node resource view: name=np0005538515.localdomain free_ram=11499MB free_disk=41.83686447143555GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 28 10:08:49 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:08:49.957 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 10:08:49 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:08:49.957 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 10:08:50 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:08:50.033 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 28 10:08:50 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:08:50.034 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Final resource view: name=np0005538515.localdomain phys_ram=15738MB used_ram=512MB phys_disk=41GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 28 10:08:50 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:08:50.059 280172 DEBUG oslo_concurrency.processutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 10:08:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:08:50.414 158530 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=62c03cad-89c1-4fd7-973b-8f2a608c71f1, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '16'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 10:08:50 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 28 10:08:50 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/936247826' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:08:50 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "4b827a5f-531c-4609-b695-d7ea3eea20eb", "snap_name": "b751b79e-578e-4a27-88ca-ab5a9cb2b343", "format": "json"}]: dispatch
Nov 28 10:08:50 np0005538515.localdomain ceph-mon[301134]: osdmap e237: 6 total, 6 up, 6 in
Nov 28 10:08:50 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.108:0/1833255642' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:08:50 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.106:0/780788281' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:08:50 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:08:50.573 280172 DEBUG oslo_concurrency.processutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.514s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 10:08:50 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:08:50.578 280172 DEBUG nova.compute.provider_tree [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Inventory has not changed in ProviderTree for provider: 72fba1ca-0d86-48af-8a3d-510284dfd0e0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 10:08:50 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:08:50.596 280172 DEBUG nova.scheduler.client.report [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Inventory has not changed for provider 72fba1ca-0d86-48af-8a3d-510284dfd0e0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 10:08:50 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:08:50.597 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Compute_service record updated for np0005538515.localdomain:np0005538515.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 28 10:08:50 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:08:50.598 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.640s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 10:08:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:08:50.852 158530 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 10:08:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:08:50.853 158530 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 10:08:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:08:50.854 158530 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 10:08:51 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e238 e238: 6 total, 6 up, 6 in
Nov 28 10:08:51 np0005538515.localdomain ceph-mon[301134]: pgmap v470: 177 pgs: 177 active+clean; 147 MiB data, 933 MiB used, 41 GiB / 42 GiB avail; 79 KiB/s rd, 62 KiB/s wr, 112 op/s
Nov 28 10:08:51 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.108:0/936247826' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:08:51 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/1604987579' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:08:51 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/1604987579' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:08:51 np0005538515.localdomain ceph-mon[301134]: osdmap e238: 6 total, 6 up, 6 in
Nov 28 10:08:51 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:08:51.600 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:08:51 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:08:51.600 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:08:51 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e238 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:08:51 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v472: 177 pgs: 177 active+clean; 147 MiB data, 934 MiB used, 41 GiB / 42 GiB avail; 117 KiB/s rd, 70 KiB/s wr, 164 op/s
Nov 28 10:08:51 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Nov 28 10:08:51 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2163301190' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:08:51 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Nov 28 10:08:51 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2163301190' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:08:52 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:08:52.238 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:08:52 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/2163301190' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:08:52 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/2163301190' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:08:53 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:08:53.449 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:08:53 np0005538515.localdomain ceph-mon[301134]: pgmap v472: 177 pgs: 177 active+clean; 147 MiB data, 934 MiB used, 41 GiB / 42 GiB avail; 117 KiB/s rd, 70 KiB/s wr, 164 op/s
Nov 28 10:08:53 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.107:0/2410619381' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:08:53 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v473: 177 pgs: 177 active+clean; 147 MiB data, 934 MiB used, 41 GiB / 42 GiB avail; 50 KiB/s rd, 21 KiB/s wr, 68 op/s
Nov 28 10:08:54 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.107:0/1048746670' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:08:54 np0005538515.localdomain ceph-mon[301134]: pgmap v473: 177 pgs: 177 active+clean; 147 MiB data, 934 MiB used, 41 GiB / 42 GiB avail; 50 KiB/s rd, 21 KiB/s wr, 68 op/s
Nov 28 10:08:54 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/2986300977' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:08:54 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/2986300977' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:08:54 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:08:54.806 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:08:55 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "4b827a5f-531c-4609-b695-d7ea3eea20eb", "snap_name": "b751b79e-578e-4a27-88ca-ab5a9cb2b343_a9f5af9b-568e-446f-a65a-029b17668afd", "force": true, "format": "json"}]: dispatch
Nov 28 10:08:55 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:b751b79e-578e-4a27-88ca-ab5a9cb2b343_a9f5af9b-568e-446f-a65a-029b17668afd, sub_name:4b827a5f-531c-4609-b695-d7ea3eea20eb, vol_name:cephfs) < ""
Nov 28 10:08:55 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/4b827a5f-531c-4609-b695-d7ea3eea20eb/.meta.tmp'
Nov 28 10:08:55 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/4b827a5f-531c-4609-b695-d7ea3eea20eb/.meta.tmp' to config b'/volumes/_nogroup/4b827a5f-531c-4609-b695-d7ea3eea20eb/.meta'
Nov 28 10:08:55 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:b751b79e-578e-4a27-88ca-ab5a9cb2b343_a9f5af9b-568e-446f-a65a-029b17668afd, sub_name:4b827a5f-531c-4609-b695-d7ea3eea20eb, vol_name:cephfs) < ""
Nov 28 10:08:55 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "4b827a5f-531c-4609-b695-d7ea3eea20eb", "snap_name": "b751b79e-578e-4a27-88ca-ab5a9cb2b343", "force": true, "format": "json"}]: dispatch
Nov 28 10:08:55 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:b751b79e-578e-4a27-88ca-ab5a9cb2b343, sub_name:4b827a5f-531c-4609-b695-d7ea3eea20eb, vol_name:cephfs) < ""
Nov 28 10:08:55 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/4b827a5f-531c-4609-b695-d7ea3eea20eb/.meta.tmp'
Nov 28 10:08:55 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/4b827a5f-531c-4609-b695-d7ea3eea20eb/.meta.tmp' to config b'/volumes/_nogroup/4b827a5f-531c-4609-b695-d7ea3eea20eb/.meta'
Nov 28 10:08:55 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:b751b79e-578e-4a27-88ca-ab5a9cb2b343, sub_name:4b827a5f-531c-4609-b695-d7ea3eea20eb, vol_name:cephfs) < ""
Nov 28 10:08:55 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "4b827a5f-531c-4609-b695-d7ea3eea20eb", "snap_name": "b751b79e-578e-4a27-88ca-ab5a9cb2b343_a9f5af9b-568e-446f-a65a-029b17668afd", "force": true, "format": "json"}]: dispatch
Nov 28 10:08:55 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "4b827a5f-531c-4609-b695-d7ea3eea20eb", "snap_name": "b751b79e-578e-4a27-88ca-ab5a9cb2b343", "force": true, "format": "json"}]: dispatch
Nov 28 10:08:55 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v474: 177 pgs: 177 active+clean; 147 MiB data, 934 MiB used, 41 GiB / 42 GiB avail; 39 KiB/s rd, 16 KiB/s wr, 54 op/s
Nov 28 10:08:56 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e239 e239: 6 total, 6 up, 6 in
Nov 28 10:08:56 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e239 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:08:57 np0005538515.localdomain ceph-mon[301134]: pgmap v474: 177 pgs: 177 active+clean; 147 MiB data, 934 MiB used, 41 GiB / 42 GiB avail; 39 KiB/s rd, 16 KiB/s wr, 54 op/s
Nov 28 10:08:57 np0005538515.localdomain ceph-mon[301134]: osdmap e239: 6 total, 6 up, 6 in
Nov 28 10:08:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:08:57 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 10:08:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:08:57 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 10:08:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:08:57 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 10:08:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:08:57 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 10:08:57 np0005538515.localdomain openstack_network_exporter[240973]: 
Nov 28 10:08:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:08:57 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 10:08:57 np0005538515.localdomain openstack_network_exporter[240973]: 
Nov 28 10:08:57 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v476: 177 pgs: 177 active+clean; 147 MiB data, 934 MiB used, 41 GiB / 42 GiB avail; 59 KiB/s rd, 31 KiB/s wr, 83 op/s
Nov 28 10:08:57 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.
Nov 28 10:08:57 np0005538515.localdomain systemd[1]: tmp-crun.TgvAe0.mount: Deactivated successfully.
Nov 28 10:08:57 np0005538515.localdomain podman[321881]: 2025-11-28 10:08:57.971525861 +0000 UTC m=+0.083085686 container health_status 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, architecture=x86_64, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, io.openshift.tags=minimal rhel9, release=1755695350, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., vcs-type=git, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers)
Nov 28 10:08:58 np0005538515.localdomain podman[321881]: 2025-11-28 10:08:58.013521068 +0000 UTC m=+0.125080893 container exec_died 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, version=9.6, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, vendor=Red Hat, Inc., io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, architecture=x86_64, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Nov 28 10:08:58 np0005538515.localdomain systemd[1]: 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.service: Deactivated successfully.
Nov 28 10:08:58 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:08:58.452 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:08:58 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "4b827a5f-531c-4609-b695-d7ea3eea20eb", "format": "json"}]: dispatch
Nov 28 10:08:58 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_clone_status(clone_name:4b827a5f-531c-4609-b695-d7ea3eea20eb, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Nov 28 10:08:58 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_clone_status(clone_name:4b827a5f-531c-4609-b695-d7ea3eea20eb, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Nov 28 10:08:58 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T10:08:58.669+0000 7fcc87448640 -1 mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '4b827a5f-531c-4609-b695-d7ea3eea20eb' of type subvolume
Nov 28 10:08:58 np0005538515.localdomain ceph-mgr[286188]: mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '4b827a5f-531c-4609-b695-d7ea3eea20eb' of type subvolume
Nov 28 10:08:58 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "4b827a5f-531c-4609-b695-d7ea3eea20eb", "force": true, "format": "json"}]: dispatch
Nov 28 10:08:58 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:4b827a5f-531c-4609-b695-d7ea3eea20eb, vol_name:cephfs) < ""
Nov 28 10:08:58 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.subvolume_base] subvolume path 'b'/volumes/_nogroup/4b827a5f-531c-4609-b695-d7ea3eea20eb'' moved to trashcan
Nov 28 10:08:58 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Nov 28 10:08:58 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:4b827a5f-531c-4609-b695-d7ea3eea20eb, vol_name:cephfs) < ""
Nov 28 10:08:58 np0005538515.localdomain podman[239012]: time="2025-11-28T10:08:58Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 10:08:58 np0005538515.localdomain podman[239012]: @ - - [28/Nov/2025:10:08:58 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156330 "" "Go-http-client/1.1"
Nov 28 10:08:58 np0005538515.localdomain podman[239012]: @ - - [28/Nov/2025:10:08:58 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19217 "" "Go-http-client/1.1"
Nov 28 10:08:59 np0005538515.localdomain ceph-mon[301134]: pgmap v476: 177 pgs: 177 active+clean; 147 MiB data, 934 MiB used, 41 GiB / 42 GiB avail; 59 KiB/s rd, 31 KiB/s wr, 83 op/s
Nov 28 10:08:59 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:08:59.808 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:08:59 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v477: 177 pgs: 177 active+clean; 147 MiB data, 934 MiB used, 41 GiB / 42 GiB avail; 56 KiB/s rd, 29 KiB/s wr, 80 op/s
Nov 28 10:09:00 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "4b827a5f-531c-4609-b695-d7ea3eea20eb", "format": "json"}]: dispatch
Nov 28 10:09:00 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "4b827a5f-531c-4609-b695-d7ea3eea20eb", "force": true, "format": "json"}]: dispatch
Nov 28 10:09:00 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e240 e240: 6 total, 6 up, 6 in
Nov 28 10:09:01 np0005538515.localdomain ceph-mon[301134]: pgmap v477: 177 pgs: 177 active+clean; 147 MiB data, 934 MiB used, 41 GiB / 42 GiB avail; 56 KiB/s rd, 29 KiB/s wr, 80 op/s
Nov 28 10:09:01 np0005538515.localdomain ceph-mon[301134]: osdmap e240: 6 total, 6 up, 6 in
Nov 28 10:09:01 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e240 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:09:01 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v479: 177 pgs: 177 active+clean; 147 MiB data, 935 MiB used, 41 GiB / 42 GiB avail; 43 KiB/s rd, 29 KiB/s wr, 65 op/s
Nov 28 10:09:01 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "cdb1ad02-8773-41eb-84cf-933676ec61c7", "format": "json"}]: dispatch
Nov 28 10:09:01 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_clone_status(clone_name:cdb1ad02-8773-41eb-84cf-933676ec61c7, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Nov 28 10:09:01 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_clone_status(clone_name:cdb1ad02-8773-41eb-84cf-933676ec61c7, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Nov 28 10:09:01 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T10:09:01.940+0000 7fcc87448640 -1 mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume 'cdb1ad02-8773-41eb-84cf-933676ec61c7' of type subvolume
Nov 28 10:09:01 np0005538515.localdomain ceph-mgr[286188]: mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume 'cdb1ad02-8773-41eb-84cf-933676ec61c7' of type subvolume
Nov 28 10:09:01 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "cdb1ad02-8773-41eb-84cf-933676ec61c7", "force": true, "format": "json"}]: dispatch
Nov 28 10:09:01 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:cdb1ad02-8773-41eb-84cf-933676ec61c7, vol_name:cephfs) < ""
Nov 28 10:09:01 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.subvolume_base] subvolume path 'b'/volumes/_nogroup/cdb1ad02-8773-41eb-84cf-933676ec61c7'' moved to trashcan
Nov 28 10:09:01 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Nov 28 10:09:01 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:cdb1ad02-8773-41eb-84cf-933676ec61c7, vol_name:cephfs) < ""
Nov 28 10:09:02 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Nov 28 10:09:02 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/4047621474' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:09:02 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Nov 28 10:09:02 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2136588701' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:09:02 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Nov 28 10:09:02 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2136588701' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:09:02 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "775dc90b-162d-43b3-b906-8b2d7da70c34", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:09:02 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:775dc90b-162d-43b3-b906-8b2d7da70c34, vol_name:cephfs) < ""
Nov 28 10:09:02 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/775dc90b-162d-43b3-b906-8b2d7da70c34/.meta.tmp'
Nov 28 10:09:02 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/775dc90b-162d-43b3-b906-8b2d7da70c34/.meta.tmp' to config b'/volumes/_nogroup/775dc90b-162d-43b3-b906-8b2d7da70c34/.meta'
Nov 28 10:09:02 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:775dc90b-162d-43b3-b906-8b2d7da70c34, vol_name:cephfs) < ""
Nov 28 10:09:02 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "775dc90b-162d-43b3-b906-8b2d7da70c34", "format": "json"}]: dispatch
Nov 28 10:09:02 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:775dc90b-162d-43b3-b906-8b2d7da70c34, vol_name:cephfs) < ""
Nov 28 10:09:02 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:775dc90b-162d-43b3-b906-8b2d7da70c34, vol_name:cephfs) < ""
Nov 28 10:09:02 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e241 e241: 6 total, 6 up, 6 in
Nov 28 10:09:02 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/3664164282' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:09:02 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/3664164282' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:09:02 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/4047621474' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:09:02 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/2136588701' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:09:02 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/2136588701' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:09:02 np0005538515.localdomain ceph-mon[301134]: from='client.15651 172.18.0.34:0/597982567' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:09:03 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "fddc728b-0ca3-4979-8f94-9ca64d777da3", "format": "json"}]: dispatch
Nov 28 10:09:03 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_clone_status(clone_name:fddc728b-0ca3-4979-8f94-9ca64d777da3, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Nov 28 10:09:03 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:09:03.497 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:09:03 np0005538515.localdomain ceph-mon[301134]: pgmap v479: 177 pgs: 177 active+clean; 147 MiB data, 935 MiB used, 41 GiB / 42 GiB avail; 43 KiB/s rd, 29 KiB/s wr, 65 op/s
Nov 28 10:09:03 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "cdb1ad02-8773-41eb-84cf-933676ec61c7", "format": "json"}]: dispatch
Nov 28 10:09:03 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "cdb1ad02-8773-41eb-84cf-933676ec61c7", "force": true, "format": "json"}]: dispatch
Nov 28 10:09:03 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "775dc90b-162d-43b3-b906-8b2d7da70c34", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:09:03 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "775dc90b-162d-43b3-b906-8b2d7da70c34", "format": "json"}]: dispatch
Nov 28 10:09:03 np0005538515.localdomain ceph-mon[301134]: osdmap e241: 6 total, 6 up, 6 in
Nov 28 10:09:03 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/109290016' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:09:03 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/109290016' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:09:03 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e242 e242: 6 total, 6 up, 6 in
Nov 28 10:09:03 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v482: 177 pgs: 177 active+clean; 147 MiB data, 935 MiB used, 41 GiB / 42 GiB avail; 32 KiB/s rd, 19 KiB/s wr, 47 op/s
Nov 28 10:09:04 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "fddc728b-0ca3-4979-8f94-9ca64d777da3", "format": "json"}]: dispatch
Nov 28 10:09:04 np0005538515.localdomain ceph-mon[301134]: osdmap e242: 6 total, 6 up, 6 in
Nov 28 10:09:04 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/754623008' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:09:04 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/754623008' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:09:04 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:09:04.848 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:09:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_clone_status(clone_name:fddc728b-0ca3-4979-8f94-9ca64d777da3, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Nov 28 10:09:05 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "fddc728b-0ca3-4979-8f94-9ca64d777da3", "format": "json"}]: dispatch
Nov 28 10:09:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:fddc728b-0ca3-4979-8f94-9ca64d777da3, vol_name:cephfs) < ""
Nov 28 10:09:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:fddc728b-0ca3-4979-8f94-9ca64d777da3, vol_name:cephfs) < ""
Nov 28 10:09:05 np0005538515.localdomain ceph-mgr[286188]: [balancer INFO root] Optimize plan auto_2025-11-28_10:09:05
Nov 28 10:09:05 np0005538515.localdomain ceph-mgr[286188]: [balancer INFO root] Mode upmap, max misplaced 0.050000
Nov 28 10:09:05 np0005538515.localdomain ceph-mgr[286188]: [balancer INFO root] do_upmap
Nov 28 10:09:05 np0005538515.localdomain ceph-mgr[286188]: [balancer INFO root] pools ['volumes', 'manila_metadata', 'images', 'vms', 'manila_data', 'backups', '.mgr']
Nov 28 10:09:05 np0005538515.localdomain ceph-mgr[286188]: [balancer INFO root] prepared 0/10 changes
Nov 28 10:09:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] scanning for idle connections..
Nov 28 10:09:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] cleaning up connections: []
Nov 28 10:09:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] scanning for idle connections..
Nov 28 10:09:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] cleaning up connections: []
Nov 28 10:09:05 np0005538515.localdomain ceph-mon[301134]: pgmap v482: 177 pgs: 177 active+clean; 147 MiB data, 935 MiB used, 41 GiB / 42 GiB avail; 32 KiB/s rd, 19 KiB/s wr, 47 op/s
Nov 28 10:09:05 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "fddc728b-0ca3-4979-8f94-9ca64d777da3", "format": "json"}]: dispatch
Nov 28 10:09:05 np0005538515.localdomain ceph-mon[301134]: from='client.15651 172.18.0.34:0/597982567' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:09:05 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e243 e243: 6 total, 6 up, 6 in
Nov 28 10:09:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] scanning for idle connections..
Nov 28 10:09:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] cleaning up connections: []
Nov 28 10:09:05 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "775dc90b-162d-43b3-b906-8b2d7da70c34", "snap_name": "7121dfd5-9b25-4dc7-904e-3be06d03c952", "format": "json"}]: dispatch
Nov 28 10:09:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_create(format:json, prefix:fs subvolume snapshot create, snap_name:7121dfd5-9b25-4dc7-904e-3be06d03c952, sub_name:775dc90b-162d-43b3-b906-8b2d7da70c34, vol_name:cephfs) < ""
Nov 28 10:09:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_create(format:json, prefix:fs subvolume snapshot create, snap_name:7121dfd5-9b25-4dc7-904e-3be06d03c952, sub_name:775dc90b-162d-43b3-b906-8b2d7da70c34, vol_name:cephfs) < ""
Nov 28 10:09:05 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v484: 177 pgs: 177 active+clean; 147 MiB data, 935 MiB used, 41 GiB / 42 GiB avail; 36 KiB/s rd, 22 KiB/s wr, 54 op/s
Nov 28 10:09:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] _maybe_adjust
Nov 28 10:09:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Nov 28 10:09:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] Pool '.mgr' root_id -1 using 3.080724804578448e-05 of space, bias 1.0, pg target 0.006161449609156895 quantized to 1 (current 1)
Nov 28 10:09:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Nov 28 10:09:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] Pool 'vms' root_id -1 using 0.003328000680485762 of space, bias 1.0, pg target 0.6656001360971524 quantized to 32 (current 32)
Nov 28 10:09:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Nov 28 10:09:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] Pool 'volumes' root_id -1 using 3.453319839940443e-06 of space, bias 1.0, pg target 0.0006895128613747751 quantized to 32 (current 32)
Nov 28 10:09:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Nov 28 10:09:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] Pool 'images' root_id -1 using 0.004299383200725851 of space, bias 1.0, pg target 0.8584435124115949 quantized to 32 (current 32)
Nov 28 10:09:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Nov 28 10:09:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] Pool 'backups' root_id -1 using 2.7263051367950866e-07 of space, bias 1.0, pg target 5.425347222222222e-05 quantized to 32 (current 32)
Nov 28 10:09:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Nov 28 10:09:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] Pool 'manila_data' root_id -1 using 1.0905220547180346e-06 of space, bias 1.0, pg target 0.00021701388888888888 quantized to 32 (current 32)
Nov 28 10:09:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Nov 28 10:09:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] Pool 'manila_metadata' root_id -1 using 0.00014422154173646007 of space, bias 4.0, pg target 0.11480034722222221 quantized to 16 (current 16)
Nov 28 10:09:05 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules
Nov 28 10:09:05 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules
Nov 28 10:09:05 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: vms, start_after=
Nov 28 10:09:05 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: vms, start_after=
Nov 28 10:09:05 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: volumes, start_after=
Nov 28 10:09:05 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: volumes, start_after=
Nov 28 10:09:05 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: images, start_after=
Nov 28 10:09:05 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: images, start_after=
Nov 28 10:09:05 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: backups, start_after=
Nov 28 10:09:05 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: backups, start_after=
Nov 28 10:09:06 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e244 e244: 6 total, 6 up, 6 in
Nov 28 10:09:06 np0005538515.localdomain ceph-mon[301134]: osdmap e243: 6 total, 6 up, 6 in
Nov 28 10:09:06 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "775dc90b-162d-43b3-b906-8b2d7da70c34", "snap_name": "7121dfd5-9b25-4dc7-904e-3be06d03c952", "format": "json"}]: dispatch
Nov 28 10:09:06 np0005538515.localdomain ceph-mon[301134]: pgmap v484: 177 pgs: 177 active+clean; 147 MiB data, 935 MiB used, 41 GiB / 42 GiB avail; 36 KiB/s rd, 22 KiB/s wr, 54 op/s
Nov 28 10:09:06 np0005538515.localdomain ceph-mon[301134]: osdmap e244: 6 total, 6 up, 6 in
Nov 28 10:09:06 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e244 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:09:07 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "fddc728b-0ca3-4979-8f94-9ca64d777da3", "format": "json"}]: dispatch
Nov 28 10:09:07 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_clone_status(clone_name:fddc728b-0ca3-4979-8f94-9ca64d777da3, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Nov 28 10:09:07 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_clone_status(clone_name:fddc728b-0ca3-4979-8f94-9ca64d777da3, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Nov 28 10:09:07 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "fddc728b-0ca3-4979-8f94-9ca64d777da3", "force": true, "format": "json"}]: dispatch
Nov 28 10:09:07 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:fddc728b-0ca3-4979-8f94-9ca64d777da3, vol_name:cephfs) < ""
Nov 28 10:09:07 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.subvolume_base] subvolume path 'b'/volumes/_nogroup/fddc728b-0ca3-4979-8f94-9ca64d777da3'' moved to trashcan
Nov 28 10:09:07 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Nov 28 10:09:07 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:fddc728b-0ca3-4979-8f94-9ca64d777da3, vol_name:cephfs) < ""
Nov 28 10:09:07 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e245 e245: 6 total, 6 up, 6 in
Nov 28 10:09:07 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "fddc728b-0ca3-4979-8f94-9ca64d777da3", "format": "json"}]: dispatch
Nov 28 10:09:07 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "fddc728b-0ca3-4979-8f94-9ca64d777da3", "force": true, "format": "json"}]: dispatch
Nov 28 10:09:07 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v487: 177 pgs: 177 active+clean; 148 MiB data, 944 MiB used, 41 GiB / 42 GiB avail; 142 KiB/s rd, 65 KiB/s wr, 200 op/s
Nov 28 10:09:08 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Nov 28 10:09:08 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3077193037' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:09:08 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Nov 28 10:09:08 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3077193037' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:09:08 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:09:08.525 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:09:08 np0005538515.localdomain ceph-mon[301134]: osdmap e245: 6 total, 6 up, 6 in
Nov 28 10:09:08 np0005538515.localdomain ceph-mon[301134]: pgmap v487: 177 pgs: 177 active+clean; 148 MiB data, 944 MiB used, 41 GiB / 42 GiB avail; 142 KiB/s rd, 65 KiB/s wr, 200 op/s
Nov 28 10:09:08 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/3077193037' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:09:08 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/3077193037' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:09:09 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "775dc90b-162d-43b3-b906-8b2d7da70c34", "snap_name": "24807f53-77c4-4f15-8496-e0fd980f52a8", "format": "json"}]: dispatch
Nov 28 10:09:09 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_create(format:json, prefix:fs subvolume snapshot create, snap_name:24807f53-77c4-4f15-8496-e0fd980f52a8, sub_name:775dc90b-162d-43b3-b906-8b2d7da70c34, vol_name:cephfs) < ""
Nov 28 10:09:09 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_create(format:json, prefix:fs subvolume snapshot create, snap_name:24807f53-77c4-4f15-8496-e0fd980f52a8, sub_name:775dc90b-162d-43b3-b906-8b2d7da70c34, vol_name:cephfs) < ""
Nov 28 10:09:09 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "775dc90b-162d-43b3-b906-8b2d7da70c34", "snap_name": "24807f53-77c4-4f15-8496-e0fd980f52a8", "format": "json"}]: dispatch
Nov 28 10:09:09 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v488: 177 pgs: 177 active+clean; 148 MiB data, 944 MiB used, 41 GiB / 42 GiB avail; 101 KiB/s rd, 46 KiB/s wr, 142 op/s
Nov 28 10:09:09 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:09:09.891 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:09:09 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.
Nov 28 10:09:09 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.
Nov 28 10:09:09 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.
Nov 28 10:09:09 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.
Nov 28 10:09:10 np0005538515.localdomain podman[321902]: 2025-11-28 10:09:10.014891348 +0000 UTC m=+0.093073824 container health_status 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.build-date=20251125)
Nov 28 10:09:10 np0005538515.localdomain podman[321903]: 2025-11-28 10:09:10.072905629 +0000 UTC m=+0.144529813 container health_status b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Nov 28 10:09:10 np0005538515.localdomain podman[321903]: 2025-11-28 10:09:10.081321489 +0000 UTC m=+0.152945703 container exec_died b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:09:10 np0005538515.localdomain systemd[1]: b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.service: Deactivated successfully.
Nov 28 10:09:10 np0005538515.localdomain podman[321902]: 2025-11-28 10:09:10.102036508 +0000 UTC m=+0.180218984 container exec_died 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:09:10 np0005538515.localdomain systemd[1]: 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.service: Deactivated successfully.
Nov 28 10:09:10 np0005538515.localdomain podman[321901]: 2025-11-28 10:09:10.118489136 +0000 UTC m=+0.200317664 container health_status 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 28 10:09:10 np0005538515.localdomain podman[321901]: 2025-11-28 10:09:10.132776177 +0000 UTC m=+0.214604705 container exec_died 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 28 10:09:10 np0005538515.localdomain systemd[1]: 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.service: Deactivated successfully.
Nov 28 10:09:10 np0005538515.localdomain podman[321909]: 2025-11-28 10:09:10.227666296 +0000 UTC m=+0.294131541 container health_status d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 28 10:09:10 np0005538515.localdomain podman[321909]: 2025-11-28 10:09:10.261874452 +0000 UTC m=+0.328339747 container exec_died d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 10:09:10 np0005538515.localdomain systemd[1]: d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.service: Deactivated successfully.
Nov 28 10:09:10 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "f08fd6d7-e632-4600-b126-7d3287d90baf", "snap_name": "7f4376e7-ac7c-4743-ba07-48e626bc51c1_2561ce9f-a8b9-43a9-9ce8-c591d939acc0", "force": true, "format": "json"}]: dispatch
Nov 28 10:09:10 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:7f4376e7-ac7c-4743-ba07-48e626bc51c1_2561ce9f-a8b9-43a9-9ce8-c591d939acc0, sub_name:f08fd6d7-e632-4600-b126-7d3287d90baf, vol_name:cephfs) < ""
Nov 28 10:09:10 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/f08fd6d7-e632-4600-b126-7d3287d90baf/.meta.tmp'
Nov 28 10:09:10 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/f08fd6d7-e632-4600-b126-7d3287d90baf/.meta.tmp' to config b'/volumes/_nogroup/f08fd6d7-e632-4600-b126-7d3287d90baf/.meta'
Nov 28 10:09:10 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:7f4376e7-ac7c-4743-ba07-48e626bc51c1_2561ce9f-a8b9-43a9-9ce8-c591d939acc0, sub_name:f08fd6d7-e632-4600-b126-7d3287d90baf, vol_name:cephfs) < ""
Nov 28 10:09:10 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "f08fd6d7-e632-4600-b126-7d3287d90baf", "snap_name": "7f4376e7-ac7c-4743-ba07-48e626bc51c1", "force": true, "format": "json"}]: dispatch
Nov 28 10:09:10 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:7f4376e7-ac7c-4743-ba07-48e626bc51c1, sub_name:f08fd6d7-e632-4600-b126-7d3287d90baf, vol_name:cephfs) < ""
Nov 28 10:09:10 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/f08fd6d7-e632-4600-b126-7d3287d90baf/.meta.tmp'
Nov 28 10:09:10 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/f08fd6d7-e632-4600-b126-7d3287d90baf/.meta.tmp' to config b'/volumes/_nogroup/f08fd6d7-e632-4600-b126-7d3287d90baf/.meta'
Nov 28 10:09:10 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:7f4376e7-ac7c-4743-ba07-48e626bc51c1, sub_name:f08fd6d7-e632-4600-b126-7d3287d90baf, vol_name:cephfs) < ""
Nov 28 10:09:10 np0005538515.localdomain ceph-mon[301134]: pgmap v488: 177 pgs: 177 active+clean; 148 MiB data, 944 MiB used, 41 GiB / 42 GiB avail; 101 KiB/s rd, 46 KiB/s wr, 142 op/s
Nov 28 10:09:11 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e246 e246: 6 total, 6 up, 6 in
Nov 28 10:09:11 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e246 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:09:11 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v490: 177 pgs: 177 active+clean; 148 MiB data, 949 MiB used, 41 GiB / 42 GiB avail; 3.5 MiB/s rd, 85 KiB/s wr, 199 op/s
Nov 28 10:09:12 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "f08fd6d7-e632-4600-b126-7d3287d90baf", "snap_name": "7f4376e7-ac7c-4743-ba07-48e626bc51c1_2561ce9f-a8b9-43a9-9ce8-c591d939acc0", "force": true, "format": "json"}]: dispatch
Nov 28 10:09:12 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "f08fd6d7-e632-4600-b126-7d3287d90baf", "snap_name": "7f4376e7-ac7c-4743-ba07-48e626bc51c1", "force": true, "format": "json"}]: dispatch
Nov 28 10:09:12 np0005538515.localdomain ceph-mon[301134]: osdmap e246: 6 total, 6 up, 6 in
Nov 28 10:09:12 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.
Nov 28 10:09:12 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "775dc90b-162d-43b3-b906-8b2d7da70c34", "snap_name": "24807f53-77c4-4f15-8496-e0fd980f52a8_43874ef7-c24e-4d18-b524-79d36cc3d273", "force": true, "format": "json"}]: dispatch
Nov 28 10:09:12 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:24807f53-77c4-4f15-8496-e0fd980f52a8_43874ef7-c24e-4d18-b524-79d36cc3d273, sub_name:775dc90b-162d-43b3-b906-8b2d7da70c34, vol_name:cephfs) < ""
Nov 28 10:09:12 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/775dc90b-162d-43b3-b906-8b2d7da70c34/.meta.tmp'
Nov 28 10:09:12 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/775dc90b-162d-43b3-b906-8b2d7da70c34/.meta.tmp' to config b'/volumes/_nogroup/775dc90b-162d-43b3-b906-8b2d7da70c34/.meta'
Nov 28 10:09:12 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:24807f53-77c4-4f15-8496-e0fd980f52a8_43874ef7-c24e-4d18-b524-79d36cc3d273, sub_name:775dc90b-162d-43b3-b906-8b2d7da70c34, vol_name:cephfs) < ""
Nov 28 10:09:12 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "775dc90b-162d-43b3-b906-8b2d7da70c34", "snap_name": "24807f53-77c4-4f15-8496-e0fd980f52a8", "force": true, "format": "json"}]: dispatch
Nov 28 10:09:12 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:24807f53-77c4-4f15-8496-e0fd980f52a8, sub_name:775dc90b-162d-43b3-b906-8b2d7da70c34, vol_name:cephfs) < ""
Nov 28 10:09:12 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/775dc90b-162d-43b3-b906-8b2d7da70c34/.meta.tmp'
Nov 28 10:09:12 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/775dc90b-162d-43b3-b906-8b2d7da70c34/.meta.tmp' to config b'/volumes/_nogroup/775dc90b-162d-43b3-b906-8b2d7da70c34/.meta'
Nov 28 10:09:12 np0005538515.localdomain systemd[1]: tmp-crun.UYMlDo.mount: Deactivated successfully.
Nov 28 10:09:12 np0005538515.localdomain podman[321986]: 2025-11-28 10:09:12.997426948 +0000 UTC m=+0.104766636 container health_status 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 10:09:13 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:24807f53-77c4-4f15-8496-e0fd980f52a8, sub_name:775dc90b-162d-43b3-b906-8b2d7da70c34, vol_name:cephfs) < ""
Nov 28 10:09:13 np0005538515.localdomain podman[321986]: 2025-11-28 10:09:13.010473381 +0000 UTC m=+0.117813119 container exec_died 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 28 10:09:13 np0005538515.localdomain systemd[1]: 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.service: Deactivated successfully.
Nov 28 10:09:13 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Nov 28 10:09:13 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/803796492' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:09:13 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Nov 28 10:09:13 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/803796492' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:09:13 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:09:13.566 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:09:13 np0005538515.localdomain ceph-mon[301134]: pgmap v490: 177 pgs: 177 active+clean; 148 MiB data, 949 MiB used, 41 GiB / 42 GiB avail; 3.5 MiB/s rd, 85 KiB/s wr, 199 op/s
Nov 28 10:09:13 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/3079034859' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:09:13 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/3079034859' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:09:13 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/803796492' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:09:13 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/803796492' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:09:13 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v491: 177 pgs: 177 active+clean; 148 MiB data, 949 MiB used, 41 GiB / 42 GiB avail; 2.9 MiB/s rd, 69 KiB/s wr, 162 op/s
Nov 28 10:09:13 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "f08fd6d7-e632-4600-b126-7d3287d90baf", "format": "json"}]: dispatch
Nov 28 10:09:13 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_clone_status(clone_name:f08fd6d7-e632-4600-b126-7d3287d90baf, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Nov 28 10:09:13 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_clone_status(clone_name:f08fd6d7-e632-4600-b126-7d3287d90baf, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Nov 28 10:09:13 np0005538515.localdomain ceph-mgr[286188]: mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume 'f08fd6d7-e632-4600-b126-7d3287d90baf' of type subvolume
Nov 28 10:09:13 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T10:09:13.908+0000 7fcc87448640 -1 mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume 'f08fd6d7-e632-4600-b126-7d3287d90baf' of type subvolume
Nov 28 10:09:13 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "f08fd6d7-e632-4600-b126-7d3287d90baf", "force": true, "format": "json"}]: dispatch
Nov 28 10:09:13 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:f08fd6d7-e632-4600-b126-7d3287d90baf, vol_name:cephfs) < ""
Nov 28 10:09:13 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.subvolume_base] subvolume path 'b'/volumes/_nogroup/f08fd6d7-e632-4600-b126-7d3287d90baf'' moved to trashcan
Nov 28 10:09:13 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Nov 28 10:09:13 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:f08fd6d7-e632-4600-b126-7d3287d90baf, vol_name:cephfs) < ""
Nov 28 10:09:14 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "775dc90b-162d-43b3-b906-8b2d7da70c34", "snap_name": "24807f53-77c4-4f15-8496-e0fd980f52a8_43874ef7-c24e-4d18-b524-79d36cc3d273", "force": true, "format": "json"}]: dispatch
Nov 28 10:09:14 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "775dc90b-162d-43b3-b906-8b2d7da70c34", "snap_name": "24807f53-77c4-4f15-8496-e0fd980f52a8", "force": true, "format": "json"}]: dispatch
Nov 28 10:09:14 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e247 e247: 6 total, 6 up, 6 in
Nov 28 10:09:14 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:09:14.894 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:09:15 np0005538515.localdomain ceph-mon[301134]: pgmap v491: 177 pgs: 177 active+clean; 148 MiB data, 949 MiB used, 41 GiB / 42 GiB avail; 2.9 MiB/s rd, 69 KiB/s wr, 162 op/s
Nov 28 10:09:15 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "f08fd6d7-e632-4600-b126-7d3287d90baf", "format": "json"}]: dispatch
Nov 28 10:09:15 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "f08fd6d7-e632-4600-b126-7d3287d90baf", "force": true, "format": "json"}]: dispatch
Nov 28 10:09:15 np0005538515.localdomain ceph-mon[301134]: osdmap e247: 6 total, 6 up, 6 in
Nov 28 10:09:15 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e248 e248: 6 total, 6 up, 6 in
Nov 28 10:09:15 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v494: 177 pgs: 177 active+clean; 148 MiB data, 949 MiB used, 41 GiB / 42 GiB avail; 3.4 MiB/s rd, 38 KiB/s wr, 57 op/s
Nov 28 10:09:16 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "775dc90b-162d-43b3-b906-8b2d7da70c34", "snap_name": "7121dfd5-9b25-4dc7-904e-3be06d03c952_0622dbc9-fa1f-43ca-acfe-551c9624e3ef", "force": true, "format": "json"}]: dispatch
Nov 28 10:09:16 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:7121dfd5-9b25-4dc7-904e-3be06d03c952_0622dbc9-fa1f-43ca-acfe-551c9624e3ef, sub_name:775dc90b-162d-43b3-b906-8b2d7da70c34, vol_name:cephfs) < ""
Nov 28 10:09:16 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/775dc90b-162d-43b3-b906-8b2d7da70c34/.meta.tmp'
Nov 28 10:09:16 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/775dc90b-162d-43b3-b906-8b2d7da70c34/.meta.tmp' to config b'/volumes/_nogroup/775dc90b-162d-43b3-b906-8b2d7da70c34/.meta'
Nov 28 10:09:16 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:7121dfd5-9b25-4dc7-904e-3be06d03c952_0622dbc9-fa1f-43ca-acfe-551c9624e3ef, sub_name:775dc90b-162d-43b3-b906-8b2d7da70c34, vol_name:cephfs) < ""
Nov 28 10:09:16 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "775dc90b-162d-43b3-b906-8b2d7da70c34", "snap_name": "7121dfd5-9b25-4dc7-904e-3be06d03c952", "force": true, "format": "json"}]: dispatch
Nov 28 10:09:16 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:7121dfd5-9b25-4dc7-904e-3be06d03c952, sub_name:775dc90b-162d-43b3-b906-8b2d7da70c34, vol_name:cephfs) < ""
Nov 28 10:09:16 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/775dc90b-162d-43b3-b906-8b2d7da70c34/.meta.tmp'
Nov 28 10:09:16 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/775dc90b-162d-43b3-b906-8b2d7da70c34/.meta.tmp' to config b'/volumes/_nogroup/775dc90b-162d-43b3-b906-8b2d7da70c34/.meta'
Nov 28 10:09:16 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:7121dfd5-9b25-4dc7-904e-3be06d03c952, sub_name:775dc90b-162d-43b3-b906-8b2d7da70c34, vol_name:cephfs) < ""
Nov 28 10:09:16 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e249 e249: 6 total, 6 up, 6 in
Nov 28 10:09:16 np0005538515.localdomain ceph-mon[301134]: osdmap e248: 6 total, 6 up, 6 in
Nov 28 10:09:16 np0005538515.localdomain ceph-mon[301134]: pgmap v494: 177 pgs: 177 active+clean; 148 MiB data, 949 MiB used, 41 GiB / 42 GiB avail; 3.4 MiB/s rd, 38 KiB/s wr, 57 op/s
Nov 28 10:09:16 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "775dc90b-162d-43b3-b906-8b2d7da70c34", "snap_name": "7121dfd5-9b25-4dc7-904e-3be06d03c952_0622dbc9-fa1f-43ca-acfe-551c9624e3ef", "force": true, "format": "json"}]: dispatch
Nov 28 10:09:16 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "775dc90b-162d-43b3-b906-8b2d7da70c34", "snap_name": "7121dfd5-9b25-4dc7-904e-3be06d03c952", "force": true, "format": "json"}]: dispatch
Nov 28 10:09:16 np0005538515.localdomain ceph-mon[301134]: osdmap e249: 6 total, 6 up, 6 in
Nov 28 10:09:16 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/3340760179' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:09:16 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/3340760179' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:09:16 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e249 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:09:16 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.
Nov 28 10:09:16 np0005538515.localdomain podman[322009]: 2025-11-28 10:09:16.99720563 +0000 UTC m=+0.105468357 container health_status cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=multipathd, org.label-schema.schema-version=1.0, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:09:17 np0005538515.localdomain podman[322009]: 2025-11-28 10:09:17.034584214 +0000 UTC m=+0.142846951 container exec_died cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.license=GPLv2, tcib_managed=true, config_id=multipathd, container_name=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible)
Nov 28 10:09:17 np0005538515.localdomain systemd[1]: cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.service: Deactivated successfully.
Nov 28 10:09:17 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Nov 28 10:09:17 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3754312141' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:09:17 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Nov 28 10:09:17 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3754312141' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:09:17 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:09:17.454 261346 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:09:17Z, description=, device_id=ca028b75-13e6-4080-861d-94b31335eb39, device_owner=network:router_gateway, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce3608e0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce360e80>], id=e72cdf7d-f7d9-4850-a363-66db0614dee5, ip_allocation=immediate, mac_address=fa:16:3e:6a:c4:de, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T08:32:19Z, description=, dns_domain=, id=887157f9-a765-40c0-8be5-1fba3ddea8f8, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=9dda653c53224db086060962b0702694, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=29, status=ACTIVE, subnets=['5f7de60c-f82a-4f40-b803-51cb08cbf2e3'], tags=[], tenant_id=9dda653c53224db086060962b0702694, updated_at=2025-11-28T08:32:25Z, vlan_transparent=None, network_id=887157f9-a765-40c0-8be5-1fba3ddea8f8, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3153, status=DOWN, tags=[], tenant_id=, updated_at=2025-11-28T10:09:17Z on network 887157f9-a765-40c0-8be5-1fba3ddea8f8
Nov 28 10:09:17 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/3754312141' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:09:17 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/3754312141' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:09:17 np0005538515.localdomain dnsmasq[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/addn_hosts - 2 addresses
Nov 28 10:09:17 np0005538515.localdomain podman[322045]: 2025-11-28 10:09:17.719870439 +0000 UTC m=+0.059220240 container kill d6ba70e100412f39fbf43f2359746885ccd151dd93bddc3682d2651a49a36c62 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-887157f9-a765-40c0-8be5-1fba3ddea8f8, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 28 10:09:17 np0005538515.localdomain systemd[1]: tmp-crun.7EPmxP.mount: Deactivated successfully.
Nov 28 10:09:17 np0005538515.localdomain dnsmasq-dhcp[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/host
Nov 28 10:09:17 np0005538515.localdomain dnsmasq-dhcp[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/opts
Nov 28 10:09:17 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v496: 177 pgs: 177 active+clean; 194 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 117 KiB/s rd, 3.6 MiB/s wr, 171 op/s
Nov 28 10:09:17 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:09:17.917 261346 INFO neutron.agent.dhcp.agent [None req-ea411aaf-ab83-42be-ba66-34c6532be3a6 - - - - - -] DHCP configuration for ports {'e72cdf7d-f7d9-4850-a363-66db0614dee5'} is completed
Nov 28 10:09:18 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:09:18.569 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:09:18 np0005538515.localdomain ceph-mon[301134]: pgmap v496: 177 pgs: 177 active+clean; 194 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 117 KiB/s rd, 3.6 MiB/s wr, 171 op/s
Nov 28 10:09:18 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:09:18.694 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:09:19 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "775dc90b-162d-43b3-b906-8b2d7da70c34", "format": "json"}]: dispatch
Nov 28 10:09:19 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_clone_status(clone_name:775dc90b-162d-43b3-b906-8b2d7da70c34, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Nov 28 10:09:19 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_clone_status(clone_name:775dc90b-162d-43b3-b906-8b2d7da70c34, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Nov 28 10:09:19 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T10:09:19.481+0000 7fcc87448640 -1 mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '775dc90b-162d-43b3-b906-8b2d7da70c34' of type subvolume
Nov 28 10:09:19 np0005538515.localdomain ceph-mgr[286188]: mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '775dc90b-162d-43b3-b906-8b2d7da70c34' of type subvolume
Nov 28 10:09:19 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "775dc90b-162d-43b3-b906-8b2d7da70c34", "force": true, "format": "json"}]: dispatch
Nov 28 10:09:19 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:775dc90b-162d-43b3-b906-8b2d7da70c34, vol_name:cephfs) < ""
Nov 28 10:09:19 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.subvolume_base] subvolume path 'b'/volumes/_nogroup/775dc90b-162d-43b3-b906-8b2d7da70c34'' moved to trashcan
Nov 28 10:09:19 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Nov 28 10:09:19 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:775dc90b-162d-43b3-b906-8b2d7da70c34, vol_name:cephfs) < ""
Nov 28 10:09:19 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e250 e250: 6 total, 6 up, 6 in
Nov 28 10:09:19 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "775dc90b-162d-43b3-b906-8b2d7da70c34", "format": "json"}]: dispatch
Nov 28 10:09:19 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "775dc90b-162d-43b3-b906-8b2d7da70c34", "force": true, "format": "json"}]: dispatch
Nov 28 10:09:19 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v498: 177 pgs: 177 active+clean; 194 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 133 KiB/s rd, 4.1 MiB/s wr, 195 op/s
Nov 28 10:09:19 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:09:19.908 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:09:20 np0005538515.localdomain ceph-mon[301134]: osdmap e250: 6 total, 6 up, 6 in
Nov 28 10:09:20 np0005538515.localdomain ceph-mon[301134]: pgmap v498: 177 pgs: 177 active+clean; 194 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 133 KiB/s rd, 4.1 MiB/s wr, 195 op/s
Nov 28 10:09:20 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e251 e251: 6 total, 6 up, 6 in
Nov 28 10:09:21 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e252 e252: 6 total, 6 up, 6 in
Nov 28 10:09:21 np0005538515.localdomain ceph-mon[301134]: osdmap e251: 6 total, 6 up, 6 in
Nov 28 10:09:21 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/382002900' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:09:21 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/382002900' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:09:21 np0005538515.localdomain ceph-mon[301134]: osdmap e252: 6 total, 6 up, 6 in
Nov 28 10:09:21 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #37. Immutable memtables: 0.
Nov 28 10:09:21 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:09:21.760736) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Nov 28 10:09:21 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/flush_job.cc:856] [default] [JOB 19] Flushing memtable with next log file: 37
Nov 28 10:09:21 np0005538515.localdomain ceph-mon[301134]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324561760847, "job": 19, "event": "flush_started", "num_memtables": 1, "num_entries": 2894, "num_deletes": 281, "total_data_size": 4761968, "memory_usage": 4840160, "flush_reason": "Manual Compaction"}
Nov 28 10:09:21 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/flush_job.cc:885] [default] [JOB 19] Level-0 flush table #38: started
Nov 28 10:09:21 np0005538515.localdomain ceph-mon[301134]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324561783515, "cf_name": "default", "job": 19, "event": "table_file_creation", "file_number": 38, "file_size": 3108429, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 23971, "largest_seqno": 26860, "table_properties": {"data_size": 3096625, "index_size": 7669, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 3077, "raw_key_size": 27914, "raw_average_key_size": 22, "raw_value_size": 3072092, "raw_average_value_size": 2520, "num_data_blocks": 320, "num_entries": 1219, "num_filter_entries": 1219, "num_deletions": 281, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764324437, "oldest_key_time": 1764324437, "file_creation_time": 1764324561, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "75e61b0e-4f73-4b03-b096-8587ecbe7a9f", "db_session_id": "7KM5GJAJPD54H6HSLJHG", "orig_file_number": 38, "seqno_to_time_mapping": "N/A"}}
Nov 28 10:09:21 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 19] Flush lasted 22835 microseconds, and 7691 cpu microseconds.
Nov 28 10:09:21 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 28 10:09:21 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:09:21.783578) [db/flush_job.cc:967] [default] [JOB 19] Level-0 flush table #38: 3108429 bytes OK
Nov 28 10:09:21 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:09:21.783609) [db/memtable_list.cc:519] [default] Level-0 commit table #38 started
Nov 28 10:09:21 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:09:21.786217) [db/memtable_list.cc:722] [default] Level-0 commit table #38: memtable #1 done
Nov 28 10:09:21 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:09:21.786251) EVENT_LOG_v1 {"time_micros": 1764324561786242, "job": 19, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Nov 28 10:09:21 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:09:21.786279) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Nov 28 10:09:21 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 19] Try to delete WAL files size 4748509, prev total WAL file size 4748509, number of live WAL files 2.
Nov 28 10:09:21 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:09:21.787 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:09:21 np0005538515.localdomain ceph-mon[301134]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538515/store.db/000034.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 28 10:09:21 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:09:21.789635) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003132323939' seq:72057594037927935, type:22 .. '7061786F73003132353531' seq:0, type:0; will stop at (end)
Nov 28 10:09:21 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 20] Compacting 1@0 + 1@6 files to L6, score -1.00
Nov 28 10:09:21 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 19 Base level 0, inputs: [38(3035KB)], [36(17MB)]
Nov 28 10:09:21 np0005538515.localdomain ceph-mon[301134]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324561789706, "job": 20, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [38], "files_L6": [36], "score": -1, "input_data_size": 20953714, "oldest_snapshot_seqno": -1}
Nov 28 10:09:21 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e252 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:09:21 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v501: 177 pgs: 177 active+clean; 195 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 234 KiB/s rd, 4.1 MiB/s wr, 332 op/s
Nov 28 10:09:21 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 20] Generated table #39: 13409 keys, 19616539 bytes, temperature: kUnknown
Nov 28 10:09:21 np0005538515.localdomain ceph-mon[301134]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324561892509, "cf_name": "default", "job": 20, "event": "table_file_creation", "file_number": 39, "file_size": 19616539, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 19535806, "index_size": 46136, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 33541, "raw_key_size": 357758, "raw_average_key_size": 26, "raw_value_size": 19303583, "raw_average_value_size": 1439, "num_data_blocks": 1754, "num_entries": 13409, "num_filter_entries": 13409, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764323786, "oldest_key_time": 0, "file_creation_time": 1764324561, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "75e61b0e-4f73-4b03-b096-8587ecbe7a9f", "db_session_id": "7KM5GJAJPD54H6HSLJHG", "orig_file_number": 39, "seqno_to_time_mapping": "N/A"}}
Nov 28 10:09:21 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 28 10:09:21 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:09:21.892843) [db/compaction/compaction_job.cc:1663] [default] [JOB 20] Compacted 1@0 + 1@6 files to L6 => 19616539 bytes
Nov 28 10:09:21 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:09:21.911609) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 203.6 rd, 190.6 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.0, 17.0 +0.0 blob) out(18.7 +0.0 blob), read-write-amplify(13.1) write-amplify(6.3) OK, records in: 13975, records dropped: 566 output_compression: NoCompression
Nov 28 10:09:21 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:09:21.911641) EVENT_LOG_v1 {"time_micros": 1764324561911627, "job": 20, "event": "compaction_finished", "compaction_time_micros": 102924, "compaction_time_cpu_micros": 39439, "output_level": 6, "num_output_files": 1, "total_output_size": 19616539, "num_input_records": 13975, "num_output_records": 13409, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Nov 28 10:09:21 np0005538515.localdomain ceph-mon[301134]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538515/store.db/000038.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 28 10:09:21 np0005538515.localdomain ceph-mon[301134]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324561912260, "job": 20, "event": "table_file_deletion", "file_number": 38}
Nov 28 10:09:21 np0005538515.localdomain ceph-mon[301134]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538515/store.db/000036.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 28 10:09:21 np0005538515.localdomain ceph-mon[301134]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324561914576, "job": 20, "event": "table_file_deletion", "file_number": 36}
Nov 28 10:09:21 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:09:21.789514) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:09:21 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:09:21.914673) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:09:21 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:09:21.914679) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:09:21 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:09:21.914682) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:09:21 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:09:21.914685) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:09:21 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:09:21.914687) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:09:22 np0005538515.localdomain ceph-mon[301134]: pgmap v501: 177 pgs: 177 active+clean; 195 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 234 KiB/s rd, 4.1 MiB/s wr, 332 op/s
Nov 28 10:09:22 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/1047822351' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:09:22 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/1047822351' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:09:22 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e253 e253: 6 total, 6 up, 6 in
Nov 28 10:09:23 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:09:23.572 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:09:23 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e254 e254: 6 total, 6 up, 6 in
Nov 28 10:09:23 np0005538515.localdomain ceph-mon[301134]: osdmap e253: 6 total, 6 up, 6 in
Nov 28 10:09:23 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v504: 177 pgs: 177 active+clean; 195 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 139 KiB/s rd, 43 KiB/s wr, 189 op/s
Nov 28 10:09:24 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Nov 28 10:09:24 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/314747336' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:09:24 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Nov 28 10:09:24 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/314747336' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:09:24 np0005538515.localdomain ceph-mon[301134]: osdmap e254: 6 total, 6 up, 6 in
Nov 28 10:09:24 np0005538515.localdomain ceph-mon[301134]: pgmap v504: 177 pgs: 177 active+clean; 195 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 139 KiB/s rd, 43 KiB/s wr, 189 op/s
Nov 28 10:09:24 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/314747336' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:09:24 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/314747336' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:09:24 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:09:24.959 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:09:25 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Nov 28 10:09:25 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3883756345' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:09:25 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/3883756345' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:09:25 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v505: 177 pgs: 177 active+clean; 195 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 108 KiB/s rd, 34 KiB/s wr, 147 op/s
Nov 28 10:09:26 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e255 e255: 6 total, 6 up, 6 in
Nov 28 10:09:26 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e255 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:09:27 np0005538515.localdomain ceph-mon[301134]: pgmap v505: 177 pgs: 177 active+clean; 195 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 108 KiB/s rd, 34 KiB/s wr, 147 op/s
Nov 28 10:09:27 np0005538515.localdomain ceph-mon[301134]: osdmap e255: 6 total, 6 up, 6 in
Nov 28 10:09:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:09:27 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 10:09:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:09:27 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 10:09:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:09:27 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 10:09:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:09:27 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 10:09:27 np0005538515.localdomain openstack_network_exporter[240973]: 
Nov 28 10:09:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:09:27 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 10:09:27 np0005538515.localdomain openstack_network_exporter[240973]: 
Nov 28 10:09:27 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v507: 177 pgs: 177 active+clean; 251 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 99 KiB/s rd, 9.4 MiB/s wr, 140 op/s
Nov 28 10:09:28 np0005538515.localdomain sudo[322066]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 10:09:28 np0005538515.localdomain sudo[322066]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 10:09:28 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.
Nov 28 10:09:28 np0005538515.localdomain sudo[322066]: pam_unix(sudo:session): session closed for user root
Nov 28 10:09:28 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:09:28.574 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:09:28 np0005538515.localdomain sudo[322090]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 10:09:28 np0005538515.localdomain sudo[322090]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 10:09:28 np0005538515.localdomain podman[322083]: 2025-11-28 10:09:28.614234154 +0000 UTC m=+0.103454715 container health_status 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, architecture=x86_64, io.openshift.tags=minimal rhel9, name=ubi9-minimal, io.buildah.version=1.33.7, vendor=Red Hat, Inc., managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., vcs-type=git, com.redhat.component=ubi9-minimal-container, config_id=edpm, release=1755695350, version=9.6, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=)
Nov 28 10:09:28 np0005538515.localdomain podman[322083]: 2025-11-28 10:09:28.650204033 +0000 UTC m=+0.139424544 container exec_died 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vendor=Red Hat, Inc., managed_by=edpm_ansible, io.buildah.version=1.33.7, version=9.6, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 28 10:09:28 np0005538515.localdomain systemd[1]: 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.service: Deactivated successfully.
Nov 28 10:09:28 np0005538515.localdomain podman[239012]: time="2025-11-28T10:09:28Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 10:09:28 np0005538515.localdomain podman[239012]: @ - - [28/Nov/2025:10:09:28 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156330 "" "Go-http-client/1.1"
Nov 28 10:09:28 np0005538515.localdomain podman[239012]: @ - - [28/Nov/2025:10:09:28 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19211 "" "Go-http-client/1.1"
Nov 28 10:09:29 np0005538515.localdomain sudo[322090]: pam_unix(sudo:session): session closed for user root
Nov 28 10:09:29 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Nov 28 10:09:29 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 10:09:29 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0)
Nov 28 10:09:29 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 28 10:09:29 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Nov 28 10:09:29 np0005538515.localdomain ceph-mgr[286188]: [progress INFO root] update: starting ev 4fe8d159-29c7-4ceb-bfb8-9a34d8ee2669 (Updating node-proxy deployment (+3 -> 3))
Nov 28 10:09:29 np0005538515.localdomain ceph-mgr[286188]: [progress INFO root] complete: finished ev 4fe8d159-29c7-4ceb-bfb8-9a34d8ee2669 (Updating node-proxy deployment (+3 -> 3))
Nov 28 10:09:29 np0005538515.localdomain ceph-mgr[286188]: [progress INFO root] Completed event 4fe8d159-29c7-4ceb-bfb8-9a34d8ee2669 (Updating node-proxy deployment (+3 -> 3)) in 0 seconds
Nov 28 10:09:29 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0)
Nov 28 10:09:29 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 28 10:09:29 np0005538515.localdomain ceph-mon[301134]: pgmap v507: 177 pgs: 177 active+clean; 251 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 99 KiB/s rd, 9.4 MiB/s wr, 140 op/s
Nov 28 10:09:29 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 10:09:29 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 28 10:09:29 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:09:29 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 28 10:09:29 np0005538515.localdomain sudo[322153]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 10:09:29 np0005538515.localdomain sudo[322153]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 10:09:29 np0005538515.localdomain sudo[322153]: pam_unix(sudo:session): session closed for user root
Nov 28 10:09:29 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v508: 177 pgs: 177 active+clean; 251 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 84 KiB/s rd, 7.9 MiB/s wr, 119 op/s
Nov 28 10:09:29 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "06689989-6341-4053-b4cd-67b6bbd3acbb", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:09:29 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:06689989-6341-4053-b4cd-67b6bbd3acbb, vol_name:cephfs) < ""
Nov 28 10:09:29 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:09:29.964 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:09:30 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/06689989-6341-4053-b4cd-67b6bbd3acbb/.meta.tmp'
Nov 28 10:09:30 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/06689989-6341-4053-b4cd-67b6bbd3acbb/.meta.tmp' to config b'/volumes/_nogroup/06689989-6341-4053-b4cd-67b6bbd3acbb/.meta'
Nov 28 10:09:30 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:06689989-6341-4053-b4cd-67b6bbd3acbb, vol_name:cephfs) < ""
Nov 28 10:09:30 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "06689989-6341-4053-b4cd-67b6bbd3acbb", "format": "json"}]: dispatch
Nov 28 10:09:30 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:06689989-6341-4053-b4cd-67b6bbd3acbb, vol_name:cephfs) < ""
Nov 28 10:09:30 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:06689989-6341-4053-b4cd-67b6bbd3acbb, vol_name:cephfs) < ""
Nov 28 10:09:30 np0005538515.localdomain ceph-mon[301134]: from='client.15651 172.18.0.34:0/597982567' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:09:30 np0005538515.localdomain ceph-mgr[286188]: [progress INFO root] Writing back 50 completed events
Nov 28 10:09:30 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0)
Nov 28 10:09:31 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e256 e256: 6 total, 6 up, 6 in
Nov 28 10:09:31 np0005538515.localdomain ceph-mon[301134]: pgmap v508: 177 pgs: 177 active+clean; 251 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 84 KiB/s rd, 7.9 MiB/s wr, 119 op/s
Nov 28 10:09:31 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "06689989-6341-4053-b4cd-67b6bbd3acbb", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:09:31 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "06689989-6341-4053-b4cd-67b6bbd3acbb", "format": "json"}]: dispatch
Nov 28 10:09:31 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/444864674' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:09:31 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/444864674' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:09:31 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:09:31 np0005538515.localdomain ceph-mon[301134]: osdmap e256: 6 total, 6 up, 6 in
Nov 28 10:09:31 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e256 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:09:31 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v510: 177 pgs: 177 active+clean; 687 MiB data, 2.3 GiB used, 40 GiB / 42 GiB avail; 164 KiB/s rd, 62 MiB/s wr, 258 op/s
Nov 28 10:09:33 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "06689989-6341-4053-b4cd-67b6bbd3acbb", "snap_name": "7ef5a977-4548-46f4-a4ab-f17fcf8c22a8", "format": "json"}]: dispatch
Nov 28 10:09:33 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_create(format:json, prefix:fs subvolume snapshot create, snap_name:7ef5a977-4548-46f4-a4ab-f17fcf8c22a8, sub_name:06689989-6341-4053-b4cd-67b6bbd3acbb, vol_name:cephfs) < ""
Nov 28 10:09:33 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_create(format:json, prefix:fs subvolume snapshot create, snap_name:7ef5a977-4548-46f4-a4ab-f17fcf8c22a8, sub_name:06689989-6341-4053-b4cd-67b6bbd3acbb, vol_name:cephfs) < ""
Nov 28 10:09:33 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:09:33.577 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:09:33 np0005538515.localdomain ceph-mon[301134]: pgmap v510: 177 pgs: 177 active+clean; 687 MiB data, 2.3 GiB used, 40 GiB / 42 GiB avail; 164 KiB/s rd, 62 MiB/s wr, 258 op/s
Nov 28 10:09:33 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/3188050676' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:09:33 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/3188050676' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:09:33 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "06689989-6341-4053-b4cd-67b6bbd3acbb", "snap_name": "7ef5a977-4548-46f4-a4ab-f17fcf8c22a8", "format": "json"}]: dispatch
Nov 28 10:09:33 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v511: 177 pgs: 177 active+clean; 687 MiB data, 2.3 GiB used, 40 GiB / 42 GiB avail; 164 KiB/s rd, 62 MiB/s wr, 258 op/s
Nov 28 10:09:34 np0005538515.localdomain ceph-mon[301134]: pgmap v511: 177 pgs: 177 active+clean; 687 MiB data, 2.3 GiB used, 40 GiB / 42 GiB avail; 164 KiB/s rd, 62 MiB/s wr, 258 op/s
Nov 28 10:09:34 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:09:34.993 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:09:35 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "d1cf0525-f942-48d6-9b6c-f05643be68cd", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:09:35 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:d1cf0525-f942-48d6-9b6c-f05643be68cd, vol_name:cephfs) < ""
Nov 28 10:09:35 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] scanning for idle connections..
Nov 28 10:09:35 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] cleaning up connections: []
Nov 28 10:09:35 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] scanning for idle connections..
Nov 28 10:09:35 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] cleaning up connections: [('cephfs', <mgr_util.CephfsConnectionPool.Connection object at 0x7fcc91c76190>), ('cephfs', <mgr_util.CephfsConnectionPool.Connection object at 0x7fcc6bac1070>)]
Nov 28 10:09:35 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] disconnecting from cephfs 'cephfs'
Nov 28 10:09:35 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] scanning for idle connections..
Nov 28 10:09:35 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] cleaning up connections: []
Nov 28 10:09:35 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/d1cf0525-f942-48d6-9b6c-f05643be68cd/.meta.tmp'
Nov 28 10:09:35 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/d1cf0525-f942-48d6-9b6c-f05643be68cd/.meta.tmp' to config b'/volumes/_nogroup/d1cf0525-f942-48d6-9b6c-f05643be68cd/.meta'
Nov 28 10:09:35 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] disconnecting from cephfs 'cephfs'
Nov 28 10:09:35 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:d1cf0525-f942-48d6-9b6c-f05643be68cd, vol_name:cephfs) < ""
Nov 28 10:09:35 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "d1cf0525-f942-48d6-9b6c-f05643be68cd", "format": "json"}]: dispatch
Nov 28 10:09:35 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:d1cf0525-f942-48d6-9b6c-f05643be68cd, vol_name:cephfs) < ""
Nov 28 10:09:35 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:d1cf0525-f942-48d6-9b6c-f05643be68cd, vol_name:cephfs) < ""
Nov 28 10:09:35 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "d1cf0525-f942-48d6-9b6c-f05643be68cd", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:09:35 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e257 e257: 6 total, 6 up, 6 in
Nov 28 10:09:35 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v513: 177 pgs: 177 active+clean; 687 MiB data, 2.3 GiB used, 40 GiB / 42 GiB avail; 90 KiB/s rd, 55 MiB/s wr, 153 op/s
Nov 28 10:09:36 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "06689989-6341-4053-b4cd-67b6bbd3acbb", "snap_name": "7ef5a977-4548-46f4-a4ab-f17fcf8c22a8_bfb7e13e-e3ef-4568-80b6-62eac53f1d23", "force": true, "format": "json"}]: dispatch
Nov 28 10:09:36 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:7ef5a977-4548-46f4-a4ab-f17fcf8c22a8_bfb7e13e-e3ef-4568-80b6-62eac53f1d23, sub_name:06689989-6341-4053-b4cd-67b6bbd3acbb, vol_name:cephfs) < ""
Nov 28 10:09:36 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/06689989-6341-4053-b4cd-67b6bbd3acbb/.meta.tmp'
Nov 28 10:09:36 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/06689989-6341-4053-b4cd-67b6bbd3acbb/.meta.tmp' to config b'/volumes/_nogroup/06689989-6341-4053-b4cd-67b6bbd3acbb/.meta'
Nov 28 10:09:36 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:7ef5a977-4548-46f4-a4ab-f17fcf8c22a8_bfb7e13e-e3ef-4568-80b6-62eac53f1d23, sub_name:06689989-6341-4053-b4cd-67b6bbd3acbb, vol_name:cephfs) < ""
Nov 28 10:09:36 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "06689989-6341-4053-b4cd-67b6bbd3acbb", "snap_name": "7ef5a977-4548-46f4-a4ab-f17fcf8c22a8", "force": true, "format": "json"}]: dispatch
Nov 28 10:09:36 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:7ef5a977-4548-46f4-a4ab-f17fcf8c22a8, sub_name:06689989-6341-4053-b4cd-67b6bbd3acbb, vol_name:cephfs) < ""
Nov 28 10:09:36 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/06689989-6341-4053-b4cd-67b6bbd3acbb/.meta.tmp'
Nov 28 10:09:36 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/06689989-6341-4053-b4cd-67b6bbd3acbb/.meta.tmp' to config b'/volumes/_nogroup/06689989-6341-4053-b4cd-67b6bbd3acbb/.meta'
Nov 28 10:09:36 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:7ef5a977-4548-46f4-a4ab-f17fcf8c22a8, sub_name:06689989-6341-4053-b4cd-67b6bbd3acbb, vol_name:cephfs) < ""
Nov 28 10:09:36 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e257 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:09:36 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "d1cf0525-f942-48d6-9b6c-f05643be68cd", "format": "json"}]: dispatch
Nov 28 10:09:36 np0005538515.localdomain ceph-mon[301134]: from='client.15651 172.18.0.34:0/597982567' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:09:36 np0005538515.localdomain ceph-mon[301134]: osdmap e257: 6 total, 6 up, 6 in
Nov 28 10:09:36 np0005538515.localdomain ceph-mon[301134]: pgmap v513: 177 pgs: 177 active+clean; 687 MiB data, 2.3 GiB used, 40 GiB / 42 GiB avail; 90 KiB/s rd, 55 MiB/s wr, 153 op/s
Nov 28 10:09:36 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "06689989-6341-4053-b4cd-67b6bbd3acbb", "snap_name": "7ef5a977-4548-46f4-a4ab-f17fcf8c22a8_bfb7e13e-e3ef-4568-80b6-62eac53f1d23", "force": true, "format": "json"}]: dispatch
Nov 28 10:09:36 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "06689989-6341-4053-b4cd-67b6bbd3acbb", "snap_name": "7ef5a977-4548-46f4-a4ab-f17fcf8c22a8", "force": true, "format": "json"}]: dispatch
Nov 28 10:09:37 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v514: 177 pgs: 177 active+clean; 1.1 GiB data, 3.5 GiB used, 38 GiB / 42 GiB avail; 178 KiB/s rd, 105 MiB/s wr, 304 op/s
Nov 28 10:09:37 np0005538515.localdomain ceph-mon[301134]: mgrmap e49: np0005538515.yfkzhl(active, since 11m), standbys: np0005538513.dsfdlx, np0005538514.djozup
Nov 28 10:09:38 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:09:38.579 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:09:38 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "f220ee3f-2e2c-40b6-a8f5-63ee9a01d3d7", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:09:38 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:f220ee3f-2e2c-40b6-a8f5-63ee9a01d3d7, vol_name:cephfs) < ""
Nov 28 10:09:38 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/f220ee3f-2e2c-40b6-a8f5-63ee9a01d3d7/.meta.tmp'
Nov 28 10:09:38 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/f220ee3f-2e2c-40b6-a8f5-63ee9a01d3d7/.meta.tmp' to config b'/volumes/_nogroup/f220ee3f-2e2c-40b6-a8f5-63ee9a01d3d7/.meta'
Nov 28 10:09:38 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:f220ee3f-2e2c-40b6-a8f5-63ee9a01d3d7, vol_name:cephfs) < ""
Nov 28 10:09:38 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "f220ee3f-2e2c-40b6-a8f5-63ee9a01d3d7", "format": "json"}]: dispatch
Nov 28 10:09:38 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:f220ee3f-2e2c-40b6-a8f5-63ee9a01d3d7, vol_name:cephfs) < ""
Nov 28 10:09:38 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:f220ee3f-2e2c-40b6-a8f5-63ee9a01d3d7, vol_name:cephfs) < ""
Nov 28 10:09:39 np0005538515.localdomain ceph-mon[301134]: pgmap v514: 177 pgs: 177 active+clean; 1.1 GiB data, 3.5 GiB used, 38 GiB / 42 GiB avail; 178 KiB/s rd, 105 MiB/s wr, 304 op/s
Nov 28 10:09:39 np0005538515.localdomain ceph-mon[301134]: from='client.15651 172.18.0.34:0/597982567' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:09:39 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "06689989-6341-4053-b4cd-67b6bbd3acbb", "format": "json"}]: dispatch
Nov 28 10:09:39 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_clone_status(clone_name:06689989-6341-4053-b4cd-67b6bbd3acbb, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Nov 28 10:09:39 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_clone_status(clone_name:06689989-6341-4053-b4cd-67b6bbd3acbb, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Nov 28 10:09:39 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T10:09:39.659+0000 7fcc87448640 -1 mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '06689989-6341-4053-b4cd-67b6bbd3acbb' of type subvolume
Nov 28 10:09:39 np0005538515.localdomain ceph-mgr[286188]: mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '06689989-6341-4053-b4cd-67b6bbd3acbb' of type subvolume
Nov 28 10:09:39 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "06689989-6341-4053-b4cd-67b6bbd3acbb", "force": true, "format": "json"}]: dispatch
Nov 28 10:09:39 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:06689989-6341-4053-b4cd-67b6bbd3acbb, vol_name:cephfs) < ""
Nov 28 10:09:39 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.subvolume_base] subvolume path 'b'/volumes/_nogroup/06689989-6341-4053-b4cd-67b6bbd3acbb'' moved to trashcan
Nov 28 10:09:39 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Nov 28 10:09:39 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:06689989-6341-4053-b4cd-67b6bbd3acbb, vol_name:cephfs) < ""
Nov 28 10:09:39 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v515: 177 pgs: 177 active+clean; 1.1 GiB data, 3.5 GiB used, 38 GiB / 42 GiB avail; 171 KiB/s rd, 101 MiB/s wr, 292 op/s
Nov 28 10:09:40 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:09:40.048 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:09:40 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "f220ee3f-2e2c-40b6-a8f5-63ee9a01d3d7", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:09:40 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "f220ee3f-2e2c-40b6-a8f5-63ee9a01d3d7", "format": "json"}]: dispatch
Nov 28 10:09:40 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/1662012236' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:09:40 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/1662012236' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:09:40 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.
Nov 28 10:09:40 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.
Nov 28 10:09:40 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.
Nov 28 10:09:40 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.
Nov 28 10:09:40 np0005538515.localdomain systemd[1]: tmp-crun.oURLOR.mount: Deactivated successfully.
Nov 28 10:09:41 np0005538515.localdomain podman[322172]: 2025-11-28 10:09:41.001897007 +0000 UTC m=+0.104957841 container health_status 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, container_name=ovn_controller, tcib_managed=true)
Nov 28 10:09:41 np0005538515.localdomain podman[322171]: 2025-11-28 10:09:41.044497933 +0000 UTC m=+0.151595581 container health_status 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 28 10:09:41 np0005538515.localdomain podman[322172]: 2025-11-28 10:09:41.06581257 +0000 UTC m=+0.168873434 container exec_died 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 28 10:09:41 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "06689989-6341-4053-b4cd-67b6bbd3acbb", "format": "json"}]: dispatch
Nov 28 10:09:41 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "06689989-6341-4053-b4cd-67b6bbd3acbb", "force": true, "format": "json"}]: dispatch
Nov 28 10:09:41 np0005538515.localdomain ceph-mon[301134]: pgmap v515: 177 pgs: 177 active+clean; 1.1 GiB data, 3.5 GiB used, 38 GiB / 42 GiB avail; 171 KiB/s rd, 101 MiB/s wr, 292 op/s
Nov 28 10:09:41 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e258 e258: 6 total, 6 up, 6 in
Nov 28 10:09:41 np0005538515.localdomain podman[322173]: 2025-11-28 10:09:41.110524701 +0000 UTC m=+0.207939260 container health_status b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Nov 28 10:09:41 np0005538515.localdomain systemd[1]: 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.service: Deactivated successfully.
Nov 28 10:09:41 np0005538515.localdomain podman[322171]: 2025-11-28 10:09:41.135018777 +0000 UTC m=+0.242116475 container exec_died 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible)
Nov 28 10:09:41 np0005538515.localdomain podman[322173]: 2025-11-28 10:09:41.145546932 +0000 UTC m=+0.242961491 container exec_died b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:09:41 np0005538515.localdomain systemd[1]: 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.service: Deactivated successfully.
Nov 28 10:09:41 np0005538515.localdomain systemd[1]: b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.service: Deactivated successfully.
Nov 28 10:09:41 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Nov 28 10:09:41 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1184348018' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:09:41 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Nov 28 10:09:41 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1184348018' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:09:41 np0005538515.localdomain podman[322178]: 2025-11-28 10:09:41.209576429 +0000 UTC m=+0.296869505 container health_status d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 28 10:09:41 np0005538515.localdomain podman[322178]: 2025-11-28 10:09:41.240665868 +0000 UTC m=+0.327958944 container exec_died d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 28 10:09:41 np0005538515.localdomain systemd[1]: d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.service: Deactivated successfully.
Nov 28 10:09:41 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e258 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:09:41 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v517: 177 pgs: 177 active+clean; 195 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 200 KiB/s rd, 67 MiB/s wr, 343 op/s
Nov 28 10:09:41 np0005538515.localdomain systemd[1]: tmp-crun.oxXDaq.mount: Deactivated successfully.
Nov 28 10:09:42 np0005538515.localdomain ceph-mon[301134]: osdmap e258: 6 total, 6 up, 6 in
Nov 28 10:09:42 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/1184348018' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:09:42 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/1184348018' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:09:42 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "d1cf0525-f942-48d6-9b6c-f05643be68cd", "format": "json"}]: dispatch
Nov 28 10:09:42 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_clone_status(clone_name:d1cf0525-f942-48d6-9b6c-f05643be68cd, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Nov 28 10:09:42 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_clone_status(clone_name:d1cf0525-f942-48d6-9b6c-f05643be68cd, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Nov 28 10:09:42 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T10:09:42.262+0000 7fcc87448640 -1 mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume 'd1cf0525-f942-48d6-9b6c-f05643be68cd' of type subvolume
Nov 28 10:09:42 np0005538515.localdomain ceph-mgr[286188]: mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume 'd1cf0525-f942-48d6-9b6c-f05643be68cd' of type subvolume
Nov 28 10:09:42 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "d1cf0525-f942-48d6-9b6c-f05643be68cd", "force": true, "format": "json"}]: dispatch
Nov 28 10:09:42 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:d1cf0525-f942-48d6-9b6c-f05643be68cd, vol_name:cephfs) < ""
Nov 28 10:09:42 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.subvolume_base] subvolume path 'b'/volumes/_nogroup/d1cf0525-f942-48d6-9b6c-f05643be68cd'' moved to trashcan
Nov 28 10:09:42 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Nov 28 10:09:42 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:d1cf0525-f942-48d6-9b6c-f05643be68cd, vol_name:cephfs) < ""
Nov 28 10:09:42 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e259 e259: 6 total, 6 up, 6 in
Nov 28 10:09:42 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "6640fd55-9a20-4243-a340-7d5b72774834", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:09:42 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:6640fd55-9a20-4243-a340-7d5b72774834, vol_name:cephfs) < ""
Nov 28 10:09:43 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/6640fd55-9a20-4243-a340-7d5b72774834/.meta.tmp'
Nov 28 10:09:43 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/6640fd55-9a20-4243-a340-7d5b72774834/.meta.tmp' to config b'/volumes/_nogroup/6640fd55-9a20-4243-a340-7d5b72774834/.meta'
Nov 28 10:09:43 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:6640fd55-9a20-4243-a340-7d5b72774834, vol_name:cephfs) < ""
Nov 28 10:09:43 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "6640fd55-9a20-4243-a340-7d5b72774834", "format": "json"}]: dispatch
Nov 28 10:09:43 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:6640fd55-9a20-4243-a340-7d5b72774834, vol_name:cephfs) < ""
Nov 28 10:09:43 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:6640fd55-9a20-4243-a340-7d5b72774834, vol_name:cephfs) < ""
Nov 28 10:09:43 np0005538515.localdomain ceph-mon[301134]: pgmap v517: 177 pgs: 177 active+clean; 195 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 200 KiB/s rd, 67 MiB/s wr, 343 op/s
Nov 28 10:09:43 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/2049122593' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:09:43 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/2049122593' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:09:43 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "d1cf0525-f942-48d6-9b6c-f05643be68cd", "format": "json"}]: dispatch
Nov 28 10:09:43 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "d1cf0525-f942-48d6-9b6c-f05643be68cd", "force": true, "format": "json"}]: dispatch
Nov 28 10:09:43 np0005538515.localdomain ceph-mon[301134]: osdmap e259: 6 total, 6 up, 6 in
Nov 28 10:09:43 np0005538515.localdomain ceph-mon[301134]: from='client.15651 172.18.0.34:0/597982567' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:09:43 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "f220ee3f-2e2c-40b6-a8f5-63ee9a01d3d7", "format": "json"}]: dispatch
Nov 28 10:09:43 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_clone_status(clone_name:f220ee3f-2e2c-40b6-a8f5-63ee9a01d3d7, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Nov 28 10:09:43 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_clone_status(clone_name:f220ee3f-2e2c-40b6-a8f5-63ee9a01d3d7, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Nov 28 10:09:43 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T10:09:43.326+0000 7fcc87448640 -1 mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume 'f220ee3f-2e2c-40b6-a8f5-63ee9a01d3d7' of type subvolume
Nov 28 10:09:43 np0005538515.localdomain ceph-mgr[286188]: mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume 'f220ee3f-2e2c-40b6-a8f5-63ee9a01d3d7' of type subvolume
Nov 28 10:09:43 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "f220ee3f-2e2c-40b6-a8f5-63ee9a01d3d7", "force": true, "format": "json"}]: dispatch
Nov 28 10:09:43 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:f220ee3f-2e2c-40b6-a8f5-63ee9a01d3d7, vol_name:cephfs) < ""
Nov 28 10:09:43 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.subvolume_base] subvolume path 'b'/volumes/_nogroup/f220ee3f-2e2c-40b6-a8f5-63ee9a01d3d7'' moved to trashcan
Nov 28 10:09:43 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Nov 28 10:09:43 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:f220ee3f-2e2c-40b6-a8f5-63ee9a01d3d7, vol_name:cephfs) < ""
Nov 28 10:09:43 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.
Nov 28 10:09:43 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:09:43.582 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:09:43 np0005538515.localdomain podman[322254]: 2025-11-28 10:09:43.63983946 +0000 UTC m=+0.083842590 container health_status 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 28 10:09:43 np0005538515.localdomain podman[322254]: 2025-11-28 10:09:43.674012455 +0000 UTC m=+0.118015565 container exec_died 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 28 10:09:43 np0005538515.localdomain systemd[1]: 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.service: Deactivated successfully.
Nov 28 10:09:43 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v519: 177 pgs: 177 active+clean; 195 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 200 KiB/s rd, 67 MiB/s wr, 343 op/s
Nov 28 10:09:44 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "6640fd55-9a20-4243-a340-7d5b72774834", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:09:44 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "6640fd55-9a20-4243-a340-7d5b72774834", "format": "json"}]: dispatch
Nov 28 10:09:44 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "f220ee3f-2e2c-40b6-a8f5-63ee9a01d3d7", "format": "json"}]: dispatch
Nov 28 10:09:44 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "f220ee3f-2e2c-40b6-a8f5-63ee9a01d3d7", "force": true, "format": "json"}]: dispatch
Nov 28 10:09:44 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e260 e260: 6 total, 6 up, 6 in
Nov 28 10:09:45 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:09:45.093 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:09:45 np0005538515.localdomain ceph-mon[301134]: pgmap v519: 177 pgs: 177 active+clean; 195 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 200 KiB/s rd, 67 MiB/s wr, 343 op/s
Nov 28 10:09:45 np0005538515.localdomain ceph-mon[301134]: osdmap e260: 6 total, 6 up, 6 in
Nov 28 10:09:45 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:09:45.238 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:09:45 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v521: 177 pgs: 177 active+clean; 195 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 149 KiB/s rd, 21 MiB/s wr, 256 op/s
Nov 28 10:09:46 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:09:46.234 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:09:46 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "6640fd55-9a20-4243-a340-7d5b72774834", "format": "json"}]: dispatch
Nov 28 10:09:46 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_clone_status(clone_name:6640fd55-9a20-4243-a340-7d5b72774834, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Nov 28 10:09:46 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_clone_status(clone_name:6640fd55-9a20-4243-a340-7d5b72774834, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Nov 28 10:09:46 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T10:09:46.383+0000 7fcc87448640 -1 mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '6640fd55-9a20-4243-a340-7d5b72774834' of type subvolume
Nov 28 10:09:46 np0005538515.localdomain ceph-mgr[286188]: mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '6640fd55-9a20-4243-a340-7d5b72774834' of type subvolume
Nov 28 10:09:46 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "6640fd55-9a20-4243-a340-7d5b72774834", "force": true, "format": "json"}]: dispatch
Nov 28 10:09:46 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:6640fd55-9a20-4243-a340-7d5b72774834, vol_name:cephfs) < ""
Nov 28 10:09:46 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.subvolume_base] subvolume path 'b'/volumes/_nogroup/6640fd55-9a20-4243-a340-7d5b72774834'' moved to trashcan
Nov 28 10:09:46 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Nov 28 10:09:46 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:6640fd55-9a20-4243-a340-7d5b72774834, vol_name:cephfs) < ""
Nov 28 10:09:46 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e261 e261: 6 total, 6 up, 6 in
Nov 28 10:09:46 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e261 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:09:47 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:09:47.239 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:09:47 np0005538515.localdomain ceph-mon[301134]: pgmap v521: 177 pgs: 177 active+clean; 195 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 149 KiB/s rd, 21 MiB/s wr, 256 op/s
Nov 28 10:09:47 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "6640fd55-9a20-4243-a340-7d5b72774834", "format": "json"}]: dispatch
Nov 28 10:09:47 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "6640fd55-9a20-4243-a340-7d5b72774834", "force": true, "format": "json"}]: dispatch
Nov 28 10:09:47 np0005538515.localdomain ceph-mon[301134]: osdmap e261: 6 total, 6 up, 6 in
Nov 28 10:09:47 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.
Nov 28 10:09:47 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v523: 177 pgs: 177 active+clean; 196 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 3.5 MiB/s rd, 93 KiB/s wr, 72 op/s
Nov 28 10:09:47 np0005538515.localdomain podman[322277]: 2025-11-28 10:09:47.997314524 +0000 UTC m=+0.098938895 container health_status cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:09:48 np0005538515.localdomain podman[322277]: 2025-11-28 10:09:48.014640279 +0000 UTC m=+0.116264640 container exec_died cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, managed_by=edpm_ansible, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Nov 28 10:09:48 np0005538515.localdomain systemd[1]: cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.service: Deactivated successfully.
Nov 28 10:09:48 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:09:48.585 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:09:48 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:09:48.701 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:09:48 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:09:48.703 158530 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=17, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '92:49:97', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ca:ab:0a:de:51:20'}, ipsec=False) old=SB_Global(nb_cfg=16) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:09:48 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:09:48.704 158530 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 28 10:09:49 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:09:49.238 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:09:49 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:09:49.239 280172 DEBUG nova.compute.manager [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 28 10:09:49 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:09:49.239 280172 DEBUG nova.compute.manager [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 28 10:09:49 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:09:49.279 280172 DEBUG nova.compute.manager [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 28 10:09:49 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:09:49.279 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:09:49 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:09:49.280 280172 DEBUG nova.compute.manager [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 28 10:09:49 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Nov 28 10:09:49 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1498043172' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:09:49 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Nov 28 10:09:49 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1498043172' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:09:49 np0005538515.localdomain ceph-mon[301134]: pgmap v523: 177 pgs: 177 active+clean; 196 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 3.5 MiB/s rd, 93 KiB/s wr, 72 op/s
Nov 28 10:09:49 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/1498043172' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:09:49 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/1498043172' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:09:49 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:09:49.706 158530 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=62c03cad-89c1-4fd7-973b-8f2a608c71f1, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '17'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 10:09:49 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v524: 177 pgs: 177 active+clean; 196 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 2.8 MiB/s rd, 77 KiB/s wr, 59 op/s
Nov 28 10:09:50 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:09:50.141 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:09:50 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:09:50.238 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:09:50 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "3469c6cd-cd93-4e38-8faa-549a0ddf9179", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:09:50 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:3469c6cd-cd93-4e38-8faa-549a0ddf9179, vol_name:cephfs) < ""
Nov 28 10:09:50 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/3469c6cd-cd93-4e38-8faa-549a0ddf9179/.meta.tmp'
Nov 28 10:09:50 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/3469c6cd-cd93-4e38-8faa-549a0ddf9179/.meta.tmp' to config b'/volumes/_nogroup/3469c6cd-cd93-4e38-8faa-549a0ddf9179/.meta'
Nov 28 10:09:50 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:3469c6cd-cd93-4e38-8faa-549a0ddf9179, vol_name:cephfs) < ""
Nov 28 10:09:50 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e262 e262: 6 total, 6 up, 6 in
Nov 28 10:09:50 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "3469c6cd-cd93-4e38-8faa-549a0ddf9179", "format": "json"}]: dispatch
Nov 28 10:09:50 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:3469c6cd-cd93-4e38-8faa-549a0ddf9179, vol_name:cephfs) < ""
Nov 28 10:09:50 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:3469c6cd-cd93-4e38-8faa-549a0ddf9179, vol_name:cephfs) < ""
Nov 28 10:09:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:09:50.853 158530 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 10:09:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:09:50.853 158530 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 10:09:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:09:50.854 158530 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 10:09:51 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:09:51.234 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:09:51 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:09:51.238 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:09:51 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:09:51.238 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:09:51 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:09:51.257 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 10:09:51 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:09:51.258 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 10:09:51 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:09:51.258 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 10:09:51 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:09:51.258 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Auditing locally available compute resources for np0005538515.localdomain (node: np0005538515.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 28 10:09:51 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:09:51.259 280172 DEBUG oslo_concurrency.processutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 10:09:51 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e263 e263: 6 total, 6 up, 6 in
Nov 28 10:09:51 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #40. Immutable memtables: 0.
Nov 28 10:09:51 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:09:51.594598) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Nov 28 10:09:51 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/flush_job.cc:856] [default] [JOB 21] Flushing memtable with next log file: 40
Nov 28 10:09:51 np0005538515.localdomain ceph-mon[301134]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324591594637, "job": 21, "event": "flush_started", "num_memtables": 1, "num_entries": 810, "num_deletes": 255, "total_data_size": 1350948, "memory_usage": 1366608, "flush_reason": "Manual Compaction"}
Nov 28 10:09:51 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/flush_job.cc:885] [default] [JOB 21] Level-0 flush table #41: started
Nov 28 10:09:51 np0005538515.localdomain ceph-mon[301134]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324591601350, "cf_name": "default", "job": 21, "event": "table_file_creation", "file_number": 41, "file_size": 812869, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 26865, "largest_seqno": 27670, "table_properties": {"data_size": 809169, "index_size": 1427, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1221, "raw_key_size": 10137, "raw_average_key_size": 21, "raw_value_size": 801282, "raw_average_value_size": 1734, "num_data_blocks": 62, "num_entries": 462, "num_filter_entries": 462, "num_deletions": 255, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764324562, "oldest_key_time": 1764324562, "file_creation_time": 1764324591, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "75e61b0e-4f73-4b03-b096-8587ecbe7a9f", "db_session_id": "7KM5GJAJPD54H6HSLJHG", "orig_file_number": 41, "seqno_to_time_mapping": "N/A"}}
Nov 28 10:09:51 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 21] Flush lasted 6787 microseconds, and 1961 cpu microseconds.
Nov 28 10:09:51 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 28 10:09:51 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:09:51.601383) [db/flush_job.cc:967] [default] [JOB 21] Level-0 flush table #41: 812869 bytes OK
Nov 28 10:09:51 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:09:51.601402) [db/memtable_list.cc:519] [default] Level-0 commit table #41 started
Nov 28 10:09:51 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:09:51.602877) [db/memtable_list.cc:722] [default] Level-0 commit table #41: memtable #1 done
Nov 28 10:09:51 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:09:51.602892) EVENT_LOG_v1 {"time_micros": 1764324591602888, "job": 21, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Nov 28 10:09:51 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:09:51.602907) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Nov 28 10:09:51 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 21] Try to delete WAL files size 1346523, prev total WAL file size 1346523, number of live WAL files 2.
Nov 28 10:09:51 np0005538515.localdomain ceph-mon[301134]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538515/store.db/000037.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 28 10:09:51 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:09:51.603305) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740034303131' seq:72057594037927935, type:22 .. '6D6772737461740034323632' seq:0, type:0; will stop at (end)
Nov 28 10:09:51 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 22] Compacting 1@0 + 1@6 files to L6, score -1.00
Nov 28 10:09:51 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 21 Base level 0, inputs: [41(793KB)], [39(18MB)]
Nov 28 10:09:51 np0005538515.localdomain ceph-mon[301134]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324591603337, "job": 22, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [41], "files_L6": [39], "score": -1, "input_data_size": 20429408, "oldest_snapshot_seqno": -1}
Nov 28 10:09:51 np0005538515.localdomain ceph-mon[301134]: pgmap v524: 177 pgs: 177 active+clean; 196 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 2.8 MiB/s rd, 77 KiB/s wr, 59 op/s
Nov 28 10:09:51 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "3469c6cd-cd93-4e38-8faa-549a0ddf9179", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:09:51 np0005538515.localdomain ceph-mon[301134]: osdmap e262: 6 total, 6 up, 6 in
Nov 28 10:09:51 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "3469c6cd-cd93-4e38-8faa-549a0ddf9179", "format": "json"}]: dispatch
Nov 28 10:09:51 np0005538515.localdomain ceph-mon[301134]: from='client.15651 172.18.0.34:0/597982567' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:09:51 np0005538515.localdomain ceph-mon[301134]: osdmap e263: 6 total, 6 up, 6 in
Nov 28 10:09:51 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/1803177013' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:09:51 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/1803177013' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:09:51 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 22] Generated table #42: 13350 keys, 18386783 bytes, temperature: kUnknown
Nov 28 10:09:51 np0005538515.localdomain ceph-mon[301134]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324591714545, "cf_name": "default", "job": 22, "event": "table_file_creation", "file_number": 42, "file_size": 18386783, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 18310465, "index_size": 41849, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 33413, "raw_key_size": 357110, "raw_average_key_size": 26, "raw_value_size": 18083398, "raw_average_value_size": 1354, "num_data_blocks": 1573, "num_entries": 13350, "num_filter_entries": 13350, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764323786, "oldest_key_time": 0, "file_creation_time": 1764324591, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "75e61b0e-4f73-4b03-b096-8587ecbe7a9f", "db_session_id": "7KM5GJAJPD54H6HSLJHG", "orig_file_number": 42, "seqno_to_time_mapping": "N/A"}}
Nov 28 10:09:51 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 28 10:09:51 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:09:51.714819) [db/compaction/compaction_job.cc:1663] [default] [JOB 22] Compacted 1@0 + 1@6 files to L6 => 18386783 bytes
Nov 28 10:09:51 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:09:51.717283) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 183.5 rd, 165.2 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.8, 18.7 +0.0 blob) out(17.5 +0.0 blob), read-write-amplify(47.8) write-amplify(22.6) OK, records in: 13871, records dropped: 521 output_compression: NoCompression
Nov 28 10:09:51 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:09:51.717312) EVENT_LOG_v1 {"time_micros": 1764324591717299, "job": 22, "event": "compaction_finished", "compaction_time_micros": 111305, "compaction_time_cpu_micros": 26681, "output_level": 6, "num_output_files": 1, "total_output_size": 18386783, "num_input_records": 13871, "num_output_records": 13350, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Nov 28 10:09:51 np0005538515.localdomain ceph-mon[301134]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538515/store.db/000041.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 28 10:09:51 np0005538515.localdomain ceph-mon[301134]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324591717624, "job": 22, "event": "table_file_deletion", "file_number": 41}
Nov 28 10:09:51 np0005538515.localdomain ceph-mon[301134]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538515/store.db/000039.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 28 10:09:51 np0005538515.localdomain ceph-mon[301134]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324591720350, "job": 22, "event": "table_file_deletion", "file_number": 39}
Nov 28 10:09:51 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:09:51.603264) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:09:51 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:09:51.720442) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:09:51 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:09:51.720448) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:09:51 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:09:51.720451) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:09:51 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:09:51.720454) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:09:51 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:09:51.720457) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:09:51 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T10:09:51Z|00194|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Nov 28 10:09:51 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 28 10:09:51 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/2570637429' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:09:51 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:09:51.805 280172 DEBUG oslo_concurrency.processutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.546s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 10:09:51 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "11696cb6-6ed6-4708-887e-84f5b86051e8", "size": 4294967296, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:09:51 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:4294967296, sub_name:11696cb6-6ed6-4708-887e-84f5b86051e8, vol_name:cephfs) < ""
Nov 28 10:09:51 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e263 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:09:51 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/11696cb6-6ed6-4708-887e-84f5b86051e8/.meta.tmp'
Nov 28 10:09:51 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/11696cb6-6ed6-4708-887e-84f5b86051e8/.meta.tmp' to config b'/volumes/_nogroup/11696cb6-6ed6-4708-887e-84f5b86051e8/.meta'
Nov 28 10:09:51 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:4294967296, sub_name:11696cb6-6ed6-4708-887e-84f5b86051e8, vol_name:cephfs) < ""
Nov 28 10:09:51 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "11696cb6-6ed6-4708-887e-84f5b86051e8", "format": "json"}]: dispatch
Nov 28 10:09:51 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:11696cb6-6ed6-4708-887e-84f5b86051e8, vol_name:cephfs) < ""
Nov 28 10:09:51 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:11696cb6-6ed6-4708-887e-84f5b86051e8, vol_name:cephfs) < ""
Nov 28 10:09:51 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v527: 177 pgs: 177 active+clean; 242 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 3.6 MiB/s rd, 3.7 MiB/s wr, 283 op/s
Nov 28 10:09:52 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:09:52.021 280172 WARNING nova.virt.libvirt.driver [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 10:09:52 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:09:52.023 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Hypervisor/Node resource view: name=np0005538515.localdomain free_ram=11492MB free_disk=41.83686447143555GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 28 10:09:52 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:09:52.023 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 10:09:52 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:09:52.024 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 10:09:52 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:09:52.079 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 28 10:09:52 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:09:52.080 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Final resource view: name=np0005538515.localdomain phys_ram=15738MB used_ram=512MB phys_disk=41GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 28 10:09:52 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:09:52.102 280172 DEBUG oslo_concurrency.processutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 10:09:52 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 28 10:09:52 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/3487273525' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:09:52 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:09:52.548 280172 DEBUG oslo_concurrency.processutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.445s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 10:09:52 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:09:52.555 280172 DEBUG nova.compute.provider_tree [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Inventory has not changed in ProviderTree for provider: 72fba1ca-0d86-48af-8a3d-510284dfd0e0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 10:09:52 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.108:0/2570637429' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:09:52 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.106:0/1135745932' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:09:52 np0005538515.localdomain ceph-mon[301134]: from='client.15651 172.18.0.34:0/597982567' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:09:52 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/1402041267' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:09:52 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/1402041267' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:09:52 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.108:0/3487273525' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:09:52 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:09:52.647 280172 DEBUG nova.scheduler.client.report [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Inventory has not changed for provider 72fba1ca-0d86-48af-8a3d-510284dfd0e0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 10:09:52 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:09:52.651 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Compute_service record updated for np0005538515.localdomain:np0005538515.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 28 10:09:52 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:09:52.651 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.628s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 10:09:53 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Nov 28 10:09:53 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1704733216' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:09:53 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Nov 28 10:09:53 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1704733216' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:09:53 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:09:53.588 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:09:53 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "11696cb6-6ed6-4708-887e-84f5b86051e8", "size": 4294967296, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:09:53 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "11696cb6-6ed6-4708-887e-84f5b86051e8", "format": "json"}]: dispatch
Nov 28 10:09:53 np0005538515.localdomain ceph-mon[301134]: pgmap v527: 177 pgs: 177 active+clean; 242 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 3.6 MiB/s rd, 3.7 MiB/s wr, 283 op/s
Nov 28 10:09:53 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.106:0/2713521814' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:09:53 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/1704733216' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:09:53 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/1704733216' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:09:53 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "3469c6cd-cd93-4e38-8faa-549a0ddf9179", "snap_name": "1f3bf784-193d-4af9-98c3-6a3e9518c295", "format": "json"}]: dispatch
Nov 28 10:09:53 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_create(format:json, prefix:fs subvolume snapshot create, snap_name:1f3bf784-193d-4af9-98c3-6a3e9518c295, sub_name:3469c6cd-cd93-4e38-8faa-549a0ddf9179, vol_name:cephfs) < ""
Nov 28 10:09:53 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_create(format:json, prefix:fs subvolume snapshot create, snap_name:1f3bf784-193d-4af9-98c3-6a3e9518c295, sub_name:3469c6cd-cd93-4e38-8faa-549a0ddf9179, vol_name:cephfs) < ""
Nov 28 10:09:53 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v528: 177 pgs: 177 active+clean; 242 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 2.9 MiB/s rd, 3.0 MiB/s wr, 232 op/s
Nov 28 10:09:54 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:09:54.654 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:09:54 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e264 e264: 6 total, 6 up, 6 in
Nov 28 10:09:54 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "3469c6cd-cd93-4e38-8faa-549a0ddf9179", "snap_name": "1f3bf784-193d-4af9-98c3-6a3e9518c295", "format": "json"}]: dispatch
Nov 28 10:09:54 np0005538515.localdomain ceph-mon[301134]: pgmap v528: 177 pgs: 177 active+clean; 242 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 2.9 MiB/s rd, 3.0 MiB/s wr, 232 op/s
Nov 28 10:09:55 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:09:55.152 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:09:55 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "584bbedb-3694-4e91-aa03-2dfba40587ca", "size": 3221225472, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:09:55 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:3221225472, sub_name:584bbedb-3694-4e91-aa03-2dfba40587ca, vol_name:cephfs) < ""
Nov 28 10:09:55 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/584bbedb-3694-4e91-aa03-2dfba40587ca/.meta.tmp'
Nov 28 10:09:55 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/584bbedb-3694-4e91-aa03-2dfba40587ca/.meta.tmp' to config b'/volumes/_nogroup/584bbedb-3694-4e91-aa03-2dfba40587ca/.meta'
Nov 28 10:09:55 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:3221225472, sub_name:584bbedb-3694-4e91-aa03-2dfba40587ca, vol_name:cephfs) < ""
Nov 28 10:09:55 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "584bbedb-3694-4e91-aa03-2dfba40587ca", "format": "json"}]: dispatch
Nov 28 10:09:55 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:584bbedb-3694-4e91-aa03-2dfba40587ca, vol_name:cephfs) < ""
Nov 28 10:09:55 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:584bbedb-3694-4e91-aa03-2dfba40587ca, vol_name:cephfs) < ""
Nov 28 10:09:55 np0005538515.localdomain ceph-mon[301134]: osdmap e264: 6 total, 6 up, 6 in
Nov 28 10:09:55 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.107:0/729085219' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:09:55 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "584bbedb-3694-4e91-aa03-2dfba40587ca", "size": 3221225472, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:09:55 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "584bbedb-3694-4e91-aa03-2dfba40587ca", "format": "json"}]: dispatch
Nov 28 10:09:55 np0005538515.localdomain ceph-mon[301134]: from='client.15651 172.18.0.34:0/597982567' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:09:55 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v530: 177 pgs: 177 active+clean; 242 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 151 KiB/s rd, 3.6 MiB/s wr, 211 op/s
Nov 28 10:09:56 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e265 e265: 6 total, 6 up, 6 in
Nov 28 10:09:56 np0005538515.localdomain ceph-mon[301134]: pgmap v530: 177 pgs: 177 active+clean; 242 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 151 KiB/s rd, 3.6 MiB/s wr, 211 op/s
Nov 28 10:09:56 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.107:0/584414566' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:09:56 np0005538515.localdomain ceph-mon[301134]: osdmap e265: 6 total, 6 up, 6 in
Nov 28 10:09:56 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e265 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:09:57 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "aee96a4c-0a14-47e6-b8e5-ce0279118ec9", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:09:57 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:aee96a4c-0a14-47e6-b8e5-ce0279118ec9, vol_name:cephfs) < ""
Nov 28 10:09:57 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/aee96a4c-0a14-47e6-b8e5-ce0279118ec9/.meta.tmp'
Nov 28 10:09:57 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/aee96a4c-0a14-47e6-b8e5-ce0279118ec9/.meta.tmp' to config b'/volumes/_nogroup/aee96a4c-0a14-47e6-b8e5-ce0279118ec9/.meta'
Nov 28 10:09:57 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:aee96a4c-0a14-47e6-b8e5-ce0279118ec9, vol_name:cephfs) < ""
Nov 28 10:09:57 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "aee96a4c-0a14-47e6-b8e5-ce0279118ec9", "format": "json"}]: dispatch
Nov 28 10:09:57 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:aee96a4c-0a14-47e6-b8e5-ce0279118ec9, vol_name:cephfs) < ""
Nov 28 10:09:57 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:aee96a4c-0a14-47e6-b8e5-ce0279118ec9, vol_name:cephfs) < ""
Nov 28 10:09:57 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot clone", "vol_name": "cephfs", "sub_name": "3469c6cd-cd93-4e38-8faa-549a0ddf9179", "snap_name": "1f3bf784-193d-4af9-98c3-6a3e9518c295", "target_sub_name": "b9fd2bda-5757-4a94-8506-9ee54ffddc78", "format": "json"}]: dispatch
Nov 28 10:09:57 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_clone(format:json, prefix:fs subvolume snapshot clone, snap_name:1f3bf784-193d-4af9-98c3-6a3e9518c295, sub_name:3469c6cd-cd93-4e38-8faa-549a0ddf9179, target_sub_name:b9fd2bda-5757-4a94-8506-9ee54ffddc78, vol_name:cephfs) < ""
Nov 28 10:09:57 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 273 bytes to config b'/volumes/_nogroup/b9fd2bda-5757-4a94-8506-9ee54ffddc78/.meta.tmp'
Nov 28 10:09:57 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/b9fd2bda-5757-4a94-8506-9ee54ffddc78/.meta.tmp' to config b'/volumes/_nogroup/b9fd2bda-5757-4a94-8506-9ee54ffddc78/.meta'
Nov 28 10:09:57 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.clone_index] tracking-id 53c12f79-a432-40cd-ba04-128957beb093 for path b'/volumes/_nogroup/b9fd2bda-5757-4a94-8506-9ee54ffddc78'
Nov 28 10:09:57 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 246 bytes to config b'/volumes/_nogroup/3469c6cd-cd93-4e38-8faa-549a0ddf9179/.meta.tmp'
Nov 28 10:09:57 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/3469c6cd-cd93-4e38-8faa-549a0ddf9179/.meta.tmp' to config b'/volumes/_nogroup/3469c6cd-cd93-4e38-8faa-549a0ddf9179/.meta'
Nov 28 10:09:57 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Nov 28 10:09:57 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_clone(format:json, prefix:fs subvolume snapshot clone, snap_name:1f3bf784-193d-4af9-98c3-6a3e9518c295, sub_name:3469c6cd-cd93-4e38-8faa-549a0ddf9179, target_sub_name:b9fd2bda-5757-4a94-8506-9ee54ffddc78, vol_name:cephfs) < ""
Nov 28 10:09:57 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "b9fd2bda-5757-4a94-8506-9ee54ffddc78", "format": "json"}]: dispatch
Nov 28 10:09:57 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T10:09:57.567+0000 7fcc8c452640 -1 client.0 error registering admin socket command: (17) File exists
Nov 28 10:09:57 np0005538515.localdomain ceph-mgr[286188]: client.0 error registering admin socket command: (17) File exists
Nov 28 10:09:57 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T10:09:57.567+0000 7fcc8c452640 -1 client.0 error registering admin socket command: (17) File exists
Nov 28 10:09:57 np0005538515.localdomain ceph-mgr[286188]: client.0 error registering admin socket command: (17) File exists
Nov 28 10:09:57 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T10:09:57.567+0000 7fcc8c452640 -1 client.0 error registering admin socket command: (17) File exists
Nov 28 10:09:57 np0005538515.localdomain ceph-mgr[286188]: client.0 error registering admin socket command: (17) File exists
Nov 28 10:09:57 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T10:09:57.567+0000 7fcc8c452640 -1 client.0 error registering admin socket command: (17) File exists
Nov 28 10:09:57 np0005538515.localdomain ceph-mgr[286188]: client.0 error registering admin socket command: (17) File exists
Nov 28 10:09:57 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T10:09:57.567+0000 7fcc8c452640 -1 client.0 error registering admin socket command: (17) File exists
Nov 28 10:09:57 np0005538515.localdomain ceph-mgr[286188]: client.0 error registering admin socket command: (17) File exists
Nov 28 10:09:57 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_clone_status(clone_name:b9fd2bda-5757-4a94-8506-9ee54ffddc78, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Nov 28 10:09:57 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_clone_status(clone_name:b9fd2bda-5757-4a94-8506-9ee54ffddc78, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Nov 28 10:09:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:09:57 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 10:09:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:09:57 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 10:09:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:09:57 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 10:09:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:09:57 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 10:09:57 np0005538515.localdomain openstack_network_exporter[240973]: 
Nov 28 10:09:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:09:57 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 10:09:57 np0005538515.localdomain openstack_network_exporter[240973]: 
Nov 28 10:09:57 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.async_cloner] cloning to subvolume path: /volumes/_nogroup/b9fd2bda-5757-4a94-8506-9ee54ffddc78
Nov 28 10:09:57 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.async_cloner] starting clone: (cephfs, None, b9fd2bda-5757-4a94-8506-9ee54ffddc78)
Nov 28 10:09:57 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T10:09:57.604+0000 7fcc8d454640 -1 client.0 error registering admin socket command: (17) File exists
Nov 28 10:09:57 np0005538515.localdomain ceph-mgr[286188]: client.0 error registering admin socket command: (17) File exists
Nov 28 10:09:57 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T10:09:57.604+0000 7fcc8d454640 -1 client.0 error registering admin socket command: (17) File exists
Nov 28 10:09:57 np0005538515.localdomain ceph-mgr[286188]: client.0 error registering admin socket command: (17) File exists
Nov 28 10:09:57 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T10:09:57.605+0000 7fcc8d454640 -1 client.0 error registering admin socket command: (17) File exists
Nov 28 10:09:57 np0005538515.localdomain ceph-mgr[286188]: client.0 error registering admin socket command: (17) File exists
Nov 28 10:09:57 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T10:09:57.605+0000 7fcc8d454640 -1 client.0 error registering admin socket command: (17) File exists
Nov 28 10:09:57 np0005538515.localdomain ceph-mgr[286188]: client.0 error registering admin socket command: (17) File exists
Nov 28 10:09:57 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T10:09:57.605+0000 7fcc8d454640 -1 client.0 error registering admin socket command: (17) File exists
Nov 28 10:09:57 np0005538515.localdomain ceph-mgr[286188]: client.0 error registering admin socket command: (17) File exists
Nov 28 10:09:57 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.async_cloner] Delayed cloning (cephfs, None, b9fd2bda-5757-4a94-8506-9ee54ffddc78) -- by 0 seconds
Nov 28 10:09:57 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 277 bytes to config b'/volumes/_nogroup/b9fd2bda-5757-4a94-8506-9ee54ffddc78/.meta.tmp'
Nov 28 10:09:57 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/b9fd2bda-5757-4a94-8506-9ee54ffddc78/.meta.tmp' to config b'/volumes/_nogroup/b9fd2bda-5757-4a94-8506-9ee54ffddc78/.meta'
Nov 28 10:09:57 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "aee96a4c-0a14-47e6-b8e5-ce0279118ec9", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:09:57 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "aee96a4c-0a14-47e6-b8e5-ce0279118ec9", "format": "json"}]: dispatch
Nov 28 10:09:57 np0005538515.localdomain ceph-mon[301134]: from='client.15651 172.18.0.34:0/597982567' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:09:57 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot clone", "vol_name": "cephfs", "sub_name": "3469c6cd-cd93-4e38-8faa-549a0ddf9179", "snap_name": "1f3bf784-193d-4af9-98c3-6a3e9518c295", "target_sub_name": "b9fd2bda-5757-4a94-8506-9ee54ffddc78", "format": "json"}]: dispatch
Nov 28 10:09:57 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "b9fd2bda-5757-4a94-8506-9ee54ffddc78", "format": "json"}]: dispatch
Nov 28 10:09:57 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/782691183' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:09:57 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/782691183' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:09:57 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v532: 177 pgs: 177 active+clean; 196 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 194 KiB/s rd, 3.4 MiB/s wr, 274 op/s
Nov 28 10:09:58 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "11696cb6-6ed6-4708-887e-84f5b86051e8", "format": "json"}]: dispatch
Nov 28 10:09:58 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_clone_status(clone_name:11696cb6-6ed6-4708-887e-84f5b86051e8, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Nov 28 10:09:58 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:09:58.592 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:09:58 np0005538515.localdomain ceph-mon[301134]: pgmap v532: 177 pgs: 177 active+clean; 196 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 194 KiB/s rd, 3.4 MiB/s wr, 274 op/s
Nov 28 10:09:58 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "11696cb6-6ed6-4708-887e-84f5b86051e8", "format": "json"}]: dispatch
Nov 28 10:09:58 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e266 e266: 6 total, 6 up, 6 in
Nov 28 10:09:58 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.
Nov 28 10:09:58 np0005538515.localdomain podman[239012]: time="2025-11-28T10:09:58Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 10:09:58 np0005538515.localdomain podman[239012]: @ - - [28/Nov/2025:10:09:58 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156330 "" "Go-http-client/1.1"
Nov 28 10:09:58 np0005538515.localdomain podman[239012]: @ - - [28/Nov/2025:10:09:58 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19225 "" "Go-http-client/1.1"
Nov 28 10:09:59 np0005538515.localdomain podman[322364]: 2025-11-28 10:09:59.023355756 +0000 UTC m=+0.133066459 container health_status 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, distribution-scope=public, managed_by=edpm_ansible, name=ubi9-minimal, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, release=1755695350, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9)
Nov 28 10:09:59 np0005538515.localdomain podman[322364]: 2025-11-28 10:09:59.037395189 +0000 UTC m=+0.147105912 container exec_died 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, managed_by=edpm_ansible, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, version=9.6, container_name=openstack_network_exporter, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., distribution-scope=public, vcs-type=git, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 28 10:09:59 np0005538515.localdomain systemd[1]: 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.service: Deactivated successfully.
Nov 28 10:09:59 np0005538515.localdomain ceph-mon[301134]: mgrmap e50: np0005538515.yfkzhl(active, since 11m), standbys: np0005538513.dsfdlx, np0005538514.djozup
Nov 28 10:09:59 np0005538515.localdomain ceph-mon[301134]: osdmap e266: 6 total, 6 up, 6 in
Nov 28 10:09:59 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v534: 177 pgs: 177 active+clean; 196 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 53 KiB/s rd, 52 KiB/s wr, 77 op/s
Nov 28 10:10:00 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:10:00.187 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:10:00 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_clone_status(clone_name:11696cb6-6ed6-4708-887e-84f5b86051e8, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Nov 28 10:10:00 np0005538515.localdomain ceph-mgr[286188]: mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '11696cb6-6ed6-4708-887e-84f5b86051e8' of type subvolume
Nov 28 10:10:00 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T10:10:00.238+0000 7fcc87448640 -1 mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '11696cb6-6ed6-4708-887e-84f5b86051e8' of type subvolume
Nov 28 10:10:00 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "11696cb6-6ed6-4708-887e-84f5b86051e8", "force": true, "format": "json"}]: dispatch
Nov 28 10:10:00 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.async_cloner] copying data from b'/volumes/_nogroup/3469c6cd-cd93-4e38-8faa-549a0ddf9179/.snap/1f3bf784-193d-4af9-98c3-6a3e9518c295/22b0f8fa-9094-41fb-91cd-d34c00640de0' to b'/volumes/_nogroup/b9fd2bda-5757-4a94-8506-9ee54ffddc78/7237c419-af52-4e4f-b121-86fdcc449f6e'
Nov 28 10:10:00 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:11696cb6-6ed6-4708-887e-84f5b86051e8, vol_name:cephfs) < ""
Nov 28 10:10:00 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.subvolume_base] subvolume path 'b'/volumes/_nogroup/11696cb6-6ed6-4708-887e-84f5b86051e8'' moved to trashcan
Nov 28 10:10:00 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Nov 28 10:10:00 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:11696cb6-6ed6-4708-887e-84f5b86051e8, vol_name:cephfs) < ""
Nov 28 10:10:00 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 274 bytes to config b'/volumes/_nogroup/b9fd2bda-5757-4a94-8506-9ee54ffddc78/.meta.tmp'
Nov 28 10:10:00 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/b9fd2bda-5757-4a94-8506-9ee54ffddc78/.meta.tmp' to config b'/volumes/_nogroup/b9fd2bda-5757-4a94-8506-9ee54ffddc78/.meta'
Nov 28 10:10:00 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.clone_index] untracking 53c12f79-a432-40cd-ba04-128957beb093
Nov 28 10:10:00 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/3469c6cd-cd93-4e38-8faa-549a0ddf9179/.meta.tmp'
Nov 28 10:10:00 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/3469c6cd-cd93-4e38-8faa-549a0ddf9179/.meta.tmp' to config b'/volumes/_nogroup/3469c6cd-cd93-4e38-8faa-549a0ddf9179/.meta'
Nov 28 10:10:00 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 151 bytes to config b'/volumes/_nogroup/b9fd2bda-5757-4a94-8506-9ee54ffddc78/.meta.tmp'
Nov 28 10:10:00 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/b9fd2bda-5757-4a94-8506-9ee54ffddc78/.meta.tmp' to config b'/volumes/_nogroup/b9fd2bda-5757-4a94-8506-9ee54ffddc78/.meta'
Nov 28 10:10:00 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.async_cloner] finished clone: (cephfs, None, b9fd2bda-5757-4a94-8506-9ee54ffddc78)
Nov 28 10:10:00 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "c7b9c682-dd90-4895-89e5-edc4a14b470f", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:10:00 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:c7b9c682-dd90-4895-89e5-edc4a14b470f, vol_name:cephfs) < ""
Nov 28 10:10:00 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/c7b9c682-dd90-4895-89e5-edc4a14b470f/.meta.tmp'
Nov 28 10:10:00 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/c7b9c682-dd90-4895-89e5-edc4a14b470f/.meta.tmp' to config b'/volumes/_nogroup/c7b9c682-dd90-4895-89e5-edc4a14b470f/.meta'
Nov 28 10:10:00 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:c7b9c682-dd90-4895-89e5-edc4a14b470f, vol_name:cephfs) < ""
Nov 28 10:10:00 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "c7b9c682-dd90-4895-89e5-edc4a14b470f", "format": "json"}]: dispatch
Nov 28 10:10:00 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:c7b9c682-dd90-4895-89e5-edc4a14b470f, vol_name:cephfs) < ""
Nov 28 10:10:00 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:c7b9c682-dd90-4895-89e5-edc4a14b470f, vol_name:cephfs) < ""
Nov 28 10:10:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:10:00.627 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:10:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:10:00.628 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:10:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:10:00.628 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:10:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:10:00.628 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:10:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:10:00.628 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:10:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:10:00.628 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:10:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:10:00.628 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:10:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:10:00.628 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:10:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:10:00.628 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:10:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:10:00.629 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:10:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:10:00.629 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:10:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:10:00.629 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:10:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:10:00.629 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:10:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:10:00.629 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:10:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:10:00.629 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:10:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:10:00.629 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:10:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:10:00.629 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:10:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:10:00.629 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:10:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:10:00.629 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:10:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:10:00.629 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:10:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:10:00.629 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:10:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:10:00.630 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:10:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:10:00.630 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:10:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:10:00.630 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:10:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:10:00.630 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:10:00 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "aee96a4c-0a14-47e6-b8e5-ce0279118ec9", "auth_id": "eve49", "tenant_id": "ed59ec099bfe470982dfd8309e19126f", "access_level": "rw", "format": "json"}]: dispatch
Nov 28 10:10:00 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_authorize(access_level:rw, auth_id:eve49, format:json, prefix:fs subvolume authorize, sub_name:aee96a4c-0a14-47e6-b8e5-ce0279118ec9, tenant_id:ed59ec099bfe470982dfd8309e19126f, vol_name:cephfs) < ""
Nov 28 10:10:00 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.eve49", "format": "json"} v 0)
Nov 28 10:10:00 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.eve49", "format": "json"} : dispatch
Nov 28 10:10:00 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: Creating meta for ID eve49 with tenant ed59ec099bfe470982dfd8309e19126f
Nov 28 10:10:00 np0005538515.localdomain ceph-mon[301134]: pgmap v534: 177 pgs: 177 active+clean; 196 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 53 KiB/s rd, 52 KiB/s wr, 77 op/s
Nov 28 10:10:00 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/3825568988' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:10:00 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/3825568988' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:10:00 np0005538515.localdomain ceph-mon[301134]: overall HEALTH_OK
Nov 28 10:10:00 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "11696cb6-6ed6-4708-887e-84f5b86051e8", "force": true, "format": "json"}]: dispatch
Nov 28 10:10:00 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "c7b9c682-dd90-4895-89e5-edc4a14b470f", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:10:00 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "c7b9c682-dd90-4895-89e5-edc4a14b470f", "format": "json"}]: dispatch
Nov 28 10:10:00 np0005538515.localdomain ceph-mon[301134]: from='client.15651 172.18.0.34:0/597982567' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:10:00 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.eve49", "format": "json"} : dispatch
Nov 28 10:10:00 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.eve49", "caps": ["mds", "allow rw path=/volumes/_nogroup/aee96a4c-0a14-47e6-b8e5-ce0279118ec9/42613c17-5566-4a72-88af-f970b5fd2509", "osd", "allow rw pool=manila_data namespace=fsvolumens_aee96a4c-0a14-47e6-b8e5-ce0279118ec9", "mon", "allow r"], "format": "json"} v 0)
Nov 28 10:10:00 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.eve49", "caps": ["mds", "allow rw path=/volumes/_nogroup/aee96a4c-0a14-47e6-b8e5-ce0279118ec9/42613c17-5566-4a72-88af-f970b5fd2509", "osd", "allow rw pool=manila_data namespace=fsvolumens_aee96a4c-0a14-47e6-b8e5-ce0279118ec9", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:10:00 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_authorize(access_level:rw, auth_id:eve49, format:json, prefix:fs subvolume authorize, sub_name:aee96a4c-0a14-47e6-b8e5-ce0279118ec9, tenant_id:ed59ec099bfe470982dfd8309e19126f, vol_name:cephfs) < ""
Nov 28 10:10:01 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "584bbedb-3694-4e91-aa03-2dfba40587ca", "format": "json"}]: dispatch
Nov 28 10:10:01 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_clone_status(clone_name:584bbedb-3694-4e91-aa03-2dfba40587ca, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Nov 28 10:10:01 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_clone_status(clone_name:584bbedb-3694-4e91-aa03-2dfba40587ca, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Nov 28 10:10:01 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T10:10:01.803+0000 7fcc87448640 -1 mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '584bbedb-3694-4e91-aa03-2dfba40587ca' of type subvolume
Nov 28 10:10:01 np0005538515.localdomain ceph-mgr[286188]: mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '584bbedb-3694-4e91-aa03-2dfba40587ca' of type subvolume
Nov 28 10:10:01 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "584bbedb-3694-4e91-aa03-2dfba40587ca", "force": true, "format": "json"}]: dispatch
Nov 28 10:10:01 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:584bbedb-3694-4e91-aa03-2dfba40587ca, vol_name:cephfs) < ""
Nov 28 10:10:01 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "aee96a4c-0a14-47e6-b8e5-ce0279118ec9", "auth_id": "eve49", "tenant_id": "ed59ec099bfe470982dfd8309e19126f", "access_level": "rw", "format": "json"}]: dispatch
Nov 28 10:10:01 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.eve49", "caps": ["mds", "allow rw path=/volumes/_nogroup/aee96a4c-0a14-47e6-b8e5-ce0279118ec9/42613c17-5566-4a72-88af-f970b5fd2509", "osd", "allow rw pool=manila_data namespace=fsvolumens_aee96a4c-0a14-47e6-b8e5-ce0279118ec9", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:10:01 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.eve49", "caps": ["mds", "allow rw path=/volumes/_nogroup/aee96a4c-0a14-47e6-b8e5-ce0279118ec9/42613c17-5566-4a72-88af-f970b5fd2509", "osd", "allow rw pool=manila_data namespace=fsvolumens_aee96a4c-0a14-47e6-b8e5-ce0279118ec9", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:10:01 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.eve49", "caps": ["mds", "allow rw path=/volumes/_nogroup/aee96a4c-0a14-47e6-b8e5-ce0279118ec9/42613c17-5566-4a72-88af-f970b5fd2509", "osd", "allow rw pool=manila_data namespace=fsvolumens_aee96a4c-0a14-47e6-b8e5-ce0279118ec9", "mon", "allow r"], "format": "json"}]': finished
Nov 28 10:10:01 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/2619852505' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:10:01 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/2619852505' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:10:01 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.subvolume_base] subvolume path 'b'/volumes/_nogroup/584bbedb-3694-4e91-aa03-2dfba40587ca'' moved to trashcan
Nov 28 10:10:01 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Nov 28 10:10:01 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:584bbedb-3694-4e91-aa03-2dfba40587ca, vol_name:cephfs) < ""
Nov 28 10:10:01 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e266 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:10:01 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v535: 177 pgs: 177 active+clean; 197 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 111 KiB/s rd, 128 KiB/s wr, 160 op/s
Nov 28 10:10:02 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "584bbedb-3694-4e91-aa03-2dfba40587ca", "format": "json"}]: dispatch
Nov 28 10:10:02 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "584bbedb-3694-4e91-aa03-2dfba40587ca", "force": true, "format": "json"}]: dispatch
Nov 28 10:10:02 np0005538515.localdomain ceph-mon[301134]: pgmap v535: 177 pgs: 177 active+clean; 197 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 111 KiB/s rd, 128 KiB/s wr, 160 op/s
Nov 28 10:10:03 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "b9fd2bda-5757-4a94-8506-9ee54ffddc78", "format": "json"}]: dispatch
Nov 28 10:10:03 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_clone_status(clone_name:b9fd2bda-5757-4a94-8506-9ee54ffddc78, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Nov 28 10:10:03 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:10:03.594 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:10:03 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:10:03.596 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 28 10:10:03 np0005538515.localdomain podman[322399]: 2025-11-28 10:10:03.608905949 +0000 UTC m=+0.070628981 container kill d6ba70e100412f39fbf43f2359746885ccd151dd93bddc3682d2651a49a36c62 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-887157f9-a765-40c0-8be5-1fba3ddea8f8, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125)
Nov 28 10:10:03 np0005538515.localdomain dnsmasq[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/addn_hosts - 1 addresses
Nov 28 10:10:03 np0005538515.localdomain dnsmasq-dhcp[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/host
Nov 28 10:10:03 np0005538515.localdomain dnsmasq-dhcp[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/opts
Nov 28 10:10:03 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "b9fd2bda-5757-4a94-8506-9ee54ffddc78", "format": "json"}]: dispatch
Nov 28 10:10:03 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v536: 177 pgs: 177 active+clean; 197 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 100 KiB/s rd, 115 KiB/s wr, 144 op/s
Nov 28 10:10:04 np0005538515.localdomain ceph-mon[301134]: pgmap v536: 177 pgs: 177 active+clean; 197 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 100 KiB/s rd, 115 KiB/s wr, 144 op/s
Nov 28 10:10:05 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:10:05.233 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:10:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_clone_status(clone_name:b9fd2bda-5757-4a94-8506-9ee54ffddc78, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Nov 28 10:10:05 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "b9fd2bda-5757-4a94-8506-9ee54ffddc78", "format": "json"}]: dispatch
Nov 28 10:10:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:b9fd2bda-5757-4a94-8506-9ee54ffddc78, vol_name:cephfs) < ""
Nov 28 10:10:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:b9fd2bda-5757-4a94-8506-9ee54ffddc78, vol_name:cephfs) < ""
Nov 28 10:10:05 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "df1f50b2-116f-4913-b966-9e6fb632edd2", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:10:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:df1f50b2-116f-4913-b966-9e6fb632edd2, vol_name:cephfs) < ""
Nov 28 10:10:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/df1f50b2-116f-4913-b966-9e6fb632edd2/.meta.tmp'
Nov 28 10:10:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/df1f50b2-116f-4913-b966-9e6fb632edd2/.meta.tmp' to config b'/volumes/_nogroup/df1f50b2-116f-4913-b966-9e6fb632edd2/.meta'
Nov 28 10:10:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:df1f50b2-116f-4913-b966-9e6fb632edd2, vol_name:cephfs) < ""
Nov 28 10:10:05 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "df1f50b2-116f-4913-b966-9e6fb632edd2", "format": "json"}]: dispatch
Nov 28 10:10:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:df1f50b2-116f-4913-b966-9e6fb632edd2, vol_name:cephfs) < ""
Nov 28 10:10:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:df1f50b2-116f-4913-b966-9e6fb632edd2, vol_name:cephfs) < ""
Nov 28 10:10:05 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "c7b9c682-dd90-4895-89e5-edc4a14b470f", "format": "json"}]: dispatch
Nov 28 10:10:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_clone_status(clone_name:c7b9c682-dd90-4895-89e5-edc4a14b470f, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Nov 28 10:10:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_clone_status(clone_name:c7b9c682-dd90-4895-89e5-edc4a14b470f, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Nov 28 10:10:05 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T10:10:05.572+0000 7fcc87448640 -1 mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume 'c7b9c682-dd90-4895-89e5-edc4a14b470f' of type subvolume
Nov 28 10:10:05 np0005538515.localdomain ceph-mgr[286188]: mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume 'c7b9c682-dd90-4895-89e5-edc4a14b470f' of type subvolume
Nov 28 10:10:05 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "c7b9c682-dd90-4895-89e5-edc4a14b470f", "force": true, "format": "json"}]: dispatch
Nov 28 10:10:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:c7b9c682-dd90-4895-89e5-edc4a14b470f, vol_name:cephfs) < ""
Nov 28 10:10:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.subvolume_base] subvolume path 'b'/volumes/_nogroup/c7b9c682-dd90-4895-89e5-edc4a14b470f'' moved to trashcan
Nov 28 10:10:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Nov 28 10:10:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:c7b9c682-dd90-4895-89e5-edc4a14b470f, vol_name:cephfs) < ""
Nov 28 10:10:05 np0005538515.localdomain ceph-mgr[286188]: [balancer INFO root] Optimize plan auto_2025-11-28_10:10:05
Nov 28 10:10:05 np0005538515.localdomain ceph-mgr[286188]: [balancer INFO root] Mode upmap, max misplaced 0.050000
Nov 28 10:10:05 np0005538515.localdomain ceph-mgr[286188]: [balancer INFO root] do_upmap
Nov 28 10:10:05 np0005538515.localdomain ceph-mgr[286188]: [balancer INFO root] pools ['images', 'manila_metadata', 'manila_data', 'volumes', 'backups', '.mgr', 'vms']
Nov 28 10:10:05 np0005538515.localdomain ceph-mgr[286188]: [balancer INFO root] prepared 0/10 changes
Nov 28 10:10:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] scanning for idle connections..
Nov 28 10:10:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] cleaning up connections: []
Nov 28 10:10:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] scanning for idle connections..
Nov 28 10:10:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] cleaning up connections: []
Nov 28 10:10:05 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "aee96a4c-0a14-47e6-b8e5-ce0279118ec9", "auth_id": "eve48", "tenant_id": "ed59ec099bfe470982dfd8309e19126f", "access_level": "rw", "format": "json"}]: dispatch
Nov 28 10:10:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_authorize(access_level:rw, auth_id:eve48, format:json, prefix:fs subvolume authorize, sub_name:aee96a4c-0a14-47e6-b8e5-ce0279118ec9, tenant_id:ed59ec099bfe470982dfd8309e19126f, vol_name:cephfs) < ""
Nov 28 10:10:05 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.eve48", "format": "json"} v 0)
Nov 28 10:10:05 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.eve48", "format": "json"} : dispatch
Nov 28 10:10:05 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: Creating meta for ID eve48 with tenant ed59ec099bfe470982dfd8309e19126f
Nov 28 10:10:05 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.eve48", "caps": ["mds", "allow rw path=/volumes/_nogroup/aee96a4c-0a14-47e6-b8e5-ce0279118ec9/42613c17-5566-4a72-88af-f970b5fd2509", "osd", "allow rw pool=manila_data namespace=fsvolumens_aee96a4c-0a14-47e6-b8e5-ce0279118ec9", "mon", "allow r"], "format": "json"} v 0)
Nov 28 10:10:05 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.eve48", "caps": ["mds", "allow rw path=/volumes/_nogroup/aee96a4c-0a14-47e6-b8e5-ce0279118ec9/42613c17-5566-4a72-88af-f970b5fd2509", "osd", "allow rw pool=manila_data namespace=fsvolumens_aee96a4c-0a14-47e6-b8e5-ce0279118ec9", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:10:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] scanning for idle connections..
Nov 28 10:10:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] cleaning up connections: []
Nov 28 10:10:05 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "b9fd2bda-5757-4a94-8506-9ee54ffddc78", "format": "json"}]: dispatch
Nov 28 10:10:05 np0005538515.localdomain ceph-mon[301134]: from='client.15651 172.18.0.34:0/597982567' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:10:05 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "df1f50b2-116f-4913-b966-9e6fb632edd2", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:10:05 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "df1f50b2-116f-4913-b966-9e6fb632edd2", "format": "json"}]: dispatch
Nov 28 10:10:05 np0005538515.localdomain ceph-mon[301134]: from='client.15651 172.18.0.34:0/597982567' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:10:05 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "c7b9c682-dd90-4895-89e5-edc4a14b470f", "format": "json"}]: dispatch
Nov 28 10:10:05 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "c7b9c682-dd90-4895-89e5-edc4a14b470f", "force": true, "format": "json"}]: dispatch
Nov 28 10:10:05 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.eve48", "format": "json"} : dispatch
Nov 28 10:10:05 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.eve48", "caps": ["mds", "allow rw path=/volumes/_nogroup/aee96a4c-0a14-47e6-b8e5-ce0279118ec9/42613c17-5566-4a72-88af-f970b5fd2509", "osd", "allow rw pool=manila_data namespace=fsvolumens_aee96a4c-0a14-47e6-b8e5-ce0279118ec9", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:10:05 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.eve48", "caps": ["mds", "allow rw path=/volumes/_nogroup/aee96a4c-0a14-47e6-b8e5-ce0279118ec9/42613c17-5566-4a72-88af-f970b5fd2509", "osd", "allow rw pool=manila_data namespace=fsvolumens_aee96a4c-0a14-47e6-b8e5-ce0279118ec9", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:10:05 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.eve48", "caps": ["mds", "allow rw path=/volumes/_nogroup/aee96a4c-0a14-47e6-b8e5-ce0279118ec9/42613c17-5566-4a72-88af-f970b5fd2509", "osd", "allow rw pool=manila_data namespace=fsvolumens_aee96a4c-0a14-47e6-b8e5-ce0279118ec9", "mon", "allow r"], "format": "json"}]': finished
Nov 28 10:10:05 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v537: 177 pgs: 177 active+clean; 197 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 86 KiB/s rd, 99 KiB/s wr, 124 op/s
Nov 28 10:10:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_authorize(access_level:rw, auth_id:eve48, format:json, prefix:fs subvolume authorize, sub_name:aee96a4c-0a14-47e6-b8e5-ce0279118ec9, tenant_id:ed59ec099bfe470982dfd8309e19126f, vol_name:cephfs) < ""
Nov 28 10:10:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] _maybe_adjust
Nov 28 10:10:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Nov 28 10:10:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] Pool '.mgr' root_id -1 using 3.080724804578448e-05 of space, bias 1.0, pg target 0.006161449609156895 quantized to 1 (current 1)
Nov 28 10:10:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Nov 28 10:10:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] Pool 'vms' root_id -1 using 0.003328000680485762 of space, bias 1.0, pg target 0.6656001360971524 quantized to 32 (current 32)
Nov 28 10:10:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Nov 28 10:10:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] Pool 'volumes' root_id -1 using 0.0014861089300670016 of space, bias 1.0, pg target 0.29672641637004465 quantized to 32 (current 32)
Nov 28 10:10:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Nov 28 10:10:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] Pool 'images' root_id -1 using 0.004299383200725851 of space, bias 1.0, pg target 0.8584435124115949 quantized to 32 (current 32)
Nov 28 10:10:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Nov 28 10:10:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] Pool 'backups' root_id -1 using 2.7263051367950866e-07 of space, bias 1.0, pg target 5.425347222222222e-05 quantized to 32 (current 32)
Nov 28 10:10:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Nov 28 10:10:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] Pool 'manila_data' root_id -1 using 3.271566164154104e-06 of space, bias 1.0, pg target 0.0006510416666666666 quantized to 32 (current 32)
Nov 28 10:10:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Nov 28 10:10:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] Pool 'manila_metadata' root_id -1 using 0.0003465133828866555 of space, bias 4.0, pg target 0.2758246527777778 quantized to 16 (current 16)
Nov 28 10:10:05 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules
Nov 28 10:10:05 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: vms, start_after=
Nov 28 10:10:05 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules
Nov 28 10:10:05 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: vms, start_after=
Nov 28 10:10:05 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: volumes, start_after=
Nov 28 10:10:05 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: volumes, start_after=
Nov 28 10:10:05 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: images, start_after=
Nov 28 10:10:05 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: images, start_after=
Nov 28 10:10:05 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: backups, start_after=
Nov 28 10:10:05 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: backups, start_after=
Nov 28 10:10:06 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e267 e267: 6 total, 6 up, 6 in
Nov 28 10:10:06 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e267 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:10:06 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "b9fd2bda-5757-4a94-8506-9ee54ffddc78", "format": "json"}]: dispatch
Nov 28 10:10:06 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_clone_status(clone_name:b9fd2bda-5757-4a94-8506-9ee54ffddc78, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Nov 28 10:10:06 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_clone_status(clone_name:b9fd2bda-5757-4a94-8506-9ee54ffddc78, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Nov 28 10:10:06 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "b9fd2bda-5757-4a94-8506-9ee54ffddc78", "force": true, "format": "json"}]: dispatch
Nov 28 10:10:06 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:b9fd2bda-5757-4a94-8506-9ee54ffddc78, vol_name:cephfs) < ""
Nov 28 10:10:06 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.subvolume_base] subvolume path 'b'/volumes/_nogroup/b9fd2bda-5757-4a94-8506-9ee54ffddc78'' moved to trashcan
Nov 28 10:10:06 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Nov 28 10:10:06 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:b9fd2bda-5757-4a94-8506-9ee54ffddc78, vol_name:cephfs) < ""
Nov 28 10:10:07 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "503f8ba3-8dec-4f60-af76-593096ff9b7e", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:10:07 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:503f8ba3-8dec-4f60-af76-593096ff9b7e, vol_name:cephfs) < ""
Nov 28 10:10:07 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/503f8ba3-8dec-4f60-af76-593096ff9b7e/.meta.tmp'
Nov 28 10:10:07 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/503f8ba3-8dec-4f60-af76-593096ff9b7e/.meta.tmp' to config b'/volumes/_nogroup/503f8ba3-8dec-4f60-af76-593096ff9b7e/.meta'
Nov 28 10:10:07 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:503f8ba3-8dec-4f60-af76-593096ff9b7e, vol_name:cephfs) < ""
Nov 28 10:10:07 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "503f8ba3-8dec-4f60-af76-593096ff9b7e", "format": "json"}]: dispatch
Nov 28 10:10:07 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:503f8ba3-8dec-4f60-af76-593096ff9b7e, vol_name:cephfs) < ""
Nov 28 10:10:07 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:503f8ba3-8dec-4f60-af76-593096ff9b7e, vol_name:cephfs) < ""
Nov 28 10:10:07 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "aee96a4c-0a14-47e6-b8e5-ce0279118ec9", "auth_id": "eve48", "format": "json"}]: dispatch
Nov 28 10:10:07 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_deauthorize(auth_id:eve48, format:json, prefix:fs subvolume deauthorize, sub_name:aee96a4c-0a14-47e6-b8e5-ce0279118ec9, vol_name:cephfs) < ""
Nov 28 10:10:07 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.eve48", "format": "json"} v 0)
Nov 28 10:10:07 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.eve48", "format": "json"} : dispatch
Nov 28 10:10:07 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.eve48"} v 0)
Nov 28 10:10:07 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.eve48"} : dispatch
Nov 28 10:10:07 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_deauthorize(auth_id:eve48, format:json, prefix:fs subvolume deauthorize, sub_name:aee96a4c-0a14-47e6-b8e5-ce0279118ec9, vol_name:cephfs) < ""
Nov 28 10:10:07 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "aee96a4c-0a14-47e6-b8e5-ce0279118ec9", "auth_id": "eve48", "format": "json"}]: dispatch
Nov 28 10:10:07 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_evict(auth_id:eve48, format:json, prefix:fs subvolume evict, sub_name:aee96a4c-0a14-47e6-b8e5-ce0279118ec9, vol_name:cephfs) < ""
Nov 28 10:10:07 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict clients with auth_name=eve48, client_metadata.root=/volumes/_nogroup/aee96a4c-0a14-47e6-b8e5-ce0279118ec9/42613c17-5566-4a72-88af-f970b5fd2509
Nov 28 10:10:07 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict: joined all
Nov 28 10:10:07 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_evict(auth_id:eve48, format:json, prefix:fs subvolume evict, sub_name:aee96a4c-0a14-47e6-b8e5-ce0279118ec9, vol_name:cephfs) < ""
Nov 28 10:10:07 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "aee96a4c-0a14-47e6-b8e5-ce0279118ec9", "auth_id": "eve48", "tenant_id": "ed59ec099bfe470982dfd8309e19126f", "access_level": "rw", "format": "json"}]: dispatch
Nov 28 10:10:07 np0005538515.localdomain ceph-mon[301134]: pgmap v537: 177 pgs: 177 active+clean; 197 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 86 KiB/s rd, 99 KiB/s wr, 124 op/s
Nov 28 10:10:07 np0005538515.localdomain ceph-mon[301134]: osdmap e267: 6 total, 6 up, 6 in
Nov 28 10:10:07 np0005538515.localdomain ceph-mon[301134]: from='client.15651 172.18.0.34:0/597982567' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:10:07 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.eve48"} : dispatch
Nov 28 10:10:07 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.eve48", "format": "json"} : dispatch
Nov 28 10:10:07 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.eve48"} : dispatch
Nov 28 10:10:07 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.eve48"}]': finished
Nov 28 10:10:07 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v539: 177 pgs: 177 active+clean; 197 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 53 KiB/s rd, 123 KiB/s wr, 81 op/s
Nov 28 10:10:08 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:10:08.597 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:10:08 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "b9fd2bda-5757-4a94-8506-9ee54ffddc78", "format": "json"}]: dispatch
Nov 28 10:10:08 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "b9fd2bda-5757-4a94-8506-9ee54ffddc78", "force": true, "format": "json"}]: dispatch
Nov 28 10:10:08 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "503f8ba3-8dec-4f60-af76-593096ff9b7e", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:10:08 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "503f8ba3-8dec-4f60-af76-593096ff9b7e", "format": "json"}]: dispatch
Nov 28 10:10:08 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "aee96a4c-0a14-47e6-b8e5-ce0279118ec9", "auth_id": "eve48", "format": "json"}]: dispatch
Nov 28 10:10:08 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "aee96a4c-0a14-47e6-b8e5-ce0279118ec9", "auth_id": "eve48", "format": "json"}]: dispatch
Nov 28 10:10:09 np0005538515.localdomain ceph-mon[301134]: pgmap v539: 177 pgs: 177 active+clean; 197 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 53 KiB/s rd, 123 KiB/s wr, 81 op/s
Nov 28 10:10:09 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "4122e8d3-d0ce-4fba-8b4f-9622dd23c08b", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:10:09 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:4122e8d3-d0ce-4fba-8b4f-9622dd23c08b, vol_name:cephfs) < ""
Nov 28 10:10:09 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v540: 177 pgs: 177 active+clean; 197 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 49 KiB/s rd, 112 KiB/s wr, 74 op/s
Nov 28 10:10:09 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/4122e8d3-d0ce-4fba-8b4f-9622dd23c08b/.meta.tmp'
Nov 28 10:10:09 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/4122e8d3-d0ce-4fba-8b4f-9622dd23c08b/.meta.tmp' to config b'/volumes/_nogroup/4122e8d3-d0ce-4fba-8b4f-9622dd23c08b/.meta'
Nov 28 10:10:09 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:4122e8d3-d0ce-4fba-8b4f-9622dd23c08b, vol_name:cephfs) < ""
Nov 28 10:10:09 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "4122e8d3-d0ce-4fba-8b4f-9622dd23c08b", "format": "json"}]: dispatch
Nov 28 10:10:09 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:4122e8d3-d0ce-4fba-8b4f-9622dd23c08b, vol_name:cephfs) < ""
Nov 28 10:10:09 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:4122e8d3-d0ce-4fba-8b4f-9622dd23c08b, vol_name:cephfs) < ""
Nov 28 10:10:10 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "3469c6cd-cd93-4e38-8faa-549a0ddf9179", "snap_name": "1f3bf784-193d-4af9-98c3-6a3e9518c295_c90956cb-9363-45be-b7cc-986dc86d427c", "force": true, "format": "json"}]: dispatch
Nov 28 10:10:10 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:1f3bf784-193d-4af9-98c3-6a3e9518c295_c90956cb-9363-45be-b7cc-986dc86d427c, sub_name:3469c6cd-cd93-4e38-8faa-549a0ddf9179, vol_name:cephfs) < ""
Nov 28 10:10:10 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/3469c6cd-cd93-4e38-8faa-549a0ddf9179/.meta.tmp'
Nov 28 10:10:10 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/3469c6cd-cd93-4e38-8faa-549a0ddf9179/.meta.tmp' to config b'/volumes/_nogroup/3469c6cd-cd93-4e38-8faa-549a0ddf9179/.meta'
Nov 28 10:10:10 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:1f3bf784-193d-4af9-98c3-6a3e9518c295_c90956cb-9363-45be-b7cc-986dc86d427c, sub_name:3469c6cd-cd93-4e38-8faa-549a0ddf9179, vol_name:cephfs) < ""
Nov 28 10:10:10 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "3469c6cd-cd93-4e38-8faa-549a0ddf9179", "snap_name": "1f3bf784-193d-4af9-98c3-6a3e9518c295", "force": true, "format": "json"}]: dispatch
Nov 28 10:10:10 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:1f3bf784-193d-4af9-98c3-6a3e9518c295, sub_name:3469c6cd-cd93-4e38-8faa-549a0ddf9179, vol_name:cephfs) < ""
Nov 28 10:10:10 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/3469c6cd-cd93-4e38-8faa-549a0ddf9179/.meta.tmp'
Nov 28 10:10:10 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/3469c6cd-cd93-4e38-8faa-549a0ddf9179/.meta.tmp' to config b'/volumes/_nogroup/3469c6cd-cd93-4e38-8faa-549a0ddf9179/.meta'
Nov 28 10:10:10 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:1f3bf784-193d-4af9-98c3-6a3e9518c295, sub_name:3469c6cd-cd93-4e38-8faa-549a0ddf9179, vol_name:cephfs) < ""
Nov 28 10:10:10 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:10:10.260 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:10:10 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "aee96a4c-0a14-47e6-b8e5-ce0279118ec9", "auth_id": "eve47", "tenant_id": "ed59ec099bfe470982dfd8309e19126f", "access_level": "rw", "format": "json"}]: dispatch
Nov 28 10:10:10 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_authorize(access_level:rw, auth_id:eve47, format:json, prefix:fs subvolume authorize, sub_name:aee96a4c-0a14-47e6-b8e5-ce0279118ec9, tenant_id:ed59ec099bfe470982dfd8309e19126f, vol_name:cephfs) < ""
Nov 28 10:10:10 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.eve47", "format": "json"} v 0)
Nov 28 10:10:10 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.eve47", "format": "json"} : dispatch
Nov 28 10:10:10 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: Creating meta for ID eve47 with tenant ed59ec099bfe470982dfd8309e19126f
Nov 28 10:10:10 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.eve47", "caps": ["mds", "allow rw path=/volumes/_nogroup/aee96a4c-0a14-47e6-b8e5-ce0279118ec9/42613c17-5566-4a72-88af-f970b5fd2509", "osd", "allow rw pool=manila_data namespace=fsvolumens_aee96a4c-0a14-47e6-b8e5-ce0279118ec9", "mon", "allow r"], "format": "json"} v 0)
Nov 28 10:10:10 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.eve47", "caps": ["mds", "allow rw path=/volumes/_nogroup/aee96a4c-0a14-47e6-b8e5-ce0279118ec9/42613c17-5566-4a72-88af-f970b5fd2509", "osd", "allow rw pool=manila_data namespace=fsvolumens_aee96a4c-0a14-47e6-b8e5-ce0279118ec9", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:10:10 np0005538515.localdomain ceph-mon[301134]: from='client.15651 172.18.0.34:0/597982567' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:10:10 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.eve47", "format": "json"} : dispatch
Nov 28 10:10:10 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.eve47", "caps": ["mds", "allow rw path=/volumes/_nogroup/aee96a4c-0a14-47e6-b8e5-ce0279118ec9/42613c17-5566-4a72-88af-f970b5fd2509", "osd", "allow rw pool=manila_data namespace=fsvolumens_aee96a4c-0a14-47e6-b8e5-ce0279118ec9", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:10:10 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.eve47", "caps": ["mds", "allow rw path=/volumes/_nogroup/aee96a4c-0a14-47e6-b8e5-ce0279118ec9/42613c17-5566-4a72-88af-f970b5fd2509", "osd", "allow rw pool=manila_data namespace=fsvolumens_aee96a4c-0a14-47e6-b8e5-ce0279118ec9", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:10:10 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.eve47", "caps": ["mds", "allow rw path=/volumes/_nogroup/aee96a4c-0a14-47e6-b8e5-ce0279118ec9/42613c17-5566-4a72-88af-f970b5fd2509", "osd", "allow rw pool=manila_data namespace=fsvolumens_aee96a4c-0a14-47e6-b8e5-ce0279118ec9", "mon", "allow r"], "format": "json"}]': finished
Nov 28 10:10:10 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_authorize(access_level:rw, auth_id:eve47, format:json, prefix:fs subvolume authorize, sub_name:aee96a4c-0a14-47e6-b8e5-ce0279118ec9, tenant_id:ed59ec099bfe470982dfd8309e19126f, vol_name:cephfs) < ""
Nov 28 10:10:10 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "503f8ba3-8dec-4f60-af76-593096ff9b7e", "format": "json"}]: dispatch
Nov 28 10:10:10 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_clone_status(clone_name:503f8ba3-8dec-4f60-af76-593096ff9b7e, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Nov 28 10:10:10 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_clone_status(clone_name:503f8ba3-8dec-4f60-af76-593096ff9b7e, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Nov 28 10:10:10 np0005538515.localdomain ceph-mgr[286188]: mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '503f8ba3-8dec-4f60-af76-593096ff9b7e' of type subvolume
Nov 28 10:10:10 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T10:10:10.853+0000 7fcc87448640 -1 mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '503f8ba3-8dec-4f60-af76-593096ff9b7e' of type subvolume
Nov 28 10:10:10 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "503f8ba3-8dec-4f60-af76-593096ff9b7e", "force": true, "format": "json"}]: dispatch
Nov 28 10:10:10 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:503f8ba3-8dec-4f60-af76-593096ff9b7e, vol_name:cephfs) < ""
Nov 28 10:10:10 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.subvolume_base] subvolume path 'b'/volumes/_nogroup/503f8ba3-8dec-4f60-af76-593096ff9b7e'' moved to trashcan
Nov 28 10:10:10 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Nov 28 10:10:10 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:503f8ba3-8dec-4f60-af76-593096ff9b7e, vol_name:cephfs) < ""
Nov 28 10:10:11 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "4122e8d3-d0ce-4fba-8b4f-9622dd23c08b", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:10:11 np0005538515.localdomain ceph-mon[301134]: pgmap v540: 177 pgs: 177 active+clean; 197 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 49 KiB/s rd, 112 KiB/s wr, 74 op/s
Nov 28 10:10:11 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "4122e8d3-d0ce-4fba-8b4f-9622dd23c08b", "format": "json"}]: dispatch
Nov 28 10:10:11 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "3469c6cd-cd93-4e38-8faa-549a0ddf9179", "snap_name": "1f3bf784-193d-4af9-98c3-6a3e9518c295_c90956cb-9363-45be-b7cc-986dc86d427c", "force": true, "format": "json"}]: dispatch
Nov 28 10:10:11 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "3469c6cd-cd93-4e38-8faa-549a0ddf9179", "snap_name": "1f3bf784-193d-4af9-98c3-6a3e9518c295", "force": true, "format": "json"}]: dispatch
Nov 28 10:10:11 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "aee96a4c-0a14-47e6-b8e5-ce0279118ec9", "auth_id": "eve47", "tenant_id": "ed59ec099bfe470982dfd8309e19126f", "access_level": "rw", "format": "json"}]: dispatch
Nov 28 10:10:11 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e267 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:10:11 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.
Nov 28 10:10:11 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.
Nov 28 10:10:11 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.
Nov 28 10:10:11 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.
Nov 28 10:10:11 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v541: 177 pgs: 177 active+clean; 198 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 921 B/s rd, 122 KiB/s wr, 13 op/s
Nov 28 10:10:11 np0005538515.localdomain podman[322420]: 2025-11-28 10:10:11.966336761 +0000 UTC m=+0.078705120 container health_status 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, managed_by=edpm_ansible, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3)
Nov 28 10:10:12 np0005538515.localdomain podman[322421]: 2025-11-28 10:10:11.989347501 +0000 UTC m=+0.092886808 container health_status 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller)
Nov 28 10:10:12 np0005538515.localdomain podman[322428]: 2025-11-28 10:10:12.053652107 +0000 UTC m=+0.147942669 container health_status d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 28 10:10:12 np0005538515.localdomain podman[322428]: 2025-11-28 10:10:12.0615481 +0000 UTC m=+0.155838682 container exec_died d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 28 10:10:12 np0005538515.localdomain systemd[1]: d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.service: Deactivated successfully.
Nov 28 10:10:12 np0005538515.localdomain podman[322421]: 2025-11-28 10:10:12.074692356 +0000 UTC m=+0.178231693 container exec_died 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, managed_by=edpm_ansible)
Nov 28 10:10:12 np0005538515.localdomain systemd[1]: 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.service: Deactivated successfully.
Nov 28 10:10:12 np0005538515.localdomain podman[322427]: 2025-11-28 10:10:12.025161497 +0000 UTC m=+0.122682418 container health_status b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Nov 28 10:10:12 np0005538515.localdomain podman[322427]: 2025-11-28 10:10:12.158661778 +0000 UTC m=+0.256182689 container exec_died b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Nov 28 10:10:12 np0005538515.localdomain systemd[1]: b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.service: Deactivated successfully.
Nov 28 10:10:12 np0005538515.localdomain podman[322420]: 2025-11-28 10:10:12.179607795 +0000 UTC m=+0.291976154 container exec_died 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3)
Nov 28 10:10:12 np0005538515.localdomain systemd[1]: 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.service: Deactivated successfully.
Nov 28 10:10:12 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "503f8ba3-8dec-4f60-af76-593096ff9b7e", "format": "json"}]: dispatch
Nov 28 10:10:12 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "503f8ba3-8dec-4f60-af76-593096ff9b7e", "force": true, "format": "json"}]: dispatch
Nov 28 10:10:12 np0005538515.localdomain systemd[1]: tmp-crun.g49Pzs.mount: Deactivated successfully.
Nov 28 10:10:13 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "3469c6cd-cd93-4e38-8faa-549a0ddf9179", "format": "json"}]: dispatch
Nov 28 10:10:13 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_clone_status(clone_name:3469c6cd-cd93-4e38-8faa-549a0ddf9179, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Nov 28 10:10:13 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_clone_status(clone_name:3469c6cd-cd93-4e38-8faa-549a0ddf9179, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Nov 28 10:10:13 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T10:10:13.291+0000 7fcc87448640 -1 mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '3469c6cd-cd93-4e38-8faa-549a0ddf9179' of type subvolume
Nov 28 10:10:13 np0005538515.localdomain ceph-mgr[286188]: mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '3469c6cd-cd93-4e38-8faa-549a0ddf9179' of type subvolume
Nov 28 10:10:13 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "3469c6cd-cd93-4e38-8faa-549a0ddf9179", "force": true, "format": "json"}]: dispatch
Nov 28 10:10:13 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:3469c6cd-cd93-4e38-8faa-549a0ddf9179, vol_name:cephfs) < ""
Nov 28 10:10:13 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.subvolume_base] subvolume path 'b'/volumes/_nogroup/3469c6cd-cd93-4e38-8faa-549a0ddf9179'' moved to trashcan
Nov 28 10:10:13 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Nov 28 10:10:13 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:3469c6cd-cd93-4e38-8faa-549a0ddf9179, vol_name:cephfs) < ""
Nov 28 10:10:13 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:10:13.600 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:10:13 np0005538515.localdomain ceph-mon[301134]: pgmap v541: 177 pgs: 177 active+clean; 198 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 921 B/s rd, 122 KiB/s wr, 13 op/s
Nov 28 10:10:13 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "3469c6cd-cd93-4e38-8faa-549a0ddf9179", "format": "json"}]: dispatch
Nov 28 10:10:13 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "3469c6cd-cd93-4e38-8faa-549a0ddf9179", "force": true, "format": "json"}]: dispatch
Nov 28 10:10:13 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/2493387610' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:10:13 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/2493387610' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:10:13 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "4122e8d3-d0ce-4fba-8b4f-9622dd23c08b", "format": "json"}]: dispatch
Nov 28 10:10:13 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_clone_status(clone_name:4122e8d3-d0ce-4fba-8b4f-9622dd23c08b, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Nov 28 10:10:13 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.
Nov 28 10:10:13 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_clone_status(clone_name:4122e8d3-d0ce-4fba-8b4f-9622dd23c08b, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Nov 28 10:10:13 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T10:10:13.888+0000 7fcc87448640 -1 mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '4122e8d3-d0ce-4fba-8b4f-9622dd23c08b' of type subvolume
Nov 28 10:10:13 np0005538515.localdomain ceph-mgr[286188]: mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '4122e8d3-d0ce-4fba-8b4f-9622dd23c08b' of type subvolume
Nov 28 10:10:13 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "4122e8d3-d0ce-4fba-8b4f-9622dd23c08b", "force": true, "format": "json"}]: dispatch
Nov 28 10:10:13 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:4122e8d3-d0ce-4fba-8b4f-9622dd23c08b, vol_name:cephfs) < ""
Nov 28 10:10:13 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v542: 177 pgs: 177 active+clean; 198 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 921 B/s rd, 122 KiB/s wr, 13 op/s
Nov 28 10:10:13 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.subvolume_base] subvolume path 'b'/volumes/_nogroup/4122e8d3-d0ce-4fba-8b4f-9622dd23c08b'' moved to trashcan
Nov 28 10:10:13 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Nov 28 10:10:13 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:4122e8d3-d0ce-4fba-8b4f-9622dd23c08b, vol_name:cephfs) < ""
Nov 28 10:10:13 np0005538515.localdomain podman[322502]: 2025-11-28 10:10:13.990305881 +0000 UTC m=+0.094893871 container health_status 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 28 10:10:14 np0005538515.localdomain podman[322502]: 2025-11-28 10:10:14.024875158 +0000 UTC m=+0.129463128 container exec_died 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 28 10:10:14 np0005538515.localdomain systemd[1]: 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.service: Deactivated successfully.
Nov 28 10:10:14 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "a26a558f-6d92-48d1-91b7-33af52872ba0", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:10:14 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:a26a558f-6d92-48d1-91b7-33af52872ba0, vol_name:cephfs) < ""
Nov 28 10:10:14 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/a26a558f-6d92-48d1-91b7-33af52872ba0/.meta.tmp'
Nov 28 10:10:14 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/a26a558f-6d92-48d1-91b7-33af52872ba0/.meta.tmp' to config b'/volumes/_nogroup/a26a558f-6d92-48d1-91b7-33af52872ba0/.meta'
Nov 28 10:10:14 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:a26a558f-6d92-48d1-91b7-33af52872ba0, vol_name:cephfs) < ""
Nov 28 10:10:14 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "a26a558f-6d92-48d1-91b7-33af52872ba0", "format": "json"}]: dispatch
Nov 28 10:10:14 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:a26a558f-6d92-48d1-91b7-33af52872ba0, vol_name:cephfs) < ""
Nov 28 10:10:14 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:a26a558f-6d92-48d1-91b7-33af52872ba0, vol_name:cephfs) < ""
Nov 28 10:10:14 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "fd57705e-1bdf-4eac-bcf9-3bad5ad20c9f", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:10:14 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:fd57705e-1bdf-4eac-bcf9-3bad5ad20c9f, vol_name:cephfs) < ""
Nov 28 10:10:14 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/fd57705e-1bdf-4eac-bcf9-3bad5ad20c9f/.meta.tmp'
Nov 28 10:10:14 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/fd57705e-1bdf-4eac-bcf9-3bad5ad20c9f/.meta.tmp' to config b'/volumes/_nogroup/fd57705e-1bdf-4eac-bcf9-3bad5ad20c9f/.meta'
Nov 28 10:10:14 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:fd57705e-1bdf-4eac-bcf9-3bad5ad20c9f, vol_name:cephfs) < ""
Nov 28 10:10:14 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "fd57705e-1bdf-4eac-bcf9-3bad5ad20c9f", "format": "json"}]: dispatch
Nov 28 10:10:14 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:fd57705e-1bdf-4eac-bcf9-3bad5ad20c9f, vol_name:cephfs) < ""
Nov 28 10:10:14 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:fd57705e-1bdf-4eac-bcf9-3bad5ad20c9f, vol_name:cephfs) < ""
Nov 28 10:10:14 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "aee96a4c-0a14-47e6-b8e5-ce0279118ec9", "auth_id": "eve47", "format": "json"}]: dispatch
Nov 28 10:10:14 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_deauthorize(auth_id:eve47, format:json, prefix:fs subvolume deauthorize, sub_name:aee96a4c-0a14-47e6-b8e5-ce0279118ec9, vol_name:cephfs) < ""
Nov 28 10:10:14 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.eve47", "format": "json"} v 0)
Nov 28 10:10:14 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.eve47", "format": "json"} : dispatch
Nov 28 10:10:14 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.eve47"} v 0)
Nov 28 10:10:14 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.eve47"} : dispatch
Nov 28 10:10:14 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_deauthorize(auth_id:eve47, format:json, prefix:fs subvolume deauthorize, sub_name:aee96a4c-0a14-47e6-b8e5-ce0279118ec9, vol_name:cephfs) < ""
Nov 28 10:10:14 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "aee96a4c-0a14-47e6-b8e5-ce0279118ec9", "auth_id": "eve47", "format": "json"}]: dispatch
Nov 28 10:10:14 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_evict(auth_id:eve47, format:json, prefix:fs subvolume evict, sub_name:aee96a4c-0a14-47e6-b8e5-ce0279118ec9, vol_name:cephfs) < ""
Nov 28 10:10:14 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict clients with auth_name=eve47, client_metadata.root=/volumes/_nogroup/aee96a4c-0a14-47e6-b8e5-ce0279118ec9/42613c17-5566-4a72-88af-f970b5fd2509
Nov 28 10:10:14 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict: joined all
Nov 28 10:10:14 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_evict(auth_id:eve47, format:json, prefix:fs subvolume evict, sub_name:aee96a4c-0a14-47e6-b8e5-ce0279118ec9, vol_name:cephfs) < ""
Nov 28 10:10:14 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "4122e8d3-d0ce-4fba-8b4f-9622dd23c08b", "format": "json"}]: dispatch
Nov 28 10:10:14 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "4122e8d3-d0ce-4fba-8b4f-9622dd23c08b", "force": true, "format": "json"}]: dispatch
Nov 28 10:10:14 np0005538515.localdomain ceph-mon[301134]: pgmap v542: 177 pgs: 177 active+clean; 198 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 921 B/s rd, 122 KiB/s wr, 13 op/s
Nov 28 10:10:14 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "a26a558f-6d92-48d1-91b7-33af52872ba0", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:10:14 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "a26a558f-6d92-48d1-91b7-33af52872ba0", "format": "json"}]: dispatch
Nov 28 10:10:14 np0005538515.localdomain ceph-mon[301134]: from='client.15651 172.18.0.34:0/597982567' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:10:14 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "fd57705e-1bdf-4eac-bcf9-3bad5ad20c9f", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:10:14 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "fd57705e-1bdf-4eac-bcf9-3bad5ad20c9f", "format": "json"}]: dispatch
Nov 28 10:10:14 np0005538515.localdomain ceph-mon[301134]: from='client.15651 172.18.0.34:0/597982567' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:10:14 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "aee96a4c-0a14-47e6-b8e5-ce0279118ec9", "auth_id": "eve47", "format": "json"}]: dispatch
Nov 28 10:10:14 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.eve47"} : dispatch
Nov 28 10:10:14 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.eve47", "format": "json"} : dispatch
Nov 28 10:10:14 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.eve47"} : dispatch
Nov 28 10:10:14 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.eve47"}]': finished
Nov 28 10:10:14 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "aee96a4c-0a14-47e6-b8e5-ce0279118ec9", "auth_id": "eve47", "format": "json"}]: dispatch
Nov 28 10:10:15 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:10:15.291 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:10:15 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e268 e268: 6 total, 6 up, 6 in
Nov 28 10:10:15 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v544: 177 pgs: 177 active+clean; 198 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 989 B/s rd, 131 KiB/s wr, 14 op/s
Nov 28 10:10:16 np0005538515.localdomain ceph-mon[301134]: osdmap e268: 6 total, 6 up, 6 in
Nov 28 10:10:16 np0005538515.localdomain ceph-mon[301134]: pgmap v544: 177 pgs: 177 active+clean; 198 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 989 B/s rd, 131 KiB/s wr, 14 op/s
Nov 28 10:10:16 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e268 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:10:17 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "fd57705e-1bdf-4eac-bcf9-3bad5ad20c9f", "snap_name": "3d88a300-3326-420b-858f-9b926d8029b3", "format": "json"}]: dispatch
Nov 28 10:10:17 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_create(format:json, prefix:fs subvolume snapshot create, snap_name:3d88a300-3326-420b-858f-9b926d8029b3, sub_name:fd57705e-1bdf-4eac-bcf9-3bad5ad20c9f, vol_name:cephfs) < ""
Nov 28 10:10:17 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_create(format:json, prefix:fs subvolume snapshot create, snap_name:3d88a300-3326-420b-858f-9b926d8029b3, sub_name:fd57705e-1bdf-4eac-bcf9-3bad5ad20c9f, vol_name:cephfs) < ""
Nov 28 10:10:17 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "fd57705e-1bdf-4eac-bcf9-3bad5ad20c9f", "snap_name": "3d88a300-3326-420b-858f-9b926d8029b3", "format": "json"}]: dispatch
Nov 28 10:10:17 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v545: 177 pgs: 177 active+clean; 198 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 1.2 KiB/s rd, 125 KiB/s wr, 14 op/s
Nov 28 10:10:17 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "a26a558f-6d92-48d1-91b7-33af52872ba0", "format": "json"}]: dispatch
Nov 28 10:10:17 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_clone_status(clone_name:a26a558f-6d92-48d1-91b7-33af52872ba0, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Nov 28 10:10:17 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_clone_status(clone_name:a26a558f-6d92-48d1-91b7-33af52872ba0, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Nov 28 10:10:17 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T10:10:17.959+0000 7fcc87448640 -1 mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume 'a26a558f-6d92-48d1-91b7-33af52872ba0' of type subvolume
Nov 28 10:10:17 np0005538515.localdomain ceph-mgr[286188]: mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume 'a26a558f-6d92-48d1-91b7-33af52872ba0' of type subvolume
Nov 28 10:10:17 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "a26a558f-6d92-48d1-91b7-33af52872ba0", "force": true, "format": "json"}]: dispatch
Nov 28 10:10:17 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:a26a558f-6d92-48d1-91b7-33af52872ba0, vol_name:cephfs) < ""
Nov 28 10:10:17 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.subvolume_base] subvolume path 'b'/volumes/_nogroup/a26a558f-6d92-48d1-91b7-33af52872ba0'' moved to trashcan
Nov 28 10:10:17 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Nov 28 10:10:17 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:a26a558f-6d92-48d1-91b7-33af52872ba0, vol_name:cephfs) < ""
Nov 28 10:10:18 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "aee96a4c-0a14-47e6-b8e5-ce0279118ec9", "auth_id": "eve49", "format": "json"}]: dispatch
Nov 28 10:10:18 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_deauthorize(auth_id:eve49, format:json, prefix:fs subvolume deauthorize, sub_name:aee96a4c-0a14-47e6-b8e5-ce0279118ec9, vol_name:cephfs) < ""
Nov 28 10:10:18 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.eve49", "format": "json"} v 0)
Nov 28 10:10:18 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.eve49", "format": "json"} : dispatch
Nov 28 10:10:18 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.eve49"} v 0)
Nov 28 10:10:18 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.eve49"} : dispatch
Nov 28 10:10:18 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_deauthorize(auth_id:eve49, format:json, prefix:fs subvolume deauthorize, sub_name:aee96a4c-0a14-47e6-b8e5-ce0279118ec9, vol_name:cephfs) < ""
Nov 28 10:10:18 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "aee96a4c-0a14-47e6-b8e5-ce0279118ec9", "auth_id": "eve49", "format": "json"}]: dispatch
Nov 28 10:10:18 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_evict(auth_id:eve49, format:json, prefix:fs subvolume evict, sub_name:aee96a4c-0a14-47e6-b8e5-ce0279118ec9, vol_name:cephfs) < ""
Nov 28 10:10:18 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict clients with auth_name=eve49, client_metadata.root=/volumes/_nogroup/aee96a4c-0a14-47e6-b8e5-ce0279118ec9/42613c17-5566-4a72-88af-f970b5fd2509
Nov 28 10:10:18 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict: joined all
Nov 28 10:10:18 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_evict(auth_id:eve49, format:json, prefix:fs subvolume evict, sub_name:aee96a4c-0a14-47e6-b8e5-ce0279118ec9, vol_name:cephfs) < ""
Nov 28 10:10:18 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "aee96a4c-0a14-47e6-b8e5-ce0279118ec9", "format": "json"}]: dispatch
Nov 28 10:10:18 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_clone_status(clone_name:aee96a4c-0a14-47e6-b8e5-ce0279118ec9, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Nov 28 10:10:18 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_clone_status(clone_name:aee96a4c-0a14-47e6-b8e5-ce0279118ec9, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Nov 28 10:10:18 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T10:10:18.481+0000 7fcc87448640 -1 mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume 'aee96a4c-0a14-47e6-b8e5-ce0279118ec9' of type subvolume
Nov 28 10:10:18 np0005538515.localdomain ceph-mgr[286188]: mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume 'aee96a4c-0a14-47e6-b8e5-ce0279118ec9' of type subvolume
Nov 28 10:10:18 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "aee96a4c-0a14-47e6-b8e5-ce0279118ec9", "force": true, "format": "json"}]: dispatch
Nov 28 10:10:18 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:aee96a4c-0a14-47e6-b8e5-ce0279118ec9, vol_name:cephfs) < ""
Nov 28 10:10:18 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.subvolume_base] subvolume path 'b'/volumes/_nogroup/aee96a4c-0a14-47e6-b8e5-ce0279118ec9'' moved to trashcan
Nov 28 10:10:18 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Nov 28 10:10:18 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:aee96a4c-0a14-47e6-b8e5-ce0279118ec9, vol_name:cephfs) < ""
Nov 28 10:10:18 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:10:18.601 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:10:18 np0005538515.localdomain ceph-mon[301134]: pgmap v545: 177 pgs: 177 active+clean; 198 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 1.2 KiB/s rd, 125 KiB/s wr, 14 op/s
Nov 28 10:10:18 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "a26a558f-6d92-48d1-91b7-33af52872ba0", "format": "json"}]: dispatch
Nov 28 10:10:18 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "a26a558f-6d92-48d1-91b7-33af52872ba0", "force": true, "format": "json"}]: dispatch
Nov 28 10:10:18 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "aee96a4c-0a14-47e6-b8e5-ce0279118ec9", "auth_id": "eve49", "format": "json"}]: dispatch
Nov 28 10:10:18 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.eve49"} : dispatch
Nov 28 10:10:18 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.eve49", "format": "json"} : dispatch
Nov 28 10:10:18 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.eve49"} : dispatch
Nov 28 10:10:18 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.eve49"}]': finished
Nov 28 10:10:18 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "aee96a4c-0a14-47e6-b8e5-ce0279118ec9", "auth_id": "eve49", "format": "json"}]: dispatch
Nov 28 10:10:18 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "aee96a4c-0a14-47e6-b8e5-ce0279118ec9", "format": "json"}]: dispatch
Nov 28 10:10:18 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "aee96a4c-0a14-47e6-b8e5-ce0279118ec9", "force": true, "format": "json"}]: dispatch
Nov 28 10:10:18 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.
Nov 28 10:10:18 np0005538515.localdomain podman[322527]: 2025-11-28 10:10:18.974976336 +0000 UTC m=+0.083519119 container health_status cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=multipathd, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3)
Nov 28 10:10:18 np0005538515.localdomain podman[322527]: 2025-11-28 10:10:18.989455053 +0000 UTC m=+0.097997856 container exec_died cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_id=multipathd, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Nov 28 10:10:19 np0005538515.localdomain systemd[1]: cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.service: Deactivated successfully.
Nov 28 10:10:19 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v546: 177 pgs: 177 active+clean; 198 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 1.2 KiB/s rd, 125 KiB/s wr, 14 op/s
Nov 28 10:10:20 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:10:20.326 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:10:20 np0005538515.localdomain ceph-mon[301134]: pgmap v546: 177 pgs: 177 active+clean; 198 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 1.2 KiB/s rd, 125 KiB/s wr, 14 op/s
Nov 28 10:10:21 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "fd57705e-1bdf-4eac-bcf9-3bad5ad20c9f", "snap_name": "3d88a300-3326-420b-858f-9b926d8029b3_22da53bc-1abc-41d9-a746-84c48ea05590", "force": true, "format": "json"}]: dispatch
Nov 28 10:10:21 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:3d88a300-3326-420b-858f-9b926d8029b3_22da53bc-1abc-41d9-a746-84c48ea05590, sub_name:fd57705e-1bdf-4eac-bcf9-3bad5ad20c9f, vol_name:cephfs) < ""
Nov 28 10:10:21 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/fd57705e-1bdf-4eac-bcf9-3bad5ad20c9f/.meta.tmp'
Nov 28 10:10:21 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/fd57705e-1bdf-4eac-bcf9-3bad5ad20c9f/.meta.tmp' to config b'/volumes/_nogroup/fd57705e-1bdf-4eac-bcf9-3bad5ad20c9f/.meta'
Nov 28 10:10:21 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:3d88a300-3326-420b-858f-9b926d8029b3_22da53bc-1abc-41d9-a746-84c48ea05590, sub_name:fd57705e-1bdf-4eac-bcf9-3bad5ad20c9f, vol_name:cephfs) < ""
Nov 28 10:10:21 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "fd57705e-1bdf-4eac-bcf9-3bad5ad20c9f", "snap_name": "3d88a300-3326-420b-858f-9b926d8029b3", "force": true, "format": "json"}]: dispatch
Nov 28 10:10:21 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:3d88a300-3326-420b-858f-9b926d8029b3, sub_name:fd57705e-1bdf-4eac-bcf9-3bad5ad20c9f, vol_name:cephfs) < ""
Nov 28 10:10:21 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/fd57705e-1bdf-4eac-bcf9-3bad5ad20c9f/.meta.tmp'
Nov 28 10:10:21 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/fd57705e-1bdf-4eac-bcf9-3bad5ad20c9f/.meta.tmp' to config b'/volumes/_nogroup/fd57705e-1bdf-4eac-bcf9-3bad5ad20c9f/.meta'
Nov 28 10:10:21 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:3d88a300-3326-420b-858f-9b926d8029b3, sub_name:fd57705e-1bdf-4eac-bcf9-3bad5ad20c9f, vol_name:cephfs) < ""
Nov 28 10:10:21 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "d8f7d257-b798-4b3d-88f0-c0bbfa330aa3", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:10:21 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:d8f7d257-b798-4b3d-88f0-c0bbfa330aa3, vol_name:cephfs) < ""
Nov 28 10:10:21 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/d8f7d257-b798-4b3d-88f0-c0bbfa330aa3/.meta.tmp'
Nov 28 10:10:21 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/d8f7d257-b798-4b3d-88f0-c0bbfa330aa3/.meta.tmp' to config b'/volumes/_nogroup/d8f7d257-b798-4b3d-88f0-c0bbfa330aa3/.meta'
Nov 28 10:10:21 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:d8f7d257-b798-4b3d-88f0-c0bbfa330aa3, vol_name:cephfs) < ""
Nov 28 10:10:21 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "d8f7d257-b798-4b3d-88f0-c0bbfa330aa3", "format": "json"}]: dispatch
Nov 28 10:10:21 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:d8f7d257-b798-4b3d-88f0-c0bbfa330aa3, vol_name:cephfs) < ""
Nov 28 10:10:21 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:d8f7d257-b798-4b3d-88f0-c0bbfa330aa3, vol_name:cephfs) < ""
Nov 28 10:10:21 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e269 e269: 6 total, 6 up, 6 in
Nov 28 10:10:21 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e269 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:10:21 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v548: 177 pgs: 177 active+clean; 199 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 1.4 KiB/s rd, 141 KiB/s wr, 15 op/s
Nov 28 10:10:21 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "fd57705e-1bdf-4eac-bcf9-3bad5ad20c9f", "snap_name": "3d88a300-3326-420b-858f-9b926d8029b3_22da53bc-1abc-41d9-a746-84c48ea05590", "force": true, "format": "json"}]: dispatch
Nov 28 10:10:21 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "fd57705e-1bdf-4eac-bcf9-3bad5ad20c9f", "snap_name": "3d88a300-3326-420b-858f-9b926d8029b3", "force": true, "format": "json"}]: dispatch
Nov 28 10:10:21 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "d8f7d257-b798-4b3d-88f0-c0bbfa330aa3", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:10:21 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "d8f7d257-b798-4b3d-88f0-c0bbfa330aa3", "format": "json"}]: dispatch
Nov 28 10:10:21 np0005538515.localdomain ceph-mon[301134]: from='client.15651 172.18.0.34:0/597982567' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:10:21 np0005538515.localdomain ceph-mon[301134]: osdmap e269: 6 total, 6 up, 6 in
Nov 28 10:10:22 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e270 e270: 6 total, 6 up, 6 in
Nov 28 10:10:23 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:10:23.604 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:10:23 np0005538515.localdomain ceph-mon[301134]: pgmap v548: 177 pgs: 177 active+clean; 199 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 1.4 KiB/s rd, 141 KiB/s wr, 15 op/s
Nov 28 10:10:23 np0005538515.localdomain ceph-mon[301134]: osdmap e270: 6 total, 6 up, 6 in
Nov 28 10:10:23 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v550: 177 pgs: 177 active+clean; 199 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 1.4 KiB/s rd, 141 KiB/s wr, 15 op/s
Nov 28 10:10:24 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "fd57705e-1bdf-4eac-bcf9-3bad5ad20c9f", "format": "json"}]: dispatch
Nov 28 10:10:24 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_clone_status(clone_name:fd57705e-1bdf-4eac-bcf9-3bad5ad20c9f, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Nov 28 10:10:24 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_clone_status(clone_name:fd57705e-1bdf-4eac-bcf9-3bad5ad20c9f, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Nov 28 10:10:24 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T10:10:24.310+0000 7fcc87448640 -1 mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume 'fd57705e-1bdf-4eac-bcf9-3bad5ad20c9f' of type subvolume
Nov 28 10:10:24 np0005538515.localdomain ceph-mgr[286188]: mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume 'fd57705e-1bdf-4eac-bcf9-3bad5ad20c9f' of type subvolume
Nov 28 10:10:24 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "fd57705e-1bdf-4eac-bcf9-3bad5ad20c9f", "force": true, "format": "json"}]: dispatch
Nov 28 10:10:24 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:fd57705e-1bdf-4eac-bcf9-3bad5ad20c9f, vol_name:cephfs) < ""
Nov 28 10:10:24 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.subvolume_base] subvolume path 'b'/volumes/_nogroup/fd57705e-1bdf-4eac-bcf9-3bad5ad20c9f'' moved to trashcan
Nov 28 10:10:24 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Nov 28 10:10:24 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:fd57705e-1bdf-4eac-bcf9-3bad5ad20c9f, vol_name:cephfs) < ""
Nov 28 10:10:24 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "d8f7d257-b798-4b3d-88f0-c0bbfa330aa3", "format": "json"}]: dispatch
Nov 28 10:10:24 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_clone_status(clone_name:d8f7d257-b798-4b3d-88f0-c0bbfa330aa3, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Nov 28 10:10:24 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_clone_status(clone_name:d8f7d257-b798-4b3d-88f0-c0bbfa330aa3, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Nov 28 10:10:24 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T10:10:24.527+0000 7fcc87448640 -1 mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume 'd8f7d257-b798-4b3d-88f0-c0bbfa330aa3' of type subvolume
Nov 28 10:10:24 np0005538515.localdomain ceph-mgr[286188]: mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume 'd8f7d257-b798-4b3d-88f0-c0bbfa330aa3' of type subvolume
Nov 28 10:10:24 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "d8f7d257-b798-4b3d-88f0-c0bbfa330aa3", "force": true, "format": "json"}]: dispatch
Nov 28 10:10:24 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:d8f7d257-b798-4b3d-88f0-c0bbfa330aa3, vol_name:cephfs) < ""
Nov 28 10:10:24 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.subvolume_base] subvolume path 'b'/volumes/_nogroup/d8f7d257-b798-4b3d-88f0-c0bbfa330aa3'' moved to trashcan
Nov 28 10:10:24 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Nov 28 10:10:24 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:d8f7d257-b798-4b3d-88f0-c0bbfa330aa3, vol_name:cephfs) < ""
Nov 28 10:10:25 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:10:25 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50, vol_name:cephfs) < ""
Nov 28 10:10:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:10:25.365 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:10:25 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/.meta.tmp'
Nov 28 10:10:25 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/.meta.tmp' to config b'/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/.meta'
Nov 28 10:10:25 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50, vol_name:cephfs) < ""
Nov 28 10:10:25 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "format": "json"}]: dispatch
Nov 28 10:10:25 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50, vol_name:cephfs) < ""
Nov 28 10:10:25 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50, vol_name:cephfs) < ""
Nov 28 10:10:25 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "223d1419-7407-477a-a3da-e408c7b6c43a", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:10:25 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:223d1419-7407-477a-a3da-e408c7b6c43a, vol_name:cephfs) < ""
Nov 28 10:10:25 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/223d1419-7407-477a-a3da-e408c7b6c43a/.meta.tmp'
Nov 28 10:10:25 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/223d1419-7407-477a-a3da-e408c7b6c43a/.meta.tmp' to config b'/volumes/_nogroup/223d1419-7407-477a-a3da-e408c7b6c43a/.meta'
Nov 28 10:10:25 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:223d1419-7407-477a-a3da-e408c7b6c43a, vol_name:cephfs) < ""
Nov 28 10:10:25 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "223d1419-7407-477a-a3da-e408c7b6c43a", "format": "json"}]: dispatch
Nov 28 10:10:25 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:223d1419-7407-477a-a3da-e408c7b6c43a, vol_name:cephfs) < ""
Nov 28 10:10:25 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:223d1419-7407-477a-a3da-e408c7b6c43a, vol_name:cephfs) < ""
Nov 28 10:10:25 np0005538515.localdomain ceph-mon[301134]: pgmap v550: 177 pgs: 177 active+clean; 199 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 1.4 KiB/s rd, 141 KiB/s wr, 15 op/s
Nov 28 10:10:25 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "fd57705e-1bdf-4eac-bcf9-3bad5ad20c9f", "format": "json"}]: dispatch
Nov 28 10:10:25 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "fd57705e-1bdf-4eac-bcf9-3bad5ad20c9f", "force": true, "format": "json"}]: dispatch
Nov 28 10:10:25 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "d8f7d257-b798-4b3d-88f0-c0bbfa330aa3", "format": "json"}]: dispatch
Nov 28 10:10:25 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "d8f7d257-b798-4b3d-88f0-c0bbfa330aa3", "force": true, "format": "json"}]: dispatch
Nov 28 10:10:25 np0005538515.localdomain ceph-mon[301134]: from='client.15651 172.18.0.34:0/597982567' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:10:25 np0005538515.localdomain ceph-mon[301134]: from='client.15651 172.18.0.34:0/597982567' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:10:25 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v551: 177 pgs: 177 active+clean; 199 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 383 B/s rd, 73 KiB/s wr, 6 op/s
Nov 28 10:10:26 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:10:26 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "format": "json"}]: dispatch
Nov 28 10:10:26 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "223d1419-7407-477a-a3da-e408c7b6c43a", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:10:26 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "223d1419-7407-477a-a3da-e408c7b6c43a", "format": "json"}]: dispatch
Nov 28 10:10:26 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e270 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:10:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:10:27 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 10:10:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:10:27 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 10:10:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:10:27 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 10:10:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:10:27 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 10:10:27 np0005538515.localdomain openstack_network_exporter[240973]: 
Nov 28 10:10:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:10:27 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 10:10:27 np0005538515.localdomain openstack_network_exporter[240973]: 
Nov 28 10:10:27 np0005538515.localdomain ceph-mon[301134]: pgmap v551: 177 pgs: 177 active+clean; 199 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 383 B/s rd, 73 KiB/s wr, 6 op/s
Nov 28 10:10:27 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "b6df0b61-7d52-4368-906d-590e5295b08d", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:10:27 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:b6df0b61-7d52-4368-906d-590e5295b08d, vol_name:cephfs) < ""
Nov 28 10:10:27 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/b6df0b61-7d52-4368-906d-590e5295b08d/.meta.tmp'
Nov 28 10:10:27 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/b6df0b61-7d52-4368-906d-590e5295b08d/.meta.tmp' to config b'/volumes/_nogroup/b6df0b61-7d52-4368-906d-590e5295b08d/.meta'
Nov 28 10:10:27 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:b6df0b61-7d52-4368-906d-590e5295b08d, vol_name:cephfs) < ""
Nov 28 10:10:27 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "b6df0b61-7d52-4368-906d-590e5295b08d", "format": "json"}]: dispatch
Nov 28 10:10:27 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:b6df0b61-7d52-4368-906d-590e5295b08d, vol_name:cephfs) < ""
Nov 28 10:10:27 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:b6df0b61-7d52-4368-906d-590e5295b08d, vol_name:cephfs) < ""
Nov 28 10:10:27 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v552: 177 pgs: 177 active+clean; 199 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 895 B/s rd, 128 KiB/s wr, 11 op/s
Nov 28 10:10:28 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "3c672c69-0cef-413e-aa2f-cebe487d9fad", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:10:28 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:3c672c69-0cef-413e-aa2f-cebe487d9fad, vol_name:cephfs) < ""
Nov 28 10:10:28 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/3c672c69-0cef-413e-aa2f-cebe487d9fad/.meta.tmp'
Nov 28 10:10:28 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/3c672c69-0cef-413e-aa2f-cebe487d9fad/.meta.tmp' to config b'/volumes/_nogroup/3c672c69-0cef-413e-aa2f-cebe487d9fad/.meta'
Nov 28 10:10:28 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:3c672c69-0cef-413e-aa2f-cebe487d9fad, vol_name:cephfs) < ""
Nov 28 10:10:28 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "3c672c69-0cef-413e-aa2f-cebe487d9fad", "format": "json"}]: dispatch
Nov 28 10:10:28 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:3c672c69-0cef-413e-aa2f-cebe487d9fad, vol_name:cephfs) < ""
Nov 28 10:10:28 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:3c672c69-0cef-413e-aa2f-cebe487d9fad, vol_name:cephfs) < ""
Nov 28 10:10:28 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:10:28.605 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:10:28 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "b6df0b61-7d52-4368-906d-590e5295b08d", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:10:28 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "b6df0b61-7d52-4368-906d-590e5295b08d", "format": "json"}]: dispatch
Nov 28 10:10:28 np0005538515.localdomain ceph-mon[301134]: from='client.15651 172.18.0.34:0/597982567' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:10:28 np0005538515.localdomain ceph-mon[301134]: pgmap v552: 177 pgs: 177 active+clean; 199 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 895 B/s rd, 128 KiB/s wr, 11 op/s
Nov 28 10:10:28 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "3c672c69-0cef-413e-aa2f-cebe487d9fad", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:10:28 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "3c672c69-0cef-413e-aa2f-cebe487d9fad", "format": "json"}]: dispatch
Nov 28 10:10:28 np0005538515.localdomain ceph-mon[301134]: from='client.15651 172.18.0.34:0/597982567' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:10:28 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "223d1419-7407-477a-a3da-e408c7b6c43a", "snap_name": "cdc57bbb-b20f-407d-9c9a-d134937d596e", "format": "json"}]: dispatch
Nov 28 10:10:28 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_create(format:json, prefix:fs subvolume snapshot create, snap_name:cdc57bbb-b20f-407d-9c9a-d134937d596e, sub_name:223d1419-7407-477a-a3da-e408c7b6c43a, vol_name:cephfs) < ""
Nov 28 10:10:28 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_create(format:json, prefix:fs subvolume snapshot create, snap_name:cdc57bbb-b20f-407d-9c9a-d134937d596e, sub_name:223d1419-7407-477a-a3da-e408c7b6c43a, vol_name:cephfs) < ""
Nov 28 10:10:28 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice", "tenant_id": "38de2f991c8946e4ad86ddc6b9c2ae73", "access_level": "rw", "format": "json"}]: dispatch
Nov 28 10:10:28 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_authorize(access_level:rw, auth_id:alice, format:json, prefix:fs subvolume authorize, sub_name:3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50, tenant_id:38de2f991c8946e4ad86ddc6b9c2ae73, vol_name:cephfs) < ""
Nov 28 10:10:28 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.alice", "format": "json"} v 0)
Nov 28 10:10:28 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Nov 28 10:10:28 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: Creating meta for ID alice with tenant 38de2f991c8946e4ad86ddc6b9c2ae73
Nov 28 10:10:28 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} v 0)
Nov 28 10:10:28 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:10:28 np0005538515.localdomain podman[239012]: time="2025-11-28T10:10:28Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 10:10:28 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_authorize(access_level:rw, auth_id:alice, format:json, prefix:fs subvolume authorize, sub_name:3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50, tenant_id:38de2f991c8946e4ad86ddc6b9c2ae73, vol_name:cephfs) < ""
Nov 28 10:10:28 np0005538515.localdomain podman[239012]: @ - - [28/Nov/2025:10:10:28 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156330 "" "Go-http-client/1.1"
Nov 28 10:10:28 np0005538515.localdomain podman[239012]: @ - - [28/Nov/2025:10:10:28 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19248 "" "Go-http-client/1.1"
Nov 28 10:10:29 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "223d1419-7407-477a-a3da-e408c7b6c43a", "snap_name": "cdc57bbb-b20f-407d-9c9a-d134937d596e", "format": "json"}]: dispatch
Nov 28 10:10:29 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice", "tenant_id": "38de2f991c8946e4ad86ddc6b9c2ae73", "access_level": "rw", "format": "json"}]: dispatch
Nov 28 10:10:29 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Nov 28 10:10:29 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:10:29 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:10:29 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"}]': finished
Nov 28 10:10:29 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.
Nov 28 10:10:29 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v553: 177 pgs: 177 active+clean; 199 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 862 B/s rd, 123 KiB/s wr, 11 op/s
Nov 28 10:10:29 np0005538515.localdomain sudo[322547]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 10:10:29 np0005538515.localdomain sudo[322547]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 10:10:29 np0005538515.localdomain sudo[322547]: pam_unix(sudo:session): session closed for user root
Nov 28 10:10:29 np0005538515.localdomain podman[322556]: 2025-11-28 10:10:29.983199797 +0000 UTC m=+0.084557441 container health_status 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, managed_by=edpm_ansible, vcs-type=git, distribution-scope=public, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc.)
Nov 28 10:10:30 np0005538515.localdomain podman[322556]: 2025-11-28 10:10:30.001632746 +0000 UTC m=+0.102990390 container exec_died 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., distribution-scope=public, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, io.buildah.version=1.33.7, release=1755695350, io.openshift.expose-services=, architecture=x86_64, build-date=2025-08-20T13:12:41)
Nov 28 10:10:30 np0005538515.localdomain systemd[1]: 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.service: Deactivated successfully.
Nov 28 10:10:30 np0005538515.localdomain sudo[322578]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 10:10:30 np0005538515.localdomain sudo[322578]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 10:10:30 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "204dc313-c4b7-4b7f-a2b9-7d12dcbb771e", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:10:30 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:204dc313-c4b7-4b7f-a2b9-7d12dcbb771e, vol_name:cephfs) < ""
Nov 28 10:10:30 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/204dc313-c4b7-4b7f-a2b9-7d12dcbb771e/.meta.tmp'
Nov 28 10:10:30 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/204dc313-c4b7-4b7f-a2b9-7d12dcbb771e/.meta.tmp' to config b'/volumes/_nogroup/204dc313-c4b7-4b7f-a2b9-7d12dcbb771e/.meta'
Nov 28 10:10:30 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:204dc313-c4b7-4b7f-a2b9-7d12dcbb771e, vol_name:cephfs) < ""
Nov 28 10:10:30 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "204dc313-c4b7-4b7f-a2b9-7d12dcbb771e", "format": "json"}]: dispatch
Nov 28 10:10:30 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:204dc313-c4b7-4b7f-a2b9-7d12dcbb771e, vol_name:cephfs) < ""
Nov 28 10:10:30 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:204dc313-c4b7-4b7f-a2b9-7d12dcbb771e, vol_name:cephfs) < ""
Nov 28 10:10:30 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:10:30.405 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:10:30 np0005538515.localdomain sudo[322578]: pam_unix(sudo:session): session closed for user root
Nov 28 10:10:30 np0005538515.localdomain ceph-mon[301134]: pgmap v553: 177 pgs: 177 active+clean; 199 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 862 B/s rd, 123 KiB/s wr, 11 op/s
Nov 28 10:10:30 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "204dc313-c4b7-4b7f-a2b9-7d12dcbb771e", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:10:30 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "204dc313-c4b7-4b7f-a2b9-7d12dcbb771e", "format": "json"}]: dispatch
Nov 28 10:10:30 np0005538515.localdomain ceph-mon[301134]: from='client.15651 172.18.0.34:0/597982567' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:10:30 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Nov 28 10:10:30 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 10:10:30 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0)
Nov 28 10:10:30 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 28 10:10:30 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Nov 28 10:10:30 np0005538515.localdomain ceph-mgr[286188]: [progress INFO root] update: starting ev a8d100a9-a4f7-40c5-b992-9fa7b68fd68f (Updating node-proxy deployment (+3 -> 3))
Nov 28 10:10:30 np0005538515.localdomain ceph-mgr[286188]: [progress INFO root] complete: finished ev a8d100a9-a4f7-40c5-b992-9fa7b68fd68f (Updating node-proxy deployment (+3 -> 3))
Nov 28 10:10:30 np0005538515.localdomain ceph-mgr[286188]: [progress INFO root] Completed event a8d100a9-a4f7-40c5-b992-9fa7b68fd68f (Updating node-proxy deployment (+3 -> 3)) in 0 seconds
Nov 28 10:10:30 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0)
Nov 28 10:10:30 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 28 10:10:31 np0005538515.localdomain ceph-mgr[286188]: [progress INFO root] Writing back 50 completed events
Nov 28 10:10:31 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0)
Nov 28 10:10:31 np0005538515.localdomain sudo[322635]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 10:10:31 np0005538515.localdomain sudo[322635]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 10:10:31 np0005538515.localdomain sudo[322635]: pam_unix(sudo:session): session closed for user root
Nov 28 10:10:31 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e271 e271: 6 total, 6 up, 6 in
Nov 28 10:10:31 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 10:10:31 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 28 10:10:31 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:10:31 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 28 10:10:31 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:10:31 np0005538515.localdomain ceph-mon[301134]: osdmap e271: 6 total, 6 up, 6 in
Nov 28 10:10:31 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "3c672c69-0cef-413e-aa2f-cebe487d9fad", "auth_id": "tempest-cephx-id-254686751", "tenant_id": "a65552de119e4309a43e9e85b3f7e533", "access_level": "rw", "format": "json"}]: dispatch
Nov 28 10:10:31 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_authorize(access_level:rw, auth_id:tempest-cephx-id-254686751, format:json, prefix:fs subvolume authorize, sub_name:3c672c69-0cef-413e-aa2f-cebe487d9fad, tenant_id:a65552de119e4309a43e9e85b3f7e533, vol_name:cephfs) < ""
Nov 28 10:10:31 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.tempest-cephx-id-254686751", "format": "json"} v 0)
Nov 28 10:10:31 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-254686751", "format": "json"} : dispatch
Nov 28 10:10:31 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: Creating meta for ID tempest-cephx-id-254686751 with tenant a65552de119e4309a43e9e85b3f7e533
Nov 28 10:10:31 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-254686751", "caps": ["mds", "allow rw path=/volumes/_nogroup/3c672c69-0cef-413e-aa2f-cebe487d9fad/4ce11f7d-42a8-42fc-8022-08399d2c2f15", "osd", "allow rw pool=manila_data namespace=fsvolumens_3c672c69-0cef-413e-aa2f-cebe487d9fad", "mon", "allow r"], "format": "json"} v 0)
Nov 28 10:10:31 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-254686751", "caps": ["mds", "allow rw path=/volumes/_nogroup/3c672c69-0cef-413e-aa2f-cebe487d9fad/4ce11f7d-42a8-42fc-8022-08399d2c2f15", "osd", "allow rw pool=manila_data namespace=fsvolumens_3c672c69-0cef-413e-aa2f-cebe487d9fad", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:10:31 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e271 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:10:31 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_authorize(access_level:rw, auth_id:tempest-cephx-id-254686751, format:json, prefix:fs subvolume authorize, sub_name:3c672c69-0cef-413e-aa2f-cebe487d9fad, tenant_id:a65552de119e4309a43e9e85b3f7e533, vol_name:cephfs) < ""
Nov 28 10:10:31 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v555: 177 pgs: 177 active+clean; 200 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 441 B/s rd, 100 KiB/s wr, 8 op/s
Nov 28 10:10:32 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "b6df0b61-7d52-4368-906d-590e5295b08d", "format": "json"}]: dispatch
Nov 28 10:10:32 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_clone_status(clone_name:b6df0b61-7d52-4368-906d-590e5295b08d, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Nov 28 10:10:32 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_clone_status(clone_name:b6df0b61-7d52-4368-906d-590e5295b08d, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Nov 28 10:10:32 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T10:10:32.267+0000 7fcc87448640 -1 mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume 'b6df0b61-7d52-4368-906d-590e5295b08d' of type subvolume
Nov 28 10:10:32 np0005538515.localdomain ceph-mgr[286188]: mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume 'b6df0b61-7d52-4368-906d-590e5295b08d' of type subvolume
Nov 28 10:10:32 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "b6df0b61-7d52-4368-906d-590e5295b08d", "force": true, "format": "json"}]: dispatch
Nov 28 10:10:32 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:b6df0b61-7d52-4368-906d-590e5295b08d, vol_name:cephfs) < ""
Nov 28 10:10:32 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.subvolume_base] subvolume path 'b'/volumes/_nogroup/b6df0b61-7d52-4368-906d-590e5295b08d'' moved to trashcan
Nov 28 10:10:32 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Nov 28 10:10:32 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:b6df0b61-7d52-4368-906d-590e5295b08d, vol_name:cephfs) < ""
Nov 28 10:10:32 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot clone", "vol_name": "cephfs", "sub_name": "223d1419-7407-477a-a3da-e408c7b6c43a", "snap_name": "cdc57bbb-b20f-407d-9c9a-d134937d596e", "target_sub_name": "96f781a9-18ad-48a6-a288-5b831718a338", "format": "json"}]: dispatch
Nov 28 10:10:32 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_clone(format:json, prefix:fs subvolume snapshot clone, snap_name:cdc57bbb-b20f-407d-9c9a-d134937d596e, sub_name:223d1419-7407-477a-a3da-e408c7b6c43a, target_sub_name:96f781a9-18ad-48a6-a288-5b831718a338, vol_name:cephfs) < ""
Nov 28 10:10:32 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 273 bytes to config b'/volumes/_nogroup/96f781a9-18ad-48a6-a288-5b831718a338/.meta.tmp'
Nov 28 10:10:32 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/96f781a9-18ad-48a6-a288-5b831718a338/.meta.tmp' to config b'/volumes/_nogroup/96f781a9-18ad-48a6-a288-5b831718a338/.meta'
Nov 28 10:10:32 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.clone_index] tracking-id d904628f-63ce-454f-9e8e-ced6a9a698f0 for path b'/volumes/_nogroup/96f781a9-18ad-48a6-a288-5b831718a338'
Nov 28 10:10:32 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 246 bytes to config b'/volumes/_nogroup/223d1419-7407-477a-a3da-e408c7b6c43a/.meta.tmp'
Nov 28 10:10:32 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/223d1419-7407-477a-a3da-e408c7b6c43a/.meta.tmp' to config b'/volumes/_nogroup/223d1419-7407-477a-a3da-e408c7b6c43a/.meta'
Nov 28 10:10:32 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Nov 28 10:10:32 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_clone(format:json, prefix:fs subvolume snapshot clone, snap_name:cdc57bbb-b20f-407d-9c9a-d134937d596e, sub_name:223d1419-7407-477a-a3da-e408c7b6c43a, target_sub_name:96f781a9-18ad-48a6-a288-5b831718a338, vol_name:cephfs) < ""
Nov 28 10:10:32 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.async_cloner] cloning to subvolume path: /volumes/_nogroup/96f781a9-18ad-48a6-a288-5b831718a338
Nov 28 10:10:32 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.async_cloner] starting clone: (cephfs, None, 96f781a9-18ad-48a6-a288-5b831718a338)
Nov 28 10:10:32 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "96f781a9-18ad-48a6-a288-5b831718a338", "format": "json"}]: dispatch
Nov 28 10:10:32 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_clone_status(clone_name:96f781a9-18ad-48a6-a288-5b831718a338, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Nov 28 10:10:32 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "3c672c69-0cef-413e-aa2f-cebe487d9fad", "auth_id": "tempest-cephx-id-254686751", "tenant_id": "a65552de119e4309a43e9e85b3f7e533", "access_level": "rw", "format": "json"}]: dispatch
Nov 28 10:10:32 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-254686751", "format": "json"} : dispatch
Nov 28 10:10:32 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-254686751", "caps": ["mds", "allow rw path=/volumes/_nogroup/3c672c69-0cef-413e-aa2f-cebe487d9fad/4ce11f7d-42a8-42fc-8022-08399d2c2f15", "osd", "allow rw pool=manila_data namespace=fsvolumens_3c672c69-0cef-413e-aa2f-cebe487d9fad", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:10:32 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-254686751", "caps": ["mds", "allow rw path=/volumes/_nogroup/3c672c69-0cef-413e-aa2f-cebe487d9fad/4ce11f7d-42a8-42fc-8022-08399d2c2f15", "osd", "allow rw pool=manila_data namespace=fsvolumens_3c672c69-0cef-413e-aa2f-cebe487d9fad", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:10:32 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-254686751", "caps": ["mds", "allow rw path=/volumes/_nogroup/3c672c69-0cef-413e-aa2f-cebe487d9fad/4ce11f7d-42a8-42fc-8022-08399d2c2f15", "osd", "allow rw pool=manila_data namespace=fsvolumens_3c672c69-0cef-413e-aa2f-cebe487d9fad", "mon", "allow r"], "format": "json"}]': finished
Nov 28 10:10:32 np0005538515.localdomain ceph-mon[301134]: pgmap v555: 177 pgs: 177 active+clean; 200 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 441 B/s rd, 100 KiB/s wr, 8 op/s
Nov 28 10:10:32 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "b6df0b61-7d52-4368-906d-590e5295b08d", "format": "json"}]: dispatch
Nov 28 10:10:32 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "b6df0b61-7d52-4368-906d-590e5295b08d", "force": true, "format": "json"}]: dispatch
Nov 28 10:10:32 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot clone", "vol_name": "cephfs", "sub_name": "223d1419-7407-477a-a3da-e408c7b6c43a", "snap_name": "cdc57bbb-b20f-407d-9c9a-d134937d596e", "target_sub_name": "96f781a9-18ad-48a6-a288-5b831718a338", "format": "json"}]: dispatch
Nov 28 10:10:32 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "96f781a9-18ad-48a6-a288-5b831718a338", "format": "json"}]: dispatch
Nov 28 10:10:33 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:10:33.608 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:10:33 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v556: 177 pgs: 177 active+clean; 200 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 409 B/s rd, 93 KiB/s wr, 7 op/s
Nov 28 10:10:34 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T10:10:34Z|00195|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Nov 28 10:10:34 np0005538515.localdomain ceph-mon[301134]: pgmap v556: 177 pgs: 177 active+clean; 200 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 409 B/s rd, 93 KiB/s wr, 7 op/s
Nov 28 10:10:35 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.async_cloner] Delayed cloning (cephfs, None, 96f781a9-18ad-48a6-a288-5b831718a338) -- by 0 seconds
Nov 28 10:10:35 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 277 bytes to config b'/volumes/_nogroup/96f781a9-18ad-48a6-a288-5b831718a338/.meta.tmp'
Nov 28 10:10:35 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/96f781a9-18ad-48a6-a288-5b831718a338/.meta.tmp' to config b'/volumes/_nogroup/96f781a9-18ad-48a6-a288-5b831718a338/.meta'
Nov 28 10:10:35 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_clone_status(clone_name:96f781a9-18ad-48a6-a288-5b831718a338, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Nov 28 10:10:35 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice", "format": "json"}]: dispatch
Nov 28 10:10:35 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_deauthorize(auth_id:alice, format:json, prefix:fs subvolume deauthorize, sub_name:3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50, vol_name:cephfs) < ""
Nov 28 10:10:35 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:10:35.450 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:10:35 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] scanning for idle connections..
Nov 28 10:10:35 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] cleaning up connections: []
Nov 28 10:10:35 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] scanning for idle connections..
Nov 28 10:10:35 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] cleaning up connections: []
Nov 28 10:10:35 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] scanning for idle connections..
Nov 28 10:10:35 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] cleaning up connections: []
Nov 28 10:10:35 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice", "format": "json"}]: dispatch
Nov 28 10:10:35 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v557: 177 pgs: 177 active+clean; 200 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 409 B/s rd, 93 KiB/s wr, 7 op/s
Nov 28 10:10:36 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e271 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:10:37 np0005538515.localdomain ceph-mon[301134]: pgmap v557: 177 pgs: 177 active+clean; 200 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 409 B/s rd, 93 KiB/s wr, 7 op/s
Nov 28 10:10:37 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v558: 177 pgs: 177 active+clean; 200 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 409 B/s rd, 97 KiB/s wr, 8 op/s
Nov 28 10:10:38 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:10:38.611 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:10:39 np0005538515.localdomain ceph-mon[301134]: pgmap v558: 177 pgs: 177 active+clean; 200 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 409 B/s rd, 97 KiB/s wr, 8 op/s
Nov 28 10:10:39 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v559: 177 pgs: 177 active+clean; 200 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 409 B/s rd, 97 KiB/s wr, 8 op/s
Nov 28 10:10:40 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.async_cloner] copying data from b'/volumes/_nogroup/223d1419-7407-477a-a3da-e408c7b6c43a/.snap/cdc57bbb-b20f-407d-9c9a-d134937d596e/c18e34d9-3414-43d5-bf95-d8340acbe301' to b'/volumes/_nogroup/96f781a9-18ad-48a6-a288-5b831718a338/05021c8f-ee51-4139-ac5a-e9446a1dbb7b'
Nov 28 10:10:40 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.alice", "format": "json"} v 0)
Nov 28 10:10:40 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Nov 28 10:10:40 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice"} v 0)
Nov 28 10:10:40 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Nov 28 10:10:40 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_deauthorize(auth_id:alice, format:json, prefix:fs subvolume deauthorize, sub_name:3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50, vol_name:cephfs) < ""
Nov 28 10:10:40 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 274 bytes to config b'/volumes/_nogroup/96f781a9-18ad-48a6-a288-5b831718a338/.meta.tmp'
Nov 28 10:10:40 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/96f781a9-18ad-48a6-a288-5b831718a338/.meta.tmp' to config b'/volumes/_nogroup/96f781a9-18ad-48a6-a288-5b831718a338/.meta'
Nov 28 10:10:40 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice", "format": "json"}]: dispatch
Nov 28 10:10:40 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_evict(auth_id:alice, format:json, prefix:fs subvolume evict, sub_name:3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50, vol_name:cephfs) < ""
Nov 28 10:10:40 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.clone_index] untracking d904628f-63ce-454f-9e8e-ced6a9a698f0
Nov 28 10:10:40 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/223d1419-7407-477a-a3da-e408c7b6c43a/.meta.tmp'
Nov 28 10:10:40 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/223d1419-7407-477a-a3da-e408c7b6c43a/.meta.tmp' to config b'/volumes/_nogroup/223d1419-7407-477a-a3da-e408c7b6c43a/.meta'
Nov 28 10:10:40 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 151 bytes to config b'/volumes/_nogroup/96f781a9-18ad-48a6-a288-5b831718a338/.meta.tmp'
Nov 28 10:10:40 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/96f781a9-18ad-48a6-a288-5b831718a338/.meta.tmp' to config b'/volumes/_nogroup/96f781a9-18ad-48a6-a288-5b831718a338/.meta'
Nov 28 10:10:40 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.async_cloner] finished clone: (cephfs, None, 96f781a9-18ad-48a6-a288-5b831718a338)
Nov 28 10:10:40 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict clients with auth_name=alice, client_metadata.root=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97
Nov 28 10:10:40 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict: joined all
Nov 28 10:10:40 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_evict(auth_id:alice, format:json, prefix:fs subvolume evict, sub_name:3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50, vol_name:cephfs) < ""
Nov 28 10:10:40 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:10:40.484 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:10:40 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Nov 28 10:10:40 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Nov 28 10:10:40 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Nov 28 10:10:40 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished
Nov 28 10:10:40 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "204dc313-c4b7-4b7f-a2b9-7d12dcbb771e", "format": "json"}]: dispatch
Nov 28 10:10:40 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_clone_status(clone_name:204dc313-c4b7-4b7f-a2b9-7d12dcbb771e, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Nov 28 10:10:40 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_clone_status(clone_name:204dc313-c4b7-4b7f-a2b9-7d12dcbb771e, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Nov 28 10:10:40 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T10:10:40.690+0000 7fcc87448640 -1 mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '204dc313-c4b7-4b7f-a2b9-7d12dcbb771e' of type subvolume
Nov 28 10:10:40 np0005538515.localdomain ceph-mgr[286188]: mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '204dc313-c4b7-4b7f-a2b9-7d12dcbb771e' of type subvolume
Nov 28 10:10:40 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "204dc313-c4b7-4b7f-a2b9-7d12dcbb771e", "force": true, "format": "json"}]: dispatch
Nov 28 10:10:40 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:204dc313-c4b7-4b7f-a2b9-7d12dcbb771e, vol_name:cephfs) < ""
Nov 28 10:10:40 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.subvolume_base] subvolume path 'b'/volumes/_nogroup/204dc313-c4b7-4b7f-a2b9-7d12dcbb771e'' moved to trashcan
Nov 28 10:10:40 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Nov 28 10:10:40 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:204dc313-c4b7-4b7f-a2b9-7d12dcbb771e, vol_name:cephfs) < ""
Nov 28 10:10:40 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "68e2b06e-b3f4-47c6-ba92-1f283cfd85db", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:10:40 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:68e2b06e-b3f4-47c6-ba92-1f283cfd85db, vol_name:cephfs) < ""
Nov 28 10:10:40 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/68e2b06e-b3f4-47c6-ba92-1f283cfd85db/.meta.tmp'
Nov 28 10:10:40 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/68e2b06e-b3f4-47c6-ba92-1f283cfd85db/.meta.tmp' to config b'/volumes/_nogroup/68e2b06e-b3f4-47c6-ba92-1f283cfd85db/.meta'
Nov 28 10:10:40 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:68e2b06e-b3f4-47c6-ba92-1f283cfd85db, vol_name:cephfs) < ""
Nov 28 10:10:40 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "68e2b06e-b3f4-47c6-ba92-1f283cfd85db", "format": "json"}]: dispatch
Nov 28 10:10:40 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:68e2b06e-b3f4-47c6-ba92-1f283cfd85db, vol_name:cephfs) < ""
Nov 28 10:10:40 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:68e2b06e-b3f4-47c6-ba92-1f283cfd85db, vol_name:cephfs) < ""
Nov 28 10:10:41 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "3c672c69-0cef-413e-aa2f-cebe487d9fad", "auth_id": "tempest-cephx-id-254686751", "format": "json"}]: dispatch
Nov 28 10:10:41 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_deauthorize(auth_id:tempest-cephx-id-254686751, format:json, prefix:fs subvolume deauthorize, sub_name:3c672c69-0cef-413e-aa2f-cebe487d9fad, vol_name:cephfs) < ""
Nov 28 10:10:41 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.tempest-cephx-id-254686751", "format": "json"} v 0)
Nov 28 10:10:41 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-254686751", "format": "json"} : dispatch
Nov 28 10:10:41 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.tempest-cephx-id-254686751"} v 0)
Nov 28 10:10:41 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-254686751"} : dispatch
Nov 28 10:10:41 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_deauthorize(auth_id:tempest-cephx-id-254686751, format:json, prefix:fs subvolume deauthorize, sub_name:3c672c69-0cef-413e-aa2f-cebe487d9fad, vol_name:cephfs) < ""
Nov 28 10:10:41 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "3c672c69-0cef-413e-aa2f-cebe487d9fad", "auth_id": "tempest-cephx-id-254686751", "format": "json"}]: dispatch
Nov 28 10:10:41 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_evict(auth_id:tempest-cephx-id-254686751, format:json, prefix:fs subvolume evict, sub_name:3c672c69-0cef-413e-aa2f-cebe487d9fad, vol_name:cephfs) < ""
Nov 28 10:10:41 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict clients with auth_name=tempest-cephx-id-254686751, client_metadata.root=/volumes/_nogroup/3c672c69-0cef-413e-aa2f-cebe487d9fad/4ce11f7d-42a8-42fc-8022-08399d2c2f15
Nov 28 10:10:41 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict: joined all
Nov 28 10:10:41 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_evict(auth_id:tempest-cephx-id-254686751, format:json, prefix:fs subvolume evict, sub_name:3c672c69-0cef-413e-aa2f-cebe487d9fad, vol_name:cephfs) < ""
Nov 28 10:10:41 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice", "tenant_id": "38de2f991c8946e4ad86ddc6b9c2ae73", "access_level": "r", "format": "json"}]: dispatch
Nov 28 10:10:41 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_authorize(access_level:r, auth_id:alice, format:json, prefix:fs subvolume authorize, sub_name:3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50, tenant_id:38de2f991c8946e4ad86ddc6b9c2ae73, vol_name:cephfs) < ""
Nov 28 10:10:41 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.alice", "format": "json"} v 0)
Nov 28 10:10:41 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Nov 28 10:10:41 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: Creating meta for ID alice with tenant 38de2f991c8946e4ad86ddc6b9c2ae73
Nov 28 10:10:41 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} v 0)
Nov 28 10:10:41 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:10:41 np0005538515.localdomain ceph-mon[301134]: pgmap v559: 177 pgs: 177 active+clean; 200 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 409 B/s rd, 97 KiB/s wr, 8 op/s
Nov 28 10:10:41 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice", "format": "json"}]: dispatch
Nov 28 10:10:41 np0005538515.localdomain ceph-mon[301134]: from='client.15651 172.18.0.34:0/597982567' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:10:41 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-254686751"} : dispatch
Nov 28 10:10:41 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-254686751", "format": "json"} : dispatch
Nov 28 10:10:41 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-254686751"} : dispatch
Nov 28 10:10:41 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-254686751"}]': finished
Nov 28 10:10:41 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Nov 28 10:10:41 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:10:41 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:10:41 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_authorize(access_level:r, auth_id:alice, format:json, prefix:fs subvolume authorize, sub_name:3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50, tenant_id:38de2f991c8946e4ad86ddc6b9c2ae73, vol_name:cephfs) < ""
Nov 28 10:10:41 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e271 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:10:41 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "3c672c69-0cef-413e-aa2f-cebe487d9fad", "format": "json"}]: dispatch
Nov 28 10:10:41 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_clone_status(clone_name:3c672c69-0cef-413e-aa2f-cebe487d9fad, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Nov 28 10:10:41 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_clone_status(clone_name:3c672c69-0cef-413e-aa2f-cebe487d9fad, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Nov 28 10:10:41 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T10:10:41.869+0000 7fcc87448640 -1 mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '3c672c69-0cef-413e-aa2f-cebe487d9fad' of type subvolume
Nov 28 10:10:41 np0005538515.localdomain ceph-mgr[286188]: mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '3c672c69-0cef-413e-aa2f-cebe487d9fad' of type subvolume
Nov 28 10:10:41 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "3c672c69-0cef-413e-aa2f-cebe487d9fad", "force": true, "format": "json"}]: dispatch
Nov 28 10:10:41 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:3c672c69-0cef-413e-aa2f-cebe487d9fad, vol_name:cephfs) < ""
Nov 28 10:10:41 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v560: 177 pgs: 177 active+clean; 201 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 695 B/s rd, 193 KiB/s wr, 16 op/s
Nov 28 10:10:41 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.subvolume_base] subvolume path 'b'/volumes/_nogroup/3c672c69-0cef-413e-aa2f-cebe487d9fad'' moved to trashcan
Nov 28 10:10:41 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Nov 28 10:10:41 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:3c672c69-0cef-413e-aa2f-cebe487d9fad, vol_name:cephfs) < ""
Nov 28 10:10:42 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "9f812932-0755-4fbf-a6f3-aea9e3a38b58", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:10:42 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:9f812932-0755-4fbf-a6f3-aea9e3a38b58, vol_name:cephfs) < ""
Nov 28 10:10:42 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/9f812932-0755-4fbf-a6f3-aea9e3a38b58/.meta.tmp'
Nov 28 10:10:42 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/9f812932-0755-4fbf-a6f3-aea9e3a38b58/.meta.tmp' to config b'/volumes/_nogroup/9f812932-0755-4fbf-a6f3-aea9e3a38b58/.meta'
Nov 28 10:10:42 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:9f812932-0755-4fbf-a6f3-aea9e3a38b58, vol_name:cephfs) < ""
Nov 28 10:10:42 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "9f812932-0755-4fbf-a6f3-aea9e3a38b58", "format": "json"}]: dispatch
Nov 28 10:10:42 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:9f812932-0755-4fbf-a6f3-aea9e3a38b58, vol_name:cephfs) < ""
Nov 28 10:10:42 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:9f812932-0755-4fbf-a6f3-aea9e3a38b58, vol_name:cephfs) < ""
Nov 28 10:10:42 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "204dc313-c4b7-4b7f-a2b9-7d12dcbb771e", "format": "json"}]: dispatch
Nov 28 10:10:42 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "204dc313-c4b7-4b7f-a2b9-7d12dcbb771e", "force": true, "format": "json"}]: dispatch
Nov 28 10:10:42 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "68e2b06e-b3f4-47c6-ba92-1f283cfd85db", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:10:42 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "68e2b06e-b3f4-47c6-ba92-1f283cfd85db", "format": "json"}]: dispatch
Nov 28 10:10:42 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "3c672c69-0cef-413e-aa2f-cebe487d9fad", "auth_id": "tempest-cephx-id-254686751", "format": "json"}]: dispatch
Nov 28 10:10:42 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "3c672c69-0cef-413e-aa2f-cebe487d9fad", "auth_id": "tempest-cephx-id-254686751", "format": "json"}]: dispatch
Nov 28 10:10:42 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice", "tenant_id": "38de2f991c8946e4ad86ddc6b9c2ae73", "access_level": "r", "format": "json"}]: dispatch
Nov 28 10:10:42 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"}]': finished
Nov 28 10:10:42 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "3c672c69-0cef-413e-aa2f-cebe487d9fad", "format": "json"}]: dispatch
Nov 28 10:10:42 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "3c672c69-0cef-413e-aa2f-cebe487d9fad", "force": true, "format": "json"}]: dispatch
Nov 28 10:10:42 np0005538515.localdomain ceph-mon[301134]: pgmap v560: 177 pgs: 177 active+clean; 201 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 695 B/s rd, 193 KiB/s wr, 16 op/s
Nov 28 10:10:42 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "9f812932-0755-4fbf-a6f3-aea9e3a38b58", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:10:42 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "9f812932-0755-4fbf-a6f3-aea9e3a38b58", "format": "json"}]: dispatch
Nov 28 10:10:42 np0005538515.localdomain ceph-mon[301134]: from='client.15651 172.18.0.34:0/597982567' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:10:42 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.
Nov 28 10:10:42 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.
Nov 28 10:10:42 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.
Nov 28 10:10:42 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.
Nov 28 10:10:43 np0005538515.localdomain podman[322658]: 2025-11-28 10:10:43.045488986 +0000 UTC m=+0.136569817 container health_status d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 28 10:10:43 np0005538515.localdomain podman[322658]: 2025-11-28 10:10:43.058518149 +0000 UTC m=+0.149599030 container exec_died d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 10:10:43 np0005538515.localdomain systemd[1]: d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.service: Deactivated successfully.
Nov 28 10:10:43 np0005538515.localdomain systemd[1]: tmp-crun.PsOkRQ.mount: Deactivated successfully.
Nov 28 10:10:43 np0005538515.localdomain podman[322655]: 2025-11-28 10:10:43.109259255 +0000 UTC m=+0.206877348 container health_status 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.license=GPLv2, config_id=edpm, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Nov 28 10:10:43 np0005538515.localdomain podman[322657]: 2025-11-28 10:10:43.010117314 +0000 UTC m=+0.103425994 container health_status b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Nov 28 10:10:43 np0005538515.localdomain podman[322657]: 2025-11-28 10:10:43.145429422 +0000 UTC m=+0.238738142 container exec_died b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Nov 28 10:10:43 np0005538515.localdomain podman[322656]: 2025-11-28 10:10:43.155055048 +0000 UTC m=+0.249662938 container health_status 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_controller, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 28 10:10:43 np0005538515.localdomain systemd[1]: b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.service: Deactivated successfully.
Nov 28 10:10:43 np0005538515.localdomain podman[322655]: 2025-11-28 10:10:43.176638375 +0000 UTC m=+0.274256478 container exec_died 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, container_name=ceilometer_agent_compute, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 28 10:10:43 np0005538515.localdomain systemd[1]: 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.service: Deactivated successfully.
Nov 28 10:10:43 np0005538515.localdomain podman[322656]: 2025-11-28 10:10:43.203571756 +0000 UTC m=+0.298179696 container exec_died 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Nov 28 10:10:43 np0005538515.localdomain systemd[1]: 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.service: Deactivated successfully.
Nov 28 10:10:43 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:10:43.614 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:10:43 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v561: 177 pgs: 177 active+clean; 201 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 596 B/s rd, 125 KiB/s wr, 10 op/s
Nov 28 10:10:44 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "4afb8271-18a6-4a0b-8cb1-7414aa7c5267", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:10:44 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:4afb8271-18a6-4a0b-8cb1-7414aa7c5267, vol_name:cephfs) < ""
Nov 28 10:10:44 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/4afb8271-18a6-4a0b-8cb1-7414aa7c5267/.meta.tmp'
Nov 28 10:10:44 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/4afb8271-18a6-4a0b-8cb1-7414aa7c5267/.meta.tmp' to config b'/volumes/_nogroup/4afb8271-18a6-4a0b-8cb1-7414aa7c5267/.meta'
Nov 28 10:10:44 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:4afb8271-18a6-4a0b-8cb1-7414aa7c5267, vol_name:cephfs) < ""
Nov 28 10:10:44 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "4afb8271-18a6-4a0b-8cb1-7414aa7c5267", "format": "json"}]: dispatch
Nov 28 10:10:44 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:4afb8271-18a6-4a0b-8cb1-7414aa7c5267, vol_name:cephfs) < ""
Nov 28 10:10:44 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:4afb8271-18a6-4a0b-8cb1-7414aa7c5267, vol_name:cephfs) < ""
Nov 28 10:10:44 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.
Nov 28 10:10:44 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "9861e523-796e-4848-a7e0-e4ce88058d68", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:10:44 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:9861e523-796e-4848-a7e0-e4ce88058d68, vol_name:cephfs) < ""
Nov 28 10:10:44 np0005538515.localdomain podman[322741]: 2025-11-28 10:10:44.982690466 +0000 UTC m=+0.079208756 container health_status 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 28 10:10:44 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/9861e523-796e-4848-a7e0-e4ce88058d68/.meta.tmp'
Nov 28 10:10:44 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/9861e523-796e-4848-a7e0-e4ce88058d68/.meta.tmp' to config b'/volumes/_nogroup/9861e523-796e-4848-a7e0-e4ce88058d68/.meta'
Nov 28 10:10:44 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:9861e523-796e-4848-a7e0-e4ce88058d68, vol_name:cephfs) < ""
Nov 28 10:10:44 np0005538515.localdomain ceph-mon[301134]: pgmap v561: 177 pgs: 177 active+clean; 201 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 596 B/s rd, 125 KiB/s wr, 10 op/s
Nov 28 10:10:44 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "4afb8271-18a6-4a0b-8cb1-7414aa7c5267", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:10:44 np0005538515.localdomain ceph-mon[301134]: from='client.15651 172.18.0.34:0/597982567' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:10:44 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "9861e523-796e-4848-a7e0-e4ce88058d68", "format": "json"}]: dispatch
Nov 28 10:10:44 np0005538515.localdomain podman[322741]: 2025-11-28 10:10:44.994865543 +0000 UTC m=+0.091383823 container exec_died 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 28 10:10:44 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:9861e523-796e-4848-a7e0-e4ce88058d68, vol_name:cephfs) < ""
Nov 28 10:10:45 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:9861e523-796e-4848-a7e0-e4ce88058d68, vol_name:cephfs) < ""
Nov 28 10:10:45 np0005538515.localdomain systemd[1]: 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.service: Deactivated successfully.
Nov 28 10:10:45 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice", "format": "json"}]: dispatch
Nov 28 10:10:45 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_deauthorize(auth_id:alice, format:json, prefix:fs subvolume deauthorize, sub_name:3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50, vol_name:cephfs) < ""
Nov 28 10:10:45 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.alice", "format": "json"} v 0)
Nov 28 10:10:45 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Nov 28 10:10:45 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice"} v 0)
Nov 28 10:10:45 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Nov 28 10:10:45 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_deauthorize(auth_id:alice, format:json, prefix:fs subvolume deauthorize, sub_name:3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50, vol_name:cephfs) < ""
Nov 28 10:10:45 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice", "format": "json"}]: dispatch
Nov 28 10:10:45 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_evict(auth_id:alice, format:json, prefix:fs subvolume evict, sub_name:3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50, vol_name:cephfs) < ""
Nov 28 10:10:45 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict clients with auth_name=alice, client_metadata.root=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97
Nov 28 10:10:45 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict: joined all
Nov 28 10:10:45 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_evict(auth_id:alice, format:json, prefix:fs subvolume evict, sub_name:3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50, vol_name:cephfs) < ""
Nov 28 10:10:45 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:10:45.522 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:10:45 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v562: 177 pgs: 177 active+clean; 201 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 597 B/s rd, 125 KiB/s wr, 10 op/s
Nov 28 10:10:46 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "4afb8271-18a6-4a0b-8cb1-7414aa7c5267", "format": "json"}]: dispatch
Nov 28 10:10:46 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "9861e523-796e-4848-a7e0-e4ce88058d68", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:10:46 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "9861e523-796e-4848-a7e0-e4ce88058d68", "format": "json"}]: dispatch
Nov 28 10:10:46 np0005538515.localdomain ceph-mon[301134]: from='client.15651 172.18.0.34:0/597982567' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:10:46 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice", "format": "json"}]: dispatch
Nov 28 10:10:46 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Nov 28 10:10:46 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Nov 28 10:10:46 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Nov 28 10:10:46 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished
Nov 28 10:10:46 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice", "format": "json"}]: dispatch
Nov 28 10:10:46 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:10:46.237 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:10:46 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e271 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:10:46 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "9f812932-0755-4fbf-a6f3-aea9e3a38b58", "format": "json"}]: dispatch
Nov 28 10:10:46 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_clone_status(clone_name:9f812932-0755-4fbf-a6f3-aea9e3a38b58, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Nov 28 10:10:46 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_clone_status(clone_name:9f812932-0755-4fbf-a6f3-aea9e3a38b58, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Nov 28 10:10:46 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T10:10:46.915+0000 7fcc87448640 -1 mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '9f812932-0755-4fbf-a6f3-aea9e3a38b58' of type subvolume
Nov 28 10:10:46 np0005538515.localdomain ceph-mgr[286188]: mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '9f812932-0755-4fbf-a6f3-aea9e3a38b58' of type subvolume
Nov 28 10:10:46 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "9f812932-0755-4fbf-a6f3-aea9e3a38b58", "force": true, "format": "json"}]: dispatch
Nov 28 10:10:46 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:9f812932-0755-4fbf-a6f3-aea9e3a38b58, vol_name:cephfs) < ""
Nov 28 10:10:46 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.subvolume_base] subvolume path 'b'/volumes/_nogroup/9f812932-0755-4fbf-a6f3-aea9e3a38b58'' moved to trashcan
Nov 28 10:10:46 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Nov 28 10:10:46 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:9f812932-0755-4fbf-a6f3-aea9e3a38b58, vol_name:cephfs) < ""
Nov 28 10:10:47 np0005538515.localdomain ceph-mon[301134]: pgmap v562: 177 pgs: 177 active+clean; 201 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 597 B/s rd, 125 KiB/s wr, 10 op/s
Nov 28 10:10:47 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:10:47.238 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:10:47 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:10:47.239 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:10:47 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:10:47.239 280172 DEBUG nova.compute.manager [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Nov 28 10:10:47 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:10:47.259 280172 DEBUG nova.compute.manager [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Nov 28 10:10:47 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "4afb8271-18a6-4a0b-8cb1-7414aa7c5267", "format": "json"}]: dispatch
Nov 28 10:10:47 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_clone_status(clone_name:4afb8271-18a6-4a0b-8cb1-7414aa7c5267, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Nov 28 10:10:47 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_clone_status(clone_name:4afb8271-18a6-4a0b-8cb1-7414aa7c5267, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Nov 28 10:10:47 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T10:10:47.737+0000 7fcc87448640 -1 mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '4afb8271-18a6-4a0b-8cb1-7414aa7c5267' of type subvolume
Nov 28 10:10:47 np0005538515.localdomain ceph-mgr[286188]: mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '4afb8271-18a6-4a0b-8cb1-7414aa7c5267' of type subvolume
Nov 28 10:10:47 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "4afb8271-18a6-4a0b-8cb1-7414aa7c5267", "force": true, "format": "json"}]: dispatch
Nov 28 10:10:47 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:4afb8271-18a6-4a0b-8cb1-7414aa7c5267, vol_name:cephfs) < ""
Nov 28 10:10:47 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.subvolume_base] subvolume path 'b'/volumes/_nogroup/4afb8271-18a6-4a0b-8cb1-7414aa7c5267'' moved to trashcan
Nov 28 10:10:47 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Nov 28 10:10:47 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:4afb8271-18a6-4a0b-8cb1-7414aa7c5267, vol_name:cephfs) < ""
Nov 28 10:10:47 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v563: 177 pgs: 177 active+clean; 201 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 1.1 KiB/s rd, 167 KiB/s wr, 16 op/s
Nov 28 10:10:48 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice_bob", "tenant_id": "38de2f991c8946e4ad86ddc6b9c2ae73", "access_level": "rw", "format": "json"}]: dispatch
Nov 28 10:10:48 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_authorize(access_level:rw, auth_id:alice_bob, format:json, prefix:fs subvolume authorize, sub_name:3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50, tenant_id:38de2f991c8946e4ad86ddc6b9c2ae73, vol_name:cephfs) < ""
Nov 28 10:10:48 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} v 0)
Nov 28 10:10:48 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Nov 28 10:10:48 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: Creating meta for ID alice_bob with tenant 38de2f991c8946e4ad86ddc6b9c2ae73
Nov 28 10:10:48 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} v 0)
Nov 28 10:10:48 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:10:48 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "9f812932-0755-4fbf-a6f3-aea9e3a38b58", "format": "json"}]: dispatch
Nov 28 10:10:48 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "9f812932-0755-4fbf-a6f3-aea9e3a38b58", "force": true, "format": "json"}]: dispatch
Nov 28 10:10:48 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Nov 28 10:10:48 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:10:48 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:10:48 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"}]': finished
Nov 28 10:10:48 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_authorize(access_level:rw, auth_id:alice_bob, format:json, prefix:fs subvolume authorize, sub_name:3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50, tenant_id:38de2f991c8946e4ad86ddc6b9c2ae73, vol_name:cephfs) < ""
Nov 28 10:10:48 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "9861e523-796e-4848-a7e0-e4ce88058d68", "snap_name": "feab8b53-fb10-403a-9e5e-ef10a5640c05", "format": "json"}]: dispatch
Nov 28 10:10:48 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_create(format:json, prefix:fs subvolume snapshot create, snap_name:feab8b53-fb10-403a-9e5e-ef10a5640c05, sub_name:9861e523-796e-4848-a7e0-e4ce88058d68, vol_name:cephfs) < ""
Nov 28 10:10:48 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_create(format:json, prefix:fs subvolume snapshot create, snap_name:feab8b53-fb10-403a-9e5e-ef10a5640c05, sub_name:9861e523-796e-4848-a7e0-e4ce88058d68, vol_name:cephfs) < ""
Nov 28 10:10:48 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:10:48.617 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:10:49 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "4afb8271-18a6-4a0b-8cb1-7414aa7c5267", "format": "json"}]: dispatch
Nov 28 10:10:49 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "4afb8271-18a6-4a0b-8cb1-7414aa7c5267", "force": true, "format": "json"}]: dispatch
Nov 28 10:10:49 np0005538515.localdomain ceph-mon[301134]: pgmap v563: 177 pgs: 177 active+clean; 201 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 1.1 KiB/s rd, 167 KiB/s wr, 16 op/s
Nov 28 10:10:49 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice_bob", "tenant_id": "38de2f991c8946e4ad86ddc6b9c2ae73", "access_level": "rw", "format": "json"}]: dispatch
Nov 28 10:10:49 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "9861e523-796e-4848-a7e0-e4ce88058d68", "snap_name": "feab8b53-fb10-403a-9e5e-ef10a5640c05", "format": "json"}]: dispatch
Nov 28 10:10:49 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:10:49.257 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:10:49 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:10:49.258 280172 DEBUG nova.compute.manager [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 28 10:10:49 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:10:49.593 158530 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=18, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '92:49:97', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ca:ab:0a:de:51:20'}, ipsec=False) old=SB_Global(nb_cfg=17) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:10:49 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:10:49.594 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:10:49 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:10:49.595 158530 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 28 10:10:49 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:10:49.595 158530 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=62c03cad-89c1-4fd7-973b-8f2a608c71f1, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '18'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 10:10:49 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.
Nov 28 10:10:49 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v564: 177 pgs: 177 active+clean; 201 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 767 B/s rd, 128 KiB/s wr, 12 op/s
Nov 28 10:10:49 np0005538515.localdomain podman[322765]: 2025-11-28 10:10:49.956347802 +0000 UTC m=+0.067520895 container health_status cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 28 10:10:49 np0005538515.localdomain podman[322765]: 2025-11-28 10:10:49.99874276 +0000 UTC m=+0.109915793 container exec_died cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Nov 28 10:10:50 np0005538515.localdomain systemd[1]: cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.service: Deactivated successfully.
Nov 28 10:10:50 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:10:50.238 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:10:50 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:10:50.239 280172 DEBUG nova.compute.manager [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 28 10:10:50 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:10:50.239 280172 DEBUG nova.compute.manager [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 28 10:10:50 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:10:50.265 280172 DEBUG nova.compute.manager [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 28 10:10:50 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:10:50.265 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:10:50 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:10:50.570 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:10:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:10:50.854 158530 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 10:10:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:10:50.855 158530 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 10:10:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:10:50.855 158530 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 10:10:50 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "68e2b06e-b3f4-47c6-ba92-1f283cfd85db", "format": "json"}]: dispatch
Nov 28 10:10:50 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_clone_status(clone_name:68e2b06e-b3f4-47c6-ba92-1f283cfd85db, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Nov 28 10:10:50 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_clone_status(clone_name:68e2b06e-b3f4-47c6-ba92-1f283cfd85db, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Nov 28 10:10:50 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T10:10:50.967+0000 7fcc87448640 -1 mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '68e2b06e-b3f4-47c6-ba92-1f283cfd85db' of type subvolume
Nov 28 10:10:50 np0005538515.localdomain ceph-mgr[286188]: mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '68e2b06e-b3f4-47c6-ba92-1f283cfd85db' of type subvolume
Nov 28 10:10:50 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "68e2b06e-b3f4-47c6-ba92-1f283cfd85db", "force": true, "format": "json"}]: dispatch
Nov 28 10:10:50 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:68e2b06e-b3f4-47c6-ba92-1f283cfd85db, vol_name:cephfs) < ""
Nov 28 10:10:50 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.subvolume_base] subvolume path 'b'/volumes/_nogroup/68e2b06e-b3f4-47c6-ba92-1f283cfd85db'' moved to trashcan
Nov 28 10:10:50 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Nov 28 10:10:50 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:68e2b06e-b3f4-47c6-ba92-1f283cfd85db, vol_name:cephfs) < ""
Nov 28 10:10:50 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "e15fa73c-5a04-4afe-898a-d761ebf88b95", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:10:50 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:e15fa73c-5a04-4afe-898a-d761ebf88b95, vol_name:cephfs) < ""
Nov 28 10:10:51 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/e15fa73c-5a04-4afe-898a-d761ebf88b95/.meta.tmp'
Nov 28 10:10:51 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/e15fa73c-5a04-4afe-898a-d761ebf88b95/.meta.tmp' to config b'/volumes/_nogroup/e15fa73c-5a04-4afe-898a-d761ebf88b95/.meta'
Nov 28 10:10:51 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:e15fa73c-5a04-4afe-898a-d761ebf88b95, vol_name:cephfs) < ""
Nov 28 10:10:51 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "e15fa73c-5a04-4afe-898a-d761ebf88b95", "format": "json"}]: dispatch
Nov 28 10:10:51 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:e15fa73c-5a04-4afe-898a-d761ebf88b95, vol_name:cephfs) < ""
Nov 28 10:10:51 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:e15fa73c-5a04-4afe-898a-d761ebf88b95, vol_name:cephfs) < ""
Nov 28 10:10:51 np0005538515.localdomain ceph-mon[301134]: pgmap v564: 177 pgs: 177 active+clean; 201 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 767 B/s rd, 128 KiB/s wr, 12 op/s
Nov 28 10:10:51 np0005538515.localdomain ceph-mon[301134]: from='client.15651 172.18.0.34:0/597982567' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:10:51 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:10:51.260 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:10:51 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice_bob", "format": "json"}]: dispatch
Nov 28 10:10:51 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_deauthorize(auth_id:alice_bob, format:json, prefix:fs subvolume deauthorize, sub_name:3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50, vol_name:cephfs) < ""
Nov 28 10:10:51 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} v 0)
Nov 28 10:10:51 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Nov 28 10:10:51 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice_bob"} v 0)
Nov 28 10:10:51 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Nov 28 10:10:51 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_deauthorize(auth_id:alice_bob, format:json, prefix:fs subvolume deauthorize, sub_name:3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50, vol_name:cephfs) < ""
Nov 28 10:10:51 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice_bob", "format": "json"}]: dispatch
Nov 28 10:10:51 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_evict(auth_id:alice_bob, format:json, prefix:fs subvolume evict, sub_name:3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50, vol_name:cephfs) < ""
Nov 28 10:10:51 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict clients with auth_name=alice_bob, client_metadata.root=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97
Nov 28 10:10:51 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict: joined all
Nov 28 10:10:51 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_evict(auth_id:alice_bob, format:json, prefix:fs subvolume evict, sub_name:3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50, vol_name:cephfs) < ""
Nov 28 10:10:51 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "df1f50b2-116f-4913-b966-9e6fb632edd2", "format": "json"}]: dispatch
Nov 28 10:10:51 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_clone_status(clone_name:df1f50b2-116f-4913-b966-9e6fb632edd2, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Nov 28 10:10:51 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_clone_status(clone_name:df1f50b2-116f-4913-b966-9e6fb632edd2, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Nov 28 10:10:51 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T10:10:51.681+0000 7fcc87448640 -1 mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume 'df1f50b2-116f-4913-b966-9e6fb632edd2' of type subvolume
Nov 28 10:10:51 np0005538515.localdomain ceph-mgr[286188]: mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume 'df1f50b2-116f-4913-b966-9e6fb632edd2' of type subvolume
Nov 28 10:10:51 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "df1f50b2-116f-4913-b966-9e6fb632edd2", "force": true, "format": "json"}]: dispatch
Nov 28 10:10:51 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:df1f50b2-116f-4913-b966-9e6fb632edd2, vol_name:cephfs) < ""
Nov 28 10:10:51 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.subvolume_base] subvolume path 'b'/volumes/_nogroup/df1f50b2-116f-4913-b966-9e6fb632edd2'' moved to trashcan
Nov 28 10:10:51 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Nov 28 10:10:51 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:df1f50b2-116f-4913-b966-9e6fb632edd2, vol_name:cephfs) < ""
Nov 28 10:10:51 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e271 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:10:51 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "619d1399-3e67-47c1-b13e-c8a98f88c137", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:10:51 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:619d1399-3e67-47c1-b13e-c8a98f88c137, vol_name:cephfs) < ""
Nov 28 10:10:51 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v565: 177 pgs: 177 active+clean; 202 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 1.2 KiB/s rd, 205 KiB/s wr, 18 op/s
Nov 28 10:10:52 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/619d1399-3e67-47c1-b13e-c8a98f88c137/.meta.tmp'
Nov 28 10:10:52 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/619d1399-3e67-47c1-b13e-c8a98f88c137/.meta.tmp' to config b'/volumes/_nogroup/619d1399-3e67-47c1-b13e-c8a98f88c137/.meta'
Nov 28 10:10:52 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:619d1399-3e67-47c1-b13e-c8a98f88c137, vol_name:cephfs) < ""
Nov 28 10:10:52 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "619d1399-3e67-47c1-b13e-c8a98f88c137", "format": "json"}]: dispatch
Nov 28 10:10:52 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:619d1399-3e67-47c1-b13e-c8a98f88c137, vol_name:cephfs) < ""
Nov 28 10:10:52 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:619d1399-3e67-47c1-b13e-c8a98f88c137, vol_name:cephfs) < ""
Nov 28 10:10:52 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "68e2b06e-b3f4-47c6-ba92-1f283cfd85db", "format": "json"}]: dispatch
Nov 28 10:10:52 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "68e2b06e-b3f4-47c6-ba92-1f283cfd85db", "force": true, "format": "json"}]: dispatch
Nov 28 10:10:52 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "e15fa73c-5a04-4afe-898a-d761ebf88b95", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:10:52 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "e15fa73c-5a04-4afe-898a-d761ebf88b95", "format": "json"}]: dispatch
Nov 28 10:10:52 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.106:0/1097277396' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:10:52 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice_bob", "format": "json"}]: dispatch
Nov 28 10:10:52 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Nov 28 10:10:52 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Nov 28 10:10:52 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Nov 28 10:10:52 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished
Nov 28 10:10:52 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice_bob", "format": "json"}]: dispatch
Nov 28 10:10:52 np0005538515.localdomain ceph-mon[301134]: from='client.15651 172.18.0.34:0/597982567' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:10:53 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:10:53.237 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:10:53 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:10:53.238 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:10:53 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "df1f50b2-116f-4913-b966-9e6fb632edd2", "format": "json"}]: dispatch
Nov 28 10:10:53 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "df1f50b2-116f-4913-b966-9e6fb632edd2", "force": true, "format": "json"}]: dispatch
Nov 28 10:10:53 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "619d1399-3e67-47c1-b13e-c8a98f88c137", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:10:53 np0005538515.localdomain ceph-mon[301134]: pgmap v565: 177 pgs: 177 active+clean; 202 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 1.2 KiB/s rd, 205 KiB/s wr, 18 op/s
Nov 28 10:10:53 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "619d1399-3e67-47c1-b13e-c8a98f88c137", "format": "json"}]: dispatch
Nov 28 10:10:53 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.106:0/4265101259' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:10:53 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:10:53.264 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 10:10:53 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:10:53.264 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 10:10:53 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:10:53.265 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 10:10:53 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:10:53.265 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Auditing locally available compute resources for np0005538515.localdomain (node: np0005538515.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 28 10:10:53 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:10:53.266 280172 DEBUG oslo_concurrency.processutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 10:10:53 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:10:53.620 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:10:53 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 28 10:10:53 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/1858069606' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:10:53 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:10:53.733 280172 DEBUG oslo_concurrency.processutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.468s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 10:10:53 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v566: 177 pgs: 177 active+clean; 202 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 938 B/s rd, 120 KiB/s wr, 11 op/s
Nov 28 10:10:53 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:10:53.944 280172 WARNING nova.virt.libvirt.driver [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 10:10:53 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:10:53.945 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Hypervisor/Node resource view: name=np0005538515.localdomain free_ram=11487MB free_disk=41.83686447143555GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 28 10:10:53 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:10:53.945 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 10:10:53 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:10:53.946 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 10:10:54 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:10:54.184 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 28 10:10:54 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:10:54.185 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Final resource view: name=np0005538515.localdomain phys_ram=15738MB used_ram=512MB phys_disk=41GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 28 10:10:54 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.108:0/1858069606' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:10:54 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "e15fa73c-5a04-4afe-898a-d761ebf88b95", "snap_name": "8728644f-8752-4fbc-9dba-996bfb595ace", "format": "json"}]: dispatch
Nov 28 10:10:54 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_create(format:json, prefix:fs subvolume snapshot create, snap_name:8728644f-8752-4fbc-9dba-996bfb595ace, sub_name:e15fa73c-5a04-4afe-898a-d761ebf88b95, vol_name:cephfs) < ""
Nov 28 10:10:54 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:10:54.379 280172 DEBUG oslo_concurrency.processutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 10:10:54 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_create(format:json, prefix:fs subvolume snapshot create, snap_name:8728644f-8752-4fbc-9dba-996bfb595ace, sub_name:e15fa73c-5a04-4afe-898a-d761ebf88b95, vol_name:cephfs) < ""
Nov 28 10:10:54 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "f288b089-d265-4559-9acd-a03615016513", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:10:54 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:f288b089-d265-4559-9acd-a03615016513, vol_name:cephfs) < ""
Nov 28 10:10:54 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/f288b089-d265-4559-9acd-a03615016513/.meta.tmp'
Nov 28 10:10:54 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/f288b089-d265-4559-9acd-a03615016513/.meta.tmp' to config b'/volumes/_nogroup/f288b089-d265-4559-9acd-a03615016513/.meta'
Nov 28 10:10:54 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:f288b089-d265-4559-9acd-a03615016513, vol_name:cephfs) < ""
Nov 28 10:10:54 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "f288b089-d265-4559-9acd-a03615016513", "format": "json"}]: dispatch
Nov 28 10:10:54 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:f288b089-d265-4559-9acd-a03615016513, vol_name:cephfs) < ""
Nov 28 10:10:54 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:f288b089-d265-4559-9acd-a03615016513, vol_name:cephfs) < ""
Nov 28 10:10:54 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "3fe15641-5409-4db6-8856-5687ded3c0e8", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:10:54 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:3fe15641-5409-4db6-8856-5687ded3c0e8, vol_name:cephfs) < ""
Nov 28 10:10:54 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/3fe15641-5409-4db6-8856-5687ded3c0e8/.meta.tmp'
Nov 28 10:10:54 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/3fe15641-5409-4db6-8856-5687ded3c0e8/.meta.tmp' to config b'/volumes/_nogroup/3fe15641-5409-4db6-8856-5687ded3c0e8/.meta'
Nov 28 10:10:54 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:3fe15641-5409-4db6-8856-5687ded3c0e8, vol_name:cephfs) < ""
Nov 28 10:10:54 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "3fe15641-5409-4db6-8856-5687ded3c0e8", "format": "json"}]: dispatch
Nov 28 10:10:54 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:3fe15641-5409-4db6-8856-5687ded3c0e8, vol_name:cephfs) < ""
Nov 28 10:10:54 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:3fe15641-5409-4db6-8856-5687ded3c0e8, vol_name:cephfs) < ""
Nov 28 10:10:54 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 28 10:10:54 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/2557878891' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:10:54 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:10:54.890 280172 DEBUG oslo_concurrency.processutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.511s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 10:10:54 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:10:54.895 280172 DEBUG nova.compute.provider_tree [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Inventory has not changed in ProviderTree for provider: 72fba1ca-0d86-48af-8a3d-510284dfd0e0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 10:10:54 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:10:54.910 280172 DEBUG nova.scheduler.client.report [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Inventory has not changed for provider 72fba1ca-0d86-48af-8a3d-510284dfd0e0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 10:10:54 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:10:54.911 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Compute_service record updated for np0005538515.localdomain:np0005538515.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 28 10:10:54 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:10:54.912 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.966s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 10:10:54 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:10:54.912 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:10:54 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice_bob", "tenant_id": "38de2f991c8946e4ad86ddc6b9c2ae73", "access_level": "r", "format": "json"}]: dispatch
Nov 28 10:10:54 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_authorize(access_level:r, auth_id:alice_bob, format:json, prefix:fs subvolume authorize, sub_name:3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50, tenant_id:38de2f991c8946e4ad86ddc6b9c2ae73, vol_name:cephfs) < ""
Nov 28 10:10:54 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} v 0)
Nov 28 10:10:54 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Nov 28 10:10:54 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: Creating meta for ID alice_bob with tenant 38de2f991c8946e4ad86ddc6b9c2ae73
Nov 28 10:10:55 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} v 0)
Nov 28 10:10:55 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:10:55 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_authorize(access_level:r, auth_id:alice_bob, format:json, prefix:fs subvolume authorize, sub_name:3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50, tenant_id:38de2f991c8946e4ad86ddc6b9c2ae73, vol_name:cephfs) < ""
Nov 28 10:10:55 np0005538515.localdomain ceph-mon[301134]: pgmap v566: 177 pgs: 177 active+clean; 202 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 938 B/s rd, 120 KiB/s wr, 11 op/s
Nov 28 10:10:55 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "e15fa73c-5a04-4afe-898a-d761ebf88b95", "snap_name": "8728644f-8752-4fbc-9dba-996bfb595ace", "format": "json"}]: dispatch
Nov 28 10:10:55 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "f288b089-d265-4559-9acd-a03615016513", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:10:55 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "f288b089-d265-4559-9acd-a03615016513", "format": "json"}]: dispatch
Nov 28 10:10:55 np0005538515.localdomain ceph-mon[301134]: from='client.15651 172.18.0.34:0/597982567' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:10:55 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "3fe15641-5409-4db6-8856-5687ded3c0e8", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:10:55 np0005538515.localdomain ceph-mon[301134]: from='client.15651 172.18.0.34:0/597982567' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:10:55 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.108:0/2557878891' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:10:55 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Nov 28 10:10:55 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:10:55 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:10:55 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"}]': finished
Nov 28 10:10:55 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "619d1399-3e67-47c1-b13e-c8a98f88c137", "format": "json"}]: dispatch
Nov 28 10:10:55 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_clone_status(clone_name:619d1399-3e67-47c1-b13e-c8a98f88c137, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Nov 28 10:10:55 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_clone_status(clone_name:619d1399-3e67-47c1-b13e-c8a98f88c137, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Nov 28 10:10:55 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T10:10:55.310+0000 7fcc87448640 -1 mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '619d1399-3e67-47c1-b13e-c8a98f88c137' of type subvolume
Nov 28 10:10:55 np0005538515.localdomain ceph-mgr[286188]: mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '619d1399-3e67-47c1-b13e-c8a98f88c137' of type subvolume
Nov 28 10:10:55 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "619d1399-3e67-47c1-b13e-c8a98f88c137", "force": true, "format": "json"}]: dispatch
Nov 28 10:10:55 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:619d1399-3e67-47c1-b13e-c8a98f88c137, vol_name:cephfs) < ""
Nov 28 10:10:55 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.subvolume_base] subvolume path 'b'/volumes/_nogroup/619d1399-3e67-47c1-b13e-c8a98f88c137'' moved to trashcan
Nov 28 10:10:55 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Nov 28 10:10:55 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:619d1399-3e67-47c1-b13e-c8a98f88c137, vol_name:cephfs) < ""
Nov 28 10:10:55 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:10:55.623 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:10:55 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v567: 177 pgs: 177 active+clean; 202 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 938 B/s rd, 120 KiB/s wr, 11 op/s
Nov 28 10:10:56 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "3fe15641-5409-4db6-8856-5687ded3c0e8", "format": "json"}]: dispatch
Nov 28 10:10:56 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice_bob", "tenant_id": "38de2f991c8946e4ad86ddc6b9c2ae73", "access_level": "r", "format": "json"}]: dispatch
Nov 28 10:10:56 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "619d1399-3e67-47c1-b13e-c8a98f88c137", "format": "json"}]: dispatch
Nov 28 10:10:56 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "619d1399-3e67-47c1-b13e-c8a98f88c137", "force": true, "format": "json"}]: dispatch
Nov 28 10:10:56 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e271 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:10:56 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:10:56.925 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:10:57 np0005538515.localdomain ceph-mon[301134]: pgmap v567: 177 pgs: 177 active+clean; 202 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 938 B/s rd, 120 KiB/s wr, 11 op/s
Nov 28 10:10:57 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.107:0/646999688' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:10:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:10:57 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 10:10:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:10:57 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 10:10:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:10:57 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 10:10:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:10:57 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 10:10:57 np0005538515.localdomain openstack_network_exporter[240973]: 
Nov 28 10:10:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:10:57 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 10:10:57 np0005538515.localdomain openstack_network_exporter[240973]: 
Nov 28 10:10:57 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "f288b089-d265-4559-9acd-a03615016513", "format": "json"}]: dispatch
Nov 28 10:10:57 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_clone_status(clone_name:f288b089-d265-4559-9acd-a03615016513, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Nov 28 10:10:57 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_clone_status(clone_name:f288b089-d265-4559-9acd-a03615016513, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Nov 28 10:10:57 np0005538515.localdomain ceph-mgr[286188]: mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume 'f288b089-d265-4559-9acd-a03615016513' of type subvolume
Nov 28 10:10:57 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T10:10:57.694+0000 7fcc87448640 -1 mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume 'f288b089-d265-4559-9acd-a03615016513' of type subvolume
Nov 28 10:10:57 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "f288b089-d265-4559-9acd-a03615016513", "force": true, "format": "json"}]: dispatch
Nov 28 10:10:57 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:f288b089-d265-4559-9acd-a03615016513, vol_name:cephfs) < ""
Nov 28 10:10:57 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.subvolume_base] subvolume path 'b'/volumes/_nogroup/f288b089-d265-4559-9acd-a03615016513'' moved to trashcan
Nov 28 10:10:57 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Nov 28 10:10:57 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:f288b089-d265-4559-9acd-a03615016513, vol_name:cephfs) < ""
Nov 28 10:10:57 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v568: 177 pgs: 177 active+clean; 203 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 1.1 KiB/s rd, 171 KiB/s wr, 16 op/s
Nov 28 10:10:58 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "e15fa73c-5a04-4afe-898a-d761ebf88b95", "snap_name": "8728644f-8752-4fbc-9dba-996bfb595ace_609d618f-0ee5-41e8-beae-89d616159cd4", "force": true, "format": "json"}]: dispatch
Nov 28 10:10:58 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:8728644f-8752-4fbc-9dba-996bfb595ace_609d618f-0ee5-41e8-beae-89d616159cd4, sub_name:e15fa73c-5a04-4afe-898a-d761ebf88b95, vol_name:cephfs) < ""
Nov 28 10:10:58 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/e15fa73c-5a04-4afe-898a-d761ebf88b95/.meta.tmp'
Nov 28 10:10:58 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/e15fa73c-5a04-4afe-898a-d761ebf88b95/.meta.tmp' to config b'/volumes/_nogroup/e15fa73c-5a04-4afe-898a-d761ebf88b95/.meta'
Nov 28 10:10:58 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:8728644f-8752-4fbc-9dba-996bfb595ace_609d618f-0ee5-41e8-beae-89d616159cd4, sub_name:e15fa73c-5a04-4afe-898a-d761ebf88b95, vol_name:cephfs) < ""
Nov 28 10:10:58 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "e15fa73c-5a04-4afe-898a-d761ebf88b95", "snap_name": "8728644f-8752-4fbc-9dba-996bfb595ace", "force": true, "format": "json"}]: dispatch
Nov 28 10:10:58 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:8728644f-8752-4fbc-9dba-996bfb595ace, sub_name:e15fa73c-5a04-4afe-898a-d761ebf88b95, vol_name:cephfs) < ""
Nov 28 10:10:58 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/e15fa73c-5a04-4afe-898a-d761ebf88b95/.meta.tmp'
Nov 28 10:10:58 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/e15fa73c-5a04-4afe-898a-d761ebf88b95/.meta.tmp' to config b'/volumes/_nogroup/e15fa73c-5a04-4afe-898a-d761ebf88b95/.meta'
Nov 28 10:10:58 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.107:0/2010999762' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:10:58 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:8728644f-8752-4fbc-9dba-996bfb595ace, sub_name:e15fa73c-5a04-4afe-898a-d761ebf88b95, vol_name:cephfs) < ""
Nov 28 10:10:58 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice_bob", "format": "json"}]: dispatch
Nov 28 10:10:58 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_deauthorize(auth_id:alice_bob, format:json, prefix:fs subvolume deauthorize, sub_name:3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50, vol_name:cephfs) < ""
Nov 28 10:10:58 np0005538515.localdomain ceph-osd[32393]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #45. Immutable memtables: 2.
Nov 28 10:10:58 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} v 0)
Nov 28 10:10:58 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Nov 28 10:10:58 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice_bob"} v 0)
Nov 28 10:10:58 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Nov 28 10:10:58 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_deauthorize(auth_id:alice_bob, format:json, prefix:fs subvolume deauthorize, sub_name:3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50, vol_name:cephfs) < ""
Nov 28 10:10:58 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice_bob", "format": "json"}]: dispatch
Nov 28 10:10:58 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_evict(auth_id:alice_bob, format:json, prefix:fs subvolume evict, sub_name:3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50, vol_name:cephfs) < ""
Nov 28 10:10:58 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict clients with auth_name=alice_bob, client_metadata.root=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97
Nov 28 10:10:58 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict: joined all
Nov 28 10:10:58 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_evict(auth_id:alice_bob, format:json, prefix:fs subvolume evict, sub_name:3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50, vol_name:cephfs) < ""
Nov 28 10:10:58 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:10:58.622 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:10:58 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "9342293c-12c1-4a10-bd1e-8eba9e15cd79", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:10:58 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:9342293c-12c1-4a10-bd1e-8eba9e15cd79, vol_name:cephfs) < ""
Nov 28 10:10:58 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/9342293c-12c1-4a10-bd1e-8eba9e15cd79/.meta.tmp'
Nov 28 10:10:58 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/9342293c-12c1-4a10-bd1e-8eba9e15cd79/.meta.tmp' to config b'/volumes/_nogroup/9342293c-12c1-4a10-bd1e-8eba9e15cd79/.meta'
Nov 28 10:10:58 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:9342293c-12c1-4a10-bd1e-8eba9e15cd79, vol_name:cephfs) < ""
Nov 28 10:10:58 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "9342293c-12c1-4a10-bd1e-8eba9e15cd79", "format": "json"}]: dispatch
Nov 28 10:10:58 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:9342293c-12c1-4a10-bd1e-8eba9e15cd79, vol_name:cephfs) < ""
Nov 28 10:10:58 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:9342293c-12c1-4a10-bd1e-8eba9e15cd79, vol_name:cephfs) < ""
Nov 28 10:10:58 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "47d6dddf-401e-4ff2-980a-f34a0aa62099", "size": 2147483648, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:10:58 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:2147483648, sub_name:47d6dddf-401e-4ff2-980a-f34a0aa62099, vol_name:cephfs) < ""
Nov 28 10:10:58 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/47d6dddf-401e-4ff2-980a-f34a0aa62099/.meta.tmp'
Nov 28 10:10:58 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/47d6dddf-401e-4ff2-980a-f34a0aa62099/.meta.tmp' to config b'/volumes/_nogroup/47d6dddf-401e-4ff2-980a-f34a0aa62099/.meta'
Nov 28 10:10:58 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:2147483648, sub_name:47d6dddf-401e-4ff2-980a-f34a0aa62099, vol_name:cephfs) < ""
Nov 28 10:10:58 np0005538515.localdomain podman[239012]: time="2025-11-28T10:10:58Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 10:10:58 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "47d6dddf-401e-4ff2-980a-f34a0aa62099", "format": "json"}]: dispatch
Nov 28 10:10:58 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:47d6dddf-401e-4ff2-980a-f34a0aa62099, vol_name:cephfs) < ""
Nov 28 10:10:58 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:47d6dddf-401e-4ff2-980a-f34a0aa62099, vol_name:cephfs) < ""
Nov 28 10:10:58 np0005538515.localdomain podman[239012]: @ - - [28/Nov/2025:10:10:58 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156330 "" "Go-http-client/1.1"
Nov 28 10:10:58 np0005538515.localdomain podman[239012]: @ - - [28/Nov/2025:10:10:58 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19241 "" "Go-http-client/1.1"
Nov 28 10:10:59 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "f288b089-d265-4559-9acd-a03615016513", "format": "json"}]: dispatch
Nov 28 10:10:59 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "f288b089-d265-4559-9acd-a03615016513", "force": true, "format": "json"}]: dispatch
Nov 28 10:10:59 np0005538515.localdomain ceph-mon[301134]: pgmap v568: 177 pgs: 177 active+clean; 203 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 1.1 KiB/s rd, 171 KiB/s wr, 16 op/s
Nov 28 10:10:59 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "e15fa73c-5a04-4afe-898a-d761ebf88b95", "snap_name": "8728644f-8752-4fbc-9dba-996bfb595ace_609d618f-0ee5-41e8-beae-89d616159cd4", "force": true, "format": "json"}]: dispatch
Nov 28 10:10:59 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "e15fa73c-5a04-4afe-898a-d761ebf88b95", "snap_name": "8728644f-8752-4fbc-9dba-996bfb595ace", "force": true, "format": "json"}]: dispatch
Nov 28 10:10:59 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice_bob", "format": "json"}]: dispatch
Nov 28 10:10:59 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Nov 28 10:10:59 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Nov 28 10:10:59 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Nov 28 10:10:59 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished
Nov 28 10:10:59 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice_bob", "format": "json"}]: dispatch
Nov 28 10:10:59 np0005538515.localdomain ceph-mon[301134]: from='client.15651 172.18.0.34:0/597982567' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:10:59 np0005538515.localdomain ceph-mon[301134]: from='client.15651 172.18.0.34:0/597982567' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:10:59 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v569: 177 pgs: 177 active+clean; 203 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 597 B/s rd, 128 KiB/s wr, 10 op/s
Nov 28 10:11:00 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e272 e272: 6 total, 6 up, 6 in
Nov 28 10:11:00 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "9342293c-12c1-4a10-bd1e-8eba9e15cd79", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:11:00 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "9342293c-12c1-4a10-bd1e-8eba9e15cd79", "format": "json"}]: dispatch
Nov 28 10:11:00 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "47d6dddf-401e-4ff2-980a-f34a0aa62099", "size": 2147483648, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:11:00 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "47d6dddf-401e-4ff2-980a-f34a0aa62099", "format": "json"}]: dispatch
Nov 28 10:11:00 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:11:00.646 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:11:00 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.
Nov 28 10:11:00 np0005538515.localdomain podman[322831]: 2025-11-28 10:11:00.981487083 +0000 UTC m=+0.092488016 container health_status 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.openshift.expose-services=, version=9.6, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, io.buildah.version=1.33.7, managed_by=edpm_ansible, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, architecture=x86_64, build-date=2025-08-20T13:12:41, distribution-scope=public, vcs-type=git)
Nov 28 10:11:00 np0005538515.localdomain podman[322831]: 2025-11-28 10:11:00.992805953 +0000 UTC m=+0.103806916 container exec_died 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, architecture=x86_64, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vcs-type=git, config_id=edpm, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers)
Nov 28 10:11:01 np0005538515.localdomain systemd[1]: 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.service: Deactivated successfully.
Nov 28 10:11:01 np0005538515.localdomain ceph-mon[301134]: pgmap v569: 177 pgs: 177 active+clean; 203 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 597 B/s rd, 128 KiB/s wr, 10 op/s
Nov 28 10:11:01 np0005538515.localdomain ceph-mon[301134]: osdmap e272: 6 total, 6 up, 6 in
Nov 28 10:11:01 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "e15fa73c-5a04-4afe-898a-d761ebf88b95", "format": "json"}]: dispatch
Nov 28 10:11:01 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_clone_status(clone_name:e15fa73c-5a04-4afe-898a-d761ebf88b95, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Nov 28 10:11:01 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_clone_status(clone_name:e15fa73c-5a04-4afe-898a-d761ebf88b95, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Nov 28 10:11:01 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T10:11:01.475+0000 7fcc87448640 -1 mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume 'e15fa73c-5a04-4afe-898a-d761ebf88b95' of type subvolume
Nov 28 10:11:01 np0005538515.localdomain ceph-mgr[286188]: mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume 'e15fa73c-5a04-4afe-898a-d761ebf88b95' of type subvolume
Nov 28 10:11:01 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "e15fa73c-5a04-4afe-898a-d761ebf88b95", "force": true, "format": "json"}]: dispatch
Nov 28 10:11:01 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:e15fa73c-5a04-4afe-898a-d761ebf88b95, vol_name:cephfs) < ""
Nov 28 10:11:01 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.subvolume_base] subvolume path 'b'/volumes/_nogroup/e15fa73c-5a04-4afe-898a-d761ebf88b95'' moved to trashcan
Nov 28 10:11:01 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Nov 28 10:11:01 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:e15fa73c-5a04-4afe-898a-d761ebf88b95, vol_name:cephfs) < ""
Nov 28 10:11:01 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice bob", "tenant_id": "38de2f991c8946e4ad86ddc6b9c2ae73", "access_level": "rw", "format": "json"}]: dispatch
Nov 28 10:11:01 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_authorize(access_level:rw, auth_id:alice bob, format:json, prefix:fs subvolume authorize, sub_name:3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50, tenant_id:38de2f991c8946e4ad86ddc6b9c2ae73, vol_name:cephfs) < ""
Nov 28 10:11:01 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.alice bob", "format": "json"} v 0)
Nov 28 10:11:01 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Nov 28 10:11:01 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: Creating meta for ID alice bob with tenant 38de2f991c8946e4ad86ddc6b9c2ae73
Nov 28 10:11:01 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} v 0)
Nov 28 10:11:01 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:11:01 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_authorize(access_level:rw, auth_id:alice bob, format:json, prefix:fs subvolume authorize, sub_name:3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50, tenant_id:38de2f991c8946e4ad86ddc6b9c2ae73, vol_name:cephfs) < ""
Nov 28 10:11:01 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e272 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:11:01 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v571: 177 pgs: 177 active+clean; 204 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 818 B/s rd, 159 KiB/s wr, 14 op/s
Nov 28 10:11:02 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "9342293c-12c1-4a10-bd1e-8eba9e15cd79", "auth_id": "tempest-cephx-id-459400664", "tenant_id": "1562d9ae673b4a5ea5a1a571bd0ea2c8", "access_level": "rw", "format": "json"}]: dispatch
Nov 28 10:11:02 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_authorize(access_level:rw, auth_id:tempest-cephx-id-459400664, format:json, prefix:fs subvolume authorize, sub_name:9342293c-12c1-4a10-bd1e-8eba9e15cd79, tenant_id:1562d9ae673b4a5ea5a1a571bd0ea2c8, vol_name:cephfs) < ""
Nov 28 10:11:02 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.tempest-cephx-id-459400664", "format": "json"} v 0)
Nov 28 10:11:02 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-459400664", "format": "json"} : dispatch
Nov 28 10:11:02 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: Creating meta for ID tempest-cephx-id-459400664 with tenant 1562d9ae673b4a5ea5a1a571bd0ea2c8
Nov 28 10:11:02 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-459400664", "caps": ["mds", "allow rw path=/volumes/_nogroup/9342293c-12c1-4a10-bd1e-8eba9e15cd79/6ecb968c-6419-4825-9a25-21daec62c92e", "osd", "allow rw pool=manila_data namespace=fsvolumens_9342293c-12c1-4a10-bd1e-8eba9e15cd79", "mon", "allow r"], "format": "json"} v 0)
Nov 28 10:11:02 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-459400664", "caps": ["mds", "allow rw path=/volumes/_nogroup/9342293c-12c1-4a10-bd1e-8eba9e15cd79/6ecb968c-6419-4825-9a25-21daec62c92e", "osd", "allow rw pool=manila_data namespace=fsvolumens_9342293c-12c1-4a10-bd1e-8eba9e15cd79", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:11:02 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_authorize(access_level:rw, auth_id:tempest-cephx-id-459400664, format:json, prefix:fs subvolume authorize, sub_name:9342293c-12c1-4a10-bd1e-8eba9e15cd79, tenant_id:1562d9ae673b4a5ea5a1a571bd0ea2c8, vol_name:cephfs) < ""
Nov 28 10:11:02 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "47d6dddf-401e-4ff2-980a-f34a0aa62099", "format": "json"}]: dispatch
Nov 28 10:11:02 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_clone_status(clone_name:47d6dddf-401e-4ff2-980a-f34a0aa62099, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Nov 28 10:11:02 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_clone_status(clone_name:47d6dddf-401e-4ff2-980a-f34a0aa62099, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Nov 28 10:11:02 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T10:11:02.214+0000 7fcc87448640 -1 mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '47d6dddf-401e-4ff2-980a-f34a0aa62099' of type subvolume
Nov 28 10:11:02 np0005538515.localdomain ceph-mgr[286188]: mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '47d6dddf-401e-4ff2-980a-f34a0aa62099' of type subvolume
Nov 28 10:11:02 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "47d6dddf-401e-4ff2-980a-f34a0aa62099", "force": true, "format": "json"}]: dispatch
Nov 28 10:11:02 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:47d6dddf-401e-4ff2-980a-f34a0aa62099, vol_name:cephfs) < ""
Nov 28 10:11:02 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.subvolume_base] subvolume path 'b'/volumes/_nogroup/47d6dddf-401e-4ff2-980a-f34a0aa62099'' moved to trashcan
Nov 28 10:11:02 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Nov 28 10:11:02 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:47d6dddf-401e-4ff2-980a-f34a0aa62099, vol_name:cephfs) < ""
Nov 28 10:11:02 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "e15fa73c-5a04-4afe-898a-d761ebf88b95", "format": "json"}]: dispatch
Nov 28 10:11:02 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "e15fa73c-5a04-4afe-898a-d761ebf88b95", "force": true, "format": "json"}]: dispatch
Nov 28 10:11:02 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice bob", "tenant_id": "38de2f991c8946e4ad86ddc6b9c2ae73", "access_level": "rw", "format": "json"}]: dispatch
Nov 28 10:11:02 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Nov 28 10:11:02 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:11:02 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:11:02 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"}]': finished
Nov 28 10:11:02 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-459400664", "format": "json"} : dispatch
Nov 28 10:11:02 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-459400664", "caps": ["mds", "allow rw path=/volumes/_nogroup/9342293c-12c1-4a10-bd1e-8eba9e15cd79/6ecb968c-6419-4825-9a25-21daec62c92e", "osd", "allow rw pool=manila_data namespace=fsvolumens_9342293c-12c1-4a10-bd1e-8eba9e15cd79", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:11:02 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-459400664", "caps": ["mds", "allow rw path=/volumes/_nogroup/9342293c-12c1-4a10-bd1e-8eba9e15cd79/6ecb968c-6419-4825-9a25-21daec62c92e", "osd", "allow rw pool=manila_data namespace=fsvolumens_9342293c-12c1-4a10-bd1e-8eba9e15cd79", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:11:02 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-459400664", "caps": ["mds", "allow rw path=/volumes/_nogroup/9342293c-12c1-4a10-bd1e-8eba9e15cd79/6ecb968c-6419-4825-9a25-21daec62c92e", "osd", "allow rw pool=manila_data namespace=fsvolumens_9342293c-12c1-4a10-bd1e-8eba9e15cd79", "mon", "allow r"], "format": "json"}]': finished
Nov 28 10:11:03 np0005538515.localdomain ceph-mon[301134]: pgmap v571: 177 pgs: 177 active+clean; 204 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 818 B/s rd, 159 KiB/s wr, 14 op/s
Nov 28 10:11:03 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "9342293c-12c1-4a10-bd1e-8eba9e15cd79", "auth_id": "tempest-cephx-id-459400664", "tenant_id": "1562d9ae673b4a5ea5a1a571bd0ea2c8", "access_level": "rw", "format": "json"}]: dispatch
Nov 28 10:11:03 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "47d6dddf-401e-4ff2-980a-f34a0aa62099", "format": "json"}]: dispatch
Nov 28 10:11:03 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "47d6dddf-401e-4ff2-980a-f34a0aa62099", "force": true, "format": "json"}]: dispatch
Nov 28 10:11:03 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "96f781a9-18ad-48a6-a288-5b831718a338", "format": "json"}]: dispatch
Nov 28 10:11:03 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_clone_status(clone_name:96f781a9-18ad-48a6-a288-5b831718a338, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Nov 28 10:11:03 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:11:03.624 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:11:03 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v572: 177 pgs: 177 active+clean; 204 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 818 B/s rd, 159 KiB/s wr, 14 op/s
Nov 28 10:11:04 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:11:04.238 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:11:04 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:11:04.239 280172 DEBUG nova.compute.manager [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Nov 28 10:11:04 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "96f781a9-18ad-48a6-a288-5b831718a338", "format": "json"}]: dispatch
Nov 28 10:11:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_clone_status(clone_name:96f781a9-18ad-48a6-a288-5b831718a338, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Nov 28 10:11:05 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "96f781a9-18ad-48a6-a288-5b831718a338", "format": "json"}]: dispatch
Nov 28 10:11:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:96f781a9-18ad-48a6-a288-5b831718a338, vol_name:cephfs) < ""
Nov 28 10:11:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:96f781a9-18ad-48a6-a288-5b831718a338, vol_name:cephfs) < ""
Nov 28 10:11:05 np0005538515.localdomain ceph-mon[301134]: pgmap v572: 177 pgs: 177 active+clean; 204 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 818 B/s rd, 159 KiB/s wr, 14 op/s
Nov 28 10:11:05 np0005538515.localdomain ceph-mon[301134]: from='client.15651 172.18.0.34:0/597982567' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:11:05 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "fef85cbb-0b69-4eb4-8353-b0bae82d0d83", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:11:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:fef85cbb-0b69-4eb4-8353-b0bae82d0d83, vol_name:cephfs) < ""
Nov 28 10:11:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/fef85cbb-0b69-4eb4-8353-b0bae82d0d83/.meta.tmp'
Nov 28 10:11:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/fef85cbb-0b69-4eb4-8353-b0bae82d0d83/.meta.tmp' to config b'/volumes/_nogroup/fef85cbb-0b69-4eb4-8353-b0bae82d0d83/.meta'
Nov 28 10:11:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:fef85cbb-0b69-4eb4-8353-b0bae82d0d83, vol_name:cephfs) < ""
Nov 28 10:11:05 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "fef85cbb-0b69-4eb4-8353-b0bae82d0d83", "format": "json"}]: dispatch
Nov 28 10:11:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:fef85cbb-0b69-4eb4-8353-b0bae82d0d83, vol_name:cephfs) < ""
Nov 28 10:11:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:fef85cbb-0b69-4eb4-8353-b0bae82d0d83, vol_name:cephfs) < ""
Nov 28 10:11:05 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice bob", "format": "json"}]: dispatch
Nov 28 10:11:05 np0005538515.localdomain ceph-mgr[286188]: [balancer INFO root] Optimize plan auto_2025-11-28_10:11:05
Nov 28 10:11:05 np0005538515.localdomain ceph-mgr[286188]: [balancer INFO root] Mode upmap, max misplaced 0.050000
Nov 28 10:11:05 np0005538515.localdomain ceph-mgr[286188]: [balancer INFO root] do_upmap
Nov 28 10:11:05 np0005538515.localdomain ceph-mgr[286188]: [balancer INFO root] pools ['manila_metadata', 'manila_data', '.mgr', 'volumes', 'images', 'vms', 'backups']
Nov 28 10:11:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_deauthorize(auth_id:alice bob, format:json, prefix:fs subvolume deauthorize, sub_name:3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50, vol_name:cephfs) < ""
Nov 28 10:11:05 np0005538515.localdomain ceph-mgr[286188]: [balancer INFO root] prepared 0/10 changes
Nov 28 10:11:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] scanning for idle connections..
Nov 28 10:11:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] cleaning up connections: []
Nov 28 10:11:05 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:11:05.686 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:11:05 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.alice bob", "format": "json"} v 0)
Nov 28 10:11:05 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Nov 28 10:11:05 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice bob"} v 0)
Nov 28 10:11:05 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Nov 28 10:11:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_deauthorize(auth_id:alice bob, format:json, prefix:fs subvolume deauthorize, sub_name:3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50, vol_name:cephfs) < ""
Nov 28 10:11:05 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice bob", "format": "json"}]: dispatch
Nov 28 10:11:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_evict(auth_id:alice bob, format:json, prefix:fs subvolume evict, sub_name:3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50, vol_name:cephfs) < ""
Nov 28 10:11:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict clients with auth_name=alice bob, client_metadata.root=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97
Nov 28 10:11:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] scanning for idle connections..
Nov 28 10:11:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] cleaning up connections: []
Nov 28 10:11:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict: joined all
Nov 28 10:11:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_evict(auth_id:alice bob, format:json, prefix:fs subvolume evict, sub_name:3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50, vol_name:cephfs) < ""
Nov 28 10:11:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] scanning for idle connections..
Nov 28 10:11:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] cleaning up connections: []
Nov 28 10:11:05 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v573: 177 pgs: 177 active+clean; 204 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 818 B/s rd, 159 KiB/s wr, 14 op/s
Nov 28 10:11:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] _maybe_adjust
Nov 28 10:11:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Nov 28 10:11:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] Pool '.mgr' root_id -1 using 3.080724804578448e-05 of space, bias 1.0, pg target 0.006161449609156895 quantized to 1 (current 1)
Nov 28 10:11:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Nov 28 10:11:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] Pool 'vms' root_id -1 using 0.003328000680485762 of space, bias 1.0, pg target 0.6656001360971524 quantized to 32 (current 32)
Nov 28 10:11:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Nov 28 10:11:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] Pool 'volumes' root_id -1 using 0.0014861089300670016 of space, bias 1.0, pg target 0.29672641637004465 quantized to 32 (current 32)
Nov 28 10:11:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Nov 28 10:11:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] Pool 'images' root_id -1 using 0.004299383200725851 of space, bias 1.0, pg target 0.8584435124115949 quantized to 32 (current 32)
Nov 28 10:11:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Nov 28 10:11:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] Pool 'backups' root_id -1 using 2.7263051367950866e-07 of space, bias 1.0, pg target 5.425347222222222e-05 quantized to 32 (current 32)
Nov 28 10:11:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Nov 28 10:11:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] Pool 'manila_data' root_id -1 using 2.9989356504745952e-06 of space, bias 1.0, pg target 0.0005967881944444444 quantized to 32 (current 32)
Nov 28 10:11:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Nov 28 10:11:05 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] Pool 'manila_metadata' root_id -1 using 0.0008502436953284943 of space, bias 4.0, pg target 0.6767939814814814 quantized to 16 (current 16)
Nov 28 10:11:05 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules
Nov 28 10:11:06 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: vms, start_after=
Nov 28 10:11:06 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules
Nov 28 10:11:06 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: vms, start_after=
Nov 28 10:11:06 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: volumes, start_after=
Nov 28 10:11:06 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: volumes, start_after=
Nov 28 10:11:06 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: images, start_after=
Nov 28 10:11:06 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: backups, start_after=
Nov 28 10:11:06 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: images, start_after=
Nov 28 10:11:06 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: backups, start_after=
Nov 28 10:11:06 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "96f781a9-18ad-48a6-a288-5b831718a338", "format": "json"}]: dispatch
Nov 28 10:11:06 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "fef85cbb-0b69-4eb4-8353-b0bae82d0d83", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:11:06 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "fef85cbb-0b69-4eb4-8353-b0bae82d0d83", "format": "json"}]: dispatch
Nov 28 10:11:06 np0005538515.localdomain ceph-mon[301134]: from='client.15651 172.18.0.34:0/597982567' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:11:06 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice bob", "format": "json"}]: dispatch
Nov 28 10:11:06 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Nov 28 10:11:06 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Nov 28 10:11:06 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Nov 28 10:11:06 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished
Nov 28 10:11:06 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "eb97c54f-5603-4ed1-8403-ba162352ea4f", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:11:06 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:eb97c54f-5603-4ed1-8403-ba162352ea4f, vol_name:cephfs) < ""
Nov 28 10:11:06 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/eb97c54f-5603-4ed1-8403-ba162352ea4f/.meta.tmp'
Nov 28 10:11:06 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/eb97c54f-5603-4ed1-8403-ba162352ea4f/.meta.tmp' to config b'/volumes/_nogroup/eb97c54f-5603-4ed1-8403-ba162352ea4f/.meta'
Nov 28 10:11:06 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:eb97c54f-5603-4ed1-8403-ba162352ea4f, vol_name:cephfs) < ""
Nov 28 10:11:06 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "eb97c54f-5603-4ed1-8403-ba162352ea4f", "format": "json"}]: dispatch
Nov 28 10:11:06 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:eb97c54f-5603-4ed1-8403-ba162352ea4f, vol_name:cephfs) < ""
Nov 28 10:11:06 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:eb97c54f-5603-4ed1-8403-ba162352ea4f, vol_name:cephfs) < ""
Nov 28 10:11:06 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e273 e273: 6 total, 6 up, 6 in
Nov 28 10:11:06 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e273 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:11:07 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "9342293c-12c1-4a10-bd1e-8eba9e15cd79", "auth_id": "tempest-cephx-id-459400664", "format": "json"}]: dispatch
Nov 28 10:11:07 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_deauthorize(auth_id:tempest-cephx-id-459400664, format:json, prefix:fs subvolume deauthorize, sub_name:9342293c-12c1-4a10-bd1e-8eba9e15cd79, vol_name:cephfs) < ""
Nov 28 10:11:07 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.tempest-cephx-id-459400664", "format": "json"} v 0)
Nov 28 10:11:07 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-459400664", "format": "json"} : dispatch
Nov 28 10:11:07 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.tempest-cephx-id-459400664"} v 0)
Nov 28 10:11:07 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-459400664"} : dispatch
Nov 28 10:11:07 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_deauthorize(auth_id:tempest-cephx-id-459400664, format:json, prefix:fs subvolume deauthorize, sub_name:9342293c-12c1-4a10-bd1e-8eba9e15cd79, vol_name:cephfs) < ""
Nov 28 10:11:07 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "9342293c-12c1-4a10-bd1e-8eba9e15cd79", "auth_id": "tempest-cephx-id-459400664", "format": "json"}]: dispatch
Nov 28 10:11:07 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_evict(auth_id:tempest-cephx-id-459400664, format:json, prefix:fs subvolume evict, sub_name:9342293c-12c1-4a10-bd1e-8eba9e15cd79, vol_name:cephfs) < ""
Nov 28 10:11:07 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict clients with auth_name=tempest-cephx-id-459400664, client_metadata.root=/volumes/_nogroup/9342293c-12c1-4a10-bd1e-8eba9e15cd79/6ecb968c-6419-4825-9a25-21daec62c92e
Nov 28 10:11:07 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict: joined all
Nov 28 10:11:07 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_evict(auth_id:tempest-cephx-id-459400664, format:json, prefix:fs subvolume evict, sub_name:9342293c-12c1-4a10-bd1e-8eba9e15cd79, vol_name:cephfs) < ""
Nov 28 10:11:07 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice bob", "format": "json"}]: dispatch
Nov 28 10:11:07 np0005538515.localdomain ceph-mon[301134]: pgmap v573: 177 pgs: 177 active+clean; 204 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 818 B/s rd, 159 KiB/s wr, 14 op/s
Nov 28 10:11:07 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "eb97c54f-5603-4ed1-8403-ba162352ea4f", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:11:07 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "eb97c54f-5603-4ed1-8403-ba162352ea4f", "format": "json"}]: dispatch
Nov 28 10:11:07 np0005538515.localdomain ceph-mon[301134]: from='client.15651 172.18.0.34:0/597982567' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:11:07 np0005538515.localdomain ceph-mon[301134]: osdmap e273: 6 total, 6 up, 6 in
Nov 28 10:11:07 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-459400664"} : dispatch
Nov 28 10:11:07 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-459400664", "format": "json"} : dispatch
Nov 28 10:11:07 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-459400664"} : dispatch
Nov 28 10:11:07 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-459400664"}]': finished
Nov 28 10:11:07 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "9342293c-12c1-4a10-bd1e-8eba9e15cd79", "format": "json"}]: dispatch
Nov 28 10:11:07 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_clone_status(clone_name:9342293c-12c1-4a10-bd1e-8eba9e15cd79, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Nov 28 10:11:07 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_clone_status(clone_name:9342293c-12c1-4a10-bd1e-8eba9e15cd79, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Nov 28 10:11:07 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T10:11:07.644+0000 7fcc87448640 -1 mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '9342293c-12c1-4a10-bd1e-8eba9e15cd79' of type subvolume
Nov 28 10:11:07 np0005538515.localdomain ceph-mgr[286188]: mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '9342293c-12c1-4a10-bd1e-8eba9e15cd79' of type subvolume
Nov 28 10:11:07 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "9342293c-12c1-4a10-bd1e-8eba9e15cd79", "force": true, "format": "json"}]: dispatch
Nov 28 10:11:07 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:9342293c-12c1-4a10-bd1e-8eba9e15cd79, vol_name:cephfs) < ""
Nov 28 10:11:07 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.subvolume_base] subvolume path 'b'/volumes/_nogroup/9342293c-12c1-4a10-bd1e-8eba9e15cd79'' moved to trashcan
Nov 28 10:11:07 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Nov 28 10:11:07 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:9342293c-12c1-4a10-bd1e-8eba9e15cd79, vol_name:cephfs) < ""
Nov 28 10:11:07 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v575: 177 pgs: 177 active+clean; 204 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 1.4 KiB/s rd, 199 KiB/s wr, 18 op/s
Nov 28 10:11:08 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "9342293c-12c1-4a10-bd1e-8eba9e15cd79", "auth_id": "tempest-cephx-id-459400664", "format": "json"}]: dispatch
Nov 28 10:11:08 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "9342293c-12c1-4a10-bd1e-8eba9e15cd79", "auth_id": "tempest-cephx-id-459400664", "format": "json"}]: dispatch
Nov 28 10:11:08 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "9342293c-12c1-4a10-bd1e-8eba9e15cd79", "format": "json"}]: dispatch
Nov 28 10:11:08 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "9342293c-12c1-4a10-bd1e-8eba9e15cd79", "force": true, "format": "json"}]: dispatch
Nov 28 10:11:08 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice bob", "tenant_id": "38de2f991c8946e4ad86ddc6b9c2ae73", "access_level": "r", "format": "json"}]: dispatch
Nov 28 10:11:08 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_authorize(access_level:r, auth_id:alice bob, format:json, prefix:fs subvolume authorize, sub_name:3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50, tenant_id:38de2f991c8946e4ad86ddc6b9c2ae73, vol_name:cephfs) < ""
Nov 28 10:11:08 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.alice bob", "format": "json"} v 0)
Nov 28 10:11:08 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Nov 28 10:11:08 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: Creating meta for ID alice bob with tenant 38de2f991c8946e4ad86ddc6b9c2ae73
Nov 28 10:11:08 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:11:08.625 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:11:08 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} v 0)
Nov 28 10:11:08 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:11:08 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_authorize(access_level:r, auth_id:alice bob, format:json, prefix:fs subvolume authorize, sub_name:3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50, tenant_id:38de2f991c8946e4ad86ddc6b9c2ae73, vol_name:cephfs) < ""
Nov 28 10:11:08 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "1d0b6ffa-5038-4546-af4f-2ad9a9443222", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:11:08 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:1d0b6ffa-5038-4546-af4f-2ad9a9443222, vol_name:cephfs) < ""
Nov 28 10:11:08 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/1d0b6ffa-5038-4546-af4f-2ad9a9443222/.meta.tmp'
Nov 28 10:11:08 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/1d0b6ffa-5038-4546-af4f-2ad9a9443222/.meta.tmp' to config b'/volumes/_nogroup/1d0b6ffa-5038-4546-af4f-2ad9a9443222/.meta'
Nov 28 10:11:08 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:1d0b6ffa-5038-4546-af4f-2ad9a9443222, vol_name:cephfs) < ""
Nov 28 10:11:08 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "1d0b6ffa-5038-4546-af4f-2ad9a9443222", "format": "json"}]: dispatch
Nov 28 10:11:08 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:1d0b6ffa-5038-4546-af4f-2ad9a9443222, vol_name:cephfs) < ""
Nov 28 10:11:08 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:1d0b6ffa-5038-4546-af4f-2ad9a9443222, vol_name:cephfs) < ""
Nov 28 10:11:08 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "fef85cbb-0b69-4eb4-8353-b0bae82d0d83", "format": "json"}]: dispatch
Nov 28 10:11:08 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_clone_status(clone_name:fef85cbb-0b69-4eb4-8353-b0bae82d0d83, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Nov 28 10:11:08 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_clone_status(clone_name:fef85cbb-0b69-4eb4-8353-b0bae82d0d83, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Nov 28 10:11:08 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T10:11:08.980+0000 7fcc87448640 -1 mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume 'fef85cbb-0b69-4eb4-8353-b0bae82d0d83' of type subvolume
Nov 28 10:11:08 np0005538515.localdomain ceph-mgr[286188]: mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume 'fef85cbb-0b69-4eb4-8353-b0bae82d0d83' of type subvolume
Nov 28 10:11:08 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "fef85cbb-0b69-4eb4-8353-b0bae82d0d83", "force": true, "format": "json"}]: dispatch
Nov 28 10:11:08 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:fef85cbb-0b69-4eb4-8353-b0bae82d0d83, vol_name:cephfs) < ""
Nov 28 10:11:08 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.subvolume_base] subvolume path 'b'/volumes/_nogroup/fef85cbb-0b69-4eb4-8353-b0bae82d0d83'' moved to trashcan
Nov 28 10:11:08 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Nov 28 10:11:08 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:fef85cbb-0b69-4eb4-8353-b0bae82d0d83, vol_name:cephfs) < ""
Nov 28 10:11:09 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume resize", "vol_name": "cephfs", "sub_name": "eb97c54f-5603-4ed1-8403-ba162352ea4f", "new_size": 2147483648, "format": "json"}]: dispatch
Nov 28 10:11:09 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_resize(format:json, new_size:2147483648, prefix:fs subvolume resize, sub_name:eb97c54f-5603-4ed1-8403-ba162352ea4f, vol_name:cephfs) < ""
Nov 28 10:11:09 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_resize(format:json, new_size:2147483648, prefix:fs subvolume resize, sub_name:eb97c54f-5603-4ed1-8403-ba162352ea4f, vol_name:cephfs) < ""
Nov 28 10:11:09 np0005538515.localdomain ceph-mon[301134]: pgmap v575: 177 pgs: 177 active+clean; 204 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 1.4 KiB/s rd, 199 KiB/s wr, 18 op/s
Nov 28 10:11:09 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice bob", "tenant_id": "38de2f991c8946e4ad86ddc6b9c2ae73", "access_level": "r", "format": "json"}]: dispatch
Nov 28 10:11:09 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Nov 28 10:11:09 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:11:09 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:11:09 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"}]': finished
Nov 28 10:11:09 np0005538515.localdomain ceph-mon[301134]: from='client.15651 172.18.0.34:0/597982567' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:11:09 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v576: 177 pgs: 177 active+clean; 204 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 1.2 KiB/s rd, 167 KiB/s wr, 15 op/s
Nov 28 10:11:10 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "1d0b6ffa-5038-4546-af4f-2ad9a9443222", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:11:10 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "1d0b6ffa-5038-4546-af4f-2ad9a9443222", "format": "json"}]: dispatch
Nov 28 10:11:10 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "fef85cbb-0b69-4eb4-8353-b0bae82d0d83", "format": "json"}]: dispatch
Nov 28 10:11:10 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "fef85cbb-0b69-4eb4-8353-b0bae82d0d83", "force": true, "format": "json"}]: dispatch
Nov 28 10:11:10 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume resize", "vol_name": "cephfs", "sub_name": "eb97c54f-5603-4ed1-8403-ba162352ea4f", "new_size": 2147483648, "format": "json"}]: dispatch
Nov 28 10:11:10 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:11:10.737 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:11:11 np0005538515.localdomain ceph-mon[301134]: pgmap v576: 177 pgs: 177 active+clean; 204 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 1.2 KiB/s rd, 167 KiB/s wr, 15 op/s
Nov 28 10:11:11 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e273 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:11:11 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v577: 177 pgs: 177 active+clean; 205 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 1023 B/s rd, 142 KiB/s wr, 13 op/s
Nov 28 10:11:12 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "1d0b6ffa-5038-4546-af4f-2ad9a9443222", "auth_id": "tempest-cephx-id-459400664", "tenant_id": "1562d9ae673b4a5ea5a1a571bd0ea2c8", "access_level": "rw", "format": "json"}]: dispatch
Nov 28 10:11:12 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_authorize(access_level:rw, auth_id:tempest-cephx-id-459400664, format:json, prefix:fs subvolume authorize, sub_name:1d0b6ffa-5038-4546-af4f-2ad9a9443222, tenant_id:1562d9ae673b4a5ea5a1a571bd0ea2c8, vol_name:cephfs) < ""
Nov 28 10:11:12 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.tempest-cephx-id-459400664", "format": "json"} v 0)
Nov 28 10:11:12 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-459400664", "format": "json"} : dispatch
Nov 28 10:11:12 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: Creating meta for ID tempest-cephx-id-459400664 with tenant 1562d9ae673b4a5ea5a1a571bd0ea2c8
Nov 28 10:11:12 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-459400664", "caps": ["mds", "allow rw path=/volumes/_nogroup/1d0b6ffa-5038-4546-af4f-2ad9a9443222/d4c11bd3-8d93-41b4-9ea4-9848a75f8c7c", "osd", "allow rw pool=manila_data namespace=fsvolumens_1d0b6ffa-5038-4546-af4f-2ad9a9443222", "mon", "allow r"], "format": "json"} v 0)
Nov 28 10:11:12 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-459400664", "caps": ["mds", "allow rw path=/volumes/_nogroup/1d0b6ffa-5038-4546-af4f-2ad9a9443222/d4c11bd3-8d93-41b4-9ea4-9848a75f8c7c", "osd", "allow rw pool=manila_data namespace=fsvolumens_1d0b6ffa-5038-4546-af4f-2ad9a9443222", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:11:12 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_authorize(access_level:rw, auth_id:tempest-cephx-id-459400664, format:json, prefix:fs subvolume authorize, sub_name:1d0b6ffa-5038-4546-af4f-2ad9a9443222, tenant_id:1562d9ae673b4a5ea5a1a571bd0ea2c8, vol_name:cephfs) < ""
Nov 28 10:11:12 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice bob", "format": "json"}]: dispatch
Nov 28 10:11:12 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_deauthorize(auth_id:alice bob, format:json, prefix:fs subvolume deauthorize, sub_name:3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50, vol_name:cephfs) < ""
Nov 28 10:11:12 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.alice bob", "format": "json"} v 0)
Nov 28 10:11:12 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Nov 28 10:11:12 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice bob"} v 0)
Nov 28 10:11:12 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Nov 28 10:11:12 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_deauthorize(auth_id:alice bob, format:json, prefix:fs subvolume deauthorize, sub_name:3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50, vol_name:cephfs) < ""
Nov 28 10:11:12 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice bob", "format": "json"}]: dispatch
Nov 28 10:11:12 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_evict(auth_id:alice bob, format:json, prefix:fs subvolume evict, sub_name:3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50, vol_name:cephfs) < ""
Nov 28 10:11:12 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict clients with auth_name=alice bob, client_metadata.root=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97
Nov 28 10:11:12 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict: joined all
Nov 28 10:11:12 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_evict(auth_id:alice bob, format:json, prefix:fs subvolume evict, sub_name:3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50, vol_name:cephfs) < ""
Nov 28 10:11:12 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "eb97c54f-5603-4ed1-8403-ba162352ea4f", "format": "json"}]: dispatch
Nov 28 10:11:12 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_clone_status(clone_name:eb97c54f-5603-4ed1-8403-ba162352ea4f, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Nov 28 10:11:12 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_clone_status(clone_name:eb97c54f-5603-4ed1-8403-ba162352ea4f, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Nov 28 10:11:12 np0005538515.localdomain ceph-mgr[286188]: mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume 'eb97c54f-5603-4ed1-8403-ba162352ea4f' of type subvolume
Nov 28 10:11:12 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T10:11:12.449+0000 7fcc87448640 -1 mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume 'eb97c54f-5603-4ed1-8403-ba162352ea4f' of type subvolume
Nov 28 10:11:12 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "eb97c54f-5603-4ed1-8403-ba162352ea4f", "force": true, "format": "json"}]: dispatch
Nov 28 10:11:12 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:eb97c54f-5603-4ed1-8403-ba162352ea4f, vol_name:cephfs) < ""
Nov 28 10:11:12 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.subvolume_base] subvolume path 'b'/volumes/_nogroup/eb97c54f-5603-4ed1-8403-ba162352ea4f'' moved to trashcan
Nov 28 10:11:12 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Nov 28 10:11:12 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:eb97c54f-5603-4ed1-8403-ba162352ea4f, vol_name:cephfs) < ""
Nov 28 10:11:12 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "0b7976e6-bdd9-4983-8639-f0b5b8a68920", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:11:12 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:0b7976e6-bdd9-4983-8639-f0b5b8a68920, vol_name:cephfs) < ""
Nov 28 10:11:12 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/0b7976e6-bdd9-4983-8639-f0b5b8a68920/.meta.tmp'
Nov 28 10:11:12 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/0b7976e6-bdd9-4983-8639-f0b5b8a68920/.meta.tmp' to config b'/volumes/_nogroup/0b7976e6-bdd9-4983-8639-f0b5b8a68920/.meta'
Nov 28 10:11:12 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:0b7976e6-bdd9-4983-8639-f0b5b8a68920, vol_name:cephfs) < ""
Nov 28 10:11:12 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "0b7976e6-bdd9-4983-8639-f0b5b8a68920", "format": "json"}]: dispatch
Nov 28 10:11:12 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:0b7976e6-bdd9-4983-8639-f0b5b8a68920, vol_name:cephfs) < ""
Nov 28 10:11:12 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:0b7976e6-bdd9-4983-8639-f0b5b8a68920, vol_name:cephfs) < ""
Nov 28 10:11:12 np0005538515.localdomain ceph-mon[301134]: pgmap v577: 177 pgs: 177 active+clean; 205 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 1023 B/s rd, 142 KiB/s wr, 13 op/s
Nov 28 10:11:12 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "1d0b6ffa-5038-4546-af4f-2ad9a9443222", "auth_id": "tempest-cephx-id-459400664", "tenant_id": "1562d9ae673b4a5ea5a1a571bd0ea2c8", "access_level": "rw", "format": "json"}]: dispatch
Nov 28 10:11:12 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-459400664", "format": "json"} : dispatch
Nov 28 10:11:12 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-459400664", "caps": ["mds", "allow rw path=/volumes/_nogroup/1d0b6ffa-5038-4546-af4f-2ad9a9443222/d4c11bd3-8d93-41b4-9ea4-9848a75f8c7c", "osd", "allow rw pool=manila_data namespace=fsvolumens_1d0b6ffa-5038-4546-af4f-2ad9a9443222", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:11:12 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-459400664", "caps": ["mds", "allow rw path=/volumes/_nogroup/1d0b6ffa-5038-4546-af4f-2ad9a9443222/d4c11bd3-8d93-41b4-9ea4-9848a75f8c7c", "osd", "allow rw pool=manila_data namespace=fsvolumens_1d0b6ffa-5038-4546-af4f-2ad9a9443222", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:11:12 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-459400664", "caps": ["mds", "allow rw path=/volumes/_nogroup/1d0b6ffa-5038-4546-af4f-2ad9a9443222/d4c11bd3-8d93-41b4-9ea4-9848a75f8c7c", "osd", "allow rw pool=manila_data namespace=fsvolumens_1d0b6ffa-5038-4546-af4f-2ad9a9443222", "mon", "allow r"], "format": "json"}]': finished
Nov 28 10:11:12 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice bob", "format": "json"}]: dispatch
Nov 28 10:11:12 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Nov 28 10:11:12 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Nov 28 10:11:12 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Nov 28 10:11:12 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished
Nov 28 10:11:12 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice bob", "format": "json"}]: dispatch
Nov 28 10:11:12 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "eb97c54f-5603-4ed1-8403-ba162352ea4f", "format": "json"}]: dispatch
Nov 28 10:11:12 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "eb97c54f-5603-4ed1-8403-ba162352ea4f", "force": true, "format": "json"}]: dispatch
Nov 28 10:11:12 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "0b7976e6-bdd9-4983-8639-f0b5b8a68920", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:11:12 np0005538515.localdomain ceph-mon[301134]: from='client.15651 172.18.0.34:0/597982567' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:11:13 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.
Nov 28 10:11:13 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.
Nov 28 10:11:13 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.
Nov 28 10:11:13 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.
Nov 28 10:11:13 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:11:13.628 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:11:13 np0005538515.localdomain podman[322855]: 2025-11-28 10:11:13.633390465 +0000 UTC m=+0.074926774 container health_status 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20251125, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible)
Nov 28 10:11:13 np0005538515.localdomain systemd[1]: tmp-crun.O6uj6M.mount: Deactivated successfully.
Nov 28 10:11:13 np0005538515.localdomain podman[322866]: 2025-11-28 10:11:13.690808517 +0000 UTC m=+0.119209530 container health_status d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 10:11:13 np0005538515.localdomain podman[322855]: 2025-11-28 10:11:13.701480727 +0000 UTC m=+0.143017036 container exec_died 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20251125, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 28 10:11:13 np0005538515.localdomain systemd[1]: 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.service: Deactivated successfully.
Nov 28 10:11:13 np0005538515.localdomain podman[322856]: 2025-11-28 10:11:13.665482436 +0000 UTC m=+0.096996575 container health_status b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0)
Nov 28 10:11:13 np0005538515.localdomain podman[322854]: 2025-11-28 10:11:13.746957191 +0000 UTC m=+0.186241210 container health_status 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251125, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=edpm, container_name=ceilometer_agent_compute, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:11:13 np0005538515.localdomain podman[322856]: 2025-11-28 10:11:13.750724508 +0000 UTC m=+0.182238677 container exec_died b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:11:13 np0005538515.localdomain systemd[1]: b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.service: Deactivated successfully.
Nov 28 10:11:13 np0005538515.localdomain podman[322866]: 2025-11-28 10:11:13.774443289 +0000 UTC m=+0.202844312 container exec_died d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 28 10:11:13 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "0b7976e6-bdd9-4983-8639-f0b5b8a68920", "format": "json"}]: dispatch
Nov 28 10:11:13 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/281311810' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:11:13 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/281311810' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:11:13 np0005538515.localdomain podman[322854]: 2025-11-28 10:11:13.785441289 +0000 UTC m=+0.224725298 container exec_died 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=ceilometer_agent_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=edpm)
Nov 28 10:11:13 np0005538515.localdomain systemd[1]: d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.service: Deactivated successfully.
Nov 28 10:11:13 np0005538515.localdomain systemd[1]: 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.service: Deactivated successfully.
Nov 28 10:11:13 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v578: 177 pgs: 177 active+clean; 205 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 1023 B/s rd, 142 KiB/s wr, 13 op/s
Nov 28 10:11:14 np0005538515.localdomain ceph-mon[301134]: pgmap v578: 177 pgs: 177 active+clean; 205 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 1023 B/s rd, 142 KiB/s wr, 13 op/s
Nov 28 10:11:15 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice", "tenant_id": "38de2f991c8946e4ad86ddc6b9c2ae73", "access_level": "rw", "format": "json"}]: dispatch
Nov 28 10:11:15 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_authorize(access_level:rw, auth_id:alice, format:json, prefix:fs subvolume authorize, sub_name:3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50, tenant_id:38de2f991c8946e4ad86ddc6b9c2ae73, vol_name:cephfs) < ""
Nov 28 10:11:15 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.alice", "format": "json"} v 0)
Nov 28 10:11:15 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Nov 28 10:11:15 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: Creating meta for ID alice with tenant 38de2f991c8946e4ad86ddc6b9c2ae73
Nov 28 10:11:15 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} v 0)
Nov 28 10:11:15 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:11:15 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_authorize(access_level:rw, auth_id:alice, format:json, prefix:fs subvolume authorize, sub_name:3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50, tenant_id:38de2f991c8946e4ad86ddc6b9c2ae73, vol_name:cephfs) < ""
Nov 28 10:11:15 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:11:15.768 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:11:15 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice", "tenant_id": "38de2f991c8946e4ad86ddc6b9c2ae73", "access_level": "rw", "format": "json"}]: dispatch
Nov 28 10:11:15 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Nov 28 10:11:15 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:11:15 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:11:15 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"}]': finished
Nov 28 10:11:15 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.
Nov 28 10:11:15 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v579: 177 pgs: 177 active+clean; 205 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 1023 B/s rd, 142 KiB/s wr, 13 op/s
Nov 28 10:11:15 np0005538515.localdomain podman[322935]: 2025-11-28 10:11:15.981564063 +0000 UTC m=+0.087392229 container health_status 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 28 10:11:15 np0005538515.localdomain podman[322935]: 2025-11-28 10:11:15.990287342 +0000 UTC m=+0.096115498 container exec_died 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 28 10:11:16 np0005538515.localdomain systemd[1]: 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.service: Deactivated successfully.
Nov 28 10:11:16 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "0b7976e6-bdd9-4983-8639-f0b5b8a68920", "format": "json"}]: dispatch
Nov 28 10:11:16 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_clone_status(clone_name:0b7976e6-bdd9-4983-8639-f0b5b8a68920, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Nov 28 10:11:16 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_clone_status(clone_name:0b7976e6-bdd9-4983-8639-f0b5b8a68920, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Nov 28 10:11:16 np0005538515.localdomain ceph-mgr[286188]: mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '0b7976e6-bdd9-4983-8639-f0b5b8a68920' of type subvolume
Nov 28 10:11:16 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T10:11:16.102+0000 7fcc87448640 -1 mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '0b7976e6-bdd9-4983-8639-f0b5b8a68920' of type subvolume
Nov 28 10:11:16 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "0b7976e6-bdd9-4983-8639-f0b5b8a68920", "force": true, "format": "json"}]: dispatch
Nov 28 10:11:16 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:0b7976e6-bdd9-4983-8639-f0b5b8a68920, vol_name:cephfs) < ""
Nov 28 10:11:16 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.subvolume_base] subvolume path 'b'/volumes/_nogroup/0b7976e6-bdd9-4983-8639-f0b5b8a68920'' moved to trashcan
Nov 28 10:11:16 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Nov 28 10:11:16 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:0b7976e6-bdd9-4983-8639-f0b5b8a68920, vol_name:cephfs) < ""
Nov 28 10:11:16 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "1d0b6ffa-5038-4546-af4f-2ad9a9443222", "auth_id": "tempest-cephx-id-459400664", "format": "json"}]: dispatch
Nov 28 10:11:16 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_deauthorize(auth_id:tempest-cephx-id-459400664, format:json, prefix:fs subvolume deauthorize, sub_name:1d0b6ffa-5038-4546-af4f-2ad9a9443222, vol_name:cephfs) < ""
Nov 28 10:11:16 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.tempest-cephx-id-459400664", "format": "json"} v 0)
Nov 28 10:11:16 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-459400664", "format": "json"} : dispatch
Nov 28 10:11:16 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.tempest-cephx-id-459400664"} v 0)
Nov 28 10:11:16 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-459400664"} : dispatch
Nov 28 10:11:16 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_deauthorize(auth_id:tempest-cephx-id-459400664, format:json, prefix:fs subvolume deauthorize, sub_name:1d0b6ffa-5038-4546-af4f-2ad9a9443222, vol_name:cephfs) < ""
Nov 28 10:11:16 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "1d0b6ffa-5038-4546-af4f-2ad9a9443222", "auth_id": "tempest-cephx-id-459400664", "format": "json"}]: dispatch
Nov 28 10:11:16 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_evict(auth_id:tempest-cephx-id-459400664, format:json, prefix:fs subvolume evict, sub_name:1d0b6ffa-5038-4546-af4f-2ad9a9443222, vol_name:cephfs) < ""
Nov 28 10:11:16 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict clients with auth_name=tempest-cephx-id-459400664, client_metadata.root=/volumes/_nogroup/1d0b6ffa-5038-4546-af4f-2ad9a9443222/d4c11bd3-8d93-41b4-9ea4-9848a75f8c7c
Nov 28 10:11:16 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict: joined all
Nov 28 10:11:16 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_evict(auth_id:tempest-cephx-id-459400664, format:json, prefix:fs subvolume evict, sub_name:1d0b6ffa-5038-4546-af4f-2ad9a9443222, vol_name:cephfs) < ""
Nov 28 10:11:16 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "1d0b6ffa-5038-4546-af4f-2ad9a9443222", "format": "json"}]: dispatch
Nov 28 10:11:16 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_clone_status(clone_name:1d0b6ffa-5038-4546-af4f-2ad9a9443222, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Nov 28 10:11:16 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_clone_status(clone_name:1d0b6ffa-5038-4546-af4f-2ad9a9443222, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Nov 28 10:11:16 np0005538515.localdomain ceph-mgr[286188]: mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '1d0b6ffa-5038-4546-af4f-2ad9a9443222' of type subvolume
Nov 28 10:11:16 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T10:11:16.530+0000 7fcc87448640 -1 mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '1d0b6ffa-5038-4546-af4f-2ad9a9443222' of type subvolume
Nov 28 10:11:16 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "1d0b6ffa-5038-4546-af4f-2ad9a9443222", "force": true, "format": "json"}]: dispatch
Nov 28 10:11:16 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:1d0b6ffa-5038-4546-af4f-2ad9a9443222, vol_name:cephfs) < ""
Nov 28 10:11:16 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.subvolume_base] subvolume path 'b'/volumes/_nogroup/1d0b6ffa-5038-4546-af4f-2ad9a9443222'' moved to trashcan
Nov 28 10:11:16 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Nov 28 10:11:16 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:1d0b6ffa-5038-4546-af4f-2ad9a9443222, vol_name:cephfs) < ""
Nov 28 10:11:16 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e273 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:11:16 np0005538515.localdomain ceph-mon[301134]: pgmap v579: 177 pgs: 177 active+clean; 205 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 1023 B/s rd, 142 KiB/s wr, 13 op/s
Nov 28 10:11:16 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "0b7976e6-bdd9-4983-8639-f0b5b8a68920", "format": "json"}]: dispatch
Nov 28 10:11:16 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "0b7976e6-bdd9-4983-8639-f0b5b8a68920", "force": true, "format": "json"}]: dispatch
Nov 28 10:11:16 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "1d0b6ffa-5038-4546-af4f-2ad9a9443222", "auth_id": "tempest-cephx-id-459400664", "format": "json"}]: dispatch
Nov 28 10:11:16 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-459400664"} : dispatch
Nov 28 10:11:16 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-459400664", "format": "json"} : dispatch
Nov 28 10:11:16 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-459400664"} : dispatch
Nov 28 10:11:16 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-459400664"}]': finished
Nov 28 10:11:16 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "1d0b6ffa-5038-4546-af4f-2ad9a9443222", "auth_id": "tempest-cephx-id-459400664", "format": "json"}]: dispatch
Nov 28 10:11:16 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "1d0b6ffa-5038-4546-af4f-2ad9a9443222", "format": "json"}]: dispatch
Nov 28 10:11:16 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "1d0b6ffa-5038-4546-af4f-2ad9a9443222", "force": true, "format": "json"}]: dispatch
Nov 28 10:11:17 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v580: 177 pgs: 177 active+clean; 206 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 640 B/s rd, 170 KiB/s wr, 14 op/s
Nov 28 10:11:18 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "671f3f7b-b41c-49c8-8917-acdf4c0a35f5", "size": 2147483648, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:11:18 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:2147483648, sub_name:671f3f7b-b41c-49c8-8917-acdf4c0a35f5, vol_name:cephfs) < ""
Nov 28 10:11:18 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/671f3f7b-b41c-49c8-8917-acdf4c0a35f5/.meta.tmp'
Nov 28 10:11:18 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/671f3f7b-b41c-49c8-8917-acdf4c0a35f5/.meta.tmp' to config b'/volumes/_nogroup/671f3f7b-b41c-49c8-8917-acdf4c0a35f5/.meta'
Nov 28 10:11:18 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:2147483648, sub_name:671f3f7b-b41c-49c8-8917-acdf4c0a35f5, vol_name:cephfs) < ""
Nov 28 10:11:18 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "671f3f7b-b41c-49c8-8917-acdf4c0a35f5", "format": "json"}]: dispatch
Nov 28 10:11:18 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:671f3f7b-b41c-49c8-8917-acdf4c0a35f5, vol_name:cephfs) < ""
Nov 28 10:11:18 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:671f3f7b-b41c-49c8-8917-acdf4c0a35f5, vol_name:cephfs) < ""
Nov 28 10:11:18 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:11:18.629 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:11:18 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice", "format": "json"}]: dispatch
Nov 28 10:11:18 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_deauthorize(auth_id:alice, format:json, prefix:fs subvolume deauthorize, sub_name:3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50, vol_name:cephfs) < ""
Nov 28 10:11:18 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.alice", "format": "json"} v 0)
Nov 28 10:11:18 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Nov 28 10:11:18 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice"} v 0)
Nov 28 10:11:18 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Nov 28 10:11:18 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_deauthorize(auth_id:alice, format:json, prefix:fs subvolume deauthorize, sub_name:3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50, vol_name:cephfs) < ""
Nov 28 10:11:18 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice", "format": "json"}]: dispatch
Nov 28 10:11:18 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_evict(auth_id:alice, format:json, prefix:fs subvolume evict, sub_name:3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50, vol_name:cephfs) < ""
Nov 28 10:11:18 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict clients with auth_name=alice, client_metadata.root=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97
Nov 28 10:11:18 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict: joined all
Nov 28 10:11:18 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_evict(auth_id:alice, format:json, prefix:fs subvolume evict, sub_name:3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50, vol_name:cephfs) < ""
Nov 28 10:11:19 np0005538515.localdomain ceph-mon[301134]: pgmap v580: 177 pgs: 177 active+clean; 206 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 640 B/s rd, 170 KiB/s wr, 14 op/s
Nov 28 10:11:19 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "671f3f7b-b41c-49c8-8917-acdf4c0a35f5", "size": 2147483648, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:11:19 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "671f3f7b-b41c-49c8-8917-acdf4c0a35f5", "format": "json"}]: dispatch
Nov 28 10:11:19 np0005538515.localdomain ceph-mon[301134]: from='client.15651 172.18.0.34:0/597982567' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:11:19 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Nov 28 10:11:19 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Nov 28 10:11:19 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Nov 28 10:11:19 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished
Nov 28 10:11:19 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "1daffb16-bcf6-4808-941d-7da7540d99dc", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:11:19 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:1daffb16-bcf6-4808-941d-7da7540d99dc, vol_name:cephfs) < ""
Nov 28 10:11:19 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/1daffb16-bcf6-4808-941d-7da7540d99dc/.meta.tmp'
Nov 28 10:11:19 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/1daffb16-bcf6-4808-941d-7da7540d99dc/.meta.tmp' to config b'/volumes/_nogroup/1daffb16-bcf6-4808-941d-7da7540d99dc/.meta'
Nov 28 10:11:19 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:1daffb16-bcf6-4808-941d-7da7540d99dc, vol_name:cephfs) < ""
Nov 28 10:11:19 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "1daffb16-bcf6-4808-941d-7da7540d99dc", "format": "json"}]: dispatch
Nov 28 10:11:19 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:1daffb16-bcf6-4808-941d-7da7540d99dc, vol_name:cephfs) < ""
Nov 28 10:11:19 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:1daffb16-bcf6-4808-941d-7da7540d99dc, vol_name:cephfs) < ""
Nov 28 10:11:19 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "09b6c71b-6f5f-4037-840b-757a404b81fe", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:11:19 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:09b6c71b-6f5f-4037-840b-757a404b81fe, vol_name:cephfs) < ""
Nov 28 10:11:19 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/09b6c71b-6f5f-4037-840b-757a404b81fe/.meta.tmp'
Nov 28 10:11:19 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/09b6c71b-6f5f-4037-840b-757a404b81fe/.meta.tmp' to config b'/volumes/_nogroup/09b6c71b-6f5f-4037-840b-757a404b81fe/.meta'
Nov 28 10:11:19 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:09b6c71b-6f5f-4037-840b-757a404b81fe, vol_name:cephfs) < ""
Nov 28 10:11:19 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "09b6c71b-6f5f-4037-840b-757a404b81fe", "format": "json"}]: dispatch
Nov 28 10:11:19 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:09b6c71b-6f5f-4037-840b-757a404b81fe, vol_name:cephfs) < ""
Nov 28 10:11:19 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:09b6c71b-6f5f-4037-840b-757a404b81fe, vol_name:cephfs) < ""
Nov 28 10:11:19 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v581: 177 pgs: 177 active+clean; 206 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 597 B/s rd, 158 KiB/s wr, 13 op/s
Nov 28 10:11:20 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice", "format": "json"}]: dispatch
Nov 28 10:11:20 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice", "format": "json"}]: dispatch
Nov 28 10:11:20 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "1daffb16-bcf6-4808-941d-7da7540d99dc", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:11:20 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "1daffb16-bcf6-4808-941d-7da7540d99dc", "format": "json"}]: dispatch
Nov 28 10:11:20 np0005538515.localdomain ceph-mon[301134]: from='client.15651 172.18.0.34:0/597982567' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:11:20 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "09b6c71b-6f5f-4037-840b-757a404b81fe", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:11:20 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "09b6c71b-6f5f-4037-840b-757a404b81fe", "format": "json"}]: dispatch
Nov 28 10:11:20 np0005538515.localdomain ceph-mon[301134]: from='client.15651 172.18.0.34:0/597982567' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:11:20 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:11:20.798 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:11:20 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.
Nov 28 10:11:20 np0005538515.localdomain podman[322959]: 2025-11-28 10:11:20.974277816 +0000 UTC m=+0.081567890 container health_status cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=multipathd, io.buildah.version=1.41.3, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, container_name=multipathd)
Nov 28 10:11:20 np0005538515.localdomain podman[322959]: 2025-11-28 10:11:20.984952255 +0000 UTC m=+0.092242339 container exec_died cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 28 10:11:20 np0005538515.localdomain systemd[1]: cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.service: Deactivated successfully.
Nov 28 10:11:21 np0005538515.localdomain ceph-mon[301134]: pgmap v581: 177 pgs: 177 active+clean; 206 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 597 B/s rd, 158 KiB/s wr, 13 op/s
Nov 28 10:11:21 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume resize", "vol_name": "cephfs", "sub_name": "671f3f7b-b41c-49c8-8917-acdf4c0a35f5", "new_size": 1073741824, "no_shrink": true, "format": "json"}]: dispatch
Nov 28 10:11:21 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_resize(format:json, new_size:1073741824, no_shrink:True, prefix:fs subvolume resize, sub_name:671f3f7b-b41c-49c8-8917-acdf4c0a35f5, vol_name:cephfs) < ""
Nov 28 10:11:21 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_resize(format:json, new_size:1073741824, no_shrink:True, prefix:fs subvolume resize, sub_name:671f3f7b-b41c-49c8-8917-acdf4c0a35f5, vol_name:cephfs) < ""
Nov 28 10:11:21 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e273 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:11:21 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v582: 177 pgs: 177 active+clean; 207 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 852 B/s rd, 209 KiB/s wr, 18 op/s
Nov 28 10:11:22 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume resize", "vol_name": "cephfs", "sub_name": "671f3f7b-b41c-49c8-8917-acdf4c0a35f5", "new_size": 1073741824, "no_shrink": true, "format": "json"}]: dispatch
Nov 28 10:11:22 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "1daffb16-bcf6-4808-941d-7da7540d99dc", "auth_id": "tempest-cephx-id-459400664", "tenant_id": "1562d9ae673b4a5ea5a1a571bd0ea2c8", "access_level": "rw", "format": "json"}]: dispatch
Nov 28 10:11:22 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_authorize(access_level:rw, auth_id:tempest-cephx-id-459400664, format:json, prefix:fs subvolume authorize, sub_name:1daffb16-bcf6-4808-941d-7da7540d99dc, tenant_id:1562d9ae673b4a5ea5a1a571bd0ea2c8, vol_name:cephfs) < ""
Nov 28 10:11:22 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.tempest-cephx-id-459400664", "format": "json"} v 0)
Nov 28 10:11:22 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-459400664", "format": "json"} : dispatch
Nov 28 10:11:22 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: Creating meta for ID tempest-cephx-id-459400664 with tenant 1562d9ae673b4a5ea5a1a571bd0ea2c8
Nov 28 10:11:22 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-459400664", "caps": ["mds", "allow rw path=/volumes/_nogroup/1daffb16-bcf6-4808-941d-7da7540d99dc/48e67baa-974d-4f75-aae2-5962f27bbd3a", "osd", "allow rw pool=manila_data namespace=fsvolumens_1daffb16-bcf6-4808-941d-7da7540d99dc", "mon", "allow r"], "format": "json"} v 0)
Nov 28 10:11:22 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-459400664", "caps": ["mds", "allow rw path=/volumes/_nogroup/1daffb16-bcf6-4808-941d-7da7540d99dc/48e67baa-974d-4f75-aae2-5962f27bbd3a", "osd", "allow rw pool=manila_data namespace=fsvolumens_1daffb16-bcf6-4808-941d-7da7540d99dc", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:11:22 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_authorize(access_level:rw, auth_id:tempest-cephx-id-459400664, format:json, prefix:fs subvolume authorize, sub_name:1daffb16-bcf6-4808-941d-7da7540d99dc, tenant_id:1562d9ae673b4a5ea5a1a571bd0ea2c8, vol_name:cephfs) < ""
Nov 28 10:11:22 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice", "tenant_id": "38de2f991c8946e4ad86ddc6b9c2ae73", "access_level": "r", "format": "json"}]: dispatch
Nov 28 10:11:22 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_authorize(access_level:r, auth_id:alice, format:json, prefix:fs subvolume authorize, sub_name:3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50, tenant_id:38de2f991c8946e4ad86ddc6b9c2ae73, vol_name:cephfs) < ""
Nov 28 10:11:22 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.alice", "format": "json"} v 0)
Nov 28 10:11:22 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Nov 28 10:11:22 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: Creating meta for ID alice with tenant 38de2f991c8946e4ad86ddc6b9c2ae73
Nov 28 10:11:22 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} v 0)
Nov 28 10:11:22 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:11:22 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_authorize(access_level:r, auth_id:alice, format:json, prefix:fs subvolume authorize, sub_name:3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50, tenant_id:38de2f991c8946e4ad86ddc6b9c2ae73, vol_name:cephfs) < ""
Nov 28 10:11:23 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "09b6c71b-6f5f-4037-840b-757a404b81fe", "format": "json"}]: dispatch
Nov 28 10:11:23 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_clone_status(clone_name:09b6c71b-6f5f-4037-840b-757a404b81fe, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Nov 28 10:11:23 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_clone_status(clone_name:09b6c71b-6f5f-4037-840b-757a404b81fe, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Nov 28 10:11:23 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T10:11:23.030+0000 7fcc87448640 -1 mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '09b6c71b-6f5f-4037-840b-757a404b81fe' of type subvolume
Nov 28 10:11:23 np0005538515.localdomain ceph-mgr[286188]: mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '09b6c71b-6f5f-4037-840b-757a404b81fe' of type subvolume
Nov 28 10:11:23 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "09b6c71b-6f5f-4037-840b-757a404b81fe", "force": true, "format": "json"}]: dispatch
Nov 28 10:11:23 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:09b6c71b-6f5f-4037-840b-757a404b81fe, vol_name:cephfs) < ""
Nov 28 10:11:23 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.subvolume_base] subvolume path 'b'/volumes/_nogroup/09b6c71b-6f5f-4037-840b-757a404b81fe'' moved to trashcan
Nov 28 10:11:23 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Nov 28 10:11:23 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:09b6c71b-6f5f-4037-840b-757a404b81fe, vol_name:cephfs) < ""
Nov 28 10:11:23 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "ea36305f-cfab-4e87-868a-2ee1b584f374", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:11:23 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:ea36305f-cfab-4e87-868a-2ee1b584f374, vol_name:cephfs) < ""
Nov 28 10:11:23 np0005538515.localdomain ceph-mon[301134]: pgmap v582: 177 pgs: 177 active+clean; 207 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 852 B/s rd, 209 KiB/s wr, 18 op/s
Nov 28 10:11:23 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "1daffb16-bcf6-4808-941d-7da7540d99dc", "auth_id": "tempest-cephx-id-459400664", "tenant_id": "1562d9ae673b4a5ea5a1a571bd0ea2c8", "access_level": "rw", "format": "json"}]: dispatch
Nov 28 10:11:23 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-459400664", "format": "json"} : dispatch
Nov 28 10:11:23 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-459400664", "caps": ["mds", "allow rw path=/volumes/_nogroup/1daffb16-bcf6-4808-941d-7da7540d99dc/48e67baa-974d-4f75-aae2-5962f27bbd3a", "osd", "allow rw pool=manila_data namespace=fsvolumens_1daffb16-bcf6-4808-941d-7da7540d99dc", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:11:23 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-459400664", "caps": ["mds", "allow rw path=/volumes/_nogroup/1daffb16-bcf6-4808-941d-7da7540d99dc/48e67baa-974d-4f75-aae2-5962f27bbd3a", "osd", "allow rw pool=manila_data namespace=fsvolumens_1daffb16-bcf6-4808-941d-7da7540d99dc", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:11:23 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-459400664", "caps": ["mds", "allow rw path=/volumes/_nogroup/1daffb16-bcf6-4808-941d-7da7540d99dc/48e67baa-974d-4f75-aae2-5962f27bbd3a", "osd", "allow rw pool=manila_data namespace=fsvolumens_1daffb16-bcf6-4808-941d-7da7540d99dc", "mon", "allow r"], "format": "json"}]': finished
Nov 28 10:11:23 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice", "tenant_id": "38de2f991c8946e4ad86ddc6b9c2ae73", "access_level": "r", "format": "json"}]: dispatch
Nov 28 10:11:23 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Nov 28 10:11:23 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:11:23 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:11:23 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"}]': finished
Nov 28 10:11:23 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/ea36305f-cfab-4e87-868a-2ee1b584f374/.meta.tmp'
Nov 28 10:11:23 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/ea36305f-cfab-4e87-868a-2ee1b584f374/.meta.tmp' to config b'/volumes/_nogroup/ea36305f-cfab-4e87-868a-2ee1b584f374/.meta'
Nov 28 10:11:23 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:ea36305f-cfab-4e87-868a-2ee1b584f374, vol_name:cephfs) < ""
Nov 28 10:11:23 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "ea36305f-cfab-4e87-868a-2ee1b584f374", "format": "json"}]: dispatch
Nov 28 10:11:23 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:ea36305f-cfab-4e87-868a-2ee1b584f374, vol_name:cephfs) < ""
Nov 28 10:11:23 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:ea36305f-cfab-4e87-868a-2ee1b584f374, vol_name:cephfs) < ""
Nov 28 10:11:23 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:11:23.632 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:11:23 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v583: 177 pgs: 177 active+clean; 207 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 426 B/s rd, 143 KiB/s wr, 12 op/s
Nov 28 10:11:24 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "09b6c71b-6f5f-4037-840b-757a404b81fe", "format": "json"}]: dispatch
Nov 28 10:11:24 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "09b6c71b-6f5f-4037-840b-757a404b81fe", "force": true, "format": "json"}]: dispatch
Nov 28 10:11:24 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "ea36305f-cfab-4e87-868a-2ee1b584f374", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:11:24 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "ea36305f-cfab-4e87-868a-2ee1b584f374", "format": "json"}]: dispatch
Nov 28 10:11:24 np0005538515.localdomain ceph-mon[301134]: from='client.15651 172.18.0.34:0/597982567' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:11:24 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "671f3f7b-b41c-49c8-8917-acdf4c0a35f5", "format": "json"}]: dispatch
Nov 28 10:11:24 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_clone_status(clone_name:671f3f7b-b41c-49c8-8917-acdf4c0a35f5, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Nov 28 10:11:24 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_clone_status(clone_name:671f3f7b-b41c-49c8-8917-acdf4c0a35f5, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Nov 28 10:11:24 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T10:11:24.622+0000 7fcc87448640 -1 mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '671f3f7b-b41c-49c8-8917-acdf4c0a35f5' of type subvolume
Nov 28 10:11:24 np0005538515.localdomain ceph-mgr[286188]: mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '671f3f7b-b41c-49c8-8917-acdf4c0a35f5' of type subvolume
Nov 28 10:11:24 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "671f3f7b-b41c-49c8-8917-acdf4c0a35f5", "force": true, "format": "json"}]: dispatch
Nov 28 10:11:24 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:671f3f7b-b41c-49c8-8917-acdf4c0a35f5, vol_name:cephfs) < ""
Nov 28 10:11:24 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.subvolume_base] subvolume path 'b'/volumes/_nogroup/671f3f7b-b41c-49c8-8917-acdf4c0a35f5'' moved to trashcan
Nov 28 10:11:24 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Nov 28 10:11:24 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:671f3f7b-b41c-49c8-8917-acdf4c0a35f5, vol_name:cephfs) < ""
Nov 28 10:11:25 np0005538515.localdomain ceph-mon[301134]: pgmap v583: 177 pgs: 177 active+clean; 207 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 426 B/s rd, 143 KiB/s wr, 12 op/s
Nov 28 10:11:25 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "671f3f7b-b41c-49c8-8917-acdf4c0a35f5", "format": "json"}]: dispatch
Nov 28 10:11:25 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "671f3f7b-b41c-49c8-8917-acdf4c0a35f5", "force": true, "format": "json"}]: dispatch
Nov 28 10:11:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:11:25.833 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:11:25 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice", "format": "json"}]: dispatch
Nov 28 10:11:25 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_deauthorize(auth_id:alice, format:json, prefix:fs subvolume deauthorize, sub_name:3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50, vol_name:cephfs) < ""
Nov 28 10:11:25 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v584: 177 pgs: 177 active+clean; 207 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 426 B/s rd, 143 KiB/s wr, 12 op/s
Nov 28 10:11:25 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.alice", "format": "json"} v 0)
Nov 28 10:11:25 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Nov 28 10:11:25 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice"} v 0)
Nov 28 10:11:25 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Nov 28 10:11:26 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_deauthorize(auth_id:alice, format:json, prefix:fs subvolume deauthorize, sub_name:3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50, vol_name:cephfs) < ""
Nov 28 10:11:26 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice", "format": "json"}]: dispatch
Nov 28 10:11:26 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_evict(auth_id:alice, format:json, prefix:fs subvolume evict, sub_name:3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50, vol_name:cephfs) < ""
Nov 28 10:11:26 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict clients with auth_name=alice, client_metadata.root=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97
Nov 28 10:11:26 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict: joined all
Nov 28 10:11:26 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_evict(auth_id:alice, format:json, prefix:fs subvolume evict, sub_name:3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50, vol_name:cephfs) < ""
Nov 28 10:11:26 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Nov 28 10:11:26 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Nov 28 10:11:26 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Nov 28 10:11:26 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished
Nov 28 10:11:26 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "1daffb16-bcf6-4808-941d-7da7540d99dc", "auth_id": "tempest-cephx-id-459400664", "format": "json"}]: dispatch
Nov 28 10:11:26 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_deauthorize(auth_id:tempest-cephx-id-459400664, format:json, prefix:fs subvolume deauthorize, sub_name:1daffb16-bcf6-4808-941d-7da7540d99dc, vol_name:cephfs) < ""
Nov 28 10:11:26 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.tempest-cephx-id-459400664", "format": "json"} v 0)
Nov 28 10:11:26 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-459400664", "format": "json"} : dispatch
Nov 28 10:11:26 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.tempest-cephx-id-459400664"} v 0)
Nov 28 10:11:26 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-459400664"} : dispatch
Nov 28 10:11:26 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_deauthorize(auth_id:tempest-cephx-id-459400664, format:json, prefix:fs subvolume deauthorize, sub_name:1daffb16-bcf6-4808-941d-7da7540d99dc, vol_name:cephfs) < ""
Nov 28 10:11:26 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "1daffb16-bcf6-4808-941d-7da7540d99dc", "auth_id": "tempest-cephx-id-459400664", "format": "json"}]: dispatch
Nov 28 10:11:26 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_evict(auth_id:tempest-cephx-id-459400664, format:json, prefix:fs subvolume evict, sub_name:1daffb16-bcf6-4808-941d-7da7540d99dc, vol_name:cephfs) < ""
Nov 28 10:11:26 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict clients with auth_name=tempest-cephx-id-459400664, client_metadata.root=/volumes/_nogroup/1daffb16-bcf6-4808-941d-7da7540d99dc/48e67baa-974d-4f75-aae2-5962f27bbd3a
Nov 28 10:11:26 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict: joined all
Nov 28 10:11:26 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_evict(auth_id:tempest-cephx-id-459400664, format:json, prefix:fs subvolume evict, sub_name:1daffb16-bcf6-4808-941d-7da7540d99dc, vol_name:cephfs) < ""
Nov 28 10:11:26 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "9861e523-796e-4848-a7e0-e4ce88058d68", "snap_name": "feab8b53-fb10-403a-9e5e-ef10a5640c05_4a989d3d-660f-4fdb-879a-341eac23ba00", "force": true, "format": "json"}]: dispatch
Nov 28 10:11:26 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:feab8b53-fb10-403a-9e5e-ef10a5640c05_4a989d3d-660f-4fdb-879a-341eac23ba00, sub_name:9861e523-796e-4848-a7e0-e4ce88058d68, vol_name:cephfs) < ""
Nov 28 10:11:26 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/9861e523-796e-4848-a7e0-e4ce88058d68/.meta.tmp'
Nov 28 10:11:26 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/9861e523-796e-4848-a7e0-e4ce88058d68/.meta.tmp' to config b'/volumes/_nogroup/9861e523-796e-4848-a7e0-e4ce88058d68/.meta'
Nov 28 10:11:26 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:feab8b53-fb10-403a-9e5e-ef10a5640c05_4a989d3d-660f-4fdb-879a-341eac23ba00, sub_name:9861e523-796e-4848-a7e0-e4ce88058d68, vol_name:cephfs) < ""
Nov 28 10:11:26 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "9861e523-796e-4848-a7e0-e4ce88058d68", "snap_name": "feab8b53-fb10-403a-9e5e-ef10a5640c05", "force": true, "format": "json"}]: dispatch
Nov 28 10:11:26 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:feab8b53-fb10-403a-9e5e-ef10a5640c05, sub_name:9861e523-796e-4848-a7e0-e4ce88058d68, vol_name:cephfs) < ""
Nov 28 10:11:26 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/9861e523-796e-4848-a7e0-e4ce88058d68/.meta.tmp'
Nov 28 10:11:26 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/9861e523-796e-4848-a7e0-e4ce88058d68/.meta.tmp' to config b'/volumes/_nogroup/9861e523-796e-4848-a7e0-e4ce88058d68/.meta'
Nov 28 10:11:26 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:feab8b53-fb10-403a-9e5e-ef10a5640c05, sub_name:9861e523-796e-4848-a7e0-e4ce88058d68, vol_name:cephfs) < ""
Nov 28 10:11:26 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "1daffb16-bcf6-4808-941d-7da7540d99dc", "format": "json"}]: dispatch
Nov 28 10:11:26 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_clone_status(clone_name:1daffb16-bcf6-4808-941d-7da7540d99dc, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Nov 28 10:11:26 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_clone_status(clone_name:1daffb16-bcf6-4808-941d-7da7540d99dc, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Nov 28 10:11:26 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T10:11:26.695+0000 7fcc87448640 -1 mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '1daffb16-bcf6-4808-941d-7da7540d99dc' of type subvolume
Nov 28 10:11:26 np0005538515.localdomain ceph-mgr[286188]: mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '1daffb16-bcf6-4808-941d-7da7540d99dc' of type subvolume
Nov 28 10:11:26 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "1daffb16-bcf6-4808-941d-7da7540d99dc", "force": true, "format": "json"}]: dispatch
Nov 28 10:11:26 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:1daffb16-bcf6-4808-941d-7da7540d99dc, vol_name:cephfs) < ""
Nov 28 10:11:26 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.subvolume_base] subvolume path 'b'/volumes/_nogroup/1daffb16-bcf6-4808-941d-7da7540d99dc'' moved to trashcan
Nov 28 10:11:26 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Nov 28 10:11:26 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:1daffb16-bcf6-4808-941d-7da7540d99dc, vol_name:cephfs) < ""
Nov 28 10:11:26 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e273 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:11:27 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "ea36305f-cfab-4e87-868a-2ee1b584f374", "format": "json"}]: dispatch
Nov 28 10:11:27 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_clone_status(clone_name:ea36305f-cfab-4e87-868a-2ee1b584f374, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Nov 28 10:11:27 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice", "format": "json"}]: dispatch
Nov 28 10:11:27 np0005538515.localdomain ceph-mon[301134]: pgmap v584: 177 pgs: 177 active+clean; 207 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 426 B/s rd, 143 KiB/s wr, 12 op/s
Nov 28 10:11:27 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice", "format": "json"}]: dispatch
Nov 28 10:11:27 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "1daffb16-bcf6-4808-941d-7da7540d99dc", "auth_id": "tempest-cephx-id-459400664", "format": "json"}]: dispatch
Nov 28 10:11:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-459400664"} : dispatch
Nov 28 10:11:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-459400664", "format": "json"} : dispatch
Nov 28 10:11:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-459400664"} : dispatch
Nov 28 10:11:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-459400664"}]': finished
Nov 28 10:11:27 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "1daffb16-bcf6-4808-941d-7da7540d99dc", "auth_id": "tempest-cephx-id-459400664", "format": "json"}]: dispatch
Nov 28 10:11:27 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "9861e523-796e-4848-a7e0-e4ce88058d68", "snap_name": "feab8b53-fb10-403a-9e5e-ef10a5640c05_4a989d3d-660f-4fdb-879a-341eac23ba00", "force": true, "format": "json"}]: dispatch
Nov 28 10:11:27 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "9861e523-796e-4848-a7e0-e4ce88058d68", "snap_name": "feab8b53-fb10-403a-9e5e-ef10a5640c05", "force": true, "format": "json"}]: dispatch
Nov 28 10:11:27 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_clone_status(clone_name:ea36305f-cfab-4e87-868a-2ee1b584f374, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Nov 28 10:11:27 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T10:11:27.143+0000 7fcc87448640 -1 mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume 'ea36305f-cfab-4e87-868a-2ee1b584f374' of type subvolume
Nov 28 10:11:27 np0005538515.localdomain ceph-mgr[286188]: mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume 'ea36305f-cfab-4e87-868a-2ee1b584f374' of type subvolume
Nov 28 10:11:27 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "ea36305f-cfab-4e87-868a-2ee1b584f374", "force": true, "format": "json"}]: dispatch
Nov 28 10:11:27 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:ea36305f-cfab-4e87-868a-2ee1b584f374, vol_name:cephfs) < ""
Nov 28 10:11:27 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.subvolume_base] subvolume path 'b'/volumes/_nogroup/ea36305f-cfab-4e87-868a-2ee1b584f374'' moved to trashcan
Nov 28 10:11:27 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Nov 28 10:11:27 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:ea36305f-cfab-4e87-868a-2ee1b584f374, vol_name:cephfs) < ""
Nov 28 10:11:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:11:27 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 10:11:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:11:27 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 10:11:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:11:27 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 10:11:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:11:27 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 10:11:27 np0005538515.localdomain openstack_network_exporter[240973]: 
Nov 28 10:11:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:11:27 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 10:11:27 np0005538515.localdomain openstack_network_exporter[240973]: 
Nov 28 10:11:27 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v585: 177 pgs: 177 active+clean; 208 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 767 B/s rd, 241 KiB/s wr, 20 op/s
Nov 28 10:11:28 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "1daffb16-bcf6-4808-941d-7da7540d99dc", "format": "json"}]: dispatch
Nov 28 10:11:28 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "1daffb16-bcf6-4808-941d-7da7540d99dc", "force": true, "format": "json"}]: dispatch
Nov 28 10:11:28 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "ea36305f-cfab-4e87-868a-2ee1b584f374", "format": "json"}]: dispatch
Nov 28 10:11:28 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "ea36305f-cfab-4e87-868a-2ee1b584f374", "force": true, "format": "json"}]: dispatch
Nov 28 10:11:28 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "ac978b2f-998a-4925-a071-fbc679b9c4b6", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:11:28 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:ac978b2f-998a-4925-a071-fbc679b9c4b6, vol_name:cephfs) < ""
Nov 28 10:11:28 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/ac978b2f-998a-4925-a071-fbc679b9c4b6/.meta.tmp'
Nov 28 10:11:28 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/ac978b2f-998a-4925-a071-fbc679b9c4b6/.meta.tmp' to config b'/volumes/_nogroup/ac978b2f-998a-4925-a071-fbc679b9c4b6/.meta'
Nov 28 10:11:28 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:ac978b2f-998a-4925-a071-fbc679b9c4b6, vol_name:cephfs) < ""
Nov 28 10:11:28 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:11:28.633 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:11:28 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "ac978b2f-998a-4925-a071-fbc679b9c4b6", "format": "json"}]: dispatch
Nov 28 10:11:28 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:ac978b2f-998a-4925-a071-fbc679b9c4b6, vol_name:cephfs) < ""
Nov 28 10:11:28 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:ac978b2f-998a-4925-a071-fbc679b9c4b6, vol_name:cephfs) < ""
Nov 28 10:11:28 np0005538515.localdomain podman[239012]: time="2025-11-28T10:11:28Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 10:11:28 np0005538515.localdomain podman[239012]: @ - - [28/Nov/2025:10:11:28 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156330 "" "Go-http-client/1.1"
Nov 28 10:11:28 np0005538515.localdomain podman[239012]: @ - - [28/Nov/2025:10:11:28 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19245 "" "Go-http-client/1.1"
Nov 28 10:11:29 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice_bob", "tenant_id": "38de2f991c8946e4ad86ddc6b9c2ae73", "access_level": "rw", "format": "json"}]: dispatch
Nov 28 10:11:29 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_authorize(access_level:rw, auth_id:alice_bob, format:json, prefix:fs subvolume authorize, sub_name:3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50, tenant_id:38de2f991c8946e4ad86ddc6b9c2ae73, vol_name:cephfs) < ""
Nov 28 10:11:29 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} v 0)
Nov 28 10:11:29 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Nov 28 10:11:29 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: Creating meta for ID alice_bob with tenant 38de2f991c8946e4ad86ddc6b9c2ae73
Nov 28 10:11:29 np0005538515.localdomain ceph-mon[301134]: pgmap v585: 177 pgs: 177 active+clean; 208 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 767 B/s rd, 241 KiB/s wr, 20 op/s
Nov 28 10:11:29 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "ac978b2f-998a-4925-a071-fbc679b9c4b6", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:11:29 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "ac978b2f-998a-4925-a071-fbc679b9c4b6", "format": "json"}]: dispatch
Nov 28 10:11:29 np0005538515.localdomain ceph-mon[301134]: from='client.15651 172.18.0.34:0/597982567' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:11:29 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Nov 28 10:11:29 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} v 0)
Nov 28 10:11:29 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:11:29 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_authorize(access_level:rw, auth_id:alice_bob, format:json, prefix:fs subvolume authorize, sub_name:3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50, tenant_id:38de2f991c8946e4ad86ddc6b9c2ae73, vol_name:cephfs) < ""
Nov 28 10:11:29 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "3fe15641-5409-4db6-8856-5687ded3c0e8", "auth_id": "tempest-cephx-id-459400664", "tenant_id": "1562d9ae673b4a5ea5a1a571bd0ea2c8", "access_level": "rw", "format": "json"}]: dispatch
Nov 28 10:11:29 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_authorize(access_level:rw, auth_id:tempest-cephx-id-459400664, format:json, prefix:fs subvolume authorize, sub_name:3fe15641-5409-4db6-8856-5687ded3c0e8, tenant_id:1562d9ae673b4a5ea5a1a571bd0ea2c8, vol_name:cephfs) < ""
Nov 28 10:11:29 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.tempest-cephx-id-459400664", "format": "json"} v 0)
Nov 28 10:11:29 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-459400664", "format": "json"} : dispatch
Nov 28 10:11:29 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: Creating meta for ID tempest-cephx-id-459400664 with tenant 1562d9ae673b4a5ea5a1a571bd0ea2c8
Nov 28 10:11:29 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-459400664", "caps": ["mds", "allow rw path=/volumes/_nogroup/3fe15641-5409-4db6-8856-5687ded3c0e8/de8c16b9-d700-4f01-bec8-e5a6a7967ad5", "osd", "allow rw pool=manila_data namespace=fsvolumens_3fe15641-5409-4db6-8856-5687ded3c0e8", "mon", "allow r"], "format": "json"} v 0)
Nov 28 10:11:29 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-459400664", "caps": ["mds", "allow rw path=/volumes/_nogroup/3fe15641-5409-4db6-8856-5687ded3c0e8/de8c16b9-d700-4f01-bec8-e5a6a7967ad5", "osd", "allow rw pool=manila_data namespace=fsvolumens_3fe15641-5409-4db6-8856-5687ded3c0e8", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:11:29 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_authorize(access_level:rw, auth_id:tempest-cephx-id-459400664, format:json, prefix:fs subvolume authorize, sub_name:3fe15641-5409-4db6-8856-5687ded3c0e8, tenant_id:1562d9ae673b4a5ea5a1a571bd0ea2c8, vol_name:cephfs) < ""
Nov 28 10:11:29 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "9861e523-796e-4848-a7e0-e4ce88058d68", "format": "json"}]: dispatch
Nov 28 10:11:29 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_clone_status(clone_name:9861e523-796e-4848-a7e0-e4ce88058d68, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Nov 28 10:11:29 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_clone_status(clone_name:9861e523-796e-4848-a7e0-e4ce88058d68, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Nov 28 10:11:29 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T10:11:29.855+0000 7fcc87448640 -1 mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '9861e523-796e-4848-a7e0-e4ce88058d68' of type subvolume
Nov 28 10:11:29 np0005538515.localdomain ceph-mgr[286188]: mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '9861e523-796e-4848-a7e0-e4ce88058d68' of type subvolume
Nov 28 10:11:29 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "9861e523-796e-4848-a7e0-e4ce88058d68", "force": true, "format": "json"}]: dispatch
Nov 28 10:11:29 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:9861e523-796e-4848-a7e0-e4ce88058d68, vol_name:cephfs) < ""
Nov 28 10:11:29 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.subvolume_base] subvolume path 'b'/volumes/_nogroup/9861e523-796e-4848-a7e0-e4ce88058d68'' moved to trashcan
Nov 28 10:11:29 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Nov 28 10:11:29 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:9861e523-796e-4848-a7e0-e4ce88058d68, vol_name:cephfs) < ""
Nov 28 10:11:29 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v586: 177 pgs: 177 active+clean; 208 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 597 B/s rd, 149 KiB/s wr, 13 op/s
Nov 28 10:11:30 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:11:30.070 158530 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=19, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '92:49:97', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ca:ab:0a:de:51:20'}, ipsec=False) old=SB_Global(nb_cfg=18) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:11:30 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:11:30.072 158530 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 28 10:11:30 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:11:30.071 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:11:30 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice_bob", "tenant_id": "38de2f991c8946e4ad86ddc6b9c2ae73", "access_level": "rw", "format": "json"}]: dispatch
Nov 28 10:11:30 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:11:30 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:11:30 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"}]': finished
Nov 28 10:11:30 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "3fe15641-5409-4db6-8856-5687ded3c0e8", "auth_id": "tempest-cephx-id-459400664", "tenant_id": "1562d9ae673b4a5ea5a1a571bd0ea2c8", "access_level": "rw", "format": "json"}]: dispatch
Nov 28 10:11:30 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-459400664", "format": "json"} : dispatch
Nov 28 10:11:30 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-459400664", "caps": ["mds", "allow rw path=/volumes/_nogroup/3fe15641-5409-4db6-8856-5687ded3c0e8/de8c16b9-d700-4f01-bec8-e5a6a7967ad5", "osd", "allow rw pool=manila_data namespace=fsvolumens_3fe15641-5409-4db6-8856-5687ded3c0e8", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:11:30 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-459400664", "caps": ["mds", "allow rw path=/volumes/_nogroup/3fe15641-5409-4db6-8856-5687ded3c0e8/de8c16b9-d700-4f01-bec8-e5a6a7967ad5", "osd", "allow rw pool=manila_data namespace=fsvolumens_3fe15641-5409-4db6-8856-5687ded3c0e8", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:11:30 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-459400664", "caps": ["mds", "allow rw path=/volumes/_nogroup/3fe15641-5409-4db6-8856-5687ded3c0e8/de8c16b9-d700-4f01-bec8-e5a6a7967ad5", "osd", "allow rw pool=manila_data namespace=fsvolumens_3fe15641-5409-4db6-8856-5687ded3c0e8", "mon", "allow r"], "format": "json"}]': finished
Nov 28 10:11:30 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #43. Immutable memtables: 0.
Nov 28 10:11:30 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:11:30.231206) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Nov 28 10:11:30 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/flush_job.cc:856] [default] [JOB 23] Flushing memtable with next log file: 43
Nov 28 10:11:30 np0005538515.localdomain ceph-mon[301134]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324690231251, "job": 23, "event": "flush_started", "num_memtables": 1, "num_entries": 2358, "num_deletes": 255, "total_data_size": 3139472, "memory_usage": 3192704, "flush_reason": "Manual Compaction"}
Nov 28 10:11:30 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/flush_job.cc:885] [default] [JOB 23] Level-0 flush table #44: started
Nov 28 10:11:30 np0005538515.localdomain ceph-mon[301134]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324690248709, "cf_name": "default", "job": 23, "event": "table_file_creation", "file_number": 44, "file_size": 2054274, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 27675, "largest_seqno": 30028, "table_properties": {"data_size": 2044761, "index_size": 5574, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2885, "raw_key_size": 25919, "raw_average_key_size": 22, "raw_value_size": 2023509, "raw_average_value_size": 1764, "num_data_blocks": 240, "num_entries": 1147, "num_filter_entries": 1147, "num_deletions": 255, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764324591, "oldest_key_time": 1764324591, "file_creation_time": 1764324690, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "75e61b0e-4f73-4b03-b096-8587ecbe7a9f", "db_session_id": "7KM5GJAJPD54H6HSLJHG", "orig_file_number": 44, "seqno_to_time_mapping": "N/A"}}
Nov 28 10:11:30 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 23] Flush lasted 17557 microseconds, and 6603 cpu microseconds.
Nov 28 10:11:30 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 28 10:11:30 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:11:30.248759) [db/flush_job.cc:967] [default] [JOB 23] Level-0 flush table #44: 2054274 bytes OK
Nov 28 10:11:30 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:11:30.248787) [db/memtable_list.cc:519] [default] Level-0 commit table #44 started
Nov 28 10:11:30 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:11:30.251202) [db/memtable_list.cc:722] [default] Level-0 commit table #44: memtable #1 done
Nov 28 10:11:30 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:11:30.251226) EVENT_LOG_v1 {"time_micros": 1764324690251220, "job": 23, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Nov 28 10:11:30 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:11:30.251249) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Nov 28 10:11:30 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 23] Try to delete WAL files size 3127799, prev total WAL file size 3127799, number of live WAL files 2.
Nov 28 10:11:30 np0005538515.localdomain ceph-mon[301134]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538515/store.db/000040.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 28 10:11:30 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:11:30.252329) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003132353530' seq:72057594037927935, type:22 .. '7061786F73003132383032' seq:0, type:0; will stop at (end)
Nov 28 10:11:30 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 24] Compacting 1@0 + 1@6 files to L6, score -1.00
Nov 28 10:11:30 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 23 Base level 0, inputs: [44(2006KB)], [42(17MB)]
Nov 28 10:11:30 np0005538515.localdomain ceph-mon[301134]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324690252402, "job": 24, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [44], "files_L6": [42], "score": -1, "input_data_size": 20441057, "oldest_snapshot_seqno": -1}
Nov 28 10:11:30 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 24] Generated table #45: 13960 keys, 18920322 bytes, temperature: kUnknown
Nov 28 10:11:30 np0005538515.localdomain ceph-mon[301134]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324690394984, "cf_name": "default", "job": 24, "event": "table_file_creation", "file_number": 45, "file_size": 18920322, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 18839438, "index_size": 44879, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 34949, "raw_key_size": 372407, "raw_average_key_size": 26, "raw_value_size": 18601287, "raw_average_value_size": 1332, "num_data_blocks": 1690, "num_entries": 13960, "num_filter_entries": 13960, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764323786, "oldest_key_time": 0, "file_creation_time": 1764324690, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "75e61b0e-4f73-4b03-b096-8587ecbe7a9f", "db_session_id": "7KM5GJAJPD54H6HSLJHG", "orig_file_number": 45, "seqno_to_time_mapping": "N/A"}}
Nov 28 10:11:30 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 28 10:11:30 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:11:30.395357) [db/compaction/compaction_job.cc:1663] [default] [JOB 24] Compacted 1@0 + 1@6 files to L6 => 18920322 bytes
Nov 28 10:11:30 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:11:30.397605) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 143.2 rd, 132.5 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.0, 17.5 +0.0 blob) out(18.0 +0.0 blob), read-write-amplify(19.2) write-amplify(9.2) OK, records in: 14497, records dropped: 537 output_compression: NoCompression
Nov 28 10:11:30 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:11:30.397633) EVENT_LOG_v1 {"time_micros": 1764324690397620, "job": 24, "event": "compaction_finished", "compaction_time_micros": 142745, "compaction_time_cpu_micros": 52295, "output_level": 6, "num_output_files": 1, "total_output_size": 18920322, "num_input_records": 14497, "num_output_records": 13960, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Nov 28 10:11:30 np0005538515.localdomain ceph-mon[301134]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538515/store.db/000044.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 28 10:11:30 np0005538515.localdomain ceph-mon[301134]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324690398034, "job": 24, "event": "table_file_deletion", "file_number": 44}
Nov 28 10:11:30 np0005538515.localdomain ceph-mon[301134]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538515/store.db/000042.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 28 10:11:30 np0005538515.localdomain ceph-mon[301134]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324690400473, "job": 24, "event": "table_file_deletion", "file_number": 42}
Nov 28 10:11:30 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:11:30.252181) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:11:30 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:11:30.400597) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:11:30 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:11:30.400604) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:11:30 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:11:30.400608) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:11:30 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:11:30.400611) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:11:30 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:11:30.400613) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:11:30 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "783fcf38-4096-452b-b965-78e606bf4fa1", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:11:30 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:783fcf38-4096-452b-b965-78e606bf4fa1, vol_name:cephfs) < ""
Nov 28 10:11:30 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:11:30.886 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:11:30 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/783fcf38-4096-452b-b965-78e606bf4fa1/.meta.tmp'
Nov 28 10:11:30 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/783fcf38-4096-452b-b965-78e606bf4fa1/.meta.tmp' to config b'/volumes/_nogroup/783fcf38-4096-452b-b965-78e606bf4fa1/.meta'
Nov 28 10:11:30 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:783fcf38-4096-452b-b965-78e606bf4fa1, vol_name:cephfs) < ""
Nov 28 10:11:30 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "783fcf38-4096-452b-b965-78e606bf4fa1", "format": "json"}]: dispatch
Nov 28 10:11:30 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:783fcf38-4096-452b-b965-78e606bf4fa1, vol_name:cephfs) < ""
Nov 28 10:11:30 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:783fcf38-4096-452b-b965-78e606bf4fa1, vol_name:cephfs) < ""
Nov 28 10:11:31 np0005538515.localdomain sudo[322981]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 10:11:31 np0005538515.localdomain sudo[322981]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 10:11:31 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.
Nov 28 10:11:31 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e274 e274: 6 total, 6 up, 6 in
Nov 28 10:11:31 np0005538515.localdomain sudo[322981]: pam_unix(sudo:session): session closed for user root
Nov 28 10:11:31 np0005538515.localdomain sudo[323005]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 10:11:31 np0005538515.localdomain sudo[323005]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 10:11:31 np0005538515.localdomain podman[322998]: 2025-11-28 10:11:31.340155207 +0000 UTC m=+0.088615756 container health_status 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., release=1755695350, io.buildah.version=1.33.7, version=9.6, distribution-scope=public, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, vcs-type=git)
Nov 28 10:11:31 np0005538515.localdomain podman[322998]: 2025-11-28 10:11:31.353305383 +0000 UTC m=+0.101765922 container exec_died 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, version=9.6, io.openshift.expose-services=, maintainer=Red Hat, Inc., config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, vcs-type=git, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal)
Nov 28 10:11:31 np0005538515.localdomain systemd[1]: 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.service: Deactivated successfully.
Nov 28 10:11:31 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "9861e523-796e-4848-a7e0-e4ce88058d68", "format": "json"}]: dispatch
Nov 28 10:11:31 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "9861e523-796e-4848-a7e0-e4ce88058d68", "force": true, "format": "json"}]: dispatch
Nov 28 10:11:31 np0005538515.localdomain ceph-mon[301134]: pgmap v586: 177 pgs: 177 active+clean; 208 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 597 B/s rd, 149 KiB/s wr, 13 op/s
Nov 28 10:11:31 np0005538515.localdomain ceph-mon[301134]: from='client.15651 172.18.0.34:0/597982567' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:11:31 np0005538515.localdomain ceph-mon[301134]: osdmap e274: 6 total, 6 up, 6 in
Nov 28 10:11:31 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e274 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:11:31 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v588: 177 pgs: 2 active+clean+snaptrim, 3 active+clean+snaptrim_wait, 172 active+clean; 209 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 818 B/s rd, 211 KiB/s wr, 17 op/s
Nov 28 10:11:31 np0005538515.localdomain sudo[323005]: pam_unix(sudo:session): session closed for user root
Nov 28 10:11:32 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Nov 28 10:11:32 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 10:11:32 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0)
Nov 28 10:11:32 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 28 10:11:32 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Nov 28 10:11:32 np0005538515.localdomain ceph-mgr[286188]: [progress INFO root] update: starting ev 752b3d70-42ae-49ea-a3fa-11cef9b83bd0 (Updating node-proxy deployment (+3 -> 3))
Nov 28 10:11:32 np0005538515.localdomain ceph-mgr[286188]: [progress INFO root] complete: finished ev 752b3d70-42ae-49ea-a3fa-11cef9b83bd0 (Updating node-proxy deployment (+3 -> 3))
Nov 28 10:11:32 np0005538515.localdomain ceph-mgr[286188]: [progress INFO root] Completed event 752b3d70-42ae-49ea-a3fa-11cef9b83bd0 (Updating node-proxy deployment (+3 -> 3)) in 0 seconds
Nov 28 10:11:32 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0)
Nov 28 10:11:32 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 28 10:11:32 np0005538515.localdomain sudo[323067]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 10:11:32 np0005538515.localdomain sudo[323067]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 10:11:32 np0005538515.localdomain sudo[323067]: pam_unix(sudo:session): session closed for user root
Nov 28 10:11:32 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "783fcf38-4096-452b-b965-78e606bf4fa1", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:11:32 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "783fcf38-4096-452b-b965-78e606bf4fa1", "format": "json"}]: dispatch
Nov 28 10:11:32 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 10:11:32 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 28 10:11:32 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:11:32 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 28 10:11:32 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "3fe15641-5409-4db6-8856-5687ded3c0e8", "auth_id": "tempest-cephx-id-459400664", "format": "json"}]: dispatch
Nov 28 10:11:32 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_deauthorize(auth_id:tempest-cephx-id-459400664, format:json, prefix:fs subvolume deauthorize, sub_name:3fe15641-5409-4db6-8856-5687ded3c0e8, vol_name:cephfs) < ""
Nov 28 10:11:32 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.tempest-cephx-id-459400664", "format": "json"} v 0)
Nov 28 10:11:32 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-459400664", "format": "json"} : dispatch
Nov 28 10:11:32 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.tempest-cephx-id-459400664"} v 0)
Nov 28 10:11:32 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-459400664"} : dispatch
Nov 28 10:11:32 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_deauthorize(auth_id:tempest-cephx-id-459400664, format:json, prefix:fs subvolume deauthorize, sub_name:3fe15641-5409-4db6-8856-5687ded3c0e8, vol_name:cephfs) < ""
Nov 28 10:11:32 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "3fe15641-5409-4db6-8856-5687ded3c0e8", "auth_id": "tempest-cephx-id-459400664", "format": "json"}]: dispatch
Nov 28 10:11:32 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_evict(auth_id:tempest-cephx-id-459400664, format:json, prefix:fs subvolume evict, sub_name:3fe15641-5409-4db6-8856-5687ded3c0e8, vol_name:cephfs) < ""
Nov 28 10:11:32 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict clients with auth_name=tempest-cephx-id-459400664, client_metadata.root=/volumes/_nogroup/3fe15641-5409-4db6-8856-5687ded3c0e8/de8c16b9-d700-4f01-bec8-e5a6a7967ad5
Nov 28 10:11:32 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict: joined all
Nov 28 10:11:32 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_evict(auth_id:tempest-cephx-id-459400664, format:json, prefix:fs subvolume evict, sub_name:3fe15641-5409-4db6-8856-5687ded3c0e8, vol_name:cephfs) < ""
Nov 28 10:11:32 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice_bob", "format": "json"}]: dispatch
Nov 28 10:11:32 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_deauthorize(auth_id:alice_bob, format:json, prefix:fs subvolume deauthorize, sub_name:3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50, vol_name:cephfs) < ""
Nov 28 10:11:32 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} v 0)
Nov 28 10:11:32 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Nov 28 10:11:32 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice_bob"} v 0)
Nov 28 10:11:32 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Nov 28 10:11:32 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_deauthorize(auth_id:alice_bob, format:json, prefix:fs subvolume deauthorize, sub_name:3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50, vol_name:cephfs) < ""
Nov 28 10:11:32 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice_bob", "format": "json"}]: dispatch
Nov 28 10:11:32 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_evict(auth_id:alice_bob, format:json, prefix:fs subvolume evict, sub_name:3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50, vol_name:cephfs) < ""
Nov 28 10:11:32 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict clients with auth_name=alice_bob, client_metadata.root=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97
Nov 28 10:11:32 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict: joined all
Nov 28 10:11:32 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_evict(auth_id:alice_bob, format:json, prefix:fs subvolume evict, sub_name:3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50, vol_name:cephfs) < ""
Nov 28 10:11:33 np0005538515.localdomain ceph-mon[301134]: pgmap v588: 177 pgs: 2 active+clean+snaptrim, 3 active+clean+snaptrim_wait, 172 active+clean; 209 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 818 B/s rd, 211 KiB/s wr, 17 op/s
Nov 28 10:11:33 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "3fe15641-5409-4db6-8856-5687ded3c0e8", "auth_id": "tempest-cephx-id-459400664", "format": "json"}]: dispatch
Nov 28 10:11:33 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-459400664"} : dispatch
Nov 28 10:11:33 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-459400664", "format": "json"} : dispatch
Nov 28 10:11:33 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-459400664"} : dispatch
Nov 28 10:11:33 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-459400664"}]': finished
Nov 28 10:11:33 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "3fe15641-5409-4db6-8856-5687ded3c0e8", "auth_id": "tempest-cephx-id-459400664", "format": "json"}]: dispatch
Nov 28 10:11:33 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Nov 28 10:11:33 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Nov 28 10:11:33 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Nov 28 10:11:33 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished
Nov 28 10:11:33 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:11:33.636 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:11:33 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v589: 177 pgs: 2 active+clean+snaptrim, 3 active+clean+snaptrim_wait, 172 active+clean; 209 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 818 B/s rd, 211 KiB/s wr, 17 op/s
Nov 28 10:11:34 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:11:34.073 158530 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=62c03cad-89c1-4fd7-973b-8f2a608c71f1, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '19'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 10:11:34 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice_bob", "format": "json"}]: dispatch
Nov 28 10:11:34 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice_bob", "format": "json"}]: dispatch
Nov 28 10:11:34 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "ac978b2f-998a-4925-a071-fbc679b9c4b6", "format": "json"}]: dispatch
Nov 28 10:11:34 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_clone_status(clone_name:ac978b2f-998a-4925-a071-fbc679b9c4b6, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Nov 28 10:11:34 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_clone_status(clone_name:ac978b2f-998a-4925-a071-fbc679b9c4b6, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Nov 28 10:11:34 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T10:11:34.523+0000 7fcc87448640 -1 mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume 'ac978b2f-998a-4925-a071-fbc679b9c4b6' of type subvolume
Nov 28 10:11:34 np0005538515.localdomain ceph-mgr[286188]: mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume 'ac978b2f-998a-4925-a071-fbc679b9c4b6' of type subvolume
Nov 28 10:11:34 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "ac978b2f-998a-4925-a071-fbc679b9c4b6", "force": true, "format": "json"}]: dispatch
Nov 28 10:11:34 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:ac978b2f-998a-4925-a071-fbc679b9c4b6, vol_name:cephfs) < ""
Nov 28 10:11:34 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.subvolume_base] subvolume path 'b'/volumes/_nogroup/ac978b2f-998a-4925-a071-fbc679b9c4b6'' moved to trashcan
Nov 28 10:11:34 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Nov 28 10:11:34 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:ac978b2f-998a-4925-a071-fbc679b9c4b6, vol_name:cephfs) < ""
Nov 28 10:11:35 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "783fcf38-4096-452b-b965-78e606bf4fa1", "format": "json"}]: dispatch
Nov 28 10:11:35 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_clone_status(clone_name:783fcf38-4096-452b-b965-78e606bf4fa1, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Nov 28 10:11:35 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_clone_status(clone_name:783fcf38-4096-452b-b965-78e606bf4fa1, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Nov 28 10:11:35 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T10:11:35.076+0000 7fcc87448640 -1 mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '783fcf38-4096-452b-b965-78e606bf4fa1' of type subvolume
Nov 28 10:11:35 np0005538515.localdomain ceph-mgr[286188]: mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '783fcf38-4096-452b-b965-78e606bf4fa1' of type subvolume
Nov 28 10:11:35 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "783fcf38-4096-452b-b965-78e606bf4fa1", "force": true, "format": "json"}]: dispatch
Nov 28 10:11:35 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:783fcf38-4096-452b-b965-78e606bf4fa1, vol_name:cephfs) < ""
Nov 28 10:11:35 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.subvolume_base] subvolume path 'b'/volumes/_nogroup/783fcf38-4096-452b-b965-78e606bf4fa1'' moved to trashcan
Nov 28 10:11:35 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Nov 28 10:11:35 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:783fcf38-4096-452b-b965-78e606bf4fa1, vol_name:cephfs) < ""
Nov 28 10:11:35 np0005538515.localdomain ceph-mon[301134]: pgmap v589: 177 pgs: 2 active+clean+snaptrim, 3 active+clean+snaptrim_wait, 172 active+clean; 209 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 818 B/s rd, 211 KiB/s wr, 17 op/s
Nov 28 10:11:35 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "ac978b2f-998a-4925-a071-fbc679b9c4b6", "format": "json"}]: dispatch
Nov 28 10:11:35 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "ac978b2f-998a-4925-a071-fbc679b9c4b6", "force": true, "format": "json"}]: dispatch
Nov 28 10:11:35 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "3fe15641-5409-4db6-8856-5687ded3c0e8", "auth_id": "tempest-cephx-id-459400664", "tenant_id": "1562d9ae673b4a5ea5a1a571bd0ea2c8", "access_level": "rw", "format": "json"}]: dispatch
Nov 28 10:11:35 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_authorize(access_level:rw, auth_id:tempest-cephx-id-459400664, format:json, prefix:fs subvolume authorize, sub_name:3fe15641-5409-4db6-8856-5687ded3c0e8, tenant_id:1562d9ae673b4a5ea5a1a571bd0ea2c8, vol_name:cephfs) < ""
Nov 28 10:11:35 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] scanning for idle connections..
Nov 28 10:11:35 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] cleaning up connections: []
Nov 28 10:11:35 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.tempest-cephx-id-459400664", "format": "json"} v 0)
Nov 28 10:11:35 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-459400664", "format": "json"} : dispatch
Nov 28 10:11:35 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: Creating meta for ID tempest-cephx-id-459400664 with tenant 1562d9ae673b4a5ea5a1a571bd0ea2c8
Nov 28 10:11:35 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-459400664", "caps": ["mds", "allow rw path=/volumes/_nogroup/3fe15641-5409-4db6-8856-5687ded3c0e8/de8c16b9-d700-4f01-bec8-e5a6a7967ad5", "osd", "allow rw pool=manila_data namespace=fsvolumens_3fe15641-5409-4db6-8856-5687ded3c0e8", "mon", "allow r"], "format": "json"} v 0)
Nov 28 10:11:35 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-459400664", "caps": ["mds", "allow rw path=/volumes/_nogroup/3fe15641-5409-4db6-8856-5687ded3c0e8/de8c16b9-d700-4f01-bec8-e5a6a7967ad5", "osd", "allow rw pool=manila_data namespace=fsvolumens_3fe15641-5409-4db6-8856-5687ded3c0e8", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:11:35 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] scanning for idle connections..
Nov 28 10:11:35 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] cleaning up connections: []
Nov 28 10:11:35 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_authorize(access_level:rw, auth_id:tempest-cephx-id-459400664, format:json, prefix:fs subvolume authorize, sub_name:3fe15641-5409-4db6-8856-5687ded3c0e8, tenant_id:1562d9ae673b4a5ea5a1a571bd0ea2c8, vol_name:cephfs) < ""
Nov 28 10:11:35 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice_bob", "tenant_id": "38de2f991c8946e4ad86ddc6b9c2ae73", "access_level": "r", "format": "json"}]: dispatch
Nov 28 10:11:35 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_authorize(access_level:r, auth_id:alice_bob, format:json, prefix:fs subvolume authorize, sub_name:3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50, tenant_id:38de2f991c8946e4ad86ddc6b9c2ae73, vol_name:cephfs) < ""
Nov 28 10:11:35 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} v 0)
Nov 28 10:11:35 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Nov 28 10:11:35 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: Creating meta for ID alice_bob with tenant 38de2f991c8946e4ad86ddc6b9c2ae73
Nov 28 10:11:35 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] scanning for idle connections..
Nov 28 10:11:35 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] cleaning up connections: [('cephfs', <mgr_util.CephfsConnectionPool.Connection object at 0x7fcc6baa3820>)]
Nov 28 10:11:35 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] disconnecting from cephfs 'cephfs'
Nov 28 10:11:35 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:11:35.918 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:11:35 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} v 0)
Nov 28 10:11:35 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:11:35 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v590: 177 pgs: 2 active+clean+snaptrim, 3 active+clean+snaptrim_wait, 172 active+clean; 209 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 818 B/s rd, 211 KiB/s wr, 17 op/s
Nov 28 10:11:36 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_authorize(access_level:r, auth_id:alice_bob, format:json, prefix:fs subvolume authorize, sub_name:3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50, tenant_id:38de2f991c8946e4ad86ddc6b9c2ae73, vol_name:cephfs) < ""
Nov 28 10:11:36 np0005538515.localdomain ceph-mgr[286188]: [progress INFO root] Writing back 50 completed events
Nov 28 10:11:36 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0)
Nov 28 10:11:36 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "783fcf38-4096-452b-b965-78e606bf4fa1", "format": "json"}]: dispatch
Nov 28 10:11:36 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "783fcf38-4096-452b-b965-78e606bf4fa1", "force": true, "format": "json"}]: dispatch
Nov 28 10:11:36 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-459400664", "format": "json"} : dispatch
Nov 28 10:11:36 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-459400664", "caps": ["mds", "allow rw path=/volumes/_nogroup/3fe15641-5409-4db6-8856-5687ded3c0e8/de8c16b9-d700-4f01-bec8-e5a6a7967ad5", "osd", "allow rw pool=manila_data namespace=fsvolumens_3fe15641-5409-4db6-8856-5687ded3c0e8", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:11:36 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-459400664", "caps": ["mds", "allow rw path=/volumes/_nogroup/3fe15641-5409-4db6-8856-5687ded3c0e8/de8c16b9-d700-4f01-bec8-e5a6a7967ad5", "osd", "allow rw pool=manila_data namespace=fsvolumens_3fe15641-5409-4db6-8856-5687ded3c0e8", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:11:36 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-459400664", "caps": ["mds", "allow rw path=/volumes/_nogroup/3fe15641-5409-4db6-8856-5687ded3c0e8/de8c16b9-d700-4f01-bec8-e5a6a7967ad5", "osd", "allow rw pool=manila_data namespace=fsvolumens_3fe15641-5409-4db6-8856-5687ded3c0e8", "mon", "allow r"], "format": "json"}]': finished
Nov 28 10:11:36 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Nov 28 10:11:36 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:11:36 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:11:36 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"}]': finished
Nov 28 10:11:36 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:11:36 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #46. Immutable memtables: 0.
Nov 28 10:11:36 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:11:36.770544) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Nov 28 10:11:36 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/flush_job.cc:856] [default] [JOB 25] Flushing memtable with next log file: 46
Nov 28 10:11:36 np0005538515.localdomain ceph-mon[301134]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324696770607, "job": 25, "event": "flush_started", "num_memtables": 1, "num_entries": 443, "num_deletes": 259, "total_data_size": 282092, "memory_usage": 292024, "flush_reason": "Manual Compaction"}
Nov 28 10:11:36 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/flush_job.cc:885] [default] [JOB 25] Level-0 flush table #47: started
Nov 28 10:11:36 np0005538515.localdomain ceph-mon[301134]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324696775323, "cf_name": "default", "job": 25, "event": "table_file_creation", "file_number": 47, "file_size": 185011, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 30033, "largest_seqno": 30471, "table_properties": {"data_size": 182545, "index_size": 513, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 901, "raw_key_size": 6728, "raw_average_key_size": 18, "raw_value_size": 177115, "raw_average_value_size": 498, "num_data_blocks": 23, "num_entries": 355, "num_filter_entries": 355, "num_deletions": 259, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764324690, "oldest_key_time": 1764324690, "file_creation_time": 1764324696, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "75e61b0e-4f73-4b03-b096-8587ecbe7a9f", "db_session_id": "7KM5GJAJPD54H6HSLJHG", "orig_file_number": 47, "seqno_to_time_mapping": "N/A"}}
Nov 28 10:11:36 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 25] Flush lasted 4844 microseconds, and 2011 cpu microseconds.
Nov 28 10:11:36 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 28 10:11:36 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:11:36.775392) [db/flush_job.cc:967] [default] [JOB 25] Level-0 flush table #47: 185011 bytes OK
Nov 28 10:11:36 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:11:36.775425) [db/memtable_list.cc:519] [default] Level-0 commit table #47 started
Nov 28 10:11:36 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:11:36.778160) [db/memtable_list.cc:722] [default] Level-0 commit table #47: memtable #1 done
Nov 28 10:11:36 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:11:36.778189) EVENT_LOG_v1 {"time_micros": 1764324696778182, "job": 25, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Nov 28 10:11:36 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:11:36.778218) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Nov 28 10:11:36 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 25] Try to delete WAL files size 279188, prev total WAL file size 287987, number of live WAL files 2.
Nov 28 10:11:36 np0005538515.localdomain ceph-mon[301134]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538515/store.db/000043.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 28 10:11:36 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:11:36.779656) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0034323734' seq:72057594037927935, type:22 .. '6C6F676D0034353239' seq:0, type:0; will stop at (end)
Nov 28 10:11:36 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 26] Compacting 1@0 + 1@6 files to L6, score -1.00
Nov 28 10:11:36 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 25 Base level 0, inputs: [47(180KB)], [45(18MB)]
Nov 28 10:11:36 np0005538515.localdomain ceph-mon[301134]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324696779710, "job": 26, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [47], "files_L6": [45], "score": -1, "input_data_size": 19105333, "oldest_snapshot_seqno": -1}
Nov 28 10:11:36 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e274 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:11:36 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 26] Generated table #48: 13774 keys, 18715457 bytes, temperature: kUnknown
Nov 28 10:11:36 np0005538515.localdomain ceph-mon[301134]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324696907199, "cf_name": "default", "job": 26, "event": "table_file_creation", "file_number": 48, "file_size": 18715457, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 18635965, "index_size": 43935, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 34501, "raw_key_size": 369648, "raw_average_key_size": 26, "raw_value_size": 18401186, "raw_average_value_size": 1335, "num_data_blocks": 1643, "num_entries": 13774, "num_filter_entries": 13774, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764323786, "oldest_key_time": 0, "file_creation_time": 1764324696, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "75e61b0e-4f73-4b03-b096-8587ecbe7a9f", "db_session_id": "7KM5GJAJPD54H6HSLJHG", "orig_file_number": 48, "seqno_to_time_mapping": "N/A"}}
Nov 28 10:11:36 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 28 10:11:36 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:11:36.907693) [db/compaction/compaction_job.cc:1663] [default] [JOB 26] Compacted 1@0 + 1@6 files to L6 => 18715457 bytes
Nov 28 10:11:36 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:11:36.909956) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 149.6 rd, 146.6 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.2, 18.0 +0.0 blob) out(17.8 +0.0 blob), read-write-amplify(204.4) write-amplify(101.2) OK, records in: 14315, records dropped: 541 output_compression: NoCompression
Nov 28 10:11:36 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:11:36.909995) EVENT_LOG_v1 {"time_micros": 1764324696909979, "job": 26, "event": "compaction_finished", "compaction_time_micros": 127706, "compaction_time_cpu_micros": 56630, "output_level": 6, "num_output_files": 1, "total_output_size": 18715457, "num_input_records": 14315, "num_output_records": 13774, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Nov 28 10:11:36 np0005538515.localdomain ceph-mon[301134]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538515/store.db/000047.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 28 10:11:36 np0005538515.localdomain ceph-mon[301134]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324696910228, "job": 26, "event": "table_file_deletion", "file_number": 47}
Nov 28 10:11:36 np0005538515.localdomain ceph-mon[301134]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538515/store.db/000045.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 28 10:11:36 np0005538515.localdomain ceph-mon[301134]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324696913773, "job": 26, "event": "table_file_deletion", "file_number": 45}
Nov 28 10:11:36 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:11:36.779567) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:11:36 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:11:36.913816) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:11:36 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:11:36.913824) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:11:36 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:11:36.913828) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:11:36 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:11:36.913832) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:11:36 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:11:36.913836) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:11:37 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "3fe15641-5409-4db6-8856-5687ded3c0e8", "auth_id": "tempest-cephx-id-459400664", "tenant_id": "1562d9ae673b4a5ea5a1a571bd0ea2c8", "access_level": "rw", "format": "json"}]: dispatch
Nov 28 10:11:37 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice_bob", "tenant_id": "38de2f991c8946e4ad86ddc6b9c2ae73", "access_level": "r", "format": "json"}]: dispatch
Nov 28 10:11:37 np0005538515.localdomain ceph-mon[301134]: pgmap v590: 177 pgs: 2 active+clean+snaptrim, 3 active+clean+snaptrim_wait, 172 active+clean; 209 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 818 B/s rd, 211 KiB/s wr, 17 op/s
Nov 28 10:11:37 np0005538515.localdomain ceph-mon[301134]: mgrmap e51: np0005538515.yfkzhl(active, since 13m), standbys: np0005538513.dsfdlx, np0005538514.djozup
Nov 28 10:11:37 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "96f781a9-18ad-48a6-a288-5b831718a338", "format": "json"}]: dispatch
Nov 28 10:11:37 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_clone_status(clone_name:96f781a9-18ad-48a6-a288-5b831718a338, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Nov 28 10:11:37 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_clone_status(clone_name:96f781a9-18ad-48a6-a288-5b831718a338, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Nov 28 10:11:37 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "96f781a9-18ad-48a6-a288-5b831718a338", "force": true, "format": "json"}]: dispatch
Nov 28 10:11:37 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:96f781a9-18ad-48a6-a288-5b831718a338, vol_name:cephfs) < ""
Nov 28 10:11:37 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.subvolume_base] subvolume path 'b'/volumes/_nogroup/96f781a9-18ad-48a6-a288-5b831718a338'' moved to trashcan
Nov 28 10:11:37 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Nov 28 10:11:37 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:96f781a9-18ad-48a6-a288-5b831718a338, vol_name:cephfs) < ""
Nov 28 10:11:37 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v591: 177 pgs: 177 active+clean; 210 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 921 B/s rd, 210 KiB/s wr, 17 op/s
Nov 28 10:11:38 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:11:38.638 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:11:39 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "3fe15641-5409-4db6-8856-5687ded3c0e8", "auth_id": "tempest-cephx-id-459400664", "format": "json"}]: dispatch
Nov 28 10:11:39 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_deauthorize(auth_id:tempest-cephx-id-459400664, format:json, prefix:fs subvolume deauthorize, sub_name:3fe15641-5409-4db6-8856-5687ded3c0e8, vol_name:cephfs) < ""
Nov 28 10:11:39 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.tempest-cephx-id-459400664", "format": "json"} v 0)
Nov 28 10:11:39 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-459400664", "format": "json"} : dispatch
Nov 28 10:11:39 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.tempest-cephx-id-459400664"} v 0)
Nov 28 10:11:39 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-459400664"} : dispatch
Nov 28 10:11:39 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_deauthorize(auth_id:tempest-cephx-id-459400664, format:json, prefix:fs subvolume deauthorize, sub_name:3fe15641-5409-4db6-8856-5687ded3c0e8, vol_name:cephfs) < ""
Nov 28 10:11:39 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "3fe15641-5409-4db6-8856-5687ded3c0e8", "auth_id": "tempest-cephx-id-459400664", "format": "json"}]: dispatch
Nov 28 10:11:39 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_evict(auth_id:tempest-cephx-id-459400664, format:json, prefix:fs subvolume evict, sub_name:3fe15641-5409-4db6-8856-5687ded3c0e8, vol_name:cephfs) < ""
Nov 28 10:11:39 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict clients with auth_name=tempest-cephx-id-459400664, client_metadata.root=/volumes/_nogroup/3fe15641-5409-4db6-8856-5687ded3c0e8/de8c16b9-d700-4f01-bec8-e5a6a7967ad5
Nov 28 10:11:39 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict: joined all
Nov 28 10:11:39 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_evict(auth_id:tempest-cephx-id-459400664, format:json, prefix:fs subvolume evict, sub_name:3fe15641-5409-4db6-8856-5687ded3c0e8, vol_name:cephfs) < ""
Nov 28 10:11:39 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice_bob", "format": "json"}]: dispatch
Nov 28 10:11:39 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_deauthorize(auth_id:alice_bob, format:json, prefix:fs subvolume deauthorize, sub_name:3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50, vol_name:cephfs) < ""
Nov 28 10:11:39 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} v 0)
Nov 28 10:11:39 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Nov 28 10:11:39 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice_bob"} v 0)
Nov 28 10:11:39 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Nov 28 10:11:39 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_deauthorize(auth_id:alice_bob, format:json, prefix:fs subvolume deauthorize, sub_name:3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50, vol_name:cephfs) < ""
Nov 28 10:11:39 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice_bob", "format": "json"}]: dispatch
Nov 28 10:11:39 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_evict(auth_id:alice_bob, format:json, prefix:fs subvolume evict, sub_name:3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50, vol_name:cephfs) < ""
Nov 28 10:11:39 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict clients with auth_name=alice_bob, client_metadata.root=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97
Nov 28 10:11:39 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict: joined all
Nov 28 10:11:39 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_evict(auth_id:alice_bob, format:json, prefix:fs subvolume evict, sub_name:3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50, vol_name:cephfs) < ""
Nov 28 10:11:39 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "96f781a9-18ad-48a6-a288-5b831718a338", "format": "json"}]: dispatch
Nov 28 10:11:39 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "96f781a9-18ad-48a6-a288-5b831718a338", "force": true, "format": "json"}]: dispatch
Nov 28 10:11:39 np0005538515.localdomain ceph-mon[301134]: pgmap v591: 177 pgs: 177 active+clean; 210 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 921 B/s rd, 210 KiB/s wr, 17 op/s
Nov 28 10:11:39 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-459400664"} : dispatch
Nov 28 10:11:39 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-459400664", "format": "json"} : dispatch
Nov 28 10:11:39 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-459400664"} : dispatch
Nov 28 10:11:39 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-459400664"}]': finished
Nov 28 10:11:39 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Nov 28 10:11:39 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Nov 28 10:11:39 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Nov 28 10:11:39 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished
Nov 28 10:11:39 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v592: 177 pgs: 177 active+clean; 210 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 921 B/s rd, 210 KiB/s wr, 17 op/s
Nov 28 10:11:40 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "3fe15641-5409-4db6-8856-5687ded3c0e8", "auth_id": "tempest-cephx-id-459400664", "format": "json"}]: dispatch
Nov 28 10:11:40 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "3fe15641-5409-4db6-8856-5687ded3c0e8", "auth_id": "tempest-cephx-id-459400664", "format": "json"}]: dispatch
Nov 28 10:11:40 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice_bob", "format": "json"}]: dispatch
Nov 28 10:11:40 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice_bob", "format": "json"}]: dispatch
Nov 28 10:11:40 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:11:40.957 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:11:40 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "223d1419-7407-477a-a3da-e408c7b6c43a", "snap_name": "cdc57bbb-b20f-407d-9c9a-d134937d596e_d5b0525c-04f8-4742-a7f8-f2e936812088", "force": true, "format": "json"}]: dispatch
Nov 28 10:11:40 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:cdc57bbb-b20f-407d-9c9a-d134937d596e_d5b0525c-04f8-4742-a7f8-f2e936812088, sub_name:223d1419-7407-477a-a3da-e408c7b6c43a, vol_name:cephfs) < ""
Nov 28 10:11:40 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/223d1419-7407-477a-a3da-e408c7b6c43a/.meta.tmp'
Nov 28 10:11:40 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/223d1419-7407-477a-a3da-e408c7b6c43a/.meta.tmp' to config b'/volumes/_nogroup/223d1419-7407-477a-a3da-e408c7b6c43a/.meta'
Nov 28 10:11:40 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:cdc57bbb-b20f-407d-9c9a-d134937d596e_d5b0525c-04f8-4742-a7f8-f2e936812088, sub_name:223d1419-7407-477a-a3da-e408c7b6c43a, vol_name:cephfs) < ""
Nov 28 10:11:40 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "223d1419-7407-477a-a3da-e408c7b6c43a", "snap_name": "cdc57bbb-b20f-407d-9c9a-d134937d596e", "force": true, "format": "json"}]: dispatch
Nov 28 10:11:40 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:cdc57bbb-b20f-407d-9c9a-d134937d596e, sub_name:223d1419-7407-477a-a3da-e408c7b6c43a, vol_name:cephfs) < ""
Nov 28 10:11:41 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/223d1419-7407-477a-a3da-e408c7b6c43a/.meta.tmp'
Nov 28 10:11:41 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/223d1419-7407-477a-a3da-e408c7b6c43a/.meta.tmp' to config b'/volumes/_nogroup/223d1419-7407-477a-a3da-e408c7b6c43a/.meta'
Nov 28 10:11:41 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:cdc57bbb-b20f-407d-9c9a-d134937d596e, sub_name:223d1419-7407-477a-a3da-e408c7b6c43a, vol_name:cephfs) < ""
Nov 28 10:11:41 np0005538515.localdomain ceph-mon[301134]: pgmap v592: 177 pgs: 177 active+clean; 210 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 921 B/s rd, 210 KiB/s wr, 17 op/s
Nov 28 10:11:41 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e275 e275: 6 total, 6 up, 6 in
Nov 28 10:11:41 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e275 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:11:41 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v594: 177 pgs: 177 active+clean; 210 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 1023 B/s rd, 189 KiB/s wr, 16 op/s
Nov 28 10:11:42 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice bob", "tenant_id": "38de2f991c8946e4ad86ddc6b9c2ae73", "access_level": "rw", "format": "json"}]: dispatch
Nov 28 10:11:42 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_authorize(access_level:rw, auth_id:alice bob, format:json, prefix:fs subvolume authorize, sub_name:3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50, tenant_id:38de2f991c8946e4ad86ddc6b9c2ae73, vol_name:cephfs) < ""
Nov 28 10:11:42 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.alice bob", "format": "json"} v 0)
Nov 28 10:11:42 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Nov 28 10:11:42 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: Creating meta for ID alice bob with tenant 38de2f991c8946e4ad86ddc6b9c2ae73
Nov 28 10:11:42 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} v 0)
Nov 28 10:11:42 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:11:42 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "223d1419-7407-477a-a3da-e408c7b6c43a", "snap_name": "cdc57bbb-b20f-407d-9c9a-d134937d596e_d5b0525c-04f8-4742-a7f8-f2e936812088", "force": true, "format": "json"}]: dispatch
Nov 28 10:11:42 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "223d1419-7407-477a-a3da-e408c7b6c43a", "snap_name": "cdc57bbb-b20f-407d-9c9a-d134937d596e", "force": true, "format": "json"}]: dispatch
Nov 28 10:11:42 np0005538515.localdomain ceph-mon[301134]: osdmap e275: 6 total, 6 up, 6 in
Nov 28 10:11:42 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Nov 28 10:11:42 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:11:42 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:11:42 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"}]': finished
Nov 28 10:11:42 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_authorize(access_level:rw, auth_id:alice bob, format:json, prefix:fs subvolume authorize, sub_name:3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50, tenant_id:38de2f991c8946e4ad86ddc6b9c2ae73, vol_name:cephfs) < ""
Nov 28 10:11:42 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "3fe15641-5409-4db6-8856-5687ded3c0e8", "auth_id": "tempest-cephx-id-459400664", "tenant_id": "1562d9ae673b4a5ea5a1a571bd0ea2c8", "access_level": "rw", "format": "json"}]: dispatch
Nov 28 10:11:42 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_authorize(access_level:rw, auth_id:tempest-cephx-id-459400664, format:json, prefix:fs subvolume authorize, sub_name:3fe15641-5409-4db6-8856-5687ded3c0e8, tenant_id:1562d9ae673b4a5ea5a1a571bd0ea2c8, vol_name:cephfs) < ""
Nov 28 10:11:42 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.tempest-cephx-id-459400664", "format": "json"} v 0)
Nov 28 10:11:42 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-459400664", "format": "json"} : dispatch
Nov 28 10:11:42 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: Creating meta for ID tempest-cephx-id-459400664 with tenant 1562d9ae673b4a5ea5a1a571bd0ea2c8
Nov 28 10:11:42 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-459400664", "caps": ["mds", "allow rw path=/volumes/_nogroup/3fe15641-5409-4db6-8856-5687ded3c0e8/de8c16b9-d700-4f01-bec8-e5a6a7967ad5", "osd", "allow rw pool=manila_data namespace=fsvolumens_3fe15641-5409-4db6-8856-5687ded3c0e8", "mon", "allow r"], "format": "json"} v 0)
Nov 28 10:11:42 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-459400664", "caps": ["mds", "allow rw path=/volumes/_nogroup/3fe15641-5409-4db6-8856-5687ded3c0e8/de8c16b9-d700-4f01-bec8-e5a6a7967ad5", "osd", "allow rw pool=manila_data namespace=fsvolumens_3fe15641-5409-4db6-8856-5687ded3c0e8", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:11:42 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_authorize(access_level:rw, auth_id:tempest-cephx-id-459400664, format:json, prefix:fs subvolume authorize, sub_name:3fe15641-5409-4db6-8856-5687ded3c0e8, tenant_id:1562d9ae673b4a5ea5a1a571bd0ea2c8, vol_name:cephfs) < ""
Nov 28 10:11:43 np0005538515.localdomain ceph-mon[301134]: pgmap v594: 177 pgs: 177 active+clean; 210 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 1023 B/s rd, 189 KiB/s wr, 16 op/s
Nov 28 10:11:43 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice bob", "tenant_id": "38de2f991c8946e4ad86ddc6b9c2ae73", "access_level": "rw", "format": "json"}]: dispatch
Nov 28 10:11:43 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-459400664", "format": "json"} : dispatch
Nov 28 10:11:43 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-459400664", "caps": ["mds", "allow rw path=/volumes/_nogroup/3fe15641-5409-4db6-8856-5687ded3c0e8/de8c16b9-d700-4f01-bec8-e5a6a7967ad5", "osd", "allow rw pool=manila_data namespace=fsvolumens_3fe15641-5409-4db6-8856-5687ded3c0e8", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:11:43 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-459400664", "caps": ["mds", "allow rw path=/volumes/_nogroup/3fe15641-5409-4db6-8856-5687ded3c0e8/de8c16b9-d700-4f01-bec8-e5a6a7967ad5", "osd", "allow rw pool=manila_data namespace=fsvolumens_3fe15641-5409-4db6-8856-5687ded3c0e8", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:11:43 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-459400664", "caps": ["mds", "allow rw path=/volumes/_nogroup/3fe15641-5409-4db6-8856-5687ded3c0e8/de8c16b9-d700-4f01-bec8-e5a6a7967ad5", "osd", "allow rw pool=manila_data namespace=fsvolumens_3fe15641-5409-4db6-8856-5687ded3c0e8", "mon", "allow r"], "format": "json"}]': finished
Nov 28 10:11:43 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:11:43.640 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:11:43 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.
Nov 28 10:11:43 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.
Nov 28 10:11:43 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.
Nov 28 10:11:43 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.
Nov 28 10:11:43 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v595: 177 pgs: 177 active+clean; 210 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 1023 B/s rd, 189 KiB/s wr, 16 op/s
Nov 28 10:11:43 np0005538515.localdomain podman[323089]: 2025-11-28 10:11:43.994781303 +0000 UTC m=+0.094162298 container health_status 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.build-date=20251125, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=edpm)
Nov 28 10:11:44 np0005538515.localdomain podman[323089]: 2025-11-28 10:11:44.005389929 +0000 UTC m=+0.104770954 container exec_died 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute)
Nov 28 10:11:44 np0005538515.localdomain podman[323091]: 2025-11-28 10:11:44.038397839 +0000 UTC m=+0.133009707 container health_status b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125)
Nov 28 10:11:44 np0005538515.localdomain podman[323091]: 2025-11-28 10:11:44.043538537 +0000 UTC m=+0.138150505 container exec_died b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Nov 28 10:11:44 np0005538515.localdomain systemd[1]: b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.service: Deactivated successfully.
Nov 28 10:11:44 np0005538515.localdomain systemd[1]: 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.service: Deactivated successfully.
Nov 28 10:11:44 np0005538515.localdomain systemd[1]: tmp-crun.7JTEMD.mount: Deactivated successfully.
Nov 28 10:11:44 np0005538515.localdomain podman[323090]: 2025-11-28 10:11:44.10126588 +0000 UTC m=+0.201166051 container health_status 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251125, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 28 10:11:44 np0005538515.localdomain podman[323092]: 2025-11-28 10:11:44.159191787 +0000 UTC m=+0.249243165 container health_status d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 10:11:44 np0005538515.localdomain podman[323090]: 2025-11-28 10:11:44.164195942 +0000 UTC m=+0.264096063 container exec_died 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Nov 28 10:11:44 np0005538515.localdomain systemd[1]: 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.service: Deactivated successfully.
Nov 28 10:11:44 np0005538515.localdomain podman[323092]: 2025-11-28 10:11:44.22242757 +0000 UTC m=+0.312478948 container exec_died d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 10:11:44 np0005538515.localdomain systemd[1]: d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.service: Deactivated successfully.
Nov 28 10:11:44 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "223d1419-7407-477a-a3da-e408c7b6c43a", "format": "json"}]: dispatch
Nov 28 10:11:44 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_clone_status(clone_name:223d1419-7407-477a-a3da-e408c7b6c43a, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Nov 28 10:11:44 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_clone_status(clone_name:223d1419-7407-477a-a3da-e408c7b6c43a, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Nov 28 10:11:44 np0005538515.localdomain ceph-mgr[286188]: mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '223d1419-7407-477a-a3da-e408c7b6c43a' of type subvolume
Nov 28 10:11:44 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T10:11:44.262+0000 7fcc87448640 -1 mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '223d1419-7407-477a-a3da-e408c7b6c43a' of type subvolume
Nov 28 10:11:44 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "223d1419-7407-477a-a3da-e408c7b6c43a", "force": true, "format": "json"}]: dispatch
Nov 28 10:11:44 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:223d1419-7407-477a-a3da-e408c7b6c43a, vol_name:cephfs) < ""
Nov 28 10:11:44 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.subvolume_base] subvolume path 'b'/volumes/_nogroup/223d1419-7407-477a-a3da-e408c7b6c43a'' moved to trashcan
Nov 28 10:11:44 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Nov 28 10:11:44 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:223d1419-7407-477a-a3da-e408c7b6c43a, vol_name:cephfs) < ""
Nov 28 10:11:44 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "3fe15641-5409-4db6-8856-5687ded3c0e8", "auth_id": "tempest-cephx-id-459400664", "tenant_id": "1562d9ae673b4a5ea5a1a571bd0ea2c8", "access_level": "rw", "format": "json"}]: dispatch
Nov 28 10:11:45 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e276 e276: 6 total, 6 up, 6 in
Nov 28 10:11:45 np0005538515.localdomain ceph-mon[301134]: pgmap v595: 177 pgs: 177 active+clean; 210 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 1023 B/s rd, 189 KiB/s wr, 16 op/s
Nov 28 10:11:45 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "223d1419-7407-477a-a3da-e408c7b6c43a", "format": "json"}]: dispatch
Nov 28 10:11:45 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "223d1419-7407-477a-a3da-e408c7b6c43a", "force": true, "format": "json"}]: dispatch
Nov 28 10:11:45 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice bob", "format": "json"}]: dispatch
Nov 28 10:11:45 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_deauthorize(auth_id:alice bob, format:json, prefix:fs subvolume deauthorize, sub_name:3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50, vol_name:cephfs) < ""
Nov 28 10:11:45 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.alice bob", "format": "json"} v 0)
Nov 28 10:11:45 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Nov 28 10:11:45 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice bob"} v 0)
Nov 28 10:11:45 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Nov 28 10:11:45 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_deauthorize(auth_id:alice bob, format:json, prefix:fs subvolume deauthorize, sub_name:3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50, vol_name:cephfs) < ""
Nov 28 10:11:45 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice bob", "format": "json"}]: dispatch
Nov 28 10:11:45 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_evict(auth_id:alice bob, format:json, prefix:fs subvolume evict, sub_name:3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50, vol_name:cephfs) < ""
Nov 28 10:11:45 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict clients with auth_name=alice bob, client_metadata.root=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97
Nov 28 10:11:45 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict: joined all
Nov 28 10:11:45 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_evict(auth_id:alice bob, format:json, prefix:fs subvolume evict, sub_name:3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50, vol_name:cephfs) < ""
Nov 28 10:11:45 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v597: 177 pgs: 177 active+clean; 210 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 639 B/s rd, 89 KiB/s wr, 8 op/s
Nov 28 10:11:45 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:11:45.989 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:11:46 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "c66addb9-39a7-4478-8dc0-78ac198152ee", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:11:46 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:c66addb9-39a7-4478-8dc0-78ac198152ee, vol_name:cephfs) < ""
Nov 28 10:11:46 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/c66addb9-39a7-4478-8dc0-78ac198152ee/.meta.tmp'
Nov 28 10:11:46 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/c66addb9-39a7-4478-8dc0-78ac198152ee/.meta.tmp' to config b'/volumes/_nogroup/c66addb9-39a7-4478-8dc0-78ac198152ee/.meta'
Nov 28 10:11:46 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:c66addb9-39a7-4478-8dc0-78ac198152ee, vol_name:cephfs) < ""
Nov 28 10:11:46 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "c66addb9-39a7-4478-8dc0-78ac198152ee", "format": "json"}]: dispatch
Nov 28 10:11:46 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:c66addb9-39a7-4478-8dc0-78ac198152ee, vol_name:cephfs) < ""
Nov 28 10:11:46 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:c66addb9-39a7-4478-8dc0-78ac198152ee, vol_name:cephfs) < ""
Nov 28 10:11:46 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "3fe15641-5409-4db6-8856-5687ded3c0e8", "auth_id": "tempest-cephx-id-459400664", "format": "json"}]: dispatch
Nov 28 10:11:46 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_deauthorize(auth_id:tempest-cephx-id-459400664, format:json, prefix:fs subvolume deauthorize, sub_name:3fe15641-5409-4db6-8856-5687ded3c0e8, vol_name:cephfs) < ""
Nov 28 10:11:46 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.tempest-cephx-id-459400664", "format": "json"} v 0)
Nov 28 10:11:46 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-459400664", "format": "json"} : dispatch
Nov 28 10:11:46 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.tempest-cephx-id-459400664"} v 0)
Nov 28 10:11:46 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-459400664"} : dispatch
Nov 28 10:11:46 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_deauthorize(auth_id:tempest-cephx-id-459400664, format:json, prefix:fs subvolume deauthorize, sub_name:3fe15641-5409-4db6-8856-5687ded3c0e8, vol_name:cephfs) < ""
Nov 28 10:11:46 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "3fe15641-5409-4db6-8856-5687ded3c0e8", "auth_id": "tempest-cephx-id-459400664", "format": "json"}]: dispatch
Nov 28 10:11:46 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_evict(auth_id:tempest-cephx-id-459400664, format:json, prefix:fs subvolume evict, sub_name:3fe15641-5409-4db6-8856-5687ded3c0e8, vol_name:cephfs) < ""
Nov 28 10:11:46 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict clients with auth_name=tempest-cephx-id-459400664, client_metadata.root=/volumes/_nogroup/3fe15641-5409-4db6-8856-5687ded3c0e8/de8c16b9-d700-4f01-bec8-e5a6a7967ad5
Nov 28 10:11:46 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict: joined all
Nov 28 10:11:46 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_evict(auth_id:tempest-cephx-id-459400664, format:json, prefix:fs subvolume evict, sub_name:3fe15641-5409-4db6-8856-5687ded3c0e8, vol_name:cephfs) < ""
Nov 28 10:11:46 np0005538515.localdomain ceph-mon[301134]: osdmap e276: 6 total, 6 up, 6 in
Nov 28 10:11:46 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Nov 28 10:11:46 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Nov 28 10:11:46 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Nov 28 10:11:46 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished
Nov 28 10:11:46 np0005538515.localdomain ceph-mon[301134]: from='client.15651 172.18.0.34:0/597982567' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:11:46 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-459400664"} : dispatch
Nov 28 10:11:46 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-459400664", "format": "json"} : dispatch
Nov 28 10:11:46 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-459400664"} : dispatch
Nov 28 10:11:46 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-459400664"}]': finished
Nov 28 10:11:46 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e276 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:11:46 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.
Nov 28 10:11:46 np0005538515.localdomain systemd[1]: tmp-crun.Dus8Mw.mount: Deactivated successfully.
Nov 28 10:11:46 np0005538515.localdomain podman[323178]: 2025-11-28 10:11:46.990673154 +0000 UTC m=+0.093632271 container health_status 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 28 10:11:47 np0005538515.localdomain podman[323178]: 2025-11-28 10:11:47.001715125 +0000 UTC m=+0.104674222 container exec_died 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 28 10:11:47 np0005538515.localdomain systemd[1]: 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.service: Deactivated successfully.
Nov 28 10:11:47 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:11:47.387 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:11:47 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:11:47.408 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:11:47 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "88c7260b-39e4-485d-ba81-9fc5fc185b80", "size": 2147483648, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:11:47 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:2147483648, sub_name:88c7260b-39e4-485d-ba81-9fc5fc185b80, vol_name:cephfs) < ""
Nov 28 10:11:47 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/88c7260b-39e4-485d-ba81-9fc5fc185b80/.meta.tmp'
Nov 28 10:11:47 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/88c7260b-39e4-485d-ba81-9fc5fc185b80/.meta.tmp' to config b'/volumes/_nogroup/88c7260b-39e4-485d-ba81-9fc5fc185b80/.meta'
Nov 28 10:11:47 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:2147483648, sub_name:88c7260b-39e4-485d-ba81-9fc5fc185b80, vol_name:cephfs) < ""
Nov 28 10:11:47 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "88c7260b-39e4-485d-ba81-9fc5fc185b80", "format": "json"}]: dispatch
Nov 28 10:11:47 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:88c7260b-39e4-485d-ba81-9fc5fc185b80, vol_name:cephfs) < ""
Nov 28 10:11:47 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:88c7260b-39e4-485d-ba81-9fc5fc185b80, vol_name:cephfs) < ""
Nov 28 10:11:47 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice bob", "format": "json"}]: dispatch
Nov 28 10:11:47 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice bob", "format": "json"}]: dispatch
Nov 28 10:11:47 np0005538515.localdomain ceph-mon[301134]: pgmap v597: 177 pgs: 177 active+clean; 210 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 639 B/s rd, 89 KiB/s wr, 8 op/s
Nov 28 10:11:47 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "c66addb9-39a7-4478-8dc0-78ac198152ee", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:11:47 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "c66addb9-39a7-4478-8dc0-78ac198152ee", "format": "json"}]: dispatch
Nov 28 10:11:47 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "3fe15641-5409-4db6-8856-5687ded3c0e8", "auth_id": "tempest-cephx-id-459400664", "format": "json"}]: dispatch
Nov 28 10:11:47 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "3fe15641-5409-4db6-8856-5687ded3c0e8", "auth_id": "tempest-cephx-id-459400664", "format": "json"}]: dispatch
Nov 28 10:11:47 np0005538515.localdomain ceph-mon[301134]: from='client.15651 172.18.0.34:0/597982567' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:11:47 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v598: 177 pgs: 177 active+clean; 211 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 1.2 KiB/s rd, 222 KiB/s wr, 20 op/s
Nov 28 10:11:48 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "88c7260b-39e4-485d-ba81-9fc5fc185b80", "size": 2147483648, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:11:48 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "88c7260b-39e4-485d-ba81-9fc5fc185b80", "format": "json"}]: dispatch
Nov 28 10:11:48 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:11:48.642 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:11:48 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice bob", "tenant_id": "38de2f991c8946e4ad86ddc6b9c2ae73", "access_level": "r", "format": "json"}]: dispatch
Nov 28 10:11:48 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_authorize(access_level:r, auth_id:alice bob, format:json, prefix:fs subvolume authorize, sub_name:3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50, tenant_id:38de2f991c8946e4ad86ddc6b9c2ae73, vol_name:cephfs) < ""
Nov 28 10:11:48 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.alice bob", "format": "json"} v 0)
Nov 28 10:11:48 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Nov 28 10:11:48 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: Creating meta for ID alice bob with tenant 38de2f991c8946e4ad86ddc6b9c2ae73
Nov 28 10:11:49 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} v 0)
Nov 28 10:11:49 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:11:49 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_authorize(access_level:r, auth_id:alice bob, format:json, prefix:fs subvolume authorize, sub_name:3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50, tenant_id:38de2f991c8946e4ad86ddc6b9c2ae73, vol_name:cephfs) < ""
Nov 28 10:11:49 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:11:49.238 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:11:49 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "c66addb9-39a7-4478-8dc0-78ac198152ee", "snap_name": "f5b4c96d-e791-4ea7-b052-7e31c47893d9", "format": "json"}]: dispatch
Nov 28 10:11:49 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_create(format:json, prefix:fs subvolume snapshot create, snap_name:f5b4c96d-e791-4ea7-b052-7e31c47893d9, sub_name:c66addb9-39a7-4478-8dc0-78ac198152ee, vol_name:cephfs) < ""
Nov 28 10:11:49 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_create(format:json, prefix:fs subvolume snapshot create, snap_name:f5b4c96d-e791-4ea7-b052-7e31c47893d9, sub_name:c66addb9-39a7-4478-8dc0-78ac198152ee, vol_name:cephfs) < ""
Nov 28 10:11:49 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "3fe15641-5409-4db6-8856-5687ded3c0e8", "auth_id": "tempest-cephx-id-459400664", "tenant_id": "1562d9ae673b4a5ea5a1a571bd0ea2c8", "access_level": "rw", "format": "json"}]: dispatch
Nov 28 10:11:49 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_authorize(access_level:rw, auth_id:tempest-cephx-id-459400664, format:json, prefix:fs subvolume authorize, sub_name:3fe15641-5409-4db6-8856-5687ded3c0e8, tenant_id:1562d9ae673b4a5ea5a1a571bd0ea2c8, vol_name:cephfs) < ""
Nov 28 10:11:49 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.tempest-cephx-id-459400664", "format": "json"} v 0)
Nov 28 10:11:49 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-459400664", "format": "json"} : dispatch
Nov 28 10:11:49 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: Creating meta for ID tempest-cephx-id-459400664 with tenant 1562d9ae673b4a5ea5a1a571bd0ea2c8
Nov 28 10:11:49 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-459400664", "caps": ["mds", "allow rw path=/volumes/_nogroup/3fe15641-5409-4db6-8856-5687ded3c0e8/de8c16b9-d700-4f01-bec8-e5a6a7967ad5", "osd", "allow rw pool=manila_data namespace=fsvolumens_3fe15641-5409-4db6-8856-5687ded3c0e8", "mon", "allow r"], "format": "json"} v 0)
Nov 28 10:11:49 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-459400664", "caps": ["mds", "allow rw path=/volumes/_nogroup/3fe15641-5409-4db6-8856-5687ded3c0e8/de8c16b9-d700-4f01-bec8-e5a6a7967ad5", "osd", "allow rw pool=manila_data namespace=fsvolumens_3fe15641-5409-4db6-8856-5687ded3c0e8", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:11:49 np0005538515.localdomain ceph-mon[301134]: pgmap v598: 177 pgs: 177 active+clean; 211 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 1.2 KiB/s rd, 222 KiB/s wr, 20 op/s
Nov 28 10:11:49 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Nov 28 10:11:49 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:11:49 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:11:49 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"}]': finished
Nov 28 10:11:49 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-459400664", "format": "json"} : dispatch
Nov 28 10:11:49 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-459400664", "caps": ["mds", "allow rw path=/volumes/_nogroup/3fe15641-5409-4db6-8856-5687ded3c0e8/de8c16b9-d700-4f01-bec8-e5a6a7967ad5", "osd", "allow rw pool=manila_data namespace=fsvolumens_3fe15641-5409-4db6-8856-5687ded3c0e8", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:11:49 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-459400664", "caps": ["mds", "allow rw path=/volumes/_nogroup/3fe15641-5409-4db6-8856-5687ded3c0e8/de8c16b9-d700-4f01-bec8-e5a6a7967ad5", "osd", "allow rw pool=manila_data namespace=fsvolumens_3fe15641-5409-4db6-8856-5687ded3c0e8", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:11:49 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-459400664", "caps": ["mds", "allow rw path=/volumes/_nogroup/3fe15641-5409-4db6-8856-5687ded3c0e8/de8c16b9-d700-4f01-bec8-e5a6a7967ad5", "osd", "allow rw pool=manila_data namespace=fsvolumens_3fe15641-5409-4db6-8856-5687ded3c0e8", "mon", "allow r"], "format": "json"}]': finished
Nov 28 10:11:49 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_authorize(access_level:rw, auth_id:tempest-cephx-id-459400664, format:json, prefix:fs subvolume authorize, sub_name:3fe15641-5409-4db6-8856-5687ded3c0e8, tenant_id:1562d9ae673b4a5ea5a1a571bd0ea2c8, vol_name:cephfs) < ""
Nov 28 10:11:49 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v599: 177 pgs: 177 active+clean; 211 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 626 B/s rd, 130 KiB/s wr, 12 op/s
Nov 28 10:11:50 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice bob", "tenant_id": "38de2f991c8946e4ad86ddc6b9c2ae73", "access_level": "r", "format": "json"}]: dispatch
Nov 28 10:11:50 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "c66addb9-39a7-4478-8dc0-78ac198152ee", "snap_name": "f5b4c96d-e791-4ea7-b052-7e31c47893d9", "format": "json"}]: dispatch
Nov 28 10:11:50 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "3fe15641-5409-4db6-8856-5687ded3c0e8", "auth_id": "tempest-cephx-id-459400664", "tenant_id": "1562d9ae673b4a5ea5a1a571bd0ea2c8", "access_level": "rw", "format": "json"}]: dispatch
Nov 28 10:11:50 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "88c7260b-39e4-485d-ba81-9fc5fc185b80", "snap_name": "640bfb92-ee5b-44bb-be51-eab8b7a02ed4", "format": "json"}]: dispatch
Nov 28 10:11:50 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_create(format:json, prefix:fs subvolume snapshot create, snap_name:640bfb92-ee5b-44bb-be51-eab8b7a02ed4, sub_name:88c7260b-39e4-485d-ba81-9fc5fc185b80, vol_name:cephfs) < ""
Nov 28 10:11:50 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_create(format:json, prefix:fs subvolume snapshot create, snap_name:640bfb92-ee5b-44bb-be51-eab8b7a02ed4, sub_name:88c7260b-39e4-485d-ba81-9fc5fc185b80, vol_name:cephfs) < ""
Nov 28 10:11:50 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:11:50.800 261346 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:11:50Z, description=, device_id=4350f7ee-36aa-45f0-add8-5b47287dce0e, device_owner=network:router_gateway, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce5fb790>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce5fbd90>], id=3e97e68b-041f-44d0-baf0-5db6fd023bf3, ip_allocation=immediate, mac_address=fa:16:3e:27:bf:d6, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T08:32:19Z, description=, dns_domain=, id=887157f9-a765-40c0-8be5-1fba3ddea8f8, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=9dda653c53224db086060962b0702694, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=29, status=ACTIVE, subnets=['5f7de60c-f82a-4f40-b803-51cb08cbf2e3'], tags=[], tenant_id=9dda653c53224db086060962b0702694, updated_at=2025-11-28T08:32:25Z, vlan_transparent=None, network_id=887157f9-a765-40c0-8be5-1fba3ddea8f8, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3681, status=DOWN, tags=[], tenant_id=, updated_at=2025-11-28T10:11:50Z on network 887157f9-a765-40c0-8be5-1fba3ddea8f8
Nov 28 10:11:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:11:50.855 158530 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 10:11:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:11:50.855 158530 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 10:11:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:11:50.855 158530 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 10:11:51 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:11:51.034 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:11:51 np0005538515.localdomain podman[323216]: 2025-11-28 10:11:51.073768899 +0000 UTC m=+0.066635428 container kill d6ba70e100412f39fbf43f2359746885ccd151dd93bddc3682d2651a49a36c62 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-887157f9-a765-40c0-8be5-1fba3ddea8f8, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team)
Nov 28 10:11:51 np0005538515.localdomain dnsmasq[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/addn_hosts - 2 addresses
Nov 28 10:11:51 np0005538515.localdomain dnsmasq-dhcp[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/host
Nov 28 10:11:51 np0005538515.localdomain dnsmasq-dhcp[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/opts
Nov 28 10:11:51 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.
Nov 28 10:11:51 np0005538515.localdomain podman[323231]: 2025-11-28 10:11:51.195196957 +0000 UTC m=+0.094306163 container health_status cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=multipathd, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:11:51 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:11:51.233 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:11:51 np0005538515.localdomain podman[323231]: 2025-11-28 10:11:51.235688406 +0000 UTC m=+0.134797642 container exec_died cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 28 10:11:51 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:11:51.237 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:11:51 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:11:51.237 280172 DEBUG nova.compute.manager [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 28 10:11:51 np0005538515.localdomain systemd[1]: cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.service: Deactivated successfully.
Nov 28 10:11:51 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:11:51.345 261346 INFO neutron.agent.dhcp.agent [None req-bf598181-cfc2-4864-910a-99b921982202 - - - - - -] DHCP configuration for ports {'3e97e68b-041f-44d0-baf0-5db6fd023bf3'} is completed
Nov 28 10:11:51 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:11:51.471 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:11:51 np0005538515.localdomain ceph-mon[301134]: pgmap v599: 177 pgs: 177 active+clean; 211 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 626 B/s rd, 130 KiB/s wr, 12 op/s
Nov 28 10:11:51 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "88c7260b-39e4-485d-ba81-9fc5fc185b80", "snap_name": "640bfb92-ee5b-44bb-be51-eab8b7a02ed4", "format": "json"}]: dispatch
Nov 28 10:11:51 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e277 e277: 6 total, 6 up, 6 in
Nov 28 10:11:51 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e277 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:11:51 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v601: 177 pgs: 177 active+clean; 212 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 895 B/s rd, 233 KiB/s wr, 20 op/s
Nov 28 10:11:52 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:11:52.238 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:11:52 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:11:52.239 280172 DEBUG nova.compute.manager [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 28 10:11:52 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:11:52.239 280172 DEBUG nova.compute.manager [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 28 10:11:52 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:11:52.261 280172 DEBUG nova.compute.manager [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 28 10:11:52 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:11:52.262 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:11:52 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice bob", "format": "json"}]: dispatch
Nov 28 10:11:52 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_deauthorize(auth_id:alice bob, format:json, prefix:fs subvolume deauthorize, sub_name:3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50, vol_name:cephfs) < ""
Nov 28 10:11:52 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.alice bob", "format": "json"} v 0)
Nov 28 10:11:52 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Nov 28 10:11:52 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice bob"} v 0)
Nov 28 10:11:52 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Nov 28 10:11:52 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_deauthorize(auth_id:alice bob, format:json, prefix:fs subvolume deauthorize, sub_name:3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50, vol_name:cephfs) < ""
Nov 28 10:11:52 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice bob", "format": "json"}]: dispatch
Nov 28 10:11:52 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_evict(auth_id:alice bob, format:json, prefix:fs subvolume evict, sub_name:3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50, vol_name:cephfs) < ""
Nov 28 10:11:52 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict clients with auth_name=alice bob, client_metadata.root=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97
Nov 28 10:11:52 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict: joined all
Nov 28 10:11:52 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_evict(auth_id:alice bob, format:json, prefix:fs subvolume evict, sub_name:3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50, vol_name:cephfs) < ""
Nov 28 10:11:52 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "c66addb9-39a7-4478-8dc0-78ac198152ee", "snap_name": "7b81adcd-a38a-49d6-b81f-5b50e8780854", "format": "json"}]: dispatch
Nov 28 10:11:52 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_create(format:json, prefix:fs subvolume snapshot create, snap_name:7b81adcd-a38a-49d6-b81f-5b50e8780854, sub_name:c66addb9-39a7-4478-8dc0-78ac198152ee, vol_name:cephfs) < ""
Nov 28 10:11:52 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_create(format:json, prefix:fs subvolume snapshot create, snap_name:7b81adcd-a38a-49d6-b81f-5b50e8780854, sub_name:c66addb9-39a7-4478-8dc0-78ac198152ee, vol_name:cephfs) < ""
Nov 28 10:11:52 np0005538515.localdomain ceph-mon[301134]: osdmap e277: 6 total, 6 up, 6 in
Nov 28 10:11:52 np0005538515.localdomain ceph-mon[301134]: pgmap v601: 177 pgs: 177 active+clean; 212 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 895 B/s rd, 233 KiB/s wr, 20 op/s
Nov 28 10:11:52 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice bob", "format": "json"}]: dispatch
Nov 28 10:11:52 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Nov 28 10:11:52 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Nov 28 10:11:52 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Nov 28 10:11:52 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished
Nov 28 10:11:52 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice bob", "format": "json"}]: dispatch
Nov 28 10:11:52 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "c66addb9-39a7-4478-8dc0-78ac198152ee", "snap_name": "7b81adcd-a38a-49d6-b81f-5b50e8780854", "format": "json"}]: dispatch
Nov 28 10:11:52 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "3fe15641-5409-4db6-8856-5687ded3c0e8", "auth_id": "tempest-cephx-id-459400664", "format": "json"}]: dispatch
Nov 28 10:11:52 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_deauthorize(auth_id:tempest-cephx-id-459400664, format:json, prefix:fs subvolume deauthorize, sub_name:3fe15641-5409-4db6-8856-5687ded3c0e8, vol_name:cephfs) < ""
Nov 28 10:11:52 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.tempest-cephx-id-459400664", "format": "json"} v 0)
Nov 28 10:11:52 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-459400664", "format": "json"} : dispatch
Nov 28 10:11:52 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.tempest-cephx-id-459400664"} v 0)
Nov 28 10:11:52 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-459400664"} : dispatch
Nov 28 10:11:52 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_deauthorize(auth_id:tempest-cephx-id-459400664, format:json, prefix:fs subvolume deauthorize, sub_name:3fe15641-5409-4db6-8856-5687ded3c0e8, vol_name:cephfs) < ""
Nov 28 10:11:52 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "3fe15641-5409-4db6-8856-5687ded3c0e8", "auth_id": "tempest-cephx-id-459400664", "format": "json"}]: dispatch
Nov 28 10:11:53 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_evict(auth_id:tempest-cephx-id-459400664, format:json, prefix:fs subvolume evict, sub_name:3fe15641-5409-4db6-8856-5687ded3c0e8, vol_name:cephfs) < ""
Nov 28 10:11:53 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict clients with auth_name=tempest-cephx-id-459400664, client_metadata.root=/volumes/_nogroup/3fe15641-5409-4db6-8856-5687ded3c0e8/de8c16b9-d700-4f01-bec8-e5a6a7967ad5
Nov 28 10:11:53 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict: joined all
Nov 28 10:11:53 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_evict(auth_id:tempest-cephx-id-459400664, format:json, prefix:fs subvolume evict, sub_name:3fe15641-5409-4db6-8856-5687ded3c0e8, vol_name:cephfs) < ""
Nov 28 10:11:53 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:11:53.238 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:11:53 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:11:53.644 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:11:53 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "3fe15641-5409-4db6-8856-5687ded3c0e8", "auth_id": "tempest-cephx-id-459400664", "format": "json"}]: dispatch
Nov 28 10:11:53 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-459400664"} : dispatch
Nov 28 10:11:53 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-459400664", "format": "json"} : dispatch
Nov 28 10:11:53 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-459400664"} : dispatch
Nov 28 10:11:53 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-459400664"}]': finished
Nov 28 10:11:53 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "3fe15641-5409-4db6-8856-5687ded3c0e8", "auth_id": "tempest-cephx-id-459400664", "format": "json"}]: dispatch
Nov 28 10:11:53 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.106:0/2162333142' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:11:53 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v602: 177 pgs: 177 active+clean; 212 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 858 B/s rd, 223 KiB/s wr, 19 op/s
Nov 28 10:11:54 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "88c7260b-39e4-485d-ba81-9fc5fc185b80", "snap_name": "640bfb92-ee5b-44bb-be51-eab8b7a02ed4_ff125526-c3a4-469c-b6b2-27c868491137", "force": true, "format": "json"}]: dispatch
Nov 28 10:11:54 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:640bfb92-ee5b-44bb-be51-eab8b7a02ed4_ff125526-c3a4-469c-b6b2-27c868491137, sub_name:88c7260b-39e4-485d-ba81-9fc5fc185b80, vol_name:cephfs) < ""
Nov 28 10:11:54 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:11:54.238 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:11:54 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/88c7260b-39e4-485d-ba81-9fc5fc185b80/.meta.tmp'
Nov 28 10:11:54 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/88c7260b-39e4-485d-ba81-9fc5fc185b80/.meta.tmp' to config b'/volumes/_nogroup/88c7260b-39e4-485d-ba81-9fc5fc185b80/.meta'
Nov 28 10:11:54 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:640bfb92-ee5b-44bb-be51-eab8b7a02ed4_ff125526-c3a4-469c-b6b2-27c868491137, sub_name:88c7260b-39e4-485d-ba81-9fc5fc185b80, vol_name:cephfs) < ""
Nov 28 10:11:54 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "88c7260b-39e4-485d-ba81-9fc5fc185b80", "snap_name": "640bfb92-ee5b-44bb-be51-eab8b7a02ed4", "force": true, "format": "json"}]: dispatch
Nov 28 10:11:54 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:640bfb92-ee5b-44bb-be51-eab8b7a02ed4, sub_name:88c7260b-39e4-485d-ba81-9fc5fc185b80, vol_name:cephfs) < ""
Nov 28 10:11:54 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:11:54.263 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 10:11:54 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:11:54.263 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 10:11:54 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:11:54.264 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 10:11:54 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:11:54.264 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Auditing locally available compute resources for np0005538515.localdomain (node: np0005538515.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 28 10:11:54 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:11:54.264 280172 DEBUG oslo_concurrency.processutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 10:11:54 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/88c7260b-39e4-485d-ba81-9fc5fc185b80/.meta.tmp'
Nov 28 10:11:54 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/88c7260b-39e4-485d-ba81-9fc5fc185b80/.meta.tmp' to config b'/volumes/_nogroup/88c7260b-39e4-485d-ba81-9fc5fc185b80/.meta'
Nov 28 10:11:54 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:11:54.299 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:11:54 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:640bfb92-ee5b-44bb-be51-eab8b7a02ed4, sub_name:88c7260b-39e4-485d-ba81-9fc5fc185b80, vol_name:cephfs) < ""
Nov 28 10:11:54 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 28 10:11:54 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/1945970975' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:11:54 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:11:54.776 280172 DEBUG oslo_concurrency.processutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.512s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 10:11:54 np0005538515.localdomain ceph-mon[301134]: pgmap v602: 177 pgs: 177 active+clean; 212 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 858 B/s rd, 223 KiB/s wr, 19 op/s
Nov 28 10:11:54 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.106:0/2922934843' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:11:54 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "88c7260b-39e4-485d-ba81-9fc5fc185b80", "snap_name": "640bfb92-ee5b-44bb-be51-eab8b7a02ed4_ff125526-c3a4-469c-b6b2-27c868491137", "force": true, "format": "json"}]: dispatch
Nov 28 10:11:54 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "88c7260b-39e4-485d-ba81-9fc5fc185b80", "snap_name": "640bfb92-ee5b-44bb-be51-eab8b7a02ed4", "force": true, "format": "json"}]: dispatch
Nov 28 10:11:54 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.108:0/1945970975' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:11:54 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:11:54.997 280172 WARNING nova.virt.libvirt.driver [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 10:11:54 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:11:54.998 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Hypervisor/Node resource view: name=np0005538515.localdomain free_ram=11467MB free_disk=41.83686447143555GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 28 10:11:54 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:11:54.999 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 10:11:55 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:11:54.999 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 10:11:55 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:11:55.064 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 28 10:11:55 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:11:55.065 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Final resource view: name=np0005538515.localdomain phys_ram=15738MB used_ram=512MB phys_disk=41GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 28 10:11:55 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:11:55.285 280172 DEBUG nova.scheduler.client.report [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Refreshing inventories for resource provider 72fba1ca-0d86-48af-8a3d-510284dfd0e0 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Nov 28 10:11:55 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:11:55.833 280172 DEBUG nova.scheduler.client.report [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Updating ProviderTree inventory for provider 72fba1ca-0d86-48af-8a3d-510284dfd0e0 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Nov 28 10:11:55 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:11:55.834 280172 DEBUG nova.compute.provider_tree [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Updating inventory in ProviderTree for provider 72fba1ca-0d86-48af-8a3d-510284dfd0e0 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 28 10:11:55 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:11:55.850 280172 DEBUG nova.scheduler.client.report [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Refreshing aggregate associations for resource provider 72fba1ca-0d86-48af-8a3d-510284dfd0e0, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Nov 28 10:11:55 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:11:55.895 280172 DEBUG nova.scheduler.client.report [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Refreshing trait associations for resource provider 72fba1ca-0d86-48af-8a3d-510284dfd0e0, traits: COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_TRUSTED_CERTS,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_AVX,HW_CPU_X86_BMI2,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_AESNI,HW_CPU_X86_SHA,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_ACCELERATORS,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_AMD_SVM,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_F16C,HW_CPU_X86_SVM,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_SSE2,HW_CPU_X86_CLMUL,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_ABM,COMPUTE_STORAGE_BUS_FDC,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_SSE4A,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_SSSE3,HW_CPU_X86_AVX2,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_SECURITY_TPM_1_2,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_SSE41,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_FMA3,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_BMI,HW_CPU_X86_SSE,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NODE,HW_CPU_X86_MMX,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_VOLUME_EXTEND,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_VIRTIO _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Nov 28 10:11:55 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:11:55.919 280172 DEBUG oslo_concurrency.processutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 10:11:55 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v603: 177 pgs: 177 active+clean; 212 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 716 B/s rd, 186 KiB/s wr, 16 op/s
Nov 28 10:11:56 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:11:56.073 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:11:56 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice", "tenant_id": "38de2f991c8946e4ad86ddc6b9c2ae73", "access_level": "rw", "format": "json"}]: dispatch
Nov 28 10:11:56 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_authorize(access_level:rw, auth_id:alice, format:json, prefix:fs subvolume authorize, sub_name:3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50, tenant_id:38de2f991c8946e4ad86ddc6b9c2ae73, vol_name:cephfs) < ""
Nov 28 10:11:56 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.alice", "format": "json"} v 0)
Nov 28 10:11:56 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Nov 28 10:11:56 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: Creating meta for ID alice with tenant 38de2f991c8946e4ad86ddc6b9c2ae73
Nov 28 10:11:56 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} v 0)
Nov 28 10:11:56 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:11:56 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 28 10:11:56 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/515171710' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:11:56 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_authorize(access_level:rw, auth_id:alice, format:json, prefix:fs subvolume authorize, sub_name:3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50, tenant_id:38de2f991c8946e4ad86ddc6b9c2ae73, vol_name:cephfs) < ""
Nov 28 10:11:56 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:11:56.389 280172 DEBUG oslo_concurrency.processutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.470s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 10:11:56 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:11:56.396 280172 DEBUG nova.compute.provider_tree [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Inventory has not changed in ProviderTree for provider: 72fba1ca-0d86-48af-8a3d-510284dfd0e0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 10:11:56 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:11:56.418 280172 DEBUG nova.scheduler.client.report [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Inventory has not changed for provider 72fba1ca-0d86-48af-8a3d-510284dfd0e0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 10:11:56 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:11:56.421 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Compute_service record updated for np0005538515.localdomain:np0005538515.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 28 10:11:56 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:11:56.421 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.422s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 10:11:56 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "3fe15641-5409-4db6-8856-5687ded3c0e8", "format": "json"}]: dispatch
Nov 28 10:11:56 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_clone_status(clone_name:3fe15641-5409-4db6-8856-5687ded3c0e8, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Nov 28 10:11:56 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_clone_status(clone_name:3fe15641-5409-4db6-8856-5687ded3c0e8, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Nov 28 10:11:56 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T10:11:56.581+0000 7fcc87448640 -1 mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '3fe15641-5409-4db6-8856-5687ded3c0e8' of type subvolume
Nov 28 10:11:56 np0005538515.localdomain ceph-mgr[286188]: mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '3fe15641-5409-4db6-8856-5687ded3c0e8' of type subvolume
Nov 28 10:11:56 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "3fe15641-5409-4db6-8856-5687ded3c0e8", "force": true, "format": "json"}]: dispatch
Nov 28 10:11:56 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:3fe15641-5409-4db6-8856-5687ded3c0e8, vol_name:cephfs) < ""
Nov 28 10:11:56 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.subvolume_base] subvolume path 'b'/volumes/_nogroup/3fe15641-5409-4db6-8856-5687ded3c0e8'' moved to trashcan
Nov 28 10:11:56 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Nov 28 10:11:56 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:3fe15641-5409-4db6-8856-5687ded3c0e8, vol_name:cephfs) < ""
Nov 28 10:11:56 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e277 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:11:57 np0005538515.localdomain ceph-mon[301134]: pgmap v603: 177 pgs: 177 active+clean; 212 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 716 B/s rd, 186 KiB/s wr, 16 op/s
Nov 28 10:11:57 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice", "tenant_id": "38de2f991c8946e4ad86ddc6b9c2ae73", "access_level": "rw", "format": "json"}]: dispatch
Nov 28 10:11:57 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Nov 28 10:11:57 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:11:57 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:11:57 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"}]': finished
Nov 28 10:11:57 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.108:0/515171710' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:11:57 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "3fe15641-5409-4db6-8856-5687ded3c0e8", "format": "json"}]: dispatch
Nov 28 10:11:57 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "3fe15641-5409-4db6-8856-5687ded3c0e8", "force": true, "format": "json"}]: dispatch
Nov 28 10:11:57 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "c66addb9-39a7-4478-8dc0-78ac198152ee", "snap_name": "7b81adcd-a38a-49d6-b81f-5b50e8780854_1a1adf00-c956-42a5-9ff9-ac2b4556603f", "force": true, "format": "json"}]: dispatch
Nov 28 10:11:57 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:7b81adcd-a38a-49d6-b81f-5b50e8780854_1a1adf00-c956-42a5-9ff9-ac2b4556603f, sub_name:c66addb9-39a7-4478-8dc0-78ac198152ee, vol_name:cephfs) < ""
Nov 28 10:11:57 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/c66addb9-39a7-4478-8dc0-78ac198152ee/.meta.tmp'
Nov 28 10:11:57 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/c66addb9-39a7-4478-8dc0-78ac198152ee/.meta.tmp' to config b'/volumes/_nogroup/c66addb9-39a7-4478-8dc0-78ac198152ee/.meta'
Nov 28 10:11:57 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:7b81adcd-a38a-49d6-b81f-5b50e8780854_1a1adf00-c956-42a5-9ff9-ac2b4556603f, sub_name:c66addb9-39a7-4478-8dc0-78ac198152ee, vol_name:cephfs) < ""
Nov 28 10:11:57 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "c66addb9-39a7-4478-8dc0-78ac198152ee", "snap_name": "7b81adcd-a38a-49d6-b81f-5b50e8780854", "force": true, "format": "json"}]: dispatch
Nov 28 10:11:57 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:7b81adcd-a38a-49d6-b81f-5b50e8780854, sub_name:c66addb9-39a7-4478-8dc0-78ac198152ee, vol_name:cephfs) < ""
Nov 28 10:11:57 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/c66addb9-39a7-4478-8dc0-78ac198152ee/.meta.tmp'
Nov 28 10:11:57 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/c66addb9-39a7-4478-8dc0-78ac198152ee/.meta.tmp' to config b'/volumes/_nogroup/c66addb9-39a7-4478-8dc0-78ac198152ee/.meta'
Nov 28 10:11:57 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:7b81adcd-a38a-49d6-b81f-5b50e8780854, sub_name:c66addb9-39a7-4478-8dc0-78ac198152ee, vol_name:cephfs) < ""
Nov 28 10:11:57 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "88c7260b-39e4-485d-ba81-9fc5fc185b80", "format": "json"}]: dispatch
Nov 28 10:11:57 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_clone_status(clone_name:88c7260b-39e4-485d-ba81-9fc5fc185b80, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Nov 28 10:11:57 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_clone_status(clone_name:88c7260b-39e4-485d-ba81-9fc5fc185b80, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Nov 28 10:11:57 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T10:11:57.469+0000 7fcc87448640 -1 mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '88c7260b-39e4-485d-ba81-9fc5fc185b80' of type subvolume
Nov 28 10:11:57 np0005538515.localdomain ceph-mgr[286188]: mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '88c7260b-39e4-485d-ba81-9fc5fc185b80' of type subvolume
Nov 28 10:11:57 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "88c7260b-39e4-485d-ba81-9fc5fc185b80", "force": true, "format": "json"}]: dispatch
Nov 28 10:11:57 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:88c7260b-39e4-485d-ba81-9fc5fc185b80, vol_name:cephfs) < ""
Nov 28 10:11:57 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.subvolume_base] subvolume path 'b'/volumes/_nogroup/88c7260b-39e4-485d-ba81-9fc5fc185b80'' moved to trashcan
Nov 28 10:11:57 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Nov 28 10:11:57 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:88c7260b-39e4-485d-ba81-9fc5fc185b80, vol_name:cephfs) < ""
Nov 28 10:11:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:11:57 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 10:11:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:11:57 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 10:11:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:11:57 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 10:11:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:11:57 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 10:11:57 np0005538515.localdomain openstack_network_exporter[240973]: 
Nov 28 10:11:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:11:57 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 10:11:57 np0005538515.localdomain openstack_network_exporter[240973]: 
Nov 28 10:11:57 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v604: 177 pgs: 177 active+clean; 213 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 179 KiB/s wr, 14 op/s
Nov 28 10:11:58 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "c66addb9-39a7-4478-8dc0-78ac198152ee", "snap_name": "7b81adcd-a38a-49d6-b81f-5b50e8780854_1a1adf00-c956-42a5-9ff9-ac2b4556603f", "force": true, "format": "json"}]: dispatch
Nov 28 10:11:58 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "c66addb9-39a7-4478-8dc0-78ac198152ee", "snap_name": "7b81adcd-a38a-49d6-b81f-5b50e8780854", "force": true, "format": "json"}]: dispatch
Nov 28 10:11:58 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "88c7260b-39e4-485d-ba81-9fc5fc185b80", "format": "json"}]: dispatch
Nov 28 10:11:58 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "88c7260b-39e4-485d-ba81-9fc5fc185b80", "force": true, "format": "json"}]: dispatch
Nov 28 10:11:58 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:11:58.391 261346 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:11:58Z, description=, device_id=c2d60d6f-83fc-4648-964a-020aeb44c54e, device_owner=network:router_gateway, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce4928b0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce492e50>], id=298c366b-f0f4-42f6-86e1-3759a03f1daf, ip_allocation=immediate, mac_address=fa:16:3e:a3:e3:6d, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T08:32:19Z, description=, dns_domain=, id=887157f9-a765-40c0-8be5-1fba3ddea8f8, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=9dda653c53224db086060962b0702694, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=29, status=ACTIVE, subnets=['5f7de60c-f82a-4f40-b803-51cb08cbf2e3'], tags=[], tenant_id=9dda653c53224db086060962b0702694, updated_at=2025-11-28T08:32:25Z, vlan_transparent=None, network_id=887157f9-a765-40c0-8be5-1fba3ddea8f8, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3695, status=DOWN, tags=[], tenant_id=, updated_at=2025-11-28T10:11:58Z on network 887157f9-a765-40c0-8be5-1fba3ddea8f8
Nov 28 10:11:58 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:11:58.653 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:11:58 np0005538515.localdomain dnsmasq[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/addn_hosts - 3 addresses
Nov 28 10:11:58 np0005538515.localdomain dnsmasq-dhcp[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/host
Nov 28 10:11:58 np0005538515.localdomain dnsmasq-dhcp[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/opts
Nov 28 10:11:58 np0005538515.localdomain podman[323320]: 2025-11-28 10:11:58.661646804 +0000 UTC m=+0.113473844 container kill d6ba70e100412f39fbf43f2359746885ccd151dd93bddc3682d2651a49a36c62 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-887157f9-a765-40c0-8be5-1fba3ddea8f8, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 28 10:11:58 np0005538515.localdomain podman[239012]: time="2025-11-28T10:11:58Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 10:11:58 np0005538515.localdomain podman[239012]: @ - - [28/Nov/2025:10:11:58 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156330 "" "Go-http-client/1.1"
Nov 28 10:11:58 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:11:58.938 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:11:58 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:11:58.948 261346 INFO neutron.agent.dhcp.agent [None req-d62fd418-4170-4c1c-9799-adb0dffddf4b - - - - - -] DHCP configuration for ports {'298c366b-f0f4-42f6-86e1-3759a03f1daf'} is completed
Nov 28 10:11:58 np0005538515.localdomain podman[239012]: @ - - [28/Nov/2025:10:11:58 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19253 "" "Go-http-client/1.1"
Nov 28 10:11:59 np0005538515.localdomain ceph-mon[301134]: pgmap v604: 177 pgs: 177 active+clean; 213 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 179 KiB/s wr, 14 op/s
Nov 28 10:11:59 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:11:59.423 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:11:59 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice", "format": "json"}]: dispatch
Nov 28 10:11:59 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_deauthorize(auth_id:alice, format:json, prefix:fs subvolume deauthorize, sub_name:3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50, vol_name:cephfs) < ""
Nov 28 10:11:59 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.alice", "format": "json"} v 0)
Nov 28 10:11:59 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Nov 28 10:11:59 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice"} v 0)
Nov 28 10:11:59 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Nov 28 10:11:59 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_deauthorize(auth_id:alice, format:json, prefix:fs subvolume deauthorize, sub_name:3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50, vol_name:cephfs) < ""
Nov 28 10:11:59 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice", "format": "json"}]: dispatch
Nov 28 10:11:59 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_evict(auth_id:alice, format:json, prefix:fs subvolume evict, sub_name:3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50, vol_name:cephfs) < ""
Nov 28 10:11:59 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict clients with auth_name=alice, client_metadata.root=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97
Nov 28 10:11:59 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict: joined all
Nov 28 10:11:59 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_evict(auth_id:alice, format:json, prefix:fs subvolume evict, sub_name:3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50, vol_name:cephfs) < ""
Nov 28 10:11:59 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v605: 177 pgs: 177 active+clean; 213 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 179 KiB/s wr, 14 op/s
Nov 28 10:12:00 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.107:0/1145513864' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:12:00 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice", "format": "json"}]: dispatch
Nov 28 10:12:00 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Nov 28 10:12:00 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Nov 28 10:12:00 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Nov 28 10:12:00 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished
Nov 28 10:12:00 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice", "format": "json"}]: dispatch
Nov 28 10:12:00 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.107:0/2356686647' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:12:00 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "c66addb9-39a7-4478-8dc0-78ac198152ee", "snap_name": "11d8e4ee-62d3-434d-8e1b-ad29c40b9e5f", "format": "json"}]: dispatch
Nov 28 10:12:00 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_create(format:json, prefix:fs subvolume snapshot create, snap_name:11d8e4ee-62d3-434d-8e1b-ad29c40b9e5f, sub_name:c66addb9-39a7-4478-8dc0-78ac198152ee, vol_name:cephfs) < ""
Nov 28 10:12:00 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_create(format:json, prefix:fs subvolume snapshot create, snap_name:11d8e4ee-62d3-434d-8e1b-ad29c40b9e5f, sub_name:c66addb9-39a7-4478-8dc0-78ac198152ee, vol_name:cephfs) < ""
Nov 28 10:12:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:12:00.628 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:12:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:12:00.628 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:12:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:12:00.628 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:12:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:12:00.628 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:12:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:12:00.628 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:12:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:12:00.628 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:12:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:12:00.629 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:12:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:12:00.629 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:12:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:12:00.629 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:12:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:12:00.629 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:12:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:12:00.629 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:12:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:12:00.629 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:12:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:12:00.630 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:12:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:12:00.630 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:12:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:12:00.630 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:12:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:12:00.630 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:12:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:12:00.631 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:12:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:12:00.631 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:12:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:12:00.631 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:12:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:12:00.631 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:12:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:12:00.631 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:12:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:12:00.632 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:12:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:12:00.632 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:12:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:12:00.632 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:12:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:12:00.632 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:12:00 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "7c140f95-3467-4116-9762-d08cfc663c93", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:12:00 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:7c140f95-3467-4116-9762-d08cfc663c93, vol_name:cephfs) < ""
Nov 28 10:12:00 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/7c140f95-3467-4116-9762-d08cfc663c93/.meta.tmp'
Nov 28 10:12:00 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/7c140f95-3467-4116-9762-d08cfc663c93/.meta.tmp' to config b'/volumes/_nogroup/7c140f95-3467-4116-9762-d08cfc663c93/.meta'
Nov 28 10:12:00 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:7c140f95-3467-4116-9762-d08cfc663c93, vol_name:cephfs) < ""
Nov 28 10:12:00 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "7c140f95-3467-4116-9762-d08cfc663c93", "format": "json"}]: dispatch
Nov 28 10:12:00 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:7c140f95-3467-4116-9762-d08cfc663c93, vol_name:cephfs) < ""
Nov 28 10:12:00 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:7c140f95-3467-4116-9762-d08cfc663c93, vol_name:cephfs) < ""
Nov 28 10:12:01 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e278 e278: 6 total, 6 up, 6 in
Nov 28 10:12:01 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:12:01.107 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:12:01 np0005538515.localdomain ceph-mon[301134]: pgmap v605: 177 pgs: 177 active+clean; 213 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 179 KiB/s wr, 14 op/s
Nov 28 10:12:01 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "c66addb9-39a7-4478-8dc0-78ac198152ee", "snap_name": "11d8e4ee-62d3-434d-8e1b-ad29c40b9e5f", "format": "json"}]: dispatch
Nov 28 10:12:01 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "7c140f95-3467-4116-9762-d08cfc663c93", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:12:01 np0005538515.localdomain ceph-mon[301134]: from='client.15651 172.18.0.34:0/597982567' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:12:01 np0005538515.localdomain ceph-mon[301134]: osdmap e278: 6 total, 6 up, 6 in
Nov 28 10:12:01 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:12:01.763 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:12:01 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e278 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:12:01 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.
Nov 28 10:12:01 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v607: 177 pgs: 6 active+clean+snaptrim_wait, 4 active+clean+snaptrim, 167 active+clean; 214 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 716 B/s rd, 170 KiB/s wr, 13 op/s
Nov 28 10:12:01 np0005538515.localdomain podman[323342]: 2025-11-28 10:12:01.975555313 +0000 UTC m=+0.083278122 container health_status 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., release=1755695350, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, vcs-type=git, maintainer=Red Hat, Inc., managed_by=edpm_ansible, name=ubi9-minimal, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, io.openshift.expose-services=, architecture=x86_64, build-date=2025-08-20T13:12:41)
Nov 28 10:12:02 np0005538515.localdomain podman[323342]: 2025-11-28 10:12:02.011863464 +0000 UTC m=+0.119586273 container exec_died 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., io.buildah.version=1.33.7, release=1755695350, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, vcs-type=git, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, name=ubi9-minimal, version=9.6)
Nov 28 10:12:02 np0005538515.localdomain systemd[1]: 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.service: Deactivated successfully.
Nov 28 10:12:02 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "7c140f95-3467-4116-9762-d08cfc663c93", "format": "json"}]: dispatch
Nov 28 10:12:02 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice", "tenant_id": "38de2f991c8946e4ad86ddc6b9c2ae73", "access_level": "r", "format": "json"}]: dispatch
Nov 28 10:12:02 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_authorize(access_level:r, auth_id:alice, format:json, prefix:fs subvolume authorize, sub_name:3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50, tenant_id:38de2f991c8946e4ad86ddc6b9c2ae73, vol_name:cephfs) < ""
Nov 28 10:12:02 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.alice", "format": "json"} v 0)
Nov 28 10:12:02 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Nov 28 10:12:02 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: Creating meta for ID alice with tenant 38de2f991c8946e4ad86ddc6b9c2ae73
Nov 28 10:12:02 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} v 0)
Nov 28 10:12:02 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:12:02 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_authorize(access_level:r, auth_id:alice, format:json, prefix:fs subvolume authorize, sub_name:3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50, tenant_id:38de2f991c8946e4ad86ddc6b9c2ae73, vol_name:cephfs) < ""
Nov 28 10:12:03 np0005538515.localdomain ceph-mon[301134]: pgmap v607: 177 pgs: 6 active+clean+snaptrim_wait, 4 active+clean+snaptrim, 167 active+clean; 214 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 716 B/s rd, 170 KiB/s wr, 13 op/s
Nov 28 10:12:03 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Nov 28 10:12:03 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:12:03 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:12:03 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"}]': finished
Nov 28 10:12:03 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:12:03.692 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:12:03 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "7c140f95-3467-4116-9762-d08cfc663c93", "snap_name": "15a14553-6d16-4d08-bf64-a6b8c8a6c750", "format": "json"}]: dispatch
Nov 28 10:12:03 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_create(format:json, prefix:fs subvolume snapshot create, snap_name:15a14553-6d16-4d08-bf64-a6b8c8a6c750, sub_name:7c140f95-3467-4116-9762-d08cfc663c93, vol_name:cephfs) < ""
Nov 28 10:12:03 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_create(format:json, prefix:fs subvolume snapshot create, snap_name:15a14553-6d16-4d08-bf64-a6b8c8a6c750, sub_name:7c140f95-3467-4116-9762-d08cfc663c93, vol_name:cephfs) < ""
Nov 28 10:12:03 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v608: 177 pgs: 6 active+clean+snaptrim_wait, 4 active+clean+snaptrim, 167 active+clean; 214 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 716 B/s rd, 170 KiB/s wr, 13 op/s
Nov 28 10:12:04 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice", "tenant_id": "38de2f991c8946e4ad86ddc6b9c2ae73", "access_level": "r", "format": "json"}]: dispatch
Nov 28 10:12:04 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "c66addb9-39a7-4478-8dc0-78ac198152ee", "snap_name": "11d8e4ee-62d3-434d-8e1b-ad29c40b9e5f_69807d00-efbf-4837-b2f8-58c7e5436c8f", "force": true, "format": "json"}]: dispatch
Nov 28 10:12:04 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:11d8e4ee-62d3-434d-8e1b-ad29c40b9e5f_69807d00-efbf-4837-b2f8-58c7e5436c8f, sub_name:c66addb9-39a7-4478-8dc0-78ac198152ee, vol_name:cephfs) < ""
Nov 28 10:12:04 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/c66addb9-39a7-4478-8dc0-78ac198152ee/.meta.tmp'
Nov 28 10:12:04 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/c66addb9-39a7-4478-8dc0-78ac198152ee/.meta.tmp' to config b'/volumes/_nogroup/c66addb9-39a7-4478-8dc0-78ac198152ee/.meta'
Nov 28 10:12:04 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:11d8e4ee-62d3-434d-8e1b-ad29c40b9e5f_69807d00-efbf-4837-b2f8-58c7e5436c8f, sub_name:c66addb9-39a7-4478-8dc0-78ac198152ee, vol_name:cephfs) < ""
Nov 28 10:12:04 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "c66addb9-39a7-4478-8dc0-78ac198152ee", "snap_name": "11d8e4ee-62d3-434d-8e1b-ad29c40b9e5f", "force": true, "format": "json"}]: dispatch
Nov 28 10:12:04 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:11d8e4ee-62d3-434d-8e1b-ad29c40b9e5f, sub_name:c66addb9-39a7-4478-8dc0-78ac198152ee, vol_name:cephfs) < ""
Nov 28 10:12:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/c66addb9-39a7-4478-8dc0-78ac198152ee/.meta.tmp'
Nov 28 10:12:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/c66addb9-39a7-4478-8dc0-78ac198152ee/.meta.tmp' to config b'/volumes/_nogroup/c66addb9-39a7-4478-8dc0-78ac198152ee/.meta'
Nov 28 10:12:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:11d8e4ee-62d3-434d-8e1b-ad29c40b9e5f, sub_name:c66addb9-39a7-4478-8dc0-78ac198152ee, vol_name:cephfs) < ""
Nov 28 10:12:05 np0005538515.localdomain ceph-mgr[286188]: [balancer INFO root] Optimize plan auto_2025-11-28_10:12:05
Nov 28 10:12:05 np0005538515.localdomain ceph-mgr[286188]: [balancer INFO root] Mode upmap, max misplaced 0.050000
Nov 28 10:12:05 np0005538515.localdomain ceph-mgr[286188]: [balancer INFO root] do_upmap
Nov 28 10:12:05 np0005538515.localdomain ceph-mgr[286188]: [balancer INFO root] pools ['.mgr', 'volumes', 'vms', 'manila_metadata', 'backups', 'images', 'manila_data']
Nov 28 10:12:05 np0005538515.localdomain ceph-mgr[286188]: [balancer INFO root] prepared 0/10 changes
Nov 28 10:12:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] scanning for idle connections..
Nov 28 10:12:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] cleaning up connections: []
Nov 28 10:12:05 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "7c140f95-3467-4116-9762-d08cfc663c93", "snap_name": "15a14553-6d16-4d08-bf64-a6b8c8a6c750", "format": "json"}]: dispatch
Nov 28 10:12:05 np0005538515.localdomain ceph-mon[301134]: pgmap v608: 177 pgs: 6 active+clean+snaptrim_wait, 4 active+clean+snaptrim, 167 active+clean; 214 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 716 B/s rd, 170 KiB/s wr, 13 op/s
Nov 28 10:12:05 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e279 e279: 6 total, 6 up, 6 in
Nov 28 10:12:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] scanning for idle connections..
Nov 28 10:12:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] cleaning up connections: []
Nov 28 10:12:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] scanning for idle connections..
Nov 28 10:12:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] cleaning up connections: [('cephfs', <mgr_util.CephfsConnectionPool.Connection object at 0x7fcc91c71b50>)]
Nov 28 10:12:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] disconnecting from cephfs 'cephfs'
Nov 28 10:12:05 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v610: 177 pgs: 6 active+clean+snaptrim_wait, 4 active+clean+snaptrim, 167 active+clean; 214 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 89 KiB/s wr, 7 op/s
Nov 28 10:12:06 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] _maybe_adjust
Nov 28 10:12:06 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Nov 28 10:12:06 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] Pool '.mgr' root_id -1 using 3.080724804578448e-05 of space, bias 1.0, pg target 0.006161449609156895 quantized to 1 (current 1)
Nov 28 10:12:06 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Nov 28 10:12:06 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] Pool 'vms' root_id -1 using 0.003328000680485762 of space, bias 1.0, pg target 0.6656001360971524 quantized to 32 (current 32)
Nov 28 10:12:06 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Nov 28 10:12:06 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] Pool 'volumes' root_id -1 using 0.0014861089300670016 of space, bias 1.0, pg target 0.29672641637004465 quantized to 32 (current 32)
Nov 28 10:12:06 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Nov 28 10:12:06 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] Pool 'images' root_id -1 using 0.004299383200725851 of space, bias 1.0, pg target 0.8584435124115949 quantized to 32 (current 32)
Nov 28 10:12:06 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Nov 28 10:12:06 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] Pool 'backups' root_id -1 using 2.7263051367950866e-07 of space, bias 1.0, pg target 5.425347222222222e-05 quantized to 32 (current 32)
Nov 28 10:12:06 np0005538515.localdomain systemd-journald[48427]: Data hash table of /run/log/journal/5cd59ba25ae47acac865224fa46a5f9e/system.journal has a fill level at 75.0 (53724 of 71630 items, 25165824 file size, 468 bytes per hash table item), suggesting rotation.
Nov 28 10:12:06 np0005538515.localdomain systemd-journald[48427]: /run/log/journal/5cd59ba25ae47acac865224fa46a5f9e/system.journal: Journal header limits reached or header out-of-date, rotating.
Nov 28 10:12:06 np0005538515.localdomain rsyslogd[758]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Nov 28 10:12:06 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Nov 28 10:12:06 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] Pool 'manila_data' root_id -1 using 1.635783082077052e-06 of space, bias 1.0, pg target 0.0003255208333333333 quantized to 32 (current 32)
Nov 28 10:12:06 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Nov 28 10:12:06 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] Pool 'manila_metadata' root_id -1 using 0.00154799605667225 of space, bias 4.0, pg target 1.232204861111111 quantized to 16 (current 16)
Nov 28 10:12:06 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules
Nov 28 10:12:06 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: vms, start_after=
Nov 28 10:12:06 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules
Nov 28 10:12:06 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: vms, start_after=
Nov 28 10:12:06 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: volumes, start_after=
Nov 28 10:12:06 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: volumes, start_after=
Nov 28 10:12:06 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: images, start_after=
Nov 28 10:12:06 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: images, start_after=
Nov 28 10:12:06 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: backups, start_after=
Nov 28 10:12:06 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: backups, start_after=
Nov 28 10:12:06 np0005538515.localdomain rsyslogd[758]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Nov 28 10:12:06 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice", "format": "json"}]: dispatch
Nov 28 10:12:06 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:12:06.162 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:12:06 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_deauthorize(auth_id:alice, format:json, prefix:fs subvolume deauthorize, sub_name:3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50, vol_name:cephfs) < ""
Nov 28 10:12:06 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.alice", "format": "json"} v 0)
Nov 28 10:12:06 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Nov 28 10:12:06 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice"} v 0)
Nov 28 10:12:06 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Nov 28 10:12:06 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_deauthorize(auth_id:alice, format:json, prefix:fs subvolume deauthorize, sub_name:3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50, vol_name:cephfs) < ""
Nov 28 10:12:06 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice", "format": "json"}]: dispatch
Nov 28 10:12:06 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_evict(auth_id:alice, format:json, prefix:fs subvolume evict, sub_name:3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50, vol_name:cephfs) < ""
Nov 28 10:12:06 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict clients with auth_name=alice, client_metadata.root=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97
Nov 28 10:12:06 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict: joined all
Nov 28 10:12:06 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_evict(auth_id:alice, format:json, prefix:fs subvolume evict, sub_name:3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50, vol_name:cephfs) < ""
Nov 28 10:12:06 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "c66addb9-39a7-4478-8dc0-78ac198152ee", "snap_name": "11d8e4ee-62d3-434d-8e1b-ad29c40b9e5f_69807d00-efbf-4837-b2f8-58c7e5436c8f", "force": true, "format": "json"}]: dispatch
Nov 28 10:12:06 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "c66addb9-39a7-4478-8dc0-78ac198152ee", "snap_name": "11d8e4ee-62d3-434d-8e1b-ad29c40b9e5f", "force": true, "format": "json"}]: dispatch
Nov 28 10:12:06 np0005538515.localdomain ceph-mon[301134]: osdmap e279: 6 total, 6 up, 6 in
Nov 28 10:12:06 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Nov 28 10:12:06 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Nov 28 10:12:06 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Nov 28 10:12:06 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished
Nov 28 10:12:06 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e280 e280: 6 total, 6 up, 6 in
Nov 28 10:12:06 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e280 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:12:07 np0005538515.localdomain ceph-mon[301134]: pgmap v610: 177 pgs: 6 active+clean+snaptrim_wait, 4 active+clean+snaptrim, 167 active+clean; 214 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 89 KiB/s wr, 7 op/s
Nov 28 10:12:07 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice", "format": "json"}]: dispatch
Nov 28 10:12:07 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice", "format": "json"}]: dispatch
Nov 28 10:12:07 np0005538515.localdomain ceph-mon[301134]: osdmap e280: 6 total, 6 up, 6 in
Nov 28 10:12:07 np0005538515.localdomain ceph-mon[301134]: mgrmap e52: np0005538515.yfkzhl(active, since 14m), standbys: np0005538513.dsfdlx, np0005538514.djozup
Nov 28 10:12:07 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v612: 177 pgs: 177 active+clean; 235 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 29 KiB/s rd, 3.2 MiB/s wr, 57 op/s
Nov 28 10:12:08 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "7c140f95-3467-4116-9762-d08cfc663c93", "snap_name": "15a14553-6d16-4d08-bf64-a6b8c8a6c750_fac779c6-92c9-42a2-b987-c4a675360397", "force": true, "format": "json"}]: dispatch
Nov 28 10:12:08 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:15a14553-6d16-4d08-bf64-a6b8c8a6c750_fac779c6-92c9-42a2-b987-c4a675360397, sub_name:7c140f95-3467-4116-9762-d08cfc663c93, vol_name:cephfs) < ""
Nov 28 10:12:08 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/7c140f95-3467-4116-9762-d08cfc663c93/.meta.tmp'
Nov 28 10:12:08 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/7c140f95-3467-4116-9762-d08cfc663c93/.meta.tmp' to config b'/volumes/_nogroup/7c140f95-3467-4116-9762-d08cfc663c93/.meta'
Nov 28 10:12:08 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:15a14553-6d16-4d08-bf64-a6b8c8a6c750_fac779c6-92c9-42a2-b987-c4a675360397, sub_name:7c140f95-3467-4116-9762-d08cfc663c93, vol_name:cephfs) < ""
Nov 28 10:12:08 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "7c140f95-3467-4116-9762-d08cfc663c93", "snap_name": "15a14553-6d16-4d08-bf64-a6b8c8a6c750", "force": true, "format": "json"}]: dispatch
Nov 28 10:12:08 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:15a14553-6d16-4d08-bf64-a6b8c8a6c750, sub_name:7c140f95-3467-4116-9762-d08cfc663c93, vol_name:cephfs) < ""
Nov 28 10:12:08 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/7c140f95-3467-4116-9762-d08cfc663c93/.meta.tmp'
Nov 28 10:12:08 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/7c140f95-3467-4116-9762-d08cfc663c93/.meta.tmp' to config b'/volumes/_nogroup/7c140f95-3467-4116-9762-d08cfc663c93/.meta'
Nov 28 10:12:08 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:15a14553-6d16-4d08-bf64-a6b8c8a6c750, sub_name:7c140f95-3467-4116-9762-d08cfc663c93, vol_name:cephfs) < ""
Nov 28 10:12:08 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "c66addb9-39a7-4478-8dc0-78ac198152ee", "snap_name": "fff24b7d-2f6c-4225-9289-9551d74f54fa", "format": "json"}]: dispatch
Nov 28 10:12:08 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_create(format:json, prefix:fs subvolume snapshot create, snap_name:fff24b7d-2f6c-4225-9289-9551d74f54fa, sub_name:c66addb9-39a7-4478-8dc0-78ac198152ee, vol_name:cephfs) < ""
Nov 28 10:12:08 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_create(format:json, prefix:fs subvolume snapshot create, snap_name:fff24b7d-2f6c-4225-9289-9551d74f54fa, sub_name:c66addb9-39a7-4478-8dc0-78ac198152ee, vol_name:cephfs) < ""
Nov 28 10:12:08 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:12:08.746 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:12:08 np0005538515.localdomain ceph-mon[301134]: pgmap v612: 177 pgs: 177 active+clean; 235 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 29 KiB/s rd, 3.2 MiB/s wr, 57 op/s
Nov 28 10:12:08 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "7c140f95-3467-4116-9762-d08cfc663c93", "snap_name": "15a14553-6d16-4d08-bf64-a6b8c8a6c750_fac779c6-92c9-42a2-b987-c4a675360397", "force": true, "format": "json"}]: dispatch
Nov 28 10:12:08 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "7c140f95-3467-4116-9762-d08cfc663c93", "snap_name": "15a14553-6d16-4d08-bf64-a6b8c8a6c750", "force": true, "format": "json"}]: dispatch
Nov 28 10:12:08 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "c66addb9-39a7-4478-8dc0-78ac198152ee", "snap_name": "fff24b7d-2f6c-4225-9289-9551d74f54fa", "format": "json"}]: dispatch
Nov 28 10:12:09 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice_bob", "tenant_id": "38de2f991c8946e4ad86ddc6b9c2ae73", "access_level": "rw", "format": "json"}]: dispatch
Nov 28 10:12:09 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_authorize(access_level:rw, auth_id:alice_bob, format:json, prefix:fs subvolume authorize, sub_name:3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50, tenant_id:38de2f991c8946e4ad86ddc6b9c2ae73, vol_name:cephfs) < ""
Nov 28 10:12:09 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} v 0)
Nov 28 10:12:09 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Nov 28 10:12:09 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: Creating meta for ID alice_bob with tenant 38de2f991c8946e4ad86ddc6b9c2ae73
Nov 28 10:12:09 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} v 0)
Nov 28 10:12:09 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:12:09 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_authorize(access_level:rw, auth_id:alice_bob, format:json, prefix:fs subvolume authorize, sub_name:3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50, tenant_id:38de2f991c8946e4ad86ddc6b9c2ae73, vol_name:cephfs) < ""
Nov 28 10:12:09 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice_bob", "tenant_id": "38de2f991c8946e4ad86ddc6b9c2ae73", "access_level": "rw", "format": "json"}]: dispatch
Nov 28 10:12:09 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Nov 28 10:12:09 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:12:09 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:12:09 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"}]': finished
Nov 28 10:12:09 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v613: 177 pgs: 177 active+clean; 235 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 25 KiB/s rd, 2.7 MiB/s wr, 41 op/s
Nov 28 10:12:10 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e281 e281: 6 total, 6 up, 6 in
Nov 28 10:12:10 np0005538515.localdomain ceph-mon[301134]: pgmap v613: 177 pgs: 177 active+clean; 235 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 25 KiB/s rd, 2.7 MiB/s wr, 41 op/s
Nov 28 10:12:11 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:12:11.225 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:12:11 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "7c140f95-3467-4116-9762-d08cfc663c93", "format": "json"}]: dispatch
Nov 28 10:12:11 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_clone_status(clone_name:7c140f95-3467-4116-9762-d08cfc663c93, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Nov 28 10:12:11 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_clone_status(clone_name:7c140f95-3467-4116-9762-d08cfc663c93, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Nov 28 10:12:11 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T10:12:11.561+0000 7fcc87448640 -1 mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '7c140f95-3467-4116-9762-d08cfc663c93' of type subvolume
Nov 28 10:12:11 np0005538515.localdomain ceph-mgr[286188]: mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '7c140f95-3467-4116-9762-d08cfc663c93' of type subvolume
Nov 28 10:12:11 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "7c140f95-3467-4116-9762-d08cfc663c93", "force": true, "format": "json"}]: dispatch
Nov 28 10:12:11 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:7c140f95-3467-4116-9762-d08cfc663c93, vol_name:cephfs) < ""
Nov 28 10:12:11 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.subvolume_base] subvolume path 'b'/volumes/_nogroup/7c140f95-3467-4116-9762-d08cfc663c93'' moved to trashcan
Nov 28 10:12:11 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Nov 28 10:12:11 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:7c140f95-3467-4116-9762-d08cfc663c93, vol_name:cephfs) < ""
Nov 28 10:12:11 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e282 e282: 6 total, 6 up, 6 in
Nov 28 10:12:11 np0005538515.localdomain ceph-mon[301134]: osdmap e281: 6 total, 6 up, 6 in
Nov 28 10:12:11 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "7c140f95-3467-4116-9762-d08cfc663c93", "format": "json"}]: dispatch
Nov 28 10:12:11 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "7c140f95-3467-4116-9762-d08cfc663c93", "force": true, "format": "json"}]: dispatch
Nov 28 10:12:11 np0005538515.localdomain ceph-mon[301134]: osdmap e282: 6 total, 6 up, 6 in
Nov 28 10:12:11 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e282 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:12:11 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "c66addb9-39a7-4478-8dc0-78ac198152ee", "snap_name": "fff24b7d-2f6c-4225-9289-9551d74f54fa_ead2690c-f6f8-4b05-b16e-a6295cb33d61", "force": true, "format": "json"}]: dispatch
Nov 28 10:12:11 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:fff24b7d-2f6c-4225-9289-9551d74f54fa_ead2690c-f6f8-4b05-b16e-a6295cb33d61, sub_name:c66addb9-39a7-4478-8dc0-78ac198152ee, vol_name:cephfs) < ""
Nov 28 10:12:11 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/c66addb9-39a7-4478-8dc0-78ac198152ee/.meta.tmp'
Nov 28 10:12:11 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/c66addb9-39a7-4478-8dc0-78ac198152ee/.meta.tmp' to config b'/volumes/_nogroup/c66addb9-39a7-4478-8dc0-78ac198152ee/.meta'
Nov 28 10:12:11 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:fff24b7d-2f6c-4225-9289-9551d74f54fa_ead2690c-f6f8-4b05-b16e-a6295cb33d61, sub_name:c66addb9-39a7-4478-8dc0-78ac198152ee, vol_name:cephfs) < ""
Nov 28 10:12:11 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "c66addb9-39a7-4478-8dc0-78ac198152ee", "snap_name": "fff24b7d-2f6c-4225-9289-9551d74f54fa", "force": true, "format": "json"}]: dispatch
Nov 28 10:12:11 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:fff24b7d-2f6c-4225-9289-9551d74f54fa, sub_name:c66addb9-39a7-4478-8dc0-78ac198152ee, vol_name:cephfs) < ""
Nov 28 10:12:11 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/c66addb9-39a7-4478-8dc0-78ac198152ee/.meta.tmp'
Nov 28 10:12:11 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/c66addb9-39a7-4478-8dc0-78ac198152ee/.meta.tmp' to config b'/volumes/_nogroup/c66addb9-39a7-4478-8dc0-78ac198152ee/.meta'
Nov 28 10:12:11 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v616: 177 pgs: 177 active+clean; 215 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 53 KiB/s rd, 3.7 MiB/s wr, 92 op/s
Nov 28 10:12:11 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:fff24b7d-2f6c-4225-9289-9551d74f54fa, sub_name:c66addb9-39a7-4478-8dc0-78ac198152ee, vol_name:cephfs) < ""
Nov 28 10:12:12 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:12:12.133 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:12:12 np0005538515.localdomain dnsmasq[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/addn_hosts - 2 addresses
Nov 28 10:12:12 np0005538515.localdomain podman[323379]: 2025-11-28 10:12:12.156737544 +0000 UTC m=+0.072563440 container kill d6ba70e100412f39fbf43f2359746885ccd151dd93bddc3682d2651a49a36c62 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-887157f9-a765-40c0-8be5-1fba3ddea8f8, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 28 10:12:12 np0005538515.localdomain dnsmasq-dhcp[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/host
Nov 28 10:12:12 np0005538515.localdomain dnsmasq-dhcp[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/opts
Nov 28 10:12:12 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice_bob", "format": "json"}]: dispatch
Nov 28 10:12:12 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_deauthorize(auth_id:alice_bob, format:json, prefix:fs subvolume deauthorize, sub_name:3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50, vol_name:cephfs) < ""
Nov 28 10:12:12 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} v 0)
Nov 28 10:12:12 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Nov 28 10:12:12 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "c66addb9-39a7-4478-8dc0-78ac198152ee", "snap_name": "fff24b7d-2f6c-4225-9289-9551d74f54fa_ead2690c-f6f8-4b05-b16e-a6295cb33d61", "force": true, "format": "json"}]: dispatch
Nov 28 10:12:12 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "c66addb9-39a7-4478-8dc0-78ac198152ee", "snap_name": "fff24b7d-2f6c-4225-9289-9551d74f54fa", "force": true, "format": "json"}]: dispatch
Nov 28 10:12:12 np0005538515.localdomain ceph-mon[301134]: pgmap v616: 177 pgs: 177 active+clean; 215 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 53 KiB/s rd, 3.7 MiB/s wr, 92 op/s
Nov 28 10:12:12 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice_bob"} v 0)
Nov 28 10:12:12 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Nov 28 10:12:12 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_deauthorize(auth_id:alice_bob, format:json, prefix:fs subvolume deauthorize, sub_name:3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50, vol_name:cephfs) < ""
Nov 28 10:12:12 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice_bob", "format": "json"}]: dispatch
Nov 28 10:12:12 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_evict(auth_id:alice_bob, format:json, prefix:fs subvolume evict, sub_name:3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50, vol_name:cephfs) < ""
Nov 28 10:12:12 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict clients with auth_name=alice_bob, client_metadata.root=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97
Nov 28 10:12:12 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict: joined all
Nov 28 10:12:12 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_evict(auth_id:alice_bob, format:json, prefix:fs subvolume evict, sub_name:3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50, vol_name:cephfs) < ""
Nov 28 10:12:13 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Nov 28 10:12:13 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1973228268' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:12:13 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Nov 28 10:12:13 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1973228268' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:12:13 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:12:13.789 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:12:13 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice_bob", "format": "json"}]: dispatch
Nov 28 10:12:13 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Nov 28 10:12:13 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Nov 28 10:12:13 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Nov 28 10:12:13 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished
Nov 28 10:12:13 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice_bob", "format": "json"}]: dispatch
Nov 28 10:12:13 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/1973228268' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:12:13 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/1973228268' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:12:13 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v617: 177 pgs: 177 active+clean; 215 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 21 KiB/s rd, 226 KiB/s wr, 42 op/s
Nov 28 10:12:14 np0005538515.localdomain ceph-mon[301134]: pgmap v617: 177 pgs: 177 active+clean; 215 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 21 KiB/s rd, 226 KiB/s wr, 42 op/s
Nov 28 10:12:14 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.
Nov 28 10:12:14 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.
Nov 28 10:12:14 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.
Nov 28 10:12:14 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.
Nov 28 10:12:15 np0005538515.localdomain podman[323404]: 2025-11-28 10:12:15.012481521 +0000 UTC m=+0.100466573 container health_status d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 10:12:15 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "c66addb9-39a7-4478-8dc0-78ac198152ee", "snap_name": "fde565fd-d88b-4da4-8499-bcfdb95820a1", "format": "json"}]: dispatch
Nov 28 10:12:15 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_create(format:json, prefix:fs subvolume snapshot create, snap_name:fde565fd-d88b-4da4-8499-bcfdb95820a1, sub_name:c66addb9-39a7-4478-8dc0-78ac198152ee, vol_name:cephfs) < ""
Nov 28 10:12:15 np0005538515.localdomain podman[323404]: 2025-11-28 10:12:15.018663241 +0000 UTC m=+0.106648313 container exec_died d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 10:12:15 np0005538515.localdomain systemd[1]: d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.service: Deactivated successfully.
Nov 28 10:12:15 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_create(format:json, prefix:fs subvolume snapshot create, snap_name:fde565fd-d88b-4da4-8499-bcfdb95820a1, sub_name:c66addb9-39a7-4478-8dc0-78ac198152ee, vol_name:cephfs) < ""
Nov 28 10:12:15 np0005538515.localdomain podman[323401]: 2025-11-28 10:12:15.071179873 +0000 UTC m=+0.169741461 container health_status 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, container_name=ceilometer_agent_compute)
Nov 28 10:12:15 np0005538515.localdomain podman[323403]: 2025-11-28 10:12:15.125974274 +0000 UTC m=+0.218269799 container health_status b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 28 10:12:15 np0005538515.localdomain podman[323402]: 2025-11-28 10:12:15.179412753 +0000 UTC m=+0.274022799 container health_status 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 28 10:12:15 np0005538515.localdomain podman[323401]: 2025-11-28 10:12:15.207825021 +0000 UTC m=+0.306386649 container exec_died 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=edpm, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ceilometer_agent_compute)
Nov 28 10:12:15 np0005538515.localdomain podman[323402]: 2025-11-28 10:12:15.216261722 +0000 UTC m=+0.310871828 container exec_died 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 28 10:12:15 np0005538515.localdomain systemd[1]: 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.service: Deactivated successfully.
Nov 28 10:12:15 np0005538515.localdomain systemd[1]: 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.service: Deactivated successfully.
Nov 28 10:12:15 np0005538515.localdomain podman[323403]: 2025-11-28 10:12:15.26157896 +0000 UTC m=+0.353874525 container exec_died b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_metadata_agent, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Nov 28 10:12:15 np0005538515.localdomain systemd[1]: b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.service: Deactivated successfully.
Nov 28 10:12:15 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:12:15.466 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:12:15 np0005538515.localdomain dnsmasq[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/addn_hosts - 1 addresses
Nov 28 10:12:15 np0005538515.localdomain dnsmasq-dhcp[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/host
Nov 28 10:12:15 np0005538515.localdomain podman[323502]: 2025-11-28 10:12:15.501715134 +0000 UTC m=+0.060281743 container kill d6ba70e100412f39fbf43f2359746885ccd151dd93bddc3682d2651a49a36c62 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-887157f9-a765-40c0-8be5-1fba3ddea8f8, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:12:15 np0005538515.localdomain dnsmasq-dhcp[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/opts
Nov 28 10:12:15 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "c66addb9-39a7-4478-8dc0-78ac198152ee", "snap_name": "fde565fd-d88b-4da4-8499-bcfdb95820a1", "format": "json"}]: dispatch
Nov 28 10:12:15 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e283 e283: 6 total, 6 up, 6 in
Nov 28 10:12:15 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v619: 177 pgs: 177 active+clean; 215 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 20 KiB/s rd, 141 KiB/s wr, 36 op/s
Nov 28 10:12:16 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice_bob", "tenant_id": "38de2f991c8946e4ad86ddc6b9c2ae73", "access_level": "r", "format": "json"}]: dispatch
Nov 28 10:12:16 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_authorize(access_level:r, auth_id:alice_bob, format:json, prefix:fs subvolume authorize, sub_name:3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50, tenant_id:38de2f991c8946e4ad86ddc6b9c2ae73, vol_name:cephfs) < ""
Nov 28 10:12:16 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} v 0)
Nov 28 10:12:16 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Nov 28 10:12:16 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: Creating meta for ID alice_bob with tenant 38de2f991c8946e4ad86ddc6b9c2ae73
Nov 28 10:12:16 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} v 0)
Nov 28 10:12:16 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:12:16 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:12:16.229 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:12:16 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_authorize(access_level:r, auth_id:alice_bob, format:json, prefix:fs subvolume authorize, sub_name:3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50, tenant_id:38de2f991c8946e4ad86ddc6b9c2ae73, vol_name:cephfs) < ""
Nov 28 10:12:16 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e284 e284: 6 total, 6 up, 6 in
Nov 28 10:12:16 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e284 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:12:16 np0005538515.localdomain ceph-mon[301134]: osdmap e283: 6 total, 6 up, 6 in
Nov 28 10:12:16 np0005538515.localdomain ceph-mon[301134]: pgmap v619: 177 pgs: 177 active+clean; 215 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 20 KiB/s rd, 141 KiB/s wr, 36 op/s
Nov 28 10:12:16 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice_bob", "tenant_id": "38de2f991c8946e4ad86ddc6b9c2ae73", "access_level": "r", "format": "json"}]: dispatch
Nov 28 10:12:16 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Nov 28 10:12:16 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:12:16 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:12:16 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"}]': finished
Nov 28 10:12:16 np0005538515.localdomain ceph-mon[301134]: osdmap e284: 6 total, 6 up, 6 in
Nov 28 10:12:17 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.
Nov 28 10:12:17 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v621: 177 pgs: 177 active+clean; 216 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 831 B/s rd, 130 KiB/s wr, 11 op/s
Nov 28 10:12:17 np0005538515.localdomain podman[323520]: 2025-11-28 10:12:17.982182814 +0000 UTC m=+0.088296077 container health_status 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 28 10:12:17 np0005538515.localdomain podman[323520]: 2025-11-28 10:12:17.992576465 +0000 UTC m=+0.098689758 container exec_died 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 10:12:18 np0005538515.localdomain systemd[1]: 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.service: Deactivated successfully.
Nov 28 10:12:18 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:12:18.826 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:12:19 np0005538515.localdomain ceph-mon[301134]: pgmap v621: 177 pgs: 177 active+clean; 216 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 831 B/s rd, 130 KiB/s wr, 11 op/s
Nov 28 10:12:19 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice_bob", "format": "json"}]: dispatch
Nov 28 10:12:19 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_deauthorize(auth_id:alice_bob, format:json, prefix:fs subvolume deauthorize, sub_name:3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50, vol_name:cephfs) < ""
Nov 28 10:12:19 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} v 0)
Nov 28 10:12:19 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Nov 28 10:12:19 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice_bob"} v 0)
Nov 28 10:12:19 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Nov 28 10:12:19 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_deauthorize(auth_id:alice_bob, format:json, prefix:fs subvolume deauthorize, sub_name:3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50, vol_name:cephfs) < ""
Nov 28 10:12:19 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice_bob", "format": "json"}]: dispatch
Nov 28 10:12:19 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_evict(auth_id:alice_bob, format:json, prefix:fs subvolume evict, sub_name:3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50, vol_name:cephfs) < ""
Nov 28 10:12:19 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict clients with auth_name=alice_bob, client_metadata.root=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97
Nov 28 10:12:19 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict: joined all
Nov 28 10:12:19 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_evict(auth_id:alice_bob, format:json, prefix:fs subvolume evict, sub_name:3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50, vol_name:cephfs) < ""
Nov 28 10:12:19 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "c66addb9-39a7-4478-8dc0-78ac198152ee", "snap_name": "fde565fd-d88b-4da4-8499-bcfdb95820a1_3ca8ffc5-e61a-4a5f-afb0-e3f73884eb70", "force": true, "format": "json"}]: dispatch
Nov 28 10:12:19 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:fde565fd-d88b-4da4-8499-bcfdb95820a1_3ca8ffc5-e61a-4a5f-afb0-e3f73884eb70, sub_name:c66addb9-39a7-4478-8dc0-78ac198152ee, vol_name:cephfs) < ""
Nov 28 10:12:19 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/c66addb9-39a7-4478-8dc0-78ac198152ee/.meta.tmp'
Nov 28 10:12:19 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/c66addb9-39a7-4478-8dc0-78ac198152ee/.meta.tmp' to config b'/volumes/_nogroup/c66addb9-39a7-4478-8dc0-78ac198152ee/.meta'
Nov 28 10:12:19 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:fde565fd-d88b-4da4-8499-bcfdb95820a1_3ca8ffc5-e61a-4a5f-afb0-e3f73884eb70, sub_name:c66addb9-39a7-4478-8dc0-78ac198152ee, vol_name:cephfs) < ""
Nov 28 10:12:19 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "c66addb9-39a7-4478-8dc0-78ac198152ee", "snap_name": "fde565fd-d88b-4da4-8499-bcfdb95820a1", "force": true, "format": "json"}]: dispatch
Nov 28 10:12:19 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:fde565fd-d88b-4da4-8499-bcfdb95820a1, sub_name:c66addb9-39a7-4478-8dc0-78ac198152ee, vol_name:cephfs) < ""
Nov 28 10:12:19 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/c66addb9-39a7-4478-8dc0-78ac198152ee/.meta.tmp'
Nov 28 10:12:19 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/c66addb9-39a7-4478-8dc0-78ac198152ee/.meta.tmp' to config b'/volumes/_nogroup/c66addb9-39a7-4478-8dc0-78ac198152ee/.meta'
Nov 28 10:12:19 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:fde565fd-d88b-4da4-8499-bcfdb95820a1, sub_name:c66addb9-39a7-4478-8dc0-78ac198152ee, vol_name:cephfs) < ""
Nov 28 10:12:19 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v622: 177 pgs: 177 active+clean; 216 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 639 B/s rd, 100 KiB/s wr, 8 op/s
Nov 28 10:12:20 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice_bob", "format": "json"}]: dispatch
Nov 28 10:12:20 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Nov 28 10:12:20 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Nov 28 10:12:20 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Nov 28 10:12:20 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished
Nov 28 10:12:20 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice_bob", "format": "json"}]: dispatch
Nov 28 10:12:20 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "c66addb9-39a7-4478-8dc0-78ac198152ee", "snap_name": "fde565fd-d88b-4da4-8499-bcfdb95820a1_3ca8ffc5-e61a-4a5f-afb0-e3f73884eb70", "force": true, "format": "json"}]: dispatch
Nov 28 10:12:20 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "c66addb9-39a7-4478-8dc0-78ac198152ee", "snap_name": "fde565fd-d88b-4da4-8499-bcfdb95820a1", "force": true, "format": "json"}]: dispatch
Nov 28 10:12:21 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:12:21.266 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:12:21 np0005538515.localdomain ceph-mon[301134]: pgmap v622: 177 pgs: 177 active+clean; 216 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 639 B/s rd, 100 KiB/s wr, 8 op/s
Nov 28 10:12:21 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e285 e285: 6 total, 6 up, 6 in
Nov 28 10:12:21 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e285 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:12:21 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.
Nov 28 10:12:21 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v624: 177 pgs: 177 active+clean; 217 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 1.2 KiB/s rd, 224 KiB/s wr, 18 op/s
Nov 28 10:12:21 np0005538515.localdomain podman[323544]: 2025-11-28 10:12:21.975243269 +0000 UTC m=+0.083619452 container health_status cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 28 10:12:22 np0005538515.localdomain podman[323544]: 2025-11-28 10:12:22.016553565 +0000 UTC m=+0.124929728 container exec_died cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_id=multipathd, managed_by=edpm_ansible, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 28 10:12:22 np0005538515.localdomain systemd[1]: cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.service: Deactivated successfully.
Nov 28 10:12:22 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:12:22.540 261346 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:12:22Z, description=, device_id=a2e46648-c204-4aa4-852a-22559e830378, device_owner=network:router_gateway, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce3aabb0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce3aab80>], id=07485490-1606-4c57-81f4-a8dc7bf2ee54, ip_allocation=immediate, mac_address=fa:16:3e:fa:9a:f8, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T08:32:19Z, description=, dns_domain=, id=887157f9-a765-40c0-8be5-1fba3ddea8f8, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=9dda653c53224db086060962b0702694, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=29, status=ACTIVE, subnets=['5f7de60c-f82a-4f40-b803-51cb08cbf2e3'], tags=[], tenant_id=9dda653c53224db086060962b0702694, updated_at=2025-11-28T08:32:25Z, vlan_transparent=None, network_id=887157f9-a765-40c0-8be5-1fba3ddea8f8, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3765, status=DOWN, tags=[], tenant_id=, updated_at=2025-11-28T10:12:22Z on network 887157f9-a765-40c0-8be5-1fba3ddea8f8
Nov 28 10:12:22 np0005538515.localdomain dnsmasq[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/addn_hosts - 2 addresses
Nov 28 10:12:22 np0005538515.localdomain dnsmasq-dhcp[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/host
Nov 28 10:12:22 np0005538515.localdomain podman[323580]: 2025-11-28 10:12:22.762192703 +0000 UTC m=+0.063382677 container kill d6ba70e100412f39fbf43f2359746885ccd151dd93bddc3682d2651a49a36c62 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-887157f9-a765-40c0-8be5-1fba3ddea8f8, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Nov 28 10:12:22 np0005538515.localdomain dnsmasq-dhcp[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/opts
Nov 28 10:12:22 np0005538515.localdomain ceph-mon[301134]: osdmap e285: 6 total, 6 up, 6 in
Nov 28 10:12:22 np0005538515.localdomain ceph-mon[301134]: pgmap v624: 177 pgs: 177 active+clean; 217 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 1.2 KiB/s rd, 224 KiB/s wr, 18 op/s
Nov 28 10:12:22 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e286 e286: 6 total, 6 up, 6 in
Nov 28 10:12:22 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice bob", "tenant_id": "38de2f991c8946e4ad86ddc6b9c2ae73", "access_level": "rw", "format": "json"}]: dispatch
Nov 28 10:12:22 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_authorize(access_level:rw, auth_id:alice bob, format:json, prefix:fs subvolume authorize, sub_name:3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50, tenant_id:38de2f991c8946e4ad86ddc6b9c2ae73, vol_name:cephfs) < ""
Nov 28 10:12:22 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.alice bob", "format": "json"} v 0)
Nov 28 10:12:22 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Nov 28 10:12:22 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: Creating meta for ID alice bob with tenant 38de2f991c8946e4ad86ddc6b9c2ae73
Nov 28 10:12:22 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} v 0)
Nov 28 10:12:22 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:12:23 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_authorize(access_level:rw, auth_id:alice bob, format:json, prefix:fs subvolume authorize, sub_name:3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50, tenant_id:38de2f991c8946e4ad86ddc6b9c2ae73, vol_name:cephfs) < ""
Nov 28 10:12:23 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:12:23.022 261346 INFO neutron.agent.dhcp.agent [None req-f72fcaf0-cdef-4bff-a079-719b622765fd - - - - - -] DHCP configuration for ports {'07485490-1606-4c57-81f4-a8dc7bf2ee54'} is completed
Nov 28 10:12:23 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:12:23.353 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:12:23 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "c66addb9-39a7-4478-8dc0-78ac198152ee", "snap_name": "91be3195-e4e7-4f0e-9446-564a14aa6ee3", "format": "json"}]: dispatch
Nov 28 10:12:23 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_create(format:json, prefix:fs subvolume snapshot create, snap_name:91be3195-e4e7-4f0e-9446-564a14aa6ee3, sub_name:c66addb9-39a7-4478-8dc0-78ac198152ee, vol_name:cephfs) < ""
Nov 28 10:12:23 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_create(format:json, prefix:fs subvolume snapshot create, snap_name:91be3195-e4e7-4f0e-9446-564a14aa6ee3, sub_name:c66addb9-39a7-4478-8dc0-78ac198152ee, vol_name:cephfs) < ""
Nov 28 10:12:23 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:12:23.828 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:12:23 np0005538515.localdomain ceph-mon[301134]: osdmap e286: 6 total, 6 up, 6 in
Nov 28 10:12:23 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice bob", "tenant_id": "38de2f991c8946e4ad86ddc6b9c2ae73", "access_level": "rw", "format": "json"}]: dispatch
Nov 28 10:12:23 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Nov 28 10:12:23 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:12:23 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:12:23 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"}]': finished
Nov 28 10:12:23 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "c66addb9-39a7-4478-8dc0-78ac198152ee", "snap_name": "91be3195-e4e7-4f0e-9446-564a14aa6ee3", "format": "json"}]: dispatch
Nov 28 10:12:23 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v626: 177 pgs: 177 active+clean; 217 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 286 B/s rd, 77 KiB/s wr, 5 op/s
Nov 28 10:12:24 np0005538515.localdomain ceph-mon[301134]: pgmap v626: 177 pgs: 177 active+clean; 217 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 286 B/s rd, 77 KiB/s wr, 5 op/s
Nov 28 10:12:25 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v627: 177 pgs: 177 active+clean; 217 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 255 B/s rd, 69 KiB/s wr, 5 op/s
Nov 28 10:12:26 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice bob", "format": "json"}]: dispatch
Nov 28 10:12:26 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_deauthorize(auth_id:alice bob, format:json, prefix:fs subvolume deauthorize, sub_name:3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50, vol_name:cephfs) < ""
Nov 28 10:12:26 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.alice bob", "format": "json"} v 0)
Nov 28 10:12:26 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Nov 28 10:12:26 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice bob"} v 0)
Nov 28 10:12:26 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Nov 28 10:12:26 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_deauthorize(auth_id:alice bob, format:json, prefix:fs subvolume deauthorize, sub_name:3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50, vol_name:cephfs) < ""
Nov 28 10:12:26 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice bob", "format": "json"}]: dispatch
Nov 28 10:12:26 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_evict(auth_id:alice bob, format:json, prefix:fs subvolume evict, sub_name:3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50, vol_name:cephfs) < ""
Nov 28 10:12:26 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict clients with auth_name=alice bob, client_metadata.root=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97
Nov 28 10:12:26 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict: joined all
Nov 28 10:12:26 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_evict(auth_id:alice bob, format:json, prefix:fs subvolume evict, sub_name:3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50, vol_name:cephfs) < ""
Nov 28 10:12:26 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:12:26.312 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:12:26 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:12:26.652 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:12:26 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e286 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:12:27 np0005538515.localdomain ceph-mon[301134]: pgmap v627: 177 pgs: 177 active+clean; 217 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 255 B/s rd, 69 KiB/s wr, 5 op/s
Nov 28 10:12:27 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice bob", "format": "json"}]: dispatch
Nov 28 10:12:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Nov 28 10:12:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Nov 28 10:12:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Nov 28 10:12:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished
Nov 28 10:12:27 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice bob", "format": "json"}]: dispatch
Nov 28 10:12:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:12:27 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 10:12:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:12:27 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 10:12:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:12:27 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 10:12:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:12:27 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 10:12:27 np0005538515.localdomain openstack_network_exporter[240973]: 
Nov 28 10:12:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:12:27 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 10:12:27 np0005538515.localdomain openstack_network_exporter[240973]: 
Nov 28 10:12:27 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "a152cb30-8073-4c75-8e60-d299f2f23f89", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:12:27 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:a152cb30-8073-4c75-8e60-d299f2f23f89, vol_name:cephfs) < ""
Nov 28 10:12:27 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/a152cb30-8073-4c75-8e60-d299f2f23f89/.meta.tmp'
Nov 28 10:12:27 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/a152cb30-8073-4c75-8e60-d299f2f23f89/.meta.tmp' to config b'/volumes/_nogroup/a152cb30-8073-4c75-8e60-d299f2f23f89/.meta'
Nov 28 10:12:27 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:a152cb30-8073-4c75-8e60-d299f2f23f89, vol_name:cephfs) < ""
Nov 28 10:12:27 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "a152cb30-8073-4c75-8e60-d299f2f23f89", "format": "json"}]: dispatch
Nov 28 10:12:27 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:a152cb30-8073-4c75-8e60-d299f2f23f89, vol_name:cephfs) < ""
Nov 28 10:12:27 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:a152cb30-8073-4c75-8e60-d299f2f23f89, vol_name:cephfs) < ""
Nov 28 10:12:27 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v628: 177 pgs: 177 active+clean; 217 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 255 B/s rd, 151 KiB/s wr, 10 op/s
Nov 28 10:12:28 np0005538515.localdomain ceph-mon[301134]: from='client.15651 172.18.0.34:0/597982567' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:12:28 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "c66addb9-39a7-4478-8dc0-78ac198152ee", "snap_name": "91be3195-e4e7-4f0e-9446-564a14aa6ee3_e616749c-df0b-458b-9f06-7163c80539a0", "force": true, "format": "json"}]: dispatch
Nov 28 10:12:28 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:91be3195-e4e7-4f0e-9446-564a14aa6ee3_e616749c-df0b-458b-9f06-7163c80539a0, sub_name:c66addb9-39a7-4478-8dc0-78ac198152ee, vol_name:cephfs) < ""
Nov 28 10:12:28 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/c66addb9-39a7-4478-8dc0-78ac198152ee/.meta.tmp'
Nov 28 10:12:28 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/c66addb9-39a7-4478-8dc0-78ac198152ee/.meta.tmp' to config b'/volumes/_nogroup/c66addb9-39a7-4478-8dc0-78ac198152ee/.meta'
Nov 28 10:12:28 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:91be3195-e4e7-4f0e-9446-564a14aa6ee3_e616749c-df0b-458b-9f06-7163c80539a0, sub_name:c66addb9-39a7-4478-8dc0-78ac198152ee, vol_name:cephfs) < ""
Nov 28 10:12:28 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "c66addb9-39a7-4478-8dc0-78ac198152ee", "snap_name": "91be3195-e4e7-4f0e-9446-564a14aa6ee3", "force": true, "format": "json"}]: dispatch
Nov 28 10:12:28 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:91be3195-e4e7-4f0e-9446-564a14aa6ee3, sub_name:c66addb9-39a7-4478-8dc0-78ac198152ee, vol_name:cephfs) < ""
Nov 28 10:12:28 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/c66addb9-39a7-4478-8dc0-78ac198152ee/.meta.tmp'
Nov 28 10:12:28 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/c66addb9-39a7-4478-8dc0-78ac198152ee/.meta.tmp' to config b'/volumes/_nogroup/c66addb9-39a7-4478-8dc0-78ac198152ee/.meta'
Nov 28 10:12:28 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:91be3195-e4e7-4f0e-9446-564a14aa6ee3, sub_name:c66addb9-39a7-4478-8dc0-78ac198152ee, vol_name:cephfs) < ""
Nov 28 10:12:28 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:12:28.854 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:12:28 np0005538515.localdomain podman[239012]: time="2025-11-28T10:12:28Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 10:12:28 np0005538515.localdomain podman[239012]: @ - - [28/Nov/2025:10:12:28 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156330 "" "Go-http-client/1.1"
Nov 28 10:12:28 np0005538515.localdomain podman[239012]: @ - - [28/Nov/2025:10:12:28 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19241 "" "Go-http-client/1.1"
Nov 28 10:12:29 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "a152cb30-8073-4c75-8e60-d299f2f23f89", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:12:29 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "a152cb30-8073-4c75-8e60-d299f2f23f89", "format": "json"}]: dispatch
Nov 28 10:12:29 np0005538515.localdomain ceph-mon[301134]: pgmap v628: 177 pgs: 177 active+clean; 217 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 255 B/s rd, 151 KiB/s wr, 10 op/s
Nov 28 10:12:29 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "c66addb9-39a7-4478-8dc0-78ac198152ee", "snap_name": "91be3195-e4e7-4f0e-9446-564a14aa6ee3_e616749c-df0b-458b-9f06-7163c80539a0", "force": true, "format": "json"}]: dispatch
Nov 28 10:12:29 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "c66addb9-39a7-4478-8dc0-78ac198152ee", "snap_name": "91be3195-e4e7-4f0e-9446-564a14aa6ee3", "force": true, "format": "json"}]: dispatch
Nov 28 10:12:29 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice bob", "tenant_id": "38de2f991c8946e4ad86ddc6b9c2ae73", "access_level": "r", "format": "json"}]: dispatch
Nov 28 10:12:29 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_authorize(access_level:r, auth_id:alice bob, format:json, prefix:fs subvolume authorize, sub_name:3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50, tenant_id:38de2f991c8946e4ad86ddc6b9c2ae73, vol_name:cephfs) < ""
Nov 28 10:12:29 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.alice bob", "format": "json"} v 0)
Nov 28 10:12:29 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Nov 28 10:12:29 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: Creating meta for ID alice bob with tenant 38de2f991c8946e4ad86ddc6b9c2ae73
Nov 28 10:12:29 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} v 0)
Nov 28 10:12:29 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:12:29 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_authorize(access_level:r, auth_id:alice bob, format:json, prefix:fs subvolume authorize, sub_name:3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50, tenant_id:38de2f991c8946e4ad86ddc6b9c2ae73, vol_name:cephfs) < ""
Nov 28 10:12:29 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v629: 177 pgs: 177 active+clean; 217 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 80 KiB/s wr, 5 op/s
Nov 28 10:12:30 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice bob", "tenant_id": "38de2f991c8946e4ad86ddc6b9c2ae73", "access_level": "r", "format": "json"}]: dispatch
Nov 28 10:12:30 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Nov 28 10:12:30 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:12:30 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:12:30 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"}]': finished
Nov 28 10:12:30 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "535c32b0-23e4-4a2c-bbae-552340590760", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:12:30 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:535c32b0-23e4-4a2c-bbae-552340590760, vol_name:cephfs) < ""
Nov 28 10:12:31 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/535c32b0-23e4-4a2c-bbae-552340590760/.meta.tmp'
Nov 28 10:12:31 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/535c32b0-23e4-4a2c-bbae-552340590760/.meta.tmp' to config b'/volumes/_nogroup/535c32b0-23e4-4a2c-bbae-552340590760/.meta'
Nov 28 10:12:31 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:535c32b0-23e4-4a2c-bbae-552340590760, vol_name:cephfs) < ""
Nov 28 10:12:31 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "535c32b0-23e4-4a2c-bbae-552340590760", "format": "json"}]: dispatch
Nov 28 10:12:31 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:535c32b0-23e4-4a2c-bbae-552340590760, vol_name:cephfs) < ""
Nov 28 10:12:31 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:535c32b0-23e4-4a2c-bbae-552340590760, vol_name:cephfs) < ""
Nov 28 10:12:31 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e287 e287: 6 total, 6 up, 6 in
Nov 28 10:12:31 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:12:31.354 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:12:31 np0005538515.localdomain ceph-mon[301134]: pgmap v629: 177 pgs: 177 active+clean; 217 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 80 KiB/s wr, 5 op/s
Nov 28 10:12:31 np0005538515.localdomain ceph-mon[301134]: from='client.15651 172.18.0.34:0/597982567' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:12:31 np0005538515.localdomain ceph-mon[301134]: osdmap e287: 6 total, 6 up, 6 in
Nov 28 10:12:31 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e288 e288: 6 total, 6 up, 6 in
Nov 28 10:12:31 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e288 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:12:31 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v632: 177 pgs: 177 active+clean; 218 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 383 B/s rd, 178 KiB/s wr, 12 op/s
Nov 28 10:12:32 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "535c32b0-23e4-4a2c-bbae-552340590760", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:12:32 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "535c32b0-23e4-4a2c-bbae-552340590760", "format": "json"}]: dispatch
Nov 28 10:12:32 np0005538515.localdomain ceph-mon[301134]: osdmap e288: 6 total, 6 up, 6 in
Nov 28 10:12:32 np0005538515.localdomain sudo[323602]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 10:12:32 np0005538515.localdomain sudo[323602]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 10:12:32 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.
Nov 28 10:12:32 np0005538515.localdomain sudo[323602]: pam_unix(sudo:session): session closed for user root
Nov 28 10:12:32 np0005538515.localdomain sudo[323621]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 10:12:32 np0005538515.localdomain sudo[323621]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 10:12:32 np0005538515.localdomain podman[323620]: 2025-11-28 10:12:32.553355274 +0000 UTC m=+0.084215431 container health_status 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, managed_by=edpm_ansible, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, distribution-scope=public, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, vendor=Red Hat, Inc., config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, com.redhat.component=ubi9-minimal-container, version=9.6, container_name=openstack_network_exporter)
Nov 28 10:12:32 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice bob", "format": "json"}]: dispatch
Nov 28 10:12:32 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_deauthorize(auth_id:alice bob, format:json, prefix:fs subvolume deauthorize, sub_name:3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50, vol_name:cephfs) < ""
Nov 28 10:12:32 np0005538515.localdomain podman[323620]: 2025-11-28 10:12:32.566237892 +0000 UTC m=+0.097098059 container exec_died 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.openshift.expose-services=, architecture=x86_64, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, distribution-scope=public, vcs-type=git, container_name=openstack_network_exporter, io.buildah.version=1.33.7)
Nov 28 10:12:32 np0005538515.localdomain systemd[1]: 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.service: Deactivated successfully.
Nov 28 10:12:32 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.alice bob", "format": "json"} v 0)
Nov 28 10:12:32 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Nov 28 10:12:32 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice bob"} v 0)
Nov 28 10:12:32 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Nov 28 10:12:32 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_deauthorize(auth_id:alice bob, format:json, prefix:fs subvolume deauthorize, sub_name:3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50, vol_name:cephfs) < ""
Nov 28 10:12:32 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice bob", "format": "json"}]: dispatch
Nov 28 10:12:32 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_evict(auth_id:alice bob, format:json, prefix:fs subvolume evict, sub_name:3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50, vol_name:cephfs) < ""
Nov 28 10:12:32 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict clients with auth_name=alice bob, client_metadata.root=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97
Nov 28 10:12:32 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict: joined all
Nov 28 10:12:32 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_evict(auth_id:alice bob, format:json, prefix:fs subvolume evict, sub_name:3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50, vol_name:cephfs) < ""
Nov 28 10:12:33 np0005538515.localdomain sudo[323621]: pam_unix(sudo:session): session closed for user root
Nov 28 10:12:33 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "c66addb9-39a7-4478-8dc0-78ac198152ee", "snap_name": "f5b4c96d-e791-4ea7-b052-7e31c47893d9_0d92dab8-838f-4c0e-9a76-ed8d13f07bb1", "force": true, "format": "json"}]: dispatch
Nov 28 10:12:33 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:f5b4c96d-e791-4ea7-b052-7e31c47893d9_0d92dab8-838f-4c0e-9a76-ed8d13f07bb1, sub_name:c66addb9-39a7-4478-8dc0-78ac198152ee, vol_name:cephfs) < ""
Nov 28 10:12:33 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/c66addb9-39a7-4478-8dc0-78ac198152ee/.meta.tmp'
Nov 28 10:12:33 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/c66addb9-39a7-4478-8dc0-78ac198152ee/.meta.tmp' to config b'/volumes/_nogroup/c66addb9-39a7-4478-8dc0-78ac198152ee/.meta'
Nov 28 10:12:33 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:f5b4c96d-e791-4ea7-b052-7e31c47893d9_0d92dab8-838f-4c0e-9a76-ed8d13f07bb1, sub_name:c66addb9-39a7-4478-8dc0-78ac198152ee, vol_name:cephfs) < ""
Nov 28 10:12:33 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "c66addb9-39a7-4478-8dc0-78ac198152ee", "snap_name": "f5b4c96d-e791-4ea7-b052-7e31c47893d9", "force": true, "format": "json"}]: dispatch
Nov 28 10:12:33 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:f5b4c96d-e791-4ea7-b052-7e31c47893d9, sub_name:c66addb9-39a7-4478-8dc0-78ac198152ee, vol_name:cephfs) < ""
Nov 28 10:12:33 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Nov 28 10:12:33 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 10:12:33 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0)
Nov 28 10:12:33 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 28 10:12:33 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Nov 28 10:12:33 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/c66addb9-39a7-4478-8dc0-78ac198152ee/.meta.tmp'
Nov 28 10:12:33 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/c66addb9-39a7-4478-8dc0-78ac198152ee/.meta.tmp' to config b'/volumes/_nogroup/c66addb9-39a7-4478-8dc0-78ac198152ee/.meta'
Nov 28 10:12:33 np0005538515.localdomain ceph-mgr[286188]: [progress INFO root] update: starting ev 0af48f44-05b6-4c2e-a412-8b8a1114462d (Updating node-proxy deployment (+3 -> 3))
Nov 28 10:12:33 np0005538515.localdomain ceph-mgr[286188]: [progress INFO root] complete: finished ev 0af48f44-05b6-4c2e-a412-8b8a1114462d (Updating node-proxy deployment (+3 -> 3))
Nov 28 10:12:33 np0005538515.localdomain ceph-mgr[286188]: [progress INFO root] Completed event 0af48f44-05b6-4c2e-a412-8b8a1114462d (Updating node-proxy deployment (+3 -> 3)) in 0 seconds
Nov 28 10:12:33 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0)
Nov 28 10:12:33 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 28 10:12:33 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:f5b4c96d-e791-4ea7-b052-7e31c47893d9, sub_name:c66addb9-39a7-4478-8dc0-78ac198152ee, vol_name:cephfs) < ""
Nov 28 10:12:33 np0005538515.localdomain ceph-mon[301134]: pgmap v632: 177 pgs: 177 active+clean; 218 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 383 B/s rd, 178 KiB/s wr, 12 op/s
Nov 28 10:12:33 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice bob", "format": "json"}]: dispatch
Nov 28 10:12:33 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Nov 28 10:12:33 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Nov 28 10:12:33 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Nov 28 10:12:33 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished
Nov 28 10:12:33 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 10:12:33 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 28 10:12:33 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:12:33 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 28 10:12:33 np0005538515.localdomain sudo[323691]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 10:12:33 np0005538515.localdomain sudo[323691]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 10:12:33 np0005538515.localdomain sudo[323691]: pam_unix(sudo:session): session closed for user root
Nov 28 10:12:33 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:12:33.858 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:12:33 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v633: 177 pgs: 177 active+clean; 218 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 383 B/s rd, 178 KiB/s wr, 12 op/s
Nov 28 10:12:34 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice bob", "format": "json"}]: dispatch
Nov 28 10:12:34 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "c66addb9-39a7-4478-8dc0-78ac198152ee", "snap_name": "f5b4c96d-e791-4ea7-b052-7e31c47893d9_0d92dab8-838f-4c0e-9a76-ed8d13f07bb1", "force": true, "format": "json"}]: dispatch
Nov 28 10:12:34 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "c66addb9-39a7-4478-8dc0-78ac198152ee", "snap_name": "f5b4c96d-e791-4ea7-b052-7e31c47893d9", "force": true, "format": "json"}]: dispatch
Nov 28 10:12:34 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "535c32b0-23e4-4a2c-bbae-552340590760", "format": "json"}]: dispatch
Nov 28 10:12:34 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_clone_status(clone_name:535c32b0-23e4-4a2c-bbae-552340590760, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Nov 28 10:12:34 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_clone_status(clone_name:535c32b0-23e4-4a2c-bbae-552340590760, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Nov 28 10:12:34 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T10:12:34.567+0000 7fcc87448640 -1 mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '535c32b0-23e4-4a2c-bbae-552340590760' of type subvolume
Nov 28 10:12:34 np0005538515.localdomain ceph-mgr[286188]: mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '535c32b0-23e4-4a2c-bbae-552340590760' of type subvolume
Nov 28 10:12:34 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "535c32b0-23e4-4a2c-bbae-552340590760", "force": true, "format": "json"}]: dispatch
Nov 28 10:12:34 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:535c32b0-23e4-4a2c-bbae-552340590760, vol_name:cephfs) < ""
Nov 28 10:12:34 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.subvolume_base] subvolume path 'b'/volumes/_nogroup/535c32b0-23e4-4a2c-bbae-552340590760'' moved to trashcan
Nov 28 10:12:34 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Nov 28 10:12:34 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:535c32b0-23e4-4a2c-bbae-552340590760, vol_name:cephfs) < ""
Nov 28 10:12:35 np0005538515.localdomain ceph-mon[301134]: pgmap v633: 177 pgs: 177 active+clean; 218 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 383 B/s rd, 178 KiB/s wr, 12 op/s
Nov 28 10:12:35 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "535c32b0-23e4-4a2c-bbae-552340590760", "format": "json"}]: dispatch
Nov 28 10:12:35 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "535c32b0-23e4-4a2c-bbae-552340590760", "force": true, "format": "json"}]: dispatch
Nov 28 10:12:35 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e289 e289: 6 total, 6 up, 6 in
Nov 28 10:12:35 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] scanning for idle connections..
Nov 28 10:12:35 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] cleaning up connections: []
Nov 28 10:12:35 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] scanning for idle connections..
Nov 28 10:12:35 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] cleaning up connections: []
Nov 28 10:12:35 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice", "tenant_id": "38de2f991c8946e4ad86ddc6b9c2ae73", "access_level": "rw", "format": "json"}]: dispatch
Nov 28 10:12:35 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_authorize(access_level:rw, auth_id:alice, format:json, prefix:fs subvolume authorize, sub_name:3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50, tenant_id:38de2f991c8946e4ad86ddc6b9c2ae73, vol_name:cephfs) < ""
Nov 28 10:12:35 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.alice", "format": "json"} v 0)
Nov 28 10:12:35 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Nov 28 10:12:35 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: Creating meta for ID alice with tenant 38de2f991c8946e4ad86ddc6b9c2ae73
Nov 28 10:12:35 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} v 0)
Nov 28 10:12:35 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:12:35 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_authorize(access_level:rw, auth_id:alice, format:json, prefix:fs subvolume authorize, sub_name:3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50, tenant_id:38de2f991c8946e4ad86ddc6b9c2ae73, vol_name:cephfs) < ""
Nov 28 10:12:35 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] scanning for idle connections..
Nov 28 10:12:35 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] cleaning up connections: []
Nov 28 10:12:35 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:12:35.957 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:12:35 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:12:35.958 158530 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=20, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '92:49:97', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ca:ab:0a:de:51:20'}, ipsec=False) old=SB_Global(nb_cfg=19) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:12:35 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:12:35.959 158530 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 28 10:12:35 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v635: 177 pgs: 177 active+clean; 218 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 127 KiB/s wr, 9 op/s
Nov 28 10:12:36 np0005538515.localdomain ceph-mgr[286188]: [progress INFO root] Writing back 50 completed events
Nov 28 10:12:36 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0)
Nov 28 10:12:36 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "c66addb9-39a7-4478-8dc0-78ac198152ee", "format": "json"}]: dispatch
Nov 28 10:12:36 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_clone_status(clone_name:c66addb9-39a7-4478-8dc0-78ac198152ee, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Nov 28 10:12:36 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:12:36.397 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:12:36 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_clone_status(clone_name:c66addb9-39a7-4478-8dc0-78ac198152ee, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Nov 28 10:12:36 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T10:12:36.402+0000 7fcc87448640 -1 mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume 'c66addb9-39a7-4478-8dc0-78ac198152ee' of type subvolume
Nov 28 10:12:36 np0005538515.localdomain ceph-mgr[286188]: mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume 'c66addb9-39a7-4478-8dc0-78ac198152ee' of type subvolume
Nov 28 10:12:36 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "c66addb9-39a7-4478-8dc0-78ac198152ee", "force": true, "format": "json"}]: dispatch
Nov 28 10:12:36 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:c66addb9-39a7-4478-8dc0-78ac198152ee, vol_name:cephfs) < ""
Nov 28 10:12:36 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.subvolume_base] subvolume path 'b'/volumes/_nogroup/c66addb9-39a7-4478-8dc0-78ac198152ee'' moved to trashcan
Nov 28 10:12:36 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Nov 28 10:12:36 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:c66addb9-39a7-4478-8dc0-78ac198152ee, vol_name:cephfs) < ""
Nov 28 10:12:36 np0005538515.localdomain ceph-mon[301134]: osdmap e289: 6 total, 6 up, 6 in
Nov 28 10:12:36 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Nov 28 10:12:36 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:12:36 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:12:36 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"}]': finished
Nov 28 10:12:36 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:12:36 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e289 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:12:37 np0005538515.localdomain dnsmasq[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/addn_hosts - 1 addresses
Nov 28 10:12:37 np0005538515.localdomain dnsmasq-dhcp[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/host
Nov 28 10:12:37 np0005538515.localdomain podman[323726]: 2025-11-28 10:12:37.399192403 +0000 UTC m=+0.064026907 container kill d6ba70e100412f39fbf43f2359746885ccd151dd93bddc3682d2651a49a36c62 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-887157f9-a765-40c0-8be5-1fba3ddea8f8, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:12:37 np0005538515.localdomain dnsmasq-dhcp[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/opts
Nov 28 10:12:37 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice", "tenant_id": "38de2f991c8946e4ad86ddc6b9c2ae73", "access_level": "rw", "format": "json"}]: dispatch
Nov 28 10:12:37 np0005538515.localdomain ceph-mon[301134]: pgmap v635: 177 pgs: 177 active+clean; 218 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 127 KiB/s wr, 9 op/s
Nov 28 10:12:37 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "c66addb9-39a7-4478-8dc0-78ac198152ee", "format": "json"}]: dispatch
Nov 28 10:12:37 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "c66addb9-39a7-4478-8dc0-78ac198152ee", "force": true, "format": "json"}]: dispatch
Nov 28 10:12:37 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:12:37.835 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:12:37 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "a152cb30-8073-4c75-8e60-d299f2f23f89", "format": "json"}]: dispatch
Nov 28 10:12:37 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_clone_status(clone_name:a152cb30-8073-4c75-8e60-d299f2f23f89, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Nov 28 10:12:37 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_clone_status(clone_name:a152cb30-8073-4c75-8e60-d299f2f23f89, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Nov 28 10:12:37 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T10:12:37.896+0000 7fcc87448640 -1 mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume 'a152cb30-8073-4c75-8e60-d299f2f23f89' of type subvolume
Nov 28 10:12:37 np0005538515.localdomain ceph-mgr[286188]: mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume 'a152cb30-8073-4c75-8e60-d299f2f23f89' of type subvolume
Nov 28 10:12:37 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "a152cb30-8073-4c75-8e60-d299f2f23f89", "force": true, "format": "json"}]: dispatch
Nov 28 10:12:37 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:a152cb30-8073-4c75-8e60-d299f2f23f89, vol_name:cephfs) < ""
Nov 28 10:12:37 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.subvolume_base] subvolume path 'b'/volumes/_nogroup/a152cb30-8073-4c75-8e60-d299f2f23f89'' moved to trashcan
Nov 28 10:12:37 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Nov 28 10:12:37 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:a152cb30-8073-4c75-8e60-d299f2f23f89, vol_name:cephfs) < ""
Nov 28 10:12:37 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v636: 177 pgs: 177 active+clean; 219 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 1.0 KiB/s rd, 280 KiB/s wr, 18 op/s
Nov 28 10:12:38 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:12:38.893 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:12:39 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice", "format": "json"}]: dispatch
Nov 28 10:12:39 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_deauthorize(auth_id:alice, format:json, prefix:fs subvolume deauthorize, sub_name:3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50, vol_name:cephfs) < ""
Nov 28 10:12:39 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.alice", "format": "json"} v 0)
Nov 28 10:12:39 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Nov 28 10:12:39 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice"} v 0)
Nov 28 10:12:39 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Nov 28 10:12:39 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_deauthorize(auth_id:alice, format:json, prefix:fs subvolume deauthorize, sub_name:3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50, vol_name:cephfs) < ""
Nov 28 10:12:39 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice", "format": "json"}]: dispatch
Nov 28 10:12:39 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_evict(auth_id:alice, format:json, prefix:fs subvolume evict, sub_name:3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50, vol_name:cephfs) < ""
Nov 28 10:12:39 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict clients with auth_name=alice, client_metadata.root=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97
Nov 28 10:12:39 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict: joined all
Nov 28 10:12:39 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_evict(auth_id:alice, format:json, prefix:fs subvolume evict, sub_name:3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50, vol_name:cephfs) < ""
Nov 28 10:12:39 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "a152cb30-8073-4c75-8e60-d299f2f23f89", "format": "json"}]: dispatch
Nov 28 10:12:39 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "a152cb30-8073-4c75-8e60-d299f2f23f89", "force": true, "format": "json"}]: dispatch
Nov 28 10:12:39 np0005538515.localdomain ceph-mon[301134]: pgmap v636: 177 pgs: 177 active+clean; 219 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 1.0 KiB/s rd, 280 KiB/s wr, 18 op/s
Nov 28 10:12:39 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Nov 28 10:12:39 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Nov 28 10:12:39 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Nov 28 10:12:39 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished
Nov 28 10:12:39 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v637: 177 pgs: 177 active+clean; 219 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 502 B/s rd, 142 KiB/s wr, 8 op/s
Nov 28 10:12:40 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice", "format": "json"}]: dispatch
Nov 28 10:12:40 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice", "format": "json"}]: dispatch
Nov 28 10:12:41 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolumegroup create", "vol_name": "cephfs", "group_name": "a1a9bb3f-1d01-4004-8dd6-66f4f23a71c5", "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:12:41 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolumegroup_create(format:json, group_name:a1a9bb3f-1d01-4004-8dd6-66f4f23a71c5, mode:0755, prefix:fs subvolumegroup create, vol_name:cephfs) < ""
Nov 28 10:12:41 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolumegroup_create(format:json, group_name:a1a9bb3f-1d01-4004-8dd6-66f4f23a71c5, mode:0755, prefix:fs subvolumegroup create, vol_name:cephfs) < ""
Nov 28 10:12:41 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:12:41.453 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:12:41 np0005538515.localdomain ceph-mon[301134]: pgmap v637: 177 pgs: 177 active+clean; 219 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 502 B/s rd, 142 KiB/s wr, 8 op/s
Nov 28 10:12:41 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e290 e290: 6 total, 6 up, 6 in
Nov 28 10:12:41 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e290 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:12:41 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:12:41.962 158530 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=62c03cad-89c1-4fd7-973b-8f2a608c71f1, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '20'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 10:12:41 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v639: 177 pgs: 177 active+clean; 220 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 1023 B/s rd, 228 KiB/s wr, 13 op/s
Nov 28 10:12:42 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice", "tenant_id": "38de2f991c8946e4ad86ddc6b9c2ae73", "access_level": "r", "format": "json"}]: dispatch
Nov 28 10:12:42 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_authorize(access_level:r, auth_id:alice, format:json, prefix:fs subvolume authorize, sub_name:3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50, tenant_id:38de2f991c8946e4ad86ddc6b9c2ae73, vol_name:cephfs) < ""
Nov 28 10:12:42 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.alice", "format": "json"} v 0)
Nov 28 10:12:42 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Nov 28 10:12:42 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: Creating meta for ID alice with tenant 38de2f991c8946e4ad86ddc6b9c2ae73
Nov 28 10:12:42 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} v 0)
Nov 28 10:12:42 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:12:42 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolumegroup create", "vol_name": "cephfs", "group_name": "a1a9bb3f-1d01-4004-8dd6-66f4f23a71c5", "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:12:42 np0005538515.localdomain ceph-mon[301134]: osdmap e290: 6 total, 6 up, 6 in
Nov 28 10:12:42 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Nov 28 10:12:42 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:12:42 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:12:42 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"}]': finished
Nov 28 10:12:42 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_authorize(access_level:r, auth_id:alice, format:json, prefix:fs subvolume authorize, sub_name:3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50, tenant_id:38de2f991c8946e4ad86ddc6b9c2ae73, vol_name:cephfs) < ""
Nov 28 10:12:43 np0005538515.localdomain ceph-mon[301134]: pgmap v639: 177 pgs: 177 active+clean; 220 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 1023 B/s rd, 228 KiB/s wr, 13 op/s
Nov 28 10:12:43 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice", "tenant_id": "38de2f991c8946e4ad86ddc6b9c2ae73", "access_level": "r", "format": "json"}]: dispatch
Nov 28 10:12:43 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:12:43.936 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:12:43 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v640: 177 pgs: 177 active+clean; 220 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 963 B/s rd, 215 KiB/s wr, 12 op/s
Nov 28 10:12:44 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolumegroup rm", "vol_name": "cephfs", "group_name": "a1a9bb3f-1d01-4004-8dd6-66f4f23a71c5", "force": true, "format": "json"}]: dispatch
Nov 28 10:12:44 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolumegroup_rm(force:True, format:json, group_name:a1a9bb3f-1d01-4004-8dd6-66f4f23a71c5, prefix:fs subvolumegroup rm, vol_name:cephfs) < ""
Nov 28 10:12:44 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolumegroup_rm(force:True, format:json, group_name:a1a9bb3f-1d01-4004-8dd6-66f4f23a71c5, prefix:fs subvolumegroup rm, vol_name:cephfs) < ""
Nov 28 10:12:45 np0005538515.localdomain ceph-mon[301134]: pgmap v640: 177 pgs: 177 active+clean; 220 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 963 B/s rd, 215 KiB/s wr, 12 op/s
Nov 28 10:12:45 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolumegroup rm", "vol_name": "cephfs", "group_name": "a1a9bb3f-1d01-4004-8dd6-66f4f23a71c5", "force": true, "format": "json"}]: dispatch
Nov 28 10:12:45 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice", "format": "json"}]: dispatch
Nov 28 10:12:45 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_deauthorize(auth_id:alice, format:json, prefix:fs subvolume deauthorize, sub_name:3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50, vol_name:cephfs) < ""
Nov 28 10:12:45 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.alice", "format": "json"} v 0)
Nov 28 10:12:45 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Nov 28 10:12:45 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice"} v 0)
Nov 28 10:12:45 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Nov 28 10:12:45 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_deauthorize(auth_id:alice, format:json, prefix:fs subvolume deauthorize, sub_name:3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50, vol_name:cephfs) < ""
Nov 28 10:12:45 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice", "format": "json"}]: dispatch
Nov 28 10:12:45 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_evict(auth_id:alice, format:json, prefix:fs subvolume evict, sub_name:3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50, vol_name:cephfs) < ""
Nov 28 10:12:45 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict clients with auth_name=alice, client_metadata.root=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97
Nov 28 10:12:45 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict: joined all
Nov 28 10:12:45 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_evict(auth_id:alice, format:json, prefix:fs subvolume evict, sub_name:3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50, vol_name:cephfs) < ""
Nov 28 10:12:45 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.
Nov 28 10:12:45 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.
Nov 28 10:12:45 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.
Nov 28 10:12:45 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.
Nov 28 10:12:45 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v641: 177 pgs: 177 active+clean; 220 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 818 B/s rd, 183 KiB/s wr, 10 op/s
Nov 28 10:12:45 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:12:45.984 261346 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:12:45Z, description=, device_id=4b3fe2b0-a7cf-43ab-948e-4143df334636, device_owner=network:router_gateway, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce3166d0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce3167f0>], id=78d57a6d-83f8-4c5d-bd32-76f006a72d19, ip_allocation=immediate, mac_address=fa:16:3e:81:fc:b6, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T08:32:19Z, description=, dns_domain=, id=887157f9-a765-40c0-8be5-1fba3ddea8f8, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=9dda653c53224db086060962b0702694, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=29, status=ACTIVE, subnets=['5f7de60c-f82a-4f40-b803-51cb08cbf2e3'], tags=[], tenant_id=9dda653c53224db086060962b0702694, updated_at=2025-11-28T08:32:25Z, vlan_transparent=None, network_id=887157f9-a765-40c0-8be5-1fba3ddea8f8, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3793, status=DOWN, tags=[], tenant_id=, updated_at=2025-11-28T10:12:45Z on network 887157f9-a765-40c0-8be5-1fba3ddea8f8
Nov 28 10:12:45 np0005538515.localdomain systemd[1]: tmp-crun.BjPi7Y.mount: Deactivated successfully.
Nov 28 10:12:46 np0005538515.localdomain podman[323750]: 2025-11-28 10:12:46.001860665 +0000 UTC m=+0.101678309 container health_status 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0)
Nov 28 10:12:46 np0005538515.localdomain podman[323751]: 2025-11-28 10:12:46.043437239 +0000 UTC m=+0.139569360 container health_status b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Nov 28 10:12:46 np0005538515.localdomain podman[323749]: 2025-11-28 10:12:46.098273451 +0000 UTC m=+0.199389136 container health_status 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible)
Nov 28 10:12:46 np0005538515.localdomain podman[323752]: 2025-11-28 10:12:46.145510489 +0000 UTC m=+0.237180872 container health_status d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 28 10:12:46 np0005538515.localdomain podman[323752]: 2025-11-28 10:12:46.158602074 +0000 UTC m=+0.250272417 container exec_died d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 28 10:12:46 np0005538515.localdomain podman[323749]: 2025-11-28 10:12:46.165806297 +0000 UTC m=+0.266921962 container exec_died 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=edpm, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, io.buildah.version=1.41.3)
Nov 28 10:12:46 np0005538515.localdomain podman[323750]: 2025-11-28 10:12:46.175040841 +0000 UTC m=+0.274858525 container exec_died 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:12:46 np0005538515.localdomain systemd[1]: d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.service: Deactivated successfully.
Nov 28 10:12:46 np0005538515.localdomain podman[323751]: 2025-11-28 10:12:46.176142225 +0000 UTC m=+0.272274346 container exec_died b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_managed=true, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Nov 28 10:12:46 np0005538515.localdomain systemd[1]: 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.service: Deactivated successfully.
Nov 28 10:12:46 np0005538515.localdomain systemd[1]: b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.service: Deactivated successfully.
Nov 28 10:12:46 np0005538515.localdomain systemd[1]: 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.service: Deactivated successfully.
Nov 28 10:12:46 np0005538515.localdomain dnsmasq[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/addn_hosts - 2 addresses
Nov 28 10:12:46 np0005538515.localdomain podman[323848]: 2025-11-28 10:12:46.29550357 +0000 UTC m=+0.122318197 container kill d6ba70e100412f39fbf43f2359746885ccd151dd93bddc3682d2651a49a36c62 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-887157f9-a765-40c0-8be5-1fba3ddea8f8, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Nov 28 10:12:46 np0005538515.localdomain dnsmasq-dhcp[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/host
Nov 28 10:12:46 np0005538515.localdomain dnsmasq-dhcp[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/opts
Nov 28 10:12:46 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:12:46.492 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:12:46 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:12:46.491 261346 INFO neutron.agent.dhcp.agent [None req-2470b25d-957b-48cf-a593-e8ac202053bf - - - - - -] DHCP configuration for ports {'78d57a6d-83f8-4c5d-bd32-76f006a72d19'} is completed
Nov 28 10:12:46 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice", "format": "json"}]: dispatch
Nov 28 10:12:46 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Nov 28 10:12:46 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Nov 28 10:12:46 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Nov 28 10:12:46 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished
Nov 28 10:12:46 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e290 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:12:47 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:12:47.258 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:12:47 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolumegroup create", "vol_name": "cephfs", "group_name": "32cdd268-82b3-4d40-9663-e0bea21b28c3", "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:12:47 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolumegroup_create(format:json, group_name:32cdd268-82b3-4d40-9663-e0bea21b28c3, mode:0755, prefix:fs subvolumegroup create, vol_name:cephfs) < ""
Nov 28 10:12:47 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolumegroup_create(format:json, group_name:32cdd268-82b3-4d40-9663-e0bea21b28c3, mode:0755, prefix:fs subvolumegroup create, vol_name:cephfs) < ""
Nov 28 10:12:47 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice", "format": "json"}]: dispatch
Nov 28 10:12:47 np0005538515.localdomain ceph-mon[301134]: pgmap v641: 177 pgs: 177 active+clean; 220 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 818 B/s rd, 183 KiB/s wr, 10 op/s
Nov 28 10:12:47 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v642: 177 pgs: 177 active+clean; 220 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 140 KiB/s wr, 8 op/s
Nov 28 10:12:48 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolumegroup create", "vol_name": "cephfs", "group_name": "32cdd268-82b3-4d40-9663-e0bea21b28c3", "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:12:48 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.
Nov 28 10:12:48 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:12:48.968 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:12:49 np0005538515.localdomain podman[323869]: 2025-11-28 10:12:49.001822253 +0000 UTC m=+0.105842678 container health_status 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 28 10:12:49 np0005538515.localdomain podman[323869]: 2025-11-28 10:12:49.011492521 +0000 UTC m=+0.115512886 container exec_died 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 28 10:12:49 np0005538515.localdomain systemd[1]: 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.service: Deactivated successfully.
Nov 28 10:12:49 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:12:49.238 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:12:49 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:12:49.239 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:12:49 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice_bob", "tenant_id": "38de2f991c8946e4ad86ddc6b9c2ae73", "access_level": "rw", "format": "json"}]: dispatch
Nov 28 10:12:49 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_authorize(access_level:rw, auth_id:alice_bob, format:json, prefix:fs subvolume authorize, sub_name:3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50, tenant_id:38de2f991c8946e4ad86ddc6b9c2ae73, vol_name:cephfs) < ""
Nov 28 10:12:49 np0005538515.localdomain ceph-mon[301134]: pgmap v642: 177 pgs: 177 active+clean; 220 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 140 KiB/s wr, 8 op/s
Nov 28 10:12:49 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} v 0)
Nov 28 10:12:49 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Nov 28 10:12:49 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: Creating meta for ID alice_bob with tenant 38de2f991c8946e4ad86ddc6b9c2ae73
Nov 28 10:12:49 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} v 0)
Nov 28 10:12:49 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:12:49 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_authorize(access_level:rw, auth_id:alice_bob, format:json, prefix:fs subvolume authorize, sub_name:3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50, tenant_id:38de2f991c8946e4ad86ddc6b9c2ae73, vol_name:cephfs) < ""
Nov 28 10:12:49 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v643: 177 pgs: 177 active+clean; 220 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 140 KiB/s wr, 8 op/s
Nov 28 10:12:50 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:12:50.146 261346 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:12:49Z, description=, device_id=c6a3c308-ab33-40a1-9933-91a047698d13, device_owner=network:router_gateway, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce3965b0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce396fa0>], id=aeec12f0-7000-43ed-81b1-6c08563e6d70, ip_allocation=immediate, mac_address=fa:16:3e:eb:bc:41, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T08:32:19Z, description=, dns_domain=, id=887157f9-a765-40c0-8be5-1fba3ddea8f8, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=9dda653c53224db086060962b0702694, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=29, status=ACTIVE, subnets=['5f7de60c-f82a-4f40-b803-51cb08cbf2e3'], tags=[], tenant_id=9dda653c53224db086060962b0702694, updated_at=2025-11-28T08:32:25Z, vlan_transparent=None, network_id=887157f9-a765-40c0-8be5-1fba3ddea8f8, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3806, status=DOWN, tags=[], tenant_id=, updated_at=2025-11-28T10:12:50Z on network 887157f9-a765-40c0-8be5-1fba3ddea8f8
Nov 28 10:12:50 np0005538515.localdomain dnsmasq[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/addn_hosts - 3 addresses
Nov 28 10:12:50 np0005538515.localdomain dnsmasq-dhcp[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/host
Nov 28 10:12:50 np0005538515.localdomain dnsmasq-dhcp[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/opts
Nov 28 10:12:50 np0005538515.localdomain podman[323909]: 2025-11-28 10:12:50.462300238 +0000 UTC m=+0.049313663 container kill d6ba70e100412f39fbf43f2359746885ccd151dd93bddc3682d2651a49a36c62 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-887157f9-a765-40c0-8be5-1fba3ddea8f8, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:12:50 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolumegroup rm", "vol_name": "cephfs", "group_name": "32cdd268-82b3-4d40-9663-e0bea21b28c3", "force": true, "format": "json"}]: dispatch
Nov 28 10:12:50 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolumegroup_rm(force:True, format:json, group_name:32cdd268-82b3-4d40-9663-e0bea21b28c3, prefix:fs subvolumegroup rm, vol_name:cephfs) < ""
Nov 28 10:12:50 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice_bob", "tenant_id": "38de2f991c8946e4ad86ddc6b9c2ae73", "access_level": "rw", "format": "json"}]: dispatch
Nov 28 10:12:50 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Nov 28 10:12:50 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:12:50 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:12:50 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"}]': finished
Nov 28 10:12:50 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolumegroup_rm(force:True, format:json, group_name:32cdd268-82b3-4d40-9663-e0bea21b28c3, prefix:fs subvolumegroup rm, vol_name:cephfs) < ""
Nov 28 10:12:50 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:12:50.718 261346 INFO neutron.agent.dhcp.agent [None req-6f2fcf49-eb70-479b-bb2d-79344f63fe33 - - - - - -] DHCP configuration for ports {'aeec12f0-7000-43ed-81b1-6c08563e6d70'} is completed
Nov 28 10:12:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:12:50.855 158530 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 10:12:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:12:50.858 158530 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 10:12:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:12:50.858 158530 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 10:12:51 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:12:51.238 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:12:51 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:12:51.238 280172 DEBUG nova.compute.manager [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 28 10:12:51 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "2f14e343-1793-4cb4-b8c5-3069f088f109", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:12:51 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:2f14e343-1793-4cb4-b8c5-3069f088f109, vol_name:cephfs) < ""
Nov 28 10:12:51 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/2f14e343-1793-4cb4-b8c5-3069f088f109/.meta.tmp'
Nov 28 10:12:51 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/2f14e343-1793-4cb4-b8c5-3069f088f109/.meta.tmp' to config b'/volumes/_nogroup/2f14e343-1793-4cb4-b8c5-3069f088f109/.meta'
Nov 28 10:12:51 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:2f14e343-1793-4cb4-b8c5-3069f088f109, vol_name:cephfs) < ""
Nov 28 10:12:51 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "2f14e343-1793-4cb4-b8c5-3069f088f109", "format": "json"}]: dispatch
Nov 28 10:12:51 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:2f14e343-1793-4cb4-b8c5-3069f088f109, vol_name:cephfs) < ""
Nov 28 10:12:51 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:2f14e343-1793-4cb4-b8c5-3069f088f109, vol_name:cephfs) < ""
Nov 28 10:12:51 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:12:51.477 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:12:51 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:12:51.494 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:12:51 np0005538515.localdomain ceph-mon[301134]: pgmap v643: 177 pgs: 177 active+clean; 220 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 140 KiB/s wr, 8 op/s
Nov 28 10:12:51 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolumegroup rm", "vol_name": "cephfs", "group_name": "32cdd268-82b3-4d40-9663-e0bea21b28c3", "force": true, "format": "json"}]: dispatch
Nov 28 10:12:51 np0005538515.localdomain ceph-mon[301134]: from='client.15651 172.18.0.34:0/597982567' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:12:51 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:12:51.775 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:12:51 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e290 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:12:51 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v644: 177 pgs: 177 active+clean; 221 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 302 B/s rd, 130 KiB/s wr, 8 op/s
Nov 28 10:12:52 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:12:52.239 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:12:52 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:12:52.240 280172 DEBUG nova.compute.manager [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 28 10:12:52 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:12:52.240 280172 DEBUG nova.compute.manager [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 28 10:12:52 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:12:52.449 280172 DEBUG nova.compute.manager [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 28 10:12:52 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "2f14e343-1793-4cb4-b8c5-3069f088f109", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:12:52 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "2f14e343-1793-4cb4-b8c5-3069f088f109", "format": "json"}]: dispatch
Nov 28 10:12:52 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.
Nov 28 10:12:52 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice_bob", "format": "json"}]: dispatch
Nov 28 10:12:52 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_deauthorize(auth_id:alice_bob, format:json, prefix:fs subvolume deauthorize, sub_name:3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50, vol_name:cephfs) < ""
Nov 28 10:12:52 np0005538515.localdomain podman[323929]: 2025-11-28 10:12:52.979885445 +0000 UTC m=+0.087264255 container health_status cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, org.label-schema.build-date=20251125, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 28 10:12:53 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} v 0)
Nov 28 10:12:53 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Nov 28 10:12:53 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice_bob"} v 0)
Nov 28 10:12:53 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Nov 28 10:12:53 np0005538515.localdomain podman[323929]: 2025-11-28 10:12:53.018557679 +0000 UTC m=+0.125936439 container exec_died cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=multipathd)
Nov 28 10:12:53 np0005538515.localdomain systemd[1]: cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.service: Deactivated successfully.
Nov 28 10:12:53 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_deauthorize(auth_id:alice_bob, format:json, prefix:fs subvolume deauthorize, sub_name:3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50, vol_name:cephfs) < ""
Nov 28 10:12:53 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice_bob", "format": "json"}]: dispatch
Nov 28 10:12:53 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_evict(auth_id:alice_bob, format:json, prefix:fs subvolume evict, sub_name:3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50, vol_name:cephfs) < ""
Nov 28 10:12:53 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict clients with auth_name=alice_bob, client_metadata.root=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97
Nov 28 10:12:53 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict: joined all
Nov 28 10:12:53 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_evict(auth_id:alice_bob, format:json, prefix:fs subvolume evict, sub_name:3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50, vol_name:cephfs) < ""
Nov 28 10:12:53 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:12:53.238 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:12:53 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:12:53.238 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:12:53 np0005538515.localdomain ceph-mon[301134]: pgmap v644: 177 pgs: 177 active+clean; 221 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 302 B/s rd, 130 KiB/s wr, 8 op/s
Nov 28 10:12:53 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Nov 28 10:12:53 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Nov 28 10:12:53 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Nov 28 10:12:53 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished
Nov 28 10:12:53 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.106:0/2706457592' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:12:53 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v645: 177 pgs: 177 active+clean; 221 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 255 B/s rd, 110 KiB/s wr, 6 op/s
Nov 28 10:12:54 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:12:54.011 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:12:54 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:12:54.238 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:12:54 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "2f14e343-1793-4cb4-b8c5-3069f088f109", "format": "json"}]: dispatch
Nov 28 10:12:54 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_clone_status(clone_name:2f14e343-1793-4cb4-b8c5-3069f088f109, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Nov 28 10:12:54 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_clone_status(clone_name:2f14e343-1793-4cb4-b8c5-3069f088f109, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Nov 28 10:12:54 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T10:12:54.667+0000 7fcc87448640 -1 mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '2f14e343-1793-4cb4-b8c5-3069f088f109' of type subvolume
Nov 28 10:12:54 np0005538515.localdomain ceph-mgr[286188]: mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '2f14e343-1793-4cb4-b8c5-3069f088f109' of type subvolume
Nov 28 10:12:54 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "2f14e343-1793-4cb4-b8c5-3069f088f109", "force": true, "format": "json"}]: dispatch
Nov 28 10:12:54 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:2f14e343-1793-4cb4-b8c5-3069f088f109, vol_name:cephfs) < ""
Nov 28 10:12:54 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.subvolume_base] subvolume path 'b'/volumes/_nogroup/2f14e343-1793-4cb4-b8c5-3069f088f109'' moved to trashcan
Nov 28 10:12:54 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Nov 28 10:12:54 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:2f14e343-1793-4cb4-b8c5-3069f088f109, vol_name:cephfs) < ""
Nov 28 10:12:54 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice_bob", "format": "json"}]: dispatch
Nov 28 10:12:54 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice_bob", "format": "json"}]: dispatch
Nov 28 10:12:54 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.106:0/2576893307' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:12:55 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:12:55.238 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:12:55 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:12:55.274 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 10:12:55 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:12:55.274 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 10:12:55 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:12:55.274 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 10:12:55 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:12:55.275 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Auditing locally available compute resources for np0005538515.localdomain (node: np0005538515.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 28 10:12:55 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:12:55.275 280172 DEBUG oslo_concurrency.processutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 10:12:55 np0005538515.localdomain ceph-mon[301134]: pgmap v645: 177 pgs: 177 active+clean; 221 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 255 B/s rd, 110 KiB/s wr, 6 op/s
Nov 28 10:12:55 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "2f14e343-1793-4cb4-b8c5-3069f088f109", "format": "json"}]: dispatch
Nov 28 10:12:55 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "2f14e343-1793-4cb4-b8c5-3069f088f109", "force": true, "format": "json"}]: dispatch
Nov 28 10:12:55 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 28 10:12:55 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/1162585651' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:12:55 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:12:55.779 280172 DEBUG oslo_concurrency.processutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.503s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 10:12:55 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:12:55.971 280172 WARNING nova.virt.libvirt.driver [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 10:12:55 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:12:55.973 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Hypervisor/Node resource view: name=np0005538515.localdomain free_ram=11448MB free_disk=41.83686447143555GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 28 10:12:55 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:12:55.973 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 10:12:55 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:12:55.973 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 10:12:55 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v646: 177 pgs: 177 active+clean; 221 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 255 B/s rd, 110 KiB/s wr, 6 op/s
Nov 28 10:12:56 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:12:56.105 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 28 10:12:56 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:12:56.106 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Final resource view: name=np0005538515.localdomain phys_ram=15738MB used_ram=512MB phys_disk=41GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 28 10:12:56 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:12:56.121 280172 DEBUG oslo_concurrency.processutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 10:12:56 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice_bob", "tenant_id": "38de2f991c8946e4ad86ddc6b9c2ae73", "access_level": "r", "format": "json"}]: dispatch
Nov 28 10:12:56 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_authorize(access_level:r, auth_id:alice_bob, format:json, prefix:fs subvolume authorize, sub_name:3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50, tenant_id:38de2f991c8946e4ad86ddc6b9c2ae73, vol_name:cephfs) < ""
Nov 28 10:12:56 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} v 0)
Nov 28 10:12:56 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Nov 28 10:12:56 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: Creating meta for ID alice_bob with tenant 38de2f991c8946e4ad86ddc6b9c2ae73
Nov 28 10:12:56 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} v 0)
Nov 28 10:12:56 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:12:56 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:12:56.536 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:12:56 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_authorize(access_level:r, auth_id:alice_bob, format:json, prefix:fs subvolume authorize, sub_name:3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50, tenant_id:38de2f991c8946e4ad86ddc6b9c2ae73, vol_name:cephfs) < ""
Nov 28 10:12:56 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 28 10:12:56 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/44064503' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:12:56 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:12:56.586 280172 DEBUG oslo_concurrency.processutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.464s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 10:12:56 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:12:56.592 280172 DEBUG nova.compute.provider_tree [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Inventory has not changed in ProviderTree for provider: 72fba1ca-0d86-48af-8a3d-510284dfd0e0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 10:12:56 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:12:56.606 280172 DEBUG nova.scheduler.client.report [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Inventory has not changed for provider 72fba1ca-0d86-48af-8a3d-510284dfd0e0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 10:12:56 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:12:56.609 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Compute_service record updated for np0005538515.localdomain:np0005538515.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 28 10:12:56 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:12:56.609 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.636s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 10:12:56 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.108:0/1162585651' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:12:56 np0005538515.localdomain ceph-mon[301134]: pgmap v646: 177 pgs: 177 active+clean; 221 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 255 B/s rd, 110 KiB/s wr, 6 op/s
Nov 28 10:12:56 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice_bob", "tenant_id": "38de2f991c8946e4ad86ddc6b9c2ae73", "access_level": "r", "format": "json"}]: dispatch
Nov 28 10:12:56 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Nov 28 10:12:56 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:12:56 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:12:56 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"}]': finished
Nov 28 10:12:56 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.108:0/44064503' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:12:56 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e290 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:12:56 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:12:56.861 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:12:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:12:57 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 10:12:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:12:57 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 10:12:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:12:57 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 10:12:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:12:57 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 10:12:57 np0005538515.localdomain openstack_network_exporter[240973]: 
Nov 28 10:12:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:12:57 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 10:12:57 np0005538515.localdomain openstack_network_exporter[240973]: 
Nov 28 10:12:57 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v647: 177 pgs: 177 active+clean; 222 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 180 KiB/s wr, 11 op/s
Nov 28 10:12:58 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "028817fc-1877-4dba-8e2f-8daf5bd38000", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:12:58 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:028817fc-1877-4dba-8e2f-8daf5bd38000, vol_name:cephfs) < ""
Nov 28 10:12:58 np0005538515.localdomain ceph-osd[32393]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #46. Immutable memtables: 3.
Nov 28 10:12:58 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/028817fc-1877-4dba-8e2f-8daf5bd38000/.meta.tmp'
Nov 28 10:12:58 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/028817fc-1877-4dba-8e2f-8daf5bd38000/.meta.tmp' to config b'/volumes/_nogroup/028817fc-1877-4dba-8e2f-8daf5bd38000/.meta'
Nov 28 10:12:58 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:028817fc-1877-4dba-8e2f-8daf5bd38000, vol_name:cephfs) < ""
Nov 28 10:12:58 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "028817fc-1877-4dba-8e2f-8daf5bd38000", "format": "json"}]: dispatch
Nov 28 10:12:58 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:028817fc-1877-4dba-8e2f-8daf5bd38000, vol_name:cephfs) < ""
Nov 28 10:12:58 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:028817fc-1877-4dba-8e2f-8daf5bd38000, vol_name:cephfs) < ""
Nov 28 10:12:58 np0005538515.localdomain podman[239012]: time="2025-11-28T10:12:58Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 10:12:58 np0005538515.localdomain podman[239012]: @ - - [28/Nov/2025:10:12:58 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156330 "" "Go-http-client/1.1"
Nov 28 10:12:58 np0005538515.localdomain podman[239012]: @ - - [28/Nov/2025:10:12:58 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19255 "" "Go-http-client/1.1"
Nov 28 10:12:59 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:12:59.042 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:12:59 np0005538515.localdomain ceph-mon[301134]: pgmap v647: 177 pgs: 177 active+clean; 222 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 180 KiB/s wr, 11 op/s
Nov 28 10:12:59 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "028817fc-1877-4dba-8e2f-8daf5bd38000", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:12:59 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "028817fc-1877-4dba-8e2f-8daf5bd38000", "format": "json"}]: dispatch
Nov 28 10:12:59 np0005538515.localdomain ceph-mon[301134]: from='client.15651 172.18.0.34:0/597982567' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:12:59 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:12:59.611 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:12:59 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice_bob", "format": "json"}]: dispatch
Nov 28 10:12:59 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_deauthorize(auth_id:alice_bob, format:json, prefix:fs subvolume deauthorize, sub_name:3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50, vol_name:cephfs) < ""
Nov 28 10:12:59 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v648: 177 pgs: 177 active+clean; 222 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 426 B/s rd, 119 KiB/s wr, 7 op/s
Nov 28 10:13:00 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} v 0)
Nov 28 10:13:00 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Nov 28 10:13:00 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice_bob"} v 0)
Nov 28 10:13:00 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Nov 28 10:13:00 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Nov 28 10:13:00 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Nov 28 10:13:00 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Nov 28 10:13:00 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_deauthorize(auth_id:alice_bob, format:json, prefix:fs subvolume deauthorize, sub_name:3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50, vol_name:cephfs) < ""
Nov 28 10:13:00 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice_bob", "format": "json"}]: dispatch
Nov 28 10:13:00 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_evict(auth_id:alice_bob, format:json, prefix:fs subvolume evict, sub_name:3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50, vol_name:cephfs) < ""
Nov 28 10:13:00 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict clients with auth_name=alice_bob, client_metadata.root=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97
Nov 28 10:13:00 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict: joined all
Nov 28 10:13:00 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_evict(auth_id:alice_bob, format:json, prefix:fs subvolume evict, sub_name:3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50, vol_name:cephfs) < ""
Nov 28 10:13:01 np0005538515.localdomain systemd[1]: tmp-crun.0L4EHz.mount: Deactivated successfully.
Nov 28 10:13:01 np0005538515.localdomain dnsmasq[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/addn_hosts - 2 addresses
Nov 28 10:13:01 np0005538515.localdomain dnsmasq-dhcp[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/host
Nov 28 10:13:01 np0005538515.localdomain dnsmasq-dhcp[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/opts
Nov 28 10:13:01 np0005538515.localdomain podman[324013]: 2025-11-28 10:13:01.068857929 +0000 UTC m=+0.062113848 container kill d6ba70e100412f39fbf43f2359746885ccd151dd93bddc3682d2651a49a36c62 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-887157f9-a765-40c0-8be5-1fba3ddea8f8, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team)
Nov 28 10:13:01 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice_bob", "format": "json"}]: dispatch
Nov 28 10:13:01 np0005538515.localdomain ceph-mon[301134]: pgmap v648: 177 pgs: 177 active+clean; 222 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 426 B/s rd, 119 KiB/s wr, 7 op/s
Nov 28 10:13:01 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished
Nov 28 10:13:01 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice_bob", "format": "json"}]: dispatch
Nov 28 10:13:01 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.107:0/713742877' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:13:01 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.107:0/3297457012' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:13:01 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:13:01.136 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:13:01 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "028817fc-1877-4dba-8e2f-8daf5bd38000", "format": "json"}]: dispatch
Nov 28 10:13:01 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_clone_status(clone_name:028817fc-1877-4dba-8e2f-8daf5bd38000, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Nov 28 10:13:01 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_clone_status(clone_name:028817fc-1877-4dba-8e2f-8daf5bd38000, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Nov 28 10:13:01 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T10:13:01.339+0000 7fcc87448640 -1 mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '028817fc-1877-4dba-8e2f-8daf5bd38000' of type subvolume
Nov 28 10:13:01 np0005538515.localdomain ceph-mgr[286188]: mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '028817fc-1877-4dba-8e2f-8daf5bd38000' of type subvolume
Nov 28 10:13:01 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "028817fc-1877-4dba-8e2f-8daf5bd38000", "force": true, "format": "json"}]: dispatch
Nov 28 10:13:01 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:028817fc-1877-4dba-8e2f-8daf5bd38000, vol_name:cephfs) < ""
Nov 28 10:13:01 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.subvolume_base] subvolume path 'b'/volumes/_nogroup/028817fc-1877-4dba-8e2f-8daf5bd38000'' moved to trashcan
Nov 28 10:13:01 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Nov 28 10:13:01 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:028817fc-1877-4dba-8e2f-8daf5bd38000, vol_name:cephfs) < ""
Nov 28 10:13:01 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:13:01.564 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:13:01 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e290 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:13:01 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v649: 177 pgs: 177 active+clean; 222 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 426 B/s rd, 184 KiB/s wr, 10 op/s
Nov 28 10:13:02 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "028817fc-1877-4dba-8e2f-8daf5bd38000", "format": "json"}]: dispatch
Nov 28 10:13:02 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "028817fc-1877-4dba-8e2f-8daf5bd38000", "force": true, "format": "json"}]: dispatch
Nov 28 10:13:02 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.
Nov 28 10:13:02 np0005538515.localdomain podman[324034]: 2025-11-28 10:13:02.971736191 +0000 UTC m=+0.081611240 container health_status 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, config_id=edpm, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, version=9.6, release=1755695350, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vcs-type=git, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Nov 28 10:13:02 np0005538515.localdomain podman[324034]: 2025-11-28 10:13:02.987509598 +0000 UTC m=+0.097384637 container exec_died 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vcs-type=git, com.redhat.component=ubi9-minimal-container, config_id=edpm, managed_by=edpm_ansible, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, name=ubi9-minimal, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., architecture=x86_64, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9)
Nov 28 10:13:03 np0005538515.localdomain systemd[1]: 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.service: Deactivated successfully.
Nov 28 10:13:03 np0005538515.localdomain ceph-mon[301134]: pgmap v649: 177 pgs: 177 active+clean; 222 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 426 B/s rd, 184 KiB/s wr, 10 op/s
Nov 28 10:13:03 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice bob", "tenant_id": "38de2f991c8946e4ad86ddc6b9c2ae73", "access_level": "rw", "format": "json"}]: dispatch
Nov 28 10:13:03 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_authorize(access_level:rw, auth_id:alice bob, format:json, prefix:fs subvolume authorize, sub_name:3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50, tenant_id:38de2f991c8946e4ad86ddc6b9c2ae73, vol_name:cephfs) < ""
Nov 28 10:13:03 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.alice bob", "format": "json"} v 0)
Nov 28 10:13:03 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Nov 28 10:13:03 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: Creating meta for ID alice bob with tenant 38de2f991c8946e4ad86ddc6b9c2ae73
Nov 28 10:13:03 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:13:03.290 261346 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:13:03Z, description=, device_id=c18f45d1-0864-4b21-9d36-6aae15733137, device_owner=network:router_gateway, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce49b970>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce49bbe0>], id=f47899ca-bb53-42d6-a1d1-6396c431c982, ip_allocation=immediate, mac_address=fa:16:3e:82:5d:ad, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T08:32:19Z, description=, dns_domain=, id=887157f9-a765-40c0-8be5-1fba3ddea8f8, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=9dda653c53224db086060962b0702694, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=29, status=ACTIVE, subnets=['5f7de60c-f82a-4f40-b803-51cb08cbf2e3'], tags=[], tenant_id=9dda653c53224db086060962b0702694, updated_at=2025-11-28T08:32:25Z, vlan_transparent=None, network_id=887157f9-a765-40c0-8be5-1fba3ddea8f8, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3819, status=DOWN, tags=[], tenant_id=, updated_at=2025-11-28T10:13:03Z on network 887157f9-a765-40c0-8be5-1fba3ddea8f8
Nov 28 10:13:03 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} v 0)
Nov 28 10:13:03 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:13:03 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_authorize(access_level:rw, auth_id:alice bob, format:json, prefix:fs subvolume authorize, sub_name:3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50, tenant_id:38de2f991c8946e4ad86ddc6b9c2ae73, vol_name:cephfs) < ""
Nov 28 10:13:03 np0005538515.localdomain dnsmasq[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/addn_hosts - 3 addresses
Nov 28 10:13:03 np0005538515.localdomain dnsmasq-dhcp[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/host
Nov 28 10:13:03 np0005538515.localdomain podman[324071]: 2025-11-28 10:13:03.547137293 +0000 UTC m=+0.058132355 container kill d6ba70e100412f39fbf43f2359746885ccd151dd93bddc3682d2651a49a36c62 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-887157f9-a765-40c0-8be5-1fba3ddea8f8, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:13:03 np0005538515.localdomain dnsmasq-dhcp[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/opts
Nov 28 10:13:03 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v650: 177 pgs: 177 active+clean; 222 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 255 B/s rd, 135 KiB/s wr, 7 op/s
Nov 28 10:13:04 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:13:04.031 261346 INFO neutron.agent.dhcp.agent [None req-f226837e-eb5c-487b-9d6b-2744578ba75a - - - - - -] DHCP configuration for ports {'f47899ca-bb53-42d6-a1d1-6396c431c982'} is completed
Nov 28 10:13:04 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:13:04.048 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:13:04 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice bob", "tenant_id": "38de2f991c8946e4ad86ddc6b9c2ae73", "access_level": "rw", "format": "json"}]: dispatch
Nov 28 10:13:04 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Nov 28 10:13:04 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:13:04 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:13:04 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"}]': finished
Nov 28 10:13:04 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:13:04.378 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:13:05 np0005538515.localdomain ceph-mon[301134]: pgmap v650: 177 pgs: 177 active+clean; 222 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 255 B/s rd, 135 KiB/s wr, 7 op/s
Nov 28 10:13:05 np0005538515.localdomain ceph-mgr[286188]: [balancer INFO root] Optimize plan auto_2025-11-28_10:13:05
Nov 28 10:13:05 np0005538515.localdomain ceph-mgr[286188]: [balancer INFO root] Mode upmap, max misplaced 0.050000
Nov 28 10:13:05 np0005538515.localdomain ceph-mgr[286188]: [balancer INFO root] do_upmap
Nov 28 10:13:05 np0005538515.localdomain ceph-mgr[286188]: [balancer INFO root] pools ['images', 'vms', '.mgr', 'backups', 'manila_metadata', 'manila_data', 'volumes']
Nov 28 10:13:05 np0005538515.localdomain ceph-mgr[286188]: [balancer INFO root] prepared 0/10 changes
Nov 28 10:13:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] scanning for idle connections..
Nov 28 10:13:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] cleaning up connections: []
Nov 28 10:13:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] scanning for idle connections..
Nov 28 10:13:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] cleaning up connections: []
Nov 28 10:13:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] scanning for idle connections..
Nov 28 10:13:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] cleaning up connections: []
Nov 28 10:13:05 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v651: 177 pgs: 177 active+clean; 222 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 255 B/s rd, 135 KiB/s wr, 7 op/s
Nov 28 10:13:06 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] _maybe_adjust
Nov 28 10:13:06 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Nov 28 10:13:06 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] Pool '.mgr' root_id -1 using 3.080724804578448e-05 of space, bias 1.0, pg target 0.006161449609156895 quantized to 1 (current 1)
Nov 28 10:13:06 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Nov 28 10:13:06 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] Pool 'vms' root_id -1 using 0.003328000680485762 of space, bias 1.0, pg target 0.6656001360971524 quantized to 32 (current 32)
Nov 28 10:13:06 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Nov 28 10:13:06 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] Pool 'volumes' root_id -1 using 0.0014861089300670016 of space, bias 1.0, pg target 0.29672641637004465 quantized to 32 (current 32)
Nov 28 10:13:06 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Nov 28 10:13:06 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] Pool 'images' root_id -1 using 0.004299383200725851 of space, bias 1.0, pg target 0.8584435124115949 quantized to 32 (current 32)
Nov 28 10:13:06 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Nov 28 10:13:06 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] Pool 'backups' root_id -1 using 2.7263051367950866e-07 of space, bias 1.0, pg target 5.425347222222222e-05 quantized to 32 (current 32)
Nov 28 10:13:06 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Nov 28 10:13:06 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] Pool 'manila_data' root_id -1 using 1.0905220547180346e-06 of space, bias 1.0, pg target 0.00021701388888888888 quantized to 32 (current 32)
Nov 28 10:13:06 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Nov 28 10:13:06 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] Pool 'manila_metadata' root_id -1 using 0.002151327383445003 of space, bias 4.0, pg target 1.7124565972222223 quantized to 16 (current 16)
Nov 28 10:13:06 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules
Nov 28 10:13:06 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: vms, start_after=
Nov 28 10:13:06 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules
Nov 28 10:13:06 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: vms, start_after=
Nov 28 10:13:06 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: volumes, start_after=
Nov 28 10:13:06 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: images, start_after=
Nov 28 10:13:06 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: volumes, start_after=
Nov 28 10:13:06 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: images, start_after=
Nov 28 10:13:06 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: backups, start_after=
Nov 28 10:13:06 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: backups, start_after=
Nov 28 10:13:06 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:13:06.603 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:13:06 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e290 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:13:07 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice bob", "format": "json"}]: dispatch
Nov 28 10:13:07 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_deauthorize(auth_id:alice bob, format:json, prefix:fs subvolume deauthorize, sub_name:3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50, vol_name:cephfs) < ""
Nov 28 10:13:07 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.alice bob", "format": "json"} v 0)
Nov 28 10:13:07 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Nov 28 10:13:07 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice bob"} v 0)
Nov 28 10:13:07 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Nov 28 10:13:07 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_deauthorize(auth_id:alice bob, format:json, prefix:fs subvolume deauthorize, sub_name:3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50, vol_name:cephfs) < ""
Nov 28 10:13:07 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice bob", "format": "json"}]: dispatch
Nov 28 10:13:07 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_evict(auth_id:alice bob, format:json, prefix:fs subvolume evict, sub_name:3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50, vol_name:cephfs) < ""
Nov 28 10:13:07 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict clients with auth_name=alice bob, client_metadata.root=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97
Nov 28 10:13:07 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict: joined all
Nov 28 10:13:07 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_evict(auth_id:alice bob, format:json, prefix:fs subvolume evict, sub_name:3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50, vol_name:cephfs) < ""
Nov 28 10:13:07 np0005538515.localdomain ceph-mon[301134]: pgmap v651: 177 pgs: 177 active+clean; 222 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 255 B/s rd, 135 KiB/s wr, 7 op/s
Nov 28 10:13:07 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Nov 28 10:13:07 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Nov 28 10:13:07 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Nov 28 10:13:07 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished
Nov 28 10:13:07 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v652: 177 pgs: 177 active+clean; 223 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 167 KiB/s wr, 10 op/s
Nov 28 10:13:08 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice bob", "format": "json"}]: dispatch
Nov 28 10:13:08 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice bob", "format": "json"}]: dispatch
Nov 28 10:13:09 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:13:09.083 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:13:09 np0005538515.localdomain ceph-mon[301134]: pgmap v652: 177 pgs: 177 active+clean; 223 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 167 KiB/s wr, 10 op/s
Nov 28 10:13:09 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v653: 177 pgs: 177 active+clean; 223 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 255 B/s rd, 96 KiB/s wr, 5 op/s
Nov 28 10:13:11 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice bob", "tenant_id": "38de2f991c8946e4ad86ddc6b9c2ae73", "access_level": "r", "format": "json"}]: dispatch
Nov 28 10:13:11 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_authorize(access_level:r, auth_id:alice bob, format:json, prefix:fs subvolume authorize, sub_name:3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50, tenant_id:38de2f991c8946e4ad86ddc6b9c2ae73, vol_name:cephfs) < ""
Nov 28 10:13:11 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.alice bob", "format": "json"} v 0)
Nov 28 10:13:11 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Nov 28 10:13:11 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: Creating meta for ID alice bob with tenant 38de2f991c8946e4ad86ddc6b9c2ae73
Nov 28 10:13:11 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} v 0)
Nov 28 10:13:11 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:13:11 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_authorize(access_level:r, auth_id:alice bob, format:json, prefix:fs subvolume authorize, sub_name:3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50, tenant_id:38de2f991c8946e4ad86ddc6b9c2ae73, vol_name:cephfs) < ""
Nov 28 10:13:11 np0005538515.localdomain ceph-mon[301134]: pgmap v653: 177 pgs: 177 active+clean; 223 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 255 B/s rd, 96 KiB/s wr, 5 op/s
Nov 28 10:13:11 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Nov 28 10:13:11 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:13:11 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:13:11 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"}]': finished
Nov 28 10:13:11 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:13:11.638 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:13:11 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e290 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:13:11 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v654: 177 pgs: 177 active+clean; 224 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 255 B/s rd, 159 KiB/s wr, 8 op/s
Nov 28 10:13:12 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice bob", "tenant_id": "38de2f991c8946e4ad86ddc6b9c2ae73", "access_level": "r", "format": "json"}]: dispatch
Nov 28 10:13:13 np0005538515.localdomain ceph-mon[301134]: pgmap v654: 177 pgs: 177 active+clean; 224 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 255 B/s rd, 159 KiB/s wr, 8 op/s
Nov 28 10:13:13 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/236213188' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:13:13 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/236213188' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:13:13 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v655: 177 pgs: 177 active+clean; 224 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 255 B/s rd, 94 KiB/s wr, 6 op/s
Nov 28 10:13:14 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:13:14.118 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:13:15 np0005538515.localdomain ceph-mon[301134]: pgmap v655: 177 pgs: 177 active+clean; 224 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 255 B/s rd, 94 KiB/s wr, 6 op/s
Nov 28 10:13:15 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v656: 177 pgs: 177 active+clean; 224 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 255 B/s rd, 94 KiB/s wr, 6 op/s
Nov 28 10:13:16 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:13:16.676 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:13:16 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e290 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:13:16 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.
Nov 28 10:13:16 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.
Nov 28 10:13:16 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.
Nov 28 10:13:16 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.
Nov 28 10:13:16 np0005538515.localdomain podman[324093]: 2025-11-28 10:13:16.981115285 +0000 UTC m=+0.082782436 container health_status 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:13:17 np0005538515.localdomain podman[324093]: 2025-11-28 10:13:17.040008184 +0000 UTC m=+0.141675315 container exec_died 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Nov 28 10:13:17 np0005538515.localdomain systemd[1]: 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.service: Deactivated successfully.
Nov 28 10:13:17 np0005538515.localdomain podman[324092]: 2025-11-28 10:13:17.054685477 +0000 UTC m=+0.159848685 container health_status 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, config_id=edpm, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 28 10:13:17 np0005538515.localdomain podman[324099]: 2025-11-28 10:13:17.063306203 +0000 UTC m=+0.155163441 container health_status d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 10:13:17 np0005538515.localdomain podman[324099]: 2025-11-28 10:13:17.071531497 +0000 UTC m=+0.163388775 container exec_died d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 28 10:13:17 np0005538515.localdomain systemd[1]: d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.service: Deactivated successfully.
Nov 28 10:13:17 np0005538515.localdomain podman[324092]: 2025-11-28 10:13:17.086904841 +0000 UTC m=+0.192067989 container exec_died 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 28 10:13:17 np0005538515.localdomain systemd[1]: 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.service: Deactivated successfully.
Nov 28 10:13:17 np0005538515.localdomain podman[324094]: 2025-11-28 10:13:17.146096058 +0000 UTC m=+0.243807957 container health_status b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125)
Nov 28 10:13:17 np0005538515.localdomain podman[324094]: 2025-11-28 10:13:17.150247287 +0000 UTC m=+0.247959166 container exec_died b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Nov 28 10:13:17 np0005538515.localdomain systemd[1]: b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.service: Deactivated successfully.
Nov 28 10:13:17 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice bob", "format": "json"}]: dispatch
Nov 28 10:13:17 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_deauthorize(auth_id:alice bob, format:json, prefix:fs subvolume deauthorize, sub_name:3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50, vol_name:cephfs) < ""
Nov 28 10:13:17 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.alice bob", "format": "json"} v 0)
Nov 28 10:13:17 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Nov 28 10:13:17 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice bob"} v 0)
Nov 28 10:13:17 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Nov 28 10:13:17 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_deauthorize(auth_id:alice bob, format:json, prefix:fs subvolume deauthorize, sub_name:3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50, vol_name:cephfs) < ""
Nov 28 10:13:17 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice bob", "format": "json"}]: dispatch
Nov 28 10:13:17 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_evict(auth_id:alice bob, format:json, prefix:fs subvolume evict, sub_name:3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50, vol_name:cephfs) < ""
Nov 28 10:13:17 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict clients with auth_name=alice bob, client_metadata.root=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97
Nov 28 10:13:17 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict: joined all
Nov 28 10:13:17 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_evict(auth_id:alice bob, format:json, prefix:fs subvolume evict, sub_name:3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50, vol_name:cephfs) < ""
Nov 28 10:13:17 np0005538515.localdomain ceph-mon[301134]: pgmap v656: 177 pgs: 177 active+clean; 224 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 255 B/s rd, 94 KiB/s wr, 6 op/s
Nov 28 10:13:17 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Nov 28 10:13:17 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Nov 28 10:13:17 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Nov 28 10:13:17 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished
Nov 28 10:13:17 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v657: 177 pgs: 177 active+clean; 224 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 341 B/s rd, 96 KiB/s wr, 6 op/s
Nov 28 10:13:18 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice bob", "format": "json"}]: dispatch
Nov 28 10:13:18 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "alice bob", "format": "json"}]: dispatch
Nov 28 10:13:18 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:13:18.752 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:13:19 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:13:19.160 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:13:19 np0005538515.localdomain ceph-mon[301134]: pgmap v657: 177 pgs: 177 active+clean; 224 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 341 B/s rd, 96 KiB/s wr, 6 op/s
Nov 28 10:13:19 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.
Nov 28 10:13:19 np0005538515.localdomain podman[324177]: 2025-11-28 10:13:19.971101286 +0000 UTC m=+0.076348208 container health_status 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 28 10:13:19 np0005538515.localdomain podman[324177]: 2025-11-28 10:13:19.983393255 +0000 UTC m=+0.088640167 container exec_died 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 28 10:13:19 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v658: 177 pgs: 177 active+clean; 224 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 85 B/s rd, 65 KiB/s wr, 3 op/s
Nov 28 10:13:19 np0005538515.localdomain systemd[1]: 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.service: Deactivated successfully.
Nov 28 10:13:20 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "bob", "tenant_id": "38de2f991c8946e4ad86ddc6b9c2ae73", "access_level": "rw", "format": "json"}]: dispatch
Nov 28 10:13:20 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_authorize(access_level:rw, auth_id:bob, format:json, prefix:fs subvolume authorize, sub_name:3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50, tenant_id:38de2f991c8946e4ad86ddc6b9c2ae73, vol_name:cephfs) < ""
Nov 28 10:13:20 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.bob", "format": "json"} v 0)
Nov 28 10:13:20 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.bob", "format": "json"} : dispatch
Nov 28 10:13:20 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: Creating meta for ID bob with tenant 38de2f991c8946e4ad86ddc6b9c2ae73
Nov 28 10:13:20 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} v 0)
Nov 28 10:13:20 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:13:20 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.bob", "format": "json"} : dispatch
Nov 28 10:13:20 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:13:20 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_authorize(access_level:rw, auth_id:bob, format:json, prefix:fs subvolume authorize, sub_name:3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50, tenant_id:38de2f991c8946e4ad86ddc6b9c2ae73, vol_name:cephfs) < ""
Nov 28 10:13:21 np0005538515.localdomain ceph-mon[301134]: pgmap v658: 177 pgs: 177 active+clean; 224 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 85 B/s rd, 65 KiB/s wr, 3 op/s
Nov 28 10:13:21 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "bob", "tenant_id": "38de2f991c8946e4ad86ddc6b9c2ae73", "access_level": "rw", "format": "json"}]: dispatch
Nov 28 10:13:21 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:13:21 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"}]': finished
Nov 28 10:13:21 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:13:21.711 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:13:21 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e290 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:13:21 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v659: 177 pgs: 177 active+clean; 224 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 85 B/s rd, 128 KiB/s wr, 7 op/s
Nov 28 10:13:22 np0005538515.localdomain ceph-mon[301134]: pgmap v659: 177 pgs: 177 active+clean; 224 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 85 B/s rd, 128 KiB/s wr, 7 op/s
Nov 28 10:13:23 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:13:23.344 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:13:23 np0005538515.localdomain dnsmasq[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/addn_hosts - 2 addresses
Nov 28 10:13:23 np0005538515.localdomain dnsmasq-dhcp[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/host
Nov 28 10:13:23 np0005538515.localdomain dnsmasq-dhcp[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/opts
Nov 28 10:13:23 np0005538515.localdomain podman[324219]: 2025-11-28 10:13:23.380558304 +0000 UTC m=+0.058361212 container kill d6ba70e100412f39fbf43f2359746885ccd151dd93bddc3682d2651a49a36c62 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-887157f9-a765-40c0-8be5-1fba3ddea8f8, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:13:23 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.
Nov 28 10:13:23 np0005538515.localdomain podman[324233]: 2025-11-28 10:13:23.500220899 +0000 UTC m=+0.088042350 container health_status cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 28 10:13:23 np0005538515.localdomain podman[324233]: 2025-11-28 10:13:23.514580932 +0000 UTC m=+0.102402433 container exec_died cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 28 10:13:23 np0005538515.localdomain systemd[1]: cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.service: Deactivated successfully.
Nov 28 10:13:23 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v660: 177 pgs: 177 active+clean; 224 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 85 B/s rd, 65 KiB/s wr, 4 op/s
Nov 28 10:13:24 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:13:24.187 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:13:24 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "1f8e6c04-5771-4f46-846b-71f913803117", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:13:24 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:1f8e6c04-5771-4f46-846b-71f913803117, vol_name:cephfs) < ""
Nov 28 10:13:25 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/1f8e6c04-5771-4f46-846b-71f913803117/.meta.tmp'
Nov 28 10:13:25 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/1f8e6c04-5771-4f46-846b-71f913803117/.meta.tmp' to config b'/volumes/_nogroup/1f8e6c04-5771-4f46-846b-71f913803117/.meta'
Nov 28 10:13:25 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:1f8e6c04-5771-4f46-846b-71f913803117, vol_name:cephfs) < ""
Nov 28 10:13:25 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "1f8e6c04-5771-4f46-846b-71f913803117", "format": "json"}]: dispatch
Nov 28 10:13:25 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:1f8e6c04-5771-4f46-846b-71f913803117, vol_name:cephfs) < ""
Nov 28 10:13:25 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:1f8e6c04-5771-4f46-846b-71f913803117, vol_name:cephfs) < ""
Nov 28 10:13:25 np0005538515.localdomain ceph-mon[301134]: pgmap v660: 177 pgs: 177 active+clean; 224 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 85 B/s rd, 65 KiB/s wr, 4 op/s
Nov 28 10:13:25 np0005538515.localdomain ceph-mon[301134]: from='client.15651 172.18.0.34:0/597982567' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:13:25 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v661: 177 pgs: 177 active+clean; 224 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 85 B/s rd, 65 KiB/s wr, 4 op/s
Nov 28 10:13:26 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "1f8e6c04-5771-4f46-846b-71f913803117", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:13:26 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "1f8e6c04-5771-4f46-846b-71f913803117", "format": "json"}]: dispatch
Nov 28 10:13:26 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:13:26.562 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:13:26 np0005538515.localdomain dnsmasq[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/addn_hosts - 1 addresses
Nov 28 10:13:26 np0005538515.localdomain dnsmasq-dhcp[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/host
Nov 28 10:13:26 np0005538515.localdomain dnsmasq-dhcp[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/opts
Nov 28 10:13:26 np0005538515.localdomain podman[324273]: 2025-11-28 10:13:26.563854932 +0000 UTC m=+0.093613861 container kill d6ba70e100412f39fbf43f2359746885ccd151dd93bddc3682d2651a49a36c62 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-887157f9-a765-40c0-8be5-1fba3ddea8f8, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 28 10:13:26 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:13:26.714 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:13:26 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e290 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:13:27 np0005538515.localdomain ceph-mon[301134]: pgmap v661: 177 pgs: 177 active+clean; 224 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 85 B/s rd, 65 KiB/s wr, 4 op/s
Nov 28 10:13:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:13:27 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 10:13:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:13:27 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 10:13:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:13:27 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 10:13:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:13:27 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 10:13:27 np0005538515.localdomain openstack_network_exporter[240973]: 
Nov 28 10:13:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:13:27 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 10:13:27 np0005538515.localdomain openstack_network_exporter[240973]: 
Nov 28 10:13:27 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v662: 177 pgs: 177 active+clean; 224 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 170 B/s rd, 79 KiB/s wr, 5 op/s
Nov 28 10:13:28 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "1f8e6c04-5771-4f46-846b-71f913803117", "auth_id": "bob", "tenant_id": "38de2f991c8946e4ad86ddc6b9c2ae73", "access_level": "rw", "format": "json"}]: dispatch
Nov 28 10:13:28 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_authorize(access_level:rw, auth_id:bob, format:json, prefix:fs subvolume authorize, sub_name:1f8e6c04-5771-4f46-846b-71f913803117, tenant_id:38de2f991c8946e4ad86ddc6b9c2ae73, vol_name:cephfs) < ""
Nov 28 10:13:28 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.bob", "format": "json"} v 0)
Nov 28 10:13:28 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.bob", "format": "json"} : dispatch
Nov 28 10:13:28 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth caps", "entity": "client.bob", "caps": ["mon", "allow r", "mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97,allow rw path=/volumes/_nogroup/1f8e6c04-5771-4f46-846b-71f913803117/99d1667a-36eb-45db-8990-56f5fc443d2e", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50,allow rw pool=manila_data namespace=fsvolumens_1f8e6c04-5771-4f46-846b-71f913803117"]} v 0)
Nov 28 10:13:28 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth caps", "entity": "client.bob", "caps": ["mon", "allow r", "mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97,allow rw path=/volumes/_nogroup/1f8e6c04-5771-4f46-846b-71f913803117/99d1667a-36eb-45db-8990-56f5fc443d2e", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50,allow rw pool=manila_data namespace=fsvolumens_1f8e6c04-5771-4f46-846b-71f913803117"]} : dispatch
Nov 28 10:13:28 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.bob", "format": "json"} v 0)
Nov 28 10:13:28 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.bob", "format": "json"} : dispatch
Nov 28 10:13:28 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_authorize(access_level:rw, auth_id:bob, format:json, prefix:fs subvolume authorize, sub_name:1f8e6c04-5771-4f46-846b-71f913803117, tenant_id:38de2f991c8946e4ad86ddc6b9c2ae73, vol_name:cephfs) < ""
Nov 28 10:13:28 np0005538515.localdomain podman[239012]: time="2025-11-28T10:13:28Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 10:13:28 np0005538515.localdomain podman[239012]: @ - - [28/Nov/2025:10:13:28 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156330 "" "Go-http-client/1.1"
Nov 28 10:13:28 np0005538515.localdomain podman[239012]: @ - - [28/Nov/2025:10:13:28 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19247 "" "Go-http-client/1.1"
Nov 28 10:13:29 np0005538515.localdomain ceph-mon[301134]: pgmap v662: 177 pgs: 177 active+clean; 224 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 170 B/s rd, 79 KiB/s wr, 5 op/s
Nov 28 10:13:29 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "1f8e6c04-5771-4f46-846b-71f913803117", "auth_id": "bob", "tenant_id": "38de2f991c8946e4ad86ddc6b9c2ae73", "access_level": "rw", "format": "json"}]: dispatch
Nov 28 10:13:29 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.bob", "format": "json"} : dispatch
Nov 28 10:13:29 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth caps", "entity": "client.bob", "caps": ["mon", "allow r", "mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97,allow rw path=/volumes/_nogroup/1f8e6c04-5771-4f46-846b-71f913803117/99d1667a-36eb-45db-8990-56f5fc443d2e", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50,allow rw pool=manila_data namespace=fsvolumens_1f8e6c04-5771-4f46-846b-71f913803117"]} : dispatch
Nov 28 10:13:29 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth caps", "entity": "client.bob", "caps": ["mon", "allow r", "mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97,allow rw path=/volumes/_nogroup/1f8e6c04-5771-4f46-846b-71f913803117/99d1667a-36eb-45db-8990-56f5fc443d2e", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50,allow rw pool=manila_data namespace=fsvolumens_1f8e6c04-5771-4f46-846b-71f913803117"]} : dispatch
Nov 28 10:13:29 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth caps", "entity": "client.bob", "caps": ["mon", "allow r", "mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97,allow rw path=/volumes/_nogroup/1f8e6c04-5771-4f46-846b-71f913803117/99d1667a-36eb-45db-8990-56f5fc443d2e", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50,allow rw pool=manila_data namespace=fsvolumens_1f8e6c04-5771-4f46-846b-71f913803117"]}]': finished
Nov 28 10:13:29 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.bob", "format": "json"} : dispatch
Nov 28 10:13:29 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:13:29.213 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:13:29 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v663: 177 pgs: 177 active+clean; 224 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 85 B/s rd, 77 KiB/s wr, 4 op/s
Nov 28 10:13:31 np0005538515.localdomain ceph-mon[301134]: pgmap v663: 177 pgs: 177 active+clean; 224 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 85 B/s rd, 77 KiB/s wr, 4 op/s
Nov 28 10:13:31 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:13:31.749 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:13:31 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e290 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:13:31 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v664: 177 pgs: 177 active+clean; 225 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 85 B/s rd, 112 KiB/s wr, 6 op/s
Nov 28 10:13:32 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "1f8e6c04-5771-4f46-846b-71f913803117", "auth_id": "bob", "format": "json"}]: dispatch
Nov 28 10:13:32 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_deauthorize(auth_id:bob, format:json, prefix:fs subvolume deauthorize, sub_name:1f8e6c04-5771-4f46-846b-71f913803117, vol_name:cephfs) < ""
Nov 28 10:13:32 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.bob", "format": "json"} v 0)
Nov 28 10:13:32 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.bob", "format": "json"} : dispatch
Nov 28 10:13:32 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth caps", "entity": "client.bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50"]} v 0)
Nov 28 10:13:32 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth caps", "entity": "client.bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50"]} : dispatch
Nov 28 10:13:32 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #49. Immutable memtables: 0.
Nov 28 10:13:32 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:13:32.682585) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Nov 28 10:13:32 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/flush_job.cc:856] [default] [JOB 27] Flushing memtable with next log file: 49
Nov 28 10:13:32 np0005538515.localdomain ceph-mon[301134]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324812682692, "job": 27, "event": "flush_started", "num_memtables": 1, "num_entries": 2594, "num_deletes": 259, "total_data_size": 3815636, "memory_usage": 3876696, "flush_reason": "Manual Compaction"}
Nov 28 10:13:32 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/flush_job.cc:885] [default] [JOB 27] Level-0 flush table #50: started
Nov 28 10:13:32 np0005538515.localdomain ceph-mon[301134]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324812705459, "cf_name": "default", "job": 27, "event": "table_file_creation", "file_number": 50, "file_size": 2498832, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 30476, "largest_seqno": 33065, "table_properties": {"data_size": 2488792, "index_size": 6097, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2885, "raw_key_size": 25107, "raw_average_key_size": 22, "raw_value_size": 2467168, "raw_average_value_size": 2175, "num_data_blocks": 263, "num_entries": 1134, "num_filter_entries": 1134, "num_deletions": 259, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764324696, "oldest_key_time": 1764324696, "file_creation_time": 1764324812, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "75e61b0e-4f73-4b03-b096-8587ecbe7a9f", "db_session_id": "7KM5GJAJPD54H6HSLJHG", "orig_file_number": 50, "seqno_to_time_mapping": "N/A"}}
Nov 28 10:13:32 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 27] Flush lasted 22927 microseconds, and 8378 cpu microseconds.
Nov 28 10:13:32 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 28 10:13:32 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:13:32.705526) [db/flush_job.cc:967] [default] [JOB 27] Level-0 flush table #50: 2498832 bytes OK
Nov 28 10:13:32 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:13:32.705556) [db/memtable_list.cc:519] [default] Level-0 commit table #50 started
Nov 28 10:13:32 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:13:32.707885) [db/memtable_list.cc:722] [default] Level-0 commit table #50: memtable #1 done
Nov 28 10:13:32 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:13:32.707909) EVENT_LOG_v1 {"time_micros": 1764324812707902, "job": 27, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Nov 28 10:13:32 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:13:32.707933) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Nov 28 10:13:32 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 27] Try to delete WAL files size 3803279, prev total WAL file size 3803279, number of live WAL files 2.
Nov 28 10:13:32 np0005538515.localdomain ceph-mon[301134]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538515/store.db/000046.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 28 10:13:32 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:13:32.708990) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003132383031' seq:72057594037927935, type:22 .. '7061786F73003133303533' seq:0, type:0; will stop at (end)
Nov 28 10:13:32 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 28] Compacting 1@0 + 1@6 files to L6, score -1.00
Nov 28 10:13:32 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 27 Base level 0, inputs: [50(2440KB)], [48(17MB)]
Nov 28 10:13:32 np0005538515.localdomain ceph-mon[301134]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324812709040, "job": 28, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [50], "files_L6": [48], "score": -1, "input_data_size": 21214289, "oldest_snapshot_seqno": -1}
Nov 28 10:13:32 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_deauthorize(auth_id:bob, format:json, prefix:fs subvolume deauthorize, sub_name:1f8e6c04-5771-4f46-846b-71f913803117, vol_name:cephfs) < ""
Nov 28 10:13:32 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "1f8e6c04-5771-4f46-846b-71f913803117", "auth_id": "bob", "format": "json"}]: dispatch
Nov 28 10:13:32 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_evict(auth_id:bob, format:json, prefix:fs subvolume evict, sub_name:1f8e6c04-5771-4f46-846b-71f913803117, vol_name:cephfs) < ""
Nov 28 10:13:32 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict clients with auth_name=bob, client_metadata.root=/volumes/_nogroup/1f8e6c04-5771-4f46-846b-71f913803117/99d1667a-36eb-45db-8990-56f5fc443d2e
Nov 28 10:13:32 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict: joined all
Nov 28 10:13:32 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_evict(auth_id:bob, format:json, prefix:fs subvolume evict, sub_name:1f8e6c04-5771-4f46-846b-71f913803117, vol_name:cephfs) < ""
Nov 28 10:13:32 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 28] Generated table #51: 14367 keys, 19606996 bytes, temperature: kUnknown
Nov 28 10:13:32 np0005538515.localdomain ceph-mon[301134]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324812845176, "cf_name": "default", "job": 28, "event": "table_file_creation", "file_number": 51, "file_size": 19606996, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 19522778, "index_size": 47189, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 35973, "raw_key_size": 383945, "raw_average_key_size": 26, "raw_value_size": 19276939, "raw_average_value_size": 1341, "num_data_blocks": 1771, "num_entries": 14367, "num_filter_entries": 14367, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764323786, "oldest_key_time": 0, "file_creation_time": 1764324812, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "75e61b0e-4f73-4b03-b096-8587ecbe7a9f", "db_session_id": "7KM5GJAJPD54H6HSLJHG", "orig_file_number": 51, "seqno_to_time_mapping": "N/A"}}
Nov 28 10:13:32 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 28 10:13:32 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:13:32.845529) [db/compaction/compaction_job.cc:1663] [default] [JOB 28] Compacted 1@0 + 1@6 files to L6 => 19606996 bytes
Nov 28 10:13:32 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:13:32.847682) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 155.7 rd, 143.9 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.4, 17.8 +0.0 blob) out(18.7 +0.0 blob), read-write-amplify(16.3) write-amplify(7.8) OK, records in: 14908, records dropped: 541 output_compression: NoCompression
Nov 28 10:13:32 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:13:32.847715) EVENT_LOG_v1 {"time_micros": 1764324812847698, "job": 28, "event": "compaction_finished", "compaction_time_micros": 136271, "compaction_time_cpu_micros": 50852, "output_level": 6, "num_output_files": 1, "total_output_size": 19606996, "num_input_records": 14908, "num_output_records": 14367, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Nov 28 10:13:32 np0005538515.localdomain ceph-mon[301134]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538515/store.db/000050.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 28 10:13:32 np0005538515.localdomain ceph-mon[301134]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324812848318, "job": 28, "event": "table_file_deletion", "file_number": 50}
Nov 28 10:13:32 np0005538515.localdomain ceph-mon[301134]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538515/store.db/000048.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 28 10:13:32 np0005538515.localdomain ceph-mon[301134]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324812851209, "job": 28, "event": "table_file_deletion", "file_number": 48}
Nov 28 10:13:32 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:13:32.708929) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:13:32 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:13:32.851307) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:13:32 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:13:32.851313) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:13:32 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:13:32.851316) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:13:32 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:13:32.851320) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:13:32 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:13:32.851322) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:13:32 np0005538515.localdomain ceph-mon[301134]: pgmap v664: 177 pgs: 177 active+clean; 225 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 85 B/s rd, 112 KiB/s wr, 6 op/s
Nov 28 10:13:32 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "1f8e6c04-5771-4f46-846b-71f913803117", "auth_id": "bob", "format": "json"}]: dispatch
Nov 28 10:13:32 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth caps", "entity": "client.bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50"]} : dispatch
Nov 28 10:13:32 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.bob", "format": "json"} : dispatch
Nov 28 10:13:32 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth caps", "entity": "client.bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50"]} : dispatch
Nov 28 10:13:32 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth caps", "entity": "client.bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50"]}]': finished
Nov 28 10:13:33 np0005538515.localdomain sudo[324294]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 10:13:33 np0005538515.localdomain sudo[324294]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 10:13:33 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.
Nov 28 10:13:33 np0005538515.localdomain sudo[324294]: pam_unix(sudo:session): session closed for user root
Nov 28 10:13:33 np0005538515.localdomain sudo[324318]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 check-host
Nov 28 10:13:33 np0005538515.localdomain sudo[324318]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 10:13:33 np0005538515.localdomain podman[324312]: 2025-11-28 10:13:33.83016436 +0000 UTC m=+0.082148286 container health_status 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, vendor=Red Hat, Inc., version=9.6, com.redhat.component=ubi9-minimal-container, release=1755695350, vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., name=ubi9-minimal, build-date=2025-08-20T13:12:41, config_id=edpm)
Nov 28 10:13:33 np0005538515.localdomain podman[324312]: 2025-11-28 10:13:33.843541233 +0000 UTC m=+0.095525169 container exec_died 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, version=9.6, config_id=edpm, io.openshift.tags=minimal rhel9, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, io.openshift.expose-services=, container_name=openstack_network_exporter)
Nov 28 10:13:33 np0005538515.localdomain systemd[1]: 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.service: Deactivated successfully.
Nov 28 10:13:33 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "1f8e6c04-5771-4f46-846b-71f913803117", "auth_id": "bob", "format": "json"}]: dispatch
Nov 28 10:13:33 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v665: 177 pgs: 177 active+clean; 225 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 85 B/s rd, 49 KiB/s wr, 2 op/s
Nov 28 10:13:34 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain.devices.0}] v 0)
Nov 28 10:13:34 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:13:34.249 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:13:34 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain}] v 0)
Nov 28 10:13:34 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain.devices.0}] v 0)
Nov 28 10:13:34 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain}] v 0)
Nov 28 10:13:34 np0005538515.localdomain sudo[324318]: pam_unix(sudo:session): session closed for user root
Nov 28 10:13:34 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain.devices.0}] v 0)
Nov 28 10:13:34 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain}] v 0)
Nov 28 10:13:34 np0005538515.localdomain sudo[324371]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 10:13:34 np0005538515.localdomain sudo[324371]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 10:13:34 np0005538515.localdomain sudo[324371]: pam_unix(sudo:session): session closed for user root
Nov 28 10:13:34 np0005538515.localdomain sudo[324389]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 10:13:34 np0005538515.localdomain sudo[324389]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 10:13:35 np0005538515.localdomain sudo[324389]: pam_unix(sudo:session): session closed for user root
Nov 28 10:13:35 np0005538515.localdomain ceph-mon[301134]: pgmap v665: 177 pgs: 177 active+clean; 225 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 85 B/s rd, 49 KiB/s wr, 2 op/s
Nov 28 10:13:35 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:13:35 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:13:35 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:13:35 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:13:35 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:13:35 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:13:35 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Nov 28 10:13:35 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 10:13:35 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0)
Nov 28 10:13:35 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 28 10:13:35 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Nov 28 10:13:35 np0005538515.localdomain ceph-mgr[286188]: [progress INFO root] update: starting ev 36881bde-4e60-4842-9021-b86e61a7a555 (Updating node-proxy deployment (+3 -> 3))
Nov 28 10:13:35 np0005538515.localdomain ceph-mgr[286188]: [progress INFO root] complete: finished ev 36881bde-4e60-4842-9021-b86e61a7a555 (Updating node-proxy deployment (+3 -> 3))
Nov 28 10:13:35 np0005538515.localdomain ceph-mgr[286188]: [progress INFO root] Completed event 36881bde-4e60-4842-9021-b86e61a7a555 (Updating node-proxy deployment (+3 -> 3)) in 0 seconds
Nov 28 10:13:35 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0)
Nov 28 10:13:35 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 28 10:13:35 np0005538515.localdomain sudo[324439]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 10:13:35 np0005538515.localdomain sudo[324439]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 10:13:35 np0005538515.localdomain sudo[324439]: pam_unix(sudo:session): session closed for user root
Nov 28 10:13:35 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] scanning for idle connections..
Nov 28 10:13:35 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] cleaning up connections: []
Nov 28 10:13:35 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] scanning for idle connections..
Nov 28 10:13:35 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] cleaning up connections: []
Nov 28 10:13:35 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:13:35.835 261346 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:13:35Z, description=, device_id=b0257272-76a6-44da-9e3f-446bfab91a2f, device_owner=network:router_gateway, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce456850>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f19ce456fa0>], id=d0e1c2d7-bdcb-42a0-9bb5-3966853464ac, ip_allocation=immediate, mac_address=fa:16:3e:b0:0a:c9, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T08:32:19Z, description=, dns_domain=, id=887157f9-a765-40c0-8be5-1fba3ddea8f8, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=9dda653c53224db086060962b0702694, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=29, status=ACTIVE, subnets=['5f7de60c-f82a-4f40-b803-51cb08cbf2e3'], tags=[], tenant_id=9dda653c53224db086060962b0702694, updated_at=2025-11-28T08:32:25Z, vlan_transparent=None, network_id=887157f9-a765-40c0-8be5-1fba3ddea8f8, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3862, status=DOWN, tags=[], tenant_id=, updated_at=2025-11-28T10:13:35Z on network 887157f9-a765-40c0-8be5-1fba3ddea8f8
Nov 28 10:13:35 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "bob", "format": "json"}]: dispatch
Nov 28 10:13:35 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_deauthorize(auth_id:bob, format:json, prefix:fs subvolume deauthorize, sub_name:3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50, vol_name:cephfs) < ""
Nov 28 10:13:35 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] scanning for idle connections..
Nov 28 10:13:35 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] cleaning up connections: []
Nov 28 10:13:35 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.bob", "format": "json"} v 0)
Nov 28 10:13:35 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.bob", "format": "json"} : dispatch
Nov 28 10:13:35 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.bob"} v 0)
Nov 28 10:13:35 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.bob"} : dispatch
Nov 28 10:13:35 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_deauthorize(auth_id:bob, format:json, prefix:fs subvolume deauthorize, sub_name:3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50, vol_name:cephfs) < ""
Nov 28 10:13:35 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "bob", "format": "json"}]: dispatch
Nov 28 10:13:35 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_evict(auth_id:bob, format:json, prefix:fs subvolume evict, sub_name:3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50, vol_name:cephfs) < ""
Nov 28 10:13:35 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict clients with auth_name=bob, client_metadata.root=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97
Nov 28 10:13:35 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict: joined all
Nov 28 10:13:35 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_evict(auth_id:bob, format:json, prefix:fs subvolume evict, sub_name:3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50, vol_name:cephfs) < ""
Nov 28 10:13:35 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v666: 177 pgs: 177 active+clean; 225 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 85 B/s rd, 49 KiB/s wr, 2 op/s
Nov 28 10:13:36 np0005538515.localdomain dnsmasq[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/addn_hosts - 2 addresses
Nov 28 10:13:36 np0005538515.localdomain podman[324473]: 2025-11-28 10:13:36.024680934 +0000 UTC m=+0.050301844 container kill d6ba70e100412f39fbf43f2359746885ccd151dd93bddc3682d2651a49a36c62 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-887157f9-a765-40c0-8be5-1fba3ddea8f8, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Nov 28 10:13:36 np0005538515.localdomain dnsmasq-dhcp[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/host
Nov 28 10:13:36 np0005538515.localdomain dnsmasq-dhcp[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/opts
Nov 28 10:13:36 np0005538515.localdomain ceph-mgr[286188]: [progress INFO root] Writing back 50 completed events
Nov 28 10:13:36 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0)
Nov 28 10:13:36 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 10:13:36 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 28 10:13:36 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:13:36 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 28 10:13:36 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.bob"} : dispatch
Nov 28 10:13:36 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.bob", "format": "json"} : dispatch
Nov 28 10:13:36 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.bob"} : dispatch
Nov 28 10:13:36 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.bob"}]': finished
Nov 28 10:13:36 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:13:36 np0005538515.localdomain neutron_dhcp_agent[261342]: 2025-11-28 10:13:36.294 261346 INFO neutron.agent.dhcp.agent [None req-e8d73316-c6da-412d-abde-99448150d6b0 - - - - - -] DHCP configuration for ports {'d0e1c2d7-bdcb-42a0-9bb5-3966853464ac'} is completed
Nov 28 10:13:36 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:13:36.555 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:13:36 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:13:36.752 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:13:36 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e290 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:13:37 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "bob", "format": "json"}]: dispatch
Nov 28 10:13:37 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "auth_id": "bob", "format": "json"}]: dispatch
Nov 28 10:13:37 np0005538515.localdomain ceph-mon[301134]: pgmap v666: 177 pgs: 177 active+clean; 225 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 85 B/s rd, 49 KiB/s wr, 2 op/s
Nov 28 10:13:37 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:13:37.641 158530 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=21, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '92:49:97', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ca:ab:0a:de:51:20'}, ipsec=False) old=SB_Global(nb_cfg=20) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:13:37 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:13:37.642 158530 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 28 10:13:37 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:13:37.674 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:13:37 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v667: 177 pgs: 177 active+clean; 225 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 85 B/s rd, 101 KiB/s wr, 5 op/s
Nov 28 10:13:39 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:13:39.279 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:13:39 np0005538515.localdomain ceph-mon[301134]: pgmap v667: 177 pgs: 177 active+clean; 225 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 85 B/s rd, 101 KiB/s wr, 5 op/s
Nov 28 10:13:39 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v668: 177 pgs: 177 active+clean; 225 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 87 KiB/s wr, 4 op/s
Nov 28 10:13:40 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "1f8e6c04-5771-4f46-846b-71f913803117", "format": "json"}]: dispatch
Nov 28 10:13:40 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_clone_status(clone_name:1f8e6c04-5771-4f46-846b-71f913803117, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Nov 28 10:13:40 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_clone_status(clone_name:1f8e6c04-5771-4f46-846b-71f913803117, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Nov 28 10:13:40 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T10:13:40.079+0000 7fcc87448640 -1 mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '1f8e6c04-5771-4f46-846b-71f913803117' of type subvolume
Nov 28 10:13:40 np0005538515.localdomain ceph-mgr[286188]: mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '1f8e6c04-5771-4f46-846b-71f913803117' of type subvolume
Nov 28 10:13:40 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "1f8e6c04-5771-4f46-846b-71f913803117", "force": true, "format": "json"}]: dispatch
Nov 28 10:13:40 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:1f8e6c04-5771-4f46-846b-71f913803117, vol_name:cephfs) < ""
Nov 28 10:13:40 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.subvolume_base] subvolume path 'b'/volumes/_nogroup/1f8e6c04-5771-4f46-846b-71f913803117'' moved to trashcan
Nov 28 10:13:40 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Nov 28 10:13:40 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:1f8e6c04-5771-4f46-846b-71f913803117, vol_name:cephfs) < ""
Nov 28 10:13:41 np0005538515.localdomain ceph-mon[301134]: pgmap v668: 177 pgs: 177 active+clean; 225 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 87 KiB/s wr, 4 op/s
Nov 28 10:13:41 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "1f8e6c04-5771-4f46-846b-71f913803117", "format": "json"}]: dispatch
Nov 28 10:13:41 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "1f8e6c04-5771-4f46-846b-71f913803117", "force": true, "format": "json"}]: dispatch
Nov 28 10:13:41 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:13:41.801 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:13:41 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e290 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:13:41 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v669: 177 pgs: 177 active+clean; 226 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 114 KiB/s wr, 5 op/s
Nov 28 10:13:42 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:13:42.510 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:13:42 np0005538515.localdomain ceph-mon[301134]: pgmap v669: 177 pgs: 177 active+clean; 226 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 114 KiB/s wr, 5 op/s
Nov 28 10:13:43 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "format": "json"}]: dispatch
Nov 28 10:13:43 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_clone_status(clone_name:3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Nov 28 10:13:43 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_clone_status(clone_name:3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Nov 28 10:13:43 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T10:13:43.252+0000 7fcc87448640 -1 mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50' of type subvolume
Nov 28 10:13:43 np0005538515.localdomain ceph-mgr[286188]: mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50' of type subvolume
Nov 28 10:13:43 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "force": true, "format": "json"}]: dispatch
Nov 28 10:13:43 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50, vol_name:cephfs) < ""
Nov 28 10:13:43 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.subvolume_base] subvolume path 'b'/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50'' moved to trashcan
Nov 28 10:13:43 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Nov 28 10:13:43 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50, vol_name:cephfs) < ""
Nov 28 10:13:43 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "format": "json"}]: dispatch
Nov 28 10:13:43 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "force": true, "format": "json"}]: dispatch
Nov 28 10:13:43 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v670: 177 pgs: 177 active+clean; 226 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 79 KiB/s wr, 3 op/s
Nov 28 10:13:44 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:13:44.282 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:13:44 np0005538515.localdomain ceph-mon[301134]: pgmap v670: 177 pgs: 177 active+clean; 226 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 79 KiB/s wr, 3 op/s
Nov 28 10:13:45 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v671: 177 pgs: 177 active+clean; 226 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 79 KiB/s wr, 3 op/s
Nov 28 10:13:46 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:13:46.833 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:13:46 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e290 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:13:47 np0005538515.localdomain ceph-mon[301134]: pgmap v671: 177 pgs: 177 active+clean; 226 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 79 KiB/s wr, 3 op/s
Nov 28 10:13:47 np0005538515.localdomain dnsmasq[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/addn_hosts - 1 addresses
Nov 28 10:13:47 np0005538515.localdomain podman[324513]: 2025-11-28 10:13:47.116758784 +0000 UTC m=+0.054487764 container kill d6ba70e100412f39fbf43f2359746885ccd151dd93bddc3682d2651a49a36c62 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-887157f9-a765-40c0-8be5-1fba3ddea8f8, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:13:47 np0005538515.localdomain dnsmasq-dhcp[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/host
Nov 28 10:13:47 np0005538515.localdomain dnsmasq-dhcp[310862]: read /var/lib/neutron/dhcp/887157f9-a765-40c0-8be5-1fba3ddea8f8/opts
Nov 28 10:13:47 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:13:47.125 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:13:47 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.
Nov 28 10:13:47 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.
Nov 28 10:13:47 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.
Nov 28 10:13:47 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.
Nov 28 10:13:47 np0005538515.localdomain systemd[1]: tmp-crun.ErMjot.mount: Deactivated successfully.
Nov 28 10:13:47 np0005538515.localdomain podman[324529]: 2025-11-28 10:13:47.263485483 +0000 UTC m=+0.111724590 container health_status 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 28 10:13:47 np0005538515.localdomain podman[324529]: 2025-11-28 10:13:47.294985955 +0000 UTC m=+0.143225072 container exec_died 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, io.buildah.version=1.41.3)
Nov 28 10:13:47 np0005538515.localdomain systemd[1]: 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.service: Deactivated successfully.
Nov 28 10:13:47 np0005538515.localdomain podman[324530]: 2025-11-28 10:13:47.308211123 +0000 UTC m=+0.153180529 container health_status d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 28 10:13:47 np0005538515.localdomain podman[324530]: 2025-11-28 10:13:47.31295542 +0000 UTC m=+0.157924816 container exec_died d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 28 10:13:47 np0005538515.localdomain podman[324528]: 2025-11-28 10:13:47.221415964 +0000 UTC m=+0.074813260 container health_status 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_id=edpm, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS)
Nov 28 10:13:47 np0005538515.localdomain systemd[1]: d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.service: Deactivated successfully.
Nov 28 10:13:47 np0005538515.localdomain podman[324567]: 2025-11-28 10:13:47.377798952 +0000 UTC m=+0.137015951 container health_status b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:13:47 np0005538515.localdomain podman[324528]: 2025-11-28 10:13:47.402314229 +0000 UTC m=+0.255711535 container exec_died 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, managed_by=edpm_ansible, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0)
Nov 28 10:13:47 np0005538515.localdomain systemd[1]: 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.service: Deactivated successfully.
Nov 28 10:13:47 np0005538515.localdomain podman[324567]: 2025-11-28 10:13:47.456701597 +0000 UTC m=+0.215918606 container exec_died b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 28 10:13:47 np0005538515.localdomain systemd[1]: b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.service: Deactivated successfully.
Nov 28 10:13:47 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:13:47.644 158530 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=62c03cad-89c1-4fd7-973b-8f2a608c71f1, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '21'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 10:13:48 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v672: 177 pgs: 177 active+clean; 226 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 341 B/s rd, 112 KiB/s wr, 5 op/s
Nov 28 10:13:49 np0005538515.localdomain ceph-mon[301134]: pgmap v672: 177 pgs: 177 active+clean; 226 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 341 B/s rd, 112 KiB/s wr, 5 op/s
Nov 28 10:13:49 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:13:49.234 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:13:49 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:13:49.286 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:13:50 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v673: 177 pgs: 177 active+clean; 226 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 341 B/s rd, 60 KiB/s wr, 2 op/s
Nov 28 10:13:50 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:13:50.237 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:13:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:13:50.856 158530 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 10:13:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:13:50.857 158530 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 10:13:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:13:50.857 158530 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 10:13:50 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.
Nov 28 10:13:50 np0005538515.localdomain systemd[1]: tmp-crun.l8XfvL.mount: Deactivated successfully.
Nov 28 10:13:50 np0005538515.localdomain podman[324614]: 2025-11-28 10:13:50.974110879 +0000 UTC m=+0.074573593 container health_status 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 28 10:13:50 np0005538515.localdomain podman[324614]: 2025-11-28 10:13:50.983321663 +0000 UTC m=+0.083784357 container exec_died 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 28 10:13:50 np0005538515.localdomain systemd[1]: 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.service: Deactivated successfully.
Nov 28 10:13:51 np0005538515.localdomain ceph-mon[301134]: pgmap v673: 177 pgs: 177 active+clean; 226 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 341 B/s rd, 60 KiB/s wr, 2 op/s
Nov 28 10:13:51 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:13:51.239 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:13:51 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:13:51.866 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:13:51 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e290 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:13:52 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v674: 177 pgs: 177 active+clean; 226 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 341 B/s rd, 60 KiB/s wr, 3 op/s
Nov 28 10:13:52 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:13:52.238 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:13:52 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:13:52.238 280172 DEBUG nova.compute.manager [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 28 10:13:52 np0005538515.localdomain ceph-mon[301134]: pgmap v674: 177 pgs: 177 active+clean; 226 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 341 B/s rd, 60 KiB/s wr, 3 op/s
Nov 28 10:13:53 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:13:53.239 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:13:53 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:13:53.239 280172 DEBUG nova.compute.manager [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 28 10:13:53 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:13:53.240 280172 DEBUG nova.compute.manager [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 28 10:13:53 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:13:53.375 280172 DEBUG nova.compute.manager [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 28 10:13:53 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.
Nov 28 10:13:53 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.106:0/3360059162' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:13:53 np0005538515.localdomain podman[324637]: 2025-11-28 10:13:53.971527018 +0000 UTC m=+0.082807558 container health_status cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:13:53 np0005538515.localdomain podman[324637]: 2025-11-28 10:13:53.986584103 +0000 UTC m=+0.097864633 container exec_died cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, container_name=multipathd)
Nov 28 10:13:54 np0005538515.localdomain systemd[1]: cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.service: Deactivated successfully.
Nov 28 10:13:54 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v675: 177 pgs: 177 active+clean; 226 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 341 B/s rd, 33 KiB/s wr, 2 op/s
Nov 28 10:13:54 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:13:54.290 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:13:54 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:13:54.370 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:13:54 np0005538515.localdomain ceph-mon[301134]: pgmap v675: 177 pgs: 177 active+clean; 226 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 341 B/s rd, 33 KiB/s wr, 2 op/s
Nov 28 10:13:54 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.106:0/2441468294' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:13:55 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:13:55.237 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:13:55 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:13:55.238 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:13:55 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "4c25470d-c14c-4093-b430-b79c735aaf06", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:13:55 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:4c25470d-c14c-4093-b430-b79c735aaf06, vol_name:cephfs) < ""
Nov 28 10:13:55 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/4c25470d-c14c-4093-b430-b79c735aaf06/.meta.tmp'
Nov 28 10:13:55 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/4c25470d-c14c-4093-b430-b79c735aaf06/.meta.tmp' to config b'/volumes/_nogroup/4c25470d-c14c-4093-b430-b79c735aaf06/.meta'
Nov 28 10:13:55 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:4c25470d-c14c-4093-b430-b79c735aaf06, vol_name:cephfs) < ""
Nov 28 10:13:55 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "4c25470d-c14c-4093-b430-b79c735aaf06", "format": "json"}]: dispatch
Nov 28 10:13:55 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:4c25470d-c14c-4093-b430-b79c735aaf06, vol_name:cephfs) < ""
Nov 28 10:13:55 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:4c25470d-c14c-4093-b430-b79c735aaf06, vol_name:cephfs) < ""
Nov 28 10:13:55 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "4c25470d-c14c-4093-b430-b79c735aaf06", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:13:55 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "4c25470d-c14c-4093-b430-b79c735aaf06", "format": "json"}]: dispatch
Nov 28 10:13:55 np0005538515.localdomain ceph-mon[301134]: from='client.15651 172.18.0.34:0/597982567' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:13:56 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v676: 177 pgs: 177 active+clean; 226 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 341 B/s rd, 33 KiB/s wr, 2 op/s
Nov 28 10:13:56 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e290 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:13:56 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:13:56.905 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:13:56 np0005538515.localdomain ceph-mon[301134]: pgmap v676: 177 pgs: 177 active+clean; 226 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 341 B/s rd, 33 KiB/s wr, 2 op/s
Nov 28 10:13:57 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:13:57.238 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:13:57 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:13:57.269 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 10:13:57 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:13:57.270 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 10:13:57 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:13:57.270 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 10:13:57 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:13:57.270 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Auditing locally available compute resources for np0005538515.localdomain (node: np0005538515.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 28 10:13:57 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:13:57.271 280172 DEBUG oslo_concurrency.processutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 10:13:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:13:57 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 10:13:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:13:57 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 10:13:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:13:57 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 10:13:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:13:57 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 10:13:57 np0005538515.localdomain openstack_network_exporter[240973]: 
Nov 28 10:13:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:13:57 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 10:13:57 np0005538515.localdomain openstack_network_exporter[240973]: 
Nov 28 10:13:57 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 28 10:13:57 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/2582566830' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:13:57 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:13:57.673 280172 DEBUG oslo_concurrency.processutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.403s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 10:13:57 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:13:57.890 280172 WARNING nova.virt.libvirt.driver [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 10:13:57 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:13:57.892 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Hypervisor/Node resource view: name=np0005538515.localdomain free_ram=11454MB free_disk=41.83686447143555GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 28 10:13:57 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:13:57.892 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 10:13:57 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:13:57.893 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 10:13:57 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.108:0/2582566830' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:13:57 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:13:57.997 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 28 10:13:57 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:13:57.998 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Final resource view: name=np0005538515.localdomain phys_ram=15738MB used_ram=512MB phys_disk=41GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 28 10:13:58 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v677: 177 pgs: 177 active+clean; 226 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 341 B/s rd, 46 KiB/s wr, 3 op/s
Nov 28 10:13:58 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:13:58.015 280172 DEBUG oslo_concurrency.processutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 10:13:58 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 28 10:13:58 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/1398638247' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:13:58 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:13:58.418 280172 DEBUG oslo_concurrency.processutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.402s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 10:13:58 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:13:58.424 280172 DEBUG nova.compute.provider_tree [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Inventory has not changed in ProviderTree for provider: 72fba1ca-0d86-48af-8a3d-510284dfd0e0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 10:13:58 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:13:58.441 280172 DEBUG nova.scheduler.client.report [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Inventory has not changed for provider 72fba1ca-0d86-48af-8a3d-510284dfd0e0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 10:13:58 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:13:58.444 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Compute_service record updated for np0005538515.localdomain:np0005538515.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 28 10:13:58 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:13:58.445 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.552s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 10:13:58 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "8d719993-3b66-454f-a026-687de7e6b3e4", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:13:58 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:8d719993-3b66-454f-a026-687de7e6b3e4, vol_name:cephfs) < ""
Nov 28 10:13:58 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/8d719993-3b66-454f-a026-687de7e6b3e4/.meta.tmp'
Nov 28 10:13:58 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/8d719993-3b66-454f-a026-687de7e6b3e4/.meta.tmp' to config b'/volumes/_nogroup/8d719993-3b66-454f-a026-687de7e6b3e4/.meta'
Nov 28 10:13:58 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:8d719993-3b66-454f-a026-687de7e6b3e4, vol_name:cephfs) < ""
Nov 28 10:13:58 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "8d719993-3b66-454f-a026-687de7e6b3e4", "format": "json"}]: dispatch
Nov 28 10:13:58 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:8d719993-3b66-454f-a026-687de7e6b3e4, vol_name:cephfs) < ""
Nov 28 10:13:58 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:8d719993-3b66-454f-a026-687de7e6b3e4, vol_name:cephfs) < ""
Nov 28 10:13:58 np0005538515.localdomain podman[239012]: time="2025-11-28T10:13:58Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 10:13:58 np0005538515.localdomain podman[239012]: @ - - [28/Nov/2025:10:13:58 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156330 "" "Go-http-client/1.1"
Nov 28 10:13:58 np0005538515.localdomain podman[239012]: @ - - [28/Nov/2025:10:13:58 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19259 "" "Go-http-client/1.1"
Nov 28 10:13:58 np0005538515.localdomain ceph-mon[301134]: pgmap v677: 177 pgs: 177 active+clean; 226 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 341 B/s rd, 46 KiB/s wr, 3 op/s
Nov 28 10:13:58 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.108:0/1398638247' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:13:58 np0005538515.localdomain ceph-mon[301134]: from='client.15651 172.18.0.34:0/597982567' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:13:59 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:13:59.332 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:13:59 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:13:59.446 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:14:00 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "8d719993-3b66-454f-a026-687de7e6b3e4", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:14:00 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "8d719993-3b66-454f-a026-687de7e6b3e4", "format": "json"}]: dispatch
Nov 28 10:14:00 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v678: 177 pgs: 177 active+clean; 226 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 13 KiB/s wr, 0 op/s
Nov 28 10:14:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:14:00.629 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:14:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:14:00.629 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:14:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:14:00.629 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:14:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:14:00.630 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:14:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:14:00.630 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:14:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:14:00.630 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:14:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:14:00.630 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:14:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:14:00.630 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:14:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:14:00.631 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:14:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:14:00.631 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:14:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:14:00.631 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:14:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:14:00.631 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:14:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:14:00.631 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:14:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:14:00.631 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:14:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:14:00.632 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:14:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:14:00.632 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:14:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:14:00.632 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:14:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:14:00.632 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:14:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:14:00.632 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:14:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:14:00.632 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:14:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:14:00.633 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:14:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:14:00.633 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:14:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:14:00.633 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:14:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:14:00.633 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:14:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:14:00.633 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:14:01 np0005538515.localdomain ceph-mon[301134]: pgmap v678: 177 pgs: 177 active+clean; 226 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 13 KiB/s wr, 0 op/s
Nov 28 10:14:01 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e290 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:14:01 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:14:01.957 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:14:02 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v679: 177 pgs: 177 active+clean; 226 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 28 KiB/s wr, 1 op/s
Nov 28 10:14:02 np0005538515.localdomain ceph-mon[301134]: pgmap v679: 177 pgs: 177 active+clean; 226 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 28 KiB/s wr, 1 op/s
Nov 28 10:14:02 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.107:0/1439035389' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:14:03 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "8d719993-3b66-454f-a026-687de7e6b3e4", "auth_id": "Joe", "tenant_id": "301971e834c14ea7aa009696c3f04782", "access_level": "rw", "format": "json"}]: dispatch
Nov 28 10:14:03 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_authorize(access_level:rw, auth_id:Joe, format:json, prefix:fs subvolume authorize, sub_name:8d719993-3b66-454f-a026-687de7e6b3e4, tenant_id:301971e834c14ea7aa009696c3f04782, vol_name:cephfs) < ""
Nov 28 10:14:03 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.Joe", "format": "json"} v 0)
Nov 28 10:14:03 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.Joe", "format": "json"} : dispatch
Nov 28 10:14:03 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: Creating meta for ID Joe with tenant 301971e834c14ea7aa009696c3f04782
Nov 28 10:14:03 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.Joe", "caps": ["mds", "allow rw path=/volumes/_nogroup/8d719993-3b66-454f-a026-687de7e6b3e4/a35af86e-1bef-43ca-805f-c714f40e8411", "osd", "allow rw pool=manila_data namespace=fsvolumens_8d719993-3b66-454f-a026-687de7e6b3e4", "mon", "allow r"], "format": "json"} v 0)
Nov 28 10:14:03 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.Joe", "caps": ["mds", "allow rw path=/volumes/_nogroup/8d719993-3b66-454f-a026-687de7e6b3e4/a35af86e-1bef-43ca-805f-c714f40e8411", "osd", "allow rw pool=manila_data namespace=fsvolumens_8d719993-3b66-454f-a026-687de7e6b3e4", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:14:03 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_authorize(access_level:rw, auth_id:Joe, format:json, prefix:fs subvolume authorize, sub_name:8d719993-3b66-454f-a026-687de7e6b3e4, tenant_id:301971e834c14ea7aa009696c3f04782, vol_name:cephfs) < ""
Nov 28 10:14:03 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.
Nov 28 10:14:03 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "8d719993-3b66-454f-a026-687de7e6b3e4", "auth_id": "Joe", "tenant_id": "301971e834c14ea7aa009696c3f04782", "access_level": "rw", "format": "json"}]: dispatch
Nov 28 10:14:03 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.Joe", "format": "json"} : dispatch
Nov 28 10:14:03 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.Joe", "caps": ["mds", "allow rw path=/volumes/_nogroup/8d719993-3b66-454f-a026-687de7e6b3e4/a35af86e-1bef-43ca-805f-c714f40e8411", "osd", "allow rw pool=manila_data namespace=fsvolumens_8d719993-3b66-454f-a026-687de7e6b3e4", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:14:03 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.Joe", "caps": ["mds", "allow rw path=/volumes/_nogroup/8d719993-3b66-454f-a026-687de7e6b3e4/a35af86e-1bef-43ca-805f-c714f40e8411", "osd", "allow rw pool=manila_data namespace=fsvolumens_8d719993-3b66-454f-a026-687de7e6b3e4", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:14:03 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.Joe", "caps": ["mds", "allow rw path=/volumes/_nogroup/8d719993-3b66-454f-a026-687de7e6b3e4/a35af86e-1bef-43ca-805f-c714f40e8411", "osd", "allow rw pool=manila_data namespace=fsvolumens_8d719993-3b66-454f-a026-687de7e6b3e4", "mon", "allow r"], "format": "json"}]': finished
Nov 28 10:14:03 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.107:0/2102988728' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:14:03 np0005538515.localdomain systemd[1]: tmp-crun.VeoBdq.mount: Deactivated successfully.
Nov 28 10:14:03 np0005538515.localdomain podman[324702]: 2025-11-28 10:14:03.980386459 +0000 UTC m=+0.088075841 container health_status 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.openshift.expose-services=, vcs-type=git, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, version=9.6, build-date=2025-08-20T13:12:41, release=1755695350, distribution-scope=public, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Nov 28 10:14:03 np0005538515.localdomain podman[324702]: 2025-11-28 10:14:03.992440451 +0000 UTC m=+0.100129863 container exec_died 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, managed_by=edpm_ansible, architecture=x86_64, config_id=edpm, version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, io.openshift.tags=minimal rhel9, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, release=1755695350, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 28 10:14:04 np0005538515.localdomain systemd[1]: 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.service: Deactivated successfully.
Nov 28 10:14:04 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v680: 177 pgs: 177 active+clean; 226 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 27 KiB/s wr, 1 op/s
Nov 28 10:14:04 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:14:04.367 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:14:04 np0005538515.localdomain ceph-mon[301134]: pgmap v680: 177 pgs: 177 active+clean; 226 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 27 KiB/s wr, 1 op/s
Nov 28 10:14:05 np0005538515.localdomain ceph-mgr[286188]: [balancer INFO root] Optimize plan auto_2025-11-28_10:14:05
Nov 28 10:14:05 np0005538515.localdomain ceph-mgr[286188]: [balancer INFO root] Mode upmap, max misplaced 0.050000
Nov 28 10:14:05 np0005538515.localdomain ceph-mgr[286188]: [balancer INFO root] do_upmap
Nov 28 10:14:05 np0005538515.localdomain ceph-mgr[286188]: [balancer INFO root] pools ['volumes', 'vms', 'manila_data', '.mgr', 'backups', 'manila_metadata', 'images']
Nov 28 10:14:05 np0005538515.localdomain ceph-mgr[286188]: [balancer INFO root] prepared 0/10 changes
Nov 28 10:14:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] scanning for idle connections..
Nov 28 10:14:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] cleaning up connections: []
Nov 28 10:14:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] scanning for idle connections..
Nov 28 10:14:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] cleaning up connections: []
Nov 28 10:14:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] scanning for idle connections..
Nov 28 10:14:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] cleaning up connections: []
Nov 28 10:14:06 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v681: 177 pgs: 177 active+clean; 226 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 27 KiB/s wr, 1 op/s
Nov 28 10:14:06 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] _maybe_adjust
Nov 28 10:14:06 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Nov 28 10:14:06 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] Pool '.mgr' root_id -1 using 3.080724804578448e-05 of space, bias 1.0, pg target 0.006161449609156895 quantized to 1 (current 1)
Nov 28 10:14:06 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Nov 28 10:14:06 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] Pool 'vms' root_id -1 using 0.003328000680485762 of space, bias 1.0, pg target 0.6656001360971524 quantized to 32 (current 32)
Nov 28 10:14:06 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Nov 28 10:14:06 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] Pool 'volumes' root_id -1 using 0.0014861089300670016 of space, bias 1.0, pg target 0.29672641637004465 quantized to 32 (current 32)
Nov 28 10:14:06 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Nov 28 10:14:06 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] Pool 'images' root_id -1 using 0.004299383200725851 of space, bias 1.0, pg target 0.8584435124115949 quantized to 32 (current 32)
Nov 28 10:14:06 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Nov 28 10:14:06 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] Pool 'backups' root_id -1 using 2.7263051367950866e-07 of space, bias 1.0, pg target 5.425347222222222e-05 quantized to 32 (current 32)
Nov 28 10:14:06 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Nov 28 10:14:06 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] Pool 'manila_data' root_id -1 using 5.452610273590173e-07 of space, bias 1.0, pg target 0.00010850694444444444 quantized to 32 (current 32)
Nov 28 10:14:06 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Nov 28 10:14:06 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] Pool 'manila_metadata' root_id -1 using 0.002426956832774986 of space, bias 4.0, pg target 1.931857638888889 quantized to 16 (current 16)
Nov 28 10:14:06 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules
Nov 28 10:14:06 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: vms, start_after=
Nov 28 10:14:06 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: volumes, start_after=
Nov 28 10:14:06 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: images, start_after=
Nov 28 10:14:06 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: backups, start_after=
Nov 28 10:14:06 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules
Nov 28 10:14:06 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: vms, start_after=
Nov 28 10:14:06 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: volumes, start_after=
Nov 28 10:14:06 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: images, start_after=
Nov 28 10:14:06 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: backups, start_after=
Nov 28 10:14:06 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e290 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:14:07 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:14:07.017 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:14:07 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "d56bb3f2-efa0-4328-9320-c5298bccaeb7", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:14:07 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:d56bb3f2-efa0-4328-9320-c5298bccaeb7, vol_name:cephfs) < ""
Nov 28 10:14:07 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/d56bb3f2-efa0-4328-9320-c5298bccaeb7/.meta.tmp'
Nov 28 10:14:07 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/d56bb3f2-efa0-4328-9320-c5298bccaeb7/.meta.tmp' to config b'/volumes/_nogroup/d56bb3f2-efa0-4328-9320-c5298bccaeb7/.meta'
Nov 28 10:14:07 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:d56bb3f2-efa0-4328-9320-c5298bccaeb7, vol_name:cephfs) < ""
Nov 28 10:14:07 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "d56bb3f2-efa0-4328-9320-c5298bccaeb7", "format": "json"}]: dispatch
Nov 28 10:14:07 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:d56bb3f2-efa0-4328-9320-c5298bccaeb7, vol_name:cephfs) < ""
Nov 28 10:14:07 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:d56bb3f2-efa0-4328-9320-c5298bccaeb7, vol_name:cephfs) < ""
Nov 28 10:14:07 np0005538515.localdomain ceph-mon[301134]: pgmap v681: 177 pgs: 177 active+clean; 226 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 27 KiB/s wr, 1 op/s
Nov 28 10:14:07 np0005538515.localdomain ceph-mon[301134]: from='client.15651 172.18.0.34:0/597982567' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:14:08 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v682: 177 pgs: 177 active+clean; 227 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 66 KiB/s wr, 2 op/s
Nov 28 10:14:08 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "d56bb3f2-efa0-4328-9320-c5298bccaeb7", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:14:08 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "d56bb3f2-efa0-4328-9320-c5298bccaeb7", "format": "json"}]: dispatch
Nov 28 10:14:09 np0005538515.localdomain ceph-mon[301134]: pgmap v682: 177 pgs: 177 active+clean; 227 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 66 KiB/s wr, 2 op/s
Nov 28 10:14:09 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:14:09.377 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:14:10 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v683: 177 pgs: 177 active+clean; 227 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 53 KiB/s wr, 2 op/s
Nov 28 10:14:10 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "d56bb3f2-efa0-4328-9320-c5298bccaeb7", "auth_id": "Joe", "tenant_id": "b90c445933704341b38d135548fb5388", "access_level": "rw", "format": "json"}]: dispatch
Nov 28 10:14:10 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_authorize(access_level:rw, auth_id:Joe, format:json, prefix:fs subvolume authorize, sub_name:d56bb3f2-efa0-4328-9320-c5298bccaeb7, tenant_id:b90c445933704341b38d135548fb5388, vol_name:cephfs) < ""
Nov 28 10:14:10 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.Joe", "format": "json"} v 0)
Nov 28 10:14:10 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.Joe", "format": "json"} : dispatch
Nov 28 10:14:10 np0005538515.localdomain ceph-mgr[286188]: [volumes ERROR volumes.fs.operations.versions.subvolume_v1] auth ID: Joe is already in use
Nov 28 10:14:10 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_authorize(access_level:rw, auth_id:Joe, format:json, prefix:fs subvolume authorize, sub_name:d56bb3f2-efa0-4328-9320-c5298bccaeb7, tenant_id:b90c445933704341b38d135548fb5388, vol_name:cephfs) < ""
Nov 28 10:14:10 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T10:14:10.410+0000 7fcc87448640 -1 mgr.server reply reply (1) Operation not permitted auth ID: Joe is already in use
Nov 28 10:14:10 np0005538515.localdomain ceph-mgr[286188]: mgr.server reply reply (1) Operation not permitted auth ID: Joe is already in use
Nov 28 10:14:11 np0005538515.localdomain ceph-mon[301134]: pgmap v683: 177 pgs: 177 active+clean; 227 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 53 KiB/s wr, 2 op/s
Nov 28 10:14:11 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "d56bb3f2-efa0-4328-9320-c5298bccaeb7", "auth_id": "Joe", "tenant_id": "b90c445933704341b38d135548fb5388", "access_level": "rw", "format": "json"}]: dispatch
Nov 28 10:14:11 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.Joe", "format": "json"} : dispatch
Nov 28 10:14:11 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e290 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:14:12 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v684: 177 pgs: 177 active+clean; 227 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 75 KiB/s wr, 3 op/s
Nov 28 10:14:12 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:14:12.061 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:14:12 np0005538515.localdomain ceph-mon[301134]: pgmap v684: 177 pgs: 177 active+clean; 227 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 75 KiB/s wr, 3 op/s
Nov 28 10:14:13 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Nov 28 10:14:13 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/4282388622' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:14:13 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Nov 28 10:14:13 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/4282388622' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:14:13 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "d56bb3f2-efa0-4328-9320-c5298bccaeb7", "auth_id": "tempest-cephx-id-241168775", "tenant_id": "b90c445933704341b38d135548fb5388", "access_level": "rw", "format": "json"}]: dispatch
Nov 28 10:14:13 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_authorize(access_level:rw, auth_id:tempest-cephx-id-241168775, format:json, prefix:fs subvolume authorize, sub_name:d56bb3f2-efa0-4328-9320-c5298bccaeb7, tenant_id:b90c445933704341b38d135548fb5388, vol_name:cephfs) < ""
Nov 28 10:14:13 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.tempest-cephx-id-241168775", "format": "json"} v 0)
Nov 28 10:14:13 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-241168775", "format": "json"} : dispatch
Nov 28 10:14:13 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: Creating meta for ID tempest-cephx-id-241168775 with tenant b90c445933704341b38d135548fb5388
Nov 28 10:14:13 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-241168775", "caps": ["mds", "allow rw path=/volumes/_nogroup/d56bb3f2-efa0-4328-9320-c5298bccaeb7/3fd14072-fd4e-434b-b433-15cdcb82070a", "osd", "allow rw pool=manila_data namespace=fsvolumens_d56bb3f2-efa0-4328-9320-c5298bccaeb7", "mon", "allow r"], "format": "json"} v 0)
Nov 28 10:14:13 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-241168775", "caps": ["mds", "allow rw path=/volumes/_nogroup/d56bb3f2-efa0-4328-9320-c5298bccaeb7/3fd14072-fd4e-434b-b433-15cdcb82070a", "osd", "allow rw pool=manila_data namespace=fsvolumens_d56bb3f2-efa0-4328-9320-c5298bccaeb7", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:14:13 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_authorize(access_level:rw, auth_id:tempest-cephx-id-241168775, format:json, prefix:fs subvolume authorize, sub_name:d56bb3f2-efa0-4328-9320-c5298bccaeb7, tenant_id:b90c445933704341b38d135548fb5388, vol_name:cephfs) < ""
Nov 28 10:14:13 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/4282388622' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:14:13 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/4282388622' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:14:13 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "d56bb3f2-efa0-4328-9320-c5298bccaeb7", "auth_id": "tempest-cephx-id-241168775", "tenant_id": "b90c445933704341b38d135548fb5388", "access_level": "rw", "format": "json"}]: dispatch
Nov 28 10:14:13 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-241168775", "format": "json"} : dispatch
Nov 28 10:14:13 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-241168775", "caps": ["mds", "allow rw path=/volumes/_nogroup/d56bb3f2-efa0-4328-9320-c5298bccaeb7/3fd14072-fd4e-434b-b433-15cdcb82070a", "osd", "allow rw pool=manila_data namespace=fsvolumens_d56bb3f2-efa0-4328-9320-c5298bccaeb7", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:14:13 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-241168775", "caps": ["mds", "allow rw path=/volumes/_nogroup/d56bb3f2-efa0-4328-9320-c5298bccaeb7/3fd14072-fd4e-434b-b433-15cdcb82070a", "osd", "allow rw pool=manila_data namespace=fsvolumens_d56bb3f2-efa0-4328-9320-c5298bccaeb7", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:14:13 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-241168775", "caps": ["mds", "allow rw path=/volumes/_nogroup/d56bb3f2-efa0-4328-9320-c5298bccaeb7/3fd14072-fd4e-434b-b433-15cdcb82070a", "osd", "allow rw pool=manila_data namespace=fsvolumens_d56bb3f2-efa0-4328-9320-c5298bccaeb7", "mon", "allow r"], "format": "json"}]': finished
Nov 28 10:14:14 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v685: 177 pgs: 177 active+clean; 227 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 61 KiB/s wr, 2 op/s
Nov 28 10:14:14 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:14:14.404 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:14:14 np0005538515.localdomain ceph-mon[301134]: pgmap v685: 177 pgs: 177 active+clean; 227 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 61 KiB/s wr, 2 op/s
Nov 28 10:14:16 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v686: 177 pgs: 177 active+clean; 227 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 61 KiB/s wr, 2 op/s
Nov 28 10:14:16 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e290 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:14:16 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "d56bb3f2-efa0-4328-9320-c5298bccaeb7", "auth_id": "Joe", "format": "json"}]: dispatch
Nov 28 10:14:16 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_deauthorize(auth_id:Joe, format:json, prefix:fs subvolume deauthorize, sub_name:d56bb3f2-efa0-4328-9320-c5298bccaeb7, vol_name:cephfs) < ""
Nov 28 10:14:16 np0005538515.localdomain ceph-mgr[286188]: [volumes WARNING volumes.fs.operations.versions.subvolume_v1] deauthorized called for already-removed authID 'Joe' for subvolume 'd56bb3f2-efa0-4328-9320-c5298bccaeb7'
Nov 28 10:14:16 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_deauthorize(auth_id:Joe, format:json, prefix:fs subvolume deauthorize, sub_name:d56bb3f2-efa0-4328-9320-c5298bccaeb7, vol_name:cephfs) < ""
Nov 28 10:14:16 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "d56bb3f2-efa0-4328-9320-c5298bccaeb7", "auth_id": "Joe", "format": "json"}]: dispatch
Nov 28 10:14:16 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_evict(auth_id:Joe, format:json, prefix:fs subvolume evict, sub_name:d56bb3f2-efa0-4328-9320-c5298bccaeb7, vol_name:cephfs) < ""
Nov 28 10:14:16 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict clients with auth_name=Joe, client_metadata.root=/volumes/_nogroup/d56bb3f2-efa0-4328-9320-c5298bccaeb7/3fd14072-fd4e-434b-b433-15cdcb82070a
Nov 28 10:14:16 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict: joined all
Nov 28 10:14:16 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_evict(auth_id:Joe, format:json, prefix:fs subvolume evict, sub_name:d56bb3f2-efa0-4328-9320-c5298bccaeb7, vol_name:cephfs) < ""
Nov 28 10:14:17 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:14:17.096 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:14:17 np0005538515.localdomain ceph-mon[301134]: pgmap v686: 177 pgs: 177 active+clean; 227 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 61 KiB/s wr, 2 op/s
Nov 28 10:14:17 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.
Nov 28 10:14:17 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.
Nov 28 10:14:17 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.
Nov 28 10:14:17 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.
Nov 28 10:14:17 np0005538515.localdomain podman[324722]: 2025-11-28 10:14:17.992714175 +0000 UTC m=+0.095927492 container health_status 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=edpm, org.label-schema.build-date=20251125, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 28 10:14:18 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v687: 177 pgs: 177 active+clean; 227 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 94 KiB/s wr, 3 op/s
Nov 28 10:14:18 np0005538515.localdomain podman[324722]: 2025-11-28 10:14:18.026925532 +0000 UTC m=+0.130138839 container exec_died 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=edpm, io.buildah.version=1.41.3, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:14:18 np0005538515.localdomain systemd[1]: tmp-crun.K0wRry.mount: Deactivated successfully.
Nov 28 10:14:18 np0005538515.localdomain systemd[1]: 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.service: Deactivated successfully.
Nov 28 10:14:18 np0005538515.localdomain podman[324724]: 2025-11-28 10:14:18.050449628 +0000 UTC m=+0.144314106 container health_status b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:14:18 np0005538515.localdomain podman[324724]: 2025-11-28 10:14:18.07935682 +0000 UTC m=+0.173221298 container exec_died b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 28 10:14:18 np0005538515.localdomain systemd[1]: b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.service: Deactivated successfully.
Nov 28 10:14:18 np0005538515.localdomain podman[324723]: 2025-11-28 10:14:18.098040127 +0000 UTC m=+0.194412602 container health_status 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 28 10:14:18 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "d56bb3f2-efa0-4328-9320-c5298bccaeb7", "auth_id": "Joe", "format": "json"}]: dispatch
Nov 28 10:14:18 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "d56bb3f2-efa0-4328-9320-c5298bccaeb7", "auth_id": "Joe", "format": "json"}]: dispatch
Nov 28 10:14:18 np0005538515.localdomain podman[324730]: 2025-11-28 10:14:18.179828451 +0000 UTC m=+0.267182218 container health_status d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 28 10:14:18 np0005538515.localdomain podman[324730]: 2025-11-28 10:14:18.18821623 +0000 UTC m=+0.275570037 container exec_died d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 10:14:18 np0005538515.localdomain podman[324723]: 2025-11-28 10:14:18.19725735 +0000 UTC m=+0.293629855 container exec_died 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Nov 28 10:14:18 np0005538515.localdomain systemd[1]: d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.service: Deactivated successfully.
Nov 28 10:14:18 np0005538515.localdomain systemd[1]: 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.service: Deactivated successfully.
Nov 28 10:14:18 np0005538515.localdomain ovn_controller[152726]: 2025-11-28T10:14:18Z|00196|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Nov 28 10:14:19 np0005538515.localdomain ceph-mon[301134]: pgmap v687: 177 pgs: 177 active+clean; 227 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 94 KiB/s wr, 3 op/s
Nov 28 10:14:19 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:14:19.439 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:14:20 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v688: 177 pgs: 177 active+clean; 227 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 55 KiB/s wr, 2 op/s
Nov 28 10:14:20 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "d56bb3f2-efa0-4328-9320-c5298bccaeb7", "auth_id": "tempest-cephx-id-241168775", "format": "json"}]: dispatch
Nov 28 10:14:20 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_deauthorize(auth_id:tempest-cephx-id-241168775, format:json, prefix:fs subvolume deauthorize, sub_name:d56bb3f2-efa0-4328-9320-c5298bccaeb7, vol_name:cephfs) < ""
Nov 28 10:14:20 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.tempest-cephx-id-241168775", "format": "json"} v 0)
Nov 28 10:14:20 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-241168775", "format": "json"} : dispatch
Nov 28 10:14:20 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.tempest-cephx-id-241168775"} v 0)
Nov 28 10:14:20 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-241168775"} : dispatch
Nov 28 10:14:20 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_deauthorize(auth_id:tempest-cephx-id-241168775, format:json, prefix:fs subvolume deauthorize, sub_name:d56bb3f2-efa0-4328-9320-c5298bccaeb7, vol_name:cephfs) < ""
Nov 28 10:14:20 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "d56bb3f2-efa0-4328-9320-c5298bccaeb7", "auth_id": "tempest-cephx-id-241168775", "format": "json"}]: dispatch
Nov 28 10:14:20 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_evict(auth_id:tempest-cephx-id-241168775, format:json, prefix:fs subvolume evict, sub_name:d56bb3f2-efa0-4328-9320-c5298bccaeb7, vol_name:cephfs) < ""
Nov 28 10:14:20 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict clients with auth_name=tempest-cephx-id-241168775, client_metadata.root=/volumes/_nogroup/d56bb3f2-efa0-4328-9320-c5298bccaeb7/3fd14072-fd4e-434b-b433-15cdcb82070a
Nov 28 10:14:20 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict: joined all
Nov 28 10:14:20 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_evict(auth_id:tempest-cephx-id-241168775, format:json, prefix:fs subvolume evict, sub_name:d56bb3f2-efa0-4328-9320-c5298bccaeb7, vol_name:cephfs) < ""
Nov 28 10:14:21 np0005538515.localdomain ceph-mon[301134]: pgmap v688: 177 pgs: 177 active+clean; 227 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 55 KiB/s wr, 2 op/s
Nov 28 10:14:21 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "d56bb3f2-efa0-4328-9320-c5298bccaeb7", "auth_id": "tempest-cephx-id-241168775", "format": "json"}]: dispatch
Nov 28 10:14:21 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-241168775"} : dispatch
Nov 28 10:14:21 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-241168775", "format": "json"} : dispatch
Nov 28 10:14:21 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-241168775"} : dispatch
Nov 28 10:14:21 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-241168775"}]': finished
Nov 28 10:14:21 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "d56bb3f2-efa0-4328-9320-c5298bccaeb7", "auth_id": "tempest-cephx-id-241168775", "format": "json"}]: dispatch
Nov 28 10:14:21 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e290 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:14:21 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.
Nov 28 10:14:21 np0005538515.localdomain podman[324807]: 2025-11-28 10:14:21.979821577 +0000 UTC m=+0.085810160 container health_status 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 28 10:14:21 np0005538515.localdomain podman[324807]: 2025-11-28 10:14:21.98478344 +0000 UTC m=+0.090772063 container exec_died 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 28 10:14:21 np0005538515.localdomain systemd[1]: 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.service: Deactivated successfully.
Nov 28 10:14:22 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v689: 177 pgs: 177 active+clean; 228 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 86 KiB/s wr, 4 op/s
Nov 28 10:14:22 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:14:22.116 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:14:22 np0005538515.localdomain ceph-mon[301134]: pgmap v689: 177 pgs: 177 active+clean; 228 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 86 KiB/s wr, 4 op/s
Nov 28 10:14:23 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "8d719993-3b66-454f-a026-687de7e6b3e4", "auth_id": "Joe", "format": "json"}]: dispatch
Nov 28 10:14:23 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_deauthorize(auth_id:Joe, format:json, prefix:fs subvolume deauthorize, sub_name:8d719993-3b66-454f-a026-687de7e6b3e4, vol_name:cephfs) < ""
Nov 28 10:14:23 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.Joe", "format": "json"} v 0)
Nov 28 10:14:23 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.Joe", "format": "json"} : dispatch
Nov 28 10:14:23 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.Joe"} v 0)
Nov 28 10:14:23 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.Joe"} : dispatch
Nov 28 10:14:23 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_deauthorize(auth_id:Joe, format:json, prefix:fs subvolume deauthorize, sub_name:8d719993-3b66-454f-a026-687de7e6b3e4, vol_name:cephfs) < ""
Nov 28 10:14:23 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "8d719993-3b66-454f-a026-687de7e6b3e4", "auth_id": "Joe", "format": "json"}]: dispatch
Nov 28 10:14:23 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_evict(auth_id:Joe, format:json, prefix:fs subvolume evict, sub_name:8d719993-3b66-454f-a026-687de7e6b3e4, vol_name:cephfs) < ""
Nov 28 10:14:23 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict clients with auth_name=Joe, client_metadata.root=/volumes/_nogroup/8d719993-3b66-454f-a026-687de7e6b3e4/a35af86e-1bef-43ca-805f-c714f40e8411
Nov 28 10:14:23 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict: joined all
Nov 28 10:14:23 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_evict(auth_id:Joe, format:json, prefix:fs subvolume evict, sub_name:8d719993-3b66-454f-a026-687de7e6b3e4, vol_name:cephfs) < ""
Nov 28 10:14:23 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "8d719993-3b66-454f-a026-687de7e6b3e4", "auth_id": "Joe", "format": "json"}]: dispatch
Nov 28 10:14:23 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.Joe"} : dispatch
Nov 28 10:14:23 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.Joe", "format": "json"} : dispatch
Nov 28 10:14:23 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.Joe"} : dispatch
Nov 28 10:14:23 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.Joe"}]': finished
Nov 28 10:14:23 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "8d719993-3b66-454f-a026-687de7e6b3e4", "auth_id": "Joe", "format": "json"}]: dispatch
Nov 28 10:14:24 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v690: 177 pgs: 177 active+clean; 228 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 64 KiB/s wr, 3 op/s
Nov 28 10:14:24 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:14:24.470 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:14:24 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.
Nov 28 10:14:24 np0005538515.localdomain ceph-mon[301134]: pgmap v690: 177 pgs: 177 active+clean; 228 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 64 KiB/s wr, 3 op/s
Nov 28 10:14:24 np0005538515.localdomain systemd[1]: tmp-crun.ERSGU6.mount: Deactivated successfully.
Nov 28 10:14:24 np0005538515.localdomain podman[324831]: 2025-11-28 10:14:24.982646944 +0000 UTC m=+0.091533888 container health_status cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd)
Nov 28 10:14:24 np0005538515.localdomain podman[324831]: 2025-11-28 10:14:24.998599645 +0000 UTC m=+0.107486589 container exec_died cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3)
Nov 28 10:14:25 np0005538515.localdomain systemd[1]: cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.service: Deactivated successfully.
Nov 28 10:14:26 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v691: 177 pgs: 177 active+clean; 228 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 64 KiB/s wr, 3 op/s
Nov 28 10:14:26 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "4c25470d-c14c-4093-b430-b79c735aaf06", "auth_id": "admin", "tenant_id": "301971e834c14ea7aa009696c3f04782", "access_level": "rw", "format": "json"}]: dispatch
Nov 28 10:14:26 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_authorize(access_level:rw, auth_id:admin, format:json, prefix:fs subvolume authorize, sub_name:4c25470d-c14c-4093-b430-b79c735aaf06, tenant_id:301971e834c14ea7aa009696c3f04782, vol_name:cephfs) < ""
Nov 28 10:14:26 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e290 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:14:26 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.admin", "format": "json"} v 0)
Nov 28 10:14:26 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.admin", "format": "json"} : dispatch
Nov 28 10:14:26 np0005538515.localdomain ceph-mgr[286188]: [volumes ERROR volumes.fs.operations.versions.subvolume_v1] auth ID: admin exists and not created by mgr plugin. Not allowed to modify
Nov 28 10:14:26 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_authorize(access_level:rw, auth_id:admin, format:json, prefix:fs subvolume authorize, sub_name:4c25470d-c14c-4093-b430-b79c735aaf06, tenant_id:301971e834c14ea7aa009696c3f04782, vol_name:cephfs) < ""
Nov 28 10:14:26 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T10:14:26.878+0000 7fcc87448640 -1 mgr.server reply reply (1) Operation not permitted auth ID: admin exists and not created by mgr plugin. Not allowed to modify
Nov 28 10:14:26 np0005538515.localdomain ceph-mgr[286188]: mgr.server reply reply (1) Operation not permitted auth ID: admin exists and not created by mgr plugin. Not allowed to modify
Nov 28 10:14:27 np0005538515.localdomain ceph-mon[301134]: pgmap v691: 177 pgs: 177 active+clean; 228 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 64 KiB/s wr, 3 op/s
Nov 28 10:14:27 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.admin", "format": "json"} : dispatch
Nov 28 10:14:27 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:14:27.157 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:14:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:14:27 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 10:14:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:14:27 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 10:14:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:14:27 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 10:14:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:14:27 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 10:14:27 np0005538515.localdomain openstack_network_exporter[240973]: 
Nov 28 10:14:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:14:27 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 10:14:27 np0005538515.localdomain openstack_network_exporter[240973]: 
Nov 28 10:14:28 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v692: 177 pgs: 177 active+clean; 228 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 85 B/s rd, 100 KiB/s wr, 5 op/s
Nov 28 10:14:28 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "4c25470d-c14c-4093-b430-b79c735aaf06", "auth_id": "admin", "tenant_id": "301971e834c14ea7aa009696c3f04782", "access_level": "rw", "format": "json"}]: dispatch
Nov 28 10:14:28 np0005538515.localdomain podman[239012]: time="2025-11-28T10:14:28Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 10:14:28 np0005538515.localdomain podman[239012]: @ - - [28/Nov/2025:10:14:28 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156330 "" "Go-http-client/1.1"
Nov 28 10:14:28 np0005538515.localdomain podman[239012]: @ - - [28/Nov/2025:10:14:28 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19251 "" "Go-http-client/1.1"
Nov 28 10:14:29 np0005538515.localdomain ceph-mon[301134]: pgmap v692: 177 pgs: 177 active+clean; 228 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 85 B/s rd, 100 KiB/s wr, 5 op/s
Nov 28 10:14:29 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:14:29.501 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:14:30 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v693: 177 pgs: 177 active+clean; 228 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 85 B/s rd, 67 KiB/s wr, 3 op/s
Nov 28 10:14:30 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "4c25470d-c14c-4093-b430-b79c735aaf06", "auth_id": "david", "tenant_id": "301971e834c14ea7aa009696c3f04782", "access_level": "rw", "format": "json"}]: dispatch
Nov 28 10:14:30 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_authorize(access_level:rw, auth_id:david, format:json, prefix:fs subvolume authorize, sub_name:4c25470d-c14c-4093-b430-b79c735aaf06, tenant_id:301971e834c14ea7aa009696c3f04782, vol_name:cephfs) < ""
Nov 28 10:14:30 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.david", "format": "json"} v 0)
Nov 28 10:14:30 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.david", "format": "json"} : dispatch
Nov 28 10:14:30 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: Creating meta for ID david with tenant 301971e834c14ea7aa009696c3f04782
Nov 28 10:14:30 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.david", "caps": ["mds", "allow rw path=/volumes/_nogroup/4c25470d-c14c-4093-b430-b79c735aaf06/471f0b39-a16d-4e49-a0e3-afb597cde17a", "osd", "allow rw pool=manila_data namespace=fsvolumens_4c25470d-c14c-4093-b430-b79c735aaf06", "mon", "allow r"], "format": "json"} v 0)
Nov 28 10:14:30 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.david", "caps": ["mds", "allow rw path=/volumes/_nogroup/4c25470d-c14c-4093-b430-b79c735aaf06/471f0b39-a16d-4e49-a0e3-afb597cde17a", "osd", "allow rw pool=manila_data namespace=fsvolumens_4c25470d-c14c-4093-b430-b79c735aaf06", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:14:30 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_authorize(access_level:rw, auth_id:david, format:json, prefix:fs subvolume authorize, sub_name:4c25470d-c14c-4093-b430-b79c735aaf06, tenant_id:301971e834c14ea7aa009696c3f04782, vol_name:cephfs) < ""
Nov 28 10:14:31 np0005538515.localdomain ceph-mon[301134]: pgmap v693: 177 pgs: 177 active+clean; 228 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 85 B/s rd, 67 KiB/s wr, 3 op/s
Nov 28 10:14:31 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "4c25470d-c14c-4093-b430-b79c735aaf06", "auth_id": "david", "tenant_id": "301971e834c14ea7aa009696c3f04782", "access_level": "rw", "format": "json"}]: dispatch
Nov 28 10:14:31 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.david", "format": "json"} : dispatch
Nov 28 10:14:31 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.david", "caps": ["mds", "allow rw path=/volumes/_nogroup/4c25470d-c14c-4093-b430-b79c735aaf06/471f0b39-a16d-4e49-a0e3-afb597cde17a", "osd", "allow rw pool=manila_data namespace=fsvolumens_4c25470d-c14c-4093-b430-b79c735aaf06", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:14:31 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.david", "caps": ["mds", "allow rw path=/volumes/_nogroup/4c25470d-c14c-4093-b430-b79c735aaf06/471f0b39-a16d-4e49-a0e3-afb597cde17a", "osd", "allow rw pool=manila_data namespace=fsvolumens_4c25470d-c14c-4093-b430-b79c735aaf06", "mon", "allow r"], "format": "json"} : dispatch
Nov 28 10:14:31 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.david", "caps": ["mds", "allow rw path=/volumes/_nogroup/4c25470d-c14c-4093-b430-b79c735aaf06/471f0b39-a16d-4e49-a0e3-afb597cde17a", "osd", "allow rw pool=manila_data namespace=fsvolumens_4c25470d-c14c-4093-b430-b79c735aaf06", "mon", "allow r"], "format": "json"}]': finished
Nov 28 10:14:31 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e290 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:14:32 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v694: 177 pgs: 177 active+clean; 229 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 170 B/s rd, 98 KiB/s wr, 5 op/s
Nov 28 10:14:32 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:14:32.185 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:14:32 np0005538515.localdomain ceph-mon[301134]: pgmap v694: 177 pgs: 177 active+clean; 229 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 170 B/s rd, 98 KiB/s wr, 5 op/s
Nov 28 10:14:33 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "e1e8ed83-707d-47a8-914a-3aa1c73e18ce", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:14:33 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:e1e8ed83-707d-47a8-914a-3aa1c73e18ce, vol_name:cephfs) < ""
Nov 28 10:14:34 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v695: 177 pgs: 177 active+clean; 229 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 170 B/s rd, 67 KiB/s wr, 3 op/s
Nov 28 10:14:34 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "e1e8ed83-707d-47a8-914a-3aa1c73e18ce", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:14:34 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/e1e8ed83-707d-47a8-914a-3aa1c73e18ce/.meta.tmp'
Nov 28 10:14:34 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/e1e8ed83-707d-47a8-914a-3aa1c73e18ce/.meta.tmp' to config b'/volumes/_nogroup/e1e8ed83-707d-47a8-914a-3aa1c73e18ce/.meta'
Nov 28 10:14:34 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:e1e8ed83-707d-47a8-914a-3aa1c73e18ce, vol_name:cephfs) < ""
Nov 28 10:14:34 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "e1e8ed83-707d-47a8-914a-3aa1c73e18ce", "format": "json"}]: dispatch
Nov 28 10:14:34 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:e1e8ed83-707d-47a8-914a-3aa1c73e18ce, vol_name:cephfs) < ""
Nov 28 10:14:34 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:e1e8ed83-707d-47a8-914a-3aa1c73e18ce, vol_name:cephfs) < ""
Nov 28 10:14:34 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:14:34.507 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:14:34 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.
Nov 28 10:14:34 np0005538515.localdomain podman[324850]: 2025-11-28 10:14:34.973270061 +0000 UTC m=+0.080935080 container health_status 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.openshift.expose-services=, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., config_id=edpm, container_name=openstack_network_exporter, version=9.6, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, distribution-scope=public, release=1755695350, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Nov 28 10:14:34 np0005538515.localdomain podman[324850]: 2025-11-28 10:14:34.985294622 +0000 UTC m=+0.092959671 container exec_died 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, vcs-type=git, version=9.6, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., distribution-scope=public, io.buildah.version=1.33.7, config_id=edpm, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Nov 28 10:14:34 np0005538515.localdomain systemd[1]: 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.service: Deactivated successfully.
Nov 28 10:14:35 np0005538515.localdomain ceph-mon[301134]: pgmap v695: 177 pgs: 177 active+clean; 229 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 170 B/s rd, 67 KiB/s wr, 3 op/s
Nov 28 10:14:35 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "e1e8ed83-707d-47a8-914a-3aa1c73e18ce", "format": "json"}]: dispatch
Nov 28 10:14:35 np0005538515.localdomain ceph-mon[301134]: from='client.15651 172.18.0.34:0/597982567' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:14:35 np0005538515.localdomain sudo[324870]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 10:14:35 np0005538515.localdomain sudo[324870]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 10:14:35 np0005538515.localdomain sudo[324870]: pam_unix(sudo:session): session closed for user root
Nov 28 10:14:35 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] scanning for idle connections..
Nov 28 10:14:35 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] cleaning up connections: []
Nov 28 10:14:35 np0005538515.localdomain sudo[324888]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 10:14:35 np0005538515.localdomain sudo[324888]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 10:14:35 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] scanning for idle connections..
Nov 28 10:14:35 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] cleaning up connections: []
Nov 28 10:14:35 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] scanning for idle connections..
Nov 28 10:14:35 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] cleaning up connections: []
Nov 28 10:14:36 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v696: 177 pgs: 177 active+clean; 229 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 170 B/s rd, 67 KiB/s wr, 3 op/s
Nov 28 10:14:36 np0005538515.localdomain sudo[324888]: pam_unix(sudo:session): session closed for user root
Nov 28 10:14:36 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Nov 28 10:14:36 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 10:14:36 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0)
Nov 28 10:14:36 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 28 10:14:36 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Nov 28 10:14:36 np0005538515.localdomain ceph-mgr[286188]: [progress INFO root] update: starting ev 488fe40d-168e-47e8-9e59-4eefa7962b43 (Updating node-proxy deployment (+3 -> 3))
Nov 28 10:14:36 np0005538515.localdomain ceph-mgr[286188]: [progress INFO root] complete: finished ev 488fe40d-168e-47e8-9e59-4eefa7962b43 (Updating node-proxy deployment (+3 -> 3))
Nov 28 10:14:36 np0005538515.localdomain ceph-mgr[286188]: [progress INFO root] Completed event 488fe40d-168e-47e8-9e59-4eefa7962b43 (Updating node-proxy deployment (+3 -> 3)) in 0 seconds
Nov 28 10:14:36 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0)
Nov 28 10:14:36 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 28 10:14:36 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "e1e8ed83-707d-47a8-914a-3aa1c73e18ce", "auth_id": "david", "tenant_id": "b90c445933704341b38d135548fb5388", "access_level": "rw", "format": "json"}]: dispatch
Nov 28 10:14:36 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_authorize(access_level:rw, auth_id:david, format:json, prefix:fs subvolume authorize, sub_name:e1e8ed83-707d-47a8-914a-3aa1c73e18ce, tenant_id:b90c445933704341b38d135548fb5388, vol_name:cephfs) < ""
Nov 28 10:14:36 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.david", "format": "json"} v 0)
Nov 28 10:14:36 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.david", "format": "json"} : dispatch
Nov 28 10:14:36 np0005538515.localdomain ceph-mgr[286188]: [volumes ERROR volumes.fs.operations.versions.subvolume_v1] auth ID: david is already in use
Nov 28 10:14:36 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_authorize(access_level:rw, auth_id:david, format:json, prefix:fs subvolume authorize, sub_name:e1e8ed83-707d-47a8-914a-3aa1c73e18ce, tenant_id:b90c445933704341b38d135548fb5388, vol_name:cephfs) < ""
Nov 28 10:14:36 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T10:14:36.727+0000 7fcc87448640 -1 mgr.server reply reply (1) Operation not permitted auth ID: david is already in use
Nov 28 10:14:36 np0005538515.localdomain ceph-mgr[286188]: mgr.server reply reply (1) Operation not permitted auth ID: david is already in use
Nov 28 10:14:36 np0005538515.localdomain sudo[324938]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 10:14:36 np0005538515.localdomain sudo[324938]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 10:14:36 np0005538515.localdomain sudo[324938]: pam_unix(sudo:session): session closed for user root
Nov 28 10:14:36 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e290 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:14:37 np0005538515.localdomain ceph-mon[301134]: pgmap v696: 177 pgs: 177 active+clean; 229 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 170 B/s rd, 67 KiB/s wr, 3 op/s
Nov 28 10:14:37 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 10:14:37 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 28 10:14:37 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:14:37 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 28 10:14:37 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "e1e8ed83-707d-47a8-914a-3aa1c73e18ce", "auth_id": "david", "tenant_id": "b90c445933704341b38d135548fb5388", "access_level": "rw", "format": "json"}]: dispatch
Nov 28 10:14:37 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.david", "format": "json"} : dispatch
Nov 28 10:14:37 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:14:37.220 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:14:38 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v697: 177 pgs: 177 active+clean; 229 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 170 B/s rd, 95 KiB/s wr, 5 op/s
Nov 28 10:14:39 np0005538515.localdomain ceph-mon[301134]: pgmap v697: 177 pgs: 177 active+clean; 229 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 170 B/s rd, 95 KiB/s wr, 5 op/s
Nov 28 10:14:39 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:14:39.537 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:14:40 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v698: 177 pgs: 177 active+clean; 229 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 85 B/s rd, 59 KiB/s wr, 2 op/s
Nov 28 10:14:40 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "e1e8ed83-707d-47a8-914a-3aa1c73e18ce", "auth_id": "david", "format": "json"}]: dispatch
Nov 28 10:14:40 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_deauthorize(auth_id:david, format:json, prefix:fs subvolume deauthorize, sub_name:e1e8ed83-707d-47a8-914a-3aa1c73e18ce, vol_name:cephfs) < ""
Nov 28 10:14:40 np0005538515.localdomain ceph-mgr[286188]: [volumes WARNING volumes.fs.operations.versions.subvolume_v1] deauthorized called for already-removed authID 'david' for subvolume 'e1e8ed83-707d-47a8-914a-3aa1c73e18ce'
Nov 28 10:14:40 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_deauthorize(auth_id:david, format:json, prefix:fs subvolume deauthorize, sub_name:e1e8ed83-707d-47a8-914a-3aa1c73e18ce, vol_name:cephfs) < ""
Nov 28 10:14:40 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "e1e8ed83-707d-47a8-914a-3aa1c73e18ce", "auth_id": "david", "format": "json"}]: dispatch
Nov 28 10:14:40 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_evict(auth_id:david, format:json, prefix:fs subvolume evict, sub_name:e1e8ed83-707d-47a8-914a-3aa1c73e18ce, vol_name:cephfs) < ""
Nov 28 10:14:40 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict clients with auth_name=david, client_metadata.root=/volumes/_nogroup/e1e8ed83-707d-47a8-914a-3aa1c73e18ce/984304c3-0bfb-45a6-bfeb-644c624ba49e
Nov 28 10:14:40 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict: joined all
Nov 28 10:14:40 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_evict(auth_id:david, format:json, prefix:fs subvolume evict, sub_name:e1e8ed83-707d-47a8-914a-3aa1c73e18ce, vol_name:cephfs) < ""
Nov 28 10:14:41 np0005538515.localdomain ceph-mgr[286188]: [progress INFO root] Writing back 50 completed events
Nov 28 10:14:41 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0)
Nov 28 10:14:41 np0005538515.localdomain ceph-mon[301134]: pgmap v698: 177 pgs: 177 active+clean; 229 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 85 B/s rd, 59 KiB/s wr, 2 op/s
Nov 28 10:14:41 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "e1e8ed83-707d-47a8-914a-3aa1c73e18ce", "auth_id": "david", "format": "json"}]: dispatch
Nov 28 10:14:41 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "e1e8ed83-707d-47a8-914a-3aa1c73e18ce", "auth_id": "david", "format": "json"}]: dispatch
Nov 28 10:14:41 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:14:41 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e290 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:14:42 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v699: 177 pgs: 177 active+clean; 229 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 85 B/s rd, 68 KiB/s wr, 3 op/s
Nov 28 10:14:42 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:14:42.260 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:14:42 np0005538515.localdomain ceph-mon[301134]: pgmap v699: 177 pgs: 177 active+clean; 229 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 85 B/s rd, 68 KiB/s wr, 3 op/s
Nov 28 10:14:43 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "4c25470d-c14c-4093-b430-b79c735aaf06", "auth_id": "david", "format": "json"}]: dispatch
Nov 28 10:14:43 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_deauthorize(auth_id:david, format:json, prefix:fs subvolume deauthorize, sub_name:4c25470d-c14c-4093-b430-b79c735aaf06, vol_name:cephfs) < ""
Nov 28 10:14:43 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.david", "format": "json"} v 0)
Nov 28 10:14:43 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.david", "format": "json"} : dispatch
Nov 28 10:14:43 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.david"} v 0)
Nov 28 10:14:43 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.david"} : dispatch
Nov 28 10:14:43 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_deauthorize(auth_id:david, format:json, prefix:fs subvolume deauthorize, sub_name:4c25470d-c14c-4093-b430-b79c735aaf06, vol_name:cephfs) < ""
Nov 28 10:14:43 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "4c25470d-c14c-4093-b430-b79c735aaf06", "auth_id": "david", "format": "json"}]: dispatch
Nov 28 10:14:43 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_evict(auth_id:david, format:json, prefix:fs subvolume evict, sub_name:4c25470d-c14c-4093-b430-b79c735aaf06, vol_name:cephfs) < ""
Nov 28 10:14:43 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict clients with auth_name=david, client_metadata.root=/volumes/_nogroup/4c25470d-c14c-4093-b430-b79c735aaf06/471f0b39-a16d-4e49-a0e3-afb597cde17a
Nov 28 10:14:43 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict: joined all
Nov 28 10:14:43 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_evict(auth_id:david, format:json, prefix:fs subvolume evict, sub_name:4c25470d-c14c-4093-b430-b79c735aaf06, vol_name:cephfs) < ""
Nov 28 10:14:43 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "4c25470d-c14c-4093-b430-b79c735aaf06", "auth_id": "david", "format": "json"}]: dispatch
Nov 28 10:14:43 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.david"} : dispatch
Nov 28 10:14:43 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.david", "format": "json"} : dispatch
Nov 28 10:14:43 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.david"} : dispatch
Nov 28 10:14:43 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.david"}]': finished
Nov 28 10:14:43 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "4c25470d-c14c-4093-b430-b79c735aaf06", "auth_id": "david", "format": "json"}]: dispatch
Nov 28 10:14:44 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v700: 177 pgs: 177 active+clean; 229 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 36 KiB/s wr, 1 op/s
Nov 28 10:14:44 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:14:44.540 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:14:44 np0005538515.localdomain ceph-mon[301134]: pgmap v700: 177 pgs: 177 active+clean; 229 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 36 KiB/s wr, 1 op/s
Nov 28 10:14:46 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v701: 177 pgs: 177 active+clean; 229 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 36 KiB/s wr, 1 op/s
Nov 28 10:14:46 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "e1e8ed83-707d-47a8-914a-3aa1c73e18ce", "format": "json"}]: dispatch
Nov 28 10:14:46 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_clone_status(clone_name:e1e8ed83-707d-47a8-914a-3aa1c73e18ce, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Nov 28 10:14:46 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_clone_status(clone_name:e1e8ed83-707d-47a8-914a-3aa1c73e18ce, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Nov 28 10:14:46 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T10:14:46.834+0000 7fcc87448640 -1 mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume 'e1e8ed83-707d-47a8-914a-3aa1c73e18ce' of type subvolume
Nov 28 10:14:46 np0005538515.localdomain ceph-mgr[286188]: mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume 'e1e8ed83-707d-47a8-914a-3aa1c73e18ce' of type subvolume
Nov 28 10:14:46 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "e1e8ed83-707d-47a8-914a-3aa1c73e18ce", "force": true, "format": "json"}]: dispatch
Nov 28 10:14:46 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:e1e8ed83-707d-47a8-914a-3aa1c73e18ce, vol_name:cephfs) < ""
Nov 28 10:14:46 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.subvolume_base] subvolume path 'b'/volumes/_nogroup/e1e8ed83-707d-47a8-914a-3aa1c73e18ce'' moved to trashcan
Nov 28 10:14:46 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Nov 28 10:14:46 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:e1e8ed83-707d-47a8-914a-3aa1c73e18ce, vol_name:cephfs) < ""
Nov 28 10:14:46 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e290 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:14:47 np0005538515.localdomain ceph-mon[301134]: pgmap v701: 177 pgs: 177 active+clean; 229 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 36 KiB/s wr, 1 op/s
Nov 28 10:14:47 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:14:47.290 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:14:48 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v702: 177 pgs: 177 active+clean; 229 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 67 KiB/s wr, 3 op/s
Nov 28 10:14:48 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "e1e8ed83-707d-47a8-914a-3aa1c73e18ce", "format": "json"}]: dispatch
Nov 28 10:14:48 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "e1e8ed83-707d-47a8-914a-3aa1c73e18ce", "force": true, "format": "json"}]: dispatch
Nov 28 10:14:48 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.
Nov 28 10:14:48 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.
Nov 28 10:14:48 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.
Nov 28 10:14:48 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.
Nov 28 10:14:49 np0005538515.localdomain podman[324960]: 2025-11-28 10:14:49.002630883 +0000 UTC m=+0.097468701 container health_status b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Nov 28 10:14:49 np0005538515.localdomain podman[324960]: 2025-11-28 10:14:49.031834704 +0000 UTC m=+0.126672632 container exec_died b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Nov 28 10:14:49 np0005538515.localdomain systemd[1]: b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.service: Deactivated successfully.
Nov 28 10:14:49 np0005538515.localdomain podman[324961]: 2025-11-28 10:14:49.046973882 +0000 UTC m=+0.136289439 container health_status d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 28 10:14:49 np0005538515.localdomain podman[324961]: 2025-11-28 10:14:49.055632449 +0000 UTC m=+0.144948056 container exec_died d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 28 10:14:49 np0005538515.localdomain systemd[1]: d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.service: Deactivated successfully.
Nov 28 10:14:49 np0005538515.localdomain podman[324958]: 2025-11-28 10:14:49.104954991 +0000 UTC m=+0.200064517 container health_status 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125)
Nov 28 10:14:49 np0005538515.localdomain podman[324958]: 2025-11-28 10:14:49.115685453 +0000 UTC m=+0.210794989 container exec_died 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ceilometer_agent_compute, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, managed_by=edpm_ansible)
Nov 28 10:14:49 np0005538515.localdomain systemd[1]: 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.service: Deactivated successfully.
Nov 28 10:14:49 np0005538515.localdomain ceph-mon[301134]: pgmap v702: 177 pgs: 177 active+clean; 229 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 67 KiB/s wr, 3 op/s
Nov 28 10:14:49 np0005538515.localdomain podman[324959]: 2025-11-28 10:14:49.209760297 +0000 UTC m=+0.304855853 container health_status 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Nov 28 10:14:49 np0005538515.localdomain podman[324959]: 2025-11-28 10:14:49.266458517 +0000 UTC m=+0.361554043 container exec_died 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Nov 28 10:14:49 np0005538515.localdomain systemd[1]: 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.service: Deactivated successfully.
Nov 28 10:14:49 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:14:49.565 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:14:50 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v703: 177 pgs: 177 active+clean; 229 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 39 KiB/s wr, 2 op/s
Nov 28 10:14:50 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "d56bb3f2-efa0-4328-9320-c5298bccaeb7", "format": "json"}]: dispatch
Nov 28 10:14:50 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_clone_status(clone_name:d56bb3f2-efa0-4328-9320-c5298bccaeb7, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Nov 28 10:14:50 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_clone_status(clone_name:d56bb3f2-efa0-4328-9320-c5298bccaeb7, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Nov 28 10:14:50 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T10:14:50.054+0000 7fcc87448640 -1 mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume 'd56bb3f2-efa0-4328-9320-c5298bccaeb7' of type subvolume
Nov 28 10:14:50 np0005538515.localdomain ceph-mgr[286188]: mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume 'd56bb3f2-efa0-4328-9320-c5298bccaeb7' of type subvolume
Nov 28 10:14:50 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "d56bb3f2-efa0-4328-9320-c5298bccaeb7", "force": true, "format": "json"}]: dispatch
Nov 28 10:14:50 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:d56bb3f2-efa0-4328-9320-c5298bccaeb7, vol_name:cephfs) < ""
Nov 28 10:14:50 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.subvolume_base] subvolume path 'b'/volumes/_nogroup/d56bb3f2-efa0-4328-9320-c5298bccaeb7'' moved to trashcan
Nov 28 10:14:50 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Nov 28 10:14:50 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:d56bb3f2-efa0-4328-9320-c5298bccaeb7, vol_name:cephfs) < ""
Nov 28 10:14:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:14:50.857 158530 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 10:14:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:14:50.858 158530 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 10:14:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:14:50.858 158530 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 10:14:51 np0005538515.localdomain ceph-mon[301134]: pgmap v703: 177 pgs: 177 active+clean; 229 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 39 KiB/s wr, 2 op/s
Nov 28 10:14:51 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "d56bb3f2-efa0-4328-9320-c5298bccaeb7", "format": "json"}]: dispatch
Nov 28 10:14:51 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "d56bb3f2-efa0-4328-9320-c5298bccaeb7", "force": true, "format": "json"}]: dispatch
Nov 28 10:14:51 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:14:51.237 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:14:51 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e290 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:14:52 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v704: 177 pgs: 177 active+clean; 230 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 255 B/s rd, 73 KiB/s wr, 3 op/s
Nov 28 10:14:52 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:14:52.239 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:14:52 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:14:52.340 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:14:52 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.
Nov 28 10:14:52 np0005538515.localdomain ceph-mon[301134]: pgmap v704: 177 pgs: 177 active+clean; 230 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 255 B/s rd, 73 KiB/s wr, 3 op/s
Nov 28 10:14:52 np0005538515.localdomain podman[325041]: 2025-11-28 10:14:52.979487108 +0000 UTC m=+0.084213172 container health_status 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 28 10:14:52 np0005538515.localdomain podman[325041]: 2025-11-28 10:14:52.992180819 +0000 UTC m=+0.096906913 container exec_died 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 28 10:14:53 np0005538515.localdomain systemd[1]: 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.service: Deactivated successfully.
Nov 28 10:14:53 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "8d719993-3b66-454f-a026-687de7e6b3e4", "format": "json"}]: dispatch
Nov 28 10:14:53 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_clone_status(clone_name:8d719993-3b66-454f-a026-687de7e6b3e4, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Nov 28 10:14:53 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_clone_status(clone_name:8d719993-3b66-454f-a026-687de7e6b3e4, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Nov 28 10:14:53 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T10:14:53.275+0000 7fcc87448640 -1 mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '8d719993-3b66-454f-a026-687de7e6b3e4' of type subvolume
Nov 28 10:14:53 np0005538515.localdomain ceph-mgr[286188]: mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '8d719993-3b66-454f-a026-687de7e6b3e4' of type subvolume
Nov 28 10:14:53 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "8d719993-3b66-454f-a026-687de7e6b3e4", "force": true, "format": "json"}]: dispatch
Nov 28 10:14:53 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:8d719993-3b66-454f-a026-687de7e6b3e4, vol_name:cephfs) < ""
Nov 28 10:14:53 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.subvolume_base] subvolume path 'b'/volumes/_nogroup/8d719993-3b66-454f-a026-687de7e6b3e4'' moved to trashcan
Nov 28 10:14:53 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Nov 28 10:14:53 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:8d719993-3b66-454f-a026-687de7e6b3e4, vol_name:cephfs) < ""
Nov 28 10:14:53 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "8d719993-3b66-454f-a026-687de7e6b3e4", "format": "json"}]: dispatch
Nov 28 10:14:53 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "8d719993-3b66-454f-a026-687de7e6b3e4", "force": true, "format": "json"}]: dispatch
Nov 28 10:14:54 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v705: 177 pgs: 177 active+clean; 230 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 255 B/s rd, 65 KiB/s wr, 3 op/s
Nov 28 10:14:54 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:14:54.235 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:14:54 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:14:54.238 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:14:54 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:14:54.239 280172 DEBUG nova.compute.manager [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 28 10:14:54 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:14:54.604 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:14:54 np0005538515.localdomain ceph-mon[301134]: pgmap v705: 177 pgs: 177 active+clean; 230 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 255 B/s rd, 65 KiB/s wr, 3 op/s
Nov 28 10:14:54 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.106:0/2621744934' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:14:55 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:14:55.239 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:14:55 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:14:55.240 280172 DEBUG nova.compute.manager [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 28 10:14:55 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:14:55.240 280172 DEBUG nova.compute.manager [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 28 10:14:55 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:14:55.263 280172 DEBUG nova.compute.manager [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 28 10:14:55 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:14:55.263 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:14:55 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.
Nov 28 10:14:55 np0005538515.localdomain systemd[1]: tmp-crun.CdmIMV.mount: Deactivated successfully.
Nov 28 10:14:55 np0005538515.localdomain podman[325065]: 2025-11-28 10:14:55.978526467 +0000 UTC m=+0.081938611 container health_status cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, container_name=multipathd)
Nov 28 10:14:55 np0005538515.localdomain podman[325065]: 2025-11-28 10:14:55.991399935 +0000 UTC m=+0.094812079 container exec_died cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Nov 28 10:14:56 np0005538515.localdomain systemd[1]: cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.service: Deactivated successfully.
Nov 28 10:14:56 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v706: 177 pgs: 177 active+clean; 230 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 255 B/s rd, 65 KiB/s wr, 3 op/s
Nov 28 10:14:56 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.106:0/3985923543' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:14:56 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:14:56.238 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:14:56 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "4c25470d-c14c-4093-b430-b79c735aaf06", "auth_id": "admin", "format": "json"}]: dispatch
Nov 28 10:14:56 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_deauthorize(auth_id:admin, format:json, prefix:fs subvolume deauthorize, sub_name:4c25470d-c14c-4093-b430-b79c735aaf06, vol_name:cephfs) < ""
Nov 28 10:14:56 np0005538515.localdomain ceph-mgr[286188]: [volumes ERROR volumes.fs.operations.versions.subvolume_v1] auth ID: admin doesn't exist
Nov 28 10:14:56 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_deauthorize(auth_id:admin, format:json, prefix:fs subvolume deauthorize, sub_name:4c25470d-c14c-4093-b430-b79c735aaf06, vol_name:cephfs) < ""
Nov 28 10:14:56 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T10:14:56.630+0000 7fcc87448640 -1 mgr.server reply reply (2) No such file or directory auth ID: admin doesn't exist
Nov 28 10:14:56 np0005538515.localdomain ceph-mgr[286188]: mgr.server reply reply (2) No such file or directory auth ID: admin doesn't exist
Nov 28 10:14:56 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "4c25470d-c14c-4093-b430-b79c735aaf06", "format": "json"}]: dispatch
Nov 28 10:14:56 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_clone_status(clone_name:4c25470d-c14c-4093-b430-b79c735aaf06, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Nov 28 10:14:56 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_clone_status(clone_name:4c25470d-c14c-4093-b430-b79c735aaf06, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Nov 28 10:14:56 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T10:14:56.731+0000 7fcc87448640 -1 mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '4c25470d-c14c-4093-b430-b79c735aaf06' of type subvolume
Nov 28 10:14:56 np0005538515.localdomain ceph-mgr[286188]: mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '4c25470d-c14c-4093-b430-b79c735aaf06' of type subvolume
Nov 28 10:14:56 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "4c25470d-c14c-4093-b430-b79c735aaf06", "force": true, "format": "json"}]: dispatch
Nov 28 10:14:56 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:4c25470d-c14c-4093-b430-b79c735aaf06, vol_name:cephfs) < ""
Nov 28 10:14:56 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.subvolume_base] subvolume path 'b'/volumes/_nogroup/4c25470d-c14c-4093-b430-b79c735aaf06'' moved to trashcan
Nov 28 10:14:56 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Nov 28 10:14:56 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:4c25470d-c14c-4093-b430-b79c735aaf06, vol_name:cephfs) < ""
Nov 28 10:14:56 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e290 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:14:57 np0005538515.localdomain ceph-mon[301134]: pgmap v706: 177 pgs: 177 active+clean; 230 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 255 B/s rd, 65 KiB/s wr, 3 op/s
Nov 28 10:14:57 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "4c25470d-c14c-4093-b430-b79c735aaf06", "auth_id": "admin", "format": "json"}]: dispatch
Nov 28 10:14:57 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "4c25470d-c14c-4093-b430-b79c735aaf06", "format": "json"}]: dispatch
Nov 28 10:14:57 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:14:57.339 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:14:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:14:57 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 10:14:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:14:57 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 10:14:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:14:57 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 10:14:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:14:57 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 10:14:57 np0005538515.localdomain openstack_network_exporter[240973]: 
Nov 28 10:14:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:14:57 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 10:14:57 np0005538515.localdomain openstack_network_exporter[240973]: 
Nov 28 10:14:58 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v707: 177 pgs: 177 active+clean; 230 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 597 B/s rd, 106 KiB/s wr, 5 op/s
Nov 28 10:14:58 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "4c25470d-c14c-4093-b430-b79c735aaf06", "force": true, "format": "json"}]: dispatch
Nov 28 10:14:58 np0005538515.localdomain podman[239012]: time="2025-11-28T10:14:58Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 10:14:58 np0005538515.localdomain podman[239012]: @ - - [28/Nov/2025:10:14:58 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156330 "" "Go-http-client/1.1"
Nov 28 10:14:58 np0005538515.localdomain podman[239012]: @ - - [28/Nov/2025:10:14:58 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19255 "" "Go-http-client/1.1"
Nov 28 10:14:59 np0005538515.localdomain ceph-mon[301134]: pgmap v707: 177 pgs: 177 active+clean; 230 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 597 B/s rd, 106 KiB/s wr, 5 op/s
Nov 28 10:14:59 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:14:59.239 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:14:59 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:14:59.240 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:14:59 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:14:59.267 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 10:14:59 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:14:59.268 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 10:14:59 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:14:59.268 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 10:14:59 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:14:59.268 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Auditing locally available compute resources for np0005538515.localdomain (node: np0005538515.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 28 10:14:59 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:14:59.269 280172 DEBUG oslo_concurrency.processutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 10:14:59 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:14:59.630 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:14:59 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 28 10:14:59 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/149668244' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:14:59 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:14:59.741 280172 DEBUG oslo_concurrency.processutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.472s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 10:14:59 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:14:59.902 280172 WARNING nova.virt.libvirt.driver [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 10:14:59 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:14:59.904 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Hypervisor/Node resource view: name=np0005538515.localdomain free_ram=11444MB free_disk=41.83686447143555GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 28 10:14:59 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:14:59.904 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 10:14:59 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:14:59.904 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 10:14:59 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:14:59.984 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 28 10:14:59 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:14:59.985 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Final resource view: name=np0005538515.localdomain phys_ram=15738MB used_ram=512MB phys_disk=41GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 28 10:15:00 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:15:00.016 280172 DEBUG oslo_concurrency.processutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 10:15:00 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v708: 177 pgs: 177 active+clean; 230 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 597 B/s rd, 75 KiB/s wr, 3 op/s
Nov 28 10:15:00 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.108:0/149668244' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:15:00 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 28 10:15:00 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/1184699795' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:15:00 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:15:00.469 280172 DEBUG oslo_concurrency.processutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.453s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 10:15:00 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:15:00.477 280172 DEBUG nova.compute.provider_tree [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Inventory has not changed in ProviderTree for provider: 72fba1ca-0d86-48af-8a3d-510284dfd0e0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 10:15:00 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:15:00.492 280172 DEBUG nova.scheduler.client.report [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Inventory has not changed for provider 72fba1ca-0d86-48af-8a3d-510284dfd0e0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 10:15:00 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:15:00.494 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Compute_service record updated for np0005538515.localdomain:np0005538515.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 28 10:15:00 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:15:00.494 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.590s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 10:15:00 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:15:00.760 158530 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=22, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '92:49:97', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ca:ab:0a:de:51:20'}, ipsec=False) old=SB_Global(nb_cfg=21) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:15:00 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:15:00.761 158530 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 28 10:15:00 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:15:00.781 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:15:01 np0005538515.localdomain ceph-mon[301134]: pgmap v708: 177 pgs: 177 active+clean; 230 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 597 B/s rd, 75 KiB/s wr, 3 op/s
Nov 28 10:15:01 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.108:0/1184699795' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:15:01 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e290 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:15:02 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v709: 177 pgs: 177 active+clean; 230 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 767 B/s rd, 77 KiB/s wr, 5 op/s
Nov 28 10:15:02 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:15:02.386 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:15:02 np0005538515.localdomain ceph-mon[301134]: pgmap v709: 177 pgs: 177 active+clean; 230 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 767 B/s rd, 77 KiB/s wr, 5 op/s
Nov 28 10:15:03 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.107:0/1417585220' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:15:03 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.107:0/2960680827' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:15:04 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v710: 177 pgs: 177 active+clean; 230 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 44 KiB/s wr, 3 op/s
Nov 28 10:15:04 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:15:04.671 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:15:04 np0005538515.localdomain ceph-mon[301134]: pgmap v710: 177 pgs: 177 active+clean; 230 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 44 KiB/s wr, 3 op/s
Nov 28 10:15:05 np0005538515.localdomain ceph-mgr[286188]: [balancer INFO root] Optimize plan auto_2025-11-28_10:15:05
Nov 28 10:15:05 np0005538515.localdomain ceph-mgr[286188]: [balancer INFO root] Mode upmap, max misplaced 0.050000
Nov 28 10:15:05 np0005538515.localdomain ceph-mgr[286188]: [balancer INFO root] do_upmap
Nov 28 10:15:05 np0005538515.localdomain ceph-mgr[286188]: [balancer INFO root] pools ['vms', 'volumes', '.mgr', 'manila_metadata', 'manila_data', 'images', 'backups']
Nov 28 10:15:05 np0005538515.localdomain ceph-mgr[286188]: [balancer INFO root] prepared 0/10 changes
Nov 28 10:15:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] scanning for idle connections..
Nov 28 10:15:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] cleaning up connections: []
Nov 28 10:15:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] scanning for idle connections..
Nov 28 10:15:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] cleaning up connections: []
Nov 28 10:15:05 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.
Nov 28 10:15:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] scanning for idle connections..
Nov 28 10:15:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] cleaning up connections: []
Nov 28 10:15:05 np0005538515.localdomain podman[325128]: 2025-11-28 10:15:05.979156389 +0000 UTC m=+0.086457664 container health_status 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, distribution-scope=public, vcs-type=git, container_name=openstack_network_exporter, io.buildah.version=1.33.7, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, config_id=edpm, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350)
Nov 28 10:15:05 np0005538515.localdomain podman[325128]: 2025-11-28 10:15:05.992402707 +0000 UTC m=+0.099704032 container exec_died 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vcs-type=git, distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, release=1755695350, container_name=openstack_network_exporter, io.buildah.version=1.33.7, managed_by=edpm_ansible)
Nov 28 10:15:06 np0005538515.localdomain systemd[1]: 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.service: Deactivated successfully.
Nov 28 10:15:06 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v711: 177 pgs: 177 active+clean; 230 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 44 KiB/s wr, 3 op/s
Nov 28 10:15:06 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] _maybe_adjust
Nov 28 10:15:06 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Nov 28 10:15:06 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] Pool '.mgr' root_id -1 using 3.080724804578448e-05 of space, bias 1.0, pg target 0.006161449609156895 quantized to 1 (current 1)
Nov 28 10:15:06 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Nov 28 10:15:06 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] Pool 'vms' root_id -1 using 0.003328000680485762 of space, bias 1.0, pg target 0.6656001360971524 quantized to 32 (current 32)
Nov 28 10:15:06 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Nov 28 10:15:06 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] Pool 'volumes' root_id -1 using 0.0014861089300670016 of space, bias 1.0, pg target 0.29672641637004465 quantized to 32 (current 32)
Nov 28 10:15:06 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Nov 28 10:15:06 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] Pool 'images' root_id -1 using 0.004299383200725851 of space, bias 1.0, pg target 0.8584435124115949 quantized to 32 (current 32)
Nov 28 10:15:06 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Nov 28 10:15:06 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] Pool 'backups' root_id -1 using 2.7263051367950866e-07 of space, bias 1.0, pg target 5.425347222222222e-05 quantized to 32 (current 32)
Nov 28 10:15:06 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Nov 28 10:15:06 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] Pool 'manila_data' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Nov 28 10:15:06 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Nov 28 10:15:06 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] Pool 'manila_metadata' root_id -1 using 0.0026946799972082636 of space, bias 4.0, pg target 2.144965277777778 quantized to 16 (current 16)
Nov 28 10:15:06 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules
Nov 28 10:15:06 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: vms, start_after=
Nov 28 10:15:06 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: volumes, start_after=
Nov 28 10:15:06 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: images, start_after=
Nov 28 10:15:06 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: backups, start_after=
Nov 28 10:15:06 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules
Nov 28 10:15:06 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: vms, start_after=
Nov 28 10:15:06 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: volumes, start_after=
Nov 28 10:15:06 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: images, start_after=
Nov 28 10:15:06 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: backups, start_after=
Nov 28 10:15:06 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e290 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:15:07 np0005538515.localdomain ceph-mon[301134]: pgmap v711: 177 pgs: 177 active+clean; 230 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 44 KiB/s wr, 3 op/s
Nov 28 10:15:07 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:15:07.415 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:15:08 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v712: 177 pgs: 177 active+clean; 230 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 60 KiB/s wr, 3 op/s
Nov 28 10:15:09 np0005538515.localdomain ceph-mon[301134]: pgmap v712: 177 pgs: 177 active+clean; 230 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 60 KiB/s wr, 3 op/s
Nov 28 10:15:09 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:15:09.709 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:15:10 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v713: 177 pgs: 177 active+clean; 230 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 170 B/s rd, 19 KiB/s wr, 1 op/s
Nov 28 10:15:10 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:15:10.763 158530 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=62c03cad-89c1-4fd7-973b-8f2a608c71f1, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '22'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 10:15:11 np0005538515.localdomain ceph-mon[301134]: pgmap v713: 177 pgs: 177 active+clean; 230 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 170 B/s rd, 19 KiB/s wr, 1 op/s
Nov 28 10:15:11 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e290 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:15:12 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v714: 177 pgs: 177 active+clean; 230 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 170 B/s rd, 19 KiB/s wr, 1 op/s
Nov 28 10:15:12 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:15:12.453 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:15:12 np0005538515.localdomain ceph-mon[301134]: pgmap v714: 177 pgs: 177 active+clean; 230 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 170 B/s rd, 19 KiB/s wr, 1 op/s
Nov 28 10:15:13 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Nov 28 10:15:13 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1275385607' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:15:13 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Nov 28 10:15:13 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1275385607' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:15:13 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/1275385607' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:15:13 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/1275385607' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:15:14 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v715: 177 pgs: 177 active+clean; 230 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 16 KiB/s wr, 0 op/s
Nov 28 10:15:14 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:15:14.750 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:15:15 np0005538515.localdomain ceph-mon[301134]: pgmap v715: 177 pgs: 177 active+clean; 230 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 16 KiB/s wr, 0 op/s
Nov 28 10:15:16 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v716: 177 pgs: 177 active+clean; 230 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 16 KiB/s wr, 0 op/s
Nov 28 10:15:16 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e290 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:15:17 np0005538515.localdomain ceph-mon[301134]: pgmap v716: 177 pgs: 177 active+clean; 230 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 16 KiB/s wr, 0 op/s
Nov 28 10:15:17 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:15:17.487 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:15:18 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v717: 177 pgs: 177 active+clean; 230 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 16 KiB/s wr, 0 op/s
Nov 28 10:15:18 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "e4f2bd9a-6730-4442-8982-1535b4534b94", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:15:18 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:e4f2bd9a-6730-4442-8982-1535b4534b94, vol_name:cephfs) < ""
Nov 28 10:15:18 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/e4f2bd9a-6730-4442-8982-1535b4534b94/.meta.tmp'
Nov 28 10:15:18 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/e4f2bd9a-6730-4442-8982-1535b4534b94/.meta.tmp' to config b'/volumes/_nogroup/e4f2bd9a-6730-4442-8982-1535b4534b94/.meta'
Nov 28 10:15:18 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:e4f2bd9a-6730-4442-8982-1535b4534b94, vol_name:cephfs) < ""
Nov 28 10:15:18 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "e4f2bd9a-6730-4442-8982-1535b4534b94", "format": "json"}]: dispatch
Nov 28 10:15:18 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:e4f2bd9a-6730-4442-8982-1535b4534b94, vol_name:cephfs) < ""
Nov 28 10:15:18 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:e4f2bd9a-6730-4442-8982-1535b4534b94, vol_name:cephfs) < ""
Nov 28 10:15:19 np0005538515.localdomain ceph-mon[301134]: pgmap v717: 177 pgs: 177 active+clean; 230 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 16 KiB/s wr, 0 op/s
Nov 28 10:15:19 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "e4f2bd9a-6730-4442-8982-1535b4534b94", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:15:19 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "e4f2bd9a-6730-4442-8982-1535b4534b94", "format": "json"}]: dispatch
Nov 28 10:15:19 np0005538515.localdomain ceph-mon[301134]: from='client.15651 172.18.0.34:0/597982567' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:15:19 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:15:19.796 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:15:19 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.
Nov 28 10:15:19 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.
Nov 28 10:15:19 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.
Nov 28 10:15:19 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.
Nov 28 10:15:19 np0005538515.localdomain podman[325156]: 2025-11-28 10:15:19.996169435 +0000 UTC m=+0.087660782 container health_status d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 10:15:20 np0005538515.localdomain podman[325150]: 2025-11-28 10:15:19.96972866 +0000 UTC m=+0.069723039 container health_status b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 28 10:15:20 np0005538515.localdomain podman[325149]: 2025-11-28 10:15:20.0291632 +0000 UTC m=+0.129391096 container health_status 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_controller)
Nov 28 10:15:20 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v718: 177 pgs: 177 active+clean; 230 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:15:20 np0005538515.localdomain podman[325150]: 2025-11-28 10:15:20.049387694 +0000 UTC m=+0.149382063 container exec_died b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2)
Nov 28 10:15:20 np0005538515.localdomain podman[325148]: 2025-11-28 10:15:20.089794538 +0000 UTC m=+0.194244213 container health_status 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3)
Nov 28 10:15:20 np0005538515.localdomain podman[325149]: 2025-11-28 10:15:20.098003581 +0000 UTC m=+0.198231487 container exec_died 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_managed=true)
Nov 28 10:15:20 np0005538515.localdomain systemd[1]: 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.service: Deactivated successfully.
Nov 28 10:15:20 np0005538515.localdomain podman[325148]: 2025-11-28 10:15:20.150781206 +0000 UTC m=+0.255230901 container exec_died 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Nov 28 10:15:20 np0005538515.localdomain systemd[1]: b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.service: Deactivated successfully.
Nov 28 10:15:20 np0005538515.localdomain systemd[1]: 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.service: Deactivated successfully.
Nov 28 10:15:20 np0005538515.localdomain podman[325156]: 2025-11-28 10:15:20.256337737 +0000 UTC m=+0.347829104 container exec_died d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 28 10:15:20 np0005538515.localdomain systemd[1]: d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.service: Deactivated successfully.
Nov 28 10:15:20 np0005538515.localdomain systemd[1]: tmp-crun.giB8y1.mount: Deactivated successfully.
Nov 28 10:15:21 np0005538515.localdomain ceph-mon[301134]: pgmap v718: 177 pgs: 177 active+clean; 230 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:15:21 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "98a016eb-e15e-4a92-95d1-45b6c6f58025", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:15:21 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:98a016eb-e15e-4a92-95d1-45b6c6f58025, vol_name:cephfs) < ""
Nov 28 10:15:21 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/98a016eb-e15e-4a92-95d1-45b6c6f58025/.meta.tmp'
Nov 28 10:15:21 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/98a016eb-e15e-4a92-95d1-45b6c6f58025/.meta.tmp' to config b'/volumes/_nogroup/98a016eb-e15e-4a92-95d1-45b6c6f58025/.meta'
Nov 28 10:15:21 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:98a016eb-e15e-4a92-95d1-45b6c6f58025, vol_name:cephfs) < ""
Nov 28 10:15:21 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "98a016eb-e15e-4a92-95d1-45b6c6f58025", "format": "json"}]: dispatch
Nov 28 10:15:21 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:98a016eb-e15e-4a92-95d1-45b6c6f58025, vol_name:cephfs) < ""
Nov 28 10:15:21 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:98a016eb-e15e-4a92-95d1-45b6c6f58025, vol_name:cephfs) < ""
Nov 28 10:15:21 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e290 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:15:22 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v719: 177 pgs: 177 active+clean; 231 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 12 KiB/s wr, 0 op/s
Nov 28 10:15:22 np0005538515.localdomain ceph-mon[301134]: from='client.15651 172.18.0.34:0/597982567' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:15:22 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:15:22.520 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:15:23 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "98a016eb-e15e-4a92-95d1-45b6c6f58025", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:15:23 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "98a016eb-e15e-4a92-95d1-45b6c6f58025", "format": "json"}]: dispatch
Nov 28 10:15:23 np0005538515.localdomain ceph-mon[301134]: pgmap v719: 177 pgs: 177 active+clean; 231 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 12 KiB/s wr, 0 op/s
Nov 28 10:15:23 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.
Nov 28 10:15:23 np0005538515.localdomain podman[325229]: 2025-11-28 10:15:23.96725606 +0000 UTC m=+0.075909849 container health_status 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 28 10:15:23 np0005538515.localdomain podman[325229]: 2025-11-28 10:15:23.982458729 +0000 UTC m=+0.091112568 container exec_died 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 28 10:15:23 np0005538515.localdomain systemd[1]: 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.service: Deactivated successfully.
Nov 28 10:15:24 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v720: 177 pgs: 177 active+clean; 231 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 12 KiB/s wr, 0 op/s
Nov 28 10:15:24 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:15:24.799 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:15:25 np0005538515.localdomain ceph-mon[301134]: pgmap v720: 177 pgs: 177 active+clean; 231 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 12 KiB/s wr, 0 op/s
Nov 28 10:15:25 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "183f7a7a-5ae4-4e98-be38-4edc7d9e437a", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:15:25 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:183f7a7a-5ae4-4e98-be38-4edc7d9e437a, vol_name:cephfs) < ""
Nov 28 10:15:25 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/183f7a7a-5ae4-4e98-be38-4edc7d9e437a/.meta.tmp'
Nov 28 10:15:25 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/183f7a7a-5ae4-4e98-be38-4edc7d9e437a/.meta.tmp' to config b'/volumes/_nogroup/183f7a7a-5ae4-4e98-be38-4edc7d9e437a/.meta'
Nov 28 10:15:25 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:183f7a7a-5ae4-4e98-be38-4edc7d9e437a, vol_name:cephfs) < ""
Nov 28 10:15:25 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "183f7a7a-5ae4-4e98-be38-4edc7d9e437a", "format": "json"}]: dispatch
Nov 28 10:15:25 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:183f7a7a-5ae4-4e98-be38-4edc7d9e437a, vol_name:cephfs) < ""
Nov 28 10:15:25 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:183f7a7a-5ae4-4e98-be38-4edc7d9e437a, vol_name:cephfs) < ""
Nov 28 10:15:26 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v721: 177 pgs: 177 active+clean; 231 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 12 KiB/s wr, 0 op/s
Nov 28 10:15:26 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "183f7a7a-5ae4-4e98-be38-4edc7d9e437a", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:15:26 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "183f7a7a-5ae4-4e98-be38-4edc7d9e437a", "format": "json"}]: dispatch
Nov 28 10:15:26 np0005538515.localdomain ceph-mon[301134]: from='client.15651 172.18.0.34:0/597982567' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:15:26 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.
Nov 28 10:15:26 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e290 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:15:26 np0005538515.localdomain podman[325252]: 2025-11-28 10:15:26.971906042 +0000 UTC m=+0.081768930 container health_status cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.schema-version=1.0, container_name=multipathd, managed_by=edpm_ansible, io.buildah.version=1.41.3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 28 10:15:26 np0005538515.localdomain podman[325252]: 2025-11-28 10:15:26.984571931 +0000 UTC m=+0.094434819 container exec_died cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 28 10:15:26 np0005538515.localdomain systemd[1]: cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.service: Deactivated successfully.
Nov 28 10:15:27 np0005538515.localdomain ceph-mon[301134]: pgmap v721: 177 pgs: 177 active+clean; 231 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 12 KiB/s wr, 0 op/s
Nov 28 10:15:27 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:15:27.522 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:15:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:15:27 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 10:15:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:15:27 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 10:15:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:15:27 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 10:15:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:15:27 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 10:15:27 np0005538515.localdomain openstack_network_exporter[240973]: 
Nov 28 10:15:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:15:27 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 10:15:27 np0005538515.localdomain openstack_network_exporter[240973]: 
Nov 28 10:15:28 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v722: 177 pgs: 177 active+clean; 231 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 53 KiB/s wr, 2 op/s
Nov 28 10:15:28 np0005538515.localdomain podman[239012]: time="2025-11-28T10:15:28Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 10:15:28 np0005538515.localdomain podman[239012]: @ - - [28/Nov/2025:10:15:28 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156330 "" "Go-http-client/1.1"
Nov 28 10:15:28 np0005538515.localdomain podman[239012]: @ - - [28/Nov/2025:10:15:28 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19260 "" "Go-http-client/1.1"
Nov 28 10:15:29 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "8b4e8834-f80e-4f71-9b6d-4708a36d1242", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:15:29 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:8b4e8834-f80e-4f71-9b6d-4708a36d1242, vol_name:cephfs) < ""
Nov 28 10:15:29 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/8b4e8834-f80e-4f71-9b6d-4708a36d1242/.meta.tmp'
Nov 28 10:15:29 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/8b4e8834-f80e-4f71-9b6d-4708a36d1242/.meta.tmp' to config b'/volumes/_nogroup/8b4e8834-f80e-4f71-9b6d-4708a36d1242/.meta'
Nov 28 10:15:29 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:8b4e8834-f80e-4f71-9b6d-4708a36d1242, vol_name:cephfs) < ""
Nov 28 10:15:29 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "8b4e8834-f80e-4f71-9b6d-4708a36d1242", "format": "json"}]: dispatch
Nov 28 10:15:29 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:8b4e8834-f80e-4f71-9b6d-4708a36d1242, vol_name:cephfs) < ""
Nov 28 10:15:29 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:8b4e8834-f80e-4f71-9b6d-4708a36d1242, vol_name:cephfs) < ""
Nov 28 10:15:29 np0005538515.localdomain ceph-mon[301134]: pgmap v722: 177 pgs: 177 active+clean; 231 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 53 KiB/s wr, 2 op/s
Nov 28 10:15:29 np0005538515.localdomain ceph-mon[301134]: from='client.15651 172.18.0.34:0/597982567' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:15:29 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:15:29.832 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:15:30 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v723: 177 pgs: 177 active+clean; 231 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 53 KiB/s wr, 2 op/s
Nov 28 10:15:30 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "8b4e8834-f80e-4f71-9b6d-4708a36d1242", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:15:30 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "8b4e8834-f80e-4f71-9b6d-4708a36d1242", "format": "json"}]: dispatch
Nov 28 10:15:31 np0005538515.localdomain ceph-mon[301134]: pgmap v723: 177 pgs: 177 active+clean; 231 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 53 KiB/s wr, 2 op/s
Nov 28 10:15:31 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e290 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:15:32 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v724: 177 pgs: 177 active+clean; 231 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 73 KiB/s wr, 3 op/s
Nov 28 10:15:32 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:15:32.547 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:15:32 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "8b4e8834-f80e-4f71-9b6d-4708a36d1242", "format": "json"}]: dispatch
Nov 28 10:15:32 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_clone_status(clone_name:8b4e8834-f80e-4f71-9b6d-4708a36d1242, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Nov 28 10:15:32 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_clone_status(clone_name:8b4e8834-f80e-4f71-9b6d-4708a36d1242, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Nov 28 10:15:32 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T10:15:32.924+0000 7fcc87448640 -1 mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '8b4e8834-f80e-4f71-9b6d-4708a36d1242' of type subvolume
Nov 28 10:15:32 np0005538515.localdomain ceph-mgr[286188]: mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '8b4e8834-f80e-4f71-9b6d-4708a36d1242' of type subvolume
Nov 28 10:15:32 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "8b4e8834-f80e-4f71-9b6d-4708a36d1242", "force": true, "format": "json"}]: dispatch
Nov 28 10:15:32 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:8b4e8834-f80e-4f71-9b6d-4708a36d1242, vol_name:cephfs) < ""
Nov 28 10:15:32 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.subvolume_base] subvolume path 'b'/volumes/_nogroup/8b4e8834-f80e-4f71-9b6d-4708a36d1242'' moved to trashcan
Nov 28 10:15:32 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Nov 28 10:15:32 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:8b4e8834-f80e-4f71-9b6d-4708a36d1242, vol_name:cephfs) < ""
Nov 28 10:15:32 np0005538515.localdomain ceph-mon[301134]: pgmap v724: 177 pgs: 177 active+clean; 231 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 73 KiB/s wr, 3 op/s
Nov 28 10:15:34 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "8b4e8834-f80e-4f71-9b6d-4708a36d1242", "format": "json"}]: dispatch
Nov 28 10:15:34 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "8b4e8834-f80e-4f71-9b6d-4708a36d1242", "force": true, "format": "json"}]: dispatch
Nov 28 10:15:34 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v725: 177 pgs: 177 active+clean; 231 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 61 KiB/s wr, 2 op/s
Nov 28 10:15:34 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:15:34.857 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:15:35 np0005538515.localdomain ceph-mon[301134]: pgmap v725: 177 pgs: 177 active+clean; 231 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 61 KiB/s wr, 2 op/s
Nov 28 10:15:35 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] scanning for idle connections..
Nov 28 10:15:35 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] cleaning up connections: []
Nov 28 10:15:35 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] scanning for idle connections..
Nov 28 10:15:35 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] cleaning up connections: []
Nov 28 10:15:35 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] scanning for idle connections..
Nov 28 10:15:35 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] cleaning up connections: []
Nov 28 10:15:36 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v726: 177 pgs: 177 active+clean; 231 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 61 KiB/s wr, 2 op/s
Nov 28 10:15:36 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "183f7a7a-5ae4-4e98-be38-4edc7d9e437a", "format": "json"}]: dispatch
Nov 28 10:15:36 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_clone_status(clone_name:183f7a7a-5ae4-4e98-be38-4edc7d9e437a, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Nov 28 10:15:36 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_clone_status(clone_name:183f7a7a-5ae4-4e98-be38-4edc7d9e437a, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Nov 28 10:15:36 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T10:15:36.139+0000 7fcc87448640 -1 mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '183f7a7a-5ae4-4e98-be38-4edc7d9e437a' of type subvolume
Nov 28 10:15:36 np0005538515.localdomain ceph-mgr[286188]: mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '183f7a7a-5ae4-4e98-be38-4edc7d9e437a' of type subvolume
Nov 28 10:15:36 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "183f7a7a-5ae4-4e98-be38-4edc7d9e437a", "force": true, "format": "json"}]: dispatch
Nov 28 10:15:36 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:183f7a7a-5ae4-4e98-be38-4edc7d9e437a, vol_name:cephfs) < ""
Nov 28 10:15:36 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.subvolume_base] subvolume path 'b'/volumes/_nogroup/183f7a7a-5ae4-4e98-be38-4edc7d9e437a'' moved to trashcan
Nov 28 10:15:36 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Nov 28 10:15:36 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:183f7a7a-5ae4-4e98-be38-4edc7d9e437a, vol_name:cephfs) < ""
Nov 28 10:15:36 np0005538515.localdomain sudo[325271]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 10:15:36 np0005538515.localdomain sudo[325271]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 10:15:36 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.
Nov 28 10:15:36 np0005538515.localdomain sudo[325271]: pam_unix(sudo:session): session closed for user root
Nov 28 10:15:36 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e290 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:15:36 np0005538515.localdomain sudo[325295]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 10:15:36 np0005538515.localdomain sudo[325295]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 10:15:36 np0005538515.localdomain podman[325288]: 2025-11-28 10:15:36.978363261 +0000 UTC m=+0.088207438 container health_status 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.openshift.expose-services=, vendor=Red Hat, Inc., distribution-scope=public, io.buildah.version=1.33.7, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, vcs-type=git, release=1755695350, managed_by=edpm_ansible, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers)
Nov 28 10:15:36 np0005538515.localdomain podman[325288]: 2025-11-28 10:15:36.990097752 +0000 UTC m=+0.099941969 container exec_died 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, vendor=Red Hat, Inc., architecture=x86_64, distribution-scope=public, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., version=9.6, build-date=2025-08-20T13:12:41, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, vcs-type=git)
Nov 28 10:15:37 np0005538515.localdomain systemd[1]: 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.service: Deactivated successfully.
Nov 28 10:15:37 np0005538515.localdomain ceph-mon[301134]: pgmap v726: 177 pgs: 177 active+clean; 231 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 61 KiB/s wr, 2 op/s
Nov 28 10:15:37 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "183f7a7a-5ae4-4e98-be38-4edc7d9e437a", "format": "json"}]: dispatch
Nov 28 10:15:37 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "183f7a7a-5ae4-4e98-be38-4edc7d9e437a", "force": true, "format": "json"}]: dispatch
Nov 28 10:15:37 np0005538515.localdomain sudo[325295]: pam_unix(sudo:session): session closed for user root
Nov 28 10:15:37 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:15:37.592 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:15:37 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Nov 28 10:15:37 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 10:15:37 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0)
Nov 28 10:15:37 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 28 10:15:37 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Nov 28 10:15:37 np0005538515.localdomain ceph-mgr[286188]: [progress INFO root] update: starting ev e93e93e0-f3e6-4b8e-b81d-53d2f3b70421 (Updating node-proxy deployment (+3 -> 3))
Nov 28 10:15:37 np0005538515.localdomain ceph-mgr[286188]: [progress INFO root] complete: finished ev e93e93e0-f3e6-4b8e-b81d-53d2f3b70421 (Updating node-proxy deployment (+3 -> 3))
Nov 28 10:15:37 np0005538515.localdomain ceph-mgr[286188]: [progress INFO root] Completed event e93e93e0-f3e6-4b8e-b81d-53d2f3b70421 (Updating node-proxy deployment (+3 -> 3)) in 0 seconds
Nov 28 10:15:37 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0)
Nov 28 10:15:37 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 28 10:15:38 np0005538515.localdomain sudo[325359]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 10:15:38 np0005538515.localdomain sudo[325359]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 10:15:38 np0005538515.localdomain sudo[325359]: pam_unix(sudo:session): session closed for user root
Nov 28 10:15:38 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v727: 177 pgs: 177 active+clean; 232 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 170 B/s rd, 107 KiB/s wr, 4 op/s
Nov 28 10:15:38 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 10:15:38 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 28 10:15:38 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:15:38 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 28 10:15:39 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "98a016eb-e15e-4a92-95d1-45b6c6f58025", "format": "json"}]: dispatch
Nov 28 10:15:39 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_clone_status(clone_name:98a016eb-e15e-4a92-95d1-45b6c6f58025, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Nov 28 10:15:39 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_clone_status(clone_name:98a016eb-e15e-4a92-95d1-45b6c6f58025, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Nov 28 10:15:39 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T10:15:39.341+0000 7fcc87448640 -1 mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '98a016eb-e15e-4a92-95d1-45b6c6f58025' of type subvolume
Nov 28 10:15:39 np0005538515.localdomain ceph-mgr[286188]: mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '98a016eb-e15e-4a92-95d1-45b6c6f58025' of type subvolume
Nov 28 10:15:39 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "98a016eb-e15e-4a92-95d1-45b6c6f58025", "force": true, "format": "json"}]: dispatch
Nov 28 10:15:39 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:98a016eb-e15e-4a92-95d1-45b6c6f58025, vol_name:cephfs) < ""
Nov 28 10:15:39 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.subvolume_base] subvolume path 'b'/volumes/_nogroup/98a016eb-e15e-4a92-95d1-45b6c6f58025'' moved to trashcan
Nov 28 10:15:39 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Nov 28 10:15:39 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:98a016eb-e15e-4a92-95d1-45b6c6f58025, vol_name:cephfs) < ""
Nov 28 10:15:39 np0005538515.localdomain ceph-mon[301134]: pgmap v727: 177 pgs: 177 active+clean; 232 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 170 B/s rd, 107 KiB/s wr, 4 op/s
Nov 28 10:15:39 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "98a016eb-e15e-4a92-95d1-45b6c6f58025", "format": "json"}]: dispatch
Nov 28 10:15:39 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "98a016eb-e15e-4a92-95d1-45b6c6f58025", "force": true, "format": "json"}]: dispatch
Nov 28 10:15:39 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:15:39.881 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:15:40 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v728: 177 pgs: 177 active+clean; 232 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 170 B/s rd, 66 KiB/s wr, 2 op/s
Nov 28 10:15:40 np0005538515.localdomain ceph-mon[301134]: pgmap v728: 177 pgs: 177 active+clean; 232 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 170 B/s rd, 66 KiB/s wr, 2 op/s
Nov 28 10:15:41 np0005538515.localdomain ceph-mgr[286188]: [progress INFO root] Writing back 50 completed events
Nov 28 10:15:41 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0)
Nov 28 10:15:41 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e290 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:15:42 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v729: 177 pgs: 177 active+clean; 232 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 90 KiB/s wr, 4 op/s
Nov 28 10:15:42 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:15:42 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "e4f2bd9a-6730-4442-8982-1535b4534b94", "format": "json"}]: dispatch
Nov 28 10:15:42 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_clone_status(clone_name:e4f2bd9a-6730-4442-8982-1535b4534b94, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Nov 28 10:15:42 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_clone_status(clone_name:e4f2bd9a-6730-4442-8982-1535b4534b94, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Nov 28 10:15:42 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T10:15:42.533+0000 7fcc87448640 -1 mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume 'e4f2bd9a-6730-4442-8982-1535b4534b94' of type subvolume
Nov 28 10:15:42 np0005538515.localdomain ceph-mgr[286188]: mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume 'e4f2bd9a-6730-4442-8982-1535b4534b94' of type subvolume
Nov 28 10:15:42 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "e4f2bd9a-6730-4442-8982-1535b4534b94", "force": true, "format": "json"}]: dispatch
Nov 28 10:15:42 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:e4f2bd9a-6730-4442-8982-1535b4534b94, vol_name:cephfs) < ""
Nov 28 10:15:42 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.subvolume_base] subvolume path 'b'/volumes/_nogroup/e4f2bd9a-6730-4442-8982-1535b4534b94'' moved to trashcan
Nov 28 10:15:42 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Nov 28 10:15:42 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:e4f2bd9a-6730-4442-8982-1535b4534b94, vol_name:cephfs) < ""
Nov 28 10:15:42 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:15:42.638 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:15:43 np0005538515.localdomain ceph-mon[301134]: pgmap v729: 177 pgs: 177 active+clean; 232 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 90 KiB/s wr, 4 op/s
Nov 28 10:15:43 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "e4f2bd9a-6730-4442-8982-1535b4534b94", "format": "json"}]: dispatch
Nov 28 10:15:43 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "e4f2bd9a-6730-4442-8982-1535b4534b94", "force": true, "format": "json"}]: dispatch
Nov 28 10:15:44 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v730: 177 pgs: 177 active+clean; 232 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 70 KiB/s wr, 3 op/s
Nov 28 10:15:44 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:15:44.912 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:15:45 np0005538515.localdomain ceph-mon[301134]: pgmap v730: 177 pgs: 177 active+clean; 232 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 70 KiB/s wr, 3 op/s
Nov 28 10:15:46 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v731: 177 pgs: 177 active+clean; 232 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 70 KiB/s wr, 3 op/s
Nov 28 10:15:46 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e290 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:15:47 np0005538515.localdomain ceph-mon[301134]: pgmap v731: 177 pgs: 177 active+clean; 232 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 70 KiB/s wr, 3 op/s
Nov 28 10:15:47 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:15:47.671 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:15:48 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v732: 177 pgs: 177 active+clean; 232 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 682 B/s rd, 95 KiB/s wr, 4 op/s
Nov 28 10:15:49 np0005538515.localdomain ceph-mon[301134]: pgmap v732: 177 pgs: 177 active+clean; 232 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 682 B/s rd, 95 KiB/s wr, 4 op/s
Nov 28 10:15:49 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:15:49.956 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:15:50 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v733: 177 pgs: 177 active+clean; 232 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 49 KiB/s wr, 3 op/s
Nov 28 10:15:50 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:15:50.489 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:15:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:15:50.859 158530 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 10:15:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:15:50.859 158530 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 10:15:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:15:50.859 158530 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 10:15:50 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.
Nov 28 10:15:50 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.
Nov 28 10:15:50 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.
Nov 28 10:15:50 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.
Nov 28 10:15:51 np0005538515.localdomain podman[325379]: 2025-11-28 10:15:51.007018372 +0000 UTC m=+0.100496637 container health_status b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:15:51 np0005538515.localdomain podman[325379]: 2025-11-28 10:15:51.039472472 +0000 UTC m=+0.132950747 container exec_died b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 28 10:15:51 np0005538515.localdomain systemd[1]: tmp-crun.sgH9DI.mount: Deactivated successfully.
Nov 28 10:15:51 np0005538515.localdomain podman[325378]: 2025-11-28 10:15:51.054511874 +0000 UTC m=+0.151326641 container health_status 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Nov 28 10:15:51 np0005538515.localdomain systemd[1]: b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.service: Deactivated successfully.
Nov 28 10:15:51 np0005538515.localdomain podman[325380]: 2025-11-28 10:15:51.146281641 +0000 UTC m=+0.233813252 container health_status d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 28 10:15:51 np0005538515.localdomain podman[325380]: 2025-11-28 10:15:51.152449561 +0000 UTC m=+0.239981182 container exec_died d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 10:15:51 np0005538515.localdomain podman[325378]: 2025-11-28 10:15:51.1615082 +0000 UTC m=+0.258323007 container exec_died 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 28 10:15:51 np0005538515.localdomain systemd[1]: d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.service: Deactivated successfully.
Nov 28 10:15:51 np0005538515.localdomain systemd[1]: 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.service: Deactivated successfully.
Nov 28 10:15:51 np0005538515.localdomain podman[325377]: 2025-11-28 10:15:51.201354648 +0000 UTC m=+0.299098173 container health_status 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_managed=true, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, config_id=edpm, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:15:51 np0005538515.localdomain podman[325377]: 2025-11-28 10:15:51.219536997 +0000 UTC m=+0.317280492 container exec_died 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.build-date=20251125, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:15:51 np0005538515.localdomain systemd[1]: 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.service: Deactivated successfully.
Nov 28 10:15:51 np0005538515.localdomain ceph-mon[301134]: pgmap v733: 177 pgs: 177 active+clean; 232 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 49 KiB/s wr, 3 op/s
Nov 28 10:15:51 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e290 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:15:52 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v734: 177 pgs: 177 active+clean; 232 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 56 KiB/s wr, 3 op/s
Nov 28 10:15:52 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:15:52.238 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:15:52 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:15:52.239 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:15:52 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:15:52.700 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:15:53 np0005538515.localdomain ceph-mon[301134]: pgmap v734: 177 pgs: 177 active+clean; 232 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 56 KiB/s wr, 3 op/s
Nov 28 10:15:53 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "4650ef04-7360-41ee-b6b9-a66770a7edbb", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:15:53 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:4650ef04-7360-41ee-b6b9-a66770a7edbb, vol_name:cephfs) < ""
Nov 28 10:15:53 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/4650ef04-7360-41ee-b6b9-a66770a7edbb/.meta.tmp'
Nov 28 10:15:53 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/4650ef04-7360-41ee-b6b9-a66770a7edbb/.meta.tmp' to config b'/volumes/_nogroup/4650ef04-7360-41ee-b6b9-a66770a7edbb/.meta'
Nov 28 10:15:53 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:4650ef04-7360-41ee-b6b9-a66770a7edbb, vol_name:cephfs) < ""
Nov 28 10:15:53 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "4650ef04-7360-41ee-b6b9-a66770a7edbb", "format": "json"}]: dispatch
Nov 28 10:15:53 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:4650ef04-7360-41ee-b6b9-a66770a7edbb, vol_name:cephfs) < ""
Nov 28 10:15:53 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:4650ef04-7360-41ee-b6b9-a66770a7edbb, vol_name:cephfs) < ""
Nov 28 10:15:54 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "4650ef04-7360-41ee-b6b9-a66770a7edbb", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:15:54 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "4650ef04-7360-41ee-b6b9-a66770a7edbb", "format": "json"}]: dispatch
Nov 28 10:15:54 np0005538515.localdomain ceph-mon[301134]: from='client.15651 172.18.0.34:0/597982567' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:15:54 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v735: 177 pgs: 177 active+clean; 232 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 170 B/s rd, 32 KiB/s wr, 1 op/s
Nov 28 10:15:54 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.
Nov 28 10:15:54 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:15:54.998 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:15:55 np0005538515.localdomain podman[325461]: 2025-11-28 10:15:55.006794141 +0000 UTC m=+0.112674021 container health_status 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 10:15:55 np0005538515.localdomain ceph-mon[301134]: pgmap v735: 177 pgs: 177 active+clean; 232 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 170 B/s rd, 32 KiB/s wr, 1 op/s
Nov 28 10:15:55 np0005538515.localdomain podman[325461]: 2025-11-28 10:15:55.044399779 +0000 UTC m=+0.150279639 container exec_died 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 28 10:15:55 np0005538515.localdomain systemd[1]: 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.service: Deactivated successfully.
Nov 28 10:15:55 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:15:55.238 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:15:55 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:15:55.238 280172 DEBUG nova.compute.manager [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 28 10:15:55 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:15:55.239 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:15:56 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v736: 177 pgs: 177 active+clean; 232 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 170 B/s rd, 32 KiB/s wr, 1 op/s
Nov 28 10:15:56 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "4650ef04-7360-41ee-b6b9-a66770a7edbb", "snap_name": "489a50ca-e11a-4c64-8d66-724b9734ba9f", "format": "json"}]: dispatch
Nov 28 10:15:56 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_create(format:json, prefix:fs subvolume snapshot create, snap_name:489a50ca-e11a-4c64-8d66-724b9734ba9f, sub_name:4650ef04-7360-41ee-b6b9-a66770a7edbb, vol_name:cephfs) < ""
Nov 28 10:15:56 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_create(format:json, prefix:fs subvolume snapshot create, snap_name:489a50ca-e11a-4c64-8d66-724b9734ba9f, sub_name:4650ef04-7360-41ee-b6b9-a66770a7edbb, vol_name:cephfs) < ""
Nov 28 10:15:56 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:15:56.249 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:15:56 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:15:56.249 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:15:56 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:15:56.250 280172 DEBUG nova.compute.manager [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 28 10:15:56 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:15:56.250 280172 DEBUG nova.compute.manager [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 28 10:15:56 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:15:56.266 280172 DEBUG nova.compute.manager [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 28 10:15:56 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:15:56.266 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:15:56 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e290 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:15:57 np0005538515.localdomain ceph-mon[301134]: pgmap v736: 177 pgs: 177 active+clean; 232 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 170 B/s rd, 32 KiB/s wr, 1 op/s
Nov 28 10:15:57 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "4650ef04-7360-41ee-b6b9-a66770a7edbb", "snap_name": "489a50ca-e11a-4c64-8d66-724b9734ba9f", "format": "json"}]: dispatch
Nov 28 10:15:57 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.106:0/2526196820' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:15:57 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:15:57.239 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:15:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:15:57 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 10:15:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:15:57 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 10:15:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:15:57 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 10:15:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:15:57 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 10:15:57 np0005538515.localdomain openstack_network_exporter[240973]: 
Nov 28 10:15:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:15:57 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 10:15:57 np0005538515.localdomain openstack_network_exporter[240973]: 
Nov 28 10:15:57 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:15:57.702 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:15:57 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.
Nov 28 10:15:57 np0005538515.localdomain podman[325485]: 2025-11-28 10:15:57.966341873 +0000 UTC m=+0.074845167 container health_status cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3)
Nov 28 10:15:57 np0005538515.localdomain podman[325485]: 2025-11-28 10:15:57.977199956 +0000 UTC m=+0.085703240 container exec_died cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 28 10:15:57 np0005538515.localdomain systemd[1]: cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.service: Deactivated successfully.
Nov 28 10:15:58 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v737: 177 pgs: 177 active+clean; 233 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 170 B/s rd, 50 KiB/s wr, 2 op/s
Nov 28 10:15:58 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.106:0/1329289336' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:15:58 np0005538515.localdomain podman[239012]: time="2025-11-28T10:15:58Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 10:15:58 np0005538515.localdomain podman[239012]: @ - - [28/Nov/2025:10:15:58 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156330 "" "Go-http-client/1.1"
Nov 28 10:15:58 np0005538515.localdomain podman[239012]: @ - - [28/Nov/2025:10:15:58 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19253 "" "Go-http-client/1.1"
Nov 28 10:15:59 np0005538515.localdomain ceph-mon[301134]: pgmap v737: 177 pgs: 177 active+clean; 233 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 170 B/s rd, 50 KiB/s wr, 2 op/s
Nov 28 10:15:59 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:15:59.239 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:15:59 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:15:59.239 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:15:59 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:15:59.267 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 10:15:59 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:15:59.267 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 10:15:59 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:15:59.268 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 10:15:59 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:15:59.268 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Auditing locally available compute resources for np0005538515.localdomain (node: np0005538515.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 28 10:15:59 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:15:59.269 280172 DEBUG oslo_concurrency.processutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 10:15:59 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 28 10:15:59 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/1387439949' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:15:59 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:15:59.728 280172 DEBUG oslo_concurrency.processutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.459s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 10:15:59 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:15:59.931 280172 WARNING nova.virt.libvirt.driver [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 10:15:59 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:15:59.933 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Hypervisor/Node resource view: name=np0005538515.localdomain free_ram=11440MB free_disk=41.83686447143555GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 28 10:15:59 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:15:59.933 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 10:15:59 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:15:59.934 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 10:15:59 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:15:59.994 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 28 10:15:59 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:15:59.995 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Final resource view: name=np0005538515.localdomain phys_ram=15738MB used_ram=512MB phys_disk=41GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 28 10:16:00 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:16:00.040 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:16:00 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v738: 177 pgs: 177 active+clean; 233 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 25 KiB/s wr, 1 op/s
Nov 28 10:16:00 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "4650ef04-7360-41ee-b6b9-a66770a7edbb", "snap_name": "489a50ca-e11a-4c64-8d66-724b9734ba9f_7b47d2ce-aead-4b15-bdda-8d040cf088ac", "force": true, "format": "json"}]: dispatch
Nov 28 10:16:00 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:489a50ca-e11a-4c64-8d66-724b9734ba9f_7b47d2ce-aead-4b15-bdda-8d040cf088ac, sub_name:4650ef04-7360-41ee-b6b9-a66770a7edbb, vol_name:cephfs) < ""
Nov 28 10:16:00 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/4650ef04-7360-41ee-b6b9-a66770a7edbb/.meta.tmp'
Nov 28 10:16:00 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/4650ef04-7360-41ee-b6b9-a66770a7edbb/.meta.tmp' to config b'/volumes/_nogroup/4650ef04-7360-41ee-b6b9-a66770a7edbb/.meta'
Nov 28 10:16:00 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:489a50ca-e11a-4c64-8d66-724b9734ba9f_7b47d2ce-aead-4b15-bdda-8d040cf088ac, sub_name:4650ef04-7360-41ee-b6b9-a66770a7edbb, vol_name:cephfs) < ""
Nov 28 10:16:00 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "4650ef04-7360-41ee-b6b9-a66770a7edbb", "snap_name": "489a50ca-e11a-4c64-8d66-724b9734ba9f", "force": true, "format": "json"}]: dispatch
Nov 28 10:16:00 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:489a50ca-e11a-4c64-8d66-724b9734ba9f, sub_name:4650ef04-7360-41ee-b6b9-a66770a7edbb, vol_name:cephfs) < ""
Nov 28 10:16:00 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/4650ef04-7360-41ee-b6b9-a66770a7edbb/.meta.tmp'
Nov 28 10:16:00 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/4650ef04-7360-41ee-b6b9-a66770a7edbb/.meta.tmp' to config b'/volumes/_nogroup/4650ef04-7360-41ee-b6b9-a66770a7edbb/.meta'
Nov 28 10:16:00 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:489a50ca-e11a-4c64-8d66-724b9734ba9f, sub_name:4650ef04-7360-41ee-b6b9-a66770a7edbb, vol_name:cephfs) < ""
Nov 28 10:16:00 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:16:00.177 280172 DEBUG oslo_concurrency.processutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 10:16:00 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.108:0/1387439949' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:16:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:16:00.629 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:16:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:16:00.630 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:16:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:16:00.630 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:16:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:16:00.630 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:16:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:16:00.630 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:16:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:16:00.631 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:16:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:16:00.631 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:16:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:16:00.631 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:16:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:16:00.631 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:16:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:16:00.632 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:16:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:16:00.632 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:16:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:16:00.632 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:16:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:16:00.632 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:16:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:16:00.632 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:16:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:16:00.633 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:16:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:16:00.633 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:16:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:16:00.633 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:16:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:16:00.633 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:16:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:16:00.634 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:16:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:16:00.634 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:16:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:16:00.634 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:16:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:16:00.634 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:16:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:16:00.635 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:16:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:16:00.635 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:16:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:16:00.635 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:16:00 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 28 10:16:00 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/3856985557' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:16:00 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:16:00.658 280172 DEBUG oslo_concurrency.processutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.481s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 10:16:00 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:16:00.665 280172 DEBUG nova.compute.provider_tree [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Inventory has not changed in ProviderTree for provider: 72fba1ca-0d86-48af-8a3d-510284dfd0e0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 10:16:00 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:16:00.686 280172 DEBUG nova.scheduler.client.report [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Inventory has not changed for provider 72fba1ca-0d86-48af-8a3d-510284dfd0e0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 10:16:00 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:16:00.689 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Compute_service record updated for np0005538515.localdomain:np0005538515.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 28 10:16:00 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:16:00.689 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.755s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 10:16:00 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:16:00.690 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:16:00 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:16:00.690 280172 DEBUG nova.compute.manager [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Nov 28 10:16:00 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:16:00.704 280172 DEBUG nova.compute.manager [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Nov 28 10:16:01 np0005538515.localdomain ceph-mon[301134]: pgmap v738: 177 pgs: 177 active+clean; 233 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 25 KiB/s wr, 1 op/s
Nov 28 10:16:01 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "4650ef04-7360-41ee-b6b9-a66770a7edbb", "snap_name": "489a50ca-e11a-4c64-8d66-724b9734ba9f_7b47d2ce-aead-4b15-bdda-8d040cf088ac", "force": true, "format": "json"}]: dispatch
Nov 28 10:16:01 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "4650ef04-7360-41ee-b6b9-a66770a7edbb", "snap_name": "489a50ca-e11a-4c64-8d66-724b9734ba9f", "force": true, "format": "json"}]: dispatch
Nov 28 10:16:01 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.108:0/3856985557' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:16:01 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e290 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:16:02 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v739: 177 pgs: 177 active+clean; 233 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 51 KiB/s wr, 2 op/s
Nov 28 10:16:02 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:16:02.731 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:16:03 np0005538515.localdomain ceph-mon[301134]: pgmap v739: 177 pgs: 177 active+clean; 233 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 51 KiB/s wr, 2 op/s
Nov 28 10:16:03 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "4650ef04-7360-41ee-b6b9-a66770a7edbb", "format": "json"}]: dispatch
Nov 28 10:16:03 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_clone_status(clone_name:4650ef04-7360-41ee-b6b9-a66770a7edbb, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Nov 28 10:16:03 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_clone_status(clone_name:4650ef04-7360-41ee-b6b9-a66770a7edbb, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Nov 28 10:16:03 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T10:16:03.267+0000 7fcc87448640 -1 mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '4650ef04-7360-41ee-b6b9-a66770a7edbb' of type subvolume
Nov 28 10:16:03 np0005538515.localdomain ceph-mgr[286188]: mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '4650ef04-7360-41ee-b6b9-a66770a7edbb' of type subvolume
Nov 28 10:16:03 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "4650ef04-7360-41ee-b6b9-a66770a7edbb", "force": true, "format": "json"}]: dispatch
Nov 28 10:16:03 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:4650ef04-7360-41ee-b6b9-a66770a7edbb, vol_name:cephfs) < ""
Nov 28 10:16:03 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.subvolume_base] subvolume path 'b'/volumes/_nogroup/4650ef04-7360-41ee-b6b9-a66770a7edbb'' moved to trashcan
Nov 28 10:16:03 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Nov 28 10:16:03 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:4650ef04-7360-41ee-b6b9-a66770a7edbb, vol_name:cephfs) < ""
Nov 28 10:16:04 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "4650ef04-7360-41ee-b6b9-a66770a7edbb", "format": "json"}]: dispatch
Nov 28 10:16:04 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "4650ef04-7360-41ee-b6b9-a66770a7edbb", "force": true, "format": "json"}]: dispatch
Nov 28 10:16:04 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v740: 177 pgs: 177 active+clean; 233 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 45 KiB/s wr, 2 op/s
Nov 28 10:16:05 np0005538515.localdomain ceph-mon[301134]: pgmap v740: 177 pgs: 177 active+clean; 233 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 45 KiB/s wr, 2 op/s
Nov 28 10:16:05 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.107:0/2845142208' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:16:05 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:16:05.092 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:16:05 np0005538515.localdomain ceph-mgr[286188]: [balancer INFO root] Optimize plan auto_2025-11-28_10:16:05
Nov 28 10:16:05 np0005538515.localdomain ceph-mgr[286188]: [balancer INFO root] Mode upmap, max misplaced 0.050000
Nov 28 10:16:05 np0005538515.localdomain ceph-mgr[286188]: [balancer INFO root] do_upmap
Nov 28 10:16:05 np0005538515.localdomain ceph-mgr[286188]: [balancer INFO root] pools ['manila_data', 'backups', 'volumes', '.mgr', 'images', 'vms', 'manila_metadata']
Nov 28 10:16:05 np0005538515.localdomain ceph-mgr[286188]: [balancer INFO root] prepared 0/10 changes
Nov 28 10:16:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] scanning for idle connections..
Nov 28 10:16:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] cleaning up connections: []
Nov 28 10:16:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] scanning for idle connections..
Nov 28 10:16:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] cleaning up connections: []
Nov 28 10:16:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] scanning for idle connections..
Nov 28 10:16:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] cleaning up connections: []
Nov 28 10:16:06 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v741: 177 pgs: 177 active+clean; 233 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 45 KiB/s wr, 2 op/s
Nov 28 10:16:06 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.107:0/160190658' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:16:06 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] _maybe_adjust
Nov 28 10:16:06 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Nov 28 10:16:06 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] Pool '.mgr' root_id -1 using 3.080724804578448e-05 of space, bias 1.0, pg target 0.006161449609156895 quantized to 1 (current 1)
Nov 28 10:16:06 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Nov 28 10:16:06 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] Pool 'vms' root_id -1 using 0.003328000680485762 of space, bias 1.0, pg target 0.6656001360971524 quantized to 32 (current 32)
Nov 28 10:16:06 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Nov 28 10:16:06 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] Pool 'volumes' root_id -1 using 0.0014861089300670016 of space, bias 1.0, pg target 0.29672641637004465 quantized to 32 (current 32)
Nov 28 10:16:06 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Nov 28 10:16:06 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] Pool 'images' root_id -1 using 0.004299383200725851 of space, bias 1.0, pg target 0.8584435124115949 quantized to 32 (current 32)
Nov 28 10:16:06 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Nov 28 10:16:06 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] Pool 'backups' root_id -1 using 2.7263051367950866e-07 of space, bias 1.0, pg target 5.425347222222222e-05 quantized to 32 (current 32)
Nov 28 10:16:06 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Nov 28 10:16:06 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] Pool 'manila_data' root_id -1 using 8.17891541038526e-07 of space, bias 1.0, pg target 0.00016276041666666666 quantized to 32 (current 32)
Nov 28 10:16:06 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Nov 28 10:16:06 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] Pool 'manila_metadata' root_id -1 using 0.002881977160106086 of space, bias 4.0, pg target 2.294053819444444 quantized to 16 (current 16)
Nov 28 10:16:06 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules
Nov 28 10:16:06 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e291 e291: 6 total, 6 up, 6 in
Nov 28 10:16:06 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: vms, start_after=
Nov 28 10:16:06 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: volumes, start_after=
Nov 28 10:16:06 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: images, start_after=
Nov 28 10:16:06 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: backups, start_after=
Nov 28 10:16:06 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules
Nov 28 10:16:06 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: vms, start_after=
Nov 28 10:16:06 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: volumes, start_after=
Nov 28 10:16:06 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: images, start_after=
Nov 28 10:16:06 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: backups, start_after=
Nov 28 10:16:06 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e291 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:16:07 np0005538515.localdomain ceph-mon[301134]: pgmap v741: 177 pgs: 177 active+clean; 233 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 45 KiB/s wr, 2 op/s
Nov 28 10:16:07 np0005538515.localdomain ceph-mon[301134]: osdmap e291: 6 total, 6 up, 6 in
Nov 28 10:16:07 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:16:07.451 158530 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=23, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '92:49:97', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ca:ab:0a:de:51:20'}, ipsec=False) old=SB_Global(nb_cfg=22) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 10:16:07 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:16:07.451 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:16:07 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:16:07.452 158530 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 28 10:16:07 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:16:07.776 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:16:07 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.
Nov 28 10:16:07 np0005538515.localdomain podman[325548]: 2025-11-28 10:16:07.978524539 +0000 UTC m=+0.081988597 container health_status 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, config_id=edpm, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, build-date=2025-08-20T13:12:41, io.openshift.expose-services=)
Nov 28 10:16:07 np0005538515.localdomain podman[325548]: 2025-11-28 10:16:07.99544189 +0000 UTC m=+0.098905968 container exec_died 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, architecture=x86_64, container_name=openstack_network_exporter, release=1755695350, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, config_id=edpm, managed_by=edpm_ansible, vcs-type=git, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc.)
Nov 28 10:16:08 np0005538515.localdomain systemd[1]: 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.service: Deactivated successfully.
Nov 28 10:16:08 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v743: 177 pgs: 177 active+clean; 233 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 409 B/s rd, 63 KiB/s wr, 3 op/s
Nov 28 10:16:09 np0005538515.localdomain ceph-mon[301134]: pgmap v743: 177 pgs: 177 active+clean; 233 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 409 B/s rd, 63 KiB/s wr, 3 op/s
Nov 28 10:16:10 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v744: 177 pgs: 177 active+clean; 233 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 409 B/s rd, 63 KiB/s wr, 3 op/s
Nov 28 10:16:10 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:16:10.142 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:16:11 np0005538515.localdomain ceph-mon[301134]: pgmap v744: 177 pgs: 177 active+clean; 233 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 409 B/s rd, 63 KiB/s wr, 3 op/s
Nov 28 10:16:11 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e291 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:16:12 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v745: 177 pgs: 177 active+clean; 233 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 409 B/s rd, 49 KiB/s wr, 2 op/s
Nov 28 10:16:12 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:16:12.809 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:16:13 np0005538515.localdomain ceph-mon[301134]: pgmap v745: 177 pgs: 177 active+clean; 233 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 409 B/s rd, 49 KiB/s wr, 2 op/s
Nov 28 10:16:13 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "2507d4ac-39eb-44f4-bc88-d8388bf61f95", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:16:13 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:2507d4ac-39eb-44f4-bc88-d8388bf61f95, vol_name:cephfs) < ""
Nov 28 10:16:13 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/2507d4ac-39eb-44f4-bc88-d8388bf61f95/.meta.tmp'
Nov 28 10:16:13 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/2507d4ac-39eb-44f4-bc88-d8388bf61f95/.meta.tmp' to config b'/volumes/_nogroup/2507d4ac-39eb-44f4-bc88-d8388bf61f95/.meta'
Nov 28 10:16:13 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:2507d4ac-39eb-44f4-bc88-d8388bf61f95, vol_name:cephfs) < ""
Nov 28 10:16:13 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "2507d4ac-39eb-44f4-bc88-d8388bf61f95", "format": "json"}]: dispatch
Nov 28 10:16:13 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:2507d4ac-39eb-44f4-bc88-d8388bf61f95, vol_name:cephfs) < ""
Nov 28 10:16:13 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:2507d4ac-39eb-44f4-bc88-d8388bf61f95, vol_name:cephfs) < ""
Nov 28 10:16:14 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "2507d4ac-39eb-44f4-bc88-d8388bf61f95", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Nov 28 10:16:14 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/3621361723' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:16:14 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/3621361723' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:16:14 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "2507d4ac-39eb-44f4-bc88-d8388bf61f95", "format": "json"}]: dispatch
Nov 28 10:16:14 np0005538515.localdomain ceph-mon[301134]: from='client.15651 172.18.0.34:0/597982567' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Nov 28 10:16:14 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v746: 177 pgs: 177 active+clean; 233 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 409 B/s rd, 49 KiB/s wr, 2 op/s
Nov 28 10:16:15 np0005538515.localdomain ceph-mon[301134]: pgmap v746: 177 pgs: 177 active+clean; 233 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 409 B/s rd, 49 KiB/s wr, 2 op/s
Nov 28 10:16:15 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:16:15.177 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:16:15 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:16:15.238 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:16:15 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:16:15.238 280172 DEBUG nova.compute.manager [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Nov 28 10:16:15 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:16:15.455 158530 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=62c03cad-89c1-4fd7-973b-8f2a608c71f1, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '23'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 10:16:16 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v747: 177 pgs: 177 active+clean; 233 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 409 B/s rd, 49 KiB/s wr, 2 op/s
Nov 28 10:16:16 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #52. Immutable memtables: 0.
Nov 28 10:16:16 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:16:16.165795) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Nov 28 10:16:16 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/flush_job.cc:856] [default] [JOB 29] Flushing memtable with next log file: 52
Nov 28 10:16:16 np0005538515.localdomain ceph-mon[301134]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324976165872, "job": 29, "event": "flush_started", "num_memtables": 1, "num_entries": 2496, "num_deletes": 251, "total_data_size": 3083912, "memory_usage": 3136896, "flush_reason": "Manual Compaction"}
Nov 28 10:16:16 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/flush_job.cc:885] [default] [JOB 29] Level-0 flush table #53: started
Nov 28 10:16:16 np0005538515.localdomain ceph-mon[301134]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324976178596, "cf_name": "default", "job": 29, "event": "table_file_creation", "file_number": 53, "file_size": 1989878, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 33070, "largest_seqno": 35561, "table_properties": {"data_size": 1980900, "index_size": 5423, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2501, "raw_key_size": 21281, "raw_average_key_size": 21, "raw_value_size": 1961837, "raw_average_value_size": 1971, "num_data_blocks": 233, "num_entries": 995, "num_filter_entries": 995, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764324812, "oldest_key_time": 1764324812, "file_creation_time": 1764324976, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "75e61b0e-4f73-4b03-b096-8587ecbe7a9f", "db_session_id": "7KM5GJAJPD54H6HSLJHG", "orig_file_number": 53, "seqno_to_time_mapping": "N/A"}}
Nov 28 10:16:16 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 29] Flush lasted 12860 microseconds, and 5386 cpu microseconds.
Nov 28 10:16:16 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 28 10:16:16 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:16:16.178651) [db/flush_job.cc:967] [default] [JOB 29] Level-0 flush table #53: 1989878 bytes OK
Nov 28 10:16:16 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:16:16.178678) [db/memtable_list.cc:519] [default] Level-0 commit table #53 started
Nov 28 10:16:16 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:16:16.183087) [db/memtable_list.cc:722] [default] Level-0 commit table #53: memtable #1 done
Nov 28 10:16:16 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:16:16.183119) EVENT_LOG_v1 {"time_micros": 1764324976183102, "job": 29, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Nov 28 10:16:16 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:16:16.183137) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Nov 28 10:16:16 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 29] Try to delete WAL files size 3072621, prev total WAL file size 3072621, number of live WAL files 2.
Nov 28 10:16:16 np0005538515.localdomain ceph-mon[301134]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538515/store.db/000049.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 28 10:16:16 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:16:16.183999) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003133303532' seq:72057594037927935, type:22 .. '7061786F73003133333034' seq:0, type:0; will stop at (end)
Nov 28 10:16:16 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 30] Compacting 1@0 + 1@6 files to L6, score -1.00
Nov 28 10:16:16 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 29 Base level 0, inputs: [53(1943KB)], [51(18MB)]
Nov 28 10:16:16 np0005538515.localdomain ceph-mon[301134]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324976184112, "job": 30, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [53], "files_L6": [51], "score": -1, "input_data_size": 21596874, "oldest_snapshot_seqno": -1}
Nov 28 10:16:16 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 30] Generated table #54: 14829 keys, 20052149 bytes, temperature: kUnknown
Nov 28 10:16:16 np0005538515.localdomain ceph-mon[301134]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324976315648, "cf_name": "default", "job": 30, "event": "table_file_creation", "file_number": 54, "file_size": 20052149, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 19964381, "index_size": 49587, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 37125, "raw_key_size": 394594, "raw_average_key_size": 26, "raw_value_size": 19710102, "raw_average_value_size": 1329, "num_data_blocks": 1867, "num_entries": 14829, "num_filter_entries": 14829, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764323786, "oldest_key_time": 0, "file_creation_time": 1764324976, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "75e61b0e-4f73-4b03-b096-8587ecbe7a9f", "db_session_id": "7KM5GJAJPD54H6HSLJHG", "orig_file_number": 54, "seqno_to_time_mapping": "N/A"}}
Nov 28 10:16:16 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 28 10:16:16 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:16:16.315956) [db/compaction/compaction_job.cc:1663] [default] [JOB 30] Compacted 1@0 + 1@6 files to L6 => 20052149 bytes
Nov 28 10:16:16 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:16:16.317755) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 164.1 rd, 152.3 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.9, 18.7 +0.0 blob) out(19.1 +0.0 blob), read-write-amplify(20.9) write-amplify(10.1) OK, records in: 15362, records dropped: 533 output_compression: NoCompression
Nov 28 10:16:16 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:16:16.317784) EVENT_LOG_v1 {"time_micros": 1764324976317772, "job": 30, "event": "compaction_finished", "compaction_time_micros": 131624, "compaction_time_cpu_micros": 52825, "output_level": 6, "num_output_files": 1, "total_output_size": 20052149, "num_input_records": 15362, "num_output_records": 14829, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Nov 28 10:16:16 np0005538515.localdomain ceph-mon[301134]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538515/store.db/000053.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 28 10:16:16 np0005538515.localdomain ceph-mon[301134]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324976318250, "job": 30, "event": "table_file_deletion", "file_number": 53}
Nov 28 10:16:16 np0005538515.localdomain ceph-mon[301134]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538515/store.db/000051.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 28 10:16:16 np0005538515.localdomain ceph-mon[301134]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324976321200, "job": 30, "event": "table_file_deletion", "file_number": 51}
Nov 28 10:16:16 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:16:16.183921) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:16:16 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:16:16.321274) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:16:16 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:16:16.321280) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:16:16 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:16:16.321284) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:16:16 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:16:16.321287) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:16:16 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:16:16.321291) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:16:16 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "2507d4ac-39eb-44f4-bc88-d8388bf61f95", "snap_name": "f728da4c-905d-477e-bad7-75aaf837cd76", "format": "json"}]: dispatch
Nov 28 10:16:16 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_create(format:json, prefix:fs subvolume snapshot create, snap_name:f728da4c-905d-477e-bad7-75aaf837cd76, sub_name:2507d4ac-39eb-44f4-bc88-d8388bf61f95, vol_name:cephfs) < ""
Nov 28 10:16:16 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_create(format:json, prefix:fs subvolume snapshot create, snap_name:f728da4c-905d-477e-bad7-75aaf837cd76, sub_name:2507d4ac-39eb-44f4-bc88-d8388bf61f95, vol_name:cephfs) < ""
Nov 28 10:16:16 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e291 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:16:17 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e292 e292: 6 total, 6 up, 6 in
Nov 28 10:16:17 np0005538515.localdomain ceph-mon[301134]: pgmap v747: 177 pgs: 177 active+clean; 233 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 409 B/s rd, 49 KiB/s wr, 2 op/s
Nov 28 10:16:17 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "2507d4ac-39eb-44f4-bc88-d8388bf61f95", "snap_name": "f728da4c-905d-477e-bad7-75aaf837cd76", "format": "json"}]: dispatch
Nov 28 10:16:17 np0005538515.localdomain ceph-mon[301134]: osdmap e292: 6 total, 6 up, 6 in
Nov 28 10:16:17 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:16:17.844 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:16:18 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v749: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 40 KiB/s wr, 1 op/s
Nov 28 10:16:19 np0005538515.localdomain ceph-mon[301134]: pgmap v749: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 40 KiB/s wr, 1 op/s
Nov 28 10:16:20 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v750: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 40 KiB/s wr, 1 op/s
Nov 28 10:16:20 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:16:20.213 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:16:20 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "2507d4ac-39eb-44f4-bc88-d8388bf61f95", "snap_name": "f728da4c-905d-477e-bad7-75aaf837cd76_f56b26cb-b1ec-4894-975d-4a0ccc289bb1", "force": true, "format": "json"}]: dispatch
Nov 28 10:16:20 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:f728da4c-905d-477e-bad7-75aaf837cd76_f56b26cb-b1ec-4894-975d-4a0ccc289bb1, sub_name:2507d4ac-39eb-44f4-bc88-d8388bf61f95, vol_name:cephfs) < ""
Nov 28 10:16:20 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/2507d4ac-39eb-44f4-bc88-d8388bf61f95/.meta.tmp'
Nov 28 10:16:20 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/2507d4ac-39eb-44f4-bc88-d8388bf61f95/.meta.tmp' to config b'/volumes/_nogroup/2507d4ac-39eb-44f4-bc88-d8388bf61f95/.meta'
Nov 28 10:16:20 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:f728da4c-905d-477e-bad7-75aaf837cd76_f56b26cb-b1ec-4894-975d-4a0ccc289bb1, sub_name:2507d4ac-39eb-44f4-bc88-d8388bf61f95, vol_name:cephfs) < ""
Nov 28 10:16:20 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "2507d4ac-39eb-44f4-bc88-d8388bf61f95", "snap_name": "f728da4c-905d-477e-bad7-75aaf837cd76", "force": true, "format": "json"}]: dispatch
Nov 28 10:16:20 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:f728da4c-905d-477e-bad7-75aaf837cd76, sub_name:2507d4ac-39eb-44f4-bc88-d8388bf61f95, vol_name:cephfs) < ""
Nov 28 10:16:20 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/2507d4ac-39eb-44f4-bc88-d8388bf61f95/.meta.tmp'
Nov 28 10:16:20 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/2507d4ac-39eb-44f4-bc88-d8388bf61f95/.meta.tmp' to config b'/volumes/_nogroup/2507d4ac-39eb-44f4-bc88-d8388bf61f95/.meta'
Nov 28 10:16:20 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:f728da4c-905d-477e-bad7-75aaf837cd76, sub_name:2507d4ac-39eb-44f4-bc88-d8388bf61f95, vol_name:cephfs) < ""
Nov 28 10:16:20 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:16:20.801 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:16:21 np0005538515.localdomain ceph-mon[301134]: pgmap v750: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 40 KiB/s wr, 1 op/s
Nov 28 10:16:21 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "2507d4ac-39eb-44f4-bc88-d8388bf61f95", "snap_name": "f728da4c-905d-477e-bad7-75aaf837cd76_f56b26cb-b1ec-4894-975d-4a0ccc289bb1", "force": true, "format": "json"}]: dispatch
Nov 28 10:16:21 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "2507d4ac-39eb-44f4-bc88-d8388bf61f95", "snap_name": "f728da4c-905d-477e-bad7-75aaf837cd76", "force": true, "format": "json"}]: dispatch
Nov 28 10:16:21 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.
Nov 28 10:16:21 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e292 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:16:21 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.
Nov 28 10:16:21 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.
Nov 28 10:16:21 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.
Nov 28 10:16:22 np0005538515.localdomain podman[325575]: 2025-11-28 10:16:21.962316727 +0000 UTC m=+0.055734788 container health_status b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Nov 28 10:16:22 np0005538515.localdomain podman[325568]: 2025-11-28 10:16:22.043988812 +0000 UTC m=+0.147645858 container health_status 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=edpm, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 28 10:16:22 np0005538515.localdomain podman[325569]: 2025-11-28 10:16:21.997276284 +0000 UTC m=+0.091351085 container health_status 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 28 10:16:22 np0005538515.localdomain podman[325576]: 2025-11-28 10:16:22.054668361 +0000 UTC m=+0.140627152 container health_status d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 28 10:16:22 np0005538515.localdomain podman[325568]: 2025-11-28 10:16:22.055390884 +0000 UTC m=+0.159047940 container exec_died 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, config_id=edpm, container_name=ceilometer_agent_compute)
Nov 28 10:16:22 np0005538515.localdomain systemd[1]: 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.service: Deactivated successfully.
Nov 28 10:16:22 np0005538515.localdomain podman[325569]: 2025-11-28 10:16:22.079540127 +0000 UTC m=+0.173614928 container exec_died 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 28 10:16:22 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v751: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 53 KiB/s wr, 2 op/s
Nov 28 10:16:22 np0005538515.localdomain podman[325575]: 2025-11-28 10:16:22.09521784 +0000 UTC m=+0.188635981 container exec_died b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:16:22 np0005538515.localdomain systemd[1]: 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.service: Deactivated successfully.
Nov 28 10:16:22 np0005538515.localdomain systemd[1]: b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.service: Deactivated successfully.
Nov 28 10:16:22 np0005538515.localdomain podman[325576]: 2025-11-28 10:16:22.136598485 +0000 UTC m=+0.222557246 container exec_died d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 10:16:22 np0005538515.localdomain systemd[1]: d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.service: Deactivated successfully.
Nov 28 10:16:22 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:16:22.880 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:16:22 np0005538515.localdomain systemd[1]: tmp-crun.xqbxYO.mount: Deactivated successfully.
Nov 28 10:16:23 np0005538515.localdomain ceph-mon[301134]: pgmap v751: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 53 KiB/s wr, 2 op/s
Nov 28 10:16:23 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "2507d4ac-39eb-44f4-bc88-d8388bf61f95", "format": "json"}]: dispatch
Nov 28 10:16:23 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_clone_status(clone_name:2507d4ac-39eb-44f4-bc88-d8388bf61f95, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Nov 28 10:16:23 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_clone_status(clone_name:2507d4ac-39eb-44f4-bc88-d8388bf61f95, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Nov 28 10:16:23 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T10:16:23.794+0000 7fcc87448640 -1 mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '2507d4ac-39eb-44f4-bc88-d8388bf61f95' of type subvolume
Nov 28 10:16:23 np0005538515.localdomain ceph-mgr[286188]: mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '2507d4ac-39eb-44f4-bc88-d8388bf61f95' of type subvolume
Nov 28 10:16:23 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "2507d4ac-39eb-44f4-bc88-d8388bf61f95", "force": true, "format": "json"}]: dispatch
Nov 28 10:16:23 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:2507d4ac-39eb-44f4-bc88-d8388bf61f95, vol_name:cephfs) < ""
Nov 28 10:16:23 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.operations.versions.subvolume_base] subvolume path 'b'/volumes/_nogroup/2507d4ac-39eb-44f4-bc88-d8388bf61f95'' moved to trashcan
Nov 28 10:16:23 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Nov 28 10:16:23 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:2507d4ac-39eb-44f4-bc88-d8388bf61f95, vol_name:cephfs) < ""
Nov 28 10:16:24 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v752: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 53 KiB/s wr, 2 op/s
Nov 28 10:16:25 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "2507d4ac-39eb-44f4-bc88-d8388bf61f95", "format": "json"}]: dispatch
Nov 28 10:16:25 np0005538515.localdomain ceph-mon[301134]: from='client.15651 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "2507d4ac-39eb-44f4-bc88-d8388bf61f95", "force": true, "format": "json"}]: dispatch
Nov 28 10:16:25 np0005538515.localdomain ceph-mon[301134]: pgmap v752: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 53 KiB/s wr, 2 op/s
Nov 28 10:16:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:16:25.250 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:16:25 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.
Nov 28 10:16:25 np0005538515.localdomain podman[325650]: 2025-11-28 10:16:25.974388215 +0000 UTC m=+0.080340365 container health_status 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 10:16:25 np0005538515.localdomain podman[325650]: 2025-11-28 10:16:25.987420867 +0000 UTC m=+0.093373067 container exec_died 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 28 10:16:26 np0005538515.localdomain systemd[1]: 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.service: Deactivated successfully.
Nov 28 10:16:26 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v753: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 53 KiB/s wr, 2 op/s
Nov 28 10:16:26 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e293 e293: 6 total, 6 up, 6 in
Nov 28 10:16:26 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 28 10:16:26 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                           ** DB Stats **
                                                           Uptime(secs): 1200.0 total, 600.0 interval
                                                           Cumulative writes: 4735 writes, 35K keys, 4735 commit groups, 1.0 writes per commit group, ingest: 0.05 GB, 0.05 MB/s
                                                           Cumulative WAL: 4735 writes, 4735 syncs, 1.00 writes per sync, written: 0.05 GB, 0.05 MB/s
                                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                           Interval writes: 2458 writes, 13K keys, 2458 commit groups, 1.0 writes per commit group, ingest: 17.89 MB, 0.03 MB/s
                                                           Interval WAL: 2458 writes, 2458 syncs, 1.00 writes per sync, written: 0.02 GB, 0.03 MB/s
                                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                                           
                                                           ** Compaction Stats [default] **
                                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0    117.0      0.32              0.10        15    0.022       0      0       0.0       0.0
                                                             L6      1/0   19.12 MB   0.0      0.3     0.0      0.2       0.2      0.0       0.0   6.5    149.5    138.8      1.77              0.66        14    0.127    188K   7204       0.0       0.0
                                                            Sum      1/0   19.12 MB   0.0      0.3     0.0      0.2       0.3      0.1       0.0   7.5    126.3    135.4      2.10              0.76        29    0.072    188K   7204       0.0       0.0
                                                            Int      0/0    0.00 KB   0.0      0.2     0.0      0.1       0.2      0.0       0.0  12.1    134.8    137.7      1.14              0.42        16    0.071    113K   4315       0.0       0.0
                                                           
                                                           ** Compaction Stats [default] **
                                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            Low      0/0    0.00 KB   0.0      0.3     0.0      0.2       0.2      0.0       0.0   0.0    149.5    138.8      1.77              0.66        14    0.127    188K   7204       0.0       0.0
                                                           High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0    117.8      0.32              0.10        14    0.023       0      0       0.0       0.0
                                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.8      0.00              0.00         1    0.002       0      0       0.0       0.0
                                                           
                                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                           
                                                           Uptime(secs): 1200.0 total, 600.0 interval
                                                           Flush(GB): cumulative 0.037, interval 0.013
                                                           AddFile(GB): cumulative 0.000, interval 0.000
                                                           AddFile(Total Files): cumulative 0, interval 0
                                                           AddFile(L0 Files): cumulative 0, interval 0
                                                           AddFile(Keys): cumulative 0, interval 0
                                                           Cumulative compaction: 0.28 GB write, 0.24 MB/s write, 0.26 GB read, 0.22 MB/s read, 2.1 seconds
                                                           Interval compaction: 0.15 GB write, 0.26 MB/s write, 0.15 GB read, 0.26 MB/s read, 1.1 seconds
                                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                           Block cache BinnedLRUCache@0x561ae0707350#2 capacity: 304.00 MB usage: 21.32 MB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 0 last_secs: 0.000251 secs_since: 0
                                                           Block cache entry stats(count,size,portion): DataBlock(1297,20.07 MB,6.6025%) FilterBlock(29,556.11 KB,0.178643%) IndexBlock(29,726.11 KB,0.233254%) Misc(1,0.00 KB,0%)
                                                           
                                                           ** File Read Latency Histogram By Level [default] **
Nov 28 10:16:26 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e293 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:16:27 np0005538515.localdomain ceph-mon[301134]: pgmap v753: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 53 KiB/s wr, 2 op/s
Nov 28 10:16:27 np0005538515.localdomain ceph-mon[301134]: osdmap e293: 6 total, 6 up, 6 in
Nov 28 10:16:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:16:27 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 10:16:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:16:27 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 10:16:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:16:27 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 10:16:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:16:27 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 10:16:27 np0005538515.localdomain openstack_network_exporter[240973]: 
Nov 28 10:16:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:16:27 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 10:16:27 np0005538515.localdomain openstack_network_exporter[240973]: 
Nov 28 10:16:27 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:16:27.882 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:16:28 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v755: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 409 B/s rd, 61 KiB/s wr, 3 op/s
Nov 28 10:16:28 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.
Nov 28 10:16:28 np0005538515.localdomain podman[239012]: time="2025-11-28T10:16:28Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 10:16:28 np0005538515.localdomain podman[239012]: @ - - [28/Nov/2025:10:16:28 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156330 "" "Go-http-client/1.1"
Nov 28 10:16:28 np0005538515.localdomain podman[239012]: @ - - [28/Nov/2025:10:16:28 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19252 "" "Go-http-client/1.1"
Nov 28 10:16:29 np0005538515.localdomain systemd[1]: tmp-crun.7dJ4bM.mount: Deactivated successfully.
Nov 28 10:16:29 np0005538515.localdomain podman[325673]: 2025-11-28 10:16:29.047305058 +0000 UTC m=+0.151133376 container health_status cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=multipathd)
Nov 28 10:16:29 np0005538515.localdomain podman[325673]: 2025-11-28 10:16:29.060513765 +0000 UTC m=+0.164342063 container exec_died cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 28 10:16:29 np0005538515.localdomain systemd[1]: cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.service: Deactivated successfully.
Nov 28 10:16:29 np0005538515.localdomain ceph-mon[301134]: pgmap v755: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 409 B/s rd, 61 KiB/s wr, 3 op/s
Nov 28 10:16:30 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v756: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 409 B/s rd, 61 KiB/s wr, 3 op/s
Nov 28 10:16:30 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:16:30.255 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:16:31 np0005538515.localdomain ceph-mon[301134]: pgmap v756: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 409 B/s rd, 61 KiB/s wr, 3 op/s
Nov 28 10:16:31 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e293 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:16:32 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v757: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 409 B/s rd, 57 KiB/s wr, 3 op/s
Nov 28 10:16:32 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:16:32.914 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:16:33 np0005538515.localdomain ceph-mon[301134]: pgmap v757: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 409 B/s rd, 57 KiB/s wr, 3 op/s
Nov 28 10:16:34 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v758: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 409 B/s rd, 57 KiB/s wr, 3 op/s
Nov 28 10:16:35 np0005538515.localdomain ceph-mon[301134]: pgmap v758: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 409 B/s rd, 57 KiB/s wr, 3 op/s
Nov 28 10:16:35 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:16:35.277 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:16:35 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] scanning for idle connections..
Nov 28 10:16:35 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] cleaning up connections: []
Nov 28 10:16:35 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] scanning for idle connections..
Nov 28 10:16:35 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] cleaning up connections: []
Nov 28 10:16:35 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] scanning for idle connections..
Nov 28 10:16:35 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] cleaning up connections: []
Nov 28 10:16:36 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v759: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 409 B/s rd, 57 KiB/s wr, 3 op/s
Nov 28 10:16:36 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e293 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:16:37 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e294 e294: 6 total, 6 up, 6 in
Nov 28 10:16:37 np0005538515.localdomain ceph-mon[301134]: pgmap v759: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 409 B/s rd, 57 KiB/s wr, 3 op/s
Nov 28 10:16:37 np0005538515.localdomain ceph-mon[301134]: osdmap e294: 6 total, 6 up, 6 in
Nov 28 10:16:37 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:16:37.950 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:16:38 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v761: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 26 KiB/s wr, 1 op/s
Nov 28 10:16:38 np0005538515.localdomain sudo[325692]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 10:16:38 np0005538515.localdomain sudo[325692]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 10:16:38 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.
Nov 28 10:16:38 np0005538515.localdomain sudo[325692]: pam_unix(sudo:session): session closed for user root
Nov 28 10:16:38 np0005538515.localdomain sudo[325711]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 10:16:38 np0005538515.localdomain sudo[325711]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 10:16:38 np0005538515.localdomain systemd[1]: tmp-crun.Flyulf.mount: Deactivated successfully.
Nov 28 10:16:38 np0005538515.localdomain podman[325710]: 2025-11-28 10:16:38.209931729 +0000 UTC m=+0.083577315 container health_status 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, architecture=x86_64, build-date=2025-08-20T13:12:41, config_id=edpm, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, distribution-scope=public, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, version=9.6, io.buildah.version=1.33.7, container_name=openstack_network_exporter)
Nov 28 10:16:38 np0005538515.localdomain podman[325710]: 2025-11-28 10:16:38.225764827 +0000 UTC m=+0.099410433 container exec_died 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, architecture=x86_64, build-date=2025-08-20T13:12:41, config_id=edpm, io.openshift.expose-services=, vendor=Red Hat, Inc., vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7)
Nov 28 10:16:38 np0005538515.localdomain systemd[1]: 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.service: Deactivated successfully.
Nov 28 10:16:38 np0005538515.localdomain sudo[325711]: pam_unix(sudo:session): session closed for user root
Nov 28 10:16:38 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Nov 28 10:16:38 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 10:16:38 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0)
Nov 28 10:16:38 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 28 10:16:38 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Nov 28 10:16:38 np0005538515.localdomain ceph-mgr[286188]: [progress INFO root] update: starting ev 263baec8-83d9-46b6-bc6c-1803c8c8b24e (Updating node-proxy deployment (+3 -> 3))
Nov 28 10:16:38 np0005538515.localdomain ceph-mgr[286188]: [progress INFO root] complete: finished ev 263baec8-83d9-46b6-bc6c-1803c8c8b24e (Updating node-proxy deployment (+3 -> 3))
Nov 28 10:16:38 np0005538515.localdomain ceph-mgr[286188]: [progress INFO root] Completed event 263baec8-83d9-46b6-bc6c-1803c8c8b24e (Updating node-proxy deployment (+3 -> 3)) in 0 seconds
Nov 28 10:16:38 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0)
Nov 28 10:16:38 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 28 10:16:39 np0005538515.localdomain sudo[325778]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 10:16:39 np0005538515.localdomain sudo[325778]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 10:16:39 np0005538515.localdomain sudo[325778]: pam_unix(sudo:session): session closed for user root
Nov 28 10:16:39 np0005538515.localdomain ceph-mon[301134]: pgmap v761: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 26 KiB/s wr, 1 op/s
Nov 28 10:16:39 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 10:16:39 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 28 10:16:39 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:16:39 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 28 10:16:40 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v762: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 26 KiB/s wr, 1 op/s
Nov 28 10:16:40 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:16:40.321 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:16:41 np0005538515.localdomain ceph-mgr[286188]: [progress INFO root] Writing back 50 completed events
Nov 28 10:16:41 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0)
Nov 28 10:16:41 np0005538515.localdomain ceph-mon[301134]: pgmap v762: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 26 KiB/s wr, 1 op/s
Nov 28 10:16:41 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:16:41 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e294 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:16:42 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v763: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:16:42 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:16:42.994 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:16:43 np0005538515.localdomain ceph-mon[301134]: pgmap v763: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:16:44 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v764: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:16:45 np0005538515.localdomain ceph-mon[301134]: pgmap v764: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:16:45 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:16:45.357 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:16:46 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v765: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:16:46 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e294 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:16:47 np0005538515.localdomain ceph-mon[301134]: pgmap v765: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:16:48 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:16:48.029 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:16:48 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v766: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:16:49 np0005538515.localdomain ceph-mon[301134]: pgmap v766: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:16:50 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v767: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:16:50 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:16:50.399 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:16:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:16:50.860 158530 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 10:16:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:16:50.861 158530 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 10:16:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:16:50.861 158530 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 10:16:51 np0005538515.localdomain ceph-mon[301134]: pgmap v767: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:16:51 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:16:51.771 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:16:51 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e294 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:16:52 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v768: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:16:52 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.
Nov 28 10:16:52 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.
Nov 28 10:16:52 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.
Nov 28 10:16:52 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.
Nov 28 10:16:53 np0005538515.localdomain ceph-mon[301134]: pgmap v768: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:16:53 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:16:53.054 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:16:53 np0005538515.localdomain podman[325796]: 2025-11-28 10:16:53.059255828 +0000 UTC m=+0.161272009 container health_status 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 28 10:16:53 np0005538515.localdomain podman[325798]: 2025-11-28 10:16:53.064440568 +0000 UTC m=+0.154759519 container health_status b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 28 10:16:53 np0005538515.localdomain podman[325797]: 2025-11-28 10:16:52.97296731 +0000 UTC m=+0.075302861 container health_status 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_controller)
Nov 28 10:16:53 np0005538515.localdomain podman[325796]: 2025-11-28 10:16:53.075359054 +0000 UTC m=+0.177375165 container exec_died 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:16:53 np0005538515.localdomain systemd[1]: 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.service: Deactivated successfully.
Nov 28 10:16:53 np0005538515.localdomain podman[325798]: 2025-11-28 10:16:53.100462007 +0000 UTC m=+0.190780958 container exec_died b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Nov 28 10:16:53 np0005538515.localdomain systemd[1]: b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.service: Deactivated successfully.
Nov 28 10:16:53 np0005538515.localdomain podman[325797]: 2025-11-28 10:16:53.156593556 +0000 UTC m=+0.258929177 container exec_died 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 28 10:16:53 np0005538515.localdomain systemd[1]: 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.service: Deactivated successfully.
Nov 28 10:16:53 np0005538515.localdomain podman[325809]: 2025-11-28 10:16:53.220190644 +0000 UTC m=+0.307205952 container health_status d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 28 10:16:53 np0005538515.localdomain podman[325809]: 2025-11-28 10:16:53.231685738 +0000 UTC m=+0.318701066 container exec_died d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 28 10:16:53 np0005538515.localdomain systemd[1]: d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.service: Deactivated successfully.
Nov 28 10:16:53 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:16:53.263 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:16:54 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v769: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:16:54 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:16:54.238 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:16:55 np0005538515.localdomain ceph-mon[301134]: pgmap v769: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:16:55 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:16:55.441 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:16:56 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v770: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:16:56 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:16:56.234 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:16:56 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.
Nov 28 10:16:56 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e294 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:16:56 np0005538515.localdomain podman[325879]: 2025-11-28 10:16:56.981625853 +0000 UTC m=+0.084479793 container health_status 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 28 10:16:56 np0005538515.localdomain podman[325879]: 2025-11-28 10:16:56.995556672 +0000 UTC m=+0.098410622 container exec_died 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 28 10:16:57 np0005538515.localdomain systemd[1]: 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.service: Deactivated successfully.
Nov 28 10:16:57 np0005538515.localdomain ceph-mon[301134]: pgmap v770: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:16:57 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:16:57.238 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:16:57 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:16:57.238 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:16:57 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:16:57.239 280172 DEBUG nova.compute.manager [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 28 10:16:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:16:57 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 10:16:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:16:57 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 10:16:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:16:57 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 10:16:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:16:57 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 10:16:57 np0005538515.localdomain openstack_network_exporter[240973]: 
Nov 28 10:16:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:16:57 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 10:16:57 np0005538515.localdomain openstack_network_exporter[240973]: 
Nov 28 10:16:58 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:16:58.064 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:16:58 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v771: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:16:58 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:16:58.239 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:16:58 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:16:58.240 280172 DEBUG nova.compute.manager [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 28 10:16:58 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:16:58.240 280172 DEBUG nova.compute.manager [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 28 10:16:58 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:16:58.258 280172 DEBUG nova.compute.manager [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 28 10:16:58 np0005538515.localdomain podman[239012]: time="2025-11-28T10:16:58Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 10:16:58 np0005538515.localdomain podman[239012]: @ - - [28/Nov/2025:10:16:58 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156330 "" "Go-http-client/1.1"
Nov 28 10:16:58 np0005538515.localdomain podman[239012]: @ - - [28/Nov/2025:10:16:58 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19254 "" "Go-http-client/1.1"
Nov 28 10:16:59 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:16:59.238 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:16:59 np0005538515.localdomain ceph-mon[301134]: pgmap v771: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:16:59 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.106:0/775532288' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:16:59 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.106:0/1769094031' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:16:59 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.
Nov 28 10:16:59 np0005538515.localdomain podman[325903]: 2025-11-28 10:16:59.97221492 +0000 UTC m=+0.072568445 container health_status cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.build-date=20251125, container_name=multipathd, io.buildah.version=1.41.3, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, org.label-schema.license=GPLv2)
Nov 28 10:16:59 np0005538515.localdomain podman[325903]: 2025-11-28 10:16:59.984497869 +0000 UTC m=+0.084851444 container exec_died cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=multipathd)
Nov 28 10:16:59 np0005538515.localdomain systemd[1]: cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.service: Deactivated successfully.
Nov 28 10:17:00 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v772: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:17:00 np0005538515.localdomain sshd[325922]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 10:17:00 np0005538515.localdomain sshd[325922]: Accepted publickey for zuul from 38.102.83.114 port 59492 ssh2: RSA SHA256:3gOhaEk5Hp1Sm2LwNst6cGDJ5O01KvSo8lCo9SBO2II
Nov 28 10:17:00 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:17:00.238 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:17:00 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:17:00.238 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:17:00 np0005538515.localdomain systemd-logind[763]: New session 75 of user zuul.
Nov 28 10:17:00 np0005538515.localdomain systemd[1]: Started Session 75 of User zuul.
Nov 28 10:17:00 np0005538515.localdomain sshd[325922]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Nov 28 10:17:00 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:17:00.276 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 10:17:00 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:17:00.276 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 10:17:00 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:17:00.276 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 10:17:00 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:17:00.277 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Auditing locally available compute resources for np0005538515.localdomain (node: np0005538515.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 28 10:17:00 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:17:00.277 280172 DEBUG oslo_concurrency.processutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 10:17:00 np0005538515.localdomain sudo[325943]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-egryihwmghhfnjjmtqilnwgasgxfcone ; /usr/bin/python3
Nov 28 10:17:00 np0005538515.localdomain sudo[325943]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 10:17:00 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:17:00.483 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:17:00 np0005538515.localdomain python3[325946]: ansible-ansible.legacy.command Invoked with _raw_params=subscription-manager unregister
                                                           _uses_shell=True zuul_log_id=fa163ef9-e89a-49a1-b30e-00000000000c-1-overcloudnovacompute2 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 10:17:00 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 28 10:17:00 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/847218026' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:17:00 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:17:00.705 280172 DEBUG oslo_concurrency.processutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.428s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 10:17:00 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:17:00.910 280172 WARNING nova.virt.libvirt.driver [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 10:17:00 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:17:00.911 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Hypervisor/Node resource view: name=np0005538515.localdomain free_ram=11392MB free_disk=41.83686447143555GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 28 10:17:00 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:17:00.912 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 10:17:00 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:17:00.912 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 10:17:00 np0005538515.localdomain sudo[325943]: pam_unix(sudo:session): session closed for user root
Nov 28 10:17:00 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:17:00.991 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 28 10:17:00 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:17:00.992 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Final resource view: name=np0005538515.localdomain phys_ram=15738MB used_ram=512MB phys_disk=41GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 28 10:17:01 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:17:01.147 280172 DEBUG nova.scheduler.client.report [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Refreshing inventories for resource provider 72fba1ca-0d86-48af-8a3d-510284dfd0e0 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Nov 28 10:17:01 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:17:01.219 280172 DEBUG nova.scheduler.client.report [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Updating ProviderTree inventory for provider 72fba1ca-0d86-48af-8a3d-510284dfd0e0 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Nov 28 10:17:01 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:17:01.219 280172 DEBUG nova.compute.provider_tree [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Updating inventory in ProviderTree for provider 72fba1ca-0d86-48af-8a3d-510284dfd0e0 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 28 10:17:01 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:17:01.235 280172 DEBUG nova.scheduler.client.report [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Refreshing aggregate associations for resource provider 72fba1ca-0d86-48af-8a3d-510284dfd0e0, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Nov 28 10:17:01 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:17:01.259 280172 DEBUG nova.scheduler.client.report [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Refreshing trait associations for resource provider 72fba1ca-0d86-48af-8a3d-510284dfd0e0, traits: COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_TRUSTED_CERTS,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_AVX,HW_CPU_X86_BMI2,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_AESNI,HW_CPU_X86_SHA,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_ACCELERATORS,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_AMD_SVM,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_F16C,HW_CPU_X86_SVM,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_SSE2,HW_CPU_X86_CLMUL,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_ABM,COMPUTE_STORAGE_BUS_FDC,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_SSE4A,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_SSSE3,HW_CPU_X86_AVX2,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_SECURITY_TPM_1_2,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_SSE41,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_FMA3,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_BMI,HW_CPU_X86_SSE,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NODE,HW_CPU_X86_MMX,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_VOLUME_EXTEND,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_VIRTIO _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Nov 28 10:17:01 np0005538515.localdomain ceph-mon[301134]: pgmap v772: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:17:01 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.108:0/847218026' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:17:01 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:17:01.291 280172 DEBUG oslo_concurrency.processutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 10:17:01 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 28 10:17:01 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/3631755571' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:17:01 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:17:01.776 280172 DEBUG oslo_concurrency.processutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.485s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 10:17:01 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:17:01.783 280172 DEBUG nova.compute.provider_tree [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Inventory has not changed in ProviderTree for provider: 72fba1ca-0d86-48af-8a3d-510284dfd0e0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 10:17:01 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:17:01.799 280172 DEBUG nova.scheduler.client.report [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Inventory has not changed for provider 72fba1ca-0d86-48af-8a3d-510284dfd0e0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 10:17:01 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:17:01.802 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Compute_service record updated for np0005538515.localdomain:np0005538515.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 28 10:17:01 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:17:01.802 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.890s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 10:17:01 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e294 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:17:02 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v773: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:17:02 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.108:0/3631755571' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:17:03 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:17:03.109 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:17:03 np0005538515.localdomain ceph-mon[301134]: pgmap v773: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:17:04 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v774: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:17:05 np0005538515.localdomain ceph-mon[301134]: pgmap v774: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:17:05 np0005538515.localdomain sshd[325922]: pam_unix(sshd:session): session closed for user zuul
Nov 28 10:17:05 np0005538515.localdomain systemd[1]: session-75.scope: Deactivated successfully.
Nov 28 10:17:05 np0005538515.localdomain systemd-logind[763]: Session 75 logged out. Waiting for processes to exit.
Nov 28 10:17:05 np0005538515.localdomain systemd-logind[763]: Removed session 75.
Nov 28 10:17:05 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:17:05.535 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:17:05 np0005538515.localdomain ceph-mgr[286188]: [balancer INFO root] Optimize plan auto_2025-11-28_10:17:05
Nov 28 10:17:05 np0005538515.localdomain ceph-mgr[286188]: [balancer INFO root] Mode upmap, max misplaced 0.050000
Nov 28 10:17:05 np0005538515.localdomain ceph-mgr[286188]: [balancer INFO root] do_upmap
Nov 28 10:17:05 np0005538515.localdomain ceph-mgr[286188]: [balancer INFO root] pools ['manila_data', 'backups', 'volumes', 'vms', '.mgr', 'images', 'manila_metadata']
Nov 28 10:17:05 np0005538515.localdomain ceph-mgr[286188]: [balancer INFO root] prepared 0/10 changes
Nov 28 10:17:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] scanning for idle connections..
Nov 28 10:17:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] cleaning up connections: []
Nov 28 10:17:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] scanning for idle connections..
Nov 28 10:17:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] cleaning up connections: []
Nov 28 10:17:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] scanning for idle connections..
Nov 28 10:17:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] cleaning up connections: []
Nov 28 10:17:06 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v775: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:17:06 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] _maybe_adjust
Nov 28 10:17:06 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Nov 28 10:17:06 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] Pool '.mgr' root_id -1 using 3.080724804578448e-05 of space, bias 1.0, pg target 0.006161449609156895 quantized to 1 (current 1)
Nov 28 10:17:06 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Nov 28 10:17:06 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] Pool 'vms' root_id -1 using 0.003328000680485762 of space, bias 1.0, pg target 0.6656001360971524 quantized to 32 (current 32)
Nov 28 10:17:06 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Nov 28 10:17:06 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] Pool 'volumes' root_id -1 using 0.0014861089300670016 of space, bias 1.0, pg target 0.29672641637004465 quantized to 32 (current 32)
Nov 28 10:17:06 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Nov 28 10:17:06 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] Pool 'images' root_id -1 using 0.004299383200725851 of space, bias 1.0, pg target 0.8584435124115949 quantized to 32 (current 32)
Nov 28 10:17:06 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Nov 28 10:17:06 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] Pool 'backups' root_id -1 using 2.7263051367950866e-07 of space, bias 1.0, pg target 5.425347222222222e-05 quantized to 32 (current 32)
Nov 28 10:17:06 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Nov 28 10:17:06 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] Pool 'manila_data' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Nov 28 10:17:06 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Nov 28 10:17:06 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] Pool 'manila_metadata' root_id -1 using 0.002986939907872697 of space, bias 4.0, pg target 2.3776041666666665 quantized to 16 (current 16)
Nov 28 10:17:06 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules
Nov 28 10:17:06 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: vms, start_after=
Nov 28 10:17:06 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: volumes, start_after=
Nov 28 10:17:06 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: images, start_after=
Nov 28 10:17:06 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: backups, start_after=
Nov 28 10:17:06 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules
Nov 28 10:17:06 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: vms, start_after=
Nov 28 10:17:06 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: volumes, start_after=
Nov 28 10:17:06 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: images, start_after=
Nov 28 10:17:06 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: backups, start_after=
Nov 28 10:17:06 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.107:0/489346964' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:17:06 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e294 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:17:07 np0005538515.localdomain ceph-mon[301134]: pgmap v775: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:17:07 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.107:0/3700869485' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:17:08 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v776: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:17:08 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:17:08.147 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:17:08 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.
Nov 28 10:17:08 np0005538515.localdomain podman[325991]: 2025-11-28 10:17:08.977856306 +0000 UTC m=+0.084082541 container health_status 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vcs-type=git, config_id=edpm, io.openshift.tags=minimal rhel9, version=9.6, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, name=ubi9-minimal, vendor=Red Hat, Inc., io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=)
Nov 28 10:17:08 np0005538515.localdomain podman[325991]: 2025-11-28 10:17:08.994502749 +0000 UTC m=+0.100728994 container exec_died 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.openshift.expose-services=, vendor=Red Hat, Inc., io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, managed_by=edpm_ansible, version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41)
Nov 28 10:17:09 np0005538515.localdomain systemd[1]: 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.service: Deactivated successfully.
Nov 28 10:17:09 np0005538515.localdomain ceph-mon[301134]: pgmap v776: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:17:10 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v777: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:17:10 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:17:10.569 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:17:11 np0005538515.localdomain ceph-mon[301134]: pgmap v777: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:17:11 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e294 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:17:12 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v778: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:17:13 np0005538515.localdomain ceph-mon[301134]: pgmap v778: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:17:13 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:17:13.188 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:17:14 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/218519506' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:17:14 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/218519506' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:17:14 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v779: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:17:15 np0005538515.localdomain ceph-mon[301134]: pgmap v779: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:17:15 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:17:15.611 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:17:16 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v780: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:17:16 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e294 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:17:17 np0005538515.localdomain ceph-mon[301134]: pgmap v780: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:17:18 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v781: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:17:18 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:17:18.224 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:17:19 np0005538515.localdomain ceph-mon[301134]: pgmap v781: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:17:20 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v782: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:17:20 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:17:20.656 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:17:21 np0005538515.localdomain ceph-mon[301134]: pgmap v782: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:17:21 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e294 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:17:22 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v783: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:17:23 np0005538515.localdomain ceph-mon[301134]: pgmap v783: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:17:23 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:17:23.269 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:17:23 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.
Nov 28 10:17:23 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.
Nov 28 10:17:23 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.
Nov 28 10:17:23 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.
Nov 28 10:17:24 np0005538515.localdomain systemd[1]: tmp-crun.FeDlrK.mount: Deactivated successfully.
Nov 28 10:17:24 np0005538515.localdomain podman[326013]: 2025-11-28 10:17:24.003348968 +0000 UTC m=+0.093440260 container health_status d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 28 10:17:24 np0005538515.localdomain podman[326012]: 2025-11-28 10:17:23.971017271 +0000 UTC m=+0.067460778 container health_status b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3)
Nov 28 10:17:24 np0005538515.localdomain podman[326013]: 2025-11-28 10:17:24.03885624 +0000 UTC m=+0.128947572 container exec_died d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 28 10:17:24 np0005538515.localdomain systemd[1]: d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.service: Deactivated successfully.
Nov 28 10:17:24 np0005538515.localdomain podman[326012]: 2025-11-28 10:17:24.053557094 +0000 UTC m=+0.150000591 container exec_died b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 28 10:17:24 np0005538515.localdomain systemd[1]: b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.service: Deactivated successfully.
Nov 28 10:17:24 np0005538515.localdomain podman[326011]: 2025-11-28 10:17:24.039952105 +0000 UTC m=+0.136794425 container health_status 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 28 10:17:24 np0005538515.localdomain podman[326010]: 2025-11-28 10:17:24.104462832 +0000 UTC m=+0.200666312 container health_status 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, config_id=edpm)
Nov 28 10:17:24 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v784: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:17:24 np0005538515.localdomain podman[326011]: 2025-11-28 10:17:24.127614434 +0000 UTC m=+0.224456714 container exec_died 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true)
Nov 28 10:17:24 np0005538515.localdomain systemd[1]: 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.service: Deactivated successfully.
Nov 28 10:17:24 np0005538515.localdomain podman[326010]: 2025-11-28 10:17:24.144450403 +0000 UTC m=+0.240653893 container exec_died 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute)
Nov 28 10:17:24 np0005538515.localdomain systemd[1]: 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.service: Deactivated successfully.
Nov 28 10:17:24 np0005538515.localdomain sshd[326093]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 10:17:24 np0005538515.localdomain sshd[326093]: Accepted publickey for zuul from 38.102.83.114 port 51620 ssh2: RSA SHA256:3gOhaEk5Hp1Sm2LwNst6cGDJ5O01KvSo8lCo9SBO2II
Nov 28 10:17:24 np0005538515.localdomain systemd-logind[763]: New session 76 of user zuul.
Nov 28 10:17:24 np0005538515.localdomain systemd[1]: Started Session 76 of User zuul.
Nov 28 10:17:24 np0005538515.localdomain sshd[326093]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Nov 28 10:17:24 np0005538515.localdomain sudo[326097]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/rsync --server --sender -lLogDtprze.LsfxC . /var/log
Nov 28 10:17:24 np0005538515.localdomain sudo[326097]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 10:17:25 np0005538515.localdomain ceph-mon[301134]: pgmap v784: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:17:25 np0005538515.localdomain sudo[326097]: pam_unix(sudo:session): session closed for user root
Nov 28 10:17:25 np0005538515.localdomain sshd[326096]: Received disconnect from 38.102.83.114 port 51620:11: disconnected by user
Nov 28 10:17:25 np0005538515.localdomain sshd[326096]: Disconnected from user zuul 38.102.83.114 port 51620
Nov 28 10:17:25 np0005538515.localdomain sshd[326093]: pam_unix(sshd:session): session closed for user zuul
Nov 28 10:17:25 np0005538515.localdomain systemd[1]: session-76.scope: Deactivated successfully.
Nov 28 10:17:25 np0005538515.localdomain systemd-logind[763]: Session 76 logged out. Waiting for processes to exit.
Nov 28 10:17:25 np0005538515.localdomain systemd-logind[763]: Removed session 76.
Nov 28 10:17:25 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:17:25.700 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:17:25 np0005538515.localdomain sshd[326115]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 10:17:26 np0005538515.localdomain sshd[326115]: Accepted publickey for zuul from 38.102.83.114 port 46734 ssh2: RSA SHA256:3gOhaEk5Hp1Sm2LwNst6cGDJ5O01KvSo8lCo9SBO2II
Nov 28 10:17:26 np0005538515.localdomain systemd-logind[763]: New session 77 of user zuul.
Nov 28 10:17:26 np0005538515.localdomain systemd[1]: Started Session 77 of User zuul.
Nov 28 10:17:26 np0005538515.localdomain sshd[326115]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Nov 28 10:17:26 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v785: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:17:26 np0005538515.localdomain sudo[326119]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/rsync --server --sender -lLogDtprze.LsfxC . /etc/containers/networks
Nov 28 10:17:26 np0005538515.localdomain sudo[326119]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 10:17:26 np0005538515.localdomain sudo[326119]: pam_unix(sudo:session): session closed for user root
Nov 28 10:17:26 np0005538515.localdomain sshd[326118]: Received disconnect from 38.102.83.114 port 46734:11: disconnected by user
Nov 28 10:17:26 np0005538515.localdomain sshd[326118]: Disconnected from user zuul 38.102.83.114 port 46734
Nov 28 10:17:26 np0005538515.localdomain sshd[326115]: pam_unix(sshd:session): session closed for user zuul
Nov 28 10:17:26 np0005538515.localdomain systemd[1]: session-77.scope: Deactivated successfully.
Nov 28 10:17:26 np0005538515.localdomain systemd-logind[763]: Session 77 logged out. Waiting for processes to exit.
Nov 28 10:17:26 np0005538515.localdomain systemd-logind[763]: Removed session 77.
Nov 28 10:17:26 np0005538515.localdomain sshd[326137]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 10:17:26 np0005538515.localdomain sshd[326137]: Accepted publickey for zuul from 38.102.83.114 port 46744 ssh2: RSA SHA256:3gOhaEk5Hp1Sm2LwNst6cGDJ5O01KvSo8lCo9SBO2II
Nov 28 10:17:26 np0005538515.localdomain systemd-logind[763]: New session 78 of user zuul.
Nov 28 10:17:26 np0005538515.localdomain systemd[1]: Started Session 78 of User zuul.
Nov 28 10:17:26 np0005538515.localdomain sshd[326137]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Nov 28 10:17:26 np0005538515.localdomain sudo[326141]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/rsync --server --sender -lLogDtprze.LsfxC . /etc/containers/containers.conf
Nov 28 10:17:26 np0005538515.localdomain sudo[326141]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 10:17:26 np0005538515.localdomain sudo[326141]: pam_unix(sudo:session): session closed for user root
Nov 28 10:17:26 np0005538515.localdomain sshd[326140]: Received disconnect from 38.102.83.114 port 46744:11: disconnected by user
Nov 28 10:17:26 np0005538515.localdomain sshd[326140]: Disconnected from user zuul 38.102.83.114 port 46744
Nov 28 10:17:26 np0005538515.localdomain sshd[326137]: pam_unix(sshd:session): session closed for user zuul
Nov 28 10:17:26 np0005538515.localdomain systemd[1]: session-78.scope: Deactivated successfully.
Nov 28 10:17:26 np0005538515.localdomain systemd-logind[763]: Session 78 logged out. Waiting for processes to exit.
Nov 28 10:17:26 np0005538515.localdomain systemd-logind[763]: Removed session 78.
Nov 28 10:17:26 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e294 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:17:27 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #55. Immutable memtables: 0.
Nov 28 10:17:27 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:17:27.064137) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Nov 28 10:17:27 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/flush_job.cc:856] [default] [JOB 31] Flushing memtable with next log file: 55
Nov 28 10:17:27 np0005538515.localdomain ceph-mon[301134]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764325047064190, "job": 31, "event": "flush_started", "num_memtables": 1, "num_entries": 1074, "num_deletes": 257, "total_data_size": 1187505, "memory_usage": 1214128, "flush_reason": "Manual Compaction"}
Nov 28 10:17:27 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/flush_job.cc:885] [default] [JOB 31] Level-0 flush table #56: started
Nov 28 10:17:27 np0005538515.localdomain ceph-mon[301134]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764325047074779, "cf_name": "default", "job": 31, "event": "table_file_creation", "file_number": 56, "file_size": 774824, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 35566, "largest_seqno": 36635, "table_properties": {"data_size": 770485, "index_size": 2002, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1285, "raw_key_size": 10035, "raw_average_key_size": 19, "raw_value_size": 761389, "raw_average_value_size": 1487, "num_data_blocks": 89, "num_entries": 512, "num_filter_entries": 512, "num_deletions": 257, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764324976, "oldest_key_time": 1764324976, "file_creation_time": 1764325047, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "75e61b0e-4f73-4b03-b096-8587ecbe7a9f", "db_session_id": "7KM5GJAJPD54H6HSLJHG", "orig_file_number": 56, "seqno_to_time_mapping": "N/A"}}
Nov 28 10:17:27 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 31] Flush lasted 10697 microseconds, and 3903 cpu microseconds.
Nov 28 10:17:27 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 28 10:17:27 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:17:27.074837) [db/flush_job.cc:967] [default] [JOB 31] Level-0 flush table #56: 774824 bytes OK
Nov 28 10:17:27 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:17:27.074865) [db/memtable_list.cc:519] [default] Level-0 commit table #56 started
Nov 28 10:17:27 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:17:27.076891) [db/memtable_list.cc:722] [default] Level-0 commit table #56: memtable #1 done
Nov 28 10:17:27 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:17:27.076912) EVENT_LOG_v1 {"time_micros": 1764325047076906, "job": 31, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Nov 28 10:17:27 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:17:27.076932) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Nov 28 10:17:27 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 31] Try to delete WAL files size 1182166, prev total WAL file size 1182490, number of live WAL files 2.
Nov 28 10:17:27 np0005538515.localdomain ceph-mon[301134]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538515/store.db/000052.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 28 10:17:27 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:17:27.077582) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0034353238' seq:72057594037927935, type:22 .. '6C6F676D0034373739' seq:0, type:0; will stop at (end)
Nov 28 10:17:27 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 32] Compacting 1@0 + 1@6 files to L6, score -1.00
Nov 28 10:17:27 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 31 Base level 0, inputs: [56(756KB)], [54(19MB)]
Nov 28 10:17:27 np0005538515.localdomain ceph-mon[301134]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764325047077656, "job": 32, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [56], "files_L6": [54], "score": -1, "input_data_size": 20826973, "oldest_snapshot_seqno": -1}
Nov 28 10:17:27 np0005538515.localdomain sshd[326159]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 10:17:27 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.
Nov 28 10:17:27 np0005538515.localdomain sshd[326159]: Accepted publickey for zuul from 38.102.83.114 port 46756 ssh2: RSA SHA256:3gOhaEk5Hp1Sm2LwNst6cGDJ5O01KvSo8lCo9SBO2II
Nov 28 10:17:27 np0005538515.localdomain systemd-logind[763]: New session 79 of user zuul.
Nov 28 10:17:27 np0005538515.localdomain systemd[1]: Started Session 79 of User zuul.
Nov 28 10:17:27 np0005538515.localdomain sshd[326159]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Nov 28 10:17:27 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 32] Generated table #57: 14807 keys, 20694593 bytes, temperature: kUnknown
Nov 28 10:17:27 np0005538515.localdomain ceph-mon[301134]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764325047241377, "cf_name": "default", "job": 32, "event": "table_file_creation", "file_number": 57, "file_size": 20694593, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 20605477, "index_size": 50999, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 37061, "raw_key_size": 395186, "raw_average_key_size": 26, "raw_value_size": 20350004, "raw_average_value_size": 1374, "num_data_blocks": 1924, "num_entries": 14807, "num_filter_entries": 14807, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764323786, "oldest_key_time": 0, "file_creation_time": 1764325047, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "75e61b0e-4f73-4b03-b096-8587ecbe7a9f", "db_session_id": "7KM5GJAJPD54H6HSLJHG", "orig_file_number": 57, "seqno_to_time_mapping": "N/A"}}
Nov 28 10:17:27 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 28 10:17:27 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:17:27.242120) [db/compaction/compaction_job.cc:1663] [default] [JOB 32] Compacted 1@0 + 1@6 files to L6 => 20694593 bytes
Nov 28 10:17:27 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:17:27.246186) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 126.9 rd, 126.1 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.7, 19.1 +0.0 blob) out(19.7 +0.0 blob), read-write-amplify(53.6) write-amplify(26.7) OK, records in: 15341, records dropped: 534 output_compression: NoCompression
Nov 28 10:17:27 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:17:27.246223) EVENT_LOG_v1 {"time_micros": 1764325047246207, "job": 32, "event": "compaction_finished", "compaction_time_micros": 164154, "compaction_time_cpu_micros": 55967, "output_level": 6, "num_output_files": 1, "total_output_size": 20694593, "num_input_records": 15341, "num_output_records": 14807, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Nov 28 10:17:27 np0005538515.localdomain ceph-mon[301134]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538515/store.db/000056.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 28 10:17:27 np0005538515.localdomain ceph-mon[301134]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764325047246619, "job": 32, "event": "table_file_deletion", "file_number": 56}
Nov 28 10:17:27 np0005538515.localdomain ceph-mon[301134]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538515/store.db/000054.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 28 10:17:27 np0005538515.localdomain ceph-mon[301134]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764325047249528, "job": 32, "event": "table_file_deletion", "file_number": 54}
Nov 28 10:17:27 np0005538515.localdomain ceph-mon[301134]: pgmap v785: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:17:27 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:17:27.077498) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:17:27 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:17:27.249808) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:17:27 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:17:27.249816) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:17:27 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:17:27.249819) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:17:27 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:17:27.249822) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:17:27 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:17:27.249826) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:17:27 np0005538515.localdomain podman[326161]: 2025-11-28 10:17:27.282274705 +0000 UTC m=+0.103786957 container health_status 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 28 10:17:27 np0005538515.localdomain podman[326161]: 2025-11-28 10:17:27.294378809 +0000 UTC m=+0.115891011 container exec_died 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 10:17:27 np0005538515.localdomain systemd[1]: 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.service: Deactivated successfully.
Nov 28 10:17:27 np0005538515.localdomain sudo[326186]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/rsync --server --sender -lLogDtprze.LsfxC . /etc/ceph
Nov 28 10:17:27 np0005538515.localdomain sudo[326186]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 10:17:27 np0005538515.localdomain sudo[326186]: pam_unix(sudo:session): session closed for user root
Nov 28 10:17:27 np0005538515.localdomain sshd[326175]: Received disconnect from 38.102.83.114 port 46756:11: disconnected by user
Nov 28 10:17:27 np0005538515.localdomain sshd[326175]: Disconnected from user zuul 38.102.83.114 port 46756
Nov 28 10:17:27 np0005538515.localdomain sshd[326159]: pam_unix(sshd:session): session closed for user zuul
Nov 28 10:17:27 np0005538515.localdomain systemd[1]: session-79.scope: Deactivated successfully.
Nov 28 10:17:27 np0005538515.localdomain systemd-logind[763]: Session 79 logged out. Waiting for processes to exit.
Nov 28 10:17:27 np0005538515.localdomain systemd-logind[763]: Removed session 79.
Nov 28 10:17:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:17:27 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 10:17:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:17:27 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 10:17:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:17:27 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 10:17:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:17:27 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 10:17:27 np0005538515.localdomain openstack_network_exporter[240973]: 
Nov 28 10:17:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:17:27 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 10:17:27 np0005538515.localdomain openstack_network_exporter[240973]: 
Nov 28 10:17:27 np0005538515.localdomain sshd[326204]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 10:17:27 np0005538515.localdomain sshd[326204]: Accepted publickey for zuul from 38.102.83.114 port 46758 ssh2: RSA SHA256:3gOhaEk5Hp1Sm2LwNst6cGDJ5O01KvSo8lCo9SBO2II
Nov 28 10:17:27 np0005538515.localdomain systemd-logind[763]: New session 80 of user zuul.
Nov 28 10:17:27 np0005538515.localdomain systemd[1]: Started Session 80 of User zuul.
Nov 28 10:17:27 np0005538515.localdomain sshd[326204]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Nov 28 10:17:27 np0005538515.localdomain sudo[326208]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/rsync --server --sender -lLogDtprze.LsfxC . /etc/ci
Nov 28 10:17:27 np0005538515.localdomain sudo[326208]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 10:17:27 np0005538515.localdomain sudo[326208]: pam_unix(sudo:session): session closed for user root
Nov 28 10:17:27 np0005538515.localdomain sshd[326207]: Received disconnect from 38.102.83.114 port 46758:11: disconnected by user
Nov 28 10:17:27 np0005538515.localdomain sshd[326207]: Disconnected from user zuul 38.102.83.114 port 46758
Nov 28 10:17:27 np0005538515.localdomain sshd[326204]: pam_unix(sshd:session): session closed for user zuul
Nov 28 10:17:27 np0005538515.localdomain systemd[1]: session-80.scope: Deactivated successfully.
Nov 28 10:17:27 np0005538515.localdomain systemd-logind[763]: Session 80 logged out. Waiting for processes to exit.
Nov 28 10:17:27 np0005538515.localdomain systemd-logind[763]: Removed session 80.
Nov 28 10:17:28 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v786: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:17:28 np0005538515.localdomain sshd[326226]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 10:17:28 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:17:28.312 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:17:28 np0005538515.localdomain sshd[326226]: Accepted publickey for zuul from 38.102.83.114 port 46766 ssh2: RSA SHA256:3gOhaEk5Hp1Sm2LwNst6cGDJ5O01KvSo8lCo9SBO2II
Nov 28 10:17:28 np0005538515.localdomain systemd-logind[763]: New session 81 of user zuul.
Nov 28 10:17:28 np0005538515.localdomain systemd[1]: Started Session 81 of User zuul.
Nov 28 10:17:28 np0005538515.localdomain sshd[326226]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Nov 28 10:17:28 np0005538515.localdomain sudo[326230]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/rsync --server --sender -lLogDtprze.LsfxC . /etc/yum.conf
Nov 28 10:17:28 np0005538515.localdomain sudo[326230]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 10:17:28 np0005538515.localdomain sudo[326230]: pam_unix(sudo:session): session closed for user root
Nov 28 10:17:28 np0005538515.localdomain sshd[326229]: Received disconnect from 38.102.83.114 port 46766:11: disconnected by user
Nov 28 10:17:28 np0005538515.localdomain sshd[326229]: Disconnected from user zuul 38.102.83.114 port 46766
Nov 28 10:17:28 np0005538515.localdomain sshd[326226]: pam_unix(sshd:session): session closed for user zuul
Nov 28 10:17:28 np0005538515.localdomain systemd[1]: session-81.scope: Deactivated successfully.
Nov 28 10:17:28 np0005538515.localdomain systemd-logind[763]: Session 81 logged out. Waiting for processes to exit.
Nov 28 10:17:28 np0005538515.localdomain systemd-logind[763]: Removed session 81.
Nov 28 10:17:28 np0005538515.localdomain sshd[326248]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 10:17:28 np0005538515.localdomain podman[239012]: time="2025-11-28T10:17:28Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 10:17:28 np0005538515.localdomain podman[239012]: @ - - [28/Nov/2025:10:17:28 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156330 "" "Go-http-client/1.1"
Nov 28 10:17:28 np0005538515.localdomain sshd[326248]: Accepted publickey for zuul from 38.102.83.114 port 46780 ssh2: RSA SHA256:3gOhaEk5Hp1Sm2LwNst6cGDJ5O01KvSo8lCo9SBO2II
Nov 28 10:17:28 np0005538515.localdomain podman[239012]: @ - - [28/Nov/2025:10:17:28 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19248 "" "Go-http-client/1.1"
Nov 28 10:17:28 np0005538515.localdomain systemd-logind[763]: New session 82 of user zuul.
Nov 28 10:17:28 np0005538515.localdomain systemd[1]: Started Session 82 of User zuul.
Nov 28 10:17:29 np0005538515.localdomain sshd[326248]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Nov 28 10:17:29 np0005538515.localdomain sudo[326252]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/rsync --server --sender -lLogDtprze.LsfxC . /etc/yum.repos.d
Nov 28 10:17:29 np0005538515.localdomain sudo[326252]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 10:17:29 np0005538515.localdomain sudo[326252]: pam_unix(sudo:session): session closed for user root
Nov 28 10:17:29 np0005538515.localdomain sshd[326251]: Received disconnect from 38.102.83.114 port 46780:11: disconnected by user
Nov 28 10:17:29 np0005538515.localdomain sshd[326251]: Disconnected from user zuul 38.102.83.114 port 46780
Nov 28 10:17:29 np0005538515.localdomain sshd[326248]: pam_unix(sshd:session): session closed for user zuul
Nov 28 10:17:29 np0005538515.localdomain systemd[1]: session-82.scope: Deactivated successfully.
Nov 28 10:17:29 np0005538515.localdomain systemd-logind[763]: Session 82 logged out. Waiting for processes to exit.
Nov 28 10:17:29 np0005538515.localdomain systemd-logind[763]: Removed session 82.
Nov 28 10:17:29 np0005538515.localdomain ceph-mon[301134]: pgmap v786: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:17:29 np0005538515.localdomain sshd[326270]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 10:17:29 np0005538515.localdomain sshd[326270]: Accepted publickey for zuul from 38.102.83.114 port 46788 ssh2: RSA SHA256:3gOhaEk5Hp1Sm2LwNst6cGDJ5O01KvSo8lCo9SBO2II
Nov 28 10:17:29 np0005538515.localdomain systemd-logind[763]: New session 83 of user zuul.
Nov 28 10:17:29 np0005538515.localdomain systemd[1]: Started Session 83 of User zuul.
Nov 28 10:17:29 np0005538515.localdomain sshd[326270]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Nov 28 10:17:29 np0005538515.localdomain sudo[326274]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/rsync --server --sender -lLogDtprze.LsfxC . /etc/os-net-config
Nov 28 10:17:29 np0005538515.localdomain sudo[326274]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 10:17:29 np0005538515.localdomain sudo[326274]: pam_unix(sudo:session): session closed for user root
Nov 28 10:17:29 np0005538515.localdomain sshd[326273]: Received disconnect from 38.102.83.114 port 46788:11: disconnected by user
Nov 28 10:17:29 np0005538515.localdomain sshd[326273]: Disconnected from user zuul 38.102.83.114 port 46788
Nov 28 10:17:29 np0005538515.localdomain sshd[326270]: pam_unix(sshd:session): session closed for user zuul
Nov 28 10:17:29 np0005538515.localdomain systemd[1]: session-83.scope: Deactivated successfully.
Nov 28 10:17:29 np0005538515.localdomain systemd-logind[763]: Session 83 logged out. Waiting for processes to exit.
Nov 28 10:17:29 np0005538515.localdomain systemd-logind[763]: Removed session 83.
Nov 28 10:17:30 np0005538515.localdomain sshd[326292]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 10:17:30 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v787: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:17:30 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.
Nov 28 10:17:30 np0005538515.localdomain sshd[326292]: Accepted publickey for zuul from 38.102.83.114 port 46792 ssh2: RSA SHA256:3gOhaEk5Hp1Sm2LwNst6cGDJ5O01KvSo8lCo9SBO2II
Nov 28 10:17:30 np0005538515.localdomain systemd-logind[763]: New session 84 of user zuul.
Nov 28 10:17:30 np0005538515.localdomain systemd[1]: Started Session 84 of User zuul.
Nov 28 10:17:30 np0005538515.localdomain sshd[326292]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Nov 28 10:17:30 np0005538515.localdomain podman[326294]: 2025-11-28 10:17:30.272732159 +0000 UTC m=+0.092429188 container health_status cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Nov 28 10:17:30 np0005538515.localdomain podman[326294]: 2025-11-28 10:17:30.316593679 +0000 UTC m=+0.136290698 container exec_died cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Nov 28 10:17:30 np0005538515.localdomain systemd[1]: cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.service: Deactivated successfully.
Nov 28 10:17:30 np0005538515.localdomain sudo[326315]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/rsync --server --sender -lLogDtprze.LsfxC . /home/zuul/ansible_hostname
Nov 28 10:17:30 np0005538515.localdomain sudo[326315]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 10:17:30 np0005538515.localdomain sudo[326315]: pam_unix(sudo:session): session closed for user root
Nov 28 10:17:30 np0005538515.localdomain sshd[326306]: Received disconnect from 38.102.83.114 port 46792:11: disconnected by user
Nov 28 10:17:30 np0005538515.localdomain sshd[326306]: Disconnected from user zuul 38.102.83.114 port 46792
Nov 28 10:17:30 np0005538515.localdomain sshd[326292]: pam_unix(sshd:session): session closed for user zuul
Nov 28 10:17:30 np0005538515.localdomain systemd[1]: session-84.scope: Deactivated successfully.
Nov 28 10:17:30 np0005538515.localdomain systemd-logind[763]: Session 84 logged out. Waiting for processes to exit.
Nov 28 10:17:30 np0005538515.localdomain systemd-logind[763]: Removed session 84.
Nov 28 10:17:30 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:17:30.734 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:17:31 np0005538515.localdomain ceph-mon[301134]: pgmap v787: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:17:31 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e294 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:17:32 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v788: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:17:33 np0005538515.localdomain ceph-mon[301134]: pgmap v788: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:17:33 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:17:33.354 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:17:34 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v789: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:17:35 np0005538515.localdomain ceph-mon[301134]: pgmap v789: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:17:35 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] scanning for idle connections..
Nov 28 10:17:35 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] cleaning up connections: [('cephfs', <mgr_util.CephfsConnectionPool.Connection object at 0x7fccb1e1a3a0>)]
Nov 28 10:17:35 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] disconnecting from cephfs 'cephfs'
Nov 28 10:17:35 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:17:35.768 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:17:35 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] scanning for idle connections..
Nov 28 10:17:35 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] cleaning up connections: [('cephfs', <mgr_util.CephfsConnectionPool.Connection object at 0x7fcc6baed490>), ('cephfs', <mgr_util.CephfsConnectionPool.Connection object at 0x7fcc6ba63d00>)]
Nov 28 10:17:35 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] disconnecting from cephfs 'cephfs'
Nov 28 10:17:35 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] disconnecting from cephfs 'cephfs'
Nov 28 10:17:35 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] scanning for idle connections..
Nov 28 10:17:35 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] cleaning up connections: []
Nov 28 10:17:36 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v790: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:17:36 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e294 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:17:37 np0005538515.localdomain ceph-mon[301134]: pgmap v790: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:17:38 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v791: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 426 B/s wr, 0 op/s
Nov 28 10:17:38 np0005538515.localdomain ceph-mon[301134]: mgrmap e53: np0005538515.yfkzhl(active, since 19m), standbys: np0005538513.dsfdlx, np0005538514.djozup
Nov 28 10:17:38 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:17:38.398 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:17:39 np0005538515.localdomain ceph-mon[301134]: pgmap v791: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 426 B/s wr, 0 op/s
Nov 28 10:17:39 np0005538515.localdomain sudo[326334]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 10:17:39 np0005538515.localdomain sudo[326334]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 10:17:39 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.
Nov 28 10:17:39 np0005538515.localdomain sudo[326334]: pam_unix(sudo:session): session closed for user root
Nov 28 10:17:39 np0005538515.localdomain sudo[326353]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 10:17:39 np0005538515.localdomain sudo[326353]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 10:17:39 np0005538515.localdomain podman[326351]: 2025-11-28 10:17:39.412926868 +0000 UTC m=+0.095083659 container health_status 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., config_id=edpm, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, distribution-scope=public, container_name=openstack_network_exporter, managed_by=edpm_ansible, release=1755695350, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 28 10:17:39 np0005538515.localdomain podman[326351]: 2025-11-28 10:17:39.453419555 +0000 UTC m=+0.135576336 container exec_died 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, distribution-scope=public, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1755695350, vcs-type=git, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc.)
Nov 28 10:17:39 np0005538515.localdomain systemd[1]: 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.service: Deactivated successfully.
Nov 28 10:17:39 np0005538515.localdomain sudo[326353]: pam_unix(sudo:session): session closed for user root
Nov 28 10:17:40 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v792: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 426 B/s wr, 0 op/s
Nov 28 10:17:40 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Nov 28 10:17:40 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 10:17:40 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0)
Nov 28 10:17:40 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 28 10:17:40 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Nov 28 10:17:40 np0005538515.localdomain ceph-mgr[286188]: [progress INFO root] update: starting ev e9037134-85b6-48aa-a63c-09ebf5c1aa05 (Updating node-proxy deployment (+3 -> 3))
Nov 28 10:17:40 np0005538515.localdomain ceph-mgr[286188]: [progress INFO root] complete: finished ev e9037134-85b6-48aa-a63c-09ebf5c1aa05 (Updating node-proxy deployment (+3 -> 3))
Nov 28 10:17:40 np0005538515.localdomain ceph-mgr[286188]: [progress INFO root] Completed event e9037134-85b6-48aa-a63c-09ebf5c1aa05 (Updating node-proxy deployment (+3 -> 3)) in 0 seconds
Nov 28 10:17:40 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0)
Nov 28 10:17:40 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 28 10:17:40 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 10:17:40 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 28 10:17:40 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:17:40 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 28 10:17:40 np0005538515.localdomain sudo[326421]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 10:17:40 np0005538515.localdomain sudo[326421]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 10:17:40 np0005538515.localdomain sudo[326421]: pam_unix(sudo:session): session closed for user root
Nov 28 10:17:40 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:17:40.791 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:17:41 np0005538515.localdomain ceph-mgr[286188]: [progress INFO root] Writing back 50 completed events
Nov 28 10:17:41 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0)
Nov 28 10:17:41 np0005538515.localdomain ceph-mon[301134]: pgmap v792: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 426 B/s wr, 0 op/s
Nov 28 10:17:41 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:17:41 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e294 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:17:42 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v793: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 426 B/s wr, 0 op/s
Nov 28 10:17:43 np0005538515.localdomain ceph-mon[301134]: pgmap v793: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 426 B/s wr, 0 op/s
Nov 28 10:17:43 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:17:43.431 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:17:44 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v794: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 426 B/s wr, 0 op/s
Nov 28 10:17:45 np0005538515.localdomain ceph-mon[301134]: pgmap v794: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 426 B/s wr, 0 op/s
Nov 28 10:17:45 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:17:45.917 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:17:46 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v795: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 426 B/s wr, 0 op/s
Nov 28 10:17:46 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e294 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:17:47 np0005538515.localdomain ceph-mon[301134]: pgmap v795: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 426 B/s wr, 0 op/s
Nov 28 10:17:48 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v796: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 426 B/s wr, 0 op/s
Nov 28 10:17:48 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:17:48.470 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:17:49 np0005538515.localdomain ceph-mon[301134]: pgmap v796: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 426 B/s wr, 0 op/s
Nov 28 10:17:50 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v797: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 0 B/s wr, 0 op/s
Nov 28 10:17:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:17:50.861 158530 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 10:17:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:17:50.861 158530 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 10:17:50 np0005538515.localdomain ovn_metadata_agent[158525]: 2025-11-28 10:17:50.862 158530 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 10:17:51 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:17:51.004 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:17:51 np0005538515.localdomain ceph-mon[301134]: pgmap v797: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 0 B/s wr, 0 op/s
Nov 28 10:17:51 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e294 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:17:52 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v798: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 0 B/s wr, 0 op/s
Nov 28 10:17:53 np0005538515.localdomain ceph-mon[301134]: pgmap v798: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 0 B/s wr, 0 op/s
Nov 28 10:17:53 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:17:53.503 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:17:53 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:17:53.799 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:17:54 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v799: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:17:54 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.
Nov 28 10:17:54 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.
Nov 28 10:17:54 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.
Nov 28 10:17:54 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.
Nov 28 10:17:54 np0005538515.localdomain podman[326439]: 2025-11-28 10:17:54.983280423 +0000 UTC m=+0.087443845 container health_status 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.schema-version=1.0, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3)
Nov 28 10:17:54 np0005538515.localdomain podman[326440]: 2025-11-28 10:17:54.998078479 +0000 UTC m=+0.096894536 container health_status 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.license=GPLv2, config_id=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible)
Nov 28 10:17:55 np0005538515.localdomain podman[326441]: 2025-11-28 10:17:55.03512816 +0000 UTC m=+0.131496371 container health_status b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Nov 28 10:17:55 np0005538515.localdomain podman[326439]: 2025-11-28 10:17:55.051553476 +0000 UTC m=+0.155716888 container exec_died 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Nov 28 10:17:55 np0005538515.localdomain podman[326440]: 2025-11-28 10:17:55.064641229 +0000 UTC m=+0.163457316 container exec_died 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 10:17:55 np0005538515.localdomain systemd[1]: 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.service: Deactivated successfully.
Nov 28 10:17:55 np0005538515.localdomain systemd[1]: 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.service: Deactivated successfully.
Nov 28 10:17:55 np0005538515.localdomain podman[326441]: 2025-11-28 10:17:55.118606431 +0000 UTC m=+0.214974582 container exec_died b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 28 10:17:55 np0005538515.localdomain systemd[1]: b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.service: Deactivated successfully.
Nov 28 10:17:55 np0005538515.localdomain podman[326447]: 2025-11-28 10:17:55.069471677 +0000 UTC m=+0.160021729 container health_status d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 28 10:17:55 np0005538515.localdomain ceph-mon[301134]: pgmap v799: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:17:55 np0005538515.localdomain podman[326447]: 2025-11-28 10:17:55.205020422 +0000 UTC m=+0.295570454 container exec_died d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 10:17:55 np0005538515.localdomain systemd[1]: d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.service: Deactivated successfully.
Nov 28 10:17:55 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:17:55.238 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:17:55 np0005538515.localdomain systemd[1]: tmp-crun.tdiesy.mount: Deactivated successfully.
Nov 28 10:17:56 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:17:56.052 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:17:56 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v800: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:17:56 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:17:56.237 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:17:56 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e294 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:17:57 np0005538515.localdomain ceph-mon[301134]: pgmap v800: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:17:57 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:17:57.234 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:17:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:17:57 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 10:17:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:17:57 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 10:17:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:17:57 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 10:17:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:17:57 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 10:17:57 np0005538515.localdomain openstack_network_exporter[240973]: 
Nov 28 10:17:57 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:17:57 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 10:17:57 np0005538515.localdomain openstack_network_exporter[240973]: 
Nov 28 10:17:57 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.
Nov 28 10:17:57 np0005538515.localdomain podman[326525]: 2025-11-28 10:17:57.973607602 +0000 UTC m=+0.081500371 container health_status 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 28 10:17:57 np0005538515.localdomain podman[326525]: 2025-11-28 10:17:57.98651283 +0000 UTC m=+0.094405579 container exec_died 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 10:17:57 np0005538515.localdomain systemd[1]: 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.service: Deactivated successfully.
Nov 28 10:17:58 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v801: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:17:58 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:17:58.237 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:17:58 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:17:58.238 280172 DEBUG nova.compute.manager [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 28 10:17:58 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:17:58.238 280172 DEBUG nova.compute.manager [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 28 10:17:58 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:17:58.260 280172 DEBUG nova.compute.manager [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 28 10:17:58 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:17:58.260 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:17:58 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:17:58.261 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:17:58 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:17:58.261 280172 DEBUG nova.compute.manager [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 28 10:17:58 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:17:58.532 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:17:58 np0005538515.localdomain podman[239012]: time="2025-11-28T10:17:58Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 10:17:58 np0005538515.localdomain podman[239012]: @ - - [28/Nov/2025:10:17:58 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156330 "" "Go-http-client/1.1"
Nov 28 10:17:58 np0005538515.localdomain podman[239012]: @ - - [28/Nov/2025:10:17:58 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19248 "" "Go-http-client/1.1"
Nov 28 10:17:59 np0005538515.localdomain ceph-mon[301134]: pgmap v801: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:18:00 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v802: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:18:00 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:18:00.238 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:18:00 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:18:00.239 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:18:00 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.106:0/4023789649' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:18:00 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.106:0/2940993655' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:18:00 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:18:00.258 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 10:18:00 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:18:00.259 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 10:18:00 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:18:00.259 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 10:18:00 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:18:00.259 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Auditing locally available compute resources for np0005538515.localdomain (node: np0005538515.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 28 10:18:00 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:18:00.260 280172 DEBUG oslo_concurrency.processutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 10:18:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:18:00.629 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:18:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:18:00.630 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:18:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:18:00.630 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:18:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:18:00.630 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:18:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:18:00.630 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:18:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:18:00.630 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:18:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:18:00.630 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:18:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:18:00.631 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:18:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:18:00.631 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:18:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:18:00.631 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:18:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:18:00.631 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:18:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:18:00.631 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:18:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:18:00.631 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:18:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:18:00.631 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:18:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:18:00.631 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:18:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:18:00.632 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:18:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:18:00.632 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:18:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:18:00.632 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:18:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:18:00.632 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:18:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:18:00.632 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:18:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:18:00.632 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:18:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:18:00.632 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:18:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:18:00.632 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:18:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:18:00.633 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:18:00 np0005538515.localdomain ceilometer_agent_compute[236400]: 2025-11-28 10:18:00.633 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 10:18:00 np0005538515.localdomain sshd[326568]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 10:18:00 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 28 10:18:00 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/3704514543' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:18:00 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:18:00.731 280172 DEBUG oslo_concurrency.processutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.471s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 10:18:00 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.
Nov 28 10:18:00 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:18:00.938 280172 WARNING nova.virt.libvirt.driver [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 10:18:00 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:18:00.941 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Hypervisor/Node resource view: name=np0005538515.localdomain free_ram=11438MB free_disk=41.83686447143555GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 28 10:18:00 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:18:00.941 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 10:18:00 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:18:00.942 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 10:18:00 np0005538515.localdomain podman[326572]: 2025-11-28 10:18:00.967216855 +0000 UTC m=+0.070489333 container health_status cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 10:18:01 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:18:01.000 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 28 10:18:01 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:18:01.000 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Final resource view: name=np0005538515.localdomain phys_ram=15738MB used_ram=512MB phys_disk=41GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 28 10:18:01 np0005538515.localdomain podman[326572]: 2025-11-28 10:18:01.009390713 +0000 UTC m=+0.112663221 container exec_died cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, container_name=multipathd)
Nov 28 10:18:01 np0005538515.localdomain systemd[1]: cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.service: Deactivated successfully.
Nov 28 10:18:01 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:18:01.021 280172 DEBUG oslo_concurrency.processutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 10:18:01 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:18:01.097 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:18:01 np0005538515.localdomain ceph-mon[301134]: pgmap v802: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:18:01 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.108:0/3704514543' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:18:01 np0005538515.localdomain sshd[326568]: Received disconnect from 80.94.93.119 port 52378:11:  [preauth]
Nov 28 10:18:01 np0005538515.localdomain sshd[326568]: Disconnected from authenticating user root 80.94.93.119 port 52378 [preauth]
Nov 28 10:18:01 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 28 10:18:01 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/4099343195' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:18:01 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:18:01.463 280172 DEBUG oslo_concurrency.processutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.442s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 10:18:01 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:18:01.470 280172 DEBUG nova.compute.provider_tree [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Inventory has not changed in ProviderTree for provider: 72fba1ca-0d86-48af-8a3d-510284dfd0e0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 10:18:01 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:18:01.484 280172 DEBUG nova.scheduler.client.report [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Inventory has not changed for provider 72fba1ca-0d86-48af-8a3d-510284dfd0e0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 10:18:01 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:18:01.487 280172 DEBUG nova.compute.resource_tracker [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Compute_service record updated for np0005538515.localdomain:np0005538515.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 28 10:18:01 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:18:01.488 280172 DEBUG oslo_concurrency.lockutils [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.546s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 10:18:01 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e294 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:18:02 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v803: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:18:02 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.108:0/4099343195' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:18:02 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:18:02.488 280172 DEBUG oslo_service.periodic_task [None req-def660ec-de0f-46dd-a9a8-d8f091e83f66 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 10:18:03 np0005538515.localdomain ceph-mon[301134]: pgmap v803: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:18:03 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:18:03.571 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:18:04 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v804: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:18:05 np0005538515.localdomain ceph-mon[301134]: pgmap v804: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:18:05 np0005538515.localdomain ceph-mgr[286188]: [balancer INFO root] Optimize plan auto_2025-11-28_10:18:05
Nov 28 10:18:05 np0005538515.localdomain ceph-mgr[286188]: [balancer INFO root] Mode upmap, max misplaced 0.050000
Nov 28 10:18:05 np0005538515.localdomain ceph-mgr[286188]: [balancer INFO root] do_upmap
Nov 28 10:18:05 np0005538515.localdomain ceph-mgr[286188]: [balancer INFO root] pools ['backups', 'manila_metadata', 'manila_data', 'volumes', 'images', 'vms', '.mgr']
Nov 28 10:18:05 np0005538515.localdomain ceph-mgr[286188]: [balancer INFO root] prepared 0/10 changes
Nov 28 10:18:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] scanning for idle connections..
Nov 28 10:18:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] cleaning up connections: []
Nov 28 10:18:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] scanning for idle connections..
Nov 28 10:18:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] cleaning up connections: []
Nov 28 10:18:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] scanning for idle connections..
Nov 28 10:18:05 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] cleaning up connections: []
Nov 28 10:18:06 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v805: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:18:06 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:18:06.143 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:18:06 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] _maybe_adjust
Nov 28 10:18:06 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Nov 28 10:18:06 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] Pool '.mgr' root_id -1 using 3.080724804578448e-05 of space, bias 1.0, pg target 0.006161449609156895 quantized to 1 (current 1)
Nov 28 10:18:06 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Nov 28 10:18:06 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] Pool 'vms' root_id -1 using 0.003328000680485762 of space, bias 1.0, pg target 0.6656001360971524 quantized to 32 (current 32)
Nov 28 10:18:06 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Nov 28 10:18:06 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] Pool 'volumes' root_id -1 using 0.0014861089300670016 of space, bias 1.0, pg target 0.29672641637004465 quantized to 32 (current 32)
Nov 28 10:18:06 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Nov 28 10:18:06 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] Pool 'images' root_id -1 using 0.004299383200725851 of space, bias 1.0, pg target 0.8584435124115949 quantized to 32 (current 32)
Nov 28 10:18:06 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Nov 28 10:18:06 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] Pool 'backups' root_id -1 using 2.7263051367950866e-07 of space, bias 1.0, pg target 5.425347222222222e-05 quantized to 32 (current 32)
Nov 28 10:18:06 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Nov 28 10:18:06 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] Pool 'manila_data' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Nov 28 10:18:06 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Nov 28 10:18:06 np0005538515.localdomain ceph-mgr[286188]: [pg_autoscaler INFO root] Pool 'manila_metadata' root_id -1 using 0.002986939907872697 of space, bias 4.0, pg target 2.3776041666666665 quantized to 16 (current 16)
Nov 28 10:18:06 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules
Nov 28 10:18:06 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: vms, start_after=
Nov 28 10:18:06 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: volumes, start_after=
Nov 28 10:18:06 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: images, start_after=
Nov 28 10:18:06 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: backups, start_after=
Nov 28 10:18:06 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules
Nov 28 10:18:06 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: vms, start_after=
Nov 28 10:18:06 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: volumes, start_after=
Nov 28 10:18:06 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: images, start_after=
Nov 28 10:18:06 np0005538515.localdomain ceph-mgr[286188]: [rbd_support INFO root] load_schedules: backups, start_after=
Nov 28 10:18:06 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.107:0/1023311526' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:18:06 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e294 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:18:07 np0005538515.localdomain ceph-mon[301134]: pgmap v805: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:18:07 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.107:0/1297328519' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Nov 28 10:18:07 np0005538515.localdomain ceph-mgr[286188]: [devicehealth INFO root] Check health
Nov 28 10:18:08 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v806: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:18:08 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:18:08.607 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:18:09 np0005538515.localdomain ceph-mon[301134]: pgmap v806: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:18:09 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.
Nov 28 10:18:09 np0005538515.localdomain systemd[1]: tmp-crun.TP5BrE.mount: Deactivated successfully.
Nov 28 10:18:09 np0005538515.localdomain podman[326613]: 2025-11-28 10:18:09.981721184 +0000 UTC m=+0.087356992 container health_status 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, distribution-scope=public, config_id=edpm, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, vcs-type=git, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, version=9.6, io.openshift.tags=minimal rhel9)
Nov 28 10:18:09 np0005538515.localdomain podman[326613]: 2025-11-28 10:18:09.996602703 +0000 UTC m=+0.102238491 container exec_died 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, version=9.6, architecture=x86_64, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, distribution-scope=public, vcs-type=git, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc.)
Nov 28 10:18:10 np0005538515.localdomain systemd[1]: 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.service: Deactivated successfully.
Nov 28 10:18:10 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v807: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:18:11 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:18:11.181 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:18:11 np0005538515.localdomain ceph-mon[301134]: pgmap v807: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:18:11 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e294 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:18:12 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #58. Immutable memtables: 0.
Nov 28 10:18:12 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:18:12.111129) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Nov 28 10:18:12 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/flush_job.cc:856] [default] [JOB 33] Flushing memtable with next log file: 58
Nov 28 10:18:12 np0005538515.localdomain ceph-mon[301134]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764325092111186, "job": 33, "event": "flush_started", "num_memtables": 1, "num_entries": 789, "num_deletes": 250, "total_data_size": 1151189, "memory_usage": 1167352, "flush_reason": "Manual Compaction"}
Nov 28 10:18:12 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/flush_job.cc:885] [default] [JOB 33] Level-0 flush table #59: started
Nov 28 10:18:12 np0005538515.localdomain ceph-mon[301134]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764325092118329, "cf_name": "default", "job": 33, "event": "table_file_creation", "file_number": 59, "file_size": 621617, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 36640, "largest_seqno": 37424, "table_properties": {"data_size": 618670, "index_size": 866, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1029, "raw_key_size": 8404, "raw_average_key_size": 20, "raw_value_size": 612179, "raw_average_value_size": 1507, "num_data_blocks": 40, "num_entries": 406, "num_filter_entries": 406, "num_deletions": 250, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764325047, "oldest_key_time": 1764325047, "file_creation_time": 1764325092, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "75e61b0e-4f73-4b03-b096-8587ecbe7a9f", "db_session_id": "7KM5GJAJPD54H6HSLJHG", "orig_file_number": 59, "seqno_to_time_mapping": "N/A"}}
Nov 28 10:18:12 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 33] Flush lasted 7251 microseconds, and 2806 cpu microseconds.
Nov 28 10:18:12 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 28 10:18:12 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:18:12.118379) [db/flush_job.cc:967] [default] [JOB 33] Level-0 flush table #59: 621617 bytes OK
Nov 28 10:18:12 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:18:12.118400) [db/memtable_list.cc:519] [default] Level-0 commit table #59 started
Nov 28 10:18:12 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:18:12.121046) [db/memtable_list.cc:722] [default] Level-0 commit table #59: memtable #1 done
Nov 28 10:18:12 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:18:12.121071) EVENT_LOG_v1 {"time_micros": 1764325092121058, "job": 33, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Nov 28 10:18:12 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:18:12.121110) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Nov 28 10:18:12 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 33] Try to delete WAL files size 1147036, prev total WAL file size 1147360, number of live WAL files 2.
Nov 28 10:18:12 np0005538515.localdomain ceph-mon[301134]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538515/store.db/000055.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 28 10:18:12 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:18:12.121781) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740034323631' seq:72057594037927935, type:22 .. '6D6772737461740034353132' seq:0, type:0; will stop at (end)
Nov 28 10:18:12 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 34] Compacting 1@0 + 1@6 files to L6, score -1.00
Nov 28 10:18:12 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 33 Base level 0, inputs: [59(607KB)], [57(19MB)]
Nov 28 10:18:12 np0005538515.localdomain ceph-mon[301134]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764325092121883, "job": 34, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [59], "files_L6": [57], "score": -1, "input_data_size": 21316210, "oldest_snapshot_seqno": -1}
Nov 28 10:18:12 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v808: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:18:12 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 34] Generated table #60: 14709 keys, 19334760 bytes, temperature: kUnknown
Nov 28 10:18:12 np0005538515.localdomain ceph-mon[301134]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764325092278777, "cf_name": "default", "job": 34, "event": "table_file_creation", "file_number": 60, "file_size": 19334760, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 19250580, "index_size": 46319, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 36805, "raw_key_size": 393358, "raw_average_key_size": 26, "raw_value_size": 19000923, "raw_average_value_size": 1291, "num_data_blocks": 1729, "num_entries": 14709, "num_filter_entries": 14709, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764323786, "oldest_key_time": 0, "file_creation_time": 1764325092, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "75e61b0e-4f73-4b03-b096-8587ecbe7a9f", "db_session_id": "7KM5GJAJPD54H6HSLJHG", "orig_file_number": 60, "seqno_to_time_mapping": "N/A"}}
Nov 28 10:18:12 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 28 10:18:12 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:18:12.279194) [db/compaction/compaction_job.cc:1663] [default] [JOB 34] Compacted 1@0 + 1@6 files to L6 => 19334760 bytes
Nov 28 10:18:12 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:18:12.281816) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 135.8 rd, 123.2 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.6, 19.7 +0.0 blob) out(18.4 +0.0 blob), read-write-amplify(65.4) write-amplify(31.1) OK, records in: 15213, records dropped: 504 output_compression: NoCompression
Nov 28 10:18:12 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:18:12.281849) EVENT_LOG_v1 {"time_micros": 1764325092281835, "job": 34, "event": "compaction_finished", "compaction_time_micros": 156998, "compaction_time_cpu_micros": 60066, "output_level": 6, "num_output_files": 1, "total_output_size": 19334760, "num_input_records": 15213, "num_output_records": 14709, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Nov 28 10:18:12 np0005538515.localdomain ceph-mon[301134]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538515/store.db/000059.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 28 10:18:12 np0005538515.localdomain ceph-mon[301134]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764325092282139, "job": 34, "event": "table_file_deletion", "file_number": 59}
Nov 28 10:18:12 np0005538515.localdomain ceph-mon[301134]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538515/store.db/000057.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 28 10:18:12 np0005538515.localdomain ceph-mon[301134]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764325092284857, "job": 34, "event": "table_file_deletion", "file_number": 57}
Nov 28 10:18:12 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:18:12.121656) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:18:12 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:18:12.284923) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:18:12 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:18:12.284929) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:18:12 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:18:12.284932) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:18:12 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:18:12.284935) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:18:12 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:18:12.284939) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 28 10:18:13 np0005538515.localdomain ceph-mon[301134]: pgmap v808: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:18:13 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:18:13.637 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:18:14 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v809: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:18:14 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/3885796551' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Nov 28 10:18:14 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.32:0/3885796551' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Nov 28 10:18:15 np0005538515.localdomain ceph-mon[301134]: pgmap v809: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:18:16 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v810: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:18:16 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:18:16.215 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:18:16 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e294 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:18:17 np0005538515.localdomain ceph-mon[301134]: pgmap v810: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:18:18 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v811: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:18:18 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:18:18.641 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:18:19 np0005538515.localdomain ceph-mon[301134]: pgmap v811: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:18:20 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v812: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:18:20 np0005538515.localdomain sshd[326635]: main: sshd: ssh-rsa algorithm is disabled
Nov 28 10:18:20 np0005538515.localdomain sshd[326635]: Accepted publickey for zuul from 192.168.122.10 port 44912 ssh2: RSA SHA256:3gOhaEk5Hp1Sm2LwNst6cGDJ5O01KvSo8lCo9SBO2II
Nov 28 10:18:20 np0005538515.localdomain systemd-logind[763]: New session 85 of user zuul.
Nov 28 10:18:20 np0005538515.localdomain systemd[1]: Started Session 85 of User zuul.
Nov 28 10:18:20 np0005538515.localdomain sshd[326635]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Nov 28 10:18:20 np0005538515.localdomain sudo[326639]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/bash -c rm -rf /var/tmp/sos-osp && mkdir /var/tmp/sos-osp && sos report --batch --all-logs --tmp-dir=/var/tmp/sos-osp  -p container,openstack_edpm,system,storage,virt
Nov 28 10:18:20 np0005538515.localdomain sudo[326639]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Nov 28 10:18:21 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:18:21.251 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:18:21 np0005538515.localdomain ceph-mon[301134]: pgmap v812: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:18:21 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e294 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:18:22 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v813: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:18:23 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.49284 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:23 np0005538515.localdomain ceph-mon[301134]: pgmap v813: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:18:23 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.69335 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:23 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.59317 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:23 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.49290 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:23 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:18:23.679 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:18:23 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.69341 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:23 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.59332 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:24 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v814: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:18:24 np0005538515.localdomain ceph-mon[301134]: from='client.49284 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:24 np0005538515.localdomain ceph-mon[301134]: from='client.69335 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:24 np0005538515.localdomain ceph-mon[301134]: from='client.59317 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:24 np0005538515.localdomain ceph-mon[301134]: from='client.49290 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:24 np0005538515.localdomain ceph-mon[301134]: from='client.69341 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:24 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.106:0/1185886869' entity='client.admin' cmd={"prefix": "status"} : dispatch
Nov 28 10:18:24 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "status"} v 0)
Nov 28 10:18:24 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/2941785349' entity='client.admin' cmd={"prefix": "status"} : dispatch
Nov 28 10:18:25 np0005538515.localdomain ceph-mon[301134]: from='client.59332 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:25 np0005538515.localdomain ceph-mon[301134]: pgmap v814: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:18:25 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.108:0/2941785349' entity='client.admin' cmd={"prefix": "status"} : dispatch
Nov 28 10:18:25 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.107:0/1170736032' entity='client.admin' cmd={"prefix": "status"} : dispatch
Nov 28 10:18:25 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.
Nov 28 10:18:25 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.
Nov 28 10:18:25 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.
Nov 28 10:18:25 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.
Nov 28 10:18:25 np0005538515.localdomain podman[326858]: 2025-11-28 10:18:25.935126873 +0000 UTC m=+0.145235634 container health_status b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 28 10:18:25 np0005538515.localdomain podman[326856]: 2025-11-28 10:18:25.890287062 +0000 UTC m=+0.108133791 container health_status 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 28 10:18:25 np0005538515.localdomain podman[326858]: 2025-11-28 10:18:25.972202295 +0000 UTC m=+0.182310986 container exec_died b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Nov 28 10:18:25 np0005538515.localdomain systemd[1]: b95b8b369ca0d04466faa10822d7769e9b6e55c24a5c9f479419e65851e1616c.service: Deactivated successfully.
Nov 28 10:18:25 np0005538515.localdomain podman[326862]: 2025-11-28 10:18:25.985736252 +0000 UTC m=+0.194536092 container health_status d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 28 10:18:26 np0005538515.localdomain podman[326857]: 2025-11-28 10:18:26.053494929 +0000 UTC m=+0.266355315 container health_status 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, managed_by=edpm_ansible, config_id=ovn_controller)
Nov 28 10:18:26 np0005538515.localdomain podman[326862]: 2025-11-28 10:18:26.06748977 +0000 UTC m=+0.276289570 container exec_died d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 28 10:18:26 np0005538515.localdomain podman[326856]: 2025-11-28 10:18:26.076358884 +0000 UTC m=+0.294205613 container exec_died 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, config_id=edpm, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251125)
Nov 28 10:18:26 np0005538515.localdomain systemd[1]: 783f7bdda3f71afff28110159f06f1a6805dbf850fe6e134f611291204eec4c3.service: Deactivated successfully.
Nov 28 10:18:26 np0005538515.localdomain systemd[1]: d28c0e8fd438b00dd8ab36649f1b3f02744bd33191d227364de3217fdcd8c3d7.service: Deactivated successfully.
Nov 28 10:18:26 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v815: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:18:26 np0005538515.localdomain podman[326857]: 2025-11-28 10:18:26.163661442 +0000 UTC m=+0.376521848 container exec_died 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251125, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 28 10:18:26 np0005538515.localdomain systemd[1]: 98976ce53429eeceb8d29e2e3bd0404e86bed3e45091f63795bb750b24b381a9.service: Deactivated successfully.
Nov 28 10:18:26 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:18:26.253 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:18:26 np0005538515.localdomain ovs-vsctl[326967]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Nov 28 10:18:26 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e294 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:18:27 np0005538515.localdomain ceph-mon[301134]: pgmap v815: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:18:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:18:27 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 10:18:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:18:27 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 10:18:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:18:27 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 10:18:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:18:27 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 10:18:27 np0005538515.localdomain openstack_network_exporter[240973]: 
Nov 28 10:18:27 np0005538515.localdomain openstack_network_exporter[240973]: ERROR   10:18:27 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 10:18:27 np0005538515.localdomain openstack_network_exporter[240973]: 
Nov 28 10:18:27 np0005538515.localdomain virtqemud[227736]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Nov 28 10:18:27 np0005538515.localdomain virtqemud[227736]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Nov 28 10:18:27 np0005538515.localdomain virtqemud[227736]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Nov 28 10:18:27 np0005538515.localdomain systemd[1]: efi.automount: Got automount request for /efi, triggered by 327119 (lsinitrd)
Nov 28 10:18:27 np0005538515.localdomain systemd[1]: Mounting EFI System Partition Automount...
Nov 28 10:18:27 np0005538515.localdomain systemd[1]: Mounted EFI System Partition Automount.
Nov 28 10:18:28 np0005538515.localdomain ceph-mds[282859]: mds.mds.np0005538515.anvatb asok_command: cache status {prefix=cache status} (starting...)
Nov 28 10:18:28 np0005538515.localdomain ceph-mds[282859]: mds.mds.np0005538515.anvatb Can't run that command on an inactive MDS!
Nov 28 10:18:28 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v816: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:18:28 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.59344 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:28 np0005538515.localdomain ceph-mds[282859]: mds.mds.np0005538515.anvatb asok_command: client ls {prefix=client ls} (starting...)
Nov 28 10:18:28 np0005538515.localdomain ceph-mds[282859]: mds.mds.np0005538515.anvatb Can't run that command on an inactive MDS!
Nov 28 10:18:28 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.
Nov 28 10:18:28 np0005538515.localdomain lvm[327208]: PV /dev/loop4 online, VG ceph_vg1 is complete.
Nov 28 10:18:28 np0005538515.localdomain lvm[327208]: VG ceph_vg1 finished
Nov 28 10:18:28 np0005538515.localdomain lvm[327226]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Nov 28 10:18:28 np0005538515.localdomain podman[327201]: 2025-11-28 10:18:28.428020772 +0000 UTC m=+0.105345915 container health_status 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 28 10:18:28 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.69356 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:28 np0005538515.localdomain lvm[327226]: VG ceph_vg0 finished
Nov 28 10:18:28 np0005538515.localdomain podman[327201]: 2025-11-28 10:18:28.467421816 +0000 UTC m=+0.144746939 container exec_died 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 28 10:18:28 np0005538515.localdomain systemd[1]: 56bbd0ca835b5b4ab7ba25672a11a9498e53aecd6d4bd842394893fa3da92686.service: Deactivated successfully.
Nov 28 10:18:28 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.59353 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:28 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.49311 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:28 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:18:28.719 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:18:28 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.69362 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:28 np0005538515.localdomain ceph-mds[282859]: mds.mds.np0005538515.anvatb asok_command: damage ls {prefix=damage ls} (starting...)
Nov 28 10:18:28 np0005538515.localdomain ceph-mds[282859]: mds.mds.np0005538515.anvatb Can't run that command on an inactive MDS!
Nov 28 10:18:28 np0005538515.localdomain podman[239012]: time="2025-11-28T10:18:28Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 10:18:28 np0005538515.localdomain podman[239012]: @ - - [28/Nov/2025:10:18:28 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156330 "" "Go-http-client/1.1"
Nov 28 10:18:28 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.59362 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:28 np0005538515.localdomain podman[239012]: @ - - [28/Nov/2025:10:18:28 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19256 "" "Go-http-client/1.1"
Nov 28 10:18:28 np0005538515.localdomain ceph-mds[282859]: mds.mds.np0005538515.anvatb asok_command: dump loads {prefix=dump loads} (starting...)
Nov 28 10:18:28 np0005538515.localdomain ceph-mds[282859]: mds.mds.np0005538515.anvatb Can't run that command on an inactive MDS!
Nov 28 10:18:29 np0005538515.localdomain ceph-mds[282859]: mds.mds.np0005538515.anvatb asok_command: dump tree {prefix=dump tree,root=/} (starting...)
Nov 28 10:18:29 np0005538515.localdomain ceph-mds[282859]: mds.mds.np0005538515.anvatb Can't run that command on an inactive MDS!
Nov 28 10:18:29 np0005538515.localdomain ceph-mds[282859]: mds.mds.np0005538515.anvatb asok_command: dump_blocked_ops {prefix=dump_blocked_ops} (starting...)
Nov 28 10:18:29 np0005538515.localdomain ceph-mds[282859]: mds.mds.np0005538515.anvatb Can't run that command on an inactive MDS!
Nov 28 10:18:29 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "report"} v 0)
Nov 28 10:18:29 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/493760634' entity='client.admin' cmd={"prefix": "report"} : dispatch
Nov 28 10:18:29 np0005538515.localdomain ceph-mon[301134]: pgmap v816: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:18:29 np0005538515.localdomain ceph-mon[301134]: from='client.59344 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:29 np0005538515.localdomain ceph-mon[301134]: from='client.69356 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:29 np0005538515.localdomain ceph-mon[301134]: from='client.59353 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:29 np0005538515.localdomain ceph-mon[301134]: from='client.49311 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:29 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.106:0/2707254784' entity='client.admin' cmd={"prefix": "report"} : dispatch
Nov 28 10:18:29 np0005538515.localdomain ceph-mon[301134]: from='client.? ' entity='client.admin' cmd={"prefix": "report"} : dispatch
Nov 28 10:18:29 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.108:0/493760634' entity='client.admin' cmd={"prefix": "report"} : dispatch
Nov 28 10:18:29 np0005538515.localdomain ceph-mds[282859]: mds.mds.np0005538515.anvatb asok_command: dump_historic_ops {prefix=dump_historic_ops} (starting...)
Nov 28 10:18:29 np0005538515.localdomain ceph-mds[282859]: mds.mds.np0005538515.anvatb Can't run that command on an inactive MDS!
Nov 28 10:18:29 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.49329 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:29 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T10:18:29.402+0000 7fccb8638640 -1 mgr.server reply reply (95) Operation not supported Module 'prometheus' is not enabled/loaded (required by command 'healthcheck history ls'): use `ceph mgr module enable prometheus` to enable it
Nov 28 10:18:29 np0005538515.localdomain ceph-mgr[286188]: mgr.server reply reply (95) Operation not supported Module 'prometheus' is not enabled/loaded (required by command 'healthcheck history ls'): use `ceph mgr module enable prometheus` to enable it
Nov 28 10:18:29 np0005538515.localdomain ceph-mds[282859]: mds.mds.np0005538515.anvatb asok_command: dump_historic_ops_by_duration {prefix=dump_historic_ops_by_duration} (starting...)
Nov 28 10:18:29 np0005538515.localdomain ceph-mds[282859]: mds.mds.np0005538515.anvatb Can't run that command on an inactive MDS!
Nov 28 10:18:29 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.69389 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:29 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T10:18:29.539+0000 7fccb8638640 -1 mgr.server reply reply (95) Operation not supported Module 'prometheus' is not enabled/loaded (required by command 'healthcheck history ls'): use `ceph mgr module enable prometheus` to enable it
Nov 28 10:18:29 np0005538515.localdomain ceph-mgr[286188]: mgr.server reply reply (95) Operation not supported Module 'prometheus' is not enabled/loaded (required by command 'healthcheck history ls'): use `ceph mgr module enable prometheus` to enable it
Nov 28 10:18:29 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Nov 28 10:18:29 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/4066833274' entity='client.admin' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 10:18:29 np0005538515.localdomain ceph-mds[282859]: mds.mds.np0005538515.anvatb asok_command: dump_ops_in_flight {prefix=dump_ops_in_flight} (starting...)
Nov 28 10:18:29 np0005538515.localdomain ceph-mds[282859]: mds.mds.np0005538515.anvatb Can't run that command on an inactive MDS!
Nov 28 10:18:29 np0005538515.localdomain ceph-mds[282859]: mds.mds.np0005538515.anvatb asok_command: get subtrees {prefix=get subtrees} (starting...)
Nov 28 10:18:29 np0005538515.localdomain ceph-mds[282859]: mds.mds.np0005538515.anvatb Can't run that command on an inactive MDS!
Nov 28 10:18:29 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.59392 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:29 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T10:18:29.776+0000 7fccb8638640 -1 mgr.server reply reply (95) Operation not supported Module 'prometheus' is not enabled/loaded (required by command 'healthcheck history ls'): use `ceph mgr module enable prometheus` to enable it
Nov 28 10:18:29 np0005538515.localdomain ceph-mgr[286188]: mgr.server reply reply (95) Operation not supported Module 'prometheus' is not enabled/loaded (required by command 'healthcheck history ls'): use `ceph mgr module enable prometheus` to enable it
Nov 28 10:18:29 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "config log"} v 0)
Nov 28 10:18:29 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/461455018' entity='client.admin' cmd={"prefix": "config log"} : dispatch
Nov 28 10:18:29 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "log last", "channel": "cephadm"} v 0)
Nov 28 10:18:29 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/1543263582' entity='client.admin' cmd={"prefix": "log last", "channel": "cephadm"} : dispatch
Nov 28 10:18:30 np0005538515.localdomain ceph-mds[282859]: mds.mds.np0005538515.anvatb asok_command: ops {prefix=ops} (starting...)
Nov 28 10:18:30 np0005538515.localdomain ceph-mds[282859]: mds.mds.np0005538515.anvatb Can't run that command on an inactive MDS!
Nov 28 10:18:30 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v817: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:18:30 np0005538515.localdomain ceph-mon[301134]: from='client.69362 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:30 np0005538515.localdomain ceph-mon[301134]: from='client.59362 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:30 np0005538515.localdomain ceph-mon[301134]: from='client.49329 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:30 np0005538515.localdomain ceph-mon[301134]: from='client.? ' entity='client.admin' cmd={"prefix": "report"} : dispatch
Nov 28 10:18:30 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.107:0/6787673' entity='client.admin' cmd={"prefix": "report"} : dispatch
Nov 28 10:18:30 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.106:0/4057740893' entity='client.admin' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 10:18:30 np0005538515.localdomain ceph-mon[301134]: from='client.69389 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:30 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.108:0/4066833274' entity='client.admin' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 10:18:30 np0005538515.localdomain ceph-mon[301134]: from='client.59392 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:30 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.106:0/1807372338' entity='client.admin' cmd={"prefix": "log last", "channel": "cephadm"} : dispatch
Nov 28 10:18:30 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.106:0/1177517886' entity='client.admin' cmd={"prefix": "config log"} : dispatch
Nov 28 10:18:30 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.108:0/461455018' entity='client.admin' cmd={"prefix": "config log"} : dispatch
Nov 28 10:18:30 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.107:0/1994531849' entity='client.admin' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 10:18:30 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.108:0/1543263582' entity='client.admin' cmd={"prefix": "log last", "channel": "cephadm"} : dispatch
Nov 28 10:18:30 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "mgr dump"} v 0)
Nov 28 10:18:30 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/357711483' entity='client.admin' cmd={"prefix": "mgr dump"} : dispatch
Nov 28 10:18:30 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "config-key dump"} v 0)
Nov 28 10:18:30 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/1304102523' entity='client.admin' cmd={"prefix": "config-key dump"} : dispatch
Nov 28 10:18:30 np0005538515.localdomain ceph-mds[282859]: mds.mds.np0005538515.anvatb asok_command: session ls {prefix=session ls} (starting...)
Nov 28 10:18:30 np0005538515.localdomain ceph-mds[282859]: mds.mds.np0005538515.anvatb Can't run that command on an inactive MDS!
Nov 28 10:18:30 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.69452 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:30 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.49377 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:30 np0005538515.localdomain ceph-mds[282859]: mds.mds.np0005538515.anvatb asok_command: status {prefix=status} (starting...)
Nov 28 10:18:30 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "mgr metadata"} v 0)
Nov 28 10:18:30 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/2434183970' entity='client.admin' cmd={"prefix": "mgr metadata"} : dispatch
Nov 28 10:18:31 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.59455 -' entity='client.admin' cmd=[{"prefix": "crash stat", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:31 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.49389 -' entity='client.admin' cmd=[{"prefix": "crash stat", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:31 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.59473 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:31 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "mgr module ls"} v 0)
Nov 28 10:18:31 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/1834711607' entity='client.admin' cmd={"prefix": "mgr module ls"} : dispatch
Nov 28 10:18:31 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:18:31.285 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:18:31 np0005538515.localdomain ceph-mon[301134]: pgmap v817: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:18:31 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.107:0/783008498' entity='client.admin' cmd={"prefix": "log last", "channel": "cephadm"} : dispatch
Nov 28 10:18:31 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.106:0/3391289643' entity='client.admin' cmd={"prefix": "mgr dump"} : dispatch
Nov 28 10:18:31 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.106:0/207961467' entity='client.admin' cmd={"prefix": "config-key dump"} : dispatch
Nov 28 10:18:31 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.108:0/357711483' entity='client.admin' cmd={"prefix": "mgr dump"} : dispatch
Nov 28 10:18:31 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.107:0/4264039640' entity='client.admin' cmd={"prefix": "config log"} : dispatch
Nov 28 10:18:31 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.108:0/1304102523' entity='client.admin' cmd={"prefix": "config-key dump"} : dispatch
Nov 28 10:18:31 np0005538515.localdomain ceph-mon[301134]: from='client.69452 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:31 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.106:0/4251160342' entity='client.admin' cmd={"prefix": "mgr metadata"} : dispatch
Nov 28 10:18:31 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.107:0/2610313962' entity='client.admin' cmd={"prefix": "mgr dump"} : dispatch
Nov 28 10:18:31 np0005538515.localdomain ceph-mon[301134]: from='client.49377 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:31 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.108:0/2434183970' entity='client.admin' cmd={"prefix": "mgr metadata"} : dispatch
Nov 28 10:18:31 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.107:0/2654470196' entity='client.admin' cmd={"prefix": "config-key dump"} : dispatch
Nov 28 10:18:31 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.106:0/998697886' entity='client.admin' cmd={"prefix": "mgr module ls"} : dispatch
Nov 28 10:18:31 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.107:0/3437019487' entity='client.admin' cmd={"prefix": "mgr metadata"} : dispatch
Nov 28 10:18:31 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.108:0/1834711607' entity='client.admin' cmd={"prefix": "mgr module ls"} : dispatch
Nov 28 10:18:31 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "features"} v 0)
Nov 28 10:18:31 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/1906133235' entity='client.admin' cmd={"prefix": "features"} : dispatch
Nov 28 10:18:31 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.59479 -' entity='client.admin' cmd=[{"prefix": "crash stat", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:31 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "mgr services"} v 0)
Nov 28 10:18:31 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/2752663634' entity='client.admin' cmd={"prefix": "mgr services"} : dispatch
Nov 28 10:18:31 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.
Nov 28 10:18:31 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e294 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:18:31 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "health", "detail": "detail"} v 0)
Nov 28 10:18:31 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/3594311866' entity='client.admin' cmd={"prefix": "health", "detail": "detail"} : dispatch
Nov 28 10:18:31 np0005538515.localdomain podman[327735]: 2025-11-28 10:18:31.968212067 +0000 UTC m=+0.078432396 container health_status cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team)
Nov 28 10:18:31 np0005538515.localdomain podman[327735]: 2025-11-28 10:18:31.984467768 +0000 UTC m=+0.094688127 container exec_died cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, io.buildah.version=1.41.3, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=multipathd, org.label-schema.vendor=CentOS)
Nov 28 10:18:31 np0005538515.localdomain systemd[1]: cdf4e3e42cf3903eaffd27921fa3139462d8f511d1951a0793f21197fa17820f.service: Deactivated successfully.
Nov 28 10:18:32 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v818: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:18:32 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "mgr stat"} v 0)
Nov 28 10:18:32 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/2663181736' entity='client.admin' cmd={"prefix": "mgr stat"} : dispatch
Nov 28 10:18:32 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.69524 -' entity='client.admin' cmd=[{"prefix": "insights", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:32 np0005538515.localdomain ceph-mgr[286188]: mgr.server reply reply (95) Operation not supported Module 'insights' is not enabled/loaded (required by command 'insights'): use `ceph mgr module enable insights` to enable it
Nov 28 10:18:32 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T10:18:32.287+0000 7fccb8638640 -1 mgr.server reply reply (95) Operation not supported Module 'insights' is not enabled/loaded (required by command 'insights'): use `ceph mgr module enable insights` to enable it
Nov 28 10:18:32 np0005538515.localdomain ceph-mon[301134]: from='client.59455 -' entity='client.admin' cmd=[{"prefix": "crash stat", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:32 np0005538515.localdomain ceph-mon[301134]: from='client.49389 -' entity='client.admin' cmd=[{"prefix": "crash stat", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:32 np0005538515.localdomain ceph-mon[301134]: from='client.59473 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:32 np0005538515.localdomain ceph-mon[301134]: from='client.? ' entity='client.admin' cmd={"prefix": "features"} : dispatch
Nov 28 10:18:32 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.108:0/1906133235' entity='client.admin' cmd={"prefix": "features"} : dispatch
Nov 28 10:18:32 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.106:0/3436769846' entity='client.admin' cmd={"prefix": "features"} : dispatch
Nov 28 10:18:32 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.106:0/4077617134' entity='client.admin' cmd={"prefix": "mgr services"} : dispatch
Nov 28 10:18:32 np0005538515.localdomain ceph-mon[301134]: from='client.59479 -' entity='client.admin' cmd=[{"prefix": "crash stat", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:32 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.108:0/2752663634' entity='client.admin' cmd={"prefix": "mgr services"} : dispatch
Nov 28 10:18:32 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.107:0/3827185081' entity='client.admin' cmd={"prefix": "mgr module ls"} : dispatch
Nov 28 10:18:32 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.108:0/3594311866' entity='client.admin' cmd={"prefix": "health", "detail": "detail"} : dispatch
Nov 28 10:18:32 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.106:0/2492373822' entity='client.admin' cmd={"prefix": "mgr stat"} : dispatch
Nov 28 10:18:32 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.106:0/4060251766' entity='client.admin' cmd={"prefix": "health", "detail": "detail"} : dispatch
Nov 28 10:18:32 np0005538515.localdomain ceph-mon[301134]: from='client.? ' entity='client.admin' cmd={"prefix": "features"} : dispatch
Nov 28 10:18:32 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.107:0/2621700254' entity='client.admin' cmd={"prefix": "features"} : dispatch
Nov 28 10:18:32 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.107:0/2252583354' entity='client.admin' cmd={"prefix": "mgr services"} : dispatch
Nov 28 10:18:32 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.108:0/2663181736' entity='client.admin' cmd={"prefix": "mgr stat"} : dispatch
Nov 28 10:18:32 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.49425 -' entity='client.admin' cmd=[{"prefix": "insights", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:32 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T10:18:32.407+0000 7fccb8638640 -1 mgr.server reply reply (95) Operation not supported Module 'insights' is not enabled/loaded (required by command 'insights'): use `ceph mgr module enable insights` to enable it
Nov 28 10:18:32 np0005538515.localdomain ceph-mgr[286188]: mgr.server reply reply (95) Operation not supported Module 'insights' is not enabled/loaded (required by command 'insights'): use `ceph mgr module enable insights` to enable it
Nov 28 10:18:32 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "mgr versions"} v 0)
Nov 28 10:18:32 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/4097305108' entity='client.admin' cmd={"prefix": "mgr versions"} : dispatch
Nov 28 10:18:32 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"} v 0)
Nov 28 10:18:32 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/2044064874' entity='client.admin' cmd={"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"} : dispatch
Nov 28 10:18:32 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.49437 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:32 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.59545 -' entity='client.admin' cmd=[{"prefix": "insights", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:32 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T10:18:32.965+0000 7fccb8638640 -1 mgr.server reply reply (95) Operation not supported Module 'insights' is not enabled/loaded (required by command 'insights'): use `ceph mgr module enable insights` to enable it
Nov 28 10:18:32 np0005538515.localdomain ceph-mgr[286188]: mgr.server reply reply (95) Operation not supported Module 'insights' is not enabled/loaded (required by command 'insights'): use `ceph mgr module enable insights` to enable it
Nov 28 10:18:33 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.69557 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:33 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.49446 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:33 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"} v 0)
Nov 28 10:18:33 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/2747186648' entity='client.admin' cmd={"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"} : dispatch
Nov 28 10:18:33 np0005538515.localdomain ceph-mon[301134]: pgmap v818: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:18:33 np0005538515.localdomain ceph-mon[301134]: from='client.69524 -' entity='client.admin' cmd=[{"prefix": "insights", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:33 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.106:0/1900692722' entity='client.admin' cmd={"prefix": "mgr versions"} : dispatch
Nov 28 10:18:33 np0005538515.localdomain ceph-mon[301134]: from='client.49425 -' entity='client.admin' cmd=[{"prefix": "insights", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:33 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.107:0/2369810755' entity='client.admin' cmd={"prefix": "health", "detail": "detail"} : dispatch
Nov 28 10:18:33 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.107:0/984640696' entity='client.admin' cmd={"prefix": "mgr stat"} : dispatch
Nov 28 10:18:33 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.108:0/4097305108' entity='client.admin' cmd={"prefix": "mgr versions"} : dispatch
Nov 28 10:18:33 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.108:0/2044064874' entity='client.admin' cmd={"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"} : dispatch
Nov 28 10:18:33 np0005538515.localdomain ceph-mon[301134]: from='client.49437 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:33 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.106:0/931850460' entity='client.admin' cmd={"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"} : dispatch
Nov 28 10:18:33 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.107:0/1681934238' entity='client.admin' cmd={"prefix": "mgr versions"} : dispatch
Nov 28 10:18:33 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.108:0/2747186648' entity='client.admin' cmd={"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"} : dispatch
Nov 28 10:18:33 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.69575 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:33 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.59575 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:33 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.49458 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:33 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:18:33.760 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:18:33 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "mgr dump"} v 0)
Nov 28 10:18:33 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/1322949749' entity='client.admin' cmd={"prefix": "mgr dump"} : dispatch
Nov 28 10:18:33 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.69587 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:33 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.59590 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:33 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.59602 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:33 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"} v 0)
Nov 28 10:18:33 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/258418876' entity='client.admin' cmd={"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"} : dispatch
Nov 28 10:18:33 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "mgr dump"} v 0)
Nov 28 10:18:33 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/2618867657' entity='client.admin' cmd={"prefix": "mgr dump"} : dispatch
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[32393]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[32393]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 9000.1 total, 600.0 interval
                                                          Cumulative writes: 22K writes, 83K keys, 22K commit groups, 1.0 writes per commit group, ingest: 0.09 GB, 0.01 MB/s
                                                          Cumulative WAL: 22K writes, 7722 syncs, 2.86 writes per sync, written: 0.09 GB, 0.01 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 8846 writes, 31K keys, 8846 commit groups, 1.0 writes per commit group, ingest: 44.37 MB, 0.07 MB/s
                                                          Interval WAL: 8846 writes, 3528 syncs, 2.51 writes per sync, written: 0.04 GB, 0.07 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Nov 28 10:18:34 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v819: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:18:34 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.59620 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:34 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.69617 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:34 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.49464 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:34 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "mgr metadata"} v 0)
Nov 28 10:18:34 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/1946602031' entity='client.admin' cmd={"prefix": "mgr metadata"} : dispatch
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:56:10.364656+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 92971008 unmapped: 2392064 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:56:11.364984+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 90 handle_osd_map epochs [90,91], i have 90, src has [1,91]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 81.827331543s of 81.836059570s, submitted: 1
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: mgrc handle_mgr_map Got map version 35
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: mgrc handle_mgr_map Active mgr is now 
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: mgrc reconnect Terminating session with v2:172.18.0.106:6810/4278362185
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: mgrc reconnect No active mgr available yet
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 91 ms_handle_reset con 0x562bfc584c00 session 0x562bfd5edc20
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 91 heartbeat osd_stat(store_statfs(0x1b9511000/0x0/0x1bfc00000, data 0x24fbc01/0x257b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfc9ff400
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93126656 unmapped: 2236416 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:56:12.365474+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: mgrc handle_mgr_map Got map version 36
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.107:6810/2760684413,v1:172.18.0.107:6811/2760684413]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: mgrc reconnect Starting new session with [v2:172.18.0.107:6810/2760684413,v1:172.18.0.107:6811/2760684413]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: get_auth_request con 0x562bfdf05400 auth_method 0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: mgrc handle_mgr_configure stats_period=5
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93036544 unmapped: 2326528 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:56:13.365604+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: mgrc handle_mgr_map Got map version 37
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.107:6810/2760684413,v1:172.18.0.107:6811/2760684413]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 908792 data_alloc: 301989888 data_used: 6553600
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93036544 unmapped: 2326528 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:56:14.365918+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93036544 unmapped: 2326528 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:56:15.366105+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 91 heartbeat osd_stat(store_statfs(0x1b950d000/0x0/0x1bfc00000, data 0x24fde6d/0x257f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: mgrc handle_mgr_map Got map version 38
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.107:6810/2760684413,v1:172.18.0.107:6811/2760684413]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93036544 unmapped: 2326528 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:56:16.366257+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 91 heartbeat osd_stat(store_statfs(0x1b950d000/0x0/0x1bfc00000, data 0x24fde6d/0x257f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93036544 unmapped: 2326528 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:56:17.366401+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: mgrc handle_mgr_map Got map version 39
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.107:6810/2760684413,v1:172.18.0.107:6811/2760684413]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 92962816 unmapped: 2400256 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:56:18.366622+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 908792 data_alloc: 301989888 data_used: 6553600
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 92962816 unmapped: 2400256 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:56:19.366846+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 92962816 unmapped: 2400256 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:56:20.367124+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 91 heartbeat osd_stat(store_statfs(0x1b950d000/0x0/0x1bfc00000, data 0x24fde6d/0x257f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 92962816 unmapped: 2400256 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:56:21.367380+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 92962816 unmapped: 2400256 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:56:22.367583+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 92962816 unmapped: 2400256 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:56:23.367819+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 908792 data_alloc: 301989888 data_used: 6553600
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 92962816 unmapped: 2400256 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:56:24.368259+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 91 heartbeat osd_stat(store_statfs(0x1b950d000/0x0/0x1bfc00000, data 0x24fde6d/0x257f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 92962816 unmapped: 2400256 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:56:25.368676+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 92962816 unmapped: 2400256 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:56:26.369158+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 92962816 unmapped: 2400256 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:56:27.369406+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:56:28.369666+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 92962816 unmapped: 2400256 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 908792 data_alloc: 301989888 data_used: 6553600
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:56:29.369893+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 92962816 unmapped: 2400256 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 91 heartbeat osd_stat(store_statfs(0x1b950d000/0x0/0x1bfc00000, data 0x24fde6d/0x257f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:56:30.370090+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 92962816 unmapped: 2400256 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 91 heartbeat osd_stat(store_statfs(0x1b950d000/0x0/0x1bfc00000, data 0x24fde6d/0x257f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:56:31.370305+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 92962816 unmapped: 2400256 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:56:32.370535+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 92962816 unmapped: 2400256 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:56:33.370736+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 92962816 unmapped: 2400256 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 91 heartbeat osd_stat(store_statfs(0x1b950d000/0x0/0x1bfc00000, data 0x24fde6d/0x257f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 908792 data_alloc: 301989888 data_used: 6553600
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:56:34.371008+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 92962816 unmapped: 2400256 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: mgrc handle_mgr_map Got map version 40
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.107:6810/2760684413,v1:172.18.0.107:6811/2760684413]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:56:35.371251+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93110272 unmapped: 2252800 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 91 heartbeat osd_stat(store_statfs(0x1b950d000/0x0/0x1bfc00000, data 0x24fde6d/0x257f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:56:36.371524+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93110272 unmapped: 2252800 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 91 heartbeat osd_stat(store_statfs(0x1b950d000/0x0/0x1bfc00000, data 0x24fde6d/0x257f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:56:37.371714+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93110272 unmapped: 2252800 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:56:38.371933+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93110272 unmapped: 2252800 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 91 heartbeat osd_stat(store_statfs(0x1b950d000/0x0/0x1bfc00000, data 0x24fde6d/0x257f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 908792 data_alloc: 301989888 data_used: 6553600
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:56:39.372126+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93110272 unmapped: 2252800 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:56:40.372349+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93110272 unmapped: 2252800 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:56:41.372509+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93110272 unmapped: 2252800 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:56:42.372640+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93110272 unmapped: 2252800 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 91 heartbeat osd_stat(store_statfs(0x1b950d000/0x0/0x1bfc00000, data 0x24fde6d/0x257f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:56:43.372765+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93110272 unmapped: 2252800 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 908792 data_alloc: 301989888 data_used: 6553600
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:56:44.372903+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93110272 unmapped: 2252800 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:56:45.373168+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93110272 unmapped: 2252800 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:56:46.373357+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93110272 unmapped: 2252800 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 91 heartbeat osd_stat(store_statfs(0x1b950d000/0x0/0x1bfc00000, data 0x24fde6d/0x257f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:56:47.373556+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93110272 unmapped: 2252800 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:56:48.373710+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93110272 unmapped: 2252800 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 908792 data_alloc: 301989888 data_used: 6553600
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:56:49.374106+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93110272 unmapped: 2252800 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:56:50.374306+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93110272 unmapped: 2252800 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:56:51.374422+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93110272 unmapped: 2252800 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 91 heartbeat osd_stat(store_statfs(0x1b950d000/0x0/0x1bfc00000, data 0x24fde6d/0x257f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:56:52.374571+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93110272 unmapped: 2252800 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:56:53.374730+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93110272 unmapped: 2252800 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 91 heartbeat osd_stat(store_statfs(0x1b950d000/0x0/0x1bfc00000, data 0x24fde6d/0x257f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:56:54.374851+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 908792 data_alloc: 301989888 data_used: 6553600
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93110272 unmapped: 2252800 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:56:55.374974+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93110272 unmapped: 2252800 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:56:56.375152+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93110272 unmapped: 2252800 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:56:57.375280+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93110272 unmapped: 2252800 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:56:58.375460+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93110272 unmapped: 2252800 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 91 heartbeat osd_stat(store_statfs(0x1b950d000/0x0/0x1bfc00000, data 0x24fde6d/0x257f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:56:59.375660+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 908792 data_alloc: 301989888 data_used: 6553600
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93110272 unmapped: 2252800 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:00.375836+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93110272 unmapped: 2252800 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:01.376010+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93110272 unmapped: 2252800 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:02.376155+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93110272 unmapped: 2252800 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:03.376284+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93110272 unmapped: 2252800 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:04.376441+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 908792 data_alloc: 301989888 data_used: 6553600
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93110272 unmapped: 2252800 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 91 heartbeat osd_stat(store_statfs(0x1b950d000/0x0/0x1bfc00000, data 0x24fde6d/0x257f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:05.376558+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93110272 unmapped: 2252800 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 91 heartbeat osd_stat(store_statfs(0x1b950d000/0x0/0x1bfc00000, data 0x24fde6d/0x257f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:06.376729+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93110272 unmapped: 2252800 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:07.376920+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93110272 unmapped: 2252800 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:08.377054+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93110272 unmapped: 2252800 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 91 heartbeat osd_stat(store_statfs(0x1b950d000/0x0/0x1bfc00000, data 0x24fde6d/0x257f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:09.380121+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 908792 data_alloc: 301989888 data_used: 6553600
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93110272 unmapped: 2252800 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:10.380280+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93110272 unmapped: 2252800 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:11.381336+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93110272 unmapped: 2252800 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:12.381613+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93110272 unmapped: 2252800 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 91 heartbeat osd_stat(store_statfs(0x1b950d000/0x0/0x1bfc00000, data 0x24fde6d/0x257f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:13.382340+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93110272 unmapped: 2252800 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:14.382591+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 908792 data_alloc: 301989888 data_used: 6553600
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93110272 unmapped: 2252800 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:15.383261+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93110272 unmapped: 2252800 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:16.384371+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93110272 unmapped: 2252800 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:17.384763+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 91 heartbeat osd_stat(store_statfs(0x1b950d000/0x0/0x1bfc00000, data 0x24fde6d/0x257f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93110272 unmapped: 2252800 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:18.385799+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 91 heartbeat osd_stat(store_statfs(0x1b950d000/0x0/0x1bfc00000, data 0x24fde6d/0x257f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93110272 unmapped: 2252800 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:19.386299+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 908792 data_alloc: 301989888 data_used: 6553600
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93110272 unmapped: 2252800 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:20.386994+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93110272 unmapped: 2252800 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:21.387582+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93110272 unmapped: 2252800 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:22.388123+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93110272 unmapped: 2252800 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:23.388502+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93110272 unmapped: 2252800 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 91 heartbeat osd_stat(store_statfs(0x1b950d000/0x0/0x1bfc00000, data 0x24fde6d/0x257f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 91 heartbeat osd_stat(store_statfs(0x1b950d000/0x0/0x1bfc00000, data 0x24fde6d/0x257f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:24.388825+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 908792 data_alloc: 301989888 data_used: 6553600
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93110272 unmapped: 2252800 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:25.389119+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93110272 unmapped: 2252800 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:26.389455+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93110272 unmapped: 2252800 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:27.389677+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93110272 unmapped: 2252800 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:28.389821+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93110272 unmapped: 2252800 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:29.390149+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 908792 data_alloc: 301989888 data_used: 6553600
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 91 heartbeat osd_stat(store_statfs(0x1b950d000/0x0/0x1bfc00000, data 0x24fde6d/0x257f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93110272 unmapped: 2252800 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:30.390630+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93110272 unmapped: 2252800 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:31.390894+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93110272 unmapped: 2252800 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:32.391281+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93110272 unmapped: 2252800 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:33.391720+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93110272 unmapped: 2252800 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 91 heartbeat osd_stat(store_statfs(0x1b950d000/0x0/0x1bfc00000, data 0x24fde6d/0x257f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:34.391922+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 908792 data_alloc: 301989888 data_used: 6553600
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 91 heartbeat osd_stat(store_statfs(0x1b950d000/0x0/0x1bfc00000, data 0x24fde6d/0x257f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93110272 unmapped: 2252800 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:35.392154+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: mgrc handle_mgr_map Got map version 41
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: mgrc handle_mgr_map Active mgr is now 
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: mgrc reconnect Terminating session with v2:172.18.0.107:6810/2760684413
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: mgrc reconnect No active mgr available yet
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 91 ms_handle_reset con 0x562bfc9ff400 session 0x562bfd5ed4a0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfdf05400
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _renew_subs
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 91 handle_osd_map epochs [92,92], i have 91, src has [1,92]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 84.241355896s of 84.250129700s, submitted: 1
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93265920 unmapped: 2097152 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 92 heartbeat osd_stat(store_statfs(0x1b950d000/0x0/0x1bfc00000, data 0x24fde6d/0x257f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:36.392515+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: mgrc handle_mgr_map Got map version 42
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/3214120196,v1:172.18.0.108:6811/3214120196]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: mgrc reconnect Starting new session with [v2:172.18.0.108:6810/3214120196,v1:172.18.0.108:6811/3214120196]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: get_auth_request con 0x562bfd5eb800 auth_method 0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: mgrc handle_mgr_configure stats_period=5
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93265920 unmapped: 2097152 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:37.392685+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93265920 unmapped: 2097152 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:38.392878+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: mgrc handle_mgr_map Got map version 43
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/3214120196,v1:172.18.0.108:6811/3214120196]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93265920 unmapped: 2097152 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:39.393020+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 911792 data_alloc: 301989888 data_used: 6553600
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93265920 unmapped: 2097152 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:40.393193+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 92 heartbeat osd_stat(store_statfs(0x1b9509000/0x0/0x1bfc00000, data 0x2500231/0x2583000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93265920 unmapped: 2097152 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: mgrc handle_mgr_map Got map version 44
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/3214120196,v1:172.18.0.108:6811/3214120196]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:41.393341+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93265920 unmapped: 2097152 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:42.393582+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93265920 unmapped: 2097152 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:43.393773+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93265920 unmapped: 2097152 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:44.393994+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 911792 data_alloc: 301989888 data_used: 6553600
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 92 heartbeat osd_stat(store_statfs(0x1b9509000/0x0/0x1bfc00000, data 0x2500231/0x2583000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93265920 unmapped: 2097152 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:45.394159+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93265920 unmapped: 2097152 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:46.394400+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93265920 unmapped: 2097152 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:47.394600+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 92 heartbeat osd_stat(store_statfs(0x1b9509000/0x0/0x1bfc00000, data 0x2500231/0x2583000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93265920 unmapped: 2097152 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:48.394825+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93265920 unmapped: 2097152 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:49.394974+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 911792 data_alloc: 301989888 data_used: 6553600
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93265920 unmapped: 2097152 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:50.395191+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93265920 unmapped: 2097152 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:51.395422+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 92 heartbeat osd_stat(store_statfs(0x1b9509000/0x0/0x1bfc00000, data 0x2500231/0x2583000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93265920 unmapped: 2097152 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:52.395637+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93265920 unmapped: 2097152 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:53.395849+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93265920 unmapped: 2097152 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:54.396107+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 911792 data_alloc: 301989888 data_used: 6553600
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93265920 unmapped: 2097152 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:55.396338+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 92 heartbeat osd_stat(store_statfs(0x1b9509000/0x0/0x1bfc00000, data 0x2500231/0x2583000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93265920 unmapped: 2097152 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:56.396585+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93265920 unmapped: 2097152 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:57.396886+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93265920 unmapped: 2097152 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:58.397046+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93265920 unmapped: 2097152 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:59.397268+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 911792 data_alloc: 301989888 data_used: 6553600
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93265920 unmapped: 2097152 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:00.397574+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93200384 unmapped: 2162688 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 92 heartbeat osd_stat(store_statfs(0x1b9509000/0x0/0x1bfc00000, data 0x2500231/0x2583000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:01.397810+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93200384 unmapped: 2162688 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:02.398061+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93200384 unmapped: 2162688 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:03.398341+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93200384 unmapped: 2162688 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:04.398550+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 911792 data_alloc: 301989888 data_used: 6553600
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93200384 unmapped: 2162688 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:05.398828+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93200384 unmapped: 2162688 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:06.399139+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 92 heartbeat osd_stat(store_statfs(0x1b9509000/0x0/0x1bfc00000, data 0x2500231/0x2583000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 92 heartbeat osd_stat(store_statfs(0x1b9509000/0x0/0x1bfc00000, data 0x2500231/0x2583000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93200384 unmapped: 2162688 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:07.399354+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 92 heartbeat osd_stat(store_statfs(0x1b9509000/0x0/0x1bfc00000, data 0x2500231/0x2583000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93200384 unmapped: 2162688 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:08.399696+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 7800.2 total, 600.0 interval
                                                          Cumulative writes: 5888 writes, 25K keys, 5888 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s
                                                          Cumulative WAL: 5888 writes, 780 syncs, 7.55 writes per sync, written: 0.02 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 34 writes, 127 keys, 34 commit groups, 1.0 writes per commit group, ingest: 0.22 MB, 0.00 MB/s
                                                          Interval WAL: 34 writes, 17 syncs, 2.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93200384 unmapped: 2162688 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 92 heartbeat osd_stat(store_statfs(0x1b9509000/0x0/0x1bfc00000, data 0x2500231/0x2583000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:09.399921+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 911792 data_alloc: 301989888 data_used: 6553600
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93200384 unmapped: 2162688 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:10.400115+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93200384 unmapped: 2162688 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:11.400254+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93200384 unmapped: 2162688 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:12.400503+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93200384 unmapped: 2162688 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:13.400754+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93200384 unmapped: 2162688 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:14.400939+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 911792 data_alloc: 301989888 data_used: 6553600
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 92 heartbeat osd_stat(store_statfs(0x1b9509000/0x0/0x1bfc00000, data 0x2500231/0x2583000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93200384 unmapped: 2162688 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:15.401126+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93200384 unmapped: 2162688 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:16.401884+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93200384 unmapped: 2162688 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:17.407598+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93200384 unmapped: 2162688 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:18.411105+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 92 heartbeat osd_stat(store_statfs(0x1b9509000/0x0/0x1bfc00000, data 0x2500231/0x2583000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93200384 unmapped: 2162688 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:19.411274+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 911792 data_alloc: 285212672 data_used: 6553600
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93200384 unmapped: 2162688 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 92 heartbeat osd_stat(store_statfs(0x1b9509000/0x0/0x1bfc00000, data 0x2500231/0x2583000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:20.411420+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93200384 unmapped: 2162688 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:21.411571+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93200384 unmapped: 2162688 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:22.412915+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93200384 unmapped: 2162688 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:23.413111+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93200384 unmapped: 2162688 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 92 heartbeat osd_stat(store_statfs(0x1b9509000/0x0/0x1bfc00000, data 0x2500231/0x2583000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:24.413276+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 911792 data_alloc: 285212672 data_used: 6553600
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 92 heartbeat osd_stat(store_statfs(0x1b9509000/0x0/0x1bfc00000, data 0x2500231/0x2583000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93200384 unmapped: 2162688 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:25.413436+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 92 heartbeat osd_stat(store_statfs(0x1b9509000/0x0/0x1bfc00000, data 0x2500231/0x2583000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93200384 unmapped: 2162688 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:26.413624+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93200384 unmapped: 2162688 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:27.413764+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93200384 unmapped: 2162688 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:28.413931+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93200384 unmapped: 2162688 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:29.414150+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 911792 data_alloc: 285212672 data_used: 6553600
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 92 heartbeat osd_stat(store_statfs(0x1b9509000/0x0/0x1bfc00000, data 0x2500231/0x2583000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93200384 unmapped: 2162688 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:30.414572+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93200384 unmapped: 2162688 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:31.414847+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93200384 unmapped: 2162688 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:32.415014+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93200384 unmapped: 2162688 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:33.415359+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93200384 unmapped: 2162688 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:34.415599+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 911792 data_alloc: 285212672 data_used: 6553600
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93200384 unmapped: 2162688 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:35.415821+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 92 heartbeat osd_stat(store_statfs(0x1b9509000/0x0/0x1bfc00000, data 0x2500231/0x2583000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93200384 unmapped: 2162688 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:36.416129+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 92 heartbeat osd_stat(store_statfs(0x1b9509000/0x0/0x1bfc00000, data 0x2500231/0x2583000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93200384 unmapped: 2162688 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:37.416404+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93200384 unmapped: 2162688 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:38.416649+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93200384 unmapped: 2162688 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:39.416858+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 911792 data_alloc: 285212672 data_used: 6553600
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 92 heartbeat osd_stat(store_statfs(0x1b9509000/0x0/0x1bfc00000, data 0x2500231/0x2583000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93200384 unmapped: 2162688 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:40.417114+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 92 heartbeat osd_stat(store_statfs(0x1b9509000/0x0/0x1bfc00000, data 0x2500231/0x2583000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93200384 unmapped: 2162688 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:41.417260+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93200384 unmapped: 2162688 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:42.417426+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 92 heartbeat osd_stat(store_statfs(0x1b9509000/0x0/0x1bfc00000, data 0x2500231/0x2583000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93200384 unmapped: 2162688 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:43.417596+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93200384 unmapped: 2162688 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 92 heartbeat osd_stat(store_statfs(0x1b9509000/0x0/0x1bfc00000, data 0x2500231/0x2583000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:44.417813+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 911792 data_alloc: 285212672 data_used: 6553600
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93200384 unmapped: 2162688 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:45.418162+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93200384 unmapped: 2162688 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:46.418413+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93200384 unmapped: 2162688 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 92 heartbeat osd_stat(store_statfs(0x1b9509000/0x0/0x1bfc00000, data 0x2500231/0x2583000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:47.418609+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93200384 unmapped: 2162688 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:48.419261+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93200384 unmapped: 2162688 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:49.419606+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 911792 data_alloc: 285212672 data_used: 6553600
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93200384 unmapped: 2162688 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:50.419850+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93200384 unmapped: 2162688 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:51.420341+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93200384 unmapped: 2162688 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:52.420505+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 92 heartbeat osd_stat(store_statfs(0x1b9509000/0x0/0x1bfc00000, data 0x2500231/0x2583000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93200384 unmapped: 2162688 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:53.421326+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93200384 unmapped: 2162688 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 92 heartbeat osd_stat(store_statfs(0x1b9509000/0x0/0x1bfc00000, data 0x2500231/0x2583000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:54.422113+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 911792 data_alloc: 285212672 data_used: 6553600
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93200384 unmapped: 2162688 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:55.422407+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93200384 unmapped: 2162688 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:56.422760+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 92 heartbeat osd_stat(store_statfs(0x1b9509000/0x0/0x1bfc00000, data 0x2500231/0x2583000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93200384 unmapped: 2162688 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:57.423099+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93200384 unmapped: 2162688 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:58.423284+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93200384 unmapped: 2162688 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:59.423615+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 911792 data_alloc: 285212672 data_used: 6553600
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93200384 unmapped: 2162688 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:00.423860+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 92 heartbeat osd_stat(store_statfs(0x1b9509000/0x0/0x1bfc00000, data 0x2500231/0x2583000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93200384 unmapped: 2162688 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:01.424190+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93200384 unmapped: 2162688 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:02.424539+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93200384 unmapped: 2162688 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:03.424740+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 92 heartbeat osd_stat(store_statfs(0x1b9509000/0x0/0x1bfc00000, data 0x2500231/0x2583000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93200384 unmapped: 2162688 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:04.425060+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 911792 data_alloc: 285212672 data_used: 6553600
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93200384 unmapped: 2162688 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:05.425365+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93200384 unmapped: 2162688 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:06.425604+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: mgrc handle_mgr_map Got map version 45
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/3214120196,v1:172.18.0.108:6811/3214120196]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93200384 unmapped: 2162688 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:07.425751+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93200384 unmapped: 2162688 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:08.426214+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 92 heartbeat osd_stat(store_statfs(0x1b9509000/0x0/0x1bfc00000, data 0x2500231/0x2583000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93200384 unmapped: 2162688 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:09.426645+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 911792 data_alloc: 285212672 data_used: 6553600
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93200384 unmapped: 2162688 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:10.427012+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93200384 unmapped: 2162688 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:11.427239+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93200384 unmapped: 2162688 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 92 heartbeat osd_stat(store_statfs(0x1b9509000/0x0/0x1bfc00000, data 0x2500231/0x2583000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:12.427397+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93200384 unmapped: 2162688 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:13.427501+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 92 heartbeat osd_stat(store_statfs(0x1b9509000/0x0/0x1bfc00000, data 0x2500231/0x2583000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93200384 unmapped: 2162688 heap: 95363072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:14.427643+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 92 heartbeat osd_stat(store_statfs(0x1b9509000/0x0/0x1bfc00000, data 0x2500231/0x2583000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 911792 data_alloc: 285212672 data_used: 6553600
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfc2f1000
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 99.245407104s of 99.248748779s, submitted: 1
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 92 heartbeat osd_stat(store_statfs(0x1b950a000/0x0/0x1bfc00000, data 0x2500241/0x2584000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 102236160 unmapped: 7815168 heap: 110051328 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:15.427785+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 92 handle_osd_map epochs [93,93], i have 92, src has [1,93]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 93 ms_handle_reset con 0x562bfc2f1000 session 0x562bfd23e000
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 93929472 unmapped: 16121856 heap: 110051328 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:16.428012+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfcaaf800
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 93 handle_osd_map epochs [93,93], i have 93, src has [1,93]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 95076352 unmapped: 14974976 heap: 110051328 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:17.428152+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 93 heartbeat osd_stat(store_statfs(0x1b8093000/0x0/0x1bfc00000, data 0x3972655/0x39fb000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _renew_subs
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 93 handle_osd_map epochs [94,94], i have 93, src has [1,94]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 94 ms_handle_reset con 0x562bfcaaf800 session 0x562bfd5ec780
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 95141888 unmapped: 19570688 heap: 114712576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:18.428328+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 95141888 unmapped: 19570688 heap: 114712576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:19.428483+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1154938 data_alloc: 285212672 data_used: 6586368
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 95141888 unmapped: 19570688 heap: 114712576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:20.429809+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 95068160 unmapped: 19644416 heap: 114712576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:21.430126+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 95068160 unmapped: 19644416 heap: 114712576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:22.430934+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 94 heartbeat osd_stat(store_statfs(0x1b741d000/0x0/0x1bfc00000, data 0x45e4a36/0x4670000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 95068160 unmapped: 19644416 heap: 114712576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:23.431373+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 95068160 unmapped: 19644416 heap: 114712576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:24.432730+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1154938 data_alloc: 285212672 data_used: 6586368
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 95068160 unmapped: 19644416 heap: 114712576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:25.433926+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 95068160 unmapped: 19644416 heap: 114712576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:26.434975+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 94 heartbeat osd_stat(store_statfs(0x1b741d000/0x0/0x1bfc00000, data 0x45e4a36/0x4670000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 95068160 unmapped: 19644416 heap: 114712576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:27.435600+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:28.436138+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 95068160 unmapped: 19644416 heap: 114712576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:29.436483+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 95068160 unmapped: 19644416 heap: 114712576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1154938 data_alloc: 285212672 data_used: 6586368
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:30.436874+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 95068160 unmapped: 19644416 heap: 114712576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:31.437146+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 95068160 unmapped: 19644416 heap: 114712576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:32.437539+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 95068160 unmapped: 19644416 heap: 114712576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 94 heartbeat osd_stat(store_statfs(0x1b741d000/0x0/0x1bfc00000, data 0x45e4a36/0x4670000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:33.437894+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 95068160 unmapped: 19644416 heap: 114712576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:34.438206+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 95068160 unmapped: 19644416 heap: 114712576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1154938 data_alloc: 285212672 data_used: 6586368
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:35.438370+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 95068160 unmapped: 19644416 heap: 114712576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:36.438957+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 95076352 unmapped: 19636224 heap: 114712576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:37.439303+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 95076352 unmapped: 19636224 heap: 114712576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:38.439551+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 95076352 unmapped: 19636224 heap: 114712576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 94 heartbeat osd_stat(store_statfs(0x1b741d000/0x0/0x1bfc00000, data 0x45e4a36/0x4670000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:39.439937+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 95076352 unmapped: 19636224 heap: 114712576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1154938 data_alloc: 285212672 data_used: 6586368
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 94 heartbeat osd_stat(store_statfs(0x1b741d000/0x0/0x1bfc00000, data 0x45e4a36/0x4670000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:40.440140+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 95076352 unmapped: 19636224 heap: 114712576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:41.440286+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 95076352 unmapped: 19636224 heap: 114712576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:42.440504+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 95076352 unmapped: 19636224 heap: 114712576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:43.440729+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 95076352 unmapped: 19636224 heap: 114712576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:44.440910+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 95076352 unmapped: 19636224 heap: 114712576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1154938 data_alloc: 285212672 data_used: 6586368
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:45.441123+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 95076352 unmapped: 19636224 heap: 114712576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 94 heartbeat osd_stat(store_statfs(0x1b741d000/0x0/0x1bfc00000, data 0x45e4a36/0x4670000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:46.441379+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 95076352 unmapped: 19636224 heap: 114712576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:47.441524+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 95076352 unmapped: 19636224 heap: 114712576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 94 heartbeat osd_stat(store_statfs(0x1b741d000/0x0/0x1bfc00000, data 0x45e4a36/0x4670000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:48.441673+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 95076352 unmapped: 19636224 heap: 114712576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:49.441854+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 95076352 unmapped: 19636224 heap: 114712576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 94 heartbeat osd_stat(store_statfs(0x1b741d000/0x0/0x1bfc00000, data 0x45e4a36/0x4670000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1154938 data_alloc: 285212672 data_used: 6586368
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:50.442187+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 95076352 unmapped: 19636224 heap: 114712576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:51.442441+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 95076352 unmapped: 19636224 heap: 114712576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:52.442645+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 95076352 unmapped: 19636224 heap: 114712576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 94 heartbeat osd_stat(store_statfs(0x1b741d000/0x0/0x1bfc00000, data 0x45e4a36/0x4670000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:53.442872+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 95076352 unmapped: 19636224 heap: 114712576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:54.443155+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 95076352 unmapped: 19636224 heap: 114712576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1154938 data_alloc: 285212672 data_used: 6586368
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:55.443402+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 95076352 unmapped: 19636224 heap: 114712576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:56.443747+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 95076352 unmapped: 19636224 heap: 114712576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 94 heartbeat osd_stat(store_statfs(0x1b741d000/0x0/0x1bfc00000, data 0x45e4a36/0x4670000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:57.443922+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 95076352 unmapped: 19636224 heap: 114712576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:58.444130+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 95076352 unmapped: 19636224 heap: 114712576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:59.444307+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 95076352 unmapped: 19636224 heap: 114712576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1154938 data_alloc: 285212672 data_used: 6586368
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:00.444493+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 95076352 unmapped: 19636224 heap: 114712576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:01.444666+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 95076352 unmapped: 19636224 heap: 114712576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:02.445202+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 95076352 unmapped: 19636224 heap: 114712576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 94 heartbeat osd_stat(store_statfs(0x1b741d000/0x0/0x1bfc00000, data 0x45e4a36/0x4670000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:03.445387+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 95076352 unmapped: 19636224 heap: 114712576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 94 heartbeat osd_stat(store_statfs(0x1b741d000/0x0/0x1bfc00000, data 0x45e4a36/0x4670000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:04.445560+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 95076352 unmapped: 19636224 heap: 114712576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1154938 data_alloc: 285212672 data_used: 6586368
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:05.445759+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 95076352 unmapped: 19636224 heap: 114712576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:06.445968+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 95076352 unmapped: 19636224 heap: 114712576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 94 heartbeat osd_stat(store_statfs(0x1b741d000/0x0/0x1bfc00000, data 0x45e4a36/0x4670000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:07.446165+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 95076352 unmapped: 19636224 heap: 114712576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:08.446396+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 95076352 unmapped: 19636224 heap: 114712576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:09.446639+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 95076352 unmapped: 19636224 heap: 114712576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1154938 data_alloc: 285212672 data_used: 6586368
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:10.446865+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 95076352 unmapped: 19636224 heap: 114712576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfdb11000
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 94 ms_handle_reset con 0x562bfdb11000 session 0x562bfe1a6d20
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfc2f0800
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 94 ms_handle_reset con 0x562bfc2f0800 session 0x562bfe1a6f00
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfc2f0800
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 94 ms_handle_reset con 0x562bfc2f0800 session 0x562bfe1a70e0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:11.447031+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 95084544 unmapped: 19628032 heap: 114712576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfc585400
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 94 heartbeat osd_stat(store_statfs(0x1b741d000/0x0/0x1bfc00000, data 0x45e4a36/0x4670000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:12.447178+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 94 ms_handle_reset con 0x562bfc585400 session 0x562bfe1a72c0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfc9fe800
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 95084544 unmapped: 19628032 heap: 114712576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 94 heartbeat osd_stat(store_statfs(0x1b741d000/0x0/0x1bfc00000, data 0x45e4a36/0x4670000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 94 ms_handle_reset con 0x562bfc9fe800 session 0x562bfe1a74a0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:13.447339+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfc988400
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 94 ms_handle_reset con 0x562bfc988400 session 0x562bfe1a7680
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 95084544 unmapped: 19628032 heap: 114712576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfca28c00
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 58.431877136s of 58.723098755s, submitted: 40
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 94 ms_handle_reset con 0x562bfca28c00 session 0x562bfe1a7a40
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfc2f0800
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 94 ms_handle_reset con 0x562bfc2f0800 session 0x562bfe1a70e0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfc585400
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 94 ms_handle_reset con 0x562bfc585400 session 0x562bfd5ed0e0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfc988400
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 94 ms_handle_reset con 0x562bfc988400 session 0x562bfd5ec780
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfc9fe800
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 94 ms_handle_reset con 0x562bfc9fe800 session 0x562bfb2b3860
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:14.447520+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 95739904 unmapped: 22650880 heap: 118390784 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1285307 data_alloc: 285212672 data_used: 6586368
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:15.447725+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 95739904 unmapped: 22650880 heap: 118390784 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:16.447897+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 95739904 unmapped: 22650880 heap: 118390784 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 94 heartbeat osd_stat(store_statfs(0x1b6436000/0x0/0x1bfc00000, data 0x55caaa8/0x5658000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:17.448061+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 95780864 unmapped: 22609920 heap: 118390784 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:18.448294+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 95780864 unmapped: 22609920 heap: 118390784 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfcdac400
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:19.448453+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 95887360 unmapped: 22503424 heap: 118390784 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1290122 data_alloc: 285212672 data_used: 6623232
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:20.448643+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 96657408 unmapped: 21733376 heap: 118390784 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:21.448832+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 98451456 unmapped: 19939328 heap: 118390784 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:22.449005+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 98451456 unmapped: 19939328 heap: 118390784 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 94 heartbeat osd_stat(store_statfs(0x1b6411000/0x0/0x1bfc00000, data 0x55eeab8/0x567d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:23.449281+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 98451456 unmapped: 19939328 heap: 118390784 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:24.449513+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 98451456 unmapped: 19939328 heap: 118390784 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfcaaf800
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 10.951934814s of 11.195569992s, submitted: 51
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1317577 data_alloc: 285212672 data_used: 9768960
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:25.450197+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 98459648 unmapped: 19931136 heap: 118390784 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 94 handle_osd_map epochs [94,95], i have 94, src has [1,95]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 95 handle_osd_map epochs [95,95], i have 95, src has [1,95]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 95 ms_handle_reset con 0x562bfcaaf800 session 0x562bfb1fc960
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:26.451109+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 98467840 unmapped: 19922944 heap: 118390784 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 95 heartbeat osd_stat(store_statfs(0x1b6408000/0x0/0x1bfc00000, data 0x55f190c/0x5685000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:27.451664+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 98467840 unmapped: 19922944 heap: 118390784 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfdebe000
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 95 handle_osd_map epochs [96,96], i have 95, src has [1,96]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 95 handle_osd_map epochs [96,96], i have 96, src has [1,96]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 96 heartbeat osd_stat(store_statfs(0x1b6403000/0x0/0x1bfc00000, data 0x55f3cfb/0x5688000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [1])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfdf05000
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:28.452044+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 96 ms_handle_reset con 0x562bfdebe000 session 0x562bfd5ecd20
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 98590720 unmapped: 19800064 heap: 118390784 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 96 ms_handle_reset con 0x562bfcdac400 session 0x562bfaf35c20
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:29.452709+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 98582528 unmapped: 19808256 heap: 118390784 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1325360 data_alloc: 285212672 data_used: 9781248
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:30.454616+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 98582528 unmapped: 19808256 heap: 118390784 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:31.454871+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 98615296 unmapped: 19775488 heap: 118390784 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 96 heartbeat osd_stat(store_statfs(0x1b628f000/0x0/0x1bfc00000, data 0x576d288/0x57ff000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [0,0,2,1,6])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:32.455108+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 96 heartbeat osd_stat(store_statfs(0x1b628f000/0x0/0x1bfc00000, data 0x576d288/0x57ff000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 102367232 unmapped: 16023552 heap: 118390784 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfdf08400
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 96 ms_handle_reset con 0x562bfdf08400 session 0x562bfdbd14a0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfdebf400
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 96 ms_handle_reset con 0x562bfdebf400 session 0x562bfcd41a40
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfdf04c00
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 96 ms_handle_reset con 0x562bfdf04c00 session 0x562bfcd41c20
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:33.455556+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 102367232 unmapped: 16023552 heap: 118390784 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:34.455721+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfcdac400
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 96 ms_handle_reset con 0x562bfcdac400 session 0x562bfcd414a0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfdebe000
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 102973440 unmapped: 15417344 heap: 118390784 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 96 handle_osd_map epochs [96,97], i have 96, src has [1,97]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 9.130534172s of 10.000466347s, submitted: 195
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 97 ms_handle_reset con 0x562bfdf05000 session 0x562bfdbd10e0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 97 heartbeat osd_stat(store_statfs(0x1b582a000/0x0/0x1bfc00000, data 0x61ca288/0x625c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 97 ms_handle_reset con 0x562bfdebe000 session 0x562bfc693680
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfdebf400
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1430832 data_alloc: 285212672 data_used: 9785344
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:35.455846+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 100048896 unmapped: 18341888 heap: 118390784 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfdf08400
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 97 ms_handle_reset con 0x562bfdf08400 session 0x562bfe1a6960
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfcdad800
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:36.456032+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 97 ms_handle_reset con 0x562bfdebf400 session 0x562bfd5ec780
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 97 ms_handle_reset con 0x562bfcdad800 session 0x562bfb2b2960
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 100442112 unmapped: 17948672 heap: 118390784 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfcdac400
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 97 ms_handle_reset con 0x562bfcdac400 session 0x562bfc57b4a0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:37.456314+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfdebe000
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 97 ms_handle_reset con 0x562bfdebe000 session 0x562bfaeda1e0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 100474880 unmapped: 17915904 heap: 118390784 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:38.456602+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfdf05000
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 97 ms_handle_reset con 0x562bfdf05000 session 0x562bfbf54960
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 100712448 unmapped: 17678336 heap: 118390784 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:39.456877+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 100712448 unmapped: 17678336 heap: 118390784 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfdf08400
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 97 ms_handle_reset con 0x562bfdf08400 session 0x562bfc6e4d20
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfcdac400
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 97 ms_handle_reset con 0x562bfcdac400 session 0x562bfc265e00
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1289415 data_alloc: 285212672 data_used: 6610944
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 97 heartbeat osd_stat(store_statfs(0x1b67c6000/0x0/0x1bfc00000, data 0x523453f/0x52c8000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:40.457102+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfcdad800
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 100048896 unmapped: 18341888 heap: 118390784 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:41.457290+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 100048896 unmapped: 18341888 heap: 118390784 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 97 heartbeat osd_stat(store_statfs(0x1b67c6000/0x0/0x1bfc00000, data 0x523453f/0x52c8000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:42.457550+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 100057088 unmapped: 18333696 heap: 118390784 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:43.457784+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 100057088 unmapped: 18333696 heap: 118390784 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:44.458019+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 100057088 unmapped: 18333696 heap: 118390784 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1289415 data_alloc: 285212672 data_used: 6610944
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:45.458251+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 100057088 unmapped: 18333696 heap: 118390784 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:46.458575+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 100057088 unmapped: 18333696 heap: 118390784 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:47.458759+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 100057088 unmapped: 18333696 heap: 118390784 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 97 heartbeat osd_stat(store_statfs(0x1b67c6000/0x0/0x1bfc00000, data 0x523453f/0x52c8000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:48.458954+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 100057088 unmapped: 18333696 heap: 118390784 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:49.459184+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 100057088 unmapped: 18333696 heap: 118390784 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1289415 data_alloc: 285212672 data_used: 6610944
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:50.459393+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 100057088 unmapped: 18333696 heap: 118390784 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfc585400
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 97 ms_handle_reset con 0x562bfc585400 session 0x562bfc42b860
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfc9ffc00
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 97 heartbeat osd_stat(store_statfs(0x1b67c6000/0x0/0x1bfc00000, data 0x523453f/0x52c8000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:51.459552+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfc2f0400
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 15.780247688s of 16.863403320s, submitted: 129
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 101629952 unmapped: 16760832 heap: 118390784 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 97 ms_handle_reset con 0x562bfc9ffc00 session 0x562bfc6bc960
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:52.459707+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfcaafc00
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 109182976 unmapped: 9207808 heap: 118390784 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 97 ms_handle_reset con 0x562bfcaafc00 session 0x562bfc6bc3c0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfdf05000
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 97 ms_handle_reset con 0x562bfdf05000 session 0x562bfa4292c0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:53.459845+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 102629376 unmapped: 17874944 heap: 120504320 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:54.494743+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 102629376 unmapped: 17874944 heap: 120504320 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1493359 data_alloc: 285212672 data_used: 6688768
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfc585400
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:55.494880+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 97 heartbeat osd_stat(store_statfs(0x1b4f7b000/0x0/0x1bfc00000, data 0x6a745b0/0x6b0a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 97 ms_handle_reset con 0x562bfc585400 session 0x562bfa429e00
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 102678528 unmapped: 17825792 heap: 120504320 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:56.495054+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 102752256 unmapped: 17752064 heap: 120504320 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:57.495222+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 102752256 unmapped: 17752064 heap: 120504320 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:58.495406+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 102752256 unmapped: 17752064 heap: 120504320 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:59.495558+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 102752256 unmapped: 17752064 heap: 120504320 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 97 heartbeat osd_stat(store_statfs(0x1b4f79000/0x0/0x1bfc00000, data 0x6a745e3/0x6b0c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1492051 data_alloc: 285212672 data_used: 6696960
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:00.495713+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 102400000 unmapped: 18104320 heap: 120504320 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 97 heartbeat osd_stat(store_statfs(0x1b4f61000/0x0/0x1bfc00000, data 0x6a955e3/0x6b2d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:01.495879+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 102400000 unmapped: 18104320 heap: 120504320 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:02.496005+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 102400000 unmapped: 18104320 heap: 120504320 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 10.511833191s of 11.295376778s, submitted: 195
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:03.496151+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 102432768 unmapped: 18071552 heap: 120504320 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 97 heartbeat osd_stat(store_statfs(0x1b4f53000/0x0/0x1bfc00000, data 0x6aa35e3/0x6b3b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:04.496265+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 102432768 unmapped: 18071552 heap: 120504320 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 97 heartbeat osd_stat(store_statfs(0x1b4f53000/0x0/0x1bfc00000, data 0x6aa35e3/0x6b3b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 97 ms_handle_reset con 0x562bfc2f0400 session 0x562bfc435e00
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 97 ms_handle_reset con 0x562bfcdad800 session 0x562bfc53be00
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1497547 data_alloc: 285212672 data_used: 6696960
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:05.496421+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 102531072 unmapped: 17973248 heap: 120504320 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfaeb3c00
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:06.496595+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 102539264 unmapped: 17965056 heap: 120504320 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _renew_subs
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 97 handle_osd_map epochs [98,98], i have 97, src has [1,98]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 98 ms_handle_reset con 0x562bfaeb3c00 session 0x562bfc1a65a0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfdf05800
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfc9ffc00
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 98 ms_handle_reset con 0x562bfdf05800 session 0x562bfdbd1e00
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 98 ms_handle_reset con 0x562bfc9ffc00 session 0x562bfc434d20
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfaeb3c00
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:07.496749+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 118087680 unmapped: 13664256 heap: 131751936 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 98 ms_handle_reset con 0x562bfaeb3c00 session 0x562bfa49ab40
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfc2f0400
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:08.496895+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 98 heartbeat osd_stat(store_statfs(0x1b33c7000/0x0/0x1bfc00000, data 0x862c9b1/0x86c7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 107716608 unmapped: 28016640 heap: 135733248 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _renew_subs
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 98 handle_osd_map epochs [99,99], i have 98, src has [1,99]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[2.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[2.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[2.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[2.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[2.13] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[2.19] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 99 ms_handle_reset con 0x562bfc2f0400 session 0x562bfa49be00
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[2.1] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfc585400
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 99 heartbeat osd_stat(store_statfs(0x1b33c7000/0x0/0x1bfc00000, data 0x862c9b1/0x86c7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[2.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[2.15] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[2.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[2.9] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[2.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[2.e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:09.497053+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 107724800 unmapped: 28008448 heap: 135733248 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _renew_subs
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 99 handle_osd_map epochs [100,100], i have 99, src has [1,100]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 100 ms_handle_reset con 0x562bfc585400 session 0x562bfbf54960
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1731592 data_alloc: 285212672 data_used: 10485760
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 100 handle_osd_map epochs [100,100], i have 100, src has [1,100]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:10.497246+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 106766336 unmapped: 28966912 heap: 135733248 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfcdad800
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 100 ms_handle_reset con 0x562bfcdad800 session 0x562bfe1a72c0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfaeb3c00
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 100 ms_handle_reset con 0x562bfaeb3c00 session 0x562bfdbd14a0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfc2f0400
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 100 ms_handle_reset con 0x562bfc2f0400 session 0x562bfa428d20
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:11.497388+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 106782720 unmapped: 28950528 heap: 135733248 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfaeb3000
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:12.497479+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 112099328 unmapped: 23633920 heap: 135733248 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:13.497608+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 112099328 unmapped: 23633920 heap: 135733248 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 100 heartbeat osd_stat(store_statfs(0x1b26f5000/0x0/0x1bfc00000, data 0x92fb19d/0x9399000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 10.224626541s of 11.057232857s, submitted: 188
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfa3cbc00
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:14.497700+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 100 ms_handle_reset con 0x562bfa3cbc00 session 0x562bfc22ab40
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 109223936 unmapped: 26509312 heap: 135733248 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _renew_subs
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 100 handle_osd_map epochs [101,101], i have 100, src has [1,101]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[2.15] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[2.13] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[2.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[2.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[2.e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[2.9] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[2.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[2.1] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[2.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[2.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[2.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[2.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[2.19] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1647491 data_alloc: 285212672 data_used: 6656000
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:15.497867+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 108879872 unmapped: 26853376 heap: 135733248 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:16.498051+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 101 heartbeat osd_stat(store_statfs(0x1b3e8f000/0x0/0x1bfc00000, data 0x7b613ac/0x7bfe000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 108879872 unmapped: 26853376 heap: 135733248 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:17.500802+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 108879872 unmapped: 26853376 heap: 135733248 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:18.500959+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 101 heartbeat osd_stat(store_statfs(0x1b3e8f000/0x0/0x1bfc00000, data 0x7b613ac/0x7bfe000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 108879872 unmapped: 26853376 heap: 135733248 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfcdacc00
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 101 ms_handle_reset con 0x562bfcdacc00 session 0x562bfbebd680
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfaeb2800
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 101 ms_handle_reset con 0x562bfaeb2800 session 0x562bfaf34780
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfa3cbc00
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 101 ms_handle_reset con 0x562bfa3cbc00 session 0x562bfaf345a0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfaeb3c00
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 101 ms_handle_reset con 0x562bfaeb3c00 session 0x562bfc4c4960
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfc2f0400
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfcdacc00
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:19.501097+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 101 ms_handle_reset con 0x562bfcdacc00 session 0x562bfb2b3c20
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 101 ms_handle_reset con 0x562bfc2f0400 session 0x562bfbec0000
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfdf08000
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 101 ms_handle_reset con 0x562bfdf08000 session 0x562bff1f9a40
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfa3cbc00
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 101 ms_handle_reset con 0x562bfa3cbc00 session 0x562bfcd40000
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfaeb3c00
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 101 ms_handle_reset con 0x562bfaeb3c00 session 0x562bfbebc960
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfc2f0400
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 101 ms_handle_reset con 0x562bfc2f0400 session 0x562bfa38a000
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfcdacc00
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 126566400 unmapped: 12353536 heap: 138919936 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1800496 data_alloc: 301989888 data_used: 15044608
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:20.501245+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 101 ms_handle_reset con 0x562bfcdacc00 session 0x562bfa38a5a0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 101 heartbeat osd_stat(store_statfs(0x1b2ecc000/0x0/0x1bfc00000, data 0x8b253ac/0x8bc2000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 124035072 unmapped: 21372928 heap: 145408000 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfdf05c00
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 101 ms_handle_reset con 0x562bfdf05c00 session 0x562bff1ed4a0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:21.501430+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfdf09800
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 101 ms_handle_reset con 0x562bfdf09800 session 0x562bff1ed0e0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 124035072 unmapped: 21372928 heap: 145408000 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 101 heartbeat osd_stat(store_statfs(0x1b2c09000/0x0/0x1bfc00000, data 0x8de83ac/0x8e85000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 101 ms_handle_reset con 0x562bfaeb3000 session 0x562bfc9f2d20
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfcaafc00
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 101 ms_handle_reset con 0x562bfcaafc00 session 0x562bfaed7a40
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfd5eb000
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfcce3c00
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 101 ms_handle_reset con 0x562bfd5eb000 session 0x562bfc1f6960
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:22.501547+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfd37dc00
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 101 ms_handle_reset con 0x562bfcce3c00 session 0x562bfc9f30e0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfaeb3000
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 124657664 unmapped: 20750336 heap: 145408000 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 101 heartbeat osd_stat(store_statfs(0x1b2c08000/0x0/0x1bfc00000, data 0x8de83bc/0x8e86000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfcaafc00
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _renew_subs
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 101 handle_osd_map epochs [102,102], i have 101, src has [1,102]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:23.501654+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 102 handle_osd_map epochs [102,102], i have 102, src has [1,102]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 102 handle_osd_map epochs [102,102], i have 102, src has [1,102]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 102 ms_handle_reset con 0x562bfaeb3000 session 0x562bfa38b0e0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 118939648 unmapped: 26468352 heap: 145408000 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:24.501771+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 119111680 unmapped: 26296320 heap: 145408000 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 102 heartbeat osd_stat(store_statfs(0x1b5aca000/0x0/0x1bfc00000, data 0x589e71a/0x5939000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1460253 data_alloc: 285212672 data_used: 10338304
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:25.501899+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 119128064 unmapped: 26279936 heap: 145408000 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:26.502112+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 119128064 unmapped: 26279936 heap: 145408000 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:27.502270+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 119128064 unmapped: 26279936 heap: 145408000 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 102 heartbeat osd_stat(store_statfs(0x1b5aca000/0x0/0x1bfc00000, data 0x589e71a/0x5939000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:28.502402+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 119209984 unmapped: 26198016 heap: 145408000 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:29.502574+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 119209984 unmapped: 26198016 heap: 145408000 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _renew_subs
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 102 handle_osd_map epochs [103,103], i have 102, src has [1,103]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 15.225378990s of 16.205333710s, submitted: 263
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1444535 data_alloc: 285212672 data_used: 10338304
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:30.502729+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 111976448 unmapped: 33431552 heap: 145408000 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 103 heartbeat osd_stat(store_statfs(0x1b5aca000/0x0/0x1bfc00000, data 0x589e71a/0x5939000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:31.502833+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 111976448 unmapped: 33431552 heap: 145408000 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 103 heartbeat osd_stat(store_statfs(0x1b6150000/0x0/0x1bfc00000, data 0x58a09be/0x593d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:32.502957+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 111976448 unmapped: 33431552 heap: 145408000 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 103 heartbeat osd_stat(store_statfs(0x1b6150000/0x0/0x1bfc00000, data 0x58a09be/0x593d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:33.503103+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 111976448 unmapped: 33431552 heap: 145408000 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:34.503235+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 112812032 unmapped: 32595968 heap: 145408000 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 103 heartbeat osd_stat(store_statfs(0x1b6150000/0x0/0x1bfc00000, data 0x58a09be/0x593d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1465155 data_alloc: 301989888 data_used: 12124160
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:35.503361+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 112795648 unmapped: 32612352 heap: 145408000 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:36.503538+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 113016832 unmapped: 32391168 heap: 145408000 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:37.503699+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfa3cbc00
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 103 ms_handle_reset con 0x562bfa3cbc00 session 0x562bfbebd680
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 103 heartbeat osd_stat(store_statfs(0x1b614e000/0x0/0x1bfc00000, data 0x58a29e7/0x5940000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [0,1])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfd5eb400
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 113008640 unmapped: 32399360 heap: 145408000 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 103 ms_handle_reset con 0x562bfd5eb400 session 0x562bfc4c4960
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:38.503865+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 103 ms_handle_reset con 0x562bfcaafc00 session 0x562bff1ed2c0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 103 ms_handle_reset con 0x562bfd37dc00 session 0x562bfc1b30e0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 113025024 unmapped: 32382976 heap: 145408000 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:39.504025+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 103 handle_osd_map epochs [103,104], i have 103, src has [1,104]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 104 handle_osd_map epochs [104,104], i have 104, src has [1,104]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfa3cbc00
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 104 handle_osd_map epochs [104,104], i have 104, src has [1,104]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 113188864 unmapped: 32219136 heap: 145408000 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 104 ms_handle_reset con 0x562bfa3cbc00 session 0x562bff1ec3c0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfaeb3000
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 104 handle_osd_map epochs [104,104], i have 104, src has [1,104]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 9.577100754s of 10.020499229s, submitted: 122
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:40.504139+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1747694 data_alloc: 301989888 data_used: 18046976
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 104 ms_handle_reset con 0x562bfaeb3000 session 0x562bff1ecb40
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfcaafc00
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 104 handle_osd_map epochs [105,105], i have 104, src has [1,105]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[2.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[2.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[2.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[2.19] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 105 ms_handle_reset con 0x562bfcaafc00 session 0x562bfaf34960
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[2.1] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[2.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfd5eb400
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[2.13] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 105 handle_osd_map epochs [105,105], i have 105, src has [1,105]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[2.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[2.9] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[2.15] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 124502016 unmapped: 25108480 heap: 149610496 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[2.e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[2.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[2.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:41.504313+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 105 handle_osd_map epochs [106,106], i have 105, src has [1,106]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 106 ms_handle_reset con 0x562bfd5eb400 session 0x562bfc434d20
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 124493824 unmapped: 25116672 heap: 149610496 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 106 heartbeat osd_stat(store_statfs(0x1b3dfc000/0x0/0x1bfc00000, data 0x7bef252/0x7c91000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfcdac000
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 106 ms_handle_reset con 0x562bfcdac000 session 0x562bfa38a780
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfa3cbc00
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 106 ms_handle_reset con 0x562bfa3cbc00 session 0x562bff1f92c0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:42.504460+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfaeb3000
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 106 ms_handle_reset con 0x562bfaeb3000 session 0x562bff1ed0e0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfcaafc00
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfd5eb400
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 124510208 unmapped: 25100288 heap: 149610496 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:43.504595+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 106 ms_handle_reset con 0x562bfd5eb400 session 0x562bfcd410e0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 119783424 unmapped: 29827072 heap: 149610496 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 106 heartbeat osd_stat(store_statfs(0x1b5074000/0x0/0x1bfc00000, data 0x694968e/0x69ee000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:44.504738+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 106 handle_osd_map epochs [106,107], i have 106, src has [1,107]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[2.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[2.19] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[2.1] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[2.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[2.9] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[2.e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[2.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[2.15] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[2.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[2.13] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[2.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[2.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 113876992 unmapped: 35733504 heap: 149610496 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:45.504868+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[2.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1618047 data_alloc: 301989888 data_used: 11149312
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 113876992 unmapped: 35733504 heap: 149610496 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 107 heartbeat osd_stat(store_statfs(0x1b4c9b000/0x0/0x1bfc00000, data 0x694b932/0x69f2000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfa5c9800
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 107 ms_handle_reset con 0x562bfa5c9800 session 0x562bfc9f32c0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfc585400
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 107 ms_handle_reset con 0x562bfc585400 session 0x562bfc9f25a0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfa3cbc00
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 107 ms_handle_reset con 0x562bfa3cbc00 session 0x562bfc6bd680
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:46.505035+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfa5c9800
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 107 ms_handle_reset con 0x562bfa5c9800 session 0x562bfc6bc5a0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfc988800
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfc988400
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 107 ms_handle_reset con 0x562bfc988400 session 0x562bfbf54d20
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 107 ms_handle_reset con 0x562bfc988800 session 0x562bf8a6f860
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfc2f0800
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 107 ms_handle_reset con 0x562bfc2f0800 session 0x562bfa49b4a0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfa3cbc00
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 107 ms_handle_reset con 0x562bfa3cbc00 session 0x562bfbef92c0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfa5c9800
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 113844224 unmapped: 35766272 heap: 149610496 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 107 ms_handle_reset con 0x562bfa5c9800 session 0x562bfc265e00
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfc988400
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 107 ms_handle_reset con 0x562bfc988400 session 0x562bff1ec000
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 107 heartbeat osd_stat(store_statfs(0x1b4c9b000/0x0/0x1bfc00000, data 0x694b932/0x69f2000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfc988800
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:47.505227+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 107 heartbeat osd_stat(store_statfs(0x1b459f000/0x0/0x1bfc00000, data 0x7047994/0x70ef000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,5] op hist [1,0,0,0,0,0,0,0,1])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 107 ms_handle_reset con 0x562bfc988800 session 0x562bfbec0000
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 122667008 unmapped: 26943488 heap: 149610496 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfca28c00
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:48.505414+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfca28800
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 107 ms_handle_reset con 0x562bfca28800 session 0x562bfd5ec960
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 107 handle_osd_map epochs [107,108], i have 107, src has [1,108]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 108 handle_osd_map epochs [108,108], i have 108, src has [1,108]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 122732544 unmapped: 26877952 heap: 149610496 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 108 ms_handle_reset con 0x562bfca28c00 session 0x562bfd5ec5a0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfca28800
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 108 ms_handle_reset con 0x562bfca28800 session 0x562bfbfcf680
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfa3cbc00
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 108 handle_osd_map epochs [108,108], i have 108, src has [1,108]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:49.505547+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfa5c9800
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 108 ms_handle_reset con 0x562bfa5c9800 session 0x562bfcd40000
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfc988400
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 108 handle_osd_map epochs [108,108], i have 108, src has [1,108]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 108 ms_handle_reset con 0x562bfc988400 session 0x562bfcd41860
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfc988800
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 122937344 unmapped: 26673152 heap: 149610496 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 108 handle_osd_map epochs [109,109], i have 108, src has [1,109]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 109 ms_handle_reset con 0x562bfa3cbc00 session 0x562bfc42b860
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfa5c9800
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 9.040178299s of 10.084425926s, submitted: 214
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfc988400
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:50.505699+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 109 handle_osd_map epochs [109,109], i have 109, src has [1,109]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1809329 data_alloc: 301989888 data_used: 18599936
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 109 heartbeat osd_stat(store_statfs(0x1b3b62000/0x0/0x1bfc00000, data 0x7a79155/0x7b29000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfca28800
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 124174336 unmapped: 25436160 heap: 149610496 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 109 handle_osd_map epochs [110,110], i have 109, src has [1,110]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 109 handle_osd_map epochs [110,110], i have 110, src has [1,110]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _renew_subs
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 110 handle_osd_map epochs [110,110], i have 110, src has [1,110]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 110 ms_handle_reset con 0x562bfc988400 session 0x562bfd5ec780
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfca28c00
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:51.505829+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 110 ms_handle_reset con 0x562bfa5c9800 session 0x562bfdbd1c20
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 115245056 unmapped: 34365440 heap: 149610496 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 110 handle_osd_map epochs [110,111], i have 110, src has [1,111]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _renew_subs
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 111 handle_osd_map epochs [111,111], i have 111, src has [1,111]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 111 ms_handle_reset con 0x562bfca28c00 session 0x562bfaed7680
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfdb10000
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:52.505970+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 111 heartbeat osd_stat(store_statfs(0x1b54da000/0x0/0x1bfc00000, data 0x5d80e66/0x5e37000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 116981760 unmapped: 32628736 heap: 149610496 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 111 handle_osd_map epochs [111,112], i have 111, src has [1,112]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 112 handle_osd_map epochs [112,112], i have 112, src has [1,112]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 112 ms_handle_reset con 0x562bfdb10000 session 0x562bf8a6de00
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:53.506145+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 112 ms_handle_reset con 0x562bfca28800 session 0x562bfa429860
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 112 ms_handle_reset con 0x562bfc988800 session 0x562bfaf34960
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfa5c9800
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfc988400
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 117211136 unmapped: 32399360 heap: 149610496 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 112 handle_osd_map epochs [113,113], i have 112, src has [1,113]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 113 ms_handle_reset con 0x562bfa5c9800 session 0x562bfc240780
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:54.506274+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 113 ms_handle_reset con 0x562bfc988400 session 0x562bfc22e1e0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _renew_subs
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 113 handle_osd_map epochs [114,114], i have 113, src has [1,114]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 123117568 unmapped: 26492928 heap: 149610496 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:55.506458+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1561715 data_alloc: 301989888 data_used: 10493952
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfca28c00
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 121372672 unmapped: 28237824 heap: 149610496 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _renew_subs
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 114 handle_osd_map epochs [115,115], i have 114, src has [1,115]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:56.506649+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 115 handle_osd_map epochs [115,115], i have 115, src has [1,115]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 115 ms_handle_reset con 0x562bfca28c00 session 0x562bfc49b680
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfc2f0c00
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 120954880 unmapped: 28655616 heap: 149610496 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 115 handle_osd_map epochs [116,116], i have 115, src has [1,116]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:57.506784+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 116 ms_handle_reset con 0x562bfc2f0c00 session 0x562bfd5edc20
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfa5c9800
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 121028608 unmapped: 28581888 heap: 149610496 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _renew_subs
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 116 handle_osd_map epochs [117,117], i have 116, src has [1,117]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:58.506916+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 117 ms_handle_reset con 0x562bfa5c9800 session 0x562bfc9f2780
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 117 heartbeat osd_stat(store_statfs(0x1b5e4a000/0x0/0x1bfc00000, data 0x577c45f/0x583a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfc988400
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 121110528 unmapped: 28499968 heap: 149610496 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 117 heartbeat osd_stat(store_statfs(0x1b5e4a000/0x0/0x1bfc00000, data 0x577c45f/0x583a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 117 handle_osd_map epochs [118,118], i have 117, src has [1,118]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:59.507059+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 118 handle_osd_map epochs [118,118], i have 118, src has [1,118]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 118 ms_handle_reset con 0x562bfc988400 session 0x562bfa429c20
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfc988800
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 118 handle_osd_map epochs [119,119], i have 118, src has [1,119]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _renew_subs
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 118 handle_osd_map epochs [119,119], i have 119, src has [1,119]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 119 handle_osd_map epochs [119,119], i have 119, src has [1,119]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 120840192 unmapped: 28770304 heap: 149610496 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:00.507275+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1566345 data_alloc: 301989888 data_used: 10600448
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _renew_subs
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 119 handle_osd_map epochs [120,120], i have 119, src has [1,120]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 8.605745316s of 10.616219521s, submitted: 662
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 120 heartbeat osd_stat(store_statfs(0x1b5e50000/0x0/0x1bfc00000, data 0x5780559/0x583b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 120045568 unmapped: 29564928 heap: 149610496 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 120 ms_handle_reset con 0x562bfc988800 session 0x562bfc6bd680
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfca28c00
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:01.507393+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 120217600 unmapped: 29392896 heap: 149610496 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 120 handle_osd_map epochs [121,121], i have 120, src has [1,121]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:02.507612+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 121 handle_osd_map epochs [121,121], i have 121, src has [1,121]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 121 handle_osd_map epochs [121,121], i have 121, src has [1,121]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 121 ms_handle_reset con 0x562bfca28c00 session 0x562bfbf1c960
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 120258560 unmapped: 29351936 heap: 149610496 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:03.508125+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 121 heartbeat osd_stat(store_statfs(0x1b5e2a000/0x0/0x1bfc00000, data 0x57a4d7a/0x5862000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 120258560 unmapped: 29351936 heap: 149610496 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:04.508336+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 121 handle_osd_map epochs [122,122], i have 121, src has [1,122]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 120324096 unmapped: 29286400 heap: 149610496 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:05.508513+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1574959 data_alloc: 301989888 data_used: 10600448
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 120324096 unmapped: 29286400 heap: 149610496 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:06.508968+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 122 heartbeat osd_stat(store_statfs(0x1b5e25000/0x0/0x1bfc00000, data 0x57a705a/0x5866000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 120324096 unmapped: 29286400 heap: 149610496 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:07.509139+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 120406016 unmapped: 29204480 heap: 149610496 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:08.509326+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 122 heartbeat osd_stat(store_statfs(0x1b5e1e000/0x0/0x1bfc00000, data 0x57b105a/0x5870000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 120430592 unmapped: 29179904 heap: 149610496 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:09.510035+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 120430592 unmapped: 29179904 heap: 149610496 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:10.510312+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _renew_subs
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 122 handle_osd_map epochs [123,123], i have 122, src has [1,123]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1577545 data_alloc: 301989888 data_used: 10612736
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 123 heartbeat osd_stat(store_statfs(0x1b5e1e000/0x0/0x1bfc00000, data 0x57b105a/0x5870000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 120438784 unmapped: 29171712 heap: 149610496 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:11.510461+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfc2f1800
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 10.563442230s of 10.909771919s, submitted: 135
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 123 heartbeat osd_stat(store_statfs(0x1b5e19000/0x0/0x1bfc00000, data 0x57b32fe/0x5874000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,5] op hist [0,0,1])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 120471552 unmapped: 29138944 heap: 149610496 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 123 heartbeat osd_stat(store_statfs(0x1b5e19000/0x0/0x1bfc00000, data 0x57b32fe/0x5874000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:12.510644+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 123 handle_osd_map epochs [123,124], i have 123, src has [1,124]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 124 ms_handle_reset con 0x562bfc2f1800 session 0x562bfaed7a40
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 120504320 unmapped: 29106176 heap: 149610496 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:13.510904+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 120504320 unmapped: 29106176 heap: 149610496 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:14.511043+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 120504320 unmapped: 29106176 heap: 149610496 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 124 heartbeat osd_stat(store_statfs(0x1b5e0f000/0x0/0x1bfc00000, data 0x57b8b11/0x587f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:15.511174+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1588727 data_alloc: 301989888 data_used: 10625024
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 120504320 unmapped: 29106176 heap: 149610496 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:16.511385+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 120512512 unmapped: 29097984 heap: 149610496 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:17.511512+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfca29000
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _renew_subs
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 124 handle_osd_map epochs [125,125], i have 124, src has [1,125]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 120545280 unmapped: 29065216 heap: 149610496 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 125 handle_osd_map epochs [125,125], i have 125, src has [1,125]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:18.511677+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 125 ms_handle_reset con 0x562bfca29000 session 0x562bff2e6f00
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 125 ms_handle_reset con 0x562bfcaafc00 session 0x562bff1ec960
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfcdacc00
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 120594432 unmapped: 29016064 heap: 149610496 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:19.511838+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 125 ms_handle_reset con 0x562bfcdacc00 session 0x562bfefeef00
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 116195328 unmapped: 33415168 heap: 149610496 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:20.512000+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 125 heartbeat osd_stat(store_statfs(0x1b6f9a000/0x0/0x1bfc00000, data 0x4629a5c/0x46ec000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1416381 data_alloc: 285212672 data_used: 4870144
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 116195328 unmapped: 33415168 heap: 149610496 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:21.512145+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 116195328 unmapped: 33415168 heap: 149610496 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:22.512365+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 116195328 unmapped: 33415168 heap: 149610496 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:23.512537+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 125 heartbeat osd_stat(store_statfs(0x1b6f9a000/0x0/0x1bfc00000, data 0x4629a5c/0x46ec000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 116195328 unmapped: 33415168 heap: 149610496 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:24.512719+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 125 handle_osd_map epochs [125,126], i have 125, src has [1,126]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 12.448627472s of 12.996591568s, submitted: 140
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 116195328 unmapped: 33415168 heap: 149610496 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:25.512863+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1419015 data_alloc: 285212672 data_used: 4870144
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 126 heartbeat osd_stat(store_statfs(0x1b6f9d000/0x0/0x1bfc00000, data 0x462bd00/0x46f0000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 116195328 unmapped: 33415168 heap: 149610496 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:26.513122+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 116187136 unmapped: 33423360 heap: 149610496 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:27.513323+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 116187136 unmapped: 33423360 heap: 149610496 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:28.513508+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 116187136 unmapped: 33423360 heap: 149610496 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:29.513694+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 116187136 unmapped: 33423360 heap: 149610496 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:30.513833+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1419015 data_alloc: 285212672 data_used: 4870144
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 116187136 unmapped: 33423360 heap: 149610496 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:31.514031+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 126 heartbeat osd_stat(store_statfs(0x1b6f9d000/0x0/0x1bfc00000, data 0x462bd00/0x46f0000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 116187136 unmapped: 33423360 heap: 149610496 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:32.514196+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 116187136 unmapped: 33423360 heap: 149610496 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:33.514333+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 126 heartbeat osd_stat(store_statfs(0x1b6f9d000/0x0/0x1bfc00000, data 0x462bd00/0x46f0000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 116187136 unmapped: 33423360 heap: 149610496 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:34.514719+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 116187136 unmapped: 33423360 heap: 149610496 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:35.514938+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1419015 data_alloc: 285212672 data_used: 4870144
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 116187136 unmapped: 33423360 heap: 149610496 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:36.515237+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 126 heartbeat osd_stat(store_statfs(0x1b6f9d000/0x0/0x1bfc00000, data 0x462bd00/0x46f0000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 116187136 unmapped: 33423360 heap: 149610496 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:37.515379+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 116187136 unmapped: 33423360 heap: 149610496 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:38.515625+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 126 heartbeat osd_stat(store_statfs(0x1b6f9d000/0x0/0x1bfc00000, data 0x462bd00/0x46f0000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 126 heartbeat osd_stat(store_statfs(0x1b6f9d000/0x0/0x1bfc00000, data 0x462bd00/0x46f0000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 116187136 unmapped: 33423360 heap: 149610496 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:39.515824+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 126 heartbeat osd_stat(store_statfs(0x1b6f9d000/0x0/0x1bfc00000, data 0x462bd00/0x46f0000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 116187136 unmapped: 33423360 heap: 149610496 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:40.516029+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1419015 data_alloc: 285212672 data_used: 4870144
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 116187136 unmapped: 33423360 heap: 149610496 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:41.516234+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 126 heartbeat osd_stat(store_statfs(0x1b6f9d000/0x0/0x1bfc00000, data 0x462bd00/0x46f0000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 116187136 unmapped: 33423360 heap: 149610496 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:42.516382+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 116187136 unmapped: 33423360 heap: 149610496 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:43.516522+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 116187136 unmapped: 33423360 heap: 149610496 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:44.516685+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 116187136 unmapped: 33423360 heap: 149610496 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:45.516873+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1419015 data_alloc: 285212672 data_used: 4870144
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 116187136 unmapped: 33423360 heap: 149610496 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:46.517102+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 126 heartbeat osd_stat(store_statfs(0x1b6f9d000/0x0/0x1bfc00000, data 0x462bd00/0x46f0000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 116187136 unmapped: 33423360 heap: 149610496 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:47.517240+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 116187136 unmapped: 33423360 heap: 149610496 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:48.517384+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 126 heartbeat osd_stat(store_statfs(0x1b6f9d000/0x0/0x1bfc00000, data 0x462bd00/0x46f0000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 116187136 unmapped: 33423360 heap: 149610496 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 126 heartbeat osd_stat(store_statfs(0x1b6f9d000/0x0/0x1bfc00000, data 0x462bd00/0x46f0000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:49.517523+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 116187136 unmapped: 33423360 heap: 149610496 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:50.517741+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1419015 data_alloc: 285212672 data_used: 4870144
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 116187136 unmapped: 33423360 heap: 149610496 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:51.517938+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 116187136 unmapped: 33423360 heap: 149610496 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:52.518144+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 126 heartbeat osd_stat(store_statfs(0x1b6f9d000/0x0/0x1bfc00000, data 0x462bd00/0x46f0000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 116187136 unmapped: 33423360 heap: 149610496 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:53.518303+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 116187136 unmapped: 33423360 heap: 149610496 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:54.518453+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfc2f0400
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 29.948873520s of 30.001106262s, submitted: 22
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 117243904 unmapped: 32366592 heap: 149610496 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:55.518670+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1421872 data_alloc: 285212672 data_used: 4870144
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 126 heartbeat osd_stat(store_statfs(0x1b6f9c000/0x0/0x1bfc00000, data 0x462c123/0x46f2000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 126 handle_osd_map epochs [127,127], i have 126, src has [1,127]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 127 ms_handle_reset con 0x562bfc2f0400 session 0x562bf8a6f860
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 117243904 unmapped: 32366592 heap: 149610496 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:56.518893+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bf9849800
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 127 ms_handle_reset con 0x562bf9849800 session 0x562bff2e6f00
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfb19bc00
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 127 ms_handle_reset con 0x562bfb19bc00 session 0x562bff2e65a0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfcd1b400
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 127 handle_osd_map epochs [127,127], i have 127, src has [1,127]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 127 handle_osd_map epochs [127,127], i have 127, src has [1,127]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 117284864 unmapped: 32325632 heap: 149610496 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:57.519035+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 127 heartbeat osd_stat(store_statfs(0x1b6f94000/0x0/0x1bfc00000, data 0x462e937/0x46fa000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 127 handle_osd_map epochs [127,128], i have 127, src has [1,128]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 128 handle_osd_map epochs [128,128], i have 128, src has [1,128]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 128 handle_osd_map epochs [128,128], i have 128, src has [1,128]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 128 ms_handle_reset con 0x562bfcd1b400 session 0x562bff2e63c0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 117293056 unmapped: 32317440 heap: 149610496 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:58.519190+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfc988400
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 128 ms_handle_reset con 0x562bfc988400 session 0x562bff2e72c0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfceb4400
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 128 ms_handle_reset con 0x562bfceb4400 session 0x562bff2e61e0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfcaae800
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 117293056 unmapped: 32317440 heap: 149610496 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:59.519347+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 128 handle_osd_map epochs [128,129], i have 128, src has [1,129]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 129 handle_osd_map epochs [129,129], i have 129, src has [1,129]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 129 handle_osd_map epochs [129,129], i have 129, src has [1,129]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 117325824 unmapped: 32284672 heap: 149610496 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 129 heartbeat osd_stat(store_statfs(0x1b6f8b000/0x0/0x1bfc00000, data 0x4633107/0x4702000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 129 ms_handle_reset con 0x562bfcaae800 session 0x562bfefeef00
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:00.519496+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1437545 data_alloc: 285212672 data_used: 4898816
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 117325824 unmapped: 32284672 heap: 149610496 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:01.519864+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfb19bc00
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 129 heartbeat osd_stat(store_statfs(0x1b6f8d000/0x0/0x1bfc00000, data 0x4632cd4/0x46ff000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 129 handle_osd_map epochs [129,130], i have 129, src has [1,130]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 130 handle_osd_map epochs [130,130], i have 130, src has [1,130]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 117366784 unmapped: 32243712 heap: 149610496 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 130 handle_osd_map epochs [130,130], i have 130, src has [1,130]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:02.520030+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 130 handle_osd_map epochs [130,130], i have 130, src has [1,130]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 130 ms_handle_reset con 0x562bfb19bc00 session 0x562bfefee000
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 117358592 unmapped: 32251904 heap: 149610496 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:03.520166+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 117358592 unmapped: 32251904 heap: 149610496 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:04.520354+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 130 handle_osd_map epochs [130,131], i have 130, src has [1,131]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 9.471812248s of 10.043688774s, submitted: 146
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 117383168 unmapped: 32227328 heap: 149610496 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:05.520516+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1440679 data_alloc: 285212672 data_used: 4911104
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 131 handle_osd_map epochs [131,132], i have 131, src has [1,132]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 117391360 unmapped: 32219136 heap: 149610496 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:06.520701+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 132 handle_osd_map epochs [133,133], i have 132, src has [1,133]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 133 handle_osd_map epochs [132,133], i have 133, src has [1,133]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 117415936 unmapped: 32194560 heap: 149610496 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:07.520842+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfc9a1400
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 133 heartbeat osd_stat(store_statfs(0x1b6f84000/0x0/0x1bfc00000, data 0x4639342/0x4709000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _renew_subs
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 133 handle_osd_map epochs [134,134], i have 133, src has [1,134]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 133 handle_osd_map epochs [134,134], i have 134, src has [1,134]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfcdacc00
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 134 ms_handle_reset con 0x562bfcdacc00 session 0x562bfefef2c0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 117317632 unmapped: 32292864 heap: 149610496 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 134 ms_handle_reset con 0x562bfc9a1400 session 0x562bff5654a0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:08.521003+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 117317632 unmapped: 32292864 heap: 149610496 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets getting new tickets!
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:09.521270+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _finish_auth 0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:09.522906+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfaeb3000
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfcd1a000
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 134 ms_handle_reset con 0x562bfcd1a000 session 0x562bfa38af00
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 134 ms_handle_reset con 0x562bfaeb3000 session 0x562bfa3234a0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _renew_subs
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 134 handle_osd_map epochs [135,135], i have 134, src has [1,135]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 117334016 unmapped: 32276480 heap: 149610496 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:10.521434+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfaeb3000
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 135 handle_osd_map epochs [135,135], i have 135, src has [1,135]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1478653 data_alloc: 285212672 data_used: 4935680
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfb19bc00
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 118423552 unmapped: 39583744 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 135 ms_handle_reset con 0x562bfb19bc00 session 0x562bfefef4a0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:11.521561+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfb19b800
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 135 ms_handle_reset con 0x562bfb19b800 session 0x562bfefef680
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 118480896 unmapped: 39526400 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:12.521718+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 135 heartbeat osd_stat(store_statfs(0x1b4772000/0x0/0x1bfc00000, data 0x6e404c3/0x6f1c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 118513664 unmapped: 39493632 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:13.521849+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfa5c9800
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 135 ms_handle_reset con 0x562bfa5c9800 session 0x562bfefefc20
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfcaae400
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 126541824 unmapped: 31465472 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:14.522017+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 135 handle_osd_map epochs [136,136], i have 135, src has [1,136]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 9.142364502s of 10.016036987s, submitted: 185
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 135 handle_osd_map epochs [136,136], i have 136, src has [1,136]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 136 handle_osd_map epochs [136,136], i have 136, src has [1,136]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 136 ms_handle_reset con 0x562bfcaae400 session 0x562bfefef2c0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 136 ms_handle_reset con 0x562bfaeb3000 session 0x562bfa323680
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 118226944 unmapped: 39780352 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:15.522240+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1865901 data_alloc: 285212672 data_used: 4947968
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfa5c9800
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 118243328 unmapped: 39763968 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:16.522447+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfcce3400
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 136 ms_handle_reset con 0x562bfcce3400 session 0x562bfefef4a0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 136 handle_osd_map epochs [137,137], i have 136, src has [1,137]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 118382592 unmapped: 39624704 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:17.522580+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 137 ms_handle_reset con 0x562bfa5c9800 session 0x562bfa323c20
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfc988000
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 137 heartbeat osd_stat(store_statfs(0x1b3369000/0x0/0x1bfc00000, data 0x7e44dc9/0x7f22000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _renew_subs
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 137 handle_osd_map epochs [138,138], i have 137, src has [1,138]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 138 handle_osd_map epochs [138,138], i have 138, src has [1,138]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 118497280 unmapped: 39510016 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:18.522722+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 138 ms_handle_reset con 0x562bfc988000 session 0x562bfc9f3a40
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 118497280 unmapped: 39510016 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:19.522882+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _renew_subs
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 138 handle_osd_map epochs [139,139], i have 138, src has [1,139]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 118333440 unmapped: 39673856 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:20.523047+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 139 handle_osd_map epochs [139,139], i have 139, src has [1,139]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1496274 data_alloc: 285212672 data_used: 4960256
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 118333440 unmapped: 39673856 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:21.523200+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 118333440 unmapped: 39673856 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:22.523331+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 118333440 unmapped: 39673856 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:23.523481+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 139 heartbeat osd_stat(store_statfs(0x1b6b67000/0x0/0x1bfc00000, data 0x46490ae/0x4727000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:24.523651+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 118333440 unmapped: 39673856 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 139 handle_osd_map epochs [139,140], i have 139, src has [1,140]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 9.315750122s of 10.004851341s, submitted: 238
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:25.523827+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 118341632 unmapped: 39665664 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1500164 data_alloc: 285212672 data_used: 4972544
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:26.524004+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 118382592 unmapped: 39624704 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 140 handle_osd_map epochs [141,141], i have 140, src has [1,141]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 141 heartbeat osd_stat(store_statfs(0x1b6b5e000/0x0/0x1bfc00000, data 0x464d710/0x472f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:27.524156+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 118382592 unmapped: 39624704 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfc2f1800
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _renew_subs
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 141 handle_osd_map epochs [142,142], i have 141, src has [1,142]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 142 ms_handle_reset con 0x562bfc2f1800 session 0x562bfc1a70e0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:28.524278+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 118398976 unmapped: 39608320 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:29.524461+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 118407168 unmapped: 39600128 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfc2f0400
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 142 ms_handle_reset con 0x562bfc2f0400 session 0x562bfe34b2c0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 142 handle_osd_map epochs [143,143], i have 142, src has [1,143]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfa5c9800
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 143 ms_handle_reset con 0x562bfa5c9800 session 0x562bfe34a960
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfc2f1800
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:30.524615+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 118439936 unmapped: 39567360 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1510527 data_alloc: 285212672 data_used: 4984832
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 143 handle_osd_map epochs [143,144], i have 143, src has [1,144]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 144 handle_osd_map epochs [144,144], i have 144, src has [1,144]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 144 ms_handle_reset con 0x562bfc2f1800 session 0x562bff2e6b40
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:31.524846+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 118456320 unmapped: 39550976 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 144 heartbeat osd_stat(store_statfs(0x1b6b51000/0x0/0x1bfc00000, data 0x46542f2/0x473b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfdebfc00
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:32.525018+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 118480896 unmapped: 39526400 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 144 ms_handle_reset con 0x562bfdebfc00 session 0x562bfbec0000
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 144 heartbeat osd_stat(store_statfs(0x1b6b51000/0x0/0x1bfc00000, data 0x46542f2/0x473b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfcaae800
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 144 ms_handle_reset con 0x562bfcaae800 session 0x562bff565680
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:33.525173+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 118497280 unmapped: 39510016 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:34.525329+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 118497280 unmapped: 39510016 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:35.525492+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 118497280 unmapped: 39510016 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _renew_subs
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 144 handle_osd_map epochs [145,145], i have 144, src has [1,145]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 10.515130043s of 10.936584473s, submitted: 121
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1515714 data_alloc: 285212672 data_used: 5021696
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:36.525646+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 118497280 unmapped: 39510016 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:37.525800+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 118521856 unmapped: 39485440 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 145 heartbeat osd_stat(store_statfs(0x1b6b50000/0x0/0x1bfc00000, data 0x4656523/0x473d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 145 handle_osd_map epochs [146,146], i have 145, src has [1,146]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 145 handle_osd_map epochs [146,146], i have 146, src has [1,146]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 145 handle_osd_map epochs [146,146], i have 146, src has [1,146]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 145 handle_osd_map epochs [146,146], i have 146, src has [1,146]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfaeb3800
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:38.525939+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 146 ms_handle_reset con 0x562bfaeb3800 session 0x562bfa38ad20
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 118480896 unmapped: 39526400 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 146 handle_osd_map epochs [146,147], i have 146, src has [1,147]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:39.526081+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 118489088 unmapped: 39518208 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfa5c9800
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 147 ms_handle_reset con 0x562bfa5c9800 session 0x562bfbef92c0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:40.526536+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 118489088 unmapped: 39518208 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1521718 data_alloc: 285212672 data_used: 5021696
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:41.526882+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 118489088 unmapped: 39518208 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 147 heartbeat osd_stat(store_statfs(0x1b6b48000/0x0/0x1bfc00000, data 0x465acf3/0x4745000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:42.527059+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 118489088 unmapped: 39518208 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:43.527303+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 118489088 unmapped: 39518208 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:44.527472+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 118489088 unmapped: 39518208 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 147 heartbeat osd_stat(store_statfs(0x1b6b48000/0x0/0x1bfc00000, data 0x465acf3/0x4745000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _renew_subs
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 147 handle_osd_map epochs [148,148], i have 147, src has [1,148]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 148 heartbeat osd_stat(store_statfs(0x1b6b48000/0x0/0x1bfc00000, data 0x465acf3/0x4745000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:45.527628+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 118505472 unmapped: 39501824 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1524720 data_alloc: 285212672 data_used: 5021696
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:46.527837+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 148 heartbeat osd_stat(store_statfs(0x1b6b44000/0x0/0x1bfc00000, data 0x465cf97/0x4749000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 118505472 unmapped: 39501824 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:47.528125+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 118505472 unmapped: 39501824 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 148 heartbeat osd_stat(store_statfs(0x1b6b44000/0x0/0x1bfc00000, data 0x465cf97/0x4749000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:48.528429+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 118505472 unmapped: 39501824 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:49.528672+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 118505472 unmapped: 39501824 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:50.528878+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 118505472 unmapped: 39501824 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1524720 data_alloc: 285212672 data_used: 5021696
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 148 heartbeat osd_stat(store_statfs(0x1b6b44000/0x0/0x1bfc00000, data 0x465cf97/0x4749000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:51.529012+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 118505472 unmapped: 39501824 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:52.529153+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 118505472 unmapped: 39501824 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 148 heartbeat osd_stat(store_statfs(0x1b6b44000/0x0/0x1bfc00000, data 0x465cf97/0x4749000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 148 heartbeat osd_stat(store_statfs(0x1b6b44000/0x0/0x1bfc00000, data 0x465cf97/0x4749000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:53.529295+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 118513664 unmapped: 39493632 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 148 heartbeat osd_stat(store_statfs(0x1b6b44000/0x0/0x1bfc00000, data 0x465cf97/0x4749000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:54.529511+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 118513664 unmapped: 39493632 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:55.529633+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 118513664 unmapped: 39493632 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1524720 data_alloc: 285212672 data_used: 5021696
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:56.529850+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 118513664 unmapped: 39493632 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:57.530011+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 21.494176865s of 21.650053024s, submitted: 65
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 118513664 unmapped: 39493632 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:58.530152+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 118513664 unmapped: 39493632 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfc988000
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 148 ms_handle_reset con 0x562bfc988000 session 0x562bfc2303c0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 148 heartbeat osd_stat(store_statfs(0x1b6b43000/0x0/0x1bfc00000, data 0x465cf97/0x4749000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:59.530341+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 118538240 unmapped: 39469056 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 148 heartbeat osd_stat(store_statfs(0x1b6b43000/0x0/0x1bfc00000, data 0x465cf97/0x4749000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:00.530501+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 118546432 unmapped: 39460864 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1525937 data_alloc: 285212672 data_used: 5021696
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:01.530712+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 118546432 unmapped: 39460864 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:02.530907+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 118546432 unmapped: 39460864 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:03.531030+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 118546432 unmapped: 39460864 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 148 heartbeat osd_stat(store_statfs(0x1b6b43000/0x0/0x1bfc00000, data 0x465cf97/0x4749000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:04.531149+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 118546432 unmapped: 39460864 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:05.531347+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 118546432 unmapped: 39460864 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 148 heartbeat osd_stat(store_statfs(0x1b6b43000/0x0/0x1bfc00000, data 0x465cf97/0x4749000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1525937 data_alloc: 285212672 data_used: 5021696
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:06.531559+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 118546432 unmapped: 39460864 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:07.531716+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 118546432 unmapped: 39460864 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:08.531910+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 118554624 unmapped: 39452672 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:09.532102+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 118554624 unmapped: 39452672 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 148 heartbeat osd_stat(store_statfs(0x1b6b43000/0x0/0x1bfc00000, data 0x465cf97/0x4749000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:10.532250+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 118554624 unmapped: 39452672 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1525937 data_alloc: 285212672 data_used: 5021696
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:11.532394+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 118554624 unmapped: 39452672 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 148 heartbeat osd_stat(store_statfs(0x1b6b43000/0x0/0x1bfc00000, data 0x465cf97/0x4749000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:12.532548+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 118554624 unmapped: 39452672 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:13.532642+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfc2f1800
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 15.926970482s of 15.990081787s, submitted: 15
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 118571008 unmapped: 39436288 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 148 handle_osd_map epochs [148,149], i have 148, src has [1,149]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 149 handle_osd_map epochs [149,149], i have 149, src has [1,149]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 149 ms_handle_reset con 0x562bfc2f1800 session 0x562bfa49a780
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:14.532754+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfaeb3c00
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 118587392 unmapped: 39419904 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 149 handle_osd_map epochs [149,150], i have 149, src has [1,150]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 150 handle_osd_map epochs [150,150], i have 150, src has [1,150]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 150 heartbeat osd_stat(store_statfs(0x1b6b3e000/0x0/0x1bfc00000, data 0x465fb66/0x474f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:15.532870+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 150 ms_handle_reset con 0x562bfaeb3c00 session 0x562bfc4c4b40
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 118611968 unmapped: 39395328 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1533796 data_alloc: 285212672 data_used: 5033984
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:16.533034+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 118611968 unmapped: 39395328 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:17.533126+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 150 heartbeat osd_stat(store_statfs(0x1b6b3c000/0x0/0x1bfc00000, data 0x4661767/0x4751000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 118611968 unmapped: 39395328 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 150 heartbeat osd_stat(store_statfs(0x1b6b3c000/0x0/0x1bfc00000, data 0x4661767/0x4751000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:18.533250+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 118611968 unmapped: 39395328 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:19.533425+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 118611968 unmapped: 39395328 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 150 heartbeat osd_stat(store_statfs(0x1b6b3c000/0x0/0x1bfc00000, data 0x4661767/0x4751000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 150 handle_osd_map epochs [151,151], i have 150, src has [1,151]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:20.533591+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 118620160 unmapped: 39387136 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1536798 data_alloc: 285212672 data_used: 5033984
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:21.533735+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 118611968 unmapped: 39395328 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:22.533880+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 151 heartbeat osd_stat(store_statfs(0x1b6b38000/0x0/0x1bfc00000, data 0x4663a0b/0x4755000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 118611968 unmapped: 39395328 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 151 heartbeat osd_stat(store_statfs(0x1b6b38000/0x0/0x1bfc00000, data 0x4663a0b/0x4755000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:23.534052+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 118611968 unmapped: 39395328 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:24.534239+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 118628352 unmapped: 39378944 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:25.534404+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 118628352 unmapped: 39378944 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 151 heartbeat osd_stat(store_statfs(0x1b6b38000/0x0/0x1bfc00000, data 0x4663a0b/0x4755000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1536798 data_alloc: 285212672 data_used: 5033984
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:26.534617+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 118636544 unmapped: 39370752 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:27.534778+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 118636544 unmapped: 39370752 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:28.534930+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 118636544 unmapped: 39370752 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:29.535131+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 118636544 unmapped: 39370752 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 151 heartbeat osd_stat(store_statfs(0x1b6b38000/0x0/0x1bfc00000, data 0x4663a0b/0x4755000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:30.535265+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 118636544 unmapped: 39370752 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1536798 data_alloc: 285212672 data_used: 5033984
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:31.535359+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 118636544 unmapped: 39370752 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:32.535592+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 118636544 unmapped: 39370752 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 151 heartbeat osd_stat(store_statfs(0x1b6b38000/0x0/0x1bfc00000, data 0x4663a0b/0x4755000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:33.535768+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 118636544 unmapped: 39370752 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:34.535885+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 118636544 unmapped: 39370752 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:35.536041+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 118636544 unmapped: 39370752 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1537438 data_alloc: 285212672 data_used: 5050368
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:36.536231+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 118652928 unmapped: 39354368 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 151 heartbeat osd_stat(store_statfs(0x1b6b38000/0x0/0x1bfc00000, data 0x4663a0b/0x4755000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:37.536369+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 151 heartbeat osd_stat(store_statfs(0x1b6b38000/0x0/0x1bfc00000, data 0x4663a0b/0x4755000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 118652928 unmapped: 39354368 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:38.536511+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 118652928 unmapped: 39354368 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:39.536653+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 118652928 unmapped: 39354368 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 151 heartbeat osd_stat(store_statfs(0x1b6b38000/0x0/0x1bfc00000, data 0x4663a0b/0x4755000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:40.536826+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 151 heartbeat osd_stat(store_statfs(0x1b6b38000/0x0/0x1bfc00000, data 0x4663a0b/0x4755000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 118661120 unmapped: 39346176 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 151 heartbeat osd_stat(store_statfs(0x1b6b38000/0x0/0x1bfc00000, data 0x4663a0b/0x4755000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1537438 data_alloc: 285212672 data_used: 5050368
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:41.536982+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 118661120 unmapped: 39346176 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:42.537138+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 118661120 unmapped: 39346176 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:43.537296+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 118661120 unmapped: 39346176 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:44.538245+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 118661120 unmapped: 39346176 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 151 heartbeat osd_stat(store_statfs(0x1b6b38000/0x0/0x1bfc00000, data 0x4663a0b/0x4755000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:45.538371+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 118661120 unmapped: 39346176 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1537438 data_alloc: 285212672 data_used: 5050368
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:46.538714+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 118661120 unmapped: 39346176 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:47.538948+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 118661120 unmapped: 39346176 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:48.539433+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 151 heartbeat osd_stat(store_statfs(0x1b6b38000/0x0/0x1bfc00000, data 0x4663a0b/0x4755000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 118669312 unmapped: 39337984 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:49.540053+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 35.952636719s of 36.220905304s, submitted: 86
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 118669312 unmapped: 39337984 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:50.540241+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 118669312 unmapped: 39337984 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1538278 data_alloc: 285212672 data_used: 5050368
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfaeb2000
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:51.540632+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 151 ms_handle_reset con 0x562bfaeb2000 session 0x562bfc22f860
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 118693888 unmapped: 39313408 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 151 heartbeat osd_stat(store_statfs(0x1b6b39000/0x0/0x1bfc00000, data 0x4663a0b/0x4755000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:52.540869+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfcce3400
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 151 ms_handle_reset con 0x562bfcce3400 session 0x562bfb2b25a0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 118693888 unmapped: 39313408 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:53.541137+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 118693888 unmapped: 39313408 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfca29000
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 151 ms_handle_reset con 0x562bfca29000 session 0x562bfc1a74a0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 151 heartbeat osd_stat(store_statfs(0x1b6b39000/0x0/0x1bfc00000, data 0x4663a0b/0x4755000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:54.541343+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 118710272 unmapped: 39297024 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfc484400
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 151 ms_handle_reset con 0x562bfc484400 session 0x562bfc4ab2c0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:55.541520+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 118710272 unmapped: 39297024 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1543465 data_alloc: 285212672 data_used: 5050368
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfc485800
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:56.541789+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 151 ms_handle_reset con 0x562bfc485800 session 0x562bfbf55860
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 118734848 unmapped: 39272448 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:57.541924+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 118734848 unmapped: 39272448 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:58.542125+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 118734848 unmapped: 39272448 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 151 heartbeat osd_stat(store_statfs(0x1b6b38000/0x0/0x1bfc00000, data 0x4663a6d/0x4756000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,5] op hist [0,0,1])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfcd1b800
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 151 ms_handle_reset con 0x562bfcd1b800 session 0x562bfa38b4a0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:59.542410+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 118767616 unmapped: 39239680 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfc484400
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 9.923303604s of 10.392802238s, submitted: 114
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 151 ms_handle_reset con 0x562bfc484400 session 0x562bfc230000
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfc485800
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 151 ms_handle_reset con 0x562bfc485800 session 0x562bfbf8b0e0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:00.542553+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 118767616 unmapped: 39239680 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1550364 data_alloc: 285212672 data_used: 5050368
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:01.542726+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 118767616 unmapped: 39239680 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfcd1a800
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 151 heartbeat osd_stat(store_statfs(0x1b6b36000/0x0/0x1bfc00000, data 0x4663adf/0x4758000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,5] op hist [1])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 151 ms_handle_reset con 0x562bfcd1a800 session 0x562bfc4c4960
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:02.549259+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfdebfc00
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 151 ms_handle_reset con 0x562bfdebfc00 session 0x562bfc9f25a0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfcce3400
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 151 ms_handle_reset con 0x562bfcce3400 session 0x562bff1f9a40
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 118824960 unmapped: 39182336 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:03.549470+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 118824960 unmapped: 39182336 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:04.549757+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 118833152 unmapped: 39174144 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 151 heartbeat osd_stat(store_statfs(0x1b6b36000/0x0/0x1bfc00000, data 0x4663a0b/0x4755000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:05.549912+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 118833152 unmapped: 39174144 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 151 handle_osd_map epochs [151,152], i have 151, src has [1,152]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 152 handle_osd_map epochs [152,152], i have 152, src has [1,152]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfc484400
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 152 ms_handle_reset con 0x562bfc484400 session 0x562bfa323a40
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1553459 data_alloc: 285212672 data_used: 5062656
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:06.550089+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 119898112 unmapped: 38109184 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 152 heartbeat osd_stat(store_statfs(0x1b6b31000/0x0/0x1bfc00000, data 0x4665e9d/0x475c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:07.550241+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfc9a1800
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 119898112 unmapped: 38109184 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 152 ms_handle_reset con 0x562bfc9a1800 session 0x562bfc57a1e0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfc988400
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 152 ms_handle_reset con 0x562bfc988400 session 0x562bfc24f860
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:08.550358+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 119914496 unmapped: 38092800 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:09.550540+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 152 heartbeat osd_stat(store_statfs(0x1b6b31000/0x0/0x1bfc00000, data 0x4665e2b/0x475a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 119914496 unmapped: 38092800 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:10.552737+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 119914496 unmapped: 38092800 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1551946 data_alloc: 285212672 data_used: 5062656
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:11.553028+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 119914496 unmapped: 38092800 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:12.553215+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 119922688 unmapped: 38084608 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:13.553430+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 152 heartbeat osd_stat(store_statfs(0x1b6b31000/0x0/0x1bfc00000, data 0x4665e2b/0x475a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 119922688 unmapped: 38084608 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:14.553614+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 119922688 unmapped: 38084608 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 152 heartbeat osd_stat(store_statfs(0x1b6b31000/0x0/0x1bfc00000, data 0x4665e2b/0x475a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:15.553792+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfcdacc00
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 15.147739410s of 15.489278793s, submitted: 81
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 152 ms_handle_reset con 0x562bfcdacc00 session 0x562bfc4aaf00
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 119930880 unmapped: 38076416 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1552049 data_alloc: 285212672 data_used: 5062656
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 152 heartbeat osd_stat(store_statfs(0x1b6b31000/0x0/0x1bfc00000, data 0x4665e2b/0x475a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:16.553969+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 119939072 unmapped: 38068224 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 152 handle_osd_map epochs [153,153], i have 152, src has [1,153]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 153 handle_osd_map epochs [153,153], i have 153, src has [1,153]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:17.554138+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 119947264 unmapped: 38060032 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfd5ea400
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 153 ms_handle_reset con 0x562bfd5ea400 session 0x562bfb1b1c20
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:18.554277+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 119947264 unmapped: 38060032 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 153 handle_osd_map epochs [154,154], i have 153, src has [1,154]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 154 handle_osd_map epochs [154,154], i have 154, src has [1,154]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:19.554354+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 119963648 unmapped: 38043648 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfc484400
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:20.554474+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 154 heartbeat osd_stat(store_statfs(0x1b6b2a000/0x0/0x1bfc00000, data 0x466a65d/0x4763000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 154 ms_handle_reset con 0x562bfc484400 session 0x562bfc9f3860
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfc988400
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfc9a1800
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 154 ms_handle_reset con 0x562bfc9a1800 session 0x562bff2e72c0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 119996416 unmapped: 38010880 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 154 ms_handle_reset con 0x562bfc988400 session 0x562bfc22ba40
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1559057 data_alloc: 285212672 data_used: 5087232
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:21.554657+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 119996416 unmapped: 38010880 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:22.554871+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 119996416 unmapped: 38010880 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 154 heartbeat osd_stat(store_statfs(0x1b6b2b000/0x0/0x1bfc00000, data 0x466a5fb/0x4762000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:23.555008+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 119996416 unmapped: 38010880 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:24.555184+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 119996416 unmapped: 38010880 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 154 handle_osd_map epochs [154,155], i have 154, src has [1,155]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfcce2c00
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 155 ms_handle_reset con 0x562bfcce2c00 session 0x562bfc42a780
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:25.555328+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 120004608 unmapped: 38002688 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1567143 data_alloc: 285212672 data_used: 5087232
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:26.555487+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfcd1ac00
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 10.738153458s of 11.061581612s, submitted: 103
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 155 ms_handle_reset con 0x562bfcd1ac00 session 0x562bfbebd0e0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 120012800 unmapped: 37994496 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfc484400
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 155 ms_handle_reset con 0x562bfc484400 session 0x562bfaed43c0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfc988400
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 155 ms_handle_reset con 0x562bfc988400 session 0x562bfc4c41e0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:27.555605+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfc9a1800
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 120758272 unmapped: 37249024 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 155 ms_handle_reset con 0x562bfc9a1800 session 0x562bfc2103c0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfcce2c00
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfcaae400
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 155 ms_handle_reset con 0x562bfcaae400 session 0x562bf8a6f860
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 155 ms_handle_reset con 0x562bfcce2c00 session 0x562bfc53a1e0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:28.555760+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 120815616 unmapped: 37191680 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 155 heartbeat osd_stat(store_statfs(0x1b5d5c000/0x0/0x1bfc00000, data 0x5435922/0x5532000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfc484400
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 155 ms_handle_reset con 0x562bfc484400 session 0x562bfc9f3c20
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:29.555957+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfc988400
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 155 ms_handle_reset con 0x562bfc988400 session 0x562bfc9f3680
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfc9a1800
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 120832000 unmapped: 37175296 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 155 handle_osd_map epochs [155,156], i have 155, src has [1,156]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 156 handle_osd_map epochs [156,156], i have 156, src has [1,156]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:30.556054+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 120832000 unmapped: 37175296 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1578800 data_alloc: 285212672 data_used: 5103616
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 156 handle_osd_map epochs [157,157], i have 156, src has [1,157]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfcaae400
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:31.556206+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.5] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.4] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 157 ms_handle_reset con 0x562bfcaae400 session 0x562bfc435e00
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 120848384 unmapped: 37158912 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:32.556379+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 157 handle_osd_map epochs [158,158], i have 157, src has [1,158]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _renew_subs
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 158 handle_osd_map epochs [158,158], i have 158, src has [1,158]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 158 handle_osd_map epochs [158,158], i have 158, src has [1,158]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfc484c00
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfc484000
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 158 ms_handle_reset con 0x562bfc484c00 session 0x562bfaf34b40
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 158 handle_osd_map epochs [158,158], i have 158, src has [1,158]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 120905728 unmapped: 37101568 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:33.556542+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _renew_subs
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 158 handle_osd_map epochs [159,159], i have 158, src has [1,159]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 159 handle_osd_map epochs [159,159], i have 159, src has [1,159]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 159 ms_handle_reset con 0x562bfc484000 session 0x562bfb1681e0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 159 ms_handle_reset con 0x562bfc9a1800 session 0x562bfc16c5a0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 120922112 unmapped: 37085184 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 159 handle_osd_map epochs [159,159], i have 159, src has [1,159]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfc484400
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:34.556702+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 159 heartbeat osd_stat(store_statfs(0x1b6b10000/0x0/0x1bfc00000, data 0x4675e83/0x477d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 159 ms_handle_reset con 0x562bfc484400 session 0x562bfbfce1e0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 120930304 unmapped: 37076992 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 159 handle_osd_map epochs [159,160], i have 159, src has [1,160]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.5] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.4] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:35.556801+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 121978880 unmapped: 36028416 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1601347 data_alloc: 285212672 data_used: 5107712
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:36.557004+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 9.659642220s of 10.290184975s, submitted: 195
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 120930304 unmapped: 37076992 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:37.557200+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 120930304 unmapped: 37076992 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfc988c00
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfa3cac00
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:38.557352+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 160 ms_handle_reset con 0x562bfa3cac00 session 0x562bfc6e5860
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfceb4400
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 160 ms_handle_reset con 0x562bfc988c00 session 0x562bfc240b40
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfcaafc00
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 160 ms_handle_reset con 0x562bfcaafc00 session 0x562bfc693c20
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 120946688 unmapped: 37060608 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 160 handle_osd_map epochs [161,161], i have 160, src has [1,161]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfa3cac00
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 161 ms_handle_reset con 0x562bfa3cac00 session 0x562bfc6e45a0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 161 ms_handle_reset con 0x562bfceb4400 session 0x562bfc6e4f00
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:39.557506+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfc484400
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 161 ms_handle_reset con 0x562bfc484400 session 0x562bfc16cb40
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 120963072 unmapped: 37044224 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:40.557648+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfc988c00
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 161 heartbeat osd_stat(store_statfs(0x1b6b06000/0x0/0x1bfc00000, data 0x467a506/0x4787000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 161 ms_handle_reset con 0x562bfc988c00 session 0x562bfc265680
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 161 handle_osd_map epochs [161,162], i have 161, src has [1,162]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 120987648 unmapped: 37019648 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 162 handle_osd_map epochs [162,162], i have 162, src has [1,162]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfc9a1800
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 162 handle_osd_map epochs [162,162], i have 162, src has [1,162]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfc2f1800
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:41.557811+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1612630 data_alloc: 285212672 data_used: 5120000
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 162 ms_handle_reset con 0x562bfc2f1800 session 0x562bff1f9e00
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 121004032 unmapped: 37003264 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _renew_subs
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 162 handle_osd_map epochs [163,163], i have 162, src has [1,163]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.4] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 163 handle_osd_map epochs [163,163], i have 163, src has [1,163]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.5] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:42.557933+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 163 ms_handle_reset con 0x562bfc9a1800 session 0x562bfb2b2780
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 121044992 unmapped: 36962304 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfcaae000
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:43.558056+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 163 heartbeat osd_stat(store_statfs(0x1b6b04000/0x0/0x1bfc00000, data 0x467e74b/0x4789000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 121044992 unmapped: 36962304 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 163 handle_osd_map epochs [163,164], i have 163, src has [1,164]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 164 handle_osd_map epochs [164,164], i have 164, src has [1,164]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 164 ms_handle_reset con 0x562bfcaae000 session 0x562bfc231860
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfca29000
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:44.558210+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 164 ms_handle_reset con 0x562bfca29000 session 0x562bfc4350e0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 122159104 unmapped: 35848192 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 164 handle_osd_map epochs [165,165], i have 164, src has [1,165]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:45.558339+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 122159104 unmapped: 35848192 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:46.558534+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1613440 data_alloc: 285212672 data_used: 5111808
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 122159104 unmapped: 35848192 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:47.558680+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfc9fe400
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 10.009366989s of 10.771879196s, submitted: 221
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 165 ms_handle_reset con 0x562bfc9fe400 session 0x562bfc6e5e00
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 122159104 unmapped: 35848192 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 165 handle_osd_map epochs [166,166], i have 165, src has [1,166]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:48.558867+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfcdacc00
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 166 ms_handle_reset con 0x562bfcdacc00 session 0x562bfc1f7c20
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 122167296 unmapped: 35840000 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 166 heartbeat osd_stat(store_statfs(0x1b5959000/0x0/0x1bfc00000, data 0x46851e9/0x4794000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5b0f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 166 handle_osd_map epochs [166,167], i have 166, src has [1,167]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 167 handle_osd_map epochs [167,167], i have 167, src has [1,167]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 167 handle_osd_map epochs [167,167], i have 167, src has [1,167]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:49.559013+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 167 handle_osd_map epochs [167,167], i have 167, src has [1,167]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 167 heartbeat osd_stat(store_statfs(0x1b5952000/0x0/0x1bfc00000, data 0x4687b48/0x479a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5b0f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 122167296 unmapped: 35840000 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfc9a1800
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _renew_subs
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 167 handle_osd_map epochs [168,168], i have 167, src has [1,168]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.5] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 168 handle_osd_map epochs [168,168], i have 168, src has [1,168]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.4] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 168 handle_osd_map epochs [168,168], i have 168, src has [1,168]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:50.559182+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 122183680 unmapped: 35823616 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 168 handle_osd_map epochs [169,169], i have 168, src has [1,169]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.4] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.5] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 169 handle_osd_map epochs [169,169], i have 169, src has [1,169]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 169 ms_handle_reset con 0x562bfc9a1800 session 0x562bfaed6f00
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:51.559344+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1628420 data_alloc: 285212672 data_used: 5115904
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfc2f1800
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 169 ms_handle_reset con 0x562bfc2f1800 session 0x562bfc1b3680
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfc9ff800
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 122200064 unmapped: 35807232 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 169 ms_handle_reset con 0x562bfc9ff800 session 0x562bfc24de00
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:52.559488+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfc9a1400
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 169 ms_handle_reset con 0x562bfc9a1400 session 0x562bfaed6f00
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 122200064 unmapped: 35807232 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 169 heartbeat osd_stat(store_statfs(0x1b594e000/0x0/0x1bfc00000, data 0x468bcbc/0x479f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5b0f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:53.559663+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 169 handle_osd_map epochs [170,170], i have 169, src has [1,170]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfb19bc00
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfaeb2c00
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 170 ms_handle_reset con 0x562bfb19bc00 session 0x562bfaf35a40
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 122200064 unmapped: 35807232 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfb19bc00
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 170 ms_handle_reset con 0x562bfb19bc00 session 0x562bfb2b3860
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:54.559814+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _renew_subs
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 170 handle_osd_map epochs [171,171], i have 170, src has [1,171]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 170 handle_osd_map epochs [171,171], i have 171, src has [1,171]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 171 ms_handle_reset con 0x562bfaeb2c00 session 0x562bfc24e5a0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 122249216 unmapped: 35758080 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfc2f1800
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _renew_subs
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 171 handle_osd_map epochs [172,172], i have 171, src has [1,172]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.5] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.4] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:55.560111+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfc9a1400
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 172 handle_osd_map epochs [171,172], i have 172, src has [1,172]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 172 ms_handle_reset con 0x562bfc9a1400 session 0x562bff2e72c0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 122265600 unmapped: 35741696 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfc9a1800
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _renew_subs
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 172 handle_osd_map epochs [173,173], i have 172, src has [1,173]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 173 ms_handle_reset con 0x562bfc2f1800 session 0x562bfe34eb40
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 173 handle_osd_map epochs [173,173], i have 173, src has [1,173]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 173 handle_osd_map epochs [173,173], i have 173, src has [1,173]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 173 handle_osd_map epochs [173,173], i have 173, src has [1,173]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 173 handle_osd_map epochs [172,173], i have 173, src has [1,173]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfc9ff800
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:56.560290+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1767077 data_alloc: 285212672 data_used: 5132288
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 173 ms_handle_reset con 0x562bfc9ff800 session 0x562bfb2b25a0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 173 ms_handle_reset con 0x562bfc9a1800 session 0x562bff2e7860
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfaeb2c00
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 122748928 unmapped: 35258368 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfb19bc00
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:57.560428+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 173 ms_handle_reset con 0x562bfb19bc00 session 0x562bfc9f3860
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfc2f1800
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 173 ms_handle_reset con 0x562bfc2f1800 session 0x562bfc9f23c0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 173 handle_osd_map epochs [174,174], i have 173, src has [1,174]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 9.511152267s of 10.203665733s, submitted: 233
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfc9a1400
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 123895808 unmapped: 34111488 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 174 ms_handle_reset con 0x562bfc9a1400 session 0x562bfc9f25a0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfc9a1c00
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 174 handle_osd_map epochs [174,174], i have 174, src has [1,174]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:58.560574+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 174 handle_osd_map epochs [174,175], i have 174, src has [1,175]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 175 handle_osd_map epochs [175,175], i have 175, src has [1,175]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 175 ms_handle_reset con 0x562bfc9a1c00 session 0x562bfc9f30e0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 175 handle_osd_map epochs [175,175], i have 175, src has [1,175]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 132325376 unmapped: 25681920 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 175 heartbeat osd_stat(store_statfs(0x1b4053000/0x0/0x1bfc00000, data 0x5b78063/0x5c9a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5f0f9b7), peers [0,1,2,3,5] op hist [0,0,0,0,0,0,0,1])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfb19bc00
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 175 ms_handle_reset con 0x562bfb19bc00 session 0x562bfc4c4b40
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:59.560669+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfc2f1800
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfc9a1400
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 175 ms_handle_reset con 0x562bfc2f1800 session 0x562bfc4c4960
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 175 ms_handle_reset con 0x562bfc9a1400 session 0x562bfc4343c0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 175 handle_osd_map epochs [176,176], i have 175, src has [1,176]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 176 handle_osd_map epochs [176,176], i have 176, src has [1,176]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 176 handle_osd_map epochs [176,176], i have 176, src has [1,176]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 176 handle_osd_map epochs [176,176], i have 176, src has [1,176]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 123920384 unmapped: 34086912 heap: 158007296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfc9a1800
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _renew_subs
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 176 handle_osd_map epochs [177,177], i have 176, src has [1,177]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfa5c9800
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:00.560838+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 177 ms_handle_reset con 0x562bfa5c9800 session 0x562bfa38ab40
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfc988000
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 124977152 unmapped: 41426944 heap: 166404096 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 177 ms_handle_reset con 0x562bfc988000 session 0x562bfd30c000
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:01.560996+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2202642 data_alloc: 285212672 data_used: 5140480
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _renew_subs
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 177 handle_osd_map epochs [178,178], i have 177, src has [1,178]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 178 handle_osd_map epochs [178,178], i have 178, src has [1,178]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfc485800
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 178 ms_handle_reset con 0x562bfc9a1800 session 0x562bfc230b40
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 133423104 unmapped: 32980992 heap: 166404096 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfc988c00
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 178 ms_handle_reset con 0x562bfc988c00 session 0x562bfe34f4a0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:02.561136+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 178 handle_osd_map epochs [179,179], i have 178, src has [1,179]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 179 handle_osd_map epochs [179,179], i have 179, src has [1,179]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 125263872 unmapped: 41140224 heap: 166404096 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 179 ms_handle_reset con 0x562bfc485800 session 0x562bfd30c1e0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:03.561301+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfca29000
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 133734400 unmapped: 32669696 heap: 166404096 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 179 ms_handle_reset con 0x562bfca29000 session 0x562bfe34f680
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:04.561435+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 179 heartbeat osd_stat(store_statfs(0x1aee48000/0x0/0x1bfc00000, data 0xad715cb/0xaea6000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5f0f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfd30fc00
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfc2f1800
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 179 ms_handle_reset con 0x562bfd30fc00 session 0x562bfe34fa40
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 179 ms_handle_reset con 0x562bfc2f1800 session 0x562bfc53a3c0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 125607936 unmapped: 40796160 heap: 166404096 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:05.561539+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfc485800
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _renew_subs
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 179 handle_osd_map epochs [180,180], i have 179, src has [1,180]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _renew_subs
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfc988c00
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 180 ms_handle_reset con 0x562bfc988c00 session 0x562bfd30c5a0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfc9a1800
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 180 handle_osd_map epochs [180,180], i have 180, src has [1,180]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 180 ms_handle_reset con 0x562bfc485800 session 0x562bfe34fe00
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 180 handle_osd_map epochs [180,180], i have 180, src has [1,180]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfca29000
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfc9a1400
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 180 ms_handle_reset con 0x562bfca29000 session 0x562bfc24da40
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 126091264 unmapped: 40312832 heap: 166404096 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfdf04800
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfdb11000
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 180 ms_handle_reset con 0x562bfc9a1400 session 0x562bff1f83c0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 180 ms_handle_reset con 0x562bfdf04800 session 0x562bfc435e00
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfc2f1800
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:06.561724+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2675715 data_alloc: 285212672 data_used: 5156864
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 180 ms_handle_reset con 0x562bfc2f1800 session 0x562bfbec0960
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 125681664 unmapped: 40722432 heap: 166404096 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 180 handle_osd_map epochs [181,181], i have 180, src has [1,181]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:07.561898+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 181 handle_osd_map epochs [181,181], i have 181, src has [1,181]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 181 ms_handle_reset con 0x562bfdb11000 session 0x562bfc435c20
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 181 handle_osd_map epochs [181,181], i have 181, src has [1,181]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfdf09800
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 181 ms_handle_reset con 0x562bfdf09800 session 0x562bfbfcf4a0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfc485800
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 8.128908157s of 10.030710220s, submitted: 316
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 181 ms_handle_reset con 0x562bfc9a1800 session 0x562bfd30cb40
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 127885312 unmapped: 38518784 heap: 166404096 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 181 ms_handle_reset con 0x562bfc485800 session 0x562bfe34ab40
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:08.562043+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 181 heartbeat osd_stat(store_statfs(0x1abdec000/0x0/0x1bfc00000, data 0xedaabc2/0xeee1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x4f2f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 128040960 unmapped: 38363136 heap: 166404096 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:09.562192+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 181 handle_osd_map epochs [182,182], i have 181, src has [1,182]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 182 heartbeat osd_stat(store_statfs(0x1aa5e8000/0x0/0x1bfc00000, data 0x105acfd4/0x106e5000, compress 0x0/0x0/0x0, omap 0x649, meta 0x4f2f9b7), peers [0,1,2,3,5] op hist [0,0,0,1])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfc2f1800
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 136658944 unmapped: 29745152 heap: 166404096 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 182 ms_handle_reset con 0x562bfc2f1800 session 0x562bfe34a5a0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:10.562344+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfdb11000
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 136790016 unmapped: 29614080 heap: 166404096 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:11.562496+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _renew_subs
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 182 handle_osd_map epochs [183,183], i have 182, src has [1,183]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.4] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 3334453 data_alloc: 285212672 data_used: 5181440
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 183 handle_osd_map epochs [183,183], i have 183, src has [1,183]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.5] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 183 handle_osd_map epochs [183,183], i have 183, src has [1,183]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfc9a0400
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 183 ms_handle_reset con 0x562bfc9a0400 session 0x562bfbfcef00
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 129687552 unmapped: 36716544 heap: 166404096 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfdf04800
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:12.562724+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 183 ms_handle_reset con 0x562bfdb11000 session 0x562bfc16cb40
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 183 handle_osd_map epochs [183,184], i have 183, src has [1,184]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 184 handle_osd_map epochs [184,184], i have 184, src has [1,184]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 184 ms_handle_reset con 0x562bfdf04800 session 0x562bfdbd0f00
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfc2f1800
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 129785856 unmapped: 36618240 heap: 166404096 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 184 handle_osd_map epochs [184,184], i have 184, src has [1,184]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:13.562857+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 184 ms_handle_reset con 0x562bfc2f1800 session 0x562bfe34fe00
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 129908736 unmapped: 36495360 heap: 166404096 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 184 heartbeat osd_stat(store_statfs(0x1a6661000/0x0/0x1bfc00000, data 0x1438e675/0x144c3000, compress 0x0/0x0/0x0, omap 0x649, meta 0x4f2f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:14.562979+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfc485800
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 184 ms_handle_reset con 0x562bfc485800 session 0x562bfe34f680
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfc9a0400
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 184 handle_osd_map epochs [184,185], i have 184, src has [1,185]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 185 handle_osd_map epochs [185,185], i have 185, src has [1,185]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 185 handle_osd_map epochs [185,185], i have 185, src has [1,185]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 130113536 unmapped: 36290560 heap: 166404096 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfdb11000
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 185 handle_osd_map epochs [186,186], i have 185, src has [1,186]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 186 ms_handle_reset con 0x562bfc9a0400 session 0x562bfc230b40
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 186 handle_osd_map epochs [186,186], i have 186, src has [1,186]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:15.563147+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 186 heartbeat osd_stat(store_statfs(0x1a6663000/0x0/0x1bfc00000, data 0x1438e5b1/0x144c1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x4f2f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 186 ms_handle_reset con 0x562bfdb11000 session 0x562bfe34f0e0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 130359296 unmapped: 36044800 heap: 166404096 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:16.563363+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 3652688 data_alloc: 285212672 data_used: 5201920
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 138887168 unmapped: 27516928 heap: 166404096 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:17.563494+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 8.227177620s of 10.185109138s, submitted: 349
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 138895360 unmapped: 27508736 heap: 166404096 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:18.563624+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfc9a1000
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 130646016 unmapped: 35758080 heap: 166404096 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:19.563787+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 186 heartbeat osd_stat(store_statfs(0x1a1ce7000/0x0/0x1bfc00000, data 0x18eb1cf4/0x18fe7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x4f2f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 130834432 unmapped: 35569664 heap: 166404096 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 186 handle_osd_map epochs [187,187], i have 186, src has [1,187]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.5] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:20.564154+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.4] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfdf09800
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 187 handle_osd_map epochs [187,187], i have 187, src has [1,187]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 187 ms_handle_reset con 0x562bfdf09800 session 0x562bfd30dc20
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: mgrc handle_mgr_map Got map version 46
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/3214120196,v1:172.18.0.108:6811/3214120196]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 131235840 unmapped: 35168256 heap: 166404096 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:21.564966+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 4286652 data_alloc: 285212672 data_used: 5214208
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 139796480 unmapped: 26607616 heap: 166404096 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:22.565153+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 139919360 unmapped: 26484736 heap: 166404096 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:23.565382+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfcd1a000
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 187 ms_handle_reset con 0x562bfcd1a000 session 0x562bfbf8af00
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 140132352 unmapped: 26271744 heap: 166404096 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:24.565598+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 131915776 unmapped: 34488320 heap: 166404096 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:25.566200+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 187 heartbeat osd_stat(store_statfs(0x19bcdf000/0x0/0x1bfc00000, data 0x1eeb40b5/0x1efef000, compress 0x0/0x0/0x0, omap 0x649, meta 0x4f2f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 132046848 unmapped: 34357248 heap: 166404096 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:26.566436+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 4874570 data_alloc: 285212672 data_used: 5214208
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 132055040 unmapped: 34349056 heap: 166404096 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:27.566770+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 8.030615807s of 10.025968552s, submitted: 102
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfaeb3c00
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 187 ms_handle_reset con 0x562bfaeb3c00 session 0x562bfc434f00
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 133275648 unmapped: 33128448 heap: 166404096 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfd30e800
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:28.566919+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 187 ms_handle_reset con 0x562bfd30e800 session 0x562bfe34a780
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfcaae400
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 187 ms_handle_reset con 0x562bfcaae400 session 0x562bfa57c5a0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 133472256 unmapped: 32931840 heap: 166404096 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfc989c00
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 187 ms_handle_reset con 0x562bfc989c00 session 0x562bfa57c780
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfaeb3c00
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 187 ms_handle_reset con 0x562bfaeb3c00 session 0x562bfa57c960
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:29.567125+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfcaae400
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 187 ms_handle_reset con 0x562bfcaae400 session 0x562bfa57cb40
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfcd1a000
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 187 ms_handle_reset con 0x562bfcd1a000 session 0x562bfa57cd20
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 187 heartbeat osd_stat(store_statfs(0x196ce0000/0x0/0x1bfc00000, data 0x23eb4053/0x23fee000, compress 0x0/0x0/0x0, omap 0x649, meta 0x4f2f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 134242304 unmapped: 36364288 heap: 170606592 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:30.567259+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 187 heartbeat osd_stat(store_statfs(0x195a53000/0x0/0x1bfc00000, data 0x25141053/0x2527b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x4f2f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 134356992 unmapped: 36249600 heap: 170606592 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:31.567408+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 5508472 data_alloc: 285212672 data_used: 5214208
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 187 heartbeat osd_stat(store_statfs(0x194253000/0x0/0x1bfc00000, data 0x26941053/0x26a7b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x4f2f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bf9a92400
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 187 heartbeat osd_stat(store_statfs(0x194253000/0x0/0x1bfc00000, data 0x26941053/0x26a7b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x4f2f9b7), peers [0,1,2,3,5] op hist [0,0,0,0,1,0,0,0,1])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 134463488 unmapped: 36143104 heap: 170606592 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:32.567536+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 187 ms_handle_reset con 0x562bf9a92400 session 0x562bfa57d2c0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: mgrc handle_mgr_map Got map version 47
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/3214120196,v1:172.18.0.108:6811/3214120196]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 135544832 unmapped: 35061760 heap: 170606592 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:33.567871+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfd30e800
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 187 ms_handle_reset con 0x562bfd30e800 session 0x562bfa57d860
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 136781824 unmapped: 33824768 heap: 170606592 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:34.568116+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfd30e800
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 187 heartbeat osd_stat(store_statfs(0x192254000/0x0/0x1bfc00000, data 0x28941043/0x28a7a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x4f2f9b7), peers [0,1,2,3,5] op hist [0,0,0,0,0,0,3,1])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 187 ms_handle_reset con 0x562bfd30e800 session 0x562bfa57da40
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 145293312 unmapped: 25313280 heap: 170606592 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:35.568256+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 136642560 unmapped: 33964032 heap: 170606592 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:36.568516+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 6056832 data_alloc: 285212672 data_used: 5242880
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 136708096 unmapped: 33898496 heap: 170606592 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:37.568738+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 8.441513062s of 10.066326141s, submitted: 136
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 136757248 unmapped: 33849344 heap: 170606592 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:38.568911+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 187 heartbeat osd_stat(store_statfs(0x18ea55000/0x0/0x1bfc00000, data 0x2c14109a/0x2c279000, compress 0x0/0x0/0x0, omap 0x649, meta 0x4f2f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _renew_subs
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 187 handle_osd_map epochs [188,188], i have 187, src has [1,188]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 136773632 unmapped: 33832960 heap: 170606592 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:39.569046+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 188 ms_handle_reset con 0x562bfaeb2c00 session 0x562bfc22e1e0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 188 handle_osd_map epochs [188,188], i have 188, src has [1,188]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:40.569217+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 136781824 unmapped: 33824768 heap: 170606592 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bff8a5c00
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:41.569382+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 136814592 unmapped: 33792000 heap: 170606592 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 188 handle_osd_map epochs [188,189], i have 188, src has [1,189]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 189 handle_osd_map epochs [188,189], i have 189, src has [1,189]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 189 handle_osd_map epochs [189,189], i have 189, src has [1,189]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 5588164 data_alloc: 285212672 data_used: 5267456
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 189 heartbeat osd_stat(store_statfs(0x18ea22000/0x0/0x1bfc00000, data 0x2c16f869/0x2c2ab000, compress 0x0/0x0/0x0, omap 0x649, meta 0x4f2f9b7), peers [0,1,2,3,5] op hist [1,0,0,4])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 189 ms_handle_reset con 0x562bff8a5c00 session 0x562bfefef4a0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:42.569547+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 135004160 unmapped: 35602432 heap: 170606592 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfc2f0000
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 189 ms_handle_reset con 0x562bfc2f0000 session 0x562bfc6e4f00
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfcce3400
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:43.569682+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 135036928 unmapped: 35569664 heap: 170606592 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 189 ms_handle_reset con 0x562bfcce3400 session 0x562bfc6e50e0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfaeb2c00
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 189 heartbeat osd_stat(store_statfs(0x1b5a25000/0x0/0x1bfc00000, data 0x516f817/0x52a9000, compress 0x0/0x0/0x0, omap 0x649, meta 0x4f2f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:44.569820+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 135979008 unmapped: 34627584 heap: 170606592 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:45.569993+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 138911744 unmapped: 31694848 heap: 170606592 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:46.570185+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 140943360 unmapped: 29663232 heap: 170606592 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2012673 data_alloc: 301989888 data_used: 14778368
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 189 handle_osd_map epochs [189,190], i have 189, src has [1,190]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 190 handle_osd_map epochs [190,190], i have 190, src has [1,190]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:47.570388+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 140943360 unmapped: 29663232 heap: 170606592 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 190 heartbeat osd_stat(store_statfs(0x1b5a1e000/0x0/0x1bfc00000, data 0x5171d7f/0x52af000, compress 0x0/0x0/0x0, omap 0x649, meta 0x4f2f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 190 handle_osd_map epochs [191,191], i have 190, src has [1,191]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 9.571776390s of 10.218141556s, submitted: 207
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 191 handle_osd_map epochs [191,191], i have 191, src has [1,191]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:48.570522+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 140976128 unmapped: 29630464 heap: 170606592 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 191 handle_osd_map epochs [191,192], i have 191, src has [1,192]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 192 handle_osd_map epochs [192,192], i have 192, src has [1,192]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:49.570644+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 141008896 unmapped: 29597696 heap: 170606592 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 192 handle_osd_map epochs [193,193], i have 192, src has [1,193]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:50.570818+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 141033472 unmapped: 29573120 heap: 170606592 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 193 handle_osd_map epochs [193,193], i have 193, src has [1,193]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfadb7800
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 193 ms_handle_reset con 0x562bfadb7800 session 0x562bfa57de00
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:51.570993+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 141033472 unmapped: 29573120 heap: 170606592 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 193 handle_osd_map epochs [193,194], i have 193, src has [1,194]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 194 handle_osd_map epochs [194,194], i have 194, src has [1,194]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2031977 data_alloc: 301989888 data_used: 14802944
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfaeb3800
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 194 ms_handle_reset con 0x562bfaeb3800 session 0x562bfc693c20
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 194 handle_osd_map epochs [194,194], i have 194, src has [1,194]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:52.571147+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 141058048 unmapped: 29548544 heap: 170606592 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 194 handle_osd_map epochs [195,195], i have 194, src has [1,195]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 195 heartbeat osd_stat(store_statfs(0x1b5a0d000/0x0/0x1bfc00000, data 0x517ac4b/0x52c0000, compress 0x0/0x0/0x0, omap 0x649, meta 0x4f2f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:53.571474+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 141066240 unmapped: 29540352 heap: 170606592 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfcce2c00
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 195 ms_handle_reset con 0x562bfcce2c00 session 0x562bfc29b860
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:54.571741+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 141066240 unmapped: 29540352 heap: 170606592 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _renew_subs
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 195 handle_osd_map epochs [196,196], i have 195, src has [1,196]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 196 handle_osd_map epochs [196,196], i have 196, src has [1,196]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 196 handle_osd_map epochs [196,196], i have 196, src has [1,196]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #44. Immutable memtables: 1.
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:55.571901+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 147415040 unmapped: 23191552 heap: 170606592 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfdf05000
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:56.572109+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 147562496 unmapped: 23044096 heap: 170606592 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 196 ms_handle_reset con 0x562bfdf05000 session 0x562bfc264960
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2108213 data_alloc: 301989888 data_used: 15364096
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:57.572247+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 146219008 unmapped: 24387584 heap: 170606592 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 196 heartbeat osd_stat(store_statfs(0x1b3be1000/0x0/0x1bfc00000, data 0x5a07413/0x5b4d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:58.572626+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 146219008 unmapped: 24387584 heap: 170606592 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 9.325073242s of 10.326891899s, submitted: 273
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfc2f1400
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 196 ms_handle_reset con 0x562bfc2f1400 session 0x562bfc264000
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:59.572767+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 146259968 unmapped: 24346624 heap: 170606592 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 196 handle_osd_map epochs [196,197], i have 196, src has [1,197]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:00.573019+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 146259968 unmapped: 24346624 heap: 170606592 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:01.573247+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 146259968 unmapped: 24346624 heap: 170606592 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2120439 data_alloc: 301989888 data_used: 15519744
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 197 heartbeat osd_stat(store_statfs(0x1b3bdd000/0x0/0x1bfc00000, data 0x5a09781/0x5b51000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:02.573446+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 146292736 unmapped: 24313856 heap: 170606592 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:03.573674+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 146415616 unmapped: 24190976 heap: 170606592 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:04.573837+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 146415616 unmapped: 24190976 heap: 170606592 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:05.574003+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 146415616 unmapped: 24190976 heap: 170606592 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 197 heartbeat osd_stat(store_statfs(0x1b3bbc000/0x0/0x1bfc00000, data 0x5a2981c/0x5b72000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfc988800
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 197 ms_handle_reset con 0x562bfc988800 session 0x562bfa428b40
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:06.574129+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 146489344 unmapped: 24117248 heap: 170606592 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2127049 data_alloc: 301989888 data_used: 15527936
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfdb56400
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:07.574381+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 146497536 unmapped: 24109056 heap: 170606592 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 197 handle_osd_map epochs [198,198], i have 197, src has [1,198]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 198 ms_handle_reset con 0x562bfdb56400 session 0x562bfa4290e0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:08.574525+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 146522112 unmapped: 24084480 heap: 170606592 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 9.732999802s of 10.004335403s, submitted: 79
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfcd1a000
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 198 handle_osd_map epochs [198,198], i have 198, src has [1,198]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 198 handle_osd_map epochs [198,199], i have 198, src has [1,199]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 199 handle_osd_map epochs [199,199], i have 199, src has [1,199]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfdb57800
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:09.574707+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 199 ms_handle_reset con 0x562bfdb57800 session 0x562bfbebcd20
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 146628608 unmapped: 23977984 heap: 170606592 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 199 ms_handle_reset con 0x562bfcd1a000 session 0x562bfa323a40
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfca28000
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 199 heartbeat osd_stat(store_statfs(0x1b3ba6000/0x0/0x1bfc00000, data 0x5a3a1bd/0x5b87000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 199 handle_osd_map epochs [200,200], i have 199, src has [1,200]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 200 handle_osd_map epochs [200,200], i have 200, src has [1,200]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:10.574851+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 200 ms_handle_reset con 0x562bfca28000 session 0x562bfe1a6780
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfca28000
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfc988800
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 146751488 unmapped: 23855104 heap: 170606592 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 200 ms_handle_reset con 0x562bfc988800 session 0x562bfc693680
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-mon[301134]: from='client.59545 -' entity='client.admin' cmd=[{"prefix": "insights", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:11.574993+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-mon[301134]: from='client.69557 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 200 handle_osd_map epochs [201,201], i have 200, src has [1,201]
Nov 28 10:18:34 np0005538515.localdomain ceph-mon[301134]: from='client.49446 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 146759680 unmapped: 23846912 heap: 170606592 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.107:0/1513419320' entity='client.admin' cmd={"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"} : dispatch
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 201 ms_handle_reset con 0x562bfca28000 session 0x562bfc49ad20
Nov 28 10:18:34 np0005538515.localdomain ceph-mon[301134]: from='client.69575 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfd30f400
Nov 28 10:18:34 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.106:0/1517482184' entity='client.admin' cmd={"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"} : dispatch
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 201 handle_osd_map epochs [201,201], i have 201, src has [1,201]
Nov 28 10:18:34 np0005538515.localdomain ceph-mon[301134]: from='client.59575 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-mon[301134]: from='client.49458 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.108:0/1322949749' entity='client.admin' cmd={"prefix": "mgr dump"} : dispatch
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2152193 data_alloc: 301989888 data_used: 15556608
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.107:0/258418876' entity='client.admin' cmd={"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"} : dispatch
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.106:0/2618867657' entity='client.admin' cmd={"prefix": "mgr dump"} : dispatch
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:12.575260+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.108:0/1946602031' entity='client.admin' cmd={"prefix": "mgr metadata"} : dispatch
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 201 handle_osd_map epochs [202,202], i have 201, src has [1,202]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 147832832 unmapped: 22773760 heap: 170606592 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 202 handle_osd_map epochs [202,202], i have 202, src has [1,202]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 202 handle_osd_map epochs [202,202], i have 202, src has [1,202]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 202 ms_handle_reset con 0x562bfd30f400 session 0x562bfa57c000
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:13.575399+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 147898368 unmapped: 22708224 heap: 170606592 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfaeb3800
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 202 ms_handle_reset con 0x562bfaeb3800 session 0x562bfc1a74a0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:14.575552+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 202 handle_osd_map epochs [203,203], i have 202, src has [1,203]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 147939328 unmapped: 22667264 heap: 170606592 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfc988c00
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfdb11000
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 203 ms_handle_reset con 0x562bfc988c00 session 0x562bfe34b860
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:15.575769+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _renew_subs
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 203 handle_osd_map epochs [204,204], i have 203, src has [1,204]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 204 ms_handle_reset con 0x562bfdb11000 session 0x562bff1ec780
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 147947520 unmapped: 22659072 heap: 170606592 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 204 handle_osd_map epochs [204,204], i have 204, src has [1,204]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfc988c00
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 204 ms_handle_reset con 0x562bfc988c00 session 0x562bfa38a960
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 204 heartbeat osd_stat(store_statfs(0x1b3b8a000/0x0/0x1bfc00000, data 0x5a492b0/0x5ba4000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 204 handle_osd_map epochs [204,205], i have 204, src has [1,205]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:16.575935+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 148045824 unmapped: 22560768 heap: 170606592 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2167750 data_alloc: 301989888 data_used: 15585280
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfcd1bc00
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:17.576083+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _renew_subs
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 205 handle_osd_map epochs [206,206], i have 205, src has [1,206]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.4] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.5] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 206 ms_handle_reset con 0x562bfcd1bc00 session 0x562bfa57d0e0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 206 handle_osd_map epochs [206,206], i have 206, src has [1,206]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 148054016 unmapped: 22552576 heap: 170606592 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfdb57400
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:18.576231+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 206 ms_handle_reset con 0x562bfdb57400 session 0x562bfaf341e0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 148054016 unmapped: 22552576 heap: 170606592 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 9.485142708s of 10.385216713s, submitted: 284
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:19.576414+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 206 handle_osd_map epochs [206,207], i have 206, src has [1,207]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 148062208 unmapped: 22544384 heap: 170606592 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 207 handle_osd_map epochs [207,207], i have 207, src has [1,207]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 207 handle_osd_map epochs [207,207], i have 207, src has [1,207]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:20.576634+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfaeb2000
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 148078592 unmapped: 22528000 heap: 170606592 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 207 ms_handle_reset con 0x562bfaeb2000 session 0x562bfaf343c0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfaeb2000
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 207 handle_osd_map epochs [207,207], i have 207, src has [1,207]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 207 ms_handle_reset con 0x562bfaeb2000 session 0x562bff1f8b40
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 207 handle_osd_map epochs [208,208], i have 207, src has [1,208]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfc988c00
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:21.576794+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfcd1bc00
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 208 heartbeat osd_stat(store_statfs(0x1b3b82000/0x0/0x1bfc00000, data 0x5a4fcdc/0x5bab000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 208 ms_handle_reset con 0x562bfcd1bc00 session 0x562bfcd40780
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 148144128 unmapped: 22462464 heap: 170606592 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 208 heartbeat osd_stat(store_statfs(0x1b3b7e000/0x0/0x1bfc00000, data 0x5a52126/0x5bae000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2174584 data_alloc: 301989888 data_used: 15597568
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 208 handle_osd_map epochs [208,209], i have 208, src has [1,209]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 209 ms_handle_reset con 0x562bfc988c00 session 0x562bfc240b40
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:22.576977+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 209 handle_osd_map epochs [209,209], i have 209, src has [1,209]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 148226048 unmapped: 22380544 heap: 170606592 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfceb4800
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:23.577137+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 148234240 unmapped: 22372352 heap: 170606592 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 209 handle_osd_map epochs [209,210], i have 209, src has [1,210]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.5] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 210 ms_handle_reset con 0x562bfceb4800 session 0x562bfc57b0e0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.4] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfd30e400
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:24.577284+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 148242432 unmapped: 22364160 heap: 170606592 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 210 handle_osd_map epochs [210,211], i have 210, src has [1,211]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.5] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.4] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 211 ms_handle_reset con 0x562bfd30e400 session 0x562bfc42b860
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:25.577435+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 148258816 unmapped: 22347776 heap: 170606592 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:26.577711+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 148283392 unmapped: 22323200 heap: 170606592 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _renew_subs
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 211 handle_osd_map epochs [212,212], i have 211, src has [1,212]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.4] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.5] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2183412 data_alloc: 301989888 data_used: 15593472
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 212 heartbeat osd_stat(store_statfs(0x1b3b75000/0x0/0x1bfc00000, data 0x5a5890e/0x5bb8000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:27.578115+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 148332544 unmapped: 22274048 heap: 170606592 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 212 ms_handle_reset con 0x562bfaeb2c00 session 0x562bfa38ba40
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfd30f000
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:28.578325+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 143319040 unmapped: 27287552 heap: 170606592 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 212 ms_handle_reset con 0x562bfd30f000 session 0x562bfcd40960
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:29.578518+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 142254080 unmapped: 28352512 heap: 170606592 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:30.578662+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 142254080 unmapped: 28352512 heap: 170606592 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 212 heartbeat osd_stat(store_statfs(0x1b4ee1000/0x0/0x1bfc00000, data 0x46ecc5e/0x484d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 11.191284180s of 11.835122108s, submitted: 236
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 212 handle_osd_map epochs [212,213], i have 212, src has [1,213]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:31.578819+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.5] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 142270464 unmapped: 28336128 heap: 170606592 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.4] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1961367 data_alloc: 285212672 data_used: 5398528
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 213 handle_osd_map epochs [213,213], i have 213, src has [1,213]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:32.579027+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 142270464 unmapped: 28336128 heap: 170606592 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:33.579181+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfd30f400
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 213 ms_handle_reset con 0x562bfd30f400 session 0x562bff1f8d20
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 213 heartbeat osd_stat(store_statfs(0x1b4eda000/0x0/0x1bfc00000, data 0x46eefcd/0x4853000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 143360000 unmapped: 27246592 heap: 170606592 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:34.579334+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 143392768 unmapped: 27213824 heap: 170606592 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 213 handle_osd_map epochs [214,214], i have 213, src has [1,214]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 214 handle_osd_map epochs [214,214], i have 214, src has [1,214]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfa5c9800
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 214 ms_handle_reset con 0x562bfa5c9800 session 0x562bfefef0e0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 214 handle_osd_map epochs [214,214], i have 214, src has [1,214]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:35.579504+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 143450112 unmapped: 27156480 heap: 170606592 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 214 handle_osd_map epochs [214,215], i have 214, src has [1,215]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 215 handle_osd_map epochs [215,215], i have 215, src has [1,215]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:36.579675+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 215 heartbeat osd_stat(store_statfs(0x1b4ed1000/0x0/0x1bfc00000, data 0x46f3a96/0x485b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfd30e000
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 215 ms_handle_reset con 0x562bfd30e000 session 0x562bfd5ed680
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfa5c9800
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 142835712 unmapped: 27770880 heap: 170606592 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 215 handle_osd_map epochs [216,216], i have 215, src has [1,216]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _renew_subs
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 215 handle_osd_map epochs [216,216], i have 216, src has [1,216]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 216 handle_osd_map epochs [216,216], i have 216, src has [1,216]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1973891 data_alloc: 285212672 data_used: 5443584
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 216 ms_handle_reset con 0x562bfa5c9800 session 0x562bfa57dc20
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfaeb2c00
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 216 ms_handle_reset con 0x562bfaeb2c00 session 0x562bfc4350e0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:37.579818+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 142393344 unmapped: 28213248 heap: 170606592 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:38.580010+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 142393344 unmapped: 28213248 heap: 170606592 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:39.580179+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 142393344 unmapped: 28213248 heap: 170606592 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:40.580320+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 142393344 unmapped: 28213248 heap: 170606592 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 216 handle_osd_map epochs [216,217], i have 216, src has [1,217]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 8.944920540s of 10.217798233s, submitted: 227
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:41.580503+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 217 handle_osd_map epochs [218,218], i have 217, src has [1,218]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 142393344 unmapped: 28213248 heap: 170606592 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1984509 data_alloc: 285212672 data_used: 5455872
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:42.580646+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 218 heartbeat osd_stat(store_statfs(0x1b4ec5000/0x0/0x1bfc00000, data 0x46faa4a/0x4867000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 142401536 unmapped: 28205056 heap: 170606592 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:43.580783+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 142401536 unmapped: 28205056 heap: 170606592 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:44.580959+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 142401536 unmapped: 28205056 heap: 170606592 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 218 handle_osd_map epochs [218,219], i have 218, src has [1,219]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:45.581103+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 142409728 unmapped: 28196864 heap: 170606592 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 219 handle_osd_map epochs [219,220], i have 219, src has [1,220]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:46.581359+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 142409728 unmapped: 28196864 heap: 170606592 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1994353 data_alloc: 285212672 data_used: 5492736
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 220 heartbeat osd_stat(store_statfs(0x1b4ec1000/0x0/0x1bfc00000, data 0x46ff441/0x486c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:47.581558+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 142409728 unmapped: 28196864 heap: 170606592 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:48.581694+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 142409728 unmapped: 28196864 heap: 170606592 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 220 heartbeat osd_stat(store_statfs(0x1b4ec1000/0x0/0x1bfc00000, data 0x46ff441/0x486c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfcdac400
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 220 ms_handle_reset con 0x562bfcdac400 session 0x562bfaed4b40
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfb19bc00
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:49.581862+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 220 ms_handle_reset con 0x562bfb19bc00 session 0x562bfc1f7c20
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfcd1b400
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 220 ms_handle_reset con 0x562bfcd1b400 session 0x562bfbfcf4a0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 142409728 unmapped: 28196864 heap: 170606592 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:50.582006+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 142409728 unmapped: 28196864 heap: 170606592 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:51.582162+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfdb56c00
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 220 heartbeat osd_stat(store_statfs(0x1b4ec0000/0x0/0x1bfc00000, data 0x46ff57d/0x486e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 220 handle_osd_map epochs [221,221], i have 220, src has [1,221]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 220 handle_osd_map epochs [221,221], i have 221, src has [1,221]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 220 handle_osd_map epochs [221,221], i have 221, src has [1,221]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 10.118875504s of 10.590768814s, submitted: 167
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 220 handle_osd_map epochs [221,221], i have 221, src has [1,221]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _renew_subs
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 220 handle_osd_map epochs [221,221], i have 221, src has [1,221]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 142409728 unmapped: 28196864 heap: 170606592 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 221 ms_handle_reset con 0x562bfdb56c00 session 0x562bff5650e0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1998428 data_alloc: 285212672 data_used: 5509120
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:52.582468+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfdb57c00
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 142426112 unmapped: 28180480 heap: 170606592 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 221 ms_handle_reset con 0x562bfdb57c00 session 0x562bff2e7a40
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfdf05000
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:53.582659+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 142696448 unmapped: 27910144 heap: 170606592 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: mgrc handle_mgr_map Got map version 48
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/3214120196,v1:172.18.0.108:6811/3214120196]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:54.582787+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfdb57000
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 221 ms_handle_reset con 0x562bfdb57000 session 0x562bff2e7680
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 142852096 unmapped: 27754496 heap: 170606592 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:55.582950+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 142868480 unmapped: 27738112 heap: 170606592 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfdf09c00
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 221 handle_osd_map epochs [221,222], i have 221, src has [1,222]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 222 ms_handle_reset con 0x562bfdf09c00 session 0x562bfc2103c0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:56.583192+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfca28000
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 222 handle_osd_map epochs [222,222], i have 222, src has [1,222]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 222 handle_osd_map epochs [222,222], i have 222, src has [1,222]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 222 ms_handle_reset con 0x562bfca28000 session 0x562bfe34f860
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 142893056 unmapped: 27713536 heap: 170606592 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2015276 data_alloc: 285212672 data_used: 5521408
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:57.583329+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 222 heartbeat osd_stat(store_statfs(0x1b4d2b000/0x0/0x1bfc00000, data 0x4703e1a/0x4877000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfca28000
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 222 ms_handle_reset con 0x562bfca28000 session 0x562bfb2b0d20
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfdb56c00
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 142917632 unmapped: 27688960 heap: 170606592 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 222 ms_handle_reset con 0x562bfdb56c00 session 0x562bfd23fe00
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:58.584598+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 142934016 unmapped: 27672576 heap: 170606592 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 222 heartbeat osd_stat(store_statfs(0x1b4eb8000/0x0/0x1bfc00000, data 0x4703e34/0x4876000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 222 handle_osd_map epochs [223,223], i have 222, src has [1,223]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:59.584806+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 142934016 unmapped: 27672576 heap: 170606592 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 223 handle_osd_map epochs [223,223], i have 223, src has [1,223]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:00.585707+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 142950400 unmapped: 27656192 heap: 170606592 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfdb57000
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 223 ms_handle_reset con 0x562bfdb57000 session 0x562bfbfcf4a0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 223 handle_osd_map epochs [223,223], i have 223, src has [1,223]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:01.586293+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 142958592 unmapped: 27648000 heap: 170606592 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2021479 data_alloc: 285212672 data_used: 5533696
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 223 handle_osd_map epochs [224,224], i have 223, src has [1,224]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 10.042039871s of 10.633100510s, submitted: 145
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 224 handle_osd_map epochs [224,224], i have 224, src has [1,224]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:02.586844+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 142958592 unmapped: 27648000 heap: 170606592 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfd30f800
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 224 ms_handle_reset con 0x562bfd30f800 session 0x562bfc1f7c20
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:03.587534+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 224 handle_osd_map epochs [224,225], i have 224, src has [1,225]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 225 handle_osd_map epochs [225,225], i have 225, src has [1,225]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 225 handle_osd_map epochs [225,225], i have 225, src has [1,225]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 225 handle_osd_map epochs [225,225], i have 225, src has [1,225]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfdebe000
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 225 ms_handle_reset con 0x562bfdebe000 session 0x562bfaf354a0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 142974976 unmapped: 27631616 heap: 170606592 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:04.588184+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 225 heartbeat osd_stat(store_statfs(0x1b4ea9000/0x0/0x1bfc00000, data 0x470a9a3/0x4883000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 225 handle_osd_map epochs [226,226], i have 225, src has [1,226]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 225 handle_osd_map epochs [226,226], i have 226, src has [1,226]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 225 handle_osd_map epochs [226,226], i have 226, src has [1,226]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 225 handle_osd_map epochs [226,226], i have 226, src has [1,226]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 142983168 unmapped: 27623424 heap: 170606592 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 226 handle_osd_map epochs [226,226], i have 226, src has [1,226]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:05.588481+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 226 handle_osd_map epochs [227,227], i have 226, src has [1,227]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 227 handle_osd_map epochs [227,227], i have 227, src has [1,227]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfca28000
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 227 handle_osd_map epochs [227,227], i have 227, src has [1,227]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 227 ms_handle_reset con 0x562bfca28000 session 0x562bfd5ed680
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 143007744 unmapped: 27598848 heap: 170606592 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:06.589055+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfc484c00
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 227 ms_handle_reset con 0x562bfc484c00 session 0x562bff1f8d20
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 143007744 unmapped: 27598848 heap: 170606592 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2036092 data_alloc: 285212672 data_used: 5545984
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfc988000
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 227 ms_handle_reset con 0x562bfc988000 session 0x562bfa38ba40
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:07.589324+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 227 handle_osd_map epochs [228,228], i have 227, src has [1,228]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfd30f400
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 228 ms_handle_reset con 0x562bfd30f400 session 0x562bfd5ec960
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 143032320 unmapped: 27574272 heap: 170606592 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 8400.2 total, 600.0 interval
                                                          Cumulative writes: 17K writes, 66K keys, 17K commit groups, 1.0 writes per commit group, ingest: 0.05 GB, 0.01 MB/s
                                                          Cumulative WAL: 17K writes, 5746 syncs, 3.07 writes per sync, written: 0.05 GB, 0.01 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 11K writes, 40K keys, 11K commit groups, 1.0 writes per commit group, ingest: 27.02 MB, 0.05 MB/s
                                                          Interval WAL: 11K writes, 4966 syncs, 2.37 writes per sync, written: 0.03 GB, 0.05 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:08.589734+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 228 heartbeat osd_stat(store_statfs(0x1b4ea0000/0x0/0x1bfc00000, data 0x471156f/0x488d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 143048704 unmapped: 27557888 heap: 170606592 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:09.589997+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 228 handle_osd_map epochs [228,229], i have 228, src has [1,229]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 229 handle_osd_map epochs [229,229], i have 229, src has [1,229]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 229 handle_osd_map epochs [229,229], i have 229, src has [1,229]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 143048704 unmapped: 27557888 heap: 170606592 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 229 handle_osd_map epochs [229,229], i have 229, src has [1,229]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfb19a400
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 229 ms_handle_reset con 0x562bfb19a400 session 0x562bfaf34b40
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:10.590266+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 229 handle_osd_map epochs [229,230], i have 229, src has [1,230]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 230 handle_osd_map epochs [230,230], i have 230, src has [1,230]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 143073280 unmapped: 27533312 heap: 170606592 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:11.590465+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 230 handle_osd_map epochs [231,231], i have 230, src has [1,231]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 144130048 unmapped: 26476544 heap: 170606592 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2047915 data_alloc: 285212672 data_used: 5570560
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:12.590666+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 231 handle_osd_map epochs [232,232], i have 231, src has [1,232]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 9.784350395s of 10.272776604s, submitted: 182
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfdb11000
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 232 ms_handle_reset con 0x562bfdb11000 session 0x562bfaf343c0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 144138240 unmapped: 26468352 heap: 170606592 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:13.590832+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfc2f1400
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 232 ms_handle_reset con 0x562bfc2f1400 session 0x562bfc1b30e0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 144162816 unmapped: 26443776 heap: 170606592 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 232 heartbeat osd_stat(store_statfs(0x1b4e8d000/0x0/0x1bfc00000, data 0x471a4e7/0x489e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,1,2,3,5] op hist [0,0,1])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfceb4800
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:14.591189+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 232 handle_osd_map epochs [232,233], i have 232, src has [1,233]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 233 handle_osd_map epochs [233,233], i have 233, src has [1,233]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 233 ms_handle_reset con 0x562bfceb4800 session 0x562bfa57d0e0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 144211968 unmapped: 26394624 heap: 170606592 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 233 handle_osd_map epochs [233,233], i have 233, src has [1,233]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:15.591458+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 233 handle_osd_map epochs [234,234], i have 233, src has [1,234]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 144220160 unmapped: 26386432 heap: 170606592 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfc2f1000
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _renew_subs
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 234 handle_osd_map epochs [235,235], i have 234, src has [1,235]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 235 handle_osd_map epochs [235,235], i have 235, src has [1,235]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 235 ms_handle_reset con 0x562bfc2f1000 session 0x562bff1ec780
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:16.591634+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 144244736 unmapped: 26361856 heap: 170606592 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2060817 data_alloc: 285212672 data_used: 5595136
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 235 handle_osd_map epochs [236,236], i have 235, src has [1,236]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:17.591793+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 144285696 unmapped: 26320896 heap: 170606592 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 236 heartbeat osd_stat(store_statfs(0x1b4a7f000/0x0/0x1bfc00000, data 0x47233b3/0x48ad000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:18.592004+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfc484c00
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 236 ms_handle_reset con 0x562bfc484c00 session 0x562bfa38b0e0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 144326656 unmapped: 26279936 heap: 170606592 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 236 handle_osd_map epochs [236,237], i have 236, src has [1,237]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 237 handle_osd_map epochs [237,237], i have 237, src has [1,237]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:19.592175+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 237 handle_osd_map epochs [237,237], i have 237, src has [1,237]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 237 handle_osd_map epochs [237,237], i have 237, src has [1,237]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 144351232 unmapped: 26255360 heap: 170606592 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 237 handle_osd_map epochs [237,237], i have 237, src has [1,237]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfc484c00
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:20.592283+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 237 ms_handle_reset con 0x562bfc484c00 session 0x562bfc1a74a0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 144351232 unmapped: 26255360 heap: 170606592 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 237 handle_osd_map epochs [238,238], i have 237, src has [1,238]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:21.592440+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 238 handle_osd_map epochs [238,238], i have 238, src has [1,238]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfc2f1000
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 238 ms_handle_reset con 0x562bfc2f1000 session 0x562bfc49ad20
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 144416768 unmapped: 26189824 heap: 170606592 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2071230 data_alloc: 285212672 data_used: 5607424
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 238 heartbeat osd_stat(store_statfs(0x1b4a77000/0x0/0x1bfc00000, data 0x4727b1d/0x48b4000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:22.592619+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfc485400
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 238 ms_handle_reset con 0x562bfc485400 session 0x562bfa323a40
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 144433152 unmapped: 26173440 heap: 170606592 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:23.592766+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfdb56800
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 10.454723358s of 11.122385979s, submitted: 201
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 238 ms_handle_reset con 0x562bfdb56800 session 0x562bfbebcd20
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfca28000
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 144465920 unmapped: 26140672 heap: 170606592 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 238 ms_handle_reset con 0x562bfca28000 session 0x562bfdbd1a40
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:24.592924+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 238 heartbeat osd_stat(store_statfs(0x1b4a79000/0x0/0x1bfc00000, data 0x4727b7f/0x48b5000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 144490496 unmapped: 26116096 heap: 170606592 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:25.593139+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 238 heartbeat osd_stat(store_statfs(0x1b4a79000/0x0/0x1bfc00000, data 0x4727b1d/0x48b4000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 144490496 unmapped: 26116096 heap: 170606592 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfca28000
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 238 ms_handle_reset con 0x562bfca28000 session 0x562bfbec0000
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 238 handle_osd_map epochs [238,239], i have 238, src has [1,239]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:26.593340+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 144506880 unmapped: 26099712 heap: 170606592 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2075589 data_alloc: 285212672 data_used: 5619712
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfd30f000
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 239 ms_handle_reset con 0x562bfd30f000 session 0x562bff1ecf00
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:27.594497+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 144515072 unmapped: 26091520 heap: 170606592 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:28.594678+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 144515072 unmapped: 26091520 heap: 170606592 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:29.594877+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 144515072 unmapped: 26091520 heap: 170606592 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfcaae000
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 239 ms_handle_reset con 0x562bfcaae000 session 0x562bfc1f74a0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 239 handle_osd_map epochs [239,240], i have 239, src has [1,240]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:30.595020+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 144531456 unmapped: 26075136 heap: 170606592 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfc2f1400
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:31.595169+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 240 heartbeat osd_stat(store_statfs(0x1b4a6c000/0x0/0x1bfc00000, data 0x472c419/0x48c1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,5] op hist [0,1])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 240 ms_handle_reset con 0x562bfc2f1400 session 0x562bfc53a1e0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfc9fe400
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 240 ms_handle_reset con 0x562bfc9fe400 session 0x562bfa3230e0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 144621568 unmapped: 25985024 heap: 170606592 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfc2f1400
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2086550 data_alloc: 285212672 data_used: 5632000
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 240 ms_handle_reset con 0x562bfc2f1400 session 0x562bf8a6fc20
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 240 handle_osd_map epochs [241,241], i have 240, src has [1,241]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:32.595631+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 241 handle_osd_map epochs [241,241], i have 241, src has [1,241]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfca28000
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfcaae000
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 241 ms_handle_reset con 0x562bfcaae000 session 0x562bfc29ad20
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 241 ms_handle_reset con 0x562bfca28000 session 0x562bfaeda1e0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 144703488 unmapped: 25903104 heap: 170606592 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:33.595739+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 241 handle_osd_map epochs [242,242], i have 241, src has [1,242]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 242 handle_osd_map epochs [242,242], i have 242, src has [1,242]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 242 handle_osd_map epochs [242,242], i have 242, src has [1,242]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 9.315456390s of 10.002860069s, submitted: 182
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfd30f000
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 242 ms_handle_reset con 0x562bfd30f000 session 0x562bff564960
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 145809408 unmapped: 24797184 heap: 170606592 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:34.595860+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 242 heartbeat osd_stat(store_statfs(0x1b4a65000/0x0/0x1bfc00000, data 0x4731119/0x48c7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfd5eb400
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 145809408 unmapped: 24797184 heap: 170606592 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:35.596131+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 242 handle_osd_map epochs [243,243], i have 242, src has [1,243]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.5] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 243 handle_osd_map epochs [243,243], i have 243, src has [1,243]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.4] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 145842176 unmapped: 24764416 heap: 170606592 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 243 ms_handle_reset con 0x562bfd5eb400 session 0x562bf8a6f860
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:36.596344+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 243 heartbeat osd_stat(store_statfs(0x1b4a61000/0x0/0x1bfc00000, data 0x4733611/0x48cc000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 243 handle_osd_map epochs [244,244], i have 243, src has [1,244]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 243 handle_osd_map epochs [244,244], i have 244, src has [1,244]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 243 handle_osd_map epochs [244,244], i have 244, src has [1,244]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 243 handle_osd_map epochs [244,244], i have 244, src has [1,244]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 244 handle_osd_map epochs [244,244], i have 244, src has [1,244]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 145850368 unmapped: 24756224 heap: 170606592 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2104536 data_alloc: 285212672 data_used: 5701632
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:37.596451+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 244 handle_osd_map epochs [245,245], i have 244, src has [1,245]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfc485800
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 245 ms_handle_reset con 0x562bfc485800 session 0x562bfd30d0e0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 145883136 unmapped: 24723456 heap: 170606592 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:38.596562+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfdf08400
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 245 ms_handle_reset con 0x562bfdf08400 session 0x562bff564960
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 145883136 unmapped: 24723456 heap: 170606592 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfaeb2c00
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:39.596680+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 245 ms_handle_reset con 0x562bfaeb2c00 session 0x562bfaeda1e0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 245 heartbeat osd_stat(store_statfs(0x1b4a5c000/0x0/0x1bfc00000, data 0x4737736/0x48d1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 145883136 unmapped: 24723456 heap: 170606592 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfc988400
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 245 ms_handle_reset con 0x562bfc988400 session 0x562bfc29ad20
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:40.596850+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 145932288 unmapped: 24674304 heap: 170606592 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 245 handle_osd_map epochs [245,246], i have 245, src has [1,246]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:41.596982+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.4] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.5] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 145940480 unmapped: 24666112 heap: 170606592 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 246 handle_osd_map epochs [246,246], i have 246, src has [1,246]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2117080 data_alloc: 285212672 data_used: 5713920
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:42.597147+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfca28c00
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 246 ms_handle_reset con 0x562bfca28c00 session 0x562bf8a6fc20
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 246 heartbeat osd_stat(store_statfs(0x1b4a54000/0x0/0x1bfc00000, data 0x4739c03/0x48d9000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfaeb2c00
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfc485800
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 246 ms_handle_reset con 0x562bfc485800 session 0x562bfaed5680
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 246 ms_handle_reset con 0x562bfaeb2c00 session 0x562bfc53a1e0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 145997824 unmapped: 24608768 heap: 170606592 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:43.597281+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 146571264 unmapped: 24035328 heap: 170606592 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:44.597451+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 246 handle_osd_map epochs [247,247], i have 246, src has [1,247]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 9.974518776s of 10.710470200s, submitted: 210
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 247 handle_osd_map epochs [247,247], i have 247, src has [1,247]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfc988400
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 247 ms_handle_reset con 0x562bfc988400 session 0x562bfc53a3c0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 146595840 unmapped: 24010752 heap: 170606592 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:45.597606+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 247 handle_osd_map epochs [248,248], i have 247, src has [1,248]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfdf08400
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 248 ms_handle_reset con 0x562bfdf08400 session 0x562bfc4ab860
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bf9a92400
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 146612224 unmapped: 23994368 heap: 170606592 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 248 ms_handle_reset con 0x562bf9a92400 session 0x562bfa38ba40
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:46.597780+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 248 heartbeat osd_stat(store_statfs(0x1b46c2000/0x0/0x1bfc00000, data 0x4ac8590/0x4c6a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 248 handle_osd_map epochs [249,249], i have 248, src has [1,249]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 248 handle_osd_map epochs [249,249], i have 249, src has [1,249]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 248 handle_osd_map epochs [249,249], i have 249, src has [1,249]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 248 handle_osd_map epochs [249,249], i have 249, src has [1,249]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 249 handle_osd_map epochs [249,249], i have 249, src has [1,249]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 146628608 unmapped: 23977984 heap: 170606592 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2159668 data_alloc: 285212672 data_used: 5738496
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:47.597921+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfcdac800
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 146628608 unmapped: 23977984 heap: 170606592 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 249 ms_handle_reset con 0x562bfcdac800 session 0x562bfaf354a0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:48.598099+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 146636800 unmapped: 23969792 heap: 170606592 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfcaae000
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 249 ms_handle_reset con 0x562bfcaae000 session 0x562bfc9f25a0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfcaaf000
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:49.598231+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 249 handle_osd_map epochs [249,250], i have 249, src has [1,250]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 250 handle_osd_map epochs [250,250], i have 250, src has [1,250]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 250 handle_osd_map epochs [250,250], i have 250, src has [1,250]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 250 ms_handle_reset con 0x562bfcaaf000 session 0x562bfbfcf4a0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 146685952 unmapped: 23920640 heap: 170606592 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfc9fe800
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:50.598376+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfcd1a000
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 250 ms_handle_reset con 0x562bfcd1a000 session 0x562bfc22f4a0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _renew_subs
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 250 handle_osd_map epochs [251,251], i have 250, src has [1,251]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 250 handle_osd_map epochs [251,251], i have 251, src has [1,251]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfd30f000
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 251 handle_osd_map epochs [251,251], i have 251, src has [1,251]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 251 ms_handle_reset con 0x562bfc9fe800 session 0x562bfb2b0d20
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 251 heartbeat osd_stat(store_statfs(0x1b46b8000/0x0/0x1bfc00000, data 0x4acce73/0x4c75000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 251 ms_handle_reset con 0x562bfd30f000 session 0x562bfc4aab40
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 146579456 unmapped: 24027136 heap: 170606592 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:51.598525+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfc9fe800
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _renew_subs
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 251 handle_osd_map epochs [252,252], i have 251, src has [1,252]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 252 ms_handle_reset con 0x562bfc9fe800 session 0x562bfe34f860
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfcdac400
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfdb56400
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 252 ms_handle_reset con 0x562bfdb56400 session 0x562bfc1b23c0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 146628608 unmapped: 23977984 heap: 170606592 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2180268 data_alloc: 285212672 data_used: 5754880
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfdb57800
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:52.598667+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 252 ms_handle_reset con 0x562bfdb57800 session 0x562bfa3221e0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _renew_subs
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 252 handle_osd_map epochs [253,253], i have 252, src has [1,253]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 253 ms_handle_reset con 0x562bfcdac400 session 0x562bfc6bc3c0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 146661376 unmapped: 23945216 heap: 170606592 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:53.598832+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfc9fe800
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _renew_subs
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 253 handle_osd_map epochs [254,254], i have 253, src has [1,254]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 254 handle_osd_map epochs [254,254], i have 254, src has [1,254]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 254 ms_handle_reset con 0x562bfc9fe800 session 0x562bff1f8d20
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfd30f000
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 254 ms_handle_reset con 0x562bfd30f000 session 0x562bfaf345a0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 146677760 unmapped: 23928832 heap: 170606592 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:54.599002+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 9.230003357s of 10.254796028s, submitted: 315
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfdb56400
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 254 heartbeat osd_stat(store_statfs(0x1b46a8000/0x0/0x1bfc00000, data 0x4ad5fd1/0x4c84000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 254 ms_handle_reset con 0x562bfdb56400 session 0x562bfc434d20
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 146677760 unmapped: 23928832 heap: 170606592 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:55.599186+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfdb57800
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfa5c9800
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 150896640 unmapped: 19709952 heap: 170606592 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:56.599356+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 254 handle_osd_map epochs [255,255], i have 254, src has [1,255]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 255 handle_osd_map epochs [255,255], i have 255, src has [1,255]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 255 handle_osd_map epochs [255,255], i have 255, src has [1,255]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 255 handle_osd_map epochs [255,255], i have 255, src has [1,255]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 150642688 unmapped: 32571392 heap: 183214080 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 255 heartbeat osd_stat(store_statfs(0x1b0aa2000/0x0/0x1bfc00000, data 0x86d82e2/0x888b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2673495 data_alloc: 285212672 data_used: 5767168
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:57.599547+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 154886144 unmapped: 28327936 heap: 183214080 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:58.604195+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 159113216 unmapped: 24100864 heap: 183214080 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:59.604338+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfca6ac00
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 146628608 unmapped: 40787968 heap: 187416576 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 255 ms_handle_reset con 0x562bfca6ac00 session 0x562bfaf35a40
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:00.604480+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 150888448 unmapped: 40730624 heap: 191619072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 255 handle_osd_map epochs [255,256], i have 255, src has [1,256]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:01.604574+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 256 handle_osd_map epochs [256,256], i have 256, src has [1,256]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfcd1b000
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 256 handle_osd_map epochs [256,256], i have 256, src has [1,256]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 159318016 unmapped: 32301056 heap: 191619072 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 4907323 data_alloc: 285212672 data_used: 5779456
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:02.604757+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 256 heartbeat osd_stat(store_statfs(0x19de9f000/0x0/0x1bfc00000, data 0x1b2da575/0x1b48e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,5] op hist [0,0,0,0,0,0,1,1,1])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 256 ms_handle_reset con 0x562bfcd1b000 session 0x562bfc24de00
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 146808832 unmapped: 49012736 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfc9fe800
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:03.604979+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 256 ms_handle_reset con 0x562bfc9fe800 session 0x562bff2e6f00
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 155271168 unmapped: 40550400 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:04.605672+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfca6ac00
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 6.404842854s of 10.006223679s, submitted: 217
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 151306240 unmapped: 44515328 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:05.605952+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 256 ms_handle_reset con 0x562bfdf05000 session 0x562bfc9f30e0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 256 handle_osd_map epochs [257,257], i have 256, src has [1,257]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 256 handle_osd_map epochs [257,257], i have 257, src has [1,257]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 257 handle_osd_map epochs [257,257], i have 257, src has [1,257]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 153313280 unmapped: 42508288 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:06.606179+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 257 ms_handle_reset con 0x562bfca6ac00 session 0x562bfc6bcb40
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: mgrc handle_mgr_map Got map version 49
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/3214120196,v1:172.18.0.108:6811/3214120196]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 153731072 unmapped: 42090496 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 6519347 data_alloc: 285212672 data_used: 5799936
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:07.606428+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 257 heartbeat osd_stat(store_statfs(0x18ea95000/0x0/0x1bfc00000, data 0x2a6dcb4d/0x2a898000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,5] op hist [0,0,0,0,0,0,0,3,1])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 159350784 unmapped: 36470784 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 257 ms_handle_reset con 0x562bfdb57800 session 0x562bfaed7a40
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:08.606573+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfdb10000
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 257 ms_handle_reset con 0x562bfdb10000 session 0x562bfa57c000
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 257 ms_handle_reset con 0x562bfa5c9800 session 0x562bfc1a65a0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfdb10000
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 257 ms_handle_reset con 0x562bfdb10000 session 0x562bfc49af00
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfc9fe800
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 257 ms_handle_reset con 0x562bfc9fe800 session 0x562bfc1a61e0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfca6ac00
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 150978560 unmapped: 44843008 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:09.606805+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfdb57800
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 257 ms_handle_reset con 0x562bfdb57800 session 0x562bfc4aaf00
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 257 ms_handle_reset con 0x562bfca6ac00 session 0x562bff1f8f00
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfa5c9800
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 257 ms_handle_reset con 0x562bfa5c9800 session 0x562bfa0a1680
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 149585920 unmapped: 46235648 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfc9fe800
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 257 ms_handle_reset con 0x562bfc9fe800 session 0x562bfc1a65a0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:10.606979+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfdb10000
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfdb57800
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 257 ms_handle_reset con 0x562bfdb10000 session 0x562bfa57c000
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 257 heartbeat osd_stat(store_statfs(0x1b1e96000/0x0/0x1bfc00000, data 0x4adcc92/0x4c97000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,5] op hist [0,1])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfdf05000
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 257 ms_handle_reset con 0x562bfdf05000 session 0x562bfaed7a40
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 257 handle_osd_map epochs [257,258], i have 257, src has [1,258]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 258 ms_handle_reset con 0x562bfdb57800 session 0x562bfc2303c0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 150659072 unmapped: 45162496 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 258 handle_osd_map epochs [258,258], i have 258, src has [1,258]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:11.607143+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfdb57800
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfa5c9800
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 258 ms_handle_reset con 0x562bfa5c9800 session 0x562bfa38af00
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfb19bc00
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 258 ms_handle_reset con 0x562bfb19bc00 session 0x562bfe34b680
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 150708224 unmapped: 45113344 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2399540 data_alloc: 285212672 data_used: 5812224
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _renew_subs
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:12.607378+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 258 handle_osd_map epochs [259,259], i have 258, src has [1,259]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 259 ms_handle_reset con 0x562bfdb57800 session 0x562bfc22f860
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 150708224 unmapped: 45113344 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfd30e800
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 259 ms_handle_reset con 0x562bfd30e800 session 0x562bff2e6f00
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfcce2000
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:13.607581+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfd30f800
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 259 ms_handle_reset con 0x562bfd30f800 session 0x562bfbebcd20
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfa5c9800
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 259 ms_handle_reset con 0x562bfcce2000 session 0x562bfc24de00
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 259 handle_osd_map epochs [259,260], i have 259, src has [1,260]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 260 ms_handle_reset con 0x562bfa5c9800 session 0x562bfa57cf00
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 150724608 unmapped: 45096960 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 260 handle_osd_map epochs [260,260], i have 260, src has [1,260]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:14.607982+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 260 heartbeat osd_stat(store_statfs(0x1b428d000/0x0/0x1bfc00000, data 0x4ae3a89/0x4ca0000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfb19bc00
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 7.692381382s of 10.221053123s, submitted: 642
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 260 ms_handle_reset con 0x562bfb19bc00 session 0x562bfc22ba40
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfd30e800
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 260 ms_handle_reset con 0x562bfd30e800 session 0x562bfaf345a0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 150732800 unmapped: 45088768 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:15.608509+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 150740992 unmapped: 45080576 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 260 handle_osd_map epochs [261,261], i have 260, src has [1,261]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:16.608783+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 261 handle_osd_map epochs [261,261], i have 261, src has [1,261]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfdebe800
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 261 handle_osd_map epochs [261,261], i have 261, src has [1,261]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 261 ms_handle_reset con 0x562bfdebe800 session 0x562bfe34b860
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 150740992 unmapped: 45080576 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfa5c9800
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2417046 data_alloc: 285212672 data_used: 5824512
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:17.609044+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 261 ms_handle_reset con 0x562bfa5c9800 session 0x562bff1f9a40
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfcce2000
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfd30e800
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 261 ms_handle_reset con 0x562bfcce2000 session 0x562bfe34f860
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 261 ms_handle_reset con 0x562bfd30e800 session 0x562bfd5eda40
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfcaae800
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 150396928 unmapped: 45424640 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 261 ms_handle_reset con 0x562bfcaae800 session 0x562bfb2b0d20
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:18.609366+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfcaaf000
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 261 ms_handle_reset con 0x562bfcaaf000 session 0x562bfbfcf4a0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfa5c9800
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 261 ms_handle_reset con 0x562bfa5c9800 session 0x562bfbef92c0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 150413312 unmapped: 45408256 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:19.609546+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfcaae800
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 261 ms_handle_reset con 0x562bfcaae800 session 0x562bfa38ba40
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 261 heartbeat osd_stat(store_statfs(0x1b3399000/0x0/0x1bfc00000, data 0x59d5fd2/0x5b93000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 149372928 unmapped: 46448640 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:20.609706+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 261 handle_osd_map epochs [262,262], i have 261, src has [1,262]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfcce2000
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 149389312 unmapped: 46432256 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 262 ms_handle_reset con 0x562bfcce2000 session 0x562bfc53a3c0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfd30e800
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 262 handle_osd_map epochs [262,263], i have 262, src has [1,263]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:21.609843+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 263 ms_handle_reset con 0x562bfd30e800 session 0x562bfaed5680
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 263 handle_osd_map epochs [262,263], i have 263, src has [1,263]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 149397504 unmapped: 46424064 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 263 heartbeat osd_stat(store_statfs(0x1b338f000/0x0/0x1bfc00000, data 0x59da788/0x5b9c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:22.610056+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2552612 data_alloc: 285212672 data_used: 5836800
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfc9ffc00
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 263 ms_handle_reset con 0x562bfc9ffc00 session 0x562bfaeda1e0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfa5c9800
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 263 ms_handle_reset con 0x562bfa5c9800 session 0x562bff564960
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 149340160 unmapped: 46481408 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:23.610287+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 149340160 unmapped: 46481408 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:24.610495+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 263 handle_osd_map epochs [263,264], i have 263, src has [1,264]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 264 handle_osd_map epochs [264,264], i have 264, src has [1,264]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 9.016990662s of 10.088974953s, submitted: 257
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 149348352 unmapped: 46473216 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:25.610650+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfcaae800
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 264 ms_handle_reset con 0x562bfcaae800 session 0x562bff1f90e0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 149348352 unmapped: 46473216 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:26.610922+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfceb4400
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _renew_subs
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 264 handle_osd_map epochs [265,265], i have 264, src has [1,265]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 265 ms_handle_reset con 0x562bfceb4400 session 0x562bff565c20
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 265 handle_osd_map epochs [265,265], i have 265, src has [1,265]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfc2f1400
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 149364736 unmapped: 46456832 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 265 ms_handle_reset con 0x562bfc2f1400 session 0x562bfc42a1e0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:27.611116+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2447138 data_alloc: 285212672 data_used: 5861376
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 265 heartbeat osd_stat(store_statfs(0x1b4276000/0x0/0x1bfc00000, data 0x4aeeeca/0x4cb6000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfdb56800
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 265 ms_handle_reset con 0x562bfdb56800 session 0x562bff5645a0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 149381120 unmapped: 46440448 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:28.611289+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: mgrc handle_mgr_map Got map version 50
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/3214120196,v1:172.18.0.108:6811/3214120196]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 265 handle_osd_map epochs [266,266], i have 265, src has [1,266]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 149446656 unmapped: 46374912 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:29.611422+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfa5c9800
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 266 ms_handle_reset con 0x562bfa5c9800 session 0x562bfefee3c0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 266 heartbeat osd_stat(store_statfs(0x1b4275000/0x0/0x1bfc00000, data 0x4af1285/0x4cb8000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 149463040 unmapped: 46358528 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:30.611581+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfc2f1400
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 266 ms_handle_reset con 0x562bfc2f1400 session 0x562bfc2103c0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 149479424 unmapped: 46342144 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:31.611764+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 266 heartbeat osd_stat(store_statfs(0x1b4274000/0x0/0x1bfc00000, data 0x4af13dc/0x4cb8000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 149504000 unmapped: 46317568 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:32.611962+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2448599 data_alloc: 285212672 data_used: 5857280
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 149504000 unmapped: 46317568 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:33.612146+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 149504000 unmapped: 46317568 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:34.612284+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 9.582780838s of 10.016024590s, submitted: 120
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 149504000 unmapped: 46317568 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:35.612465+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 266 heartbeat osd_stat(store_statfs(0x1b4276000/0x0/0x1bfc00000, data 0x4af1704/0x4cb8000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 149504000 unmapped: 46317568 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 266 handle_osd_map epochs [266,267], i have 266, src has [1,267]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:36.612633+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 149528576 unmapped: 46292992 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:37.612747+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2456353 data_alloc: 285212672 data_used: 5869568
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 149528576 unmapped: 46292992 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:38.612880+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 149528576 unmapped: 46292992 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:39.613016+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 149528576 unmapped: 46292992 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:40.613143+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 149528576 unmapped: 46292992 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:41.613309+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 267 heartbeat osd_stat(store_statfs(0x1b4270000/0x0/0x1bfc00000, data 0x4af3c0d/0x4cbe000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 149528576 unmapped: 46292992 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:42.613430+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2456007 data_alloc: 285212672 data_used: 5869568
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 149536768 unmapped: 46284800 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:43.613539+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 149536768 unmapped: 46284800 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:44.613657+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 9.998409271s of 10.171700478s, submitted: 41
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 149536768 unmapped: 46284800 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:45.613798+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 267 handle_osd_map epochs [267,268], i have 267, src has [1,268]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 268 handle_osd_map epochs [268,268], i have 268, src has [1,268]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 149536768 unmapped: 46284800 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:46.613971+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 268 heartbeat osd_stat(store_statfs(0x1b426d000/0x0/0x1bfc00000, data 0x4af61f0/0x4cc0000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 149544960 unmapped: 46276608 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:47.614192+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2458433 data_alloc: 285212672 data_used: 5881856
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 149544960 unmapped: 46276608 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:48.614317+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 149544960 unmapped: 46276608 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:49.614499+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 149544960 unmapped: 46276608 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:50.614666+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 268 heartbeat osd_stat(store_statfs(0x1b426e000/0x0/0x1bfc00000, data 0x4af6253/0x4cc0000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 149553152 unmapped: 46268416 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 268 handle_osd_map epochs [268,269], i have 268, src has [1,269]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:51.614819+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 149553152 unmapped: 46268416 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:52.614951+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _renew_subs
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 269 handle_osd_map epochs [270,270], i have 269, src has [1,270]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2467089 data_alloc: 285212672 data_used: 5894144
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 270 heartbeat osd_stat(store_statfs(0x1b4269000/0x0/0x1bfc00000, data 0x4af86c4/0x4cc4000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 149577728 unmapped: 46243840 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:53.615132+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 149577728 unmapped: 46243840 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:54.615305+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 9.779236794s of 10.156165123s, submitted: 106
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 149577728 unmapped: 46243840 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:55.615485+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 270 heartbeat osd_stat(store_statfs(0x1b4264000/0x0/0x1bfc00000, data 0x4afac50/0x4cc9000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,1,2,3,5] op hist [0,0,0,1])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 149577728 unmapped: 46243840 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:56.615653+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 270 heartbeat osd_stat(store_statfs(0x1b4266000/0x0/0x1bfc00000, data 0x4afad49/0x4cc8000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 149577728 unmapped: 46243840 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:57.615830+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2468381 data_alloc: 285212672 data_used: 5894144
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 149577728 unmapped: 46243840 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:58.616008+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 270 heartbeat osd_stat(store_statfs(0x1b4266000/0x0/0x1bfc00000, data 0x4afad49/0x4cc8000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 149577728 unmapped: 46243840 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:59.616208+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 149577728 unmapped: 46243840 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:00.616363+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 149577728 unmapped: 46243840 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:01.616530+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 270 handle_osd_map epochs [271,271], i have 270, src has [1,271]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 271 heartbeat osd_stat(store_statfs(0x1b4263000/0x0/0x1bfc00000, data 0x4afae71/0x4cca000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 149577728 unmapped: 46243840 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:02.616656+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2479063 data_alloc: 285212672 data_used: 5906432
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:03.616840+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 149577728 unmapped: 46243840 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:04.617033+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 149577728 unmapped: 46243840 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:05.617229+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 149594112 unmapped: 46227456 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 271 heartbeat osd_stat(store_statfs(0x1b425e000/0x0/0x1bfc00000, data 0x4afd39e/0x4cd0000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:06.617411+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 149594112 unmapped: 46227456 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 11.709307671s of 11.862863541s, submitted: 38
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:07.617558+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 149610496 unmapped: 46211072 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2479575 data_alloc: 285212672 data_used: 5906432
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:08.617721+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 149610496 unmapped: 46211072 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:09.617877+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 149618688 unmapped: 46202880 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfca28400
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:10.618015+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 149618688 unmapped: 46202880 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 271 heartbeat osd_stat(store_statfs(0x1b425d000/0x0/0x1bfc00000, data 0x4afd43f/0x4cd0000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:11.618228+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 149626880 unmapped: 46194688 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:12.618387+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 149626880 unmapped: 46194688 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2480539 data_alloc: 285212672 data_used: 5906432
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 271 heartbeat osd_stat(store_statfs(0x1b425d000/0x0/0x1bfc00000, data 0x4afd5d4/0x4ccf000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:13.618561+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 149626880 unmapped: 46194688 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:14.618682+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 149626880 unmapped: 46194688 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:15.618805+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 149635072 unmapped: 46186496 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:16.618964+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 149635072 unmapped: 46186496 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 9.807858467s of 10.008973122s, submitted: 42
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:17.619151+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 149635072 unmapped: 46186496 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2478359 data_alloc: 285212672 data_used: 5906432
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 271 heartbeat osd_stat(store_statfs(0x1b4261000/0x0/0x1bfc00000, data 0x4afd610/0x4ccd000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:18.619293+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 149635072 unmapped: 46186496 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:19.619426+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 149643264 unmapped: 46178304 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:20.619561+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 149667840 unmapped: 46153728 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:21.619715+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 149667840 unmapped: 46153728 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:22.619894+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 149667840 unmapped: 46153728 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2482215 data_alloc: 285212672 data_used: 5906432
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:23.620114+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 149667840 unmapped: 46153728 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 271 heartbeat osd_stat(store_statfs(0x1b4261000/0x0/0x1bfc00000, data 0x4afd99d/0x4ccd000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:24.620274+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 149700608 unmapped: 46120960 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:25.620446+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 149716992 unmapped: 46104576 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:26.620629+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 149716992 unmapped: 46104576 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 9.850010872s of 10.008014679s, submitted: 35
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:27.620785+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 150765568 unmapped: 45056000 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2481289 data_alloc: 285212672 data_used: 5906432
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:28.620941+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 150773760 unmapped: 45047808 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:29.621181+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 150806528 unmapped: 45015040 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 271 heartbeat osd_stat(store_statfs(0x1b425e000/0x0/0x1bfc00000, data 0x4afdc67/0x4cd0000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 271 handle_osd_map epochs [271,272], i have 271, src has [1,272]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 272 handle_osd_map epochs [272,272], i have 272, src has [1,272]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:30.621357+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 150814720 unmapped: 45006848 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:31.621522+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 150839296 unmapped: 44982272 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:32.621757+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 150839296 unmapped: 44982272 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2488337 data_alloc: 285212672 data_used: 5918720
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:33.621952+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 150847488 unmapped: 44974080 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:34.622128+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 150847488 unmapped: 44974080 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 272 heartbeat osd_stat(store_statfs(0x1b425c000/0x0/0x1bfc00000, data 0x4b0024e/0x4cd2000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:35.622316+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 150855680 unmapped: 44965888 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:36.622516+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 150855680 unmapped: 44965888 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 272 handle_osd_map epochs [273,273], i have 272, src has [1,273]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:37.622716+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 150872064 unmapped: 44949504 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2490377 data_alloc: 285212672 data_used: 5931008
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 10.687522888s of 11.009541512s, submitted: 83
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:38.622863+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 150872064 unmapped: 44949504 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 273 handle_osd_map epochs [273,273], i have 273, src has [1,273]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:39.623230+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 273 heartbeat osd_stat(store_statfs(0x1b4256000/0x0/0x1bfc00000, data 0x4b02720/0x4cd7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 150888448 unmapped: 44933120 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:40.623404+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 150896640 unmapped: 44924928 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:41.623599+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 150904832 unmapped: 44916736 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:42.623771+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 150921216 unmapped: 44900352 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2493353 data_alloc: 285212672 data_used: 5931008
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:43.623889+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 150921216 unmapped: 44900352 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:44.624043+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 150929408 unmapped: 44892160 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 273 heartbeat osd_stat(store_statfs(0x1b4256000/0x0/0x1bfc00000, data 0x4b029b0/0x4cd8000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:45.624171+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 150929408 unmapped: 44892160 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:46.624302+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 150953984 unmapped: 44867584 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:47.624470+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 150953984 unmapped: 44867584 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2493225 data_alloc: 285212672 data_used: 5931008
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 9.834547043s of 10.007121086s, submitted: 33
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:48.624602+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 150962176 unmapped: 44859392 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:49.624765+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 150962176 unmapped: 44859392 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:50.624932+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 150962176 unmapped: 44859392 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 273 heartbeat osd_stat(store_statfs(0x1b4256000/0x0/0x1bfc00000, data 0x4b02ae0/0x4cd7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:51.625090+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 150962176 unmapped: 44859392 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 273 heartbeat osd_stat(store_statfs(0x1b4257000/0x0/0x1bfc00000, data 0x4b02b45/0x4cd7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:52.625279+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 150962176 unmapped: 44859392 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2495141 data_alloc: 285212672 data_used: 5931008
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:53.625406+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 150962176 unmapped: 44859392 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:54.625544+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 150962176 unmapped: 44859392 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 273 heartbeat osd_stat(store_statfs(0x1b4253000/0x0/0x1bfc00000, data 0x4b02cd6/0x4cd9000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:55.625696+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 150962176 unmapped: 44859392 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:56.625939+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 151142400 unmapped: 44679168 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:57.626145+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 151224320 unmapped: 44597248 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2510639 data_alloc: 285212672 data_used: 5931008
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 9.775051117s of 10.028513908s, submitted: 56
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:58.626296+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 151519232 unmapped: 44302336 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:59.626448+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 153673728 unmapped: 42147840 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:00.626588+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 273 heartbeat osd_stat(store_statfs(0x1b4155000/0x0/0x1bfc00000, data 0x4c025de/0x4dd7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 154157056 unmapped: 41664512 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:01.626718+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _renew_subs
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 273 handle_osd_map epochs [274,274], i have 273, src has [1,274]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 154386432 unmapped: 41435136 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:02.626868+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 154402816 unmapped: 41418752 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2526301 data_alloc: 285212672 data_used: 5947392
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:03.627130+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 274 heartbeat osd_stat(store_statfs(0x1b40e3000/0x0/0x1bfc00000, data 0x4c7394d/0x4e4a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 154402816 unmapped: 41418752 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:04.655791+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 274 heartbeat osd_stat(store_statfs(0x1b40bf000/0x0/0x1bfc00000, data 0x4c96611/0x4e6e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 155590656 unmapped: 40230912 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 274 heartbeat osd_stat(store_statfs(0x1b40bf000/0x0/0x1bfc00000, data 0x4c96611/0x4e6e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:05.655959+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 156434432 unmapped: 39387136 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:06.656163+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 156499968 unmapped: 39321600 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: mgrc handle_mgr_map Got map version 51
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/3214120196,v1:172.18.0.108:6811/3214120196]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:07.656342+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 156737536 unmapped: 39084032 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2542139 data_alloc: 285212672 data_used: 5947392
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:08.656487+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 9.618061066s of 10.322961807s, submitted: 169
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 274 heartbeat osd_stat(store_statfs(0x1b4015000/0x0/0x1bfc00000, data 0x4d406f9/0x4f17000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 157163520 unmapped: 38658048 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:09.656644+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 158326784 unmapped: 37494784 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:10.656847+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 158416896 unmapped: 37404672 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:11.657045+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 158507008 unmapped: 37314560 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 274 handle_osd_map epochs [275,275], i have 274, src has [1,275]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 275 handle_osd_map epochs [275,275], i have 275, src has [1,275]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:12.657212+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 159727616 unmapped: 36093952 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 275 heartbeat osd_stat(store_statfs(0x1b3f43000/0x0/0x1bfc00000, data 0x4e0ea33/0x4fe8000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2560523 data_alloc: 285212672 data_used: 5959680
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:13.657388+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 159727616 unmapped: 36093952 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:14.657561+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 159014912 unmapped: 36806656 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:15.657701+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 275 handle_osd_map epochs [276,276], i have 275, src has [1,276]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 276 heartbeat osd_stat(store_statfs(0x1b3eee000/0x0/0x1bfc00000, data 0x4e66767/0x5040000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 159047680 unmapped: 36773888 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:16.657873+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 159211520 unmapped: 36610048 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 276 heartbeat osd_stat(store_statfs(0x1b3e7c000/0x0/0x1bfc00000, data 0x4ed59b3/0x50b0000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:17.658048+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 159555584 unmapped: 36265984 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2560647 data_alloc: 285212672 data_used: 5971968
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:18.658208+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 159555584 unmapped: 36265984 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 9.578546524s of 10.180689812s, submitted: 179
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:19.658387+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 276 heartbeat osd_stat(store_statfs(0x1b3e4a000/0x0/0x1bfc00000, data 0x4f0a1bf/0x50e3000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 160612352 unmapped: 35209216 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:20.658579+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 160391168 unmapped: 35430400 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 276 heartbeat osd_stat(store_statfs(0x1b3e4a000/0x0/0x1bfc00000, data 0x4f0a1bf/0x50e3000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:21.658714+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 160391168 unmapped: 35430400 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 276 heartbeat osd_stat(store_statfs(0x1b3e4a000/0x0/0x1bfc00000, data 0x4f0a1bf/0x50e3000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 276 handle_osd_map epochs [277,277], i have 276, src has [1,277]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 276 handle_osd_map epochs [277,277], i have 277, src has [1,277]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 276 handle_osd_map epochs [277,277], i have 277, src has [1,277]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:22.658869+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 161439744 unmapped: 34381824 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2563847 data_alloc: 285212672 data_used: 5984256
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:23.659022+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 161439744 unmapped: 34381824 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:24.659209+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 160391168 unmapped: 35430400 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 277 heartbeat osd_stat(store_statfs(0x1b3a46000/0x0/0x1bfc00000, data 0x4f0c551/0x50e8000, compress 0x0/0x0/0x0, omap 0x649, meta 0x70cf9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:25.659368+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 160391168 unmapped: 35430400 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:26.659559+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 161447936 unmapped: 34373632 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:27.659740+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 161447936 unmapped: 34373632 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2569669 data_alloc: 285212672 data_used: 5984256
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:28.659908+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 161456128 unmapped: 34365440 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 10.292652130s of 10.522511482s, submitted: 55
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:29.660034+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 161456128 unmapped: 34365440 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:30.660186+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 161456128 unmapped: 34365440 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 277 heartbeat osd_stat(store_statfs(0x1b3a45000/0x0/0x1bfc00000, data 0x4f0c912/0x50e8000, compress 0x0/0x0/0x0, omap 0x649, meta 0x70cf9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 277 handle_osd_map epochs [278,278], i have 277, src has [1,278]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 277 handle_osd_map epochs [278,278], i have 278, src has [1,278]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 277 handle_osd_map epochs [278,278], i have 278, src has [1,278]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:31.660344+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 161464320 unmapped: 34357248 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:32.660528+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 161472512 unmapped: 34349056 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2573313 data_alloc: 285212672 data_used: 5996544
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:33.660745+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 161472512 unmapped: 34349056 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfd30ec00
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:34.660926+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 161669120 unmapped: 34152448 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:35.661129+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 278 handle_osd_map epochs [279,279], i have 278, src has [1,279]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 279 handle_osd_map epochs [279,279], i have 279, src has [1,279]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 279 handle_osd_map epochs [279,279], i have 279, src has [1,279]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 279 ms_handle_reset con 0x562bfd30ec00 session 0x562bfa57d860
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 279 ms_handle_reset con 0x562bfca28400 session 0x562bfd23e780
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 162054144 unmapped: 33767424 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfceb4800
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:36.661299+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 279 heartbeat osd_stat(store_statfs(0x1b3237000/0x0/0x1bfc00000, data 0x57112b6/0x58f4000, compress 0x0/0x0/0x0, omap 0x649, meta 0x70cf9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 279 handle_osd_map epochs [280,280], i have 279, src has [1,280]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 280 handle_osd_map epochs [280,280], i have 280, src has [1,280]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 162054144 unmapped: 33767424 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 280 handle_osd_map epochs [280,280], i have 280, src has [1,280]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 280 ms_handle_reset con 0x562bfceb4800 session 0x562bfc4aba40
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: mgrc handle_mgr_map Got map version 52
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/3214120196,v1:172.18.0.108:6811/3214120196]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:37.661561+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 280 heartbeat osd_stat(store_statfs(0x1b3a34000/0x0/0x1bfc00000, data 0x4f13813/0x50f4000, compress 0x0/0x0/0x0, omap 0x649, meta 0x70cf9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 162062336 unmapped: 33759232 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2590374 data_alloc: 285212672 data_used: 6008832
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:38.661760+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 162062336 unmapped: 33759232 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 9.150407791s of 10.001079559s, submitted: 433
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:39.661972+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 162062336 unmapped: 33759232 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:40.662187+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 280 handle_osd_map epochs [280,281], i have 280, src has [1,281]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 161300480 unmapped: 34521088 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:41.662345+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 161300480 unmapped: 34521088 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 281 heartbeat osd_stat(store_statfs(0x1b4a37000/0x0/0x1bfc00000, data 0x4f15d3d/0x50f7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x60cf9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 281 handle_osd_map epochs [282,282], i have 281, src has [1,282]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 281 handle_osd_map epochs [282,282], i have 282, src has [1,282]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 281 handle_osd_map epochs [282,282], i have 282, src has [1,282]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 281 handle_osd_map epochs [282,282], i have 282, src has [1,282]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 281 handle_osd_map epochs [282,282], i have 282, src has [1,282]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:42.662554+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 161300480 unmapped: 34521088 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2590125 data_alloc: 285212672 data_used: 6033408
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:43.662721+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 161300480 unmapped: 34521088 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:44.663388+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 161300480 unmapped: 34521088 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:45.663627+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 282 heartbeat osd_stat(store_statfs(0x1b4a34000/0x0/0x1bfc00000, data 0x4f180df/0x50fa000, compress 0x0/0x0/0x0, omap 0x649, meta 0x60cf9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 282 handle_osd_map epochs [283,283], i have 282, src has [1,283]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 282 handle_osd_map epochs [283,283], i have 283, src has [1,283]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 161300480 unmapped: 34521088 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 283 handle_osd_map epochs [283,283], i have 283, src has [1,283]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 283 heartbeat osd_stat(store_statfs(0x1b4a34000/0x0/0x1bfc00000, data 0x4f180df/0x50fa000, compress 0x0/0x0/0x0, omap 0x649, meta 0x60cf9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:46.663890+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 161300480 unmapped: 34521088 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _renew_subs
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 283 handle_osd_map epochs [284,284], i have 283, src has [1,284]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.16] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:47.664204+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 161300480 unmapped: 34521088 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2595455 data_alloc: 285212672 data_used: 6045696
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:48.664358+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 161300480 unmapped: 34521088 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 284 heartbeat osd_stat(store_statfs(0x1b4a2b000/0x0/0x1bfc00000, data 0x4f1c879/0x5102000, compress 0x0/0x0/0x0, omap 0x649, meta 0x60cf9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 9.663481712s of 10.022708893s, submitted: 124
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:49.664657+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 161308672 unmapped: 34512896 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:50.664833+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 161308672 unmapped: 34512896 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:51.665300+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 284 handle_osd_map epochs [284,285], i have 284, src has [1,285]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 161308672 unmapped: 34512896 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 285 heartbeat osd_stat(store_statfs(0x1b4a28000/0x0/0x1bfc00000, data 0x4f1c97d/0x5104000, compress 0x0/0x0/0x0, omap 0x649, meta 0x60cf9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:52.665470+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 285 handle_osd_map epochs [285,286], i have 285, src has [1,286]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 161325056 unmapped: 34496512 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2607513 data_alloc: 285212672 data_used: 6057984
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:53.666010+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 161325056 unmapped: 34496512 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:54.666410+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 161325056 unmapped: 34496512 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:55.666609+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 286 heartbeat osd_stat(store_statfs(0x1b4a1d000/0x0/0x1bfc00000, data 0x4f211d8/0x510c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x60cf9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 161325056 unmapped: 34496512 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:56.666809+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 161325056 unmapped: 34496512 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:57.667020+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 161325056 unmapped: 34496512 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2605465 data_alloc: 285212672 data_used: 6057984
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:58.667233+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 286 heartbeat osd_stat(store_statfs(0x1b4a1f000/0x0/0x1bfc00000, data 0x4f2123e/0x510d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x60cf9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 161325056 unmapped: 34496512 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 9.783012390s of 10.071023941s, submitted: 77
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:59.667404+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 161349632 unmapped: 34471936 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:00.667565+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 286 heartbeat osd_stat(store_statfs(0x1b4a20000/0x0/0x1bfc00000, data 0x4f2120e/0x510c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x60cf9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 161366016 unmapped: 34455552 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _renew_subs
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 286 handle_osd_map epochs [287,287], i have 286, src has [1,287]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:01.667777+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 287 handle_osd_map epochs [287,288], i have 287, src has [1,288]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 161390592 unmapped: 34430976 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:02.667910+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 288 heartbeat osd_stat(store_statfs(0x1b4a1a000/0x0/0x1bfc00000, data 0x4f25924/0x5114000, compress 0x0/0x0/0x0, omap 0x649, meta 0x60cf9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 161398784 unmapped: 34422784 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2614847 data_alloc: 285212672 data_used: 6078464
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:03.668107+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 161406976 unmapped: 34414592 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:04.668271+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 161406976 unmapped: 34414592 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _renew_subs
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 288 handle_osd_map epochs [289,289], i have 288, src has [1,289]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 289 heartbeat osd_stat(store_statfs(0x1b4a17000/0x0/0x1bfc00000, data 0x4f25a28/0x5116000, compress 0x0/0x0/0x0, omap 0x649, meta 0x60cf9b7), peers [0,1,2,3,5] op hist [0,0,0,2])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:05.668425+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 161447936 unmapped: 34373632 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:06.668595+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 161464320 unmapped: 34357248 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:07.668805+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 161464320 unmapped: 34357248 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2620241 data_alloc: 285212672 data_used: 6090752
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:08.668972+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 161480704 unmapped: 34340864 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:09.669163+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 161480704 unmapped: 34340864 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 10.250058174s of 10.616589546s, submitted: 124
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:10.669353+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 161497088 unmapped: 34324480 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 289 heartbeat osd_stat(store_statfs(0x1b4a16000/0x0/0x1bfc00000, data 0x4f27f9d/0x5117000, compress 0x0/0x0/0x0, omap 0x649, meta 0x60cf9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:11.669548+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 289 handle_osd_map epochs [289,290], i have 289, src has [1,290]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 161513472 unmapped: 34308096 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 290 handle_osd_map epochs [290,290], i have 290, src has [1,290]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:12.669691+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 161538048 unmapped: 34283520 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 290 heartbeat osd_stat(store_statfs(0x1b4a11000/0x0/0x1bfc00000, data 0x4f2a359/0x511b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x60cf9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2621815 data_alloc: 285212672 data_used: 6103040
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:13.669864+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 161538048 unmapped: 34283520 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:14.670041+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 161538048 unmapped: 34283520 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:15.670192+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 290 heartbeat osd_stat(store_statfs(0x1b4a12000/0x0/0x1bfc00000, data 0x4f2a2f7/0x511a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x60cf9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 161554432 unmapped: 34267136 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:16.670378+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 161562624 unmapped: 34258944 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:17.670563+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 161562624 unmapped: 34258944 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2620631 data_alloc: 285212672 data_used: 6103040
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:18.670742+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 161562624 unmapped: 34258944 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:19.670938+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 161562624 unmapped: 34258944 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 9.855056763s of 10.032669067s, submitted: 42
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:20.671124+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 290 heartbeat osd_stat(store_statfs(0x1b4a13000/0x0/0x1bfc00000, data 0x4f2a3c6/0x511a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x60cf9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 161562624 unmapped: 34258944 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:21.671341+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 161562624 unmapped: 34258944 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:22.671573+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 290 heartbeat osd_stat(store_statfs(0x1b4a13000/0x0/0x1bfc00000, data 0x4f2a3c6/0x511a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x60cf9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 161562624 unmapped: 34258944 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2620583 data_alloc: 285212672 data_used: 6103040
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:23.671755+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 161562624 unmapped: 34258944 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:24.671866+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 290 heartbeat osd_stat(store_statfs(0x1b4a14000/0x0/0x1bfc00000, data 0x4f2a3f8/0x511a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x60cf9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 161570816 unmapped: 34250752 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:25.672020+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 161570816 unmapped: 34250752 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:26.672258+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 161619968 unmapped: 34201600 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:27.672454+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 161644544 unmapped: 34177024 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2624615 data_alloc: 285212672 data_used: 6103040
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:28.672616+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 290 heartbeat osd_stat(store_statfs(0x1b4a10000/0x0/0x1bfc00000, data 0x4f2a6bc/0x511c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x60cf9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 161644544 unmapped: 34177024 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:29.672836+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 161660928 unmapped: 34160640 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:30.673014+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 161660928 unmapped: 34160640 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:31.673223+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 161660928 unmapped: 34160640 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 12.245940208s of 12.380335808s, submitted: 27
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:32.673405+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 161669120 unmapped: 34152448 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2622527 data_alloc: 285212672 data_used: 6103040
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:33.673561+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 161693696 unmapped: 34127872 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:34.673742+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 290 heartbeat osd_stat(store_statfs(0x1b4a13000/0x0/0x1bfc00000, data 0x4f2a689/0x511a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x60cf9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 161693696 unmapped: 34127872 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:35.673941+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 161693696 unmapped: 34127872 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:36.674144+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 161701888 unmapped: 34119680 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:37.674337+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 161701888 unmapped: 34119680 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2621485 data_alloc: 285212672 data_used: 6103040
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:38.674518+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 161701888 unmapped: 34119680 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:39.674676+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 290 heartbeat osd_stat(store_statfs(0x1b4a13000/0x0/0x1bfc00000, data 0x4f2a656/0x511a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x60cf9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 161701888 unmapped: 34119680 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:40.674856+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 162791424 unmapped: 33030144 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:41.675024+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 162791424 unmapped: 33030144 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 9.898042679s of 10.008740425s, submitted: 24
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:42.675190+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 290 heartbeat osd_stat(store_statfs(0x1b4a11000/0x0/0x1bfc00000, data 0x4f2a750/0x511b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x60cf9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 162791424 unmapped: 33030144 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2623125 data_alloc: 285212672 data_used: 6103040
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:43.675360+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 162791424 unmapped: 33030144 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:44.675516+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 162791424 unmapped: 33030144 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:45.675720+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 290 heartbeat osd_stat(store_statfs(0x1b4a13000/0x0/0x1bfc00000, data 0x4f2a7b5/0x511b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x60cf9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 162791424 unmapped: 33030144 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:46.675924+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 162816000 unmapped: 33005568 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:47.676140+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 290 heartbeat osd_stat(store_statfs(0x1b4a12000/0x0/0x1bfc00000, data 0x4f2a781/0x511b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x60cf9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 162816000 unmapped: 33005568 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2623253 data_alloc: 285212672 data_used: 6103040
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:48.676276+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 162824192 unmapped: 32997376 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:49.676462+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 162824192 unmapped: 32997376 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:50.676643+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 162848768 unmapped: 32972800 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:51.676828+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 290 heartbeat osd_stat(store_statfs(0x1b4a10000/0x0/0x1bfc00000, data 0x4f2a877/0x511c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x60cf9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 162848768 unmapped: 32972800 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 9.907809258s of 10.007678986s, submitted: 18
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:52.676989+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 162889728 unmapped: 32931840 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2624635 data_alloc: 285212672 data_used: 6103040
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:53.677124+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 162889728 unmapped: 32931840 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:54.677300+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 290 heartbeat osd_stat(store_statfs(0x1b4a11000/0x0/0x1bfc00000, data 0x4f2a816/0x511b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x60cf9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 162889728 unmapped: 32931840 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:55.677440+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 290 heartbeat osd_stat(store_statfs(0x1b4a11000/0x0/0x1bfc00000, data 0x4f2a816/0x511b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x60cf9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 162889728 unmapped: 32931840 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:56.677614+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 162906112 unmapped: 32915456 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:57.677763+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 162906112 unmapped: 32915456 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2627339 data_alloc: 285212672 data_used: 6103040
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:58.677904+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 162906112 unmapped: 32915456 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:59.678030+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 162906112 unmapped: 32915456 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:00.678158+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 290 heartbeat osd_stat(store_statfs(0x1b4a10000/0x0/0x1bfc00000, data 0x4f2a963/0x511d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x60cf9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 162906112 unmapped: 32915456 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:01.678340+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 162906112 unmapped: 32915456 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:02.678475+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 9.984338760s of 10.063252449s, submitted: 15
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 290 heartbeat osd_stat(store_statfs(0x1b4a10000/0x0/0x1bfc00000, data 0x4f2a963/0x511d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x60cf9b7), peers [0,1,2,3,5] op hist [0,0,0,2])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 162922496 unmapped: 32899072 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2627467 data_alloc: 285212672 data_used: 6103040
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:03.678618+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 162922496 unmapped: 32899072 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:04.678823+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 162930688 unmapped: 32890880 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:05.679014+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 162971648 unmapped: 32849920 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:06.679265+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 162971648 unmapped: 32849920 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:07.679399+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 290 heartbeat osd_stat(store_statfs(0x1b4a10000/0x0/0x1bfc00000, data 0x4f2a8b2/0x511d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x60cf9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 162971648 unmapped: 32849920 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2627339 data_alloc: 285212672 data_used: 6103040
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:08.679585+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 162971648 unmapped: 32849920 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:09.679749+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 162971648 unmapped: 32849920 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:10.679914+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 162971648 unmapped: 32849920 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:11.680124+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 290 heartbeat osd_stat(store_statfs(0x1b4a10000/0x0/0x1bfc00000, data 0x4f2a8b2/0x511d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x60cf9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 162971648 unmapped: 32849920 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:12.680305+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 9.917514801s of 10.002973557s, submitted: 18
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 162979840 unmapped: 32841728 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2625173 data_alloc: 285212672 data_used: 6103040
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:13.680516+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 290 heartbeat osd_stat(store_statfs(0x1b4a14000/0x0/0x1bfc00000, data 0x4f2a84f/0x511a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x60cf9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 162979840 unmapped: 32841728 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:14.680685+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 290 heartbeat osd_stat(store_statfs(0x1b4a14000/0x0/0x1bfc00000, data 0x4f2a84f/0x511a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x60cf9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 162996224 unmapped: 32825344 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:15.680847+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 163004416 unmapped: 32817152 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:16.681118+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 163004416 unmapped: 32817152 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:17.681315+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 290 heartbeat osd_stat(store_statfs(0x1b4a16000/0x0/0x1bfc00000, data 0x4f2a84e/0x5118000, compress 0x0/0x0/0x0, omap 0x649, meta 0x60cf9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 163004416 unmapped: 32817152 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2623777 data_alloc: 285212672 data_used: 6103040
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:18.681490+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 163004416 unmapped: 32817152 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 290 heartbeat osd_stat(store_statfs(0x1b4a16000/0x0/0x1bfc00000, data 0x4f2a84e/0x5118000, compress 0x0/0x0/0x0, omap 0x649, meta 0x60cf9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:19.681639+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 163012608 unmapped: 32808960 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:20.682441+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 163020800 unmapped: 32800768 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:21.682916+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 163020800 unmapped: 32800768 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:22.683142+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 290 heartbeat osd_stat(store_statfs(0x1b4a16000/0x0/0x1bfc00000, data 0x4f2a84e/0x5118000, compress 0x0/0x0/0x0, omap 0x649, meta 0x60cf9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 163020800 unmapped: 32800768 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 290 heartbeat osd_stat(store_statfs(0x1b4a16000/0x0/0x1bfc00000, data 0x4f2a84e/0x5118000, compress 0x0/0x0/0x0, omap 0x649, meta 0x60cf9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2623777 data_alloc: 285212672 data_used: 6103040
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:23.684485+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 163020800 unmapped: 32800768 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:24.686644+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 163020800 unmapped: 32800768 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:25.686849+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 163020800 unmapped: 32800768 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:26.687209+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 163020800 unmapped: 32800768 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:27.687542+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 290 heartbeat osd_stat(store_statfs(0x1b4a16000/0x0/0x1bfc00000, data 0x4f2a84e/0x5118000, compress 0x0/0x0/0x0, omap 0x649, meta 0x60cf9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 163028992 unmapped: 32792576 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2623777 data_alloc: 285212672 data_used: 6103040
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:28.687813+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 16.106819153s of 16.138708115s, submitted: 7
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 163037184 unmapped: 32784384 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:29.687915+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 163053568 unmapped: 32768000 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:30.688411+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 290 heartbeat osd_stat(store_statfs(0x1b4a15000/0x0/0x1bfc00000, data 0x4f2a8e9/0x5119000, compress 0x0/0x0/0x0, omap 0x649, meta 0x60cf9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 163053568 unmapped: 32768000 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:31.688717+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 290 heartbeat osd_stat(store_statfs(0x1b4a15000/0x0/0x1bfc00000, data 0x4f2a8e9/0x5119000, compress 0x0/0x0/0x0, omap 0x649, meta 0x60cf9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 163053568 unmapped: 32768000 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:32.689189+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 163053568 unmapped: 32768000 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2625177 data_alloc: 285212672 data_used: 6103040
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:33.689441+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 163053568 unmapped: 32768000 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:34.689812+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 163053568 unmapped: 32768000 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:35.689902+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 163053568 unmapped: 32768000 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:36.690118+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 290 heartbeat osd_stat(store_statfs(0x1b4a15000/0x0/0x1bfc00000, data 0x4f2a8e9/0x5119000, compress 0x0/0x0/0x0, omap 0x649, meta 0x60cf9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 163061760 unmapped: 32759808 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:37.690234+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 163061760 unmapped: 32759808 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2625001 data_alloc: 285212672 data_used: 6103040
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:38.690451+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 163061760 unmapped: 32759808 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:39.690626+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 11.348899841s of 11.369369507s, submitted: 3
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 163061760 unmapped: 32759808 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:40.690763+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 163061760 unmapped: 32759808 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 290 heartbeat osd_stat(store_statfs(0x1b4a15000/0x0/0x1bfc00000, data 0x4f2a8e9/0x5119000, compress 0x0/0x0/0x0, omap 0x649, meta 0x60cf9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:41.690985+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 163061760 unmapped: 32759808 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:42.691171+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 290 heartbeat osd_stat(store_statfs(0x1b4a15000/0x0/0x1bfc00000, data 0x4f2a8e9/0x5119000, compress 0x0/0x0/0x0, omap 0x649, meta 0x60cf9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 163061760 unmapped: 32759808 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:43.691258+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2625001 data_alloc: 285212672 data_used: 6103040
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 163061760 unmapped: 32759808 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:44.691569+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 163069952 unmapped: 32751616 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:45.691800+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 290 heartbeat osd_stat(store_statfs(0x1b4a15000/0x0/0x1bfc00000, data 0x4f2a8e9/0x5119000, compress 0x0/0x0/0x0, omap 0x649, meta 0x60cf9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 163069952 unmapped: 32751616 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:46.691962+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 163069952 unmapped: 32751616 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:47.692143+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 163069952 unmapped: 32751616 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:48.692290+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2625001 data_alloc: 285212672 data_used: 6103040
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 163069952 unmapped: 32751616 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:49.692450+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 9.979582787s of 10.000236511s, submitted: 2
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 290 heartbeat osd_stat(store_statfs(0x1b4a15000/0x0/0x1bfc00000, data 0x4f2a8e9/0x5119000, compress 0x0/0x0/0x0, omap 0x649, meta 0x60cf9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 163078144 unmapped: 32743424 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:50.692688+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 290 heartbeat osd_stat(store_statfs(0x1b4a15000/0x0/0x1bfc00000, data 0x4f2a8e9/0x5119000, compress 0x0/0x0/0x0, omap 0x649, meta 0x60cf9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 290 heartbeat osd_stat(store_statfs(0x1b49ee000/0x0/0x1bfc00000, data 0x4f5094c/0x5140000, compress 0x0/0x0/0x0, omap 0x649, meta 0x60cf9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 163094528 unmapped: 32727040 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:51.692844+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 290 heartbeat osd_stat(store_statfs(0x1b49ee000/0x0/0x1bfc00000, data 0x4f5094c/0x5140000, compress 0x0/0x0/0x0, omap 0x649, meta 0x60cf9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 163094528 unmapped: 32727040 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:52.693003+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 163282944 unmapped: 32538624 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:53.693191+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2639595 data_alloc: 285212672 data_used: 6103040
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:54.693326+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 163389440 unmapped: 32432128 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 290 heartbeat osd_stat(store_statfs(0x1b37f9000/0x0/0x1bfc00000, data 0x4fa6098/0x5195000, compress 0x0/0x0/0x0, omap 0x649, meta 0x726f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:55.693451+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 163594240 unmapped: 32227328 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 290 heartbeat osd_stat(store_statfs(0x1b37f9000/0x0/0x1bfc00000, data 0x4fa6098/0x5195000, compress 0x0/0x0/0x0, omap 0x649, meta 0x726f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:56.693616+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 163594240 unmapped: 32227328 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 290 heartbeat osd_stat(store_statfs(0x1b37e7000/0x0/0x1bfc00000, data 0x4fb7c93/0x51a7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x726f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:57.693743+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 163610624 unmapped: 32210944 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 290 heartbeat osd_stat(store_statfs(0x1b37e7000/0x0/0x1bfc00000, data 0x4fb7cf8/0x51a7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x726f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:58.693867+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 163610624 unmapped: 32210944 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2636067 data_alloc: 285212672 data_used: 6103040
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:59.693995+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 163610624 unmapped: 32210944 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 9.854078293s of 10.003557205s, submitted: 33
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:00.694120+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 163823616 unmapped: 31997952 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 290 heartbeat osd_stat(store_statfs(0x1b378b000/0x0/0x1bfc00000, data 0x5013e8b/0x5203000, compress 0x0/0x0/0x0, omap 0x649, meta 0x726f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:01.694249+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 163823616 unmapped: 31997952 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:02.694389+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 163823616 unmapped: 31997952 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 290 heartbeat osd_stat(store_statfs(0x1b378b000/0x0/0x1bfc00000, data 0x5013e8b/0x5203000, compress 0x0/0x0/0x0, omap 0x649, meta 0x726f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:03.694534+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 163979264 unmapped: 31842304 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2638513 data_alloc: 285212672 data_used: 6103040
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:04.694665+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 165027840 unmapped: 30793728 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:05.694808+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 165027840 unmapped: 30793728 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:06.694981+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 165109760 unmapped: 30711808 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:07.695118+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 164216832 unmapped: 31604736 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 290 heartbeat osd_stat(store_statfs(0x1b373a000/0x0/0x1bfc00000, data 0x506507c/0x5254000, compress 0x0/0x0/0x0, omap 0x649, meta 0x726f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:08.695261+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 164216832 unmapped: 31604736 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2641433 data_alloc: 285212672 data_used: 6103040
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 290 heartbeat osd_stat(store_statfs(0x1b373a000/0x0/0x1bfc00000, data 0x506507c/0x5254000, compress 0x0/0x0/0x0, omap 0x649, meta 0x726f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:09.695402+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 164216832 unmapped: 31604736 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 9.915359497s of 10.002251625s, submitted: 18
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:10.695531+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 164200448 unmapped: 31621120 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:11.695679+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 164200448 unmapped: 31621120 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 290 heartbeat osd_stat(store_statfs(0x1b3722000/0x0/0x1bfc00000, data 0x507cd3b/0x526c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x726f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:12.695846+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 164200448 unmapped: 31621120 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:13.695996+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 164323328 unmapped: 31498240 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2649649 data_alloc: 285212672 data_used: 6103040
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:14.696173+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 164323328 unmapped: 31498240 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:15.696320+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 164659200 unmapped: 31162368 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 290 heartbeat osd_stat(store_statfs(0x1b36c7000/0x0/0x1bfc00000, data 0x50d818d/0x52c7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x726f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:16.696588+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 164659200 unmapped: 31162368 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:17.696720+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 164823040 unmapped: 30998528 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 290 heartbeat osd_stat(store_statfs(0x1b3699000/0x0/0x1bfc00000, data 0x5106013/0x52f5000, compress 0x0/0x0/0x0, omap 0x649, meta 0x726f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:18.696850+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 163586048 unmapped: 32235520 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2648625 data_alloc: 285212672 data_used: 6103040
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:19.696987+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 163586048 unmapped: 32235520 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 9.852293968s of 10.003196716s, submitted: 30
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 290 heartbeat osd_stat(store_statfs(0x1b3699000/0x0/0x1bfc00000, data 0x51060dd/0x52f5000, compress 0x0/0x0/0x0, omap 0x649, meta 0x726f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:20.697109+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 163774464 unmapped: 32047104 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:21.697237+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 163774464 unmapped: 32047104 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:22.697366+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 163782656 unmapped: 32038912 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:23.697500+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 163700736 unmapped: 32120832 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2652233 data_alloc: 285212672 data_used: 6103040
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:24.697629+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 163708928 unmapped: 32112640 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:25.697759+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 163897344 unmapped: 31924224 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 290 heartbeat osd_stat(store_statfs(0x1b360e000/0x0/0x1bfc00000, data 0x5191137/0x5380000, compress 0x0/0x0/0x0, omap 0x649, meta 0x726f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:26.697932+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 164954112 unmapped: 30867456 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:27.698146+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 165085184 unmapped: 30736384 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:28.698296+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 165085184 unmapped: 30736384 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2654795 data_alloc: 285212672 data_used: 6103040
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 290 heartbeat osd_stat(store_statfs(0x1b35e8000/0x0/0x1bfc00000, data 0x51b89e7/0x53a6000, compress 0x0/0x0/0x0, omap 0x649, meta 0x726f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:29.698404+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 165085184 unmapped: 30736384 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:30.698548+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 165093376 unmapped: 30728192 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:31.698684+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 165093376 unmapped: 30728192 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:32.698813+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 165101568 unmapped: 30720000 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:33.698966+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 165101568 unmapped: 30720000 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2655027 data_alloc: 285212672 data_used: 6103040
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:34.699139+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 290 heartbeat osd_stat(store_statfs(0x1b35e7000/0x0/0x1bfc00000, data 0x51b8f5f/0x53a7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x726f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 165101568 unmapped: 30720000 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 14.876365662s of 14.994490623s, submitted: 27
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:35.699273+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 165298176 unmapped: 30523392 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:36.699475+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 165412864 unmapped: 30408704 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:37.699599+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 165412864 unmapped: 30408704 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:38.699780+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 165412864 unmapped: 30408704 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2659299 data_alloc: 285212672 data_used: 6103040
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 290 heartbeat osd_stat(store_statfs(0x1b35b8000/0x0/0x1bfc00000, data 0x51e87e3/0x53d6000, compress 0x0/0x0/0x0, omap 0x649, meta 0x726f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:39.699926+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 165412864 unmapped: 30408704 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:40.700110+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 165421056 unmapped: 30400512 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:41.700259+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 165421056 unmapped: 30400512 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 290 heartbeat osd_stat(store_statfs(0x1b35b8000/0x0/0x1bfc00000, data 0x51e87e3/0x53d6000, compress 0x0/0x0/0x0, omap 0x649, meta 0x726f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:42.700439+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 165421056 unmapped: 30400512 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:43.700584+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 165421056 unmapped: 30400512 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2659299 data_alloc: 285212672 data_used: 6103040
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:44.700754+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 165421056 unmapped: 30400512 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:45.700947+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 165421056 unmapped: 30400512 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 290 heartbeat osd_stat(store_statfs(0x1b35b8000/0x0/0x1bfc00000, data 0x51e87e3/0x53d6000, compress 0x0/0x0/0x0, omap 0x649, meta 0x726f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:46.701181+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 165421056 unmapped: 30400512 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:47.701380+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 165421056 unmapped: 30400512 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 12.869069099s of 12.891178131s, submitted: 6
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:48.701571+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 165429248 unmapped: 30392320 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2660563 data_alloc: 285212672 data_used: 6103040
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 290 heartbeat osd_stat(store_statfs(0x1b359c000/0x0/0x1bfc00000, data 0x5203564/0x53f2000, compress 0x0/0x0/0x0, omap 0x649, meta 0x726f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:49.701729+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 165429248 unmapped: 30392320 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:50.701846+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 165429248 unmapped: 30392320 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:51.702026+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 165511168 unmapped: 30310400 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 290 heartbeat osd_stat(store_statfs(0x1b357c000/0x0/0x1bfc00000, data 0x5223660/0x5412000, compress 0x0/0x0/0x0, omap 0x649, meta 0x726f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:52.702230+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 290 heartbeat osd_stat(store_statfs(0x1b357c000/0x0/0x1bfc00000, data 0x5223660/0x5412000, compress 0x0/0x0/0x0, omap 0x649, meta 0x726f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 165511168 unmapped: 30310400 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:53.702378+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 165511168 unmapped: 30310400 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2661671 data_alloc: 285212672 data_used: 6103040
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:54.702503+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 165511168 unmapped: 30310400 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:55.702630+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 165527552 unmapped: 30294016 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:56.702801+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 165535744 unmapped: 30285824 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 290 heartbeat osd_stat(store_statfs(0x1b351d000/0x0/0x1bfc00000, data 0x5282098/0x5471000, compress 0x0/0x0/0x0, omap 0x649, meta 0x726f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:57.702938+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 165535744 unmapped: 30285824 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:58.703096+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 165535744 unmapped: 30285824 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2668647 data_alloc: 285212672 data_used: 6103040
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 10.834485054s of 10.959936142s, submitted: 25
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:59.703195+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 164749312 unmapped: 31072256 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:00.703355+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 164749312 unmapped: 31072256 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 290 heartbeat osd_stat(store_statfs(0x1b34e3000/0x0/0x1bfc00000, data 0x52bc7b0/0x54ab000, compress 0x0/0x0/0x0, omap 0x649, meta 0x726f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:01.703475+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 164749312 unmapped: 31072256 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:02.703635+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 164970496 unmapped: 30851072 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:03.703765+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 164970496 unmapped: 30851072 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2669499 data_alloc: 285212672 data_used: 6103040
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:04.703886+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 164995072 unmapped: 30826496 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 290 heartbeat osd_stat(store_statfs(0x1b34a9000/0x0/0x1bfc00000, data 0x52f64f9/0x54e5000, compress 0x0/0x0/0x0, omap 0x649, meta 0x726f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:05.704016+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 165101568 unmapped: 30720000 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:06.704225+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 164880384 unmapped: 30941184 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:07.704373+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 164880384 unmapped: 30941184 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:08.704492+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 164888576 unmapped: 30932992 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2671607 data_alloc: 285212672 data_used: 6103040
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:09.704640+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 164888576 unmapped: 30932992 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 11.026061058s of 11.147589684s, submitted: 25
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:10.704801+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 290 heartbeat osd_stat(store_statfs(0x1b3474000/0x0/0x1bfc00000, data 0x532aed4/0x551a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x726f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 164888576 unmapped: 30932992 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:11.704945+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 164888576 unmapped: 30932992 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 290 heartbeat osd_stat(store_statfs(0x1b3474000/0x0/0x1bfc00000, data 0x532af9e/0x551a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x726f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:12.705093+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 164896768 unmapped: 30924800 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:13.705274+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 164896768 unmapped: 30924800 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2671095 data_alloc: 285212672 data_used: 6103040
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:14.705422+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 164904960 unmapped: 30916608 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:15.705565+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 164904960 unmapped: 30916608 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:16.705732+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 290 heartbeat osd_stat(store_statfs(0x1b3475000/0x0/0x1bfc00000, data 0x532afcd/0x5519000, compress 0x0/0x0/0x0, omap 0x649, meta 0x726f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 164904960 unmapped: 30916608 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:17.705857+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 290 heartbeat osd_stat(store_statfs(0x1b3475000/0x0/0x1bfc00000, data 0x532afcd/0x5519000, compress 0x0/0x0/0x0, omap 0x649, meta 0x726f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 164904960 unmapped: 30916608 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:18.706007+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 164904960 unmapped: 30916608 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2670405 data_alloc: 285212672 data_used: 6103040
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:19.706171+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 164904960 unmapped: 30916608 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 290 heartbeat osd_stat(store_statfs(0x1b3475000/0x0/0x1bfc00000, data 0x532afcd/0x5519000, compress 0x0/0x0/0x0, omap 0x649, meta 0x726f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 9.951857567s of 10.000116348s, submitted: 11
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:20.706307+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 164913152 unmapped: 30908416 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 290 heartbeat osd_stat(store_statfs(0x1b3475000/0x0/0x1bfc00000, data 0x532afcd/0x5519000, compress 0x0/0x0/0x0, omap 0x649, meta 0x726f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:21.706456+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 164913152 unmapped: 30908416 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:22.706591+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 164913152 unmapped: 30908416 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:23.706753+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 165961728 unmapped: 29859840 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2671805 data_alloc: 285212672 data_used: 6103040
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:24.706887+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 165961728 unmapped: 29859840 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:25.707046+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 290 heartbeat osd_stat(store_statfs(0x1b3474000/0x0/0x1bfc00000, data 0x532b068/0x551a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x726f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 165961728 unmapped: 29859840 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:26.707282+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 165961728 unmapped: 29859840 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:27.707420+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 165961728 unmapped: 29859840 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 290 heartbeat osd_stat(store_statfs(0x1b3474000/0x0/0x1bfc00000, data 0x532b068/0x551a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x726f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:28.707619+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 165969920 unmapped: 29851648 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2671805 data_alloc: 285212672 data_used: 6103040
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:29.707766+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 165969920 unmapped: 29851648 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 9.967976570s of 10.000982285s, submitted: 5
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:30.707948+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 165978112 unmapped: 29843456 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 290 heartbeat osd_stat(store_statfs(0x1b3472000/0x0/0x1bfc00000, data 0x532b19e/0x551c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x726f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:31.708120+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 165978112 unmapped: 29843456 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:32.708291+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 165986304 unmapped: 29835264 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:33.709672+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 165986304 unmapped: 29835264 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2674121 data_alloc: 285212672 data_used: 6103040
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:34.710848+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 165986304 unmapped: 29835264 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:35.711554+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 165986304 unmapped: 29835264 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 290 heartbeat osd_stat(store_statfs(0x1b3475000/0x0/0x1bfc00000, data 0x532b161/0x5519000, compress 0x0/0x0/0x0, omap 0x649, meta 0x726f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 290 handle_osd_map epochs [291,291], i have 290, src has [1,291]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 290 handle_osd_map epochs [291,291], i have 291, src has [1,291]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 290 handle_osd_map epochs [291,291], i have 291, src has [1,291]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:36.711753+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 166019072 unmapped: 29802496 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:37.712423+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 166019072 unmapped: 29802496 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:38.712960+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 166019072 unmapped: 29802496 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2676043 data_alloc: 285212672 data_used: 6115328
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:39.713461+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 166019072 unmapped: 29802496 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 9.720867157s of 10.005661964s, submitted: 45
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:40.713661+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 166019072 unmapped: 29802496 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 291 heartbeat osd_stat(store_statfs(0x1b3470000/0x0/0x1bfc00000, data 0x532d588/0x551d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x726f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:41.713987+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 166019072 unmapped: 29802496 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:42.714344+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 166019072 unmapped: 29802496 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:43.714482+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 291 heartbeat osd_stat(store_statfs(0x1b3470000/0x0/0x1bfc00000, data 0x532d623/0x551e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x726f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 166019072 unmapped: 29802496 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2676931 data_alloc: 285212672 data_used: 6115328
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:44.714758+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 166027264 unmapped: 29794304 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:45.715162+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 166027264 unmapped: 29794304 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 291 heartbeat osd_stat(store_statfs(0x1b3470000/0x0/0x1bfc00000, data 0x532d623/0x551e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x726f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:46.715400+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 166027264 unmapped: 29794304 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:47.715622+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _renew_subs
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 291 handle_osd_map epochs [292,292], i have 291, src has [1,292]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 166035456 unmapped: 29786112 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:48.715929+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 166035456 unmapped: 29786112 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2680957 data_alloc: 285212672 data_used: 6127616
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:49.716109+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 166035456 unmapped: 29786112 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 292 handle_osd_map epochs [292,292], i have 292, src has [1,292]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 9.958819389s of 10.002717972s, submitted: 19
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 292 heartbeat osd_stat(store_statfs(0x1b346b000/0x0/0x1bfc00000, data 0x532f8c7/0x5522000, compress 0x0/0x0/0x0, omap 0x649, meta 0x726f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:50.716269+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 166035456 unmapped: 29786112 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:51.716522+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 292 heartbeat osd_stat(store_statfs(0x1b346b000/0x0/0x1bfc00000, data 0x532f962/0x5523000, compress 0x0/0x0/0x0, omap 0x649, meta 0x726f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 166035456 unmapped: 29786112 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:52.716702+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 166060032 unmapped: 29761536 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:53.716869+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 166060032 unmapped: 29761536 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2681363 data_alloc: 285212672 data_used: 6127616
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:54.717001+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 166060032 unmapped: 29761536 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:55.717188+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 166068224 unmapped: 29753344 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 292 heartbeat osd_stat(store_statfs(0x1b346d000/0x0/0x1bfc00000, data 0x532f9c0/0x5521000, compress 0x0/0x0/0x0, omap 0x649, meta 0x726f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 292 handle_osd_map epochs [293,293], i have 292, src has [1,293]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 292 handle_osd_map epochs [293,293], i have 293, src has [1,293]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:56.717356+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 166084608 unmapped: 29736960 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:57.717487+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 166084608 unmapped: 29736960 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:58.717645+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 166084608 unmapped: 29736960 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2684859 data_alloc: 285212672 data_used: 6139904
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:59.717787+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 166084608 unmapped: 29736960 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 9.883934975s of 10.014445305s, submitted: 42
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:00.717933+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 293 heartbeat osd_stat(store_statfs(0x1b3468000/0x0/0x1bfc00000, data 0x5331de7/0x5525000, compress 0x0/0x0/0x0, omap 0x649, meta 0x726f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 166100992 unmapped: 29720576 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:01.718111+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 293 heartbeat osd_stat(store_statfs(0x1b3468000/0x0/0x1bfc00000, data 0x5331de7/0x5525000, compress 0x0/0x0/0x0, omap 0x649, meta 0x726f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 166100992 unmapped: 29720576 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 293 heartbeat osd_stat(store_statfs(0x1b3468000/0x0/0x1bfc00000, data 0x5331de7/0x5525000, compress 0x0/0x0/0x0, omap 0x649, meta 0x726f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:02.718243+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 166109184 unmapped: 29712384 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:03.718374+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 166109184 unmapped: 29712384 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2684859 data_alloc: 285212672 data_used: 6139904
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:04.718499+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 166109184 unmapped: 29712384 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:05.718622+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 166109184 unmapped: 29712384 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:06.718791+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 166109184 unmapped: 29712384 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:07.718924+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _renew_subs
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _send_mon_message to mon.np0005538513 at v2:172.18.0.103:3300/0
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 293 handle_osd_map epochs [294,294], i have 293, src has [1,294]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 166109184 unmapped: 29712384 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 294 heartbeat osd_stat(store_statfs(0x1b3468000/0x0/0x1bfc00000, data 0x5331de7/0x5525000, compress 0x0/0x0/0x0, omap 0x649, meta 0x726f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:08.719059+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 166117376 unmapped: 29704192 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2687493 data_alloc: 285212672 data_used: 6139904
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:09.719223+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 166117376 unmapped: 29704192 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:10.719354+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 294 heartbeat osd_stat(store_statfs(0x1b3464000/0x0/0x1bfc00000, data 0x533408b/0x5529000, compress 0x0/0x0/0x0, omap 0x649, meta 0x726f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 166117376 unmapped: 29704192 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:11.719485+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 166117376 unmapped: 29704192 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:12.719601+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 166117376 unmapped: 29704192 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:13.719745+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 166117376 unmapped: 29704192 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2687493 data_alloc: 285212672 data_used: 6139904
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 294 ms_handle_reset con 0x562bf9a92000 session 0x562bfaf34f00
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: handle_auth_request added challenge on 0x562bfc484400
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:14.719885+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 166117376 unmapped: 29704192 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:15.720032+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 166117376 unmapped: 29704192 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 294 heartbeat osd_stat(store_statfs(0x1b3464000/0x0/0x1bfc00000, data 0x533408b/0x5529000, compress 0x0/0x0/0x0, omap 0x649, meta 0x726f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:16.720246+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 166125568 unmapped: 29696000 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:17.720385+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 166125568 unmapped: 29696000 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:18.720520+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 166125568 unmapped: 29696000 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2687493 data_alloc: 285212672 data_used: 6139904
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:19.720656+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 166125568 unmapped: 29696000 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:20.720801+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 166125568 unmapped: 29696000 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:21.720940+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 166125568 unmapped: 29696000 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 294 heartbeat osd_stat(store_statfs(0x1b3464000/0x0/0x1bfc00000, data 0x533408b/0x5529000, compress 0x0/0x0/0x0, omap 0x649, meta 0x726f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:22.721142+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 294 heartbeat osd_stat(store_statfs(0x1b3464000/0x0/0x1bfc00000, data 0x533408b/0x5529000, compress 0x0/0x0/0x0, omap 0x649, meta 0x726f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 166125568 unmapped: 29696000 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:23.721277+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 294 heartbeat osd_stat(store_statfs(0x1b3464000/0x0/0x1bfc00000, data 0x533408b/0x5529000, compress 0x0/0x0/0x0, omap 0x649, meta 0x726f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 166125568 unmapped: 29696000 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2687493 data_alloc: 285212672 data_used: 6139904
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:24.721414+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 166133760 unmapped: 29687808 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:25.721565+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 294 heartbeat osd_stat(store_statfs(0x1b3464000/0x0/0x1bfc00000, data 0x533408b/0x5529000, compress 0x0/0x0/0x0, omap 0x649, meta 0x726f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 166133760 unmapped: 29687808 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:26.721742+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 294 heartbeat osd_stat(store_statfs(0x1b3464000/0x0/0x1bfc00000, data 0x533408b/0x5529000, compress 0x0/0x0/0x0, omap 0x649, meta 0x726f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 166141952 unmapped: 29679616 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:27.721875+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 166141952 unmapped: 29679616 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:28.722045+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 166141952 unmapped: 29679616 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2687493 data_alloc: 285212672 data_used: 6139904
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:29.722237+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 166141952 unmapped: 29679616 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:30.722376+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 166150144 unmapped: 29671424 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:31.722525+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 294 heartbeat osd_stat(store_statfs(0x1b3464000/0x0/0x1bfc00000, data 0x533408b/0x5529000, compress 0x0/0x0/0x0, omap 0x649, meta 0x726f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 166150144 unmapped: 29671424 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:32.722677+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 166158336 unmapped: 29663232 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:33.722823+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 166158336 unmapped: 29663232 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2687493 data_alloc: 285212672 data_used: 6139904
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:34.723240+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 166158336 unmapped: 29663232 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:35.723709+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 294 heartbeat osd_stat(store_statfs(0x1b3464000/0x0/0x1bfc00000, data 0x533408b/0x5529000, compress 0x0/0x0/0x0, omap 0x649, meta 0x726f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 166158336 unmapped: 29663232 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:36.724013+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 166166528 unmapped: 29655040 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:37.724666+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 166166528 unmapped: 29655040 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 294 heartbeat osd_stat(store_statfs(0x1b3464000/0x0/0x1bfc00000, data 0x533408b/0x5529000, compress 0x0/0x0/0x0, omap 0x649, meta 0x726f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:38.725126+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 166166528 unmapped: 29655040 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2687493 data_alloc: 285212672 data_used: 6139904
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:39.725715+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 166166528 unmapped: 29655040 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:40.727220+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 166174720 unmapped: 29646848 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:41.728260+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 166174720 unmapped: 29646848 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:42.729287+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 166174720 unmapped: 29646848 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:43.729542+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 294 heartbeat osd_stat(store_statfs(0x1b3464000/0x0/0x1bfc00000, data 0x533408b/0x5529000, compress 0x0/0x0/0x0, omap 0x649, meta 0x726f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 166174720 unmapped: 29646848 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2687493 data_alloc: 285212672 data_used: 6139904
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:44.730248+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 294 heartbeat osd_stat(store_statfs(0x1b3464000/0x0/0x1bfc00000, data 0x533408b/0x5529000, compress 0x0/0x0/0x0, omap 0x649, meta 0x726f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 166174720 unmapped: 29646848 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:45.730703+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 166174720 unmapped: 29646848 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:46.731209+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 166174720 unmapped: 29646848 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:47.731530+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 166174720 unmapped: 29646848 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:48.731649+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 166182912 unmapped: 29638656 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2687493 data_alloc: 285212672 data_used: 6139904
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:49.731776+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 166182912 unmapped: 29638656 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:50.731955+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 294 heartbeat osd_stat(store_statfs(0x1b3464000/0x0/0x1bfc00000, data 0x533408b/0x5529000, compress 0x0/0x0/0x0, omap 0x649, meta 0x726f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 166191104 unmapped: 29630464 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:51.732245+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 166191104 unmapped: 29630464 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 294 heartbeat osd_stat(store_statfs(0x1b3464000/0x0/0x1bfc00000, data 0x533408b/0x5529000, compress 0x0/0x0/0x0, omap 0x649, meta 0x726f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:52.732433+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 166191104 unmapped: 29630464 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:53.732685+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 294 heartbeat osd_stat(store_statfs(0x1b3464000/0x0/0x1bfc00000, data 0x533408b/0x5529000, compress 0x0/0x0/0x0, omap 0x649, meta 0x726f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 166191104 unmapped: 29630464 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2687493 data_alloc: 285212672 data_used: 6139904
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:54.732861+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 166191104 unmapped: 29630464 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:55.733136+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 166191104 unmapped: 29630464 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:56.733378+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 294 heartbeat osd_stat(store_statfs(0x1b3464000/0x0/0x1bfc00000, data 0x533408b/0x5529000, compress 0x0/0x0/0x0, omap 0x649, meta 0x726f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 166199296 unmapped: 29622272 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:57.733525+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 166199296 unmapped: 29622272 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:58.733796+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 166199296 unmapped: 29622272 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2687493 data_alloc: 285212672 data_used: 6139904
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:59.733932+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 166199296 unmapped: 29622272 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:00.734162+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 166199296 unmapped: 29622272 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:01.734319+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 166199296 unmapped: 29622272 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 294 heartbeat osd_stat(store_statfs(0x1b3464000/0x0/0x1bfc00000, data 0x533408b/0x5529000, compress 0x0/0x0/0x0, omap 0x649, meta 0x726f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:02.734419+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 166199296 unmapped: 29622272 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:03.734614+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 294 heartbeat osd_stat(store_statfs(0x1b3464000/0x0/0x1bfc00000, data 0x533408b/0x5529000, compress 0x0/0x0/0x0, omap 0x649, meta 0x726f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 166199296 unmapped: 29622272 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2687493 data_alloc: 285212672 data_used: 6139904
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:04.734741+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 166207488 unmapped: 29614080 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 65.436904907s of 65.458427429s, submitted: 30
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 294 ms_handle_reset con 0x562bfc9a1000 session 0x562bff2e7860
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:05.734905+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 294 heartbeat osd_stat(store_statfs(0x1b3464000/0x0/0x1bfc00000, data 0x533408b/0x5529000, compress 0x0/0x0/0x0, omap 0x649, meta 0x726f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 167641088 unmapped: 28180480 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:06.735118+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 294 heartbeat osd_stat(store_statfs(0x1b3465000/0x0/0x1bfc00000, data 0x533408b/0x5529000, compress 0x0/0x0/0x0, omap 0x649, meta 0x726f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: mgrc handle_mgr_map Got map version 53
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/3214120196,v1:172.18.0.108:6811/3214120196]
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 167649280 unmapped: 28172288 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:07.735212+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 167649280 unmapped: 28172288 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:08.735355+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 167649280 unmapped: 28172288 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2686613 data_alloc: 285212672 data_used: 6139904
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:09.735490+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 294 heartbeat osd_stat(store_statfs(0x1b3465000/0x0/0x1bfc00000, data 0x533408b/0x5529000, compress 0x0/0x0/0x0, omap 0x649, meta 0x726f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 167649280 unmapped: 28172288 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:10.735657+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 167649280 unmapped: 28172288 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:11.735807+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 167649280 unmapped: 28172288 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:12.735936+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 294 heartbeat osd_stat(store_statfs(0x1b3465000/0x0/0x1bfc00000, data 0x533408b/0x5529000, compress 0x0/0x0/0x0, omap 0x649, meta 0x726f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 294 heartbeat osd_stat(store_statfs(0x1b3465000/0x0/0x1bfc00000, data 0x533408b/0x5529000, compress 0x0/0x0/0x0, omap 0x649, meta 0x726f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 167657472 unmapped: 28164096 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:13.736159+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 167657472 unmapped: 28164096 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2686613 data_alloc: 285212672 data_used: 6139904
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:14.736292+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 167657472 unmapped: 28164096 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:15.736425+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 167657472 unmapped: 28164096 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:16.736535+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 167657472 unmapped: 28164096 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:17.736684+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 294 heartbeat osd_stat(store_statfs(0x1b3465000/0x0/0x1bfc00000, data 0x533408b/0x5529000, compress 0x0/0x0/0x0, omap 0x649, meta 0x726f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 167657472 unmapped: 28164096 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:18.736795+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 167657472 unmapped: 28164096 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2686613 data_alloc: 285212672 data_used: 6139904
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:19.736963+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 167657472 unmapped: 28164096 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:20.737123+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 167665664 unmapped: 28155904 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 294 heartbeat osd_stat(store_statfs(0x1b3465000/0x0/0x1bfc00000, data 0x533408b/0x5529000, compress 0x0/0x0/0x0, omap 0x649, meta 0x726f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:21.737347+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 167665664 unmapped: 28155904 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:22.737543+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 167665664 unmapped: 28155904 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:23.737756+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 294 heartbeat osd_stat(store_statfs(0x1b3465000/0x0/0x1bfc00000, data 0x533408b/0x5529000, compress 0x0/0x0/0x0, omap 0x649, meta 0x726f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 167665664 unmapped: 28155904 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2686613 data_alloc: 285212672 data_used: 6139904
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:24.737919+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 167665664 unmapped: 28155904 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:25.738037+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 167665664 unmapped: 28155904 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:26.738267+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 167665664 unmapped: 28155904 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:27.738474+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 167665664 unmapped: 28155904 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 294 heartbeat osd_stat(store_statfs(0x1b3465000/0x0/0x1bfc00000, data 0x533408b/0x5529000, compress 0x0/0x0/0x0, omap 0x649, meta 0x726f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:28.738589+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 167673856 unmapped: 28147712 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2686613 data_alloc: 285212672 data_used: 6139904
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:29.738817+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 167673856 unmapped: 28147712 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:30.738973+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 167673856 unmapped: 28147712 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:31.739151+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 294 heartbeat osd_stat(store_statfs(0x1b3465000/0x0/0x1bfc00000, data 0x533408b/0x5529000, compress 0x0/0x0/0x0, omap 0x649, meta 0x726f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 167682048 unmapped: 28139520 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:32.739323+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 167682048 unmapped: 28139520 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:33.739463+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 167682048 unmapped: 28139520 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2686613 data_alloc: 285212672 data_used: 6139904
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:34.739597+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 294 heartbeat osd_stat(store_statfs(0x1b3465000/0x0/0x1bfc00000, data 0x533408b/0x5529000, compress 0x0/0x0/0x0, omap 0x649, meta 0x726f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 167682048 unmapped: 28139520 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:35.739724+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 167682048 unmapped: 28139520 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:36.739840+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 167690240 unmapped: 28131328 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:37.739963+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 167690240 unmapped: 28131328 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:38.740143+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 167690240 unmapped: 28131328 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2686613 data_alloc: 285212672 data_used: 6139904
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:39.740730+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 167690240 unmapped: 28131328 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:40.740921+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 294 heartbeat osd_stat(store_statfs(0x1b3465000/0x0/0x1bfc00000, data 0x533408b/0x5529000, compress 0x0/0x0/0x0, omap 0x649, meta 0x726f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 167690240 unmapped: 28131328 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:41.741611+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 167690240 unmapped: 28131328 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:42.741963+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 167690240 unmapped: 28131328 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:43.742645+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 167690240 unmapped: 28131328 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2686613 data_alloc: 285212672 data_used: 6139904
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:44.742832+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 167698432 unmapped: 28123136 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:45.742998+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 294 heartbeat osd_stat(store_statfs(0x1b3465000/0x0/0x1bfc00000, data 0x533408b/0x5529000, compress 0x0/0x0/0x0, omap 0x649, meta 0x726f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 167698432 unmapped: 28123136 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:46.743396+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 167698432 unmapped: 28123136 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:47.743653+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 294 heartbeat osd_stat(store_statfs(0x1b3465000/0x0/0x1bfc00000, data 0x533408b/0x5529000, compress 0x0/0x0/0x0, omap 0x649, meta 0x726f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 167698432 unmapped: 28123136 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:48.743927+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 167698432 unmapped: 28123136 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2686613 data_alloc: 285212672 data_used: 6139904
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:49.744142+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 167698432 unmapped: 28123136 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:50.744548+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 294 heartbeat osd_stat(store_statfs(0x1b3465000/0x0/0x1bfc00000, data 0x533408b/0x5529000, compress 0x0/0x0/0x0, omap 0x649, meta 0x726f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:51.744707+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 167698432 unmapped: 28123136 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:52.744924+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 167706624 unmapped: 28114944 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:53.745081+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 167714816 unmapped: 28106752 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:54.745393+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 167714816 unmapped: 28106752 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2686613 data_alloc: 285212672 data_used: 6139904
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:55.745534+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 167714816 unmapped: 28106752 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 294 heartbeat osd_stat(store_statfs(0x1b3465000/0x0/0x1bfc00000, data 0x533408b/0x5529000, compress 0x0/0x0/0x0, omap 0x649, meta 0x726f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:56.745719+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 167714816 unmapped: 28106752 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:57.745856+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 167714816 unmapped: 28106752 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:58.746001+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 167714816 unmapped: 28106752 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:59.746108+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 167714816 unmapped: 28106752 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: bluestore.MempoolThread(0x562bf8adbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2686613 data_alloc: 285212672 data_used: 6139904
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:18:00.746213+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 167723008 unmapped: 28098560 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: osd.4 294 heartbeat osd_stat(store_statfs(0x1b3465000/0x0/0x1bfc00000, data 0x533408b/0x5529000, compress 0x0/0x0/0x0, omap 0x649, meta 0x726f9b7), peers [0,1,2,3,5] op hist [])
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:18:01.746329+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 167714816 unmapped: 28106752 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: do_command 'config diff' '{prefix=config diff}'
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: do_command 'config diff' '{prefix=config diff}' result is 0 bytes
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: do_command 'config show' '{prefix=config show}'
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: do_command 'config show' '{prefix=config show}' result is 0 bytes
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: do_command 'counter dump' '{prefix=counter dump}'
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: do_command 'counter dump' '{prefix=counter dump}' result is 0 bytes
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: do_command 'counter schema' '{prefix=counter schema}'
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: do_command 'counter schema' '{prefix=counter schema}' result is 0 bytes
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:18:02.746423+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 167256064 unmapped: 28565504 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: tick
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_tickets
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:18:03.755500+0000)
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: prioritycache tune_memory target: 5709084876 mapped: 167460864 unmapped: 28360704 heap: 195821568 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:34 np0005538515.localdomain ceph-osd[33334]: do_command 'log dump' '{prefix=log dump}'
Nov 28 10:18:34 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.69629 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:34 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.59641 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:34 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "mgr module ls"} v 0)
Nov 28 10:18:34 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/3274056886' entity='client.admin' cmd={"prefix": "mgr module ls"} : dispatch
Nov 28 10:18:34 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.49476 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:35 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.59659 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:35 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.69647 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:35 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.49488 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:35 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "mgr services"} v 0)
Nov 28 10:18:35 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/265827640' entity='client.admin' cmd={"prefix": "mgr services"} : dispatch
Nov 28 10:18:35 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.59671 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:35 np0005538515.localdomain ceph-mon[301134]: from='client.69587 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:35 np0005538515.localdomain ceph-mon[301134]: from='client.59590 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:35 np0005538515.localdomain ceph-mon[301134]: from='client.59602 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:35 np0005538515.localdomain ceph-mon[301134]: pgmap v819: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:18:35 np0005538515.localdomain ceph-mon[301134]: from='client.59620 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:35 np0005538515.localdomain ceph-mon[301134]: from='client.69617 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:35 np0005538515.localdomain ceph-mon[301134]: from='client.49464 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:35 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.107:0/2092821165' entity='client.admin' cmd={"prefix": "mgr dump"} : dispatch
Nov 28 10:18:35 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.106:0/3138369626' entity='client.admin' cmd={"prefix": "mgr metadata"} : dispatch
Nov 28 10:18:35 np0005538515.localdomain ceph-mon[301134]: from='client.69629 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:35 np0005538515.localdomain ceph-mon[301134]: from='client.59641 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:35 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.108:0/3274056886' entity='client.admin' cmd={"prefix": "mgr module ls"} : dispatch
Nov 28 10:18:35 np0005538515.localdomain ceph-mon[301134]: from='client.49476 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:35 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.107:0/97215739' entity='client.admin' cmd={"prefix": "mgr metadata"} : dispatch
Nov 28 10:18:35 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.106:0/2135208413' entity='client.admin' cmd={"prefix": "mgr module ls"} : dispatch
Nov 28 10:18:35 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.108:0/265827640' entity='client.admin' cmd={"prefix": "mgr services"} : dispatch
Nov 28 10:18:35 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.107:0/3675476587' entity='client.admin' cmd={"prefix": "mgr module ls"} : dispatch
Nov 28 10:18:35 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.106:0/2560522477' entity='client.admin' cmd={"prefix": "mgr services"} : dispatch
Nov 28 10:18:35 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.69668 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:35 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.49500 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Nov 28 10:18:35 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "mgr versions"} v 0)
Nov 28 10:18:35 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/1719564881' entity='client.admin' cmd={"prefix": "mgr versions"} : dispatch
Nov 28 10:18:35 np0005538515.localdomain crontab[328239]: (root) LIST (root)
Nov 28 10:18:35 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] scanning for idle connections..
Nov 28 10:18:35 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] cleaning up connections: []
Nov 28 10:18:35 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] scanning for idle connections..
Nov 28 10:18:35 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] cleaning up connections: []
Nov 28 10:18:35 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] scanning for idle connections..
Nov 28 10:18:35 np0005538515.localdomain ceph-mgr[286188]: [volumes INFO mgr_util] cleaning up connections: []
Nov 28 10:18:35 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.49512 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Nov 28 10:18:35 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.59683 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Nov 28 10:18:36 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.59692 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:36 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "mon stat"} v 0)
Nov 28 10:18:36 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/2168774809' entity='client.admin' cmd={"prefix": "mon stat"} : dispatch
Nov 28 10:18:36 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v820: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:18:36 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.69704 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Nov 28 10:18:36 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:18:36.324 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:18:36 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.59707 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Nov 28 10:18:36 np0005538515.localdomain ceph-mon[301134]: from='client.59659 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:36 np0005538515.localdomain ceph-mon[301134]: from='client.69647 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:36 np0005538515.localdomain ceph-mon[301134]: from='client.49488 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:36 np0005538515.localdomain ceph-mon[301134]: from='client.59671 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:36 np0005538515.localdomain ceph-mon[301134]: from='client.69668 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:36 np0005538515.localdomain ceph-mon[301134]: from='client.49500 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Nov 28 10:18:36 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.108:0/1719564881' entity='client.admin' cmd={"prefix": "mgr versions"} : dispatch
Nov 28 10:18:36 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.107:0/3738043283' entity='client.admin' cmd={"prefix": "mgr services"} : dispatch
Nov 28 10:18:36 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.106:0/1773482231' entity='client.admin' cmd={"prefix": "mgr versions"} : dispatch
Nov 28 10:18:36 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.108:0/2168774809' entity='client.admin' cmd={"prefix": "mon stat"} : dispatch
Nov 28 10:18:36 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.107:0/1437964439' entity='client.admin' cmd={"prefix": "mgr versions"} : dispatch
Nov 28 10:18:36 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.106:0/3561111865' entity='client.admin' cmd={"prefix": "mon stat"} : dispatch
Nov 28 10:18:36 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.49533 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Nov 28 10:18:36 np0005538515.localdomain ceph-mgr[286188]: mgr.server reply reply (95) Operation not supported Module 'prometheus' is not enabled/loaded (required by command 'healthcheck history ls'): use `ceph mgr module enable prometheus` to enable it
Nov 28 10:18:36 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T10:18:36.693+0000 7fccb8638640 -1 mgr.server reply reply (95) Operation not supported Module 'prometheus' is not enabled/loaded (required by command 'healthcheck history ls'): use `ceph mgr module enable prometheus` to enable it
Nov 28 10:18:36 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.59725 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Nov 28 10:18:36 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "node ls"} v 0)
Nov 28 10:18:36 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/2392063071' entity='client.admin' cmd={"prefix": "node ls"} : dispatch
Nov 28 10:18:36 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e294 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:18:37 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.69740 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Nov 28 10:18:37 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T10:18:37.240+0000 7fccb8638640 -1 mgr.server reply reply (95) Operation not supported Module 'prometheus' is not enabled/loaded (required by command 'healthcheck history ls'): use `ceph mgr module enable prometheus` to enable it
Nov 28 10:18:37 np0005538515.localdomain ceph-mgr[286188]: mgr.server reply reply (95) Operation not supported Module 'prometheus' is not enabled/loaded (required by command 'healthcheck history ls'): use `ceph mgr module enable prometheus` to enable it
Nov 28 10:18:37 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "osd crush class ls"} v 0)
Nov 28 10:18:37 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/97330575' entity='client.admin' cmd={"prefix": "osd crush class ls"} : dispatch
Nov 28 10:18:37 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.59755 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Nov 28 10:18:37 np0005538515.localdomain ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538515-yfkzhl[286184]: 2025-11-28T10:18:37.579+0000 7fccb8638640 -1 mgr.server reply reply (95) Operation not supported Module 'prometheus' is not enabled/loaded (required by command 'healthcheck history ls'): use `ceph mgr module enable prometheus` to enable it
Nov 28 10:18:37 np0005538515.localdomain ceph-mgr[286188]: mgr.server reply reply (95) Operation not supported Module 'prometheus' is not enabled/loaded (required by command 'healthcheck history ls'): use `ceph mgr module enable prometheus` to enable it
Nov 28 10:18:37 np0005538515.localdomain ceph-mon[301134]: from='client.49512 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Nov 28 10:18:37 np0005538515.localdomain ceph-mon[301134]: from='client.59683 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Nov 28 10:18:37 np0005538515.localdomain ceph-mon[301134]: from='client.59692 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:37 np0005538515.localdomain ceph-mon[301134]: pgmap v820: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:18:37 np0005538515.localdomain ceph-mon[301134]: from='client.69704 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Nov 28 10:18:37 np0005538515.localdomain ceph-mon[301134]: from='client.59707 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Nov 28 10:18:37 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.107:0/87497652' entity='client.admin' cmd={"prefix": "mon stat"} : dispatch
Nov 28 10:18:37 np0005538515.localdomain ceph-mon[301134]: from='client.49533 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Nov 28 10:18:37 np0005538515.localdomain ceph-mon[301134]: from='client.59725 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Nov 28 10:18:37 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.108:0/2392063071' entity='client.admin' cmd={"prefix": "node ls"} : dispatch
Nov 28 10:18:37 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.106:0/842310788' entity='client.admin' cmd={"prefix": "log last", "channel": "cephadm", "format": "json-pretty"} : dispatch
Nov 28 10:18:37 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.106:0/3259287151' entity='client.admin' cmd={"prefix": "node ls"} : dispatch
Nov 28 10:18:37 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.108:0/97330575' entity='client.admin' cmd={"prefix": "osd crush class ls"} : dispatch
Nov 28 10:18:37 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.106:0/3115545767' entity='client.admin' cmd={"prefix": "osd crush class ls"} : dispatch
Nov 28 10:18:37 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.106:0/3687197011' entity='client.admin' cmd={"prefix": "mgr dump", "format": "json-pretty"} : dispatch
Nov 28 10:18:37 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.107:0/2236414565' entity='client.admin' cmd={"prefix": "node ls"} : dispatch
Nov 28 10:18:37 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "log last", "channel": "cephadm", "format": "json-pretty"} v 0)
Nov 28 10:18:37 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/1340007337' entity='client.admin' cmd={"prefix": "log last", "channel": "cephadm", "format": "json-pretty"} : dispatch
Nov 28 10:18:37 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "osd crush dump"} v 0)
Nov 28 10:18:37 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/998719553' entity='client.admin' cmd={"prefix": "osd crush dump"} : dispatch
Nov 28 10:18:38 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "mgr dump", "format": "json-pretty"} v 0)
Nov 28 10:18:38 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/2676165965' entity='client.admin' cmd={"prefix": "mgr dump", "format": "json-pretty"} : dispatch
Nov 28 10:18:38 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v821: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:18:38 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "osd crush rule ls"} v 0)
Nov 28 10:18:38 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/3830129038' entity='client.admin' cmd={"prefix": "osd crush rule ls"} : dispatch
Nov 28 10:18:38 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "mgr metadata", "format": "json-pretty"} v 0)
Nov 28 10:18:38 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/3050153649' entity='client.admin' cmd={"prefix": "mgr metadata", "format": "json-pretty"} : dispatch
Nov 28 10:18:38 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "osd crush show-tunables"} v 0)
Nov 28 10:18:38 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/3319289045' entity='client.admin' cmd={"prefix": "osd crush show-tunables"} : dispatch
Nov 28 10:18:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 28 10:18:38 np0005538515.localdomain ceph-osd[33334]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 9000.2 total, 600.0 interval
                                                          Cumulative writes: 25K writes, 94K keys, 25K commit groups, 1.0 writes per commit group, ingest: 0.06 GB, 0.01 MB/s
                                                          Cumulative WAL: 25K writes, 8676 syncs, 2.89 writes per sync, written: 0.06 GB, 0.01 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 7417 writes, 27K keys, 7417 commit groups, 1.0 writes per commit group, ingest: 17.07 MB, 0.03 MB/s
                                                          Interval WAL: 7417 writes, 2930 syncs, 2.53 writes per sync, written: 0.02 GB, 0.03 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Nov 28 10:18:38 np0005538515.localdomain ceph-mon[301134]: from='client.69740 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Nov 28 10:18:38 np0005538515.localdomain ceph-mon[301134]: from='client.59755 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Nov 28 10:18:38 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.108:0/1340007337' entity='client.admin' cmd={"prefix": "log last", "channel": "cephadm", "format": "json-pretty"} : dispatch
Nov 28 10:18:38 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.108:0/998719553' entity='client.admin' cmd={"prefix": "osd crush dump"} : dispatch
Nov 28 10:18:38 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.106:0/3437061184' entity='client.admin' cmd={"prefix": "osd crush dump"} : dispatch
Nov 28 10:18:38 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.107:0/2354334774' entity='client.admin' cmd={"prefix": "osd crush class ls"} : dispatch
Nov 28 10:18:38 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.106:0/367041026' entity='client.admin' cmd={"prefix": "mgr metadata", "format": "json-pretty"} : dispatch
Nov 28 10:18:38 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.107:0/2033329145' entity='client.admin' cmd={"prefix": "log last", "channel": "cephadm", "format": "json-pretty"} : dispatch
Nov 28 10:18:38 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.108:0/2676165965' entity='client.admin' cmd={"prefix": "mgr dump", "format": "json-pretty"} : dispatch
Nov 28 10:18:38 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.108:0/3830129038' entity='client.admin' cmd={"prefix": "osd crush rule ls"} : dispatch
Nov 28 10:18:38 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.106:0/641110482' entity='client.admin' cmd={"prefix": "osd crush rule ls"} : dispatch
Nov 28 10:18:38 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.107:0/2982477570' entity='client.admin' cmd={"prefix": "osd crush dump"} : dispatch
Nov 28 10:18:38 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.106:0/2259440631' entity='client.admin' cmd={"prefix": "mgr module ls", "format": "json-pretty"} : dispatch
Nov 28 10:18:38 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.107:0/4237691921' entity='client.admin' cmd={"prefix": "mgr dump", "format": "json-pretty"} : dispatch
Nov 28 10:18:38 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.108:0/3050153649' entity='client.admin' cmd={"prefix": "mgr metadata", "format": "json-pretty"} : dispatch
Nov 28 10:18:38 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.108:0/3319289045' entity='client.admin' cmd={"prefix": "osd crush show-tunables"} : dispatch
Nov 28 10:18:38 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "osd crush rule ls"} v 0)
Nov 28 10:18:38 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/4176004123' entity='client.admin' cmd={"prefix": "osd crush rule ls"} : dispatch
Nov 28 10:18:38 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:18:38.807 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:18:38 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "osd crush tree", "show_shadow": true} v 0)
Nov 28 10:18:38 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/1301512143' entity='client.admin' cmd={"prefix": "osd crush tree", "show_shadow": true} : dispatch
Nov 28 10:18:38 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "mgr module ls", "format": "json-pretty"} v 0)
Nov 28 10:18:38 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/1939566436' entity='client.admin' cmd={"prefix": "mgr module ls", "format": "json-pretty"} : dispatch
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:55:57.362613+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 83124224 unmapped: 1753088 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:55:58.362767+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 83124224 unmapped: 1753088 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:55:59.362962+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 83124224 unmapped: 1753088 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 814433 data_alloc: 285212672 data_used: 2523136
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:56:00.363195+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 83124224 unmapped: 1753088 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:56:01.363388+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 83124224 unmapped: 1753088 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:56:02.363587+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 83124224 unmapped: 1753088 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 90 heartbeat osd_stat(store_statfs(0x1b992e000/0x0/0x1bfc00000, data 0x20db6d9/0x2160000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:56:03.363729+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 83124224 unmapped: 1753088 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 90 heartbeat osd_stat(store_statfs(0x1b992e000/0x0/0x1bfc00000, data 0x20db6d9/0x2160000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:56:04.363917+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 83124224 unmapped: 1753088 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 814433 data_alloc: 285212672 data_used: 2523136
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:56:05.364165+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 83124224 unmapped: 1753088 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 90 heartbeat osd_stat(store_statfs(0x1b992e000/0x0/0x1bfc00000, data 0x20db6d9/0x2160000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:56:06.364308+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 83124224 unmapped: 1753088 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 90 heartbeat osd_stat(store_statfs(0x1b992e000/0x0/0x1bfc00000, data 0x20db6d9/0x2160000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:56:07.365592+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 90 heartbeat osd_stat(store_statfs(0x1b992e000/0x0/0x1bfc00000, data 0x20db6d9/0x2160000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 83124224 unmapped: 1753088 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:56:08.365862+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 83124224 unmapped: 1753088 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:56:09.366056+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 83124224 unmapped: 1753088 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 90 heartbeat osd_stat(store_statfs(0x1b992e000/0x0/0x1bfc00000, data 0x20db6d9/0x2160000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 814433 data_alloc: 285212672 data_used: 2523136
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:56:10.367109+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 83124224 unmapped: 1753088 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 90 heartbeat osd_stat(store_statfs(0x1b992e000/0x0/0x1bfc00000, data 0x20db6d9/0x2160000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:56:11.367541+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: mgrc handle_mgr_map Got map version 35
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: mgrc handle_mgr_map Active mgr is now 
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: mgrc reconnect Terminating session with v2:172.18.0.106:6810/4278362185
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: mgrc reconnect No active mgr available yet
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 90 handle_osd_map epochs [91,91], i have 90, src has [1,91]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 81.796394348s of 81.861694336s, submitted: 15
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 91 ms_handle_reset con 0x55ab8e2f7800 session 0x55ab8dbd92c0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 91 ms_handle_reset con 0x55ab8f197c00 session 0x55ab8dce0000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 91 ms_handle_reset con 0x55ab8e370800 session 0x55ab8df7e5a0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 83288064 unmapped: 1589248 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e2f7800
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:56:12.367720+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 83435520 unmapped: 1441792 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: mgrc handle_mgr_map Got map version 36
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.107:6810/2760684413,v1:172.18.0.107:6811/2760684413]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: mgrc reconnect Starting new session with [v2:172.18.0.107:6810/2760684413,v1:172.18.0.107:6811/2760684413]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: get_auth_request con 0x55ab8e370800 auth_method 0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: mgrc handle_mgr_configure stats_period=5
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:56:13.367886+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82919424 unmapped: 1957888 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: mgrc handle_mgr_map Got map version 37
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.107:6810/2760684413,v1:172.18.0.107:6811/2760684413]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8f372000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8f199800
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 91 heartbeat osd_stat(store_statfs(0x1b992a000/0x0/0x1bfc00000, data 0x20ddda9/0x2164000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:56:14.368054+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82354176 unmapped: 2523136 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 818105 data_alloc: 285212672 data_used: 2535424
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:56:15.368411+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82354176 unmapped: 2523136 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: mgrc handle_mgr_map Got map version 38
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.107:6810/2760684413,v1:172.18.0.107:6811/2760684413]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:56:16.368634+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82501632 unmapped: 2375680 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 91 heartbeat osd_stat(store_statfs(0x1b992a000/0x0/0x1bfc00000, data 0x20ddda9/0x2164000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:56:17.368819+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: mgrc handle_mgr_map Got map version 39
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.107:6810/2760684413,v1:172.18.0.107:6811/2760684413]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82690048 unmapped: 2187264 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:56:18.369003+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82690048 unmapped: 2187264 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:56:19.369246+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82690048 unmapped: 2187264 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 91 heartbeat osd_stat(store_statfs(0x1b992a000/0x0/0x1bfc00000, data 0x20ddda9/0x2164000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 818105 data_alloc: 285212672 data_used: 2535424
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:56:20.369603+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82690048 unmapped: 2187264 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:56:21.369786+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82690048 unmapped: 2187264 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 91 heartbeat osd_stat(store_statfs(0x1b992a000/0x0/0x1bfc00000, data 0x20ddda9/0x2164000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:56:22.370139+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82690048 unmapped: 2187264 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:56:23.370378+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82690048 unmapped: 2187264 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:56:24.370617+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82690048 unmapped: 2187264 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 818105 data_alloc: 285212672 data_used: 2535424
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:56:25.370788+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82690048 unmapped: 2187264 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:56:26.370918+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82690048 unmapped: 2187264 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:56:27.371114+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82690048 unmapped: 2187264 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 91 heartbeat osd_stat(store_statfs(0x1b992a000/0x0/0x1bfc00000, data 0x20ddda9/0x2164000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 91 heartbeat osd_stat(store_statfs(0x1b992a000/0x0/0x1bfc00000, data 0x20ddda9/0x2164000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:56:28.371264+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82690048 unmapped: 2187264 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:56:29.371418+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82690048 unmapped: 2187264 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 818105 data_alloc: 285212672 data_used: 2535424
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:56:30.371553+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82690048 unmapped: 2187264 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:56:31.371703+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82690048 unmapped: 2187264 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:56:32.371933+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82690048 unmapped: 2187264 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:56:33.372141+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82690048 unmapped: 2187264 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 91 heartbeat osd_stat(store_statfs(0x1b992a000/0x0/0x1bfc00000, data 0x20ddda9/0x2164000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:56:34.372311+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82690048 unmapped: 2187264 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: mgrc handle_mgr_map Got map version 40
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.107:6810/2760684413,v1:172.18.0.107:6811/2760684413]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 818105 data_alloc: 285212672 data_used: 2535424
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:56:35.372484+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82837504 unmapped: 2039808 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:56:36.372640+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82837504 unmapped: 2039808 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:56:37.372792+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 91 heartbeat osd_stat(store_statfs(0x1b992a000/0x0/0x1bfc00000, data 0x20ddda9/0x2164000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82837504 unmapped: 2039808 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:56:38.372950+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82837504 unmapped: 2039808 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:56:39.373153+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82837504 unmapped: 2039808 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 818105 data_alloc: 285212672 data_used: 2535424
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:56:40.373374+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82837504 unmapped: 2039808 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:56:41.373551+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82837504 unmapped: 2039808 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:56:42.373680+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 91 heartbeat osd_stat(store_statfs(0x1b992a000/0x0/0x1bfc00000, data 0x20ddda9/0x2164000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82837504 unmapped: 2039808 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:56:43.373829+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82837504 unmapped: 2039808 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 91 heartbeat osd_stat(store_statfs(0x1b992a000/0x0/0x1bfc00000, data 0x20ddda9/0x2164000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:56:44.374002+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82837504 unmapped: 2039808 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 818105 data_alloc: 285212672 data_used: 2535424
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:56:45.374172+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82837504 unmapped: 2039808 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:56:46.374349+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 91 heartbeat osd_stat(store_statfs(0x1b992a000/0x0/0x1bfc00000, data 0x20ddda9/0x2164000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82837504 unmapped: 2039808 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:56:47.374534+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82837504 unmapped: 2039808 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 91 heartbeat osd_stat(store_statfs(0x1b992a000/0x0/0x1bfc00000, data 0x20ddda9/0x2164000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:56:48.374724+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82837504 unmapped: 2039808 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:56:49.374870+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82837504 unmapped: 2039808 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 818105 data_alloc: 285212672 data_used: 2535424
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:56:50.375057+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82837504 unmapped: 2039808 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:56:51.375245+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82837504 unmapped: 2039808 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:56:52.375408+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82837504 unmapped: 2039808 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:56:53.375586+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 91 heartbeat osd_stat(store_statfs(0x1b992a000/0x0/0x1bfc00000, data 0x20ddda9/0x2164000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82837504 unmapped: 2039808 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:56:54.375715+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82837504 unmapped: 2039808 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 818105 data_alloc: 285212672 data_used: 2535424
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:56:55.375880+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82837504 unmapped: 2039808 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:56:56.375995+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82837504 unmapped: 2039808 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:56:57.376162+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82837504 unmapped: 2039808 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 91 heartbeat osd_stat(store_statfs(0x1b992a000/0x0/0x1bfc00000, data 0x20ddda9/0x2164000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:56:58.376320+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 91 heartbeat osd_stat(store_statfs(0x1b992a000/0x0/0x1bfc00000, data 0x20ddda9/0x2164000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82837504 unmapped: 2039808 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:56:59.376471+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82837504 unmapped: 2039808 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 91 heartbeat osd_stat(store_statfs(0x1b992a000/0x0/0x1bfc00000, data 0x20ddda9/0x2164000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 818105 data_alloc: 285212672 data_used: 2535424
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:00.376651+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82837504 unmapped: 2039808 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:01.376845+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82837504 unmapped: 2039808 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:02.376991+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82837504 unmapped: 2039808 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:03.377149+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82837504 unmapped: 2039808 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:04.377355+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 91 heartbeat osd_stat(store_statfs(0x1b992a000/0x0/0x1bfc00000, data 0x20ddda9/0x2164000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82837504 unmapped: 2039808 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 818105 data_alloc: 285212672 data_used: 2535424
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:05.377498+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82837504 unmapped: 2039808 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:06.377645+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82837504 unmapped: 2039808 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 91 heartbeat osd_stat(store_statfs(0x1b992a000/0x0/0x1bfc00000, data 0x20ddda9/0x2164000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:07.377788+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 91 heartbeat osd_stat(store_statfs(0x1b992a000/0x0/0x1bfc00000, data 0x20ddda9/0x2164000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82837504 unmapped: 2039808 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:08.377911+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 91 heartbeat osd_stat(store_statfs(0x1b992a000/0x0/0x1bfc00000, data 0x20ddda9/0x2164000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82837504 unmapped: 2039808 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:09.378087+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82837504 unmapped: 2039808 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 818105 data_alloc: 285212672 data_used: 2535424
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:10.378337+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82837504 unmapped: 2039808 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:11.378464+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82837504 unmapped: 2039808 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:12.379319+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82837504 unmapped: 2039808 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:13.379471+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82837504 unmapped: 2039808 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 91 heartbeat osd_stat(store_statfs(0x1b992a000/0x0/0x1bfc00000, data 0x20ddda9/0x2164000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:14.380032+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82837504 unmapped: 2039808 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 818105 data_alloc: 285212672 data_used: 2535424
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:15.380484+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 91 heartbeat osd_stat(store_statfs(0x1b992a000/0x0/0x1bfc00000, data 0x20ddda9/0x2164000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82837504 unmapped: 2039808 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:16.380670+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 91 heartbeat osd_stat(store_statfs(0x1b992a000/0x0/0x1bfc00000, data 0x20ddda9/0x2164000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82837504 unmapped: 2039808 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:17.380999+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 91 heartbeat osd_stat(store_statfs(0x1b992a000/0x0/0x1bfc00000, data 0x20ddda9/0x2164000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82837504 unmapped: 2039808 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:18.381219+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82837504 unmapped: 2039808 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:19.381797+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82837504 unmapped: 2039808 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 818105 data_alloc: 285212672 data_used: 2535424
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:20.382040+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82837504 unmapped: 2039808 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:21.382246+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 91 heartbeat osd_stat(store_statfs(0x1b992a000/0x0/0x1bfc00000, data 0x20ddda9/0x2164000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82837504 unmapped: 2039808 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:22.382528+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82837504 unmapped: 2039808 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:23.382681+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82837504 unmapped: 2039808 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:24.382868+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 91 heartbeat osd_stat(store_statfs(0x1b992a000/0x0/0x1bfc00000, data 0x20ddda9/0x2164000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82837504 unmapped: 2039808 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:25.383105+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 818105 data_alloc: 285212672 data_used: 2535424
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82837504 unmapped: 2039808 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:26.383327+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82837504 unmapped: 2039808 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:27.383547+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82837504 unmapped: 2039808 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:28.383732+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82837504 unmapped: 2039808 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:29.384016+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82837504 unmapped: 2039808 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:30.384275+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 91 heartbeat osd_stat(store_statfs(0x1b992a000/0x0/0x1bfc00000, data 0x20ddda9/0x2164000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 818105 data_alloc: 285212672 data_used: 2535424
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 91 heartbeat osd_stat(store_statfs(0x1b992a000/0x0/0x1bfc00000, data 0x20ddda9/0x2164000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82837504 unmapped: 2039808 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:31.384473+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82837504 unmapped: 2039808 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:32.384683+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82837504 unmapped: 2039808 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:33.384868+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82837504 unmapped: 2039808 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:34.385128+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82837504 unmapped: 2039808 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:35.385279+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 818105 data_alloc: 285212672 data_used: 2535424
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 91 handle_osd_map epochs [91,92], i have 91, src has [1,92]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 83.950874329s of 84.016578674s, submitted: 17
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: mgrc handle_mgr_map Got map version 41
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: mgrc handle_mgr_map Active mgr is now 
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: mgrc reconnect Terminating session with v2:172.18.0.107:6810/2760684413
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: mgrc reconnect No active mgr available yet
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 92 handle_osd_map epochs [92,92], i have 92, src has [1,92]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 92 ms_handle_reset con 0x55ab8f199800 session 0x55ab8dc07e00
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 92 ms_handle_reset con 0x55ab8f372000 session 0x55ab8c0fa5a0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 92 ms_handle_reset con 0x55ab8e2f7800 session 0x55ab8dc51860
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8ee2f000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 92 heartbeat osd_stat(store_statfs(0x1b9925000/0x0/0x1bfc00000, data 0x20e0287/0x2168000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82984960 unmapped: 1892352 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:36.385465+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: mgrc handle_mgr_map Got map version 42
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/3214120196,v1:172.18.0.108:6811/3214120196]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: mgrc reconnect Starting new session with [v2:172.18.0.108:6810/3214120196,v1:172.18.0.108:6811/3214120196]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: get_auth_request con 0x55ab8f19bc00 auth_method 0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: mgrc handle_mgr_configure stats_period=5
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 83083264 unmapped: 1794048 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:37.385612+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e24e400
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8f197000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 83083264 unmapped: 1794048 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:38.385781+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: mgrc handle_mgr_map Got map version 43
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/3214120196,v1:172.18.0.108:6811/3214120196]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 83083264 unmapped: 1794048 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:39.385940+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 92 heartbeat osd_stat(store_statfs(0x1b9926000/0x0/0x1bfc00000, data 0x20e05d1/0x2168000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 83083264 unmapped: 1794048 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:40.386120+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 821233 data_alloc: 285212672 data_used: 2547712
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: mgrc handle_mgr_map Got map version 44
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/3214120196,v1:172.18.0.108:6811/3214120196]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82763776 unmapped: 2113536 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:41.386288+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82763776 unmapped: 2113536 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:42.386456+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82763776 unmapped: 2113536 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:43.386619+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82763776 unmapped: 2113536 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:44.386785+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 92 heartbeat osd_stat(store_statfs(0x1b9926000/0x0/0x1bfc00000, data 0x20e05d1/0x2168000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82763776 unmapped: 2113536 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:45.386956+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 821233 data_alloc: 285212672 data_used: 2547712
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 92 heartbeat osd_stat(store_statfs(0x1b9926000/0x0/0x1bfc00000, data 0x20e05d1/0x2168000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82763776 unmapped: 2113536 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:46.387090+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82763776 unmapped: 2113536 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:47.387272+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82763776 unmapped: 2113536 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:48.387484+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82763776 unmapped: 2113536 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:49.387694+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 92 heartbeat osd_stat(store_statfs(0x1b9926000/0x0/0x1bfc00000, data 0x20e05d1/0x2168000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82763776 unmapped: 2113536 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:50.387914+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 821233 data_alloc: 285212672 data_used: 2547712
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82763776 unmapped: 2113536 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:51.388141+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82763776 unmapped: 2113536 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:52.388345+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82763776 unmapped: 2113536 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:53.388756+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82763776 unmapped: 2113536 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:54.389005+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 92 heartbeat osd_stat(store_statfs(0x1b9926000/0x0/0x1bfc00000, data 0x20e05d1/0x2168000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82763776 unmapped: 2113536 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:55.389203+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 821233 data_alloc: 285212672 data_used: 2547712
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82763776 unmapped: 2113536 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:56.389391+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82763776 unmapped: 2113536 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:57.389587+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 92 heartbeat osd_stat(store_statfs(0x1b9926000/0x0/0x1bfc00000, data 0x20e05d1/0x2168000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82763776 unmapped: 2113536 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:58.389780+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 92 heartbeat osd_stat(store_statfs(0x1b9926000/0x0/0x1bfc00000, data 0x20e05d1/0x2168000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82763776 unmapped: 2113536 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:57:59.389950+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82763776 unmapped: 2113536 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:00.390176+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 821233 data_alloc: 285212672 data_used: 2547712
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 92 heartbeat osd_stat(store_statfs(0x1b9926000/0x0/0x1bfc00000, data 0x20e05d1/0x2168000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82763776 unmapped: 2113536 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:01.390337+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:02.390537+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82763776 unmapped: 2113536 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 92 heartbeat osd_stat(store_statfs(0x1b9926000/0x0/0x1bfc00000, data 0x20e05d1/0x2168000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:03.390697+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82763776 unmapped: 2113536 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 92 heartbeat osd_stat(store_statfs(0x1b9926000/0x0/0x1bfc00000, data 0x20e05d1/0x2168000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 92 heartbeat osd_stat(store_statfs(0x1b9926000/0x0/0x1bfc00000, data 0x20e05d1/0x2168000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 7800.1 total, 600.0 interval
                                                          Cumulative writes: 5064 writes, 22K keys, 5064 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s
                                                          Cumulative WAL: 5064 writes, 762 syncs, 6.65 writes per sync, written: 0.02 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 214 writes, 503 keys, 214 commit groups, 1.0 writes per commit group, ingest: 0.48 MB, 0.00 MB/s
                                                          Interval WAL: 214 writes, 101 syncs, 2.12 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:04.390862+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82763776 unmapped: 2113536 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:05.391111+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82763776 unmapped: 2113536 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 821233 data_alloc: 285212672 data_used: 2547712
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:06.391272+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82763776 unmapped: 2113536 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:07.391451+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 92 heartbeat osd_stat(store_statfs(0x1b9926000/0x0/0x1bfc00000, data 0x20e05d1/0x2168000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82763776 unmapped: 2113536 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:08.391673+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82763776 unmapped: 2113536 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:09.392172+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82763776 unmapped: 2113536 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:10.392402+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82763776 unmapped: 2113536 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 821233 data_alloc: 285212672 data_used: 2547712
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:11.392600+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82763776 unmapped: 2113536 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 92 heartbeat osd_stat(store_statfs(0x1b9926000/0x0/0x1bfc00000, data 0x20e05d1/0x2168000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:12.392798+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82763776 unmapped: 2113536 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:13.393004+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82763776 unmapped: 2113536 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:14.393221+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82763776 unmapped: 2113536 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:15.393428+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82763776 unmapped: 2113536 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 821233 data_alloc: 285212672 data_used: 2547712
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:16.393650+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82763776 unmapped: 2113536 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:17.394429+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 92 heartbeat osd_stat(store_statfs(0x1b9926000/0x0/0x1bfc00000, data 0x20e05d1/0x2168000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82763776 unmapped: 2113536 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:18.395752+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82763776 unmapped: 2113536 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:19.396396+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82763776 unmapped: 2113536 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:20.397549+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82763776 unmapped: 2113536 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 821233 data_alloc: 285212672 data_used: 2547712
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:21.397766+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82763776 unmapped: 2113536 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 92 heartbeat osd_stat(store_statfs(0x1b9926000/0x0/0x1bfc00000, data 0x20e05d1/0x2168000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:22.397996+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82763776 unmapped: 2113536 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:23.398375+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82763776 unmapped: 2113536 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:24.399023+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82763776 unmapped: 2113536 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:25.399391+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82763776 unmapped: 2113536 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 821233 data_alloc: 285212672 data_used: 2547712
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 92 heartbeat osd_stat(store_statfs(0x1b9926000/0x0/0x1bfc00000, data 0x20e05d1/0x2168000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:26.399623+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82763776 unmapped: 2113536 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 92 heartbeat osd_stat(store_statfs(0x1b9926000/0x0/0x1bfc00000, data 0x20e05d1/0x2168000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:27.400022+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82763776 unmapped: 2113536 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:28.400361+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82763776 unmapped: 2113536 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:29.400531+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82763776 unmapped: 2113536 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 92 heartbeat osd_stat(store_statfs(0x1b9926000/0x0/0x1bfc00000, data 0x20e05d1/0x2168000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:30.400805+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82763776 unmapped: 2113536 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 821233 data_alloc: 285212672 data_used: 2547712
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:31.401011+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82763776 unmapped: 2113536 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:32.401271+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82763776 unmapped: 2113536 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:33.401482+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82763776 unmapped: 2113536 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:34.401670+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82763776 unmapped: 2113536 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:35.401869+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 92 heartbeat osd_stat(store_statfs(0x1b9926000/0x0/0x1bfc00000, data 0x20e05d1/0x2168000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82763776 unmapped: 2113536 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 821233 data_alloc: 285212672 data_used: 2547712
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:36.402104+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 92 heartbeat osd_stat(store_statfs(0x1b9926000/0x0/0x1bfc00000, data 0x20e05d1/0x2168000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82763776 unmapped: 2113536 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:37.402280+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82763776 unmapped: 2113536 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:38.402477+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82763776 unmapped: 2113536 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:39.402653+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82763776 unmapped: 2113536 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:40.402924+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82763776 unmapped: 2113536 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 821233 data_alloc: 285212672 data_used: 2547712
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 92 heartbeat osd_stat(store_statfs(0x1b9926000/0x0/0x1bfc00000, data 0x20e05d1/0x2168000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:41.403136+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82763776 unmapped: 2113536 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:42.403329+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82763776 unmapped: 2113536 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:43.403501+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82763776 unmapped: 2113536 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:44.403697+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82763776 unmapped: 2113536 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:45.403875+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82763776 unmapped: 2113536 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 821233 data_alloc: 285212672 data_used: 2547712
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:46.404128+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 92 heartbeat osd_stat(store_statfs(0x1b9926000/0x0/0x1bfc00000, data 0x20e05d1/0x2168000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82763776 unmapped: 2113536 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:47.404300+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82763776 unmapped: 2113536 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 92 heartbeat osd_stat(store_statfs(0x1b9926000/0x0/0x1bfc00000, data 0x20e05d1/0x2168000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:48.404526+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82763776 unmapped: 2113536 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 92 heartbeat osd_stat(store_statfs(0x1b9926000/0x0/0x1bfc00000, data 0x20e05d1/0x2168000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:49.404990+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82763776 unmapped: 2113536 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:50.405371+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 92 heartbeat osd_stat(store_statfs(0x1b9926000/0x0/0x1bfc00000, data 0x20e05d1/0x2168000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82763776 unmapped: 2113536 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 821233 data_alloc: 285212672 data_used: 2547712
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:51.405639+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82763776 unmapped: 2113536 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 92 heartbeat osd_stat(store_statfs(0x1b9926000/0x0/0x1bfc00000, data 0x20e05d1/0x2168000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:52.405884+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82763776 unmapped: 2113536 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:53.407399+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82763776 unmapped: 2113536 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 92 heartbeat osd_stat(store_statfs(0x1b9926000/0x0/0x1bfc00000, data 0x20e05d1/0x2168000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:54.407546+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82763776 unmapped: 2113536 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:55.407971+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82763776 unmapped: 2113536 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 821233 data_alloc: 285212672 data_used: 2547712
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:56.408213+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82763776 unmapped: 2113536 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:57.408432+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82763776 unmapped: 2113536 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:58.408722+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 92 heartbeat osd_stat(store_statfs(0x1b9926000/0x0/0x1bfc00000, data 0x20e05d1/0x2168000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82763776 unmapped: 2113536 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:58:59.408927+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82763776 unmapped: 2113536 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:00.409211+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82763776 unmapped: 2113536 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 821233 data_alloc: 285212672 data_used: 2547712
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:01.409485+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82763776 unmapped: 2113536 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:02.409731+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82763776 unmapped: 2113536 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 92 heartbeat osd_stat(store_statfs(0x1b9926000/0x0/0x1bfc00000, data 0x20e05d1/0x2168000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:03.409996+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82763776 unmapped: 2113536 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:04.410281+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82763776 unmapped: 2113536 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:05.410633+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 90.084037781s of 90.155387878s, submitted: 18
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82804736 unmapped: 2072576 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 821601 data_alloc: 285212672 data_used: 2547712
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:06.410841+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82804736 unmapped: 2072576 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: mgrc handle_mgr_map Got map version 45
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/3214120196,v1:172.18.0.108:6811/3214120196]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:07.411170+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82804736 unmapped: 2072576 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 92 heartbeat osd_stat(store_statfs(0x1b9926000/0x0/0x1bfc00000, data 0x20e06eb/0x2168000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:08.411509+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82804736 unmapped: 2072576 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:09.411744+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82804736 unmapped: 2072576 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:10.412028+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82804736 unmapped: 2072576 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 821601 data_alloc: 285212672 data_used: 2547712
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:11.412239+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82804736 unmapped: 2072576 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:12.412355+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 92 heartbeat osd_stat(store_statfs(0x1b9926000/0x0/0x1bfc00000, data 0x20e06eb/0x2168000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82804736 unmapped: 2072576 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:13.413290+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82804736 unmapped: 2072576 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:14.413477+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 92 heartbeat osd_stat(store_statfs(0x1b9926000/0x0/0x1bfc00000, data 0x20e06eb/0x2168000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 82804736 unmapped: 2072576 heap: 84877312 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8f374000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:15.413628+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 9.974635124s of 10.035637856s, submitted: 14
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 84017152 unmapped: 16596992 heap: 100614144 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 879799 data_alloc: 285212672 data_used: 2547712
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _renew_subs
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _send_mon_message to mon.np0005538514 at v2:172.18.0.104:3300/0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 92 handle_osd_map epochs [93,93], i have 92, src has [1,93]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 93 handle_osd_map epochs [93,93], i have 93, src has [1,93]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 93 ms_handle_reset con 0x55ab8f374000 session 0x55ab8c6ae1e0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:16.413793+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 84107264 unmapped: 16506880 heap: 100614144 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:17.413947+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 84107264 unmapped: 16506880 heap: 100614144 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 93 handle_osd_map epochs [93,94], i have 93, src has [1,94]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:18.414175+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 94 heartbeat osd_stat(store_statfs(0x1b9120000/0x0/0x1bfc00000, data 0x28e2acc/0x296d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 84107264 unmapped: 16506880 heap: 100614144 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:19.414394+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 84107264 unmapped: 16506880 heap: 100614144 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:20.414872+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 84107264 unmapped: 16506880 heap: 100614144 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 887263 data_alloc: 285212672 data_used: 2560000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:21.415441+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 94 heartbeat osd_stat(store_statfs(0x1b911b000/0x0/0x1bfc00000, data 0x28e4e8a/0x2971000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 84107264 unmapped: 16506880 heap: 100614144 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:22.416481+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 84107264 unmapped: 16506880 heap: 100614144 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:23.416901+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 84107264 unmapped: 16506880 heap: 100614144 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:24.417154+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 84107264 unmapped: 16506880 heap: 100614144 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:25.417334+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 84107264 unmapped: 16506880 heap: 100614144 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 887263 data_alloc: 285212672 data_used: 2560000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 94 heartbeat osd_stat(store_statfs(0x1b911b000/0x0/0x1bfc00000, data 0x28e4e8a/0x2971000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:26.418047+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 84107264 unmapped: 16506880 heap: 100614144 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:27.418598+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 84107264 unmapped: 16506880 heap: 100614144 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:28.419671+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 84107264 unmapped: 16506880 heap: 100614144 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 94 heartbeat osd_stat(store_statfs(0x1b911b000/0x0/0x1bfc00000, data 0x28e4e8a/0x2971000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:29.420834+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 84107264 unmapped: 16506880 heap: 100614144 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:30.421375+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 84107264 unmapped: 16506880 heap: 100614144 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 887263 data_alloc: 285212672 data_used: 2560000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:31.422045+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 84107264 unmapped: 16506880 heap: 100614144 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 94 heartbeat osd_stat(store_statfs(0x1b911b000/0x0/0x1bfc00000, data 0x28e4e8a/0x2971000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:32.422515+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 84107264 unmapped: 16506880 heap: 100614144 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:33.422889+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 84107264 unmapped: 16506880 heap: 100614144 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:34.423157+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 84107264 unmapped: 16506880 heap: 100614144 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:35.423997+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 94 heartbeat osd_stat(store_statfs(0x1b911b000/0x0/0x1bfc00000, data 0x28e4e8a/0x2971000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 84107264 unmapped: 16506880 heap: 100614144 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 887263 data_alloc: 285212672 data_used: 2560000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:36.424569+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 94 heartbeat osd_stat(store_statfs(0x1b911b000/0x0/0x1bfc00000, data 0x28e4e8a/0x2971000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 84107264 unmapped: 16506880 heap: 100614144 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:37.425161+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 84107264 unmapped: 16506880 heap: 100614144 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:38.425435+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 84107264 unmapped: 16506880 heap: 100614144 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:39.425643+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 84107264 unmapped: 16506880 heap: 100614144 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:40.425939+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 84107264 unmapped: 16506880 heap: 100614144 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 887263 data_alloc: 285212672 data_used: 2560000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 94 heartbeat osd_stat(store_statfs(0x1b911b000/0x0/0x1bfc00000, data 0x28e4e8a/0x2971000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:41.426117+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 84107264 unmapped: 16506880 heap: 100614144 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:42.426303+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 84107264 unmapped: 16506880 heap: 100614144 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:43.426464+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 84107264 unmapped: 16506880 heap: 100614144 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:44.426607+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 84107264 unmapped: 16506880 heap: 100614144 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:45.426754+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 84107264 unmapped: 16506880 heap: 100614144 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 887263 data_alloc: 285212672 data_used: 2560000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:46.426953+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 84107264 unmapped: 16506880 heap: 100614144 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 94 heartbeat osd_stat(store_statfs(0x1b911b000/0x0/0x1bfc00000, data 0x28e4e8a/0x2971000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:47.427116+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 84107264 unmapped: 16506880 heap: 100614144 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:48.427248+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 84107264 unmapped: 16506880 heap: 100614144 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:49.427990+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 84107264 unmapped: 16506880 heap: 100614144 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:50.428204+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 84107264 unmapped: 16506880 heap: 100614144 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 887263 data_alloc: 285212672 data_used: 2560000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:51.428417+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 94 heartbeat osd_stat(store_statfs(0x1b911b000/0x0/0x1bfc00000, data 0x28e4e8a/0x2971000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 84107264 unmapped: 16506880 heap: 100614144 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:52.428662+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 84107264 unmapped: 16506880 heap: 100614144 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:53.428876+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 84107264 unmapped: 16506880 heap: 100614144 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:54.429157+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 84107264 unmapped: 16506880 heap: 100614144 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:55.429362+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 84107264 unmapped: 16506880 heap: 100614144 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 887263 data_alloc: 285212672 data_used: 2560000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:56.429562+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 84107264 unmapped: 16506880 heap: 100614144 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:57.429740+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 94 heartbeat osd_stat(store_statfs(0x1b911b000/0x0/0x1bfc00000, data 0x28e4e8a/0x2971000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 84107264 unmapped: 16506880 heap: 100614144 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:58.429931+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 84107264 unmapped: 16506880 heap: 100614144 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T09:59:59.430108+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 94 heartbeat osd_stat(store_statfs(0x1b911b000/0x0/0x1bfc00000, data 0x28e4e8a/0x2971000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 84107264 unmapped: 16506880 heap: 100614144 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:00.430353+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 84107264 unmapped: 16506880 heap: 100614144 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 887263 data_alloc: 285212672 data_used: 2560000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:01.430526+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 84107264 unmapped: 16506880 heap: 100614144 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:02.430816+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 94 heartbeat osd_stat(store_statfs(0x1b911b000/0x0/0x1bfc00000, data 0x28e4e8a/0x2971000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 84107264 unmapped: 16506880 heap: 100614144 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:03.431035+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 94 heartbeat osd_stat(store_statfs(0x1b911b000/0x0/0x1bfc00000, data 0x28e4e8a/0x2971000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 84107264 unmapped: 16506880 heap: 100614144 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:04.431261+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 94 heartbeat osd_stat(store_statfs(0x1b911b000/0x0/0x1bfc00000, data 0x28e4e8a/0x2971000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 84107264 unmapped: 16506880 heap: 100614144 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:05.431452+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 94 heartbeat osd_stat(store_statfs(0x1b911b000/0x0/0x1bfc00000, data 0x28e4e8a/0x2971000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 84107264 unmapped: 16506880 heap: 100614144 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 94 heartbeat osd_stat(store_statfs(0x1b911b000/0x0/0x1bfc00000, data 0x28e4e8a/0x2971000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 887263 data_alloc: 285212672 data_used: 2560000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:06.431649+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 84115456 unmapped: 16498688 heap: 100614144 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:07.431856+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 84115456 unmapped: 16498688 heap: 100614144 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:08.432102+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 84115456 unmapped: 16498688 heap: 100614144 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:09.432301+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 84115456 unmapped: 16498688 heap: 100614144 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:10.432551+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 84115456 unmapped: 16498688 heap: 100614144 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 887263 data_alloc: 285212672 data_used: 2560000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 94 heartbeat osd_stat(store_statfs(0x1b911b000/0x0/0x1bfc00000, data 0x28e4e8a/0x2971000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:11.432738+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 84115456 unmapped: 16498688 heap: 100614144 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8f373000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:12.432895+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 56.632221222s of 56.718605042s, submitted: 14
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 94 ms_handle_reset con 0x55ab8f373000 session 0x55ab8dfd52c0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8f196000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 94 ms_handle_reset con 0x55ab8f196000 session 0x55ab8edf3860
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 84140032 unmapped: 16474112 heap: 100614144 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:13.433110+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e24fc00
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 84344832 unmapped: 16269312 heap: 100614144 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 94 ms_handle_reset con 0x55ab8e24fc00 session 0x55ab8edf3a40
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e269800
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 94 ms_handle_reset con 0x55ab8e269800 session 0x55ab8edf2f00
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e2b9400
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 94 ms_handle_reset con 0x55ab8e2b9400 session 0x55ab8dd06f00
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e24fc00
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 94 ms_handle_reset con 0x55ab8e24fc00 session 0x55ab8dd06960
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e269800
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 94 ms_handle_reset con 0x55ab8e269800 session 0x55ab8dc46f00
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:14.433286+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 84361216 unmapped: 16252928 heap: 100614144 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:15.433464+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 84361216 unmapped: 16252928 heap: 100614144 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 933126 data_alloc: 285212672 data_used: 2564096
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:16.433626+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 84434944 unmapped: 16179200 heap: 100614144 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 94 heartbeat osd_stat(store_statfs(0x1b8bc5000/0x0/0x1bfc00000, data 0x2e3ce8a/0x2ec9000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:17.433807+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 84434944 unmapped: 16179200 heap: 100614144 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:18.433943+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8edefc00
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 94 ms_handle_reset con 0x55ab8edefc00 session 0x55ab8edf2960
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8cc2c000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 84074496 unmapped: 16539648 heap: 100614144 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e26b800
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:19.434144+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 84230144 unmapped: 16384000 heap: 100614144 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:20.434380+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 84230144 unmapped: 16384000 heap: 100614144 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 934527 data_alloc: 285212672 data_used: 2588672
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:21.434576+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 94 heartbeat osd_stat(store_statfs(0x1b8bc4000/0x0/0x1bfc00000, data 0x2e3cead/0x2eca000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 84189184 unmapped: 16424960 heap: 100614144 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:22.434719+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 84189184 unmapped: 16424960 heap: 100614144 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:23.434904+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 84189184 unmapped: 16424960 heap: 100614144 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 94 heartbeat osd_stat(store_statfs(0x1b8bc4000/0x0/0x1bfc00000, data 0x2e3cead/0x2eca000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:24.435133+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8cc2d400
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 12.316813469s of 12.474899292s, submitted: 32
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 84238336 unmapped: 16375808 heap: 100614144 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 94 heartbeat osd_stat(store_statfs(0x1b8bc4000/0x0/0x1bfc00000, data 0x2e3cead/0x2eca000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:25.436050+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _renew_subs
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _send_mon_message to mon.np0005538514 at v2:172.18.0.104:3300/0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 94 handle_osd_map epochs [95,95], i have 94, src has [1,95]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 95 ms_handle_reset con 0x55ab8cc2d400 session 0x55ab8dd06000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 84287488 unmapped: 16326656 heap: 100614144 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 940176 data_alloc: 285212672 data_used: 2605056
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:26.437326+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 84287488 unmapped: 16326656 heap: 100614144 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:27.438226+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8dcfb800
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _renew_subs
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _send_mon_message to mon.np0005538514 at v2:172.18.0.104:3300/0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 95 handle_osd_map epochs [96,96], i have 95, src has [1,96]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.17] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 84475904 unmapped: 16138240 heap: 100614144 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8dc97400
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8f197800
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:28.438630+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 96 ms_handle_reset con 0x55ab8dcfb800 session 0x55ab8dc07e00
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 96 ms_handle_reset con 0x55ab8cc2c000 session 0x55ab8edf30e0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 96 ms_handle_reset con 0x55ab8e26b800 session 0x55ab8c63d0e0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 84566016 unmapped: 16048128 heap: 100614144 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:29.438851+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 96 heartbeat osd_stat(store_statfs(0x1b8bbc000/0x0/0x1bfc00000, data 0x2e4167d/0x2ed2000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 84566016 unmapped: 16048128 heap: 100614144 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:30.439921+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 96 heartbeat osd_stat(store_statfs(0x1b8bbc000/0x0/0x1bfc00000, data 0x2e4167d/0x2ed2000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 84566016 unmapped: 16048128 heap: 100614144 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 941704 data_alloc: 285212672 data_used: 2605056
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:31.440441+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 85794816 unmapped: 14819328 heap: 100614144 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:32.440610+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 96 heartbeat osd_stat(store_statfs(0x1b7f40000/0x0/0x1bfc00000, data 0x3ab667d/0x3b47000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 89645056 unmapped: 10969088 heap: 100614144 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:33.440792+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 89620480 unmapped: 10993664 heap: 100614144 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:34.440990+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8ee5cc00
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 96 ms_handle_reset con 0x55ab8ee5cc00 session 0x55ab8edf34a0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8df90800
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 96 handle_osd_map epochs [96,97], i have 96, src has [1,97]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.17] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 9.270826340s of 10.000038147s, submitted: 169
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 97 handle_osd_map epochs [97,97], i have 97, src has [1,97]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 97 ms_handle_reset con 0x55ab8dc97400 session 0x55ab8f102000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 97 ms_handle_reset con 0x55ab8f197800 session 0x55ab8df7e1e0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 97 ms_handle_reset con 0x55ab8df90800 session 0x55ab8c790780
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8cc2c000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 87719936 unmapped: 12894208 heap: 100614144 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:35.441145+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 97 handle_osd_map epochs [97,97], i have 97, src has [1,97]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 87621632 unmapped: 12992512 heap: 100614144 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 910326 data_alloc: 285212672 data_used: 2592768
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8dc97400
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:36.441275+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 97 ms_handle_reset con 0x55ab8cc2c000 session 0x55ab8dc06960
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 97 ms_handle_reset con 0x55ab8dc97400 session 0x55ab8f10c1e0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e26b000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 97 ms_handle_reset con 0x55ab8e26b000 session 0x55ab8cbead20
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8cc2d800
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 97 ms_handle_reset con 0x55ab8cc2d800 session 0x55ab8f1021e0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8cc2c000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 97 ms_handle_reset con 0x55ab8cc2c000 session 0x55ab8dc5dc20
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8dc97400
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 97 ms_handle_reset con 0x55ab8dc97400 session 0x55ab8dc36000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 97 heartbeat osd_stat(store_statfs(0x1b862c000/0x0/0x1bfc00000, data 0x31e090e/0x3273000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 89808896 unmapped: 10805248 heap: 100614144 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:37.441451+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8df90800
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 97 ms_handle_reset con 0x55ab8df90800 session 0x55ab8dc43e00
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 89808896 unmapped: 10805248 heap: 100614144 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:38.441730+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e26b000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 97 ms_handle_reset con 0x55ab8e26b000 session 0x55ab8dc42960
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 89808896 unmapped: 10805248 heap: 100614144 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:39.441882+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8f27f800
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 97 ms_handle_reset con 0x55ab8f27f800 session 0x55ab8de2a000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8cc2c000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 97 ms_handle_reset con 0x55ab8cc2c000 session 0x55ab8de2b680
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 89325568 unmapped: 11288576 heap: 100614144 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8dc97400
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:40.442054+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8df90800
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 97 heartbeat osd_stat(store_statfs(0x1b862c000/0x0/0x1bfc00000, data 0x31e090e/0x3273000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 89358336 unmapped: 11255808 heap: 100614144 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 987800 data_alloc: 285212672 data_used: 2686976
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:41.442303+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 97 heartbeat osd_stat(store_statfs(0x1b87f7000/0x0/0x1bfc00000, data 0x320490e/0x3297000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 93052928 unmapped: 7561216 heap: 100614144 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:42.442553+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _send_mon_message to mon.np0005538514 at v2:172.18.0.104:3300/0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 93241344 unmapped: 7372800 heap: 100614144 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:43.442782+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 93241344 unmapped: 7372800 heap: 100614144 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:44.443030+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 97 heartbeat osd_stat(store_statfs(0x1b87f7000/0x0/0x1bfc00000, data 0x320490e/0x3297000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 93323264 unmapped: 7290880 heap: 100614144 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:45.443224+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 93323264 unmapped: 7290880 heap: 100614144 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1019160 data_alloc: 301989888 data_used: 7077888
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:46.443512+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 93323264 unmapped: 7290880 heap: 100614144 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:47.443691+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 93323264 unmapped: 7290880 heap: 100614144 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:48.443871+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 93323264 unmapped: 7290880 heap: 100614144 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:49.444018+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 93323264 unmapped: 7290880 heap: 100614144 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:50.444166+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e338800
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 97 heartbeat osd_stat(store_statfs(0x1b87f7000/0x0/0x1bfc00000, data 0x320490e/0x3297000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 97 ms_handle_reset con 0x55ab8e338800 session 0x55ab8dd06960
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8edec800
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 97 ms_handle_reset con 0x55ab8edec800 session 0x55ab8dd07680
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e338400
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 97 ms_handle_reset con 0x55ab8e338400 session 0x55ab8dfc7a40
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8ee5cc00
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 15.909819603s of 16.398948669s, submitted: 78
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 97 ms_handle_reset con 0x55ab8ee5cc00 session 0x55ab8dbd8b40
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 93421568 unmapped: 7192576 heap: 100614144 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1019160 data_alloc: 301989888 data_used: 7077888
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8cc2c000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:51.444466+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 97 ms_handle_reset con 0x55ab8cc2c000 session 0x55ab8c790b40
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 97763328 unmapped: 5668864 heap: 103432192 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:52.444583+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e2f7000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 97 ms_handle_reset con 0x55ab8e2f7000 session 0x55ab8dfd4780
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e26b000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 97 ms_handle_reset con 0x55ab8e26b000 session 0x55ab8dfd5c20
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e24fc00
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 97 ms_handle_reset con 0x55ab8e24fc00 session 0x55ab8dfd4000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8f2d5c00
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 97 ms_handle_reset con 0x55ab8f2d5c00 session 0x55ab8dc50b40
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8f2d5c00
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 97 ms_handle_reset con 0x55ab8f2d5c00 session 0x55ab8dc50000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8cc2c000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 97 ms_handle_reset con 0x55ab8cc2c000 session 0x55ab8c0fa5a0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 98869248 unmapped: 8765440 heap: 107634688 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:53.444692+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 99115008 unmapped: 8519680 heap: 107634688 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:54.444853+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 99115008 unmapped: 8519680 heap: 107634688 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e24fc00
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:55.444989+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 97 ms_handle_reset con 0x55ab8e24fc00 session 0x55ab8dc07e00
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 97 heartbeat osd_stat(store_statfs(0x1b71cc000/0x0/0x1bfc00000, data 0x482f90e/0x48c2000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e26b000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e2f7000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 98828288 unmapped: 8806400 heap: 107634688 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1204600 data_alloc: 301989888 data_used: 8187904
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:56.445215+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 98828288 unmapped: 8806400 heap: 107634688 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:57.445366+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 98828288 unmapped: 8806400 heap: 107634688 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:58.445559+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 98828288 unmapped: 8806400 heap: 107634688 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:00:59.446285+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 98828288 unmapped: 8806400 heap: 107634688 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:00.446502+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1204776 data_alloc: 301989888 data_used: 8187904
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 98828288 unmapped: 8806400 heap: 107634688 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:01.446686+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 97 heartbeat osd_stat(store_statfs(0x1b71cc000/0x0/0x1bfc00000, data 0x482f90e/0x48c2000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 98828288 unmapped: 8806400 heap: 107634688 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:02.446801+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 97 heartbeat osd_stat(store_statfs(0x1b71cc000/0x0/0x1bfc00000, data 0x482f90e/0x48c2000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 11.179256439s of 11.739358902s, submitted: 121
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 98885632 unmapped: 8749056 heap: 107634688 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:03.447010+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 98885632 unmapped: 8749056 heap: 107634688 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:04.447114+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 97 ms_handle_reset con 0x55ab8dc97400 session 0x55ab8dc47e00
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 97 ms_handle_reset con 0x55ab8df90800 session 0x55ab8f10cd20
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 98885632 unmapped: 8749056 heap: 107634688 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:05.447256+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 97 heartbeat osd_stat(store_statfs(0x1b71ca000/0x0/0x1bfc00000, data 0x483090e/0x48c3000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8cc2c000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 97 ms_handle_reset con 0x55ab8cc2c000 session 0x55ab8f10d2c0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1204700 data_alloc: 301989888 data_used: 8196096
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:06.447393+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 98893824 unmapped: 8740864 heap: 107634688 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8dc97400
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 97 handle_osd_map epochs [97,98], i have 97, src has [1,98]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 98 ms_handle_reset con 0x55ab8dc97400 session 0x55ab8f10d860
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8edefc00
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8f2d5400
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 98 heartbeat osd_stat(store_statfs(0x1b71cb000/0x0/0x1bfc00000, data 0x483090e/0x48c3000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 98 ms_handle_reset con 0x55ab8f2d5400 session 0x55ab8dc42960
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 98 ms_handle_reset con 0x55ab8edefc00 session 0x55ab8edf23c0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8f372000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:07.447551+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 99688448 unmapped: 7946240 heap: 107634688 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 98 ms_handle_reset con 0x55ab8f372000 session 0x55ab8edf2780
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8cc2c000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:08.447731+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 107667456 unmapped: 11698176 heap: 119365632 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 98 handle_osd_map epochs [98,99], i have 98, src has [1,99]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _renew_subs
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _send_mon_message to mon.np0005538514 at v2:172.18.0.104:3300/0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 98 handle_osd_map epochs [99,99], i have 99, src has [1,99]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[2.b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 99 ms_handle_reset con 0x55ab8cc2c000 session 0x55ab8dce0f00
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[2.a] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[2.d] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[2.5] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 99 handle_osd_map epochs [99,99], i have 99, src has [1,99]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[2.12] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[2.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[2.c] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[2.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[2.18] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:09.447870+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 107724800 unmapped: 11640832 heap: 119365632 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8dc97400
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8edefc00
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 99 handle_osd_map epochs [99,100], i have 99, src has [1,100]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 100 handle_osd_map epochs [100,100], i have 100, src has [1,100]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 100 ms_handle_reset con 0x55ab8e26b000 session 0x55ab8cbeb860
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 100 ms_handle_reset con 0x55ab8e2f7000 session 0x55ab8de2a000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:10.448112+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 107782144 unmapped: 11583488 heap: 119365632 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1401935 data_alloc: 301989888 data_used: 13238272
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:11.448292+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 107782144 unmapped: 11583488 heap: 119365632 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 100 heartbeat osd_stat(store_statfs(0x1b5ecb000/0x0/0x1bfc00000, data 0x5b2757c/0x5bc1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:12.448446+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 108126208 unmapped: 11239424 heap: 119365632 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 9.062048912s of 10.017923355s, submitted: 181
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:13.448585+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109346816 unmapped: 10018816 heap: 119365632 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 100 ms_handle_reset con 0x55ab8e2f6400 session 0x55ab8dc36f00
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e29c800
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e269800
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:14.448748+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 104292352 unmapped: 15073280 heap: 119365632 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 100 ms_handle_reset con 0x55ab8e269800 session 0x55ab8c63d680
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _renew_subs
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _send_mon_message to mon.np0005538514 at v2:172.18.0.104:3300/0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 100 handle_osd_map epochs [101,101], i have 100, src has [1,101]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[2.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[2.d] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[2.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[2.c] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[2.5] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[2.a] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[2.12] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[2.18] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[2.b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:15.448902+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 101023744 unmapped: 18341888 heap: 119365632 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 101 heartbeat osd_stat(store_statfs(0x1b6983000/0x0/0x1bfc00000, data 0x5069810/0x5104000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:16.449113+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1264977 data_alloc: 285212672 data_used: 2625536
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 101023744 unmapped: 18341888 heap: 119365632 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:17.449266+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 101023744 unmapped: 18341888 heap: 119365632 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 101 heartbeat osd_stat(store_statfs(0x1b6983000/0x0/0x1bfc00000, data 0x5069810/0x5104000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:18.449422+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 101040128 unmapped: 18325504 heap: 119365632 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8dcfa800
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:19.449541+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 101040128 unmapped: 18325504 heap: 119365632 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 101 ms_handle_reset con 0x55ab8dcfa800 session 0x55ab8c0fa780
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 101 heartbeat osd_stat(store_statfs(0x1b6983000/0x0/0x1bfc00000, data 0x5069810/0x5104000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8dcfa800
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 101 ms_handle_reset con 0x55ab8dcfa800 session 0x55ab8dce0960
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8cc2c000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 101 ms_handle_reset con 0x55ab8cc2c000 session 0x55ab8dce12c0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e269800
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 101 ms_handle_reset con 0x55ab8e269800 session 0x55ab8dce01e0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e26b000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 101 ms_handle_reset con 0x55ab8e26b000 session 0x55ab8dce14a0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e2f7000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:20.449708+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 113557504 unmapped: 17989632 heap: 131547136 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 101 heartbeat osd_stat(store_statfs(0x1b6988000/0x0/0x1bfc00000, data 0x5069882/0x5106000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [0,0,0,1])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 101 ms_handle_reset con 0x55ab8e2f7000 session 0x55ab8dfc63c0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8cc2c000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 101 ms_handle_reset con 0x55ab8cc2c000 session 0x55ab8dce0780
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:21.449861+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1500328 data_alloc: 285212672 data_used: 2633728
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 107397120 unmapped: 24150016 heap: 131547136 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8f373c00
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 101 ms_handle_reset con 0x55ab8f373c00 session 0x55ab8dc5d860
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 101 ms_handle_reset con 0x55ab8dc97400 session 0x55ab8f1023c0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 101 ms_handle_reset con 0x55ab8edefc00 session 0x55ab8dc36000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8ee2ec00
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 101 ms_handle_reset con 0x55ab8ee2ec00 session 0x55ab8dc06960
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8cc2c000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8dc97400
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 101 ms_handle_reset con 0x55ab8cc2c000 session 0x55ab8c790780
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:22.450646+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 107388928 unmapped: 24158208 heap: 131547136 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8edefc00
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8ee2ec00
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 101 ms_handle_reset con 0x55ab8dc97400 session 0x55ab8dbfb680
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8f373c00
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 9.209855080s of 10.001928329s, submitted: 177
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:23.450777+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 101 handle_osd_map epochs [102,102], i have 101, src has [1,102]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.17] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 107454464 unmapped: 24092672 heap: 131547136 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 102 handle_osd_map epochs [102,102], i have 102, src has [1,102]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 102 handle_osd_map epochs [102,102], i have 102, src has [1,102]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 102 handle_osd_map epochs [102,102], i have 102, src has [1,102]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 102 ms_handle_reset con 0x55ab8f373c00 session 0x55ab8dc472c0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:24.450894+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 107913216 unmapped: 23633920 heap: 131547136 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 102 heartbeat osd_stat(store_statfs(0x1b6703000/0x0/0x1bfc00000, data 0x44e9bf3/0x4587000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:25.451012+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 112107520 unmapped: 19439616 heap: 131547136 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:26.451150+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1262647 data_alloc: 301989888 data_used: 10530816
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 112107520 unmapped: 19439616 heap: 131547136 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:27.451321+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 112107520 unmapped: 19439616 heap: 131547136 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:28.451509+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 112107520 unmapped: 19439616 heap: 131547136 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:29.451645+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 112140288 unmapped: 19406848 heap: 131547136 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _renew_subs
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _send_mon_message to mon.np0005538514 at v2:172.18.0.104:3300/0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 102 handle_osd_map epochs [103,103], i have 102, src has [1,103]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.17] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:30.451845+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 105758720 unmapped: 25788416 heap: 131547136 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 103 heartbeat osd_stat(store_statfs(0x1b6703000/0x0/0x1bfc00000, data 0x44e9bf3/0x4587000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:31.451977+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1249313 data_alloc: 301989888 data_used: 10539008
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 105824256 unmapped: 25722880 heap: 131547136 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:32.452130+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 105824256 unmapped: 25722880 heap: 131547136 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:33.452268+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 105824256 unmapped: 25722880 heap: 131547136 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.996702194s of 11.146968842s, submitted: 52
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:34.452395+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 105824256 unmapped: 25722880 heap: 131547136 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 103 heartbeat osd_stat(store_statfs(0x1b7502000/0x0/0x1bfc00000, data 0x44ebe97/0x458b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:35.452527+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 105955328 unmapped: 25591808 heap: 131547136 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 103 heartbeat osd_stat(store_statfs(0x1b7503000/0x0/0x1bfc00000, data 0x44ebe97/0x458b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:36.452777+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1257145 data_alloc: 301989888 data_used: 11440128
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 106192896 unmapped: 25354240 heap: 131547136 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8f374400
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 103 ms_handle_reset con 0x55ab8f374400 session 0x55ab8dc425a0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e338400
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 103 ms_handle_reset con 0x55ab8e338400 session 0x55ab8cbea3c0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8cc2c000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 103 ms_handle_reset con 0x55ab8cc2c000 session 0x55ab8cbeb680
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8dc97400
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 103 ms_handle_reset con 0x55ab8dc97400 session 0x55ab9029e1e0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:37.452915+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8f373c00
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 106381312 unmapped: 25165824 heap: 131547136 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 103 ms_handle_reset con 0x55ab8f373c00 session 0x55ab9029e3c0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8f374400
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 103 ms_handle_reset con 0x55ab8f374400 session 0x55ab9029e960
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e26b800
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 103 ms_handle_reset con 0x55ab8e26b800 session 0x55ab9029eb40
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8cc2c000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 103 ms_handle_reset con 0x55ab8cc2c000 session 0x55ab9029ef00
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8dc97400
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 103 ms_handle_reset con 0x55ab8dc97400 session 0x55ab9029f0e0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:38.453161+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 106766336 unmapped: 24780800 heap: 131547136 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 103 ms_handle_reset con 0x55ab8edefc00 session 0x55ab8dce0d20
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 103 ms_handle_reset con 0x55ab8ee2ec00 session 0x55ab8f10da40
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:39.453313+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e26b800
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 103 ms_handle_reset con 0x55ab8e26b800 session 0x55ab8fe1b860
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 106766336 unmapped: 24780800 heap: 131547136 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8cc2c000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 103 handle_osd_map epochs [104,104], i have 103, src has [1,104]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 103 handle_osd_map epochs [104,104], i have 104, src has [1,104]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 104 ms_handle_reset con 0x55ab8cc2c000 session 0x55ab8fe1ba40
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8dc97400
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 104 handle_osd_map epochs [104,104], i have 104, src has [1,104]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e26b800
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 104 ms_handle_reset con 0x55ab8e26b800 session 0x55ab9029f4a0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 104 ms_handle_reset con 0x55ab8dc97400 session 0x55ab8fe1bc20
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8edefc00
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:40.453478+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 104 heartbeat osd_stat(store_statfs(0x1b6602000/0x0/0x1bfc00000, data 0x53e72d7/0x548b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [0,0,0,0,0,1,1])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 104 ms_handle_reset con 0x55ab8edefc00 session 0x55ab8fe1be00
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 115933184 unmapped: 19595264 heap: 135528448 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8ee2ec00
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 104 handle_osd_map epochs [105,105], i have 104, src has [1,105]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[2.b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 105 ms_handle_reset con 0x55ab8ee2ec00 session 0x55ab8dc4e3c0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[2.d] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[2.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 105 handle_osd_map epochs [105,105], i have 105, src has [1,105]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[2.12] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[2.a] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[2.5] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[2.18] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[2.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[2.c] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:41.453602+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1547176 data_alloc: 301989888 data_used: 13762560
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 116801536 unmapped: 18726912 heap: 135528448 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 105 handle_osd_map epochs [105,106], i have 105, src has [1,106]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 106 handle_osd_map epochs [106,106], i have 106, src has [1,106]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e29a800
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 106 handle_osd_map epochs [105,106], i have 106, src has [1,106]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 106 ms_handle_reset con 0x55ab8e29a800 session 0x55ab8dc47c20
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8d87b000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e269800
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:42.453770+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 117080064 unmapped: 18448384 heap: 135528448 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8f455800
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:43.453907+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 106 ms_handle_reset con 0x55ab8f455800 session 0x55ab8dc51680
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 111599616 unmapped: 23928832 heap: 135528448 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:44.454036+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 111689728 unmapped: 23838720 heap: 135528448 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _renew_subs
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _send_mon_message to mon.np0005538514 at v2:172.18.0.104:3300/0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 106 handle_osd_map epochs [107,107], i have 106, src has [1,107]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.317867279s of 10.971207619s, submitted: 147
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[2.1c] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[2.d] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[2.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[2.5] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[2.c] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[2.b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[2.a] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[2.12] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[2.18] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:45.454189+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 104546304 unmapped: 30982144 heap: 135528448 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 107 heartbeat osd_stat(store_statfs(0x1b7048000/0x0/0x1bfc00000, data 0x499ec93/0x4a45000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:46.454335+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1265370 data_alloc: 285212672 data_used: 2678784
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8edee400
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 104546304 unmapped: 30982144 heap: 135528448 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 107 ms_handle_reset con 0x55ab8edee400 session 0x55ab8dfc6f00
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8ee2f800
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 107 ms_handle_reset con 0x55ab8ee2f800 session 0x55ab8c6afa40
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8f199800
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 107 ms_handle_reset con 0x55ab8f199800 session 0x55ab8dc4fa40
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8f199800
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 107 ms_handle_reset con 0x55ab8f199800 session 0x55ab8c0fb860
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e29a800
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 107 ms_handle_reset con 0x55ab8e29a800 session 0x55ab9036a960
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8edee400
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:47.454492+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 122077184 unmapped: 16859136 heap: 138936320 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 107 ms_handle_reset con 0x55ab8edee400 session 0x55ab9036a000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:48.454734+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8ee2f800
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 107 ms_handle_reset con 0x55ab8ee2f800 session 0x55ab8dc505a0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 110043136 unmapped: 28893184 heap: 138936320 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 107 handle_osd_map epochs [107,108], i have 107, src has [1,108]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8f455800
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 108 ms_handle_reset con 0x55ab8f455800 session 0x55ab8de2b680
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8f455800
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 108 ms_handle_reset con 0x55ab8f455800 session 0x55ab8d8a3e00
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:49.454890+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 108 handle_osd_map epochs [108,108], i have 108, src has [1,108]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 108 handle_osd_map epochs [108,108], i have 108, src has [1,108]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e29a800
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 108 ms_handle_reset con 0x55ab8e29a800 session 0x55ab8d8a2780
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8edee400
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8ee2f800
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 110133248 unmapped: 28803072 heap: 138936320 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 108 ms_handle_reset con 0x55ab8ee2f800 session 0x55ab8dce1a40
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8f199800
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8f199400
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 108 handle_osd_map epochs [108,109], i have 108, src has [1,109]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 109 handle_osd_map epochs [109,109], i have 109, src has [1,109]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 109 handle_osd_map epochs [109,109], i have 109, src has [1,109]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 109 ms_handle_reset con 0x55ab8edee400 session 0x55ab8d8a3860
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 109 handle_osd_map epochs [109,109], i have 109, src has [1,109]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8f19ac00
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:50.455123+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 110133248 unmapped: 28803072 heap: 138936320 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 109 handle_osd_map epochs [110,110], i have 109, src has [1,110]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.17] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 110 handle_osd_map epochs [109,110], i have 110, src has [1,110]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 110 handle_osd_map epochs [110,110], i have 110, src has [1,110]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:51.455283+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 110 handle_osd_map epochs [110,110], i have 110, src has [1,110]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1396282 data_alloc: 285212672 data_used: 2772992
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 110 ms_handle_reset con 0x55ab8f19ac00 session 0x55ab8d8a3680
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 106340352 unmapped: 32595968 heap: 138936320 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 110 heartbeat osd_stat(store_statfs(0x1b52b9000/0x0/0x1bfc00000, data 0x6720457/0x67d2000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 110 handle_osd_map epochs [110,111], i have 110, src has [1,111]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 111 handle_osd_map epochs [111,111], i have 111, src has [1,111]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:52.455479+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 105594880 unmapped: 33341440 heap: 138936320 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:53.455660+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _renew_subs
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _send_mon_message to mon.np0005538514 at v2:172.18.0.104:3300/0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 111 handle_osd_map epochs [112,112], i have 111, src has [1,112]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 112 ms_handle_reset con 0x55ab8f199800 session 0x55ab9029e1e0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 112 ms_handle_reset con 0x55ab8f199400 session 0x55ab9029fc20
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e24c000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 106471424 unmapped: 32464896 heap: 138936320 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8f373c00
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 112 handle_osd_map epochs [111,112], i have 112, src has [1,112]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 112 handle_osd_map epochs [112,113], i have 112, src has [1,113]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 113 handle_osd_map epochs [113,113], i have 113, src has [1,113]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 113 handle_osd_map epochs [113,113], i have 113, src has [1,113]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 113 handle_osd_map epochs [113,113], i have 113, src has [1,113]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 113 ms_handle_reset con 0x55ab8e24c000 session 0x55ab8edf3860
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:54.455789+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 106700800 unmapped: 32235520 heap: 138936320 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 113 ms_handle_reset con 0x55ab8f373c00 session 0x55ab8edf32c0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _renew_subs
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _send_mon_message to mon.np0005538514 at v2:172.18.0.104:3300/0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 113 handle_osd_map epochs [114,114], i have 113, src has [1,114]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.17] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 8.935484886s of 10.018543243s, submitted: 244
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 114 heartbeat osd_stat(store_statfs(0x1b642e000/0x0/0x1bfc00000, data 0x55a5d33/0x565d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:55.456017+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8f372c00
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 106258432 unmapped: 32677888 heap: 138936320 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 114 handle_osd_map epochs [115,115], i have 114, src has [1,115]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.17] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:56.456202+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1312589 data_alloc: 285212672 data_used: 2867200
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 115 ms_handle_reset con 0x55ab8f372c00 session 0x55ab8dc5c1e0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 106627072 unmapped: 32309248 heap: 138936320 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e24c000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:57.456365+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 115 ms_handle_reset con 0x55ab8e24c000 session 0x55ab8c6af680
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 106627072 unmapped: 32309248 heap: 138936320 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8f199400
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _renew_subs
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _send_mon_message to mon.np0005538514 at v2:172.18.0.104:3300/0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 115 handle_osd_map epochs [116,116], i have 115, src has [1,116]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.17] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 116 handle_osd_map epochs [116,116], i have 116, src has [1,116]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:58.456520+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 116 handle_osd_map epochs [116,117], i have 116, src has [1,117]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.17] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 117 ms_handle_reset con 0x55ab8f199400 session 0x55ab8dfd4960
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 106684416 unmapped: 32251904 heap: 138936320 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8f199800
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:01:59.456658+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 117 ms_handle_reset con 0x55ab8f199800 session 0x55ab8dfd5680
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 117 handle_osd_map epochs [118,118], i have 117, src has [1,118]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.17] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 106684416 unmapped: 32251904 heap: 138936320 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 118 handle_osd_map epochs [119,119], i have 118, src has [1,119]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.17] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 119 heartbeat osd_stat(store_statfs(0x1b6f9d000/0x0/0x1bfc00000, data 0x4632e9f/0x46ef000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8f373c00
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 119 heartbeat osd_stat(store_statfs(0x1b6f93000/0x0/0x1bfc00000, data 0x4637571/0x46f7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:00.456826+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 107790336 unmapped: 31145984 heap: 138936320 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 119 handle_osd_map epochs [119,120], i have 119, src has [1,120]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 120 handle_osd_map epochs [120,120], i have 120, src has [1,120]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.17] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 120 handle_osd_map epochs [120,120], i have 120, src has [1,120]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 120 ms_handle_reset con 0x55ab8f373c00 session 0x55ab8dfd4d20
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:01.457142+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e370000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1319558 data_alloc: 285212672 data_used: 2867200
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 107896832 unmapped: 31039488 heap: 138936320 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:02.457330+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 120 handle_osd_map epochs [121,121], i have 120, src has [1,121]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.17] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 108969984 unmapped: 29966336 heap: 138936320 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 121 ms_handle_reset con 0x55ab8e370000 session 0x55ab8dfd43c0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:03.457591+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 108986368 unmapped: 29949952 heap: 138936320 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:04.457948+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109002752 unmapped: 29933568 heap: 138936320 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 121 handle_osd_map epochs [122,122], i have 121, src has [1,122]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.17] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:05.458136+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109051904 unmapped: 29884416 heap: 138936320 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 122 heartbeat osd_stat(store_statfs(0x1b6f8e000/0x0/0x1bfc00000, data 0x463f46f/0x46ff000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:06.458839+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1323613 data_alloc: 285212672 data_used: 2867200
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109051904 unmapped: 29884416 heap: 138936320 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 122 heartbeat osd_stat(store_statfs(0x1b6f8e000/0x0/0x1bfc00000, data 0x463f46f/0x46ff000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 11.625911713s of 12.144161224s, submitted: 201
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:07.459004+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109051904 unmapped: 29884416 heap: 138936320 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 122 heartbeat osd_stat(store_statfs(0x1b6f8e000/0x0/0x1bfc00000, data 0x463f46f/0x46ff000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:08.459125+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109051904 unmapped: 29884416 heap: 138936320 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:09.459265+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109051904 unmapped: 29884416 heap: 138936320 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _renew_subs
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _send_mon_message to mon.np0005538514 at v2:172.18.0.104:3300/0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 122 handle_osd_map epochs [123,123], i have 122, src has [1,123]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.17] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:10.459435+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109060096 unmapped: 29876224 heap: 138936320 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 123 heartbeat osd_stat(store_statfs(0x1b6f8d000/0x0/0x1bfc00000, data 0x464046f/0x4700000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:11.459627+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1326643 data_alloc: 285212672 data_used: 2867200
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109060096 unmapped: 29876224 heap: 138936320 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:12.459868+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109060096 unmapped: 29876224 heap: 138936320 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 123 handle_osd_map epochs [123,124], i have 123, src has [1,124]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:13.460154+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109117440 unmapped: 29818880 heap: 138936320 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:14.460301+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109117440 unmapped: 29818880 heap: 138936320 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:15.460467+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109117440 unmapped: 29818880 heap: 138936320 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:16.460642+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1329645 data_alloc: 285212672 data_used: 2867200
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 124 heartbeat osd_stat(store_statfs(0x1b6f85000/0x0/0x1bfc00000, data 0x4644ad1/0x4708000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109125632 unmapped: 29810688 heap: 138936320 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 124 heartbeat osd_stat(store_statfs(0x1b6f85000/0x0/0x1bfc00000, data 0x4644ad1/0x4708000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:17.460790+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8edef800
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109125632 unmapped: 29810688 heap: 138936320 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _renew_subs
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _send_mon_message to mon.np0005538514 at v2:172.18.0.104:3300/0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 124 handle_osd_map epochs [125,125], i have 124, src has [1,125]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.702108383s of 10.747031212s, submitted: 18
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.17] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 125 heartbeat osd_stat(store_statfs(0x1b6f85000/0x0/0x1bfc00000, data 0x4644ad1/0x4708000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:18.460927+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 125 ms_handle_reset con 0x55ab8edef800 session 0x55ab8dbd8b40
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 125 ms_handle_reset con 0x55ab8d87b000 session 0x55ab8de2a3c0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 125 ms_handle_reset con 0x55ab8e269800 session 0x55ab9029f860
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109133824 unmapped: 29802496 heap: 138936320 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8f19a800
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:19.461026+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 125 ms_handle_reset con 0x55ab8f19a800 session 0x55ab8c81d860
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109232128 unmapped: 29704192 heap: 138936320 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:20.461237+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109232128 unmapped: 29704192 heap: 138936320 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:21.461403+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1109901 data_alloc: 285212672 data_used: 2686976
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109232128 unmapped: 29704192 heap: 138936320 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 125 heartbeat osd_stat(store_statfs(0x1b8ca0000/0x0/0x1bfc00000, data 0x2929eb0/0x29ed000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:22.461575+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109232128 unmapped: 29704192 heap: 138936320 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 125 heartbeat osd_stat(store_statfs(0x1b8ca0000/0x0/0x1bfc00000, data 0x2929eb0/0x29ed000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:23.461733+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109232128 unmapped: 29704192 heap: 138936320 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:24.461883+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109232128 unmapped: 29704192 heap: 138936320 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 125 handle_osd_map epochs [125,126], i have 125, src has [1,126]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.17] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:25.462111+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109248512 unmapped: 29687808 heap: 138936320 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:26.462254+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1113927 data_alloc: 285212672 data_used: 2699264
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109248512 unmapped: 29687808 heap: 138936320 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:27.462482+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109248512 unmapped: 29687808 heap: 138936320 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 126 heartbeat osd_stat(store_statfs(0x1b8c9c000/0x0/0x1bfc00000, data 0x292c154/0x29f1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:28.462665+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109248512 unmapped: 29687808 heap: 138936320 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:29.462864+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109248512 unmapped: 29687808 heap: 138936320 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:30.463043+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109248512 unmapped: 29687808 heap: 138936320 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:31.463251+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1113927 data_alloc: 285212672 data_used: 2699264
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109248512 unmapped: 29687808 heap: 138936320 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 126 heartbeat osd_stat(store_statfs(0x1b8c9c000/0x0/0x1bfc00000, data 0x292c154/0x29f1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:32.463414+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109248512 unmapped: 29687808 heap: 138936320 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:33.463555+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109256704 unmapped: 29679616 heap: 138936320 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:34.463946+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109256704 unmapped: 29679616 heap: 138936320 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 126 heartbeat osd_stat(store_statfs(0x1b8c9c000/0x0/0x1bfc00000, data 0x292c154/0x29f1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:35.464275+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109256704 unmapped: 29679616 heap: 138936320 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:36.464497+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1113927 data_alloc: 285212672 data_used: 2699264
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109256704 unmapped: 29679616 heap: 138936320 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:37.464639+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109256704 unmapped: 29679616 heap: 138936320 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:38.465037+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 126 heartbeat osd_stat(store_statfs(0x1b8c9c000/0x0/0x1bfc00000, data 0x292c154/0x29f1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109264896 unmapped: 29671424 heap: 138936320 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:39.465264+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109264896 unmapped: 29671424 heap: 138936320 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:40.465810+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109264896 unmapped: 29671424 heap: 138936320 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:41.466137+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1113927 data_alloc: 285212672 data_used: 2699264
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109264896 unmapped: 29671424 heap: 138936320 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 126 heartbeat osd_stat(store_statfs(0x1b8c9c000/0x0/0x1bfc00000, data 0x292c154/0x29f1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:42.466343+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 126 heartbeat osd_stat(store_statfs(0x1b8c9c000/0x0/0x1bfc00000, data 0x292c154/0x29f1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109264896 unmapped: 29671424 heap: 138936320 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:43.466648+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109264896 unmapped: 29671424 heap: 138936320 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:44.466832+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109264896 unmapped: 29671424 heap: 138936320 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:45.467011+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109264896 unmapped: 29671424 heap: 138936320 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:46.467224+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1113927 data_alloc: 285212672 data_used: 2699264
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109264896 unmapped: 29671424 heap: 138936320 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 126 heartbeat osd_stat(store_statfs(0x1b8c9c000/0x0/0x1bfc00000, data 0x292c154/0x29f1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:47.467339+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109264896 unmapped: 29671424 heap: 138936320 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:48.467476+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109264896 unmapped: 29671424 heap: 138936320 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:49.467616+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109273088 unmapped: 29663232 heap: 138936320 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:50.467834+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 126 heartbeat osd_stat(store_statfs(0x1b8c9c000/0x0/0x1bfc00000, data 0x292c154/0x29f1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109273088 unmapped: 29663232 heap: 138936320 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:51.468004+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1113927 data_alloc: 285212672 data_used: 2699264
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109273088 unmapped: 29663232 heap: 138936320 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:52.468237+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109273088 unmapped: 29663232 heap: 138936320 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:53.468403+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109273088 unmapped: 29663232 heap: 138936320 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 126 heartbeat osd_stat(store_statfs(0x1b8c9c000/0x0/0x1bfc00000, data 0x292c154/0x29f1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:54.468584+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 36.608581543s of 36.857170105s, submitted: 69
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109281280 unmapped: 29655040 heap: 138936320 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:55.468802+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109281280 unmapped: 29655040 heap: 138936320 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 126 handle_osd_map epochs [126,127], i have 126, src has [1,127]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 127 handle_osd_map epochs [127,127], i have 127, src has [1,127]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:56.468981+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 127 heartbeat osd_stat(store_statfs(0x1b8c97000/0x0/0x1bfc00000, data 0x292e522/0x29f6000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1120384 data_alloc: 285212672 data_used: 2711552
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109297664 unmapped: 29638656 heap: 138936320 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:57.469138+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 127 heartbeat osd_stat(store_statfs(0x1b8c97000/0x0/0x1bfc00000, data 0x292e522/0x29f6000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109297664 unmapped: 29638656 heap: 138936320 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:58.469351+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109297664 unmapped: 29638656 heap: 138936320 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _renew_subs
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _send_mon_message to mon.np0005538514 at v2:172.18.0.104:3300/0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 127 handle_osd_map epochs [128,128], i have 127, src has [1,128]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8f2d5400
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:02:59.469563+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 128 heartbeat osd_stat(store_statfs(0x1b8c97000/0x0/0x1bfc00000, data 0x292e522/0x29f6000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109355008 unmapped: 29581312 heap: 138936320 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _renew_subs
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _send_mon_message to mon.np0005538514 at v2:172.18.0.104:3300/0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 128 handle_osd_map epochs [129,129], i have 128, src has [1,129]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.17] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:00.470086+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 129 ms_handle_reset con 0x55ab8f2d5400 session 0x55ab8f102000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 129 heartbeat osd_stat(store_statfs(0x1b8c92000/0x0/0x1bfc00000, data 0x29308e0/0x29fa000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109371392 unmapped: 29564928 heap: 138936320 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:01.470343+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8d87b000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1125732 data_alloc: 285212672 data_used: 2711552
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109371392 unmapped: 29564928 heap: 138936320 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 129 handle_osd_map epochs [129,130], i have 129, src has [1,130]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 130 handle_osd_map epochs [130,130], i have 130, src has [1,130]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.17] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:02.470460+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 130 handle_osd_map epochs [130,130], i have 130, src has [1,130]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 130 ms_handle_reset con 0x55ab8d87b000 session 0x55ab8f102f00
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 130 heartbeat osd_stat(store_statfs(0x1b8c90000/0x0/0x1bfc00000, data 0x2932cf2/0x29fe000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109387776 unmapped: 29548544 heap: 138936320 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:03.470588+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109387776 unmapped: 29548544 heap: 138936320 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 130 heartbeat osd_stat(store_statfs(0x1b8c8c000/0x0/0x1bfc00000, data 0x29350f4/0x2a01000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:04.470812+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 130 handle_osd_map epochs [130,131], i have 130, src has [1,131]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 9.774957657s of 10.027979851s, submitted: 67
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109395968 unmapped: 29540352 heap: 138936320 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.17] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:05.470982+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e2f7800
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 131 heartbeat osd_stat(store_statfs(0x1b8c88000/0x0/0x1bfc00000, data 0x29373b4/0x2a05000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109395968 unmapped: 29540352 heap: 138936320 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 131 handle_osd_map epochs [131,132], i have 131, src has [1,132]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets getting new tickets!
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _send_mon_message to mon.np0005538514 at v2:172.18.0.104:3300/0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:06.471214+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _finish_auth 0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:06.472685+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1135081 data_alloc: 285212672 data_used: 2723840
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109453312 unmapped: 29483008 heap: 138936320 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _renew_subs
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _send_mon_message to mon.np0005538514 at v2:172.18.0.104:3300/0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 132 handle_osd_map epochs [133,133], i have 132, src has [1,133]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 133 ms_handle_reset con 0x55ab8e2f7800 session 0x55ab8de2a1e0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:07.471558+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109469696 unmapped: 29466624 heap: 138936320 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8f19b800
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 133 handle_osd_map epochs [133,134], i have 133, src has [1,134]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 134 ms_handle_reset con 0x55ab8f19b800 session 0x55ab8dc06960
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:08.471736+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 134 heartbeat osd_stat(store_statfs(0x1b8c7e000/0x0/0x1bfc00000, data 0x293bbdc/0x2a0f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109477888 unmapped: 29458432 heap: 138936320 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:09.472175+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8f199400
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e24c800
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 134 ms_handle_reset con 0x55ab8e24c800 session 0x55ab8cbeb860
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 134 ms_handle_reset con 0x55ab8f199400 session 0x55ab8de2b860
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8d87b000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 134 ms_handle_reset con 0x55ab8d87b000 session 0x55ab8c6aeb40
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 134 handle_osd_map epochs [134,135], i have 134, src has [1,135]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109518848 unmapped: 29417472 heap: 138936320 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.17] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:10.472508+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109518848 unmapped: 29417472 heap: 138936320 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e24c800
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:11.472681+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 135 ms_handle_reset con 0x55ab8e24c800 session 0x55ab8dc463c0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e2f7800
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1202127 data_alloc: 285212672 data_used: 2723840
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109748224 unmapped: 37584896 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 135 ms_handle_reset con 0x55ab8e2f7800 session 0x55ab8dc470e0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 135 heartbeat osd_stat(store_statfs(0x1b8477000/0x0/0x1bfc00000, data 0x31403e4/0x3217000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:12.472843+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 117129216 unmapped: 30203904 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:13.473000+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8f19a800
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 135 ms_handle_reset con 0x55ab8f19a800 session 0x55ab8dc36000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8dcfa800
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 108601344 unmapped: 38731776 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 135 heartbeat osd_stat(store_statfs(0x1b6477000/0x0/0x1bfc00000, data 0x51403e4/0x5217000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:14.473176+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _renew_subs
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _send_mon_message to mon.np0005538514 at v2:172.18.0.104:3300/0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 135 handle_osd_map epochs [136,136], i have 135, src has [1,136]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 9.648155212s of 10.057988167s, submitted: 59
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 136 ms_handle_reset con 0x55ab8dcfa800 session 0x55ab8f10c780
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 108691456 unmapped: 38641664 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:15.473302+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8d87b000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 108691456 unmapped: 38641664 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 136 heartbeat osd_stat(store_statfs(0x1b5472000/0x0/0x1bfc00000, data 0x6142980/0x621b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e24c800
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 136 ms_handle_reset con 0x55ab8e24c800 session 0x55ab8cbeb680
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:16.473399+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e2f7800
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 136 ms_handle_reset con 0x55ab8e2f7800 session 0x55ab8cbea3c0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1533817 data_alloc: 285212672 data_used: 2736128
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 136 handle_osd_map epochs [136,137], i have 136, src has [1,137]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 108691456 unmapped: 38641664 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.17] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 137 handle_osd_map epochs [137,137], i have 137, src has [1,137]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 137 handle_osd_map epochs [137,137], i have 137, src has [1,137]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:17.473552+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 137 ms_handle_reset con 0x55ab8d87b000 session 0x55ab8f10d4a0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8f198800
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 108740608 unmapped: 38592512 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 137 handle_osd_map epochs [138,138], i have 137, src has [1,138]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.17] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:18.473681+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 138 ms_handle_reset con 0x55ab8f198800 session 0x55ab8dd07a40
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 108732416 unmapped: 38600704 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8f199800
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 138 ms_handle_reset con 0x55ab8f199800 session 0x55ab8dd06f00
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:19.473790+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 138 heartbeat osd_stat(store_statfs(0x1b8c6b000/0x0/0x1bfc00000, data 0x29471bc/0x2a21000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _renew_subs
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _send_mon_message to mon.np0005538514 at v2:172.18.0.104:3300/0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 138 handle_osd_map epochs [139,139], i have 138, src has [1,139]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 108756992 unmapped: 38576128 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:20.473985+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8d87b000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 139 ms_handle_reset con 0x55ab8d87b000 session 0x55ab9029fa40
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 108756992 unmapped: 38576128 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:21.474184+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1173684 data_alloc: 285212672 data_used: 2760704
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 108756992 unmapped: 38576128 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:22.474323+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 108756992 unmapped: 38576128 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 139 heartbeat osd_stat(store_statfs(0x1b8c66000/0x0/0x1bfc00000, data 0x29494ee/0x2a27000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:23.474492+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 108675072 unmapped: 38658048 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:24.474641+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 139 handle_osd_map epochs [139,140], i have 139, src has [1,140]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 108675072 unmapped: 38658048 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.17] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 9.938452721s of 10.139330864s, submitted: 95
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:25.474930+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 108675072 unmapped: 38658048 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:26.475181+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1175541 data_alloc: 285212672 data_used: 2760704
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8f374800
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _renew_subs
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _send_mon_message to mon.np0005538514 at v2:172.18.0.104:3300/0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 140 handle_osd_map epochs [141,141], i have 140, src has [1,141]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 108756992 unmapped: 38576128 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 141 ms_handle_reset con 0x55ab8f374800 session 0x55ab9029ed20
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 141 handle_osd_map epochs [140,141], i have 141, src has [1,141]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:27.475383+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 108765184 unmapped: 38567936 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 141 handle_osd_map epochs [142,142], i have 141, src has [1,142]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 142 heartbeat osd_stat(store_statfs(0x1b8c5d000/0x0/0x1bfc00000, data 0x294dbb2/0x2a30000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:28.475571+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 108773376 unmapped: 38559744 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:29.475783+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e2b8000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 142 ms_handle_reset con 0x55ab8e2b8000 session 0x55ab8dbd9e00
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8f196000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 108789760 unmapped: 38543360 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 142 handle_osd_map epochs [142,143], i have 142, src has [1,143]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 143 handle_osd_map epochs [143,143], i have 143, src has [1,143]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 143 ms_handle_reset con 0x55ab8f196000 session 0x55ab9029f2c0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 143 handle_osd_map epochs [143,143], i have 143, src has [1,143]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8f27e000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 143 ms_handle_reset con 0x55ab8f27e000 session 0x55ab9029f680
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:30.475990+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 108822528 unmapped: 38510592 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 143 handle_osd_map epochs [143,144], i have 143, src has [1,144]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 144 handle_osd_map epochs [144,144], i have 144, src has [1,144]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:31.476200+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1189786 data_alloc: 285212672 data_used: 2777088
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 108863488 unmapped: 38469632 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8f2d4800
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:32.476354+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 144 heartbeat osd_stat(store_statfs(0x1b8c52000/0x0/0x1bfc00000, data 0x2954732/0x2a3b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,2,3,4,5] op hist [0,0,1])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 144 ms_handle_reset con 0x55ab8f2d4800 session 0x55ab8dce0f00
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8b7fe000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 144 ms_handle_reset con 0x55ab8b7fe000 session 0x55ab8dce1e00
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 108863488 unmapped: 38469632 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:33.477003+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 108863488 unmapped: 38469632 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:34.477169+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 108863488 unmapped: 38469632 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 144 handle_osd_map epochs [144,145], i have 144, src has [1,145]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 9.763650894s of 10.126563072s, submitted: 94
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:35.477357+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8f373000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 145 handle_osd_map epochs [145,145], i have 145, src has [1,145]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 145 ms_handle_reset con 0x55ab8f373000 session 0x55ab8fe1a780
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 108871680 unmapped: 38461440 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:36.477566+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 145 heartbeat osd_stat(store_statfs(0x1b8c4e000/0x0/0x1bfc00000, data 0x29569d7/0x2a3f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1196011 data_alloc: 285212672 data_used: 2781184
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 108871680 unmapped: 38461440 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e24d800
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _renew_subs
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _send_mon_message to mon.np0005538514 at v2:172.18.0.104:3300/0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 145 handle_osd_map epochs [146,146], i have 145, src has [1,146]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 146 handle_osd_map epochs [146,146], i have 146, src has [1,146]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 146 ms_handle_reset con 0x55ab8e24d800 session 0x55ab8fe1a1e0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:37.477756+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 108945408 unmapped: 38387712 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8f19a800
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:38.477894+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 146 ms_handle_reset con 0x55ab8f19a800 session 0x55ab8dfc63c0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8b7fe000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 108945408 unmapped: 38387712 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _renew_subs
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _send_mon_message to mon.np0005538514 at v2:172.18.0.104:3300/0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 146 handle_osd_map epochs [147,147], i have 146, src has [1,147]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 147 handle_osd_map epochs [147,147], i have 147, src has [1,147]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:39.478041+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 147 ms_handle_reset con 0x55ab8b7fe000 session 0x55ab8d8a32c0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 147 heartbeat osd_stat(store_statfs(0x1b8c49000/0x0/0x1bfc00000, data 0x2958d95/0x2a43000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e24d800
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 147 ms_handle_reset con 0x55ab8e24d800 session 0x55ab8dbfba40
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8f2d4800
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109084672 unmapped: 38248448 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 147 ms_handle_reset con 0x55ab8f2d4800 session 0x55ab8dc5c1e0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:40.478316+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109109248 unmapped: 38223872 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 147 heartbeat osd_stat(store_statfs(0x1b8c49000/0x0/0x1bfc00000, data 0x295b134/0x2a45000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:41.478689+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1199115 data_alloc: 285212672 data_used: 2793472
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109060096 unmapped: 38273024 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 147 heartbeat osd_stat(store_statfs(0x1b8c49000/0x0/0x1bfc00000, data 0x295b134/0x2a45000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:42.479140+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109060096 unmapped: 38273024 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:43.479529+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109060096 unmapped: 38273024 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:44.480169+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 147 handle_osd_map epochs [147,148], i have 147, src has [1,148]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109117440 unmapped: 38215680 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 9.818835258s of 10.126947403s, submitted: 95
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:45.480279+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109117440 unmapped: 38215680 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 148 heartbeat osd_stat(store_statfs(0x1b8c44000/0x0/0x1bfc00000, data 0x295d3d8/0x2a49000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:46.480416+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1203317 data_alloc: 285212672 data_used: 2805760
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109117440 unmapped: 38215680 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:47.480558+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109117440 unmapped: 38215680 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:48.480836+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109117440 unmapped: 38215680 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 148 heartbeat osd_stat(store_statfs(0x1b8c44000/0x0/0x1bfc00000, data 0x295d3d8/0x2a49000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:49.481341+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109117440 unmapped: 38215680 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:50.481549+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109117440 unmapped: 38215680 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:51.481691+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1203317 data_alloc: 285212672 data_used: 2805760
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109117440 unmapped: 38215680 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:52.481847+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109125632 unmapped: 38207488 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:53.482002+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109125632 unmapped: 38207488 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:54.482196+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 148 heartbeat osd_stat(store_statfs(0x1b8c44000/0x0/0x1bfc00000, data 0x295d3d8/0x2a49000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109125632 unmapped: 38207488 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:55.482419+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109125632 unmapped: 38207488 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:56.482646+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 148 heartbeat osd_stat(store_statfs(0x1b8c44000/0x0/0x1bfc00000, data 0x295d3d8/0x2a49000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1203317 data_alloc: 285212672 data_used: 2805760
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109125632 unmapped: 38207488 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:57.482830+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8cc2cc00
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 12.119089127s of 12.146196365s, submitted: 6
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 148 ms_handle_reset con 0x55ab8cc2cc00 session 0x55ab8c0fb2c0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109142016 unmapped: 38191104 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:58.482992+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8cc2c000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 148 ms_handle_reset con 0x55ab8cc2c000 session 0x55ab8dd06f00
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8b7fe000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 148 ms_handle_reset con 0x55ab8b7fe000 session 0x55ab8dd07e00
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109166592 unmapped: 38166528 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:03:59.483167+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109166592 unmapped: 38166528 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:00.483383+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 148 heartbeat osd_stat(store_statfs(0x1b8c45000/0x0/0x1bfc00000, data 0x295d3d8/0x2a49000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109174784 unmapped: 38158336 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:01.483749+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1203342 data_alloc: 285212672 data_used: 2805760
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109174784 unmapped: 38158336 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:02.483936+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109174784 unmapped: 38158336 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:03.484079+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109174784 unmapped: 38158336 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:04.484390+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109174784 unmapped: 38158336 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:05.484565+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109174784 unmapped: 38158336 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:06.484924+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 148 heartbeat osd_stat(store_statfs(0x1b8c45000/0x0/0x1bfc00000, data 0x295d3d8/0x2a49000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1203342 data_alloc: 285212672 data_used: 2805760
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109174784 unmapped: 38158336 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:07.485153+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109174784 unmapped: 38158336 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:08.485405+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109182976 unmapped: 38150144 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:09.485552+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109182976 unmapped: 38150144 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:10.485782+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 148 heartbeat osd_stat(store_statfs(0x1b8c45000/0x0/0x1bfc00000, data 0x295d3d8/0x2a49000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109182976 unmapped: 38150144 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:11.485964+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1203342 data_alloc: 285212672 data_used: 2805760
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109182976 unmapped: 38150144 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:12.486133+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109182976 unmapped: 38150144 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:13.486307+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 15.977515221s of 16.062902451s, submitted: 19
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 148 handle_osd_map epochs [148,149], i have 148, src has [1,149]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 149 handle_osd_map epochs [149,149], i have 149, src has [1,149]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 149 handle_osd_map epochs [149,149], i have 149, src has [1,149]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109207552 unmapped: 38125568 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:14.486460+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8f375800
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 149 heartbeat osd_stat(store_statfs(0x1b8c3e000/0x0/0x1bfc00000, data 0x295f7b9/0x2a4e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 149 handle_osd_map epochs [149,150], i have 149, src has [1,150]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 150 handle_osd_map epochs [150,150], i have 150, src has [1,150]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109223936 unmapped: 38109184 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.17] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:15.486622+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 150 ms_handle_reset con 0x55ab8f375800 session 0x55ab8fe1b860
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109240320 unmapped: 38092800 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:16.486762+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1213471 data_alloc: 285212672 data_used: 2818048
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109248512 unmapped: 38084608 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:17.486888+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109248512 unmapped: 38084608 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:18.487000+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 150 heartbeat osd_stat(store_statfs(0x1b8c39000/0x0/0x1bfc00000, data 0x2961ba8/0x2a51000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109248512 unmapped: 38084608 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:19.487124+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 150 handle_osd_map epochs [150,151], i have 150, src has [1,151]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109256704 unmapped: 38076416 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 151 heartbeat osd_stat(store_statfs(0x1b8c39000/0x0/0x1bfc00000, data 0x2961ba8/0x2a51000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.17] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:20.487283+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109256704 unmapped: 38076416 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:21.487430+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1213548 data_alloc: 285212672 data_used: 2818048
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109256704 unmapped: 38076416 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:22.487584+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109256704 unmapped: 38076416 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:23.487721+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109256704 unmapped: 38076416 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:24.487869+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109256704 unmapped: 38076416 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:25.488031+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 151 heartbeat osd_stat(store_statfs(0x1b8c38000/0x0/0x1bfc00000, data 0x2963e4c/0x2a55000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 151 heartbeat osd_stat(store_statfs(0x1b8c38000/0x0/0x1bfc00000, data 0x2963e4c/0x2a55000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109256704 unmapped: 38076416 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:26.488198+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1213548 data_alloc: 285212672 data_used: 2818048
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109256704 unmapped: 38076416 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:27.488379+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109256704 unmapped: 38076416 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:28.488729+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109256704 unmapped: 38076416 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:29.488900+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 151 heartbeat osd_stat(store_statfs(0x1b8c38000/0x0/0x1bfc00000, data 0x2963e4c/0x2a55000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109256704 unmapped: 38076416 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:30.489118+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109256704 unmapped: 38076416 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:31.489311+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1213548 data_alloc: 285212672 data_used: 2818048
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109256704 unmapped: 38076416 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:32.489480+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109256704 unmapped: 38076416 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:33.489661+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109256704 unmapped: 38076416 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:34.489807+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109256704 unmapped: 38076416 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:35.489950+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 151 heartbeat osd_stat(store_statfs(0x1b8c38000/0x0/0x1bfc00000, data 0x2963e4c/0x2a55000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109256704 unmapped: 38076416 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:36.490116+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1213548 data_alloc: 285212672 data_used: 2818048
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109256704 unmapped: 38076416 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:37.490271+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109256704 unmapped: 38076416 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:38.490493+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109256704 unmapped: 38076416 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:39.490645+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109256704 unmapped: 38076416 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:40.490835+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 151 heartbeat osd_stat(store_statfs(0x1b8c38000/0x0/0x1bfc00000, data 0x2963e4c/0x2a55000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109264896 unmapped: 38068224 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:41.490991+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1213548 data_alloc: 285212672 data_used: 2818048
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109264896 unmapped: 38068224 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:42.491175+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109264896 unmapped: 38068224 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:43.491305+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109264896 unmapped: 38068224 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:44.492050+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109264896 unmapped: 38068224 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:45.492998+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 151 heartbeat osd_stat(store_statfs(0x1b8c38000/0x0/0x1bfc00000, data 0x2963e4c/0x2a55000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109264896 unmapped: 38068224 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:46.493182+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1213548 data_alloc: 285212672 data_used: 2818048
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:47.493782+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109264896 unmapped: 38068224 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:48.496226+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109264896 unmapped: 38068224 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:49.498334+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109273088 unmapped: 38060032 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e24d800
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 35.964664459s of 36.145420074s, submitted: 50
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 151 ms_handle_reset con 0x55ab8e24d800 session 0x55ab8f10d4a0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:50.499024+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109281280 unmapped: 38051840 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 151 heartbeat osd_stat(store_statfs(0x1b8c37000/0x0/0x1bfc00000, data 0x2963ebe/0x2a57000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8ee5d400
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:51.499302+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109305856 unmapped: 38027264 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 151 ms_handle_reset con 0x55ab8ee5d400 session 0x55ab8f10c780
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1214765 data_alloc: 285212672 data_used: 2818048
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:52.499467+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109305856 unmapped: 38027264 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 151 heartbeat osd_stat(store_statfs(0x1b8c37000/0x0/0x1bfc00000, data 0x2963e4c/0x2a55000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8dc97c00
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 151 ms_handle_reset con 0x55ab8dc97c00 session 0x55ab8dc470e0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:53.499734+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109289472 unmapped: 38043648 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8b7fe000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 151 ms_handle_reset con 0x55ab8b7fe000 session 0x55ab8dc47a40
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:54.499903+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109289472 unmapped: 38043648 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e24d800
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 151 ms_handle_reset con 0x55ab8e24d800 session 0x55ab8edf3860
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:55.500366+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109289472 unmapped: 38043648 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8ee5d400
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 151 ms_handle_reset con 0x55ab8ee5d400 session 0x55ab8dc5cf00
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8f375800
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:56.500512+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109297664 unmapped: 38035456 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 151 ms_handle_reset con 0x55ab8f375800 session 0x55ab8dbfb680
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1212668 data_alloc: 285212672 data_used: 2818048
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:57.500645+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109297664 unmapped: 38035456 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8cc2cc00
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 151 ms_handle_reset con 0x55ab8cc2cc00 session 0x55ab903801e0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 151 heartbeat osd_stat(store_statfs(0x1b8c39000/0x0/0x1bfc00000, data 0x2963e4c/0x2a55000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:58.500795+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109043712 unmapped: 38289408 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8b7fe000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 151 ms_handle_reset con 0x55ab8b7fe000 session 0x55ab8edf2f00
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e24d800
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 151 ms_handle_reset con 0x55ab8e24d800 session 0x55ab903803c0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:04:59.501136+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109051904 unmapped: 38281216 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8ee5d400
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.309179306s of 10.401854515s, submitted: 22
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 151 ms_handle_reset con 0x55ab8ee5d400 session 0x55ab90380780
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8f375800
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:00.501328+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 151 ms_handle_reset con 0x55ab8f375800 session 0x55ab90380b40
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109051904 unmapped: 38281216 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 151 heartbeat osd_stat(store_statfs(0x1b8c39000/0x0/0x1bfc00000, data 0x2963e4c/0x2a55000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:01.501546+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109051904 unmapped: 38281216 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8f375c00
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 151 ms_handle_reset con 0x55ab8f375c00 session 0x55ab90380d20
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1212668 data_alloc: 285212672 data_used: 2818048
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:02.501769+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109051904 unmapped: 38281216 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8b7fe000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 151 ms_handle_reset con 0x55ab8b7fe000 session 0x55ab903810e0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:03.502028+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109051904 unmapped: 38281216 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e24d800
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 151 ms_handle_reset con 0x55ab8e24d800 session 0x55ab903814a0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:04.502231+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109060096 unmapped: 38273024 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8ee5d400
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:05.502465+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109068288 unmapped: 38264832 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 151 heartbeat osd_stat(store_statfs(0x1b8c37000/0x0/0x1bfc00000, data 0x2963ebe/0x2a57000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 151 handle_osd_map epochs [151,152], i have 151, src has [1,152]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 152 handle_osd_map epochs [152,152], i have 152, src has [1,152]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 152 ms_handle_reset con 0x55ab8ee5d400 session 0x55ab90381680
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:06.502711+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109092864 unmapped: 38240256 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1223274 data_alloc: 285212672 data_used: 2834432
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:07.502971+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109092864 unmapped: 38240256 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8f374c00
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 152 ms_handle_reset con 0x55ab8f374c00 session 0x55ab90381a40
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8edef800
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 152 ms_handle_reset con 0x55ab8edef800 session 0x55ab903803c0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:08.503166+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109133824 unmapped: 38199296 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8b7fe000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 152 ms_handle_reset con 0x55ab8b7fe000 session 0x55ab8dc5dc20
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:09.503320+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109166592 unmapped: 38166528 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:10.503515+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109166592 unmapped: 38166528 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:11.503657+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109166592 unmapped: 38166528 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 152 heartbeat osd_stat(store_statfs(0x1b8c32000/0x0/0x1bfc00000, data 0x296627c/0x2a5b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1222949 data_alloc: 285212672 data_used: 2834432
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:12.503842+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109166592 unmapped: 38166528 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 152 heartbeat osd_stat(store_statfs(0x1b8c32000/0x0/0x1bfc00000, data 0x296627c/0x2a5b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:13.503982+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109174784 unmapped: 38158336 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:14.504177+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109174784 unmapped: 38158336 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:15.504329+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109174784 unmapped: 38158336 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e2b8400
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 15.247541428s of 15.478184700s, submitted: 53
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 152 ms_handle_reset con 0x55ab8e2b8400 session 0x55ab90380000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 152 heartbeat osd_stat(store_statfs(0x1b8c32000/0x0/0x1bfc00000, data 0x296627c/0x2a5b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:16.504496+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109174784 unmapped: 38158336 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1224601 data_alloc: 285212672 data_used: 2834432
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:17.504626+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109174784 unmapped: 38158336 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 152 heartbeat osd_stat(store_statfs(0x1b8c32000/0x0/0x1bfc00000, data 0x296628c/0x2a5c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 152 handle_osd_map epochs [153,153], i have 152, src has [1,153]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 152 handle_osd_map epochs [153,153], i have 153, src has [1,153]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 152 handle_osd_map epochs [153,153], i have 153, src has [1,153]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 152 handle_osd_map epochs [153,153], i have 153, src has [1,153]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 152 handle_osd_map epochs [153,153], i have 153, src has [1,153]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8dcfb000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 153 ms_handle_reset con 0x55ab8dcfb000 session 0x55ab8dc470e0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:18.504767+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109199360 unmapped: 38133760 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 153 handle_osd_map epochs [153,154], i have 153, src has [1,154]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:19.504958+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109199360 unmapped: 38133760 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 154 heartbeat osd_stat(store_statfs(0x1b8c2d000/0x0/0x1bfc00000, data 0x296864a/0x2a60000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:20.505141+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e233000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 154 ms_handle_reset con 0x55ab8e233000 session 0x55ab8f10d2c0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8cc2d000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8f455c00
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109215744 unmapped: 38117376 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 154 ms_handle_reset con 0x55ab8f455c00 session 0x55ab8dfd5c20
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 154 ms_handle_reset con 0x55ab8cc2d000 session 0x55ab8dd06000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 154 heartbeat osd_stat(store_statfs(0x1b8c28000/0x0/0x1bfc00000, data 0x296aa5c/0x2a64000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:21.505335+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109232128 unmapped: 38100992 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1230029 data_alloc: 285212672 data_used: 2846720
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:22.505495+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109232128 unmapped: 38100992 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 154 heartbeat osd_stat(store_statfs(0x1b8c2b000/0x0/0x1bfc00000, data 0x296aa4c/0x2a63000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:23.505611+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109232128 unmapped: 38100992 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 154 heartbeat osd_stat(store_statfs(0x1b8c2b000/0x0/0x1bfc00000, data 0x296aa4c/0x2a63000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:24.505770+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109232128 unmapped: 38100992 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 154 handle_osd_map epochs [154,155], i have 154, src has [1,155]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e370800
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 155 ms_handle_reset con 0x55ab8e370800 session 0x55ab8c0fb2c0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:25.505919+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109248512 unmapped: 38084608 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:26.506090+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109264896 unmapped: 38068224 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8c761000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.927778244s of 11.069559097s, submitted: 46
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 155 ms_handle_reset con 0x55ab8c761000 session 0x55ab8fe1a1e0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 155 heartbeat osd_stat(store_statfs(0x1b8c26000/0x0/0x1bfc00000, data 0x296ccf0/0x2a67000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e24f800
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 155 ms_handle_reset con 0x55ab8e24f800 session 0x55ab8cbeb0e0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8d87a800
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1235133 data_alloc: 285212672 data_used: 2859008
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:27.506251+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 155 ms_handle_reset con 0x55ab8d87a800 session 0x55ab8dce0f00
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 109281280 unmapped: 38051840 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8c761000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 155 heartbeat osd_stat(store_statfs(0x1b8c26000/0x0/0x1bfc00000, data 0x296cd52/0x2a68000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,2,3,4,5] op hist [0,0,0,0,0,0,2])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8cc2d000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 155 ms_handle_reset con 0x55ab8c761000 session 0x55ab9029f2c0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e24f800
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 155 ms_handle_reset con 0x55ab8e24f800 session 0x55ab9029f0e0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 155 ms_handle_reset con 0x55ab8cc2d000 session 0x55ab8edf3c20
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:28.506408+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 111181824 unmapped: 36151296 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e370800
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 155 ms_handle_reset con 0x55ab8e370800 session 0x55ab9036a000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e232800
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:29.506620+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 111091712 unmapped: 36241408 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 155 ms_handle_reset con 0x55ab8e232800 session 0x55ab8dc07e00
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:30.506811+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 111091712 unmapped: 36241408 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 155 heartbeat osd_stat(store_statfs(0x1b8c26000/0x0/0x1bfc00000, data 0x296ccf0/0x2a67000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 155 handle_osd_map epochs [156,156], i have 155, src has [1,156]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 155 handle_osd_map epochs [156,156], i have 156, src has [1,156]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 155 handle_osd_map epochs [156,156], i have 156, src has [1,156]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:31.506962+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 156 handle_osd_map epochs [156,157], i have 156, src has [1,157]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.8] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.6] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.3] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.19] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.13] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.12] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 110370816 unmapped: 36962304 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1249913 data_alloc: 285212672 data_used: 2871296
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:32.507209+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 157 handle_osd_map epochs [157,158], i have 157, src has [1,158]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 158 handle_osd_map epochs [157,158], i have 158, src has [1,158]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e24c400
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 158 handle_osd_map epochs [157,158], i have 158, src has [1,158]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 158 ms_handle_reset con 0x55ab8e24c400 session 0x55ab8f66f2c0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 110059520 unmapped: 37273600 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8dcfb400
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 158 ms_handle_reset con 0x55ab8dcfb400 session 0x55ab8f66f4a0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:33.507447+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 158 handle_osd_map epochs [158,159], i have 158, src has [1,159]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 159 handle_osd_map epochs [157,159], i have 159, src has [1,159]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 110059520 unmapped: 37273600 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 159 handle_osd_map epochs [159,159], i have 159, src has [1,159]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e29d400
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:34.507648+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 110084096 unmapped: 37249024 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 159 ms_handle_reset con 0x55ab8e29d400 session 0x55ab8dc51860
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 159 handle_osd_map epochs [159,160], i have 159, src has [1,160]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 160 heartbeat osd_stat(store_statfs(0x1b8c10000/0x0/0x1bfc00000, data 0x2977fdc/0x2a7d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.3] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.19] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.6] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.13] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.12] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.8] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:35.507777+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 110108672 unmapped: 37224448 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:36.508015+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 110108672 unmapped: 37224448 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e338800
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 9.484382629s of 10.088588715s, submitted: 154
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 160 ms_handle_reset con 0x55ab8e338800 session 0x55ab8dd074a0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1264522 data_alloc: 285212672 data_used: 2887680
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:37.508212+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 110108672 unmapped: 37224448 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:38.508412+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8ee2f800
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 110116864 unmapped: 37216256 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e24c800
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 160 ms_handle_reset con 0x55ab8e24c800 session 0x55ab9036be00
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8dcfb400
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 160 ms_handle_reset con 0x55ab8dcfb400 session 0x55ab905325a0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 160 handle_osd_map epochs [161,161], i have 160, src has [1,161]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 161 handle_osd_map epochs [161,161], i have 161, src has [1,161]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 161 ms_handle_reset con 0x55ab8ee2f800 session 0x55ab8dd070e0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:39.508590+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 161 heartbeat osd_stat(store_statfs(0x1b8c0f000/0x0/0x1bfc00000, data 0x297803e/0x2a7e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 110108672 unmapped: 37224448 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e24c400
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:40.508754+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 161 heartbeat osd_stat(store_statfs(0x1b8c0a000/0x0/0x1bfc00000, data 0x297a3fc/0x2a82000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 110108672 unmapped: 37224448 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 161 ms_handle_reset con 0x55ab8e24c400 session 0x55ab90532d20
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _renew_subs
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _send_mon_message to mon.np0005538514 at v2:172.18.0.104:3300/0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 161 handle_osd_map epochs [162,162], i have 161, src has [1,162]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 162 handle_osd_map epochs [162,162], i have 162, src has [1,162]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e24c800
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 162 handle_osd_map epochs [162,162], i have 162, src has [1,162]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:41.508915+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 162 ms_handle_reset con 0x55ab8e24c800 session 0x55ab905330e0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 110157824 unmapped: 37175296 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 162 handle_osd_map epochs [162,163], i have 162, src has [1,163]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.8] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.6] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.13] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.12] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.3] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.19] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:42.509126+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1272932 data_alloc: 285212672 data_used: 2899968
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 163 handle_osd_map epochs [162,163], i have 163, src has [1,163]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 163 handle_osd_map epochs [163,163], i have 163, src has [1,163]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 110182400 unmapped: 37150720 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:43.510244+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 110182400 unmapped: 37150720 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _send_mon_message to mon.np0005538514 at v2:172.18.0.104:3300/0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8ee5d800
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _renew_subs
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _send_mon_message to mon.np0005538514 at v2:172.18.0.104:3300/0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 163 handle_osd_map epochs [164,164], i have 163, src has [1,164]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:44.510405+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 110215168 unmapped: 37117952 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 164 ms_handle_reset con 0x55ab8ee5d800 session 0x55ab905334a0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 164 handle_osd_map epochs [165,165], i have 164, src has [1,165]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:45.510530+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 110231552 unmapped: 37101568 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8dcfb400
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 165 ms_handle_reset con 0x55ab8dcfb400 session 0x55ab90533860
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 165 heartbeat osd_stat(store_statfs(0x1b8c02000/0x0/0x1bfc00000, data 0x2980f44/0x2a8b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:46.510676+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 110247936 unmapped: 37085184 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:47.510870+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1284911 data_alloc: 285212672 data_used: 2904064
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8df91400
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.336835861s of 10.850212097s, submitted: 97
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 110247936 unmapped: 37085184 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8f372400
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 165 ms_handle_reset con 0x55ab8f372400 session 0x55ab90533680
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 165 heartbeat osd_stat(store_statfs(0x1b8bfc000/0x0/0x1bfc00000, data 0x29832f4/0x2a92000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 165 handle_osd_map epochs [166,166], i have 165, src has [1,166]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 166 handle_osd_map epochs [166,166], i have 166, src has [1,166]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:48.511029+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e268800
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 166 ms_handle_reset con 0x55ab8e268800 session 0x55ab8f10de00
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e2b8400
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 110256128 unmapped: 37076992 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 166 handle_osd_map epochs [166,167], i have 166, src has [1,167]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 167 ms_handle_reset con 0x55ab8e2b8400 session 0x55ab903810e0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:49.511155+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 167 handle_osd_map epochs [167,167], i have 167, src has [1,167]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 167 ms_handle_reset con 0x55ab8df91400 session 0x55ab90533c20
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 110280704 unmapped: 37052416 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _renew_subs
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _send_mon_message to mon.np0005538514 at v2:172.18.0.104:3300/0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 167 handle_osd_map epochs [168,168], i have 167, src has [1,168]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.8] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.6] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.3] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.19] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.12] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.13] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8dcfb400
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:50.511387+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 110329856 unmapped: 37003264 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _renew_subs
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _send_mon_message to mon.np0005538514 at v2:172.18.0.104:3300/0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 168 handle_osd_map epochs [169,169], i have 168, src has [1,169]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.6] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.8] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.13] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.12] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.3] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.19] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:51.512211+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 169 ms_handle_reset con 0x55ab8dcfb400 session 0x55ab90380b40
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 169 heartbeat osd_stat(store_statfs(0x1b8bee000/0x0/0x1bfc00000, data 0x2989e1e/0x2a9f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8bf54400
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 110411776 unmapped: 36921344 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 169 ms_handle_reset con 0x55ab8bf54400 session 0x55ab90533e00
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8f2d5000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 169 heartbeat osd_stat(store_statfs(0x1b8bea000/0x0/0x1bfc00000, data 0x298c1ce/0x2aa2000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 169 ms_handle_reset con 0x55ab8f2d5000 session 0x55ab8f10d2c0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8f19b800
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:52.512349+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1299216 data_alloc: 285212672 data_used: 2920448
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 110419968 unmapped: 36913152 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e26b000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 169 ms_handle_reset con 0x55ab8e26b000 session 0x55ab8d8a2000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8edec800
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 169 ms_handle_reset con 0x55ab8edec800 session 0x55ab8f103860
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 169 handle_osd_map epochs [170,170], i have 169, src has [1,170]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:53.512451+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 170 handle_osd_map epochs [170,170], i have 170, src has [1,170]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 170 ms_handle_reset con 0x55ab8f19b800 session 0x55ab8f10c1e0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8bf54400
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 111493120 unmapped: 35840000 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 170 handle_osd_map epochs [170,170], i have 170, src has [1,170]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 170 ms_handle_reset con 0x55ab8bf54400 session 0x55ab9029f2c0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8dcfb400
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:54.512602+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 170 ms_handle_reset con 0x55ab8dcfb400 session 0x55ab9029e960
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 170 handle_osd_map epochs [170,171], i have 170, src has [1,171]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 170 handle_osd_map epochs [171,171], i have 171, src has [1,171]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 171 heartbeat osd_stat(store_statfs(0x1b8be8000/0x0/0x1bfc00000, data 0x298e5ec/0x2aa5000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 111525888 unmapped: 35807232 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 171 handle_osd_map epochs [171,172], i have 171, src has [1,172]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.8] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.3] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.19] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.13] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.12] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 172 handle_osd_map epochs [172,172], i have 172, src has [1,172]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.6] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 172 handle_osd_map epochs [172,172], i have 172, src has [1,172]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:55.512763+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e26b000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 111525888 unmapped: 35807232 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 172 ms_handle_reset con 0x55ab8e26b000 session 0x55ab8dce0f00
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 172 handle_osd_map epochs [173,173], i have 172, src has [1,173]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 172 handle_osd_map epochs [173,173], i have 173, src has [1,173]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8f2d5000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8b7fe400
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 173 handle_osd_map epochs [172,173], i have 173, src has [1,173]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:56.512889+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 173 ms_handle_reset con 0x55ab8b7fe400 session 0x55ab8dc42960
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 112369664 unmapped: 34963456 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 173 ms_handle_reset con 0x55ab8f2d5000 session 0x55ab8dce0780
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:57.512998+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8bf54400
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1440799 data_alloc: 285212672 data_used: 2945024
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 173 ms_handle_reset con 0x55ab8bf54400 session 0x55ab8dc5c780
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 9.137009621s of 10.010091782s, submitted: 220
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 120741888 unmapped: 26591232 heap: 147333120 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 173 handle_osd_map epochs [173,174], i have 173, src has [1,174]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 174 handle_osd_map epochs [174,174], i have 174, src has [1,174]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 174 handle_osd_map epochs [174,174], i have 174, src has [1,174]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8dcfb400
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 174 heartbeat osd_stat(store_statfs(0x1b837e000/0x0/0x1bfc00000, data 0x31f20de/0x330f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,2,3,4,5] op hist [0,0,0,0,0,0,1])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e26b000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 174 ms_handle_reset con 0x55ab8e26b000 session 0x55ab8dc07680
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:58.513150+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 112394240 unmapped: 43335680 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 174 handle_osd_map epochs [174,175], i have 174, src has [1,175]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 175 handle_osd_map epochs [175,175], i have 175, src has [1,175]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 175 handle_osd_map epochs [175,175], i have 175, src has [1,175]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8f19b800
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:05:59.513305+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 112099328 unmapped: 43630592 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 175 ms_handle_reset con 0x55ab8f19b800 session 0x55ab9036a3c0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _renew_subs
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _send_mon_message to mon.np0005538514 at v2:172.18.0.104:3300/0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 175 handle_osd_map epochs [176,177], i have 175, src has [1,177]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 177 handle_osd_map epochs [176,177], i have 177, src has [1,177]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e29cc00
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 177 handle_osd_map epochs [176,177], i have 177, src has [1,177]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:00.513581+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 177 handle_osd_map epochs [176,177], i have 177, src has [1,177]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 112885760 unmapped: 42844160 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 177 ms_handle_reset con 0x55ab8e29cc00 session 0x55ab9029ef00
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8bf54400
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e26b000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 177 ms_handle_reset con 0x55ab8bf54400 session 0x55ab8d8a21e0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:01.513776+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 177 heartbeat osd_stat(store_statfs(0x1b4c1f000/0x0/0x1bfc00000, data 0x654af08/0x666d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 113131520 unmapped: 42598400 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 177 ms_handle_reset con 0x55ab8e26b000 session 0x55ab8c6af860
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 177 handle_osd_map epochs [178,178], i have 177, src has [1,178]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e2b8000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 178 ms_handle_reset con 0x55ab8e2b8000 session 0x55ab8dc510e0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:02.513949+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1922505 data_alloc: 285212672 data_used: 2957312
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 113041408 unmapped: 42688512 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _renew_subs
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _send_mon_message to mon.np0005538514 at v2:172.18.0.104:3300/0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 178 handle_osd_map epochs [179,179], i have 178, src has [1,179]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 178 handle_osd_map epochs [179,179], i have 179, src has [1,179]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:03.514150+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8c735c00
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 179 handle_osd_map epochs [179,179], i have 179, src has [1,179]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 179 ms_handle_reset con 0x55ab8c735c00 session 0x55ab8dc46f00
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 113139712 unmapped: 42590208 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 179 handle_osd_map epochs [178,179], i have 179, src has [1,179]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8f27ec00
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:04.514310+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e24f800
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e24f000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 121700352 unmapped: 34029568 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 179 ms_handle_reset con 0x55ab8f27ec00 session 0x55ab8dc47e00
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 179 ms_handle_reset con 0x55ab8e24f000 session 0x55ab8edf32c0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 179 ms_handle_reset con 0x55ab8e24f800 session 0x55ab8d8a34a0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 179 handle_osd_map epochs [179,180], i have 179, src has [1,180]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:05.514479+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 180 handle_osd_map epochs [180,180], i have 180, src has [1,180]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 180 handle_osd_map epochs [180,180], i have 180, src has [1,180]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8bf54400
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 113360896 unmapped: 42369024 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8c735c00
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e26b000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 180 ms_handle_reset con 0x55ab8c735c00 session 0x55ab8dc47c20
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e2b8000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 180 ms_handle_reset con 0x55ab8e26b000 session 0x55ab8dd074a0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e29a800
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 180 heartbeat osd_stat(store_statfs(0x1b02df000/0x0/0x1bfc00000, data 0xae84022/0xafae000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8edefc00
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 180 ms_handle_reset con 0x55ab8e2b8000 session 0x55ab9036ad20
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 180 ms_handle_reset con 0x55ab8e29a800 session 0x55ab8dd070e0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:06.514656+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e2b8000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 180 ms_handle_reset con 0x55ab8e2b8000 session 0x55ab8df7f680
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 112795648 unmapped: 42934272 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8c761400
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 180 heartbeat osd_stat(store_statfs(0x1ae708000/0x0/0x1bfc00000, data 0xca5b022/0xcb85000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,2,3,4,5] op hist [0,0,0,0,0,0,0,1])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 180 handle_osd_map epochs [180,181], i have 180, src has [1,181]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:07.514786+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 181 handle_osd_map epochs [181,181], i have 181, src has [1,181]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 181 handle_osd_map epochs [181,181], i have 181, src has [1,181]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2538223 data_alloc: 285212672 data_used: 2957312
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 181 ms_handle_reset con 0x55ab8edefc00 session 0x55ab8f1030e0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 181 heartbeat osd_stat(store_statfs(0x1ae708000/0x0/0x1bfc00000, data 0xca5b022/0xcb85000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 113975296 unmapped: 41754624 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 8.984186172s of 10.332992554s, submitted: 181
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 181 ms_handle_reset con 0x55ab8bf54400 session 0x55ab90532780
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 181 ms_handle_reset con 0x55ab8c761400 session 0x55ab8edf3e00
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:08.514966+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 123551744 unmapped: 32178176 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:09.515377+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 181 handle_osd_map epochs [181,182], i have 181, src has [1,182]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 182 handle_osd_map epochs [182,182], i have 182, src has [1,182]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8bf54400
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 114171904 unmapped: 41558016 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 182 ms_handle_reset con 0x55ab8bf54400 session 0x55ab8f103e00
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:10.515568+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 182 heartbeat osd_stat(store_statfs(0x1acf00000/0x0/0x1bfc00000, data 0xe25f7f2/0xe38d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 122650624 unmapped: 33079296 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:11.515714+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 182 handle_osd_map epochs [182,183], i have 182, src has [1,183]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.8] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.13] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.12] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.3] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.19] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.6] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 114286592 unmapped: 41443328 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8f19ac00
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 183 handle_osd_map epochs [183,183], i have 183, src has [1,183]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 183 ms_handle_reset con 0x55ab8f19ac00 session 0x55ab8c81c3c0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:12.515831+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e268800
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 183 handle_osd_map epochs [183,183], i have 183, src has [1,183]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 183 ms_handle_reset con 0x55ab8e268800 session 0x55ab8dfc7c20
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2584418 data_alloc: 285212672 data_used: 2969600
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 183 handle_osd_map epochs [184,184], i have 183, src has [1,184]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 184 handle_osd_map epochs [184,184], i have 184, src has [1,184]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 122740736 unmapped: 32989184 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e2b8400
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:13.515982+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 184 ms_handle_reset con 0x55ab8e2b8400 session 0x55ab8c0fb860
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 122806272 unmapped: 32923648 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:14.516186+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8f372400
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 184 handle_osd_map epochs [184,185], i have 184, src has [1,185]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 185 handle_osd_map epochs [185,185], i have 185, src has [1,185]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 114409472 unmapped: 41320448 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8eded800
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 185 handle_osd_map epochs [185,185], i have 185, src has [1,185]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 185 ms_handle_reset con 0x55ab8f372400 session 0x55ab8dd063c0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:15.516333+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _renew_subs
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _send_mon_message to mon.np0005538514 at v2:172.18.0.104:3300/0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 185 handle_osd_map epochs [186,186], i have 185, src has [1,186]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 186 handle_osd_map epochs [185,186], i have 186, src has [1,186]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 114450432 unmapped: 41279488 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 186 ms_handle_reset con 0x55ab8eded800 session 0x55ab8dfc6780
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 186 heartbeat osd_stat(store_statfs(0x1ab74f000/0x0/0x1bfc00000, data 0xfa0ce1c/0xfb3e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:16.516541+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 114475008 unmapped: 41254912 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab9094f000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 186 ms_handle_reset con 0x55ab9094f000 session 0x55ab8f10cd20
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:17.516673+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2861104 data_alloc: 285212672 data_used: 2981888
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 9.117569923s of 10.025844574s, submitted: 150
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 122970112 unmapped: 32759808 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:18.516826+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 114589696 unmapped: 41140224 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:19.517009+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 114589696 unmapped: 41140224 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 186 heartbeat osd_stat(store_statfs(0x1aa7a3000/0x0/0x1bfc00000, data 0x109b6ea4/0x10aeb000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:20.517539+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 186 handle_osd_map epochs [187,187], i have 186, src has [1,187]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.6] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.8] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.12] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.3] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.13] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.19] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8ee2f400
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 187 ms_handle_reset con 0x55ab8ee2f400 session 0x55ab8edf2000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 114720768 unmapped: 41009152 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: mgrc handle_mgr_map Got map version 46
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/3214120196,v1:172.18.0.108:6811/3214120196]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 187 handle_osd_map epochs [187,187], i have 187, src has [1,187]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:21.517734+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 114843648 unmapped: 40886272 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:22.518029+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8f2d4c00
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 187 ms_handle_reset con 0x55ab8f2d4c00 session 0x55ab90532780
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 3033400 data_alloc: 285212672 data_used: 2994176
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 114843648 unmapped: 40886272 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:23.518370+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e24d400
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 114892800 unmapped: 40837120 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 187 ms_handle_reset con 0x55ab8e24d400 session 0x55ab8f1030e0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:24.518536+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 114982912 unmapped: 40747008 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 187 heartbeat osd_stat(store_statfs(0x1a8793000/0x0/0x1bfc00000, data 0x129c1d53/0x12afb000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:25.518719+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 115056640 unmapped: 40673280 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:26.518896+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 115154944 unmapped: 40574976 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:27.519094+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 3364720 data_alloc: 285212672 data_used: 2994176
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 115212288 unmapped: 40517632 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8d87b000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 9.130879402s of 10.323074341s, submitted: 63
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 187 ms_handle_reset con 0x55ab8d87b000 session 0x55ab8f102f00
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8c734400
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:28.519290+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 115269632 unmapped: 40460288 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 187 ms_handle_reset con 0x55ab8c734400 session 0x55ab8d8a2000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:29.519447+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8ee5d400
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 121004032 unmapped: 34725888 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 187 ms_handle_reset con 0x55ab8ee5d400 session 0x55ab8d8a3860
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e2b8c00
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 187 ms_handle_reset con 0x55ab8e2b8c00 session 0x55ab8c6ae000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e24f800
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 187 ms_handle_reset con 0x55ab8e24f800 session 0x55ab8edf32c0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:30.519678+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8c734400
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 187 ms_handle_reset con 0x55ab8c734400 session 0x55ab8edf3860
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8d87b000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 187 ms_handle_reset con 0x55ab8d87b000 session 0x55ab8dce0780
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 187 heartbeat osd_stat(store_statfs(0x1a44e0000/0x0/0x1bfc00000, data 0x16473f04/0x165ae000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,2,3,4,5] op hist [0,0,0,0,0,0,0,1])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 115892224 unmapped: 39837696 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:31.519917+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e269000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 115924992 unmapped: 39804928 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:32.520081+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: mgrc handle_mgr_map Got map version 47
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/3214120196,v1:172.18.0.108:6811/3214120196]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 187 ms_handle_reset con 0x55ab8e269000 session 0x55ab8dce1680
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 3672734 data_alloc: 285212672 data_used: 2994176
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 116072448 unmapped: 39657472 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:33.520243+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 116121600 unmapped: 39608320 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e233c00
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:34.520524+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 187 ms_handle_reset con 0x55ab8e233c00 session 0x55ab8dfc6960
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 124600320 unmapped: 31129600 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 187 heartbeat osd_stat(store_statfs(0x1a24dc000/0x0/0x1bfc00000, data 0x18c79da9/0x18db2000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:35.520667+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 116260864 unmapped: 39469056 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:36.520820+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8b28a400
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 187 heartbeat osd_stat(store_statfs(0x1a1cdc000/0x0/0x1bfc00000, data 0x19479da9/0x195b2000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 116326400 unmapped: 39403520 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:37.520981+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 4079986 data_alloc: 285212672 data_used: 2994176
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 116375552 unmapped: 39354368 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 8.741377831s of 10.148809433s, submitted: 104
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:38.521165+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 126001152 unmapped: 29728768 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:39.521296+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 187 handle_osd_map epochs [188,188], i have 187, src has [1,188]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 188 ms_handle_reset con 0x55ab8dcfb400 session 0x55ab8dc5c000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 188 handle_osd_map epochs [188,188], i have 188, src has [1,188]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 117694464 unmapped: 38035456 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:40.521639+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8c734400
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 117694464 unmapped: 38035456 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:41.521796+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8d87a000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _renew_subs
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _send_mon_message to mon.np0005538514 at v2:172.18.0.104:3300/0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 188 handle_osd_map epochs [189,189], i have 188, src has [1,189]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 189 handle_osd_map epochs [189,189], i have 189, src has [1,189]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.17] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 189 ms_handle_reset con 0x55ab8d87a000 session 0x55ab9036b860
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 189 heartbeat osd_stat(store_statfs(0x19eccd000/0x0/0x1bfc00000, data 0x1c484433/0x1c5bf000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8f19a000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8c64b800
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 116752384 unmapped: 38977536 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 189 ms_handle_reset con 0x55ab8c734400 session 0x55ab8df7e3c0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:42.521996+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1553155 data_alloc: 285212672 data_used: 3010560
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e338c00
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 116752384 unmapped: 38977536 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:43.522133+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 189 ms_handle_reset con 0x55ab8e338c00 session 0x55ab903803c0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 189 heartbeat osd_stat(store_statfs(0x1b7cc9000/0x0/0x1bfc00000, data 0x3486868/0x35c4000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 116932608 unmapped: 38797312 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:44.522277+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 189 heartbeat osd_stat(store_statfs(0x1b7cc6000/0x0/0x1bfc00000, data 0x348aac4/0x35c7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 117252096 unmapped: 38477824 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:45.522497+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8f27e400
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 189 ms_handle_reset con 0x55ab8f27e400 session 0x55ab8dc5cf00
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 117284864 unmapped: 38445056 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:46.522738+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8d87b400
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 117284864 unmapped: 38445056 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:47.522966+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 189 ms_handle_reset con 0x55ab8d87b400 session 0x55ab8f10c000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 189 handle_osd_map epochs [190,190], i have 189, src has [1,190]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 190 handle_osd_map epochs [190,190], i have 190, src has [1,190]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1579512 data_alloc: 285212672 data_used: 5713920
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8c734400
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 117342208 unmapped: 38387712 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:48.523144+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 190 ms_handle_reset con 0x55ab8c734400 session 0x55ab9036b2c0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 190 handle_osd_map epochs [191,191], i have 190, src has [1,191]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 9.862903595s of 10.409833908s, submitted: 104
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8d87a000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 117350400 unmapped: 38379520 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:49.523280+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _renew_subs
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _send_mon_message to mon.np0005538514 at v2:172.18.0.104:3300/0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 191 handle_osd_map epochs [192,192], i have 191, src has [1,192]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 192 ms_handle_reset con 0x55ab8d87a000 session 0x55ab9036b680
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 192 handle_osd_map epochs [192,192], i have 192, src has [1,192]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 117415936 unmapped: 38313984 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _renew_subs
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _send_mon_message to mon.np0005538514 at v2:172.18.0.104:3300/0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 192 handle_osd_map epochs [193,193], i have 192, src has [1,193]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.17] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:50.523453+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 193 heartbeat osd_stat(store_statfs(0x1b7cad000/0x0/0x1bfc00000, data 0x349a6a6/0x35de000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e338c00
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 193 ms_handle_reset con 0x55ab8e338c00 session 0x55ab90381e00
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8f27e400
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 117424128 unmapped: 38305792 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:51.523680+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 193 ms_handle_reset con 0x55ab8f27e400 session 0x55ab8dbfaf00
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 193 handle_osd_map epochs [194,194], i have 193, src has [1,194]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8ee2f400
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 194 ms_handle_reset con 0x55ab8ee2f400 session 0x55ab8dbfa5a0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8c734400
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 117440512 unmapped: 38289408 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:52.523853+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 194 handle_osd_map epochs [194,195], i have 194, src has [1,195]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 195 ms_handle_reset con 0x55ab8c734400 session 0x55ab8cbea780
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1597518 data_alloc: 285212672 data_used: 5718016
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 195 handle_osd_map epochs [194,195], i have 195, src has [1,195]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 117489664 unmapped: 38240256 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:53.524224+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8d87a000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 195 ms_handle_reset con 0x55ab8d87a000 session 0x55ab8cbeb0e0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e338c00
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 118538240 unmapped: 37191680 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:54.524332+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 195 handle_osd_map epochs [195,196], i have 195, src has [1,196]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 196 ms_handle_reset con 0x55ab8e338c00 session 0x55ab8cbea1e0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 196 handle_osd_map epochs [195,196], i have 196, src has [1,196]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 196 handle_osd_map epochs [194,196], i have 196, src has [1,196]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 123256832 unmapped: 32473088 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:55.524479+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8f27e400
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 196 ms_handle_reset con 0x55ab8f27e400 session 0x55ab8c790780
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e24f000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 124903424 unmapped: 30826496 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 196 heartbeat osd_stat(store_statfs(0x1b5a96000/0x0/0x1bfc00000, data 0x450ce25/0x4657000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5b0f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:56.524619+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 196 ms_handle_reset con 0x55ab8e24f000 session 0x55ab8f66fe00
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 123764736 unmapped: 31965184 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:57.524813+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8ee5d400
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 196 ms_handle_reset con 0x55ab8ee5d400 session 0x55ab8f66f0e0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1732540 data_alloc: 285212672 data_used: 5931008
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 123764736 unmapped: 31965184 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:58.524967+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e29a400
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 9.228244781s of 10.033915520s, submitted: 227
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 196 ms_handle_reset con 0x55ab8e29a400 session 0x55ab8f66eb40
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 123822080 unmapped: 31907840 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:06:59.525191+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 123920384 unmapped: 31809536 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:00.525386+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 196 handle_osd_map epochs [197,197], i have 196, src has [1,197]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 123920384 unmapped: 31809536 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:01.525595+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 197 heartbeat osd_stat(store_statfs(0x1b5a5c000/0x0/0x1bfc00000, data 0x4546c39/0x4691000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5b0f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 123920384 unmapped: 31809536 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:02.526245+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1728508 data_alloc: 285212672 data_used: 5947392
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 123920384 unmapped: 31809536 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:03.526569+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 123920384 unmapped: 31809536 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:04.526769+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 197 heartbeat osd_stat(store_statfs(0x1b5a5b000/0x0/0x1bfc00000, data 0x4548c39/0x4693000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5b0f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 123920384 unmapped: 31809536 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:05.527143+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e268400
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 123920384 unmapped: 31809536 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 197 ms_handle_reset con 0x55ab8e268400 session 0x55ab8f66e780
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:06.527560+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e36f400
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 123928576 unmapped: 31801344 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:07.527706+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 197 heartbeat osd_stat(store_statfs(0x1b5a4a000/0x0/0x1bfc00000, data 0x4556f50/0x46a4000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5b0f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1740341 data_alloc: 285212672 data_used: 5947392
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 197 handle_osd_map epochs [197,198], i have 197, src has [1,198]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 123928576 unmapped: 31801344 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 198 handle_osd_map epochs [198,198], i have 198, src has [1,198]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:08.527872+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 198 ms_handle_reset con 0x55ab8e36f400 session 0x55ab90380b40
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 198 heartbeat osd_stat(store_statfs(0x1b5a4a000/0x0/0x1bfc00000, data 0x4556f50/0x46a4000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5b0f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.052940369s of 10.278253555s, submitted: 72
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e2b9000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 123961344 unmapped: 31768576 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 198 handle_osd_map epochs [198,199], i have 198, src has [1,199]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 199 handle_osd_map epochs [199,199], i have 199, src has [1,199]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:09.528612+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 199 handle_osd_map epochs [199,199], i have 199, src has [1,199]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.17] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8eded800
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 199 handle_osd_map epochs [199,199], i have 199, src has [1,199]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 199 ms_handle_reset con 0x55ab8e2b9000 session 0x55ab90381c20
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 199 handle_osd_map epochs [199,199], i have 199, src has [1,199]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 123985920 unmapped: 31744000 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 199 handle_osd_map epochs [199,200], i have 199, src has [1,200]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:10.528756+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 200 handle_osd_map epochs [200,200], i have 200, src has [1,200]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 200 handle_osd_map epochs [200,200], i have 200, src has [1,200]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e268400
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 200 ms_handle_reset con 0x55ab8e268400 session 0x55ab8dc50000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 200 handle_osd_map epochs [200,200], i have 200, src has [1,200]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 124010496 unmapped: 31719424 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:11.528984+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 200 handle_osd_map epochs [200,201], i have 200, src has [1,201]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 201 heartbeat osd_stat(store_statfs(0x1b5a30000/0x0/0x1bfc00000, data 0x4564c7b/0x46ba000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5b0f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab9094f000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 201 ms_handle_reset con 0x55ab8eded800 session 0x55ab8f10d4a0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 124133376 unmapped: 31596544 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:12.529200+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 201 handle_osd_map epochs [201,202], i have 201, src has [1,202]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 202 handle_osd_map epochs [202,202], i have 202, src has [1,202]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.17] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 202 handle_osd_map epochs [202,202], i have 202, src has [1,202]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1760832 data_alloc: 285212672 data_used: 5963776
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 202 ms_handle_reset con 0x55ab9094f000 session 0x55ab9036ad20
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 202 heartbeat osd_stat(store_statfs(0x1b5a2f000/0x0/0x1bfc00000, data 0x45670b0/0x46be000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5b0f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 124174336 unmapped: 31555584 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:13.529393+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8f629000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 202 ms_handle_reset con 0x55ab8f629000 session 0x55ab8c6af860
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e29d800
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:14.529571+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 124182528 unmapped: 31547392 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 202 handle_osd_map epochs [202,203], i have 202, src has [1,203]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 203 handle_osd_map epochs [203,203], i have 203, src has [1,203]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e269c00
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 203 ms_handle_reset con 0x55ab8e269c00 session 0x55ab8cbeaf00
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:15.529745+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 124190720 unmapped: 31539200 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 203 handle_osd_map epochs [204,204], i have 203, src has [1,204]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 204 handle_osd_map epochs [204,204], i have 204, src has [1,204]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 204 ms_handle_reset con 0x55ab8e29d800 session 0x55ab8d8a2d20
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e268400
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 204 handle_osd_map epochs [204,204], i have 204, src has [1,204]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 204 ms_handle_reset con 0x55ab8e268400 session 0x55ab8fe1bc20
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 204 handle_osd_map epochs [204,205], i have 204, src has [1,205]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 205 handle_osd_map epochs [205,205], i have 205, src has [1,205]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.17] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8eded800
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 205 handle_osd_map epochs [205,205], i have 205, src has [1,205]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:16.529918+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 205 ms_handle_reset con 0x55ab8eded800 session 0x55ab8dc43a40
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 126312448 unmapped: 29417472 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 205 heartbeat osd_stat(store_statfs(0x1b5a13000/0x0/0x1bfc00000, data 0x457d5c1/0x46d9000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5b0f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:17.530170+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 127361024 unmapped: 28368896 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 205 handle_osd_map epochs [205,206], i have 205, src has [1,206]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.13] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.12] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.3] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.8] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.19] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 206 handle_osd_map epochs [206,206], i have 206, src has [1,206]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.6] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1779445 data_alloc: 285212672 data_used: 5963776
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 206 handle_osd_map epochs [206,206], i have 206, src has [1,206]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:18.530329+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e36f800
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 206 ms_handle_reset con 0x55ab8e36f800 session 0x55ab903805a0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 206 handle_osd_map epochs [206,206], i have 206, src has [1,206]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 127410176 unmapped: 28319744 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e36e800
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 9.172919273s of 10.035908699s, submitted: 271
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:19.530549+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 127410176 unmapped: 28319744 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 206 handle_osd_map epochs [206,207], i have 206, src has [1,207]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 207 handle_osd_map epochs [207,207], i have 207, src has [1,207]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 207 handle_osd_map epochs [207,207], i have 207, src has [1,207]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 207 ms_handle_reset con 0x55ab8e36e800 session 0x55ab90380d20
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 207 handle_osd_map epochs [207,207], i have 207, src has [1,207]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:20.530726+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 127434752 unmapped: 28295168 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e268400
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e29d800
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 207 handle_osd_map epochs [207,208], i have 207, src has [1,208]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 207 ms_handle_reset con 0x55ab8e29d800 session 0x55ab8dc42960
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.17] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:21.530960+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e36e800
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 127475712 unmapped: 28254208 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 208 ms_handle_reset con 0x55ab8e36e800 session 0x55ab8dc461e0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 208 heartbeat osd_stat(store_statfs(0x1b59f7000/0x0/0x1bfc00000, data 0x4597732/0x46f7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5b0f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 208 handle_osd_map epochs [209,209], i have 208, src has [1,209]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 209 handle_osd_map epochs [209,209], i have 209, src has [1,209]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:22.531121+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 127483904 unmapped: 28246016 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e232400
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 209 ms_handle_reset con 0x55ab8e232400 session 0x55ab8f10d0e0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 209 ms_handle_reset con 0x55ab8e268400 session 0x55ab8fe1b4a0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1793079 data_alloc: 285212672 data_used: 5976064
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:23.531306+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 127500288 unmapped: 28229632 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 209 handle_osd_map epochs [209,210], i have 209, src has [1,210]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.3] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.8] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.19] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.6] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 210 handle_osd_map epochs [210,210], i have 210, src has [1,210]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 210 handle_osd_map epochs [210,210], i have 210, src has [1,210]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e26b400
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 210 ms_handle_reset con 0x55ab8e26b400 session 0x55ab8dbfb680
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.13] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.12] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:24.531473+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 127516672 unmapped: 28213248 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 210 handle_osd_map epochs [210,211], i have 210, src has [1,211]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.3] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.8] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.19] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 211 handle_osd_map epochs [211,211], i have 211, src has [1,211]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.6] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.13] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.12] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 211 handle_osd_map epochs [211,211], i have 211, src has [1,211]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:25.531677+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 127508480 unmapped: 28221440 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 211 heartbeat osd_stat(store_statfs(0x1b59dc000/0x0/0x1bfc00000, data 0x45aa656/0x4710000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5b0f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:26.531893+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 127508480 unmapped: 28221440 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _renew_subs
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _send_mon_message to mon.np0005538514 at v2:172.18.0.104:3300/0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 211 handle_osd_map epochs [212,212], i have 211, src has [1,212]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.3] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.6] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.8] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.12] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.13] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.19] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:27.532048+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 212 handle_osd_map epochs [212,212], i have 212, src has [1,212]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 127557632 unmapped: 28172288 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 212 handle_osd_map epochs [212,212], i have 212, src has [1,212]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 212 handle_osd_map epochs [212,212], i have 212, src has [1,212]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 212 ms_handle_reset con 0x55ab8f19a000 session 0x55ab8dc5c960
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1795495 data_alloc: 285212672 data_used: 5976064
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 212 ms_handle_reset con 0x55ab8c64b800 session 0x55ab90381a40
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e26a800
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:28.532218+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 127590400 unmapped: 28139520 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 212 ms_handle_reset con 0x55ab8e26a800 session 0x55ab90532000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:29.532387+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 126803968 unmapped: 28925952 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 212 heartbeat osd_stat(store_statfs(0x1b6ae0000/0x0/0x1bfc00000, data 0x2a6eab1/0x2bd2000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5b0f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.179714203s of 10.933508873s, submitted: 229
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:30.532560+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 126803968 unmapped: 28925952 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 212 heartbeat osd_stat(store_statfs(0x1b6ae0000/0x0/0x1bfc00000, data 0x2a6eb05/0x2bd2000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5b0f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 212 handle_osd_map epochs [212,213], i have 212, src has [1,213]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:31.532709+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 126836736 unmapped: 28893184 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.3] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.19] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.8] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.6] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.13] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.12] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e339000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 213 ms_handle_reset con 0x55ab8e339000 session 0x55ab8fe1a1e0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:32.532911+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 126844928 unmapped: 28884992 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1584201 data_alloc: 285212672 data_used: 3088384
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e2f7c00
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 213 ms_handle_reset con 0x55ab8e2f7c00 session 0x55ab8dd07680
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8c64b800
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:33.533130+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 126877696 unmapped: 28852224 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 213 ms_handle_reset con 0x55ab8c64b800 session 0x55ab8fe1a3c0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e26a800
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 213 heartbeat osd_stat(store_statfs(0x1b7502000/0x0/0x1bfc00000, data 0x2a83f97/0x2beb000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5b0f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:34.533327+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 126902272 unmapped: 28827648 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 213 handle_osd_map epochs [213,214], i have 213, src has [1,214]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e339000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 214 handle_osd_map epochs [214,214], i have 214, src has [1,214]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 214 ms_handle_reset con 0x55ab8e339000 session 0x55ab8dbfba40
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:35.533548+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 126943232 unmapped: 28786688 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 214 heartbeat osd_stat(store_statfs(0x1b74e1000/0x0/0x1bfc00000, data 0x2aa3bd7/0x2c0c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5b0f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _renew_subs
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _send_mon_message to mon.np0005538514 at v2:172.18.0.104:3300/0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 214 handle_osd_map epochs [215,215], i have 214, src has [1,215]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 214 handle_osd_map epochs [215,215], i have 215, src has [1,215]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 214 handle_osd_map epochs [215,215], i have 215, src has [1,215]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:36.533791+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _renew_subs
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _send_mon_message to mon.np0005538514 at v2:172.18.0.104:3300/0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 215 handle_osd_map epochs [215,215], i have 215, src has [1,215]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 127016960 unmapped: 28712960 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e371000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 215 handle_osd_map epochs [216,216], i have 215, src has [1,216]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 215 handle_osd_map epochs [215,216], i have 216, src has [1,216]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 216 handle_osd_map epochs [216,216], i have 216, src has [1,216]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 216 ms_handle_reset con 0x55ab8e26a800 session 0x55ab8de2be00
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e36f800
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 216 ms_handle_reset con 0x55ab8e371000 session 0x55ab8f102960
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 216 ms_handle_reset con 0x55ab8e36f800 session 0x55ab8d8a3c20
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:37.533965+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 216 handle_osd_map epochs [216,216], i have 216, src has [1,216]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 127066112 unmapped: 28663808 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1597178 data_alloc: 285212672 data_used: 3096576
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:38.534175+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 127066112 unmapped: 28663808 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 216 heartbeat osd_stat(store_statfs(0x1b70c2000/0x0/0x1bfc00000, data 0x2ac2b53/0x2c2a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5f0f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:39.534355+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 127066112 unmapped: 28663808 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 8.885457039s of 10.000157356s, submitted: 207
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:40.534586+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 128212992 unmapped: 27516928 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 216 handle_osd_map epochs [217,218], i have 216, src has [1,218]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:41.534710+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 218 handle_osd_map epochs [217,218], i have 218, src has [1,218]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 127500288 unmapped: 28229632 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:42.534839+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 127582208 unmapped: 28147712 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1606704 data_alloc: 285212672 data_used: 3108864
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:43.535003+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 127582208 unmapped: 28147712 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 218 heartbeat osd_stat(store_statfs(0x1b7095000/0x0/0x1bfc00000, data 0x2aedbd9/0x2c58000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5f0f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:44.535144+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 127614976 unmapped: 28114944 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:45.535324+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _renew_subs
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _send_mon_message to mon.np0005538514 at v2:172.18.0.104:3300/0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 218 handle_osd_map epochs [219,219], i have 218, src has [1,219]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 127614976 unmapped: 28114944 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 219 handle_osd_map epochs [219,219], i have 219, src has [1,219]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 219 handle_osd_map epochs [219,219], i have 219, src has [1,219]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:46.535542+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _renew_subs
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _send_mon_message to mon.np0005538514 at v2:172.18.0.104:3300/0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 219 handle_osd_map epochs [220,220], i have 219, src has [1,220]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 127688704 unmapped: 28041216 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 220 handle_osd_map epochs [220,220], i have 220, src has [1,220]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 220 handle_osd_map epochs [220,220], i have 220, src has [1,220]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:47.535674+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 127688704 unmapped: 28041216 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1620864 data_alloc: 285212672 data_used: 3121152
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:48.535869+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 127688704 unmapped: 28041216 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 220 heartbeat osd_stat(store_statfs(0x1b7054000/0x0/0x1bfc00000, data 0x2b29579/0x2c98000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5f0f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e29ac00
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 220 ms_handle_reset con 0x55ab8e29ac00 session 0x55ab8d8a25a0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e233400
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:49.536055+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 220 ms_handle_reset con 0x55ab8e233400 session 0x55ab8dfc7e00
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e29d800
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 126705664 unmapped: 29024256 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 220 ms_handle_reset con 0x55ab8e29d800 session 0x55ab8cc723c0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 220 heartbeat osd_stat(store_statfs(0x1b7055000/0x0/0x1bfc00000, data 0x2b2b9ed/0x2c99000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5f0f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 9.051505089s of 10.000431061s, submitted: 195
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:50.536286+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e26d000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 220 ms_handle_reset con 0x55ab8e26d000 session 0x55ab905332c0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 126705664 unmapped: 29024256 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 220 handle_osd_map epochs [220,221], i have 220, src has [1,221]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:51.536484+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8f629000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 126722048 unmapped: 29007872 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 221 ms_handle_reset con 0x55ab8f629000 session 0x55ab8fe1af00
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e233400
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 221 ms_handle_reset con 0x55ab8e233400 session 0x55ab90380000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 221 heartbeat osd_stat(store_statfs(0x1b703b000/0x0/0x1bfc00000, data 0x2b410f2/0x2cb2000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5f0f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:52.536659+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e26d000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 221 ms_handle_reset con 0x55ab8e26d000 session 0x55ab8dce0000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e29ac00
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 126754816 unmapped: 28975104 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 221 ms_handle_reset con 0x55ab8e29ac00 session 0x55ab8dc065a0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e29d800
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 221 ms_handle_reset con 0x55ab8e29d800 session 0x55ab8f399e00
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #44. Immutable memtables: 1.
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8f2d4c00
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1631137 data_alloc: 285212672 data_used: 3133440
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:53.536798+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 221 ms_handle_reset con 0x55ab8f2d4c00 session 0x55ab8c63d0e0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 128466944 unmapped: 27262976 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e233400
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 221 ms_handle_reset con 0x55ab8e233400 session 0x55ab8de2ba40
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e26d000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:54.536965+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: mgrc handle_mgr_map Got map version 48
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/3214120196,v1:172.18.0.108:6811/3214120196]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 221 ms_handle_reset con 0x55ab8e26d000 session 0x55ab8cc72d20
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 128499712 unmapped: 27230208 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e29ac00
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 221 ms_handle_reset con 0x55ab8e29ac00 session 0x55ab8f1025a0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e29d800
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:55.537229+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 128524288 unmapped: 27205632 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e29d000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 221 handle_osd_map epochs [221,222], i have 221, src has [1,222]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:56.537376+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 222 handle_osd_map epochs [222,222], i have 222, src has [1,222]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 222 ms_handle_reset con 0x55ab8e29d000 session 0x55ab8c63d680
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e26a400
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 222 handle_osd_map epochs [222,222], i have 222, src has [1,222]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 128540672 unmapped: 27189248 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 222 ms_handle_reset con 0x55ab8e26a400 session 0x55ab8dc5d860
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:57.537531+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 222 heartbeat osd_stat(store_statfs(0x1b6e3d000/0x0/0x1bfc00000, data 0x2b7ba01/0x2cf0000, compress 0x0/0x0/0x0, omap 0x649, meta 0x60cf9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e233400
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 128540672 unmapped: 27189248 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e26d000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 222 ms_handle_reset con 0x55ab8e233400 session 0x55ab90532960
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 222 ms_handle_reset con 0x55ab8e26d000 session 0x55ab8c6af4a0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e29ac00
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 222 ms_handle_reset con 0x55ab8e29ac00 session 0x55ab8de2a000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1651578 data_alloc: 285212672 data_used: 3145728
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:58.538933+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 128548864 unmapped: 27181056 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e29d000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 222 handle_osd_map epochs [222,223], i have 222, src has [1,223]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _renew_subs
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _send_mon_message to mon.np0005538514 at v2:172.18.0.104:3300/0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 223 handle_osd_map epochs [223,223], i have 223, src has [1,223]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:07:59.539192+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 223 ms_handle_reset con 0x55ab8e29d000 session 0x55ab8dc37a40
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 128565248 unmapped: 27164672 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8f455c00
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 223 ms_handle_reset con 0x55ab8f455c00 session 0x55ab908985a0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 8.956013680s of 10.001311302s, submitted: 253
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 223 heartbeat osd_stat(store_statfs(0x1b6e29000/0x0/0x1bfc00000, data 0x2b8e796/0x2d05000, compress 0x0/0x0/0x0, omap 0x649, meta 0x60cf9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:00.540470+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 128581632 unmapped: 27148288 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e233400
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 223 ms_handle_reset con 0x55ab8e233400 session 0x55ab90898f00
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e26d000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 223 ms_handle_reset con 0x55ab8e26d000 session 0x55ab908990e0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e29ac00
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e29d000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 223 ms_handle_reset con 0x55ab8e29ac00 session 0x55ab90899860
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:01.540680+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8eded800
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 223 ms_handle_reset con 0x55ab8eded800 session 0x55ab90899c20
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 128614400 unmapped: 27115520 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _renew_subs
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _send_mon_message to mon.np0005538514 at v2:172.18.0.104:3300/0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 223 handle_osd_map epochs [224,224], i have 223, src has [1,224]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 223 handle_osd_map epochs [224,224], i have 224, src has [1,224]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e339000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:02.540916+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 224 ms_handle_reset con 0x55ab8e29d000 session 0x55ab90381860
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 128671744 unmapped: 27058176 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e233400
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 224 ms_handle_reset con 0x55ab8e233400 session 0x55ab8dc37e00
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1676075 data_alloc: 285212672 data_used: 3174400
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8f199000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 224 ms_handle_reset con 0x55ab8f199000 session 0x55ab8f66e000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:03.541169+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e26d000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 224 handle_osd_map epochs [224,225], i have 224, src has [1,225]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 224 ms_handle_reset con 0x55ab8e26d000 session 0x55ab8f10da40
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e29ac00
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 225 ms_handle_reset con 0x55ab8e339000 session 0x55ab90899e00
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 128729088 unmapped: 27000832 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 225 ms_handle_reset con 0x55ab8e29ac00 session 0x55ab903801e0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 8400.1 total, 600.0 interval
                                                          Cumulative writes: 13K writes, 52K keys, 13K commit groups, 1.0 writes per commit group, ingest: 0.04 GB, 0.01 MB/s
                                                          Cumulative WAL: 13K writes, 4194 syncs, 3.16 writes per sync, written: 0.04 GB, 0.01 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 8169 writes, 30K keys, 8169 commit groups, 1.0 writes per commit group, ingest: 23.49 MB, 0.04 MB/s
                                                          Interval WAL: 8169 writes, 3432 syncs, 2.38 writes per sync, written: 0.02 GB, 0.04 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:04.541389+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 225 heartbeat osd_stat(store_statfs(0x1b6e1b000/0x0/0x1bfc00000, data 0x2b930a7/0x2d11000, compress 0x0/0x0/0x0, omap 0x649, meta 0x60cf9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 225 handle_osd_map epochs [226,226], i have 225, src has [1,226]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 128778240 unmapped: 26951680 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 226 heartbeat osd_stat(store_statfs(0x1b6e1b000/0x0/0x1bfc00000, data 0x2b930a7/0x2d11000, compress 0x0/0x0/0x0, omap 0x649, meta 0x60cf9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e233400
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:05.541587+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e26d000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 226 handle_osd_map epochs [227,227], i have 226, src has [1,227]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 227 handle_osd_map epochs [227,227], i have 227, src has [1,227]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 227 ms_handle_reset con 0x55ab8e233400 session 0x55ab924be000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 128786432 unmapped: 26943488 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 227 ms_handle_reset con 0x55ab8e26d000 session 0x55ab8ff234a0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:06.541846+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8f19a000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 227 ms_handle_reset con 0x55ab8f19a000 session 0x55ab924be1e0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8f196c00
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 227 heartbeat osd_stat(store_statfs(0x1b6e16000/0x0/0x1bfc00000, data 0x2b97a68/0x2d18000, compress 0x0/0x0/0x0, omap 0x649, meta 0x60cf9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 128835584 unmapped: 26894336 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e26bc00
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 227 ms_handle_reset con 0x55ab8e26bc00 session 0x55ab8ff23860
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:07.542132+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e36fc00
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 227 handle_osd_map epochs [228,228], i have 227, src has [1,228]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 227 handle_osd_map epochs [228,228], i have 228, src has [1,228]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 228 handle_osd_map epochs [228,228], i have 228, src has [1,228]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 228 ms_handle_reset con 0x55ab8f196c00 session 0x55ab924be5a0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 228 heartbeat osd_stat(store_statfs(0x1b6e18000/0x0/0x1bfc00000, data 0x2b979f6/0x2d16000, compress 0x0/0x0/0x0, omap 0x649, meta 0x60cf9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 129982464 unmapped: 25747456 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 228 ms_handle_reset con 0x55ab8e36fc00 session 0x55ab8ff23c20
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1682575 data_alloc: 285212672 data_used: 3198976
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:08.542376+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 129982464 unmapped: 25747456 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8f196c00
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:09.542952+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 228 handle_osd_map epochs [228,229], i have 228, src has [1,229]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 229 handle_osd_map epochs [229,229], i have 229, src has [1,229]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 229 ms_handle_reset con 0x55ab8f196c00 session 0x55ab924be780
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 129998848 unmapped: 25731072 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e233400
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 229 heartbeat osd_stat(store_statfs(0x1b6e14000/0x0/0x1bfc00000, data 0x2b99e18/0x2d19000, compress 0x0/0x0/0x0, omap 0x649, meta 0x60cf9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 229 ms_handle_reset con 0x55ab8e233400 session 0x55ab924be960
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e26bc00
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 9.377478600s of 10.125592232s, submitted: 203
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:10.543167+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 229 handle_osd_map epochs [229,230], i have 229, src has [1,230]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 230 handle_osd_map epochs [230,230], i have 230, src has [1,230]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 230 ms_handle_reset con 0x55ab8e26bc00 session 0x55ab924bed20
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 130023424 unmapped: 25706496 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _renew_subs
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _send_mon_message to mon.np0005538514 at v2:172.18.0.104:3300/0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 230 handle_osd_map epochs [231,231], i have 230, src has [1,231]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:11.543349+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 130056192 unmapped: 25673728 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab9251fc00
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 231 handle_osd_map epochs [231,231], i have 231, src has [1,231]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 231 handle_osd_map epochs [230,231], i have 231, src has [1,231]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 231 handle_osd_map epochs [231,231], i have 231, src has [1,231]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 231 heartbeat osd_stat(store_statfs(0x1b6e0b000/0x0/0x1bfc00000, data 0x2b9e654/0x2d21000, compress 0x0/0x0/0x0, omap 0x649, meta 0x60cf9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:12.543501+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 231 handle_osd_map epochs [231,232], i have 231, src has [1,232]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 232 handle_osd_map epochs [232,232], i have 232, src has [1,232]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 232 ms_handle_reset con 0x55ab9251fc00 session 0x55ab924bef00
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 131137536 unmapped: 24592384 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1702211 data_alloc: 285212672 data_used: 3211264
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:13.543689+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 131137536 unmapped: 24592384 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab9251fc00
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 232 ms_handle_reset con 0x55ab9251fc00 session 0x55ab924bf2c0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e233400
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:14.543939+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 232 handle_osd_map epochs [232,233], i have 232, src has [1,233]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 233 handle_osd_map epochs [233,233], i have 233, src has [1,233]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 233 ms_handle_reset con 0x55ab8e233400 session 0x55ab924bf4a0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 131186688 unmapped: 24543232 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 233 handle_osd_map epochs [232,233], i have 233, src has [1,233]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e26bc00
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:15.544211+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 233 handle_osd_map epochs [234,234], i have 233, src has [1,234]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 234 handle_osd_map epochs [234,234], i have 234, src has [1,234]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 131194880 unmapped: 24535040 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 234 ms_handle_reset con 0x55ab8e26bc00 session 0x55ab924bf860
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e36fc00
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _renew_subs
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _send_mon_message to mon.np0005538514 at v2:172.18.0.104:3300/0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 234 handle_osd_map epochs [235,235], i have 234, src has [1,235]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 235 ms_handle_reset con 0x55ab8e36fc00 session 0x55ab924bfa40
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:16.544421+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8f196c00
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 235 handle_osd_map epochs [235,235], i have 235, src has [1,235]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 235 handle_osd_map epochs [235,235], i have 235, src has [1,235]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 131334144 unmapped: 24395776 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 235 handle_osd_map epochs [236,236], i have 235, src has [1,236]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:17.544630+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 236 ms_handle_reset con 0x55ab8f196c00 session 0x55ab924bfe00
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 236 heartbeat osd_stat(store_statfs(0x1b6db9000/0x0/0x1bfc00000, data 0x2be0b2e/0x2d71000, compress 0x0/0x0/0x0, omap 0x649, meta 0x60cf9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 131375104 unmapped: 24354816 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1713585 data_alloc: 285212672 data_used: 3231744
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:18.544866+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 131375104 unmapped: 24354816 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8f196c00
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 236 ms_handle_reset con 0x55ab8f196c00 session 0x55ab935ba000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e233400
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:19.545033+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 236 handle_osd_map epochs [236,237], i have 236, src has [1,237]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 237 handle_osd_map epochs [237,237], i have 237, src has [1,237]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e26bc00
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 237 handle_osd_map epochs [237,237], i have 237, src has [1,237]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 237 ms_handle_reset con 0x55ab8e233400 session 0x55ab935ba3c0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 237 ms_handle_reset con 0x55ab8e26bc00 session 0x55ab8ff230e0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 131440640 unmapped: 24289280 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e36fc00
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 237 ms_handle_reset con 0x55ab8e36fc00 session 0x55ab935ba5a0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab9251fc00
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 9.491093636s of 10.022408485s, submitted: 147
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:20.545230+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 237 ms_handle_reset con 0x55ab9251fc00 session 0x55ab935ba960
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 237 heartbeat osd_stat(store_statfs(0x1b6d9e000/0x0/0x1bfc00000, data 0x2bfdaef/0x2d8f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x60cf9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 131465216 unmapped: 24264704 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab9251fc00
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 237 handle_osd_map epochs [237,238], i have 237, src has [1,238]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:21.545360+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 238 handle_osd_map epochs [238,238], i have 238, src has [1,238]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 238 heartbeat osd_stat(store_statfs(0x1b6d9b000/0x0/0x1bfc00000, data 0x2bffd5d/0x2d92000, compress 0x0/0x0/0x0, omap 0x649, meta 0x60cf9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 238 ms_handle_reset con 0x55ab9251fc00 session 0x55ab935bad20
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 131563520 unmapped: 24166400 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:22.545527+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8df90400
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 238 ms_handle_reset con 0x55ab8df90400 session 0x55ab935bb0e0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 131563520 unmapped: 24166400 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1723645 data_alloc: 285212672 data_used: 3239936
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:23.545780+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e36f800
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 238 ms_handle_reset con 0x55ab8e36f800 session 0x55ab935bb2c0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8edec400
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 131579904 unmapped: 24150016 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 238 ms_handle_reset con 0x55ab8edec400 session 0x55ab935bb4a0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:24.545958+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 131579904 unmapped: 24150016 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:25.546136+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 131588096 unmapped: 24141824 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e338400
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 238 ms_handle_reset con 0x55ab8e338400 session 0x55ab935bb860
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 238 handle_osd_map epochs [239,239], i have 238, src has [1,239]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:26.546308+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 131596288 unmapped: 24133632 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8d87b800
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:27.546477+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 239 ms_handle_reset con 0x55ab8d87b800 session 0x55ab935bbc20
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 239 heartbeat osd_stat(store_statfs(0x1b6d7d000/0x0/0x1bfc00000, data 0x2c1a3d0/0x2daf000, compress 0x0/0x0/0x0, omap 0x649, meta 0x60cf9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 131596288 unmapped: 24133632 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1731371 data_alloc: 285212672 data_used: 3252224
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:28.546643+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e233400
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 239 ms_handle_reset con 0x55ab8e233400 session 0x55ab935bbe00
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 131604480 unmapped: 24125440 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:29.546801+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 131604480 unmapped: 24125440 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8dc96800
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 239 ms_handle_reset con 0x55ab8dc96800 session 0x55ab8f73d680
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8bf54800
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 239 ms_handle_reset con 0x55ab8bf54800 session 0x55ab935ba1e0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 239 handle_osd_map epochs [239,240], i have 239, src has [1,240]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:30.547003+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 9.945206642s of 10.188413620s, submitted: 77
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 131620864 unmapped: 24109056 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 240 handle_osd_map epochs [240,240], i have 240, src has [1,240]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8cc2d800
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 240 ms_handle_reset con 0x55ab8cc2d800 session 0x55ab8f73da40
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8cc2d800
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:31.547124+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8bf54800
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 240 ms_handle_reset con 0x55ab8cc2d800 session 0x55ab8ea38000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 240 handle_osd_map epochs [240,240], i have 240, src has [1,240]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 131653632 unmapped: 24076288 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 240 ms_handle_reset con 0x55ab8bf54800 session 0x55ab90898b40
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e339000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:32.547291+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 240 handle_osd_map epochs [240,241], i have 240, src has [1,241]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e371800
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 241 ms_handle_reset con 0x55ab8e371800 session 0x55ab8dc5da40
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e36f000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8edecc00
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 241 ms_handle_reset con 0x55ab8edecc00 session 0x55ab908994a0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 241 ms_handle_reset con 0x55ab8e36f000 session 0x55ab9029eb40
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 132784128 unmapped: 22945792 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1751355 data_alloc: 285212672 data_used: 3264512
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:33.547478+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 241 heartbeat osd_stat(store_statfs(0x1b6d3f000/0x0/0x1bfc00000, data 0x2c51982/0x2ded000, compress 0x0/0x0/0x0, omap 0x649, meta 0x60cf9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 241 handle_osd_map epochs [241,242], i have 241, src has [1,242]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 242 handle_osd_map epochs [242,242], i have 242, src has [1,242]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 242 ms_handle_reset con 0x55ab8e339000 session 0x55ab8ea39680
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8bf54800
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 132866048 unmapped: 22863872 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 242 ms_handle_reset con 0x55ab8bf54800 session 0x55ab8ea39a40
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:34.547675+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 132890624 unmapped: 22839296 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:35.547877+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 242 handle_osd_map epochs [243,243], i have 242, src has [1,243]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.8] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.6] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.3] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.19] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.12] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.13] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 132898816 unmapped: 22831104 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 243 handle_osd_map epochs [243,243], i have 243, src has [1,243]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8cc2d800
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 243 ms_handle_reset con 0x55ab8cc2d800 session 0x55ab8ea39e00
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _renew_subs
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _send_mon_message to mon.np0005538514 at v2:172.18.0.104:3300/0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 243 handle_osd_map epochs [244,244], i have 243, src has [1,244]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:36.547970+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab90916c00
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 132915200 unmapped: 22814720 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:37.548150+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 244 handle_osd_map epochs [244,245], i have 244, src has [1,245]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 245 ms_handle_reset con 0x55ab90916c00 session 0x55ab8f398780
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8f2d4800
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 132939776 unmapped: 22790144 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 245 ms_handle_reset con 0x55ab8f2d4800 session 0x55ab8f398960
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 245 heartbeat osd_stat(store_statfs(0x1b6d0b000/0x0/0x1bfc00000, data 0x2c85486/0x2e22000, compress 0x0/0x0/0x0, omap 0x649, meta 0x60cf9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1760273 data_alloc: 285212672 data_used: 3280896
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:38.548286+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 245 heartbeat osd_stat(store_statfs(0x1b6d05000/0x0/0x1bfc00000, data 0x2c8adf7/0x2e27000, compress 0x0/0x0/0x0, omap 0x649, meta 0x60cf9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 132939776 unmapped: 22790144 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8bf54800
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 245 ms_handle_reset con 0x55ab8bf54800 session 0x55ab8c6af2c0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8cc2d800
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:39.548430+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 245 ms_handle_reset con 0x55ab8cc2d800 session 0x55ab90899680
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 132964352 unmapped: 22765568 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e339000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 245 ms_handle_reset con 0x55ab8e339000 session 0x55ab8dc50d20
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:40.548606+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 9.457598686s of 10.109394073s, submitted: 184
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab90916c00
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 245 ms_handle_reset con 0x55ab90916c00 session 0x55ab8d8a21e0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 133013504 unmapped: 22716416 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:41.548751+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _renew_subs
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _send_mon_message to mon.np0005538514 at v2:172.18.0.104:3300/0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 245 handle_osd_map epochs [246,246], i have 245, src has [1,246]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.6] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.13] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.12] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.3] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.19] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.8] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 133029888 unmapped: 22700032 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 246 heartbeat osd_stat(store_statfs(0x1b6ce6000/0x0/0x1bfc00000, data 0x2caaaf9/0x2e48000, compress 0x0/0x0/0x0, omap 0x649, meta 0x60cf9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:42.548874+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8dc96000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 246 ms_handle_reset con 0x55ab8dc96000 session 0x55ab8fe1a960
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8dc96000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8bf54800
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 246 heartbeat osd_stat(store_statfs(0x1b6ce2000/0x0/0x1bfc00000, data 0x2cacd1e/0x2e4b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x60cf9b7), peers [0,2,3,4,5] op hist [0,0,0,1])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 133169152 unmapped: 22560768 heap: 155729920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 246 ms_handle_reset con 0x55ab8bf54800 session 0x55ab8ff22f00
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 246 ms_handle_reset con 0x55ab8dc96000 session 0x55ab8dbfab40
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8cc2d800
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1857898 data_alloc: 285212672 data_used: 3289088
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:43.549048+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 246 ms_handle_reset con 0x55ab8cc2d800 session 0x55ab9029e3c0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 246 heartbeat osd_stat(store_statfs(0x1b5b17000/0x0/0x1bfc00000, data 0x3e789e5/0x4017000, compress 0x0/0x0/0x0, omap 0x649, meta 0x60cf9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 133980160 unmapped: 23371776 heap: 157351936 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:44.549267+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 246 handle_osd_map epochs [247,247], i have 246, src has [1,247]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 247 handle_osd_map epochs [247,247], i have 247, src has [1,247]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e339000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 247 ms_handle_reset con 0x55ab8e339000 session 0x55ab902203c0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 135110656 unmapped: 22241280 heap: 157351936 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:45.549428+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 247 handle_osd_map epochs [247,248], i have 247, src has [1,248]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 248 handle_osd_map epochs [248,248], i have 248, src has [1,248]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab90916c00
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 248 ms_handle_reset con 0x55ab90916c00 session 0x55ab90220780
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8bf54800
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 135274496 unmapped: 22077440 heap: 157351936 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 248 handle_osd_map epochs [248,248], i have 248, src has [1,248]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 248 ms_handle_reset con 0x55ab8bf54800 session 0x55ab90220960
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _renew_subs
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _send_mon_message to mon.np0005538514 at v2:172.18.0.104:3300/0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 248 handle_osd_map epochs [249,249], i have 248, src has [1,249]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:46.549607+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e36e000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 249 ms_handle_reset con 0x55ab8e36e000 session 0x55ab90220d20
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 135397376 unmapped: 21954560 heap: 157351936 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:47.549838+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 249 heartbeat osd_stat(store_statfs(0x1b56c3000/0x0/0x1bfc00000, data 0x3ec5ab3/0x4069000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 135397376 unmapped: 21954560 heap: 157351936 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e36e800
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 249 ms_handle_reset con 0x55ab8e36e800 session 0x55ab90220f00
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1925244 data_alloc: 285212672 data_used: 3313664
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:48.550040+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 135340032 unmapped: 22011904 heap: 157351936 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e232400
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 249 ms_handle_reset con 0x55ab8e232400 session 0x55ab902212c0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:49.550223+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 249 handle_osd_map epochs [250,250], i have 249, src has [1,250]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e26d000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e26a000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 135389184 unmapped: 21962752 heap: 157351936 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 250 ms_handle_reset con 0x55ab8e26a000 session 0x55ab8d8a3e00
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 250 ms_handle_reset con 0x55ab8e26d000 session 0x55ab902214a0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8bf54800
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:50.550405+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e232400
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 250 ms_handle_reset con 0x55ab8e232400 session 0x55ab8f1032c0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e36e000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 8.802846909s of 10.001990318s, submitted: 277
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 250 handle_osd_map epochs [250,251], i have 250, src has [1,251]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 251 handle_osd_map epochs [251,251], i have 251, src has [1,251]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 251 handle_osd_map epochs [251,251], i have 251, src has [1,251]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 251 ms_handle_reset con 0x55ab8bf54800 session 0x55ab90221860
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 251 ms_handle_reset con 0x55ab8e36e000 session 0x55ab8dc4e000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 136560640 unmapped: 20791296 heap: 157351936 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 251 heartbeat osd_stat(store_statfs(0x1b569c000/0x0/0x1bfc00000, data 0x3ee9522/0x4091000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 251 handle_osd_map epochs [251,252], i have 251, src has [1,252]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:51.550520+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e36e800
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 252 handle_osd_map epochs [252,252], i have 252, src has [1,252]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 252 ms_handle_reset con 0x55ab8e36e800 session 0x55ab90221a40
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e36e800
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8bf54800
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 252 ms_handle_reset con 0x55ab8bf54800 session 0x55ab8dce1c20
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 136617984 unmapped: 20733952 heap: 157351936 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e232400
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:52.550703+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 252 ms_handle_reset con 0x55ab8e232400 session 0x55ab8dc512c0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 252 handle_osd_map epochs [252,253], i have 252, src has [1,253]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 253 handle_osd_map epochs [253,253], i have 253, src has [1,253]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 253 ms_handle_reset con 0x55ab8e36e800 session 0x55ab90221e00
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 136658944 unmapped: 20692992 heap: 157351936 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1947851 data_alloc: 285212672 data_used: 3330048
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:53.550960+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e26d000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 253 handle_osd_map epochs [253,254], i have 253, src has [1,254]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e36e000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 254 handle_osd_map epochs [254,254], i have 254, src has [1,254]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 254 ms_handle_reset con 0x55ab8e36e000 session 0x55ab90221c20
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e268400
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 254 ms_handle_reset con 0x55ab8e26d000 session 0x55ab9029fc20
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 136691712 unmapped: 20660224 heap: 157351936 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 254 ms_handle_reset con 0x55ab8e268400 session 0x55ab90898b40
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 254 heartbeat osd_stat(store_statfs(0x1b5690000/0x0/0x1bfc00000, data 0x3ef00cc/0x409c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:54.551197+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e26d000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8bf54800
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 254 ms_handle_reset con 0x55ab8bf54800 session 0x55ab8f73da40
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 254 ms_handle_reset con 0x55ab8e26d000 session 0x55ab8f398b40
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 136716288 unmapped: 20635648 heap: 157351936 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 254 heartbeat osd_stat(store_statfs(0x1b568f000/0x0/0x1bfc00000, data 0x3ef24c2/0x409f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:55.551371+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e232400
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 136716288 unmapped: 20635648 heap: 157351936 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e36e000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:56.551536+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 254 handle_osd_map epochs [255,255], i have 254, src has [1,255]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 254 handle_osd_map epochs [255,255], i have 255, src has [1,255]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e36e800
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 255 handle_osd_map epochs [255,255], i have 255, src has [1,255]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 255 ms_handle_reset con 0x55ab8e36e800 session 0x55ab90533860
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 148201472 unmapped: 21757952 heap: 169959424 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:57.551688+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 255 heartbeat osd_stat(store_statfs(0x1b1675000/0x0/0x1bfc00000, data 0x7f080c0/0x80b9000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,2,3,4,5] op hist [0,0,0,0,0,0,0,1])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8db6b000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 137748480 unmapped: 32210944 heap: 169959424 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 255 ms_handle_reset con 0x55ab8db6b000 session 0x55ab8dc5c5a0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:58.551869+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2605717 data_alloc: 285212672 data_used: 3342336
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 137797632 unmapped: 32161792 heap: 169959424 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8bf54800
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:08:59.552006+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 142647296 unmapped: 27312128 heap: 169959424 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:00.552203+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 255 ms_handle_reset con 0x55ab8bf54800 session 0x55ab924bf680
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 7.890389919s of 10.037495613s, submitted: 309
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 142671872 unmapped: 27287552 heap: 169959424 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e268400
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:01.552348+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 255 ms_handle_reset con 0x55ab8e268400 session 0x55ab911feb40
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 255 handle_osd_map epochs [255,256], i have 255, src has [1,256]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e26d000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 256 handle_osd_map epochs [256,256], i have 256, src has [1,256]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 256 handle_osd_map epochs [256,256], i have 256, src has [1,256]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 138534912 unmapped: 31424512 heap: 169959424 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:02.552514+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 256 ms_handle_reset con 0x55ab8e26d000 session 0x55ab911ff0e0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 256 heartbeat osd_stat(store_statfs(0x1a865e000/0x0/0x1bfc00000, data 0x10f1c044/0x110cf000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 143892480 unmapped: 26066944 heap: 169959424 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:03.552725+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab90c54c00
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 3564175 data_alloc: 285212672 data_used: 3354624
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 256 ms_handle_reset con 0x55ab90c54c00 session 0x55ab8dce1c20
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 139689984 unmapped: 30269440 heap: 169959424 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:04.553121+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 144941056 unmapped: 25018368 heap: 169959424 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:05.553547+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 256 ms_handle_reset con 0x55ab8e29d800 session 0x55ab8de2b2c0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 256 handle_osd_map epochs [256,257], i have 256, src has [1,257]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 256 handle_osd_map epochs [257,257], i have 257, src has [1,257]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 257 handle_osd_map epochs [257,257], i have 257, src has [1,257]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8bf54800
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 141352960 unmapped: 28606464 heap: 169959424 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:06.553773+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 257 ms_handle_reset con 0x55ab8bf54800 session 0x55ab935bb860
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: mgrc handle_mgr_map Got map version 49
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/3214120196,v1:172.18.0.108:6811/3214120196]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 145678336 unmapped: 24281088 heap: 169959424 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:07.553904+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 142630912 unmapped: 31531008 heap: 174161920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 257 heartbeat osd_stat(store_statfs(0x19fe1f000/0x0/0x1bfc00000, data 0x197579d8/0x1990e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,2,3,4,5] op hist [0,0,0,0,0,0,1])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8df91000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:08.554123+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 257 ms_handle_reset con 0x55ab8df91000 session 0x55ab935bb0e0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 257 ms_handle_reset con 0x55ab8e36e000 session 0x55ab8dbfb860
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 4561019 data_alloc: 285212672 data_used: 3366912
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 257 ms_handle_reset con 0x55ab8e232400 session 0x55ab8f399860
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8f375800
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 257 ms_handle_reset con 0x55ab8f375800 session 0x55ab8db5af00
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8bf54800
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 142426112 unmapped: 31735808 heap: 174161920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 257 ms_handle_reset con 0x55ab8bf54800 session 0x55ab935bad20
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8df91000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 257 ms_handle_reset con 0x55ab8df91000 session 0x55ab90220f00
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e232400
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 257 heartbeat osd_stat(store_statfs(0x19e620000/0x0/0x1bfc00000, data 0x1af578a2/0x1b10c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:09.554362+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e36e000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 257 ms_handle_reset con 0x55ab8e232400 session 0x55ab935ba5a0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 257 ms_handle_reset con 0x55ab8e36e000 session 0x55ab8edf2f00
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 141434880 unmapped: 32727040 heap: 174161920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e29b400
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e26bc00
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:10.554638+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 257 ms_handle_reset con 0x55ab8e29b400 session 0x55ab8de2a1e0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8bf54800
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 257 ms_handle_reset con 0x55ab8e26bc00 session 0x55ab90221860
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8df91000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 6.613661289s of 10.137508392s, submitted: 712
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 257 ms_handle_reset con 0x55ab8df91000 session 0x55ab8d8a30e0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _renew_subs
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _send_mon_message to mon.np0005538514 at v2:172.18.0.104:3300/0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 257 handle_osd_map epochs [258,258], i have 257, src has [1,258]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 258 ms_handle_reset con 0x55ab8bf54800 session 0x55ab924bf4a0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 141139968 unmapped: 33021952 heap: 174161920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:11.554811+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e232400
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8f628800
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 258 ms_handle_reset con 0x55ab8f628800 session 0x55ab8f103a40
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e268400
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 258 ms_handle_reset con 0x55ab8e268400 session 0x55ab8ff23e00
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 141164544 unmapped: 32997376 heap: 174161920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 258 handle_osd_map epochs [258,258], i have 258, src has [1,258]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:12.555054+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 258 handle_osd_map epochs [258,259], i have 258, src has [1,259]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 259 ms_handle_reset con 0x55ab8e232400 session 0x55ab8dc510e0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 141262848 unmapped: 32899072 heap: 174161920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8bf54800
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 259 ms_handle_reset con 0x55ab8bf54800 session 0x55ab8f10da40
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8df91000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:13.555354+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2088162 data_alloc: 285212672 data_used: 3395584
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e26bc00
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 259 ms_handle_reset con 0x55ab8e26bc00 session 0x55ab924bf0e0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8f628800
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 259 ms_handle_reset con 0x55ab8df91000 session 0x55ab8dc37e00
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 259 handle_osd_map epochs [259,260], i have 259, src has [1,260]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 260 ms_handle_reset con 0x55ab8f628800 session 0x55ab8db5a960
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 141221888 unmapped: 32940032 heap: 174161920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 260 handle_osd_map epochs [259,260], i have 260, src has [1,260]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 260 handle_osd_map epochs [260,260], i have 260, src has [1,260]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:14.555557+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8bf54800
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 260 ms_handle_reset con 0x55ab8bf54800 session 0x55ab90899860
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 141238272 unmapped: 32923648 heap: 174161920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8df91000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 260 ms_handle_reset con 0x55ab8df91000 session 0x55ab908990e0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 260 heartbeat osd_stat(store_statfs(0x1b5597000/0x0/0x1bfc00000, data 0x3fdd4fd/0x4195000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:15.555773+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 141238272 unmapped: 32923648 heap: 174161920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e232400
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:16.555937+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _renew_subs
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _send_mon_message to mon.np0005538514 at v2:172.18.0.104:3300/0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 260 handle_osd_map epochs [261,261], i have 260, src has [1,261]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 261 ms_handle_reset con 0x55ab8e232400 session 0x55ab908985a0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 141320192 unmapped: 32841728 heap: 174161920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab9094e400
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:17.556152+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e371800
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 261 ms_handle_reset con 0x55ab9094e400 session 0x55ab8f399c20
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8ff86400
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab90916800
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 261 ms_handle_reset con 0x55ab8ff86400 session 0x55ab8c6af4a0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 261 ms_handle_reset con 0x55ab8e371800 session 0x55ab905325a0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8bf54800
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 142032896 unmapped: 32129024 heap: 174161920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 261 ms_handle_reset con 0x55ab8bf54800 session 0x55ab8ff223c0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 261 ms_handle_reset con 0x55ab90916800 session 0x55ab90532960
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8df91000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 261 ms_handle_reset con 0x55ab8df91000 session 0x55ab8f399e00
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:18.556405+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2151545 data_alloc: 285212672 data_used: 3407872
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e232400
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 261 ms_handle_reset con 0x55ab8e232400 session 0x55ab8fe1af00
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8bf54800
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 261 ms_handle_reset con 0x55ab8bf54800 session 0x55ab9029e960
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 141950976 unmapped: 32210944 heap: 174161920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:19.556703+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8df91000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 261 ms_handle_reset con 0x55ab8df91000 session 0x55ab8dfc7e00
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e371800
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 261 heartbeat osd_stat(store_statfs(0x1b4f0a000/0x0/0x1bfc00000, data 0x4666e0b/0x4824000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 142999552 unmapped: 31162368 heap: 174161920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab90916800
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:20.556903+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 261 ms_handle_reset con 0x55ab90916800 session 0x55ab8f66e3c0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _renew_subs
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _send_mon_message to mon.np0005538514 at v2:172.18.0.104:3300/0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 261 handle_osd_map epochs [262,262], i have 261, src has [1,262]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 262 ms_handle_reset con 0x55ab8e371800 session 0x55ab8c6af680
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab9094e400
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 262 ms_handle_reset con 0x55ab9094e400 session 0x55ab8db5ba40
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8bf54800
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 9.504934311s of 10.329587936s, submitted: 228
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 143106048 unmapped: 31055872 heap: 174161920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8df91000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 262 ms_handle_reset con 0x55ab8bf54800 session 0x55ab90899e00
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:21.557109+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 262 handle_osd_map epochs [262,263], i have 262, src has [1,263]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 263 handle_osd_map epochs [263,263], i have 263, src has [1,263]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 263 ms_handle_reset con 0x55ab8df91000 session 0x55ab8fe1a5a0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 143212544 unmapped: 30949376 heap: 174161920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab90917400
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 263 ms_handle_reset con 0x55ab90917400 session 0x55ab908992c0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:22.557283+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8eded800
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 263 ms_handle_reset con 0x55ab8eded800 session 0x55ab9029e000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e371000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 263 handle_osd_map epochs [263,263], i have 263, src has [1,263]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 263 ms_handle_reset con 0x55ab8e371000 session 0x55ab90898000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 143310848 unmapped: 30851072 heap: 174161920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:23.557617+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2118866 data_alloc: 285212672 data_used: 3432448
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 143310848 unmapped: 30851072 heap: 174161920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:24.557849+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 263 handle_osd_map epochs [264,264], i have 263, src has [1,264]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 264 heartbeat osd_stat(store_statfs(0x1b551f000/0x0/0x1bfc00000, data 0x404ef91/0x420d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 143343616 unmapped: 30818304 heap: 174161920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:25.558024+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8bf54800
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8df91000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 264 ms_handle_reset con 0x55ab8df91000 session 0x55ab8d8a3a40
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 264 ms_handle_reset con 0x55ab8bf54800 session 0x55ab908983c0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 142950400 unmapped: 31211520 heap: 174161920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:26.558265+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8f454000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _renew_subs
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _send_mon_message to mon.np0005538514 at v2:172.18.0.104:3300/0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 264 handle_osd_map epochs [265,265], i have 264, src has [1,265]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 265 handle_osd_map epochs [265,265], i have 265, src has [1,265]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 265 ms_handle_reset con 0x55ab8f454000 session 0x55ab8cc72b40
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8db6b400
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 265 handle_osd_map epochs [265,265], i have 265, src has [1,265]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 142958592 unmapped: 31203328 heap: 174161920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 265 handle_osd_map epochs [265,265], i have 265, src has [1,265]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 265 ms_handle_reset con 0x55ab8db6b400 session 0x55ab8ea38780
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:27.558414+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 265 heartbeat osd_stat(store_statfs(0x1b54de000/0x0/0x1bfc00000, data 0x408c237/0x4250000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 141803520 unmapped: 32358400 heap: 174161920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab90917000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 265 ms_handle_reset con 0x55ab90917000 session 0x55ab8c6ae1e0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:28.558557+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2138713 data_alloc: 285212672 data_used: 3432448
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: mgrc handle_mgr_map Got map version 50
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/3214120196,v1:172.18.0.108:6811/3214120196]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 265 handle_osd_map epochs [266,266], i have 265, src has [1,266]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 141508608 unmapped: 32653312 heap: 174161920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8e26a800
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:29.558720+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 266 ms_handle_reset con 0x55ab8e26a800 session 0x55ab8dc5dc20
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 266 handle_osd_map epochs [266,266], i have 266, src has [1,266]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 142704640 unmapped: 31457280 heap: 174161920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:30.558965+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8bf54800
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 266 ms_handle_reset con 0x55ab8bf54800 session 0x55ab8ff22960
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8db6b400
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 266 ms_handle_reset con 0x55ab8db6b400 session 0x55ab8f102000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 142991360 unmapped: 31170560 heap: 174161920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:31.559182+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 9.675200462s of 10.661262512s, submitted: 232
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 142991360 unmapped: 31170560 heap: 174161920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:32.559469+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 266 heartbeat osd_stat(store_statfs(0x1b5450000/0x0/0x1bfc00000, data 0x4115e70/0x42dc000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 143048704 unmapped: 31113216 heap: 174161920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:33.559675+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2153147 data_alloc: 285212672 data_used: 3440640
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 143237120 unmapped: 30924800 heap: 174161920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:34.560006+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 143171584 unmapped: 30990336 heap: 174161920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:35.560160+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 266 heartbeat osd_stat(store_statfs(0x1b5415000/0x0/0x1bfc00000, data 0x4153f88/0x4319000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,2,3,4,5] op hist [1,0,1,2])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 143450112 unmapped: 30711808 heap: 174161920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:36.560335+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 266 handle_osd_map epochs [266,267], i have 266, src has [1,267]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 267 handle_osd_map epochs [267,267], i have 267, src has [1,267]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 144621568 unmapped: 29540352 heap: 174161920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:37.560504+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 267 heartbeat osd_stat(store_statfs(0x1b4fe4000/0x0/0x1bfc00000, data 0x418119b/0x4349000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 144629760 unmapped: 29532160 heap: 174161920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:38.560689+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2164473 data_alloc: 285212672 data_used: 3452928
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 144629760 unmapped: 29532160 heap: 174161920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:39.560871+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 143736832 unmapped: 30425088 heap: 174161920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:40.561044+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 267 heartbeat osd_stat(store_statfs(0x1b4f47000/0x0/0x1bfc00000, data 0x421b5fe/0x43e5000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 144064512 unmapped: 30097408 heap: 174161920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:41.561223+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 9.640686035s of 10.081563950s, submitted: 117
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 144064512 unmapped: 30097408 heap: 174161920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:42.561383+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 144089088 unmapped: 30072832 heap: 174161920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:43.561563+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2166645 data_alloc: 285212672 data_used: 3452928
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 145219584 unmapped: 28942336 heap: 174161920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:44.561707+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 145375232 unmapped: 28786688 heap: 174161920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:45.561902+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _renew_subs
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _send_mon_message to mon.np0005538514 at v2:172.18.0.104:3300/0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 267 handle_osd_map epochs [268,268], i have 267, src has [1,268]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 145743872 unmapped: 28418048 heap: 174161920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 268 handle_osd_map epochs [268,268], i have 268, src has [1,268]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:46.562149+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 268 heartbeat osd_stat(store_statfs(0x1b4eba000/0x0/0x1bfc00000, data 0x42a9736/0x4473000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 145833984 unmapped: 28327936 heap: 174161920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:47.562428+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 145833984 unmapped: 28327936 heap: 174161920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:48.562573+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2179455 data_alloc: 285212672 data_used: 3465216
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 146112512 unmapped: 28049408 heap: 174161920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:49.562753+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 268 heartbeat osd_stat(store_statfs(0x1b4e63000/0x0/0x1bfc00000, data 0x43032bf/0x44cb000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 268 heartbeat osd_stat(store_statfs(0x1b4e63000/0x0/0x1bfc00000, data 0x43032bf/0x44cb000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 146112512 unmapped: 28049408 heap: 174161920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:50.563001+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 147161088 unmapped: 27000832 heap: 174161920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:51.563191+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 268 handle_osd_map epochs [268,269], i have 268, src has [1,269]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 147423232 unmapped: 26738688 heap: 174161920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.141796112s of 10.608541489s, submitted: 135
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:52.563330+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 269 handle_osd_map epochs [269,270], i have 269, src has [1,270]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 147431424 unmapped: 26730496 heap: 174161920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:53.563493+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2187949 data_alloc: 285212672 data_used: 3477504
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 147578880 unmapped: 26583040 heap: 174161920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:54.563618+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 147578880 unmapped: 26583040 heap: 174161920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:55.563768+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 270 heartbeat osd_stat(store_statfs(0x1b4dc3000/0x0/0x1bfc00000, data 0x43a1506/0x456b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 148733952 unmapped: 25427968 heap: 174161920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:56.563950+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 270 heartbeat osd_stat(store_statfs(0x1b4dc3000/0x0/0x1bfc00000, data 0x43a1506/0x456b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 148733952 unmapped: 25427968 heap: 174161920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:57.564161+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 147718144 unmapped: 26443776 heap: 174161920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:58.564285+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2203451 data_alloc: 285212672 data_used: 3477504
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 147742720 unmapped: 26419200 heap: 174161920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:09:59.564433+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 147742720 unmapped: 26419200 heap: 174161920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:00.564588+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 147570688 unmapped: 26591232 heap: 174161920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 270 heartbeat osd_stat(store_statfs(0x1b4d4d000/0x0/0x1bfc00000, data 0x4415c1d/0x45e1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:01.564728+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _renew_subs
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _send_mon_message to mon.np0005538514 at v2:172.18.0.104:3300/0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 270 handle_osd_map epochs [271,271], i have 270, src has [1,271]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 271 handle_osd_map epochs [271,271], i have 271, src has [1,271]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 148717568 unmapped: 25444352 heap: 174161920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:02.564940+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 148717568 unmapped: 25444352 heap: 174161920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:03.565161+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2220323 data_alloc: 285212672 data_used: 3489792
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 148905984 unmapped: 25255936 heap: 174161920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:04.565377+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 11.858432770s of 12.323176384s, submitted: 130
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8f628000
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 149004288 unmapped: 25157632 heap: 174161920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:05.565566+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 271 heartbeat osd_stat(store_statfs(0x1b4cd5000/0x0/0x1bfc00000, data 0x4489f79/0x4659000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 149004288 unmapped: 25157632 heap: 174161920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:06.565746+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 149004288 unmapped: 25157632 heap: 174161920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:07.565942+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 149004288 unmapped: 25157632 heap: 174161920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:08.566134+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2218651 data_alloc: 285212672 data_used: 3489792
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 271 heartbeat osd_stat(store_statfs(0x1b4cd6000/0x0/0x1bfc00000, data 0x4489e68/0x4658000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 149028864 unmapped: 25133056 heap: 174161920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:09.566279+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 149110784 unmapped: 25051136 heap: 174161920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:10.566530+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 150683648 unmapped: 23478272 heap: 174161920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:11.566740+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 152010752 unmapped: 22151168 heap: 174161920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:12.566940+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 271 heartbeat osd_stat(store_statfs(0x1b3a3c000/0x0/0x1bfc00000, data 0x457e88f/0x4751000, compress 0x0/0x0/0x0, omap 0x649, meta 0x7a6f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 152018944 unmapped: 22142976 heap: 174161920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:13.567116+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2256807 data_alloc: 285212672 data_used: 3489792
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 151117824 unmapped: 23044096 heap: 174161920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:14.567274+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 9.689379692s of 10.105747223s, submitted: 92
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 153305088 unmapped: 20856832 heap: 174161920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:15.567457+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 153509888 unmapped: 20652032 heap: 174161920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:16.567605+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 271 heartbeat osd_stat(store_statfs(0x1b39c9000/0x0/0x1bfc00000, data 0x45f4457/0x47c5000, compress 0x0/0x0/0x0, omap 0x649, meta 0x7a6f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 153616384 unmapped: 20545536 heap: 174161920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:17.567769+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 153632768 unmapped: 20529152 heap: 174161920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:18.567956+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2251981 data_alloc: 285212672 data_used: 3489792
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 153206784 unmapped: 20955136 heap: 174161920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:19.568117+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 271 heartbeat osd_stat(store_statfs(0x1b3967000/0x0/0x1bfc00000, data 0x46568e9/0x4827000, compress 0x0/0x0/0x0, omap 0x649, meta 0x7a6f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 153214976 unmapped: 20946944 heap: 174161920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:20.568285+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 153223168 unmapped: 20938752 heap: 174161920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:21.568452+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 153485312 unmapped: 20676608 heap: 174161920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:22.568609+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 153485312 unmapped: 20676608 heap: 174161920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:23.568797+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 271 heartbeat osd_stat(store_statfs(0x1b38dd000/0x0/0x1bfc00000, data 0x46e1f06/0x48b1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x7a6f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2253115 data_alloc: 285212672 data_used: 3489792
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 154550272 unmapped: 19611648 heap: 174161920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:24.568995+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 9.573098183s of 10.007849693s, submitted: 89
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 153460736 unmapped: 20701184 heap: 174161920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:25.569160+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 153460736 unmapped: 20701184 heap: 174161920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:26.569293+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 153460736 unmapped: 20701184 heap: 174161920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:27.569442+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 271 heartbeat osd_stat(store_statfs(0x1b385a000/0x0/0x1bfc00000, data 0x4762c14/0x4933000, compress 0x0/0x0/0x0, omap 0x649, meta 0x7a6f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #45. Immutable memtables: 2.
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 155959296 unmapped: 18202624 heap: 174161920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:28.569567+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2279511 data_alloc: 285212672 data_used: 3489792
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:29.569738+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 156221440 unmapped: 17940480 heap: 174161920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 271 handle_osd_map epochs [271,272], i have 271, src has [1,272]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 272 handle_osd_map epochs [272,272], i have 272, src has [1,272]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 272 handle_osd_map epochs [272,272], i have 272, src has [1,272]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:30.569962+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 156393472 unmapped: 17768448 heap: 174161920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:31.570132+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 155901952 unmapped: 18259968 heap: 174161920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 272 heartbeat osd_stat(store_statfs(0x1b25c6000/0x0/0x1bfc00000, data 0x4854ef9/0x4a27000, compress 0x0/0x0/0x0, omap 0x649, meta 0x8c0f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:32.570334+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 157016064 unmapped: 17145856 heap: 174161920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:33.570494+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 157040640 unmapped: 17121280 heap: 174161920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2298373 data_alloc: 285212672 data_used: 3506176
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:34.570696+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 157270016 unmapped: 16891904 heap: 174161920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 9.743079185s of 10.265683174s, submitted: 134
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 272 heartbeat osd_stat(store_statfs(0x1b259a000/0x0/0x1bfc00000, data 0x487ed19/0x4a52000, compress 0x0/0x0/0x0, omap 0x649, meta 0x8c0f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:35.570848+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 157392896 unmapped: 16769024 heap: 174161920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:36.571018+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 157401088 unmapped: 16760832 heap: 174161920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 272 handle_osd_map epochs [272,273], i have 272, src has [1,273]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 273 handle_osd_map epochs [273,273], i have 273, src has [1,273]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 273 handle_osd_map epochs [273,273], i have 273, src has [1,273]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:37.571198+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 157433856 unmapped: 16728064 heap: 174161920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:38.571359+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 157433856 unmapped: 16728064 heap: 174161920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2319389 data_alloc: 285212672 data_used: 3518464
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:39.571532+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 158466048 unmapped: 15695872 heap: 174161920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:40.571768+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 158679040 unmapped: 15482880 heap: 174161920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 273 heartbeat osd_stat(store_statfs(0x1b248b000/0x0/0x1bfc00000, data 0x49895bc/0x4b5d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x8c0f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:41.571978+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 158679040 unmapped: 15482880 heap: 174161920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:42.572177+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 158769152 unmapped: 15392768 heap: 174161920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:43.572307+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 157720576 unmapped: 16441344 heap: 174161920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2313245 data_alloc: 285212672 data_used: 3518464
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:44.572448+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 157720576 unmapped: 16441344 heap: 174161920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 9.503775597s of 10.005803108s, submitted: 126
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 273 heartbeat osd_stat(store_statfs(0x1b240a000/0x0/0x1bfc00000, data 0x4a10ceb/0x4be4000, compress 0x0/0x0/0x0, omap 0x649, meta 0x8c0f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:45.572629+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 158769152 unmapped: 15392768 heap: 174161920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _send_mon_message to mon.np0005538514 at v2:172.18.0.104:3300/0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:46.572739+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 158949376 unmapped: 15212544 heap: 174161920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:47.572921+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 158949376 unmapped: 15212544 heap: 174161920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:48.573116+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 159031296 unmapped: 15130624 heap: 174161920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2331841 data_alloc: 285212672 data_used: 3518464
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 273 heartbeat osd_stat(store_statfs(0x1b231a000/0x0/0x1bfc00000, data 0x4b00821/0x4cd4000, compress 0x0/0x0/0x0, omap 0x649, meta 0x8c0f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:49.573316+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 159498240 unmapped: 14663680 heap: 174161920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:50.573538+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 160620544 unmapped: 13541376 heap: 174161920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:51.573691+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 160735232 unmapped: 13426688 heap: 174161920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:52.573857+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 160964608 unmapped: 13197312 heap: 174161920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:53.574013+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 273 heartbeat osd_stat(store_statfs(0x1b2283000/0x0/0x1bfc00000, data 0x4b949f1/0x4d6a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x8c0f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 161103872 unmapped: 13058048 heap: 174161920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2348283 data_alloc: 285212672 data_used: 3518464
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:54.574188+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 161218560 unmapped: 12943360 heap: 174161920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 9.528294563s of 10.000494957s, submitted: 110
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 273 heartbeat osd_stat(store_statfs(0x1b2247000/0x0/0x1bfc00000, data 0x4bd186c/0x4da6000, compress 0x0/0x0/0x0, omap 0x649, meta 0x8c0f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:55.574443+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 161570816 unmapped: 12591104 heap: 174161920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:56.574581+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 162635776 unmapped: 11526144 heap: 174161920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:57.574756+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 162643968 unmapped: 11517952 heap: 174161920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:58.574894+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 162938880 unmapped: 11223040 heap: 174161920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2342537 data_alloc: 285212672 data_used: 3518464
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:10:59.575052+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 162938880 unmapped: 11223040 heap: 174161920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:00.575293+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 162947072 unmapped: 11214848 heap: 174161920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 273 heartbeat osd_stat(store_statfs(0x1b21f1000/0x0/0x1bfc00000, data 0x4c27904/0x4dfc000, compress 0x0/0x0/0x0, omap 0x649, meta 0x8c0f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:01.575446+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 162955264 unmapped: 11206656 heap: 174161920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _renew_subs
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _send_mon_message to mon.np0005538514 at v2:172.18.0.104:3300/0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 273 handle_osd_map epochs [274,274], i have 273, src has [1,274]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 274 handle_osd_map epochs [274,274], i have 274, src has [1,274]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 274 handle_osd_map epochs [274,274], i have 274, src has [1,274]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:02.575567+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 162955264 unmapped: 11206656 heap: 174161920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:03.575764+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 162955264 unmapped: 11206656 heap: 174161920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 274 heartbeat osd_stat(store_statfs(0x1b21ed000/0x0/0x1bfc00000, data 0x4c29c7a/0x4dff000, compress 0x0/0x0/0x0, omap 0x649, meta 0x8c0f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2351305 data_alloc: 285212672 data_used: 3530752
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:04.575899+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 162955264 unmapped: 11206656 heap: 174161920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:05.576150+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.236233711s of 10.486874580s, submitted: 85
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 161783808 unmapped: 12378112 heap: 174161920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 274 heartbeat osd_stat(store_statfs(0x1b21ed000/0x0/0x1bfc00000, data 0x4c29cc0/0x4dff000, compress 0x0/0x0/0x0, omap 0x649, meta 0x8c0f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:06.576304+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 161792000 unmapped: 12369920 heap: 174161920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: mgrc handle_mgr_map Got map version 51
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/3214120196,v1:172.18.0.108:6811/3214120196]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 274 heartbeat osd_stat(store_statfs(0x1b21ee000/0x0/0x1bfc00000, data 0x4c29c25/0x4dfe000, compress 0x0/0x0/0x0, omap 0x649, meta 0x8c0f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:07.576497+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 161939456 unmapped: 12222464 heap: 174161920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:08.576649+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 161939456 unmapped: 12222464 heap: 174161920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2349899 data_alloc: 285212672 data_used: 3530752
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:09.576808+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 161947648 unmapped: 12214272 heap: 174161920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:10.577036+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 161947648 unmapped: 12214272 heap: 174161920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:11.577204+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 161955840 unmapped: 12206080 heap: 174161920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _renew_subs
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _send_mon_message to mon.np0005538514 at v2:172.18.0.104:3300/0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 274 handle_osd_map epochs [275,275], i have 274, src has [1,275]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:12.577367+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 161972224 unmapped: 12189696 heap: 174161920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 275 handle_osd_map epochs [275,275], i have 275, src has [1,275]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 275 heartbeat osd_stat(store_statfs(0x1b21eb000/0x0/0x1bfc00000, data 0x4c2be52/0x4e01000, compress 0x0/0x0/0x0, omap 0x649, meta 0x8c0f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:13.577515+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 161980416 unmapped: 12181504 heap: 174161920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2353645 data_alloc: 285212672 data_used: 3543040
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:14.577660+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 161980416 unmapped: 12181504 heap: 174161920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:15.577819+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 161988608 unmapped: 12173312 heap: 174161920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 275 handle_osd_map epochs [276,276], i have 275, src has [1,276]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 9.968935013s of 10.118656158s, submitted: 46
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 276 handle_osd_map epochs [276,276], i have 276, src has [1,276]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 276 heartbeat osd_stat(store_statfs(0x1b21ea000/0x0/0x1bfc00000, data 0x4c2e173/0x4e03000, compress 0x0/0x0/0x0, omap 0x649, meta 0x8c0f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:16.577953+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 162029568 unmapped: 12132352 heap: 174161920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:17.578119+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 162029568 unmapped: 12132352 heap: 174161920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:18.578294+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 276 heartbeat osd_stat(store_statfs(0x1b21e8000/0x0/0x1bfc00000, data 0x4c2e1c8/0x4e04000, compress 0x0/0x0/0x0, omap 0x649, meta 0x8c0f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 162045952 unmapped: 12115968 heap: 174161920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2367681 data_alloc: 285212672 data_used: 3555328
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:19.578456+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 162062336 unmapped: 12099584 heap: 174161920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:20.578687+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 163274752 unmapped: 10887168 heap: 174161920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 276 heartbeat osd_stat(store_statfs(0x1b2187000/0x0/0x1bfc00000, data 0x4c8c9e1/0x4e64000, compress 0x0/0x0/0x0, omap 0x649, meta 0x8c0f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:21.578858+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 163274752 unmapped: 10887168 heap: 174161920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 276 handle_osd_map epochs [276,277], i have 276, src has [1,277]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 277 handle_osd_map epochs [277,277], i have 277, src has [1,277]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 277 handle_osd_map epochs [277,277], i have 277, src has [1,277]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:22.579013+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 163684352 unmapped: 10477568 heap: 174161920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 277 handle_osd_map epochs [277,277], i have 277, src has [1,277]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:23.579180+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 163692544 unmapped: 10469376 heap: 174161920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2389099 data_alloc: 285212672 data_used: 3567616
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:24.579368+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 163741696 unmapped: 10420224 heap: 174161920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:25.579523+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 277 heartbeat osd_stat(store_statfs(0x1b20cd000/0x0/0x1bfc00000, data 0x4d44925/0x4f1e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x8c0f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 163930112 unmapped: 10231808 heap: 174161920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 9.999635696s of 10.439295769s, submitted: 142
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:26.579662+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 165216256 unmapped: 8945664 heap: 174161920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:27.579801+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 165265408 unmapped: 8896512 heap: 174161920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:28.579948+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 165543936 unmapped: 8617984 heap: 174161920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2394663 data_alloc: 285212672 data_used: 3567616
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:29.580104+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 164323328 unmapped: 9838592 heap: 174161920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:30.580288+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 164487168 unmapped: 9674752 heap: 174161920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 277 heartbeat osd_stat(store_statfs(0x1b2009000/0x0/0x1bfc00000, data 0x4e0fe4b/0x4fe5000, compress 0x0/0x0/0x0, omap 0x649, meta 0x8c0f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _renew_subs
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _send_mon_message to mon.np0005538514 at v2:172.18.0.104:3300/0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 277 handle_osd_map epochs [278,278], i have 277, src has [1,278]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:31.580444+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 164634624 unmapped: 9527296 heap: 174161920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:32.580612+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 164634624 unmapped: 9527296 heap: 174161920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 278 handle_osd_map epochs [278,278], i have 278, src has [1,278]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:33.580819+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 165691392 unmapped: 8470528 heap: 174161920 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2399203 data_alloc: 285212672 data_used: 3579904
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 278 heartbeat osd_stat(store_statfs(0x1b1b75000/0x0/0x1bfc00000, data 0x4ea111f/0x5079000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:34.580995+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 171352064 unmapped: 12255232 heap: 183607296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:35.581141+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 166584320 unmapped: 17022976 heap: 183607296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 278 handle_osd_map epochs [279,279], i have 278, src has [1,279]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _renew_subs
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _send_mon_message to mon.np0005538514 at v2:172.18.0.104:3300/0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 278 handle_osd_map epochs [279,279], i have 279, src has [1,279]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 278 handle_osd_map epochs [279,279], i have 279, src has [1,279]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 278 handle_osd_map epochs [279,279], i have 279, src has [1,279]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 279 ms_handle_reset con 0x55ab8f628000 session 0x55ab8dd06960
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 279 handle_osd_map epochs [279,279], i have 279, src has [1,279]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8dcfb400
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:36.581316+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 279 handle_osd_map epochs [279,280], i have 279, src has [1,280]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 9.844315529s of 10.479787827s, submitted: 450
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.17] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 167403520 unmapped: 16203776 heap: 183607296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 280 handle_osd_map epochs [280,280], i have 280, src has [1,280]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 280 ms_handle_reset con 0x55ab8dcfb400 session 0x55ab8dce0b40
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: mgrc handle_mgr_map Got map version 52
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/3214120196,v1:172.18.0.108:6811/3214120196]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:37.581573+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 167534592 unmapped: 16072704 heap: 183607296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 280 handle_osd_map epochs [280,280], i have 280, src has [1,280]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:38.581803+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 167534592 unmapped: 16072704 heap: 183607296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2420709 data_alloc: 285212672 data_used: 3592192
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 280 handle_osd_map epochs [280,280], i have 280, src has [1,280]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:39.581956+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 280 heartbeat osd_stat(store_statfs(0x1b1ac6000/0x0/0x1bfc00000, data 0x4f4bc3b/0x5128000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 168607744 unmapped: 14999552 heap: 183607296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 280 heartbeat osd_stat(store_statfs(0x1b1a7d000/0x0/0x1bfc00000, data 0x4f92438/0x5170000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,2,3,4,5] op hist [0,0,0,0,1])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:40.582146+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 280 handle_osd_map epochs [280,281], i have 280, src has [1,281]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 168312832 unmapped: 15294464 heap: 183607296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 281 handle_osd_map epochs [281,281], i have 281, src has [1,281]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:41.582284+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 168312832 unmapped: 15294464 heap: 183607296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 281 handle_osd_map epochs [282,282], i have 281, src has [1,282]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 281 handle_osd_map epochs [282,282], i have 282, src has [1,282]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:42.582434+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 282 handle_osd_map epochs [281,282], i have 282, src has [1,282]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 168419328 unmapped: 15187968 heap: 183607296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:43.582613+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 167936000 unmapped: 15671296 heap: 183607296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2439129 data_alloc: 285212672 data_used: 3604480
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:44.583425+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 167944192 unmapped: 15663104 heap: 183607296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 282 heartbeat osd_stat(store_statfs(0x1b19ba000/0x0/0x1bfc00000, data 0x5053bc8/0x5234000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:45.583556+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 167944192 unmapped: 15663104 heap: 183607296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 282 handle_osd_map epochs [282,283], i have 282, src has [1,283]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 283 handle_osd_map epochs [283,283], i have 283, src has [1,283]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:46.584219+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 169140224 unmapped: 14467072 heap: 183607296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _renew_subs
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _send_mon_message to mon.np0005538514 at v2:172.18.0.104:3300/0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 283 handle_osd_map epochs [284,284], i have 283, src has [1,284]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 9.607948303s of 10.409277916s, submitted: 215
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.17] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:47.584601+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 169148416 unmapped: 14458880 heap: 183607296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:48.584919+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 169148416 unmapped: 14458880 heap: 183607296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2460169 data_alloc: 285212672 data_used: 3616768
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 284 handle_osd_map epochs [284,284], i have 284, src has [1,284]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:49.585180+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 169320448 unmapped: 14286848 heap: 183607296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:50.585515+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 169402368 unmapped: 14204928 heap: 183607296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 284 heartbeat osd_stat(store_statfs(0x1b18de000/0x0/0x1bfc00000, data 0x512cd4d/0x5310000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:51.585800+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 284 handle_osd_map epochs [284,285], i have 284, src has [1,285]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 169410560 unmapped: 14196736 heap: 183607296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:52.586155+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _renew_subs
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _send_mon_message to mon.np0005538514 at v2:172.18.0.104:3300/0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 285 handle_osd_map epochs [286,286], i have 285, src has [1,286]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 170483712 unmapped: 13123584 heap: 183607296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:53.586403+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 286 heartbeat osd_stat(store_statfs(0x1b187f000/0x0/0x1bfc00000, data 0x518877e/0x536e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 171540480 unmapped: 12066816 heap: 183607296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2474397 data_alloc: 285212672 data_used: 3645440
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:54.586773+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 171540480 unmapped: 12066816 heap: 183607296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:55.587001+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 171786240 unmapped: 11821056 heap: 183607296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:56.587209+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 171786240 unmapped: 11821056 heap: 183607296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:57.587293+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.214418411s of 10.562563896s, submitted: 118
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 172032000 unmapped: 11575296 heap: 183607296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 286 heartbeat osd_stat(store_statfs(0x1b1813000/0x0/0x1bfc00000, data 0x51f4819/0x53db000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:58.587479+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 171081728 unmapped: 12525568 heap: 183607296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2489717 data_alloc: 285212672 data_used: 3645440
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:11:59.587697+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 172138496 unmapped: 11468800 heap: 183607296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:00.587922+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 286 heartbeat osd_stat(store_statfs(0x1b17a7000/0x0/0x1bfc00000, data 0x525db3a/0x5446000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 173441024 unmapped: 10166272 heap: 183607296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:01.588058+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 286 handle_osd_map epochs [287,288], i have 286, src has [1,288]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 173531136 unmapped: 10076160 heap: 183607296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:02.588208+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 173547520 unmapped: 10059776 heap: 183607296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:03.588386+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 288 heartbeat osd_stat(store_statfs(0x1b1735000/0x0/0x1bfc00000, data 0x52cc023/0x54b8000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 173637632 unmapped: 9969664 heap: 183607296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 288 heartbeat osd_stat(store_statfs(0x1b1715000/0x0/0x1bfc00000, data 0x52ec93e/0x54d8000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2503489 data_alloc: 285212672 data_used: 3657728
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:04.588525+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 174112768 unmapped: 9494528 heap: 183607296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 288 handle_osd_map epochs [288,289], i have 288, src has [1,289]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:05.588732+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 289 handle_osd_map epochs [289,289], i have 289, src has [1,289]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 289 handle_osd_map epochs [289,289], i have 289, src has [1,289]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 175497216 unmapped: 8110080 heap: 183607296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:06.588918+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 175702016 unmapped: 7905280 heap: 183607296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:07.589109+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 289 heartbeat osd_stat(store_statfs(0x1b165e000/0x0/0x1bfc00000, data 0x53a3cdc/0x558f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 9.679487228s of 10.198865891s, submitted: 166
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 175955968 unmapped: 7651328 heap: 183607296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:08.589298+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 175988736 unmapped: 7618560 heap: 183607296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2516727 data_alloc: 285212672 data_used: 3670016
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:09.589502+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 175996928 unmapped: 7610368 heap: 183607296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:10.589708+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 174931968 unmapped: 8675328 heap: 183607296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:11.589856+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 289 handle_osd_map epochs [289,290], i have 289, src has [1,290]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 174809088 unmapped: 8798208 heap: 183607296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:12.589988+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 175890432 unmapped: 7716864 heap: 183607296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 290 heartbeat osd_stat(store_statfs(0x1b1567000/0x0/0x1bfc00000, data 0x54980db/0x5685000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:13.590146+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 176070656 unmapped: 7536640 heap: 183607296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2533049 data_alloc: 285212672 data_used: 3682304
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:14.590295+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 176070656 unmapped: 7536640 heap: 183607296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:15.590443+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 175636480 unmapped: 7970816 heap: 183607296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:16.590607+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 175915008 unmapped: 7692288 heap: 183607296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:17.590803+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 290 heartbeat osd_stat(store_statfs(0x1b1507000/0x0/0x1bfc00000, data 0x54f9590/0x56e7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,2,3,4,5] op hist [0,0,0,1])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 176054272 unmapped: 7553024 heap: 183607296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:18.590965+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 176054272 unmapped: 7553024 heap: 183607296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2530727 data_alloc: 285212672 data_used: 3682304
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:19.591146+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 11.405799866s of 11.713174820s, submitted: 72
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 176062464 unmapped: 7544832 heap: 183607296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:20.591331+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 176062464 unmapped: 7544832 heap: 183607296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:21.591538+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 176177152 unmapped: 7430144 heap: 183607296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:22.591745+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 290 heartbeat osd_stat(store_statfs(0x1b1477000/0x0/0x1bfc00000, data 0x5587e5a/0x5776000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 177291264 unmapped: 6316032 heap: 183607296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:23.591932+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 177291264 unmapped: 6316032 heap: 183607296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2545469 data_alloc: 285212672 data_used: 3682304
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:24.592125+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 177668096 unmapped: 5939200 heap: 183607296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:25.592335+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 177397760 unmapped: 6209536 heap: 183607296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:26.592481+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 177422336 unmapped: 6184960 heap: 183607296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:27.592675+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 290 heartbeat osd_stat(store_statfs(0x1b020d000/0x0/0x1bfc00000, data 0x5654ac2/0x5841000, compress 0x0/0x0/0x0, omap 0x649, meta 0xa1af9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #46. Immutable memtables: 3.
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 290 heartbeat osd_stat(store_statfs(0x1b020d000/0x0/0x1bfc00000, data 0x5654ac2/0x5841000, compress 0x0/0x0/0x0, omap 0x649, meta 0xa1af9b7), peers [0,2,3,4,5] op hist [0,0,0,1])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 178479104 unmapped: 5128192 heap: 183607296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:28.592868+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 178806784 unmapped: 4800512 heap: 183607296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 290 heartbeat osd_stat(store_statfs(0x1af052000/0x0/0x1bfc00000, data 0x566fead/0x585c000, compress 0x0/0x0/0x0, omap 0x649, meta 0xb34f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2554669 data_alloc: 285212672 data_used: 3682304
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:29.593192+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.045596123s of 10.347048759s, submitted: 58
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 178905088 unmapped: 4702208 heap: 183607296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:30.593435+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 179331072 unmapped: 4276224 heap: 183607296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:31.593587+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 179838976 unmapped: 3768320 heap: 183607296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:32.593784+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 179838976 unmapped: 3768320 heap: 183607296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:33.593927+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 290 heartbeat osd_stat(store_statfs(0x1aef59000/0x0/0x1bfc00000, data 0x5766d0d/0x5954000, compress 0x0/0x0/0x0, omap 0x649, meta 0xb34f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 179871744 unmapped: 3735552 heap: 183607296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2567205 data_alloc: 285212672 data_used: 3682304
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:34.594176+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 179871744 unmapped: 3735552 heap: 183607296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:35.594343+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 178200576 unmapped: 5406720 heap: 183607296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:36.594569+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 179396608 unmapped: 4210688 heap: 183607296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 290 heartbeat osd_stat(store_statfs(0x1aef17000/0x0/0x1bfc00000, data 0x57a9efd/0x5997000, compress 0x0/0x0/0x0, omap 0x649, meta 0xb34f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:37.594768+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 290 ms_handle_reset con 0x55ab8f197000 session 0x55ab8dc06d20
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8edeec00
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 179404800 unmapped: 4202496 heap: 183607296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:38.594960+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 179404800 unmapped: 4202496 heap: 183607296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2565561 data_alloc: 285212672 data_used: 3682304
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:39.595105+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 179404800 unmapped: 4202496 heap: 183607296 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.010607719s of 10.241950989s, submitted: 48
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:40.595304+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 290 heartbeat osd_stat(store_statfs(0x1aeee7000/0x0/0x1bfc00000, data 0x57d9a1f/0x59c7000, compress 0x0/0x0/0x0, omap 0x649, meta 0xb34f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 179699712 unmapped: 4956160 heap: 184655872 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:41.595500+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 179716096 unmapped: 4939776 heap: 184655872 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:42.595693+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 179716096 unmapped: 4939776 heap: 184655872 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:43.595900+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 290 heartbeat osd_stat(store_statfs(0x1aee9e000/0x0/0x1bfc00000, data 0x582351d/0x5a10000, compress 0x0/0x0/0x0, omap 0x649, meta 0xb34f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 180109312 unmapped: 4546560 heap: 184655872 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 290 heartbeat osd_stat(store_statfs(0x1aee9e000/0x0/0x1bfc00000, data 0x582351d/0x5a10000, compress 0x0/0x0/0x0, omap 0x649, meta 0xb34f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:44.596098+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2576359 data_alloc: 285212672 data_used: 3682304
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 180109312 unmapped: 4546560 heap: 184655872 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:45.596503+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 180109312 unmapped: 4546560 heap: 184655872 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:46.596699+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 180109312 unmapped: 4546560 heap: 184655872 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 290 heartbeat osd_stat(store_statfs(0x1aee98000/0x0/0x1bfc00000, data 0x582949a/0x5a16000, compress 0x0/0x0/0x0, omap 0x649, meta 0xb34f9b7), peers [0,2,3,4,5] op hist [0,0,0,0,1])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:47.596877+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 181157888 unmapped: 3497984 heap: 184655872 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:48.597057+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 181157888 unmapped: 3497984 heap: 184655872 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:49.597287+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2580287 data_alloc: 285212672 data_used: 3682304
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 9.834632874s of 10.000970840s, submitted: 33
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 181215232 unmapped: 3440640 heap: 184655872 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:50.597471+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 181248000 unmapped: 3407872 heap: 184655872 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:51.597641+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 181248000 unmapped: 3407872 heap: 184655872 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 290 heartbeat osd_stat(store_statfs(0x1aede1000/0x0/0x1bfc00000, data 0x58e1165/0x5acd000, compress 0x0/0x0/0x0, omap 0x649, meta 0xb34f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:52.597834+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 181501952 unmapped: 3153920 heap: 184655872 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:53.598044+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 290 heartbeat osd_stat(store_statfs(0x1aede1000/0x0/0x1bfc00000, data 0x58e1165/0x5acd000, compress 0x0/0x0/0x0, omap 0x649, meta 0xb34f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 181501952 unmapped: 3153920 heap: 184655872 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:54.598180+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2587239 data_alloc: 285212672 data_used: 3682304
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 290 heartbeat osd_stat(store_statfs(0x1aede1000/0x0/0x1bfc00000, data 0x58e1165/0x5acd000, compress 0x0/0x0/0x0, omap 0x649, meta 0xb34f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 181600256 unmapped: 4104192 heap: 185704448 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:55.598320+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 181600256 unmapped: 4104192 heap: 185704448 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 290 heartbeat osd_stat(store_statfs(0x1aedb9000/0x0/0x1bfc00000, data 0x5908aaa/0x5af5000, compress 0x0/0x0/0x0, omap 0x649, meta 0xb34f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:56.598450+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 181608448 unmapped: 4096000 heap: 185704448 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 290 heartbeat osd_stat(store_statfs(0x1aedb9000/0x0/0x1bfc00000, data 0x5908aaa/0x5af5000, compress 0x0/0x0/0x0, omap 0x649, meta 0xb34f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:57.598587+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 181608448 unmapped: 4096000 heap: 185704448 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:58.598774+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 290 heartbeat osd_stat(store_statfs(0x1aed95000/0x0/0x1bfc00000, data 0x592d073/0x5b19000, compress 0x0/0x0/0x0, omap 0x649, meta 0xb34f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 181649408 unmapped: 4055040 heap: 185704448 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:12:59.598957+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2588275 data_alloc: 285212672 data_used: 3682304
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 9.838538170s of 10.004695892s, submitted: 30
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 181649408 unmapped: 4055040 heap: 185704448 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:00.599174+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 181649408 unmapped: 4055040 heap: 185704448 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:01.599303+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 290 heartbeat osd_stat(store_statfs(0x1aed53000/0x0/0x1bfc00000, data 0x596f3f4/0x5b5b000, compress 0x0/0x0/0x0, omap 0x649, meta 0xb34f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 181657600 unmapped: 4046848 heap: 185704448 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:02.599475+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 182829056 unmapped: 2875392 heap: 185704448 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:03.599661+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 182829056 unmapped: 2875392 heap: 185704448 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 290 heartbeat osd_stat(store_statfs(0x1aed03000/0x0/0x1bfc00000, data 0x59beac1/0x5bab000, compress 0x0/0x0/0x0, omap 0x649, meta 0xb34f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:04.599868+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2594883 data_alloc: 285212672 data_used: 3682304
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 183058432 unmapped: 2646016 heap: 185704448 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:05.600047+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 181936128 unmapped: 4816896 heap: 186753024 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:06.600271+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 290 heartbeat osd_stat(store_statfs(0x1aecba000/0x0/0x1bfc00000, data 0x5a07c66/0x5bf4000, compress 0x0/0x0/0x0, omap 0x649, meta 0xb34f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 181936128 unmapped: 4816896 heap: 186753024 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:07.600469+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 182165504 unmapped: 4587520 heap: 186753024 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 290 heartbeat osd_stat(store_statfs(0x1aecba000/0x0/0x1bfc00000, data 0x5a07c66/0x5bf4000, compress 0x0/0x0/0x0, omap 0x649, meta 0xb34f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:08.600667+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 182165504 unmapped: 4587520 heap: 186753024 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:09.600863+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2597839 data_alloc: 285212672 data_used: 3682304
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 9.836727142s of 10.002305031s, submitted: 28
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 182484992 unmapped: 4268032 heap: 186753024 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:10.601142+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 182484992 unmapped: 4268032 heap: 186753024 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:11.601351+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 182484992 unmapped: 4268032 heap: 186753024 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 290 heartbeat osd_stat(store_statfs(0x1aec69000/0x0/0x1bfc00000, data 0x5a58c9a/0x5c45000, compress 0x0/0x0/0x0, omap 0x649, meta 0xb34f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:12.601541+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 182566912 unmapped: 4186112 heap: 186753024 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:13.601689+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 290 heartbeat osd_stat(store_statfs(0x1aec39000/0x0/0x1bfc00000, data 0x5a887ed/0x5c75000, compress 0x0/0x0/0x0, omap 0x649, meta 0xb34f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 182616064 unmapped: 4136960 heap: 186753024 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:14.601885+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2599159 data_alloc: 285212672 data_used: 3682304
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 182616064 unmapped: 4136960 heap: 186753024 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:15.602050+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 290 heartbeat osd_stat(store_statfs(0x1aec0b000/0x0/0x1bfc00000, data 0x5ab69f0/0x5ca3000, compress 0x0/0x0/0x0, omap 0x649, meta 0xb34f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 182616064 unmapped: 4136960 heap: 186753024 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:16.602257+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 182624256 unmapped: 4128768 heap: 186753024 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:17.602418+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 182624256 unmapped: 4128768 heap: 186753024 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:18.602573+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 182624256 unmapped: 4128768 heap: 186753024 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 290 heartbeat osd_stat(store_statfs(0x1aec0b000/0x0/0x1bfc00000, data 0x5ab69f0/0x5ca3000, compress 0x0/0x0/0x0, omap 0x649, meta 0xb34f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:19.602782+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2602623 data_alloc: 285212672 data_used: 3682304
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 182624256 unmapped: 4128768 heap: 186753024 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:20.603058+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 182632448 unmapped: 4120576 heap: 186753024 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:21.603439+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 182632448 unmapped: 4120576 heap: 186753024 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:22.603714+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 182632448 unmapped: 4120576 heap: 186753024 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:23.604530+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 290 heartbeat osd_stat(store_statfs(0x1aec0b000/0x0/0x1bfc00000, data 0x5ab6e9e/0x5ca3000, compress 0x0/0x0/0x0, omap 0x649, meta 0xb34f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 182632448 unmapped: 4120576 heap: 186753024 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:24.604879+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2602639 data_alloc: 285212672 data_used: 3682304
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 14.949788094s of 14.993952751s, submitted: 11
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 182632448 unmapped: 4120576 heap: 186753024 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:25.605285+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 182640640 unmapped: 4112384 heap: 186753024 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:26.605535+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 182640640 unmapped: 4112384 heap: 186753024 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:27.605686+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 182640640 unmapped: 4112384 heap: 186753024 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:28.606098+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 183697408 unmapped: 3055616 heap: 186753024 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:29.606468+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2605839 data_alloc: 285212672 data_used: 3682304
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 290 heartbeat osd_stat(store_statfs(0x1aebc2000/0x0/0x1bfc00000, data 0x5afef9c/0x5cec000, compress 0x0/0x0/0x0, omap 0x649, meta 0xb34f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 183697408 unmapped: 3055616 heap: 186753024 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:30.606839+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 183697408 unmapped: 3055616 heap: 186753024 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:31.607112+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 183697408 unmapped: 3055616 heap: 186753024 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:32.607311+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 183992320 unmapped: 3809280 heap: 187801600 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:33.607715+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 183992320 unmapped: 3809280 heap: 187801600 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:34.608010+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2615819 data_alloc: 285212672 data_used: 3682304
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 9.884501457s of 10.001825333s, submitted: 26
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 182173696 unmapped: 5627904 heap: 187801600 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:35.608229+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 290 heartbeat osd_stat(store_statfs(0x1aeb47000/0x0/0x1bfc00000, data 0x5b77cc6/0x5d67000, compress 0x0/0x0/0x0, omap 0x649, meta 0xb34f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 182173696 unmapped: 5627904 heap: 187801600 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:36.608520+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 182255616 unmapped: 5545984 heap: 187801600 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:37.608764+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 182255616 unmapped: 5545984 heap: 187801600 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:38.609010+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 182255616 unmapped: 5545984 heap: 187801600 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:39.609234+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2619955 data_alloc: 285212672 data_used: 3682304
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 290 heartbeat osd_stat(store_statfs(0x1aeb2b000/0x0/0x1bfc00000, data 0x5b92a47/0x5d83000, compress 0x0/0x0/0x0, omap 0x649, meta 0xb34f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 182419456 unmapped: 5382144 heap: 187801600 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:40.609489+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 182419456 unmapped: 5382144 heap: 187801600 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:41.609700+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:42.609905+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 182419456 unmapped: 5382144 heap: 187801600 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:43.610120+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 182419456 unmapped: 5382144 heap: 187801600 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:44.610334+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 182419456 unmapped: 5382144 heap: 187801600 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2628423 data_alloc: 285212672 data_used: 3682304
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 9.882828712s of 10.009747505s, submitted: 24
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:45.610511+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 182460416 unmapped: 5341184 heap: 187801600 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 290 heartbeat osd_stat(store_statfs(0x1aeaa3000/0x0/0x1bfc00000, data 0x5c17c32/0x5e0a000, compress 0x0/0x0/0x0, omap 0x649, meta 0xb34f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:46.610737+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 182460416 unmapped: 5341184 heap: 187801600 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:47.610933+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 182657024 unmapped: 5144576 heap: 187801600 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:48.611126+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 182657024 unmapped: 5144576 heap: 187801600 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:49.611348+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 182657024 unmapped: 5144576 heap: 187801600 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2632111 data_alloc: 285212672 data_used: 3682304
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 290 heartbeat osd_stat(store_statfs(0x1aeaa3000/0x0/0x1bfc00000, data 0x5c17c32/0x5e0a000, compress 0x0/0x0/0x0, omap 0x649, meta 0xb34f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:50.611577+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 182779904 unmapped: 5021696 heap: 187801600 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:51.611763+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 182779904 unmapped: 5021696 heap: 187801600 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 290 heartbeat osd_stat(store_statfs(0x1aea6e000/0x0/0x1bfc00000, data 0x5c4d75a/0x5e3f000, compress 0x0/0x0/0x0, omap 0x649, meta 0xb34f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:52.611971+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 182779904 unmapped: 5021696 heap: 187801600 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:53.612140+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 182861824 unmapped: 4939776 heap: 187801600 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:54.612295+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 182861824 unmapped: 4939776 heap: 187801600 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2629031 data_alloc: 285212672 data_used: 3682304
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 9.914077759s of 10.006759644s, submitted: 22
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:55.612444+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 182861824 unmapped: 4939776 heap: 187801600 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 290 heartbeat osd_stat(store_statfs(0x1aea6f000/0x0/0x1bfc00000, data 0x5c4d664/0x5e3e000, compress 0x0/0x0/0x0, omap 0x649, meta 0xb34f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:56.612611+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 182861824 unmapped: 4939776 heap: 187801600 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:57.612743+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 182861824 unmapped: 4939776 heap: 187801600 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:58.612847+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 182861824 unmapped: 4939776 heap: 187801600 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 290 heartbeat osd_stat(store_statfs(0x1aea71000/0x0/0x1bfc00000, data 0x5c4d59d/0x5e3d000, compress 0x0/0x0/0x0, omap 0x649, meta 0xb34f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:13:59.612995+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 182861824 unmapped: 4939776 heap: 187801600 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2627171 data_alloc: 285212672 data_used: 3682304
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:00.613182+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 183910400 unmapped: 3891200 heap: 187801600 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:01.613400+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 183918592 unmapped: 3883008 heap: 187801600 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 290 heartbeat osd_stat(store_statfs(0x1aea6e000/0x0/0x1bfc00000, data 0x5c4d6c5/0x5e3f000, compress 0x0/0x0/0x0, omap 0x649, meta 0xb34f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:02.613505+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 183918592 unmapped: 3883008 heap: 187801600 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:03.613576+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 183918592 unmapped: 3883008 heap: 187801600 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:04.613724+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 183918592 unmapped: 3883008 heap: 187801600 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2632779 data_alloc: 285212672 data_used: 3682304
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 290 heartbeat osd_stat(store_statfs(0x1aea6d000/0x0/0x1bfc00000, data 0x5c4d760/0x5e40000, compress 0x0/0x0/0x0, omap 0x649, meta 0xb34f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:05.613892+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 183918592 unmapped: 3883008 heap: 187801600 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:06.614032+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 183918592 unmapped: 3883008 heap: 187801600 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 290 heartbeat osd_stat(store_statfs(0x1aea6d000/0x0/0x1bfc00000, data 0x5c4d760/0x5e40000, compress 0x0/0x0/0x0, omap 0x649, meta 0xb34f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:07.614187+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 183918592 unmapped: 3883008 heap: 187801600 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:08.614355+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 183918592 unmapped: 3883008 heap: 187801600 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:09.614545+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 183934976 unmapped: 3866624 heap: 187801600 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2632779 data_alloc: 285212672 data_used: 3682304
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 290 heartbeat osd_stat(store_statfs(0x1aea6d000/0x0/0x1bfc00000, data 0x5c4d760/0x5e40000, compress 0x0/0x0/0x0, omap 0x649, meta 0xb34f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:10.614767+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 183943168 unmapped: 3858432 heap: 187801600 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:11.614932+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 183943168 unmapped: 3858432 heap: 187801600 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 290 heartbeat osd_stat(store_statfs(0x1aea6d000/0x0/0x1bfc00000, data 0x5c4d760/0x5e40000, compress 0x0/0x0/0x0, omap 0x649, meta 0xb34f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:12.615142+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 183943168 unmapped: 3858432 heap: 187801600 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 18.179527283s of 18.225175858s, submitted: 11
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:13.615318+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 183943168 unmapped: 3858432 heap: 187801600 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:14.615529+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 183943168 unmapped: 3858432 heap: 187801600 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2632427 data_alloc: 285212672 data_used: 3682304
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:15.615666+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 183943168 unmapped: 3858432 heap: 187801600 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:16.615810+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 183943168 unmapped: 3858432 heap: 187801600 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:17.615985+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 183951360 unmapped: 3850240 heap: 187801600 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 290 heartbeat osd_stat(store_statfs(0x1aea6d000/0x0/0x1bfc00000, data 0x5c4d72f/0x5e40000, compress 0x0/0x0/0x0, omap 0x649, meta 0xb34f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:18.616132+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 183951360 unmapped: 3850240 heap: 187801600 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:19.616311+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 183951360 unmapped: 3850240 heap: 187801600 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2629665 data_alloc: 285212672 data_used: 3682304
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:20.616519+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 183951360 unmapped: 3850240 heap: 187801600 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:21.616665+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 183951360 unmapped: 3850240 heap: 187801600 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:22.616868+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 183959552 unmapped: 3842048 heap: 187801600 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 290 heartbeat osd_stat(store_statfs(0x1aea72000/0x0/0x1bfc00000, data 0x5c4d502/0x5e3c000, compress 0x0/0x0/0x0, omap 0x649, meta 0xb34f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:23.617048+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 183959552 unmapped: 3842048 heap: 187801600 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:24.617244+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 183959552 unmapped: 3842048 heap: 187801600 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2628285 data_alloc: 285212672 data_used: 3682304
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 11.734083176s of 11.791581154s, submitted: 13
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:25.617423+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 183959552 unmapped: 3842048 heap: 187801600 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:26.617603+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 183959552 unmapped: 3842048 heap: 187801600 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:27.617753+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 183959552 unmapped: 3842048 heap: 187801600 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 290 heartbeat osd_stat(store_statfs(0x1aea73000/0x0/0x1bfc00000, data 0x5c4d4d2/0x5e3b000, compress 0x0/0x0/0x0, omap 0x649, meta 0xb34f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:28.617906+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 183959552 unmapped: 3842048 heap: 187801600 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:29.618037+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 183959552 unmapped: 3842048 heap: 187801600 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2626215 data_alloc: 285212672 data_used: 3682304
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:30.618272+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 183959552 unmapped: 3842048 heap: 187801600 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:31.618421+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 183959552 unmapped: 3842048 heap: 187801600 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:32.618589+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 183967744 unmapped: 3833856 heap: 187801600 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 290 heartbeat osd_stat(store_statfs(0x1aea75000/0x0/0x1bfc00000, data 0x5c4d407/0x5e39000, compress 0x0/0x0/0x0, omap 0x649, meta 0xb34f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:33.618737+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 183975936 unmapped: 3825664 heap: 187801600 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:34.618902+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 183975936 unmapped: 3825664 heap: 187801600 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2626215 data_alloc: 285212672 data_used: 3682304
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:35.619140+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 183975936 unmapped: 3825664 heap: 187801600 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:36.619429+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 183975936 unmapped: 3825664 heap: 187801600 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:37.619612+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 183975936 unmapped: 3825664 heap: 187801600 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:38.619767+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 183975936 unmapped: 3825664 heap: 187801600 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 290 heartbeat osd_stat(store_statfs(0x1aea75000/0x0/0x1bfc00000, data 0x5c4d407/0x5e39000, compress 0x0/0x0/0x0, omap 0x649, meta 0xb34f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:39.619915+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 183975936 unmapped: 3825664 heap: 187801600 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2626391 data_alloc: 285212672 data_used: 3682304
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:40.620162+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 290 heartbeat osd_stat(store_statfs(0x1aea75000/0x0/0x1bfc00000, data 0x5c4d407/0x5e39000, compress 0x0/0x0/0x0, omap 0x649, meta 0xb34f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 183975936 unmapped: 3825664 heap: 187801600 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:41.620339+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 183984128 unmapped: 3817472 heap: 187801600 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 290 heartbeat osd_stat(store_statfs(0x1aea75000/0x0/0x1bfc00000, data 0x5c4d407/0x5e39000, compress 0x0/0x0/0x0, omap 0x649, meta 0xb34f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:42.620545+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 183992320 unmapped: 3809280 heap: 187801600 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:43.620679+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 183992320 unmapped: 3809280 heap: 187801600 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:44.620882+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 183992320 unmapped: 3809280 heap: 187801600 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2626391 data_alloc: 285212672 data_used: 3682304
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:45.621044+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 183992320 unmapped: 3809280 heap: 187801600 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:46.621226+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 183992320 unmapped: 3809280 heap: 187801600 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:47.621415+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 290 heartbeat osd_stat(store_statfs(0x1aea75000/0x0/0x1bfc00000, data 0x5c4d407/0x5e39000, compress 0x0/0x0/0x0, omap 0x649, meta 0xb34f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 183992320 unmapped: 3809280 heap: 187801600 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 290 heartbeat osd_stat(store_statfs(0x1aea75000/0x0/0x1bfc00000, data 0x5c4d407/0x5e39000, compress 0x0/0x0/0x0, omap 0x649, meta 0xb34f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:48.621628+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 290 heartbeat osd_stat(store_statfs(0x1aea75000/0x0/0x1bfc00000, data 0x5c4d407/0x5e39000, compress 0x0/0x0/0x0, omap 0x649, meta 0xb34f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 183992320 unmapped: 3809280 heap: 187801600 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:49.621798+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 184000512 unmapped: 3801088 heap: 187801600 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2626391 data_alloc: 285212672 data_used: 3682304
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:50.621954+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 184000512 unmapped: 3801088 heap: 187801600 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:51.622140+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 26.509159088s of 26.526023865s, submitted: 4
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 184000512 unmapped: 3801088 heap: 187801600 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:52.622290+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 184000512 unmapped: 3801088 heap: 187801600 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 290 heartbeat osd_stat(store_statfs(0x1aea74000/0x0/0x1bfc00000, data 0x5c4d4a2/0x5e3a000, compress 0x0/0x0/0x0, omap 0x649, meta 0xb34f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:53.622423+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 184000512 unmapped: 3801088 heap: 187801600 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:54.622762+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 184000512 unmapped: 3801088 heap: 187801600 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2628159 data_alloc: 285212672 data_used: 3682304
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:55.622894+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 184016896 unmapped: 3784704 heap: 187801600 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 290 heartbeat osd_stat(store_statfs(0x1aea73000/0x0/0x1bfc00000, data 0x5c4d53d/0x5e3b000, compress 0x0/0x0/0x0, omap 0x649, meta 0xb34f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:56.623042+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 184016896 unmapped: 3784704 heap: 187801600 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:57.623187+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 184025088 unmapped: 3776512 heap: 187801600 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:58.623353+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 184025088 unmapped: 3776512 heap: 187801600 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:14:59.623515+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 184033280 unmapped: 3768320 heap: 187801600 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2631343 data_alloc: 285212672 data_used: 3682304
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:00.623739+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 184033280 unmapped: 3768320 heap: 187801600 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:01.623901+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 184033280 unmapped: 3768320 heap: 187801600 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 290 heartbeat osd_stat(store_statfs(0x1aea72000/0x0/0x1bfc00000, data 0x5c4d5d8/0x5e3c000, compress 0x0/0x0/0x0, omap 0x649, meta 0xb34f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:02.624059+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 290 heartbeat osd_stat(store_statfs(0x1aea72000/0x0/0x1bfc00000, data 0x5c4d5d8/0x5e3c000, compress 0x0/0x0/0x0, omap 0x649, meta 0xb34f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 184033280 unmapped: 3768320 heap: 187801600 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:03.624215+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 184033280 unmapped: 3768320 heap: 187801600 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 12.586763382s of 12.612449646s, submitted: 4
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:04.624347+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 184033280 unmapped: 3768320 heap: 187801600 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2629399 data_alloc: 285212672 data_used: 3682304
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:05.624497+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 290 heartbeat osd_stat(store_statfs(0x1aea73000/0x0/0x1bfc00000, data 0x5c4d53d/0x5e3b000, compress 0x0/0x0/0x0, omap 0x649, meta 0xb34f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 184033280 unmapped: 3768320 heap: 187801600 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 290 heartbeat osd_stat(store_statfs(0x1aea5a000/0x0/0x1bfc00000, data 0x5c656c2/0x5e54000, compress 0x0/0x0/0x0, omap 0x649, meta 0xb34f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:06.624685+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 290 heartbeat osd_stat(store_statfs(0x1aea5a000/0x0/0x1bfc00000, data 0x5c656c2/0x5e54000, compress 0x0/0x0/0x0, omap 0x649, meta 0xb34f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 184049664 unmapped: 3751936 heap: 187801600 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 290 heartbeat osd_stat(store_statfs(0x1aea54000/0x0/0x1bfc00000, data 0x5c6bc7c/0x5e5a000, compress 0x0/0x0/0x0, omap 0x649, meta 0xb34f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:07.624827+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 184066048 unmapped: 3735552 heap: 187801600 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:08.624975+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 184147968 unmapped: 3653632 heap: 187801600 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:09.625114+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 184303616 unmapped: 3497984 heap: 187801600 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2638047 data_alloc: 285212672 data_used: 3682304
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:10.625314+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 184303616 unmapped: 3497984 heap: 187801600 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:11.625507+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 290 heartbeat osd_stat(store_statfs(0x1aea15000/0x0/0x1bfc00000, data 0x5cab518/0x5e99000, compress 0x0/0x0/0x0, omap 0x649, meta 0xb34f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 184279040 unmapped: 3522560 heap: 187801600 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:12.625648+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 184434688 unmapped: 3366912 heap: 187801600 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:13.625790+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 184434688 unmapped: 3366912 heap: 187801600 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:14.625918+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 184434688 unmapped: 3366912 heap: 187801600 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.736823082s of 10.829957962s, submitted: 17
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2638327 data_alloc: 285212672 data_used: 3682304
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 290 heartbeat osd_stat(store_statfs(0x1ae9ce000/0x0/0x1bfc00000, data 0x5cf3d1d/0x5ee0000, compress 0x0/0x0/0x0, omap 0x649, meta 0xb34f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:15.626054+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 184549376 unmapped: 3252224 heap: 187801600 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:16.626219+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 184549376 unmapped: 3252224 heap: 187801600 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:17.626371+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 184549376 unmapped: 3252224 heap: 187801600 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 290 heartbeat osd_stat(store_statfs(0x1ae9ce000/0x0/0x1bfc00000, data 0x5cf3d1d/0x5ee0000, compress 0x0/0x0/0x0, omap 0x649, meta 0xb34f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:18.626516+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 184549376 unmapped: 3252224 heap: 187801600 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:19.626720+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 184549376 unmapped: 3252224 heap: 187801600 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2638631 data_alloc: 285212672 data_used: 3682304
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:20.626921+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 184549376 unmapped: 3252224 heap: 187801600 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 290 heartbeat osd_stat(store_statfs(0x1ae9b9000/0x0/0x1bfc00000, data 0x5d086d1/0x5ef5000, compress 0x0/0x0/0x0, omap 0x649, meta 0xb34f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:21.627196+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 184549376 unmapped: 3252224 heap: 187801600 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:22.627425+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 184549376 unmapped: 4300800 heap: 188850176 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:23.627650+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 184549376 unmapped: 4300800 heap: 188850176 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:24.627829+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 184549376 unmapped: 4300800 heap: 188850176 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 9.947232246s of 10.000511169s, submitted: 12
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2642383 data_alloc: 285212672 data_used: 3682304
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 290 heartbeat osd_stat(store_statfs(0x1ae99f000/0x0/0x1bfc00000, data 0x5d233b7/0x5f0f000, compress 0x0/0x0/0x0, omap 0x649, meta 0xb34f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:25.628139+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 184557568 unmapped: 4292608 heap: 188850176 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:26.628409+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 184557568 unmapped: 4292608 heap: 188850176 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:27.629668+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 184557568 unmapped: 4292608 heap: 188850176 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:28.629803+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 184557568 unmapped: 4292608 heap: 188850176 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:29.629951+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 184647680 unmapped: 4202496 heap: 188850176 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2647019 data_alloc: 285212672 data_used: 3682304
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:30.630382+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 290 heartbeat osd_stat(store_statfs(0x1ae938000/0x0/0x1bfc00000, data 0x5d8992c/0x5f76000, compress 0x0/0x0/0x0, omap 0x649, meta 0xb34f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 184664064 unmapped: 4186112 heap: 188850176 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:31.631234+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 184664064 unmapped: 4186112 heap: 188850176 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:32.632039+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 184664064 unmapped: 4186112 heap: 188850176 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:33.633721+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 184901632 unmapped: 3948544 heap: 188850176 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 290 heartbeat osd_stat(store_statfs(0x1ae8f8000/0x0/0x1bfc00000, data 0x5dc9529/0x5fb6000, compress 0x0/0x0/0x0, omap 0x649, meta 0xb34f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:34.633880+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 185950208 unmapped: 2899968 heap: 188850176 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 9.905405998s of 10.000433922s, submitted: 17
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2648475 data_alloc: 285212672 data_used: 3682304
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:35.634100+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 290 handle_osd_map epochs [290,291], i have 290, src has [1,291]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 185950208 unmapped: 2899968 heap: 188850176 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:36.634252+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 185966592 unmapped: 2883584 heap: 188850176 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:37.634406+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 185966592 unmapped: 2883584 heap: 188850176 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:38.634569+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 185966592 unmapped: 2883584 heap: 188850176 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 291 heartbeat osd_stat(store_statfs(0x1ae8e7000/0x0/0x1bfc00000, data 0x5dd81bf/0x5fc6000, compress 0x0/0x0/0x0, omap 0x649, meta 0xb34f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:39.634721+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 185966592 unmapped: 2883584 heap: 188850176 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2654557 data_alloc: 285212672 data_used: 3694592
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:40.635164+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 185966592 unmapped: 2883584 heap: 188850176 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:41.635689+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 185966592 unmapped: 2883584 heap: 188850176 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:42.636126+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 185966592 unmapped: 2883584 heap: 188850176 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:43.636277+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 185974784 unmapped: 3923968 heap: 189898752 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:44.636442+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 9.683199883s of 10.000146866s, submitted: 48
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 185974784 unmapped: 3923968 heap: 189898752 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 291 heartbeat osd_stat(store_statfs(0x1ae8a0000/0x0/0x1bfc00000, data 0x5e1fd52/0x600e000, compress 0x0/0x0/0x0, omap 0x649, meta 0xb34f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2655021 data_alloc: 285212672 data_used: 3694592
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:45.636638+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 185974784 unmapped: 3923968 heap: 189898752 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:46.636857+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 185974784 unmapped: 3923968 heap: 189898752 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:47.637030+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 291 heartbeat osd_stat(store_statfs(0x1ae887000/0x0/0x1bfc00000, data 0x5e3971e/0x6027000, compress 0x0/0x0/0x0, omap 0x649, meta 0xb34f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 291 handle_osd_map epochs [292,292], i have 291, src has [1,292]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 291 handle_osd_map epochs [292,292], i have 292, src has [1,292]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 291 handle_osd_map epochs [292,292], i have 292, src has [1,292]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _send_mon_message to mon.np0005538514 at v2:172.18.0.104:3300/0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 292 heartbeat osd_stat(store_statfs(0x1ae882000/0x0/0x1bfc00000, data 0x5e3b9c2/0x602b000, compress 0x0/0x0/0x0, omap 0x649, meta 0xb34f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 185999360 unmapped: 3899392 heap: 189898752 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:48.637196+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 292 heartbeat osd_stat(store_statfs(0x1ae882000/0x0/0x1bfc00000, data 0x5e3b9c2/0x602b000, compress 0x0/0x0/0x0, omap 0x649, meta 0xb34f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 185999360 unmapped: 3899392 heap: 189898752 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:49.637390+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 185999360 unmapped: 3899392 heap: 189898752 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2662839 data_alloc: 285212672 data_used: 3706880
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:50.637677+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 186007552 unmapped: 3891200 heap: 189898752 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:51.637870+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 292 heartbeat osd_stat(store_statfs(0x1ae838000/0x0/0x1bfc00000, data 0x5e855c8/0x6076000, compress 0x0/0x0/0x0, omap 0x649, meta 0xb34f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 186007552 unmapped: 3891200 heap: 189898752 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:52.638019+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 186015744 unmapped: 3883008 heap: 189898752 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:53.638148+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 186261504 unmapped: 3637248 heap: 189898752 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:54.638400+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 186261504 unmapped: 3637248 heap: 189898752 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 9.994967461s of 10.124125481s, submitted: 37
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2666047 data_alloc: 285212672 data_used: 3706880
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:55.638681+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 186277888 unmapped: 3620864 heap: 189898752 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:56.638826+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _renew_subs
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _send_mon_message to mon.np0005538514 at v2:172.18.0.104:3300/0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 292 handle_osd_map epochs [293,293], i have 292, src has [1,293]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 186286080 unmapped: 3612672 heap: 189898752 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:57.639040+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 293 heartbeat osd_stat(store_statfs(0x1ae7ea000/0x0/0x1bfc00000, data 0x5ed130d/0x60c3000, compress 0x0/0x0/0x0, omap 0x649, meta 0xb34f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 186286080 unmapped: 3612672 heap: 189898752 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:58.639216+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 293 heartbeat osd_stat(store_statfs(0x1ae7ea000/0x0/0x1bfc00000, data 0x5ed130d/0x60c3000, compress 0x0/0x0/0x0, omap 0x649, meta 0xb34f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 186286080 unmapped: 3612672 heap: 189898752 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:15:59.639447+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 186417152 unmapped: 3481600 heap: 189898752 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2673325 data_alloc: 285212672 data_used: 3719168
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:00.639638+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 186417152 unmapped: 3481600 heap: 189898752 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:01.639778+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 186417152 unmapped: 3481600 heap: 189898752 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:02.639965+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 293 heartbeat osd_stat(store_statfs(0x1ae7ab000/0x0/0x1bfc00000, data 0x5f11ae2/0x6103000, compress 0x0/0x0/0x0, omap 0x649, meta 0xb34f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 186474496 unmapped: 3424256 heap: 189898752 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:03.640110+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 186474496 unmapped: 3424256 heap: 189898752 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:04.640275+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 186474496 unmapped: 3424256 heap: 189898752 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2673325 data_alloc: 285212672 data_used: 3719168
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:05.640504+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 186474496 unmapped: 3424256 heap: 189898752 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:06.640748+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 293 handle_osd_map epochs [293,294], i have 293, src has [1,294]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 11.600185394s of 11.673746109s, submitted: 45
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 187531264 unmapped: 3416064 heap: 190947328 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:07.640968+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 187531264 unmapped: 3416064 heap: 190947328 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 294 heartbeat osd_stat(store_statfs(0x1ae7a6000/0x0/0x1bfc00000, data 0x5f13d86/0x6107000, compress 0x0/0x0/0x0, omap 0x649, meta 0xb34f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:08.641169+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 187539456 unmapped: 3407872 heap: 190947328 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:09.641359+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 187539456 unmapped: 3407872 heap: 190947328 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2674295 data_alloc: 285212672 data_used: 3731456
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:10.641549+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 294 heartbeat osd_stat(store_statfs(0x1ae7a6000/0x0/0x1bfc00000, data 0x5f13d86/0x6107000, compress 0x0/0x0/0x0, omap 0x649, meta 0xb34f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 187539456 unmapped: 3407872 heap: 190947328 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:11.641688+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 187539456 unmapped: 3407872 heap: 190947328 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:12.641952+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 187539456 unmapped: 3407872 heap: 190947328 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:13.642197+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 294 heartbeat osd_stat(store_statfs(0x1ae7a6000/0x0/0x1bfc00000, data 0x5f13d86/0x6107000, compress 0x0/0x0/0x0, omap 0x649, meta 0xb34f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 294 ms_handle_reset con 0x55ab8df90c00 session 0x55ab8c81d4a0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8edefc00
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 294 ms_handle_reset con 0x55ab8e29c800 session 0x55ab8cbead20
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: handle_auth_request added challenge on 0x55ab8df90c00
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 187539456 unmapped: 3407872 heap: 190947328 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:14.642386+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 187539456 unmapped: 3407872 heap: 190947328 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2674295 data_alloc: 285212672 data_used: 3731456
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:15.642598+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 187539456 unmapped: 3407872 heap: 190947328 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:16.642765+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 187547648 unmapped: 3399680 heap: 190947328 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:17.642946+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 187547648 unmapped: 3399680 heap: 190947328 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:18.643150+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 294 heartbeat osd_stat(store_statfs(0x1ae7a6000/0x0/0x1bfc00000, data 0x5f13d86/0x6107000, compress 0x0/0x0/0x0, omap 0x649, meta 0xb34f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 187547648 unmapped: 3399680 heap: 190947328 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:19.643351+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 187547648 unmapped: 3399680 heap: 190947328 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2674295 data_alloc: 285212672 data_used: 3731456
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:20.643530+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 187547648 unmapped: 3399680 heap: 190947328 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:21.643648+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 294 heartbeat osd_stat(store_statfs(0x1ae7a6000/0x0/0x1bfc00000, data 0x5f13d86/0x6107000, compress 0x0/0x0/0x0, omap 0x649, meta 0xb34f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 187547648 unmapped: 3399680 heap: 190947328 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:22.643769+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 187547648 unmapped: 3399680 heap: 190947328 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:23.644002+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 187547648 unmapped: 3399680 heap: 190947328 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:24.644158+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 294 heartbeat osd_stat(store_statfs(0x1ae7a6000/0x0/0x1bfc00000, data 0x5f13d86/0x6107000, compress 0x0/0x0/0x0, omap 0x649, meta 0xb34f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 187555840 unmapped: 3391488 heap: 190947328 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2674295 data_alloc: 285212672 data_used: 3731456
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:25.644300+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 187555840 unmapped: 3391488 heap: 190947328 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 294 heartbeat osd_stat(store_statfs(0x1ae7a6000/0x0/0x1bfc00000, data 0x5f13d86/0x6107000, compress 0x0/0x0/0x0, omap 0x649, meta 0xb34f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:26.644449+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 187555840 unmapped: 3391488 heap: 190947328 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:27.644592+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 187555840 unmapped: 3391488 heap: 190947328 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:28.644757+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 187555840 unmapped: 3391488 heap: 190947328 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:29.644989+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 187555840 unmapped: 3391488 heap: 190947328 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2674295 data_alloc: 285212672 data_used: 3731456
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:30.645159+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 187555840 unmapped: 3391488 heap: 190947328 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:31.645328+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 187555840 unmapped: 3391488 heap: 190947328 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 294 heartbeat osd_stat(store_statfs(0x1ae7a6000/0x0/0x1bfc00000, data 0x5f13d86/0x6107000, compress 0x0/0x0/0x0, omap 0x649, meta 0xb34f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:32.645558+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 187564032 unmapped: 3383296 heap: 190947328 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:33.645782+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 187564032 unmapped: 3383296 heap: 190947328 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:34.646204+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 187564032 unmapped: 3383296 heap: 190947328 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2674295 data_alloc: 285212672 data_used: 3731456
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:35.646845+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 187564032 unmapped: 3383296 heap: 190947328 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 294 heartbeat osd_stat(store_statfs(0x1ae7a6000/0x0/0x1bfc00000, data 0x5f13d86/0x6107000, compress 0x0/0x0/0x0, omap 0x649, meta 0xb34f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:36.647181+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 187564032 unmapped: 3383296 heap: 190947328 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:37.647754+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 187564032 unmapped: 3383296 heap: 190947328 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:38.647942+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 294 heartbeat osd_stat(store_statfs(0x1ae7a6000/0x0/0x1bfc00000, data 0x5f13d86/0x6107000, compress 0x0/0x0/0x0, omap 0x649, meta 0xb34f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 187572224 unmapped: 3375104 heap: 190947328 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:39.648384+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 187572224 unmapped: 3375104 heap: 190947328 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2674295 data_alloc: 285212672 data_used: 3731456
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:40.648677+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 187588608 unmapped: 3358720 heap: 190947328 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:41.649059+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 294 heartbeat osd_stat(store_statfs(0x1ae7a6000/0x0/0x1bfc00000, data 0x5f13d86/0x6107000, compress 0x0/0x0/0x0, omap 0x649, meta 0xb34f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 187588608 unmapped: 3358720 heap: 190947328 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:42.649366+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 187588608 unmapped: 3358720 heap: 190947328 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:43.649692+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 187588608 unmapped: 3358720 heap: 190947328 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:44.649923+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 187588608 unmapped: 3358720 heap: 190947328 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2674295 data_alloc: 285212672 data_used: 3731456
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:45.650227+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 187588608 unmapped: 3358720 heap: 190947328 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:46.650401+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 187588608 unmapped: 3358720 heap: 190947328 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:47.650666+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 294 heartbeat osd_stat(store_statfs(0x1ae7a6000/0x0/0x1bfc00000, data 0x5f13d86/0x6107000, compress 0x0/0x0/0x0, omap 0x649, meta 0xb34f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 187588608 unmapped: 3358720 heap: 190947328 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:48.650860+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 187588608 unmapped: 3358720 heap: 190947328 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:49.651193+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 187588608 unmapped: 3358720 heap: 190947328 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:50.655389+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2674295 data_alloc: 285212672 data_used: 3731456
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 187588608 unmapped: 3358720 heap: 190947328 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:51.655953+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 187596800 unmapped: 3350528 heap: 190947328 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:52.656345+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 294 heartbeat osd_stat(store_statfs(0x1ae7a6000/0x0/0x1bfc00000, data 0x5f13d86/0x6107000, compress 0x0/0x0/0x0, omap 0x649, meta 0xb34f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 187596800 unmapped: 3350528 heap: 190947328 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:53.656510+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 187596800 unmapped: 3350528 heap: 190947328 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:54.656687+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 187596800 unmapped: 3350528 heap: 190947328 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:55.656851+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2674295 data_alloc: 285212672 data_used: 3731456
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 187596800 unmapped: 3350528 heap: 190947328 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:56.656998+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 187604992 unmapped: 3342336 heap: 190947328 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:57.657164+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 294 heartbeat osd_stat(store_statfs(0x1ae7a6000/0x0/0x1bfc00000, data 0x5f13d86/0x6107000, compress 0x0/0x0/0x0, omap 0x649, meta 0xb34f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 187604992 unmapped: 3342336 heap: 190947328 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:58.657530+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 294 heartbeat osd_stat(store_statfs(0x1ae7a6000/0x0/0x1bfc00000, data 0x5f13d86/0x6107000, compress 0x0/0x0/0x0, omap 0x649, meta 0xb34f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 187604992 unmapped: 3342336 heap: 190947328 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:16:59.657715+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 187604992 unmapped: 3342336 heap: 190947328 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:00.657998+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2674295 data_alloc: 285212672 data_used: 3731456
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 187604992 unmapped: 3342336 heap: 190947328 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:01.658189+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 187604992 unmapped: 3342336 heap: 190947328 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:02.658388+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 294 heartbeat osd_stat(store_statfs(0x1ae7a6000/0x0/0x1bfc00000, data 0x5f13d86/0x6107000, compress 0x0/0x0/0x0, omap 0x649, meta 0xb34f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 187613184 unmapped: 3334144 heap: 190947328 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:03.658597+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 187613184 unmapped: 3334144 heap: 190947328 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:04.658736+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:05.658965+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 187621376 unmapped: 3325952 heap: 190947328 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 58.627788544s of 58.669982910s, submitted: 30
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 294 ms_handle_reset con 0x55ab8b28a400 session 0x55ab8dd074a0
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2673767 data_alloc: 285212672 data_used: 3731456
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 294 heartbeat osd_stat(store_statfs(0x1ae7a6000/0x0/0x1bfc00000, data 0x5f13d86/0x6107000, compress 0x0/0x0/0x0, omap 0x649, meta 0xb34f9b7), peers [0,2,3,4,5] op hist [1,0,1])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:06.659222+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 188071936 unmapped: 2875392 heap: 190947328 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: mgrc handle_mgr_map Got map version 53
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/3214120196,v1:172.18.0.108:6811/3214120196]
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 294 heartbeat osd_stat(store_statfs(0x1ae7a7000/0x0/0x1bfc00000, data 0x5f13f99/0x6107000, compress 0x0/0x0/0x0, omap 0x649, meta 0xb34f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:07.659370+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 188219392 unmapped: 2727936 heap: 190947328 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 294 heartbeat osd_stat(store_statfs(0x1ae7a7000/0x0/0x1bfc00000, data 0x5f13f99/0x6107000, compress 0x0/0x0/0x0, omap 0x649, meta 0xb34f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:08.659555+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 188219392 unmapped: 2727936 heap: 190947328 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:09.659908+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 188219392 unmapped: 2727936 heap: 190947328 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:10.660125+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 188219392 unmapped: 2727936 heap: 190947328 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2673815 data_alloc: 285212672 data_used: 3731456
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:11.660245+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 188219392 unmapped: 2727936 heap: 190947328 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:12.660393+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 188219392 unmapped: 2727936 heap: 190947328 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:13.660568+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 188219392 unmapped: 2727936 heap: 190947328 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 294 heartbeat osd_stat(store_statfs(0x1ae7a7000/0x0/0x1bfc00000, data 0x5f13f99/0x6107000, compress 0x0/0x0/0x0, omap 0x649, meta 0xb34f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:14.660768+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 188219392 unmapped: 2727936 heap: 190947328 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 294 heartbeat osd_stat(store_statfs(0x1ae7a7000/0x0/0x1bfc00000, data 0x5f13f99/0x6107000, compress 0x0/0x0/0x0, omap 0x649, meta 0xb34f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:15.660946+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 188219392 unmapped: 2727936 heap: 190947328 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2673815 data_alloc: 285212672 data_used: 3731456
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:16.661111+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 188219392 unmapped: 2727936 heap: 190947328 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:17.661258+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 188219392 unmapped: 2727936 heap: 190947328 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:18.661452+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 188219392 unmapped: 2727936 heap: 190947328 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 294 heartbeat osd_stat(store_statfs(0x1ae7a7000/0x0/0x1bfc00000, data 0x5f13f99/0x6107000, compress 0x0/0x0/0x0, omap 0x649, meta 0xb34f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:19.661659+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 188219392 unmapped: 2727936 heap: 190947328 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:20.661923+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 188219392 unmapped: 2727936 heap: 190947328 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2673815 data_alloc: 285212672 data_used: 3731456
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:21.662120+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 188227584 unmapped: 2719744 heap: 190947328 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:22.662320+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 188227584 unmapped: 2719744 heap: 190947328 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 294 heartbeat osd_stat(store_statfs(0x1ae7a7000/0x0/0x1bfc00000, data 0x5f13f99/0x6107000, compress 0x0/0x0/0x0, omap 0x649, meta 0xb34f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:23.662538+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 188227584 unmapped: 2719744 heap: 190947328 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:24.662701+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 188227584 unmapped: 2719744 heap: 190947328 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:25.662848+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 188227584 unmapped: 2719744 heap: 190947328 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2673815 data_alloc: 285212672 data_used: 3731456
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:26.663009+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 188227584 unmapped: 2719744 heap: 190947328 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:27.663151+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 188227584 unmapped: 2719744 heap: 190947328 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:28.663313+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 188227584 unmapped: 2719744 heap: 190947328 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 294 heartbeat osd_stat(store_statfs(0x1ae7a7000/0x0/0x1bfc00000, data 0x5f13f99/0x6107000, compress 0x0/0x0/0x0, omap 0x649, meta 0xb34f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 294 heartbeat osd_stat(store_statfs(0x1ae7a7000/0x0/0x1bfc00000, data 0x5f13f99/0x6107000, compress 0x0/0x0/0x0, omap 0x649, meta 0xb34f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:29.663495+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 188243968 unmapped: 2703360 heap: 190947328 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:30.663692+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 188243968 unmapped: 2703360 heap: 190947328 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2673815 data_alloc: 285212672 data_used: 3731456
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:31.663848+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 188243968 unmapped: 2703360 heap: 190947328 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:32.664188+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 188243968 unmapped: 2703360 heap: 190947328 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:33.664399+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 188243968 unmapped: 2703360 heap: 190947328 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:34.664564+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 188243968 unmapped: 2703360 heap: 190947328 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 294 heartbeat osd_stat(store_statfs(0x1ae7a7000/0x0/0x1bfc00000, data 0x5f13f99/0x6107000, compress 0x0/0x0/0x0, omap 0x649, meta 0xb34f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:35.664708+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 188243968 unmapped: 2703360 heap: 190947328 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2673815 data_alloc: 285212672 data_used: 3731456
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 294 heartbeat osd_stat(store_statfs(0x1ae7a7000/0x0/0x1bfc00000, data 0x5f13f99/0x6107000, compress 0x0/0x0/0x0, omap 0x649, meta 0xb34f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:36.664810+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 188252160 unmapped: 2695168 heap: 190947328 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:37.664953+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 294 heartbeat osd_stat(store_statfs(0x1ae7a7000/0x0/0x1bfc00000, data 0x5f13f99/0x6107000, compress 0x0/0x0/0x0, omap 0x649, meta 0xb34f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 188252160 unmapped: 2695168 heap: 190947328 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:38.665092+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 188252160 unmapped: 2695168 heap: 190947328 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:39.665624+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 188252160 unmapped: 2695168 heap: 190947328 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 294 heartbeat osd_stat(store_statfs(0x1ae7a7000/0x0/0x1bfc00000, data 0x5f13f99/0x6107000, compress 0x0/0x0/0x0, omap 0x649, meta 0xb34f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:40.666491+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 188252160 unmapped: 2695168 heap: 190947328 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2673815 data_alloc: 285212672 data_used: 3731456
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 294 heartbeat osd_stat(store_statfs(0x1ae7a7000/0x0/0x1bfc00000, data 0x5f13f99/0x6107000, compress 0x0/0x0/0x0, omap 0x649, meta 0xb34f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:41.666708+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 188252160 unmapped: 2695168 heap: 190947328 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:42.667579+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 188252160 unmapped: 2695168 heap: 190947328 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:43.668041+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 188252160 unmapped: 2695168 heap: 190947328 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 294 heartbeat osd_stat(store_statfs(0x1ae7a7000/0x0/0x1bfc00000, data 0x5f13f99/0x6107000, compress 0x0/0x0/0x0, omap 0x649, meta 0xb34f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:44.668476+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 188260352 unmapped: 2686976 heap: 190947328 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:45.668668+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 188260352 unmapped: 2686976 heap: 190947328 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2673815 data_alloc: 285212672 data_used: 3731456
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:46.669057+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 188260352 unmapped: 2686976 heap: 190947328 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 294 heartbeat osd_stat(store_statfs(0x1ae7a7000/0x0/0x1bfc00000, data 0x5f13f99/0x6107000, compress 0x0/0x0/0x0, omap 0x649, meta 0xb34f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:47.669441+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 188260352 unmapped: 2686976 heap: 190947328 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:48.669758+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 188260352 unmapped: 2686976 heap: 190947328 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:49.669951+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 294 heartbeat osd_stat(store_statfs(0x1ae7a7000/0x0/0x1bfc00000, data 0x5f13f99/0x6107000, compress 0x0/0x0/0x0, omap 0x649, meta 0xb34f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 188268544 unmapped: 2678784 heap: 190947328 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:50.670133+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 188268544 unmapped: 2678784 heap: 190947328 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2673815 data_alloc: 285212672 data_used: 3731456
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:51.670268+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 188268544 unmapped: 2678784 heap: 190947328 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:52.670511+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 188268544 unmapped: 2678784 heap: 190947328 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:53.677253+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 188276736 unmapped: 2670592 heap: 190947328 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:54.677405+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 188276736 unmapped: 2670592 heap: 190947328 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 294 heartbeat osd_stat(store_statfs(0x1ae7a7000/0x0/0x1bfc00000, data 0x5f13f99/0x6107000, compress 0x0/0x0/0x0, omap 0x649, meta 0xb34f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:55.677555+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 188276736 unmapped: 2670592 heap: 190947328 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2673815 data_alloc: 285212672 data_used: 3731456
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:56.677901+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 188276736 unmapped: 2670592 heap: 190947328 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:57.678115+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 188276736 unmapped: 2670592 heap: 190947328 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:58.678231+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 188276736 unmapped: 2670592 heap: 190947328 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 294 heartbeat osd_stat(store_statfs(0x1ae7a7000/0x0/0x1bfc00000, data 0x5f13f99/0x6107000, compress 0x0/0x0/0x0, omap 0x649, meta 0xb34f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:17:59.678341+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 188276736 unmapped: 2670592 heap: 190947328 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:18:00.678532+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 188276736 unmapped: 2670592 heap: 190947328 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2673815 data_alloc: 285212672 data_used: 3731456
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:18:01.678696+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 188284928 unmapped: 2662400 heap: 190947328 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:18:02.678811+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 188284928 unmapped: 2662400 heap: 190947328 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:18:03.679025+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 294 heartbeat osd_stat(store_statfs(0x1ae7a7000/0x0/0x1bfc00000, data 0x5f13f99/0x6107000, compress 0x0/0x0/0x0, omap 0x649, meta 0xb34f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 188284928 unmapped: 2662400 heap: 190947328 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 9000.1 total, 600.0 interval
                                                          Cumulative writes: 22K writes, 83K keys, 22K commit groups, 1.0 writes per commit group, ingest: 0.09 GB, 0.01 MB/s
                                                          Cumulative WAL: 22K writes, 7722 syncs, 2.86 writes per sync, written: 0.09 GB, 0.01 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 8846 writes, 31K keys, 8846 commit groups, 1.0 writes per commit group, ingest: 44.37 MB, 0.07 MB/s
                                                          Interval WAL: 8846 writes, 3528 syncs, 2.51 writes per sync, written: 0.04 GB, 0.07 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:18:04.679156+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 188284928 unmapped: 2662400 heap: 190947328 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:18:05.679280+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 188334080 unmapped: 2613248 heap: 190947328 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: bluestore.MempoolThread(0x55ab8a4fbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2673815 data_alloc: 285212672 data_used: 3731456
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: osd.1 294 heartbeat osd_stat(store_statfs(0x1ae7a7000/0x0/0x1bfc00000, data 0x5f13f99/0x6107000, compress 0x0/0x0/0x0, omap 0x649, meta 0xb34f9b7), peers [0,2,3,4,5] op hist [])
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: do_command 'config diff' '{prefix=config diff}'
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: do_command 'config diff' '{prefix=config diff}' result is 0 bytes
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: do_command 'config show' '{prefix=config show}'
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: do_command 'config show' '{prefix=config show}' result is 0 bytes
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:18:06.679406+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: do_command 'counter dump' '{prefix=counter dump}'
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: do_command 'counter dump' '{prefix=counter dump}' result is 0 bytes
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: do_command 'counter schema' '{prefix=counter schema}'
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: do_command 'counter schema' '{prefix=counter schema}' result is 0 bytes
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 188317696 unmapped: 2629632 heap: 190947328 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:18:07.679593+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 187850752 unmapped: 3096576 heap: 190947328 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: tick
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_tickets
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-11-28T10:18:08.679724+0000)
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: prioritycache tune_memory target: 5709084876 mapped: 188243968 unmapped: 2703360 heap: 190947328 old mem: 4047415775 new mem: 4047415775
Nov 28 10:18:39 np0005538515.localdomain ceph-osd[32393]: do_command 'log dump' '{prefix=log dump}'
Nov 28 10:18:39 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "osd erasure-code-profile ls"} v 0)
Nov 28 10:18:39 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/1500572328' entity='client.admin' cmd={"prefix": "osd erasure-code-profile ls"} : dispatch
Nov 28 10:18:39 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "mgr services", "format": "json-pretty"} v 0)
Nov 28 10:18:39 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/1135811599' entity='client.admin' cmd={"prefix": "mgr services", "format": "json-pretty"} : dispatch
Nov 28 10:18:39 np0005538515.localdomain ceph-mon[301134]: pgmap v821: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:18:39 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.106:0/3667633455' entity='client.admin' cmd={"prefix": "osd crush show-tunables"} : dispatch
Nov 28 10:18:39 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.107:0/4176004123' entity='client.admin' cmd={"prefix": "osd crush rule ls"} : dispatch
Nov 28 10:18:39 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.106:0/1087429981' entity='client.admin' cmd={"prefix": "mgr services", "format": "json-pretty"} : dispatch
Nov 28 10:18:39 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.108:0/1301512143' entity='client.admin' cmd={"prefix": "osd crush tree", "show_shadow": true} : dispatch
Nov 28 10:18:39 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.107:0/1212911426' entity='client.admin' cmd={"prefix": "mgr metadata", "format": "json-pretty"} : dispatch
Nov 28 10:18:39 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.108:0/1939566436' entity='client.admin' cmd={"prefix": "mgr module ls", "format": "json-pretty"} : dispatch
Nov 28 10:18:39 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.106:0/3300551025' entity='client.admin' cmd={"prefix": "osd crush tree", "show_shadow": true} : dispatch
Nov 28 10:18:39 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.107:0/2340776439' entity='client.admin' cmd={"prefix": "osd crush show-tunables"} : dispatch
Nov 28 10:18:39 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.106:0/1467414868' entity='client.admin' cmd={"prefix": "mgr stat", "format": "json-pretty"} : dispatch
Nov 28 10:18:39 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.107:0/3245672563' entity='client.admin' cmd={"prefix": "mgr module ls", "format": "json-pretty"} : dispatch
Nov 28 10:18:39 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.108:0/1500572328' entity='client.admin' cmd={"prefix": "osd erasure-code-profile ls"} : dispatch
Nov 28 10:18:39 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.108:0/1135811599' entity='client.admin' cmd={"prefix": "mgr services", "format": "json-pretty"} : dispatch
Nov 28 10:18:39 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.106:0/3956784888' entity='client.admin' cmd={"prefix": "osd erasure-code-profile ls"} : dispatch
Nov 28 10:18:39 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.107:0/2059175748' entity='client.admin' cmd={"prefix": "osd crush tree", "show_shadow": true} : dispatch
Nov 28 10:18:39 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "osd metadata"} v 0)
Nov 28 10:18:39 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/1832128950' entity='client.admin' cmd={"prefix": "osd metadata"} : dispatch
Nov 28 10:18:39 np0005538515.localdomain rsyslogd[758]: imjournal from <localhost:ceph-osd>: begin to drop messages due to rate-limiting
Nov 28 10:18:39 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "mgr stat", "format": "json-pretty"} v 0)
Nov 28 10:18:39 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/2432312567' entity='client.admin' cmd={"prefix": "mgr stat", "format": "json-pretty"} : dispatch
Nov 28 10:18:40 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.49641 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Nov 28 10:18:40 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "osd utilization"} v 0)
Nov 28 10:18:40 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/1527008646' entity='client.admin' cmd={"prefix": "osd utilization"} : dispatch
Nov 28 10:18:40 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v822: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:18:40 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "mgr versions", "format": "json-pretty"} v 0)
Nov 28 10:18:40 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/1580882676' entity='client.admin' cmd={"prefix": "mgr versions", "format": "json-pretty"} : dispatch
Nov 28 10:18:40 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.49653 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Nov 28 10:18:40 np0005538515.localdomain sudo[328895]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 10:18:40 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.69884 -' entity='client.admin' cmd=[{"prefix": "telemetry channel ls", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:40 np0005538515.localdomain sudo[328895]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 10:18:40 np0005538515.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.
Nov 28 10:18:40 np0005538515.localdomain sudo[328895]: pam_unix(sudo:session): session closed for user root
Nov 28 10:18:40 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.49659 -' entity='client.admin' cmd=[{"prefix": "telemetry channel ls", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:40 np0005538515.localdomain sudo[328929]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Nov 28 10:18:40 np0005538515.localdomain sudo[328929]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 10:18:40 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.106:0/1176939805' entity='client.admin' cmd={"prefix": "mgr versions", "format": "json-pretty"} : dispatch
Nov 28 10:18:40 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.108:0/1832128950' entity='client.admin' cmd={"prefix": "osd metadata"} : dispatch
Nov 28 10:18:40 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.107:0/4055277002' entity='client.admin' cmd={"prefix": "mgr services", "format": "json-pretty"} : dispatch
Nov 28 10:18:40 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.106:0/3622627622' entity='client.admin' cmd={"prefix": "osd metadata"} : dispatch
Nov 28 10:18:40 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.108:0/2432312567' entity='client.admin' cmd={"prefix": "mgr stat", "format": "json-pretty"} : dispatch
Nov 28 10:18:40 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.107:0/2924469151' entity='client.admin' cmd={"prefix": "osd erasure-code-profile ls"} : dispatch
Nov 28 10:18:40 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.108:0/1527008646' entity='client.admin' cmd={"prefix": "osd utilization"} : dispatch
Nov 28 10:18:40 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.107:0/3302747548' entity='client.admin' cmd={"prefix": "mgr stat", "format": "json-pretty"} : dispatch
Nov 28 10:18:40 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.106:0/12388310' entity='client.admin' cmd={"prefix": "osd utilization"} : dispatch
Nov 28 10:18:40 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.108:0/1580882676' entity='client.admin' cmd={"prefix": "mgr versions", "format": "json-pretty"} : dispatch
Nov 28 10:18:40 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.107:0/1816962051' entity='client.admin' cmd={"prefix": "osd metadata"} : dispatch
Nov 28 10:18:40 np0005538515.localdomain podman[328921]: 2025-11-28 10:18:40.67144307 +0000 UTC m=+0.084318079 container health_status 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, name=ubi9-minimal, vendor=Red Hat, Inc., container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, release=1755695350, vcs-type=git)
Nov 28 10:18:40 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.69890 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Nov 28 10:18:40 np0005538515.localdomain podman[328921]: 2025-11-28 10:18:40.689515346 +0000 UTC m=+0.102390385 container exec_died 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., config_id=edpm, maintainer=Red Hat, Inc., name=ubi9-minimal, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 28 10:18:40 np0005538515.localdomain systemd[1]: 6c250c88fddb84b6d409058ed0d298c2f5484527eb3f5d819767f1d36839a4bf.service: Deactivated successfully.
Nov 28 10:18:40 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.49665 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Nov 28 10:18:41 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.69902 -' entity='client.admin' cmd=[{"prefix": "telemetry collection ls", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:41 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.49671 -' entity='client.admin' cmd=[{"prefix": "telemetry collection ls", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:41 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.59887 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Nov 28 10:18:41 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.69914 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Nov 28 10:18:41 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.59896 -' entity='client.admin' cmd=[{"prefix": "telemetry channel ls", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:41 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.49680 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Nov 28 10:18:41 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:18:41.367 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:18:41 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.59908 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Nov 28 10:18:41 np0005538515.localdomain podman[329144]: 2025-11-28 10:18:41.438356159 +0000 UTC m=+0.143861411 container exec 98f7091a3e2ea0e9ed1e630f1e98c8fad1fd276cf7448473db6afc3c103ea45d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538515, CEPH_POINT_RELEASE=, distribution-scope=public, io.openshift.expose-services=, architecture=x86_64, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, io.buildah.version=1.33.12, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, release=553, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7)
Nov 28 10:18:41 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.69926 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Nov 28 10:18:41 np0005538515.localdomain podman[329144]: 2025-11-28 10:18:41.539589147 +0000 UTC m=+0.245094449 container exec_died 98f7091a3e2ea0e9ed1e630f1e98c8fad1fd276cf7448473db6afc3c103ea45d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538515, com.redhat.component=rhceph-container, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., GIT_CLEAN=True, release=553, version=7, RELEASE=main, io.buildah.version=1.33.12, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, vcs-type=git, build-date=2025-09-24T08:57:55, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_BRANCH=main)
Nov 28 10:18:41 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.59914 -' entity='client.admin' cmd=[{"prefix": "telemetry collection ls", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:41 np0005538515.localdomain ceph-mon[301134]: from='client.49641 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Nov 28 10:18:41 np0005538515.localdomain ceph-mon[301134]: pgmap v822: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:18:41 np0005538515.localdomain ceph-mon[301134]: from='client.49653 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Nov 28 10:18:41 np0005538515.localdomain ceph-mon[301134]: from='client.69884 -' entity='client.admin' cmd=[{"prefix": "telemetry channel ls", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:41 np0005538515.localdomain ceph-mon[301134]: from='client.49659 -' entity='client.admin' cmd=[{"prefix": "telemetry channel ls", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:41 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.107:0/1502712426' entity='client.admin' cmd={"prefix": "mgr versions", "format": "json-pretty"} : dispatch
Nov 28 10:18:41 np0005538515.localdomain ceph-mon[301134]: from='client.69890 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Nov 28 10:18:41 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.107:0/1699203364' entity='client.admin' cmd={"prefix": "osd utilization"} : dispatch
Nov 28 10:18:41 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.49692 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Nov 28 10:18:41 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.59920 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Nov 28 10:18:41 np0005538515.localdomain systemd[1]: Starting Hostname Service...
Nov 28 10:18:41 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.69944 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Nov 28 10:18:41 np0005538515.localdomain systemd[1]: Started Hostname Service.
Nov 28 10:18:41 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "quorum_status"} v 0)
Nov 28 10:18:41 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/4004329074' entity='client.admin' cmd={"prefix": "quorum_status"} : dispatch
Nov 28 10:18:41 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e294 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:18:42 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain.devices.0}] v 0)
Nov 28 10:18:42 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.49704 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Nov 28 10:18:42 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain}] v 0)
Nov 28 10:18:42 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v823: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:18:42 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.59938 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Nov 28 10:18:42 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain.devices.0}] v 0)
Nov 28 10:18:42 np0005538515.localdomain sudo[328929]: pam_unix(sudo:session): session closed for user root
Nov 28 10:18:42 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain.devices.0}] v 0)
Nov 28 10:18:42 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain}] v 0)
Nov 28 10:18:42 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain}] v 0)
Nov 28 10:18:42 np0005538515.localdomain sudo[329373]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Nov 28 10:18:42 np0005538515.localdomain sudo[329373]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 10:18:42 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.69959 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Nov 28 10:18:42 np0005538515.localdomain sudo[329373]: pam_unix(sudo:session): session closed for user root
Nov 28 10:18:42 np0005538515.localdomain sudo[329396]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Nov 28 10:18:42 np0005538515.localdomain sudo[329396]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 10:18:42 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "versions"} v 0)
Nov 28 10:18:42 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/2192705607' entity='client.admin' cmd={"prefix": "versions"} : dispatch
Nov 28 10:18:42 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.49719 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Nov 28 10:18:42 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.59959 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Nov 28 10:18:42 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.69977 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Nov 28 10:18:42 np0005538515.localdomain ceph-mon[301134]: from='client.49665 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Nov 28 10:18:42 np0005538515.localdomain ceph-mon[301134]: from='client.69902 -' entity='client.admin' cmd=[{"prefix": "telemetry collection ls", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:42 np0005538515.localdomain ceph-mon[301134]: from='client.49671 -' entity='client.admin' cmd=[{"prefix": "telemetry collection ls", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:42 np0005538515.localdomain ceph-mon[301134]: from='client.59887 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Nov 28 10:18:42 np0005538515.localdomain ceph-mon[301134]: from='client.69914 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Nov 28 10:18:42 np0005538515.localdomain ceph-mon[301134]: from='client.59896 -' entity='client.admin' cmd=[{"prefix": "telemetry channel ls", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:42 np0005538515.localdomain ceph-mon[301134]: from='client.49680 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Nov 28 10:18:42 np0005538515.localdomain ceph-mon[301134]: from='client.59908 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Nov 28 10:18:42 np0005538515.localdomain ceph-mon[301134]: from='client.69926 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Nov 28 10:18:42 np0005538515.localdomain ceph-mon[301134]: from='client.59914 -' entity='client.admin' cmd=[{"prefix": "telemetry collection ls", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:42 np0005538515.localdomain ceph-mon[301134]: from='client.49692 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Nov 28 10:18:42 np0005538515.localdomain ceph-mon[301134]: from='client.59920 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Nov 28 10:18:42 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.106:0/2583937855' entity='client.admin' cmd={"prefix": "quorum_status"} : dispatch
Nov 28 10:18:42 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.108:0/4004329074' entity='client.admin' cmd={"prefix": "quorum_status"} : dispatch
Nov 28 10:18:42 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:18:42 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:18:42 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:18:42 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:18:42 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:18:42 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:18:42 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.106:0/2050263147' entity='client.admin' cmd={"prefix": "versions"} : dispatch
Nov 28 10:18:42 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.108:0/2192705607' entity='client.admin' cmd={"prefix": "versions"} : dispatch
Nov 28 10:18:42 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.107:0/3570554908' entity='client.admin' cmd={"prefix": "quorum_status"} : dispatch
Nov 28 10:18:42 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "health", "detail": "detail", "format": "json-pretty"} v 0)
Nov 28 10:18:42 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/4269415353' entity='client.admin' cmd={"prefix": "health", "detail": "detail", "format": "json-pretty"} : dispatch
Nov 28 10:18:42 np0005538515.localdomain sudo[329396]: pam_unix(sudo:session): session closed for user root
Nov 28 10:18:43 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.59974 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Nov 28 10:18:43 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.69998 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Nov 28 10:18:43 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} v 0)
Nov 28 10:18:43 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch
Nov 28 10:18:43 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} v 0)
Nov 28 10:18:43 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch
Nov 28 10:18:43 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} v 0)
Nov 28 10:18:43 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch
Nov 28 10:18:43 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} v 0)
Nov 28 10:18:43 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch
Nov 28 10:18:43 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} v 0)
Nov 28 10:18:43 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch
Nov 28 10:18:43 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} v 0)
Nov 28 10:18:43 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch
Nov 28 10:18:43 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO root] Adjusting osd_memory_target on np0005538515.localdomain to 836.6M
Nov 28 10:18:43 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Adjusting osd_memory_target on np0005538515.localdomain to 836.6M
Nov 28 10:18:43 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO root] Adjusting osd_memory_target on np0005538513.localdomain to 836.6M
Nov 28 10:18:43 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Adjusting osd_memory_target on np0005538513.localdomain to 836.6M
Nov 28 10:18:43 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command([{prefix=config set, name=osd_memory_target}] v 0)
Nov 28 10:18:43 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command([{prefix=config set, name=osd_memory_target}] v 0)
Nov 28 10:18:43 np0005538515.localdomain ceph-mgr[286188]: [cephadm INFO root] Adjusting osd_memory_target on np0005538514.localdomain to 836.6M
Nov 28 10:18:43 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [INF] : Adjusting osd_memory_target on np0005538514.localdomain to 836.6M
Nov 28 10:18:43 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command([{prefix=config set, name=osd_memory_target}] v 0)
Nov 28 10:18:43 np0005538515.localdomain ceph-mgr[286188]: [cephadm WARNING cephadm.serve] Unable to set osd_memory_target on np0005538515.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Nov 28 10:18:43 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [WRN] : Unable to set osd_memory_target on np0005538515.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Nov 28 10:18:43 np0005538515.localdomain ceph-mgr[286188]: [cephadm WARNING cephadm.serve] Unable to set osd_memory_target on np0005538513.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Nov 28 10:18:43 np0005538515.localdomain ceph-mgr[286188]: [cephadm WARNING cephadm.serve] Unable to set osd_memory_target on np0005538514.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Nov 28 10:18:43 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [WRN] : Unable to set osd_memory_target on np0005538513.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Nov 28 10:18:43 np0005538515.localdomain ceph-mgr[286188]: log_channel(cephadm) log [WRN] : Unable to set osd_memory_target on np0005538514.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Nov 28 10:18:43 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Nov 28 10:18:43 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 10:18:43 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0)
Nov 28 10:18:43 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [INF] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 28 10:18:43 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Nov 28 10:18:43 np0005538515.localdomain ceph-mgr[286188]: [progress INFO root] update: starting ev 982980e4-f1a8-42d4-987b-c67300c2c4f3 (Updating node-proxy deployment (+3 -> 3))
Nov 28 10:18:43 np0005538515.localdomain ceph-mgr[286188]: [progress INFO root] complete: finished ev 982980e4-f1a8-42d4-987b-c67300c2c4f3 (Updating node-proxy deployment (+3 -> 3))
Nov 28 10:18:43 np0005538515.localdomain ceph-mgr[286188]: [progress INFO root] Completed event 982980e4-f1a8-42d4-987b-c67300c2c4f3 (Updating node-proxy deployment (+3 -> 3)) in 0 seconds
Nov 28 10:18:43 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0)
Nov 28 10:18:43 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 28 10:18:43 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "osd tree", "format": "json-pretty"} v 0)
Nov 28 10:18:43 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/327747030' entity='client.admin' cmd={"prefix": "osd tree", "format": "json-pretty"} : dispatch
Nov 28 10:18:43 np0005538515.localdomain sudo[329547]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Nov 28 10:18:43 np0005538515.localdomain sudo[329547]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Nov 28 10:18:43 np0005538515.localdomain sudo[329547]: pam_unix(sudo:session): session closed for user root
Nov 28 10:18:43 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "health", "detail": "detail", "format": "json-pretty"} v 0)
Nov 28 10:18:43 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/663276669' entity='client.admin' cmd={"prefix": "health", "detail": "detail", "format": "json-pretty"} : dispatch
Nov 28 10:18:43 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.59992 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Nov 28 10:18:43 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Nov 28 10:18:43 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Nov 28 10:18:43 np0005538515.localdomain ceph-mon[301134]: from='client.69944 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Nov 28 10:18:43 np0005538515.localdomain ceph-mon[301134]: from='client.49704 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Nov 28 10:18:43 np0005538515.localdomain ceph-mon[301134]: pgmap v823: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:18:43 np0005538515.localdomain ceph-mon[301134]: from='client.59938 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Nov 28 10:18:43 np0005538515.localdomain ceph-mon[301134]: from='client.69959 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Nov 28 10:18:43 np0005538515.localdomain ceph-mon[301134]: from='client.49719 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Nov 28 10:18:43 np0005538515.localdomain ceph-mon[301134]: from='client.59959 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Nov 28 10:18:43 np0005538515.localdomain ceph-mon[301134]: from='client.69977 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Nov 28 10:18:43 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.106:0/2928277428' entity='client.admin' cmd={"prefix": "health", "detail": "detail", "format": "json-pretty"} : dispatch
Nov 28 10:18:43 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.108:0/4269415353' entity='client.admin' cmd={"prefix": "health", "detail": "detail", "format": "json-pretty"} : dispatch
Nov 28 10:18:43 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.107:0/2018896911' entity='client.admin' cmd={"prefix": "versions"} : dispatch
Nov 28 10:18:43 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch
Nov 28 10:18:43 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch
Nov 28 10:18:43 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch
Nov 28 10:18:43 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch
Nov 28 10:18:43 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch
Nov 28 10:18:43 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch
Nov 28 10:18:43 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch
Nov 28 10:18:43 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch
Nov 28 10:18:43 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch
Nov 28 10:18:43 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch
Nov 28 10:18:43 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch
Nov 28 10:18:43 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch
Nov 28 10:18:43 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Nov 28 10:18:43 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Nov 28 10:18:43 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:18:43 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Nov 28 10:18:43 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.106:0/233060911' entity='client.admin' cmd={"prefix": "osd tree", "format": "json-pretty"} : dispatch
Nov 28 10:18:43 np0005538515.localdomain ceph-mon[301134]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Nov 28 10:18:43 np0005538515.localdomain ceph-mon[301134]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Nov 28 10:18:43 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.108:0/327747030' entity='client.admin' cmd={"prefix": "osd tree", "format": "json-pretty"} : dispatch
Nov 28 10:18:43 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.107:0/663276669' entity='client.admin' cmd={"prefix": "health", "detail": "detail", "format": "json-pretty"} : dispatch
Nov 28 10:18:43 np0005538515.localdomain ceph-mon[301134]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Nov 28 10:18:43 np0005538515.localdomain ceph-mon[301134]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Nov 28 10:18:43 np0005538515.localdomain ceph-mon[301134]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Nov 28 10:18:43 np0005538515.localdomain ceph-mon[301134]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Nov 28 10:18:43 np0005538515.localdomain ceph-mon[301134]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Nov 28 10:18:43 np0005538515.localdomain ceph-mon[301134]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Nov 28 10:18:43 np0005538515.localdomain ceph-mon[301134]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Nov 28 10:18:43 np0005538515.localdomain ceph-mon[301134]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Nov 28 10:18:43 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Nov 28 10:18:43 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Nov 28 10:18:43 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:18:43.809 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:18:44 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v824: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:18:44 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "config dump"} v 0)
Nov 28 10:18:44 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/313248130' entity='client.admin' cmd={"prefix": "config dump"} : dispatch
Nov 28 10:18:44 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.49779 -' entity='client.admin' cmd=[{"prefix": "device ls", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:44 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Nov 28 10:18:44 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Nov 28 10:18:44 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.70061 -' entity='client.admin' cmd=[{"prefix": "device ls", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:44 np0005538515.localdomain ceph-mon[301134]: from='client.59974 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Nov 28 10:18:44 np0005538515.localdomain ceph-mon[301134]: from='client.69998 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Nov 28 10:18:44 np0005538515.localdomain ceph-mon[301134]: Adjusting osd_memory_target on np0005538515.localdomain to 836.6M
Nov 28 10:18:44 np0005538515.localdomain ceph-mon[301134]: Adjusting osd_memory_target on np0005538513.localdomain to 836.6M
Nov 28 10:18:44 np0005538515.localdomain ceph-mon[301134]: Adjusting osd_memory_target on np0005538514.localdomain to 836.6M
Nov 28 10:18:44 np0005538515.localdomain ceph-mon[301134]: Unable to set osd_memory_target on np0005538515.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Nov 28 10:18:44 np0005538515.localdomain ceph-mon[301134]: Unable to set osd_memory_target on np0005538513.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Nov 28 10:18:44 np0005538515.localdomain ceph-mon[301134]: Unable to set osd_memory_target on np0005538514.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Nov 28 10:18:44 np0005538515.localdomain ceph-mon[301134]: from='client.59992 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Nov 28 10:18:44 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.107:0/3969960704' entity='client.admin' cmd={"prefix": "osd tree", "format": "json-pretty"} : dispatch
Nov 28 10:18:44 np0005538515.localdomain ceph-mon[301134]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Nov 28 10:18:44 np0005538515.localdomain ceph-mon[301134]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Nov 28 10:18:44 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.106:0/1533518817' entity='client.admin' cmd={"prefix": "config dump"} : dispatch
Nov 28 10:18:44 np0005538515.localdomain ceph-mon[301134]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Nov 28 10:18:44 np0005538515.localdomain ceph-mon[301134]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Nov 28 10:18:44 np0005538515.localdomain ceph-mon[301134]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Nov 28 10:18:44 np0005538515.localdomain ceph-mon[301134]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Nov 28 10:18:44 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.108:0/313248130' entity='client.admin' cmd={"prefix": "config dump"} : dispatch
Nov 28 10:18:44 np0005538515.localdomain ceph-mon[301134]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Nov 28 10:18:44 np0005538515.localdomain ceph-mon[301134]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Nov 28 10:18:44 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.107:0/3322486298' entity='client.admin' cmd={"prefix": "config dump"} : dispatch
Nov 28 10:18:44 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.106:0/2196629395' entity='client.admin' cmd={"prefix": "df", "detail": "detail"} : dispatch
Nov 28 10:18:44 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "df", "detail": "detail"} v 0)
Nov 28 10:18:44 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/4007421546' entity='client.admin' cmd={"prefix": "df", "detail": "detail"} : dispatch
Nov 28 10:18:45 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.60064 -' entity='client.admin' cmd=[{"prefix": "device ls", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:45 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "df"} v 0)
Nov 28 10:18:45 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/1868358059' entity='client.admin' cmd={"prefix": "df"} : dispatch
Nov 28 10:18:45 np0005538515.localdomain ceph-mon[301134]: pgmap v824: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:18:45 np0005538515.localdomain ceph-mon[301134]: from='client.49779 -' entity='client.admin' cmd=[{"prefix": "device ls", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:45 np0005538515.localdomain ceph-mon[301134]: from='client.70061 -' entity='client.admin' cmd=[{"prefix": "device ls", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:45 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.108:0/4007421546' entity='client.admin' cmd={"prefix": "df", "detail": "detail"} : dispatch
Nov 28 10:18:45 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.106:0/3575363919' entity='client.admin' cmd={"prefix": "df"} : dispatch
Nov 28 10:18:45 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.108:0/1868358059' entity='client.admin' cmd={"prefix": "df"} : dispatch
Nov 28 10:18:45 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.107:0/3429851138' entity='client.admin' cmd={"prefix": "df", "detail": "detail"} : dispatch
Nov 28 10:18:45 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.106:0/2831850694' entity='client.admin' cmd={"prefix": "fs dump"} : dispatch
Nov 28 10:18:45 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "fs dump"} v 0)
Nov 28 10:18:45 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/1835945986' entity='client.admin' cmd={"prefix": "fs dump"} : dispatch
Nov 28 10:18:46 np0005538515.localdomain ceph-mgr[286188]: log_channel(cluster) log [DBG] : pgmap v825: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Nov 28 10:18:46 np0005538515.localdomain ceph-mgr[286188]: [progress INFO root] Writing back 50 completed events
Nov 28 10:18:46 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0)
Nov 28 10:18:46 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "fs ls"} v 0)
Nov 28 10:18:46 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/84128671' entity='client.admin' cmd={"prefix": "fs ls"} : dispatch
Nov 28 10:18:46 np0005538515.localdomain nova_compute[280168]: 2025-11-28 10:18:46.403 280172 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 10:18:46 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.49812 -' entity='client.admin' cmd=[{"prefix": "fs status", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:46 np0005538515.localdomain kernel: cfg80211: Loading compiled-in X.509 certificates for regulatory database
Nov 28 10:18:46 np0005538515.localdomain kernel: cfg80211: Loaded X.509 cert 'sforshee: 00b28ddf47aef9cea7'
Nov 28 10:18:46 np0005538515.localdomain kernel: platform regulatory.0: Direct firmware load for regulatory.db failed with error -2
Nov 28 10:18:46 np0005538515.localdomain kernel: cfg80211: failed to load regulatory.db
Nov 28 10:18:46 np0005538515.localdomain ceph-mgr[286188]: log_channel(audit) log [DBG] : from='client.70112 -' entity='client.admin' cmd=[{"prefix": "fs status", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:46 np0005538515.localdomain ceph-mon[301134]: from='client.60064 -' entity='client.admin' cmd=[{"prefix": "device ls", "target": ["mon-mgr", ""]}]: dispatch
Nov 28 10:18:46 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.108:0/1835945986' entity='client.admin' cmd={"prefix": "fs dump"} : dispatch
Nov 28 10:18:46 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.107:0/2490676851' entity='client.admin' cmd={"prefix": "df"} : dispatch
Nov 28 10:18:46 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.106:0/754854177' entity='client.admin' cmd={"prefix": "fs ls"} : dispatch
Nov 28 10:18:46 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.108:0/84128671' entity='client.admin' cmd={"prefix": "fs ls"} : dispatch
Nov 28 10:18:46 np0005538515.localdomain ceph-mon[301134]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' 
Nov 28 10:18:46 np0005538515.localdomain ceph-mon[301134]: from='client.? 172.18.0.107:0/1333046251' entity='client.admin' cmd={"prefix": "fs dump"} : dispatch
Nov 28 10:18:46 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon).osd e294 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Nov 28 10:18:47 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #61. Immutable memtables: 0.
Nov 28 10:18:47 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:18:47.153361) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Nov 28 10:18:47 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/flush_job.cc:856] [default] [JOB 35] Flushing memtable with next log file: 61
Nov 28 10:18:47 np0005538515.localdomain ceph-mon[301134]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764325127153413, "job": 35, "event": "flush_started", "num_memtables": 1, "num_entries": 1044, "num_deletes": 253, "total_data_size": 1189878, "memory_usage": 1216896, "flush_reason": "Manual Compaction"}
Nov 28 10:18:47 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/flush_job.cc:885] [default] [JOB 35] Level-0 flush table #62: started
Nov 28 10:18:47 np0005538515.localdomain ceph-mon[301134]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764325127160799, "cf_name": "default", "job": 35, "event": "table_file_creation", "file_number": 62, "file_size": 781055, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 37429, "largest_seqno": 38468, "table_properties": {"data_size": 775772, "index_size": 2498, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1733, "raw_key_size": 14794, "raw_average_key_size": 21, "raw_value_size": 763937, "raw_average_value_size": 1113, "num_data_blocks": 102, "num_entries": 686, "num_filter_entries": 686, "num_deletions": 253, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764325092, "oldest_key_time": 1764325092, "file_creation_time": 1764325127, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "75e61b0e-4f73-4b03-b096-8587ecbe7a9f", "db_session_id": "7KM5GJAJPD54H6HSLJHG", "orig_file_number": 62, "seqno_to_time_mapping": "N/A"}}
Nov 28 10:18:47 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 35] Flush lasted 7465 microseconds, and 2253 cpu microseconds.
Nov 28 10:18:47 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 28 10:18:47 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:18:47.160833) [db/flush_job.cc:967] [default] [JOB 35] Level-0 flush table #62: 781055 bytes OK
Nov 28 10:18:47 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:18:47.160849) [db/memtable_list.cc:519] [default] Level-0 commit table #62 started
Nov 28 10:18:47 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:18:47.162494) [db/memtable_list.cc:722] [default] Level-0 commit table #62: memtable #1 done
Nov 28 10:18:47 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:18:47.162519) EVENT_LOG_v1 {"time_micros": 1764325127162512, "job": 35, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Nov 28 10:18:47 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:18:47.162541) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Nov 28 10:18:47 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 35] Try to delete WAL files size 1183913, prev total WAL file size 1183913, number of live WAL files 2.
Nov 28 10:18:47 np0005538515.localdomain ceph-mon[301134]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538515/store.db/000058.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 28 10:18:47 np0005538515.localdomain ceph-mon[301134]: rocksdb: (Original Log Time 2025/11/28-10:18:47.163182) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6B760031353438' seq:72057594037927935, type:22 .. '6B760031383032' seq:0, type:0; will stop at (end)
Nov 28 10:18:47 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 36] Compacting 1@0 + 1@6 files to L6, score -1.00
Nov 28 10:18:47 np0005538515.localdomain ceph-mon[301134]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 35 Base level 0, inputs: [62(762KB)], [60(18MB)]
Nov 28 10:18:47 np0005538515.localdomain ceph-mon[301134]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764325127163227, "job": 36, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [62], "files_L6": [60], "score": -1, "input_data_size": 20115815, "oldest_snapshot_seqno": -1}
Nov 28 10:18:47 np0005538515.localdomain ceph-mon[301134]: mon.np0005538515@2(peon) e15 handle_command mon_command({"prefix": "mds stat"} v 0)
Nov 28 10:18:47 np0005538515.localdomain ceph-mon[301134]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/1639533449' entity='client.admin' cmd={"prefix": "mds stat"} : dispatch
